repo_name
stringlengths 4
116
| path
stringlengths 3
942
| size
stringlengths 1
7
| content
stringlengths 3
1.05M
| license
stringclasses 15
values |
---|---|---|---|---|
nylnook/nylnook-website | src/documents/en/blog/krita-brushes-pack-v2.html.md | 7656 | ---
title: "Krita Brushes Presets Pack v2"
date: 2016-09-16 14:00
thumb: '/img/blog/brush-pack-v2/icon-nylnook-brush-pack-v2-art-pen.jpg'
lang_fr: '/fr/blog/pack-brosses-krita-v2'
tags:
- download
- graphic novel
- making of
- tutorials
---

Time for update ! I'm happy to introduce **36 brushes** presets for digital painting I crafted for and with **[Krita 3.0.1](https://krita.org/)** , used for [my latest comic](/en/comics/mokatori-ep0-the-end/)... This is version 2.
They are free to use, under a [Creative Commons Zero License](http://creativecommons.org/publicdomain/zero/1.0/deed) !
## What's new in v2 ?
First of all, now there are 2 packs, depending on your stylus : does it support rotation additionally to pressure and tilt ?
If you have a [Wacom Art Pen](https://www.wacom.com/en-us/store/pens/art-pen) or similar, that support rotation, you will be interested by the Art Pen pack:
[](https://github.com/nylnook/nylnook-krita-brushes/releases/download/v2/nylnook-v2-art.bundle)
And if you have any other Pen, you will be interested by the Generic Pen pack, which emulate rotation on some brushes:
[](https://github.com/nylnook/nylnook-krita-brushes/releases/download/v2/nylnook-v2-gen.bundle)
Emulation is acheived with brand new Krita features : Drawing Angle Lock (introduced in 3.0), and Fuzzy Stroke (introduced in 3.0.1) !
I also :
- added 3 new brushes presets (Aboriginal Dots, Ink Power Rectangle and Clone Tool)
- updated almost every brush
- removed 2 not so interesting brushes that were no longer working
- updated textures and redo some
- and the basic brushes now use the "Greater" blending mode, which give better results when a stroke overlap another stroke, but works only on a transparent layer: if you want to use them on an opaque layer like a background, just switch the blending mode to Normal again
## Installation
Download the [Generic Pen Bundle](https://github.com/nylnook/nylnook-krita-brushes/releases/download/v2/nylnook-v2-gen.bundle), or the [Art Pen Bundle](https://github.com/nylnook/nylnook-krita-brushes/releases/download/v2/nylnook-v2-art.bundle).
In Krita, go to *Settings > Manage Resources… > Import Bundle/Resource*, and make sure the bundle is in the *Active Bundles* column on the left.
You should choose one of the bundle, and do not install or activate both of them, otherwise the Krita tagging system will be confused with brushes that are common to both packs.
## Usage
I usually use them on a large canvas (mininimum 2K)... so theses presets may look big on a smaller canvas.
### Small Icons
 Brushes with a rotation icon for the Art Pen pack are meant to be used with a stylus **supporting rotation** like the [Wacom Art Pen](https://www.wacom.com/en-us/store/pens/art-pen) (the best stylus I know if you want my opinion). This allow to do thick and thin strokes, essentials for inking.
 Brushes with a G with an arrow icon for "Generic rotation" are brushes with **emulated rotation** which can work with any stylus, and rely on Krita features Drawing Angle Lock and Fuzzy Stroke. Most of them are in the Generic Pen pack, but you can find two in the Art Pen pack when Fuzzy Stroke is more intersting than controlled rotation.
 Brushes with a drop icon mix there colors with the color already on the canvas... so they feel "wet".
Brushes with mixing and rotation use more computing power than other brushes, especially when they are combined with textures. Should work on any recent computer nevertheless ;)
### Naming
As Krita tagging system is sometimes capricious, every brush preset start with "Nylnook" to quickly find them. Then they are sorted by types :
**Aboriginal Dots** : I created this one specially to mimic australian aboriginal dot painting for a specific projet. Just draw your line and the preset will paint dots along the way in this aboriginal style.

**Airbrush** is a textured airbrush for shading, it's more interesting with a texture ;)

**Basic** Brushes are the simplest, and the less demanding for your computer. Slightly noising to allow soft mixing between colors. Now using the "Greater" Blending mode.

**Block** allow to do large blocking of colors in speed painting for example. Noise and not texture to make it quicker.

**Clone Tool** allow to copy part of an image on another part. Define the source spot with Ctrl+clic. With the airbrush texture for more random mixings, less repetitive.

**Erase** : One really hard (just erase that mistake now in one stoke) and one soft with a texture for shadings.

**Fill or Erase Shape**: for quick filling, or quick erasing of large areas with the "E" shortcut.

**Ink**: Brushes for quick inking or details and some experiments for original inkings.

**Ink Power**: The three best inking brushes, but they are hard to use : I recommend the Dynamic Brush tool to draw with it.

**Paint**: Three brushes with rotation and mixing for "real" painting or watercoloring.

**Pencil**: a simple pencil for sketches, really similar to default Pencil 2B, with addtional settings for more realism

**Poska**: Small markers brushes inspired by the famous [Posca](http://www.posca.com)s

## Compatibility
Compatible with Krita **3.0.1** (not 3.0), and next point releases at least ;)
## Changelog
**September 16th 2016**: 36 brushes crafted for and with Krita 3.0.1, used for [my comics](http://nylnook.com/en/comics/)... This is version 2 !
**January 7th 2016**: 25 brushes crafted for and with Krita 2.9, used for [my comics](http://nylnook.com/en/comics/)... This is version 1 !
**April 24th 2015**: 12 brushes I craft since Krita 2.8, and finalized with Krita 2.9... They are working, but more work is needed ! This is a beta.
## License
CC-0 / Public Domain. Credit *Camille Bissuel* if needed.
## Thanks
Theses brushes are born with the inspiration of other brushes made by theses great peoples :
- [Timothée Giet](http://timotheegiet.com)
- [David Revoy](http://davidrevoy.com/)
- [Pablo Cazorla](http://www.pcazorla.com/)
- [Wolthera van Hövell](http://wolthera.info/)
## Full Sources
You can find the full sources [here on Framagit](https://framagit.org/nylnook/nylnook-krita-brushes), so each brush individually, icons, and so on...
| mit |
thedavisproject/davis-web | src/middleware/initContext.js | 87 | module.exports = (req, res, next) => {
req.context = req.context || {};
next();
};
| mit |
achan/angular-previewer | app/scripts/services/imagePreviewService.js | 392 | 'use strict';
angular.module('achan.previewer').service('imagePreviewService', function () {
var source;
var ImagePreviewService = {
render: function (scope, element) {
element.html('<img src="' + source + '" class="img-responsive" />');
},
forSource: function (src) {
source = src;
return ImagePreviewService;
}
};
return ImagePreviewService;
});
| mit |
gemvein/cooperative | spec/models/group_spec.rb | 768 | require 'spec_helper'
describe Group do
# Check that gems are installed
# Acts as Taggable on gem
it { should have_many(:base_tags).through(:taggings) }
# Check that appropriate fields are accessible
it { should allow_mass_assignment_of(:name) }
it { should allow_mass_assignment_of(:description) }
it { should allow_mass_assignment_of(:public) }
it { should allow_mass_assignment_of(:tag_list) }
# Check that validations are happening properly
it { should validate_presence_of(:name) }
context 'Class Methods' do
describe '#open_to_the_public' do
include_context 'groups support'
subject { Group.open_to_the_public }
it { should include public_group }
it { should_not include private_group }
end
end
end | mit |
Balise42/sattools | include/dimacsgenerator.h | 619 | #ifndef DIMACSGENERATOR_H
#define DIMACSGENERATOR_H 1
#include <vector>
#include <fstream>
#include "cnfclause.h"
#include "cnfformula.h"
#include "satgenerator.h"
/** A very very basic DIMACS parser. Only parses for cnf formulas. */
class DimacsGenerator : public SatGenerator{
private:
std::string filename;
public:
/** constructor - reads the formula from the file and creates it
@param file the file to read
@param k arity of a clause */
DimacsGenerator(std::string filename, unsigned int k);
/** @return sat formula from the file */
void generate_sat(CNFFormula & f);
};
#endif
| mit |
samvartaka/keyak-python | utils.py | 1775 | # -*- coding: utf-8 -*-
# Keyak v2 implementation by Jos Wetzels and Wouter Bokslag
# hereby denoted as "the implementer".
# Based on Keccak Python and Keyak v2 C++ implementations
# by the Keccak, Keyak and Ketje Teams, namely, Guido Bertoni,
# Joan Daemen, Michaël Peeters, Gilles Van Assche and Ronny Van Keer
#
# For more information, feedback or questions, please refer to:
# http://keyak.noekeon.org/
# http://keccak.noekeon.org/
# http://ketje.noekeon.org/
from StringIO import StringIO
class stringStream(StringIO):
# Peek (extract byte without advancing position, return None if no more stream is available)
def peek(self):
oldPos = self.tell()
b = self.read(1)
newPos = self.tell()
if((newPos == (oldPos+1)) and (b != '')):
r = ord(b)
else:
r = None
self.seek(oldPos, 0)
return r
# Pop a single byte (as integer representation)
def get(self):
return ord(self.read(1))
# Push a single byte (as integer representation)
def put(self, b):
self.write(chr(b))
return
# Erase buffered contents
def erase(self):
self.truncate(0)
self.seek(0, 0)
return
# Set buffered contents
def setvalue(self, s):
self.erase()
self.write(s)
return
def hasMore(I):
return (I.peek() != None)
def enc8(x):
if (x > 255):
raise Exception("The integer %d cannot be encoded on 8 bits." % x)
else:
return x
# Constant-time comparison from the Django source: https://github.com/django/django/blob/master/django/utils/crypto.py
# Is constant-time only if both strings are of equal length but given the use-case that is always the case.
def constant_time_compare(val1, val2):
if len(val1) != len(val2):
return False
result = 0
for x, y in zip(val1, val2):
result |= ord(x) ^ ord(y)
return result == 0 | mit |
jilse/jilse.github.io | _posts/2016/2016-07-27-random-2016-Lake-Superior-sailing-picutres.md | 953 | ---
layout: post
title: Summer Sailing
excerpt: Random photos of summer sailing and favorite anchorages.
categories: 2016-LakeSuperior
date: 2016-07-27
published: true
image:
ogimage: "2016/DSCF3171.jpg"
images-array:
- path: 2016/DSCF3080.jpg
label:
- path: 2016/DSCF3095.jpg
label:
- path: 2016/DSCF3096.jpg
label:
- path: 2016/DSCF3100.jpg
label:
- path: 2016/DSCF3119.jpg
label:
- path: 2016/DSCF3121.jpg
label:
- path: 2016/DSCF3156.jpg
label: I like my rope rigging enough that I'm always taking pictures of it. Dead eyes and lashings with a huge ship in the background. Could it be more nautical?
- path: 2016/DSCF3159.jpg
label:
- path: 2016/DSCF3171.jpg
label:
---
I mysteriously lost most of the pictures from our summer circle tour. This is what I could find. Basically the sunset is from the day before we left and there are a few from the Apostle Islands just before we got home. That's it! | mit |
msavela/deliver | middleware_test.go | 2081 | package deliver
import (
"testing"
"net/http"
"net/http/httptest"
"reflect"
)
func TestMiddlewareBasic(t *testing.T) {
d := New()
d.Use(MiddlewareHandlerFunc(func(res Response, req *Request, next func()) {
res.Send("Hello")
}))
response, body := testMiddleware(t, d)
expect(t, body, "Hello")
expect(t, response.Code, http.StatusOK)
}
func TestMiddlewareMultiple(t *testing.T) {
d := New()
content := ""
d.Use(MiddlewareHandlerFunc(func(res Response, req *Request, next func()) {
content += "Hello"
next()
}))
d.Use(MiddlewareHandlerFunc(func(res Response, req *Request, next func()) {
content += "World"
res.SetStatus(http.StatusOK)
}))
response, _ := testMiddleware(t, d)
expect(t, content, "HelloWorld")
expect(t, response.Code, http.StatusOK)
}
func TestMiddlewareMultipleAfter(t *testing.T) {
d := New()
content := ""
d.Use(MiddlewareHandlerFunc(func(res Response, req *Request, next func()) {
next()
content += "Hello"
}))
d.Use(MiddlewareHandlerFunc(func(res Response, req *Request, next func()) {
content += "World"
res.SetStatus(http.StatusOK)
}))
response, _ := testMiddleware(t, d)
expect(t, content, "WorldHello")
expect(t, response.Code, http.StatusOK)
}
func TestMiddlewareMultipleInterrupt(t *testing.T) {
d := New()
content := ""
d.Use(MiddlewareHandlerFunc(func(res Response, req *Request, next func()) {
content += "Hello"
}))
d.Use(MiddlewareHandlerFunc(func(res Response, req *Request, next func()) {
content += "Should not be called"
res.SetStatus(http.StatusOK)
}))
response, _ := testMiddleware(t, d)
expect(t, content, "Hello")
expect(t, response.Code, http.StatusNotFound)
}
/* Helpers */
func testMiddleware(t *testing.T, deliver *Deliver) (*httptest.ResponseRecorder, string) {
response := httptest.NewRecorder()
deliver.ServeHTTP(response, (*http.Request)(nil))
return response, response.Body.String()
}
func expect(t *testing.T, a interface{}, b interface{}) {
if a != b {
t.Errorf("Expected %v (%v) - Got %v (%v)", b, reflect.TypeOf(b), a, reflect.TypeOf(a))
}
} | mit |
plotly/plotly.py | packages/python/plotly/plotly/validators/sankey/hoverlabel/_bordercolor.py | 482 | import _plotly_utils.basevalidators
class BordercolorValidator(_plotly_utils.basevalidators.ColorValidator):
def __init__(
self, plotly_name="bordercolor", parent_name="sankey.hoverlabel", **kwargs
):
super(BordercolorValidator, self).__init__(
plotly_name=plotly_name,
parent_name=parent_name,
array_ok=kwargs.pop("array_ok", True),
edit_type=kwargs.pop("edit_type", "calc"),
**kwargs
)
| mit |
datosgobar/portal-andino | Dockerfile | 1633 | # Si se lleva a cabo un docker build de portal-andino sin el parámetro "--build-arg IMAGE_VERSION={versión de portal-base}, se usa el ARG IMAGE_VERSION por default
ARG IMAGE_VERSION=release-0.11.3
FROM datosgobar/portal-base:$IMAGE_VERSION
MAINTAINER Leandro Gomez<[email protected]>
ARG PORTAL_VERSION
ENV CKAN_HOME /usr/lib/ckan/default
ENV CKAN_DIST_MEDIA /usr/lib/ckan/default/src/ckanext-gobar-theme/ckanext/gobar_theme/public/user_images
ENV CKAN_DEFAULT /etc/ckan/default
WORKDIR /portal
# portal-andino-theme
RUN $CKAN_HOME/bin/pip install -e git+https://github.com/datosgobar/portal-andino-theme.git@0c4b0021bde0e312505e0e4ff90a2d017c755f98#egg=ckanext-gobar_theme && \
$CKAN_HOME/bin/pip install -r $CKAN_HOME/src/ckanext-gobar-theme/requirements.txt && \
/etc/ckan_init.d/build-combined-ckan-mo.sh $CKAN_HOME/src/ckanext-gobar-theme/ckanext/gobar_theme/i18n/es/LC_MESSAGES/ckan.po
# Series de Tiempo Ar explorer
RUN $CKAN_HOME/bin/pip install -e git+https://github.com/datosgobar/[email protected]#egg=ckanext-seriestiempoarexplorer
# DCAT dependencies (el plugin se instala desde el `requirements.txt` de portal-andino-theme)
RUN $CKAN_HOME/bin/pip install -r $CKAN_HOME/src/ckanext-dcat/requirements.txt
RUN mkdir -p $CKAN_DIST_MEDIA
RUN chown -R www-data:www-data $CKAN_DIST_MEDIA
RUN chmod u+rwx $CKAN_DIST_MEDIA
RUN echo "$PORTAL_VERSION" > /portal/version
RUN mkdir -p /var/lib/ckan/theme_config/templates
RUN cp $CKAN_HOME/src/ckanext-gobar-theme/ckanext/gobar_theme/templates/seccion-acerca.html /var/lib/ckan/theme_config/templates
VOLUME $CKAN_DIST_MEDIA $CKAN_DEFAULT
| mit |
jakegough/jaytwo.CommonLib | CommonLib.Futures/Numbers/MathUtility.cs | 534 | using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
namespace jaytwo.Common.Futures.Numbers
{
public static class MathUtility
{
public static double StandardDeviation(IEnumerable<double> data)
{
var average = data.Average();
var individualDeviations = data.Select(x => Math.Pow(x - average, 2));
return Math.Sqrt(individualDeviations.Average());
}
public static double StandardDeviation(params double[] data)
{
return StandardDeviation((IEnumerable<double>)data);
}
}
} | mit |
AndreiMisiukevich/FFImageLoading | samples/ImageLoading.Forms.Sample/WinPhoneSL/FFImageLoading.Forms.Sample.WinPhoneSL/LocalizedStrings.cs | 407 | using FFImageLoading.Forms.Sample.WinPhoneSL.Resources;
namespace FFImageLoading.Forms.Sample.WinPhoneSL
{
/// <summary>
/// Provides access to string resources.
/// </summary>
public class LocalizedStrings
{
private static AppResources _localizedResources = new AppResources();
public AppResources LocalizedResources { get { return _localizedResources; } }
}
}
| mit |
sethbusque/pizzaccio | src_custom/com/aws/global/dao/PizzaDAO.java | 1598 | package com.aws.global.dao;
import java.util.ArrayList;
import com.aws.global.classes.Pizza;
import com.aws.global.common.base.BaseDAO;
import com.aws.global.mapper.PizzaRowMapper;
public class PizzaDAO extends BaseDAO{
//SQL Statement when user adds a pizza to his inventory
public void addPizza(String pizzaName, int pizzaPrice)
{
String sql = "INSERT INTO PIZZA (pizza_id, pizza_name, pizza_price) VALUES (NULL, ?, ?);";
getJdbcTemplate().update(sql, new Object[] { pizzaName, pizzaPrice});
}
//SQL Statement when user wants to get a list of pizzas
public ArrayList<Pizza> getAllPizza()
{
String sql = "SELECT * FROM Pizza";
ArrayList<Pizza> pizzas = (ArrayList<Pizza>) getJdbcTemplate().query(sql, new PizzaRowMapper());
return pizzas;
}
//SQL Statement when user wants to get a pizza record using a pizza id
public Pizza getPizzaById(int id)
{
String sql = "SELECT * FROM PIZZA WHERE pizza_id = ?";
Pizza pizza = (Pizza)getJdbcTemplate().queryForObject(
sql, new Object[] { id },
new PizzaRowMapper());
return pizza;
}
//SQL Statement when user wants to update a certain pizza's information
public void editPizza(String pizza_name, int pizza_price, int id)
{
String sql = "UPDATE PIZZA SET pizza_name = ?, pizza_price = ? WHERE pizza_id = ?;";
getJdbcTemplate().update(sql, new Object[] { pizza_name, pizza_price, id });
}
//SQL Statement when user wants to delete a pizza information
public void deletePizza(int id)
{
String sql = "DELETE FROM PIZZA WHERE pizza_id = ?";
getJdbcTemplate().update(sql, new Object[] { id });
}
}
| mit |
howardhou/DataStatistic | Example/Pods/Target Support Files/DataStatistic/DataStatistic-umbrella.h | 248 | #ifdef __OBJC__
#import <UIKit/UIKit.h>
#endif
#import "DataStatistic.h"
#import "TalkingData.h"
#import "TalkingDataSMS.h"
FOUNDATION_EXPORT double DataStatisticVersionNumber;
FOUNDATION_EXPORT const unsigned char DataStatisticVersionString[];
| mit |
mafintosh/mongojs | test/test-expose-bson-types.js | 408 | var test = require('./tape')
var mongojs = require('../index')
test('should export bson types', function (t) {
t.ok(mongojs.Binary)
t.ok(mongojs.Code)
t.ok(mongojs.DBRef)
t.ok(mongojs.Double)
t.ok(mongojs.Long)
t.ok(mongojs.MinKey)
t.ok(mongojs.MaxKey)
t.ok(mongojs.ObjectID)
t.ok(mongojs.ObjectId)
t.ok(mongojs.Symbol)
t.ok(mongojs.Timestamp)
t.ok(mongojs.Decimal128)
t.end()
})
| mit |
coleww/twitter_bot_generator | such_streaming_bot/src/such_streaming_bot.rb | 115 | class SuchStreamingBot
class << self
def matches? text
!!(text =~ /hello world/)
end
end
end
| mit |
imco/nmx | frontend/README.md | 603 | # Frontend de NOMS/NMX/Normas
Los tres frontend se encuentran en ramas distintas de desarrollo:
* master ---> http://noms.imco.org.mx
* nmx ---> http://nmx.imco.org.mx
* normas ---> http://normas.imco.org.mx
Para cambiar de ramas utilice el comando de GIT
`git checkout ${BRANCH}`
La construcción del sitio de despliege se ejecuta en la carpeta `frontend`
> #### NOTA:
> Cada sitio debe construirse desde su rama de forma individual.
## Build & development
Run `grunt` for building and `grunt serve` for preview.
## Testing
Running `grunt test` will run the unit tests with karma.
| mit |
Monkios/ClientServerGame | ConsoleClient/Properties/AssemblyInfo.cs | 1546 | using System.Reflection;
using System.Runtime.CompilerServices;
using System.Runtime.InteropServices;
// Les informations générales relatives à un assembly dépendent de
// l'ensemble d'attributs suivant. Changez les valeurs de ces attributs pour modifier les informations
// associées à un assembly.
[assembly: AssemblyTitle("Client")]
[assembly: AssemblyDescription("")]
[assembly: AssemblyConfiguration("")]
[assembly: AssemblyCompany("")]
[assembly: AssemblyProduct("Client")]
[assembly: AssemblyCopyright("Copyright © 2017")]
[assembly: AssemblyTrademark("")]
[assembly: AssemblyCulture("")]
// L'affectation de la valeur false à ComVisible rend les types invisibles dans cet assembly
// aux composants COM. Si vous devez accéder à un type dans cet assembly à partir de
// COM, affectez la valeur true à l'attribut ComVisible sur ce type.
[assembly: ComVisible(false)]
// Le GUID suivant est pour l'ID de la typelib si ce projet est exposé à COM
[assembly: Guid("210de826-2c8c-4023-a45b-777ce845b803")]
// Les informations de version pour un assembly se composent des quatre valeurs suivantes :
//
// Version principale
// Version secondaire
// Numéro de build
// Révision
//
// Vous pouvez spécifier toutes les valeurs ou indiquer les numéros de build et de révision par défaut
// en utilisant '*', comme indiqué ci-dessous :
// [assembly: AssemblyVersion("1.0.*")]
[assembly: AssemblyVersion("1.0.0.0")]
[assembly: AssemblyFileVersion("1.0.0.0")]
| mit |
instaclick/PDI-Plugin-Step-BloomFilter | ic-filter/src/main/java/com/instaclick/filter/DataFilter.java | 780 | package com.instaclick.filter;
/**
* Defines a behavior that should be implement by all filter
*
* @author Fabio B. Silva <[email protected]>
*/
public interface DataFilter
{
/**
* Adds the given {@link Data} if it does not exists
*
* @param data
*
* @return <b>TRUE</b> if the the {@link Data} does not exists; <b>FALSE</b> otherwise
*/
public boolean add(Data data);
/**
* Check if the given {@link Data} exists
*
* @param data
*
* @return <b>TRUE</b> if the the {@link Data} does not exists; <b>FALSE</b> otherwise
*/
public boolean contains(Data data);
/**
* Flushes the filter data, this operation should be invoked at the end of the filter
*/
public void flush();
} | mit |
sazid/codes | problem_solving/codeforces/676C.cpp | 881 | #include <bits/stdc++.h>
using namespace std;
int count_consecutive(string &s, int n, int k, char x) {
int mx_count = 0;
int x_count = 0;
int curr_count = 0;
int l = 0;
int r = 0;
while (r < n) {
if (x_count <= k) {
if (s[r] == x)
x_count++;
r++;
curr_count++;
if (s[r-1] != x) mx_count = max(mx_count, curr_count);
} else {
if (s[l] == x) {
x_count--;
}
l++;
curr_count--;
}
}
if (s[s.size()-1] == x && x_count) mx_count++;
return mx_count;
}
int main() {
ios::sync_with_stdio(false);
cin.tie(nullptr);
int n, k;
string s;
cin >> n >> k;
cin >> s;
cout << max(count_consecutive(s, n, k, 'b'), count_consecutive(s, n, k, 'a')) << endl;
return 0;
}
| mit |
stereocat/expectacle | README.md | 10236 | # Expectacle
[](https://badge.fury.io/rb/expectacle)
Expectacle ("expect + spectacle") is a small wrapper of `pty`/`expect`.
It can send commands (command-list) to hosts (including network devices etc)
using telnet/ssh session.
Expectacle is portable (instead of less feature).
Because it depends on only standard modules (YAML, ERB, PTY, Expect and Logger).
It can work on almost ruby(>2.2) system without installation other gems. (probably...)
## Installation
Add this line to your application's Gemfile:
```ruby
gem 'expectacle'
```
And then execute:
$ bundle
Or install it yourself as:
$ gem install expectacle
## Usage
### Send commands to hosts
See [exe/run_command](./exe/run_command) and [vendor directory](./vendor).
`run_command` can send commands to hosts with `-r`/`--run` option.
$ bundle exec run_command -r -h l2switch.yml -c cisco_show_arp.yml
- See details of command line options with `--help` option.
- [l2switch.yml](./vendor/hosts/l2switch.yml) is host-list file.
It is a data definitions for each hosts to send commands.
- At username and password (login/enable) parameter,
you can write environment variables with ERB manner to avoid write raw login information.
- [exe/readne](./exe/readne) is a small bash script to set environment variable in your shell.
```
$ export L2SW_USER=`./exe/readne`
Input: (type username)
$ export L2SW_PASS=`./exe/readne`
Input: (type password)
```
- `Expectacle::Thrower` read prompt-file by "type" parameter in host-list file.
- In prompt-file, prompt regexps that used for interactive operation to host
are defined. (These regexp are common information for some host-groups. (vendor, OS, ...))
- Prompt-file is searched by filename: `#{type}_prompt.yml` from [prompts directory](./vendor/prompts).
`type` parameter defined in host-list file.
- [cisco_show_arp.yml](./vendor/commands/cisco_show_arp.yml) is command-list file.
- it is a list of commands.
- Each files are written by YAML.
### Parameter expansion and preview
Expectacle has parameter expansion feature using ERB.
In a command list file,
you can write command strings including environment variable and parameters defined in host file.
See [Parameter definitions](#parameter-definitions) section about details of parameter expansion feature.
Thereby, there are some risks sending danger commands by using wrong parameter and command definitions.
Then, you can preview expanded command strings to send a host and parameters before execute actually.
For Example:
```
stereocat@tftpserver:~/expectacle$ bundle exec run_command -p -h l2switch.yml -c cisco_save_config_tftp.yml
---
:spawn_cmd: ssh -o StrictHostKeyChecking=no -o KexAlgorithms=+diffie-hellman-group1-sha1
-l cisco 192.168.20.150
:prompt:
:password: "^Password\\s*:"
:username: "^Username\\s*:"
:sub1: "\\][:\\?]"
:sub2: "\\[confirm\\]"
:yn: "\\[yes\\/no\\]:"
:command1: "^[\\w\\-]+>"
:command2: "^[\\w\\-]+(:?\\(config\\))?\\#"
:enable_password: SAME_AS_LOGIN_PASSWORD
:enable_command: enable
:host:
:hostname: l2sw1
:type: c3750g
:ipaddr: 192.168.20.150
:protocol: ssh
:username: cisco
:password: ********
:enable: ********
:tftp_server: 192.168.20.170
:commands:
- copy run start
- copy run tftp://192.168.20.170/l2sw1.confg
---
(snip)
```
**Notice** : Passwords were masked above example, but actually, raw password strings are printed out.
### Change place of log message
With `-l`/`--logfile`, [run_command](./exe/run_command) changes logging IO to file instead of standard-out (default).
$ bundle exec run_command -r -l foo.log -h l2switch.yml -c cisco_show_arp.yml
With `-s`/`--syslog`, [run_command](./exe/run_command) changes logging instance to `syslog/logger`.
So, log messages are printed out to syslog on localhost.
$ bundle exec run_command -rs -h l2switch.yml -c cisco_show_arp.yml
**Notice** : When specified `--logfile` and `--syslog` at the same time, `--syslog` is used to logging.
### Quiet mode
With `-q`/`--quiet`, [run_command](./exe/run_command) stop printing out results
received from a host to standard out. For example:
$ bundle exec run_command -rq -h l2switch.yml -c cisco_show_arp.yml
the command prints only log message (without host output) to standard out.
If you use options syslog(`-s`) and quiet(`-q`),
there is nothing to be appeared in terminal (standard out).
$ bundle exec run_command -rqs -h l2switch.yml -c cisco_show_arp.yml
## Parameter Definitions
### Expectacle::Thrower
`Expectacle::Thrower` argument description.
- `:timeout` : (Optional) Timeout interval (sec) to connect a host.
(default: 60sec)
- `:verbose` : (Optional) When `:verbose` is `false`,
`Expectacle` does not output spawned process input/output to standard-out(`$stdout`).
(default: `true`)
- `:base_dir`: (Optional) Base path to search host/prompt/command files.
(default: current working directory (`Dir.pwd`))
- `#{base_dir}/commands`: command-list file directory.
- `#{base_dir}/prompts` : prompt-file directory.
- `#{base_dir}/hosts` : host-file directory.
- `:logger` : (Optional) IO object to logging `Expectacle` operations.
(default: `$stdout`)
**Notice** : When `Expectacle` success to connect(spawn) host,
it will change the user mode to privilege (root/super-user/enable) at first, ASAP.
All commands are executed with privilege mode at the host.
### Host-list parameter
Host-list file is a list of host-parameters.
- `:hostname`: Indication String of host name.
- `:type`: Host type (used to choose prompt-file).
- `:ipaddr`: IP(v4) address to connect host.
- `:protocol`: Protocol to connect host. (telnet or ssh)
- `:username`: Login name.
- `:password`: Login password.
- `:enable`: Password to be privilege mode.
It can use ERB to set values from environment variable in `:username`, `:password` and `:enable`.
You can add other parameter(s) to refer in command-list files.
See also: [Command list](#command-list-with-erb) section.
### Prompt parameter
Prompt file is a table of prompt regexp of host group(type).
- `:password`: Login password prompt
- `:username`: Login username prompt
- `:sub1`: Sub command prompt
- `:sub2`: Sub command prompt
- `:yn`: Yes/No prompt
- `:command1`: Command prompt (normal mode)
- `:command2`: Command prompt (privilege mode)
- `enable_password`: Enable password prompt
- `enable_command`: command to be privilege mode
(Only this parameter is not a "prompt regexp")
### Command list with ERB
Command-list is a simple list of command-string.
A command-string can contain host-parameter reference by ERB.
For example, if you want to save configuration of a Cisco device to tftp server:
- Add a parameter to tftp server info (IP address) in [host-list file](vendor/hosts/l2switch.yml).
```YAML
- :hostname : 'l2sw1'
:type : 'c3750g'
:ipaddr : '192.168.20.150'
:protocol : 'ssh'
:username : "<%= ENV['L2SW_USER'] %>"
:password : "<%= ENV['L2SW_PASS'] %>"
:enable : "<%= ENV['L2SW_PASS'] %>"
:tftp_server: '192.168.20.170'
- :hostname : 'l2sw2'
:type : 'c3750g'
:ipaddr : '192.168.20.151'
:protocol : 'ssh'
:username : "<%= ENV['L2SW_USER'] %>"
:password : "<%= ENV['L2SW_PASS'] %>"
:enable : "<%= ENV['L2SW_PASS'] %>"
:tftp_server: '192.168.20.170'
```
- Write [command-list file](vendor/commands/cisco_save_config_tftp.yml) using ERB.
When send a command to host, ERB string was evaluated in `Expectacle::Thrower` bindings.
Then, it can refer host-parameter as `@host_param` hash.
- When exec below command-list, host configuration will be saved a file as `l2sw1.confg` on tftp server.
- See also: [parameter preview](#parameter-expansion-and-preview) section.
```YAML
- "copy run start"
- "copy run tftp://<%= @host_param[:tftp_server] %>/<%= @host_param[:hostname] %>.confg"
```
## Default SSH Options
When use `ssh` (OpenSSH) command to spawn device, the user can set options for the command via `#{base_dir}/opts/ssh_opts.yml`.
With options as list in [ssh_opts.yml](./vendor/opts/ssh_opts.yml),
```
- '-o StrictHostKeyChecking=no'
- '-o KexAlgorithms=+diffie-hellman-group1-sha1'
- '-o Ciphers=+aes128-cbc,3des-cbc,aes192-cbc,aes256-cbc'
```
it works same as `~/.ssh/config` below.
```
Host *
StrictHostKeyChecking no
KexAlgorithms +diffie-hellman-group1-sha1
Ciphers +aes128-cbc,3des-cbc,aes192-cbc,aes256-cbc
```
## Use Local Serial Port
Expectacle can handle `cu` (call up another system) command to operate via device local serial port.
At first, install `cu`. If you use Ubuntu, install it with `apt`.
```
sudo apt install cu
```
Next, set parameter `:protocol` to `cu`, and write `cu` command options as `:cu_opts`. Usually, one serial port correspond to one device. So host parameter `:cu_opts` is used as options to connect a host via serial port. For example:
```
- :hostname : 'l2sw1'
:type : 'c3750g'
:protocol : 'cu'
:cu_opts : '-l /dev/ttyUSB0 -s 9600'
```
File `#{base_dir}/opts/cu_opts.yml` has default options for `cu` command.
At last, execute by `run_command` with `sudo`. Because it requires superuser permission to handle local device.
```
sudo -E bundle exec run_command -r -h l2switch.yml -c cisco_show_version.yml
```
**Notice** : Without `sudo -E` (`--preserve-env`) option, it do not preserve environment variables such as username/password and others you defined.
## TODO
### Sub prompt operation (interactive command)
Feature for sub-prompt (interactive command) is not enough.
Now, Expectacle sends fixed command for sub-prompt.
(These actions were defined for cisco to execute above "copy run" example...)
- Yex/No (`:yn`) : always sends "yes"
- Sub prompt (`:sub1` and `:sub2`) : always sends Empty string (RETURN)
### Error handling
Expectacle does not have error message handling feature.
If a host returns a error message when expectacle sent a command,
then expectacle ignores it and continue sending rest commands (until command list is empty).
## Contributing
Bug reports and pull requests are welcome on GitHub at <https://github.com/stereocat/expectacle>.
## License
The gem is available as open source under the terms of the [MIT License](http://opensource.org/licenses/MIT).
| mit |
ryanburns23/font-image | test/font-image_test.html | 802 | <!doctype html>
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, minimum-scale=1, initial-scale=1, user-scalable=yes">
<title>font-image test</title>
<script src="../../webcomponentsjs/webcomponents-lite.js"></script>
<script src="../../web-component-tester/browser.js"></script>
<link rel="import" href="../font-image.html">
</head>
<body>
<test-fixture id="basic">
<template>
<font-image></font-image>
</template>
</test-fixture>
<script>
suite('font-image', function() {
test('instantiating the element works', function() {
var element = fixture('basic');
assert.equal(element.is, 'font-image');
});
});
</script>
</body>
</html>
| mit |
voltagex/b2-csharp | BaxterWorks.B2/Extensions/BucketExtensions.cs | 2003 | using BaxterWorks.B2.Exceptions;
using BaxterWorks.B2.Types;
namespace BaxterWorks.B2.Extensions
{
public static class BucketExtensions
{
public static Bucket GetOrCreateBucket(this ServiceStackB2Api client, CreateBucketRequest request)
{
try
{
return client.CreateBucket(request);
}
catch (DuplicateBucketException) //todo: there are other ways this could fail
{
return client.GetBucketByName(request.BucketName);
}
}
/// <summary>
/// Get an existing bucket, or create a new one if it doesn't exist. Defaults to a private bucket
/// </summary>
/// <param name="client"></param>
/// <param name="bucketName"></param>
/// <returns><see cref="Bucket"/></returns>
public static Bucket GetOrCreateBucket(this ServiceStackB2Api client, string bucketName)
{
try
{
return client.CreateBucket(bucketName);
}
catch (DuplicateBucketException) //todo: there are other ways this could fail
{
return client.GetBucketByName(bucketName);
}
}
public static Bucket OverwriteBucket(this ServiceStackB2Api client, CreateBucketRequest request, bool deleteFiles = false)
{
try
{
return client.CreateBucket(request);
}
catch (DuplicateBucketException) //todo: there are other ways this could fail
{
Bucket targetBucket = client.GetBucketByName(request.BucketName);
if (deleteFiles)
{
client.DeleteBucketRecursively(targetBucket);
}
else
{
client.DeleteBucket(targetBucket);
}
return client.CreateBucket(request);
}
}
}
} | mit |
mdimitrov/oldcrafts | css/favorites.css | 808 | .favorites-container{
margin-top: 16px;
min-height: 350px;
padding-bottom: 30px;
font-family: Helvetica, Verdana;
border-radius: 2px;
background: rgba(255,255,255,0.8);
border: solid 2px rgba(255,255,255,0.3);
-webkit-box-shadow: 2px 2px 5px rgba(0,0,0,0.2);
-moz-box-shadow: 2px 2px 5px rgba(0,0,0,0.2);
box-shadow: 2px 2px 5px rgba(0,0,0,0.2);
}
.favorites-container p {
color: #50737a;
margin-left: 40px;
margin-top: 20px;
}
.cat-small.favorite{
display: inline-block;
margin-left: 10px;
margin-top: 10px;
}
.cat-small.favorite .image img {
}
a.credit {
display:block;
font-size: 11px;
font-family: Arial,Helvetica,sans-serif;
color: #999;
position: absolute;
bottom: 6px;
right: 7px;
}
a.credit:hover{
color: #fff;
} | mit |
PtitNoony/FxTreeMap | src/main/java/com/github/ptitnoony/components/fxtreemap/MapData.java | 3281 | /*
* The MIT License
*
* Copyright 2017 Arnaud Hamon
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in
* all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
* THE SOFTWARE.
*/
package com.github.ptitnoony.components.fxtreemap;
import java.beans.PropertyChangeListener;
import java.util.List;
/**
*
* @author ahamon
*/
public interface MapData {
/**
* Data type that represents whether a data is represents a single object
* (ie LEAF) or an aggregation of objects (ie NODE)
*/
enum DataType {
LEAF, NODE
};
DataType getType();
/**
* Get the data value.
*
* @return the data value
*/
double getValue();
/**
* Set the data value. If the data has children data, their value will be
* set with the same percentage of the value they use to have before the
* setValue is applied. The value must be equal or greater to 0.
*
* @param newValue the new data value
*/
void setValue(double newValue);
/**
* Get the data name.
*
* @return the data name
*/
String getName();
/**
* Set the data name.
*
* @param newName the new data name
*/
void setName(String newName);
/**
* If the data is an aggregation of children data.
*
* @return if the data is an aggregation of children data
*/
boolean hasChildrenData();
/**
* Get the children aggregated data if any.
*
* @return the list of aggregated data
*/
List<MapData> getChildrenData();
/**
* Add a child data. If the data had no child before, adding a child data
* will override the previously set data value.
*
* @param data the data to be added as a child data to aggregate
*/
void addChildrenData(MapData data);
/**
* Remove a child data.
*
* @param data the data to be removed
*/
void removeChildrenData(MapData data);
/**
* Add a property change listener.
*
* @param listener the listener to be added
*/
void addPropertyChangeListener(PropertyChangeListener listener);
/**
* Remove a property change listener.
*
* @param listener the listener to be removed
*/
void removePropertyChangeListener(PropertyChangeListener listener);
}
| mit |
ianrumford/potrubi | lib/potrubi/potrubi.rb | 137 |
# Potrubi
gemName = 'potrubi'
#requireList = %w(mixin/bootstrap)
#requireList.each {|r| require_relative "#{gemName}/#{r}"}
__END__
| mit |
slundberg/shap | shap/explainers/_deep/deep_pytorch.py | 16170 | import numpy as np
import warnings
from .._explainer import Explainer
from packaging import version
torch = None
class PyTorchDeep(Explainer):
def __init__(self, model, data):
# try and import pytorch
global torch
if torch is None:
import torch
if version.parse(torch.__version__) < version.parse("0.4"):
warnings.warn("Your PyTorch version is older than 0.4 and not supported.")
# check if we have multiple inputs
self.multi_input = False
if type(data) == list:
self.multi_input = True
if type(data) != list:
data = [data]
self.data = data
self.layer = None
self.input_handle = None
self.interim = False
self.interim_inputs_shape = None
self.expected_value = None # to keep the DeepExplainer base happy
if type(model) == tuple:
self.interim = True
model, layer = model
model = model.eval()
self.layer = layer
self.add_target_handle(self.layer)
# if we are taking an interim layer, the 'data' is going to be the input
# of the interim layer; we will capture this using a forward hook
with torch.no_grad():
_ = model(*data)
interim_inputs = self.layer.target_input
if type(interim_inputs) is tuple:
# this should always be true, but just to be safe
self.interim_inputs_shape = [i.shape for i in interim_inputs]
else:
self.interim_inputs_shape = [interim_inputs.shape]
self.target_handle.remove()
del self.layer.target_input
self.model = model.eval()
self.multi_output = False
self.num_outputs = 1
with torch.no_grad():
outputs = model(*data)
# also get the device everything is running on
self.device = outputs.device
if outputs.shape[1] > 1:
self.multi_output = True
self.num_outputs = outputs.shape[1]
self.expected_value = outputs.mean(0).cpu().numpy()
def add_target_handle(self, layer):
input_handle = layer.register_forward_hook(get_target_input)
self.target_handle = input_handle
def add_handles(self, model, forward_handle, backward_handle):
"""
Add handles to all non-container layers in the model.
Recursively for non-container layers
"""
handles_list = []
model_children = list(model.children())
if model_children:
for child in model_children:
handles_list.extend(self.add_handles(child, forward_handle, backward_handle))
else: # leaves
handles_list.append(model.register_forward_hook(forward_handle))
handles_list.append(model.register_backward_hook(backward_handle))
return handles_list
def remove_attributes(self, model):
"""
Removes the x and y attributes which were added by the forward handles
Recursively searches for non-container layers
"""
for child in model.children():
if 'nn.modules.container' in str(type(child)):
self.remove_attributes(child)
else:
try:
del child.x
except AttributeError:
pass
try:
del child.y
except AttributeError:
pass
def gradient(self, idx, inputs):
self.model.zero_grad()
X = [x.requires_grad_() for x in inputs]
outputs = self.model(*X)
selected = [val for val in outputs[:, idx]]
grads = []
if self.interim:
interim_inputs = self.layer.target_input
for idx, input in enumerate(interim_inputs):
grad = torch.autograd.grad(selected, input,
retain_graph=True if idx + 1 < len(interim_inputs) else None,
allow_unused=True)[0]
if grad is not None:
grad = grad.cpu().numpy()
else:
grad = torch.zeros_like(X[idx]).cpu().numpy()
grads.append(grad)
del self.layer.target_input
return grads, [i.detach().cpu().numpy() for i in interim_inputs]
else:
for idx, x in enumerate(X):
grad = torch.autograd.grad(selected, x,
retain_graph=True if idx + 1 < len(X) else None,
allow_unused=True)[0]
if grad is not None:
grad = grad.cpu().numpy()
else:
grad = torch.zeros_like(X[idx]).cpu().numpy()
grads.append(grad)
return grads
def shap_values(self, X, ranked_outputs=None, output_rank_order="max", check_additivity=False):
# X ~ self.model_input
# X_data ~ self.data
# check if we have multiple inputs
if not self.multi_input:
assert type(X) != list, "Expected a single tensor model input!"
X = [X]
else:
assert type(X) == list, "Expected a list of model inputs!"
X = [x.detach().to(self.device) for x in X]
if ranked_outputs is not None and self.multi_output:
with torch.no_grad():
model_output_values = self.model(*X)
# rank and determine the model outputs that we will explain
if output_rank_order == "max":
_, model_output_ranks = torch.sort(model_output_values, descending=True)
elif output_rank_order == "min":
_, model_output_ranks = torch.sort(model_output_values, descending=False)
elif output_rank_order == "max_abs":
_, model_output_ranks = torch.sort(torch.abs(model_output_values), descending=True)
else:
assert False, "output_rank_order must be max, min, or max_abs!"
model_output_ranks = model_output_ranks[:, :ranked_outputs]
else:
model_output_ranks = (torch.ones((X[0].shape[0], self.num_outputs)).int() *
torch.arange(0, self.num_outputs).int())
# add the gradient handles
handles = self.add_handles(self.model, add_interim_values, deeplift_grad)
if self.interim:
self.add_target_handle(self.layer)
# compute the attributions
output_phis = []
for i in range(model_output_ranks.shape[1]):
phis = []
if self.interim:
for k in range(len(self.interim_inputs_shape)):
phis.append(np.zeros((X[0].shape[0], ) + self.interim_inputs_shape[k][1: ]))
else:
for k in range(len(X)):
phis.append(np.zeros(X[k].shape))
for j in range(X[0].shape[0]):
# tile the inputs to line up with the background data samples
tiled_X = [X[l][j:j + 1].repeat(
(self.data[l].shape[0],) + tuple([1 for k in range(len(X[l].shape) - 1)])) for l
in range(len(X))]
joint_x = [torch.cat((tiled_X[l], self.data[l]), dim=0) for l in range(len(X))]
# run attribution computation graph
feature_ind = model_output_ranks[j, i]
sample_phis = self.gradient(feature_ind, joint_x)
# assign the attributions to the right part of the output arrays
if self.interim:
sample_phis, output = sample_phis
x, data = [], []
for k in range(len(output)):
x_temp, data_temp = np.split(output[k], 2)
x.append(x_temp)
data.append(data_temp)
for l in range(len(self.interim_inputs_shape)):
phis[l][j] = (sample_phis[l][self.data[l].shape[0]:] * (x[l] - data[l])).mean(0)
else:
for l in range(len(X)):
phis[l][j] = (torch.from_numpy(sample_phis[l][self.data[l].shape[0]:]).to(self.device) * (X[l][j: j + 1] - self.data[l])).cpu().detach().numpy().mean(0)
output_phis.append(phis[0] if not self.multi_input else phis)
# cleanup; remove all gradient handles
for handle in handles:
handle.remove()
self.remove_attributes(self.model)
if self.interim:
self.target_handle.remove()
if not self.multi_output:
return output_phis[0]
elif ranked_outputs is not None:
return output_phis, model_output_ranks
else:
return output_phis
# Module hooks
def deeplift_grad(module, grad_input, grad_output):
"""The backward hook which computes the deeplift
gradient for an nn.Module
"""
# first, get the module type
module_type = module.__class__.__name__
# first, check the module is supported
if module_type in op_handler:
if op_handler[module_type].__name__ not in ['passthrough', 'linear_1d']:
return op_handler[module_type](module, grad_input, grad_output)
else:
print('Warning: unrecognized nn.Module: {}'.format(module_type))
return grad_input
def add_interim_values(module, input, output):
"""The forward hook used to save interim tensors, detached
from the graph. Used to calculate the multipliers
"""
try:
del module.x
except AttributeError:
pass
try:
del module.y
except AttributeError:
pass
module_type = module.__class__.__name__
if module_type in op_handler:
func_name = op_handler[module_type].__name__
# First, check for cases where we don't need to save the x and y tensors
if func_name == 'passthrough':
pass
else:
# check only the 0th input varies
for i in range(len(input)):
if i != 0 and type(output) is tuple:
assert input[i] == output[i], "Only the 0th input may vary!"
# if a new method is added, it must be added here too. This ensures tensors
# are only saved if necessary
if func_name in ['maxpool', 'nonlinear_1d']:
# only save tensors if necessary
if type(input) is tuple:
setattr(module, 'x', torch.nn.Parameter(input[0].detach()))
else:
setattr(module, 'x', torch.nn.Parameter(input.detach()))
if type(output) is tuple:
setattr(module, 'y', torch.nn.Parameter(output[0].detach()))
else:
setattr(module, 'y', torch.nn.Parameter(output.detach()))
if module_type in failure_case_modules:
input[0].register_hook(deeplift_tensor_grad)
def get_target_input(module, input, output):
"""A forward hook which saves the tensor - attached to its graph.
Used if we want to explain the interim outputs of a model
"""
try:
del module.target_input
except AttributeError:
pass
setattr(module, 'target_input', input)
# From the documentation: "The current implementation will not have the presented behavior for
# complex Module that perform many operations. In some failure cases, grad_input and grad_output
# will only contain the gradients for a subset of the inputs and outputs.
# The tensor hook below handles such failure cases (currently, MaxPool1d). In such cases, the deeplift
# grad should still be computed, and then appended to the complex_model_gradients list. The tensor hook
# will then retrieve the proper gradient from this list.
failure_case_modules = ['MaxPool1d']
def deeplift_tensor_grad(grad):
return_grad = complex_module_gradients[-1]
del complex_module_gradients[-1]
return return_grad
complex_module_gradients = []
def passthrough(module, grad_input, grad_output):
"""No change made to gradients"""
return None
def maxpool(module, grad_input, grad_output):
pool_to_unpool = {
'MaxPool1d': torch.nn.functional.max_unpool1d,
'MaxPool2d': torch.nn.functional.max_unpool2d,
'MaxPool3d': torch.nn.functional.max_unpool3d
}
pool_to_function = {
'MaxPool1d': torch.nn.functional.max_pool1d,
'MaxPool2d': torch.nn.functional.max_pool2d,
'MaxPool3d': torch.nn.functional.max_pool3d
}
delta_in = module.x[: int(module.x.shape[0] / 2)] - module.x[int(module.x.shape[0] / 2):]
dup0 = [2] + [1 for i in delta_in.shape[1:]]
# we also need to check if the output is a tuple
y, ref_output = torch.chunk(module.y, 2)
cross_max = torch.max(y, ref_output)
diffs = torch.cat([cross_max - ref_output, y - cross_max], 0)
# all of this just to unpool the outputs
with torch.no_grad():
_, indices = pool_to_function[module.__class__.__name__](
module.x, module.kernel_size, module.stride, module.padding,
module.dilation, module.ceil_mode, True)
xmax_pos, rmax_pos = torch.chunk(pool_to_unpool[module.__class__.__name__](
grad_output[0] * diffs, indices, module.kernel_size, module.stride,
module.padding, list(module.x.shape)), 2)
org_input_shape = grad_input[0].shape # for the maxpool 1d
grad_input = [None for _ in grad_input]
grad_input[0] = torch.where(torch.abs(delta_in) < 1e-7, torch.zeros_like(delta_in),
(xmax_pos + rmax_pos) / delta_in).repeat(dup0)
if module.__class__.__name__ == 'MaxPool1d':
complex_module_gradients.append(grad_input[0])
# the grad input that is returned doesn't matter, since it will immediately be
# be overridden by the grad in the complex_module_gradient
grad_input[0] = torch.ones(org_input_shape)
return tuple(grad_input)
def linear_1d(module, grad_input, grad_output):
"""No change made to gradients."""
return None
def nonlinear_1d(module, grad_input, grad_output):
delta_out = module.y[: int(module.y.shape[0] / 2)] - module.y[int(module.y.shape[0] / 2):]
delta_in = module.x[: int(module.x.shape[0] / 2)] - module.x[int(module.x.shape[0] / 2):]
dup0 = [2] + [1 for i in delta_in.shape[1:]]
# handles numerical instabilities where delta_in is very small by
# just taking the gradient in those cases
grads = [None for _ in grad_input]
grads[0] = torch.where(torch.abs(delta_in.repeat(dup0)) < 1e-6, grad_input[0],
grad_output[0] * (delta_out / delta_in).repeat(dup0))
return tuple(grads)
op_handler = {}
# passthrough ops, where we make no change to the gradient
op_handler['Dropout3d'] = passthrough
op_handler['Dropout2d'] = passthrough
op_handler['Dropout'] = passthrough
op_handler['AlphaDropout'] = passthrough
op_handler['Conv1d'] = linear_1d
op_handler['Conv2d'] = linear_1d
op_handler['Conv3d'] = linear_1d
op_handler['ConvTranspose1d'] = linear_1d
op_handler['ConvTranspose2d'] = linear_1d
op_handler['ConvTranspose3d'] = linear_1d
op_handler['Linear'] = linear_1d
op_handler['AvgPool1d'] = linear_1d
op_handler['AvgPool2d'] = linear_1d
op_handler['AvgPool3d'] = linear_1d
op_handler['AdaptiveAvgPool1d'] = linear_1d
op_handler['AdaptiveAvgPool2d'] = linear_1d
op_handler['AdaptiveAvgPool3d'] = linear_1d
op_handler['BatchNorm1d'] = linear_1d
op_handler['BatchNorm2d'] = linear_1d
op_handler['BatchNorm3d'] = linear_1d
op_handler['LeakyReLU'] = nonlinear_1d
op_handler['ReLU'] = nonlinear_1d
op_handler['ELU'] = nonlinear_1d
op_handler['Sigmoid'] = nonlinear_1d
op_handler["Tanh"] = nonlinear_1d
op_handler["Softplus"] = nonlinear_1d
op_handler['Softmax'] = nonlinear_1d
op_handler['MaxPool1d'] = maxpool
op_handler['MaxPool2d'] = maxpool
op_handler['MaxPool3d'] = maxpool
| mit |
thesheps/lemonade | src/Lemonade.Data/Commands/IUpdateFeature.cs | 157 | using Lemonade.Data.Entities;
namespace Lemonade.Data.Commands
{
public interface IUpdateFeature
{
void Execute(Feature feature);
}
} | mit |
eric-dowty/eric-dowty.github.io | apps/rails-pub-sub-node-server/node_modules/socket.io/node_modules/socket.io-client/node_modules/engine.io-client/node_modules/ws/node_modules/utf-8-validate/build/Makefile | 13890 | # We borrow heavily from the kernel build setup, though we are simpler since
# we don't have Kconfig tweaking settings on us.
# The implicit make rules have it looking for RCS files, among other things.
# We instead explicitly write all the rules we care about.
# It's even quicker (saves ~200ms) to pass -r on the command line.
MAKEFLAGS=-r
# The source directory tree.
srcdir := ..
abs_srcdir := $(abspath $(srcdir))
# The name of the builddir.
builddir_name ?= .
# The V=1 flag on command line makes us verbosely print command lines.
ifdef V
quiet=
else
quiet=quiet_
endif
# Specify BUILDTYPE=Release on the command line for a release build.
BUILDTYPE ?= Release
# Directory all our build output goes into.
# Note that this must be two directories beneath src/ for unit tests to pass,
# as they reach into the src/ directory for data with relative paths.
builddir ?= $(builddir_name)/$(BUILDTYPE)
abs_builddir := $(abspath $(builddir))
depsdir := $(builddir)/.deps
# Object output directory.
obj := $(builddir)/obj
abs_obj := $(abspath $(obj))
# We build up a list of every single one of the targets so we can slurp in the
# generated dependency rule Makefiles in one pass.
all_deps :=
CC.target ?= $(CC)
CFLAGS.target ?= $(CFLAGS)
CXX.target ?= $(CXX)
CXXFLAGS.target ?= $(CXXFLAGS)
LINK.target ?= $(LINK)
LDFLAGS.target ?= $(LDFLAGS)
AR.target ?= $(AR)
# C++ apps need to be linked with g++.
#
# Note: flock is used to seralize linking. Linking is a memory-intensive
# process so running parallel links can often lead to thrashing. To disable
# the serialization, override LINK via an envrionment variable as follows:
#
# export LINK=g++
#
# This will allow make to invoke N linker processes as specified in -jN.
LINK ?= ./gyp-mac-tool flock $(builddir)/linker.lock $(CXX.target)
# TODO(evan): move all cross-compilation logic to gyp-time so we don't need
# to replicate this environment fallback in make as well.
CC.host ?= gcc
CFLAGS.host ?=
CXX.host ?= g++
CXXFLAGS.host ?=
LINK.host ?= $(CXX.host)
LDFLAGS.host ?=
AR.host ?= ar
# Define a dir function that can handle spaces.
# http://www.gnu.org/software/make/manual/make.html#Syntax-of-Functions
# "leading spaces cannot appear in the text of the first argument as written.
# These characters can be put into the argument value by variable substitution."
empty :=
space := $(empty) $(empty)
# http://stackoverflow.com/questions/1189781/using-make-dir-or-notdir-on-a-path-with-spaces
replace_spaces = $(subst $(space),?,$1)
unreplace_spaces = $(subst ?,$(space),$1)
dirx = $(call unreplace_spaces,$(dir $(call replace_spaces,$1)))
# Flags to make gcc output dependency info. Note that you need to be
# careful here to use the flags that ccache and distcc can understand.
# We write to a dep file on the side first and then rename at the end
# so we can't end up with a broken dep file.
depfile = $(depsdir)/$(call replace_spaces,$@).d
DEPFLAGS = -MMD -MF $(depfile).raw
# We have to fixup the deps output in a few ways.
# (1) the file output should mention the proper .o file.
# ccache or distcc lose the path to the target, so we convert a rule of
# the form:
# foobar.o: DEP1 DEP2
# into
# path/to/foobar.o: DEP1 DEP2
# (2) we want missing files not to cause us to fail to build.
# We want to rewrite
# foobar.o: DEP1 DEP2 \
# DEP3
# to
# DEP1:
# DEP2:
# DEP3:
# so if the files are missing, they're just considered phony rules.
# We have to do some pretty insane escaping to get those backslashes
# and dollar signs past make, the shell, and sed at the same time.
# Doesn't work with spaces, but that's fine: .d files have spaces in
# their names replaced with other characters.
define fixup_dep
# The depfile may not exist if the input file didn't have any #includes.
touch $(depfile).raw
# Fixup path as in (1).
sed -e "s|^$(notdir $@)|$@|" $(depfile).raw >> $(depfile)
# Add extra rules as in (2).
# We remove slashes and replace spaces with new lines;
# remove blank lines;
# delete the first line and append a colon to the remaining lines.
sed -e 's|\\||' -e 'y| |\n|' $(depfile).raw |\
grep -v '^$$' |\
sed -e 1d -e 's|$$|:|' \
>> $(depfile)
rm $(depfile).raw
endef
# Command definitions:
# - cmd_foo is the actual command to run;
# - quiet_cmd_foo is the brief-output summary of the command.
quiet_cmd_cc = CC($(TOOLSET)) $@
cmd_cc = $(CC.$(TOOLSET)) $(GYP_CFLAGS) $(DEPFLAGS) $(CFLAGS.$(TOOLSET)) -c -o $@ $<
quiet_cmd_cxx = CXX($(TOOLSET)) $@
cmd_cxx = $(CXX.$(TOOLSET)) $(GYP_CXXFLAGS) $(DEPFLAGS) $(CXXFLAGS.$(TOOLSET)) -c -o $@ $<
quiet_cmd_objc = CXX($(TOOLSET)) $@
cmd_objc = $(CC.$(TOOLSET)) $(GYP_OBJCFLAGS) $(DEPFLAGS) -c -o $@ $<
quiet_cmd_objcxx = CXX($(TOOLSET)) $@
cmd_objcxx = $(CXX.$(TOOLSET)) $(GYP_OBJCXXFLAGS) $(DEPFLAGS) -c -o $@ $<
# Commands for precompiled header files.
quiet_cmd_pch_c = CXX($(TOOLSET)) $@
cmd_pch_c = $(CC.$(TOOLSET)) $(GYP_PCH_CFLAGS) $(DEPFLAGS) $(CXXFLAGS.$(TOOLSET)) -c -o $@ $<
quiet_cmd_pch_cc = CXX($(TOOLSET)) $@
cmd_pch_cc = $(CC.$(TOOLSET)) $(GYP_PCH_CXXFLAGS) $(DEPFLAGS) $(CXXFLAGS.$(TOOLSET)) -c -o $@ $<
quiet_cmd_pch_m = CXX($(TOOLSET)) $@
cmd_pch_m = $(CC.$(TOOLSET)) $(GYP_PCH_OBJCFLAGS) $(DEPFLAGS) -c -o $@ $<
quiet_cmd_pch_mm = CXX($(TOOLSET)) $@
cmd_pch_mm = $(CC.$(TOOLSET)) $(GYP_PCH_OBJCXXFLAGS) $(DEPFLAGS) -c -o $@ $<
# gyp-mac-tool is written next to the root Makefile by gyp.
# Use $(4) for the command, since $(2) and $(3) are used as flag by do_cmd
# already.
quiet_cmd_mac_tool = MACTOOL $(4) $<
cmd_mac_tool = ./gyp-mac-tool $(4) $< "$@"
quiet_cmd_mac_package_framework = PACKAGE FRAMEWORK $@
cmd_mac_package_framework = ./gyp-mac-tool package-framework "$@" $(4)
quiet_cmd_infoplist = INFOPLIST $@
cmd_infoplist = $(CC.$(TOOLSET)) -E -P -Wno-trigraphs -x c $(INFOPLIST_DEFINES) "$<" -o "$@"
quiet_cmd_touch = TOUCH $@
cmd_touch = touch $@
quiet_cmd_copy = COPY $@
# send stderr to /dev/null to ignore messages when linking directories.
cmd_copy = rm -rf "$@" && cp -af "$<" "$@"
quiet_cmd_alink = LIBTOOL-STATIC $@
cmd_alink = rm -f $@ && ./gyp-mac-tool filter-libtool libtool $(GYP_LIBTOOLFLAGS) -static -o $@ $(filter %.o,$^)
quiet_cmd_link = LINK($(TOOLSET)) $@
cmd_link = $(LINK.$(TOOLSET)) $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o "$@" $(LD_INPUTS) $(LIBS)
quiet_cmd_solink = SOLINK($(TOOLSET)) $@
cmd_solink = $(LINK.$(TOOLSET)) -shared $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o "$@" $(LD_INPUTS) $(LIBS)
quiet_cmd_solink_module = SOLINK_MODULE($(TOOLSET)) $@
cmd_solink_module = $(LINK.$(TOOLSET)) -bundle $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o $@ $(filter-out FORCE_DO_CMD, $^) $(LIBS)
# Define an escape_quotes function to escape single quotes.
# This allows us to handle quotes properly as long as we always use
# use single quotes and escape_quotes.
escape_quotes = $(subst ','\'',$(1))
# This comment is here just to include a ' to unconfuse syntax highlighting.
# Define an escape_vars function to escape '$' variable syntax.
# This allows us to read/write command lines with shell variables (e.g.
# $LD_LIBRARY_PATH), without triggering make substitution.
escape_vars = $(subst $$,$$$$,$(1))
# Helper that expands to a shell command to echo a string exactly as it is in
# make. This uses printf instead of echo because printf's behaviour with respect
# to escape sequences is more portable than echo's across different shells
# (e.g., dash, bash).
exact_echo = printf '%s\n' '$(call escape_quotes,$(1))'
# Helper to compare the command we're about to run against the command
# we logged the last time we ran the command. Produces an empty
# string (false) when the commands match.
# Tricky point: Make has no string-equality test function.
# The kernel uses the following, but it seems like it would have false
# positives, where one string reordered its arguments.
# arg_check = $(strip $(filter-out $(cmd_$(1)), $(cmd_$@)) \
# $(filter-out $(cmd_$@), $(cmd_$(1))))
# We instead substitute each for the empty string into the other, and
# say they're equal if both substitutions produce the empty string.
# .d files contain ? instead of spaces, take that into account.
command_changed = $(or $(subst $(cmd_$(1)),,$(cmd_$(call replace_spaces,$@))),\
$(subst $(cmd_$(call replace_spaces,$@)),,$(cmd_$(1))))
# Helper that is non-empty when a prerequisite changes.
# Normally make does this implicitly, but we force rules to always run
# so we can check their command lines.
# $? -- new prerequisites
# $| -- order-only dependencies
prereq_changed = $(filter-out FORCE_DO_CMD,$(filter-out $|,$?))
# Helper that executes all postbuilds until one fails.
define do_postbuilds
@E=0;\
for p in $(POSTBUILDS); do\
eval $$p;\
E=$$?;\
if [ $$E -ne 0 ]; then\
break;\
fi;\
done;\
if [ $$E -ne 0 ]; then\
rm -rf "$@";\
exit $$E;\
fi
endef
# do_cmd: run a command via the above cmd_foo names, if necessary.
# Should always run for a given target to handle command-line changes.
# Second argument, if non-zero, makes it do asm/C/C++ dependency munging.
# Third argument, if non-zero, makes it do POSTBUILDS processing.
# Note: We intentionally do NOT call dirx for depfile, since it contains ? for
# spaces already and dirx strips the ? characters.
define do_cmd
$(if $(or $(command_changed),$(prereq_changed)),
@$(call exact_echo, $($(quiet)cmd_$(1)))
@mkdir -p "$(call dirx,$@)" "$(dir $(depfile))"
$(if $(findstring flock,$(word 2,$(cmd_$1))),
@$(cmd_$(1))
@echo " $(quiet_cmd_$(1)): Finished",
@$(cmd_$(1))
)
@$(call exact_echo,$(call escape_vars,cmd_$(call replace_spaces,$@) := $(cmd_$(1)))) > $(depfile)
@$(if $(2),$(fixup_dep))
$(if $(and $(3), $(POSTBUILDS)),
$(call do_postbuilds)
)
)
endef
# Declare the "all" target first so it is the default,
# even though we don't have the deps yet.
.PHONY: all
all:
# make looks for ways to re-generate included makefiles, but in our case, we
# don't have a direct way. Explicitly telling make that it has nothing to do
# for them makes it go faster.
%.d: ;
# Use FORCE_DO_CMD to force a target to run. Should be coupled with
# do_cmd.
.PHONY: FORCE_DO_CMD
FORCE_DO_CMD:
TOOLSET := target
# Suffix rules, putting all outputs into $(obj).
$(obj).$(TOOLSET)/%.o: $(srcdir)/%.c FORCE_DO_CMD
@$(call do_cmd,cc,1)
$(obj).$(TOOLSET)/%.o: $(srcdir)/%.cc FORCE_DO_CMD
@$(call do_cmd,cxx,1)
$(obj).$(TOOLSET)/%.o: $(srcdir)/%.cpp FORCE_DO_CMD
@$(call do_cmd,cxx,1)
$(obj).$(TOOLSET)/%.o: $(srcdir)/%.cxx FORCE_DO_CMD
@$(call do_cmd,cxx,1)
$(obj).$(TOOLSET)/%.o: $(srcdir)/%.m FORCE_DO_CMD
@$(call do_cmd,objc,1)
$(obj).$(TOOLSET)/%.o: $(srcdir)/%.mm FORCE_DO_CMD
@$(call do_cmd,objcxx,1)
$(obj).$(TOOLSET)/%.o: $(srcdir)/%.S FORCE_DO_CMD
@$(call do_cmd,cc,1)
$(obj).$(TOOLSET)/%.o: $(srcdir)/%.s FORCE_DO_CMD
@$(call do_cmd,cc,1)
# Try building from generated source, too.
$(obj).$(TOOLSET)/%.o: $(obj).$(TOOLSET)/%.c FORCE_DO_CMD
@$(call do_cmd,cc,1)
$(obj).$(TOOLSET)/%.o: $(obj).$(TOOLSET)/%.cc FORCE_DO_CMD
@$(call do_cmd,cxx,1)
$(obj).$(TOOLSET)/%.o: $(obj).$(TOOLSET)/%.cpp FORCE_DO_CMD
@$(call do_cmd,cxx,1)
$(obj).$(TOOLSET)/%.o: $(obj).$(TOOLSET)/%.cxx FORCE_DO_CMD
@$(call do_cmd,cxx,1)
$(obj).$(TOOLSET)/%.o: $(obj).$(TOOLSET)/%.m FORCE_DO_CMD
@$(call do_cmd,objc,1)
$(obj).$(TOOLSET)/%.o: $(obj).$(TOOLSET)/%.mm FORCE_DO_CMD
@$(call do_cmd,objcxx,1)
$(obj).$(TOOLSET)/%.o: $(obj).$(TOOLSET)/%.S FORCE_DO_CMD
@$(call do_cmd,cc,1)
$(obj).$(TOOLSET)/%.o: $(obj).$(TOOLSET)/%.s FORCE_DO_CMD
@$(call do_cmd,cc,1)
$(obj).$(TOOLSET)/%.o: $(obj)/%.c FORCE_DO_CMD
@$(call do_cmd,cc,1)
$(obj).$(TOOLSET)/%.o: $(obj)/%.cc FORCE_DO_CMD
@$(call do_cmd,cxx,1)
$(obj).$(TOOLSET)/%.o: $(obj)/%.cpp FORCE_DO_CMD
@$(call do_cmd,cxx,1)
$(obj).$(TOOLSET)/%.o: $(obj)/%.cxx FORCE_DO_CMD
@$(call do_cmd,cxx,1)
$(obj).$(TOOLSET)/%.o: $(obj)/%.m FORCE_DO_CMD
@$(call do_cmd,objc,1)
$(obj).$(TOOLSET)/%.o: $(obj)/%.mm FORCE_DO_CMD
@$(call do_cmd,objcxx,1)
$(obj).$(TOOLSET)/%.o: $(obj)/%.S FORCE_DO_CMD
@$(call do_cmd,cc,1)
$(obj).$(TOOLSET)/%.o: $(obj)/%.s FORCE_DO_CMD
@$(call do_cmd,cc,1)
ifeq ($(strip $(foreach prefix,$(NO_LOAD),\
$(findstring $(join ^,$(prefix)),\
$(join ^,validation.target.mk)))),)
include validation.target.mk
endif
quiet_cmd_regen_makefile = ACTION Regenerating $@
cmd_regen_makefile = cd $(srcdir); /usr/local/lib/node_modules/npm/node_modules/node-gyp/gyp/gyp_main.py -fmake --ignore-environment "--toplevel-dir=." -I/Users/ericdowty/Techblog/apps/rails-pub-sub-node-server/node_modules/socket.io/node_modules/socket.io-client/node_modules/engine.io-client/node_modules/ws/node_modules/utf-8-validate/build/config.gypi -I/usr/local/lib/node_modules/npm/node_modules/node-gyp/addon.gypi -I/Users/ericdowty/.node-gyp/0.12.4/common.gypi "--depth=." "-Goutput_dir=." "--generator-output=build" "-Dlibrary=shared_library" "-Dvisibility=default" "-Dnode_root_dir=/Users/ericdowty/.node-gyp/0.12.4" "-Dmodule_root_dir=/Users/ericdowty/Techblog/apps/rails-pub-sub-node-server/node_modules/socket.io/node_modules/socket.io-client/node_modules/engine.io-client/node_modules/ws/node_modules/utf-8-validate" binding.gyp
Makefile: $(srcdir)/../../../../../../../../../../../../../.node-gyp/0.12.4/common.gypi $(srcdir)/../../../../../../../../../../../../../../../usr/local/lib/node_modules/npm/node_modules/node-gyp/addon.gypi $(srcdir)/build/config.gypi $(srcdir)/binding.gyp
$(call do_cmd,regen_makefile)
# "all" is a concatenation of the "all" targets from all the included
# sub-makefiles. This is just here to clarify.
all:
# Add in dependency-tracking rules. $(all_deps) is the list of every single
# target in our tree. Only consider the ones with .d (dependency) info:
d_files := $(wildcard $(foreach f,$(all_deps),$(depsdir)/$(f).d))
ifneq ($(d_files),)
include $(d_files)
endif
| mit |
jsguy/misojs-codemirror-component | codemirror.component.js | 2410 | /*
Misojs Codemirror component
*/
var m = require('mithril'),
basePath = "external/codemirror/",
pjson = require("./package.json");
// Here we have a few fixes to make CM work in node - we only setup each,
// if they don't already exist, otherwise we would override the browser
global.document = global.document || {};
global.document.createElement = global.document.createElement || function(){
return {
setAttribute: function(){}
};
};
global.window = global.window || {};
global.window.getSelection = global.window.getSelection || function(){
return false;
};
global.navigator = global.navigator || {};
global.navigator.userAgent = global.navigator.userAgent || "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/43.0.2357.130 Safari/537.36";
// Grab code mirror and the javascript language
// Note: you cannot dynamically require with browserify,
// so we must get whatever modes we need here.
// If you need other languages, simply equire them statically in your program.
var CodeMirror = require('codemirror');
require("codemirror/mode/javascript/javascript.js");
require("codemirror/mode/htmlmixed/htmlmixed.js");
require("codemirror/mode/css/css.js");
// Our component
var CodemirrorComponent = {
// Returns a textarea
view: function(ctrl, attrs) {
return m("div", [
// It is ok to include CSS here - the browser will cache it,
// though a more ideal setup would be the ability to load only
// once when required.
m("LINK", { href: basePath + "lib/codemirror.css", rel: "stylesheet"}),
m("textarea", {config: CodemirrorComponent.config(attrs)}, attrs.value())
]);
},
config: function(attrs) {
return function(element, isInitialized) {
if(typeof CodeMirror !== 'undefined') {
if (!isInitialized) {
var editor = CodeMirror.fromTextArea(element, {
lineNumbers: true
});
editor.on("change", function(instance, object) {
m.startComputation();
attrs.value(editor.doc.getValue());
if (typeof attrs.onchange == "function"){
attrs.onchange(instance, object);
}
m.endComputation();
});
}
} else {
console.warn('ERROR: You need Codemirror in the page');
}
};
}
};
// Allow the user to pass in arguments when loading.
module.exports = function(args){
if(args && args.basePath) {
basePath = args.basePath;
}
return CodemirrorComponent;
}; | mit |
speedland/wcg | supports/github/api.go | 3000 | // github package provides an API client for github.com
//
// Copyright (C) 2014 Yohei Sasaki
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
//
package github
import (
"bytes"
"encoding/json"
"fmt"
"io/ioutil"
"net/http"
"strconv"
"time"
)
const BaseUrl = "https://api.github.com"
type Client struct {
RateLimit int
RateLimitRemaining int
RateLimitReset time.Time
baseUrl string
client *http.Client
}
func NewClient(c *http.Client) *Client {
return &Client{
baseUrl: BaseUrl,
client: c,
}
}
type MarkdownMode string
var Markdown = MarkdownMode("markdown")
var Gfm = MarkdownMode("gfm")
type ApiError struct {
Status int
Body string
*Client
}
func (e *ApiError) Error() string {
return fmt.Sprintf("Github API Error: %d - %v", e.Status, e.Body)
}
func NewApiError(status int, body string, c *Client) *ApiError {
return &ApiError{Status: status, Body: body, Client: c}
}
func IsApiError(err error) bool {
switch err.(type) {
case *ApiError:
return true
default:
return false
}
}
// Call /markdown API
// See: https://developer.github.com/v3/markdown/
func (g *Client) Markdown(text string, mode MarkdownMode, context string) (string, error) {
url := g.baseUrl + "/markdown"
body := map[string]string{
"text": text,
"mode": string(mode),
"context": context,
}
buff, _ := json.Marshal(body)
resp, err := g.client.Post(url, "application/json", bytes.NewBuffer(buff))
if err != nil {
return "", err
}
defer resp.Body.Close()
buff, err = ioutil.ReadAll(resp.Body)
if err != nil {
return "", err
}
g.updateRateLimit(resp)
if resp.StatusCode != http.StatusOK {
return "", NewApiError(resp.StatusCode, string(buff), g)
}
return string(buff), nil
}
// Returns if the client exceeds the limit or not.
func (g *Client) LimitExceeded() bool {
if g.RateLimit == 0 && g.RateLimitRemaining == 0 { // initial value
return false
}
return g.RateLimitRemaining == 0
}
func (g *Client) updateRateLimit(resp *http.Response) {
limit := resp.Header.Get("X-Ratelimit-Limit")
i, err := strconv.ParseInt(limit, 10, 32)
if err == nil {
g.RateLimit = int(i)
}
remaining := resp.Header.Get("X-Ratelimit-Remaining")
i, err = strconv.ParseInt(remaining, 10, 32)
if err == nil {
g.RateLimitRemaining = int(i)
}
reset := resp.Header.Get("X-Ratelimit-Reset")
i, err = strconv.ParseInt(reset, 10, 32)
if err == nil {
g.RateLimitReset = time.Unix(i, 0)
}
}
| mit |
orocrm/platform | src/Oro/Bundle/AttachmentBundle/Tests/Unit/Entity/FileTest.php | 3094 | <?php
namespace Oro\Bundle\AttachmentBundle\Tests\Unit\Entity;
use Oro\Bundle\AttachmentBundle\Entity\File;
use Oro\Bundle\UserBundle\Entity\User;
use Oro\Component\Testing\Unit\EntityTestCaseTrait;
use Oro\Component\Testing\Unit\EntityTrait;
use Symfony\Component\HttpFoundation\File\File as ComponentFile;
class FileTest extends \PHPUnit\Framework\TestCase
{
use EntityTestCaseTrait;
use EntityTrait;
/** @var File */
private $entity;
protected function setUp()
{
$this->entity = new File();
}
public function testAccessors(): void
{
$properties = [
['id', 1],
['uuid', '123e4567-e89b-12d3-a456-426655440000', false],
['owner', new User()],
['filename', 'sample_filename'],
['extension', 'smplext'],
['mimeType', 'sample/mime-type'],
['originalFilename', 'sample_original_filename'],
['fileSize', 12345],
['parentEntityClass', \stdClass::class],
['parentEntityId', 2],
['parentEntityFieldName', 'sampleFieldName'],
['createdAt', new \DateTime('today')],
['updatedAt', new \DateTime('today')],
['file', new ComponentFile('sample/file', false)],
['emptyFile', true],
];
static::assertPropertyAccessors($this->entity, $properties);
}
public function testPrePersists(): void
{
$testDate = new \DateTime('now', new \DateTimeZone('UTC'));
$this->entity->prePersist();
$this->entity->preUpdate();
$this->assertEquals($testDate->format('Y-m-d'), $this->entity->getCreatedAt()->format('Y-m-d'));
$this->assertEquals($testDate->format('Y-m-d'), $this->entity->getUpdatedAt()->format('Y-m-d'));
}
public function testEmptyFile(): void
{
$this->assertNull($this->entity->isEmptyFile());
$this->entity->setEmptyFile(true);
$this->assertTrue($this->entity->isEmptyFile());
}
public function testToString(): void
{
$this->assertSame('', $this->entity->__toString());
$this->entity->setFilename('file.doc');
$this->entity->setOriginalFilename('original.doc');
$this->assertEquals('file.doc (original.doc)', $this->entity->__toString());
}
public function testSerialize(): void
{
$this->assertSame(serialize([null, null, $this->entity->getUuid()]), $this->entity->serialize());
$this->assertEquals(
serialize([1, 'sample_filename', 'test-uuid']),
$this->getEntity(
File::class,
['id' => 1, 'filename' => 'sample_filename', 'uuid' => 'test-uuid']
)->serialize()
);
}
public function testUnserialize(): void
{
$this->entity->unserialize(serialize([1, 'sample_filename', 'test-uuid']));
$this->assertSame('sample_filename', $this->entity->getFilename());
$this->assertSame(1, $this->entity->getId());
$this->assertSame('test-uuid', $this->entity->getUuid());
}
}
| mit |
chrisvroberts/json_diff | lib/json_diff.rb | 2550 | require 'json_diff/version'
# Provides helper methods to compare object trees (like those generated by JSON.parse)
# and generate a list of their differences.
module JSONDiff
# Generates an Array of differences between the two supplied object trees with Hash roots.
#
# @param a [Hash] the left hand side of the comparison.
# @param b [Hash] the right hand side of the comparison.
# @param path [String] the JSON path at which `a` and `b` are found in a larger object tree.
# @return [Array<String>] the differences found between `a` and `b`.
def self.objects(a, b, path='')
differences = []
a.each do |k, v|
if b.has_key? k
if v.class != b[k].class
differences << "type mismatch: #{path}/#{k} '#{v.class}' != '#{b[k].class}'"
else
if v.is_a? Hash
differences += objects(v, b[k], "#{path}/#{k}")
elsif v.is_a? Array
differences += arrays(v, b[k], "#{path}/#{k}")
elsif v != b[k] # String, TrueClass, FalseClass, NilClass, Float, Fixnum
differences << "value mismatch: #{path}/#{k} '#{v}' != '#{b[k]}'"
end
end
else
differences << "b is missing: #{path}/#{k}"
end
end
(b.keys - a.keys).each do |k, v|
differences << "a is missing: #{path}/#{k}"
end
differences
end
# Generates an Array of differences between the two supplied object trees with Array roots.
#
# @param a [Array] the left hand side of the comparison.
# @param b [Array] the right hand side of the comparison.
# @param path [String] the JSON path at which `a` and `b` are found in a larger object tree.
# @return [Array<String>] the differences found between `a` and `b`.
def self.arrays(a, b, path='/')
differences = []
if a.size != b.size
differences << "size mismatch: #{path}"
else
a.zip(b).each_with_index do |pair, index|
if pair[0].class != pair[1].class
differences << "type mismatch: #{path}[#{index}] '#{pair[0].class}' != '#{pair[1].class}'"
else
if pair[0].is_a? Hash
differences += objects(pair[0], pair[1], "#{path}[#{index}]")
elsif pair[0].is_a? Array
differences += arrays(pair[0], pair[1], "#{path}[#{index}]")
elsif pair[0] != pair[1] # String, TrueClass, FalseClass, NilClass, Float, Fixnum
differences << "value mismatch: #{path}[#{index}] '#{pair[0]}' != '#{pair[1]}'"
end
end
end
end
differences
end
end
| mit |
timorleste/health-facility | att/health_facilities_dbf257.html | 5889 | <html>
<head>
<meta http-equiv="Page-Enter" content="revealTrans(Duration=4,Transition=12)">
<meta http-equiv="content-type" content="text/html; charset=utf-8">
<meta name="language" content="en" />
<meta http-equiv="X-UA-Compatible" content="IE=edge,chrome=1"/>
<meta name="viewport" content="width=device-width"/>
<meta name="author" content="T">
<META NAME="robots" content="index">
<META NAME="robots" content="follow">
<meta name="description" content="Our mission is to help facilitate the newly independent East-Timor in accessing the voices and the key events in its past and fostering an understanding of its unique cultural heritage; to help unite and secure the identity of the country and its people; and to contribute to the forging of a new democratic nation. To ensure the survival of unique resources which contain and communicate the experience and the story of Timor-Leste and to place these at the service of peace and understanding, in Timor-Leste, Regionally and Internationally.">
<meta name="keywords" content="CAMSTL, East-Timor, Timor-Leste, Timor-Lorosae">
<title>Raster and Vector Attributes</title>
</head>
<body bgcolor="#FFFFFF">
<table width="90" border="0" cellspacing="2" cellpadding="2" align="Center">
<tr>
<td bgcolor="#CC3333">
<font face="Courier New, Courier, mono" size="-1">
Id
</font>
</td>
<td bgcolor="#CCCCCC">
<font face="Courier New, Courier, mono" size="-1">
386
</font>
</td>
</tr>
<tr>
<td bgcolor="#CC3333">
<font face="Courier New, Courier, mono" size="-1">
Facility
</font>
</td>
<td bgcolor="#CCCCCC">
<font face="Courier New, Courier, mono" size="-1">
Railacoleten HP
</font>
</td>
</tr>
<tr>
<td bgcolor="#CC3333">
<font face="Courier New, Courier, mono" size="-1">
District
</font>
</td>
<td bgcolor="#CCCCCC">
<font face="Courier New, Courier, mono" size="-1">
Ermera
</font>
</td>
</tr>
<tr>
<td bgcolor="#CC3333">
<font face="Courier New, Courier, mono" size="-1">
Code
</font>
</td>
<td bgcolor="#CCCCCC">
<font face="Courier New, Courier, mono" size="-1">
HP
</font>
</td>
</tr>
<tr>
<td bgcolor="#CC3333">
<font face="Courier New, Courier, mono" size="-1">
Owner_
</font>
</td>
<td bgcolor="#CCCCCC">
<font face="Courier New, Courier, mono" size="-1">
Government
</font>
</td>
</tr>
<tr>
<td bgcolor="#CC3333">
<font face="Courier New, Courier, mono" size="-1">
Management
</font>
</td>
<td bgcolor="#CCCCCC">
<font face="Courier New, Courier, mono" size="-1">
Railaco CHC
</font>
</td>
</tr>
<tr>
<td bgcolor="#CC3333">
<font face="Courier New, Courier, mono" size="-1">
Visitmnth
</font>
</td>
<td bgcolor="#CCCCCC">
<font face="Courier New, Courier, mono" size="-1">
0
</font>
</td>
</tr>
<tr>
<td bgcolor="#CC3333">
<font face="Courier New, Courier, mono" size="-1">
Physician
</font>
</td>
<td bgcolor="#CCCCCC">
<font face="Courier New, Courier, mono" size="-1">
0.00
</font>
</td>
</tr>
<tr>
<td bgcolor="#CC3333">
<font face="Courier New, Courier, mono" size="-1">
Nurse
</font>
</td>
<td bgcolor="#CCCCCC">
<font face="Courier New, Courier, mono" size="-1">
1.00
</font>
</td>
</tr>
<tr>
<td bgcolor="#CC3333">
<font face="Courier New, Courier, mono" size="-1">
Midwife
</font>
</td>
<td bgcolor="#CCCCCC">
<font face="Courier New, Courier, mono" size="-1">
0.00
</font>
</td>
</tr>
<tr>
<td bgcolor="#CC3333">
<font face="Courier New, Courier, mono" size="-1">
Dental
</font>
</td>
<td bgcolor="#CCCCCC">
<font face="Courier New, Courier, mono" size="-1">
0.00
</font>
</td>
</tr>
<tr>
<td bgcolor="#CC3333">
<font face="Courier New, Courier, mono" size="-1">
Lab
</font>
</td>
<td bgcolor="#CCCCCC">
<font face="Courier New, Courier, mono" size="-1">
0.00
</font>
</td>
</tr>
<tr>
<td bgcolor="#CC3333">
<font face="Courier New, Courier, mono" size="-1">
Ass_nurse
</font>
</td>
<td bgcolor="#CCCCCC">
<font face="Courier New, Courier, mono" size="-1">
0.00
</font>
</td>
</tr>
<tr>
<td bgcolor="#CC3333">
<font face="Courier New, Courier, mono" size="-1">
Other
</font>
</td>
<td bgcolor="#CCCCCC">
<font face="Courier New, Courier, mono" size="-1">
0.00
</font>
</td>
</tr>
<tr>
<td bgcolor="#CC3333">
<font face="Courier New, Courier, mono" size="-1">
Assumption
</font>
</td>
<td bgcolor="#CCCCCC">
<font face="Courier New, Courier, mono" size="-1">
</font>
</td>
</tr>
<tr>
<td bgcolor="#CC3333">
<font face="Courier New, Courier, mono" size="-1">
Check_date
</font>
</td>
<td bgcolor="#CCCCCC">
<font face="Courier New, Courier, mono" size="-1">
20030211
</font>
</td>
</tr>
<tr>
<td bgcolor="#CC3333">
<font face="Courier New, Courier, mono" size="-1">
Verified
</font>
</td>
<td bgcolor="#CCCCCC">
<font face="Courier New, Courier, mono" size="-1">
CHC verified
</font>
</td>
</tr>
</table>
</body>
</html>
| mit |
KlishGroup/prose-pogs | pogs/A/ATLNBHX/E5A/index.md | 2613 | ---
layout: page
title: Erickson 50th Anniversary
date: 2016-05-24
author: Kyle Fitzpatrick
tags: weekly links, java
status: published
summary: Ut porta eleifend purus, vitae semper nunc blandit.
banner: images/banner/office-01.jpg
booking:
startDate: 09/28/2017
endDate: 09/29/2017
ctyhocn: ATLNBHX
groupCode: E5A
published: true
---
Mauris volutpat, turpis eget convallis convallis, felis est vulputate neque, eu laoreet nulla mi at leo. Nam eu aliquam tellus. Morbi nec eleifend tellus, pretium imperdiet mauris. Maecenas vitae porta odio, vitae fringilla purus. Duis eu euismod nunc. Aenean ac consectetur lacus. Vivamus turpis nulla, bibendum mattis consectetur vel, tempor vitae turpis. Aliquam quis feugiat enim. Morbi et finibus urna, quis elementum dui. Phasellus a orci sapien. Suspendisse vestibulum, risus vel ultrices ultrices, eros tortor tristique sapien, a eleifend tortor turpis et velit. Aliquam venenatis leo est, a cursus ante lacinia ac. Sed id libero lorem.
1 Nunc dignissim nulla consectetur leo commodo, vel aliquet tellus cursus.
Etiam eget odio nec tellus ultricies semper et faucibus lectus. Fusce vel odio tellus. Mauris efficitur massa in purus fermentum congue. Donec euismod massa lacus, a convallis nisi egestas ut. Praesent sodales, mi a convallis commodo, turpis tortor sollicitudin ipsum, vitae iaculis tortor ligula in ante. Vestibulum elementum eleifend dolor eu tincidunt. Vestibulum ante ipsum primis in faucibus orci luctus et ultrices posuere cubilia Curae; Mauris efficitur tortor sit amet ante condimentum, ut aliquam metus porttitor. Vivamus a nisi nec neque mattis dignissim. Nunc vel facilisis erat. Sed porttitor enim vitae rhoncus placerat. Ut tincidunt lacus nec dignissim imperdiet. Integer tristique libero justo. Curabitur convallis urna in purus tempor auctor ac id elit.
Praesent mauris lorem, pharetra et orci sed, rhoncus tempus libero. Maecenas sed nibh a mauris porta elementum rutrum a erat. Cras diam ex, venenatis at bibendum eget, pellentesque a ipsum. Donec ut malesuada urna. Aliquam neque mauris, volutpat nec ullamcorper vel, lobortis eu urna. Ut imperdiet porta ante, ac vehicula augue. Donec vulputate magna sit amet turpis volutpat, quis ultrices libero luctus. Duis dignissim magna pellentesque gravida facilisis. Proin ut libero vel orci aliquam malesuada ut in ex. Aliquam semper elit velit. Donec aliquet metus vitae ultrices ultricies. Ut laoreet malesuada ante ac ultricies. Phasellus nec augue nibh. Vestibulum nec venenatis quam, quis dictum dolor. Nulla tortor tortor, iaculis nec nulla sit amet, ullamcorper lobortis augue.
| mit |
nadimtuhin/facebook-activity-monitor | src/content/store/Story.js | 1020 | import { observable, action } from 'mobx';
import Fuse from 'fuse.js';
import Activity from './../utils/Activity';
import noop from 'lodash/noop';
import uniqBy from 'lodash/uniqBy';
const inactive = Activity(500);
export default class Story {
@observable keyword = '';
@observable allStories = [];
@observable stories = [];
@action search(keyword) {
this.keyword = keyword;
inactive().then(() => {
this.stories = this.searchStories(this.allStories, this.keyword);
}, noop);
}
@action addStories(stories){
this.allStories = uniqBy(this.allStories.concat(stories), story => story.key);
this.stories = this.searchStories(this.allStories, this.keyword);
}
searchStories(stories, keyword) {
/**
* threshold is the correctness of the search
* @type {{threshold: number, keys: string[]}}
*/
const options = {
threshold: 0.2,
keys: ['text']
};
const google = new Fuse(stories, options);
return google.search(keyword) || [];
}
}
| mit |
shaggytwodope/rtv | rtv/submission_page.py | 11574 | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
import time
import curses
from . import docs
from .content import SubmissionContent, SubredditContent
from .page import Page, PageController, logged_in
from .objects import Navigator, Color, Command
from .exceptions import TemporaryFileError
class SubmissionController(PageController):
character_map = {}
class SubmissionPage(Page):
FOOTER = docs.FOOTER_SUBMISSION
def __init__(self, reddit, term, config, oauth, url=None, submission=None):
super(SubmissionPage, self).__init__(reddit, term, config, oauth)
self.controller = SubmissionController(self, keymap=config.keymap)
if url:
self.content = SubmissionContent.from_url(
reddit, url, term.loader,
max_comment_cols=config['max_comment_cols'])
else:
self.content = SubmissionContent(
submission, term.loader,
max_comment_cols=config['max_comment_cols'])
# Start at the submission post, which is indexed as -1
self.nav = Navigator(self.content.get, page_index=-1)
self.selected_subreddit = None
@SubmissionController.register(Command('SUBMISSION_TOGGLE_COMMENT'))
def toggle_comment(self):
"Toggle the selected comment tree between visible and hidden"
current_index = self.nav.absolute_index
self.content.toggle(current_index)
# This logic handles a display edge case after a comment toggle. We
# want to make sure that when we re-draw the page, the cursor stays at
# its current absolute position on the screen. In order to do this,
# apply a fixed offset if, while inverted, we either try to hide the
# bottom comment or toggle any of the middle comments.
if self.nav.inverted:
data = self.content.get(current_index)
if data['hidden'] or self.nav.cursor_index != 0:
window = self._subwindows[-1][0]
n_rows, _ = window.getmaxyx()
self.nav.flip(len(self._subwindows) - 1)
self.nav.top_item_height = n_rows
@SubmissionController.register(Command('SUBMISSION_EXIT'))
def exit_submission(self):
"Close the submission and return to the subreddit page"
self.active = False
@SubmissionController.register(Command('REFRESH'))
def refresh_content(self, order=None, name=None):
"Re-download comments and reset the page index"
order = order or self.content.order
url = name or self.content.name
with self.term.loader('Refreshing page'):
self.content = SubmissionContent.from_url(
self.reddit, url, self.term.loader, order=order,
max_comment_cols=self.config['max_comment_cols'])
if not self.term.loader.exception:
self.nav = Navigator(self.content.get, page_index=-1)
@SubmissionController.register(Command('PROMPT'))
def prompt_subreddit(self):
"Open a prompt to navigate to a different subreddit"
name = self.term.prompt_input('Enter page: /')
if name is not None:
with self.term.loader('Loading page'):
content = SubredditContent.from_name(
self.reddit, name, self.term.loader)
if not self.term.loader.exception:
self.selected_subreddit = content
self.active = False
@SubmissionController.register(Command('SUBMISSION_OPEN_IN_BROWSER'))
def open_link(self):
"Open the selected item with the webbrowser"
data = self.get_selected_item()
url = data.get('permalink')
if url:
self.term.open_browser(url)
else:
self.term.flash()
@SubmissionController.register(Command('SUBMISSION_OPEN_IN_PAGER'))
def open_pager(self):
"Open the selected item with the system's pager"
data = self.get_selected_item()
if data['type'] == 'Submission':
text = '\n\n'.join((data['permalink'], data['text']))
self.term.open_pager(text)
elif data['type'] == 'Comment':
text = '\n\n'.join((data['permalink'], data['body']))
self.term.open_pager(text)
else:
self.term.flash()
@SubmissionController.register(Command('SUBMISSION_POST'))
@logged_in
def add_comment(self):
"""
Submit a reply to the selected item.
Selected item:
Submission - add a top level comment
Comment - add a comment reply
"""
data = self.get_selected_item()
if data['type'] == 'Submission':
body = data['text']
reply = data['object'].add_comment
elif data['type'] == 'Comment':
body = data['body']
reply = data['object'].reply
else:
self.term.flash()
return
# Construct the text that will be displayed in the editor file.
# The post body will be commented out and added for reference
lines = ['# |' + line for line in body.split('\n')]
content = '\n'.join(lines)
comment_info = docs.COMMENT_FILE.format(
author=data['author'],
type=data['type'].lower(),
content=content)
with self.term.open_editor(comment_info) as comment:
if not comment:
self.term.show_notification('Canceled')
return
with self.term.loader('Posting', delay=0):
reply(comment)
# Give reddit time to process the submission
time.sleep(2.0)
if self.term.loader.exception is None:
self.refresh_content()
else:
raise TemporaryFileError()
@SubmissionController.register(Command('DELETE'))
@logged_in
def delete_comment(self):
"Delete the selected comment"
if self.get_selected_item()['type'] == 'Comment':
self.delete_item()
else:
self.term.flash()
@SubmissionController.register(Command('SUBMISSION_OPEN_IN_URLVIEWER'))
def comment_urlview(self):
data = self.get_selected_item()
comment = data.get('body') or data.get('text') or data.get('url_full')
if comment:
self.term.open_urlview(comment)
else:
self.term.flash()
def _draw_item(self, win, data, inverted):
if data['type'] == 'MoreComments':
return self._draw_more_comments(win, data)
elif data['type'] == 'HiddenComment':
return self._draw_more_comments(win, data)
elif data['type'] == 'Comment':
return self._draw_comment(win, data, inverted)
else:
return self._draw_submission(win, data)
def _draw_comment(self, win, data, inverted):
n_rows, n_cols = win.getmaxyx()
n_cols -= 1
# Handle the case where the window is not large enough to fit the text.
valid_rows = range(0, n_rows)
offset = 0 if not inverted else -(data['n_rows'] - n_rows)
# If there isn't enough space to fit the comment body on the screen,
# replace the last line with a notification.
split_body = data['split_body']
if data['n_rows'] > n_rows:
# Only when there is a single comment on the page and not inverted
if not inverted and len(self._subwindows) == 0:
cutoff = data['n_rows'] - n_rows + 1
split_body = split_body[:-cutoff]
split_body.append('(Not enough space to display)')
row = offset
if row in valid_rows:
attr = curses.A_BOLD
attr |= (Color.BLUE if not data['is_author'] else Color.GREEN)
self.term.add_line(win, '{author} '.format(**data), row, 1, attr)
if data['flair']:
attr = curses.A_BOLD | Color.YELLOW
self.term.add_line(win, '{flair} '.format(**data), attr=attr)
text, attr = self.term.get_arrow(data['likes'])
self.term.add_line(win, text, attr=attr)
self.term.add_line(win, ' {score} {created} '.format(**data))
if data['gold']:
text, attr = self.term.guilded
self.term.add_line(win, text, attr=attr)
if data['stickied']:
text, attr = '[stickied]', Color.GREEN
self.term.add_line(win, text, attr=attr)
if data['saved']:
text, attr = '[saved]', Color.GREEN
self.term.add_line(win, text, attr=attr)
for row, text in enumerate(split_body, start=offset+1):
if row in valid_rows:
self.term.add_line(win, text, row, 1)
# Unfortunately vline() doesn't support custom color so we have to
# build it one segment at a time.
attr = Color.get_level(data['level'])
x = 0
for y in range(n_rows):
self.term.addch(win, y, x, self.term.vline, attr)
return attr | self.term.vline
def _draw_more_comments(self, win, data):
n_rows, n_cols = win.getmaxyx()
n_cols -= 1
self.term.add_line(win, '{body}'.format(**data), 0, 1)
self.term.add_line(
win, ' [{count}]'.format(**data), attr=curses.A_BOLD)
attr = Color.get_level(data['level'])
self.term.addch(win, 0, 0, self.term.vline, attr)
return attr | self.term.vline
def _draw_submission(self, win, data):
n_rows, n_cols = win.getmaxyx()
n_cols -= 3 # one for each side of the border + one for offset
for row, text in enumerate(data['split_title'], start=1):
self.term.add_line(win, text, row, 1, curses.A_BOLD)
row = len(data['split_title']) + 1
attr = curses.A_BOLD | Color.GREEN
self.term.add_line(win, '{author}'.format(**data), row, 1, attr)
attr = curses.A_BOLD | Color.YELLOW
if data['flair']:
self.term.add_line(win, ' {flair}'.format(**data), attr=attr)
self.term.add_line(win, ' {created} {subreddit}'.format(**data))
row = len(data['split_title']) + 2
attr = curses.A_UNDERLINE | Color.BLUE
self.term.add_line(win, '{url}'.format(**data), row, 1, attr)
offset = len(data['split_title']) + 3
# Cut off text if there is not enough room to display the whole post
split_text = data['split_text']
if data['n_rows'] > n_rows:
cutoff = data['n_rows'] - n_rows + 1
split_text = split_text[:-cutoff]
split_text.append('(Not enough space to display)')
for row, text in enumerate(split_text, start=offset):
self.term.add_line(win, text, row, 1)
row = len(data['split_title']) + len(split_text) + 3
self.term.add_line(win, '{score} '.format(**data), row, 1)
text, attr = self.term.get_arrow(data['likes'])
self.term.add_line(win, text, attr=attr)
self.term.add_line(win, ' {comments} '.format(**data))
if data['gold']:
text, attr = self.term.guilded
self.term.add_line(win, text, attr=attr)
if data['nsfw']:
text, attr = 'NSFW', (curses.A_BOLD | Color.RED)
self.term.add_line(win, text, attr=attr)
if data['saved']:
text, attr = '[saved]', Color.GREEN
self.term.add_line(win, text, attr=attr)
win.border()
| mit |
kasoki/project-zombye | src/source/zombye/gameplay/gameplay_system.cpp | 1741 | #include <zombye/core/game.hpp>
#include <zombye/gameplay/camera_follow_component.hpp>
#include <zombye/gameplay/game_states.hpp>
#include <zombye/gameplay/gameplay_system.hpp>
#include <zombye/gameplay/states/menu_state.hpp>
#include <zombye/gameplay/states/play_state.hpp>
#include <zombye/gameplay/state_component.hpp>
#include <zombye/scripting/scripting_system.hpp>
#include <zombye/utils/state_machine.hpp>
#include <zombye/utils/component_helper.hpp>
zombye::gameplay_system::gameplay_system(zombye::game *game) {
sm_ = std::unique_ptr<zombye::state_machine>(new zombye::state_machine(game));
init_game_states();
}
void zombye::gameplay_system::init_game_states() {
sm_->add<zombye::menu_state>(GAME_STATE_MENU);
sm_->add<zombye::play_state>(GAME_STATE_PLAY);
}
void zombye::gameplay_system::use(std::string name) {
sm_->use(name);
}
void zombye::gameplay_system::dispose_current() {
sm_->dispose_current();
}
void zombye::gameplay_system::update(float delta_time) {
sm_->update(delta_time);
for (auto& c : camera_follow_components_) {
c->update(delta_time);
}
for (auto& s : state_components_) {
s->update(delta_time);
}
}
void zombye::gameplay_system::register_component(camera_follow_component* component) {
camera_follow_components_.emplace_back(component);
}
void zombye::gameplay_system::unregister_component(camera_follow_component* component) {
remove(camera_follow_components_, component);
}
void zombye::gameplay_system::register_component(state_component* component) {
state_components_.emplace_back(component);
}
void zombye::gameplay_system::unregister_component(state_component* component) {
remove(state_components_, component);
}
| mit |
genediazjr/hapitodo | test/schemasTest.js | 1261 | 'use strict';
const Schemas = require('../server/schemas');
const Code = require('code');
const Lab = require('lab');
const expect = Code.expect;
const lab = exports.lab = Lab.script();
const describe = lab.describe;
const it = lab.it;
describe('server/schemas.todoSchema', () => {
it('validates object', (done) => {
expect(Schemas.todoSchema.validate({ test: 'val' }).error).to.exist();
return done();
});
it('allows id as string, done as boolean, and content as string', (done) => {
expect(Schemas.todoSchema.validate({ id: 1 }).error).to.exist();
expect(Schemas.todoSchema.validate({ id: 'id' }).error).to.not.exist();
expect(Schemas.todoSchema.validate({ done: false}).error).to.not.exist();
expect(Schemas.todoSchema.validate({ done: 'somtest'}).error).to.exist();
expect(Schemas.todoSchema.validate({ done: 'false'}).error).to.not.exist();
expect(Schemas.todoSchema.validate({ content: 1234567 }).error).to.exist();
expect(Schemas.todoSchema.validate({ content: 'test' }).error).to.not.exist();
return done();
});
it('exposes a todoObject', (done) => {
expect(Schemas.todoObject).to.be.an.object();
return done();
});
});
| mit |
ngageoint/geopackage-android | geopackage-sdk/src/androidTest/java/mil/nga/geopackage/extension/rtree/RTreeIndexExtensionCreateTest.java | 690 | package mil.nga.geopackage.extension.rtree;
import org.junit.Test;
import java.sql.SQLException;
import mil.nga.geopackage.CreateGeoPackageTestCase;
/**
* Test RTree Extension from a created database
*
* @author osbornb
*/
public class RTreeIndexExtensionCreateTest extends CreateGeoPackageTestCase {
/**
* Constructor
*/
public RTreeIndexExtensionCreateTest() {
}
/**
* Test RTree
*
* @throws SQLException upon error
*/
@Test
public void testRTree() throws SQLException {
RTreeIndexExtensionUtils.testRTree(geoPackage);
}
@Override
public boolean allowEmptyFeatures() {
return false;
}
}
| mit |
vorushin/moodbox_aka_risovaska | client/messagefile.h | 4309 | #ifndef MESSAGEFILE_H
#define MESSAGEFILE_H
#include "xmlserializable.h"
#include <QList>
#include <QByteArray>
#include <QImage>
#include "messagekey.h"
#include "messagetypemix.h"
namespace Velasquez
{
class DrawingElement;
}
namespace MoodBox
{
#define MESSAGE_FILE_EXTENSION ".mbm"
#define MESSAGE_TIMESTAMP "dd.MM.yyyy hh-mm-ss-zzz"
#define MESSAGE_FILE_XML_TAGNAME "Message"
#define MESSAGE_FILE_ID_ATTRIBUTE "Id"
#define MESSAGE_TYPE_XML_ATTRIBUTE "Type"
#define MESSAGE_SENTDATE_XML_ATTRIBUTE "SentDate"
#define MESSAGE_RECIPIENT_XML_ATTRIBUTE "Recipient"
#define MESSAGE_AUTHOR_XML_ATTRIBUTE "Author"
#define MESSAGE_AUTHORLOGIN_XML_ATTRIBUTE "AuthorLogin"
#define MESSAGE_SENT_XML_ATTRIBUTE "Sent"
#define MESSAGE_ISPUBLIC_XML_ATTRIBUTE "IsPublic"
#define MESSAGE_ONBEHALF_XML_TAGNAME "OnBehalf"
#define MESSAGE_TYPE_PRIVATE_VALUE "Private"
#define MESSAGE_TYPE_FRIENDS_VALUE "Friends"
#define MESSAGE_TYPE_CHANNEL_VALUE "Channel"
#define MESSAGE_YES_VALUE "Yes"
#define MESSAGE_NO_VALUE "No"
#define METAINFO_IMAGEFORMAT_TITLE "Image"
#define METAINFO_IMAGEFORMAT_TAGNAME "Type"
#define METAINFO_IMAGEFORMAT_VALUEPREFIX "image/"
// Drawing Message file base info
class MessageFileBase : public MessageTypeMix, protected XmlSerializable
{
public:
MessageFileBase();
MessageFileBase(MessageType::MessageTypeEnum type, qint32 authorId);
MessageFileBase(qint32 recipientId, qint32 authorId);
inline void setId(qint32 id) { key.setId(id); };
inline qint32 getId() const { return key.getId(); };
inline void setSentDate(const QDateTime &date) { key.setDate(date); };
inline QDateTime getSentDate() const { return key.getDate(); };
inline void setAuthor(qint32 authorId) { this->authorId = authorId; };
inline qint32 getAuthor() const { return authorId; };
inline void setAuthorLogin(const QString &authorLogin) { this->authorLogin = authorLogin; };
inline QString getAuthorLogin() const { return authorLogin; };
inline void setSent(bool sent) { this->sent = sent; };
inline bool getSent() const { return sent; };
inline void setPublic(bool isPublic) { this->isPublic = isPublic; };
inline bool getPublic() const { return isPublic; };
inline void setFileName(const QString &fileName) { this->fileName = fileName; };
inline QString getFileName() const { return this->fileName; };
inline MessageKey getMessageKey() const { return key; };
SerializationResult save();
SerializationResult load();
static QString messageTypeToString(MessageType::MessageTypeEnum type);
static MessageType::MessageTypeEnum messageTypeFromString(const QString &string, bool *ok = NULL);
static QString messageBoolToString(bool value);
static bool messageBoolFromString(const QString &string);
protected:
virtual SerializationResult saveToXml(QXmlStreamWriter* writer) const;
virtual SerializationResult loadFromXml(QXmlStreamReader* reader);
virtual SerializationResult saveContentToXml(QXmlStreamWriter* writer) const;
virtual SerializationResult loadContentFromXml(QXmlStreamReader* reader);
private:
MessageKey key;
qint32 authorId;
QString authorLogin;
bool sent;
bool isPublic;
QString fileName;
};
// Drawing Message file with elements buffer
class MessageFile : public MessageFileBase
{
public:
MessageFile();
MessageFile(MessageType::MessageTypeEnum type, qint32 authorId);
MessageFile(qint32 recipientId, qint32 authorId);
// Information
inline void setInfo(const QString &info) { this->info = info; };
QString getInfo() const { return info; };
// Image format info
QString getImageFormatFromInfo() const;
// Preview works
void setPreview(const QImage &preview);
void setPreview(const QByteArray &previewBytes);
inline QImage getPreview() const { return preview; };
inline QByteArray getPreviewBytes() const { return previewBytes; };
protected:
virtual SerializationResult saveContentToXml(QXmlStreamWriter* writer) const;
virtual SerializationResult loadContentFromXml(QXmlStreamReader* reader);
private:
QString info;
QImage preview;
QByteArray previewBytes;
void updateBytesFromPreview();
void updatePreviewFromBytes();
};
}
#endif // MESSAGEFILE_H | mit |
codingfriend1/Feathers-Vue | README.md | 4364 | # Feathers-Vue
> A Vue 2 and FeathersJS 2 fullstack app with authentication, email verification, and email support."
## About
This project uses [Feathers](http://feathersjs.com). An open source web framework for building modern real-time applications and Vue 2 with Server Side Rendering.
This project is not finished but if you are can be ready to use if you are content with what it offers.
Features
- SASS
- Stylus
- Pug
- ES6, ES7, and ES8
- Webpack
- Vue Stash - For Redux Store
- Bootstrap
- Lodash
- jQuery
- FontAwesome
- Validate client side data with mongoose schemas
## Getting Started
Getting up and running is as easy as 1, 2, 3, 4.
There are multiple ways to start/develop the app.
### Develop with docker
Don't install node_modules locally
1. Create a `environment-dev.env` and `environment.env` file to hold your environment variables. These files are ignored by git. You'll want a DATABASE_URL and you gmail info for email verification
```
DATABASE_URL=mongodb://db/feathersvuedevelopment
[email protected]
[email protected]
GMAIL_PASSWORD=your_pass_password
```
_See [How to set an app password](https://support.google.com/accounts/answer/185833)_
2. Run npm start
```
npm start
```
2. To see production build locally
```
npm run build-qa
npm run qa
```
3. To switch back to development use
```
npm run build-dev
npm start
```
Switching contexts between production and development requires a full docker build with no cache.
### Develop without docker
1. Make sure you have [NodeJS](https://nodejs.org/) and [npm](https://www.npmjs.com/) installed.
2. Install your dependencies
```
cd path/to/Feathers-Vue; npm install
```
3. Start your app locally
```
mongod
```
```
npm run dev
```
4. In production run
```
npm run build
npm run production
```
If you want emails to work using gmail add the following environment variables
```
export [email protected]
export GMAIL_PASS=yourpassword or app-password
```
_See [How to set an app password](https://support.google.com/accounts/answer/185833)_
## Testing
Simply run `npm test` and all your tests in the `test/` directory to run server side unit test or run `npm run integration` to run client side side tests.
## Scaffolding
Feathers has a powerful command line interface. Here are a few things it can do:
```
$ npm install -g feathers-cli # Install Feathers CLI
$ feathers generate service # Generate a new Service
$ feathers generate hook # Generate a new Hook
$ feathers generate model # Generate a new Model
$ feathers help # Show all commands
```
## Help
For more information on all the things you can do with Feathers visit [docs.feathersjs.com](http://docs.feathersjs.com).
## Looking for mobile?
I'm working on a cordova starter with feathers 2, Vue 2, and Framework 7. Visit the `cordova` branch of this repo.
[Cordova Branch](https://github.com/codingfriend1/Feathers-Vue/tree/cordova)
## Gitlab Auto Deployment
1. Create a digitalocean instance from using the one-click docker instance.
2. ssh into the instance and run
```
sudo apt-get update
sudo apt-get upgrade
sudo apt-get -y install python-pip
sudo pip install docker-compose
```
3. Edit `sshd_config`
```
nano /etc/ssh/sshd_config
```
4. At the bottom of the file change `PasswordAuthentication`
```
PasswordAuthentication yes
```
5. Run `reload ssh`
6. Set the secret environment variables in gitlab
```
DATABASE_URL=mongodb://db/feathersvue
DEPLOYMENT_SERVER_IP=your_ip_address
DEPLOYMENT_SERVER_PASS=your_user_password
DEPLOYMENT_SERVER_USER=your_server_user
```
7. Update `docker-compose.autodeploy.yml` web image to point to your hosted image.
8. Update `gitlab-ci.yml` in the `only` sections to only run on the branches you want to deploy from.
9. Push changes in git to gitlab.
## Breaking Changes
- Removed mongoose validation from client side and replaced with Yup.
- Reconstructed server-side rendering to use updated instructions in vuejs.
- Moved server-entry file into app.
## License
Copyright (c) 2016
Licensed under the [MIT license](LICENSE).
| mit |
dfcreative/popoff | overlay.js | 2294 | /**
* @module popoff/overlay
*
* Because overlay-component is hopelessly out of date.
* This is modern rewrite.
*/
const Emitter = require('events').EventEmitter;
const inherits = require('inherits');
const extend = require('xtend/mutable');
module.exports = Overlay;
/**
* Initialize a new `Overlay`.
*
* @param {Object} options
* @api public
*/
function Overlay(options) {
if (!(this instanceof Overlay)) return new Overlay(options);
Emitter.call(this);
extend(this, options);
if (!this.container) {
this.container = document.body || document.documentElement;
}
//create overlay element
this.element = document.createElement('div');
this.element.classList.add('popoff-overlay');
if (this.closable) {
this.element.addEventListener('click', e => {
this.hide();
});
this.element.classList.add('popoff-closable');
}
}
inherits(Overlay, Emitter);
//close overlay by click
Overlay.prototype.closable = true;
/**
* Show the overlay.
*
* Emits "show" event.
*
* @return {Overlay}
* @api public
*/
Overlay.prototype.show = function () {
this.emit('show');
this.container.appendChild(this.element);
//class removed in a timeout to save animation
setTimeout( () => {
this.element.classList.add('popoff-visible');
this.emit('afterShow');
}, 10);
return this;
};
/**
* Hide the overlay.
*
* Emits "hide" event.
*
* @return {Overlay}
* @api public
*/
Overlay.prototype.hide = function () {
this.emit('hide');
this.element.classList.remove('popoff-visible');
this.element.addEventListener('transitionend', end);
this.element.addEventListener('webkitTransitionEnd', end);
this.element.addEventListener('otransitionend', end);
this.element.addEventListener('oTransitionEnd', end);
this.element.addEventListener('msTransitionEnd', end);
var to = setTimeout(end, 1000);
var that = this;
function end () {
that.element.removeEventListener('transitionend', end);
that.element.removeEventListener('webkitTransitionEnd', end);
that.element.removeEventListener('otransitionend', end);
that.element.removeEventListener('oTransitionEnd', end);
that.element.removeEventListener('msTransitionEnd', end);
clearInterval(to);
that.container.removeChild(that.element);
that.emit('afterHide');
}
return this;
};
| mit |
timkrentz/SunTracker | IMU/VTK-6.2.0/IO/MINC/Testing/Python/TestMNITagPoints.py | 3826 | #!/usr/bin/env python
import os
import vtk
from vtk.test import Testing
from vtk.util.misc import vtkGetDataRoot
VTK_DATA_ROOT = vtkGetDataRoot()
# Test label reading from an MNI tag file
#
# The current directory must be writeable.
#
try:
fname = "mni-tagtest.tag"
channel = open(fname, "wb")
channel.close()
# create some random points in a sphere
#
sphere1 = vtk.vtkPointSource()
sphere1.SetNumberOfPoints(13)
xform = vtk.vtkTransform()
xform.RotateWXYZ(20, 1, 0, 0)
xformFilter = vtk.vtkTransformFilter()
xformFilter.SetTransform(xform)
xformFilter.SetInputConnection(sphere1.GetOutputPort())
labels = vtk.vtkStringArray()
labels.InsertNextValue("0")
labels.InsertNextValue("1")
labels.InsertNextValue("2")
labels.InsertNextValue("3")
labels.InsertNextValue("Halifax")
labels.InsertNextValue("Toronto")
labels.InsertNextValue("Vancouver")
labels.InsertNextValue("Larry")
labels.InsertNextValue("Bob")
labels.InsertNextValue("Jackie")
labels.InsertNextValue("10")
labels.InsertNextValue("11")
labels.InsertNextValue("12")
weights = vtk.vtkDoubleArray()
weights.InsertNextValue(1.0)
weights.InsertNextValue(1.1)
weights.InsertNextValue(1.2)
weights.InsertNextValue(1.3)
weights.InsertNextValue(1.4)
weights.InsertNextValue(1.5)
weights.InsertNextValue(1.6)
weights.InsertNextValue(1.7)
weights.InsertNextValue(1.8)
weights.InsertNextValue(1.9)
weights.InsertNextValue(0.9)
weights.InsertNextValue(0.8)
weights.InsertNextValue(0.7)
writer = vtk.vtkMNITagPointWriter()
writer.SetFileName(fname)
writer.SetInputConnection(sphere1.GetOutputPort())
writer.SetInputConnection(1, xformFilter.GetOutputPort())
writer.SetLabelText(labels)
writer.SetWeights(weights)
writer.SetComments("Volume 1: sphere points\nVolume 2: transformed points")
writer.Write()
reader = vtk.vtkMNITagPointReader()
reader.CanReadFile(fname)
reader.SetFileName(fname)
textProp = vtk.vtkTextProperty()
textProp.SetFontSize(12)
textProp.SetColor(1.0, 1.0, 0.5)
labelHier = vtk.vtkPointSetToLabelHierarchy()
labelHier.SetInputConnection(reader.GetOutputPort())
labelHier.SetTextProperty(textProp)
labelHier.SetLabelArrayName("LabelText")
labelHier.SetMaximumDepth(15)
labelHier.SetTargetLabelCount(12)
labelMapper = vtk.vtkLabelPlacementMapper()
labelMapper.SetInputConnection(labelHier.GetOutputPort())
labelMapper.UseDepthBufferOff()
labelMapper.SetShapeToRect()
labelMapper.SetStyleToOutline()
labelActor = vtk.vtkActor2D()
labelActor.SetMapper(labelMapper)
glyphSource = vtk.vtkSphereSource()
glyphSource.SetRadius(0.01)
glyph = vtk.vtkGlyph3D()
glyph.SetSourceConnection(glyphSource.GetOutputPort())
glyph.SetInputConnection(reader.GetOutputPort())
mapper = vtk.vtkDataSetMapper()
mapper.SetInputConnection(glyph.GetOutputPort())
actor = vtk.vtkActor()
actor.SetMapper(mapper)
# Create rendering stuff
ren1 = vtk.vtkRenderer()
renWin = vtk.vtkRenderWindow()
renWin.SetMultiSamples(0)
renWin.AddRenderer(ren1)
iren = vtk.vtkRenderWindowInteractor()
iren.SetRenderWindow(renWin)
# Add the actors to the renderer, set the background and size
#
ren1.AddViewProp(actor)
ren1.AddViewProp(labelActor)
ren1.SetBackground(0, 0, 0)
renWin.SetSize(300, 300)
renWin.Render()
try:
os.remove(fname)
except OSError:
pass
# render the image
#
# iren.Start()
except IOError:
print "Unable to test the writer/reader."
| mit |
donaldinou/frontend | src/Viteloge/CoreBundle/Resources/descriptions/28173.html | 1962 | <div class="commune_descr limited">
<p>
Gasville-Oisème est
une commune localisée dans le département de l'Eure-et-Loir en Centre. Elle totalisait 1 176 habitants en 2008.</p>
<p>La commune propose quelques équipements sportifs, elle propose entre autres un centre d'équitation et une boucle de randonnée.</p>
<p>Le nombre d'habitations, à Gasville-Oisème, se décomposait en 2011 en 17 appartements et 496 maisons soit
un marché plutôt équilibré.</p>
<p>À Gasville-Oisème, le prix moyen à la vente d'un appartement s'évalue à 12 611 € du m² en vente. Le prix moyen d'une maison à l'achat se situe à 1 739 € du m². À la location la valorisation moyenne se situe à 19,59 € du m² par mois.</p>
<p>Gasville-Oisème est situé à seulement 7 Kilomètres de Chartres, les étudiants qui aurons besoin de se loger à pas cher pourront envisager de louer un appartement à Gasville-Oisème. Gasville-Oisème est aussi un bon placement locatif du fait de sa proximité de Chartres et de ses Universités. Il sera envisageable de trouver un appartement à acheter. </p>
<p>À proximité de Gasville-Oisème sont situées les villes de
<a href="{{VLROOT}}/immobilier/saint-prest_28358/">Saint-Prest</a> localisée à 2 km, 2 135 habitants,
<a href="{{VLROOT}}/immobilier/leves_28209/">Lèves</a> située à 6 km, 4 405 habitants,
<a href="{{VLROOT}}/immobilier/soulaires_28379/">Soulaires</a> située à 4 km, 427 habitants,
<a href="{{VLROOT}}/immobilier/poisvilliers_28301/">Poisvilliers</a> située à 7 km, 327 habitants,
<a href="{{VLROOT}}/immobilier/jouy_28201/">Jouy</a> située à 3 km, 1 888 habitants,
<a href="{{VLROOT}}/immobilier/champseru_28073/">Champseru</a> située à 7 km, 305 habitants,
entre autres. De plus, Gasville-Oisème est située à seulement sept km de <a href="{{VLROOT}}/immobilier/chartres_28085/">Chartres</a>.</p>
</div>
| mit |
liemqv/EventFlow | Source/EventFlow.Hangfire/Integration/HangfireJobScheduler.cs | 3367 | // The MIT License (MIT)
//
// Copyright (c) 2015 Rasmus Mikkelsen
// https://github.com/rasmus/EventFlow
//
// Permission is hereby granted, free of charge, to any person obtaining a copy of
// this software and associated documentation files (the "Software"), to deal in
// the Software without restriction, including without limitation the rights to
// use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
// the Software, and to permit persons to whom the Software is furnished to do so,
// subject to the following conditions:
//
// The above copyright notice and this permission notice shall be included in all
// copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
// IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
// FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
// COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
// IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
// CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
using EventFlow.Core;
using EventFlow.Jobs;
using EventFlow.Logs;
using Hangfire;
using System;
using System.Threading;
using System.Threading.Tasks;
namespace EventFlow.Hangfire.Integration
{
public class HangfireJobScheduler : IJobScheduler
{
private readonly IBackgroundJobClient _backgroundJobClient;
private readonly IJobDefinitionService _jobDefinitionService;
private readonly IJsonSerializer _jsonSerializer;
private readonly ILog _log;
public HangfireJobScheduler(
ILog log,
IJsonSerializer jsonSerializer,
IBackgroundJobClient backgroundJobClient,
IJobDefinitionService jobDefinitionService)
{
_log = log;
_jsonSerializer = jsonSerializer;
_backgroundJobClient = backgroundJobClient;
_jobDefinitionService = jobDefinitionService;
}
public Task<IJobId> ScheduleNowAsync(IJob job, CancellationToken cancellationToken)
{
return ScheduleAsync(job, (c, d, j) => _backgroundJobClient.Enqueue<IJobRunner>(r => r.Execute(d.Name, d.Version, j)));
}
public Task<IJobId> ScheduleAsync(IJob job, DateTimeOffset runAt, CancellationToken cancellationToken)
{
return ScheduleAsync(job, (c, d, j) => _backgroundJobClient.Schedule<IJobRunner>(r => r.Execute(d.Name, d.Version, j), runAt));
}
public Task<IJobId> ScheduleAsync(IJob job, TimeSpan delay, CancellationToken cancellationToken)
{
return ScheduleAsync(job, (c, d, j) => _backgroundJobClient.Schedule<IJobRunner>(r => r.Execute(d.Name, d.Version, j), delay));
}
private Task<IJobId> ScheduleAsync(IJob job, Func<IBackgroundJobClient, JobDefinition, string, string> schedule)
{
var jobDefinition = _jobDefinitionService.GetJobDefinition(job.GetType());
var json = _jsonSerializer.Serialize(job);
var id = schedule(_backgroundJobClient, jobDefinition, json);
_log.Verbose($"Scheduled job '{id}' in Hangfire");
return Task.FromResult<IJobId>(new HangfireJobId(id));
}
}
} | mit |
vicols92/linkify | resources/includes/TRASH/KANSKESPARA.php | 3055 | if (isset($_POST['upload'])) {
$target = "../img".basename($_FILES['image']['name']);
$image = $_FILES['image']['name'];
$msg = "";
$sql = "UPDATE user SET avatar='$image' WHERE id='$id'";
mysqli_query($conn, $sql);
if (move_uploaded_file($_FILES['image']['tmp_name'], $target)) {
$msg = "Uploaded file.";
} else {
$msg = "there was a problem uploading image";
}
}
if (isset($_POST['upload'])) {
$imageName = mysqli_real_escape_string($conn, $_FILES['image']['name']);
$imageData = mysqli_real_escape_string($conn,file_get_contents($_FILES['image']['tmp_name']));
$imageType = mysqli_real_escape_string($conn, $_FILES['image']['type']);
if (substr($imageType, 0, 5) == "image") {
$sql = "UPDATE user SET avatar='$imageData' WHERE id='$id'";
mysqli_query($conn, $sql);
echo "Image uploaded!";
} else {
echo "only images are allowed";
}
}
$sql = "SELECT * FROM user";
$result = mysqli_query($conn, $sql);
if (mysqli_num_rows($result) > 0) {
while ($row = mysqli_fetch_assoc($result)) {
$id = $row['id'];
$sqlImg = "SELECT * FROM profileimg WHERE userid='$id'";
$resultImg = mysqli_query($conn, $sqlImg);
while ($rowImg = mysqli_fetch_assoc($resultImg)) {
echo "<div class='user-container'>";
if ($rowImg['status'] == 0) {
echo "<img src='img/profile".$id.".jpg?'".mt_rand().">";
} else {
echo "<img src='img/profiledefault.jpg'>";
}
echo "<p>".$row['uid']."</p>";
echo "</div>";
}
}
} else {
echo "There are no users yet!";
}
// if ($userRow['image'] == "") {
echo "<form action='".upload($conn)."' method='POST' enctype='multipart/form-data'>
<input type='file' name='file'>
<button type='submit' name='submit'>UPLOAD</button>
</form>";
if (isset($_POST['upload'])) {
} else {
}
// }
// function upload($conn) {
// $id = $_SESSION['id'];
//
//
// if (isset($_POST['submit'])) {
// $file = $_FILES['file'];
// $fileName = $file['name'];
// $fileTmpName = $file['tmp_name'];
// $fileSize = $file['size'];
// $fileError = $file['error'];
// $fileType = $file['type'];
//
// $fileExt = explode('.', $fileName);
// $fileActualExt = strtolower(end($fileExt));
//
// $allowed = array('jpg', 'jpeg', 'png', 'pdf');
//
// if (in_array($fileActualExt, $allowed)) {
// if ($fileError === 0) {
// if ($fileSize < 1000000) {
// $fileNameNew = "profile".$id.".".$fileActualExt;
// $fileDestination = 'uploads/'.$fileNameNew;
// move_uploaded_file($fileTmpName, $fileDestination);
// $sql = "UPDATE profileimg SET status =0 WHERE userid='$id'";
// $result = mysqli_query($conn, $sql);
// header("Location: index.php?uploadsuccess");
// } else {
// echo "Your file is too big!";
// }
// } else {
// echo "There was an error uploading your file!";
// }
// } else {
// echo "You cannot upload files of this type!";
// }
// }
// }
| mit |
neurodrone/earthquake | lib/earthquake/ext.rb | 1538 | module Twitter
class JSONStream
protected
def reconnect_after timeout
@reconnect_callback.call(timeout, @reconnect_retries) if @reconnect_callback
if timeout == 0
reconnect @options[:host], @options[:port]
start_tls if @options[:ssl]
else
EventMachine.add_timer(timeout) do
reconnect @options[:host], @options[:port]
start_tls if @options[:ssl]
end
end
end
end
end
module TwitterOAuth
class Client
private
def consumer
@consumer ||= OAuth::Consumer.new(
@consumer_key,
@consumer_secret,
{ :site => 'https://api.twitter.com', :proxy => @proxy }
)
end
end
end
class String
def c(*codes)
return self if Earthquake.config[:lolize]
codes = codes.flatten.map { |code|
case code
when String, Symbol
Earthquake.config[:color][code.to_sym] rescue nil
else
code
end
}.compact.unshift(0)
"\e[#{codes.join(';')}m#{self}\e[0m"
end
def coloring(pattern, color = nil, &block)
return self if Earthquake.config[:lolize]
self.gsub(pattern) do |i|
applied_colors = $`.scan(/\e\[[\d;]+m/)
c = color || block.call(i)
"#{i.c(c)}#{applied_colors.join}"
end
end
t = {
?& => "&",
?< => "<",
?> => ">",
?' => "'",
?" => """,
}
define_method(:u) do
gsub(/(#{Regexp.union(t.values)})/o, t.invert)
end
define_method(:e) do
gsub(/[#{t.keys.join}]/o, t)
end
end
| mit |
jordanwallwork/jello | src/Jello/Nodes/TerminalNode.cs | 197 | namespace Jello.Nodes
{
public abstract class TerminalNode<T> : Node<T> where T : class
{
public override INode GetSingleChild()
{
return null;
}
}
} | mit |
mattdbridges/dotify | lib/dotify/version/checker.rb | 622 | module Dotify
class Version
# The Checkup class is responsible for
# reaching out to Rubygems.org and retrieving
# the latest gem version.
class Checker
class << self
attr_reader :result, :resp
end
def self.check_latest_release!
@result = (latest == Version.build.level)
end
def self.latest
fetch.map { |v| v['number'] }.max
end
private
def self.fetch
require 'multi_json'
@resp = Net::HTTP.get('rubygems.org', '/api/v1/versions/dotify.json')
MultiJson.load(@resp)
end
end
end
end | mit |
RandCoin/randcoin | src/irc.cpp | 10568 | // Copyright (c) 2009-2010 Satoshi Nakamoto
// Copyright (c) 2009-2012 The Bitcoin developers
// Copyright (c) 2011-2012 Litecoin Developers
// Copyright (c) 2013 Fastcoin Developers
// Distributed under the MIT/X11 software license, see the accompanying
// file COPYING or http://www.opensource.org/licenses/mit-license.php.
#include "irc.h"
#include "net.h"
#include "strlcpy.h"
#include "base58.h"
using namespace std;
using namespace boost;
int nGotIRCAddresses = 0;
void ThreadIRCSeed2(void* parg);
#pragma pack(push, 1)
struct ircaddr
{
struct in_addr ip;
short port;
};
#pragma pack(pop)
string EncodeAddress(const CService& addr)
{
struct ircaddr tmp;
if (addr.GetInAddr(&tmp.ip))
{
tmp.port = htons(addr.GetPort());
vector<unsigned char> vch(UBEGIN(tmp), UEND(tmp));
return string("u") + EncodeBase58Check(vch);
}
return "";
}
bool DecodeAddress(string str, CService& addr)
{
vector<unsigned char> vch;
if (!DecodeBase58Check(str.substr(1), vch))
return false;
struct ircaddr tmp;
if (vch.size() != sizeof(tmp))
return false;
memcpy(&tmp, &vch[0], sizeof(tmp));
addr = CService(tmp.ip, ntohs(tmp.port));
return true;
}
static bool Send(SOCKET hSocket, const char* pszSend)
{
if (strstr(pszSend, "PONG") != pszSend)
printf("IRC SENDING: %s\n", pszSend);
const char* psz = pszSend;
const char* pszEnd = psz + strlen(psz);
while (psz < pszEnd)
{
int ret = send(hSocket, psz, pszEnd - psz, MSG_NOSIGNAL);
if (ret < 0)
return false;
psz += ret;
}
return true;
}
bool RecvLineIRC(SOCKET hSocket, string& strLine)
{
loop
{
bool fRet = RecvLine(hSocket, strLine);
if (fRet)
{
if (fShutdown)
return false;
vector<string> vWords;
ParseString(strLine, ' ', vWords);
if (vWords.size() >= 1 && vWords[0] == "PING")
{
strLine[1] = 'O';
strLine += '\r';
Send(hSocket, strLine.c_str());
continue;
}
}
return fRet;
}
}
int RecvUntil(SOCKET hSocket, const char* psz1, const char* psz2=NULL, const char* psz3=NULL, const char* psz4=NULL)
{
loop
{
string strLine;
strLine.reserve(10000);
if (!RecvLineIRC(hSocket, strLine))
return 0;
printf("IRC %s\n", strLine.c_str());
if (psz1 && strLine.find(psz1) != string::npos)
return 1;
if (psz2 && strLine.find(psz2) != string::npos)
return 2;
if (psz3 && strLine.find(psz3) != string::npos)
return 3;
if (psz4 && strLine.find(psz4) != string::npos)
return 4;
}
}
bool Wait(int nSeconds)
{
if (fShutdown)
return false;
printf("IRC waiting %d seconds to reconnect\n", nSeconds);
for (int i = 0; i < nSeconds; i++)
{
if (fShutdown)
return false;
Sleep(1000);
}
return true;
}
bool RecvCodeLine(SOCKET hSocket, const char* psz1, string& strRet)
{
strRet.clear();
loop
{
string strLine;
if (!RecvLineIRC(hSocket, strLine))
return false;
vector<string> vWords;
ParseString(strLine, ' ', vWords);
if (vWords.size() < 2)
continue;
if (vWords[1] == psz1)
{
printf("IRC %s\n", strLine.c_str());
strRet = strLine;
return true;
}
}
}
bool GetIPFromIRC(SOCKET hSocket, string strMyName, CNetAddr& ipRet)
{
Send(hSocket, strprintf("USERHOST %s\r", strMyName.c_str()).c_str());
string strLine;
if (!RecvCodeLine(hSocket, "302", strLine))
return false;
vector<string> vWords;
ParseString(strLine, ' ', vWords);
if (vWords.size() < 4)
return false;
string str = vWords[3];
if (str.rfind("@") == string::npos)
return false;
string strHost = str.substr(str.rfind("@")+1);
// Hybrid IRC used by lfnet always returns IP when you userhost yourself,
// but in case another IRC is ever used this should work.
printf("GetIPFromIRC() got userhost %s\n", strHost.c_str());
CNetAddr addr(strHost, true);
if (!addr.IsValid())
return false;
ipRet = addr;
return true;
}
void ThreadIRCSeed(void* parg)
{
IMPLEMENT_RANDOMIZE_STACK(ThreadIRCSeed(parg));
// Make this thread recognisable as the IRC seeding thread
RenameThread("bitcoin-ircseed");
try
{
ThreadIRCSeed2(parg);
}
catch (std::exception& e) {
PrintExceptionContinue(&e, "ThreadIRCSeed()");
} catch (...) {
PrintExceptionContinue(NULL, "ThreadIRCSeed()");
}
printf("ThreadIRCSeed exited\n");
}
void ThreadIRCSeed2(void* parg)
{
/* Dont advertise on IRC if we don't allow incoming connections */
if (mapArgs.count("-connect") || fNoListen)
return;
if (!GetBoolArg("-irc", false))
return;
printf("ThreadIRCSeed started\n");
int nErrorWait = 10;
int nRetryWait = 10;
while (!fShutdown)
{
CService addrConnect("92.243.23.21", 6667); // irc.lfnet.org
CService addrIRC("irc.lfnet.org", 6667, true);
if (addrIRC.IsValid())
addrConnect = addrIRC;
SOCKET hSocket;
if (!ConnectSocket(addrConnect, hSocket))
{
printf("IRC connect failed\n");
nErrorWait = nErrorWait * 11 / 10;
if (Wait(nErrorWait += 60))
continue;
else
return;
}
if (!RecvUntil(hSocket, "Found your hostname", "using your IP address instead", "Couldn't look up your hostname", "ignoring hostname"))
{
closesocket(hSocket);
hSocket = INVALID_SOCKET;
nErrorWait = nErrorWait * 11 / 10;
if (Wait(nErrorWait += 60))
continue;
else
return;
}
CNetAddr addrIPv4("1.2.3.4"); // arbitrary IPv4 address to make GetLocal prefer IPv4 addresses
CService addrLocal;
string strMyName;
if (GetLocal(addrLocal, &addrIPv4))
strMyName = EncodeAddress(GetLocalAddress(&addrConnect));
if (strMyName == "")
strMyName = strprintf("x%u", GetRand(1000000000));
Send(hSocket, strprintf("NICK %s\r", strMyName.c_str()).c_str());
Send(hSocket, strprintf("USER %s 8 * : %s\r", strMyName.c_str(), strMyName.c_str()).c_str());
int nRet = RecvUntil(hSocket, " 004 ", " 433 ");
if (nRet != 1)
{
closesocket(hSocket);
hSocket = INVALID_SOCKET;
if (nRet == 2)
{
printf("IRC name already in use\n");
Wait(10);
continue;
}
nErrorWait = nErrorWait * 11 / 10;
if (Wait(nErrorWait += 60))
continue;
else
return;
}
Sleep(500);
// Get our external IP from the IRC server and re-nick before joining the channel
CNetAddr addrFromIRC;
if (GetIPFromIRC(hSocket, strMyName, addrFromIRC))
{
printf("GetIPFromIRC() returned %s\n", addrFromIRC.ToString().c_str());
if (addrFromIRC.IsRoutable())
{
// IRC lets you to re-nick
AddLocal(addrFromIRC, LOCAL_IRC);
strMyName = EncodeAddress(GetLocalAddress(&addrConnect));
Send(hSocket, strprintf("NICK %s\r", strMyName.c_str()).c_str());
}
}
if (fTestNet) {
Send(hSocket, "JOIN #fastcoinTEST3\r");
Send(hSocket, "WHO #fastcoinTEST3\r");
} else {
// randomly join #fastcoin00-#fastcoin99
int channel_number = GetRandInt(100);
channel_number = 0; // Fastcoin: for now, just use one channel
Send(hSocket, strprintf("JOIN #fastcoin%02d\r", channel_number).c_str());
Send(hSocket, strprintf("WHO #fastcoin%02d\r", channel_number).c_str());
}
int64 nStart = GetTime();
string strLine;
strLine.reserve(10000);
while (!fShutdown && RecvLineIRC(hSocket, strLine))
{
if (strLine.empty() || strLine.size() > 900 || strLine[0] != ':')
continue;
vector<string> vWords;
ParseString(strLine, ' ', vWords);
if (vWords.size() < 2)
continue;
char pszName[10000];
pszName[0] = '\0';
if (vWords[1] == "352" && vWords.size() >= 8)
{
// index 7 is limited to 16 characters
// could get full length name at index 10, but would be different from join messages
strlcpy(pszName, vWords[7].c_str(), sizeof(pszName));
printf("IRC got who\n");
}
if (vWords[1] == "JOIN" && vWords[0].size() > 1)
{
// :[email protected] JOIN :#channelname
strlcpy(pszName, vWords[0].c_str() + 1, sizeof(pszName));
if (strchr(pszName, '!'))
*strchr(pszName, '!') = '\0';
printf("IRC got join\n");
}
if (pszName[0] == 'u')
{
CAddress addr;
if (DecodeAddress(pszName, addr))
{
addr.nTime = GetAdjustedTime();
if (addrman.Add(addr, addrConnect, 51 * 60))
printf("IRC got new address: %s\n", addr.ToString().c_str());
nGotIRCAddresses++;
}
else
{
printf("IRC decode failed\n");
}
}
}
closesocket(hSocket);
hSocket = INVALID_SOCKET;
if (GetTime() - nStart > 20 * 60)
{
nErrorWait /= 3;
nRetryWait /= 3;
}
nRetryWait = nRetryWait * 11 / 10;
if (!Wait(nRetryWait += 60))
return;
}
}
#ifdef TEST
int main(int argc, char *argv[])
{
WSADATA wsadata;
if (WSAStartup(MAKEWORD(2,2), &wsadata) != NO_ERROR)
{
printf("Error at WSAStartup()\n");
return false;
}
ThreadIRCSeed(NULL);
WSACleanup();
return 0;
}
#endif
| mit |
fyskam/FysKams-sangbok | FysKamsSangbok/app/src/main/java/fyskam/fyskamssngbok/NavigationDrawerFragment.java | 10600 | package fyskam.fyskamssngbok;
import android.app.Activity;
import android.app.ActionBar;
import android.app.Fragment;
import android.support.v4.app.ActionBarDrawerToggle;
import android.support.v4.view.GravityCompat;
import android.support.v4.widget.DrawerLayout;
import android.content.SharedPreferences;
import android.content.res.Configuration;
import android.os.Bundle;
import android.preference.PreferenceManager;
import android.view.LayoutInflater;
import android.view.Menu;
import android.view.MenuInflater;
import android.view.MenuItem;
import android.view.View;
import android.view.ViewGroup;
import android.widget.AdapterView;
import android.widget.ArrayAdapter;
import android.widget.ListView;
import android.widget.Toast;
/**
* Fragment used for managing interactions for and presentation of a navigation drawer.
* See the <a href="https://developer.android.com/design/patterns/navigation-drawer.html#Interaction">
* design guidelines</a> for a complete explanation of the behaviors implemented here.
*/
public class NavigationDrawerFragment extends Fragment {
/**
* Remember the position of the selected item.
*/
private static final String STATE_SELECTED_POSITION = "selected_navigation_drawer_position";
/**
* Per the design guidelines, you should show the drawer on launch until the user manually
* expands it. This shared preference tracks this.
*/
private static final String PREF_USER_LEARNED_DRAWER = "navigation_drawer_learned";
/**
* A pointer to the current callbacks instance (the Activity).
*/
private NavigationDrawerCallbacks mCallbacks;
/**
* Helper component that ties the action bar to the navigation drawer.
*/
private ActionBarDrawerToggle mDrawerToggle;
private DrawerLayout mDrawerLayout;
private ListView mDrawerListView;
private View mFragmentContainerView;
private int mCurrentSelectedPosition = 0;
private boolean mFromSavedInstanceState;
private boolean mUserLearnedDrawer;
public NavigationDrawerFragment() {
}
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
// Read in the flag indicating whether or not the user has demonstrated awareness of the
// drawer. See PREF_USER_LEARNED_DRAWER for details.
SharedPreferences sp = PreferenceManager.getDefaultSharedPreferences(getActivity());
mUserLearnedDrawer = sp.getBoolean(PREF_USER_LEARNED_DRAWER, false);
if (savedInstanceState != null) {
mCurrentSelectedPosition = savedInstanceState.getInt(STATE_SELECTED_POSITION);
mFromSavedInstanceState = true;
}
// Select either the default item (0) or the last selected item.
selectItem(mCurrentSelectedPosition);
}
@Override
public void onActivityCreated (Bundle savedInstanceState) {
super.onActivityCreated(savedInstanceState);
// Indicate that this fragment would like to influence the set of actions in the action bar.
setHasOptionsMenu(true);
}
@Override
public View onCreateView(LayoutInflater inflater, ViewGroup container,
Bundle savedInstanceState) {
mDrawerListView = (ListView) inflater.inflate(
R.layout.fragment_navigation_drawer, container, false);
mDrawerListView.setOnItemClickListener(new AdapterView.OnItemClickListener() {
@Override
public void onItemClick(AdapterView<?> parent, View view, int position, long id) {
selectItem(position);
}
});
mDrawerListView.setAdapter(new ArrayAdapter<String>(
getActionBar().getThemedContext(),
android.R.layout.simple_list_item_activated_1,
android.R.id.text1,
new String[]{
getString(R.string.title_section1),
getString(R.string.title_section2),
getString(R.string.title_section3),
}));
mDrawerListView.setItemChecked(mCurrentSelectedPosition, true);
return mDrawerListView;
}
public boolean isDrawerOpen() {
return mDrawerLayout != null && mDrawerLayout.isDrawerOpen(mFragmentContainerView);
}
/**
* Users of this fragment must call this method to set up the navigation drawer interactions.
*
* @param fragmentId The android:id of this fragment in its activity's layout.
* @param drawerLayout The DrawerLayout containing this fragment's UI.
*/
public void setUp(int fragmentId, DrawerLayout drawerLayout) {
mFragmentContainerView = getActivity().findViewById(fragmentId);
mDrawerLayout = drawerLayout;
// set a custom shadow that overlays the main content when the drawer opens
mDrawerLayout.setDrawerShadow(R.drawable.drawer_shadow, GravityCompat.START);
// set up the drawer's list view with items and click listener
ActionBar actionBar = getActionBar();
actionBar.setDisplayHomeAsUpEnabled(true);
actionBar.setHomeButtonEnabled(true);
// ActionBarDrawerToggle ties together the the proper interactions
// between the navigation drawer and the action bar app icon.
mDrawerToggle = new ActionBarDrawerToggle(
getActivity(), /* host Activity */
mDrawerLayout, /* DrawerLayout object */
R.drawable.ic_drawer, /* nav drawer image to replace 'Up' caret */
R.string.navigation_drawer_open, /* "open drawer" description for accessibility */
R.string.navigation_drawer_close /* "close drawer" description for accessibility */
) {
@Override
public void onDrawerClosed(View drawerView) {
super.onDrawerClosed(drawerView);
if (!isAdded()) {
return;
}
getActivity().invalidateOptionsMenu(); // calls onPrepareOptionsMenu()
}
@Override
public void onDrawerOpened(View drawerView) {
super.onDrawerOpened(drawerView);
if (!isAdded()) {
return;
}
if (!mUserLearnedDrawer) {
// The user manually opened the drawer; store this flag to prevent auto-showing
// the navigation drawer automatically in the future.
mUserLearnedDrawer = true;
SharedPreferences sp = PreferenceManager
.getDefaultSharedPreferences(getActivity());
sp.edit().putBoolean(PREF_USER_LEARNED_DRAWER, true).apply();
}
getActivity().invalidateOptionsMenu(); // calls onPrepareOptionsMenu()
}
};
// If the user hasn't 'learned' about the drawer, open it to introduce them to the drawer,
// per the navigation drawer design guidelines.
if (!mUserLearnedDrawer && !mFromSavedInstanceState) {
mDrawerLayout.openDrawer(mFragmentContainerView);
}
// Defer code dependent on restoration of previous instance state.
mDrawerLayout.post(new Runnable() {
@Override
public void run() {
mDrawerToggle.syncState();
}
});
mDrawerLayout.setDrawerListener(mDrawerToggle);
}
private void selectItem(int position) {
mCurrentSelectedPosition = position;
if (mDrawerListView != null) {
mDrawerListView.setItemChecked(position, true);
}
if (mDrawerLayout != null) {
mDrawerLayout.closeDrawer(mFragmentContainerView);
}
if (mCallbacks != null) {
mCallbacks.onNavigationDrawerItemSelected(position);
}
}
@Override
public void onAttach(Activity activity) {
super.onAttach(activity);
try {
mCallbacks = (NavigationDrawerCallbacks) activity;
} catch (ClassCastException e) {
throw new ClassCastException("Activity must implement NavigationDrawerCallbacks.");
}
}
@Override
public void onDetach() {
super.onDetach();
mCallbacks = null;
}
@Override
public void onSaveInstanceState(Bundle outState) {
super.onSaveInstanceState(outState);
outState.putInt(STATE_SELECTED_POSITION, mCurrentSelectedPosition);
}
@Override
public void onConfigurationChanged(Configuration newConfig) {
super.onConfigurationChanged(newConfig);
// Forward the new configuration the drawer toggle component.
mDrawerToggle.onConfigurationChanged(newConfig);
}
@Override
public void onCreateOptionsMenu(Menu menu, MenuInflater inflater) {
// If the drawer is open, show the global app actions in the action bar. See also
// showGlobalContextActionBar, which controls the top-left area of the action bar.
if (mDrawerLayout != null && isDrawerOpen()) {
inflater.inflate(R.menu.global, menu);
showGlobalContextActionBar();
}
super.onCreateOptionsMenu(menu, inflater);
}
@Override
public boolean onOptionsItemSelected(MenuItem item) {
if (mDrawerToggle.onOptionsItemSelected(item)) {
return true;
}
if (item.getItemId() == R.id.action_example) {
Toast.makeText(getActivity(), "Example action.", Toast.LENGTH_SHORT).show();
return true;
}
return super.onOptionsItemSelected(item);
}
/**
* Per the navigation drawer design guidelines, updates the action bar to show the global app
* 'context', rather than just what's in the current screen.
*/
private void showGlobalContextActionBar() {
ActionBar actionBar = getActionBar();
actionBar.setDisplayShowTitleEnabled(true);
actionBar.setNavigationMode(ActionBar.NAVIGATION_MODE_STANDARD);
actionBar.setTitle(R.string.app_name);
}
private ActionBar getActionBar() {
return getActivity().getActionBar();
}
/**
* Callbacks interface that all activities using this fragment must implement.
*/
public static interface NavigationDrawerCallbacks {
/**
* Called when an item in the navigation drawer is selected.
*/
void onNavigationDrawerItemSelected(int position);
}
}
| mit |
tomreyn/godot | scene/resources/surface_tool.h | 5483 | /*************************************************************************/
/* surface_tool.h */
/*************************************************************************/
/* This file is part of: */
/* GODOT ENGINE */
/* http://www.godotengine.org */
/*************************************************************************/
/* Copyright (c) 2007-2017 Juan Linietsky, Ariel Manzur. */
/* Copyright (c) 2014-2017 Godot Engine contributors (cf. AUTHORS.md) */
/* */
/* Permission is hereby granted, free of charge, to any person obtaining */
/* a copy of this software and associated documentation files (the */
/* "Software"), to deal in the Software without restriction, including */
/* without limitation the rights to use, copy, modify, merge, publish, */
/* distribute, sublicense, and/or sell copies of the Software, and to */
/* permit persons to whom the Software is furnished to do so, subject to */
/* the following conditions: */
/* */
/* The above copyright notice and this permission notice shall be */
/* included in all copies or substantial portions of the Software. */
/* */
/* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, */
/* EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF */
/* MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.*/
/* IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY */
/* CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, */
/* TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE */
/* SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. */
/*************************************************************************/
#ifndef SURFACE_TOOL_H
#define SURFACE_TOOL_H
#include "scene/resources/mesh.h"
#include "thirdparty/misc/mikktspace.h"
class SurfaceTool : public Reference {
GDCLASS(SurfaceTool, Reference);
public:
struct Vertex {
Vector3 vertex;
Color color;
Vector3 normal; // normal, binormal, tangent
Vector3 binormal;
Vector3 tangent;
Vector2 uv;
Vector2 uv2;
Vector<int> bones;
Vector<float> weights;
bool operator==(const Vertex &p_vertex) const;
Vertex() {}
};
private:
struct VertexHasher {
static _FORCE_INLINE_ uint32_t hash(const Vertex &p_vtx);
};
bool begun;
bool first;
Mesh::PrimitiveType primitive;
int format;
Ref<Material> material;
//arrays
List<Vertex> vertex_array;
List<int> index_array;
Map<int, bool> smooth_groups;
//memory
Color last_color;
Vector3 last_normal;
Vector2 last_uv;
Vector2 last_uv2;
Vector<int> last_bones;
Vector<float> last_weights;
Plane last_tangent;
void _create_list_from_arrays(Array arr, List<Vertex> *r_vertex, List<int> *r_index, int &lformat);
void _create_list(const Ref<Mesh> &p_existing, int p_surface, List<Vertex> *r_vertex, List<int> *r_index, int &lformat);
//mikktspace callbacks
static int mikktGetNumFaces(const SMikkTSpaceContext *pContext);
static int mikktGetNumVerticesOfFace(const SMikkTSpaceContext *pContext, const int iFace);
static void mikktGetPosition(const SMikkTSpaceContext *pContext, float fvPosOut[], const int iFace, const int iVert);
static void mikktGetNormal(const SMikkTSpaceContext *pContext, float fvNormOut[], const int iFace, const int iVert);
static void mikktGetTexCoord(const SMikkTSpaceContext *pContext, float fvTexcOut[], const int iFace, const int iVert);
static void mikktSetTSpaceBasic(const SMikkTSpaceContext *pContext, const float fvTangent[], const float fSign, const int iFace, const int iVert);
protected:
static void _bind_methods();
public:
void begin(Mesh::PrimitiveType p_primitive);
void add_vertex(const Vector3 &p_vertex);
void add_color(Color p_color);
void add_normal(const Vector3 &p_normal);
void add_tangent(const Plane &p_tangent);
void add_uv(const Vector2 &p_uv);
void add_uv2(const Vector2 &p_uv);
void add_bones(const Vector<int> &p_indices);
void add_weights(const Vector<float> &p_weights);
void add_smooth_group(bool p_smooth);
void add_triangle_fan(const Vector<Vector3> &p_vertexes, const Vector<Vector2> &p_uvs = Vector<Vector2>(), const Vector<Color> &p_colors = Vector<Color>(), const Vector<Vector2> &p_uv2s = Vector<Vector2>(), const Vector<Vector3> &p_normals = Vector<Vector3>(), const Vector<Plane> &p_tangents = Vector<Plane>());
void add_index(int p_index);
void index();
void deindex();
void generate_normals();
void generate_tangents();
void add_to_format(int p_flags) { format |= p_flags; }
void set_material(const Ref<Material> &p_material);
void clear();
List<Vertex> &get_vertex_array() { return vertex_array; }
void create_from_triangle_arrays(const Array &p_arrays);
Array commit_to_arrays();
void create_from(const Ref<Mesh> &p_existing, int p_surface);
void append_from(const Ref<Mesh> &p_existing, int p_surface, const Transform &p_xform);
Ref<ArrayMesh> commit(const Ref<ArrayMesh> &p_existing = Ref<ArrayMesh>());
SurfaceTool();
};
#endif
| mit |
dawidd6/qtictactoe | include/Menu.h | 201 | #pragma once
class Menu : public QWidget
{
private:
QGridLayout layout;
QPushButton play_single;
QPushButton play_2v2;
QPushButton play_multi;
public:
Menu(Window *window, Game *game);
};
| mit |
Pulgama/supriya | supriya/patterns/EventPattern.py | 1545 | import uuid
from uqbar.objects import new
from supriya.patterns.Pattern import Pattern
class EventPattern(Pattern):
### CLASS VARIABLES ###
__slots__ = ()
### SPECIAL METHODS ###
def _coerce_iterator_output(self, expr, state=None):
import supriya.patterns
if not isinstance(expr, supriya.patterns.Event):
expr = supriya.patterns.NoteEvent(**expr)
if expr.get("uuid") is None:
expr = new(expr, uuid=uuid.uuid4())
return expr
### PUBLIC METHODS ###
def play(self, clock=None, server=None):
import supriya.patterns
import supriya.realtime
event_player = supriya.patterns.RealtimeEventPlayer(
self, clock=clock, server=server or supriya.realtime.Server.default()
)
event_player.start()
return event_player
def with_bus(self, calculation_rate="audio", channel_count=None, release_time=0.25):
import supriya.patterns
return supriya.patterns.Pbus(
self,
calculation_rate=calculation_rate,
channel_count=channel_count,
release_time=release_time,
)
def with_effect(self, synthdef, release_time=0.25, **settings):
import supriya.patterns
return supriya.patterns.Pfx(
self, synthdef=synthdef, release_time=release_time, **settings
)
def with_group(self, release_time=0.25):
import supriya.patterns
return supriya.patterns.Pgroup(self, release_time=release_time)
| mit |
crazyfacka/text2meo | data/lib/levenshtein.js | 1980 | /*
Copyright (c) 2011 Andrei Mackenzie
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
*/
// Compute the edit distance between the two given strings
exports.getEditDistance = function(a, b){
if(a.length == 0) return b.length;
if(b.length == 0) return a.length;
var matrix = [];
// increment along the first column of each row
var i;
for(i = 0; i <= b.length; i++){
matrix[i] = [i];
}
// increment each column in the first row
var j;
for(j = 0; j <= a.length; j++){
matrix[0][j] = j;
}
// Fill in the rest of the matrix
for(i = 1; i <= b.length; i++){
for(j = 1; j <= a.length; j++){
if(b.charAt(i-1) == a.charAt(j-1)){
matrix[i][j] = matrix[i-1][j-1];
} else {
matrix[i][j] = Math.min(matrix[i-1][j-1] + 1, // substitution
Math.min(matrix[i][j-1] + 1, // insertion
matrix[i-1][j] + 1)); // deletion
}
}
}
return matrix[b.length][a.length];
};
| mit |
SurgicalSteel/Competitive-Programming | Kattis-Solutions/Pervasive Heart Monitor.cs | 1202 | using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace Pervasive_Heart_Monitor
{
class Program
{
static bool checknum(string s)
{
return (s.Contains("1") || s.Contains("2") || s.Contains("3") || s.Contains("4") || s.Contains("5") || s.Contains("6") || s.Contains("7") || s.Contains("8") || s.Contains("9") || s.Contains("."));
}
static void Main(string[] args)
{
string s = Console.ReadLine();
while (!string.IsNullOrEmpty(s))
{
float total=0;
int n=0;
string name = "";
string[] ssplit = s.Split();
for(int i = 0; i < ssplit.Length; i++)
{
if (!checknum(ssplit[i])) { name += (ssplit[i] + " "); }
else
{
total += float.Parse(ssplit[i]);
n++;
}
}
Console.WriteLine("{0:F6} {1}", (float)(total / n), name);
s = Console.ReadLine();
}
}
}
}
| mit |
morkt/GARbro | ArcFormats/Ivory/ArcSG.cs | 2865 | //! \file ArcSG.cs
//! \date 2018 Feb 01
//! \brief 'fSGX' multi-frame image container.
//
// Copyright (C) 2018 by morkt
//
// Permission is hereby granted, free of charge, to any person obtaining a copy
// of this software and associated documentation files (the "Software"), to
// deal in the Software without restriction, including without limitation the
// rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
// sell copies of the Software, and to permit persons to whom the Software is
// furnished to do so, subject to the following conditions:
//
// The above copyright notice and this permission notice shall be included in
// all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
// IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
// FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
// AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
// LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
// FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
// IN THE SOFTWARE.
//
using System.Collections.Generic;
using System.ComponentModel.Composition;
using System.IO;
namespace GameRes.Formats.Ivory
{
[Export(typeof(ArchiveFormat))]
public class SgOpener : ArchiveFormat
{
public override string Tag { get { return "SG/cOBJ"; } }
public override string Description { get { return "Ivory multi-frame image"; } }
public override uint Signature { get { return 0x58475366; } } // 'fSGX'
public override bool IsHierarchic { get { return false; } }
public override bool CanWrite { get { return false; } }
public override ArcFile TryOpen (ArcView file)
{
long offset = 8;
var base_name = Path.GetFileNameWithoutExtension (file.Name);
var dir = new List<Entry>();
while (offset < file.MaxOffset && file.View.AsciiEqual (offset, "cOBJ"))
{
uint obj_size = file.View.ReadUInt32 (offset+4);
if (0 == obj_size)
break;
if (file.View.AsciiEqual (offset+0x10, "fSG "))
{
var entry = new Entry {
Name = string.Format ("{0}#{1}", base_name, dir.Count),
Type = "image",
Offset = offset+0x10,
Size = file.View.ReadUInt32 (offset+0x14),
};
dir.Add (entry);
}
offset += obj_size;
}
if (0 == dir.Count)
return null;
return new ArcFile (file, this, dir);
}
}
}
| mit |
Avarea/Programming-Fundamentals | Lists/02TrackDownloader/Program.cs | 1263 | using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace track_downloader
{
class Program
{
static void Main(string[] args)
{
string[] blacklisted = Console.ReadLine().Split();
List<string> filenames = ReadFilenames();
filenames = filenames
.Where(a => ContainsBlackListed(a, blacklisted))
.OrderBy(a => a)
.ToList();
Console.WriteLine(string.Join("\r\n", filenames));
}
private static bool ContainsBlackListed(string a, string[] blacklisted)
{
for (int i = 0; i < blacklisted.Length; i++)
{
if (a.Contains(blacklisted[i]))
{
return false;
}
}
return true;
}
private static List<string> ReadFilenames()
{
List<string> output = new List<string>();
string filename = Console.ReadLine();
while (filename != "end")
{
output.Add(filename);
filename = Console.ReadLine();
}
return output;
}
}
} | mit |
logger-app/logger-app | src/containers/Group/Group.js | 545 | import React, { PropTypes } from 'react';
import { connect } from 'react-redux';
import GroupPage from './GroupPage.js';
import GroupNotFoundPage from './GroupNotFoundPage.js';
const Group = ({ isValid, groupId }) => (isValid ? <GroupPage groupId={groupId} /> : <GroupNotFoundPage groupId={groupId} />);
Group.propTypes = {
groupId: PropTypes.string,
isValid: PropTypes.bool.isRequired
};
export default connect(state => ({
groupId: state.router.params.groupId,
isValid: !!state.groups.data[state.router.params.groupId]
}))(Group);
| mit |
gravity00/SimplePersistence | Examples/SimplePersistence.Example.Console/SimplePersistence.Example.Console.UoW.EF/Migrations/Configuration.cs | 2232 | using System;
using System.Data.Entity;
using System.Data.Entity.Migrations;
using SimplePersistence.Example.Console.Models.Logging;
using SimplePersistence.Example.Console.UoW.EF.Mapping;
namespace SimplePersistence.Example.Console.UoW.EF.Migrations
{
public sealed class Configuration : DropCreateDatabaseIfModelChanges<ConsoleDbContext>//DbMigrationsConfiguration<ConsoleDbContext>
{
//public Configuration()
//{
// AutomaticMigrationsEnabled = false;
//}
protected override void Seed(ConsoleDbContext context)
{
context.Applications.AddOrUpdate(
e => e.Id,
new Application
{
Id = "SimplePersistence.Example.Console",
Description = "The SimplePersistence.Example.Console application",
CreatedOn = DateTime.Now,
CreatedBy = "seed.migration"
});
context.Levels.AddOrUpdate(
e => e.Id,
new Level
{
Id = "DEBUG",
Description = "Debug",
CreatedOn = DateTime.Now,
CreatedBy = "seed.migration"
},
new Level
{
Id = "Info",
Description = "Information",
CreatedOn = DateTime.Now,
CreatedBy = "seed.migration"
},
new Level
{
Id = "WARN",
Description = "Warning",
CreatedOn = DateTime.Now,
CreatedBy = "seed.migration"
},
new Level
{
Id = "ERROR",
Description = "Error",
CreatedOn = DateTime.Now,
CreatedBy = "seed.migration"
},
new Level
{
Id = "FATAL",
Description = "Fatal",
CreatedOn = DateTime.Now,
CreatedBy = "seed.migration"
});
}
}
}
| mit |
nyc-fiddler-crabs-2015/Velox | db/migrate/20150328200056_create_itineraries.rb | 176 | class CreateItineraries < ActiveRecord::Migration
def change
create_table :itineraries do |t|
t.references :user
t.timestamps null: false
end
end
end
| mit |
baranov1ch/node-vcdiff | src/vcdiff.h | 2632 | // node-vcdiff
// https://github.com/baranov1ch/node-vcdiff
//
// Copyright 2014 Alexey Baranov <[email protected]>
// Released under the MIT license
#ifndef VCDIFF_H_
#define VCDIFF_H_
#include <memory>
#include <string>
#include <node.h>
#include <node_object_wrap.h>
#include <uv.h>
#include <v8.h>
namespace open_vcdiff {
class OutputStringInterface;
}
class VcdCtx : public node::ObjectWrap {
public:
enum class Mode {
ENCODE,
DECODE,
};
enum class Error {
OK,
INIT_ERROR,
ENCODE_ERROR,
DECODE_ERROR,
};
class Coder {
public:
virtual Error Start(open_vcdiff::OutputStringInterface* out) = 0;
virtual Error Process(const char* data,
size_t len,
open_vcdiff::OutputStringInterface* out) = 0;
virtual Error Finish(open_vcdiff::OutputStringInterface* out) = 0;
virtual ~Coder() {}
};
VcdCtx(std::unique_ptr<Coder> coder);
virtual ~VcdCtx();
static void Init(v8::Handle<v8::Object> exports);
static v8::Persistent<v8::Function> constructor;
static void New(const v8::FunctionCallbackInfo<v8::Value>& args);
static void WriteAsync(const v8::FunctionCallbackInfo<v8::Value>& args);
static void WriteSync(const v8::FunctionCallbackInfo<v8::Value>& args);
static void Close(const v8::FunctionCallbackInfo<v8::Value>& args);
private:
enum class State {
IDLE,
PROCESSING,
FINALIZING,
DONE,
};
struct WorkData {
VcdCtx* ctx;
v8::Isolate* isolate;
const char* data;
size_t len;
bool is_last;
};
static void WriteInternal(
const v8::FunctionCallbackInfo<v8::Value>& args, bool async);
v8::Local<v8::Object> Write(
v8::Local<v8::Object> buffer, bool is_last, bool async);
v8::Local<v8::Array> FinishWrite(v8::Isolate* isolate);
void Process(const char* data, size_t len, bool is_last);
bool CheckError(v8::Isolate* isolate);
void SendError(v8::Isolate* isolate);
void Close();
void Reset();
bool HasError() const;
v8::Local<v8::Object> GetOutputBuffer(v8::Isolate* isolate);
static const char* GetErrorString(Error err);
static void ProcessShim(uv_work_t* work_req);
static void AfterShim(uv_work_t* work_req, int status);
std::unique_ptr<Coder> coder_;
v8::Persistent<v8::Object> in_buffer_; // hold reference when async
uv_work_t work_req_;
bool write_in_progress_ = false;
bool pending_close_ = false;
State state_ = State::IDLE;
Error err_ = Error::OK;
std::string output_buffer_;
VcdCtx(const VcdCtx& other) = delete;
VcdCtx& operator=(const VcdCtx& other) = delete;
};
#endif // VCDIFF_H_
| mit |
reimagined/resolve | packages/runtime/adapters/eventstore-adapters/eventstore-base/src/cursor-operations.ts | 3122 | import assert from 'assert'
import {
THREAD_COUNT,
CURSOR_BUFFER_SIZE,
THREAD_COUNTER_BYTE_LENGTH,
} from './constants'
import {
InputCursor,
StoredEventBatchPointer,
StoredEventPointer,
} from './types'
const checkThreadArrayLength = (threadArray: Array<number>): void => {
assert.strictEqual(
threadArray.length,
THREAD_COUNT,
'Cursor must be represented by array of 256 numbers'
)
}
export const initThreadArray = (): Array<number> => {
const threadCounters = new Array<number>(THREAD_COUNT)
threadCounters.fill(0)
return threadCounters
}
export const threadArrayToCursor = (threadArray: Array<number>): string => {
checkThreadArrayLength(threadArray)
const cursorBuffer: Buffer = Buffer.alloc(CURSOR_BUFFER_SIZE)
for (let i = 0; i < threadArray.length; ++i) {
cursorBuffer.writeUIntBE(
threadArray[i],
i * THREAD_COUNTER_BYTE_LENGTH,
THREAD_COUNTER_BYTE_LENGTH
)
}
return cursorBuffer.toString('base64')
}
export const cursorToThreadArray = (cursor: InputCursor): Array<number> => {
if (cursor == null) return initThreadArray()
const cursorBuffer = Buffer.from(cursor, 'base64')
assert.strictEqual(
cursorBuffer.length,
CURSOR_BUFFER_SIZE,
'Wrong size of cursor buffer'
)
const threadCounters = new Array<number>(THREAD_COUNT)
for (let i = 0; i < cursorBuffer.length / THREAD_COUNTER_BYTE_LENGTH; i++) {
threadCounters[i] = cursorBuffer.readUIntBE(
i * THREAD_COUNTER_BYTE_LENGTH,
THREAD_COUNTER_BYTE_LENGTH
)
}
return threadCounters
}
export const emptyLoadEventsResult = (
cursor: InputCursor
): StoredEventBatchPointer => {
return {
cursor: cursor == null ? threadArrayToCursor(initThreadArray()) : cursor,
events: [],
}
}
const calculateMaxThreadArray = (
threadArrays: Array<Array<number>>
): Array<number> => {
const maxThreadArray = initThreadArray()
for (const threadArray of threadArrays) {
checkThreadArrayLength(threadArray)
for (let i = 0; i < THREAD_COUNT; ++i) {
maxThreadArray[i] = Math.max(maxThreadArray[i], threadArray[i])
}
}
return maxThreadArray
}
export const checkEventsContinuity = (
startingCursor: InputCursor,
eventCursorPairs: StoredEventPointer[]
): boolean => {
const startingThreadArray = cursorToThreadArray(startingCursor)
const tuples = eventCursorPairs.map(({ event, cursor }) => {
return {
event,
cursor,
threadArray: cursorToThreadArray(cursor),
}
})
for (let i = 0; i < tuples.length; ++i) {
assert.strictEqual(
tuples[i].event.threadCounter,
tuples[i].threadArray[tuples[i].event.threadId] - 1
)
if (
startingThreadArray[tuples[i].event.threadId] >
tuples[i].event.threadCounter
) {
return false
}
}
const maxThreadArray = calculateMaxThreadArray(
tuples.map((t) => t.threadArray)
)
for (const t of tuples) {
startingThreadArray[t.event.threadId]++
}
for (let i = 0; i < THREAD_COUNT; ++i) {
if (maxThreadArray[i] !== startingThreadArray[i]) {
return false
}
}
return true
}
| mit |
ibruton/ibruton.github.io | _posts/2015-8-30-Week-1-Review.md | 1577 | ---
layout: post
title: Week 1 Review
---
## Weekly Review (8/30/15)
It's Sunday afternoon on the 30th of August, and I've finally managed to set this Jekyll thing up, which means I can now talk about my time in the class this past week! Hooray!
Besides being the only class I have on Mondays, Wednesdays, and Fridays, Object Oriented Programming is great so far. Professor Downing structures the course so that the incoming student has a background in programming, but not necessarily in C++. This is great because myself (as well as many other students in the course, I'd assume), have made it through Data Structures (CS 314, a prerequisite to the prerequisite of this class, CS 429), but have no background in the language of C++.
While I'm not the biggest fan of the teaching method Prof. Downing uses (the calling out of students at random to answerom questions), the lectures he has given thus far are very interesting and in-depth, taking time to explain even the most minute things. I feel like this course will give students more than just knowledge in a new language at the end of the day; I think it'll give them a much more in-depth idea of how to approach programming problems as well as a set of tools to use during the development process that they didn't even know they had.
## Tip of the Week
http://gitref.org/
Familiarizing yourself with Git and the commands associated with it will prove to be invaluable to you as you progress in your programming career. The above website is a bare-bones, to-the-point reference site that helps this process.
| mit |
jade-press/jade-press.org | gulpfile.js | 2290 |
'use strict'
let
ugly = require('gulp-uglify')
,gulp = require('gulp')
,watch = require('gulp-watch')
,plumber = require('gulp-plumber')
,newer = require('gulp-newer')
,stylus = require('gulp-stylus')
,jade = require('gulp-jade')
,concat = require('gulp-concat')
,rename = require('gulp-rename')
,runSequence = require('run-sequence')
,_ = require('lodash')
,path = require('path')
,fs = require('fs')
,spawn = require('cross-spawn')
let
cssFolder = __dirname + '/public/css'
,jsFolder = __dirname + '/public/js'
,views = __dirname + '/views'
,stylusOptions = {
compress: true
}
,uglyOptions = {
}
gulp.task('stylus', function() {
gulp.src(cssFolder + '/*.styl')
/*
.pipe(newer({
dest: cssFolder
,map: function(path) {
return path.replace(/\.styl$/, '.css')
}
}))
*/
.pipe(plumber())
.pipe(stylus(stylusOptions))
.pipe(gulp.dest(cssFolder))
})
gulp.task('ugly', function() {
gulp.src(jsFolder + '/*.js')
.pipe(newer({
dest: jsFolder
,map: function(path) {
return path.replace(/\.dev.js$/, '.min.js')
}
}))
.pipe(plumber())
.pipe(rename(function (path) {
path.basename = path.basename.replace('.dev', '.min')
}))
.pipe(gulp.dest(jsFolder))
.pipe(ugly(uglyOptions))
.pipe(gulp.dest(jsFolder))
})
let config = require('./config')
let pack = require('./package.json')
let packj = require('./node_modules/jade-press/package.json')
config.version = pack.version
config.siteDesc = packj.description
gulp.task('jade', function() {
gulp.src(views + '/*.jade')
.pipe(plumber())
.pipe(jade({
locals: config
}))
.pipe(gulp.dest(__dirname))
})
gulp.task('server', function (cb) {
var runner = spawn(
'node'
,['server']
,{
stdio: 'inherit'
}
)
runner.on('exit', function (code) {
process.exit(code)
})
runner.on('error', function (err) {
cb(err)
})
})
gulp.task('watch', function () {
runSequence('server')
watch([cssFolder + '/*.styl', cssFolder + '/parts/*.styl'], function() {
runSequence('stylus')
})
watch(jsFolder, function() {
runSequence('ugly')
})
watch([
views + '/*.jade'
,views + '/parts/*.jade'
], function() {
runSequence('jade')
}
)
})
gulp.task('default', ['watch'])
gulp.task('dist', function() {
runSequence('stylus', 'ugly', 'jade')
}) | mit |
justarrived/just-match-frontend | src/app/common/templates/company-job.html | 3011 | <section class="user-job-wrapper">
<user-job-header></user-job-header>
<div class="main-title primary" ng-show="!ctrl.hasInvoice && ctrl.showStatus">
<div class="job-owner-header">
<!-- Choose candidate -->
<div ng-show="!ctrl.accepted && !ctrl.will_perform">
<h2>{{'common.you_have' | translate}} {{job_user.data.length}} {{'company.assignments.applications' | translate}}</h2>
</div>
<div ng-show="!ctrl.accepted && !ctrl.will_perform && job_user.data.length > 0">
<a class="button" href="#{{routes.company.job_candidates.resolve(job_obj)}}">{{'company.assignments.candidates.select.title' | translate}}</a>
</div>
<!-- Wait user accepted -->
<div ng-show="ctrl.accepted && !ctrl.will_perform" ng-click="ctrl.gotoAcceptedCandidate()">
<h2>{{ctrl.user_apply.attributes["first-name"]}} {{'common.have' | translate}} {{ctrl.remainHours}}{{'common.hour' | translate}}
{{ctrl.remainMinutes}}{{'common.min' | translate}}<br/>
{{'assignment.status.user_application.time_remaining' | translate}}</h2>
</div>
<!-- Job ongoing -->
<div ng-show="ctrl.will_perform && !ctrl.canPerformed" ng-click="ctrl.gotoAcceptedCandidate()">
<h2>{{ctrl.user_apply.attributes["first-name"]}} {{'assignment.status.applicant_approved' | translate}}</h2>
</div>
<!-- Job performed confirmation -->
<div ng-show="(ctrl.will_perform || ctrl.performed) && ctrl.canPerformed">
<h2>{{'assignment.is_approved' | translate}}</h2>
</div>
<div class="buttons" ng-show="(ctrl.will_perform || ctrl.performed) && ctrl.canPerformed">
<a class="button small" href="" ng-click="ctrl.ownerCancelPerformed()">{{'common.no' | translate}}</a>
<a class="button small" href="" ng-click="modalPerformShow=true; isPerformed=false;">{{'common.yes' | translate}}</a>
</div>
</div>
</div>
<div class="main-content" ng-hide="userModalPerformShow">
<ul>
<li class="select">
<a href="#{{routes.company.job_comments.resolve(job_obj)}}">
<span>{{'user.assignments.comments' | translate}}</span><br/>
{{comments_amt}} {{'user.assignments.comment_count' | translate}}
</a>
</li>
<li class="select" ng-repeat="chat in ctrl.userChats.data">
<a href="" ng-click="ctrl.gotoChat(chat['job-users'].id,chat.id);"><span>{{'user.assignments.chat' | translate}}</span><br>{{'user.assignments.conversation' | translate}} {{chat.users.attributes["first-name"]}} {{chat.users.attributes["last-name"]}}</a>
</li>
</ul>
</div>
<!-- OWNER modal form performed to create invoice -->
<company-job-perform></company-job-perform>
</section> | mit |
drm343/MyWorld | docs/search/all_3.html | 1020 | <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html><head><title></title>
<meta http-equiv="Content-Type" content="text/xhtml;charset=UTF-8"/>
<meta name="generator" content="Doxygen 1.8.13"/>
<link rel="stylesheet" type="text/css" href="search.css"/>
<script type="text/javascript" src="all_3.js"></script>
<script type="text/javascript" src="search.js"></script>
</head>
<body class="SRPage">
<div id="SRIndex">
<div class="SRStatus" id="Loading">載入中...</div>
<div id="SRResults"></div>
<script type="text/javascript"><!--
createResults();
--></script>
<div class="SRStatus" id="Searching">搜尋中...</div>
<div class="SRStatus" id="NoMatches">無符合項目</div>
<script type="text/javascript"><!--
document.getElementById("Loading").style.display="none";
document.getElementById("NoMatches").style.display="none";
var searchResults = new SearchResults("searchResults");
searchResults.Search();
--></script>
</div>
</body>
</html>
| mit |
esimkowitz/Slide-Summarizer | web/public/views/oauth2callback.php | 998 | <?php
require_once __DIR__.'/../../../vendor/autoload.php';
require 'templates/base.php';
session_start();
$client = new Google_Client();
$client->setApplicationName("Slide-Summarizer");
if ($credentials_file = getOAuthCredentialsFile()) {
// set the location manually
$client->setAuthConfig($credentials_file);
$credentials_json = json_decode(file_get_contents($credentials_file));
}
else {
echo missingServiceAccountDetailsWarning();
return;
}
$client->setRedirectUri('https://' . $_SERVER['HTTP_HOST'] . '/oauth2callback');
$client->addScope(Google_Service_Drive::DRIVE_READONLY);
if (! isset($_GET['code'])) {
$auth_url = $client->createAuthUrl();
header('Location: ' . filter_var($auth_url, FILTER_SANITIZE_URL));
} else {
$client->authenticate($_GET['code']);
$_SESSION['access_token'] = $client->getAccessToken();
$redirect_uri = 'https://' . $_SERVER['HTTP_HOST'] . '/';
header('Location: ' . filter_var($redirect_uri, FILTER_SANITIZE_URL));
} | mit |
ymherklotz/YAGE | yage/physics/particlebody.cpp | 1426 | /** ---------------------------------------------------------------------------
* -*- c++ -*-
* @file: particlebody.cpp
*
* Copyright (c) 2017 Yann Herklotz Grave <[email protected]>
* MIT License, see LICENSE file for more details.
* ----------------------------------------------------------------------------
*/
#include <yage/physics/particlebody.h>
#include <cmath>
namespace yage
{
ParticleBody::ParticleBody(const Vector2d &position, double mass,
const Vector2d &velocity, bool gravity)
: Body(position, mass, velocity, gravity)
{
}
void ParticleBody::applyForce(const Vector2d &force)
{
force_ += force;
}
void ParticleBody::update()
{
// set the time_step for 60fps
double time_step = 1.0 / 60.0;
// set the last acceleration
Vector2d last_acceleration = acceleration_;
// update the position of the body
position_ += velocity_ * time_step +
(0.5 * last_acceleration * std::pow(time_step, 2));
// update the acceleration
if (gravity_) {
acceleration_ =
Vector2d(force_.x() / mass_, (GRAVITY + force_.y()) / mass_);
} else {
acceleration_ = Vector2d(force_.x() / mass_, force_.y() / mass_);
}
Vector2d avg_acceleration = (acceleration_ + last_acceleration) / 2.0;
// update the velocity of the body
velocity_ += avg_acceleration * time_step;
}
} // namespace yage
| mit |
hustwyk/trainTicket | GeneratedFiles/ui_modifystationdlg.h | 3909 | /********************************************************************************
** Form generated from reading UI file 'modifystationdlg.ui'
**
** Created by: Qt User Interface Compiler version 5.7.0
**
** WARNING! All changes made in this file will be lost when recompiling UI file!
********************************************************************************/
#ifndef UI_MODIFYSTATIONDLG_H
#define UI_MODIFYSTATIONDLG_H
#include <QtCore/QVariant>
#include <QtWidgets/QAction>
#include <QtWidgets/QApplication>
#include <QtWidgets/QButtonGroup>
#include <QtWidgets/QDialog>
#include <QtWidgets/QGridLayout>
#include <QtWidgets/QGroupBox>
#include <QtWidgets/QHBoxLayout>
#include <QtWidgets/QHeaderView>
#include <QtWidgets/QLabel>
#include <QtWidgets/QLineEdit>
#include <QtWidgets/QPushButton>
#include <QtWidgets/QSpacerItem>
#include <QtWidgets/QVBoxLayout>
QT_BEGIN_NAMESPACE
class Ui_ModifyStationDlg
{
public:
QGridLayout *gridLayout_2;
QGroupBox *groupBox;
QGridLayout *gridLayout;
QVBoxLayout *verticalLayout;
QHBoxLayout *horizontalLayout;
QLabel *label;
QLineEdit *lineEdit;
QHBoxLayout *horizontalLayout_2;
QSpacerItem *horizontalSpacer;
QPushButton *pushButton;
void setupUi(QDialog *ModifyStationDlg)
{
if (ModifyStationDlg->objectName().isEmpty())
ModifyStationDlg->setObjectName(QStringLiteral("ModifyStationDlg"));
ModifyStationDlg->resize(311, 111);
gridLayout_2 = new QGridLayout(ModifyStationDlg);
gridLayout_2->setObjectName(QStringLiteral("gridLayout_2"));
groupBox = new QGroupBox(ModifyStationDlg);
groupBox->setObjectName(QStringLiteral("groupBox"));
gridLayout = new QGridLayout(groupBox);
gridLayout->setObjectName(QStringLiteral("gridLayout"));
verticalLayout = new QVBoxLayout();
verticalLayout->setObjectName(QStringLiteral("verticalLayout"));
horizontalLayout = new QHBoxLayout();
horizontalLayout->setObjectName(QStringLiteral("horizontalLayout"));
label = new QLabel(groupBox);
label->setObjectName(QStringLiteral("label"));
label->setAlignment(Qt::AlignRight|Qt::AlignTrailing|Qt::AlignVCenter);
horizontalLayout->addWidget(label);
lineEdit = new QLineEdit(groupBox);
lineEdit->setObjectName(QStringLiteral("lineEdit"));
horizontalLayout->addWidget(lineEdit);
verticalLayout->addLayout(horizontalLayout);
horizontalLayout_2 = new QHBoxLayout();
horizontalLayout_2->setObjectName(QStringLiteral("horizontalLayout_2"));
horizontalSpacer = new QSpacerItem(40, 20, QSizePolicy::Expanding, QSizePolicy::Minimum);
horizontalLayout_2->addItem(horizontalSpacer);
pushButton = new QPushButton(groupBox);
pushButton->setObjectName(QStringLiteral("pushButton"));
horizontalLayout_2->addWidget(pushButton);
verticalLayout->addLayout(horizontalLayout_2);
gridLayout->addLayout(verticalLayout, 0, 0, 1, 1);
gridLayout_2->addWidget(groupBox, 0, 0, 1, 1);
retranslateUi(ModifyStationDlg);
QMetaObject::connectSlotsByName(ModifyStationDlg);
} // setupUi
void retranslateUi(QDialog *ModifyStationDlg)
{
ModifyStationDlg->setWindowTitle(QApplication::translate("ModifyStationDlg", "ModifyStation", 0));
groupBox->setTitle(QApplication::translate("ModifyStationDlg", "GroupBox", 0));
label->setText(QApplication::translate("ModifyStationDlg", "\350\257\267\350\276\223\345\205\245\344\277\256\346\224\271\345\200\274\357\274\232", 0));
pushButton->setText(QApplication::translate("ModifyStationDlg", "\346\217\220\344\272\244", 0));
} // retranslateUi
};
namespace Ui {
class ModifyStationDlg: public Ui_ModifyStationDlg {};
} // namespace Ui
QT_END_NAMESPACE
#endif // UI_MODIFYSTATIONDLG_H
| mit |
chriske/nytimes_api_demo | app/src/main/java/hu/autsoft/nytimes/exception/OkHttpException.java | 172 | package hu.autsoft.nytimes.exception;
public class OkHttpException extends RuntimeException {
public OkHttpException(Throwable cause) {
super(cause);
}
}
| mit |
awto/effectfuljs | packages/serialization/test/main.js | 27905 | const Lib = require("../src/main");
const assert = require("assert");
describe("plain object output", function () {
context("for `JSON.stringify` serializable objects", function () {
it("should have resembling structure", function () {
const obj1 = { a: 1, b: "b", c: true };
const res = Lib.write(obj1);
const exp1 = {
f: [
["a", 1],
["b", "b"],
["c", true]
]
};
assert.deepStrictEqual(res, exp1);
assert.equal(Lib.stringify(obj1), JSON.stringify(res));
const obj2 = { obj1 };
const exp2 = { f: [["obj1", exp1]] };
assert.deepStrictEqual(Lib.write(obj2), exp2);
assert.equal(Lib.stringify(obj2), JSON.stringify(exp2));
});
context("with `alwaysByRef: true`", function () {
it("should use references environment", function () {
const obj1 = { a: { a: 1 }, b: { b: "b" }, c: { c: true } };
const res = Lib.write(obj1, { alwaysByRef: true });
const exp1 = {
r: 3,
x: [
{ f: [["c", true]] },
{ f: [["b", "b"]] },
{ f: [["a", 1]] },
{
f: [
["a", { r: 2 }],
["b", { r: 1 }],
["c", { r: 0 }]
]
}
]
};
assert.deepStrictEqual(res, exp1);
const obj2 = Lib.read(res);
assert.notStrictEqual(obj1, obj2);
assert.deepStrictEqual(obj1, obj2);
});
});
});
it("should have correct format for shared values", function () {
const root = { val: "hi" };
root.rec1 = { obj1: root, obj2: root, obj3: { obj: root } };
root.rec2 = root;
assert.deepStrictEqual(Lib.write(root), {
r: 0,
x: [
{
f: [
["val", "hi"],
[
"rec1",
{
f: [
["obj1", { r: 0 }],
["obj2", { r: 0 }],
["obj3", { f: [["obj", { r: 0 }]] }]
]
}
],
["rec2", { r: 0 }]
]
}
]
});
});
});
describe("special values", function () {
it("should correctly restore them", function () {
const root = { undef: undefined, nul: null, nan: NaN };
const res = Lib.write(root);
assert.deepStrictEqual(res, {
f: [
["undef", { $: "undefined" }],
["nul", null],
["nan", { $: "NaN" }]
]
});
const { undef, nul, nan } = Lib.read(res);
assert.strictEqual(undef, undefined);
assert.strictEqual(nul, null);
assert.ok(typeof nan === "number" && Number.isNaN(nan));
});
});
describe("reading plain object", function () {
it("should correctly assign shared values", function () {
const obj = Lib.read({
r: 0,
x: [
{
f: [
["val", "hi"],
[
"rec1",
{
f: [
["obj1", { r: 0 }],
["obj2", { r: 0 }],
["obj3", { f: [["obj", { r: 0 }]] }]
]
}
],
["rec2", { r: 0 }]
]
}
]
});
assert.strictEqual(Object.keys(obj).sort().join(), "rec1,rec2,val");
assert.strictEqual(obj.val, "hi");
assert.strictEqual(Object.keys(obj.rec1).sort().join(), "obj1,obj2,obj3");
assert.strictEqual(Object.keys(obj.rec1.obj3).sort().join(), "obj");
assert.strictEqual(obj.rec2, obj);
assert.strictEqual(obj.rec1.obj1, obj);
assert.strictEqual(obj.rec1.obj2, obj);
assert.strictEqual(obj.rec1.obj3.obj, obj);
});
});
describe("object with parent", function () {
function MyObj() {
this.a = 1;
this.b = "b";
this.c = true;
}
Lib.regConstructor(MyObj);
it("should output `$` attribute", function () {
const obj1 = new MyObj();
assert.deepEqual(Lib.write(obj1), {
$: "MyObj",
f: [
["a", 1],
["b", "b"],
["c", true]
]
});
function Object() {
this.a = obj1;
}
Lib.regConstructor(Object);
assert.deepEqual(Lib.write(new Object()), {
$: "Object_1",
f: [
[
"a",
{
f: [
["a", 1],
["b", "b"],
["c", true]
],
$: "MyObj"
}
]
]
});
});
it("should use `$` attribute to resolve a type on read", function () {
const obj1 = Lib.read({
$: "MyObj",
f: [
["a", 1],
["b", "b"],
["c", true]
]
});
assert.strictEqual(obj1.constructor, MyObj);
assert.equal(Object.keys(obj1).sort().join(), "a,b,c");
assert.strictEqual(obj1.a, 1);
assert.strictEqual(obj1.b, "b");
assert.strictEqual(obj1.c, true);
});
context("for shared values", function () {
function Obj2() {}
Lib.regConstructor(Obj2);
it("should write shared values in `shared` map", function () {
const root = new Obj2();
root.l1 = new Obj2();
root.l1.back = root.l1;
assert.deepStrictEqual(Lib.write(root), {
f: [["l1", { r: 0 }]],
$: "Obj2",
x: [{ f: [["back", { r: 0 }]], $: "Obj2" }]
});
});
it("should use `#shared` keys to resolve prototypes on read", function () {
const obj1 = Lib.read({
f: [["l1", { r: 0 }]],
$: "Obj2",
x: [{ f: [["back", { r: 0 }]], $: "Obj2" }]
});
assert.strictEqual(obj1.constructor, Obj2);
assert.deepEqual(Object.keys(obj1), ["l1"]);
assert.deepEqual(Object.keys(obj1.l1), ["back"]);
assert.strictEqual(obj1.l1.constructor, Obj2);
assert.strictEqual(obj1.l1.back, obj1.l1);
});
});
});
describe("prototypes chain", function () {
it("should correctly store and recover all references", function () {
class C1 {
constructor(p) {
this.p1 = p;
}
}
class C2 extends C1 {
constructor() {
super("A");
this.c1 = new C1(this);
}
}
Lib.regOpaqueObject(C1.prototype, "C1");
Lib.regOpaqueObject(C2.prototype, "C2");
const obj = new C2();
C1.prototype.p_prop_1 = "prop_1";
const res = Lib.write(obj);
C1.prototype.p_prop_1 = "changed";
assert.deepEqual(res, {
r: 0,
x: [
{
p: { $: "C2" },
f: [
["p1", "A"],
[
"c1",
{
p: { $: "C1", f: [["p_prop_1", "prop_1"]] },
f: [["p1", { r: 0 }]]
}
]
]
}
]
});
const r2 = Lib.read(res);
assert.ok(r2 instanceof C1);
assert.ok(r2 instanceof C2);
assert.strictEqual(r2.constructor, C2);
assert.strictEqual(Object.getPrototypeOf(r2).constructor, C2);
assert.strictEqual(r2.c1.constructor, C1);
assert.strictEqual(r2.c1.p1, r2);
assert.equal(r2.p1, "A");
assert.strictEqual(C1.prototype.p_prop_1, "prop_1");
class C3 {
constructor(val) {
this.a = val;
}
}
Lib.regOpaqueObject(C3.prototype, "C3", { props: false });
class C4 extends C3 {
constructor() {
super("A");
this.b = "B";
}
}
Lib.regOpaqueObject(C4.prototype, "C4");
const obj2 = new C4();
const res2 = Lib.write(obj2);
assert.deepEqual(res2, {
p: {
$: "C4"
},
f: [
["a", "A"],
["b", "B"]
]
});
const obj3 = Lib.read(res2);
assert.ok(obj3 instanceof C3);
assert.ok(obj3 instanceof C4);
assert.equal(obj3.a, "A");
assert.equal(obj3.b, "B");
assert.equal(
Object.getPrototypeOf(Object.getPrototypeOf(obj3)),
Object.getPrototypeOf(Object.getPrototypeOf(obj2))
);
});
});
describe("property's descriptor", function () {
it("should correctly store and recover all settings", function () {
const a = {};
let setCalled = 0;
let getCalled = 0;
let back;
let val;
const descr = {
set(value) {
assert.strictEqual(this, back);
setCalled++;
val = value;
},
get() {
assert.strictEqual(this, back);
getCalled++;
return a;
}
};
Object.defineProperty(a, "prop", descr);
const psym1 = Symbol("prop");
const psym2 = Symbol("prop");
Object.defineProperty(a, psym1, {
value: "B",
enumerable: true
});
Object.defineProperty(a, psym2, {
value: "C",
configurable: true
});
Object.defineProperty(a, Symbol.for("prop"), {
value: "D",
writable: true
});
Lib.regOpaqueObject(descr.set, "dset");
Lib.regOpaqueObject(descr.get, "dget");
const opts = { symsByName: new Map() };
const res = Lib.write(a, opts);
assert.deepEqual(res, {
f: [
["prop", null, 15, { $: "dget" }, { $: "dset" }],
[{ name: "prop" }, "B", 5],
[{ name: "prop", id: 1 }, "C", 6],
[{ key: "prop" }, "D", 3]
]
});
back = Lib.read(res, opts);
assert.deepEqual(Object.getOwnPropertySymbols(back), [
psym1,
psym2,
Symbol.for("prop")
]);
assert.strictEqual(setCalled, 0);
assert.strictEqual(getCalled, 0);
back.prop = "A";
assert.strictEqual(setCalled, 1);
assert.strictEqual(getCalled, 0);
assert.strictEqual(val, "A");
assert.strictEqual(back.prop, a);
assert.strictEqual(
Object.getOwnPropertyDescriptor(back, Symbol("prop")),
void 0
);
assert.deepEqual(Object.getOwnPropertyDescriptor(back, "prop"), {
enumerable: false,
configurable: false,
...descr
});
assert.deepEqual(Object.getOwnPropertyDescriptor(back, psym1), {
value: "B",
writable: false,
enumerable: true,
configurable: false
});
assert.deepEqual(Object.getOwnPropertyDescriptor(back, psym2), {
value: "C",
writable: false,
enumerable: false,
configurable: true
});
assert.deepEqual(
Object.getOwnPropertyDescriptor(back, Symbol.for("prop")),
{
value: "D",
writable: true,
enumerable: false,
configurable: false
}
);
});
});
describe("arrays serialization", function () {
context("without shared references", function () {
it("should be similar to `JSON.stringify`/`JSON.parse`", function () {
const obj = { arr: [1, "a", [true, [false, null]], undefined] };
const res = Lib.write(obj);
assert.deepStrictEqual(res, {
f: [["arr", [1, "a", [true, [false, null]], { $: "undefined" }]]]
});
const back = Lib.read(res);
assert.deepStrictEqual(obj, back);
});
it("doesn't support Array as root", function () {
assert.throws(() => Lib.write([1, 2]), TypeError);
});
});
it("should handle shared references", function () {
const obj = { arr: [1, "a", [true, [false, null]], undefined] };
obj.arr.push(obj.arr);
const res = Lib.write(obj);
assert.notStrictEqual(res, obj);
assert.deepStrictEqual(res, {
f: [["arr", { r: 0 }]],
x: [[1, "a", [true, [false, null]], { $: "undefined" }, { r: 0 }]]
});
const back = Lib.read(res);
assert.notStrictEqual(res, back);
assert.deepStrictEqual(obj, back);
});
});
describe("`Set` serialization", function () {
context("without shared references", function () {
it("should output `JSON.stringify` serializable object", function () {
const arr = [1, "a", [true, [false, null]], undefined];
const obj = { set: new Set(arr) };
obj.set.someNum = 100;
obj.set.self = obj.set;
const res = Lib.write(obj);
assert.deepStrictEqual(res, {
f: [["set", { r: 0 }]],
x: [
{
$: "Set",
l: [1, "a", [true, [false, null]], { $: "undefined" }],
f: [
["someNum", 100],
["self", { r: 0 }]
]
}
]
});
const back = Lib.read(res);
assert.deepStrictEqual(obj, back);
});
});
it("should handle shared references", function () {
const obj = new Set([1, "a", [true, [false, null]], undefined]);
obj.add(obj);
const res = Lib.write(obj);
assert.notStrictEqual(res, obj);
assert.deepStrictEqual(res, {
r: 0,
x: [
{
$: "Set",
l: [1, "a", [true, [false, null]], { $: "undefined" }, { r: 0 }]
}
]
});
const back = Lib.read(res);
assert.notStrictEqual(res, back);
assert.deepStrictEqual(obj, back);
});
});
describe("`Map` serialization", function () {
context("without shared references", function () {
it("should output `JSON.stringify` serializable object", function () {
const arr = [[1, "a"], [true, [false, null]], [undefined]];
const obj = { map: new Map(arr) };
const res = Lib.write(obj);
assert.deepStrictEqual(res, {
f: [
[
"map",
{
$: "Map",
k: [1, true, { $: "undefined" }],
v: ["a", [false, null], { $: "undefined" }]
}
]
]
});
const back = Lib.read(res);
assert.deepStrictEqual(obj, back);
});
});
it("should handle shared references", function () {
const obj = new Map([[1, "a"], [true, [false, null]], [undefined]]);
obj.set(obj, obj);
const res = Lib.write(obj);
assert.notStrictEqual(res, obj);
assert.deepStrictEqual(res, {
r: 0,
x: [
{
$: "Map",
k: [1, true, { $: "undefined" }, { r: 0 }],
v: ["a", [false, null], { $: "undefined" }, { r: 0 }]
}
]
});
const back = Lib.read(res);
assert.notStrictEqual(res, back);
assert.deepStrictEqual(obj, back);
});
});
describe("opaque objects serialization", function () {
it("should throw for not registered objects", function () {
function a() {}
assert.throws(() => Lib.write({ a }), TypeError);
});
it("should not throw if `ignore:true`", function () {
function a() {}
assert.deepStrictEqual(Lib.write({ a }, { ignore: true }), {});
});
it("should output object's name if registered", function () {
function a() {}
Lib.regOpaqueObject(a);
assert.deepStrictEqual(Lib.write({ a }), { f: [["a", { $: "a" }]] });
Lib.regOpaqueObject(a);
assert.deepStrictEqual(Lib.read({ f: [["a", { $: "a" }]] }), { a });
(function () {
function a() {}
Lib.regOpaqueObject(a);
assert.deepStrictEqual(Lib.write({ a }), { f: [["a", { $: "a_1" }]] });
assert.deepStrictEqual(Lib.read({ f: [["a", { $: "a_1" }]] }), { a });
})();
});
it("should not serialize properties specified before its registration", function () {
const obj = {
prop1: "p1",
[Symbol.for("sym#a")]: "s1",
[Symbol.for("sym#b")]: "s2",
prop2: "p2",
[3]: "N3",
[4]: "N4"
};
Lib.regOpaqueObject(obj, "A");
obj.prop1 = "P2";
obj.prop3 = "p3";
obj[Symbol.for("sym#a")] = "S1";
obj[Symbol.for("sym#c")] = "s3";
obj[4] = "n4";
obj[5] = "n5";
assert.deepStrictEqual(Lib.write({ obj }), {
f: [
[
"obj",
{
$: "A",
f: [
["4", "n4"],
["5", "n5"],
["prop1", "P2"],
["prop3", "p3"],
[
{
key: "sym#a"
},
"S1"
],
[
{
key: "sym#c"
},
"s3"
]
]
}
]
]
});
});
});
describe("opaque primitive value serialization", function () {
it("should output object's name if registered", function () {
const a = Symbol("a");
Lib.regOpaquePrim(a, "sa");
assert.ok(!a[Lib.descriptorSymbol]);
assert.deepStrictEqual(Lib.write({ a }), {
f: [["a", { $: "sa" }]]
});
Lib.regOpaquePrim(a, "sb");
assert.deepStrictEqual(Lib.read({ f: [["a", { $: "sa" }]] }), {
a
});
(function () {
const a = Symbol("a");
Lib.regOpaquePrim(a, "sa");
assert.deepStrictEqual(Lib.write({ a }), {
f: [["a", { $: "sa_1" }]]
});
assert.deepStrictEqual(Lib.read({ f: [["a", { $: "sa_1" }]] }), { a });
})();
});
});
describe("Symbols serialization", function () {
it("should keep values", function () {
const a1 = Symbol("a");
const a2 = Symbol("a");
const b = Symbol("b");
const g = Symbol.for("g");
const opts = { symsByName: new Map() };
const res = Lib.write({ a1, a2, b1: b, b2: b, g }, opts);
assert.deepStrictEqual(res, {
f: [
["a1", { name: "a", $: "Symbol" }],
["a2", { name: "a", id: 1, $: "Symbol" }],
["b1", { name: "b", $: "Symbol" }],
["b2", { name: "b", $: "Symbol" }],
["g", { key: "g", $: "Symbol" }]
]
});
const { a1: ra1, a2: ra2, b1: rb1, b2: rb2, g: rg } = Lib.read(res, opts);
assert.strictEqual(a1, ra1);
assert.strictEqual(a2, ra2);
assert.strictEqual(b, rb1);
assert.strictEqual(b, rb2);
assert.strictEqual(rg, g);
const { a1: la1, a2: la2, b1: lb1, b2: lb2, g: lg } = Lib.read(res, {
ignore: true
});
assert.notStrictEqual(a1, la1);
assert.notStrictEqual(a2, la2);
assert.notStrictEqual(lb1, b);
assert.notStrictEqual(lb2, b);
assert.strictEqual(lg, g);
assert.strictEqual(lb1, lb2);
assert.equal(String(la1), "Symbol(a)");
assert.equal(String(la2), "Symbol(a)");
assert.equal(String(lb1), "Symbol(b)");
assert.equal(String(lb2), "Symbol(b)");
});
});
describe("type with `$$typeof` attribute", function () {
Lib.regDescriptor({
name: "hundred",
typeofTag: 100,
read(ctx, json) {
return { $$typeof: 100 };
},
write(ctx, value) {
return { $: "hundred" };
},
props: false
});
it("should use overriden methods", function () {
assert.deepStrictEqual(Lib.write({ $$typeof: 100 }), {
$: "hundred"
});
assert.deepStrictEqual(Lib.read({ $: "hundred" }), { $$typeof: 100 });
});
});
describe("bind function arguments", function () {
it("should be serializable", function () {
const obj = {};
function f1(a1, a2, a3) {
return [this, a1, a2, a3];
}
const a1 = {},
a2 = {},
a3 = {};
Lib.regOpaqueObject(obj, "obj");
Lib.regOpaqueObject(f1);
Lib.regOpaqueObject(a1, "arg");
Lib.regOpaqueObject(a2, "arg");
const bind = Lib.bind(f1, obj, a1, a2);
bind.someNum = 100;
bind.rec = bind;
const fjson = Lib.write({ f: bind });
assert.deepStrictEqual(fjson, {
f: [
[
"f",
{
r: 0
}
]
],
x: [
{
f: [
["someNum", 100],
[
"rec",
{
r: 0
}
],
[
{
$: "#this"
},
{
$: "obj"
}
],
[
{
$: "#fun"
},
{
$: "f1"
}
],
[
{
$: "#args"
},
[
{
$: "arg"
},
{
$: "arg_1"
}
]
]
],
$: "Bind"
}
]
});
const f2 = Lib.read(fjson).f;
assert.notStrictEqual(f1, f2);
const res = f2(a3);
assert.strictEqual(res.length, 4);
const [robj, ra1, ra2, ra3] = res;
assert.strictEqual(obj, robj);
assert.strictEqual(a1, ra1);
assert.strictEqual(a2, ra2);
assert.strictEqual(a3, ra3);
});
});
describe("RegExp", function () {
it("should be serializable", function () {
const re1 = /\w+/;
const re2 = /ho/g;
const s1 = "uho-ho-ho";
re2.test(s1);
const res = Lib.write({ re1, re2 });
assert.deepEqual(res, {
f: [
["re1", { src: "\\w+", flags: "", $: "RegExp" }],
["re2", { src: "ho", flags: "g", last: 3, $: "RegExp" }]
]
});
const { re1: bre1, re2: bre2 } = Lib.read(res);
assert.equal(re1.src, bre1.src);
assert.equal(re1.flags, bre1.flags);
assert.equal(re1.lastIndex, bre1.lastIndex);
assert.equal(re2.src, bre2.src);
assert.equal(re2.flags, bre2.flags);
assert.equal(re2.lastIndex, bre2.lastIndex);
});
});
describe("not serializable values", function () {
it("should throw an exception if `ignore:falsy`", function () {
function A() {}
try {
Lib.write({ A });
} catch (e) {
assert.equal(e.constructor, TypeError);
assert.equal(
e.message,
`not serializable value "function A() {}" at "1"(A) of "A"`
);
return;
}
assert.fail("should throw");
});
it("should be ignored if `ignore:true`", function () {
function A() {}
const d = Lib.write({ A }, { ignore: true });
const r = Lib.read(d);
assert.deepEqual(r, {});
});
it('should register an opaque descriptor `ignore:"opaque"`', function () {
function A() {}
const d = Lib.write({ A, b: A }, { ignore: "opaque" });
const r = Lib.read(d);
assert.deepEqual(r, { A, b: A });
});
it("should register an opaque descriptor with auto-opaque descriptor", function () {
function A() {}
Lib.regAutoOpaqueConstr(A, true);
const a = new A();
const d = Lib.write({ a, b: a }, { ignore: "opaque" });
const r = Lib.read(d);
assert.deepEqual(r, { a, b: a });
});
it('should be converted into a not usable placeholder if `ignore:"placeholder"`', function () {
function A() {}
const d = Lib.write({ A }, { ignore: "placeholder" });
const r = Lib.read(d);
try {
r.A();
} catch (e) {
assert.equal(e.constructor, TypeError);
assert.equal(e.message, "apply in a not restored object");
return;
}
assert.fail("should throw");
});
});
describe("TypedArray", function () {
it("should be serializable", function () {
const arr1 = new Int32Array([1, 2, 3, 4, 5]);
const arr2 = new Uint32Array(arr1.buffer, 8);
const d = Lib.write({ arr1, arr2 }, {});
assert.deepStrictEqual(d, {
f: [
[
"arr1",
{
o: 0,
l: 5,
b: {
r: 0
},
$: "Int32Array"
}
],
[
"arr2",
{
o: 8,
l: 3,
b: {
r: 0
},
$: "Uint32Array"
}
]
],
x: [
{
d: "AQAAAAIAAAADAAAABAAAAAUAAAA=",
$: "ArrayBuffer"
}
]
});
const { arr1: rarr1, arr2: rarr2 } = Lib.read(d);
assert.equal(rarr1.constructor, Int32Array);
assert.equal(rarr2.constructor, Uint32Array);
assert.notStrictEqual(arr1, rarr1);
assert.notStrictEqual(arr2, rarr2);
assert.deepStrictEqual(arr1, rarr1);
assert.deepStrictEqual(arr2, rarr2);
});
});
describe("WeakSet/WeakMap", function () {
it("should be serializable", function () {
const set = new WeakSet();
const map = new WeakMap();
const map2 = new WeakMap();
const obj1 = {};
const obj2 = {};
Lib.regOpaqueObject(obj1, "w#obj1");
set.add(obj1).add(obj2);
map.set(obj1, "obj1").set(obj2, "obj2");
map2.set(obj1, "2obj1");
assert.ok(set.has(obj1));
assert.ok(map.has(obj1));
assert.strictEqual(map.get(obj1), "obj1");
assert.strictEqual(map.get({}), void 0);
const d = Lib.write({ set, map, map2, obj1, obj2 });
assert.deepStrictEqual(d, {
x: [{}, { $: "w#obj1" }],
f: [
["set", { v: [{ r: 0 }, { r: 1 }], $: "WeakSet" }],
[
"map",
{
k: [{ r: 1 }, { r: 0 }],
v: ["obj1", "obj2"],
$: "WeakMap"
}
],
[
"map2",
{
k: [{ r: 1 }],
v: ["2obj1"],
$: "WeakMap"
}
],
["obj1", { r: 1 }],
["obj2", { r: 0 }]
]
});
const {
set: rset,
map: rmap,
map2: rmap2,
obj1: robj1,
obj2: robj2
} = Lib.read(d);
assert.strictEqual(robj1, obj1);
assert.notStrictEqual(robj2, obj2);
assert.ok(rset.has(obj1));
assert.ok(set.delete(obj1));
assert.ok(!set.has(obj1));
assert.ok(rset.has(obj1));
assert.ok(rset.has(robj2));
assert.ok(!rset.has(obj2));
assert.ok(!set.has(robj2));
assert.strictEqual(rmap.get(obj1), "obj1");
assert.strictEqual(rmap2.get(obj1), "2obj1");
assert.ok(map.delete(obj1));
assert.strictEqual(rmap.get(obj1), "obj1");
assert.ok(rset.delete(robj2));
assert.ok(!rset.has(robj2));
assert.ok(!rset.delete(robj2));
assert.ok(!rset.has(robj2));
});
});
describe("WeakSet/WeakMap workaround", function () {
it("should be serializable", function () {
const set = new Lib.WeakSetWorkaround();
const map = new Lib.WeakMapWorkaround();
const map2 = new Lib.WeakMapWorkaround();
const obj1 = {};
const obj2 = {};
Lib.regOpaqueObject(obj1, "w##obj1");
set.add(obj1).add(obj2);
map.set(obj1, "obj1").set(obj2, "obj2");
map2.set(obj1, "2obj1");
assert.ok(set.has(obj1));
assert.ok(map.has(obj1));
assert.strictEqual(map.get(obj1), "obj1");
assert.strictEqual(map.get({}), void 0);
const d = Lib.write({ set, map, map2, obj1, obj2 });
assert.deepStrictEqual(d, {
f: [
[
"set",
{
f: [["prop", { name: "@effectful/weakset", $: "Symbol" }]],
$: "WeakSet#"
}
],
[
"map",
{
f: [["prop", { name: "@effectful/weakmap", $: "Symbol" }]],
$: "WeakMap#"
}
],
[
"map2",
{
f: [["prop", { name: "@effectful/weakmap", id: 1, $: "Symbol" }]],
$: "WeakMap#"
}
],
[
"obj1",
{
$: "w##obj1",
f: [
[{ name: "@effectful/weakset" }, true, 2],
[{ name: "@effectful/weakmap" }, "obj1", 2],
[{ name: "@effectful/weakmap", id: 1 }, "2obj1", 2]
]
}
],
[
"obj2",
{
f: [
[{ name: "@effectful/weakset" }, true, 2],
[{ name: "@effectful/weakmap" }, "obj2", 2]
]
}
]
]
});
const {
set: rset,
map: rmap,
map2: rmap2,
obj1: robj1,
obj2: robj2
} = Lib.read(d);
assert.strictEqual(robj1, obj1);
assert.notStrictEqual(robj2, obj2);
assert.ok(rset.has(obj1));
assert.ok(set.delete(obj1));
assert.ok(!set.has(obj1));
assert.ok(rset.has(obj1));
assert.ok(rset.has(robj2));
assert.ok(!rset.has(obj2));
assert.ok(!set.has(robj2));
assert.strictEqual(rmap.get(obj1), "obj1");
assert.strictEqual(rmap2.get(obj1), "2obj1");
assert.ok(map.delete(obj1));
assert.strictEqual(rmap.get(obj1), "obj1");
assert.ok(rset.delete(robj2));
assert.ok(!rset.has(robj2));
assert.ok(!rset.delete(robj2));
assert.ok(!rset.has(robj2));
});
});
describe("BigInt", function () {
it("should be serializable", function () {
const num = 2n ** 10000n;
const doc = Lib.write({ num });
assert.equal(doc.f[0][0], "num");
assert.ok(doc.f[0][1].int.substr);
assert.equal(doc.f[0][1].int.length, 3011);
assert.strictEqual(Lib.read(doc).num, num);
});
});
| mit |
Gomdoree/Snake | src/main.cpp | 3451 | #include "GameCtrl.h"
char title[16][30]=
{
{0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0},
{0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0},
{0,1,1,1,1,1,0,1,0,0,0,1,0,0,1,1,0,0,0,1,0,0,1,0,1,1,1,1,1,0},
{0,1,0,0,0,0,0,1,1,0,0,1,0,1,0,0,1,0,0,1,0,1,0,0,1,0,0,0,0,0},
{0,1,1,1,1,1,0,1,0,1,0,1,0,1,1,1,1,0,0,1,1,0,0,0,1,1,1,1,1,0},
{0,0,0,0,0,1,0,1,0,0,1,1,0,1,0,0,1,0,0,1,0,1,0,0,1,0,0,0,0,0},
{0,1,1,1,1,1,0,1,0,0,0,1,0,1,0,0,1,0,0,1,0,0,1,0,1,1,1,1,1,0},
{0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0},
{0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0},
{0,0,1,1,1,1,1,0,0,0,1,1,0,0,0,1,1,0,0,0,1,1,0,0,1,1,1,1,1,0},
{0,0,1,0,0,0,0,0,0,1,0,0,1,0,0,1,1,0,0,0,1,1,0,0,1,0,0,0,0,0},
{0,0,1,0,0,1,1,0,0,1,1,1,1,0,0,1,0,1,0,1,0,1,0,0,1,1,1,1,1,0},
{0,0,1,1,1,1,1,0,0,1,0,0,1,0,0,1,0,1,0,1,0,1,0,0,1,0,0,0,0,0},
{0,0,0,0,0,0,1,0,0,1,0,0,1,0,0,1,0,0,1,0,0,1,0,0,1,1,1,1,1,0},
{0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0},
{0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0}
};
int main() {
int menu = 0;
for(int x=0;x<16;x++){
printf("\t");
for(int y=0;y<30;y++){
if(title[x][y] == 1){
printf("¡á");
}
if(title[x][y] == 0){
printf("¡à");
}
}
printf("\n");
}
printf("\n");
printf("\t\t\t\t<MENU>\n");
printf("\t\t\t 1.Snake Game(Manual Mode)\n");
printf("\t\t\t 2.See How to find Shortest Path\n");
printf("\t\t\t 3.See How to find Longest Path\n");
printf("\t\t\t 4.AI Hamiltonian Cycle\n");
printf("\t\t\t 5.AI Graph Search\n");
printf("\t\t\t 6.Exit\n");
printf("\t\t\t =>Input Menu Number : ");
scanf("%d",&menu);
auto game = GameCtrl::getInstance(0);
// Set FPS. Default is 60.0
game->setFPS(60.0);
// Set the interval time between each snake's movement. Default is 30 ms.
// To play classic snake game, set to 150 ms is perfect.
game->setMoveInterval(150);
// Set whether to record the snake's movements to file. Default is true.
// The movements will be written to a file named "movements.txt".
game->setRecordMovements(true);
// Set map's size(including boundaries). Default is 10*10. Minimum is 5*5.
game->setMapRow(10);
game->setMapCol(10);
if(menu==1){ //Snake Game
// Set whether to enable the snake AI. Default is true.
game->setEnableAI(false);
// Set whether to use a hamiltonian cycle to guide the AI. Default is true.
game->setEnableHamilton(true);
// Set whether to run the test program. Default is false.
// You can select different testing methods by modifying GameCtrl::test().
game->setRunTest(false);
}
else if(menu==2){ //Shortest Path
game->setEnableAI(false);
game->setEnableHamilton(true);
game->setRunTest(true);
game->setMapRow(20);
game->setMapCol(20);
}
else if(menu==3){ //Longest Path
auto game = GameCtrl::getInstance(1);
game->setEnableAI(false);
game->setEnableHamilton(true);
game->setRunTest(true);
game->setMapRow(20);
game->setMapCol(20);
}
else if(menu==4){ //Hamiltonian Cycle
game->setEnableAI(true);
game->setEnableHamilton(true);
game->setRunTest(false);
}
else if(menu==5){ //AI
game->setEnableAI(true);
game->setEnableHamilton(false);
game->setRunTest(false);
}
else
return 0;
return game->run();
}
| mit |
mjago/CW | test/test_dsl.rb | 2411 | require 'simplecov'
$VERBOSE = nil #FIXME
SimpleCov.start
require 'minitest/autorun'
require 'minitest/pride'
require_relative '../lib/cw'
class TestNumbers < MiniTest::Test
ROOT = File.expand_path File.dirname(__FILE__) + '/../'
def setup
@dsl = CW::Dsl.new
end
def teardown
@dsl = nil
end
def test_words_returns_words
words = @dsl.words
assert words.is_a? Array
assert_equal 1000, words.size
assert_equal 'the', words.first
end
def test_load_common_words_returns_words
words = @dsl.load_common_words
assert words.is_a? Array
assert_equal 1000, words.size
assert_equal 'the', words.first
end
def test_most_load_most_common_words_returns_words
words = @dsl.load_most_common_words
assert words.is_a? Array
assert_equal 500, words.size
assert_equal 'the', words.first
end
def test_load_abbreviations
words = @dsl.load_abbreviations
assert words.is_a? Array
assert_equal 85, words.size
assert_equal 'abt', words.first
end
def test_reverse
@dsl.words = %w[a b c]
assert_equal CW::Dsl, @dsl.class
@dsl.reverse
assert_equal 'c', @dsl.words.first
end
def test_double_words
@dsl.words = %w[a b]
@dsl.double_words
assert_equal %w[a a b b], @dsl.words
end
def test_letters_numbers
assert_equal %w[a b c d e], @dsl.letters_numbers[0..4]
assert_equal %w[y z], @dsl.words[24..25]
assert_equal %w[0 1], @dsl.words[26..27]
assert_equal %w[8 9], @dsl.words[34..35]
end
def test_random_numbers
@dsl.random_numbers
assert_equal Array, @dsl.words.class
assert_equal 50, @dsl.words.size
@dsl.words.each do |wrd|
assert wrd.size == 4
assert wrd.to_i > 0
assert wrd.to_i < 10000
end
end
def test_random_letters
@dsl.random_letters
assert_equal Array, @dsl.words.class
assert_equal 50, @dsl.words.size
@dsl.words.each do |wrd|
assert wrd.size == 4
end
end
def test_random_letters_numbers
@dsl.random_letters_numbers
assert_equal Array, @dsl.words.class
assert_equal 50, @dsl.words.size
@dsl.words.each do |wrd|
assert wrd.size == 4
end
end
def test_alphabet
@dsl.alphabet
assert_equal ["a b c d e f g h i j k l m n o p q r s t u v w x y z"], @dsl.words
end
def test_numbers
assert_equal ['0', '1', '2', '3', '4'], @dsl.numbers[0..4]
end
end
| mit |
Wadpam/lets-the-right-one-in | lroi-lib/src/main/java/se/leiflandia/lroi/utils/AuthUtils.java | 3349 | package se.leiflandia.lroi.utils;
import android.accounts.Account;
import android.accounts.AccountManager;
import android.content.Context;
import android.content.SharedPreferences;
import android.text.TextUtils;
import se.leiflandia.lroi.auth.model.AccessToken;
import se.leiflandia.lroi.auth.model.UserCredentials;
public class AuthUtils {
private static final String PREF_ACTIVE_ACCOUNT = "active_account";
private static final String PREFS_NAME = "se.leiflandia.lroi.prefs";
public static void removeActiveAccount(Context context, String accountType) {
Account account = getActiveAccount(context, accountType);
if (account != null) {
AccountManager.get(context).removeAccount(account, null, null);
}
setActiveAccountName(context, null);
}
public static Account getActiveAccount(final Context context, final String accountType) {
Account[] accounts = AccountManager.get(context).getAccountsByType(accountType);
return getActiveAccount(accounts, getActiveAccountName(context));
}
public static boolean hasActiveAccount(final Context context, final String accountType) {
return getActiveAccount(context, accountType) != null;
}
private static String getActiveAccountName(final Context context) {
return getSharedPreferences(context)
.getString(PREF_ACTIVE_ACCOUNT, null);
}
public static void setActiveAccountName(final Context context, final String name) {
getSharedPreferences(context).edit()
.putString(PREF_ACTIVE_ACCOUNT, name)
.commit();
}
private static Account getActiveAccount(final Account[] accounts, final String activeAccountName) {
for (Account account : accounts) {
if (TextUtils.equals(account.name, activeAccountName)) {
return account;
}
}
return null;
}
private static SharedPreferences getSharedPreferences(final Context context) {
return context.getSharedPreferences(PREFS_NAME, Context.MODE_PRIVATE);
}
/**
* Saves an authorized account in account manager and set as active account.
*/
public static void setAuthorizedAccount(Context context, UserCredentials credentials, AccessToken token, String authtokenType, String accountType) {
final AccountManager accountManager = AccountManager.get(context);
Account account = findOrCreateAccount(accountManager, credentials.getUsername(), token.getRefreshToken(), accountType);
accountManager.setAuthToken(account, authtokenType, token.getAccessToken());
setActiveAccountName(context, account.name);
}
/**
* Sets password of account, creates a new account if necessary.
*/
private static Account findOrCreateAccount(AccountManager accountManager, String username, String refreshToken, String accountType) {
for (Account account : accountManager.getAccountsByType(accountType)) {
if (account.name.equals(username)) {
accountManager.setPassword(account, refreshToken);
return account;
}
}
Account account = new Account(username, accountType);
accountManager.addAccountExplicitly(account, refreshToken, null);
return account;
}
}
| mit |
ossimlabs/ossim-plugins | potrace/src/backend_geojson.h | 360 | /* Copyright (C) 2001-2015 Peter Selinger.
This file is part of Potrace. It is free software and it is covered
by the GNU General Public License. See the file COPYING for details. */
#ifndef BACKEND_GEO_H
#define BACKEND_GEO_H
#include "potracelib.h"
int page_geojson(FILE *fout, potrace_path_t *plist, int as_polygons);
#endif /* BACKEND_GEO_H */
| mit |
chamunks/alpine-sickbeard-arm | run.sh | 171 | docker run -d \
--name=sickbeard \
-v $(pwd)/data:/data \
-v $(pwd)/config/config.ini:/app/config.ini \
-p 8081:8081 \
chamunks/alpine-sickbeard-arm:latest
| mit |
LazyTarget/Reportz | Reportz.Scripting/Commands/ExecuteScriptCommand.cs | 4213 | using System;
using System.Collections.Generic;
using System.Linq;
using System.Reflection;
using System.Xml.Linq;
using Reportz.Scripting.Attributes;
using Reportz.Scripting.Classes;
using Reportz.Scripting.Interfaces;
namespace Reportz.Scripting.Commands
{
[ScriptElementAlias("execute-script")]
public class ExecuteScriptCommand : IExecutable, IScriptElement
{
private IScriptParser _instantiator;
private XElement _element;
private ArgCollection _argsCollection;
private EventCollection _eventsCollection;
public ExecuteScriptCommand()
{
}
public string ScriptName { get; private set; }
public IExecutableResult Execute(IExecutableArgs args)
{
IExecutableResult result = null;
args = args ?? new ExecutableArgs { Scope = new VariableScope() };
try
{
var ctx = args?.Scope.GetScriptContext();
var scriptName = args?.Arguments?.FirstOrDefault(x => x.Key == "scriptName")?.Value?.ToString();
scriptName = scriptName ?? ScriptName;
if (string.IsNullOrWhiteSpace(scriptName))
throw new Exception($"Invalid script name.");
var script = ctx?.ScriptScope?.GetScript(scriptName);
if (script == null)
throw new Exception($"Could not get script '{scriptName}'. ");
var scriptArgs = new ExecutableArgs();
scriptArgs.Scope = args.Scope.CreateChild();
//scriptArgs.Arguments = _argsCollection?._vars?.Values?.ToArray();
var scriptArgVars = _argsCollection?._vars?.Values?.ToArray();
if (scriptArgVars != null)
{
foreach (var scriptVar in scriptArgVars)
{
var r = scriptVar.Execute(scriptArgs);
}
}
result = script.Execute(scriptArgs);
return result;
}
catch (Exception ex)
{
IEvent errorEvent;
if (_eventsCollection._events.TryGetValue("error", out errorEvent) && errorEvent != null)
{
var exceptionVar = new Variable
{
Key = "$$Exception",
Value = ex,
};
args.Scope?.SetVariable(exceptionVar);
errorEvent.Execute(args);
}
return result;
// todo: implement 'catch' logic. catch="true" on <event key="error">. Or only if wrapped inside <try> <catch>
// todo: implement test <throw> tag
//throw;
}
finally
{
IEvent completeEvent;
if (_eventsCollection._events.TryGetValue("complete", out completeEvent) && completeEvent != null)
{
var resultVar = new Variable
{
Key = "$$Result",
Value = result?.Result,
};
args.Scope?.SetVariable(resultVar);
completeEvent.Execute(args);
}
}
}
public void Configure(IScriptParser parser, XElement element)
{
_instantiator = parser;
_element = element;
ScriptName = element?.Attribute("scriptName")?.Value ??
element?.Attribute("name")?.Value;
var argsElem = element?.Element("arguments");
if (argsElem != null)
{
var arg = new ArgCollection();
arg.Configure(parser, argsElem);
_argsCollection = arg;
}
var eventsElem = element?.Element("events");
if (eventsElem != null)
{
var events = new EventCollection();
events.Configure(parser, eventsElem);
_eventsCollection = events;
}
}
}
}
| mit |
ravikumargh/Police | modules/crimes/client/config/crime.client.routes.js | 1154 | (function () {
'use strict';
angular
.module('crimes.routes')
.config(routeConfig);
routeConfig.$inject = ['$stateProvider'];
function routeConfig($stateProvider) {
$stateProvider
.state('crimes', {
abstract: true,
url: '/crimes',
template: '<ui-view/>'
})
.state('crimes.list', {
url: '',
templateUrl: '/modules/crimes/client/views/list-crimes.client.view.html',
controller: 'CrimesListController',
controllerAs: 'vm',
data: {
pageTitle: 'Crimes List'
}
})
.state('crimes.view', {
url: '/:crimeId',
templateUrl: '/modules/crimes/client/views/view-crimes.client.view.html',
controller: 'CrimesController',
controllerAs: 'vm',
resolve: {
crimeResolve: getCrime
},
data: {
pageTitle: 'Crimes {{ crimeResolve.title }}'
}
});
}
getCrime.$inject = ['$stateParams', 'CrimesService'];
function getCrime($stateParams, CrimesService) {
return CrimesService.get({
crimeId: $stateParams.crimeId
}).$promise;
}
}());
| mit |
matthewsot/CocoaSharp | Headers/PrivateFrameworks/iWorkImport/GQDWPColumn.h | 492 | /*
* Generated by class-dump 3.3.4 (64 bit).
*
* class-dump is Copyright (C) 1997-1998, 2000-2001, 2004-2011 by Steve Nygard.
*/
#import "NSObject.h"
// Not exported
@interface GQDWPColumn : NSObject
{
long long mIndex;
float mWidth;
float mSpacing;
_Bool mHasSpacing;
}
+ (const struct StateSpec *)stateForReading;
- (float)spacing;
- (_Bool)hasSpacing;
- (float)width;
- (long long)index;
- (int)readAttributesFromReader:(struct _xmlTextReader *)arg1;
@end
| mit |
raokarthik74/Discovery | discovery/discovery/AppDelegate.h | 280 | //
// AppDelegate.h
// discovery
//
// Created by Karthik Rao on 1/20/17.
// Copyright © 2017 Karthik Rao. All rights reserved.
//
#import <UIKit/UIKit.h>
@interface AppDelegate : UIResponder <UIApplicationDelegate>
@property (strong, nonatomic) UIWindow *window;
@end
| mit |
mhaddir/draco | apps/filters/filters-news.php | 1680 | <?php
if( $kind = ic_get_post( 'kind' ) ) {
if( $kind == 'c' || $kind == 'cod' ) { $sym = '#'; $kind = 'cod'; }
else if( $kind == 'l' || $kind == 'loc' ) { $sym = '@'; $kind = 'loc'; }
else if( $kind == 's' || $kind == 'str' ) { $sym = '*'; $kind = 'str'; }
else if( $kind == 'u' || $kind == 'usr' ) { $sym = '+'; $kind = 'usr'; }
if( $name = ic_get_post( 'filters-req' ) ) {
$filter = $_SESSION['filters'][$sym.$name];
$filter->manage( 'req' );
} else if( $name = ic_get_post( 'filters-sign' ) ) {
$filter = $_SESSION['filters'][$sym.$name];
$filter->manage( 'sign' );
} else if( $name = ic_get_post( 'trashed_filter' ) ) {
$filter = $_SESSION['filters'][$sym.$name];
$filter->manage( 'clear' );
} else if( $name = ic_get_post( 'new_filter' ) ) {
$sign = ( $post_meta = ic_get_post( 'new_sign' ) ) ? $post_meta : 'd';
$filter = new filter( $name, $kind, $sign );
$filter->manage( 'add' );
} $filter_change = TRUE;
}
if( isset( $filter_change ) || ic_get_post( 'filters_refresh' ) ) {
foreach( $_SESSION['filters'] as $filter ) $filter->tag();
} else {
$sign_filters = get_setting( 'sign_filters');
$on = $sign_filters != 1 ? ',\'d\'' : ',0';
$filts = $sign_filters != 1 ? 'style="display:none;"' : '';
$hover = get_setting( 'hover_help' );
?>
<div id="hover-help">
<div class="hover-off" id="hh-ftradd"><?php tr( 'ftradd', 'h' ) ?></div>
</div>
<div class="text-box" id="shell-filters">
<div class="filters-choice filters-choice-signs"><?php filter::add_filters( 'input-filters', 'addFilter()', 'addFilterSelect()' ) ?></div>
<div id="filters">
<?php
$filter_change = TRUE;
include( 'filters-main.php' );
echo '</div></div>';
}
| mit |
dplarson/gitlabhq | app/models/project_services/issue_tracker_service.rb | 3048 | class IssueTrackerService < Service
validate :one_issue_tracker, if: :activated?, on: :manual_change
default_value_for :category, 'issue_tracker'
# Pattern used to extract links from comments
# Override this method on services that uses different patterns
# This pattern does not support cross-project references
# The other code assumes that this pattern is a superset of all
# overriden patterns. See ReferenceRegexes::EXTERNAL_PATTERN
def self.reference_pattern
@reference_pattern ||= %r{(\b[A-Z][A-Z0-9_]+-|#{Issue.reference_prefix})(?<issue>\d+)}
end
def default?
default
end
def issue_url(iid)
self.issues_url.gsub(':id', iid.to_s)
end
def issue_tracker_path
project_url
end
def new_issue_path
new_issue_url
end
def issue_path(iid)
issue_url(iid)
end
def fields
[
{ type: 'text', name: 'description', placeholder: description },
{ type: 'text', name: 'project_url', placeholder: 'Project url', required: true },
{ type: 'text', name: 'issues_url', placeholder: 'Issue url', required: true },
{ type: 'text', name: 'new_issue_url', placeholder: 'New Issue url', required: true }
]
end
# Initialize with default properties values
# or receive a block with custom properties
def initialize_properties(&block)
return unless properties.nil?
if enabled_in_gitlab_config
if block_given?
yield
else
self.properties = {
title: issues_tracker['title'],
project_url: issues_tracker['project_url'],
issues_url: issues_tracker['issues_url'],
new_issue_url: issues_tracker['new_issue_url']
}
end
else
self.properties = {}
end
end
def self.supported_events
%w(push)
end
def execute(data)
return unless supported_events.include?(data[:object_kind])
message = "#{self.type} was unable to reach #{self.project_url}. Check the url and try again."
result = false
begin
response = HTTParty.head(self.project_url, verify: true)
if response
message = "#{self.type} received response #{response.code} when attempting to connect to #{self.project_url}"
result = true
end
rescue HTTParty::Error, Timeout::Error, SocketError, Errno::ECONNRESET, Errno::ECONNREFUSED, OpenSSL::SSL::SSLError => error
message = "#{self.type} had an error when trying to connect to #{self.project_url}: #{error.message}"
end
Rails.logger.info(message)
result
end
private
def enabled_in_gitlab_config
Gitlab.config.issues_tracker &&
Gitlab.config.issues_tracker.values.any? &&
issues_tracker
end
def issues_tracker
Gitlab.config.issues_tracker[to_param]
end
def one_issue_tracker
return if template?
return if project.blank?
if project.services.external_issue_trackers.where.not(id: id).any?
errors.add(:base, 'Another issue tracker is already in use. Only one issue tracker service can be active at a time')
end
end
end
| mit |
timothytran/ngModular | grunt/compass.js | 192 | /**
* Compile sass files to css using compass
*/
module.exports = {
dev: {
options: {
config: 'config.rb',
environment: 'development'
}
},
};
| mit |
vc3/Cognito.Stripe | Cognito.Stripe/Classes/FileUpload.cs | 377 | using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
namespace Cognito.Stripe.Classes
{
public class FileUpload : BaseObject
{
public override string Object { get { return "file_upload"; } }
public string Purpose { get; set; }
public int Size { get; set; }
public string Type { get; set; }
public string URL { get; set; }
}
}
| mit |
anquinn/infrastructure_projects | db/migrate/20170513191438_change_integer_limits.rb | 214 | class ChangeIntegerLimits < ActiveRecord::Migration
def change
change_column :projects, :federal_contribution, :integer, limit: 8
change_column :projects, :total_eligible_cost, :integer, limit: 8
end
end
| mit |
SibirCTF/2014-jury-system-fhq | php/fhq/test_mail.php | 568 | <?php
exit;
include_once "config/config.php";
include_once "engine/fhq_class_security.php";
include_once "engine/fhq_class_database.php";
include_once "engine/fhq_class_mail.php";
if(!isset($_GET['email']))
{
echo "not found parametr ?email=";
exit;
};
$email = $_GET['email'];
echo "send to mail: ".$email."<br>";
$security = new fhq_security();
$db = new fhq_database();
$mail = new fhq_mail();
echo "mail created <br>";
$error = "";
echo "try send email<br>";
$mail->send($email,'','','Test Mail', 'Test messages', $error);
echo "sended";
echo $error;
?>
| mit |
F5Networks/f5-ansible-modules | ansible_collections/f5networks/f5_modules/plugins/modules/bigiq_regkey_license_assignment.py | 19962 | #!/usr/bin/python
# -*- coding: utf-8 -*-
#
# Copyright: (c) 2017, F5 Networks Inc.
# GNU General Public License v3.0 (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import absolute_import, division, print_function
__metaclass__ = type
DOCUMENTATION = r'''
---
module: bigiq_regkey_license_assignment
short_description: Manage regkey license assignment on BIG-IPs from a BIG-IQ
description:
- Manages the assignment of regkey licenses on a BIG-IQ. Assignment means
the license is assigned to a BIG-IP, or it needs to be assigned to a BIG-IP.
Additionally, this module supports revoking the assignments from BIG-IP devices.
version_added: "1.0.0"
options:
pool:
description:
- The registration key pool to use.
type: str
required: True
key:
description:
- The registration key you want to assign from the pool.
type: str
required: True
device:
description:
- When C(managed) is C(no), specifies the address, or hostname, where the BIG-IQ
can reach the remote device to register.
- When C(managed) is C(yes), specifies the managed device, or device UUID, that
you want to register.
- If C(managed) is C(yes), it is very important you do not have more than
one device with the same name. BIG-IQ internally recognizes devices by their ID,
and therefore, this module cannot guarantee the correct device will be
registered. The device returned is the device that is used.
type: str
required: True
managed:
description:
- Whether the specified device is a managed or un-managed device.
- When C(state) is C(present), this parameter is required.
type: bool
device_port:
description:
- Specifies the port of the remote device to connect to.
- If this parameter is not specified, the default is C(443).
type: int
default: 443
device_username:
description:
- The username used to connect to the remote device.
- This username should be one that has sufficient privileges on the remote device
to do licensing. Usually this is the C(Administrator) role.
- When C(managed) is C(no), this parameter is required.
type: str
device_password:
description:
- The password of the C(device_username).
- When C(managed) is C(no), this parameter is required.
type: str
state:
description:
- When C(present), ensures the device is assigned the specified license.
- When C(absent), ensures the license is revoked from the remote device and freed
on the BIG-IQ.
type: str
choices:
- present
- absent
default: present
extends_documentation_fragment: f5networks.f5_modules.f5
author:
- Tim Rupp (@caphrim007)
'''
EXAMPLES = r'''
- name: Register an unmanaged device
bigiq_regkey_license_assignment:
pool: my-regkey-pool
key: XXXX-XXXX-XXXX-XXXX-XXXX
device: 1.1.1.1
managed: no
device_username: admin
device_password: secret
state: present
provider:
user: admin
password: secret
server: lb.mydomain.com
delegate_to: localhost
- name: Register a managed device, by name
bigiq_regkey_license_assignment:
pool: my-regkey-pool
key: XXXX-XXXX-XXXX-XXXX-XXXX
device: bigi1.foo.com
managed: yes
state: present
provider:
user: admin
password: secret
server: lb.mydomain.com
delegate_to: localhost
- name: Register a managed device, by UUID
bigiq_regkey_license_assignment:
pool: my-regkey-pool
key: XXXX-XXXX-XXXX-XXXX-XXXX
device: 7141a063-7cf8-423f-9829-9d40599fa3e0
managed: yes
state: present
provider:
user: admin
password: secret
server: lb.mydomain.com
delegate_to: localhost
'''
RETURN = r'''
# only common fields returned
'''
import re
import time
from datetime import datetime
from ansible.module_utils.basic import AnsibleModule
from ..module_utils.bigip import F5RestClient
from ..module_utils.common import (
F5ModuleError, AnsibleF5Parameters, f5_argument_spec
)
from ..module_utils.icontrol import bigiq_version
from ..module_utils.ipaddress import is_valid_ip
from ..module_utils.teem import send_teem
class Parameters(AnsibleF5Parameters):
api_map = {
'deviceReference': 'device_reference',
'deviceAddress': 'device_address',
'httpsPort': 'device_port'
}
api_attributes = [
'deviceReference', 'deviceAddress', 'httpsPort', 'managed'
]
returnables = [
'device_address', 'device_reference', 'device_username', 'device_password',
'device_port', 'managed'
]
updatables = [
'device_reference', 'device_address', 'device_username', 'device_password',
'device_port', 'managed'
]
def to_return(self):
result = {}
try:
for returnable in self.returnables:
result[returnable] = getattr(self, returnable)
result = self._filter_params(result)
except Exception:
raise
return result
class ApiParameters(Parameters):
pass
class ModuleParameters(Parameters):
@property
def device_password(self):
if self._values['device_password'] is None:
return None
return self._values['device_password']
@property
def device_username(self):
if self._values['device_username'] is None:
return None
return self._values['device_username']
@property
def device_address(self):
if self.device_is_address:
return self._values['device']
@property
def device_port(self):
if self._values['device_port'] is None:
return None
return int(self._values['device_port'])
@property
def device_is_address(self):
if is_valid_ip(self.device):
return True
return False
@property
def device_is_id(self):
pattern = r'[A-Za-z0-9]{8}-[A-Za-z0-9]{4}-[A-Za-z0-9]{4}-[A-Za-z0-9]{4}-[A-Za-z0-9]{12}'
if re.match(pattern, self.device):
return True
return False
@property
def device_is_name(self):
if not self.device_is_address and not self.device_is_id:
return True
return False
@property
def device_reference(self):
if not self.managed:
return None
if self.device_is_address:
# This range lookup is how you do lookups for single IP addresses. Weird.
filter = "address+eq+'{0}...{0}'".format(self.device)
elif self.device_is_name:
filter = "hostname+eq+'{0}'".format(self.device)
elif self.device_is_id:
filter = "uuid+eq+'{0}'".format(self.device)
else:
raise F5ModuleError(
"Unknown device format '{0}'".format(self.device)
)
uri = "https://{0}:{1}/mgmt/shared/resolver/device-groups/cm-bigip-allBigIpDevices/devices/" \
"?$filter={2}&$top=1".format(self.client.provider['server'],
self.client.provider['server_port'], filter)
resp = self.client.api.get(uri)
try:
response = resp.json()
except ValueError as ex:
raise F5ModuleError(str(ex))
if resp.status == 200 and response['totalItems'] == 0:
raise F5ModuleError(
"No device with the specified address was found."
)
elif 'code' in response and response['code'] == 400:
if 'message' in response:
raise F5ModuleError(response['message'])
else:
raise F5ModuleError(resp._content)
id = response['items'][0]['uuid']
result = dict(
link='https://localhost/mgmt/shared/resolver/device-groups/cm-bigip-allBigIpDevices/devices/{0}'.format(id)
)
return result
@property
def pool_id(self):
filter = "(name%20eq%20'{0}')".format(self.pool)
uri = 'https://{0}:{1}/mgmt/cm/device/licensing/pool/regkey/licenses?$filter={2}&$top=1'.format(
self.client.provider['server'],
self.client.provider['server_port'],
filter
)
resp = self.client.api.get(uri)
try:
response = resp.json()
except ValueError as ex:
raise F5ModuleError(str(ex))
if resp.status == 200 and response['totalItems'] == 0:
raise F5ModuleError(
"No pool with the specified name was found."
)
elif 'code' in response and response['code'] == 400:
if 'message' in response:
raise F5ModuleError(response['message'])
else:
raise F5ModuleError(resp._content)
return response['items'][0]['id']
@property
def member_id(self):
if self.device_is_address:
# This range lookup is how you do lookups for single IP addresses. Weird.
filter = "deviceAddress+eq+'{0}...{0}'".format(self.device)
elif self.device_is_name:
filter = "deviceName+eq+'{0}'".format(self.device)
elif self.device_is_id:
filter = "deviceMachineId+eq+'{0}'".format(self.device)
else:
raise F5ModuleError(
"Unknown device format '{0}'".format(self.device)
)
uri = 'https://{0}:{1}/mgmt/cm/device/licensing/pool/regkey/licenses/{2}/offerings/{3}/members/' \
'?$filter={4}'.format(self.client.provider['server'], self.client.provider['server_port'],
self.pool_id, self.key, filter)
resp = self.client.api.get(uri)
try:
response = resp.json()
except ValueError as ex:
raise F5ModuleError(str(ex))
if resp.status == 200 and response['totalItems'] == 0:
return None
elif 'code' in response and response['code'] == 400:
if 'message' in response:
raise F5ModuleError(response['message'])
else:
raise F5ModuleError(resp._content)
result = response['items'][0]['id']
return result
class Changes(Parameters):
pass
class UsableChanges(Changes):
@property
def device_port(self):
if self._values['managed']:
return None
return self._values['device_port']
@property
def device_username(self):
if self._values['managed']:
return None
return self._values['device_username']
@property
def device_password(self):
if self._values['managed']:
return None
return self._values['device_password']
@property
def device_reference(self):
if not self._values['managed']:
return None
return self._values['device_reference']
@property
def device_address(self):
if self._values['managed']:
return None
return self._values['device_address']
@property
def managed(self):
return None
class ReportableChanges(Changes):
pass
class Difference(object):
def __init__(self, want, have=None):
self.want = want
self.have = have
def compare(self, param):
try:
result = getattr(self, param)
return result
except AttributeError:
return self.__default(param)
def __default(self, param):
attr1 = getattr(self.want, param)
try:
attr2 = getattr(self.have, param)
if attr1 != attr2:
return attr1
except AttributeError:
return attr1
class ModuleManager(object):
def __init__(self, *args, **kwargs):
self.module = kwargs.get('module', None)
self.client = F5RestClient(**self.module.params)
self.want = ModuleParameters(params=self.module.params, client=self.client)
self.have = ApiParameters()
self.changes = UsableChanges()
def _set_changed_options(self):
changed = {}
for key in Parameters.returnables:
if getattr(self.want, key) is not None:
changed[key] = getattr(self.want, key)
if changed:
self.changes = UsableChanges(params=changed)
def _update_changed_options(self):
diff = Difference(self.want, self.have)
updatables = Parameters.updatables
changed = dict()
for k in updatables:
change = diff.compare(k)
if change is None:
continue
else:
if isinstance(change, dict):
changed.update(change)
else:
changed[k] = change
if changed:
self.changes = UsableChanges(params=changed)
return True
return False
def should_update(self):
result = self._update_changed_options()
if result:
return True
return False
def exec_module(self):
start = datetime.now().isoformat()
version = bigiq_version(self.client)
changed = False
result = dict()
state = self.want.state
if state == "present":
changed = self.present()
elif state == "absent":
changed = self.absent()
reportable = ReportableChanges(params=self.changes.to_return())
changes = reportable.to_return()
result.update(**changes)
result.update(dict(changed=changed))
self._announce_deprecations(result)
send_teem(start, self.module, version)
return result
def _announce_deprecations(self, result):
warnings = result.pop('__warnings', [])
for warning in warnings:
self.module.deprecate(
msg=warning['msg'],
version=warning['version']
)
def present(self):
if self.exists():
return False
return self.create()
def exists(self):
if self.want.member_id is None:
return False
uri = 'https://{0}:{1}/mgmt/cm/device/licensing/pool/regkey/licenses/{2}/offerings/{3}/members/{4}'.format(
self.client.provider['server'],
self.client.provider['server_port'],
self.want.pool_id,
self.want.key,
self.want.member_id
)
resp = self.client.api.get(uri)
if resp.status == 200:
return True
return False
def remove(self):
self._set_changed_options()
if self.module.check_mode:
return True
self.remove_from_device()
if self.exists():
raise F5ModuleError("Failed to delete the resource.")
# Artificial sleeping to wait for remote licensing (on BIG-IP) to complete
#
# This should be something that BIG-IQ can do natively in 6.1-ish time.
time.sleep(60)
return True
def create(self):
self._set_changed_options()
if not self.want.managed:
if self.want.device_username is None:
raise F5ModuleError(
"You must specify a 'device_username' when working with unmanaged devices."
)
if self.want.device_password is None:
raise F5ModuleError(
"You must specify a 'device_password' when working with unmanaged devices."
)
if self.module.check_mode:
return True
self.create_on_device()
if not self.exists():
raise F5ModuleError(
"Failed to license the remote device."
)
self.wait_for_device_to_be_licensed()
# Artificial sleeping to wait for remote licensing (on BIG-IP) to complete
#
# This should be something that BIG-IQ can do natively in 6.1-ish time.
time.sleep(60)
return True
def create_on_device(self):
params = self.changes.api_params()
uri = 'https://{0}:{1}/mgmt/cm/device/licensing/pool/regkey/licenses/{2}/offerings/{3}/members/'.format(
self.client.provider['server'],
self.client.provider['server_port'],
self.want.pool_id,
self.want.key
)
if not self.want.managed:
params['username'] = self.want.device_username
params['password'] = self.want.device_password
resp = self.client.api.post(uri, json=params)
try:
response = resp.json()
except ValueError as ex:
raise F5ModuleError(str(ex))
if 'code' in response and response['code'] == 400:
if 'message' in response:
raise F5ModuleError(response['message'])
else:
raise F5ModuleError(resp.content)
def wait_for_device_to_be_licensed(self):
count = 0
uri = 'https://{0}:{1}/mgmt/cm/device/licensing/pool/regkey/licenses/{2}/offerings/{3}/members/{4}'.format(
self.client.provider['server'],
self.client.provider['server_port'],
self.want.pool_id,
self.want.key,
self.want.member_id
)
while count < 3:
resp = self.client.api.get(uri)
try:
response = resp.json()
except ValueError as ex:
raise F5ModuleError(str(ex))
if 'code' in response and response['code'] == 400:
if 'message' in response:
raise F5ModuleError(response['message'])
else:
raise F5ModuleError(resp.content)
if response['status'] == 'LICENSED':
count += 1
else:
count = 0
def absent(self):
if self.exists():
return self.remove()
return False
def remove_from_device(self):
uri = 'https://{0}:{1}/mgmt/cm/device/licensing/pool/regkey/licenses/{2}/offerings/{3}/members/{4}'.format(
self.client.provider['server'],
self.client.provider['server_port'],
self.want.pool_id,
self.want.key,
self.want.member_id
)
params = {}
if not self.want.managed:
params.update(self.changes.api_params())
params['id'] = self.want.member_id
params['username'] = self.want.device_username
params['password'] = self.want.device_password
self.client.api.delete(uri, json=params)
class ArgumentSpec(object):
def __init__(self):
self.supports_check_mode = True
argument_spec = dict(
pool=dict(required=True),
key=dict(required=True, no_log=True),
device=dict(required=True),
managed=dict(type='bool'),
device_port=dict(type='int', default=443),
device_username=dict(no_log=True),
device_password=dict(no_log=True),
state=dict(default='present', choices=['absent', 'present'])
)
self.argument_spec = {}
self.argument_spec.update(f5_argument_spec)
self.argument_spec.update(argument_spec)
self.required_if = [
['state', 'present', ['key', 'managed']],
['managed', False, ['device', 'device_username', 'device_password']],
['managed', True, ['device']]
]
def main():
spec = ArgumentSpec()
module = AnsibleModule(
argument_spec=spec.argument_spec,
supports_check_mode=spec.supports_check_mode,
required_if=spec.required_if
)
try:
mm = ModuleManager(module=module)
results = mm.exec_module()
module.exit_json(**results)
except F5ModuleError as ex:
module.fail_json(msg=str(ex))
if __name__ == '__main__':
main()
| mit |
thiagorthomaz/design-patterns | abstractFactory/ConsoleFactory.class.php | 153 | <?php
/**
*
* @author thiago
*/
interface ConsoleFactory {
public function create_console_microsoft();
public function create_console_sony();
}
| mit |
mikeloll/mikeloll.github.io.jekyll | _site/tag/systemjs/index.html | 3998 | <!DOCTYPE html>
<html lang="en-us">
<head>
<link href="http://gmpg.org/xfn/11" rel="profile">
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta http-equiv="content-type" content="text/html; charset=utf-8">
<!-- Enable responsiveness on mobile devices-->
<meta name="viewport" content="width=device-width, initial-scale=1.0, maximum-scale=1">
<title>
Posts Tagged “systemjs” · Mike Loll
</title>
<!-- CSS -->
<link rel="stylesheet" href="/public/css/poole.css">
<link rel="stylesheet" href="/public/css/syntax.css">
<link rel="stylesheet" href="/public/css/lanyon.css">
<link rel="stylesheet" href="/public/css/mikeloll.css">
<link rel="stylesheet" href="https://fonts.googleapis.com/css?family=PT+Serif:400,400italic,700%7CPT+Sans:400">
<!-- Icons -->
<link rel="apple-touch-icon-precomposed" sizes="144x144" href="/public/apple-touch-icon-precomposed.png">
<link rel="shortcut icon" href="/public/favicon.ico">
<!-- RSS -->
<link rel="alternate" type="application/rss+xml" title="RSS" href="/atom.xml">
</head>
<body>
<!-- Target for toggling the sidebar `.sidebar-checkbox` is for regular
styles, `#sidebar-checkbox` for behavior. -->
<input type="checkbox" class="sidebar-checkbox" id="sidebar-checkbox">
<!-- Toggleable sidebar -->
<div class="sidebar" id="sidebar">
<div class="sidebar-item">
<p></p>
</div>
<nav class="sidebar-nav">
<a class="sidebar-nav-item" href="/">Home</a>
<a class="sidebar-nav-item" href="/about/">About</a>
<a class="sidebar-nav-item" href="/archive/">Archive</a>
</nav>
<div class="sidebar-item">
<p>
© 2015. All rights reserved.
</p>
</div>
</div>
<!-- Wrap is the content to shift when toggling the sidebar. We wrap the
content to avoid any CSS collisions with our real content. -->
<div class="wrap">
<div class="masthead">
<div class="container">
<h3 class="masthead-title">
<a href="/" title="Home">Mike Loll</a>
<small>A collection of random tech related things.</small>
</h3>
</div>
</div>
<div class="container content">
<h2 class="post_title">Posts Tagged “systemjs”</h2>
01/02/14<a class="archive_list_article_link" href='/2014/01/02/introducing-lanyon/'>Introducing Lanyon</a>
<p class="summary"></p>
<a class="tag_list_link" href="/tag/angular2">angular2</a> <a class="tag_list_link" href="/tag/typescript">typescript</a> <a class="tag_list_link" href="/tag/systemjs">systemjs</a>
</ul>
</div>
</div>
<label for="sidebar-checkbox" class="sidebar-toggle"></label>
<script>
(function(document) {
var toggle = document.querySelector('.sidebar-toggle');
var sidebar = document.querySelector('#sidebar');
var checkbox = document.querySelector('#sidebar-checkbox');
document.addEventListener('click', function(e) {
var target = e.target;
if(!checkbox.checked ||
sidebar.contains(target) ||
(target === checkbox || target === toggle)) return;
checkbox.checked = false;
}, false);
})(document);
</script>
<script>
(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
(i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
})(window,document,'script','//www.google-analytics.com/analytics.js','ga');
ga('create', 'UA-70120850-1', 'auto');
ga('send', 'pageview');
</script>
</body>
</html>
| mit |
timrourke/dotEnv | controllers/_ApplicationController.rb | 84 | class ApplicationController < Sinatra::Base
require 'bundler'
Bundler.require
end | mit |
una/heiroglyph | app/js/data/iconListInfo.js | 1423 | // JSON Object of all of the icons and their tags
export default {
apple : {
name : 'apple',
color: '#be0000',
image : 'apple67.svg',
tags: ['apple', 'fruit', 'food'],
categories: ['food', 'supermarket']
},
bread : {
name : 'bread',
color: '#c26b24',
image : 'bread14.svg',
tags: ['bread', 'food', 'wheat', 'bake'],
categories: ['food', 'supermarket']
},
broccoli : {
name : 'broccoli',
color: '#16a900',
image : 'broccoli.svg',
tags: ['broccoli', 'food', 'vegetable'],
categories: ['food', 'supermarket']
},
cheese : {
name : 'cheese',
color: '#ffe625',
image : 'cheese7.svg',
tags: ['cheese', 'food', 'dairy'],
categories: ['food', 'supermarket']
},
shopping : {
name : 'shopping',
color: '#f32393',
image : 'shopping225.svg',
tags: ['shopping', 'bag'],
categories: ['shopping', 'supermarket']
},
cart : {
name : 'cart',
color: '#9ba990',
image : 'supermarket1.svg',
tags: ['cart', 'shopping'],
categories: ['shopping', 'supermarket']
},
fish : {
name : 'fish',
color: '#6d7ca9',
image : 'fish52.svg',
tags: ['fish', 'food'],
categories: ['fish', 'supermarket']
},
giftbox : {
name : 'giftbox',
color: '#f32393',
image : 'giftbox56.svg',
tags: ['giftbox', 'gift', 'present'],
categories: ['gift', 'shopping', 'supermarket']
}
}; | mit |
Vilsepi/after | backend/tool.sh | 2234 | #!/bin/bash
#
# bash strict mode
set -euo pipefail
IFS=$'\n\t'
USAGE="Usage:\n
Requires AWS CLI tools and credentials configured.\n
./tool.sh install mySourceDirectory\n
./tool.sh create mySourceDirectory myAWSLambdaFunctionName myIAMRoleARN\n
./tool.sh update mySourceDirectory myAWSLambdaFunctionName\n
./tool.sh invoke myAWSLambdaFunctionName\n
"
REGION="eu-west-1"
PROFILE="heap"
# Install pip requirements for a Python lambda
function install_requirements {
FUNCTION_DIRECTORY=$2
cd $FUNCTION_DIRECTORY
pip install -r requirements.txt -t .
}
# Creates a new lambda function
function create {
FUNCTION_DIRECTORY=$2
FUNCTION_ARN_NAME=$3
ROLE_ARN=$4
mkdir -p build
cd $FUNCTION_DIRECTORY
zip -FSr ../build/$FUNCTION_DIRECTORY.zip .
cd ..
aws lambda create-function\
--function-name $FUNCTION_ARN_NAME\
--runtime python2.7\
--role $4\
--handler main.lambda_handler\
--timeout 15\
--memory-size 128\
--zip-file fileb://build/$FUNCTION_DIRECTORY.zip
}
# Packages and uploads the source code of a AWS Lambda function and deploys it live.
function upload_lambda_source {
FUNCTION_DIRECTORY=$2
FUNCTION_ARN_NAME=$3
mkdir -p build
cd $FUNCTION_DIRECTORY
zip -FSr ../build/$FUNCTION_DIRECTORY.zip .
cd ..
aws lambda update-function-code --profile $PROFILE --region $REGION --function-name $FUNCTION_ARN_NAME --zip-file fileb://build/$FUNCTION_DIRECTORY.zip
}
# Invokes an AWS Lambda function and outputs its result
function invoke {
FUNCTION_ARN_NAME=$2
aws lambda invoke --profile $PROFILE --region $REGION --function-name $FUNCTION_ARN_NAME /dev/stdout
}
function help_and_exit {
echo -e $USAGE
exit 1
}
# Subcommand handling
if [ $# -lt 1 ]
then
help_and_exit
fi
case "$1" in
install)
if (( $# == 2 )); then
install_requirements "$@"
else
help_and_exit
fi
;;
create)
if (( $# == 4 )); then
create "$@"
else
help_and_exit
fi
;;
update)
if (( $# == 3 )); then
upload_lambda_source "$@"
else
help_and_exit
fi
;;
invoke)
if (( $# == 2 )); then
invoke "$@"
else
help_and_exit
fi
;;
*)
echo "Error: No such subcommand"
help_and_exit
esac
| mit |
Catorpilor/LeetCode | 37_sudoku_solver/sudoku.go | 3258 | package sudoku
import "fmt"
const (
n = 3
N = 3 * 3
)
var (
resolved bool
)
func solveSudoku(board [][]byte) [][]byte {
// box size 3
row := make([][]int, N)
columns := make([][]int, N)
box := make([][]int, N)
res := make([][]byte, N)
for i := 0; i < N; i++ {
row[i] = make([]int, N+1)
columns[i] = make([]int, N+1)
box[i] = make([]int, N+1)
res[i] = make([]byte, N)
copy(res[i], board[i])
}
for i := 0; i < N; i++ {
for j := 0; j < N; j++ {
if board[i][j] != '.' {
placeNumberAtPos(res, i, j, int(board[i][j]-'0'), row, columns, box)
}
}
}
fmt.Printf("row: %v\n, column: %v\n, box: %v\n", row, columns, box)
permute(res, 0, 0, row, columns, box)
return res
}
func placeNumberAtPos(board [][]byte, i, j, num int, row, columns, box [][]int) {
boxIdx := (i/n)*n + j/n
(row)[i][num]++
(columns)[j][num]++
(box)[boxIdx][num]++
(board)[i][j] = byte('0' + num)
}
func removeNumberAtPos(board [][]byte, i, j, num int, row, columns, box [][]int) {
boxIdx := (i/n)*n + j/n
row[i][num]++
columns[j][num]++
box[boxIdx][num]++
board[i][j] = '.'
}
func isValidPosForNum(i, j, num int, row, columns, box [][]int) bool {
boxIdx := (i/n)*n + j/n
return row[i][num]+columns[j][num]+box[boxIdx][num] == 0
}
func permute(board [][]byte, i, j int, row, column, box [][]int) {
if board[i][j] == '.' {
for k := 1; k <= N; k++ {
if isValidPosForNum(i, j, k, row, column, box) {
placeNumberAtPos(board, i, j, k, row, column, box)
fmt.Printf("place k:%d at row: %d and col:%d, row[i][k]= %d, col[j][k] = %d and box[boxidx][k] = %d\n", k, i, j, row[i][k], column[j][k],
box[(i/n)*n+j/n][k])
placeNext(board, i, j, row, column, box)
fmt.Printf("place next then k:%d at row: %d and col:%d, row[i][k]= %d, col[j][k] = %d and box[boxidx][k] = %d\n", k, i, j, row[i][k], column[j][k],
box[(i/n)*n+j/n][k])
if !resolved {
removeNumberAtPos(board, i, j, k, row, column, box)
}
}
}
} else {
placeNext(board, i, j, row, column, box)
}
}
func placeNext(board [][]byte, i, j int, row, column, box [][]int) {
if i == N-1 && j == N-1 {
resolved = true
}
fmt.Printf("board: %v\n, row: %v \n, column: %v\n, box: %v\n", board, row, column, box)
if j == N-1 {
fmt.Println("next row")
permute(board, i+1, 0, row, column, box)
} else {
fmt.Println("next column")
permute(board, i, j+1, row, column, box)
}
}
func solveSudoku2(board [][]byte) [][]byte {
if len(board) == 0 {
return board
}
solve(board)
return board
}
func solve(board [][]byte) bool {
var c byte
for i := 0; i < len(board); i++ {
for j := 0; j < len(board[0]); j++ {
if board[i][j] == '.' {
for c = '1'; c <= '9'; c++ {
if isValid(board, i, j, c) {
board[i][j] = c
if solve(board) {
return true
} else {
board[i][j] = '.'
}
}
}
return false
}
}
}
return true
}
func isValid(board [][]byte, row int, col int, c byte) bool {
for i := 0; i < 9; i++ {
if board[i][col] != '.' && board[i][col] == c {
return false
}
if board[row][i] != '.' && board[row][i] == c {
return false
}
if board[3*(row/3)+i/3][3*(col/3)+i%3] != '.' && board[3*(row/3)+i/3][3*(col/3)+i%3] == c {
return false
}
}
return true
}
| mit |
ngeor/nunitrunner-plugin | src/main/resources/net/ngeor/plugins/nunitrunner/MSBuildBuilder/help-assemblyVersion.html | 79 | <div>
The version string to use when patching assembly version files.
</div>
| mit |
Andre-LA/Ink | gui/components/rect_transform.lua | 432 | return function(parameters)
return {
position = parameters.position or {x = 0, y = 0, z = 0},
scale = parameters.scale or {x = 0, y = 0},
anchors = parameters.anchors or {up = 1, left = 1, right = 1, down = 1},
offset = parameters.offset or {up = 0, left = 0, right = 0, down = 0},
useTween = parameters.useTween or true,
velocity = parameters.velocity or 5
}
end
| mit |
theDrake/opengl-experiments | Fractals/Fractals/Fractal.cpp | 2212 | #include "Fractal.h"
Color::Color() : r(0.0), g(0.0), b(0.0) {}
Color::Color(double rin, double gin, double bin) : r(rin), g(gin), b(bin) {}
Fractal::Fractal(int width, int height)
: width_(width), height_(height), center_x_(0.0), center_y_(0.0),
max_distance_sqr_(4.0), max_iteration_(32) {
pixel_size_ = 4.0 / width_;
}
Fractal::~Fractal() {}
void Fractal::Initialize() {
RecalcMins();
CreateColors();
CalculateEscapeTime();
}
void Fractal::CreateColors() {
int i;
double r, g, b;
colors_.resize(max_iteration_ + 1);
for (i = 0; i < max_iteration_; i++) {
r = 1.0 * i / (double) max_iteration_;
g = 0.5 * i / (double) max_iteration_;
b = 1.0 * i / (double) max_iteration_;
colors_[i] = Color(r, g, b);
}
colors_[max_iteration_] = Color(0.0, 0.0, 0.0);
}
void Fractal::CalculateEscapeTime() {
int i, j;
double x, y, xmin, ymin;
xmin = center_x_ - pixel_size_ * (width_ / 2.0 - 0.5);
ymin = center_y_ - pixel_size_ * (height_ / 2.0 - 0.5);
escape_times_.resize(height_);
for (j = 0; j < height_; j++) {
escape_times_[j].resize(width_);
for (i = 0; i < width_; i++) {
x = xmin + i * pixel_size_;
y = ymin + j * pixel_size_;
escape_times_[j][i] = EscapeOne(x, y);
}
}
}
void Fractal::Draw() {
int x, y, iter;
for (y = 0; y < height_; y++) {
for (x = 0; x < width_; x++) {
iter = escape_times_[y][x];
glColor3d(colors_[iter].r, colors_[iter].g, colors_[iter].b);
glBegin(GL_POINTS);
glVertex2d(x, y);
glEnd();
}
}
}
void Fractal::Center(double x, double y) {
RecalcCenter(x, y);
RecalcMins();
CalculateEscapeTime();
}
void Fractal::ZoomIn(double x, double y) {
RecalcCenter(x, y);
pixel_size_ /= 2.0;
RecalcMins();
CalculateEscapeTime();
}
void Fractal::ZoomOut(double x, double y) {
RecalcCenter(x, y);
pixel_size_ *= 2.0;
RecalcMins();
CalculateEscapeTime();
}
void Fractal::RecalcCenter(double x, double y) {
center_x_ = min_x_ + pixel_size_ * x;
center_y_ = min_y_ + pixel_size_ * y;
}
void Fractal::RecalcMins() {
min_x_ = center_x_ - pixel_size_ * (width_ / 2.0 - 0.5);
min_y_ = center_y_ - pixel_size_ * (height_ / 2.0 - 0.5);
}
| mit |
MechanicalMen/Mechanical3 | source/Mechanical3.Portable/IO/FileSystems/SemiThreadSafeMemoryFileSystemReader.cs | 17319 | using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.IO;
using Mechanical3.Core;
namespace Mechanical3.IO.FileSystems
{
//// NOTE: For still more speed, you could ditch abstract file systems, file paths and streams alltogether, and just use byte arrays directly.
//// Obviously this is a compromize between performance and ease of use.
/// <summary>
/// Provides performant, semi thread-safe access to an in-memory copy of the contants of an <see cref="IFileSystemReader"/>.
/// Access to <see cref="IFileSystemReader"/> members is thread-safe.
/// <see cref="Stream"/> instances returned are NOT thread-safe, and they must not be used after they are closed or disposed (one of which should happen exactly one time).
/// </summary>
public class SemiThreadSafeMemoryFileSystemReader : IFileSystemReader
{
#region ByteArrayReaderStream
/// <summary>
/// Provides thread-safe, read-only access to the wrapped byte array.
/// </summary>
private class ByteArrayReaderStream : Stream
{
//// NOTE: the default asynchronous implementations call their synchronous versions.
//// NOTE: we don't use Store*, to save a few allocations
#region ObjectPool
internal static class ObjectPool
{
private static readonly ConcurrentBag<ByteArrayReaderStream> Pool = new ConcurrentBag<ByteArrayReaderStream>();
internal static ByteArrayReaderStream Get( byte[] data )
{
ByteArrayReaderStream stream;
if( !Pool.TryTake(out stream) )
stream = new ByteArrayReaderStream();
stream.OnInitialize(data);
return stream;
}
internal static void Put( ByteArrayReaderStream stream )
{
Pool.Add(stream);
}
internal static void Clear()
{
ByteArrayReaderStream stream;
while( true )
{
if( Pool.TryTake(out stream) )
GC.SuppressFinalize(stream);
else
break; // pool is empty
}
}
}
#endregion
#region Private Fields
private byte[] array;
private int position; // NOTE: (position == length) == EOF
private bool isOpen;
#endregion
#region Constructor
private ByteArrayReaderStream()
: base()
{
}
#endregion
#region Private Methods
private void OnInitialize( byte[] data )
{
this.array = data;
this.position = 0;
this.isOpen = true;
}
private void OnClose()
{
this.isOpen = false;
this.array = null;
this.position = -1;
}
private void ThrowIfClosed()
{
if( !this.isOpen )
throw new ObjectDisposedException(message: "The stream was already closed!", innerException: null);
}
#endregion
#region Stream
protected override void Dispose( bool disposing )
{
this.OnClose();
ObjectPool.Put(this);
if( !disposing )
{
// called from finalizer
GC.ReRegisterForFinalize(this);
}
}
public override bool CanRead
{
get
{
this.ThrowIfClosed();
return true;
}
}
public override bool CanSeek
{
get
{
this.ThrowIfClosed();
return true;
}
}
public override bool CanTimeout
{
get
{
this.ThrowIfClosed();
return false;
}
}
public override bool CanWrite
{
get
{
this.ThrowIfClosed();
return false;
}
}
public override long Length
{
get
{
this.ThrowIfClosed();
return this.array.Length;
}
}
public override long Position
{
get
{
this.ThrowIfClosed();
return this.position;
}
set
{
this.ThrowIfClosed();
if( value < 0 || this.array.Length < value )
throw new ArgumentOutOfRangeException();
this.position = (int)value;
}
}
public override long Seek( long offset, SeekOrigin origin )
{
this.ThrowIfClosed();
int newPosition;
switch( origin )
{
case SeekOrigin.Begin:
newPosition = (int)offset;
break;
case SeekOrigin.Current:
newPosition = this.position + (int)offset;
break;
case SeekOrigin.End:
newPosition = this.array.Length + (int)offset;
break;
default:
throw new ArgumentException("Invalid SeekOrigin!");
}
if( newPosition < 0
|| newPosition > this.array.Length )
throw new ArgumentOutOfRangeException();
this.position = newPosition;
return this.position;
}
public override int ReadByte()
{
this.ThrowIfClosed();
if( this.position == this.array.Length )
{
// end of stream
return -1;
}
else
{
return this.array[this.position++];
}
}
public override int Read( byte[] buffer, int offset, int bytesToRead )
{
this.ThrowIfClosed();
int bytesLeft = this.array.Length - this.position;
if( bytesLeft < bytesToRead ) bytesToRead = bytesLeft;
if( bytesToRead == 0 )
return 0;
Buffer.BlockCopy(src: this.array, srcOffset: this.position, dst: buffer, dstOffset: offset, count: bytesToRead);
this.position += bytesToRead;
return bytesToRead;
}
public override void Flush()
{
throw new NotSupportedException();
}
public override void SetLength( long value )
{
throw new NotSupportedException();
}
public override void WriteByte( byte value )
{
throw new NotSupportedException();
}
public override void Write( byte[] buffer, int offset, int count )
{
throw new NotSupportedException();
}
#endregion
}
#endregion
#region Private Fields
/* From MSDN:
* A Dictionary can support multiple readers concurrently, as long as the collection is not modified.
* Even so, enumerating through a collection is intrinsically not a thread-safe procedure. In the
* rare case where an enumeration contends with write accesses, the collection must be locked during
* the entire enumeration. To allow the collection to be accessed by multiple threads for reading
* and writing, you must implement your own synchronization.
*
* ... (or there is also ConcurrentDictionary)
*/
//// NOTE: since after they are filled, the dictionaries won't be modified, we are fine.
private readonly FilePath[] rootFolderEntries;
private readonly Dictionary<FilePath, FilePath[]> nonRootFolderEntries;
private readonly Dictionary<FilePath, byte[]> fileContents;
private readonly Dictionary<FilePath, string> hostPaths;
private readonly string rootHostPath;
#endregion
#region Constructors
/// <summary>
/// Copies the current contents of the specified <see cref="IFileSystemReader"/>
/// into a new <see cref="SemiThreadSafeMemoryFileSystemReader"/> instance.
/// </summary>
/// <param name="readerToCopy">The abstract file system to copy the current contents of, into memory.</param>
/// <returns>A new <see cref="SemiThreadSafeMemoryFileSystemReader"/> instance.</returns>
public static SemiThreadSafeMemoryFileSystemReader CopyFrom( IFileSystemReader readerToCopy )
{
return new SemiThreadSafeMemoryFileSystemReader(readerToCopy);
}
/// <summary>
/// Initializes a new instance of the <see cref="SemiThreadSafeMemoryFileSystemReader"/> class.
/// </summary>
/// <param name="readerToCopy">The abstract file system to copy the current contents of, into memory.</param>
private SemiThreadSafeMemoryFileSystemReader( IFileSystemReader readerToCopy )
{
this.rootFolderEntries = readerToCopy.GetPaths();
this.nonRootFolderEntries = new Dictionary<FilePath, FilePath[]>();
this.fileContents = new Dictionary<FilePath, byte[]>();
if( readerToCopy.SupportsToHostPath )
{
this.hostPaths = new Dictionary<FilePath, string>();
this.rootHostPath = readerToCopy.ToHostPath(null);
}
using( var tmpStream = new MemoryStream() )
{
foreach( var e in this.rootFolderEntries )
this.AddRecursively(e, readerToCopy, tmpStream);
}
}
#endregion
#region Private Methods
private void AddRecursively( FilePath entry, IFileSystemReader readerToCopy, MemoryStream tmpStream )
{
if( entry.IsDirectory )
{
var subEntries = readerToCopy.GetPaths(entry);
this.nonRootFolderEntries.Add(entry, subEntries);
foreach( var e in subEntries )
this.AddRecursively(e, readerToCopy, tmpStream);
}
else
{
this.fileContents.Add(entry, ReadFileContents(entry, readerToCopy, tmpStream));
}
if( this.SupportsToHostPath )
this.hostPaths.Add(entry, readerToCopy.ToHostPath(entry));
}
private static byte[] ReadFileContents( FilePath filePath, IFileSystemReader readerToCopy, MemoryStream tmpStream )
{
tmpStream.SetLength(0);
long? fileSize = null;
if( readerToCopy.SupportsGetFileSize )
{
fileSize = readerToCopy.GetFileSize(filePath);
ThrowIfFileTooBig(filePath, fileSize.Value);
}
using( var stream = readerToCopy.ReadFile(filePath) )
{
if( !fileSize.HasValue
&& stream.CanSeek )
{
fileSize = stream.Length;
ThrowIfFileTooBig(filePath, fileSize.Value);
}
stream.CopyTo(tmpStream);
if( !fileSize.HasValue )
ThrowIfFileTooBig(filePath, tmpStream.Length);
return tmpStream.ToArray();
}
}
private static void ThrowIfFileTooBig( FilePath filePath, long fileSize )
{
if( fileSize > int.MaxValue ) // that's the largest our stream implementation can support
throw new Exception("One of the files is too large!").Store(nameof(filePath), filePath).Store(nameof(fileSize), fileSize);
}
#endregion
#region IFileSystemBase
/// <summary>
/// Gets a value indicating whether the ToHostPath method is supported.
/// </summary>
/// <value><c>true</c> if the method is supported; otherwise, <c>false</c>.</value>
public bool SupportsToHostPath
{
get { return this.hostPaths.NotNullReference(); }
}
/// <summary>
/// Gets the string the underlying system uses to represent the specified file or directory.
/// </summary>
/// <param name="path">The path to the file or directory.</param>
/// <returns>The string the underlying system uses to represent the specified <paramref name="path"/>.</returns>
public string ToHostPath( FilePath path )
{
if( !this.SupportsToHostPath )
throw new NotSupportedException().StoreFileLine();
if( path.NullReference() )
return this.rootHostPath;
string result;
if( this.hostPaths.TryGetValue(path, out result) )
return result;
else
throw new FileNotFoundException("The specified file or directory was not found!").Store(nameof(path), path);
}
#endregion
#region IFileSystemReader
/// <summary>
/// Gets the paths to the direct children of the specified directory.
/// Subdirectories are not searched.
/// </summary>
/// <param name="directoryPath">The path specifying the directory to list the direct children of; or <c>null</c> to specify the root of this file system.</param>
/// <returns>The paths of the files and directories found.</returns>
public FilePath[] GetPaths( FilePath directoryPath = null )
{
if( directoryPath.NotNullReference()
&& !directoryPath.IsDirectory )
throw new ArgumentException("Argument is not a directory!").Store(nameof(directoryPath), directoryPath);
FilePath[] paths;
if( directoryPath.NullReference() )
{
paths = this.rootFolderEntries;
}
else
{
if( !this.nonRootFolderEntries.TryGetValue(directoryPath, out paths) )
throw new FileNotFoundException("The specified file or directory was not found!").Store(nameof(directoryPath), directoryPath);
}
// NOTE: Unfortunately we need to make a copy, since arrays are writable.
// TODO: return ImmutableArray from GetPaths.
var copy = new FilePath[paths.Length];
Array.Copy(sourceArray: paths, destinationArray: copy, length: paths.Length);
return copy;
}
/// <summary>
/// Opens the specified file for reading.
/// </summary>
/// <param name="filePath">The path specifying the file to open.</param>
/// <returns>A <see cref="Stream"/> representing the file opened.</returns>
public Stream ReadFile( FilePath filePath )
{
if( filePath.NullReference()
|| filePath.IsDirectory )
throw new ArgumentException("Argument is not a file!").Store(nameof(filePath), filePath);
byte[] bytes;
if( !this.fileContents.TryGetValue(filePath, out bytes) )
throw new FileNotFoundException("The specified file or directory was not found!").Store(nameof(filePath), filePath);
return ByteArrayReaderStream.ObjectPool.Get(bytes);
}
/// <summary>
/// Gets a value indicating whether the GetFileSize method is supported.
/// </summary>
/// <value><c>true</c> if the method is supported; otherwise, <c>false</c>.</value>
public bool SupportsGetFileSize
{
get { return true; }
}
/// <summary>
/// Gets the size, in bytes, of the specified file.
/// </summary>
/// <param name="filePath">The file to get the size of.</param>
/// <returns>The size of the specified file in bytes.</returns>
public long GetFileSize( FilePath filePath )
{
if( filePath.NullReference()
|| filePath.IsDirectory )
throw new ArgumentException("Argument is not a file!").Store(nameof(filePath), filePath);
byte[] bytes;
if( !this.fileContents.TryGetValue(filePath, out bytes) )
throw new FileNotFoundException("The specified file or directory was not found!").Store(nameof(filePath), filePath);
return bytes.Length;
}
#endregion
}
}
| mit |
LunneMarketingGroup/Grunt-Website-Template | config.rb | 323 | http_path = "/"
css_dir = "assets/css/src"
sass_dir = "assets/sass"
images_dir = "assets/img"
javascripts_dir = "assets/js"
fonts_dir = "assets/font"
http_fonts_path = "assets/font"
http_images_path = "assets/img"
output_style = :nested
relative_assets = false
line_comments = false
| mit |
Subsets and Splits