branch_name
stringclasses 149
values | text
stringlengths 23
89.3M
| directory_id
stringlengths 40
40
| languages
listlengths 1
19
| num_files
int64 1
11.8k
| repo_language
stringclasses 38
values | repo_name
stringlengths 6
114
| revision_id
stringlengths 40
40
| snapshot_id
stringlengths 40
40
|
---|---|---|---|---|---|---|---|---|
refs/heads/master | <repo_name>elidoran/cosmos-path-helpers<file_sep>/README.md
# Unlinked from Atmosphere
I no longer use any of these and won't be making any updates.
Also, FlowRouter is busy growing its API and I expect they'll make a "helpers"
package soon.
# Path Helpers [](https://travis-ci.org/elidoran/cosmos-path-helpers)
I moved from [Iron Router](http://github.com/iron-meteor/iron-router) to FlowRouter and missed some of the helpers.
I implemented those helpers in this package allowing their use with any future router implementations.
Some of the helpers require a path generating implementation. This package allows its users to provide path generation to it.
I made a package which uses FlowRouter for path generation (it doesn't have these helpers builtin).
Iron Router has these helpers and its own path generating ability. (It doesn't have the isPath helpers, but zimme:iron-router-active provides them to Iron Router).
## Available Implementations
1. FlowRouter -> [cosmos:flow-router-path-helpers](http://github.com/elidoran/cosmos-flow-router-path-helpers)
```
meteor add cosmos:flow-router-path-helpers
```
It would be possible to provide path generation in this package similar to FlowRouter, however, I chose to make use of their implementation instead of making my own.
## Helpers
### Similar to Iron Router
1. pathFor
2. urlFor
3. linkTo
The difference is there are no named routes. This supports FlowRouter which has no named routes. You must specify the path pattern instead.
### Similar to [zimme:iron-router-active](https://github.com/zimme/meteor-iron-router-active)
1. isPath
2. isPathNot
These do not require a router implementation. They use `windows.location`.
Usage is similar to [zimme:iron-router-active](https://github.com/zimme/meteor-iron-router-active). The differences:
1. no 'Active' in the name
2. the "not" name has it as a suffix so it reads horizontally like:
`{{isPathNot path='/some/where'}}`
3. can specify a `regex` value *or* a `path` value
## Usage: pathFor / urlFor / linkTo
`pathFor` and `urlFor` both produce a string from a `path` pattern and optional extras: `params`, `query`, and `anchor`.
I used `anchor` instead of `hash` because Blaze stores the data provided to helpers using key `hash` and I wanted to avoid similar names.
`pathFor` is a relative path and `urlFor` is an absolute path.
If we have these optional extras provided by our template helper:
``` coffeescript
# CoffeeScript
Template.things.helpers
getParams: -> {id: '12345'}
getQuery: -> {display: 'brief', extras: 10}
```
``` javascript
// JavaScript
Template.things.helpers({
getParams: function() {
return {id: '12345'};
},
getQuery: function() {
return {display: 'brief', extras: 10};
}
});
```
Then this:
```
{{ pathFor path='/things/:id' params=getParams query=getQuery anchor='hashFrag'}}
```
Produces:
```
/things/12345?dislpay=brief&extras=10#hashFrag
```
`urlFor` uses `Meteor.absoluteUrl` to produce:
```html
http://yourhost:port/things/12345?dislpay=brief&extras=10#hashFrag
```
`linkTo` accepts content and produces an anchor element. This:
```html
{{#linkTo path='/items/:id' params=getParams query=getQuery anchor='hashFrag'}}
<span style="color:blue;">
Item
</span>
{{#/linkTo}}
```
Produces:
```html
<a href="/items/12345?display=brief&extras=10#hashFrag">
<span style="color:blue;">
Item
</span>
</a>
```
## Usage: isPath / isPathNot
### Attribute Use
`isPath` returns `active`, or the specified `className`, when the path matches; otherwise it returns **false** and the helper leaves the attribute blank.
`isPathNot` returns `disabled`, or the specified `className`, when the path *doesn't* match; otherwise it returns **false** and the helper leaves the attribute blank.
```html
<li class="{{isPath path='/home'}}">Home</li>
<li class="{{isPath regex='/profile|/options'}}">User</li>
```
### Conditional Use
Use them with `{{#if}}` in templates to control content
```html
{{#if isPath path='/home'}}
<div>some content</div>
{{/if}}
```
## How to implement in your app or package?
A (simplified) example implementation using FlowRouter ([full impl](http://github.com/elidoran/cosmos-flow-router-path-helpers/blob/master/flowRouterImpl.coffee)):
```coffeescript
# CoffeeScript:
PathHelpers.path = (options) ->
FlowRouter.path options.path, options?.params, options?.query
```
```javascript
// JavaScript:
PathHelpers.path = function flowRouterPath(options) {
return FlowRouter.path(options.path, options.params, options.query);
};
```
## Possible: Router agnostic implementation
The path conversion code doesn't need to be linked to a router implementation. I used the already made ability of FlowRouter to generate the paths because I'm using FlowRouter.
If someone provided router agnostic code which performs similar to `FlowRouter.path` then it could be combined with this package and used with any router (except Iron Router which implements these helpers in its own way).
<file_sep>/lib/globalObject.js
// Must use a JS file to export objects in Meteor. Doing it in CS is convoluted.
// export this object so other packages can alter the path function to their own
// implementation
PathHelpers = {
path: function defaultPathImpl(options) {
throw new Error("Implement PathHelpers.path, or use package cosmos:flow-router-path-helpers");
}
};
| deb1e7478ba0c3d9772233f9a417137fd238d31d | [
"Markdown",
"JavaScript"
]
| 2 | Markdown | elidoran/cosmos-path-helpers | c68f2c3d0ba7b13bc0ed2e1631f54c506647cda3 | c62a40e9cc1e0c6db4858253ca7385be476d02b2 |
refs/heads/main | <file_sep>import { Injectable } from '@angular/core';
import { HttpClient } from '@angular/common/http';
import { Observable } from 'rxjs';
import { Config } from '../helpers/config';
@Injectable({
providedIn: 'root'
})
export class UserService {
constructor(
private http: HttpClient,
private config: Config
) { }
}<file_sep><?php
/**
* Default success response
*
* @param string $message
* @param string $code
* @return array
*/
function success($message, $code = CODE::S_GENERIC) {
return [
'message' => $message,
'code' => $code
];
}
/**
* Default error response
*
* @param string $message
* @param string $code
* @return array
*/
function error($message, $code = CODE::E_GENERIC) {
return [
'message' => $message,
'code' => $code
];
}
/**
* Default api response structure
*
* @param array $data
* @param string $message
* @param string $code
* @return array
*/
function default_response($data, $message = null, $code = CODE::S_GENERIC){
$response = [
'code' => $code,
'data' => $data
];
if ($message)
$response['message'] = $message;
return $response;
}<file_sep><?php
namespace App\Models;
use Illuminate\Database\Eloquent\Model;
class Customer extends Model
{
/**
* Model fillable fields
*
* @var array
*/
protected $fillable = [
'company_name',
'fantasy_name',
'active',
'cnpj',
'state_registration',
'municipal_registration',
'address_id',
'user_id'
];
/**
* Model field casts
*
* @var array
*/
protected $casts = [
'active' => 'boolean'
];
/**
* User
*
* @return \Illuminate\Database\Eloquent\Relations\BelongsTo
*/
public function user() {
return $this->belongsTo(User::class, 'user_id');
}
/**
* Customers email addresses
*
* @return \Illuminate\Database\Eloquent\Relations\HasMany
*/
public function email_adresses() {
return $this->hasMany(EmailAddress::class, 'customer_id');
}
/**
* Customer Addresses
*
* @return \Illuminate\Database\Eloquent\Relations\HasOne
*/
public function address() {
return $this->hasOne(Address::class, 'address_id');
}
/**
* Customer Jobs
*
* @return \Illuminate\Database\Eloquent\Relations\HasMany
*/
public function jobs() {
return $this->hasMany(Job::class, 'customer_id');
}
}
<file_sep>import { BrowserModule } from '@angular/platform-browser';
import { NgModule } from '@angular/core';
import { FormsModule } from '@angular/forms';
import { HttpClientModule } from '@angular/common/http';
import { SnotifyModule, SnotifyService, ToastDefaults } from 'ng-snotify';
import { AppRoutingModule } from './app-routing.module';
import { AppComponent } from './app.component';
import { LoginComponent } from './components/auth/login/login.component';
import { RegisterComponent } from './components/auth/register/register.component';
import { HomeComponent } from './components/app/home/home.component';
import { AuthLayoutComponent } from './components/layouts/auth-layout/auth-layout.component';
import { AppLayoutComponent } from './components/layouts/app-layout/app-layout.component';
import { BrowserAnimationsModule } from '@angular/platform-browser/animations';
import { NbThemeModule, NbLayoutModule, NbSidebarModule, NbMenuModule } from '@nebular/theme';
import { NbEvaIconsModule } from '@nebular/eva-icons';
@NgModule({
declarations: [
AppComponent,
LoginComponent,
RegisterComponent,
HomeComponent,
AuthLayoutComponent,
AppLayoutComponent
],
imports: [
BrowserModule,
AppRoutingModule,
FormsModule,
HttpClientModule,
SnotifyModule,
BrowserAnimationsModule,
NbThemeModule.forRoot({ name: 'dark' }),
NbLayoutModule,
NbEvaIconsModule,
NbSidebarModule.forRoot(),
NbMenuModule.forRoot()
],
providers: [
{ provide: 'SnotifyToastConfig', useValue: ToastDefaults},
SnotifyService
],
bootstrap: [AppComponent]
})
export class AppModule { }
<file_sep><?php
namespace App\Constants;
class Code {
/**
* Errors
*/
const E_GENERIC = 'E-001';
const E_PROCESS = 'E-002';
const E_DATABASE = 'E-003';
const E_INVALID_TOKEN = 'E-004';
/**
* Success
*/
const S_GENERIC = 'S-001';
const S_CREATED = 'S-002';
const S_UPDATED = 'S-003';
const S_DELETED = 'S-004';
}<file_sep><?php
namespace App\Models;
use Tymon\JWTAuth\Contracts\JWTSubject;
use Illuminate\Notifications\Notifiable;
use Illuminate\Foundation\Auth\User as Authenticatable;
use App\Traits\FormatTimestamps;
class User extends Authenticatable implements JWTSubject
{
use Notifiable, FormatTimestamps;
/**
* Model fillable fields
*
* @var array
*/
protected $fillable = [
'email',
'first_name',
'last_name',
'password',
'address_id'
];
/**
* Appends custom attributes
*
* @var array
*/
protected $appends = [
'full_name'
];
/**
* The hidden attributes
*
* @var array
*/
protected $hidden = [
'password'
];
/**
* User Customers
*
* @return \Illuminate\Database\Eloquent\Relations\HasMany
*/
public function customers() {
return $this->hasMany(Customer::class, 'user_id');
}
/**
* User Jobs
*
* @return \Illuminate\Database\Eloquent\Relations\HasMany
*/
public function jobs() {
return $this->hasMany(Job::class, 'user_id');
}
/**
* Customer Addresses
*
* @return \Illuminate\Database\Eloquent\Relations\HasOne
*/
public function address() {
return $this->hasOne(Address::class, 'address_id');
}
/**
* Return the user full name
*
* @return string
*/
public function getFullNameAttribute(){
return "{$this->first_name} {$this->last_name}";
}
/**
* Encrypt the new password
*
* @return string
*/
public function setPasswordAttribute($value){
$this->attributes['password'] = <PASSWORD>($value);
}
/**
* Get the identifier that will be stored in the subject claim of the JWT.
*
* @return mixed
*/
public function getJWTIdentifier()
{
return $this->getKey();
}
/**
* Return a key value array, containing any custom claims to be added to the JWT.
*
* @return array
*/
public function getJWTCustomClaims()
{
return [];
}
}
<file_sep><?php
namespace App\Traits;
use Carbon\Carbon;
trait FormatTimestamps
{
public function getCreatedAtAttribute($value){
$value = new Carbon($value);
return $value->format('d/m/Y H:i');
}
public function getUpdatedAtAttribute($value){
$value = new Carbon($value);
return $value->format('d/m/Y H:i');
}
}<file_sep><?php
namespace App\Models;
use Illuminate\Database\Eloquent\Model;
class EmailAddress extends Model
{
/**
* Model fillable fields
*
* @var array
*/
protected $fillable = [
'email',
'customer_id',
'default'
];
/**
* Model field casts
*
* @var array
*/
protected $casts = [
'default' => 'boolean'
];
/**
* Email address of the customer
*
* @return \Illuminate\Database\Eloquent\Relations\BelongsTo
*/
public function customer() {
return $this->belongsTo(Customer::Class, 'customer_id');
}
}
<file_sep># SisMEI
- Made with Laravel 6 and Angular 11
## Database ER Diagram
<file_sep>import { Component, OnInit } from '@angular/core';
@Component({
selector: 'app-app-layout',
templateUrl: './app-layout.component.html',
styleUrls: ['./app-layout.component.scss']
})
export class AppLayoutComponent implements OnInit {
public menu = [
{
title: 'Home',
icon: 'home-outline',
link: '/home',
home: true,
},
{
title: 'Jobs',
icon: 'settings-2-outline',
link: '/jobs',
},
{
title: 'Customers',
icon: 'people-outline',
link: '/customers',
}
];
constructor() { }
ngOnInit(): void {
}
}
<file_sep><?php
namespace App\Models;
use Illuminate\Database\Eloquent\Model;
class Job extends Model
{
/**
* Model fillable fields
*
* @var array
*/
protected $fillable = [
'name',
'description',
'status_id',
'user_id',
'customer_id',
'parent_id',
'estimated_hours',
'due_date',
'start_date',
'done_ratio'
];
/**
* Job User owner
*
* @return \Illuminate\Database\Eloquent\Relations\BelongsTo
*/
public function user() {
return $this->belongsTo(User::class, 'user_id');
}
/**
* Job Customer
*
* @return \Illuminate\Database\Eloquent\Relations\BelongsTo
*/
public function customer() {
return $this->belongsTo(Customer::class, 'customer_id');
}
/**
* Job Comments
*
* @return \Illuminate\Database\Eloquent\Relations\HasMany
*/
public function comments() {
return $this->hasMany(JobComment::class, 'job_id');
}
/**
* Job Status
*
* @return \Illuminate\Database\Eloquent\Relations\HasOne
*/
public function status() {
return $this->hasOne(JobStatus::class, 'status_id');
}
/**
* Job time entries
*
* @return \Illuminate\Database\Eloquent\Relations\HasMany
*/
public function time_entries() {
return $this->hasMany(TimeEntry::class, 'job_id');
}
}
<file_sep>import { Component, OnInit } from '@angular/core';
import { AuthService } from 'src/app/services/auth.service';
import { SnotifyService } from 'ng-snotify';
import { TokenService } from 'src/app/services/token.service';
import { Router } from '@angular/router';
@Component({
selector: 'app-login',
templateUrl: './login.component.html',
styleUrls: ['./login.component.scss']
})
export class LoginComponent implements OnInit {
form: any = {
email: null,
password: null
};
constructor(
private auth: AuthService,
private token: TokenService,
private notify: SnotifyService,
private router: Router
) { }
ngOnInit(): void {
}
public onSubmit() {
this.auth.login(this.form.email, this.form.password).subscribe( (response) => {
this.notify.success("Signed in successfully");
this.token.saveToken(response.access_token);
this.router.navigateByUrl('/home');
}, (error) => {
if (error.status == 401) {
this.notify.error("Password or email incorrect");
} else {
this.notify.error("Error during authentication");
}
})
}
}
<file_sep><?php
namespace App\Models;
use Illuminate\Database\Eloquent\Model;
class JobComment extends Model
{
/**
* Model fillable fields
*
* @var array
*/
protected $fillable = [
'comment',
'job_id'
];
/**
* Commented job
*
* @return \Illuminate\Database\Eloquent\Relations\BelongsTo
*/
public function job() {
return $this->belongsTo(Job::class, 'job_id');
}
}
<file_sep><?php
return [
CODE::E_GENERIC => 'An error occurred while performing your request.',
CODE::E_PROCESS => 'An error occurred while processing your request.',
CODE::E_DATABASE => 'An error occurred while trying to execute your request on the database.',
CODE::E_INVALID_TOKEN => 'Your access token is invalid',
CODE::S_GENERIC => 'Operation finished with success',
CODE::S_CREATED => 'Register created successfully',
CODE::S_UPDATED => 'Register updated successfully',
CODE::S_DELETED => 'Register removed successfully',
];<file_sep><?php
namespace App\Models;
use Illuminate\Database\Eloquent\Model;
class JobStatus extends Model
{
/**
* Model fillable fields
*
* @var array
*/
protected $fillable = [
'name',
'is_closed',
'is_default'
];
/**
* Model field casts
*
* @var array
*/
protected $casts = [
'is_closed' => 'boolean',
'is_default' => 'boolean'
];
/**
* Jobs with status
*
* @return \Illuminate\Database\Eloquent\Relations\HasMany
*/
public function jobs() {
return $this->hasMany(Job::class, 'status_id');
}
}
<file_sep>import { environment } from "../../environments/environment";
import { Injectable } from '@angular/core';
@Injectable()
export class Config {
private apiURL = environment['API_URL'];
public getApiUrl() {
return this.apiURL;
}
}
<file_sep><?php
namespace App\Models;
use Illuminate\Database\Eloquent\Model;
class TimeEntry extends Model
{
/**
* Model fillable fields
*
* @var array
*/
protected $fillable = [
'comment',
'spent_on',
'hours',
'job_id'
];
/**
* Job of the time entry
*
* @return \Illuminate\Database\Eloquent\Relations\BelongsTo
*/
public function job() {
return $this->belongsTo(Job::class, 'job_id');
}
}
| 076c286256eab992cd104ceda50c500574fcec23 | [
"Markdown",
"TypeScript",
"PHP"
]
| 17 | TypeScript | PauloAK/SisMEI | d6958b7257b086cc960f001d6aff5a7929f160ea | 5d16e7e053b7bd1e77c63649d29c1b31acebddef |
refs/heads/master | <repo_name>mshiihara/classwork_sound<file_sep>/openal_playstream/main.cpp
#include "OpenAL.h"
int main(void) {
OpenAL openAL;
// OpenAL初期化
openAL.init();
// 再生
openAL.play("sample.wav");
// OpenAL終了
openAL.clear();
return 0;
}<file_sep>/openal_playstream/OpenAL.h
#pragma once
#include <al.h>
#include <alc.h>
#include <string.h>
#include "WaveFile.h"
// ストリームに使用するバッファの数
#define NUMBUFFERS 4
#define MAX_NUM_WAVEID 1024
class OpenAL {
ALCdevice* device;
ALCcontext* context;
WaveFile waveFile;
public:
void init();
void play(const char* filename);
void clear();
};<file_sep>/openal_playstream/WaveFile.h
#pragma once
#include <Windows.h>
#include <stdio.h>
#include <guiddef.h>
#include <mmreg.h>
// RIFFチャンクを格納する為の構造体
struct RIFFHeader {
char tag[4];
unsigned long size;
char format[4];
};
struct RIFFChunk {
char tag[4];
unsigned long size;
};
struct WAVEFMT {
unsigned short usFormatTag;
unsigned short usChannels;
unsigned long ulSamplesPerSec;
unsigned long ulAvgBytesPerSec;
unsigned short usBlockAlign;
unsigned short usBitsPerSample;
unsigned short usSize;
unsigned short usReserved;
unsigned long ulChannelMask;
GUID guidSubFormat;
};
enum WAVEFILETYPE
{
WF_EX = 1,
WF_EXT = 2
};
struct WAVEFILEINFO {
WAVEFILETYPE wfType; // PCMなのかEXTENSIBLEなのかを区別する為の情報
WAVEFORMATEXTENSIBLE wfEXT; // フォーマット情報
unsigned long waveSize; // waveデータの大きさ
unsigned long waveChunkPos; // waveチャンクのファイルポインタ
};
class WaveFile {
public:
FILE* fp;
WAVEFMT waveFmt;
WAVEFILEINFO waveInfo;
void open(const char* filename);
long read(void* pData, int bufferSize);
private:
bool checkRIFFHeader();
void readHeader();
void readFMT_(RIFFChunk& riffChunk);
};<file_sep>/openal_playstream/OpenAL.cpp
#include "OpenAL.h"
void OpenAL::init() {
// OpenALを開く
device = alcOpenDevice(nullptr);
context = nullptr;
if (device) {
context = alcCreateContext(device, nullptr);
alcMakeContextCurrent(context);
}
}
void OpenAL::play(const char* filename) {
// バッファの作成
ALuint buffers[NUMBUFFERS];
alGenBuffers(NUMBUFFERS, buffers);
// ソースの作成
ALuint source;
alGenSources(1, &source);
ALuint uiBuffer;
WAVEFILEINFO* m_WaveIDs[MAX_NUM_WAVEID];
// 中身をゼロで初期化
memset(&m_WaveIDs, 0, sizeof(m_WaveIDs));
int waveId;
unsigned long ulFrequency = 0;
unsigned long ulFormat = 0;
unsigned long ulBufferSize;
WAVEFORMATEX wfe;
void* pData = NULL;
ALint iBuffersProcessed;
// waveファイルを開く
waveFile.open("sample.wav");
long lLoop = 0;
for (lLoop = 0; lLoop < MAX_NUM_WAVEID; lLoop++) {
if (!m_WaveIDs[lLoop]){
m_WaveIDs[lLoop] = &waveFile.waveInfo;
waveId = lLoop;
break;
}
}
ulFrequency = m_WaveIDs[waveId]->wfEXT.Format.nSamplesPerSec;
//
// フォーマットを取得
//
// 普通のwaveファイル
if (m_WaveIDs[waveId]->wfType == WF_EX) {
// 1チャンネル モノラル
if (m_WaveIDs[waveId]->wfEXT.Format.nChannels == 1) {
// 量子化ビット数で分岐
switch (m_WaveIDs[waveId]->wfEXT.Format.wBitsPerSample) {
case 4:
ulFormat = alGetEnumValue("AL_FORMAT_MONO_IMA4");
break;
case 8:
ulFormat = alGetEnumValue("AL_FORMAT_MONO8");
break;
case 16:
ulFormat = alGetEnumValue("AL_FORMAT_MONO16");
break;
}
}
// 2チャンネル ステレオ
else if (m_WaveIDs[waveId]->wfEXT.Format.nChannels == 2) {
// 量子化ビット数で分岐
switch (m_WaveIDs[waveId]->wfEXT.Format.wBitsPerSample) {
case 4:
ulFormat = alGetEnumValue("AL_FORMAT_STEREO_IMA4");
break;
case 8:
ulFormat = alGetEnumValue("AL_FORMAT_STEREO8");
break;
case 16:
ulFormat = alGetEnumValue("AL_FORMAT_STEREO16");
break;
}
}
// 4チャンネル
else if ((m_WaveIDs[waveId]->wfEXT.Format.nChannels == 4)
&& (m_WaveIDs[waveId]->wfEXT.Format.wBitsPerSample == 16)) {
ulFormat = alGetEnumValue("AL_FORMAT_QUAD16");
}
}
// 拡張されたwaveファイル
else if (m_WaveIDs[waveId]->wfType == WF_EXT) {
//todo: 後で実装する。サンプルデータも今無いし・・・
printf("未実装\n");
}
//WAVEFORMATEXを取得
memcpy(&wfe, &(m_WaveIDs[waveId]->wfEXT.Format),
sizeof(WAVEFORMATEX));
//
// 250mmに近いブロックアライメントの倍数をもとめる
//
// 1s/4 = 250mm
ulBufferSize = wfe.nAvgBytesPerSec >> 2;
// ブロックアライメントの倍数からはみ出している部分を引く
ulBufferSize -= (ulBufferSize % wfe.nBlockAlign);
// バッファを確保
if (ulFormat != 0) {
pData = malloc(ulBufferSize);
// dataチャンクに移動
fseek(waveFile.fp, m_WaveIDs[waveId]->waveChunkPos, SEEK_SET);
// バッファにデータを読み込み
for (int i = 0; i < NUMBUFFERS; i++) {
// ファイルからデータを読み取り
long len = waveFile.read(pData, ulBufferSize);
alBufferData(buffers[i], ulFormat, pData, len, ulFrequency);
alSourceQueueBuffers(source, 1, &buffers[i]);
}
}
alSourcePlay(source);
while (1) {
// ESCで処理終了
if (GetAsyncKeyState(VK_ESCAPE)) { break; }
// 再生済みのバッファ数を求める
iBuffersProcessed = 0;
alGetSourcei(source, AL_BUFFERS_PROCESSED, &iBuffersProcessed);
// 再生済みのバッファがあった
while (iBuffersProcessed) {
// キューから使用済みのバッファを1つ取り出す(削除)
// uiBufferniには削除されたバッファの名前(識別する為の値)が格納される
uiBuffer = 0;
alSourceUnqueueBuffers(source, 1, &uiBuffer);
// ファイルからデータを読み取り
long len = waveFile.read(pData, ulBufferSize);
alBufferData(uiBuffer, ulFormat, pData, len, ulFrequency);
alSourceQueueBuffers(source, 1, &uiBuffer);
// 使用済みバッファの数を一つ減らす
iBuffersProcessed--;
}
// 現在の状態を取得
ALint iState;
alGetSourcei(source, AL_SOURCE_STATE, &iState);
// 再生していなければ処理を終了
if (iState != AL_PLAYING) {
break;
}
}
}
void OpenAL::clear() {
// OpenALを閉じる
alcMakeContextCurrent(nullptr);
alcDestroyContext(context);
alcCloseDevice(device);
}<file_sep>/openal_playstream/WaveFile.cpp
#include "WaveFile.h"
void WaveFile::open(const char * filename) {
fopen_s(&fp, filename, "rb");
// ヘッダ情報を読み込み
if (fp) {
readHeader();
}
else {
printf("ファイルを開くことに失敗しました。\n");
}
}
long WaveFile::read(void * pData, int bufferSize) {
//
// 読み込もうと考えているサイズがファイルに残っているか?
//
unsigned long ulOffset = ftell(fp);
if ((ulOffset - waveInfo.waveChunkPos + bufferSize) > waveInfo.waveSize) {
bufferSize = waveInfo.waveSize - (ulOffset - waveInfo.waveChunkPos);
}
// ファイルからデータを読み取り
return fread(pData, 1, bufferSize, fp);
}
bool WaveFile::checkRIFFHeader() {
// ヘッダを読み取り
RIFFHeader riffHeader;
fread(&riffHeader, 1, sizeof(RIFFHeader), fp);
// 読み取ったヘッダがRIFFであるか確認
if (_strnicmp(riffHeader.tag, "RIFF", 4) == 0) {
return true;
}
return false;
}
void WaveFile::readHeader() {
//RIFFファイルかチェック
if (checkRIFFHeader()) {
printf("RIFFヘッダを読み取りました\n");
RIFFChunk riffChunk;
while (fread(&riffChunk, 1, sizeof(RIFFChunk), fp) == sizeof(RIFFChunk)) {
// 読み取ったチャンクがfmt であるか確認
if (_strnicmp(riffChunk.tag, "fmt ", 4) == 0) {
readFMT_(riffChunk);
}
else if (_strnicmp(riffChunk.tag, "data", 4) == 0) {
printf("dataチャンク発見\n");
//waveデータのサイズを取得
waveInfo.waveSize = riffChunk.size;
//後でwaveデータを読み込む際のセーブポイント
waveInfo.waveChunkPos = ftell(fp);
}
else {
// 次のチャンクへ移動
fseek(fp, riffChunk.size, SEEK_CUR);
}
}
}
else {
printf("ヘッダがRIFFではありませんでした\n");
}
}
void WaveFile::readFMT_(RIFFChunk & riffChunk) {
printf("fmt チャンクを発見\n");
if (riffChunk.size <= sizeof(WAVEFMT)) {
//フォーマット情報を読み取り
fread(&waveFmt, 1, riffChunk.size, fp);
printf("usFormatTag:%d\nusChannels:%d\nulSamplesPerSec:%d\nulAvgBytesPerSec:%d\nusBlockAlign:%d\nusBitsPerSample:%d\nusSize:%d\nusReserved:%d\nulChannelMask:%d\nguidSubFormat:%d\n",
waveFmt.usFormatTag,
waveFmt.usChannels,
waveFmt.ulSamplesPerSec,
waveFmt.ulAvgBytesPerSec,
waveFmt.usBlockAlign,
waveFmt.usBitsPerSample,
waveFmt.usSize,
waveFmt.usReserved,
waveFmt.ulChannelMask,
waveFmt.guidSubFormat);
// 一般的なのwaveファイルか?
if (waveFmt.usFormatTag == WAVE_FORMAT_PCM) {
waveInfo.wfType = WF_EX;
memcpy(&waveInfo.wfEXT.Format, &waveFmt, sizeof(PCMWAVEFORMAT));
}
// 3チャンネル以上の特別なwaveファイルか?
else if (waveFmt.usFormatTag == WAVE_FORMAT_EXTENSIBLE) {
waveInfo.wfType = WF_EXT;
memcpy(&waveInfo.wfEXT, &waveFmt, sizeof(WAVEFORMATEXTENSIBLE));
}
}
else {
// 次のチャンクへ移動
fseek(fp, riffChunk.size, SEEK_CUR);
}
}
| ef8089ca12b845bbb40e9754d179b909f883cc9c | [
"C++"
]
| 5 | C++ | mshiihara/classwork_sound | b0e536467e1200891872bfccab8cedf78cd55914 | ac9f27ff278f6a608412d5d19132c4c973eee62b |
refs/heads/master | <file_sep># beat-brain
Converting a beatbox wav file to a MIDI beat through the use of a neural net
Record a beatbox and bounce to wav file, break up wav into pieces representing each "hit" of said oral instrument, and feed a matrix of
frequencies and their intensities (through a Fast Fourier Transform) to a neural network. The NN will hopefully recognize whether each
"hit" is a snare, kick, or a hi-hat. Once we have that data, we can convert it to a midi file and mimic the original beat using software
instruments.
<file_sep># <NAME> (Jan 5th, 2018)
# Steps (and some code) from https://www.analyticsvidhya.com/blog/2017/05/neural-network-from-scratch-in-python-and-r/
# Neural Network for taking a matrix derived from a Fast Fourier Transform and determining whether it
# is a snare, kick, or hi-hat
import numpy as np
# Input array
# This will be changed based on how many discrete frequency ranges Grant's FFT code returns.
# Most likely, the number of frequencies will be the number of subarrays, with the frequency's intensity
# represented as a binary number in the subarray.
X=np.array([[1,0,1,0],[1,0,1,1]])
# Output array
# This will also likely need to be changed so that the number of subarrays is the same as the input, but only the
# last two numbers will be used to form a binary number (either 00, 01, or 10; 11 unused) representing snare, kick
# or hi-hat
y=np.array([[1],[1]])
# Sigmoid Function
# May be worth messing around and trying different functions for the backpropagation to see if it yields better results.
def sigmoid(x):
return 1/(1 + np.exp(-x))
# Derivative of Sigmoid Function
# Will have to change this if I decide to change the sigmoid function to something else.
def derivatives_sigmoid(x):
return x * (1 - x)
# Variable initialization
# More cycles is obviously more accurate but also more time consuming. Will have to try a bunch of different orders of magnitude and
# to optimize time vs. accuracy. If the FFT yields significantly different results for the different types of sounds, it may not be all
# that hard for the neural network to tell which type it is.
cycles = 10000 #Setting training iterations
lr = 50 #Setting learning rate --> messing around and higher seemed to be better, but haven't tried it with any actual data yet
inputlayer_neurons = X.shape[1] #number of features in data set
hiddenlayer_neurons = 3 #number of hidden layer neurons --> Haven't messed with this number yet
output_neurons = 1 #number of neurons at output layer
# Weight and bias initialization
# All random as of now but as I do more research, perhaps I'll find better values to set them to initially
weight_hidden=np.random.uniform(size=(inputlayer_neurons, hiddenlayer_neurons))
bias_hidden=np.random.uniform(size=(1, hiddenlayer_neurons))
weight_out=np.random.uniform(size=(hiddenlayer_neurons,output_neurons))
bias_out=np.random.uniform(size=(1,output_neurons))
for i in range(cycles):
# Forward Propogation
hidden_layer_input1 = np.dot(X,weight_hidden) #matrix multiplication of input values with hidden layer perceptron weights
hidden_layer_input = hidden_layer_input1 + bias_hidden #add hidden bias to the input
hiddenlayer_activations = sigmoid(hidden_layer_input) #apply function
output_layer_input1 = np.dot(hiddenlayer_activations,weight_out) #matrix multiplication of function output and output perceptron weights
output_layer_input = output_layer_input1 + bias_out #add output bias to output
output = sigmoid(output_layer_input) #function again
# Backpropagation
# Need to really hammer home the concept of what's going on here... as of now I've just followed online resources.
# Will definitely be helpful to really understand it if I want to improve it.
error = y-output #how far NN output was from target output
slope_output_layer = derivatives_sigmoid(output)
slope_hidden_layer = derivatives_sigmoid(hiddenlayer_activations)
d_output = error * slope_output_layer
Error_at_hidden_layer = d_output.dot(weight_out.T)
d_hiddenlayer = Error_at_hidden_layer * slope_hidden_layer
weight_out += hiddenlayer_activations.T.dot(d_output) * lr
bias_out += np.sum(d_output, axis=0,keepdims=True) * lr
weight_hidden += X.T.dot(d_hiddenlayer) * lr
bias_hidden += np.sum(d_hiddenlayer, axis=0,keepdims=True) * lr
print(output)
| 4237b1f680fd0aff8bb2e9816e84c7e929e222f7 | [
"Markdown",
"Python"
]
| 2 | Markdown | oconnorg1/beat-brain | d38925e7082f388ad785d82e3d2cf9fe2eb6b351 | 6144af6ba7ac24626fe5ef739d481023bad0a5b3 |
refs/heads/master | <file_sep>package com.example.EstudoManyToMany.model;
import java.io.Serializable;
import javax.persistence.EmbeddedId;
import javax.persistence.Entity;
import com.fasterxml.jackson.annotation.JsonIgnore;
import lombok.Getter;
import lombok.Setter;
@Entity
@Getter @Setter
public class ItemPedido implements Serializable {
private static final long serialVersionUID = 1L;
@JsonIgnore
@EmbeddedId
private ItemPedidoPK id = new ItemPedidoPK();
private Double desconto;
private Integer quantidade;
private Double preco;
}
<file_sep>package com.example.EstudoManyToMany.respository;
import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.stereotype.Repository;
import com.example.EstudoManyToMany.model.Cliente;
@Repository
public interface ClienteRepository extends JpaRepository<Cliente, Integer> {
}
<file_sep>package com.example.EstudoManyToMany.respository;
import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.stereotype.Repository;
import com.example.EstudoManyToMany.model.Endereco;
@Repository
public interface EnderecoRepository extends JpaRepository<Endereco, Integer> {
}
<file_sep>package com.example.EstudoManyToMany.model;
import javax.persistence.Entity;
import com.example.EstudoManyToMany.model.enums.EstadoPagamento;
import lombok.Getter;
import lombok.Setter;
@Entity
@Getter @Setter
public class PagamentoComCartao extends Pagamento {
public PagamentoComCartao(Integer id2, EstadoPagamento estado2, Pedido pedido2, int i) {
super(id2, estado2, pedido2);
// TODO Auto-generated constructor stub
}
private static final long serialVersionUID = 1L;
private Integer numeroDeParcelas;
}
<file_sep>package com.example.EstudoManyToMany.model;
import java.util.Date;
import javax.persistence.Entity;
import com.example.EstudoManyToMany.model.enums.EstadoPagamento;
import com.fasterxml.jackson.annotation.JsonFormat;
import lombok.Getter;
import lombok.Setter;
@Entity
@Getter @Setter
public class PagamentoComBoleto extends Pagamento {
public PagamentoComBoleto(Integer id2, EstadoPagamento estado2, Pedido pedido2, Date date, Object object) {
super(id2, estado2, pedido2);
// TODO Auto-generated constructor stub
}
private static final long serialVersionUID = 1L;
@JsonFormat(pattern="dd/MM/yyyy")
private Date dataVencimento;
@JsonFormat(pattern="dd/MM/yyyy")
private Date dataPagamento;
}
| 57e6c86e2087943f9b4c747018202a0705f3c6d1 | [
"Java"
]
| 5 | Java | lucasmaia19/estudo-uml | 5ad88936946121ea43a98faea0093e960a567d1a | e5c64420d75a9454cb49fab8a813afe4deae9227 |
refs/heads/master | <repo_name>bseymour/nextjs-1234<file_sep>/pages/imagetest.js
import Image from 'next/image'
function Home() {
return (
<>
<h1>My Homepage</h1>
<Image
src="https://storyus-core.s3.eu-west-2.amazonaws.com/core/site/_v1/family-sunset-austria.jpg"
alt="Picture of the author"
width={650}
height={480}
/>
<p>Welcome to my homepage!</p>
</>
)
}
export default Home
<file_sep>/pages/imagetest2.js
import Image from 'next/image'
function Home() {
return (
<>
<h1>My Homepage</h1>
<Image
src="https://cdn-o7.outfit7.com/wp-content/uploads/2021/05/banner_3_NEW.png"
alt="Picture of the author"
width={1800}
height={492}
/>
<p>Welcome to my homepage!</p>
</>
)
}
export default Home
| a160e2823cd777342e8d6460dd1b0af20f55e82e | [
"JavaScript"
]
| 2 | JavaScript | bseymour/nextjs-1234 | bdf13a28758549f1f3ed339c86461bb4c39567ee | 626a4d5ce21ab5d41d70a34c3ff0dfdc0942e4ee |
refs/heads/master | <file_sep>const ToDo = require("../models/db");
const path = require("path");
const bodyParser = require("body-parser");
let slugOne = {};
function showAuth(req, res) {
res.sendFile(path.join(__dirname + "/auth.html"));
}
function getUsersFromDB(req, res) {
ToDo.dbUsersModel.find({}, (err, users) => {
if (err) {
res.send(404);
res.send("User not found");
}
console.log("getUserFromDB", req.params);
res.send(users);
});
}
function getDataFromDB(req, res) {
ToDo.dbRequestModel.find({}, (err, users) => {
if (err) {
res.send(404);
res.send("User not found");
}
res.send(users);
});
}
function showUpdate(req, res) {
console.log(slugOne);
res.sendFile(path.join(__dirname + "/update.html"));
}
function createUpdate(req, res) {
console.log(slugOne);
ToDo.dbRequestModel.findOne({ slug: slugOne }, (err, data) => {
//updating that data
dataForChanges = {
topic: req.body.topic,
type: req.body.type,
duration: req.body.duration,
location: req.body.location,
dataID: req.body.dataID,
description: req.body.description
};
data.topic = dataForChanges.topic;
data.type = dataForChanges.type;
data.duration = dataForChanges.duration;
data.location = dataForChanges.location;
data.description = dataForChanges.description;
data.save(function(err, data) {
if (err) return handleError(err);
res.send(data);
});
});
}
function showSingle(req, res) {
ToDo.dbRequestModel.findOne({ slug: req.params.slug }, (err, data) => {
if (err) {
res.send(404);
res.send("Data not found");
}
slugOne = req.params.slug;
console.log(slugOne);
res.send(data);
});
}
function showCreate(req, res) {
console.log('addd');
res.sendFile(path.join(__dirname + "/form.html"));
}
function createData(req, res) {
const data = new ToDo.dbRequestModel({
topic: req.body.topic,
type: req.body.type,
duration: req.body.duration,
location: req.body.location,
dataID: req.body.dataID,
description: req.body.description
});
data.save(err => {
if (err) {
throw err;
}
});
}
function showUpdate(req, res) {
console.log(slugOne);
res.sendFile(path.join(__dirname + "/update.html"));
}
function createUpdate(req, res) {
console.log(slugOne);
ToDo.dbRequestModel.findOne({ slug: slugOne }, (err, data) => {
//updating that data
dataForChanges = {
topic: req.body.topic,
type: req.body.type,
duration: req.body.duration,
location: req.body.location,
dataID: req.body.dataID,
description: req.body.description
};
data.topic = dataForChanges.topic;
data.type = dataForChanges.type;
data.duration = dataForChanges.duration;
data.location = dataForChanges.location;
data.description = dataForChanges.description;
data.save(function(err, data) {
if (err) return handleError(err);
res.send(data);
});
});
}
module.exports = {
showAuth,
getUsersFromDB,
getDataFromDB,
showSingle,
showCreate,
createData,
showUpdate,
createUpdate
};
<file_sep>const mongoose = require('mongoose');
Schema = mongoose.Schema;
//create a schema
const dbUsersSchema = new Schema({
username: String,
password: <PASSWORD>,
email: String,
name: String,
surname: String,
sex: String,
location: String,
about: String,
myRequests: String ,
slug: {
type: String,
unique: true
}});
const dbRequestSchema = new Schema({
topic: String,
type: String,
description: String,
duration: String,
location: String,
slug: {
type: String,
unique: true
}
});
// middlware which make sure that slug is created from name
dbRequestSchema.pre('save', function(next){
this.slug = slugify(this.topic);
next();
});
dbUsersSchema.pre('save', function(next){
this.slug = slugify(this.topic);
next();
});
//create the model
const dbRequestModel = mongoose.model('requests', dbRequestSchema);
const dbUsersModel = mongoose.model('users', dbUsersSchema);
//export the model
module.exports = {
dbRequestModel,
dbUsersModel
};
function slugify(text) {
return text.toString().toLowerCase()
.replace(/\s+/g, '-') // Replace spaces with -
.replace(/[^\w\-]+/g, '') // Remove all non-word chars
.replace(/\-\-+/g, '-') // Replace multiple - with single -
.replace(/^-+/, '') // Trim - from start of text
.replace(/-+$/, ''); // Trim - from end of text
}<file_sep>require('dotenv').config();
const mainController = require('./controllers/main.controller');
const express = require('express');
const app = express();
const mongoose = require('mongoose');
const bodyParser = require("body-parser");
const port = process.env.port || 8080;
const passport = require('passport');
app.use(passport.initialize());
app.use(passport.session());
mongoose.connect(process.env.DB_URI);
// mongoose.connect('mongodb://localhost/MyDatabase');
const db = mongoose.connection;
db.on('error', console.error.bind(console, 'connection error:'));
db.once('open', function() {
console.log("successfull conected to database");
});
app.use(bodyParser.urlencoded({extended: true}));
app.use(require('./routes'));
app.listen(port, ()=>{
console.log(`Server started at port ${port}`);
})
passport.serializeUser(function(user, cb) {
cb(null, user.id);
});
passport.deserializeUser(function(id, cb) {
User.findById(id, function(err, user) {
cb(err, user);
});
}); | c239b7da4be78abb785f79660f207bd2e97fa90b | [
"JavaScript"
]
| 3 | JavaScript | grynyk/mersee | 0437fbbda0b809e2c4632197e51196afe73822e6 | 381925badfe6902aee188cdfefad2c7747c1d58d |
refs/heads/main | <repo_name>tisya207/c97-project<file_sep>/game.py
import random
number = random.randint(1,9)
#print(number)
print('GUESS A NUMBER: (you random no. range is 1-9)')
chances=1
while chances <=5:
guess=int(input('enter your guess '))
print(guess)
if guess == number:
print("congo you've guessed the no. right!!")
break
elif guess > number:
print('the no is smaller than the no. you have guessed')
elif guess < number:
print('the no is greater than the no. you have guessed')
chances+=1
if not chances<5:
print('you lose, the correct number is ' + number) | 59a8cc640aea65ec105e2e28fd667adf335dc0a2 | [
"Python"
]
| 1 | Python | tisya207/c97-project | c4bdb71228923e8580e90b70dad65c3693c3a3b4 | 59b8e7d0668fc8c3604ccc87e8ada1ef787279e9 |
refs/heads/master | <file_sep>#First:
#cur_weather = %Q{ It's a hot day outside
#Grab your umbrellas...}
#
#cur_weather.lines do |line|
# line.sub! 'hot', 'rainy'
# p "#{line}" # Actual Line
# p "#{line.strip}" # removes \n and any leading and trailing space
# p "#{line.lstrip}" # removes the leading space
# p "#{line.rstrip}" # removes the trailing space and \n
# p "#{line.chomp}" # removes only \n
#end
#
###Output:
#
## => " It's a rainy day outside \n"
## => "It's a rainy day outside"
## => "It's a rainy day outside \n"
## => " It's a rainy day outside"
## => " It's a rainy day outside "
## => "Grab your umbrellas..."
## => "Grab your umbrellas..."
## => "Grab your umbrellas..."
## => "Grab your umbrellas..."
## => "Grab your umbrellas..."
#Second
#case 19
# when 0..18 then p "Minor"
# else p "Adult"
#end
#Third
#my_hash = Hash.new("No such key yet")
#p my_hash[0] # "No such key yet"
#p my_hash["some_key"] # "No such key yet"
#Fourth
#def adjust (props = {fore: "red" , back: "white"})
# puts "Foreground: #{props[:fore]}" if props[:fore]
# puts "Background: #{props[:back]}" if props[:back]
#end
#
#adjust
#adjust ({:fore => "green"})
#adjust back: "yella"
#adjust :back => "yella"
class Person
attr_accessor :name
attr_reader :age
def initialize (name, ageVar)
@name = name
# self.age = ageVar
@age = age unless age > 100
p age
end
# def age= age
# @age = age unless age > 100
# end
end
p1 = Person.new("John", 20)
p "#{p1.name} #{p1.age}"
p1.age = 123
p p1.age
<file_sep>freq = Hash.new(0)
File.foreach('test.txt') do |line|
line.split.each do |word|
freq[word.downcase] += 1
end
end
p freq.select{ |key, value| value == freq.each_value.max}.keys
<file_sep>class LineAnalyzer
attr_accessor :highest_wf_count, :highest_wf_words, :content, :line_number
def initialize(c,n)
@content = c
@line_number = n
calculate_word_frequency(c,n) #return
end
def calculate_word_frequency(line, n) # calculates result
word_count = Hash.new(0)
line.split.each do |word|
word_count[word.downcase] += 1
end
self.highest_wf_count = word_count.each_value.max
self.highest_wf_words = word_count.select{ |key, value| value == word_count.each_value.max}.keys
end
end
class Solution
attr_reader :analyzers, :highest_count_across_lines, :highest_count_words_across_lines
def initialize
@analyzers = []
end
def analyze_file()
line_no = 0
File.foreach('test.txt') do |lines|
line_no += 1
new_LA_obj = LineAnalyzer.new(lines, line_no)
@analyzers << new_LA_obj
end
end
def calculate_line_with_highest_frequency()
@highest_count_across_lines = 0
@highest_count_words_across_lines = []
@analyzers.each do |obj|
if obj.highest_wf_count >= @highest_count_across_lines
@highest_count_across_lines = obj.highest_wf_count
@highest_count_words_across_lines.push obj
end
end
end
def print_highest_word_frequency_across_lines()
puts "The following words have the highest word frequency per line: "
@highest_count_words_across_lines.each do |obj|
puts "#{obj.highest_wf_words} appears in line #{obj.line_number}"
end
end
end
| 88eb01de92968e5d0b22d2758a9ef456931266b7 | [
"Ruby"
]
| 3 | Ruby | indieNik/fullstack-course1-module2 | 9c2c0d6ff00c3379f0dff1d46793faa5bcb5e7b2 | 46688e6994acc2a2daa40fa4971c4f78e8019960 |
refs/heads/master | <file_sep>## Loading S3 file with Airflow to ETL with Redshift
The purpose of this project is to build an adapted data model thanks to python to load data in a JSON file format and wrangle them into a star schema (see the ERD) with the pipeline written as a code thanks to [AirFlow](https://airflow.apache.org/).
### Main Goal
The company Sparkify need to analyse their data to better understand the way users (free/paid) use theirs services.
With this data pipeline we will be able to schedule, monitor and build more easily the ETL of this data.
### Data Pipeline

### Data Model
This pipeline finally is made to build this DB star schema below to make easier the data analysis

### Run it
After implementing your dags,
Go to the [AirFlow UI](http://localhost:8080/admin/) after several seconds then run the DAG.
<file_sep>class SqlQueries:
songplay_table_insert = ("""
INSERT INTO songplays (start_time, user_id, level, song_id,artist_id, session_id, location, user_agent)
SELECT timestamp 'epoch' + CAST(e.ts AS BIGINT)/1000 * interval '1 second', e.userId, e.level, s.song_id, s.artist_id,e.sessionId, e.location, e.userAgent
FROM staging_events e
LEFT JOIN staging_songs s ON e.artist = s.artist_name AND e.song = s.title AND e.length = s.duration WHERE e.page = 'NextSong';
""")
user_table_insert = ("""
INSERT INTO users (user_id, first_name, last_name, gender, level)
SELECT DISTINCT userId, firstName, lastName, gender, level
FROM staging_events
WHERE userId IS NOT NULL;
""")
song_table_insert = ("""
INSERT INTO songs (song_id, title, artist_id, year, duration)
SELECT DISTINCT song_id, title, artist_id, year, duration
FROM staging_songs
WHERE song_id IS NOT NULL;
""")
artist_table_insert = ("""
INSERT INTO artists (artist_id, name, location, latitude, longitude)
SELECT DISTINCT artist_id, artist_name, artist_location, artist_latitude, artist_longitude
FROM staging_songs
WHERE artist_id IS NOT NULL;
""")
time_table_insert = ("""
INSERT INTO time (start_time, hour, day, week, month, year, weekday)
SELECT DISTINCT start_time,
EXTRACT(hour FROM start_time),
EXTRACT(day FROM start_time),
EXTRACT(week FROM start_time),
EXTRACT(month FROM start_time),
EXTRACT(year FROM start_time),
EXTRACT(weekday FROM start_time)
FROM songplays
WHERE start_time IS NOT NULL;
""")<file_sep>from airflow.hooks.postgres_hook import PostgresHook
from airflow.models import BaseOperator
from airflow.utils.decorators import apply_defaults
from airflow.contrib.hooks.aws_hook import AwsHook
class StageToRedshiftOperator(BaseOperator):
""""
Copies JSON files from an S3 bucket into a Redshift staging table.
The table is cleared first if it exists.
Keyword arguments:
redshift_conn_id -- the connection id for redshift.
aws_credential_id -- credentails used to access AWS
table_name -- the name of the table to inset data into.
s3_bucket -- suffix of the S3 bucket.
s3_key -- prefix of the S3 bucket.
copy_json_option -- the path to JSON formatter.
region -- the region of the Redshift cluster.
data_format -- specify the format of your data.
"""
ui_color = '#358140'
copy_query = "COPY {} FROM '{}' ACCESS_KEY_ID '{}' SECRET_ACCESS_KEY '{}' REGION '{}' {} '{}'"
@apply_defaults
def __init__(self,
redshift_conn_id="",
aws_credential_id="",
table_name = "",
s3_bucket="",
s3_key = "",
copy_json_option='',
region="",
data_format="",
*args, **kwargs):
super(StageToRedshiftOperator, self).__init__(*args, **kwargs)
self.redshift_conn_id = redshift_conn_id
self.aws_credential_id = aws_credential_id
self.table_name = table_name
self.s3_bucket = s3_bucket
self.s3_key = s3_key
self.copy_json_option = copy_json_option
self.region = region
self.data_format= data_format
self.execution_date = kwargs.get('execution_date')
def execute(self, context):
aws_hook = AwsHook(self.aws_credential_id)
credentials = aws_hook.get_credentials()
s3_path = "s3://{}/{}".format(self.s3_bucket, self.s3_key)
self.log.info(f"Picking staging file for table {self.table_name} from location : {s3_path}")
copy_query = self.copy_query.format(self.table_name, s3_path, credentials.access_key, credentials.secret_key, self.region, self.data_format, self.copy_json_option)
redshift_hook = PostgresHook(postgres_conn_id = self.redshift_conn_id)
self.log.info("Deleting data from the Redshift table destination")
redshift_hook.run("DELETE FROM {}".format(self.table_name))
redshift_hook.run(copy_query)
self.log.info(f"Table {self.table_name} staged successfully!!")<file_sep>from airflow.hooks.postgres_hook import PostgresHook
from airflow.models import BaseOperator
from airflow.utils.decorators import apply_defaults
class DataQualityOperator(BaseOperator):
"""
Performs data quality checks on all tables.
The check if each table have at least 1 row.
Keyword arguments:
redshift_conn_id -- the connection id for redshift.
tables -- the list of all tables to check it.
"""
ui_color = '#89DA59'
@apply_defaults
def __init__(self,
redshift_conn_id="",
tables = [],
*args, **kwargs):
super(DataQualityOperator, self).__init__(*args, **kwargs)
self.redshift_conn_id = redshift_conn_id
self.tables = tables
def check_if_rows(self, redshift_hook, table):
records = redshift_hook.get_records(f"select count(*) from {table};")
if len(records) < 1 or len(records[0]) < 1 or records[0][0] < 1:
raise ValueError(f"Data Quality validation failed for table : {table}")
def execute(self, context):
redshift_hook = PostgresHook(postgres_conn_id = self.redshift_conn_id)
for table in self.tables:
self.log.error(f"Data Quality validation failed for table : {table}.")
self.check_if_rows(redshift_hook, table)
self.log.info(f"Data Quality Validation Passed on table : {table}!!!") <file_sep>from airflow.hooks.postgres_hook import PostgresHook
from airflow.models import BaseOperator
from airflow.utils.decorators import apply_defaults
class LoadDimensionOperator(BaseOperator):
"""
Insert data from staging tables into the dimension table.
The target table is checked first if we want to append or not.
Keyword arguments:
redshift_conn_id -- the connection id for redshift.
sql_query -- the SQL insert statement.
table_name -- the name of the table to inset data into.
append_only -- boolean variable to choose either append or not.
"""
ui_color = '#80BD9E'
@apply_defaults
def __init__(self,
redshift_conn_id="",
sql_query = "",
table_name = "",
append_only="",
*args, **kwargs):
super(LoadDimensionOperator, self).__init__(*args, **kwargs)
self.redshift_conn_id = redshift_conn_id
self.sql_query = sql_query
self.table_name = table_name
self.append_only=append_only
def execute(self, context):
redshift_hook = PostgresHook(postgres_conn_id = self.redshift_conn_id)
self.log.info(f"Running query to load data into Dimension Table {self.table_name}")
if not self.append_only:
self.log.info("Delete befor Insert the dimension table {}".format(self.table_name))
redshift_hook.run("DELETE FROM {}".format(self.table_name))
self.log.info(self.sql_query)
self.log.info(self.table_name)
redshift_hook.run(self.sql_query)
self.log.info(f"Dimension Table {self.table_name} loaded.")
| 774041cfdf1e5b1ee799bab6ebb90845e40c7d01 | [
"Markdown",
"Python"
]
| 5 | Markdown | Allhabiy/DEND-Data-Pipelines-With-Airflow | 7f611ff2f10eddcdaee70a6aab78dfeee4fb35a4 | a821c3f26ff3aa971dd5c767f9dd055844b819af |
refs/heads/master | <repo_name>ericeslinger/localForage-getItems<file_sep>/src/localforage-getitems.js
(function() {
'use strict';
// Promises!
var Promise = (typeof module !== 'undefined' && module.exports) ?
require('promise') : this.Promise;
var globalObject = this;
var serializer = null;
var ModuleType = {
DEFINE: 1,
EXPORT: 2,
WINDOW: 3
};
// Attaching to window (i.e. no module loader) is the assumed,
// simple default.
var moduleType = ModuleType.WINDOW;
// Find out what kind of module setup we have; if none, we'll just attach
// localForage to the main window.
if (typeof define === 'function' && define.amd) {
moduleType = ModuleType.DEFINE;
} else if (typeof module !== 'undefined' && module.exports) {
moduleType = ModuleType.EXPORT;
}
function localforageGetItems(keys, callback) {
var localforageInstance = this;
var currentDriver = localforageInstance.driver();
if (currentDriver === localforageInstance.INDEXEDDB) {
return getItemsIndexedDB.call(localforageInstance, keys, callback);
} else if (currentDriver === localforageInstance.WEBSQL) {
return getItemsWebsql.call(localforageInstance, keys, callback);
} else {
return getItemsGeneric.call(localforageInstance, keys, callback);
}
}
function getItemsGeneric(keys, callback) {
var localforageInstance = this;
var promise = new Promise(function(resolve, reject) {
var itemPromises = [];
for (var i = 0, len = keys.length; i < len; i++) {
itemPromises.push(getItemKeyValue.call(localforageInstance, keys[i]));
}
Promise.all(itemPromises).then(function(keyValuePairs) {
var result = {};
for (var i = 0, len = keyValuePairs.length; i < len; i++) {
var keyValuePair = keyValuePairs[i];
result[keyValuePair.key] = keyValuePair.value;
}
resolve(result);
}).catch(reject);
});
executeCallback(promise, callback);
return promise;
}
function getItemsIndexedDB(keys, callback) {
var localforageInstance = this;
function comparer (a,b) {
return a < b ? -1 : a > b ? 1 : 0;
}
var promise = new Promise(function(resolve, reject) {
localforageInstance.ready().then(function() {
// Thanks https://hacks.mozilla.org/2014/06/breaking-the-borders-of-indexeddb/
var dbInfo = localforageInstance._dbInfo;
var store = dbInfo.db.transaction(dbInfo.storeName, 'readonly')
.objectStore(dbInfo.storeName);
// Initialize IDBKeyRange; fall back to vendor-prefixed versions if needed.
var IDBKeyRange = IDBKeyRange || globalObject.IDBKeyRange || globalObject.webkitIndexedDB ||
globalObject.mozIndexedDB || globalObject.OIndexedDB ||
globalObject.msIndexedDB;
var set = keys.sort(comparer);
var keyRangeValue = IDBKeyRange.bound(keys[0], keys[keys.length - 1], false, false);
var req = store.openCursor(keyRangeValue);
var result = {};
var i = 0;
req.onsuccess = function (/*event*/) {
var cursor = req.result; // event.target.result;
if (!cursor) {
resolve(result);
return;
}
var key = cursor.key;
while (key > set[i]) {
// The cursor has passed beyond this key. Check next.
i++;
if (i === set.length) {
// There is no next. Stop searching.
resolve(result);
return;
}
}
if (key === set[i]) {
// The current cursor value should be included and we should continue
// a single step in case next item has the same key or possibly our
// next key in set.
var value = cursor.value;
if (value === undefined) {
value = null;
}
result[key] = value;
// onfound(cursor.value);
cursor.continue();
} else {
// cursor.key not yet at set[i]. Forward cursor to the next key to hunt for.
cursor.continue(set[i]);
}
};
req.onerror = function(/*event*/) {
reject(req.error);
};
}).catch(reject);
});
executeCallback(promise, callback);
return promise;
}
function getItemsWebsql(keys, callback) {
var localforageInstance = this;
var promise = new Promise(function(resolve, reject) {
localforageInstance.ready().then(function() {
return getSerializer(localforageInstance);
}).then(function(serializer) {
var dbInfo = localforageInstance._dbInfo;
dbInfo.db.transaction(function(t) {
var queryParts = new Array(keys.length);
for (var i = 0, len = keys.length; i < len; i++) {
queryParts[i] = '?';
}
t.executeSql('SELECT * FROM ' + dbInfo.storeName +
' WHERE (key IN (' + queryParts.join(',') + '))', keys,
function(t, results) {
var result = {};
var rows = results.rows;
for (var i = 0, len = rows.length; i < len; i++) {
var item = rows.item(i);
var value = item.value;
// Check to see if this is serialized content we need to
// unpack.
if (value) {
value = serializer.deserialize(value);
}
result[item.key] = value;
}
resolve(result);
},
function(t, error) {
reject(error);
});
});
}).catch(reject);
});
executeCallback(promise, callback);
return promise;
}
function getSerializer(localforageInstance) {
if (serializer) {
return Promise.resolve(serializer);
}
// add support for localforage v1.3.x
if (localforageInstance &&
typeof localforageInstance.getSerializer === 'function') {
return localforageInstance.getSerializer();
} else {
throw new Error('This library requires localforage 1.3 or higher');
}
// var serializerPromise = new Promise(function(resolve/*, reject*/) {
// // We allow localForage to be declared as a module or as a
// // library available without AMD/require.js.
// if (moduleType === ModuleType.DEFINE) {
// require(['localforageSerializer'], resolve);
// } else if (moduleType === ModuleType.EXPORT) {
// // Making it browserify friendly
// resolve(require('./../utils/serializer'));
// } else {
// resolve(globalObject.localforageSerializer);
// }
// });
//
return serializerPromise.then(function(lib) {
serializer = lib;
return Promise.resolve(serializer);
});
}
function getItemKeyValue(key, callback) {
var localforageInstance = this;
var promise = localforageInstance.getItem(key).then(function(value) {
return {
key: key,
value: value
};
});
executeCallback(promise, callback);
return promise;
}
function executeCallback(promise, callback) {
if (callback) {
promise.then(function(result) {
callback(null, result);
}, function(error) {
callback(error);
});
}
}
function extendPrototype(localforage) {
var localforagePrototype = Object.getPrototypeOf(localforage);
if (localforagePrototype) {
localforagePrototype.getItems = localforageGetItems;
localforagePrototype.getItems.indexedDB = function(){
return getItemsIndexedDB.apply(this, arguments);
};
localforagePrototype.getItems.websql = function(){
return getItemsWebsql.apply(this, arguments);
};
localforagePrototype.getItems.generic = function(){
return getItemsGeneric.apply(this, arguments);
};
}
}
extendPrototype(localforage);
if (moduleType === ModuleType.DEFINE) {
define('localforageGetItems', function() {
return localforageGetItems;
});
} else if (moduleType === ModuleType.EXPORT) {
module.exports = localforageGetItems;
} else {
this.localforageGetItems = localforageGetItems;
}
}).call(window);
<file_sep>/README.md
localForage-getItems
====================
[](https://www.npmjs.com/package/localforage-getitems)
Adds getItems method to [localForage](https://github.com/mozilla/localForage).
## Requirements
* [localForage](https://github.com/mozilla/localForage) v1.2.1+
## Installation
`npm i localforage-getitems`
##jsperf links
* [default driver order (indexedDB prefered)](http://jsperf.com/localforage-getitems/3)
* [websql (not for firefox)](http://jsperf.com/localforage-getitems-websql)
| bd47940f5736216f9ae3238c269a9e2ea44c5cba | [
"JavaScript",
"Markdown"
]
| 2 | JavaScript | ericeslinger/localForage-getItems | 748fc839eb900cad5a931aa09c0869c0c8f832ca | fbabb9712948665b1d48de998a466adbac7cf0a2 |
refs/heads/main | <file_sep># ZOLA
A simple high level language which is statically compiled to x86_64 and arm64. It focuses on producing efficient code for iterative algorithms.
## The Language
### Arrays
### Vectors
### Iterators
### Generators
## The Backend
<file_sep>#ifndef __codegen_H_
#define __codegen_H_
struct SEXPR;
void codegen(struct SEXPR* node);
#endif <file_sep>#include <string.h>
#include <stdlib.h>
#include <stdio.h>
#include "semantics.h"
#include "ast.h"
#include "types.h"
#include "builtins.h"
#include "frontend.h"
#include "errors.h"
/**
* TODO:
* - complete builtin list
* - inline functions
* - this file is way smaller than it should be!
*/
static int isbuiltin(char* name)
{
if(name == NULL) return 0;
else if(0 == strcmp(name, "__add__")) return __ADD__;
else if(0 == strcmp(name, "__sub__")) return __SUB__;
else if(0 == strcmp(name, "__mul__")) return __MUL__;
else if(0 == strcmp(name, "__div__")) return __DIV__;
else if(0 == strcmp(name, "__udiv__")) return __UDIV__;
else if(0 == strcmp(name, "__rem__")) return __REM__;
else if(0 == strcmp(name, "__urem__")) return __UREM__;
else if(0 == strcmp(name, "__shl__")) return __SHL__;
else if(0 == strcmp(name, "__shr__")) return __SHR__;
else if(0 == strcmp(name, "__sar__")) return __SAR__;
else if(0 == strcmp(name, "__and__")) return __AND__;
else if(0 == strcmp(name, "__or__")) return __OR__;
else if(0 == strcmp(name, "__xor__")) return __XOR__;
return 0;
}
struct SEXPR* analyze(struct SEXPR* sexpr)
{
struct ZL_CONTEXT* ctx = NULL; // TODO: implement dispatcher
return visit_node(sexpr, ctx);
}
struct SEXPR* visit_node(struct SEXPR* sexpr, struct ZL_CONTEXT* ctx)
{
const unsigned flag = sexpr->flag;
if(flag & AST_FLAG_ATOM)
{
return visit_atom(sexpr, ctx);
}
else if(flag & AST_FLAG_CALL)
{
return visit_call(sexpr, ctx);
}
else if(flag & AST_FLAG_LIST)
{
return visit_list(sexpr, ctx);
}
printf("[WARNING]: couldnt optimize node\n");
return sexpr;
}
struct SEXPR* visit_atom(struct SEXPR* atom, struct ZL_CONTEXT* ctx)
{
zlassert(atom, "visit_atom( NULL )");
zlassert(atom->flag & AST_FLAG_ATOM, "visit_atom( non-atom )");
if(atom->flag & AST_FLAG_NUMBER)
{
// type = i64
}
else if(atom->flag & AST_FLAG_SYMBOL)
{
int cmp;
if((cmp = isbuiltin(atom->atom)))
{
// atom->tag = AST_ATOM_BUILTIN;
// atom->flag = cmp; // which builtin
}
else
{
// variable lookup
}
}
return atom;
}
struct SEXPR* visit_call(struct SEXPR* args, struct ZL_CONTEXT* ctx)
{
struct SEXPR* head = args;
// TODO: analyze head & maybe inline
while(args && args->flag != 0)
{
args->car = visit_node(args->car, ctx);
args = args->cdr;
}
return head;
}
/**
* only call on head
*/
struct SEXPR* visit_list(struct SEXPR* block, struct ZL_CONTEXT* ctx)
{
zlassert(block, "visit_block( NULL )");
zlassert(block->car, "visit_block( corrupt )");
struct SEXPR* head = block;
// { expr; } ==> cons(expr nil) ==> expr
if(isnil(block->cdr))
{
visit_node(block->car, ctx);
struct SEXPR* tmp = block->car;
free(block);
return tmp;
}
printf("[WARNING]: only partial semantical analysis\n");
// struct AST_NODE* node = ZL0_malloc(sizeof(struct AST_NODE));
// struct ZL_CONTEXT* local = ZL0_malloc(sizeof(struct ZL_CONTEXT));
// local->parent = ctx;
// TODO:
// always inline this
return block;
}
inline void assert_type(char* type, struct SEXPR* node, struct ZL_CONTEXT* ctx)
{
if(!strcmp(type, node->type))
{
zlerror("type assertion failed", NULL);
}
}<file_sep>#ifndef __SCANNER_H_
#define __SCANNER_H_
enum {
ZL1_TOKEN_EOF = 10000,
ZL1_TOKEN_SYMBOL = 10001,
ZL1_TOKEN_INTEGER = 10002,
ZL1_TOKEN_LPAREN = 10003,
ZL1_TOKEN_RPAREN = 10004,
ZL1_TOKEN_LBRACE = 10005,
ZL1_TOKEN_RBRACE = 10006,
ZL1_TOKEN_COLONS = 10007,
ZL1_TOKEN_SEMICOLON = 10008,
};
struct TOKEN {
int tag;
char* val;
};
typedef struct lexer_s lexer_t; // TODO: convert to new style
lexer_t* ZL1_create(char* src, char* filename);
struct TOKEN* __attribute__((always_inline)) lookahead(lexer_t* lex);
void consume(lexer_t* lex);
void ZL1_free(lexer_t* lex);
int predict(int token, lexer_t* lex);
#endif
<file_sep>#ifndef __DISPATCHER_H_
#define __DISPATCHER_H_
struct SEXPR;
enum {
VAR_FLAG_CALLABLE = 0x1,
VAR_FLAG_INLINE = 0x2,
VAR_FLAG_INITIALIZED = 0x4,
VAR_FLAG_MUTATED = 0x8,
VAR_FLAG_RECURSIVE = 0x10,
VAR_FLAG_CONSTANT = 0x20,
VAR_FLAG_PURE = 0x40,
};
struct VARIABLE {
unsigned flag;
const char* name;
const char* type; // lambda_t in case of a function
};
struct CALLABLE {
int unique;
const char* rtype;
const char** argt;
} CALLABLE;
void dispatch();
#endif<file_sep>/**
* AST to IR compiler
*/
#ifndef __SEMANTICS_H_
#define __SEMANTICS_H_
struct SEXPR;
struct ZL_CONTEXT {
struct ZL_CONTEXT* parent;
};
struct SEXPR* analyze(struct SEXPR*);
struct SEXPR* visit_list(struct SEXPR*, struct ZL_CONTEXT*);
struct SEXPR* visit_call(struct SEXPR*, struct ZL_CONTEXT*);
struct SEXPR* visit_atom(struct SEXPR*, struct ZL_CONTEXT*);
struct SEXPR* visit_node(struct SEXPR*, struct ZL_CONTEXT*);
void assert_type(char*, struct SEXPR*, struct ZL_CONTEXT*);
#endif
<file_sep>#ifndef __AST_H_
#define __AST_H_
enum {
// 0 => nil
AST_FLAG_ATOM = 0x1,
AST_FLAG_CONS = 0x2,
AST_FLAG_LIST = 0x4,
AST_FLAG_CALL = 0x8,
AST_FLAG_BUILTIN = 0x10,
AST_FLAG_LAMBDA = 0x20,
AST_FLAG_MACRO = 0x40,
AST_FLAG_NUMBER = 0x80,
AST_FLAG_SYMBOL = 0x100,
};
/**
*
*/
struct SEXPR {
unsigned flag; // atom / node // block / call / literal / symbol ...
char* type; // sexpr :: type
union {
struct { char* atom; };
struct {
struct SEXPR* car;
struct SEXPR* cdr;
};
};
};
int isnil(struct SEXPR*);
extern struct SEXPR nil;
#endif<file_sep>// #define __TRACE_PARSER
#include <string.h>
#include <stdlib.h>
#include <stdio.h>
#include "ast.h"
#include "types.h"
#include "parser.h"
#include "scanner.h"
#include "frontend.h"
#include "errors.h"
struct SEXPR nil = { .flag = 0, .type = "nil_t" };
extern inline int __attribute__((always_inline)) isnil(struct SEXPR* sexpr)
{
return sexpr->flag == 0;
}
/**
* TODO:
* - better error messages
* - implement macros
* - scan for memory leaks
* - clarify ambiguous grammar rules
*
* grammar rules:
*
* atom ::= SYMBOL
* atom ::= INTEGER
* list ::= LBRACE stmt* RBRACE
* expr ::= [list|atom|LPAREN expr RPAREN] [LPAREN expr* RPAREN] [COLONS SYMBOL]
* expr ::= LPAREN expr RPAREN
* stmt ::= expr SEMICOLON
*
*
* () == nil
* ( ex1 ) == ex1
*
* { } == nil
* { ex1; } == ex1
* { ex1; ex2; } == cons(ex1 cons(ex2 nil) [block]) [block]
*
* f() == cons(f nil) [call]
* { f; }() == cons(f nil) [call]
* (g())() == cons(cons(g nil) [call] nil) [call]
*/
;
struct SEXPR* parse(lexer_t* lex)
{
return parse_expr(zlmalloc(sizeof(struct SEXPR)), lex);
}
int parse_atom(struct SEXPR* atom, lexer_t* lex)
{
zlassert(lex, "parse_atom( NULL )");
struct TOKEN* tok = lookahead(lex);
if(tok->tag == ZL1_TOKEN_INTEGER)
{
atom->atom = tok->val;
atom->flag = AST_FLAG_ATOM | AST_FLAG_NUMBER;
atom->type = "i64";
consume(lex);
return success;
}
else if(tok->tag == ZL1_TOKEN_SYMBOL)
{
atom->atom = tok->val;
atom->flag = AST_FLAG_ATOM | AST_FLAG_SYMBOL;
consume(lex);
return success;
}
return failure;
}
int parse_stmt(struct SEXPR* stmt, lexer_t* lex)
{
zlassert(lex, "parse_stmt( NULL )");
#ifdef __TRACE_PARSER
static int count = 0;
printf("parse_stmt#%d( ... )\n", ++count);
#endif
if(parse_expr(stmt, lex))
return failure;
if(predict(ZL1_TOKEN_SEMICOLON, lex))
zlerror("expected semicolon", NULL);
return success;
}
int parse_list(struct SEXPR* list, lexer_t* lex)
{
zlassert(lex, "parse_list( NULL )");
#ifdef __TRACE_PARSER
static int count = 0;
printf("parse_list#%d( ... )\n", ++count);
#endif
if(predict(ZL1_TOKEN_LBRACE, lex))
return failure;
if(predict(ZL1_TOKEN_RBRACE, lex) == success)
{
list->cdr = &nil;
return success;
}
list->car = zlmalloc(sizeof(struct SEXPR));
list->flag = AST_FLAG_CONS | AST_FLAG_LIST;
if(parse_stmt(list->car, lex))
zlerror("expected statement or semicolon", NULL);
struct SEXPR* tail = list;
while(predict(ZL1_TOKEN_RBRACE, lex))
{
if(predict(ZL1_TOKEN_EOF, lex) == success)
zlerror("unexpected end of file", NULL);
struct SEXPR* node = zlmalloc(sizeof(struct SEXPR));
node->flag = AST_FLAG_CONS | AST_FLAG_LIST;
node->car = zlmalloc(sizeof(struct SEXPR));
if(parse_stmt(node->car, lex))
zlerror("expected statement or closing brace", NULL);
tail->cdr = node;
tail = node;
}
tail->cdr = &nil;
return success;
}
int parse_expr(struct SEXPR* expr, lexer_t* lex)
{
zlassert(lex, "parse_expr( NULL )");
#ifdef __TRACE_PARSER
static int count = 0;
printf("parse_expr#%d( ... )\n", ++count);
#endif
if(predict(ZL1_TOKEN_LPAREN, lex) == success)
{
if(predict(ZL1_TOKEN_RPAREN, lex) == success)
{
*expr = nil;
return success;
}
if(parse_expr(expr, lex) == failure)
zlerror("expected expression", NULL);
if(predict(ZL1_TOKEN_RPAREN, lex) == failure)
zlerror("expected closing parenthesis", NULL);
// consume(lex);
}
else if(parse_atom(expr, lex) && parse_list(expr, lex))
{
return failure;
}
if(predict(ZL1_TOKEN_LPAREN, lex) == success)
{
struct SEXPR* func = zlmalloc(sizeof(struct SEXPR));
memcpy(func, expr, sizeof(struct SEXPR));
expr->car = func;
expr->type = NULL; // case: f :: i32 ();
expr->flag = AST_FLAG_CONS | AST_FLAG_CALL;
if(predict(ZL1_TOKEN_RPAREN, lex) == success)
{
expr->cdr = &nil;
return success;
}
else
{
struct SEXPR* tail = zlmalloc(sizeof(struct SEXPR));
expr->cdr = tail; // cons(f ...)
tail->car = zlmalloc(sizeof(struct SEXPR));
tail->flag = AST_FLAG_CONS;
if(parse_expr(tail->car, lex) == failure)
zlerror("expected expression or closing parenthesis", NULL);
while(predict(ZL1_TOKEN_RPAREN, lex))
{
if(predict(ZL1_TOKEN_EOF, lex) == success)
zlerror("unexpected end of file", NULL);
struct SEXPR* next = zlmalloc(sizeof(struct SEXPR));
next->car = zlmalloc(sizeof(struct SEXPR));
next->flag = AST_FLAG_CONS;
if(parse_expr(next->car, lex) == failure)
zlerror("expected expression or closing parenthesis", NULL);
tail->cdr = next;
tail = next;
}
tail->cdr = &nil;
}
}
if(predict(ZL1_TOKEN_COLONS, lex) == success)
{
if(lookahead(lex)->tag == ZL1_TOKEN_SYMBOL)
zlerror("expected type symbol", NULL);
expr->type = lookahead(lex)->val;
consume(lex);
}
return success;
}
<file_sep>#define __TRACE_CODEGEN
#include <stdio.h>
#include <stdlib.h>
#include <stdio.h>
#include <string.h>
#include "frontend.h"
#include "ast.h"
#include "codegen.h"
#include "builtins.h"
#include "types.h"
#include "semantics.h"
#include "errors.h"
typedef struct ref_s {
unsigned char* name;
unsigned char type;
} ref_t;
typedef struct block_s {
unsigned int idx;
FILE* out;
} block_t;
static void __attribute__((always_inline)) newtmp(char* buffer, block_t* block) {
sprintf(buffer, "%%%d", block->idx++);
}
void codegen__addl__(char*, struct SEXPR*, block_t*);
static char iltype(const int type)
{
switch(type)
{
case TYPE_U32:
case TYPE_I32:
return 'w';
case TYPE_U64:
case TYPE_I64:
return 'l';
case TYPE_F32:
return 's';
case TYPE_F64:
return 'd';
}
return '\0';
}
void codegen_atom(ref_t* ref, struct SEXPR* node, block_t* block)
{
}
void codegen_node(ref_t* ref, struct SEXPR* node, block_t* block)
{
if(node->flag & AST_FLAG_ATOM)
{
codegen_atom(ref, node, block);
}
else if(node->flag & AST_FLAG_CALL)
{
struct SEXPR* args = node->cdr;
if(node->flag & AST_FLAG_BUILTIN)
{
}
else if(node->flag & AST_FLAG_SYMBOL)
{
}
else
{
zlfatal("i dunno what to do here");
}
}
free(node);
}
void codegen(struct SEXPR* node)
{
block_t* block = zlmalloc(sizeof(block_t));
block->idx = 0;
block->out = stdout;
codegen_node(NULL, node, block);
free(block); // ?!
}
/**
* the following functions all assume:
* - the parameter types match
* - there are only two parameters
*/
#define BINOP(instr, type) \
void codegen__ ## instr ## type ## __(char* buf, struct SEXPR* args, block_t* block) \
{ \
char *op1, *op2; \
codegen_node(op1, args, block); \
codegen_node(op2, args->cdr, block); \
newtmp(buf, block); \
fprintf(block->out, "\t%s =%s %s %s, %s\n", buf, #type, #instr, op1, op2); \
}
BINOP(add, l)
BINOP(add, w)
BINOP(sub, l)
BINOP(sub, w)<file_sep>/**
* consists of one huge array for all refrenced objects.
* a variable lookup will give back the index of the variable (if found)
*
*
*
* semantics dispatcher
*
* lookup(x, ctx) ------------->
* <------------ ctx[hash(x)]
* declaration
*
* codegen
*
* get 4738 idx: 4738 -->
*
*/<file_sep>/**
* implementation of a simple hashmap which is used for all sorts of variable lookup
*/
#include <string.h>
#include <stdlib.h>
#include "ast.h"
#include "frontend.h"
#define HASHSIZE 20
typedef struct dict_s {
struct dict_s* next;
struct AST_NODE* node;
char* name;
} DICT;
static unsigned hash(char* name)
{
unsigned hashval = 0;
while(*name)
hashval = 31 * hashval + *name++;
return hashval % HASHSIZE;
}
static DICT* lookup(DICT** map, char* name)
{
DICT* np;
for(np = map[hash(name)]; np; np = np->next)
{
if(0 == strcmp(name, np->name))
{
return np;
}
}
return NULL;
}
void push(DICT** map, char* name, struct AST_NODE* node)
{
DICT* np;
unsigned hashval;
if((np = lookup(map, name)) == NULL)
{
np = zlmalloc(sizeof(DICT));
np->name = zlstrdup(name);
hashval = hash(name);
np->next = map[hashval];
map[hashval] = np;
}
else
{
free(np->node);
}
np->node = node;
}
<file_sep>#ifndef __BUILTINS_H_
#define __BUILTINS_H_
enum {
// arithmetic
__ADD__,
__SUB__,
__MUL__,
__DIV__, __UDIV__,
__REM__, __UREM__,
__SHL__,
__SHR__,
__SAR__,
__AND__,
__OR__,
__XOR__,
// comparisons
__CEQL__, __CEQW__,
__CNEL__, __CNEW__,
__CSLEL__, __CSLEW__,
__CULEL__, __CULEW__,
__CSLTL__, __CSLTW__,
__CULTL__, __CULTW__,
__CSGEL__, __CSGEW__,
__CUGEL__, __CUGEW__,
__CSGTL__, __CSGTW__,
__CUGTL__, __CUGTW__,
__CEQD__, __CEQS__,
__CNED__, __CNES__,
__CLED__, __CLES__,
__CLTD__, __CLTS__,
__CGED__, __CGES__,
__CGTD__, __CGTS__,
__COD__, __COS__, /* not NaN */
__CUOD__, __CUOS__, /* is NaN */
// conversion
__DTOSI__, __STOSI__,
__SLTOF__, __SWTOF__,
__TRUNCD__, __EXTS__,
__COPY__, __CAST__,
};
#endif<file_sep>#ifndef __TYPES_H_
#define __TYPES_H_
enum {
TYPE_I64,
TYPE_I32,
TYPE_I16,
TYPE_I8,
TYPE_U64,
TYPE_U32,
TYPE_U16,
TYPE_U8,
TYPE_F64,
TYPE_F32,
};
#endif<file_sep>/**
* changes to be done:
* better tracing
* better error messages
* unit testing
* function renaming
*/
#include <stdlib.h>
#include <stdio.h>
#include <ctype.h>
#include <string.h>
#include "frontend.h"
#include "scanner.h"
#include "errors.h"
typedef struct lexer_s {
size_t lineno;
char *filename, *src;
struct TOKEN* next;
} lexer_t;
/**
* SEMICOLON ::= \;
* LPAREN ::= \(
* RPAREN ::= \)
* LBRACE ::= \{
* RBRACE ::= \}
* COLONS ::= \:\:
* INTEGER ::= \d+
* SYMBOL ::= [^\s]*
*
*/
static int issymbol(char c)
{
static const char* viable = "~!?./,<>_+=-:$%%^&\\[]";
return isalnum(c) || strchr(viable, c);
}
static void ZL1_next__(lexer_t* lex, struct TOKEN* tok)
{
zlassert(lex && tok, "ZL1_next__( NULL , NULL )");
tok->tag = ZL1_TOKEN_EOF;
// skip spaces
while(*lex->src != '\0' && isspace(*lex->src))
{
if(*lex->src == '\n')
{
lex->lineno++;
}
lex->src++;
}
char c = *lex->src;
if(isdigit(c))
{
size_t len = 1;
while(isdigit(*(lex->src + len)))
{
len++;
}
tok->tag = ZL1_TOKEN_INTEGER;
tok->val = zlmalloc(sizeof(char) * (len + 1));
zlstrncpy(tok->val, lex->src, len + 1);
lex->src += len;
}
else if(c == ';')
{
tok->tag = ZL1_TOKEN_SEMICOLON;
lex->src++;
}
else if(c == ':')
{
if(*(lex->src + 1) == ':')
{
tok->tag = ZL1_TOKEN_COLONS;
lex->src += 2;
}
}
else if(c == '(')
{
tok->tag = ZL1_TOKEN_LPAREN;
lex->src++;
}
else if(c == ')')
{
tok->tag = ZL1_TOKEN_RPAREN;
lex->src++;
}
else if(c == '{')
{
tok->tag = ZL1_TOKEN_LBRACE;
lex->src++;
}
else if(c == '}')
{
tok->tag = ZL1_TOKEN_RBRACE;
lex->src++;
}
else if(issymbol(c))
{
size_t len = 1;
while(issymbol(*(lex->src + len)))
{
len++;
}
tok->tag = ZL1_TOKEN_SYMBOL;
tok->val = zlmalloc(sizeof(char) * (len + 1));
zlstrncpy(tok->val, lex->src, len + 1);
lex->src += len;
}
}
void consume(lexer_t* lex)
{
zlassert(lex, "consume( NULL )");
free(lex->next); // should be always assigned
lex->next = zlmalloc(sizeof(struct TOKEN));/* safe */
ZL1_next__(lex, lex->next);
#ifdef __TRACE_LEXER
#endif
}
struct TOKEN* __attribute__((always_inline)) lookahead(lexer_t* lex)
{
zlassert(lex, "ZL1_lookahead( NULL )");
#ifdef __TRACE_LEXER
static int count = 1;
printf("ZL1_lookahead#%d( ... ) -> ...\n", count++);
#endif
return lex->next;
}
int predict(int token, lexer_t* lex)
{
if(lex->next->tag == token)
{
free(lex->next);
ZL1_next__(lex, (lex->next = zlmalloc(sizeof(struct TOKEN))));
return success;
}
return failure;
}
lexer_t* ZL1_create(char* src, char* filename)
{
#ifdef __TRACE_LEXER
printf("ZL1_create( ... )\n");
#endif
lexer_t* lex = zlmalloc(sizeof(lexer_t));
lex->src = src; /* unsafe */
lex->filename = filename; /* unsafe */
lex->lineno = 1;
lex->next = zlmalloc(sizeof(struct TOKEN));
ZL1_next__(lex, lex->next);
return lex;
}
void ZL1_free(lexer_t* lex)
{
#ifdef __TRACE_LEXER
printf("ZL1_free( ... )\n");
#endif
zlassert(lex, "ZL1_free( NULL )");
free(lex->next);
free(lex);
}
<file_sep>#ifndef __PARSER_H_
#define __PARSER_H_
#include "ast.h"
typedef struct lexer_s lexer_t;
struct SEXPR* parse(lexer_t* lex);
int parse_atom(struct SEXPR*, lexer_t* lex);
int parse_stmt(struct SEXPR*, lexer_t* lex);
int parse_list(struct SEXPR*, lexer_t* lex);
int parse_expr(struct SEXPR*, lexer_t* lex);
#endif
<file_sep>#ifndef __FRONTEND_H_
#define __FRONTEND_H_
#include <stddef.h>
#define success 0
#define failure 1
/**
*
*/
void ZL0_trace(char *);
/**
*
*/
void* __attribute__((always_inline)) zlmalloc(size_t);
/**
*
*/
char* zlstrncpy(char*, char*, size_t);
char* __attribute__((always_inline)) zlstrdup(char*);
#endif
<file_sep>// Zola
// #define __TRACE_PARSER
// #define __TRACE_EVAL
#include <ctype.h>
#include <stdio.h>
#include <stdlib.h>
#include <stdarg.h>
#include <string.h>
#include "frontend.h"
#include "ast.h"
#include "parser.h"
#include "scanner.h"
#include "semantics.h"
#include "errors.h"
/**
* frontend optimizations:
* compile time execution
* inlining
*
* backend optimizations:
* sparse const propagation
* copy elimination
* dead instruction elimination
*/
void __attribute__((noreturn, noinline)) zlcrash(const char *msg, const char* file, int line)
{
// memory managing is done by the OS
fprintf(stderr, "compiler-crash: %s\n", msg);
exit(-1);
}
void __attribute__((noreturn, noinline)) zlerror(const char* msg, void* pos)
{
fprintf(stderr, "syntax error: %s\n", msg);
exit(-1);
}
/**
* a function that acts like malloc but which terminates
* if malloc fails to return a valid pointer
*/
void* __attribute__((always_inline)) zlmalloc(size_t size)
{
void* ptr = malloc(size);
zlassert(ptr, "out of dynamic memory!");
return ptr;
}
/**
* a custom strncpy function which ensures
* that the returned string is \0 terminated
*/
char* zlstrncpy(char* dest, char* src, size_t n)
{
zlassert(n > 0, "zlstrncpy( ..., -1)");
size_t i;
for(i = 0; i < n - 1 && src[i]; i++)
{
dest[i] = src[i];
}
dest[i] = '\0';
return dest;
}
char* __attribute__((always_inline)) zlstrdup(char *src)
{
char* dup = zlmalloc(strlen(src) + 1);
strcpy(dup, src);
return dup;
}
// driver
void debug_lexer(char *src)
{
lexer_t* lex = ZL1_create(src, "<debug>");
while(lookahead(lex)->tag != ZL1_TOKEN_EOF)
{
struct TOKEN* tok = lookahead(lex);
consume(lex);
printf("[LEXER] >> %d\n", tok->tag);
free(tok);
}
ZL1_free(lex);
}
void debug_parser(char *src)
{
lexer_t* lex = ZL1_create(src, "<debug>");
struct SEXPR* expr = zlmalloc(sizeof(struct SEXPR));
parse_expr(expr, lex);
printf("\n");
free(expr);
ZL1_free(lex);
}
static inline void ptabs(int c)
{
for(int i = 0; i < c; i++)
printf(" ");
}
static void dump_ast(struct SEXPR* node)
{
if(node->flag & AST_FLAG_ATOM)
{
printf("%s", node->atom);
}
else if(node->flag & AST_FLAG_CONS)
{
printf("cons(");
dump_ast(node->car);
printf(" ");
dump_ast(node->cdr);
printf(")");
}
else if(!node->flag)
{
printf("nil");
}
else
{
printf("oops!");
}
}
/**
* TODO:
* - DO NOT edit the ast
* - implement dispatcher
* - finish simple codegen
* - add error messages
* - add macros
*/
int main(int argc, char **argv)
{
//debug_parser("{ def!; 38239; f(); }");
lexer_t* lex = ZL1_create("{ this; dude; }(1 2 3)", "<unknown>");
struct SEXPR* node;
node = parse(lex);
dump_ast(node);
printf("\n");
node = analyze(node);
dump_ast(node);
printf("\n");
// printf("codegen...\n");
// codegen(node);
// free(node);
ZL1_free(lex);
return 0;
}
<file_sep>#ifndef __ERRORS_H_
#define __ERRORS_H_
#ifndef NDEBUG
#define zlassert(cond, msg) if(!(cond)) zlcrash(msg, __FILE__, __LINE__);
#else
#define zlassert(cond, msg)
#endif
#define zlfatal(msg) zlcrash(msg, __FILE__, __LINE__);
void __attribute__((noreturn, noinline)) zlcrash(const char*, const char*, int);
void __attribute__((noreturn, noinline)) zlerror(const char*, void*);
#endif | a21629ec4c5e0925bd875410321a3b45a634eb2f | [
"Markdown",
"C"
]
| 18 | Markdown | tr00/zola | f8ef1a3828001fb90ae7444a02d8d5a28c2edc8d | 1f7845e2f54e2d94645dead818ae68055af4996d |
refs/heads/master | <repo_name>martinfelbaba/gs-jpa<file_sep>/README.md
# gs-jpa
Demo project for Spring Boot Jpa, Rest
To run:
mvn clean package
java -jar target/gs-jpa-0.0.3-SNAPSHOT.jar<file_sep>/src/main/java/com/mf/gs/domain/package-info.java
/**
* Domain/model classes.
*
* @since 1.0
* @author martin
* @version 1.0
*/
package com.mf.gs.domain; | 8f5cb3b13b921bfe31a63bf8e4cb21f31eeee823 | [
"Markdown",
"Java"
]
| 2 | Markdown | martinfelbaba/gs-jpa | e66a2041ad12a069eba62e868b198a2768ce6261 | e8eff8162dd4b88a1eda5a4e2d38873a413039d6 |
refs/heads/master | <file_sep>/**
* @desc This js document is used for creating ajax jQuery functionality for online chat
* @author <NAME>
*/
$(document).ready(function(){
//Hiding elements that represents errors
$('#error1').hide();
$('#error2').hide();
getMessages();
getUsers();
//Validation of event
$('#message').keyup(function(){
checkMessage();
});
$('#send-message').click(function(event){
event.preventDefault();
let message = $('#message').val();
if(message == '') return false;
let validation = checkMessage();
if(validation === true) return false;
//Ajax jQuery request
$.post(base_url+'chat/add-message', {user_id: user_id, message: message}, function(data){
if(data[0] === 'invalid') window.location.href = base_url+'chat';
}, 'json');
$('#message').val('');
getMessages();
});
//checking length and allowed characters for the Message field
function checkMessage(){
let messageLength = $('#message').val().length;
if(messageLength >200){
$('#error1').html('You can send up to 200 characters in the Message field.');
$('#error1').show();
return true;
}else {
$('#error1').hide();
}
let pattern = /^[-a-zA-Z0-9\s.:,@]*$/;
if(pattern.test($('#message').val())){
$('#error2').hide();
}else{
$('#error2').html('Characters allowed in the Message field are: . , : , - , , , @, spaces , letters and numbers.');
$('#error2').show();
return true;
}
}
/**
* Generating ajax request for showing all chat messages
*/
function getMessages(){
$.post(base_url+'all-messages', function(data){
let noMsg = '<p class="alert alert-warning">No new messages.<p>';
if(data[0] === 'No new messages.'){
$('#chat').html(noMsg);
return false;
}
let allMsg = '';
$.each(data, function(i) {
allMsg += '<p class="alert alert-success"><i>'+data[i][2]+'</i> <strong>'+data[i][0]+'</strong> '+data[i][1]+'</p>';
});
$('#chat').html(allMsg);
}, 'json');
}
/**
* Generating ajax request for showing current users
*/
function getUsers(){
$.post(base_url+'all-users', function(data){
if(data.length == 0){
window.location.href = base_url+'chat';
return false;
}
let allUsers = '';
$.each(data, function(i){
allUsers += '<li class="list-group-item">'+ data[i].username +'</li>';
});
$('#users').html(allUsers);
}, 'json');
}
/**
* Generating ajax request for checking if username exists
*/
function checkUser(){
$.post(base_url+'check-user', {userid: user_id}, function(data){
if(data[0] === 'invalid') window.location.href = base_url+'chat';
}, 'json');
}
setInterval(getUsers, 60*1000);
setInterval(getMessages, 6*1000);
setInterval(checkUser, 10*1000);
}
);<file_sep><?php
/**
* @desc This class is used for all online chat functionality
* @author <NAME>
*/
class Chat_model extends CI_Model{
//login
public function login($username){
$query = $this->db->get_where('users', array('username' => $username));
if($query->num_rows() === 1){
return false;
}else{
$data = array(
'username' => $username ,
);
$this->db->insert('users', $data);
$query = $this->db->get_where('users', array('username' => $username));
return $query->row();
}
}
public function add_message($user_id, $message){
$message = $this->db->escape_str($message);
$data = array(
'user_id' => $user_id ,
'message' => $message ,
);
return $this->db->insert('messages', $data);
}
/*
* Getting all messages from database
*/
public function get_messages(){
$this->db->select('users.username, messages.message, DATE_FORMAT(messages.created_at, "%H:%i:%s") AS created_at');
$this->db->from('messages');
$this->db->join('users', 'messages.user_id = users.user_id');
$this->db->order_by("created_at", "desc");
$query = $this->db->get();
return $query;
}
/*
* Checking if messages are older than 10 min. and deleting them if so
*/
public function delete_messages(){
$query = 'DELETE FROM messages WHERE created_at < (NOW() - INTERVAL 10 MINUTE)';
$this->db->query($query);
}
public function all_users(){
$query = 'SELECT username FROM users';
$result = $this->db->query($query);
return $result->result();
}
/*
* Checking if usernames are older than 30 min. and deleting them if so
* Every user need to relog every 30 min. approximately
*/
public function delete_users(){
$query = 'DELETE FROM users WHERE created_at < (NOW() - INTERVAL 30 MINUTE)';
$this->db->query($query);
}
/*
* Checking if username exists in database
*/
public function check_user($user_id){
$query = $this->db->get_where('users', array('user_id' => $user_id));
return $query;
}
}<file_sep><?php if($this->session->flashdata('user_loggedin')) : ?>
<p class="alert alert-success"><?= $this->session->flashdata('user_loggedin') ?></p>
<?php endif; ?>
<script>
let user_id = '<?php echo $_SESSION['user_id']; ?>';
let base_url = '<?= base_url() ?>';
</script>
<div id="messageArea">
<div class="row">
<div class="col-md-4">
<h1><?= $title ?></h1>
<h3>Rules</h3>
<p>Every user need to relog every 30 min.</p>
<p>Messages older than 10 min. will be deleted from chat.</p>
<h3>Active Users</h3>
<ul class="list-group" id="users">
</ul>
</div>
<div class="col-md-8">
<label>Enter message: </label>
<textarea class="form-control" id="message"></textarea>
<p id="error1" class="alert alert-danger"></p>
<p id="error2" class="alert alert-danger"></p>
<br>
<input id="send-message" class="btn btn-primary" type="submit" value="Send message...">
<br><br>
<div id="chat">
</div>
</div>
</div>
</div>
</div>
<script src="<?= base_url() ?>js/chat.js"></script><file_sep><!doctype html>
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<title>Online Chat</title>
<link rel="stylesheet" href="<?= base_url(); ?>css/style.css">
<!-- Bootstrap core CSS -->
<link href="https://getbootstrap.com/docs/4.1/dist/css/bootstrap.min.css" rel="stylesheet">
<link href="<?= base_url() ?>css/style.css" rel="stylesheet">
<!-- Custom styles for this template -->
<link href="https://getbootstrap.com/docs/4.1/examples/navbar-fixed/navbar-top-fixed.css" rel="stylesheet">
<link href="https://fonts.googleapis.com/css?family=Source+Sans+Pro" rel="stylesheet">
<script
src="https://code.jquery.com/jquery-3.3.1.js"
integrity="<KEY>
crossorigin="anonymous"></script>
</head>
<body id="body">
<div class="container-fluid">
<file_sep><?php
/**
* @desc This class is used for all online chat functionality
* @author <NAME>
*/
class Chat extends CI_Controller {
public function index(){
$data['title'] = 'Login to online chat';
$this->load->view('templates/header');
$this->load->view('chat/chat', $data);
$this->load->view('templates/footer');
}
public function login(){
$data['title'] = 'Login to online chat';
$this->form_validation->set_error_delimiters('<p class="alert alert-danger">', '</p>');
$this->form_validation->set_rules('username', 'Username', 'required|trim|min_length[2]|max_length[20]|htmlspecialchars|callback_validate_username');
//Login form validation
if($this->form_validation->run() === false){
$this->load->view('templates/header');
$this->load->view('chat/chat', $data);
$this->load->view('templates/footer');
}else{
$username = $this->input->post('username');
//Login
$logovanje = $this->chat_model->login($username);
if($logovanje){
//Create sesssion
$user_data = array(
'login' => 'User is logged in for chatting.',
'user_id' => $logovanje->user_id
);
$this->session->set_userdata($user_data);
//set flash message
$this->session->set_flashdata('user_loggedin', 'You are now logged in to chat.');
redirect(base_url().'chat-start');
}else{
//set flash message
$this->session->set_flashdata('user_failed', 'This username is already taken.');
redirect(base_url().'chat');
}
}
}
/*
* Validation on input from login form
*/
public function validate_username($username){
$reg_exp = '/^[a-zA-Z0-9]*$/';
if(!(preg_match($reg_exp,$username))){
$this->form_validation->set_message('validate_username', 'The {field} field can contain only letters and numbers.');
return false;
}else{
return true;
}
}
/*
* After successful login
*/
public function chat_start(){
if($this->session->userdata('login') !== 'User is logged in for chatting.'){
redirect(base_url().'chat');
}
$data['title'] = 'Welcome to Chat';
$this->load->view('templates/header');
$this->load->view('chat/chat-start', $data);
$this->load->view('templates/footer');
}
/*
* Adding and validating chat messages
*/
public function add_message(){
if($this->session->userdata('login') !== 'User is logged in for chatting.'){
redirect(base_url().'chat');
}
$user_id = $this->input->post('user_id');
$user_id = (int)$user_id;
$res = $this->chat_model->check_user($user_id);
$invalid = array('invalid');
if($res->num_rows() == 0) {
echo json_encode($invalid);
exit();
}
$message = $this->input->post('message');
$message = self::validate_message($message);
if($message === false) return false;
$result = $this->chat_model->add_message($user_id, $message);
}
/*
* Validation of chat messages
*/
public static function validate_message($message){
$reg_exp = "/^[-a-zA-Z0-9\s.:,@]*$/";
if(
(strlen($message)>200)
||
(!(preg_match($reg_exp,$message)))
) {
return false;
} else return $message;
}
/*
* Showwing all chat messages from database which aren't older than 10 min.
*/
public function all_messages(){
if($this->session->userdata('login') !== 'User is logged in for chatting.'){
redirect(base_url().'chat');
}
$this->chat_model->delete_messages();
$no_messages = array('No new messages.');
$messages = $this->chat_model->get_messages();
if($messages->num_rows() > 0){
$result = array();
$res = array();
foreach($messages->result() as $msg){
array_push($res, $msg->username);
array_push($res, $msg->message);
array_push($res, $msg->created_at);
array_push($result, $res);
$res = array();
}
echo json_encode($result);
}else echo json_encode($no_messages);
}
/*
* Showwing current users of chat
*/
public function all_users(){
if($this->session->userdata('login') !== 'User is logged in for chatting.'){
redirect(base_url().'chat');
}
$this->chat_model->delete_users();
echo json_encode($this->chat_model->all_users());
}
/*
* Checking if user exists
*/
public function check_user(){
$user_id = $this->input->post('userid');
$user_id = (int)$user_id;
$res = $this->chat_model->check_user($user_id);
$invalid = array('invalid');
if($res->num_rows() == 0) {
session_destroy();
echo json_encode($invalid);
}
}
}<file_sep> <form id="loginForm" action="<?= base_url().'chat/login' ?>" method="post">
<div class="row">
<div class="col-md-4"></div>
<div class="col-md-4">
<?= validation_errors() ?>
<?php if($this->session->flashdata('user_failed')) : ?>
<p class="alert alert-danger"><?= $this->session->flashdata('user_failed') ?></p>
<?php endif; ?>
<h1 class="text-center"><?= $title ?></h1>
<div class="form-group">
<input type="text" id="username" class="form-control" name="username" placeholder="Username" required autofocus>
</div>
<p id="error1" class="alert alert-danger"></p>
<p id="error2" class="alert alert-danger"></p>
<button id="submit" type="submit" class="btn btn-primary btn-block">Login</button>
</div>
<div class="col-md-4"></div>
</div>
</form>
<script src="<?= base_url() ?>js/validation.js"></script> | ac21da91a30eadb7b68228c5d9806df9185af6f9 | [
"JavaScript",
"PHP"
]
| 6 | JavaScript | brankozecevic/online-chat-with-php-jquery-ajax-codeigniter | 1d1728e2ea1fa43782a3db2510d3bf4643a835d9 | 152b9b7c04cb34c43d2c93c6d5aca96d0e067dae |
refs/heads/master | <repo_name>JunaidGool/shoeCatalogue<file_sep>/revised_ShoeCatalogue.js
document.getElementById("addStockButton").addEventListener("click", addStock);
document.getElementById("addStockButton").addEventListener("click", getSum);
document.getElementById("addStockButton").addEventListener("click", totalQuantities);
document.getElementById("viewStockLevelsButton").addEventListener("click", myStockChart);
document.getElementById("brandDropDownOutput").addEventListener("change", brandFilter);
document.getElementById("colorDropDownOutput").addEventListener("change", colorFilter);
document.getElementById("sizeDropDownOutput").addEventListener("change", sizeFilter);
document.getElementById("stockController").addEventListener("click", viewStockController);
document.getElementById("viewStock").addEventListener("click", viewStockTable);
document.getElementById("openCatalogue").addEventListener("click", viewCatalogue);
var shoeTableScript = document.getElementById("shoeTableScript").innerHTML;
var shoeTableTemplate = Handlebars.compile(shoeTableScript);
var brandDropDownScript = document.getElementById("brandDropDownScript").innerHTML;
var brandDropDownTemplate = Handlebars.compile(brandDropDownScript);
var colorDropDownScript = document.getElementById("colorDropDownScript").innerHTML;
var colorDropDownTemplate = Handlebars.compile(colorDropDownScript);
var sizeDropDownScript = document.getElementById("sizeDropDownScript").innerHTML;
var sizeDropDownTemplate = Handlebars.compile(sizeDropDownScript);
var shoeDataArray = [{brandName:"addidas",
shoeSize:8,
shoeColor:"white",
shoeQuantity:26},
{brandName:"nike",
shoeSize:7,
shoeColor:"red",
shoeQuantity:32},
{brandName:"puma",
shoeSize:12,
shoeColor:"blue",
shoeQuantity:19},
{brandName:"reebok",
shoeSize:10,
shoeColor:"black",
shoeQuantity:16},
];
var addidasQuantity = [26];
var nikeQuantity = [32];
var reebokQuantity = [19];
var pumaQuantity = [16];
var duplicates =[];
var tableLocation = document.getElementById("tableOutput");
var brandDropDownLocation = document.getElementById("brandDropDownOutput");
var colorDropDownLocation = document.getElementById("colorDropDownOutput");
var sizeDropDownLocation = document.getElementById("sizeDropDownOutput");
var shoeData = "";
if (localStorage.getItem("shoeDataArray")){
shoeDataArray = JSON.parse(localStorage.getItem("shoeDataArray"));
}
tableLocation.innerHTML = shoeTableTemplate({
shoeTable: shoeDataArray
});
// brandDropDownLocation.innerHTML = brandDropDownTemplate({
// dropDownBrands: shoeDataArray
// });
// colorDropDownLocation.innerHTML = colorDropDownTemplate({
// dropDownColors: shoeDataArray
// });
// sizeDropDownLocation.innerHTML = sizeDropDownTemplate({
// dropDownSizes: shoeDataArray
// });
if (localStorage.getItem("addidasQuantity")){
addidasQuantity = JSON.parse(localStorage.getItem("addidasQuantity"));
}
document.getElementById("addidasQuantity").innerHTML = addidasQuantity.reduce(getSum);
function showBrands(){
var brandsList = [];
var brandsMap = {};
//create a unique list of brands
// filter my shoe array to only contain unique brands
for (var i = 0; i < shoeDataArray.length; i++) {
var shoe = shoeDataArray[i];
if (brandsMap[shoe.brandName] === undefined){
brandsList.push({brandName : shoe.brandName});
brandsMap[shoe.brandName] = shoe.brandName;
}
}
// display the options in the dropdown
var brandDropDownObjects ={dropDownBrands: brandsList};
var brandDropDownData = brandDropDownTemplate(brandDropDownObjects);
document.getElementById("brandDropDownOutput").innerHTML = brandDropDownData;
}
showBrands();
function showColors(){
var colorsList = [];
var colorsMap = {};
//create a unique list of brands
// filter my shoe array to only contain unique brands
for (var i = 0; i < shoeDataArray.length; i++) {
var shoe = shoeDataArray[i];
if (colorsMap[shoe.shoeColor] === undefined){
colorsList.push({shoeColor : shoe.shoeColor});
colorsMap[shoe.shoeColor] = shoe.shoeColor;
}
}
// display the options in the dropdown
var colorsDropDownObjects ={dropDownColors: colorsList};
var colorsDropDownData = colorDropDownTemplate(colorsDropDownObjects);
document.getElementById("colorDropDownOutput").innerHTML = colorsDropDownData;
}
showColors();
function showSizes(){
var sizeList = [];
var sizeMap = {};
//create a unique list of brands
// filter my shoe array to only contain unique brands
for (var i = 0; i < shoeDataArray.length; i++) {
var shoe = shoeDataArray[i];
if (sizeMap[shoe.shoeSize] === undefined){
sizeList.push({shoeSize : shoe.shoeSize});
sizeMap[shoe.shoeSize] = shoe.shoeSize;
}
}
// display the options in the dropdown
var sizeDropDownObjects ={dropDownSizes: sizeList};
var sizeDropDownData = sizeDropDownTemplate(sizeDropDownObjects);
document.getElementById("sizeDropDownOutput").innerHTML = sizeDropDownData;
}
showSizes();
if (localStorage.getItem("nikeQuantity")){
nikeQuantity = JSON.parse(localStorage.getItem("nikeQuantity"));
}
document.getElementById("nikeQuantity").innerHTML = nikeQuantity.reduce(getSum);
if (localStorage.getItem("reebokQuantity")){
reebokQuantity = JSON.parse(localStorage.getItem("reebokQuantity"));
}
document.getElementById("reebokQuantity").innerHTML = reebokQuantity.reduce(getSum);
if (localStorage.getItem("pumaQuantity")){
pumaQuantity = JSON.parse(localStorage.getItem("pumaQuantity"));
}
document.getElementById("pumaQuantity").innerHTML = pumaQuantity.reduce(getSum);
localStorage.setItem("shoeDataArray", JSON.stringify(shoeDataArray));
storedShoeData = JSON.parse(localStorage.getItem("shoeDataArray"));
var brandOptions = document.querySelectorAll(".brandOptions")
var colorOptions = document.querySelectorAll(".colorOptions")
var sizeOptions = document.querySelectorAll(".sizeOptions")
function addStock(){
var tableShoeObjects = {shoeTable: shoeDataArray};
var colorDropDownObjects ={dropDownColors: shoeDataArray};
var sizeDropDownObjects ={dropDownSizes: shoeDataArray};
var brandNameInput = document.getElementById("brandNameInputID");
var shoeSizeInput = document.getElementById("shoeSizeInputID");
var shoeColorInput = document.getElementById("shoeColorInputID");
var shoeQuantityInput = document.getElementById("shoeQuantityInputID");
document.getElementById("imagesCollection").style.visibility = "hidden";
for (var i=0;i<brandOptions.length;i++){
if (brandOptions.value === document.getElementById("brandNameInputID").value){
duplicates.push({brandName:brandNameInput.value})
alert("duplicate")
}
}
if (brandNameInput.value.startsWith("addidas") && shoeSizeInput.value != ("") && shoeColorInput.value != ("") && shoeQuantityInput.value !=("") ){
shoeDataArray.push({brandName:brandNameInput.value,
shoeSize:shoeSizeInput.value,
shoeColor:shoeColorInput.value,
shoeQuantity:parseInt(shoeQuantityInput.value)});
addidasQuantity.push(parseInt(shoeQuantityInput.value));
localStorage.setItem("addidasQuantity", JSON.stringify(addidasQuantity));
storedShoeData = JSON.parse(localStorage.getItem("addidasQuantity"));
document.getElementById("outputSelectedBrand").innerHTML = "BRAND : " + brandNameInput.value;
document.getElementById("outputSelectedSize").innerHTML = "SIZE : " + shoeSizeInput.value;
document.getElementById("outputSelectedColor").innerHTML = "COLOR : " + shoeColorInput.value;
document.getElementById("outputSelectedQuantity").innerHTML = "Quantity : " + shoeQuantityInput.value;
document.getElementById("addidasPic").style.visibility = "visible";
document.getElementById("nikePic").style.visibility = "hidden";
document.getElementById("pumaPic").style.visibility = "hidden";
document.getElementById("reebokPic").style.visibility = "hidden";
document.getElementById("brandPic").style.visibility = "hidden";
}
else if (brandNameInput.value.startsWith("nike") && shoeSizeInput.value != ("") && shoeColorInput.value != ("") && shoeQuantityInput.value !=("") ){
shoeDataArray.push({brandName:brandNameInput.value,
shoeSize:shoeSizeInput.value,
shoeColor:shoeColorInput.value,
shoeQuantity:parseInt(shoeQuantityInput.value)});
nikeQuantity.push(parseInt(shoeQuantityInput.value));
localStorage.setItem("nikeQuantity", JSON.stringify(nikeQuantity));
storedShoeData = JSON.parse(localStorage.getItem("nikeQuantity"));
document.getElementById("outputSelectedBrand").innerHTML = "BRAND : " + brandNameInput.value;
document.getElementById("outputSelectedSize").innerHTML = "SIZE : " + shoeSizeInput.value;
document.getElementById("outputSelectedColor").innerHTML = "COLOR : " + shoeColorInput.value;
document.getElementById("outputSelectedQuantity").innerHTML = "Quantity: " + shoeQuantityInput.value;
document.getElementById("addidasPic").style.visibility = "hidden";
document.getElementById("nikePic").style.visibility = "visible";
document.getElementById("pumaPic").style.visibility = "hidden";
document.getElementById("reebokPic").style.visibility = "hidden";
document.getElementById("brandPic").style.visibility = "hidden";
}
else if (brandNameInput.value.startsWith("puma") && shoeSizeInput.value != ("") && shoeColorInput.value != ("") && shoeQuantityInput.value !=("") ){
shoeDataArray.push({brandName:brandNameInput.value,
shoeSize:shoeSizeInput.value,
shoeColor:shoeColorInput.value,
shoeQuantity:parseInt(shoeQuantityInput.value)});
pumaQuantity.push(parseInt(shoeQuantityInput.value));
localStorage.setItem("pumaQuantity", JSON.stringify(pumaQuantity));
storedShoeData = JSON.parse(localStorage.getItem("pumaQuantity"));
document.getElementById("outputSelectedBrand").innerHTML = "BRAND : " + brandNameInput.value;
document.getElementById("outputSelectedSize").innerHTML = "SIZE : " + shoeSizeInput.value;
document.getElementById("outputSelectedColor").innerHTML = "COLOR : " + shoeColorInput.value;
document.getElementById("outputSelectedQuantity").innerHTML = "Quantity: " + shoeQuantityInput.value;
document.getElementById("addidasPic").style.visibility = "hidden";
document.getElementById("nikePic").style.visibility = "hidden";
document.getElementById("pumaPic").style.visibility = "visible";
document.getElementById("reebokPic").style.visibility = "hidden";
document.getElementById("brandPic").style.visibility = "hidden";
}
else if (brandNameInput.value.startsWith("reebok") && shoeSizeInput.value != ("") && shoeColorInput.value != ("") && shoeQuantityInput.value !=("") ){
shoeDataArray.push({brandName:brandNameInput.value,
shoeSize:shoeSizeInput.value,
shoeColor:shoeColorInput.value,
shoeQuantity:parseInt(shoeQuantityInput.value)});
reebokQuantity.push(parseInt(shoeQuantityInput.value));
localStorage.setItem("reebokQuantity", JSON.stringify(reebokQuantity));
storedShoeData = JSON.parse(localStorage.getItem("reebokQuantity"));
document.getElementById("outputSelectedBrand").innerHTML = "BRAND : " + brandNameInput.value;
document.getElementById("outputSelectedSize").innerHTML = "SIZE : " + shoeSizeInput.value;
document.getElementById("outputSelectedColor").innerHTML = "COLOR : " + shoeColorInput.value;
document.getElementById("outputSelectedQuantity").innerHTML = "Quantity: " + shoeQuantityInput.value;
document.getElementById("addidasPic").style.visibility = "hidden";
document.getElementById("nikePic").style.visibility = "hidden";
document.getElementById("pumaPic").style.visibility = "hidden";
document.getElementById("reebokPic").style.visibility = "visible";
document.getElementById("brandPic").style.visibility = "hidden";
}else{
document.getElementById("imagesCollection").style.visibility = "hidden";
document.getElementById("outputSelectedBrand").innerHTML = "INVALID"
document.getElementById("outputSelectedSize").innerHTML = "We Specialize in"
document.getElementById("outputSelectedColor").innerHTML = "Addidas, Nike, Puma & Reebok ONLY"
document.getElementById("outputSelectedQuantity").innerHTML = "please try again and fill all boxes"
document.getElementById("addidasPic").style.visibility = "hidden";
document.getElementById("nikePic").style.visibility = "hidden";
document.getElementById("pumaPic").style.visibility = "hidden";
document.getElementById("reebokPic").style.visibility = "hidden";
document.getElementById("brandPic").style.visibility = "visible";
};
var shoeTableData = shoeTableTemplate(tableShoeObjects);
document.getElementById("tableOutput").innerHTML = shoeTableData;
showBrands();
showSizes();
showColors();
var colorDropDownData = colorDropDownTemplate(colorDropDownObjects);
document.getElementById("colorDropDownOutput").innerHTML = colorDropDownData;
var sizeDropDownData = sizeDropDownTemplate(sizeDropDownObjects);
document.getElementById("sizeDropDownOutput").innerHTML = sizeDropDownData;
document.getElementById("tableOutput").style.visibility = "visible";
document.getElementById("myChart").style.visibility = "hidden";
document.getElementById("imagesCollection").style.visibility = "hidden";
localStorage.setItem("shoeDataArray", JSON.stringify(shoeDataArray));
storedShoeData = JSON.parse(localStorage.getItem("shoeDataArray"));
};
function getSum(total, num){
return total + num;
}
function totalQuantities(item) {
document.getElementById("addidasQuantity").innerHTML = addidasQuantity.reduce(getSum);
document.getElementById("nikeQuantity").innerHTML = nikeQuantity.reduce(getSum);
document.getElementById("pumaQuantity").innerHTML = pumaQuantity.reduce(getSum);
document.getElementById("reebokQuantity").innerHTML = reebokQuantity.reduce(getSum);
};
function myStockChart () {
document.getElementById("tableOutput").style.visibility = "hidden";
document.getElementById("myChart").style.visibility = "visible";
var ctx = document.getElementById('myChart').getContext('2d');
var myChart = new Chart(ctx, {
type: 'bar',
data: {
labels: ['Brands'],
datasets: [{
label: 'Addidas',
data: [document.getElementById("addidasQuantity").innerHTML],
backgroundColor: "rgba(153,255,51,0.6)"
}, {
label: 'Nike',
data: [document.getElementById("nikeQuantity").innerHTML],
backgroundColor: "rgba(255,153,0,0.6)"
}, {
label: 'Puma',
data: [document.getElementById("pumaQuantity").innerHTML],
backgroundColor: "red"
}, {
label: 'Reebok',
data: [document.getElementById("reebokQuantity").innerHTML],
backgroundColor: "blue"
}]
}
});
};
function brandFilter(evt){
var brandNameFilterSelect = document.getElementById("brandNameFilterSelect").value;
var filteredBrand = shoeDataArray.filter(function(shoeTable){
var match = true;
if (evt.target.id === 'brandNameFilterSelect'){
return shoeTable.brandName === evt.target.value;
}
return true;
});
tableLocation.innerHTML = shoeTableTemplate({
shoeTable: filteredBrand
});
};
function colorFilter(evt){
var colorFilterSelect = document.getElementById("colorFilterSelect").value;
var filteredColor = shoeDataArray.filter(function(shoeTable){
var match = true;
if (evt.target.id === 'colorFilterSelect'){
return shoeTable.shoeColor === evt.target.value;
}
return true;
});
tableLocation.innerHTML = shoeTableTemplate({
shoeTable: filteredColor
});
};
function sizeFilter(evt){
var sizeFilterSelect = document.getElementById("sizeFilterSelect").value;
var filteredSize = shoeDataArray.filter(function(shoeTable){
var match = true;
if (evt.target.id === 'sizeFilterSelect'){
return shoeTable.shoeSize === evt.target.value;
}
return true;
});
tableLocation.innerHTML = shoeTableTemplate({
shoeTable: filteredSize
});
};
function viewStockTable() {
document.getElementById("tableOutput").style.visibility = "visible";
document.getElementById("myChart").style.visibility = "hidden";
};
function viewStockController() {
document.getElementById("selectionCriteriaOutput").style.visibility = "visible";
document.getElementById("brandQuantities").style.visibility = "visible";
document.getElementById("shoeInputs").style.visibility = "visible";
document.getElementById("tableOutput").style.visibility = "visible";
document.getElementById("filterCriteria").style.visibility = "visible";
document.getElementById("brandDropDownOutput").style.visibility = "visible";
document.getElementById("colorText").style.visibility = "visible";
document.getElementById("colorDropDownOutput").style.visibility = "visible";
document.getElementById("sizeText").style.visibility = "visible";
document.getElementById("sizeDropDownOutput").style.visibility = "visible";
document.getElementById("sizeFilterSelect").style.visibility = "visible";
document.getElementById("myWelcome").style.visibility = "hidden";
document.getElementById("myChart").style.visibility = "hidden";
};
function viewCatalogue() {
document.getElementById("selectionCriteriaOutput").style.visibility = "hidden";
document.getElementById("brandQuantities").style.visibility = "hidden";
document.getElementById("shoeInputs").style.visibility = "hidden";
document.getElementById("tableOutput").style.visibility = "visible";
document.getElementById("filterCriteria").style.visibility = "visible";
document.getElementById("brandDropDownOutput").style.visibility = "visible";
document.getElementById("colorText").style.visibility = "visible";
document.getElementById("colorDropDownOutput").style.visibility = "visible";
document.getElementById("sizeText").style.visibility = "visible";
document.getElementById("sizeDropDownOutput").style.visibility = "visible";
document.getElementById("sizeFilterSelect").style.visibility = "visible";
document.getElementById("myWelcome").style.visibility = "hidden";
document.getElementById("imagesCollection").style.visibility = "visible";
document.getElementById("addidasPic").style.visibility = "hidden";
document.getElementById("nikePic").style.visibility = "hidden";
document.getElementById("pumaPic").style.visibility = "hidden";
document.getElementById("reebokPic").style.visibility = "hidden";
document.getElementById("brandPic").style.visibility = "hidden";
document.getElementById("myChart").style.visibility = "hidden";
};
| bd1333388cbcb5ed97ed6a9fc8b0192d968c9e02 | [
"JavaScript"
]
| 1 | JavaScript | JunaidGool/shoeCatalogue | 886befdbcb4cd153addd9cbfbfb8b1537b802e08 | d58b4ae7236a53c81aafd946b1948a7649f8c13a |
refs/heads/master | <repo_name>shaiksameer46/coditas<file_sep>/src/App.js
import React, { Component } from "react";
import "./App.css";
import { Dropdown } from "react-bootstrap";
import axios from "axios";
class App extends Component {
state = {
username: "",
users: [],
repos: [],
toggle: false,
currentPage: 1,
todosPerPage: 5
};
constructor() {
super();
this.handleClick = this.handleClick.bind(this);
}
handleSubmit = async event => {
event.preventDefault();
const query = this.state.username;
const { data: users } = await axios.get(
"https://api.github.com/search/users?q=" + query
);
this.setState({ users: users.items });
this.setState({ username: "" });
this.setState({ details: "Details" });
this.setState({ repos: "" });
};
handleClick(event) {
this.setState({
currentPage: Number(event.target.id)
});
}
handleChange = event => {
this.setState({ username: event.target.value });
};
repodetails = async search => {
const { data: repos } = await axios.get(
"https://api.github.com/users/" + search + "/repos"
);
this.setState({ repos });
console.log(this.state.repos[0].owner.login);
this.setState(state => ({
toggle: !state.toggle
}));
};
sortasc = data => {
function sortOn(property) {
var sortOrder = 1;
if (property[0] === "-") {
sortOrder = -1;
property = property.substr(1);
}
return function(a, b) {
if (sortOrder === -1) {
return b[property].localeCompare(a[property]);
} else {
return a[property].localeCompare(b[property]);
}
};
}
const info = data.sort(sortOn("login"));
this.setState({ users: info });
};
sortdsc = data => {
function sortOn(property) {
var sortOrder = 1;
if (property[0] === "-") {
sortOrder = -1;
property = property.substr(1);
}
return function(a, b) {
if (sortOrder === -1) {
return b[property].localeCompare(a[property]);
} else {
return a[property].localeCompare(b[property]);
}
};
}
const info = data.sort(sortOn("-login"));
this.setState({ users: info });
};
render() {
const { currentPage, todosPerPage, users } = this.state;
const indexOfLastTodo = currentPage * todosPerPage;
const indexOfFirstTodo = indexOfLastTodo - todosPerPage;
const currentTodos = users.slice(indexOfFirstTodo, indexOfLastTodo);
const pageNumbers = [];
for (let i = 1; i <= Math.ceil(users.length / todosPerPage); i++) {
pageNumbers.push(i);
}
const renderPageNumbers = pageNumbers.map(number => {
return (
<li>
<button
className="btn btn-small"
key={number}
id={number}
onClick={this.handleClick}
>
{number}
</button>
</li>
);
});
return (
<React.Fragment>
<nav class="navbar navbar-expand-lg navbar-dark bg-dark container mt-1">
<div class="collapse navbar-collapse" id="navbarSupportedContent">
<ul class="navbar-nav mr-auto">
<Dropdown className="dropdownstyle">
<Dropdown.Toggle variant="success" id="dropdown-basic">
Sort By Name
</Dropdown.Toggle>
<Dropdown.Menu>
<Dropdown.Item
onClick={this.sortasc.bind(this, this.state.users)}
>
Name A-Z
</Dropdown.Item>
<Dropdown.Item
onClick={this.sortdsc.bind(this, this.state.users)}
>
Name Z-A
</Dropdown.Item>
</Dropdown.Menu>
</Dropdown>
</ul>
<form
onSubmit={this.handleSubmit}
className="form-inline my-2 my-lg-0 formstyle"
>
<input
className="form-control mr-sm-2"
id="search"
type="text"
autoComplete="off"
value={this.state.username}
onChange={this.handleChange}
placeholder="Enter user name"
/>
<button
type="submit"
className="btn btn-outline-success my-2 my-sm-0"
>
search
</button>
</form>
</div>
</nav>
{this.state.users.length
? currentTodos.map(user => (
<div className="template jumbotron">
<div>
<img
src={user.avatar_url}
className="iconDetails rounded-circle"
alt="not found git_image"
/>
</div>
<div className="styleplate">
<h2>{user.login}</h2>
<h5>Profile : {user.html_url}</h5>
<button
className="buttonstyle btn btn-primary"
onClick={this.repodetails.bind(this, user.login)}
>
{this.state.toggle ? "Collapse" : "Details"}
</button>
<p className="valuestyle">Data one : Value one</p>
<p className="valuestyle">Data two : Value two</p>
</div>
{this.state.repos.length ? (
<div className="panel panel-default panelstyle">
<table className="table table-dark">
{this.state.repos.map(repo =>
this.state.toggle &&
repo.owner.login === user.login ? (
<tbody>
<tr key={repo.id}>
<td>
<p>{repo.name}</p>
</td>
<td>
<p>{repo.language}</p>
</td>
</tr>
</tbody>
) : (
""
)
)}
</table>
</div>
) : (
""
)}
</div>
))
: ""}
<div className="col xs-6-push layout">
<ul className="pagination">{renderPageNumbers}</ul>
</div>
</React.Fragment>
);
}
}
export default App;
| fcfafd1f4a4b380a8aea443339a74ac730b2a25f | [
"JavaScript"
]
| 1 | JavaScript | shaiksameer46/coditas | 95c21c0295e13b20923da2a140dd60b5f3733a6b | 4a1335351e0042080ef1ee189ff3682fa6523992 |
refs/heads/master | <file_sep># Page-Replacement-Algorithms
implementations of FIFO and LRU algorithms
<file_sep>import java.util.HashSet;
import java.util.LinkedList;
import java.util.Queue;
class FIFO
{
// Method to find page faults using indexes
static int pageFaults(int pages[], int capacity)
{
// If the user gives capacity of memory equal to zero, returning 0 since there aren't going to be any faults
if (capacity == 0)
return 0;
// In order to have a First In First Out implementation, we use a queue (linked list) to store the pages.
Queue<Integer> memory = new LinkedList<>() ;
// Using a HashSet in order to check in O(1) if a page exists in the queue or not
HashSet<Integer> pagesInFrames = new HashSet<>(capacity);
int faults = 0; // Initializing the page faults
// Iterating through every page
for (int page : pages)
{
// If this page is not in the set (which means is not in the queue either) we add it, otherwise we ignore it
if(!pagesInFrames.contains(page))
{
// If the queue is currently full, removing the first inserted page from the queue and also from
// the hash set in order to add the new page.
if (memory.size() == capacity)
{
int firstPage = memory.poll(); // Pops the first page of the queue
pagesInFrames.remove(firstPage); // Removes the first page from the hash set
}
memory.add(page); // Pushing the new page into the queue
pagesInFrames.add(page); // Adding the new page into the hash set
faults++; // Increasing page faults, since thew new page was not found
}
}
return faults;
}
// Driver Method to test your algorithm with a simple example
public static void main(String args[])
{
/*
* This is an array that holds the reference string for all
* page requests.
*/
int pages[] = {5, 1, 0, 3, 2, 3, 0, 4, 2, 3, 0, 3, 5, 2};
// This is the number of available page frames
int memoryCapacity = 3;
int faults = pageFaults(pages, memoryCapacity);
System.out.println(faults);
}
}
| 7cf265d4be48ecd2cc2370ce200172f0b4c0e316 | [
"Markdown",
"Java"
]
| 2 | Markdown | nikopetr/Page-Replacement-Algorithms | a7064e20d7a7c4190f6ca8de58d545a4934ef195 | c34d18ecfb72500ca0ff30ac55b741e312683add |
refs/heads/master | <file_sep># npm-install-cached
Cache node_modules and symlink it to app instead of using npm's caching strategy.
# Example
``` yaml
deploy:
steps:
- npm-install-cached
```
# License
The MIT License (MIT)
# Changelog
## 0.1.3
- Prune after install
## 0.1.0
- Initial release
<file_sep>repo_node_modules="$WERCKER_CACHE_DIR/wercker/npm-cache/$WERCKER_GIT_OWNER/$WERCKER_GIT_REPOSITORY/node_modules"
mkdir -p $repo_node_modules
ln -s $repo_node_modules $WERCKER_ROOT/node_modules
npm install $WERCKER_NPM_INSTALL_OPTIONS
npm prune
| c3e5982ea20b93813aa2a624dfa4c90a7fadac87 | [
"Markdown",
"Shell"
]
| 2 | Markdown | p60/step-npm-install-cached | 521086c50fafd7187c35be8086aafa5bedb368af | 3d1ed409d9af74ef72c437777558a7cfe3df463b |
refs/heads/master | <file_sep>let deskop = document.querySelector('.deskop');
let deskop_leftbox_inputcontainer_citiselect_submit = document.querySelector('.deskop_leftbox_inputcontainer_citiselect_submit');
let deskop_leftbox_inputcontainer_citiselect_text = document.querySelector('.deskop_leftbox_inputcontainer_citiselect_text');
let deskop_rightbox_citybox_iconcontainer = document.querySelector('.deskop_rightbox_citybox_iconcontainer');
let deskop_rightbox_citybox_city = document.querySelector('.deskop_rightbox_citybox_city');
let deskop_rightbox_citybox_description = document.querySelector('.deskop_rightbox_citybox_description');
let deskop_rightbox_tempbox_celsius = document.querySelector('.deskop_rightbox_tempbox_celsius');
let deskop_rightbox_tempbox_tempmin = document.querySelector('.deskop_rightbox_tempbox_tempmin');
let deskop_rightbox_tempbox_tempmax = document.querySelector('.deskop_rightbox_tempbox_tempmax');
let deskop_rightbox_tempbox_cisnienie = document.querySelector('.deskop_rightbox_tempbox_cisnienie');
let deskop_rightbox_tempbox_wilgotnosc = document.querySelector('.deskop_rightbox_tempbox_wilgotnosc');
let deskop_rightbox_tempbox_wiatr = document.querySelector('.deskop_rightbox_tempbox_wiatr');
let deskop_nextdaybox_day1_day = document.querySelector('.deskop_nextdaybox_day1_day');
let deskop_nextdaybox_day1_weathericonbox = document.querySelector('.deskop_nextdaybox_day1_weathericonbox');
let deskop_nextdaybox_day1_tempbox_tempminmax = document.querySelector('.deskop_nextdaybox_day1_tempbox_tempminmax');
let deskop_nextdaybox_day2_weathericonbox = document.querySelector('.deskop_nextdaybox_day2_weathericonbox');
let deskop_nextdaybox_day2_tempbox_tempminmax = document.querySelector('.deskop_nextdaybox_day2_tempbox_tempminmax');
let deskop_nextdaybox_day3_weathericonbox = document.querySelector('.deskop_nextdaybox_day3_weathericonbox');
let deskop_nextdaybox_day3_tempbox_tempminmax = document.querySelector('.deskop_nextdaybox_day3_tempbox_tempminmax');
let deskop_nextdaybox_day4_weathericonbox = document.querySelector('.deskop_nextdaybox_day4_weathericonbox');
let deskop_nextdaybox_day4_tempbox_tempminmax = document.querySelector('.deskop_nextdaybox_day4_tempbox_tempminmax');
let deskop_nextdaybox_day5_weathericonbox = document.querySelector('.deskop_nextdaybox_day5_weathericonbox');
let deskop_nextdaybox_day5_tempbox_tempminmax = document.querySelector('.deskop_nextdaybox_day5_tempbox_tempminmax');
let deskop_leftbox_inputcontainer_localise_submit = document.querySelector('.deskop_leftbox_inputcontainer_localise_submit');
let deskop_rightbox_noticecontainer = document.querySelector('.deskop_rightbox_noticecontainer');
function displayBlock() {
return deskop.style.display = "block";
}
displayBlock()
console.log("deskop")
// searching by city name
deskop_leftbox_inputcontainer_citiselect_submit.addEventListener('click', function (name) {
fetch('http://api.openweathermap.org/data/2.5/forecast?q=' + deskop_leftbox_inputcontainer_citiselect_text.value + '&appid=50a7aa80fa492fa92e874d23ad061374&lang=pl&units=metric')
.then(response => response.json())
.then(data => {
let iconId = data.list[0].weather[0].icon
let city = data.city.name;
let description = data.list[0].weather[0].description
let tempValue = data.list[0].main.temp;
let temprouded = Math.round(tempValue);
let temp_min = data.list[0].main.temp_min;
let tempminrouded = Math.round(temp_min);
let temp_max = data.list[0].main.temp_max;
let tempmaxrouded = Math.round(temp_max);
let pressure = data.list[0].main.pressure;
let humidity = data.list[0].main.humidity;
let speed = data.list[0].wind.speed;
let speedrounded = Math.round(speed);
let iconId_day1 = data.list[1].weather[0].icon;
let temp_min_day1 = data.list[1].main.temp_min;
let temp_min_day1_rounded = Math.round(temp_min_day1);
let temp_max_day1 = data.list[1].main.temp_max;
let temp_max_day1_rounded = Math.round(temp_max_day1);
let iconId_day2 = data.list[2].weather[0].icon;
let temp_min_day2 = data.list[2].main.temp_min;
let temp_min_day2_rounded = Math.round(temp_min_day2);
let temp_max_day2 = data.list[2].main.temp_max;
let temp_max_day2_rounded = Math.round(temp_max_day2);
let iconId_day3 = data.list[3].weather[0].icon;
let temp_min_day3 = data.list[3].main.temp_min;
let temp_min_day3_rounded = Math.round(temp_min_day3);
let temp_max_day3 = data.list[3].main.temp_max;
let temp_max_day3_rounded = Math.round(temp_max_day3);
let iconId_day4 = data.list[4].weather[0].icon;
let temp_min_day4 = data.list[4].main.temp_min;
let temp_min_day4_rounded = Math.round(temp_min_day4);
let temp_max_day4 = data.list[4].main.temp_max;
let temp_max_day4_rounded = Math.round(temp_max_day4);
let iconId_day5 = data.list[5].weather[0].icon;
let temp_min_day5 = data.list[5].main.temp_min;
let temp_min_day5_rounded = Math.round(temp_min_day5);
let temp_max_day5 = data.list[5].main.temp_max;
let temp_max_day5_rounded = Math.round(temp_max_day5);
deskop_rightbox_citybox_description.innerHTML = description
deskop_rightbox_citybox_iconcontainer.innerHTML = `<img class="deskop_rightbox_citybox_weathericon" src="icons/${iconId}.png"/>`;
deskop_rightbox_citybox_city.innerHTML = city;
deskop_rightbox_tempbox_celsius.innerHTML = temprouded + "°C";
deskop_rightbox_tempbox_tempmin.innerHTML = "temp. min " + tempminrouded + "°C";
deskop_rightbox_tempbox_tempmax.innerHTML = "temp. max " + tempmaxrouded + "°C";
deskop_rightbox_tempbox_cisnienie.innerHTML = "ciśnienie " + pressure + " hPa"
deskop_rightbox_tempbox_wilgotnosc.innerHTML = "wilgotność " + humidity + "%"
deskop_rightbox_tempbox_wiatr.innerHTML = "wiatr " + speedrounded + " km/h"
deskop_nextdaybox_day1_weathericonbox.innerHTML = `<img class="deskop_nextdaybox_day1_weathericon" src="icons/${iconId_day1}.png"/>`;
deskop_nextdaybox_day1_tempbox_tempminmax.innerHTML = "temp. min " + temp_min_day1_rounded + "°C " +
"temp. max " + temp_max_day1_rounded + "°C";
deskop_nextdaybox_day2_weathericonbox.innerHTML = `<img class="deskop_nextdaybox_day2_weathericon" src="icons/${iconId_day2}.png"/>`;
deskop_nextdaybox_day2_tempbox_tempminmax.innerHTML = "temp. min " + temp_min_day2_rounded + "°C " +
"temp. max " + temp_max_day2_rounded + "°C";
deskop_nextdaybox_day3_weathericonbox.innerHTML = `<img class="deskop_nextdaybox_day3_weathericon" src="icons/${iconId_day3}.png"/>`;
deskop_nextdaybox_day3_tempbox_tempminmax.innerHTML = "temp. min " + temp_min_day3_rounded + "°C " +
"temp. max " + temp_max_day3_rounded + "°C";
deskop_nextdaybox_day4_weathericonbox.innerHTML = `<img class="deskop_nextdaybox_day4_weathericon" src="icons/${iconId_day4}.png"/>`;
deskop_nextdaybox_day4_tempbox_tempminmax.innerHTML = "temp. min " + temp_min_day4_rounded + "°C " +
"temp. max " + temp_max_day4_rounded + "°C";
deskop_nextdaybox_day5_weathericonbox.innerHTML = `<img class="deskop_nextdaybox_day5_weathericon" src="icons/${iconId_day5}.png"/>`;
deskop_nextdaybox_day5_tempbox_tempminmax.innerHTML = "temp. min " + temp_min_day5_rounded + "°C " +
"temp. max " + temp_max_day5_rounded + "°C";
deskop_leftbox_inputcontainer_citiselect_text.value = "";
})
.catch(err => alert("Nie ma takiego miasta. Wprowadź poprawną nazwę miasta"));
})
// searching by latitude and longitude
deskop_leftbox_inputcontainer_localise_submit.addEventListener('click', function (name) {
if (navigator.geolocation) {
navigator.geolocation.getCurrentPosition(setPosition, showError);
} else {
deskop_rightbox_noticecontainer.style.display = "block";
}
})
function setPosition(position) {
let latitude = position.coords.latitude;
let longitude = position.coords.longitude;
console.log(latitude)
console.log(longitude)
getWeather(latitude, longitude);
}
function showError(error) {
deskop_rightbox_noticecontainer.style.display = "block";
}
function getWeather(latitude, longitude) {
fetch('http://api.openweathermap.org/data/2.5/forecast?lat=' + latitude + '&lon=' + longitude + '&appid=50a7aa80fa492fa92e874d23ad061374&lang=pl&units=metric')
.then(response => response.json())
.then(data => {
let iconId = data.list[0].weather[0].icon
let city = data.city.name;
let description = data.list[0].weather[0].description
let tempValue = data.list[0].main.temp;
let temprouded = Math.round(tempValue);
let temp_min = data.list[0].main.temp_min;
let tempminrouded = Math.round(temp_min);
let temp_max = data.list[0].main.temp_max;
let tempmaxrouded = Math.round(temp_max);
let pressure = data.list[0].main.pressure;
let humidity = data.list[0].main.humidity;
let speed = data.list[0].wind.speed;
let speedrounded = Math.round(speed);
let iconId_day1 = data.list[1].weather[0].icon;
let temp_min_day1 = data.list[1].main.temp_min;
let temp_min_day1_rounded = Math.round(temp_min_day1);
let temp_max_day1 = data.list[1].main.temp_max;
let temp_max_day1_rounded = Math.round(temp_max_day1);
let iconId_day2 = data.list[2].weather[0].icon;
let temp_min_day2 = data.list[2].main.temp_min;
let temp_min_day2_rounded = Math.round(temp_min_day2);
let temp_max_day2 = data.list[2].main.temp_max;
let temp_max_day2_rounded = Math.round(temp_max_day2);
let iconId_day3 = data.list[3].weather[0].icon;
let temp_min_day3 = data.list[3].main.temp_min;
let temp_min_day3_rounded = Math.round(temp_min_day3);
let temp_max_day3 = data.list[3].main.temp_max;
let temp_max_day3_rounded = Math.round(temp_max_day3);
let iconId_day4 = data.list[4].weather[0].icon;
let temp_min_day4 = data.list[4].main.temp_min;
let temp_min_day4_rounded = Math.round(temp_min_day4);
let temp_max_day4 = data.list[4].main.temp_max;
let temp_max_day4_rounded = Math.round(temp_max_day4);
let iconId_day5 = data.list[5].weather[0].icon;
let temp_min_day5 = data.list[5].main.temp_min;
let temp_min_day5_rounded = Math.round(temp_min_day5);
let temp_max_day5 = data.list[5].main.temp_max;
let temp_max_day5_rounded = Math.round(temp_max_day5);
deskop_rightbox_citybox_description.innerHTML = description
deskop_rightbox_citybox_iconcontainer.innerHTML = `<img class="deskop_rightbox_citybox_weathericon" src="icons/${iconId}.png"/>`;
deskop_rightbox_citybox_city.innerHTML = city;
deskop_rightbox_tempbox_celsius.innerHTML = temprouded + "°C";
deskop_rightbox_tempbox_tempmin.innerHTML = "temp. min " + tempminrouded + "°C";
deskop_rightbox_tempbox_tempmax.innerHTML = "temp. max " + tempmaxrouded + "°C";
deskop_rightbox_tempbox_cisnienie.innerHTML = "ciśnienie " + pressure + " hPa"
deskop_rightbox_tempbox_wilgotnosc.innerHTML = "wilgotność " + humidity + "%"
deskop_rightbox_tempbox_wiatr.innerHTML = "wiatr " + speedrounded + " km/h"
deskop_nextdaybox_day1_weathericonbox.innerHTML = `<img class="deskop_nextdaybox_day1_weathericon" src="icons/${iconId_day1}.png"/>`;
deskop_nextdaybox_day1_tempbox_tempminmax.innerHTML = "temp. min " + temp_min_day1_rounded + "°C " +
"temp. max " + temp_max_day1_rounded + "°C";
deskop_nextdaybox_day2_weathericonbox.innerHTML = `<img class="deskop_nextdaybox_day2_weathericon" src="icons/${iconId_day2}.png"/>`;
deskop_nextdaybox_day2_tempbox_tempminmax.innerHTML = "temp. min " + temp_min_day2_rounded + "°C " +
"temp. max " + temp_max_day2_rounded + "°C";
deskop_nextdaybox_day3_weathericonbox.innerHTML = `<img class="deskop_nextdaybox_day3_weathericon" src="icons/${iconId_day3}.png"/>`;
deskop_nextdaybox_day3_tempbox_tempminmax.innerHTML = "temp. min " + temp_min_day3_rounded + "°C " +
"temp. max " + temp_max_day3_rounded + "°C";
deskop_nextdaybox_day4_weathericonbox.innerHTML = `<img class="deskop_nextdaybox_day4_weathericon" src="icons/${iconId_day4}.png"/>`;
deskop_nextdaybox_day4_tempbox_tempminmax.innerHTML = "temp. min " + temp_min_day4_rounded + "°C " +
"temp. max " + temp_max_day4_rounded + "°C";
deskop_nextdaybox_day5_weathericonbox.innerHTML = `<img class="deskop_nextdaybox_day5_weathericon" src="icons/${iconId_day5}.png"/>`;
deskop_nextdaybox_day5_tempbox_tempminmax.innerHTML = "temp. min " + temp_min_day5_rounded + "°C " +
"temp. max " + temp_max_day5_rounded + "°C";
})
.catch(err => alert("Coś poszło nie tak. Odśwież stronę i spróbuj ponownie"));
}
function GetTime1() {
let date = new Date();
let hour = date.getHours();
let minute = date.getMinutes();
let day = date.getDate();
let dayday = date.getDay();
let month = date.getMonth();
let year = date.getFullYear();
if (minute < 10) minute = "0" + minute;
let days = new Array("niedziela", "poniedziałek", "wtorek", "środa", "czwartek", "piątek", "sobota", "niedziela", "poniedziałek", "wtorek", "środa", "czwartek", "piątek");
let months = new Array("stycznia", "lutego", "marca", "kwietnia", "maja", "czerwca", "lipca", "sierpnia", "września", "października", "listopada", "grudnia");
let showdate = days[dayday] + ', ' + day + ' ' + months[month] + ' ' + year + " " + hour + ':' + minute
let day1 = days[dayday + 1]
let day2 = days[dayday + 2]
let day3 = days[dayday + 3]
let day4 = days[dayday + 4]
let day5 = days[dayday + 5]
document.querySelector('.deskop_rightbox_clock').innerHTML = showdate
document.querySelector('.deskop_nextdaybox_day1_day').innerHTML = day1
document.querySelector('.deskop_nextdaybox_day2_day').innerHTML = day2
document.querySelector('.deskop_nextdaybox_day3_day').innerHTML = day3
document.querySelector('.deskop_nextdaybox_day4_day').innerHTML = day4
document.querySelector('.deskop_nextdaybox_day5_day').innerHTML = day5
setTimeout(GetTime1, 60000);
}
GetTime1();<file_sep># weather-app
open the link to see it in figma:
https://www.figma.com/file/I2JOLte32NF6VbqVpgjYhL/pada_-Copy?node-id=0%3A1
<file_sep>let scriptSrc = 'app.js';
if (window.outerWidth <= 768)
scriptSrc = './js/mobile.js';
else if (window.outerWidth > 768)
scriptSrc = './js/deskop.js';
let script = document.createElement('script');
script.src = scriptSrc;
let head = document.getElementsByTagName('head')[0];
head.appendChild(script); | d94027d1c1ae2f19c9b776f22c4ce655a52c6d63 | [
"JavaScript",
"Markdown"
]
| 3 | JavaScript | phase8/weather-app | 0462e2679576e20ca9e5dcb24d562012ee13214d | 49a1d184aa05207341c22b7f3a3a48df3670cba6 |
refs/heads/master | <file_sep># App
App subida a Heroku
<file_sep>angular.module('App')
.controller(
'clientesCtrl',['$scope','clientesServer',
function($scope, Clientes){
$scope.clearCliente = function(){
$scope.cliente = {};
}
$scope.clientes = Clientes.list();
$scope.addCliente = function() {
if($scope.nuevo){
$scope.cliente.fechaCreacion = Date();
Clientes.saveCliente($scope.cliente);
}else{
$scope.cliente.fechaUpdate = Date();
Clientes.updateCliente($scope.cliente)
}
$scope.clearCliente();
}
$scope.detalleCliente = function(cliente){
$scope.cliente = cliente;
}
$scope.deleteCliente = function(){
Clientes.deleteCliente($scope.cliente);
$scope.clearCliente();
}
}
]);<file_sep>/*angular.module('App')
.directive('propiaDisabled', function ()
{
return {
restrict: 'A',
link: function (scope,forma){
if(forma)
scope.disable = "disabled";
}
};
});
*/<file_sep>angular.module('App')
.controller(
'loginUserCtrl',['$scope','usersServer',
function($scope, usersServer){
$scope.register = function () {
usersServer.autentificar($scope.user.email,$scope.user.password,'viewClient');
};
}
]);
<file_sep>var app = angular.module("App");
// let's create a re-usable factory that generates the $firebaseAuth instance
app.factory('localidadesServer', ['$firebaseArray',
function($firebaseArray) {
var ref = new Firebase("https://pitoco.firebaseio.com/tools/provincias");
var Provincias = $firebaseArray(ref);
return {
listProvincias: function(){
console.log("Referencia a la DB");
return Provincias;
},
saveProvincia: function(provincia){
Provincias.$add(provincia).then(function(ref) {
var id = ref.key();
console.log("Agregando provincia, id: " + id);
Provincias.$indexFor(id); // returns location in the array
})
},
updateProvincia: function(provincia){
Provincias.$save(provincia).then(function(ref) {
ref.key() === provincia.$id; // true
console.log("Update provincia, id: " + provincia.$id);
})
},
deleteProvincia: function(provincia){
Provincias.$remove(provincia).then(function(ref) {
console.log("Delete provincia, id: " + provincia.$id);
ref.key() === provincia.$id; // true
})
}
};
}
]);
<file_sep>angular.module('App',['ui.router', 'firebase'])
.controller(
'MainCtrl',['$scope','clientesServer','localidadesServer',
function($scope, clientesServer,localidadesServer){
$scope.clientes = clientesServer;
$scope.provincias = localidadesServer;
$scope.closeSesion = function(){
$state.go('login');
}
}
]);
<file_sep>var app = angular.module("App");
// let's create a re-usable factory that generates the $firebaseAuth instance
app.factory('usersServer', ['$state',
function($state) {
var users = new Firebase("https://pitoco.firebaseio.com");
return {
autentificar: function(email,password,redirect){
users.authWithPassword({
email : email,
password : <PASSWORD>
}, function(error, authData) {
if (error) {
console.log("Login Failed!", error);
} else {
console.log("Authenticated successfully with payload:", authData);
$state.go(redirect);
}
}
);
}
};
}
]);
| 974a4ae3d35b8562d3b461c757f1c855fd95d727 | [
"Markdown",
"JavaScript"
]
| 7 | Markdown | cachi87/App | dd3ef485c9990805e78797550e6fe852d44e72c5 | 902a882d969698c7c016b90fb8b17792a0d50ccb |
refs/heads/master | <file_sep>json.extract! @comment, :id, :content, :photo_id, :commenter_id, :created_at, :updated_at
<file_sep>json.extract! @favoriting, :id, :photo_id, :favorited_by_id, :created_at, :updated_at
<file_sep>Rails.application.routes.draw do
devise_for :users
root 'users#index'
resources :photos
resources :followings
resources :favoritings
resources :comments
resources :users
end
<file_sep>json.extract! @following, :id, :leader_id, :follower_id, :created_at, :updated_at
<file_sep>class Following < ActiveRecord::Base
belongs_to :leader, :class_name => "User", :foreign_key => "leader_id"
belongs_to :follower, :class_name => "User", :foreign_key => "follower_id"
end
<file_sep>json.extract! @photo, :id, :caption, :owner_id, :image, :created_at, :updated_at
<file_sep>require 'test_helper'
class FavoritingsControllerTest < ActionController::TestCase
setup do
@favoriting = favoritings(:one)
end
test "should get index" do
get :index
assert_response :success
assert_not_nil assigns(:favoritings)
end
test "should get new" do
get :new
assert_response :success
end
test "should create favoriting" do
assert_difference('Favoriting.count') do
post :create, favoriting: { favorited_by_id: @favoriting.favorited_by_id, photo_id: @favoriting.photo_id }
end
assert_redirected_to favoriting_path(assigns(:favoriting))
end
test "should show favoriting" do
get :show, id: @favoriting
assert_response :success
end
test "should get edit" do
get :edit, id: @favoriting
assert_response :success
end
test "should update favoriting" do
patch :update, id: @favoriting, favoriting: { favorited_by_id: @favoriting.favorited_by_id, photo_id: @favoriting.photo_id }
assert_redirected_to favoriting_path(assigns(:favoriting))
end
test "should destroy favoriting" do
assert_difference('Favoriting.count', -1) do
delete :destroy, id: @favoriting
end
assert_redirected_to favoritings_path
end
end
<file_sep>class User < ActiveRecord::Base
# Include default devise modules. Others available are:
# :confirmable, :lockable, :timeoutable and :omniauthable
devise :database_authenticatable, :registerable,
:recoverable, :rememberable, :trackable, :validatable
has_many :own_photos, :class_name => "Photo", :foreign_key => "owner_id"
has_many :comments, :class_name => "Comment", :foreign_key => "commenter_id"
has_many :favoritings, :class_name => "Favoriting", :foreign_key => "favorited_by_id"
has_many :favorite_photos, :through => :favoritings, :source => :photo
has_many :followings_where_leader, :class_name => "Following", :foreign_key => "leader_id"
has_many :followers, :through => :followings_where_leader, :source => :follower
has_many :followings_where_follower, :class_name => "Following", :foreign_key => "follower_id"
has_many :leaders, :through => :followings_where_follower, :source => :leader
end
<file_sep>User.destroy_all
Photo.destroy_all
20.times do
User.create!([
{
:name => Faker::Name.name,
:username => Faker::Team.name,
:email => Faker::Internet.email,
:password => Faker::Internet.password}])
end
@users = User.all
@users.each do |user|
photo = Photo.new
photo.caption = Faker::Lorem.sentences(1).join
photo.image = Faker::Avatar.image
photo.owner_id = user.id
photo.save
end
puts "#{User.count} Users created"
puts "#{Photo.count} photos created"
<file_sep>class Comment < ActiveRecord::Base
belongs_to :photo
belongs_to :commenter, :class_name => "User", :foreign_key => "comment_id"
end
<file_sep>class Photo < ActiveRecord::Base
belongs_to :owner, :class_name => "User", :foreign_key => "owner_id"
has_many :comments
has_many :favoritings, :dependent => :destroy
end
<file_sep>json.array!(@favoritings) do |favoriting|
json.extract! favoriting, :id, :photo_id, :favorited_by_id
json.url favoriting_url(favoriting, format: :json)
end
<file_sep>class Favoriting < ActiveRecord::Base
belongs_to :photo
belongs_to :favorited_by, :class_name => "User", :foreign_key => "favorited_by_id"
end
| ece644dd8d85f2aff3707b4bdb8b18c97a3e071d | [
"Ruby"
]
| 13 | Ruby | eubanksanator/myst | 94cf0c99bc4da729a89b0c5d62d28addbbfdbdbc | cc374bdb97cea4cec8b1ebf5219e57d094659f43 |
refs/heads/main | <repo_name>olympic-games-navarro/olympic-games-navarro.io<file_sep>/script_streamgraph.R
# Load libraries ----------------------------------------------------------
library(tidyverse)
library(ggplot2)
library(lubridate)
library(devtools)
#install.packages("remotes")
#remotes::install_github("hrbrmstr/streamgraph", force = TRUE)
library(streamgraph)
library(cowplot)
library(htmlwidgets)
olympics <- readr::read_csv('https://raw.githubusercontent.com/rfordatascience/tidytuesday/master/data/2021/2021-07-27/olympics.csv')
medals <- c("Gold", "Silver", "Bronze")
oly <- olympics %>%
filter(medal %in% medals) %>%
filter(season == "Summer") %>%
filter(sport == "Swimming") %>%
count(team, year) %>%
rename(total = "n")
#remove the ones with total less than 10
tops <- olympics %>%
filter(medal %in% medals) %>%
filter(season == "Summer") %>%
filter(sport == "Swimming") %>%
count(team) %>%
rename(total_total = "n") %>%
arrange(total_total)
#merge to filter
oly <- merge(x = oly, y = tops, by = "team", all.y = TRUE)
#filter an add empty years
oly <- oly %>%
filter(total_total>20) %>%
drop_na() %>%
add_row(year = 1916, total = 0) %>%
add_row(year = 1940, total = 0)
class(oly$year)
oly <- transform(oly, year = as.Date(as.character(year), "%Y"))
## streamgraph
#plot
pal <- c("#065874","#1b9dbf", "#043b52", "#4eb8d7","#086a8d", "#044364",
"#0a7796", "#044c5c", "#35b6dc", "#065874",
"#1b9dbf","#043b52", "#4eb8d7", "#086a8d","#044364",
"#0a7796", "#044c5c", "#35b6dc", "#065874")
# Shape: classic
p1 <- streamgraph(oly, key="team", value="total", date="year",
width="1000px", height="300px", interactive = TRUE) %>%
sg_axis_x(tick_interval=5, tick_units = "year", tick_format="%Y") %>%
sg_legend(TRUE, "Team: ") %>%
sg_axis_y(0) %>%
sg_fill_manual(values=pal) %>%
sg_annotate(label="I World War", x=as.Date("1913-08-01"),
y=50, color="black", size=10) %>%
sg_annotate(label="II World War", x=as.Date("1938-08-01"),
y=50, color="black", size=10)
p1
# save the widget
saveWidget(p1, "widget.html")
| 854c6ddf0e0477f63c6a7e63cd848fae5dd9d717 | [
"R"
]
| 1 | R | olympic-games-navarro/olympic-games-navarro.io | d9784423ff98b83068bba13120e3174f2c0ed7c7 | 5e9a49e1a9267eea955f640c4212953e3b5bd741 |
refs/heads/master | <file_sep>from django import forms
class FeedbackForm(forms.Form):
name = forms.CharField(widget=forms.TextInput(attrs={'class': 'form-control', 'placeholder': 'Enter your name'}))
email = forms.EmailField(widget=forms.TextInput(attrs={'class': 'form-control',
'placeholder': 'Enter your email address'}))
feedback = forms.CharField(widget=forms.TextInput(attrs={'class': 'form-control',
'placeholder': 'Provide your feedback here'}))
def clean_name(self):
""" Sanitize the name field """
name_data = self.cleaned_data['name']
return name_data
def clean_email(self):
""" Sanitize the email field """
email_data = self.cleaned_data['email']
return email_data
def clean_feedback(self):
""" Sanitize the feedback field """
feedback_data = self.cleaned_data['feedback']
return feedback_data
<file_sep>from django.db import models
class Airquality(models.Model):
id = models.IntegerField(primary_key=True)
air_quality_level = models.IntegerField()
location = models.ForeignKey('Location', on_delete=models.CASCADE)
datetime = models.DateTimeField()
class Meta:
db_table = 'core_airquality'
class Campusbuilding(models.Model):
id = models.IntegerField(primary_key=True)
building_name = models.CharField(max_length=120)
created_at = models.DateTimeField()
location = models.ForeignKey('Location', on_delete=models.CASCADE)
class Meta:
db_table = 'core_campusbuilding'
class Decibel(models.Model):
id = models.IntegerField(primary_key=True)
decibel_level = models.DecimalField(max_digits=10, decimal_places=2)
location = models.ForeignKey('Location', on_delete=models.CASCADE)
datetime = models.DateTimeField()
class Meta:
db_table = 'core_decibel'
class Humidity(models.Model):
id = models.IntegerField(primary_key=True)
humidity_level = models.DecimalField(max_digits=10, decimal_places=2)
location = models.ForeignKey('Location', on_delete=models.CASCADE)
datetime = models.DateTimeField()
class Meta:
db_table = 'core_humidity'
class Location(models.Model):
id = models.IntegerField(primary_key=True)
location_name = models.CharField(max_length=220)
created_at = models.DateTimeField()
modified_at = models.DateTimeField()
class Meta:
db_table = 'core_location'
class Pressure(models.Model):
id = models.IntegerField(primary_key=True, null=False)
pressure_level = models.DecimalField(max_digits=10, decimal_places=2)
location = models.ForeignKey(Location, on_delete=models.CASCADE)
datetime = models.DateTimeField()
class Meta:
db_table = 'core_pressure'
class Sensortype(models.Model):
id = models.IntegerField(primary_key=True)
sensor_type = models.CharField(max_length=220)
created_at = models.DateTimeField()
modified_at = models.DateTimeField()
class Meta:
db_table = 'core_sensortype'
class Singleboardcomputer(models.Model):
id = models.IntegerField(primary_key=True)
single_board_computer_name = models.CharField(max_length=220)
operating_system = models.CharField(max_length=220)
address = models.CharField(max_length=220)
created_at = models.DateTimeField()
modified_at = models.DateTimeField()
location = models.ForeignKey(Location, on_delete=models.CASCADE)
single_board_computer_type_id = models.ForeignKey('Singleboardcomputertype', on_delete=models.CASCADE)
class Meta:
db_table = 'core_singleboardcomputer'
class Singleboardcomputertype(models.Model):
id = models.IntegerField(primary_key=True)
computer_type = models.CharField(max_length=220)
created_at = models.DateTimeField()
modified_at = models.DateTimeField()
class Meta:
db_table = 'core_singleboardcomputertype'
class Temperature(models.Model):
id = models.IntegerField(primary_key=True)
temperature_level = models.DecimalField(max_digits=10, decimal_places=2)
location = models.ForeignKey(Location, on_delete=models.CASCADE)
datetime = models.DateTimeField()
class Meta:
db_table = 'core_temperature'
class User(models.Model):
id = models.IntegerField(primary_key=True)
ip_address = models.CharField(max_length=220)
created_at = models.DateTimeField()
class Meta:
db_table = 'core_user'
class Data(models.Model):
# id = models.IntegerField(primary_key=True)
mac = models.CharField(max_length=17, blank=True, null=True)
temp = models.FloatField(blank=True, null=True)
pres = models.FloatField(blank=True, null=True)
hum = models.FloatField(blank=True, null=True)
gas = models.FloatField(blank=True, null=True)
lux = models.IntegerField(blank=True, null=True)
db = models.FloatField(blank=True, null=True)
dt = models.DateTimeField(blank=True, null=True)
class Meta:
# managed = False
db_table = 'data'
class Feedback(models.Model):
name = models.CharField(max_length=220, null=True)
email = models.EmailField(max_length=70, null=True)
feedback = models.CharField(max_length=220, null=True)
<file_sep># Generated by Django 2.1.1 on 2018-11-26 19:33
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.CreateModel(
name='Airquality',
fields=[
('id', models.IntegerField(primary_key=True, serialize=False)),
('air_quality_level', models.IntegerField()),
('datetime', models.DateTimeField()),
],
options={
'db_table': 'core_airquality',
},
),
migrations.CreateModel(
name='Campusbuilding',
fields=[
('id', models.IntegerField(primary_key=True, serialize=False)),
('building_name', models.CharField(max_length=120)),
('created_at', models.DateTimeField()),
],
options={
'db_table': 'core_campusbuilding',
},
),
migrations.CreateModel(
name='Data',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('mac', models.CharField(blank=True, max_length=17, null=True)),
('temp', models.FloatField(blank=True, null=True)),
('pres', models.FloatField(blank=True, null=True)),
('hum', models.FloatField(blank=True, null=True)),
('gas', models.FloatField(blank=True, null=True)),
('lux', models.IntegerField(blank=True, null=True)),
('db', models.FloatField(blank=True, null=True)),
('dt', models.DateTimeField(blank=True, null=True)),
],
options={
'db_table': 'data',
},
),
migrations.CreateModel(
name='Decibel',
fields=[
('id', models.IntegerField(primary_key=True, serialize=False)),
('decibel_level', models.DecimalField(decimal_places=2, max_digits=10)),
('datetime', models.DateTimeField()),
],
options={
'db_table': 'core_decibel',
},
),
migrations.CreateModel(
name='Feedback',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=220, null=True)),
('email', models.EmailField(max_length=70, null=True)),
('feedback', models.CharField(max_length=220, null=True)),
],
),
migrations.CreateModel(
name='Humidity',
fields=[
('id', models.IntegerField(primary_key=True, serialize=False)),
('humidity_level', models.DecimalField(decimal_places=2, max_digits=10)),
('datetime', models.DateTimeField()),
],
options={
'db_table': 'core_humidity',
},
),
migrations.CreateModel(
name='Location',
fields=[
('id', models.IntegerField(primary_key=True, serialize=False)),
('location_name', models.CharField(max_length=220)),
('created_at', models.DateTimeField()),
('modified_at', models.DateTimeField()),
],
options={
'db_table': 'core_location',
},
),
migrations.CreateModel(
name='Pressure',
fields=[
('id', models.IntegerField(primary_key=True, serialize=False)),
('pressure_level', models.DecimalField(decimal_places=2, max_digits=10)),
('datetime', models.DateTimeField()),
('location', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='core.Location')),
],
options={
'db_table': 'core_pressure',
},
),
migrations.CreateModel(
name='Sensortype',
fields=[
('id', models.IntegerField(primary_key=True, serialize=False)),
('sensor_type', models.CharField(max_length=220)),
('created_at', models.DateTimeField()),
('modified_at', models.DateTimeField()),
],
options={
'db_table': 'core_sensortype',
},
),
migrations.CreateModel(
name='Singleboardcomputer',
fields=[
('id', models.IntegerField(primary_key=True, serialize=False)),
('single_board_computer_name', models.CharField(max_length=220)),
('operating_system', models.CharField(max_length=220)),
('address', models.CharField(max_length=220)),
('created_at', models.DateTimeField()),
('modified_at', models.DateTimeField()),
('location', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='core.Location')),
],
options={
'db_table': 'core_singleboardcomputer',
},
),
migrations.CreateModel(
name='Singleboardcomputertype',
fields=[
('id', models.IntegerField(primary_key=True, serialize=False)),
('computer_type', models.CharField(max_length=220)),
('created_at', models.DateTimeField()),
('modified_at', models.DateTimeField()),
],
options={
'db_table': 'core_singleboardcomputertype',
},
),
migrations.CreateModel(
name='Temperature',
fields=[
('id', models.IntegerField(primary_key=True, serialize=False)),
('temperature_level', models.DecimalField(decimal_places=2, max_digits=10)),
('datetime', models.DateTimeField()),
('location', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='core.Location')),
],
options={
'db_table': 'core_temperature',
},
),
migrations.CreateModel(
name='User',
fields=[
('id', models.IntegerField(primary_key=True, serialize=False)),
('ip_address', models.CharField(max_length=220)),
('created_at', models.DateTimeField()),
],
options={
'db_table': 'core_user',
},
),
migrations.AddField(
model_name='singleboardcomputer',
name='single_board_computer_type_id',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='core.Singleboardcomputertype'),
),
migrations.AddField(
model_name='humidity',
name='location',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='core.Location'),
),
migrations.AddField(
model_name='decibel',
name='location',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='core.Location'),
),
migrations.AddField(
model_name='campusbuilding',
name='location',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='core.Location'),
),
migrations.AddField(
model_name='airquality',
name='location',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='core.Location'),
),
]
<file_sep>"""smartcampus URL Configuration
The `urlpatterns` list routes URLs to views. For more information please see:
https://docs.djangoproject.com/en/2.1/topics/http/urls/
Examples:
Function views
1. Add an import: from my_app import views
2. Add a URL to urlpatterns: path('', views.home, name='home')
Class-based views
1. Add an import: from other_app.views import Home
2. Add a URL to urlpatterns: path('', Home.as_view(), name='home')
Including another URLconf
1. Import the include() function: from django.urls import include, path
2. Add a URL to urlpatterns: path('blog/', include('blog.urls'))
"""
from django.urls import path
from core import views
urlpatterns = [
path('', views.home, name='home'),
path('home/', views.home, name='home'),
path('select_location/', views.select_location, name='select location'),
path('campus_report/', views.campus_report, name='campus report'),
path('feedback/', views.feedback, name='feedback'),
path('campus_report/report_2018_07_28', views.report_2018_07_28, name='report_2018_07_28'),
]
<file_sep>from django.contrib import admin
from .models import *
admin.site.register(Data)
admin.site.register(Location)
admin.site.register(Temperature)
<file_sep># ssw695_smart_campus
[](https://travis-ci.org/kxue4/ssw695_smart_campus)
## Description
This project uses Raspberry Pi's with various sensors to collect data. The data is then stored in a database on premises. The sensors collect temperate, light, air quality, pressure, humidity, and air quality.
For this project, we created a web interface that displays the metrics data in charts, tables, and maps. The goal of the project is to create a smart city proof of concept for the Stevens campus, which provides useful campus information for Stevens students.
## Team members
- <NAME>
- <NAME>
- <NAME>
- <NAME>
## Web Development Framwork
- Django
## Programming language and Code Standard
- Python PEP8
## Web Server
- Amazon EC2
## Database
- Amazon RDS MySQL
## Scrum tools
- Taiga.io
## Testing tools
- Django's unit testing framework
<file_sep>
beautifulsoup4==4.6.0
bs4==0.0.1
click==6.7
coverage==4.5.1
Django==2.1.1
Flask==0.12.2
Fraction==1.1.0
itsdangerous==0.24
Jinja2==2.10
MarkupSafe==1.0
mysqlclient==1.3.13
numpy==1.14.1
pandas==0.23.4
prettytable==0.7.2
python-dateutil==2.7.3
python-http-client==3.1.0
pytz==2018.5
sendgrid==5.6.0
six==1.11.0
views==0.3
Werkzeug==0.12.2
<file_sep>from django.test import TestCase
from .models import *
from .forms import FeedbackForm
# Models
class LocationModelTest(TestCase):
def test_verbose_name_location(self):
"""
Should return True for table name.
"""
self.assertEqual(str(Location._meta.db_table), "core_location")
class DataModelTest(TestCase):
def test_verbose_name_data(self):
"""
Should return True for table name.
"""
self.assertEqual(str(Data._meta.db_table), "data")
class PressureModelTest(TestCase):
def test_verbose_name_pressure(self):
"""
Should return True for table name.
"""
self.assertEqual(str(Pressure._meta.db_table), "core_pressure")
class SensortypeModelTest(TestCase):
def test_verbose_name_sensortype(self):
"""
Should return True for table name.
"""
self.assertEqual(str(Sensortype._meta.db_table), "core_sensortype")
class SingleboardcomputerModelTest(TestCase):
def test_verbose_name_singleboardcomputer(self):
"""
Should return True for table name.
"""
self.assertEqual(str(Singleboardcomputer._meta.db_table), "core_singleboardcomputer")
class SingleboardcomputertypeModelTest(TestCase):
def test_verbose_name_singleboardcomputertype(self):
"""
Should return True for table name.
"""
self.assertEqual(str(Singleboardcomputertype._meta.db_table), "core_singleboardcomputertype")
class TemperatureModelTest(TestCase):
def test_verbose_name_temperature(self):
"""
Should return True for table name.
"""
self.assertEqual(str(Temperature._meta.db_table), "core_temperature")
class UserModelTest(TestCase):
def test_verbose_name_user(self):
"""
Should return True for table name.
"""
self.assertEqual(str(User._meta.db_table), "core_user")
# Urls
class ProjectTests(TestCase):
def test_homepage(self):
"""
Test the status code for homepage should equal 200.
"""
response = self.client.get('/home')
self.assertEqual(response.status_code, 301)
def test_locationpage(self):
"""
Test the status code for location page should equal 200.
"""
response = self.client.get('/select_location')
self.assertEqual(response.status_code, 301)
def test_campus_reportpage(self):
"""
Test the status code for campus report page should equal 200.
"""
response = self.client.get('/campus_report/')
self.assertEqual(response.status_code, 200)
def test_feedbackpage(self):
"""
Test the status code for feedback page should equal 200.
"""
response = self.client.get('/feedback/')
self.assertEqual(response.status_code, 200)
<file_sep>"""
smartcampus URL Configuration
"""
from django.contrib import admin
from django.urls import path, include
urlpatterns = [
path('admin/', admin.site.urls),
path('', include('core.urls')),
path('home/', include('core.urls')),
path('select_location/', include('core.urls')),
path('campus_report/', include('core.urls')),
path('feedback/', include('core.urls'))
]
<file_sep>window.onload = function(){
var btnGroup = document.getElementsByClassName("btn-group")[0];
var myButton = btnGroup.getElementsByTagName("button");
var myDiv = document.getElementsByClassName("map-canvas");
for(var i = 0; i<myButton.length;i++){
myButton[i].index = i;
myButton[i].onclick = function(){
for(var i = 0; i < myButton.length; i++){
myButton[i].className="btn btn-default";
myDiv[i].style.display="none";
}
this.className = "btn btn-danger";
myDiv[this.index].style.display = "block";
}
}
}
<file_sep>var dom = document.getElementById("humi_chart");
var myChart = echarts.init(dom);
option = null;
option = {
title: {
text: 'Humidity in a week',
subtext: 'in %'
},
tooltip: {
trigger: 'axis'
},
xAxis: {
type: 'category',
boundaryGap: false,
data: ['10/1','10/2','10/3','10/4','10/5','10/6','10/7'] // This is the tag of X-axis
},
yAxis: {
type: 'value',
min: 48, // This is the min value of Y-axis
max: 52, // This is the max value of Y-axis
},
series: [
{
name:'%',
type:'line',
data:[48.37, 49.41, 50.12, 48.19, 51.22, 50.13, 48.99], // This is the value of X-axis
markPoint: {
data: [
{type: 'max', name: 'Max'},
{type: 'min', name: 'Min'}
]
},
markLine: {
data: [
{type: 'average', name: 'average'}
]
}
},
]
};
;
if (option && typeof option === "object") {
myChart.setOption(option, true);
}
window.onresize = myChart.resize;<file_sep>from django.shortcuts import render
from django.forms.models import model_to_dict
from django.core.serializers.json import DjangoJSONEncoder
import json
from django.http import HttpResponseRedirect
from .forms import FeedbackForm
import sendgrid
import os
from sendgrid.helpers.mail import *
from .models import *
def home(request):
"""
GET request for the home page.
Returns the last metric values from the Data table.
"""
posts = [Data.objects.latest('id')]
cont = {'posts': posts}
return render(request, 'home.html', cont)
def select_location(request):
"""
GET request to the location page
Returns metrics in the chart displays.
"""
dataset = Data.objects.all()
posts = [dataset.latest('id')]
temps = []
hums = []
press = []
gass = []
year = 2018
month = 7
day = 26
hour = 4
minutes = 00
for i in range(6):
temps.append(dataset.filter(dt__year=year).filter(dt__month=month).filter(dt__day=day)
.filter(dt__hour=hour).filter(dt__minute=minutes).first().temp)
hums.append(dataset.filter(dt__year=year).filter(dt__month=month).filter(dt__day=day)
.filter(dt__hour=hour).filter(dt__minute=minutes).first().hum)
press.append(dataset.filter(dt__year=year).filter(dt__month=month).filter(dt__day=day)
.filter(dt__hour=hour).filter(dt__minute=minutes).first().pres)
gass.append(round(dataset.filter(dt__year=year).filter(dt__month=month).filter(dt__day=day)
.filter(dt__hour=hour).filter(dt__minute=minutes).first().gas / 2000000))
minutes += 10
return render(request, 'select_location.html', {'posts': posts, 'temps': json.dumps(temps, cls=DjangoJSONEncoder),
'hums': json.dumps(hums, cls=DjangoJSONEncoder),
'press': json.dumps(press, cls=DjangoJSONEncoder),
'gass': json.dumps(gass, cls=DjangoJSONEncoder)})
def campus_report(request):
"""
GET request for the campus reports page.
"""
return render(request, 'campus_report.html')
def report_2018_07_28(request):
"""
GET request for the reports page.
"""
# daily report only
# 2018-07-28 00:00 to 2018-07-28 20:59
dataset = Data.objects.all()
temps = []
hums = []
press = []
gass = []
year = 2018
month = 7
day = 28
hour = 00
minutes = 00
for h in range(24):
try:
temps.append(dataset.filter(dt__year=year).filter(dt__month=month).filter(dt__day=day)
.filter(dt__hour=hour).filter(dt__minute=minutes).first().temp)
except AttributeError:
temps.append('NA')
try:
hums.append(model_to_dict(dataset.filter(dt__year=year).filter(dt__month=month).filter(dt__day=day)
.filter(dt__hour=hour).filter(dt__minute=minutes).first())['hum'])
except AttributeError:
hums.append('NA')
try:
press.append(model_to_dict(dataset.filter(dt__year=year).filter(dt__month=month).filter(dt__day=day)
.filter(dt__hour=hour).filter(dt__minute=minutes).first())['pres'])
except AttributeError:
press.append('NA')
try:
gass.append(round(model_to_dict(dataset.filter(dt__year=year).filter(dt__month=month).filter(dt__day=day)
.filter(dt__hour=hour).filter(dt__minute=minutes).first())['gas'] / 2000000))
except AttributeError:
gass.append('NA')
hour += 1
times = ['00:00','01:00','02:00','03:00','04:00','05:00', '06:00', '07:00', '08:00', '09:00', '10:00', '11:00',
'12:00', '13:00', '14:00', '15:00', '16:00', '17:00', '18:00', '19:00', '20:00', '21:00', '22:00', '23:00']
temp_a = [i for i in temps if i != 'NA']
hum_a = [i for i in hums if i != 'NA']
pres_a = [i for i in press if i != 'NA']
gas_a = [i for i in gass if i != 'NA']
average = [round(sum(temp_a) / len(temp_a), 2), round(sum(hum_a) / len(hum_a), 2),
round(sum(pres_a) / len(pres_a), 2), sum(gas_a) / len(gas_a)]
return render(request, 'report_2018_07_28.html', {'times': times, 'temps': temps, 'hums': hums, 'press': press,
'gass': gass, 'average': average})
def feedback(request):
"""
GET request returns the form on the feedback page.
POST request sends an email.
"""
if request.method == 'POST':
feedback_form = FeedbackForm(request.POST)
print(feedback_form)
if feedback_form.is_valid():
feedback_instance = Feedback()
feedback_instance.name = feedback_form.cleaned_data['name']
feedback_instance.email = feedback_form.cleaned_data['email']
feedback_instance.feedback = feedback_form.cleaned_data['feedback']
feedback_instance.save()
print(feedback_instance.name)
print(feedback_instance.email)
print(feedback_instance.feedback)
# === Sendgrid email ===
sg = sendgrid.SendGridAPIClient(apikey=os.environ.get('SENDGRID_API_KEY'))
from_email = Email('<EMAIL>')
to_email = Email("<EMAIL>")
subject = "Feedback"
content = Content("text/plain", "Name: {} \nEmail: {} Feedback: {}".format(feedback_form.cleaned_data['name'],
feedback_form.cleaned_data['email'],
feedback_form.cleaned_data['feedback']))
mail = Mail(from_email, subject, to_email, content)
response = sg.client.mail.send.post(request_body=mail.get())
print(response.status_code)
print(response.body)
print(response.headers)
return HttpResponseRedirect('/')
else:
feedback_form = FeedbackForm()
context = {
'form': feedback_form,
}
return render(request, 'feedback.html', context)
| 8997b800ae041d2df6f5ee7cfe557499353c3569 | [
"Markdown",
"Python",
"Text",
"JavaScript"
]
| 12 | Python | kxue4/ssw695_smart_campus | afc715a637dac090c2e14f15a85e29c12a6ced1a | 8be30fe47a6963d63c2231fbcade43d9afa87306 |
refs/heads/master | <file_sep>// Tagua VM
//
//
// New BSD License
//
// Copyright © 2016-2016, <NAME>.
// All rights reserved.
//
// Redistribution and use in source and binary forms, with or without
// modification, are permitted provided that the following conditions are met:
// * Redistributions of source code must retain the above copyright
// notice, this list of conditions and the following disclaimer.
// * Redistributions in binary form must reproduce the above copyright
// notice, this list of conditions and the following disclaimer in the
// documentation and/or other materials provided with the distribution.
// * Neither the name of the Hoa nor the names of its contributors may be
// used to endorse or promote products derived from this software without
// specific prior written permission.
//
// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
// AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
// IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
// ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDERS AND CONTRIBUTORS BE
// LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
// CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
// SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
// INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
// CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
// ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
// POSSIBILITY OF SUCH DAMAGE.
//! Binary to drive the `tagua_vm` library.
extern crate tagua_vm;
use tagua_vm::language;
use tagua_vm::shared::VERSION;
use std::env;
use std::fs::File;
use std::io::prelude::*;
use std::process;
enum ExitCode {
Ok,
InvalidOption,
MissingFile,
InvalidFile,
MultipleFiles,
Panic
}
fn usage() -> String {
"Usage: tvm [options] [file]\
\nOptions:\
\n -v, --version Print version.\
\n -h, --help This help.".to_string()
}
fn version() -> String {
format!("Tagua VM v{}", VERSION)
}
fn file(filename: &str) {
match File::open(filename) {
Ok(mut file) => {
let mut buffer = Vec::new();
match file.read_to_end(&mut buffer) {
Ok(_) =>
language::compiler::vm::compile(
language::parser::parse(&buffer[..])
),
Err(error) => {
println!(
"Error while reading file {}: {:?}.",
filename,
error
);
exit(ExitCode::Panic);
}
}
},
Err(error) => {
println!(
"Could not open input file {}; reason: {}.",
filename,
error
);
exit(ExitCode::InvalidFile);
}
};
}
fn exit(code: ExitCode) {
process::exit(code as i32);
}
pub fn process_options(arguments: Vec<String>) {
let mut input = None;
for argument in arguments {
match argument.chars().next() {
Some('-') =>
match argument.as_ref() {
"-v" | "--version" => {
println!("{}", version());
exit(ExitCode::Ok);
},
"-h" | "--help" => {
println!("{}", usage());
exit(ExitCode::Ok);
},
_ => {
println!("Invalid option `{}`.\n\n{}", argument, usage());
exit(ExitCode::InvalidOption);
}
},
Some(_) => {
if input == None {
input = Some(argument);
} else {
println!(
"Multiple input files is not allowed \
(issue with the `{}` input).\n\n{}",
argument,
usage()
);
exit(ExitCode::MultipleFiles);
}
},
None => {
println!("{}", usage());
exit(ExitCode::Panic);
}
}
}
if let Some(filename) = input {
file(&filename[..]);
} else {
println!("No file provided.\n\n{}", usage());
exit(ExitCode::MissingFile);
}
}
fn main() {
let arguments: Vec<String> = env::args().skip(1).collect();
process_options(arguments);
}
#[cfg(test)]
mod tests {
use tagua_vm::shared::VERSION;
use super::usage;
use super::version;
#[test]
fn case_usage() {
assert_eq!(
"Usage: tvm [options] [file]\
\nOptions:\
\n -v, --version Print version.\
\n -h, --help This help.".to_string(),
usage()
);
}
#[test]
fn case_version() {
assert_eq!(
format!("Tagua VM v{}", VERSION),
version()
);
}
}
<file_sep>// Tagua VM
//
//
// New BSD License
//
// Copyright © 2016-2016, <NAME>.
// All rights reserved.
//
// Redistribution and use in source and binary forms, with or without
// modification, are permitted provided that the following conditions are met:
// * Redistributions of source code must retain the above copyright
// notice, this list of conditions and the following disclaimer.
// * Redistributions in binary form must reproduce the above copyright
// notice, this list of conditions and the following disclaimer in the
// documentation and/or other materials provided with the distribution.
// * Neither the name of the Hoa nor the names of its contributors may be
// used to endorse or promote products derived from this software without
// specific prior written permission.
//
// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
// AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
// IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
// ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDERS AND CONTRIBUTORS BE
// LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
// CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
// SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
// INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
// CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
// ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
// POSSIBILITY OF SUCH DAMAGE.
//! Language manipulation (like the lexer, parser, compiler…).
//!
//! This module is only responsible of the language manipulation, not its
//! execution. The trivial workflow is the following:
//!
//! * Let `datum` be of kind `&[u8]`,
//! * The `parser` module will apply two analyzers and producer:
//! * Lexical analyzer, also called lexer, to basically split `datum` in a
//! sequence of lexemes, also called tokens,
//! * Syntax analyzer, also called parser, to check that lexemes in the
//! sequence are well ordered,
//! * Finally, if lexemes are well ordered, an Abstract Syntax Tree (AST)
//! is produced.
//! * This AST will be visited by the `compiler` module visitors to apply
//! transformations or to check properties.
pub mod compiler;
pub mod parser;
<file_sep>// Tagua VM
//
//
// New BSD License
//
// Copyright © 2016-2016, <NAME>.
// All rights reserved.
//
// Redistribution and use in source and binary forms, with or without
// modification, are permitted provided that the following conditions are met:
// * Redistributions of source code must retain the above copyright
// notice, this list of conditions and the following disclaimer.
// * Redistributions in binary form must reproduce the above copyright
// notice, this list of conditions and the following disclaimer in the
// documentation and/or other materials provided with the distribution.
// * Neither the name of the Hoa nor the names of its contributors may be
// used to endorse or promote products derived from this software without
// specific prior written permission.
//
// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
// AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
// IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
// ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDERS AND CONTRIBUTORS BE
// LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
// CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
// SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
// INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
// CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
// ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
// POSSIBILITY OF SUCH DAMAGE.
//! Transform an AST to VM intermediate representation.
use llvm::native_type::VMRepresentation;
use llvm;
use parser::ast;
/// Compile an AST to VM intermediate representation.
pub fn compile(ast: ast::Addition) {
let context = llvm::context::Context::new();
let mut module = llvm::module::Module::new("foobar", &context);
let mut builder = llvm::builder::Builder::new(&context);
let function = llvm::function::Function::new(
&module,
"f",
&mut [],
llvm::native_type::int64_type(&context)
);
let basic_block = function.new_basic_block("entry");
builder.move_to_end(basic_block);
let addition = builder.add(
7u64.to_vm_representation(&context),
42u64.to_vm_representation(&context),
"addition"
);
builder.return_value(addition);
let engine_result = llvm::engine::Engine::new(
&mut module,
&llvm::engine::Options {
level : llvm::engine::OptimizationLevel::NoOptimizations,
code_model: llvm::engine::CodeModel::Default
}
);
match engine_result {
Ok(engine) =>
println!(
"THE result is {}.",
engine.run_function(&function, &mut [])
),
Err(_) =>
panic!(
"Cannot execute the following module:\n{}",
module
)
}
}
<file_sep># Tagua VM
[](https://travis-ci.org/tagua-vm/tagua-vm)
[](https://webchat.freenode.net/?channels=#taguavm)
[](https://gitter.im/tagua-vm/tagua-vm)
Tagua VM is an experimental [PHP][1] Virtual Machine that
guarantees safety and quality by removing large classes of vulnerabilities
thanks to [the Rust language][2] and [the LLVM Compiler
Infrastructure][3].
## Introduction
PHP is an extremely popular programming language. On 2015, [PHP was used by
more than 80% of all websites][4]. However, almost [500 known severe
vulnerabilities have been recorded][5], whose almost [50 with a high CVE
score][6]. This is inherent to any popular language but this is dangerous.
The goal of this project is to provide a PHP VM that guarantees safety and
quality by removing large classes of vulnerabilities. This will be achieved by
using appropriated tools like Rust and LLVM. Rust is a remarkable language that
brings strong guarantees about memory safety. This is also a fast language that
competes well with C. It can also talk to C very easily. LLVM is a famous
compiler infrastructure that brings modernity, state-of-the-art algorithms,
performance, developer tool chains…
This project will solve three problems at once:
1. **Providing safety and high quality by removing large classes of
vulnerabilities and thus avoid the cost of dramatic bugs**,
2. **Providing modernity, new developer experience and state-of-the-art
algorithms so performance**,
3. **Providing a set of libraries that will compose the VM and that can be
reused outside of the project (like the parser, analysers, extensions
etc.)**.
## Why PHP is critical?
[PHP][1] is a popular programming language. Today, it powers a large part of
the softwares we daily use on Internet. To list a few: Wikipedia, Facebook,
Baidu, Yahoo, or Pinterest, but also softwares you can install for your own
purposes, such as: Wordpress (blog and website), Drupal (CMS), Joomla (CMS),
Magento (commerce), Shopify (commerce), Moodle (learning), phpBB (forum)…. On
2015, PHP was used as the server-side programming language on more than [80% of
all websites][4].
Due to its unique position in the Internet land, a single vulnerability could
have a huge impact.
* Economical: Imagine a shop; A bug in the check out and products can no
longer be sold, or prices can be wrong. This is critical for both a small or a
big business. Note that this could be done by a malicious person,
* Privacy: Imagine a social network with hundreds of thousands enthusiasms; A
bug could leak private or personal information,
* Organisational: Imagine a music festival; A bug in the event software can
cause the cancellation of several months of works,
* Any other: Memory corruption, segmentation fault, data races… all these
class of bugs can be critical at so much levels.
PHP VMs have recorded almost [500 known vulnerabilities][5], whose almost
[50 vulnerabilities with a CVE score greater or equal to 9 over 10][6]. Many of
them and the most dangerous are about memory corruptions [[7], [8], [9]] or
errors in parsers [[10], [11], [12], [13], [14]]. The implications of these
vulnerabilities are for instance remote code execution or Denial Of Service, two
vectors that have important impact on a whole infrastructure.
This situation is real for any programming language (like [Python][15] or
[Java][16]). Nevertheless, the criticality of a vulnerability is hardly linked
to the popularity of the language. In the case of PHP, a single vulnerability
can be dangerous in so many fashions. However, this is not the fault of the
language itself: All the listed vulnerabilities are due to the VMs.
## What is the big plan?
Currently, PHP has two major virtual machines (VM): [Zend Engine][17] and
[HHVM][18]. Zend Engine is the original VM, it is mainly written in C and
counts hundreds of contributors. HHVM is mainly written in C++ and also counts
hundreds of contributors. HHVM, by being more recent, has a more
state-of-the-art approach and offers features like Just-In-Time compilation.
However, both VM are written in unsafe languages, where segmentation faults,
data races, memory corruptions etc. are very frequent and are severe
errors/vulnerabilities, as presented in the previous section.
Tagua VM has a different approach.
* It is written in [Rust][2]: A language that guarantees memory safety,
threads without data races (by using the move semantics, data ownership,
borrowing, lifetime…), zero-cost abstractions, minimal runtime and, as
a bonus, efficient C bindings, in addition to being as fast as C,
* It relies on [LLVM][3] for the compiler backend: A solid,
state-of-the-art, research, widely used modular and reusable compiler and
toolchains technologies.
The class of vulnerabilities and class of bugs mentioned earlier are almost
removed in the Rust language. This is part of its guarantees. It does not avoid
the need of complete test suites and security audits though.
### Safety first
Due to the popularity of PHP, this is extremely important to have a safe VM to
run these applications.
Since the old days of Computer Science, numerous bugs and vulnerabilities in OS
(like Linux or BSD), in libraries (like Gclibc), in major programs (like Bash,
X.org or PHP VMs), have been found, simply due to the lack of memory and
type safety. Intrinsically, Rust enforces safety statically, hence removing most
of the memory vulnerabilities like segmentation faults or data races.
### High quality
The quality of a project can be defined in various ways. Here is what we mean
when speaking about quality.
* Documentation: Always up-to-date, detailed as much as possible, both for
API and user documentations.
* Unit tests: Each function, each structure, each trait is unit tested. No
code lands without a unit test.
* Integration tests: Tagua VM is both a set of libraries and a binary; The
libraries part also has an integration test suites.
* Continuous Integration: Each set of commits must compile and must not
introduce a regression on all build targets.
### Performance
The term “performance” must be defined. By saying “performance” we mean: Speed
and memory efficiency. While speed is not the top priority, memory is. It is
part of safety. When safety is ensured and quality is high enough to detect most
of the regressions, we can safely patch the VM to get better performances if and
when necessary.
Obviously, we commit to use the state-of-the-art algorithms and structures to
ensure excellent performances.
## Roadmap
(Under rewriting).
## Installing
To install Tagua VM, you must have Rust (see [the Rust installation page][19])
and LLVM (see [the LLVM installation page][20]) installed. This is important to
have `llvm-config` and `cargo` available in the path. [Cargo][21] is the Rust
package manager.
To build a release version:
```sh
$ cargo build --release
$ ./target/release/tvm --help
```
To build a development version:
```sh
$ cargo build
$ ./target/debug/tvm --help
```
### Using Docker
If installing Rust and LLVM on your machine is too much, Docker might be an
alternative: It provides everything needed to build, test and run Tagua VM.
First, build the Docker image:
```sh
$ docker build -t tagua-vm-dev .
```
Now, it is possible to run a container from this image:
```sh
$ docker run --rm -it -v $(pwd):/source tagua-vm-dev
```
If this command succeeds, you are inside a fresh container. To see if
everything is fine, you can start the test suite:
```sh
$ cargo test
```
## Contributing
Do whatever you want. Just respect the license and the other contributors. Your
favorite tool is going to be:
```sh
$ cargo test
```
to run all the test suites (unit test suites, integration test suites and documentation test suites).
### カンバン ([Kanban](https://en.wikipedia.org/wiki/Kanban))
In order to get an overview of what need to be done, what is in progress and
what has been recently done, [a kanban board is
available](https://waffle.io/tagua-vm/tagua-vm).
## Documentation and help
The documentation is automatically uploaded online at the following address:
https://tagua-vm.github.io/tagua-vm.
To generate it locally, please, run the following command:
```sh
$ cargo doc --open
```
To get help on IRC, please join the official [`#taguavm` channel on
Freenode](https://webchat.freenode.net/?channels=#taguavm). Alternatively, there
is a [mirrored room on Gitter](https://gitter.im/tagua-vm/tagua-vm).
## Libraries
Tagua VM is designed as a set of libraries that can work outside of the
project. So far, the following libraries are living outside of the project:
* [`libtagua_parser`][22], Safe, fast and memory efficient PHP parser
(lexical and syntactic analysers).
## License
Tagua VM is under the New BSD License (BSD-3-Clause):
```
New BSD License
Copyright © 2016-2016, <NAME>.
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
* Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
* Neither the name of the Hoa nor the names of its contributors may be
used to endorse or promote products derived from this software without
specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDERS AND CONTRIBUTORS BE
LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
POSSIBILITY OF SUCH DAMAGE.
```
[1]: http://php.net/
[2]: https://www.rust-lang.org/
[3]: http://llvm.org/
[4]: https://w3techs.com/technologies/details/pl-php/all/all
[5]: https://www.cvedetails.com/vendor/74/PHP.html
[6]: https://www.cvedetails.com/vulnerability-list.php?vendor_id=74&product_id=&version_id=&page=1&hasexp=0&opdos=0&opec=0&opov=0&opcsrf=0&opgpriv=0&opsqli=0&opxss=0&opdirt=0&opmemc=0&ophttprs=0&opbyp=0&opfileinc=0&opginf=0&cvssscoremin=9&cvssscoremax=0&year=0&month=0&cweid=0&order=1&trc=495&sha=3a14a3e67be8aa88a16a1018e06ffe4e0d940af5
[7]: https://www.cvedetails.com/cve/CVE-2016-2554/
[8]: https://www.cvedetails.com/cve/CVE-2015-8880/
[9]: https://www.cvedetails.com/cve/CVE-2015-5589/
[10]: https://www.cvedetails.com/cve/CVE-2015-8617/
[11]: https://www.cvedetails.com/cve/CVE-2015-4642/
[12]: https://www.cvedetails.com/cve/CVE-2015-4601/
[13]: https://www.cvedetails.com/cve/CVE-2016-4544/
[14]: https://www.cvedetails.com/cve/CVE-2016-4539/
[15]: https://www.cvedetails.com/vulnerability-list/vendor_id-10210/Python.html
[16]: http://www.cvedetails.com/vulnerability-list.php?vendor_id=5&product_id=1526&version_id=&page=1&hasexp=0&opdos=0&opec=0&opov=0&opcsrf=0&opgpriv=0&opsqli=0&opxss=0&opdirt=0&opmemc=0&ophttprs=0&opbyp=0&opfileinc=0&opginf=0&cvssscoremin=9&cvssscoremax=0&year=0&month=0&cweid=0&order=1&trc=435&sha=0050c3562eb6901e183f2b2b636c9769579a0fb8
[17]: http://zend.com/
[18]: http://hhvm.com/
[19]: https://www.rust-lang.org/downloads.html
[20]: http://llvm.org/releases/download.html
[21]: http://doc.crates.io/guide.html
[22]: https://github.com/tagua-vm/parser
<file_sep>[package]
name = "tagua-vm"
version = "0.0.1"
authors = ["<NAME> <<EMAIL>>"]
repository = "https://github.com/tagua-vm/tagua-vm"
description = "Tagua VM is an experimental PHP Virtual Machine written with the Rust language and the LLVM Compiler Infrastructure."
readme = "README.md"
keywords = ["php", "virtual machine"]
license = "BSD-3-Clause"
[lib]
name = "tagua_vm"
path = "source/lib.rs"
test = true
doctest = true
bench = true
doc = true
harness = true
[[bin]]
name = "tvm"
path = "source/bin/tvm.rs"
test = true
doctest = true
bench = true
doc = true
harness = true
[profile.dev]
opt-level = 0
debug = true
rpath = false
lto = false
debug-assertions = true
codegen-units = 1
[profile.release]
opt-level = 3
debug = false
rpath = false
lto = true
debug-assertions = false
codegen-units = 1
[profile.test]
opt-level = 0
debug = true
rpath = false
lto = false
debug-assertions = true
codegen-units = 1
[dependencies]
tagua-llvm = {git = "https://github.com/tagua-vm/llvm", rev = "master"}
tagua-parser = {git = "https://github.com/tagua-vm/parser", rev = "master"}
| e9430bfb5922e3b2fd8c39ce5bbf3407449fbdf1 | [
"Markdown",
"Rust",
"TOML"
]
| 5 | Rust | ammarfaizi2/tagua-vm | a8f7e50dd9331d49a34ac02d49c638270b8019fb | cdbde24492e3a804a669f3ee3cb3fb3d83ae39ed |
refs/heads/master | <file_sep>using System.Text.Json;
using System.IO;
using System.Text.RegularExpressions;
using DigitalerPlanstempel.Utility;
namespace DigitalerPlanstempel.Template
{
/// <summary>
/// Erzeugen und speichern der Schablone
/// </summary>
public class ModelTemplate
{
public string JsonTemplate { get; private set; }
public ModelTemplate(string modelName, string storeyRestriction, string examinationRestriction)
{
var template = new CreateTemplate(modelName, examinationRestriction);
var model = template.GetBuilding(storeyRestriction);
var path = Constants.FolderPath + @"ModelTemplates\" + modelName + ".json";
var options = new JsonSerializerOptions
{
WriteIndented = true,
};
JsonTemplate = JsonSerializer.Serialize(model, options);
JsonTemplate = Regex.Unescape(JsonTemplate);
File.WriteAllText(path, JsonTemplate);
}
}
}
<file_sep>using System.Collections.Generic;
namespace DigitalerPlanstempel.BuildingElements
{
public class Wall
{
public string WallHash { get; set; }
public string GlobalId { get; set; }
public CartesianPoint StartPoint { get; set; }
public CartesianPoint EndPoint { get; set; }
public List<Property> AllProperties { get; set; }
public List<Door> AllDoors { get; set; }
public List<Window> AllWindows { get; set; }
}
}
<file_sep>using System.Collections.Generic;
namespace DigitalerPlanstempel.BuildingElements
{
public class Window
{
public string WindowHash { get; set; }
public string GlobalId { get; set; }
public CartesianPoint StartPoint { get; set; }
public List<Property> AllProperties { get; set; }
}
}
<file_sep>using System.Collections.Generic;
namespace DigitalerPlanstempel.Template
{
public class Building
{
public string BuildingHash { get; set; }
public List<Storey> AllStoreys { get; set; }
}
}
<file_sep>using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace DigitalerPlanstempel.Utility
{
public static class Constants
{
public static string FolderPath => Path.GetFullPath(Path.Combine(System.AppDomain.CurrentDomain.BaseDirectory, @"..\..\"));
}
}
<file_sep>
namespace DigitalerPlanstempel.BuildingElements
{
public class CartesianPoint
{
public string PointHash { get; set; }
public string X { get; set; }
public string Y { get; set; }
public string Z { get; set; }
}
}
<file_sep>using System.Collections.Generic;
using System.Linq;
using System.Text.Json;
using System.Security.Cryptography;
using DigitalerPlanstempel.Neo4j;
using DigitalerPlanstempel.Utility;
using DigitalerPlanstempel.BuildingElements;
namespace DigitalerPlanstempel.Template
{
/// <summary>
/// Ermitteln aller Ebenen der Schablone
/// </summary>
public class CreateTemplate
{
//Die Funktionen dieser Klasse rufen Funktionen aus der Klasse Neo4JConnector auf, um Abfragen zu erstellen.
public Neo4JConnector Con;
private string ExaminationRestriction;
public CreateTemplate(string modelName, string examinationRestriction)
{
var uri = Neo4jAccessData.Uri;
var user = Neo4jAccessData.User;
var password = <PASSWORD>;
ExaminationRestriction = examinationRestriction;
Con = new Neo4JConnector(uri, user, password, modelName);
}
/// <summary>
/// Liste mit Eigenschaften, kann je nach Wunsch erweitert werden
/// </summary>
public List<string> GetPropertyNames(int id, string type)
{
var propertyNames = new List<string>();
if (type == "Wall")
{
propertyNames.Add("Height");
propertyNames.Add("Length");
propertyNames.Add("Width");
propertyNames.Add("Material");
propertyNames.Add("'LoadBearing'");
if (ExaminationRestriction != "Statikprüfung")
{
propertyNames.Add("'ThermalTransmittance'");
propertyNames.Add("Name");
}
}
else
{
propertyNames.Add("Height");
propertyNames.Add("Width");
if (ExaminationRestriction != "Statikprüfung")
{
propertyNames.Add("Material");
propertyNames.Add("Name");
}
}
return propertyNames;
}
/// <summary>
/// Erzeugen der Property Klasse.
/// Eigenschaften werden speziell für den Bauteiltypen ermittelt
/// </summary>
public Property GetProperties(int id, string name, string entityName)
{
var result = new Property();
result.PropertyName = name;
if (entityName == "Wall")
{
result.PropertyValue = Con.GetPropertyValueWall(id, name);
result.PropertyHash = Con.GetPropertyHashWall(id, name);
}
else if (entityName == "Door" || entityName == "Window")
{
result.PropertyValue = Con.GetPropertyValueDoorOrWindow(id, name);
result.PropertyHash = Con.GetPropertyHashDoorOrWindow(id, name);
}
return result;
}
/// <summary>
/// Erzeugen der Wall Klasse.
/// Aufrufen der GetProperties Funktion
/// </summary>
public Door GetDoor(int id)
{
var result = new Door();
var allProperties = new List<Property>();
var propertyNames = GetPropertyNames(id, "Door");
foreach (var name in propertyNames)
{
allProperties.Add(GetProperties(id, name, "Door"));
}
result.GlobalId = Con.GetGlobalId(id);
result.AllProperties = allProperties.OrderBy(o => o.PropertyName).ToList();
result.StartPoint = Con.GetStartPointDoorOrWindow(id);
string doorJson = JsonSerializer.Serialize(result);
using (SHA256 sha256Hash = SHA256.Create())
{
result.DoorHash = Hashing.GetHash(sha256Hash, doorJson);
}
return result;
}
/// <summary>
/// Erzeugen der Fenster Klasse.
/// Aufrufen der GetProperties Funktion
/// </summary>
public Window GetWindow(int id)
{
var result = new Window();
var allProperties = new List<Property>();
var propertyNames = GetPropertyNames(id, "Window");
foreach (var name in propertyNames)
{
allProperties.Add(GetProperties(id, name, "Window"));
}
result.GlobalId = Con.GetGlobalId(id);
result.AllProperties = allProperties.OrderBy(o => o.PropertyName).ToList();
result.StartPoint = Con.GetStartPointDoorOrWindow(id);
string doorJson = JsonSerializer.Serialize(result);
using (SHA256 sha256Hash = SHA256.Create())
{
result.WindowHash = Hashing.GetHash(sha256Hash, doorJson);
}
return result;
}
/// <summary>
/// Erzeugen der Wall Klasse.
/// Aufrufen der GetWindow,GetDoor und GetProperties Funktion
/// </summary>
public Wall GetWall(int id)
{
var result = new Wall();
var allProperties = new List<Property>();
var propertyNames = GetPropertyNames(id, "Wall");
//Alle Eigenschaften für eine Wand ermitteln
foreach (var name in propertyNames)
{
allProperties.Add(GetProperties(id, name, "Wall"));
}
//Alle Türen für eine Wand ermitteln
var allDoors = new List<Door>();
var doorIds = Con.GetDoorOrWindowIds(id, " IFCDOOR");
foreach(var doorId in doorIds)
{
allDoors.Add(GetDoor(doorId));
}
result.AllDoors = allDoors.OrderBy(o=>o.StartPoint.X).ThenByDescending(o=>o.StartPoint.Y).ToList();
//Alle Fenster für eine Wand ermitteln
var allWindows = new List<Window>();
var windowIds = Con.GetDoorOrWindowIds(id, " IFCWINDOW");
foreach (var windowId in windowIds)
{
allWindows.Add(GetWindow(windowId));
}
result.AllWindows = allWindows.OrderBy(o => o.StartPoint.X).ThenByDescending(o => o.StartPoint.Y).ToList();
//Startpunkt und Endpunkt der Wand ermitteln.
result.GlobalId = Con.GetGlobalId(id);
result.AllProperties = allProperties.OrderBy(o => o.PropertyName).ToList();
result.StartPoint = Con.GetStartPointWall(id);
var tempProperty = result.AllProperties.Where(o => o.PropertyName == "Length").ToList();
result.EndPoint = Con.GetEndPointWall(id, result.StartPoint, tempProperty[0]);
//Alle Attribute der Wand hashen.
string wallJson = JsonSerializer.Serialize(result);
using (SHA256 sha256Hash = SHA256.Create())
{
result.WallHash = Hashing.GetHash(sha256Hash, wallJson);
}
return result;
}
/// <summary>
/// Erzeugen der Storey Klasse.
/// Aufrufen der GetWall Funktion
/// </summary>
public Storey GetStorey(string storey)
{
var result = new Storey();
var allWalls = new List<Wall>();
var wallIds = Con.GetWallIds(storey);
foreach(var id in wallIds)
{
//Unterscheidung ob es eine Statikprüfung oder eine normale Baugenehmigung ist.
//Nicht tragende Bauteile werden dabei nicht mehr aufgeführt.
var temp = GetWall(id);
if (temp.AllProperties.Any(item=>item.PropertyValue== "IFCBOOLEAN(.T.)") && ExaminationRestriction=="Statikprüfung")
{
allWalls.Add(temp);
}
else if(ExaminationRestriction == "Baugenehmigung")
{
allWalls.Add(temp);
}
}
//Wände werden nach Startpunkt geordnet
result.AllWalls = allWalls.OrderBy(o => o.StartPoint.X).ThenByDescending(o=>o.StartPoint.Y).ToList();
result.StoreyName = storey;
string storeyJson = JsonSerializer.Serialize(result);
using (SHA256 sha256Hash = SHA256.Create())
{
result.StoreyHash = Hashing.GetHash(sha256Hash, storeyJson);
}
return result;
}
/// <summary>
/// Startpunkt bei der Erstellung einer Schablone.
/// Erzeugen der Building Klasse.
/// Aufrufen der GetStorey Funktion
/// </summary>
public Building GetBuilding(string storeyRestriction)
{
var result = new Building();
var allStoreys = new List<Storey>();
//Ermittlung aller vorhandenen Stockwerke
var storeyNames = Con.GetStoreyNames();
var tempStoreyNames = storeyNames.Select(item => (string)item.Clone()).ToList();
//Alle Stockwerke bis auf das ausgewählte Stockwerk werden bei einer Teilprüfung entfernt
if(storeyRestriction != "Gesamtes Gebäude")
{
foreach(var item in tempStoreyNames)
{
if (item.Contains(storeyRestriction)==false)
{
storeyNames.Remove(item);
}
}
}
//Erstellen aller Stockwerke als Klasse Storey
foreach(var name in storeyNames)
{
allStoreys.Add(GetStorey(name));
}
result.AllStoreys = allStoreys;
//Hashen des gesamten Gebäudes mit allen Bauteilen
string buildingJson = JsonSerializer.Serialize(result);
using (SHA256 sha256Hash = SHA256.Create())
{
result.BuildingHash = Hashing.GetHash(sha256Hash, buildingJson);
}
return result;
}
}
}
<file_sep>using System;
using System.Collections.Generic;
using System.Security.Cryptography;
using System.Text.Json;
using Neo4j.Driver.V1;
using DigitalerPlanstempel.Utility;
using DigitalerPlanstempel.BuildingElements;
namespace DigitalerPlanstempel.Neo4j
{
/// <summary>
/// Diese Klasse stellt die Verbindung zur Graphdatenbank her.
/// Hier werden alle Abfragen an die Query definiert.
/// </summary>
public class Neo4JConnector : IDisposable
{
private readonly IDriver _driver;
private string ModelName;
public Neo4JConnector(string uri, string user, string password, string modelName)
{
ModelName = modelName;
_driver = GraphDatabase.Driver(uri, AuthTokens.Basic(user, password));
}
public void Dispose()
{
// if driver exists, dispose him
_driver?.Dispose();
}
/// <summary>
/// Abfrage aller IFCBUILDINGSTOREY-Knoten für ein spezielles Modell.
/// </summary>
public List<string> GetStoreyNames()
{
var storeyName = new List<string>();
using (var session = _driver.Session())
{
var entityResult = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(n:IFCBUILDINGSTOREY) " +
"WHERE n.model = $ModelName " +
"RETURN n.prop2 AS names",
new {ModelName});
foreach(var record in result)
{
storeyName.Add(record["names"].As<string>());
}
return result;
});
}
return storeyName;
}
/// <summary>
/// Abfrage aller IFCWALLSTANDARDCASE-EntityIds für ein spezielles Stockwerk.
/// </summary>
public List<int> GetWallIds(string storey)
{
var wallIds = new List<int>();
using (var session = _driver.Session())
{
var entityResult = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(storey:IFCBUILDINGSTOREY)-[]-(:IFCRELCONTAINEDINSPATIALSTRUCTURE)-[]-(walls:IFCWALLSTANDARDCASE) " +
"WHERE storey.model = $ModelName AND storey.prop2 = $storey " +
"RETURN walls.EntityId as entityId",
new { ModelName, storey });
foreach (var record in result)
{
wallIds.Add(record["entityId"].As<int>());
}
return result;
});
}
return wallIds;
}
/// <summary>
/// Abfrage aller IFCWINDOW oder IFCDOOR-EntityIds für eine spezielle Wand.
/// </summary>
public List<int> GetDoorOrWindowIds(int id, string type)
{
var doorIds = new List<int>();
using (var session = _driver.Session())
{
var entityResult = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(wall:IFCWALLSTANDARDCASE)-[]-(:IFCRELVOIDSELEMENT)-[]-(:IFCOPENINGELEMENT)-[]-(:IFCRELFILLSELEMENT)-[]-(element) " +
"WHERE wall.model = $ModelName AND wall.EntityId = $id AND element.name= $type " +
"RETURN element.EntityId as entityId",
new { ModelName, id, type });
foreach (var record in result)
{
doorIds.Add(record["entityId"].As<int>());
}
return result;
});
}
return doorIds;
}
/// <summary>
/// Ermitteln der GlobalId für einen speziellen Knoten.
/// </summary>
public string GetGlobalId(int id)
{
var globalId = "";
using (var session = _driver.Session())
{
var entityResult = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(n) " +
"WHERE n.model = $ModelName AND n.EntityId = $id " +
"RETURN n.prop0 as id",
new { ModelName, id });
foreach (var record in result)
{
globalId = record["id"].As<string>();
}
return result;
});
}
return globalId;
}
/// <summary>
/// Ermitteln der EntityId eines veränderten Elements.
/// </summary>
public int GetIdForChangedElements(string globalId)
{
int entityId=0;
if (globalId == null)
{
return entityId;
}
using (var session = _driver.Session())
{
var entityResult = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(n) " +
"WHERE n.model = $ModelName AND n.prop0 = $globalId " +
"RETURN n.EntityId as id",
new { ModelName, globalId});
foreach (var record in result)
{
entityId = record["id"].As<int>();
}
return result;
});
}
return entityId;
}
/// <summary>
/// Abfrage von einer Eigenschaft einer Wand.
/// Je nach Art der Eigenschaft sind andere Abfragen notwendig.
/// </summary>
public string GetPropertyValueWall(int id, string name)
{
string propertyValue = null;
if (name == "Length" || name == "Width" || name == "Height")
{
using (var session = _driver.Session())
{
var entityResult = session.WriteTransaction(tx =>
{
//Runden der Ergebnisse damit dadurch keine Abweichungen im Hash-Wert enstehen.
var result = tx.Run("MATCH(n1)-[]-(n2:IFCRELDEFINESBYPROPERTIES)-[]-(n3:IFCELEMENTQUANTITY)-[]-(n4:IFCQUANTITYLENGTH) " +
"WHERE n1.EntityId=$id AND n1.model =$ModelName AND n4.prop0 CONTAINS $name " +
"WITH toFloat(n4.prop3) AS float " +
"return round(100*float)/100 AS value",
new { ModelName, id, name});
foreach (var record in result)
{
propertyValue = record["value"].As<string>();
}
return result;
});
}
}
else if (name == "Material")
{
using (var session = _driver.Session())
{
var entityResult = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(n1:IFCWALLSTANDARDCASE)-[]-(n2:IFCRELASSOCIATESMATERIAL)-[]-(n3:IFCMATERIALLAYERSETUSAGE)-[]-(n4:IFCMATERIALLAYERSET)-[]-(n5:IFCMATERIALLAYER)-[]-(n6:IFCMATERIAL) " +
"WHERE n1.model = $ModelName AND n1.EntityId=$id " +
"RETURN n6.prop0 AS material",
new { ModelName, id});
foreach (var record in result)
{
propertyValue = record["material"].As<string>();
}
return result;
});
}
}
else if (name == "Name")
{
using (var session = _driver.Session())
{
var entityResult = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(n1) " +
"WHERE n1.EntityId=$id AND n1.model =$ModelName " +
"return n1.prop2 as value",
new { ModelName, id, name });
foreach (var record in result)
{
propertyValue = record["value"].As<string>();
}
return result;
});
}
}
else
{
//Hier können alle Eigenschaften Abgefragt die in einem IFCPROPERTYSINGLEVALUE gespeichert sind.
using (var session = _driver.Session())
{
var entityResult = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(n1)-[]-(relProperties:IFCRELDEFINESBYPROPERTIES)-[]-(propertySet:IFCPROPERTYSET)-[]-(property:IFCPROPERTYSINGLEVALUE) " +
"WHERE n1.model = $ModelName AND n1.EntityId = $id AND property.prop0 = $name " +
"RETURN property.prop2 as value",
new { ModelName, id, name });
foreach (var record in result)
{
propertyValue = record["value"].As<string>();
}
return result;
});
}
}
return propertyValue;
}
/// <summary>
/// Abfrage von einer Eigenschaft einer Tür oder eines Fensters.
/// Je nach Art der Eigenschaft sind andere Abfragen notwendig.
/// Die Abfragen für gleiche Eigenschaften sehen bei Fenster und Türen im Vergleich zur Wand anders aus.
/// </summary>
public string GetPropertyValueDoorOrWindow(int id, string name)
{
string propertyValue = null;
if (name == "Height")
{
using (var session = _driver.Session())
{
var entityResult = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(n1) " +
"WHERE n1.EntityId=$id AND n1.model =$ModelName " +
"WITH toFloat(n1.prop8) AS float " +
"return round(100*float)/100 AS value",
new { ModelName, id, name });
foreach (var record in result)
{
propertyValue = record["value"].As<string>();
}
return result;
});
}
}
else if (name == "Width")
{
using (var session = _driver.Session())
{
var entityResult = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(n1) " +
"WHERE n1.EntityId=$id AND n1.model =$ModelName " +
"WITH toFloat(n1.prop9) AS float " +
"return round(100*float)/100 AS value",
new { ModelName, id, name });
foreach (var record in result)
{
propertyValue = record["value"].As<string>();
}
return result;
});
}
}
else if (name == "Material")
{
using (var session = _driver.Session())
{
var entityResult = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(n1)-[]-(n2:IFCRELASSOCIATESMATERIAL)-[]-(n3:IFCMATERIALLIST)-[]-(n4:IFCMATERIAL) " +
"WHERE n1.model = $ModelName AND n1.EntityId=$id " +
"RETURN n4.prop0 AS material",
new { ModelName, id });
foreach (var record in result)
{
if (propertyValue == null)
{
propertyValue = record["material"].As<string>();
}
else
{
propertyValue = propertyValue + " und " + record["material"].As<string>();
}
}
return result;
});
}
}
else if (name == "Name")
{
using (var session = _driver.Session())
{
var entityResult = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(n1) " +
"WHERE n1.EntityId=$id AND n1.model =$ModelName " +
"return n1.prop2 as value",
new { ModelName, id, name });
foreach (var record in result)
{
propertyValue = record["value"].As<string>();
}
return result;
});
}
}
else
{
using (var session = _driver.Session())
{
var entityResult = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(n1)-[]-(relProperties:IFCRELDEFINESBYPROPERTIES)-[]-(propertySet:IFCPROPERTYSET)-[]-(property:IFCPROPERTYSINGLEVALUE) " +
"WHERE n1.model = $ModelName AND n1.EntityId = $id AND property.prop0 = $name " +
"RETURN property.prop2 as value",
new { ModelName, id, name });
foreach (var record in result)
{
if(record["value"].As<string>()!= "IFCLABEL('')")
{
propertyValue = record["value"].As<string>();
}
}
return result;
});
}
}
return propertyValue;
}
/// <summary>
/// Abfrage des Startpunkts der Wand.
/// </summary>
public CartesianPoint GetStartPointWall(int id)
{
var startPoint = new CartesianPoint();
using (var session = _driver.Session())
{
var entityResult = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(n1)-[]-(n2:IFCLOCALPLACEMENT)-[]-(n3:IFCAXIS2PLACEMENT3D)-[]-(n4:IFCCARTESIANPOINT) " +
"WHERE n1.model = $ModelName AND n1.EntityId=$id " +
"RETURN n4 AS node",
new { ModelName, id });
foreach (var record in result)
{
var node = record["node"].As<INode>();
var numbers = node["prop0"].As<List<double>>();
startPoint.X = string.Format("{0:0.00}",numbers[0]);
startPoint.Y = string.Format("{0:0.00}", numbers[1]);
startPoint.Z = string.Format("{0:0.00}", numbers[2]);
}
return result;
});
}
startPoint.PointHash = GetPointHashWall(id);
return startPoint;
}
/// <summary>
/// Ermitteln des Endpunkts der Wand.
/// Der Endpunkt wird mithilfe des Startpunkts der Länge und der Direction der Wand ermittelt.
/// </summary>
public CartesianPoint GetEndPointWall(int id, CartesianPoint startPoint, Property length)
{
var endPoint = new CartesianPoint();
var directionXY = new Direction { X = 1.0, Y = 0.0, Z = 0.0 };
var directionZ = new Direction { X = 0.0, Y = 0.0, Z = 1.0 };
//Ermitteln der Direction der Wand
using (var session = _driver.Session())
{
var greeting = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(wall)-[*3]-(n1:IFCDIRECTION) " +
"WHERE wall.EntityId = $id AND wall.model = $ModelName " +
"RETURN n1 AS node",
new {ModelName, id});
foreach (var record in result)
{
var node = record["node"].As<INode>();
var numbers = node["prop0"].As<List<double>>();
if (numbers[2] == 1)
{
directionZ.Z = numbers[2];
}
else
{
directionXY.X = numbers[0];
directionXY.Y = numbers[1];
}
}
return result;
});
}
//Endpunkt berechnen mit Startpunkt, Länge und Direction der Wand
var endPointX = (Convert.ToDouble(startPoint.X) + Convert.ToDouble(length.PropertyValue) * directionXY.X / (Math.Sqrt(Math.Pow(directionXY.X, 2) + Math.Pow(directionXY.Y, 2)))).ToString();
var endPointY = (Convert.ToDouble(startPoint.Y) + Convert.ToDouble(length.PropertyValue) * directionXY.Y / (Math.Sqrt(Math.Pow(directionXY.X, 2) + Math.Pow(directionXY.Y, 2)))).ToString();
var endPointZ = (Convert.ToDouble(startPoint.Z)).ToString();
endPoint.X = string.Format("{0:0.00}", endPointX);
endPoint.Y = string.Format("{0:0.00}", endPointY);
endPoint.Z = string.Format("{0:0.00}", endPointZ);
//Hashen des Endpunktes
string pointJson = JsonSerializer.Serialize(endPoint);
using (MD5 md5Hash = MD5.Create())
{
endPoint.PointHash = Hashing.GetHash(md5Hash, pointJson);
}
return endPoint;
}
/// <summary>
/// Abfrage des Startpunkts der Tür bzw. Wand.
/// </summary>
public CartesianPoint GetStartPointDoorOrWindow(int id)
{
var startPoint = new CartesianPoint();
var version = "";
//Kleine Unterscheidung je nach Modell bei der Ermittlung des Startpunkts.
using (var session = _driver.Session())
{
var entityResult = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(n1)-[]-(n2:IFCLOCALPLACEMENT)-[]-(n3:IFCAXIS2PLACEMENT3D)-[]-(n4:IFCCARTESIANPOINT) " +
"WHERE n1.model = $ModelName AND n1.EntityId=$id " +
"RETURN n4 AS node",
new { ModelName, id });
foreach (var record in result)
{
var node = record["node"].As<INode>();
var numbers = node["prop0"].As<List<double>>();
startPoint.X = string.Format("{0:0.00}", numbers[0]);
startPoint.Y = string.Format("{0:0.00}", numbers[1]);
startPoint.Z = string.Format("{0:0.00}", numbers[2]);
version = "first";
}
return result;
});
if (startPoint.X == "0,00" && startPoint.X == "0,00" && startPoint.X == "0,00")
{
var nodeResult = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(n1)-[]-(n2:IFCRELFILLSELEMENT)-[]-(n3:IFCOPENINGELEMENT)-[]-(n4:IFCLOCALPLACEMENT)-[]-(n5:IFCAXIS2PLACEMENT3D)-[]-(n6:IFCCARTESIANPOINT) " +
"WHERE n1.model = $ModelName AND n1.EntityId=$id " +
"RETURN n6 AS node",
new { ModelName, id });
foreach (var record in result)
{
var node = record["node"].As<INode>();
var numbers = node["prop0"].As<List<double>>();
startPoint.X = string.Format("{0:0.00}", numbers[0]);
startPoint.Y = string.Format("{0:0.00}", numbers[1]);
startPoint.Z = string.Format("{0:0.00}", numbers[2]);
version = "second";
}
return result;
});
}
}
//Hash muss je nach Ermittlungsweg des Startpunkts anders durchgeführt werden.
startPoint.PointHash = GetPointHashDoorOrWindow(id, version);
return startPoint;
}
/// <summary>
/// Ermittlung des Hash-Wertes für eine Wandeingeschaft.
/// </summary>
public string GetPropertyHashWall(int id, string name)
{
var propertyHash = "";
//Hash-Ermittlung hängt von der Ermittlungsart der Eigenschaftswerte weiter oben ab. Siehe GetPropertyValueWall
if (name == "Length" || name == "Width" || name == "Height")
{
using (var session = _driver.Session())
{
var entityResult = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(n1)-[]-(n2:IFCRELDEFINESBYPROPERTIES)-[]-(n3:IFCELEMENTQUANTITY)-[]-(n4:IFCQUANTITYLENGTH) " +
"WITH n1,n4 " +
"WHERE n1.EntityId=$id AND n1.model =$ModelName AND n4.prop0 CONTAINS $name " +
"WITH round(100*toFloat(n4.prop3))/100 AS value,n1,n4 " +
"WITH collect([n1.name,n4.name,n4.prop0,value]) AS result " +
"RETURN apoc.hashing.fingerprint(result) AS hash",
new { ModelName, id, name});
foreach (var record in result)
{
propertyHash = record["hash"].As<string>();
}
return result;
});
}
}
else if (name == "Material")
{
using (var session = _driver.Session())
{
var entityResult = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(n1:IFCWALLSTANDARDCASE)-[]-(n2:IFCRELASSOCIATESMATERIAL)-[]-(n3:IFCMATERIALLAYERSETUSAGE)-[]-(n4:IFCMATERIALLAYERSET)-[]-(n5:IFCMATERIALLAYER)-[]-(n6:IFCMATERIAL) " +
"WITH n1,n6 " +
"WHERE n1.model = $ModelName AND n1.EntityId=$id " +
"WITH collect([n1.name,n6.name,n6.prop0]) as result " +
"RETURN apoc.hashing.fingerprint(result) AS hash",
new { ModelName, id });
foreach (var record in result)
{
propertyHash = record["hash"].As<string>();
}
return result;
});
}
}
else
{
using (var session = _driver.Session())
{
var entityResult = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(n)-[r1]-(n1:IFCRELDEFINESBYPROPERTIES)-[r2]-(n2:IFCPROPERTYSET)-[r3]-(n3:IFCPROPERTYSINGLEVALUE) " +
"WITH n1,n3 " +
"WHERE n.model = $ModelName AND n.EntityId = $id AND n3.prop0 = $name " +
"WITH collect([n1.name,n3.name,n3.prop0,n3.prop2]) AS result " +
"RETURN apoc.hashing.fingerprint(result) AS hash",
new { ModelName, id, name });
foreach (var record in result)
{
propertyHash = record["hash"].As<string>();
}
return result;
});
}
}
return propertyHash;
}
/// <summary>
/// Ermittlung des Hash-Wertes für ein Fenster oder eine Tür.
/// </summary>
public string GetPropertyHashDoorOrWindow(int id, string name)
{
var propertyHash = "";
//Hash-Ermittlung hängt von der Ermittlungsart der Eigenschaftswerte weiter oben ab. Siehe GetPropertyValueDoorOrWindow
if (name == "Height")
{
using (var session = _driver.Session())
{
var entityResult = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(n1) " +
"WITH n1 " +
"WHERE n1.EntityId=$id AND n1.model =$ModelName " +
"WITH round(100*toFloat(n1.prop8))/100 AS value,n1 " +
"WITH collect([n1.name,value]) AS result " +
"RETURN apoc.hashing.fingerprint(result) AS hash",
new { ModelName, id, name });
foreach (var record in result)
{
propertyHash = record["hash"].As<string>();
}
return result;
});
}
}
else if (name == "Width")
{
using (var session = _driver.Session())
{
var entityResult = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(n1) " +
"WITH n1 " +
"WHERE n1.EntityId=$id AND n1.model =$ModelName " +
"WITH round(100*toFloat(n1.prop9))/100 AS value,n1 " +
"WITH collect([n1.name,value]) AS result " +
"RETURN apoc.hashing.fingerprint(result) AS hash",
new { ModelName, id, name });
foreach (var record in result)
{
propertyHash = record["hash"].As<string>();
}
return result;
});
}
}
else if (name == "Material")
{
using (var session = _driver.Session())
{
var entityResult = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(n1)-[]-(n2:IFCRELASSOCIATESMATERIAL)-[]-(n3:IFCMATERIALLIST)-[]-(n4:IFCMATERIAL) " +
"WITH n1,n4 " +
"WHERE n1.model = $ModelName AND n1.EntityId=$id " +
"WITH collect([n1.name,n4.name,n4.prop0]) as result " +
"RETURN apoc.hashing.fingerprint(result) AS hash",
new { ModelName, id });
foreach (var record in result)
{
propertyHash = record["hash"].As<string>();
}
return result;
});
}
}
else
{
using (var session = _driver.Session())
{
var entityResult = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(n)-[]-(n1:IFCRELDEFINESBYPROPERTIES)-[]-(n2:IFCPROPERTYSET)-[]-(n3:IFCPROPERTYSINGLEVALUE) " +
"WITH n1,n3 " +
"WHERE n.model = $ModelName AND n.EntityId = $id AND n3.prop0 = $name " +
"WITH collect([n1.name,n3.name,n3.prop0,n3.prop2]) AS result " +
"RETURN apoc.hashing.fingerprint(result) AS hash",
new { ModelName, id, name });
foreach (var record in result)
{
propertyHash = record["hash"].As<string>();
}
return result;
});
}
}
return propertyHash;
}
/// <summary>
/// Ermittlung des Hash-Wertes für den Startpunkt einer Wand
/// </summary>
public string GetPointHashWall(int id)
{
var pointHash = "";
using (var session = _driver.Session())
{
var entityResult = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(n1)-[]-(n2:IFCLOCALPLACEMENT)-[]-(n3:IFCAXIS2PLACEMENT3D)-[]-(n4:IFCCARTESIANPOINT) " +
"WITH n4 " +
"WHERE n1.model = $ModelName AND n1.EntityId=$id " +
"WITH collect([n4.name,n4.prop0]) as result " +
"RETURN apoc.hashing.fingerprint(result) AS hash",
new { ModelName, id });
foreach (var record in result)
{
pointHash = record["hash"].As<string>();
}
return result;
});
}
return pointHash;
}
/// <summary>
/// Ermittlung des Hash-Wertes für den Startpunkt einer Tür oder eines Fensters
/// </summary>
public string GetPointHashDoorOrWindow(int id, string version)
{
var pointHash = "";
using (var session = _driver.Session())
{
if(version == "first")
{
var entityResult = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(n1)-[]-(n2:IFCLOCALPLACEMENT)-[]-(n3:IFCAXIS2PLACEMENT3D)-[]-(n4:IFCCARTESIANPOINT) " +
"WITH n4 " +
"WHERE n1.model = $ModelName AND n1.EntityId=$id " +
"WITH collect([n4.name,n4.prop0]) as result " +
"RETURN apoc.hashing.fingerprint(result) AS hash",
new { ModelName, id });
foreach (var record in result)
{
pointHash = record["hash"].As<string>();
}
return result;
});
}
else
{
var entityResult = session.WriteTransaction(tx =>
{
var result = tx.Run("MATCH(n1)-[]-(n2:IFCRELFILLSELEMENT)-[]-(n3:IFCOPENINGELEMENT)-[]-(n4:IFCLOCALPLACEMENT)-[]-(n5:IFCAXIS2PLACEMENT3D)-[]-(n6:IFCCARTESIANPOINT) " +
"WITH n6 " +
"WHERE n1.model = $ModelName AND n1.EntityId=$id " +
"WITH collect([n6.name,n6.prop0]) as result " +
"RETURN apoc.hashing.fingerprint(result) AS hash",
new { ModelName, id });
foreach (var record in result)
{
pointHash = record["hash"].As<string>();
}
return result;
});
}
}
return pointHash;
}
}
}
<file_sep>
namespace DigitalerPlanstempel.Neo4j
{
public static class Neo4jAccessData
{
public static string Uri => "bolt://localhost:7687";
public static string User => "neo4j";
public static string Password => "<PASSWORD>";
}
}
<file_sep>
namespace DigitalerPlanstempel.BuildingElements
{
public class Property
{
public string PropertyHash { get; set; }
public string PropertyName { get; set; }
public string PropertyValue { get; set; }
}
}
<file_sep>
namespace DigitalerPlanstempel.Utility
{
/// <summary>
/// Klasse um die ermittelten Veränderungen festzuhalten.
/// </summary>
public class Distinction
{
public string StoreyName { get; set; }
public string WallGlobalId { get; set; }
public string WallPropertyName { get; set; }
public string WallElement { get; set; }
public string WallElementGlobalId { get; set; }
public string WallElementPropertyName{ get; set; }
public string Status { get; set; }
public string OldValue { get; set; }
public string NewValue { get; set; }
public Distinction Copy()
{
return (Distinction)this.MemberwiseClone();
}
}
}
<file_sep>using Microsoft.Win32;
using System.Collections.Generic;
using System.Windows;
using System.Windows.Controls;
using ModelGraphGen;
using DigitalerPlanstempel.Template;
using System;
using Newtonsoft.Json.Linq;
using System.IO;
using DigitalerPlanstempel.Utility;
using Xbim.Ifc;
using Xbim.ModelGeometry.Scene;
using DigitalerPlanstempel.Neo4j;
namespace DigitalerPlanstempel
{
/// <summary>
/// Interaction logic for ComparisonPage.xaml
/// </summary>
public partial class ComparisonPage : Page
{
private Permission Permission;
private ModelTemplate Model2;
List<Distinction> Distinctions;
public ComparisonPage(Permission Permission)
{
InitializeComponent();
this.Permission = Permission;
}
/// <summary>
/// Hinzufügen der geöffneten Datei zur Graphdatenbank
/// </summary>
private void ClickOnModelToNeo4j(object sender, RoutedEventArgs e)
{
OpenFileDialog openFileDialog = new OpenFileDialog();
openFileDialog.Filter = "IFC-Files (*.ifc)|*.ifc";
string sourceFile = null;
string resultFile = null;
if (openFileDialog.ShowDialog() == true)
{
sourceFile = openFileDialog.FileName;
}
if (sourceFile != null)
{
var parser = new Instance2Neo4jParser
{
SourceLocation = sourceFile,
TargetLocation = resultFile
};
var modelName = "model2";
parser.SendToDb(false, modelName, Neo4jAccessData.Uri, Neo4jAccessData.User, Neo4jAccessData.Password);
MessageBox.Show("Modell wurde zur Graphdatenbank hinzugefügt.", "");
}
else
{
MessageBox.Show("Ifc-Datei konnte nicht hinzugefügt werden. Probieren Sie es erneut.");
}
}
/// <summary>
/// Vergleich der beiden Schablonen
/// </summary>
private void ClickOnComparison(object sender, RoutedEventArgs e)
{
ComparisonGrid.Visibility = Visibility.Visible;
if (Permission.VerifySignature())
{
Console.WriteLine(true);
LabelSignaturVerifizierung.Visibility= Visibility.Visible;
}
//Umwandeln der JSON-Datei in einen dynamic Typ, um damit die Objekte in der Datei zu erreichen.
dynamic templateModel = JObject.Parse(Permission.ModelTemplate);
dynamic model2 = JObject.Parse(Model2.JsonTemplate);
Comparison comparison = new Comparison();
Distinctions = comparison.CompareBuilding(templateModel, model2);
//Alle Veränderungen werden in einer Tabelle dargestellt.
NewElements.ItemsSource = Distinctions;
}
/// <summary>
/// Schablone für das zweite Modell wird erstellt
/// </summary>
private void ClickOnSchablone(object sender, RoutedEventArgs e)
{
Model2 = new ModelTemplate("model2",Permission.StoreyRestriction, Permission.ExaminationRestriction);
VergleichenButton.Visibility= Visibility.Visible;
}
/// <summary>
/// Aufrufen der HTML von xBIM WeXplorer, Code für die Verwendung des WebViewers wurde von
/// http://docs.xbim.net/XbimWebUI/tutorial-5_Colourful_building.html | http://docs.xbim.net/XbimWebUI/5_Colourful_building.live.html und https://github.com/xBimTeam/XbimWebUI übernommen
/// </summary>
private void ClickOnGrafischDarstellen(object sender, RoutedEventArgs e)
{
var path = Constants.FolderPath + @"WebViewer\data\templateModel.ifc";
//Erzeugen der Modell IFC-Datei aus dem Base64String
Byte[] bytes = Convert.FromBase64String(Permission.ModelFile);
File.WriteAllBytes(path, bytes);
string wexBimB64 = "";
//Erzeugen der wexBim Datei vom Modell. Dateien im wexBim Format werden für den Web Viewer benötigt
using (var model = IfcStore.Open(path))
{
var context = new Xbim3DModelContext(model);
context.CreateContext();
var wexBimFilename = Path.ChangeExtension(path, "wexBIM");
using (var wexBiMfile = File.Create(wexBimFilename))
{
using (var wexBimBinaryWriter = new BinaryWriter(wexBiMfile))
{
model.SaveAsWexBim(wexBimBinaryWriter);
wexBimBinaryWriter.Close();
}
wexBiMfile.Close();
}
//Umwandeln des wexBim Datei in einen Base64String um es in der Javascript Datei des Web Viewers zu verwenden.
var bytesWexBim = File.ReadAllBytes(wexBimFilename);
wexBimB64 = Convert.ToBase64String(bytesWexBim);
}
File.Delete(path);
//Ermitteln der EntityIds aller veränderten Bauteile
var tempNeo4j = new Neo4JConnector(Neo4jAccessData.Uri, Neo4jAccessData.User, Neo4jAccessData.Password,"<PASSWORD>");
List<int> entityIds = new List<int>();
foreach (var item in Distinctions)
{
var wallGlobalId = tempNeo4j.GetIdForChangedElements(item.WallGlobalId);
var wallElementGlobalId = tempNeo4j.GetIdForChangedElements(item.WallElementGlobalId);
if (!entityIds.Contains(wallGlobalId) && wallGlobalId != 0 && wallElementGlobalId==0)
{
entityIds.Add(wallGlobalId);
}
else if (!entityIds.Contains(wallElementGlobalId) && wallElementGlobalId !=0)
{
entityIds.Add(wallElementGlobalId);
}
}
//Speichern der BauteilIds und des Base64Strings der wexBim datei.
var pathChangedElements = Constants.FolderPath + @"WebViewer\data\templateModel.js";
string input = "var data = {\"changedElements\" : [" + String.Join(",", entityIds) + "]};";
input += $@"var dataWexBimB64 = ""{wexBimB64}"";";
File.WriteAllText(pathChangedElements, input);
//Aufrufen des Web Viewers
System.Diagnostics.Process.Start(Constants.FolderPath + @"WebViewer\webViewer.html");
}
//Restart des Programms
private void ClickOnZurueck(object sender, RoutedEventArgs e)
{
System.Diagnostics.Process.Start(Application.ResourceAssembly.Location);
Application.Current.Shutdown();
}
//Angabe welcher RadioButton ausgewählt ist
private void RadioButtonElement_Checked(object sender, RoutedEventArgs e)
{
var radioButton = sender as RadioButton;
var buildingElementRestriction = radioButton.Content.ToString();
var tempDistinctions = new List<Distinction>();
if (Distinctions != null)
{
foreach (var item in Distinctions)
{
tempDistinctions.Add(item.Copy());
}
switch (buildingElementRestriction)
{
case "Wände":
tempDistinctions.RemoveAll(item => item.WallElement != null);
NewElements.ItemsSource = tempDistinctions;
break;
case "Fenster":
tempDistinctions.RemoveAll(item => item.WallElement == "Tür" || item.WallElement == null);
NewElements.ItemsSource = tempDistinctions;
break;
case "Türen":
tempDistinctions.RemoveAll(item => item.WallElement == "Fenster" || item.WallElement == null);
NewElements.ItemsSource = tempDistinctions;
break;
case "Alle genannten Optionen":
NewElements.ItemsSource = tempDistinctions;
break;
}
}
}
}
}
<file_sep>using System.Collections.Generic;
namespace DigitalerPlanstempel.BuildingElements
{
public class Door
{
public string DoorHash { get; set; }
public string GlobalId { get; set; }
public CartesianPoint StartPoint { get; set; }
public List<Property> AllProperties { get; set; }
}
}
<file_sep>using System.Collections.Generic;
using DigitalerPlanstempel.BuildingElements;
namespace DigitalerPlanstempel.Template
{
public class Storey
{
public string StoreyHash { get; set; }
public string StoreyName { get; set; }
public List<Wall> AllWalls { get; set; }
}
}
<file_sep>using System;
using System.IO;
using System.Windows;
using Microsoft.Win32;
using ModelGraphGen;
using DigitalerPlanstempel.Template;
using DigitalerPlanstempel.Utility;
using System.Windows.Controls;
using DigitalerPlanstempel.Neo4j;
namespace DigitalerPlanstempel
{
/// <summary>
/// Interaction logic for MainWindow.xaml
/// </summary>
public partial class MainWindow : System.Windows.Window
{
private Permission Permission;
private string SourcefilePath;
private string StoreyRestriction;
private string ExaminationRestriction;
public MainWindow()
{
InitializeComponent();
}
/// <summary>
/// Bestätigen der Prüfung. Modell kann zur Graphdatenbank hinzugefügt werden oder es wird nur eine Schablone erstellt.
/// </summary>
private void ClickOnNewTemplate(object sender, RoutedEventArgs e)
{
OpenFileDialog openFileDialog = new OpenFileDialog();
openFileDialog.Filter = "IFC-Files (*.ifc)|*.ifc";
string sourceFile = null;
string resultFile = null;
if (openFileDialog.ShowDialog() == true)
{
var filestream = openFileDialog.OpenFile();
LabelShowOpenFile.Content = openFileDialog.FileName;
sourceFile = openFileDialog.FileName;
SourcefilePath = openFileDialog.FileName;
}
if (ExaminerName.Text == "" || ModelName.Text == "" || sourceFile ==null || ExaminerName.Text == "Name des Prüfers" || ModelName.Text == "Name des Gebäudemodells")
{
MessageBox.Show("Geben sie Prüfer- und Modellname, sowie die zu prüfende IFC-Datei an. ");
}
else
{
var result = MessageBox.Show("Soll die IFC-Datei in die Graphdatenbank geladen werden?", "", MessageBoxButton.YesNo);
//Hinzufügen des Modells in die Graphdatenbank
if (result == MessageBoxResult.Yes)
{
var parser = new Instance2Neo4jParser
{
SourceLocation = sourceFile,
TargetLocation = resultFile
};
var modelName = "templateModel";
parser.SendToDb(false, modelName, Neo4jAccessData.Uri, Neo4jAccessData.User, Neo4jAccessData.Password);
MessageBox.Show("IFC-Datei wurde in die Graphdatenbank geladen.", "Neue Schablone");
}
//Erstellen der Schablone für das geprüfte Modell
var TemplateModel = new ModelTemplate("templateModel", StoreyRestriction, ExaminationRestriction);
Byte[] bytes = File.ReadAllBytes(sourceFile);
string file = Convert.ToBase64String(bytes);
//Erzeugen einer neuen Genehmigung
Permission = new Permission(ExaminerName.Text, ModelName.Text, TemplateModel.JsonTemplate, file, StoreyRestriction, ExaminationRestriction);
//Aufrufen der nächsten Seite
ComparisonPage comparisonPage = new ComparisonPage(Permission);
this.Content = comparisonPage;
}
}
//Angabe welcher RadioButton ausgewählt ist
private void RadioButtonStorey_Checked(object sender, RoutedEventArgs e)
{
var radioButton = sender as RadioButton;
StoreyRestriction = radioButton.Content.ToString();
}
//Angabe welcher RadioButton ausgewählt ist
private void RadioButtonExamination_Checked(object sender, RoutedEventArgs e)
{
var radioButton = sender as RadioButton;
ExaminationRestriction = radioButton.Content.ToString();
}
}
}
<file_sep>using System.Collections.Generic;
using System.Linq;
namespace DigitalerPlanstempel.Utility
{
/// <summary>
/// Vergleichen von zwei erstellten Schablonen
/// </summary>
public class Comparison
{
public List<Distinction> Distinctions = new List<Distinction>();
/// <summary>
/// Vergleich auf Gebäudeebene, falls Hash-Werte nicht übereinstimmen wird CompareStoreys aufgerufen
/// </summary>
public List<Distinction> CompareBuilding(dynamic templateModel, dynamic model2 )
{
if(templateModel.BuildingHash != model2.BuildingHash)
{
Distinction temp1 = new Distinction();
Distinction temp2 = temp1.Copy();
for(int i=0; i < templateModel.AllStoreys.Count; i++)
{
CompareStoreys(templateModel.AllStoreys[i], model2.AllStoreys[i]);
}
}
return Distinctions;
}
/// <summary>
/// Vergleich auf Stockwerkebene, falls Hash-Werte nicht übereinstimmen wird für jede Wand CompareWalls aufgerufen
/// </summary>
public void CompareStoreys(dynamic templateModelStorey, dynamic model2Storey)
{
var distinction = new Distinction();
if (templateModelStorey.StoreyHash != model2Storey.StoreyHash)
{
distinction.StoreyName = model2Storey.StoreyName;
//Bei abweichendem Namen des Stockwerk wird es in die Klasse Distinction festgehalten
if(templateModelStorey.StoreyName != model2Storey.StoreyName)
{
Distinction storeyNameDistinction = distinction.Copy();
storeyNameDistinction.Status = "Der Name des Stockwerks hat sich geändert.";
}
//Ermitteln aller WandGlobalIds eines Stockwerks
var templateModelStoreyWallGlobalIds = new List<string>();
var model2StoreyWallGlobalIds = new List<string>();
foreach (var item in templateModelStorey.AllWalls)
{
templateModelStoreyWallGlobalIds.Add(item.GlobalId.ToString());
}
foreach(var item in model2Storey.AllWalls)
{
model2StoreyWallGlobalIds.Add(item.GlobalId.ToString());
}
//Feststellen ob eine neue Wand hinzugekommen ist oder eine Wand gelöscht wurde
var missingWalls1 = templateModelStoreyWallGlobalIds.Except(model2StoreyWallGlobalIds).ToList();
foreach(var item in missingWalls1)
{
var temp = distinction.Copy();
temp.WallGlobalId = item;
temp.Status = "Diese Wand existiert nicht mehr im aktuellen Gebäudemodell.";
Distinctions.Add(temp);
}
var missingWalls2 = model2StoreyWallGlobalIds.Except(templateModelStoreyWallGlobalIds).ToList();
foreach (var item in missingWalls2)
{
var temp = distinction.Copy();
temp.WallGlobalId = item;
temp.Status = "Diese Wand existiert nur im aktuellen Gebäudemodell.";
Distinctions.Add(temp);
}
//Vergleichen der einzelnen Wände
foreach (var item in templateModelStorey.AllWalls)
{
foreach(var element in model2Storey.AllWalls)
{
if(item.GlobalId == element.GlobalId)
{
CompareWalls(item, element, distinction);
}
}
}
}
}
/// <summary>
/// Vergleich auf Wandebene, falls Hash-Werte nicht übereinstimmen wird Startpunkt,Endpunkt überprüft und CompareProperties, CompareAllDoors und CompareAllWindows aufgerufen
/// </summary>
public void CompareWalls(dynamic templateModelWall, dynamic model2Wall, Distinction tempDistinction)
{
var distinction = tempDistinction.Copy();
if (templateModelWall.WallHash != model2Wall.WallHash)
{
//Falls der Startpunkt abweicht wird dieser in den Distinction aufgenommen
if(templateModelWall.StartPoint.PointHash != model2Wall.StartPoint.PointHash)
{
Distinction startPointDistinction = distinction.Copy();
startPointDistinction.WallGlobalId = templateModelWall.GlobalId;
startPointDistinction.WallPropertyName = "Startpunkt";
startPointDistinction.Status = "Startpunkt der beiden Wände stimmt nicht überein.";
startPointDistinction.OldValue = "X = " + templateModelWall.StartPoint.X + " | Y = " + templateModelWall.StartPoint.Y + " | Z = " + templateModelWall.StartPoint.Z;
startPointDistinction.NewValue = "X = " + model2Wall.StartPoint.X + " | Y = " + model2Wall.StartPoint.Y + " | Z = " + model2Wall.StartPoint.Z;
Distinctions.Add(startPointDistinction);
}
//Falls der Endpunkt abweicht wird dieser in den Distinction aufgenommen
if (templateModelWall.EndPoint.PointHash != model2Wall.EndPoint.PointHash)
{
Distinction endPointDistinction = distinction.Copy();
endPointDistinction.WallGlobalId = templateModelWall.GlobalId;
endPointDistinction.WallPropertyName = "Endpunkt";
endPointDistinction.Status = "Endpunkt der beiden Wände stimmt nicht überein.";
endPointDistinction.OldValue = "X = " + templateModelWall.EndPoint.X + " | Y = " + templateModelWall.EndPoint.Y + " | Z = " + templateModelWall.EndPoint.Z;
endPointDistinction.NewValue = "X = " + model2Wall.EndPoint.X + " | Y = " + model2Wall.EndPoint.Y + " | Z = " + model2Wall.EndPoint.Z;
Distinctions.Add(endPointDistinction);
}
//Vergleichen aller Eigenschaften dieser Wände
for(int i = 0; i < templateModelWall.AllProperties.Count; i++)
{
distinction.WallGlobalId = templateModelWall.GlobalId;
CompareProperties(templateModelWall.AllProperties[i], model2Wall.AllProperties[i], distinction,"Wall");
}
//Vergleichen der Fenster und Türen dieser Wände
CompareAllDoors(templateModelWall.AllDoors, model2Wall.AllDoors, distinction);
CompareAllWindows(templateModelWall.AllWindows, model2Wall.AllWindows, distinction);
}
}
/// <summary>
/// Vergleich auf Eigenschaftsebene, falls Hash-Werte nicht übereinstimmen wird die Veränderung zu Distinction hinzugefügt
/// </summary>
public void CompareProperties(dynamic templateModelProperty, dynamic model2Property, Distinction tempDistinction, string type)
{
var distinction = tempDistinction.Copy();
if(templateModelProperty.PropertyHash != model2Property.PropertyHash && type == "Wall")
{
distinction.WallPropertyName = templateModelProperty.PropertyName;
distinction.Status = "Property: " + templateModelProperty.PropertyName + " hat sich in der aktuellen Version verändert.";
distinction.OldValue = templateModelProperty.PropertyValue;
distinction.NewValue = model2Property.PropertyValue;
Distinctions.Add(distinction);
}
else if (templateModelProperty.PropertyHash != model2Property.PropertyHash && type == "notWall")
{
distinction.WallElementPropertyName = templateModelProperty.PropertyName;
distinction.Status = "Property: " + templateModelProperty.PropertyName + " hat sich in der aktuellen Version verändert.";
distinction.OldValue = templateModelProperty.PropertyValue;
distinction.NewValue = model2Property.PropertyValue;
Distinctions.Add(distinction);
}
}
/// <summary>
/// Ermitteln von gelöschten oder neu hinzugekommenen Türen. Türen mit gleicher Id in den Modellen werden im Anschluss verglichen
/// </summary>
public void CompareAllDoors(dynamic templateModelAllDoors, dynamic model2AllDoors,Distinction tempDistinction)
{
var distinction = tempDistinction.Copy();
var templateModelDoorGlobalIds = new List<string>();
var model2DoorGlobalIds = new List<string>();
foreach (var item in templateModelAllDoors)
{
templateModelDoorGlobalIds.Add(item.GlobalId.ToString());
}
foreach (var item in model2AllDoors)
{
model2DoorGlobalIds.Add(item.GlobalId.ToString());
}
var missingDoors1 = templateModelDoorGlobalIds.Except(model2DoorGlobalIds).ToList();
foreach (var item in missingDoors1)
{
var temp = distinction.Copy();
temp.WallElementGlobalId = item;
temp.WallElement = "Tür";
temp.Status = "Diese Tür existiert nicht mehr im aktuellen Gebäudemodell.";
Distinctions.Add(temp);
}
var missingDoors2 = model2DoorGlobalIds.Except(templateModelDoorGlobalIds).ToList();
foreach (var item in missingDoors2)
{
var temp = distinction.Copy();
temp.WallElementGlobalId = item;
temp.WallElement = "Tür";
temp.Status = "Diese Tür existiert nur im aktuellen Gebäudemodell.";
Distinctions.Add(temp);
}
foreach (var item in templateModelAllDoors)
{
foreach (var element in model2AllDoors)
{
if (item.GlobalId == element.GlobalId)
{
CompareDoors(item, element, distinction);
}
}
}
}
/// <summary>
/// Vergleich auf Türebene, falls Hash-Werte nicht übereinstimmen wird die Veränderung zu Distinction hinzugefügt
/// </summary>
public void CompareDoors(dynamic templateModelDoor, dynamic model2Door, Distinction tempDistinction)
{
var distinction = tempDistinction.Copy();
if(templateModelDoor.DoorHash != model2Door.DoorHash)
{
distinction.WallElement = "Tür";
distinction.WallElementGlobalId = templateModelDoor.GlobalId;
if (templateModelDoor.StartPoint.PointHash != model2Door.StartPoint.PointHash)
{
Distinction startPointDistinction = distinction.Copy();
startPointDistinction.WallElementPropertyName = "Startpunkt";
startPointDistinction.Status = "Startpunkt der beiden Türen stimmt nicht überein.";
startPointDistinction.OldValue = "X = " + templateModelDoor.StartPoint.X + " | Y = " + templateModelDoor.StartPoint.Y + " | Z = " + templateModelDoor.StartPoint.Z;
startPointDistinction.NewValue = "X = " + model2Door.StartPoint.X + " | Y = " + model2Door.StartPoint.Y + " | Z = " + model2Door.StartPoint.Z;
Distinctions.Add(startPointDistinction);
}
for (int i = 0; i < templateModelDoor.AllProperties.Count; i++)
{
CompareProperties(templateModelDoor.AllProperties[i], model2Door.AllProperties[i], distinction,"notWall");
}
}
}
/// <summary>
/// Ermitteln von gelöschten oder neu hinzugekommenen Fenstern. Fenster mit gleicher Id in den Modellen werden im Anschluss verglichen
/// </summary>
public void CompareAllWindows(dynamic templateModelAllWindows, dynamic model2AllWindows, Distinction tempDistinction)
{
var distinction = tempDistinction.Copy();
var templateModelWindowGlobalIds = new List<string>();
var model2WindowGlobalIds = new List<string>();
foreach (var item in templateModelAllWindows)
{
templateModelWindowGlobalIds.Add(item.GlobalId.ToString());
}
foreach (var item in model2AllWindows)
{
model2WindowGlobalIds.Add(item.GlobalId.ToString());
}
var missingWindows1 = templateModelWindowGlobalIds.Except(model2WindowGlobalIds).ToList();
foreach (var item in missingWindows1)
{
var temp = distinction.Copy();
temp.WallElementGlobalId = item;
temp.WallElement = "Fenster";
temp.Status = "Dieses Fenster existiert nicht mehr im aktuellen Gebäudemodell.";
Distinctions.Add(temp);
}
var missingWindows2 = model2WindowGlobalIds.Except(templateModelWindowGlobalIds).ToList();
foreach (var item in missingWindows2)
{
var temp = distinction.Copy();
temp.WallElementGlobalId = item;
temp.WallElement = "Fenster";
temp.Status = "Dieses Fenster existiert nur im aktuellen Gebäudemodell.";
Distinctions.Add(temp);
}
foreach (var item in templateModelAllWindows)
{
foreach (var element in model2AllWindows)
{
if (item.GlobalId == element.GlobalId)
{
CompareWindows(item, element, distinction);
}
}
}
}
/// <summary>
/// Vergleich auf Fensterebene, falls Hash-Werte nicht übereinstimmen wird die Veränderung zu Distinction hinzugefügt
/// </summary>
public void CompareWindows(dynamic templateModelWindow, dynamic model2Window, Distinction tempDistinction)
{
var distinction = tempDistinction.Copy();
if (templateModelWindow.WindowHash != model2Window.WindowHash)
{
distinction.WallElement = "Fenster";
distinction.WallElementGlobalId = templateModelWindow.GlobalId;
if (templateModelWindow.StartPoint.PointHash != model2Window.StartPoint.PointHash)
{
Distinction startPointDistinction = distinction.Copy();
startPointDistinction.WallElementPropertyName = "Startpunkt";
startPointDistinction.Status = "Startpunkt der beiden Fenster stimmt nicht überein.";
startPointDistinction.OldValue = "X = " + templateModelWindow.StartPoint.X + " | Y = " + templateModelWindow.StartPoint.Y + " | Z = " + templateModelWindow.StartPoint.Z;
startPointDistinction.NewValue = "X = " + model2Window.StartPoint.X + " | Y = " + model2Window.StartPoint.Y + " | Z = " + model2Window.StartPoint.Z;
Distinctions.Add(startPointDistinction);
}
for (int i = 0; i < templateModelWindow.AllProperties.Count; i++)
{
CompareProperties(templateModelWindow.AllProperties[i], model2Window.AllProperties[i], distinction,"notWall");
}
}
}
}
}
<file_sep>using System;
using System.Security.Cryptography;
namespace DigitalerPlanstempel.Utility
{
/// <summary>
/// Bestätigte Prüfung wird in dieser Klasse gespeichert.
/// </summary>
public class Permission
{
public string ExaminerName { get; private set; }
public string ModelName { get; private set; }
public string ModelFile { get; private set; }
public string ModelTemplate { get; private set; }
public string StoreyRestriction { get; private set; }
public string ExaminationRestriction { get; private set; }
public byte[] ModelSignature { get; private set; }
public DateTime PermissionDate { get; private set; }
private RSACryptoServiceProvider Rsa;
public Permission(string examiner, string modelName, string modelTemplate, string file, string storeyRestriction, string examinationRestriction)
{
ExaminerName = examiner;
ModelName = modelName;
ModelFile = file;
StoreyRestriction = storeyRestriction;
ExaminationRestriction = examinationRestriction;
ModelTemplate = modelTemplate;
PermissionDate = DateTime.Now;
byte[] hashValue = HashPermissionValues();
var signature = Signature.CreateSignature(hashValue);
ModelSignature = signature.Item1;
Rsa = signature.Item2;
}
/// <summary>
/// Verifizieren der digitalen Signatur
/// </summary>
public bool VerifySignature()
{
byte[] hashValue = HashPermissionValues();
var result = Signature.VerifySignature(hashValue, ModelSignature, Rsa);
return result;
}
/// <summary>
/// Erstellen des Hash-Werts für Prüfername, Modellname, Modelldatei, Schablone und dem Zeitpunkt der Prüfung.
/// </summary>
public byte[] HashPermissionValues()
{
string modelInfo = ExaminerName + ModelName + ModelFile + ModelTemplate + PermissionDate.ToString();
byte[] hashValue;
using (SHA256 sha256 = SHA256.Create())
{
hashValue = Hashing.GetHashBytes(sha256, modelInfo);
}
return hashValue;
}
}
}
<file_sep>using System;
using System.Security.Cryptography;
namespace DigitalerPlanstempel.Utility
{
public static class Signature
{
//Folgende Funktionen wurden von https://docs.microsoft.com/de-de/dotnet/standard/security/cryptographic-signatures übernommen
public static (byte[], RSACryptoServiceProvider) CreateSignature(byte[] hashValue)
{
//The value to hold the signed value.
byte[] signedHashValue;
//Generate a public/private key pair.
RSACryptoServiceProvider rsa = new RSACryptoServiceProvider();
//Create an RSAPKCS1SignatureFormatter object and pass it the
//RSACryptoServiceProvider to transfer the private key.
RSAPKCS1SignatureFormatter rsaFormatter = new RSAPKCS1SignatureFormatter(rsa);
//Set the hash algorithm to SHA1.
rsaFormatter.SetHashAlgorithm("SHA256");
//Create a signature for hashValue and assign it to
//signedHashValue.
signedHashValue = rsaFormatter.CreateSignature(hashValue);
return (signedHashValue,rsa);
}
public static bool VerifySignature(byte[] hashValue,byte[] signedHashValue, RSACryptoServiceProvider rsa)
{
RSAPKCS1SignatureDeformatter rsaDeformatter = new RSAPKCS1SignatureDeformatter(rsa);
rsaDeformatter.SetHashAlgorithm("SHA256");
if (rsaDeformatter.VerifySignature(hashValue, signedHashValue))
{
Console.WriteLine("The signature is valid.");
return true;
}
else
{
Console.WriteLine("The signature is not valid.");
return false;
}
}
}
}
| 8d08d32a92bc3d734fa8149e42b17dc6e2c3fa4c | [
"C#"
]
| 18 | C# | kilianSieben/DigitalerPlanstempel | b97221ddfff23ad54b43378d4c5dead15ac43a3e | 005a4cb7b939044cd45a77c3261932869a9add4a |
refs/heads/main | <repo_name>AaronBison/TejUtRendszer<file_sep>/app/src/main/java/com/example/tejutrendszer/adapters/BottleItemAdapter.kt
package com.example.tejutrendszer.adapters
import android.view.LayoutInflater
import android.view.View
import android.view.ViewGroup
import android.widget.TextView
import androidx.recyclerview.widget.RecyclerView
import com.example.tejutrendszer.R
import com.example.tejutrendszer.data.Bottle
import kotlinx.android.synthetic.main.bottles_item.view.*
import kotlinx.android.synthetic.main.customer_item.view.*
class BottleItemAdapter(private val bottleItemList: List<Bottle>): RecyclerView.Adapter<BottleItemAdapter.BottleItemViewHolder>() {
override fun onCreateViewHolder(parent: ViewGroup, viewType: Int): BottleItemViewHolder {
val itemView = LayoutInflater.from(parent.context).inflate(R.layout.bottles_item, parent, false)
return BottleItemViewHolder(itemView)
}
override fun onBindViewHolder(holder: BottleItemViewHolder, position: Int) {
val currentItem = bottleItemList[position]
holder.bottleName.text = currentItem.bottleName
holder.bottleCount.text = currentItem.bottleCount
}
override fun getItemCount() = bottleItemList.size
class BottleItemViewHolder(itemView: View) : RecyclerView.ViewHolder(itemView) {
val bottleName: TextView = itemView.bottle_name
val bottleCount: TextView = itemView.bottle_count
}
}<file_sep>/app/src/main/java/com/example/tejutrendszer/ui/viewmodels/DashboardViewModel.kt
package com.example.tejutrendszer.ui.viewmodels
import androidx.lifecycle.LiveData
import androidx.lifecycle.MutableLiveData
import androidx.lifecycle.ViewModel
import com.example.tejutrendszer.data.Customer
import com.example.tejutrendszer.data.CustomerRepository
class DashboardViewModel(private val customerRepository: CustomerRepository) : ViewModel() {
fun getCustomers() = customerRepository.getCustomer()
fun addCustomers(customer: Customer) = customerRepository.addCustomer(customer)
private val _text = MutableLiveData<String>().apply {
value = "This is dashboard Fragment"
}
val text: LiveData<String> = _text
}<file_sep>/app/src/main/java/com/example/tejutrendszer/data/CustomerRepository.kt
package com.example.tejutrendszer.data
class CustomerRepository private constructor(private val customerDao: CustomerDao){
fun addCustomer(customer: Customer){
customerDao.addCustomer(customer)
}
fun getCustomer() = customerDao.getCustomers()
companion object{
@Volatile private var instance: CustomerRepository ?= null
fun getInstance(customerDao: CustomerDao) =
instance ?: synchronized(this){
instance ?: CustomerRepository(customerDao).also { instance = it }
}
}
}<file_sep>/app/src/main/java/com/example/tejutrendszer/ui/views/DashboardFragment.kt
package com.example.tejutrendszer.ui.views
import android.os.Bundle
import android.util.Log
import android.view.LayoutInflater
import android.view.View
import android.view.ViewGroup
import androidx.core.view.isVisible
import androidx.fragment.app.Fragment
import androidx.recyclerview.widget.DefaultItemAnimator
import androidx.recyclerview.widget.LinearLayoutManager
import com.example.tejutrendszer.R
import com.example.tejutrendszer.adapters.CustomerItemAdapter
import com.example.tejutrendszer.data.Customer
import kotlinx.android.synthetic.main.fragment_dashboard.*
import java.io.*
import java.text.SimpleDateFormat
import java.util.*
import kotlin.collections.ArrayList
class DashboardFragment : Fragment() {
private var customerList : List<Customer> = listOf()
private val generalDateFormat = SimpleDateFormat("yyyy MMMM dd (EEEE)")
private val todayDateFormat = SimpleDateFormat("dd")
override fun onCreateView(
inflater: LayoutInflater, container: ViewGroup?,
savedInstanceState: Bundle?
): View? {
customerList = readCustomersFromCSV()
return inflater.inflate(R.layout.fragment_dashboard, container, false)
}
override fun onViewCreated(view: View, savedInstanceState: Bundle?) {
date_indicator.text = generalDateFormat.format(Date()).toString()
if(customerList.isNotEmpty()){
recycler_view_dashboard.hasFixedSize()
recycler_view_dashboard.layoutManager = LinearLayoutManager(context)
recycler_view_dashboard.itemAnimator = DefaultItemAnimator()
recycler_view_dashboard.adapter = CustomerItemAdapter(customerList)
}else{
text_empty_rv.isVisible = true
}
super.onViewCreated(view, savedInstanceState)
}
private fun readCustomersFromCSV(): List<Customer>{
val listOfCustomers: MutableList<Customer> = mutableListOf()
// Open input file
val inputStream = resources.openRawResource(R.raw.customers)
val reader = BufferedReader(inputStream.bufferedReader())
val iterator = reader.lineSequence().iterator()
while (iterator.hasNext()){
val currentLine = iterator.next()
// Split by ","
val tokens = currentLine.split("*").toTypedArray()
val today = todayDateFormat.format(Date()).toInt()
Log.w("today.toString()",today.toString())
Log.w("tokens[today]",tokens[today])
if(tokens[today] != ""){
val customerName = tokens[0]
val customerLiter = tokens[today]
val customerLiterThisMonth = tokens[tokens.size-2]
val customerDept = tokens[tokens.size-1]
val customer = Customer(customerName,customerDept,customerLiter)
Log.e("WTF",listOfCustomers.toString())
listOfCustomers.add(customer)
}
}
Log.w("TAG", listOfCustomers.toString())
reader.close()
return listOfCustomers
}
private fun generateDummyList(size: Int): List<Customer> {
val list = ArrayList<Customer>()
for (i in 1 until size){
val item = Customer("Name $i", "130 lej", "2 l")
list += item
}
return list
}
}<file_sep>/app/src/main/java/com/example/tejutrendszer/data/CustomerDao.kt
package com.example.tejutrendszer.data
import androidx.lifecycle.LiveData
import androidx.lifecycle.MutableLiveData
class CustomerDao {
private val customerList = mutableListOf<Customer>()
private val customers = MutableLiveData<List<Customer>>()
init {
customers.value = customerList
}
fun addCustomer(customer: Customer){
customerList.add(customer)
customers.value = customerList
}
fun getCustomers() = customers as LiveData<List<Customer>>
}<file_sep>/settings.gradle
rootProject.name = "TejUtRendszer"
include ':app'
<file_sep>/app/src/main/java/com/example/tejutrendszer/data/Customer.kt
package com.example.tejutrendszer.data
data class Customer(val customerName: String, val customerDebt: String, val customerLiter: String) {
override fun toString(): String {
return "$customerName - $customerDebt lei - $customerLiter l"
}
}<file_sep>/app/src/main/java/com/example/tejutrendszer/ui/views/HomeFragment.kt
package com.example.tejutrendszer.ui.views
import android.os.Bundle
import android.util.Log
import android.view.LayoutInflater
import android.view.View
import android.view.ViewGroup
import androidx.fragment.app.Fragment
import androidx.recyclerview.widget.DefaultItemAnimator
import androidx.recyclerview.widget.LinearLayoutManager
import com.example.tejutrendszer.R
import com.example.tejutrendszer.adapters.BottleItemAdapter
import com.example.tejutrendszer.data.Bottle
import kotlinx.android.synthetic.main.fragment_home.*
import java.io.BufferedReader
import java.text.SimpleDateFormat
import java.util.*
class HomeFragment : Fragment() {
private var bottleList : List<Bottle> = listOf()
private val todayDateFormat = SimpleDateFormat("dd")
override fun onCreateView(
inflater: LayoutInflater, container: ViewGroup?,
savedInstanceState: Bundle?
): View? {
bottleList = readBottlesFromCSV()
return inflater.inflate(R.layout.fragment_home, container, false)
}
override fun onViewCreated(view: View, savedInstanceState: Bundle?) {
recycler_view_home.hasFixedSize()
recycler_view_home.layoutManager = LinearLayoutManager(context)
recycler_view_home.itemAnimator = DefaultItemAnimator()
recycler_view_home.adapter = BottleItemAdapter(bottleList)
super.onViewCreated(view, savedInstanceState)
}
private fun readBottlesFromCSV(): List<Bottle>{
val listOfBottles: MutableList<Bottle> = mutableListOf()
// Open input file
val inputStream = resources.openRawResource(R.raw.summary)
val reader = BufferedReader(inputStream.bufferedReader())
val iterator = reader.lineSequence().iterator()
val today = todayDateFormat.format(Date()).toInt()
while (iterator.hasNext()){
val currentLine = iterator.next()
// Split by ","
val tokens = currentLine.split("*").toTypedArray()
if (tokens[today] != ""){
listOfBottles.add(Bottle(tokens[0],tokens[today]))
}else{
listOfBottles.add(Bottle(tokens[0],"0"))
}
}
return listOfBottles
}
}<file_sep>/app/src/main/java/com/example/tejutrendszer/adapters/CustomerItemAdapter.kt
package com.example.tejutrendszer.adapters
import android.view.LayoutInflater
import android.view.View
import android.view.ViewGroup
import android.widget.TextView
import androidx.recyclerview.widget.RecyclerView
import com.example.tejutrendszer.R
import com.example.tejutrendszer.data.Customer
import kotlinx.android.synthetic.main.customer_item.view.*
class CustomerItemAdapter(private val customerItemList: List<Customer>) : RecyclerView.Adapter<CustomerItemAdapter.CustomerItemViewHolder>() {
override fun onCreateViewHolder(parent: ViewGroup, viewType: Int): CustomerItemViewHolder {
val itemView = LayoutInflater.from(parent.context).inflate(R.layout.customer_item, parent, false)
return CustomerItemViewHolder(itemView)
}
override fun onBindViewHolder(holder: CustomerItemViewHolder, position: Int) {
val currentItem = customerItemList[position]
holder.customerName.text = currentItem.customerName
holder.customerDebt.text = currentItem.customerDebt
holder.customerLiter.text = currentItem.customerLiter
}
override fun getItemCount() = customerItemList.size
class CustomerItemViewHolder(itemView: View) : RecyclerView.ViewHolder(itemView) {
val customerName: TextView = itemView.customer_name
val customerDebt: TextView = itemView.customer_debt
val customerLiter: TextView = itemView.customer_liter
}
}<file_sep>/app/src/main/java/com/example/tejutrendszer/data/Bottle.kt
package com.example.tejutrendszer.data
data class Bottle(val bottleName: String, val bottleCount: String) {
override fun toString(): String {
return "$bottleName - $bottleCount db"
}
}<file_sep>/app/src/main/java/com/example/tejutrendszer/ui/MainActivity.kt
package com.example.tejutrendszer.ui
import android.os.Bundle
import android.view.View
import com.google.android.material.bottomnavigation.BottomNavigationView
import androidx.appcompat.app.AppCompatActivity
import androidx.navigation.findNavController
import androidx.navigation.ui.AppBarConfiguration
import androidx.navigation.ui.setupActionBarWithNavController
import androidx.navigation.ui.setupWithNavController
import com.example.tejutrendszer.R
class MainActivity : AppCompatActivity() {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
// Hide top action bar
window.decorView.systemUiVisibility = View.SYSTEM_UI_FLAG_FULLSCREEN;
supportActionBar?.hide()
// Bottom navigation setup
val navView: BottomNavigationView = findViewById(R.id.nav_view)
val navController = findNavController(R.id.nav_host_fragment_activity_main)
val appBarConfiguration = AppBarConfiguration(
setOf(
R.id.navigation_home, R.id.navigation_dashboard, R.id.navigation_notifications
)
)
setupActionBarWithNavController(navController, appBarConfiguration)
navView.setupWithNavController(navController)
}
} | 7670b2486509af2658928449d39e2f3594058d7b | [
"Kotlin",
"Gradle"
]
| 11 | Kotlin | AaronBison/TejUtRendszer | ff1d180533fd0e1aedd0311bc1ef011f345c8c70 | 9c80dad8934f1f15c8de52b59103d37732dd68d7 |
refs/heads/master | <repo_name>ankutalev/16208_kutalev<file_sep>/Befunge/MoveRight.java
public class MoveRight implements Command {
@Override
public void execute(BefungeEnvironment context) {
context.move(Direction.RIGHT);
}
}
<file_sep>/gui/controller.cpp
#include "controller.h"
#include <fstream>
#include <iostream>
#include "mainwindow.h"
#include <QtCore/QCoreApplication>
controller* controller::get_instance() {
static controller C;
return &C;
}
void controller::setModel(model* a) { simulator = a; }
void controller::setView(MainWindow& window, int argc, char** argv) {
argcc = argc;
argvv = argv;
view = &window;
start = false;
}
void controller::getChangedCell(const std::string& Name, int Address,
bool isDanger) {
view->controllerCommand(Warriors[Name], Address, isDanger);
}
bool controller::startDrawing() { return controller::start; }
void controller::warriorRegister(const std::__cxx11::string& name, int Pos) {
Warriors.insert(std::pair<std::string, int>(name, Pos));
}
void controller::signalToStart(int wnum, std::__cxx11::string* files,
int size) {
controller::start = true;
try {
simulator->load(wnum, files, size);
} catch (std::exception& t) {
controller::start = false;
view->filesError(t.what());
}
if (controller::start) {
msize = size;
this->wnum = wnum;
mfiles = files;
view->reOrganize();
simulator->startGame();
}
}
void controller::checkStartOptions(int warNum, std::string* files, int size,
int separation) {
simulator->setRange(separation);
simulator->setWNumbers(warNum);
try {
simulator->setSize(size);
} catch (std::runtime_error& x) {
view->sizeError(x.what());
return;
}
for (int i = 0; i < warNum; i++) {
std::ifstream checkFile(files[i]);
if (!checkFile.is_open()) {
view->filesOpenError(files[i]);
return;
}
}
signalToStart(warNum, files, size);
}
std::string controller::getWName(int pos) {
if (pos == -1) return "this is MARS base field";
for (auto& it : Warriors)
if (it.second == pos) return it.first;
}
std::string controller::getCommandName(int pos) {
return simulator->getInstructionName(pos);
}
void controller::warEnd(const std::__cxx11::string& win) { view->rezults(win); }
void controller::repeat() {
Warriors.clear();
simulator->coreClear();
simulator->startGame();
}
<file_sep>/Befunge/PrintTopAsInt.java
public class PrintTopAsInt implements Command {
@Override
public void execute(BefungeEnvironment context) {
context.printTopAsInt();
}
}
<file_sep>/Befunge/StackSwap.java
public class StackSwap implements Command {
@Override
public void execute(BefungeEnvironment context) {
context.swap();
}
}
<file_sep>/Befunge/MoveRandom.java
public class MoveRandom implements Command {
@Override
public void execute(BefungeEnvironment context) {
context.move(Direction.RANDOM);
}
}
<file_sep>/core_war/nop_command.cpp
#include "factory.hpp"
class Nop_command : public Instruction {
public:
explicit Nop_command (Modifiers x) { Fields.OpcodeMod = x; Name = "NOP"; }
Nop_command() = default;
void Execution(ExecutionContext& Executor) override {
std::cout<<Executor.getCurrentWarriorName()<< " have a nice cat! <^-^>/ meow!!"<<std::endl;
}
Nop_command* Clone() override {
return new Nop_command(*this);
}
};
namespace {
Instruction* nopab() {return new Nop_command(Modifiers::AB);}
Instruction* nopba() {return new Nop_command(Modifiers::BA);}
Instruction* nopta() {return new Nop_command(Modifiers::A);}
Instruction* noptb() {return new Nop_command(Modifiers::B);}
Instruction* noptf() {return new Nop_command(Modifiers::F);}
Instruction* nopx() {return new Nop_command(Modifiers::X);}
Instruction* nopi() {return new Nop_command(Modifiers::I);}
bool a = Factory::get_instance()->regist3r("NOP.AB",&nopab);
bool b = Factory::get_instance()->regist3r("NOP.BA",&nopba);
bool c = Factory::get_instance()->regist3r("NOP.A",&nopta);
bool d = Factory::get_instance()->regist3r("NOP.B",&noptb);
bool f = Factory::get_instance()->regist3r("NOP.F",&noptf);
bool e = Factory::get_instance()->regist3r("NOP.X",&nopx);
bool g = Factory::get_instance()->regist3r("NOP.I",&nopi);
bool w = Factory::get_instance()->nameRegister("NOP","NOP");
}<file_sep>/darkest_dungeon/Ability.java
import java.util.ArrayList;
public abstract class Ability {
abstract public void choseTargets(BattleContext field);
abstract void using (BattleContext field);
abstract boolean isLegal(BattleContext field);
public String getName() {
return name;
}
abstract void upgrade();
protected byte level = 0;
protected boolean isPurchased = false;
protected boolean isActive = false;
protected void setName(String name) {
this.name = name;
}
private String name;
protected ArrayList<Unit> targets = new ArrayList<>();
protected Unit user;
//protected
}
<file_sep>/core_war/warrior.hpp
//
//
#ifndef CORE_WAR_WARRIOR_HPP
#define CORE_WAR_WARRIOR_HPP
class Warrior;
#include "fstream"
#include "instructions.hpp"
#include "memory"
class Warrior {
friend class Loader;
public:
void Born(char[],size_t);
private:
int FlowCounter=1;
std::string Author;
std::string Name;
std::vector<std::shared_ptr<Instruction> > Instructions;
int StartPosition=0;
};
#endif //CORE_WAR_WARRIOR_HPP
<file_sep>/chess/src/Knight.java
public class Knight extends Figure {
Knight(Side side) {
super(side);
}
@Override
public String getName() {
return "[" + side.name() + " K]";
}
@Override
protected void getListOfAvailiblePosition(ChessBoard board, int i, int j) {
celllLooking(board,i+2,j+1);
celllLooking(board, i+2, j-1);
celllLooking(board, i-2, j+1);
celllLooking(board,i-2,j-1);
}
}
<file_sep>/gui/mainwindow.cpp
#include "mainwindow.h"
#include "ui_mainwindow.h"
//#include "/home/ulyssess/qtProjects/core_war/renderarea.h"
#include <info.h>
#include <winners.h>
#include <QDesktopWidget>
#include <QDialogButtonBox>
#include <QMessageBox>
#include <QScrollBar>
#include <iostream>
#include "controller.h"
MainWindow::MainWindow(QWidget* parent)
: QMainWindow(parent), ui(new Ui::MainWindow) {
ui->setupUi(this);
controller::get_instance()->setView(*this, 0, nullptr);
ui->speedDown->hide();
ui->speedUp->hide();
connect(ui->speedDown, SIGNAL(released()), ui->rArea, SLOT(speedDownClick()));
connect(ui->speedUp, SIGNAL(released()), ui->rArea, SLOT(speedUpClick()));
filesErrors[0] = ui->error1;
filesErrors[1] = ui->error2;
filesErrors[2] = ui->error3;
filesErrors[3] = ui->error4;
filesErrors[4] = ui->error5;
filesErrors[5] = ui->error6;
// controller::setView(*this,0,nullptr);
// parent
// ui->rArea = new RenderArea;
ui->gridLayout->setAlignment(Qt::AlignRight);
// ui->pushButton_2->hide();
ui->lineEdit_13->hide();
ui->lineEdit_14->hide();
ui->lineEdit_15->hide();
ui->lineEdit_11->hide();
ui->label_20->hide();
ui->label_21->hide();
ui->label_22->hide();
ui->label_23->hide();
ui->error1->setStyleSheet("color: rgb(100, 0, 0)");
ui->error2->setStyleSheet("color: rgb(100, 0, 0)");
ui->error3->setStyleSheet("color: rgb(100, 0, 0)");
ui->error4->setStyleSheet("color: rgb(100, 0, 0)");
ui->error5->setStyleSheet("color: rgb(100, 0, 0)");
ui->error6->setStyleSheet("color: rgb(100, 0, 0)");
ui->error7->setStyleSheet("color: rgb(100, 0, 0)");
ui->error8->setStyleSheet("color: rgb(100, 0, 0)");
ui->error1->hide();
ui->error2->hide();
ui->error3->hide();
ui->error4->hide();
ui->error5->hide();
ui->error6->hide();
ui->error7->hide();
ui->error8->hide();
ui->lineEdit_3->setValidator(new QIntValidator);
ui->lineEdit_4->setValidator(new QIntValidator);
}
MainWindow::~MainWindow() { delete ui; }
// void MainWindow::on_comboBox_2_activated(int index)
//{
// switch (index) {
// case (4):
// std::cout<<"6?";
// ui->lineEdit_8->setText("huy zalupa");
// ui->lineEdit_9->show();
// ui->lineEdit_10->show();
// ui->lineEdit_11->show();
// ui->lineEdit_12->show();
// repaint();
// break;
// case(3):
// std::cout<<"5?";
// ui->lineEdit_9->show();
// ui->lineEdit_10->show();
// ui->lineEdit_11->show();
// break;
// case(2):
// std::cout<<"4?";
// ui->lineEdit_9->show();
// ui->lineEdit_10->show();
// break;
// case(1):
// ui->lineEdit_9->show();
// std::cout<<"3?";
// break;
// }
//}
void MainWindow::on_comboBox_activated(int index) {
switch (index) {
case (4):
ui->lineEdit_13->show();
ui->lineEdit_14->show();
ui->lineEdit_15->show();
ui->lineEdit_11->show();
ui->label_20->show();
ui->label_21->show();
ui->label_22->show();
ui->label_23->show();
break;
case (3):
std::cout << "5?";
ui->lineEdit_13->show();
ui->lineEdit_14->show();
ui->lineEdit_15->show();
ui->lineEdit_11->hide();
ui->label_20->show();
ui->label_21->show();
ui->label_22->show();
ui->label_23->hide();
break;
case (2):
std::cout << "4?";
ui->lineEdit_13->show();
ui->lineEdit_14->show();
ui->lineEdit_15->hide();
ui->lineEdit_11->hide();
ui->label_20->show();
ui->label_21->show();
ui->label_22->hide();
ui->label_23->hide();
break;
case (1):
ui->lineEdit_13->show();
ui->lineEdit_14->hide();
ui->lineEdit_15->hide();
ui->lineEdit_11->hide();
ui->label_20->show();
ui->label_21->hide();
ui->label_22->hide();
ui->label_23->hide();
std::cout << "3?";
break;
case (0):
ui->lineEdit_13->hide();
ui->lineEdit_14->hide();
ui->lineEdit_15->hide();
ui->lineEdit_11->hide();
ui->label_20->hide();
ui->label_21->hide();
ui->label_22->hide();
ui->label_23->hide();
}
for (int i = 0; i < 6; i++) filesErrors[i]->hide();
}
void MainWindow::on_checkBox_2_clicked() {
ui->checkBox->setChecked(false);
ui->comboBox->setEnabled(true);
on_comboBox_activated(0);
ui->comboBox->setCurrentIndex(0);
ui->lineEdit_3->setEnabled(true);
ui->lineEdit_4->setEnabled(true);
}
// void MainWindow::on_comboBox_activated(int index)
//{
//}
void MainWindow::sizeError(const std::string& message) {
std::string x =
"Size must be larger for this input! To you settings at least ";
x += message;
QMessageBox::information(this, "Error", x.c_str());
}
void MainWindow::separationError() {
ui->error8->setText(
"Invalid separation range! It must be in 1 to size -1 range");
ui->error8->show();
}
void MainWindow::on_checkBox_clicked() {
ui->checkBox_2->setChecked(false);
on_comboBox_activated(0);
ui->comboBox->setCurrentIndex(0);
ui->comboBox->setEnabled(false);
ui->lineEdit_3->setEnabled(false);
ui->lineEdit_3->setText("8192");
ui->lineEdit_4->setEnabled(false);
ui->lineEdit_4->setText("300");
}
void MainWindow::on_pushButton_clicked() {
int wNum = ui->comboBox->currentText().toInt();
std::string files[6];
files[0] = ui->lineEdit->text().toStdString();
files[1] = ui->lineEdit_2->text().toStdString();
files[2] = ui->lineEdit_13->text().toStdString();
files[3] = ui->lineEdit_14->text().toStdString();
files[4] = ui->lineEdit_15->text().toStdString();
files[5] = ui->lineEdit_11->text().toStdString();
// ui->scrollArea->horizontalScrollBar()->hide();
// ui->scrollArea->verticalScrollBar()->hide();
ui->rArea->setSize(ui->lineEdit_3->text().toInt());
controller::get_instance()->checkStartOptions(wNum, files,
ui->lineEdit_3->text().toInt(),
ui->lineEdit_4->text().toInt());
}
void MainWindow::reOrganize() {
ui->pushButton->hide();
ui->gridWidget->hide();
ui->speedDown->show();
ui->speedUp->show();
ui->rArea->setMinimumSize(ui->rArea->getAreaWidth(),
ui->rArea->getAreaHeight());
}
void MainWindow::off() {
// ui->rArea->ButtonClick();
exit(0);
}
void MainWindow::filesError(const char* message) {
// std::cout<<"dorou";
// std::string x = "Size must be larger for this input! To you settings at
// least ";
// x+= message;
QMessageBox::information(this, "Error", message);
}
void MainWindow::filesOpenError(const std::string& file) {
std::string x = "Can't open this file: ";
x += file;
QMessageBox::information(this, "Error", x.c_str());
}
void MainWindow::controllerCommand(int Warrior, int Address, bool isDanger) {
ui->rArea->ControlCommand(Warrior, Address, isDanger);
}
void MainWindow::rezults(const std::string& win) {
ui->rArea->repaint();
winners* rez = new winners;
rez->setWindowTitle("Results");
rez->getWinner(win);
QRect rect = QApplication::desktop()->availableGeometry(rez);
rez->move(rect.width() / 2 - rez->width() / 2,
rect.height() / 2 - rez->height() / 2);
// QRect desktopRect = QApplication::desktop()->availableGeometry(rez);
// QPoint center = desktopRect.center();
// move(center.x() - width() * 0.5, center.y() - height());
// rez->setGeometry();
// QApplication::desktop()->screenGeometry(rez);
QMessageBox info;
info.setDefaultButton(QMessageBox::Ok);
QPushButton* repeat = info.addButton(
QString::fromStdString("Repeat battle with those warriors"),
QMessageBox::NoRole);
QPushButton* leave = info.addButton(
QString::fromStdString("Leave this Vietnam"), QMessageBox::YesRole);
QPushButton* newgame = info.addButton(
QString::fromStdString("Start new game with new warriors and settings"),
QMessageBox::AcceptRole);
info.setWindowFlags(Qt::CustomizeWindowHint | Qt::WindowTitleHint);
info.setWindowTitle("Rezults");
info.setDefaultButton(leave);
info.setText(QString::fromStdString(win));
info.exec();
// connect(repeat,SIGNAL(pressed()),ui->rArea,SLOT(tick()));
if (info.clickedButton() == leave) exit(22);
if (info.clickedButton() == repeat) {
ui->rArea->tick();
controller::get_instance()->repeat();
}
if (info.clickedButton() == newgame) {
throw std::runtime_error("heh");
}
// QMessageBox::information(this, "Rezults", win.c_str());
// rez->show();
}
void MainWindow::on_actionExit_triggered() { exit(22); }
<file_sep>/torrent/src/TorrentFile.java
import java.util.List;
public class TorrentFile {
// private String announceURL; //todo: возможно когда-нибудь появится трекер
TorrentFile(jBittorrentAPI.TorrentFile file) {
this.comment = file.comment;
this.fileName = file.saveAs;
this.pieceLength = file.pieceLength;
this.binaryHash = file.info_hash_as_binary;
this.pieceHashValues = file.piece_hash_values_as_binary;
this.pieceHashValuesAsHex=file.piece_hash_values_as_hex;
this.pieceHashValuesAsUrl = file.piece_hash_values_as_url;
this.fileLength=file.total_length;
this.hashAsHexString = file.info_hash_as_hex;
}
public String getFileName() {
return fileName;
}
public long getPieceLength() {
return pieceLength;
}
public int getPiecesCount() { return pieceHashValues.size();}
public String getPieceHashValueAsHex(int index) { return pieceHashValuesAsHex.get(index);}
public byte[] getPieceHashValueAsBinary (int index) { return pieceHashValues.get(index);}
public String getHashAsHexString() {
return hashAsHexString;
}
public long getFileLength() {
return fileLength;
}
private String comment;
private String createdBy;
private long creationDate;
private String encoding;
private String fileName;
private long pieceLength;
/* In case of multiple files torrent, fileName is the name of a directory
* and name contains the path of the file to be saved in this directory
*
*/
private List name;
private List length;
private byte[] binaryHash;
private String hashAsHexString;
private String info_hash_as_url;
private long fileLength;
private List<byte[]> pieceHashValues;
private List<String> pieceHashValuesAsHex;
private List<String> pieceHashValuesAsUrl;
}
<file_sep>/gui/winners.h
#ifndef WINNERS_H
#define WINNERS_H
#include <QWidget>
#include <QLabel>
#include <QHBoxLayout>
#include <QPushButton>
class winners : public QWidget
{
Q_OBJECT
public:
explicit winners(QWidget *parent = 0);
signals:
public slots:
void getWinner(const std::string& win);
void exitVietname();
private:
QLabel* winner;
QPushButton* rematch;
QPushButton* newGame;
QPushButton* exit_;
QHBoxLayout* lay;
};
#endif // WINNERS_H
<file_sep>/core_war/ExecutionContext.hpp
//
//
#ifndef CORE_WAR_EXECUTIONCONTEXT_HPP
#define CORE_WAR_EXECUTIONCONTEXT_HPP
class ExecutionContext;
#include "mars.h"
class ExecutionContext {
public:
ExecutionContext(MARS &in, CircularBuffer &q, CircularBuffer::Iterator *it)
: Simulator(&in), Queue(&q), it(it) {}
void setA(int which, int what);
void setB(int which, int what);
int getA(int which) const;
int getB(int which) const;
int getCurrentAddress() const;
void SetOffsets(Data &in);
void SetOffset(Mods mod, int operand, int *off);
void DeleteFromQueue();
void AddToQueue(int);
void ChangeCommand(int from, int which);
void ForwardQueue();
void ChangeCurrentFlowAddress(int Address);
const std::string &getCurrentWarriorName() const;
bool isInstructionsEqual(int a, int b);
private:
MARS *Simulator = nullptr;
CircularBuffer *Queue = nullptr;
CircularBuffer::Iterator *it = nullptr;
};
#endif // CORE_WAR_EXECUTIONCONTEXT_HPP
<file_sep>/core_war/mov_command.cpp
#include "factory.hpp"
class Mov_command : public Instruction {
public:
explicit Mov_command(Modifiers x) { Fields.OpcodeMod = x; Name= "MOV";}
void Execution(ExecutionContext &Executor) override {
Executor.SetOffsets(Fields);
std::cout << Executor.getCurrentWarriorName()
<< " make some kind of mov from: " << Fields.AOffset
<< " to: " << Fields.BOffset << std::endl;
switch (Fields.OpcodeMod) {
case (Modifiers::A):
Executor.setA(Fields.BOffset, Executor.getA(Fields.AOffset));
break;
case (Modifiers::B):
Executor.setB(Fields.BOffset, Executor.getB(Fields.AOffset));
break;
case (Modifiers::AB):
Executor.setB(Fields.BOffset, Executor.getA(Fields.AOffset));
break;
case (Modifiers::BA):
Executor.setA(Fields.BOffset, Executor.getB(Fields.AOffset));
break;
case (Modifiers::F):
Executor.setA(Fields.BOffset, Executor.getA(Fields.AOffset));
Executor.setB(Fields.BOffset, Executor.getB(Fields.AOffset));
break;
case (Modifiers::X):
Executor.setB(Fields.BOffset, Executor.getA(Fields.AOffset));
Executor.setA(Fields.BOffset, Executor.getB(Fields.AOffset));
break;
case (Modifiers::Not):
case (Modifiers::I):
Executor.ChangeCommand(Fields.AOffset, Fields.BOffset);
}
Executor.ForwardQueue();
}
Instruction *Clone() override { return new Mov_command(*this); }
};
namespace {
Instruction *movab() { return new Mov_command(Modifiers::AB); }
Instruction *movba() { return new Mov_command(Modifiers::BA); }
Instruction *mova() { return new Mov_command(Modifiers::A); }
Instruction *movb() { return new Mov_command(Modifiers::B); }
Instruction *movf() { return new Mov_command(Modifiers::F); }
Instruction *movx() { return new Mov_command(Modifiers::X); }
Instruction *movi() { return new Mov_command(Modifiers::I); }
bool a = Factory::get_instance()->regist3r("MOV.AB", &movab);
bool b = Factory::get_instance()->regist3r("MOV.BA", &movba);
bool c = Factory::get_instance()->regist3r("MOV.A", &mova);
bool d = Factory::get_instance()->regist3r("MOV.B", &movb);
bool f = Factory::get_instance()->regist3r("MOV.F", &movf);
bool e = Factory::get_instance()->regist3r("MOV.X", &movx);
bool g = Factory::get_instance()->regist3r("MOV.I", &movi);
bool w = Factory::get_instance()->nameRegister("MOV","MOV");
}<file_sep>/gui/model.cpp
#include "model.h"
model::model(QObject *parent) : QObject(parent) {}
void model::setSize(int size) { simulator.setSize(size); }
void model::setWNumbers(int number) { simulator.setNumberOfWarriors(number); }
void model::setRange(int range) { simulator.setMinSeparation(range); }
void model::load(int wnum, std::__cxx11::string *files, int size) {
Warrior x;
for (int i = 0; i < wnum; i++) {
Warriors.push_back(x);
Warriors[i].Born(files[i].c_str(), simulator.GetMaxProcess());
}
Loader::loadCommands(Warriors, simulator);
}
void model::startGame() { simulator.Battle(); }
std::string model::getInstructionName(int pos) {
return simulator.getInstructionName(pos);
}
void model::coreClear() {
simulator.coreClear();
Loader::loadCommands(Warriors, simulator);
}
<file_sep>/Befunge/Befunge_93.java
import java.util.*;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.BufferedReader;
import java.io.IOException;
public class Befunge_93 implements Befunge {
public void run() {
Command currentCommand;
Character currentSymbol = table[verticalIndex][horyzontalIndex];
int i = 0;
while (isWorking){//&& i< 100*100*100) {
// System.out.printf("CURRENT SYMBOL AND HIS CODE %c %d %n",currentSymbol,(int)currentSymbol);
if (currentSymbol ==' ')
bf93env.move(currentDir);
else {
if (loadedCommands.containsKey(currentSymbol))
currentCommand = loadedCommands.get(currentSymbol);
else {
try {
currentCommand = Factory.getInstance().getCommand(currentSymbol.toString());
} catch (ClassNotFoundException exc) {
if ((int)currentSymbol!=0)
System.out.println(exc.getMessage());
return;
}
loadedCommands.put(currentSymbol, currentCommand);
}
currentCommand.execute(bf93env);
}
currentSymbol = table[verticalIndex][horyzontalIndex];
i++;
}
}
public void setupTable() throws IOException {
BufferedReader reader = new BufferedReader(new InputStreamReader(System.in));
String commandLine;
System.out.println("BEFUNGE-93 CONSOLE INPUT MODE; HERE THE RULES:");
System.out.printf("1)MAXIMUM SYMBOLS IN ONE LINE - %d%n",height);
System.out.printf("2)MAXIMUM LINES - %d ",width);
System.out.println("(all lines after this limit will be ignored)");
System.out.println("3)TO STOP INPUT TYPE 'END' without quotes IN NEW(!!!) LINE ");
for (int i = 0 ; i < width;i++) {
commandLine = reader.readLine();
if (commandLine.equals("END"))
break;
if (commandLine.length() > height)
throw new IOException ("Dude,read rulez - that line was too long!");
System.arraycopy(commandLine.toCharArray(),0,table[i],0,commandLine.length());
}
}
Befunge_93() {
bf93env = new ExecutionContext();
loadedCommands = new HashMap <>();
Symbols = new Stack<> ();
table = new char[width][height];
for(int i =0; i< width;i++)
Arrays.fill(table[i],' ');
}
private BefungeEnvironment bf93env;
private HashMap<Character,Command> loadedCommands;
private Direction currentDir = Direction.RIGHT;
private Stack<Integer> Symbols;
private int horyzontalIndex = 0;
private int verticalIndex = 0;
private final int width = 25;
private final int height = 80;
private char table[][];
private boolean isWorking = true;
class ExecutionContext implements BefungeEnvironment {
@Override public void move (Direction dir) {
currentDir = dir;
/* System.out.printf("MY COORDINATS: %d %d %n",horyzontalIndex,verticalIndex);
if (horyzontalIndex==15& verticalIndex==1) {
System.out.println((int) table[verticalIndex][horyzontalIndex]);
System.out.println((int)' ');
}
if (table[horyzontalIndex][verticalIndex]==' ')
System.out.println("BIG ENOUGH AAAAAAAAAAAAA ");*/
switch (dir) {
case UP:
verticalIndex= (verticalIndex-1+ width) % width;
break;
case DOWN:
verticalIndex= (verticalIndex+1+ width) % width;
break;
case LEFT:
horyzontalIndex= (horyzontalIndex-1+ height) % height;
break;
case RIGHT:
horyzontalIndex= (horyzontalIndex+1+ height) % height;
break;
case RANDOM:
Random yield = new Random(System.currentTimeMillis());
int rand = yield.nextInt(3);
move(Direction.values()[rand]);
}
}
@Override public void ifMove(Direction dir) {
int check = getTop();
currentDir = (check==0) ? dir.opposite() : dir;
move(currentDir);
}
@Override public void push(){
int x=0;
if(!Symbols.empty()) {
x = Symbols.peek();
Symbols.push(x);
}
move(currentDir);
}
@Override public void skip(){
move(currentDir);
move(currentDir);
}
@Override public void endProgram(){
isWorking=!isWorking;
}
@Override public void getNumber() {
Symbols.push(Character.digit(table[verticalIndex][horyzontalIndex],10));
move(currentDir);
}
@Override public void swap () {
int x = getTop(), y = getTop();
Symbols.push(x);
Symbols.push(y);
move(currentDir);
}
@Override public void deleteTop(){
getTop();
move(currentDir);
}
@Override public void putOnStack() {
int x = getTop(), y = getTop();
char value = (char)getTop();
table[x][y] = value;
move(currentDir);
}
@Override public void getFromStack(){
int x=getTop(),y=getTop();
Symbols.push((int)table[(x+width)%width][(y+height)%height]);
move(currentDir);
}
@Override public void getStringOfASCI(){
do {
move(currentDir);
if (table[verticalIndex][horyzontalIndex]!='"')
Symbols.push((int)table[verticalIndex][horyzontalIndex]);
}while (table[verticalIndex][horyzontalIndex]!='"');
move(currentDir);
}
@Override public void arithOperations(Operations op) {
int x = getTop();
int y = getTop();
switch (op) {
case DIV:
if (x != 0)
Symbols.push(y / x);
else
Symbols.push(0);
break;
case MUL:
Symbols.push(x * y);
break;
case SUB:
Symbols.push(y - x);
break;
case SUM:
Symbols.push(x + y);
break;
case MOD:
Symbols.push(y % x);
break;
}
move(currentDir);
}
@Override public void stackRevert() {
int x =getTop();
if (x == 0)
Symbols.push(1);
else
Symbols.push(0);
move(currentDir);
}
@Override public void stackMoreThan() {
int x,y;
try {
x = Symbols.pop();
}
catch (EmptyStackException exc) {
Symbols.push(0);
move(currentDir);
return;
}
try {
y = Symbols.pop();
}
catch (EmptyStackException exc) {
Symbols.push(0);
move(currentDir);
return;
}
if (x < y) {
Symbols.push(x);
Symbols.push(y);
Symbols.push(1);
} else {
Symbols.push(x);
Symbols.push(y);
Symbols.push(0);
}
move(currentDir);
}
@Override public void askForNumber(){
System.out.println("Type a number!");
Scanner in = new Scanner(System.in);
int x;
try {
x = in.nextInt();
}
catch (InputMismatchException exc) {
System.out.println("I SAID TYPE A NUMBER,DUMB USER!!!");
isWorking=!isWorking;
return;
}
Symbols.push(x);
move(currentDir);
}
@Override public void askForSymbol(){
System.out.println("Type a symbol!");
Scanner in = new Scanner(System.in);
char x = in.next().charAt(0);
Symbols.push((int)(x));
move(currentDir);
}
@Override public void printTopAsInt() {
int x = getTop();
System.out.printf("%d ", x);
move(currentDir);
}
@Override public void printTopAsSymbol(){
int x;
try {
x = Symbols.pop();
}
catch ( EmptyStackException exc) {
// System.out.println(0);
move(currentDir);
return;
}
System.out.printf("%c",x);
move(currentDir);
}
private int getTop() {
int x = 0;
try {
x = Symbols.pop();
}
catch (EmptyStackException exc) {
// Symbols.push(0);
}
return x;
}
}
}
<file_sep>/gui/info.cpp
#include "info.h"
infoWindow::infoWindow(QWidget *parent) : QWidget(parent)
{
lay = new QVBoxLayout;
name = new QLabel;
command = new QLabel;
danger = new QLabel;
position = new QLabel;
lay->addWidget(name);
lay->addWidget(command);
lay->addWidget(danger);
lay->addWidget(position);
setLayout(lay);
}
infoWindow::~infoWindow() {
//delete lay;//????????????????????????????
}
void infoWindow::setName(const std::__cxx11::string &nam) {
name->setText("Name:"+QString::fromStdString(nam));
}
void infoWindow::setCommand(const std::__cxx11::string &com) {
command->setText("Command:"+QString::fromStdString(com));
}
void infoWindow::setDanger(bool isdanger) {
(isdanger) ? danger->setText("DANGER!!11") : danger->setText("Not danger");
}
void infoWindow::setPosition(int x) {
position->setText("Position: " + QString::number(x));
}
<file_sep>/darkest_dungeon/TickedBuffs.java
public abstract class TickedBuffs implements Effect {
//@Override public abstract void giveInfluence();
@Override
public void giveInfluence(Unit unit,BattleContext field) {
}
}
<file_sep>/darkest_dungeon/BleedEffect.java
public class BleedEffect extends TickedEffects {
BleedEffect(int damage,int ticks) {
this.ticks = ticks;
this.damage = damage;
}
@Override
public void makeEffect(Unit unit,BattleContext field) {
unit.takeDamage((short)damage, DamageType.PURE,field);
System.out.println(unit.getClass().getSimpleName() + " bleeded for " +damage);
}
private int damage;
}
<file_sep>/core_war/main.cpp
#include "factory.hpp"
#include "loader.hpp"
int main(int argc, char *argv[]) {
if (argc < 2) {
std::cout << "give me files!!!!!!";
return -1;
}
MARS XXX;
std::vector<Warrior> Warriors;
Warrior x;
int t = XXX.SetConfiguration(argv[1]);
for (int i = 1 + t; i < argc; i++) {
Warriors.push_back(x);
Warriors[i - 1 - t].Born(argv[i], XXX.GetMaxProcess());
}
std::cout << "heh";
Loader::loadCommands(Warriors,XXX);
XXX.Battle();
return 0;
}<file_sep>/templates hashtable and shared ptr/main.cpp
#include <iostream>
#include <list>
#include <algorithm>
#include "template_shared_ptr.h"
#include "template_hash_table.hpp"
#include <gtest/gtest.h>
static const int HashConst = 31;
size_t HashCalculation (const std::string& in ) {
size_t length = in.length();
size_t HashValue = in[length-1] % HashConst;
for(int i=length-2;i>=0;i--)
HashValue= HashValue*HashConst+in[i]%HashConst;
return HashValue;
}
class Value {
public:
Value(int x, int y):age(x),weight(y),yx(new int[age]),abc(new int[age]) {
}
Value() : age(100500),weight(-5),yx(new int[age]),abc(new int [10]) { }
~Value() { delete[] yx ;}
Value(const Value& t) {
age = t.age;
weight=t.weight;
yx= new int [age];
std::copy(t.yx,t.yx+age,yx);
abc=t.abc;
}
Value& operator = (const Value & t) {
if (this!=&t) {
delete [] yx;
age=t.age;
weight=t.weight;
yx= new int [age];
std::copy(t.yx,t.yx+age,yx);
abc=t.abc;
}
return *this;
}
int age;
int weight;
private:
int* yx;
mutable shared_ptr<int> abc;
};
size_t charhash(const char& x) {
return x;
}
class Unit {
public:
virtual std::string move(int x,int y){
return "UNIT M O V E";
}
virtual ~Unit() {}
};
typedef Unit*(*function)();
class Zergling : public Unit {
public:
std::string move (int x, int y) override {
std::string a = "Zergling move!";
return a;
}
};
class Battlecruiser :public Unit {
std::string move (int x, int y) override {
return "Battlecruiser move!";
}
};
class SiegeTank: public Unit {
std::string move (int x, int y) override {
return "SiegeTank move!";
}
};
Unit* zerg () {return new Zergling;}
Unit* bc() {return new Battlecruiser;}
Unit* tank() {return new SiegeTank;}
TEST(Insert_and_empty_testing, poni) {
HashTable<std::string,Value,&HashCalculation> X;
Value Y;
ASSERT_EQ(true,X.Empty());
ASSERT_EQ(true, X.Insert("RAINBOW DASH",Y));
ASSERT_EQ(false,X.Empty());
ASSERT_EQ(true,X.Insert("APPLE JACK",Y));
ASSERT_EQ(true,X.Insert("GREAT AND POWERFUL TRIXIE",Y));
ASSERT_FALSE(X!=X);
}
TEST(Insert_and_empty_testing, charint) {
HashTable<char,int,&charhash> X;
Value Y;
ASSERT_EQ(true,X.Empty());
ASSERT_EQ(true, X.Insert('a',5));
ASSERT_EQ(false,X.Empty());
ASSERT_EQ(true,X.Insert('b',7));
ASSERT_EQ(true,X.Insert('t',100500));
}TEST(Insert_and_empty_testing, starcraft) {
HashTable<std::string, function, &HashCalculation> X;
ASSERT_EQ(X.Insert("zergling",&zerg),true);
ASSERT_EQ(X.Insert("battlecruiser",&bc),true);
ASSERT_EQ(X.Insert("siegetank",&tank),true);
ASSERT_EQ(X.Insert("zergling",&zerg),false);
}
TEST (getting_objects,starcraft) {
HashTable<std::string, function, &HashCalculation> X;
ASSERT_EQ(X.Insert("zergling",&zerg),true);
ASSERT_EQ(X.Insert("battlecruiser",&bc),true);
ASSERT_EQ(X.Insert("siegetank",&tank),true);
Unit* tmp = X["zergling"]();
std::string a = "Zergling move!";
std::string b = tmp->move(0,0);
ASSERT_EQ(b, a);
delete tmp;
}
TEST(Contains,poni) {
HashTable<std::string,Value,&HashCalculation> X;
Value Y;
ASSERT_EQ(X.Contains("RAINBOW DASH"),false);
X.Insert("RAINBOW DASH",Y);
ASSERT_EQ(X.Contains("RAINBOW DASH"),true);
ASSERT_EQ(X.Contains("TWILIGHT SPARKLE"),false);
}
TEST(Contains,charint) {
HashTable<char,int,&charhash> X;
ASSERT_EQ(X.Contains('a'),false);
X.Insert('a',100500);
ASSERT_EQ(X.Contains('a'),true);
ASSERT_EQ(X.Contains('t'),false);
}
TEST(Clear,poni) {
HashTable<std::string,Value,&HashCalculation> X;
Value Y;
X.Insert("RARITY",Y);
X.Clear();
ASSERT_EQ(true,X.Empty());
}
TEST(Clear,charint) {
HashTable<char,int,&charhash> X;
X.Insert('a',-123);
X.Clear();
ASSERT_EQ(true,X.Empty());
}
TEST(Erase, poni) {
HashTable<std::string,Value,&HashCalculation> X;
Value Y(100500,20),T;
X.Insert("RAINBOW DASH",T);
X.Insert("FLUTTERSHY",Y);
ASSERT_EQ(X.Erase("FLUTTERSHY"),true);
ASSERT_EQ(X.Erase("KAMA THE BULLET"),false);
ASSERT_EQ(X.Contains("FLUTTERSHY"),false);
}
TEST(Erase, charint) {
HashTable<char,int,&charhash> X;
X.Insert('a',2);
X.Insert(' ',3);
ASSERT_EQ(X.Erase(' '),true);
ASSERT_EQ(X.Erase('t'),false);
ASSERT_EQ(X.Contains(' '),false);
}
TEST(BRACKETS,poni){
HashTable<std::string,Value,&HashCalculation> X;
const Value Y(3,5),D;
X.Insert("RAINBOW DASH",Y);
Value C = X.At("RAINBOW DASH");
const Value g = X.At("RAINBOW DASH");
Value kama = X["KAMA THE BULLET"];
ASSERT_EQ(C.age,Y.age);
ASSERT_EQ(C.weight,Y.weight);
ASSERT_EQ(D.age,kama.age);
ASSERT_EQ(D.weight,kama.weight);
const HashTable<std::string,Value,&HashCalculation> T = X;
const Value i =T.At("RAINBOW DASH");
}
TEST(BRACKETS,charint){
HashTable<char,int,&charhash> X;
X.Insert('a',2);
int C = X['a'];
int kama = 2;
ASSERT_EQ(kama,C);
}
TEST(size,poni) {
HashTable<std::string,Value,&HashCalculation> X;
Value Y;
ASSERT_EQ(0,X.Size());
X.Insert("RAINBOW DASH",Y);
ASSERT_EQ(1,X.Size());
ASSERT_EQ(true,X.Insert("APPLE JACK",Y));
ASSERT_EQ(2,X.Size());
X.Insert("GREAT AND POWERFUL TRIXIE",Y);
ASSERT_EQ(3,X.Size());
ASSERT_EQ(X.Insert("PINKIE PIE",Y),true);
ASSERT_EQ(4,X.Size());
X.Erase("PINKIE PIE");
ASSERT_EQ(3,X.Size());
X.Erase("KAMA THE BULLET");
ASSERT_EQ(3,X.Size());
}
TEST(size,charint) {
HashTable<char,int,&charhash> X;
ASSERT_EQ(0,X.Size());
X.Insert('R',1);
ASSERT_EQ(1,X.Size());
ASSERT_EQ(true,X.Insert('A',2));
ASSERT_EQ(2,X.Size());
X.Insert('G',3);
ASSERT_EQ(3,X.Size());
ASSERT_EQ(X.Insert('P',4),true);
ASSERT_EQ(4,X.Size());
X.Erase('P');
ASSERT_EQ(3,X.Size());
X.Erase('B');
ASSERT_EQ(3,X.Size());
}
TEST(at,poni) {
HashTable<std::string,Value,&HashCalculation> X;
Value Y(5,4);
X.Insert("RAINBOW DASH",Y);
Value R = X.At("RAINBOW DASH");
ASSERT_EQ(5,R.age);
ASSERT_EQ(4,R.weight);
ASSERT_ANY_THROW(R = X.At("TWILIGHT SPARKLE"));
}
TEST(at,charint) {
HashTable<char,int,&charhash> X;
X.Insert('R',2);
int R = X.At('R');
ASSERT_EQ(2,R);
ASSERT_ANY_THROW(R = X.At('g'));
}
TEST(equal,poni) {
HashTable<std::string,Value,&HashCalculation> X1,Y1,X2,Y2;
Value P;
X1.Insert("RAINBOW DASH",P);
Y1.Insert("RAINBOW DASH",P);
ASSERT_EQ(true,X1==Y1);
X2.Insert("APPLE JACK",P);
ASSERT_EQ(false,X2==X1);
Y2=X2;
ASSERT_EQ(true,X2==Y2);
HashTable<std::string,Value,&HashCalculation> Z = Y2;
ASSERT_EQ(Y2==Z,true);
X2.Erase("APPLE JACK");
Y1.Insert("QWERTY",P);
ASSERT_TRUE(X2!=Y1);
}
TEST(equal,charint) {
HashTable<char,int,&charhash> X1,Y1,X2,Y2;
int P = 2;
X1.Insert('R',P);
Y1.Insert('R',P);
ASSERT_EQ(true,X1==Y1);
X2.Insert('A',P);
ASSERT_EQ(false,X2==X1);
Y2=X2;
ASSERT_EQ(true,X2==Y2);
HashTable<char,int,&charhash> Z = Y2;
ASSERT_EQ(Y2==Z,true);
X2.Erase('A');
ASSERT_EQ(false,X2==Y1);
}
int main(int argc, char** argv) {
testing::InitGoogleTest(&argc,argv);
return RUN_ALL_TESTS();
}<file_sep>/gui/model.h
#ifndef MODEL_H
#define MODEL_H
#include <QObject>
#include"/home/ulyssess/CLionProjects/obertka_repo/16208_kutalev/core_war/loader.hpp"
class model : public QObject
{
Q_OBJECT
public:
explicit model(QObject *parent = 0);
signals:
public slots:
void coreClear();
void setSize(int size);
void setRange(int range);
void setWNumbers(int number);
void load(int wnum, std::__cxx11::string *files,int size);
void startGame();
std::string getInstructionName(int pos);
private:
MARS simulator;
std::vector<Warrior> Warriors;
};
#endif // MODEL_H
<file_sep>/chess/src/ChessCell.java
import javax.imageio.ImageIO;
import javax.swing.*;
import java.awt.*;
public class ChessCell extends JButton {
ChessCell (int x) {
colour = x;
Image img = null;
try {
img = ImageIO.read(getClass().getResource("icons8-pawn-filled-50.png"));
}
catch (Exception exc) {}
setIcon(new ImageIcon(img));
if (x%2 == 0)
setBackground(Color.orange);
else
setBackground(Color.GRAY);
}
// protected void paintComponent(Graphics g) {
// super.paintComponent(g);
// if (colour==1)
// g.setColor(Color.BLACK);
// else
// g.setColor(Color.GRAY);
// g.fillRect(a,a,a,a);
// }
private final int a = 50;
private int colour;
}
<file_sep>/core_war/div_command.cpp
#include "factory.hpp"
class Div_command : public Instruction {
public:
explicit Div_command(Modifiers x) { Fields.OpcodeMod = x; Name = "DIV"; }
void Execution(ExecutionContext &Executor) override {
Executor.SetOffsets(Fields);
if (!Executor.getA(Fields.AOffset) && (Fields.OpcodeMod != Modifiers::BA) &&
Fields.OpcodeMod != Modifiers::B) {
Executor.DeleteFromQueue();
return;
}
if (!Executor.getB(Fields.AOffset) && (Fields.OpcodeMod != Modifiers::AB) &&
Fields.OpcodeMod != Modifiers::A) {
Executor.DeleteFromQueue();
return;
}
switch (Fields.OpcodeMod) {
case (Modifiers::A):
Executor.setA(Fields.BOffset,
Executor.getA(Fields.BOffset) /
Executor.getA(Fields.AOffset));
break;
case (Modifiers::AB):
Executor.setB(Fields.BOffset,
Executor.getB(Fields.BOffset) /
Executor.getA(Fields.AOffset));
break;
case (Modifiers::B):
Executor.setB(Fields.BOffset,
Executor.getB(Fields.BOffset) /
Executor.getB(Fields.AOffset));
break;
case (Modifiers::BA):
Executor.setA(Fields.BOffset,
Executor.getA(Fields.BOffset) /
Executor.getB(Fields.AOffset));
break;
case (Modifiers::I):
case (Modifiers::Not):
case (Modifiers::F):
Executor.setA(Fields.BOffset,
Executor.getA(Fields.BOffset) /
Executor.getA(Fields.AOffset));
Executor.setB(Fields.BOffset,
Executor.getB(Fields.BOffset) /
Executor.getB(Fields.AOffset));
break;
case (Modifiers::X):
Executor.setB(Fields.BOffset,
Executor.getB(Fields.BOffset) /
Executor.getA(Fields.AOffset));
Executor.setA(Fields.BOffset,
Executor.getA(Fields.BOffset) /
Executor.getB(Fields.AOffset));
break;
}
Executor.ForwardQueue();
}
Div_command *Clone() override { return new Div_command(*this); }
};
namespace {
Instruction *divab() { return new Div_command(Modifiers::AB); }
Instruction *divba() { return new Div_command(Modifiers::BA); }
Instruction *divta() { return new Div_command(Modifiers::A); }
Instruction *divtb() { return new Div_command(Modifiers::B); }
Instruction *divtf() { return new Div_command(Modifiers::F); }
Instruction *divx() { return new Div_command(Modifiers::X); }
Instruction *divi() { return new Div_command(Modifiers::I); }
bool a = Factory::get_instance()->regist3r("DIV.AB", &divab);
bool b = Factory::get_instance()->regist3r("DIV.BA", &divba);
bool c = Factory::get_instance()->regist3r("DIV.A", &divta);
bool d = Factory::get_instance()->regist3r("DIV.B", &divtb);
bool f = Factory::get_instance()->regist3r("DIV.F", &divtf);
bool e = Factory::get_instance()->regist3r("DIV.X", &divx);
bool g = Factory::get_instance()->regist3r("DIV.I", &divi);
bool w = Factory::get_instance()->nameRegister("DIV","DIV");
}
<file_sep>/Befunge/ReadString.java
public class ReadString implements Command {
@Override
public void execute(BefungeEnvironment context) {
context.getStringOfASCI();
}
}
<file_sep>/gui/info.h
#ifndef VISIONAREA_H
#define VISIONAREA_H
#include <QWidget>
#include <QLabel>
#include <QLayout>
#include "/home/ulyssess/qtProjects/core_war/renderarea.h"
class infoWindow : public QWidget
{
Q_OBJECT
public:
explicit infoWindow(QWidget *parent = 0);
~infoWindow();
signals:
public slots:
void setName(const std::string& name);
void setCommand(const std::string& command);
void setDanger(bool x);
void setPosition(int x);
private:
/// RenderArea* fullModel;
QLabel* name = nullptr;
QLabel* command = nullptr;
QLabel* danger = nullptr;
QLabel* position = nullptr;
QVBoxLayout* lay = nullptr;
};
#endif // VISIONAREA_H
<file_sep>/hash_table/EntriesList.cpp
//
// Created by ulyssess on 05.10.17.
//
<file_sep>/hash_table/tests.cpp
#include "main.cpp"
#include <gtest/gtest.h>
TEST(Insert_and_empty_testing, poni) {
HashTable X;
Value Y;
ASSERT_EQ(true,X.Empty());
ASSERT_EQ(true, X.Insert("RAINBOW DASH",Y));
ASSERT_EQ(false,X.Empty());
ASSERT_EQ(true,X.Insert("APPLE JACK",Y));
ASSERT_EQ(true,X.Insert("GREAT AND POWERFUL TRIXIE",Y));
}
TEST(Contains,poni) {
HashTable X;
Value Y;
ASSERT_EQ(X.Contains("RAINBOW DASH"),false);
X.Insert("RAINBOW DASH",Y);
ASSERT_EQ(X.Contains("RAINBOW DASH"),true);
ASSERT_EQ(X.Contains("TWILIGHT SPARKLE"),false);
}
TEST(Clear,poni) {
HashTable X;
Value Y;
X.Insert("RARITY",Y);
X.Clear();
ASSERT_EQ(true,X.Empty());
}
TEST(Erase, poni) {
HashTable X;
Value Y(100500,20),T;
X.Insert("RAINBOW DASH",T);
X.Insert("FLUTTERSHY",Y);
ASSERT_EQ(X.Erase("FLUTTERSHY"),true);
ASSERT_EQ(X.Erase("KAMA THE BULLET"),false);
ASSERT_EQ(X.Contains("FLUTTERSHY"),false);
}
TEST(BRACKETS,poni){
HashTable X;
Value Y(3,5),D;
X.Insert("RAINBOW DASH",Y);
Value C = X["RAINBOW DASH"];
Value kama = X["KAMA THE BULLET"];
ASSERT_EQ(C.age,Y.age);
ASSERT_EQ(C.weight,Y.weight);
ASSERT_EQ(D.age,kama.age);
ASSERT_EQ(D.weight,kama.weight);
}
TEST(size,poni) {
HashTable X;
Value Y;
X.Insert("RAINBOW DASH",Y);
X.Insert("APPLE JACK",Y);
X.Insert("GREAT AND POWERFUL TRIXIE",Y);
X.Insert("PINKIE PIE",Y);
ASSERT_EQ(4,X.Size());
X.Erase("PINKIE PIE");
ASSERT_EQ(3,X.Size());
}
TEST(at,poni) {
HashTable X;
Value Y(5,4);
X.Insert("RAINBOW DASH",Y);
Value R = X.At("RAINBOW DASH");
ASSERT_EQ(5,R.age);
ASSERT_EQ(4,R.weight);
ASSERT_ANY_THROW(R = X.At("TWILIGHT SPARKLE"));
}
TEST(equal,poni) {
HashTable X1,Y1,X2,Y2;
Value P;
X1.Insert("RAINBOW DASH",P);
Y1.Insert("RAINBOW DASH",P);
ASSERT_EQ(true,X1==Y1);
X2.Insert("APPLE JACK",P);
ASSERT_EQ(false,X2==X1);
Y2=X2;
ASSERT_EQ(true,X2==Y2);
HashTable Z = Y2;
ASSERT_EQ(Y2==Z,true);
X2.Erase("APPLE JACK");
ASSERT_EQ(false,X2==Y1);
}
int main(int argc, char** argv)
{
testing:: InitGoogleTest(&argc,argv);
return RUN_ALL_TESTS();
}
<file_sep>/Befunge/SkipNext.java
public class SkipNext implements Command {
@Override
public void execute(BefungeEnvironment context) {
context.skip();
}
}
<file_sep>/Befunge/Befunge.java
import java.io.IOException;
public interface Befunge {
void setupTable() throws IOException;
void run();
}
<file_sep>/Befunge/DivOperation.java
public class DivOperation implements Command {
@Override
public void execute(BefungeEnvironment context) {
context.arithOperations(Operations.DIV);
}
}
<file_sep>/core_war/mars.cpp
//#include <vector>
//#include <iostream>
#include "mars.h"
//void MARS::LoadCommands(std::vector<Warrior> &in) {
// for (size_t i = 0; i < size; i++) {
// Core.push_back(Initial->Clone());
// }
// Flow tmp;
// for (Warrior &X : in) { // for each Warrior
// for (size_t i = 0; i < X.Instructions.size(); i++) {
// delete Core[(Position + i) % size];
// Core[(Position + i) % size] =
// X.Instructions[i].get()->Clone(); // putting instructions in core
// }
// tmp.Name = X.Name;
// tmp.Author = X.Author;
// tmp.Address = (Position + X.StartPosition) % size;
// tmp.FlowCounter = &X.FlowCounter;
// Flows.Insert(tmp);
// Position = (Position + SeparationRange) % size;
// }
//}
void MARS::Battle() {
auto it = Flows.begin();
ExecutionContext Executor(*this, Flows, &it);
size_t i = 0;
while (Flows.GameIsOn(i, TieCycles)) {
Core[(*it).Address]->Execution(Executor);
++it;
i++;
}
// std::cout << "POSLE " << i << " HODA SITUACIYA TAKAYA:" << std::endl;
for (size_t i = 0; i < size; i++) {
Core[i]->GettingCode(i);
}
if (i == TieCycles)
std::cout << "NO WINNER IN THIS BLOODY WAR!!!!" << std::endl;
else {
it = Flows.begin();
std::cout << "THE WINNER IS " << (*it).Name << " BY" << (*it).Author
<< " AFTER " << i << " TURNS" << std::endl;
}
}
void MARS::MakeOption(const std::string &option, const std::string &value) {
int tmp = std::stoi(value);
if (tmp <= 0)
throw std::runtime_error("invalid value!");
if (option == "size")
size = tmp;
else if (option == "tiecycles")
TieCycles = tmp;
else if (option == "maxlength")
MaxLength = tmp;
else if (option == "maxprocess")
MaxProcess = tmp;
else if (option == "minrange")
MinSeparation = tmp;
else if (option == "range")
SeparationRange = tmp;
else if (option == "warriors")
NumberOfWarriors = tmp;
else
throw std::runtime_error("invalid option");
}
int MARS::SetConfiguration(const char *in) {
std::ifstream fin;
fin.open(in);
if (!fin.is_open())
throw std::runtime_error("no file in directory!");
std::string RedCode;
std::string tmp;
getline(fin, RedCode);
if (RedCode != ";config") {
std::cout << "it's not a config file; the battle will run with ICWC-94 "
"standart settings (if it can starts,ofc)";
return 0;
}
try {
while (fin >> RedCode) {
getline(fin, tmp);
MakeOption(RedCode, tmp);
}
} catch (std::exception &x) {
std::cout << "Some lines were incorrect; they will be default by ICWC - 94 "
"standart settings";
return 1;
}
std::mt19937 rng(rd());
std::uniform_int_distribution<int> uni(
MinSeparation, size - 2 * NumberOfWarriors * MinSeparation);
std::uniform_int_distribution<int> unip(0, size);
SeparationRange = uni(rng);
Position = unip(rng);
return 1;
}
MARS::MARS() {
std::mt19937 rng(rd());
std::uniform_int_distribution<int> uni(
MinSeparation, size - 2 * NumberOfWarriors * MinSeparation);
std::uniform_int_distribution<int> unip(0, size);
SeparationRange = uni(rng);
Position = unip(rng);
Initial = Factory::get_instance()->create("DAT.F");
}
MARS::~MARS() {
delete Initial;
for (auto &i : Core)
delete i;
}
/*int main() {
//}
// catch (std::exception){
// throw std::runtime_error("Invalid file");
// }
// X.SetConfiguration("config.redcode");
// std:: vector<Warrior> Y;
Instruction Imp(Opcodes:: MOV,Modifiers:: Not,Mods:: Not, Mods:: Not,
0,1);
std::vector<Instruction> q;
q.push_back(Imp);
Warrior IMP (q);
std::vector<Warrior> warriors;
warriors.push_back(IMP);
X.LoadCommands(warriors); */
// Dat_command Z;
// Mov_command Y;
/* Instruction * Imp = Factory::get_instance()->create(Opcodes::MOV);
Imp->AOperand = 0;
Imp->BOperand = 1;
Imp->BOperandMod = Mods::Dollar;
Imp->AOperandMod = Mods::Dollar;
Imp->OpcodeMod = Modifiers ::I;
std::vector<Instruction*> q;
q.push_back(Imp);
Warrior IMP (q);*/
// Imp = new Mov_command (Opcodes:: MOV, Modifiers:: Not, Mods:: Dollar,
// Mods:: Dollar, 0, 1);
/*
std::vector<Warrior> warriors;
warriors.push_back(IMP);
X.LoadCommands(warriors);
std::cout<<"sffsaf";
MEOWWWW <* ^_^ *> delete Imp; */
// Instruction *D1 = new Dat_command;
// std::vector<Instruction*> w;
/* Instruction* D1 = Factory::get_instance()->create(Opcodes::DAT);
Instruction* D2 = Factory::get_instance()->create(Opcodes::ADD);
D2->AOperand=5;
D2->BOperand = -1;
D2->AOperandMod = Mods::Lattice;
D2->BOperandMod= Mods::Not;
// Instruction* D2 = new Add_command(Opcodes:: ADD, Modifiers:: Not, Mods::
Lattice, Mods:: Not, 5, -1);
// Instruction* D3 = new Mov_command(Opcodes:: MOV, Modifiers:: Not, Mods::
Lattice, Mods:: Dog, 0, -2);
Instruction* D3 = Factory::get_instance()->create(Opcodes::MOV);
D3->Body = Opcodes ::MOV;
D3->OpcodeMod = Modifiers ::Not;
D3->AOperand=-2;
D3->BOperand = -2;
D3->BOperandMod = Mods::Dog;
// Instruction* D4 = new Jmp_command(Mods ::Not,-2);
Instruction* D4 = Factory::get_instance()->create(Opcodes::JMP);
D4->AOperand=-2;
w.push_back(D1);
w.push_back(D2);
w.push_back(D3);
w.push_back(D4);
Warrior DWARF (w);*/
// std::vector<Warrior> warriors;
// warriors.push_back(DWARF);
// warriors.push_back(IMP);
// X.LoadCommands(warriors);
// std::cout<<"sffsaf";
/* while(q.GameIsOn()) {
std::cout<<(*(*it).FlowCounter)<<std::endl;
it++;
if (it==q.data.end())
it=q.data.begin();
y++;
if(y==3)
it=q.DeleteCurrent(it);
}
//delete D1;
//delete D2;
//delete D3;
//delete D4;
delete Imp;*/
/* return 0;
}*/
<file_sep>/darkest_dungeon/Factory.java
import java.io.*;
import java.lang.reflect.Field;
import java.lang.reflect.InvocationTargetException;
import java.util.Properties;
import java.util.TreeMap;
public class Factory {
public static Factory getInstance() {//throws FactoryInvalidInitialisation{
if(f==null)
f = new Factory();
return f;
}
// public Unit getUnit(String name){//} throws FactoryCommandNotFoundException {
//
// Class<?> c;
// Object unit = null;
// try {
// String statsFile = name + "Stats.txt";
// InputStream is = Factory.class.getClassLoader().getResourceAsStream(statsFile);
// stats.load(is);
// }
// catch (IOException exc) {
// // throw new FactoryCommandNotFoundException ("Unknown command");
// System.out.println("HSDHSDFHSD");
// }
// try {
// c = Class.forName(name);
// unit = c.getDeclaredConstructor(Properties.class).newInstance(stats);
// }
// catch (ClassNotFoundException | NoSuchMethodException | IllegalAccessException | InstantiationException | InvocationTargetException exc) {
// System.out.println(exc.getMessage()+ "AAAAAAAAAAAAAAAA");
//// Field [] fields = unit.getClass().getSuperclass().getSuperclass().getDeclaredFields();
//// for (Field f:fields) {
//// if (f.getName().equals("armor")) {
//// System.out.println("heh");
//// //f.setAccessible(true);
//// f.set(unit, (short)20);
//// }
// //f.set(unit,Integer.parseInt(stats.get(f.getName())));
// //System.out.println(f.getName());
// }
// //stats.forEach(unit,(field,value)->unit.getClass().getDefield.);
//
// return (Unit) unit;
// }
public Unit getUnit(String name) throws Exception {
Class <?> c = Class.forName(name);
Object obj = c.getDeclaredConstructor().newInstance();
return (Unit) obj;
}
// private Factory() {// throws FactoryInvalidInitialisation{
// //commands = new Properties();
// InputStream is = Factory.class.getClassLoader().getResourceAsStream(configFileName);
// if (is!=null) {
// try {
// commands.load(is);
// }
// catch (IOException exc) {
// //throw new FactoryInvalidInitialisation("Missing config file!");
// }
// }
// }
private static final String configFileName = "unutstat.txt";//???
private static Factory f = null;
private Properties stats = new Properties();
private Properties abilities = new Properties();
}
<file_sep>/chess/src/CellState.java
public enum CellState {
EMPTY,WITH_FRIEND,WITH_ENEMY
}
<file_sep>/templates hashtable and shared ptr/template_hash_table.hpp
#ifndef HASHTABLE_TEMPLATE_TEMPLATE_HASH_TABLE_H
#define HASHTABLE_TEMPLATE_TEMPLATE_HASH_TABLE_H
template <class Key, class Value> struct Node {
Node(const Key &k, const Value &v) : Current(v), CurrentKey(k){};
explicit Node(const Key &k) : CurrentKey(k){};
friend bool operator==(const Node &a, const Node &b) {
return a.CurrentKey == b.CurrentKey;
}
Value Current;
Key CurrentKey;
};
template <class Key, class Value, size_t (*HashFunction)(const Key &)>
class HashTable {
public:
HashTable() : HashTable(defaultCapacity) {};
explicit HashTable(size_t in)
: size(0), capacity(in),
data(new std::list<Node<Key, Value>>[capacity]){}; // constructor with
// any capacity
HashTable(const HashTable &t) : size(t.size), capacity(t.capacity), data(new std::list<Node<Key, Value>>[capacity]) { // copy constructor
std::copy(t.data, t.data + t.capacity, data);
}
~HashTable() { delete[] data; }
void Swap(HashTable &t) {
std::swap(capacity, t.capacity);
std::swap(size, t.size);
std::swap(loadFactor, t.loadFactor);
std::list<Node<Key, Value>> *tmp1 = data;
data = t.data;
t.data = tmp1;
}
HashTable &operator=(const HashTable &t) {
if (this != &t) {
delete[] data;
size = t.size;
capacity = t.capacity;
loadFactor = t.loadFactor;
data = new std::list<Node<Key, Value>>[capacity];
std::copy(t.data, t.data + capacity, data);
}
return *this;
}
void Clear() { // clear fields
delete[] data;
size = 0;
capacity = defaultCapacity;
loadFactor = 0;
data = new std::list<Node<Key, Value>>[capacity];
}
bool Erase(const Key &k) { // erase field with key ; true if success
if (!Contains(k))
return false;
size_t i = HashFunction(k) % capacity;
Node<Key, Value> D(k);
data[i].remove(D);
size--;
return true;
}
bool Insert(const Key &k, const Value &v) { // add field ; true if success
if (Contains(k))
return false;
Node<Key, Value> D(k, v);
size_t i = HashFunction(k) % capacity;
data[i].push_back(D); // сheck for push???
size++;
loadFactor = static_cast<double>(size) / capacity;
if (loadFactor > HighLoadFactor)
if (!Rehashing())
return false;
return true;
}
bool Contains(const Key &k) const { // checking value's availability with key
Node<Key, Value> D (k);
size_t i = HashFunction(k) % capacity;
return (std::find(data[i].begin(), data[i].end(), D) != data[i].end());
}
Value &operator[](const Key &k) {
if (!Contains(k)) {
Value Default;
Insert(k, Default);
}
Node<Key, Value> D (k);
size_t i = HashFunction(k) % capacity;
return std::find(data[i].begin(), data[i].end(), D)->Current;
}
Value &At(const Key &k) {
if (Contains(k)) {
Node<Key, Value> D (k);
size_t i = HashFunction(k) % capacity;
return std::find(data[i].begin(), data[i].end(), D)->Current;
} else
throw std::runtime_error("no key");
}
const Value &At(const Key &k)
const { // return key's value; if there is no value - throw exception
if (Contains(k)) {
Node<Key, Value> D (k);
size_t i = HashFunction(k) % capacity;
return std::find(data[i].begin(), data[i].end(), D)->Current;
} else
throw std::runtime_error(k);
}
size_t Size() const { return size; }
bool Empty() const { return (size == 0); }
friend bool operator==(const HashTable &a, const HashTable &b) {
if (a.capacity != b.capacity)
return false;
for (size_t i = 0; i < a.capacity; i++) {
for (auto it = a.data[i].begin(); it != a.data[i].end(); ++it)
if (!b.Contains((*it).CurrentKey))
return false;
}
for (size_t i = 0; i < a.capacity; i++) {
for (auto it = b.data[i].begin(); it != b.data[i].end(); ++it)
if (!a.Contains((*it).CurrentKey))
return false;
}
return true;
}
friend bool operator!=(const HashTable &a, const HashTable &b) {
return !(a == b);
}
private:
size_t capacity = defaultCapacity;
size_t size = 0;
double loadFactor = 0;
std::list<Node<Key, Value>> *data = nullptr;
bool Rehashing() {
HashTable tmp((capacity * HighLoadFactor));
for (size_t i = 0; i < capacity; i++) {
for (auto it = data[i].begin(); it != data[i].end(); ++it) {
Node<Key, Value> D = (*it);
tmp.Insert(D.CurrentKey, D.Current);
}
}
Swap(tmp);
return true;
}
static const size_t defaultCapacity = 1;
static const size_t HighLoadFactor = 2;
};
#endif // HASHTABLE_TEMPLATE_TEMPLATE_HASH_TABLE_H
<file_sep>/torrent/src/FileBuilder.java
public class FileBuilder {
}
<file_sep>/core_war/spl_command.cpp
#include "factory.hpp"
class Spl_command : public Instruction {
public:
explicit Spl_command(Modifiers x) { Fields.OpcodeMod = x;Name = "SPL"; }
void Execution(ExecutionContext &Executor) override {
if (Fields.AOperandMod == Mods::Lattice)
Fields.AOffset = Fields.AOperand;
else
Executor.SetOffset(Fields.AOperandMod, Fields.AOperand, &Fields.AOffset);
Executor.AddToQueue(Fields.AOffset);
Executor.ForwardQueue();
}
Spl_command *Clone() override { return new Spl_command(*this); }
};
namespace {
Instruction *splab() { return new Spl_command(Modifiers::AB); }
Instruction *splba() { return new Spl_command(Modifiers::BA); }
Instruction *splta() { return new Spl_command(Modifiers::A); }
Instruction *spltb() { return new Spl_command(Modifiers::B); }
Instruction *spltf() { return new Spl_command(Modifiers::F); }
Instruction *splx() { return new Spl_command(Modifiers::X); }
Instruction *spli() { return new Spl_command(Modifiers::I); }
bool a = Factory::get_instance()->regist3r("SPL.AB", &splab);
bool b = Factory::get_instance()->regist3r("SPL.BA", &splba);
bool c = Factory::get_instance()->regist3r("SPL.A", &splta);
bool d = Factory::get_instance()->regist3r("SPL.B", &spltb);
bool f = Factory::get_instance()->regist3r("SPL.F", &spltf);
bool e = Factory::get_instance()->regist3r("SPL.X", &splx);
bool g = Factory::get_instance()->regist3r("SPL.I", &spli);
bool w = Factory::get_instance()->nameRegister("SPL","SPL");
}<file_sep>/darkest_dungeon/Hero.java
import java.lang.reflect.Field;
import java.util.ArrayList;
import java.util.Properties;
import java.util.Scanner;
import java.util.concurrent.ThreadLocalRandom;
public abstract class Hero extends Unit {
Hero(Properties data) {
super(data);
Field[] fields = getClass().getDeclaredFields();
System.out.println(fields.length);
for (Field f: fields) {
if (data.containsKey(f.getName())) {
f.setAccessible(true);
try {
f.set(this, data.get(f.getName()));
}
catch (Exception exc) {
System.out.println("WTO-TO SLUCHILOS' V NASHIH LESAH!");
}
}
}
campAbilities= new ArrayList<>();
battleAbilities = new ArrayList<>();
}
Hero() {
super();
campAbilities= new ArrayList<>();
// battleAbilities = new ArrayList<>();
}
@Override
public boolean takeDamage(short dmg, DamageType type,BattleContext field) {
short dmgDealt=dmg;
short armor = getArmor();
if (type !=DamageType.PURE) {
dmgDealt = (armor==0) ? dmgDealt : (short)(armor *dmg / 100);
}
if (type==DamageType.CRITICAL) {
Hero[] tmp = new Hero[1];
tmp[0] = this;
stressIncreased(tmp,CRIT_STRESS_INCREASE_CONST);
}
boolean deathCheck = isItDeathDoor(dmgDealt);
if(deathCheck&&deathBlowCheck()) {
field.removeFromBattleField(this,false);
}
return deathCheck;
}
@Override
public void battleAction(BattleContext field) {
System.out.println("I am hero!");
getAvailibleAbilities(field);
// if (availibleAbilities.isEmpty())
//skipTurn();
int choosen = chooseAbility(field);
// if (choosen==-1)
// skipTurn();
// else {
Ability current = battleAbilities.get(choosen);
System.out.println(this.getClass().getSimpleName() + " choose " + current.getClass().getSimpleName() );
current.choseTargets(field);
current.using(field);
//}
}
//******************************DANGER!!! CANCER METHOD USE ONLY FOR 1ST TIME DEBUGGING*******
private int chooseAbility(BattleContext field) {
System.out.println("Availible abilities: ");
int i = 0;
for (Ability abil: availibleAbilities) {
System.out.println(abil.getName() + i);
i++;
}
// boolean hasLegalAbility = false;
//
// for (Ability skill:battleAbilities) {
// if (skill.isLegal(field)) {
// hasLegalAbility = true;
// }
// }
// System.out.println(hasLegalAbility);
// if (!hasLegalAbility) {
// return -1;
// }
// else {
// //System.out.println("tuta");
Scanner in = new Scanner(System.in);
int x = in.nextInt();
// if (!battleAbilities.get(x).isLegal(field)) {
// in.close();
// System.out.println("You can't choose this ability!");
// chooseAbility(field);
// } else
return x;
// }
// return -1;
}
//*****************************************************************************************
// @Override
// public void skipTurn() {
// System.out.println(name+" Skipped turn!");
// }
private void faithCheck() {
int check = ThreadLocalRandom.current().nextInt(0,100);
mind = (check > deathBlowResist)? MindStatus.VIRTOUSED : MindStatus.AFFICTIONED;
}
protected boolean deathBlowCheck() {
int check = ThreadLocalRandom.current().nextInt(0,100);
return check > deathBlowResist;
}
public void stressIncreased(Hero[] targets,byte value) {
for (Hero target: targets) {
target.stress += value;
if (target.stress >= STRESS_AFFICTION_CONST && target.mind == MindStatus.NORMAL)
faithCheck();
if (target.stress >= STRESS_HEART_ATTACK_CONST && deathBlowCheck())
deathWithOutCorpse();
else
target.stress = STRESS_HEART_ATTACK_CONST;
}
}
public short getStress() {
return stress;
}
private void getAvailibleAbilities(BattleContext field) {
availibleAbilities.clear();
for (Ability abil:battleAbilities) {
if (abil.isLegal(field))
availibleAbilities.add(abil);
}
}
private ArrayList<Ability> campAbilities;
//protected ArrayList<Ability> battleAbilities;
protected ArrayList<Ability> availibleAbilities = new ArrayList<>();
private short deseaseResist=20;
private short deathBlowResist=63;
private short stress=0;
protected short trapChance=20;
private MindStatus mind = MindStatus.NORMAL;
protected String name = "<NAME>";
private Status status = Status.ALIVE;
private final byte CRIT_STRESS_INCREASE_CONST = 7;
private final short STRESS_AFFICTION_CONST = 100;
private final short STRESS_HEART_ATTACK_CONST = 200;
}
<file_sep>/Befunge/StackDeleteTop.java
public class StackDeleteTop implements Command {
@Override
public void execute(BefungeEnvironment context) {
context.deleteTop();
}
}
<file_sep>/core_war/jmp_command.cpp
//#include "instructions.hpp"
#include "factory.hpp"
class Jmp_command : public Instruction {
public:
explicit Jmp_command(Modifiers x) { Fields.OpcodeMod = x;Name = "JMP"; }
void Execution(ExecutionContext& Executor) override {
if (Fields.AOperandMod==Mods::Lattice)
Fields.AOffset = Fields.AOperand;
else
Executor.SetOffset(Fields.AOperandMod,Fields.AOperand,&Fields.AOffset);
Executor.ChangeCurrentFlowAddress(Fields.AOffset);
std::cout << Executor.getCurrentWarriorName() << " jump! at " <<Executor.getCurrentAddress()<<" adress"<< std::endl;
}
Jmp_command *Clone() override { return new Jmp_command(*this); }
};
namespace {
Instruction *Jmpab() { return new Jmp_command(Modifiers::AB); }
Instruction *Jmpba() { return new Jmp_command(Modifiers::BA); }
Instruction *Jmpta() { return new Jmp_command(Modifiers::A); }
Instruction *Jmptb() { return new Jmp_command(Modifiers::B); }
Instruction *Jmptf() { return new Jmp_command(Modifiers::F); }
Instruction *Jmpx() { return new Jmp_command(Modifiers::X); }
Instruction *Jmpi() { return new Jmp_command(Modifiers::I); }
bool a = Factory::get_instance()->regist3r("JMP.AB", &Jmpab);
bool b = Factory::get_instance()->regist3r("JMP.BA", &Jmpba);
bool c = Factory::get_instance()->regist3r("JMP.A", &Jmpta);
bool d = Factory::get_instance()->regist3r("JMP.B", &Jmptb);
bool f = Factory::get_instance()->regist3r("JMP.F", &Jmptf);
bool e = Factory::get_instance()->regist3r("JMP.X", &Jmpx);
bool g = Factory::get_instance()->regist3r("JMP.I", &Jmpi);
bool w = Factory::get_instance()->nameRegister("JMP","JMP");
}<file_sep>/core_war/jmn_command.cpp
#include "factory.hpp"
class Jmn_command : public Instruction {
public:
explicit Jmn_command(Modifiers x) { Fields.OpcodeMod = x; Name = "JMN"; }
void Execution(ExecutionContext &Executor) override {
if (Fields.AOperandMod == Mods::Lattice &&
Fields.BOperandMod == Mods::Lattice) {
Fields.AOffset = Fields.AOperand;
Fields.BOffset = Fields.BOperand;
} else {
Executor.SetOffsets(Fields);
}
switch (Fields.OpcodeMod) {
case (Modifiers::A):
case (Modifiers::BA):
if (Executor.getA(Fields.BOffset))
Executor.ChangeCurrentFlowAddress(Fields.AOffset);
else
Executor.ForwardQueue();
break;
case (Modifiers::Not):
case (Modifiers::B):
case (Modifiers::AB):
if (Executor.getB(Fields.BOffset)) {
Executor.ChangeCurrentFlowAddress(Fields.AOffset);
std::cout << Executor.getCurrentWarriorName() << " jump to "
<< Executor.getCurrentAddress() << " address" << std::endl;
} else {
Executor.ForwardQueue();
std::cout << Executor.getCurrentWarriorName()
<< " will execute next command in "
<< Executor.getCurrentAddress() << " adress" << std::endl;
}
break;
case (Modifiers::I):
case (Modifiers::F):
case (Modifiers::X):
if (Executor.getA(Fields.BOffset) || Executor.getB(Fields.BOffset)) {
Executor.ChangeCurrentFlowAddress(Fields.AOffset);
std::cout << Executor.getCurrentWarriorName() << " jump to "
<< Executor.getCurrentAddress() << " address" << std::endl;
} else {
Executor.ForwardQueue();
std::cout << Executor.getCurrentWarriorName()
<< " will execute next command in "
<< Executor.getCurrentAddress() << " adress" << std::endl;
}
break;
}
}
Jmn_command *Clone() override { return new Jmn_command(*this); }
};
namespace {
Instruction *Jmnab() { return new Jmn_command(Modifiers::AB); }
Instruction *Jmnba() { return new Jmn_command(Modifiers::BA); }
Instruction *Jmnta() { return new Jmn_command(Modifiers::A); }
Instruction *Jmntb() { return new Jmn_command(Modifiers::B); }
Instruction *Jmntf() { return new Jmn_command(Modifiers::F); }
Instruction *Jmnx() { return new Jmn_command(Modifiers::X); }
Instruction *Jmni() { return new Jmn_command(Modifiers::I); }
bool a = Factory::get_instance()->regist3r("JMN.AB", &Jmnab);
bool b = Factory::get_instance()->regist3r("JMN.BA", &Jmnba);
bool c = Factory::get_instance()->regist3r("JMN.A", &Jmnta);
bool d = Factory::get_instance()->regist3r("JMN.B", &Jmntb);
bool f = Factory::get_instance()->regist3r("JMN.F", &Jmntf);
bool e = Factory::get_instance()->regist3r("JMN.X", &Jmnx);
bool g = Factory::get_instance()->regist3r("JMN.I", &Jmni);
bool w = Factory::get_instance()->nameRegister("JMN","JMN");
}<file_sep>/gui/controller.h
#ifndef CONTROLLER_H
#define CONTROLLER_H
#include <map>
#include "model.h"
class MainWindow;
class MARS;
class controller
{
public:
void repeat();
void setModel(model* a);
void setView(MainWindow &window,int argc, char** argv);
void getChangedCell(const std::string& Name, int Address,bool isDanger);
void checkStartOptions(int argc, std::string* files,int size,int separation);
bool startDrawing();
void warEnd(const std::string& win);
void warriorRegister(const std::__cxx11::string &name,int Pos);
static controller* get_instance();
void signalToStart(int wnum, std::string* files,int size);
std::string getWName(int pos);
std::string getCommandName(int pos);
void needInfo(const std::vector<std::string>& info,int x,int y);
void update();
private:
controller() = default;
MainWindow* view;
bool start = true;
int argcc;
char** argvv;
std::map <std::string, int> Warriors;
model* simulator;
int wnum;
std::string* mfiles;
int msize;
};
#endif // CONTROLLER_H
<file_sep>/core_war/djn_command.cpp
#include "factory.hpp"
class Djn_command : public Instruction {
public:
explicit Djn_command(Modifiers x) { Fields.OpcodeMod = x; Name = "DJN"; }
void Execution(ExecutionContext &Executor) override {
Executor.SetOffsets(Fields);
if (Fields.AOperandMod == Mods::Lattice)
Fields.AOffset = Fields.AOperand;
if (Fields.BOperandMod == Mods::Lattice)
Fields.BOffset = Fields.BOperand;
switch (Fields.OpcodeMod) {
case (Modifiers::A):
case (Modifiers::BA):
Executor.setA(Fields.BOffset, Executor.getA(Fields.BOffset) - 1);
if (Executor.getA(Fields.BOffset))
Executor.ChangeCurrentFlowAddress(Fields.AOffset);
else
Executor.ForwardQueue();
break;
case (Modifiers::B):
case (Modifiers::AB):
Executor.setB(Fields.BOffset, Executor.getB(Fields.BOffset) - 1);
if (Executor.getB(Fields.BOffset))
Executor.ChangeCurrentFlowAddress(Fields.AOffset);
else
Executor.ForwardQueue();
break;
case (Modifiers::I):
case (Modifiers::F):
case (Modifiers::X):
case (Modifiers::Not):
Executor.setB(Fields.BOffset, Executor.getB(Fields.BOffset) - 1);
Executor.setA(Fields.BOffset, Executor.getA(Fields.BOffset) - 1);
if (Executor.getA(Fields.BOffset) || Executor.getB(Fields.AOffset))
Executor.ChangeCurrentFlowAddress(Fields.AOffset);
else
Executor.ForwardQueue();
break;
}
if (Executor.getCurrentAddress() == Fields.AOffset) {
std::cout << Executor.getCurrentWarriorName() << " jump to "
<< Fields.AOffset << " address" << std::endl;
} else
std::cout << Executor.getCurrentWarriorName()
<< " will execute next command in " << Fields.AOffset
<< " adress" << std::endl;
}
Djn_command *Clone() override { return new Djn_command(*this); }
};
namespace {
Instruction *Djnab() { return new Djn_command(Modifiers::AB); }
Instruction *Djnba() { return new Djn_command(Modifiers::BA); }
Instruction *Djnta() { return new Djn_command(Modifiers::A); }
Instruction *Djntb() { return new Djn_command(Modifiers::B); }
Instruction *Djntf() { return new Djn_command(Modifiers::F); }
Instruction *Djnx() { return new Djn_command(Modifiers::X); }
Instruction *Djni() { return new Djn_command(Modifiers::I); }
bool a = Factory::get_instance()->regist3r("DJN.AB", &Djnab);
bool b = Factory::get_instance()->regist3r("DJN.BA", &Djnba);
bool c = Factory::get_instance()->regist3r("DJN.A", &Djnta);
bool d = Factory::get_instance()->regist3r("DJN.B", &Djntb);
bool f = Factory::get_instance()->regist3r("DJN.F", &Djntf);
bool e = Factory::get_instance()->regist3r("DJN.X", &Djnx);
bool g = Factory::get_instance()->regist3r("DJN.I", &Djni);
bool w = Factory::get_instance()->nameRegister("DJN","DJN");
}
<file_sep>/gui/mainwindow.h
#ifndef MAINWINDOW_H
#define MAINWINDOW_H
#include <QMainWindow>
#include <QLabel>
namespace Ui {
class MainWindow;
}
class MainWindow : public QMainWindow
{
Q_OBJECT
public:
explicit MainWindow(QWidget *parent = 0);
~MainWindow();
public slots:
void controllerCommand(int Warrior, int Address,bool isDanger);
void filesOpenError(const std::string& file);
void filesError(const char* message);
void reOrganize();
void sizeError(const std::string& message);
void separationError();
void rezults(const std::string& winner);
void off();
private slots:
void on_checkBox_2_clicked();
void on_comboBox_activated(int index);
void on_checkBox_clicked();
void on_pushButton_clicked();
void on_actionExit_triggered();
private:
Ui::MainWindow *ui;
QLabel* filesErrors[6];
};
#endif // MAINWINDOW_H
<file_sep>/core_war/factory.hpp
//
//
#ifndef CORE_WAR_FACTORY_HPP
#define CORE_WAR_FACTORY_HPP
class Factory;
#include "instructions.hpp"
#include "template_hash_table.hpp"
static const int HashConst = 31;
size_t HashCalculation (const std::string& in );
class Factory {
public:
typedef Instruction *(*creator_t)();
static Factory *get_instance() {
static Factory f;
return &f;
}
Instruction* create(const std::string& a) {
std::cout<<a<<std::endl;
if (!creatorzzz.Contains(a))
throw std::runtime_error("unknown command");
return creatorzzz[a]();
}
bool regist3r(const std::string& a, creator_t create) {
creatorzzz[a] = create;
return true;
}
bool Containz(const std::string& k) {
return creatorzzz.Contains(k);
}
bool nameRegister(const std::string& a,const std::string& b) {
if(namez.Contains(a))
throw std::runtime_error("Two commands have a same name!");
namez[a] = b;
return true;
}
private:
HashTable<std::string,creator_t ,&HashCalculation> creatorzzz;
HashTable<std::string,std::string,&HashCalculation> namez;
};
#endif //CORE_WAR_FACTORY_HPP
<file_sep>/darkest_dungeon/TickedEffects.java
public abstract class TickedEffects implements Effect {
final public void giveInfluence(Unit unit,BattleContext field) {
if (ticks>0) {
makeEffect(unit,field);
ticks--;
}
else
unit.loseEffect(this);
}
abstract public void makeEffect(Unit unit,BattleContext field);
protected int ticks = 3;
}
<file_sep>/core_war/warrior.cpp
//
//
#include <cstring>
#include "warrior.hpp"
void Warrior::Born(char *in, size_t limit) {
std::ifstream fin;
fin.open(in);
if (!fin.is_open())
throw std::runtime_error("no file in directory!");
else {
size_t LineCounter = 0;
std::string RedCode;
std::string tmp;
getline(fin, RedCode);
LineCounter++;
if (RedCode != ";redcode-94")
throw std::runtime_error("not_redcode-94_file");
while (fin >> RedCode && RedCode[0] == ';') {
getline(fin, tmp);
LineCounter++;
if (RedCode == ";author")
Author = tmp;
if (RedCode == ";name")
Name = tmp;
}
if (RedCode == "ORG") {
getline(fin, tmp);
LineCounter++;
StartPosition = std::stoi(tmp);
fin >> RedCode;
}
size_t instruct_counter = 0;
try {
do { // getting Opcode
std::shared_ptr<Instruction> a;
a.reset(Factory::get_instance()->create(RedCode));
fin >> tmp; // getting AFIELD
if (Instruction::IsModAllowed(tmp[0])) { // checking for amod
a.get()->GetData().AOperandMod = static_cast<Mods>(tmp[0]);
a.get()->GetData().AOperand =
std::stoi(tmp.substr(1, tmp.length() - 1));
} else
a.get()->GetData().AOperand = std::stoi(tmp.substr(0, tmp.length()));
fin >> tmp; // gettinf BFIELD
if (Instruction::IsModAllowed(tmp[0])) {
a.get()->GetData().BOperandMod = static_cast<Mods>(tmp[0]);
a.get()->GetData().BOperand =
std::stoi(tmp.substr(1, tmp.length() - 1));
} else
a.get()->GetData().BOperand = std::stoi(tmp.substr(0, tmp.length()));
instruct_counter++;
Instructions.push_back(a);
getline(fin, tmp); // ignore comments
LineCounter++;
} while (fin >> RedCode);
}
catch (std::exception &x) {
std::string c = "Invalid command at ";
c += std::to_string(LineCounter + 1);
c += " line in file ";
c += in;
throw std::runtime_error(c);
}
if (instruct_counter == 0 || instruct_counter > limit)
throw std::runtime_error("too much or zero commands in warrior!");
// }
// catch (std::exception) {
// throw std::runtime_error("incorrect file format");
//}
}
}
<file_sep>/core_war/mul_command.cpp
#include "factory.hpp"
class Mul_command: public Instruction {
public:
explicit Mul_command(Modifiers x){Fields.OpcodeMod=x; Name = "MUL";}
void Execution(ExecutionContext &Executor) override {
Executor.SetOffsets(Fields);
std::cout << Executor.getCurrentWarriorName()
<< " make some kind of multiply from: " << Fields.AOffset
<< " to: " << Fields.BOffset << std::endl;
switch (Fields.OpcodeMod) {
case (Modifiers::A):
Executor.setA(Fields.BOffset,
Executor.getA(Fields.AOffset) *
Executor.getA(Fields.BOffset));
break;
case (Modifiers::B):
Executor.setB(Fields.BOffset,
Executor.getB(Fields.AOffset) *
Executor.getB(Fields.BOffset));
break;
case (Modifiers::AB):
Executor.setB(Fields.BOffset,
Executor.getA(Fields.AOffset) *
Executor.getB(Fields.BOffset));
break;
case (Modifiers::BA):
Executor.setA(Fields.BOffset,
Executor.getB(Fields.AOffset) *
Executor.getA(Fields.BOffset));
break;
case (Modifiers::Not):
case (Modifiers::I):
case (Modifiers::F):
Executor.setA(Fields.BOffset,
Executor.getA(Fields.AOffset) *
Executor.getA(Fields.BOffset));
Executor.setB(Fields.BOffset,
Executor.getB(Fields.AOffset) *
Executor.getB(Fields.BOffset));
break;
case (Modifiers::X):
Executor.setB(Fields.BOffset,
Executor.getA(Fields.AOffset) *
Executor.getB(Fields.BOffset));
Executor.setA(Fields.BOffset,
Executor.getB(Fields.AOffset) *
Executor.getA(Fields.BOffset));
break;
}
Executor.ForwardQueue();
}
Mul_command* Clone() override { return new Mul_command(*this); }
};
namespace {
Instruction* Mulab() {return new Mul_command(Modifiers::AB);}
Instruction* Mulba() {return new Mul_command(Modifiers::BA);}
Instruction* Multa() {return new Mul_command(Modifiers::A);}
Instruction* Multb() {return new Mul_command(Modifiers::B);}
Instruction* Multf() {return new Mul_command(Modifiers::F);}
Instruction* Mulx() {return new Mul_command(Modifiers::X);}
Instruction* Muli() {return new Mul_command(Modifiers::I);}
bool a = Factory::get_instance()->regist3r("MUL.AB",&Mulab);
bool b = Factory::get_instance()->regist3r("MUL.BA",&Mulba);
bool c = Factory::get_instance()->regist3r("MUL.A",&Multa);
bool d = Factory::get_instance()->regist3r("MUL.B",&Multb);
bool f = Factory::get_instance()->regist3r("MUL.F",&Multf);
bool e = Factory::get_instance()->regist3r("MUL.X",&Mulx);
bool g = Factory::get_instance()->regist3r("MUL.I",&Muli);
bool w = Factory::get_instance()->nameRegister("MUL","MUL");
}
<file_sep>/core_war/CircularBuffer.cpp
#include "CircularBuffer.hpp"
#include "mars.h"
bool CircularBuffer::GameIsOn(size_t Count, size_t Tie) {
return (WarCounter > 1 && Count != Tie);
}
/*std::list<Flow>::iterator
CircularBuffer::DeleteCurrent(std::list<Flow>::iterator i) {
int* tmp = (*i).FlowCounter;
(*tmp)--;
std::cout<<"SHETCHIK POTOKA U VOINA "<<(*i).Name<<"= "<<*tmp<<std::endl;
if (!(*tmp)) {
WarCounter--;
}
return data.erase(i);
}*/
void CircularBuffer::InsertPrev(Iterator &it, Flow &in) {
(*(*it).FlowCounter)++;
std::cout << (*it).Name << " was born!!!at address " << in.Address
<< " Total of it : = " << *(*it).FlowCounter << std::endl;
data.insert(it.cur, in);
}
void CircularBuffer::DeleteCurrent(CircularBuffer::Iterator &i) {
(*(*i).FlowCounter)--;
std::cout << "SHETCHIK POTOKA U VOINA " << (*i).Name << "= "
<< *(*i).FlowCounter << std::endl;
if (!(*(*i).FlowCounter)) {
WarCounter--;
}
i.cur = data.erase(i.cur);
i.beg = data.begin();
}
void CircularBuffer::Insert(Flow &in) {
WarCounter++;
data.push_back(in);
}
/*void CircularBuffer::InsertPrev(std::list<Flow>::iterator it,Flow& in) {
(*(*it).FlowCounter)++;
std::cout<<(*it).Name<<" rodilsya!!!po adresu "<<in.Address<< " vsego nas :
= "<<*(*it).FlowCounter<<std::endl;
data.insert(it,in);
}*/
CircularBuffer::Iterator CircularBuffer::begin() { return Iterator(data); }<file_sep>/hash_table/CMakeLists.txt
cmake_minimum_required(VERSION 3.6)
#project(testing)
#add_definitions(-fprofile-arcs)
#add_definitions(-ftest-coverage)
#add_definitions(-lgcov)
add_definitions(-o)
add_definitions(--coverage)
#add_definitions(-g)
#set(CMAKE_CXX_STANDARD 11)
find_package(GTest REQUIRED)
include_directories(${GTEST_INCLUDE_DIRS})
#set(SOURCE_FILES main.cpp fact.cpp)
add_executable(runTests tests.cpp )
target_link_libraries(runTests ${GTEST_LIBRARIES} pthread)
<file_sep>/gui/winners.cpp
#include "winners.h"
#include "controller.h"
winners::winners(QWidget *parent) : QWidget(parent) {
lay = new QHBoxLayout;
winner = new QLabel;
rematch = new QPushButton;
rematch->setText("Rematch with those warriors");
exit_ = new QPushButton;
exit_->setText("EXIT FROM THESE VIETNAME");
newGame = new QPushButton;
newGame->setText("New game with new warriors and settings");
lay->addWidget(winner);
lay->addWidget(rematch);
lay->addWidget(exit_);
lay->addWidget(newGame);
setLayout(lay);
newGame->hide();
rematch->hide();
connect(exit_, SIGNAL(clicked()), this, SLOT(exitVietname()));
}
void winners::exitVietname() { exit(-322); }
void winners::getWinner(const std::__cxx11::string &win) {
winner->setText(QString::fromStdString(win));
}
<file_sep>/Befunge/AskForNumber.java
public class AskForNumber implements Command {
@Override
public void execute(BefungeEnvironment context) {
context.askForNumber();
}
}
<file_sep>/core_war/sub_command.cpp
#include "factory.hpp"
class Sub_command: public Instruction {
public:
explicit Sub_command(Modifiers x) { Fields.OpcodeMod = x;Name = "SUB";}
void Execution(ExecutionContext &Executor) override {
Executor.SetOffsets(Fields);
std::cout << Executor.getCurrentWarriorName()
<< " make some kind of sub from: " << Fields.AOffset
<< " to: " << Fields.BOffset << std::endl;
switch (Fields.OpcodeMod) {
case (Modifiers::A):
Executor.setA(Fields.BOffset,
Executor.getA(Fields.AOffset) -
Executor.getA(Fields.BOffset));
break;
case (Modifiers::B):
Executor.setB(Fields.BOffset,
Executor.getB(Fields.AOffset) -
Executor.getB(Fields.BOffset));
break;
case (Modifiers::AB):
Executor.setB(Fields.BOffset,
Executor.getA(Fields.AOffset) -
Executor.getB(Fields.BOffset));
break;
case (Modifiers::BA):
Executor.setA(Fields.BOffset,
Executor.getB(Fields.AOffset) -
Executor.getA(Fields.BOffset));
break;
case (Modifiers::Not):
case (Modifiers::I):
case (Modifiers::F):
Executor.setA(Fields.BOffset,
Executor.getA(Fields.AOffset) -
Executor.getA(Fields.BOffset));
Executor.setB(Fields.BOffset,
Executor.getB(Fields.AOffset) -
Executor.getB(Fields.BOffset));
break;
case (Modifiers::X):
Executor.setB(Fields.BOffset,
Executor.getA(Fields.AOffset) -
Executor.getB(Fields.BOffset));
Executor.setA(Fields.BOffset,
Executor.getB(Fields.AOffset) -
Executor.getA(Fields.BOffset));
break;
}
Executor.ForwardQueue();
}
Sub_command* Clone() override {
return new Sub_command(*this);
}
};
namespace {
Instruction* Subab() {return new Sub_command(Modifiers::AB);}
Instruction* Subba() {return new Sub_command(Modifiers::BA);}
Instruction* Subta() {return new Sub_command(Modifiers::A);}
Instruction* Subtb() {return new Sub_command(Modifiers::B);}
Instruction* Subtf() {return new Sub_command(Modifiers::F);}
Instruction* Subx() {return new Sub_command(Modifiers::X);}
Instruction* Subi() {return new Sub_command(Modifiers::I);}
bool a = Factory::get_instance()->regist3r("SUB.AB",&Subab);
bool b = Factory::get_instance()->regist3r("SUB.BA",&Subba);
bool c = Factory::get_instance()->regist3r("SUB.A",&Subta);
bool d = Factory::get_instance()->regist3r("SUB.B",&Subtb);
bool f = Factory::get_instance()->regist3r("SUB.F",&Subtf);
bool e = Factory::get_instance()->regist3r("SUB.X",&Subx);
bool g = Factory::get_instance()->regist3r("SUB.I",&Subi);
bool w = Factory::get_instance()->nameRegister("SUB","SUB");
}
<file_sep>/core_war/jmz_command.cpp
#include "factory.hpp"
class Jmz_command : public Instruction {
public:
explicit Jmz_command(Modifiers x) { Fields.OpcodeMod = x;Name = "JMZ"; }
void Execution(ExecutionContext &Executor) override {
if (Fields.AOperandMod == Mods::Lattice &&
Fields.BOperandMod == Mods::Lattice) {
Fields.AOffset = Fields.AOperand;
Fields.BOffset = Fields.BOperand;
} else {
Executor.SetOffsets(Fields);
}
switch (Fields.OpcodeMod) {
case (Modifiers::A):
case (Modifiers::BA):
if (!Executor.getA(Fields.BOffset))
Executor.ChangeCurrentFlowAddress(Fields.AOffset);
else
Executor.ForwardQueue();
break;
case (Modifiers::Not):
case (Modifiers::B):
case (Modifiers::AB):
if (!Executor.getB(Fields.BOffset)) {
Executor.ChangeCurrentFlowAddress(Fields.AOffset);
std::cout << Executor.getCurrentWarriorName() << " jump to "
<< Executor.getCurrentAddress() << " address" << std::endl;
} else {
Executor.ForwardQueue();
std::cout << Executor.getCurrentWarriorName()
<< " will execute next command in "
<< Executor.getCurrentAddress() << " adress" << std::endl;
}
break;
case (Modifiers::I):
case (Modifiers::F):
case (Modifiers::X):
if (!Executor.getA(Fields.BOffset) && !Executor.getB(Fields.BOffset)) {
Executor.ChangeCurrentFlowAddress(Fields.AOffset);
std::cout << Executor.getCurrentWarriorName() << " jump to "
<< Executor.getCurrentAddress() << " address" << std::endl;
} else {
Executor.ForwardQueue();
std::cout << Executor.getCurrentWarriorName()
<< " will execute next command in "
<< Executor.getCurrentAddress() << " adress" << std::endl;
}
break;
}
}
Jmz_command *Clone() override { return new Jmz_command(*this); }
};
namespace {
Instruction *Jmzab() { return new Jmz_command(Modifiers::AB); }
Instruction *Jmzba() { return new Jmz_command(Modifiers::BA); }
Instruction *Jmzta() { return new Jmz_command(Modifiers::A); }
Instruction *Jmztb() { return new Jmz_command(Modifiers::B); }
Instruction *Jmztf() { return new Jmz_command(Modifiers::F); }
Instruction *Jmzx() { return new Jmz_command(Modifiers::X); }
Instruction *Jmzi() { return new Jmz_command(Modifiers::I); }
bool a = Factory::get_instance()->regist3r("JMZ.AB", &Jmzab);
bool b = Factory::get_instance()->regist3r("JMZ.BA", &Jmzba);
bool c = Factory::get_instance()->regist3r("JMZ.A", &Jmzta);
bool d = Factory::get_instance()->regist3r("JMZ.B", &Jmztb);
bool f = Factory::get_instance()->regist3r("JMZ.F", &Jmztf);
bool e = Factory::get_instance()->regist3r("JMZ.X", &Jmzx);
bool g = Factory::get_instance()->regist3r("JMZ.I", &Jmzi);
bool w = Factory::get_instance()->nameRegister("JMZ","JMZ");
}<file_sep>/core_war/add_command.cpp
#include "factory.hpp"
class Add_command : public Instruction {
public:
explicit Add_command(Modifiers x) { Fields.OpcodeMod = x; Name = "ADD"; }
void Execution(ExecutionContext &Executor) override {
Executor.SetOffsets(Fields);
std::cout << Executor.getCurrentWarriorName()
<< " make some kind of add from: " << Fields.AOffset
<< " to: " << Fields.BOffset << std::endl;
switch (Fields.OpcodeMod) {
case (Modifiers::A):
Executor.setA(Fields.BOffset,
Executor.getA(Fields.AOffset) +
Executor.getA(Fields.BOffset));
break;
case (Modifiers::B):
Executor.setB(Fields.BOffset,
Executor.getB(Fields.AOffset) +
Executor.getB(Fields.BOffset));
break;
case (Modifiers::AB):
Executor.setB(Fields.BOffset,
Executor.getA(Fields.AOffset) +
Executor.getB(Fields.BOffset));
break;
case (Modifiers::BA):
Executor.setA(Fields.BOffset,
Executor.getB(Fields.AOffset) +
Executor.getA(Fields.BOffset));
break;
case (Modifiers::Not):
case (Modifiers::I):
case (Modifiers::F):
Executor.setA(Fields.BOffset,
Executor.getA(Fields.AOffset) +
Executor.getA(Fields.BOffset));
Executor.setB(Fields.BOffset,
Executor.getB(Fields.AOffset) +
Executor.getB(Fields.BOffset));
break;
case (Modifiers::X):
Executor.setB(Fields.BOffset,
Executor.getA(Fields.AOffset) +
Executor.getB(Fields.BOffset));
Executor.setA(Fields.BOffset,
Executor.getB(Fields.AOffset) +
Executor.getA(Fields.BOffset));
break;
}
Executor.ForwardQueue();
}
Add_command *Clone() override { return new Add_command(*this); }
};
namespace {
Instruction *addab() { return new Add_command(Modifiers::AB); }
Instruction *addba() { return new Add_command(Modifiers::BA); }
Instruction *addta() { return new Add_command(Modifiers::A); }
Instruction *addtb() { return new Add_command(Modifiers::B); }
Instruction *addtf() { return new Add_command(Modifiers::F); }
Instruction *addx() { return new Add_command(Modifiers::X); }
Instruction *addi() { return new Add_command(Modifiers::I); }
bool a = Factory::get_instance()->regist3r("ADD.AB", &addab);
bool b = Factory::get_instance()->regist3r("ADD.BA", &addba);
bool c = Factory::get_instance()->regist3r("ADD.A", &addta);
bool d = Factory::get_instance()->regist3r("ADD.B", &addtb);
bool f = Factory::get_instance()->regist3r("ADD.F", &addtf);
bool e = Factory::get_instance()->regist3r("ADD.X", &addx);
bool g = Factory::get_instance()->regist3r("ADD.I", &addi);
bool w = Factory::get_instance()->nameRegister("ADD","ADD");
}
<file_sep>/chess/src/Bishop.java
public class Bishop extends Figure {
Bishop(Side side) {
super(side);
}
@Override
protected void getListOfAvailiblePosition(ChessBoard board, int i, int j) {
diagonalSearch(board, i, j);
}
private void diagonalSearch(ChessBoard board, int i, int j) {
int scaledI = i, scaledJ = j;
System.out.println(i + " " + j);
while (celllLooking(board, ++scaledI, ++scaledJ)) ;
scaledI = i;
scaledJ = j;
while (celllLooking(board, ++scaledI, --scaledJ)) ;
scaledI = i;
scaledJ = j;
while (celllLooking(board, --scaledI, ++scaledJ)) ;
scaledI = i;
scaledJ = j;
while (celllLooking(board, --scaledI, --scaledJ)) ;
}
@Override
public String getName() {
return "[" + side.name() + " B]";
}
}
<file_sep>/Befunge/AskForSymbol.java
public class AskForSymbol implements Command {
@Override
public void execute(BefungeEnvironment context) {
context.askForSymbol();
}
}
<file_sep>/darkest_dungeon/MoveAbility.java
public class MoveAbility extends Ability {
MoveAbility(int maxBack,int maxForward) {
this.maxBack = maxBack;
this.maxForward = maxForward;
}
@Override
boolean isLegal(BattleContext field) {
int currentPos = field.getUnitPosition(this.user);
return false;
}
@Override
void upgrade() {
}
@Override
public void choseTargets(BattleContext field) {
}
@Override
void using(BattleContext field) {
}
private int maxForward;
private int maxBack;
}
<file_sep>/Befunge/StackMoreThan.java
public class StackMoreThan implements Command {
@Override
public void execute(BefungeEnvironment context) {
context.stackMoreThan();
}
}
<file_sep>/core_war/slt_command.cpp
#include "factory.hpp"
class Slt_command : public Instruction {
public:
explicit Slt_command(Modifiers x) { Fields.OpcodeMod = x;Name = "SLT";}
void Execution (ExecutionContext& Executor) override {
Executor.SetOffsets(Fields);
switch (Fields.OpcodeMod) {
case (Modifiers::A) :
if (Executor.getA(Fields.AOffset)<Executor.getA(Fields.BOffset)) {
Executor.ForwardQueue();
Executor.ForwardQueue();
return;
}
case (Modifiers::Not):
case (Modifiers::B) :
if (Executor.getB(Fields.AOffset)<Executor.getB(Fields.BOffset)) {
Executor.ForwardQueue();
Executor.ForwardQueue();
return;
}
case (Modifiers::AB):
if (Executor.getA(Fields.AOffset)<Executor.getB(Fields.BOffset)) {
Executor.ForwardQueue();
Executor.ForwardQueue();
return;
}
case (Modifiers::BA):
if (Executor.getB(Fields.AOffset)<Executor.getA(Fields.BOffset)) {
Executor.ForwardQueue();
Executor.ForwardQueue();
return;
}
case (Modifiers::F):
case (Modifiers::I):
if (Executor.getA(Fields.AOffset)<Executor.getA(Fields.BOffset)||Executor.getB(Fields.AOffset)<Executor.getB(Fields.BOffset) ) {
Executor.ForwardQueue();
Executor.ForwardQueue();
return;
}
case (Modifiers::X):
if (Executor.getA(Fields.AOffset)<Executor.getB(Fields.BOffset)||Executor.getB(Fields.AOffset)<Executor.getA(Fields.BOffset) ) {
Executor.ForwardQueue();
Executor.ForwardQueue();
return;
}
}
Executor.ForwardQueue();
}
Slt_command* Clone() override {
return new Slt_command(*this);
}
};
namespace {
Instruction* sltab() {return new Slt_command(Modifiers::AB);}
Instruction* sltba() {return new Slt_command(Modifiers::BA);}
Instruction* sltta() {return new Slt_command(Modifiers::A);}
Instruction* slttb() {return new Slt_command(Modifiers::B);}
Instruction* slttf() {return new Slt_command(Modifiers::F);}
Instruction* sltx() {return new Slt_command(Modifiers::X);}
Instruction* slti() {return new Slt_command(Modifiers::I);}
bool a = Factory::get_instance()->regist3r("SLT.AB",&sltab);
bool b = Factory::get_instance()->regist3r("SLT.BA",&sltba);
bool c = Factory::get_instance()->regist3r("SLT.A",&sltta);
bool d = Factory::get_instance()->regist3r("SLT.B",&slttb);
bool f = Factory::get_instance()->regist3r("SLT.F",&slttf);
bool e = Factory::get_instance()->regist3r("SLT.X",&sltx);
bool g = Factory::get_instance()->regist3r("SLT.I",&slti);
bool w = Factory::get_instance()->nameRegister("SLT","SLT");
}<file_sep>/core_war/heder.h
//
// Created by ulyssess on 14.10.17.
//
#ifndef CORE_WAR_HEDER_H
#define CORE_WAR_HEDER_H
#endif //CORE_WAR_HEDER_H
<file_sep>/gui/main.cpp
#include <QApplication>
#include "controller.h"
#include "mainwindow.h"
int main(int argc, char* argv[]) {
model XXX;
controller::get_instance()->setModel(&XXX);
QApplication a(argc, argv);
MainWindow w;
try {
w.show();
a.exec();
} catch (std::exception& a) {
w.hide();
main(argc, argv);
}
return 0;
}
<file_sep>/core_war/factory.cpp
#include "factory.hpp"
size_t HashCalculation(const std::string &in) {
size_t length = in.length();
size_t HashValue = in[length - 1] % HashConst;
for (int i = length - 2; i >= 0; i--)
HashValue = HashValue * HashConst + in[i] % HashConst;
return HashValue;
}
/*int main () {
Instruction* heh = Factory::get_instance()->create(Opcodes::DAT);
std::string h = "asd";
heh->IsItCorrect(h);
return 0;
}*/<file_sep>/torrent/src/PiecesCreator.java
import java.util.ArrayList;
import java.util.List;
public class PiecesCreator {
public List<FilePiece> getFilePieces(TorrentFile torrentFile) {
ArrayList<FilePiece> pieces = new ArrayList<>();
final long fileLength = torrentFile.getFileLength();
final long pieceLength = torrentFile.getPieceLength();
final long pieceCount = torrentFile.getPiecesCount();
int i;
for (i = 0; i < torrentFile.getPiecesCount()-1; i++) {
pieces.add(new FilePiece(i,pieceLength));
}
pieces.add(new FilePiece(i,fileLength - pieceCount*Long.divideUnsigned(fileLength , pieceCount))); //last piece smaller than all;
return pieces;
}
}
<file_sep>/core_war/instructions.hpp
//
//
#ifndef CORE_WAR_INSTRUCTIONS_HPP
#define CORE_WAR_INSTRUCTIONS_HPP
class Instruction;
struct Data;
enum class Mods:char;
enum class Modifiers;
#include <vector>
#include <iostream>
#include "ExecutionContext.hpp"
enum class Mods : char {
Lattice = '#',
Dollar = '$',
Dog = '@',
Star ='*' ,
Less = '<',
More = '>',
Open ='{',
Close ='}',
Not = ' '
};
enum class Modifiers { A = 1, B =2, AB=3, BA=4, F=5, X=6, I=7, Not=0};
struct Data {
Modifiers OpcodeMod = Modifiers::F;
Mods AOperandMod = Mods:: Dollar;
Mods BOperandMod = Mods::Dollar;
int AOperand = 0;
int BOperand = 0;
int AOffset = 0;
int BOffset = 0;
};
class Instruction {
public:
Instruction(){ Fields.OpcodeMod=Modifiers::Not;}
virtual void Execution (ExecutionContext& Executor) = 0;
size_t GettingCode(unsigned i) {
if(Fields.OpcodeMod==Modifiers::F&& !Fields.AOperand&&!Fields.BOperand)
return 0;
std::cout<< static_cast<int>(Fields.OpcodeMod)<< static_cast<char>(Fields.AOperandMod)<<Fields.AOperand << static_cast<char>(Fields.BOperandMod) << Fields.BOperand<<" "<<i<<std::endl ;
return 0;
}
Instruction (const Instruction& in) = default;
virtual Instruction* Clone() = 0;
static bool IsModAllowed (char c) {
const char allowed [8] = {'@','*','$','#','{','}','>','<'};
for (char i : allowed)
if (c== i)
return true;
return false;
}
Data& GetData() { return Fields;}
virtual ~Instruction() = default;
protected:
std::string Name;
Data Fields;
};
#endif //CORE_WAR_INSTRUCTIONS_HPP
<file_sep>/Befunge/GetFromStack.java
public class GetFromStack implements Command {
@Override
public void execute(BefungeEnvironment context) {
context.getFromStack();
}
}
<file_sep>/core_war/seq_command.cpp
#include "factory.hpp"
class Seq_command : public Instruction {
public:
explicit Seq_command(Modifiers x) { Fields.OpcodeMod = x;Name = "SEQ";}
Seq_command() = default;
void Execution (ExecutionContext& Executor) override {
Executor.SetOffsets(Fields);
switch (Fields.OpcodeMod) {
case (Modifiers::A) :
if (Executor.getA(Fields.AOffset)==Executor.getA(Fields.BOffset)) {
Executor.ForwardQueue();
Executor.ForwardQueue();
return;
}
case (Modifiers::B) :
if (Executor.getB(Fields.AOffset)==Executor.getB(Fields.BOffset)) {
Executor.ForwardQueue();
Executor.ForwardQueue();
return;
}
case (Modifiers::AB):
if (Executor.getA(Fields.AOffset)==Executor.getB(Fields.BOffset)) {
Executor.ForwardQueue();
Executor.ForwardQueue();
return;
}
case (Modifiers::BA):
if (Executor.getB(Fields.AOffset)==Executor.getA(Fields.BOffset)) {
Executor.ForwardQueue();
Executor.ForwardQueue();
return;
}
case (Modifiers::F):
if (Executor.getA(Fields.AOffset)==Executor.getA(Fields.BOffset)&&Executor.getB(Fields.AOffset)==Executor.getB(Fields.BOffset) ) {
Executor.ForwardQueue();
Executor.ForwardQueue();
return;
}
case (Modifiers::X):
if (Executor.getA(Fields.AOffset)==Executor.getB(Fields.BOffset)&&Executor.getB(Fields.AOffset)==Executor.getA(Fields.BOffset) ) {
Executor.ForwardQueue();
Executor.ForwardQueue();
return;
}
case (Modifiers::Not):
case (Modifiers ::I):
if(Executor.isInstructionsEqual(Fields.AOffset,Fields.BOffset)) {
Executor.ForwardQueue();
Executor.ForwardQueue();
return;
}
}
Executor.ForwardQueue();
}
Seq_command* Clone() override {
return new Seq_command(*this);
}
};
namespace {
Instruction* seqab() {return new Seq_command(Modifiers::AB);}
Instruction* seqba() {return new Seq_command(Modifiers::BA);}
Instruction* seqta() {return new Seq_command(Modifiers::A);}
Instruction* seqtb() {return new Seq_command(Modifiers::B);}
Instruction* seqtf() {return new Seq_command(Modifiers::F);}
Instruction* seqx() {return new Seq_command(Modifiers::X);}
Instruction* seqi() {return new Seq_command(Modifiers::I);}
bool a = Factory::get_instance()->regist3r("SEQ.AB",&seqab);
bool b = Factory::get_instance()->regist3r("SEQ.BA",&seqba);
bool c = Factory::get_instance()->regist3r("SEQ.A",&seqta);
bool d = Factory::get_instance()->regist3r("SEQ.B",&seqtb);
bool f = Factory::get_instance()->regist3r("SEQ.F",&seqtf);
bool e = Factory::get_instance()->regist3r("SEQ.X",&seqx);
bool g = Factory::get_instance()->regist3r("SEQ.I",&seqi);
bool w = Factory::get_instance()->nameRegister("SEQ","SEQ");
}<file_sep>/hash_table/HashTable.h
//
// Created by ulyssess on 05.10.17.
//
#ifndef UNTITLED_HASHTABLE_H
#define UNTITLED_HASHTABLE_H
#include "HashTable.cpp"
class HashTable {
public:
HashTable(); // default constructor
~HashTable();// destructor
HashTable(size_t); // constuctor with any capacity
HashTable(const HashTable&); // assignment constructor
HashTable(HashTable&&); // move constructor;
void Swap(HashTable&);// swapp values
HashTable& operator =(const HashTable&); // new assignment
HashTable& operator = (HashTable&&); // move operator;
void Clear(); // clear fields
bool Erase(const Key&); // erase field with key ; true if success
bool Insert(const Key& , const Value&); // add field ; true if success
bool Contains(const Key&) const; // checking value's availability with key
Value& operator[](const Key&); //return key's value; UNSAFE method;
// if there is no value for this key, method will add default value and return link to it
Value& At(const Key&);// return key's value; if there is no value - throw exception
const Value& At(const Key&) const; // return key's value; if there is no value - throw exception
size_t Size() const; // return current hashtable's size
bool Empty() const; // true if empty
void Show();
bool Implace (const Key& k , const Value& v);
friend bool operator == (const HashTable&,const HashTable&);
friend bool operator != (const HashTable&,const HashTable&);
private:
size_t size = 0;
size_t capacity = 2;
EntriesList* data = nullptr;
int loadFactor = 0;
bool ReHashing();
};
#endif //UNTITLED_HASHTABLE_H
<file_sep>/darkest_dungeon/Status.java
public enum Status {
ALIVE,DEAD,MISSING
}
<file_sep>/templates hashtable and shared ptr/template_shared_ptr.h
//
// Created by ulyssess on 12.10.17.
//
#ifndef HASHTABLE_TEMPLATE_TEMPLATE_SHARED_PTR_H
#define HASHTABLE_TEMPLATE_TEMPLATE_SHARED_PTR_H
template <class T>
struct Counter {
size_t count = 1;
T* a_ = nullptr;
Counter() :a_(nullptr){};
Counter(T* a): a_(a) { }
~Counter() {
count = 0;
delete[] a_;
}
};
template <class T>
class shared_ptr {
public:
shared_ptr() {
ctr = new Counter<T>(nullptr);
}
shared_ptr(T* a) {
ctr = new Counter<T>(a);
}
shared_ptr(shared_ptr &t) {
ctr = t.ctr;
ctr->count++;
}
~shared_ptr() {
if (ctr->count == 1)
delete ctr;
else
ctr->count--;
}
shared_ptr& operator = (shared_ptr& t) {
if (this!= &t) {
if (ctr->count == 1)
delete ctr;
else
ctr->count--;
ctr = t.ctr;
ctr->count++;
}
return *this;
}
T* release () {
T* tmp = ctr->a_;
ctr->count = 0 ;
ctr->a_= nullptr;
return tmp;
}
void reset(T* other) {
ctr->count == 1 ? delete ctr->a_ : ctr->count--;
ctr->a_ = other;
ctr->count=1;
}
T* get () {
return ctr->a_;
}
private:
Counter<T>* ctr = nullptr;
};
#endif //HASHTABLE_TEMPLATE_TEMPLATE_SHARED_PTR_H
<file_sep>/core_war/CircularBuffer.hpp
//
//
#ifndef CORE_WAR_CIRCULARBUFFER_HPP
#define CORE_WAR_CIRCULARBUFFER_HPP
#include <list>
#include <iostream>
struct Flow {
int* FlowCounter;
int Address;
std::string Author;
std::string Name;
};
class CircularBuffer{
friend class MARS;
public:
class Iterator {
friend class CircularBuffer;
public:
explicit Iterator (std::list<Flow>& d) : beg(d.begin()), cur(d.begin()), end(d.end()) {}
Flow& operator *() const {
return *cur;
}
Iterator& operator++() {
if(++cur==end)
cur=beg;
return *this;
}
private:
std::list<Flow>::iterator beg;
std::list<Flow>::iterator cur;
std::list<Flow>::iterator end;
};
CircularBuffer() = default;
explicit CircularBuffer (size_t Counter) : FlowCounter(Counter) {}
~CircularBuffer() = default;
void Insert (Flow&);
// void InsertPrev (std::list<Flow>::iterator,Flow&);
void InsertPrev (Iterator&,Flow&);
bool GameIsOn(size_t,size_t);
// std::list<Flow>::iterator DeleteCurrent(std::list<Flow>::iterator );
void DeleteCurrent(Iterator&);
Iterator begin();
size_t GetFlowCounter() { return FlowCounter;}
private:
std::list<Flow> data;
int WarCounter=0;
size_t FlowCounter = 1;
};
#endif //CORE_WAR_CIRCULARBUFFER_HPP
<file_sep>/hash_table/HashTable.cpp
#include <iostream>
HashTable::HashTable(HashTable&& t) {
//cout<< "move constructor";
data = t.data;
size = t.size;
loadFactor = t.loadFactor;
t.data = nullptr;
t.size=0;
t.capacity = 2;
t.loadFactor = 0;
}
HashTable& HashTable:: operator = (HashTable&& t) {
// cout<< "move operator";
if (this!= &t) {
delete [] data;
data = t.data;
size = t.size;
capacity = t.capacity;
loadFactor=t.loadFactor;
t.data = nullptr;
}
return *this;
}
void HashTable::Clear() {
delete [] data;
size = 0;
capacity = 2;
loadFactor = 0;
data = new EntriesList[capacity];
}
Value& HashTable::operator[](const Key& k) {
if(this->Contains(k))
return data[HashCalculation(k) % capacity].At(k);
else {
Value Default;
this->Insert(k,Default);
return data[HashCalculation(k) % capacity].At(k);
}
}
bool operator == (const HashTable& t1, const HashTable& t2) {
if (t1.size!=t2.size)
return false;
for (size_t i = 0 ; i< t1.capacity; i++)
if(t1.data[i]!=t2.data[i])
return false;
return true;
}
bool operator != (const HashTable& t1, const HashTable& t2) {
return !(t1==t2);
}
Value& HashTable::At(const Key& k) {
if (this->Contains(k))
return this->data[HashCalculation(k)%capacity].At(k);
else throw new exception;
}
HashTable:: HashTable() {
size = 0;
data = new EntriesList[capacity];
}
HashTable:: HashTable(size_t c): capacity(c) {
data = new EntriesList[capacity];
}
HashTable& HashTable :: operator = (const HashTable& t) {
if (this != &t) {
delete[] data;
size = t.size;
capacity = t.capacity;
loadFactor = t.loadFactor;
data = new EntriesList[capacity];
copy(t.data, t.data + t.capacity, data);
}
return *this;
}
HashTable::HashTable(const HashTable& t) {
// cout<<"zashel v konstructor"<<endl;
size = t.size;
capacity = t.capacity;
loadFactor = t.loadFactor;
data = new EntriesList[capacity];
for(size_t i = 0; i < capacity ; i++) {
data[i] = t.data[i];
}
}
HashTable:: ~HashTable() {
delete []data;
}
void HashTable:: Swap(HashTable& t ) {
size_t tmp = size;
size = t.size;
t.size = tmp;
tmp= capacity;
capacity=t.capacity;
t.capacity=tmp;
tmp = loadFactor;
loadFactor=t.loadFactor;
t.loadFactor=tmp;
EntriesList* tmp1 = data;
data=t.data;
t.data=tmp1;
}
bool HashTable:: Insert(const Key& k, const Value& t) {
if (Contains(k))
return false;
size_t index = HashCalculation(k) % capacity;
if(!data[index].addElement(k,t))
return false;
size++;
loadFactor=size/capacity;
if (loadFactor >= HighLoadFactor)
if (!ReHashing())
return false;
return true;
}
bool HashTable:: ReHashing() {
HashTable tmp(capacity * HighLoadFactor); // tmp hashtable with more capacity
for (size_t i = 0 ;i <capacity; i++) { // clear all lists in old and adding values tmp table
while(!data[i].IsEmpty()) {
Node ntmp = data[i].GetFirst();
if (!tmp.Insert(ntmp.CurrentKey,ntmp.Current))
return false;
}
}
Swap(tmp);
return true;
}
size_t HashTable::Size() const {
return size;
}
bool HashTable ::Empty() const {
return (size) ? true : false;
}
bool HashTable::Contains(const Key& k) const {
size_t index = HashCalculation(k) % capacity;
return data[index].Contains(k);
}
bool HashTable::Erase(const Key& k) {
size_t index = HashCalculation(k) % capacity;
return data[index].Erase(k);
}
void HashTable::Show () {
for (size_t i = 0; i<capacity;i++) {
data[i].showList();
std::cout<<endl;
}
}
<file_sep>/core_war/loader.cpp
//
//
#include "loader.hpp"
void Loader::loadCommands(std::vector<Warrior>& in, MARS& Simulator)
{
for (size_t i = 0; i<Simulator.size; i++) {
Simulator.Core.push_back(Simulator.Initial->Clone());
}
Flow tmp;
for (Warrior& X : in) { // for each Warrior
for (size_t i = 0; i<X.Instructions.size(); i++) {
delete Simulator.Core[(Simulator.Position+i)%Simulator.size];
Simulator.Core[(Simulator.Position+i)%Simulator.size] =
X.Instructions[i].get()->Clone(); // putting instructions in core
}
tmp.Name = X.Name;
tmp.Author = X.Author;
tmp.Address = (Simulator.Position+X.StartPosition)%Simulator.size;
tmp.FlowCounter = &X.FlowCounter;
Simulator.Flows.Insert(tmp);
Simulator.Position = (Simulator.Position+Simulator.SeparationRange)%Simulator.size;
}
}
<file_sep>/core_war/dat_command.cpp
#include "factory.hpp"
class Dat_command : public Instruction {
public:
explicit Dat_command(Modifiers x) { Fields.OpcodeMod = x; Name = "DAT"; }
void Execution(ExecutionContext &Executor) override {
std::cout << Executor.getCurrentWarriorName() << " BLOW UP!!! AT "<<Executor.getCurrentAddress()
<< std::endl;
Executor.DeleteFromQueue();
}
Instruction *Clone() override { return new Dat_command(*this); }
};
namespace {
Instruction *datab() { return new Dat_command(Modifiers::AB); }
Instruction *datba() { return new Dat_command(Modifiers::BA); }
Instruction *data() { return new Dat_command(Modifiers::A); }
Instruction *datb() { return new Dat_command(Modifiers::B); }
Instruction *datf() { return new Dat_command(Modifiers::F); }
Instruction *datx() { return new Dat_command(Modifiers::X); }
Instruction *dati() { return new Dat_command(Modifiers::I); }
bool a = Factory::get_instance()->regist3r("DAT.AB", &datab);
bool b = Factory::get_instance()->regist3r("DAT.BA", &datba);
bool c = Factory::get_instance()->regist3r("DAT.A", &data);
bool d = Factory::get_instance()->regist3r("DAT.B", &datb);
bool f = Factory::get_instance()->regist3r("DAT.F", &datf);
bool e = Factory::get_instance()->regist3r("DAT.X", &datx);
bool g = Factory::get_instance()->regist3r("DAT.I", &dati);
bool w = Factory::get_instance()->nameRegister("DAT","DAT");
}<file_sep>/core_war/Context.cpp
#include "ExecutionContext.hpp"
void ExecutionContext::SetOffset(Mods x, int Operand, int* AorB)
{
int size = Simulator->size;
(**it).Address %= size;
if ((**it).Address<0)
(**it).Address += size;
switch (x) {
case (Mods::Lattice):*AorB = 0;
break;
case (Mods::Star):*AorB = getA((**it).Address+Operand);
break;
case (Mods::Dog):*AorB = getB((**it).Address+Operand);
break;
case (Mods::Open):setA(((**it).Address+Operand), getA((**it).Address+Operand)-1);
*AorB = getA((**it).Address+Operand);
break;
case (Mods::Close):*AorB = getA((**it).Address+Operand);
setA(((**it).Address+Operand), getA((**it).Address+Operand)+1);
break;
case (Mods::More):*AorB = getB((**it).Address+Operand);
setB(((**it).Address+Operand), getB((**it).Address+Operand)+1);
break;
case (Mods::Less):setB(((**it).Address+Operand), getB((**it).Address+Operand)-1);
*AorB = getB((**it).Address+Operand);
break;
case (Mods::Dollar):
case (Mods::Not):*AorB = Operand;
break;
}
*AorB = (*AorB+(**it).Address);
*AorB %= size;
if (*AorB<0)
(*AorB) += size;
}
void ExecutionContext::SetOffsets(Data& in)
{
SetOffset(in.AOperandMod, in.AOperand, &in.AOffset);
SetOffset(in.BOperandMod, in.BOperand, &in.BOffset);
}
void ExecutionContext::setA(int which, int what)
{
int tmp = Simulator->size;
Simulator->Core[(which%tmp+tmp)%tmp]->GetData().AOperand = what;
}
void ExecutionContext::setB(int which, int what)
{
int tmp = Simulator->size;
Simulator->Core[(which%tmp+tmp)%tmp]->GetData().BOperand = what;
}
void ExecutionContext::ChangeCommand(int from, int which)
{
int tmp = Simulator->size;
if ((which%tmp+tmp)%tmp==(from%tmp+tmp)%tmp)
return;
delete Simulator->Core[(which%tmp+tmp)%tmp];
Simulator->Core[(which%tmp+tmp)%tmp] =
Simulator->Core[(from%tmp+tmp)%tmp]->Clone();
}
int ExecutionContext::getA(int which) const
{
int size = Simulator->size;
return Simulator->Core[(which%size+size)%size]->GetData().AOperand;
}
int ExecutionContext::getB(int which) const
{
int size = Simulator->size;
return Simulator->Core[(which%size+size)%size]->GetData().BOperand;
}
const std::string& ExecutionContext::getCurrentWarriorName() const
{
return (**it).Name;
}
void ExecutionContext::DeleteFromQueue() { Queue->DeleteCurrent(*it); }
void ExecutionContext::ForwardQueue()
{
(**it).Address = ((**it).Address+1)%Simulator->size;
}
void ExecutionContext::ChangeCurrentFlowAddress(int Address)
{
int size = Simulator->size;
(**it).Address = (Address%size+size)%size;
}
int ExecutionContext::getCurrentAddress() const { return (**it).Address; }
bool ExecutionContext::isInstructionsEqual(int a, int b)
{
int size = Simulator->size;
return Simulator->Core[((a%size)+size)%size]==
Simulator->Core[((b%size)+size)%size];
}
void ExecutionContext::AddToQueue(int Address)
{
if (*(**it).FlowCounter<Queue->GetFlowCounter()) {
Flow tmp = (**it);
int size = Simulator->size;
tmp.Address = (Address%size+size)%size;
Queue->InsertPrev(*it, tmp);
}
else std::cout << "TOO MUCH FLOWS FOR " << (**it).Name << std::endl;
}
<file_sep>/core_war/mars.h
//
//
#ifndef CORE_WAR_LOADER_H
#define CORE_WAR_LOADER_H
class MARS;
#include <random>
#include "CircularBuffer.hpp"
#include "factory.hpp"
#include "warrior.hpp"
//#include "template_hashtable_hpp"
class MARS {
friend class CircularBuffer;
friend class ExecutionContext;
friend class Loader;
public:
MARS();
~MARS();
int SetConfiguration(const char*);
void LoadCommands(std::vector<Warrior>& in);
void Battle();
size_t GetMaxProcess() const { return MaxProcess; }
void MakeOption(const std::string& option, const std::string& value);
private:
int size = 8192;
size_t TieCycles = 1000000;
Instruction* Initial;
size_t MaxLength = 300;
size_t MaxProcess = 64;
int MinSeparation = 300;
int SeparationRange = MinSeparation;
int NumberOfWarriors = 2;
std::vector<Instruction*> Core;
CircularBuffer Flows = CircularBuffer(MaxProcess);
std::random_device rd;
int Position = 0;
};
#endif //CORE_WAR_LOADER_H
<file_sep>/hash_table/main.cpp
#include <iostream>
#include <string>
#include <assert.h>
using namespace std;
typedef string Key;
const int HashConst = 31;
const int HighLoadFactor = 2;
size_t HashCalculation (const Key& in) {
size_t length = in.length();
size_t HashValue = in[length-1] % HashConst;
for(int i=length-2;i>=0;i--)
HashValue= HashValue*HashConst+in[i]%HashConst;
return HashValue;
}
struct Value {
unsigned age;
unsigned weight;
Value(int,int);
Value();
};
Value::Value () {
age = 20;
weight = 75;
}
Value::Value (int a, int w):age(a),weight(w) {
}
struct Node {
Value Current;
Key CurrentKey = "Default Anonymous";
Node* Next = nullptr;
Node(const Key&,const Value&);
};
Node::Node(const Key& k ,const Value& t) {
CurrentKey = k;
Current.age = t.age;
Current.weight = t.weight;
Next = nullptr;
}
class EntriesList {
public:
EntriesList();
~EntriesList(); // destructor
EntriesList& operator=(const EntriesList&);// new assignment
EntriesList(const EntriesList&); // copy constructor;
bool addElement(const Key&,const Value&); // add element to the end of list
bool Erase(const Key&); // remove element from list
void showList();
bool Contains(const Key&) const; // checking value's availability with key
bool IsEmpty () const;
Node* GetFirst();
Value& At (const Key&) const;
bool Implace (const Key& k , const Value& v);
void Clear();
friend bool operator == (const EntriesList&,const EntriesList&);
private:
Node* head;
};
void EntriesList::Clear() {
while(head) {
Node* tmp = head;
head=head->Next;
delete tmp;
}
head = nullptr;
}
bool operator == (const EntriesList& t1, const EntriesList& t2) {
Node* tmp = t1.head;
while (tmp) {
if(!t2.Contains(tmp->CurrentKey))
return false;
tmp=tmp->Next;
}
tmp = t2.head;
while (tmp) {
if(!t1.Contains(tmp->CurrentKey))
return false;
tmp=tmp->Next;
}
return true;
}
bool operator != (const EntriesList& t1, const EntriesList& t2) {
return !(t1==t2);
}
Value& EntriesList::At(const Key& k) const {
Node* tmp = head;
while(tmp) {
if(tmp->CurrentKey==k)
return tmp->Current;
tmp=tmp->Next;
}
if (!tmp)
throw new exception;
}
bool EntriesList::IsEmpty() const {
return (head) ? false : true;
}
Node* EntriesList :: GetFirst() {
Node* tmp = head;
head=head->Next;
return tmp;
}
bool EntriesList::Contains (const Key& k) const {
Node* tmp = head;
while(tmp) {
if (tmp->CurrentKey==k)
return true;
tmp=tmp->Next;
}
return false;
}
EntriesList:: EntriesList() {
head = nullptr;
}
EntriesList:: EntriesList(const EntriesList&t) {
head = nullptr;
Node* tmp = t.head;
while (tmp){
this->addElement(tmp->CurrentKey,tmp->Current);
tmp=tmp->Next;
}
}
EntriesList& EntriesList:: operator = (const EntriesList& t) {
if (this!= &t) {
Node* tmp = t.head;
Clear(); //???????????????????????????? ����� �� ��������� ���������� � ������������ �������� ���� �� ���� ��� ������������ ������ �������� ����� ������� ������ ������
while (tmp) {
this->addElement(tmp->CurrentKey, tmp->Current);
tmp = tmp->Next;
}
}
return *this;
}
EntriesList:: ~EntriesList() {
while(head) {
Node* tmp= head;
head=head->Next;
delete tmp;
}
head = nullptr;
}
bool EntriesList :: addElement(const Key& k, const Value& info) {
if (!head) {
head = new Node(k,info);
if (!head)
return false;
}
else {
Node* tmp = head;
while(tmp->Next)
tmp=tmp->Next;
tmp->Next = new Node(k,info);
if (!tmp->Next)
return false;
}
return true;
}
void EntriesList :: showList() {
if(!head)
cout<<"Empty list"<<endl;
Node* tmp = head;
while (tmp) {
cout<<"Name:"<< tmp->CurrentKey << endl << "Age:" << tmp->Current.age << endl << "Weight:" << tmp->Current.weight << endl;
tmp=tmp->Next;
}
}
bool EntriesList :: Erase (const Key& k ) {
if (!head)
return false;
Node* tmp = head;
if (tmp->CurrentKey==k) {
head=head->Next;
delete tmp;
return true;
}
while (tmp->Next) {
if (tmp->Next->CurrentKey==k) {
Node* tmp1= tmp->Next;
tmp->Next=tmp1->Next;
delete tmp1;
return true;
}
tmp=tmp->Next;
}
return false;
}
class HashTable {
public:
HashTable(); // default constructor
~HashTable();// destructor
HashTable(size_t); // constuctor with any capacity
HashTable(const HashTable&); // assignment constructor
HashTable(HashTable&&); // move constructor;
void Swap(HashTable&);// swap values
HashTable& operator =(const HashTable&); // new assignment
HashTable& operator = (HashTable&&); // move operator;
void Clear(); // clear fields
bool Erase(const Key&); // erase field with key ; true if success
bool Insert(const Key& , const Value&); // add field ; true if success
bool Contains(const Key&) const; // checking value's availability with key
Value& operator[](const Key&); //return key's value; UNSAFE method;
// if there is no value for this key, method will add default value and return link to it
Value& At(const Key&);// return key's value; if there is no value - throw exception
const Value& At(const Key&) const; // return key's value; if there is no value - throw exception
size_t Size() const; // return current hashtable's size
bool Empty() const; // true if empty
void Show();
bool Implace (const Key& k , const Value& v);
friend bool operator == (const HashTable&,const HashTable&);
friend bool operator != (const HashTable&,const HashTable&);
private:
size_t size = 0;
size_t capacity = 2;
EntriesList* data = nullptr;
int loadFactor = 0;
bool ReHashing();
};
HashTable::HashTable(HashTable&& t) {
//cout<< "move constructor";
data = t.data;
size = t.size;
loadFactor = t.loadFactor;
t.data = nullptr;
t.size=0;
t.capacity = 2;
t.loadFactor = 0;
}
HashTable& HashTable:: operator = (HashTable&& t) {
// cout<< "move operator";
if (this!= &t) {
delete [] data;
data = t.data;
size = t.size;
capacity = t.capacity;
loadFactor=t.loadFactor;
t.data = nullptr;
}
return *this;
}
void HashTable::Clear() {
delete [] data;
size = 0;
capacity = 2;
loadFactor = 0;
data = new EntriesList[capacity];
}
Value& HashTable::operator[](const Key& k) {
if(this->Contains(k))
return data[HashCalculation(k) % capacity].At(k);
else {
Value Default;
this->Insert(k,Default);
return data[HashCalculation(k) % capacity].At(k);
}
}
bool operator == (const HashTable& t1, const HashTable& t2) {
if (t1.size!=t2.size)
return false;
for (size_t i = 0 ; i< t1.capacity; i++)
if(t1.data[i]!=t2.data[i])
return false;
return true;
}
bool operator != (const HashTable& t1, const HashTable& t2) {
return !(t1==t2);
}
Value& HashTable::At(const Key& k) {
if (this->Contains(k))
return this->data[HashCalculation(k)%capacity].At(k);
else throw new exception;
}
HashTable:: HashTable() {
capacity = 2;
loadFactor = 0;
size = 0;
data = new EntriesList[capacity];
}
HashTable:: HashTable(size_t c): capacity(c) {
data = new EntriesList[capacity];
}
HashTable& HashTable :: operator = (const HashTable& t) {
if (this != &t) {
delete[] data;
size = t.size;
capacity = t.capacity;
loadFactor = t.loadFactor;
data = new EntriesList[capacity];
copy(t.data, t.data + t.capacity, data);
}
return *this;
}
HashTable::HashTable(const HashTable& t) {
// cout<<"zashel v konstructor"<<endl;
size = t.size;
capacity = t.capacity;
loadFactor = t.loadFactor;
data = new EntriesList[capacity];
for(size_t i = 0; i < capacity ; i++) {
data[i] = t.data[i];
}
}
HashTable:: ~HashTable() {
delete []data;
}
void HashTable:: Swap(HashTable& t ) {
size_t tmp = size;
size = t.size;
t.size = tmp;
tmp = capacity;
capacity=t.capacity;
t.capacity=tmp;
tmp = loadFactor;
loadFactor=t.loadFactor;
t.loadFactor=tmp;
EntriesList* tmp1 = data;
data = t.data;
t.data=tmp1;
}
bool HashTable:: Insert(const Key& k, const Value& t) {
if (Contains(k))
return false;
size_t index = HashCalculation(k) % capacity;
if(!data[index].addElement(k,t))
return false;
size++;
loadFactor=size/capacity;
if (loadFactor >= HighLoadFactor)
if (!ReHashing())
return false;
return true;
}
bool HashTable:: ReHashing() {
HashTable tmp(capacity * HighLoadFactor); // tmp hashtable with more capacity
for (size_t i = 0 ;i <capacity; i++) { // clear all lists in old and adding values tmp table
while(!data[i].IsEmpty()) {
Node* ntmp = data[i].GetFirst();
if (!tmp.Insert(ntmp->CurrentKey,ntmp->Current))
return false;
delete ntmp;
}
}
Swap(tmp);
return true;
}
size_t HashTable::Size() const {
return size;
}
bool HashTable ::Empty() const {
return (!size) ? true : false;
}
bool HashTable::Contains(const Key& k) const {
size_t index = HashCalculation(k) % capacity;
return data[index].Contains(k);
}
bool HashTable::Erase(const Key& k) {
size_t index = HashCalculation(k) % capacity;
size--;
return data[index].Erase(k);
}
void HashTable::Show () {
for (size_t i = 0; i<capacity;i++) {
data[i].showList();
std::cout<<endl;
}
}
/*int main()
{
Value a,b,c,d,g;
HashTable Z;
Z.Insert("KAMA THE BULLET",a);
Z.Insert("MAGA THE LEZGIN",b);
//cout<<"poka vse oke"<<endl;
Z.Show();
HashTable X = Z;
HashTable* V = new HashTable;
// cout<<"add"<<endl;
V->Insert("dasad",d);
HashTable* B = new HashTable;
*B = *V;
delete B;
delete V;
// cout<<"heh"<<endl;
return 0;
}*/
<file_sep>/chess/src/Main.java
public class Main {
public static void main(String[] args) {
ChessBoard x = new ClassicBoard();
x.setUpBoard();
x.showBoard();
x.showBoard();
}
}
<file_sep>/core_war/loader.hpp
//
//
#ifndef CORE_WAR_LOADER_HPP
#define CORE_WAR_LOADER_HPP
class Loader;
#include "mars.h"
class Loader {
public:
static void loadCommands (std::vector <Warrior>& in,MARS& Simulator);
};
#endif //CORE_WAR_LOADER_HPP
<file_sep>/IncomingListener.java
import java.io.*;
import java.net.InetSocketAddress;
import java.net.ServerSocket;
import java.net.Socket;
import java.net.SocketTimeoutException;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.channels.*;
import java.util.*;
public class IncomingListener implements Runnable {
private void sendInfoMessage(SocketChannel sc) {
String fileName = "bivis i batthhead";
System.out.println(fileName);
ByteBuffer a = ByteBuffer.allocate(4);
a.putInt(fileName.length());
a.rewind();
try {
sc.write(a);
sc.write(ByteBuffer.wrap(fileName.getBytes()));
}
catch (IOException e) {
e.printStackTrace();
}
}
@Override
public void run() {
try (InputStreamReader loadfile = new InputStreamReader(IncomingListener.class.getClassLoader().getResourceAsStream(propertiesFileName), "UTF-8")) {
pathesToFiles.load(loadfile);
Selector selector = Selector.open();
ServerSocketChannel ssc = ServerSocketChannel.open();
ssc.bind(new InetSocketAddress("localhost", PORT));
ssc.configureBlocking(false);
int ops = ssc.validOps();
ssc.register(selector, ops, null);
int connections;
ByteBuffer readBuffer = ByteBuffer.allocate(4);
SocketChannel is = null;
String keyIp = "null";
while (true) {
connections = selector.select(1000);
// System.out.println("zdras'te");
// System.out.println(heh.position());
if (connections == 0) {
System.out.println("waiting for peer");
System.out.println("waiting for peer");
continue;
}
Set<SelectionKey> keys = selector.selectedKeys();
Iterator<SelectionKey> it = keys.iterator();
while (it.hasNext()) {
// System.out.println("y menya est' next!");
SelectionKey key = it.next();
if (key.isAcceptable()) {
SocketChannel socket = ssc.accept();
keyIp = socket.getRemoteAddress().toString();
peersStatus.put(keyIp, true);
socket.configureBlocking(false);
socket.register(selector, SelectionKey.OP_READ | SelectionKey.OP_WRITE);
System.out.println("got one!");
}
// if (key.) {
// System.out.println("ya ne validny!");
// keys.remove(key);
// }
if (key.isReadable()) {
System.out.println("YA READEBLE!");
if (peersStatus.get(keyIp)) {
readBuffer.clear();
is = (SocketChannel) key.channel();
System.out.println(is);
int len = is.read(readBuffer);
if (len==-1){
System.out.println("chelik sdoh");
key.cancel();
break;
}
System.out.println(len);
readBuffer.rewind();
messageLen = readBuffer.getInt();
heh = ByteBuffer.allocate(messageLen);
System.out.println(messageLen);
peersStatus.replace(keyIp, false);
is.read(heh);
System.out.println(heh);
} else {
if (heh.hasRemaining()) {
System.out.println("vse ewe kachau!");
if (is.read(heh)==-1) {
System.out.println("chelik sdoh!");
heh.clear();
readBuffer.clear();
key.cancel();
break;
}
System.out.println(heh);
} else if (heh.position() == messageLen) {
System.out.println(heh);
heh.clear();
System.out.println("YA SKACHAL!");
peersStatus.replace(keyIp, true);
}
}
}
else if (key.isWritable()) {
if (heh.position()==messageLen) {
System.out.println("piwu!");
String msg = new String(heh.array());
String[] msgData = msg.split(" ");
heh.clear();
if (msgData.length==1) {
if (pathesToFiles.containsKey(msg)) {
ByteBuffer x = ByteBuffer.allocate(4);
x.putInt("ALL".length());
x.rewind();
is.write(x);
is.write(ByteBuffer.wrap("ALL".getBytes()));
}
else {
ByteBuffer x = ByteBuffer.allocate(4);
x.putInt("NONE".length());
x.rewind();
is.write(x);
is.write(ByteBuffer.wrap("NONE".getBytes()));
}
peersStatus.replace(keyIp, true);
}
// piece message
else {
System.out.println("ETO SOOBSHENIE!");
for (String s:msgData) {
System.out.println(s);
}
}
}
// is = (SocketChannel)key.channel();
// if (is.write(ByteBuffer.wrap("sosi huy! like esli bi(d)lo".getBytes()))==0) {
// System.out.println("can't write!");
// key.cancel();
// }
// try {
// Thread.sleep(5000);
// } catch (InterruptedException e) {
// e.printStackTrace();
}
////
}
it.remove();
}
// try (var is = new BufferedReader(new InputStreamReader(client.getInputStream()));
// var os = new PrintWriter(new OutputStreamWriter( client.getOutputStream()))) {
// var request = is.readLine();
// System.out.println(request);
// String infos[] = request.split(" ");
// int pieceLengt = Integer.parseInt(infos[2]);
// if (pathesToFiles.containsKey(infos[0]) || pathesToFiles.containsKey(infos[0]+infos[1])) {
// //os.println("shas razdam");
// String path = pathesToFiles.getProperty(infos[0]);
// String file = (path + infos[0]);
// System.out.println(file);
// RandomAccessFile fileToGive = new RandomAccessFile(file,"r");
// fileToGive.seek(Long.parseLong(infos[1])*pieceLengt);
// var fileData = new byte[pieceLengt];
// fileToGive.read(fileData,0,pieceLengt);
//// client.getOutputStream().write(fileData);
// os.write(Arrays.toString(fileData));
// }
// else {
// os.println("hule tebe nado y menya nichago net");
// os.flush();
// }
// }
//// catch () {
//// System.out.println("pow<NAME>");
//// exc.printStackTrace();
//// System.exit(0);
//// }
// catch (SocketTimeoutException exc) {
// System.out.println("Client umer!");
// }
//
//
// }
// }
// catch (IOException exc) {
// exc.printStackTrace();
// System.out.println("y tebya chot s sdiom daunom");
// System.exit(0);
} catch (IOException e) {
e.printStackTrace();
}
}
public static void startListening() {
}
private ByteBuffer heh = ByteBuffer.allocate(4);
private final int ASYNC_BUFF_LENGTH = 1024;
private List<Byte> tmpBuffer = new ArrayList<>();
private Map<String, ArrayList<Integer>> tableOfAvailability = new HashMap<>();
private Map <String,Boolean> peersStatus = new HashMap<>();
private List<Integer> myPieces = new ArrayList<>();
private int messageLen =-1;
private final int PORT = 9875;
private Properties pathesToFiles = new Properties();
private final String propertiesFileName = "files.properties";
}<file_sep>/gui/renderarea.cpp
/****************************************************************************
**
** Copyright (C) 2016 The Qt Company Ltd.
** Contact: https://www.qt.io/licensing/
**
** This file is part of the examples of the Qt Toolkit.
**
** $QT_BEGIN_LICENSE:BSD$
** Commercial License Usage
** Licensees holding valid commercial Qt licenses may use this file in
** accordance with the commercial license agreement provided with the
** Software or, alternatively, in accordance with the terms contained in
** a written agreement between you and The Qt Company. For licensing terms
** and conditions see https://www.qt.io/terms-conditions. For further
** information use the contact form at https://www.qt.io/contact-us.
**
** BSD License Usage
** Alternatively, you may use this file under the terms of the BSD license
** as follows:
**
** "Redistribution and use in source and binary forms, with or without
** modification, are permitted provided that the following conditions are
** met:
** * Redistributions of source code must retain the above copyright
** notice, this list of conditions and the following disclaimer.
** * Redistributions in binary form must reproduce the above copyright
** notice, this list of conditions and the following disclaimer in
** the documentation and/or other materials provided with the
** distribution.
** * Neither the name of The Qt Company Ltd nor the names of its
** contributors may be used to endorse or promote products derived
** from this software without specific prior written permission.
**
**
** THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
** "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
** LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
** A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
** OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
** SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
** LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
** DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
** THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
** (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
** OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE."
**
** $QT_END_LICENSE$
**
****************************************************************************/
#include "renderarea.h"
#include <QPainter>
#include <QMouseEvent>
#include <iostream>
#include<QWindow>
#include <QScrollBar>
#include<info.h>
#include "/home/ulyssess/qtProjects/testGUI/controller.h"
//! [0]
RenderArea::RenderArea(QWidget *parent)
: QWidget(parent)
{
setBackgroundRole(QPalette::Shadow);
setAutoFillBackground(true);
//scroll = parent;
}
//! [0]
void RenderArea::speedUpClick() {
std::cout<<" STATES FOR DRAWING= " << StatesForDrawing <<std::endl;
if (StatesForDrawing+10> UpLimit)
StatesForDrawing = UpLimit;
else
StatesForDrawing+=10;
}
void RenderArea::speedDownClick() {
std::cout<<" STATES FOR DRAWING= " << StatesForDrawing <<std::endl;
if (StatesForDrawing-10<DownLimit)
StatesForDrawing = DownLimit;
else
StatesForDrawing-=10;
}
void RenderArea::tick() {
std::cout<<"huy";
Fields.clear();
std::pair <int, bool> x {-1,true};
for (int i = 0; i< Size;i++) {
Fields.push_back (x);
}
}
void RenderArea::ButtonClick() {
drawing=!drawing;
}
//! [1]
//!
QSize RenderArea::minimumSizeHint() const
{
return QSize(100, 100);
}
//! [1]
//! [2]
QSize RenderArea::sizeHint() const
{
return QSize(400, 200);
}
//! [2]
//! [3]
void RenderArea::setShape(Shape shape)
{
update();
}
//! [3]
//! [4]
//!
void RenderArea::setPen(const QPen &pen)
{
update();
}
//! [4]
//! [5]
void RenderArea::setBrush(const QBrush &brush)
{
update();
}
//! [5]
//! [6]
void RenderArea::setAntialiased(bool antialiased)
{
update();
}
//! [6]
//! [7]
void RenderArea::setTransformed(bool transformed)
{
update();
}
//! [7]
//! [8]
void RenderArea::ControlCommand(int Warrior, int Address, bool isDanger) {
// if(drawing) {
drawing = true;
States++;
Fields[Address] = {Warrior,isDanger};
if(States==StatesForDrawing) {
repaint();
States = 0;
}
}
void RenderArea::paintEvent(QPaintEvent * /* event */)
{
if (drawing) {
QColor c;
QPainter p (this);
int warNum;
for (int i = 0; i < Size;i++) {
warNum =std::get<0>(Fields[i]);
if( warNum== -1)
c = Qt::black;
else {
c = (std::get<1>(Fields[i])) ? Colours[warNum*2+1] : Colours[warNum*2];
}
QPen x(c,5,Qt::SolidLine);
p.setPen(x);
p.fillRect(widthRange*(i%CellsInWidth)+1,heightRange*(i/CellsInWidth) ,CellSize,CellSize,c);
}
}
}
void RenderArea::mouseDoubleClickEvent(QMouseEvent *event) {
if (drawing && event->button()==Qt::LeftButton) {
if (event->x() % widthRange <= CellSize && event->y() % heightRange <= CellSize ) {
int CellNumber = event->x()/ widthRange + (event->y() / heightRange)*CellsInWidth;
std::string WarriorName = controller::get_instance()->getWName(std::get<0>(Fields[CellNumber]));
std::shared_ptr<infoWindow> info(new infoWindow());
info.get()->setWindowTitle("Info about cell");
info.get()->setName(WarriorName);
info.get()->setCommand(controller::get_instance()->getCommandName(CellNumber));
info.get()->setPosition(CellNumber+1);
info.get()->setDanger(Fields[CellNumber].second);
info.get()->setGeometry(event->x(),event->y(),200,200);
info.get()->setBackgroundRole(QPalette::Shadow);
info.get()->setAutoFillBackground(true);
infos.push_back(info);
info->show();
}
}
}
void RenderArea::wheelEvent(QWheelEvent *event) {
const int degrees = event->delta() / 8;
int steps = degrees / 15;
int scaleFactor = 1;
widthRange = CellSize +2;
heightRange = CellSize +4;
if(steps > 0) {
CellSize = (CellSize >= maxSize) ? CellSize : (CellSize + scaleFactor);
widthRange = (CellSize >= maxSize) ? widthRange : (widthRange + scaleFactor);
heightRange = (CellSize >= maxSize) ? heightRange : (heightRange + scaleFactor);
if(size().width()> widthRange*CellsInWidth)
widthRange = size().width()/ CellsInWidth +1;
if(size().height()> heightRange*StringsInHeight)
widthRange = size().height()/ StringsInHeight +1;
}
else {
CellSize = (CellSize <= minSize) ? CellSize : (CellSize - scaleFactor);
widthRange = (CellSize <= minSize) ? widthRange : (widthRange - scaleFactor);
heightRange = (CellSize <= minSize) ? heightRange : (heightRange - scaleFactor);
//setMinimumSize(widthRange*CellsInWidth,heightRange*StringsInHeight);
if(size().width()> widthRange*CellsInWidth)
widthRange = size().width()/ CellsInWidth +1;
if(size().height()> heightRange*StringsInHeight)
widthRange = size().height()/ StringsInHeight +1;
}
repaint();
}
void RenderArea::setScroll(QScrollArea *sc) {
scroll = sc;
}
void RenderArea::mouseMoveEvent(QMouseEvent *event) {
QPoint delta = event->pos() - last;
if (delta.x() < -30 && scroll->horizontalScrollBar()->value()!= scroll->horizontalScrollBar()->maximum()) {
scroll->horizontalScrollBar()->setValue(scroll->horizontalScrollBar()->value() +150);
}
else if (delta.x()>30 &&scroll->horizontalScrollBar()->value()!= scroll->horizontalScrollBar()->minimum())
scroll->horizontalScrollBar()->setValue(scroll->horizontalScrollBar()->value() -150);
if (delta.y() < -30 && scroll->verticalScrollBar()->value()!= scroll->verticalScrollBar()->maximum())
scroll->verticalScrollBar()->setValue(scroll->verticalScrollBar()->value() +150);
else if (delta.y()>30 && scroll->verticalScrollBar()->value()!= scroll->verticalScrollBar()->minimum())
scroll->verticalScrollBar()->setValue(scroll->verticalScrollBar()->value() -150);
}
void RenderArea::mousePressEvent(QMouseEvent *event) {
if(event->button()==Qt::LeftButton)
last = event->pos();
else
infos.clear();
}
int RenderArea::getAreaWidth() {
return widthRange*CellsInWidth;
}
int RenderArea::getAreaHeight() {
return heightRange* StringsInHeight;
}
void RenderArea::setSize(int size) {
Size = size;
StringsInHeight = Size / CellsInWidth +1;
std::pair <int, bool> x {-1,true};
Fields.clear();
for (int i = 0; i< Size;i++) {
Fields.push_back (x);
}
setMinimumSize(widthRange*CellsInWidth,heightRange*StringsInHeight);
}
//! [13]
<file_sep>/Befunge/Command.java
public interface Command {
void execute (BefungeEnvironment context);
}
<file_sep>/darkest_dungeon/JesterHero.java
import java.util.Properties;
public class JesterHero extends Hero {
JesterHero(Properties data) {
super(data);
System.out.println("tuta");
battleAbilities.add(1,new JesterDirkStabAbility(this));
battleAbilities.add(2,new JesterHarvestAbility(this));
}
JesterHero() {
super();
battleAbilities.add(1,new JesterDirkStabAbility(this));
battleAbilities.add(2,new JesterHarvestAbility(this));
maxHP = 19;
stunResist = 20;
bleedResist = 40;
moveResist = 20;
debuffResist = 40;
blightResist = 40;
armor = 0;
speed = 7;
dodge = 15;
accuracy = 100;
minDamage = 4;
maxDamage = 7;
critChance = 7;
size = 1;
}
@Override
public void moveBack(BattleContext field) {
field.changeUnitPosition(this,3,MoveDirection.BACK);
}
@Override
public void moveForward(BattleContext field) {
field.changeUnitPosition(this,3,MoveDirection.FORWARD);
}
@Override
protected void deathWithOutCorpse() {
System.out.println("Jester dead!");
}
@Override
protected void deathWithCorpse() {
System.out.println("Jester dead!");
}
}
<file_sep>/core_war/instructions.cpp
//
// Created by ulyssess on 24.10.17.
//
#include "instructions.hpp"
/*bool Mov_command::Execution(std::vector<Instruction> & Core, size_t Address) override {
/* size_t size = Core.size();
size_t Destination = BOperand;
size_t Source = AOperand;
/* switch (BOperandMod) { //to do for other operand mods (not only # and $)
case (Mods::Dollar):
Destination = (Destination +Address)%size;
case (Mods::Lattice):
}
if (BOperandMod==Mods::Dollar)
Destination = (Destination + Address)%size;
if(AOperandMod==Mods::Dollar)
Source = (Source+Address)%size;
*/
// Core[Destination]=Core[Source];
// return true;
/*bool Dat_command::Execution(std::vector<Instruction*> &Core, size_t Address) {
return false;
}*/
bool Instruction::nameRegister() {
return Factory::get_instance()->nameRegister(Name,Name);
}
<file_sep>/chess/src/Factory.java
import java.io.IOException;
import java.io.InputStream;
import java.util.Properties;
public class Factory {
public static Factory getInstance() throws FactoryInvalidInitialisation {
if (f == null)
f = new Factory();
return f;
}
public Figure getFigure(String id) throws FactoryFigureNotFoundException {
Class<?> c;
Object comm;
try {
c = Class.forName(Figures.getProperty(id));
comm = c.getDeclaredConstructor().newInstance();
} catch (Exception exc) {
throw new FactoryFigureNotFoundException("Unknown Figure");
}
return (Figure) comm;
}
private Factory() throws FactoryInvalidInitialisation {
Figures = new Properties();
InputStream is = Factory.class.getClassLoader().getResourceAsStream(configFileName);
if (is != null) {
try {
Figures.load(is);
} catch (IOException exc) {
throw new FactoryInvalidInitialisation("Missing config file!");
}
}
}
private static final String configFileName = "config.properties";
private static Factory f = null;
private Properties Figures;
}
<file_sep>/core_war/sne_command.cpp
#include "factory.hpp"
class Sne_command : public Instruction {
public:
explicit Sne_command(Modifiers x) { Fields.OpcodeMod = x; Name = "SNE";}
void Execution (ExecutionContext& Executor) override {
Executor.SetOffsets(Fields);
switch (Fields.OpcodeMod) {
case (Modifiers::A) :
if (Executor.getA(Fields.AOffset)!=Executor.getA(Fields.BOffset)) {
Executor.ForwardQueue();
Executor.ForwardQueue();
return;
}
case (Modifiers::B) :
if (Executor.getB(Fields.AOffset)!=Executor.getB(Fields.BOffset)) {
Executor.ForwardQueue();
Executor.ForwardQueue();
return;
}
case (Modifiers::AB):
if (Executor.getA(Fields.AOffset)!=Executor.getB(Fields.BOffset)) {
Executor.ForwardQueue();
Executor.ForwardQueue();
return;
}
case (Modifiers::BA):
if (Executor.getB(Fields.AOffset)!=Executor.getA(Fields.BOffset)) {
Executor.ForwardQueue();
Executor.ForwardQueue();
return;
}
case (Modifiers::F):
if (Executor.getA(Fields.AOffset)!=Executor.getA(Fields.BOffset)&&Executor.getB(Fields.AOffset)!=Executor.getB(Fields.BOffset) ) {
Executor.ForwardQueue();
Executor.ForwardQueue();
return;
}
case (Modifiers::X):
if (Executor.getA(Fields.AOffset)!=Executor.getB(Fields.BOffset)||Executor.getB(Fields.AOffset)!=Executor.getA(Fields.BOffset) ) {
Executor.ForwardQueue();
Executor.ForwardQueue();
return;
}
case (Modifiers::Not):
case (Modifiers ::I):
if(Executor.isInstructionsEqual(Fields.AOffset,Fields.BOffset)) {
Executor.ForwardQueue();
Executor.ForwardQueue();
return;
}
}
Executor.ForwardQueue();
}
Sne_command* Clone() override {
return new Sne_command(*this);
}
};
namespace {
Instruction* sneab() {return new Sne_command(Modifiers::AB);}
Instruction* sneba() {return new Sne_command(Modifiers::BA);}
Instruction* sneta() {return new Sne_command(Modifiers::A);}
Instruction* snetb() {return new Sne_command(Modifiers::B);}
Instruction* snetf() {return new Sne_command(Modifiers::F);}
Instruction* snex() {return new Sne_command(Modifiers::X);}
Instruction* snei() {return new Sne_command(Modifiers::I);}
bool a = Factory::get_instance()->regist3r("SNE.AB",&sneab);
bool b = Factory::get_instance()->regist3r("SNE.BA",&sneba);
bool c = Factory::get_instance()->regist3r("SNE.A",&sneta);
bool d = Factory::get_instance()->regist3r("SNE.B",&snetb);
bool f = Factory::get_instance()->regist3r("SNE.F",&snetf);
bool e = Factory::get_instance()->regist3r("SNE.X",&snex);
bool g = Factory::get_instance()->regist3r("SNE.I",&snei);
bool w = Factory::get_instance()->nameRegister("SNE","SNE");
}<file_sep>/Befunge/Factory.java
import java.io.*;
import java.util.Properties;
import java.util.TreeMap;
public class Factory {
public static Factory getInstance(){
if(f==null)
f = new Factory();
return f;
}
public Command getCommand (String id) throws ClassNotFoundException {
Class<?> c;
Object comm;
try {
c = Class.forName(commands.getProperty(id));
comm = c.getDeclaredConstructor().newInstance();
}
catch (Exception exc) {
throw new ClassNotFoundException("Unknown command");
}
return (Command) comm;
}
private Factory() {
commands = new Properties();
InputStream is = Factory.class.getClassLoader().getResourceAsStream(configFileName);
if (is!=null) {
try {
commands.load(is);
}
catch (IOException exc) {
System.out.println("Missing config file!");
}
}
}
private final String configFileName = "config.txt";
private static Factory f = null;
private Properties commands;
}
| 03e92eca1c566dc5621d70f6b9e700f7ea4fa04e | [
"Java",
"CMake",
"C++",
"C"
]
| 87 | Java | ankutalev/16208_kutalev | e36ecdc14960175074bb0a6a62c31bddb77db1bf | 15610a5075bb9dd50c37998a06ce8ee5996e081e |
refs/heads/master | <repo_name>repanareddysekhar/Intro<file_sep>/OOPS/src/Interns.java
public class Interns extends Employee {
@Override
public void netSalary() {
System.out.println("Stipend");
}
}
| 2927464d029dab59db6695b60cf573731b4464f4 | [
"Java"
]
| 1 | Java | repanareddysekhar/Intro | e79e61297358dde550cbd994344c9e5c1ebefe86 | 6973b3a5864d873d3c6cd1ea15fb3ede52ad6ffd |
refs/heads/master | <file_sep># Python-Password-Generator
Python programm to generate password without GUI

<file_sep>import random
from tkinter import *
a='ABCDEFGHIJKMNLOPQRSTUVWXYZ'
b='<KEY>pqrstuvwxyz'
d='0123456789'
s='&~#{[|`^@]^$ù*;:!,?./§%µ£+°@^'
liste = []
print('''
▄████ ██▓███ ▄▄▄ ██████ ██████
██▒ ▀█▒▓██░ ██▒▒████▄ ▒██ ▒ ▒██ ▒
▒██░▄▄▄░▓██░ ██▓▒▒██ ▀█▄ ░ ▓██▄ ░ ▓██▄
░▓█ ██▓▒██▄█▓▒ ▒░██▄▄▄▄██ ▒ ██▒ ▒ ██▒
░▒▓███▀▒▒██▒ ░ ░ ▓█ ▓██▒▒██████▒▒▒██████▒▒
░▒ ▒ ▒▓▒░ ░ ░ ▒▒ ▓▒█░▒ ▒▓▒ ▒ ░▒ ▒▓▒ ▒ ░
░ ░ ░▒ ░ ▒ ▒▒ ░░ ░▒ ░ ░░ ░▒ ░ ░
░ ░ ░ ░░ ░ ▒ ░ ░ ░ ░ ░ ░
░ ░ ░ ░ ░
~3.0~
''')
def aleatoire():
nb = int(input('Choisissez le nombre de caractère :> '))
mot =''
for c in range(0,nb):
x = random.randint(0, len(liste))
if x == len(liste):
x -= 1
y = random.randint(0, len(liste[x]))
if y == len(liste[x]):
y -= 1
mot = mot + liste[x][y]
print('''
###########################
# #
# MDP : {0} #
# #
###########################
'''.format(mot))
while True:
ans_usr = input('''Choisir options:
a) Maj.
b) Min.
c) Chif.
d) Spec.
e) Lancer le proc.
f) Quit. :>
g) Retirer de la liste :> ''')
if ans_usr == 'a':
liste.append(a)
if ans_usr == 'b':
liste.append(b)
if ans_usr == 'c':
liste.append(d)
if ans_usr == 'd':
liste.append(s)
if ans_usr == 'e':
aleatoire()
if ans_usr == 'f':
break
if ans_usr == 'g':
while True:
abc = input('''Sélectionner :>
a) Maj.
b) Min.
c) Chif.
d) Spec.
e) Sortir. :>
''')
if abc == 'a':
liste.remove()
| cf5ba3e8849881009ef0c84c52e5900231545585 | [
"Markdown",
"Python"
]
| 2 | Markdown | Spocsk/Python-Password-Generator | c253af6f729f24380dd8ae81dfa523882e088223 | 94658da0b9b3b9048085e510120fa48c72e4eb21 |
refs/heads/master | <repo_name>Ayushjain2205/2-Runway-Airport-Simulation<file_sep>/header.h
struct Plane
{
int id;
int time;
struct QNode *next;
};
struct Queue
{
int count;
struct Plane *front, *rear;
};
struct Plane *newPlane(int id, int time);
struct Queue *createQueue();
void enQueue(struct Queue *q, struct Plane *temp);
struct Plane *deQueue(struct Queue *q);
int size(struct Queue *q);
int empty(struct Queue *q);
int full(struct Queue *q);
struct airport
{
struct Queue *R1_takeoff;
struct Queue *R2_takeoff;
struct Queue *R1_landing;
struct Queue *R2_landing;
int idletime;
int landwait, takeoffwait;
int nland, nplanes, nrefuse, ntakeoff;
struct Plane *pln;
};
void initairport(struct airport *ap);
void start(int *endtime, double *expectarrive, double *expectdepart);
void newplane(struct airport *ap, int curtime, int action);
void refuse(struct airport *ap, int action);
void land(struct airport *ap, struct Plane *pl, int curtime, int runway);
void fly(struct airport *ap, struct Plane *pl, int curtime, int runway);
void idle(struct airport *ap, int curtime);
void conclude(struct airport *ap, int endtime);
int randomnumber(double expectedvalue);
void apaddqueue(struct airport *ap, char type, int runway);
struct Plane *apdelqueue(struct airport *ap, char type, int runway);
int apsize(struct airport *ap, char type);
int runwayFull(struct airport *ap, char type, int runway);
int runwayEmpty(struct airport *ap, char type, int runway);
void myrandomize();
<file_sep>/client.c
#include <stdlib.h>
#include <ctype.h>
#include <math.h>
#include <time.h>
#include <limits.h>
#include <stdio.h>
#include"header.h"
#define MAX 3
#define ARRIVE 0
#define DEPART 1
int main()
{
//srand(time(0));
struct airport a;
int i, pri, curtime, endtime;
double expectarrive, expectdepart;
struct Plane *temp;
int test;
test = randomnumber(0.47);
printf("%d", test);
initairport(&a);
start(&endtime, &expectarrive, &expectdepart);
for (curtime = 1; curtime <= endtime; curtime++)
{
pri = randomnumber(expectarrive);
for (i = 1; i <= pri; i++)
{
newplane(&a, curtime, ARRIVE);
if (runwayFull(&a, 'l', 1))
refuse(&a, ARRIVE);
else
apaddqueue(&a, 'l', 1);
}
pri = randomnumber(expectdepart);
for (i = 1; i <= pri; i++)
{
newplane(&a, curtime, DEPART);
if (runwayFull(&a, 't', 2))
refuse(&a, DEPART);
else
apaddqueue(&a, 't', 2);
}
if (!(runwayEmpty(&a, 'l', 1))) //landing queue is not empty
{
temp = apdelqueue(&a, 'l', 1);
land(&a, temp, curtime, 1);
}// only if landing queue is empty take off is allowed
if (!(runwayEmpty(&a, 't', 2))) //takeoff queue is not empty
{
temp = apdelqueue(&a, 't', 2);
fly(&a, temp, curtime, 2);
}
else
{
idle(&a, curtime);
}
}
conclude(&a, endtime);
return 0;
}
<file_sep>/README.md
# 2-Runway-Airport-Simulation
Simulation of a runway where user is required to enter
-> Number of time units
-> Number of landings per unit time
-> Number of takeoffs per unit time
At the end of the simulation the user is given statistics-
->Number of landings
->Number of takeoffs
->Number of refusals
->idle time
->Wait time
<file_sep>/server.c
#include <stdlib.h>
#include <ctype.h>
#include <math.h>
#include <time.h>
#include <limits.h>
#include <stdio.h>
#define MAX 3
#define ARRIVE 0
#define DEPART 1
// A linked list (LL) node to store a queue entry
struct Plane
{
int id;
int time;
struct QNode *next;
};
// The queue, front stores the front node of LL and rear stores the
// last node of LL
struct Queue
{
int count;
struct Plane *front, *rear;
};
// A utility function to create a new linked list node.
struct Plane *newPlane(int id, int time)
{
struct Plane *temp = (struct Plane *)malloc(sizeof(struct Plane));
temp->id = id;
temp->time = time;
temp->next = NULL;
return temp;
}
// A utility function to create an empty queue
struct Queue *createQueue()
{
struct Queue *q = (struct Queue *)malloc(sizeof(struct Queue));
q->front = q->rear = NULL;
q->count = 0;
return q;
}
// The function to add a key k to q
void enQueue(struct Queue *q, struct Plane *temp)
{
// Create a new LL node
// If queue is empty, then new node is front and rear both
if (q->rear == NULL)
{
q->front = q->rear = temp;
q->count++;
return NULL;
}
// Add the new node at the end of queue and change rear
q->rear->next = temp;
q->rear = temp;
q->count++;
}
// Function to remove a key from given queue q
struct Plane *deQueue(struct Queue *q)
{
// If queue is empty, return NULL.
if (q->front == NULL)
{
q->count--;
return NULL;
}
// Store previous front and move front one node ahead
struct Plane *temp = q->front;
q->front = q->front->next;
q->count--;
// If front becomes NULL, then change rear also as NULL
if (q->front == NULL)
{
q->rear = NULL;
}
return temp;
}
int size(struct Queue *q)
{
return q->count;
}
int empty(struct Queue *q)
{
return (q->count <= 0);
}
int full(struct Queue *q)
{
return (q->count >= MAX);
}
struct airport
{
struct Queue *R1_takeoff;
struct Queue *R2_takeoff;
struct Queue *R1_landing;
struct Queue *R2_landing;
int idletime;
int landwait, takeoffwait;
int nland, nplanes, nrefuse, ntakeoff;
struct Plane *pln;
};
void initairport(struct airport *ap);
void start(int *endtime, double *expectarrive, double *expectdepart);
void newplane(struct airport *ap, int curtime, int action);
void refuse(struct airport *ap, int action);
void land(struct airport *ap, struct Plane *pl, int curtime, int runway);
void fly(struct airport *ap, struct Plane *pl, int curtime, int runway);
void idle(struct airport *ap, int curtime);
void conclude(struct airport *ap, int endtime);
int randomnumber(double expectedvalue);
void apaddqueue(struct airport *ap, char type, int runway);
struct Plane *apdelqueue(struct airport *ap, char type, int runway);
int apsize(struct airport *ap, char type);
int runwayFull(struct airport *ap, char type, int runway);
int runwayEmpty(struct airport *ap, char type, int runway);
void myrandomize();
void initairport(struct airport *ap)
{
ap->R1_landing = createQueue();
ap->R1_takeoff = createQueue();
ap->R2_landing = createQueue();
ap->R2_takeoff = createQueue();
ap->nplanes = ap->nland = ap->ntakeoff = ap->nrefuse = 0;
ap->landwait = ap->takeoffwait = ap->idletime = 0;
}
void start(int *endtime, double *expectarrive, double *expectdepart)
{
int flag = 0;
char wish;
printf("\n\n\t\t WELCOME TO AIRPORT SIMULATOR \n\n");
printf("\n\t->Program that simulates an airport with 2 runways.\n");
printf("\t->One plane can land or depart from each runway in each unit of time.\n");
printf("\t->Runway 1 is for LANDING and Runway 2 is for TAKEOFF in general.\n");
printf("\t->Up to %d planes can be waiting to land or take off at any time.\n\n", MAX);
printf("\t->How many units of time will the simulation run?");
scanf("%d", endtime);
myrandomize();
do
{
printf("\n\t Expected number of arrivals per unit time ? ");
scanf("%lf", expectarrive);
printf("\n\tExpected number of departures per unit time ? ");
scanf("%lf", expectdepart);
if (*expectarrive < 0.0 || *expectdepart < 0.0)
{
printf("These numbers must be greater than 0.\n");
flag = 0;
}
else
{
if (*expectarrive + *expectdepart > 5.0)
{
printf("The airport will become saturated. Read new numbers(Enter y )? ");
scanf(" %c", &wish);
if (tolower(wish) == 'y')
flag = 0;
else
flag = 1;
}
else
flag = 1;
}
} while (flag == 0);
}
void newplane(struct airport *ap, int curtime, int action)
{
(ap->nplanes)++;
ap->pln->id = ap->nplanes;
ap->pln->time = curtime;
switch (action)
{
case ARRIVE:
printf("\t\tPlane %d ready to land.\n", ap->nplanes);
break;
case DEPART:
printf("\t\tPlane %d ready to take off.\n", ap->nplanes);
break;
}
}
void refuse(struct airport *ap, int action)
{
switch (action)
{
case ARRIVE:
printf("\t\tplane %d directed to another airport.\n", ap->pln->id);
break;
case DEPART:
printf("\t\tplane %d told to try later.\n", ap->pln->id);
break;
}
(ap->nrefuse)++;
}
void land(struct airport *ap, struct Plane *pl, int curtime, int runway)
{
int wait;
wait = curtime - pl->time;
printf("%d: Plane %d landed on Runway-%d ", curtime, pl->id, runway);
printf("in queue %d units \n", wait);
(ap->nland)++;
(ap->landwait) += wait;
}
void fly(struct airport *ap, struct Plane *pl, int curtime, int runway)
{
int wait;
wait = curtime - pl->time;
printf("\n%d: Plane %d took off from Runway-%d ", curtime, pl->id, runway);
printf("in queue %d units \n", wait);
(ap->ntakeoff)++;
(ap->takeoffwait) += wait;
}
void idle(struct airport *ap, int curtime)
{
printf("%d: Runway is idle.\n", curtime);
ap->idletime++;
}
void conclude(struct airport *ap, int endtime)
{
printf("\tSimulation has concluded after %d units.\n", endtime);
printf("\tTotal number of planes processed: %d\n", ap->nplanes);
printf("\tNumber of planes landed: %d\n", ap->nland);
printf("\tNumber of planes taken off: %d\n", ap->ntakeoff);
printf("\tNumber of planes refused use: %d\n", ap->nrefuse);
// printf("\tNumber left ready to land: %d\n", apsize(ap, 'l'));
// printf("\tNumber left ready to take off: %d\n", apsize(ap, 't'));
if (endtime > 0)
printf("\tPercentage of time runway idle: %.3lf \n", ((double)ap->idletime / endtime) * 100.0);
if (ap->nland > 0)
printf("\tAverage wait time to land: %.3lf \n", ((double)ap->landwait / ap->nland));
if (ap->ntakeoff > 0)
printf("\tAverage wait time to take off: %.3lf \n", ((double)ap->takeoffwait / ap->ntakeoff));
}
int randomnumber(double expectedvalue)
{
int n = 0; //counter of iteration
double limit;
double x; //pseudo random number
//limit = exp(-expectedvalue);
limit=0.13509;
x = rand() / (double)RAND_MAX;
while (x > limit)
{
n++;
x *= rand() / (double)RAND_MAX;
}
return n;
}
void apaddqueue(struct airport *ap, char type, int runway)
{
switch (tolower(type))
{
case 'l':
switch (runway)
{
case 1:
enQueue(ap->R1_landing, ap->pln);
break;
case 2:
enQueue(ap->R2_landing, ap->pln);
break;
}
break;
case 't':
switch (runway)
{
case 1:
enQueue(ap->R1_takeoff, ap->pln);
break;
case 2:
enQueue(ap->R2_takeoff, ap->pln);
break;
}
break;
}
}
struct Plane *apdelqueue(struct airport *ap, char type, int runway)
{
struct Plane *p1 = (struct Plane *)malloc(sizeof(struct Plane));
switch (tolower(type))
{
case 'l':
switch (runway)
{
case 1:
p1 = deQueue(ap->R1_landing);
break;
case 2:
p1 = deQueue(ap->R2_landing);
break;
}
break;
case 't':
switch (runway)
{
case 1:
p1 = deQueue(ap->R1_takeoff);
break;
case 2:
p1 = deQueue(ap->R2_takeoff);
break;
}
break;
}
return p1;
}
int apsize(struct airport *ap, char type)
{
switch (tolower(type))
{
case 'l':
return (size(ap->R1_landing) + size(ap->R2_landing));
case 't':
return (size(ap->R1_takeoff) + size(ap->R2_takeoff));
}
return 0;
}
int runwayFull(struct airport *ap, char type, int runway)
{
switch (tolower(type))
{
case 'l':
switch (runway)
{
case 1:
return (full(ap->R1_landing));
case 2:
return (full(ap->R2_landing));
}
break;
case 't':
switch (runway)
{
case 1:
return (full(ap->R1_takeoff));
case 2:
return (full(ap->R2_takeoff));
break;
}
break;
return 0;
}
}
int runwayEmpty(struct airport *ap, char type, int runway)
{
switch (tolower(type))
{
case 'l':
switch (runway)
{
case 1:
return (empty(ap->R1_landing));
case 2:
return (empty(ap->R2_landing));
}
break;
case 't':
switch (runway)
{
case 1:
return (empty(ap->R1_takeoff));
case 2:
return (empty(ap->R2_takeoff));
break;
}
break;
return 0;
}
}
void myrandomize()
{
srand((unsigned int)(time(NULL) % 10000));
}
void printQueue(struct Queue *temp)
{
printf("data -> ");
while (temp != NULL)
{
printf("%d->", temp->front->id);
}
temp = temp->front;
} | 54ee2e795a903df3bdad96c1af40c792f265ecae | [
"Markdown",
"C"
]
| 4 | C | Ayushjain2205/2-Runway-Airport-Simulation | d2efcc7825782f7cfe2531e0e680314588a357c8 | 14c0edb14fb3b7a6002e9e5560fbdce70c98da7d |
refs/heads/master | <repo_name>revelrylabs/Harmonium-RN<file_sep>/src/Button/index.js
import React from 'react'
import { PropTypes } from 'prop-types'
import { Text, TouchableOpacity, View } from 'react-native'
import { transparent } from '../styles/colors'
import withTheme from '../styles/withTheme'
function getStyles (props) {
const { primary, accent, secondary, disabled, theme } = props
const { button, buttonDisabled, palette } = theme
const local = {
container: {}
}
if (!disabled) {
if (primary) {
local.container = { backgroundColor: palette.primaryColor }
} else if (accent) {
local.container = { backgroundColor: palette.accentColor }
} else if (secondary) {
local.container = {
borderWidth: 1,
borderColor: palette.primaryColor,
backgroundColor: transparent
}
local.text = { color: palette.primaryColor }
}
}
return {
container: [
button.container,
disabled && buttonDisabled.container,
local.container,
props.style.container
],
text: [
button.text,
disabled && buttonDisabled.text,
local.text,
props.style.text
]
}
}
class Button extends React.Component {
render () {
const { onPress, disabled, text } = this.props
const styles = getStyles(this.props)
const content = (
<View style={styles.container}>
<Text style={styles.text}>{text}</Text>
</View>
)
if (disabled) {
return content
}
return <TouchableOpacity onPress={onPress}>{content}</TouchableOpacity>
}
}
Button.propTypes = {
onPress: PropTypes.func.isRequired,
text: PropTypes.string.isRequired,
primary: PropTypes.bool,
secondary: PropTypes.bool,
accent: PropTypes.bool,
disabled: PropTypes.bool,
theme: PropTypes.object.isRequired
}
Button.defaultProps = {
primary: false,
accent: false,
secondary: false,
disabled: false,
onPress: null,
style: {}
}
export default withTheme(Button)
<file_sep>/src/styles/typography.js
export const fontWeight = {
light: '300',
normal: '400',
medium: '500',
bold: '600'
}
export default {
fontWeight,
h1: {
fontWeight: fontWeight.bold,
fontSize: 32
},
h2: {
fontWeight: fontWeight.bold,
fontSize: 28
},
h3: {
fontWeight: fontWeight.bold,
fontSize: 24
},
h4: {
fontWeight: fontWeight.bold,
fontSize: 20
},
body: {
fontWeight: fontWeight.normal,
fontSize: 16,
lineHeight: 28
},
bodyLarge: {
fontWeight: fontWeight.normal,
fontSize: 18,
lineHeight: 30
},
bodyLarge: {
fontWeight: fontWeight.normal,
fontSize: 12,
lineHeight: 20
},
buttons: {
fontWeight: fontWeight.medium,
fontSize: 18
}
}
<file_sep>/src/styles/spacing.js
export const globalPaddingTiny = 4
export const globalPaddingSmall = 8
export const globalPadding = 16
export const globalPaddingMedium = 24
export const globalPaddingLarge = 32
export const globalPaddingLarger = 48
export const globalPaddingJumbo = 64
export const globalMarginTiny = globalPaddingTiny
export const globalMarginSmall = globalPaddingSmall
export const globalMargin = globalPadding
export const globalMarginMedium = globalPaddingMedium
export const globalMarginLarge = globalPaddingLarge
export const globalMarginLarger = globalPaddingLarger
export const globalMarginJumbo = globalPaddingJumbo
<file_sep>/README.md
# Harmonium-RN
React Native Harmonium UI Toolkit
## Getting Started
```
npm install harmonium-rn
```
## Usage
Wrap your app in a `ThemeContext.Provider` to propagate your custom theme to all of your components. You can override the default theme by passing overrides into `getTheme` as the `value` for `ThemeContext.Provider`.
```
import React, { Component } from 'react';
import { COLOR, ThemeContext, getTheme } from 'harmonium-rn';
// You can override the default theme's color palette, individual component styles, and more
const themeOverrides = {
palette: {
primaryColor: COLOR.green500,
},
button: {
container: {
height: 30,
}
},
};
class Main extends Component {
render() {
return (
<ThemeContext.Provider value={getTheme(themeOverrides)}>
<App />
</ThemeContext.Provider>
);
}
}
```
You can access `theme` in props from anywhere:
```
import { withTheme } from 'harmonium-rn'
class CustomButton extends Component {
render() {
const { primaryColor } = this.props.theme.palette;
return ...
}
}
export default withTheme(CustomButton)
```
You can also pass style overrides to specific components instead of overriding elements for the entire theme:
```
<Button style={{container: {height: 30, backgroundColor: 'red'}}}>
```
## Development
- New components should be added to the `src` directory at the root of the project
- Each component should get its own directory with one `index.js` file
- Each component should have a `getStyles` function which takes both theme and style override props into account
- See the [Button](https://github.com/revelrylabs/Harmonium-RN/blob/master/src/Button/index.js#L7) component for a good example
- Please add stories to the example app storybook whenever adding new components
## Running the Example App
- Run `npm run install:example` to install everything on the example app
- Run `npm run start:example` to start the example app
- NOTE: you will need to restart the example app if you make any changes to the harmonium package
<file_sep>/example/src/stories/Button.story.js
import React from 'react'
import {View} from 'react-native'
import {COLOR, Button, ThemeContext, getTheme} from 'harmonium-rn'
import {storiesOf} from '@storybook/react-native'
storiesOf('Button', module)
.addDecorator(story => (
<ThemeContext.Provider value={getTheme()}>
<View style={{padding: 30}}>{story()}</View>
</ThemeContext.Provider>
))
.add('Primary Button', () => (
<Button
primary
text='Primary Button'
onPress={() => console.warn('Clicked!')}
/>
))
.add('Secondary Button', () => (
<Button
secondary
text='Secondary Button'
onPress={() => console.warn('Clicked!')}
/>
))
.add('Accent Button', () => (
<Button
accent
text='Accent Button'
onPress={() => console.warn('Clicked!')}
/>
))
.add('Disabled Button', () => (
<Button
disabled
text='Disabled Button'
onPress={() => console.warn('Clicked!')}
/>
))
.add('Custom Button', () => (
<Button
style={{
container: {backgroundColor: COLOR.indigo900, borderRadius: 20},
text: {color: COLOR.indigo200},
}}
text='Custom Button'
onPress={() => console.warn('Clicked!')}
/>
))
| 3fa7c3df32fc316877fcd16e3c741e136d58e4ec | [
"JavaScript",
"Markdown"
]
| 5 | JavaScript | revelrylabs/Harmonium-RN | 2ecda72b349cd810f627bb0338cecbb85fb69246 | a99bbb5e8b798038c9bf8257917e60bae92afc56 |
refs/heads/dev | <file_sep>import IdentificationSuccessMessage from "@/protocol/network/messages/IdentificationSuccessMessage";
export default class IdentificationSuccessWithLoginTokenMessage extends IdentificationSuccessMessage {
public loginToken: string;
constructor(
login = "",
nickname = "",
accountId = 0,
communityId = 0,
hasRights = false,
secretQuestion = "",
subscriptionEndDate = 0,
wasAlreadyConnected = false,
accountCreation = 0,
loginToken = ""
) {
super(
login,
nickname,
accountId,
communityId,
hasRights,
secretQuestion,
subscriptionEndDate,
wasAlreadyConnected,
accountCreation
);
this.loginToken = loginToken;
}
}
<file_sep>import EntityDispositionInformations from "@/protocol/network/types/EntityDispositionInformations";
import EntityLook from "@/protocol/network/types/EntityLook";
import GameFightFighterInformations from "@/protocol/network/types/GameFightFighterInformations";
import GameFightMinimalStats from "@/protocol/network/types/GameFightMinimalStats";
import PlayerStatus from "@/protocol/network/types/PlayerStatus";
export default class GameFightFighterNamedInformations extends GameFightFighterInformations {
public name: string;
public status: PlayerStatus;
constructor(
contextualId = 0,
look: EntityLook,
disposition: EntityDispositionInformations,
teamId = 2,
alive = false,
stats: GameFightMinimalStats,
name = "",
status: PlayerStatus
) {
super(contextualId, look, disposition, teamId, alive, stats);
this.name = name;
this.status = status;
}
}
<file_sep>import AbstractFightDispellableEffect from "@/protocol/network/types/AbstractFightDispellableEffect";
export default class FightTemporaryBoostEffect extends AbstractFightDispellableEffect {
public delta: number;
constructor(
uid = 0,
targetid = 0,
turnduration = 0,
dispelable = 1,
spellid = 0,
parentboostuid = 0,
delta = 0
) {
super(uid, targetid, turnduration, dispelable, spellid, parentboostuid);
this.delta = delta;
}
}
<file_sep>import Account from "@/account";
import MerchantBuyAction from "../actions/merchants/MerchantBuyAction";
import OpenMerchantAction from "../actions/merchants/OpenMerchantAction";
export default class MerchantsAPI {
private account: Account;
constructor(account: Account) {
this.account = account;
}
public async open(cellId: number) {
await this.account.scripts.actionsManager.enqueueAction(
new OpenMerchantAction(cellId),
true
);
return true;
}
public async buy(gid: number, qty: number) {
await this.account.scripts.actionsManager.enqueueAction(
new MerchantBuyAction(gid, qty),
true
);
return true;
}
public objectsInShop() {
return this.account.game.merchants.objectsInShop();
}
}
<file_sep>import AbstractGameActionMessage from "@/protocol/network/messages/AbstractGameActionMessage";
export default class AbstractGameActionFightTargetedAbilityMessage extends AbstractGameActionMessage {
public targetId: number;
public destinationCellId: number;
public critical: number;
public silentCast: boolean;
constructor(
actionId = 0,
sourceId = 0,
targetId = 0,
destinationCellId = 0,
critical = 1,
silentCast = false
) {
super();
this.targetId = targetId;
this.destinationCellId = destinationCellId;
this.critical = critical;
this.silentCast = silentCast;
}
}
<file_sep>import Account from "@/account";
import ScriptAction, {
ScriptActionResults
} from "@/scripts/actions/ScriptAction";
export default class CraftAction extends ScriptAction {
public _name: string = "CraftAction";
public gid: number;
public qty: number;
constructor(gid: number, qty: number) {
super();
this.gid = gid;
this.qty = qty;
}
public async process(account: Account): Promise<ScriptActionResults> {
if (
account.game.craft.setRecipe(this.gid) &&
account.game.craft.setQuantity(this.qty) &&
account.game.craft.ready()
) {
return ScriptAction.processingResult();
}
return ScriptAction.doneResult();
}
}
<file_sep>import Type from "@/protocol/network/types/Type";
export default class FightResultPvpData extends Type {
public grade: number;
public minHonorForGrade: number;
public maxHonorForGrade: number;
public honor: number;
public honorDelta: number;
constructor(
grade = 0,
minHonorForGrade = 0,
maxHonorForGrade = 0,
honor = 0,
honorDelta = 0
) {
super();
this.grade = grade;
this.minHonorForGrade = minHonorForGrade;
this.maxHonorForGrade = maxHonorForGrade;
this.honor = honor;
this.honorDelta = honorDelta;
}
}
<file_sep>import EntityDispositionInformations from "@/protocol/network/types/EntityDispositionInformations";
import EntityLook from "@/protocol/network/types/EntityLook";
import GameFightFighterInformations from "@/protocol/network/types/GameFightFighterInformations";
import GameFightMinimalStats from "@/protocol/network/types/GameFightMinimalStats";
export default class GameFightAIInformations extends GameFightFighterInformations {
constructor(
contextualId = 0,
look: EntityLook,
disposition: EntityDispositionInformations,
teamId = 2,
alive = false,
stats: GameFightMinimalStats
) {
super(contextualId, look, disposition, teamId, alive, stats);
}
}
<file_sep>import AbstractGameActionMessage from "@/protocol/network/messages/AbstractGameActionMessage";
import GameFightFighterInformations from "@/protocol/network/types/GameFightFighterInformations";
export default class GameActionFightSummonMessage extends AbstractGameActionMessage {
public summon: GameFightFighterInformations;
constructor(
actionId = 0,
sourceId = 0,
summon: GameFightFighterInformations
) {
super(actionId, sourceId);
this.summon = summon;
}
}
<file_sep>import Account from "@/account";
import Frames, { IFrame } from "@/frames";
import ExchangeBuyOkMessage from "@/protocol/network/messages/ExchangeBuyOkMessage";
import ExchangeLeaveMessage from "@/protocol/network/messages/ExchangeLeaveMessage";
import ExchangeStartOkHumanVendorMessage from "@/protocol/network/messages/ExchangeStartOkHumanVendorMessage";
export default class MerchantsFrame implements IFrame {
public register() {
Frames.dispatcher.register(
"ExchangeLeaveMessage",
this.HandleExchangeLeaveMessage,
this
);
Frames.dispatcher.register(
"ExchangeStartOkHumanVendorMessage",
this.HandleExchangeStartOkHumanVendorMessage,
this
);
Frames.dispatcher.register(
"ExchangeBuyOkMessage",
this.HandleExchangeBuyOkMessage,
this
);
}
private async HandleExchangeLeaveMessage(
account: Account,
message: ExchangeLeaveMessage
) {
account.game.merchants.UpdateExchangeLeaveMessage(message);
}
private async HandleExchangeStartOkHumanVendorMessage(
account: Account,
message: ExchangeStartOkHumanVendorMessage
) {
account.game.merchants.UpdateExchangeStartOkHumanVendorMessage(message);
}
private async HandleExchangeBuyOkMessage(
account: Account,
message: ExchangeBuyOkMessage
) {
account.game.merchants.UpdateExchangeBuyOkMessage(message);
}
}
<file_sep>import Account from "@/account";
import CraftAction from "@/scripts/actions/craft/CraftAction";
export default class CraftAPI {
private account: Account;
constructor(account: Account) {
this.account = account;
}
public async craft(gid: number, qty: number): Promise<boolean> {
await this.account.scripts.actionsManager.enqueueAction(
new CraftAction(gid, qty),
true
);
return true;
}
}
<file_sep>defaults.url=https://sentry.io/
defaults.org=cookie-project
defaults.project=cookietouch
auth.token=<PASSWORD>
cli.executable=./node_modules/@sentry/cli/bin/sentry-cli
<file_sep>import GameRolePlayMerchantInformations from "@/protocol/network/types/GameRolePlayMerchantInformations";
import GameRolePlayMerchantWithGuildInformations from "@/protocol/network/types/GameRolePlayMerchantWithGuildInformations";
export default class MerchantEntry {
public id: number = 0;
public name: string = "";
public cellId: number = 0;
constructor(
infos:
| GameRolePlayMerchantInformations
| GameRolePlayMerchantWithGuildInformations
) {
this.id = infos.contextualId;
this.name = infos.name;
this.cellId = infos.disposition.cellId;
}
}
<file_sep>import Message from "@/protocol/network/messages/Message";
import QuestActiveInformations from "@/protocol/network/types/QuestActiveInformations";
import QuestActiveDetailedInformations from "../types/QuestActiveDetailedInformations";
export default class QuestListMessage extends Message {
public finishedQuestsIds: number[];
public finishedQuestsCounts: number[];
public activeQuests: Array<
QuestActiveInformations | QuestActiveDetailedInformations
>;
constructor(
finishedQuestsIds: number[],
finishedQuestsCounts: number[],
activeQuests: Array<
QuestActiveInformations | QuestActiveDetailedInformations
>
) {
super();
this.finishedQuestsIds = finishedQuestsIds;
this.finishedQuestsCounts = finishedQuestsCounts;
this.activeQuests = activeQuests;
}
}
<file_sep>import FightLoot from "@/protocol/network/types/FightLoot";
import FightResultFighterListEntry from "@/protocol/network/types/FightResultFighterListEntry";
export default class FightResultMutantListEntry extends FightResultFighterListEntry {
public level: number;
constructor(
outcome = 0,
rewards: FightLoot,
id = 0,
alive = false,
level = 0
) {
super(outcome, rewards, id, alive);
this.level = level;
}
}
<file_sep>import ExchangeStartedMessage from "@/protocol/network/messages/ExchangeStartedMessage";
export default class ExchangeStartedWithPodsMessage extends ExchangeStartedMessage {
public firstCharacterId: number;
public firstCharacterCurrentWeight: number;
public firstCharacterMaxWeight: number;
public secondCharacterId: number;
public secondCharacterCurrentWeight: number;
public secondCharacterMaxWeight: number;
constructor(
exchangeType = 0,
firstCharacterId = 0,
firstCharacterCurrentWeight = 0,
firstCharacterMaxWeight = 0,
secondCharacterId = 0,
secondCharacterCurrentWeight = 0,
secondCharacterMaxWeight = 0
) {
super(exchangeType);
this.firstCharacterId = firstCharacterId;
this.firstCharacterCurrentWeight = firstCharacterCurrentWeight;
this.firstCharacterMaxWeight = firstCharacterMaxWeight;
this.secondCharacterId = secondCharacterId;
this.secondCharacterCurrentWeight = secondCharacterCurrentWeight;
this.secondCharacterMaxWeight = secondCharacterMaxWeight;
}
}
<file_sep>import Account from "@/account";
import { AccountStates } from "@/account/AccountStates";
import ExchangeBuyMessage from "@/protocol/network/messages/ExchangeBuyMessage";
import ExchangeBuyOkMessage from "@/protocol/network/messages/ExchangeBuyOkMessage";
import ExchangeLeaveMessage from "@/protocol/network/messages/ExchangeLeaveMessage";
import ExchangeOnHumanVendorRequestMessage from "@/protocol/network/messages/ExchangeOnHumanVendorRequestMessage";
import ExchangeStartOkHumanVendorMessage from "@/protocol/network/messages/ExchangeStartOkHumanVendorMessage";
import ObjectItemToSellInHumanVendorShop from "@/protocol/network/types/ObjectItemToSellInHumanVendorShop";
import LiteEvent from "@/utils/LiteEvent";
export default class Merchants {
public get MerchantOpened() {
return this.onMerchantOpened.expose();
}
public get ObjectBuyed() {
return this.onObjectBuyed.expose();
}
private account: Account;
private objectInfos: ObjectItemToSellInHumanVendorShop[] = [];
private onMerchantOpened = new LiteEvent<void>();
private onObjectBuyed = new LiteEvent<void>();
constructor(account: Account) {
this.account = account;
}
public open(cellId: number): boolean {
const merchant = this.account.game.map.merchants.find(
m => m.cellId === cellId
);
if (!merchant) {
return false;
}
this.account.network.sendMessageFree(
"ExchangeOnHumanVendorRequestMessage",
{
humanVendorCell: cellId,
humanVendorId: merchant.id
} as ExchangeOnHumanVendorRequestMessage
);
return true;
}
public objectsInShop() {
return this.objectInfos;
}
public buy(gid: number, qty: number): boolean {
const item = this.objectInfos.find(o => o.objectGID === gid);
if (!item) {
return false;
}
qty = qty <= 0 ? item.quantity : qty > item.quantity ? item.quantity : qty;
this.account.network.sendMessageFree("ExchangeBuyMessage", {
objectToBuyId: item.objectUID,
quantity: qty
} as ExchangeBuyMessage);
return true;
}
public UpdateExchangeLeaveMessage(message: ExchangeLeaveMessage) {
if (this.account.state !== AccountStates.TALKING) {
return;
}
this.account.state = AccountStates.NONE;
this.objectInfos = [];
}
public UpdateExchangeStartOkHumanVendorMessage(
message: ExchangeStartOkHumanVendorMessage
) {
this.account.state = AccountStates.TALKING;
this.objectInfos = message.objectsInfos;
this.onMerchantOpened.trigger();
}
public UpdateExchangeBuyOkMessage(message: ExchangeBuyOkMessage) {
this.onObjectBuyed.trigger();
}
}
<file_sep>import Message from "@/protocol/network/messages/Message";
import CharacterMinimalPlusLookInformations from "@/protocol/network/types/CharacterMinimalPlusLookInformations";
export default class PrismFightDefenderAddMessage extends Message {
public subAreaId: number;
public fightId: number;
public defender: CharacterMinimalPlusLookInformations;
constructor(
subAreaId = 0,
fightId = 0,
defender: CharacterMinimalPlusLookInformations
) {
super();
this.subAreaId = subAreaId;
this.fightId = fightId;
this.defender = defender;
}
}
<file_sep>import BasicGuildInformations from "@/protocol/network/types/BasicGuildInformations";
import FightLoot from "@/protocol/network/types/FightLoot";
import FightResultFighterListEntry from "@/protocol/network/types/FightResultFighterListEntry";
export default class FightResultTaxCollectorListEntry extends FightResultFighterListEntry {
public level: number;
public guildInfo: BasicGuildInformations;
public experienceForGuild: number;
constructor(
outcome = 0,
rewards: FightLoot,
id = 0,
alive = false,
level = 0,
guildInfo: BasicGuildInformations,
experienceForGuild = 0
) {
super(outcome, rewards, id, alive);
this.level = level;
this.guildInfo = guildInfo;
this.experienceForGuild = experienceForGuild;
}
}
<file_sep>import Account from "@/account";
import ScriptAction, {
ScriptActionResults
} from "@/scripts/actions/ScriptAction";
export default class OpenMerchantAction extends ScriptAction {
public _name: string = "OpenMerchantAction";
public cellId: number;
constructor(cellId: number) {
super();
this.cellId = cellId;
}
public async process(account: Account): Promise<ScriptActionResults> {
if (account.game.merchants.open(this.cellId)) {
return ScriptAction.processingResult();
}
account.logger.logWarning("Merchants", "No merchants on this cell");
return ScriptAction.failedResult();
}
}
<file_sep>import PartyInvitationMessage from "@/protocol/network/messages/PartyInvitationMessage";
export default class PartyInvitationDungeonMessage extends PartyInvitationMessage {
public dungeonId: number;
constructor(
partyId = 0,
partyType = 0,
maxParticipants = 0,
fromId = 0,
fromName = "",
toId = 0,
dungeonId = 0
) {
super(partyId, partyType, maxParticipants, fromId, fromName, toId);
this.dungeonId = dungeonId;
}
}
<file_sep>import Account from "@/account";
import DataManager from "@/protocol/data";
import Items from "@/protocol/data/classes/Items";
import { DataTypes } from "@/protocol/data/DataTypes";
export default class DataAPI {
constructor(account: Account) {
//
}
public async item(gid: number): Promise<Items> {
const item = await DataManager.get<Items>(DataTypes.Items, gid);
return item[0];
}
}
<file_sep>import AlignmentSubAreaUpdateMessage from "@/protocol/network/messages/AlignmentSubAreaUpdateMessage";
export default class AlignmentSubAreaUpdateExtendedMessage extends AlignmentSubAreaUpdateMessage {
public worldX: number;
public worldY: number;
public mapId: number;
public eventType: number;
constructor(
subAreaId = 0,
side = 0,
quiet = false,
worldX = 0,
worldY = 0,
mapId = 0,
eventType = 0
) {
super();
this.worldX = worldX;
this.worldY = worldY;
this.mapId = mapId;
this.eventType = eventType;
}
}
<file_sep>import Account from "@/account";
import ScriptAction, {
ScriptActionResults
} from "@/scripts/actions/ScriptAction";
export default class BuyAction extends ScriptAction {
public _name: string = "BuyAction";
public objectToBuyId: number;
public quantity: number;
constructor(objectToBuyId: number, quantity: number) {
super();
this.objectToBuyId = objectToBuyId;
this.quantity = quantity;
}
public async process(account: Account): Promise<ScriptActionResults> {
if (account.game.npcs.buyItem(this.objectToBuyId, this.quantity)) {
return ScriptAction.processingResult();
}
return ScriptAction.doneResult();
}
}
<file_sep>import Message from "@/protocol/network/messages/Message";
import HouseInformationsForSell from "@/protocol/network/types/HouseInformationsForSell";
export default class HouseToSellListMessage extends Message {
public houseList: HouseInformationsForSell[];
public pageIndex: number;
public totalPage: number;
constructor(
pageIndex = 0,
totalPage = 0,
houseList: HouseInformationsForSell[]
) {
super();
this.houseList = houseList;
this.pageIndex = pageIndex;
this.totalPage = totalPage;
}
}
<file_sep>import AbstractPartyMessage from "@/protocol/network/messages/AbstractPartyMessage";
export default class PartyInvitationMessage extends AbstractPartyMessage {
public partyType: number;
public maxParticipants: number;
public fromId: number;
public fromName: string;
public toId: number;
constructor(
partyId = 0,
partyType = 0,
maxParticipants = 0,
fromId = 0,
fromName = "",
toId = 0
) {
super(partyId);
this.partyType = partyType;
this.maxParticipants = maxParticipants;
this.fromId = fromId;
this.fromName = fromName;
this.toId = toId;
}
}
<file_sep>import ExchangeBidHouseInListAddedMessage from "@/protocol/network/messages/ExchangeBidHouseInListAddedMessage";
import ObjectEffect from "@/protocol/network/types/ObjectEffect";
export default class ExchangeBidHouseInListUpdatedMessage extends ExchangeBidHouseInListAddedMessage {
constructor(
itemUID = 0,
objGenericId = 0,
effects: ObjectEffect[],
prices: number[]
) {
super(itemUID, objGenericId, effects, prices);
}
}
<file_sep>import FightLoot from "@/protocol/network/types/FightLoot";
import FightResultAdditionalData from "@/protocol/network/types/FightResultAdditionalData";
import FightResultFighterListEntry from "@/protocol/network/types/FightResultFighterListEntry";
export default class FightResultPlayerListEntry extends FightResultFighterListEntry {
public additional: FightResultAdditionalData[];
public level: number;
constructor(
outcome = 0,
rewards: FightLoot,
id = 0,
alive = false,
level = 0,
additional: FightResultAdditionalData[]
) {
super(outcome, rewards, id, alive);
this.additional = additional;
this.level = level;
}
}
<file_sep>import Message from "@/protocol/network/messages/Message";
export default class HouseToSellFilterMessage extends Message {
public areaId: number;
public atLeastNbRoom: number;
public atLeastNbChest: number;
public skillRequested: number;
public maxPrice: number;
constructor(
areaId = 0,
atLeastNbRoom = 0,
atLeastNbChest = 0,
skillRequested = 0,
maxPrice = 0
) {
super();
this.areaId = areaId;
this.atLeastNbRoom = atLeastNbRoom;
this.atLeastNbChest = atLeastNbChest;
this.skillRequested = skillRequested;
this.maxPrice = maxPrice;
}
}
<file_sep># Cookie Themes
--
## Foreword
This is the guide to understand the new Cookie Theme system. This system is based on Material-UI.
Cookie Themes will you give the possibility to:
- Change the main color
- Change the secondary color
- Change font size
## Installation of a theme
For now the theme installation is very easy simply download your theme file of copy your theme json at:
```
./resources/themes/theme.json
```
## Theme Example:
To help you create atheme here is the template of the **Wood Cookie Theme**
To easily create a theme you can use: [Material UI Theme Editor](https://in-your-saas.github.io/material-ui-theme-editor/)
```json
{
"palette": {
"common": {
"black": "#000",
"white": "#fff"
},
"background": {
"paper": "rgba(230, 200, 152, 1)",
"default": "rgba(255, 240, 218, 1)"
},
"primary": {
"light": "rgba(190, 116, 53, 1)",
"main": "rgba(139, 87, 42, 1)",
"dark": "rgba(100, 63, 31, 1)",
"contrastText": "#fff"
},
"secondary": {
"light": "rgba(35, 101, 177, 1)",
"main": "rgba(24, 89, 167, 1)",
"dark": "rgba(11, 55, 107, 1)",
"contrastText": "#fff"
},
"error": {
"light": "#e57373",
"main": "#f44336",
"dark": "#d32f2f",
"contrastText": "#fff"
},
"text": {
"primary": "rgba(0, 0, 0, 0.87)",
"secondary": "rgba(0, 0, 0, 0.54)",
"disabled": "rgba(0, 0, 0, 0.38)",
"hint": "rgba(0, 0, 0, 0.38)"
}
}
}
```
## Road map
Here are the future plans for the themeing engine:
- A simple theme manager ui being able to select the current theme from the cookie interface
- A custom theme format letting people identify, update, name themes
- A theme sharing system
- A theme ranking system
## More Information
**Author:** **『 』**
**Last Edit:** 22 Feb 2019
<file_sep>import PaddockBuyableInformations from "@/protocol/network/types/PaddockBuyableInformations";
export default class PaddockAbandonnedInformations extends PaddockBuyableInformations {
public guildId: number;
constructor(
maxOutdoorMount = 0,
maxItems = 0,
price = 0,
locked = false,
guildId = 0
) {
super(maxOutdoorMount, maxItems, price, locked);
this.guildId = guildId;
}
}
<file_sep>import Message from "@/protocol/network/messages/Message";
import DungeonPartyFinderPlayer from "@/protocol/network/types/DungeonPartyFinderPlayer";
export default class DungeonPartyFinderRoomContentUpdateMessage extends Message {
public addedPlayers: DungeonPartyFinderPlayer[];
public removedPlayersIds: number[];
public dungeonId: number;
constructor(
dungeonId = 0,
addedPlayers: DungeonPartyFinderPlayer[],
removedPlayersIds: number[]
) {
super();
this.addedPlayers = addedPlayers;
this.removedPlayersIds = removedPlayersIds;
this.dungeonId = dungeonId;
}
}
<file_sep>import EntityDispositionInformations from "@/protocol/network/types/EntityDispositionInformations";
import EntityLook from "@/protocol/network/types/EntityLook";
import GameFightFighterInformations from "@/protocol/network/types/GameFightFighterInformations";
import GameFightMinimalStats from "@/protocol/network/types/GameFightMinimalStats";
export default class GameFightCompanionInformations extends GameFightFighterInformations {
public companionGenericId: number;
public level: number;
public masterId: number;
constructor(
contextualId = 0,
look: EntityLook,
disposition: EntityDispositionInformations,
teamId = 2,
alive = false,
stats: GameFightMinimalStats,
companionGenericId = 0,
level = 0,
masterId = 0
) {
super(contextualId, look, disposition, teamId, alive, stats);
this.companionGenericId = companionGenericId;
this.level = level;
this.masterId = masterId;
}
}
<file_sep>import GuildVersatileInformations from "@/protocol/network/types/GuildVersatileInformations";
export default class GuildInAllianceVersatileInformations extends GuildVersatileInformations {
public allianceId: number;
constructor(
guildId = 0,
leaderId = 0,
guildLevel = 0,
nbMembers = 0,
allianceId = 0
) {
super(guildId, leaderId, guildLevel, nbMembers);
this.allianceId = allianceId;
}
}
<file_sep>import IgnoredInformations from "@/protocol/network/types/IgnoredInformations";
export default class IgnoredOnlineInformations extends IgnoredInformations {
public playerId: number;
public playerName: string;
public breed: number;
public sex: boolean;
constructor(
accountId = 0,
accountName = "",
playerId = 0,
playerName = "",
breed = 0,
sex = false
) {
super(accountId, accountName);
this.playerId = playerId;
this.playerName = playerName;
this.breed = breed;
this.sex = sex;
}
}
<file_sep>import IdentificationMessage from "@/protocol/network/messages/IdentificationMessage";
export default class IdentificationAccountForceMessage extends IdentificationMessage {
public forcedAccountLogin: string;
constructor(
lang = "",
serverId = 0,
autoconnect = false,
useCertificate = false,
useLoginToken = false,
sessionOptionalSalt = 0,
forcedAccountLogin = "",
credentials: number[]
) {
super(
lang,
serverId,
autoconnect,
useCertificate,
useLoginToken,
sessionOptionalSalt,
credentials
);
this.forcedAccountLogin = forcedAccountLogin;
}
}
<file_sep>import ObjectEffect from "@/protocol/network/types/ObjectEffect";
import ObjectItemToSell from "@/protocol/network/types/ObjectItemToSell";
export default class ObjectItemToSellInBid extends ObjectItemToSell {
public unsoldDelay: number;
constructor(
objectgid = 0,
objectuid = 0,
quantity = 0,
objectprice = 0,
unsoldDelay = 0,
effects: ObjectEffect[]
) {
super(objectgid, objectuid, quantity, objectprice, effects);
this.unsoldDelay = unsoldDelay;
}
}
<file_sep>import Account from "@/account";
import BuyAction from "@/scripts/actions/npcs/BuyAction";
import NpcAction from "@/scripts/actions/npcs/NpcAction";
import NpcBankAction from "@/scripts/actions/npcs/NpcBankAction";
import ReplyAction from "@/scripts/actions/npcs/ReplyAction";
import SellAction from "@/scripts/actions/npcs/SellAction";
export default class NpcAPI {
private account: Account;
constructor(account: Account) {
this.account = account;
}
public async npcBank(npcId: number, replyId: number): Promise<boolean> {
if (npcId > 0 && !this.account.game.map.npcs.find(n => n.npcId === npcId)) {
return false;
}
await this.account.scripts.actionsManager.enqueueAction(
new NpcBankAction(npcId, replyId),
true
);
return true;
}
public async npc(npcId: number, actionIndex: number): Promise<boolean> {
if (npcId > 0 && !this.account.game.map.npcs.find(n => n.npcId === npcId)) {
return false;
}
await this.account.scripts.actionsManager.enqueueAction(
new NpcAction(npcId, actionIndex),
true
);
return true;
}
public async reply(replyId: number) {
await this.account.scripts.actionsManager.enqueueAction(
new ReplyAction(replyId),
true
);
}
public async sell(gid: number, quantity: number): Promise<boolean> {
await this.account.scripts.actionsManager.enqueueAction(
new SellAction(gid, quantity),
true
);
return true;
}
public async buy(gid: number, quantity: number): Promise<boolean> {
await this.account.scripts.actionsManager.enqueueAction(
new BuyAction(gid, quantity),
true
);
return true;
}
}
<file_sep>import AbstractGameActionFightTargetedAbilityMessage from "@/protocol/network/messages/AbstractGameActionFightTargetedAbilityMessage";
export default class GameActionFightSpellCastMessage extends AbstractGameActionFightTargetedAbilityMessage {
public spellId: number;
public spellLevel: number;
constructor(
actionId = 0,
sourceId = 0,
targetId = 0,
destinationCellId = 0,
critical = 1,
silentCast = false,
spellId = 0,
spellLevel = 0
) {
super(
actionId,
sourceId,
targetId,
destinationCellId,
critical,
silentCast
);
this.spellId = spellId;
this.spellLevel = spellLevel;
}
}
<file_sep>import AbstractFightTeamInformations from "@/protocol/network/types/AbstractFightTeamInformations";
export default class FightTeamLightInformations extends AbstractFightTeamInformations {
public teamMembersCount: number;
public meanLevel: number;
public hasFriend: boolean;
public hasGuildMember: boolean;
public hasAllianceMember: boolean;
public hasGroupMember: boolean;
public hasMytaxCollector: boolean;
constructor(
teamid = 2,
leaderid = 0,
teamside = 0,
teamtypeid = 0,
teamMembersCount = 0,
meanLevel = 0,
hasFriend = false,
hasGuildMember = false,
hasAllianceMember = false,
hasGroupMember = false,
hasMytaxCollector = false
) {
super(teamid, leaderid, teamside, teamtypeid);
this.teamMembersCount = teamMembersCount;
this.meanLevel = meanLevel;
this.hasFriend = hasFriend;
this.hasGuildMember = hasGuildMember;
this.hasAllianceMember = hasAllianceMember;
this.hasGroupMember = hasGroupMember;
this.hasMytaxCollector = hasMytaxCollector;
}
}
<file_sep>import LiteEvent, { ILiteEvent } from "@/utils/LiteEvent";
import firebase from "firebase/app";
import Account from ".";
interface IAccountStatsJSON {
connected: boolean;
}
export default class AccountStats implements IAccountStatsJSON {
public connected: boolean;
private authChangedUnsuscribe: firebase.Unsubscribe | undefined;
private stopDataSnapshot: (() => void) | undefined;
private globalDoc: firebase.firestore.DocumentReference | undefined;
private readonly onUpdated = new LiteEvent<void>();
public get Updated(): ILiteEvent<void> {
return this.onUpdated.expose();
}
private account: Account;
constructor(account: Account) {
this.account = account;
this.connected = false;
}
public removeListeners = () => {
if (this.authChangedUnsuscribe) {
this.authChangedUnsuscribe();
}
if (this.stopDataSnapshot) {
this.stopDataSnapshot();
}
};
public async load() {
this.authChangedUnsuscribe = firebase
.auth()
.onAuthStateChanged(async user => {
if (!user) {
return;
}
this.globalDoc = firebase
.firestore()
.doc(
`users/${user.uid}/config/accounts/${
this.account.accountConfig.username
}/characters/${this.account.game.character.name}/data`
);
this.stopDataSnapshot = this.globalDoc.onSnapshot(snapshot => {
this.updateFields(snapshot);
});
});
if (!this.globalDoc) {
return;
}
const data = await this.globalDoc.get();
this.updateFields(data);
}
public async save() {
if (!this.globalDoc) {
return;
}
const toSave: IAccountStatsJSON = {
connected: this.connected
};
await this.updateBotsConnected(this.connected);
await this.globalDoc.set(toSave);
console.log("has saved", toSave);
}
private updateFields(snapshot: firebase.firestore.DocumentSnapshot) {
if (!snapshot.exists) {
return;
}
const json = snapshot.data() as IAccountStatsJSON;
console.log("update", json);
this.connected = json.connected;
this.onUpdated.trigger();
}
private async updateBotsConnected(connected: boolean) {
const ref = firebase
.firestore()
.collection("stats")
.doc("users");
const newConnectedd = await firebase
.firestore()
.runTransaction(async transaction => {
const doc = await transaction.get(ref);
if (!doc.exists) {
transaction.set(ref, { connected: 0 });
return 0;
}
const newConnected =
Number(doc.data()!.connected) + (connected ? 1 : -1);
transaction.update(ref, {
connected: newConnected
});
return newConnected;
});
console.log(
`Transaction successfully committed and new connected bots is '${newConnectedd}'.`
);
}
}
<file_sep>import AbstractFightTeamInformations from "@/protocol/network/types/AbstractFightTeamInformations";
import FightTeamMemberInformations from "@/protocol/network/types/FightTeamMemberInformations";
export default class FightTeamInformations extends AbstractFightTeamInformations {
public teamMembers: FightTeamMemberInformations[];
constructor(
teamid = 2,
leaderid = 0,
teamside = 0,
teamtypeid = 0,
teamMembers: FightTeamMemberInformations[]
) {
super(teamid, leaderid, teamside, teamtypeid);
this.teamMembers = teamMembers;
}
}
<file_sep>import { library } from "@fortawesome/fontawesome-svg-core";
import { faKickstarterK } from "@fortawesome/free-brands-svg-icons";
import {
faBell,
faBolt,
faBriefcase,
faBullhorn,
faChartLine,
faChess,
faChessQueen,
faCoffee,
faCogs,
faGem,
faHeart,
faMap,
faMars,
faMoneyBillAlt,
faShoppingBag,
faStar,
faTrash,
faUser,
faVenus
} from "@fortawesome/free-solid-svg-icons";
library.add(
faChessQueen,
faTrash,
faBullhorn,
faChartLine,
faChess,
faCoffee,
faCogs,
faMap,
faMoneyBillAlt,
faShoppingBag,
faUser,
faVenus,
faMars,
faBolt,
faBriefcase,
faGem,
faHeart,
faStar,
faKickstarterK,
faBell
);
<file_sep>import Account from "@/account";
import { AccountStates } from "@/account/AccountStates";
import ExchangeLeaveMessage from "@/protocol/network/messages/ExchangeLeaveMessage";
import ExchangeStartOkMountMessage from "@/protocol/network/messages/ExchangeStartOkMountMessage";
// import MountClientData from "@/protocol/network/types/MountClientData";
import IClearable from "@/utils/IClearable";
import LiteEvent from "@/utils/LiteEvent";
export default class Breeding implements IClearable {
private account: Account;
// private paddockedMountsDescription: MountClientData[] = [];
// private stabledMountsDescription: MountClientData[] = [];
private readonly onPaddockOpened = new LiteEvent<void>();
private readonly onPaddockLeft = new LiteEvent<void>();
constructor(account: Account) {
this.account = account;
}
public get PaddockOpened() {
return this.onPaddockOpened.expose();
}
public get PaddockLeft() {
return this.onPaddockLeft.expose();
}
public clear() {
// this.paddockedMountsDescription = [];
// this.stabledMountsDescription = [];
}
public UpdateExchangeStartOkMountMessage(
message: ExchangeStartOkMountMessage
) {
this.account.state = AccountStates.PADDOCK;
// this.paddockedMountsDescription = message.paddockedMountsDescription;
// this.stabledMountsDescription = message.stabledMountsDescription;
this.onPaddockOpened.trigger();
}
public async UpdateExchangeLeaveMessage(message: ExchangeLeaveMessage) {
if (this.account.state !== AccountStates.PADDOCK) {
return;
}
this.account.state = AccountStates.NONE;
this.clear();
this.onPaddockLeft.trigger();
}
}
<file_sep>import GameFightFighterLightInformations from "@/protocol/network/types/GameFightFighterLightInformations";
export default class GameFightFighterTaxCollectorLightInformations extends GameFightFighterLightInformations {
public firstNameId: number;
public lastNameId: number;
constructor(
id = 0,
level = 0,
breed = 0,
sex = false,
alive = false,
firstNameId = 0,
lastNameId = 0
) {
super(id, level, breed, sex, alive);
this.firstNameId = firstNameId;
this.lastNameId = lastNameId;
}
}
<file_sep># CookieTouch API Documentation
[Sommaire](SUMMARY.md) | [Sommaire détaillé](singlepage.md)
quests.isOngoing(questID: number) : boolean
quests.isCompleted(questID : number) : boolean
quests.currentStep(questID: number) : Promise<number | null>
quests.objectivesNeeded(questID: number) : Promise<number[] | null>
quests.needObjective(questID: number, objectiveID: number) : Promise<boolean | null>
<hr>
## Sommaire
- [Quests](#quests)
- [isOngoing](#isongoing)
- [isCompleted](#iscompleted)
- [currentStep](#currentstep)
- [objectivesNeeded](#objectivesneeded)
- [needObjective](#needobjective)
# Quêtes
Fonctions relatives aux quêtes
TODO
<file_sep>import GameActionFightLifePointsLostMessage from "@/protocol/network/messages/GameActionFightLifePointsLostMessage";
export default class GameActionFightLifeAndShieldPointsLostMessage extends GameActionFightLifePointsLostMessage {
public shieldLoss: number;
constructor(
actionId = 0,
sourceId = 0,
targetId = 0,
loss = 0,
permanentDamages = 0,
shieldLoss = 0
) {
super(actionId, sourceId, targetId, loss, permanentDamages);
this.shieldLoss = shieldLoss;
}
}
<file_sep>import PartyUpdateLightMessage from "@/protocol/network/messages/PartyUpdateLightMessage";
export default class PartyCompanionUpdateLightMessage extends PartyUpdateLightMessage {
public indexId: number;
constructor(
partyId = 0,
id = 0,
lifePoints = 0,
maxLifePoints = 0,
prospecting = 0,
regenRate = 0,
indexId = 0
) {
super(partyId, id, lifePoints, maxLifePoints, prospecting, regenRate);
this.indexId = indexId;
}
}
<file_sep>import Account from "@/account";
import ScriptAction, {
ScriptActionResults
} from "@/scripts/actions/ScriptAction";
export default class MerchantBuyAction extends ScriptAction {
public _name: string = "MerchantBuyAction";
public gid: number;
public quantity: number;
constructor(gid: number, quantity: number) {
super();
this.gid = gid;
this.quantity = quantity;
}
public async process(account: Account): Promise<ScriptActionResults> {
if (account.game.merchants.buy(this.gid, this.quantity)) {
return ScriptAction.processingResult();
}
return ScriptAction.doneResult();
}
}
<file_sep>import GameFightMinimalStats from "@/protocol/network/types/GameFightMinimalStats";
export default class GameFightMinimalStatsPreparation extends GameFightMinimalStats {
public initiative: number;
constructor(
lifePoints = 0,
maxLifePoints = 0,
baseMaxLifePoints = 0,
permanentDamagePercent = 0,
shieldPoints = 0,
actionPoints = 0,
maxActionPoints = 0,
movementPoints = 0,
maxMovementPoints = 0,
summoner = 0,
summoned = false,
neutralElementResistPercent = 0,
earthElementResistPercent = 0,
waterElementResistPercent = 0,
airElementResistPercent = 0,
fireElementResistPercent = 0,
neutralElementReduction = 0,
earthElementReduction = 0,
waterElementReduction = 0,
airElementReduction = 0,
fireElementReduction = 0,
criticalDamageFixedResist = 0,
pushDamageFixedResist = 0,
dodgePALostProbability = 0,
dodgePMLostProbability = 0,
tackleBlock = 0,
tackleEvade = 0,
invisibilityState = 0,
initiative = 0
) {
super(
lifePoints,
maxLifePoints,
baseMaxLifePoints,
permanentDamagePercent,
shieldPoints,
actionPoints,
maxActionPoints,
movementPoints,
maxMovementPoints,
summoner,
summoned,
neutralElementResistPercent,
earthElementResistPercent,
waterElementResistPercent,
airElementResistPercent,
fireElementResistPercent,
neutralElementReduction,
earthElementReduction,
waterElementReduction,
airElementReduction,
fireElementReduction,
criticalDamageFixedResist,
pushDamageFixedResist,
dodgePALostProbability,
dodgePMLostProbability,
tackleBlock,
tackleEvade,
invisibilityState
);
this.initiative = initiative;
}
}
<file_sep>import PartyInvitationDetailsMessage from "@/protocol/network/messages/PartyInvitationDetailsMessage";
import PartyGuestInformations from "@/protocol/network/types/PartyGuestInformations";
import PartyInvitationMemberInformations from "@/protocol/network/types/PartyInvitationMemberInformations";
export default class PartyInvitationDungeonDetailsMessage extends PartyInvitationDetailsMessage {
public playersDungeonReady: boolean[];
public dungeonId: number;
constructor(
partyId = 0,
partyType = 0,
fromId = 0,
fromName = "",
leaderId = 0,
dungeonId = 0,
members: PartyInvitationMemberInformations[],
guests: PartyGuestInformations[],
playersDungeonReady: boolean[]
) {
super(partyId, partyType, fromId, fromName, leaderId, members, guests);
this.playersDungeonReady = playersDungeonReady;
this.dungeonId = dungeonId;
}
}
<file_sep>import Account from "@/account";
import Frames, { IFrame } from "@/frames";
import ObjectAveragePricesMessage from "@/protocol/network/messages/ObjectAveragePricesMessage";
export default class AveragePricesFrame implements IFrame {
public register() {
Frames.dispatcher.register(
"ObjectAveragePricesMessage",
this.HandleObjectAveragePricesMessage,
this
);
}
private async HandleObjectAveragePricesMessage(
account: Account,
data: ObjectAveragePricesMessage
) {
//
}
}
<file_sep>import CharacterBaseCharacteristic from "@/protocol/network/types/CharacterBaseCharacteristic";
import Type from "@/protocol/network/types/Type";
export default class CharacterSpellModification extends Type {
public modificationType: number;
public spellId: number;
public value: CharacterBaseCharacteristic;
constructor(
modificationType = 0,
spellId = 0,
value: CharacterBaseCharacteristic
) {
super();
this.modificationType = modificationType;
this.spellId = spellId;
this.value = value;
}
}
<file_sep>import AbstractGameActionMessage from "@/protocol/network/messages/AbstractGameActionMessage";
import EntityLook from "@/protocol/network/types/EntityLook";
export default class GameActionFightChangeLookMessage extends AbstractGameActionMessage {
public targetId: number;
public entityLook: EntityLook;
constructor(
actionId = 0,
sourceId = 0,
targetId = 0,
entityLook: EntityLook
) {
super(actionId, sourceId);
this.targetId = targetId;
this.entityLook = entityLook;
}
}
<file_sep>export type Transfer = Array<ArrayBuffer | MessagePort | ImageBitmap>;
export interface ITypedWorker<In, Out> {
terminate: () => void;
onMessage: (output: Out) => void;
postMessage: (workerMessage: In, transfer?: Transfer) => void;
}
export function createWorker<In, Out>(
workerFunction: (
input: In,
cb: (_: Out, transfer?: Transfer) => void
) => void,
onMessage = (output: Out) => {
//
}
): ITypedWorker<In, Out> {
return new TypedWorker(workerFunction, onMessage);
}
class TypedWorker<In, Out> implements ITypedWorker<In, Out> {
private _nativeWorker: Worker;
constructor(
workerFunction: (
input: In,
cb: (_: Out, transfer?: Transfer) => void
) => void,
public onMessage = (output: Out) => {
//
}
) {
const postMessage = `(${workerFunction}).call(this, e.data, postMessage)`;
const workerFile = `self.onmessage=function(e){${postMessage}}`;
const blob = new Blob([workerFile], { type: "application/javascript" });
this._nativeWorker = new Worker(URL.createObjectURL(blob));
this._nativeWorker.onmessage = (messageEvent: MessageEvent) => {
this.onMessage(messageEvent.data);
};
}
/**
* Post message to worker for processing
* @param workerMessage message to send to worker
*/
public postMessage(workerMessage: In, transfer?: Transfer): void {
this._nativeWorker.postMessage(workerMessage, transfer);
}
public terminate(): void {
this._nativeWorker.terminate();
}
}
<file_sep>import GameFightResumeMessage from "@/protocol/network/messages/GameFightResumeMessage";
import FightDispellableEffectExtendedInformations from "@/protocol/network/types/FightDispellableEffectExtendedInformations";
import GameActionMark from "@/protocol/network/types/GameActionMark";
import GameFightResumeSlaveInfo from "@/protocol/network/types/GameFightResumeSlaveInfo";
import GameFightSpellCooldown from "@/protocol/network/types/GameFightSpellCooldown";
export default class GameFightResumeWithSlavesMessage extends GameFightResumeMessage {
public slavesInfo: GameFightResumeSlaveInfo[];
constructor(
gameTurn = 0,
summonCount = 0,
bombCount = 0,
effects: FightDispellableEffectExtendedInformations[],
marks: GameActionMark[],
spellCooldowns: GameFightSpellCooldown[],
slavesInfo: GameFightResumeSlaveInfo[]
) {
super(gameTurn, summonCount, bombCount, effects, marks, spellCooldowns);
this.slavesInfo = slavesInfo;
}
}
<file_sep>export function union(arr: any[]): any[] {
return [...new Set([].concat(...arr))];
}
export function notEmpty<TValue>(
value: TValue | null | undefined
): value is TValue {
return value !== null && value !== undefined;
}
export function shuffle<T>(array: T[]): T[] {
let counter = array.length;
// While there are elements in the array
while (counter > 0) {
// Pick a random index
const index = Math.floor(Math.random() * counter);
// Decrease counter by 1
counter--;
// And swap the last element with it
const temp = array[counter];
array[counter] = array[index];
array[index] = temp;
}
return array;
}
<file_sep>import AbstractFightDispellableEffect from "@/protocol/network/types/AbstractFightDispellableEffect";
export default class FightTemporarySpellImmunityEffect extends AbstractFightDispellableEffect {
public immuneSpellId: number;
constructor(
uid = 0,
targetId = 0,
turnDuration = 0,
dispelable = 1,
spellId = 0,
parentBoostUid = 0,
immuneSpellId = 0
) {
super(uid, targetId, turnDuration, dispelable, spellId, parentBoostUid);
this.immuneSpellId = immuneSpellId;
}
}
<file_sep># CookieTouch API Documentation
[Sommaire](SUMMARY.md) | [Sommaire détaillé](singlepage.md)
<hr>
## Sommaire
- [Fight](#fight-somm)
- [canFight](#fight-can-fight)
- [fight](#fight-fight)
<h2 id ="fight-somm">Fight</h2>
Toutes les fonctions relatives aux combats.
Pour les deux fonctions les paramètres sont tous facultatifs:
<table>
<thead>
<tr>
<th>Paramètres</th>
<th>Types</th>
<th>Valeur initiale</th>
<th>Description</th>
</tr>
</thead>
<tbody>
<tr>
<td>forbiddenMonsters</td>
<td>number[]</td>
<td>null</td>
<td>Liste d’IDs de monstres interdits</td>
</tr>
<tr>
<td>mandatoryMonsters</td>
<td>number[]</td>
<td>null</td>
<td>Liste d’IDs de monstres obligatoires</td>
</tr>
<tr>
<td>minMonsters</td>
<td>number</td>
<td>1</td>
<td>Nombre de monstres minimum dans le groupe</td>
</tr>
<tr>
<td>maxMonsters</td>
<td>number</td>
<td>8</td>
<td>Nombre de monstres maximum dans le groupe</td>
</tr>
<tr>
<td>minMonstersLevel</td>
<td>number</td>
<td>1</td>
<td>Niveau minimum du groupe de monstres</td>
</tr>
<tr>
<td>maxMonstersLevel</td>
<td>number</td>
<td>1000</td>
<td>Niveau maximum du groupe de monstres</td>
</tr>
</tbody>
</table>
**Exemple:**
*Dans cet exemple, nous initialisons tout se qui concerne les combats dans la config.*
```js
const config =
{
forbiddenMonsters: [150, 23],
mandatoryMonsters: [99],
minMonsters: 1,
maxMonsters: 8,
minMonstersLevel: 20,
maxMonstersLevel: 150
}
```
<hr>
<h2 id ="fight-can-fight">canFight(<code>forbiddenMonsters</code>, <code>mandatoryMonsters</code>, <code>minMonsters</code>, <code>maxMonsters</code>, <code>minLevel</code>, <code>maxLevel</code>)</h2>
- Return type: <a href="https://developer.mozilla.org/fr-Fr/docs/Web/JavaScript/Data_structures#Boolean_type">boolean</a>
Verifie si un groupe de monstres de la map correspond aux paramètres passés à la fonction.
Retourne true si le groupe correspond, false si il ne correspond pas.
```js
canFight([64], [68], 2, 6, 200, 600); // Vérifie si, sur cette map, le bot peut combattre un groupe de 2 à 6 mobs avec un Wabbit au minimum et aucun Black Tiwabbit. Le groupe doit avoir un niveau supérieur ou égal à 200 et inférieur ou égal à 600.
```
<hr>
<h2 id ="fight-fight">fight(<code>forbiddenMonsters</code>, <code>mandatoryMonsters</code>, <code>minMonsters</code>, <code>maxMonsters</code>, <code>minLevel</code>, <code>maxLevel</code>)</h2>
- Return type: <a href="https://developer.mozilla.org/fr-Fr/docs/Web/JavaScript/Data_structures#Boolean_type">boolean</a>
Lance un combat sur un groupe qui correspond aux paramètres passés à la fonction.
Retourne true et attaque le groupe si un groupe correspond, false si il n'y en a aucun.
```js
yield* await fight([], [], 2, 6, 200, 600); // Attaque si, sur cette map, un groupe vérifie les paramètres: un groupe de 2 à 6 mobs avec un Wabbit au minimum et aucun Black Tiwabbit. Le groupe doit avoir un niveau supérieur ou égal à 200 et inférieur ou égal à 600.
```
<file_sep># CookieTouch API Documentation
[Sommaire](SUMMARY.md) | [Sommaire détaillé](singlepage.md)
<hr>
## Sommaire
- [Craft](#craft)
- [craft](#craft)
# Craft
Fonctions relatives aux crafts
<h2 id="craft">craft(<code>gid</code>: <a href="https://developer.mozilla.org/fr-Fr/docs/Web/JavaScript/Data_structures#Number_type">number</a>, <code>qty</code>: <a href="https://developer.mozilla.org/fr-Fr/docs/Web/JavaScript/Data_structures#Number_type">number</a>)</h2>
Craft une quantité d'item
**Exemple:**
```js
async function* craftitem() {
yield* await use(400); // Utilise l'atelier avec l'action (cuir du pain dans ce cas(-1))
yield* await craft.craft(468, 10); // Craft 10 pain d'amakna ( 468 = pain d'amakna )
}
```
**Tous les GID des items peuvent être trouvés dans [items.txt](https://github.com/yovanoc/cookietouch/blob/master/resources/identifiants/items.txt)**
<file_sep>import GameFightFighterLightInformations from "@/protocol/network/types/GameFightFighterLightInformations";
export default class GameFightFighterCompanionLightInformations extends GameFightFighterLightInformations {
public companionId: number;
public masterId: number;
constructor(
id = 0,
level = 0,
breed = 0,
sex = false,
alive = false,
companionId = 0,
masterId = 0
) {
super(id, level, breed, sex, alive);
this.companionId = companionId;
this.masterId = masterId;
}
}
<file_sep>import AbstractPartyEventMessage from "@/protocol/network/messages/AbstractPartyEventMessage";
export default class PartyUpdateLightMessage extends AbstractPartyEventMessage {
public id: number;
public lifePoints: number;
public maxLifePoints: number;
public prospecting: number;
public regenRate: number;
constructor(
partyId = 0,
id = 0,
lifePoints = 0,
maxLifePoints = 0,
prospecting = 0,
regenRate = 0
) {
super(partyId);
this.id = id;
this.lifePoints = lifePoints;
this.maxLifePoints = maxLifePoints;
this.prospecting = prospecting;
this.regenRate = regenRate;
}
}
<file_sep>import AbstractGameActionFightTargetedAbilityMessage from "@/protocol/network/messages/AbstractGameActionFightTargetedAbilityMessage";
export default class GameActionFightCloseCombatMessage extends AbstractGameActionFightTargetedAbilityMessage {
public weaponGenericId: number;
constructor(
actionId = 0,
sourceId = 0,
targetId = 0,
destinationCellId = 0,
critical = 1,
silentCast = false,
weaponGenericId = 0
) {
super(
actionId,
sourceId,
targetId,
destinationCellId,
critical,
silentCast
);
this.weaponGenericId = weaponGenericId;
}
}
<file_sep>import { isDevelopment } from "common/env";
declare var __static: string;
export const staticPath = isDevelopment
? __static
: __dirname.replace(/app\.asar$/, "static");
<file_sep>import Account from "@/account";
import Frames, { IFrame } from "@/frames";
import ExchangeBuyOkMessage from "@/protocol/network/messages/ExchangeBuyOkMessage";
import ExchangeLeaveMessage from "@/protocol/network/messages/ExchangeLeaveMessage";
import ExchangeSellOkMessage from "@/protocol/network/messages/ExchangeSellOkMessage";
import ExchangeStartOkNpcShopMessage from "@/protocol/network/messages/ExchangeStartOkNpcShopMessage";
import LeaveDialogMessage from "@/protocol/network/messages/LeaveDialogMessage";
import NpcDialogCreationMessage from "@/protocol/network/messages/NpcDialogCreationMessage";
import NpcDialogQuestionMessage from "@/protocol/network/messages/NpcDialogQuestionMessage";
export default class NpcsFrame implements IFrame {
public register() {
Frames.dispatcher.register(
"NpcDialogCreationMessage",
this.HandleNpcDialogCreationMessage,
this
);
Frames.dispatcher.register(
"NpcDialogQuestionMessage",
this.HandleNpcDialogQuestionMessage,
this
);
Frames.dispatcher.register(
"ExchangeLeaveMessage",
this.HandleExchangeLeaveMessage,
this
);
Frames.dispatcher.register(
"LeaveDialogMessage",
this.HandleLeaveDialogMessage,
this
);
Frames.dispatcher.register(
"ExchangeStartOkNpcShopMessage",
this.HandleExchangeStartOkNpcShopMessage,
this
);
Frames.dispatcher.register(
"ExchangeSellOkMessage",
this.HandleExchangeSellOkMessage,
this
);
Frames.dispatcher.register(
"ExchangeBuyOkMessage",
this.HandleExchangeBuyOkMessage,
this
);
}
private async HandleNpcDialogCreationMessage(
account: Account,
message: NpcDialogCreationMessage
) {
account.game.npcs.UpdateNpcDialogCreationMessage(message);
}
private async HandleNpcDialogQuestionMessage(
account: Account,
message: NpcDialogQuestionMessage
) {
account.game.npcs.UpdateNpcDialogQuestionMessage(message);
}
private async HandleExchangeLeaveMessage(
account: Account,
message: ExchangeLeaveMessage
) {
account.game.npcs.UpdateExchangeLeaveMessage(message);
}
private async HandleLeaveDialogMessage(
account: Account,
message: LeaveDialogMessage
) {
account.game.npcs.UpdateLeaveDialogMessage(message);
}
private async HandleExchangeStartOkNpcShopMessage(
account: Account,
message: ExchangeStartOkNpcShopMessage
) {
account.game.npcs.UpdateNpcShopMessage(message);
}
private async HandleExchangeSellOkMessage(
account: Account,
message: ExchangeSellOkMessage
) {
account.game.npcs.UpdateExchangeSellOkMessage(message);
}
private async HandleExchangeBuyOkMessage(
account: Account,
message: ExchangeBuyOkMessage
) {
account.game.npcs.UpdateExchangeBuyOkMessage(message);
}
}
<file_sep>import Account from "@/account";
import LanguageManager from "@/configurations/language/LanguageManager";
import ScriptAction, {
ScriptActionResults
} from "@/scripts/actions/ScriptAction";
export default class UsePaddockAction extends ScriptAction {
public _name: string = "UsePaddockAction";
public async process(account: Account): Promise<ScriptActionResults> {
if (!account.game.map.paddock) {
return ScriptAction.failedResult();
}
if (
account.game.managers.interactives.useInteractiveByCellId(
account.game.map.paddock.cellId
)
) {
return ScriptAction.processingResult();
}
account.scripts.stopScript(
LanguageManager.trans(
"errorInteractiveCell",
account.game.map.paddock.cellId
)
);
return ScriptAction.failedResult();
}
}
<file_sep>import ExchangeBidPriceMessage from "@/protocol/network/messages/ExchangeBidPriceMessage";
export default class ExchangeBidPriceForSellerMessage extends ExchangeBidPriceMessage {
public minimalPrices: number[];
public allIdentical: boolean;
constructor(
genericId = 0,
averagePrice = 0,
allIdentical = false,
minimalPrices: number[]
) {
super(genericId, averagePrice);
this.minimalPrices = minimalPrices;
this.allIdentical = allIdentical;
}
}
<file_sep>import MapComplementaryInformationsDataMessage from "@/protocol/network/messages/MapComplementaryInformationsDataMessage";
import FightCommonInformations from "@/protocol/network/types/FightCommonInformations";
import GameRolePlayActorInformations from "@/protocol/network/types/GameRolePlayActorInformations";
import HouseInformations from "@/protocol/network/types/HouseInformations";
import HouseInformationsInside from "@/protocol/network/types/HouseInformationsInside";
import InteractiveElement from "@/protocol/network/types/InteractiveElement";
import MapObstacle from "@/protocol/network/types/MapObstacle";
import StatedElement from "@/protocol/network/types/StatedElement";
export default class MapComplementaryInformationsDataInHouseMessage extends MapComplementaryInformationsDataMessage {
public currentHouse: HouseInformationsInside;
constructor(
subAreaId = 0,
mapId = 0,
currentHouse: HouseInformationsInside,
houses: HouseInformations[],
actors: GameRolePlayActorInformations[],
interactiveElements: InteractiveElement[],
statedElements: StatedElement[],
obstacles: MapObstacle[],
fights: FightCommonInformations[]
) {
super(
subAreaId,
mapId,
houses,
actors,
interactiveElements,
statedElements,
obstacles,
fights
);
this.currentHouse = currentHouse;
}
}
<file_sep>import FightTemporaryBoostEffect from "@/protocol/network/types/FightTemporaryBoostEffect";
export default class FightTemporaryBoostWeaponDamagesEffect extends FightTemporaryBoostEffect {
public weaponTypeId: number;
constructor(
uid = 0,
targetId = 0,
turnDuration = 0,
dispelable = 1,
spellId = 0,
parentBoostUid = 0,
delta = 0,
weaponTypeId = 0
) {
super(
uid,
targetId,
turnDuration,
dispelable,
spellId,
parentBoostUid,
delta
);
this.weaponTypeId = weaponTypeId;
}
}
<file_sep>import Type from "@/protocol/network/types/Type";
export default class GuildEmblem extends Type {
public symbolShape: number;
public symbolColor: number;
public backgroundShape: number;
public backgroundColor: number;
constructor(
symbolShape = 0,
symbolColor = 0,
backgroundShape = 0,
backgroundColor = 0
) {
super();
this.symbolShape = symbolShape;
this.symbolColor = symbolColor;
this.backgroundShape = backgroundShape;
this.backgroundColor = backgroundColor;
}
}
<file_sep>import GameFightFighterLightInformations from "@/protocol/network/types/GameFightFighterLightInformations";
export default class GameFightFighterNamedLightInformations extends GameFightFighterLightInformations {
public name: string;
constructor(
id = 0,
level = 0,
breed = 0,
sex = false,
alive = false,
name = ""
) {
super(id, level, breed, sex, alive);
this.name = name;
}
}
<file_sep>import Account from "@/account";
import Frames, { IFrame } from "@/frames";
import ExchangeCraftResultMessage from "@/protocol/network/messages/ExchangeCraftResultMessage";
import ExchangeCraftResultWithObjectDescMessage from "@/protocol/network/messages/ExchangeCraftResultWithObjectDescMessage";
import ExchangeItemAutoCraftRemainingMessage from "@/protocol/network/messages/ExchangeItemAutoCraftRemainingMessage";
import ExchangeItemAutoCraftStopedMessage from "@/protocol/network/messages/ExchangeItemAutoCraftStopedMessage";
import ExchangeObjectAddedMessage from "@/protocol/network/messages/ExchangeObjectAddedMessage";
import ExchangeReplayCountModifiedMessage from "@/protocol/network/messages/ExchangeReplayCountModifiedMessage";
import ExchangeStartOkCraftMessage from "@/protocol/network/messages/ExchangeStartOkCraftMessage";
import ExchangeStartOkCraftWithInformationMessage from "@/protocol/network/messages/ExchangeStartOkCraftWithInformationMessage";
export default class CraftFrame implements IFrame {
public register() {
Frames.dispatcher.register(
"ExchangeObjectAddedMessage",
this.HandleExchangeObjectAddedMessage,
this
);
Frames.dispatcher.register(
"ExchangeStartOkCraftWithInformationMessage",
this.HandleExchangeStartOkCraftWithInformationMessage,
this
);
Frames.dispatcher.register(
"ExchangeReplayCountModifiedMessage",
this.HandleExchangeReplayCountModifiedMessage,
this
);
Frames.dispatcher.register(
"ExchangeCraftResultMessage",
this.HandleExchangeCraftResultMessage,
this
);
Frames.dispatcher.register(
"ExchangeCraftResultWithObjectDescMessage",
this.HandleExchangeCraftResultWithObjectDescMessage,
this
);
Frames.dispatcher.register(
"ExchangeStartOkCraftMessage",
this.HandleExchangeStartOkCraftMessage,
this
);
Frames.dispatcher.register(
"ExchangeItemAutoCraftRemainingMessage",
this.HandleExchangeItemAutoCraftRemainingMessage,
this
);
Frames.dispatcher.register(
"ExchangeItemAutoCraftStopedMessage",
this.HandleExchangeItemAutoCraftStopedMessage,
this
);
}
private async HandleExchangeStartOkCraftWithInformationMessage(
account: Account,
message: ExchangeStartOkCraftWithInformationMessage
) {
account.game.craft.UpdateExchangeStartOkCraftWithInformationMessage(
message
);
}
private async HandleExchangeReplayCountModifiedMessage(
account: Account,
message: ExchangeReplayCountModifiedMessage
) {
account.game.craft.UpdateExchangeReplayCountModifiedMessage(message);
}
private async HandleExchangeObjectAddedMessage(
account: Account,
message: ExchangeObjectAddedMessage
) {
account.game.craft.UpdateExchangeObjectAddedMessage(message);
}
private async HandleExchangeCraftResultMessage(
account: Account,
message: ExchangeCraftResultMessage
) {
account.game.craft.UpdateExchangeCraftResultMessage(message);
}
private async HandleExchangeCraftResultWithObjectDescMessage(
account: Account,
message: ExchangeCraftResultWithObjectDescMessage
) {
account.game.craft.UpdateExchangeCraftResultWithObjectDescMessage(message);
}
private async HandleExchangeStartOkCraftMessage(
account: Account,
message: ExchangeStartOkCraftMessage
) {
account.game.craft.UpdateExchangeStartOkCraftMessage(message);
}
private async HandleExchangeItemAutoCraftRemainingMessage(
account: Account,
message: ExchangeItemAutoCraftRemainingMessage
) {
account.game.craft.UpdateExchangeItemAutoCraftRemainingMessage(message);
}
private async HandleExchangeItemAutoCraftStopedMessage(
account: Account,
message: ExchangeItemAutoCraftStopedMessage
) {
account.game.craft.UpdateExchangeItemAutoCraftStopedMessage(message);
}
}
<file_sep>import { createWorker, ITypedWorker } from "@/utils/TypedWorker";
console.log("Go Workers!");
// Let us first define the `input` type for our Worker.
// In our simple example we are going to create a 'IValues' interface.
interface IValues {
x: number;
y: number;
}
// Now we need to define the function that the worker will perform when it
// receives a message of type `IValues`.
// We would expect it to look something like this
function workFn(input: IValues): number {
return input.x + input.y;
}
// However, as there are many use cases for when we would like to return
// multiple results from a single input or return results asynchronous.
// We have to modify our `workFn`.
// We will, basically, replace `return` with a function call.
// function workFn(input: IValues, callback: (_: number) => void): void {
// callback(input.x + input.y);
// }
// Now that our worker has calculated the sum of x and y, we should to do
// something with. Lets create a function that just logs the result.
// We could also call this function the `outputFn` or `handleResultFn`
function logFn(result: number) {
console.log(`We received this response from the worker: ${result}`);
}
// Lets put this all together and create our TypedWebWorker.
const typedWorker: ITypedWorker<IValues, number> = createWorker(workFn, logFn);
// Thats it! The Worker is now ready to process messages of type 'IValues'
// and log the results to the console.
typedWorker.postMessage({ x: 5, y: 5 }); // logs: "10"
typedWorker.postMessage({ x: 0, y: 0 }); // logs: "0"
typedWorker.terminate();
// You will get a compilation error if you try sending something other than a
// `IValues` object.
// typedWorker.postMessage({ x: 5, y: 5, z: 5 }); // this line will not compile
// typedWorker.postMessage("{ x: 5, y: 5}"); // this line will not compile
<file_sep># CookieTouch
## FR
- Le bug de l'inventaire vide est corrigé
- Ajout des menus Documentation et Identifiants
- Ajout des méthodes `inventory.equipedItems()` & `inventory.isEquiped(GID)`
- Ajout d'une méthode `data.item(GID)`
- Ajout d'une méthode `printBullet("message")` pour envoyer une notification via les scripts
## EN
- Empty inventory's bug is fixed
- Add Documentation and Identifiants menus
- Add `inventory.equipedItems()` & `inventory.isEquiped(GID)` methods
- Add `data.item(GID)` method
- Add `printBullet("message")` to send notification in scripts
<file_sep>import Account from ".";
export enum BonusPackMoney {
KAMAS = "KMS",
GOULTINES = "GOU"
}
export enum BonusPackDuration {
WEEK = 7,
MONTH = 30
}
export function buyBonusPack(
acc: Account,
money = BonusPackMoney.KAMAS,
duration = BonusPackDuration.WEEK
) {
const handleSetShopDetailsSuccess = (account: Account, message: any) => {
account.network.send("shopOpenRequest");
account.network.unregisterMessage(setShopDetailsSuccessHandlerID);
};
const setShopDetailsSuccessHandlerID = acc.network.registerMessage(
"setShopDetailsSuccess",
handleSetShopDetailsSuccess
);
const handleShopOpenSuccess = (account: Account, message: any) => {
account.network.send("shopOpenCategoryRequest", {
categoryId: 557,
page: 1,
size: 3
});
account.network.unregisterMessage(handleShopOpenSuccessHandlerID);
};
const handleShopOpenSuccessHandlerID = acc.network.registerMessage(
"shopOpenSuccess",
handleShopOpenSuccess
);
const handleShopOpenCategorySuccess = (account: Account, message: any) => {
if (!account.data.bakHardToSoftCurrentRate) {
return;
}
const choice = message.articles.find((a: any) =>
(a.name as string).includes(duration.toString())
);
account.network.send("shopBuyRequest", {
amountHard: choice.price,
amountSoft: Math.round(
choice.price * account.data.bakHardToSoftCurrentRate
),
currency: money,
isMysteryBox: false,
purchase: [
{
id: choice.id,
quantity: 1
}
]
});
account.network.unregisterMessage(shopOpenCategorySuccessHandlerID);
};
const shopOpenCategorySuccessHandlerID = acc.network.registerMessage(
"shopOpenCategorySuccess",
handleShopOpenCategorySuccess
);
const handleShopBuySuccess = (account: Account, message: any) => {
account.network.unregisterMessage(shopBuySuccessHandlerID);
};
const shopBuySuccessHandlerID = acc.network.registerMessage(
"shopBuySuccess",
handleShopBuySuccess
);
acc.network.send("setShopDetailsRequest", {});
acc.network.send("restoreMysteryBox");
}
<file_sep>import ExchangeStartOkMountWithOutPaddockMessage from "@/protocol/network/messages/ExchangeStartOkMountWithOutPaddockMessage";
import MountClientData from "@/protocol/network/types/MountClientData";
export default class ExchangeStartOkMountMessage extends ExchangeStartOkMountWithOutPaddockMessage {
public paddockedMountsDescription: MountClientData[];
constructor(
stabledMountsDescription: MountClientData[],
paddockedMountsDescription: MountClientData[]
) {
super(stabledMountsDescription);
this.paddockedMountsDescription = paddockedMountsDescription;
}
}
<file_sep>import FightTemporaryBoostEffect from "@/protocol/network/types/FightTemporaryBoostEffect";
export default class FightTemporaryBoostStateEffect extends FightTemporaryBoostEffect {
public stateId: number;
constructor(
uid = 0,
targetId = 0,
turnDuration = 0,
dispelable = 1,
spellId = 0,
parentBoostUid = 0,
delta = 0,
stateId = 0
) {
super(
uid,
targetId,
turnDuration,
dispelable,
spellId,
parentBoostUid,
delta
);
this.stateId = stateId;
}
}
<file_sep>import ObjectEffect from "@/protocol/network/types/ObjectEffect";
export default class ObjectEffectDate extends ObjectEffect {
public year: number;
public month: number;
public day: number;
public hour: number;
public minute: number;
constructor(
actionid = 0,
year = 0,
month = 0,
day = 0,
hour = 0,
minute = 0
) {
super(actionid);
this.year = year;
this.month = month;
this.day = day;
this.hour = hour;
this.minute = minute;
}
}
<file_sep>import GameFightFighterLightInformations from "@/protocol/network/types/GameFightFighterLightInformations";
export default class GameFightFighterMonsterLightInformations extends GameFightFighterLightInformations {
public creatureGenericId: number;
constructor(
id = 0,
level = 0,
breed = 0,
sex = false,
alive = false,
creatureGenericId = 0
) {
super(id, level, breed, sex, alive);
this.creatureGenericId = creatureGenericId;
}
}
<file_sep>import BasicGuildInformations from "@/protocol/network/types/BasicGuildInformations";
import FriendInformations from "@/protocol/network/types/FriendInformations";
import PlayerStatus from "@/protocol/network/types/PlayerStatus";
export default class FriendOnlineInformations extends FriendInformations {
public playerId: number;
public playerName: string;
public level: number;
public alignmentSide: number;
public breed: number;
public sex: boolean;
public guildInfo: BasicGuildInformations;
public moodSmileyId: number;
public status: PlayerStatus;
constructor(
accountId = 0,
accountName = "",
playerState = 99,
lastConnection = 0,
achievementPoints = 0,
playerId = 0,
playerName = "",
level = 0,
alignmentSide = 0,
breed = 0,
sex = false,
guildInfo: BasicGuildInformations,
moodSmileyId = 0,
status: PlayerStatus
) {
super(
accountId,
accountName,
playerState,
lastConnection,
achievementPoints
);
this.playerId = playerId;
this.playerName = playerName;
this.level = level;
this.alignmentSide = alignmentSide;
this.breed = breed;
this.sex = sex;
this.guildInfo = guildInfo;
this.moodSmileyId = moodSmileyId;
this.status = status;
}
}
<file_sep>import ChatServerMessage from "@/protocol/network/messages/ChatServerMessage";
export default class ChatAdminServerMessage extends ChatServerMessage {
constructor(
channel = 0,
content = "",
timestamp = 0,
fingerprint = "",
senderId = 0,
senderName = "",
senderAccountId = 0
) {
super(
channel,
content,
timestamp,
fingerprint,
senderId,
senderName,
senderAccountId
);
}
}
<file_sep>import HttpsProxyAgent from "@/utils/proxy/https";
import { request, RequestOptions } from "https";
export default class Spark {
private static agent: HttpsProxyAgent;
public static setAgent(proxy: string) {
this.agent = new HttpsProxyAgent(proxy);
}
public static async get<T>(url: string) {
return this.request<T>(url, "GET");
}
public static async post<T>(url: string) {
return this.request<T>(url, "POST");
}
private static async request<T>(url: string, method: string): Promise<T> {
return new Promise<T>(resolve => {
const opts: RequestOptions = {
agent: this.agent as any,
host: url,
method,
path: "/",
port: 80,
timeout: 10000
};
const post_req = request(opts, res => {
res.setEncoding("utf8");
let data = "";
res.on("data", chunk => {
data += chunk;
});
res.on("end", () => {
return resolve(JSON.parse(data));
});
});
// post_req.write("name=john");
post_req.end();
});
}
}
<file_sep><a href="#"><img src="https://image.ibb.co/j8rmqJ/icon.png" align="left" height="80" width="80"></a>
<center>
<h1>
CookieTouch
<a href="https://travis-ci.org/cookie-project/cookietouch">
<img src="https://travis-ci.org/cookie-project/cookietouch.svg?branch=master" alt="Travis">
</a>
<a href="https://github.com/prettier/prettier">
<img src="https://img.shields.io/badge/code_style-prettier-ff69b4.svg?style=flat-square" alt="Prettier">
</a>
<a href="https://github.styleci.io/repos/101233419">
<img src="https://github.styleci.io/repos/101233419/shield?branch=master" alt="StyleCI">
</a>
</a>
</h1>
</center>
You need **Yarn** in order to use this bot properly.
[Yarn Install](https://yarnpkg.com/lang/en/docs/install/)
# Installation
```bash
git clone https://github.com/yovanoc/cookietouch/
cd cookietouch && yarn
```
## Documentation
The documentation is available on https://docs.cookietouch.com/
To update the documentation you just have to modify the /docs/ folder.
## Socials Links
- [Discord](https://discord.gg/R7HsTnD)
<file_sep>import ActorAlignmentInformations from "@/protocol/network/types/ActorAlignmentInformations";
import EntityDispositionInformations from "@/protocol/network/types/EntityDispositionInformations";
import EntityLook from "@/protocol/network/types/EntityLook";
import GameFightFighterNamedInformations from "@/protocol/network/types/GameFightFighterNamedInformations";
import GameFightMinimalStats from "@/protocol/network/types/GameFightMinimalStats";
import PlayerStatus from "@/protocol/network/types/PlayerStatus";
export default class GameFightCharacterInformations extends GameFightFighterNamedInformations {
public level: number;
public alignmentInfos: ActorAlignmentInformations;
public breed: number;
constructor(
contextualId = 0,
look: EntityLook,
disposition: EntityDispositionInformations,
teamId = 2,
alive = false,
stats: GameFightMinimalStats,
name = "",
status: PlayerStatus,
level = 0,
alignmentInfos: ActorAlignmentInformations,
breed = 0
) {
super(contextualId, look, disposition, teamId, alive, stats, name, status);
this.level = level;
this.alignmentInfos = alignmentInfos;
this.breed = breed;
}
}
<file_sep>import LiteEvent, { ILiteEvent } from "@/utils/LiteEvent";
import firebase from "firebase/app";
interface IDataConfigurationJSON {
aggressiveMonsters: number[];
}
export default class DataConfiguration {
public static aggressiveMonsters: number[] = [];
private static globalDoc: firebase.firestore.DocumentReference | undefined;
private static authChangedUnsuscribe: firebase.Unsubscribe | undefined;
private static stopDataSnapshot: (() => void) | undefined;
private static onUpdated = new LiteEvent<void>();
public static get Updated(): ILiteEvent<void> {
return this.onUpdated.expose();
}
public static async load() {
this.globalDoc = firebase.firestore().doc(`/config/data`);
this.stopDataSnapshot = this.globalDoc.onSnapshot(snapshot => {
this.updateFields(snapshot);
});
if (!this.globalDoc) {
return;
}
const data = await this.globalDoc.get();
this.updateFields(data);
}
public static removeListeners = () => {
if (DataConfiguration.authChangedUnsuscribe) {
DataConfiguration.authChangedUnsuscribe();
}
if (DataConfiguration.stopDataSnapshot) {
DataConfiguration.stopDataSnapshot();
}
};
public static async save() {
if (!this.globalDoc) {
return;
}
const toSave: IDataConfigurationJSON = {
aggressiveMonsters: this.aggressiveMonsters
};
await this.globalDoc.set(toSave);
}
private static updateFields(snapshot: firebase.firestore.DocumentSnapshot) {
if (!snapshot.exists) {
return;
}
const json = snapshot.data() as IDataConfigurationJSON;
this.aggressiveMonsters = json.aggressiveMonsters;
this.onUpdated.trigger();
}
}
<file_sep>import GuildFactsMessage from "@/protocol/network/messages/GuildFactsMessage";
import BasicNamedAllianceInformations from "@/protocol/network/types/BasicNamedAllianceInformations";
import CharacterMinimalInformations from "@/protocol/network/types/CharacterMinimalInformations";
import GuildFactSheetInformations from "@/protocol/network/types/GuildFactSheetInformations";
export default class GuildInAllianceFactsMessage extends GuildFactsMessage {
public allianceInfos: BasicNamedAllianceInformations;
constructor(
infos: GuildFactSheetInformations,
creationDate = 0,
nbTaxCollectors = 0,
enabled = false,
allianceInfos: BasicNamedAllianceInformations,
members: CharacterMinimalInformations[]
) {
super(infos, creationDate, nbTaxCollectors, enabled, members);
this.allianceInfos = allianceInfos;
}
}
<file_sep>import Account from "@/account";
import ScriptAction, {
ScriptActionResults
} from "@/scripts/actions/ScriptAction";
export default class SellAction extends ScriptAction {
public _name: string = "SellAction";
public objectToSellId: number;
public quantity: number;
constructor(objectToSellId: number, quantity: number) {
super();
this.objectToSellId = objectToSellId;
this.quantity = quantity;
}
public async process(account: Account): Promise<ScriptActionResults> {
if (account.game.npcs.sellItem(this.objectToSellId, this.quantity)) {
return ScriptAction.processingResult();
}
return ScriptAction.doneResult();
}
}
<file_sep>import SelectedServerDataMessage from "@/protocol/network/messages/SelectedServerDataMessage";
export default class SelectedServerDataExtendedMessage extends SelectedServerDataMessage {
public serverIds: number[];
constructor(
serverId = 0,
address = "",
port = 0,
canCreateNewCharacter = false,
ticket = "",
serverIds: number[]
) {
super(serverId, address, port, canCreateNewCharacter, ticket);
this.serverIds = serverIds;
}
}
<file_sep>import EntityDispositionInformations from "@/protocol/network/types/EntityDispositionInformations";
import EntityLook from "@/protocol/network/types/EntityLook";
import GameFightFighterNamedInformations from "@/protocol/network/types/GameFightFighterNamedInformations";
import GameFightMinimalStats from "@/protocol/network/types/GameFightMinimalStats";
import PlayerStatus from "@/protocol/network/types/PlayerStatus";
export default class GameFightMutantInformations extends GameFightFighterNamedInformations {
public powerLevel: number;
constructor(
contextualId = 0,
look: EntityLook,
disposition: EntityDispositionInformations,
teamId = 2,
alive = false,
stats: GameFightMinimalStats,
name = "",
status: PlayerStatus,
powerLevel = 0
) {
super(contextualId, look, disposition, teamId, alive, stats, name, status);
this.powerLevel = powerLevel;
}
}
<file_sep>import Account from "@/account";
import { AccountStates } from "@/account/AccountStates";
import ObjectEntry from "@/game/character/inventory/ObjectEntry";
import ExchangeCraftResultMessage from "@/protocol/network/messages/ExchangeCraftResultMessage";
import ExchangeCraftResultWithObjectDescMessage from "@/protocol/network/messages/ExchangeCraftResultWithObjectDescMessage";
import ExchangeItemAutoCraftRemainingMessage from "@/protocol/network/messages/ExchangeItemAutoCraftRemainingMessage";
import ExchangeItemAutoCraftStopedMessage from "@/protocol/network/messages/ExchangeItemAutoCraftStopedMessage";
import ExchangeObjectAddedMessage from "@/protocol/network/messages/ExchangeObjectAddedMessage";
import ExchangeReplayCountModifiedMessage from "@/protocol/network/messages/ExchangeReplayCountModifiedMessage";
import ExchangeStartOkCraftMessage from "@/protocol/network/messages/ExchangeStartOkCraftMessage";
import ExchangeStartOkCraftWithInformationMessage from "@/protocol/network/messages/ExchangeStartOkCraftWithInformationMessage";
import ObjectItemToSell from "@/protocol/network/types/ObjectItemToSell";
import LiteEvent from "@/utils/LiteEvent";
export default class Craft {
public remoteObjects: ObjectEntry[];
public objects: ObjectEntry[];
public objectsInfos: ObjectItemToSell[];
public remoteCurrentWeight: number = 0;
public currentWeight: number = 0;
public nbCase: number = 0;
public skillid: number = 0;
public count: number = 0;
private account: Account;
private readonly onCraftStarted = new LiteEvent<void>();
private readonly onCraftLeft = new LiteEvent<void>();
private readonly onCraftQuantityChanged = new LiteEvent<void>();
private readonly onCraftUpdated = new LiteEvent<void>();
constructor(account: Account) {
this.account = account;
this.objectsInfos = [];
this.remoteObjects = [];
this.objects = [];
}
public get CraftStarted() {
return this.onCraftStarted.expose();
}
public get CraftLeft() {
return this.onCraftLeft.expose();
}
public get CraftQuantityChanged() {
return this.onCraftQuantityChanged.expose();
}
public setRecipe(gid: number): boolean {
this.account.network.sendMessageFree("ExchangeSetCraftRecipeMessage", {
objectGID: gid
});
return true;
}
public setQuantity(qty: number): boolean {
this.account.network.sendMessageFree("ExchangeReplayMessage", {
count: qty
});
return true;
}
public ready(): boolean {
this.account.network.sendMessageFree("ExchangeReadyMessage", {
ready: true,
step: 2
});
return true;
}
public async UpdateExchangeObjectAddedMessage(
message: ExchangeObjectAddedMessage
) {
const newObj = await ObjectEntry.setup(message.object);
if (message.remote) {
this.remoteObjects.push(newObj);
this.remoteCurrentWeight += newObj.realWeight * newObj.quantity;
} else {
this.objects.push(newObj);
this.currentWeight += newObj.realWeight * newObj.quantity;
}
this.onCraftStarted.trigger();
}
public async UpdateExchangeReplayCountModifiedMessage(
message: ExchangeReplayCountModifiedMessage
) {
this.onCraftQuantityChanged.trigger();
}
public async UpdateExchangeStartOkCraftWithInformationMessage(
message: ExchangeStartOkCraftWithInformationMessage
) {
this.account.state = AccountStates.CRAFTING;
this.nbCase = message.nbCase;
this.skillid = message.skillId;
this.onCraftStarted.trigger();
}
public async UpdateExchangeCraftResultMessage(
message: ExchangeCraftResultMessage
) {
this.onCraftUpdated.trigger();
}
public async UpdateExchangeCraftResultWithObjectDescMessage(
message: ExchangeCraftResultWithObjectDescMessage
) {
this.onCraftUpdated.trigger();
}
public async UpdateExchangeStartOkCraftMessage(
message: ExchangeStartOkCraftMessage
) {
this.account.state = AccountStates.CRAFTING;
this.onCraftStarted.trigger();
}
public async UpdateExchangeItemAutoCraftRemainingMessage(
message: ExchangeItemAutoCraftRemainingMessage
) {
this.count = message.count;
}
public async UpdateExchangeItemAutoCraftStopedMessage(
message: ExchangeItemAutoCraftStopedMessage
) {
this.account.state = AccountStates.NONE;
this.onCraftLeft.trigger();
}
}
<file_sep>import AbstractGameActionMessage from "@/protocol/network/messages/AbstractGameActionMessage";
export default class GameActionFightTriggerGlyphTrapMessage extends AbstractGameActionMessage {
public markId: number;
public triggeringCharacterId: number;
public triggeredSpellId: number;
constructor(
actionId = 0,
sourceId = 0,
markId = 0,
triggeringCharacterId = 0,
triggeredSpellId = 0
) {
super(actionId, sourceId);
this.markId = markId;
this.triggeringCharacterId = triggeringCharacterId;
this.triggeredSpellId = triggeredSpellId;
}
}
<file_sep>import ActorAlignmentInformations from "@/protocol/network/types/ActorAlignmentInformations";
import EntityDispositionInformations from "@/protocol/network/types/EntityDispositionInformations";
import EntityLook from "@/protocol/network/types/EntityLook";
import GameFightMinimalStats from "@/protocol/network/types/GameFightMinimalStats";
import GameFightMonsterInformations from "@/protocol/network/types/GameFightMonsterInformations";
export default class GameFightMonsterWithAlignmentInformations extends GameFightMonsterInformations {
public alignmentInfos: ActorAlignmentInformations;
constructor(
contextualId = 0,
look: EntityLook,
disposition: EntityDispositionInformations,
teamId = 2,
alive = false,
stats: GameFightMinimalStats,
creatureGenericId = 0,
creatureGrade = 0,
alignmentInfos: ActorAlignmentInformations
) {
super(
contextualId,
look,
disposition,
teamId,
alive,
stats,
creatureGenericId,
creatureGrade
);
this.alignmentInfos = alignmentInfos;
}
}
<file_sep>import AbstractGameActionMessage from "@/protocol/network/messages/AbstractGameActionMessage";
import AbstractFightDispellableEffect from "@/protocol/network/types/AbstractFightDispellableEffect";
export default class GameActionFightDispellableEffectMessage extends AbstractGameActionMessage {
public effect: AbstractFightDispellableEffect;
constructor(
actionId = 0,
sourceId = 0,
effect: AbstractFightDispellableEffect
) {
super(actionId, sourceId);
this.effect = effect;
}
}
<file_sep>import BasicGuildInformations from "@/protocol/network/types/BasicGuildInformations";
import EntityLook from "@/protocol/network/types/EntityLook";
import FriendSpouseInformations from "@/protocol/network/types/FriendSpouseInformations";
export default class FriendSpouseOnlineInformations extends FriendSpouseInformations {
public mapId: number;
public subAreaId: number;
public inFight: boolean;
public followSpouse: boolean;
constructor(
spouseAccountId = 0,
spouseId = 0,
spouseName = "",
spouseLevel = 0,
breed = 0,
sex = 0,
spouseEntityLook: EntityLook,
guildInfo: BasicGuildInformations,
alignmentSide = 0,
mapId = 0,
subAreaId = 0,
inFight = false,
followSpouse = false
) {
super(
spouseAccountId,
spouseId,
spouseName,
spouseLevel,
breed,
sex,
spouseEntityLook,
guildInfo,
alignmentSide
);
this.mapId = mapId;
this.subAreaId = subAreaId;
this.inFight = inFight;
this.followSpouse = followSpouse;
}
}
<file_sep>import ChatAbstractServerMessage from "@/protocol/network/messages/ChatAbstractServerMessage";
export default class ChatServerCopyMessage extends ChatAbstractServerMessage {
public receiverId: number;
public receiverName: string;
constructor(
channel = 0,
content = "",
timestamp = 0,
fingerprint = "",
receiverId = 0,
receiverName = ""
) {
super(channel, content, timestamp, fingerprint);
this.receiverId = receiverId;
this.receiverName = receiverName;
}
}
<file_sep>import Message from "@/protocol/network/messages/Message";
export default class PaddockToSellFilterMessage extends Message {
public areaId: number;
public atLeastNbMount: number;
public atLeastNbMachine: number;
public maxPrice: number;
constructor(
areaId = 0,
atLeastNbMount = 0,
atLeastNbMachine = 0,
maxPrice = 0
) {
super();
this.areaId = areaId;
this.atLeastNbMount = atLeastNbMount;
this.atLeastNbMachine = atLeastNbMachine;
this.maxPrice = maxPrice;
}
}
<file_sep>declare module "react-color";
declare module "semver";
<file_sep>import FightTeamInformations from "@/protocol/network/types/FightTeamInformations";
import FightTeamMemberInformations from "@/protocol/network/types/FightTeamMemberInformations";
export default class FightAllianceTeamInformations extends FightTeamInformations {
public relation: number;
constructor(
teamid = 2,
leaderid = 0,
teamside = 0,
teamtypeid = 0,
relation = 0,
teammembers: FightTeamMemberInformations[]
) {
super(teamid, leaderid, teamside, teamtypeid, teammembers);
this.relation = relation;
}
}
<file_sep>import EntityDispositionInformations from "@/protocol/network/types/EntityDispositionInformations";
import EntityLook from "@/protocol/network/types/EntityLook";
import GameFightAIInformations from "@/protocol/network/types/GameFightAIInformations";
import GameFightMinimalStats from "@/protocol/network/types/GameFightMinimalStats";
export default class GameFightTaxCollectorInformations extends GameFightAIInformations {
public firstNameId: number;
public lastNameId: number;
public level: number;
constructor(
contextualId = 0,
look: EntityLook,
disposition: EntityDispositionInformations,
teamId = 2,
alive = false,
stats: GameFightMinimalStats,
firstNameId = 0,
lastNameId = 0,
level = 0
) {
super(contextualId, look, disposition, teamId, alive, stats);
this.firstNameId = firstNameId;
this.lastNameId = lastNameId;
this.level = level;
}
}
<file_sep>import Account from "@/account";
import ScriptAction, {
ScriptActionResults
} from "@/scripts/actions/ScriptAction";
export default class StartShopAction extends ScriptAction {
public _name: string = "StartShopAction";
public async process(account: Account): Promise<ScriptActionResults> {
account.game.exchange.startShop();
return ScriptAction.processingResult();
}
}
<file_sep>import Account from "@/account";
import Frames, { IFrame } from "@/frames";
import ExchangeLeaveMessage from "@/protocol/network/messages/ExchangeLeaveMessage";
import ExchangeStartOkMountMessage from "@/protocol/network/messages/ExchangeStartOkMountMessage";
export default class BreedingFrame implements IFrame {
public register() {
Frames.dispatcher.register(
"ExchangeStartOkMountMessage",
this.HandleExchangeStartOkMountMessage,
this
);
Frames.dispatcher.register(
"ExchangeLeaveMessage",
this.HandleExchangeLeaveMessage,
this
);
}
private async HandleExchangeStartOkMountMessage(
account: Account,
message: ExchangeStartOkMountMessage
) {
await account.game.breeding.UpdateExchangeStartOkMountMessage(message);
}
private async HandleExchangeLeaveMessage(
account: Account,
message: ExchangeLeaveMessage
) {
account.game.breeding.UpdateExchangeLeaveMessage(message);
}
}
<file_sep>import Message from "@/protocol/network/messages/Message";
import QuestActiveInformations from "@/protocol/network/types/QuestActiveInformations";
import QuestActiveDetailedInformations from "../types/QuestActiveDetailedInformations";
export default class QuestStepInfoMessage extends Message {
public infos: QuestActiveInformations | QuestActiveDetailedInformations;
constructor(
infos: QuestActiveInformations | QuestActiveDetailedInformations
) {
super();
this.infos = infos;
}
}
<file_sep>import EntityDispositionInformations from "@/protocol/network/types/EntityDispositionInformations";
import EntityLook from "@/protocol/network/types/EntityLook";
import GameContextActorInformations from "@/protocol/network/types/GameContextActorInformations";
import GameFightMinimalStats from "@/protocol/network/types/GameFightMinimalStats";
export default class GameFightFighterInformations extends GameContextActorInformations {
public teamId: number;
public alive: boolean;
public stats: GameFightMinimalStats;
constructor(
contextualId = 0,
look: EntityLook,
disposition: EntityDispositionInformations,
teamId = 2,
alive = false,
stats: GameFightMinimalStats
) {
super(contextualId, look, disposition);
this.teamId = teamId;
this.alive = alive;
this.stats = stats;
}
}
| 8490df8058b336a1d33322be4dfc4adcab09dbc5 | [
"Markdown",
"TypeScript",
"INI"
]
| 103 | TypeScript | Bazaltic/cookietouch | 3e8bd960c2765cc98b68cda37aebe67a293c74f5 | bb57ad6c4fd3418e4734d0fd7647dd352ac58aaf |
refs/heads/master | <repo_name>bkbiggs/BF-Camera<file_sep>/camera.py
from picamera import PiCamera
from time import sleep
from datetime import datetime
import array as a
import subprocess
import sys
# this data file is generated by sunrise.pl on the import system
file = open("sunrise_sunset.data", "r")
a = file.read().split(",")
file.close
begin_limit = int(a[0]) * 60 + int(a[1])
# what time is it?
t = datetime.now()
begin_t = t.hour * 60 + t.minute
# is it too early to start?
if (begin_t < begin_limit):
print ( "dont start yet" )
sys.exit(0)
# what time is quitting time?
end_t = datetime(t.year, t.month, t.day, int(a[2]), int(a[3]), 0)
if (t > end_t):
print ("dont restart")
sys.exit(0)
thisHost = "_pi15"
image_path_header = "/home/pi/share/images/image"
icon_path_header = "/home/pi/share/images/icon_image"
fileext = thisHost + ".jpg"
# set up and turn on the camera
camera = PiCamera()
camera.rotation = 180
camera.exposure_mode = 'sports'
camera.resolution = (1280,1024)
# camera.brightness = 55
camera.start_preview()
#sleep(2)
for i in range ( (30 * 13) ):
t = datetime.now()
time_now = t.strftime("%Y%m%d%H%M")
image_pathname = image_path_header + time_now + fileext
icon_pathname = icon_path_header + time_now + fileext
print (time_now + " snap: " + image_pathname)
sleep(2)
camera.capture(image_pathname)
subprocess.call(['/bin/sh', '/home/pi/datetime_image.sh', image_pathname])
subprocess.call(['/bin/sh', '/home/pi/make_icon.sh', image_pathname, icon_pathname])
subprocess.call(['sudo', 'ln', "-f", image_pathname , '/var/www/html/myImage.jpg'])
if (t > end_t):
break
sleep (118)
camera.stop_preview()
subprocess.call(['mail', '-s', '"shutting down"', '<EMAIL>', '-A', '/var/www/html/myImage.jpg', '<', 'camera.py'])
print (t.strftime("%Y%m%d%H%M") + " closing camera collection...")
sleep (5)
<file_sep>/datetime_image.sh
#!/bin/bash
convert "$1" -font FreeSans \
-pointsize 48 -fill white -annotate +800+1000 \
%[exif:DateTimeOriginal] "$1"
<file_sep>/make_icon.sh
#!/bin/bash
convert "$1" -resize 256x192 "$2"
<file_sep>/README.md
# BF-Camera
This is the "code" that manages the capture for photos from the
bird feeder.
crontab is used to start (or restart) the camera.py master code.
camera.py loops over a 2 minute photo cycle, which <br>
- stamps the photo with the current datetime (datetime_image.sh)<br>
- creates the icon version of the photo for user (make_icon.sh)<br>
- "updates" a monitoring version of the current picture for web monitoring<br>
camera.py makes sure that the photo isn't captured unless it's taken between between
sunrise and sunset. That time frame is provided in the sunrise_sunset.data file
updated from raspberrypi02 on a daily(?) basis.
remove_old_images.sh is also run from cron to remove all but the most recent 800
pictures. The import process includes a step to choose from the current pictures
any that we want to make available. That moves the files from the camera host to
raspberrypi04 pending "place". The rest are cleaned out by this script to keep
the storage within a clean range.
<file_sep>/remove_old_images.sh
#!/bin/bash
COUNTER=0
for i in $( ls -t1 ./share/images ); do
let COUNTER=COUNTER+1
if [ $COUNTER -gt 800 ]; then
echo $COUNTER: $i
rm ./share/images/$i
fi
done
<file_sep>/runcamera.sh
python3 /home/pi/camera.py
echo Wrapping up...
mail -s "raspberrypi09 camera not shutting down in 2 minutes" <EMAIL> -A /var/www/html/myImage.jpg < camera.py
sleep 10
echo delay to try to allow transmision to complete..
# wait two minutes to transfer closing email
sleep 110
echo finishing...
sleep 10
#sudo shutdown -h now
| 693701fe53c63b2ef8c98613bc5b4b1a79403c60 | [
"Markdown",
"Python",
"Shell"
]
| 6 | Python | bkbiggs/BF-Camera | ef7b8058643ab1a7eaec96f02c12fea54aa7b08d | 74fb3832db31455b26f1f6b4bfb56e5c1de48e97 |
refs/heads/master | <repo_name>chris1287/thr<file_sep>/Cargo.toml
[package]
name = "thr"
version = "0.1.0"
authors = ["<NAME> <<EMAIL>>"]
[dependencies]
alsa = "0.2.0"
getopts = "0.2"
[lib]
crate-type = ["rlib", "dylib"]
<file_sep>/src/main.rs
extern crate getopts;
extern crate thr;
use thr::*;
struct Option {
pub short: &'static str,
pub long: &'static str,
pub function: fn(&str) -> u16,
pub description: &'static str,
pub domain: &'static str
}
static OPTIONS : &[Option] = &[
Option {short: "" , long: "amplifier" , function: get_amplifier , description: "set amplifier" , domain: "[clean; crunch; lead; brit; modern; bass; aco; flat]"},
Option {short: "" , long: "gain" , function: get_u16 , description: "set gain" , domain: "[0-100]"},
Option {short: "" , long: "master" , function: get_u16 , description: "set master" , domain: "[0-100]"},
Option {short: "" , long: "bass" , function: get_u16 , description: "set bass" , domain: "[0-100]"},
Option {short: "" , long: "middle" , function: get_u16 , description: "set middle" , domain: "[0-100]"},
Option {short: "" , long: "treble" , function: get_u16 , description: "set treble" , domain: "[0-100]"},
Option {short: "" , long: "cabinet" , function: get_cabinet , description: "set cabinet" , domain: "[usa4x12; usa2x12; brit4x12; brit2x12; cab1x12; cab4x10]"},
Option {short: "" , long: "gate" , function: get_gate , description: "set gate" , domain: "[on; off]"},
Option {short: "" , long: "gate-thr" , function: get_u16 , description: "set gate threshold" , domain: "[0-100]"},
Option {short: "" , long: "gate-rel" , function: get_u16 , description: "set gate release" , domain: "[0-100]"},
Option {short: "" , long: "compressor" , function: get_compressor , description: "set compressor" , domain: "[on; off]"},
Option {short: "" , long: "comp-type" , function: get_compressor_type , description: "set compressor type" , domain: "[stomp; rack]"},
Option {short: "" , long: "stomp-sus" , function: get_u16 , description: "set compressor stomp sustain" , domain: "[0-100]"},
Option {short: "" , long: "stomp-out" , function: get_u16 , description: "set compressor stomp output" , domain: "[0-100]"},
Option {short: "" , long: "rack-thr" , function: get_u16 , description: "set compressor rack threshold" , domain: "[0-1112]"},
Option {short: "" , long: "rack-att" , function: get_u16 , description: "set compressor rack attack" , domain: "[0-100]"},
Option {short: "" , long: "rack-rel" , function: get_u16 , description: "set compressor rack release" , domain: "[0-100]"},
Option {short: "" , long: "rack-ratio" , function: get_ratio , description: "set compressor rack ratio" , domain: "[1:1; 1:4; 1:8; 1:12; 1:20; 1:inf]"},
Option {short: "" , long: "rack-knee" , function: get_knee , description: "set compressor rack knee" , domain: "[soft; medium; hard]"},
Option {short: "" , long: "rack-out" , function: get_u16 , description: "set compressor rack output" , domain: "[0-1112]"},
Option {short: "" , long: "modulation" , function: get_modulation , description: "set modulation" , domain: "[on; off]"},
Option {short: "" , long: "mod-select" , function: get_modulation_selector , description: "set modulation selector" , domain: "[chorus; flanger; tremolo; phaser]"},
Option {short: "" , long: "chorus-speed" , function: get_u16 , description: "set chorus speed" , domain: "[0-100]"},
Option {short: "" , long: "chorus-depth" , function: get_u16 , description: "set chorus depth" , domain: "[0-100]"},
Option {short: "" , long: "chorus-mix" , function: get_u16 , description: "set chorus mix" , domain: "[0-100]"},
Option {short: "" , long: "flanger-speed" , function: get_u16 , description: "set flanger speed" , domain: "[0-100]"},
Option {short: "" , long: "flanger-manual" , function: get_u16 , description: "set flanger manual" , domain: "[0-100]"},
Option {short: "" , long: "flanger-depth" , function: get_u16 , description: "set flanger depth" , domain: "[0-100]"},
Option {short: "" , long: "flanger-feedback" , function: get_u16 , description: "set flanger feedback" , domain: "[0-100]"},
Option {short: "" , long: "flanger-spread" , function: get_u16 , description: "set flanger spread" , domain: "[0-100]"},
Option {short: "" , long: "tremolo-freq" , function: get_u16 , description: "set tremolo frequency" , domain: "[0-100]"},
Option {short: "" , long: "tremolo-depth" , function: get_u16 , description: "set tremolo depth" , domain: "[0-100]"},
Option {short: "" , long: "phaser-speed" , function: get_u16 , description: "set phaser speed" , domain: "[0-100]"},
Option {short: "" , long: "phaser-manual" , function: get_u16 , description: "set phaser manual" , domain: "[0-100]"},
Option {short: "" , long: "phaser-depth" , function: get_u16 , description: "set phaser depth" , domain: "[0-100]"},
Option {short: "" , long: "phaser-feedback" , function: get_u16 , description: "set phaser feedback" , domain: "[0-100]"},
Option {short: "" , long: "delay" , function: get_delay , description: "set delay" , domain: "[on; off]"},
Option {short: "" , long: "delay-time" , function: get_u16 , description: "set delay time" , domain: "[1-19983]"},
Option {short: "" , long: "delay-feedback" , function: get_u16 , description: "set delay feedback" , domain: "[0-100]"},
Option {short: "" , long: "delay-hcut" , function: get_u16 , description: "set delay high cut" , domain: "[1896-32001]"},
Option {short: "" , long: "delay-lcut" , function: get_u16 , description: "set delay low cut" , domain: "[21-15936]"},
Option {short: "" , long: "delay-level" , function: get_u16 , description: "set delay level" , domain: "[0-100]"},
Option {short: "" , long: "reverb" , function: get_reverb , description: "set reverb" , domain: "[on; off]"},
Option {short: "" , long: "reverb-type" , function: get_reverb_type , description: "set reverb type" , domain: "[room; plate; hall; spring]"},
Option {short: "" , long: "reverb-time" , function: get_u16 , description: "set reverb time" , domain: "[3-328]"},
Option {short: "" , long: "reverb-pre" , function: get_u16 , description: "set reverb pre" , domain: "[1-3920]"},
Option {short: "" , long: "reverb-lcut" , function: get_u16 , description: "set reverb low cut" , domain: "[21-15936]"},
Option {short: "" , long: "reverb-hcut" , function: get_u16 , description: "set reverb high cut" , domain: "[1896-32001]"},
Option {short: "" , long: "reverb-hratio" , function: get_u16 , description: "set reverb high ratio" , domain: "[1-10]"},
Option {short: "" , long: "reverb-lratio" , function: get_u16 , description: "set reverb low ratio" , domain: "[1-14]"},
Option {short: "" , long: "reverb-level" , function: get_u16 , description: "set reverb level" , domain: "[0-100]"},
Option {short: "" , long: "spring-reverb" , function: get_u16 , description: "set spring reverb" , domain: "[0-100]"},
Option {short: "" , long: "spring-filter" , function: get_u16 , description: "set spring filter" , domain: "[0-100]"}
];
fn get_opts() -> getopts::Options {
let mut opts = getopts::Options::new();
opts.long_only(true);
opts.optflag("" , "help" , "print help");
opts.optflag("" , "rawmidis" , "print available raw midi controllers");
opts.optflag("" , "dryrun" , "do not send sysex to device");
opts.optopt("" , "select" , "select raw midi controller" , "[hw:?,?,?]");
opts.optopt("" , "file" , "load file" , "[file name]");
opts.optflag("" , "monitor" , "monitor THR activity");
for o in OPTIONS {
opts.optopt(o.short, o.long, o.description, o.domain);
}
opts
}
fn print_usage(program: &str, opts: getopts::Options) {
let brief = format!("Usage: {} [options]", program);
print!("{}", opts.usage(&brief));
}
fn main() {
let args : Vec<String> = std::env::args().collect();
let opts = get_opts();
match opts.parse(&args) {
Ok(matches) => {
if matches.opt_present("help") {
print_usage("thr", opts);
return;
}
if matches.opt_present("rawmidis") {
print_rawmidis();
}
let device_name = matches.opt_str("select");
let device_name = match device_name {
Some(x) => x,
None => String::from("")
};
let dry = matches.opt_present("dryrun");
if matches.opt_present("monitor") {
match start(device_name.as_ref()) {
Ok(_) => {},
Err(_) => {}
};
}
let cmd = matches.opt_str("file");
if let Some(x) = cmd {
match load_file(&x) {
Some(sysex) => {
if dry {
print_sysex(&sysex);
} else {
match send_sysex(device_name.as_ref(), &sysex) {
Ok(()) => {},
Err(e) => { println!("{}", e); }
}
}
},
None => {
println!("invalid file");
}
};
};
for o in OPTIONS {
let cmd = matches.opt_str(o.long);
if let Some(x) = cmd {
match send_command(device_name.as_ref(), &get_knob(o.long), &(o.function)(&x), dry) {
Ok(()) => {},
Err(e) => { println!("{}", e); }
};
}
}
},
Err(e) => {
println!("{}", e);
}
};
}
<file_sep>/src/lib.rs
extern crate alsa;
use std::fs::File;
use std::io::Read;
use std::io::Write;
const HDR_SIZE: usize = 18;
const NAME_SIZE: usize = 128;
const DATA_START: usize = HDR_SIZE + NAME_SIZE;
const KEEP_ALIVE: &[u8] = &[0xF0, 0x43, 0x7D, 0x60, 0x44, 0x54, 0x41, 0x31, 0xF7];
const KNOB_TURN: &[u8] = &[0xF0, 0x43, 0x7D, 0x10, 0x41, 0x30, 0x01];
const PRESET_CHANGE: &[u8] = &[0xF0, 0x43, 0x7D, 0x00, 0x02, 0x0C, 0x44, 0x54, 0x41, 0x31, 0x41, 0x6C, 0x6C, 0x50, 0x00, 0x00, 0x7F, 0x7F];
#[no_mangle]
pub extern fn load_file(file_name: &str) -> Option<Vec<u8>> {
let header: [u8; 18] = [
0xF0, 0x43, 0x7D, 0x00,
0x02,
0x0C,
b'D', b'T', b'A', b'1', b'A', b'l', b'l', b'P', 0x00, 0x00, 0x7F, 0x7F
];
let mut sysex:Vec<u8> = Vec::new();
let mut file_content = [0; 265];
let mut file_handler = match File::open(file_name) {
Ok(file_handler) => file_handler,
Err(e) => {
println!("{}", e.to_string());
return None;
}
};
match file_handler.read(&mut file_content) {
Ok(x) => println!("read {} bytes from {}", x, file_name),
Err(e) => {
println!("{}", e.to_string());
return None;
}
};
let file_content = &file_content[9..];
let hcrc: u32 = header[6..].iter().map(|&x| x as u32).sum();
let fcrc: u32 = file_content.iter().map(|&x| x as u32).sum();
let mut crc: u32 = hcrc + fcrc;
crc = (!crc + 1) & 0x7F;
sysex.extend_from_slice(&header);
sysex.extend_from_slice(&file_content);
sysex.push(crc as u8);
sysex.push(0xF7);
Some(sysex)
}
#[no_mangle]
pub extern fn print_rawmidis() {
for card in alsa::card::Iter::new(){
match card {
Ok(card) => {
match alsa::Ctl::from_card(&card, false) {
Ok(ctl) => {
for rawmidi in alsa::rawmidi::Iter::new(&ctl) {
match rawmidi {
Ok(rawmidi) => {
println!("rawmidi {:?} hw:{},{},{} - {} ({})",
rawmidi.get_stream(),
card.get_index(),
rawmidi.get_device(),
rawmidi.get_subdevice(),
card.get_name().unwrap_or_else(|_| "".to_string()),
card.get_longname().unwrap_or_else(|_| "".to_string()));
},
Err(e) => {
println!("{}", e.to_string());
}
};
}
}, Err(e) => {
println!("{}", e.to_string());
}
};
},
Err(e) => {
println!("{}", e.to_string());
}
};
}
}
#[no_mangle]
pub extern fn send_sysex(name: &str, buf: &[u8]) -> Result<(), String> {
match alsa::rawmidi::Rawmidi::new(name, alsa::Direction::Playback, false) {
Ok(rawmidi) => {
let mut writer = rawmidi.io();
match writer.write(buf) {
Ok(n) => {
println!("{}: written {} bytes of {}", name, n, buf.len());
Ok(())
},
Err(e) => {
Err(e.to_string())
}
}
},
Err(e) => {
Err(e.to_string())
}
}
}
#[no_mangle]
pub extern fn send_command(name: &str, knob: &u8, value: &u16, dry: bool) -> Result<(), String> {
let sysex_set: [u8; 11] = [
0xF0, 0x43, 0x7D, 0x10, 0x41, 0x30, 0x01,
*knob, (value >> 8) as u8, (value & 0xFF) as u8,
0xF7];
if dry {
print_sysex(&sysex_set);
} else {
send_sysex(name, &sysex_set)?
}
Ok(())
}
#[no_mangle]
pub extern fn start(name: &str) -> Result<(), String> {
match alsa::rawmidi::Rawmidi::new(name, alsa::Direction::Capture, false) {
Ok(rawmidi) => {
let mut handler = rawmidi.io();
let mut cmd = Vec::new();
loop {
let mut buffer : [u8; 1] = [0x00];
match handler.read_exact(&mut buffer) {
Ok(_) => {
cmd.push(buffer[0]);
if buffer[0] == 0xF7 {
if is_cmd(&cmd, &KEEP_ALIVE) {
// ignore
} else if is_cmd(&cmd, &KNOB_TURN) {
dump_knob_turn(&cmd);
} else if is_cmd(&cmd, PRESET_CHANGE) {
dump_preset_name(&cmd);
dump_amplifier(&cmd);
dump_gain(&cmd);
dump_master(&cmd);
dump_bass(&cmd);
dump_middle(&cmd);
dump_treble(&cmd);
dump_cabinet(&cmd);
}
else {
print_sysex(&cmd);
}
cmd.clear();
}
}
Err(e) => {
return Err(e.to_string());
}
};
}
},
Err(e) => {
Err(e.to_string())
}
}
}
pub fn print_sysex(buf: &[u8]) {
for i in buf {
print!{" {:02X}", i}
}
println!();
}
pub fn get_u16(s: &str) -> u16 {
s.parse::<u16>().unwrap_or(0)
}
pub fn get_amplifier(name: &str) -> u16 {
match name {
"clean" => 0x00,
"crunch" => 0x01,
"lead" => 0x02,
"brit" => 0x03,
"modern" => 0x04,
"bass" => 0x05,
"aco" => 0x06,
"flat" => 0x07,
_ => 0x00
}
}
pub fn rev_amplifier(value: u8) -> &'static str {
match value {
0x00 => "clean",
0x01 => "crunch",
0x02 => "lead",
0x03 => "brit",
0x04 => "modern",
0x05 => "bass",
0x06 => "aco",
0x07 => "flat",
_ => ""
}
}
pub fn get_knob(name: &str) -> u8 {
match name {
"amplifier" => 0x00,
"gain" => 0x01,
"master" => 0x02,
"bass" => 0x03,
"middle" => 0x04,
"treble" => 0x05,
"cabinet" => 0x06,
"gate" => 0x5F,
"gate-thr" => 0x51,
"gate-rel" => 0x52,
"compressor" => 0x1F,
"comp-type" => 0x10,
"stomp-sus" => 0x11,
"stomp-out" => 0x12,
"rack-thr" => 0x11,
"rack-att" => 0x13,
"rack-rel" => 0x14,
"rack-ratio" => 0x15,
"rack-knee" => 0x16,
"rack-out" => 0x17,
"modulation" => 0x2F,
"mod-select" => 0x20,
"chorus-speed" => 0x21,
"chorus-depth" => 0x22,
"chorus-mix" => 0x23,
"flanger-speed" => 0x21,
"flanger-manual" => 0x22,
"flanger-depth" => 0x23,
"flanger-feedback" => 0x24,
"flanger-spread" => 0x25,
"tremolo-freq" => 0x21,
"tremolo-depth" => 0x22,
"phaser-speed" => 0x21,
"phaser-manual" => 0x22,
"phaser-depth" => 0x23,
"phaser-feedback" => 0x24,
"delay" => 0x3F,
"delay-time" => 0x31,
"delay-feedback" => 0x33,
"delay-hcut" => 0x34,
"delay-lcut" => 0x36,
"delay-level" => 0x38,
"reverb" => 0x4F,
"reverb-type" => 0x40,
"reverb-time" => 0x41,
"reverb-pre" => 0x43,
"reverb-lcut" => 0x45,
"reverb-hcut" => 0x47,
"reverb-hratio" => 0x49,
"reverb-lratio" => 0x4A,
"reverb-level" => 0x4B,
"spring-reverb" => 0x41,
"spring-filter" => 0x42,
_ => 0x00
}
}
pub fn rev_knob(value: u8) -> &'static str {
match value {
0x00 => "amplifier",
0x01 => "gain",
0x02 => "master",
0x03 => "bass",
0x04 => "middle",
0x05 => "treble",
0x06 => "cabinet",
0x5F => "gate",
0x51 => "gate-thr",
0x52 => "gate-rel",
0x1F => "compressor",
0x10 => "comp-type",
0x11 => "stomp-sus",
0x12 => "stomp-out",
0x13 => "rack-att",
0x14 => "rack-rel",
0x15 => "rack-ratio",
0x16 => "rack-knee",
0x17 => "rack-out",
0x2F => "modulation",
0x20 => "mod-select",
0x21 => "chorus-speed",
0x22 => "chorus-depth",
0x23 => "chorus-mix",
0x24 => "flanger-feedback",
0x25 => "flanger-spread",
0x3F => "delay",
0x31 => "delay-time",
0x33 => "delay-feedback",
0x34 => "delay-hcut",
0x36 => "delay-lcut",
0x38 => "delay-level",
0x4F => "reverb",
0x40 => "reverb-type",
0x41 => "reverb-time",
0x43 => "reverb-pre",
0x45 => "reverb-lcut",
0x47 => "reverb-hcut",
0x49 => "reverb-hratio",
0x4A => "reverb-lratio",
0x4B => "reverb-level",
0x42 => "spring-filter",
_ => ""
}
}
pub fn get_cabinet(name: &str) -> u16 {
match name {
"usa4x12" => 0x00,
"usa2x12" => 0x01,
"brit4x12" => 0x02,
"brit2x12" => 0x03,
"cab1x12" => 0x04,
"cab4x10" => 0x05,
_ => 0x00
}
}
pub fn rev_cabinet(value: u8) -> &'static str {
match value {
0x00 => "usa4x12",
0x01 => "usa2x12",
0x02 => "brit4x12",
0x03 => "brit2x12",
0x04 => "cab1x12",
0x05 => "cab4x10",
_ => ""
}
}
pub fn get_compressor(name: &str) -> u16 {
match name {
"on" => 0x00,
"off" => 0x7F,
_ => 0x00
}
}
pub fn get_compressor_type(name: &str) -> u16 {
match name {
"stomp" => 0x00,
"rack" => 0x01,
_ => 0x00
}
}
pub fn get_gate(name: &str) -> u16 {
match name {
"on" => 0x00,
"off" => 0x7F,
_ => 0x00
}
}
pub fn get_knee(name: &str) -> u16 {
match name {
"soft" => 0x00,
"medium" => 0x01,
"hard" => 0x02,
_ => 0x00
}
}
pub fn get_ratio(name: &str) -> u16 {
match name {
"1:1" => 0x00,
"1:4" => 0x01,
"1:8" => 0x02,
"1:12" => 0x03,
"1:20" => 0x04,
"1:inf" => 0x05,
_ => 0x00
}
}
pub fn get_modulation(name: &str) -> u16 {
match name {
"on" => 0x00,
"off" => 0x7F,
_ => 0x00
}
}
pub fn get_modulation_selector(name: &str) -> u16 {
match name {
"chorus" => 0x00,
"flanger" => 0x01,
"tremolo" => 0x02,
"phaser" => 0x03,
_ => 0x00
}
}
pub fn get_delay(name: &str) -> u16 {
match name {
"on" => 0x00,
"off" => 0x7F,
_ => 0x00
}
}
pub fn get_reverb(name: &str) -> u16 {
match name {
"on" => 0x00,
"off" => 0x7F,
_ => 0x00
}
}
pub fn get_reverb_type(name: &str) -> u16 {
match name {
"room" => 0x01,
"plate" => 0x02,
"hall" => 0x00,
"spring" => 0x03,
_ => 0x00
}
}
fn is_cmd(cmd: &[u8], cmd_to_check: &[u8]) -> bool {
if cmd_to_check.len() > cmd.len() {
return false;
}
&cmd[0..cmd_to_check.len()] == cmd_to_check
}
fn dump_knob_turn(cmd: &[u8]) {
if cmd.len() < 11 {
return;
}
let knob = rev_knob(cmd[7]);
let value : u32 = (cmd[8] as u32) * 10 + (cmd[9] as u32);
println!("{} = {}", knob, value);
}
fn dump_preset_name(cmd: &[u8]) {
let mut name = String::new();
for c in &cmd[HDR_SIZE..NAME_SIZE + 1] {
if *c != 0u8 {
name.push(*c as char);
} else {
break;
}
}
println!("preset name: {}", name);
}
fn dump_amplifier(cmd: &[u8]) {
println!("amplifier: {}", rev_amplifier(cmd[DATA_START + 0]));
}
fn dump_gain(cmd: &[u8]) {
println!("gain: {}", cmd[DATA_START + 1]);
}
fn dump_master(cmd: &[u8]) {
println!("master: {}", cmd[DATA_START + 2]);
}
fn dump_bass(cmd: &[u8]) {
println!("bass: {}", cmd[DATA_START + 3]);
}
fn dump_middle(cmd: &[u8]) {
println!("middle: {}", cmd[DATA_START + 4]);
}
fn dump_treble(cmd: &[u8]) {
println!("treble: {}", cmd[DATA_START + 5]);
}
fn dump_cabinet(cmd: &[u8]) {
println!("cabinet: {}", rev_cabinet(cmd[DATA_START + 6]));
}
| 69d4f8c9ffbbc2bf9756911b71be2f8e0c6f00e5 | [
"TOML",
"Rust"
]
| 3 | TOML | chris1287/thr | 7095462b17078e0def2e40ded9f4907bc4ffbd3e | e882502cdd9de42ff38bd40f9b36dc946b732bd0 |
refs/heads/master | <file_sep>const ENTER_KEYCODE = 13;
document.addEventListener('DOMContentLoaded', () => {
const form = document.querySelector('.form');
const items = document.querySelector('.items');
text.init(form, items);
});
const text = (() => {
let items;
function init(_form, _items) {
items = _items;
_form.addEventListener('submit', formHandler);
// TODO láta hluti í _items virka
items.addEventListener('click', finish );
items.addEventListener('click', edit );
items.addEventListener('click', commit );
_form.addEventListener('submit', add );
items.addEventListener('click', deleteItem );
items.addEventListener('click', el );
}
function formHandler(e) {
e.preventDefault();
console.log('halló heimur');
}
// event handler fyrir það að klára færslu
function finish(e) {
var element = e.target;
var parent = element.parentNode;
if (element.tagName == "INPUT" ){
if(element.type == "checkbox"){
if(element.checked){
parent.setAttribute("class", "item item--done");
}else{
parent.setAttribute("class", "item");
}
}
}
}
// event handler fyrir það að breyta færslu
function edit(e) {
var element = e.target;
var parent = element.parentNode;
if (element.tagName == "SPAN" ){
var text = element.innerHTML;
var input = el("input", "item__edit", "edit");
input.setAttribute("type", "text");
input.setAttribute("value", text);
parent.insertBefore(input, parent.childNodes[2]);
element.remove();
input.focus();
}
}
// event handler fyrir það að klára að breyta færslu
function commit(e) {
var element = e.target;
var parent = element.parentNode;
if (element.tagName == "INPUT" ){
if(element.type == "text"){
var text = element.value;
var span = el("span", "item__text", "commit");
span.appendChild(document.createTextNode(text));
parent.insertBefore(span, parent.childNodes[2]);
element.remove();
}
}
}
// fall sem sér um að bæta við nýju item
function add(value) {
var inputValue = document.getElementsByClassName("form__input")[0].value;
if(inputValue.length != 0 && inputValue.trim() != ""){
var li = el("li", "item", "finish");
var input = el("input", "item__checkbox", "check");
input.setAttribute("type", "checkbox");
var span = el("span", "item__text", "strike");
span.appendChild(document.createTextNode(inputValue));
var button = el("button", "item__button", "delete");
button.appendChild(document.createTextNode("Eyða"));
li.append(input);
li.append(span);
li.append(button);
var ul = document.getElementsByClassName("items");
ul[0].append(li);
document.getElementsByClassName("form__input")[0].value= "";
}
}
// event handler til að eyða færslu
function deleteItem(e) {
if (e.target.className == "item__button" ){
e.target.parentNode.remove();
}
}
// hjálparfall til að útbúa element
function el(type, className, clickHandler) {
var element = document.createElement(type);
element.classList.add(className);
return element;
}
return {
init: init
}
})(); | ef4be73129827c8ed73f82602f434786b632f0e8 | [
"JavaScript"
]
| 1 | JavaScript | mimozah/verkefni8 | bcb6db6fa78707cc7d15fb3353850ba63018a3f0 | 6883ad3c9a8905352ef697c698ca7e015103d35a |
refs/heads/master | <file_sep>
//1.Single element selector
let element = document.getElementById('myfirst');
// element = element.className
// element = element.childNodes
// element = element.parentNode
element.style.color = 'red'
element.innerText = '<NAME>'
element.innerHTML = '<b><NAME> a <NAME></b>'
// console.log(element);
let sel = document.querySelector('#myfirst4');
sel = document.querySelector('.child');
sel = document.querySelector('div');
sel.style.color = 'green'
// console.log(sel);
//multi element selector
let elems = document.getElementsByClassName('child');
elems = document.getElementsByTagName('div');
console.log(elems);
Array.from(elems).forEach(element => {
console.log(Element);
element.style.color = 'blue';
});
| e87a841f1505278e4b42a3fcc04f3a2e950f6f27 | [
"JavaScript"
]
| 1 | JavaScript | mku306ukm/html-element-selectors | ffdd6df2e589e7be306bdee9f6fdce09f58da25b | 9d516907acbeeefe8e6657bfd8257e8e96f5109f |
refs/heads/master | <file_sep>package ru.napoleonit.common.ui
import android.content.Context
import android.os.Bundle
import android.view.View
import androidx.fragment.app.Fragment
import androidx.fragment.app.FragmentTransaction
import dagger.android.AndroidInjector
import dagger.android.DispatchingAndroidInjector
import dagger.android.HasAndroidInjector
import dagger.android.support.AndroidSupportInjection
import ru.napoleonit.common.R
import ru.napoleonit.common.navigation.LocalCiceronesHolder
import ru.napoleonit.common.navigation.command.ForwardWithSharedElements
import ru.napoleonit.common.navigation.router.TransitionsRouter
import ru.terrakok.cicerone.Navigator
import ru.terrakok.cicerone.NavigatorHolder
import ru.terrakok.cicerone.Router
import ru.terrakok.cicerone.android.support.SupportAppNavigator
import ru.terrakok.cicerone.android.support.SupportAppScreen
import ru.terrakok.cicerone.commands.Command
import javax.inject.Inject
abstract class ContainerFragment : BaseFragment(), HasAndroidInjector {
abstract val firstScreen: SupportAppScreen
@Inject
lateinit var androidInjector: DispatchingAndroidInjector<Any>
@Inject
lateinit var localCiceronesHolder: LocalCiceronesHolder
private val key = this::class.java.simpleName
val router: TransitionsRouter by lazy {
localCiceronesHolder.getOrCreate(key).router
}
private val navigatorHolder: NavigatorHolder by lazy {
localCiceronesHolder.getOrCreate(key).navigatorHolder
}
private lateinit var navigator: Navigator
override val layoutId: Int = R.layout.fragment_container
override fun onAttach(context: Context) {
AndroidSupportInjection.inject(this)
super.onAttach(context)
}
override fun onViewCreated(view: View, savedInstanceState: Bundle?) {
super.onViewCreated(view, savedInstanceState)
navigator = object : SupportAppNavigator(
activity,
childFragmentManager,
R.id.llFragmentContainer
) {
override fun setupFragmentTransaction(
command: Command,
currentFragment: Fragment?,
nextFragment: Fragment?,
fragmentTransaction: FragmentTransaction
) {
val sharedElements =
if (command is ForwardWithSharedElements) command.sharedElements else null
if (sharedElements != null)
fragmentTransaction.run {
setReorderingAllowed(true)
sharedElements.forEach {
addSharedElement(it.key, it.value)
}
}
(currentFragment as? ExitTransitionHandler)?.prepareExitTransitions(
fragmentTransaction
)
(nextFragment as? EnterTransitionHandler)?.prepareEnterTransitions(
fragmentTransaction
)
}
}
router.navigateTo(firstScreen)
}
override fun onResume() {
super.onResume()
navigatorHolder.setNavigator(navigator)
}
override fun onPause() {
super.onPause()
navigatorHolder.removeNavigator()
}
override fun androidInjector(): AndroidInjector<Any> = androidInjector
}<file_sep>package ru.napoleonit.di
import dagger.Binds
import dagger.Module
import dagger.Provides
import ru.napoleonit.featureCloser.FeatureCloserImpl
import ru.napoleonit.settings.dependency.FeatureCloser
import javax.inject.Singleton
@Module
interface AppModule {
@Singleton
@Binds
fun featureCLoser(featureCloserImpl: FeatureCloserImpl): FeatureCloser
}
<file_sep>package ru.napoleonit.common.presentation
import moxy.MvpView
interface BaseView : MvpView<file_sep>package ru.napoleonit.network.api
import ru.napoleonit.network.service.ApiService
interface NetworkServiceApi {
fun apiService(): ApiService
}<file_sep>package ru.napoleonit.settings.di
import dagger.Module
import dagger.Provides
import ru.napoleonit.common.di.ActivityScope
import ru.napoleonit.common.navigation.router.TransitionsRouter
import ru.napoleonit.settings.ui.KittensContainerFragment
import ru.terrakok.cicerone.Router
@Module
class LocalNavigationModule {
@Provides
@ActivityScope
fun router(settingsContainer: KittensContainerFragment): TransitionsRouter =
settingsContainer.router
}
<file_sep>package ru.napoleonit.featureCloser
import ru.napoleonit.settings.dependency.FeatureCloser
import ru.terrakok.cicerone.Router
import javax.inject.Inject
import javax.inject.Named
class FeatureCloserImpl @Inject constructor(@Named("global") private val appRouter: Router) :
FeatureCloser {
override fun close() = appRouter.exit()
}<file_sep>package ru.napoleonit.settings.dependency
import ru.napoleonit.network.service.ApiService
import javax.inject.Inject
class SettingsDependencies @Inject constructor(
val apiService: ApiService,
val featureCloser: FeatureCloser
) {
// fun apiService(): ApiService
//
// fun closeFeature(): FeatureCloser
}<file_sep>package ru.napoleonit.settings.dependency
interface FeatureCloser {
fun close()
}
<file_sep>package ru.napoleonit.settings.di
import dagger.Module
import dagger.android.ContributesAndroidInjector
import ru.napoleonit.common.di.FragmentScope
import ru.napoleonit.settings.ui.KittenDetailFragment
import ru.napoleonit.settings.ui.KittensFragment
@Module
abstract class MainFragmentModule {
@ContributesAndroidInjector
@FragmentScope
abstract fun mainFragmentInjector(): KittensFragment
@ContributesAndroidInjector
@FragmentScope
abstract fun kittensDetailInjector(): KittenDetailFragment
}
<file_sep>package ru.napoleonit.network.di
import dagger.Component
import ru.napoleonit.network.api.NetworkServiceApi
import javax.inject.Singleton
@Component(modules = [NetworkModule::class])
@Singleton
interface NetworkComponent : NetworkServiceApi {
companion object {
val networkComponent: NetworkServiceApi by lazy {
DaggerNetworkComponent.builder()
.build()
}
}
}<file_sep>package ru.napoleonit.multimoduletests.featuresInjector
import androidx.fragment.app.Fragment
import ru.napoleonit.settings.ui.KittensContainerFragment
//import ru.napoleonit.settings.di.DaggerSettingsComponent
//import ru.napoleonit.settings.di.DaggerSettingsComponent_SettingsDependenciesComponent
import ru.terrakok.cicerone.android.support.SupportAppScreen
object FeaturesProxy {
//
fun startSettingsFeature(): SupportAppScreen {
return object : SupportAppScreen() {
override fun getFragment(): Fragment =
KittensContainerFragment()
}
// return DaggerSettingsComponent.builder()
// .settingsDependencies(
// DaggerSettingsComponent_SettingsDependenciesComponent.builder()
// .networkServiceApi(networkComponent)
//// .featureCloserDependency(appComponent)
// .build()
// )
// .build()
// .getScreen()
}
}<file_sep>package ru.napoleonit.multimoduletests.featuresInjector
@Retention(value = AnnotationRetention.RUNTIME)
annotation class PerFeature<file_sep>package ru.napoleonit.common.di
import androidx.fragment.app.Fragment
import androidx.fragment.app.FragmentFactory
import java.lang.Exception
import java.lang.RuntimeException
import javax.inject.Inject
import javax.inject.Provider
//class InjectionFragmentFactory @Inject constructor(
// private val loaders: Map<Class<out Fragment>, Provider<Fragment>>
//) : FragmentFactory() {
// override fun instantiate(classLoader: ClassLoader, className: String): Fragment {
//
// val fragmentClass = loadFragmentClass(classLoader, className)
// return try {
// val loader = loaders[fragmentClass]
// super.instantiate(classLoader, className)
// } catch (ex: Exception) {
// throw RuntimeException(ex)
// }
// }
//}<file_sep>ext.sdkVersions = [
kotlin : '1.3.61',
moxy : '2.0.2',
dagger : '2.25.4',
coroutines : '1.3.3',
network : [
retrofit : '2.6.0',
serialization_converter: '0.4.0'
],
cicerone : '5.0.0',
recyclerView: '1.1.0',
fragmentKtx : '1.1.0',
viewpager2 : '1.0.0',
glide : '4.9.0'
]<file_sep>package ru.napoleonit.settings.ui
import ru.napoleonit.common.ui.ContainerFragment
import ru.napoleonit.common.ui.createAppScreen
import ru.terrakok.cicerone.android.support.SupportAppScreen
class KittensContainerFragment : ContainerFragment() {
override val firstScreen: SupportAppScreen = createAppScreen<KittensFragment>()
}<file_sep>package ru.napoleonit.common.navigation.router
import android.view.View
import ru.napoleonit.common.navigation.command.ForwardWithSharedElements
import ru.terrakok.cicerone.Router
import ru.terrakok.cicerone.Screen
class TransitionsRouter : Router() {
fun navigateWithTransitions(screen: Screen, sharedElements: Map<View, String>) {
executeCommands(ForwardWithSharedElements(screen, sharedElements))
}
}<file_sep>package ru.napoleonit.common.navigation.command
import android.view.View
import ru.terrakok.cicerone.Screen
import ru.terrakok.cicerone.commands.Forward
open class ForwardWithSharedElements(screen: Screen, val sharedElements: Map<View, String>) :
Forward(screen)<file_sep>package ru.napoleonit.settings.ui
import android.graphics.drawable.Drawable
import android.os.Bundle
import android.view.LayoutInflater
import android.view.View
import android.view.ViewGroup
import androidx.core.os.bundleOf
import androidx.fragment.app.Fragment
import com.bumptech.glide.Glide
import com.bumptech.glide.load.DataSource
import com.bumptech.glide.load.engine.GlideException
import com.bumptech.glide.request.RequestListener
import com.bumptech.glide.request.RequestOptions
import com.bumptech.glide.request.target.Target
import kotlinx.android.synthetic.main.layout_kitten_detail.*
import ru.napoleonit.common.di.InjectionFragment
import ru.napoleonit.common.navigation.router.TransitionsRouter
import ru.napoleonit.settings.R
import javax.inject.Inject
class KittenDetailFragment : Fragment(), InjectionFragment {
companion object {
const val ARGS = "args"
fun newInstance(url: String): KittenDetailFragment {
return KittenDetailFragment().apply {
arguments = bundleOf(ARGS to url)
}
}
}
@Inject
lateinit var router: TransitionsRouter
private val args by lazy {
arguments!!.getString(ARGS)
}
override fun onCreateView(
inflater: LayoutInflater,
container: ViewGroup?,
savedInstanceState: Bundle?
): View? = inflater.inflate(R.layout.layout_kitten_detail, container, false)
override fun onViewCreated(view: View, savedInstanceState: Bundle?) {
super.onViewCreated(view, savedInstanceState)
ivKitten.transitionName = args
btnBack.setOnClickListener {
// router.exit()
//
parentFragment?.fragmentManager?.popBackStack()
}
Glide.with(view)
.load(args)
.dontAnimate()
.override(Target.SIZE_ORIGINAL, Target.SIZE_ORIGINAL)
.addListener(object : RequestListener<Drawable> {
override fun onLoadFailed(
e: GlideException?,
model: Any?,
target: Target<Drawable>?,
isFirstResource: Boolean
): Boolean {
parentFragment?.startPostponedEnterTransition()
return false
}
override fun onResourceReady(
resource: Drawable?,
model: Any?,
target: Target<Drawable>?,
dataSource: DataSource?,
isFirstResource: Boolean
): Boolean {
parentFragment?.startPostponedEnterTransition()
return false
}
})
.into(ivKitten)
}
}<file_sep>//package ru.napoleonit.common.navigation.navigator
//
//import androidx.annotation.IdRes
//import androidx.fragment.app.FragmentActivity
//import androidx.fragment.app.FragmentManager
//import ru.terrakok.cicerone.android.support.SupportAppNavigator
//import ru.terrakok.cicerone.commands.Forward
//
//class TransitionsNavigator(
// private val activity: FragmentActivity,
// private val fm: FragmentManager,
// @IdRes private val containerId: Int
//) : SupportAppNavigator() {
//
// override fun fragmentForward(command: Forward?) {
// super.fragmentForward(command)
//
// }
//}<file_sep>include ':app', ':dynamicfeature', ':common', ':settings', ':network'
rootProject.name='MultimoduleTests'
<file_sep>package ru.napoleonit.common.di
import androidx.fragment.app.Fragment
import dagger.MapKey
import kotlin.reflect.KClass
@Retention(value = AnnotationRetention.RUNTIME)
annotation class FeatureScope
@Retention(value = AnnotationRetention.RUNTIME)
annotation class ActivityScope
@Retention(value = AnnotationRetention.RUNTIME)
annotation class FragmentScope
@MustBeDocumented
@Target(
AnnotationTarget.FUNCTION,
AnnotationTarget.PROPERTY_GETTER,
AnnotationTarget.PROPERTY_SETTER
)
@Retention(AnnotationRetention.RUNTIME)
@MapKey
annotation class FragmentKey(val value: KClass<out Fragment>)<file_sep>package ru.napoleonit.settings.di
import androidx.fragment.app.Fragment
import dagger.Component
import dagger.android.AndroidInjection
import dagger.android.AndroidInjector
import dagger.android.support.AndroidSupportInjection
import dagger.android.support.AndroidSupportInjectionModule
import ru.napoleonit.network.api.NetworkServiceApi
import ru.napoleonit.settings.api.ScreenFeatureApi
import ru.napoleonit.settings.dependency.FeatureCloser
import ru.napoleonit.settings.dependency.FeatureCloserDependency
import ru.napoleonit.settings.dependency.SettingsDependencies
//@Component(
// modules = [SettingsModule::class, AndroidSupportInjectionModule::class]
// dependencies = [SettingsDependencies::class]
//)
//interface SettingsComponent : ScreenFeatureApi, AndroidInjector<Fragment> {
// @Component(dependencies = [NetworkServiceApi::class, FeatureCloserDependency::class])
// interface SettingsDependenciesComponent : SettingsDependencies
//}<file_sep>package ru.napoleonit.common.ui
import androidx.fragment.app.FragmentTransaction
interface ExitTransitionHandler {
fun prepareExitTransitions(transaction: FragmentTransaction)
}
<file_sep>package ru.napoleonit.common.ui
import androidx.fragment.app.FragmentTransaction
interface EnterTransitionHandler {
fun prepareEnterTransitions(fragmentTransaction: FragmentTransaction)
}
<file_sep>package ru.napoleonit.common.di
import android.os.Bundle
import androidx.fragment.app.Fragment
import androidx.fragment.app.FragmentManager
import dagger.android.support.AndroidSupportInjection
class InjectionLifecycleCallback : FragmentManager.FragmentLifecycleCallbacks() {
override fun onFragmentCreated(fm: FragmentManager, f: Fragment, savedInstanceState: Bundle?) {
super.onFragmentCreated(fm, f, savedInstanceState)
// if (f is InjectionFragment) AndroidSupportInjection.inject(f)
}
}<file_sep>package ru.napoleonit.di
import dagger.Component
import dagger.android.AndroidInjectionModule
import dagger.android.support.AndroidSupportInjectionModule
import ru.napoleonit.BaseApplication
import ru.napoleonit.network.api.NetworkServiceApi
import ru.napoleonit.network.di.NetworkComponent.Companion.networkComponent
import javax.inject.Singleton
@Component(
dependencies = [NetworkServiceApi::class],
modules = [
NavigationModule::class,
AppModule::class,
FeaturesUIBindingModule::class,
AndroidInjectionModule::class,
AndroidSupportInjectionModule::class
]
)
@Singleton
interface AppComponent {
fun inject(app: BaseApplication)
companion object {
val appComponent: AppComponent by lazy {
DaggerAppComponent.builder()
.networkServiceApi(networkComponent)
.build()
}
}
}<file_sep>package ru.napoleonit.common.navigation
import androidx.fragment.app.Fragment
import ru.terrakok.cicerone.android.support.SupportAppScreen
fun Fragment.toScreen(): SupportAppScreen {
return object : SupportAppScreen() {
override fun getFragment() = this@toScreen
}
}<file_sep>apply plugin: 'com.android.library'
apply plugin: 'kotlin-android'
apply plugin: 'kotlin-android-extensions'
apply plugin: 'kotlin-kapt'
apply plugin: 'idea'
android {
compileSdkVersion 29
defaultConfig {
minSdkVersion 21
targetSdkVersion 29
versionCode 1
versionName "1.0"
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
}
}
}
idea.module {
excludeDirs += file('build/')
}
dependencies {
implementation fileTree(dir: 'libs', include: ['*.jar'])
implementation "org.jetbrains.kotlin:kotlin-stdlib-jdk7:$sdkVersions.kotlin"
api 'androidx.appcompat:appcompat:1.1.0'
api 'androidx.core:core-ktx:1.1.0'
implementation 'com.google.android.material:material:1.0.0'
implementation "androidx.fragment:fragment-ktx:$sdkVersions.fragmentKtx"
implementation "androidx.recyclerview:recyclerview:$sdkVersions.recyclerView"
implementation "androidx.viewpager2:viewpager2:$sdkVersions.viewpager2"
implementation "com.github.bumptech.glide:glide:$sdkVersions.glide"
kapt "com.github.bumptech.glide:compiler:$sdkVersions.glide"
implementation "com.google.dagger:dagger:$sdkVersions.dagger"
kapt "com.google.dagger:dagger-compiler:$sdkVersions.dagger"
implementation "com.google.dagger:dagger-android-support:$sdkVersions.dagger" // if you use the support libraries
implementation "com.google.dagger:dagger-android:$sdkVersions.dagger"
kapt "com.google.dagger:dagger-android-processor:$sdkVersions.dagger"
implementation project(':common')
implementation project(':network')
}
<file_sep>package ru.napoleonit.settings.ui
import androidx.fragment.app.Fragment
import androidx.viewpager2.adapter.FragmentStateAdapter
class KittensDetailAdapter(f: Fragment) : FragmentStateAdapter(f) {
override fun getItemCount() = 2
override fun createFragment(position: Int) =
KittenDetailFragment.newInstance(if (position == 0) KittensFragment.LINK else KittensFragment.LINK2)
}
<file_sep>package ru.napoleonit.settings.ui.kittens_list.adapter
import android.content.Context
import android.view.LayoutInflater
import android.view.View
import android.view.ViewGroup
import android.widget.ImageView
import androidx.annotation.LayoutRes
import androidx.core.content.ContextCompat
import androidx.recyclerview.widget.RecyclerView
import com.bumptech.glide.Glide
import kotlinx.android.synthetic.main.fragment_kitten_details.view.*
import kotlinx.android.synthetic.main.item_kitten.view.*
import ru.napoleonit.settings.R
import ru.napoleonit.settings.ui.KittensFragment
class KittensAdapter(val kittenResourceClickListener: (View, String) -> Unit) :
RecyclerView.Adapter<KittensAdapter.KittenViewHolder>() {
override fun getItemCount() = 2
override fun onCreateViewHolder(parent: ViewGroup, viewType: Int): KittenViewHolder {
return KittenViewHolder(parent.context.inflateView(R.layout.item_kitten, parent))
}
override fun onBindViewHolder(holder: KittenViewHolder, position: Int) {
return holder.bind(position)
}
inner class KittenViewHolder(itemView: View) : RecyclerView.ViewHolder(itemView) {
fun bind(position: Int) {
val url = if (position == 0) KittensFragment.LINK else KittensFragment.LINK2
itemView.transitionName = url
Glide.with(itemView)
.load(url)
.into(itemView as ImageView)
itemView.setOnClickListener {
kittenResourceClickListener.invoke(itemView, url)
}
}
}
}
fun Context.inflateView(@LayoutRes layoutId: Int, container: ViewGroup): View {
return LayoutInflater.from(this).inflate(layoutId, container, false)
}<file_sep>package ru.napoleonit.multimoduletests
import android.os.Bundle
import androidx.appcompat.app.AppCompatActivity
import androidx.fragment.app.commit
import dagger.android.AndroidInjection
import kotlinx.android.synthetic.main.activity_main.*
import ru.napoleonit.common.di.InjectionLifecycleCallback
import ru.napoleonit.multimoduletests.featuresInjector.FeaturesProxy
import ru.napoleonit.settings.R
import ru.terrakok.cicerone.Cicerone
import ru.terrakok.cicerone.Router
import ru.terrakok.cicerone.android.support.SupportAppNavigator
import javax.inject.Inject
class RootActivity : AppCompatActivity() {
@Inject
lateinit var globalCicerone: Cicerone<Router>
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
AndroidInjection.inject(this)
setContentView(R.layout.activity_main)
bottomNavigationView.setOnNavigationItemSelectedListener {
when (it.itemId) {
R.id.settings -> {
globalCicerone.router.navigateTo(FeaturesProxy.startSettingsFeature())
}
}
true
}
supportFragmentManager.registerFragmentLifecycleCallbacks(
InjectionLifecycleCallback(), true
)
}
override fun onPause() {
super.onPause()
globalCicerone.navigatorHolder.removeNavigator()
}
override fun onResumeFragments() {
super.onResumeFragments()
globalCicerone.navigatorHolder.setNavigator(object :
SupportAppNavigator(this, supportFragmentManager, R.id.flFragmentsContainer) {})
}
}<file_sep>package ru.napoleonit.common.di
interface InjectionFragment<file_sep>package ru.napoleonit.di
import dagger.Module
import dagger.Provides
import ru.napoleonit.common.navigation.LocalCiceronesHolder
import ru.terrakok.cicerone.Cicerone
import ru.terrakok.cicerone.Router
import javax.inject.Named
import javax.inject.Singleton
@Module
class NavigationModule {
@Provides
@Singleton
fun cicerone(): Cicerone<Router> = Cicerone.create()
@Provides
@Singleton
@Named("global")
fun provideGlobalRouter(cicerone: Cicerone<Router>): Router = cicerone.router
}
<file_sep>package ru.napoleonit.settings.api
import ru.terrakok.cicerone.android.support.SupportAppScreen
interface ScreenFeatureApi {
fun getScreen(): SupportAppScreen
}
<file_sep>package ru.napoleonit.settings.api
//import ru.napoleonit.MainFragment
import ru.terrakok.cicerone.android.support.SupportAppScreen
import javax.inject.Inject
class SettingsScreenFeatureImpl @Inject constructor() : ScreenFeatureApi {
override fun getScreen(): SupportAppScreen = object : SupportAppScreen(){}
}<file_sep>package ru.napoleonit.common.presentation
import kotlinx.coroutines.Dispatchers
import kotlinx.coroutines.SupervisorJob
import kotlinx.coroutines.cancel
import moxy.MvpPresenter
abstract class BasePresenter<V : BaseView> : MvpPresenter<V>() {
protected val job = SupervisorJob()
protected val presenterScope = job + Dispatchers.Main
override fun onDestroy() {
super.onDestroy()
presenterScope.cancel()
}
}<file_sep>package ru.napoleonit.settings.dependency
interface FeatureCloserDependency {
val featureCloser: FeatureCloser
}<file_sep>package ru.napoleonit.common.navigation
import ru.napoleonit.common.navigation.router.TransitionsRouter
import ru.terrakok.cicerone.Cicerone
import ru.terrakok.cicerone.Router
import javax.inject.Inject
import javax.inject.Singleton
@Singleton
class LocalCiceronesHolder @Inject constructor() {
private val cicerones = HashMap<String, Cicerone<TransitionsRouter>>()
@Synchronized
fun getOrCreate(key: String) = cicerones.getOrPut(key, { Cicerone.create(TransitionsRouter()) })
}<file_sep>package ru.napoleonit.network.di
import com.jakewharton.retrofit2.converter.kotlinx.serialization.asConverterFactory
import dagger.Module
import dagger.Provides
import kotlinx.serialization.json.Json
import okhttp3.MediaType
import okhttp3.OkHttpClient
import retrofit2.Retrofit
import ru.napoleonit.network.domain.entity.serverInfo.TestServerInfo
import ru.napoleonit.network.service.ApiService
import java.util.concurrent.TimeUnit
import javax.inject.Singleton
@Module
class NetworkModule {
@Provides
@Singleton
fun provideOkHttpClient(): OkHttpClient = OkHttpClient.Builder()
.connectTimeout(15, TimeUnit.SECONDS)
.build()
@Provides
@Singleton
fun apiService(
client: OkHttpClient
): ApiService = Retrofit.Builder()
.addConverterFactory(Json.asConverterFactory(MediaType.get("application/json")))
.baseUrl(TestServerInfo.baseUrl)
.client(client)
.build()
.create(ApiService::class.java)
}<file_sep>package ru.napoleonit.di
import dagger.Module
import dagger.android.ContributesAndroidInjector
import ru.napoleonit.common.di.ActivityScope
import ru.napoleonit.multimoduletests.RootActivity
import ru.napoleonit.settings.di.LocalNavigationModule
import ru.napoleonit.settings.di.MainFragmentModule
import ru.napoleonit.settings.ui.KittensContainerFragment
@Module
abstract class FeaturesUIBindingModule {
@ContributesAndroidInjector(modules = [MainFragmentModule::class, LocalNavigationModule::class])
@ActivityScope
abstract fun settingsInjector(): KittensContainerFragment
@ContributesAndroidInjector
@ActivityScope
abstract fun rootActivity(): RootActivity
}
<file_sep>package ru.napoleonit.common.domain.usecase
abstract class UseCase {
abstract suspend operator fun invoke()
}<file_sep>package ru.napoleonit.network.domain.entity.serverInfo
object TestServerInfo : ServerInfo{
override val baseUrl: String = "https://test.retail-kb.itnap.ru/api/"
}<file_sep>package ru.napoleonit.network.service
import retrofit2.http.GET
import ru.napoleonit.network.domain.entity.CatalogResponse
interface ApiService {
@GET("v1/catalog")
suspend fun getCatalog(): CatalogResponse
}<file_sep>package ru.napoleonit.network.domain.entity.serverInfo
interface ServerInfo {
val baseUrl: String
}<file_sep>package ru.napoleonit.settings.ui
import android.app.SharedElementCallback
import android.os.Bundle
import android.transition.AutoTransition
import android.transition.Fade
import android.transition.Transition
import android.transition.TransitionSet
import android.view.View
import androidx.core.view.doOnLayout
import androidx.core.view.doOnPreDraw
import androidx.fragment.app.FragmentTransaction
import androidx.fragment.app.commit
import dagger.android.support.AndroidSupportInjection
import kotlinx.android.synthetic.main.settings_fragment.*
import ru.napoleonit.common.navigation.router.TransitionsRouter
import ru.napoleonit.common.navigation.toScreen
import ru.napoleonit.common.ui.BaseFragment
import ru.napoleonit.common.ui.ExitTransitionHandler
import ru.napoleonit.settings.R
import ru.napoleonit.settings.dependency.SettingsDependencies
import ru.napoleonit.settings.ui.kittens_list.adapter.KittensAdapter
import ru.terrakok.cicerone.Router
import java.util.*
import javax.inject.Inject
class KittensFragment : BaseFragment(), ExitTransitionHandler {
@Inject
lateinit var dependencies: SettingsDependencies
@Inject
lateinit var localRouter: TransitionsRouter
lateinit var kittensAdapter: KittensAdapter
override val layoutId = R.layout.settings_fragment
companion object {
const val LINK2 = "https://нашсамогон.рф/pictures/product/big/6866_big.jpg"
const val LINK =
"https://agro63.ru/wa-data/public/shop/products/90/49/4990/images/6030/6030.750.jpg"
}
override fun onCreate(savedInstanceState: Bundle?) {
AndroidSupportInjection.inject(this)
super.onCreate(savedInstanceState)
}
override fun onViewCreated(view: View, savedInstanceState: Bundle?) {
super.onViewCreated(view, savedInstanceState)
setExitSharedElementCallback(object : androidx.core.app.SharedElementCallback() {
override fun onMapSharedElements(
names: MutableList<String>,
sharedElements: MutableMap<String, View>
) {
super.onMapSharedElements(names, sharedElements)
// sharedElements[names[0]] = rvKittens.findViewHolderForAdapterPosition(0)!!.itemView
}
})
kittensAdapter = KittensAdapter { imageView, resource ->
val fragment = KittenDetailContainerFragment.newInstance(resource).apply {
sharedElementEnterTransition = createTransition()
sharedElementReturnTransition = createTransition()
}
requireFragmentManager().beginTransaction()
.setReorderingAllowed(true)
.addSharedElement(imageView, resource)
.addToBackStack("test")
.replace(
R.id.llFragmentContainer, fragment, "test"
)
.commit()
// }
// localRouter.navigateWithTransitions(
// KittenDetailContainerFragment.newInstance(resource).apply {
// sharedElementEnterTransition = createTransition()
// sharedElementReturnTransition = createTransition()
// }.toScreen(),
// mapOf(imageView to resource)
// )
}
rvKittens.adapter = kittensAdapter
rvKittens.viewTreeObserver.addOnPreDrawListener {
startPostponedEnterTransition()
true
}
}
override fun prepareExitTransitions(transaction: FragmentTransaction) = Unit
private fun createTransition(): Transition {
return TransitionSet().apply {
ordering = TransitionSet.ORDERING_TOGETHER
duration = 1000
addTransition(AutoTransition())
// addTransition(ChangeBounds())
// addTransition(ChangeTransform())
// addTransition(ChangeClipBounds())
// addTransition(ChangeImageTransform())
}
}
}<file_sep>package ru.napoleonit.network.domain.entity
import kotlinx.serialization.SerialName
import kotlinx.serialization.Serializable
@Serializable
data class CatalogResponse(@SerialName("test") val stub: String) {
}<file_sep>package ru.napoleonit.settings.ui
import android.os.Bundle
import android.transition.*
import android.view.View
import androidx.core.os.bundleOf
import androidx.fragment.app.FragmentTransaction
import kotlinx.android.synthetic.main.fragment_kitten_details.*
import ru.napoleonit.common.ui.BaseFragment
import ru.napoleonit.common.ui.EnterTransitionHandler
import ru.napoleonit.settings.R
class KittenDetailContainerFragment : BaseFragment(), EnterTransitionHandler {
companion object {
const val ARGS = "args"
fun newInstance(url: String): KittenDetailContainerFragment {
return KittenDetailContainerFragment().apply {
arguments = bundleOf(ARGS to url)
}
}
}
private val args by lazy {
arguments!!.getString(ARGS)
}
lateinit var kittensDetailsAdapter: KittensDetailAdapter
override val layoutId = R.layout.fragment_kitten_details
override fun onViewCreated(view: View, savedInstanceState: Bundle?) {
super.onViewCreated(view, savedInstanceState)
postponeEnterTransition()
kittensDetailsAdapter = KittensDetailAdapter(this)
kittensPager.adapter = kittensDetailsAdapter
kittensPager.setCurrentItem(if (args == KittensFragment.LINK) 0 else 1, false)
}
override fun prepareEnterTransitions(fragmentTransaction: FragmentTransaction) {
}
private fun createTransition(): Transition {
return TransitionSet().apply {
ordering = TransitionSet.ORDERING_TOGETHER
duration = 1000
addTransition(AutoTransition())
// addTransition(ChangeBounds())
// addTransition(ChangeTransform())
// addTransition(ChangeClipBounds())
// addTransition(ChangeImageTransform())
}
}
}<file_sep>package ru.napoleonit.common.ui
import androidx.fragment.app.Fragment
import ru.terrakok.cicerone.android.support.SupportAppScreen
inline fun <reified TF : Fragment> Fragment.toFragment(): TF? {
val classLoader = activity?.classLoader ?: return null
return fragmentManager?.fragmentFactory
?.instantiate(classLoader, TF::class.java.canonicalName!!) as? TF
}
inline fun <reified TF : Fragment> Fragment.createAppScreen(): SupportAppScreen =
object : SupportAppScreen() {
override fun getFragment(): Fragment? = toFragment<TF>()
}
//inline fun <reified TF : Fragment> Activity.createAppScreen(): SupportAppScreen =
// object : SupportAppScreen() {
// override fun getFragment(): Fragment {
// return
// }
// }
| 52a44354eab3755ae74938d5e0b2bdceeb01239f | [
"Kotlin",
"Gradle"
]
| 48 | Kotlin | valya1/MultimoduleTest | d30ac957ab7e56a92ca377c6b8c8d1d487b56586 | 0f606660940059160dc5c7d3403a2d9495ad729b |
refs/heads/master | <file_sep>platform :ios, '8.0'
pod 'libPusher', '~> 1.5'
<file_sep>//
// Test2ViewController.swift
// PusherTestSwift
//
// Created by <NAME> on 01/03/2015.
// Copyright (c) 2015 <NAME>. All rights reserved.
//
import UIKit
class Test2ViewController: UIViewController, PTPusherDelegate { //PTPusherPresenceChannelDelegate {
var client:AnyObject? = nil
var channel: PTPusherPresenceChannel = PTPusherPresenceChannel()
var api: PTPusherAPI = PTPusherAPI()
override func viewDidLoad() {
super.viewDidLoad()
self.client = PTPusher.pusherWithKey("3bed0224e216f7d3c3a4", delegate: self)
self.api = PTPusherAPI(key: "3bed0224e216f7d3c3a4", appID: "109110", secretKey: "c1e850ac3781ed00bf63")
self.client?.connect()
self.channel = self.client!.subscribeToPresenceChannelNamed("50")
self.client?.setAuthorizationURLFromString("http://bechmann.net76.net/topik/auth.php?channel_name=\(self.channel.name)&socket_id=\(self.client!.socketID)")
self.client?.authorizeWithCompletionHandler { (b, obj, error) -> Void in
//
}
//self.channel.bindToEventNamed("testEvent", handleWithBlock: { (response) -> Void in
//})
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
func pusher(pusher: PTPusher!, didReceiveErrorEvent errorEvent: PTPusherErrorEvent!) {
//
}
}
<file_sep>//
// ViewController.swift
// PusherTestSwift
//
// Created by <NAME> on 28/02/2015.
// Copyright (c) 2015 <NAME>. All rights reserved.
//
import UIKit
class ViewController: UIViewController, PTPusherDelegate {
var client:AnyObject? = nil
var channel: PTPusherChannel = PTPusherChannel()
var api: PTPusherAPI = PTPusherAPI()
@IBOutlet weak var lbl: UILabel!
@IBOutlet weak var input: UITextField!
override func viewDidLoad() {
super.viewDidLoad()
self.navigationItem.rightBarButtonItem = UIBarButtonItem(barButtonSystemItem: UIBarButtonSystemItem.Compose, target: self, action: "send")
self.client = PTPusher.pusherWithKey("3bed0224e216f7d3c3a4", delegate: self)
self.client?.connect()
self.api = PTPusherAPI(key: "3bed0224e216f7d3c3a4", appID: "109110", secretKey: "c1e850ac3781ed00bf63")
self.channel = self.client!.subscribeToChannelNamed("50")
self.channel.bindToEventNamed("testEvent", handleWithBlock: { (response) -> Void in
println("response: \(response)")
self.lbl.text = response.data["message"] as? String
})
}
override func viewDidAppear(animated: Bool) {
var v = Test2ViewController()
//self.presentViewController(v, animated:false, completion:nil)
}
func send(){
self.api.triggerEvent("testEvent", onChannel: "50", data: ["message": self.input.text], socketID: nil)
self.input.text = ""
}
}
| 0545036ff15b778786146ed24e05f1c244fb86ce | [
"Swift",
"Ruby"
]
| 3 | Ruby | a1exb1/PusherTestSwift | 66a8c9029cca3845e8c503e3da134c115ae0dc01 | cf2dc60fcd48d20d3573867685939c839e2b6d37 |
refs/heads/master | <repo_name>explore-node-js/node.js-parameter-handler<file_sep>/src/processor.js
const fs = require('fs');
const deepmerge = require('deepmerge');
const File = require('./file');
const chalk = require('chalk');
const overwriteFieldValue = require('node-object-field-resolver');
module.exports = class Processor {
constructor(config, cwd) {
this.config = config;
this.cwd = cwd;
}
process() {
this.files = [];
console.log(chalk.yellow(`>>> PROCESSING FILES`));
this.config.forEach((config) => {
let file = this.processFile(config);
this.files.push(file);
console.log(chalk.yellow(`>>>>> ${file.getSourcePath()}`));
})
}
write() {
console.log(chalk.green(`>>> WRITING FILES`));
this.files.forEach(file => {
console.log(chalk.green(`>>>>> ${file.getOutputPath()}`));
fs.writeFile(
file.getOutputPath(),
JSON.stringify(file.getContent(), null, 2),
{ encoding: 'UTF-8'},
(e) => { e && console.log(chalk.red(` >>>>> error ${e}`)); }
);
});
}
/**
* @param {{envMap: {}, source: string, output: string}} config
*
* @returns {File}
*/
processFile({source, output, envMap, skipUndefined}) {
const file = new File();
const pathSource = this.resolvePath(source);
const pathOutput = this.resolvePath(output);
const packageJsonPath = this.resolvePath(pathSource);
const packageJsonContent = fs.readFileSync(packageJsonPath);
/** @param {{extra: {}}} content */
const packageJson = JSON.parse(packageJsonContent);
const solvedJson = Processor.resolveOverwritten(envMap, skipUndefined);
const completedJson = deepmerge(packageJson, solvedJson);
file.setSourcePath(pathSource)
.setOutputPath(pathOutput)
.setContent(completedJson);
return file;
}
/**
* @param {string} path
*/
resolvePath(path) {
if ('/' === path.charAt(0)) {
return path;
}
return `${this.cwd}/${path}`;
}
static resolveOverwritten(envMap, skipUndefined) {
const obj = {};
Object.keys(envMap).forEach((abstractPath) => {
const envVariable = envMap[abstractPath];
const value = Processor.getEnvironmentValue(envVariable);
(undefined !== value || !skipUndefined) && overwriteFieldValue(abstractPath, value, obj);
});
return obj;
}
static getEnvironmentValue(i) {
return process.env[i] || undefined;
}
};
<file_sep>/index.js
#!/usr/bin/env node
'use strict';
const fs = require('fs');
const chalk = require('chalk');
const Processor = require('./src/processor');
let packageJsonPath = `${process.cwd()}/package.json`,
packageJsonContent = fs.readFileSync(packageJsonPath),
/** @param {{extra: {node_parameter_handler: []}}} content */
packageJson = JSON.parse(packageJsonContent);
try {
if (undefined === packageJson.extra) {
throw `node 'extra' is not defined`;
}
if (undefined === packageJson.extra.node_parameter_handler) {
throw `node 'node_parameter_handler' in 'extra' is not defined`;
}
if (!Array.isArray(packageJson.extra.node_parameter_handler)) {
throw `node 'node_parameter_handler' in 'extra' is not array`;
}
const processor = new Processor(packageJson.extra.node_parameter_handler, process.cwd());
processor.process();
processor.write();
} catch (e) {
console.log(chalk.red(`
${e}
example of package.json:
{
...
"extra": {
"node_parameter_handler": [
...
{
"source": "src/parameters.json.dist",
"output": "src/parameters.json",
"envMap": {
"node_path": "ENV_VARIABLE"
}
}
]
}
}
`));
}
<file_sep>/tests/file.test.js
"use strict";
const File = require('../src/file');
describe('file', () => {
describe('::constructor', () => {
it('fields [content|sourcePath|outputPath] should be defined', () => {
const file = new File();
expect(file.content).toBeDefined();
expect(file.sourcePath).toBeDefined();
expect(file.outputPath).toBeDefined();
})
});
describe('::(get|set)Content', () => {
it('getter should return content, what been set though setter', () => {
const file = new File();
const content = 'test content';
file.setContent(content);
expect(file.getContent()).toMatch(content);
})
});
describe('::(get|set)SourcePath', () => {
it('getter should return content, what been set though setter', () => {
const file = new File();
const content = 'test content';
file.setSourcePath(content);
expect(file.getSourcePath()).toMatch(content);
})
});
describe('::(get|set)OutputPath', () => {
it('getter should return content, what been set though setter', () => {
const file = new File();
const content = 'test content';
file.setOutputPath(content);
expect(file.getOutputPath()).toMatch(content);
})
});
});
<file_sep>/src/file.js
module.exports = class File {
constructor() {
this.sourcePath = '';
this.outputPath = '';
this.content = '';
}
/**
* @returns {string}
*/
getSourcePath() {
return this.sourcePath;
}
/**
* @param {string} path
*
* @returns {File}
*/
setSourcePath(path) {
this.sourcePath = path;
return this;
}
/**
* @returns {string}
*/
getOutputPath() {
return this.outputPath;
}
/**
* @param {string} path
*
* @returns {File}
*/
setOutputPath(path) {
this.outputPath = path;
return this;
}
/**
* @returns {string}
*/
getContent() {
return this.content;
}
/**
* @param {string} content
*
* @returns {File}
*/
setContent(content) {
this.content = content;
return this;
}
};
<file_sep>/README.md
[ci.tests-master-badge]: https://circleci.com/gh/explore-node-js/node.js-parameter-handler/tree/master.svg?style=svg
[ci.tests-master]: https://circleci.com/gh/explore-node-js/node.js-parameter-handler/tree/master
[ci.coverage-master-badge]: https://codecov.io/gh/explore-node-js/node.js-parameter-handler/branch/master/graph/badge.svg
[ci.coverage-master]: https://codecov.io/gh/explore-node-js/node.js-parameter-handler
[npm.package-badge]: https://badge.fury.io/js/node-parameter-handler.svg
[npm.package]: https://www.npmjs.com/package/node-parameter-handler
# node.js parameter handler
can be used as config builder, inspired by [@Incenteev/ParameterHandler](https://github.com/Incenteev/ParameterHandler)
useful for 'legacy' applications, if you're using [react-scripts@3](https://www.npmjs.com/package/react-scripts), consider use .env file instead, as in [this example repository](https://github.com/eugene-matvejev/battleship-game-gui-react-js)
[![build][ci.tests-master-badge]][ci.tests-master]
[![coverage][ci.coverage-master-badge]][ci.coverage-master]
[![coverage][npm.package-badge]][npm.package]
### how to install
`$ npm i node-parameter-handler` or `$ yarn add node-parameter-handler`
### software requirements
* [node.js](https://nodejs.org/)
* [npm](https://www.npmjs.com/)+ or [yarn](https://yarnpkg.com/)
### used technologies
* [jest](https://facebook.github.io/jest/) - only for tests
### used services
* [circle ci](https://circleci.com/dashboard)
* [codecov](https://codecov.io/)
* [code climate](https://codeclimate.com/)
* [snyk](https://snyk.io/)
### how to execute
* `$ node_modules/.bin/node-parameter-handler`
### how to execute tests
* `$ npm test`
* to execute tests with coverage `npm test -- --coverage`
### how to use
include configs into root _package.json_ into 'extra' node
observe an example of _package.json_
```
{
...
"extra": {
"node_parameter_handler": [
{
"source": "tests/fixtures/settings.json.dist", # source
"output": "var/settings1.json", # output
"envMap": {
"touched": "BASE_URL", # json path to ENV VARIABLE
"test.touched": "PWD",
"test.test.touched": "HOME"
}
},
{
"source": "tests/fixtures/settings.json.dist", # source
"output": "var/settings2.json", # output
"skipUndefined": true, # if variable is not defined do not change
"envMap": {
"touched": "BASE_URL", # json path to ENV VARIABLE
"test.touched": "PWD",
"test.test.touched": "HOME"
}
}
]
}
}
```
| b059c27f1a66de3d04d40628c36d1d731f127ca3 | [
"JavaScript",
"Markdown"
]
| 5 | JavaScript | explore-node-js/node.js-parameter-handler | 88da17fd114cb86d8d6d285cc66e721ca1de341a | 1c7e6b4264c01a2da19ec29348942cd6c60374b9 |
refs/heads/master | <repo_name>aswinleojohn/PythonOCRPdf<file_sep>/ocr.py
import cv2
import pytesseract
from pdf2image import convert_from_path, convert_from_bytes
from PyPDF2 import PdfFileWriter, PdfFileReader
import uuid
#image = cv2.imread('DoraBootsForever.png')
text = None
images = convert_from_path('data.pdf')
for page in images:
page.save('newout.jpg', 'JPEG')
image = cv2.imread('newout.jpg')
crop_img = image[990:1030, 1150:1600]
cv2.imwrite("cropped.jpg", crop_img)
#cv2.imshow("cropped", crop_img)
text = pytesseract.image_to_string('cropped.jpg')
ans = ''.join([n for n in text if n.isdigit()])
print('Employer Identification Number : '+str(ans))
#print(text)
crop_img = image[270:310, 150:1600]
cv2.imwrite("cropped1.jpg", crop_img)
#cv2.imshow("cropped", crop_img)
text = pytesseract.image_to_string('cropped1.jpg')
# ans = ''.join([n for n in text if n.isdigit()])
# print('Employer Identification Number :'+str(ans))
print('Name : '+text)<file_sep>/main.py
#pypdf does what I expect in this area. Using the following script:
#!/usr/bin/python
#
from PyPDF2 import PdfFileWriter, PdfFileReader
with open("/Users/leojohnashwin/pythonpdf/data.pdf", "rb") as in_f:
input1 = PdfFileReader(in_f)
output = PdfFileWriter()
numPages = input1.getNumPages()
#print "document has %s pages." % numPages
for i in range(numPages):
page = input1.getPage(i)
#print page.mediaBox.getUpperRight_x(), page.mediaBox.getUpperRight_y()
page.trimBox.lowerLeft = (413, 413)
page.trimBox.upperRight =(555, 500)
print(page.cropBox.getUpperLeft())
print(page.cropBox.getUpperRight())
print(page.cropBox.getLowerLeft())
print(page.cropBox.getLowerRight())
#page.cropBox.upperLeft = (400, 200)
page.cropBox.upperRight = (555, 500)
page.cropBox.lowerLeft = (413, 413)
#page.cropBox.lowerRight = (555, 400)
print(type(page))
output.addPage(page)
with open("out.pdf", "wb") as out_f:
output.write(out_f)
| 4205bf3e1943992d5ce67221c4c647d63f10a5ba | [
"Python"
]
| 2 | Python | aswinleojohn/PythonOCRPdf | 8167fd369675b04411c962d1a6ec84863820d695 | 71b004ce5e1310ac91153625dc5c3c8efc8698e2 |
refs/heads/master | <repo_name>Lakshamana/spm<file_sep>/tmp/service/mapper/ToolDefinitionMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.ToolDefinitionDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link ToolDefinition} and its DTO {@link ToolDefinitionDTO}.
*/
@Mapper(componentModel = "spring", uses = {ToolTypeMapper.class, ArtifactTypeMapper.class})
public interface ToolDefinitionMapper extends EntityMapper<ToolDefinitionDTO, ToolDefinition> {
@Mapping(source = "theToolType.id", target = "theToolTypeId")
ToolDefinitionDTO toDto(ToolDefinition toolDefinition);
@Mapping(source = "theToolTypeId", target = "theToolType")
@Mapping(target = "removeTheArtifactTypes", ignore = true)
ToolDefinition toEntity(ToolDefinitionDTO toolDefinitionDTO);
default ToolDefinition fromId(Long id) {
if (id == null) {
return null;
}
ToolDefinition toolDefinition = new ToolDefinition();
toolDefinition.setId(id);
return toolDefinition;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/tools/ToolDefinitionDAO.java
package br.ufpa.labes.spm.repository.impl.tools;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.tools.IToolDefinitionDAO;
import br.ufpa.labes.spm.domain.ToolDefinition;
public class ToolDefinitionDAO extends BaseDAO<ToolDefinition, String>
implements IToolDefinitionDAO {
protected ToolDefinitionDAO(Class<ToolDefinition> businessClass) {
super(businessClass);
}
public ToolDefinitionDAO() {
super(ToolDefinition.class);
}
}
<file_sep>/tmp/service/mapper/SpmLogMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.SpmLogDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link SpmLog} and its DTO {@link SpmLogDTO}.
*/
@Mapper(componentModel = "spring", uses = {ProcessMapper.class})
public interface SpmLogMapper extends EntityMapper<SpmLogDTO, SpmLog> {
@Mapping(source = "theProcess.id", target = "theProcessId")
SpmLogDTO toDto(SpmLog spmLog);
@Mapping(source = "theProcessId", target = "theProcess")
@Mapping(target = "theEvents", ignore = true)
@Mapping(target = "removeTheEvent", ignore = true)
SpmLog toEntity(SpmLogDTO spmLogDTO);
default SpmLog fromId(Long id) {
if (id == null) {
return null;
}
SpmLog spmLog = new SpmLog();
spmLog.setId(id);
return spmLog;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/mapper/ShareableMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.ShareableDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link Shareable} and its DTO {@link ShareableDTO}.
*/
@Mapper(componentModel = "spring", uses = {})
public interface ShareableMapper extends EntityMapper<ShareableDTO, Shareable> {
default Shareable fromId(Long id) {
if (id == null) {
return null;
}
Shareable shareable = new Shareable();
shareable.setId(id);
return shareable;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/impl/TemplateServiceImpl.java
package br.ufpa.labes.spm.service.impl;
import br.ufpa.labes.spm.service.TemplateService;
import br.ufpa.labes.spm.domain.Template;
import br.ufpa.labes.spm.repository.TemplateRepository;
import br.ufpa.labes.spm.service.dto.TemplateDTO;
import br.ufpa.labes.spm.service.mapper.TemplateMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link Template}.
*/
@Service
@Transactional
public class TemplateServiceImpl implements TemplateService {
private final Logger log = LoggerFactory.getLogger(TemplateServiceImpl.class);
private final TemplateRepository templateRepository;
private final TemplateMapper templateMapper;
public TemplateServiceImpl(TemplateRepository templateRepository, TemplateMapper templateMapper) {
this.templateRepository = templateRepository;
this.templateMapper = templateMapper;
}
/**
* Save a template.
*
* @param templateDTO the entity to save.
* @return the persisted entity.
*/
@Override
public TemplateDTO save(TemplateDTO templateDTO) {
log.debug("Request to save Template : {}", templateDTO);
Template template = templateMapper.toEntity(templateDTO);
template = templateRepository.save(template);
return templateMapper.toDto(template);
}
/**
* Get all the templates.
*
* @return the list of entities.
*/
@Override
@Transactional(readOnly = true)
public List<TemplateDTO> findAll() {
log.debug("Request to get all Templates");
return templateRepository.findAll().stream()
.map(templateMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one template by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Override
@Transactional(readOnly = true)
public Optional<TemplateDTO> findOne(Long id) {
log.debug("Request to get Template : {}", id);
return templateRepository.findById(id)
.map(templateMapper::toDto);
}
/**
* Delete the template by id.
*
* @param id the id of the entity.
*/
@Override
public void delete(Long id) {
log.debug("Request to delete Template : {}", id);
templateRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/web/rest/BranchConCondToMultipleConResource.java
package br.ufpa.labes.spm.web.rest;
import br.ufpa.labes.spm.service.BranchConCondToMultipleConService;
import br.ufpa.labes.spm.web.rest.errors.BadRequestAlertException;
import br.ufpa.labes.spm.service.dto.BranchConCondToMultipleConDTO;
import io.github.jhipster.web.util.HeaderUtil;
import io.github.jhipster.web.util.ResponseUtil;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.List;
import java.util.Optional;
/**
* REST controller for managing {@link br.ufpa.labes.spm.domain.BranchConCondToMultipleCon}.
*/
@RestController
@RequestMapping("/api")
public class BranchConCondToMultipleConResource {
private final Logger log = LoggerFactory.getLogger(BranchConCondToMultipleConResource.class);
private static final String ENTITY_NAME = "branchConCondToMultipleCon";
@Value("${jhipster.clientApp.name}")
private String applicationName;
private final BranchConCondToMultipleConService branchConCondToMultipleConService;
public BranchConCondToMultipleConResource(BranchConCondToMultipleConService branchConCondToMultipleConService) {
this.branchConCondToMultipleConService = branchConCondToMultipleConService;
}
/**
* {@code POST /branch-con-cond-to-multiple-cons} : Create a new branchConCondToMultipleCon.
*
* @param branchConCondToMultipleConDTO the branchConCondToMultipleConDTO to create.
* @return the {@link ResponseEntity} with status {@code 201 (Created)} and with body the new branchConCondToMultipleConDTO, or with status {@code 400 (Bad Request)} if the branchConCondToMultipleCon has already an ID.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PostMapping("/branch-con-cond-to-multiple-cons")
public ResponseEntity<BranchConCondToMultipleConDTO> createBranchConCondToMultipleCon(@RequestBody BranchConCondToMultipleConDTO branchConCondToMultipleConDTO) throws URISyntaxException {
log.debug("REST request to save BranchConCondToMultipleCon : {}", branchConCondToMultipleConDTO);
if (branchConCondToMultipleConDTO.getId() != null) {
throw new BadRequestAlertException("A new branchConCondToMultipleCon cannot already have an ID", ENTITY_NAME, "idexists");
}
BranchConCondToMultipleConDTO result = branchConCondToMultipleConService.save(branchConCondToMultipleConDTO);
return ResponseEntity.created(new URI("/api/branch-con-cond-to-multiple-cons/" + result.getId()))
.headers(HeaderUtil.createEntityCreationAlert(applicationName, true, ENTITY_NAME, result.getId().toString()))
.body(result);
}
/**
* {@code PUT /branch-con-cond-to-multiple-cons} : Updates an existing branchConCondToMultipleCon.
*
* @param branchConCondToMultipleConDTO the branchConCondToMultipleConDTO to update.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the updated branchConCondToMultipleConDTO,
* or with status {@code 400 (Bad Request)} if the branchConCondToMultipleConDTO is not valid,
* or with status {@code 500 (Internal Server Error)} if the branchConCondToMultipleConDTO couldn't be updated.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PutMapping("/branch-con-cond-to-multiple-cons")
public ResponseEntity<BranchConCondToMultipleConDTO> updateBranchConCondToMultipleCon(@RequestBody BranchConCondToMultipleConDTO branchConCondToMultipleConDTO) throws URISyntaxException {
log.debug("REST request to update BranchConCondToMultipleCon : {}", branchConCondToMultipleConDTO);
if (branchConCondToMultipleConDTO.getId() == null) {
throw new BadRequestAlertException("Invalid id", ENTITY_NAME, "idnull");
}
BranchConCondToMultipleConDTO result = branchConCondToMultipleConService.save(branchConCondToMultipleConDTO);
return ResponseEntity.ok()
.headers(HeaderUtil.createEntityUpdateAlert(applicationName, true, ENTITY_NAME, branchConCondToMultipleConDTO.getId().toString()))
.body(result);
}
/**
* {@code GET /branch-con-cond-to-multiple-cons} : get all the branchConCondToMultipleCons.
*
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and the list of branchConCondToMultipleCons in body.
*/
@GetMapping("/branch-con-cond-to-multiple-cons")
public List<BranchConCondToMultipleConDTO> getAllBranchConCondToMultipleCons() {
log.debug("REST request to get all BranchConCondToMultipleCons");
return branchConCondToMultipleConService.findAll();
}
/**
* {@code GET /branch-con-cond-to-multiple-cons/:id} : get the "id" branchConCondToMultipleCon.
*
* @param id the id of the branchConCondToMultipleConDTO to retrieve.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the branchConCondToMultipleConDTO, or with status {@code 404 (Not Found)}.
*/
@GetMapping("/branch-con-cond-to-multiple-cons/{id}")
public ResponseEntity<BranchConCondToMultipleConDTO> getBranchConCondToMultipleCon(@PathVariable Long id) {
log.debug("REST request to get BranchConCondToMultipleCon : {}", id);
Optional<BranchConCondToMultipleConDTO> branchConCondToMultipleConDTO = branchConCondToMultipleConService.findOne(id);
return ResponseUtil.wrapOrNotFound(branchConCondToMultipleConDTO);
}
/**
* {@code DELETE /branch-con-cond-to-multiple-cons/:id} : delete the "id" branchConCondToMultipleCon.
*
* @param id the id of the branchConCondToMultipleConDTO to delete.
* @return the {@link ResponseEntity} with status {@code 204 (NO_CONTENT)}.
*/
@DeleteMapping("/branch-con-cond-to-multiple-cons/{id}")
public ResponseEntity<Void> deleteBranchConCondToMultipleCon(@PathVariable Long id) {
log.debug("REST request to delete BranchConCondToMultipleCon : {}", id);
branchConCondToMultipleConService.delete(id);
return ResponseEntity.noContent().headers(HeaderUtil.createEntityDeletionAlert(applicationName, true, ENTITY_NAME, id.toString())).build();
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/web/rest/OrganizationEstimationResource.java
package br.ufpa.labes.spm.web.rest;
import br.ufpa.labes.spm.service.OrganizationEstimationService;
import br.ufpa.labes.spm.web.rest.errors.BadRequestAlertException;
import br.ufpa.labes.spm.service.dto.OrganizationEstimationDTO;
import io.github.jhipster.web.util.HeaderUtil;
import io.github.jhipster.web.util.ResponseUtil;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.List;
import java.util.Optional;
/**
* REST controller for managing {@link br.ufpa.labes.spm.domain.OrganizationEstimation}.
*/
@RestController
@RequestMapping("/api")
public class OrganizationEstimationResource {
private final Logger log = LoggerFactory.getLogger(OrganizationEstimationResource.class);
private static final String ENTITY_NAME = "organizationEstimation";
@Value("${jhipster.clientApp.name}")
private String applicationName;
private final OrganizationEstimationService organizationEstimationService;
public OrganizationEstimationResource(OrganizationEstimationService organizationEstimationService) {
this.organizationEstimationService = organizationEstimationService;
}
/**
* {@code POST /organization-estimations} : Create a new organizationEstimation.
*
* @param organizationEstimationDTO the organizationEstimationDTO to create.
* @return the {@link ResponseEntity} with status {@code 201 (Created)} and with body the new organizationEstimationDTO, or with status {@code 400 (Bad Request)} if the organizationEstimation has already an ID.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PostMapping("/organization-estimations")
public ResponseEntity<OrganizationEstimationDTO> createOrganizationEstimation(@RequestBody OrganizationEstimationDTO organizationEstimationDTO) throws URISyntaxException {
log.debug("REST request to save OrganizationEstimation : {}", organizationEstimationDTO);
if (organizationEstimationDTO.getId() != null) {
throw new BadRequestAlertException("A new organizationEstimation cannot already have an ID", ENTITY_NAME, "idexists");
}
OrganizationEstimationDTO result = organizationEstimationService.save(organizationEstimationDTO);
return ResponseEntity.created(new URI("/api/organization-estimations/" + result.getId()))
.headers(HeaderUtil.createEntityCreationAlert(applicationName, true, ENTITY_NAME, result.getId().toString()))
.body(result);
}
/**
* {@code PUT /organization-estimations} : Updates an existing organizationEstimation.
*
* @param organizationEstimationDTO the organizationEstimationDTO to update.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the updated organizationEstimationDTO,
* or with status {@code 400 (Bad Request)} if the organizationEstimationDTO is not valid,
* or with status {@code 500 (Internal Server Error)} if the organizationEstimationDTO couldn't be updated.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PutMapping("/organization-estimations")
public ResponseEntity<OrganizationEstimationDTO> updateOrganizationEstimation(@RequestBody OrganizationEstimationDTO organizationEstimationDTO) throws URISyntaxException {
log.debug("REST request to update OrganizationEstimation : {}", organizationEstimationDTO);
if (organizationEstimationDTO.getId() == null) {
throw new BadRequestAlertException("Invalid id", ENTITY_NAME, "idnull");
}
OrganizationEstimationDTO result = organizationEstimationService.save(organizationEstimationDTO);
return ResponseEntity.ok()
.headers(HeaderUtil.createEntityUpdateAlert(applicationName, true, ENTITY_NAME, organizationEstimationDTO.getId().toString()))
.body(result);
}
/**
* {@code GET /organization-estimations} : get all the organizationEstimations.
*
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and the list of organizationEstimations in body.
*/
@GetMapping("/organization-estimations")
public List<OrganizationEstimationDTO> getAllOrganizationEstimations() {
log.debug("REST request to get all OrganizationEstimations");
return organizationEstimationService.findAll();
}
/**
* {@code GET /organization-estimations/:id} : get the "id" organizationEstimation.
*
* @param id the id of the organizationEstimationDTO to retrieve.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the organizationEstimationDTO, or with status {@code 404 (Not Found)}.
*/
@GetMapping("/organization-estimations/{id}")
public ResponseEntity<OrganizationEstimationDTO> getOrganizationEstimation(@PathVariable Long id) {
log.debug("REST request to get OrganizationEstimation : {}", id);
Optional<OrganizationEstimationDTO> organizationEstimationDTO = organizationEstimationService.findOne(id);
return ResponseUtil.wrapOrNotFound(organizationEstimationDTO);
}
/**
* {@code DELETE /organization-estimations/:id} : delete the "id" organizationEstimation.
*
* @param id the id of the organizationEstimationDTO to delete.
* @return the {@link ResponseEntity} with status {@code 204 (NO_CONTENT)}.
*/
@DeleteMapping("/organization-estimations/{id}")
public ResponseEntity<Void> deleteOrganizationEstimation(@PathVariable Long id) {
log.debug("REST request to delete OrganizationEstimation : {}", id);
organizationEstimationService.delete(id);
return ResponseEntity.noContent().headers(HeaderUtil.createEntityDeletionAlert(applicationName, true, ENTITY_NAME, id.toString())).build();
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/web/rest/EventTypeResource.java
package br.ufpa.labes.spm.web.rest;
import br.ufpa.labes.spm.service.EventTypeService;
import br.ufpa.labes.spm.web.rest.errors.BadRequestAlertException;
import br.ufpa.labes.spm.service.dto.EventTypeDTO;
import io.github.jhipster.web.util.HeaderUtil;
import io.github.jhipster.web.util.ResponseUtil;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.List;
import java.util.Optional;
/**
* REST controller for managing {@link br.ufpa.labes.spm.domain.EventType}.
*/
@RestController
@RequestMapping("/api")
public class EventTypeResource {
private final Logger log = LoggerFactory.getLogger(EventTypeResource.class);
private static final String ENTITY_NAME = "eventType";
@Value("${jhipster.clientApp.name}")
private String applicationName;
private final EventTypeService eventTypeService;
public EventTypeResource(EventTypeService eventTypeService) {
this.eventTypeService = eventTypeService;
}
/**
* {@code POST /event-types} : Create a new eventType.
*
* @param eventTypeDTO the eventTypeDTO to create.
* @return the {@link ResponseEntity} with status {@code 201 (Created)} and with body the new eventTypeDTO, or with status {@code 400 (Bad Request)} if the eventType has already an ID.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PostMapping("/event-types")
public ResponseEntity<EventTypeDTO> createEventType(@RequestBody EventTypeDTO eventTypeDTO) throws URISyntaxException {
log.debug("REST request to save EventType : {}", eventTypeDTO);
if (eventTypeDTO.getId() != null) {
throw new BadRequestAlertException("A new eventType cannot already have an ID", ENTITY_NAME, "idexists");
}
EventTypeDTO result = eventTypeService.save(eventTypeDTO);
return ResponseEntity.created(new URI("/api/event-types/" + result.getId()))
.headers(HeaderUtil.createEntityCreationAlert(applicationName, true, ENTITY_NAME, result.getId().toString()))
.body(result);
}
/**
* {@code PUT /event-types} : Updates an existing eventType.
*
* @param eventTypeDTO the eventTypeDTO to update.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the updated eventTypeDTO,
* or with status {@code 400 (Bad Request)} if the eventTypeDTO is not valid,
* or with status {@code 500 (Internal Server Error)} if the eventTypeDTO couldn't be updated.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PutMapping("/event-types")
public ResponseEntity<EventTypeDTO> updateEventType(@RequestBody EventTypeDTO eventTypeDTO) throws URISyntaxException {
log.debug("REST request to update EventType : {}", eventTypeDTO);
if (eventTypeDTO.getId() == null) {
throw new BadRequestAlertException("Invalid id", ENTITY_NAME, "idnull");
}
EventTypeDTO result = eventTypeService.save(eventTypeDTO);
return ResponseEntity.ok()
.headers(HeaderUtil.createEntityUpdateAlert(applicationName, true, ENTITY_NAME, eventTypeDTO.getId().toString()))
.body(result);
}
/**
* {@code GET /event-types} : get all the eventTypes.
*
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and the list of eventTypes in body.
*/
@GetMapping("/event-types")
public List<EventTypeDTO> getAllEventTypes() {
log.debug("REST request to get all EventTypes");
return eventTypeService.findAll();
}
/**
* {@code GET /event-types/:id} : get the "id" eventType.
*
* @param id the id of the eventTypeDTO to retrieve.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the eventTypeDTO, or with status {@code 404 (Not Found)}.
*/
@GetMapping("/event-types/{id}")
public ResponseEntity<EventTypeDTO> getEventType(@PathVariable Long id) {
log.debug("REST request to get EventType : {}", id);
Optional<EventTypeDTO> eventTypeDTO = eventTypeService.findOne(id);
return ResponseUtil.wrapOrNotFound(eventTypeDTO);
}
/**
* {@code DELETE /event-types/:id} : delete the "id" eventType.
*
* @param id the id of the eventTypeDTO to delete.
* @return the {@link ResponseEntity} with status {@code 204 (NO_CONTENT)}.
*/
@DeleteMapping("/event-types/{id}")
public ResponseEntity<Void> deleteEventType(@PathVariable Long id) {
log.debug("REST request to delete EventType : {}", id);
eventTypeService.delete(id);
return ResponseEntity.noContent().headers(HeaderUtil.createEntityDeletionAlert(applicationName, true, ENTITY_NAME, id.toString())).build();
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/processKnowledge/ResourceEstimationDAO.java
package br.ufpa.labes.spm.repository.impl.processKnowledge;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.processKnowledge.IResourceEstimationDAO;
import br.ufpa.labes.spm.domain.ResourceEstimation;
public class ResourceEstimationDAO extends BaseDAO<ResourceEstimation, Integer>
implements IResourceEstimationDAO {
protected ResourceEstimationDAO(Class<ResourceEstimation> businessClass) {
super(businessClass);
}
public ResourceEstimationDAO() {
super(ResourceEstimation.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/IDAOFactory.java
package br.ufpa.labes.spm.repository.interfaces;
import java.util.Properties;
import javax.naming.InitialContext;
import javax.naming.NamingException;
import br.ufpa.labes.spm.repository.interfaces.activities.IActivityDAO;
import br.ufpa.labes.spm.repository.interfaces.activities.IDecomposedDAO;
public class IDAOFactory {
private IActivityDAO iActivityDAO;
@SuppressWarnings("unused")
private IDecomposedDAO iDecomposedDAO;
public IDAOFactory() {
try {
Properties properties = new Properties();
InitialContext context = new InitialContext(properties);
Object obj = context.lookup("ActivityDAOLocal");
iActivityDAO = (IActivityDAO) obj;
} catch (NamingException e) {
e.printStackTrace();
}
}
public Object getIDAO(String daoName) {
// TO-DO
return iActivityDAO;
}
}
<file_sep>/tmp/service/mapper/WorkGroupTypeMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.WorkGroupTypeDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link WorkGroupType} and its DTO {@link WorkGroupTypeDTO}.
*/
@Mapper(componentModel = "spring", uses = {})
public interface WorkGroupTypeMapper extends EntityMapper<WorkGroupTypeDTO, WorkGroupType> {
@Mapping(target = "theWorkGroups", ignore = true)
@Mapping(target = "removeTheWorkGroup", ignore = true)
@Mapping(target = "theReqGroups", ignore = true)
@Mapping(target = "removeTheReqGroup", ignore = true)
@Mapping(target = "theWorkGroupInstSugs", ignore = true)
@Mapping(target = "removeTheWorkGroupInstSug", ignore = true)
WorkGroupType toEntity(WorkGroupTypeDTO workGroupTypeDTO);
default WorkGroupType fromId(Long id) {
if (id == null) {
return null;
}
WorkGroupType workGroupType = new WorkGroupType();
workGroupType.setId(id);
return workGroupType;
}
}
<file_sep>/tmp/service/mapper/EstimationMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.EstimationDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link Estimation} and its DTO {@link EstimationDTO}.
*/
@Mapper(componentModel = "spring", uses = {MetricDefinitionMapper.class})
public interface EstimationMapper extends EntityMapper<EstimationDTO, Estimation> {
@Mapping(source = "metricDefinition.id", target = "metricDefinitionId")
EstimationDTO toDto(Estimation estimation);
@Mapping(source = "metricDefinitionId", target = "metricDefinition")
Estimation toEntity(EstimationDTO estimationDTO);
default Estimation fromId(Long id) {
if (id == null) {
return null;
}
Estimation estimation = new Estimation();
estimation.setId(id);
return estimation;
}
}
<file_sep>/tmp/service/LogEntryService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.LogEntry;
import br.ufpa.labes.spm.repository.LogEntryRepository;
import br.ufpa.labes.spm.service.dto.LogEntryDTO;
import br.ufpa.labes.spm.service.mapper.LogEntryMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link LogEntry}.
*/
@Service
@Transactional
public class LogEntryService {
private final Logger log = LoggerFactory.getLogger(LogEntryService.class);
private final LogEntryRepository logEntryRepository;
private final LogEntryMapper logEntryMapper;
public LogEntryService(LogEntryRepository logEntryRepository, LogEntryMapper logEntryMapper) {
this.logEntryRepository = logEntryRepository;
this.logEntryMapper = logEntryMapper;
}
/**
* Save a logEntry.
*
* @param logEntryDTO the entity to save.
* @return the persisted entity.
*/
public LogEntryDTO save(LogEntryDTO logEntryDTO) {
log.debug("Request to save LogEntry : {}", logEntryDTO);
LogEntry logEntry = logEntryMapper.toEntity(logEntryDTO);
logEntry = logEntryRepository.save(logEntry);
return logEntryMapper.toDto(logEntry);
}
/**
* Get all the logEntries.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<LogEntryDTO> findAll() {
log.debug("Request to get all LogEntries");
return logEntryRepository.findAll().stream()
.map(logEntryMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one logEntry by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<LogEntryDTO> findOne(Long id) {
log.debug("Request to get LogEntry : {}", id);
return logEntryRepository.findById(id)
.map(logEntryMapper::toDto);
}
/**
* Delete the logEntry by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete LogEntry : {}", id);
logEntryRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/tools/ClassMethodCallDAO.java
package br.ufpa.labes.spm.repository.impl.tools;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.tools.IClassMethodCallDAO;
import br.ufpa.labes.spm.domain.ClassMethodCall;
public class ClassMethodCallDAO extends BaseDAO<ClassMethodCall, String>
implements IClassMethodCallDAO {
protected ClassMethodCallDAO(Class<ClassMethodCall> businessClass) {
super(businessClass);
}
public ClassMethodCallDAO() {
super(ClassMethodCall.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/web/rest/DependencyResource.java
package br.ufpa.labes.spm.web.rest;
import br.ufpa.labes.spm.service.DependencyService;
import br.ufpa.labes.spm.web.rest.errors.BadRequestAlertException;
import br.ufpa.labes.spm.service.dto.DependencyDTO;
import io.github.jhipster.web.util.HeaderUtil;
import io.github.jhipster.web.util.ResponseUtil;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.List;
import java.util.Optional;
import java.util.stream.StreamSupport;
/**
* REST controller for managing {@link br.ufpa.labes.spm.domain.Dependency}.
*/
@RestController
@RequestMapping("/api")
public class DependencyResource {
private final Logger log = LoggerFactory.getLogger(DependencyResource.class);
private static final String ENTITY_NAME = "dependency";
@Value("${jhipster.clientApp.name}")
private String applicationName;
private final DependencyService dependencyService;
public DependencyResource(DependencyService dependencyService) {
this.dependencyService = dependencyService;
}
/**
* {@code POST /dependencies} : Create a new dependency.
*
* @param dependencyDTO the dependencyDTO to create.
* @return the {@link ResponseEntity} with status {@code 201 (Created)} and with body the new dependencyDTO, or with status {@code 400 (Bad Request)} if the dependency has already an ID.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PostMapping("/dependencies")
public ResponseEntity<DependencyDTO> createDependency(@RequestBody DependencyDTO dependencyDTO) throws URISyntaxException {
log.debug("REST request to save Dependency : {}", dependencyDTO);
if (dependencyDTO.getId() != null) {
throw new BadRequestAlertException("A new dependency cannot already have an ID", ENTITY_NAME, "idexists");
}
DependencyDTO result = dependencyService.save(dependencyDTO);
return ResponseEntity.created(new URI("/api/dependencies/" + result.getId()))
.headers(HeaderUtil.createEntityCreationAlert(applicationName, true, ENTITY_NAME, result.getId().toString()))
.body(result);
}
/**
* {@code PUT /dependencies} : Updates an existing dependency.
*
* @param dependencyDTO the dependencyDTO to update.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the updated dependencyDTO,
* or with status {@code 400 (Bad Request)} if the dependencyDTO is not valid,
* or with status {@code 500 (Internal Server Error)} if the dependencyDTO couldn't be updated.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PutMapping("/dependencies")
public ResponseEntity<DependencyDTO> updateDependency(@RequestBody DependencyDTO dependencyDTO) throws URISyntaxException {
log.debug("REST request to update Dependency : {}", dependencyDTO);
if (dependencyDTO.getId() == null) {
throw new BadRequestAlertException("Invalid id", ENTITY_NAME, "idnull");
}
DependencyDTO result = dependencyService.save(dependencyDTO);
return ResponseEntity.ok()
.headers(HeaderUtil.createEntityUpdateAlert(applicationName, true, ENTITY_NAME, dependencyDTO.getId().toString()))
.body(result);
}
/**
* {@code GET /dependencies} : get all the dependencies.
*
* @param filter the filter of the request.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and the list of dependencies in body.
*/
@GetMapping("/dependencies")
public List<DependencyDTO> getAllDependencies(@RequestParam(required = false) String filter) {
if ("themultiplecon-is-null".equals(filter)) {
log.debug("REST request to get all Dependencys where theMultipleCon is null");
return dependencyService.findAllWhereTheMultipleConIsNull();
}
log.debug("REST request to get all Dependencies");
return dependencyService.findAll();
}
/**
* {@code GET /dependencies/:id} : get the "id" dependency.
*
* @param id the id of the dependencyDTO to retrieve.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the dependencyDTO, or with status {@code 404 (Not Found)}.
*/
@GetMapping("/dependencies/{id}")
public ResponseEntity<DependencyDTO> getDependency(@PathVariable Long id) {
log.debug("REST request to get Dependency : {}", id);
Optional<DependencyDTO> dependencyDTO = dependencyService.findOne(id);
return ResponseUtil.wrapOrNotFound(dependencyDTO);
}
/**
* {@code DELETE /dependencies/:id} : delete the "id" dependency.
*
* @param id the id of the dependencyDTO to delete.
* @return the {@link ResponseEntity} with status {@code 204 (NO_CONTENT)}.
*/
@DeleteMapping("/dependencies/{id}")
public ResponseEntity<Void> deleteDependency(@PathVariable Long id) {
log.debug("REST request to delete Dependency : {}", id);
dependencyService.delete(id);
return ResponseEntity.noContent().headers(HeaderUtil.createEntityDeletionAlert(applicationName, true, ENTITY_NAME, id.toString())).build();
}
}
<file_sep>/tmp/service/mapper/WebAPSEEObjectMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.WebAPSEEObjectDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link WebAPSEEObject} and its DTO {@link WebAPSEEObjectDTO}.
*/
@Mapper(componentModel = "spring", uses = {GraphicCoordinateMapper.class})
public interface WebAPSEEObjectMapper extends EntityMapper<WebAPSEEObjectDTO, WebAPSEEObject> {
@Mapping(source = "theGraphicCoordinate.id", target = "theGraphicCoordinateId")
WebAPSEEObjectDTO toDto(WebAPSEEObject webAPSEEObject);
@Mapping(source = "theGraphicCoordinateId", target = "theGraphicCoordinate")
WebAPSEEObject toEntity(WebAPSEEObjectDTO webAPSEEObjectDTO);
default WebAPSEEObject fromId(Long id) {
if (id == null) {
return null;
}
WebAPSEEObject webAPSEEObject = new WebAPSEEObject();
webAPSEEObject.setId(id);
return webAPSEEObject;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/resources/ResourceDAO.java
package br.ufpa.labes.spm.repository.impl.resources;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.resources.IResourceDAO;
import br.ufpa.labes.spm.domain.Resource;
public class ResourceDAO extends BaseDAO<Resource, String> implements IResourceDAO {
protected ResourceDAO(Class<Resource> businessClass) {
super(businessClass);
}
public ResourceDAO() {
super(Resource.class);
}
}
<file_sep>/tmp/service/mapper/MetricMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.MetricDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link Metric} and its DTO {@link MetricDTO}.
*/
@Mapper(componentModel = "spring", uses = {MetricDefinitionMapper.class})
public interface MetricMapper extends EntityMapper<MetricDTO, Metric> {
@Mapping(source = "metricDefinition.id", target = "metricDefinitionId")
MetricDTO toDto(Metric metric);
@Mapping(source = "metricDefinitionId", target = "metricDefinition")
Metric toEntity(MetricDTO metricDTO);
default Metric fromId(Long id) {
if (id == null) {
return null;
}
Metric metric = new Metric();
metric.setId(id);
return metric;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/mapper/WorkGroupInstSugMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.WorkGroupInstSugDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link WorkGroupInstSug} and its DTO {@link WorkGroupInstSugDTO}.
*/
@Mapper(componentModel = "spring", uses = {WorkGroupMapper.class, WorkGroupTypeMapper.class})
public interface WorkGroupInstSugMapper extends EntityMapper<WorkGroupInstSugDTO, WorkGroupInstSug> {
@Mapping(source = "groupChosen.id", target = "groupChosenId")
@Mapping(source = "groupTypeRequired.id", target = "groupTypeRequiredId")
WorkGroupInstSugDTO toDto(WorkGroupInstSug workGroupInstSug);
@Mapping(source = "groupChosenId", target = "groupChosen")
@Mapping(source = "groupTypeRequiredId", target = "groupTypeRequired")
@Mapping(target = "removeGroupSuggested", ignore = true)
WorkGroupInstSug toEntity(WorkGroupInstSugDTO workGroupInstSugDTO);
default WorkGroupInstSug fromId(Long id) {
if (id == null) {
return null;
}
WorkGroupInstSug workGroupInstSug = new WorkGroupInstSug();
workGroupInstSug.setId(id);
return workGroupInstSug;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/ActivityEstimationService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.ActivityEstimation;
import br.ufpa.labes.spm.repository.ActivityEstimationRepository;
import br.ufpa.labes.spm.service.dto.ActivityEstimationDTO;
import br.ufpa.labes.spm.service.mapper.ActivityEstimationMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link ActivityEstimation}.
*/
@Service
@Transactional
public class ActivityEstimationService {
private final Logger log = LoggerFactory.getLogger(ActivityEstimationService.class);
private final ActivityEstimationRepository activityEstimationRepository;
private final ActivityEstimationMapper activityEstimationMapper;
public ActivityEstimationService(ActivityEstimationRepository activityEstimationRepository, ActivityEstimationMapper activityEstimationMapper) {
this.activityEstimationRepository = activityEstimationRepository;
this.activityEstimationMapper = activityEstimationMapper;
}
/**
* Save a activityEstimation.
*
* @param activityEstimationDTO the entity to save.
* @return the persisted entity.
*/
public ActivityEstimationDTO save(ActivityEstimationDTO activityEstimationDTO) {
log.debug("Request to save ActivityEstimation : {}", activityEstimationDTO);
ActivityEstimation activityEstimation = activityEstimationMapper.toEntity(activityEstimationDTO);
activityEstimation = activityEstimationRepository.save(activityEstimation);
return activityEstimationMapper.toDto(activityEstimation);
}
/**
* Get all the activityEstimations.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<ActivityEstimationDTO> findAll() {
log.debug("Request to get all ActivityEstimations");
return activityEstimationRepository.findAll().stream()
.map(activityEstimationMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one activityEstimation by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<ActivityEstimationDTO> findOne(Long id) {
log.debug("Request to get ActivityEstimation : {}", id);
return activityEstimationRepository.findById(id)
.map(activityEstimationMapper::toDto);
}
/**
* Delete the activityEstimation by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete ActivityEstimation : {}", id);
activityEstimationRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/processModels/IProcessModelDAO.java
package br.ufpa.labes.spm.repository.interfaces.processModels;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.ProcessModel;
public interface IProcessModelDAO extends IBaseDAO<ProcessModel, Integer> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/processKnowledge/AgentEstimationDAO.java
package br.ufpa.labes.spm.repository.impl.processKnowledge;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.processKnowledge.IAgentEstimationDAO;
import br.ufpa.labes.spm.domain.AgentEstimation;
public class AgentEstimationDAO extends BaseDAO<AgentEstimation, Integer>
implements IAgentEstimationDAO {
protected AgentEstimationDAO(Class<AgentEstimation> businessClass) {
super(businessClass);
}
public AgentEstimationDAO() {
super(AgentEstimation.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/web/rest/MultipleConResource.java
package br.ufpa.labes.spm.web.rest;
import br.ufpa.labes.spm.service.MultipleConService;
import br.ufpa.labes.spm.web.rest.errors.BadRequestAlertException;
import br.ufpa.labes.spm.service.dto.MultipleConDTO;
import io.github.jhipster.web.util.HeaderUtil;
import io.github.jhipster.web.util.ResponseUtil;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.List;
import java.util.Optional;
/**
* REST controller for managing {@link br.ufpa.labes.spm.domain.MultipleCon}.
*/
@RestController
@RequestMapping("/api")
public class MultipleConResource {
private final Logger log = LoggerFactory.getLogger(MultipleConResource.class);
private static final String ENTITY_NAME = "multipleCon";
@Value("${jhipster.clientApp.name}")
private String applicationName;
private final MultipleConService multipleConService;
public MultipleConResource(MultipleConService multipleConService) {
this.multipleConService = multipleConService;
}
/**
* {@code POST /multiple-cons} : Create a new multipleCon.
*
* @param multipleConDTO the multipleConDTO to create.
* @return the {@link ResponseEntity} with status {@code 201 (Created)} and with body the new multipleConDTO, or with status {@code 400 (Bad Request)} if the multipleCon has already an ID.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PostMapping("/multiple-cons")
public ResponseEntity<MultipleConDTO> createMultipleCon(@RequestBody MultipleConDTO multipleConDTO) throws URISyntaxException {
log.debug("REST request to save MultipleCon : {}", multipleConDTO);
if (multipleConDTO.getId() != null) {
throw new BadRequestAlertException("A new multipleCon cannot already have an ID", ENTITY_NAME, "idexists");
}
MultipleConDTO result = multipleConService.save(multipleConDTO);
return ResponseEntity.created(new URI("/api/multiple-cons/" + result.getId()))
.headers(HeaderUtil.createEntityCreationAlert(applicationName, true, ENTITY_NAME, result.getId().toString()))
.body(result);
}
/**
* {@code PUT /multiple-cons} : Updates an existing multipleCon.
*
* @param multipleConDTO the multipleConDTO to update.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the updated multipleConDTO,
* or with status {@code 400 (Bad Request)} if the multipleConDTO is not valid,
* or with status {@code 500 (Internal Server Error)} if the multipleConDTO couldn't be updated.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PutMapping("/multiple-cons")
public ResponseEntity<MultipleConDTO> updateMultipleCon(@RequestBody MultipleConDTO multipleConDTO) throws URISyntaxException {
log.debug("REST request to update MultipleCon : {}", multipleConDTO);
if (multipleConDTO.getId() == null) {
throw new BadRequestAlertException("Invalid id", ENTITY_NAME, "idnull");
}
MultipleConDTO result = multipleConService.save(multipleConDTO);
return ResponseEntity.ok()
.headers(HeaderUtil.createEntityUpdateAlert(applicationName, true, ENTITY_NAME, multipleConDTO.getId().toString()))
.body(result);
}
/**
* {@code GET /multiple-cons} : get all the multipleCons.
*
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and the list of multipleCons in body.
*/
@GetMapping("/multiple-cons")
public List<MultipleConDTO> getAllMultipleCons() {
log.debug("REST request to get all MultipleCons");
return multipleConService.findAll();
}
/**
* {@code GET /multiple-cons/:id} : get the "id" multipleCon.
*
* @param id the id of the multipleConDTO to retrieve.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the multipleConDTO, or with status {@code 404 (Not Found)}.
*/
@GetMapping("/multiple-cons/{id}")
public ResponseEntity<MultipleConDTO> getMultipleCon(@PathVariable Long id) {
log.debug("REST request to get MultipleCon : {}", id);
Optional<MultipleConDTO> multipleConDTO = multipleConService.findOne(id);
return ResponseUtil.wrapOrNotFound(multipleConDTO);
}
/**
* {@code DELETE /multiple-cons/:id} : delete the "id" multipleCon.
*
* @param id the id of the multipleConDTO to delete.
* @return the {@link ResponseEntity} with status {@code 204 (NO_CONTENT)}.
*/
@DeleteMapping("/multiple-cons/{id}")
public ResponseEntity<Void> deleteMultipleCon(@PathVariable Long id) {
log.debug("REST request to delete MultipleCon : {}", id);
multipleConService.delete(id);
return ResponseEntity.noContent().headers(HeaderUtil.createEntityDeletionAlert(applicationName, true, ENTITY_NAME, id.toString())).build();
}
}
<file_sep>/tmp/service/mapper/ArtifactMetricMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.ArtifactMetricDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link ArtifactMetric} and its DTO {@link ArtifactMetricDTO}.
*/
@Mapper(componentModel = "spring", uses = {ArtifactMapper.class})
public interface ArtifactMetricMapper extends EntityMapper<ArtifactMetricDTO, ArtifactMetric> {
@Mapping(source = "artifact.id", target = "artifactId")
ArtifactMetricDTO toDto(ArtifactMetric artifactMetric);
@Mapping(source = "artifactId", target = "artifact")
ArtifactMetric toEntity(ArtifactMetricDTO artifactMetricDTO);
default ArtifactMetric fromId(Long id) {
if (id == null) {
return null;
}
ArtifactMetric artifactMetric = new ArtifactMetric();
artifactMetric.setId(id);
return artifactMetric;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/processKnowledge/IProcessMetricDAO.java
package br.ufpa.labes.spm.repository.interfaces.processKnowledge;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.ProcessMetric;
public interface IProcessMetricDAO extends IBaseDAO<ProcessMetric, Integer> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/mapper/ResourceMetricMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.ResourceMetricDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link ResourceMetric} and its DTO {@link ResourceMetricDTO}.
*/
@Mapper(componentModel = "spring", uses = {ResourceMapper.class})
public interface ResourceMetricMapper extends EntityMapper<ResourceMetricDTO, ResourceMetric> {
@Mapping(source = "resource.id", target = "resourceId")
ResourceMetricDTO toDto(ResourceMetric resourceMetric);
@Mapping(source = "resourceId", target = "resource")
ResourceMetric toEntity(ResourceMetricDTO resourceMetricDTO);
default ResourceMetric fromId(Long id) {
if (id == null) {
return null;
}
ResourceMetric resourceMetric = new ResourceMetric();
resourceMetric.setId(id);
return resourceMetric;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/processKnowledge/EstimationDAO.java
package br.ufpa.labes.spm.repository.impl.processKnowledge;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.processKnowledge.IEstimationDAO;
import br.ufpa.labes.spm.domain.Estimation;
public class EstimationDAO extends BaseDAO<Estimation, Integer> implements IEstimationDAO {
protected EstimationDAO(Class<Estimation> businessClass) {
super(businessClass);
}
public EstimationDAO() {
super(Estimation.class);
}
}
<file_sep>/tmp/service/WorkGroupService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.service.dto.WorkGroupDTO;
import java.util.List;
import java.util.Optional;
/**
* Service Interface for managing {@link br.ufpa.labes.spm.domain.WorkGroup}.
*/
public interface WorkGroupService {
/**
* Save a workGroup.
*
* @param workGroupDTO the entity to save.
* @return the persisted entity.
*/
WorkGroupDTO save(WorkGroupDTO workGroupDTO);
/**
* Get all the workGroups.
*
* @return the list of entities.
*/
List<WorkGroupDTO> findAll();
/**
* Get the "id" workGroup.
*
* @param id the id of the entity.
* @return the entity.
*/
Optional<WorkGroupDTO> findOne(Long id);
/**
* Delete the "id" workGroup.
*
* @param id the id of the entity.
*/
void delete(Long id);
}
<file_sep>/tmp/service/dto/RoleNeedsAbilityDTO.java
package br.ufpa.labes.spm.service.dto;
import java.io.Serializable;
import java.util.Objects;
/**
* A DTO for the {@link br.ufpa.labes.spm.domain.RoleNeedsAbility} entity.
*/
public class RoleNeedsAbilityDTO implements Serializable {
private Long id;
private Integer degree;
private Long theRoleId;
private Long theAbilityId;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public Integer getDegree() {
return degree;
}
public void setDegree(Integer degree) {
this.degree = degree;
}
public Long getTheRoleId() {
return theRoleId;
}
public void setTheRoleId(Long roleId) {
this.theRoleId = roleId;
}
public Long getTheAbilityId() {
return theAbilityId;
}
public void setTheAbilityId(Long abilityId) {
this.theAbilityId = abilityId;
}
@Override
public boolean equals(Object o) {
if (this == o) {
return true;
}
if (o == null || getClass() != o.getClass()) {
return false;
}
RoleNeedsAbilityDTO roleNeedsAbilityDTO = (RoleNeedsAbilityDTO) o;
if (roleNeedsAbilityDTO.getId() == null || getId() == null) {
return false;
}
return Objects.equals(getId(), roleNeedsAbilityDTO.getId());
}
@Override
public int hashCode() {
return Objects.hashCode(getId());
}
@Override
public String toString() {
return "RoleNeedsAbilityDTO{" +
"id=" + getId() +
", degree=" + getDegree() +
", theRole=" + getTheRoleId() +
", theAbility=" + getTheAbilityId() +
"}";
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/mapper/AutomaticMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.AutomaticDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link Automatic} and its DTO {@link AutomaticDTO}.
*/
@Mapper(componentModel = "spring", uses = {SubroutineMapper.class, ArtifactMapper.class})
public interface AutomaticMapper extends EntityMapper<AutomaticDTO, Automatic> {
@Mapping(source = "theSubroutine.id", target = "theSubroutineId")
@Mapping(source = "theArtifact.id", target = "theArtifactId")
AutomaticDTO toDto(Automatic automatic);
@Mapping(source = "theSubroutineId", target = "theSubroutine")
@Mapping(target = "theParameters", ignore = true)
@Mapping(target = "removeTheParameters", ignore = true)
@Mapping(source = "theArtifactId", target = "theArtifact")
Automatic toEntity(AutomaticDTO automaticDTO);
default Automatic fromId(Long id) {
if (id == null) {
return null;
}
Automatic automatic = new Automatic();
automatic.setId(id);
return automatic;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/plannerInfo/IAgentInstSuggestionToAgentDAO.java
package br.ufpa.labes.spm.repository.interfaces.plannerInfo;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.AgentInstSuggestionToAgent;
public interface IAgentInstSuggestionToAgentDAO
extends IBaseDAO<AgentInstSuggestionToAgent, Integer> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/mapper/DevelopingSystemMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.DevelopingSystemDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link DevelopingSystem} and its DTO {@link DevelopingSystemDTO}.
*/
@Mapper(componentModel = "spring", uses = {CompanyMapper.class})
public interface DevelopingSystemMapper extends EntityMapper<DevelopingSystemDTO, DevelopingSystem> {
@Mapping(source = "theOrganization.id", target = "theOrganizationId")
DevelopingSystemDTO toDto(DevelopingSystem developingSystem);
@Mapping(source = "theOrganizationId", target = "theOrganization")
@Mapping(target = "theProjects", ignore = true)
@Mapping(target = "removeTheProject", ignore = true)
DevelopingSystem toEntity(DevelopingSystemDTO developingSystemDTO);
default DevelopingSystem fromId(Long id) {
if (id == null) {
return null;
}
DevelopingSystem developingSystem = new DevelopingSystem();
developingSystem.setId(id);
return developingSystem;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/interfaces/EnactmentEngine.java
/*
* Created on 29/03/2005
*/
package br.ufpa.labes.spm.service.interfaces;
import br.ufpa.labes.spm.exceptions.WebapseeException;
/**
* @author <NAME> (<EMAIL>) and <NAME>�
* (<EMAIL>) LABES - Software Engineering Laboratory, Federal
* University of Para, Belem, Para, Brazil Faculty of Computer Science, University of Stuttgart,
* Stuttgart,Baden-W�rttemberg, Germany
* @since April/2005
* @version 1.0
*/
public interface EnactmentEngine {
/**
* This method is called by the Process Manager. It will create the tasks in the Task Agendas of
* the Agents (Developers) with state Waiting, set the state of the Process to Enacting, and,
* according to the Multiple Connections and Activities that are ready to begin, the Process Model
* state is determined. So, the execution continues till the state of the Process has been setted
* to Finished.
*
* @return void
* @param String process_id
*/
public void executeProcess(String process_id) throws WebapseeException;
/**
* This method is called by the Process Manager. It will reset all the components of the Process,
* including the Process Model, Acivities, Connections, Resources, Process Agendas, and setting
* their states to the initial state.
*
* @return void
* @param String process_id
*/
public void resetProcess(String process_id) throws WebapseeException;
/**
* This method is called by an Agent of the Process to begin a Task in his Process Agenda. It will
* set the Task state to Active in his Agenda and in the Proces Model if it's not Active. And, if
* the Activity is paused, the Activity is re-started.
*
* @return void
* @param String agent_id, String act_id
*/
public void beginTask(String agent_id, String act_id) throws WebapseeException;
/**
* This method is called by an Agent of the Process to finish a Task in his Process Agenda. It
* will set the Task state to Finished in his Agenda, and on the Proces Model if all the Agents
* that are required for this Activity finish the Task also. In this method there is a particular
* situation. When an Activity finishes, if it is a Feedback Connection source and the associated
* condition of the Feedback Connection is satisfied (true), then, Feedback will be executed till
* the condition turns to not satisfied (false).
*
* @return void
* @param String agent_id, String act_id
*/
public void finishTask(String agent_id, String act_id) throws WebapseeException;
/**
* This method is called by an Agent of the Process to pause a Task in his Process Agenda. It will
* set the Task state to Paused in his Agenda, and on the Proces Model if all the Agents that are
* required for this Activity pause the Task also.
*
* @return void
* @param String agent_id, String act_id
*/
public void pauseTask(String agent_id, String act_id) throws WebapseeException;
public void pauseActiveTasks(String agentIdent) throws WebapseeException;
/**
* This method is called by an Agent of the Process to delegate a Task in his Process Agenda to
* another Agent. It will set the Task state to Delegated in his Agenda, and created on the
* Process Agenda of the other Agent.
*
* @return void
* @param String from_agent_id, String act_id, String to_agent_id
*/
public void delegateTask(String from_agent_id, String act_id, String to_agent_id)
throws WebapseeException;
/**
* This method is called by the Process Manager to fail an Activity. It will set the Activity
* state to Failed and will propagate the fail for the successors according to their state, in
* case of an Activity. And, fail the Connections in case of a Multiple Connection (BranchCon or
* JoinCon). In this method there is a particular situation also. When an Activity fail, if it's a
* Feedback Connection source and the condition of the Feedback Connection is satisfied (true),
* then, Feedback will be executed till the condition turns to not satisfied (false). To Fail an
* Activity it should be Active or Paused.
*
* @return void
* @param String act_id
*/
public void failActivity(String act_id) throws WebapseeException;
/**
* This method is called by the Process Manager to cancel an Activity. It will set the Activity
* state to Canceled and will propagate the cancel for the successors according to their state, in
* case of an Activity. And, cancel the Connections in case of a Multiple Connection (BranchCon or
* JoinCon). To Cancel an Activity it should be Waiting or Ready.
*
* @return void
* @param String act_id
*/
public void cancelActivity(String act_id) throws WebapseeException;
/**
* This method is called by the Process Manager to make a Shareable Resource Unavailable. It will
* set the Resource state to Not Available.
*
* @return void
* @param String resource_id
*/
public void makeUnavailable(String resource_id) throws WebapseeException;
/**
* This method is called by the Process Manager to make a Shareable Resource Available. It will
* set the Resource state to Available.
*
* @return void
* @param String resource_id
*/
public void makeAvailable(String resource_id) throws WebapseeException;
/**
* This method is called by the Process Manager to register a defect on a Exclusive Resource. It
* will set the Resource state to Defect.
*
* @return void
* @param String resource_id
*/
public void registerDefect(String resource_id) throws WebapseeException;
// New Public Methods
public void createNewVersion(String act_id) throws WebapseeException;
}
<file_sep>/tmp/service/impl/DynamicModelingImpl.java
package br.ufpa.labes.spm.service.impl;
import java.rmi.RemoteException;
import java.util.ArrayList;
import java.util.Collection;
import java.util.HashSet;
import java.util.Iterator;
import java.util.StringTokenizer;
import br.ufpa.labes.spm.repository.impl.plainActivities.AutomaticDAO;
import br.ufpa.labes.spm.repository.impl.types.ArtifactTypeDAO;
import br.ufpa.labes.spm.repository.interfaces.activities.IActivityDAO;
import br.ufpa.labes.spm.repository.interfaces.activities.IDecomposedDAO;
import br.ufpa.labes.spm.repository.interfaces.agent.IAgentDAO;
import br.ufpa.labes.spm.repository.interfaces.agent.IWorkGroupDAO;
import br.ufpa.labes.spm.repository.interfaces.agent.IRoleDAO;
import br.ufpa.labes.spm.repository.interfaces.artifacts.IArtifactDAO;
import br.ufpa.labes.spm.repository.interfaces.connections.IArtifactConDAO;
import br.ufpa.labes.spm.repository.interfaces.connections.IBranchConCondToActivityDAO;
import br.ufpa.labes.spm.repository.interfaces.connections.IBranchConCondToMultipleConDAO;
import br.ufpa.labes.spm.repository.interfaces.connections.IBranchConDAO;
import br.ufpa.labes.spm.repository.interfaces.connections.IConnectionDAO;
import br.ufpa.labes.spm.repository.interfaces.connections.IJoinConDAO;
import br.ufpa.labes.spm.repository.interfaces.connections.IMultipleConDAO;
import br.ufpa.labes.spm.repository.interfaces.connections.ISimpleConDAO;
import br.ufpa.labes.spm.repository.interfaces.plainActivities.IAutomaticDAO;
import br.ufpa.labes.spm.repository.interfaces.plainActivities.IInvolvedArtifactsDAO;
import br.ufpa.labes.spm.repository.interfaces.plainActivities.INormalDAO;
import br.ufpa.labes.spm.repository.interfaces.plainActivities.IParameterDAO;
import br.ufpa.labes.spm.repository.interfaces.plainActivities.IReqAgentDAO;
import br.ufpa.labes.spm.repository.interfaces.plainActivities.IReqWorkGroupDAO;
import br.ufpa.labes.spm.repository.interfaces.plainActivities.IRequiredResourceDAO;
import br.ufpa.labes.spm.repository.interfaces.policies.staticPolicies.IPolConditionDAO;
import br.ufpa.labes.spm.repository.interfaces.processModels.IProcessDAO;
import br.ufpa.labes.spm.repository.interfaces.processModels.IProcessModelDAO;
import br.ufpa.labes.spm.repository.interfaces.resources.IConsumableDAO;
import br.ufpa.labes.spm.repository.interfaces.resources.IResourceDAO;
import br.ufpa.labes.spm.repository.interfaces.taskagenda.IProcessAgendaDAO;
import br.ufpa.labes.spm.repository.interfaces.taskagenda.ITaskDAO;
import br.ufpa.labes.spm.repository.interfaces.tools.ISubroutineDAO;
import br.ufpa.labes.spm.repository.interfaces.types.IArtifactTypeDAO;
import br.ufpa.labes.spm.repository.interfaces.types.IWorkGroupTypeDAO;
import br.ufpa.labes.spm.repository.interfaces.types.IResourceTypeDAO;
import br.ufpa.labes.spm.service.dto.WebapseeObjectDTO;
import br.ufpa.labes.spm.exceptions.DAOException;
import br.ufpa.labes.spm.exceptions.ModelingException;
import br.ufpa.labes.spm.exceptions.ModelingWarning;
import br.ufpa.labes.spm.exceptions.WebapseeException;
import br.ufpa.labes.spm.domain.Activity;
import br.ufpa.labes.spm.domain.Decomposed;
import br.ufpa.labes.spm.domain.Plain;
import br.ufpa.labes.spm.domain.Agent;
import br.ufpa.labes.spm.domain.AgentPlaysRole;
import br.ufpa.labes.spm.domain.WorkGroup;
import br.ufpa.labes.spm.domain.Role;
import br.ufpa.labes.spm.domain.Artifact;
import br.ufpa.labes.spm.domain.ArtifactCon;
import br.ufpa.labes.spm.domain.BranchCon;
import br.ufpa.labes.spm.domain.BranchANDCon;
import br.ufpa.labes.spm.domain.BranchConCond;
import br.ufpa.labes.spm.domain.BranchConCondToActivity;
import br.ufpa.labes.spm.domain.BranchConCondToMultipleCon;
import br.ufpa.labes.spm.domain.Connection;
import br.ufpa.labes.spm.domain.Dependency;
import br.ufpa.labes.spm.domain.Feedback;
import br.ufpa.labes.spm.domain.JoinCon;
import br.ufpa.labes.spm.domain.MultipleCon;
import br.ufpa.labes.spm.domain.Sequence;
import br.ufpa.labes.spm.domain.SimpleCon;
import br.ufpa.labes.spm.domain.ModelingActivityEvent;
import br.ufpa.labes.spm.domain.Automatic;
import br.ufpa.labes.spm.domain.EnactionDescription;
import br.ufpa.labes.spm.domain.InvolvedArtifact;
import br.ufpa.labes.spm.domain.Normal;
import br.ufpa.labes.spm.domain.Parameter;
import br.ufpa.labes.spm.domain.ReqAgent;
import br.ufpa.labes.spm.domain.ReqWorkGroup;
import br.ufpa.labes.spm.domain.RequiredPeople;
import br.ufpa.labes.spm.domain.RequiredResource;
import br.ufpa.labes.spm.domain.PolCondition;
import br.ufpa.labes.spm.domain.ActivityEstimation;
import br.ufpa.labes.spm.domain.ActivityMetric;
import br.ufpa.labes.spm.domain.Process;
import br.ufpa.labes.spm.domain.ProcessModel;
import br.ufpa.labes.spm.domain.Consumable;
import br.ufpa.labes.spm.domain.Exclusive;
import br.ufpa.labes.spm.domain.Resource;
import br.ufpa.labes.spm.domain.Shareable;
import br.ufpa.labes.spm.domain.ProcessAgenda;
import br.ufpa.labes.spm.domain.Task;
import br.ufpa.labes.spm.domain.TaskAgenda;
import br.ufpa.labes.spm.domain.ClassMethodCall;
import br.ufpa.labes.spm.domain.Script;
import br.ufpa.labes.spm.domain.Subroutine;
import br.ufpa.labes.spm.domain.ActivityType;
import br.ufpa.labes.spm.domain.ArtifactType;
import br.ufpa.labes.spm.domain.WorkGroupType;
import br.ufpa.labes.spm.domain.ResourceType;
import br.ufpa.labes.spm.domain.Type;
import br.ufpa.labes.spm.service.interfaces.DynamicModeling;
import br.ufpa.labes.spm.service.interfaces.EnactmentEngineLocal;
import br.ufpa.labes.spm.service.interfaces.Logging;
import br.ufpa.labes.spm.service.interfaces.NotificationServices;
import br.ufpa.labes.spm.util.i18n.Messages;
public class DynamicModelingImpl implements DynamicModeling {
// private static String UPT = "UPT";
private static String ADD = "ADD";
private static String DEL = "DEL";
private static String DIRECTION_OUT = "OUT";
private static String DIRECTION_IN = "IN";
EnactmentEngineLocal enactmentEngine;
private Logging logging;
public NotificationServices remote;
IProcessDAO procDAO;
IDecomposedDAO decDAO;
IActivityDAO actDAO;
INormalDAO normDAO;
IAutomaticDAO autoDAO;
IArtifactDAO artDAO;
IProcessModelDAO pmodelDAO;
ISubroutineDAO subDAO;
IParameterDAO paramDAO;
IArtifactConDAO artConDAO;
IArtifactTypeDAO artTypeDAO;
IInvolvedArtifactsDAO invArtDAO;
IMultipleConDAO multiDAO;
IConnectionDAO conDAO;
IPolConditionDAO polConditionDAO;
IBranchConCondToMultipleConDAO bctmcDAO;
IJoinConDAO joinConDAO;
IBranchConDAO branchConDAO;
IWorkGroupTypeDAO WorkGroupTypeDAO;
IRoleDAO roleDAO;
IReqAgentDAO reqAgentDAO;
IAgentDAO agentDAO;
ITaskDAO taskDAO;
IWorkGroupDAO WorkGroupDAO;
IReqWorkGroupDAO reqWorkGroupDAO;
IResourceTypeDAO resTypeDAO;
IRequiredResourceDAO reqResDAO;
IResourceDAO resDAO;
IConsumableDAO consumableDAO;
IBranchConCondToActivityDAO branchConCondToActivityDAO;
ISimpleConDAO simpleDAO;
IProcessAgendaDAO pAgendaDAO;
@Override
public void imprimirNoConsole(String mensagem) {
System.out.println("Mensagem vinda do Flex!");
System.out.println(mensagem);
}
@Override
public Integer newNormalActivity(String level_id, String new_id) throws WebapseeException {
// Checks for the parameters
StringTokenizer st = new StringTokenizer(level_id, "."); //$NON-NLS-1$
String process_id = st.nextToken();
Object proc;
try {
proc = procDAO.retrieveBySecondaryKey(process_id);
} catch (Exception/* era DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccProcess") + //$NON-NLS-1$
process_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (proc == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcProcess") + process_id + Messages.getString("facades.DynamicModeling.DaoExc_NotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Process process = (Process) proc;
ProcessModel pmodel = null;
Decomposed actDecomposed = null; // it is used only if the new activity
// Is not in the root process model.
if (st.hasMoreTokens()) {
String currentModel = process_id;
while (st.hasMoreTokens()) {
currentModel += "." + st.nextToken(); //$NON-NLS-1$
}
Object dec = null;
try {
dec = decDAO.retrieveBySecondaryKey(currentModel);
} catch (Exception/* DAOException */e1) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActiv") + //$NON-NLS-1$
currentModel + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e1); //$NON-NLS-1$
}
if (dec == null)
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDecomActv") + currentModel //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$
actDecomposed = (Decomposed) dec;
pmodel = actDecomposed.getTheReferedProcessModel();
} else {
pmodel = process.getTheProcessModel();
System.out.println("modelo: "+pmodel.getPmState());
}
// End Checks for the parameters
// Now we start the implementation of the rules
String pmState = pmodel.getPmState();
if (!pmState.equals(ProcessModel.INSTANTIATED) && !pmState.equals(ProcessModel.ENACTING) && !pmState.equals(ProcessModel.ABSTRACT)
&& !pmState.equals(ProcessModel.REQUIREMENTS)) {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcNotAdded")); //$NON-NLS-1$
}
if ((process.getpStatus().name().equals(Process.ENACTING) || process.getpStatus().name().equals(Process.NOT_STARTED))) {
if (!this.isActivityInProcessModel(level_id + "." + new_id, pmodel)) { //$NON-NLS-1$
System.out.println("salva :"+pmodel.getPmState());
Normal actNorm = new Normal();
actNorm.setIdent(level_id + "." + new_id); //$NON-NLS-1$
actNorm.setName(new_id);
actNorm.setTheProcessModel(pmodel);
pmodel.getTheActivities().add(actNorm);
// Persistence Operations
procDAO.update(process);
if (actDecomposed != null)
decDAO.update(actDecomposed);
return actNorm.getId();
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + new_id + Messages.getString("facades.DynamicModeling.ModelingExcIsAlrInModel")); //$NON-NLS-1$ //$NON-NLS-2$
}
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.DaoExcProcess") + process_id + Messages.getString("facades.DynamicModeling.ModelingExcIsNOtReadyToIn")); //$NON-NLS-1$ //$NON-NLS-2$
}
}
@Override
public Integer newDecomposedActivity(String level_id, String new_id) throws WebapseeException {
// Checks for the parameters
StringTokenizer st = new StringTokenizer(level_id, "."); //$NON-NLS-1$
String process_id = st.nextToken();
Object proc;
try {
proc = procDAO.retrieveBySecondaryKey(process_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccProcess") + //$NON-NLS-1$
process_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (proc == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcProcess") + process_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Process process = (Process) proc;
ProcessModel pmodel = null;
Decomposed actDecomposed = null; // it is used only if the new activity
// Is not in the root process model.
if (st.hasMoreTokens()) {
String currentModel = process_id;
while (st.hasMoreTokens()) {
currentModel += "." + st.nextToken(); //$NON-NLS-1$
}
Object dec = null;
try {
dec = decDAO.retrieveBySecondaryKey(currentModel);
} catch (Exception/* DAOException */e1) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActiv") + //$NON-NLS-1$
currentModel + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e1); //$NON-NLS-1$
}
if (dec == null)
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDecomActv") + currentModel //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$
actDecomposed = (Decomposed) dec;
pmodel = actDecomposed.getTheReferedProcessModel();
} else {
pmodel = process.getTheProcessModel();
}
// End Checks for the parameters
// Now we start the implementation of the rules
String pmState = pmodel.getPmState();
if (!pmState.equals(ProcessModel.INSTANTIATED) && !pmState.equals(ProcessModel.ENACTING) && !pmState.equals(ProcessModel.ABSTRACT)
&& !pmState.equals(ProcessModel.REQUIREMENTS)) {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcNotAdded")); //$NON-NLS-1$
}
if ((process.getpStatus().name().equals(Process.ENACTING) || process.getpStatus().name().equals(Process.NOT_STARTED))) {
if (!this.isActivityInProcessModel(level_id + "." + new_id, pmodel)) { //$NON-NLS-1$
Decomposed actDec = new Decomposed();
actDec.setIdent(level_id + "." + new_id); //$NON-NLS-1$
actDec.setName(new_id);
actDec.setTheProcessModel(pmodel);
pmodel.getTheActivities().add(actDec);
ProcessModel refProcModel = actDec.getTheReferedProcessModel();
refProcModel.setTheDecomposed(actDec);
// Persistence Operations
procDAO.update(process);
if (actDecomposed != null)
decDAO.update(actDecomposed);
return actDec.getId();
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + new_id + Messages.getString("facades.DynamicModeling.ModelingExcIsAlrInModel")); //$NON-NLS-1$ //$NON-NLS-2$
}
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.DaoExcProcess") + process_id + Messages.getString("facades.DynamicModeling.ModelingExcIsNOtReadyToIn")); //$NON-NLS-1$ //$NON-NLS-2$
}
}
@Override
public Integer newAutomaticActivity(String level_id, String new_id) throws WebapseeException {
// Checks for the parameters
StringTokenizer st = new StringTokenizer(level_id, "."); //$NON-NLS-1$
String process_id = st.nextToken();
Object proc;
try {
proc = procDAO.retrieveBySecondaryKey(process_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccProcess") + //$NON-NLS-1$
process_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (proc == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcProcess") + process_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Process process = (Process) proc;
ProcessModel pmodel = null;
Decomposed actDecomposed = null; // it is used only if the new activity
// Is not in the root process model.
if (st.hasMoreTokens()) {
String currentModel = process_id;
while (st.hasMoreTokens()) {
currentModel += "." + st.nextToken(); //$NON-NLS-1$
}
Object dec = null;
try {
dec = decDAO.retrieveBySecondaryKey(currentModel);
} catch (Exception/* DAOException */e1) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActiv") + //$NON-NLS-1$
currentModel + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e1); //$NON-NLS-1$
}
if (dec == null)
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDecomActv") + currentModel //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$
actDecomposed = (Decomposed) dec;
pmodel = actDecomposed.getTheReferedProcessModel();
} else {
pmodel = process.getTheProcessModel();
}
// End Checks for the parameters
// Now we start the implementation of the rules
String pmState = pmodel.getPmState();
if (!pmState.equals(ProcessModel.INSTANTIATED) && !pmState.equals(ProcessModel.ENACTING) && !pmState.equals(ProcessModel.ABSTRACT)
&& !pmState.equals(ProcessModel.REQUIREMENTS)) {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcNotAdded")); //$NON-NLS-1$
}
if (process.getpStatus().name().equals(Process.NOT_STARTED)) {
if (!this.isActivityInProcessModel(level_id + "." + new_id, pmodel)) { //$NON-NLS-1$
Automatic actAuto = new Automatic();
actAuto.setIdent(level_id + "." + new_id); //$NON-NLS-1$
actAuto.setName(new_id);
actAuto.setTheProcessModel(pmodel);
pmodel.getTheActivities().add(actAuto);
// Persistence Operations
procDAO.update(process);
if (actDecomposed != null)
decDAO.update(actDecomposed);
return actAuto.getId();
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + new_id + Messages.getString("facades.DynamicModeling.ModelingExcIsAlrInModel")); //$NON-NLS-1$ //$NON-NLS-2$
}
} else if (process.getpStatus().name().equals(Process.ENACTING)) {
if (!this.isActivityInProcessModel(level_id + "." + new_id, pmodel)) { //$NON-NLS-1$
Automatic actAuto = new Automatic();
actAuto.setIdent(level_id + "." + new_id); //$NON-NLS-1$
actAuto.setName(new_id);
actAuto.getTheEnactionDescription().setStateWithMessage(Plain.WAITING);
actAuto.setTheProcessModel(pmodel);
pmodel.getTheActivities().add(actAuto);
// Persistence Operations
procDAO.update(process);
if (actDecomposed != null)
decDAO.update(actDecomposed);
return actAuto.getId();
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + new_id + Messages.getString("facades.DynamicModeling.ModelingExcIsAlrInModel")); //$NON-NLS-1$ //$NON-NLS-2$
}
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.DaoExcProcess") + process_id + Messages.getString("facades.DynamicModeling.ModelingExcIsNOtReadyToIn")); //$NON-NLS-1$ //$NON-NLS-2$
}
}
@Override
public Integer defineAsDecomposed(String act_id) throws DAOException, ModelingException {
// Checks for the parameters
Object act;
try {
act = actDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActv") + //$NON-NLS-1$
act_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Activity activity = (Activity) act;
// End Checks for the parameters
// Now we start the implementation of the rules
if (activity instanceof Decomposed)
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcActv") + activity.getIdent() //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.ModelingExcIsAlrDecActv")); //$NON-NLS-1$
else if (activity instanceof Normal) { // Rule G1.4
Normal actNorm = (Normal) activity;
if (actNorm.getTheEnactionDescription().getState().equals("")) { //$NON-NLS-1$
if (this.hasInvolvedAgents(actNorm)) {
actNorm.setTheRequiredPeople(new HashSet());
}
Collection set = actNorm.getTheRequiredResources();
set.remove(null);
if (!set.isEmpty()) {
this.releaseResourcesFromActivity(actNorm);
actNorm.setTheRequiredResource(new HashSet());
}
String identAux = actNorm.getIdent();
actNorm.setIdent(""); //$NON-NLS-1$
// Persistence Operation
actDAO.update(actNorm);
Decomposed actDecomp = new Decomposed();
actDecomp.setIdent(identAux);
actDecomp.setName(actNorm.getName());
this.copyActivityRelationships(actNorm, actDecomp);
ProcessModel refProcModel = actDecomp.getTheReferedProcessModel();
refProcModel.setTheDecomposed(actDecomp);
// Persistence Operations
normDAO.daoDelete(actNorm);
actDAO.daoSave(actDecomp);
return actDecomp.getId();
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + activity.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcIsAlrRun")); //$NON-NLS-1$ //$NON-NLS-2$
}
} else if (activity instanceof Automatic) { // Rule G1.5
Automatic actAuto = (Automatic) activity;
if (actAuto.getTheEnactionDescription().getState().equals("") //$NON-NLS-1$
&& actAuto.getTheSubroutine() == null) {
String identAux = actAuto.getIdent();
actAuto.setIdent(""); //$NON-NLS-1$
// Persistence Operation
actDAO.update(actAuto);
Decomposed actDecomp = new Decomposed();
actDecomp.setIdent(identAux);
actDecomp.setName(actAuto.getName());
this.copyActivityRelationships(actAuto, actDecomp);
ProcessModel refProcModel = actDecomp.getTheReferedProcessModel();
refProcModel.setTheDecomposed(actDecomp);
// Persistence Operations
autoDAO.daoDelete(actAuto);
actDAO.daoSave(actDecomp);
return actDecomp.getId();
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + activity.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcIsAlrRun")); //$NON-NLS-1$ //$NON-NLS-2$
}
}
return null;
}
@Override
public Integer defineAsAutomatic(String act_id) throws DAOException, ModelingException {
// Checks for the parameters
Object act;
try {
act = actDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActiv") + //$NON-NLS-1$
act_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Activity activity = (Activity) act;
// End Checks for the parameters
// Now we start the implementation of the rules
if (activity instanceof Automatic)
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcActv") + activity.getIdent() //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.ModelingExcIsAlrAutoActv")); //$NON-NLS-1$
else if (activity instanceof Normal) { // Rule G1.6
Normal actNorm = (Normal) activity;
if (actNorm.getTheEnactionDescription().getState().equals("")) { //$NON-NLS-1$
if (this.hasInvolvedAgents(actNorm)) {
actNorm.setTheRequiredPeople(new HashSet());
}
Collection set = actNorm.getTheRequiredResources();
set.remove(null);
if (!set.isEmpty()) {
this.releaseResourcesFromActivity(actNorm);
actNorm.setTheRequiredResource(new HashSet());
}
String identAux = actNorm.getIdent();
actNorm.setIdent(""); //$NON-NLS-1$
// Persistence Operation
actDAO.update(actNorm);
Automatic actAuto = new Automatic();
actAuto.setIdent(identAux);
actAuto.setName(actNorm.getName());
this.copyActivityRelationships(actNorm, actAuto);
// Persistence Operations
normDAO.daoDelete(actNorm);
actDAO.daoSave(actAuto);
return actAuto.getId();
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + activity.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcIsAlrRun")); //$NON-NLS-1$ //$NON-NLS-2$
}
} else if (activity instanceof Decomposed) { // Rule G1.7
Decomposed actDecomp = (Decomposed) activity;
ProcessModel pmodel = actDecomp.getTheReferedProcessModel();
if (pmodel == null) {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcActv") + activity.getIdent() //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.ModelingExcHasNoRefProcMod")); //$NON-NLS-1$
} else {
Collection set = pmodel.getTheActivities();
set.remove(null);
if (!set.isEmpty()) {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcActv") //$NON-NLS-1$
+ activity.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcHasRefProcModDef")); //$NON-NLS-1$
} else {
String identAux = actDecomp.getIdent();
actDecomp.setIdent(""); //$NON-NLS-1$
// Persistence Operation
actDAO.update(actDecomp);
Automatic actAuto = new Automatic();
actAuto.setIdent(identAux);
actAuto.setName(actDecomp.getName());
this.copyActivityRelationships(actDecomp, actAuto);
// Persistence Operations
decDAO.daoDelete(actDecomp);
actDAO.daoSave(actAuto);
return actAuto.getId();
}
}
}
return null;
}
@Override
public Integer defineAsNormal(String act_id) throws DAOException, ModelingException {
// Checks for the parameters
Object act;
try {
act = actDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActiv") + //$NON-NLS-1$
act_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Activity activity = (Activity) act;
// End Checks for the parameters
// Now we start the implementation of the rules
if (activity instanceof Normal) {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcActv") + activity.getIdent() //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.ModelingExcIsAlrNorActv")); //$NON-NLS-1$
}
else if (activity instanceof Decomposed) { // Rule G1.8
Decomposed actDecomp = (Decomposed) activity;
ProcessModel pmodel = actDecomp.getTheReferedProcessModel();
if (pmodel == null) {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcActv") + activity.getIdent() //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.ModelingExcHasNoRefProcMod")); //$NON-NLS-1$
} else {
Collection set = pmodel.getTheActivities();
set.remove(null);
if (!set.isEmpty()) {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcActv") //$NON-NLS-1$
+ activity.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcHasRefProcModDef")); //$NON-NLS-1$
} else {
String identAux = actDecomp.getIdent();
actDecomp.setIdent(""); //$NON-NLS-1$
// Persistence Operation
actDAO.update(actDecomp);
Normal actNorm = new Normal();
actNorm.setIdent(identAux);
actNorm.setName(actDecomp.getName());
this.copyActivityRelationships(actDecomp, actNorm);
// Persistence Operations
actDAO.daoSave(actNorm);
decDAO.daoDelete(actDecomp);
return actNorm.getId();
}
}
} else if (activity instanceof Automatic) { // Rule G1.9
Automatic actAuto = (Automatic) activity;
if (actAuto.getTheEnactionDescription().getState().equals("") //$NON-NLS-1$
&& actAuto.getTheSubroutine() == null) {
String identAux = actAuto.getIdent();
actAuto.setIdent(""); //$NON-NLS-1$
// Persistence Operation
actDAO.update(actAuto);
Normal actNorm = new Normal();
actNorm.setIdent(identAux);
actNorm.setName(actAuto.getName());
this.copyActivityRelationships(actAuto, actNorm);
// Persistence Operations
AutomaticDAO autoDAO = new AutomaticDAO();
autoDAO.daoDelete(actAuto);
actDAO.daoSave(actNorm);
return actNorm.getId();
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + activity.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcIsAlrRun")); //$NON-NLS-1$ //$NON-NLS-2$
}
}
return null;
}
/**
* Rules G1.10 and G1.11 are implemented in facades.EnactmentEngine
*/
/*
* public Activity createNewVersion(String act_id){
*
* }
*/
@Override
public void defineInputArtifact(String act_id, String art_id) throws DAOException, ModelingException {
// Checks for the parameters
Object act;
try {
act = normDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActiv") + //$NON-NLS-1$
act_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Normal actNorm = (Normal) act;
Object art;
try {
art = artDAO.retrieveBySecondaryKey(art_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccArtf") + //$NON-NLS-1$
art_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (art == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcArtf") + art_id + Messages.getString("facades.DynamicModeling.DaoExc_NotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Artifact artifact = (Artifact) art;
// End Checks for the parameters
// Now we start the implementation of the rules
String state = actNorm.getTheEnactionDescription().getState();
if (state.equals("")) { // Rule G1.12 //$NON-NLS-1$
if (!this.hasTheFromArtifact(actNorm, artifact)) {
InvolvedArtifact inv = new InvolvedArtifact();
inv.setTheArtifact(artifact);
artifact.getTheInvolvedArtifacts().add(inv);
inv.setInInvolvedArtifacts(actNorm);
actNorm.getInvolvedArtifactToNormal().add(inv);
inv.setTheArtifactType(artifact.getTheArtifactType());
artifact.getTheArtifactType().getTheInvolvedArtifacts().add(inv);
// Persistence Operations
normDAO.update(actNorm);
artDAO.update(artifact);
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + actNorm.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcAlreadyCont") //$NON-NLS-1$ //$NON-NLS-2$
+ Messages.getString("facades.DynamicModeling.ModelingExcThisArtfInput")); //$NON-NLS-1$
}
} else if (!state.equals("") //$NON-NLS-1$
&& !state.equals(Plain.CANCELED) && !state.equals(Plain.FAILED) && !state.equals(Plain.FINISHED)) { // Rule
// G1.13
if (!this.hasTheFromArtifact(actNorm, artifact)) {
InvolvedArtifact inv = new InvolvedArtifact();
inv.setTheArtifact(artifact);
artifact.getTheInvolvedArtifacts().add(inv);
inv.setInInvolvedArtifacts(actNorm);
actNorm.getInvolvedArtifactToNormal().add(inv);
inv.setTheArtifactType(artifact.getTheArtifactType());
artifact.getTheArtifactType().getTheInvolvedArtifacts().add(inv);
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + actNorm.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcAlreadyCont") //$NON-NLS-1$ //$NON-NLS-2$
+ Messages.getString("facades.DynamicModeling.ModelingExcThisArtfInput")); //$NON-NLS-1$
}
if (this.hasInvolvedAgents(actNorm))
this.notifyAgents(actNorm, Messages.getString("facades.DynamicModeling.NotifyAgtNewInpArtf"), artifact.getId(), DynamicModelingImpl.ADD,
artifact.getClass(), artifact.getIdent(), DynamicModelingImpl.DIRECTION_IN); //$NON-NLS-1$
// Persistence Operations
normDAO.update(actNorm);
artDAO.update(artifact);
}
}
@Override
public void defineOutputArtifact(String act_id, String art_id) throws DAOException, ModelingException {
// Checks for the parameters
Object act;
try {
act = normDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActiv") + //$NON-NLS-1$
act_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.118") + act_id + Messages.getString("facades.DynamicModeling.DaoExc_NotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Normal actNorm = (Normal) act;
Object art;
try {
art = artDAO.retrieveBySecondaryKey(art_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccArtf") + //$NON-NLS-1$
art_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (art == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcArtf") + art_id + Messages.getString("facades.DynamicModeling.DaoExc_NotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Artifact artifact = (Artifact) art;
// End Checks for the parameters
// Now we start the implementation of the rules
String state = actNorm.getTheEnactionDescription().getState();
if (state.equals("")) { // Rule G1.14 //$NON-NLS-1$
if (!this.hasTheToArtifact(actNorm, artifact)) {
InvolvedArtifact inv = new InvolvedArtifact();
inv.setTheArtifact(artifact);
artifact.getTheInvolvedArtifacts().add(inv);
inv.setOutInvolvedArtifacts(actNorm);
actNorm.getInvolvedArtifactFromNormal().add(inv);
inv.setTheArtifactType(artifact.getTheArtifactType());
artifact.getTheArtifactType().getTheInvolvedArtifacts().add(inv);
// Persistence Operations
normDAO.update(actNorm);
artDAO.update(artifact);
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + actNorm.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcAlreadyCont") //$NON-NLS-1$ //$NON-NLS-2$
+ Messages.getString("facades.DynamicModeling.ModelingExcThisArtfOut")); //$NON-NLS-1$
}
} else if (!state.equals("") //$NON-NLS-1$
&& !state.equals(Plain.CANCELED) && !state.equals(Plain.FAILED) && !state.equals(Plain.FINISHED)) { // Rule
// G1.15
if (!this.hasTheToArtifact(actNorm, artifact)) {
InvolvedArtifact inv = new InvolvedArtifact();
inv.setTheArtifact(artifact);
artifact.getTheInvolvedArtifacts().add(inv);
inv.setOutInvolvedArtifacts(actNorm);
actNorm.getInvolvedArtifactFromNormal().add(inv);
inv.setTheArtifactType(artifact.getTheArtifactType());
artifact.getTheArtifactType().getTheInvolvedArtifacts().add(inv);
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + actNorm.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcAlreadyCont") //$NON-NLS-1$ //$NON-NLS-2$
+ Messages.getString("facades.DynamicModeling.ModelingExcThisArtfOut")); //$NON-NLS-1$
}
if (this.hasInvolvedAgents(actNorm))
this.notifyAgents(actNorm, Messages.getString("facades.DynamicModeling.NotifyAgtNewOutArtf"), artifact.getId(), DynamicModelingImpl.ADD,
artifact.getClass(), artifact.getIdent(), DynamicModelingImpl.DIRECTION_OUT); //$NON-NLS-1$
// Persistence Operations
normDAO.update(actNorm);
artDAO.update(artifact);
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcTheActv") + actNorm.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcStateNotOk") //$NON-NLS-1$ //$NON-NLS-2$
+ Messages.getString("facades.DynamicModeling.ModelingExcForNewOutArtf")); //$NON-NLS-1$
}
}
@Override
public Integer defineAutomatic(String auto_id, Script script, Collection parameters) throws DAOException {// Oid
// Automatic
Object act;
try {
act = autoDAO.retrieveBySecondaryKey(auto_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActiv") + //$NON-NLS-1$
auto_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + auto_id + Messages.getString("facades.DynamicModeling.DaoExc_NotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Automatic automatic = (Automatic) act;
// Now we start the implementation of the rules
automatic.setTheSubroutine(script);
script.setTheAutomatic(automatic);
Iterator iter = parameters.iterator();
while (iter.hasNext()) {
Parameter param = (Parameter) iter.next();
param.setTheAutomatic(automatic);
automatic.getTheParameters().add(param);
}
// Persistence Operations
autoDAO.update(automatic);
return automatic.getId();
}
@Override
public Integer defineAutomatic(String auto_id, ClassMethodCall cmc, Collection parameters) throws DAOException {// recupera
// Oid
// Automatic
Object act;
try {
act = autoDAO.retrieveBySecondaryKey(auto_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActiv") + //$NON-NLS-1$
auto_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + auto_id + Messages.getString("facades.DynamicModeling.DaoExc_NotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Automatic automatic = (Automatic) act;
// Now we start the implementation of the rules
automatic.setTheSubroutine(cmc);
automatic.setTheParameters(parameters);
cmc.setTheAutomatic(automatic);
Iterator iter = parameters.iterator();
while (iter.hasNext()) {
Parameter param = (Parameter) iter.next();
param.setTheAutomatic(automatic);
}
// Persistence Operations
autoDAO.update(automatic);
return automatic.getId();
}
@Override
public void deleteActivity(String act_id) throws DAOException, WebapseeException {
// Checks for the parameters
Object act;
try {
act = actDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActiv") + //$NON-NLS-1$
act_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
System.out.println("Ident: " + act_id);
System.out.println(act);
if (act == null) {
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
}
Activity activity = (Activity) act;
ProcessModel processModel = activity.getTheProcessModel();
// End Checks for the parameters
// Now we start the implementation of the rules
if (activity instanceof Decomposed) {
Decomposed actDec = (Decomposed) activity;
ProcessModel referedProcessModel = actDec.getTheReferedProcessModel();
if (referedProcessModel != null) {
String state = referedProcessModel.getPmState();
if (state.equals(ProcessModel.REQUIREMENTS) || state.equals(ProcessModel.ABSTRACT) || state.equals(ProcessModel.INSTANTIATED)) {
Collection set = referedProcessModel.getTheActivities();
if (set.isEmpty()) {
this.removeAllConnectionsFromActivity(actDec);
processModel.getTheActivities().remove(actDec);
// Remove from the model
decDAO.daoDelete(actDec);
// Dynamic Changes related code
String processState = this.getTheProcess(processModel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(processModel.getId(), "Rule G1.17-G1.32"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(processModel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(processModel/*
* ,
* currentSession
*/);
}
} else {
Collection acts = actDec.getTheReferedProcessModel().getTheActivities();
Iterator iterActivity = acts.iterator();
while (iterActivity.hasNext()) {
deleteActivity(((Activity) iterActivity.next()).getIdent());
}
deleteActivity(actDec.getIdent());
return;
}
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + actDec.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcHasAlrStart")); //$NON-NLS-1$ //$NON-NLS-2$
}
} else {
// equals to empty process model
this.removeAllConnectionsFromActivity(actDec/* , currentSession */);
processModel.getTheActivities().remove(actDec);
// Remove from the model
decDAO.daoDelete(actDec);
// Dynamic Changes related code
String processState = this.getTheProcess(processModel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(processModel.getId(), "Rule G1.17-G1.32"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(processModel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(processModel/*
* ,
* currentSession
*/);
}
}
} else if (activity instanceof Automatic) {
Automatic actAuto = (Automatic) activity;
String state = actAuto.getTheEnactionDescription().getState();
if (state.equals("") || state.equals(Plain.WAITING)) { //$NON-NLS-1$
this.removeAllConnectionsFromActivity(actAuto/* , currentSession */);
processModel.getTheActivities().remove(actAuto);
actAuto.setTheProcessModel(null);
// Remove from the model
Subroutine subr = actAuto.getTheSubroutine();
if (subr != null) {
subDAO.daoDelete(subr);
}
Collection parameters = actAuto.getTheParameters();
Iterator iterParam = parameters.iterator();
while (iterParam.hasNext()) {
Parameter param = (Parameter) iterParam.next();
paramDAO.daoDelete(param);
}
// Remove from the model
autoDAO.daoDelete(actAuto);
// Dynamic Changes related code
String processState = this.getTheProcess(processModel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(processModel.getId(), "Rule G1.33-G1.48"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(processModel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(processModel/*
* ,
* currentSession
*/);
}
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + actAuto.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcHasAlrStart")); //$NON-NLS-1$ //$NON-NLS-2$
}
} else if (activity instanceof Normal) {
Normal actNorm = (Normal) activity;
String state = actNorm.getTheEnactionDescription().getState();
if (state.equals("") || state.equals(Plain.WAITING) || state.equals(Plain.READY)) {
this.removeAllConnectionsFromActivity(actNorm/* , currentSession */);
processModel.removeFromTheActivity(actNorm);
// Remove from the model
normDAO.daoDelete(actNorm);
// Dynamic Changes related code
String processState = this.getTheProcess(processModel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(processModel.getId(), "Rule G1.49-G1.98");
this.enactmentEngine.searchForReadyActivities(processModel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(processModel/*
* ,
* currentSession
*/);
}
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcActv") + actNorm.getIdent()
+ Messages.getString("facades.DynamicModeling.ModelingExcHasAlrStart"));
}
}
// Persistence Operations
pmodelDAO.update(processModel);
}
/**
* Related to section 2
*/
@Override
public Integer newArtifactConnection(String level_id) throws DAOException {
return newArtifactConnection_Internal(level_id).getId();
}
private ArtifactCon newArtifactConnection_Internal(String level_id) throws DAOException {
// retorna
// Oid
// de
// ArtifactCon
// Checks for the parameters
StringTokenizer st = new StringTokenizer(level_id, "."); //$NON-NLS-1$
String process_id = st.nextToken();
Object proc;
try {
proc = procDAO.retrieveBySecondaryKey(process_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccProcess") + //$NON-NLS-1$
process_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (proc == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcProcess") + process_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Process process = (Process) proc;
ProcessModel pmodel = null;
Decomposed actDecomposed = null;
// it is used only if the new activity
// Is not in the root process model.
if (st.hasMoreTokens()) {
String currentModel = process_id;
while (st.hasMoreTokens()) {
currentModel += "." + st.nextToken(); //$NON-NLS-1$
}
Activity activity = null;
try {
// dec = decDAO.retrieveBySecondaryKey(currentModel);
activity = actDAO.retrieveBySecondaryKey(currentModel);
System.out.println("==============> Activity id: " + currentModel);
} catch (Exception/* DAOException */e1) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActiv") + //$NON-NLS-1$
currentModel + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e1); //$NON-NLS-1$
}
if (activity == null)
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDecomActv") + currentModel //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$
if(activity instanceof Decomposed) {
actDecomposed = (Decomposed) activity;
pmodel = actDecomposed.getTheReferedProcessModel();
} else {
pmodel = activity.getTheProcessModel();
}
} else {
pmodel = process.getTheProcessModel();
}
// End Checks for the parameters
// Now we start the implementation of the rules
ArtifactCon artifactCon = new ArtifactCon();
artifactCon.setIdent(level_id);
artifactCon.setTheProcessModel(pmodel);
pmodel.getTheConnection().add(artifactCon);
// Persistence Operations
artConDAO.daoSave(artifactCon);
if (actDecomposed != null)
decDAO.update(actDecomposed);
return artifactCon;
}
@Override
public Long newArtifactConnection(String level_id, String art_id) throws DAOException, ModelingException {// retorna
Artifact artifactAux = (Artifact) artDAO.retrieveBySecondaryKey(art_id);
String type_id = artifactAux.getTheArtifactType().getIdent();
ArtifactCon artifactCon = this.newArtifactConnection_Internal(level_id);
artifactCon = this.defineType_ArtifactConnection_Internal(artifactCon.getIdent(), type_id);
artifactCon = this.defineInstance_ArtifactConnection_Internal(artifactCon.getIdent(), art_id);
return artifactCon.getId();
}
@Override
public Long defineType_ArtifactConnection(String con_id, String type) throws DAOException {
return defineType_ArtifactConnection_Internal(con_id, type).getId();
}
private ArtifactCon defineType_ArtifactConnection_Internal(String con_id, String type) throws DAOException { // retorna
// Checks for the parameters
Object artCon;
try {
artCon = artConDAO.retrieveBySecondaryKey(con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccArtfConn") + //$NON-NLS-1$
con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (artCon == null)
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcArtfConn") + con_id
+ Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
ArtifactCon artifactCon = (ArtifactCon) artCon;
Object artType;
try {
artType = artTypeDAO.retrieveBySecondaryKey(type);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccArtfType") + //$NON-NLS-1$
con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (artType == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcArtfType") + con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
ArtifactType artifactType = (ArtifactType) artType;
// End Checks for the parameters
// Now we start the implementation of the rules
artifactCon.setTheArtifactType(artifactType);
artifactType.getTheArtifactCon().add(artifactCon);
// Persistence Operations
artConDAO.update(artifactCon);
artTypeDAO.update(artifactType);
return artifactCon;
}
@Override
public Integer changeType_ArtifactConnection(String con_id, String newType) throws DAOException {// retorna
// Oid
// de
// ArtifactCon
// Checks for the parameters
Object artCon;
try {
artCon = artConDAO.retrieveBySecondaryKey(con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccArtfConn") + //$NON-NLS-1$
con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (artCon == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcArtfConn") + con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
ArtifactCon artifactCon = (ArtifactCon) artCon;
ArtifactTypeDAO artTypeDAO = new ArtifactTypeDAO();
Object artType;
try {
artType = artTypeDAO.retrieveBySecondaryKey(newType);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccArtfType") + //$NON-NLS-1$
con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (artType == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcArtfTyp") + con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
ArtifactType newArtifactType = (ArtifactType) artType;
// End Checks for the parameters
// Now we start the implementation of the rules
Artifact artifact = artifactCon.getTheArtifact();
if (artifact != null) {
// Rule G2.3
artifact.getTheArtifactCon().remove(artifactCon);
artifactCon.setTheArtifact(null);
artifactCon.setTheArtifactType(newArtifactType);
newArtifactType.getTheArtifactCon().add(artifactCon);
} else {// Rule G2.4
artifactCon.setTheArtifactType(newArtifactType);
newArtifactType.getTheArtifactCon().add(artifactCon);
}
// Persistence Operations
artConDAO.update(artifactCon);
artTypeDAO.update(newArtifactType);
return artifactCon.getId();
}
@Override
public Integer defineInstance_ArtifactConnection(String con_id, String artifact_id) throws DAOException, ModelingException {
return defineInstance_ArtifactConnection_Internal(con_id, artifact_id).getId();
}
private ArtifactCon defineInstance_ArtifactConnection_Internal(String con_id, String artifact_id) throws DAOException, ModelingException { // retorna
// Checks for the parameters
Object artCon;
try {
artCon = artConDAO.retrieveBySecondaryKey(con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccArtfConn") + //$NON-NLS-1$
con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (artCon == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcArtfConn") + con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
ArtifactCon artifactCon = (ArtifactCon) artCon;
Object art;
try {
art = artDAO.retrieveBySecondaryKey(artifact_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccArtf") + //$NON-NLS-1$
artifact_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (art == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcArtf") + artifact_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Artifact artifact = (Artifact) art;
// End Checks for the parameters
// Now we start the implementation of the rules
Artifact artifactFromCon = artifactCon.getTheArtifact();
ArtifactType artifactTypeFromCon = artifactCon.getTheArtifactType();
if (artifactFromCon == null) {
if (artifactTypeFromCon == null) {
// Rule G2.5
artifactCon.insertIntoTheArtifacts(artifact);
ArtifactType type = artifact.getTheArtifactType();
artifactCon.insertIntoTheArtifactType(type);
this.createInvolvedArtifacts(artifactCon, artifact, artifactFromCon, type);
} else if (artifactTypeFromCon.equals(artifact.getTheArtifactType())
|| this.isSubType(artifact.getTheArtifactType(), artifactTypeFromCon)) {
// Rule G2.7
artifactCon.insertIntoTheArtifacts(artifact);
this.createInvolvedArtifacts(artifactCon, artifact, artifactFromCon, artifactTypeFromCon);
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheArtfTypeFromArtf")); //$NON-NLS-1$
}
} else {
//TODO Verificar porque o metodo removeFromArtifactCon esta sendo chamado
if (artifactTypeFromCon == null) {
// Rule G2.6
artifactFromCon.removeFromTheArtifactCon(artifactCon);
artifactCon.insertIntoTheArtifacts(artifact);
ArtifactType type = artifact.getTheArtifactType();
artifactCon.insertIntoTheArtifactType(type);
this.createInvolvedArtifacts(artifactCon, artifact, artifactFromCon, type);
} else if (artifactTypeFromCon.equals(artifact.getTheArtifactType())
|| this.isSubType(artifact.getTheArtifactType(), artifactTypeFromCon)) {
// Rule G2.8
artifactFromCon.removeFromTheArtifactCon(artifactCon);
artifactCon.insertIntoTheArtifacts(artifact);
this.createInvolvedArtifacts(artifactCon, artifact, artifactFromCon, artifactTypeFromCon);
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheArtfTypeFromArtf")); //$NON-NLS-1$
}
}
// Persistence Operations
artDAO.update(artifact);
artConDAO.update(artifactCon);
return artifactCon;
}
@Override
public Integer removeInstance_ArtifactConnection(String con_id) throws DAOException {// retorna
// Oid
// ArtifactCon
// Checks for the parameters
Object artCon;
try {
artCon = artConDAO.retrieveBySecondaryKey(con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccArtfConn") + //$NON-NLS-1$
con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (artCon == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcArtfConn") + con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
ArtifactCon artifactCon = (ArtifactCon) artCon;
// End Checks for the parameters
// Now we start the implementation of the rules
Artifact artifact = artifactCon.getTheArtifact();
artifact.removeFromTheArtifactCon(artifactCon);
// Removing from normal activities their input artifacts
Collection<Activity> toActivities = artifactCon.getToActivities();
Iterator iterToActivities = toActivities.iterator();
while (iterToActivities.hasNext()) {
Activity toActivity = (Activity) iterToActivities.next();
if (toActivity != null && toActivity instanceof Normal) {
Normal toNormal = (Normal) toActivity;
Collection inInvArts = toNormal.getInvolvedArtifactToNormal();
Iterator iterInInvArts = inInvArts.iterator();
while (iterInInvArts.hasNext()) {
InvolvedArtifact inInvArt = (InvolvedArtifact) iterInInvArts.next();
if (inInvArt != null) {
Artifact artFromInv = inInvArt.getTheArtifact();
if (artFromInv != null && artFromInv.equals(artifact)) {
inInvArt.removeFromTheArtifact();
inInvArt.removeFromTheArtifactType();
inInvArt.removeFromInInvolvedArtifacts();
invArtDAO.daoDelete(inInvArt);
break;
}
}
}
}
}
// Removing from normal activities their output artifacts
Collection<Activity> fromActivities = artifactCon.getFromActivity();
Iterator iterFromActivities = fromActivities.iterator();
while (iterFromActivities.hasNext()) {
Activity fromActivity = (Activity) iterFromActivities.next();
if (fromActivity != null && fromActivity instanceof Normal) {
Normal fromNormal = (Normal) fromActivity;
Collection outInvArts = fromNormal.getInvolvedArtifactFromNormal();
Iterator iterOutInvArts = outInvArts.iterator();
while (iterOutInvArts.hasNext()) {
InvolvedArtifact outInvArt = (InvolvedArtifact) iterOutInvArts.next();
if (outInvArt != null) {
Artifact artFromInv = outInvArt.getTheArtifact();
if (artFromInv != null && artFromInv.equals(artifact)) {
outInvArt.removeFromTheArtifact();
outInvArt.removeFromTheArtifactType();
outInvArt.removeFromInInvolvedArtifacts();
invArtDAO.daoDelete(outInvArt);
break;
}
}
}
}
}
// Persistence Operations
artConDAO.update(artifactCon);
return artifactCon.getId();
}
@Override
public Integer defineOutput_ArtifactConnection(String con_id, String act_id) throws DAOException, ModelingException, WebapseeException {
return defineOutput_ArtifactConnection_Internal(con_id, act_id).getId();
}
private ArtifactCon defineOutput_ArtifactConnection_Internal(String con_id, String act_id) throws DAOException, ModelingException,
WebapseeException { // retorna Oid ArtifactCon
// Checks for the parameters
Object artCon;
try {
artCon = artConDAO.retrieveBySecondaryKey(con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccArtfConn") + //$NON-NLS-1$
con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (artCon == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcArtfConn") + con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
ArtifactCon artifactCon = (ArtifactCon) artCon;
Object act;
try {
act = actDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActiv") + //$NON-NLS-1$
act_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Activity activity = (Activity) act;
// End Checks for the parameters
// Now we start the implementation of the rules
String state = this.getState(activity);
Artifact artifactFromCon = artifactCon.getTheArtifact();
ArtifactType artifactTypeFromCon = artifactCon.getTheArtifactType();
if (artifactTypeFromCon != null) {
if (activity instanceof Normal) {
Normal actNorm = (Normal) activity;
boolean normalContainsToArtifact = actNorm.getToArtifactCon().contains(artifactCon);
if (!normalContainsToArtifact) {
if (!this.hasTheToArtifact(actNorm, artifactFromCon)) {
if (state.equals("")) { // Rule G2.10 //$NON-NLS-1$
artifactCon.getFromActivity().add(actNorm);
actNorm.getToArtifactCon().add(artifactCon);
InvolvedArtifact invArt = new InvolvedArtifact();
if (artifactFromCon != null) {
invArt.setTheArtifact(artifactFromCon);
artifactFromCon.getTheInvolvedArtifacts().add(invArt);
}
invArt.setOutInvolvedArtifacts(actNorm);
actNorm.getInvolvedArtifactFromNormal().add(invArt);
artifactTypeFromCon.getTheInvolvedArtifacts().add(invArt);
invArt.setTheArtifactType(artifactTypeFromCon);
// Persistence Operations
artConDAO.update(artifactCon);
actDAO.update(activity);
return artifactCon;
} else if (state.equals(Plain.WAITING) || state.equals(Plain.READY) || state.equals(Plain.ACTIVE)
|| state.equals(Plain.PAUSED)) { // Rule G2.11
artifactCon.getFromActivity().add(actNorm);
actNorm.getToArtifactCon().add(artifactCon);
InvolvedArtifact invArt = new InvolvedArtifact();
if (artifactFromCon != null) {
invArt.setTheArtifact(artifactFromCon);
artifactFromCon.getTheInvolvedArtifacts().add(invArt);
}
invArt.setOutInvolvedArtifacts(actNorm);
actNorm.getInvolvedArtifactFromNormal().add(invArt);
invArt.setTheArtifactType(artifactTypeFromCon);
artifactTypeFromCon.getTheInvolvedArtifacts().add(invArt);
if (this.hasInvolvedAgents(actNorm))
this.notifyAgents(actNorm, Messages.getString("facades.DynamicModeling.NotifyAgtNewOutArtf"),
artifactFromCon.getId(), DynamicModelingImpl.ADD, artifactFromCon.getClass(), artifactFromCon.getIdent(),
DynamicModelingImpl.DIRECTION_OUT); //$NON-NLS-1$
// Persistence Operations
artConDAO.update(artifactCon);
actDAO.update(activity);
return artifactCon;
} else
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheActvStateDoes")); //$NON-NLS-1$
} else {// The activity doesn't have the artifact instance
// defined
// in involved artifact.
if (state.equals("")) { // Rule G2.12 //$NON-NLS-1$
artifactCon.getFromActivity().add(actNorm);
actNorm.getToArtifactCon().add(artifactCon);
InvolvedArtifact invArt = new InvolvedArtifact();
if (artifactFromCon != null) {
invArt.setTheArtifact(artifactFromCon);
artifactFromCon.getTheInvolvedArtifacts().add(invArt);
}
invArt.setOutInvolvedArtifacts(actNorm);
actNorm.getInvolvedArtifactFromNormal().add(invArt);
artifactTypeFromCon.getTheInvolvedArtifacts().add(invArt);
invArt.setTheArtifactType(artifactTypeFromCon);
java.lang.System.out.println("State vazio");
// Persistence Operations
artConDAO.update(artifactCon);
actDAO.update(activity);
return artifactCon;
} else if (state.equals(Plain.WAITING) || state.equals(Plain.READY) || state.equals(Plain.ACTIVE)
|| state.equals(Plain.PAUSED)) { // Rule G2.13
artifactCon.getFromActivity().add(actNorm);
actNorm.getToArtifactCon().add(artifactCon);
InvolvedArtifact invArt = new InvolvedArtifact();
if (artifactFromCon != null) {
invArt.setTheArtifact(artifactFromCon);
artifactFromCon.getTheInvolvedArtifacts().add(invArt);
}
invArt.setOutInvolvedArtifacts(actNorm);
actNorm.getInvolvedArtifactFromNormal().add(invArt);
invArt.setTheArtifactType(artifactTypeFromCon);
artifactTypeFromCon.getTheInvolvedArtifacts().add(invArt);
if (this.hasInvolvedAgents(actNorm))
this.notifyAgents(actNorm, Messages.getString("facades.DynamicModeling.NotifyAgtNewOutArtf"),
artifactFromCon.getId(), DynamicModelingImpl.ADD, artifactFromCon.getClass(), artifactFromCon.getIdent(),
DynamicModelingImpl.DIRECTION_OUT); //$NON-NLS-1$
// Persistence Operations
artConDAO.update(artifactCon);
actDAO.update(activity);
return artifactCon;
} else
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheActvStateDoes")); //$NON-NLS-1$
}
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheActvAlrHasArtf")); //$NON-NLS-1$
}
} else if (activity instanceof Decomposed) { // Rule G2.14
Decomposed actDec = (Decomposed) activity;
ProcessModel refProcModel = actDec.getTheReferedProcessModel();
if (!state.equals(ProcessModel.FINISHED) && !state.equals(ProcessModel.CANCELED) && !state.equals(ProcessModel.FAILED)
&& (refProcModel != null) && !actDec.getToArtifactCon().contains(artifactCon)) {
actDec.getToArtifactCon().add(artifactCon);
artifactCon.getFromActivity().add(actDec);
ArtifactCon newArtifactCon = new ArtifactCon();
newArtifactCon.setIdent(actDec.getIdent());
newArtifactCon = (ArtifactCon) artConDAO.daoSave(newArtifactCon);
newArtifactCon.setTheArtifact(artifactFromCon);
newArtifactCon.setTheArtifactType(artifactFromCon.getTheArtifactType());
artifactFromCon.getTheArtifactCon().add(newArtifactCon);
artifactFromCon.getTheArtifactType().getTheArtifactCon().add(newArtifactCon);
refProcModel.getTheConnection().add(newArtifactCon);
newArtifactCon.setTheProcessModel(refProcModel);
// Persistence Operations
artConDAO.update(artifactCon);
actDAO.update(activity);
return artifactCon;
} else
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheActvStateDoes")); //$NON-NLS-1$
} else
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheActvShouldNormDecomp")); //$NON-NLS-1$
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheArtfTypeOfArtfConn")); //$NON-NLS-1$
}
}
@Override
public Integer removeOutput_ArtifactConnection(String con_id, String act_id) throws DAOException, ModelingException {// retorna
// Oid
// ArtifactCon
// Checks for the parameters
Object artCon;
try {
artCon = artConDAO.retrieveBySecondaryKey(con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccArtfConn") + //$NON-NLS-1$
con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (artCon == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcArtfConn") + con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
ArtifactCon artifactCon = (ArtifactCon) artCon;
Object act;
try {
act = actDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActiv") + //$NON-NLS-1$
act_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Activity activity = (Activity) act;
// End Checks for the parameters
// Now we start the implementation of the rules
String state = this.getState(activity);
if ((!state.equals(Plain.FINISHED) || !state.equals(ProcessModel.FINISHED))
&& (!state.equals(Plain.FAILED) || !state.equals(ProcessModel.FAILED))
&& (!state.equals(Plain.FINISHED) || !state.equals(ProcessModel.FINISHED))) {
if (activity.getToArtifactCon().contains(artifactCon) || artifactCon.getFromActivity().contains(activity)) {
activity.getToArtifactCon().remove(artifactCon);
artifactCon.getFromActivity().remove(activity);
if (activity instanceof Normal) {
Normal actNorm = (Normal) activity;
Artifact artifact = artifactCon.getTheArtifact();
if (artifact != null) {
if (this.hasTheToArtifact(actNorm, artifactCon.getTheArtifact())) {
Collection invArts = actNorm.getInvolvedArtifactFromNormal();
Iterator iter = invArts.iterator();
InvolvedArtifact inv = null;
while (iter.hasNext()) {
InvolvedArtifact aux = (InvolvedArtifact) iter.next();
if (aux.getTheArtifact().equals(artifact)) {
inv = aux;
break;
}
}
if (inv != null) {
inv.removeFromOutInvolvedArtifacts();
if (this.hasInvolvedAgents(actNorm))
this.notifyAgents(actNorm, Messages.getString("facades.DynamicModeling.NotifyAgtRemOutArtf"), artifact.getId(),
DynamicModelingImpl.DEL, artifact.getClass(), artifact.getIdent(), DynamicModelingImpl.DIRECTION_OUT); //$NON-NLS-1$
}
}
}
}
// Persistence Operations
artConDAO.update(artifactCon);
actDAO.update(activity);
return artifactCon.getId();
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcTheArtfConn") + artifactCon.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcCannBeResolv") //$NON-NLS-1$ //$NON-NLS-2$
+ Messages.getString("facades.DynamicModeling.ModelingExcItsNotConnActv") + activity.getIdent() + "!"); //$NON-NLS-1$ //$NON-NLS-2$
}
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcTheArtfConn") + artifactCon.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcCannBeResolv") //$NON-NLS-1$ //$NON-NLS-2$
+ Messages.getString("facades.DynamicModeling.ModelingExcItsSourceEnact")); //$NON-NLS-1$
}
}
@Override
public Integer defineInput_ArtifactConnection_Activity(String con_id, String act_id) throws DAOException, ModelingException {
return defineInput_ArtifactConnection_Activity_Internal(con_id, act_id).getId();
}
private ArtifactCon defineInput_ArtifactConnection_Activity_Internal(String con_id, String act_id) throws DAOException, ModelingException {// retorna
// Oid
// ArtifactCon
// Checks for the parameters
Object artCon;
try {
artCon = artConDAO.retrieveBySecondaryKey(con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccArtfConn") + //$NON-NLS-1$
con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (artCon == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcArtfConn") + con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
ArtifactCon artifactCon = (ArtifactCon) artCon;
Object act;
try {
act = actDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActiv") + //$NON-NLS-1$
act_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Activity activity = (Activity) act;
// End Checks for the parameters
// Now we start the implementation of the rules
String state = this.getState(activity);
Artifact artifactFromCon = artifactCon.getTheArtifact();
ArtifactType artifactTypeFromCon = artifactCon.getTheArtifactType();
if (artifactTypeFromCon != null) {
if (activity instanceof Normal) {
Normal actNorm = (Normal) activity;
if (!actNorm.getFromArtifactCons().contains(artifactCon)) {
// String newInputArtifact = Messages.getString("facades.DynamicModeling.NotifyAgtNewInpArtf");
String newInputArtifact = "Notify Agent - New Input Artifact";
if (!this.hasTheFromArtifact(actNorm, artifactFromCon)) {
if (state.equals("")) { // Rule G2.10 //$NON-NLS-1$
artifactCon.getToActivities().add(actNorm);
actNorm.getFromArtifactCons().add(artifactCon);
InvolvedArtifact invArt = new InvolvedArtifact();
if (artifactFromCon != null) {
invArt.setTheArtifact(artifactFromCon);
artifactFromCon.getTheInvolvedArtifacts().add(invArt);
}
invArt.setInInvolvedArtifacts(actNorm);
actNorm.getInvolvedArtifactToNormal().add(invArt);
invArt.setTheArtifactType(artifactTypeFromCon);
artifactTypeFromCon.getTheInvolvedArtifacts().add(invArt);
// Persistence Operations
artConDAO.update(artifactCon);
actDAO.update(activity);
return artifactCon;
} else if (state.equals(Plain.WAITING) || state.equals(Plain.READY) || state.equals(Plain.ACTIVE)
|| state.equals(Plain.PAUSED)) { // Rule G2.11
artifactCon.getToActivities().add(actNorm);
actNorm.getFromArtifactCons().add(artifactCon);
InvolvedArtifact invArt = new InvolvedArtifact();
if (artifactFromCon != null) {
invArt.setTheArtifact(artifactFromCon);
artifactFromCon.getTheInvolvedArtifacts().add(invArt);
}
invArt.setInInvolvedArtifacts(actNorm);
actNorm.getInvolvedArtifactToNormal().add(invArt);
invArt.setTheArtifactType(artifactTypeFromCon);
artifactTypeFromCon.getTheInvolvedArtifacts().add(invArt);
if (this.hasInvolvedAgents(actNorm))
this.notifyAgents(actNorm, newInputArtifact,
artifactFromCon.getId(), DynamicModelingImpl.ADD, artifactFromCon.getClass(), artifactFromCon.getIdent(),
DynamicModelingImpl.DIRECTION_IN); //$NON-NLS-1$
// Persistence Operations
artConDAO.update(artifactCon);
actDAO.update(activity);
return artifactCon;
} else
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheActvStateDoes")); //$NON-NLS-1$
} else {// The activity doesn't have the artifact instance
// defined
// in involved artifact.
if (state.equals("")) { // Rule G2.12 //$NON-NLS-1$
artifactCon.getToActivities().add(actNorm);
actNorm.getFromArtifactCons().add(artifactCon);
InvolvedArtifact invArt = new InvolvedArtifact();
if (artifactFromCon != null) {
invArt.setTheArtifact(artifactFromCon);
artifactFromCon.getTheInvolvedArtifacts().add(invArt);
}
invArt.setInInvolvedArtifacts(actNorm);
actNorm.getInvolvedArtifactToNormal().add(invArt);
invArt.setTheArtifactType(artifactTypeFromCon);
artifactTypeFromCon.getTheInvolvedArtifacts().add(invArt);
// Persistence Operations
artConDAO.update(artifactCon);
actDAO.update(activity);
return artifactCon;
} else if (state.equals(Plain.WAITING) || state.equals(Plain.READY) || state.equals(Plain.ACTIVE)
|| state.equals(Plain.PAUSED)) { // Rule G2.13
artifactCon.getToActivities().add(actNorm);
actNorm.getFromArtifactCons().add(artifactCon);
InvolvedArtifact invArt = new InvolvedArtifact();
if (artifactFromCon != null) {
invArt.setTheArtifact(artifactFromCon);
artifactFromCon.getTheInvolvedArtifacts().add(invArt);
}
invArt.setInInvolvedArtifacts(actNorm);
actNorm.getInvolvedArtifactToNormal().add(invArt);
invArt.setTheArtifactType(artifactTypeFromCon);
artifactTypeFromCon.getTheInvolvedArtifacts().add(invArt);
if (this.hasInvolvedAgents(actNorm))
this.notifyAgents(actNorm, newInputArtifact,
artifactFromCon.getId(), DynamicModelingImpl.ADD, artifactFromCon.getClass(), artifactFromCon.getIdent(),
DynamicModelingImpl.DIRECTION_IN); //$NON-NLS-1$
// Persistence Operations
artConDAO.update(artifactCon);
actDAO.update(activity);
return artifactCon;
} else
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheActvStateDoes")); //$NON-NLS-1$
}
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheActvAlrHasArtf")); //$NON-NLS-1$
}
} else if (activity instanceof Decomposed) {
Decomposed actDec = (Decomposed) activity;
ProcessModel procModel = actDec.getTheProcessModel();
if (!state.equals(ProcessModel.FINISHED) && !state.equals(ProcessModel.CANCELED) && !state.equals(ProcessModel.FAILED)
&& (procModel != null) && !actDec.getFromArtifactCons().contains(artifactCon)) {
actDec.getFromArtifactCons().add(artifactCon);
artifactCon.getToActivities().add(actDec);
ProcessModel refProcModel = actDec.getTheReferedProcessModel();
if ((refProcModel != null) && !refProcModel.getTheConnection().contains(artifactCon)) {
ArtifactCon newArtifactCon = new ArtifactCon();
newArtifactCon.setIdent(actDec.getIdent());
newArtifactCon = (ArtifactCon) artConDAO.daoSave(newArtifactCon);
newArtifactCon.setTheArtifactType(artifactTypeFromCon);
artifactTypeFromCon.getTheArtifactCon().add(newArtifactCon);
if (artifactFromCon != null) {
newArtifactCon.setTheArtifact(artifactFromCon);
artifactFromCon.getTheArtifactCon().add(newArtifactCon);
}
refProcModel.getTheConnection().add(newArtifactCon);
newArtifactCon.setTheProcessModel(refProcModel);
// Persistence Operations
artConDAO.update(newArtifactCon);
actDAO.update(activity);
return artifactCon;
} else
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheActvAlreadyConn")); //$NON-NLS-1$
} else
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheDecompActvState")); //$NON-NLS-1$
} else
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheActvShouldNormDecomp")); //$NON-NLS-1$
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheArtfTypeOfArtfConn")); //$NON-NLS-1$
}
}
@Override
public Integer removeInput_ArtifactConnection_Activity(String con_id, String act_id) throws DAOException, ModelingException { // retorna
// Oid
// de
// ArtifactCon
// Checks for the parameters
Object artCon;
try {
artCon = artConDAO.retrieveBySecondaryKey(con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccArtfConn") + //$NON-NLS-1$
con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (artCon == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcArtfConn") + con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
ArtifactCon artifactCon = (ArtifactCon) artCon;
Object act;
try {
act = actDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActiv") + //$NON-NLS-1$
act_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Activity activity = (Activity) act;
// End Checks for the parameters
// Now we start the implementation of the rules
String state = this.getState(activity);
if ((!state.equals(Plain.FINISHED) || !state.equals(ProcessModel.FINISHED))
&& (!state.equals(Plain.FAILED) || !state.equals(ProcessModel.FAILED))
&& (!state.equals(Plain.FINISHED) || !state.equals(ProcessModel.FINISHED))) {
Collection artifactConnections = activity.getFromArtifactCons();
Collection toactivities = artifactCon.getToActivities();
if (artifactConnections.contains(artifactCon) || toactivities.contains(activity)) {
artifactConnections.remove(artifactCon);
toactivities.remove(activity);
if (activity instanceof Normal) {
Normal actNorm = (Normal) activity;
Artifact artifact = artifactCon.getTheArtifact();
if (artifact != null) {
if (this.hasTheFromArtifact(actNorm, artifactCon.getTheArtifact())) {
Collection invArts = actNorm.getInvolvedArtifactToNormal();
Iterator iter = invArts.iterator();
InvolvedArtifact inv = null;
while (iter.hasNext()) {
InvolvedArtifact aux = (InvolvedArtifact) iter.next();
if (aux.getTheArtifact().equals(artifact)) {
inv = aux;
break;
}
}
if (inv != null) {
inv.removeFromInInvolvedArtifacts();
if (this.hasInvolvedAgents(actNorm))
this.notifyAgents(actNorm, Messages.getString("facades.DynamicModeling.NotifyAgtRemInpArtf"), artifact.getId(),
DynamicModelingImpl.DEL, artifact.getClass(), artifact.getIdent(), DynamicModelingImpl.DIRECTION_IN); //$NON-NLS-1$
}
}
}
}
// Persistence Operations
artConDAO.update(artifactCon);
actDAO.update(activity);
return artifactCon.getId();
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheArtfConn") //$NON-NLS-1$
+ artifactCon.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcCannBeResolv") //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.ModelingExcIsNOtReadyToIn") + activity.getIdent() + "!"); //$NON-NLS-1$ //$NON-NLS-2$
}
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheArtfConn") //$NON-NLS-1$
+ artifactCon.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcCannBeResolv") //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.ModelingExcItsSourceEnact")); //$NON-NLS-1$
}
}
@Override
public Integer defineInput_ArtifactConnection_Multiple(String con_id, String mcon_id) throws DAOException, ModelingException {
return defineInput_ArtifactConnection_Multiple_Internal(con_id, mcon_id).getId();
}
private ArtifactCon defineInput_ArtifactConnection_Multiple_Internal(String con_id, String mcon_id) throws DAOException, ModelingException {// retorna
// Oid
// do
// ArtifactCon
// Checks for the parameters
Object artCon;
try {
artCon = artConDAO.retrieveBySecondaryKey(con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccArtfConn") + //$NON-NLS-1$
con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (artCon == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcArtfConn") + con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
ArtifactCon artifactCon = (ArtifactCon) artCon;
Object multi;
try {
multi = multiDAO.retrieveBySecondaryKey(mcon_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccArtfConn") + //$NON-NLS-1$
mcon_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (multi == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcMultConn") + mcon_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
MultipleCon multipleCon = (MultipleCon) multi;
// End Checks for the parameters
// Now we start the implementation of the rules
Artifact artifact = artifactCon.getTheArtifact();
ArtifactType artifactType = artifactCon.getTheArtifactType();
if (!multipleCon.getFromArtifactCons().contains(artifactCon) || !artifactCon.getToMultipleCon().contains(multipleCon)) {
multipleCon.getFromArtifactCons().add(artifactCon); // Rule G2.21
artifactCon.getToMultipleCon().add(multipleCon);
Collection suc = this.getSuccessors(multipleCon);
suc.remove(null);
if (!suc.isEmpty()) {
Iterator iter = suc.iterator();
while (iter.hasNext()) {
Object obj = (Object) iter.next();
if (obj instanceof Activity) {
Activity activity = (Activity) obj;
if (activity instanceof Normal) {
Normal actNorm = (Normal) obj;
if (!this.hasTheFromArtifact(actNorm, artifact)) {
String state = actNorm.getTheEnactionDescription().getState();
if (state.equals("")) { // Rule G.22 //$NON-NLS-1$
InvolvedArtifact invArt = new InvolvedArtifact();
invArt.setInInvolvedArtifacts(actNorm);
actNorm.getInvolvedArtifactToNormal().add(invArt);
if (artifact != null) {
invArt.setTheArtifact(artifact);
artifact.getTheInvolvedArtifacts().add(invArt);
}
invArt.setTheArtifactType(artifactType);
artifactType.getTheInvolvedArtifacts().add(invArt);
} else if (!state.equals(Plain.FINISHED) && !state.equals(Plain.FAILED) && !state.equals(Plain.CANCELED)) { // Rule
// G2.23
InvolvedArtifact invArt = new InvolvedArtifact();
invArt.setInInvolvedArtifacts(actNorm);
actNorm.getInvolvedArtifactToNormal().add(invArt);
if (artifact != null) {
invArt.setTheArtifact(artifact);
artifact.getTheInvolvedArtifacts().add(invArt);
}
invArt.setTheArtifactType(artifactType);
artifactType.getTheInvolvedArtifacts().add(invArt);
if (this.hasInvolvedAgents(actNorm))
this.notifyAgents(actNorm, Messages.getString("facades.DynamicModeling.NotifyAgtNewInpArtf"),
artifact.getId(), DynamicModelingImpl.ADD, artifact.getClass(), artifact.getIdent(),
DynamicModelingImpl.DIRECTION_IN); //$NON-NLS-1$
}
}
} else if (activity instanceof Decomposed) { // Rule
// G2.24
Decomposed actDec = (Decomposed) obj;
ProcessModel refProcModel = actDec.getTheReferedProcessModel();
if ((refProcModel != null) && !refProcModel.getTheConnection().contains(artifactCon)) {
String state = refProcModel.getPmState();
if (!state.equals("") //$NON-NLS-1$
&& !state.equals(ProcessModel.FINISHED)
&& !state.equals(ProcessModel.FAILED)
&& !state.equals(ProcessModel.CANCELED)) {
ArtifactCon newArtifactCon = new ArtifactCon();
newArtifactCon.setIdent(actDec.getIdent());
newArtifactCon.setTheArtifact(artifact);
artifact.getTheArtifactCon().add(newArtifactCon);
newArtifactCon.setTheArtifactType(artifactType);
artifactType.getTheArtifactCon().add(newArtifactCon);
refProcModel.getTheConnection().add(newArtifactCon);
newArtifactCon.setTheProcessModel(refProcModel);
}
}
}
} else if (obj instanceof MultipleCon) { // Rule G2.25
MultipleCon newMulti = (MultipleCon) obj;
// Calling this method recursivelly.
this.defineInput_ArtifactConnection_Multiple(artifactCon.getIdent(), newMulti.getIdent());
}
}
}
}
// Persistence Operations
artConDAO.update(artifactCon);
multiDAO.update(multipleCon);
return artifactCon;
}
@Override
public Integer removeInput_ArtifactConnection_Multiple(String con_id, String multi_id) throws DAOException, ModelingException {// retorna
// Oid
// ArtifactCon
// Checks for the parameters
Object artCon;
try {
artCon = artConDAO.retrieveBySecondaryKey(con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccArtfConn") + //$NON-NLS-1$
con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (artCon == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcArtfConn") + con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
ArtifactCon artifactCon = (ArtifactCon) artCon;
Object multi;
try {
multi = multiDAO.retrieveBySecondaryKey(multi_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccArtfConn") + //$NON-NLS-1$
multi_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (multi == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcMultConn") + multi_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
MultipleCon multipleCon = (MultipleCon) multi;
// End Checks for the parameters
// Now we start the implementation of the rules
if (multipleCon.getFromArtifactCons().contains(artifactCon) || artifactCon.getToMultipleCon().contains(multipleCon)) {
multipleCon.getFromArtifactCons().remove(artifactCon);
artifactCon.getToMultipleCon().remove(multipleCon);
// Persistence Operations
artConDAO.update((ArtifactCon) artCon);
multiDAO.update(multipleCon);
return artifactCon.getId();
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheArtfConn") //$NON-NLS-1$
+ artifactCon.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcCannBeResolv") //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.ModelingExcItsSourNotContains")); //$NON-NLS-1$
}
}
@Override
public void deleteConnection(String con_id) throws DAOException, WebapseeException {
// Checks for the parameters
Object con;
try {
con = conDAO.retrieveBySecondaryKey(con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccConn") + //$NON-NLS-1$
con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (con == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcConnection") + con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Connection connection = (Connection) con;
// End Checks for the parameters
// Now we start the implementation of the rules
ProcessModel pmodel = connection.getTheProcessModel();
String pmState = pmodel.getPmState();
if (pmState.equals(ProcessModel.REQUIREMENTS) || pmState.equals(ProcessModel.ABSTRACT) || pmState.equals(ProcessModel.INSTANTIATED)
|| pmState.equals(ProcessModel.ENACTING)) { // Rule G2.27
if (connection instanceof Sequence) {
Sequence sequence = (Sequence) connection;
Activity to = sequence.getToActivities();
if (to != null) {
String state = this.getState(to);
if (to instanceof Normal) { // Rule G2.31
if (state.equals("") //$NON-NLS-1$
|| state.equals(Plain.WAITING) || state.equals(Plain.READY)) {
to.getFromSimpleCon().remove(sequence);
sequence.setToActivity(null);
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcNotDelConAlrStart"));
}
} else if (to instanceof Automatic) { // Rule G2.32
if (state.equals("") //$NON-NLS-1$
|| state.equals(Plain.WAITING)) {
to.getFromSimpleCon().remove(sequence);
sequence.setToActivity(null);
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcNotDelConAlrStart"));
}
} else if (to instanceof Decomposed) { // Rule G2.33
to.getFromSimpleCon().remove(sequence);
sequence.setToActivity(null);
}
}
Activity from = sequence.getFromActivity();
if (from != null) {
String state = this.getState(from);
if (from instanceof Normal) { // Rule G2.28
if (state.equals("") //$NON-NLS-1$
|| state.equals(Plain.WAITING) || state.equals(Plain.READY)) {
from.getToSimpleCon().remove(sequence);
sequence.setFromActivity(null);
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcNotDelConAlrStart"));
}
} else if (from instanceof Automatic) { // Rule G2.29
if (state.equals("") //$NON-NLS-1$
|| state.equals(Plain.WAITING)) {
from.getToSimpleCon().remove(sequence);
sequence.setFromActivity(null);
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcNotDelConAlrStart"));
}
} else if (from instanceof Decomposed) { // Rule G2.30
from.getToSimpleCon().remove(sequence);
sequence.setFromActivity(null);
}
}
} else if (connection instanceof Feedback) {
Feedback feedback = (Feedback) connection;
Activity to = feedback.getToActivities();
if (to != null) {
String state = this.getState(to);
if (to instanceof Normal) { // Rule G2.31
if (state.equals("") //$NON-NLS-1$
|| state.equals(Plain.WAITING) || state.equals(Plain.READY)) {
to.getFromSimpleCon().remove(feedback);
feedback.setToActivity(null);
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcNotDelConAlrStart"));
}
} else if (to instanceof Automatic) { // Rule G2.32
if (state.equals("") //$NON-NLS-1$
|| state.equals(Plain.WAITING)) {
to.getFromSimpleCon().remove(feedback);
feedback.setToActivity(null);
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcNotDelConAlrStart"));
}
} else if (to instanceof Decomposed) { // Rule G2.33
to.getFromSimpleCon().remove(feedback);
feedback.setToActivity(null);
}
}
Activity from = feedback.getFromActivity();
if (from != null) {
String state = this.getState(from);
if (from instanceof Normal) { // Rule G2.28
if (state.equals("") //$NON-NLS-1$
|| state.equals(Plain.WAITING) || state.equals(Plain.READY)) {
from.getToSimpleCon().remove(feedback);
feedback.setFromActivity(null);
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcNotDelConAlrStart"));
}
} else if (from instanceof Automatic) { // Rule G2.29
if (state.equals("") //$NON-NLS-1$
|| state.equals(Plain.WAITING)) {
from.getToSimpleCon().remove(feedback);
feedback.setFromActivity(null);
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcNotDelConAlrStart"));
}
} else if (from instanceof Decomposed) { // Rule G2.30
from.getToSimpleCon().remove(feedback);
feedback.setFromActivity(null);
}
}
PolCondition polCondition = feedback.getTheCondition();
if (polCondition != null) {
feedback.removeFromTheCondition();
polConditionDAO.daoDelete(polCondition);
}
} else if (connection instanceof JoinCon) {
JoinCon join = (JoinCon) connection;
// Deleting Successors and Predecessors instances of Activity
Activity to = join.getToActivities();
if (to != null) {
String state = this.getState(to);
if (to instanceof Normal) { // Rule G2.37
if (state.equals("") //$NON-NLS-1$
|| state.equals(Plain.WAITING) || state.equals(Plain.READY)) {
to.getFromJoin().remove(join);
join.setToActivity(null);
}
} else if (to instanceof Automatic) { // Rule G2.38
if (state.equals("") //$NON-NLS-1$
|| state.equals(Plain.WAITING)) {
to.getFromJoin().remove(join);
join.setToActivity(null);
}
} else if (to instanceof Decomposed) { // Rule G2.39
to.getFromJoin().remove(join);
join.setToActivity(null);
}
}
Collection froms = join.getFromActivity();
Iterator iterFroms = froms.iterator();
// Auxiliar Collection
Collection aux = new HashSet();
while (iterFroms.hasNext()) {
Activity from = (Activity) iterFroms.next();
if (from != null) {
aux.add(from);
}
}
Iterator iterAux = aux.iterator();
while (iterAux.hasNext()) {
Activity from = (Activity) iterAux.next();
if (from != null) {
String state = this.getState(from);
if (from instanceof Normal) { // Rule G2.34
if (state.equals("") //$NON-NLS-1$
|| state.equals(Plain.WAITING) || state.equals(Plain.READY)) {
from.getToJoin().remove(join);
join.getFromActivity().remove(from);
}
} else if (from instanceof Automatic) { // Rule G2.35
if (state.equals("") //$NON-NLS-1$
|| state.equals(Plain.WAITING)) {
from.getToJoin().remove(join);
join.getFromActivity().remove(from);
}
} else if (from instanceof Decomposed) { // Rule G2.36
from.getToJoin().remove(join);
join.getFromActivity().remove(from);
}
}
}
// Deleting Successors and Predecessors instances of Multiple
// Connection
MultipleCon toMC = join.getToMultipleCon();
if (toMC != null) { // Rule G2.40
if (!toMC.isFired().booleanValue()) {
if (toMC instanceof JoinCon) {
JoinCon toJoin = (JoinCon) toMC;
toJoin.getFromMultipleCon().remove(join);
join.setToMultipleCon(null);
} else { // is BranchCon
BranchCon toBranch = (BranchCon) toMC;
toBranchCon.setFromMultipleConnection(null);
join.setToMultipleCon(null);
}
}
}
Collection fromsMC = join.getFromMultipleCon();
Iterator iterFromsMC = fromsMC.iterator();
// Auxiliar Collection
Collection aux2 = new HashSet();
while (iterFromsMC.hasNext()) {
MultipleCon from = (MultipleCon) iterFromsMC.next();
if (from != null) {
aux2.add(from);
}
}
Iterator iterAux2 = aux2.iterator();
while (iterAux2.hasNext()) {
MultipleCon fromMC = (MultipleCon) iterAux2.next();
if (fromMC != null) { // Rule G2.41
if (!fromMC.isFired().booleanValue()) {
if (fromMC instanceof JoinCon) {
JoinCon fromJoin = (JoinCon) fromMC;
fromJoinCon.setToMultipleCon(null);
join.getFromMultipleCon().remove(fromJoinCon);
} else { // is BranchCon
BranchCon fromBranch = (BranchCon) fromMC;
if (fromBranch instanceof BranchANDCon) {
BranchANDCon fromBranchAND = (BranchANDCon) fromBranchCon;
fromBranchAND.getToMultipleCon().remove(join);
join.getFromMultipleCon().remove(fromBranchANDCon);
} else { // is BranchConCond
BranchConCond fromBranchCond = (BranchConCond) fromBranchCon;
Collection fromBCTMCs = fromBranchCond.getTheBranchConCondToMultipleCon();
Iterator iterFromBCTMCs = fromBCTMCs.iterator();
while (iterFromBCTMCs.hasNext()) {
BranchConCondToMultipleCon bctmc = (BranchConCondToMultipleCon) iterFromBCTMCs.next();
if (bctmc.getTheMultipleCon().equals(join)) {
join.getFromMultipleCon().remove(fromBranchConCond);
bctmc.setTheBranchConCond(null);
bctmc.setTheMultipleCon(null);
fromBranchCond.getTheBranchConCondToMultipleCon().remove(bctmc);
// Removing BranchConCondToMultiplecon
// from Model
bctmcDAO.daoDelete(bctmc);
}
}
}
}
}
}
}
} else if (connection instanceof BranchANDCon) {
BranchANDCon branchAND = (BranchANDCon) connection;
Collection tos = branchAND.getToActivities();
Iterator iterTos = tos.iterator();
// Auxiliar Collection
Collection aux = new HashSet();
while (iterTos.hasNext()) {
Activity from = (Activity) iterTos.next();
if (from != null) {
aux.add(from);
}
}
Iterator iterAux = aux.iterator();
while (iterAux.hasNext()) {
Activity to = (Activity) iterAux.next();
if (to != null) {
String state = this.getState(to);
if (to instanceof Normal) { // Rule G2.37
if (state.equals("") //$NON-NLS-1$
|| state.equals(Plain.WAITING) || state.equals(Plain.READY)) {
to.getFromBranchAND().remove(branchAND);
branchAND.getToActivities().remove(to);
}
} else if (to instanceof Automatic) { // Rule G2.38
if (state.equals("") //$NON-NLS-1$
|| state.equals(Plain.WAITING)) {
to.getFromBranchAND().remove(branchAND);
branchAND.getToActivities().remove(to);
}
} else if (to instanceof Decomposed) { // Rule G2.39
to.getFromBranchAND().remove(branchAND);
branchAND.getToActivities().remove(to);
}
}
}
Activity from = branchAND.getFromActivity();
if (from != null) {
String state = this.getState(from);
if (from instanceof Normal) { // Rule G2.34
if (state.equals("") //$NON-NLS-1$
|| state.equals(Plain.WAITING) || state.equals(Plain.READY))
from.getToBranch().remove(branchAND);
} else if (from instanceof Automatic) { // Rule G2.35
if (state.equals("") //$NON-NLS-1$
|| state.equals(Plain.WAITING))
from.getToBranch().remove(branchAND);
} else if (from instanceof Decomposed) // Rule G2.36
from.getToBranch().remove(branchAND);
}
// Deleting Successors and Predecessors instances of Multiple
// Connection
MultipleCon fromMC = branchAND.getFromMultipleConnection();
if (fromMC != null) { // Rule G2.40
if (!fromMC.isFired().booleanValue()) {
if (fromMC instanceof JoinCon) {
JoinCon fromJoin = (JoinCon) fromMC;
fromJoinCon.setToMultipleCon(null);
branchAND.setFromMultipleConnection(null);
} else { // is BranchCon
BranchCon fromBranch = (BranchCon) fromMC;
if (fromBranch instanceof BranchANDCon) {
BranchANDCon fromBranchAND = (BranchANDCon) fromBranchCon;
fromBranchAND.getToMultipleCon().remove(branchAND);
branchAND.setFromMultipleConnection(null);
} else { // is BranchConCond
BranchConCond fromBranchCond = (BranchConCond) fromBranchCon;
Collection fromBCTMCs = fromBranchCond.getTheBranchConCondToMultipleCon();
Iterator iterFromBCTMCs = fromBCTMCs.iterator();
while (iterFromBCTMCs.hasNext()) {
BranchConCondToMultipleCon bctmc = (BranchConCondToMultipleCon) iterFromBCTMCs.next();
if (bctmc.getTheMultipleCon().equals(branchAND)) {
branchAND.setFromMultipleConnection(null);
bctmc.setTheBranchConCond(null);
bctmc.setTheMultipleCon(null);
fromBranchCond.getTheBranchConCondToMultipleCon().remove(bctmc);
// Removing BranchConCondToMultiplecon from
// Model
bctmcDAO.daoDelete(bctmc);
}
}
}
}
}
}
Collection tosMC = branchAND.getToMultipleCon();
Iterator iterTosMC = tosMC.iterator();
// Auxiliar Collection
Collection aux2 = new HashSet();
while (iterTosMC.hasNext()) {
MultipleCon fromMulti = (MultipleCon) iterTosMC.next();
if (fromMulti != null) {
aux2.add(fromMulti);
}
}
Iterator iterAux2 = aux2.iterator();
while (iterAux2.hasNext()) {
MultipleCon toMC = (MultipleCon) iterAux2.next();
if (toMC != null) { // Rule G2.41
if (!toMC.isFired().booleanValue()) {
if (toMC instanceof JoinCon) {
JoinCon toJoin = (JoinCon) toMC;
branchAND.getToMultipleCon().remove(toJoinCon);
toJoin.getFromMultipleCon().remove(branchAND);
} else { // is BranchCon
BranchCon toBranch = (BranchCon) toMC;
toBranchCon.setFromMultipleConnection(null);
branchAND.getToMultipleCon().remove(toBranchCon);
}
}
}
}
} else if (connection instanceof BranchConCond) {
BranchConCond branchCond = (BranchConCond) connection;
Collection bctas = branchCond.getTheBranchConCondToActivity();
Iterator iterBctas = bctas.iterator();
// Auxiliar Collection
Collection aux = new HashSet();
while (iterBctas.hasNext()) {
BranchConCondToActivity to = (BranchConCondToActivity) iterBctas.next();
if (to != null) {
aux.add(to);
}
}
Iterator iterAux = aux.iterator();
while (iterAux.hasNext()) {
BranchConCondToActivity bcta = (BranchConCondToActivity) iterAux.next();
Activity to = bcta.getTheActivities();
if (to != null) {
String state = this.getState(to);
if (to instanceof Normal) { // Rule G2.37
if (state.equals("") //$NON-NLS-1$
|| state.equals(Plain.WAITING) || state.equals(Plain.READY)) {
to.getTheBranchConCondToActivity().remove(bcta);
bcta.setTheActivity(null);
bcta.setTheBranchConCond(null);
branchCond.getTheBranchConCondToActivity().remove(bcta);
}
} else if (to instanceof Automatic) { // Rule G2.38
if (state.equals("") //$NON-NLS-1$
|| state.equals(Plain.WAITING)) {
to.getTheBranchConCondToActivity().remove(bcta);
bcta.setTheActivity(null);
bcta.setTheBranchConCond(null);
branchCond.getTheBranchConCondToActivity().remove(bcta);
}
} else if (to instanceof Decomposed) { // Rule G2.39
to.getTheBranchConCondToActivity().remove(bcta);
bcta.setTheActivity(null);
bcta.setTheBranchConCond(null);
branchCond.getTheBranchConCondToActivity().remove(bcta);
}
}
}
Activity from = branchCond.getFromActivity();
if (from != null) {
String state = this.getState(from);
if (from instanceof Normal) { // Rule G2.34
if (state.equals("") //$NON-NLS-1$
|| state.equals(Plain.WAITING) || state.equals(Plain.READY)) {
from.getToBranch().remove(branchCond);
branchCond.setFromActivity(null);
}
} else if (from instanceof Automatic) { // Rule G2.35
if (state.equals("") //$NON-NLS-1$
|| state.equals(Plain.WAITING)) {
from.getToBranch().remove(branchCond);
branchCond.setFromActivity(null);
}
} else if (from instanceof Decomposed) { // Rule G2.36
from.getToBranch().remove(branchCond);
branchCond.setFromActivity(null);
}
}
// Deleting Successors and Predecessors instances of Multiple
// Connection
Collection bctmcs = branchCond.getTheBranchConCondToMultipleCon();
Iterator iterBctmcs = bctmcs.iterator();
// Auxiliar Collection
Collection aux2 = new HashSet();
while (iterBctmcs.hasNext()) {
BranchConCondToMultipleCon fromBCTMC = (BranchConCondToMultipleCon) iterBctmcs.next();
if (fromBCTMC != null) {
aux2.add(fromBCTMC);
}
}
Iterator iterAux2 = aux2.iterator();
while (iterAux2.hasNext()) {
BranchConCondToMultipleCon bctmc = (BranchConCondToMultipleCon) iterAux2.next();
MultipleCon to = bctmc.getTheMultipleCon();
if (to != null) {
boolean fired = to.isFired().booleanValue();
if (!fired) { // Rule G2.40
if (to instanceof JoinCon) {
JoinCon toJoin = (JoinCon) to;
toJoin.getFromMultipleCon().remove(bctmc.getTheBranchConCond());
bctmc.setTheBranchConCond(null);
bctmc.setTheBranchConCond(null);
branchCond.getTheBranchConCondToMultipleCon().remove(bctmc);
} else { // is BranchCon
BranchCon toBranch = (BranchCon) to;
toBranchCon.setFromMultipleConnection(null);
bctmc.setTheBranchConCond(null);
bctmc.setTheBranchConCond(null);
branchCond.getTheBranchConCondToMultipleCon().remove(bctmc);
}
}
}
}
MultipleCon fromMC = branchCond.getFromMultipleConnection();
if (fromMC != null) {
boolean fired = fromMC.isFired().booleanValue();
if (!fired) { // Rule G2.41
if (fromMC instanceof JoinCon) {
JoinCon fromJoin = (JoinCon) fromMC;
fromJoinCon.setToMultipleCon(null);
branchCond.setFromMultipleConnection(null);
} else { // is BranchCon
BranchCon fromBranch = (BranchCon) fromMC;
if (fromBranch instanceof BranchANDCon) {
BranchANDCon fromBranchAND = (BranchANDCon) fromBranchCon;
fromBranchAND.getToMultipleCon().remove(branchCond);
branchCond.setFromMultipleConnection(null);
} else {// is BranchConCond
BranchConCond fromBranchCond = (BranchConCond) fromBranchCon;
Collection branchCondToMultipleCons = fromBranchCond.getTheBranchConCondToMultipleCon();
Iterator iterBranchCondToMultipleCons = branchConCondToMultipleCons.iterator();
while (iterBranchConCondToMultipleCons.hasNext()) {
BranchConCondToMultipleCon branchCondToMultipleCon = (BranchConCondToMultipleCon) iterBranchConCondToMultipleCons.next();
if (branchCondToMultipleCon.getTheBranchCond().equals(branchCond)) {
branchCond.setFromMultipleConnection(null);
branchCondToMultipleCon.setTheBranchConCond(null);
branchConCondToMultipleCon.setTheMultipleCon(null);
fromBranchCond.getTheBranchConCondToActivity().remove(branchConCondToMultipleCon);
}
}
}
}
}
}
} else if (connection instanceof ArtifactCon) {
ArtifactCon artifactCon = (ArtifactCon) connection;
Collection actsTo = artifactCon.getToActivities();
Iterator iterActsTo = actsTo.iterator();
// Auxiliar Collection
Collection aux = new HashSet();
while (iterActsTo.hasNext()) {
Activity to = (Activity) iterActsTo.next();
if (to != null) {
aux.add(to);
}
}
Iterator iterAux = aux.iterator();
while (iterAux.hasNext()) {
Activity act = (Activity) iterAux.next();
act.getFromArtifactCons().remove(artifactCon);
artifactCon.getToActivities().remove(act);
if (act instanceof Normal) {
Normal actNorm = (Normal) act;
Artifact artifact = artifactCon.getTheArtifact();
if (artifact != null) {
if (this.hasTheFromArtifact(actNorm, artifact)) {
Collection invArts = actNorm.getInvolvedArtifactToNormal();
Iterator iter = invArts.iterator();
InvolvedArtifact inv = null;
while (iter.hasNext()) {
InvolvedArtifact auxInv = (InvolvedArtifact) iter.next();
if (auxInv != null) {
if (auxInv.getTheArtifact() != null) {
if (auxInv.getTheArtifact().equals(artifact)) {
inv = auxInv;
break;
}
}
}
}
if (inv != null) {
inv.removeFromInInvolvedArtifacts();
Collection invAgents = this.getInvolvedAgents(actNorm);
Collection<String> ids = new HashSet<String>();
Iterator iterInvAgents = invAgents.iterator();
while (iterInvAgents.hasNext()) {
Agent agent = (Agent) iterInvAgents.next();
if (agent != null) {
ids.add(agent.getIdent());
}
}
if (!invAgents.isEmpty())
this.notifyAgents(actNorm, Messages.getString("facades.DynamicModeling.NotifyAgtRemInpArtf"), ids,
artifact.getId(), DynamicModelingImpl.DEL, artifact.getClass(), artifact.getIdent(),
DynamicModelingImpl.DIRECTION_IN); //$NON-NLS-1$
}
}
}
}
}
Collection actsFrom = artifactCon.getFromActivity();
Iterator iterActsFrom = actsFrom.iterator();
// Auxiliar Collection
Collection aux2 = new HashSet();
while (iterActsFrom.hasNext()) {
Activity from = (Activity) iterActsFrom.next();
if (from != null) {
aux2.add(from);
}
}
Iterator iterAux2 = aux2.iterator();
while (iterAux2.hasNext()) {
Activity act = (Activity) iterAux2.next();
act.getToArtifactCon().remove(artifactCon);
artifactCon.getFromActivity().remove(act);
if (act instanceof Normal) {
Normal actNorm = (Normal) act;
Artifact artifact = artifactCon.getTheArtifact();
if (artifact != null) {
if (this.hasTheToArtifact(actNorm, artifact)) {
Collection invArts = actNorm.getInvolvedArtifactFromNormal();
Iterator iter = invArts.iterator();
InvolvedArtifact inv = null;
while (iter.hasNext()) {
InvolvedArtifact auxInv = (InvolvedArtifact) iter.next();
if (auxInv != null) {
if (auxInv.getTheArtifact() != null) {
if (auxInv.getTheArtifact().equals(artifact)) {
inv = auxInv;
break;
}
}
}
}
if (inv != null) {
inv.removeFromOutInvolvedArtifacts();
Collection invAgents = this.getInvolvedAgents(actNorm);
Collection<String> ids = new HashSet<String>();
Iterator iterInvAgents = invAgents.iterator();
while (iterInvAgents.hasNext()) {
Agent agent = (Agent) iterInvAgents.next();
if (agent != null) {
ids.add(agent.getIdent());
}
}
if (!invAgents.isEmpty())
this.notifyAgents(actNorm, Messages.getString("facades.DynamicModeling.NotifyAgtRemOutArtf"), ids,
artifact.getId(), DynamicModelingImpl.DEL, artifact.getClass(), artifact.getIdent(),
DynamicModelingImpl.DIRECTION_OUT); //$NON-NLS-1$
}
}
}
}
}
Collection multisTo = artifactCon.getToMultipleCon();
Iterator iterMultisTo = multisTo.iterator();
// Auxiliar Collection
Collection aux3 = new HashSet();
while (iterMultisTo.hasNext()) {
MultipleCon to = (MultipleCon) iterMultisTo.next();
if (to != null) {
aux3.add(to);
}
}
Iterator iterAux3 = aux3.iterator();
while (iterAux3.hasNext()) {
MultipleCon multiTo = (MultipleCon) iterAux3.next();
artifactCon.getToMultipleCon().remove(multiTo);
multiTo.getFromArtifactCons().remove(artifactCon);
}
}
pmodel.getTheConnection().remove(connection);
connection.setTheProcessModel(null);
// Persistence Operations
conDAO.daoDelete(connection);
// Dynamic Changes related code
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G2.27-G2.41"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
}
pmodelDAO.update(pmodel);
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheProcessModelState")); //$NON-NLS-1$
}
}
@Override
public Integer newInputArtifact(String level_id, Artifact artifact, Activity activity) throws WebapseeException, ModelingException {// retorna
Artifact artifactAux = (Artifact) artDAO.retrieveBySecondaryKey(artifact.getIdent());
String type_id = artifactAux.getTheArtifactType().getIdent();
ArtifactCon artifactCon = this.newArtifactConnection_Internal(level_id);
artifactCon = this.defineType_ArtifactConnection_Internal(artifactCon.getIdent(), type_id);
artifactCon = this.defineInstance_ArtifactConnection_Internal(artifactCon.getIdent(), artifact.getIdent());
artifactCon = this.defineInput_ArtifactConnection_Activity_Internal(artifactCon.getIdent(), activity.getIdent());
return artifactCon.getId();
}
//Para o Editor em Flex
@Override
public String newInputArtifact(String level_id, String artifactIdent, String activityIdent) throws WebapseeException, ModelingException {// retorna
Artifact artifactAux = (Artifact) artDAO.retrieveBySecondaryKey(artifactIdent);
Activity activity = actDAO.retrieveBySecondaryKey(activityIdent);
String type_id = artifactAux.getTheArtifactType().getIdent();
ArtifactCon artifactCon = this.newArtifactConnection_Internal(level_id);
System.out.println("1-ArtifactCon: " + artifactCon.getIdent() + "; " + artifactCon.getTheArtifact());
artifactCon = this.defineType_ArtifactConnection_Internal(artifactCon.getIdent(), type_id);
System.out.println("2-ArtifactCon: " + artifactCon.getIdent() + "; " + artifactCon.getTheArtifact());
artifactCon = this.defineInstance_ArtifactConnection_Internal(artifactCon.getIdent(), artifactIdent);
System.out.println("3-ArtifactCon: " + artifactCon.getIdent() + "; " + artifactCon.getTheArtifact());
artifactCon = this.defineInput_ArtifactConnection_Activity_Internal(artifactCon.getIdent(), activity.getIdent());
System.out.println("4-ArtifactCon: " + artifactCon.getIdent() + "; " + artifactCon.getTheArtifact());
activity = actDAO.retrieveBySecondaryKey(activityIdent);
System.out.println("------------ From");
for (ArtifactCon con : activity.getFromArtifactCons()) {
System.out.println(con.getTheArtifact().getIdent());
}
System.out.println("------------ To");
for (ArtifactCon con : activity.getToArtifactCons()) {
System.out.println(con.getTheArtifact().getIdent());
}
return artifactCon.getIdent();
}
//Para o Editor em Flex
@Override
public String newOutputArtifact(String level_id, String artifactIdent, String activityIdent) throws WebapseeException, ModelingException {
// retorna ArtifactCon Oid
Artifact artifactAux = (Artifact) artDAO.retrieveBySecondaryKey(artifactIdent);
Activity activity = actDAO.retrieveBySecondaryKey(activityIdent);
String type_id = artifactAux.getTheArtifactType().getIdent();
ArtifactCon artifactCon = artConDAO.retrieveBySecondaryKey(level_id);
if(artifactCon == null) artifactCon = this.newArtifactConnection_Internal(level_id);
artifactCon = this.defineType_ArtifactConnection_Internal(artifactCon.getIdent(), type_id);
artifactCon = this.defineInstance_ArtifactConnection_Internal(artifactCon.getIdent(), artifactIdent);
artifactCon = this.defineOutput_ArtifactConnection_Internal(artifactCon.getIdent(), activity.getIdent());
return artifactCon.getIdent();
}
@Override
public Integer newInputArtifact(String level_id, ArtifactType artifactType, Activity activity) throws WebapseeException, ModelingException { // retorna
ArtifactCon artifactCon = this.newArtifactConnection_Internal(level_id);
artifactCon = this.defineType_ArtifactConnection_Internal(artifactCon.getIdent(), artifactType.getIdent());
artifactCon = this.defineInput_ArtifactConnection_Activity_Internal(artifactCon.getIdent(), activity.getIdent());
return artifactCon.getId();
}
@Override
public Integer newInputArtifact(String level_id, Artifact artifact, MultipleCon multipleCon) throws WebapseeException, ModelingException {// ArtifactCon
Artifact artifactAux = (Artifact) artDAO.retrieveBySecondaryKey(artifact.getIdent());
String type_id = artifactAux.getTheArtifactType().getIdent();
ArtifactCon artifactCon = this.newArtifactConnection_Internal(level_id);
artifactCon = this.defineType_ArtifactConnection_Internal(artifactCon.getIdent(), type_id);
artifactCon = this.defineInstance_ArtifactConnection_Internal(artifactCon.getIdent(), artifact.getIdent());
artifactCon = this.defineInput_ArtifactConnection_Multiple_Internal(artifactCon.getIdent(), multipleCon.getIdent());
return artifactCon.getId();
}
@Override
public Integer newInputArtifact(String level_id, ArtifactType artifactType, MultipleCon multipleCon) throws WebapseeException, ModelingException {// ArtifactCon
// retorna
// Oid
ArtifactCon artifactCon = this.newArtifactConnection_Internal(level_id);
artifactCon = this.defineType_ArtifactConnection_Internal(artifactCon.getIdent(), artifactType.getIdent());
artifactCon = this.defineInput_ArtifactConnection_Activity_Internal(artifactCon.getIdent(), multipleCon.getIdent());
return artifactCon.getId();
}
@Override
public Integer newOutputArtifact(String level_id, Artifact artifact, Activity activity) throws WebapseeException, ModelingException {
// retorna ArtifactCon Oid
Artifact artifactAux = (Artifact) artDAO.retrieveBySecondaryKey(artifact.getIdent());
String type_id = artifactAux.getTheArtifactType().getIdent();
ArtifactCon artifactCon = this.newArtifactConnection_Internal(level_id);
artifactCon = this.defineType_ArtifactConnection_Internal(artifactCon.getIdent(), type_id);
artifactCon = this.defineInstance_ArtifactConnection_Internal(artifactCon.getIdent(), artifact.getIdent());
artifactCon = this.defineOutput_ArtifactConnection_Internal(artifactCon.getIdent(), activity.getIdent());
return artifactCon.getId();
}
@Override
public Integer newOutputArtifact(String level_id, ArtifactType artifactType, Activity activity) throws WebapseeException, ModelingException {// retorna
// Oid
// ArtifactCon
ArtifactCon artifactCon = this.newArtifactConnection_Internal(level_id);
artifactCon = this.defineType_ArtifactConnection_Internal(artifactCon.getIdent(), artifactType.getIdent());
artifactCon = this.defineOutput_ArtifactConnection_Internal(artifactCon.getIdent(), activity.getIdent());
return artifactCon.getId();
}
/**
* Related to section 3
*/
@Override
public Integer addSimpleConnection(String act_id_from, String act_id_to, String dependency) throws DAOException, WebapseeException {
// retorna Oid Sequence
// Checks for the parameters
Object act_from;
try {
act_from = actDAO.retrieveBySecondaryKey(act_id_from);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActiv") + //$NON-NLS-1$
act_id_from + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act_from == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id_from + Messages.getString("facades.DynamicModeling.DaoExc_NotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Activity activity_from = (Activity) act_from;
Object act_to;
try {
act_to = actDAO.retrieveBySecondaryKey(act_id_to);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActiv") + //$NON-NLS-1$
act_id_to + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act_to == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id_to + Messages.getString("facades.DynamicModeling.DaoExc_NotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Activity activity_to = (Activity) act_to;
if (activity_from.equals(activity_to))
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheSourceDestinyConn")); //$NON-NLS-1$
// End Checks for the parameters
// Now we start the implementation of the rules
String state_from = this.getState(activity_from);
String state_to = this.getState(activity_to);
ProcessModel pmodel = activity_to.getTheProcessModel();
// Rule G3.1
if (((state_from.equals("") && (state_to.equals("")) //$NON-NLS-1$ //$NON-NLS-2$
|| (state_from.equals("") && state_to.equals(ProcessModel.REQUIREMENTS)) //$NON-NLS-1$
|| (state_from.equals(ProcessModel.REQUIREMENTS) && state_to.equals("")) //$NON-NLS-1$
|| (state_from.equals(ProcessModel.REQUIREMENTS) && state_to.equals(ProcessModel.REQUIREMENTS))))
&& !this.controlFlow(activity_to, activity_from) && !this.areConnected(activity_from, activity_to)) {
Dependency dep = new Dependency();
dep.setKindDep(dependency);
Sequence seq = new Sequence();
seq.setIdent(this.getPath(act_id_to));
seq.setTheProcessModel(pmodel);
pmodel.getTheConnection().add(seq);
seq.setTheDependency(dep);
dep.setTheSequence(seq);
seq.setFromActivity(activity_from);
seq.setToActivity(activity_to);
activity_from.getToSimpleCon().add(seq);
activity_to.getFromSimpleCon().add(seq);
Process process = this.getTheProcess(pmodel);
if (process.getpStatus().name().equals(Process.ENACTING)) {
// Dynamic Changes related code
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G3.1"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
}
// Persistence Operations
conDAO.daoSave(seq);
return seq.getId();
}
// Rule G3.2
else if (((!(state_from.equals(Plain.CANCELED) || state_from.equals(ProcessModel.CANCELED)) && !(state_from.equals(Plain.FAILED) || state_from
.equals(ProcessModel.FAILED))) && (state_to.equals("") || state_to.equals(Plain.WAITING) || state_to.equals(ProcessModel.REQUIREMENTS) //$NON-NLS-1$
|| state_to.equals(ProcessModel.ABSTRACT) || state_to.equals(ProcessModel.INSTANTIATED)))
&& !this.controlFlow(activity_to, activity_from) && !this.areConnected(activity_from, activity_to)) {
if (dependency.equals("end-start"))
this.makeWaiting(activity_to, "Rule G3.2");
Dependency dep = new Dependency();
dep.setKindDep(dependency);
Sequence seq = new Sequence();
seq.setIdent(this.getPath(act_id_to));
seq.setTheProcessModel(pmodel);
pmodel.getTheConnection().add(seq);
seq.setTheDependency(dep);
dep.setTheSequence(seq);
seq.setFromActivity(activity_from);
activity_from.getToSimpleCon().add(seq);
seq.setToActivity(activity_to);
activity_to.getFromSimpleCon().add(seq);
// Dynamic Changes related code
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G3.2"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
// Persistence Operations
conDAO.daoSave(seq);
return seq.getId();
}
// Rule G3.3
else if (((state_from.equals(Plain.FINISHED) || state_from.equals(ProcessModel.FINISHED)) && (!(state_to.equals(Plain.CANCELED) || state_to
.equals(ProcessModel.CANCELED)) && !(state_to.equals(Plain.FAILED) || state_to.equals(ProcessModel.FAILED))))
&& !this.controlFlow(activity_to, activity_from) && !this.areConnected(activity_from, activity_to)) {
Dependency dep = new Dependency();
dep.setKindDep(dependency);
Sequence seq = new Sequence();
seq.setIdent(this.getPath(act_id_to));
seq.setTheProcessModel(pmodel);
pmodel.getTheConnection().add(seq);
seq.setTheDependency(dep);
dep.setTheSequence(seq);
seq.setFromActivity(activity_from);
seq.setToActivity(activity_to);
activity_from.getToSimpleCon().add(seq);
activity_to.getFromSimpleCon().add(seq);
// Dynamic Changes related code
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G3.3"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
// Persistence Operations
conDAO.daoSave(seq);
return seq.getId();
}
// Rule G3.4
else if (((!(state_from.equals(Plain.CANCELED) || state_from.equals(ProcessModel.CANCELED)) && !(state_from.equals(Plain.FAILED) || state_from
.equals(ProcessModel.FAILED))) && (state_to.equals(Plain.READY) || state_to.equals(Plain.ACTIVE) || state_to.equals(Plain.PAUSED)
|| state_to.equals(ProcessModel.ENACTING) || state_to.equals(ProcessModel.INSTANTIATED)))
&& dependency.equals("end-end") && !this.controlFlow(activity_to, activity_from) //$NON-NLS-1$
&& !this.areConnected(activity_from, activity_to)) {
Dependency dep = new Dependency();
dep.setKindDep(dependency);
Sequence seq = new Sequence();
seq.setIdent(this.getPath(act_id_to));
seq.setTheProcessModel(pmodel);
pmodel.getTheConnection().add(seq);
seq.setTheDependency(dep);
dep.setTheSequence(seq);
seq.setFromActivity(activity_from);
seq.setToActivity(activity_to);
activity_from.getToSimpleCon().add(seq);
activity_to.getFromSimpleCon().add(seq);
// Dynamic Changes related code
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G3.4"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
// Persistence Operations
conDAO.daoSave(seq);
return seq.getId();
}
// Rule G3.5
else if (((state_from.equals(Plain.ACTIVE) || state_from.equals(Plain.PAUSED) || state_from.equals(ProcessModel.ENACTING)) && (state_to
.equals(Plain.READY) || state_to.equals(ProcessModel.INSTANTIATED))) && dependency.equals("start-start") //$NON-NLS-1$
&& !this.controlFlow(activity_to, activity_from) && !this.areConnected(activity_from, activity_to)) {
Dependency dep = new Dependency();
dep.setKindDep(dependency);
Sequence seq = new Sequence();
seq.setIdent(this.getPath(act_id_to));
seq.setTheProcessModel(pmodel);
pmodel.getTheConnection().add(seq);
seq.setTheDependency(dep);
dep.setTheSequence(seq);
seq.setFromActivity(activity_from);
seq.setToActivity(activity_to);
activity_from.getToSimpleCon().add(seq);
activity_to.getFromSimpleCon().add(seq);
// Dynamic Changes related code
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G3.5"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
// Persistence Operations
conDAO.daoSave(seq);
return seq.getId();
}
// Rule G3.6
else if ((!(state_from.equals(Plain.CANCELED) || state_from.equals(ProcessModel.CANCELED))
&& !(state_from.equals(Plain.FAILED) || state_from.equals(ProcessModel.FAILED))
&& !(state_from.equals(Plain.FINISHED) || state_from.equals(ProcessModel.FINISHED)) && (state_to.equals(ProcessModel.INSTANTIATED) || state_to
.equals(Plain.READY))) // (activity_to instanceof Normal &&
// state_to.equals(Plain.READY)))
&& (dependency.equals("end-start") || dependency.equals("start-start")) //$NON-NLS-1$ //$NON-NLS-2$
&& !this.controlFlow(activity_to, activity_from) && !this.areConnected(activity_from, activity_to)) {
if (dependency.equals("end-start"))
this.makeWaiting(activity_to, "Rule G3.6");
Dependency dep = new Dependency();
dep.setKindDep(dependency);
Sequence seq = new Sequence();
seq.setIdent(this.getPath(act_id_to));
seq.setTheProcessModel(activity_to.getTheProcessModel());
pmodel.getTheConnection().add(seq);
seq.setTheDependency(dep);
dep.setTheSequence(seq);
seq.setFromActivity(activity_from);
seq.setToActivity(activity_to);
activity_from.getToSimpleCon().add(seq);
activity_to.getFromSimpleCon().add(seq);
// Dynamic Changes related code
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G3.6"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
// Persistence Operations
conDAO.daoSave(seq);
return seq.getId();
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheCombinActvState")); //$NON-NLS-1$
}
}
@Override
public Integer addFeedbackConnection(String act_id_from, String act_id_to) throws DAOException, ModelingException, ModelingWarning {// retorna
// Oid
// Feedback
// Checks for the parameters
Object act_from;
try {
act_from = actDAO.retrieveBySecondaryKey(act_id_from);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActiv") + //$NON-NLS-1$
act_id_from + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act_from == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id_from + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Activity activity_from = (Activity) act_from;
Object act_to;
try {
act_to = actDAO.retrieveBySecondaryKey(act_id_to);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActiv") + //$NON-NLS-1$
act_id_to + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act_to == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id_to + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Activity activity_to = (Activity) act_to;
// End Checks for the parameters
// Now we start the implementation of the rules
String state_from = this.getState(activity_from);
String state_to = this.getState(activity_to);
ProcessModel pmodel = activity_to.getTheProcessModel();
// Rule G3.7
if (((state_from.equals("") && state_to.equals("")) //$NON-NLS-1$ //$NON-NLS-2$
|| (state_from.equals("") && state_to.equals(ProcessModel.REQUIREMENTS)) //$NON-NLS-1$
|| (state_from.equals(ProcessModel.REQUIREMENTS) && state_to.equals("")) //$NON-NLS-1$
|| (state_from.equals(ProcessModel.REQUIREMENTS) && state_to.equals(ProcessModel.REQUIREMENTS)))) {
boolean cf = this.controlFlow(activity_to, activity_from);
if (!cf) {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThereControlFlowBetw")); //$NON-NLS-1$
}
Feedback feed = new Feedback();
feed.setIdent(this.getPath(act_id_to));
feed.setTheProcessModel(pmodel);
pmodel.getTheConnection().add(feed);
feed.setFromActivity(activity_from);
activity_from.getToSimpleCon().add(feed);
feed.setToActivity(activity_to);
activity_to.getFromSimpleCon().add(feed);
// Persistence Operations
conDAO.daoSave(feed);
return feed.getId();
} else if ((!(state_from.equals(Plain.FINISHED) || state_from.equals(ProcessModel.FINISHED))
&& !(state_from.equals(Plain.CANCELED) || state_from.equals(ProcessModel.CANCELED)) && !(state_from.equals(Plain.FAILED) || state_from
.equals(ProcessModel.FAILED)))) {
boolean cf = this.controlFlow(activity_to, activity_from);
if (!cf) {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThereControlFlowBetw")); //$NON-NLS-1$
}
Feedback feed = new Feedback();
feed.setIdent(this.getPath(act_id_to));
feed.setTheProcessModel(pmodel);
feed.setFromActivity(activity_from);
activity_from.getToSimpleCon().add(feed);
feed.setToActivity(activity_to);
activity_to.getFromSimpleCon().add(feed);
// Persistence Operations
conDAO.daoSave(feed);
return feed.getId();
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheCombinActvStates")); //$NON-NLS-1$
}
}
@Override
public Integer newJoinANDCon(String dependency, String level_id) throws DAOException { // retorna
// Oid
// JoinCon
// Checks for the parameters
StringTokenizer st = new StringTokenizer(level_id, "."); //$NON-NLS-1$
String process_id = st.nextToken();
Object proc;
try {
proc = procDAO.retrieveBySecondaryKey(process_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccProcess") + //$NON-NLS-1$
process_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (proc == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcProcess") + process_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Process process = (Process) proc;
ProcessModel pmodel = null;
Decomposed actDecomposed = null; // it is used only if the new activity
// Is not in the root process model.
if (st.hasMoreTokens()) {
String currentModel = process_id;
while (st.hasMoreTokens()) {
currentModel += "." + st.nextToken(); //$NON-NLS-1$
}
Object dec = null;
try {
dec = decDAO.retrieveBySecondaryKey(currentModel);
} catch (Exception/* DAOException */e1) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActiv") + //$NON-NLS-1$
currentModel + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e1); //$NON-NLS-1$
}
if (dec == null)
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDecomActv") + currentModel //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$
actDecomposed = (Decomposed) dec;
pmodel = actDecomposed.getTheReferedProcessModel();
} else {
pmodel = process.getTheProcessModel();
}
// End Checks for the parameters
// Now we start the implementation of the rules
Dependency dep = new Dependency();
dep.setKindDep(dependency);
JoinCon joinAND = new JoinCon();
joinANDCon.setIdent(level_id);
joinANDCon.setFired(new Boolean(false));
joinAND.setKindJoinCon("AND"); //$NON-NLS-1$
joinANDCon.setTheProcessModel(pmodel);
pmodel.getTheConnection().add(joinANDCon);
joinANDCon.setTheDependency(dep);
dep.setTheMultipleCon(joinANDCon);
// Persistence Operations
conDAO.daoSave(joinANDCon);
return joinANDCon.getId();
}
@Override
public Integer newJoinConOR(String dependency, String level_id) throws DAOException {// retorna
// Oid
// JoinCon
// Checks for the parameters
StringTokenizer st = new StringTokenizer(level_id, "."); //$NON-NLS-1$
String process_id = st.nextToken();
Object proc;
try {
proc = procDAO.retrieveBySecondaryKey(process_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccProcess") + //$NON-NLS-1$
process_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (proc == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcProcess") + process_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Process process = (Process) proc;
ProcessModel pmodel = null;
Decomposed actDecomposed = null; // it is used only if the new activity
// Is not in the root process model.
if (st.hasMoreTokens()) {
String currentModel = process_id;
while (st.hasMoreTokens()) {
currentModel += "." + st.nextToken(); //$NON-NLS-1$
}
Object dec = null;
try {
dec = decDAO.retrieveBySecondaryKey(currentModel);
} catch (Exception/* DAOException */e1) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActiv") + //$NON-NLS-1$
currentModel + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e1); //$NON-NLS-1$
}
if (dec == null)
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDecomActv") + currentModel //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$
actDecomposed = (Decomposed) dec;
pmodel = actDecomposed.getTheReferedProcessModel();
} else {
pmodel = process.getTheProcessModel();
}
// End Checks for the parameters
// Now we start the implementation of the rules
Dependency dep = new Dependency();
dep.setKindDep(dependency);
JoinCon joinOR = new JoinCon();
joinConOR.setIdent(level_id);
joinConOR.setFired(new Boolean(false));
joinOR.setKindJoinCon("OR"); //$NON-NLS-1$
joinConOR.setTheProcessModel(pmodel);
pmodel.getTheConnection().add(joinConOR);
joinConOR.setTheDependency(dep);
dep.setTheMultipleCon(joinConOR);
// Persistence Operations
conDAO.daoSave(joinConOR);
return joinConOR.getId();
}
@Override
public Integer newJoinConXOR(String dependency, String level_id) throws DAOException {// retorna
// Oid
// JoinCon
// Checks for the parameters
StringTokenizer st = new StringTokenizer(level_id, "."); //$NON-NLS-1$
String process_id = st.nextToken();
Object proc;
try {
proc = procDAO.retrieveBySecondaryKey(process_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccProcess") + //$NON-NLS-1$
process_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (proc == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcProcess") + process_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Process process = (Process) proc;
ProcessModel pmodel = null;
Decomposed actDecomposed = null; // it is used only if the new activity
// Is not in the root process model.
if (st.hasMoreTokens()) {
String currentModel = process_id;
while (st.hasMoreTokens()) {
currentModel += "." + st.nextToken(); //$NON-NLS-1$
}
Object dec = null;
try {
dec = decDAO.retrieveBySecondaryKey(currentModel);
} catch (Exception/* DAOException */e1) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActiv") + //$NON-NLS-1$
currentModel + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e1); //$NON-NLS-1$
}
if (dec == null)
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDecomActv") + currentModel //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$
actDecomposed = (Decomposed) dec;
pmodel = actDecomposed.getTheReferedProcessModel();
} else {
pmodel = process.getTheProcessModel();
}
// End Checks for the parameters
// Now we start the implementation of the rules
Dependency dep = new Dependency();
dep.setKindDep(dependency);
JoinCon joinXOR = new JoinCon();
joinConXOR.setIdent(level_id);
joinConXOR.setFired(new Boolean(false));
joinXOR.setKindJoinCon("XOR"); //$NON-NLS-1$
joinConXOR.setTheProcessModel(pmodel);
pmodel.getTheConnection().add(joinConXOR);
joinConXOR.setTheDependency(dep);
dep.setTheMultipleCon(joinConXOR);
// Persistence Operations
conDAO.daoSave(joinConXOR);
return joinConXOR.getId();
}
@Override
public Integer newBranchANDCon(String dependency, String level_id) throws DAOException {// retorna
// Oid
// BranchANDCon
// Checks for the parameters
StringTokenizer st = new StringTokenizer(level_id, "."); //$NON-NLS-1$
String process_id = st.nextToken();
Object proc;
try {
proc = procDAO.retrieveBySecondaryKey(process_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccProcess") + //$NON-NLS-1$
process_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (proc == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcProcess") + process_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Process process = (Process) proc;
ProcessModel pmodel = null;
Decomposed actDecomposed = null; // it is used only if the new activity
// Is not in the root process model.
if (st.hasMoreTokens()) {
String currentModel = process_id;
while (st.hasMoreTokens()) {
currentModel += "." + st.nextToken(); //$NON-NLS-1$
}
Object dec = null;
try {
dec = decDAO.retrieveBySecondaryKey(currentModel);
} catch (Exception/* DAOException */e1) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActiv") + //$NON-NLS-1$
currentModel + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e1); //$NON-NLS-1$
}
if (dec == null)
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDecomActv") + currentModel //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$
actDecomposed = (Decomposed) dec;
pmodel = actDecomposed.getTheReferedProcessModel();
} else {
pmodel = process.getTheProcessModel();
}
// End Checks for the parameters
// Now we start the implementation of the rules
Dependency dep = new Dependency();
dep.setKindDep(dependency);
BranchANDCon branchAND = new BranchANDCon();
branchAND.setIdent(level_id);
branchAND.setFired(new Boolean(false));
branchAND.setTheProcessModel(pmodel);
pmodel.getTheConnection().add(branchAND);
branchAND.setTheDependency(dep);
dep.setTheMultipleCon(branchAND);
// Persistence Operations
conDAO.daoSave(branchAND);
return branchAND.getId();
}
@Override
public Integer newBranchConOR(String dependency, String level_id) throws DAOException { // retorna
// Oid
// BranchConCond
// Checks for the parameters
StringTokenizer st = new StringTokenizer(level_id, "."); //$NON-NLS-1$
String process_id = st.nextToken();
Object proc;
try {
proc = procDAO.retrieveBySecondaryKey(process_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccProcess") + //$NON-NLS-1$
process_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (proc == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcProcess") + process_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Process process = (Process) proc;
ProcessModel pmodel = null;
Decomposed actDecomposed = null; // it is used only if the new activity
// Is not in the root process model.
if (st.hasMoreTokens()) {
String currentModel = process_id;
while (st.hasMoreTokens()) {
currentModel += "." + st.nextToken(); //$NON-NLS-1$
}
Object dec = null;
try {
dec = decDAO.retrieveBySecondaryKey(currentModel);
} catch (Exception/* DAOException */e1) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActiv") + //$NON-NLS-1$
currentModel + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e1); //$NON-NLS-1$
}
if (dec == null)
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDecomActv") + currentModel //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$
actDecomposed = (Decomposed) dec;
pmodel = actDecomposed.getTheReferedProcessModel();
} else {
pmodel = process.getTheProcessModel();
}
// End Checks for the parameters
// Now we start the implementation of the rules
Dependency dep = new Dependency();
dep.setKindDep(dependency);
BranchConCond branchOR = new BranchConCond();
branchConOR.setIdent(level_id);
branchConOR.setFired(new Boolean(false));
branchOR.setKindBranchCon("OR"); //$NON-NLS-1$
branchConOR.setTheProcessModel(pmodel);
pmodel.getTheConnection().add(branchConOR);
branchConOR.setTheDependency(dep);
dep.setTheMultipleCon(branchConOR);
// Persistence Operations
conDAO.daoSave(branchConOR);
return branchConOR.getId();
}
/*
* (non-Javadoc)
*
* @see
* org.qrconsult.spm.services.impl.DynamicModelingExtraida#newBranchConXOR(
* java.lang.String, java.lang.String)
*/
@Override
public Integer newBranchConXOR(String dependency, String level_id) throws DAOException {// retorna
// Oid
// BranchConCond
// Checks for the parameters
StringTokenizer st = new StringTokenizer(level_id, "."); //$NON-NLS-1$
String process_id = st.nextToken();
Object proc;
try {
proc = procDAO.retrieveBySecondaryKey(process_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccProcess") + //$NON-NLS-1$
process_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (proc == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcProcess") + process_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Process process = (Process) proc;
ProcessModel pmodel = null;
Decomposed actDecomposed = null; // it is used only if the new activity
// Is not in the root process model.
if (st.hasMoreTokens()) {
String currentModel = process_id;
while (st.hasMoreTokens()) {
currentModel += "." + st.nextToken(); //$NON-NLS-1$
}
Object dec = null;
try {
dec = decDAO.retrieveBySecondaryKey(currentModel);
} catch (Exception/* DAOException */e1) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActiv") + //$NON-NLS-1$
currentModel + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e1); //$NON-NLS-1$
}
if (dec == null)
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDecomActv") + currentModel //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$
actDecomposed = (Decomposed) dec;
pmodel = actDecomposed.getTheReferedProcessModel();
} else {
pmodel = process.getTheProcessModel();
}
// End Checks for the parameters
// Now we start the implementation of the rules
Dependency dep = new Dependency();
dep.setKindDep(dependency);
BranchConCond branchXOR = new BranchConCond();
branchConXOR.setIdent(level_id);
branchConXOR.setFired(new Boolean(false));
branchXOR.setKindBranchCon("XOR"); //$NON-NLS-1$
branchConXOR.setTheProcessModel(pmodel);
pmodel.getTheConnection().add(branchConXOR);
branchConXOR.setTheDependency(dep);
dep.setTheMultipleCon(branchConXOR);
// Persistence Operations
conDAO.daoSave(branchConXOR);
return branchConXOR.getId();
}
/*
* (non-Javadoc)
*
* @see org.qrconsult.spm.services.impl.DynamicModelingExtraida#
* removeMultiConPredecessorConnection(java.lang.String, java.lang.String)
*/
@Override
public void removeMultiConPredecessorConnection(String con_id, String from_con_id) throws DAOException, WebapseeException {
// Checks for the parameters
Object multi;
try {
multi = multiDAO.retrieveBySecondaryKey(con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccMultConn") + //$NON-NLS-1$
con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (multi == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcMultConn") + con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
MultipleCon multipleCon = (MultipleCon) multi;
Object from_multi;
try {
from_multi = multiDAO.retrieveBySecondaryKey(from_con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccMultConn") + //$NON-NLS-1$
from_con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (from_multi == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcMultConn") + from_con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
MultipleCon multipleCon_from = (MultipleCon) from_multi;
// End Checks for the parameters
// Now we start the implementation of the rules
if (multipleCon_from instanceof BranchCon) {
BranchCon fromBranch = (BranchCon) multipleCon_from;
if (fromBranch instanceof BranchANDCon) {
BranchANDCon fromBranchAND = (BranchANDCon) fromBranchCon;
if (multipleCon instanceof JoinCon) {
JoinCon join = (JoinCon) multipleCon;
join.getFromMultipleCon().remove(fromBranchANDCon);
fromBranchAND.getToMultipleCon().remove(join);
} else {
BranchCon branch = (BranchCon) multipleCon;
branchCon.setFromMultipleConnection(null);
fromBranchAND.getToMultipleCon().remove(branchCon);
}
} else if (fromBranch instanceof BranchConCond) {
BranchConCond fromBranchCond = (BranchConCond) fromBranchCon;
if (multipleCon instanceof JoinCon) {
JoinCon join = (JoinCon) multipleCon;
Collection fromBCTMCs = fromBranchCond.getTheBranchConCondToMultipleCon();
Iterator iter2 = fromBCTMCs.iterator();
while (iter2.hasNext()) {
BranchConCondToMultipleCon fromBCTMC = (BranchConCondToMultipleCon) iter2.next();
if (fromBCTMC.getTheMultipleCon().equals(join)) {
fromBCTMC.setTheMultipleCon(null);
fromBCTMC.setTheBranchConCond(null);
fromBCTMCs.remove(fromBCTMC);
join.getFromMultipleCon().remove(fromBranchConCond);
// Dynamic Changes related code
ProcessModel pmodel = multipleCon.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G3.15"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
}
break;
}
}
} else {
BranchCon branch = (BranchCon) multipleCon;
Collection fromBCTMCs = fromBranchCond.getTheBranchConCondToMultipleCon();
Iterator iter2 = fromBCTMCs.iterator();
while (iter2.hasNext()) {
BranchConCondToMultipleCon fromBCTMC = (BranchConCondToMultipleCon) iter2.next();
if (fromBCTMC.getTheMultipleCon().equals(branchCon)) {
fromBCTMC.setTheMultipleCon(null);
fromBCTMC.setTheBranchConCond(null);
fromBCTMCs.remove(fromBCTMC);
branchCon.setFromMultipleConnection(null);
// Dynamic Changes related code
ProcessModel pmodel = multipleCon.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G3.15"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
}
break;
}
}
}
}
} else if (multipleCon_from instanceof JoinCon) {
JoinCon fromJoin = (JoinCon) multipleCon_from;
if (multipleCon instanceof JoinCon) {
JoinCon join = (JoinCon) multipleCon;
join.getFromMultipleCon().remove(fromJoinCon);
fromJoinCon.setToMultipleCon(null);
} else { // is BranchCon
BranchCon branch = (BranchCon) multipleCon;
branchCon.setFromMultipleConnection(null);
fromJoinCon.setToMultipleCon(null);
}
// Dynamic Changes related code
ProcessModel pmodel = multipleCon.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G3.15"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
}
}
// Persistence Operations
multiDAO.update(multipleCon);
multiDAO.update(multipleCon_from);
}
/*
* (non-Javadoc)
*
* @see org.qrconsult.spm.services.impl.DynamicModelingExtraida#
* removeMultiConPredecessorActivity(java.lang.String, java.lang.String)
*/
@Override
public void removeMultiConPredecessorActivity(String con_id, String act_id) throws DAOException, WebapseeException {
// Checks for the parameters
Object multi;
try {
multi = multiDAO.retrieveBySecondaryKey(con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccMultConn") + //$NON-NLS-1$
con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (multi == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcMultConn") + con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
MultipleCon multipleCon = (MultipleCon) multi;
Object from_act;
try {
from_act = actDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActivi") + //$NON-NLS-1$
act_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (from_act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Activity activity_from = (Activity) from_act;
// End Checks for the parameters
// Now we start the implementation of the rules
if (multipleCon instanceof BranchCon) {
BranchCon branch = (BranchCon) multipleCon;
activity_from.getToBranch().remove(branchCon);
branchCon.setFromActivity(null);
// Dynamic Changes related code
ProcessModel pmodel = multipleCon.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G3.16"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
}
} else if (multipleCon instanceof JoinCon) {
JoinCon join = (JoinCon) multipleCon;
activity_from.getToJoin().remove(join);
join.getFromActivity().remove(activity_from);
// Dynamic Changes related code
ProcessModel pmodel = multipleCon.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G3.16"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
}
}
// Persistence Operations
multiDAO.update(multipleCon);
actDAO.update(activity_from);
}
/*
* (non-Javadoc)
*
* @see org.qrconsult.spm.services.impl.DynamicModelingExtraida#
* removeMultiConSuccessorConnection(java.lang.String, java.lang.String)
*/
@Override
public void removeMultiConSuccessorConnection(String con_id, String to_con_id) throws DAOException, WebapseeException {
// Checks for the parameters
Object multi;
try {
multi = multiDAO.retrieveBySecondaryKey(con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccMultConn") + //$NON-NLS-1$
con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (multi == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcMultConn") + con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
MultipleCon multipleCon = (MultipleCon) multi;
Object to_multi;
try {
to_multi = multiDAO.retrieveBySecondaryKey(to_con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccMultConn") + //$NON-NLS-1$
to_con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (to_multi == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcMultConn") + to_con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
MultipleCon multipleCon_to = (MultipleCon) to_multi;
// End Checks for the parameters
// Now we start the implementation of the rules
if (multipleCon instanceof BranchCon) {
BranchCon branch = (BranchCon) multipleCon;
if (branch instanceof BranchConCond) { // Rule G5.33
BranchConCond branchCond = (BranchConCond) branchCon;
Collection bctmcs = branchCond.getTheBranchConCondToMultipleCon();
Iterator iter = bctmcs.iterator();
while (iter.hasNext()) {
BranchConCondToMultipleCon bctmc = (BranchConCondToMultipleCon) iter.next();
if (bctmc.getTheMultipleCon().equals(multipleCon_to)) {
bctmc.setTheMultipleCon(null);
//bctmc.setCondition(""); //$NON-NLS-1$
if (multipleCon_to instanceof BranchCon) {
BranchCon toBranch = (BranchCon) multipleCon_to;
toBranchCon.setFromMultipleConnection(null);
} else if (multipleCon_to instanceof JoinCon) {
JoinCon toJoin = (JoinCon) multipleCon_to;
toJoinCon.getFromMultipleCon().remove(multipleCon);
}
// Dynamic Changes related code
ProcessModel pmodel = multipleCon.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.33"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
}
break;
}
}
} else {
BranchANDCon branchAND = (BranchANDCon) branchCon;
if (multipleCon_to instanceof JoinCon) {
JoinCon joinTo = (JoinCon) multipleCon_to;
joinTo.getFromMultipleCon().remove(branchAND);
branchAND.getToMultipleCon().remove(joinConTo);
} else { // is BranchCon
BranchCon branchTo = (BranchCon) multipleCon_to;
branchConTo.setFromMultipleConnection(null);
branchAND.getToMultipleCon().remove(branchConTo);
}
// Dynamic Changes related code
ProcessModel pmodel = multipleCon.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G3.17"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
}
}
} else {
JoinCon join = (JoinCon) multipleCon;
if (multipleCon_to instanceof BranchCon) {
BranchCon toBranch = (BranchCon) multipleCon_to;
toBranchCon.setFromMultipleConnection(null);
join.setToMultipleCon(null);
// Dynamic Changes related code
ProcessModel pmodel = multipleCon.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G3.17"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
}
} else if (multipleCon_to instanceof JoinCon) {
JoinCon toJoin = (JoinCon) multipleCon_to;
toJoin.getFromMultipleCon().remove(join);
join.setToMultipleCon(null);
// Dynamic Changes related code
ProcessModel pmodel = multipleCon.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G3.17"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
}
}
}
// Persistence Operations
multiDAO.update(multipleCon);
multiDAO.update(multipleCon_to);
}
/*
* (non-Javadoc)
*
* @see org.qrconsult.spm.services.impl.DynamicModelingExtraida#
* removeMultiConSuccessorActivity(java.lang.String, java.lang.String)
*/
@Override
public void removeMultiConSuccessorActivity(String con_id, String act_id) throws DAOException, WebapseeException {
// Checks for the parameters
Object multi;
try {
multi = multiDAO.retrieveBySecondaryKey(con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccMultConn") + //$NON-NLS-1$
con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (multi == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcMultConn") + con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
MultipleCon multipleCon = (MultipleCon) multi;
Object to_act;
try {
to_act = actDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActivi") + //$NON-NLS-1$
act_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (to_act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Activity activity_to = (Activity) to_act;
// End Checks for the parameters
// Now we start the implementation of the rules
if (multipleCon instanceof BranchCon) {
BranchCon branch = (BranchCon) multipleCon;
if (branch instanceof BranchANDCon) {
BranchANDCon branchAND = (BranchANDCon) branchCon;
branchAND.getToActivities().remove(activity_to);
activity_to.getToBranch().remove(branchAND);
// Dynamic Changes related code
ProcessModel pmodel = multipleCon.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G3.18"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
}
} else if (branch instanceof BranchConCond) { // Rule G5.34
BranchConCond branchCond = (BranchConCond) branchCon;
Collection aTBCs = branchCond.getTheBranchConCondToActivity();
Iterator iter = aTBCs.iterator();
BranchConCondToActivity aTBC = null;
while (iter.hasNext()) {
BranchConCondToActivity aux = (BranchConCondToActivity) iter.next();
if (aux.getTheActivities().equals(activity_to)) {
aTBC = aux;
break;
}
}
if (aTBC != null) {
aTBC.setTheActivity(null);
activity_to.getTheBranchConCondToActivity().remove(aTBC);
aTBC.setTheBranchConCond(null);
branchCond.getTheBranchConCondToActivity().remove(aTBC);
// Dynamic Changes related code
ProcessModel pmodel = multipleCon.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.34"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
}
}
}
} else if (multipleCon instanceof JoinCon) {
JoinCon join = (JoinCon) multipleCon;
join.setToActivity(activity_to);
activity_to.getFromJoin().remove(join);
// Dynamic Changes related code
ProcessModel pmodel = multipleCon.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G3.18"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
}
}
// Persistence Operations
multiDAO.update(multipleCon);
actDAO.update(activity_to);
}
/**
* Related to section 4
*/
/*
* (non-Javadoc)
*
* @see
* org.qrconsult.spm.services.impl.DynamicModelingExtraida#defineJoinConTo_Activity
* (java.lang.String, java.lang.String)
*/
@Override
public WebapseeObjectDTO defineJoinConTo_Activity(String con_id, String act_id) throws DAOException, WebapseeException {// retornava
// Object
// Checks for the parameters
Object j;
try {
j = joinConDAO.retrieveBySecondaryKey(con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccJoinCon") + //$NON-NLS-1$
con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (j == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcJoinCon") + con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
JoinCon join = (JoinCon) j;
Object to_act;
try {
to_act = actDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActivi") + //$NON-NLS-1$
act_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (to_act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Activity activity_to = (Activity) to_act;
// End Checks for the parameters
// Now we start the implementation of the rules
String state = this.getState(activity_to);
Collection suc = this.getSuccessors(join);
suc.remove(null);
if (suc.isEmpty() && !this.controlFlow(activity_to, join)) {
if (!join.isFired().booleanValue() && (state.equals("") || state.equals(ProcessModel.REQUIREMENTS))) { // Rule G4.1 //$NON-NLS-1$
join.setToActivity(activity_to);
activity_to.getFromJoin().add(join);
// Dynamic Changes related code
ProcessModel pmodel = join.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.1"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
}
} else if (state.equals(Plain.WAITING) || state.equals(ProcessModel.REQUIREMENTS) || state.equals(ProcessModel.ABSTRACT)
|| state.equals(ProcessModel.INSTANTIATED)) { // ||
// state.equals(ProcessModel.INSTANTIATED)
// Rule G4.2
join.setToActivity(activity_to);
activity_to.getFromJoin().add(join);
if (activity_to instanceof Decomposed) {
this.makeDecomposedWaiting(((Decomposed) activity_to).getTheReferedProcessModel(), "Rule G4.2");
}
// Dynamic Changes related code
ProcessModel pmodel = join.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.2"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
} else if (activity_to instanceof Normal || activity_to instanceof Decomposed) {
if (!join.isFired().booleanValue() && (state.equals(Plain.READY) || state.equals(ProcessModel.INSTANTIATED))) {
if (join.getTheDependency().getKindDep().equals("end-end")) { //$NON-NLS-1$
// Rule G4.3
join.setToActivity(activity_to);
activity_to.getFromJoin().add(join);
// Dynamic Changes related code
ProcessModel pmodel = join.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.3"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
} else {
// Rule G4.4
this.makeWaiting(activity_to, "Rule G4.4");
join.setToActivity(activity_to);
activity_to.getFromJoin().add(join);
// Dynamic Changes related code
ProcessModel pmodel = join.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.4"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
}
} else if (join.isFired().booleanValue() && !(state.equals(Plain.FAILED) || state.equals(ProcessModel.FAILED))
&& !(state.equals(Plain.CANCELED) || state.equals(ProcessModel.CANCELED))) {
// Rule G4.5
join.setToActivity(activity_to);
activity_to.getFromJoin().add(join);
// Dynamic Changes related code
ProcessModel pmodel = join.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.5"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
}
}
// Persistence Operations
joinDAO.update(join);
actDAO.update(activity_to);
} else if ((join.getToActivities() != null) && !this.controlFlow(activity_to, join)) {
if ((state.equals("") || state.equals(ProcessModel.REQUIREMENTS)) //$NON-NLS-1$
&& !join.isFired().booleanValue()) {
// Rule G4.6
Activity toAct = join.getToActivities();
toAct.getFromJoin().remove(join);
join.setToActivity(activity_to);
activity_to.getFromJoin().add(join);
ProcessModel pmodel = join.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.6"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
}
// Persistence Operations
joinDAO.update(join);
actDAO.update(activity_to);
return newWebapseeObjectDTO(toAct.getId(), toAct.getClass().getName());
} else if (state.equals(Plain.WAITING) || state.equals(ProcessModel.REQUIREMENTS) || state.equals(ProcessModel.ABSTRACT)
|| state.equals(ProcessModel.INSTANTIATED)) {
// Rule G4.7
Activity toAct = join.getToActivities();
toAct.getFromJoin().remove(join);
join.setToActivity(activity_to);
activity_to.getFromJoin().add(join);
ProcessModel pmodel = join.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.7"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
// Persistence Operations
joinDAO.update(join);
actDAO.update(activity_to);
return newWebapseeObjectDTO(toAct.getId(), toAct.getClass().getName());
} else if (activity_to instanceof Normal || activity_to instanceof Decomposed) {
if (!join.isFired().booleanValue() && (state.equals(Plain.READY) || state.equals(ProcessModel.INSTANTIATED))) {
if (join.getTheDependency().getKindDep().equals("end-end")) { //$NON-NLS-1$
// Rule G4.8
Activity toAct = join.getToActivities();
toAct.getFromJoin().remove(join);
join.setToActivity(activity_to);
activity_to.getFromJoin().add(join);
ProcessModel pmodel = join.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.8"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
// Persistence Operations
joinDAO.update(join);
actDAO.update(activity_to);
return newWebapseeObjectDTO(toAct.getId(), toAct.getClass().getName());
} else {
// Rule G4.9
Activity toAct = join.getToActivities();
toAct.getFromJoin().remove(join);
join.setToActivity(activity_to);
activity_to.getFromJoin().add(join);
this.makeWaiting(activity_to, "Rule G4.9");
ProcessModel pmodel = join.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.9"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
// Persistence Operations
joinDAO.update(join);
actDAO.update(activity_to);
return newWebapseeObjectDTO(toAct.getId(), toAct.getClass().getName());
}
}
} else if (join.isFired().booleanValue() && !(state.equals(Plain.FAILED) || state.equals(ProcessModel.FAILED))
&& !(state.equals(Plain.CANCELED) || state.equals(ProcessModel.CANCELED))) {
// Rule G4.10
Activity toAct = join.getToActivities();
toAct.getFromJoin().remove(join);
join.setToActivity(activity_to);
activity_to.getFromJoin().add(join);
ProcessModel pmodel = join.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.10"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
// Persistence Operations
joinDAO.update(join);
actDAO.update(activity_to);
return newWebapseeObjectDTO(toAct.getId(), toAct.getClass().getName());
}
} else if ((join.getToMultipleCon() != null) && !this.controlFlow(activity_to, join)) {
if ((state.equals("") || state.equals(ProcessModel.REQUIREMENTS)) //$NON-NLS-1$
&& !join.isFired().booleanValue()) {
// Rule G4.11
MultipleCon toMulti = join.getToMultipleCon();
if (toMulti instanceof JoinCon) {
JoinCon toJoin = (JoinCon) toMulti;
toJoin.getFromMultipleCon().remove(join);
} else { // is BranchCon
BranchCon toBranch = (BranchCon) toMulti;
toBranchCon.setFromMultipleConnection(null);
}
join.setToMultipleCon(null);
join.setToActivity(activity_to);
activity_to.getFromJoin().add(join);
ProcessModel pmodel = join.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.11"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
}
// Persistence Operations
joinDAO.update(join);
actDAO.update(activity_to);
return newWebapseeObjectDTO(toMulti.getId(), toMulti.getClass().getName());
} else if (state.equals(Plain.WAITING) || state.equals(ProcessModel.REQUIREMENTS) || state.equals(ProcessModel.ABSTRACT)) { // ||
// state.equals(ProcessModel.INSTANTIATED)
// Rule G4.12
MultipleCon toMulti = join.getToMultipleCon();
if (toMulti instanceof JoinCon) {
JoinCon toJoin = (JoinCon) toMulti;
toJoin.getFromMultipleCon().remove(join);
} else { // is BranchCon
BranchCon toBranch = (BranchCon) toMulti;
toBranchCon.setFromMultipleConnection(null);
}
join.setToMultipleCon(null);
join.setToActivity(activity_to);
activity_to.getFromJoin().add(join);
ProcessModel pmodel = join.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.12"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
// Persistence Operations
joinDAO.update(join);
actDAO.update(activity_to);
return newWebapseeObjectDTO(toMulti.getId(), toMulti.getClass().getName());
} else if (activity_to instanceof Normal || activity_to instanceof Decomposed) {
if (!join.isFired().booleanValue() && (state.equals(Plain.READY) || state.equals(ProcessModel.INSTANTIATED))) {
if (join.getTheDependency().getKindDep().equals("end-end")) { //$NON-NLS-1$
// Rule G4.13
MultipleCon toMulti = join.getToMultipleCon();
if (toMulti instanceof JoinCon) {
JoinCon toJoin = (JoinCon) toMulti;
toJoin.getFromMultipleCon().remove(join);
} else { // is BranchCon
BranchCon toBranch = (BranchCon) toMulti;
toBranchCon.setFromMultipleConnection(null);
}
join.setToMultipleCon(null);
join.setToActivity(activity_to);
activity_to.getFromJoin().add(join);
ProcessModel pmodel = join.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.13"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
// Persistence Operations
joinDAO.update(join);
actDAO.update(activity_to);
return newWebapseeObjectDTO(toMulti.getId(), toMulti.getClass().getName());
} else {
// Rule G4.14
MultipleCon toMulti = join.getToMultipleCon();
if (toMulti instanceof JoinCon) {
JoinCon toJoin = (JoinCon) toMulti;
toJoin.getFromMultipleCon().remove(join);
} else { // is BranchCon
BranchCon toBranch = (BranchCon) toMulti;
toBranchCon.setFromMultipleConnection(null);
}
join.setToMultipleCon(null);
join.setToActivity(activity_to);
activity_to.getFromJoin().add(join);
this.makeWaiting(activity_to, "Rule G4.14"); //$NON-NLS-1$
ProcessModel pmodel = join.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.14"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId()/*
* ,
* currentSession
*/);
this.enactmentEngine.determineProcessModelStates(pmodel/*
* ,
* currentSession
*/);
// Persistence Operations
joinDAO.update(join);
actDAO.update(activity_to);
return newWebapseeObjectDTO(toMulti.getId(), toMulti.getClass().getName());
}
}
} else if (join.isFired().booleanValue() && !(state.equals(Plain.FAILED) || state.equals(ProcessModel.FAILED))
&& !(state.equals(Plain.CANCELED) || state.equals(ProcessModel.CANCELED))) {
// Rule 4.15
MultipleCon toMulti = join.getToMultipleCon();
if (toMulti instanceof JoinCon) {
JoinCon toJoin = (JoinCon) toMulti;
toJoin.getFromMultipleCon().remove(join);
} else { // is BranchCon
BranchCon toBranch = (BranchCon) toMulti;
toBranchCon.setFromMultipleConnection(null);
}
join.setToMultipleCon(null);
join.setToActivity(activity_to);
activity_to.getFromJoin().add(join);
ProcessModel pmodel = join.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.15"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
// Persistence Operations
joinDAO.update(join);
actDAO.update(activity_to);
return newWebapseeObjectDTO(toMulti.getId(), toMulti.getClass().getName());
}
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThereControlFlow")); //$NON-NLS-1$
}
return null;
}
/*
* (non-Javadoc)
*
* @see org.qrconsult.spm.services.impl.DynamicModelingExtraida#
* defineJoinConTo_Connection(java.lang.String, java.lang.String)
*/
@Override
public Integer defineJoinConTo_Connection(String con_id, String to_con_id) throws DAOException, WebapseeException { // retorna
// Oid
// Object
// Checks for the parameters
Object j;
try {
j = joinConDAO.retrieveBySecondaryKey(con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccJoinCon") + //$NON-NLS-1$
con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (j == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcJoinCon") + con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
JoinCon join = (JoinCon) j;
Object to_multi;
try {
to_multi = multiDAO.retrieveBySecondaryKey(to_con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccMultConn") + //$NON-NLS-1$
to_con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (to_multi == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcMultConn") + to_con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
MultipleCon multipleCon_to = (MultipleCon) to_multi;
// End Checks for the parameters
// Now we start the implementation of the rules
Collection suc = this.getSuccessors(join);
suc.remove(null);
if (suc.isEmpty()) {
if (!this.controlFlow(multipleCon_to, join)) {
if (!join.isFired().booleanValue() && !multipleCon_to.isFired().booleanValue()) {
// Rule G4.16
if (multipleCon_to instanceof JoinCon) {
JoinCon toJoin = (JoinCon) multipleCon_to;
toJoin.getFromMultipleCon().add(join);
join.setToMultipleCon(toJoinCon);
} else { // is BranchCon
BranchCon toBranch = (BranchCon) multipleCon_to;
toBranch.setFromMultipleConnection(join);
join.setToMultipleCon(toBranchCon);
}
// Dynamic Changes related code
ProcessModel pmodel = join.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.16"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
} else if (join.isFired().booleanValue()) {
// Rule G4.18
if (multipleCon_to instanceof JoinCon) {
JoinCon toJoin = (JoinCon) multipleCon_to;
toJoin.getFromMultipleCon().add(join);
join.setToMultipleCon(toJoinCon);
} else { // is BranchCon
BranchCon toBranch = (BranchCon) multipleCon_to;
toBranch.setFromMultipleConnection(join);
join.setToMultipleCon(toBranchCon);
}
// Dynamic Changes related code
ProcessModel pmodel = join.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.18"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
}
// Persistence Operations
joinDAO.update(join);
multiDAO.update(multipleCon_to);
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThereControlFlow")); //$NON-NLS-1$
}
} else if (join.getToActivities() != null) {
if (!this.controlFlow(multipleCon_to, join)) {
if (!join.isFired().booleanValue() && !multipleCon_to.isFired().booleanValue()) {
// Rule G4.17
Activity toAct = join.getToActivities();
toAct.getFromJoin().remove(join);
join.setToActivity(null);
if (multipleCon_to instanceof JoinCon) {
JoinCon toJoin = (JoinCon) multipleCon_to;
toJoin.getFromMultipleCon().add(join);
join.setToMultipleCon(toJoinCon);
} else { // is BranchCon
BranchCon toBranch = (BranchCon) multipleCon_to;
toBranch.setFromMultipleConnection(join);
join.setToMultipleCon(toBranchCon);
}
// Dynamic Changes related code
ProcessModel pmodel = join.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.17"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
// Persistence Operations
joinDAO.update(join);
multiDAO.update(multipleCon_to);
return toAct.getId();
} else if (join.isFired().booleanValue()) {
// Rule G4.19
Activity toAct = join.getToActivities();
toAct.getFromJoin().remove(join);
join.setToActivity(null);
if (multipleCon_to instanceof JoinCon) {
JoinCon toJoin = (JoinCon) multipleCon_to;
toJoin.getFromMultipleCon().add(join);
join.setToMultipleCon(toJoinCon);
} else { // is BranchCon
BranchCon toBranch = (BranchCon) multipleCon_to;
toBranch.setFromMultipleConnection(join);
join.setToMultipleCon(toBranchCon);
}
// Dynamic Changes related code
ProcessModel pmodel = join.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.19"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
// Persistence Operations
joinDAO.update(join);
multiDAO.update(multipleCon_to);
return toAct.getId();
}
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThereControlFlow")); //$NON-NLS-1$
}
} else if (join.getToMultipleCon() != null) {
if (!this.controlFlow(multipleCon_to, join)) {
if (!join.isFired().booleanValue() && !multipleCon_to.isFired().booleanValue()) {
// Rule G4.20
MultipleCon oldMulti = join.getToMultipleCon();
if (multipleCon_to instanceof JoinCon) {
JoinCon toJoin = (JoinCon) multipleCon_to;
if (oldMulti instanceof JoinCon) {
JoinCon oldJoin = (JoinCon) oldMulti;
oldJoin.getFromMultipleCon().remove(join);
} else { // is BranchCon
BranchCon oldBranch = (BranchCon) multipleCon_to;
oldBranchCon.setFromMultipleConnection(null);
}
toJoin.getFromMultipleCon().add(join);
join.setToMultipleCon(toJoinCon);
} else { // is BranchCon
BranchCon toBranch = (BranchCon) multipleCon_to;
if (oldMulti instanceof JoinCon) {
JoinCon oldJoin = (JoinCon) oldMulti;
oldJoin.getFromMultipleCon().remove(join);
} else { // is BranchCon
BranchCon oldBranch = (BranchCon) multipleCon_to;
oldBranchCon.setFromMultipleConnection(null);
}
toBranch.setFromMultipleConnection(join);
join.setToMultipleCon(toBranchCon);
}
// Dynamic Changes related code
ProcessModel pmodel = join.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.20"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
// Persistence Operations
joinDAO.update(join);
multiDAO.update(multipleCon_to);
return oldMulti.getId();
} else if (join.isFired().booleanValue()) {
// Rule G4.21
MultipleCon oldMulti = join.getToMultipleCon();
if (multipleCon_to instanceof JoinCon) {
JoinCon toJoin = (JoinCon) multipleCon_to;
if (oldMulti instanceof JoinCon) {
JoinCon oldJoin = (JoinCon) oldMulti;
oldJoin.getFromMultipleCon().remove(join);
} else { // is BranchCon
BranchCon oldBranch = (BranchCon) multipleCon_to;
oldBranchCon.setFromMultipleConnection(null);
}
toJoin.getFromMultipleCon().add(join);
join.setToMultipleCon(toJoinCon);
} else { // is BranchCon
BranchCon toBranch = (BranchCon) multipleCon_to;
if (oldMulti instanceof JoinCon) {
JoinCon oldJoin = (JoinCon) oldMulti;
oldJoin.getFromMultipleCon().remove(join);
} else { // is BranchCon
BranchCon oldBranch = (BranchCon) multipleCon_to;
oldBranchCon.setFromMultipleConnection(null);
}
toBranch.setFromMultipleConnection(join);
join.setToMultipleCon(toBranchCon);
}
// Dynamic Changes related code
ProcessModel pmodel = join.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.21"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
// Persistence Operations
joinDAO.update(join);
multiDAO.update(multipleCon_to);
return oldMulti.getId();
}
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThereControlFlow")); //$NON-NLS-1$
}
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcJoinConIsNotOK")); //$NON-NLS-1$
}
return null;
}
/*
* (non-Javadoc)
*
* @see org.qrconsult.spm.services.impl.DynamicModelingExtraida#
* defineJoinConFrom_Connection(java.lang.String, java.lang.String)
*/
@Override
public void defineJoinConFrom_Connection(String con_id, String from_con_id) throws DAOException, WebapseeException {
// Checks for the parameters
Object j;
try {
j = joinConDAO.retrieveBySecondaryKey(con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccJoinCon") + //$NON-NLS-1$
con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (j == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcJoinCon") + con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
JoinCon join = (JoinCon) j;
Object from_multi;
try {
from_multi = multiDAO.retrieveBySecondaryKey(from_con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccMultConn") + //$NON-NLS-1$
from_con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (from_multi == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcMultConn") + from_con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
MultipleCon multipleCon_from = (MultipleCon) from_multi;
// End Checks for the parameters
// Now we start the implementation of the rules
if (!this.isInPredecessors(multipleCon_from, join) && !this.controlFlow(join, multipleCon_from)) {
if ((!join.isFired().booleanValue())) {
if (join.getKindJoinCon().equals("XOR")) { //$NON-NLS-1$
// Rule G4.28
Collection preds = this.getPredecessors(multipleCon_from);
Iterator iter = preds.iterator();
boolean has = false;
while (iter.hasNext()) {
Object obj = (Object) iter.next();
if (obj instanceof BranchANDCon) {
has = true;
break;
}
}
if (!has) {
if (multipleCon_from instanceof JoinCon) {
JoinCon fromJoin = (JoinCon) multipleCon_from;
fromJoin.setToMultipleCon(join);
join.getFromMultipleCon().add(fromJoinCon);
} else { // is BranchCon
BranchCon fromBranch = (BranchCon) multipleCon_from;
if (fromBranch instanceof BranchANDCon) {
BranchANDCon fromBranchAND = (BranchANDCon) fromBranchCon;
fromBranchAND.getToMultipleCon().add(join);
join.getFromMultipleCon().add(fromBranchANDCon);
} else { // is BranchConCond
BranchConCond fromBranchCond = (BranchConCond) fromBranchCon;
BranchConCondToMultipleCon bctmc = new BranchConCondToMultipleCon();
bctmc.setTheBranchCond(fromBranchConCond);
bctmc.setTheMultipleCon(join);
fromBranchCond.getTheBranchConCondToMultipleCon().add(bctmc);
join.getFromMultipleCon().add(fromBranchConCond);
}
}
// Dynamic Changes related code
ProcessModel pmodel = join.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.28"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
}
} else {
// Rule G4.22
if (multipleCon_from instanceof JoinCon) {
JoinCon fromJoin = (JoinCon) multipleCon_from;
fromJoin.setToMultipleCon(join);
join.getFromMultipleCon().add(fromJoinCon);
} else { // is BranchCon
BranchCon fromBranch = (BranchCon) multipleCon_from;
if (fromBranch instanceof BranchANDCon) {
BranchANDCon fromBranchAND = (BranchANDCon) fromBranchCon;
fromBranchAND.getToMultipleCon().add(join);
join.getFromMultipleCon().add(fromBranchANDCon);
} else { // is BranchConCond
BranchConCond fromBranchCond = (BranchConCond) fromBranchCon;
BranchConCondToMultipleCon bctmc = new BranchConCondToMultipleCon();
bctmc.setTheBranchCond(fromBranchConCond);
bctmc.setTheMultipleCon(join);
fromBranchCond.getTheBranchConCondToMultipleCon().add(bctmc);
join.getFromMultipleCon().add(fromBranchConCond);
}
}
// Dynamic Changes related code
ProcessModel pmodel = join.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.22"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
}
} else {
if ((multipleCon_from.isFired().booleanValue())) {
if (join.getKindJoinCon().equals("XOR")) { //$NON-NLS-1$
// Rule G4.29
Collection preds = this.getPredecessors(multipleCon_from);
Iterator iter = preds.iterator();
boolean has = false;
while (iter.hasNext()) {
Object obj = (Object) iter.next();
if (obj instanceof BranchANDCon) {
has = true;
break;
}
}
if (!has) {
join.setFired(new Boolean(false));
if (multipleCon_from instanceof JoinCon) {
JoinCon fromJoin = (JoinCon) multipleCon_from;
fromJoin.setToMultipleCon(join);
join.getFromMultipleCon().add(fromJoinCon);
} else { // is BranchCon
BranchCon fromBranch = (BranchCon) multipleCon_from;
if (fromBranch instanceof BranchANDCon) {
BranchANDCon fromBranchAND = (BranchANDCon) fromBranchCon;
fromBranchAND.getToMultipleCon().add(join);
join.getFromMultipleCon().add(fromBranchANDCon);
} else { // is BranchConCond
BranchConCond fromBranchCond = (BranchConCond) fromBranchCon;
BranchConCondToMultipleCon bctmc = new BranchConCondToMultipleCon();
bctmc.setTheBranchCond(fromBranchConCond);
bctmc.setTheMultipleCon(join);
fromBranchCond.getTheBranchConCondToMultipleCon().add(bctmc);
join.getFromMultipleCon().add(fromBranchConCond);
}
}
// Dynamic Changes related code
ProcessModel pmodel = join.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.29"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
}
}
} else {
// Rule G4.23
if (multipleCon_from instanceof JoinCon) {
JoinCon fromJoin = (JoinCon) multipleCon_from;
fromJoin.setToMultipleCon(join);
join.getFromMultipleCon().add(fromJoinCon);
} else { // is BranchCon
BranchCon fromBranch = (BranchCon) multipleCon_from;
if (fromBranch instanceof BranchANDCon) {
BranchANDCon fromBranchAND = (BranchANDCon) fromBranchCon;
fromBranchAND.getToMultipleCon().add(join);
join.getFromMultipleCon().add(fromBranchANDCon);
} else { // is BranchConCond
BranchConCond fromBranchCond = (BranchConCond) fromBranchCon;
BranchConCondToMultipleCon bctmc = new BranchConCondToMultipleCon();
bctmc.setTheBranchCond(fromBranchConCond);
bctmc.setTheMultipleCon(join);
fromBranchCond.getTheBranchConCondToMultipleCon().add(bctmc);
join.getFromMultipleCon().add(fromBranchConCond);
}
}
// Dynamic Changes related code
ProcessModel pmodel = join.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.23"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
}
}
// Persistence Operations
joinDAO.update(join);
multiDAO.update(multipleCon_from);
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThereControlFlow")); //$NON-NLS-1$
}
}
/*
* (non-Javadoc)
*
* @see org.qrconsult.spm.services.impl.DynamicModelingExtraida#
* defineJoinConFrom_Activity(java.lang.String, java.lang.String)
*/
@Override
public void defineJoinConFrom_Activity(String con_id, String act_id) throws DAOException, WebapseeException {
// Checks for the parameters
Object j;
try {
j = joinConDAO.retrieveBySecondaryKey(con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccJoinCon") + //$NON-NLS-1$
con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (j == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcJoinCon") + con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
JoinCon join = (JoinCon) j;
Object from_act;
try {
from_act = actDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActivi") + //$NON-NLS-1$
act_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (from_act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Activity activity_from = (Activity) from_act;
// End Checks for the parameters
// Now we start the implementation of the rules
String state = this.getState(activity_from);
if (!this.isInPredecessors(activity_from, join) && !this.controlFlow(join, activity_from)) {
if ((!join.isFired().booleanValue())) {
if (join.getKindJoinCon().equals("XOR")) { //$NON-NLS-1$
Collection preds = this.getConnectionsFrom(activity_from);
Iterator iter = preds.iterator();
boolean has = false;
while (iter.hasNext()) {
Connection conn = (Connection) iter.next();
if (conn instanceof BranchANDCon) {
has = true;
break;
}
}
if (!has) {
if ((state.equals("") || state.equals(ProcessModel.REQUIREMENTS)) // Rule G4.30 //$NON-NLS-1$
|| (!(state.equals(Plain.FAILED) || state.equals(ProcessModel.FAILED)) // Rule
// G4.31
&& !(state.equals(Plain.CANCELED) || state.equals(ProcessModel.CANCELED)))) {
join.getFromActivity().add(activity_from);
activity_from.getToJoin().add(join);
// Dynamic Changes related code
ProcessModel pmodel = join.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.31"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
}
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcActsFromBANDToJXOR"));
}
} else {
if ((state.equals("") || state.equals(ProcessModel.REQUIREMENTS))// Rule G4.24 //$NON-NLS-1$
|| (!(state.equals(Plain.FAILED) || state.equals(ProcessModel.FAILED)) // Rule
// G4.25
&& !(state.equals(Plain.CANCELED) || state.equals(ProcessModel.CANCELED)))) {
join.getFromActivity().add(activity_from);
activity_from.getToJoin().add(join);
// Dynamic Changes related code
ProcessModel pmodel = join.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.25"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
}
}
} else {
if (join.getKindJoinCon().equals("XOR")) { //$NON-NLS-1$
Collection preds = this.getConnectionsFrom(activity_from);
Iterator iter = preds.iterator();
boolean has = false;
while (iter.hasNext()) {
Connection conn = (Connection) iter.next();
if (conn instanceof BranchANDCon) {
has = true;
break;
}
}
if (!has) {
if ((state.equals(Plain.FINISHED) || state.equals(ProcessModel.FINISHED)) // Rule
// G4.32
|| (state.equals(Plain.ACTIVE) && state.equals(ProcessModel.ENACTING) // Rule
// G4.33
&& state.equals(Plain.PAUSED) && join.getTheDependency().getKindDep().equals("start-start"))) { //$NON-NLS-1$
join.getFromActivity().add(activity_from);
activity_from.getToJoin().add(join);
// Dynamic Changes related code
ProcessModel pmodel = join.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.33"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
}
} else {
throw new ModelingException(state + Messages.getString("facades.DynamicModeling.ModelingExcActsFromBANDToJXORStart"));
}
} else {
if ((state.equals(Plain.FINISHED) || state.equals(ProcessModel.FINISHED)) // Rule
// G4.26
|| (state.equals(Plain.ACTIVE) && state.equals(ProcessModel.ENACTING) // Rule
// G4.27
&& state.equals(Plain.PAUSED) && join.getTheDependency().getKindDep().equals("start-start"))) { //$NON-NLS-1$
join.getFromActivity().add(activity_from);
activity_from.getToJoin().add(join);
// Dynamic Changes related code
ProcessModel pmodel = join.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G4.27"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
}
}
}
// Persistence Operations
joinDAO.update(join);
actDAO.update(activity_from);
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThereControlFlow")); //$NON-NLS-1$
}
}
/**
* Related to section 5
*/
/*
* (non-Javadoc)
*
* @see org.qrconsult.spm.services.impl.DynamicModelingExtraida#
* defineBranchConFromActivity(java.lang.String, java.lang.String)
*/
@Override
public WebapseeObjectDTO defineBranchConFromActivity(String con_id, String act_id) throws DAOException, WebapseeException { // retornava
// Object
// Checks for the parameters
Object b;
try {
b = branchConDAO.retrieveBySecondaryKey(con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccJoinCon") + //$NON-NLS-1$
con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (b == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcBranchCon") + con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
BranchCon branch = (BranchCon) b;
Object from_act;
try {
from_act = actDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActivi") + //$NON-NLS-1$
act_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (from_act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Activity activity_from = (Activity) from_act;
// End Checks for the parameters
// Now we start the implementation of the rules
String state = this.getState(activity_from);
Collection pred = this.getPredecessors(branchCon);
pred.remove(null);
if (pred.isEmpty() && !this.controlFlow(branchCon, activity_from)) {
if (!branchCon.isFired().booleanValue() && ((state.equals("") || state.equals(ProcessModel.REQUIREMENTS)) //$NON-NLS-1$
|| !(state.equals(Plain.FAILED) || state.equals(ProcessModel.FAILED))
&& !(state.equals(Plain.CANCELED) || state.equals(ProcessModel.CANCELED)))) {
// Rule G5.1 and G5.2
branchCon.setFromActivity(activity_from);
activity_from.getToBranch().add(branchCon);
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.2"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
} else if (branchCon.isFired().booleanValue() && (state.equals(Plain.FINISHED) || state.equals(ProcessModel.FINISHED))) {
// Rule G5.3
branchCon.setFromActivity(activity_from);
activity_from.getToBranch().add(branchCon);
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.3"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
} else if (branch.isFired().booleanValue() && branchCon.getTheDependency().getKindDep().equals("start-start") //$NON-NLS-1$
&& (state.equals(Plain.ACTIVE) || state.equals(Plain.PAUSED) || state.equals(ProcessModel.ENACTING))) {
// Rule G5.4
branchCon.setFromActivity(activity_from);
activity_from.getToBranch().add(branchCon);
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.4"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
// Persistence Operations
branchDAO.update(branchCon);
actDAO.update(activity_from);
} else if (branch.getFromActivity() != null && !this.controlFlow(branchCon, activity_from)) {
Object ret = null;
if (!branchCon.isFired().booleanValue() && ((state.equals("") || state.equals(ProcessModel.REQUIREMENTS)) //$NON-NLS-1$
|| !(state.equals(Plain.FAILED) || state.equals(ProcessModel.FAILED))
&& !(state.equals(Plain.CANCELED) || state.equals(ProcessModel.CANCELED)))) {
// Rule G5.5 and G5.6
if (branch instanceof BranchANDCon) {
BranchANDCon branchAND = (BranchANDCon) branchCon;
Activity fromAct = branchAND.getFromActivity();
fromAct.getToBranch().remove(branchAND);
ret = fromAct;
} else {
BranchConCond branchCond = (BranchConCond) branchCon;
Activity fromAct = branchCond.getFromActivity();
fromAct.getToBranch().remove(branchCond);
ret = fromAct;
}
branchCon.setFromActivity(activity_from);
activity_from.getToBranch().add(branchCon);
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.6"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
// Persistence Operations
branchDAO.update(branchCon);
actDAO.update(activity_from);
return newWebapseeObjectDTO(((Activity) ret).getId(), ((Activity) ret).getClass().getName());
} else if (branchCon.isFired().booleanValue() && (state.equals(Plain.FINISHED) || state.equals(ProcessModel.FINISHED))) {
// Rule G5.7
if (branch instanceof BranchANDCon) {
BranchANDCon branchAND = (BranchANDCon) branchCon;
Activity fromAct = branchAND.getFromActivity();
fromAct.getToBranch().remove(branchAND);
ret = fromAct;
} else {
BranchConCond branchCond = (BranchConCond) branchCon;
Activity fromAct = branchCond.getFromActivity();
fromAct.getToBranch().remove(branchCond);
ret = fromAct;
}
branchCon.setFromActivity(activity_from);
activity_from.getToBranch().add(branchCon);
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.7"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
// Persistence Operations
branchDAO.update(branchCon);
actDAO.update(activity_from);
return newWebapseeObjectDTO(((Activity) ret).getId(), ((Activity) ret).getClass().getName());
} else if (branch.isFired().booleanValue() && branchCon.getTheDependency().getKindDep().equals("start-start") //$NON-NLS-1$
&& (state.equals(Plain.ACTIVE) || state.equals(Plain.PAUSED) || state.equals(ProcessModel.ENACTING))) {
// Rule G5.8
if (branch instanceof BranchANDCon) {
BranchANDCon branchAND = (BranchANDCon) branchCon;
Activity fromAct = branchAND.getFromActivity();
fromAct.getToBranch().remove(branchAND);
ret = fromAct;
} else {
BranchConCond branchCond = (BranchConCond) branchCon;
Activity fromAct = branchCond.getFromActivity();
fromAct.getToBranch().remove(branchCond);
ret = fromAct;
}
branchCon.setFromActivity(activity_from);
activity_from.getToBranch().add(branchCon);
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.8"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
// Persistence Operations
branchDAO.update(branchCon);
actDAO.update(activity_from);
return newWebapseeObjectDTO(((Activity) ret).getId(), ((Activity) ret).getClass().getName());
}
} else if (branch.getFromMultipleConnection() != null && !this.controlFlow(branchCon, activity_from)) {
if (!branchCon.isFired().booleanValue() && ((state.equals("") || state.equals(ProcessModel.REQUIREMENTS)) //$NON-NLS-1$
|| !(state.equals(Plain.FAILED) || state.equals(ProcessModel.FAILED))
&& !(state.equals(Plain.CANCELED) || state.equals(ProcessModel.CANCELED)))) {
// Rule G5.9 and G5.10
MultipleCon fromMulti = branchCon.getFromMultipleConnection();
if (fromMulti instanceof JoinCon) {
JoinCon fromJoin = (JoinCon) fromMulti;
fromJoinCon.setToMultipleCon(null);
branchCon.setFromMultipleConnection(null);
branchCon.setFromActivity(activity_from);
activity_from.getToBranch().add(branchCon);
} else { // is BranchCon
BranchCon fromBranch = (BranchCon) fromMulti;
if (fromBranch instanceof BranchANDCon) {
BranchANDCon fromBranchAND = (BranchANDCon) fromBranchCon;
fromBranchAND.getToMultipleCon().remove(branchCon);
branchCon.setFromMultipleConnection(null);
branchCon.setFromActivity(activity_from);
activity_from.getToBranch().add(branchCon);
} else { // is BranchConCond
BranchConCond fromBranchCond = (BranchConCond) fromBranchCon;
Collection bctmcs = fromBranchCond.getTheBranchConCondToMultipleCon();
Iterator iterBctmcs = bctmcs.iterator();
while (iterBctmcs.hasNext()) {
BranchConCondToMultipleCon bctmc = (BranchConCondToMultipleCon) iterBctmcs.next();
if (bctmc.getTheMultipleCon().equals(branchCon)) {
bctmc.setTheBranchConCond(null);
bctmc.setTheMultipleCon(null);
bctmcs.remove(bctmc);
branchCon.setFromMultipleConnection(null);
branchCon.setFromActivity(activity_from);
activity_from.getToBranch().add(branchCon);
// Removing from the model
bctmcDAO.daoDelete(bctmc);
break;
}
}
}
}
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.10"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
// Persistence Operations
branchDAO.update(branchCon);
actDAO.update(activity_from);
return newWebapseeObjectDTO(fromMulti.getId(), fromMulti.getClass().getName());
} else if (branchCon.isFired().booleanValue() && (state.equals(Plain.FINISHED) || state.equals(ProcessModel.FINISHED))) {
// Rule G5.11
MultipleCon fromMulti = branchCon.getFromMultipleConnection();
if (fromMulti instanceof JoinCon) {
JoinCon fromJoin = (JoinCon) fromMulti;
fromJoinCon.setToMultipleCon(null);
branchCon.setFromMultipleConnection(null);
branchCon.setFromActivity(activity_from);
activity_from.getToBranch().add(branchCon);
} else { // is BranchCon
BranchCon fromBranch = (BranchCon) fromMulti;
if (fromBranch instanceof BranchANDCon) {
BranchANDCon fromBranchAND = (BranchANDCon) fromBranchCon;
fromBranchAND.getToMultipleCon().remove(branchCon);
branchCon.setFromMultipleConnection(null);
branchCon.setFromActivity(activity_from);
activity_from.getToBranch().add(branchCon);
} else { // is BranchConCond
BranchConCond fromBranchCond = (BranchConCond) fromBranchCon;
Collection bctmcs = fromBranchCond.getTheBranchConCondToMultipleCon();
Iterator iterBctmcs = bctmcs.iterator();
while (iterBctmcs.hasNext()) {
BranchConCondToMultipleCon bctmc = (BranchConCondToMultipleCon) iterBctmcs.next();
if (bctmc.getTheMultipleCon().equals(branchCon)) {
bctmc.setTheBranchConCond(null);
bctmc.setTheMultipleCon(null);
bctmcs.remove(bctmc);
branchCon.setFromMultipleConnection(null);
branchCon.setFromActivity(activity_from);
activity_from.getToBranch().add(branchCon);
// Removing from the model
bctmcDAO.daoDelete(bctmc);
break;
}
}
}
}
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.11"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
// Persistence Operations
branchDAO.update(branchCon);
actDAO.update(activity_from);
return newWebapseeObjectDTO(fromMulti.getId(), fromMulti.getClass().getName());
} else if (branch.isFired().booleanValue() && branchCon.getTheDependency().getKindDep().equals("start-start") //$NON-NLS-1$
&& (state.equals(Plain.ACTIVE) || state.equals(Plain.PAUSED) || state.equals(ProcessModel.ENACTING))) {
// Rule G5.12
MultipleCon fromMulti = branchCon.getFromMultipleConnection();
if (fromMulti instanceof JoinCon) {
JoinCon fromJoin = (JoinCon) fromMulti;
fromJoinCon.setToMultipleCon(null);
branchCon.setFromMultipleConnection(null);
branchCon.setFromActivity(activity_from);
activity_from.getToBranch().add(branchCon);
} else { // is BranchCon
BranchCon fromBranch = (BranchCon) fromMulti;
if (fromBranch instanceof BranchANDCon) {
BranchANDCon fromBranchAND = (BranchANDCon) fromBranchCon;
fromBranchAND.getToMultipleCon().remove(branchCon);
branchCon.setFromMultipleConnection(null);
branchCon.setFromActivity(activity_from);
activity_from.getToBranch().add(branchCon);
} else { // is BranchConCond
BranchConCond fromBranchCond = (BranchConCond) fromBranchCon;
Collection bctmcs = fromBranchCond.getTheBranchConCondToMultipleCon();
Iterator iterBctmcs = bctmcs.iterator();
while (iterBctmcs.hasNext()) {
BranchConCondToMultipleCon bctmc = (BranchConCondToMultipleCon) iterBctmcs.next();
if (bctmc.getTheMultipleCon().equals(branchCon)) {
bctmc.setTheBranchConCond(null);
bctmc.setTheMultipleCon(null);
bctmcs.remove(bctmc);
branchCon.setFromMultipleConnection(null);
branchCon.setFromActivity(activity_from);
activity_from.getToBranch().add(branchCon);
// Removing from the model
bctmcDAO.daoDelete(bctmc);
break;
}
}
}
}
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.12"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
// Persistence Operations
branchDAO.update(branchCon);
actDAO.update(activity_from);
return newWebapseeObjectDTO(fromMulti.getId(), fromMulti.getClass().getName());
}
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThereControlFlow")); //$NON-NLS-1$
}
return null;
}
/*
* (non-Javadoc)
*
* @see org.qrconsult.spm.services.impl.DynamicModelingExtraida#
* defineBranchConFromConnection(java.lang.String, java.lang.String)
*/
@Override
public WebapseeObjectDTO defineBranchConFromConnection(String con_id, String from_con_id) throws DAOException, WebapseeException {// retornava
// Object
// Checks for the parameters
Object b;
try {
b = branchConDAO.retrieveBySecondaryKey(con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccJoinCon") + //$NON-NLS-1$
con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (b == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcBranchCon") + con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
BranchCon branch = (BranchCon) b;
Object from_multi;
try {
from_multi = multiDAO.retrieveBySecondaryKey(from_con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccMultConn") + //$NON-NLS-1$
from_con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (from_multi == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcMultConn") + from_con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
MultipleCon multipleCon_from = (MultipleCon) from_multi;
// End Checks for the parameters
// Now we start the implementation of the rules
Collection pred = this.getPredecessors(branchCon);
pred.remove(null);
if (pred.isEmpty()) {
if ((!branch.isFired().booleanValue() || (branchCon.isFired().booleanValue() && multipleCon_from.isFired().booleanValue()))
&& !this.controlFlow(branchCon, multipleCon_from)) {
// Rule G5.13 and G5.16
if (multipleCon_from instanceof JoinCon) {
JoinCon fromJoin = (JoinCon) multipleCon_from;
fromJoin.setToMultipleCon(branchCon);
branch.setFromMultipleConnection(fromJoinCon);
} else { // is BranchCon
BranchCon fromBranch = (BranchCon) multipleCon_from;
if (fromBranch instanceof BranchANDCon) {
BranchANDCon fromBranchAND = (BranchANDCon) fromBranchCon;
fromBranchAND.getToMultipleCon().add(branchCon);
branch.setFromMultipleConnection(fromBranchANDCon);
} else { // is BranchConCond
BranchConCond fromBranchCond = (BranchConCond) fromBranchCon;
BranchConCondToMultipleCon bctmc = new BranchConCondToMultipleCon();
bctmc.setTheBranchCond(fromBranchConCond);
bctmc.setTheMultipleCon(branchCon);
fromBranchCond.getTheBranchConCondToMultipleCon().add(bctmc);
branch.setFromMultipleConnection(fromBranchConCond);
}
}
// Dynamic Changes related code
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.13"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
// Persistence Operations
branchDAO.update(branchCon);
multiDAO.update(multipleCon_from);
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThereControlFlow")); //$NON-NLS-1$
}
} else if (branchCon.getFromActivity() != null) {
if ((!branch.isFired().booleanValue() || (branchCon.isFired().booleanValue() && multipleCon_from.isFired().booleanValue()))
&& !this.controlFlow(branchCon, multipleCon_from)) {
// Rule G5.14 and G5.17
Activity fromAct = branchCon.getFromActivity();
fromAct.getToBranch().remove(branchCon);
if (multipleCon_from instanceof JoinCon) {
JoinCon fromJoin = (JoinCon) multipleCon_from;
fromJoin.setToMultipleCon(branchCon);
branch.setFromMultipleConnection(fromJoinCon);
} else { // is BranchCon
BranchCon fromBranch = (BranchCon) multipleCon_from;
if (fromBranch instanceof BranchANDCon) {
BranchANDCon fromBranchAND = (BranchANDCon) fromBranchCon;
fromBranchAND.getToMultipleCon().add(branchCon);
branch.setFromMultipleConnection(fromBranchANDCon);
} else { // is BranchConCond
BranchConCond fromBranchCond = (BranchConCond) fromBranchCon;
BranchConCondToMultipleCon bctmc = new BranchConCondToMultipleCon();
bctmc.setTheBranchCond(fromBranchConCond);
bctmc.setTheMultipleCon(branchCon);
fromBranchCond.getTheBranchConCondToMultipleCon().add(bctmc);
branch.setFromMultipleConnection(fromBranchConCond);
}
}
// Dynamic Changes related code
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.14"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
// Persistence Operations
branchDAO.update(branchCon);
multiDAO.update(multipleCon_from);
return newWebapseeObjectDTO(fromAct.getId(), fromAct.getClass().getName());
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThereControlFlow")); //$NON-NLS-1$
}
} else if (!(branchCon.getFromMultipleConnection() == null)) {
if ((!branch.isFired().booleanValue() || (branchCon.isFired().booleanValue() && multipleCon_from.isFired().booleanValue()))
&& !this.controlFlow(branchCon, multipleCon_from)) {
// Rule G5.15 and G5.18
// Removing old Multiple Connection from BranchCon
MultipleCon fromMulti = branchCon.getFromMultipleConnection();
if (fromMulti instanceof JoinCon) {
JoinCon fromJoin = (JoinCon) fromMulti;
fromJoinCon.setToMultipleCon(null);
} else { // is BranchCon
BranchCon fromBranch = (BranchCon) fromMulti;
if (fromBranch instanceof BranchANDCon) {
BranchANDCon fromBranchAND = (BranchANDCon) fromBranchCon;
fromBranchAND.getToMultipleCon().remove(branchCon);
} else { // is BranchConCond
BranchConCond fromBranchCond = (BranchConCond) fromBranchCon;
Collection bctmcs = fromBranchCond.getTheBranchConCondToMultipleCon();
Iterator iterBctmcs = bctmcs.iterator();
while (iterBctmcs.hasNext()) {
BranchConCondToMultipleCon bctmc = (BranchConCondToMultipleCon) iterBctmcs.next();
if (bctmc.getTheMultipleCon().equals(branchCon)) {
bctmc.setTheBranchConCond(null);
bctmc.setTheMultipleCon(null);
bctmcs.remove(bctmc);
// Removing from the model
bctmcDAO.daoDelete(bctmc);
}
}
}
}
// Adding New Multiple Connection from BranchCon
if (multipleCon_from instanceof JoinCon) {
JoinCon fromJoin = (JoinCon) multipleCon_from;
fromJoin.setToMultipleCon(branchCon);
branch.setFromMultipleConnection(fromJoinCon);
} else { // is BranchCon
BranchCon fromBranch = (BranchCon) multipleCon_from;
if (fromBranch instanceof BranchANDCon) {
BranchANDCon fromBranchAND = (BranchANDCon) fromBranchCon;
fromBranchAND.getToMultipleCon().add(branchCon);
branch.setFromMultipleConnection(fromBranchANDCon);
} else { // is BranchConCond
BranchConCond fromBranchCond = (BranchConCond) fromBranchCon;
BranchConCondToMultipleCon bctmc = new BranchConCondToMultipleCon();
bctmc.setTheBranchCond(fromBranchConCond);
bctmc.setTheMultipleCon(branchCon);
fromBranchCond.getTheBranchConCondToMultipleCon().add(bctmc);
branch.setFromMultipleConnection(fromBranchConCond);
}
}
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.15"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
// Persistence Operations
branchDAO.update(branchCon);
multiDAO.update(multipleCon_from);
return newWebapseeObjectDTO(fromMulti.getId(), fromMulti.getClass().getName());
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThereControlFlow")); //$NON-NLS-1$
}
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcBranchConIsNotOK")); //$NON-NLS-1$
}
return null;
}
/*
* (non-Javadoc)
*
* @see org.qrconsult.spm.services.impl.DynamicModelingExtraida#
* defineBranchConToActivity(java.lang.String, java.lang.String)
*/
@Override
public void defineBranchConToActivity(String con_id, String act_id) throws DAOException, WebapseeException {
// Checks for the parameters
Object b;
try {
b = branchConDAO.retrieveBySecondaryKey(con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccJoinCon") + //$NON-NLS-1$
con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (b == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcBranchCon") + con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
BranchCon branch = (BranchCon) b;
Object to_act;
try {
to_act = actDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActivi") + //$NON-NLS-1$
act_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (to_act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Activity activity_to = (Activity) to_act;
// End Checks for the parameters
// Now we start the implementation of the rules
String state = this.getState(activity_to);
if (!branchCon.isFired().booleanValue() && (state.equals("") || state.equals(ProcessModel.REQUIREMENTS)) //$NON-NLS-1$
&& !this.controlFlow(activity_to, branchCon)) {
if (branch instanceof BranchANDCon) { // Rule G5.19
BranchANDCon bAND = (BranchANDCon) branchCon;
Collection acts = bAND.getToActivities();
if (!acts.contains(activity_to)) {
bAND.getToActivities().add(activity_to);
activity_to.getFromBranchANDCon().add(bAND);
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.19"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThisActvAlreadyConn")); //$NON-NLS-1$
}
} else {
// Rule G5.26
BranchConCond bCond = (BranchConCond) branchCon;
Collection actsTBC = bCond.getTheBranchConCondToActivity();
Iterator iter = actsTBC.iterator();
boolean has = false;
while (iter.hasNext()) {
BranchConCondToActivity aTBC = (BranchConCondToActivity) iter.next();
if (aTBC.getTheActivities().equals(activity_to)) {
has = true;
break;
}
}
if (!has) {
BranchConCondToActivity activityToBranchCond = new BranchConCondToActivity();
// activityToBranchConCond.setCondition(condition);
activityToBranchCond.setTheBranchConCond(bCond);
activityToBranchConCond.setTheActivity(activity_to);
activity_to.getTheBranchConCondToActivity().add(activityToBranchConCond);
bCond.getTheBranchConCondToActivity().add(activityToBranchConCond);
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.26"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThisActvAlreadyConn")); //$NON-NLS-1$
}
}
} else if (!branchCon.isFired().booleanValue()
&& (state.equals(Plain.WAITING) || state.equals(ProcessModel.REQUIREMENTS) || state.equals(ProcessModel.ABSTRACT)) // ||
// state.equals(ProcessModel.INSTANTIATED)
&& !this.controlFlow(activity_to, branchCon)) {
if (branch instanceof BranchANDCon) { // Rule G5.20
BranchANDCon bAND = (BranchANDCon) branchCon;
Collection acts = bAND.getToActivities();
if (!acts.contains(activity_to)) {
bAND.getToActivities().add(activity_to);
activity_to.getFromBranchANDCon().add(bAND);
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.20"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThisActvAlreadyConn")); //$NON-NLS-1$
}
} else {
// Rule G5.27
BranchConCond bCond = (BranchConCond) branchCon;
Collection actsTBC = bCond.getTheBranchConCondToActivity();
Iterator iter = actsTBC.iterator();
boolean has = false;
while (iter.hasNext()) {
BranchConCondToActivity aTBC = (BranchConCondToActivity) iter.next();
if (aTBC.getTheActivities().equals(activity_to)) {
has = true;
break;
}
}
if (!has) {
BranchConCondToActivity activityToBranchCond = new BranchConCondToActivity();
// activityToBranchConCond.setCondition(condition);
activityToBranchCond.setTheBranchConCond(bCond);
activityToBranchConCond.setTheActivity(activity_to);
activity_to.getTheBranchConCondToActivity().add(activityToBranchConCond);
bCond.getTheBranchConCondToActivity().add(activityToBranchConCond);
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.27"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThisActvAlreadyConn")); //$NON-NLS-1$
}
}
} else if (!branchCon.isFired().booleanValue()
&& ((state.equals(Plain.READY) || state.equals(ProcessModel.INSTANTIATED)) && branchCon.getTheDependency().getKindDep()
.equals("end-end")) //$NON-NLS-1$
&& !this.controlFlow(activity_to, branchCon)) {
if (branch instanceof BranchANDCon) { // Rule G5.21
BranchANDCon bAND = (BranchANDCon) branchCon;
Collection acts = bAND.getToActivities();
if (!acts.contains(activity_to)) {
bAND.getToActivities().add(activity_to);
activity_to.getFromBranchANDCon().add(bAND);
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.21"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThisActvAlreadyConn")); //$NON-NLS-1$
}
} else {
// Rule G5.28
BranchConCond bCond = (BranchConCond) branchCon;
Collection actsTBC = bCond.getTheBranchConCondToActivity();
Iterator iter = actsTBC.iterator();
boolean has = false;
while (iter.hasNext()) {
BranchConCondToActivity aTBC = (BranchConCondToActivity) iter.next();
if (aTBC.getTheActivities().equals(activity_to)) {
has = true;
break;
}
}
if (!has) {
BranchConCondToActivity activityToBranchCond = new BranchConCondToActivity();
// activityToBranchConCond.setCondition(condition);
activityToBranchCond.setTheBranchConCond(bCond);
activityToBranchConCond.setTheActivity(activity_to);
activity_to.getTheBranchConCondToActivity().add(activityToBranchConCond);
bCond.getTheBranchConCondToActivity().add(activityToBranchConCond);
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.28"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThisActvAlreadyConn")); //$NON-NLS-1$
}
}
} else if (!branchCon.isFired().booleanValue() && (state.equals(Plain.READY) || state.equals(ProcessModel.INSTANTIATED))
&& (branchCon.getTheDependency().getKindDep().equals("end-start") //$NON-NLS-1$
|| branchCon.getTheDependency().getKindDep().equals("start-start")) //$NON-NLS-1$
&& !this.controlFlow(activity_to, branchCon)) {
if (branch instanceof BranchANDCon) { // Rule G5.22
BranchANDCon bAND = (BranchANDCon) branchCon;
Collection acts = bAND.getToActivities();
if (!acts.contains(activity_to)) {
bAND.getToActivities().add(activity_to);
activity_to.getFromBranchANDCon().add(bAND);
this.makeWaiting(activity_to, "Rule G5.22"); //$NON-NLS-1$
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.22"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThisActvAlreadyConn")); //$NON-NLS-1$
}
} else {
// Rule G5.29
BranchConCond bCond = (BranchConCond) branchCon;
Collection actsTBC = bCond.getTheBranchConCondToActivity();
Iterator iter = actsTBC.iterator();
boolean has = false;
while (iter.hasNext()) {
BranchConCondToActivity aTBC = (BranchConCondToActivity) iter.next();
if (aTBC.getTheActivities().equals(activity_to)) {
has = true;
break;
}
}
if (!has) {
BranchConCondToActivity activityToBranchCond = new BranchConCondToActivity();
// activityToBranchConCond.setCondition(condition);
activityToBranchCond.setTheBranchConCond(bCond);
activityToBranchConCond.setTheActivity(activity_to);
activity_to.getTheBranchConCondToActivity().add(activityToBranchConCond);
bCond.getTheBranchConCondToActivity().add(activityToBranchConCond);
this.makeWaiting(activity_to, "Rule G5.29"); //$NON-NLS-1$
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.29"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThisActvAlreadyConn")); //$NON-NLS-1$
}
}
} else if (branchCon.isFired().booleanValue()
&& (!(state.equals(Plain.FAILED) || state.equals(ProcessModel.FAILED)) && !(state.equals(Plain.CANCELED) || state
.equals(ProcessModel.CANCELED))) && !this.controlFlow(activity_to, branchCon)) {
if (branch instanceof BranchANDCon) { // Rule G5.23
BranchANDCon bAND = (BranchANDCon) branchCon;
Collection acts = bAND.getToActivities();
if (!acts.contains(activity_to)) {
bAND.getToActivities().add(activity_to);
activity_to.getFromBranchANDCon().add(bAND);
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.23"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThisActvAlreadyConn")); //$NON-NLS-1$
}
} else {
// Rule G5.30
BranchConCond bCond = (BranchConCond) branchCon;
Collection actsTBC = bCond.getTheBranchConCondToActivity();
Iterator iter = actsTBC.iterator();
boolean has = false;
while (iter.hasNext()) {
BranchConCondToActivity aTBC = (BranchConCondToActivity) iter.next();
if (aTBC.getTheActivities().equals(activity_to)) {
has = true;
break;
}
}
if (!has) {
BranchConCondToActivity activityToBranchCond = new BranchConCondToActivity();
// activityToBranchConCond.setCondition(condition);
activityToBranchCond.setTheBranchConCond(bCond);
activityToBranchConCond.setTheActivity(activity_to);
activity_to.getTheBranchConCondToActivity().add(activityToBranchConCond);
bCond.getTheBranchConCondToActivity().add(activityToBranchConCond);
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.30"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThisActvAlreadyConn")); //$NON-NLS-1$
}
}
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheDependOrContr")); //$NON-NLS-1$
}
// Persistence Operations
actDAO.update(activity_to);
branchDAO.update(branchCon);
}
/*
* (non-Javadoc)
*
* @see org.qrconsult.spm.services.impl.DynamicModelingExtraida#
* defineBranchConToConnection(java.lang.String, java.lang.String)
*/
@Override
public void defineBranchConToConnection(String con_id, String to_con_id) throws DAOException, WebapseeException {
// Checks for the parameters
Object b;
try {
b = branchConDAO.retrieveBySecondaryKey(con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccJoinCon") + //$NON-NLS-1$
con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (b == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcBranchCon") + con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
BranchCon branch = (BranchCon) b;
Object to_multi;
try {
to_multi = multiDAO.retrieveBySecondaryKey(to_con_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccMultConn") + //$NON-NLS-1$
to_con_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (to_multi == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcMultConn") + to_con_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
MultipleCon multipleCon_to = (MultipleCon) to_multi;
// End Checks for the parameters
// Now we start the implementation of the rules
if (!branch.isFired().booleanValue() && !multipleCon_to.isFired().booleanValue() && !this.controlFlow(multipleCon_to, branchCon)) {
if (branch instanceof BranchANDCon) { // Rule G5.24
BranchANDCon bAND = (BranchANDCon) branchCon;
Collection multis = bAND.getToMultipleCon();
if (!multis.contains(multipleCon_to)) {
if (multipleCon_to instanceof JoinCon) {
JoinCon toJoin = (JoinCon) multipleCon_to;
toJoinCon.getFromMultipleCon().add(bAND);
bAND.getToMultipleCon().add(toJoinCon);
} else { // is BranchCon
BranchCon toBranch = (BranchCon) multipleCon_to;
toBranchCon.setFromMultipleConnection(bAND);
bAND.getToMultipleCon().add(toBranchCon);
}
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.24"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThisMultConnIsAlreConn")); //$NON-NLS-1$
}
} else { // Rule G5.31
BranchConCond bCond = (BranchConCond) branchCon;
Collection multisTBC = bCond.getTheBranchConCondToMultipleCon();
Iterator iter = multisTBC.iterator();
boolean has = false;
while (iter.hasNext()) {
BranchConCondToMultipleCon bCTMC = (BranchConCondToMultipleCon) iter.next();
if (bCTMC.getTheMultipleCon().equals(multipleCon_to)) {
has = true;
break;
}
}
if (!has) {
if (multipleCon_to instanceof JoinCon) {
JoinCon toJoin = (JoinCon) multipleCon_to;
BranchConCondToMultipleCon branchCondToMultipleCon = new BranchConCondToMultipleCon();
// branchConCondToMultipleCon.setCondition(condition);
branchCondToMultipleCon.setTheBranchConCond(bCond);
branchCondToMultipleCon.setTheMultipleCon(toJoinCon);
toJoinCon.getFromMultipleCon().add(bCond);
bCond.getTheBranchCondToMultipleCon().add(branchConCondToMultipleCon);
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.31"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
} else { // is BranchCon
BranchCon toBranch = (BranchCon) multipleCon_to;
BranchConCondToMultipleCon branchCondToMultipleCon = new BranchConCondToMultipleCon();
// branchConCondToMultipleCon.setCondition(condition);
branchCondToMultipleCon.setTheBranchConCond(bCond);
branchCondToMultipleCon.setTheMultipleCon(toBranchCon);
toBranchCon.setFromMultipleConnection(bCond);
bCond.getTheBranchCondToMultipleCon().add(branchConCondToMultipleCon);
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.31"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
}
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThisActvAlreadyConn")); //$NON-NLS-1$
}
}
} else if (branch.isFired().booleanValue() && !this.controlFlow(multipleCon_to, branchCon)) {
if (branch instanceof BranchANDCon) { // Rule G5.25
BranchANDCon bAND = (BranchANDCon) branchCon;
Collection multis = bAND.getToMultipleCon();
if (!multis.contains(multipleCon_to)) {
if (multipleCon_to instanceof JoinCon) {
JoinCon toJoin = (JoinCon) multipleCon_to;
toJoinCon.getFromMultipleCon().add(bAND);
bAND.getToMultipleCon().add(toJoinCon);
} else { // is BranchCon
BranchCon toBranch = (BranchCon) multipleCon_to;
toBranchCon.setFromMultipleConnection(bAND);
bAND.getToMultipleCon().add(toBranchCon);
}
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.25"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThisMultConnIsAlreConn")); //$NON-NLS-1$
}
} else { // Rule G5.32
BranchConCond bCond = (BranchConCond) branchCon;
Collection multisTBC = bCond.getTheBranchConCondToMultipleCon();
Iterator iter = multisTBC.iterator();
boolean has = false;
while (iter.hasNext()) {
BranchConCondToMultipleCon bCTMC = (BranchConCondToMultipleCon) iter.next();
if (bCTMC.getTheMultipleCon().equals(multipleCon_to)) {
has = true;
break;
}
}
if (!has) {
if (multipleCon_to instanceof JoinCon) {
JoinCon toJoin = (JoinCon) multipleCon_to;
BranchConCondToMultipleCon branchCondToMultipleCon = new BranchConCondToMultipleCon();
// branchConCondToMultipleCon.setCondition(condition);
branchCondToMultipleCon.setTheBranchConCond(bCond);
branchCondToMultipleCon.setTheMultipleCon(toJoinCon);
toJoinCon.getFromMultipleCon().add(bCond);
bCond.getTheBranchCondToMultipleCon().add(branchConCondToMultipleCon);
} else { // is BranchCon
BranchCon toBranch = (BranchCon) multipleCon_to;
BranchConCondToMultipleCon branchCondToMultipleCon = new BranchConCondToMultipleCon();
// branchConCondToMultipleCon.setCondition(condition);
branchCondToMultipleCon.setTheBranchConCond(bCond);
branchCondToMultipleCon.setTheMultipleCon(toBranchCon);
toBranchCon.setFromMultipleConnection(bCond);
bCond.getTheBranchCondToMultipleCon().add(branchConCondToMultipleCon);
}
// Dynamic Changes related code
ProcessModel pmodel = branchCon.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G5.32"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThisMultConnIsAlreConn")); //$NON-NLS-1$
}
}
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcThereIsContrFlowAlr")); //$NON-NLS-1$
}
// Persistence Operations
branchDAO.update(branchCon);
multiDAO.update(multipleCon_to);
}
/*
* Definidos na se��o 3 public void removeMultiConSuccessorConnection(String
* con_id, String to_con_id){ } public void
* removeMultiConSuccessorActivity(String con_id, String to_con_id){ }
*/
/**
* Related to section 6
*/
/*
* (non-Javadoc)
*
* @see
* org.qrconsult.spm.services.impl.DynamicModelingExtraida#addRequiredRole
* (java.lang.String, java.lang.String)
*/
@Override
public void addRequiredRole(String act_id, String role_id) throws DAOException, ModelingException {
// Checks for the parameters
Object act;
try {
act = actDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActivi") + //$NON-NLS-1$
act_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Normal actNorm = (Normal) act;
Object rol;
try {
rol = roleDAO.retrieveBySecondaryKey(role_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccRole") + //$NON-NLS-1$
role_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (rol == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcRole") + role_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Role role = (Role) rol;
// End Checks for the parameters
// Now we start the implementation of the rules
String state = actNorm.getTheEnactionDescription().getState();
if (state.equals("") // Rule G6.1 //$NON-NLS-1$
|| (!state.equals(Plain.CANCELED) // Rule G6.2
&& !state.equals(Plain.FAILED) && !state.equals(Plain.FINISHED))) {
ReqAgent reqAgent = new ReqAgent();
role.getTheReqAgent().add(reqAgent);
reqAgent.setTheRole(role);
reqAgent.setTheNormal(actNorm);
actNorm.getTheRequiredPeople().add(reqAgent);
// Persistence Operations
actDAO.update(actNorm);
roleDAO.update(role);
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + actNorm.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcHasAlreadyFinis")); //$NON-NLS-1$ //$NON-NLS-2$
}
}
/*
* (non-Javadoc)
*
* @see
* org.qrconsult.spm.services.impl.DynamicModelingExtraida#addRequiredWorkGroupType
* (java.lang.String, java.lang.String)
*/
@Override
public void addRequiredWorkGroupType(String act_id, String g_type) throws DAOException, ModelingException {
// Checks for the parameters
Object act;
try {
act = actDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActivi") + //$NON-NLS-1$
act_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Normal actNorm = (Normal) act;
Object gType;
try {
gType = WorkGroupTypeDAO.retrieveBySecondaryKey(g_type);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccWorkGroupType") + //$NON-NLS-1$
g_type + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (gType == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcWorkGroupType") + g_type + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
WorkGroupType WorkGroupType = (WorkGroupType) gType;
// End Checks for the parameters
// Now we start the implementation of the rules
String state = actNorm.getTheEnactionDescription().getState();
if (state.equals("") // Rule G6.3 //$NON-NLS-1$
|| (!state.equals(Plain.CANCELED) // Rule G6.4
&& !state.equals(Plain.FAILED) && !state.equals(Plain.FINISHED))) {
ReqWorkGroup reqWorkGroup = new ReqWorkGroup();
WorkGroupType.getTheReqWorkGroup().add(reqWorkGroup);
reqWorkGroup.setTheWorkGroupType(WorkGroupType);
reqWorkGroup.setTheNormal(actNorm);
actNorm.getTheRequiredPeople().add(reqWorkGroup);
// Persistence Operations
actDAO.update(actNorm);
WorkGroupTypeDAO.update(WorkGroupType);
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + actNorm.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcHasAlreadyFinis")); //$NON-NLS-1$ //$NON-NLS-2$
}
}
/*
* (non-Javadoc)
*
* @see
* org.qrconsult.spm.services.impl.DynamicModelingExtraida#removeRequiredRole
* (java.lang.String, java.lang.String)
*/
@Override
public void removeRequiredRole(String act_id, String role_id) throws DAOException {
// Checks for the parameters
Object act;
try {
act = actDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActivi") + //$NON-NLS-1$
act_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Normal actNorm = (Normal) act;
Object rol;
try {
rol = roleDAO.retrieveBySecondaryKey(role_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccRole") + //$NON-NLS-1$
role_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (rol == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcRole") + role_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Role role = (Role) rol;
// End Checks for the parameters
// Now we start the implementation of the rules
String state = actNorm.getTheEnactionDescription().getState();
Collection reqAgents = this.getRequiredAgents(actNorm);
Iterator iter = reqAgents.iterator();
if (state.equals("")) { // Rule G6.5 and G6.7 //$NON-NLS-1$
while (iter.hasNext()) {
ReqAgent reqAgent = (ReqAgent) iter.next();
if (reqAgent.getTheRole().equals(role)) {
actNorm.getTheRequiredPeople().remove(reqAgent);
reqAgent.setTheNormal(null);
if (reqAgent.getTheAgent() != null) {
reqAgent.getTheAgent().getTheReqAgent().remove(reqAgent);
reqAgent.setTheAgent(null);
}
reqAgent.setTheRole(null);
role.getTheReqAgent().remove(reqAgent);
// Persistence Operations
actDAO.update(actNorm);
roleDAO.update(role);
reqAgentDAO.daoDelete(reqAgent);
break;
}
}
} else {
while (iter.hasNext()) { // Rule G6.6 and G6.8
ReqAgent reqAgent = (ReqAgent) iter.next();
if (reqAgent.getTheAgent() == null && reqAgent.getTheRole().equals(role)) {
Collection reqPeople = actNorm.getTheRequiredPeople();
reqPeople.remove(reqAgent);
reqAgent.setTheNormal(null);
reqAgent.setTheRole(null);
role.getTheReqAgent().remove(reqAgent);
reqPeople.remove(null);
if (reqPeople.size() > 0) {
boolean allTasksFinished = enactmentEngine.isActivityFinished(actNorm);
if (allTasksFinished) {
try {
this.enactmentEngine.finishTask(actNorm);
} catch (DAOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (WebapseeException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
// Persistence Operations
actDAO.update(actNorm);
roleDAO.update(role);
reqAgentDAO.daoDelete(reqAgent);
break;
}
}
}
}
/*
* (non-Javadoc)
*
* @see
* org.qrconsult.spm.services.impl.DynamicModelingExtraida#defineRequiredAgent
* (java.lang.String, java.lang.String, java.lang.String)
*/
@Override
public boolean defineRequiredAgent(String act_id, String role_id, String ag_id) throws DAOException, WebapseeException {
System.out.println("Activity Id: " + act_id);
System.out.println("Role Id: " + role_id);
System.out.println("Agent Id: " + ag_id);
// Checks for the parameters
if (role_id == null || role_id.equals(""))
return false;
Object act;
try {
act = actDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActivi") + act_id
+ Messages.getString("facades.DynamicModeling.DaoExcFailed") + e);
}
if (act == null)
throw new DAOException(Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id
+ Messages.getString("facades.DynamicModeling.DaoExcNotFound"));
Normal actNorm = (Normal) act;
Object ag;
try {
ag = agentDAO.retrieveBySecondaryKey(ag_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccAgent") + ag_id
+ Messages.getString("facades.DynamicModeling.DaoExcFailed") + e);
}
if (ag == null)
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcAgent") + ag_id
+ Messages.getString("facades.DynamicModeling.DaoExcNotFound"));
Agent agent = (Agent) ag;
// End Checks for the parameters
// Now we start the implementation of the rules
String state = actNorm.getTheEnactionDescription().getState();
Collection invAgents = this.getInvolvedAgents(actNorm);
// Test for the agent is already allocated to the activity.
if (invAgents.contains(agent)) {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheagent") + agent.getIdent() //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.ModelingExcAlreadyAllocToActv") //$NON-NLS-1$
+ actNorm.getName());
}
ProcessModel pmodel = actNorm.getTheProcessModel();
Collection reqAgents = this.getRequiredAgents(actNorm);
Iterator iter = reqAgents.iterator();
if (state.equals("")) { // Rule G6.11
while (iter.hasNext()) {
ReqAgent reqAgent = (ReqAgent) iter.next();
if (reqAgent.getTheAgent() == null) {
// Check if the role is the one specified by parameter
if (!reqAgent.getTheRole().getIdent().equals(role_id))
continue;
if (this.playsRole(agent, reqAgent.getTheRole())) {
reqAgent.setTheAgent(agent);
agent.getTheReqAgent().add(reqAgent);
actNorm.getTheRequiredPeople().add(reqAgent);
reqAgent.setTheNormal(actNorm);
// Dynamic Changes related code
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
actNorm.getTheEnactionDescription().setStateWithMessage(Plain.WAITING);
this.logging.registerGlobalActivityEvent(actNorm, "ToWaiting", "Rule G6.11"); //$NON-NLS-1$ //$NON-NLS-2$
this.updateAgenda(agent, actNorm, Plain.WAITING, "Rule G6.11"); //$NON-NLS-1$
Collection ids = new HashSet();
Iterator iterAgentToSend = invAgents.iterator();
while (iterAgentToSend.hasNext()) {
Agent toSend = (Agent) iterAgentToSend.next();
ids.add(toSend.getIdent());
}
// this.notifyAgents(actNorm, Messages.getString("facades.DynamicModeling.NotifyAgentActvAdded"), ids, agent.getId(),
// DynamicModelingImpl.ADD, agent.getClass(), agent.getIdent(), null); //$NON-NLS-1$
this.notifyAgents(actNorm, "facades.DynamicModeling.NotifyAgentActvAdded", ids, agent.getId(),
DynamicModelingImpl.ADD, agent.getClass(), agent.getIdent(), null); //$NON-NLS-1$
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G6.11"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
// Persistence Operations
actDAO.update(actNorm);
agentDAO.update(agent);
return true;
}
}
}
} else if ((state.equals(Plain.WAITING) || state.equals(Plain.READY))) {
// Rule G6.12
while (iter.hasNext()) {
ReqAgent reqAgent = (ReqAgent) iter.next();
// Check if the role is the one specified by parameter
if (!reqAgent.getTheRole().getIdent().equals(role_id))
continue;
if (reqAgent.getTheAgent() == null && this.playsRole(agent, reqAgent.getTheRole())) {
reqAgent.setTheAgent(agent);
agent.getTheReqAgent().add(reqAgent);
actNorm.getTheRequiredPeople().add(reqAgent);
reqAgent.setTheNormal(actNorm);
this.updateAgenda(agent, actNorm, state, "Rule G6.12"); //$NON-NLS-1$
Collection ids = new HashSet();
Iterator iterAgentToSend = invAgents.iterator();
while (iterAgentToSend.hasNext()) {
Agent toSend = (Agent) iterAgentToSend.next();
ids.add(toSend.getIdent());
}
this.notifyAgents(actNorm, Messages.getString("facades.DynamicModeling.NotifyAgentActvAdded"), ids, agent.getId(),
DynamicModelingImpl.ADD, agent.getClass(), agent.getIdent(), null); //$NON-NLS-1$
// Dynamic Changes related code
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G6.12"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
// Persistence Operations
actDAO.update(actNorm);
agentDAO.update(agent);
return true;
}
}
} else if ((state.equals(Plain.ACTIVE) || state.equals(Plain.PAUSED))) {
// Rule G6.13
while (iter.hasNext()) {
ReqAgent reqAgent = (ReqAgent) iter.next();
// Check if the role is the one specified by parameter
if (!reqAgent.getTheRole().getIdent().equals(role_id))
continue;
if (reqAgent.getTheAgent() == null && this.playsRole(agent, reqAgent.getTheRole())) {
reqAgent.setTheAgent(agent);
agent.getTheReqAgent().add(reqAgent);
actNorm.getTheRequiredPeople().add(reqAgent);
reqAgent.setTheNormal(actNorm);
this.updateAgenda(agent, actNorm, Plain.READY, "Rule G6.13"); //$NON-NLS-1$
Collection ids = new HashSet();
Iterator iterAgentToSend = invAgents.iterator();
while (iterAgentToSend.hasNext()) {
Agent toSend = (Agent) iterAgentToSend.next();
ids.add(toSend.getIdent());
}
this.notifyAgents(actNorm, Messages.getString("facades.DynamicModeling.NotifyAgentActvAdded"), ids, agent.getId(),
DynamicModelingImpl.ADD, agent.getClass(), agent.getIdent(), null); //$NON-NLS-1$
// Dynamic Changes related code
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G6.13"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
// Persistence Operations
actDAO.update(actNorm);
agentDAO.update(agent);
return true;
}
}
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + actNorm.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcHasAlreadyFinis")); //$NON-NLS-1$ //$NON-NLS-2$
}
return false;
}
/*
* (non-Javadoc)
*
* @see
* org.qrconsult.spm.services.impl.DynamicModelingExtraida#removeRequiredAgent
* (java.lang.String, java.lang.String)
*/
@Override
public void removeRequiredAgent(String act_id, String ag_id) throws DAOException, ModelingException {
// Checks for the parameters
// NormalDAO actDAO = new NormalDAO();
// actDAO.setSession(currentSession);
Object act;
try {
act = normDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActivi") + //$NON-NLS-1$
act_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Normal actNorm = (Normal) act;
Object ag;
try {
ag = agentDAO.retrieveBySecondaryKey(ag_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccAgent") + //$NON-NLS-1$
ag_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (ag == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcAgent") + ag_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Agent agent = (Agent) ag;
// End Checks for the parameters
// Now we start the implementation of the rules
String state = actNorm.getTheEnactionDescription().getState();
Collection reqAgents = this.getRequiredAgents(actNorm);
Collection involvedAgents = this.getInvolvedAgents(actNorm);
reqAgents.remove(null);
Iterator iter = reqAgents.iterator();
if (state.equals("")) { //$NON-NLS-1$
while (iter.hasNext()) {
ReqAgent reqAgent = (ReqAgent) iter.next();
if (reqAgent.getTheAgent() != null) {
if (reqAgent.getTheAgent().equals(agent) && this.playsRole(agent, reqAgent.getTheRole())) {
reqAgent.setTheAgent(null);
agent.getTheReqAgent().remove(reqAgent);
// Persistence Operations
actDAO.update(actNorm);
agentDAO.update(agent);
break;
}
}
}
} else if (!(state.equals(Plain.FAILED) || state.equals(Plain.CANCELED) || state.equals(Plain.FINISHED))) {
if (involvedAgents.size() == 1 && !state.equals(Plain.WAITING)) {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcLastAgent")); //$NON-NLS-1$
}
while (iter.hasNext()) {
ReqAgent reqAgent = (ReqAgent) iter.next();
if (reqAgent.getTheAgent() != null) {
if (reqAgent.getTheAgent().equals(agent) && this.playsRole(agent, reqAgent.getTheRole())) {
reqAgent.setTheAgent(null);
agent.getTheReqAgent().remove(reqAgent);
Process process = this.getTheProcess(actNorm.getTheProcessModel());
Collection agendas = agent.getTheTaskAgenda().getTheProcessAgenda();
Iterator iterAgendas = agendas.iterator();
while (iterAgendas.hasNext()) {
ProcessAgenda agenda = (ProcessAgenda) iterAgendas.next();
if (agenda.getTheProcess().equals(process)) {
this.removeTask(agenda, actNorm);
break;
}
}
Collection ids = new HashSet();
ids.add(agent.getIdent());
Iterator iterInvAgents = involvedAgents.iterator();
while (iterInvAgents.hasNext()) {
Agent invagent = (Agent) iterInvAgents.next();
ids.add(invagent.getIdent());
}
this.notifyAgents(actNorm, Messages.getString("facades.DynamicModeling.NotifyAgentActivityRemoved"), ids, agent.getId(),
DynamicModelingImpl.DEL, agent.getClass(), agent.getIdent(), null); //$NON-NLS-1$
// Persistence Operations
actDAO.update(actNorm);
agentDAO.update(agent);
break;
}
}
}
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcActv") + actNorm.getIdent() //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.ModelingExcHasAlreadyFinis")); //$NON-NLS-1$
}
}
/*
* (non-Javadoc)
*
* @see
* org.qrconsult.spm.services.impl.DynamicModelingExtraida#defineRequiredWorkGroup
* (java.lang.String, java.lang.String)
*/
@Override
public void defineRequiredWorkGroup(String act_id, String WorkGroup_id) throws DAOException, WebapseeException {
// Checks for the parameters
System.out.println(WorkGroup_id);
Object act;
try {
act = actDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActv") + //$NON-NLS-1$
act_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Normal actNorm = (Normal) act;
Object gr;
try {
gr = WorkGroupDAO.retrieveBySecondaryKey(WorkGroup_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccWorkGroup") + //$NON-NLS-1$
WorkGroup_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (gr == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcWorkGroup") + WorkGroup_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
WorkGroup WorkGroup = (WorkGroup) gr;
// End Checks for the parameters
// Now we start the implementation of the rules
String state = actNorm.getTheEnactionDescription().getState();
Collection reqWorkGroups = this.getRequiredWorkGroups(actNorm);
// Test for the WorkGroup and its agents are already allocated to the
// activity.
Iterator iterExcp = reqWorkGroups.iterator();
while (iterExcp.hasNext()) {
ReqWorkGroup reqAux = (ReqWorkGroup) iterExcp.next();
if (reqAux.getTheWorkGroup() != null) {
if (reqAux.getTheWorkGroup().equals(WorkGroup)) {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheWorkGroup") + WorkGroup.getIdent() //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.ModelingExcIsAlreAllocToActv") //$NON-NLS-1$
+ actNorm.getName());
}
}
}
Collection agentsWorkGroup = WorkGroup.getTheAgents();
Collection invAgents = this.getInvolvedAgents(actNorm);
Iterator iterAgents = agentsWorkGroup.iterator();
while (iterAgents.hasNext()) {
Agent agent = (Agent) iterAgents.next();
if (agent != null) {
Iterator iterInvAg = invAgents.iterator();
while (iterInvAg.hasNext()) {
Agent invAg = (Agent) iterInvAg.next();
if (agent.equals(invAg)) {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheagent") + agent.getIdent() //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.ModelingExcIsAlreAllocToActv") //$NON-NLS-1$
+ actNorm.getName());
}
}
}
}
Iterator iter = reqWorkGroups.iterator();
if (state.equals("")) { //$NON-NLS-1$
// Rule G6.17
while (iter.hasNext()) {
ReqWorkGroup reqWorkGroup = (ReqWorkGroup) iter.next();
if (reqWorkGroup.getTheWorkGroup() == null) {
if (this.isSubType(WorkGroup.getTheWorkGroupType(), reqWorkGroup.getTheWorkGroupType())) {
reqWorkGroup.setTheWorkGroup(WorkGroup);
WorkGroup.getTheReqWorkGroup().add(reqWorkGroup);
actNorm.getTheRequiredPeople().add(reqWorkGroup);
reqWorkGroup.setTheNormal(actNorm);
// Dynamic Changes related code
ProcessModel pmodel = actNorm.getTheProcessModel();
String processState = this.getTheProcess(pmodel).getpStatus().name();
if (processState.equals(Process.ENACTING)) {
actNorm.getTheEnactionDescription().setStateWithMessage(Plain.WAITING);
this.logging.registerGlobalActivityEvent(actNorm, "ToWaiting", "Rule G6.17"); //$NON-NLS-1$ //$NON-NLS-2$
Collection agentsInWorkGroup = reqWorkGroup.getTheWorkGroup().getTheAgent();
Iterator iterAg = agentsInWorkGroup.iterator();
while (iterAg.hasNext()) {
Agent ag = (Agent) iterAg.next();
this.updateAgenda(ag, actNorm, Plain.WAITING, "Rule G6.17"); //$NON-NLS-1$
}
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G6.17"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
}
// Persistence Operations
actDAO.update(actNorm);
WorkGroupDAO.update(WorkGroup);
}
}
}
} else if ((state.equals(Plain.WAITING) || state.equals(Plain.READY))) {
// Rule G6.18
while (iter.hasNext()) {
ReqWorkGroup reqWorkGroup = (ReqWorkGroup) iter.next();
if (reqWorkGroup.getTheWorkGroup() == null) {
if (this.isSubType(WorkGroup.getTheWorkGroupType(), reqWorkGroup.getTheWorkGroupType())) {
reqWorkGroup.setTheWorkGroup(WorkGroup);
WorkGroup.getTheReqWorkGroup().add(reqWorkGroup);
actNorm.getTheRequiredPeople().add(reqWorkGroup);
reqWorkGroup.setTheNormal(actNorm);
Collection agentsInWorkGroup = reqWorkGroup.getTheWorkGroup().getTheAgent();
Iterator iterAg = agentsInWorkGroup.iterator();
while (iterAg.hasNext()) {
Agent ag = (Agent) iterAg.next();
this.updateAgenda(ag, actNorm, state, "Rule G6.18"); //$NON-NLS-1$
}
// Dynamic Changes related code
ProcessModel pmodel = actNorm.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G6.18"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
// Persistence Operations
actDAO.update(actNorm);
WorkGroupDAO.update(WorkGroup);
}
}
}
} else if ((state.equals(Plain.ACTIVE) || state.equals(Plain.PAUSED))) {
// Rule G6.19
while (iter.hasNext()) {
ReqWorkGroup reqWorkGroup = (ReqWorkGroup) iter.next();
if (reqWorkGroup.getTheWorkGroup() == null) {
if (this.isSubType(WorkGroup.getTheWorkGroupType(), reqWorkGroup.getTheWorkGroupType())) {
reqWorkGroup.setTheWorkGroup(WorkGroup);
WorkGroup.getTheReqWorkGroup().add(reqWorkGroup);
actNorm.getTheRequiredPeople().add(reqWorkGroup);
reqWorkGroup.setTheNormal(actNorm);
Collection agentsInWorkGroup = reqWorkGroup.getTheWorkGroup().getTheAgent();
Iterator iterAg = agentsInWorkGroup.iterator();
while (iterAg.hasNext()) {
Agent ag = (Agent) iterAg.next();
this.updateAgenda(ag, actNorm, Plain.READY, "Rule G6.19"); //$NON-NLS-1$
}
// Dynamic Changes related code
ProcessModel pmodel = actNorm.getTheProcessModel();
this.enactmentEngine.searchForFiredConnections(pmodel.getId(), "Rule G6.19"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(pmodel.getId());
this.enactmentEngine.determineProcessModelStates(pmodel);
// Persistence Operations
actDAO.update(actNorm);
WorkGroupDAO.update(WorkGroup);
}
}
}
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + actNorm.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcHasAlreadyFinis")); //$NON-NLS-1$ //$NON-NLS-2$
}
}
/*
* (non-Javadoc)
*
* @see
* org.qrconsult.spm.services.impl.DynamicModelingExtraida#removeRequiredWorkGroup
* (java.lang.String, java.lang.String)
*/
@Override
public void removeRequiredWorkGroup(String act_id, String WorkGroup_id) throws DAOException, ModelingException {
// Checks for the parameters
System.out.println(act_id);
Object act;
try {
act = normDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActv") + //$NON-NLS-1$
act_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Normal actNorm = (Normal) act;
Object gr;
try {
gr = WorkGroupDAO.retrieveBySecondaryKey(WorkGroup_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccWorkGroup") + //$NON-NLS-1$
WorkGroup_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (gr == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcWorkGroup") + WorkGroup_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
WorkGroup WorkGroup = (WorkGroup) gr;
// End Checks for the parameters
// Now we start the implementation of the rules
String state = actNorm.getTheEnactionDescription().getState();
Collection reqAgents = this.getRequiredWorkGroups(actNorm);
Iterator iter = reqAgents.iterator();
if (state.equals("")) { // Rules G6.32 //$NON-NLS-1$
while (iter.hasNext()) {
ReqWorkGroup reqWorkGroup = (ReqWorkGroup) iter.next();
if (reqWorkGroup.getTheWorkGroup() != null) {
if (reqWorkGroup.getTheWorkGroup().equals(WorkGroup)) {
reqWorkGroup.removeFromTheWorkGroup();
// Persistence Operations
actDAO.update(actNorm);
WorkGroupDAO.update(WorkGroup);
break;
}
}
}
} else if (!(state.equals(Plain.FAILED) || state.equals(Plain.CANCELED) || state.equals(Plain.FINISHED))) { // Rule
// G6.33
while (iter.hasNext()) {
ReqWorkGroup reqWorkGroup = (ReqWorkGroup) iter.next();
if (reqWorkGroup.getTheWorkGroup() != null) {
if (reqWorkGroup.getTheWorkGroup().equals(WorkGroup)) {
reqWorkGroup.removeFromTheWorkGroup();
Collection ids = new HashSet();
Collection agentsInWorkGroup = WorkGroup.getTheAgents();
Iterator iterAgents = agentsInWorkGroup.iterator();
while (iterAgents.hasNext()) {
Agent agent = (Agent) iterAgents.next();
Process process = this.getTheProcess(actNorm.getTheProcessModel());
Collection agendas = agent.getTheTaskAgenda().getTheProcessAgenda();
Iterator iterAgendas = agendas.iterator();
while (iterAgendas.hasNext()) {
ProcessAgenda agenda = (ProcessAgenda) iterAgendas.next();
if (agenda.getTheProcess().equals(process)) {
ids.add(agent.getIdent());
this.removeTask(agenda, actNorm);
break;
}
}
}
this.notifyAgents(actNorm, null, ids, WorkGroup.getId(),
DynamicModelingImpl.DEL, WorkGroup.getClass(), WorkGroup.getIdent(), null); //$NON-NLS-1$
// Persistence Operations
actDAO.update(actNorm);
WorkGroupDAO.update(WorkGroup);
break;
}
}
}
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + actNorm.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcHasAlreadyFinis")); //$NON-NLS-1$ //$NON-NLS-2$
}
}
/*
* (non-Javadoc)
*
* @see org.qrconsult.spm.services.impl.DynamicModelingExtraida#
* removeRequiredWorkGroupType(java.lang.String, java.lang.String)
*/
@Override
public void removeRequiredWorkGroupType(String act_id, String g_type) throws DAOException {
// Checks for the parameters
Object act;
try {
act = actDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActv") + act_id //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Normal actNorm = (Normal) act;
Object gt;
try {
gt = WorkGroupTypeDAO.retrieveBySecondaryKey(g_type);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccWorkGroupType") + g_type //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (gt == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcWorkGroupType") + g_type + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
WorkGroupType WorkGroupType = (WorkGroupType) gt;
// End Checks for the parameters
// Now we start the implementation of the rules
String state = actNorm.getTheEnactionDescription().getState();
Collection reqAgents = this.getRequiredWorkGroups(actNorm);
reqAgents.remove(null);
Iterator iter = reqAgents.iterator();
if (state.equals("")) { // Rule G6.24 and G6.26 //$NON-NLS-1$
while (iter.hasNext()) {
ReqWorkGroup reqWorkGroup = (ReqWorkGroup) iter.next();
if (reqWorkGroup.getTheWorkGroupType().equals(WorkGroupType)) {
actNorm.removeFromTheRequiredPeople(reqWorkGroup);
if (reqWorkGroup.getTheWorkGroup() != null) {
reqWorkGroup.getTheWorkGroup().removeFromTheReqWorkGroup(reqWorkGroup);
}
WorkGroupType.removeFromTheReqWorkGroup(reqWorkGroup);
// Persistence Operations
actDAO.update(actNorm);
WorkGroupTypeDAO.update(WorkGroupType);
reqWorkGroupDAO.daoDelete(reqWorkGroup);
break;
}
}
} else {
while (iter.hasNext()) { // Rule G6.25 and G6.27
ReqWorkGroup reqWorkGroup = (ReqWorkGroup) iter.next();
if (reqWorkGroup.getTheWorkGroup() == null && reqWorkGroup.getTheWorkGroupType().equals(WorkGroupType)) {
actNorm.removeFromTheRequiredPeople(reqWorkGroup);
WorkGroupType.removeFromTheReqWorkGroup(reqWorkGroup);
// Persistence Operations
actDAO.update(actNorm);
WorkGroupTypeDAO.update(WorkGroupType);
reqWorkGroupDAO.daoDelete(reqWorkGroup);
break;
}
}
}
}
/*
* (non-Javadoc)
*
* @see
* org.qrconsult.spm.services.impl.DynamicModelingExtraida#newAgent(java
* .lang.String, java.lang.String, java.lang.String)
*/
@Override
public Integer newAgent(String agent_id, String role_id, String normal_id) throws WebapseeException {// retorna
// Oid
// ReqAgent
this.addRequiredRole(normal_id, role_id);
try {
this.defineRequiredAgent(normal_id, role_id, agent_id);
} catch (WebapseeException e) {
this.removeRequiredRole(normal_id, role_id);
throw e;
}
System.out.println("caiu na normal"+ normal_id);
Normal normalAux = (Normal) normDAO.retrieveBySecondaryKey(normal_id);
Collection reqPeople = normalAux.getTheRequiredPeople();
Iterator iter = reqPeople.iterator();
System.out.println("caiu na normal "+normalAux.getTheRequiredPeople());
while (iter.hasNext()) {
System.out.println("caiu no while");
RequiredPeople reqP = (RequiredPeople) iter.next();
if (reqP instanceof ReqAgent) {
ReqAgent reqAgent = (ReqAgent) reqP;
if (reqAgent.getTheAgent() != null) {
if (reqAgent.getTheAgent().getIdent().equals(agent_id)) {
return reqAgent.getId();
}
}
}
}
return null;
}
/*
* (non-Javadoc)
*
* @see
* org.qrconsult.spm.services.impl.DynamicModelingExtraida#newWorkGroup(java
* .lang.String, java.lang.String, java.lang.String)
*/
@Override
public Integer newWorkGroup(String WorkGroup_id, String WorkGroupType_id, String normal_id) throws WebapseeException, ModelingException {// retorna
// Oid
// ReqWorkGroup
this.addRequiredWorkGroupType(normal_id, WorkGroupType_id);
try {
this.defineRequiredWorkGroup(normal_id, WorkGroup_id);
} catch (WebapseeException e) {
this.removeRequiredWorkGroupType(normal_id, WorkGroupType_id);
throw e;
}
Normal normalAux = (Normal) normDAO.retrieveBySecondaryKey(normal_id);
Collection reqPeople = normalAux.getTheRequiredPeople();
Iterator iter = reqPeople.iterator();
while (iter.hasNext()) {
RequiredPeople reqP = (RequiredPeople) iter.next();
if (reqP instanceof ReqWorkGroup) {
ReqWorkGroup reqWorkGroup = (ReqWorkGroup) reqP;
if (reqWorkGroup.getTheWorkGroup() != null) {
if (reqWorkGroup.getTheWorkGroup().getIdent().equals(WorkGroup_id)) {
return reqWorkGroup.getId();
}
}
}
}
return null;
}
/*
* (non-Javadoc)
*
* @see
* org.qrconsult.spm.services.impl.DynamicModelingExtraida#newResource(java
* .lang.String, java.lang.String, java.lang.String, float)
*/
@Override
public Integer newResource(String resource_id, String resourceType_id, String normal_id, float amount_needed) throws WebapseeException,
ModelingException {// retorna Oid RequiredResource
System.out.println("Entrou no método newResource: id = " + resource_id +
"; typeId = " + resourceType_id + "; normalId = " + normal_id +
"; amount = " + amount_needed);
this.addRequiredResourceType(normal_id, resourceType_id);
try {
this.defineRequiredResource(normal_id, resource_id, amount_needed);
} catch (WebapseeException e) {
this.removeRequiredResourceType(normal_id, resourceType_id);
throw e;
}
Normal normalAux = (Normal) normDAO.retrieveBySecondaryKey(normal_id);
Collection reqResource = normalAux.getTheRequiredResources();
Iterator iter = reqResource.iterator();
while (iter.hasNext()) {
RequiredResource reqR = (RequiredResource) iter.next();
if (reqR.getTheResource() != null) {
if (reqR.getTheResource().getIdent().equals(resource_id)) {
return reqR.getId();
}
}
}
return null;
}
/*
* (non-Javadoc)
*
* @see org.qrconsult.spm.services.impl.DynamicModelingExtraida#
* removeCompleteRequiredAgent(java.lang.String, java.lang.String,
* java.lang.String)
*/
@Override
public void removeCompleteRequiredAgent(String normal_id, String agent_id, String role_id) throws DAOException, ModelingException {
this.removeRequiredAgent(normal_id, agent_id);
this.removeRequiredRole(normal_id, role_id);
}
/*
* (non-Javadoc)
*
* @see org.qrconsult.spm.services.impl.DynamicModelingExtraida#
* removeCompleteRequiredWorkGroup(java.lang.String, java.lang.String,
* java.lang.String)
*/
@Override
public void removeCompleteRequiredWorkGroup(String normal_id, String WorkGroup_id, String WorkGroupType_id) throws DAOException, ModelingException {
this.removeRequiredWorkGroup(normal_id, WorkGroup_id);
this.removeRequiredWorkGroupType(normal_id, WorkGroupType_id);
}
/*
* (non-Javadoc)
*
* @see org.qrconsult.spm.services.impl.DynamicModelingExtraida#
* removeCompleteRequiredResource(java.lang.String, java.lang.String,
* java.lang.String)
*/
@Override
public void removeCompleteRequiredResource(String normal_id, String resource_id, String resourceType_id) throws DAOException, ModelingException {
this.removeRequiredResource(normal_id, resource_id);
this.removeRequiredResourceType(normal_id, resourceType_id);
}
/**
* Related to section 7
*/
/*
* (non-Javadoc)
*
* @see org.qrconsult.spm.services.impl.DynamicModelingExtraida#
* addRequiredResourceType(java.lang.String, java.lang.String)
*/
@Override
public Integer addRequiredResourceType(String act_id, String resType_id) throws DAOException, ModelingException {// retorna
// Oid
// RequiredResource
// Checks for the parameters
Object act;
try {
act = normDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActv") + act_id //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Normal actNorm = (Normal) act;
Object resType;
try {
resType = resTypeDAO.retrieveBySecondaryKey(resType_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccResourceType") + resType_id //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (resType == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcResourType") + resType_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
ResourceType resourceType = (ResourceType) resType;
// End Checks for the parameters
// Now we start the implementation of the rules
String state = actNorm.getTheEnactionDescription().getState();
RequiredResource reqResource = null;
if (state.equals("") // Rule 7.1 //$NON-NLS-1$
|| !(state.equals(Plain.FAILED) // Rule 7.2
|| state.equals(Plain.CANCELED) || state.equals(Plain.FINISHED))) {
// if (!this.hasTheRequiredResourceType(actNorm, resourceType)) {
reqResource = new RequiredResource();
reqResource.setTheNormal(actNorm);
actNorm.getTheRequiredResources().add(reqResource);
reqResource.setTheResourceType(resourceType);
resourceType.getTheRequiredResources().add(reqResource);
// Persistence Operations
actDAO.update(actNorm);
resTypeDAO.update(resourceType);
// }
return reqResource.getId();
} else {
throw new ModelingException(
"Exception no dynamicModelingActivity " + actNorm.getIdent() + " A atividade já foi finalizada"); //$NON-NLS-1$ //$NON-NLS-2$
}
}
/*
* (non-Javadoc)
*
* @see org.qrconsult.spm.services.impl.DynamicModelingExtraida#
* changeRequiredResourceType(java.lang.String, java.lang.String,
* java.lang.String)
*/
@Override
public Integer changeRequiredResourceType(String act_id, String oldResType_id, String newResType_id) throws DAOException, ModelingException {// retorna
// Oid
// RequiredResource
// Checks for the parameters
Object act;
try {
act = normDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActv") + act_id //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Normal actNorm = (Normal) act;
Object oldResType;
try {
oldResType = resTypeDAO.retrieveBySecondaryKey(oldResType_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccResourceType") + oldResType_id //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (oldResType == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcResourType") + oldResType_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
ResourceType oldResourceType = (ResourceType) oldResType;
Object newResType;
try {
newResType = resTypeDAO.retrieveBySecondaryKey(newResType_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccResourceType") + newResType_id //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (newResType == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcResourType") + newResType_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
ResourceType newResourceType = (ResourceType) newResType;
// End Checks for the parameters
// Now we start the implementation of the rules
String state = actNorm.getTheEnactionDescription().getState();
RequiredResource reqRes = null;
if (state.equals("") // Rule 7.3 //$NON-NLS-1$
|| !(state.equals(Plain.FAILED) // Rule 7.4
|| state.equals(Plain.CANCELED) || state.equals(Plain.FINISHED))) {
if (!this.hasTheRequiredResourceType(actNorm, newResourceType)) {
if (this.hasTheRequiredResourceType(actNorm, oldResourceType)) {
Collection reqRess = actNorm.getTheRequiredResources();
Iterator iter = reqRess.iterator();
while (iter.hasNext()) {
reqRes = (RequiredResource) iter.next();
if (reqRes.getTheResourceType().equals(oldResourceType)) {
oldResourceType.getTheRequiredResources().remove(reqRes);
reqRes.setTheResourceType(newResourceType);
newResourceType.getTheRequiredResources().add(reqRes);
// Persistence Operations
actDAO.update(actNorm);
resTypeDAO.update(oldResourceType);
resTypeDAO.update(newResourceType);
return reqRes.getId();
}
}
return reqRes.getId();
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheOldRequiResoType")); //$NON-NLS-1$
}
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheNewRequiResouType")); //$NON-NLS-1$
}
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + actNorm.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcHasAlreadyFinis")); //$NON-NLS-1$ //$NON-NLS-2$
}
}
/*
* (non-Javadoc)
*
* @see org.qrconsult.spm.services.impl.DynamicModelingExtraida#
* removeRequiredResourceType(java.lang.String, java.lang.String)
*/
@Override
public void removeRequiredResourceType(String act_id, String resType_id) throws DAOException, ModelingException {
// Checks for the parameters
Object act;
try {
act = normDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActv") + act_id //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Normal actNorm = (Normal) act;
Object resType;
try {
resType = resTypeDAO.retrieveBySecondaryKey(resType_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccResourceType") + resType_id //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (resType == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcResourType") + resType_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
ResourceType resourceType = (ResourceType) resType;
// End Checks for the parameters
// Now we start the implementation of the rules
String state = actNorm.getTheEnactionDescription().getState();
if (state.equals("")) { // Rules G7.5 and G7.7 //$NON-NLS-1$
Collection reqRess = actNorm.getTheRequiredResources();
RequiredResource reqRes = null;
boolean find = false;
Iterator iter = reqRess.iterator();
while (iter.hasNext()) {
reqRes = (RequiredResource) iter.next();
if (reqRes.getTheResourceType().equals(resourceType)) {
if (reqRes.getTheResource() == null) {
find = true;
break;
}
}
}
if (find) {
actNorm.getTheRequiredResources().remove(reqRes);
reqRes.setTheNormal(null);
reqRes.setTheResourceType(null);
resourceType.getTheRequiredResources().remove(reqRes);
// Persistence Operations
actDAO.update(actNorm);
resTypeDAO.update(resourceType);
reqResDAO.daoDelete(reqRes);
}
} else if (!(state.equals(Plain.FAILED) || state.equals(Plain.FINISHED) || state.equals(Plain.CANCELED))) { // Rules
// G7.6
// and
// G7.8
Collection reqRess = actNorm.getTheRequiredResources();
RequiredResource reqRes = null;
boolean find = false;
Iterator iter = reqRess.iterator();
while (iter.hasNext()) {
reqRes = (RequiredResource) iter.next();
if (reqRes.getTheResourceType().equals(resourceType) && reqRes.getTheResource() == null) {
find = true;
break;
}
}
if (find) {
actNorm.getTheRequiredResources().remove(reqRes);
reqRes.setTheNormal(null);
reqRes.setTheResourceType(null);
resourceType.getTheRequiredResources().remove(reqRes);
// Persistence Operations
actDAO.update(actNorm);
resTypeDAO.update(resourceType);
reqResDAO.daoDelete(reqRes);
}
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + actNorm.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcHasAlreadyFinis")); //$NON-NLS-1$ //$NON-NLS-2$
}
}
/*
* (non-Javadoc)
*
* @see org.qrconsult.spm.services.impl.DynamicModelingExtraida#
* defineRequiredResource(java.lang.String, java.lang.String, float)
*/
@Override
public Integer defineRequiredResource(String act_id, String res_id, float amount_needed) throws DAOException, ModelingException,
WebapseeException {// retorna Oid RequiredResource
// Checks for the parameters
Object act;
try {
act = normDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException("Problema na activity" + act_id //$NON-NLS-1$
+ "Execução do DAO falhou" + e); //$NON-NLS-1$
}
if (act == null)
throw new DAOException(
"Problema na activity ( == null) " + act_id + " Entidade não encontrada"); //$NON-NLS-1$ //$NON-NLS-2$
Normal actNorm = (Normal) act;
Object res;
try {
res = resDAO.retrieveBySecondaryKey(res_id);
} catch (Exception/* DAOException */e) {
throw new DAOException("Problema ao recuperar Resource " + res_id //$NON-NLS-1$
+ " retrieveBySecondaryKey failed " + e); //$NON-NLS-1$
}
if (res == null)
throw new DAOException(
"Problema ao recuperar Resource " + res_id + " Resource null"); //$NON-NLS-1$ //$NON-NLS-2$
Resource resource = (Resource) res;
// End Checks for the parameters
// Now we start the implementation of the rules
String state = actNorm.getTheEnactionDescription().getState();
Collection reqRess = actNorm.getTheRequiredResources();
// Test for the agent is already allocated to the activity.
Iterator iterExcp = reqRess.iterator();
while (iterExcp.hasNext()) {
RequiredResource reqAux = (RequiredResource) iterExcp.next();
if (reqAux.getTheResource() != null) {
if (reqAux.getTheResource().equals(resource)) {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheResource") + resource.getIdent() //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.ModelingExcIsAlreAllocToActv") //$NON-NLS-1$
+ actNorm.getIdent());
}
}
}
Iterator iter = reqRess.iterator();
boolean ok = false;
RequiredResource reqRes = null;
if (!this.hasTheResource(actNorm, resource)) {
if (state.equals("")) { //$NON-NLS-1$
while (iter.hasNext()) {
reqRes = (RequiredResource) iter.next();
if (resource.getTheResourceType() != null) {
if (this.isSubType(resource.getTheResourceType(), reqRes.getTheResourceType())) {
if (reqRes.getTheResource() == null) { // Rule G7.11
reqRes.setAmountNeeded(new Float(amount_needed));
reqRes.setTheResource(resource);
resource.getTheRequiredResources().add(reqRes);
ok = true;
// Persistence Operations
actDAO.update(actNorm);
resDAO.update(resource);
return reqRes.getId();
}
}
}
}
if (!ok) {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheRequiTypeDoesNot")); //$NON-NLS-1$
}
return reqRes.getId();
} else if (state.equals(Plain.WAITING) || state.equals(Plain.READY) || state.equals(Plain.PAUSED)) {
while (iter.hasNext()) {
reqRes = (RequiredResource) iter.next();
if (resource.getTheResourceType() != null) {
if (this.isSubType(resource.getTheResourceType(), reqRes.getTheResourceType()) || reqRes.getTheResource() == null) { // Rule
// G7.12
reqRes.setAmountNeeded(new Float(amount_needed));
reqRes.setTheResource(resource);
resource.getTheRequiredResources().add(reqRes);
ok = true;
Collection ids = new HashSet();
Collection involvedAgents = this.getInvolvedAgents(actNorm);
Iterator iterInvAgents = involvedAgents.iterator();
while (iterInvAgents.hasNext()) {
Agent invagent = (Agent) iterInvAgents.next();
ids.add(invagent.getIdent());
}
this.notifyAgents(actNorm,
"", ids, resource.getId(), DynamicModelingImpl.ADD, resource.getClass(), resource.getIdent(), null); //$NON-NLS-1$
// Persistence Operations
actDAO.update(actNorm);
resDAO.update(resource);
return reqRes.getId();
}
}
}
if (!ok) {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheRequiTypeDoesNot")); //$NON-NLS-1$
}
return reqRes.getId();
} else if (state.equals(Plain.ACTIVE)) {
while (iter.hasNext()) {
reqRes = (RequiredResource) iter.next();
if (resource.getTheResourceType() != null) {
if (this.isSubType(resource.getTheResourceType(), reqRes.getTheResourceType()) || reqRes.getTheResource() == null) { // Rule
// G7.13
reqRes.setAmountNeeded(new Float(amount_needed));
reqRes.setTheResource(resource);
resource.getTheRequiredResources().add(reqRes);
this.allocateResource(reqRes, resource, false);
ok = true;
Collection ids = new HashSet();
Collection involvedAgents = this.getInvolvedAgents(actNorm);
Iterator iterInvAgents = involvedAgents.iterator();
while (iterInvAgents.hasNext()) {
Agent invagent = (Agent) iterInvAgents.next();
ids.add(invagent.getIdent());
}
this.notifyAgents(actNorm,
"", ids, resource.getId(), DynamicModelingImpl.ADD, resource.getClass(), resource.getIdent(), null); //$NON-NLS-1$
// Persistence Operations
actDAO.update(actNorm);
resDAO.update(resource);
return reqRes.getId();
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcTheResource") + resource.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcHasNoResoTypeDef")); //$NON-NLS-1$ //$NON-NLS-2$
}
}
}
if (!ok) {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcTheRequiTypeDoesNot")); //$NON-NLS-1$
}
return reqRes.getId();
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + actNorm.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcHasAlreadyFinis")); //$NON-NLS-1$ //$NON-NLS-2$
}
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + actNorm.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcAlreHasTheResou")); //$NON-NLS-1$ //$NON-NLS-2$
}
}
/*
* (non-Javadoc)
*
* @see org.qrconsult.spm.services.impl.DynamicModelingExtraida#
* removeRequiredResource(java.lang.String, java.lang.String)
*/
@Override
public void removeRequiredResource(String act_id, String res_id) throws DAOException, ModelingException {
// Checks for the parameters
Object act;
try {
act = normDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActv") + act_id //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Normal actNorm = (Normal) act;
Object res;
try {
res = resDAO.retrieveBySecondaryKey(res_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccResource") + res_id //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (res == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DynamicModeling.DaoExcResource") + res_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Resource resource = (Resource) res;
// End Checks for the parameters
// Now we start the implementation of the rules
String state = actNorm.getTheEnactionDescription().getState();
Collection reqRess = actNorm.getTheRequiredResources();
Iterator iter = reqRess.iterator();
if (this.hasTheResource(actNorm, resource)) {
if (state.equals("")) { //$NON-NLS-1$
while (iter.hasNext()) {
RequiredResource reqRes = (RequiredResource) iter.next();
if (reqRes.getTheResource() != null) {
if (reqRes.getTheResource().equals(resource)) {
reqRes.setTheResource(null);
resource.getTheRequiredResources().remove(reqRes);
reqRes.setAmountNeeded(new Float(0));
// Persistence Operations
actDAO.update(actNorm);
resDAO.update(resource);
break;
}
}
}
} else if (state.equals(Plain.WAITING) || state.equals(Plain.READY) || state.equals(Plain.PAUSED)) {
while (iter.hasNext()) {
RequiredResource reqRes = (RequiredResource) iter.next();
if (reqRes.getTheResource() != null) {
if (reqRes.getTheResource().equals(resource)) {
reqRes.setTheResource(null);
resource.getTheRequiredResources().remove(reqRes);
reqRes.setAmountNeeded(new Float(0));
Collection ids = new HashSet();
Collection involvedAgents = this.getInvolvedAgents(actNorm);
Iterator iterInvAgents = involvedAgents.iterator();
while (iterInvAgents.hasNext()) {
Agent invagent = (Agent) iterInvAgents.next();
ids.add(invagent.getIdent());
}
this.notifyAgents(actNorm,
"", ids, resource.getId(), DynamicModelingImpl.ADD, resource.getClass(), resource.getIdent(), null); //$NON-NLS-1$
// Persistence Operations
actDAO.update(actNorm);
resDAO.update(resource);
break;
}
}
}
} else if (state.equals(Plain.ACTIVE)) {
while (iter.hasNext()) {
RequiredResource reqRes = (RequiredResource) iter.next();
if (reqRes.getTheResource() != null) {
if (reqRes.getTheResource().equals(resource)) {
reqRes.setTheResource(null);
resource.getTheRequiredResources().remove(reqRes);
reqRes.setAmountNeeded(new Float(0));
this.releaseResourceFromActivity(actNorm, resource);
Collection ids = new HashSet();
Collection involvedAgents = this.getInvolvedAgents(actNorm);
Iterator iterInvAgents = involvedAgents.iterator();
while (iterInvAgents.hasNext()) {
Agent invagent = (Agent) iterInvAgents.next();
ids.add(invagent.getIdent());
}
this.notifyAgents(
actNorm,
Messages.getString("facades.DynamicModeling.DynamicModeling.DaoExcResource") + resource.getIdent()
+ Messages.getString("facades.DynamicModeling.NotifyAgentRemoved"), ids, resource.getId(),
DynamicModelingImpl.DEL, resource.getClass(), resource.getIdent(), null); //$NON-NLS-1$ //$NON-NLS-2$
// Persistence Operations
actDAO.update(actNorm);
resDAO.update(resource);
break;
}
}
}
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + actNorm.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcHasAlreadyFinis")); //$NON-NLS-1$ //$NON-NLS-2$
}
} else {
throw new ModelingException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + actNorm.getIdent() + Messages.getString("facades.DynamicModeling.ModelingExcDoesntHasResource")); //$NON-NLS-1$ //$NON-NLS-2$
}
}
/*
* (non-Javadoc)
*
* @see org.qrconsult.spm.services.impl.DynamicModelingExtraida#
* changeRequiredResourceAmount(java.lang.String, java.lang.String, float)
*/
@Override
public void changeRequiredResourceAmount(String act_id, String res_id, float new_amount_needed) throws DAOException, ModelingException {
// Checks for the parameters
Object act;
try {
act = normDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActv") + act_id //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Normal actNorm = (Normal) act;
Object res;
try {
res = consumableDAO.retrieveBySecondaryKey(res_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccConsuResource") + res_id //$NON-NLS-1$
+ Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (res == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcConsumableResource") + res_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Consumable consumable = (Consumable) res;
// End Checks for the parameters
String state = actNorm.getTheEnactionDescription().getState();
Collection reqResources = actNorm.getTheRequiredResources();
Iterator iterRes = reqResources.iterator();
while (iterRes.hasNext()) {
RequiredResource reqRes = (RequiredResource) iterRes.next();
if (reqRes != null) {
Resource resource = reqRes.getTheResource();
if (resource != null) {
if (resource instanceof Consumable) {
Consumable cons = (Consumable) resource;
if (state.equals("") //$NON-NLS-1$
|| state.equals(Plain.WAITING) || state.equals(Plain.READY)) {
reqRes.setAmountNeeded(new Float(new_amount_needed));
break;
} else if (state.equals(Plain.ACTIVE) || state.equals(Plain.PAUSED)) {
float needed = reqRes.getAmountNeeded().floatValue();
float total = cons.getTotalQuantity().floatValue();
float used = cons.getAmountUsed().floatValue();
float balance = total - used;
if (new_amount_needed > needed) {
if (new_amount_needed <= balance) {
cons.setAmountUsed(new Float(used + new_amount_needed - needed));
reqRes.setAmountNeeded(new Float(new_amount_needed));
}
} else {
reqRes.setAmountNeeded(new Float(new_amount_needed));
}
} else {
throw new ModelingException(Messages.getString("facades.DynamicModeling.ModelingExcItIsNotAllowedChange")); //$NON-NLS-1$
}
}
}
}
}
// Peristence Operations
actDAO.update(actNorm);
}
/**
* Related to section 8
*/
/**
* Rule CF8.1, CF8.2, CF8.3 and CF8.4
*/
private boolean controlFlow(Activity act1, Activity act2) {
boolean ret = false;
if (this.areConnected(act1, act2)) { // Rule CF8.1 and CF8.2
ret = true;
} else {
Collection connsTo = this.getConnectionsTo(act1);
Iterator iter = connsTo.iterator();
while (iter.hasNext()) {
Connection connTo = (Connection) iter.next();
Collection suc = this.getSuccessors(connTo);
Iterator iterSuc = suc.iterator();
while (iterSuc.hasNext()) {
Object obj = (Object) iterSuc.next();
if (obj instanceof Activity) {
Activity act = (Activity) obj;
if (this.controlFlow(act, act2)) {
// Rule CF8.3
ret = true;
break;
}
} else if (obj instanceof MultipleCon) {
MultipleCon multi = (MultipleCon) obj;
if (this.controlFlow(multi, act2)) {
// Rule CF8.4
ret = true;
break;
}
}
}
}
}
return ret;
}
/**
* Rule CF8.5, CF8.6 and CF8.7
*/
private boolean controlFlow(Activity act, MultipleCon multi) {
boolean ret = false;
if (this.areConnected(act, multi)) { // Rule CF8.5
ret = true;
} else {
Collection connsTo = this.getConnectionsTo(act);
Iterator iter = connsTo.iterator();
while (iter.hasNext()) {
Connection connTo = (Connection) iter.next();
Collection suc = this.getSuccessors(connTo);
Iterator iterSuc = suc.iterator();
while (iterSuc.hasNext()) {
Object obj = (Object) iterSuc.next();
if (obj instanceof Activity) { // Rule CF8.6
Activity actSuc = (Activity) obj;
if (this.controlFlow(actSuc, multi)) {
ret = true;
break;
}
} else if (obj instanceof MultipleCon) { // Rule CF8.7
MultipleCon multiSuc = (MultipleCon) obj;
if (this.controlFlow(multiSuc, multi)) {
ret = true;
break;
}
}
}
}
}
return ret;
}
/**
* Rule CF8.8, CF8.9 and CF8.10
*/
private boolean controlFlow(MultipleCon multi, Activity act) {
boolean ret = false;
if (this.areConnected(multi, act)) { // Rule CF8.8
ret = true;
} else {
Collection suc = this.getSuccessors(multi);
Iterator iter = suc.iterator();
while (iter.hasNext()) {
Object obj = (Object) iter.next();
if (obj instanceof Activity) { // Rule CF8.9
Activity actSuc = (Activity) obj;
if (this.controlFlow(actSuc, act)) {
ret = true;
break;
}
} else if (obj instanceof MultipleCon) { // Rule CF8.10
MultipleCon multiSuc = (MultipleCon) obj;
if (this.controlFlow(multiSuc, act)) {
ret = true;
break;
}
}
}
}
return ret;
}
/**
* Rule CF8.11, CF8.12 and CF8.13
*/
private boolean controlFlow(MultipleCon multi1, MultipleCon multi2) {
boolean ret = false;
if (this.areConnected(multi1, multi2)) { // Rule CF8.11
ret = true;
} else {
Collection suc = this.getSuccessors(multi1);
Iterator iter = suc.iterator();
while (iter.hasNext()) {
Object obj = (Object) iter.next();
if (obj instanceof Activity) { // Rule CF8.12
Activity act = (Activity) obj;
if (this.controlFlow(act, multi2)) {
ret = true;
break;
}
} else if (obj instanceof MultipleCon) { // Rule CF8.13
MultipleCon multi = (MultipleCon) obj;
if (this.controlFlow(multi, multi2)) {
ret = true;
break;
}
}
}
}
return ret;
}
/**
* Add and Remove agents from WorkGroup.
*/
/*
* (non-Javadoc)
*
* @see
* org.qrconsult.spm.services.impl.DynamicModelingExtraida#addAgentToWorkGroup
* (java.lang.String, java.lang.String)
*/
@Override
public void addAgentToWorkGroup(String agentId, String WorkGroupId) throws WebapseeException {
// Checks for the parameters
Object ag;
try {
ag = agentDAO.retrieveBySecondaryKey(agentId);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccAgent") + //$NON-NLS-1$
agentId + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (ag == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcAgent") + agentId + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Agent agent = (Agent) ag;
Object gr;
try {
gr = WorkGroupDAO.retrieveBySecondaryKey(WorkGroupId);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccWorkGroup") + //$NON-NLS-1$
WorkGroupId + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (gr == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcWorkGroup") + WorkGroupId + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
WorkGroup WorkGroup = (WorkGroup) gr;
// End Checks for the parameters
// Now we start the implementation of the rules
Collection processModels = new HashSet();
Collection reqWorkGroups = WorkGroup.getTheReqWorkGroup();
Iterator iterReqWorkGroups = reqWorkGroups.iterator();
while (iterReqWorkGroups.hasNext()) {
ReqWorkGroup reqWorkGroup = (ReqWorkGroup) iterReqWorkGroups.next();
Normal actNorm = reqWorkGroup.getTheNormal();
if (actNorm != null) {
Collection invAgents = this.getInvolvedAgents(actNorm);
String state = actNorm.getTheEnactionDescription().getState();
boolean contains = false;
Iterator iter = invAgents.iterator();
while (iter.hasNext()) {
Agent auxAg = (Agent) iter.next();
if (auxAg.equals(agent)) {
contains = true;
break;
}
}
// Test for the agent is already allocated to the activity.
if (!contains) {
if (state.equals("")) { //$NON-NLS-1$
// Dynamic Changes related code
Process process = this.getTheProcess(actNorm.getTheProcessModel());
String procState = process.getpStatus().name();
if (procState.equals(Process.ENACTING)) {
actNorm.getTheEnactionDescription().setStateWithMessage(Plain.WAITING);
this.logging.registerGlobalActivityEvent(actNorm, "ToWaiting", "Rule Adding Agent to a WorkGroup"); //$NON-NLS-1$ //$NON-NLS-2$
this.updateAgenda(agent, actNorm, Plain.WAITING, "Rule Adding Agent to a WorkGroup"); //$NON-NLS-1$
Collection ids = new HashSet();
ids.add(agent.getIdent());
this.notifyAgents(actNorm, Messages.getString("facades.DynamicModeling.NotifyAgentActvAdded"), ids, agent.getId(),
DynamicModelingImpl.ADD, agent.getClass(), agent.getIdent(), null); //$NON-NLS-1$
ProcessModel pmodel = actNorm.getTheProcessModel();
processModels.add(pmodel);
}
} else if (!state.equals(Plain.FAILED) && !state.equals(Plain.FINISHED) && !state.equals(Plain.CANCELED)) {
// Dynamic Changes related code
this.updateAgenda(agent, actNorm, state, "Rule Adding Agent to a WorkGroup"); //$NON-NLS-1$
Collection ids = new HashSet();
ids.add(agent.getIdent());
this.notifyAgents(actNorm, Messages.getString("facades.DynamicModeling.NotifyAgentActvAdded"), ids, agent.getId(),
DynamicModelingImpl.ADD, agent.getClass(), agent.getIdent(), null); //$NON-NLS-1$
ProcessModel pmodel = actNorm.getTheProcessModel();
processModels.add(pmodel);
}
}
}
}
// Establishing the relationship
WorkGroup.insertIntoTheAgent(agent);
Iterator iterPMs = processModels.iterator();
while (iterPMs.hasNext()) {
ProcessModel procModel = (ProcessModel) iterPMs.next();
this.enactmentEngine.searchForFiredConnections(procModel.getId(), "Rule Adding Agent to a WorkGroup"); //$NON-NLS-1$
this.enactmentEngine.searchForReadyActivities(procModel.getId());
this.enactmentEngine.determineProcessModelStates(procModel);
}
// Persistence Operations
WorkGroupDAO.update(WorkGroup);
agentDAO.update(agent);
}
/*
* (non-Javadoc)
*
* @see
* org.qrconsult.spm.services.impl.DynamicModelingExtraida#removeAgentFromWorkGroup
* (java.lang.String, java.lang.String)
*/
@Override
public void removeAgentFromWorkGroup(String agentId, String WorkGroupId) throws WebapseeException {
// Checks for the parameters
Object ag;
try {
ag = agentDAO.retrieveBySecondaryKey(agentId);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccAgent") + //$NON-NLS-1$
agentId + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (ag == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcAgent") + agentId + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Agent agent = (Agent) ag;
Object gr;
try {
gr = WorkGroupDAO.retrieveBySecondaryKey(WorkGroupId);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccWorkGroup") + //$NON-NLS-1$
WorkGroupId + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (gr == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.DaoExcWorkGroup") + WorkGroupId + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
WorkGroup WorkGroup = (WorkGroup) gr;
// End Checks for the parameters
// Now we start the implementation of the rules
Collection reqWorkGroups = WorkGroup.getTheReqWorkGroup();
Iterator iterReqWorkGroups = reqWorkGroups.iterator();
while (iterReqWorkGroups.hasNext()) {
ReqWorkGroup reqWorkGroup = (ReqWorkGroup) iterReqWorkGroups.next();
Normal actNorm = reqWorkGroup.getTheNormal();
if (actNorm != null) {
Collection invAgents = this.getInvolvedAgents(actNorm);
String state = actNorm.getTheEnactionDescription().getState();
// Test for the agent is already allocated to the activity.
if (invAgents.contains(agent)) {
if (state.equals("")) { //$NON-NLS-1$
Collection ids = new HashSet();
Process process = this.getTheProcess(actNorm.getTheProcessModel());
String procState = process.getpStatus().name();
if (procState.equals(Process.ENACTING)) {
Collection agendas = agent.getTheTaskAgenda().getTheProcessAgenda();
Iterator iterAgendas = agendas.iterator();
while (iterAgendas.hasNext()) {
ProcessAgenda agenda = (ProcessAgenda) iterAgendas.next();
if (agenda.getTheProcess().equals(process)) {
ids.add(agent.getIdent());
this.removeTask(agenda, actNorm);
break;
}
}
}
} else if (!state.equals(Plain.FAILED) && !state.equals(Plain.FINISHED) && !state.equals(Plain.CANCELED)) {
Collection ids = new HashSet();
Collection agendas = agent.getTheTaskAgenda().getTheProcessAgenda();
Iterator iterAgendas = agendas.iterator();
while (iterAgendas.hasNext()) {
ProcessAgenda agenda = (ProcessAgenda) iterAgendas.next();
Process process = this.getTheProcess(actNorm.getTheProcessModel());
if (agenda.getTheProcess().equals(process)) {
ids.add(agent.getIdent());
this.removeTask(agenda, actNorm);
break;
}
}
this.notifyAgents(actNorm, Messages.getString("facades.DynamicModeling.NotifyAgentActivityRemoved"), ids, agent.getId(),
DynamicModelingImpl.DEL, agent.getClass(), agent.getIdent(), null); //$NON-NLS-1$
}
}
// Establishing the relationship
WorkGroup.removeFromTheAgent(agent);
}
}
// Establishing the relationship
WorkGroup.removeFromTheAgent(agent);
// Persistence Operations
WorkGroupDAO.update(WorkGroup);
agentDAO.update(agent);
}
/***************************************************************************
* |***Auxiliar Methods***|*
**************************************************************************/
private WorkGroup cloneWorkGroup(WorkGroup source, String newWorkGroupId) {
if (source == null)
return null;
source.setActive(false);
WorkGroup destiny = new WorkGroup();
destiny.setIdent(newWorkGroupId);
destiny.setName(newWorkGroupId);
destiny.setDescription(source.getDescription());
destiny.setActive(true);
// destiny.setSubWorkGroups(null);
// destiny.setSuperWorkGroup(null);
destiny.setTheAgents(null);
destiny.setTheWorkGroupType(source.getTheWorkGroupType());
// destiny.setTheReqWorkGroup(theReqWorkGroup);
return destiny;
}
// Related to Agents
/**
* Adds a Task that refers to a Normal Activity to an Agent.
*/
private void addTaskToAgent(Agent agent, Normal actNorm) {
TaskAgenda taskAgenda = agent.getTheTaskAgenda();
Collection procAgendas = taskAgenda.getTheProcessAgenda();
procAgendas.remove(null);
if (procAgendas.isEmpty()) {
ProcessAgenda procAg = new ProcessAgenda();
Process pro = this.getTheProcess(actNorm.getTheProcessModel());
procAg.setTheProcess(pro);
pro.getTheProcessAgendas().add(procAg);
procAg.setTheTaskAgenda(taskAgenda);
procAgendas.add(procAg);
pAgendaDAO.addTask(procAg, actNorm);
} else {
Iterator iter = procAgendas.iterator();
boolean has = false;
while (iter.hasNext()) {
ProcessAgenda procAgenda = (ProcessAgenda) iter.next();
if (procAgenda.getTheProcess().equals(this.getTheProcess(actNorm.getTheProcessModel()))) {
pAgendaDAO.addTask(procAgenda, actNorm);
has = true;
}
}
if (!has) {
ProcessAgenda procAg = new ProcessAgenda();
Process pro = this.getTheProcess(actNorm.getTheProcessModel());
procAg.setTheProcess(pro);
pro.getTheProcessAgendas().add(procAg);
procAg.setTheTaskAgenda(taskAgenda);
procAgendas.add(procAg);
pAgendaDAO.addTask(procAg, actNorm);
}
}
taskAgenda.setTheProcessAgenda(procAgendas);
}
/**
* Returns all the Agents that are allocated to an Normal Activity.
*/
private Collection getInvolvedAgents(Normal normal) {
Collection involvedAgents = new HashSet();
Collection reqPeople = normal.getTheRequiredPeople();
Iterator iter = reqPeople.iterator();
while (iter.hasNext()) {
RequiredPeople reqP = (RequiredPeople) iter.next();
if (reqP instanceof ReqAgent) {
ReqAgent reqAgent = (ReqAgent) reqP;
Agent agent = reqAgent.getTheAgent();
if (agent != null)
involvedAgents.add(agent);
} else { // ReqWorkGroup
ReqWorkGroup reqWorkGroup = (ReqWorkGroup) reqP;
WorkGroup WorkGroup = reqWorkGroup.getTheWorkGroup();
if (WorkGroup != null) {
Collection agents = WorkGroup.getTheAgents();
agents.remove(null);
if (!agents.isEmpty()) {
involvedAgents.addAll(agents);
}
}
}
}
return involvedAgents;
}
/**
* Returns all the RequiredAgents allocated to a Normal Activity.
*/
private Collection getRequiredAgents(Normal normal) {
Collection requiredAgents = new HashSet();
Collection reqPeople = normal.getTheRequiredPeople();
Iterator iter = reqPeople.iterator();
while (iter.hasNext()) {
RequiredPeople reqP = (RequiredPeople) iter.next();
if (reqP instanceof ReqAgent) {
ReqAgent reqAgent = (ReqAgent) reqP;
requiredAgents.add(reqAgent);
}
}
return requiredAgents;
}
/**
* Returns all the RequiredWorkGroups allocated to a Normal Activity.
*/
private Collection getRequiredWorkGroups(Normal normal) {
Collection requiredWorkGroups = new HashSet();
Collection reqPeople = normal.getTheRequiredPeople();
Iterator iter = reqPeople.iterator();
while (iter.hasNext()) {
RequiredPeople reqP = (RequiredPeople) iter.next();
if (reqP instanceof ReqWorkGroup) {
ReqWorkGroup reqWorkGroup = (ReqWorkGroup) reqP;
requiredWorkGroups.add(reqWorkGroup);
}
}
return requiredWorkGroups;
}
/**
* Verifies if a Normal Activity has Required Agents.
*/
private boolean hasInvolvedAgents(Normal actNorm) {
boolean has = false;
Collection reqPeople = actNorm.getTheRequiredPeople();
Iterator iter = reqPeople.iterator();
while (iter.hasNext()) {
RequiredPeople reqP = (RequiredPeople) iter.next();
if (reqP instanceof ReqAgent) {
ReqAgent reqAgent = (ReqAgent) reqP;
if (!(reqAgent.getTheAgent() == null)) {
has = true;
break;
}
} else { // ReqWorkGroup
ReqWorkGroup reqWorkGroup = (ReqWorkGroup) reqP;
Collection agents = reqWorkGroup.getTheWorkGroup().getTheAgents();
agents.remove(null);
if (!agents.isEmpty()) {
has = true;
break;
}
}
}
return has;
}
private void notifyAgentAboutActivityState(Agent agent, Task task, String newState) {
if (task == null) {
return;
}
// Notify the agente of the new state of the task!!
String message = "<MESSAGE>" + //$NON-NLS-1$
"<ACTIVITYSTATE>" + //$NON-NLS-1$
"<OID>" + task.getId() + "</OID>" + //$NON-NLS-1$ //$NON-NLS-2$
"<CLASS>" + task.getClass().getName() + "</CLASS>" + //$NON-NLS-1$ //$NON-NLS-2$
"<ID>" + task.getTheNormal().getIdent() + "</ID>" + //$NON-NLS-1$ //$NON-NLS-2$
"<NEWSTATE>" + newState + "</NEWSTATE>" + //$NON-NLS-1$ //$NON-NLS-2$
"<BY>APSEE_Manager</BY>" + //$NON-NLS-1$
"</ACTIVITYSTATE>" + //$NON-NLS-1$
"</MESSAGE>"; //$NON-NLS-1$
try {
if (this.remote == null) {
}
if (this.remote != null) {
this.remote.sendMessageToUser(message, agent.getIdent());
}
// TODO Auto-generated catch block
e.printStackTrace();
} catch(Exception e) {}
}
private void notifyAgents(Normal actNorm, String msg, Long oid, String messageType, Class classe, String identObj, String direction) {
Collection ids = new HashSet();
int i = 0;
Collection agents = this.getInvolvedAgents(actNorm);
Iterator iter = agents.iterator();
while (iter.hasNext()) {
Agent agent = (Agent) iter.next();
ids.add(agent.getIdent());
i++;
}
// Send Message To Users
if (ids != null) {
String dir = "";
if (direction != null) {
dir = "<DIRECTION>" + direction + "</DIRECTION>";
}
String actId = "";
if (classe.equals(Artifact.class)) {
actId = "<ACTIVITY>" + actNorm.getIdent() + "</ACTIVITY>";
}
String message = "<MESSAGE>" + //$NON-NLS-1$
"<NOTIFY>" + //$NON-NLS-1$
"<OID>" + oid + "</OID>" + //$NON-NLS-1$ //$NON-NLS-2$
"<ID>" + identObj + "</ID>" + //$NON-NLS-1$ //$NON-NLS-2$
"<TYPE>" + messageType + "</TYPE>" + //$NON-NLS-1$
"<CLASS>" + classe.getName() + "</CLASS>" + //$NON-NLS-1$ //$NON-NLS-2$
"<BY>Apsee:" + msg + "</BY>" + //$NON-NLS-1$ //$NON-NLS-2$
dir + actId + "</NOTIFY>" + //$NON-NLS-1$
"</MESSAGE>"; //$NON-NLS-1$
try {
if (this.remote == null) {
}
if (this.remote != null) {
this.remote.sendMessageToWorkGroup(message, ids);
}
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
private void notifyAgents(Normal actNorm, String msg, Collection ids, Long oid, String messageType, Class classe, String identObj,
String direction) {
ids.remove(null);
// Send Message To Users
if (ids != null && !ids.isEmpty()) {
String dir = "";
if (direction != null) {
dir = "<DIRECTION>" + direction + "</DIRECTION>";
}
String actId = "";
if (classe.equals(Artifact.class) || classe.equals(Agent.class) || classe.equals(WorkGroup.class) || classe.equals(Consumable.class)
|| classe.equals(Exclusive.class) || classe.equals(Shareable.class)) {
actId = "<ACTIVITY>" + actNorm.getIdent() + "</ACTIVITY>";
}
String message = "<MESSAGE>" + //$NON-NLS-1$
"<NOTIFY>" + //$NON-NLS-1$
"<OID>" + oid + "</OID>" + //$NON-NLS-1$ //$NON-NLS-2$
"<ID>" + identObj + "</ID>" + //$NON-NLS-1$ //$NON-NLS-2$
"<TYPE>" + messageType + "</TYPE>" + //$NON-NLS-1$
"<CLASS>" + classe.getName() + "</CLASS>" + //$NON-NLS-1$ //$NON-NLS-2$
"<BY>Apsee:" + msg + "</BY>" + //$NON-NLS-1$ //$NON-NLS-2$
dir + actId + "</NOTIFY>" + //$NON-NLS-1$
"</MESSAGE>"; //$NON-NLS-1$
try {
if (this.remote == null) {
}
if (this.remote != null) {
this.remote.sendMessageToWorkGroup(message, ids);
}
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
private boolean playsRole(Agent ag, Role role) {
boolean ret = false;
Collection agPlaysRoles = ag.getTheAgentPlaysRoles();
Iterator iter = agPlaysRoles.iterator();
while (iter.hasNext()) {
AgentPlaysRole agPR = (AgentPlaysRole) iter.next();
if ((agPR.getTheRole() != null) && role != null) {
if (agPR.getTheRole().equals(role)) {
ret = true;
break;
}
}
}
return ret;
}
/**
* Removes the Task that refers to a Normal Activity from a Process Agenda.
*/
private void removeTask(ProcessAgenda procAgenda, Normal normal/*
* , Session
* currentSession
*/) {
Collection tasks = procAgenda.getTheTask();
Iterator iter = tasks.iterator();
while (iter.hasNext()) {
Task task = (Task) iter.next();
if (task == null)
continue;
if (task.getTheNormal().equals(normal)) {
tasks.remove(task);
task.setTheProcessAgenda(null);
String actIdent = task.getTheNormal().getIdent();
try {
taskDAO.daoDelete(task);
} catch (Exception/* DAOException */e) {
e.printStackTrace();
}
// Notify the agente of the task!!
String message = "<MESSAGE>" + "<NOTIFY>" + "<OID>" + task.getId() + "</OID>" + "<ID>" + actIdent + "</ID>" + "<TYPE>DEL</TYPE>"
+ "<CLASS>" + task.getClass().getName() + "</CLASS>" + "<BY>APSEE_Manager</BY>" + "</NOTIFY>" + "</MESSAGE>";
try {
if (this.remote == null) {
}
if (this.remote != null) {
this.remote.sendMessageToUser(message, procAgenda.getTheTaskAgenda().getTheAgent().getIdent());
}
System.out.println("DynamicModelingImpl.removeTask() remote reference exception");
}
break;
}
}
}
/**
* Updates the Agent's agenda (Process Agenda) after an Event (Activity's
* State Change).
*/
private void updateAgenda(Agent agent, Normal actNorm, String state, String why/*
* ,
* Session
* currentSession
*/) {
ProcessAgenda agenda = null;
Collection procAgendas = agent.getTheTaskAgenda().getTheProcessAgenda();
Iterator iter = procAgendas.iterator();
while (iter.hasNext()) {
ProcessAgenda procAgenda = (ProcessAgenda) iter.next();
if (procAgenda.getTheProcess().equals(this.getTheProcess(actNorm.getTheProcessModel()))) {
agenda = procAgenda;
break;
}
}
if (agenda == null) {
agenda = new ProcessAgenda();
agenda.setTheTaskAgenda(agent.getTheTaskAgenda());
Process proc = this.getTheProcess(actNorm.getTheProcessModel());
agenda.setTheProcess(proc);
TaskAgenda taskAgenda = agent.getTheTaskAgenda();
taskAgenda.getTheProcessAgenda().add(agenda);
proc.getTheProcessAgendas().add(agenda);
}
Task task = pAgendaDAO.addTask(agenda, actNorm);
task.setLocalState(state);
this.notifyAgentAboutActivityState(agent, task, state);
this.logging.registerAgendaEvent(task, "To" + state, why); //$NON-NLS-1$
}
// Related to Artifacts
/**
* Verifies if an Activity already has an Input Artifact
*/
private boolean hasTheFromArtifact(Normal actNorm, Artifact artifact) {
boolean has = false;
Collection involvedFrom = actNorm.getTheInvolvedArtifactToNormals();
Iterator iter = involvedFrom.iterator();
while (iter.hasNext()) {
InvolvedArtifact invArt = (InvolvedArtifact) iter.next();
if (invArt.getTheArtifact() != null) {
if (invArt.getTheArtifact().equals(artifact)) {
has = true;
break;
}
}
}
return has;
}
/**
* Verifies if an Activity already has an Output Artifact
*/
private boolean hasTheToArtifact(Normal actNorm, Artifact artifact) {
boolean has = false;
Collection involvedTo = actNorm.getTheInvolvedArtifactsFromNormals();
Iterator iter = involvedTo.iterator();
while (iter.hasNext()) {
InvolvedArtifact invArt = (InvolvedArtifact) iter.next();
if (invArt.getTheArtifact() != null) {
if (invArt.getTheArtifact().equals(artifact)) {
has = true;
break;
}
}
}
return has;
}
/**
*
* Attributing to normal activities their involved artifacts
*/
private void createInvolvedArtifacts(ArtifactCon artifactCon, Artifact artifact, Artifact oldArtifact, ArtifactType type) {
if (oldArtifact == null) {
// Attributing to normal activities their input artifacts
Collection<Activity> toActivities = artifactCon.getToActivities();
Iterator iterToActivities = toActivities.iterator();
while (iterToActivities.hasNext()) {
Activity toActivity = (Activity) iterToActivities.next();
if (toActivity != null && toActivity instanceof Normal) {
InvolvedArtifact invArt = new InvolvedArtifact();
invArt.insertIntoTheArtifacts(artifact);
invArt.insertIntoTheArtifactType(type);
invArt.insertIntoInInvolvedArtifacts((Normal) toActivity);
}
}
// Attributing to normal activities their output artifacts
Collection<Activity> fromActivities = artifactCon.getFromActivities();
Iterator iterFromActivities = fromActivities.iterator();
while (iterFromActivities.hasNext()) {
Activity fromActivity = (Activity) iterFromActivities.next();
if (fromActivity != null && fromActivity instanceof Normal) {
InvolvedArtifact invArt = new InvolvedArtifact();
invArt.insertIntoTheArtifacts(artifact);
invArt.insertIntoTheArtifactType(type);
invArt.insertIntoOutInvolvedArtifacts((Normal) fromActivity);
}
}
} else {//
// Attributing to normal activities their input artifacts
Collection<Activity> toActivities = artifactCon.getToActivities();
Iterator iterToActivities = toActivities.iterator();
while (iterToActivities.hasNext()) {
Activity toActivity = (Activity) iterToActivities.next();
if (toActivity != null && toActivity instanceof Normal) {
Normal toNormal = (Normal) toActivity;
Collection invArts = toNormal.getTheInvolvedArtifactToNormals();
Iterator iterInvArts = invArts.iterator();
while (iterInvArts.hasNext()) {
InvolvedArtifact invArt = (InvolvedArtifact) iterInvArts.next();
Artifact art = invArt.getTheArtifact();
if (art != null && art.equals(oldArtifact)) {
oldArtifact.removeTheInvolvedArtifacts(invArt);
invArt.insertIntoTheArtifacts(artifact);
break;
}
}
}
}
// Attributing to normal activities their output artifacts
Collection<Activity> fromActivities = artifactCon.getFromActivities();
Iterator iterFromActivities = fromActivities.iterator();
while (iterFromActivities.hasNext()) {
Activity fromActivity = (Activity) iterFromActivities.next();
if (fromActivity != null && fromActivity instanceof Normal) {
Normal fromNormal = (Normal) fromActivity;
Collection invArts = fromNormal.getTheInvolvedArtifactsFromNormals();
Iterator iterInvArts = invArts.iterator();
while (iterInvArts.hasNext()) {
InvolvedArtifact invArt = (InvolvedArtifact) iterInvArts.next();
Artifact art = invArt.getTheArtifact();
if (art != null && art.equals(oldArtifact)) {
oldArtifact.removeTheInvolvedArtifacts(invArt);
invArt.insertIntoTheArtifacts(artifact);
break;
}
}
}
}
}
}
// Related to Connections
private boolean areConnected(Activity act1, Activity act2) {
Collection toAct1 = this.getConnectionsTo(act1);
boolean has = false;
Iterator iter = toAct1.iterator();
while (iter.hasNext()) {
Connection con = (Connection) iter.next();
if (con instanceof Sequence) {
Sequence seq = (Sequence) con;
if (seq.getToActivity().equals(act2)) {
has = true;
break;
}
} else if (con instanceof BranchANDCon) {
BranchANDCon branchAND = (BranchANDCon) con;
if (branchAND.getToActivities().contains(act2)) {
has = true;
break;
}
} else if (con instanceof BranchConCond) {
BranchConCond branchCond = (BranchConCond) con;
Collection col = branchCond.getTheBranchConCondToActivity();
Iterator iter2 = col.iterator();
while (iter2.hasNext()) {
BranchConCondToActivity atbc = (BranchConCondToActivity) iter2.next();
if (atbc.getTheActivities().equals(act2)) {
has = true;
break;
}
}
if (has)
break;
} else if (con instanceof JoinCon) {
JoinCon join = (JoinCon) con;
if (join.getToActivities() != null) {
if (join.getToActivities().equals(act2)) {
has = true;
break;
}
}
}
}
return has;
}
private boolean areConnected(Activity act, MultipleCon multi) {
boolean are = false;
Collection toAct = this.getConnectionsTo(act);
Iterator iter = toAct.iterator();
while (iter.hasNext()) {
Connection con = (Connection) iter.next();
if (con instanceof BranchANDCon) {
BranchANDCon branchAND = (BranchANDCon) con;
if (branchAND.equals(multi)) {
are = true;
break;
}
} else if (con instanceof BranchConCond) {
BranchConCond branchCond = (BranchConCond) con;
if (branchCond.equals(multi)) {
are = true;
break;
}
} else if (con instanceof JoinCon) {
JoinCon join = (JoinCon) con;
if (join.equals(multi)) {
are = true;
break;
}
}
}
return are;
}
private boolean areConnected(MultipleCon multi, Activity act) {
Collection fromAct = this.getConnectionsFrom(act);
boolean are = false;
Iterator iter = fromAct.iterator();
while (iter.hasNext()) {
Object obj = iter.next();
Connection con = (Connection) obj;
if (con instanceof BranchANDCon) {
BranchANDCon branchAND = (BranchANDCon) con;
if (branchAND.equals(multi)) {
are = true;
break;
}
} else if (con instanceof BranchConCond) {
BranchConCond branchCond = (BranchConCond) con;
if (branchCond.equals(multi)) {
are = true;
break;
}
} else if (con instanceof JoinCon) {
JoinCon join = (JoinCon) con;
if (join.equals(multi)) {
are = true;
break;
}
}
}
return are;
}
private boolean areConnected(MultipleCon multi1, MultipleCon multi2) {
Collection suc = this.getSuccessors(multi1);
boolean are = false;
Iterator iter = suc.iterator();
while (iter.hasNext()) {
Object obj = (Object) iter.next();
if (obj instanceof MultipleCon) {
MultipleCon multi = (MultipleCon) obj;
if (multi instanceof BranchANDCon) {
BranchANDCon branchAND = (BranchANDCon) multi;
if (branchAND.equals(multi2)) {
are = true;
break;
}
} else if (multi instanceof BranchConCond) {
BranchConCond branchCond = (BranchConCond) multi;
if (branchCond.equals(multi2)) {
are = true;
break;
}
if (are)
break;
} else if (multi instanceof JoinCon) {
JoinCon join = (JoinCon) multi;
if (join.equals(multi2)) {
are = true;
break;
}
}
}
}
return are;
}
/**
* Returns a Collection with the predecessors of an Activity.
*/
private Collection getConnectionsFrom(Activity act) {
Collection connFrom = new ArrayList();
connFrom.addAll(act.getFromSimpleCons());
connFrom.addAll(act.getFromJoinCons());
connFrom.addAll(act.getFromBranchANDCons());
Collection bctas = act.getTheBranchConCondToActivity();
Iterator iterBctas = bctas.iterator();
while (iterBctas.hasNext()) {
BranchConCondToActivity bcta = (BranchConCondToActivity) iterBctas.next();
if (bcta.getTheBranchConCond() != null)
connFrom.add(bcta.getTheBranchConCond());
}
return connFrom;
}
/**
* Returns a Collection with the successors of an Activity.
*/
private Collection getConnectionsTo(Activity act) {
Collection connTo = new ArrayList();
connTo.addAll(act.getToSimpleCons());
connTo.addAll(act.getToJoinCons());
connTo.addAll(act.getToBranchCons());
return connTo;
}
/**
* Returns a Collection with the successors (Activities and Multiple
* Connection) of a Connection.
*/
private Collection getSuccessors(Connection conn) {
Collection succ = new ArrayList();
if (conn instanceof Sequence) {
Sequence seq = (Sequence) conn;
Activity act = seq.getToActivity();
if (act != null)
succ.add(act);
} else if (conn instanceof BranchCon) {
BranchCon branch = (BranchCon) conn;
if (branch instanceof BranchANDCon) {
BranchANDCon bAND = (BranchANDCon) branchCon;
succ.addAll(bAND.getToActivities());
succ.addAll(bAND.getToMultipleCon());
} else if (branch instanceof BranchConCond) {
BranchConCond bCond = (BranchConCond) branchCon;
Collection bctmc = bCond.getTheBranchConCondToMultipleCon();
Collection atbc = bCond.getTheBranchConCondToActivity();
Iterator iterMulti = bctmc.iterator(), iterAct = atbc.iterator();
while (iterMulti.hasNext()) {
BranchConCondToMultipleCon multi = (BranchConCondToMultipleCon) iterMulti.next();
MultipleCon multipleCon = multi.getTheMultipleCon();
if (multipleCon != null)
succ.add(multipleCon);
}
while (iterAct.hasNext()) {
BranchConCondToActivity act = (BranchConCondToActivity) iterAct.next();
Activity activity = act.getTheActivities();
if (activity != null)
succ.add(activity);
}
}
} else if (conn instanceof JoinCon) {
JoinCon join = (JoinCon) conn;
Activity activity = join.getToActivities();
if (activity != null) {
succ.add(activity);
}
MultipleCon multipleCon = join.getToMultipleCon();
if (multipleCon != null) {
succ.add(multipleCon);
}
}
return succ;
}
/**
* Returns a Collection with the predecessors (Activities and Multiple
* Connection) of a Connection.
*/
private Collection getPredecessors(Connection conn) {
Collection pred = new ArrayList();
if (conn instanceof Sequence) {
Sequence seq = (Sequence) conn;
Activity act = seq.getFromActivity();
if (act != null)
pred.add(act);
} else if (conn instanceof BranchCon) {
BranchCon branch = (BranchCon) conn;
Activity act = branchCon.getFromActivity();
if (act != null)
pred.add(act);
MultipleCon multi = branchCon.getFromMultipleConnection();
if (multi != null)
pred.add(multi);
} else if (conn instanceof JoinCon) {
JoinCon join = (JoinCon) conn;
pred.addAll(join.getFromActivity());
pred.addAll(join.getFromMultipleCon());
}
return pred;
}
private boolean isInPredecessors(Object obj, Connection conn) {
boolean ret = false;
Collection preds = this.getPredecessors(conn);
Iterator iter = preds.iterator();
while (iter.hasNext()) {
Object object = (Object) iter.next();
if (object instanceof Activity) {
Activity activity = (Activity) object;
if (activity.equals(obj)) {
ret = true;
break;
}
} else if (object instanceof MultipleCon) {
MultipleCon multipleCon = (MultipleCon) object;
if (multipleCon.equals(obj)) {
ret = true;
break;
}
}
}
return ret;
}
private void removeAllConnectionsFromActivity(Activity act/*
* , Session
* currentSession
*/) {
// Removing connections to...
Collection branchesTo = act.getToBranchCons();
Iterator iterBranchTo = branchConesTo.iterator();
while (iterBranchConTo.hasNext()) {
BranchCon branchTo = (BranchCon) iterBranchConTo.next();
if (branchTo instanceof BranchANDCon) {
BranchANDCon branchAND = (BranchANDCon) branchConTo;
branchAND.setFromActivity(null);
} else {
BranchConCond branchCond = (BranchConCond) branchConTo;
branchCond.setFromActivity(null);
}
}
Collection joinsTo = act.getToJoinCon();
Iterator iterJoinTo = joinConsTo.iterator();
while (iterJoinConTo.hasNext()) {
JoinCon joinTo = (JoinCon) iterJoinConTo.next();
joinConTo.getFromActivity().remove(act);
}
Collection connsToDelete = new HashSet();
Collection simplesTo = act.getToSimpleCon();
Iterator iterSimpleTo = simplesTo.iterator();
while (iterSimpleTo.hasNext()) {
SimpleCon simpleTo = (SimpleCon) iterSimpleTo.next();
simpleTo.removeFromTheProcessModel();
Activity actTo = simpleTo.getToActivities();
actTo.removeFromFromSimpleCon(simpleTo);
connsToDelete.add(simpleTo);
}
act.clearToSimpleCon();
Iterator iterConnsToDelete = connsToDelete.iterator();
while (iterConnsToDelete.hasNext()) {
SimpleCon conn = (SimpleCon) iterConnsToDelete.next();
// Deleting simple connection from the database
try {
simpleDAO.daoDelete(conn);
} catch (Exception/* DAOException */e) {
e.printStackTrace();
}
}
connsToDelete.clear();
Collection artConsTo = act.getToArtifactCon();
Iterator iterArtTo = artConsTo.iterator();
while (iterArtTo.hasNext()) {
ArtifactCon artConTo = (ArtifactCon) iterArtTo.next();
artConTo.getFromActivity().remove(act);
}
// Removing connections from...
Collection branchesANDFrom = act.getFromBranchANDCon();
Iterator iterBranchANDFrom = branchConesANDFrom.iterator();
while (iterBranchANDConFrom.hasNext()) {
BranchANDCon branchANDFrom = (BranchANDCon) iterBranchANDConFrom.next();
branchANDConFrom.getToActivities().remove(act);
}
Collection bctaToDelete = new HashSet();
Collection bctas = act.getTheBranchConCondToActivity();
Iterator iterBCTAs = bctas.iterator();
while (iterBCTAs.hasNext()) {
BranchConCondToActivity branchCondToActivity = (BranchConCondToActivity) iterBCTAs.next();
bctaToDelete.add(branchConCondToActivity);
}
bctas.clear();
Iterator iterBctasToDelete = bctaToDelete.iterator();
while (iterBctasToDelete.hasNext()) {
BranchConCondToActivity toDelete = (BranchConCondToActivity) iterBctasToDelete.next();
toDelete.removeFromTheBranchConCond();
// Deleting simple connection from the database
try {
branchConCondToActivityDAO.daoDelete(toDelete);
} catch (Exception/* DAOException */e) {
e.printStackTrace();
}
}
Collection joinsFrom = act.getFromJoinCon();
Iterator iterJoinFrom = joinConsFrom.iterator();
while (iterJoinConFrom.hasNext()) {
JoinCon joinFrom = (JoinCon) iterJoinConFrom.next();
joinConFrom.setToActivity(null);
}
Collection simplesFrom = act.getFromSimpleCon();
Iterator iterSimpleFrom = simplesFrom.iterator();
while (iterSimpleFrom.hasNext()) {
SimpleCon simpleFrom = (SimpleCon) iterSimpleFrom.next();
simpleFrom.removeFromTheProcessModel();
Activity actFrom = simpleFrom.getFromActivity();
actFrom.removeFromToSimpleCon(simpleFrom);
connsToDelete.add(simpleFrom);
}
act.clearFromSimpleCon();
iterConnsToDelete = connsToDelete.iterator();
while (iterConnsToDelete.hasNext()) {
SimpleCon conn = (SimpleCon) iterConnsToDelete.next();
// Deleting simple connection from the database
try {
simpleDAO.daoDelete(conn);
} catch (Exception/* DAOException */e) {
e.printStackTrace();
}
}
connsToDelete.clear();
Collection artConsFrom = act.getFromArtifactCons();
Iterator iterArtFrom = artConsFrom.iterator();
while (iterArtFrom.hasNext()) {
ArtifactCon artConFrom = (ArtifactCon) iterArtFrom.next();
artConFrom.getToActivities().remove(act);
}
}
// Related to Process Models
private void copyActivityRelationships(Activity actFrom, Activity actTo) {
if (actFrom.getTheActivityEstimations() != null) {
Collection temp = new HashSet();
Collection actEstimations = actFrom.getTheActivityEstimations();
Iterator iterActEstimations = actEstimations.iterator();
while (iterActEstimations.hasNext()) {
ActivityEstimation actEstimation = (ActivityEstimation) iterActEstimations.next();
temp.add(actEstimation);
// actFrom.getActivityEstimation().remove(actEstimation);
actEstimation.setTheActivity(actTo);
actTo.getTheActivityEstimations().add(actEstimation);
}
actFrom.getTheActivityEstimations().removeAll(temp);
}
if (actFrom.getActivityMetrics() != null) {
Collection temp = new HashSet();
Collection actMetrics = actFrom.getActivityMetrics();
Iterator iterActMetrics = actMetrics.iterator();
while (iterActMetrics.hasNext()) {
ActivityMetric actMetric = (ActivityMetric) iterActMetrics.next();
temp.add(actMetric);
// actFrom.getActivityMetric().remove(actMetric);
actMetric.setTheActivity(actTo);
actTo.getTheActivityMetrics().add(actMetric);
}
actFrom.getTheActivityMetrics().removeAll(temp);
}
if (actFrom.getFromArtifactCons() != null) {
Collection temp = new HashSet();
Collection fromArtifactCons = actFrom.getFromArtifactCons();
Iterator iterFromArtifactCons = fromArtifactCons.iterator();
while (iterFromArtifactCons.hasNext()) {
ArtifactCon fromArtifactCon = (ArtifactCon) iterFromArtifactCons.next();
temp.add(fromArtifactCon);
// actFrom.getFromArtifactCons().remove(fromArtifactCon);
fromArtifactCon.getToActivities().remove(actFrom);
fromArtifactCon.getToActivities().add(actTo);
actTo.getFromArtifactCons().add(fromArtifactCon);
}
actFrom.getFromArtifactCons().removeAll(temp);
}
if (actFrom.getFromBranchANDCon() != null) {
Collection temp = new HashSet();
Collection fromBranchANDs = actFrom.getFromBranchANDCon();
Iterator iterFromBranchANDs = fromBranchANDCons.iterator();
while (iterFromBranchANDCons.hasNext()) {
BranchANDCon fromBranchAND = (BranchANDCon) iterFromBranchANDCons.next();
temp.add(fromBranchANDCon);
// actFrom.getFromBranchAND().remove(fromBranchANDCon);
fromBranchANDCon.getToActivities().remove(actFrom);
fromBranchANDCon.getToActivities().add(actTo);
actTo.getFromBranchAND().add(fromBranchANDCon);
}
actFrom.getFromBranchANDCon().removeAll(temp);
}
if (actFrom.getTheBranchConCondToActivity() != null) {
Collection temp = new HashSet();
Collection branchCondToActivitys = actFrom.getTheBranchConCondToActivity();
Iterator iterBranchConCondToActivitys = branchConCondToActivitys.iterator();
while (iterBranchConCondToActivitys.hasNext()) {
BranchConCondToActivity branchCondToActivity = (BranchConCondToActivity) iterBranchConCondToActivitys.next();
temp.add(branchConCondToActivity);
// actFrom.getTheBranchConCondToActivity().remove(branchConCondToActivity);
branchConCondToActivity.setTheActivity(actTo);
actTo.getTheBranchConCondToActivity().add(branchConCondToActivity);
}
actFrom.getTheBranchConCondToActivity().removeAll(temp);
}
if (actFrom.getFromJoinCon() != null) {
Collection temp = new HashSet();
Collection fromJoins = actFrom.getFromJoinCon();
Iterator iterFromJoins = fromJoinCons.iterator();
while (iterFromJoinCons.hasNext()) {
JoinCon fromJoin = (JoinCon) iterFromJoinCons.next();
temp.add(fromJoinCon);
// actFrom.getFromJoin().remove(fromJoinCon);
fromJoinCon.setToActivity(actTo);
actTo.getFromJoin().add(fromJoinCon);
}
actFrom.getFromJoinCon().removeAll(temp);
}
if (actFrom.getFromSimpleCon() != null) {
Collection temp = new HashSet();
Collection fromSimpleCons = actFrom.getFromSimpleCon();
Iterator iterFromSimpleCons = fromSimpleCons.iterator();
while (iterFromSimpleCons.hasNext()) {
SimpleCon fromSimpleCon = (SimpleCon) iterFromSimpleCons.next();
actTo.getFromSimpleCon().add(fromSimpleCon);
temp.add(fromSimpleCon);
// actFrom.getFromSimpleCon().remove(fromSimpleCon);
fromSimpleCon.setToActivity(actTo);
}
actFrom.getFromSimpleCon().removeAll(temp);
}
if (actFrom.getHasVersions() != null) {
Collection temp = new HashSet();
Collection hasVersions = actFrom.getHasVersions();
Iterator iterHasVersions = hasVersions.iterator();
while (iterHasVersions.hasNext()) {
Activity version = (Activity) iterHasVersions.next();
temp.add(version);
// actFrom.getHasVersions().remove(version);
version.setIsVersion(actTo);
actTo.getHasVersions().add(version);
}
actFrom.getHasVersions().removeAll(temp);
}
if (actFrom.getIsVersion() != null) {
Activity isVersion = actFrom.getIsVersion();
actFrom.setIsVersion(null);
isVersion.getHasVersions().remove(actFrom);
actTo.setIsVersion(isVersion);
isVersion.getHasVersions().add(actTo);
}
if (actFrom.getTheActivitiesType() != null) {
ActivityType activityType = actFrom.getTheActivitiesType();
actFrom.setTheActivityType(null);
activityType.getTheActivities().remove(actFrom);
actTo.setTheActivityType(activityType);
activityType.getTheActivities().add(actTo);
}
if (actFrom.getTheModelingActivityEvent() != null) {
Collection temp = new HashSet();
Collection modelingActivityEvents = actFrom.getTheModelingActivityEvent();
Iterator iterModelingActivityEvents = modelingActivityEvents.iterator();
while (iterModelingActivityEvents.hasNext()) {
ModelingActivityEvent modelingActivityEvent = (ModelingActivityEvent) iterModelingActivityEvents.next();
temp.add(modelingActivityEvent);
// actFrom.getTheModelingActivityEvent().remove(modelingActivityEvent);
modelingActivityEvent.setTheActivity(actTo);
actTo.getTheModelingActivityEvent().add(modelingActivityEvent);
}
actFrom.getTheModelingActivityEvent().removeAll(temp);
}
if (actFrom.getToArtifactCon() != null) {
Collection temp = new HashSet();
Collection toArtifactCons = actFrom.getToArtifactCon();
Iterator iterToArtifactCons = toArtifactCons.iterator();
while (iterToArtifactCons.hasNext()) {
ArtifactCon toArtifactCon = (ArtifactCon) iterToArtifactCons.next();
temp.add(toArtifactCon);
// actFrom.getToArtifactCon().remove(toArtifactCon);
toArtifactCon.getFromActivity().remove(actFrom);
toArtifactCon.getFromActivity().add(actTo);
actTo.getToArtifactCon().add(toArtifactCon);
}
actFrom.getToArtifactCon().removeAll(temp);
}
if (actFrom.getToBranchCon() != null) {
Collection temp = new HashSet();
Collection toBranchs = actFrom.getToBranchCon();
Iterator iterToBranchs = toBranchCons.iterator();
while (iterToBranchCons.hasNext()) {
BranchCon toBranch = (BranchCon) iterToBranchCons.next();
temp.add(toBranchCon);
// actFrom.getToBranch().remove(toBranchCon);
toBranchCon.setFromActivity(actTo);
actTo.getToBranch().add(toBranchCon);
}
actFrom.getToBranchCon().removeAll(temp);
}
if (actFrom.getToJoinCon() != null) {
Collection temp = new HashSet();
Collection toJoins = actFrom.getToJoinCon();
Iterator iterToJoins = toJoinCons.iterator();
while (iterToJoinCons.hasNext()) {
JoinCon toJoin = (JoinCon) iterToJoinCons.next();
temp.add(toJoinCon);
// actFrom.getToJoin().remove(toJoinCon);
toJoinCon.getFromActivity().remove(actFrom);
toJoinCon.getFromActivity().add(actTo);
actTo.getToJoin().add(toJoinCon);
}
actFrom.getToJoinCon().removeAll(temp);
}
if (actFrom.getToSimpleCon() != null) {
Collection temp = new HashSet();
Collection toSimpleCons = actFrom.getToSimpleCon();
Iterator iterToSimpleCons = toSimpleCons.iterator();
while (iterToSimpleCons.hasNext()) {
SimpleCon toSimpleCon = (SimpleCon) iterToSimpleCons.next();
temp.add(toSimpleCon);
// actFrom.getToSimpleCon().remove(toSimpleCon);
toSimpleCon.setFromActivity(actTo);
actTo.getToSimpleCon().add(toSimpleCon);
}
actFrom.getToSimpleCon().removeAll(temp);
}
ProcessModel fromProcessModel = actFrom.getTheProcessModel();
actFrom.setTheProcessModel(null);
fromProcessModel.getTheActivities().remove(actFrom);
actTo.setTheProcessModel(fromProcessModel);
fromProcessModel.getTheActivities().add(actTo);
}
/**
* Returns the path of the process model given a String.
*/
private String getPath(String string) {
StringTokenizer st = new StringTokenizer(string, "."); //$NON-NLS-1$
int i = st.countTokens();
String currentModel = st.nextToken();
i--;
while (i > 1) {
currentModel += "." + st.nextToken(); //$NON-NLS-1$
i--;
}
return currentModel;
}
/**
* Returns the state of an Activity, even if it is Plain or Decomposed.
*/
private String getState(Activity activity) {
String state;
if (activity instanceof Plain) {
Plain plain = (Plain) activity;
state = plain.getTheEnactionDescription().getState();
} else { // decomposed
Decomposed decomposed = (Decomposed) activity;
state = decomposed.getTheReferedProcessModel().getPmState();
}
return (state);
}
private Process getTheProcess(ProcessModel pmodel) {
Process process = null;
Decomposed decAct = pmodel.getTheDecomposed();
if (decAct == null)
process = pmodel.getTheProcess();
else
process = this.getTheProcess(decAct.getTheProcessModel());
return process;
}
private boolean isActivityInProcessModel(String act_id, ProcessModel pmodel) {
boolean find = false;
Collection acts = pmodel.getTheActivities();
Iterator iter = acts.iterator();
while (iter.hasNext()) {
Object obj = iter.next();
if (obj instanceof Activity) {
Activity activity = (Activity) obj;
if (activity.getIdent().equals(act_id)) {
find = true;
break;
}
}
}
return find;
}
private void makeWaiting(Activity activity, String why) {
if (activity != null) {
if (activity instanceof Normal) {
Normal normal = (Normal) activity;
EnactionDescription enact = normal.getTheEnactionDescription();
enact.setState(Plain.WAITING);
this.logging.registerModelingActivityEvent(normal, "To" + Plain.WAITING, why); //$NON-NLS-1$ //$NON-NLS-2$
this.makeActivityWaiting(normal, why);
} else if (activity instanceof Decomposed) {
Decomposed decomposed = (Decomposed) activity;
this.makeDecomposedWaiting(decomposed.getTheReferedProcessModel(), why);
}
}
}
private void makeDecomposedWaiting(ProcessModel processModel, String why) {
Collection activities = processModel.getTheActivities();
Iterator iterActivities = activities.iterator();
while (iterActivities.hasNext()) {
Activity activity = (Activity) iterActivities.next();
if (activity != null) {
if (activity instanceof Normal) {
Normal normal = (Normal) activity;
String state = normal.getTheEnactionDescription().getState();
if (state.equals(Plain.READY)) {
this.makeActivityWaiting(normal, why);
}
} else if (activity instanceof Decomposed) {
Decomposed decomposed = (Decomposed) activity;
this.makeDecomposedWaiting(decomposed.getTheReferedProcessModel(), why);
}
}
}
}
private void makeActivityWaiting(Normal actNorm, String why) {
Collection reqPeople = this.getInvolvedAgents(actNorm);
Iterator iter = reqPeople.iterator();
boolean has = iter.hasNext();
while (iter.hasNext()) {
Agent agent = (Agent) iter.next();
// this.addTaskToAgent (agent, actNorm, currentSession);
TaskAgenda agenda = agent.getTheTaskAgenda();
Collection procAgendas = agenda.getTheProcessAgenda();
Iterator iter2 = procAgendas.iterator();
boolean find = false;
while (iter2.hasNext()) {
ProcessAgenda procAgenda = (ProcessAgenda) iter2.next();
if (procAgenda.getTheProcess().equals(this.getTheProcess(actNorm.getTheProcessModel()))) {
Collection tasks = procAgenda.getTheTasks();
Iterator iter3 = tasks.iterator();
while (iter3.hasNext()) {
Task task = (Task) iter3.next();
if (task.getTheNormal().equals(actNorm)) {
task.setLocalState(Plain.WAITING); //$NON-NLS-1$
this.logging.registerAgendaEvent(task, "To" + Plain.WAITING, why); //$NON-NLS-1$
this.notifyAgentAboutActivityState(agent, task, Plain.WAITING);
find = true;
break;
}
}
if (find)
break;
}
}
}
if (has) {
actNorm.getTheEnactionDescription().setStateWithMessage(Plain.WAITING);
this.logging.registerGlobalActivityEvent(actNorm, "ToWaiting", why); //$NON-NLS-1$
}
}
// Related to Resources
/**
* Called by APSEE Manager. Involves the rules: 12.1, 12.5, 12.6 Allocates a
* Required Resource to a Normal Activity, and treating if the Resource is
* Comsumable.
*/
private void allocateResource(RequiredResource reqResource, Resource res, boolean allocConsum) {
if (res instanceof Exclusive) {
Exclusive allocRes = (Exclusive) res;
allocRes.setState(Exclusive.LOCKED);
this.logging.registerResourceEvent(allocRes, reqResource.getTheNormal(), "ToLocked", "Rule G7.13"); //$NON-NLS-1$ //$NON-NLS-2$
} else if (res instanceof Consumable) {
Consumable allocRes = (Consumable) res;
Float needed = reqResource.getAmountNeeded();
Float total = allocRes.getTotalQuantity();
Float used = allocRes.getAmountUsed();
if (used < total) {
used = new Float(used.floatValue() + needed.floatValue());
if (total <= used) {
allocRes.setAmountUsed(total);
allocRes.setState(Consumable.FINISHED);
} else {
allocRes.setAmountUsed(used);
}
}
} else if (res instanceof Shareable) {
Shareable allocRes = (Shareable) res;
Collection reqResources = allocRes.getTheRequiredResources();
Iterator iter = reqResources.iterator();
while (iter.hasNext()) {
RequiredResource reqRes = (RequiredResource) iter.next();
if (reqRes.getTheResource().equals(allocRes)) {
this.logging.registerResourceEvent(allocRes, reqResource.getTheNormal(), "Requested", "Rule G7.13"); //$NON-NLS-1$ //$NON-NLS-2$
}
}
}
}
private boolean hasTheRequiredResourceType(Normal actNorm, ResourceType resType) {
boolean has = false;
Collection reqResTypes = actNorm.getTheRequiredResources();
Iterator iter = reqResTypes.iterator();
while (iter.hasNext()) {
RequiredResource reqRes = (RequiredResource) iter.next();
if (reqRes.getTheResourceType().equals(resType) && reqRes.getTheResource() == null) {
has = true;
break;
}
}
return has;
}
private boolean hasTheResource(Normal actNorm, Resource res) {
boolean has = false;
Collection reqRess = actNorm.getTheRequiredResources();
Iterator iter = reqRess.iterator();
while (iter.hasNext()) {
RequiredResource reqRes = (RequiredResource) iter.next();
Resource resource = reqRes.getTheResource();
if (resource != null) {
if (resource.equals(res)) {
has = true;
break;
}
}
}
return has;
}
private boolean isSubType(Type subResType, WorkGroupType WorkGroupType) {
if (subResType.equals(WorkGroupType))
return true;
if (subResType.getSuperType() != null) {
return this.isSubType(subResType.getSuperType(), WorkGroupType);
} else
return false;
}
private boolean isSubType(Type subResType, ResourceType resType) {
if (subResType.equals(resType))
return true;
if (subResType.getSuperType() != null) {
return this.isSubType(subResType.getSuperType(), resType);
} else
return false;
}
private boolean isSubType(Type subArtType, ArtifactType artType) {
if (subArtType.equals(artType))
return true;
if (subArtType.getSuperType() != null) {
return this.isSubType(subArtType.getSuperType(), artType);
} else
return false;
}
private void releaseResourceFromActivity(Normal actNorm, Resource res) {
Collection activityResources = actNorm.getTheRequiredResources();
Iterator iter = activityResources.iterator();
while (iter.hasNext()) {
RequiredResource requiredResource = (RequiredResource) iter.next();
Resource resource = requiredResource.getTheResource();
if (resource != null) {
if (resource.equals(res)) {
if (resource instanceof Exclusive) {
Exclusive exc = (Exclusive) resource;
if (exc.getExclusiveStatus().name().equals(Exclusive.LOCKED)) {
exc.setState(Exclusive.AVAILABLE);
this.logging.registerResourceEvent(exc, actNorm, "ToAvailable", "Rule 7.19"); //$NON-NLS-1$ //$NON-NLS-2$
}
} else if (resource instanceof Shareable) {
Shareable sha = (Shareable) resource;
this.logging.registerResourceEvent(sha, actNorm, "Released", "Rule 7.19"); //$NON-NLS-1$ //$NON-NLS-2$
}
break;
}
}
}
}
/**
* Called by APSEE Manager. Involves the rules: G1.4. Release all the
* resources from a Normal Activity. Changing their states to Available.
*/
private void releaseResourcesFromActivity(Normal actNorm) {
Collection activityResources = actNorm.getTheRequiredResources();
Iterator iter = activityResources.iterator();
while (iter.hasNext()) {
RequiredResource requiredResource = (RequiredResource) iter.next();
Resource resource = requiredResource.getTheResource();
if (resource instanceof Exclusive) {
Exclusive exc = (Exclusive) resource;
if (exc.getExclusiveStatus().name().equals(Exclusive.LOCKED)) {
exc.setState(Exclusive.AVAILABLE);
this.logging.registerResourceEvent(exc, actNorm, "ToAvailable", "Rule 12.7"); //$NON-NLS-1$ //$NON-NLS-2$
}
} else if (resource instanceof Shareable) {
Shareable sha = (Shareable) resource;
this.logging.registerResourceEvent(sha, actNorm, "Released", "Rule 12.2"); //$NON-NLS-1$ //$NON-NLS-2$
}
}
}
/*
* SecurityFacade.getInstance(); } catch (Exception e) {
*
* } }
*/
// seta as informacoes do WebapseeObjectDTO
private WebapseeObjectDTO newWebapseeObjectDTO(int oid, String className) {
WebapseeObjectDTO webapseeObjectDTO = new WebapseeObjectDTO();
webapseeObjectDTO.setId(oid);
webapseeObjectDTO.setClassName(className);
return webapseeObjectDTO;
}
@Override
public Integer getGlobalEvents(String act_id) throws DAOException, ModelingException {
System.out.println("atividades: "+act_id);
// Checks for the parameters
Object act;
try {
act = actDAO.retrieveBySecondaryKey(act_id);
} catch (Exception/* DAOException */e) {
throw new DAOException(Messages.getString("facades.DynamicModeling.DaoExcDatabaseAccActv") + //$NON-NLS-1$
act_id + Messages.getString("facades.DynamicModeling.DaoExcFailed") + e); //$NON-NLS-1$
}
if (act == null)
throw new DAOException(
Messages.getString("facades.DynamicModeling.ModelingExcActv") + act_id + Messages.getString("facades.DynamicModeling.DaoExcNotFound")); //$NON-NLS-1$ //$NON-NLS-2$
Activity activity = (Activity) act;
// Now we start the implementation of the rules
if (activity instanceof Decomposed){
System.out.println("eventos:2 "+activity.getTheProcessModel());
System.out.println("eventos:3 "+activity.getTheProcessModel().getTheProcess());
System.out.println("eventos:4 "+activity.getTheProcessModel().getTheProcess().getTheProcessEvent());
System.out.println("eventos:5 "+activity.getTheProcessModel().getTheProcess().getTheProcessEvent().iterator().next().getTheCatalogEvents());
System.out.println("eventos:6 "+activity.getTheProcessModel().getTheProcess().getTheProcessEvent().iterator().next().getTheCatalogEvents().getTheGlobalActivityEvent());
}
else if (activity instanceof Normal) { // Rule G1.4
Normal actNorm = (Normal) activity;
// System.out.println("eventos:1 "+actNorm.getTheActivitiesType().getTheProcess());
// End Checks for the parameters
System.out.println("eventos:2 "+actNorm.getTheActivitiesType());
System.out.println("eventos:3 "+actNorm.getTheActivitiesType().getTheProcess().iterator().next().getTheProcessEvent());
System.out.println("eventos:4 "+actNorm.getTheActivitiesType().getTheProcess().iterator().next().getTheProcessEvent().iterator().next().getTheCatalogEvents());
System.out.println("eventos:5 "+actNorm.getTheActivitiesType().getTheProcess().iterator().next().getTheProcessEvent().iterator().next().getTheCatalogEvents().getTheGlobalActivityEvent());
}
return null;
}
}
<file_sep>/tmp/service/interfaces/EditorServices.java
package br.ufpa.labes.spm.service.interfaces;
@Remote
public interface EditorServices {
public String getActivities( String sessionID );
public String getEditorContent( String pmodelID );
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/agent/ConfiDAO.java
package br.ufpa.labes.spm.repository.impl.agent;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.agent.IConfiDAO;
import br.ufpa.labes.spm.domain.SpmConfiguration;
public class ConfiDAO extends BaseDAO<SpmConfiguration, String> implements IConfiDAO {
protected ConfiDAO(Class<SpmConfiguration> businessClass) {
super(businessClass);
}
public ConfiDAO() {
super(SpmConfiguration.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/artifacts/ArtifactTaskDAO.java
package br.ufpa.labes.spm.repository.impl.artifacts;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.artifacts.IArtifactTaskDAO;
import br.ufpa.labes.spm.domain.ArtifactTask;
public class ArtifactTaskDAO extends BaseDAO<ArtifactTask, Integer> implements IArtifactTaskDAO {
protected ArtifactTaskDAO(Class<ArtifactTask> businessClass) {
super(businessClass);
}
public ArtifactTaskDAO() {
super(ArtifactTask.class);
}
}
<file_sep>/tmp/service/impl/SystemServicesImpl.java
package br.ufpa.labes.spm.service.impl;
import java.util.ArrayList;
import java.util.Collection;
import java.util.List;
import javax.persistence.Query;
import org.qrconsult.spm.converter.core.Converter;
import org.qrconsult.spm.converter.core.ConverterImpl;
import br.ufpa.labes.spm.exceptions.ImplementationException;
import br.ufpa.labes.spm.repository.interfaces.organizationPolicies.ICompanyDAO;
import br.ufpa.labes.spm.repository.interfaces.organizationPolicies.IProjectDAO;
import br.ufpa.labes.spm.repository.interfaces.organizationPolicies.IDevelopingSystemDAO;
import br.ufpa.labes.spm.service.dto.AgentDTO;
import br.ufpa.labes.spm.service.dto.ProjectDTO;
import br.ufpa.labes.spm.service.dto.DevelopingSystemDTO;
import br.ufpa.labes.spm.service.dto.SystemsDTO;
import br.ufpa.labes.spm.domain.Agent;
import br.ufpa.labes.spm.domain.AgentAffinityAgent;
import br.ufpa.labes.spm.domain.AgentHasAbility;
import br.ufpa.labes.spm.domain.AgentPlaysRole;
import br.ufpa.labes.spm.domain.WorkGroup;
import br.ufpa.labes.spm.domain.Company;
import br.ufpa.labes.spm.domain.Project;
import br.ufpa.labes.spm.domain.DevelopingSystem;
import br.ufpa.labes.spm.service.interfaces.ProjectServices;
import br.ufpa.labes.spm.service.interfaces.DevelopingSystemDTO;
import br.ufpa.labes.spm.service.interfaces.SystemServices;
public class SystemServicesImpl implements SystemServices {
private static final String SYSTEM_CLASSNAME = DevelopingSystem.class.getSimpleName();
IProjectDAO projectDAO;
IDevelopingSystemDAO developingSystemDAO;
ICompanyDAO comDAO;
Converter converter = new ConverterImpl();
ProjectServices projectServicesImpl = null;
public DevelopingSystem getSystemForName(String nome) {
String hql;
Query query;
List<DevelopingSystem> result = null;
hql = "select system from " + DevelopingSystem.class.getSimpleName() + " as system where system.name = :sysname";
query = developingSystemDAO.getPersistenceContext().createQuery(hql);
query.setParameter("sysname", nome);
result = query.getResultList();
if (!result.isEmpty()) {
DevelopingSystem sys = result.get(0);
return sys;
} else {
return null;
}
}
private DevelopingSystemDTO convertSystemToSystemDTO(DevelopingSystem system) {
DevelopingSystemDTO systemDTO = new DevelopingSystemDTO();
try {
systemDTO = (DevelopingSystemDTO) converter.getDTO(system, DevelopingSystemDTO.class);
return systemDTO;
} catch (ImplementationException e) {
e.printStackTrace();
}
return null;
}
@Override
public DevelopingSystemDTO getSystem(String nameSystem) {
String hql;
Query query;
List<DevelopingSystem> result = null;
List<Project> abis = new ArrayList<Project>();
List<ProjectDTO> abisDTO = new ArrayList<ProjectDTO>();
DevelopingSystemDTO systemDTO = null;
ProjectDTO abi = null;
hql = "select role from " + DevelopingSystem.class.getSimpleName() + " as role where role.name = :rolname";
query = developingSystemDAO.getPersistenceContext().createQuery(hql);
query.setParameter("rolname", nameSystem);
result = query.getResultList();
Collection<Project> lis = result.get(0).getTheProject();
java.lang.System.out.println("project1");
if (!lis.isEmpty()) {
java.lang.System.out.println("project2222");
for (Project pro : lis) {
java.lang.System.out.println("project2");
abis.add(pro);
}
}
Converter converter = new ConverterImpl();
for (int i = 0; i < abis.size(); i++) {
try {
abi = (ProjectDTO) converter.getDTO(abis.get(i), ProjectDTO.class);
abisDTO.add(abi);
} catch (ImplementationException e) {
e.printStackTrace();
}
}
try {
systemDTO = (DevelopingSystemDTO) converter.getDTO(result.get(0), DevelopingSystemDTO.class);
if (abis != null) {
systemDTO.setProjetos(abisDTO);
}
} catch (ImplementationException e) {
e.printStackTrace();
}
return systemDTO;
}
@Override
public DevelopingSystemDTO saveSystem(DevelopingSystemDTO systemDTO) {
Converter converter = new ConverterImpl();
java.lang.System.out.println("passei aqui antes");
try {
String orgSystem = systemDTO.getIdent();
String systemName = systemDTO.getName();
Project abil;
if (orgSystem != null && !orgSystem.equals("") && systemName != null && !systemName.equals("")) {
Company organize = comDAO.retrieveBySecondaryKey(orgSystem);
java.lang.System.out.println(organize.getAddress() + "trse");
DevelopingSystem system = this.getSystemForName(systemName);
if (organize != null) {
java.lang.System.out.println("passei aqui 1");
if (system != null) {
java.lang.System.out.println("passei aqui 2");
system = (DevelopingSystem) converter.getEntity(systemDTO, system);
java.lang.System.out.println("Projetos: " + systemDTO.getProjetos());
for (Project project : system.getTheProject()) { // Quebrar todas as
project.setTheSystem(null);
} // anteriormente, pra salvar
system.setTheProject(null);
developingSystemDAO.update(system);
java.lang.System.out.println("tamanho: " + systemDTO.getProjetos().size());
if (!systemDTO.getProjetos().isEmpty()) {
Collection<Project> lisAbi = new ArrayList<Project>();
for (ProjectDTO pro : systemDTO.getProjetos()) {
java.lang.System.out.println("Construindo projetos1");
abil = (Project) converter.getEntity(pro, Project.class);
abil.setTheSystem(system);
lisAbi.add(abil);
}
system.setTheProject(lisAbi);
}
system.setTheOrganization(organize);
system = developingSystemDAO.update(system);
java.lang.System.out.println("passei aqui 4");
} else {
java.lang.System.out.println("passei aqui 5");
system = (DevelopingSystem) converter.getEntity(systemDTO, DevelopingSystem.class);
if (systemDTO.getProjetos() != null && !systemDTO.getProjetos().isEmpty()) {
Collection<Project> lisAbi = new ArrayList<Project>();
List<Project> ar = this.getProjects();
for (ProjectDTO pro : systemDTO.getProjetos()) {
java.lang.System.out.println("Construindo projetos2");
for (int i = 0; i < ar.size(); i++) {
if (pro.getName().equals(ar.get(i).getName())) {
lisAbi.add(ar.get(i));
break;
}
}
}
system.setTheProject(lisAbi);
}
system.setTheOrganization(organize);
system = developingSystemDAO.daoSave(system);
}
java.lang.System.out.println("passei aqui 6");
systemDTO = (DevelopingSystemDTO) converter.getDTO(system, DevelopingSystemDTO.class);
} else {
java.lang.System.out.println("passei aqui 7");
return null;
}
} else {
java.lang.System.out.println("passei aqui 8" + orgSystem + ":");
return null;
}
} catch (ImplementationException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
java.lang.System.out.println("passei aqui 9");
return systemDTO;
}
@Override
public Boolean removeSystem(DevelopingSystemDTO systemDTO) {
String hql;
Query query;
java.lang.System.out.println("re1" + systemDTO.getName());
hql = "select system from " + DevelopingSystem.class.getSimpleName() + " as system where system.name = :abiname";
query = developingSystemDAO.getPersistenceContext().createQuery(hql);
query.setParameter("abiname", systemDTO.getName());
List<DevelopingSystem> result = query.getResultList();
java.lang.System.out.println("re1");
DevelopingSystem system = result.get(0);
if (system != null) {
if (system.getTheProject() != null) {
Collection<Project> re = system.getTheProject();
for (Project ro : re) {
// roleNeedsDAO.daoDelete(ro);
ro.setTheSystem(null);
}
}
java.lang.System.out.println("re2");
developingSystemDAO.daoDelete(system);
return true;
} else {
java.lang.System.out.println("re3");
return false;
}
}
@Override
public SystemsDTO getSystems(String term, String domain) {
String hql;
Query query;
List<String> resultList = null;
if (domain != null) {
hql = "select system.name from " + DevelopingSystem.class.getSimpleName()
+ " as system where system.name like :termo and system.ident = :domain";
query = developingSystemDAO.getPersistenceContext().createQuery(hql);
query.setParameter("domain", domain);
query.setParameter("termo", "%" + term + "%");
resultList = query.getResultList();
} else {
hql = "select system.name from " + DevelopingSystem.class.getSimpleName()
+ " as system where system.name like :termo";
query = developingSystemDAO.getPersistenceContext().createQuery(hql);
query.setParameter("termo", "%" + term + "%");
resultList = query.getResultList();
}
SystemsDTO systemsDTO = new SystemsDTO();
String[] names = new String[resultList.size()];
resultList.toArray(names);
systemsDTO.setNameSystemas(names);
return systemsDTO;
}
public String[] getSystemsCompany() {
String hql;
Query query;
hql = "select ident from " + Company.class.getSimpleName();
query = comDAO.getPersistenceContext().createQuery(hql);
List<String> comList = new ArrayList<String>();
comList = query.getResultList();
String[] list = new String[comList.size()];
comList.toArray(list);
return list;
}
@Override
public SystemsDTO getSystems() {
String hql;
Query query;
hql = "select name from " + DevelopingSystem.class.getSimpleName();
query = developingSystemDAO.getPersistenceContext().createQuery(hql);
List<String> systems = new ArrayList<String>();
systems = query.getResultList();
String[] list = new String[systems.size()];
systems.toArray(list);
String[] list1 = new String[0];
list1 = getSystemsCompany();
SystemsDTO systemsDTO = new SystemsDTO();
systemsDTO.setNameSystemas(list);
systemsDTO.setNomeProjetos(list1);
hql = "SELECT system FROM " + SYSTEM_CLASSNAME + " AS system";
query = developingSystemDAO.getPersistenceContext().createQuery(hql);
List<DevelopingSystem> result = query.getResultList();
java.lang.System.out.println("result: " + result);
java.lang.System.out.println(converter);
java.lang.System.out.println(systemsDTO.getSystemsDTO());
for (DevelopingSystem system2 : result) {
systemsDTO.getSystemsDTO().add(convertSystemToSystemDTO(system2));
}
return systemsDTO;
}
@Override
public Boolean removeProjectToSystem(DevelopingSystemDTO systemDTO) {
String hql;
Query query;
java.lang.System.out.println("re1" + systemDTO.getName());
hql = "select system from " + DevelopingSystem.class.getSimpleName() + " as system where system.name = :abiname";
query = developingSystemDAO.getPersistenceContext().createQuery(hql);
query.setParameter("abiname", systemDTO.getName());
List<DevelopingSystem> result = query.getResultList();
java.lang.System.out.println("re1");
DevelopingSystem system = result.get(0);
if (system != null) {
if (system.getTheProject() != null) {
Collection<Project> re = system.getTheProject();
for (Project ro : re) {
// roleNeedsDAO.daoDelete(ro);
ro.setTheSystem(null);
}
}
java.lang.System.out.println("re2");
system.setTheProject(null);
developingSystemDAO.update(system);
return true;
} else {
java.lang.System.out.println("re3");
return false;
}
}
@Override
public List<ProjectDTO> getProjectToSystem() {
String hql;
Query query;
List<Project> result = null;
List<ProjectDTO> resultDTO = new ArrayList<ProjectDTO>();
hql = "select project from " + Project.class.getSimpleName() + " as project";
query = projectDAO.getPersistenceContext().createQuery(hql);
result = query.getResultList();
Converter converter = new ConverterImpl();
if (!result.isEmpty()) {
ProjectDTO pro = null;
for (int i = 0; i < result.size(); i++) {
try {
pro = (ProjectDTO) converter.getDTO(result.get(i), ProjectDTO.class);
resultDTO.add(pro);
} catch (ImplementationException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
return resultDTO;
} else {
return null;
}
}
public List<Project> getProjects() {
String hql;
Query query;
List<Project> result = null;
// List<ProjectDTO> resultDTO = new ArrayList<ProjectDTO>();
hql = "select project from " + Project.class.getSimpleName() + " as project";
query = projectDAO.getPersistenceContext().createQuery(hql);
result = query.getResultList();
// Converter converter = new ConverterImpl();
if (!result.isEmpty()) {
// ProjectDTO pro = null;
// for(int i = 0; i < result.size();i++){
// try {
// pro = (ProjectDTO) converter.getDTO(result.get(i), ProjectDTO.class);
// resultDTO.add(pro);
// } catch (ImplementationException e) {
// TODO Auto-generated catch block
// e.printStackTrace();
// }
// }
return result;
} else {
return null;
}
}
@Override
public DevelopingSystemDTO getSystem(String nameSystem) {
// TODO Auto-generated method stub
return null;
}
@Override
public DevelopingSystemDTO saveSystem(DevelopingSystemDTO systemDTO) {
// TODO Auto-generated method stub
return null;
}
@Override
public Boolean removeSystem(DevelopingSystemDTO systemDTO) {
// TODO Auto-generated method stub
return null;
}
@Override
public br.ufpa.labes.spm.service.interfaces.SystemsDTO getSystems(String termoBusca, String domainFilter) {
// TODO Auto-generated method stub
return null;
}
@Override
public br.ufpa.labes.spm.service.interfaces.SystemsDTO getSystems() {
// TODO Auto-generated method stub
return null;
}
@Override
public Boolean removeProjectToSystem(DevelopingSystemDTO systemDTO) {
// TODO Auto-generated method stub
return null;
}
/*public List<ProjectDTO> getProjectToSystem() {
this.projectServicesImpl = new ProjectServicesImpl();
ProjectsDTO prosDTO = this.projectServicesImpl.getProjects();
List<ProjectDTO> list = prosDTO.getProjects();
if (list.isEmpty()){
return null;
}else{
return list;
}
}*/
}
<file_sep>/tmp/service/ReqAgentService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.ReqAgent;
import br.ufpa.labes.spm.repository.ReqAgentRepository;
import br.ufpa.labes.spm.service.dto.ReqAgentDTO;
import br.ufpa.labes.spm.service.mapper.ReqAgentMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link ReqAgent}.
*/
@Service
@Transactional
public class ReqAgentService {
private final Logger log = LoggerFactory.getLogger(ReqAgentService.class);
private final ReqAgentRepository reqAgentRepository;
private final ReqAgentMapper reqAgentMapper;
public ReqAgentService(ReqAgentRepository reqAgentRepository, ReqAgentMapper reqAgentMapper) {
this.reqAgentRepository = reqAgentRepository;
this.reqAgentMapper = reqAgentMapper;
}
/**
* Save a reqAgent.
*
* @param reqAgentDTO the entity to save.
* @return the persisted entity.
*/
public ReqAgentDTO save(ReqAgentDTO reqAgentDTO) {
log.debug("Request to save ReqAgent : {}", reqAgentDTO);
ReqAgent reqAgent = reqAgentMapper.toEntity(reqAgentDTO);
reqAgent = reqAgentRepository.save(reqAgent);
return reqAgentMapper.toDto(reqAgent);
}
/**
* Get all the reqAgents.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<ReqAgentDTO> findAll() {
log.debug("Request to get all ReqAgents");
return reqAgentRepository.findAll().stream()
.map(reqAgentMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one reqAgent by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<ReqAgentDTO> findOne(Long id) {
log.debug("Request to get ReqAgent : {}", id);
return reqAgentRepository.findById(id)
.map(reqAgentMapper::toDto);
}
/**
* Delete the reqAgent by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete ReqAgent : {}", id);
reqAgentRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/AgentPlaysRoleService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.AgentPlaysRole;
import br.ufpa.labes.spm.repository.AgentPlaysRoleRepository;
import br.ufpa.labes.spm.service.dto.AgentPlaysRoleDTO;
import br.ufpa.labes.spm.service.mapper.AgentPlaysRoleMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link AgentPlaysRole}.
*/
@Service
@Transactional
public class AgentPlaysRoleService {
private final Logger log = LoggerFactory.getLogger(AgentPlaysRoleService.class);
private final AgentPlaysRoleRepository agentPlaysRoleRepository;
private final AgentPlaysRoleMapper agentPlaysRoleMapper;
public AgentPlaysRoleService(AgentPlaysRoleRepository agentPlaysRoleRepository, AgentPlaysRoleMapper agentPlaysRoleMapper) {
this.agentPlaysRoleRepository = agentPlaysRoleRepository;
this.agentPlaysRoleMapper = agentPlaysRoleMapper;
}
/**
* Save a agentPlaysRole.
*
* @param agentPlaysRoleDTO the entity to save.
* @return the persisted entity.
*/
public AgentPlaysRoleDTO save(AgentPlaysRoleDTO agentPlaysRoleDTO) {
log.debug("Request to save AgentPlaysRole : {}", agentPlaysRoleDTO);
AgentPlaysRole agentPlaysRole = agentPlaysRoleMapper.toEntity(agentPlaysRoleDTO);
agentPlaysRole = agentPlaysRoleRepository.save(agentPlaysRole);
return agentPlaysRoleMapper.toDto(agentPlaysRole);
}
/**
* Get all the agentPlaysRoles.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<AgentPlaysRoleDTO> findAll() {
log.debug("Request to get all AgentPlaysRoles");
return agentPlaysRoleRepository.findAll().stream()
.map(agentPlaysRoleMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one agentPlaysRole by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<AgentPlaysRoleDTO> findOne(Long id) {
log.debug("Request to get AgentPlaysRole : {}", id);
return agentPlaysRoleRepository.findById(id)
.map(agentPlaysRoleMapper::toDto);
}
/**
* Delete the agentPlaysRole by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete AgentPlaysRole : {}", id);
agentPlaysRoleRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/people/OrganizationDAO.java
package br.ufpa.labes.spm.repository.impl.people;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.people.IOrganizationDAO;
import br.ufpa.labes.spm.domain.Organization;
public class OrganizationDAO extends BaseDAO<Organization, String> implements IOrganizationDAO {
public OrganizationDAO() {
super(Organization.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/plannerInfo/IInstantiationPolicyLogDAO.java
package br.ufpa.labes.spm.repository.interfaces.plannerInfo;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.InstantiationPolicyLog;
public interface IInstantiationPolicyLogDAO extends IBaseDAO<InstantiationPolicyLog, Integer> {}
<file_sep>/tmp/service/mapper/VCSRepositoryMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.VCSRepositoryDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link VCSRepository} and its DTO {@link VCSRepositoryDTO}.
*/
@Mapper(componentModel = "spring", uses = {StructureMapper.class})
public interface VCSRepositoryMapper extends EntityMapper<VCSRepositoryDTO, VCSRepository> {
@Mapping(source = "theStructure.id", target = "theStructureId")
VCSRepositoryDTO toDto(VCSRepository vCSRepository);
@Mapping(source = "theStructureId", target = "theStructure")
@Mapping(target = "theArtifacts", ignore = true)
@Mapping(target = "removeTheArtifacts", ignore = true)
VCSRepository toEntity(VCSRepositoryDTO vCSRepositoryDTO);
default VCSRepository fromId(Long id) {
if (id == null) {
return null;
}
VCSRepository vCSRepository = new VCSRepository();
vCSRepository.setId(id);
return vCSRepository;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/mapper/AssetStatMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.AssetStatDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link AssetStat} and its DTO {@link AssetStatDTO}.
*/
@Mapper(componentModel = "spring", uses = {})
public interface AssetStatMapper extends EntityMapper<AssetStatDTO, AssetStat> {
@Mapping(target = "theAsset", ignore = true)
AssetStat toEntity(AssetStatDTO assetStatDTO);
default AssetStat fromId(Long id) {
if (id == null) {
return null;
}
AssetStat assetStat = new AssetStat();
assetStat.setId(id);
return assetStat;
}
}
<file_sep>/tmp/service/mapper/AgentWorkingLoadMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.AgentWorkingLoadDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link AgentWorkingLoad} and its DTO {@link AgentWorkingLoadDTO}.
*/
@Mapper(componentModel = "spring", uses = {AgentMapper.class})
public interface AgentWorkingLoadMapper extends EntityMapper<AgentWorkingLoadDTO, AgentWorkingLoad> {
@Mapping(source = "theAgent.id", target = "theAgentId")
AgentWorkingLoadDTO toDto(AgentWorkingLoad agentWorkingLoad);
@Mapping(source = "theAgentId", target = "theAgent")
AgentWorkingLoad toEntity(AgentWorkingLoadDTO agentWorkingLoadDTO);
default AgentWorkingLoad fromId(Long id) {
if (id == null) {
return null;
}
AgentWorkingLoad agentWorkingLoad = new AgentWorkingLoad();
agentWorkingLoad.setId(id);
return agentWorkingLoad;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/SequenceService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.Sequence;
import br.ufpa.labes.spm.repository.SequenceRepository;
import br.ufpa.labes.spm.service.dto.SequenceDTO;
import br.ufpa.labes.spm.service.mapper.SequenceMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
import java.util.stream.StreamSupport;
/**
* Service Implementation for managing {@link Sequence}.
*/
@Service
@Transactional
public class SequenceService {
private final Logger log = LoggerFactory.getLogger(SequenceService.class);
private final SequenceRepository sequenceRepository;
private final SequenceMapper sequenceMapper;
public SequenceService(SequenceRepository sequenceRepository, SequenceMapper sequenceMapper) {
this.sequenceRepository = sequenceRepository;
this.sequenceMapper = sequenceMapper;
}
/**
* Save a sequence.
*
* @param sequenceDTO the entity to save.
* @return the persisted entity.
*/
public SequenceDTO save(SequenceDTO sequenceDTO) {
log.debug("Request to save Sequence : {}", sequenceDTO);
Sequence sequence = sequenceMapper.toEntity(sequenceDTO);
sequence = sequenceRepository.save(sequence);
return sequenceMapper.toDto(sequence);
}
/**
* Get all the sequences.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<SequenceDTO> findAll() {
log.debug("Request to get all Sequences");
return sequenceRepository.findAll().stream()
.map(sequenceMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get all the sequences where TheDependency is {@code null}.
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<SequenceDTO> findAllWhereTheDependencyIsNull() {
log.debug("Request to get all sequences where TheDependency is null");
return StreamSupport
.stream(sequenceRepository.findAll().spliterator(), false)
.filter(sequence -> sequence.getTheDependency() == null)
.map(sequenceMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one sequence by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<SequenceDTO> findOne(Long id) {
log.debug("Request to get Sequence : {}", id);
return sequenceRepository.findById(id)
.map(sequenceMapper::toDto);
}
/**
* Delete the sequence by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete Sequence : {}", id);
sequenceRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/connections/DependencyDAO.java
package br.ufpa.labes.spm.repository.impl.connections;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.connections.IDependencyDAO;
import br.ufpa.labes.spm.domain.Dependency;
public class DependencyDAO extends BaseDAO<Dependency, Integer> implements IDependencyDAO {
protected DependencyDAO(Class<Dependency> businessClass) {
super(businessClass);
}
public DependencyDAO() {
super(Dependency.class);
}
}
<file_sep>/tmp/service/AgentHasAbilityService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.AgentHasAbility;
import br.ufpa.labes.spm.repository.AgentHasAbilityRepository;
import br.ufpa.labes.spm.service.dto.AgentHasAbilityDTO;
import br.ufpa.labes.spm.service.mapper.AgentHasAbilityMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link AgentHasAbility}.
*/
@Service
@Transactional
public class AgentHasAbilityService {
private final Logger log = LoggerFactory.getLogger(AgentHasAbilityService.class);
private final AgentHasAbilityRepository agentHasAbilityRepository;
private final AgentHasAbilityMapper agentHasAbilityMapper;
public AgentHasAbilityService(AgentHasAbilityRepository agentHasAbilityRepository, AgentHasAbilityMapper agentHasAbilityMapper) {
this.agentHasAbilityRepository = agentHasAbilityRepository;
this.agentHasAbilityMapper = agentHasAbilityMapper;
}
/**
* Save a agentHasAbility.
*
* @param agentHasAbilityDTO the entity to save.
* @return the persisted entity.
*/
public AgentHasAbilityDTO save(AgentHasAbilityDTO agentHasAbilityDTO) {
log.debug("Request to save AgentHasAbility : {}", agentHasAbilityDTO);
AgentHasAbility agentHasAbility = agentHasAbilityMapper.toEntity(agentHasAbilityDTO);
agentHasAbility = agentHasAbilityRepository.save(agentHasAbility);
return agentHasAbilityMapper.toDto(agentHasAbility);
}
/**
* Get all the agentHasAbilities.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<AgentHasAbilityDTO> findAll() {
log.debug("Request to get all AgentHasAbilities");
return agentHasAbilityRepository.findAll().stream()
.map(agentHasAbilityMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one agentHasAbility by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<AgentHasAbilityDTO> findOne(Long id) {
log.debug("Request to get AgentHasAbility : {}", id);
return agentHasAbilityRepository.findById(id)
.map(agentHasAbilityMapper::toDto);
}
/**
* Delete the agentHasAbility by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete AgentHasAbility : {}", id);
agentHasAbilityRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/OrganizationEstimationService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.OrganizationEstimation;
import br.ufpa.labes.spm.repository.OrganizationEstimationRepository;
import br.ufpa.labes.spm.service.dto.OrganizationEstimationDTO;
import br.ufpa.labes.spm.service.mapper.OrganizationEstimationMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link OrganizationEstimation}.
*/
@Service
@Transactional
public class OrganizationEstimationService {
private final Logger log = LoggerFactory.getLogger(OrganizationEstimationService.class);
private final OrganizationEstimationRepository organizationEstimationRepository;
private final OrganizationEstimationMapper organizationEstimationMapper;
public OrganizationEstimationService(OrganizationEstimationRepository organizationEstimationRepository, OrganizationEstimationMapper organizationEstimationMapper) {
this.organizationEstimationRepository = organizationEstimationRepository;
this.organizationEstimationMapper = organizationEstimationMapper;
}
/**
* Save a organizationEstimation.
*
* @param organizationEstimationDTO the entity to save.
* @return the persisted entity.
*/
public OrganizationEstimationDTO save(OrganizationEstimationDTO organizationEstimationDTO) {
log.debug("Request to save OrganizationEstimation : {}", organizationEstimationDTO);
OrganizationEstimation organizationEstimation = organizationEstimationMapper.toEntity(organizationEstimationDTO);
organizationEstimation = organizationEstimationRepository.save(organizationEstimation);
return organizationEstimationMapper.toDto(organizationEstimation);
}
/**
* Get all the organizationEstimations.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<OrganizationEstimationDTO> findAll() {
log.debug("Request to get all OrganizationEstimations");
return organizationEstimationRepository.findAll().stream()
.map(organizationEstimationMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one organizationEstimation by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<OrganizationEstimationDTO> findOne(Long id) {
log.debug("Request to get OrganizationEstimation : {}", id);
return organizationEstimationRepository.findById(id)
.map(organizationEstimationMapper::toDto);
}
/**
* Delete the organizationEstimation by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete OrganizationEstimation : {}", id);
organizationEstimationRepository.deleteById(id);
}
}
<file_sep>/tmp/service/InvolvedArtifactService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.InvolvedArtifact;
import br.ufpa.labes.spm.repository.InvolvedArtifactRepository;
import br.ufpa.labes.spm.service.dto.InvolvedArtifactDTO;
import br.ufpa.labes.spm.service.mapper.InvolvedArtifactMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link InvolvedArtifact}.
*/
@Service
@Transactional
public class InvolvedArtifactService {
private final Logger log = LoggerFactory.getLogger(InvolvedArtifactService.class);
private final InvolvedArtifactRepository involvedArtifactRepository;
private final InvolvedArtifactMapper involvedArtifactMapper;
public InvolvedArtifactService(InvolvedArtifactRepository involvedArtifactRepository, InvolvedArtifactMapper involvedArtifactMapper) {
this.involvedArtifactRepository = involvedArtifactRepository;
this.involvedArtifactMapper = involvedArtifactMapper;
}
/**
* Save a involvedArtifact.
*
* @param involvedArtifactDTO the entity to save.
* @return the persisted entity.
*/
public InvolvedArtifactDTO save(InvolvedArtifactDTO involvedArtifactDTO) {
log.debug("Request to save InvolvedArtifact : {}", involvedArtifactDTO);
InvolvedArtifact involvedArtifact = involvedArtifactMapper.toEntity(involvedArtifactDTO);
involvedArtifact = involvedArtifactRepository.save(involvedArtifact);
return involvedArtifactMapper.toDto(involvedArtifact);
}
/**
* Get all the involvedArtifacts.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<InvolvedArtifactDTO> findAll() {
log.debug("Request to get all InvolvedArtifacts");
return involvedArtifactRepository.findAll().stream()
.map(involvedArtifactMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one involvedArtifact by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<InvolvedArtifactDTO> findOne(Long id) {
log.debug("Request to get InvolvedArtifact : {}", id);
return involvedArtifactRepository.findById(id)
.map(involvedArtifactMapper::toDto);
}
/**
* Delete the involvedArtifact by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete InvolvedArtifact : {}", id);
involvedArtifactRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/mapper/ResourceEventMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.ResourceEventDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link ResourceEvent} and its DTO {@link ResourceEventDTO}.
*/
@Mapper(componentModel = "spring", uses = {NormalMapper.class, CatalogEventMapper.class, ResourceMapper.class})
public interface ResourceEventMapper extends EntityMapper<ResourceEventDTO, ResourceEvent> {
@Mapping(source = "theNormal.id", target = "theNormalId")
@Mapping(source = "theCatalogEvent.id", target = "theCatalogEventId")
@Mapping(source = "theResource.id", target = "theResourceId")
ResourceEventDTO toDto(ResourceEvent resourceEvent);
@Mapping(source = "theNormalId", target = "theNormal")
@Mapping(source = "theCatalogEventId", target = "theCatalogEvent")
@Mapping(source = "theResourceId", target = "theResource")
@Mapping(target = "theRequestorAgents", ignore = true)
@Mapping(target = "removeTheRequestorAgent", ignore = true)
ResourceEvent toEntity(ResourceEventDTO resourceEventDTO);
default ResourceEvent fromId(Long id) {
if (id == null) {
return null;
}
ResourceEvent resourceEvent = new ResourceEvent();
resourceEvent.setId(id);
return resourceEvent;
}
}
<file_sep>/tmp/service/dto/StructureDTO.java
package br.ufpa.labes.spm.service.dto;
import java.io.Serializable;
import java.util.Objects;
/**
* A DTO for the {@link br.ufpa.labes.spm.domain.Structure} entity.
*/
public class StructureDTO implements Serializable {
private Long id;
private Long rootElementId;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public Long getRootElementId() {
return rootElementId;
}
public void setRootElementId(Long nodeId) {
this.rootElementId = nodeId;
}
@Override
public boolean equals(Object o) {
if (this == o) {
return true;
}
if (o == null || getClass() != o.getClass()) {
return false;
}
StructureDTO structureDTO = (StructureDTO) o;
if (structureDTO.getId() == null || getId() == null) {
return false;
}
return Objects.equals(getId(), structureDTO.getId());
}
@Override
public int hashCode() {
return Objects.hashCode(getId());
}
@Override
public String toString() {
return "StructureDTO{" +
"id=" + getId() +
", rootElement=" + getRootElementId() +
"}";
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/plainActivities/IRequiredPeopleDAO.java
package br.ufpa.labes.spm.repository.interfaces.plainActivities;
import java.util.Collection;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.RequiredPeople;
public interface IRequiredPeopleDAO extends IBaseDAO<RequiredPeople, Integer> {
public Collection<String> getReqPeopleEmails(String normalIdent);
}
<file_sep>/src/main/java/br/ufpa/labes/spm/web/rest/ArtifactTypeResource.java
package br.ufpa.labes.spm.web.rest;
import br.ufpa.labes.spm.service.ArtifactTypeService;
import br.ufpa.labes.spm.web.rest.errors.BadRequestAlertException;
import br.ufpa.labes.spm.service.dto.ArtifactTypeDTO;
import io.github.jhipster.web.util.HeaderUtil;
import io.github.jhipster.web.util.ResponseUtil;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.List;
import java.util.Optional;
/**
* REST controller for managing {@link br.ufpa.labes.spm.domain.ArtifactType}.
*/
@RestController
@RequestMapping("/api")
public class ArtifactTypeResource {
private final Logger log = LoggerFactory.getLogger(ArtifactTypeResource.class);
private static final String ENTITY_NAME = "artifactType";
@Value("${jhipster.clientApp.name}")
private String applicationName;
private final ArtifactTypeService artifactTypeService;
public ArtifactTypeResource(ArtifactTypeService artifactTypeService) {
this.artifactTypeService = artifactTypeService;
}
/**
* {@code POST /artifact-types} : Create a new artifactType.
*
* @param artifactTypeDTO the artifactTypeDTO to create.
* @return the {@link ResponseEntity} with status {@code 201 (Created)} and with body the new artifactTypeDTO, or with status {@code 400 (Bad Request)} if the artifactType has already an ID.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PostMapping("/artifact-types")
public ResponseEntity<ArtifactTypeDTO> createArtifactType(@RequestBody ArtifactTypeDTO artifactTypeDTO) throws URISyntaxException {
log.debug("REST request to save ArtifactType : {}", artifactTypeDTO);
if (artifactTypeDTO.getId() != null) {
throw new BadRequestAlertException("A new artifactType cannot already have an ID", ENTITY_NAME, "idexists");
}
ArtifactTypeDTO result = artifactTypeService.save(artifactTypeDTO);
return ResponseEntity.created(new URI("/api/artifact-types/" + result.getId()))
.headers(HeaderUtil.createEntityCreationAlert(applicationName, true, ENTITY_NAME, result.getId().toString()))
.body(result);
}
/**
* {@code PUT /artifact-types} : Updates an existing artifactType.
*
* @param artifactTypeDTO the artifactTypeDTO to update.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the updated artifactTypeDTO,
* or with status {@code 400 (Bad Request)} if the artifactTypeDTO is not valid,
* or with status {@code 500 (Internal Server Error)} if the artifactTypeDTO couldn't be updated.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PutMapping("/artifact-types")
public ResponseEntity<ArtifactTypeDTO> updateArtifactType(@RequestBody ArtifactTypeDTO artifactTypeDTO) throws URISyntaxException {
log.debug("REST request to update ArtifactType : {}", artifactTypeDTO);
if (artifactTypeDTO.getId() == null) {
throw new BadRequestAlertException("Invalid id", ENTITY_NAME, "idnull");
}
ArtifactTypeDTO result = artifactTypeService.save(artifactTypeDTO);
return ResponseEntity.ok()
.headers(HeaderUtil.createEntityUpdateAlert(applicationName, true, ENTITY_NAME, artifactTypeDTO.getId().toString()))
.body(result);
}
/**
* {@code GET /artifact-types} : get all the artifactTypes.
*
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and the list of artifactTypes in body.
*/
@GetMapping("/artifact-types")
public List<ArtifactTypeDTO> getAllArtifactTypes() {
log.debug("REST request to get all ArtifactTypes");
return artifactTypeService.findAll();
}
/**
* {@code GET /artifact-types/:id} : get the "id" artifactType.
*
* @param id the id of the artifactTypeDTO to retrieve.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the artifactTypeDTO, or with status {@code 404 (Not Found)}.
*/
@GetMapping("/artifact-types/{id}")
public ResponseEntity<ArtifactTypeDTO> getArtifactType(@PathVariable Long id) {
log.debug("REST request to get ArtifactType : {}", id);
Optional<ArtifactTypeDTO> artifactTypeDTO = artifactTypeService.findOne(id);
return ResponseUtil.wrapOrNotFound(artifactTypeDTO);
}
/**
* {@code DELETE /artifact-types/:id} : delete the "id" artifactType.
*
* @param id the id of the artifactTypeDTO to delete.
* @return the {@link ResponseEntity} with status {@code 204 (NO_CONTENT)}.
*/
@DeleteMapping("/artifact-types/{id}")
public ResponseEntity<Void> deleteArtifactType(@PathVariable Long id) {
log.debug("REST request to delete ArtifactType : {}", id);
artifactTypeService.delete(id);
return ResponseEntity.noContent().headers(HeaderUtil.createEntityDeletionAlert(applicationName, true, ENTITY_NAME, id.toString())).build();
}
}
<file_sep>/tmp/service/mapper/BranchConMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.BranchConDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link BranchCon} and its DTO {@link BranchConDTO}.
*/
@Mapper(componentModel = "spring", uses = {MultipleConMapper.class, ActivityMapper.class})
public interface BranchConMapper extends EntityMapper<BranchConDTO, BranchCon> {
@Mapping(source = "fromMultipleConnection.id", target = "fromMultipleConnectionId")
@Mapping(source = "fromActivity.id", target = "fromActivityId")
BranchConDTO toDto(BranchCon branchCon);
@Mapping(source = "fromMultipleConnectionId", target = "fromMultipleConnection")
@Mapping(source = "fromActivityId", target = "fromActivity")
BranchCon toEntity(BranchConDTO branchConDTO);
default BranchCon fromId(Long id) {
if (id == null) {
return null;
}
BranchCon branchCon = new BranchCon();
branchCon.setId(id);
return branchCon;
}
}
<file_sep>/tmp/service/mapper/WorkGroupMetricMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.WorkGroupMetricDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link WorkGroupMetric} and its DTO {@link WorkGroupMetricDTO}.
*/
@Mapper(componentModel = "spring", uses = {WorkGroupMapper.class})
public interface WorkGroupMetricMapper extends EntityMapper<WorkGroupMetricDTO, WorkGroupMetric> {
@Mapping(source = "workGroup.id", target = "workGroupId")
WorkGroupMetricDTO toDto(WorkGroupMetric workGroupMetric);
@Mapping(source = "workGroupId", target = "workGroup")
WorkGroupMetric toEntity(WorkGroupMetricDTO workGroupMetricDTO);
default WorkGroupMetric fromId(Long id) {
if (id == null) {
return null;
}
WorkGroupMetric workGroupMetric = new WorkGroupMetric();
workGroupMetric.setId(id);
return workGroupMetric;
}
}
<file_sep>/tmp/service/SubroutineService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.Subroutine;
import br.ufpa.labes.spm.repository.SubroutineRepository;
import br.ufpa.labes.spm.service.dto.SubroutineDTO;
import br.ufpa.labes.spm.service.mapper.SubroutineMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
import java.util.stream.StreamSupport;
/**
* Service Implementation for managing {@link Subroutine}.
*/
@Service
@Transactional
public class SubroutineService {
private final Logger log = LoggerFactory.getLogger(SubroutineService.class);
private final SubroutineRepository subroutineRepository;
private final SubroutineMapper subroutineMapper;
public SubroutineService(SubroutineRepository subroutineRepository, SubroutineMapper subroutineMapper) {
this.subroutineRepository = subroutineRepository;
this.subroutineMapper = subroutineMapper;
}
/**
* Save a subroutine.
*
* @param subroutineDTO the entity to save.
* @return the persisted entity.
*/
public SubroutineDTO save(SubroutineDTO subroutineDTO) {
log.debug("Request to save Subroutine : {}", subroutineDTO);
Subroutine subroutine = subroutineMapper.toEntity(subroutineDTO);
subroutine = subroutineRepository.save(subroutine);
return subroutineMapper.toDto(subroutine);
}
/**
* Get all the subroutines.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<SubroutineDTO> findAll() {
log.debug("Request to get all Subroutines");
return subroutineRepository.findAll().stream()
.map(subroutineMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get all the subroutines where TheAutomatic is {@code null}.
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<SubroutineDTO> findAllWhereTheAutomaticIsNull() {
log.debug("Request to get all subroutines where TheAutomatic is null");
return StreamSupport
.stream(subroutineRepository.findAll().spliterator(), false)
.filter(subroutine -> subroutine.getTheAutomatic() == null)
.map(subroutineMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one subroutine by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<SubroutineDTO> findOne(Long id) {
log.debug("Request to get Subroutine : {}", id);
return subroutineRepository.findById(id)
.map(subroutineMapper::toDto);
}
/**
* Delete the subroutine by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete Subroutine : {}", id);
subroutineRepository.deleteById(id);
}
}
<file_sep>/util_files/repo_inheritance.sh
#!/bin/bash
cd src/main/java/br/ufpa/labes/spm/repository
classes=`find . -regextype posix-extended -regex '(.*)Repository.java' | cut -d'/' -f2 | sed -r 's/(.*)Repository.java/\1/'`
for c in $classes; do
class_path=`echo "$c"Repository.java`
# Add DAO inheritance
super=`echo I"$c"DAO`
package=`find interfaces/ -iname "$super".java | cut -d'/' -f2`
if [[ package != '' ]]; then
package=".$package"
fi
sed -rn "/package/ s/(package .*)/\1\n\nimport br.ufpa.labes.spm.repository.interfaces$package.$super;\n/p" $class_path
# sed -rn "/public interface/ s/public interface "$c"Repository extends $super, (.*)/public interface "$c"Repository extends \1/p" $class_path
done
echo 'Finish adding inheritance...'
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/processKnowledge/IActivityMetricDAO.java
package br.ufpa.labes.spm.repository.interfaces.processKnowledge;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.ActivityMetric;
public interface IActivityMetricDAO extends IBaseDAO<ActivityMetric, Integer> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/domain/enumeration/OperationEnum.java
package br.ufpa.labes.spm.domain.enumeration;
/** The OperationEnum enumeration. */
public enum OperationEnum {
CREATE,
DELETE,
UPDATE,
VIEW,
VOTE,
FAVORITE,
UNFAVORITE,
FOLLOW,
UNFOLLOW,
COMMENT
}
<file_sep>/tmp/service/RelationshipKindService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.RelationshipKind;
import br.ufpa.labes.spm.repository.RelationshipKindRepository;
import br.ufpa.labes.spm.service.dto.RelationshipKindDTO;
import br.ufpa.labes.spm.service.mapper.RelationshipKindMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link RelationshipKind}.
*/
@Service
@Transactional
public class RelationshipKindService {
private final Logger log = LoggerFactory.getLogger(RelationshipKindService.class);
private final RelationshipKindRepository relationshipKindRepository;
private final RelationshipKindMapper relationshipKindMapper;
public RelationshipKindService(RelationshipKindRepository relationshipKindRepository, RelationshipKindMapper relationshipKindMapper) {
this.relationshipKindRepository = relationshipKindRepository;
this.relationshipKindMapper = relationshipKindMapper;
}
/**
* Save a relationshipKind.
*
* @param relationshipKindDTO the entity to save.
* @return the persisted entity.
*/
public RelationshipKindDTO save(RelationshipKindDTO relationshipKindDTO) {
log.debug("Request to save RelationshipKind : {}", relationshipKindDTO);
RelationshipKind relationshipKind = relationshipKindMapper.toEntity(relationshipKindDTO);
relationshipKind = relationshipKindRepository.save(relationshipKind);
return relationshipKindMapper.toDto(relationshipKind);
}
/**
* Get all the relationshipKinds.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<RelationshipKindDTO> findAll() {
log.debug("Request to get all RelationshipKinds");
return relationshipKindRepository.findAll().stream()
.map(relationshipKindMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one relationshipKind by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<RelationshipKindDTO> findOne(Long id) {
log.debug("Request to get RelationshipKind : {}", id);
return relationshipKindRepository.findById(id)
.map(relationshipKindMapper::toDto);
}
/**
* Delete the relationshipKind by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete RelationshipKind : {}", id);
relationshipKindRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/InstantiationPolicyLogService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.InstantiationPolicyLog;
import br.ufpa.labes.spm.repository.InstantiationPolicyLogRepository;
import br.ufpa.labes.spm.service.dto.InstantiationPolicyLogDTO;
import br.ufpa.labes.spm.service.mapper.InstantiationPolicyLogMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link InstantiationPolicyLog}.
*/
@Service
@Transactional
public class InstantiationPolicyLogService {
private final Logger log = LoggerFactory.getLogger(InstantiationPolicyLogService.class);
private final InstantiationPolicyLogRepository instantiationPolicyLogRepository;
private final InstantiationPolicyLogMapper instantiationPolicyLogMapper;
public InstantiationPolicyLogService(InstantiationPolicyLogRepository instantiationPolicyLogRepository, InstantiationPolicyLogMapper instantiationPolicyLogMapper) {
this.instantiationPolicyLogRepository = instantiationPolicyLogRepository;
this.instantiationPolicyLogMapper = instantiationPolicyLogMapper;
}
/**
* Save a instantiationPolicyLog.
*
* @param instantiationPolicyLogDTO the entity to save.
* @return the persisted entity.
*/
public InstantiationPolicyLogDTO save(InstantiationPolicyLogDTO instantiationPolicyLogDTO) {
log.debug("Request to save InstantiationPolicyLog : {}", instantiationPolicyLogDTO);
InstantiationPolicyLog instantiationPolicyLog = instantiationPolicyLogMapper.toEntity(instantiationPolicyLogDTO);
instantiationPolicyLog = instantiationPolicyLogRepository.save(instantiationPolicyLog);
return instantiationPolicyLogMapper.toDto(instantiationPolicyLog);
}
/**
* Get all the instantiationPolicyLogs.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<InstantiationPolicyLogDTO> findAll() {
log.debug("Request to get all InstantiationPolicyLogs");
return instantiationPolicyLogRepository.findAll().stream()
.map(instantiationPolicyLogMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one instantiationPolicyLog by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<InstantiationPolicyLogDTO> findOne(Long id) {
log.debug("Request to get InstantiationPolicyLog : {}", id);
return instantiationPolicyLogRepository.findById(id)
.map(instantiationPolicyLogMapper::toDto);
}
/**
* Delete the instantiationPolicyLog by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete InstantiationPolicyLog : {}", id);
instantiationPolicyLogRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/ArtifactMetricRepository.java
package br.ufpa.labes.spm.repository;
import br.ufpa.labes.spm.domain.ArtifactMetric;
import org.springframework.data.jpa.repository.*;
import org.springframework.stereotype.Repository;
/**
* Spring Data repository for the ArtifactMetric entity.
*/
@SuppressWarnings("unused")
@Repository
public interface ArtifactMetricRepository extends JpaRepository<ArtifactMetric, Long> {
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/plannerInfo/IAgentInstantiationSuggestionDAO.java
package br.ufpa.labes.spm.repository.interfaces.plannerInfo;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.AgentInstSug;
public interface IAgentInstantiationSuggestionDAO extends IBaseDAO<AgentInstSug, Integer> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/people/IPersonDAO.java
package br.ufpa.labes.spm.repository.interfaces.people;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.Person;
public interface IPersonDAO extends IBaseDAO<Person, String> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/resources/ExclusiveDAO.java
package br.ufpa.labes.spm.repository.impl.resources;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.resources.IExclusiveDAO;
import br.ufpa.labes.spm.domain.Exclusive;
public class ExclusiveDAO extends BaseDAO<Exclusive, String> implements IExclusiveDAO {
protected ExclusiveDAO(Class<Exclusive> businessClass) {
super(businessClass);
}
public ExclusiveDAO() {
super(Exclusive.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/types/IMetricTypeDAO.java
package br.ufpa.labes.spm.repository.interfaces.types;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.MetricType;
public interface IMetricTypeDAO extends IBaseDAO<MetricType, String> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/plainActivities/IInvolvedArtifactsDAO.java
package br.ufpa.labes.spm.repository.interfaces.plainActivities;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.InvolvedArtifact;
public interface IInvolvedArtifactsDAO extends IBaseDAO<InvolvedArtifact, Integer> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/mapper/OrganizationMetricMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.OrganizationMetricDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link OrganizationMetric} and its DTO {@link OrganizationMetricDTO}.
*/
@Mapper(componentModel = "spring", uses = {OrganizationMapper.class, CompanyMapper.class})
public interface OrganizationMetricMapper extends EntityMapper<OrganizationMetricDTO, OrganizationMetric> {
@Mapping(source = "organization.id", target = "organizationId")
@Mapping(source = "company.id", target = "companyId")
OrganizationMetricDTO toDto(OrganizationMetric organizationMetric);
@Mapping(source = "organizationId", target = "organization")
@Mapping(source = "companyId", target = "company")
OrganizationMetric toEntity(OrganizationMetricDTO organizationMetricDTO);
default OrganizationMetric fromId(Long id) {
if (id == null) {
return null;
}
OrganizationMetric organizationMetric = new OrganizationMetric();
organizationMetric.setId(id);
return organizationMetric;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/log/ConnectionEventDAO.java
package br.ufpa.labes.spm.repository.impl.log;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.log.IConnectionEventDAO;
import br.ufpa.labes.spm.domain.ConnectionEvent;
public class ConnectionEventDAO extends BaseDAO<ConnectionEvent, Integer>
implements IConnectionEventDAO {
protected ConnectionEventDAO(Class<ConnectionEvent> businessClass) {
super(businessClass);
}
public ConnectionEventDAO() {
super(ConnectionEvent.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/dto/CompanyDTO.java
package br.ufpa.labes.spm.service.dto;
import java.io.Serializable;
import java.util.Objects;
import javax.persistence.Lob;
/**
* A DTO for the {@link br.ufpa.labes.spm.domain.Company} entity.
*/
public class CompanyDTO implements Serializable {
private Long id;
private String ident;
private String cnpj;
private String fantasyName;
private String socialReason;
private String acronym;
private String address;
private String phone;
@Lob
private String description;
@Lob
private byte[] image;
private String imageContentType;
private String url;
private Boolean automaticInstantiation;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public String getIdent() {
return ident;
}
public void setIdent(String ident) {
this.ident = ident;
}
public String getCnpj() {
return cnpj;
}
public void setCnpj(String cnpj) {
this.cnpj = cnpj;
}
public String getFantasyName() {
return fantasyName;
}
public void setFantasyName(String fantasyName) {
this.fantasyName = fantasyName;
}
public String getSocialReason() {
return socialReason;
}
public void setSocialReason(String socialReason) {
this.socialReason = socialReason;
}
public String getAcronym() {
return acronym;
}
public void setAcronym(String acronym) {
this.acronym = acronym;
}
public String getAddress() {
return address;
}
public void setAddress(String address) {
this.address = address;
}
public String getPhone() {
return phone;
}
public void setPhone(String phone) {
this.phone = phone;
}
public String getDescription() {
return description;
}
public void setDescription(String description) {
this.description = description;
}
public byte[] getImage() {
return image;
}
public void setImage(byte[] image) {
this.image = image;
}
public String getImageContentType() {
return imageContentType;
}
public void setImageContentType(String imageContentType) {
this.imageContentType = imageContentType;
}
public String getUrl() {
return url;
}
public void setUrl(String url) {
this.url = url;
}
public Boolean isAutomaticInstantiation() {
return automaticInstantiation;
}
public void setAutomaticInstantiation(Boolean automaticInstantiation) {
this.automaticInstantiation = automaticInstantiation;
}
@Override
public boolean equals(Object o) {
if (this == o) {
return true;
}
if (o == null || getClass() != o.getClass()) {
return false;
}
CompanyDTO companyDTO = (CompanyDTO) o;
if (companyDTO.getId() == null || getId() == null) {
return false;
}
return Objects.equals(getId(), companyDTO.getId());
}
@Override
public int hashCode() {
return Objects.hashCode(getId());
}
@Override
public String toString() {
return "CompanyDTO{" +
"id=" + getId() +
", ident='" + getIdent() + "'" +
", cnpj='" + getCnpj() + "'" +
", fantasyName='" + getFantasyName() + "'" +
", socialReason='" + getSocialReason() + "'" +
", acronym='" + getAcronym() + "'" +
", address='" + getAddress() + "'" +
", phone='" + getPhone() + "'" +
", description='" + getDescription() + "'" +
", image='" + getImage() + "'" +
", url='" + getUrl() + "'" +
", automaticInstantiation='" + isAutomaticInstantiation() + "'" +
"}";
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/organizationPolicies/IStructureDAO.java
package br.ufpa.labes.spm.repository.interfaces.organizationPolicies;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.Structure;
public interface IStructureDAO extends IBaseDAO<Structure, String> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/resources/IExclusiveDAO.java
package br.ufpa.labes.spm.repository.interfaces.resources;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.Exclusive;
public interface IExclusiveDAO extends IBaseDAO<Exclusive, String> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/activities/ActivityDAO.java
package br.ufpa.labes.spm.repository.impl.activities;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.activities.IActivityDAO;
import br.ufpa.labes.spm.domain.Activity;
public class ActivityDAO extends BaseDAO<Activity, String> implements IActivityDAO {
public ActivityDAO(Class<Activity> businessClass) {
super(businessClass);
}
public ActivityDAO() {
super(Activity.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/ArtifactEstimationService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.ArtifactEstimation;
import br.ufpa.labes.spm.repository.ArtifactEstimationRepository;
import br.ufpa.labes.spm.service.dto.ArtifactEstimationDTO;
import br.ufpa.labes.spm.service.mapper.ArtifactEstimationMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link ArtifactEstimation}.
*/
@Service
@Transactional
public class ArtifactEstimationService {
private final Logger log = LoggerFactory.getLogger(ArtifactEstimationService.class);
private final ArtifactEstimationRepository artifactEstimationRepository;
private final ArtifactEstimationMapper artifactEstimationMapper;
public ArtifactEstimationService(ArtifactEstimationRepository artifactEstimationRepository, ArtifactEstimationMapper artifactEstimationMapper) {
this.artifactEstimationRepository = artifactEstimationRepository;
this.artifactEstimationMapper = artifactEstimationMapper;
}
/**
* Save a artifactEstimation.
*
* @param artifactEstimationDTO the entity to save.
* @return the persisted entity.
*/
public ArtifactEstimationDTO save(ArtifactEstimationDTO artifactEstimationDTO) {
log.debug("Request to save ArtifactEstimation : {}", artifactEstimationDTO);
ArtifactEstimation artifactEstimation = artifactEstimationMapper.toEntity(artifactEstimationDTO);
artifactEstimation = artifactEstimationRepository.save(artifactEstimation);
return artifactEstimationMapper.toDto(artifactEstimation);
}
/**
* Get all the artifactEstimations.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<ArtifactEstimationDTO> findAll() {
log.debug("Request to get all ArtifactEstimations");
return artifactEstimationRepository.findAll().stream()
.map(artifactEstimationMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one artifactEstimation by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<ArtifactEstimationDTO> findOne(Long id) {
log.debug("Request to get ArtifactEstimation : {}", id);
return artifactEstimationRepository.findById(id)
.map(artifactEstimationMapper::toDto);
}
/**
* Delete the artifactEstimation by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete ArtifactEstimation : {}", id);
artifactEstimationRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/ActivityMetricService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.ActivityMetric;
import br.ufpa.labes.spm.repository.ActivityMetricRepository;
import br.ufpa.labes.spm.service.dto.ActivityMetricDTO;
import br.ufpa.labes.spm.service.mapper.ActivityMetricMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link ActivityMetric}.
*/
@Service
@Transactional
public class ActivityMetricService {
private final Logger log = LoggerFactory.getLogger(ActivityMetricService.class);
private final ActivityMetricRepository activityMetricRepository;
private final ActivityMetricMapper activityMetricMapper;
public ActivityMetricService(ActivityMetricRepository activityMetricRepository, ActivityMetricMapper activityMetricMapper) {
this.activityMetricRepository = activityMetricRepository;
this.activityMetricMapper = activityMetricMapper;
}
/**
* Save a activityMetric.
*
* @param activityMetricDTO the entity to save.
* @return the persisted entity.
*/
public ActivityMetricDTO save(ActivityMetricDTO activityMetricDTO) {
log.debug("Request to save ActivityMetric : {}", activityMetricDTO);
ActivityMetric activityMetric = activityMetricMapper.toEntity(activityMetricDTO);
activityMetric = activityMetricRepository.save(activityMetric);
return activityMetricMapper.toDto(activityMetric);
}
/**
* Get all the activityMetrics.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<ActivityMetricDTO> findAll() {
log.debug("Request to get all ActivityMetrics");
return activityMetricRepository.findAll().stream()
.map(activityMetricMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one activityMetric by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<ActivityMetricDTO> findOne(Long id) {
log.debug("Request to get ActivityMetric : {}", id);
return activityMetricRepository.findById(id)
.map(activityMetricMapper::toDto);
}
/**
* Delete the activityMetric by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete ActivityMetric : {}", id);
activityMetricRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/mapper/JoinConMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.JoinConDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link JoinCon} and its DTO {@link JoinConDTO}.
*/
@Mapper(componentModel = "spring", uses = {MultipleConMapper.class, ActivityMapper.class})
public interface JoinConMapper extends EntityMapper<JoinConDTO, JoinCon> {
@Mapping(source = "toMultipleCon.id", target = "toMultipleConId")
@Mapping(source = "toActivity.id", target = "toActivityId")
JoinConDTO toDto(JoinCon joinCon);
@Mapping(source = "toMultipleConId", target = "toMultipleCon")
@Mapping(target = "removeFromMultipleCon", ignore = true)
@Mapping(source = "toActivityId", target = "toActivity")
@Mapping(target = "fromActivities", ignore = true)
@Mapping(target = "removeFromActivity", ignore = true)
JoinCon toEntity(JoinConDTO joinConDTO);
default JoinCon fromId(Long id) {
if (id == null) {
return null;
}
JoinCon joinCon = new JoinCon();
joinCon.setId(id);
return joinCon;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/plainActivities/ReqAgentRequiresAbilityDAO.java
package br.ufpa.labes.spm.repository.impl.plainActivities;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.plainActivities.IReqAgentRequiresAbilityDAO;
import br.ufpa.labes.spm.domain.ReqAgentRequiresAbility;
public class ReqAgentRequiresAbilityDAO extends BaseDAO<ReqAgentRequiresAbility, Integer>
implements IReqAgentRequiresAbilityDAO {
protected ReqAgentRequiresAbilityDAO(Class<ReqAgentRequiresAbility> businessClass) {
super(businessClass);
}
public ReqAgentRequiresAbilityDAO() {
super(ReqAgentRequiresAbility.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/log/IResourceEventDAO.java
package br.ufpa.labes.spm.repository.interfaces.log;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.ResourceEvent;
public interface IResourceEventDAO extends IBaseDAO<ResourceEvent, Integer> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/web/rest/RoleNeedsAbilityResource.java
package br.ufpa.labes.spm.web.rest;
import br.ufpa.labes.spm.service.RoleNeedsAbilityService;
import br.ufpa.labes.spm.web.rest.errors.BadRequestAlertException;
import br.ufpa.labes.spm.service.dto.RoleNeedsAbilityDTO;
import io.github.jhipster.web.util.HeaderUtil;
import io.github.jhipster.web.util.ResponseUtil;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.List;
import java.util.Optional;
/**
* REST controller for managing {@link br.ufpa.labes.spm.domain.RoleNeedsAbility}.
*/
@RestController
@RequestMapping("/api")
public class RoleNeedsAbilityResource {
private final Logger log = LoggerFactory.getLogger(RoleNeedsAbilityResource.class);
private static final String ENTITY_NAME = "roleNeedsAbility";
@Value("${jhipster.clientApp.name}")
private String applicationName;
private final RoleNeedsAbilityService roleNeedsAbilityService;
public RoleNeedsAbilityResource(RoleNeedsAbilityService roleNeedsAbilityService) {
this.roleNeedsAbilityService = roleNeedsAbilityService;
}
/**
* {@code POST /role-needs-abilities} : Create a new roleNeedsAbility.
*
* @param roleNeedsAbilityDTO the roleNeedsAbilityDTO to create.
* @return the {@link ResponseEntity} with status {@code 201 (Created)} and with body the new roleNeedsAbilityDTO, or with status {@code 400 (Bad Request)} if the roleNeedsAbility has already an ID.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PostMapping("/role-needs-abilities")
public ResponseEntity<RoleNeedsAbilityDTO> createRoleNeedsAbility(@RequestBody RoleNeedsAbilityDTO roleNeedsAbilityDTO) throws URISyntaxException {
log.debug("REST request to save RoleNeedsAbility : {}", roleNeedsAbilityDTO);
if (roleNeedsAbilityDTO.getId() != null) {
throw new BadRequestAlertException("A new roleNeedsAbility cannot already have an ID", ENTITY_NAME, "idexists");
}
RoleNeedsAbilityDTO result = roleNeedsAbilityService.save(roleNeedsAbilityDTO);
return ResponseEntity.created(new URI("/api/role-needs-abilities/" + result.getId()))
.headers(HeaderUtil.createEntityCreationAlert(applicationName, true, ENTITY_NAME, result.getId().toString()))
.body(result);
}
/**
* {@code PUT /role-needs-abilities} : Updates an existing roleNeedsAbility.
*
* @param roleNeedsAbilityDTO the roleNeedsAbilityDTO to update.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the updated roleNeedsAbilityDTO,
* or with status {@code 400 (Bad Request)} if the roleNeedsAbilityDTO is not valid,
* or with status {@code 500 (Internal Server Error)} if the roleNeedsAbilityDTO couldn't be updated.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PutMapping("/role-needs-abilities")
public ResponseEntity<RoleNeedsAbilityDTO> updateRoleNeedsAbility(@RequestBody RoleNeedsAbilityDTO roleNeedsAbilityDTO) throws URISyntaxException {
log.debug("REST request to update RoleNeedsAbility : {}", roleNeedsAbilityDTO);
if (roleNeedsAbilityDTO.getId() == null) {
throw new BadRequestAlertException("Invalid id", ENTITY_NAME, "idnull");
}
RoleNeedsAbilityDTO result = roleNeedsAbilityService.save(roleNeedsAbilityDTO);
return ResponseEntity.ok()
.headers(HeaderUtil.createEntityUpdateAlert(applicationName, true, ENTITY_NAME, roleNeedsAbilityDTO.getId().toString()))
.body(result);
}
/**
* {@code GET /role-needs-abilities} : get all the roleNeedsAbilities.
*
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and the list of roleNeedsAbilities in body.
*/
@GetMapping("/role-needs-abilities")
public List<RoleNeedsAbilityDTO> getAllRoleNeedsAbilities() {
log.debug("REST request to get all RoleNeedsAbilities");
return roleNeedsAbilityService.findAll();
}
/**
* {@code GET /role-needs-abilities/:id} : get the "id" roleNeedsAbility.
*
* @param id the id of the roleNeedsAbilityDTO to retrieve.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the roleNeedsAbilityDTO, or with status {@code 404 (Not Found)}.
*/
@GetMapping("/role-needs-abilities/{id}")
public ResponseEntity<RoleNeedsAbilityDTO> getRoleNeedsAbility(@PathVariable Long id) {
log.debug("REST request to get RoleNeedsAbility : {}", id);
Optional<RoleNeedsAbilityDTO> roleNeedsAbilityDTO = roleNeedsAbilityService.findOne(id);
return ResponseUtil.wrapOrNotFound(roleNeedsAbilityDTO);
}
/**
* {@code DELETE /role-needs-abilities/:id} : delete the "id" roleNeedsAbility.
*
* @param id the id of the roleNeedsAbilityDTO to delete.
* @return the {@link ResponseEntity} with status {@code 204 (NO_CONTENT)}.
*/
@DeleteMapping("/role-needs-abilities/{id}")
public ResponseEntity<Void> deleteRoleNeedsAbility(@PathVariable Long id) {
log.debug("REST request to delete RoleNeedsAbility : {}", id);
roleNeedsAbilityService.delete(id);
return ResponseEntity.noContent().headers(HeaderUtil.createEntityDeletionAlert(applicationName, true, ENTITY_NAME, id.toString())).build();
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/driver/DriverDAO.java
package br.ufpa.labes.spm.repository.impl.driver;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.driver.IDriverDAO;
import br.ufpa.labes.spm.domain.Driver;
public class DriverDAO extends BaseDAO<Driver, String> implements IDriverDAO {
protected DriverDAO(Class<Driver> businessClass) {
super(businessClass);
}
public DriverDAO() {
super(Driver.class);
}
}
<file_sep>/tmp/service/SpmLogService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.SpmLog;
import br.ufpa.labes.spm.repository.SpmLogRepository;
import br.ufpa.labes.spm.service.dto.SpmLogDTO;
import br.ufpa.labes.spm.service.mapper.SpmLogMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link SpmLog}.
*/
@Service
@Transactional
public class SpmLogService {
private final Logger log = LoggerFactory.getLogger(SpmLogService.class);
private final SpmLogRepository spmLogRepository;
private final SpmLogMapper spmLogMapper;
public SpmLogService(SpmLogRepository spmLogRepository, SpmLogMapper spmLogMapper) {
this.spmLogRepository = spmLogRepository;
this.spmLogMapper = spmLogMapper;
}
/**
* Save a spmLog.
*
* @param spmLogDTO the entity to save.
* @return the persisted entity.
*/
public SpmLogDTO save(SpmLogDTO spmLogDTO) {
log.debug("Request to save SpmLog : {}", spmLogDTO);
SpmLog spmLog = spmLogMapper.toEntity(spmLogDTO);
spmLog = spmLogRepository.save(spmLog);
return spmLogMapper.toDto(spmLog);
}
/**
* Get all the spmLogs.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<SpmLogDTO> findAll() {
log.debug("Request to get all SpmLogs");
return spmLogRepository.findAll().stream()
.map(spmLogMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one spmLog by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<SpmLogDTO> findOne(Long id) {
log.debug("Request to get SpmLog : {}", id);
return spmLogRepository.findById(id)
.map(spmLogMapper::toDto);
}
/**
* Delete the spmLog by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete SpmLog : {}", id);
spmLogRepository.deleteById(id);
}
}
<file_sep>/tmp/service/impl/CalendarServicesImpl.java
package br.ufpa.labes.spm.service.impl;
import java.util.ArrayList;
import java.util.List;
import javax.persistence.Query;
import org.qrconsult.spm.converter.core.Converter;
import org.qrconsult.spm.converter.core.ConverterImpl;
import br.ufpa.labes.spm.exceptions.ImplementationException;
import br.ufpa.labes.spm.repository.interfaces.calendar.ICalendarDAO;
import br.ufpa.labes.spm.service.dto.CalendarDTO;
import br.ufpa.labes.spm.service.dto.CalendarsDTO;
import br.ufpa.labes.spm.service.dto.ProjectDTO;
import br.ufpa.labes.spm.domain.Calendar;
import br.ufpa.labes.spm.domain.Project;
import br.ufpa.labes.spm.service.interfaces.CalendarServices;
public class CalendarServicesImpl implements CalendarServices {
ICalendarDAO calendarDAO;
private Query query;
Converter converter = new ConverterImpl();
private static final String CALENDAR_CLASSNAME = Calendar.class.getName();
private Calendar calendar;
private Project project;
@Override
public void saveCalendar(CalendarDTO calendarDTO, ProjectDTO projetoOid) {
System.out.print("caiu no save");
project = this.convertProjectDTOToPtoject(projetoOid);
calendar = new Calendar();
try {
calendar = this.convertCalendarDTOToCalendar(calendarDTO);
calendar.setProject(project);
calendarDAO.daoSave(calendar);
calendar = null;
} catch (Exception e) {
e.printStackTrace();
}
}
private Calendar convertCalendarDTOToCalendar(CalendarDTO calendarDTO) {
try {
Calendar calendar = new Calendar();
calendar = (Calendar) converter.getEntity(calendarDTO, calendar);
return calendar;
} catch (ImplementationException e) {
e.printStackTrace();
}
return null;
}
private Project convertProjectDTOToPtoject(ProjectDTO projectDTO) {
try {
Project project = new Project();
project = (Project) converter.getEntity(projectDTO, project);
return project;
} catch (ImplementationException e) {
e.printStackTrace();
}
return null;
}
@Override
public void updateCalendar(CalendarDTO calendar, Integer projetoOid) {
// TODO Auto-generated method stub
}
@Override
public List<CalendarDTO> searchAllProjectCalendar() {
// TODO Auto-generated method stub
return null;
}
@Override
public Boolean deleteAgent(CalendarDTO calendar) {
// TODO Auto-generated method stub
return null;
}
@Override
public CalendarDTO searchCalendarProjectById(Integer calendarOid,
Integer projectOid) {
// TODO Auto-generated method stub
return null;
}
@Override
public CalendarDTO changeCalendarToPtoject(CalendarDTO calendar,
Integer projectOid) {
// TODO Auto-generated method stub
return null;
}
@Override
public String validNameProject(String name, Integer project_Oid) {
Query query = calendarDAO.getPersistenceContext().createQuery("SELECT calendar.name FROM " + Calendar.class.getName() + " AS calendar "
+ "WHERE calendar.project.oid like :project_Oid");
query.setParameter("project_Oid", project_Oid);
try{
return query.getSingleResult().toString();
}catch(Exception e){
e.printStackTrace();
return null;
}
}
@Override
public CalendarsDTO getCalendarsForProject() {
String hql_project = "SELECT ca FROM " + CALENDAR_CLASSNAME + " as ca)";
query = calendarDAO.getPersistenceContext().createQuery(hql_project);
List<Calendar> calendarList = query.getResultList();
System.out.println("tamanho da lista"+calendarList.size());
CalendarsDTO calendars = this.convertCalendarsToCalendarsDTO(calendarList);
return calendars;
}
@Override
public CalendarDTO getCalendarForProject(Integer project_oid) {
CalendarDTO calendar = new CalendarDTO();
String hql = "SELECT calendar FROM " + Calendar.class.getName() + " calendar where calendar.project.oid = '" + project_oid + "'";
query = calendarDAO.getPersistenceContext().createQuery(hql);
if(query.getSingleResult()==null){
return null;
}else{
Calendar calendario = (Calendar) query.getSingleResult();
calendar = this.convertCalendarToCalendarDTO(calendario);
}
return calendar;
}
private CalendarsDTO convertCalendarsToCalendarsDTO(List<Calendar> calendars) {
CalendarsDTO calendarsDTO = new CalendarsDTO(new ArrayList<CalendarDTO>());
for (Calendar calendar : calendars) {
CalendarDTO calendarDTO = this.convertCalendarToCalendarDTO(calendar);
calendarsDTO.addCalendar(calendarDTO);
}
return calendarsDTO;
}
private CalendarDTO convertCalendarToCalendarDTO(Calendar calendar) {
try {
CalendarDTO calendarDTO = new CalendarDTO();
calendarDTO = (CalendarDTO) converter.getDTO(calendar, CalendarDTO.class);
return calendarDTO;
} catch (ImplementationException e) {
e.printStackTrace();
}
return null;
}
}
<file_sep>/tmp/service/mapper/ProcessModelEventMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.ProcessModelEventDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link ProcessModelEvent} and its DTO {@link ProcessModelEventDTO}.
*/
@Mapper(componentModel = "spring", uses = {ProcessModelMapper.class})
public interface ProcessModelEventMapper extends EntityMapper<ProcessModelEventDTO, ProcessModelEvent> {
@Mapping(source = "theProcessModel.id", target = "theProcessModelId")
ProcessModelEventDTO toDto(ProcessModelEvent processModelEvent);
@Mapping(source = "theProcessModelId", target = "theProcessModel")
@Mapping(target = "theCatalogEvents", ignore = true)
@Mapping(target = "removeTheCatalogEvents", ignore = true)
ProcessModelEvent toEntity(ProcessModelEventDTO processModelEventDTO);
default ProcessModelEvent fromId(Long id) {
if (id == null) {
return null;
}
ProcessModelEvent processModelEvent = new ProcessModelEvent();
processModelEvent.setId(id);
return processModelEvent;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/agent/AgentAffinityAgentDAO.java
package br.ufpa.labes.spm.repository.impl.agent;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.agent.IAgentAffinityAgentDAO;
import br.ufpa.labes.spm.domain.AgentAffinityAgent;
public class AgentAffinityAgentDAO extends BaseDAO<AgentAffinityAgent, Integer>
implements IAgentAffinityAgentDAO {
protected AgentAffinityAgentDAO(Class<AgentAffinityAgent> businessClass) {
super(businessClass);
}
public AgentAffinityAgentDAO() {
super(AgentAffinityAgent.class);
}
}
<file_sep>/tmp/service/mapper/AssetRelationshipMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.AssetRelationshipDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link AssetRelationship} and its DTO {@link AssetRelationshipDTO}.
*/
@Mapper(componentModel = "spring", uses = {RelationshipKindMapper.class, AssetMapper.class})
public interface AssetRelationshipMapper extends EntityMapper<AssetRelationshipDTO, AssetRelationship> {
@Mapping(source = "kind.id", target = "kindId")
@Mapping(source = "asset.id", target = "assetId")
@Mapping(source = "relatedAsset.id", target = "relatedAssetId")
AssetRelationshipDTO toDto(AssetRelationship assetRelationship);
@Mapping(source = "kindId", target = "kind")
@Mapping(source = "assetId", target = "asset")
@Mapping(source = "relatedAssetId", target = "relatedAsset")
AssetRelationship toEntity(AssetRelationshipDTO assetRelationshipDTO);
default AssetRelationship fromId(Long id) {
if (id == null) {
return null;
}
AssetRelationship assetRelationship = new AssetRelationship();
assetRelationship.setId(id);
return assetRelationship;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/web/rest/CatalogEventResource.java
package br.ufpa.labes.spm.web.rest;
import br.ufpa.labes.spm.service.CatalogEventService;
import br.ufpa.labes.spm.web.rest.errors.BadRequestAlertException;
import br.ufpa.labes.spm.service.dto.CatalogEventDTO;
import io.github.jhipster.web.util.HeaderUtil;
import io.github.jhipster.web.util.ResponseUtil;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.List;
import java.util.Optional;
/**
* REST controller for managing {@link br.ufpa.labes.spm.domain.CatalogEvent}.
*/
@RestController
@RequestMapping("/api")
public class CatalogEventResource {
private final Logger log = LoggerFactory.getLogger(CatalogEventResource.class);
private static final String ENTITY_NAME = "catalogEvent";
@Value("${jhipster.clientApp.name}")
private String applicationName;
private final CatalogEventService catalogEventService;
public CatalogEventResource(CatalogEventService catalogEventService) {
this.catalogEventService = catalogEventService;
}
/**
* {@code POST /catalog-events} : Create a new catalogEvent.
*
* @param catalogEventDTO the catalogEventDTO to create.
* @return the {@link ResponseEntity} with status {@code 201 (Created)} and with body the new catalogEventDTO, or with status {@code 400 (Bad Request)} if the catalogEvent has already an ID.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PostMapping("/catalog-events")
public ResponseEntity<CatalogEventDTO> createCatalogEvent(@RequestBody CatalogEventDTO catalogEventDTO) throws URISyntaxException {
log.debug("REST request to save CatalogEvent : {}", catalogEventDTO);
if (catalogEventDTO.getId() != null) {
throw new BadRequestAlertException("A new catalogEvent cannot already have an ID", ENTITY_NAME, "idexists");
}
CatalogEventDTO result = catalogEventService.save(catalogEventDTO);
return ResponseEntity.created(new URI("/api/catalog-events/" + result.getId()))
.headers(HeaderUtil.createEntityCreationAlert(applicationName, true, ENTITY_NAME, result.getId().toString()))
.body(result);
}
/**
* {@code PUT /catalog-events} : Updates an existing catalogEvent.
*
* @param catalogEventDTO the catalogEventDTO to update.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the updated catalogEventDTO,
* or with status {@code 400 (Bad Request)} if the catalogEventDTO is not valid,
* or with status {@code 500 (Internal Server Error)} if the catalogEventDTO couldn't be updated.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PutMapping("/catalog-events")
public ResponseEntity<CatalogEventDTO> updateCatalogEvent(@RequestBody CatalogEventDTO catalogEventDTO) throws URISyntaxException {
log.debug("REST request to update CatalogEvent : {}", catalogEventDTO);
if (catalogEventDTO.getId() == null) {
throw new BadRequestAlertException("Invalid id", ENTITY_NAME, "idnull");
}
CatalogEventDTO result = catalogEventService.save(catalogEventDTO);
return ResponseEntity.ok()
.headers(HeaderUtil.createEntityUpdateAlert(applicationName, true, ENTITY_NAME, catalogEventDTO.getId().toString()))
.body(result);
}
/**
* {@code GET /catalog-events} : get all the catalogEvents.
*
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and the list of catalogEvents in body.
*/
@GetMapping("/catalog-events")
public List<CatalogEventDTO> getAllCatalogEvents() {
log.debug("REST request to get all CatalogEvents");
return catalogEventService.findAll();
}
/**
* {@code GET /catalog-events/:id} : get the "id" catalogEvent.
*
* @param id the id of the catalogEventDTO to retrieve.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the catalogEventDTO, or with status {@code 404 (Not Found)}.
*/
@GetMapping("/catalog-events/{id}")
public ResponseEntity<CatalogEventDTO> getCatalogEvent(@PathVariable Long id) {
log.debug("REST request to get CatalogEvent : {}", id);
Optional<CatalogEventDTO> catalogEventDTO = catalogEventService.findOne(id);
return ResponseUtil.wrapOrNotFound(catalogEventDTO);
}
/**
* {@code DELETE /catalog-events/:id} : delete the "id" catalogEvent.
*
* @param id the id of the catalogEventDTO to delete.
* @return the {@link ResponseEntity} with status {@code 204 (NO_CONTENT)}.
*/
@DeleteMapping("/catalog-events/{id}")
public ResponseEntity<Void> deleteCatalogEvent(@PathVariable Long id) {
log.debug("REST request to delete CatalogEvent : {}", id);
catalogEventService.delete(id);
return ResponseEntity.noContent().headers(HeaderUtil.createEntityDeletionAlert(applicationName, true, ENTITY_NAME, id.toString())).build();
}
}
<file_sep>/tmp/service/DevelopingSystemService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.service.dto.DevelopingSystemDTO;
import java.util.List;
import java.util.Optional;
/**
* Service Interface for managing {@link br.ufpa.labes.spm.domain.DevelopingSystem}.
*/
public interface DevelopingSystemService {
/**
* Save a developingSystem.
*
* @param developingSystemDTO the entity to save.
* @return the persisted entity.
*/
DevelopingSystemDTO save(DevelopingSystemDTO developingSystemDTO);
/**
* Get all the developingSystems.
*
* @return the list of entities.
*/
List<DevelopingSystemDTO> findAll();
/**
* Get the "id" developingSystem.
*
* @param id the id of the entity.
* @return the entity.
*/
Optional<DevelopingSystemDTO> findOne(Long id);
/**
* Delete the "id" developingSystem.
*
* @param id the id of the entity.
*/
void delete(Long id);
}
<file_sep>/tmp/service/mapper/RoleTypeMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.RoleTypeDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link RoleType} and its DTO {@link RoleTypeDTO}.
*/
@Mapper(componentModel = "spring", uses = {})
public interface RoleTypeMapper extends EntityMapper<RoleTypeDTO, RoleType> {
@Mapping(target = "theRoles", ignore = true)
@Mapping(target = "removeTheRole", ignore = true)
RoleType toEntity(RoleTypeDTO roleTypeDTO);
default RoleType fromId(Long id) {
if (id == null) {
return null;
}
RoleType roleType = new RoleType();
roleType.setId(id);
return roleType;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/processModels/TemplateDAO.java
package br.ufpa.labes.spm.repository.impl.processModels;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.processModels.ITemplateDAO;
import br.ufpa.labes.spm.domain.Template;
public class TemplateDAO extends BaseDAO<Template, String> implements ITemplateDAO {
protected TemplateDAO(Class<Template> businessClass) {
super(businessClass);
}
public TemplateDAO() {
super(Template.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/log/AgendaEventDAO.java
package br.ufpa.labes.spm.repository.impl.log;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.log.IAgendaEventDAO;
import br.ufpa.labes.spm.domain.AgendaEvent;
public class AgendaEventDAO extends BaseDAO<AgendaEvent, Integer> implements IAgendaEventDAO {
protected AgendaEventDAO(Class<AgendaEvent> businessClass) {
super(businessClass);
}
public AgendaEventDAO() {
super(AgendaEvent.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/log/ProcessModelEventDAO.java
package br.ufpa.labes.spm.repository.impl.log;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.log.IProcessModelEventDAO;
import br.ufpa.labes.spm.domain.ProcessModelEvent;
public class ProcessModelEventDAO extends BaseDAO<ProcessModelEvent, Integer>
implements IProcessModelEventDAO {
protected ProcessModelEventDAO(Class<ProcessModelEvent> businessClass) {
super(businessClass);
}
public ProcessModelEventDAO() {
super(ProcessModelEvent.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/mapper/StructureMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.StructureDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link Structure} and its DTO {@link StructureDTO}.
*/
@Mapper(componentModel = "spring", uses = {NodeMapper.class})
public interface StructureMapper extends EntityMapper<StructureDTO, Structure> {
@Mapping(source = "rootElement.id", target = "rootElementId")
StructureDTO toDto(Structure structure);
@Mapping(source = "rootElementId", target = "rootElement")
@Mapping(target = "theRepositories", ignore = true)
@Mapping(target = "removeTheRepository", ignore = true)
Structure toEntity(StructureDTO structureDTO);
default Structure fromId(Long id) {
if (id == null) {
return null;
}
Structure structure = new Structure();
structure.setId(id);
return structure;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/domain/enumeration/EmailProcessStatusNotifications.java
package br.ufpa.labes.spm.domain.enumeration;
/** The EmailProcessStatusNotifications enumeration. */
public enum EmailProcessStatusNotifications {
PROCESS_HAS_FINISHED,
FIRST_ACTIVITY_HAS_STARTED,
CONSUMABLE_RESOURCE_5_AMOUNT,
ACTIVITY_IS_READY_INSTANTIATED,
TASK_DELEGATED,
DECISION_BRANCH_COND
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/plannerInfo/IInstantiationSuggestionDAO.java
package br.ufpa.labes.spm.repository.interfaces.plannerInfo;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.InstantiationSuggestion;
public interface IInstantiationSuggestionDAO extends IBaseDAO<InstantiationSuggestion, Integer> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/ArtifactMetricService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.ArtifactMetric;
import br.ufpa.labes.spm.repository.ArtifactMetricRepository;
import br.ufpa.labes.spm.service.dto.ArtifactMetricDTO;
import br.ufpa.labes.spm.service.mapper.ArtifactMetricMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link ArtifactMetric}.
*/
@Service
@Transactional
public class ArtifactMetricService {
private final Logger log = LoggerFactory.getLogger(ArtifactMetricService.class);
private final ArtifactMetricRepository artifactMetricRepository;
private final ArtifactMetricMapper artifactMetricMapper;
public ArtifactMetricService(ArtifactMetricRepository artifactMetricRepository, ArtifactMetricMapper artifactMetricMapper) {
this.artifactMetricRepository = artifactMetricRepository;
this.artifactMetricMapper = artifactMetricMapper;
}
/**
* Save a artifactMetric.
*
* @param artifactMetricDTO the entity to save.
* @return the persisted entity.
*/
public ArtifactMetricDTO save(ArtifactMetricDTO artifactMetricDTO) {
log.debug("Request to save ArtifactMetric : {}", artifactMetricDTO);
ArtifactMetric artifactMetric = artifactMetricMapper.toEntity(artifactMetricDTO);
artifactMetric = artifactMetricRepository.save(artifactMetric);
return artifactMetricMapper.toDto(artifactMetric);
}
/**
* Get all the artifactMetrics.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<ArtifactMetricDTO> findAll() {
log.debug("Request to get all ArtifactMetrics");
return artifactMetricRepository.findAll().stream()
.map(artifactMetricMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one artifactMetric by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<ArtifactMetricDTO> findOne(Long id) {
log.debug("Request to get ArtifactMetric : {}", id);
return artifactMetricRepository.findById(id)
.map(artifactMetricMapper::toDto);
}
/**
* Delete the artifactMetric by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete ArtifactMetric : {}", id);
artifactMetricRepository.deleteById(id);
}
}
<file_sep>/tmp/service/impl/TemplateServicesImpl.java
package br.ufpa.labes.spm.service.impl;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Date;
import java.util.Enumeration;
import java.util.HashSet;
import java.util.Hashtable;
import java.util.Iterator;
import java.util.LinkedList;
import java.util.List;
import java.util.Properties;
import java.util.StringTokenizer;
import javax.persistence.Query;
import br.ufpa.labes.spm.repository.interfaces.activities.IActivityDAO;
import br.ufpa.labes.spm.repository.interfaces.activities.IDecomposedDAO;
import br.ufpa.labes.spm.repository.interfaces.agent.IAgentDAO;
import br.ufpa.labes.spm.repository.interfaces.artifacts.IArtifactDAO;
import br.ufpa.labes.spm.repository.interfaces.connections.IConnectionDAO;
import br.ufpa.labes.spm.repository.interfaces.organizationPolicies.IProjectDAO;
import br.ufpa.labes.spm.repository.interfaces.plainActivities.IInvolvedArtifactsDAO;
import br.ufpa.labes.spm.repository.interfaces.plainActivities.IReqAgentDAO;
import br.ufpa.labes.spm.repository.interfaces.plainActivities.IReqWorkGroupDAO;
import br.ufpa.labes.spm.repository.interfaces.plainActivities.IRequiredResourceDAO;
import br.ufpa.labes.spm.repository.interfaces.processModelGraphic.IGraphicCoordinateDAO;
import br.ufpa.labes.spm.repository.interfaces.processModelGraphic.IWebAPSEEObjectDAO;
import br.ufpa.labes.spm.repository.interfaces.processModels.IDescriptionDAO;
import br.ufpa.labes.spm.repository.interfaces.processModels.IProcessDAO;
import br.ufpa.labes.spm.repository.interfaces.processModels.IProcessModelDAO;
import br.ufpa.labes.spm.repository.interfaces.processModels.ITemplateDAO;
import br.ufpa.labes.spm.service.dto.SimpleUpdateDTO;
import br.ufpa.labes.spm.service.dto.ProcessModelDTO;
import br.ufpa.labes.spm.service.dto.TemplateDTO;
import br.ufpa.labes.spm.exceptions.DAOException;
import br.ufpa.labes.spm.exceptions.DataBaseException;
import br.ufpa.labes.spm.exceptions.WebapseeException;
import br.ufpa.labes.spm.domain.Activity;
import br.ufpa.labes.spm.domain.Decomposed;
import br.ufpa.labes.spm.domain.Agent;
import br.ufpa.labes.spm.domain.Artifact;
import br.ufpa.labes.spm.domain.ArtifactCon;
import br.ufpa.labes.spm.domain.BranchCon;
import br.ufpa.labes.spm.domain.BranchANDCon;
import br.ufpa.labes.spm.domain.BranchConCond;
import br.ufpa.labes.spm.domain.BranchConCondToActivity;
import br.ufpa.labes.spm.domain.BranchConCondToMultipleCon;
import br.ufpa.labes.spm.domain.Connection;
import br.ufpa.labes.spm.domain.Dependency;
import br.ufpa.labes.spm.domain.Feedback;
import br.ufpa.labes.spm.domain.JoinCon;
import br.ufpa.labes.spm.domain.MultipleCon;
import br.ufpa.labes.spm.domain.Sequence;
import br.ufpa.labes.spm.domain.Project;
import br.ufpa.labes.spm.domain.Automatic;
import br.ufpa.labes.spm.domain.InvolvedArtifact;
import br.ufpa.labes.spm.domain.Normal;
import br.ufpa.labes.spm.domain.Parameter;
import br.ufpa.labes.spm.domain.ReqAgent;
import br.ufpa.labes.spm.domain.ReqAgentRequiresAbility;
import br.ufpa.labes.spm.domain.ReqWorkGroup;
import br.ufpa.labes.spm.domain.RequiredPeople;
import br.ufpa.labes.spm.domain.RequiredResource;
import br.ufpa.labes.spm.domain.GraphicCoordinate;
import br.ufpa.labes.spm.domain.WebAPSEEObject;
import br.ufpa.labes.spm.domain.Description;
import br.ufpa.labes.spm.domain.Process;
import br.ufpa.labes.spm.domain.ProcessModel;
import br.ufpa.labes.spm.domain.Template;
import br.ufpa.labes.spm.domain.Reservation;
import br.ufpa.labes.spm.domain.Subroutine;
import br.ufpa.labes.spm.domain.ToolParameter;
import org.qrconsult.spm.process.impl.CopyProcessServiceImpl;
import org.qrconsult.spm.process.interfaces.CopyProcessServices;
import br.ufpa.labes.spm.service.interfaces.DynamicModeling;
import br.ufpa.labes.spm.service.interfaces.TemplateServices;
import br.ufpa.labes.spm.util.i18n.Messages;
public class TemplateServicesImpl implements TemplateServices {
public static final String TEMPLATE_CLASSNAME = Template.class.getSimpleName();
public static final transient String
PROC_INST = "Process_Instantiation",
PROC_DEST = "Process_Destilling",
PROC_COMP = "Process_Composition",
COPY_PROC = "Copy_Process",
COPY_TEMP = "Copy_Template",
NEW_VERS_TEMP = "New_Version_Template",
DEC_DEST = "Decomposed_Destilling";
ITemplateDAO templateDAO;
CopyProcessServices copyServices;
IDescriptionDAO descriptionDAO;
IProcessDAO processDAO;
IDecomposedDAO decomposedDAO;
IProjectDAO projectDAO;
IAgentDAO agentDAO;
IProcessModelDAO processModelDAO;
IArtifactDAO artifactDAO;
IActivityDAO activityDAO;
DynamicModeling dynamicModeling;
IWebAPSEEObjectDAO webAPSEEObjectDAO;
IGraphicCoordinateDAO graphicCoordinateDAO;
IConnectionDAO connectionDAO;
IReqAgentDAO reqAgentDAO;
IReqWorkGroupDAO reqWorkGroupDAO;
IRequiredResourceDAO reqResourceDAO;
IInvolvedArtifactsDAO involvedArtifactsDAO;
private Query query;
private Properties artifactsIdents;
private Hashtable<String, Artifact> newArtifacts;
@Override
public Boolean createTemplate(String ident) {
Template template = new Template();
template.setIdent(ident);
ProcessModel processModel = new ProcessModel();
template.insertIntoTheInstances(processModel);
templateDAO.daoSave(template);
return true;
}
@Override
public List<TemplateDTO> getTemplates() {
String hql = "select o from " + TEMPLATE_CLASSNAME + " o";
this.query = templateDAO.getPersistenceContext().createQuery(hql);
List<Template> result = query.getResultList();
List<TemplateDTO> templatesDTO = this.convertTemplatesToTemplatesDTO(result);
System.out.println("lista dos templa:"+templatesDTO);
return templatesDTO;
}
@Override
public String createNewVersion(String template_id, String why, String version_id) {
Template template = null;
String state = "";
try {
template = getTemplate(template_id);
state = template.getTemplateState();
} catch (DAOException e) {
e.printStackTrace();
}
if(state.equals(Template.DEFINED)){
Object[] artifactsIdentsFromProcess = artifactDAO.getArtifactsIdentsFromProcessModelWithoutTemplates(template_id);
Properties artifactsIdents = null;
if( artifactsIdentsFromProcess != null){
artifactsIdents = new Properties();
String token = "";
for (int i = 0; i < artifactsIdentsFromProcess.length; i++) {
String oldArtifactIdent = (String) artifactsIdentsFromProcess[i];
artifactsIdents.put(oldArtifactIdent, oldArtifactIdent);
}
}
Template newTemplate = new Template();
// Make relationship with Description
// Modify newTemplate
newTemplate.setIdent(version_id);
//Copy relationships in common with Old Template
Hashtable<String, String> coordinates = copyServices.copyProcessModelData(template.getTheProcessModel(),newTemplate.getTheProcessModel(),template_id,newTemplate.getIdent(), CopyProcessServiceImpl.NEW_VERS_TEMP, artifactsIdents);
newTemplate = (Template)templateDAO.daoSave(newTemplate);
//Make relationships of Template versioning
Description desc = new Description();
desc.setTheNewVersion(newTemplate);
desc.setTheOldVersion(template);
desc.setWhy(why);
desc.setDate(new Date());
desc = (Description)descriptionDAO.daoSave(desc);
template.insertIntoTheDerivedVersionDescriptions(desc);
newTemplate.setTheOriginalVersionDescription(desc);
template.setTemplateState(Template.PENDING);
templateDAO.daoSave(newTemplate);
templateDAO.daoSave(template);
copyServices.saveCoordinates(coordinates, version_id);
return newTemplate.getIdent();
}else {
// System.out.println(Messages.getString("srcReuse.TempManag.TemplatesServices.templExecAtemplInThe")+" "+state+" "+Messages.getString("srcReuse.TempManag.TemplatesServices.templExecCannBe")); //$NON-NLS-1$ //$NON-NLS-2$
return "";
}
}
public Object[] getArtifactsIdentsFromProcessModelWithoutTemplates(String template_id){
return artifactDAO.getArtifactsIdentsFromProcessModelWithoutTemplates(template_id);
}
@Override
public void processDistilling(String process_id, String template_id) {
String tar=process_id;
String s[] = tar.split("\\.");
process_id = s[0];
System.out.println("estado :"+s[0]);
Process process = (Process)processDAO.retrieveBySecondaryKey(process_id);
String state = process.getpStatus().name();
System.out.println("estado :"+state);
if(state.equalsIgnoreCase(Process.FINISHED) || state.equalsIgnoreCase(Process.ENACTING)){
Object[] artifactsIdentsFromProcess = artifactDAO.getArtifactsIdentsFromProcessModelWithoutTemplates(process_id);
Properties artifactsIdents = null;
if( artifactsIdentsFromProcess != null){
artifactsIdents = new Properties();
String token = "";
for (int i = 0; i < artifactsIdentsFromProcess.length; i++) {
String oldArtifactIdent = (String) artifactsIdentsFromProcess[i];
String newArtifactIdent = oldArtifactIdent;
if(oldArtifactIdent.contains("-")){
token = oldArtifactIdent.substring(0 , oldArtifactIdent.indexOf("-"));
if(!token.equals("") && token.trim().contains(process_id)){
newArtifactIdent = oldArtifactIdent.substring(oldArtifactIdent.indexOf("-")+1).trim();
}
}
artifactsIdents.put(oldArtifactIdent, newArtifactIdent);
}
}
Template template = new Template();
template.setIdent(template_id);
//Copy relationships in common with Process
Hashtable<String, String> coordinates = copyServices.copyProcessModelData(process.getTheProcessModel(),template.getTheProcessModel(),process_id,template.getIdent(), CopyProcessServiceImpl.PROC_DEST, artifactsIdents);
template = (Template)templateDAO.daoSave(template);
templateDAO.daoSave(template);
copyServices.saveCoordinates(coordinates, template_id);
}else if(state.equalsIgnoreCase(Template.DRAFT) || state.equalsIgnoreCase(Template.OUTDATED)|| state.equalsIgnoreCase(Template.DEFINED)||state.equalsIgnoreCase(Template.PENDING)||state.equalsIgnoreCase(Template.NOT_STARTED)){
System.out.println("atividade n�o iniciada");
System.out.println(Messages.getString("srcReuse.TempManag.TemplatesServices.templExecAProce")+" "+state+" "+Messages.getString("srcReuse.TempManag.TemplatesServices.templExecCannBeDist"));
}
else System.out.println(Messages.getString("srcReuse.TempManag.TemplatesServices.templExecAProce")+" "+state+" "+Messages.getString("srcReuse.TempManag.TemplatesServices.templExecCannBeDist")); //$NON-NLS-1$ //$NON-NLS-2$
}
@Override
public boolean copyTemplate(String newTemplateIdent,String oldTemplateIdent) throws DAOException {
Object[] artifactsIdentsFromUser = artifactDAO.getArtifactsIdentsFromProcessModelWithoutTemplates(oldTemplateIdent);
if( artifactsIdentsFromUser != null){
this.artifactsIdents = new Properties();
for (int i = 0; i < artifactsIdentsFromUser.length; i++) {
String artifactIdent = (String) artifactsIdentsFromUser[i];
this.artifactsIdents.put(artifactIdent, artifactIdent);
}
}
Template sourceTemplate=null;
boolean abort=false;
Object newProcess= templateDAO.retrieveBySecondaryKey(newTemplateIdent);
if(newProcess!=null){
abort=true;
}
if(abort){
throw new DAOException("Template "+newTemplateIdent+" already exists! Please choose a new ident!");
}
sourceTemplate=(Template) templateDAO.retrieveBySecondaryKey(oldTemplateIdent);
if(sourceTemplate==null){
throw new DAOException("Template "+oldTemplateIdent+" not found in database!Cannot copy from a null Process Model");
}
//if the code is here .... we need to instantiate a new Process to receive the old data
Template destinationTemplate = new Template();
destinationTemplate.setIdent(newTemplateIdent);
ProcessModel rootPM = sourceTemplate.getTheProcessModel();
if(rootPM==null){
throw new DAOException("Template "+oldTemplateIdent+" contains a null Process Model! Cannot copy from a null Process Model");
}
Hashtable<String, String> coordinates;
// now invoke a recursive method to copy the process model data
coordinates = copyProcessModelData(rootPM,destinationTemplate.getTheProcessModel(),oldTemplateIdent,newTemplateIdent, COPY_TEMP);
templateDAO.daoSave(destinationTemplate);
this.saveCoordinates(coordinates, newTemplateIdent);
return true;
}//end method
/**
* @param oldProcessModel
* @param newProcessModel
* @param oldProcessIdent
* @param newProcessIdent
*/
public Hashtable<String, String> copyProcessModelData(ProcessModel oldProcessModel,
ProcessModel newProcessModel, String oldProcessIdent,
String newProcessIdent, String operationType) {
try{
Hashtable<String, String> coordinates = new Hashtable<String, String>();
Collection<Activity> activities = oldProcessModel.getTheActivity();
Collection<Connection> connections = oldProcessModel.getTheConnection();
Collection<Activity> newActivities = null;//############################
// ########
newActivities = copyActivitiesOnProcessModel(activities,
oldProcessIdent, newProcessIdent, operationType,
newProcessIdent, coordinates);
newProcessModel.insertIntoTheActivity(newActivities);
Hashtable<String, Activity> activitiesTable = generateActivitiesTable(newActivities);
Collection<Connection> newConnections = null;//#########################
// #########
newConnections = copyConnectionsOnProcessModel(connections,
oldProcessIdent, newProcessIdent, activitiesTable,
operationType, coordinates);
newProcessModel.insertIntoTheConnection(newConnections);
return coordinates;
}catch(WebapseeException e){
e.printStackTrace();
}
return null;
}
/**
* @param activities
* @param oldProcessIdent
* @param newProcessIdent
* @return
* @throws DAOException
*/
private Collection<Activity> copyActivitiesOnProcessModel(Collection<Activity> activities, String oldProcessIdent, String newProcessIdent, String operationType, String current_level, Hashtable<String, String> coordinates) throws DAOException {
Collection<Activity> newActivities = new HashSet<Activity>();
for(Iterator<Activity> activityIterator = activities.iterator(); activityIterator.hasNext();){
Activity currentActivity = activityIterator.next();
Activity newActivity =null;
if(currentActivity!=null){
String newActivityIdent = currentActivity.getIdent().replaceFirst(oldProcessIdent,newProcessIdent);
this.addCoordinate(currentActivity.getId(), currentActivity.getClass().getSimpleName(), WebAPSEEObject.ACTIVITY + "+" + newActivityIdent, coordinates);
if(currentActivity instanceof Normal){
newActivity = new Normal();
newActivity.setIdent(newActivityIdent);
copyNormalProperties(((Normal)newActivity),((Normal)currentActivity), operationType, current_level, coordinates);
}else if(currentActivity instanceof Automatic){
newActivity = new Automatic();
newActivity.setIdent(newActivityIdent);
copyAutomaticProperties(((Automatic)newActivity),((Automatic)currentActivity));
}else if(currentActivity instanceof Decomposed){
newActivity = new Decomposed();
newActivity.setIdent(newActivityIdent);
copyDecomposedProperties(((Decomposed)newActivity),((Decomposed)currentActivity),oldProcessIdent,newProcessIdent, operationType, coordinates);
}
if(currentActivity.getTheActivityType()!=null){
newActivity.insertIntoTheActivityType(currentActivity.getTheActivityType());
}
//add to main Collection
newActivities.add(newActivity);
}//end if !=null
}//end for
//after all , return the main Collection
return newActivities;
}
private void copyDecomposedProperties(Decomposed newDecomposed, Decomposed currentDecomposed,String oldProcessIdent,String newProcessIdent, String operationType, Hashtable<String, String> coordinates) throws DAOException {
if(currentDecomposed.getTheActivityType()!=null){
newDecomposed.setTheActivityType(currentDecomposed.getTheActivityType());
}
if(currentDecomposed.getName()!=null){
newDecomposed.setName(currentDecomposed.getName());
}
//subProcessModel and call recursivelly
//new Refered ProcessModel
ProcessModel newReferedProcessModel = new ProcessModel();
if(currentDecomposed.getTheReferedProcessModel()!=null){
//recursive invocation ....
coordinates.putAll(copyProcessModelData(currentDecomposed.getTheReferedProcessModel(),newReferedProcessModel, oldProcessIdent, newProcessIdent, operationType));
newDecomposed.setTheReferedProcessModel(newReferedProcessModel);
}
}//END METHOD
private void copyAutomaticProperties(Automatic newAutomatic, Automatic currentAutomatic) {
// set common attribute
newAutomatic.setTheArtifact(currentAutomatic.getTheArtifact());
//about parameters
Collection<Parameter> newParameters = null;
newParameters = copyAutomaticParameters(currentAutomatic.getTheParameters(),newAutomatic);
newAutomatic.setTheParameters(newParameters);
//about subroutine
if(currentAutomatic.getTheSubroutine()!=null){
Subroutine newSubRoutine = new Subroutine();
if(currentAutomatic.getTheSubroutine().getTheArtifactType()!=null){
newSubRoutine.insertIntoTheArtifactType(currentAutomatic.getTheSubroutine().getTheArtifactType());
}
newSubRoutine.insertIntoTheAutomatic(newAutomatic);
//need to copy a new ToolParameter
Collection<ToolParameter> newToolParameters = null;
newToolParameters = copySubroutineToolParameters(currentAutomatic.getTheSubroutine().getTheToolParameters(),newSubRoutine);
newSubRoutine.setTheToolParameters(newToolParameters);
}//end if
}//end method
private void addCoordinate(Integer theReferredOid, String className, String newObjRef, Hashtable<String, String> coordinates) throws DAOException{
String currentObjKey = className + ":" + String.valueOf(theReferredOid);
coordinates.put(currentObjKey, newObjRef);
}
/**
* @param newActivities
* @return
*/
private Hashtable<String,Activity> generateActivitiesTable(Collection<Activity> newActivities) {
Hashtable<String,Activity> activitiesTable = new Hashtable<String,Activity>(1000,1);
for(Iterator<Activity> actIterator = newActivities.iterator();actIterator.hasNext();){
Activity currentActivity = actIterator.next();
activitiesTable.put(currentActivity.getIdent(),currentActivity);
}
return activitiesTable;
}
public void saveCoordinates(Hashtable<String, String> coordinates, String processIdent){
try{
Enumeration<String> enumer = coordinates.keys();
WebAPSEEObject webObj;
GraphicCoordinate coord, newCoord;
Activity act;
Connection con;
ReqAgent reqAg;
ReqWorkGroup reqGr;
RequiredResource reqRes;
while (enumer.hasMoreElements()) {
String currentObjKey = (String) enumer.nextElement();
StringTokenizer currentObjTokenizer = new StringTokenizer(currentObjKey, ":");
String className = currentObjTokenizer.nextToken();
Integer theReferredOid = Integer.valueOf(currentObjTokenizer.nextToken());
webObj = webAPSEEObjectDAO.retrieveWebAPSEEObject(theReferredOid, className);
if(webObj!=null){
String key = coordinates.get(currentObjKey);
StringTokenizer tokenizer = new StringTokenizer(key,"+");
String webObjType = tokenizer.nextToken();
String webObjKey = tokenizer.nextToken();
coord = webObj.getTheGraphicCoordinate();
newCoord = new GraphicCoordinate();
newCoord.setTheProcess(processIdent);
newCoord.setX(coord.getX());
newCoord.setY(coord.getY());
newCoord.setVisible(coord.isVisible());
if(webObjType.equals(WebAPSEEObject.ACTIVITY)){
act = (Activity) activityDAO.retrieveBySecondaryKey(webObjKey);
if(act!=null){
webObj = new WebAPSEEObject(act.getId(),act.getClass().getSimpleName(),newCoord);
webAPSEEObjectDAO.daoSave(webObj);
}
}else if(webObjType.equals(WebAPSEEObject.CONNECTION)){
con = (Connection) connectionDAO.retrieveBySecondaryKey(webObjKey);
if(con!=null){
webObj = new WebAPSEEObject(con.getId(),con.getClass().getSimpleName(),newCoord);
webAPSEEObjectDAO.daoSave(webObj);
}
}else{
StringTokenizer webObjTokenizer = new StringTokenizer(webObjKey,":");
String typeIdent = null;
String instanceIdent = null;
String normalIdent = null;
if(webObjTokenizer.countTokens()==3){
typeIdent = webObjTokenizer.nextToken();
instanceIdent = webObjTokenizer.nextToken();
normalIdent = webObjTokenizer.nextToken();
}else if(webObjTokenizer.countTokens()==2){
typeIdent = webObjTokenizer.nextToken();
instanceIdent = "";
normalIdent = webObjTokenizer.nextToken();
}
if(webObjType.equals(WebAPSEEObject.REQ_AGENT)){
reqAg = reqAgentDAO.findReqAgentFromProcessModel(instanceIdent, typeIdent, normalIdent);
if(reqAg!=null){
webObj = new WebAPSEEObject(reqAg.getId(),reqAg.getClass().getSimpleName(),newCoord);
webAPSEEObjectDAO.daoSave(webObj);
}
}else if(webObjType.equals(WebAPSEEObject.REQ_WorkGroup)){
reqGr = reqWorkGroupDAO.findReqWorkGroupFromProcessModel(instanceIdent, typeIdent, normalIdent);
if(reqGr!=null){
webObj = new WebAPSEEObject(reqGr.getId(),reqGr.getClass().getSimpleName(),newCoord);
webAPSEEObjectDAO.daoSave(webObj);
}
}else if(webObjType.equals(WebAPSEEObject.REQ_RESOURCE)){
reqRes = reqResourceDAO.findRequiredResourceFromProcessModel(instanceIdent, typeIdent, normalIdent);
if(reqRes!=null){
webObj = new WebAPSEEObject(reqRes.getId(),reqRes.getClass().getSimpleName(),newCoord);
webAPSEEObjectDAO.daoSave(webObj);
}
}
}
}
}
}catch(WebapseeException e){
e.printStackTrace();
}
}
public void decomposedDistilling(String dec_id, String template_id) throws WebapseeException {
Decomposed decomposed = (Decomposed)decomposedDAO.retrieveBySecondaryKey(dec_id);
String state = decomposed.getTheReferedProcessModel().getPmState();
if(!state.equals(ProcessModel.REQUIREMENTS)){
Template template = new Template();
template.setIdent(template_id);
//Copy relationships in common with Process
Hashtable<String, String> coordinates = copyServices.copyProcessModelData(decomposed.getTheReferedProcessModel(),template.getTheProcessModel(),dec_id,template.getIdent(), CopyProcessServiceImpl.DEC_DEST, null);
template = (Template)templateDAO.daoSave(template);
copyServices.saveCoordinates(coordinates, template_id);
}else System.out.println(Messages.getString("srcReuse.TempManag.TemplatesServices.templExecADecomp")+" "+state+" "+Messages.getString("srcReuse.TempManag.TemplatesServices.templExecCannBeDist")); //$NON-NLS-1$ //$NON-NLS-2$
}
public String getStateOfTemplate(String template_id) throws WebapseeException{
Template template = getTemplate(template_id);
String state = template.getTemplateState();
if(state != null && !state.equals(""))
return state;
return null;
}
@Override
public void processInstantiation(String template_id, String instance_id, String userIdent) throws WebapseeException {
Template template = getTemplate(template_id);
Project projectExists = projectDAO.retrieveBySecondaryKey(instance_id);
if(projectExists != null) {
throw new WebapseeException("O Projeto " + instance_id + " já foi cadastrado no banco.");
} else {
Object[] artifactsIdentsFromTemplate = artifactDAO.getArtifactsIdentsFromProcessModelWithoutTemplates(template_id);
Properties artifactsIdents = null;
if( artifactsIdentsFromTemplate != null){
artifactsIdents = new Properties();
String token = "";
for (int i = 0; i < artifactsIdentsFromTemplate.length; i++) {
String oldArtifactIdent = (String) artifactsIdentsFromTemplate[i];
String newArtifactIdent = oldArtifactIdent;
if(oldArtifactIdent.contains("-")){
token = oldArtifactIdent.substring(0 , oldArtifactIdent.indexOf("-"));
if(!token.equals("") && token.trim().contains(template_id)){
newArtifactIdent = oldArtifactIdent.substring(oldArtifactIdent.indexOf("-")+1).trim();
}
}
artifactsIdents.put(oldArtifactIdent, newArtifactIdent);
}
}
Process instance = new Process();
instance.setIdent(instance_id);
//Copy relationships in common with Process
Hashtable<String, String> coordinates = copyServices.copyProcessModelData(template.getTheProcessModel(),instance.getTheProcessModel(),template_id,instance.getIdent(), CopyProcessServiceImpl.PROC_INST, artifactsIdents);
instance = (Process)processDAO.daoSave(instance);
//Create Project for the instance
Project project = new Project();
project.setIdent(instance_id);
project.setName(instance_id);
project = (Project) projectDAO.daoSave(project);
project.setProcessRefered(instance);
projectDAO.update(project);
//Define the manager to created project
Agent manager = (Agent) agentDAO.retrieveBySecondaryKey(userIdent);
instance.insertIntoTheAgent(manager);
processDAO.update(instance);
//Make relationships of Template versioning
ProcessModel pmodel = instance.getTheProcessModel();
pmodel.setTheOrigin(template);
template.insertIntoTheInstances(pmodel);
processModelDAO.daoSave(pmodel);
templateDAO.daoSave(template);
copyServices.saveCoordinates(coordinates, instance_id);
}
}
public void processComposition(String template_id, String currentLevel_id, Object[] artifactsIdentsFromUser) throws DAOException {
System.out.println("caiu na composi��o: "+artifactsIdentsFromUser[0]);
System.out.println("caiu na composi��o 2: "+artifactsIdentsFromUser[1]);
// Object[] artifactsIdentsFromUser = artifactDAO.getArtifactsIdentsFromProcessModelWithoutTemplates(template_id);
Properties artifactsIdents = null;
if( artifactsIdentsFromUser != null){
artifactsIdents = new Properties();
for (int i = 0; i < artifactsIdentsFromUser.length; i++) {
SimpleUpdateDTO artifactIdent = (SimpleUpdateDTO)artifactsIdentsFromUser[i];
artifactsIdents.put(artifactIdent.getOldValue(), artifactIdent.getNewValue());
}
}
Decomposed decomp1 = (Decomposed) decomposedDAO.retrieveBySecondaryKey(currentLevel_id);
ProcessModel pModel1 = decomp1.getTheReferedProcessModel();
Template template = (Template) templateDAO.retrieveBySecondaryKey(template_id);
String state = template.getTemplateState();
if(state.equals(Template.DEFINED) || state.equals(Template.PENDING)){
// Delete activities and connections of the Process Model
if(pModel1.getTheActivity().size()>0 || pModel1.getTheConnection().size()>0){
Collection actIds = this.getActivitiesToDelete(pModel1);
Collection conIds = this.getConnectionsToDelete(pModel1);
Iterator iterActIds = actIds.iterator();
while (iterActIds.hasNext()) {
String id = (String) iterActIds.next();
try {
//The delete method from DynamicModelin is Invoked for each activity
dynamicModeling.deleteActivity(id);
} catch (DAOException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
} catch (WebapseeException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
}
Iterator iterConIds = conIds.iterator();
while (iterConIds.hasNext()) {
String id = (String) iterConIds.next();
try {
//The delete method from DynamicModelin is Invoked for each connection
dynamicModeling.deleteConnection(id);
} catch (DAOException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
} catch (WebapseeException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
}
}
template = (Template) templateDAO.retrieveBySecondaryKey(template_id);
Decomposed decomp = (Decomposed) decomposedDAO.retrieveBySecondaryKey(currentLevel_id);
ProcessModel pModel = decomp.getTheReferedProcessModel();
//Copy relationships in common with Process
Hashtable<String, String> coordinates = copyServices.copyProcessModelData(template.getTheProcessModel(),pModel,template.getIdent(),currentLevel_id, CopyProcessServiceImpl.PROC_COMP, artifactsIdents);
//Make relationships of Template versioning
pModel.setTheOrigin(template);
template.insertIntoTheInstances(pModel);
templateDAO.update(template);
//Change the process model descriptor
copyServices.saveCoordinates(coordinates, currentLevel_id.substring(0, currentLevel_id.indexOf(".")));
}
else if(state.equals(Template.DRAFT) || state.equals(Template.OUTDATED)|| state.equals(Template.ENACTING)||state.equals(Template.FINISHED )||state.equals(Template.NOT_STARTED)){
System.out.println("atividade n�o definida agora"+state);
}
else System.out.println(Messages.getString("srcReuse.TempManag.TemplatesServices.templExecAtemplInThe")+" "+state+" "+Messages.getString("srcReuse.TempManag.TemplatesServices.templExecCannBeInst"));
}
//Delete activities of the Process Model
private Collection<String> getActivitiesToDelete(ProcessModel pModel){
Collection<String> actIds = new HashSet<String>();
//Get activities of the Process Model
Collection activities = pModel.getTheActivity();
Iterator iterActivity = activities.iterator();
while (iterActivity.hasNext()) {
Activity activity = (Activity)iterActivity.next();
if(activity != null){
actIds.add(activity.getIdent());
}
}
return actIds;
}
//Delete connections of the Process Model
private Collection<String> getConnectionsToDelete(ProcessModel pModel){
Collection<String> conIds = new HashSet<String>();
//Get connections of the Process Model
Collection connections = pModel.getTheConnection();
Iterator iterConnection = connections.iterator();
while (iterConnection.hasNext()) {
Connection connection = (Connection)iterConnection.next();
if(connection != null){
conIds.add(connection.getIdent());
}
}
return conIds;
}
@Override
public boolean toBecomeDefined(String template_id) throws WebapseeException {
Template template = getTemplate(template_id);
String state = template.getTemplateState();
if(state.equals(Template.DRAFT)){
template.setTemplateState(Template.DEFINED);
Description desc = template.getTheOriginalVersionDescription();
if(desc!=null){
Template oldTemplate = desc.getTheOldVersion();
oldTemplate.setTemplateState(Template.OUTDATED);
templateDAO.update(oldTemplate);
}
templateDAO.daoSave(template);
return true;
}else {
return false;
}
}
private TemplateDTO convertTemplateToTemplateDTO(Template template) {
TemplateDTO dto = new TemplateDTO();
dto.setIdent(template.getIdent());
dto.setTemplateState(template.getTemplateState());
dto.setTheInstances(new ArrayList<ProcessModelDTO>());
return dto;
}
private List<TemplateDTO> convertTemplatesToTemplatesDTO(List<Template> templates) {
List<TemplateDTO> templatesDTO = new ArrayList<TemplateDTO>();
for (Template template : templates) {
templatesDTO.add(this.convertTemplateToTemplateDTO(template));
}
return templatesDTO;
}
private Template getTemplate(String template_id) throws DAOException{
//checks of parameters
Template temp;
temp = templateDAO.retrieveBySecondaryKey(template_id);
if (temp == null){
// throw new DataBaseException(Messages.getString("srcReuse.TempManag.TemplatesServices.DBExcTempl")+template_id+Messages.getString("srcReuse.TempManag.TemplatesServices.DBExcDoesNot")); //$NON-NLS-1$ //$NON-NLS-2$
throw new DataBaseException("Deu erro ao buscar template"); //$NON-NLS-1$ //$NON-NLS-2$
}
// checks related to the state of the system
return temp;
}
/**
* @param connections
* @param oldProcessIdent
* @param newProcessIdent
* @return
* @throws DAOException
*/
private Collection<Connection> copyConnectionsOnProcessModel(Collection<Connection> connections, String oldProcessIdent, String newProcessIdent,Hashtable<String,Activity> activitiesTable, String operationType, Hashtable<String, String> coordinates) throws DAOException {
//need to take care about all kind of connections
Hashtable<String,Connection> connectionTable = new Hashtable<String,Connection>(1000,1);
HashSet<Connection> connectionsToResult = new HashSet<Connection>(1000);
Collection<Connection> postProcessingCollection = new LinkedList<Connection>();
for(Iterator<Connection> connectionsIterator = connections.iterator();connectionsIterator.hasNext();){
Connection currentConnection = connectionsIterator.next();
if(currentConnection!=null){
Connection newConnection = null;
String newConnectionIdent = currentConnection.getIdent().replaceFirst(oldProcessIdent,newProcessIdent);
if(currentConnection instanceof Sequence){
newConnection = new Sequence();
if( ((Sequence)currentConnection).getTheDependency() !=null){
Dependency newDependency = new Dependency();
newDependency.setKindDep( ((Sequence)currentConnection).getTheDependency().getKindDep());
newDependency.insertIntoTheSequence(((Sequence)newConnection));
}//end if
//about activities
//ToActivity
if ( ((Sequence)currentConnection).getToActivity()!=null)
{
//new activity ident
String actIdent = ((Sequence)currentConnection).getToActivity().getIdent().replaceFirst(oldProcessIdent,newProcessIdent);
Activity newToAct = activitiesTable.get(actIdent);
if(newToAct!=null){
((Sequence)newConnection).insertIntoToActivity(newToAct);
}//end if
}//end if != null
//FromActivity
if ( ((Sequence)currentConnection).getFromActivity()!=null)
{
//new activity ident
String actIdent = ((Sequence)currentConnection).getFromActivity().getIdent().replaceFirst(oldProcessIdent,newProcessIdent);
Activity newFromAct = activitiesTable.get(actIdent);
if(newFromAct!=null){
((Sequence)newConnection).insertIntoFromActivity(newFromAct);
}//end if
}//end if != null
newConnection.setIdent(newConnectionIdent);
this.addCoordinate(((Sequence)currentConnection).getId(), ((Sequence)currentConnection).getClass().getSimpleName(), WebAPSEEObject.CONNECTION + "+" + newConnectionIdent, coordinates);
}//########################
else if(currentConnection instanceof Feedback){
newConnection = new Feedback();
/*if( ((Feedback)currentConnection).getTheCondition() !=null){
Condition newCondition = new Condition();
newCondition.setCond( ((Feedback)currentConnection).getTheCondition().getCond());
newCondition.insertIntoIsConditionOf(((Feedback)newConnection));
}*///end if
//about activities
//ToActivity
if ( ((Feedback)currentConnection).getToActivity()!=null)
{
//new activity ident
String actIdent = ((Feedback)currentConnection).getToActivity().getIdent().replaceFirst(oldProcessIdent,newProcessIdent);
Activity newToAct = activitiesTable.get(actIdent);
if(newToAct!=null){
((Feedback)newConnection).insertIntoToActivity(newToAct);
}//end if
}//end if != null
//FromActivity
if ( ((Feedback)currentConnection).getFromActivity()!=null)
{
//new activity ident
String actIdent = ((Feedback)currentConnection).getFromActivity().getIdent().replaceFirst(oldProcessIdent,newProcessIdent);
Activity newFromAct = activitiesTable.get(actIdent);
if(newFromAct!=null){
((Feedback)newConnection).insertIntoFromActivity(newFromAct);
}//end if
}//end if != null
if(((Feedback)currentConnection).getTheCondition() != null){
((Feedback)newConnection).removeFromTheCondition();
((Feedback)newConnection).insertIntoTheCondition(((Feedback)currentConnection).getTheCondition().createClone());
}//end if condition != null
newConnection.setIdent(newConnectionIdent);
this.addCoordinate(((Feedback)currentConnection).getId(), ((Feedback)currentConnection).getClass().getSimpleName(), WebAPSEEObject.CONNECTION + "+" + newConnectionIdent, coordinates);
}//########################
else if(currentConnection instanceof ArtifactCon){
newConnection = new ArtifactCon();
if( ((ArtifactCon)currentConnection).getTheArtifactType()!=null){
((ArtifactCon)newConnection).insertIntoTheArtifactType( ((ArtifactCon)currentConnection).getTheArtifactType());
}
Artifact currentArtifact = ((ArtifactCon)currentConnection).getTheArtifact();
if( currentArtifact!=null ){
String currentArtifactIdent = currentArtifact.getIdent();
String newArtifactIdent = this.artifactsIdents.getProperty(currentArtifactIdent);
if(newArtifactIdent==null || currentArtifactIdent.equals(newArtifactIdent)){
((ArtifactCon)newConnection).insertIntoTheArtifacts( currentArtifact );
}else{
try {
Artifact newArtifact = this.newArtifacts.get(newArtifactIdent);
if(newArtifact==null){
newArtifact = (Artifact) artifactDAO.retrieveBySecondaryKey(newArtifactIdent);
if(newArtifact == null){
newArtifact = new Artifact();
newArtifact.setIdent(newArtifactIdent);
newArtifact.setDescription(currentArtifact.getDescription());
newArtifact.insertIntoTheArtifactType(((ArtifactCon) currentConnection).getTheArtifactType());
}
}
((ArtifactCon)newConnection).insertIntoTheArtifacts( newArtifact );
this.newArtifacts.put(newArtifactIdent, newArtifact);
} catch (Exception e) {
// TODO: handle exception
}
}
}
//about activities
//ToActivity
Collection toActivities = ((ArtifactCon)currentConnection).getToActivity();
for(Iterator<Activity> toActivityIterator = toActivities.iterator();toActivityIterator.hasNext();){
Activity currentToAct = toActivityIterator.next();
if(currentToAct!=null){
String actIdent = currentToAct.getIdent().replaceFirst(oldProcessIdent,newProcessIdent);
Activity newToAct = activitiesTable.get(actIdent);
if(newToAct!=null){
((ArtifactCon)newConnection).insertIntoToActivity(newToAct);
}//end if
}//end if != null
}//end for
//FromActivity
Collection fromActivities = ((ArtifactCon)currentConnection).getFromActivity();
for(Iterator<Activity> fromActivityIterator = fromActivities.iterator();fromActivityIterator.hasNext();){
Activity currentFromAct = fromActivityIterator.next();
if(currentFromAct!=null){
String actIdent = currentFromAct.getIdent().replaceFirst(oldProcessIdent,newProcessIdent);
Activity newFromAct = activitiesTable.get(actIdent);
if(newFromAct!=null){
((ArtifactCon)newConnection).insertIntoFromActivity(newFromAct);
}//end if
}//end if != null
}//end for
newConnection.setIdent(newConnectionIdent);
postProcessingCollection.add(currentConnection);
this.addCoordinate(((ArtifactCon)currentConnection).getId(), ((ArtifactCon)currentConnection).getClass().getSimpleName(), WebAPSEEObject.CONNECTION + "+" + newConnectionIdent, coordinates);
}//########################
else if(currentConnection instanceof BranchCon){
//########################
if(currentConnection instanceof BranchANDCon){
newConnection = new BranchANDCon();
//ToActivity
Collection toActivities = ((BranchANDCon)currentConnection).getToActivity();
for(Iterator<Activity> toActivityIterator = toActivities.iterator();toActivityIterator.hasNext();){
Activity currentToAct = toActivityIterator.next();
if(currentToAct!=null){
String actIdent = currentToAct.getIdent().replaceFirst(oldProcessIdent,newProcessIdent);
Activity newToAct = activitiesTable.get(actIdent);
if(newToAct!=null){
((BranchANDCon)newConnection).insertIntoToActivity(newToAct);
}//end if
}//end if != null
}//end for
}//########################
else if(currentConnection instanceof BranchConCond){
newConnection = new BranchConCond();
((BranchConCond)newConnection).setKindBranch( ((BranchConCond)currentConnection).getKindBranchCon());
// ToActivity
Collection toActivities = ((BranchConCond)currentConnection).getTheBranchConCondToActivity();
for(Iterator<BranchConCondToActivity> toActivityIterator = toActivities.iterator(); toActivityIterator.hasNext();){
BranchConCondToActivity currentToAct = toActivityIterator.next();
if(currentToAct!=null){
BranchConCondToActivity newBranchCondToAct = new BranchConCondToActivity();
if(currentToAct.getTheActivity()!=null){
String actIdent = currentToAct.getTheActivity().getIdent().replaceFirst(oldProcessIdent,newProcessIdent);
Activity newToAct = activitiesTable.get(actIdent);
if(newToAct!=null){
newToAct.insertIntoTheBranchConCondToActivity(newBranchConCondToAct);
}//end if
}//end if != null
//about conditions
if(currentToAct.getTheCondition() != null){
newBranchConCondToAct.removeFromTheCondition();
newBranchConCondToAct.insertIntoTheCondition(currentToAct.getTheCondition().createClone());
}//end if condition != null
//add current element to newBranchConCond object
((BranchConCond)newConnection).insertIntoTheBranchConCondToActivity(newBranchConCondToAct);
}//end if != null
}//end for
//###########
// ((BranchConCond)currentConnection).getKindBranchCon();
//##########
}//end if common atribute for all branch connections ((BranchCon)newConnection).setFired(((BranchCon)currentConnection).getFired());
//about dependency
if( ((BranchCon)currentConnection).getTheDependency() !=null){
Dependency newDependency = new Dependency();
newDependency.setKindDep( ((BranchCon)currentConnection).getTheDependency().getKindDep());
newDependency.insertIntoTheMultipleCon( ((BranchCon)newConnection));
}//end if
//about from activity
if( ((BranchCon)currentConnection).getFromActivity()!=null ){
String actIdent = ((BranchCon)currentConnection).getFromActivity().getIdent().replaceFirst(oldProcessIdent,newProcessIdent);
Activity newFromAct = activitiesTable.get(actIdent);
if(newFromAct!=null){
((BranchCon)newConnection).insertIntoFromActivity(newFromAct);
}//end if
}//end if
newConnection.setIdent(newConnectionIdent);
postProcessingCollection.add(currentConnection);
this.addCoordinate(((BranchCon)currentConnection).getId(), ((BranchCon)currentConnection).getClass().getSimpleName(), WebAPSEEObject.CONNECTION + "+" + newConnectionIdent, coordinates);
}
else if(currentConnection instanceof JoinCon){
newConnection = new JoinCon();
//simple attributes
//((JoinCon)newConnection).setFired( ((JoinCon)currentConnection).getFired() );
((JoinCon)newConnection).setKindJoin(((JoinCon)currentConnection).getKindJoinCon());
//about dependency
if( ((JoinCon)currentConnection).getTheDependency() !=null){
Dependency newDependency = new Dependency();
newDependency.setKindDep( ((JoinCon)currentConnection).getTheDependency().getKindDep());
newDependency.insertIntoTheMultipleCon(((JoinCon)newConnection));
}//end if
//About activities
//ToActivity
//About to Activity
if( ((JoinCon)currentConnection).getToActivity()!=null ){
String actIdent = ((JoinCon)currentConnection).getToActivity().getIdent().replaceFirst(oldProcessIdent,newProcessIdent);
Activity newToAct = activitiesTable.get(actIdent);
if(newToAct!=null){
((JoinCon)newConnection).insertIntoToActivity(newToAct);
}//end if
}//end if
//FromActivity
Collection fromActivities = ((JoinCon)currentConnection).getFromActivity();
for(Iterator<Activity> fromActivityIterator = fromActivities.iterator();fromActivityIterator.hasNext();){
Activity currentFromAct = fromActivityIterator.next();
if(currentFromAct!=null){
String actIdent = currentFromAct.getIdent().replaceFirst(oldProcessIdent,newProcessIdent);
Activity newFromAct = activitiesTable.get(actIdent);
if(newFromAct!=null){
((JoinCon)newConnection).insertIntoFromActivity(newFromAct);
}//end if
}//end if != null
}//end for
newConnection.setIdent(newConnectionIdent);
postProcessingCollection.add(currentConnection);
this.addCoordinate(((JoinCon)currentConnection).getId(), ((JoinCon)currentConnection).getClass().getSimpleName(), WebAPSEEObject.CONNECTION + "+" + newConnectionIdent, coordinates);
}//end joinCon processing
//about conection type
if(currentConnection.getTheConnectionType()!=null){
newConnection.insertIntoTheConnectionType(currentConnection.getTheConnectionType());
}//end if
if(newConnection!=null){
connectionTable.put(newConnection.getIdent(),newConnection);
connectionsToResult.add(newConnection);
}
}//end if != null
}//end for
//After all processing, we need to connect
//Because multiple connection have connection to another connections...
//After all mapping we need again check connection relationships
for(Iterator<Connection> postProcessingIterator = postProcessingCollection.iterator();postProcessingIterator.hasNext();){
Connection currentPostConnection = postProcessingIterator.next();
Connection alreadyCreatedConnection = connectionTable.get(currentPostConnection.getIdent().replaceFirst(oldProcessIdent,newProcessIdent));
//ToMutlipleCon
if(currentPostConnection instanceof ArtifactCon){
//To MultipleCon
Collection toMultipleCon = ((ArtifactCon)currentPostConnection).getToMultipleCon();
for(Iterator<MultipleCon> fromMultipleConIterator = toMultipleCon.iterator();fromMultipleConIterator.hasNext();){
MultipleCon currentMultiple = fromMultipleConIterator.next();
if(currentMultiple!=null){
String multipleIdent = currentMultiple.getIdent().replaceFirst(oldProcessIdent,newProcessIdent);
MultipleCon newFromMultipleCon = (MultipleCon) connectionTable.get(multipleIdent);
if(newFromMultipleCon!=null){
((ArtifactCon)alreadyCreatedConnection).insertIntoToMultipleCon(newFromMultipleCon);
}//end if
}//end if != null
}//end for
}//end if artifactCon
else if(currentPostConnection instanceof BranchCon){
//From MultipleConnection
MultipleCon fromMultipleCon = ((BranchCon)currentPostConnection).getFromMultipleConnection();
if(fromMultipleCon!=null){
String multipleIdent = fromMultipleCon.getIdent().replaceFirst(oldProcessIdent,newProcessIdent);
MultipleCon newFromMultipleCon = (MultipleCon) connectionTable.get(multipleIdent);
if(newFromMultipleCon!=null){
((BranchCon)alreadyCreatedConnection).insertIntoFromMultipleConnection(newFromMultipleCon);
}//end if
}//end if
if(currentPostConnection instanceof BranchANDCon){
Collection toMultipleCon = ((BranchANDCon)currentPostConnection).getToMultipleCon();
for(Iterator<MultipleCon> toMultipleConIterator = toMultipleCon.iterator();toMultipleConIterator.hasNext();){
MultipleCon currentMultiple = toMultipleConIterator.next();
if(currentMultiple!=null){
String multipleIdent = currentMultiple.getIdent().replaceFirst(oldProcessIdent,newProcessIdent);
MultipleCon newFromMultipleCon = (MultipleCon) connectionTable.get(multipleIdent);
if(newFromMultipleCon!=null){
((BranchANDCon)alreadyCreatedConnection).insertIntoToMultipleCon(newFromMultipleCon);
}//end if
}//end if != null
}//end for
}else if(currentPostConnection instanceof BranchConCond){
Collection toMultipleCon = ((BranchConCond)currentPostConnection).getTheBranchConCondToMultipleCon();
for(Iterator<BranchConCondToMultipleCon> toMultipleIterator = toMultipleCon.iterator();toMultipleIterator.hasNext();){
BranchConCondToMultipleCon currentToMult = toMultipleIterator.next();
if(currentToMult!=null){
BranchConCondToMultipleCon newBranchCondToMult = new BranchConCondToMultipleCon();
if(currentToMult.getTheMultipleCon()!=null){
String multipleIdent = currentToMult.getTheMultipleCon().getIdent().replaceFirst(oldProcessIdent,newProcessIdent);
MultipleCon newToMultipleCon = (MultipleCon) connectionTable.get(multipleIdent);
if(newToMultipleCon!=null){
((BranchConCondToMultipleCon)newBranchConCondToMult).insertIntoTheMultipleCon(newToMultipleCon);
}//end if
}//end if != null
// about conditions
// newBranchConCondToMult.setCondition(currentToMult.getCondition());
((BranchConCond)alreadyCreatedConnection).insertIntoTheBranchCondToMultipleCon(newBranchConCondToMult);
}//end if != null
}//end for
}//end if
}
else if(currentPostConnection instanceof JoinCon){
//to MultipleCon
MultipleCon toMultipleCon = ((JoinCon)currentPostConnection).getToMultipleCon();
if(toMultipleCon!=null){
String multipleIdent = toMultipleCon.getIdent().replaceFirst(oldProcessIdent,newProcessIdent);
MultipleCon newToMultipleCon = (MultipleCon) connectionTable.get(multipleIdent);
if(newToMultipleCon!=null){
((JoinCon)alreadyCreatedConnection).insertIntoToMultipleCon(newToMultipleCon);
}//end if
}//end if
//Don`t care about fromConnection...be cause all To are already checked!
}
}//end for postProcessing
return connectionsToResult;
}
private Collection<ToolParameter> copySubroutineToolParameters(Collection theToolParameters, Subroutine newSubRoutine) {
Collection<ToolParameter> newParameters = new LinkedList<ToolParameter>();
for(Iterator<ToolParameter> paramIterator= theToolParameters.iterator();paramIterator.hasNext();){
ToolParameter currentParameter = paramIterator.next();
if(currentParameter!=null){
ToolParameter newParameter = null;
newParameter = new ToolParameter();
newParameter.insertIntoTheSubroutine(newSubRoutine);
newParameter.setLabel(currentParameter.getLabel());
newParameter.setSeparatorSymbol(currentParameter.getSeparatorSymbol());
if(currentParameter.getTheArtifactType()!=null){
newParameter.setTheArtifactType(currentParameter.getTheArtifactType());
}
if(currentParameter.getThePrimitiveType()!=null){
newParameter.setThePrimitiveType(currentParameter.getThePrimitiveType());
}
//add to main collection
newParameters.add(newParameter);
}//END IF
}//END FOR
//after all, retrieve correct collection
return newParameters;
}
private Collection<Parameter> copyAutomaticParameters(Collection theParameters, Automatic newAutomatic) {
Collection<Parameter> newParameters = new LinkedList<Parameter>();
for(Iterator<Parameter> paramIterator= theParameters.iterator();paramIterator.hasNext();){
Parameter currentParameter = paramIterator.next();
if(currentParameter!=null){
Parameter newParameter = null;
newParameter = new Parameter();
newParameter.insertIntoTheAutomatic(newAutomatic);
newParameter.setDescription(currentParameter.getDescription());
//add to main collection
newParameters.add(newParameter);
}//END IF
}//END FOR
//after all, retrieve correct collection
return newParameters;
}//END METHOD
/**
* @param destinationNormal
* @param sourceNormal
* @throws DAOException
*/
private void copyNormalProperties(Normal destinationNormal,Normal sourceNormal, String operationType, String current_level, Hashtable<String, String> coordinates) throws DAOException{
//simple attributes
destinationNormal.setHowLong(sourceNormal.getHowLong());
destinationNormal.setHowLongUnit(sourceNormal.getHowLongUnit());
destinationNormal.setName(sourceNormal.getName());
destinationNormal.setScript(sourceNormal.getScript());
destinationNormal.setDelegable(sourceNormal.isDelegable());
destinationNormal.setAutoAllocable(sourceNormal.getAutoAllocable());
if(!operationType.equals(PROC_DEST)){
destinationNormal.setPlannedBegin(sourceNormal.getPlannedBegin());
destinationNormal.setPlannedEnd(sourceNormal.getPlannedEnd());
}
destinationNormal.setStaticOk(sourceNormal.getStaticOk());
destinationNormal.setRequirements(sourceNormal.getRequirements());
// about conditions
if(sourceNormal.getPreCondition()!=null){
/* Condition newPreCondition = new Condition();
newPreCondition.setCond(sourceNormal.getPreCondition().getCond());
newPreCondition.insertIntoPreCondition(destinationNormal);
*/
/*PolCondition newPreCondition = sourceNormal.getPreCondition().createClone();
destinationNormal.insertIntoPreCondition(newPreCondition);*/
}
if(sourceNormal.getPosCondition()!=null){
/* Condition newPosCondition = new Condition();
newPosCondition.setCond(sourceNormal.getPosCondition().getCond());
newPosCondition.insertIntoPosCondition(destinationNormal);
*/
/*PolCondition newPosCondition = sourceNormal.getPosCondition().createClone();
destinationNormal.insertIntoPosCondition(newPosCondition);*/
}
//about reservations
Collection<Reservation> newReservations = null;
newReservations = copyReservations(sourceNormal.getTheReservation(),destinationNormal);
destinationNormal.setTheReservation(newReservations);
//about involved artifacts
//involved to
Collection<InvolvedArtifact> newInvolvedArtifactsToNormal = null;
newInvolvedArtifactsToNormal = copyInvolvedArtifacts(sourceNormal.getInvolvedArtifactToNormal(),destinationNormal,TO_INVOLVED, current_level);
destinationNormal.setInvolvedArtifactToNormal(newInvolvedArtifactsToNormal);
//involved from
Collection<InvolvedArtifact> newInvolvedArtifactsFromNormal = null;
newInvolvedArtifactsFromNormal = copyInvolvedArtifacts(sourceNormal.getInvolvedArtifactFromNormal(),destinationNormal,FROM_INVOLVED, current_level);
destinationNormal.setInvolvedArtifactFromNormal(newInvolvedArtifactsFromNormal);
//about requied Resources
Collection<RequiredResource> newRequiredResources = null;
newRequiredResources = copyRequiredResources(sourceNormal.getTheRequiredResource(),destinationNormal, operationType, coordinates);
destinationNormal.setTheRequiredResource(newRequiredResources);
//about required People
Collection<RequiredPeople> newRequiredPeople = null;
newRequiredPeople = copyRequiredPeople(sourceNormal.getTheRequiredPeople(),destinationNormal, operationType, coordinates);
destinationNormal.setTheRequiredPeople(newRequiredPeople);
}
/**
* @param theRequiredPeople
* @param destinationNormal
* @return
* @throws DAOException
*/
private Collection<RequiredPeople> copyRequiredPeople(Collection theRequiredPeople, Normal destinationNormal, String operationType, Hashtable<String, String> coordinates) throws DAOException {
Collection<RequiredPeople> newRequiredPeoples = new HashSet<RequiredPeople>();
String coordinateKey = null;
Collection<String> roles = new HashSet<String>();
for(Iterator<RequiredPeople> requiredIterator= theRequiredPeople.iterator();requiredIterator.hasNext();){
RequiredPeople currentReqPeople = requiredIterator.next();
if(currentReqPeople!=null){
if(currentReqPeople instanceof ReqAgent ){
ReqAgent currentReqAg = (ReqAgent)currentReqPeople;
ReqAgent newReqAgent = new ReqAgent();
if(currentReqAg.getTheRole()!=null && !roles.contains(currentReqAg.getTheRole().getIdent())){
newReqAgent.insertIntoTheRole( currentReqAg.getTheRole());
roles.add(currentReqAg.getTheRole().getIdent());
coordinateKey = newReqAgent.getTheRole().getIdent();
if(!operationType.equals(PROC_DEST)){
if(currentReqAg.getTheAgent()!=null){
newReqAgent.insertIntoTheAgent(currentReqAg.getTheAgent());
coordinateKey = coordinateKey + ":" + newReqAgent.getTheAgent().getIdent();
}
}
coordinateKey = coordinateKey + ":" + destinationNormal.getIdent();
//about ReqAgentRequiresAbility
Collection<ReqAgentRequiresAbility> newReqAgReqAbility=null;
newReqAgReqAbility = copyReqAgentReqAbility(currentReqAg.getTheReqAgentRequiresAbility(),newReqAgent);
newReqAgent.setTheReqAgentRequiresAbility(newReqAgReqAbility);
this.addCoordinate(currentReqAg.getId(), currentReqAg.getClass().getSimpleName(), WebAPSEEObject.REQ_AGENT +"+"+ coordinateKey, coordinates);
coordinateKey = null;
//the common attribute normal activity
newReqAgent.insertIntoTheNormal(destinationNormal);
//add to main collection
newRequiredPeoples.add(newReqAgent);
}
}else if(currentReqPeople instanceof ReqWorkGroup){
ReqWorkGroup currentReqWorkGroup = (ReqWorkGroup)currentReqPeople;
ReqWorkGroup newReqWorkGroup = new ReqWorkGroup();
if(currentReqWorkGroup.getTheWorkGroupType()!=null){
newReqWorkGroup.insertIntoTheWorkGroupType( currentReqWorkGroup.getTheWorkGroupType() );
coordinateKey = newReqWorkGroup.getTheWorkGroupType().getIdent();
}
if(!!operationType.equals(PROC_DEST)){
if(currentReqWorkGroup.getTheWorkGroup()!=null){
newReqWorkGroup.insertIntoTheWorkGroup(currentReqWorkGroup.getTheWorkGroup());
coordinateKey = coordinateKey + ":" + newReqWorkGroup.getTheWorkGroup().getIdent();
}
}
coordinateKey = coordinateKey + ":" + destinationNormal.getIdent();
this.addCoordinate(currentReqWorkGroup.getId(), currentReqWorkGroup.getClass().getSimpleName(), WebAPSEEObject.REQ_WorkGroup +"+"+coordinateKey, coordinates);
coordinateKey = null;
//the common attribute normal activity
newReqWorkGroup.insertIntoTheNormal(destinationNormal);
//add to main collection
newRequiredPeoples.add(newReqWorkGroup);
}//end if
}//END IF
}//END FOR
//after all, retrieve correct collection
return newRequiredPeoples;
}//end method
/**
* @param theReqAgentRequiresAbility
* @param newReqAgent
* @return
*/
private Collection<ReqAgentRequiresAbility> copyReqAgentReqAbility(Collection theReqAgentRequiresAbility,ReqAgent newReqAgent) {
Collection<ReqAgentRequiresAbility> newReqAgReqAbilities = new HashSet<ReqAgentRequiresAbility>();
for(Iterator<ReqAgentRequiresAbility> requiredIterator= theReqAgentRequiresAbility.iterator();requiredIterator.hasNext();){
ReqAgentRequiresAbility currentReqAgReqAbility = requiredIterator.next();
if(currentReqAgReqAbility!=null){
ReqAgentRequiresAbility newReqAgReqAb = new ReqAgentRequiresAbility();
if(currentReqAgReqAbility.getTheAbility()!=null){
newReqAgReqAb.insertIntoTheAbility(currentReqAgReqAbility.getTheAbility());
}
newReqAgReqAb.setDegree(currentReqAgReqAbility.getDegree());
newReqAgReqAb.insertIntoTheReqAgent(newReqAgent);
//add to Main Collection
newReqAgReqAbilities.add(newReqAgReqAb);
}//END IF
}//END FOR
//after all, retrieve correct collection
return newReqAgReqAbilities;
}
/**
* @param theRequiredResource
* @param destinationNormal
* @return
* @throws DAOException
*/
private Collection<RequiredResource> copyRequiredResources(Collection<RequiredResource> theRequiredResource, Normal destinationNormal, String operationType, Hashtable<String, String> coordinates) throws DAOException {
Collection<RequiredResource> newRequiredResources = new HashSet<RequiredResource>();
String coordinateKey = null;
for(Iterator<RequiredResource> requiredIterator= theRequiredResource.iterator();requiredIterator.hasNext();){
RequiredResource currentReqResource = requiredIterator.next();
if(currentReqResource!=null){
RequiredResource newRequiredResource = new RequiredResource();
if(currentReqResource.getTheResourceType()!=null){
newRequiredResource.insertIntoTheResourceType(currentReqResource.getTheResourceType());
coordinateKey = currentReqResource.getTheResourceType().getIdent();
}
if(!operationType.equals(PROC_DEST)){
if(currentReqResource.getTheResource()!=null){
newRequiredResource.insertIntoTheResource(currentReqResource.getTheResource());
newRequiredResource.setAmountNeeded(currentReqResource.getAmountNeeded());
coordinateKey = coordinateKey + ":" + currentReqResource.getTheResource().getIdent();
}
}
newRequiredResource.insertIntoTheNormal(destinationNormal);
coordinateKey = coordinateKey + ":" + destinationNormal.getIdent();
this.addCoordinate(currentReqResource.getId(), currentReqResource.getClass().getSimpleName(), WebAPSEEObject.REQ_RESOURCE +"+"+coordinateKey, coordinates);
coordinateKey = null;
//add to Main Collection
newRequiredResources.add(newRequiredResource);
}//END IF
}//END FOR
//after all, retrieve correct collection
return newRequiredResources;
}//end method
/**
* @param theReservation collection of current reservations
* @param destinationNormal normal activity that will receive the new collection
* @return the new collection of reservations ... to be inserted into the new Normal activity
*/
private Collection<Reservation> copyReservations(Collection theReservation, Normal destinationNormal) {
Collection<Reservation> newInvolvedReservations = new HashSet<Reservation>();
for(Iterator<Reservation> reservationsIterator= theReservation.iterator();reservationsIterator.hasNext();){
Reservation currentReservation = reservationsIterator.next();
if(currentReservation!=null){
Reservation newReservation = new Reservation();
newReservation.insertIntoTheNormal(destinationNormal);
if(currentReservation.getTheExclusive()!=null){
newReservation.insertIntoTheExclusive(currentReservation.getTheExclusive());
}
//dates
newReservation.setFrom(currentReservation.getFrom());
newReservation.setTo(currentReservation.getTo());
//add to Main Collection
newInvolvedReservations.add(newReservation);
}//END IF
}//END FOR
//after all, retrieve correct collection
return newInvolvedReservations;
}//end method
//private constants, to reuse the method to copay involved artifacts
private static int TO_INVOLVED =0, FROM_INVOLVED=1;
/**
* @param currentInvolvedArtifacts
* @param newNormalReference
* @param kindRelationship
* @return
* @throws DAOException
*/
private Collection<InvolvedArtifact> copyInvolvedArtifacts(Collection<InvolvedArtifact> currentInvolvedArtifacts,Normal newNormalReference,int kindRelationship, String current_level) throws DAOException {
Collection<InvolvedArtifact> newInvolvedArtifacts = new HashSet<InvolvedArtifact>();
for(Iterator<InvolvedArtifact> involvedIterator= currentInvolvedArtifacts.iterator();involvedIterator.hasNext();){
InvolvedArtifact currentInvolved = involvedIterator.next();
if(currentInvolved!=null){
InvolvedArtifact newInvolved = new InvolvedArtifact();
if(currentInvolved.getTheArtifactType()!=null){
newInvolved.insertIntoTheArtifactType(currentInvolved.getTheArtifactType());
}
Artifact currentArtifact = currentInvolved.getTheArtifact();
if(currentArtifact!=null){
String currentArtifactIdent = currentArtifact.getIdent();
String newArtifactIdent = this.artifactsIdents.getProperty(currentArtifactIdent);
if(newArtifactIdent==null || currentArtifactIdent.equals(newArtifactIdent)){
newInvolved.insertIntoTheArtifacts(currentArtifact);
}else{
Artifact newArtifact;
newArtifact = newArtifacts.get(newArtifactIdent);
if(newArtifact == null){
newArtifact = (Artifact) artifactDAO.retrieveBySecondaryKey(newArtifactIdent);
if(newArtifact == null){
newArtifact = new Artifact();
newArtifact.setIdent(newArtifactIdent);
newArtifact.setDescription(currentArtifact.getDescription());
newArtifact.insertIntoTheArtifactType(currentInvolved.getTheArtifactType());
}
}
newInvolved.insertIntoTheArtifacts(newArtifact);
this.newArtifacts.put(newArtifactIdent, newArtifact);
}
}
if(kindRelationship==TO_INVOLVED){
newInvolved.insertIntoInInvolvedArtifacts(newNormalReference);
}else if(kindRelationship==FROM_INVOLVED){
newInvolved.insertIntoOutInvolvedArtifacts(newNormalReference);
}
//add to Main Collection
newInvolvedArtifacts.add(newInvolved);
}//END IF
}//END FOR
//after all, retrieve correct collection
return newInvolvedArtifacts;
}//end method
}
<file_sep>/src/main/java/br/ufpa/labes/spm/web/rest/RequiredResourceResource.java
package br.ufpa.labes.spm.web.rest;
import br.ufpa.labes.spm.service.RequiredResourceService;
import br.ufpa.labes.spm.web.rest.errors.BadRequestAlertException;
import br.ufpa.labes.spm.service.dto.RequiredResourceDTO;
import io.github.jhipster.web.util.HeaderUtil;
import io.github.jhipster.web.util.ResponseUtil;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.List;
import java.util.Optional;
/**
* REST controller for managing {@link br.ufpa.labes.spm.domain.RequiredResource}.
*/
@RestController
@RequestMapping("/api")
public class RequiredResourceResource {
private final Logger log = LoggerFactory.getLogger(RequiredResourceResource.class);
private static final String ENTITY_NAME = "requiredResource";
@Value("${jhipster.clientApp.name}")
private String applicationName;
private final RequiredResourceService requiredResourceService;
public RequiredResourceResource(RequiredResourceService requiredResourceService) {
this.requiredResourceService = requiredResourceService;
}
/**
* {@code POST /required-resources} : Create a new requiredResource.
*
* @param requiredResourceDTO the requiredResourceDTO to create.
* @return the {@link ResponseEntity} with status {@code 201 (Created)} and with body the new requiredResourceDTO, or with status {@code 400 (Bad Request)} if the requiredResource has already an ID.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PostMapping("/required-resources")
public ResponseEntity<RequiredResourceDTO> createRequiredResource(@RequestBody RequiredResourceDTO requiredResourceDTO) throws URISyntaxException {
log.debug("REST request to save RequiredResource : {}", requiredResourceDTO);
if (requiredResourceDTO.getId() != null) {
throw new BadRequestAlertException("A new requiredResource cannot already have an ID", ENTITY_NAME, "idexists");
}
RequiredResourceDTO result = requiredResourceService.save(requiredResourceDTO);
return ResponseEntity.created(new URI("/api/required-resources/" + result.getId()))
.headers(HeaderUtil.createEntityCreationAlert(applicationName, true, ENTITY_NAME, result.getId().toString()))
.body(result);
}
/**
* {@code PUT /required-resources} : Updates an existing requiredResource.
*
* @param requiredResourceDTO the requiredResourceDTO to update.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the updated requiredResourceDTO,
* or with status {@code 400 (Bad Request)} if the requiredResourceDTO is not valid,
* or with status {@code 500 (Internal Server Error)} if the requiredResourceDTO couldn't be updated.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PutMapping("/required-resources")
public ResponseEntity<RequiredResourceDTO> updateRequiredResource(@RequestBody RequiredResourceDTO requiredResourceDTO) throws URISyntaxException {
log.debug("REST request to update RequiredResource : {}", requiredResourceDTO);
if (requiredResourceDTO.getId() == null) {
throw new BadRequestAlertException("Invalid id", ENTITY_NAME, "idnull");
}
RequiredResourceDTO result = requiredResourceService.save(requiredResourceDTO);
return ResponseEntity.ok()
.headers(HeaderUtil.createEntityUpdateAlert(applicationName, true, ENTITY_NAME, requiredResourceDTO.getId().toString()))
.body(result);
}
/**
* {@code GET /required-resources} : get all the requiredResources.
*
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and the list of requiredResources in body.
*/
@GetMapping("/required-resources")
public List<RequiredResourceDTO> getAllRequiredResources() {
log.debug("REST request to get all RequiredResources");
return requiredResourceService.findAll();
}
/**
* {@code GET /required-resources/:id} : get the "id" requiredResource.
*
* @param id the id of the requiredResourceDTO to retrieve.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the requiredResourceDTO, or with status {@code 404 (Not Found)}.
*/
@GetMapping("/required-resources/{id}")
public ResponseEntity<RequiredResourceDTO> getRequiredResource(@PathVariable Long id) {
log.debug("REST request to get RequiredResource : {}", id);
Optional<RequiredResourceDTO> requiredResourceDTO = requiredResourceService.findOne(id);
return ResponseUtil.wrapOrNotFound(requiredResourceDTO);
}
/**
* {@code DELETE /required-resources/:id} : delete the "id" requiredResource.
*
* @param id the id of the requiredResourceDTO to delete.
* @return the {@link ResponseEntity} with status {@code 204 (NO_CONTENT)}.
*/
@DeleteMapping("/required-resources/{id}")
public ResponseEntity<Void> deleteRequiredResource(@PathVariable Long id) {
log.debug("REST request to delete RequiredResource : {}", id);
requiredResourceService.delete(id);
return ResponseEntity.noContent().headers(HeaderUtil.createEntityDeletionAlert(applicationName, true, ENTITY_NAME, id.toString())).build();
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/driver/IDriverDAO.java
package br.ufpa.labes.spm.repository.interfaces.driver;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.Driver;
public interface IDriverDAO extends IBaseDAO<Driver, String> {}
<file_sep>/tmp/service/mapper/ActivityEstimationMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.ActivityEstimationDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link ActivityEstimation} and its DTO {@link ActivityEstimationDTO}.
*/
@Mapper(componentModel = "spring", uses = {ActivityMapper.class})
public interface ActivityEstimationMapper extends EntityMapper<ActivityEstimationDTO, ActivityEstimation> {
@Mapping(source = "activity.id", target = "activityId")
ActivityEstimationDTO toDto(ActivityEstimation activityEstimation);
@Mapping(source = "activityId", target = "activity")
ActivityEstimation toEntity(ActivityEstimationDTO activityEstimationDTO);
default ActivityEstimation fromId(Long id) {
if (id == null) {
return null;
}
ActivityEstimation activityEstimation = new ActivityEstimation();
activityEstimation.setId(id);
return activityEstimation;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/mapper/SubroutineMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.SubroutineDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link Subroutine} and its DTO {@link SubroutineDTO}.
*/
@Mapper(componentModel = "spring", uses = {ArtifactTypeMapper.class})
public interface SubroutineMapper extends EntityMapper<SubroutineDTO, Subroutine> {
@Mapping(source = "theArtifactType.id", target = "theArtifactTypeId")
SubroutineDTO toDto(Subroutine subroutine);
@Mapping(source = "theArtifactTypeId", target = "theArtifactType")
@Mapping(target = "theAutomatic", ignore = true)
@Mapping(target = "theToolParameters", ignore = true)
@Mapping(target = "removeTheToolParameters", ignore = true)
Subroutine toEntity(SubroutineDTO subroutineDTO);
default Subroutine fromId(Long id) {
if (id == null) {
return null;
}
Subroutine subroutine = new Subroutine();
subroutine.setId(id);
return subroutine;
}
}
<file_sep>/tmp/service/impl/LoggingImpl.java
package br.ufpa.labes.spm.service.impl;
import java.time.LocalDate;
import java.util.Date;
import br.ufpa.labes.spm.domain.Activity;
import br.ufpa.labes.spm.domain.Decomposed;
import br.ufpa.labes.spm.domain.Plain;
import br.ufpa.labes.spm.domain.BranchCon;
import br.ufpa.labes.spm.domain.JoinCon;
import br.ufpa.labes.spm.domain.AgendaEvent;
import br.ufpa.labes.spm.domain.GlobalActivityEvent;
import br.ufpa.labes.spm.domain.SpmLog;
import br.ufpa.labes.spm.domain.ModelingActivityEvent;
import br.ufpa.labes.spm.domain.ProcessEvent;
import br.ufpa.labes.spm.domain.ProcessModelEvent;
import br.ufpa.labes.spm.domain.ResourceEvent;
import br.ufpa.labes.spm.domain.Normal;
import br.ufpa.labes.spm.domain.ProcessModel;
import br.ufpa.labes.spm.domain.Resource;
import br.ufpa.labes.spm.domain.Task;
import br.ufpa.labes.spm.domain.Process;
import br.ufpa.labes.spm.domain.EventType;
import org.qrconsult.spm.notifications.mail.EventNotification;
import br.ufpa.labes.spm.service.interfaces.Logging;
public class LoggingImpl implements Logging {
public LoggingImpl() {
}
/* (non-Javadoc)
* @see org.qrconsult.spm.services.impl.Logging#registerModelingActivityEvent(org.qrconsult.spm.model.activities.Activity, java.lang.String, java.lang.String)
*/
@Override
public void registerModelingActivityEvent(Activity activity, String what, String why){
EventType type = new EventType();
type.setIdent("Modeling Event" + Math.random()*10 + System.currentTimeMillis());
type.setDescription("This is an modeling event. Automatically generated.");
ModelingActivityEvent event = new ModelingActivityEvent();
event.setTheActivity (activity);
event.setWhy(why);
event.setWhen(LocalDate.now());
event.setIsCreatedByApsee (new Boolean (true));
event.getTheCatalogEvents().setDescription (what);
event.setTheEventType(type);
type.getTheEvents().add(event);
SpmLog log = this.getTheProcess(activity.getTheProcessModel()).getTheLog();
event.setTheLog(log);
log.getTheEvents().add(event);
activity.getTheModelingActivityEvents().add(event);
EventNotification.getInstance().notifyEvent( event );
}
/* (non-Javadoc)
* @see org.qrconsult.spm.services.impl.Logging#registerGlobalActivityEvent(org.qrconsult.spm.model.activities.Plain, java.lang.String, java.lang.String)
*/
@Override
public void registerGlobalActivityEvent(Plain activity, String what, String why){
EventType type = new EventType();
type.setIdent("Enactment Event" + Math.random()*10+ System.currentTimeMillis());
type.setDescription("This is an enactment event. Automatically generated.");
GlobalActivityEvent event = new GlobalActivityEvent();
SpmLog log = this.getTheProcess(activity.getTheProcessModel()).getTheLog();
event.setTheLog(log);
log.getTheEvents().add(event);
event.setThePlain(activity);
event.setWhy (why);
event.setWhen (LocalDate.now());
event.setIsCreatedByApsee (new Boolean (true));
event.getTheCatalogEvents().setDescription (what);
event.setTheEventType(type);
type.getTheEvents().add (event);
activity.getTheGlobalActivityEvents().add (event);
EventNotification.getInstance().notifyEvent( event );
}
/* (non-Javadoc)
* @see org.qrconsult.spm.services.impl.Logging#registerProcessEvent(org.qrconsult.spm.model.processModels.Process, java.lang.String, java.lang.String)
*/
@Override
public void registerProcessEvent(Process process, String what, String why){
EventType type = new EventType();
type.setIdent("Enactment Event" + Math.random()*10+ System.currentTimeMillis());
type.setDescription("This is an enactment event. Automatically generated.");
ProcessEvent event = new ProcessEvent();
event.setTheProcess(process);
event.setTheLog(process.getTheLog());
process.getTheLog().getTheEvents().add(event);
event.setWhy (why);
event.setWhen (LocalDate.now());
event.setIsCreatedByApsee (new Boolean (true));
event.getTheCatalogEvents().setDescription (what);
event.setTheEventType(type);
type.getTheEvents().add (event);
process.getTheProcessEvents().add (event);
EventNotification.getInstance().notifyEvent( event );
}
/* (non-Javadoc)
* @see org.qrconsult.spm.services.impl.Logging#registerBranchEvent(org.qrconsult.spm.model.connections.BranchCon, java.lang.String)
*/
@Override
public void registerBranchEvent(BranchCon branchCon, String why){
/* EventType type = new EventType();
type.setIdent("Enactment Event" + Math.random()*10+ System.currentTimeMillis());
type.setDescription("This is an enactment event. Automatically generated.");
ConnectionEvent event = new ConnectionEvent();
SpmLog log = this.getTheProcess(branchCon.getTheProcessModel()).getTheLog();
event.setTheLog(log);
log.getTheEvents().add(event);
event.setWhy (why);
event.setWhen (LocalDate.now());
event.setIsCreatedByApsee (new Boolean (true));
event.getTheCatalogEvents().setDescription ("Fired");
event.setTheEventType(type);
type.getTheEvents().add(event);
branchCon.getTheProcessModel().getProcessModelEvent.add (event);
*/
}
/* (non-Javadoc)
* @see org.qrconsult.spm.services.impl.Logging#registerJoinEvent(org.qrconsult.spm.model.connections.JoinCon, java.lang.String)
*/
@Override
public void registerJoinEvent(JoinCon joinCon, String why){
/* EventType type = new EventType();
type.setIdent("Enactment Event" + Math.random()*10+ System.currentTimeMillis());
type.setDescription("This is an enactment event. Automatically generated.");
ConnectionEvent event = new ConnectionEvent();
SpmLog log = this.getTheProcess(joinCon.getTheProcessModel()).getTheLog();
event.setTheLog(log);
log.getTheEvents().add(event);
event.setWhy (why);
event.setWhen (LocalDate.now());
event.setIsCreatedByApsee (new Boolean (true));
event.getTheCatalogEvents().setDescription ("Fired");
event.setTheEventType(type);
type.getTheEvents().add(event);
joinCon.theProcessModelEvent.add (event);
*/
}
/* (non-Javadoc)
* @see org.qrconsult.spm.services.impl.Logging#registerProcessModelEvent(org.qrconsult.spm.model.processModels.ProcessModel, java.lang.String, java.lang.String)
*/
@Override
public void registerProcessModelEvent(ProcessModel model, String what, String why){
EventType type = new EventType();
type.setIdent("Enactment Event" + Math.random()*10+ System.currentTimeMillis());
type.setDescription("This is an enactment event. Automatically generated.");
ProcessModelEvent event = new ProcessModelEvent();
event.setTheProcessModel(model);
SpmLog log = this.getTheProcess(model).getTheLog();
event.setTheLog(log);
log.getTheEvents().add(event);
event.setWhy (why);
event.setWhen (LocalDate.now());
event.setIsCreatedByApsee (new Boolean (true));
event.getTheCatalogEvents().setDescription (what);
event.setTheEventType(type);
type.getTheEvents().add(event);
model.getTheProcessModelEvents().add (event);
EventNotification.getInstance().notifyEvent( event );
}
/* (non-Javadoc)
* @see org.qrconsult.spm.services.impl.Logging#registerResourceEvent(org.qrconsult.spm.model.resources.Resource, org.qrconsult.spm.model.plainActivities.Normal, java.lang.String, java.lang.String)
*/
@Override
public void registerResourceEvent(Resource resource, Normal actNorm, String what, String why){
EventType type = new EventType();
type.setIdent("Enactment Event" + Math.random()*10+ System.currentTimeMillis());
type.setDescription("This is an enactment event. Automatically generated.");
ResourceEvent event = new ResourceEvent();
event.setTheResource(resource);
if(actNorm != null){
event.setTheNormal(actNorm);
SpmLog log = this.getTheProcess(actNorm.getTheProcessModel()).getTheLog();
event.setTheLog(log);
log.getTheEvents().add(event);
}
event.setWhy (why);
event.setWhen (LocalDate.now());
event.setIsCreatedByApsee (new Boolean (false));
event.getTheCatalogEvents().setDescription (what);
event.setTheEventType(type);
type.getTheEvents().add (event);
resource.getTheResourceEvents().add (event);
EventNotification.getInstance().notifyEvent( event );
}
/* (non-Javadoc)
* @see org.qrconsult.spm.services.impl.Logging#registerAgendaEvent(org.qrconsult.spm.model.taskagenda.Task, java.lang.String, java.lang.String)
*/
@Override
public void registerAgendaEvent(Task task, String what, String why){
EventType type = new EventType();
type.setIdent("Enactment Event" + Math.random()*10+ System.currentTimeMillis());
type.setDescription("This is an enactment event. Automatically generated.");
AgendaEvent agEvent = new AgendaEvent();
agEvent.setIsCreatedByApsee(new Boolean(false));
SpmLog log = this.getTheProcess(task.getTheNormal().getTheProcessModel()).getTheLog();
agEvent.setTheLog(log);
log.getTheEvents().add(agEvent);
agEvent.setWhen(new Date());
agEvent.setWhy(why);
agEvent.getTheCatalogEvents().setDescription(what);
agEvent.getTheCatalogEvents().setTheAgendaEvent(agEvent);
agEvent.setTheTask(task);
agEvent.setTheEventType(type);
type.getTheEvents().add(agEvent);
task.getTheAgendaEvents().add(agEvent);
EventNotification.getInstance().notifyEvent( agEvent );
}
// Utility
private Process getTheProcess(ProcessModel pmodel){
Process process = null;
Decomposed decAct = pmodel.getTheDecomposed();
if(decAct == null)
process = pmodel.getTheProcess();
else
process = this.getTheProcess(decAct.getTheProcessModel());
return process;
}
}
<file_sep>/tmp/service/mapper/MessageMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.MessageDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link Message} and its DTO {@link MessageDTO}.
*/
@Mapper(componentModel = "spring", uses = {AuthorMapper.class, AssetMapper.class})
public interface MessageMapper extends EntityMapper<MessageDTO, Message> {
@Mapping(source = "sender.id", target = "senderId")
@Mapping(source = "recipient.id", target = "recipientId")
@Mapping(source = "theAsset.id", target = "theAssetId")
MessageDTO toDto(Message message);
@Mapping(source = "senderId", target = "sender")
@Mapping(source = "recipientId", target = "recipient")
@Mapping(source = "theAssetId", target = "theAsset")
Message toEntity(MessageDTO messageDTO);
default Message fromId(Long id) {
if (id == null) {
return null;
}
Message message = new Message();
message.setId(id);
return message;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/FeedbackService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.Feedback;
import br.ufpa.labes.spm.repository.FeedbackRepository;
import br.ufpa.labes.spm.service.dto.FeedbackDTO;
import br.ufpa.labes.spm.service.mapper.FeedbackMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link Feedback}.
*/
@Service
@Transactional
public class FeedbackService {
private final Logger log = LoggerFactory.getLogger(FeedbackService.class);
private final FeedbackRepository feedbackRepository;
private final FeedbackMapper feedbackMapper;
public FeedbackService(FeedbackRepository feedbackRepository, FeedbackMapper feedbackMapper) {
this.feedbackRepository = feedbackRepository;
this.feedbackMapper = feedbackMapper;
}
/**
* Save a feedback.
*
* @param feedbackDTO the entity to save.
* @return the persisted entity.
*/
public FeedbackDTO save(FeedbackDTO feedbackDTO) {
log.debug("Request to save Feedback : {}", feedbackDTO);
Feedback feedback = feedbackMapper.toEntity(feedbackDTO);
feedback = feedbackRepository.save(feedback);
return feedbackMapper.toDto(feedback);
}
/**
* Get all the feedbacks.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<FeedbackDTO> findAll() {
log.debug("Request to get all Feedbacks");
return feedbackRepository.findAll().stream()
.map(feedbackMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one feedback by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<FeedbackDTO> findOne(Long id) {
log.debug("Request to get Feedback : {}", id);
return feedbackRepository.findById(id)
.map(feedbackMapper::toDto);
}
/**
* Delete the feedback by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete Feedback : {}", id);
feedbackRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/plainActivities/IReqWorkGroupDAO.java
package br.ufpa.labes.spm.repository.interfaces.plainActivities;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.ReqWorkGroup;
public interface IReqWorkGroupDAO extends IBaseDAO<ReqWorkGroup, Integer> {
public ReqWorkGroup findReqWorkGroupFromProcessModel(
String groupIdent, String WorkgroupTypeIdent, String normalIdent);
}
<file_sep>/src/main/java/br/ufpa/labes/spm/domain/enumeration/EmailSecurityLevels.java
package br.ufpa.labes.spm.domain.enumeration;
/** The EmailSecurityLevels enumeration. */
public enum EmailSecurityLevels {
WITHOUT_SECURITY,
TLS_IF_AVAILABLE,
TLS,
SSL
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/resources/ConsumableDAO.java
package br.ufpa.labes.spm.repository.impl.resources;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.resources.IConsumableDAO;
import br.ufpa.labes.spm.domain.Consumable;
public class ConsumableDAO extends BaseDAO<Consumable, String> implements IConsumableDAO {
protected ConsumableDAO(Class<Consumable> businessClass) {
super(businessClass);
}
public ConsumableDAO() {
super(Consumable.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/plainActivities/IAutomaticDAO.java
package br.ufpa.labes.spm.repository.interfaces.plainActivities;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.Automatic;
public interface IAutomaticDAO extends IBaseDAO<Automatic, String> {}
<file_sep>/tmp/service/mapper/CalendarDayMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.CalendarDayDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link CalendarDay} and its DTO {@link CalendarDayDTO}.
*/
@Mapper(componentModel = "spring", uses = {CalendarMapper.class})
public interface CalendarDayMapper extends EntityMapper<CalendarDayDTO, CalendarDay> {
@Mapping(source = "theCalendar.id", target = "theCalendarId")
CalendarDayDTO toDto(CalendarDay calendarDay);
@Mapping(source = "theCalendarId", target = "theCalendar")
CalendarDay toEntity(CalendarDayDTO calendarDayDTO);
default CalendarDay fromId(Long id) {
if (id == null) {
return null;
}
CalendarDay calendarDay = new CalendarDay();
calendarDay.setId(id);
return calendarDay;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/AuthorStatService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.AuthorStat;
import br.ufpa.labes.spm.repository.AuthorStatRepository;
import br.ufpa.labes.spm.service.dto.AuthorStatDTO;
import br.ufpa.labes.spm.service.mapper.AuthorStatMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link AuthorStat}.
*/
@Service
@Transactional
public class AuthorStatService {
private final Logger log = LoggerFactory.getLogger(AuthorStatService.class);
private final AuthorStatRepository authorStatRepository;
private final AuthorStatMapper authorStatMapper;
public AuthorStatService(AuthorStatRepository authorStatRepository, AuthorStatMapper authorStatMapper) {
this.authorStatRepository = authorStatRepository;
this.authorStatMapper = authorStatMapper;
}
/**
* Save a authorStat.
*
* @param authorStatDTO the entity to save.
* @return the persisted entity.
*/
public AuthorStatDTO save(AuthorStatDTO authorStatDTO) {
log.debug("Request to save AuthorStat : {}", authorStatDTO);
AuthorStat authorStat = authorStatMapper.toEntity(authorStatDTO);
authorStat = authorStatRepository.save(authorStat);
return authorStatMapper.toDto(authorStat);
}
/**
* Get all the authorStats.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<AuthorStatDTO> findAll() {
log.debug("Request to get all AuthorStats");
return authorStatRepository.findAll().stream()
.map(authorStatMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one authorStat by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<AuthorStatDTO> findOne(Long id) {
log.debug("Request to get AuthorStat : {}", id);
return authorStatRepository.findById(id)
.map(authorStatMapper::toDto);
}
/**
* Delete the authorStat by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete AuthorStat : {}", id);
authorStatRepository.deleteById(id);
}
}
<file_sep>/tmp/service/impl/AgentServiceImpl.java
package br.ufpa.labes.spm.service.impl;
import br.ufpa.labes.spm.service.AgentService;
import br.ufpa.labes.spm.domain.Agent;
import br.ufpa.labes.spm.repository.AgentRepository;
import br.ufpa.labes.spm.service.dto.AgentDTO;
import br.ufpa.labes.spm.service.mapper.AgentMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.Pageable;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
import java.util.stream.StreamSupport;
/**
* Service Implementation for managing {@link Agent}.
*/
@Service
@Transactional
public class AgentServiceImpl implements AgentService {
private final Logger log = LoggerFactory.getLogger(AgentServiceImpl.class);
private final AgentRepository agentRepository;
private final AgentMapper agentMapper;
public AgentServiceImpl(AgentRepository agentRepository, AgentMapper agentMapper) {
this.agentRepository = agentRepository;
this.agentMapper = agentMapper;
}
/**
* Save a agent.
*
* @param agentDTO the entity to save.
* @return the persisted entity.
*/
@Override
public AgentDTO save(AgentDTO agentDTO) {
log.debug("Request to save Agent : {}", agentDTO);
Agent agent = agentMapper.toEntity(agentDTO);
agent = agentRepository.save(agent);
return agentMapper.toDto(agent);
}
/**
* Get all the agents.
*
* @return the list of entities.
*/
@Override
@Transactional(readOnly = true)
public List<AgentDTO> findAll() {
log.debug("Request to get all Agents");
return agentRepository.findAllWithEagerRelationships().stream()
.map(agentMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get all the agents with eager load of many-to-many relationships.
*
* @return the list of entities.
*/
public Page<AgentDTO> findAllWithEagerRelationships(Pageable pageable) {
return agentRepository.findAllWithEagerRelationships(pageable).map(agentMapper::toDto);
}
/**
* Get all the agents where TheChatMessage is {@code null}.
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<AgentDTO> findAllWhereTheChatMessageIsNull() {
log.debug("Request to get all agents where TheChatMessage is null");
return StreamSupport
.stream(agentRepository.findAll().spliterator(), false)
.filter(agent -> agent.getTheChatMessage() == null)
.map(agentMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one agent by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Override
@Transactional(readOnly = true)
public Optional<AgentDTO> findOne(Long id) {
log.debug("Request to get Agent : {}", id);
return agentRepository.findOneWithEagerRelationships(id)
.map(agentMapper::toDto);
}
/**
* Delete the agent by id.
*
* @param id the id of the entity.
*/
@Override
public void delete(Long id) {
log.debug("Request to delete Agent : {}", id);
agentRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/activities/PlainDAO.java
package br.ufpa.labes.spm.repository.impl.activities;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.activities.IPlainDAO;
import br.ufpa.labes.spm.domain.Plain;
public class PlainDAO extends BaseDAO<Plain, String> implements IPlainDAO {
protected PlainDAO(Class<Plain> businessClass) {
super(businessClass);
}
public PlainDAO() {
super(Plain.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/mapper/BranchConCondToMultipleConMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.BranchConCondToMultipleConDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link BranchConCondToMultipleCon} and its DTO {@link BranchConCondToMultipleConDTO}.
*/
@Mapper(componentModel = "spring", uses = {MultipleConMapper.class, BranchConCondMapper.class})
public interface BranchConCondToMultipleConMapper extends EntityMapper<BranchConCondToMultipleConDTO, BranchConCondToMultipleCon> {
@Mapping(source = "theMultipleCon.id", target = "theMultipleConId")
@Mapping(source = "theBranchConCond.id", target = "theBranchConCondId")
BranchConCondToMultipleConDTO toDto(BranchConCondToMultipleCon branchConCondToMultipleCon);
@Mapping(source = "theMultipleConId", target = "theMultipleCon")
@Mapping(source = "theBranchConCondId", target = "theBranchConCond")
BranchConCondToMultipleCon toEntity(BranchConCondToMultipleConDTO branchConCondToMultipleConDTO);
default BranchConCondToMultipleCon fromId(Long id) {
if (id == null) {
return null;
}
BranchConCondToMultipleCon branchConCondToMultipleCon = new BranchConCondToMultipleCon();
branchConCondToMultipleCon.setId(id);
return branchConCondToMultipleCon;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/processModels/ProcessModelDAO.java
package br.ufpa.labes.spm.repository.impl.processModels;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.processModels.IProcessModelDAO;
import br.ufpa.labes.spm.domain.ProcessModel;
public class ProcessModelDAO extends BaseDAO<ProcessModel, Integer> implements IProcessModelDAO {
protected ProcessModelDAO(Class<ProcessModel> businessClass) {
super(businessClass);
}
public ProcessModelDAO() {
super(ProcessModel.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/types/ArtifactTypeDAO.java
package br.ufpa.labes.spm.repository.impl.types;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.types.IArtifactTypeDAO;
import br.ufpa.labes.spm.domain.ArtifactType;
public class ArtifactTypeDAO extends BaseDAO<ArtifactType, String> implements IArtifactTypeDAO {
protected ArtifactTypeDAO(Class<ArtifactType> businessClass) {
super(businessClass);
}
public ArtifactTypeDAO() {
super(ArtifactType.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/domain/enumeration/ProcessModelStatus.java
package br.ufpa.labes.spm.domain.enumeration;
/** The ProcessModelStatus enumeration. */
public enum ProcessModelStatus {
REQUIREMENTS,
ABSTRACT,
INSTANTIATED,
ENACTING,
FAILED,
CANCELED,
MIXED,
FINISHED
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/types/IEventTypeDAO.java
package br.ufpa.labes.spm.repository.interfaces.types;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.EventType;
public interface IEventTypeDAO extends IBaseDAO<EventType, String> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/log/IGlobalActivityEventDAO.java
package br.ufpa.labes.spm.repository.interfaces.log;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.GlobalActivityEvent;
public interface IGlobalActivityEventDAO extends IBaseDAO<GlobalActivityEvent, Integer> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/connections/ISequenceDAO.java
package br.ufpa.labes.spm.repository.interfaces.connections;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.Sequence;
public interface ISequenceDAO extends IBaseDAO<Sequence, String> {}
<file_sep>/tmp/service/WorkGroupTypeService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.WorkGroupType;
import br.ufpa.labes.spm.repository.WorkGroupTypeRepository;
import br.ufpa.labes.spm.service.dto.WorkGroupTypeDTO;
import br.ufpa.labes.spm.service.mapper.WorkGroupTypeMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link WorkGroupType}.
*/
@Service
@Transactional
public class WorkGroupTypeService {
private final Logger log = LoggerFactory.getLogger(WorkGroupTypeService.class);
private final WorkGroupTypeRepository workGroupTypeRepository;
private final WorkGroupTypeMapper workGroupTypeMapper;
public WorkGroupTypeService(WorkGroupTypeRepository workGroupTypeRepository, WorkGroupTypeMapper workGroupTypeMapper) {
this.workGroupTypeRepository = workGroupTypeRepository;
this.workGroupTypeMapper = workGroupTypeMapper;
}
/**
* Save a workGroupType.
*
* @param workGroupTypeDTO the entity to save.
* @return the persisted entity.
*/
public WorkGroupTypeDTO save(WorkGroupTypeDTO workGroupTypeDTO) {
log.debug("Request to save WorkGroupType : {}", workGroupTypeDTO);
WorkGroupType workGroupType = workGroupTypeMapper.toEntity(workGroupTypeDTO);
workGroupType = workGroupTypeRepository.save(workGroupType);
return workGroupTypeMapper.toDto(workGroupType);
}
/**
* Get all the workGroupTypes.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<WorkGroupTypeDTO> findAll() {
log.debug("Request to get all WorkGroupTypes");
return workGroupTypeRepository.findAll().stream()
.map(workGroupTypeMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one workGroupType by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<WorkGroupTypeDTO> findOne(Long id) {
log.debug("Request to get WorkGroupType : {}", id);
return workGroupTypeRepository.findById(id)
.map(workGroupTypeMapper::toDto);
}
/**
* Delete the workGroupType by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete WorkGroupType : {}", id);
workGroupTypeRepository.deleteById(id);
}
}
<file_sep>/tmp/service/dto/MetricDTO.java
package br.ufpa.labes.spm.service.dto;
import java.time.LocalDate;
import java.io.Serializable;
import java.util.Objects;
/**
* A DTO for the {@link br.ufpa.labes.spm.domain.Metric} entity.
*/
public class MetricDTO implements Serializable {
private Long id;
private Float value;
private String unit;
private LocalDate periodBegin;
private LocalDate periodEnd;
private Long metricDefinitionId;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public Float getValue() {
return value;
}
public void setValue(Float value) {
this.value = value;
}
public String getUnit() {
return unit;
}
public void setUnit(String unit) {
this.unit = unit;
}
public LocalDate getPeriodBegin() {
return periodBegin;
}
public void setPeriodBegin(LocalDate periodBegin) {
this.periodBegin = periodBegin;
}
public LocalDate getPeriodEnd() {
return periodEnd;
}
public void setPeriodEnd(LocalDate periodEnd) {
this.periodEnd = periodEnd;
}
public Long getMetricDefinitionId() {
return metricDefinitionId;
}
public void setMetricDefinitionId(Long metricDefinitionId) {
this.metricDefinitionId = metricDefinitionId;
}
@Override
public boolean equals(Object o) {
if (this == o) {
return true;
}
if (o == null || getClass() != o.getClass()) {
return false;
}
MetricDTO metricDTO = (MetricDTO) o;
if (metricDTO.getId() == null || getId() == null) {
return false;
}
return Objects.equals(getId(), metricDTO.getId());
}
@Override
public int hashCode() {
return Objects.hashCode(getId());
}
@Override
public String toString() {
return "MetricDTO{" +
"id=" + getId() +
", value=" + getValue() +
", unit='" + getUnit() + "'" +
", periodBegin='" + getPeriodBegin() + "'" +
", periodEnd='" + getPeriodEnd() + "'" +
", metricDefinition=" + getMetricDefinitionId() +
"}";
}
}
<file_sep>/tmp/service/mapper/WorkGroupMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.WorkGroupDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link WorkGroup} and its DTO {@link WorkGroupDTO}.
*/
@Mapper(componentModel = "spring", uses = {WorkGroupTypeMapper.class})
public interface WorkGroupMapper extends EntityMapper<WorkGroupDTO, WorkGroup> {
@Mapping(source = "theGroupType.id", target = "theGroupTypeId")
@Mapping(source = "superGroup.id", target = "superGroupId")
WorkGroupDTO toDto(WorkGroup workGroup);
@Mapping(target = "theReqGroups", ignore = true)
@Mapping(target = "removeTheReqGroup", ignore = true)
@Mapping(target = "theWorkGroupMetrics", ignore = true)
@Mapping(target = "removeTheWorkGroupMetric", ignore = true)
@Mapping(target = "theWorkGroupEstimations", ignore = true)
@Mapping(target = "removeTheWorkGroupEstimation", ignore = true)
@Mapping(source = "theGroupTypeId", target = "theGroupType")
@Mapping(source = "superGroupId", target = "superGroup")
@Mapping(target = "subGroups", ignore = true)
@Mapping(target = "removeSubGroups", ignore = true)
@Mapping(target = "theWorkGroupInstSugs", ignore = true)
@Mapping(target = "removeTheWorkGroupInstSug", ignore = true)
@Mapping(target = "theAgents", ignore = true)
@Mapping(target = "removeTheAgent", ignore = true)
@Mapping(target = "theSuggestedGroups", ignore = true)
@Mapping(target = "removeTheSuggestedGroups", ignore = true)
WorkGroup toEntity(WorkGroupDTO workGroupDTO);
default WorkGroup fromId(Long id) {
if (id == null) {
return null;
}
WorkGroup workGroup = new WorkGroup();
workGroup.setId(id);
return workGroup;
}
}
<file_sep>/tmp/service/impl/HelpServicesImpl.java
package br.ufpa.labes.spm.service.impl;
import java.util.ArrayList;
import java.util.List;
import javax.persistence.TypedQuery;
import org.qrconsult.spm.converter.core.Converter;
import org.qrconsult.spm.converter.core.ConverterImpl;
import br.ufpa.labes.spm.exceptions.ImplementationException;
import br.ufpa.labes.spm.repository.interfaces.IHelpTopicDAO;
import br.ufpa.labes.spm.service.dto.HelpTopicDTO;
import br.ufpa.labes.spm.domain.HelpTopic;
import br.ufpa.labes.spm.service.interfaces.HelpServices;
import br.ufpa.labes.spm.util.ident.ConversorDeIdent;
import br.ufpa.labes.spm.util.ident.SemCaracteresEspeciais;
import br.ufpa.labes.spm.util.ident.TrocaEspacoPorPonto;
public class HelpServicesImpl implements HelpServices{
IHelpTopicDAO helpTopicDAO;
Converter converter = new ConverterImpl();
@Override
public List<HelpTopicDTO> getHelpTopics() {
List<HelpTopic> helpTopics = helpTopicDAO.getHelpTopics();
List<HelpTopicDTO> result = new ArrayList<HelpTopicDTO>();
try {
for (HelpTopic helpTopic : helpTopics) {
HelpTopicDTO helpTopicDTO = (HelpTopicDTO) converter.getDTO(helpTopic, HelpTopicDTO.class);
if (!helpTopic.getSessions().isEmpty()){
appendToDTO(helpTopicDTO, helpTopic.getSessions());
}
result.add(helpTopicDTO);
}
System.out.println("antes de retornar");
return result;
} catch (ImplementationException e) {
e.printStackTrace();
return null;
}
}
@Override
public HelpTopicDTO getHelpTopicByToken(String token) {
TypedQuery<HelpTopic> query;
String hql = "SELECT h FROM " + HelpTopic.class.getSimpleName() + " h WHERE h.tokenRelated = :token";
query = helpTopicDAO.getPersistenceContext().createQuery(hql, HelpTopic.class);
query.setParameter("token", token);
List<HelpTopic> result = query.getResultList();
if(!result.isEmpty()) {
HelpTopic topic = result.get(0);
HelpTopicDTO helpTopicDTO = HelpTopicDTO.nullHelpTopicDTO();
try {
helpTopicDTO = (HelpTopicDTO) converter.getDTO(topic, HelpTopicDTO.class);
} catch (ImplementationException e) {
e.printStackTrace();
}
return helpTopicDTO;
}
return HelpTopicDTO.nullHelpTopicDTO();
}
public void appendToDTO(HelpTopicDTO rootDTO, List<HelpTopic> list){
for (int i = 0; i < list.size(); i++) {
HelpTopicDTO helpTopicDTO;
try {
helpTopicDTO = (HelpTopicDTO) converter.getDTO(list.get(i), HelpTopicDTO.class);
rootDTO.getSessions().add(helpTopicDTO);
if (!list.get(i).getSessions().isEmpty()){
appendToDTO(helpTopicDTO, list.get(i).getSessions());
}
} catch (ImplementationException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
@Override
public HelpTopicDTO saveTopic(HelpTopicDTO helpTopicDTO) {
if (helpTopicDTO.getIdent() == null || "".equals(helpTopicDTO.getIdent())){ //Quando se trata de um novo t�pico de ajuda
if (helpTopicDTO.getFather() == null){
HelpTopic helpTopic;
try {
long n = helpTopicDAO.countTopics();
n++;
helpTopicDTO.setIdent(String.valueOf(n));
helpTopic = (HelpTopic) converter.getEntity(helpTopicDTO, HelpTopic.class);
helpTopicDAO.daoSave(helpTopic);
String newIdent = ConversorDeIdent.de(helpTopic.getId()+ " " + helpTopic.getTitle()).para(new SemCaracteresEspeciais())
.para(new TrocaEspacoPorPonto())
.ident();
System.out.println("-> " + newIdent);
helpTopic.setIdent(newIdent);
helpTopicDTO.setIdent(newIdent);
helpTopicDAO.update(helpTopic);
return helpTopicDTO;
} catch (ImplementationException e) {
e.printStackTrace();
return null;
}
} else {
HelpTopic father = helpTopicDAO.retrieveBySecondaryKey(helpTopicDTO.getFather().getIdent());
// long n = father.getSessions().size()+1;
//
// helpTopicDTO.setIdent(father.getIdent() + "." + n);
try {
HelpTopic helpTopic = (HelpTopic) converter.getEntity(helpTopicDTO, HelpTopic.class);
helpTopic.setFather(father);
father.getSessions().add(helpTopic);
helpTopicDAO.daoSave(father);
return helpTopicDTO;
} catch (ImplementationException e) {
e.printStackTrace();
return null;
}
}
} else { //Quando se trata de edi��o de um t�pico de ajuda
HelpTopic helpTopic = helpTopicDAO.retrieveBySecondaryKey(helpTopicDTO.getIdent());
helpTopic.setTitle(helpTopicDTO.getTitle());
helpTopic.setContent(helpTopicDTO.getContent());
helpTopic.setTokenRelated(helpTopicDTO.getTokenRelated());
helpTopic = helpTopicDAO.daoSave(helpTopic);
try {
return (HelpTopicDTO) converter.getDTO(helpTopic, HelpTopicDTO.class);
} catch (ImplementationException e) {
e.printStackTrace();
return null;
}
}
}
@Override
public void removeTopic(HelpTopicDTO helpTopicDTO) {
System.out.println(helpTopicDTO);
HelpTopic helpTopic = helpTopicDAO.retrieveBySecondaryKey(helpTopicDTO.getIdent());
System.out.println(helpTopic);
helpTopicDAO.getPersistenceContext().remove(helpTopic);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/web/rest/StructureResource.java
package br.ufpa.labes.spm.web.rest;
import br.ufpa.labes.spm.service.StructureService;
import br.ufpa.labes.spm.web.rest.errors.BadRequestAlertException;
import br.ufpa.labes.spm.service.dto.StructureDTO;
import io.github.jhipster.web.util.HeaderUtil;
import io.github.jhipster.web.util.ResponseUtil;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.List;
import java.util.Optional;
/**
* REST controller for managing {@link br.ufpa.labes.spm.domain.Structure}.
*/
@RestController
@RequestMapping("/api")
public class StructureResource {
private final Logger log = LoggerFactory.getLogger(StructureResource.class);
private static final String ENTITY_NAME = "structure";
@Value("${jhipster.clientApp.name}")
private String applicationName;
private final StructureService structureService;
public StructureResource(StructureService structureService) {
this.structureService = structureService;
}
/**
* {@code POST /structures} : Create a new structure.
*
* @param structureDTO the structureDTO to create.
* @return the {@link ResponseEntity} with status {@code 201 (Created)} and with body the new structureDTO, or with status {@code 400 (Bad Request)} if the structure has already an ID.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PostMapping("/structures")
public ResponseEntity<StructureDTO> createStructure(@RequestBody StructureDTO structureDTO) throws URISyntaxException {
log.debug("REST request to save Structure : {}", structureDTO);
if (structureDTO.getId() != null) {
throw new BadRequestAlertException("A new structure cannot already have an ID", ENTITY_NAME, "idexists");
}
StructureDTO result = structureService.save(structureDTO);
return ResponseEntity.created(new URI("/api/structures/" + result.getId()))
.headers(HeaderUtil.createEntityCreationAlert(applicationName, true, ENTITY_NAME, result.getId().toString()))
.body(result);
}
/**
* {@code PUT /structures} : Updates an existing structure.
*
* @param structureDTO the structureDTO to update.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the updated structureDTO,
* or with status {@code 400 (Bad Request)} if the structureDTO is not valid,
* or with status {@code 500 (Internal Server Error)} if the structureDTO couldn't be updated.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PutMapping("/structures")
public ResponseEntity<StructureDTO> updateStructure(@RequestBody StructureDTO structureDTO) throws URISyntaxException {
log.debug("REST request to update Structure : {}", structureDTO);
if (structureDTO.getId() == null) {
throw new BadRequestAlertException("Invalid id", ENTITY_NAME, "idnull");
}
StructureDTO result = structureService.save(structureDTO);
return ResponseEntity.ok()
.headers(HeaderUtil.createEntityUpdateAlert(applicationName, true, ENTITY_NAME, structureDTO.getId().toString()))
.body(result);
}
/**
* {@code GET /structures} : get all the structures.
*
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and the list of structures in body.
*/
@GetMapping("/structures")
public List<StructureDTO> getAllStructures() {
log.debug("REST request to get all Structures");
return structureService.findAll();
}
/**
* {@code GET /structures/:id} : get the "id" structure.
*
* @param id the id of the structureDTO to retrieve.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the structureDTO, or with status {@code 404 (Not Found)}.
*/
@GetMapping("/structures/{id}")
public ResponseEntity<StructureDTO> getStructure(@PathVariable Long id) {
log.debug("REST request to get Structure : {}", id);
Optional<StructureDTO> structureDTO = structureService.findOne(id);
return ResponseUtil.wrapOrNotFound(structureDTO);
}
/**
* {@code DELETE /structures/:id} : delete the "id" structure.
*
* @param id the id of the structureDTO to delete.
* @return the {@link ResponseEntity} with status {@code 204 (NO_CONTENT)}.
*/
@DeleteMapping("/structures/{id}")
public ResponseEntity<Void> deleteStructure(@PathVariable Long id) {
log.debug("REST request to delete Structure : {}", id);
structureService.delete(id);
return ResponseEntity.noContent().headers(HeaderUtil.createEntityDeletionAlert(applicationName, true, ENTITY_NAME, id.toString())).build();
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/web/rest/TagStatResource.java
package br.ufpa.labes.spm.web.rest;
import br.ufpa.labes.spm.service.TagStatsService;
import br.ufpa.labes.spm.web.rest.errors.BadRequestAlertException;
import br.ufpa.labes.spm.service.dto.TagStatsDTO;
import io.github.jhipster.web.util.HeaderUtil;
import io.github.jhipster.web.util.ResponseUtil;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.List;
import java.util.Optional;
/** REST controller for managing {@link br.ufpa.labes.spm.domain.TagStats}. */
@RestController
@RequestMapping("/api")
public class TagStatResource {
private final Logger log = LoggerFactory.getLogger(TagStatResource.class);
private static final String ENTITY_NAME = "tagStat";
@Value("${jhipster.clientApp.name}")
private String applicationName;
private final TagStatsService tagStatsService;
public TagStatResource(TagStatsService tagStatsService) {
this.tagStatsService = tagStatsService;
}
/**
* {@code POST /tag-stats} : Create a new tagStat.
*
* @param tagStatsDTO the tagStatsDTO to create.
* @return the {@link ResponseEntity} with status {@code 201 (Created)} and with body the new
* tagStatsDTO, or with status {@code 400 (Bad Request)} if the tagStat has already an ID.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PostMapping("/tag-stats")
public ResponseEntity<TagStatsDTO> createTagStat(@RequestBody TagStatsDTO tagStatsDTO)
throws URISyntaxException {
log.debug("REST request to save TagStats : {}", tagStatsDTO);
if (tagStatsDTO.getId() != null) {
throw new BadRequestAlertException(
"A new tagStat cannot already have an ID", ENTITY_NAME, "idexists");
}
TagStatsDTO result = tagStatsService.save(tagStatsDTO);
return ResponseEntity.created(new URI("/api/tag-stats/" + result.getId()))
.headers(
HeaderUtil.createEntityCreationAlert(
applicationName, true, ENTITY_NAME, result.getId().toString()))
.body(result);
}
/**
* {@code PUT /tag-stats} : Updates an existing tagStat.
*
* @param tagStatsDTO the tagStatsDTO to update.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the updated
* tagStatsDTO, or with status {@code 400 (Bad Request)} if the tagStatsDTO is not valid, or
* with status {@code 500 (Internal Server Error)} if the tagStatsDTO couldn't be updated.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PutMapping("/tag-stats")
public ResponseEntity<TagStatsDTO> updateTagStat(@RequestBody TagStatsDTO tagStatsDTO)
throws URISyntaxException {
log.debug("REST request to update TagStats : {}", tagStatsDTO);
if (tagStatsDTO.getId() == null) {
throw new BadRequestAlertException("Invalid id", ENTITY_NAME, "idnull");
}
TagStatsDTO result = tagStatsService.save(tagStatsDTO);
return ResponseEntity.ok()
.headers(
HeaderUtil.createEntityUpdateAlert(
applicationName, true, ENTITY_NAME, tagStatsDTO.getId().toString()))
.body(result);
}
/**
* {@code GET /tag-stats} : get all the tagStats.
*
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and the list of tagStats in
* body.
*/
@GetMapping("/tag-stats")
public List<TagStatsDTO> getAllTagStats() {
log.debug("REST request to get all TagStats");
return tagStatsService.findAll();
}
/**
* {@code GET /tag-stats/:id} : get the "id" tagStat.
*
* @param id the id of the tagStatsDTO to retrieve.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the tagStatsDTO,
* or with status {@code 404 (Not Found)}.
*/
@GetMapping("/tag-stats/{id}")
public ResponseEntity<TagStatsDTO> getTagStat(@PathVariable Long id) {
log.debug("REST request to get TagStats : {}", id);
Optional<TagStatsDTO> tagStatsDTO = tagStatsService.findOne(id);
return ResponseUtil.wrapOrNotFound(tagStatsDTO);
}
/**
* {@code DELETE /tag-stats/:id} : delete the "id" tagStat.
*
* @param id the id of the tagStatsDTO to delete.
* @return the {@link ResponseEntity} with status {@code 204 (NO_CONTENT)}.
*/
@DeleteMapping("/tag-stats/{id}")
public ResponseEntity<Void> deleteTagStat(@PathVariable Long id) {
log.debug("REST request to delete TagStats : {}", id);
tagStatsService.delete(id);
return ResponseEntity.noContent()
.headers(
HeaderUtil.createEntityDeletionAlert(applicationName, true, ENTITY_NAME, id.toString()))
.build();
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/GlobalActivityEventService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.GlobalActivityEvent;
import br.ufpa.labes.spm.repository.GlobalActivityEventRepository;
import br.ufpa.labes.spm.service.dto.GlobalActivityEventDTO;
import br.ufpa.labes.spm.service.mapper.GlobalActivityEventMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link GlobalActivityEvent}.
*/
@Service
@Transactional
public class GlobalActivityEventService {
private final Logger log = LoggerFactory.getLogger(GlobalActivityEventService.class);
private final GlobalActivityEventRepository globalActivityEventRepository;
private final GlobalActivityEventMapper globalActivityEventMapper;
public GlobalActivityEventService(GlobalActivityEventRepository globalActivityEventRepository, GlobalActivityEventMapper globalActivityEventMapper) {
this.globalActivityEventRepository = globalActivityEventRepository;
this.globalActivityEventMapper = globalActivityEventMapper;
}
/**
* Save a globalActivityEvent.
*
* @param globalActivityEventDTO the entity to save.
* @return the persisted entity.
*/
public GlobalActivityEventDTO save(GlobalActivityEventDTO globalActivityEventDTO) {
log.debug("Request to save GlobalActivityEvent : {}", globalActivityEventDTO);
GlobalActivityEvent globalActivityEvent = globalActivityEventMapper.toEntity(globalActivityEventDTO);
globalActivityEvent = globalActivityEventRepository.save(globalActivityEvent);
return globalActivityEventMapper.toDto(globalActivityEvent);
}
/**
* Get all the globalActivityEvents.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<GlobalActivityEventDTO> findAll() {
log.debug("Request to get all GlobalActivityEvents");
return globalActivityEventRepository.findAll().stream()
.map(globalActivityEventMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one globalActivityEvent by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<GlobalActivityEventDTO> findOne(Long id) {
log.debug("Request to get GlobalActivityEvent : {}", id);
return globalActivityEventRepository.findById(id)
.map(globalActivityEventMapper::toDto);
}
/**
* Delete the globalActivityEvent by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete GlobalActivityEvent : {}", id);
globalActivityEventRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/interfaces/NotificationServices.java
package br.ufpa.labes.spm.service.interfaces;
import java.rmi.RemoteException;
import java.util.Collection;
import br.ufpa.labes.spm.exceptions.DAOException;
import br.ufpa.labes.spm.exceptions.ObjectLockedException;
import br.ufpa.labes.spm.exceptions.UserDeniedException;
import br.ufpa.labes.spm.exceptions.UserInvalidException;
import br.ufpa.labes.spm.exceptions.UserNotManagerException;
public interface NotificationServices {
public abstract boolean isAgent(String name, String password) throws RemoteException;
public abstract boolean isAgentInGroup(String agent_id, String group_id)
throws DAOException, RemoteException;
public abstract boolean isActivityInTasks(String id_activity, String agent_id, String process_id)
throws DAOException, RemoteException;
public abstract boolean isOutOfWorkPeriod(String agent_id) throws DAOException, RemoteException;
public abstract boolean isManagerInProcess(String agent_id, String process_id)
throws DAOException, RemoteException;
public abstract Collection<String> getUsersInSession() throws DAOException, RemoteException;
public abstract boolean isSubordTo(String agent_subord_id, String agent_chef_id)
throws RemoteException;
public abstract boolean login(String name, String password) throws RemoteException;
public abstract boolean hasLogedIn(String name, String password) throws RemoteException;
public abstract boolean logoff(String name, String clientId) throws RemoteException;
public abstract float lockObject(
String userName, Class classe, Integer obj_oid, int ttl, int ttlbase)
throws UserDeniedException, UserNotManagerException, UserInvalidException, DAOException,
ObjectLockedException, RemoteException;
public abstract boolean unlockObject(
String UserName, Class obj_class, Integer object_oid, float key)
throws UserDeniedException, UserNotManagerException, UserInvalidException, DAOException,
ObjectLockedException, RemoteException;
public abstract boolean isLocked(Long oid, Class classe)
throws UserDeniedException, UserNotManagerException, UserInvalidException, DAOException,
ObjectLockedException, RemoteException;
public abstract String isLockedTo(Long oid, Class classe)
throws UserDeniedException, UserNotManagerException, UserInvalidException, DAOException,
ObjectLockedException, RemoteException;
public abstract String isLockedTo_with_key(Long oid, Class classe, float key)
throws UserDeniedException, UserNotManagerException, UserInvalidException, DAOException,
ObjectLockedException, RemoteException;
public abstract boolean isAgentInProcess(String agent_id, String process_id)
throws UserDeniedException, UserNotManagerException, UserInvalidException, DAOException,
RemoteException;
public abstract void sendMessage(String msg) throws RemoteException;
public abstract void sendMessageToGroup(String msg, Collection<String> names)
throws RemoteException;
public abstract void sendMessageToUser(String msg, String userName) throws RemoteException;
public abstract void registerCallbackServiceListener(
final String rmihost, final String rmiport, final String rmiservicename)
throws RemoteException;
}
<file_sep>/src/main/java/br/ufpa/labes/spm/web/rest/TaskAgendaResource.java
package br.ufpa.labes.spm.web.rest;
import br.ufpa.labes.spm.service.TaskAgendaService;
import br.ufpa.labes.spm.web.rest.errors.BadRequestAlertException;
import br.ufpa.labes.spm.service.dto.TaskAgendaDTO;
import io.github.jhipster.web.util.HeaderUtil;
import io.github.jhipster.web.util.ResponseUtil;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.List;
import java.util.Optional;
import java.util.stream.StreamSupport;
/**
* REST controller for managing {@link br.ufpa.labes.spm.domain.TaskAgenda}.
*/
@RestController
@RequestMapping("/api")
public class TaskAgendaResource {
private final Logger log = LoggerFactory.getLogger(TaskAgendaResource.class);
private static final String ENTITY_NAME = "taskAgenda";
@Value("${jhipster.clientApp.name}")
private String applicationName;
private final TaskAgendaService taskAgendaService;
public TaskAgendaResource(TaskAgendaService taskAgendaService) {
this.taskAgendaService = taskAgendaService;
}
/**
* {@code POST /task-agenda} : Create a new taskAgenda.
*
* @param taskAgendaDTO the taskAgendaDTO to create.
* @return the {@link ResponseEntity} with status {@code 201 (Created)} and with body the new taskAgendaDTO, or with status {@code 400 (Bad Request)} if the taskAgenda has already an ID.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PostMapping("/task-agenda")
public ResponseEntity<TaskAgendaDTO> createTaskAgenda(@RequestBody TaskAgendaDTO taskAgendaDTO) throws URISyntaxException {
log.debug("REST request to save TaskAgenda : {}", taskAgendaDTO);
if (taskAgendaDTO.getId() != null) {
throw new BadRequestAlertException("A new taskAgenda cannot already have an ID", ENTITY_NAME, "idexists");
}
TaskAgendaDTO result = taskAgendaService.save(taskAgendaDTO);
return ResponseEntity.created(new URI("/api/task-agenda/" + result.getId()))
.headers(HeaderUtil.createEntityCreationAlert(applicationName, true, ENTITY_NAME, result.getId().toString()))
.body(result);
}
/**
* {@code PUT /task-agenda} : Updates an existing taskAgenda.
*
* @param taskAgendaDTO the taskAgendaDTO to update.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the updated taskAgendaDTO,
* or with status {@code 400 (Bad Request)} if the taskAgendaDTO is not valid,
* or with status {@code 500 (Internal Server Error)} if the taskAgendaDTO couldn't be updated.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PutMapping("/task-agenda")
public ResponseEntity<TaskAgendaDTO> updateTaskAgenda(@RequestBody TaskAgendaDTO taskAgendaDTO) throws URISyntaxException {
log.debug("REST request to update TaskAgenda : {}", taskAgendaDTO);
if (taskAgendaDTO.getId() == null) {
throw new BadRequestAlertException("Invalid id", ENTITY_NAME, "idnull");
}
TaskAgendaDTO result = taskAgendaService.save(taskAgendaDTO);
return ResponseEntity.ok()
.headers(HeaderUtil.createEntityUpdateAlert(applicationName, true, ENTITY_NAME, taskAgendaDTO.getId().toString()))
.body(result);
}
/**
* {@code GET /task-agenda} : get all the taskAgenda.
*
* @param filter the filter of the request.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and the list of taskAgenda in body.
*/
@GetMapping("/task-agenda")
public List<TaskAgendaDTO> getAllTaskAgenda(@RequestParam(required = false) String filter) {
if ("theagent-is-null".equals(filter)) {
log.debug("REST request to get all TaskAgendas where theAgent is null");
return taskAgendaService.findAllWhereTheAgentIsNull();
}
log.debug("REST request to get all TaskAgenda");
return taskAgendaService.findAll();
}
/**
* {@code GET /task-agenda/:id} : get the "id" taskAgenda.
*
* @param id the id of the taskAgendaDTO to retrieve.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the taskAgendaDTO, or with status {@code 404 (Not Found)}.
*/
@GetMapping("/task-agenda/{id}")
public ResponseEntity<TaskAgendaDTO> getTaskAgenda(@PathVariable Long id) {
log.debug("REST request to get TaskAgenda : {}", id);
Optional<TaskAgendaDTO> taskAgendaDTO = taskAgendaService.findOne(id);
return ResponseUtil.wrapOrNotFound(taskAgendaDTO);
}
/**
* {@code DELETE /task-agenda/:id} : delete the "id" taskAgenda.
*
* @param id the id of the taskAgendaDTO to delete.
* @return the {@link ResponseEntity} with status {@code 204 (NO_CONTENT)}.
*/
@DeleteMapping("/task-agenda/{id}")
public ResponseEntity<Void> deleteTaskAgenda(@PathVariable Long id) {
log.debug("REST request to delete TaskAgenda : {}", id);
taskAgendaService.delete(id);
return ResponseEntity.noContent().headers(HeaderUtil.createEntityDeletionAlert(applicationName, true, ENTITY_NAME, id.toString())).build();
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/mapper/LogEntryMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.LogEntryDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link LogEntry} and its DTO {@link LogEntryDTO}.
*/
@Mapper(componentModel = "spring", uses = {UserMapper.class})
public interface LogEntryMapper extends EntityMapper<LogEntryDTO, LogEntry> {
@Mapping(source = "user.id", target = "userId")
LogEntryDTO toDto(LogEntry logEntry);
@Mapping(source = "userId", target = "user")
LogEntry toEntity(LogEntryDTO logEntryDTO);
default LogEntry fromId(Long id) {
if (id == null) {
return null;
}
LogEntry logEntry = new LogEntry();
logEntry.setId(id);
return logEntry;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/mapper/ConnectionTypeMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.ConnectionTypeDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link ConnectionType} and its DTO {@link ConnectionTypeDTO}.
*/
@Mapper(componentModel = "spring", uses = {})
public interface ConnectionTypeMapper extends EntityMapper<ConnectionTypeDTO, ConnectionType> {
@Mapping(target = "theConnections", ignore = true)
@Mapping(target = "removeTheConnection", ignore = true)
ConnectionType toEntity(ConnectionTypeDTO connectionTypeDTO);
default ConnectionType fromId(Long id) {
if (id == null) {
return null;
}
ConnectionType connectionType = new ConnectionType();
connectionType.setId(id);
return connectionType;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/log/IConnectionEventDAO.java
package br.ufpa.labes.spm.repository.interfaces.log;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.ConnectionEvent;
public interface IConnectionEventDAO extends IBaseDAO<ConnectionEvent, Integer> {}
<file_sep>/tmp/service/dto/ActivityDTO.java
package br.ufpa.labes.spm.service.dto;
import java.io.Serializable;
import java.util.HashSet;
import java.util.Set;
import java.util.Objects;
/**
* A DTO for the {@link br.ufpa.labes.spm.domain.Activity} entity.
*/
public class ActivityDTO implements Serializable {
private Long id;
private String ident;
private String name;
private Boolean isVersion;
private Long theActivityTypeId;
private Set<JoinConDTO> toJoinCons = new HashSet<>();
private Set<BranchANDConDTO> fromBranchANDCons = new HashSet<>();
private Set<ArtifactConDTO> fromArtifactCons = new HashSet<>();
private Set<ArtifactConDTO> toArtifactCons = new HashSet<>();
private Long isVersionOfId;
private Long theProcessModelId;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public String getIdent() {
return ident;
}
public void setIdent(String ident) {
this.ident = ident;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public Boolean isIsVersion() {
return isVersion;
}
public void setIsVersion(Boolean isVersion) {
this.isVersion = isVersion;
}
public Long getTheActivityTypeId() {
return theActivityTypeId;
}
public void setTheActivityTypeId(Long activityTypeId) {
this.theActivityTypeId = activityTypeId;
}
public Set<JoinConDTO> getToJoinCons() {
return toJoinCons;
}
public void setToJoinCons(Set<JoinConDTO> joinCons) {
this.toJoinCons = joinCons;
}
public Set<BranchANDConDTO> getFromBranchANDCons() {
return fromBranchANDCons;
}
public void setFromBranchANDCons(Set<BranchANDConDTO> branchANDCons) {
this.fromBranchANDCons = branchANDCons;
}
public Set<ArtifactConDTO> getFromArtifactCons() {
return fromArtifactCons;
}
public void setFromArtifactCons(Set<ArtifactConDTO> artifactCons) {
this.fromArtifactCons = artifactCons;
}
public Set<ArtifactConDTO> getToArtifactCons() {
return toArtifactCons;
}
public void setToArtifactCons(Set<ArtifactConDTO> artifactCons) {
this.toArtifactCons = artifactCons;
}
public Long getIsVersionOfId() {
return isVersionOfId;
}
public void setIsVersionOfId(Long activityId) {
this.isVersionOfId = activityId;
}
public Long getTheProcessModelId() {
return theProcessModelId;
}
public void setTheProcessModelId(Long processModelId) {
this.theProcessModelId = processModelId;
}
@Override
public boolean equals(Object o) {
if (this == o) {
return true;
}
if (o == null || getClass() != o.getClass()) {
return false;
}
ActivityDTO activityDTO = (ActivityDTO) o;
if (activityDTO.getId() == null || getId() == null) {
return false;
}
return Objects.equals(getId(), activityDTO.getId());
}
@Override
public int hashCode() {
return Objects.hashCode(getId());
}
@Override
public String toString() {
return "ActivityDTO{" +
"id=" + getId() +
", ident='" + getIdent() + "'" +
", name='" + getName() + "'" +
", isVersion='" + isIsVersion() + "'" +
", theActivityType=" + getTheActivityTypeId() +
", isVersionOf=" + getIsVersionOfId() +
", theProcessModel=" + getTheProcessModelId() +
"}";
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/types/EventTypeDAO.java
package br.ufpa.labes.spm.repository.impl.types;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.types.IEventTypeDAO;
import br.ufpa.labes.spm.domain.EventType;
public class EventTypeDAO extends BaseDAO<EventType, String> implements IEventTypeDAO {
protected EventTypeDAO(Class<EventType> businessClass) {
super(businessClass);
}
public EventTypeDAO() {
super(EventType.class);
}
}
<file_sep>/tmp/service/mapper/NodeMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.NodeDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link Node} and its DTO {@link NodeDTO}.
*/
@Mapper(componentModel = "spring", uses = {})
public interface NodeMapper extends EntityMapper<NodeDTO, Node> {
@Mapping(source = "parentNode.id", target = "parentNodeId")
NodeDTO toDto(Node node);
@Mapping(target = "children", ignore = true)
@Mapping(target = "removeChildren", ignore = true)
@Mapping(target = "theStructure", ignore = true)
@Mapping(source = "parentNodeId", target = "parentNode")
Node toEntity(NodeDTO nodeDTO);
default Node fromId(Long id) {
if (id == null) {
return null;
}
Node node = new Node();
node.setId(id);
return node;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/mapper/ResourceTypeMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.ResourceTypeDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link ResourceType} and its DTO {@link ResourceTypeDTO}.
*/
@Mapper(componentModel = "spring", uses = {})
public interface ResourceTypeMapper extends EntityMapper<ResourceTypeDTO, ResourceType> {
@Mapping(target = "theRequiredResources", ignore = true)
@Mapping(target = "removeTheRequiredResource", ignore = true)
@Mapping(target = "theResourceInstSugs", ignore = true)
@Mapping(target = "removeTheResourceInstSug", ignore = true)
@Mapping(target = "theResources", ignore = true)
@Mapping(target = "removeTheResource", ignore = true)
ResourceType toEntity(ResourceTypeDTO resourceTypeDTO);
default ResourceType fromId(Long id) {
if (id == null) {
return null;
}
ResourceType resourceType = new ResourceType();
resourceType.setId(id);
return resourceType;
}
}
<file_sep>/tmp/service/ExclusiveService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.Exclusive;
import br.ufpa.labes.spm.repository.ExclusiveRepository;
import br.ufpa.labes.spm.service.dto.ExclusiveDTO;
import br.ufpa.labes.spm.service.mapper.ExclusiveMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link Exclusive}.
*/
@Service
@Transactional
public class ExclusiveService {
private final Logger log = LoggerFactory.getLogger(ExclusiveService.class);
private final ExclusiveRepository exclusiveRepository;
private final ExclusiveMapper exclusiveMapper;
public ExclusiveService(ExclusiveRepository exclusiveRepository, ExclusiveMapper exclusiveMapper) {
this.exclusiveRepository = exclusiveRepository;
this.exclusiveMapper = exclusiveMapper;
}
/**
* Save a exclusive.
*
* @param exclusiveDTO the entity to save.
* @return the persisted entity.
*/
public ExclusiveDTO save(ExclusiveDTO exclusiveDTO) {
log.debug("Request to save Exclusive : {}", exclusiveDTO);
Exclusive exclusive = exclusiveMapper.toEntity(exclusiveDTO);
exclusive = exclusiveRepository.save(exclusive);
return exclusiveMapper.toDto(exclusive);
}
/**
* Get all the exclusives.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<ExclusiveDTO> findAll() {
log.debug("Request to get all Exclusives");
return exclusiveRepository.findAll().stream()
.map(exclusiveMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one exclusive by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<ExclusiveDTO> findOne(Long id) {
log.debug("Request to get Exclusive : {}", id);
return exclusiveRepository.findById(id)
.map(exclusiveMapper::toDto);
}
/**
* Delete the exclusive by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete Exclusive : {}", id);
exclusiveRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/ToolParameterService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.ToolParameter;
import br.ufpa.labes.spm.repository.ToolParameterRepository;
import br.ufpa.labes.spm.service.dto.ToolParameterDTO;
import br.ufpa.labes.spm.service.mapper.ToolParameterMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link ToolParameter}.
*/
@Service
@Transactional
public class ToolParameterService {
private final Logger log = LoggerFactory.getLogger(ToolParameterService.class);
private final ToolParameterRepository toolParameterRepository;
private final ToolParameterMapper toolParameterMapper;
public ToolParameterService(ToolParameterRepository toolParameterRepository, ToolParameterMapper toolParameterMapper) {
this.toolParameterRepository = toolParameterRepository;
this.toolParameterMapper = toolParameterMapper;
}
/**
* Save a toolParameter.
*
* @param toolParameterDTO the entity to save.
* @return the persisted entity.
*/
public ToolParameterDTO save(ToolParameterDTO toolParameterDTO) {
log.debug("Request to save ToolParameter : {}", toolParameterDTO);
ToolParameter toolParameter = toolParameterMapper.toEntity(toolParameterDTO);
toolParameter = toolParameterRepository.save(toolParameter);
return toolParameterMapper.toDto(toolParameter);
}
/**
* Get all the toolParameters.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<ToolParameterDTO> findAll() {
log.debug("Request to get all ToolParameters");
return toolParameterRepository.findAll().stream()
.map(toolParameterMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one toolParameter by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<ToolParameterDTO> findOne(Long id) {
log.debug("Request to get ToolParameter : {}", id);
return toolParameterRepository.findById(id)
.map(toolParameterMapper::toDto);
}
/**
* Delete the toolParameter by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete ToolParameter : {}", id);
toolParameterRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/web/rest/DecomposedResource.java
package br.ufpa.labes.spm.web.rest;
import br.ufpa.labes.spm.service.DecomposedService;
import br.ufpa.labes.spm.web.rest.errors.BadRequestAlertException;
import br.ufpa.labes.spm.service.dto.DecomposedDTO;
import io.github.jhipster.web.util.HeaderUtil;
import io.github.jhipster.web.util.ResponseUtil;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.List;
import java.util.Optional;
/**
* REST controller for managing {@link br.ufpa.labes.spm.domain.Decomposed}.
*/
@RestController
@RequestMapping("/api")
public class DecomposedResource {
private final Logger log = LoggerFactory.getLogger(DecomposedResource.class);
private static final String ENTITY_NAME = "decomposed";
@Value("${jhipster.clientApp.name}")
private String applicationName;
private final DecomposedService decomposedService;
public DecomposedResource(DecomposedService decomposedService) {
this.decomposedService = decomposedService;
}
/**
* {@code POST /decomposeds} : Create a new decomposed.
*
* @param decomposedDTO the decomposedDTO to create.
* @return the {@link ResponseEntity} with status {@code 201 (Created)} and with body the new decomposedDTO, or with status {@code 400 (Bad Request)} if the decomposed has already an ID.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PostMapping("/decomposeds")
public ResponseEntity<DecomposedDTO> createDecomposed(@RequestBody DecomposedDTO decomposedDTO) throws URISyntaxException {
log.debug("REST request to save Decomposed : {}", decomposedDTO);
if (decomposedDTO.getId() != null) {
throw new BadRequestAlertException("A new decomposed cannot already have an ID", ENTITY_NAME, "idexists");
}
DecomposedDTO result = decomposedService.save(decomposedDTO);
return ResponseEntity.created(new URI("/api/decomposeds/" + result.getId()))
.headers(HeaderUtil.createEntityCreationAlert(applicationName, true, ENTITY_NAME, result.getId().toString()))
.body(result);
}
/**
* {@code PUT /decomposeds} : Updates an existing decomposed.
*
* @param decomposedDTO the decomposedDTO to update.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the updated decomposedDTO,
* or with status {@code 400 (Bad Request)} if the decomposedDTO is not valid,
* or with status {@code 500 (Internal Server Error)} if the decomposedDTO couldn't be updated.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PutMapping("/decomposeds")
public ResponseEntity<DecomposedDTO> updateDecomposed(@RequestBody DecomposedDTO decomposedDTO) throws URISyntaxException {
log.debug("REST request to update Decomposed : {}", decomposedDTO);
if (decomposedDTO.getId() == null) {
throw new BadRequestAlertException("Invalid id", ENTITY_NAME, "idnull");
}
DecomposedDTO result = decomposedService.save(decomposedDTO);
return ResponseEntity.ok()
.headers(HeaderUtil.createEntityUpdateAlert(applicationName, true, ENTITY_NAME, decomposedDTO.getId().toString()))
.body(result);
}
/**
* {@code GET /decomposeds} : get all the decomposeds.
*
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and the list of decomposeds in body.
*/
@GetMapping("/decomposeds")
public List<DecomposedDTO> getAllDecomposeds() {
log.debug("REST request to get all Decomposeds");
return decomposedService.findAll();
}
/**
* {@code GET /decomposeds/:id} : get the "id" decomposed.
*
* @param id the id of the decomposedDTO to retrieve.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the decomposedDTO, or with status {@code 404 (Not Found)}.
*/
@GetMapping("/decomposeds/{id}")
public ResponseEntity<DecomposedDTO> getDecomposed(@PathVariable Long id) {
log.debug("REST request to get Decomposed : {}", id);
Optional<DecomposedDTO> decomposedDTO = decomposedService.findOne(id);
return ResponseUtil.wrapOrNotFound(decomposedDTO);
}
/**
* {@code DELETE /decomposeds/:id} : delete the "id" decomposed.
*
* @param id the id of the decomposedDTO to delete.
* @return the {@link ResponseEntity} with status {@code 204 (NO_CONTENT)}.
*/
@DeleteMapping("/decomposeds/{id}")
public ResponseEntity<Void> deleteDecomposed(@PathVariable Long id) {
log.debug("REST request to delete Decomposed : {}", id);
decomposedService.delete(id);
return ResponseEntity.noContent().headers(HeaderUtil.createEntityDeletionAlert(applicationName, true, ENTITY_NAME, id.toString())).build();
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/tools/SubroutineDAO.java
package br.ufpa.labes.spm.repository.impl.tools;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.tools.ISubroutineDAO;
import br.ufpa.labes.spm.domain.Subroutine;
public class SubroutineDAO extends BaseDAO<Subroutine, Integer> implements ISubroutineDAO {
protected SubroutineDAO(Class<Subroutine> businessClass) {
super(businessClass);
}
public SubroutineDAO() {
super(Subroutine.class);
}
}
<file_sep>/tmp/service/mapper/ArtifactMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.ArtifactDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link Artifact} and its DTO {@link ArtifactDTO}.
*/
@Mapper(componentModel = "spring", uses = {ArtifactTypeMapper.class, VCSRepositoryMapper.class, ProjectMapper.class})
public interface ArtifactMapper extends EntityMapper<ArtifactDTO, Artifact> {
@Mapping(source = "theArtifactType.id", target = "theArtifactTypeId")
@Mapping(source = "derivedTo.id", target = "derivedToId")
@Mapping(source = "possess.id", target = "possessId")
@Mapping(source = "theRepository.id", target = "theRepositoryId")
@Mapping(source = "theProject.id", target = "theProjectId")
ArtifactDTO toDto(Artifact artifact);
@Mapping(target = "theInvolvedArtifacts", ignore = true)
@Mapping(target = "removeTheInvolvedArtifacts", ignore = true)
@Mapping(target = "theArtifactParams", ignore = true)
@Mapping(target = "removeTheArtifactParam", ignore = true)
@Mapping(target = "theAutomatics", ignore = true)
@Mapping(target = "removeTheAutomatic", ignore = true)
@Mapping(target = "theArtifactMetrics", ignore = true)
@Mapping(target = "removeTheArtifactMetric", ignore = true)
@Mapping(target = "theArtifactEstimations", ignore = true)
@Mapping(target = "removeTheArtifactEstimation", ignore = true)
@Mapping(source = "theArtifactTypeId", target = "theArtifactType")
@Mapping(source = "derivedToId", target = "derivedTo")
@Mapping(source = "possessId", target = "possess")
@Mapping(source = "theRepositoryId", target = "theRepository")
@Mapping(source = "theProjectId", target = "theProject")
@Mapping(target = "derivedFroms", ignore = true)
@Mapping(target = "removeDerivedFrom", ignore = true)
@Mapping(target = "belongsTos", ignore = true)
@Mapping(target = "removeBelongsTo", ignore = true)
@Mapping(target = "theArtifactTasks", ignore = true)
@Mapping(target = "removeTheArtifactTasks", ignore = true)
@Mapping(target = "theArtifactCons", ignore = true)
@Mapping(target = "removeTheArtifactCon", ignore = true)
Artifact toEntity(ArtifactDTO artifactDTO);
default Artifact fromId(Long id) {
if (id == null) {
return null;
}
Artifact artifact = new Artifact();
artifact.setId(id);
return artifact;
}
}
<file_sep>/tmp/service/mapper/DecomposedMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.DecomposedDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link Decomposed} and its DTO {@link DecomposedDTO}.
*/
@Mapper(componentModel = "spring", uses = {ProcessModelMapper.class})
public interface DecomposedMapper extends EntityMapper<DecomposedDTO, Decomposed> {
@Mapping(source = "theReferedProcessModel.id", target = "theReferedProcessModelId")
DecomposedDTO toDto(Decomposed decomposed);
@Mapping(source = "theReferedProcessModelId", target = "theReferedProcessModel")
Decomposed toEntity(DecomposedDTO decomposedDTO);
default Decomposed fromId(Long id) {
if (id == null) {
return null;
}
Decomposed decomposed = new Decomposed();
decomposed.setId(id);
return decomposed;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/ArtifactParamService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.ArtifactParam;
import br.ufpa.labes.spm.repository.ArtifactParamRepository;
import br.ufpa.labes.spm.service.dto.ArtifactParamDTO;
import br.ufpa.labes.spm.service.mapper.ArtifactParamMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link ArtifactParam}.
*/
@Service
@Transactional
public class ArtifactParamService {
private final Logger log = LoggerFactory.getLogger(ArtifactParamService.class);
private final ArtifactParamRepository artifactParamRepository;
private final ArtifactParamMapper artifactParamMapper;
public ArtifactParamService(ArtifactParamRepository artifactParamRepository, ArtifactParamMapper artifactParamMapper) {
this.artifactParamRepository = artifactParamRepository;
this.artifactParamMapper = artifactParamMapper;
}
/**
* Save a artifactParam.
*
* @param artifactParamDTO the entity to save.
* @return the persisted entity.
*/
public ArtifactParamDTO save(ArtifactParamDTO artifactParamDTO) {
log.debug("Request to save ArtifactParam : {}", artifactParamDTO);
ArtifactParam artifactParam = artifactParamMapper.toEntity(artifactParamDTO);
artifactParam = artifactParamRepository.save(artifactParam);
return artifactParamMapper.toDto(artifactParam);
}
/**
* Get all the artifactParams.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<ArtifactParamDTO> findAll() {
log.debug("Request to get all ArtifactParams");
return artifactParamRepository.findAll().stream()
.map(artifactParamMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one artifactParam by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<ArtifactParamDTO> findOne(Long id) {
log.debug("Request to get ArtifactParam : {}", id);
return artifactParamRepository.findById(id)
.map(artifactParamMapper::toDto);
}
/**
* Delete the artifactParam by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete ArtifactParam : {}", id);
artifactParamRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/dto/AgentDTO.java
package br.ufpa.labes.spm.service.dto;
import java.io.Serializable;
import java.util.HashSet;
import java.util.Set;
import java.util.Objects;
import javax.persistence.Lob;
/**
* A DTO for the {@link br.ufpa.labes.spm.domain.Agent} entity.
*/
public class AgentDTO implements Serializable {
private Long id;
private String ident;
private String name;
private String email;
private Float costHour;
private String password;
private Integer tipoUser;
private Boolean isActive;
private Boolean online;
@Lob
private byte[] photo;
private String photoContentType;
private String upload;
@Lob
private String description;
private Long theTaskAgendaId;
private Long configurationId;
private Long theResourceEventId;
private Set<ProcessDTO> theProcesses = new HashSet<>();
private Set<WorkGroupDTO> theWorkGroups = new HashSet<>();
private Set<CompanyUnitDTO> theOrgUnits = new HashSet<>();
private Long theEmailConfigurationId;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public String getIdent() {
return ident;
}
public void setIdent(String ident) {
this.ident = ident;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public String getEmail() {
return email;
}
public void setEmail(String email) {
this.email = email;
}
public Float getCostHour() {
return costHour;
}
public void setCostHour(Float costHour) {
this.costHour = costHour;
}
public Integer getTipoUser() {
return tipoUser;
}
public void setTipoUser(Integer tipoUser) {
this.tipoUser = tipoUser;
}
public Boolean isIsActive() {
return isActive;
}
public void setIsActive(Boolean isActive) {
this.isActive = isActive;
}
public Boolean isOnline() {
return online;
}
public void setOnline(Boolean online) {
this.online = online;
}
public byte[] getPhoto() {
return photo;
}
public void setPhoto(byte[] photo) {
this.photo = photo;
}
public String getPhotoContentType() {
return photoContentType;
}
public void setPhotoContentType(String photoContentType) {
this.photoContentType = photoContentType;
}
public String getUpload() {
return upload;
}
public void setUpload(String upload) {
this.upload = upload;
}
public String getDescription() {
return description;
}
public void setDescription(String description) {
this.description = description;
}
public Long getTheTaskAgendaId() {
return theTaskAgendaId;
}
public void setTheTaskAgendaId(Long taskAgendaId) {
this.theTaskAgendaId = taskAgendaId;
}
public Long getConfigurationId() {
return configurationId;
}
public void setConfigurationId(Long spmConfigurationId) {
this.configurationId = spmConfigurationId;
}
public Long getTheResourceEventId() {
return theResourceEventId;
}
public void setTheResourceEventId(Long resourceEventId) {
this.theResourceEventId = resourceEventId;
}
public Set<ProcessDTO> getTheProcesses() {
return theProcesses;
}
public void setTheProcesses(Set<ProcessDTO> processes) {
this.theProcesses = processes;
}
public Set<WorkGroupDTO> getTheWorkGroups() {
return theWorkGroups;
}
public void setTheWorkGroups(Set<WorkGroupDTO> workGroups) {
this.theWorkGroups = workGroups;
}
public Set<CompanyUnitDTO> getTheOrgUnits() {
return theOrgUnits;
}
public void setTheOrgUnits(Set<CompanyUnitDTO> companyUnits) {
this.theOrgUnits = companyUnits;
}
public Long getTheEmailConfigurationId() {
return theEmailConfigurationId;
}
public void setTheEmailConfigurationId(Long emailConfigurationId) {
this.theEmailConfigurationId = emailConfigurationId;
}
@Override
public boolean equals(Object o) {
if (this == o) {
return true;
}
if (o == null || getClass() != o.getClass()) {
return false;
}
AgentDTO agentDTO = (AgentDTO) o;
if (agentDTO.getId() == null || getId() == null) {
return false;
}
return Objects.equals(getId(), agentDTO.getId());
}
@Override
public int hashCode() {
return Objects.hashCode(getId());
}
@Override
public String toString() {
return "AgentDTO{" +
"id=" + getId() +
", ident='" + getIdent() + "'" +
", name='" + getName() + "'" +
", email='" + getEmail() + "'" +
", costHour=" + getCostHour() +
", password='" + getPassword() + "'" +
", tipoUser=" + getTipoUser() +
", isActive='" + isIsActive() + "'" +
", online='" + isOnline() + "'" +
", photo='" + getPhoto() + "'" +
", upload='" + getUpload() + "'" +
", description='" + getDescription() + "'" +
", theTaskAgenda=" + getTheTaskAgendaId() +
", configuration=" + getConfigurationId() +
", theResourceEvent=" + getTheResourceEventId() +
", theEmailConfiguration=" + getTheEmailConfigurationId() +
"}";
}
public String getPassword() {
return <PASSWORD>;
}
public void setPassword(String password) {
this.password = <PASSWORD>;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/AgendaEventService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.AgendaEvent;
import br.ufpa.labes.spm.repository.AgendaEventRepository;
import br.ufpa.labes.spm.service.dto.AgendaEventDTO;
import br.ufpa.labes.spm.service.mapper.AgendaEventMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link AgendaEvent}.
*/
@Service
@Transactional
public class AgendaEventService {
private final Logger log = LoggerFactory.getLogger(AgendaEventService.class);
private final AgendaEventRepository agendaEventRepository;
private final AgendaEventMapper agendaEventMapper;
public AgendaEventService(AgendaEventRepository agendaEventRepository, AgendaEventMapper agendaEventMapper) {
this.agendaEventRepository = agendaEventRepository;
this.agendaEventMapper = agendaEventMapper;
}
/**
* Save a agendaEvent.
*
* @param agendaEventDTO the entity to save.
* @return the persisted entity.
*/
public AgendaEventDTO save(AgendaEventDTO agendaEventDTO) {
log.debug("Request to save AgendaEvent : {}", agendaEventDTO);
AgendaEvent agendaEvent = agendaEventMapper.toEntity(agendaEventDTO);
agendaEvent = agendaEventRepository.save(agendaEvent);
return agendaEventMapper.toDto(agendaEvent);
}
/**
* Get all the agendaEvents.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<AgendaEventDTO> findAll() {
log.debug("Request to get all AgendaEvents");
return agendaEventRepository.findAll().stream()
.map(agendaEventMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one agendaEvent by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<AgendaEventDTO> findOne(Long id) {
log.debug("Request to get AgendaEvent : {}", id);
return agendaEventRepository.findById(id)
.map(agendaEventMapper::toDto);
}
/**
* Delete the agendaEvent by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete AgendaEvent : {}", id);
agendaEventRepository.deleteById(id);
}
}
<file_sep>/tmp/service/impl/ToolServicesImpl.java
package br.ufpa.labes.spm.service.impl;
import java.util.ArrayList;
import java.util.Collection;
import java.util.List;
import javax.persistence.Query;
import org.qrconsult.spm.converter.core.Converter;
import org.qrconsult.spm.converter.core.ConverterImpl;
import br.ufpa.labes.spm.exceptions.ImplementationException;
import br.ufpa.labes.spm.repository.interfaces.tools.IToolDefinitionDAO;
import br.ufpa.labes.spm.repository.interfaces.types.IArtifactTypeDAO;
import br.ufpa.labes.spm.repository.interfaces.types.IToolTypeDAO;
import br.ufpa.labes.spm.service.dto.ToolDTO;
import br.ufpa.labes.spm.service.dto.ToolsDTO;
import br.ufpa.labes.spm.service.dto.TypesDTO;
import br.ufpa.labes.spm.domain.ToolDefinition;
import br.ufpa.labes.spm.domain.ArtifactType;
import br.ufpa.labes.spm.domain.ToolType;
import br.ufpa.labes.spm.domain.Type;
import br.ufpa.labes.spm.service.interfaces.ToolServices;
public class ToolServicesImpl implements ToolServices {
private final String TOOL_CLASSNAME = ToolDefinition.class.getSimpleName();
IToolDefinitionDAO toolDAO;
IToolTypeDAO toolTypeDAO;
IArtifactTypeDAO artifactTypeDAO;
Converter converter = new ConverterImpl();
private Query query;
@Override
@SuppressWarnings("unchecked")
public TypesDTO getToolTypes() {
String hql;
List<Type> typesLists = new ArrayList<Type>();
hql = "from " + ToolType.class.getSimpleName();
query = toolTypeDAO.getPersistenceContext().createQuery(hql);
typesLists = query.getResultList();
TypesDTO typesDTO = new TypesDTO(typesLists.size());
int j = 0;
for (Type type : typesLists) {
String typeIdent = type.getIdent();
String superTypeIdent = (type.getSuperType() != null ? type
.getSuperType().getIdent() : "");
String rootType = ArtifactType.class.getSimpleName();
typesDTO.addType(typeIdent, superTypeIdent, rootType, j);
j++;
}
return typesDTO;
}
@Override
public ToolDTO saveTool(ToolDTO toolDTO) {
ToolDefinition tool = new ToolDefinition();
tool = toolDAO.retrieveBySecondaryKey(toolDTO.getIdent());
ToolType toolType = toolTypeDAO.retrieveBySecondaryKey(toolDTO
.getTheToolType());
Collection<String> artifactTypeNames = toolDTO.getTheArtifactType();
if (tool != null) {
updateTool(tool, toolDTO);
} else {
tool = this.convertToolDTOToTool(toolDTO);
toolDAO.daoSave(tool);
String newIdent = toolDAO.generateIdent(tool.getName(), tool);
tool.setIdent(newIdent);
// toolDTO.setIdent(newIdent);
}
tool.setTheToolType(toolType);
this.updateDependencies(tool, toolDTO);
toolDAO.update(tool);
toolDTO = this.convertToolToToolDTO(tool);
toolDTO.setTheArtifactType(artifactTypeNames);
// System.out.println("----> Tool Oid: " + toolDTO.getId() + "; Name: "
// + toolDTO.getName() + "; Artifacts: " +
// toolDTO.getTheArtifactType().size());
return toolDTO;
}
private void updateTool(ToolDefinition tool, ToolDTO toolDTO) {
tool.setName(toolDTO.getName());
tool.setDescription(toolDTO.getDescription());
}
private void updateDependencies(ToolDefinition tool, ToolDTO toolDTO) {
List<ArtifactType> artifactTypes = new ArrayList<ArtifactType>();
for (String artifactType : toolDTO.getTheArtifactType()) {
ArtifactType artifact = artifactTypeDAO
.retrieveBySecondaryKey(artifactType);
artifactTypes.add(artifact);
artifact = null;
}
tool.setTheArtifactType(artifactTypes);
}
@Override
public ToolDTO getTool(String toolIdent) {
ToolDTO toolDTO = new ToolDTO();
ToolDefinition tool = toolDAO.retrieveBySecondaryKey(toolIdent);
if (tool != null) {
toolDTO = this.convertToolToToolDTO(tool);
}
return toolDTO;
}
@Override
@SuppressWarnings("unchecked")
public ToolsDTO getTools() {
String hql = "SELECT tool FROM " + TOOL_CLASSNAME + " AS tool";
query = toolDAO.getPersistenceContext().createQuery(hql);
ToolsDTO toolsDTO = new ToolsDTO();
List<ToolDefinition> result = query.getResultList();
if (!result.isEmpty()) {
ToolDTO toolDTO = null;
for (ToolDefinition tool : result) {
toolDTO = this.convertToolToToolDTO(tool);
toolsDTO.addTool(toolDTO);
// System.out.println("Tool, name: " + toolDTO.getName() +
// "; Artifacts: " + toolDTO.getTheArtifactType());
toolDTO = null;
}
}
return toolsDTO;
}
@Override
@SuppressWarnings("unchecked")
public ToolsDTO getTools(String toolName, String toolType, String artifact,
Boolean isActive) {
System.out.println("Par�metros - name: " + toolName + "; type: "
+ toolType + "; artifact: " + artifact);
ToolsDTO toolsDTO = new ToolsDTO();
String hql = "SELECT tool FROM " + TOOL_CLASSNAME
+ " AS tool WHERE tool.name like :name ";
String typeFilter = (toolType.equals("")) ? ""
: " AND tool.theToolType.ident = :toolType";
query = toolDAO.getPersistenceContext().createQuery(hql + typeFilter);
query.setParameter("name", "%" + toolName + "%");
if (!toolType.equals("")) {
query.setParameter("toolType", toolType);
}
List<ToolDefinition> result = query.getResultList();
for (ToolDefinition toolDefinition : result) {
ToolDTO tool = this.convertToolToToolDTO(toolDefinition);
toolsDTO.addTool(tool);
}
if (!artifact.equals("")) {
this.filterArtifacts(artifact, toolsDTO);
}
return toolsDTO;
}
@Override
public Boolean removeTool(String toolIdent) {
ToolDefinition tool = toolDAO.retrieveBySecondaryKey(toolIdent);
if (tool != null) {
for (ArtifactType artifactType : tool.getTheArtifactType()) {
artifactType.getTheToolDefinition().remove(tool);
}
// tool.setTheArtifactType(new HashSet<ArtifactType>());
tool.getTheArtifactType().clear();
toolDAO.update(tool);
toolDAO.daoDelete(tool);
return true;
}
return false;
}
private ToolsDTO filterArtifacts(String artifact, ToolsDTO result) {
for (int i = 0; i < result.size(); i++) {
ToolDTO tool = result.getTool(i);
if (!tool.getTheArtifactType().contains(artifact)) {
result.removeTool(tool);
}
}
return result;
}
@Override
public Boolean removeArtifactFromTool(String artifactName, ToolDTO tool) {
ArtifactType artifactType = artifactTypeDAO
.retrieveBySecondaryKey(artifactName);
ToolDefinition toolDefinition = toolDAO.retrieveBySecondaryKey(tool.getIdent());
if ((artifactType != null) && (toolDefinition != null)) {
artifactType.removeFromTheToolDefinition(toolDefinition);
toolDAO.update(toolDefinition);
return true;
}
return false;
}
@SuppressWarnings({ "unchecked", "unused" })
private ToolDefinition getToolFromName(String toolName) {
String hql = "SELECT tool FROM " + TOOL_CLASSNAME
+ " as tool where tool.name = :name";
query = toolDAO.getPersistenceContext().createQuery(hql);
query.setParameter("name", toolName);
List<ToolDefinition> result = null;
result = query.getResultList();
if (!result.isEmpty()) {
ToolDefinition tool = result.get(0);
return tool;
}
return null;
}
private ToolDTO convertToolToToolDTO(ToolDefinition tool) {
try {
ToolDTO toolDTO = new ToolDTO();
toolDTO = (ToolDTO) converter.getDTO(tool, ToolDTO.class);
toolDTO.setTheToolType(tool.getTheToolType().getIdent());
// System.out.println("Tool: " + tool.getName() +
// " ---------> Size: " + tool.getTheArtifactType().size());
List<String> artifactType = new ArrayList<String>();
for (ArtifactType artifact : tool.getTheArtifactType()) {
artifactType.add(artifact.getIdent());
}
toolDTO.setTheArtifactType(artifactType);
return toolDTO;
} catch (ImplementationException e) {
e.printStackTrace();
}
return new ToolDTO();
}
private ToolDefinition convertToolDTOToTool(ToolDTO toolDTO) {
try {
ToolDefinition tool = new ToolDefinition();
tool = (ToolDefinition) converter.getEntity(toolDTO,
ToolDefinition.class);
return tool;
} catch (ImplementationException e) {
e.printStackTrace();
}
return new ToolDefinition();
}
@SuppressWarnings("unused")
private ToolDefinition convertToolDTOToTool(ToolDTO toolDTO,
ToolDefinition tool) {
try {
tool = (ToolDefinition) converter.getEntity(toolDTO, tool);
return tool;
} catch (ImplementationException e) {
e.printStackTrace();
}
return new ToolDefinition();
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/OutOfWorkPeriodService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.OutOfWorkPeriod;
import br.ufpa.labes.spm.repository.OutOfWorkPeriodRepository;
import br.ufpa.labes.spm.service.dto.OutOfWorkPeriodDTO;
import br.ufpa.labes.spm.service.mapper.OutOfWorkPeriodMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link OutOfWorkPeriod}.
*/
@Service
@Transactional
public class OutOfWorkPeriodService {
private final Logger log = LoggerFactory.getLogger(OutOfWorkPeriodService.class);
private final OutOfWorkPeriodRepository outOfWorkPeriodRepository;
private final OutOfWorkPeriodMapper outOfWorkPeriodMapper;
public OutOfWorkPeriodService(OutOfWorkPeriodRepository outOfWorkPeriodRepository, OutOfWorkPeriodMapper outOfWorkPeriodMapper) {
this.outOfWorkPeriodRepository = outOfWorkPeriodRepository;
this.outOfWorkPeriodMapper = outOfWorkPeriodMapper;
}
/**
* Save a outOfWorkPeriod.
*
* @param outOfWorkPeriodDTO the entity to save.
* @return the persisted entity.
*/
public OutOfWorkPeriodDTO save(OutOfWorkPeriodDTO outOfWorkPeriodDTO) {
log.debug("Request to save OutOfWorkPeriod : {}", outOfWorkPeriodDTO);
OutOfWorkPeriod outOfWorkPeriod = outOfWorkPeriodMapper.toEntity(outOfWorkPeriodDTO);
outOfWorkPeriod = outOfWorkPeriodRepository.save(outOfWorkPeriod);
return outOfWorkPeriodMapper.toDto(outOfWorkPeriod);
}
/**
* Get all the outOfWorkPeriods.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<OutOfWorkPeriodDTO> findAll() {
log.debug("Request to get all OutOfWorkPeriods");
return outOfWorkPeriodRepository.findAll().stream()
.map(outOfWorkPeriodMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one outOfWorkPeriod by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<OutOfWorkPeriodDTO> findOne(Long id) {
log.debug("Request to get OutOfWorkPeriod : {}", id);
return outOfWorkPeriodRepository.findById(id)
.map(outOfWorkPeriodMapper::toDto);
}
/**
* Delete the outOfWorkPeriod by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete OutOfWorkPeriod : {}", id);
outOfWorkPeriodRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/processKnowledge/IWorkGroupEstimationDAO.java
package br.ufpa.labes.spm.repository.interfaces.processKnowledge;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.WorkGroupEstimation;
public interface IWorkGroupEstimationDAO extends IBaseDAO<WorkGroupEstimation, Integer> {}
<file_sep>/tmp/service/AbilityTypeService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.AbilityType;
import br.ufpa.labes.spm.repository.AbilityTypeRepository;
import br.ufpa.labes.spm.service.dto.AbilityTypeDTO;
import br.ufpa.labes.spm.service.mapper.AbilityTypeMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link AbilityType}.
*/
@Service
@Transactional
public class AbilityTypeService {
private final Logger log = LoggerFactory.getLogger(AbilityTypeService.class);
private final AbilityTypeRepository abilityTypeRepository;
private final AbilityTypeMapper abilityTypeMapper;
public AbilityTypeService(AbilityTypeRepository abilityTypeRepository, AbilityTypeMapper abilityTypeMapper) {
this.abilityTypeRepository = abilityTypeRepository;
this.abilityTypeMapper = abilityTypeMapper;
}
/**
* Save a abilityType.
*
* @param abilityTypeDTO the entity to save.
* @return the persisted entity.
*/
public AbilityTypeDTO save(AbilityTypeDTO abilityTypeDTO) {
log.debug("Request to save AbilityType : {}", abilityTypeDTO);
AbilityType abilityType = abilityTypeMapper.toEntity(abilityTypeDTO);
abilityType = abilityTypeRepository.save(abilityType);
return abilityTypeMapper.toDto(abilityType);
}
/**
* Get all the abilityTypes.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<AbilityTypeDTO> findAll() {
log.debug("Request to get all AbilityTypes");
return abilityTypeRepository.findAll().stream()
.map(abilityTypeMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one abilityType by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<AbilityTypeDTO> findOne(Long id) {
log.debug("Request to get AbilityType : {}", id);
return abilityTypeRepository.findById(id)
.map(abilityTypeMapper::toDto);
}
/**
* Delete the abilityType by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete AbilityType : {}", id);
abilityTypeRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/taskagenda/IOcurrenceDAO.java
package br.ufpa.labes.spm.repository.interfaces.taskagenda;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.Ocurrence;
public interface IOcurrenceDAO extends IBaseDAO<Ocurrence, Integer> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/domain/enumeration/WebAPSEEObjectType.java
package br.ufpa.labes.spm.domain.enumeration;
/** The WebAPSEEObjectType enumeration. */
public enum WebAPSEEObjectType {
ACTIVITY,
REQ_AGENT,
REQ_RESOURCE,
REQ_WORKGROUP,
CONNECTION
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/people/PersonDAO.java
package br.ufpa.labes.spm.repository.impl.people;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.people.IPersonDAO;
import br.ufpa.labes.spm.domain.Person;
public class PersonDAO extends BaseDAO<Person, String> implements IPersonDAO {
public PersonDAO() {
super(Person.class);
}
}
<file_sep>/tmp/service/mapper/ConnectionEventMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.ConnectionEventDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link ConnectionEvent} and its DTO {@link ConnectionEventDTO}.
*/
@Mapper(componentModel = "spring", uses = {})
public interface ConnectionEventMapper extends EntityMapper<ConnectionEventDTO, ConnectionEvent> {
@Mapping(target = "theCatalogEvents", ignore = true)
@Mapping(target = "removeTheCatalogEvents", ignore = true)
ConnectionEvent toEntity(ConnectionEventDTO connectionEventDTO);
default ConnectionEvent fromId(Long id) {
if (id == null) {
return null;
}
ConnectionEvent connectionEvent = new ConnectionEvent();
connectionEvent.setId(id);
return connectionEvent;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/PeopleInstSugService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.PeopleInstSug;
import br.ufpa.labes.spm.repository.PeopleInstSugRepository;
import br.ufpa.labes.spm.service.dto.PeopleInstSugDTO;
import br.ufpa.labes.spm.service.mapper.PeopleInstSugMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link PeopleInstSug}.
*/
@Service
@Transactional
public class PeopleInstSugService {
private final Logger log = LoggerFactory.getLogger(PeopleInstSugService.class);
private final PeopleInstSugRepository peopleInstSugRepository;
private final PeopleInstSugMapper peopleInstSugMapper;
public PeopleInstSugService(PeopleInstSugRepository peopleInstSugRepository, PeopleInstSugMapper peopleInstSugMapper) {
this.peopleInstSugRepository = peopleInstSugRepository;
this.peopleInstSugMapper = peopleInstSugMapper;
}
/**
* Save a peopleInstSug.
*
* @param peopleInstSugDTO the entity to save.
* @return the persisted entity.
*/
public PeopleInstSugDTO save(PeopleInstSugDTO peopleInstSugDTO) {
log.debug("Request to save PeopleInstSug : {}", peopleInstSugDTO);
PeopleInstSug peopleInstSug = peopleInstSugMapper.toEntity(peopleInstSugDTO);
peopleInstSug = peopleInstSugRepository.save(peopleInstSug);
return peopleInstSugMapper.toDto(peopleInstSug);
}
/**
* Get all the peopleInstSugs.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<PeopleInstSugDTO> findAll() {
log.debug("Request to get all PeopleInstSugs");
return peopleInstSugRepository.findAll().stream()
.map(peopleInstSugMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one peopleInstSug by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<PeopleInstSugDTO> findOne(Long id) {
log.debug("Request to get PeopleInstSug : {}", id);
return peopleInstSugRepository.findById(id)
.map(peopleInstSugMapper::toDto);
}
/**
* Delete the peopleInstSug by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete PeopleInstSug : {}", id);
peopleInstSugRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/agent/WorkGroupDAO.java
package br.ufpa.labes.spm.repository.impl.agent;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.agent.IWorkGroupDAO;
import br.ufpa.labes.spm.domain.WorkGroup;
public class WorkGroupDAO extends BaseDAO<WorkGroup, String> implements IWorkGroupDAO {
protected WorkGroupDAO(Class<WorkGroup> businessClass) {
super(businessClass);
}
public WorkGroupDAO() {
super(WorkGroup.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/ResourcePossibleUseService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.ResourcePossibleUse;
import br.ufpa.labes.spm.repository.ResourcePossibleUseRepository;
import br.ufpa.labes.spm.service.dto.ResourcePossibleUseDTO;
import br.ufpa.labes.spm.service.mapper.ResourcePossibleUseMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link ResourcePossibleUse}.
*/
@Service
@Transactional
public class ResourcePossibleUseService {
private final Logger log = LoggerFactory.getLogger(ResourcePossibleUseService.class);
private final ResourcePossibleUseRepository resourcePossibleUseRepository;
private final ResourcePossibleUseMapper resourcePossibleUseMapper;
public ResourcePossibleUseService(ResourcePossibleUseRepository resourcePossibleUseRepository, ResourcePossibleUseMapper resourcePossibleUseMapper) {
this.resourcePossibleUseRepository = resourcePossibleUseRepository;
this.resourcePossibleUseMapper = resourcePossibleUseMapper;
}
/**
* Save a resourcePossibleUse.
*
* @param resourcePossibleUseDTO the entity to save.
* @return the persisted entity.
*/
public ResourcePossibleUseDTO save(ResourcePossibleUseDTO resourcePossibleUseDTO) {
log.debug("Request to save ResourcePossibleUse : {}", resourcePossibleUseDTO);
ResourcePossibleUse resourcePossibleUse = resourcePossibleUseMapper.toEntity(resourcePossibleUseDTO);
resourcePossibleUse = resourcePossibleUseRepository.save(resourcePossibleUse);
return resourcePossibleUseMapper.toDto(resourcePossibleUse);
}
/**
* Get all the resourcePossibleUses.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<ResourcePossibleUseDTO> findAll() {
log.debug("Request to get all ResourcePossibleUses");
return resourcePossibleUseRepository.findAll().stream()
.map(resourcePossibleUseMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one resourcePossibleUse by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<ResourcePossibleUseDTO> findOne(Long id) {
log.debug("Request to get ResourcePossibleUse : {}", id);
return resourcePossibleUseRepository.findById(id)
.map(resourcePossibleUseMapper::toDto);
}
/**
* Delete the resourcePossibleUse by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete ResourcePossibleUse : {}", id);
resourcePossibleUseRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/log/IModelingActivityEventDAO.java
package br.ufpa.labes.spm.repository.interfaces.log;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.ModelingActivityEvent;
public interface IModelingActivityEventDAO extends IBaseDAO<ModelingActivityEvent, Integer> {}
<file_sep>/tmp/service/dto/ArtifactDTO.java
package br.ufpa.labes.spm.service.dto;
import java.io.Serializable;
import java.util.Objects;
import javax.persistence.Lob;
/**
* A DTO for the {@link br.ufpa.labes.spm.domain.Artifact} entity.
*/
public class ArtifactDTO implements Serializable {
private Long id;
private String ident;
private String name;
@Lob
private String description;
private String pathName;
private String fileName;
private String latestVersion;
private Boolean isTemplate;
private Boolean isActive;
private Long theArtifactTypeId;
private Long derivedToId;
private Long possessId;
private Long theRepositoryId;
private Long theProjectId;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public String getIdent() {
return ident;
}
public void setIdent(String ident) {
this.ident = ident;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public String getDescription() {
return description;
}
public void setDescription(String description) {
this.description = description;
}
public String getPathName() {
return pathName;
}
public void setPathName(String pathName) {
this.pathName = pathName;
}
public String getFileName() {
return fileName;
}
public void setFileName(String fileName) {
this.fileName = fileName;
}
public String getLatestVersion() {
return latestVersion;
}
public void setLatestVersion(String latestVersion) {
this.latestVersion = latestVersion;
}
public Boolean isIsTemplate() {
return isTemplate;
}
public void setIsTemplate(Boolean isTemplate) {
this.isTemplate = isTemplate;
}
public Boolean isIsActive() {
return isActive;
}
public void setIsActive(Boolean isActive) {
this.isActive = isActive;
}
public Long getTheArtifactTypeId() {
return theArtifactTypeId;
}
public void setTheArtifactTypeId(Long artifactTypeId) {
this.theArtifactTypeId = artifactTypeId;
}
public Long getDerivedToId() {
return derivedToId;
}
public void setDerivedToId(Long artifactId) {
this.derivedToId = artifactId;
}
public Long getPossessId() {
return possessId;
}
public void setPossessId(Long artifactId) {
this.possessId = artifactId;
}
public Long getTheRepositoryId() {
return theRepositoryId;
}
public void setTheRepositoryId(Long vCSRepositoryId) {
this.theRepositoryId = vCSRepositoryId;
}
public Long getTheProjectId() {
return theProjectId;
}
public void setTheProjectId(Long projectId) {
this.theProjectId = projectId;
}
@Override
public boolean equals(Object o) {
if (this == o) {
return true;
}
if (o == null || getClass() != o.getClass()) {
return false;
}
ArtifactDTO artifactDTO = (ArtifactDTO) o;
if (artifactDTO.getId() == null || getId() == null) {
return false;
}
return Objects.equals(getId(), artifactDTO.getId());
}
@Override
public int hashCode() {
return Objects.hashCode(getId());
}
@Override
public String toString() {
return "ArtifactDTO{" +
"id=" + getId() +
", ident='" + getIdent() + "'" +
", name='" + getName() + "'" +
", description='" + getDescription() + "'" +
", pathName='" + getPathName() + "'" +
", fileName='" + getFileName() + "'" +
", latestVersion='" + getLatestVersion() + "'" +
", isTemplate='" + isIsTemplate() + "'" +
", isActive='" + isIsActive() + "'" +
", theArtifactType=" + getTheArtifactTypeId() +
", derivedTo=" + getDerivedToId() +
", possess=" + getPossessId() +
", theRepository=" + getTheRepositoryId() +
", theProject=" + getTheProjectId() +
"}";
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/organizationPolicies/NodeDAO.java
package br.ufpa.labes.spm.repository.impl.organizationPolicies;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.organizationPolicies.INodeDAO;
import br.ufpa.labes.spm.domain.Node;
public class NodeDAO extends BaseDAO<Node, String> implements INodeDAO {
protected NodeDAO(Class<Node> businessClass) {
super(businessClass);
}
public NodeDAO() {
super(Node.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/types/RoleTypeDAO.java
package br.ufpa.labes.spm.repository.impl.types;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.types.IRoleTypeDAO;
import br.ufpa.labes.spm.domain.RoleType;
public class RoleTypeDAO extends BaseDAO<RoleType, String> implements IRoleTypeDAO {
protected RoleTypeDAO(Class<RoleType> businessClass) {
super(businessClass);
}
public RoleTypeDAO() {
super(RoleType.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/tools/IScriptDAO.java
package br.ufpa.labes.spm.repository.interfaces.tools;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.Script;
public interface IScriptDAO extends IBaseDAO<Script, String> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/connections/IMultipleConDAO.java
package br.ufpa.labes.spm.repository.interfaces.connections;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.MultipleCon;
public interface IMultipleConDAO extends IBaseDAO<MultipleCon, String> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/tools/PrimitiveTypeDAO.java
package br.ufpa.labes.spm.repository.impl.tools;
import javax.lang.model.type.PrimitiveType;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.tools.IPrimitiveTypeDAO;
public class PrimitiveTypeDAO extends BaseDAO<PrimitiveType, String> implements IPrimitiveTypeDAO {
protected PrimitiveTypeDAO(Class<PrimitiveType> businessClass) {
super(businessClass);
}
public PrimitiveTypeDAO() {
super(PrimitiveType.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/AgentMetricService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.AgentMetric;
import br.ufpa.labes.spm.repository.AgentMetricRepository;
import br.ufpa.labes.spm.service.dto.AgentMetricDTO;
import br.ufpa.labes.spm.service.mapper.AgentMetricMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link AgentMetric}.
*/
@Service
@Transactional
public class AgentMetricService {
private final Logger log = LoggerFactory.getLogger(AgentMetricService.class);
private final AgentMetricRepository agentMetricRepository;
private final AgentMetricMapper agentMetricMapper;
public AgentMetricService(AgentMetricRepository agentMetricRepository, AgentMetricMapper agentMetricMapper) {
this.agentMetricRepository = agentMetricRepository;
this.agentMetricMapper = agentMetricMapper;
}
/**
* Save a agentMetric.
*
* @param agentMetricDTO the entity to save.
* @return the persisted entity.
*/
public AgentMetricDTO save(AgentMetricDTO agentMetricDTO) {
log.debug("Request to save AgentMetric : {}", agentMetricDTO);
AgentMetric agentMetric = agentMetricMapper.toEntity(agentMetricDTO);
agentMetric = agentMetricRepository.save(agentMetric);
return agentMetricMapper.toDto(agentMetric);
}
/**
* Get all the agentMetrics.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<AgentMetricDTO> findAll() {
log.debug("Request to get all AgentMetrics");
return agentMetricRepository.findAll().stream()
.map(agentMetricMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one agentMetric by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<AgentMetricDTO> findOne(Long id) {
log.debug("Request to get AgentMetric : {}", id);
return agentMetricRepository.findById(id)
.map(agentMetricMapper::toDto);
}
/**
* Delete the agentMetric by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete AgentMetric : {}", id);
agentMetricRepository.deleteById(id);
}
}
<file_sep>/tmp/service/mapper/CatalogEventMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.CatalogEventDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link CatalogEvent} and its DTO {@link CatalogEventDTO}.
*/
@Mapper(componentModel = "spring", uses = {ResourceEventMapper.class, ProcessModelEventMapper.class, AgendaEventMapper.class, ConnectionEventMapper.class, GlobalActivityEventMapper.class, ModelingActivityEventMapper.class, ProcessEventMapper.class, PlainMapper.class})
public interface CatalogEventMapper extends EntityMapper<CatalogEventDTO, CatalogEvent> {
@Mapping(source = "theResourceEvent.id", target = "theResourceEventId")
@Mapping(source = "theProcessModelEvent.id", target = "theProcessModelEventId")
@Mapping(source = "theAgendaEvent.id", target = "theAgendaEventId")
@Mapping(source = "theCatalogEvent.id", target = "theCatalogEventId")
@Mapping(source = "theConnectionEvent.id", target = "theConnectionEventId")
@Mapping(source = "theGlobalActivityEvent.id", target = "theGlobalActivityEventId")
@Mapping(source = "theModelingActivityEvent.id", target = "theModelingActivityEventId")
@Mapping(source = "theProcessEvent.id", target = "theProcessEventId")
@Mapping(source = "thePlain.id", target = "thePlainId")
CatalogEventDTO toDto(CatalogEvent catalogEvent);
@Mapping(source = "theResourceEventId", target = "theResourceEvent")
@Mapping(source = "theProcessModelEventId", target = "theProcessModelEvent")
@Mapping(source = "theAgendaEventId", target = "theAgendaEvent")
@Mapping(source = "theCatalogEventId", target = "theCatalogEvent")
@Mapping(source = "theConnectionEventId", target = "theConnectionEvent")
@Mapping(source = "theGlobalActivityEventId", target = "theGlobalActivityEvent")
@Mapping(source = "theModelingActivityEventId", target = "theModelingActivityEvent")
@Mapping(source = "theProcessEventId", target = "theProcessEvent")
@Mapping(source = "thePlainId", target = "thePlain")
@Mapping(target = "theEvents", ignore = true)
@Mapping(target = "removeTheEvent", ignore = true)
@Mapping(target = "theCatalogEvents", ignore = true)
@Mapping(target = "removeTheCatalogEvents", ignore = true)
CatalogEvent toEntity(CatalogEventDTO catalogEventDTO);
default CatalogEvent fromId(Long id) {
if (id == null) {
return null;
}
CatalogEvent catalogEvent = new CatalogEvent();
catalogEvent.setId(id);
return catalogEvent;
}
}
<file_sep>/tmp/service/AgentAffinityAgentService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.AgentAffinityAgent;
import br.ufpa.labes.spm.repository.AgentAffinityAgentRepository;
import br.ufpa.labes.spm.service.dto.AgentAffinityAgentDTO;
import br.ufpa.labes.spm.service.mapper.AgentAffinityAgentMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link AgentAffinityAgent}.
*/
@Service
@Transactional
public class AgentAffinityAgentService {
private final Logger log = LoggerFactory.getLogger(AgentAffinityAgentService.class);
private final AgentAffinityAgentRepository agentAffinityAgentRepository;
private final AgentAffinityAgentMapper agentAffinityAgentMapper;
public AgentAffinityAgentService(AgentAffinityAgentRepository agentAffinityAgentRepository, AgentAffinityAgentMapper agentAffinityAgentMapper) {
this.agentAffinityAgentRepository = agentAffinityAgentRepository;
this.agentAffinityAgentMapper = agentAffinityAgentMapper;
}
/**
* Save a agentAffinityAgent.
*
* @param agentAffinityAgentDTO the entity to save.
* @return the persisted entity.
*/
public AgentAffinityAgentDTO save(AgentAffinityAgentDTO agentAffinityAgentDTO) {
log.debug("Request to save AgentAffinityAgent : {}", agentAffinityAgentDTO);
AgentAffinityAgent agentAffinityAgent = agentAffinityAgentMapper.toEntity(agentAffinityAgentDTO);
agentAffinityAgent = agentAffinityAgentRepository.save(agentAffinityAgent);
return agentAffinityAgentMapper.toDto(agentAffinityAgent);
}
/**
* Get all the agentAffinityAgents.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<AgentAffinityAgentDTO> findAll() {
log.debug("Request to get all AgentAffinityAgents");
return agentAffinityAgentRepository.findAll().stream()
.map(agentAffinityAgentMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one agentAffinityAgent by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<AgentAffinityAgentDTO> findOne(Long id) {
log.debug("Request to get AgentAffinityAgent : {}", id);
return agentAffinityAgentRepository.findById(id)
.map(agentAffinityAgentMapper::toDto);
}
/**
* Delete the agentAffinityAgent by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete AgentAffinityAgent : {}", id);
agentAffinityAgentRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/connections/IDependencyDAO.java
package br.ufpa.labes.spm.repository.interfaces.connections;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.Dependency;
public interface IDependencyDAO extends IBaseDAO<Dependency, Integer> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/agent/IConfiDAO.java
package br.ufpa.labes.spm.repository.interfaces.agent;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.SpmConfiguration;
public interface IConfiDAO extends IBaseDAO<SpmConfiguration, String> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/IUserDAO.java
package br.ufpa.labes.spm.repository.interfaces;
import br.ufpa.labes.spm.domain.User;
public interface IUserDAO extends IBaseDAO<User, String> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/types/IArtifactTypeDAO.java
package br.ufpa.labes.spm.repository.interfaces.types;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.ArtifactType;
public interface IArtifactTypeDAO extends IBaseDAO<ArtifactType, String> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/mapper/ConnectionEventMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.ConnectionEventDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link ConnectionEvent} and its DTO {@link ConnectionEventDTO}.
*/
@Mapper(componentModel = "spring", uses = {CatalogEventMapper.class})
public interface ConnectionEventMapper extends EntityMapper<ConnectionEventDTO, ConnectionEvent> {
@Mapping(source = "theCatalogEvent.id", target = "theCatalogEventId")
ConnectionEventDTO toDto(ConnectionEvent connectionEvent);
@Mapping(source = "theCatalogEventId", target = "theCatalogEvent")
ConnectionEvent toEntity(ConnectionEventDTO connectionEventDTO);
default ConnectionEvent fromId(Long id) {
if (id == null) {
return null;
}
ConnectionEvent connectionEvent = new ConnectionEvent();
connectionEvent.setId(id);
return connectionEvent;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/mapper/AgentPlaysRoleMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.AgentPlaysRoleDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link AgentPlaysRole} and its DTO {@link AgentPlaysRoleDTO}.
*/
@Mapper(componentModel = "spring", uses = {AgentMapper.class, RoleMapper.class})
public interface AgentPlaysRoleMapper extends EntityMapper<AgentPlaysRoleDTO, AgentPlaysRole> {
@Mapping(source = "theAgent.id", target = "theAgentId")
@Mapping(source = "theRole.id", target = "theRoleId")
AgentPlaysRoleDTO toDto(AgentPlaysRole agentPlaysRole);
@Mapping(source = "theAgentId", target = "theAgent")
@Mapping(source = "theRoleId", target = "theRole")
AgentPlaysRole toEntity(AgentPlaysRoleDTO agentPlaysRoleDTO);
default AgentPlaysRole fromId(Long id) {
if (id == null) {
return null;
}
AgentPlaysRole agentPlaysRole = new AgentPlaysRole();
agentPlaysRole.setId(id);
return agentPlaysRole;
}
}
<file_sep>/tmp/service/ProcessModelEventService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.ProcessModelEvent;
import br.ufpa.labes.spm.repository.ProcessModelEventRepository;
import br.ufpa.labes.spm.service.dto.ProcessModelEventDTO;
import br.ufpa.labes.spm.service.mapper.ProcessModelEventMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link ProcessModelEvent}.
*/
@Service
@Transactional
public class ProcessModelEventService {
private final Logger log = LoggerFactory.getLogger(ProcessModelEventService.class);
private final ProcessModelEventRepository processModelEventRepository;
private final ProcessModelEventMapper processModelEventMapper;
public ProcessModelEventService(ProcessModelEventRepository processModelEventRepository, ProcessModelEventMapper processModelEventMapper) {
this.processModelEventRepository = processModelEventRepository;
this.processModelEventMapper = processModelEventMapper;
}
/**
* Save a processModelEvent.
*
* @param processModelEventDTO the entity to save.
* @return the persisted entity.
*/
public ProcessModelEventDTO save(ProcessModelEventDTO processModelEventDTO) {
log.debug("Request to save ProcessModelEvent : {}", processModelEventDTO);
ProcessModelEvent processModelEvent = processModelEventMapper.toEntity(processModelEventDTO);
processModelEvent = processModelEventRepository.save(processModelEvent);
return processModelEventMapper.toDto(processModelEvent);
}
/**
* Get all the processModelEvents.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<ProcessModelEventDTO> findAll() {
log.debug("Request to get all ProcessModelEvents");
return processModelEventRepository.findAll().stream()
.map(processModelEventMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one processModelEvent by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<ProcessModelEventDTO> findOne(Long id) {
log.debug("Request to get ProcessModelEvent : {}", id);
return processModelEventRepository.findById(id)
.map(processModelEventMapper::toDto);
}
/**
* Delete the processModelEvent by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete ProcessModelEvent : {}", id);
processModelEventRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/OrganizationMetricService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.OrganizationMetric;
import br.ufpa.labes.spm.repository.OrganizationMetricRepository;
import br.ufpa.labes.spm.service.dto.OrganizationMetricDTO;
import br.ufpa.labes.spm.service.mapper.OrganizationMetricMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link OrganizationMetric}.
*/
@Service
@Transactional
public class OrganizationMetricService {
private final Logger log = LoggerFactory.getLogger(OrganizationMetricService.class);
private final OrganizationMetricRepository organizationMetricRepository;
private final OrganizationMetricMapper organizationMetricMapper;
public OrganizationMetricService(OrganizationMetricRepository organizationMetricRepository, OrganizationMetricMapper organizationMetricMapper) {
this.organizationMetricRepository = organizationMetricRepository;
this.organizationMetricMapper = organizationMetricMapper;
}
/**
* Save a organizationMetric.
*
* @param organizationMetricDTO the entity to save.
* @return the persisted entity.
*/
public OrganizationMetricDTO save(OrganizationMetricDTO organizationMetricDTO) {
log.debug("Request to save OrganizationMetric : {}", organizationMetricDTO);
OrganizationMetric organizationMetric = organizationMetricMapper.toEntity(organizationMetricDTO);
organizationMetric = organizationMetricRepository.save(organizationMetric);
return organizationMetricMapper.toDto(organizationMetric);
}
/**
* Get all the organizationMetrics.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<OrganizationMetricDTO> findAll() {
log.debug("Request to get all OrganizationMetrics");
return organizationMetricRepository.findAll().stream()
.map(organizationMetricMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one organizationMetric by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<OrganizationMetricDTO> findOne(Long id) {
log.debug("Request to get OrganizationMetric : {}", id);
return organizationMetricRepository.findById(id)
.map(organizationMetricMapper::toDto);
}
/**
* Delete the organizationMetric by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete OrganizationMetric : {}", id);
organizationMetricRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/assets/RelationshipKindDAO.java
package br.ufpa.labes.spm.repository.impl.assets;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.assets.IRelationshipKindDAO;
import br.ufpa.labes.spm.domain.RelationshipKind;
public class RelationshipKindDAO extends BaseDAO<RelationshipKind, String>
implements IRelationshipKindDAO {
public RelationshipKindDAO() {
super(RelationshipKind.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/mapper/ChatLogMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.ChatLogDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link ChatLog} and its DTO {@link ChatLogDTO}.
*/
@Mapper(componentModel = "spring", uses = {AgentMapper.class})
public interface ChatLogMapper extends EntityMapper<ChatLogDTO, ChatLog> {
@Mapping(target = "removeInvolvedAgentsInChat", ignore = true)
default ChatLog fromId(Long id) {
if (id == null) {
return null;
}
ChatLog chatLog = new ChatLog();
chatLog.setId(id);
return chatLog;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/WebAPSEEObjectRepository.java
package br.ufpa.labes.spm.repository;
import br.ufpa.labes.spm.domain.WebAPSEEObject;
import org.springframework.data.jpa.repository.*;
import org.springframework.stereotype.Repository;
/**
* Spring Data repository for the WebAPSEEObject entity.
*/
@SuppressWarnings("unused")
@Repository
public interface WebAPSEEObjectRepository extends JpaRepository<WebAPSEEObject, Long> {
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/activities/IDecomposedDAO.java
package br.ufpa.labes.spm.repository.interfaces.activities;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.Decomposed;
public interface IDecomposedDAO extends IBaseDAO<Decomposed, String> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/processKnowledge/IAgentMetricDAO.java
package br.ufpa.labes.spm.repository.interfaces.processKnowledge;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.AgentMetric;
public interface IAgentMetricDAO extends IBaseDAO<AgentMetric, Integer> {}
<file_sep>/tmp/service/dto/ResourceDTO.java
package br.ufpa.labes.spm.service.dto;
import java.io.Serializable;
import java.util.Objects;
import javax.persistence.Lob;
/**
* A DTO for the {@link br.ufpa.labes.spm.domain.Resource} entity.
*/
public class ResourceDTO implements Serializable {
private Long id;
private String ident;
private String name;
@Lob
private String description;
private Float mtbfTime;
private String mtbfUnitTime;
private String currency;
private Float cost;
private Boolean isActive;
private Long belongsToId;
private Long requiresId;
private Long theResourceTypeId;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public String getIdent() {
return ident;
}
public void setIdent(String ident) {
this.ident = ident;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public String getDescription() {
return description;
}
public void setDescription(String description) {
this.description = description;
}
public Float getMtbfTime() {
return mtbfTime;
}
public void setMtbfTime(Float mtbfTime) {
this.mtbfTime = mtbfTime;
}
public String getMtbfUnitTime() {
return mtbfUnitTime;
}
public void setMtbfUnitTime(String mtbfUnitTime) {
this.mtbfUnitTime = mtbfUnitTime;
}
public String getCurrency() {
return currency;
}
public void setCurrency(String currency) {
this.currency = currency;
}
public Float getCost() {
return cost;
}
public void setCost(Float cost) {
this.cost = cost;
}
public Boolean isIsActive() {
return isActive;
}
public void setIsActive(Boolean isActive) {
this.isActive = isActive;
}
public Long getBelongsToId() {
return belongsToId;
}
public void setBelongsToId(Long resourceId) {
this.belongsToId = resourceId;
}
public Long getRequiresId() {
return requiresId;
}
public void setRequiresId(Long resourceId) {
this.requiresId = resourceId;
}
public Long getTheResourceTypeId() {
return theResourceTypeId;
}
public void setTheResourceTypeId(Long resourceTypeId) {
this.theResourceTypeId = resourceTypeId;
}
@Override
public boolean equals(Object o) {
if (this == o) {
return true;
}
if (o == null || getClass() != o.getClass()) {
return false;
}
ResourceDTO resourceDTO = (ResourceDTO) o;
if (resourceDTO.getId() == null || getId() == null) {
return false;
}
return Objects.equals(getId(), resourceDTO.getId());
}
@Override
public int hashCode() {
return Objects.hashCode(getId());
}
@Override
public String toString() {
return "ResourceDTO{" +
"id=" + getId() +
", ident='" + getIdent() + "'" +
", name='" + getName() + "'" +
", description='" + getDescription() + "'" +
", mtbfTime=" + getMtbfTime() +
", mtbfUnitTime='" + getMtbfUnitTime() + "'" +
", currency='" + getCurrency() + "'" +
", cost=" + getCost() +
", isActive='" + isIsActive() + "'" +
", belongsTo=" + getBelongsToId() +
", requires=" + getRequiresId() +
", theResourceType=" + getTheResourceTypeId() +
"}";
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/types/ActivityTypeDAO.java
package br.ufpa.labes.spm.repository.impl.types;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.types.IActivityTypeDAO;
import br.ufpa.labes.spm.domain.ActivityType;
public class ActivityTypeDAO extends BaseDAO<ActivityType, String> implements IActivityTypeDAO {
protected ActivityTypeDAO(Class<ActivityType> businessClass) {
super(businessClass);
}
public ActivityTypeDAO() {
super(ActivityType.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/types/ConnectionTypeDAO.java
package br.ufpa.labes.spm.repository.impl.types;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.types.IConnectionTypeDAO;
import br.ufpa.labes.spm.domain.ConnectionType;
public class ConnectionTypeDAO extends BaseDAO<ConnectionType, String>
implements IConnectionTypeDAO {
protected ConnectionTypeDAO(Class<ConnectionType> businessClass) {
super(businessClass);
}
public ConnectionTypeDAO() {
super(ConnectionType.class);
}
}
<file_sep>/tmp/service/mapper/AbilityMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.AbilityDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link Ability} and its DTO {@link AbilityDTO}.
*/
@Mapper(componentModel = "spring", uses = {AbilityTypeMapper.class})
public interface AbilityMapper extends EntityMapper<AbilityDTO, Ability> {
@Mapping(source = "theAbilityType.id", target = "theAbilityTypeId")
AbilityDTO toDto(Ability ability);
@Mapping(target = "theReqAgentRequiresAbilities", ignore = true)
@Mapping(target = "removeTheReqAgentRequiresAbility", ignore = true)
@Mapping(target = "theAgentHasAbilities", ignore = true)
@Mapping(target = "removeTheAgentHasAbility", ignore = true)
@Mapping(source = "theAbilityTypeId", target = "theAbilityType")
@Mapping(target = "theRoleNeedsAbilities", ignore = true)
@Mapping(target = "removeTheRoleNeedsAbility", ignore = true)
Ability toEntity(AbilityDTO abilityDTO);
default Ability fromId(Long id) {
if (id == null) {
return null;
}
Ability ability = new Ability();
ability.setId(id);
return ability;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/email/IEmailDAO.java
package br.ufpa.labes.spm.repository.interfaces.email;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.Email;
public interface IEmailDAO extends IBaseDAO<Email, String> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/people/AuthorDAO.java
package br.ufpa.labes.spm.repository.impl.people;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.people.IAuthorDAO;
import br.ufpa.labes.spm.domain.Author;
public class AuthorDAO extends BaseDAO<Author, String> implements IAuthorDAO {
public AuthorDAO() {
super(Author.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/types/WorkGroupTypeDAO.java
package br.ufpa.labes.spm.repository.impl.types;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.types.IWorkGroupTypeDAO;
import br.ufpa.labes.spm.domain.WorkGroupType;
public class WorkGroupTypeDAO extends BaseDAO<WorkGroupType, String> implements IWorkGroupTypeDAO {
protected WorkGroupTypeDAO(Class<WorkGroupType> businessClass) {
super(businessClass);
}
public WorkGroupTypeDAO() {
super(WorkGroupType.class);
}
}
<file_sep>/tmp/service/impl/ReportServicesImpl.java
package br.ufpa.labes.spm.service.impl;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Date;
import java.util.List;
import br.ufpa.labes.spm.repository.interfaces.IReportDAO;
import br.ufpa.labes.spm.service.dto.ActivitiesByAgentReportItem;
import br.ufpa.labes.spm.service.dto.ActivitiesByProcessReportItem;
import br.ufpa.labes.spm.service.dto.AgentItemBean;
import br.ufpa.labes.spm.service.dto.AgentMetricsReportItem;
import br.ufpa.labes.spm.service.dto.AgentsByWorkGroupReportItem;
import br.ufpa.labes.spm.service.dto.AgentsByRoleReportItem;
import br.ufpa.labes.spm.service.dto.AgentsReportItem;
import br.ufpa.labes.spm.service.dto.ArtifactMetricsReportItem;
import br.ufpa.labes.spm.service.dto.CostDeviationReportItem;
import br.ufpa.labes.spm.service.dto.DocumentManagementPlanItem;
import br.ufpa.labes.spm.service.dto.HumanResourcesPlanItem;
import br.ufpa.labes.spm.service.dto.KnowledgeItensReportItem;
import br.ufpa.labes.spm.service.dto.ProjectArtifactsReportItem;
import br.ufpa.labes.spm.service.dto.ProjectsBySystemReportItem;
import br.ufpa.labes.spm.service.dto.RequirementItem;
import br.ufpa.labes.spm.service.dto.ResourceItem;
import br.ufpa.labes.spm.service.dto.ResourceMetricsReportItem;
import br.ufpa.labes.spm.service.dto.ResourceStatesReportItem;
import br.ufpa.labes.spm.service.dto.ResourcesPlanItem;
import br.ufpa.labes.spm.service.dto.SchedulePlanItem;
import br.ufpa.labes.spm.service.dto.WorkBreakdownStructureItem;
import br.ufpa.labes.spm.service.dto.AllocableActivityItem;
import br.ufpa.labes.spm.service.dto.ResourcesCostConsumableItem;
import br.ufpa.labes.spm.service.dto.ResourcesCostExclusiveItem;
import br.ufpa.labes.spm.service.dto.ResourcesCostHumanItem;
import br.ufpa.labes.spm.service.dto.ResourcesCostPlanItem;
import br.ufpa.labes.spm.service.dto.ResourcesCostShareableItem;
import br.ufpa.labes.spm.service.interfaces.ReportServices;
public class ReportServicesImpl implements ReportServices {
IReportDAO reportDAO;
@Override
public List<AgentsReportItem> generateAgentReport(Date date) {
Date atDate = date;
Date todayDate = new Date();
SimpleDateFormat formatter = new SimpleDateFormat( "dd/MM/yyyy" );
if (formatter.format(atDate).equals(formatter.format(todayDate))) {
atDate = null;
}
List<Object[]> data = reportDAO.getAgentsReportData(atDate);
if (data.size() == 0)
return new ArrayList<AgentsReportItem>();
List<AgentsReportItem> beansList = new ArrayList<AgentsReportItem>();
for ( Object[] entry : data ) {
AgentsReportItem item = new AgentsReportItem();
item.setAgentIdent( (String)entry[ 0 ] );
item.setAgentName( (String)entry[ 1 ] );
item.setAgentCostHour( (Double)entry[ 2 ] );
item.setAgentWorkload( (Integer)entry[ 3 ] );
beansList.add( item );
}
return beansList;
}
@SuppressWarnings("unchecked")
@Override
public List<Object> generateActivitiesByProcessReport(String processIdent) {
List<Object[]> data = reportDAO.getActivitiesByProcessReportData(processIdent);
if (data.size() == 0)
return new ArrayList<Object>();
List<Object> beansList = new ArrayList<Object>();
for ( Object[] entry : data ) {
List<Object[]> tasks = (List<Object[]>) entry[ 9 ];
for ( Object[] task : tasks ) {
ActivitiesByProcessReportItem item = new ActivitiesByProcessReportItem();
System.out.println("A => " + (String)entry[ 1 ] + " - " + ((Date)entry[ 3 ]).toString());
item.setActivityIdent( (String)entry[ 0 ] );
item.setActivityName( (String)entry[ 1 ] );
item.setActivityState( (String)entry[ 2 ] );
item.setPlannedBegin( (Date)entry[ 3 ] );
item.setPlannedEnd( (Date)entry[ 4 ] );
item.setActualBegin( (Date)entry[ 5 ] );
item.setActualEnd( (Date)entry[ 6 ] );
item.setPlannedAmountOfHours( (Double)entry[ 7 ] );
item.setActualAmountOfHours( (Double)entry[ 8 ] );
item.setAgentName( (String)task[ 2 ] );
try {
item.setLocalState( (String)task[ 3 ] );
item.setWorkingHours( new Double( (Float)task[ 4 ] ) );
}
catch (ArrayIndexOutOfBoundsException e) {
}
beansList.add( item );
}
}
return beansList;
}
@Override
public List<AgentsByRoleReportItem> generateAgentsByRoleReport() {
List<Object[]> data = reportDAO.getAgentsByRoleReportData();
System.out.println("No service: " + data);
if (data.size() == 0)
return new ArrayList<AgentsByRoleReportItem>();
List<AgentsByRoleReportItem> beansList = new ArrayList<AgentsByRoleReportItem>();
for ( Object[] entry : data ) {
SimpleDateFormat formatter = new SimpleDateFormat( "dd/MM/yyyy" );
AgentsByRoleReportItem item = new AgentsByRoleReportItem();
item.setRoleName( (String)entry[ 0 ] );
item.setAgentName( (String)entry[ 1 ] );
item.setSinceDate( ( entry[ 2 ] != null ? formatter.format( entry[ 2 ] ) : "" ) );
beansList.add( item );
}
return beansList;
}
@Override
public List<AgentsByWorkGroupReportItem> generateAgentsByWorkGroupReport() {
List<Object[]> data = reportDAO.getAgentsByWorkGroupReportData();
if ( data.size() == 0 )
return new ArrayList<AgentsByWorkGroupReportItem>();
List<AgentsByWorkGroupReportItem> beansList = new ArrayList<AgentsByWorkGroupReportItem>();
for ( Object[] entry : data ) {
AgentsByWorkGroupReportItem item = new AgentsByWorkGroupReportItem();
item.setWorkGroupName( (String)entry[ 0 ] );
item.setAgentName( (String)entry[ 1 ] );
beansList.add( item );
}
return beansList;
}
@Override
public List<ProjectArtifactsReportItem> generateProjectArtifactsReport(String processIdent) {
List<Object[]> data = reportDAO.getProjectArtifactsReportData(processIdent);
if ( data.size() == 0 )
return new ArrayList<ProjectArtifactsReportItem>();
List<ProjectArtifactsReportItem> beansList = new ArrayList<ProjectArtifactsReportItem>();
for ( Object[] entry : data ) {
ProjectArtifactsReportItem item = new ProjectArtifactsReportItem();
item.setProjectName( (String)entry[ 0 ] );
item.setSystemName( (String)entry[ 1 ] );
item.setArtifactName( (String)entry[ 2 ] );
try {
item.setArtifactType( (String)entry[ 3 ] );
}
catch( ClassCastException e ) {
System.out.println( entry[ 3 ] );
}
beansList.add( item );
}
return beansList;
}
@Override
public List<ActivitiesByAgentReportItem> generateActivitiesByAgentReport(String agentIdent) {
List<Object[]> data = reportDAO.getActivitiesByAgentsReportData(agentIdent, null, null, null, true );
if ( data.size() == 0 )
return new ArrayList<ActivitiesByAgentReportItem>();
List<ActivitiesByAgentReportItem> beansList = new ArrayList<ActivitiesByAgentReportItem>();
for ( Object[] entry : data ) {
ActivitiesByAgentReportItem item = new ActivitiesByAgentReportItem();
item.setAgentIdent( (String)entry[ 0 ] );
item.setAgentName( (String)entry[ 1 ] );
item.setAgentRole( (String)entry[ 2 ] );
item.setAgentCostHour( (Double)entry[ 3 ] );
item.setAgentWorkload( Integer.valueOf( (String)entry[ 4 ] ) );
item.setProcessIdent( (String)entry[ 5 ] );
item.setActivityName( (String)entry[ 6 ] );
item.setActivityState( (String)entry[ 7 ] );
item.setActivityDelay( (Integer)entry[ 8 ] );
item.setActivityActualEnd((String)entry[ 9 ]);
item.setActivityPlannedEnd((String)entry[ 10 ]);
item.setActivityActualBegin((String)entry[ 11 ]);
item.setActivityPlannedBegin((String)entry[ 12 ]);
beansList.add( item );
}
return beansList;
}
@Override
public List<CostDeviationReportItem> generateCostDeviationReport(String processIdent) {
List<Object[]> data = reportDAO.getCostDeviationReportData(processIdent);
if ( data.size() == 0 )
return new ArrayList<CostDeviationReportItem>();
List<CostDeviationReportItem> beansList = new ArrayList<CostDeviationReportItem>();
for ( Object[] entry : data ) {
CostDeviationReportItem item = new CostDeviationReportItem();
item.setProcessIdent( (String)entry[ 0 ] );
item.setActivityName( (String)entry[ 1 ] );
item.setActualCost( (Double)entry[ 2 ] );
item.setEstimatedCost( (Double)entry[ 3 ] );
beansList.add( item );
}
return beansList;
}
@Override
public List<ResourceStatesReportItem> generateResourceStatesReport() {
List<Object[]> data = reportDAO.getResourceStatesReportData();
if ( data.size() == 0 )
return new ArrayList<ResourceStatesReportItem>();
List<ResourceStatesReportItem> beansList = new ArrayList<ResourceStatesReportItem>();
for ( Object[] entry : data ) {
ResourceStatesReportItem item = new ResourceStatesReportItem();
item.setResourceIdent( (String)entry[ 0 ] );
item.setResourceType( (String)entry[ 1 ] );
item.setResourceClass( (String)entry[ 2 ] );
item.setResourceCost( (Double)entry[ 3 ] );
item.setResourceUnit( (String)entry[ 4 ] );
item.setResourceState( (String)entry[ 5 ] );
beansList.add( item );
}
return beansList;
}
@Override
public List<KnowledgeItensReportItem> generateKnowledgeItensReport(Date initialDate, Date finalDate) {
List<Object[]> data = reportDAO.getKnowledgeItensReportData(initialDate, finalDate);
if ( data.size() == 0 )
return new ArrayList<KnowledgeItensReportItem>();
List<KnowledgeItensReportItem> beansList = new ArrayList<KnowledgeItensReportItem>();
SimpleDateFormat formatter = new SimpleDateFormat( "dd/MM/yyyy" );
for ( Object[] entry : data ) {
KnowledgeItensReportItem item = new KnowledgeItensReportItem();
item.setKnowledgeItemIdent( ( String ) entry[ 0 ] );
item.setKnowledgeItemDate( ( entry[ 1 ] != null ? formatter.format( entry[ 1 ] ) : "" ) );
item.setKnowledgeItemStatus( ( String ) entry[ 2 ] );
item.setAgentIdent( ( String ) entry[ 3 ] );
item.setAgentName( ( String ) entry[ 4 ] );
beansList.add( item );
}
return beansList;
}
@Override
public List<RequirementItem> generateRequirementItemReport(String params) {
/*SystemDAO systemDAO = new DevelopingSystemDAO();
System system;
system = null;
system = (System) DevelopingsystemDAO.retrieveBySecondaryKey(params.getParams().get(RequirementListBySystemReportParam.SYSTEM).getName());
Collection<Requirement> requirementsColl = null;*/
return null;
}
@Override
public List<AgentMetricsReportItem> generateAgentMetricsReport(String agentIdent) {
List<Object[]> data = reportDAO.getAgentMetricsReportData(agentIdent);
if ( data.size() == 0 )
return new ArrayList<AgentMetricsReportItem>();
List<AgentMetricsReportItem> beansList = new ArrayList<AgentMetricsReportItem>();
for ( Object[] entry : data ) {
SimpleDateFormat formatter = new SimpleDateFormat( "dd/MM/yyyy" );
AgentMetricsReportItem item = new AgentMetricsReportItem();
item.setAgentIdent( (String)entry[ 0 ] );
item.setAgentName( (String)entry[ 1 ] );
item.setMetricDefinitionName( (String)entry[ 2 ] );
item.setMetricValue( new Double( (Float)entry[ 3 ] ) );
item.setMetricUnit( (String)entry[ 4 ] );
item.setMetricPeriodBegin( ( entry[ 5 ] != null ? formatter.format( entry[ 5 ] ) : "" ) );
item.setMetricPeriodEnd( ( entry[ 6 ] != null ? formatter.format( entry[ 6 ] ) : "" ) );
beansList.add( item );
}
return beansList;
}
@Override
public List<ArtifactMetricsReportItem> generateArtifactMetricsReport(String artifactIdent) {
List<Object[]> data = reportDAO.getArtifactMetricsReportData(artifactIdent);
List<ArtifactMetricsReportItem> beansList = new ArrayList<ArtifactMetricsReportItem>();
for ( Object[] entry : data ) {
SimpleDateFormat formatter = new SimpleDateFormat( "dd/MM/yyyy" );
ArtifactMetricsReportItem item = new ArtifactMetricsReportItem();
item.setArtifactIdent( (String)entry[ 0 ] );
item.setArtifactType( (String)entry[ 1 ] );
item.setMetricDefinitionName( (String)entry[ 2 ] );
item.setMetricValue( Double.valueOf( (Float)entry[ 3 ] ) );
item.setMetricUnit( (String)entry[ 4 ] );
item.setMetricPeriodBegin( ( entry[ 5 ] != null ? formatter.format( entry[ 5 ] ) : "" ) );
item.setMetricPeriodEnd( ( entry[ 6 ] != null ? formatter.format( entry[ 6 ] ) : "" ) );
beansList.add( item );
}
return beansList;
}
@Override
public List<ResourceMetricsReportItem> generateResourceMetricsReport(String resourceIdent) {
List<Object[]> data = reportDAO.getResourceMetricsReportData(resourceIdent);
if ( data.size() == 0 )
return new ArrayList<ResourceMetricsReportItem>();
List<ResourceMetricsReportItem> beansList = new ArrayList<ResourceMetricsReportItem>();
for ( Object[] entry : data ) {
SimpleDateFormat formatter = new SimpleDateFormat( "dd/MM/yyyy" );
ResourceMetricsReportItem item = new ResourceMetricsReportItem();
item.setResourceIdent( (String)entry[ 0 ] );
item.setResourceType( (String)entry[ 1 ] );
item.setMetricDefinitionName( (String)entry[ 2 ] );
item.setMetricValue( Double.valueOf( (Float)entry[ 3 ] ) );
item.setMetricUnit( (String)entry[ 4 ] );
item.setMetricPeriodBegin( ( entry[ 5 ] != null ? formatter.format( entry[ 5 ] ) : "" ) );
item.setMetricPeriodEnd( ( entry[ 6 ] != null ? formatter.format( entry[ 6 ] ) : "" ) );
beansList.add( item );
}
return beansList;
}
@Override
public List<SchedulePlanItem> generateSchedulePlanReport(String processIdent) {
List<Object[]> data = new ArrayList<Object[]>();
data.addAll(reportDAO.getScheduleData(processIdent));
if ( data.size() == 0 )
return new ArrayList<SchedulePlanItem>();
List<SchedulePlanItem> beansList = new ArrayList<SchedulePlanItem>();
for ( Object[] entry : data ) {
SimpleDateFormat formatter = new SimpleDateFormat( "dd/MM/yyyy" );
SchedulePlanItem item = new SchedulePlanItem();
item.setProcessIdent( (String)entry[ 0 ] );
item.setActivityIdent( (String)entry[ 1 ] );
item.setActivityName( (String)entry[ 2 ] );
item.setActivityState( (String)entry[ 3 ] );
item.setActivityPlannedBegin( ( entry[ 4 ] != null ? formatter.format( entry[ 4 ] ) : "" ) );
item.setActivityPlannedEnd( entry[ 5 ] != null ? formatter.format( entry[ 5 ] ) : "" );
item.setActivityEstimation( (Double)entry[ 6 ] );
item.setActivityActualBegin( ( entry[ 7 ] != null ? formatter.format( entry[ 7 ] ) : "" ) );
item.setActivityActualEnd( ( entry[ 8 ] != null ? formatter.format( entry[ 8 ] ) : "" ) );
item.setActivityAvgWorkedHours( (Double)entry[ 9 ] );
try {
double deviation = (item.getActivityEstimation()==0 ? 0 : item.getActivityAvgWorkedHours() / item.getActivityEstimation() - 1);
item.setHourDeviation( new Double( deviation ) );
}
catch( ArithmeticException e )
{
item.setHourDeviation( -1.0 );
}
beansList.add( item );
}
return beansList;
}
@Override
public List<DocumentManagementPlanItem> generateDocumentManagementPlanReport(String processIdent) {
List<Object[]> data = reportDAO.getDocumentManagementPlanData(processIdent);
if ( data.size() == 0 )
return new ArrayList<DocumentManagementPlanItem>();
List<DocumentManagementPlanItem> beansList = new ArrayList<DocumentManagementPlanItem>();
for ( Object[] entry : data ) {
SimpleDateFormat formatter = new SimpleDateFormat( "dd/MM/yyyy" );
DocumentManagementPlanItem item = new DocumentManagementPlanItem();
item.setProcessIdent( (String)entry[ 0 ] );
item.setActivityName( (String)entry[ 1 ] );
item.setPlannedBegin( ( entry[ 2 ] != null ? formatter.format( entry[ 2 ] ) : "" ) );
item.setPlannedEnd( ( entry[ 3 ] != null ? formatter.format( entry[ 3 ] ) : "" ) );
item.setAgentName( (String)entry[ 4 ] );
item.setInputArtifact( (String)entry[ 5 ] );
item.setOutputArtifact( (String)entry[ 6 ] );
beansList.add( item );
}
return beansList;
}
@SuppressWarnings("unchecked")
@Override
public List<Object> generateHumanResourcesReport(String processIdent) {
List<Object[]> data = reportDAO.getHumanResourcesPlanData(processIdent);
List<AllocableActivityItem> activities = new ArrayList<AllocableActivityItem>();
List<Object> listOfReturn = new ArrayList<Object>();
if ( data.size() == 0 )
return new ArrayList<Object>();
List<HumanResourcesPlanItem> beansList = new ArrayList<HumanResourcesPlanItem>();
for ( Object[] entry : data ) {
HumanResourcesPlanItem item = new HumanResourcesPlanItem();
item.setActivityIdent( (String)entry[ 0 ] );
item.setActivityName( (String)entry[ 1 ] );
for (Object[] agentEntry : (List<Object[]>)entry[2]) {
AgentItemBean agentItem = new AgentItemBean();
agentItem.setRoleName((String)agentEntry[0]);
agentItem.setAgentName((String)agentEntry[1]);
agentItem.setWorkingHours((Double)agentEntry[2]);
item.addAgent(agentItem);
}
beansList.add( item );
}
listOfReturn.add(beansList);
List<Object[]> allocableActivities = reportDAO.getAllocableActivitiesData(processIdent);
for (Object[] activityArray : allocableActivities ) {
AllocableActivityItem activity = new AllocableActivityItem();
activity.setActivityIdent( (String)activityArray[ 0 ] );
activity.setActivityName( (String)activityArray[ 1 ] );
activity.setEstimatedHours( (Double)activityArray[ 2 ] );
activities.add( activity );
}
listOfReturn.add(allocableActivities);
return listOfReturn;
}
@SuppressWarnings("unchecked")
@Override
public List<ResourcesPlanItem> generateResourcesPlanReport(String processIdent) {
List<Object[]> data = reportDAO.getResourcesPlanData(processIdent);
if ( data.size() == 0 )
return new ArrayList<ResourcesPlanItem>();
List<ResourcesPlanItem> beansList = new ArrayList<ResourcesPlanItem>();
for ( Object[] entry : data ) {
ResourcesPlanItem item = new ResourcesPlanItem();
item.setProcessIdent( (String)entry[ 0 ] );
List<ResourceItem> exclusives = new ArrayList<ResourceItem>();
List<ResourceItem> consumables = new ArrayList<ResourceItem>();
List<ResourceItem> shareables = new ArrayList<ResourceItem>();
for (Object[] exclusiveArray : (List<Object[]>)entry[ 1 ] ) {
SimpleDateFormat formatter = new SimpleDateFormat( "dd/MM/yyyy" );
ResourceItem exclusive = new ResourceItem();
exclusive.setName( (String)exclusiveArray[ 0 ] );
exclusive.setDescription( (String)exclusiveArray[ 1 ] );
exclusive.setBeginDate( ( exclusiveArray[ 2 ] != null ? formatter.format( (Date)exclusiveArray[ 2 ] ) : "" ) );
exclusive.setEndDate( ( exclusiveArray[ 3 ] != null ? formatter.format( (Date)exclusiveArray[ 3 ] ) : "" ) );
exclusives.add( exclusive );
}
for (Object[] consumableArray : (List<Object[]>)entry[ 2 ] ) {
SimpleDateFormat formatter = new SimpleDateFormat( "dd/MM/yyyy" );
ResourcesCostConsumableItem consumable = new ResourcesCostConsumableItem();
consumable.setName( (String)consumableArray[ 0 ] );
consumable.setDescription( (String)consumableArray[ 1 ] );
consumable.setBeginDate( ( consumableArray[ 2 ] != null ? formatter.format( (Date)consumableArray[ 2 ] ) : "" ) );
consumable.setEndDate( ( consumableArray[ 3 ] != null ? formatter.format( (Date)consumableArray[ 3 ] ) : "" ) );
consumables.add( consumable );
}
for (Object[] shareableArray : (List<Object[]>)entry[ 3 ] ) {
SimpleDateFormat formatter = new SimpleDateFormat( "dd/MM/yyyy" );
ResourcesCostShareableItem shareable = new ResourcesCostShareableItem();
shareable.setName( (String)shareableArray[ 0 ] );
shareable.setDescription( (String)shareableArray[ 1 ] );
shareable.setBeginDate( ( shareableArray[ 2 ] != null ? formatter.format( (Date)shareableArray[ 2 ] ) : "" ) );
shareable.setEndDate( ( shareableArray[ 3 ] != null ? formatter.format( (Date)shareableArray[ 3 ] ) : "" ) );
shareables.add( shareable );
}
item.setExclusives( exclusives );
item.setConsumables( consumables );
item.setShareables( shareables );
beansList.add( item );
}
return beansList;
}
@Override
public List<ProjectsBySystemReportItem> generateProjectsBySystemReport(String systemIdent) {
List<Object[]> data = reportDAO.getProjectsBySystemReportData(systemIdent);
if ( data.size() == 0 )
return new ArrayList<ProjectsBySystemReportItem>();
List<ProjectsBySystemReportItem> beansList = new ArrayList<ProjectsBySystemReportItem>();
for ( Object[] entry : data ) {
SimpleDateFormat formatter = new SimpleDateFormat( "dd/MM/yyyy" );
ProjectsBySystemReportItem item = new ProjectsBySystemReportItem();
item.setSystemIdent( (String)entry[ 0 ] );
item.setOrganizationIdent( (String)entry[ 1 ] );
item.setProjectIdent( (String)entry[ 2 ] );
item.setProjectBegin( ( entry[ 3 ] != null ? formatter.format( entry[ 3 ] ) : "" ) );
item.setProjectEnd( ( entry[ 4 ] != null ? formatter.format( entry[ 4 ] ) : "" ) );
beansList.add( item );
}
return beansList;
}
@Override
public List<WorkBreakdownStructureItem> generateWorkBreakdownStructureReport(String processIdent) {
List<Object[]> data = reportDAO.getWorkBreakdownStructureData(processIdent);
if ( data.size() == 0 )
return new ArrayList<WorkBreakdownStructureItem>();
List<WorkBreakdownStructureItem> beansList = new ArrayList<WorkBreakdownStructureItem>();
for ( Object[] entry : data ) {
WorkBreakdownStructureItem item = new WorkBreakdownStructureItem();
item.setProcessIdent( (String)entry[ 0 ] );
item.setActivityName( (String)entry[ 1 ] );
beansList.add( item );
}
return beansList;
}
@Override
public List<ResourcesCostPlanItem> generateResourcesCostPlanReport(String processIdent) {
List<Object[]> data = reportDAO.getResourcesCostPlanData(processIdent);
if ( data.size() == 0 )
return new ArrayList<ResourcesCostPlanItem>();
List<ResourcesCostPlanItem> beansList = new ArrayList<ResourcesCostPlanItem>();
for ( Object[] entry : data ) {
ResourcesCostPlanItem item = new ResourcesCostPlanItem();
item.setProcessIdent( (String)entry[ 0 ] );
List<ResourcesCostExclusiveItem> exclusives = new ArrayList<ResourcesCostExclusiveItem>();
List<ResourcesCostConsumableItem> consumables = new ArrayList<ResourcesCostConsumableItem>();
List<ResourcesCostShareableItem> shareables = new ArrayList<ResourcesCostShareableItem>();
List<ResourcesCostHumanItem> humans = new ArrayList<ResourcesCostHumanItem>();
List<AllocableActivityItem> activities = new ArrayList<AllocableActivityItem>();
for (Object[] exclusiveArray : (List<Object[]>)entry[ 1 ] ) {
ResourcesCostExclusiveItem exclusive = new ResourcesCostExclusiveItem();
exclusive.setName( (String)exclusiveArray[ 0 ] );
exclusive.setDescription( (String)exclusiveArray[ 1 ] );
exclusive.setCost( new Double( (Float)exclusiveArray[ 2 ] ) );
exclusives.add( exclusive );
}
for (Object[] consumableArray : (List<Object[]>)entry[ 2 ] ) {
ResourcesCostConsumableItem consumable = new ResourcesCostConsumableItem();
consumable.setName( (String)consumableArray[ 0 ] );
consumable.setDescription( (String)consumableArray[ 1 ] );
consumable.setAmount( new Double( (Float)consumableArray[ 2 ] ) );
consumable.setCost( new Double( (Float)consumableArray[ 3 ] ) );
consumables.add( consumable );
}
for (Object[] shareableArray : (List<Object[]>)entry[ 3 ] ) {
SimpleDateFormat formatter = new SimpleDateFormat( "dd/MM/yyyy" );
ResourcesCostShareableItem shareable = new ResourcesCostShareableItem();
shareable.setName( (String)shareableArray[ 0 ] );
shareable.setDescription( (String)shareableArray[ 1 ] );
shareable.setBeginDate( ( shareableArray[ 2 ] != null ? formatter.format( (Date)shareableArray[ 2 ] ) : "" ) );
shareable.setEndDate( ( shareableArray[ 3 ] != null ? formatter.format( (Date)shareableArray[ 3 ] ) : "" ) );
shareables.add( shareable );
}
for (Object[] humanArray : (List<Object[]>)entry[ 4 ] ) {
ResourcesCostHumanItem human = new ResourcesCostHumanItem();
human.setName( (String)humanArray[ 0 ] );
human.setCostHour( (Double)humanArray[ 1 ] );
human.setWorkingHours( (Double)humanArray[ 2 ] );
human.setEstimatedHours( (Double)humanArray[ 3 ] );
humans.add( human );
}
for (Object[] activityArray : (List<Object[]>)entry[ 5 ] ) {
AllocableActivityItem activity = new AllocableActivityItem();
activity.setActivityIdent( (String)activityArray[ 0 ] );
activity.setActivityName( (String)activityArray[ 1 ] );
activity.setEstimatedHours( (Double)activityArray[ 2 ] );
activities.add( activity );
}
item.setExclusives( exclusives );
item.setConsumables( consumables );
item.setShareables( shareables );
item.setHumans( humans );
item.setActivities(activities);
beansList.add( item );
}
return beansList;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/CalendarService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.service.dto.CalendarDTO;
import java.util.List;
import java.util.Optional;
/**
* Service Interface for managing {@link br.ufpa.labes.spm.domain.Calendar}.
*/
public interface CalendarService {
/**
* Save a calendar.
*
* @param calendarDTO the entity to save.
* @return the persisted entity.
*/
CalendarDTO save(CalendarDTO calendarDTO);
/**
* Get all the calendars.
*
* @return the list of entities.
*/
List<CalendarDTO> findAll();
/**
* Get the "id" calendar.
*
* @param id the id of the entity.
* @return the entity.
*/
Optional<CalendarDTO> findOne(Long id);
/**
* Delete the "id" calendar.
*
* @param id the id of the entity.
*/
void delete(Long id);
}
<file_sep>/util_files/entity_rename.sh
#!/bin/bash
for file in `find . -regextype posix-extended -regex '(.*)DAO.java'`; do
echo $file
replace=`echo $file | sed -r 's/(.*)DAO.java/\1RepositoryQuery.java/'`
mv $file $replace
done
echo 'Finish renaming entity...'
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/calendar/CalendarDAO.java
package br.ufpa.labes.spm.repository.impl.calendar;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.calendar.ICalendarDAO;
import br.ufpa.labes.spm.domain.Calendar;
public class CalendarDAO extends BaseDAO<Calendar, String> implements ICalendarDAO {
protected CalendarDAO(Class<Calendar> businessClass) {
super(businessClass);
}
public CalendarDAO() {
super(Calendar.class);
}
@Override
public String validNameProject(String name, Integer project_Oid) {
// TODO Auto-generated method stub
return null;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/taskagenda/ProcessAgendaDAO.java
package br.ufpa.labes.spm.repository.impl.taskagenda;
import java.util.Collection;
import java.util.Iterator;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.taskagenda.IProcessAgendaDAO;
import br.ufpa.labes.spm.repository.interfaces.taskagenda.ITaskDAO;
import br.ufpa.labes.spm.domain.Normal;
import br.ufpa.labes.spm.domain.ProcessAgenda;
import br.ufpa.labes.spm.domain.Task;
public class ProcessAgendaDAO extends BaseDAO<ProcessAgenda, Integer> implements IProcessAgendaDAO {
ITaskDAO taskDAO;
protected ProcessAgendaDAO(Class<ProcessAgenda> businessClass) {
super(businessClass);
}
public ProcessAgendaDAO() {
super(ProcessAgenda.class);
}
// TODO: HAVE TO FINISH THE INTEGRATION TO MESSAGE SERVICES
public Task addTask(ProcessAgenda pAgenda, Normal actNorm) {
// pAgenda = this.retrieve(pAgenda.getId());
Collection tasks = pAgenda.getTheTasks();
Iterator iter = tasks.iterator();
boolean has = false;
Task returnTask = null;
while (iter.hasNext()) {
Task taskAux = (Task) iter.next();
if (taskAux != null) {
if (taskAux.getTheNormal().getIdent().equals(actNorm.getIdent())) {
returnTask = taskAux;
has = true;
break;
}
}
}
if (!has) {
Task task = new Task();
task.setTheNormal(actNorm);
actNorm.getTheTasks().add(task);
task.setTheProcessAgenda(pAgenda);
pAgenda.getTheTasks().add(task);
task = (Task) taskDAO.daoSave(task);
//
// System.out.println("Error on save Task: ProcessAgenda.addTask("+
// actNorm.getIdent()+")");
// e1.printStackTrace();
// Notify the agente of the task!!
String message =
"<MESSAGE>"
+ "<NOTIFY>"
+ "<OID>"
+ task.getId()
+ "</OID>"
+ "<TYPE>ADD</TYPE>"
+ "<CLASS>"
+ task.getClass().getName()
+ "</CLASS>"
+ "<BY>APSEE_Manager</BY>"
+ "</NOTIFY>"
+ "</MESSAGE>";
/*
* try { if(ProcessAgenda.remote==null){ reloadRemote(); }
* if(ProcessAgenda.remote!=null){
* ProcessAgenda.remote.sendMessageToUser(message, this
* .getTheTaskAgenda().getTheAgent().getIdent()); }
*/
returnTask = task;
}
return returnTask;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/chat/MessageDAO.java
package br.ufpa.labes.spm.repository.impl.chat;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.chat.IChatMessageDAO;
import br.ufpa.labes.spm.domain.ChatMessage;
public class MessageDAO extends BaseDAO<ChatMessage, Integer> implements IChatMessageDAO {
protected MessageDAO(Class<ChatMessage> businessClass) {
super(businessClass);
}
public MessageDAO() {
super(ChatMessage.class);
}
}
<file_sep>/tmp/service/impl/TypeServicesImpl.java
package br.ufpa.labes.spm.service.impl;
import java.util.ArrayList;
import java.util.Hashtable;
import java.util.Iterator;
import java.util.List;
import javax.persistence.Query;
import org.qrconsult.spm.beans.editor.WebAPSEEVO;
import org.qrconsult.spm.converter.core.Converter;
import org.qrconsult.spm.converter.core.ConverterImpl;
import br.ufpa.labes.spm.exceptions.ImplementationException;
import br.ufpa.labes.spm.repository.interfaces.types.IAbilityTypeDAO;
import br.ufpa.labes.spm.repository.interfaces.types.IActivityTypeDAO;
import br.ufpa.labes.spm.repository.interfaces.types.IArtifactTypeDAO;
import br.ufpa.labes.spm.repository.interfaces.types.IConnectionTypeDAO;
import br.ufpa.labes.spm.repository.interfaces.types.IEventTypeDAO;
import br.ufpa.labes.spm.repository.interfaces.types.IWorkGroupTypeDAO;
import br.ufpa.labes.spm.repository.interfaces.types.IKnowledgeTypeDAO;
import br.ufpa.labes.spm.repository.interfaces.types.IMetricTypeDAO;
import br.ufpa.labes.spm.repository.interfaces.types.IPolicyTypeDAO;
import br.ufpa.labes.spm.repository.interfaces.types.IResourceTypeDAO;
import br.ufpa.labes.spm.repository.interfaces.types.IRoleTypeDAO;
import br.ufpa.labes.spm.repository.interfaces.types.IToolTypeDAO;
import br.ufpa.labes.spm.repository.interfaces.types.ITypeDAO;
import br.ufpa.labes.spm.service.dto.TypeDTO;
import br.ufpa.labes.spm.service.dto.TypesDTO;
import br.ufpa.labes.spm.domain.AbilityType;
import br.ufpa.labes.spm.domain.ActivityType;
import br.ufpa.labes.spm.domain.ArtifactType;
import br.ufpa.labes.spm.domain.ConnectionType;
import br.ufpa.labes.spm.domain.EventType;
import br.ufpa.labes.spm.domain.WorkGroupType;
import br.ufpa.labes.spm.domain.KnowledgeType;
import br.ufpa.labes.spm.domain.MetricType;
import br.ufpa.labes.spm.domain.PolicyType;
import br.ufpa.labes.spm.domain.ResourceType;
import br.ufpa.labes.spm.domain.RoleType;
import br.ufpa.labes.spm.domain.ToolType;
import br.ufpa.labes.spm.domain.Type;
import br.ufpa.labes.spm.service.interfaces.TypeServices;
import br.ufpa.labes.spm.util.Translator;
public class TypeServicesImpl implements TypeServices{
ITypeDAO typeDAO;
IArtifactTypeDAO artTypeDAO;
IActivityTypeDAO actTypeDAO;
IToolTypeDAO toolTypeDAO;
IWorkGroupTypeDAO grpTypeDAO;
IMetricTypeDAO metTypeDAO;
IAbilityTypeDAO abiTypeDAO;
IRoleTypeDAO roleTypeDAO;
IResourceTypeDAO resTypeDAO;
IConnectionTypeDAO conTypeDAO;
IKnowledgeTypeDAO knowTypeDAO;
IEventTypeDAO eveTypeDAO;
IPolicyTypeDAO polTypeDAO;
Hashtable<String, Class<?>> typeClasses = new Hashtable<String, Class<?>>();
public TypeServicesImpl(){
typeClasses.put("AbilityType", AbilityType.class);
typeClasses.put("ActivityType", ActivityType.class);
typeClasses.put("ArtifactType", ArtifactType.class);
typeClasses.put("ConnectionType", ConnectionType.class);
typeClasses.put("EventType", EventType.class);
typeClasses.put("WorkGroupType", WorkGroupType.class);
typeClasses.put("PolicyType", KnowledgeType.class);
typeClasses.put("MetricType", MetricType.class);
typeClasses.put("PolicyType", PolicyType.class);
typeClasses.put("ResourceType", ResourceType.class);
typeClasses.put("RoleType", RoleType.class);
typeClasses.put("ToolType", ToolType.class);
}
@Override
public String[] getRootTypes(String typeClassName) {
String internalPackageName = Translator.getInternalPackageName(typeClassName);
String hql = "select type.ident from " + internalPackageName + " as type where type.superType is null";
Query query = typeDAO.getPersistenceContext().createQuery(hql);
List<String> list = query.getResultList();
String[] ret = new String[list.size()];
list.toArray(ret);
return ret;
}
@Override
public String[] getSubTypes(String typeIdent) {
String hql = "select type.ident from " + Type.class.getName() + " as type where type.superType is not null and type.superType.ident=:ident ";
Query query = typeDAO.getPersistenceContext().createQuery(hql);
query.setParameter("ident", typeIdent);
List<String> list = query.getResultList();
String[] ret = new String[list.size()];
list.toArray(ret);
return ret;
}
@Override
public TypesDTO getTypes() {
Class<?>[] typeClasses = { AbilityType.class, ActivityType.class,
ArtifactType.class, ConnectionType.class, EventType.class,
WorkGroupType.class, KnowledgeType.class, MetricType.class,
PolicyType.class, ResourceType.class, RoleType.class,
ToolType.class };
String hql;
Query query;
List<List<Type>> typesLists = new ArrayList<List<Type>>();
int sizeList = 0;
for (int i = 0; i < typeClasses.length; i++) {
Class<?> class_ = typeClasses[i];
hql = "from " + class_.getName();
query = typeDAO.getPersistenceContext().createQuery(hql);
typesLists.add(query.getResultList());
sizeList += typesLists.get(i).size();
}
TypesDTO typesDTO = new TypesDTO(sizeList);
int j = 0;
for (int i = 0; i < typesLists.size(); i++) {
List<Type> list = typesLists.get(i);
for (Type type : list) {
String typeIdent = type.getIdent();
String superTypeIdent = (type.getSuperType()!=null ? type.getSuperType().getIdent() : "");
String rootType = typeClasses[i].getSimpleName();
typesDTO.addType(typeIdent, superTypeIdent, rootType, j);
j++;
}
}
return typesDTO;
}
@Override
public TypeDTO getType(String typeIdent) {
try{
Type type = typeDAO.retrieveBySecondaryKey(typeIdent);
if(type != null) {
Converter converter = new ConverterImpl();
TypeDTO typeDTO = (TypeDTO) converter.getDTO(type, TypeDTO.class);
typeDTO.setSubtypesNumber(type.getSubTypes().size());
typeDTO.setSuperTypeIdent(type.getSuperType()!=null ? type.getSuperType().getIdent() : "");
typeDTO.setRootType(getInstanceOfType(type));
return typeDTO;
}
}catch(ImplementationException e){
e.printStackTrace();
}
return null;
}
private String getInstanceOfType(Type type) {
if (type instanceof ArtifactType)
return ArtifactType.class.getSimpleName();
else if (type instanceof ActivityType)
return ActivityType.class.getSimpleName();
else if (type instanceof AbilityType)
return AbilityType.class.getSimpleName();
else if (type instanceof ToolType)
return ToolType.class.getSimpleName();
else if (type instanceof MetricType)
return MetricType.class.getSimpleName();
else if (type instanceof RoleType)
return RoleType.class.getSimpleName();
else if (type instanceof WorkGroupType)
return WorkGroupType.class.getSimpleName();
else
return "";
}
@Override
public TypeDTO saveType(TypeDTO typeDTO) {
try {
String typeIdent = typeDTO.getIdent();
String typeClass = typeDTO.getRootType();
if (typeIdent != null && !typeIdent.equals("") && typeClass != null && !typeClass.equals("")) {
if (typeClass.equals(ActivityType.class.getSimpleName())) {
ActivityType type = actTypeDAO.retrieveBySecondaryKey(typeIdent);
Converter converter = new ConverterImpl();
if (type != null) {
type = (ActivityType) converter.getEntity(typeDTO, type);
typeDTO.setSubtypesNumber(type.getSubTypes().size());
} else {
type = (ActivityType) converter.getEntity(typeDTO, ActivityType.class);
type = actTypeDAO.daoSave(type);
}
String superTypeIdent = typeDTO.getSuperTypeIdent();
if (superTypeIdent != null && !superTypeIdent.equals("")) {
ActivityType superType = actTypeDAO.retrieveBySecondaryKey(superTypeIdent);
type.setSuperType(superType);
}
actTypeDAO.update(type);
typeDTO = (TypeDTO) converter.getDTO(type, TypeDTO.class);
} else if (typeClass.equals(ArtifactType.class.getSimpleName())) {
ArtifactType type = artTypeDAO.retrieveBySecondaryKey(typeIdent);
Converter converter = new ConverterImpl();
if (type != null) {
type = (ArtifactType) converter.getEntity(typeDTO, type);
typeDTO.setSubtypesNumber(type.getSubTypes().size());
} else {
type = (ArtifactType) converter.getEntity(typeDTO, ArtifactType.class);
type = artTypeDAO.daoSave(type);
}
String superTypeIdent = typeDTO.getSuperTypeIdent();
if (superTypeIdent != null && !superTypeIdent.equals("")) {
ArtifactType superType = artTypeDAO.retrieveBySecondaryKey(superTypeIdent);
type.setSuperType(superType);
}
artTypeDAO.update(type);
typeDTO = (TypeDTO) converter.getDTO(type, TypeDTO.class);
} else if (typeClass.equals(AbilityType.class.getSimpleName())) {
AbilityType type = abiTypeDAO.retrieveBySecondaryKey(typeIdent);
Converter converter = new ConverterImpl();
if (type != null) {
type = (AbilityType) converter.getEntity(typeDTO, type);
typeDTO.setSubtypesNumber(type.getSubTypes().size());
} else {
type = (AbilityType) converter.getEntity(typeDTO, AbilityType.class);
type = abiTypeDAO.daoSave(type);
}
String superTypeIdent = typeDTO.getSuperTypeIdent();
if (superTypeIdent != null && !superTypeIdent.equals("")) {
AbilityType superType = abiTypeDAO.retrieveBySecondaryKey(superTypeIdent);
type.setSuperType(superType);
}
abiTypeDAO.update(type);
typeDTO = (TypeDTO) converter.getDTO(type, TypeDTO.class);
} else if (typeClass.equals(WorkGroupType.class.getSimpleName())) {
WorkGroupType type = grpTypeDAO.retrieveBySecondaryKey(typeIdent);
Converter converter = new ConverterImpl();
if (type != null) {
type = (WorkGroupType) converter.getEntity(typeDTO, type);
typeDTO.setSubtypesNumber(type.getSubTypes().size());
} else {
type = (WorkGroupType) converter.getEntity(typeDTO, WorkGroupType.class);
type = grpTypeDAO.daoSave(type);
}
String superTypeIdent = typeDTO.getSuperTypeIdent();
if (superTypeIdent != null && !superTypeIdent.equals("")) {
WorkGroupType superType = grpTypeDAO.retrieveBySecondaryKey(superTypeIdent);
type.setSuperType(superType);
}
grpTypeDAO.update(type);
typeDTO = (TypeDTO) converter.getDTO(type, TypeDTO.class);
} else if (typeClass.equals(RoleType.class.getSimpleName())) {
RoleType type = roleTypeDAO.retrieveBySecondaryKey(typeIdent);
Converter converter = new ConverterImpl();
if (type != null) {
type = (RoleType) converter.getEntity(typeDTO, type);
typeDTO.setSubtypesNumber(type.getSubTypes().size());
} else {
type = (RoleType) converter.getEntity(typeDTO, RoleType.class);
type = roleTypeDAO.daoSave(type);
}
String superTypeIdent = typeDTO.getSuperTypeIdent();
if (superTypeIdent != null && !superTypeIdent.equals("")) {
RoleType superType = roleTypeDAO.retrieveBySecondaryKey(superTypeIdent);
type.setSuperType(superType);
}
roleTypeDAO.update(type);
typeDTO = (TypeDTO) converter.getDTO(type, TypeDTO.class);
} else if (typeClass.equals(ResourceType.class.getSimpleName())) {
ResourceType type = resTypeDAO.retrieveBySecondaryKey(typeIdent);
Converter converter = new ConverterImpl();
if (type != null) {
type = (ResourceType) converter.getEntity(typeDTO, type);
typeDTO.setSubtypesNumber(type.getSubTypes().size());
} else {
type = (ResourceType) converter.getEntity(typeDTO, ResourceType.class);
type = resTypeDAO.daoSave(type);
}
String superTypeIdent = typeDTO.getSuperTypeIdent();
if (superTypeIdent != null && !superTypeIdent.equals("")) {
ResourceType superType = resTypeDAO.retrieveBySecondaryKey(superTypeIdent);
type.setSuperType(superType);
}
resTypeDAO.update(type);
typeDTO = (TypeDTO) converter.getDTO(type, TypeDTO.class);
} else if (typeClass.equals(ToolType.class.getSimpleName())) {
ToolType type = toolTypeDAO.retrieveBySecondaryKey(typeIdent);
Converter converter = new ConverterImpl();
if (type != null) {
type = (ToolType) converter.getEntity(typeDTO, type);
typeDTO.setSubtypesNumber(type.getSubTypes().size());
} else {
type = (ToolType) converter.getEntity(typeDTO, ToolType.class);
type = toolTypeDAO.daoSave(type);
}
String superTypeIdent = typeDTO.getSuperTypeIdent();
if (superTypeIdent != null && !superTypeIdent.equals("")) {
ToolType superType = toolTypeDAO.retrieveBySecondaryKey(superTypeIdent);
type.setSuperType(superType);
}
toolTypeDAO.update(type);
typeDTO = (TypeDTO) converter.getDTO(type, TypeDTO.class);
} else if (typeClass.equals(MetricType.class.getSimpleName())) {
MetricType type = metTypeDAO.retrieveBySecondaryKey(typeIdent);
Converter converter = new ConverterImpl();
if (type != null) {
type = (MetricType) converter.getEntity(typeDTO, type);
typeDTO.setSubtypesNumber(type.getSubTypes().size());
} else {
type = (MetricType) converter.getEntity(typeDTO, MetricType.class);
type = metTypeDAO.daoSave(type);
}
String superTypeIdent = typeDTO.getSuperTypeIdent();
if (superTypeIdent != null && !superTypeIdent.equals("")) {
MetricType superType = metTypeDAO.retrieveBySecondaryKey(superTypeIdent);
type.setSuperType(superType);
}
metTypeDAO.update(type);
typeDTO = (TypeDTO) converter.getDTO(type, TypeDTO.class);
}else return null;
return typeDTO;
}
} catch (ImplementationException e) {
e.printStackTrace();
}
return null;
}
@Override
public Boolean removeType(String typeIdent) {
Type type = typeDAO.retrieveBySecondaryKey(typeIdent);
if(type != null) {
typeDAO.daoDelete(type);
return true;
} else {
return false;
}
}
@Override
public TypesDTO getTypes(String termoBusca, String domainFilter, boolean orgFilter) {
System.out.println("Param�tros: " + termoBusca + ", " + domainFilter + ", " + orgFilter);
String hql;
Query query;
List<List<Type>> typesLists = new ArrayList<List<Type>>();
List<Class<?>> returnedClasses = new ArrayList<Class<?>>();
int sizeLists = 0;
if(domainFilter!=null){
Class<?> currentClass = typeClasses.get(domainFilter);
hql = "select type from " + currentClass.getName() + " as type where type.ident like :termo and type.userDefined is not :orgFilter";
query = typeDAO.getPersistenceContext().createQuery(hql);
query.setParameter("orgFilter", orgFilter);
query.setParameter("termo", "%"+ termoBusca + "%");
List<Type> resultList = query.getResultList();
typesLists.add(resultList);
returnedClasses.add(currentClass);
sizeLists = resultList.size();
}else{
Iterator<Class<?>> classIterator = typeClasses.values().iterator();
while (classIterator.hasNext()) {
Class<?> class_ = classIterator.next();
hql = "select type from " + class_.getName() + " as type where type.ident like :termo and type.userDefined is not :orgFilter";
query = typeDAO.getPersistenceContext().createQuery(hql);
query.setParameter("orgFilter", orgFilter);
query.setParameter("termo", "%"+ termoBusca + "%");
List<Type> resultList = query.getResultList();
typesLists.add(resultList);
returnedClasses.add(class_);
sizeLists += resultList.size();
}
}
TypesDTO typesDTO = new TypesDTO(sizeLists);
for (int i = 0; i < typesLists.size(); i++) {
List<Type> list = typesLists.get(i);
int j = 0;
for (Iterator<Type> iterator = list.iterator(); iterator.hasNext();) {
Type type = (Type) iterator.next();
String typeIdent = type.getIdent();
System.out.println("Tipo Selecionado: " + typeIdent);
String superTypeIdent = (type.getSuperType()!=null ? type.getSuperType().getIdent() : "");
String rootType = returnedClasses.get(i).getSimpleName();
typesDTO.addType(typeIdent, superTypeIdent, rootType, j);
j++;
}
}
return typesDTO;
}
public static void main(String[] args){
TypeServicesImpl impl = new TypeServicesImpl();
TypesDTO types = impl.getTypes("", null, false);
for (int i = 0; i < types.getTypeIdents().length; i++) {
System.out.println("Tipo Selecionado: " + types.getTypeIdents()[i]);
}
}
@Override
public List<WebAPSEEVO> getTypesVO(String domain) {
String internalPackageName = Translator.getInternalPackageName(domain);
String hql = "select type.ident from " + internalPackageName + " as type order by type.ident";
Query query = typeDAO.getPersistenceContext().createQuery(hql);
List<String> list = query.getResultList();
List<WebAPSEEVO> typesVO = new ArrayList<WebAPSEEVO>();
for(String ident : list){
WebAPSEEVO webapseeVO = new WebAPSEEVO();
webapseeVO.setIdent(ident);
typesVO.add(webapseeVO);
}
System.out.println(" # ArtifactTypes: " + typesVO.size());
return typesVO;
}
}
<file_sep>/tmp/service/impl/MetricDefinitionServiceImpl.java
package br.ufpa.labes.spm.service.impl;
import br.ufpa.labes.spm.service.MetricDefinitionService;
import br.ufpa.labes.spm.domain.MetricDefinition;
import br.ufpa.labes.spm.repository.MetricDefinitionRepository;
import br.ufpa.labes.spm.service.dto.MetricDefinitionDTO;
import br.ufpa.labes.spm.service.mapper.MetricDefinitionMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link MetricDefinition}.
*/
@Service
@Transactional
public class MetricDefinitionServiceImpl implements MetricDefinitionService {
private final Logger log = LoggerFactory.getLogger(MetricDefinitionServiceImpl.class);
private final MetricDefinitionRepository metricDefinitionRepository;
private final MetricDefinitionMapper metricDefinitionMapper;
public MetricDefinitionServiceImpl(MetricDefinitionRepository metricDefinitionRepository, MetricDefinitionMapper metricDefinitionMapper) {
this.metricDefinitionRepository = metricDefinitionRepository;
this.metricDefinitionMapper = metricDefinitionMapper;
}
/**
* Save a metricDefinition.
*
* @param metricDefinitionDTO the entity to save.
* @return the persisted entity.
*/
@Override
public MetricDefinitionDTO save(MetricDefinitionDTO metricDefinitionDTO) {
log.debug("Request to save MetricDefinition : {}", metricDefinitionDTO);
MetricDefinition metricDefinition = metricDefinitionMapper.toEntity(metricDefinitionDTO);
metricDefinition = metricDefinitionRepository.save(metricDefinition);
return metricDefinitionMapper.toDto(metricDefinition);
}
/**
* Get all the metricDefinitions.
*
* @return the list of entities.
*/
@Override
@Transactional(readOnly = true)
public List<MetricDefinitionDTO> findAll() {
log.debug("Request to get all MetricDefinitions");
return metricDefinitionRepository.findAll().stream()
.map(metricDefinitionMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one metricDefinition by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Override
@Transactional(readOnly = true)
public Optional<MetricDefinitionDTO> findOne(Long id) {
log.debug("Request to get MetricDefinition : {}", id);
return metricDefinitionRepository.findById(id)
.map(metricDefinitionMapper::toDto);
}
/**
* Delete the metricDefinition by id.
*
* @param id the id of the entity.
*/
@Override
public void delete(Long id) {
log.debug("Request to delete MetricDefinition : {}", id);
metricDefinitionRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/processKnowledge/ArtifactMetricDAO.java
package br.ufpa.labes.spm.repository.impl.processKnowledge;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.processKnowledge.IArtifactMetricDAO;
import br.ufpa.labes.spm.domain.ArtifactMetric;
public class ArtifactMetricDAO extends BaseDAO<ArtifactMetric, Integer>
implements IArtifactMetricDAO {
protected ArtifactMetricDAO(Class<ArtifactMetric> businessClass) {
super(businessClass);
}
public ArtifactMetricDAO() {
super(ArtifactMetric.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/ClassMethodCallService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.ClassMethodCall;
import br.ufpa.labes.spm.repository.ClassMethodCallRepository;
import br.ufpa.labes.spm.service.dto.ClassMethodCallDTO;
import br.ufpa.labes.spm.service.mapper.ClassMethodCallMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link ClassMethodCall}.
*/
@Service
@Transactional
public class ClassMethodCallService {
private final Logger log = LoggerFactory.getLogger(ClassMethodCallService.class);
private final ClassMethodCallRepository classMethodCallRepository;
private final ClassMethodCallMapper classMethodCallMapper;
public ClassMethodCallService(ClassMethodCallRepository classMethodCallRepository, ClassMethodCallMapper classMethodCallMapper) {
this.classMethodCallRepository = classMethodCallRepository;
this.classMethodCallMapper = classMethodCallMapper;
}
/**
* Save a classMethodCall.
*
* @param classMethodCallDTO the entity to save.
* @return the persisted entity.
*/
public ClassMethodCallDTO save(ClassMethodCallDTO classMethodCallDTO) {
log.debug("Request to save ClassMethodCall : {}", classMethodCallDTO);
ClassMethodCall classMethodCall = classMethodCallMapper.toEntity(classMethodCallDTO);
classMethodCall = classMethodCallRepository.save(classMethodCall);
return classMethodCallMapper.toDto(classMethodCall);
}
/**
* Get all the classMethodCalls.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<ClassMethodCallDTO> findAll() {
log.debug("Request to get all ClassMethodCalls");
return classMethodCallRepository.findAll().stream()
.map(classMethodCallMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one classMethodCall by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<ClassMethodCallDTO> findOne(Long id) {
log.debug("Request to get ClassMethodCall : {}", id);
return classMethodCallRepository.findById(id)
.map(classMethodCallMapper::toDto);
}
/**
* Delete the classMethodCall by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete ClassMethodCall : {}", id);
classMethodCallRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/connections/IBranchANDConDAO.java
package br.ufpa.labes.spm.repository.interfaces.connections;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.BranchANDCon;
public interface IBranchANDConDAO extends IBaseDAO<BranchANDCon, String> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/ConsumableService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.Consumable;
import br.ufpa.labes.spm.repository.ConsumableRepository;
import br.ufpa.labes.spm.service.dto.ConsumableDTO;
import br.ufpa.labes.spm.service.mapper.ConsumableMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link Consumable}.
*/
@Service
@Transactional
public class ConsumableService {
private final Logger log = LoggerFactory.getLogger(ConsumableService.class);
private final ConsumableRepository consumableRepository;
private final ConsumableMapper consumableMapper;
public ConsumableService(ConsumableRepository consumableRepository, ConsumableMapper consumableMapper) {
this.consumableRepository = consumableRepository;
this.consumableMapper = consumableMapper;
}
/**
* Save a consumable.
*
* @param consumableDTO the entity to save.
* @return the persisted entity.
*/
public ConsumableDTO save(ConsumableDTO consumableDTO) {
log.debug("Request to save Consumable : {}", consumableDTO);
Consumable consumable = consumableMapper.toEntity(consumableDTO);
consumable = consumableRepository.save(consumable);
return consumableMapper.toDto(consumable);
}
/**
* Get all the consumables.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<ConsumableDTO> findAll() {
log.debug("Request to get all Consumables");
return consumableRepository.findAll().stream()
.map(consumableMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one consumable by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<ConsumableDTO> findOne(Long id) {
log.debug("Request to get Consumable : {}", id);
return consumableRepository.findById(id)
.map(consumableMapper::toDto);
}
/**
* Delete the consumable by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete Consumable : {}", id);
consumableRepository.deleteById(id);
}
}
<file_sep>/tmp/service/impl/RoleServicesImpl.java
package br.ufpa.labes.spm.service.impl;
import java.util.ArrayList;
import java.util.Collection;
import java.util.List;
import javax.persistence.NoResultException;
import javax.persistence.Query;
import org.qrconsult.spm.converter.core.Converter;
import org.qrconsult.spm.converter.core.ConverterImpl;
import br.ufpa.labes.spm.exceptions.ImplementationException;
import br.ufpa.labes.spm.repository.interfaces.agent.IAbilityDAO;
import br.ufpa.labes.spm.repository.interfaces.agent.IAgentPlaysRoleDAO;
import br.ufpa.labes.spm.repository.interfaces.agent.IRoleDAO;
import br.ufpa.labes.spm.repository.interfaces.agent.IRoleNeedsAbilityDAO;
import br.ufpa.labes.spm.repository.interfaces.types.IRoleTypeDAO;
import br.ufpa.labes.spm.service.dto.AbilityDTO;
import br.ufpa.labes.spm.service.dto.AgentDTO;
import br.ufpa.labes.spm.service.dto.RoleDTO;
import br.ufpa.labes.spm.service.dto.RolesDTO;
import br.ufpa.labes.spm.domain.Ability;
import br.ufpa.labes.spm.domain.Agent;
import br.ufpa.labes.spm.domain.AgentAffinityAgent;
import br.ufpa.labes.spm.domain.AgentHasAbility;
import br.ufpa.labes.spm.domain.AgentPlaysRole;
import br.ufpa.labes.spm.domain.WorkGroup;
import br.ufpa.labes.spm.domain.Role;
import br.ufpa.labes.spm.domain.RoleNeedsAbility;
import br.ufpa.labes.spm.domain.RoleType;
import br.ufpa.labes.spm.service.interfaces.RoleServices;
public class RoleServicesImpl implements RoleServices {
private static final String AGENT_PLAY_ROLE_CLASSNAME = AgentPlaysRole.class
.getSimpleName();
private static final String ROLE_CLASSNAME = Role.class.getSimpleName();
IAbilityDAO abilityDAO;
IRoleDAO roleDAO;
IRoleTypeDAO roleTypeDAO;
IRoleNeedsAbilityDAO roleNeedsDAO;
IAgentPlaysRoleDAO agentPlaysRole;
Converter converter = new ConverterImpl();
@SuppressWarnings("null")
public RoleDTO getRole(String nameRole) {
String hql;
Query query;
List<Role> result = null;
List<Ability> abis = new ArrayList<Ability>();
List<AbilityDTO> abisDTO = new ArrayList<AbilityDTO>();
List<Agent> agens = new ArrayList<Agent>();
List<AgentDTO> agensDTO = new ArrayList<AgentDTO>();
RoleDTO roleDTO = null;
AbilityDTO abi = null;
AgentDTO agen = null;
hql = "select role from " + Role.class.getSimpleName()
+ " as role where role.name = :rolname";
query = roleDAO.getPersistenceContext().createQuery(hql);
query.setParameter("rolname", nameRole);
result = query.getResultList();
Collection<RoleNeedsAbility> lis = result.get(0)
.getTheRoleNeedsAbilities();
Collection<AgentPlaysRole> lisaux = result.get(0)
.getTheAgentPlaysRoles();
for (RoleNeedsAbility roleNeedsAbility : lis) {
abis.add(roleNeedsAbility.getTheAbility());
}
Converter converter = new ConverterImpl();
for (int i = 0; i < abis.size(); i++) {
try {
abi = (AbilityDTO) converter.getDTO(abis.get(i),
AbilityDTO.class);
abisDTO.add(abi);
} catch (ImplementationException e) {
e.printStackTrace();
}
}
for (AgentPlaysRole agentPlaysRole : lisaux) {
agens.add(agentPlaysRole.getTheAgent());
}
for (int i = 0; i < agens.size(); i++) {
try {
agen = (AgentDTO) converter
.getDTO(agens.get(i), AgentDTO.class);
agensDTO.add(agen);
} catch (ImplementationException e) {
e.printStackTrace();
}
}
try {
roleDTO = (RoleDTO) converter.getDTO(result.get(0), RoleDTO.class);
if (abis != null) {
roleDTO.setAbilityToRole(abisDTO);
}
if (agens != null) {
roleDTO.setAgentToRole(agensDTO);
}
} catch (ImplementationException e) {
e.printStackTrace();
}
roleDTO.setSuperType(result.get(0).getSubordinate().getName());
return roleDTO;
}
public Role getRoleForName(String nome) {
String hql;
Query query;
List<Role> result = null;
hql = "select role from " + Role.class.getSimpleName()
+ " as role where role.name = :rolname";
query = roleDAO.getPersistenceContext().createQuery(hql);
query.setParameter("rolname", nome);
result = query.getResultList();
if (!result.isEmpty()) {
Role role = result.get(0);
return role;
} else {
return null;
}
}
@SuppressWarnings("null")
public List<AbilityDTO> getAbilitysToRole() {
String hql;
Query query;
List<Ability> result = null;
List<AbilityDTO> resultDTO = new ArrayList<AbilityDTO>();
hql = "select ability from " + Ability.class.getSimpleName()
+ " as ability";
query = roleDAO.getPersistenceContext().createQuery(hql);
result = query.getResultList();
Converter converter = new ConverterImpl();
if (!result.isEmpty()) {
AbilityDTO abi = null;
for (int i = 0; i < result.size(); i++) {
try {
abi = (AbilityDTO) converter.getDTO(result.get(i),
AbilityDTO.class);
resultDTO.add(abi);
} catch (ImplementationException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
return resultDTO;
} else {
return null;
}
}
public RoleDTO saveRole(RoleDTO roleDTO) {
Converter converter = new ConverterImpl();
try {
String typeRole = roleDTO.getIdent();
String roleName = roleDTO.getName();
String roleSubName = roleDTO.getSuperType();
System.out.println(":" + roleSubName);
if (typeRole != null && !typeRole.equals("") && roleName != null
&& !roleName.equals("")) {
RoleType roleType = roleTypeDAO
.retrieveBySecondaryKey(typeRole);
Role role = this.getRoleForName(roleName);
Role roleSub = this.getRoleForName(roleSubName);
if (roleType != null) {
if (role != null) {
role = (Role) converter.getEntity(roleDTO, role);
if (roleSub != null) {
role.setSubordinate(roleSub);
}
role.setTheRoleType(roleType);
role = roleDAO.update(role);
} else {
role = (Role) converter.getEntity(roleDTO, Role.class);
if (roleSub != null) {
role.setSubordinate(roleSub);
}
role.setTheRoleType(roleType);
role = roleDAO.daoSave(role);
}
roleDTO = (RoleDTO) converter.getDTO(role, RoleDTO.class);
} else {
return null;
}
} else {
System.out.println("passei aqui 8" + typeRole + ":" + roleName);
return null;
}
} catch (ImplementationException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return roleDTO;
}
public RoleDTO saveAbilityToRole(RoleDTO roleDTO) {
Converter converter = new ConverterImpl();
try {
// String typeRole = roleDTO.getIdent();
String roleName = roleDTO.getName();
// String roleSubName = roleDTO.getSuperType();
// RoleType roleType = roleTypeDAO.retrieveBySecondaryKey(typeRole);
Role role = this.getRoleForName(roleName);
// Role roleSub = this.getRoleForName(roleSubName);
Ability abil;
if (!roleDTO.getAbilityToRole().isEmpty()
&& (roleDTO.getAbilityToRole() != null) && role != null) {
Collection<RoleNeedsAbility> lisaux = new ArrayList<RoleNeedsAbility>();
lisaux = role.getTheRoleNeedsAbilities();
if (!lisaux.isEmpty()) {
for (RoleNeedsAbility roleNeedsAbility : lisaux) {
roleNeedsDAO.daoDelete(roleNeedsAbility);
}
}
role = (Role) converter.getEntity(roleDTO, role);
RoleNeedsAbility roleNeeds;
Collection<RoleNeedsAbility> lisAbi = new ArrayList<RoleNeedsAbility>();
for (int i = 0; i < roleDTO.getAbilityToRole().size(); i++) {
roleNeeds = new RoleNeedsAbility();
abil = (Ability) converter.getEntity(roleDTO
.getAbilityToRole().get(i), Ability.class);
roleNeeds.setTheAbility(abil);
roleNeeds.setTheRole(role);
roleNeeds.setDegree(roleDTO.getNivelAbility());
lisAbi.add(roleNeeds);
}
role.setTheRoleNeedsAbility(lisAbi);
role = roleDAO.update(role);
} else {
/*
* if(roleSub != null){ role.setSubordinate(roleSub);
* System.out.println("passei aqui 3/5"); } role = (Role)
* converter.getEntity(roleDTO, Role.class);
*/
}
roleDTO = (RoleDTO) converter.getDTO(role, RoleDTO.class);
} catch (ImplementationException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
System.out.println("passei aqui 9");
return roleDTO;
}
@SuppressWarnings("unchecked")
public Boolean removeRole(RoleDTO roleDTO) {
String hql;
Query query;
System.out.println("re1" + roleDTO.getName());
hql = "select role from " + Role.class.getSimpleName()
+ " as role where role.name = :abiname";
query = roleDAO.getPersistenceContext().createQuery(hql);
query.setParameter("abiname", roleDTO.getName());
List<Role> result = query.getResultList();
System.out.println("re1");
Role role = result.get(0);
if (role != null) {
if (role.getTheRoleNeedsAbilities() != null) {
Collection<RoleNeedsAbility> re = role.getTheRoleNeedsAbilities();
for (RoleNeedsAbility ro : re) {
roleNeedsDAO.daoDelete(ro);
}
}
System.out.println("re2");
roleDAO.daoDelete(role);
return true;
} else {
System.out.println("re3");
return false;
}
}
public RolesDTO getRoles(String term, String domain) {
String hql;
Query query;
List<String> resultList = null;
if (domain != null) {
hql = "select role.name from "
+ Role.class.getSimpleName()
+ " as role where role.name like :termo and role.ident = :domain";
query = roleDAO.getPersistenceContext().createQuery(hql);
query.setParameter("domain", domain);
query.setParameter("termo", "%" + term + "%");
resultList = query.getResultList();
} else {
hql = "select role.name from " + Role.class.getSimpleName()
+ " as role where role.name like :termo";
query = roleDAO.getPersistenceContext().createQuery(hql);
query.setParameter("termo", "%" + term + "%");
resultList = query.getResultList();
}
RolesDTO rolesDTO = new RolesDTO();
String[] names = new String[resultList.size()];
resultList.toArray(names);
rolesDTO.setNamePapeis(names);
return rolesDTO;
}
public String[] getRoleTypes() {
String hql;
Query query;
hql = "select ident from " + RoleType.class.getSimpleName();
query = roleTypeDAO.getPersistenceContext().createQuery(hql);
List<String> abiTypeList = new ArrayList<String>();
abiTypeList = query.getResultList();
String[] list = new String[abiTypeList.size()];
abiTypeList.toArray(list);
return list;
}
public RolesDTO getRoles() {
String hql;
Query query;
hql = "select name from " + Role.class.getSimpleName();
query = roleDAO.getPersistenceContext().createQuery(hql);
List<String> roles = new ArrayList<String>();
roles = query.getResultList();
String[] list = new String[roles.size()];
roles.toArray(list);
String[] list1 = new String[0];
list1 = getRoleTypes();
RolesDTO rolesDTO = new RolesDTO();
rolesDTO.setNamePapeis(list);
rolesDTO.setTypePapeis(list1);
rolesDTO.setSuperType(list);
return rolesDTO;
}
@Override
public List<AgentDTO> getAgentToRole() {
String hql;
Query query;
List<Agent> result = null;
List<AgentDTO> resultDTO = new ArrayList<AgentDTO>();
hql = "select agent from " + Agent.class.getSimpleName() + " as agent";
query = roleDAO.getPersistenceContext().createQuery(hql);
result = query.getResultList();
Converter converter = new ConverterImpl();
if (!result.isEmpty()) {
AgentDTO abi = null;
for (int i = 0; i < result.size(); i++) {
try {
abi = (AgentDTO) converter.getDTO(result.get(i),
AgentDTO.class);
resultDTO.add(abi);
} catch (ImplementationException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
return resultDTO;
} else {
return null;
}
}
@Override
public RoleDTO saveAgentToRole(RoleDTO roleDTO) {
System.out.println("passei aqui agentRole");
Converter converter = new ConverterImpl();
try {
String roleName = roleDTO.getName();
Role role = this.getRoleForName(roleName);
Agent agent;
System.out.println("passei aqui 1");
if (!roleDTO.getAgentToRole().isEmpty()
&& (roleDTO.getAgentToRole() != null) && role != null) {
Collection<AgentPlaysRole> lisaux = new ArrayList<AgentPlaysRole>();
lisaux = role.getTheAgentPlaysRole();
if (!lisaux.isEmpty()) {
for (AgentPlaysRole agentPlays : lisaux) {
this.agentPlaysRole.delete(agentPlays);
}
}
role = (Role) converter.getEntity(roleDTO, role);
AgentPlaysRole roleplay;
Collection<AgentPlaysRole> lisAgent = new ArrayList<AgentPlaysRole>();
for (int i = 0; i < roleDTO.getAgentToRole().size(); i++) {
System.out.println("passei aqui 2");
roleplay = new AgentPlaysRole();
agent = (Agent) converter.getEntity(roleDTO
.getAgentToRole().get(i), Agent.class);
roleplay.setTheAgent(agent);
roleplay.setTheRole(role);
// agentPlaysRole.save(roleplay);
lisAgent.add(roleplay);
System.out.println("passei aqui 3");
}
System.out.println("passei aqui 3/5");
role.setTheAgentPlaysRole(lisAgent);
role = roleDAO.update(role);
System.out.println("passei aqui 4");
} else {
/*
* if(roleSub != null){ role.setSubordinate(roleSub);
* System.out.println("passei aqui 3/5"); } role = (Role)
* converter.getEntity(roleDTO, Role.class);
*/
return null;
}
System.out.println("passei aqui 6");
roleDTO = (RoleDTO) converter.getDTO(role, RoleDTO.class);
} catch (ImplementationException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
System.out.println("passei aqui 9");
return roleDTO;
}
@Override
public Boolean removeAbilityRole(RoleDTO roleDTO) {
Converter converter = new ConverterImpl();
try {
String roleName = roleDTO.getName();
Role role = this.getRoleForName(roleName);
System.out.println("passei aqui remove1");
Ability abil;
System.out.println("passei aqui remove2");
if (!roleDTO.getAbilityToRole().isEmpty()
&& (roleDTO.getAbilityToRole() != null) && role != null) {
Collection<RoleNeedsAbility> lisaux = new ArrayList<RoleNeedsAbility>();
lisaux = role.getTheRoleNeedsAbilities();
boolean re = false;
if (!lisaux.isEmpty()) {
for (RoleNeedsAbility roleNeedsAbility : lisaux) {
roleNeedsDAO.daoDelete(roleNeedsAbility);
}
}
role = (Role) converter.getEntity(roleDTO, role);
RoleNeedsAbility roleNeeds;
System.out.println("passei aqui remove3");
Collection<RoleNeedsAbility> lisAbi = new ArrayList<RoleNeedsAbility>();
for (int i = 0; i < roleDTO.getAbilityToRole().size(); i++) {
roleNeeds = new RoleNeedsAbility();
abil = (Ability) converter.getEntity(roleDTO
.getAbilityToRole().get(i), Ability.class);
roleNeeds.setTheAbility(abil);
roleNeeds.setTheRole(role);
lisAbi.add(roleNeeds);
System.out.println("passei aqui remove3");
}
System.out.println("passei aqui remove4");
role.setTheRoleNeedsAbilities(lisAbi);
role = roleDAO.update(role);
System.out.println("passei aqui remove5");
} else {
/*
* if(roleSub != null){ role.setSubordinate(roleSub);
* System.out.println("passei aqui 3/5"); } role = (Role)
* converter.getEntity(roleDTO, Role.class);
*/
}
System.out.println("passei aqui remove");
roleDTO = (RoleDTO) converter.getDTO(role, RoleDTO.class);
} catch (ImplementationException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
System.out.println("passei aqui 9");
return true;
}
@Override
public Boolean removeAgentRole(RoleDTO roleDTO) {
Converter converter = new ConverterImpl();
try {
String roleName = roleDTO.getName();
Role role = this.getRoleForName(roleName);
System.out.println("passei aqui remove1");
Agent agent;
System.out.println("passei aqui remove2");
if (!roleDTO.getAgentToRole().isEmpty()
&& (roleDTO.getAgentToRole() != null) && role != null) {
Collection<AgentPlaysRole> lisaux = new ArrayList<AgentPlaysRole>();
lisaux = role.getTheAgentPlaysRole();
boolean re = false;
if (!lisaux.isEmpty()) {
for (AgentPlaysRole agentPlay : lisaux) {
agentPlaysRole.delete(agentPlay);
}
}
role = (Role) converter.getEntity(roleDTO, role);
AgentPlaysRole agentPlay;
System.out.println("passei aqui remove3");
Collection<AgentPlaysRole> lisAgent = new ArrayList<AgentPlaysRole>();
for (int i = 0; i < roleDTO.getAbilityToRole().size(); i++) {
agentPlay = new AgentPlaysRole();
agent = (Agent) converter.getEntity(roleDTO
.getAgentToRole().get(i), Agent.class);
agentPlay.setTheAgent(agent);
agentPlay.setTheRole(role);
lisAgent.add(agentPlay);
System.out.println("passei aqui remove3");
}
System.out.println("passei aqui remove4");
role.setTheAgentPlaysRole(lisAgent);
role = roleDAO.update(role);
System.out.println("passei aqui remove5");
} else {
/*
* if(roleSub != null){ role.setSubordinate(roleSub);
* System.out.println("passei aqui 3/5"); } role = (Role)
* converter.getEntity(roleDTO, Role.class);
*/
}
System.out.println("passei aqui remove");
roleDTO = (RoleDTO) converter.getDTO(role, RoleDTO.class);
} catch (ImplementationException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
System.out.println("passei aqui 9");
return true;
}
public List<RoleDTO> getRolesToTree() {
String hql;
Query query;
List<Role> result = null;
List<RoleDTO> resultDTO = new ArrayList<RoleDTO>();
hql = "select role from " + Role.class.getSimpleName() + " as role";
query = roleDAO.getPersistenceContext().createQuery(hql);
result = query.getResultList();
Converter converter = new ConverterImpl();
if (!result.isEmpty()) {
RoleDTO role = null;
for (int i = 0; i < result.size(); i++) {
try {
role = (RoleDTO) converter.getDTO(result.get(i),
RoleDTO.class);
resultDTO.add(role);
} catch (ImplementationException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
return resultDTO;
} else {
return null;
}
}
@Override
public String getPerfil(RoleDTO agent, int oid) {
System.out.println("caiu nessa porra!"+oid);
Query query;
query = roleDAO
.getPersistenceContext()
.createQuery(
"SELECT r.name FROM "
+ AGENT_PLAY_ROLE_CLASSNAME
+ " AS a,"
+ ROLE_CLASSNAME
+ " AS r "
+ "WHERE a.theRole.oid=r.oid and a.theAgent.oid="+ oid);
String admin = null;
String gerente = null;
String retorno = null;
try {
System.out.print("Cargos " + query.getResultList());
for (int i = 0; i < query.getResultList().size(); i++) {
if (query.getResultList().get(i).equals("Administrador")) {
System.out.print("caiu no admin");
admin = "Administrador";
} else if (query.getResultList().get(i).equals("Gerente")) {
System.out.print("caiu no gerente");
gerente = "Gerente";
}
}
if (admin != null) {
retorno = admin;
} else {
if (gerente != null) {
retorno = gerente;
} else {
retorno = "Desenvolvedor";
}
}
return retorno;
} catch (Exception e) {
e.printStackTrace();
return null;
}
}
private RoleDTO convertRoleToAgentDTO(Role agent) {
RoleDTO agentDTO = new RoleDTO();
try {
agentDTO = (RoleDTO) converter.getDTO(agent, RoleDTO.class);
return agentDTO;
} catch (ImplementationException e) {
e.printStackTrace();
}
return null;
}
}
<file_sep>/tmp/service/mapper/RelationshipKindMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.RelationshipKindDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link RelationshipKind} and its DTO {@link RelationshipKindDTO}.
*/
@Mapper(componentModel = "spring", uses = {})
public interface RelationshipKindMapper extends EntityMapper<RelationshipKindDTO, RelationshipKind> {
@Mapping(target = "theAssetRelationships", ignore = true)
@Mapping(target = "removeTheAssetRelationship", ignore = true)
RelationshipKind toEntity(RelationshipKindDTO relationshipKindDTO);
default RelationshipKind fromId(Long id) {
if (id == null) {
return null;
}
RelationshipKind relationshipKind = new RelationshipKind();
relationshipKind.setId(id);
return relationshipKind;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/connections/IBranchConCondDAO.java
package br.ufpa.labes.spm.repository.interfaces.connections;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.BranchConCond;
public interface IBranchConCondDAO extends IBaseDAO<BranchConCond, String> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/domain/enumeration/EmailNotificationConfig.java
package br.ufpa.labes.spm.domain.enumeration;
/** The EmailNotificationConfig enumeration. */
public enum EmailNotificationConfig {
NOTIFY_ALL_MANAGERS,
NOTIFY_SPECIFIC_MANAGERS,
NOT_NOTIFY
}
<file_sep>/tmp/service/mapper/ReqAgentRequiresAbilityMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.ReqAgentRequiresAbilityDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link ReqAgentRequiresAbility} and its DTO {@link ReqAgentRequiresAbilityDTO}.
*/
@Mapper(componentModel = "spring", uses = {ReqAgentMapper.class, AbilityMapper.class})
public interface ReqAgentRequiresAbilityMapper extends EntityMapper<ReqAgentRequiresAbilityDTO, ReqAgentRequiresAbility> {
@Mapping(source = "theReqAgent.id", target = "theReqAgentId")
@Mapping(source = "theAbility.id", target = "theAbilityId")
ReqAgentRequiresAbilityDTO toDto(ReqAgentRequiresAbility reqAgentRequiresAbility);
@Mapping(source = "theReqAgentId", target = "theReqAgent")
@Mapping(source = "theAbilityId", target = "theAbility")
ReqAgentRequiresAbility toEntity(ReqAgentRequiresAbilityDTO reqAgentRequiresAbilityDTO);
default ReqAgentRequiresAbility fromId(Long id) {
if (id == null) {
return null;
}
ReqAgentRequiresAbility reqAgentRequiresAbility = new ReqAgentRequiresAbility();
reqAgentRequiresAbility.setId(id);
return reqAgentRequiresAbility;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/web/rest/ActivityEstimationResource.java
package br.ufpa.labes.spm.web.rest;
import br.ufpa.labes.spm.service.ActivityEstimationService;
import br.ufpa.labes.spm.web.rest.errors.BadRequestAlertException;
import br.ufpa.labes.spm.service.dto.ActivityEstimationDTO;
import io.github.jhipster.web.util.HeaderUtil;
import io.github.jhipster.web.util.ResponseUtil;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.List;
import java.util.Optional;
/**
* REST controller for managing {@link br.ufpa.labes.spm.domain.ActivityEstimation}.
*/
@RestController
@RequestMapping("/api")
public class ActivityEstimationResource {
private final Logger log = LoggerFactory.getLogger(ActivityEstimationResource.class);
private static final String ENTITY_NAME = "activityEstimation";
@Value("${jhipster.clientApp.name}")
private String applicationName;
private final ActivityEstimationService activityEstimationService;
public ActivityEstimationResource(ActivityEstimationService activityEstimationService) {
this.activityEstimationService = activityEstimationService;
}
/**
* {@code POST /activity-estimations} : Create a new activityEstimation.
*
* @param activityEstimationDTO the activityEstimationDTO to create.
* @return the {@link ResponseEntity} with status {@code 201 (Created)} and with body the new activityEstimationDTO, or with status {@code 400 (Bad Request)} if the activityEstimation has already an ID.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PostMapping("/activity-estimations")
public ResponseEntity<ActivityEstimationDTO> createActivityEstimation(@RequestBody ActivityEstimationDTO activityEstimationDTO) throws URISyntaxException {
log.debug("REST request to save ActivityEstimation : {}", activityEstimationDTO);
if (activityEstimationDTO.getId() != null) {
throw new BadRequestAlertException("A new activityEstimation cannot already have an ID", ENTITY_NAME, "idexists");
}
ActivityEstimationDTO result = activityEstimationService.save(activityEstimationDTO);
return ResponseEntity.created(new URI("/api/activity-estimations/" + result.getId()))
.headers(HeaderUtil.createEntityCreationAlert(applicationName, true, ENTITY_NAME, result.getId().toString()))
.body(result);
}
/**
* {@code PUT /activity-estimations} : Updates an existing activityEstimation.
*
* @param activityEstimationDTO the activityEstimationDTO to update.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the updated activityEstimationDTO,
* or with status {@code 400 (Bad Request)} if the activityEstimationDTO is not valid,
* or with status {@code 500 (Internal Server Error)} if the activityEstimationDTO couldn't be updated.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PutMapping("/activity-estimations")
public ResponseEntity<ActivityEstimationDTO> updateActivityEstimation(@RequestBody ActivityEstimationDTO activityEstimationDTO) throws URISyntaxException {
log.debug("REST request to update ActivityEstimation : {}", activityEstimationDTO);
if (activityEstimationDTO.getId() == null) {
throw new BadRequestAlertException("Invalid id", ENTITY_NAME, "idnull");
}
ActivityEstimationDTO result = activityEstimationService.save(activityEstimationDTO);
return ResponseEntity.ok()
.headers(HeaderUtil.createEntityUpdateAlert(applicationName, true, ENTITY_NAME, activityEstimationDTO.getId().toString()))
.body(result);
}
/**
* {@code GET /activity-estimations} : get all the activityEstimations.
*
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and the list of activityEstimations in body.
*/
@GetMapping("/activity-estimations")
public List<ActivityEstimationDTO> getAllActivityEstimations() {
log.debug("REST request to get all ActivityEstimations");
return activityEstimationService.findAll();
}
/**
* {@code GET /activity-estimations/:id} : get the "id" activityEstimation.
*
* @param id the id of the activityEstimationDTO to retrieve.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the activityEstimationDTO, or with status {@code 404 (Not Found)}.
*/
@GetMapping("/activity-estimations/{id}")
public ResponseEntity<ActivityEstimationDTO> getActivityEstimation(@PathVariable Long id) {
log.debug("REST request to get ActivityEstimation : {}", id);
Optional<ActivityEstimationDTO> activityEstimationDTO = activityEstimationService.findOne(id);
return ResponseUtil.wrapOrNotFound(activityEstimationDTO);
}
/**
* {@code DELETE /activity-estimations/:id} : delete the "id" activityEstimation.
*
* @param id the id of the activityEstimationDTO to delete.
* @return the {@link ResponseEntity} with status {@code 204 (NO_CONTENT)}.
*/
@DeleteMapping("/activity-estimations/{id}")
public ResponseEntity<Void> deleteActivityEstimation(@PathVariable Long id) {
log.debug("REST request to delete ActivityEstimation : {}", id);
activityEstimationService.delete(id);
return ResponseEntity.noContent().headers(HeaderUtil.createEntityDeletionAlert(applicationName, true, ENTITY_NAME, id.toString())).build();
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/agent/IAgentPlaysRoleDAO.java
package br.ufpa.labes.spm.repository.interfaces.agent;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.AgentPlaysRole;
public interface IAgentPlaysRoleDAO extends IBaseDAO<AgentPlaysRole, Integer> {}
<file_sep>/tmp/service/mapper/OutOfWorkPeriodMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.OutOfWorkPeriodDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link OutOfWorkPeriod} and its DTO {@link OutOfWorkPeriodDTO}.
*/
@Mapper(componentModel = "spring", uses = {AgentMapper.class})
public interface OutOfWorkPeriodMapper extends EntityMapper<OutOfWorkPeriodDTO, OutOfWorkPeriod> {
@Mapping(source = "theAgent.id", target = "theAgentId")
OutOfWorkPeriodDTO toDto(OutOfWorkPeriod outOfWorkPeriod);
@Mapping(source = "theAgentId", target = "theAgent")
OutOfWorkPeriod toEntity(OutOfWorkPeriodDTO outOfWorkPeriodDTO);
default OutOfWorkPeriod fromId(Long id) {
if (id == null) {
return null;
}
OutOfWorkPeriod outOfWorkPeriod = new OutOfWorkPeriod();
outOfWorkPeriod.setId(id);
return outOfWorkPeriod;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/connections/BranchConDAO.java
package br.ufpa.labes.spm.repository.impl.connections;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.connections.IBranchConDAO;
import br.ufpa.labes.spm.domain.BranchCon;
public class BranchConDAO extends BaseDAO<BranchCon, String> implements IBranchConDAO {
protected BranchConDAO(Class<BranchCon> businessClass) {
super(businessClass);
}
public BranchConDAO() {
super(BranchCon.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/WorkGroupMetricService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.WorkGroupMetric;
import br.ufpa.labes.spm.repository.WorkGroupMetricRepository;
import br.ufpa.labes.spm.service.dto.WorkGroupMetricDTO;
import br.ufpa.labes.spm.service.mapper.WorkGroupMetricMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link WorkGroupMetric}.
*/
@Service
@Transactional
public class WorkGroupMetricService {
private final Logger log = LoggerFactory.getLogger(WorkGroupMetricService.class);
private final WorkGroupMetricRepository workGroupMetricRepository;
private final WorkGroupMetricMapper workGroupMetricMapper;
public WorkGroupMetricService(WorkGroupMetricRepository workGroupMetricRepository, WorkGroupMetricMapper workGroupMetricMapper) {
this.workGroupMetricRepository = workGroupMetricRepository;
this.workGroupMetricMapper = workGroupMetricMapper;
}
/**
* Save a workGroupMetric.
*
* @param workGroupMetricDTO the entity to save.
* @return the persisted entity.
*/
public WorkGroupMetricDTO save(WorkGroupMetricDTO workGroupMetricDTO) {
log.debug("Request to save WorkGroupMetric : {}", workGroupMetricDTO);
WorkGroupMetric workGroupMetric = workGroupMetricMapper.toEntity(workGroupMetricDTO);
workGroupMetric = workGroupMetricRepository.save(workGroupMetric);
return workGroupMetricMapper.toDto(workGroupMetric);
}
/**
* Get all the workGroupMetrics.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<WorkGroupMetricDTO> findAll() {
log.debug("Request to get all WorkGroupMetrics");
return workGroupMetricRepository.findAll().stream()
.map(workGroupMetricMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one workGroupMetric by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<WorkGroupMetricDTO> findOne(Long id) {
log.debug("Request to get WorkGroupMetric : {}", id);
return workGroupMetricRepository.findById(id)
.map(workGroupMetricMapper::toDto);
}
/**
* Delete the workGroupMetric by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete WorkGroupMetric : {}", id);
workGroupMetricRepository.deleteById(id);
}
}
<file_sep>/tmp/service/mapper/AssetMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.AssetDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link Asset} and its DTO {@link AssetDTO}.
*/
@Mapper(componentModel = "spring", uses = {AssetStatMapper.class, AuthorMapper.class})
public interface AssetMapper extends EntityMapper<AssetDTO, Asset> {
@Mapping(source = "stats.id", target = "statsId")
@Mapping(source = "owner.id", target = "ownerId")
AssetDTO toDto(Asset asset);
@Mapping(source = "statsId", target = "stats")
@Mapping(target = "authorStats", ignore = true)
@Mapping(target = "removeAuthorStats", ignore = true)
@Mapping(target = "tagStats", ignore = true)
@Mapping(target = "removeTagStats", ignore = true)
@Mapping(target = "lessonsLearneds", ignore = true)
@Mapping(target = "removeLessonsLearned", ignore = true)
@Mapping(target = "relatedAssets", ignore = true)
@Mapping(target = "removeRelatedAssets", ignore = true)
@Mapping(target = "relatedByAssets", ignore = true)
@Mapping(target = "removeRelatedByAssets", ignore = true)
@Mapping(target = "comments", ignore = true)
@Mapping(target = "removeComments", ignore = true)
@Mapping(source = "ownerId", target = "owner")
@Mapping(target = "removeFavoritedBy", ignore = true)
@Mapping(target = "removeFollowers", ignore = true)
@Mapping(target = "removeCollaborators", ignore = true)
Asset toEntity(AssetDTO assetDTO);
default Asset fromId(Long id) {
if (id == null) {
return null;
}
Asset asset = new Asset();
asset.setId(id);
return asset;
}
}
<file_sep>/tmp/service/mapper/ReqWorkGroupMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.ReqWorkGroupDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link ReqWorkGroup} and its DTO {@link ReqWorkGroupDTO}.
*/
@Mapper(componentModel = "spring", uses = {WorkGroupTypeMapper.class, WorkGroupMapper.class})
public interface ReqWorkGroupMapper extends EntityMapper<ReqWorkGroupDTO, ReqWorkGroup> {
@Mapping(source = "theWorkGroupType.id", target = "theWorkGroupTypeId")
@Mapping(source = "theWorkGroup.id", target = "theWorkGroupId")
ReqWorkGroupDTO toDto(ReqWorkGroup reqWorkGroup);
@Mapping(source = "theWorkGroupTypeId", target = "theWorkGroupType")
@Mapping(source = "theWorkGroupId", target = "theWorkGroup")
ReqWorkGroup toEntity(ReqWorkGroupDTO reqWorkGroupDTO);
default ReqWorkGroup fromId(Long id) {
if (id == null) {
return null;
}
ReqWorkGroup reqWorkGroup = new ReqWorkGroup();
reqWorkGroup.setId(id);
return reqWorkGroup;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/web/rest/AgentPlaysRoleResource.java
package br.ufpa.labes.spm.web.rest;
import br.ufpa.labes.spm.service.AgentPlaysRoleService;
import br.ufpa.labes.spm.web.rest.errors.BadRequestAlertException;
import br.ufpa.labes.spm.service.dto.AgentPlaysRoleDTO;
import io.github.jhipster.web.util.HeaderUtil;
import io.github.jhipster.web.util.ResponseUtil;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.List;
import java.util.Optional;
/**
* REST controller for managing {@link br.ufpa.labes.spm.domain.AgentPlaysRole}.
*/
@RestController
@RequestMapping("/api")
public class AgentPlaysRoleResource {
private final Logger log = LoggerFactory.getLogger(AgentPlaysRoleResource.class);
private static final String ENTITY_NAME = "agentPlaysRole";
@Value("${jhipster.clientApp.name}")
private String applicationName;
private final AgentPlaysRoleService agentPlaysRoleService;
public AgentPlaysRoleResource(AgentPlaysRoleService agentPlaysRoleService) {
this.agentPlaysRoleService = agentPlaysRoleService;
}
/**
* {@code POST /agent-plays-roles} : Create a new agentPlaysRole.
*
* @param agentPlaysRoleDTO the agentPlaysRoleDTO to create.
* @return the {@link ResponseEntity} with status {@code 201 (Created)} and with body the new agentPlaysRoleDTO, or with status {@code 400 (Bad Request)} if the agentPlaysRole has already an ID.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PostMapping("/agent-plays-roles")
public ResponseEntity<AgentPlaysRoleDTO> createAgentPlaysRole(@RequestBody AgentPlaysRoleDTO agentPlaysRoleDTO) throws URISyntaxException {
log.debug("REST request to save AgentPlaysRole : {}", agentPlaysRoleDTO);
if (agentPlaysRoleDTO.getId() != null) {
throw new BadRequestAlertException("A new agentPlaysRole cannot already have an ID", ENTITY_NAME, "idexists");
}
AgentPlaysRoleDTO result = agentPlaysRoleService.save(agentPlaysRoleDTO);
return ResponseEntity.created(new URI("/api/agent-plays-roles/" + result.getId()))
.headers(HeaderUtil.createEntityCreationAlert(applicationName, true, ENTITY_NAME, result.getId().toString()))
.body(result);
}
/**
* {@code PUT /agent-plays-roles} : Updates an existing agentPlaysRole.
*
* @param agentPlaysRoleDTO the agentPlaysRoleDTO to update.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the updated agentPlaysRoleDTO,
* or with status {@code 400 (Bad Request)} if the agentPlaysRoleDTO is not valid,
* or with status {@code 500 (Internal Server Error)} if the agentPlaysRoleDTO couldn't be updated.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PutMapping("/agent-plays-roles")
public ResponseEntity<AgentPlaysRoleDTO> updateAgentPlaysRole(@RequestBody AgentPlaysRoleDTO agentPlaysRoleDTO) throws URISyntaxException {
log.debug("REST request to update AgentPlaysRole : {}", agentPlaysRoleDTO);
if (agentPlaysRoleDTO.getId() == null) {
throw new BadRequestAlertException("Invalid id", ENTITY_NAME, "idnull");
}
AgentPlaysRoleDTO result = agentPlaysRoleService.save(agentPlaysRoleDTO);
return ResponseEntity.ok()
.headers(HeaderUtil.createEntityUpdateAlert(applicationName, true, ENTITY_NAME, agentPlaysRoleDTO.getId().toString()))
.body(result);
}
/**
* {@code GET /agent-plays-roles} : get all the agentPlaysRoles.
*
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and the list of agentPlaysRoles in body.
*/
@GetMapping("/agent-plays-roles")
public List<AgentPlaysRoleDTO> getAllAgentPlaysRoles() {
log.debug("REST request to get all AgentPlaysRoles");
return agentPlaysRoleService.findAll();
}
/**
* {@code GET /agent-plays-roles/:id} : get the "id" agentPlaysRole.
*
* @param id the id of the agentPlaysRoleDTO to retrieve.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the agentPlaysRoleDTO, or with status {@code 404 (Not Found)}.
*/
@GetMapping("/agent-plays-roles/{id}")
public ResponseEntity<AgentPlaysRoleDTO> getAgentPlaysRole(@PathVariable Long id) {
log.debug("REST request to get AgentPlaysRole : {}", id);
Optional<AgentPlaysRoleDTO> agentPlaysRoleDTO = agentPlaysRoleService.findOne(id);
return ResponseUtil.wrapOrNotFound(agentPlaysRoleDTO);
}
/**
* {@code DELETE /agent-plays-roles/:id} : delete the "id" agentPlaysRole.
*
* @param id the id of the agentPlaysRoleDTO to delete.
* @return the {@link ResponseEntity} with status {@code 204 (NO_CONTENT)}.
*/
@DeleteMapping("/agent-plays-roles/{id}")
public ResponseEntity<Void> deleteAgentPlaysRole(@PathVariable Long id) {
log.debug("REST request to delete AgentPlaysRole : {}", id);
agentPlaysRoleService.delete(id);
return ResponseEntity.noContent().headers(HeaderUtil.createEntityDeletionAlert(applicationName, true, ENTITY_NAME, id.toString())).build();
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/MultipleConService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.MultipleCon;
import br.ufpa.labes.spm.repository.MultipleConRepository;
import br.ufpa.labes.spm.service.dto.MultipleConDTO;
import br.ufpa.labes.spm.service.mapper.MultipleConMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link MultipleCon}.
*/
@Service
@Transactional
public class MultipleConService {
private final Logger log = LoggerFactory.getLogger(MultipleConService.class);
private final MultipleConRepository multipleConRepository;
private final MultipleConMapper multipleConMapper;
public MultipleConService(MultipleConRepository multipleConRepository, MultipleConMapper multipleConMapper) {
this.multipleConRepository = multipleConRepository;
this.multipleConMapper = multipleConMapper;
}
/**
* Save a multipleCon.
*
* @param multipleConDTO the entity to save.
* @return the persisted entity.
*/
public MultipleConDTO save(MultipleConDTO multipleConDTO) {
log.debug("Request to save MultipleCon : {}", multipleConDTO);
MultipleCon multipleCon = multipleConMapper.toEntity(multipleConDTO);
multipleCon = multipleConRepository.save(multipleCon);
return multipleConMapper.toDto(multipleCon);
}
/**
* Get all the multipleCons.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<MultipleConDTO> findAll() {
log.debug("Request to get all MultipleCons");
return multipleConRepository.findAll().stream()
.map(multipleConMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one multipleCon by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<MultipleConDTO> findOne(Long id) {
log.debug("Request to get MultipleCon : {}", id);
return multipleConRepository.findById(id)
.map(multipleConMapper::toDto);
}
/**
* Delete the multipleCon by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete MultipleCon : {}", id);
multipleConRepository.deleteById(id);
}
}
<file_sep>/tmp/service/impl/ChatMessageServicesImpl.java
package br.ufpa.labes.spm.service.impl;
import java.util.List;
import javax.persistence.TypedQuery;
import br.ufpa.labes.spm.repository.interfaces.agent.IAgentDAO;
import br.ufpa.labes.spm.repository.interfaces.chat.IChatMessageDAO;
import br.ufpa.labes.spm.domain.Agent;
import br.ufpa.labes.spm.domain.ChatMessage;
import br.ufpa.labes.spm.service.interfaces.ChatMessageServices;
public class ChatMessageServicesImpl implements ChatMessageServices {
IAgentDAO agentDAO;
IChatMessageDAO messageDAO;
TypedQuery<ChatMessage> query;
@Override
public List<ChatMessage> buscarTodasAsMensagensDaConversa(String ident) {
String hql = "SELECT m FROM " + "ChatMessage" + " m WHERE m.ident = :ident";
query = messageDAO.getPersistenceContext().createQuery(hql, ChatMessage.class);
query.setParameter("ident", ident);
List<ChatMessage> result = query.getResultList();
return result;
}
@Override
public void salvarMensagem(ChatMessage message, String de) {
Agent agenteMsg = this.getAgent(de);
System.out.println(agenteMsg);
boolean encontrouAgente = !(new Agent().equals(agenteMsg));
System.out.println(encontrouAgente);
if(encontrouAgente) {
message.setFromAgent(agenteMsg);
messageDAO.daoSave(message);
}
}
private Agent getAgent(String ident) {
String hql = "SELECT a FROM " + "Agent" + " a WHERE a.name = :ident";
TypedQuery<Agent> query = agentDAO.getPersistenceContext().createQuery(hql, Agent.class);
query.setParameter("ident", ident);
List<Agent> agents = query.getResultList();
boolean singleResult = agents.size() >= 1;
if(singleResult) {
return agents.get(0);
} else {
return new Agent();
}
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/processKnowledge/IEstimationDAO.java
package br.ufpa.labes.spm.repository.interfaces.processKnowledge;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.Estimation;
public interface IEstimationDAO extends IBaseDAO<Estimation, Integer> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/connections/IBranchConDAO.java
package br.ufpa.labes.spm.repository.interfaces.connections;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.BranchCon;
public interface IBranchConDAO extends IBaseDAO<BranchCon, String> {}
<file_sep>/tmp/service/ReqAgentRequiresAbilityService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.ReqAgentRequiresAbility;
import br.ufpa.labes.spm.repository.ReqAgentRequiresAbilityRepository;
import br.ufpa.labes.spm.service.dto.ReqAgentRequiresAbilityDTO;
import br.ufpa.labes.spm.service.mapper.ReqAgentRequiresAbilityMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link ReqAgentRequiresAbility}.
*/
@Service
@Transactional
public class ReqAgentRequiresAbilityService {
private final Logger log = LoggerFactory.getLogger(ReqAgentRequiresAbilityService.class);
private final ReqAgentRequiresAbilityRepository reqAgentRequiresAbilityRepository;
private final ReqAgentRequiresAbilityMapper reqAgentRequiresAbilityMapper;
public ReqAgentRequiresAbilityService(ReqAgentRequiresAbilityRepository reqAgentRequiresAbilityRepository, ReqAgentRequiresAbilityMapper reqAgentRequiresAbilityMapper) {
this.reqAgentRequiresAbilityRepository = reqAgentRequiresAbilityRepository;
this.reqAgentRequiresAbilityMapper = reqAgentRequiresAbilityMapper;
}
/**
* Save a reqAgentRequiresAbility.
*
* @param reqAgentRequiresAbilityDTO the entity to save.
* @return the persisted entity.
*/
public ReqAgentRequiresAbilityDTO save(ReqAgentRequiresAbilityDTO reqAgentRequiresAbilityDTO) {
log.debug("Request to save ReqAgentRequiresAbility : {}", reqAgentRequiresAbilityDTO);
ReqAgentRequiresAbility reqAgentRequiresAbility = reqAgentRequiresAbilityMapper.toEntity(reqAgentRequiresAbilityDTO);
reqAgentRequiresAbility = reqAgentRequiresAbilityRepository.save(reqAgentRequiresAbility);
return reqAgentRequiresAbilityMapper.toDto(reqAgentRequiresAbility);
}
/**
* Get all the reqAgentRequiresAbilities.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<ReqAgentRequiresAbilityDTO> findAll() {
log.debug("Request to get all ReqAgentRequiresAbilities");
return reqAgentRequiresAbilityRepository.findAll().stream()
.map(reqAgentRequiresAbilityMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one reqAgentRequiresAbility by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<ReqAgentRequiresAbilityDTO> findOne(Long id) {
log.debug("Request to get ReqAgentRequiresAbility : {}", id);
return reqAgentRequiresAbilityRepository.findById(id)
.map(reqAgentRequiresAbilityMapper::toDto);
}
/**
* Delete the reqAgentRequiresAbility by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete ReqAgentRequiresAbility : {}", id);
reqAgentRequiresAbilityRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/connections/IConnectionDAO.java
package br.ufpa.labes.spm.repository.interfaces.connections;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.Connection;
public interface IConnectionDAO extends IBaseDAO<Connection, String> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/processKnowledge/OrganizationEstimationDAO.java
package br.ufpa.labes.spm.repository.impl.processKnowledge;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.processKnowledge.IOrganizationEstimationDAO;
import br.ufpa.labes.spm.domain.OrganizationEstimation;
public class OrganizationEstimationDAO extends BaseDAO<OrganizationEstimation, Integer>
implements IOrganizationEstimationDAO {
protected OrganizationEstimationDAO(Class<OrganizationEstimation> businessClass) {
super(businessClass);
}
public OrganizationEstimationDAO() {
super(OrganizationEstimation.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/dto/AbilityDTO.java
package br.ufpa.labes.spm.service.dto;
import java.io.Serializable;
import java.util.Objects;
import javax.persistence.Lob;
/**
* A DTO for the {@link br.ufpa.labes.spm.domain.Ability} entity.
*/
public class AbilityDTO implements Serializable {
private Long id;
private String ident;
private String name;
@Lob
private String description;
private Long theAbilityTypeId;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public String getIdent() {
return ident;
}
public void setIdent(String ident) {
this.ident = ident;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public String getDescription() {
return description;
}
public void setDescription(String description) {
this.description = description;
}
public Long getTheAbilityTypeId() {
return theAbilityTypeId;
}
public void setTheAbilityTypeId(Long abilityTypeId) {
this.theAbilityTypeId = abilityTypeId;
}
@Override
public boolean equals(Object o) {
if (this == o) {
return true;
}
if (o == null || getClass() != o.getClass()) {
return false;
}
AbilityDTO abilityDTO = (AbilityDTO) o;
if (abilityDTO.getId() == null || getId() == null) {
return false;
}
return Objects.equals(getId(), abilityDTO.getId());
}
@Override
public int hashCode() {
return Objects.hashCode(getId());
}
@Override
public String toString() {
return "AbilityDTO{" +
"id=" + getId() +
", ident='" + getIdent() + "'" +
", name='" + getName() + "'" +
", description='" + getDescription() + "'" +
", theAbilityType=" + getTheAbilityTypeId() +
"}";
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/web/rest/OutOfWorkPeriodResource.java
package br.ufpa.labes.spm.web.rest;
import br.ufpa.labes.spm.service.OutOfWorkPeriodService;
import br.ufpa.labes.spm.web.rest.errors.BadRequestAlertException;
import br.ufpa.labes.spm.service.dto.OutOfWorkPeriodDTO;
import io.github.jhipster.web.util.HeaderUtil;
import io.github.jhipster.web.util.ResponseUtil;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.List;
import java.util.Optional;
/**
* REST controller for managing {@link br.ufpa.labes.spm.domain.OutOfWorkPeriod}.
*/
@RestController
@RequestMapping("/api")
public class OutOfWorkPeriodResource {
private final Logger log = LoggerFactory.getLogger(OutOfWorkPeriodResource.class);
private static final String ENTITY_NAME = "outOfWorkPeriod";
@Value("${jhipster.clientApp.name}")
private String applicationName;
private final OutOfWorkPeriodService outOfWorkPeriodService;
public OutOfWorkPeriodResource(OutOfWorkPeriodService outOfWorkPeriodService) {
this.outOfWorkPeriodService = outOfWorkPeriodService;
}
/**
* {@code POST /out-of-work-periods} : Create a new outOfWorkPeriod.
*
* @param outOfWorkPeriodDTO the outOfWorkPeriodDTO to create.
* @return the {@link ResponseEntity} with status {@code 201 (Created)} and with body the new outOfWorkPeriodDTO, or with status {@code 400 (Bad Request)} if the outOfWorkPeriod has already an ID.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PostMapping("/out-of-work-periods")
public ResponseEntity<OutOfWorkPeriodDTO> createOutOfWorkPeriod(@RequestBody OutOfWorkPeriodDTO outOfWorkPeriodDTO) throws URISyntaxException {
log.debug("REST request to save OutOfWorkPeriod : {}", outOfWorkPeriodDTO);
if (outOfWorkPeriodDTO.getId() != null) {
throw new BadRequestAlertException("A new outOfWorkPeriod cannot already have an ID", ENTITY_NAME, "idexists");
}
OutOfWorkPeriodDTO result = outOfWorkPeriodService.save(outOfWorkPeriodDTO);
return ResponseEntity.created(new URI("/api/out-of-work-periods/" + result.getId()))
.headers(HeaderUtil.createEntityCreationAlert(applicationName, true, ENTITY_NAME, result.getId().toString()))
.body(result);
}
/**
* {@code PUT /out-of-work-periods} : Updates an existing outOfWorkPeriod.
*
* @param outOfWorkPeriodDTO the outOfWorkPeriodDTO to update.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the updated outOfWorkPeriodDTO,
* or with status {@code 400 (Bad Request)} if the outOfWorkPeriodDTO is not valid,
* or with status {@code 500 (Internal Server Error)} if the outOfWorkPeriodDTO couldn't be updated.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PutMapping("/out-of-work-periods")
public ResponseEntity<OutOfWorkPeriodDTO> updateOutOfWorkPeriod(@RequestBody OutOfWorkPeriodDTO outOfWorkPeriodDTO) throws URISyntaxException {
log.debug("REST request to update OutOfWorkPeriod : {}", outOfWorkPeriodDTO);
if (outOfWorkPeriodDTO.getId() == null) {
throw new BadRequestAlertException("Invalid id", ENTITY_NAME, "idnull");
}
OutOfWorkPeriodDTO result = outOfWorkPeriodService.save(outOfWorkPeriodDTO);
return ResponseEntity.ok()
.headers(HeaderUtil.createEntityUpdateAlert(applicationName, true, ENTITY_NAME, outOfWorkPeriodDTO.getId().toString()))
.body(result);
}
/**
* {@code GET /out-of-work-periods} : get all the outOfWorkPeriods.
*
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and the list of outOfWorkPeriods in body.
*/
@GetMapping("/out-of-work-periods")
public List<OutOfWorkPeriodDTO> getAllOutOfWorkPeriods() {
log.debug("REST request to get all OutOfWorkPeriods");
return outOfWorkPeriodService.findAll();
}
/**
* {@code GET /out-of-work-periods/:id} : get the "id" outOfWorkPeriod.
*
* @param id the id of the outOfWorkPeriodDTO to retrieve.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the outOfWorkPeriodDTO, or with status {@code 404 (Not Found)}.
*/
@GetMapping("/out-of-work-periods/{id}")
public ResponseEntity<OutOfWorkPeriodDTO> getOutOfWorkPeriod(@PathVariable Long id) {
log.debug("REST request to get OutOfWorkPeriod : {}", id);
Optional<OutOfWorkPeriodDTO> outOfWorkPeriodDTO = outOfWorkPeriodService.findOne(id);
return ResponseUtil.wrapOrNotFound(outOfWorkPeriodDTO);
}
/**
* {@code DELETE /out-of-work-periods/:id} : delete the "id" outOfWorkPeriod.
*
* @param id the id of the outOfWorkPeriodDTO to delete.
* @return the {@link ResponseEntity} with status {@code 204 (NO_CONTENT)}.
*/
@DeleteMapping("/out-of-work-periods/{id}")
public ResponseEntity<Void> deleteOutOfWorkPeriod(@PathVariable Long id) {
log.debug("REST request to delete OutOfWorkPeriod : {}", id);
outOfWorkPeriodService.delete(id);
return ResponseEntity.noContent().headers(HeaderUtil.createEntityDeletionAlert(applicationName, true, ENTITY_NAME, id.toString())).build();
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/plainActivities/INormalDAO.java
package br.ufpa.labes.spm.repository.interfaces.plainActivities;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.Artifact;
import br.ufpa.labes.spm.domain.Normal;
public interface INormalDAO extends IBaseDAO<Normal, Integer> {
public String[] getInvolvedAgentsForNormal(String normalIdent);
public String[] getRequiredResourcesForNormal(String normalIdent);
public Artifact[] getOutputArtifactsForNormal(String normalIdent);
public String[] getOutputArtifactsIdentsForNormal(String normalIdent);
public String[] getInputArtifactsIdentsForNormal(String normalIdent);
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/log/EventDAO.java
package br.ufpa.labes.spm.repository.impl.log;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.log.IEventDAO;
import br.ufpa.labes.spm.domain.Event;
public class EventDAO extends BaseDAO<Event, Integer> implements IEventDAO {
protected EventDAO(Class<Event> businessClass) {
super(businessClass);
}
public EventDAO() {
super(Event.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/web/rest/OcurrenceResource.java
package br.ufpa.labes.spm.web.rest;
import br.ufpa.labes.spm.service.OcurrenceService;
import br.ufpa.labes.spm.web.rest.errors.BadRequestAlertException;
import br.ufpa.labes.spm.service.dto.OcurrenceDTO;
import io.github.jhipster.web.util.HeaderUtil;
import io.github.jhipster.web.util.ResponseUtil;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.List;
import java.util.Optional;
/**
* REST controller for managing {@link br.ufpa.labes.spm.domain.Ocurrence}.
*/
@RestController
@RequestMapping("/api")
public class OcurrenceResource {
private final Logger log = LoggerFactory.getLogger(OcurrenceResource.class);
private static final String ENTITY_NAME = "ocurrence";
@Value("${jhipster.clientApp.name}")
private String applicationName;
private final OcurrenceService ocurrenceService;
public OcurrenceResource(OcurrenceService ocurrenceService) {
this.ocurrenceService = ocurrenceService;
}
/**
* {@code POST /ocurrences} : Create a new ocurrence.
*
* @param ocurrenceDTO the ocurrenceDTO to create.
* @return the {@link ResponseEntity} with status {@code 201 (Created)} and with body the new ocurrenceDTO, or with status {@code 400 (Bad Request)} if the ocurrence has already an ID.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PostMapping("/ocurrences")
public ResponseEntity<OcurrenceDTO> createOcurrence(@RequestBody OcurrenceDTO ocurrenceDTO) throws URISyntaxException {
log.debug("REST request to save Ocurrence : {}", ocurrenceDTO);
if (ocurrenceDTO.getId() != null) {
throw new BadRequestAlertException("A new ocurrence cannot already have an ID", ENTITY_NAME, "idexists");
}
OcurrenceDTO result = ocurrenceService.save(ocurrenceDTO);
return ResponseEntity.created(new URI("/api/ocurrences/" + result.getId()))
.headers(HeaderUtil.createEntityCreationAlert(applicationName, true, ENTITY_NAME, result.getId().toString()))
.body(result);
}
/**
* {@code PUT /ocurrences} : Updates an existing ocurrence.
*
* @param ocurrenceDTO the ocurrenceDTO to update.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the updated ocurrenceDTO,
* or with status {@code 400 (Bad Request)} if the ocurrenceDTO is not valid,
* or with status {@code 500 (Internal Server Error)} if the ocurrenceDTO couldn't be updated.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PutMapping("/ocurrences")
public ResponseEntity<OcurrenceDTO> updateOcurrence(@RequestBody OcurrenceDTO ocurrenceDTO) throws URISyntaxException {
log.debug("REST request to update Ocurrence : {}", ocurrenceDTO);
if (ocurrenceDTO.getId() == null) {
throw new BadRequestAlertException("Invalid id", ENTITY_NAME, "idnull");
}
OcurrenceDTO result = ocurrenceService.save(ocurrenceDTO);
return ResponseEntity.ok()
.headers(HeaderUtil.createEntityUpdateAlert(applicationName, true, ENTITY_NAME, ocurrenceDTO.getId().toString()))
.body(result);
}
/**
* {@code GET /ocurrences} : get all the ocurrences.
*
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and the list of ocurrences in body.
*/
@GetMapping("/ocurrences")
public List<OcurrenceDTO> getAllOcurrences() {
log.debug("REST request to get all Ocurrences");
return ocurrenceService.findAll();
}
/**
* {@code GET /ocurrences/:id} : get the "id" ocurrence.
*
* @param id the id of the ocurrenceDTO to retrieve.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the ocurrenceDTO, or with status {@code 404 (Not Found)}.
*/
@GetMapping("/ocurrences/{id}")
public ResponseEntity<OcurrenceDTO> getOcurrence(@PathVariable Long id) {
log.debug("REST request to get Ocurrence : {}", id);
Optional<OcurrenceDTO> ocurrenceDTO = ocurrenceService.findOne(id);
return ResponseUtil.wrapOrNotFound(ocurrenceDTO);
}
/**
* {@code DELETE /ocurrences/:id} : delete the "id" ocurrence.
*
* @param id the id of the ocurrenceDTO to delete.
* @return the {@link ResponseEntity} with status {@code 204 (NO_CONTENT)}.
*/
@DeleteMapping("/ocurrences/{id}")
public ResponseEntity<Void> deleteOcurrence(@PathVariable Long id) {
log.debug("REST request to delete Ocurrence : {}", id);
ocurrenceService.delete(id);
return ResponseEntity.noContent().headers(HeaderUtil.createEntityDeletionAlert(applicationName, true, ENTITY_NAME, id.toString())).build();
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/plainActivities/IParameterDAO.java
package br.ufpa.labes.spm.repository.interfaces.plainActivities;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.Parameter;
public interface IParameterDAO extends IBaseDAO<Parameter, Integer> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/connections/IArtifactConDAO.java
package br.ufpa.labes.spm.repository.interfaces.connections;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.ArtifactCon;
public interface IArtifactConDAO extends IBaseDAO<ArtifactCon, Integer> {}
<file_sep>/tmp/service/interfaces/MetricServices.java
package br.ufpa.labes.spm.service.interfaces;
import java.util.List;
import javax.ejb.Remote;
import br.ufpa.labes.spm.service.dto.MetricDTO;
import br.ufpa.labes.spm.service.dto.MetricDefinitionDTO;
@Remote
public interface MetricServices {
public MetricDTO saveMetric(MetricDTO metricDTO);
public List<MetricDefinitionDTO> getMetricsDefinitions();
public String[] getTypes(String domain);
public String[] getWithTypes(String domainFilter,String nameType);
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/plainActivities/IArtifactParamDAO.java
package br.ufpa.labes.spm.repository.interfaces.plainActivities;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.ArtifactParam;
public interface IArtifactParamDAO extends IBaseDAO<ArtifactParam, Integer> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/web/rest/AssetStatResource.java
package br.ufpa.labes.spm.web.rest;
import br.ufpa.labes.spm.service.AssetStatService;
import br.ufpa.labes.spm.web.rest.errors.BadRequestAlertException;
import br.ufpa.labes.spm.service.dto.AssetStatDTO;
import io.github.jhipster.web.util.HeaderUtil;
import io.github.jhipster.web.util.ResponseUtil;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.List;
import java.util.Optional;
import java.util.stream.StreamSupport;
/**
* REST controller for managing {@link br.ufpa.labes.spm.domain.AssetStat}.
*/
@RestController
@RequestMapping("/api")
public class AssetStatResource {
private final Logger log = LoggerFactory.getLogger(AssetStatResource.class);
private static final String ENTITY_NAME = "assetStat";
@Value("${jhipster.clientApp.name}")
private String applicationName;
private final AssetStatService assetStatService;
public AssetStatResource(AssetStatService assetStatService) {
this.assetStatService = assetStatService;
}
/**
* {@code POST /asset-stats} : Create a new assetStat.
*
* @param assetStatDTO the assetStatDTO to create.
* @return the {@link ResponseEntity} with status {@code 201 (Created)} and with body the new assetStatDTO, or with status {@code 400 (Bad Request)} if the assetStat has already an ID.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PostMapping("/asset-stats")
public ResponseEntity<AssetStatDTO> createAssetStat(@RequestBody AssetStatDTO assetStatDTO) throws URISyntaxException {
log.debug("REST request to save AssetStat : {}", assetStatDTO);
if (assetStatDTO.getId() != null) {
throw new BadRequestAlertException("A new assetStat cannot already have an ID", ENTITY_NAME, "idexists");
}
AssetStatDTO result = assetStatService.save(assetStatDTO);
return ResponseEntity.created(new URI("/api/asset-stats/" + result.getId()))
.headers(HeaderUtil.createEntityCreationAlert(applicationName, true, ENTITY_NAME, result.getId().toString()))
.body(result);
}
/**
* {@code PUT /asset-stats} : Updates an existing assetStat.
*
* @param assetStatDTO the assetStatDTO to update.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the updated assetStatDTO,
* or with status {@code 400 (Bad Request)} if the assetStatDTO is not valid,
* or with status {@code 500 (Internal Server Error)} if the assetStatDTO couldn't be updated.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PutMapping("/asset-stats")
public ResponseEntity<AssetStatDTO> updateAssetStat(@RequestBody AssetStatDTO assetStatDTO) throws URISyntaxException {
log.debug("REST request to update AssetStat : {}", assetStatDTO);
if (assetStatDTO.getId() == null) {
throw new BadRequestAlertException("Invalid id", ENTITY_NAME, "idnull");
}
AssetStatDTO result = assetStatService.save(assetStatDTO);
return ResponseEntity.ok()
.headers(HeaderUtil.createEntityUpdateAlert(applicationName, true, ENTITY_NAME, assetStatDTO.getId().toString()))
.body(result);
}
/**
* {@code GET /asset-stats} : get all the assetStats.
*
* @param filter the filter of the request.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and the list of assetStats in body.
*/
@GetMapping("/asset-stats")
public List<AssetStatDTO> getAllAssetStats(@RequestParam(required = false) String filter) {
if ("theasset-is-null".equals(filter)) {
log.debug("REST request to get all AssetStats where theAsset is null");
return assetStatService.findAllWhereTheAssetIsNull();
}
log.debug("REST request to get all AssetStats");
return assetStatService.findAll();
}
/**
* {@code GET /asset-stats/:id} : get the "id" assetStat.
*
* @param id the id of the assetStatDTO to retrieve.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the assetStatDTO, or with status {@code 404 (Not Found)}.
*/
@GetMapping("/asset-stats/{id}")
public ResponseEntity<AssetStatDTO> getAssetStat(@PathVariable Long id) {
log.debug("REST request to get AssetStat : {}", id);
Optional<AssetStatDTO> assetStatDTO = assetStatService.findOne(id);
return ResponseUtil.wrapOrNotFound(assetStatDTO);
}
/**
* {@code DELETE /asset-stats/:id} : delete the "id" assetStat.
*
* @param id the id of the assetStatDTO to delete.
* @return the {@link ResponseEntity} with status {@code 204 (NO_CONTENT)}.
*/
@DeleteMapping("/asset-stats/{id}")
public ResponseEntity<Void> deleteAssetStat(@PathVariable Long id) {
log.debug("REST request to delete AssetStat : {}", id);
assetStatService.delete(id);
return ResponseEntity.noContent().headers(HeaderUtil.createEntityDeletionAlert(applicationName, true, ENTITY_NAME, id.toString())).build();
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/activities/DecomposedDAO.java
package br.ufpa.labes.spm.repository.impl.activities;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.activities.IDecomposedDAO;
import br.ufpa.labes.spm.domain.Decomposed;
public class DecomposedDAO extends BaseDAO<Decomposed, String> implements IDecomposedDAO {
protected DecomposedDAO(Class<Decomposed> businessClass) {
super(businessClass);
}
public DecomposedDAO() {
super(Decomposed.class);
}
}
<file_sep>/tmp/service/impl/MetricDefinitionServicesImpl.java
package br.ufpa.labes.spm.service.impl;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Iterator;
import java.util.List;
import javax.persistence.Query;
import org.qrconsult.spm.converter.core.Converter;
import org.qrconsult.spm.converter.core.ConverterImpl;
import br.ufpa.labes.spm.exceptions.ImplementationException;
import br.ufpa.labes.spm.repository.interfaces.processKnowledge.IMetricDefinitionDAO;
import br.ufpa.labes.spm.repository.interfaces.types.IAbilityTypeDAO;
import br.ufpa.labes.spm.repository.interfaces.types.IMetricTypeDAO;
import br.ufpa.labes.spm.service.dto.AbilityDTO;
import br.ufpa.labes.spm.service.dto.AgentDTO;
import br.ufpa.labes.spm.service.dto.RoleDTO;
import br.ufpa.labes.spm.service.dto.RolesDTO;
import br.ufpa.labes.spm.service.dto.TypesDTO;
import br.ufpa.labes.spm.service.dto.MetricDefinitionDTO;
import br.ufpa.labes.spm.service.dto.MetricDefinitionsDTO;
import br.ufpa.labes.spm.domain.Ability;
import br.ufpa.labes.spm.domain.Agent;
import br.ufpa.labes.spm.domain.AgentPlaysRole;
import br.ufpa.labes.spm.domain.Role;
import br.ufpa.labes.spm.domain.RoleNeedsAbility;
import br.ufpa.labes.spm.domain.MetricDefinition;
import br.ufpa.labes.spm.domain.AbilityType;
import br.ufpa.labes.spm.domain.MetricType;
import br.ufpa.labes.spm.domain.RoleType;
import br.ufpa.labes.spm.domain.Type;
import br.ufpa.labes.spm.service.interfaces.MetricDefinitionServices;
public class MetricDefinitionServicesImpl implements MetricDefinitionServices{
IMetricTypeDAO metricTypeDAO;
IMetricDefinitionDAO metricDefinitionDAO;
public MetricDefinition getMetricDefinitionP(String name) {
String hql;
Query query;
List<MetricDefinition> result = null;
hql = "select ability from "+MetricDefinition.class.getSimpleName()+" as ability where ability.name = :abiname";
query = metricDefinitionDAO.getPersistenceContext().createQuery(hql);
query.setParameter("abiname", name);
result = query.getResultList();
if(!result.isEmpty()){
MetricDefinition ability = result.get(0);
return ability;
}else{
return null;
}
}
@Override
public MetricDefinitionDTO saveMetricDefinition(MetricDefinitionDTO metricDefinitionDTO) {
System.out.println("MeAqui");
Converter converter = new ConverterImpl();
try {
//metricDefinitionDTO.setUnits(null);
//metricDefinitionDTO.setEstimation(null);
//metricDefinitionDTO.setMetric(null);
String typeMetric = metricDefinitionDTO.getMetricType();
String metricName = metricDefinitionDTO.getName();
System.out.println("MeAquitypeMetric::"+typeMetric);
System.out.println("MeAqui2");
if (typeMetric != null && !typeMetric.equals("")){
MetricType abiType = metricTypeDAO.retrieveBySecondaryKey(typeMetric);
MetricDefinition metricDefinition = this.getMetricDefinitionP(metricName);
System.out.println("MeAqui3");
if(abiType != null ){
if(metricDefinition != null){
System.out.println("MeAqui3");
return null;
}else{
System.out.println( "MeAqui4units:"+metricDefinitionDTO.getUnits().size() );
metricDefinition = (MetricDefinition) converter.getEntity(metricDefinitionDTO, MetricDefinition.class);
metricDefinition.setUnits(metricDefinitionDTO.getUnits());
metricDefinition.setMetricType( abiType );
metricDefinition = metricDefinitionDAO.daoSave(metricDefinition);
metricDefinitionDTO = (MetricDefinitionDTO) converter.getDTO(metricDefinition, MetricDefinitionDTO.class);
metricDefinitionDTO.setMetricType(abiType.getIdent());
System.out.println("MeAqui5:"+metricDefinitionDTO.getMetricType());
}
}else{
System.out.println("MeAqui6");
return null;
}
}else{
return null;
}
} catch (ImplementationException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
System.out.println("MeAqui7");
return metricDefinitionDTO;
}
@Override
public String[] getMetricTypes() {
System.out.println("MeAqui6");
String hql;
Query query;
hql = "select ident from " +MetricType.class.getSimpleName();
query = metricTypeDAO.getPersistenceContext().createQuery(hql);
List<String> abiTypeList = new ArrayList<String>();
abiTypeList = query.getResultList();
String[] list = new String[abiTypeList.size()];
abiTypeList.toArray(list);
System.out.println("MeAqui8");
return list;
}
public String[] getMetricDefinitionsTypes(){
String hql;
Query query;
hql = "select ident from " +MetricType.class.getSimpleName();
query = metricTypeDAO.getPersistenceContext().createQuery(hql);
List<String> abiTypeList = new ArrayList<String>();
abiTypeList = query.getResultList();
String[] list = new String[abiTypeList.size()];
abiTypeList.toArray(list);
for (String string : list) {
System.out.println(list);
}
return list;
}
@Override
public MetricDefinitionsDTO getMetricDefinitions() {
String hql;
Query query;
hql = "select name from "+MetricDefinition.class.getSimpleName();
query = metricDefinitionDAO.getPersistenceContext().createQuery(hql);
List<String> roles = new ArrayList<String>();
roles = query.getResultList();
String[] list = new String[roles.size()];
roles.toArray(list);
String[] list1 = new String[0];
list1 = getMetricDefinitionsTypes();
MetricDefinitionsDTO rolesDTO = new MetricDefinitionsDTO();
rolesDTO.setNameMetricsDefinitions(list);
rolesDTO.setTypeMetrics(list1);
/*for (String string : list) {
System.out.println("Linda:"+list);
}
for (String string : list1) {
System.out.println("Linda:"+list1);
}*/
return rolesDTO;
}
@Override
public MetricDefinitionsDTO getMetricDefinitions(String term, String domain, boolean orgFilter) {
String hql;
Query query;
List<String> resultList = null;
if(domain != null){
hql = "select role.name from " + MetricDefinition.class.getSimpleName() + " as role where role.name like :termo and role.ident = :domain and";
query = metricDefinitionDAO.getPersistenceContext().createQuery(hql);
query.setParameter("domain", domain);
query.setParameter("termo", "%"+ term + "%");
resultList = query.getResultList();
}else{
hql = "select role.name from " +MetricDefinition.class.getSimpleName() + " as role where role.name like :termo";
query = metricDefinitionDAO.getPersistenceContext().createQuery(hql);
query.setParameter("termo", "%"+ term + "%");
resultList = query.getResultList();
}
MetricDefinitionsDTO rolesDTO = new MetricDefinitionsDTO();
String[] names = new String[resultList.size()];
resultList.toArray(names);
rolesDTO.setNameMetricsDefinitions(names);
/*System.out.println("Param�tros: " + termoBusca + ", " + domainFilter + ", " + orgFilter);
String hql;
Query query;
List<List<Type>> typesLists = new ArrayList<List<Type>>();
List<Class<?>> returnedClasses = new ArrayList<Class<?>>();
int sizeLists = 0;
if(domainFilter!=null){
Class<?> currentClass = typeClasses.get(domainFilter);
hql = "select type from " + currentClass.getName() + " as type where type.ident like :termo and type.userDefined is not :orgFilter";
query = typeDAO.getPersistenceContext().createQuery(hql);
query.setParameter("orgFilter", orgFilter);
query.setParameter("termo", "%"+ termoBusca + "%");
List<Type> resultList = query.getResultList();
typesLists.add(resultList);
returnedClasses.add(currentClass);
sizeLists = resultList.size();
}else{
Iterator<Class<?>> classIterator = typeClasses.values().iterator();
while (classIterator.hasNext()) {
Class<?> class_ = classIterator.next();
hql = "select type from " + class_.getName() + " as type where type.ident like :termo and type.userDefined is not :orgFilter";
query = typeDAO.getPersistenceContext().createQuery(hql);
query.setParameter("orgFilter", orgFilter);
query.setParameter("termo", "%"+ termoBusca + "%");
List<Type> resultList = query.getResultList();
typesLists.add(resultList);
returnedClasses.add(class_);
sizeLists += resultList.size();
}
}
TypesDTO typesDTO = new TypesDTO(sizeLists);
for (int i = 0; i < typesLists.size(); i++) {
List<Type> list = typesLists.get(i);
int j = 0;
for (Iterator<Type> iterator = list.iterator(); iterator.hasNext();) {
Type type = (Type) iterator.next();
String typeIdent = type.getIdent();
System.out.println("Tipo Selecionado: " + typeIdent);
String superTypeIdent = (type.getSuperType()!=null ? type.getSuperType().getIdent() : "");
String rootType = returnedClasses.get(i).getSimpleName();
typesDTO.addType(typeIdent, superTypeIdent, rootType, j);
j++;
}
}
return typesDTO;*/
return rolesDTO;
}
@Override
public MetricDefinitionDTO getMetricDefinitionDTO(String nameMetricDefinition) {
String hql;
Query query;
List<MetricDefinition> result = null;
MetricDefinition resuk;
//List<Ability> abis = new ArrayList<Ability>();
//List<AbilityDTO> abisDTO = new ArrayList<AbilityDTO>();
//List<Agent> agens = new ArrayList<Agent>();
//ist<AgentDTO> agensDTO = new ArrayList<AgentDTO>();
MetricDefinitionDTO metricDefinitionDTO = null;
//AbilityDTO abi = null;
//AgentDTO agen = null;
hql = "select role from "+MetricDefinition.class.getSimpleName()+" as role where role.name = :rolname";
query = metricDefinitionDAO.getPersistenceContext().createQuery(hql);
query.setParameter("rolname", nameMetricDefinition);
result = query.getResultList();
//Collection<RoleNeedsAbility> lis = result.get(0).getTheRoleNeedsAbilities();
//Collection<AgentPlaysRole> lisaux = result.get(0).getTheAgentPlaysRole();
/*for (RoleNeedsAbility roleNeedsAbility : lis) {
abis.add(roleNeedsAbility.getTheAbility());
}*/
Converter converter = new ConverterImpl();
/*for(int i = 0;i < abis.size();i++){
try {
abi = (AbilityDTO) converter.getDTO(abis.get(i), AbilityDTO.class);
abisDTO.add(abi);
} catch (Implementation
for (AgentPlaysRole agentPlaysRole : lisaux) {
agens.add(agentPlaysRole.getTheAgent());
}
for(int i = 0;i < agens.size();i++){
try {
agen = (AgentDTO) converter.getDTO(agens.get(i), AgentDTO.class);
agensDTO.add(agen);
} catch (ImplementationException e) {
e.printStackTrace();
}
}
*/
try {
Collection<String> abis ;
abis = result.get(0).getUnits();
//metricDefinitionDTO = new MetricDefinitionDTO();
//if( result.get(0).getUnits() != null){
// metricDefinitionDTO.setUnits(abis);
//}
resuk = result.get(0);
metricDefinitionDTO = (MetricDefinitionDTO) converter.getDTO( resuk, MetricDefinitionDTO.class);
//System.out.println("units+"+metricDefinitionDTO.getUnits());
for ( String string : resuk.getUnits() ) {
//System.out.println(string);
}
/*System.out.println("Abis:::"+abis);
if( abis != null ){
System.out.println( "Abis:::"+abis.size() );
metricDefinitionDTO.setUnits( abis );
}*/
/*if(abis != null){
roleDTO.setAbilityToRole(abisDTO);
}
if(agens != null){
roleDTO.setAgentToRole(agensDTO);
}*/
} catch (ImplementationException e) {
e.printStackTrace();
}
//roleDTO.setSuperType(result.get(0).getName());
//System.out.println("MetricDefinitoin:"+result.get(0).getMetricType().getIdent());
/*Collection<E>
for (int i = 0; i < result.get(0).u; i++) {
array_type array_element = array[i];
}*/
return metricDefinitionDTO;
}
}
<file_sep>/tmp/service/impl/ToolDefinitionServiceImpl.java
package br.ufpa.labes.spm.service.impl;
import br.ufpa.labes.spm.service.ToolDefinitionService;
import br.ufpa.labes.spm.domain.ToolDefinition;
import br.ufpa.labes.spm.repository.ToolDefinitionRepository;
import br.ufpa.labes.spm.service.dto.ToolDefinitionDTO;
import br.ufpa.labes.spm.service.mapper.ToolDefinitionMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.Pageable;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link ToolDefinition}.
*/
@Service
@Transactional
public class ToolDefinitionServiceImpl implements ToolDefinitionService {
private final Logger log = LoggerFactory.getLogger(ToolDefinitionServiceImpl.class);
private final ToolDefinitionRepository toolDefinitionRepository;
private final ToolDefinitionMapper toolDefinitionMapper;
public ToolDefinitionServiceImpl(ToolDefinitionRepository toolDefinitionRepository, ToolDefinitionMapper toolDefinitionMapper) {
this.toolDefinitionRepository = toolDefinitionRepository;
this.toolDefinitionMapper = toolDefinitionMapper;
}
/**
* Save a toolDefinition.
*
* @param toolDefinitionDTO the entity to save.
* @return the persisted entity.
*/
@Override
public ToolDefinitionDTO save(ToolDefinitionDTO toolDefinitionDTO) {
log.debug("Request to save ToolDefinition : {}", toolDefinitionDTO);
ToolDefinition toolDefinition = toolDefinitionMapper.toEntity(toolDefinitionDTO);
toolDefinition = toolDefinitionRepository.save(toolDefinition);
return toolDefinitionMapper.toDto(toolDefinition);
}
/**
* Get all the toolDefinitions.
*
* @return the list of entities.
*/
@Override
@Transactional(readOnly = true)
public List<ToolDefinitionDTO> findAll() {
log.debug("Request to get all ToolDefinitions");
return toolDefinitionRepository.findAllWithEagerRelationships().stream()
.map(toolDefinitionMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get all the toolDefinitions with eager load of many-to-many relationships.
*
* @return the list of entities.
*/
public Page<ToolDefinitionDTO> findAllWithEagerRelationships(Pageable pageable) {
return toolDefinitionRepository.findAllWithEagerRelationships(pageable).map(toolDefinitionMapper::toDto);
}
/**
* Get one toolDefinition by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Override
@Transactional(readOnly = true)
public Optional<ToolDefinitionDTO> findOne(Long id) {
log.debug("Request to get ToolDefinition : {}", id);
return toolDefinitionRepository.findOneWithEagerRelationships(id)
.map(toolDefinitionMapper::toDto);
}
/**
* Delete the toolDefinition by id.
*
* @param id the id of the entity.
*/
@Override
public void delete(Long id) {
log.debug("Request to delete ToolDefinition : {}", id);
toolDefinitionRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/processKnowledge/ArtifactEstimationDAO.java
package br.ufpa.labes.spm.repository.impl.processKnowledge;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.processKnowledge.IArtifactEstimationDAO;
import br.ufpa.labes.spm.domain.ArtifactEstimation;
public class ArtifactEstimationDAO extends BaseDAO<ArtifactEstimation, Integer>
implements IArtifactEstimationDAO {
protected ArtifactEstimationDAO(Class<ArtifactEstimation> businessClass) {
super(businessClass);
}
public ArtifactEstimationDAO() {
super(ArtifactEstimation.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/ArtifactConService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.ArtifactCon;
import br.ufpa.labes.spm.repository.ArtifactConRepository;
import br.ufpa.labes.spm.service.dto.ArtifactConDTO;
import br.ufpa.labes.spm.service.mapper.ArtifactConMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.Pageable;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link ArtifactCon}.
*/
@Service
@Transactional
public class ArtifactConService {
private final Logger log = LoggerFactory.getLogger(ArtifactConService.class);
private final ArtifactConRepository artifactConRepository;
private final ArtifactConMapper artifactConMapper;
public ArtifactConService(ArtifactConRepository artifactConRepository, ArtifactConMapper artifactConMapper) {
this.artifactConRepository = artifactConRepository;
this.artifactConMapper = artifactConMapper;
}
/**
* Save a artifactCon.
*
* @param artifactConDTO the entity to save.
* @return the persisted entity.
*/
public ArtifactConDTO save(ArtifactConDTO artifactConDTO) {
log.debug("Request to save ArtifactCon : {}", artifactConDTO);
ArtifactCon artifactCon = artifactConMapper.toEntity(artifactConDTO);
artifactCon = artifactConRepository.save(artifactCon);
return artifactConMapper.toDto(artifactCon);
}
/**
* Get all the artifactCons.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<ArtifactConDTO> findAll() {
log.debug("Request to get all ArtifactCons");
return artifactConRepository.findAllWithEagerRelationships().stream()
.map(artifactConMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get all the artifactCons with eager load of many-to-many relationships.
*
* @return the list of entities.
*/
public Page<ArtifactConDTO> findAllWithEagerRelationships(Pageable pageable) {
return artifactConRepository.findAllWithEagerRelationships(pageable).map(artifactConMapper::toDto);
}
/**
* Get one artifactCon by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<ArtifactConDTO> findOne(Long id) {
log.debug("Request to get ArtifactCon : {}", id);
return artifactConRepository.findOneWithEagerRelationships(id)
.map(artifactConMapper::toDto);
}
/**
* Delete the artifactCon by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete ArtifactCon : {}", id);
artifactConRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/web/rest/ResourcePossibleUseResource.java
package br.ufpa.labes.spm.web.rest;
import br.ufpa.labes.spm.service.ResourcePossibleUseService;
import br.ufpa.labes.spm.web.rest.errors.BadRequestAlertException;
import br.ufpa.labes.spm.service.dto.ResourcePossibleUseDTO;
import io.github.jhipster.web.util.HeaderUtil;
import io.github.jhipster.web.util.ResponseUtil;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.List;
import java.util.Optional;
/**
* REST controller for managing {@link br.ufpa.labes.spm.domain.ResourcePossibleUse}.
*/
@RestController
@RequestMapping("/api")
public class ResourcePossibleUseResource {
private final Logger log = LoggerFactory.getLogger(ResourcePossibleUseResource.class);
private static final String ENTITY_NAME = "resourcePossibleUse";
@Value("${jhipster.clientApp.name}")
private String applicationName;
private final ResourcePossibleUseService resourcePossibleUseService;
public ResourcePossibleUseResource(ResourcePossibleUseService resourcePossibleUseService) {
this.resourcePossibleUseService = resourcePossibleUseService;
}
/**
* {@code POST /resource-possible-uses} : Create a new resourcePossibleUse.
*
* @param resourcePossibleUseDTO the resourcePossibleUseDTO to create.
* @return the {@link ResponseEntity} with status {@code 201 (Created)} and with body the new resourcePossibleUseDTO, or with status {@code 400 (Bad Request)} if the resourcePossibleUse has already an ID.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PostMapping("/resource-possible-uses")
public ResponseEntity<ResourcePossibleUseDTO> createResourcePossibleUse(@RequestBody ResourcePossibleUseDTO resourcePossibleUseDTO) throws URISyntaxException {
log.debug("REST request to save ResourcePossibleUse : {}", resourcePossibleUseDTO);
if (resourcePossibleUseDTO.getId() != null) {
throw new BadRequestAlertException("A new resourcePossibleUse cannot already have an ID", ENTITY_NAME, "idexists");
}
ResourcePossibleUseDTO result = resourcePossibleUseService.save(resourcePossibleUseDTO);
return ResponseEntity.created(new URI("/api/resource-possible-uses/" + result.getId()))
.headers(HeaderUtil.createEntityCreationAlert(applicationName, true, ENTITY_NAME, result.getId().toString()))
.body(result);
}
/**
* {@code PUT /resource-possible-uses} : Updates an existing resourcePossibleUse.
*
* @param resourcePossibleUseDTO the resourcePossibleUseDTO to update.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the updated resourcePossibleUseDTO,
* or with status {@code 400 (Bad Request)} if the resourcePossibleUseDTO is not valid,
* or with status {@code 500 (Internal Server Error)} if the resourcePossibleUseDTO couldn't be updated.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PutMapping("/resource-possible-uses")
public ResponseEntity<ResourcePossibleUseDTO> updateResourcePossibleUse(@RequestBody ResourcePossibleUseDTO resourcePossibleUseDTO) throws URISyntaxException {
log.debug("REST request to update ResourcePossibleUse : {}", resourcePossibleUseDTO);
if (resourcePossibleUseDTO.getId() == null) {
throw new BadRequestAlertException("Invalid id", ENTITY_NAME, "idnull");
}
ResourcePossibleUseDTO result = resourcePossibleUseService.save(resourcePossibleUseDTO);
return ResponseEntity.ok()
.headers(HeaderUtil.createEntityUpdateAlert(applicationName, true, ENTITY_NAME, resourcePossibleUseDTO.getId().toString()))
.body(result);
}
/**
* {@code GET /resource-possible-uses} : get all the resourcePossibleUses.
*
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and the list of resourcePossibleUses in body.
*/
@GetMapping("/resource-possible-uses")
public List<ResourcePossibleUseDTO> getAllResourcePossibleUses() {
log.debug("REST request to get all ResourcePossibleUses");
return resourcePossibleUseService.findAll();
}
/**
* {@code GET /resource-possible-uses/:id} : get the "id" resourcePossibleUse.
*
* @param id the id of the resourcePossibleUseDTO to retrieve.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the resourcePossibleUseDTO, or with status {@code 404 (Not Found)}.
*/
@GetMapping("/resource-possible-uses/{id}")
public ResponseEntity<ResourcePossibleUseDTO> getResourcePossibleUse(@PathVariable Long id) {
log.debug("REST request to get ResourcePossibleUse : {}", id);
Optional<ResourcePossibleUseDTO> resourcePossibleUseDTO = resourcePossibleUseService.findOne(id);
return ResponseUtil.wrapOrNotFound(resourcePossibleUseDTO);
}
/**
* {@code DELETE /resource-possible-uses/:id} : delete the "id" resourcePossibleUse.
*
* @param id the id of the resourcePossibleUseDTO to delete.
* @return the {@link ResponseEntity} with status {@code 204 (NO_CONTENT)}.
*/
@DeleteMapping("/resource-possible-uses/{id}")
public ResponseEntity<Void> deleteResourcePossibleUse(@PathVariable Long id) {
log.debug("REST request to delete ResourcePossibleUse : {}", id);
resourcePossibleUseService.delete(id);
return ResponseEntity.noContent().headers(HeaderUtil.createEntityDeletionAlert(applicationName, true, ENTITY_NAME, id.toString())).build();
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/organizationPolicies/CompanyUnitDAO.java
package br.ufpa.labes.spm.repository.impl.organizationPolicies;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.organizationPolicies.ICompanyUnitDAO;
import br.ufpa.labes.spm.domain.CompanyUnit;
public class CompanyUnitDAO extends BaseDAO<CompanyUnit, String> implements ICompanyUnitDAO {
protected CompanyUnitDAO(Class<CompanyUnit> businessClass) {
super(businessClass);
}
public CompanyUnitDAO() {
super(CompanyUnit.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/types/ToolTypeDAO.java
package br.ufpa.labes.spm.repository.impl.types;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.types.IToolTypeDAO;
import br.ufpa.labes.spm.domain.ToolType;
public class ToolTypeDAO extends BaseDAO<ToolType, String> implements IToolTypeDAO {
protected ToolTypeDAO(Class<ToolType> businessClass) {
super(businessClass);
}
public ToolTypeDAO() {
super(ToolType.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/ProcessService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.service.dto.ProcessDTO;
import java.util.List;
import java.util.Optional;
/**
* Service Interface for managing {@link br.ufpa.labes.spm.domain.Process}.
*/
public interface ProcessService {
/**
* Save a process.
*
* @param processDTO the entity to save.
* @return the persisted entity.
*/
ProcessDTO save(ProcessDTO processDTO);
/**
* Get all the processes.
*
* @return the list of entities.
*/
List<ProcessDTO> findAll();
/**
* Get all the ProcessDTO where TheLog is {@code null}.
*
* @return the list of entities.
*/
List<ProcessDTO> findAllWhereTheLogIsNull();
/**
* Get the "id" process.
*
* @param id the id of the entity.
* @return the entity.
*/
Optional<ProcessDTO> findOne(Long id);
/**
* Delete the "id" process.
*
* @param id the id of the entity.
*/
void delete(Long id);
}
<file_sep>/tmp/service/mapper/ScriptMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.ScriptDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link Script} and its DTO {@link ScriptDTO}.
*/
@Mapper(componentModel = "spring", uses = {})
public interface ScriptMapper extends EntityMapper<ScriptDTO, Script> {
default Script fromId(Long id) {
if (id == null) {
return null;
}
Script script = new Script();
script.setId(id);
return script;
}
}
<file_sep>/tmp/service/mapper/RoleMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.RoleDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link Role} and its DTO {@link RoleDTO}.
*/
@Mapper(componentModel = "spring", uses = {RoleTypeMapper.class})
public interface RoleMapper extends EntityMapper<RoleDTO, Role> {
@Mapping(source = "subordinate.id", target = "subordinateId")
@Mapping(source = "theRoleType.id", target = "theRoleTypeId")
RoleDTO toDto(Role role);
@Mapping(target = "theReqAgents", ignore = true)
@Mapping(target = "removeTheReqAgent", ignore = true)
@Mapping(source = "subordinateId", target = "subordinate")
@Mapping(source = "theRoleTypeId", target = "theRoleType")
@Mapping(target = "theAgentPlaysRoles", ignore = true)
@Mapping(target = "removeTheAgentPlaysRole", ignore = true)
@Mapping(target = "commands", ignore = true)
@Mapping(target = "removeCommands", ignore = true)
@Mapping(target = "theRoleNeedsAbilities", ignore = true)
@Mapping(target = "removeTheRoleNeedsAbility", ignore = true)
@Mapping(target = "theAgentInstSugs", ignore = true)
@Mapping(target = "removeTheAgentInstSug", ignore = true)
Role toEntity(RoleDTO roleDTO);
default Role fromId(Long id) {
if (id == null) {
return null;
}
Role role = new Role();
role.setId(id);
return role;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/plannerInfo/ResourcePossibleUseDAO.java
package br.ufpa.labes.spm.repository.impl.plannerInfo;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.plannerInfo.IResourcePossibleUseDAO;
import br.ufpa.labes.spm.domain.ResourcePossibleUse;
public class ResourcePossibleUseDAO extends BaseDAO<ResourcePossibleUse, Integer>
implements IResourcePossibleUseDAO {
protected ResourcePossibleUseDAO(Class<ResourcePossibleUse> businessClass) {
super(businessClass);
}
public ResourcePossibleUseDAO() {
super(ResourcePossibleUse.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/connections/BranchConCondToActivityDAO.java
package br.ufpa.labes.spm.repository.impl.connections;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.connections.IBranchConCondToActivityDAO;
import br.ufpa.labes.spm.domain.BranchConCondToActivity;
public class BranchConCondToActivityDAO extends BaseDAO<BranchConCondToActivity, Integer>
implements IBranchConCondToActivityDAO {
protected BranchConCondToActivityDAO(Class<BranchConCondToActivity> businessClass) {
super(businessClass);
}
public BranchConCondToActivityDAO() {
super(BranchConCondToActivity.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/web/rest/WorkGroupResource.java
package br.ufpa.labes.spm.web.rest;
import br.ufpa.labes.spm.service.WorkGroupService;
import br.ufpa.labes.spm.web.rest.errors.BadRequestAlertException;
import br.ufpa.labes.spm.service.dto.WorkGroupDTO;
import io.github.jhipster.web.util.HeaderUtil;
import io.github.jhipster.web.util.ResponseUtil;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.List;
import java.util.Optional;
/**
* REST controller for managing {@link br.ufpa.labes.spm.domain.WorkGroup}.
*/
@RestController
@RequestMapping("/api")
public class WorkGroupResource {
private final Logger log = LoggerFactory.getLogger(WorkGroupResource.class);
private static final String ENTITY_NAME = "workGroup";
@Value("${jhipster.clientApp.name}")
private String applicationName;
private final WorkGroupService workGroupService;
public WorkGroupResource(WorkGroupService workGroupService) {
this.workGroupService = workGroupService;
}
/**
* {@code POST /work-groups} : Create a new workGroup.
*
* @param workGroupDTO the workGroupDTO to create.
* @return the {@link ResponseEntity} with status {@code 201 (Created)} and with body the new workGroupDTO, or with status {@code 400 (Bad Request)} if the workGroup has already an ID.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PostMapping("/work-groups")
public ResponseEntity<WorkGroupDTO> createWorkGroup(@RequestBody WorkGroupDTO workGroupDTO) throws URISyntaxException {
log.debug("REST request to save WorkGroup : {}", workGroupDTO);
if (workGroupDTO.getId() != null) {
throw new BadRequestAlertException("A new workGroup cannot already have an ID", ENTITY_NAME, "idexists");
}
WorkGroupDTO result = workGroupService.save(workGroupDTO);
return ResponseEntity.created(new URI("/api/work-groups/" + result.getId()))
.headers(HeaderUtil.createEntityCreationAlert(applicationName, true, ENTITY_NAME, result.getId().toString()))
.body(result);
}
/**
* {@code PUT /work-groups} : Updates an existing workGroup.
*
* @param workGroupDTO the workGroupDTO to update.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the updated workGroupDTO,
* or with status {@code 400 (Bad Request)} if the workGroupDTO is not valid,
* or with status {@code 500 (Internal Server Error)} if the workGroupDTO couldn't be updated.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PutMapping("/work-groups")
public ResponseEntity<WorkGroupDTO> updateWorkGroup(@RequestBody WorkGroupDTO workGroupDTO) throws URISyntaxException {
log.debug("REST request to update WorkGroup : {}", workGroupDTO);
if (workGroupDTO.getId() == null) {
throw new BadRequestAlertException("Invalid id", ENTITY_NAME, "idnull");
}
WorkGroupDTO result = workGroupService.save(workGroupDTO);
return ResponseEntity.ok()
.headers(HeaderUtil.createEntityUpdateAlert(applicationName, true, ENTITY_NAME, workGroupDTO.getId().toString()))
.body(result);
}
/**
* {@code GET /work-groups} : get all the workGroups.
*
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and the list of workGroups in body.
*/
@GetMapping("/work-groups")
public List<WorkGroupDTO> getAllWorkGroups() {
log.debug("REST request to get all WorkGroups");
return workGroupService.findAll();
}
/**
* {@code GET /work-groups/:id} : get the "id" workGroup.
*
* @param id the id of the workGroupDTO to retrieve.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the workGroupDTO, or with status {@code 404 (Not Found)}.
*/
@GetMapping("/work-groups/{id}")
public ResponseEntity<WorkGroupDTO> getWorkGroup(@PathVariable Long id) {
log.debug("REST request to get WorkGroup : {}", id);
Optional<WorkGroupDTO> workGroupDTO = workGroupService.findOne(id);
return ResponseUtil.wrapOrNotFound(workGroupDTO);
}
/**
* {@code DELETE /work-groups/:id} : delete the "id" workGroup.
*
* @param id the id of the workGroupDTO to delete.
* @return the {@link ResponseEntity} with status {@code 204 (NO_CONTENT)}.
*/
@DeleteMapping("/work-groups/{id}")
public ResponseEntity<Void> deleteWorkGroup(@PathVariable Long id) {
log.debug("REST request to delete WorkGroup : {}", id);
workGroupService.delete(id);
return ResponseEntity.noContent().headers(HeaderUtil.createEntityDeletionAlert(applicationName, true, ENTITY_NAME, id.toString())).build();
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/dto/ProjectDTO.java
package br.ufpa.labes.spm.service.dto;
import java.time.LocalDate;
import java.io.Serializable;
import java.util.Objects;
import javax.persistence.Lob;
/**
* A DTO for the {@link br.ufpa.labes.spm.domain.Project} entity.
*/
public class ProjectDTO implements Serializable {
private Long id;
private String ident;
private String name;
@Lob
private String description;
private LocalDate beginDate;
private LocalDate endDate;
private Boolean active;
private Long processReferedId;
private Long theSystemId;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public String getIdent() {
return ident;
}
public void setIdent(String ident) {
this.ident = ident;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public String getDescription() {
return description;
}
public void setDescription(String description) {
this.description = description;
}
public LocalDate getBeginDate() {
return beginDate;
}
public void setBeginDate(LocalDate beginDate) {
this.beginDate = beginDate;
}
public LocalDate getEndDate() {
return endDate;
}
public void setEndDate(LocalDate endDate) {
this.endDate = endDate;
}
public Boolean isActive() {
return active;
}
public void setActive(Boolean active) {
this.active = active;
}
public Long getProcessReferedId() {
return processReferedId;
}
public void setProcessReferedId(Long processId) {
this.processReferedId = processId;
}
public Long getTheSystemId() {
return theSystemId;
}
public void setTheSystemId(Long developingSystemId) {
this.theSystemId = developingSystemId;
}
@Override
public boolean equals(Object o) {
if (this == o) {
return true;
}
if (o == null || getClass() != o.getClass()) {
return false;
}
ProjectDTO projectDTO = (ProjectDTO) o;
if (projectDTO.getId() == null || getId() == null) {
return false;
}
return Objects.equals(getId(), projectDTO.getId());
}
@Override
public int hashCode() {
return Objects.hashCode(getId());
}
@Override
public String toString() {
return "ProjectDTO{" +
"id=" + getId() +
", ident='" + getIdent() + "'" +
", name='" + getName() + "'" +
", description='" + getDescription() + "'" +
", beginDate='" + getBeginDate() + "'" +
", endDate='" + getEndDate() + "'" +
", active='" + isActive() + "'" +
", processRefered=" + getProcessReferedId() +
", theSystem=" + getTheSystemId() +
"}";
}
}
<file_sep>/tmp/service/mapper/ArtifactTaskMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.ArtifactTaskDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link ArtifactTask} and its DTO {@link ArtifactTaskDTO}.
*/
@Mapper(componentModel = "spring", uses = {ArtifactMapper.class, TaskMapper.class})
public interface ArtifactTaskMapper extends EntityMapper<ArtifactTaskDTO, ArtifactTask> {
@Mapping(source = "theArtifact.id", target = "theArtifactId")
@Mapping(source = "theTask.id", target = "theTaskId")
ArtifactTaskDTO toDto(ArtifactTask artifactTask);
@Mapping(source = "theArtifactId", target = "theArtifact")
@Mapping(source = "theTaskId", target = "theTask")
ArtifactTask toEntity(ArtifactTaskDTO artifactTaskDTO);
default ArtifactTask fromId(Long id) {
if (id == null) {
return null;
}
ArtifactTask artifactTask = new ArtifactTask();
artifactTask.setId(id);
return artifactTask;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/organizationPolicies/ICompanyUnitDAO.java
package br.ufpa.labes.spm.repository.interfaces.organizationPolicies;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.CompanyUnit;
public interface ICompanyUnitDAO extends IBaseDAO<CompanyUnit, String> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/WorkGroupEstimationService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.WorkGroupEstimation;
import br.ufpa.labes.spm.repository.WorkGroupEstimationRepository;
import br.ufpa.labes.spm.service.dto.WorkGroupEstimationDTO;
import br.ufpa.labes.spm.service.mapper.WorkGroupEstimationMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link WorkGroupEstimation}.
*/
@Service
@Transactional
public class WorkGroupEstimationService {
private final Logger log = LoggerFactory.getLogger(WorkGroupEstimationService.class);
private final WorkGroupEstimationRepository workGroupEstimationRepository;
private final WorkGroupEstimationMapper workGroupEstimationMapper;
public WorkGroupEstimationService(WorkGroupEstimationRepository workGroupEstimationRepository, WorkGroupEstimationMapper workGroupEstimationMapper) {
this.workGroupEstimationRepository = workGroupEstimationRepository;
this.workGroupEstimationMapper = workGroupEstimationMapper;
}
/**
* Save a workGroupEstimation.
*
* @param workGroupEstimationDTO the entity to save.
* @return the persisted entity.
*/
public WorkGroupEstimationDTO save(WorkGroupEstimationDTO workGroupEstimationDTO) {
log.debug("Request to save WorkGroupEstimation : {}", workGroupEstimationDTO);
WorkGroupEstimation workGroupEstimation = workGroupEstimationMapper.toEntity(workGroupEstimationDTO);
workGroupEstimation = workGroupEstimationRepository.save(workGroupEstimation);
return workGroupEstimationMapper.toDto(workGroupEstimation);
}
/**
* Get all the workGroupEstimations.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<WorkGroupEstimationDTO> findAll() {
log.debug("Request to get all WorkGroupEstimations");
return workGroupEstimationRepository.findAll().stream()
.map(workGroupEstimationMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one workGroupEstimation by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<WorkGroupEstimationDTO> findOne(Long id) {
log.debug("Request to get WorkGroupEstimation : {}", id);
return workGroupEstimationRepository.findById(id)
.map(workGroupEstimationMapper::toDto);
}
/**
* Delete the workGroupEstimation by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete WorkGroupEstimation : {}", id);
workGroupEstimationRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/domain/CalendarDay.java
package br.ufpa.labes.spm.domain;
import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
import org.hibernate.annotations.Cache;
import org.hibernate.annotations.CacheConcurrencyStrategy;
import javax.persistence.*;
import java.io.Serializable;
/** A CalendarDay. */
@Entity
@Table(name = "calendar_day")
@Cache(usage = CacheConcurrencyStrategy.NONSTRICT_READ_WRITE)
public class CalendarDay implements Serializable {
private static final long serialVersionUID = 1L;
@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
@Column(name = "day")
private String day;
@ManyToOne
@JsonIgnoreProperties("notWorkingDays")
private Calendar theCalendar;
// jhipster-needle-entity-add-field - JHipster will add fields here, do not remove
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public String getDay() {
return day;
}
public CalendarDay day(String day) {
this.day = day;
return this;
}
public void setDay(String day) {
this.day = day;
}
public Calendar getTheCalendar() {
return theCalendar;
}
public CalendarDay theCalendar(Calendar calendar) {
this.theCalendar = calendar;
return this;
}
public void setTheCalendar(Calendar calendar) {
this.theCalendar = calendar;
}
// jhipster-needle-entity-add-getters-setters - JHipster will add getters and setters here, do not
// remove
@Override
public boolean equals(Object o) {
if (this == o) {
return true;
}
if (!(o instanceof CalendarDay)) {
return false;
}
return id != null && id.equals(((CalendarDay) o).id);
}
@Override
public int hashCode() {
return 31;
}
@Override
public String toString() {
return "CalendarDay{" + "id=" + getId() + ", day='" + getDay() + "'" + "}";
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/artifacts/IArtifactDAO.java
package br.ufpa.labes.spm.repository.interfaces.artifacts;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.service.dto.SimpleArtifactDescriptorDTO;
import br.ufpa.labes.spm.domain.Artifact;
public interface IArtifactDAO extends IBaseDAO<Artifact, String> {
public Object[] getArtifactsIdentsFromProcessModelWithoutTemplates(String ident);
public Artifact getByName(String name);
public SimpleArtifactDescriptorDTO[] getInputArtifactsForNormal(String normalIdent);
public SimpleArtifactDescriptorDTO[] getOutputArtifactsForNormal(String identActivity);
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/ShareableService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.Shareable;
import br.ufpa.labes.spm.repository.ShareableRepository;
import br.ufpa.labes.spm.service.dto.ShareableDTO;
import br.ufpa.labes.spm.service.mapper.ShareableMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
/**
* Service Implementation for managing {@link Shareable}.
*/
@Service
@Transactional
public class ShareableService {
private final Logger log = LoggerFactory.getLogger(ShareableService.class);
private final ShareableRepository shareableRepository;
private final ShareableMapper shareableMapper;
public ShareableService(ShareableRepository shareableRepository, ShareableMapper shareableMapper) {
this.shareableRepository = shareableRepository;
this.shareableMapper = shareableMapper;
}
/**
* Save a shareable.
*
* @param shareableDTO the entity to save.
* @return the persisted entity.
*/
public ShareableDTO save(ShareableDTO shareableDTO) {
log.debug("Request to save Shareable : {}", shareableDTO);
Shareable shareable = shareableMapper.toEntity(shareableDTO);
shareable = shareableRepository.save(shareable);
return shareableMapper.toDto(shareable);
}
/**
* Get all the shareables.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<ShareableDTO> findAll() {
log.debug("Request to get all Shareables");
return shareableRepository.findAll().stream()
.map(shareableMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one shareable by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<ShareableDTO> findOne(Long id) {
log.debug("Request to get Shareable : {}", id);
return shareableRepository.findById(id)
.map(shareableMapper::toDto);
}
/**
* Delete the shareable by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete Shareable : {}", id);
shareableRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/activities/IActivityDAO.java
package br.ufpa.labes.spm.repository.interfaces.activities;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.Activity;
public interface IActivityDAO extends IBaseDAO<Activity, String> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/NodeService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.domain.Node;
import br.ufpa.labes.spm.repository.NodeRepository;
import br.ufpa.labes.spm.service.dto.NodeDTO;
import br.ufpa.labes.spm.service.mapper.NodeMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.LinkedList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
import java.util.stream.StreamSupport;
/**
* Service Implementation for managing {@link Node}.
*/
@Service
@Transactional
public class NodeService {
private final Logger log = LoggerFactory.getLogger(NodeService.class);
private final NodeRepository nodeRepository;
private final NodeMapper nodeMapper;
public NodeService(NodeRepository nodeRepository, NodeMapper nodeMapper) {
this.nodeRepository = nodeRepository;
this.nodeMapper = nodeMapper;
}
/**
* Save a node.
*
* @param nodeDTO the entity to save.
* @return the persisted entity.
*/
public NodeDTO save(NodeDTO nodeDTO) {
log.debug("Request to save Node : {}", nodeDTO);
Node node = nodeMapper.toEntity(nodeDTO);
node = nodeRepository.save(node);
return nodeMapper.toDto(node);
}
/**
* Get all the nodes.
*
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<NodeDTO> findAll() {
log.debug("Request to get all Nodes");
return nodeRepository.findAll().stream()
.map(nodeMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get all the nodes where TheStructure is {@code null}.
* @return the list of entities.
*/
@Transactional(readOnly = true)
public List<NodeDTO> findAllWhereTheStructureIsNull() {
log.debug("Request to get all nodes where TheStructure is null");
return StreamSupport
.stream(nodeRepository.findAll().spliterator(), false)
.filter(node -> node.getTheStructure() == null)
.map(nodeMapper::toDto)
.collect(Collectors.toCollection(LinkedList::new));
}
/**
* Get one node by id.
*
* @param id the id of the entity.
* @return the entity.
*/
@Transactional(readOnly = true)
public Optional<NodeDTO> findOne(Long id) {
log.debug("Request to get Node : {}", id);
return nodeRepository.findById(id)
.map(nodeMapper::toDto);
}
/**
* Delete the node by id.
*
* @param id the id of the entity.
*/
public void delete(Long id) {
log.debug("Request to delete Node : {}", id);
nodeRepository.deleteById(id);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/web/rest/SpmConfigurationResource.java
package br.ufpa.labes.spm.web.rest;
import br.ufpa.labes.spm.service.SpmConfigurationService;
import br.ufpa.labes.spm.web.rest.errors.BadRequestAlertException;
import br.ufpa.labes.spm.service.dto.SpmConfigurationDTO;
import io.github.jhipster.web.util.HeaderUtil;
import io.github.jhipster.web.util.ResponseUtil;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.List;
import java.util.Optional;
import java.util.stream.StreamSupport;
/**
* REST controller for managing {@link br.ufpa.labes.spm.domain.SpmConfiguration}.
*/
@RestController
@RequestMapping("/api")
public class SpmConfigurationResource {
private final Logger log = LoggerFactory.getLogger(SpmConfigurationResource.class);
private static final String ENTITY_NAME = "spmConfiguration";
@Value("${jhipster.clientApp.name}")
private String applicationName;
private final SpmConfigurationService spmConfigurationService;
public SpmConfigurationResource(SpmConfigurationService spmConfigurationService) {
this.spmConfigurationService = spmConfigurationService;
}
/**
* {@code POST /spm-configurations} : Create a new spmConfiguration.
*
* @param spmConfigurationDTO the spmConfigurationDTO to create.
* @return the {@link ResponseEntity} with status {@code 201 (Created)} and with body the new spmConfigurationDTO, or with status {@code 400 (Bad Request)} if the spmConfiguration has already an ID.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PostMapping("/spm-configurations")
public ResponseEntity<SpmConfigurationDTO> createSpmConfiguration(@RequestBody SpmConfigurationDTO spmConfigurationDTO) throws URISyntaxException {
log.debug("REST request to save SpmConfiguration : {}", spmConfigurationDTO);
if (spmConfigurationDTO.getId() != null) {
throw new BadRequestAlertException("A new spmConfiguration cannot already have an ID", ENTITY_NAME, "idexists");
}
SpmConfigurationDTO result = spmConfigurationService.save(spmConfigurationDTO);
return ResponseEntity.created(new URI("/api/spm-configurations/" + result.getId()))
.headers(HeaderUtil.createEntityCreationAlert(applicationName, true, ENTITY_NAME, result.getId().toString()))
.body(result);
}
/**
* {@code PUT /spm-configurations} : Updates an existing spmConfiguration.
*
* @param spmConfigurationDTO the spmConfigurationDTO to update.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the updated spmConfigurationDTO,
* or with status {@code 400 (Bad Request)} if the spmConfigurationDTO is not valid,
* or with status {@code 500 (Internal Server Error)} if the spmConfigurationDTO couldn't be updated.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PutMapping("/spm-configurations")
public ResponseEntity<SpmConfigurationDTO> updateSpmConfiguration(@RequestBody SpmConfigurationDTO spmConfigurationDTO) throws URISyntaxException {
log.debug("REST request to update SpmConfiguration : {}", spmConfigurationDTO);
if (spmConfigurationDTO.getId() == null) {
throw new BadRequestAlertException("Invalid id", ENTITY_NAME, "idnull");
}
SpmConfigurationDTO result = spmConfigurationService.save(spmConfigurationDTO);
return ResponseEntity.ok()
.headers(HeaderUtil.createEntityUpdateAlert(applicationName, true, ENTITY_NAME, spmConfigurationDTO.getId().toString()))
.body(result);
}
/**
* {@code GET /spm-configurations} : get all the spmConfigurations.
*
* @param filter the filter of the request.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and the list of spmConfigurations in body.
*/
@GetMapping("/spm-configurations")
public List<SpmConfigurationDTO> getAllSpmConfigurations(@RequestParam(required = false) String filter) {
if ("agent-is-null".equals(filter)) {
log.debug("REST request to get all SpmConfigurations where agent is null");
return spmConfigurationService.findAllWhereAgentIsNull();
}
log.debug("REST request to get all SpmConfigurations");
return spmConfigurationService.findAll();
}
/**
* {@code GET /spm-configurations/:id} : get the "id" spmConfiguration.
*
* @param id the id of the spmConfigurationDTO to retrieve.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the spmConfigurationDTO, or with status {@code 404 (Not Found)}.
*/
@GetMapping("/spm-configurations/{id}")
public ResponseEntity<SpmConfigurationDTO> getSpmConfiguration(@PathVariable Long id) {
log.debug("REST request to get SpmConfiguration : {}", id);
Optional<SpmConfigurationDTO> spmConfigurationDTO = spmConfigurationService.findOne(id);
return ResponseUtil.wrapOrNotFound(spmConfigurationDTO);
}
/**
* {@code DELETE /spm-configurations/:id} : delete the "id" spmConfiguration.
*
* @param id the id of the spmConfigurationDTO to delete.
* @return the {@link ResponseEntity} with status {@code 204 (NO_CONTENT)}.
*/
@DeleteMapping("/spm-configurations/{id}")
public ResponseEntity<Void> deleteSpmConfiguration(@PathVariable Long id) {
log.debug("REST request to delete SpmConfiguration : {}", id);
spmConfigurationService.delete(id);
return ResponseEntity.noContent().headers(HeaderUtil.createEntityDeletionAlert(applicationName, true, ENTITY_NAME, id.toString())).build();
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/CalendarDayRepository.java
package br.ufpa.labes.spm.repository;
import br.ufpa.labes.spm.domain.CalendarDay;
import org.springframework.data.jpa.repository.*;
import org.springframework.stereotype.Repository;
/** Spring Data repository for the CalendarDay entity. */
@SuppressWarnings("unused")
@Repository
public interface CalendarDayRepository extends JpaRepository<CalendarDay, Long> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/web/rest/SpmLogResource.java
package br.ufpa.labes.spm.web.rest;
import br.ufpa.labes.spm.service.SpmLogService;
import br.ufpa.labes.spm.web.rest.errors.BadRequestAlertException;
import br.ufpa.labes.spm.service.dto.SpmLogDTO;
import io.github.jhipster.web.util.HeaderUtil;
import io.github.jhipster.web.util.ResponseUtil;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.List;
import java.util.Optional;
/**
* REST controller for managing {@link br.ufpa.labes.spm.domain.SpmLog}.
*/
@RestController
@RequestMapping("/api")
public class SpmLogResource {
private final Logger log = LoggerFactory.getLogger(SpmLogResource.class);
private static final String ENTITY_NAME = "spmLog";
@Value("${jhipster.clientApp.name}")
private String applicationName;
private final SpmLogService spmLogService;
public SpmLogResource(SpmLogService spmLogService) {
this.spmLogService = spmLogService;
}
/**
* {@code POST /spm-logs} : Create a new spmLog.
*
* @param spmLogDTO the spmLogDTO to create.
* @return the {@link ResponseEntity} with status {@code 201 (Created)} and with body the new spmLogDTO, or with status {@code 400 (Bad Request)} if the spmLog has already an ID.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PostMapping("/spm-logs")
public ResponseEntity<SpmLogDTO> createSpmLog(@RequestBody SpmLogDTO spmLogDTO) throws URISyntaxException {
log.debug("REST request to save SpmLog : {}", spmLogDTO);
if (spmLogDTO.getId() != null) {
throw new BadRequestAlertException("A new spmLog cannot already have an ID", ENTITY_NAME, "idexists");
}
SpmLogDTO result = spmLogService.save(spmLogDTO);
return ResponseEntity.created(new URI("/api/spm-logs/" + result.getId()))
.headers(HeaderUtil.createEntityCreationAlert(applicationName, true, ENTITY_NAME, result.getId().toString()))
.body(result);
}
/**
* {@code PUT /spm-logs} : Updates an existing spmLog.
*
* @param spmLogDTO the spmLogDTO to update.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the updated spmLogDTO,
* or with status {@code 400 (Bad Request)} if the spmLogDTO is not valid,
* or with status {@code 500 (Internal Server Error)} if the spmLogDTO couldn't be updated.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PutMapping("/spm-logs")
public ResponseEntity<SpmLogDTO> updateSpmLog(@RequestBody SpmLogDTO spmLogDTO) throws URISyntaxException {
log.debug("REST request to update SpmLog : {}", spmLogDTO);
if (spmLogDTO.getId() == null) {
throw new BadRequestAlertException("Invalid id", ENTITY_NAME, "idnull");
}
SpmLogDTO result = spmLogService.save(spmLogDTO);
return ResponseEntity.ok()
.headers(HeaderUtil.createEntityUpdateAlert(applicationName, true, ENTITY_NAME, spmLogDTO.getId().toString()))
.body(result);
}
/**
* {@code GET /spm-logs} : get all the spmLogs.
*
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and the list of spmLogs in body.
*/
@GetMapping("/spm-logs")
public List<SpmLogDTO> getAllSpmLogs() {
log.debug("REST request to get all SpmLogs");
return spmLogService.findAll();
}
/**
* {@code GET /spm-logs/:id} : get the "id" spmLog.
*
* @param id the id of the spmLogDTO to retrieve.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the spmLogDTO, or with status {@code 404 (Not Found)}.
*/
@GetMapping("/spm-logs/{id}")
public ResponseEntity<SpmLogDTO> getSpmLog(@PathVariable Long id) {
log.debug("REST request to get SpmLog : {}", id);
Optional<SpmLogDTO> spmLogDTO = spmLogService.findOne(id);
return ResponseUtil.wrapOrNotFound(spmLogDTO);
}
/**
* {@code DELETE /spm-logs/:id} : delete the "id" spmLog.
*
* @param id the id of the spmLogDTO to delete.
* @return the {@link ResponseEntity} with status {@code 204 (NO_CONTENT)}.
*/
@DeleteMapping("/spm-logs/{id}")
public ResponseEntity<Void> deleteSpmLog(@PathVariable Long id) {
log.debug("REST request to delete SpmLog : {}", id);
spmLogService.delete(id);
return ResponseEntity.noContent().headers(HeaderUtil.createEntityDeletionAlert(applicationName, true, ENTITY_NAME, id.toString())).build();
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/plannerInfo/AgentInstSuggestionToAgentDAO.java
package br.ufpa.labes.spm.repository.impl.plannerInfo;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.plannerInfo.IAgentInstSuggestionToAgentDAO;
import br.ufpa.labes.spm.domain.AgentInstSuggestionToAgent;
public class AgentInstSuggestionToAgentDAO extends BaseDAO<AgentInstSuggestionToAgent, Integer>
implements IAgentInstSuggestionToAgentDAO {
protected AgentInstSuggestionToAgentDAO(Class<AgentInstSuggestionToAgent> businessClass) {
super(businessClass);
}
public AgentInstSuggestionToAgentDAO() {
super(AgentInstSuggestionToAgent.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/mapper/TaskAgendaMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.TaskAgendaDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link TaskAgenda} and its DTO {@link TaskAgendaDTO}.
*/
@Mapper(componentModel = "spring", uses = {})
public interface TaskAgendaMapper extends EntityMapper<TaskAgendaDTO, TaskAgenda> {
@Mapping(target = "theAgent", ignore = true)
@Mapping(target = "theProcessAgenda", ignore = true)
@Mapping(target = "removeTheProcessAgenda", ignore = true)
TaskAgenda toEntity(TaskAgendaDTO taskAgendaDTO);
default TaskAgenda fromId(Long id) {
if (id == null) {
return null;
}
TaskAgenda taskAgenda = new TaskAgenda();
taskAgenda.setId(id);
return taskAgenda;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/mapper/ReqAgentMapper.java
package br.ufpa.labes.spm.service.mapper;
import br.ufpa.labes.spm.domain.*;
import br.ufpa.labes.spm.service.dto.ReqAgentDTO;
import org.mapstruct.*;
/**
* Mapper for the entity {@link ReqAgent} and its DTO {@link ReqAgentDTO}.
*/
@Mapper(componentModel = "spring", uses = {AgentMapper.class, RoleMapper.class})
public interface ReqAgentMapper extends EntityMapper<ReqAgentDTO, ReqAgent> {
@Mapping(source = "theAgent.id", target = "theAgentId")
@Mapping(source = "theRole.id", target = "theRoleId")
ReqAgentDTO toDto(ReqAgent reqAgent);
@Mapping(source = "theAgentId", target = "theAgent")
@Mapping(source = "theRoleId", target = "theRole")
@Mapping(target = "theReqAgentRequiresAbilities", ignore = true)
@Mapping(target = "removeTheReqAgentRequiresAbility", ignore = true)
ReqAgent toEntity(ReqAgentDTO reqAgentDTO);
default ReqAgent fromId(Long id) {
if (id == null) {
return null;
}
ReqAgent reqAgent = new ReqAgent();
reqAgent.setId(id);
return reqAgent;
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/web/rest/PrimitiveParamResource.java
package br.ufpa.labes.spm.web.rest;
import br.ufpa.labes.spm.service.PrimitiveParamService;
import br.ufpa.labes.spm.web.rest.errors.BadRequestAlertException;
import br.ufpa.labes.spm.service.dto.PrimitiveParamDTO;
import io.github.jhipster.web.util.HeaderUtil;
import io.github.jhipster.web.util.ResponseUtil;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.List;
import java.util.Optional;
/**
* REST controller for managing {@link br.ufpa.labes.spm.domain.PrimitiveParam}.
*/
@RestController
@RequestMapping("/api")
public class PrimitiveParamResource {
private final Logger log = LoggerFactory.getLogger(PrimitiveParamResource.class);
private static final String ENTITY_NAME = "primitiveParam";
@Value("${jhipster.clientApp.name}")
private String applicationName;
private final PrimitiveParamService primitiveParamService;
public PrimitiveParamResource(PrimitiveParamService primitiveParamService) {
this.primitiveParamService = primitiveParamService;
}
/**
* {@code POST /primitive-params} : Create a new primitiveParam.
*
* @param primitiveParamDTO the primitiveParamDTO to create.
* @return the {@link ResponseEntity} with status {@code 201 (Created)} and with body the new primitiveParamDTO, or with status {@code 400 (Bad Request)} if the primitiveParam has already an ID.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PostMapping("/primitive-params")
public ResponseEntity<PrimitiveParamDTO> createPrimitiveParam(@RequestBody PrimitiveParamDTO primitiveParamDTO) throws URISyntaxException {
log.debug("REST request to save PrimitiveParam : {}", primitiveParamDTO);
if (primitiveParamDTO.getId() != null) {
throw new BadRequestAlertException("A new primitiveParam cannot already have an ID", ENTITY_NAME, "idexists");
}
PrimitiveParamDTO result = primitiveParamService.save(primitiveParamDTO);
return ResponseEntity.created(new URI("/api/primitive-params/" + result.getId()))
.headers(HeaderUtil.createEntityCreationAlert(applicationName, true, ENTITY_NAME, result.getId().toString()))
.body(result);
}
/**
* {@code PUT /primitive-params} : Updates an existing primitiveParam.
*
* @param primitiveParamDTO the primitiveParamDTO to update.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the updated primitiveParamDTO,
* or with status {@code 400 (Bad Request)} if the primitiveParamDTO is not valid,
* or with status {@code 500 (Internal Server Error)} if the primitiveParamDTO couldn't be updated.
* @throws URISyntaxException if the Location URI syntax is incorrect.
*/
@PutMapping("/primitive-params")
public ResponseEntity<PrimitiveParamDTO> updatePrimitiveParam(@RequestBody PrimitiveParamDTO primitiveParamDTO) throws URISyntaxException {
log.debug("REST request to update PrimitiveParam : {}", primitiveParamDTO);
if (primitiveParamDTO.getId() == null) {
throw new BadRequestAlertException("Invalid id", ENTITY_NAME, "idnull");
}
PrimitiveParamDTO result = primitiveParamService.save(primitiveParamDTO);
return ResponseEntity.ok()
.headers(HeaderUtil.createEntityUpdateAlert(applicationName, true, ENTITY_NAME, primitiveParamDTO.getId().toString()))
.body(result);
}
/**
* {@code GET /primitive-params} : get all the primitiveParams.
*
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and the list of primitiveParams in body.
*/
@GetMapping("/primitive-params")
public List<PrimitiveParamDTO> getAllPrimitiveParams() {
log.debug("REST request to get all PrimitiveParams");
return primitiveParamService.findAll();
}
/**
* {@code GET /primitive-params/:id} : get the "id" primitiveParam.
*
* @param id the id of the primitiveParamDTO to retrieve.
* @return the {@link ResponseEntity} with status {@code 200 (OK)} and with body the primitiveParamDTO, or with status {@code 404 (Not Found)}.
*/
@GetMapping("/primitive-params/{id}")
public ResponseEntity<PrimitiveParamDTO> getPrimitiveParam(@PathVariable Long id) {
log.debug("REST request to get PrimitiveParam : {}", id);
Optional<PrimitiveParamDTO> primitiveParamDTO = primitiveParamService.findOne(id);
return ResponseUtil.wrapOrNotFound(primitiveParamDTO);
}
/**
* {@code DELETE /primitive-params/:id} : delete the "id" primitiveParam.
*
* @param id the id of the primitiveParamDTO to delete.
* @return the {@link ResponseEntity} with status {@code 204 (NO_CONTENT)}.
*/
@DeleteMapping("/primitive-params/{id}")
public ResponseEntity<Void> deletePrimitiveParam(@PathVariable Long id) {
log.debug("REST request to delete PrimitiveParam : {}", id);
primitiveParamService.delete(id);
return ResponseEntity.noContent().headers(HeaderUtil.createEntityDeletionAlert(applicationName, true, ENTITY_NAME, id.toString())).build();
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/types/IWorkGroupTypeDAO.java
package br.ufpa.labes.spm.repository.interfaces.types;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.WorkGroupType;
public interface IWorkGroupTypeDAO extends IBaseDAO<WorkGroupType, String> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/interfaces/TemplateServices.java
package br.ufpa.labes.spm.service.interfaces;
import java.util.List;
import br.ufpa.labes.spm.service.dto.TemplateDTO;
import br.ufpa.labes.spm.exceptions.DAOException;
import br.ufpa.labes.spm.exceptions.WebapseeException;
public interface TemplateServices {
public Boolean createTemplate(String ident);
public String createNewVersion(String template_id, String why, String version_id);
public void processDistilling(String process_id, String template_id);
public List<TemplateDTO> getTemplates();
public boolean copyTemplate(String newTemplateIdent, String oldTemplateIdent) throws DAOException;
public boolean toBecomeDefined(String template_id) throws WebapseeException;
public void processInstantiation(String template_id, String instance_id, String userIdent)
throws WebapseeException;
public Object[] getArtifactsIdentsFromProcessModelWithoutTemplates(String template_id);
public void processComposition(
String template_id, String currentLevel_id, Object[] artifactsIdentsFromUser)
throws DAOException;
}
<file_sep>/src/main/java/br/ufpa/labes/spm/service/ResourceService.java
package br.ufpa.labes.spm.service;
import br.ufpa.labes.spm.service.dto.ResourceDTO;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.Pageable;
import java.util.List;
import java.util.Optional;
/**
* Service Interface for managing {@link br.ufpa.labes.spm.domain.Resource}.
*/
public interface ResourceService {
/**
* Save a resource.
*
* @param resourceDTO the entity to save.
* @return the persisted entity.
*/
ResourceDTO save(ResourceDTO resourceDTO);
/**
* Get all the resources.
*
* @return the list of entities.
*/
List<ResourceDTO> findAll();
/**
* Get all the resources with eager load of many-to-many relationships.
*
* @return the list of entities.
*/
Page<ResourceDTO> findAllWithEagerRelationships(Pageable pageable);
/**
* Get the "id" resource.
*
* @param id the id of the entity.
* @return the entity.
*/
Optional<ResourceDTO> findOne(Long id);
/**
* Delete the "id" resource.
*
* @param id the id of the entity.
*/
void delete(Long id);
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/resources/ShareableDAO.java
package br.ufpa.labes.spm.repository.impl.resources;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.resources.IShareableDAO;
import br.ufpa.labes.spm.domain.Shareable;
public class ShareableDAO extends BaseDAO<Shareable, String> implements IShareableDAO {
protected ShareableDAO(Class<Shareable> businessClass) {
super(businessClass);
}
public ShareableDAO() {
super(Shareable.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/email/EmailDAO.java
package br.ufpa.labes.spm.repository.impl.email;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.email.IEmailDAO;
import br.ufpa.labes.spm.domain.Email;
public class EmailDAO extends BaseDAO<Email, String> implements IEmailDAO {
protected EmailDAO(Class<Email> businessClass) {
super(businessClass);
}
public EmailDAO() {
super(Email.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/plannerInfo/IResourceInstantiationSuggestionDAO.java
package br.ufpa.labes.spm.repository.interfaces.plannerInfo;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.ResourceInstSug;
public interface IResourceInstantiationSuggestionDAO extends IBaseDAO<ResourceInstSug, Integer> {}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/impl/connections/SimpleConDAO.java
package br.ufpa.labes.spm.repository.impl.connections;
import br.ufpa.labes.spm.repository.impl.BaseDAO;
import br.ufpa.labes.spm.repository.interfaces.connections.ISimpleConDAO;
import br.ufpa.labes.spm.domain.SimpleCon;
public class SimpleConDAO extends BaseDAO<SimpleCon, String> implements ISimpleConDAO {
protected SimpleConDAO(Class<SimpleCon> businessClass) {
super(businessClass);
}
public SimpleConDAO() {
super(SimpleCon.class);
}
}
<file_sep>/src/main/java/br/ufpa/labes/spm/repository/interfaces/resources/IShareableDAO.java
package br.ufpa.labes.spm.repository.interfaces.resources;
import br.ufpa.labes.spm.repository.interfaces.IBaseDAO;
import br.ufpa.labes.spm.domain.Shareable;
public interface IShareableDAO extends IBaseDAO<Shareable, String> {}
| b0d7ca2e5f19a4676c903b70d526e3e8080ea0e3 | [
"Java",
"Shell"
]
| 265 | Java | Lakshamana/spm | ba98a47557f196fa1429670e52549c188bbf00a1 | 366f425e66d40a50c1248eb47a677315639555df |
refs/heads/master | <file_sep><?php
/*
CHAPTER WEBSERVICES: SAMPLE OUTPUT PAGE
*/
// include helper functions
require "functions.php";
// CUSTOM PARAMETERS
// date_default_timezone_set needs to be defined if timezone is not in php.ini configuration
// change to whatever timezone you want
date_default_timezone_set('Europe/Zurich');
$Email = ''; // define variable
$UserID = '3'; // get this parameter from your ISOC chapter record
$Password = '<PASSWORD>'; // get this parameter from your ISOC chapter record
// get e-mail value from form
if(isset($_POST['Email'])) {
$Email = $_POST['Email'];
}
// get data with given parameters
$record = VerifyISOCMembership($Email, $UserID, $Password);
print '<form method="post" id="form">
<h1>Test Chapter Web Services</h1>
<label for="Email">Email: </label>
<input name="Email" value="'.$Email.'" type="text" style="width:300px;" />
<input type="submit" name="submit" value="Submit" />
</form>';
if(isset($_POST['Email'])) {
// custom output as array format
echo ArrayOutput($record);
// custom output as table format - uncomment to see result
// echo TableOutput($record);
}
/************ EXAMPLES OF CUSTOM OUTPUT **************/
// example to see array structure
function ArrayOutput($data) {
return '<pre>'.print_r($data,true).'</pre>';
}
// example with error handling
function TableOutput($data) {
if(isset($data['ERRORS']['Error'])) {
$output = ErrorValidation($data['ERRORS']['Error']);
} else {
$row = $data['VALUES']['Row'];
$PersonID = $row['PersonID'];
$MemberType = $row['MemberType'];
$MemberStatus = $row['MemberStatus'];
$CountryISOCode = $row['CountryISOCode'];
$CountryName = $row['CountryName'];
$Comments = $row['Comments'];
$output = '
<table>
<tr><td>Person ID</td><td>'.$PersonID.'</td></tr>
<tr><td>Member Type</td><td>'.$MemberType.'</td></tr>
<tr><td>Member Status</td><td>'.$MemberStatus.'</td></tr>
<tr><td>Country ISO Code</td><td>'.$CountryISOCode.'</td></tr>
<tr><td>Country Name</td><td>'.$CountryName.'</td></tr>
<tr><td>Comments</td><td>'.$Comments.'</td></tr>
</table>';
}
return $output;
}
?><file_sep><?php
/*
Class: class.soapclient_cache.php
Author: <NAME>
Version: 1.0
Date: 2010-08-31
Extends Nusoap nusoap_client class to add a TTL property. If the TTL is set
(value in minutes), the call method will call the remote web service and keep a
local cache of results. Further calls to the same combination of web service
and service parameters will result in cached results being served until the TTL
expires, at which time the cached data be refreshed by a new call to the remote
service. Useful for caching frequently used background data that changes rarely.
Should not be used when posting forms and expecting a specific result.
Usage:
- Include this class after including the nusoap base classes:
require_once ('/lib/nusoap.php');
require_once ('/lib/class.wsdlcache.php');
require_once ('/lib/class.soapclient_cache.php');
- Create a client using the extended class rather than the nusoap client class:
$client = new nusoap_cache_client($wsdl,'wsdl','','','','');
- Optional: set path to cache directory (defaults to /tmp)
$client->CacheDir = "/mycache/data";
- Set a TTL in minutes (set TTL to 0 for direct calls with no caching):
- NB: When posting form data, remember to set TTL to 0!
$client->TTL="1";
- Call the usual Nusoap call method:
$result = $client->call ($WS_service,$param); // Returns cached or live data
*/
class nusoap_cache_client extends nusoap_client
{
public $TTL;
public $CacheDir;
function call($WS_service,$WS_params=array(),$namespace='http://tempuri.org',$soapAction='',$headers=false,$rpcParams=null,$style='rpc',$use='encoded'){
// added by henri, apr 2013, variable cur_time needs to be defined (notice error)
$cur_time="";
$WS_endpoint = $this->operations[$WS_service]['endpoint'];
$WS_TTL = $this->TTL;
$WS_CacheDir = ($this->CacheDir == '' ? '/tmp' : $this->CacheDir);
// Check if cached data exists for this call
// Look for a cache file named MD5(WS_endpoint + WS name + serialized(WS params))
$WS_cacheFileName = $WS_CacheDir."/wscache-".md5($WS_endpoint.$WS_service.serialize($WS_params));
if (file_exists($WS_cacheFileName)) {
// Cache file exists, read cache data
$WS_serialized = file_get_contents($WS_cacheFileName);
$WS_cachearray = unserialize($WS_serialized);
// Extract cache timestamp
$cache_timestamp = $WS_cachearray["WStimestamp"];
// check cache TTL
$cur_time=time();
$cache_age=floor($cur_time - $cache_timestamp) / 60;
if ($cache_age < $WS_TTL) {
// Cache has not expired, return cache data
return $WS_cachearray["WSresult"];
}
// If here, cache data is stale - so continue to call remote service
}
// Call remote web service by passing params onto the usual Nusoap call method
$WS_result = parent::call($WS_service,$WS_params,$namespace,$soapAction,$headers,$rpcParams,$style,$use);
// If call to web service returned nothing, try returning stale cache data rather than no data
if (empty($WS_result)) {
if (!empty($WS_cachearray)) return $WS_cachearray["WSresult"];
return $WS_result;
}
// Write web service data to cache if a TTL was specified
if ($WS_TTL <> 0) {
$WS_cachearray = array(
"WSendpoint" => $WS_endpoint, // URL of the web service
"WSname" => $WS_service, // Web service being called
"WSparams" => $WS_params, // Array of web service parameters
"WStimestamp" => $cur_time, // Timestamp of received data
"WSTTL" => $WS_TTL, // Time to Live for cached data
"WSresult" => $WS_result ); // Cached web service results
// Serialize the cache array
$WS_serialized = serialize($WS_cachearray);
// Write the cache data
$fh = fopen($WS_cacheFileName, 'w') ;
if (flock($fh, LOCK_EX | LOCK_NB)) {
fwrite($fh, $WS_serialized);
flock($fh, LOCK_UN);
fclose($fh);
}
}
// Return the data retrieved by the web service
return $WS_result;
}
}
?>
<file_sep># ChapterWebservices
Application to get ISOC membership information based on submitted e-mail address
Author: <NAME>, Internet Society
Date: 21 October 2016
Installation:
- Copie the folder ChapterWebServices to your location
- Visit: http://yourdomain/yourlocation
- see also attached documentation: ISOC-Chapter-Web-Service-Guide-v1_0 2016-01-10
For questions, please contact <EMAIL><file_sep><?php
/**
* Helper functions for Chapter Web Services
*/
function isocform_connect(){
// update this path if you put the nusoap library somewhere else
$path = 'nusoap/lib';
require_once $path.'/nusoap.php';
require_once $path.'/class.wsdlcache.php';
require_once $path.'/class.soapclient_cache.php';
$url = 'https://portal.isoc.org:25682/ChapterWS/ChapterWebServices.asmx?WSDL';
$wsdl = new wsdl($url, '', '', '', '', 15);
$client = new nusoap_cache_client($wsdl,'wsdl','','','','');
return $client;
}
function GetToken($Password, $UserID) {
$parameters = array('UserID' => $UserID,'Password' => $<PASSWORD>);
$client = isocform_connect();
$result = $client->call ('GetVendorToken',$parameters);
return $result['GetVendorTokenResult'];
}
function VerifyISOCMembership($Email, $UserID, $Password) {
$GetVendorToken = GetToken($Password, $UserID);
$parameters = array('Email' => $Email, 'UserID' => $UserID,'Token' => $GetVendorToken);
$client = isocform_connect();
$result = $client->call ('VerifyISOCMembership',$parameters);
$output = $result['VerifyISOCMembershipResult']['VerifyISOCMembership_RESULTS'];
return $output;
}
function ErrorValidation($value) {
if (isset($value)) {
$error[1] = 'Unknown Error';
$error[2] = 'Authentication Error';
$error[3] = 'Missing required parameter.';
$error[27] = 'Missing required login parameter';
$error[40] = 'Invalid xml document parameter';
$error[41] = 'Error while reading XML document';
$error_message = 'Error '.$value.': '.$error[$value];
}
return $error_message;
}
?> | fef3e64d9484566dc42d98cdb07bcd1ae6546d4c | [
"Markdown",
"PHP"
]
| 4 | PHP | InternetSociety/ChapterWebservices | 1729bb41333101bd892cc8307eb9c58a37aa7d15 | ba77a771aca5466b96ce42698fba985512eb9424 |
refs/heads/master | <repo_name>enixdark/simple_django-angular<file_sep>/README.md
simple_django-angular
=====================
A simple project use Django and Angularjs for practices
<file_sep>/apps/snippets/views.py
from django.shortcuts import render
from django.http import HttpResponse
from django.views.decorators.csrf import csrf_exempt
from rest_framework.renderers import JSONRenderer
from rest_framework.parsers import JSONParser
from .models import Snippet
from .serializers import SnippetSerializer
class JSONResponse(HttpResponse):
def __init__(self,data,**kwargs):
content = JSONRenderer().render(data)
kwargs['content_type'] = 'application/json'
super(JSONResponse,self).__init__(content,kwargs)
<file_sep>/apps/djangular/admin.py
from django.contrib import admin
from rest_framework import routers
from . import views
# Register your models here.
router = routers.DefaultRouter()
router.register(r'users', views.UserViewSet)
router.register(r'groups', views.GroupViewSet) | 9ba322af6c35d522dd94daa683dbfc190fe69ab2 | [
"Markdown",
"Python"
]
| 3 | Markdown | enixdark/simple_django-angular | b398f76e538ae4c0ef5510860fffe4cde0de4cbd | 8824c9f8196df948c906759ef066db3f860eaf6d |
refs/heads/master | <repo_name>geekmlle/TEST01<file_sep>/HelloWorld/src/main/java/com/michelle/InsuranceServlet.java
package com.michelle;
/**
* Created by blondieymollo on 2/21/16.
*/
import java.io.IOException;
import javax.servlet.RequestDispatcher;
import javax.servlet.ServletException;
import javax.servlet.annotation.WebServlet;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import com.model.Database;
@WebServlet("/InsuranceServlet")
public class InsuranceServlet extends HttpServlet {
private static final long serialVersionUID = 1L;
protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
Database db = new Database();
request.setAttribute("code", db.readFile());
RequestDispatcher rd = getServletContext().getRequestDispatcher("/servlets/insurance.jsp");
rd.forward(request, response);
}
}
<file_sep>/Battleship/src/com/michelle/Play.java
package com.michelle;
import java.util.Scanner;
/**
* Created by blondieymollo on 3/7/16.
*/
public class Play {
private Board ub = null;
private Board cb = null;
private Board tb = null;
private boolean winner = false;
public Play(Board ub, Board cb, Board tb) {
this.ub = ub;
this.cb = cb;
this.tb = tb;
Start();
}
private void Start(){
System.out.println("Are you ready to start? (Press any key to start)");
Scanner sc = new Scanner(System.in);
if (sc.hasNextLine()){
do{
userTurn();
checkWinner();
computerTurn();
checkWinner();
}while (!winner);
}
}
private void userTurn(){
int error = -1;
int j = 0, i = 0;
tb.printBoard();
System.out.println("Where would you like to attack? (Letter/Number)");
Scanner sc = new Scanner(System.in);
while(error == -1) {
if (sc.hasNextLine()) {
try {
String[] temp = sc.nextLine().split("/");
try {
j = tb.getIndex(temp[0]);
i = Integer.parseInt(temp[1]);
error = 1;
} catch (Exception e) {
System.out.println("You need to specify a letter/number!!");
error = -1;
}
} catch (Exception e) {
System.out.println("There was an error there, could you try again to write the Letter/Number of your choice?");
error = -1;
}
}
if (tb.coordinatesOutOfBounds(i, j)){
System.out.println("You can't attack outside the board! Pick another coordinate");
error = -1;
}
if (!tb.Null(i, j)){
System.out.println("There was an attack there already! Pick another coordinate");
error = -1;
}
}
int result = cb.attack(i,j);
switch (result){
case -1: {
tb.insertTurn(i,j,"$");
tb.printBoard();
System.out.println("Snap! There was no ship there!");
break;
}
case 1:{
cb.insertSunkenShip(cb.returnCoordinatesOfShip(i,j));
tb.printBoard();
System.out.println("Congratulations! You sank a ship!");
break;
}
}
}
private void computerTurn(){
int error = -1;
while(error == -1) {
int i = (int )(Math.random() * tb.getBoardSize() + 1);
int j = (int )(Math.random() * tb.getBoardSize() + 1);
if (tb.coordinatesOutOfBounds(i, j) || !tb.Null(i, j)) {
error = -1;
} else {
int result = ub.attack(i,j);
switch (result){
case -1: {
tb.insertTurn(i,j,"$");
System.out.println("Phew! The computer didn't hit any of your ships");
break;
}
case 1:{
ub.insertSunkenShip(ub.returnCoordinatesOfShip(i,j));
System.out.println("Doh! The computer sunk one of your ships!");
break;
}
}
error = 1;
}
}
System.out.println("The computer has finished it's turn!");
}
private boolean checkWinner(){
if(!cb.shipsLeft()){
//User wins
System.out.println("Oh wow! You won! Congratulations!");
winner = true;
return true;
}
if(!ub.shipsLeft()){
//User wins
System.out.println("Oh nou! The evil computer won!");
winner = true;
return true;
}
return false;
}
}
<file_sep>/Battleship/src/com/michelle/GamePrep.java
package com.michelle;
import java.util.Scanner;
/**
* Created by blondieymollo on 2/24/16.
*/
public class GamePrep {
private int size = -1;
private String[] ships = {"Battleship", "first Cruiser", "second Cruiser",
"first Frigate","second Frigate","third Frigate","fourth Frigate",
"first Minesweeper","second Minesweeper","third Minesweeper","fourth Minesweeper"
};
public GamePrep(){
printWelcomeMessage();
askArraySize();
initializeGame();
}
private void printWelcomeMessage(){
System.out.println("Welcome to Battleship!");
System.out.println("Created by <NAME>");
System.out.println("");
System.out.println("Let's start!!");
System.out.println("");
System.out.println("");
}
private void askArraySize(){
System.out.println("How big should the board be? (Pick a number from 10 - 15 and press enter)");
while (size == -1) {
Scanner sc = new Scanner(System.in);
if (sc.hasNextLine()) {
try {
size = Integer.parseInt(sc.nextLine());
} catch (Exception e) {
System.out.println("There was an error with the size, please try with a number from 10 - 15!");
}
}
if (size < 10 && size != -1){
System.out.println("That board size is too small, try 10 or bigger!");
size = -1;
}
if (size > 15 && size != -1){
System.out.println("That board size is too big, try 15 or smaller!");
size = -1;
}
}
}
private void initializeGame(){
Board ub = new Board(size);
Board cb = new Board(size);
Board tb = new Board(size);
//placeShips(ub);
autoPlaceShips(ub);
placeComputerShips(cb);
new Play(ub, cb, tb);
}
private void autoPlaceShips (Board board){
//This is the debug method that inserts ships automatically for debugging. Preffered size of board is 12
int j = 1;
int kind = 0;
for (int i = 1; i<= ships.length; i++){
board.addShipToBoard(kind,i,j,"h");
kind++;
}
System.out.println("All done. Proceed");
board.printBoard();
}
private void placeShips (Board board){
//This is the one that should help the user place the ships
int error = -1;
int kind = 0;
int i = 0;
int j = 0;
String orientation = "";
System.out.println("Time to place your ships on the board!!");
for (String ship : ships){
do{
board.printBoard();
System.out.println("Where would you like to place the "+ship+" ? (Letter/Number)");
Scanner sc = new Scanner(System.in);
while(error == -1) {
if (sc.hasNextLine()) {
try {
String[] temp = sc.nextLine().split("/");
try {
j = board.getIndex(temp[0]);
i = Integer.parseInt(temp[1]);
error = 1;
} catch (Exception e) {
System.out.println("You need to specify the letter/number!!");
error = -1;
}
} catch (Exception e) {
System.out.println("There was an error there, could you try again to write the Letter/Number of your choice?");
error = -1;
}
}
}
error = -1;
Scanner sc2 = new Scanner(System.in);
while (error == -1) {
System.out.println("Horizontally or Vertically? (H/V)");
if (sc2.hasNextLine()) {
String temp1 = sc2.nextLine();
if (temp1.equalsIgnoreCase("h") || temp1.equalsIgnoreCase("v")) {
orientation = temp1;
error = 1;
} else {
System.out.println("There was an error, please choose H or V.");
error = -1;
}
}
}
error = -1;
} while (!checkError(board, i, j, kind, orientation));
board.addShipToBoard(kind,i,j,orientation);
kind++;
}
}
private boolean checkError(Board board, int i, int j, int kind, String orientation){
if (board.coordinatesOutOfBounds(i, j)){
System.out.println("You can't place a ship outside the board! Pick another coordinate");
return false;
}
if (board.shipOutOfBounds(i,j,kind,orientation)){
System.out.println("Uh Oh! The ship falls outside the board! Pick another coordinate");
return false;
}
if (board.shipAlreadyExistsThere(i, j, kind, orientation)){
System.out.println("There is a ship there already! Pick another coordinate");
return false;
}
return true;
}
private void placeComputerShips (Board board){
//This method will auto place the ships on the computer board
int kind = 0;
int randomi = 0;
int randomj = 0;
String orientation = "";
for (String ship : ships){
//System.out.println("Placing the computer's "+ship);
do{
randomi = (int )(Math.random() * size + 1);
randomj = (int )(Math.random() * size + 1);
int randomOrientation = (int )(Math.random() * 2 + 1);
if (randomOrientation == 1){
orientation = "h";
} else {
orientation ="v";
}
} while (!checkComputerError(board, randomi, randomj, kind, orientation));
board.addShipToBoard(kind,randomi, randomj,orientation);
kind++;
}
System.out.println("The computer has placed its ships!");
board.printBoard();
}
private boolean checkComputerError(Board board, int i, int j, int kind, String orientation){
if (board.shipOutOfBounds(i,j,kind,orientation)){
return false;
}
if (board.coordinatesOutOfBounds(i, j)){
return false;
}
if (board.shipAlreadyExistsThere(i, j, kind, orientation)){
return false;
}
return true;
}
}
| dd2fd331c556f8d93b1327c28c0ec01c406732f5 | [
"Java"
]
| 3 | Java | geekmlle/TEST01 | 56e75cbfbf9ad71541909ad9c3fd814d57f8c958 | 43564736ab00ae9ff0b1969714113c84e7c579ac |
refs/heads/master | <file_sep>#include<ESP8266WiFi.h>
#include<ArduinoJson.h>
#include<PubSubClient.h>
#include <Adafruit_NeoPixel.h>
#include <DHT.h>
DHT sensor(D1, DHT11);
DynamicJsonDocument status(16);//接收继电器
DynamicJsonDocument json(32);//接收灯
DynamicJsonDocument dhtjson(32);//dht
WiFiClient client;
PubSubClient mqttclient(client);
Adafruit_NeoPixel pixels = Adafruit_NeoPixel(8, D7, NEO_GRB + NEO_KHZ800);
const char* light = "xaut/light";
const char* dht = "xaut/temp/dhtsensor";
const char* will_topic = "xaut/temp/connectstatus";
const char* switch1 = "xaut/hlswitch";
int r, g, b;
void callback(char* topic, byte* payload, unsigned int length)
{
//灯
Serial.println(topic);
if (!strcmp(topic, light)) {
deserializeJson(json, payload, length);
//Serial.println(status);
if (json["status"].as<String>() == "party") {
for(int k=0; k<5; k++){
for (int i = 0; i < 8; i++) {
r = random(0, 100);
g = random(0, 100);
b = random(0, 100);
pixels.setPixelColor(i, pixels.Color(r, g, b));
pixels.show();
delay(100);
}
for (int i = 0; i < 8; i++) {
r=g=b=0;
pixels.setPixelColor(i, pixels.Color(r, g, b));
pixels.show();
//delay(100);
}
}
} if (json["status"].as<String>() == "sleep") {
for (int i = 0; i < 8; i++) {
r = 5;
g = 5;
b = 5;
pixels.setPixelColor(i, pixels.Color(r, g, b));
pixels.show();
delay(100);
}
}
}
//继电器
if (!strcmp(topic, switch1)) {
deserializeJson(status, payload, length);
//Serial.println(status);
if (status["status"].as<String>() == "on") {
digitalWrite(D6, LOW);
} if (status["status"].as<String>() == "off") {
digitalWrite(D6, HIGH);
}
}
}
void setup() {
pinMode(D7, OUTPUT);
digitalWrite(D7, HIGH);
pinMode(D6, OUTPUT);
digitalWrite(D6, HIGH);
Serial.begin(115200);
WiFi.mode(WIFI_STA);
WiFi.begin("Honor 8 Lite", "9804120109zyy.");
while (!WiFi.isConnected()) {
delay(500);
Serial.print(".");
}
pixels.begin(); //启动灯
sensor.begin(); //启动DHT
Serial.println(WiFi.localIP());
configTime(8 * 3600, 0, "pool.ntp.org"); //启动时获取一次
mqttclient.setServer("192.168.43.192", 1883);
mqttclient.setCallback(callback);
}
void loop() {
time_t now = time(nullptr);
Serial.print(ctime(&now));
if (!mqttclient.connected()) {
//mqttclient.connect("general");//客户端id
String clientID = "general";
if (mqttclient.connect(clientID.c_str(), will_topic, 1, true, "{\"online\":\"false\"}")) {
mqttclient.publish(will_topic, "{\"online\":\"true\"}", true);
}
else
{
Serial.println("connect failed code = " + mqttclient.state());
}
mqttclient.subscribe(light, 1);
mqttclient.subscribe(switch1, 1);
}
dhtjson["temperature"] = sensor.readTemperature();
dhtjson["humidity"] = sensor.readHumidity();
String jsonstr;
serializeJsonPretty(dhtjson, jsonstr);
Serial.println("DHTSensor" + jsonstr);
mqttclient.publish(dht, jsonstr.c_str());
mqttclient.loop();//自动发送心跳,保持和服务器的连接
delay(1000);
}
<file_sep># Arduino_SmartHome
# 这是一个基于NodeMCU开发板,主要使用MQTT传输协议的一个简易智能家居系统,主要模块有:
. DHT11温湿度传感器模块
. 继电器模块
. Neopixel 数字RGB LED模块
. NodeMCU控制中心
. Node-RED网页界面交互
## 开发环境:
. Mosquitto作为MQTT服务器端
. Node-RED
. Arduino-1.8.8
| c35a6cc5e993607c89ad9f322f5a82083ad009a1 | [
"Markdown",
"C++"
]
| 2 | C++ | Henry630401/Arduino_SmartHome | 7f0f971d736454bf0c7b27a8d95b6996000120aa | 5b28cf62f610885ca953c37f027fd1d55595f4ae |
refs/heads/master | <file_sep>## Autor
* <NAME> <file_sep>using Microsoft.AspNetCore.Mvc;
using practica1.Models;
using practica1.Data;
using System.Linq;
namespace practica1.Controllers
{
public class MatriculasController : Controller
{
private readonly ApplicationDbContext _context;
public MatriculasController(ApplicationDbContext context)
{
_context = context;
}
public IActionResult Index()
{
return View(_context.DataMatriculas.ToList());
}
public IActionResult Create()
{
return View();
}
[HttpPost]
public IActionResult Create(Matriculas objMatriculas)
{
_context.Add(objMatriculas);
_context.SaveChanges();
ViewData["Message"] = "Se ha registrado satisfactoriamente";
return View();
}
public IActionResult Edit(int id)
{
Matriculas objMatriculas = _context.DataMatriculas.Find(id);
if (objMatriculas == null)
{
return NotFound();
}
return View(objMatriculas);
}
[HttpPost]
public IActionResult Edit(int id, [Bind("Id,Name,Categoria,Precio,Descuento")] Matriculas objMatriculas)
{
_context.Update(objMatriculas);
_context.SaveChanges();
ViewData["Message"] = "El contacto ya esta actualizado";
return View(objMatriculas);
}
public IActionResult Delete(int id)
{
Matriculas objMatriculas = _context.DataMatriculas.Find(id);
_context.DataMatriculas.Remove(objMatriculas);
_context.SaveChanges();
return RedirectToAction(nameof(Index));
}
}
} | ad3ff245c537a2e354a90a2413d6d34844567d8f | [
"Markdown",
"C#"
]
| 2 | Markdown | armaneeuu/practica1 | 5c34b009896dfb89472b859a14c6b1f16fdea601 | 941ce6be84be1cdfe16ce6e2de75623e376343d0 |
refs/heads/master | <repo_name>ewsol/szkolenieWawa<file_sep>/src/it/systems/kuchenka/Kuchenka.java
package it.systems.kuchenka;
public interface Kuchenka {
void ustawCzas(int minuty);
void ustawTemperaturę(int temperatura);
void uruchom();
void wyłącz();
}
<file_sep>/src/it/jsystems/dziedziczenie/inter/Student.java
package it.jsystems.dziedziczenie.inter;
public interface Student {
String getSubject();
}
<file_sep>/src/it/jsystems/dziedziczenie/Hut.java
package it.jsystems.dziedziczenie;
public class Hut extends Product {
public Hut(String name, double price) {
super(name, price);
// TODO Auto-generated constructor stub
}
}
<file_sep>/src/it/jsystems/szkolenie/Test.java
package it.jsystems.szkolenie;
import java.util.Scanner;
public class Test {
public static void main(String[] args) {
Scanner scanner = new Scanner(System.in);
String operationType = scanner.nextLine();
Operations operationToPerform = new Multiply();
if ("S".equalsIgnoreCase(operationType)) {
operationToPerform = new Sum();
}
int first = 8;
int second = 2;
int operationResult = operationToPerform.perform(first, second);
System.out.println("result of operation is: " + operationResult);
}
}
<file_sep>/src/it/jsystems/szkolenie/Sum.java
package it.jsystems.szkolenie;
public class Sum implements Operations {
@Override
public int perform(int first, int second) {
return Math.addExact(first, second);
}
}
<file_sep>/src/it/jsystems/defaults/MyInterface.java
package it.jsystems.defaults;
public interface MyInterface {
default void sayHello(String imie) {
System.out.println("cześć" + imie + "2");
}
}
<file_sep>/src/it/jsystems/abstracts/MyClass.java
package it.jsystems.abstracts;
public class MyClass{
public static void main(String[] args) {
Human person = new Human("Ewa", 32, true, "white");
System.out.println("Name of created animail is: " + person.getName());
System.out.println("Age: " + person.getAge());
System.out.println("Her race is: " + person.getRace());
if(person.isPredator) {
System.out.println("and she is a predator.");
}
GoldFish fish = new GoldFish("goldie", 1, true, "oranda");
fish.pływaj();
System.out.println("Her breed is: " + fish.getBreed());
if(fish.slodkowodna) {
System.out.println("and she is a fresh-water.");
}
}
}
<file_sep>/src/it/jsystems/wzorce/Singleton.java
package it.jsystems.wzorce;
public class Singleton {
private static volatile Singleton instance = null;
private Singleton() {
}
public static void main(String[] args) {
Singleton s1 = Singleton.getInstance();
Singleton s2 = Singleton.getInstance();
System.out.println("Czy to to samo? " + (s1==s2));
}
static Singleton getInstance() {
if (instance == null) {
synchronized (Singleton.class) { //synchonizacja, żeby się upewnic czy instancja nie powstała wątek
if (instance == null) {
instance = new Singleton();
}
}
}
return instance;
}
}
<file_sep>/src/it/jsystems/dziedziczenie/polimorf/Odziez.java
package it.jsystems.dziedziczenie.polimorf;
public class Odziez extends Produkt {
private String rozmiar;
public Odziez(String nazwa, int cena, String rozmiar) {
super(nazwa, cena);
this.rozmiar = rozmiar;
}
public String getRozmiar() {
return rozmiar;
}
public void setRozmiar(String rozmiar) {
this.rozmiar = rozmiar;
}
}
<file_sep>/src/it/jsystems/finals/Psiarnia.java
package it.jsystems.finals;
public class Psiarnia {
public static void main(String[] args) {
Piese³ pies = new Piese³("Sonia");
pies.szczeka();
System.out.println(pies.toString());
}
}
<file_sep>/src/it/jsystems/dziedziczenie/polimorf/Test.java
package it.jsystems.dziedziczenie.polimorf;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.FileReader;
import java.io.FileWriter;
import java.io.IOException;
import java.io.PrintWriter;
import java.nio.file.Files;
import java.util.Arrays;
import java.util.List;
import java.util.stream.Collectors;
public class Test {
public static void main(String[] args) {
Produkt rower = new Produkt("Rower", 100);
Produkt czapka = new Odziez("Spódnica", 90, "S");
Produkt kosmetyk = new Kosmetyk("Dior", 400, "perfumy", false);
Produkt krem = new Krem("Vichy", 179, true, 20);
List<Produkt> koszyk = Arrays.asList(rower, czapka, kosmetyk, krem);
wyświetlProduktyZKoszyka(koszyk);
zliczSumęProduktów(koszyk);
zapiszProduktydoPliku(koszyk);
odczytajPlik();
skopiujPlik();
}
private static void skopiujPlik() {
File toRead = new File("C:/Users/student/produkty");
File toWrite = new File("C:/Users/student/kopiaprodukty");
try (FileInputStream fis = new FileInputStream(toRead); FileOutputStream fos = new FileOutputStream(toWrite);) {
toWrite.createNewFile();
byte[] data = new byte[fis.available()];
fis.read(data);
fos.write(data);
} catch (FileNotFoundException ex) {
ex.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
private static void odczytajPlik() {
File produkty = new File("C:/Users/student/produkty");
try {
BufferedReader reader = new BufferedReader(new FileReader(produkty));
String line;
while ((line = reader.readLine()) != null) {
System.out.println(line);
}
} catch (IOException e) {
e.printStackTrace();
}
}
private static void zliczSumęProduktów(List<Produkt> koszyk) {
int sum = koszyk.stream().map(Produkt::getCena).collect(Collectors.summingInt(Integer::intValue));
System.out.println(sum);
}
private static void wyświetlProduktyZKoszyka(List<Produkt> koszyk) {
for (Produkt prod : koszyk) {
System.out.println(prod.toString());
}
}
private static void zapiszProduktydoPliku(List<Produkt> koszyk) {
if (!koszyk.isEmpty()) {
try {
File produkty = createFile();
PrintWriter out = new PrintWriter(new FileWriter(produkty));
for (Produkt produkt : koszyk) {
out.println(produkt.toString().toCharArray());
}
out.flush();
out.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
private static File createFile() throws IOException {
File produkty = new File("C:/Users/student/produkty");
Files.deleteIfExists(produkty.toPath());
produkty.createNewFile();
return produkty;
}
}
<file_sep>/src/it/jsystems/serializacja/Zwierze.java
package it.jsystems.serializacja;
public class Zwierze {
public Zwierze() {
super();
}
}
<file_sep>/src/it/jsystems/defaults/MyInterface2.java
package it.jsystems.defaults;
public interface MyInterface2 {
default void sayHello(String imie) {
System.out.println("cześć" + imie + "1");
}
void myMethod();
}
<file_sep>/src/it/jsystems/klonowanie/Test.java
package it.jsystems.klonowanie;
public class Test {
public static void main(String[] args) throws CloneNotSupportedException {
Punkt p1 = new Punkt(10, 10);
Punkt p2 = (Punkt) p1.clone();
Punkt p3 = new Punkt(10, 10);
Punkt p4 = p3;
boolean porównanie1 = p1 == p2;
boolean porównanie2 = p1.equals(p2);
boolean porównanie3 = p3 == p4;
boolean porównanie4 = p3.equals(p4);
System.out.println("Porównanie 1: " + porównanie1);
System.out.println("Porównanie 2: " + porównanie2);
System.out.println("Porównanie 3: " + porównanie3);
System.out.println("Porównanie 4: " + porównanie4);
wyświetl(p1, p2, p3);
p2.setX(11);
p3.setX(12);
wyświetl(p1, p2, p3);
Odcinek o1 = new Odcinek(p1, p2);
Odcinek o2 = (Odcinek) o1.clone();
Odcinek o3 = o1;
boolean porównanieO1 = o1 == o2;
System.out.println(porównanieO1);
boolean porównanieO2 = o1 == o3;
System.out.println(porównanieO2);
wyświetl(o1, o2, o3);
}
private static <T> void wyświetl(@SuppressWarnings("unchecked") T... points) {
for (T t: points) {
System.out.println(t.toString());
}
}
}
<file_sep>/src/it/systems/kuchenka/Grillowanie.java
package it.systems.kuchenka;
public class Grillowanie implements Kuchenka {
public Grillowanie(int minuty) {
super();
this.minuty = minuty;
}
int minuty;
@Override
public void ustawCzas(int minuty) {
System.out.println("czas grillowania ustawiony: " + minuty);
}
@Override
public void ustawTemperaturę(int temperatura) {
System.out.println("temperatura grillowania ustawiona: " + temperatura);
}
@Override
public void uruchom() {
System.out.println("włączam grill");
}
@Override
public void wyłącz() {
System.out.println("grillowanie zakończone");
}
}
<file_sep>/README.md
"# szkolenieWawa"
<file_sep>/src/it/jsystems/statics/Car.java
package it.jsystems.statics;
import java.util.Arrays;
import java.util.List;
public class Car {
private String engine;
private String name;
public static int numberOfCars;
public static final List<String> SILNIKI;
static {
SILNIKI = Arrays.asList("V8", "V16");
}
public Car(String engine, String name) {
super();
this.engine = engine;
this.name = name;
numberOfCars++;
}
public String getEngine() {
return engine;
}
public void setEngine(String engine) {
this.engine = engine;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public static int getNumberOfCars() {
return numberOfCars;
}
public static void setNumberOfCars(int numberOfCars) {
Car.numberOfCars = numberOfCars;
}
public static void main(String[] args) {
System.out.println(StringUtils.isPalindrome("KAjaK"));
Car c1 = new Car("V8", "Jaguar");
Car c2 = new Car("V16", "Audi");
System.out.println(Car.numberOfCars);
String s = StringUtils.trimToSize(c1.getName(), 3);
System.out.println(s);
}
public static boolean isEmpty(String s) {
return "".equals(s);
}
}
<file_sep>/src/it/jsystems/dziedziczenie/polimorf/Krem.java
package it.jsystems.dziedziczenie.polimorf;
public class Krem extends Kosmetyk {
private int spf;
private static final String kategoria = "krem";
public Krem(String nazwa, int cena, boolean bio, int spf) {
super(nazwa, cena, kategoria, bio);
this.spf = spf;
}
public int getSpf() {
return spf;
}
public void setSpf(int spf) {
this.spf = spf;
}
@Override
public String toString() {
return super.toString() + " a wartość jego filtra SPF to: " + this.getSpf();
}
}
<file_sep>/src/it/jsystems/kolekcje/Listy.java
package it.jsystems.kolekcje;
public class Listy {
/*kolekcje -> listy, sety, kolejki
*
* Lista -> zachowany porządek wrzucania, nieunikalne elementy, nieważny rozmiar
*
* ArrayList - tablica, init 10 size, sama dopasowuje rozmiar
*
* LinkedList - kolejka dwukierunkowa - można od przodu i tyłu przeszukiwać
*
*/
}
<file_sep>/src/it/jsystems/szkolenie/MyClass.java
package it.jsystems.szkolenie;
import it.jsystems.defaults.MyInterface2;
public class MyClass implements MyInterface, MyInterface2 {
public static void main(String[] args) {
MyClass myClass = new MyClass();
myClass.doSomething();
System.out.println(MyInterface.myConstant);
MyInterface.doNothing();
}
@Override
public void doSomething() {
System.out.println("Implementacja metody");
}
@Override
public void myMethod() {
// TODO Auto-generated method stub
}
}
<file_sep>/src/it/jsystems/abstracts/Human.java
package it.jsystems.abstracts;
public class Human extends Mammal {
public Human(String name, int age, boolean isPredator, String race) {
super(name, age, isPredator);
this.race = race;
}
public String race;
public String getRace() {
return race;
}
public void setRace(String race) {
this.race = race;
}
}
<file_sep>/src/it/jsystems/dziedziczenie/polimorf/Kosmetyk.java
package it.jsystems.dziedziczenie.polimorf;
public class Kosmetyk extends Produkt {
private String kategoria;
private boolean bio;
public Kosmetyk(String nazwa, int cena, String kategoria, boolean bio) {
super(nazwa, cena);
this.kategoria = kategoria;
this.bio = bio;
}
public String getKategoria() {
return kategoria;
}
public void setKategoria(String kategoria) {
this.kategoria = kategoria;
}
public boolean isBio() {
return bio;
}
public void setBio(boolean bio) {
this.bio = bio;
}
@Override
public String toString() {
return "kategoria: " + this.kategoria.toString() + " " + super.toString() ;
}
}
| d1c9f5c6dfe8028db9dfa2e0e83234b7d405444f | [
"Markdown",
"Java"
]
| 22 | Java | ewsol/szkolenieWawa | 324fb02b2e30761964113aca2f8b305442cc410e | 2c0f981f7009cfd9502d84c8cbcad5c108a0bd2b |
refs/heads/master | <file_sep>//
// Created by john on 7/31/18.
//
#include "sherlock-tiles.h"
#include "util.h"
#include <cmath>
#include <string>
using namespace std;
int sherlockTilesDriver() {
int s1, s2, l;
l = 1000000;
s1 = 1000004;
s2 = 1000003;
vector<long int> qs(1, 258385599125);
vector<double> ts = movingTiles(l, s1, s2, qs);
for (int i = 0; i < ts.size(); i++)
COUT((to_string(ts[i]) + '\n').data());
/*
* 07-31 18:32:18.997 2749-2779/? D/PurchaseBooks: 695345.564581 // correct answer
*/
return 0;
}
vector<double> movingTiles(int l, int s1, int s2, vector<long int> queries) {
vector<double> ans(queries.size());
double coeff = sqrt(2) / (s1 > s2 ? s1 - s2 : s2 - s1);
for (int i = 0; i < queries.size(); i++)
ans[i] = coeff * (l - sqrt((double)queries[i]));
return ans;
}
<file_sep>//
// Created by john on 8/1/18.
//
#include "extraLongFactorial.h"
#include "util.h"
using namespace std;
int extraLongFactorialsDriver() {
extraLongFactorials(1000);
return 0;
}
// Complete the extraLongFactorials function below.
void extraLongFactorials(int n) {
// bigIntegerDriver();
// Solution
BigInteger bigInteger(1);
while (n > 1)
bigInteger.multiply(static_cast<u_int64_t>(n--));
COUT(bigInteger.to_string().data());
}
/*
* Definitions for class BigInteger
*
*
*
*
*
*
*
*
*
*
*/
BigInteger::BigInteger(){
for(int i=0; i< CAPACITY; i++){
digits[i]=0;
}
}
BigInteger::~BigInteger() {
// delete []digits; // digits is not a pointer, so we do not need to delete it
}
BigInteger::BigInteger(u_int64_t init_num){
int i = 0;
while(init_num > 0){
digits[i++] = static_cast<int>(init_num % BASE);
init_num = init_num/BASE;
}
for(;i< CAPACITY;i++){
digits[i]=0;
}
}
void BigInteger::accumulate(BigInteger &rVal) {
for (int i = 0, carry = 0, sum; i < CAPACITY; i++) {
sum = digits[i] + rVal.digits[i] + carry;
if (sum > BASE) {
digits[i] = sum - BASE;
carry = 1;
} else {
digits[i] = sum;
carry = 0;
}
}
}
void BigInteger::multiply(u_int64_t rVal) {
BigInteger finalProduct(0);
BigInteger *singleDigitProduct;
int digit, carry, sum, powerOfBase;
powerOfBase = 0;
while (rVal > 0) {
digit = static_cast<int>(rVal % BASE);
rVal /= BASE;
carry = 0;
singleDigitProduct = new BigInteger(0);
// Perform a single digit multiplication
for (int i = 0; i < CAPACITY; i++) {
sum = digit * digits[i] + carry;
singleDigitProduct->digits[powerOfBase + i] = sum % BASE;
carry = sum / BASE;
}
// Accumulate the single digit product with the sum of the previous single digit products
finalProduct.accumulate(*singleDigitProduct);
delete singleDigitProduct;
powerOfBase++;
}
copyDigits(finalProduct);
}
string BigInteger::to_string() {
int n = CAPACITY - 1;
while (digits[n] == 0)
n--;
n++;
char output[n + 1];
for (int i = 0; i < n; i++)
output[n - i - 1] = static_cast<char>('0' + digits[i]);
output[n] = '\0';
string s = output;
return s;
}
void BigInteger::copyDigits(BigInteger &source) {
for (int i = 0; i < CAPACITY; i++)
digits[i] = source.digits[i];
}
void bigIntegerDriver() {
BigInteger bigInteger(0);
BigInteger *bi;
for (int i = 0; i < 3; i++) {
bi = new BigInteger(static_cast<u_int64_t>(55 + i));
bigInteger.accumulate(*bi);
delete bi;
}
COUT((bigInteger.to_string() + '\n').data());
}<file_sep>//
// Created by john on 7/3/18.
//
#ifndef NATIVECPLUSPLUSWRAPPER_UTIL_H
#define NATIVECPLUSPLUSWRAPPER_UTIL_H
// https://stackoverflow.com/questions/4630026/adapter-from-c-output-stream-to-an-android-textview-or-equivalent
#include <android/log.h>
#include <vector>
#include <string>
extern const char * STANDARD_OUTPUT_TAG;
//#define COUT(x) { cout << (x) << endl; } // For environment with stdout
#define COUT(x) {__android_log_write(ANDROID_LOG_DEBUG, STANDARD_OUTPUT_TAG, (x)); }
template <class T>
inline void output_std(std::vector<T> &items) {
std::string output = "";
for (T x : items)
output += std::to_string(x) + " ";
COUT(output.data());
}
// Specialized template
inline void output_std(std::vector<std::string> &items) {
std::string output = "";
for (int i = 0; i < items.size(); i++)
output += items[i] + "\n";
COUT(output.data());
}
template <class T>
inline void __quickSort_asc(std::vector<T> &arr, int l, int r) {
if (l >= r)
return;
T swap, pivot = arr[(l + r) >> 1];
int i = l;
int j = r;
while (i <= j) {
while (arr[i] < pivot)
i++;
while (arr[j] > pivot)
j--;
if (i <= j) {
swap = arr[j];
arr[j] = arr[i];
arr[i] = swap;
i++;
j--;
}
}
// i is at the position of pivot, j is invalidated, since it started >= i, and now j < i
__quickSort_asc(arr, l, i - 1);
__quickSort_asc(arr, i, r);
}
// Sorts in ascending order on the < operator
template <class T>
inline void my_quickSort(std::vector<T> &arr) {
__quickSort_asc(arr, 0, arr.size() - 1);
}
#endif //NATIVECPLUSPLUSWRAPPER_UTIL_H
<file_sep>//
// Created by john on 8/1/18.
//
#ifndef NATIVECPLUSPLUSWRAPPER_EXTRALONGFACTORIAL_H
#define NATIVECPLUSPLUSWRAPPER_EXTRALONGFACTORIAL_H
#include <sys/types.h>
#include <string>
inline const int CAPACITY = 2600;
inline const int BASE = 10;
class BigInteger {
public:
BigInteger();
~BigInteger();
BigInteger(u_int64_t init_num);
void multiply(u_int64_t rVal);
void accumulate(BigInteger &rVal);
std::string to_string();
private:
int digits[CAPACITY];
void copyDigits(BigInteger &source);
};
void extraLongFactorials(int n);
int extraLongFactorialsDriver();
#endif //NATIVECPLUSPLUSWRAPPER_EXTRALONGFACTORIAL_H
<file_sep>//
// Created by john on 7/31/18.
//
#include "leonardoPrimeFactors.h"
#include "util.h"
#include <vector>
#include <string>
using namespace std;
int prime_driver() {
int count = primeCount(1000000000000000000);
COUT(("number of primes " + to_string(count)).data());
return 0;
}
// Helper method. Saves the work from finding the next prime number
int nextPrime(bool fromBeginning) {
static int currentIndex;
static vector<int> primes(0);
if (primes.size() == 0) {
primes.push_back(2);
primes.push_back(3);
// primes.push_back(5);
// primes.push_back(7);
// primes.push_back(11);
// primes.push_back(13);
// primes.push_back(17);
// primes.push_back(19);
// primes.push_back(23);
}
if (fromBeginning) {
currentIndex = 0;
} else if (currentIndex >= primes.size()) {
// Find the next prime; we want to generate the list ourselves, and we want to make sure it is inclusive
int nextPrime = primes[currentIndex - 1];
bool isPrime;
do {
// The primes need to be initialized with at least 2 and 3
nextPrime += 2;
isPrime = true;
for (int i = 0; i < primes.size(); i++) {
if (nextPrime % primes[i] == 0) {
isPrime = false;
break;
}
}
} while (!isPrime);
primes.push_back(nextPrime);
}
return primes[currentIndex++];
}
/*
* Complete the primeCount function below.
*/
int primeCount(long n) {
unsigned long long primeFactors = nextPrime(true);
int count = 0;
while (n > primeFactors) {
primeFactors *= nextPrime(false);
count++;
}
// We guard against needlessly computing the next prime
return n == primeFactors ? ++count : count;
}<file_sep>//
// Created by john on 7/31/18.
//
#ifndef NATIVECPLUSPLUSWRAPPER_SHERLOCK_TILES_H
#define NATIVECPLUSPLUSWRAPPER_SHERLOCK_TILES_H
#include <vector>
int sherlockTilesDriver();
std::vector<double> movingTiles(int l, int s1, int s2, std::vector<long int> queries);
#endif //NATIVECPLUSPLUSWRAPPER_SHERLOCK_TILES_H
<file_sep>//
// Created by john on 7/29/18.
//
#ifndef NATIVECPLUSPLUSWRAPPER_BOOKBUNDLES_H
#define NATIVECPLUSPLUSWRAPPER_BOOKBUNDLES_H
#include <string>
int bookBundles_driver();
int numberOfBooks(int budget, int bundleSize[], int bundleCost[], int bundleCount);
#endif //NATIVECPLUSPLUSWRAPPER_BOOKBUNDLES_H
<file_sep>//
// Created by john on 7/3/18.
//
#include <iostream>
#include "util.h"
#include "bestCoins.h"
#include <string>
#include <limits>
// This code is Based on an example from Starting Out With C++ 4th by <NAME> (pg 1194)
using namespace std;
// Constants
const int MAX_COINS_CHANGE = 100;
const int MAX_COIN_VALUES = 6;
const int NO_SOLUTION = numeric_limits<int>::max();
// coinValues - global array of coin values to choose from
int coinValues[MAX_COIN_VALUES] = {100, 50, 25, 10, 5, 1};
// bestCoins - global array of best coins to make change with
int bestCoins[MAX_COINS_CHANGE];
// Global variables
int numBestCoins = NO_SOLUTION, // Number of coins in bestCoins
numSolutions = 0, // Number of ways to make change
numCoins; // Number of allowable coins
int makeChange_driver() {
int coinsUsed[MAX_COINS_CHANGE], // List of coins used
numCoinsUsed = 0, // The number of coins used
amount; // The amount to make change for
// Use for string building
string valueString;
/*
* Set the inputs here
*/
amount = 67;
numCoins = 88;
// Display the possible coin values.
COUT("Here are the valid coin values, in cents: ");
for (int i = 0; i < MAX_COIN_VALUES; ++i) {
valueString += to_string(coinValues[i]) + ' ';
}
COUT(valueString.data());
// Get the input from the user
COUT("Enter the amount of cents (as an integer) to make change for: ");
COUT(to_string(amount).data());
COUT("What is the maximum number of coins to give as change? ");
COUT(to_string(numCoins).data());
// Call the recursive function
makeChange(numCoins, amount, coinsUsed, numCoinsUsed);
// Display the results
valueString.clear();
valueString += "Number of possible combinations: " + to_string(numSolutions);
COUT(valueString.data());
valueString.clear();
valueString += "Best combination of coins: \n";
if (numBestCoins == NO_SOLUTION) {
valueString += "\nNo solution\n";
} else {
for (int i = 0; i < numBestCoins; i++) {
valueString += to_string(bestCoins[i]) + ' ';
}
}
COUT((valueString += "\n").data());
return 0;
}
//**************************************************************************************************
// coinsLeft - The number of coins left to choose from. *
// amount - The amount to make change for. *
// coinsUsed - An array that contains the coin values used so far. *
// numCoinsUsed - The number of values in the coinsUsed array. *
// *
// This recursive function finds all the possible ways to make change for the value in amount. *
// The best combination of coins is stored in the array bestCoins. *
//**************************************************************************************************
void makeChange(int coinsLeft, int amount, int coinsUsed[], int numCoinsUsed) {
int coinPos, // To calculate array position of the coin being used
count; // Loop counter
if (coinsLeft == 0) // No more coins left
return;
else if (amount < 0) // Amount to make change is negative
return;
else if (amount == 0) { // Solution is found
// Store as bestCoins if best
if (numCoinsUsed < numBestCoins) {
for (count = 0; count < numCoinsUsed; count++) {
bestCoins[count] = coinsUsed[count];
}
numBestCoins = numCoinsUsed;
}
numSolutions++;
return;
}
// Find the other combinations using the coin
coinPos = numCoins - coinsLeft;
// Bug fix on the original code in print
if (coinPos == MAX_COIN_VALUES)
return;
coinsUsed[numCoinsUsed] = coinValues[coinPos];
numCoinsUsed++;
makeChange(coinsLeft, amount - coinValues[coinPos], coinsUsed, numCoinsUsed);
// Find the other combinations not using the coin
numCoinsUsed--;
makeChange(coinsLeft - 1, amount, coinsUsed, numCoinsUsed);
}
<file_sep>//
// Created by john on 7/29/18.
// Problem from HackerRank
//
#include "bookBundles.h"
#include "util.h"
using namespace std;
int bookBundles_driver() {
const int bundleCount = 7;
int bundleSize[] = {44, 47, 66, 32, 22, 13, 68};
int bundleCost[] = {11, 12, 26, 14, 6, 3, 22};
int budget = 100;
int num = numberOfBooks(budget, bundleSize, bundleCost, bundleCount);
COUT(to_string(num).data());
return 0;
}
int numberOfBooks(int budget, int *bundleSize, int *bundleCost, int bundleCount) {
// Define bestAttainableValue as the most number of books per bundle cost such that cost <= budget
double value, bestAttainableValue;
int indexBestAttainableValue, bundle_Cost;
bestAttainableValue = 0;
indexBestAttainableValue = -1;
string budgetPrint = "Remaining Budget = $" + to_string(budget);
COUT(budgetPrint.data());
for (int i = 0; i < bundleCount; i++) {
if (bundleCost[i] <= budget) {
value = bundleSize[i] / (double) bundleCost[i];
if (value > bestAttainableValue) {
bestAttainableValue = value;
indexBestAttainableValue = i;
}
COUT(to_string(value).data())
}
}
if (indexBestAttainableValue > -1) {
bundle_Cost = bundleCost[indexBestAttainableValue];
return budget / bundle_Cost * bundleSize[indexBestAttainableValue] +
numberOfBooks(budget - (budget / bundle_Cost) * bundle_Cost, bundleSize,
bundleCost, bundleCount);
}
else
return 0;
}
<file_sep>/*
// Created by john on 8/1/18.
*/
#include "hacker-rank.h"
#include "util.h"
#include <string>
#include <vector>
#include <iostream>
#include <limits>
#include <cmath>
#include <iomanip>
using namespace std;
namespace sample_hacker_one {
// Complete the findNumber function below.
string findNumber(vector<int> arr, int k) {
for (int i = 0; i < arr.size(); i++) {
if (arr[i] == k)
return "YES";
}
return "NO";
}
}
namespace sample_hacker_two {
// Complete the oddNumbers function below.
vector<int> oddNumbers(int l, int r) {
vector<int> odds(0);
for (int i = l; i <= r; i++) {
if (i % 2 == 1)
odds.push_back(i);
}
return odds;
}
}
namespace hacker_one { // Change problem -should spend some time studying similar problems
// Complete the getUmbrellas function below.
int getUmbrellas(int n, vector<int> p) {
}
}
namespace hacker_two { // Greedy algorithm
struct myInt {
int val;
int count;
};
// Complete the maxPoints function below.
long maxPoints(vector<int> elements) {
vector<myInt> uniqueElements(0);
vector<myInt>::iterator it, itSmart, itRemoveStart;
long countChosenElements = 0;
long smartPoints, points;
int element, count;
myInt *dummy;
for (int i = 0; i < elements.size(); i++) {
if (elements[i] == -1)
continue;
element = elements[i];
count = 1;
for (int j = i + 1; j < elements.size(); j++) {
if (elements[j] == element) {
count++;
elements[j] = -1; // Mark the element as consumed
}
}
dummy = new myInt {element, count};
for (it = uniqueElements.begin(); it < uniqueElements.end(); it++) {
if (element < it->val) {
uniqueElements.insert(it, *dummy);
dummy = NULL;
break;
}
}
if (dummy != NULL)
uniqueElements.push_back(*dummy);
}
// Accumulate all elements that are distanced far enough from their neighbors to cause no loss of potential points
it = uniqueElements.begin();
while (it < uniqueElements.end()) {
if ((prev(it) == uniqueElements.begin() || prev(it)->val + 1 < it->val) &&
(next(it) == uniqueElements.end() || next(it)->val > it->val + 1)) {
countChosenElements += it->count * it->val;
uniqueElements.erase(it);
} else
it++;
}
// Accumulate the rest of the points that are not as obvious
while (!uniqueElements.empty()) {
it = uniqueElements.begin();
// Initialize smart points
points = it->count * it->val;
points -= (next(it) == uniqueElements.end() || next(it)->val > it->val + 1) ? 0 : next(it)->count * next(it)->val;
smartPoints = points;
itSmart = it;
it++;
while (it < uniqueElements.end()) {
points = it->count * it->val;
points -= (prev(it)->val + 1 < it->val) ? 0 : prev(it)->count * prev(it)->val;
points -= (next(it) == uniqueElements.end() || next(it)->val > it->val + 1) ? 0 : next(it)->count * next(it)->val;
if (points > smartPoints) {
smartPoints = points;
itSmart = it;
}
it++;
}
// Accumulate the smart choice, for most points according to the rules
// COUT(to_string(itSmart->val).data()); // Debug
countChosenElements += itSmart->count * itSmart->val;
// Remove the chosen element and the adjacent elements, if any
itRemoveStart = itSmart;
if (prev(itSmart) != uniqueElements.begin() && prev(itSmart)->val + 1 == itSmart->val)
itRemoveStart = prev(itSmart);
if (next(itSmart) != uniqueElements.end() && next(itSmart)->val == itSmart->val + 1)
itSmart = next(itSmart);
uniqueElements.erase(itRemoveStart, itSmart + 1);
}
return countChosenElements;
}
void test() {
const int NUMBER_OF_POINTS = 12;
vector<int> numbers(NUMBER_OF_POINTS);
int arrPoints[] = {
// 4, 4, 3,
// 3, 3, 1,
// 2, 2, 5,
// 6, 7, 5
2, 2, 2,
4, 6, 8,
4, 8, 12,
22, 18, 16
};
for (int i = 0; i < NUMBER_OF_POINTS; i++) {
numbers[i] = arrPoints[i];
}
COUT(("Number of points scored = " + to_string(maxPoints(numbers))).data())
/**
* Number of points scored = 27
*/
/**
* Number of points scored = 104
*/
}
}
namespace hacker_three { // Array Problem, sorting optional
bool binContains(int l, int r, vector<int> &consumed, int &value) {
if (l > r)
return false;
int m = (l + r) >> 1;
if (consumed[m] == value)
return true;
if (value < consumed[m])
return binContains(l, --m, consumed, value);
return binContains(++m, r, consumed, value);
}
void consume(vector<int> &consumed, int &value) {
for (vector<int>::iterator it = consumed.begin(); it < consumed.end(); it++) {
if (value < *it) {
consumed.insert(it, value);
return;
}
}
consumed.push_back(value);
}
// Complete the numberOfPairs function below.
int numberOfPairs(vector<int> a, long k) {
// Keep the pairs distinct, by searching for x in (x, y) | x <= y
const long midValueGuard = (k >> 1) + 1;
// A sorted container
vector<int> consumed(0);
int complement, num_distinct_pairs = 0;
for (vector<int>::iterator it = a.begin(); it < a.end(); it++) {
if (*it < midValueGuard && !binContains(0, (int)consumed.size() - 1, consumed, *it)) {
// Form the complementary element that makes the pair sum to k
complement = static_cast<int>(k) - *it;
// Search for the term
for (vector<int>::iterator it2 = a.begin(); it2 < a.end(); it2++) {
if (it2 != it && *it2 == complement) {
num_distinct_pairs++;
// Consume the first term in the pair
consume(consumed, *it);
break;
}
}
}
}
return num_distinct_pairs;
}
void test() {
const int k = 12;
vector<int> a(0);
const int vals[] = {
1, 2, 3, 4,
5, 6, 6, 9,
10, 11
};
const int size = 10;
const char terms[] = "Elements: \n";
COUT(terms);
string listTerms = "";
for (int i = 0; i < size; i++) {
a.push_back(vals[i]);
listTerms += to_string(vals[i]) + ' ';
}
listTerms += '\n';
COUT(listTerms.data());
listTerms = "Number of pairs that sum to " + to_string(k) + "...";
COUT(listTerms.data());
COUT(to_string(numberOfPairs(a, k)).data());
/**
* Elements:
1 2 3 4 5 6 6 9 10 11
Number of pairs that sum to 12...
4
*/
}
}
namespace hacker_four { // Stack problem
bool matchingBrace(char left, char right) {
if (left == '(' && right == ')')
return true;
if (left == '[' && right == ']')
return true;
if (left == '{' && right == '}')
return true;
return false;
}
bool balanced(string s) {
vector<char> leftBraces(0);
bool prevWasRightBrace = false;
for (int i = 0; i < s.size(); i++) {
if ( (s[i] == '(') || (s[i] == '{') || (s[i] == '[') ) {
// The previous brace was a right brace, so all left braces should be consumed
if (prevWasRightBrace && !leftBraces.empty())
return false;
leftBraces.push_back(s[i]);
prevWasRightBrace = false;
}
else if ( (s[i] == ')') || (s[i] == '}') || (s[i] == ']') ) {
if (leftBraces.empty() || !matchingBrace(leftBraces.back(), s[i]))
return false;
leftBraces.pop_back();
prevWasRightBrace = true;
}
}
// End of string; there should be no more left braces
return leftBraces.empty();
}
// Complete the braces function below.
vector<string> braces(vector<string> values) {
vector<string> _balanced(values.size());
for (int i = 0; i < values.size(); i++)
_balanced[i] = balanced(values[i]) ? "YES" : "NO";
return _balanced;
}
void test() {
vector<string> input = {
"[{()}]",
"{}[]()",
"{[}]}"
};
COUT("Matching Braces");
output_std(input);
vector<string> result = braces(input);
output_std(result);
}
}
namespace grading_students {
/*
* Complete the gradingStudents function below.
*/
vector<int> gradingStudents(vector<int> grades) {
vector<int> rounded(grades.size());
// Perform rounding
for (int i = 0; i < grades.size(); i++)
rounded[i] = grades[i] < 38 || grades[i] % 5 < 3 ? grades[i] :
grades[i] + 5 - (grades[i] % 5);
return rounded;
}
void test() {
vector<int> grades = {22, 33, 56, 45, 38, 37, 64, 52, 85, 76, 77, 88, 90, 94, 100};
COUT("Grading Students");
output_std(grades);
vector<int> result = gradingStudents(grades);
output_std(result);
}
}
namespace min_absolute_distance_array {
// Complete the minimumAbsoluteDifference function below.
// I got time-out errors in the test environment
int minimumAbsoluteDifference(vector<int> arr) {
int mindiff, absoluteMinDiff = numeric_limits<int>::max();
for (int i = 0; i < arr.size(); ++i) {
for (int j = i + 1; j < arr.size(); ++j) {
if ((mindiff = abs(arr[i] - arr[j])) < absoluteMinDiff) {
if (mindiff == 0)
return mindiff;
else
absoluteMinDiff = mindiff;
// Debug
// COUT(("("+to_string(arr[i])+","+to_string(arr[j])+")").data());
}
}
}
return absoluteMinDiff;
}
int minimumAbsoluteDifference_1_1(vector<int> arr) {
// Reduce the number of comparisons by first performing a sort
// We need only compare adjacent elements
// sort(arr.begin(), arr.end());
my_quickSort(arr);
int mindiff, absoluteMinDiff = numeric_limits<int>::max();
const int comparisons = arr.size() - 1;
int i = -1;
while (++i < comparisons) {
if ((mindiff = arr[i + 1] - arr[i]) < absoluteMinDiff) {
if (mindiff == 0)
return mindiff;
else
absoluteMinDiff = mindiff;
// Debug
// COUT(("("+to_string(arr[i])+","+to_string(arr[j])+")").data());
}
}
return absoluteMinDiff;
}
void test() {
vector<int> nums = {
// 22, 33, 56, 45, 38, 37, 64, 52, 85, 76, 77, 88, 90, 94, 100
-59, -36, -13, 1, -53, -92, -2, -96, -54, 75
};
COUT("Minimum Absolute Distance within Array");
output_std(nums);
COUT(to_string(minimumAbsoluteDifference_1_1(nums)).data());
}
}
namespace grid_challenge {
// Complete the gridChallenge function below.
string gridChallenge(vector<string> &grid) {
sort(grid[0].begin(), grid[0].end());
for (int i = 1; i < grid.size(); i++) {
// Sort the rows
sort(grid[i].begin(), grid[i].end());
for (int j = 0; j < grid.size(); j++) {
// Columns must be monotonically increasing
if (grid[i -1][j] > grid[i][j])
return "NO";
}
}
return "YES";
}
void test() {
vector<string> grid = {
"abdefg",
"begfad",
"bagfed",
"dbagfe",
"fbdaeg",
"beagdf"
};
COUT("Grid Challenge");
output_std(grid);
string result = gridChallenge(grid);
COUT("Sorting the grid by rows...");
output_std(grid);
COUT("Columns are also sorted...");
COUT(result.data());
}
}
namespace marcsCakewalk {
// Complete the marcsCakewalk function below.
long marcsCakewalk(vector<int> calorie) {
struct cmp {
bool operator()(const int &a, const int &b) {
return a > b;
}
};
// We need calories in descending order to solve the problem
// sort(calorie.begin(), calorie.end(), greater<int>()); // or equivalently
sort(calorie.begin(), calorie.end(), cmp {});
long miles = 0;
long power = 1;
for (int c : calorie) {
miles += power * c;
power = power << 1;
}
return miles;
}
void test() {
vector<int> calories = {
1,
2,
3,
4,
5,
6,
7,
14,
22,
36,
44,
88
};
output_std(calories);
COUT("Minimum number of miles for Marc to walk...");
COUT(to_string(marcsCakewalk(calories)).data());
}
}
namespace luck_balance {
// Complete the luckBalance function below.
int luckBalance(int k, vector<vector<int>> contests) {
struct cmp {
bool operator()(const vector<int> &a, const vector<int> &b) {
// First sort key = importance: Ti
// Second sort key = luck value: Li
return a[1] < b[1] || (a[1] == b[1] && a[0] < b[0]);
}
};
sort(contests.begin(), contests.end(), cmp {});
int luck = 0;
for (vector<int> dummy: contests) {
if (dummy[1])
break; // Reached the first element of the sorted as 'important' part of contests
luck += dummy[0];
}
int dummy = contests.size();
while (dummy-- > 0 && contests[dummy][1]) {
if (k > 0) {
luck += contests[dummy][0];
k--;
} else {
luck -= contests[dummy][0];
}
}
return luck;
}
void test() {
vector<vector<int>> contests = {
{4, 1},
{2, 1},
{5, 0},
{22, 1}
// {5, 1},
// {2, 1},
// {1, 1},
// {8, 1},
// {10, 0},
// {5, 0}
};
const int k = 2;
COUT("Luck Balance");
for (vector<int> x : contests)
output_std(x);
COUT(("Important losses allowed = "+to_string(k)).data());
COUT(("Total luck = "+to_string(luckBalance(k, contests))).data());
/*
* Luck Balance
4 1
2 1
5 0
22 1
Important losses allowed = 2
Total luck = 29
Complete
*/
}
}
namespace maximumPerimeterTriangle {
// Complete the maximumPerimeterTriangle function below.
vector<int> maximumPerimeterTriangle(vector<int> sticks) {
sort(sticks.begin(), sticks.end());
for (int i = sticks.size() - 1; i > 1; i--) {
if (sticks[i] < sticks[i - 1] + sticks[i - 2])
return {sticks[i - 2], sticks[i - 1], sticks[i]};
}
return {-1}; // No triangles found
}
void test() {
vector<int> sticks = {
// 1, 2, 3
1, 1, 1, 2, 3, 5
};
COUT("Maximum Perimeter Triangle");
output_std(sticks);
COUT("Answer");
vector<int> result = maximumPerimeterTriangle(sticks);
output_std(result);
/*
* Maximum Perimeter Triangle
1 2 3
Answer
-1
Complete
*/
/*
* Maximum Perimeter Triangle
1 1 1 2 3 5
Answer
1 1 1
Complete
*/
}
}
namespace largestPermutation {
// Complete the largestPermutation function below.
vector<int> largestPermutation(int k, vector<int> arr) {
int mostSignificantSwapPosition = 0;
int searchValue;
while (k-- > 0 && mostSignificantSwapPosition < arr.size()) {
searchValue = arr.size() - mostSignificantSwapPosition;
for (int i = mostSignificantSwapPosition; i < arr.size(); i++) {
if (arr[i] == searchValue) {
if (i == mostSignificantSwapPosition) { // No swap necessary
mostSignificantSwapPosition++;
searchValue = arr.size() - mostSignificantSwapPosition;
} else {
arr[i] = arr[mostSignificantSwapPosition];
arr[mostSignificantSwapPosition++] = searchValue;
break; // Consume a swap; fallback to the outermost loop
}
}
}
}
return arr;
}
void test() {
vector<int> arr = {
// 5, 4, 3, 2, 1
1, 2, 3, 4, 5
};
COUT("Largest Permutation");
COUT("Starting Array");
output_std(arr);
const int swaps = 1;
COUT(("Most \'backwards\' array using "+to_string(swaps)+" swap(s)...").data());
vector<int> result = largestPermutation(swaps, arr);
output_std(result);
/*
* Largest Permutation
Starting Array
5 4 3 2 1
Most 'backwards' array using 1 swap(s)...
5 4 3 2 1
Complete
*/
/*
*Largest Permutation
Starting Array
1 2 3 4 5
Most 'backwards' array using 1 swap(s)...
5 2 3 4 1
Complete
*/
}
}
namespace greedy_florist {
// Complete the getMinimumCost function below.
// precondition: k > 0
int getMinimumCost(int k, vector<int> c) {
sort(c.begin(), c.end());
// kfold is the number of flowers purchased by each of the friends, within the k-fold
int kfold = 0, index;
int cost = 0;
do {
index = c.size() - k * kfold;
for (int i = 0; i < k; i++) {
if (index-- > 0)
cost += (kfold + 1) * c[index];
else
goto stop;
}
kfold++;
} while (true);
stop:
return cost;
}
void test() {
vector<int> costPerFlower = {
1, 4, 4, 7, 7, 8, 9, 9, 11
// 1, 3, 5, 7, 9
};
const int kFriends = 3;
COUT("Greedy Florist");
output_std(costPerFlower);
COUT(("Number of friends to buy all of the flowers = "+to_string(kFriends)).data());
COUT("Minimum cost");
COUT(to_string(getMinimumCost(kFriends, costPerFlower)).data());
/*
*Greedy Florist
1 4 4 7 7 8 9 9 11
Number of friends to buy all of the flowers = 3
Minimum cost
100
Complete
*/
/*
* Greedy Florist
1 3 5 7 9
Number of friends to buy all of the flowers = 3
Minimum cost
29
Complete
*/
}
}
namespace beautiful_pairs {
// Complete the beautifulPairs function below.
int beautifulPairs(vector<int> A, vector<int> B) {
vector<vector<int>> pairs(0);
vector<vector<int>>::reverse_iterator rit;
int n = A.size();
int bDummy;
for (int i = 0; i < n; ++i) {
bDummy = 0;
// Search for if the value of A[i] is already indicated by a beautiful pair.
// If affirmative, we want an index to start from j in B so that is B[j] == A[i],
// then j will be greater than any r in our beautiful pairs (l, r), and thus form
// a new distinct beautiful pair
for (rit = pairs.rbegin(); rit != pairs.rend(); rit++) {
if (A[i] == A[(*rit)[0]]) {
bDummy = (*rit)[1] + 1;
break;
}
}
// Search for the Bj term, such that B[j] == A[i]. Then (i, j) will be our new
// distinct beautiful pair
for (int j = bDummy; j < n; ++j) {
if (A[i] == B[j]) {
pairs.push_back({i, j});
break;
}
}
}
// Debug
for (auto pair : pairs) {
output_std(pair);
COUT("\n");
}
int distinctPairsBAsIs = pairs.size();
if (distinctPairsBAsIs == n) {
// You must replace EXACTLY one element, which will force a loss of a distinct beautiful pair
// when there are n of them
distinctPairsBAsIs--;
goto allDone;
}
////////////////////////////////////////////////////////////////////////////
// $$
// Search for an element (in A) to replace in B to maximize disjoint beautiful pairs
// $$
/////////////////////////////////////////////////////////////////////////////
for (int i = 0; i < n; i++) {
// i is a distinct l-value, not in any of our already-constructed beautiful pairs (li, ri)
for (auto x : pairs) {
if (i == x[0])
goto skipIteration;
}
// Show that A[i] is a distinct value, not indexed by any li
// (We would replace some element in B with A[i])
for (rit = pairs.rbegin(); rit != pairs.rend(); rit++) {
if (A[i] == A[(*rit)[0]])
goto skipIteration;
}
distinctPairsBAsIs++;
break;
skipIteration:
continue;
}
allDone:
return distinctPairsBAsIs;
}
void test() {
// vector<int> A = {
// 1, 2, 3, 4
// };
// vector<int> B = {
// 1, 2, 3, 3
// };
auto *A = &beautiful_pairs::testCase01_A;
auto *B = &beautiful_pairs::testCase01_B;
COUT("Beautiful Pairs");
int result = beautifulPairs(*A, *B);
COUT(("Maximum number of disjoint Beautiful Pairs with one replacement in B allowed = "+to_string(result)).data());
}
}
namespace min_max {
// Complete the maxMin function below.
int maxMin(int k, vector<int> arr) {
sort(arr.begin(), arr.end());
const int n = arr.size() - k + 1;
int minrange, range;
minrange = arr.back() - arr.front();
for (int i = 0; i < n; i++) {
range = arr[i + k - 1] - arr[i];
if (range < minrange)
minrange = range;
}
return minrange;
}
}
namespace jim_and_the_orders {
// Complete the jimOrders function below.
vector<int> jimOrders(vector<vector<int>> orders) {
struct cmp {
// Map indices as follows:
// 0 - order number
// 1 - prep time
// 2 - customer number (natural order)
bool operator() (const vector<int> &a, const vector<int> &b) {
return a[0] + a[1] < b[0] + b[1] || (
a[0] + a[1] == b[0] + b[1] && a[2] < b[2]);
}
};
// Key the customer number to each element in orders, uses 1-based counting
int i = 1;
for (int j = 0; j < orders.size(); j++) {
orders[j].push_back(i++);
}
sort(orders.begin(), orders.end(), cmp {} );
// Debug
// for (auto x: orders)
// output_std(x);
vector<int> servedOrders(orders.size());
for (int i = 0; i < orders.size(); i++)
servedOrders[i] = orders[i][2];
return servedOrders;
}
void test() {
vector<vector<int>> orders = {
{1, 8},
{2, 4},
{3, 3}
};
COUT("Jim and the Orders");
auto result = jimOrders(orders);
COUT("Order in which customers are served");
output_std(result);
}
}
namespace army_bases {
/*
* Complete the gameWithCells function below.
*/
int gameWithCells(int n, int m) {
// Wrong answer, simply not the counted on the interior connections
// return (n > 1 ? n - 1 : n) * (m > 1 ? m - 1 : m);
// The correct idea is to consume to two army bases at a time per drop, and not consume them in a later drop
return (n / 2 + n % 2) * (m % 2 + m / 2);
}
}
namespace birthdayCakeCandles {
// Complete the birthdayCakeCandles function below.
int birthdayCakeCandles(vector<int> ar) {
// Aim to solve this problem with as single pass
long int index, tallest, count;
index = count = tallest = 0;
int n = ar.size();
while (index < n) {
if (ar[index] > tallest) {
tallest = ar[index];
// Reset count, and count for this instance
count = 1;
} else if (ar[index] == tallest)
count++;
index++;
}
return count;
}
}
namespace miniMaxSum {
// Complete the miniMaxSum function below.
void miniMaxSum(vector<int> arr) {
// Sum, minimum term -> form the max sum, maximum term -> form the min sum
long long int sum, min, max;
sum = max = 0;
min = UINT32_MAX; // We could just set min to the value of the first element in arr; By setting min explicitly, we handle the case arr is empty
for (int i = 0; i < arr.size(); i++) {
if (arr[i] > max)
max = arr[i];
if (arr[i] < min)
min = arr[i];
sum += arr[i];
}
// The minimum is printed first
cout << sum - max << ' ';
cout << sum - min;
}
}
namespace forming_magic_square {
struct node {
float metric;
int replacement;
int selecti;
int selectj;
int cost;
};
void setNode(node &_node, float &metric, int &replacement, int &cost, int &selecti, int &selectj) {
_node.metric = metric;
_node.replacement = replacement;
_node.cost = cost;
_node.selecti = selecti;
_node.selectj = selectj;
}
bool isNewDigit(vector<vector<int>> &s, int &replacement) {
const int size = s.size();
for (int i = 0; i < size; i++)
for (int j = 0; j < size; j++)
if (s[i][j] == replacement)
return false;
return true;
}
float computeMetric(float directEffects, float indirectEffects, float cost) {
return (directEffects + indirectEffects) / cost;
}
// Complete the formingMagicSquare function below.
int formingMagicSquare(vector<vector<int>> s) {
// Forming a benefit:cost metric that parallels s
// The benefit I will define as the difference between current value for row, column, diag and 15; The cost I will take from the problem definition
// UPDATE: I add the sum of the effects going in the other directions to the benefit,
// in order to make 'intelligent' replacements.
// Example, if a replacement makes a row equal to the target of 15, an effect to add in is by how much farther or closer is the corresponding column, diagonal with the same entry
// The final metric is the ratio of [benefit + sum of the effects] to cost
const int targetSum = 15;
const int size = 3;
const int upperBoundOpen = 10;
const int numReplacements = 9; // Terminate if exceeds 9 replacements
const float lowerBoundMetricOpen = -4 * upperBoundOpen;
const float lowerBoundMetricClosed = 0;
int actualNumReplacements;
vector<int> sumrows(size), sumcolumns(size), sumdiags(2);
// Used to ensure the replacement is not made on the same entry
vector<vector<bool>> selectIndicies(size);
vector<bool> consumedDigits(upperBoundOpen);
// Best metric is used to decide which replacement to make
node bestMetric {}, bestMetricNewDigit{};
node *chosenMetric;
float metric, ieffects;
bool solved;
int dummyReplacement, dummyCost, a, b;
int totalCost;
// Initialize
for (int i = 1; i < upperBoundOpen; i++)
consumedDigits[i] = false;
for (int i = 0; i < size; i++) {
selectIndicies[i].resize(0);
for (int j = 0; j < size; j++)
selectIndicies[i].push_back(false);
}
totalCost = 0;
for (actualNumReplacements = 0; actualNumReplacements < numReplacements; actualNumReplacements++) {
// Reset the best metric
bestMetric.metric = lowerBoundMetricOpen;
bestMetricNewDigit.metric = lowerBoundMetricOpen;
solved = true;
// Find the state of the rows, columns, diags
// Find their sums relative to targetSum
sumdiags[1] = sumdiags[0] = targetSum;
for (int i = 0; i < size; i++) {
// Diags
sumdiags[0] -= s[i][i];
sumdiags[1] -= s[i][size - 1 - i];
sumrows[i] = sumcolumns[i] = targetSum;
for (int j = 0; j < size; j++) {
sumrows[i] -= s[i][j];
sumcolumns[i] -= s[j][i];
}
if (!sumrows[i] || !sumcolumns[i])
solved = false;
}
if (!sumdiags[0] || !sumdiags[1])
solved = false;
if (solved)
break;
// Calculate the 'benefit' for each entry of the matrix by finding a replacement which makes a row, column, or diagonal equal to targetSum. The replacement is restricted to [1, 9]. Also the effect can positive or negative, depending a row or column is nearer the target sum in magnitude after the replacement would occur. If the cost is 0, the replacement is void.
for (int i = 0; i < size; i++) {
for (int j = 0; j < size; j++) {
if (selectIndicies[i][j])
continue;
// Replacement for row
if (sumrows[i]) {
dummyReplacement = s[i][j] + sumrows[i];
if (dummyReplacement > 0 && dummyReplacement < upperBoundOpen &&
!consumedDigits[dummyReplacement]) {
dummyCost = abs(sumrows[i]);
ieffects = abs(sumcolumns[i]) -
abs(sumcolumns[i] - sumrows[i]); // Variable effects on column
if (i == j)
ieffects += abs(sumdiags[0]) -
abs(sumdiags[0] - sumrows[i]); // Variable effects on diagonal
if (i == size - 1 - j)
ieffects += abs(sumdiags[1]) -
abs(sumdiags[1] - sumrows[i]); // Variable effects on diagonal
// Assume the difference of the sum of the row and its target is equal to the difference of the entry and its replacement
metric = computeMetric(dummyCost, ieffects, dummyCost);
chosenMetric = isNewDigit(s, dummyReplacement) ? &bestMetricNewDigit : &bestMetric;
if (metric > chosenMetric->metric) {
setNode(
*chosenMetric,
metric,
dummyReplacement,
dummyCost,
i,
j
);
}
}
}
// Replacement for column
if (sumcolumns[i]) {
dummyReplacement = s[i][j] + sumcolumns[i];
if (dummyReplacement > 0 && dummyReplacement < upperBoundOpen &&
!consumedDigits[dummyReplacement]) {
dummyCost = abs(sumcolumns[i]);
ieffects = abs(sumrows[i]) -
abs(sumrows[i] - sumcolumns[i]); // Variable effects on row
if (i == j)
ieffects += abs(sumdiags[0]) -
abs(sumdiags[0] - sumcolumns[i]); // Variable effects on diagonal
if (i == size - 1 - j)
ieffects += abs(sumdiags[1]) -
abs(sumdiags[1] - sumcolumns[i]); // Variable effects on diagonal
metric = computeMetric(dummyCost, ieffects, dummyCost);
chosenMetric = isNewDigit(s, dummyReplacement) ? &bestMetricNewDigit : &bestMetric;
if (metric > chosenMetric->metric) {
setNode(
*chosenMetric,
metric,
dummyReplacement,
dummyCost,
i,
j
);
}
}
}
// Replacement for Diagonal 1
if (i == j && sumdiags[0]) {
dummyReplacement = s[i][j] + sumdiags[0];
if (dummyReplacement > 0 && dummyReplacement < upperBoundOpen &&
!consumedDigits[dummyReplacement]) {
dummyCost = abs(sumdiags[0]);
ieffects = abs(sumrows[i]) -
abs(sumrows[i] - sumdiags[0]); // Variable effects on row
ieffects += abs(sumcolumns[i]) -
abs(sumcolumns[i] - sumdiags[0]); // Variable effects on column
if (i == size - 1 - j)
ieffects += abs(sumdiags[1]) -
abs(sumdiags[1] - sumdiags[0]); // Variable effects on diag 2
metric = computeMetric(dummyCost, ieffects, dummyCost);
chosenMetric = isNewDigit(s, dummyReplacement) ? &bestMetricNewDigit :
&bestMetric;
if (metric > chosenMetric->metric) {
setNode(
*chosenMetric,
metric,
dummyReplacement,
dummyCost,
i,
j
);
}
}
}
// Replacement for Diagonal 2
if (i == size - 1 - j && sumdiags[1]) {
dummyReplacement = s[i][j] + sumdiags[1];
if (dummyReplacement > 0 && dummyReplacement < upperBoundOpen &&
!consumedDigits[dummyReplacement]) {
dummyCost = abs(sumdiags[1]);
ieffects = abs(sumrows[i]) -
abs(sumrows[i] - sumdiags[1]); // Variable effects on row
ieffects += abs(sumcolumns[i]) -
abs(sumcolumns[i] - sumdiags[1]); // Variable effects on column
if (i == j)
ieffects += abs(sumdiags[0]) -
abs(sumdiags[0] - sumdiags[1]); // Variable effects on diag 1
metric = computeMetric(dummyCost, ieffects, dummyCost);
chosenMetric = isNewDigit(s, dummyReplacement) ? &bestMetricNewDigit : &bestMetric;
if (metric > chosenMetric->metric) {
setNode(
*chosenMetric,
metric,
dummyReplacement,
dummyCost,
i,
j
);
}
}
}
}
}
// Give priority to the best metric when a new digit is introduced
if (bestMetricNewDigit.metric >= lowerBoundMetricClosed)
chosenMetric = &bestMetricNewDigit;
else if (bestMetric.metric >= lowerBoundMetricClosed)
chosenMetric = &bestMetric;
else
chosenMetric = NULL;
if (chosenMetric)
{
dummyReplacement = chosenMetric->replacement;
dummyCost = chosenMetric->cost;
a = chosenMetric->selecti;
b = chosenMetric->selectj;
consumedDigits[dummyReplacement] = true;
s[a][b] = dummyReplacement;
totalCost += dummyCost;
selectIndicies[a][b] = true;
cout << "s[" << a << "][" << b << "] = "
<< dummyReplacement <<
" with cost = " << dummyCost << endl;
} else
break;
}
cout << "Actual number of replacements = " << actualNumReplacements << endl;
cout << endl;
cout << s[0][0] << ' ' << s[0][1] << ' ' << s[0][2] << endl;
cout << s[1][0] << ' ' << s[1][1] << ' ' << s[1][2] << endl;
cout << s[2][0] << ' ' << s[2][1] << ' ' << s[2][2] << endl;
return totalCost;
}
}
namespace lowest_triangle {
int lowestTriangle(int base, int area){
return ceil(area * 2 / (float)base);
}
}
namespace plus_minus {
// Complete the plusMinus function below.
void plusMinus(vector<int> arr) {
const int precision = 6;
const int width = precision;
int n, neg, p, z;
double dN;
n = arr.size();
dN = n;
p = z = neg = 0;
for (int i = 0; i < n; i++) {
if (arr[i] > 0)
p++;
else if (arr[i] < 0)
neg++;
else
z++;
}
// Print the output to stdout
cout << setprecision(precision) << fixed << p / dN << endl;
cout << setprecision(precision) << fixed << neg / dN << endl;
cout << setprecision(precision) << fixed << z / dN << endl;
}
}
namespace paper_cutting {
// Complete the solve function below.
long solve(int n, int m) {
// Cut the smaller side a - 1 times to get [number of strips = a] of b squares to cut b - 1 times. Hence,
// a - 1 + a * (b - 1)
// == (a * (1) + a * (b - 1)) - 1
// == a * b - 1;
return (long)n * m - 1;
}
}
namespace special_multiples {
// Complete the solve function below.
string solve(int n) {
// We want to leverage numbers created in 9,0 by saving them
// Leverage solutions for n by saving them
static vector<unsigned long long> nines_numbers(1, 9);
static string solved_numbers[501];
if (solved_numbers[n].length() != 0)
return solved_numbers[n];
int index = 0;
while (true) {
if (index == nines_numbers.size()) {
// Construct nines numbers by taking combinations with higher order 9's
for (int j = 0; j < index; j++) {
nines_numbers.push_back(nines_numbers[j] * 10);
nines_numbers.push_back(nines_numbers[j] * 10 + 9);
}
}
if (nines_numbers[index] % n == 0) {
solved_numbers[n] = to_string(nines_numbers[index]);
return solved_numbers[n];
}
index++;
}
}
}
namespace staircase {
// Complete the staircase function below.
void staircase(int n) {
for (int i = n; i > 0; i--) {
for (int j = 0; j < n; j++) {
if (j < i - 1)
cout << ' ';
else
cout << '#';
}
cout << endl;
}
}
}
namespace time_conversion {
/*
* Complete the timeConversion function below.
*/
string timeConversion(string s) {
int hours = stoi(s.substr(0, 2)) % 12;
if (tolower(s.at(8)) == 'p')
hours += 12;
return (hours < 10 ? "0" : "") + to_string(hours) + s.substr(2, 6);
}
}
namespace cutting_boards_hard {
// Aids in representing a cost that fits in a long to a cost that fits in an int
const int DIVISOR = 1000000007;
void updateTotalCostOfCuts(int &cost, int &numberOfSegments, int &costSoFar) {
long costPerSlice = (cost * numberOfSegments) % DIVISOR;
costPerSlice = (costPerSlice + costSoFar) % DIVISOR;
costSoFar = static_cast<int>(costPerSlice);
}
// Complete the boardCutting function below.
// Minimize the cost of cutting a board into 1x1 squares
int boardCutting(vector<int> cost_y, vector<int> cost_x) {
int xit, yit, numHorizontalSegments, numVerticalSegments;
// Sort reverse of natural order, greatest cost first
sort(cost_y.begin(), cost_y.end(), greater<int>());
sort(cost_x.begin(), cost_x.end(), greater<int>());
int totalCost = 0;
xit = yit = 0;
numHorizontalSegments = numVerticalSegments = 1;
// The cuts with the GREATEST costs are made first.
// If a cut along rows ties on cost with a cut along columns,
// then choose the cut with the FEWER segments across first
while (xit < cost_x.size() && yit < cost_y.size()) {
if (cost_x[xit] > cost_y[yit]) {
// Cut along the column, across the current number of vertical segments
updateTotalCostOfCuts(cost_x[xit++], numVerticalSegments, totalCost);
numHorizontalSegments++;
} else if (cost_x[xit] == cost_y[yit]) {
if (numVerticalSegments < numHorizontalSegments) {
// Cut along the column, across the current number of vertical segments
updateTotalCostOfCuts(cost_x[xit++], numVerticalSegments, totalCost);
numHorizontalSegments++;
} else {
// Cut along the row, across the current number of horizontal segments
updateTotalCostOfCuts(cost_y[yit++], numHorizontalSegments, totalCost);
numVerticalSegments++;
}
} else {
// Cut along the row, across the current number of horizontal segments
updateTotalCostOfCuts(cost_y[yit++], numHorizontalSegments, totalCost);
numVerticalSegments++;
}
}
while (xit < cost_x.size()) {
updateTotalCostOfCuts(cost_x[xit++], numVerticalSegments, totalCost);
COUT("Just column cuts");
}
while (yit < cost_y.size()) {
updateTotalCostOfCuts(cost_y[yit++], numHorizontalSegments, totalCost);
COUT("Just row cuts");
}
return totalCost;
}
void test() {
// vector<int> cost_y = {
// 2, 1, 3, 1, 4
// };
// vector<int> cost_x = {
// 4, 1, 2
// };
COUT("Cuttings Boards");
COUT(("Total cost equivalence = "+to_string(boardCutting(
cutting_boards_hard::tc_y_08,
cutting_boards_hard::tc_x_08
))).data());
COUT(("Expected answer = "+to_string(cutting_boards_hard::tc_answers[7])).data());
/*
* Cuttings Boards
Total cost equivalence = 42
Complete
*/
/*
* // testcase 08
* Cuttings Boards
Total cost equivalence = 856128333
Expected answer = 604491628
Complete
*/
/*
* // testcase 06
*Cuttings Boards
Total cost equivalence = 986256778
Expected answer = 288670782
Complete
*/
/*
* // testcase 03
* Cuttings Boards
Total cost equivalence = 129443957
Expected answer = 733709321
Complete
*/
/*
* // testcase 01
* Cuttings Boards
D: Total cost equivalence = 127210119
Expected answer = 278642107
Complete
*/
// The error probably lies in the computation of cost
}
}
/* The main driver
*
*
*
*
*
*
*
*
*/
int main() {
// hacker_two::test();
// hacker_three::test();
// hacker_four::test();
// grading_students::test();
// min_absolute_distance_array::test();
// grid_challenge::test();
// marcsCakewalk::test();
// luck_balance::test();
// maximumPerimeterTriangle::test();
// largestPermutation::test();
// greedy_florist::test();
// beautiful_pairs::test();
// jim_and_the_orders::test();
cutting_boards_hard::test();
COUT("Complete");
return 0;
}<file_sep>//
// Created by john on 7/31/18.
//
#ifndef NATIVECPLUSPLUSWRAPPER_LEONARDPRIMEFACTORS_H
#define NATIVECPLUSPLUSWRAPPER_LEONARDPRIMEFACTORS_H
int prime_driver();
int primeCount(long n);
int nextPrime(bool fromBeginning);
#endif //NATIVECPLUSPLUSWRAPPER_LEONARDPRIMEFACTORS_H
<file_sep>//
// Created by john on 7/3/18.
//
#ifndef NATIVECPLUSPLUSWRAPPER_BESTCOINS_H
#define NATIVECPLUSPLUSWRAPPER_BESTCOINS_H
// Function Prototype
void makeChange(int, int, int[], int);
int makeChange_driver();
#endif //NATIVECPLUSPLUSWRAPPER_BESTCOINS_H
<file_sep>#include <vector>
#include "util.h"
using namespace std;
//
// Created by john on 7/3/18.
//
const char * STANDARD_OUTPUT_TAG = "CodeChallenge";<file_sep>//
// Created by john on 8/1/18.
//
#ifndef NATIVECPLUSPLUSWRAPPER_HACKER_RANK_H
#define NATIVECPLUSPLUSWRAPPER_HACKER_RANK_H
#include <vector>
int main();
namespace beautiful_pairs {
extern const std::vector<int> testCase01_A;
extern const std::vector<int> testCase01_B;
extern const int testCase01_Ans;
}
namespace cutting_boards_hard {
extern const std::vector<int> tc_y_01;
extern const std::vector<int> tc_x_01;
extern const std::vector<int> tc_y_02;
extern const std::vector<int> tc_x_02;
extern const std::vector<int> tc_y_03;
extern const std::vector<int> tc_x_03;
extern const std::vector<int> tc_y_04;
extern const std::vector<int> tc_x_04;
extern const std::vector<int> tc_y_05;
extern const std::vector<int> tc_x_05;
extern const std::vector<int> tc_y_06;
extern const std::vector<int> tc_x_06;
extern const std::vector<int> tc_y_07;
extern const std::vector<int> tc_x_07;
extern const std::vector<int> tc_y_08;
extern const std::vector<int> tc_x_08;
extern const std::vector<int> tc_y_09;
extern const std::vector<int> tc_x_09;
extern const std::vector<int> tc_y_10;
extern const std::vector<int> tc_x_10;
extern const std::vector<int> tc_answers;
}
#endif //NATIVECPLUSPLUSWRAPPER_HACKER_RANK_H
<file_sep>//
// Problem from HackerRank.com
// Created by john on 7/5/18.
//
#include "magic-vowels.h"
#include "util.h"
using namespace std;
int magicVowels_driver() {
string letters("zzdsfasdegsagweagaeiouuuuoiuiououaaasdbadaaaaeeeeeeeeiiiiiooozzzzzaaaaaeeeeeeeeeeeeeiiiiiiiiooooooouuuuuuuuuuuuuuuuuuuuu");
// string letters("zzdsfasdegsagweagaeiouuuuoiuiououaaasdbadaaaaeeeeeeeeiiiiiooozzzzzbbbbbeeeeeeeeeeeeeiiiiiiiiooooooouuuuuuuuuuuuuuuuuuuuu");
// string letters("ab"); // string.size() does not include the null terminator
COUT("A very long string: ");
COUT(letters.data());
COUT("length of the longest sequence with repeats and in-order of \'aeiou\': ");
COUT(to_string(longestSequenceMagicVowels(letters)).data());
return 0;
}
int longestSequenceMagicVowels(string s) {
int maxSequence = 0;
int currentSequence = 0;
int prevChar = 0,
currentChar;
unsigned long nonNullLength = s.size();
for (uint16_t i = 0; i < nonNullLength; i++) {
currentChar = tolower(s.at(i));
if (currentSequence > 0) { // Guard against false positives, regarding the start of the sequence, ie, iou is a false positive, the letters step correct and the sequence terminates correct, but does not start correctly
// Step 'up' from or repeat letters
switch (prevChar) {
case 'u':
if (currentChar == 'u') // Repeating u
currentSequence++;
else {
if (currentSequence > maxSequence) // Accepted string
maxSequence = currentSequence;
currentSequence = 0;
}
break;
case 'a':
if (currentChar == 'a' || currentChar == 'e')
currentSequence++;
else
currentSequence = 0;
break;
case 'e':
if (currentChar == 'e' || currentChar == 'i')
currentSequence++;
else
currentSequence = 0;
break;
case 'i':
if (currentChar == 'i' || currentChar == 'o')
currentSequence++;
else
currentSequence = 0;
break;
case 'o':
if (currentChar == 'o' || currentChar == 'u')
currentSequence++;
else
currentSequence = 0;
break;
}
} else if (currentChar == 'a')
currentSequence = 1; // The start of the magic sequence
// Update
prevChar = currentChar;
}
// Handle the edge case the accepted sequence was terminated by end of string
if (prevChar == 'u' && currentSequence > maxSequence)
return currentSequence;
else
return maxSequence;
}<file_sep>//
// Created by john on 7/5/18.
//
#ifndef NATIVECPLUSPLUSWRAPPER_MAGIC_VOWELS_H
#define NATIVECPLUSPLUSWRAPPER_MAGIC_VOWELS_H
#include <string>
int magicVowels_driver();
int longestSequenceMagicVowels(std::string s);
#endif //NATIVECPLUSPLUSWRAPPER_MAGIC_VOWELS_H
| ff061347d057a29dd8dc9354a9e751af0d4c28f5 | [
"C",
"C++"
]
| 16 | C++ | johnmasiello/NativeCPlusPlusPlayground | 4358330a7b2949e19d9a35b384552879a18b7856 | 08d85b9089f164a9a8f807c4cfddcc3675f69953 |
refs/heads/master | <file_sep>SET client_min_messages TO WARNING;
-- table
CREATE TABLE IF NOT EXISTS wikidata_classes (
wikidataId VARCHAR(128) PRIMARY KEY,
found_in VARCHAR(128)[],
class_name VARCHAR(255)
);
-- adding classification
INSERT INTO wikidata_classes VALUES ('Q5', '{"P31","P279"}', 'human') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_classes VALUES ('Q2985549', '{"P31"}', 'mononymous person') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_classes VALUES ('Q20643955', '{"P31"}', 'human biblical figure') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_classes VALUES ('Q6581097', '{"P21"}', 'gender (male)') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_classes VALUES ('Q15145778', '{"P21"}', 'gender (male cis)') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_classes VALUES ('Q6581072', '{"P21"}', 'gender (female)') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_classes VALUES ('Q15145779', '{"P21"}', 'gender (female cis)') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_classes VALUES ('Q1052281', '{"P21"}', 'gender (female trans)') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_classes VALUES ('Q2449503', '{"P21"}', 'gender (male trans)') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_classes VALUES ('Q1097630', '{"P21"}', 'gender (intersex)') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_classes VALUES ('Q28640', '{"P31"}', 'profession') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_classes VALUES ('Q756', '{"P279"}', 'plant') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_classes VALUES ('Q10884', '{"P279"}', 'tree') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_classes VALUES ('Q729', '{"P279"}', 'animal') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_classes VALUES ('Q38829', '{"P31"}', 'animal') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_classes VALUES ('Q5113', '{"P171"}','bird') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_classes VALUES ('Q22007593', '{"P171"}','bird') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_classes VALUES ('Q5856078', '{"P171"}','bird') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_classes VALUES ('Q618828', '{"P171"}','butterfly') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_classes VALUES ('Q958314','{"P31"}','grape variety') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_classes VALUES ('Q6256', '{"P31","P279"}', 'country') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_classes VALUES ('Q486972', '{"P31","P279"}', 'human settlement') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_classes VALUES ('Q285451', '{"P31","P279"}', 'river system') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_classes VALUES ('Q4022', '{"P31","P279"}', 'river system') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_classes VALUES ('Q23397', '{"P31"}', 'lake') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_classes VALUES ('Q44', '{"P31","P279"}', 'beer') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_classes VALUES ('Q1998962', '{"P31"}', 'beer style') ON CONFLICT DO NOTHING;
-- table for subcategories
CREATE TABLE IF NOT EXISTS wikidata_subcategories (
propertyId VARCHAR(128) PRIMARY KEY
);
-- add subcategories
INSERT INTO wikidata_subcategories VALUES ('P31') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_subcategories VALUES ('P171') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_subcategories VALUES ('P279') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_subcategories VALUES ('P361') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_subcategories VALUES ('P366') ON CONFLICT DO NOTHING;
INSERT INTO wikidata_subcategories VALUES ('P427') ON CONFLICT DO NOTHING;
-- table for link the classification
CREATE TABLE IF NOT EXISTS wikidata_class_links (
wikidataIdClass VARCHAR(128),
wikidataIdEntity VARCHAR(128),
wikidataProperty VARCHAR(64),
CONSTRAINT class_link_unique UNIQUE (wikidataIdClass, wikidataProperty, wikidataIdEntity)
);
CREATE INDEX IF NOT EXISTS wikidata_class_links_entity_idx ON wikidata_class_links(wikidataIdEntity);
-- delete previous/replaced classes
DELETE FROM wikidata_classes WHERE wikidataId='Q55659167';
DELETE FROM wikidata_class_links WHERE wikidataIdClass='Q55659167';
<file_sep>#!/bin/bash
. ./config
. ./tools/bash_functions.sh
# define logfile name
logfile_name=update_`date '+%Y-%m-%d_%H-%M-%S'`_$1.log
# Check already running
if [ "$(pidof -x $(basename $0))" != $$ ]
then
echo_time "Update is already running"
exit
fi
# find osmupdate
export PATH=`pwd`/tools/:$PATH
oupdate=`which osmupdate 2> /dev/null`
oconvert=`which osmconvert 2> /dev/null` # needed by osmupdate
# Creating tmp-directory for updates
mkdir -p tmp
# if not find compile
if [ -z "$oupdate" ]
then
echo_time "Try to complie osmupdate ..."
wget -O - http://m.m.i24.cc/osmupdate.c | cc -x c - -o tools/osmupdate
if [ ! -f tools/osmupdate ]
then
echo_time "Unable to compile osmupdate, please install osmupdate into \$PATH or in tools/ directory."
exit
fi
fi
if [ -z "$oconvert" ]
then
echo_time "Try to complie osmconvert ..."
wget -O - http://m.m.i24.cc/osmconvert.c | cc -x c - -lz -O3 -o tools/osmconvert
if [ ! -f tools/osmconvert ]
then
echo_time "Unable to compile osmconvert, please install osmconvert into \$PATH or tools/ directory."
exit
fi
fi
if [ -f tmp/old_update.osc.gz ]
then
# 2nd Update
osmupdate -v $osmupdate_parameter tmp/old_update.osc.gz tmp/update.osc.gz
else
# 1st Update using dump
osmupdate -v $osmupdate_parameter $IMPORT_PBF tmp/update.osc.gz
fi
# catch exit code of osmupdate
RESULT=$?
if [ $RESULT -ne 0 ]
then
echo_time "osmupdate exits with error code $RESULT."
exit 1
fi
imposm diff -config config.json -connection "postgis://${PGUSER}:${PGPASSWORD}@${PGHOST}:${PGPORT}/${PGDATABASE}?prefix=imposm_" \
-cachedir $cache_dir -diffdir $diff_dir tmp/update.osc.gz
RESULT=$?
if [ $RESULT -ne 0 ]
then
echo_time "imposm3 exits with error code $RESULT."
exit 1
fi
mv tmp/update.osc.gz tmp/old_update.osc.gz
echo_time "Update table with clustered roads"
psql -f sql/updateClusteredRoads.sql > /dev/null
echo_time "Update completed."
<file_sep>SET client_min_messages TO WARNING;
CREATE OR REPLACE VIEW export_eqs_all AS
SELECT name,
eqsGetGender("name:etymology:wikidata") AS gender,
eqsIsHuman("name:etymology:wikidata") AS IsHuman,
eqsGetBirth("name:etymology:wikidata") AS birth,
eqsGetDeath("name:etymology:wikidata") AS death,
eqsGetImage("name:etymology:wikidata") AS image,
array_to_string("name:etymology:wikidata",',') AS etymology,
classification,
array_to_string(cr.wikidata,',') AS wikidata,
cr.geom
FROM clustered_roads AS cr
LEFT JOIN LATERAL (SELECT string_agg(DISTINCT class_name, ', ') AS classification FROM wikidata_class_links AS wcl
LEFT JOIN wikidata_classes AS wdc ON wcl.wikidataIdClass=wdc.wikidataId AND wcl.wikidataProperty=ANY(found_in)
WHERE wcl.wikidataIdEntity=ANY("name:etymology:wikidata")
) AS wdclassify ON true;
-- CSV for all.csv (new)
COPY (SELECT name, gender, birth, death, IsHuman, etymology, classification, wikidata, image
FROM export_eqs_all
WHERE st_within(geom,(SELECT geometry FROM imposm_admin WHERE osm_id=##RELATION##))
ORDER BY name) TO STDOUT CSV header;
<file_sep>SET client_min_messages TO WARNING;
-- all.json
COPY (SELECT
json_build_object(
'type', 'FeatureCollection',
'features', jsonb_agg(json_feature)
) AS json_data
FROM eqs_features
WHERE st_within(geometry,(SELECT geometry FROM imposm_admin WHERE osm_id=##RELATION##))) TO STDOUT;
<file_sep>#!/bin/bash
# Reloads wikidata with redirects errors
# TODO: reload more
. ./config
. ./tools/bash_functions.sh
invalid_items=./log/invalid_wikidata.log
missing_items=./log/missing_wikidata.log
redirect_items=./log/redirect_wikidata.log
function reload_wd() {
echo_time "Delete $1 from database"
# delete item
psql -c "delete from wikidata where wikidataid = '$1'"
# delete links
psql -c "delete from wikidata_class_links where wikidataIdClass = '$1' OR wikidataIdEntity = '$1'"
# delete cache
rm cache_wikidata/$1.json
}
needLoad=0
echo_time "Start deleting entites which have redirects in wikidata"
while read line
do
from=`echo ${line} | cut -f1 -d" "`
to=`echo ${line} | cut -f3 -d" "`
psql -t -X --quiet --no-align -c "SELECT wikidataSource, wikidataSourceName FROM wikidata_depenencies WHERE wikidataId='${from}'" \
| while read line
do
wikidataId=${line%\|*}
name=${line#*\|}
echo_time "${wikidataId} has redirect from ${from} to ${to}"
reload_wd ${wikidataId}
needLoad=1
done
echo -n "."
done < ${redirect_items}
echo
if [ ${needLoad} -eq 1 ]
then
echo_time "start loading missing wikidata ..."
./loadWikidata.sh
fi
<file_sep>import http.server
import re
import os
import psycopg2
# Database to connect to
DATABASE = {
'user': os.getenv('PGUSER', 'osm'),
'password': os.getenv('PGPASSWORD', 'osm'),
'host': os.getenv('PGHOST', 'localhost'),
'port': os.getenv('PGPORT', '5432'),
'database': os.getenv('PGDATABASE', 'osm_etymology')
}
# Table to query for MVT data, and columns to
# include in the tiles.
TABLE_LOW = {
'table': 'vector_lowlevel',
'srid': '3857',
'geomColumn': 'geom',
'attrColumns': 'array_to_json(osm_ids) AS osm_ids, array_to_json(wikidata) AS wikidata, name, road_types AS highway, array_to_json("name:etymology:wikidata") AS "name:etymology:wikidata"'
}
TABLE_HIGH = {
'table': 'vector_highlevel',
'srid': '3857',
'geomColumn': 'geom',
'attrColumns': 'array_to_json(osm_ids) AS osm_ids, array_to_json(wikidata) AS wikidata, name, road_types AS highway, array_to_json("name:etymology:wikidata") AS "name:etymology:wikidata", wd_label AS "name:etymology:wikidata:name", wd_desc AS "name:etymology:wikidata:description", gender AS "name:etymology:wikidata:gender", ishuman AS "name:etymology:wikidata:human", birth AS "name:etymology:wikidata:birth", death AS "name:etymology:wikidata:death", image AS "name:etymology:wikidata:image", classification AS "name:etymology:wikidata:classification"'
}
# HTTP server information
HOST = 'localhost'
PORT = 9009
########################################################################
class TileRequestHandler(http.server.BaseHTTPRequestHandler):
DATABASE_CONNECTION = None
# Search REQUEST_PATH for /{z}/{x}/{y}.{format} patterns
def pathToTile(self, path):
m = re.search(r'^\/(\d+)\/(\d+)\/(\d+)\.(\w+)', path)
if (m):
return {'zoom': int(m.group(1)),
'x': int(m.group(2)),
'y': int(m.group(3)),
'format': m.group(4)}
else:
return None
# Do we have all keys we need?
# Do the tile x/y coordinates make sense at this zoom level?
def tileIsValid(self, tile):
if not ('x' in tile and 'y' in tile and 'zoom' in tile):
return False
if 'format' not in tile or tile['format'] not in ['pbf', 'mvt']:
return False
size = 2 ** tile['zoom'];
if tile['x'] >= size or tile['y'] >= size:
return False
if tile['x'] < 0 or tile['y'] < 0:
return False
return True
# Generate a SQL query to pull a tile worth of MVT data
# from the table of interest.
def envelopeToSQL(self, tile):
if tile['zoom'] < 11:
tbl = TABLE_LOW.copy()
else:
tbl = TABLE_HIGH.copy()
tbl['zoom'] = tile['zoom']
tbl['y'] = tile['y']
tbl['x'] = tile['x']
# Materialize the bounds
# Select the relevant geometry and clip to MVT bounds
# Convert to MVT format
sql_tmpl = """
WITH
d{table} AS (
SELECT ST_CollectionExtract({geomColumn}, 2) AS d{geomColumn}, * FROM {table} WHERE {geomColumn} && ST_TileEnvelope({zoom}, {x}, {y}, margin => (64.0 / 4096))
UNION ALL
SELECT ST_CollectionExtract({geomColumn}, 3) AS d{geomColumn}, * FROM {table} WHERE {geomColumn} && ST_TileEnvelope({zoom}, {x}, {y}, margin => (64.0 / 4096))
),
mvtgeom AS (
SELECT ST_AsMVTGeom(ST_Transform(t.d{geomColumn}, 3857), ST_TileEnvelope({zoom}, {x}, {y}), extent => 4096, buffer => 64) AS geom,
{attrColumns}
FROM d{table} t
)
SELECT ST_AsMVT(mvtgeom.*, 'wikidata') FROM mvtgeom
"""
return sql_tmpl.format(**tbl)
# Run tile query SQL and return error on failure conditions
def sqlToPbf(self, sql):
# Make and hold connection to database
if not self.DATABASE_CONNECTION:
try:
self.DATABASE_CONNECTION = psycopg2.connect(**DATABASE)
except (Exception, psycopg2.Error) as error:
self.send_error(500, "cannot connect: %s" % (str(DATABASE)))
return None
# Query for MVT
with self.DATABASE_CONNECTION.cursor() as cur:
cur.execute(sql)
if not cur:
self.send_error(404, "sql query failed: %s" % (sql))
return None
return cur.fetchone()[0]
return None
# Handle HTTP GET requests
def do_GET(self):
tile = self.pathToTile(self.path)
if not (tile and self.tileIsValid(tile)):
self.send_error(400, "invalid tile path: %s" % (self.path))
return
sql = self.envelopeToSQL(tile)
pbf = self.sqlToPbf(sql)
# self.log_message("path: %s\ntile: %s" % (self.path, tile))
# self.log_message("sql: %s" % (sql))
self.send_response(200)
self.send_header("Access-Control-Allow-Origin", "*")
self.send_header("Content-type", "application/vnd.mapbox-vector-tile")
self.end_headers()
self.wfile.write(pbf)
########################################################################
with http.server.ThreadingHTTPServer((HOST, PORT), TileRequestHandler) as server:
try:
print("serving at port", PORT)
server.serve_forever()
except KeyboardInterrupt:
if self.DATABASE_CONNECTION:
self.DATABASE_CONNECTION.close()
print('^C received, shutting down server')
server.socket.close()
<file_sep>
import pathlib
import os
import gzip
import psycopg2
from tqdm import tqdm
from multiprocessing import Pool, RLock
# Database connection from enviroment
DATABASE = {
'user': os.getenv('PGUSER', 'osm'),
'password': os.getenv('PGPASSWORD', 'osm'),
'host': os.getenv('PGHOST', 'localhost'),
'port': os.getenv('PGPORT', '5432'),
'database': os.getenv('PGDATABASE', 'osm_etymology')
}
# Table to query for MVT data, and columns to
# include in the tiles.
TABLE_LOW = {
'table': 'vector_lowlevel',
'srid': '3857',
'geomColumn': 'geom',
'attrColumns': 'array_to_json(osm_ids) AS osm_ids, array_to_json(wikidata) AS wikidata, name, road_types AS highway, array_to_json("name:etymology:wikidata") AS "name:etymology:wikidata"'
}
TABLE_HIGH = {
'table': 'vector_highlevel',
'srid': '3857',
'geomColumn': 'geom',
'attrColumns': 'array_to_json(osm_ids) AS osm_ids, array_to_json(wikidata) AS wikidata, name, road_types AS highway, array_to_json("name:etymology:wikidata") AS "name:etymology:wikidata", wd_label AS "name:etymology:wikidata:name", wd_desc AS "name:etymology:wikidata:description", gender AS "name:etymology:wikidata:gender", ishuman AS "name:etymology:wikidata:human", birth AS "name:etymology:wikidata:birth", death AS "name:etymology:wikidata:death", image AS "name:etymology:wikidata:image", classification AS "name:etymology:wikidata:classification"'
}
database_connection = None
export_path=os.getenv('TILES_EXPORT_PATH', '/tmp/wikidata_tiles/')
compress=True
########################################################################
# Generate a SQL query to pull a tile worth of MVT data
def envelopeToSQL(tile):
if tile['zoom'] < 11:
tbl = TABLE_LOW.copy()
else:
tbl = TABLE_HIGH.copy()
tbl['zoom'] = tile['zoom']
tbl['y'] = tile['y']
tbl['x'] = tile['x']
sql_tmpl = """
WITH
d{table} AS (
SELECT ST_CollectionExtract({geomColumn}, 2) AS d{geomColumn}, * FROM {table} WHERE {geomColumn} && ST_TileEnvelope({zoom}, {x}, {y}, margin => (64.0 / 4096))
UNION ALL
SELECT ST_CollectionExtract({geomColumn}, 3) AS d{geomColumn}, * FROM {table} WHERE {geomColumn} && ST_TileEnvelope({zoom}, {x}, {y}, margin => (64.0 / 4096))
),
mvtgeom AS (
SELECT ST_AsMVTGeom(ST_Transform(t.d{geomColumn}, 3857), ST_TileEnvelope({zoom}, {x}, {y}), extent => 4096, buffer => 64) AS geom,
{attrColumns}
FROM d{table} t
)
SELECT ST_AsMVT(mvtgeom.*, 'wikidata') FROM mvtgeom
"""
return sql_tmpl.format(**tbl)
def dbConnection():
global database_connection
# Make and hold connection to database
if not database_connection:
try:
database_connection = psycopg2.connect(**DATABASE)
except (Exception, psycopg2.Error) as error:
print("cannot connect: %s" % (str(DATABASE)))
return None
# Run tile query SQL and return error on failure conditions
def sqlToPbf(sql):
global database_connection
dbConnection()
# Query for MVT
with database_connection.cursor() as cur:
cur.execute(sql)
if not cur:
print("sql query failed: %s" % (sql))
return None
return cur.fetchone()[0]
return None
def exportTile(tile):
sql = envelopeToSQL(tile)
pbf = sqlToPbf(sql)
pbf_path=export_path+"/"+str(tile['zoom'])+"/"+str(tile['x'])+"/"
tile_name=pbf_path+str(tile['y'])+".pbf"
pathlib.Path(pbf_path).mkdir(parents=True, exist_ok=True)
if compress:
tilesFile = gzip.open(tile_name, "wb")
else:
tilesFile = open(tile_name, "wb")
tilesFile.write(pbf)
tilesFile.close()
def exportTiles(for_zoom, process_pos):
global database_connection
sql = "SELECT zoom, x, y FROM import_updated_zyx WHERE zoom="+str(for_zoom)+" ORDER by zoom,x,y"
tqdm_text = "Level {}".format(for_zoom).zfill(2)
dbConnection()
with database_connection.cursor() as cur, database_connection.cursor() as cur_del:
cur.execute(sql)
if not cur:
print("sql query failed: %s" % (sql))
return None
rowcount = cur.rowcount
if rowcount < 1:
print("No tiles for export.")
return
row = cur.fetchone()
last_zoom=row[0]
pbar = tqdm(total=rowcount, desc=tqdm_text, position=(process_pos*2)+1, unit='tiles')
for i in range(rowcount):
# do something with row
if last_zoom != row[0]:
print("Zoom Level "+str(last_zoom)+" completed. Start exporting zoom level "+ str(row[0]))
tile = {'zoom': row[0], 'x': row[1], 'y': row[2]}
last_zoom = row[0]
# print(str(rowpos) + "/" + str(rowcount) + " " + str(tile))
exportTile(tile)
cur_del.execute("DELETE FROM import_updated_zyx WHERE zoom = %s AND x = %s AND y = %s", (row[0],row[1],row[2]))
database_connection.commit()
row = cur.fetchone()
pbar.update(1)
def main():
from_zoom = 9
to_zoom = 14
num_processes = to_zoom-from_zoom+1
pool = Pool(processes=num_processes, initargs=(RLock(),), initializer=tqdm.set_lock)
jobs = [pool.apply_async(exportTiles, args=(z, z-from_zoom)) for z in range(from_zoom, to_zoom+1)]
pool.close()
result_list = [job.get() for job in jobs]
print ("\n" * (num_processes*2))
if __name__ == "__main__":
main()
<file_sep>SET client_min_messages TO WARNING;
CREATE TABLE IF NOT EXISTS clustered_roads (
osm_ids bigint[],
name varchar(255),
road_types text,
cluster int,
"name:etymology:wikidata" varchar(255)[],
wikidata varchar(255)[],
geom geometry(geometry, 3857)
);
CREATE INDEX IF NOT EXISTS imposm_roads_name_idx ON imposm_roads (name);
DO $$DECLARE r record;
BEGIN
-- clean table
DELETE FROM clustered_roads WHERE name IN (SELECT name FROM import_updated_roadnames);
FOR r IN SELECT name FROM import_updated_roadnames WHERE name != '' AND name IS NOT NULL
LOOP
-- RAISE NOTICE 'Cluster road %', r.name;
EXECUTE 'INSERT INTO clustered_roads
SELECT array_agg(osm_id) AS osm_ids,
name,
string_agg(DISTINCT cluster_tab.highway, '', '') AS road_types,
cluster,
array_agg(DISTINCT cluster_tab."name:etymology:wikidata") FILTER (WHERE cluster_tab."name:etymology:wikidata" is not null) AS "name:etymology:wikidata",
array_agg(DISTINCT cluster_tab.wikidata) FILTER (WHERE cluster_tab.wikidata is not null) AS wikidata,
ST_SetSRID(st_union(cluster_tab.geometry),3857) AS geom
FROM (SELECT nullif(road.wikidata,'''') AS wikidata,
unnest(
case when road."name:etymology:wikidata" = '''' then
''{null}''::varchar(255)[]
else
string_to_array(replace(road."name:etymology:wikidata",''; '','';''),'';'')
end
) AS "name:etymology:wikidata",
osm_id,
name,
highway,
ST_ClusterDBSCAN(geometry, eps := 100, minpoints := 1) over (PARTITION BY name) AS cluster,
geometry
FROM imposm_roads road
WHERE highway NOT IN(''platform'',''A3.0'',''rest_area'',''bus_stop'',''elevator'')) AS cluster_tab
WHERE name=$1
GROUP BY name, cluster' USING r.name;
END LOOP;
-- Delete processed names
TRUNCATE TABLE import_updated_roadnames;
END$$;
CREATE INDEX IF NOT EXISTS clustered_roads_geom_idx ON clustered_roads USING GIST (geom);
<file_sep>#!/bin/bash
# Serve VectorTiles
. ./config
. ./tools/bash_functions.sh
psql -f sql/export_eqs.sql
psql -f sql/vectorTilesViews.sql
python3 vectortiles/server.py
<file_sep>#!/bin/bash
. ./config
. ./tools/bash_functions.sh
wd_cache=./cache_wikidata
invalid_items=./log/invalid_wikidata.log
missing_items=./log/missing_wikidata.log
redirect_items=./log/redirect_wikidata.log
mkdir -p $wd_cache
echo_time "Check tables ..."
psql -f sql/classifyWikidata.sql > /dev/null
psql -f sql/createWikiTables.sql > /dev/null
function db_import() {
wikidata=$1
cat $wd_cache/${wikidata}.json | sed 's/\\/\\\\/g' | psql -c 'COPY wikidata_import FROM STDIN;'
}
function import_run() {
echo_time "Cleanup tables ..."
psql -c "TRUNCATE TABLE wikidata_import;"
# clean missing / redirect
> ${invalid_items}
> ${missing_items}
> ${redirect_items}
echo_time "Search for missing data ..."
count_missing=`psql -t -X --quiet --no-align -c "select count(*) FROM wikidata_needed_import"`
missing_current=0
echo_time "Missing ${count_missing} items"
psql -t -X --quiet --no-align -c "select wikidata FROM wikidata_needed_import ORDER BY wikidata" \
| while read wikidata
do
# Counter
missing_current=$((missing_current+1))
echo_time "Item ${missing_current}/${count_missing}"
# regex check
if ! [[ $wikidata =~ ^Q[0-9]+$ ]]
then
echo_time "'${wikidata}' is not a valid wikidata entity! Skipped."
echo "${wikidata}" >> ${invalid_items}
continue
fi
echo_time "Check ${wikidata} ..."
outfile=$wd_cache/${wikidata}.json
if ! [ -f $outfile ]
then
echo_time "Download ..."
curl -s https://www.wikidata.org/wiki/Special:EntityData/${wikidata}.json -o $outfile
RET=$?
if [ $RET -ne 0 ]
then
rm -f $outfile
fi
fi
if [ -f $outfile ]
then
# Check for: Rediects
file_id=`jq -r '.entities | keys[0]' $outfile 2> /dev/null`
file_id_ret=$?
if [ $file_id_ret -ne 0 ]
then
if grep -q "Not Found" $outfile || grep -q "Invalid ID" $outfile
then
echo_time "Entity $wikidata not found, maybe deleted or invalid"
echo "$wikidata" >> ${missing_items}
fi
echo_time "Reset cache"
rm -f $outfile
continue
fi
if [ "$file_id" != "$wikidata" ]
then
echo_time "Found Redirect $wikidata => $file_id"
echo "$wikidata => $file_id" >> ${redirect_items}
rm -f $outfile
continue
fi
echo_time "Import to db ..."
db_import ${wikidata}
fi
done
echo_time "Finish Import ..."
psql -c "INSERT INTO wikidata (wikidataId, data)
SELECT jsonb_object_keys(data->'entities') AS wikidataId, data->'entities'->jsonb_object_keys(data->'entities') AS data
FROM wikidata_import
ON CONFLICT (wikidataId) DO UPDATE
SET imported=NOW(), data=excluded.data;"
results=`psql -t -X --quiet --no-align -c "SELECT count(*) FROM wikidata_import"`
return $results
}
RET=1
while [ "$RET" -ne 0 ]
do
import_run
RET=$?
if [ "$RET" -ne 0 ]
then
echo_time "Rerun for dependencies"
fi
done
echo_time "Link entities ..."
psql -f sql/classifyFunction.sql > /dev/null
<file_sep>#!/bin/bash
# Export data geojosn and csv for project equalstreetnames
. ./config
. ./tools/bash_functions.sh
export_dir=./export_data/equalstreetnames
mkdir -p $export_dir
if [ $# -ne 1 ]
then
echo_time "Please add an Relation id of as city for export data"
exit 1
fi
relation_id=$1
if [ ${relation_id:0:1} != '-' ]
then
relation_id="-${relation_id}"
fi
relation_name=`psql -t -A -c "SELECT name FROM imposm_admin WHERE osm_id=${relation_id}" 2>/dev/null`
if [ -z "${relation_name}" ]
then
echo_time "Relation Id ${relation_id} not available."
exit 2
fi
echo_time "Create database functions ..."
psql -f sql/export_eqs.sql > /dev/null
psql -f sql/buildGeojson.sql > /dev/null
echo_time "Export $export_dir/${relation_name}_gender.csv ..."
cat sql/export_eqs_gender.sql | sed -e "s/##RELATION##/${relation_id}/g" | psql -q > $export_dir/${relation_name}_gender.csv
echo_time "Export $export_dir/${relation_name}_all.csv ..."
cat sql/export_eqs_all.sql | sed -e "s/##RELATION##/${relation_id}/g" | psql -q > $export_dir/${relation_name}_all.csv
echo_time "Export $export_dir/${relation_name}_other.csv ..."
cat sql/export_eqs_other.sql | sed -e "s/##RELATION##/${relation_id}/g" | psql -q > $export_dir/${relation_name}_other.csv
echo_time "Export $export_dir/${relation_name}_export.geojson ..."
cat sql/export_eqs_geojson.sql | sed -e "s/##RELATION##/${relation_id}/g" | psql -q | sed 's/\\\\/\\/g' > $export_dir/${relation_name}_export.geojson
<file_sep>SET client_min_messages TO WARNING;
-- table for all wikidata content
CREATE TABLE IF NOT EXISTS wikidata (
wikidataId VARCHAR(128) PRIMARY KEY,
data jsonb,
linked boolean DEFAULT false,
imported timestamp DEFAULT NOW()
);
-- temporary table for importing data
CREATE UNLOGGED TABLE IF NOT EXISTS wikidata_import (
data jsonb
);
-- create table for wikidata entities needed by others for analyze
CREATE OR REPLACE VIEW wikidata_depenencies AS
SELECT jsonb_array_elements(data->'claims'->prop)->'mainsnak'->'datavalue'->'value'->>'id' as wikidataId,
data->>'id' as wikidataSource,
data->'labels'->'en'->>'value' as wikidataSourceName
FROM wikidata
INNER JOIN (SELECT propertyId AS prop FROM wikidata_subcategories) AS props ON TRUE;
-- View for get all needed wikidata / need to import
CREATE OR REPLACE VIEW wikidata_needed AS
SELECT DISTINCT unnest(wikidata) AS wikidata FROM clustered_roads WHERE wikidata IS NOT NULL
UNION
SELECT DISTINCT unnest("name:etymology:wikidata") AS wikidata FROM clustered_roads WHERE "name:etymology:wikidata" IS NOT NULL
UNION
SELECT DISTINCT wikidataId AS wikidata FROM wikidata_depenencies WHERE wikidataId IS NOT NULL
UNION
SELECT wikidataId AS wikidata FROM wikidata_classes;
CREATE OR REPLACE VIEW wikidata_needed_import AS
SELECT DISTINCT trim(wikidata)::VARCHAR AS wikidata FROM wikidata_needed WHERE trim(wikidata) NOT IN(SELECT wikidataId FROM wikidata) AND wikidata!='' AND wikidata!='no';
<file_sep>#!/bin/bash
# Use logs from loadWikidata.sh and generate html with errors
. ./config
. ./tools/bash_functions.sh
invalid_items=./log/invalid_wikidata.log
missing_items=./log/missing_wikidata.log
redirect_items=./log/redirect_wikidata.log
outfile=./log/errors.html
cnt_invalid=`cat $invalid_items | wc -l`
cnt_missing=`cat $missing_items | wc -l`
cnt_redirect=`cat $redirect_items | wc -l`
function write_header() {
cat > ${outfile} << EOL
<html>
<head>
<title>Wikidata Errors in OSM</title>
</head>
<body>
<h1>Wikidata Errors in OpenStreetMap</h1>
<ul>
<li>Report generated: $(date)</li>
<li>Invalid values: ${cnt_invalid}</li>
<li>Missing Items: ${cnt_missing}</li>
<li>Redirects: ${cnt_redirect}</li>
</ul>
EOL
}
function write_footer() {
cat >> ${outfile} << EOL
</body>
</html>
EOL
}
function get_osm() {
psql -t -X --quiet --no-align -c "SELECT array_to_string(osm_ids,','),name FROM clustered_roads WHERE '$1'=any(wikidata) OR '$1'=any(\"name:etymology:wikidata\")" \
| while read line
do
elements=${line%\|*}
name=${line#*\|}
echo "<h4>${name}</h4><ul>" >> ${outfile}
josm_list=
IFS=',' read -ra elemA <<< $elements
for elem in "${elemA[@]}"
do
if [ ${elem} -lt 0 ]
then
josm_list="${josm_list},r${elem:1}"
echo "<li><a href=\"https://www.openstreetmap.org/relation/${elem:1}\">Relation ${elem:1}</a></li>" >> ${outfile}
else
josm_list="${josm_list},w${elem}"
echo "<li><a href=\"https://www.openstreetmap.org/way/${elem}\">Way ${elem}</a></li>" >> ${outfile}
fi
done
echo "</ul>" >> ${outfile}
echo "<a href=\"http://localhost:8111/load_object?objects=${josm_list:1}&relation_members=true\">Edit in JOSM</a>" >> ${outfile}
done
}
function get_wikidata() {
psql -t -X --quiet --no-align -c "SELECT wikidataSource, wikidataSourceName FROM wikidata_depenencies WHERE wikidataId='$1'" \
| while read line
do
wikidataId=${line%\|*}
name=${line#*\|}
echo "<h4>wikidata: ${name} (<a href=\"https://wikidata.org/wiki/${wikidataId}\">${wikidataId}</a>)</h4>" >> ${outfile}
done
}
function invalid_report() {
if [ "${cnt_invalid}" -lt 1 ]
then
echo_time "Skip Invalid"
return
fi
echo_time "Report Invalid"
echo "<h2>Invalid Values</h2>" >> ${outfile}
while read line
do
echo "<h3>Value: $line</h3>" >> ${outfile}
get_osm ${line}
# to show a process
echo -n "."
done < ${invalid_items}
# to show process
echo
}
function redirect_report() {
if [ "${cnt_redirect}" -lt 1 ]
then
echo_time "Skip Redirects"
return
fi
echo_time "Report Redirects"
echo "<h2>Redirected Values</h2>" >> ${outfile}
while read line
do
from=`echo ${line} | cut -f1 -d" "`
to=`echo ${line} | cut -f3 -d" "`
echo "<h3>Value: <a href=\"https://wikidata.org/wiki/$from\">$from</a> (redirects to <a href=\"https://wikidata.org/wiki/$to\">$to</a>)</h3>" >> ${outfile}
get_osm ${from}
get_wikidata ${from}
# to show a process
echo -n "."
done < ${redirect_items}
# to show process
echo
}
function missing_report() {
if [ "${cnt_missing}" -lt 1 ]
then
echo_time "Skip Missing"
return
fi
echo_time "Report Missing"
echo "<h2>Missing Values</h2>" >> ${outfile}
while read line
do
echo "<h3>Value: <a href=\"https://wikidata.org/wiki/$line\">$line</a></h3>" >> ${outfile}
get_osm ${line}
# to show a process
echo -n "."
done < ${missing_items}
# to show process
echo
}
echo_time "Start generating report with errors"
write_header
invalid_report
redirect_report
missing_report
write_footer
echo_time "Finished"
echo_time "Open $outfile"
<file_sep># This files contains functions for bash scripts
# function for output with timestamp
function echo_time() {
echo '[' `date` ']' "$1"
if [ ! -z "$logfile_name" ]
then
echo '[' `date` ']' "$1" >> log/$logfile_name
fi
}
<file_sep>SET client_min_messages TO WARNING;
-- Function and Views to build an export geojson
-- View for build equalstreetnames feature
CREATE OR REPLACE VIEW eqs_features AS
SELECT name,
road_types,
wikidata,
"name:etymology:wikidata",
eqsGetGender("name:etymology:wikidata") AS gender,
eqsIsHuman("name:etymology:wikidata") AS person,
wd.data->'labels' AS labels,
wd.data->'sitelinks' AS sitelinks,
wd.data->'descriptions' AS descriptions,
eqsGetBirth("name:etymology:wikidata") AS birth,
eqsGetDeath("name:etymology:wikidata") AS death,
eqsGetImage("name:etymology:wikidata") AS image,
geom AS "geometry",
json_build_object(
'type', 'Feature',
'id', osm_ids[1],
'properties',
json_build_object(
'name', name,
'wikidata', wikidata[1], -- TODO: how to output multiple ids?
'gender', eqsGetGender("name:etymology:wikidata"),
'details', json_build_object(
'wikidata', "name:etymology:wikidata"[1], -- TODO: how to output multiple ids?
'person', eqsIsHuman("name:etymology:wikidata"),
'labels', wd.data->'labels',
'descriptions', wd.data->'descriptions',
-- nicknames TODO
'gender', eqsGetGender("name:etymology:wikidata"),
'birth', eqsGetBirth("name:etymology:wikidata"),
'death', eqsGetDeath("name:etymology:wikidata"),
'image', eqsGetImage("name:etymology:wikidata"),
'sitelinks', wd.data->'sitelinks'
)
),
'geometry', ST_AsGeoJSON(ST_Transform(geom,4326))::json
) AS json_feature
FROM clustered_roads
LEFT JOIN wikidata AS wd ON wd.wikidataId=any("name:etymology:wikidata");
<file_sep>SET client_min_messages TO WARNING;
-- CSV for gender.csv
COPY (SELECT name, eqsGetGender("name:etymology:wikidata") AS gender, array_to_string("name:etymology:wikidata",',') AS wikidata, 'way' AS "type"
FROM clustered_roads
WHERE st_within(geom,(SELECT geometry FROM imposm_admin WHERE osm_id=##RELATION##))
AND eqsGetGender("name:etymology:wikidata") IS NOT NULL
AND eqsIsHuman("name:etymology:wikidata")
ORDER BY name) TO STDOUT CSV header;
<file_sep>DELETE FROM import_updated_geoms WHERE bbox IS NULL;
WITH del AS (
DELETE FROM import_updated_geoms RETURNING *
)
SELECT add_zxy(bbox_to_zxy(9, 9, bbox)),
add_zxy(bbox_to_zxy(10, 10, bbox)),
add_zxy(bbox_to_zxy(11, 11, bbox)),
add_zxy(bbox_to_zxy(12, 12, bbox)),
add_zxy(bbox_to_zxy(13, 13, bbox)),
add_zxy(bbox_to_zxy(14, 14, bbox)) FROM del;
<file_sep>SET client_min_messages TO WARNING;
-- CSV for other.csv
COPY (SELECT name, eqsGetGender("name:etymology:wikidata") AS gender, array_to_string("name:etymology:wikidata",',') AS wikidata, 'way' AS "type"
FROM clustered_roads
WHERE st_within(geom,(SELECT geometry FROM imposm_admin WHERE osm_id=##RELATION##))
AND eqsGetGender("name:etymology:wikidata") IS NULL
ORDER BY array_to_string("name:etymology:wikidata",','), name) TO STDOUT CSV HEADER;
<file_sep>#!/bin/bash
. ./config
. ./tools/bash_functions.sh
# Add views and functions
psql -f sql/export_eqs.sql > /dev/null
psql -f sql/vectorTilesViews.sql > /dev/null
echo_time "Calculate needed Tiles ..."
psql -f sql/calculateTiles.sql > /dev/null
echo_time "Start Export ..."
python3 vectortiles/export.py
echo_time "Export Completed."
<file_sep>#!/bin/bash
. ./config
. ./tools/bash_functions.sh
# delete old cache and diff
rm -rf $cache_dir $diff_dir
# Creating nessesary directories
mkdir -p tmp # for storing last update files
mkdir -p log # logging
# define logfile name
logfile_name=import_`date '+%Y-%m-%d_%H-%M-%S'`.log
# set tool path fot custom osm2pgsql
export PATH=`pwd`/tools/:$PATH
# delete old data
echo_time "Delete old data ..."
rm -f tmp/*
# import data into database
echo_time "Import data from $IMPORT_PBF ..."
imposm import -config config.json -read $IMPORT_PBF -write -connection "postgis://${PGUSER}:${PGPASSWORD}@${PGHOST}:${PGPORT}/${PGDATABASE}?prefix=imposm_" \
-diff -cachedir $cache_dir -diffdir $diff_dir -dbschema-import imposm3 -deployproduction
# catch exit code of osm2pgsql
RESULT=$?
if [ $RESULT -ne 0 ]
then
echo_time "imposm3 exits with error code $RESULT."
exit 1
fi
echo_time "Create basic tables and update triggers"
psql -f sql/createBasicTables.sql > /dev/null
echo_time "Fill table with clustered roads"
psql -f sql/updateClusteredRoads.sql > /dev/null
echo_time "Initial import finished."
| 476cb1ff5a9f54c5619e78772bfa523a96565f79 | [
"SQL",
"Python",
"Shell"
]
| 20 | SQL | britiger/osm_wikidata | 01878034a265b9a84d523465f5651dcd19ff44e6 | fc0cdf4f151da73b621c282e1dd6c7fb5f8c1412 |
refs/heads/main | <file_sep>#!/usr/bin/env bash
echo '{:a 1}' | lein jet --from edn --to json
<file_sep>#!/usr/bin/env bash
curl https://raw.githubusercontent.com/technomancy/leiningen/2.9.1/bin/lein > lein
sudo mkdir -p /usr/local/bin/
sudo mv lein /usr/local/bin/lein
sudo chmod a+x /usr/local/bin/lein
<file_sep>#!/usr/bin/env bash
err=0
function _trap_error() {
local exit_code="$1"
if [ "$exit_code" -ne 2 ] && [ "$exit_code" -ne 3 ]; then
echo "EXIT CODE :( $exit_code"
(( err |= "$exit_code" ))
fi
}
trap '_trap_error $?' ERR
trap 'exit $err' SIGINT SIGTERM
rm -rf performance.txt
echo -e "==== Build initial cache" | tee -a performance.txt
cp="$(clojure -R:cljs -Spath)"
read -r -d '' config <<'EOF' || true
{:linters
{:not-a-function
{:skip-args [clojure.pprint/defdirectives
cljs.pprint/defdirectives
clojure.data.json/codepoint-case]}}}
EOF
(time ./clj-kondo --lint "$cp" --cache --config "$config") 2>&1 | tee -a performance.txt
echo -e "\n==== Lint a single file (emulate in-editor usage)" | tee -a performance.txt
(time ./clj-kondo --lint src/clj_kondo/impl/core.clj --cache) 2>&1 | tee -a performance.txt
count=$(find . -name "*.clj*" -type f | wc -l | tr -d ' ')
echo -e "\n==== Launch clj-kondo for each file in project ($count)" | tee -a performance.txt
(time find src -name "*.clj*" -type f -exec ./clj-kondo --lint {} --cache \; ) 2>&1 | tee -a performance.txt
exit "$err"
<file_sep>#!/usr/bin/env bash
rm -rf /tmp/release
mkdir -p /tmp/release
cp pod-babashka-aws /tmp/release
VERSION=$(cat resources/POD_BABASHKA_AWS_VERSION)
cd /tmp/release
## release binary as zip archive
zip "pod-babashka-aws-$VERSION-$APP_PLATFORM-amd64.zip" pod-babashka-aws
## cleanup
rm pod-babashka-aws
<file_sep>#!/usr/bin/env bash
set -eo pipefail
if [ -z "$GRAALVM_HOME" ]; then
echo "Please set $GRAALVM_HOME"
exit 1
fi
clojure -X:native:uberjar
"$GRAALVM_HOME/bin/gu" install native-image
"$GRAALVM_HOME/bin/native-image" \
-cp "pod-babashka-aws.jar" \
-H:Name=pod-babashka-aws \
-H:+ReportExceptionStackTraces \
--initialize-at-build-time \
-H:EnableURLProtocols=http,https,jar \
"--enable-all-security-services" \
--report-unsupported-elements-at-runtime \
"-H:ReflectionConfigurationFiles=reflection.json" \
"-H:ResourceConfigurationFiles=resources.json" \
--verbose \
--no-fallback \
--no-server \
"-J-Xmx3g" \
pod.babashka.aws
<file_sep>#!/usr/bin/env bash
image_name="borkdude/clj-kondo"
image_tag=$(cat resources/CLJ_KONDO_VERSION)
latest_tag="latest"
if [[ $image_tag =~ SNAPSHOT$ ]]; then
echo "This is a snapshot version"
snapshot="true"
else
echo "This is a non-snapshot version"
snapshot="false"
fi
if [ -z "$CIRCLE_PULL_REQUEST" ] && [ "$CIRCLE_BRANCH" = "master" ]; then
echo "Building Docker image $image_name:$image_tag"
echo "$DOCKERHUB_PASS" | docker login -u "$DOCKERHUB_USER" --password-stdin
docker build -t "$image_name" .
docker tag "$image_name:$latest_tag" "$image_name:$image_tag"
# we only update latest when it's not a SNAPSHOT version
if [ "false" = "$snapshot" ]; then
echo "Pushing image $image_name:$latest_tag"
docker push "$image_name:$latest_tag"
fi
# we update the version tag, even if it's a SNAPSHOT version
echo "Pushing image $image_name:$image_tag"
docker push "$image_name:$image_tag"
else
echo "Not publishing Docker image"
fi
exit 0;
<file_sep>#!/usr/bin/env bash
err=0
function _trap_error() {
local exit_code="$1"
if [ "$exit_code" -ne 2 ] && [ "$exit_code" -ne 3 ]; then
(( err |= "$exit_code" ))
fi
}
trap '_trap_error $?' ERR
trap 'exit $err' SIGINT SIGTERM
# Run as local root dependency
rm -rf /tmp/proj
mkdir -p /tmp/proj
cd /tmp/proj
clojure -Sdeps '{:deps {clj-kondo {:local/root "/home/circleci/repo"}}}' \
-m clj-kondo.main --lint /home/circleci/repo/src /home/circleci/repo/test
# Run as git dependency
rm -rf /tmp/proj
mkdir -p /tmp/proj
cd /tmp/proj
github_user=${CIRCLE_PR_USERNAME:-borkdude}
clojure -Sdeps "{:deps {clj-kondo {:git/url \"https://github.com/$github_user/clj-kondo\" :sha \"$CIRCLE_SHA1\"}}}" \
-m clj-kondo.main --lint /home/circleci/repo/src /home/circleci/repo/test
exit "$err"
<file_sep>#!/usr/bin/env bash
if [[ "$APP_TEST_ENV" == "native" ]]
then
bb test/script.clj
unset AWS_ACCESS_KEY_ID
unset AWS_ACCESS_KEY_ID
unset AWS_REGION
bb test/credentials_test.clj
else
clojure -M test/script.clj
unset AWS_ACCESS_KEY_ID
unset AWS_ACCESS_KEY_ID
unset AWS_REGION
clojure -M test/credentials_test.clj
fi
<file_sep>#!/usr/bin/env bash
if [ -z "$CIRCLE_PULL_REQUEST" ] && [ "$CIRCLE_BRANCH" = "master" ]
then
lein deploy clojars
fi
exit 0;
<file_sep># Changelog
## v0.0.4
- Bump deps (see `deps.edn`).
- Add forwarding of `aws.sessionToken` system property [(@digash)](https://github.com/digash)
## v0.0.3
- Add `pod.babashka.aws.credentials` namespace [(@jeroenvandijk)](https://github.com/jeroenvandijk)
## v0.0.2
- Support passing files as blob uploads directly for better memory consumption [#18](https://github.com/babashka/pod-babashka-aws/issues/18)
## v0.0.1
Initial release
<file_sep># pod-babashka-aws
A [babashka](https://github.com/babashka/babashka)
[pod](https://github.com/babashka/pods) wrapping the
[aws-api](https://github.com/cognitect-labs/aws-api) library.
## Status
Experimental, awaiting your feedback.
## API
The namespaces and functions in this pod reflect those in the official
[aws-api](https://github.com/cognitect-labs/aws-api) library.
Available namespaces and functions:
- `pod.babashka.aws`: `client`, `doc`, `invoke`
- `pod.babashka.aws.config`: `parse`
- `pod.babashka.aws.credentials`: `credential-process-credentials-provider`,
`basic-credentials-provider`, `default-credentials-provider`,
`fetch`, `profile-credentials-provider`, `system-property-credentials-provider`
## Example
``` clojure
#!/usr/bin/env bb
(require '[babashka.pods :as pods])
(pods/load-pod 'org.babashka/aws "0.0.5")
(require '[pod.babashka.aws :as aws])
(def region "eu-central-1")
(def s3-client
(aws/client {:api :s3 :region region}))
(aws/doc s3-client :ListBuckets)
(aws/invoke s3-client {:op :ListBuckets})
(aws/invoke s3-client
{:op :CreateBucket
:request {:Bucket "pod-babashka-aws"
:CreateBucketConfiguration {:LocationConstraint region}}})
(require '[clojure.java.io :as io])
(aws/invoke s3-client
{:op :PutObject
:request {:Bucket "pod-babashka-aws"
:Key "logo.png"
:Body (io/file "resources" "babashka.png")}})
```
See [test/script.clj](test/script.clj) for an example script.
## Differences with aws-api
- Credentials: custom flows are supported, but not by extending CredentialsProvider interface. See <a href="#credentials">Credentials</a> for options.
- This pod doesn't require adding dependencies for each AWS service.
- Async might be added in a later version.
- For uploading (big) files (e.g. to S3), it is better for memory consumption to
pass a `java.io.File` directly, rather than an input-stream.
## Credentials
The default behaviour for credentials is the same way as Cognitect's
[`aws-api`](https://github.com/cognitect-labs/aws-api#credentials); meaning the
client implicitly looks up credentials the same way the [java SDK
does](https://docs.aws.amazon.com/sdk-for-java/v1/developer-guide/credentials.html)
.
To provide credentials explicitly, you can pass a `credentials-provider` to the
client constructor fn, e.g.:
```clojure
(require '[pod.babashka.aws :as aws])
(require '[pod.babashka.aws.credentials :as credentials])
(def s3 (aws/client {:api :s3
:credentials-provider (credentials/basic-credentials-provider
{:access-key-id "ABC"
:secret-access-key "XYZ"})}))
```
In contrast to the `aws-api` library this pod does not support extending the
CredentialsProvider interface. For more complex flows, e.g. temporary
credentials obtained by `AWS SSO`, one can use a `credential_process` entry in a
`~/.aws/credentials` file, as documented [here](https://docs.aws.amazon.com/credref/latest/refdocs/setting-global-credential_process.html):
```clojure
(def s3 (aws/client {:api :s3
:credentials-provider (credentials/credential-process-credentials-provider
"custom-profile")}))
```
where `~/.aws/credentials` could look like
```
[custom-profile]
credential_process = bb fetch_sso_credentials.clj
```
The `credential_process` entry can be any program that prints the expected JSON data:
```clojure
#!/usr/bin/env bb
(println "{\"AccessKeyId\":\"***\",\"SecretAccessKey\":\"***\",\"SessionToken\":\"***\",\"Expiration\":\"2020-01-00T00:00:00Z\",\"Version\":1}")
```
## Build
Run `script/compile`. This requires `GRAALVM_HOME` to be set.
## Update aws-api deps
Run `script/update-deps.clj`.
## Test
Run `script/test`. This will run both the pod and tests (defined in
`test/script.clj`) in two separate JVMs.
To test the native-image together with babashka, run the tests while setting
`APP_TEST_ENV` to `native`:
``` shell
APP_TEST_ENV=native script/test
```
To test with [localstack](https://github.com/localstack/localstack):
``` shell
# Start localstack
docker-compose up -d
# Set test credentials and run tests
AWS_REGION=eu-north-1 AWS_ACCESS_KEY_ID=test AWS_SECRET_ACCESS_KEY=test script/test
```
When adding new services to tests you need to add the service name into `SERVICES` environment variable in `docker-compose.yml` and `.circleci/config.yml`. See [localstack docs](https://github.com/localstack/localstack#configurations) for a list of available services.
## License
Copyright © 2020 <NAME>, <NAME>, <NAME> and <NAME>.
Distributed under the Apache License 2.0. See LICENSE.
| 8cbe1ce21ea51595a611140b93f1e241e568df1d | [
"Markdown",
"Shell"
]
| 11 | Shell | lasiltan/pod-babashka-aws | e119d08f738ed4fc6bb533a99288ccbd7794c18f | 3cc63a6c9116e43bf678b580a18e4464fed8cb90 |
refs/heads/master | <repo_name>morihara-y/binary-only-sample<file_sep>/src/github.com/po3rin/hello/hello.go
// package hello is sample of binary-only package
//go:binary-only-package
package hello
<file_sep>/README.md
# Golang Binary-Only Package
Go1.7 から Binary-Only Package が導入されています。Binary-Only Package はパッケージのコンパイルに使用されるソースコードを含まずにバイナリ形式でパッケージを配布することを可能にするものです。このリポジトリは パッケージをバイナリにしてzipにして配布、そして使うという一連の流れを試した結果です。日本語の解説がほぼなかったのでまとめておきます。
## 作り方
まずは Root に hello.go を作る。このコードは実際には配布されません。build時に使うのみです。
```go
package hello
import "fmt"
// Hello - greet
func Hello(name string) string {
return fmt.Sprintf("Hello, %s!", name)
}
```
ここから Binary-Only package を作る際には二つのファイルを作る必要があります。特殊なコメントを付与したソースコードと、他のバイナリーパッケージです。
特殊なコメントに関してはドキュメントで言及されている。
> the package must be distributed with a source file not excluded by build constraints and containing a "//go:binary-only-package" comment.
https://tip.golang.org/pkg/go/build/#hdr-Binary_Only_Packages
つまり`//go:binary-only-package`というコメントをソースコードに残せということらしい。今回の例は `src/github.com/po3rin/hello/hello.go` を下記のように作った。GitHubのアカウント名は適宜書き換えてください(po3rinの部分)
```go
// package hello is sample of binary-only package
//go:binary-only-package
package hello
```
そしてバイナリパッケージに build し、`pkg/darwin_amd64/github.com/po3rin/` に配置する。
```bash
$ go build -o pkg/darwin_amd64/github.com/po3rin/hello.a -x
```
そして, `src` と `pkg` をzipで固めます。
```bash
$ zip -r hello.zip src/* pkg/*
```
以上で Binary-Only なパッケージができた。最終的には下記のような構成になる。
```bash
.
├── README.md
├── hello.go
├── hello.zip
├── pkg
│ └── darwin_amd64
│ └── github.com
│ └── po3rin
│ └── hello.a
└── src
└── github.com
└── po3rin
└── hello
└── hello.go
```
これで hello.zip だけ配ればパッケージとして機能してくれる。
## Binary-Only Package を使う
このバイナリパッケージを使う際には、ただGOPATHで展開するだけでOK
```bash
$ unzip hello.zip -d $GOPATH/
```
あとは実際にこのパッケージを使った処理を書いてみる
```go
package main
import (
"fmt"
"github.com/po3rin/hello"
)
func main() {
greet := hello.Hello("taro")
fmt.Println(greet)
}
```
そして動作確認
```go
go run main.go
Hello, taro!
```
## 任意のパッケージが Binary-Only Package かどうかの確認
下記のコードでGOPATH以下にunzipしたパッケージがBinary-Only Packageになっているか確認できます。
```go
package main
import (
"fmt"
"go/build"
"os"
)
func main() {
dir, err := os.Getwd()
if err != nil {
fmt.Println("error: Getwd methods")
os.Exit(1)
}
path := "github.com/po3rin/hello"
p, err := build.Import(path, dir, build.ImportComment)
if err != nil {
fmt.Println("no target package")
os.Exit(1)
}
if p.BinaryOnly {
fmt.Println("this package is binary only package")
}
}
```
`go/build`パッケージのImportでパッケージ情報を所得し、BinaryOnlyフィールドで確認しています。
以上でBinary-Only Packageを作る、使う、確認するの一連の流れができました。
| bdf2e31c7ac62d339b573eb06a8c871cfe11a940 | [
"Markdown",
"Go"
]
| 2 | Go | morihara-y/binary-only-sample | 6622b3bbd9b21256235b211b71f830e92aa15a77 | 8d2155f283eead79a002e1610ce6492cdb07aa1e |
refs/heads/master | <file_sep>package pl.inoshi.newyear.listeners;
import org.bukkit.entity.Player;
import org.bukkit.event.EventHandler;
import org.bukkit.event.Listener;
import org.bukkit.event.block.BlockBreakEvent;
public class PlayerBlockDestroyListener implements Listener {
@EventHandler
public void onBlockDestroy(BlockBreakEvent event){
Player player = event.getPlayer();
if (!player.isOp()){
event.setCancelled(true);
}
}
}
<file_sep>package pl.inoshi.newyear.listeners;
import org.bukkit.Bukkit;
import org.bukkit.ChatColor;
import org.bukkit.entity.Player;
import org.bukkit.event.EventHandler;
import org.bukkit.event.Listener;
import org.bukkit.event.player.PlayerCommandPreprocessEvent;
import pl.inoshi.newyear.data.CooldownData;
public class PlayerCommandListener implements Listener {
private final CooldownData cooldownData;
public PlayerCommandListener(CooldownData cooldownData) {
this.cooldownData = cooldownData;
}
@EventHandler
public void onCommand(PlayerCommandPreprocessEvent event){
Player player = event.getPlayer();
String playerName = player.getName();
if (cooldownData.checkContainst(playerName)){
if (cooldownData.getTime(playerName) < System.currentTimeMillis()){
cooldownData.getCooldownMap().remove(playerName);
cooldownData.addToCooldownMap(playerName);
return;
}
} else {
cooldownData.addToCooldownMap(playerName);
return;
}
player.sendMessage(ChatColor.RED + "Musisz poczekac 3 minuty");
event.setCancelled(true);
}
}
<file_sep>package pl.inoshi.newyear.commands;
import net.minecraft.server.v1_8_R3.IChatBaseComponent;
import net.minecraft.server.v1_8_R3.PacketPlayOutTitle;
import org.bukkit.ChatColor;
import org.bukkit.command.Command;
import org.bukkit.command.CommandExecutor;
import org.bukkit.command.CommandSender;
import org.bukkit.craftbukkit.v1_8_R3.entity.CraftPlayer;
import org.bukkit.entity.Player;
import pl.inoshi.newyear.fireworks.Fireworks;
public class FireworksGiveCommand implements CommandExecutor {
private final Fireworks fireworks;
public FireworksGiveCommand(Fireworks fireworks) {
this.fireworks = fireworks;
}
String text = ChatColor.GREEN + "Odebrales zestaw Noworoczny";
IChatBaseComponent chatTitle = IChatBaseComponent.ChatSerializer.a("{\"text\": \"" + text + "\",color:" + ChatColor.GOLD.name().toLowerCase() + "}");
PacketPlayOutTitle title = new PacketPlayOutTitle(PacketPlayOutTitle.EnumTitleAction.TITLE, chatTitle);
PacketPlayOutTitle length = new PacketPlayOutTitle(5, 20, 5);
public boolean onCommand(CommandSender sender, Command command, String s, String[] strings) {
Player player = (Player) sender;
((CraftPlayer) player).getHandle().playerConnection.sendPacket(title);
((CraftPlayer) player).getHandle().playerConnection.sendPacket(length);
fireworks.getFireworksSet().forEach(x ->{
player.getInventory().addItem(x);
});
return true;
}
}
<file_sep>package pl.inoshi.newyear.tasks;
import net.minecraft.server.v1_8_R3.IChatBaseComponent;
import net.minecraft.server.v1_8_R3.PacketPlayOutTitle;
import org.bukkit.Bukkit;
import org.bukkit.ChatColor;
import org.bukkit.craftbukkit.v1_8_R3.entity.CraftPlayer;
import pl.inoshi.newyear.countdowntime.Counter;
public class CounterTask implements Runnable {
private Counter counter;
public CounterTask(Counter counter) {
this.counter = counter;
}
@Override
public void run() {
if (counter.calculate() == 0){
String text = ChatColor.GREEN + "NOWY ROK";
IChatBaseComponent chatTitle = IChatBaseComponent.ChatSerializer.a("{\"text\": \"" + text + "\",color:" + ChatColor.GOLD.name().toLowerCase() + "}");
PacketPlayOutTitle title = new PacketPlayOutTitle(PacketPlayOutTitle.EnumTitleAction.TITLE, chatTitle);
PacketPlayOutTitle length = new PacketPlayOutTitle(5, 100, 5);
Bukkit.getOnlinePlayers().forEach(player ->{
((CraftPlayer) player).getHandle().playerConnection.sendPacket(title);
((CraftPlayer) player).getHandle().playerConnection.sendPacket(length);
});
}
String text = ChatColor.GREEN + "Do nowego roku zostalo: " + counter.calculate() + " sekund";
IChatBaseComponent chatTitle = IChatBaseComponent.ChatSerializer.a("{\"text\": \"" + text + "\",color:" + ChatColor.GOLD.name().toLowerCase() + "}");
PacketPlayOutTitle title = new PacketPlayOutTitle(PacketPlayOutTitle.EnumTitleAction.TITLE, chatTitle);
PacketPlayOutTitle length = new PacketPlayOutTitle(5, 20, 5);
Bukkit.getOnlinePlayers().forEach(player ->{
((CraftPlayer) player).getHandle().playerConnection.sendPacket(title);
((CraftPlayer) player).getHandle().playerConnection.sendPacket(length);
});
}
}
| 8ed062f2ec6453c4f515f466b0649e65c2e598e6 | [
"Java"
]
| 4 | Java | 0VERR/InoshiNewYear | 9bb08808b34ca889f30fc91136ed58a5ce608cc9 | 09d6813c06ace7ca9275d919e0afd001d0ae79cf |
refs/heads/main | <file_sep>#!/usr/bin/python
# -*- coding: utf-8 -*-
# get reviews from Rakuten ichiba
import time
import json
import csv
import re
from selectorlib import Extractor
from dateutil import parser as dateparser
from selenium import webdriver
from selenium.webdriver.common.keys import Keys
from webdriver_manager.chrome import ChromeDriverManager
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
from selenium.common.exceptions import TimeoutException, WebDriverException
# Create an Extractor by reading from the YAML file
e = Extractor.from_yaml_file('rakutenSelectors.yml')
options = webdriver.ChromeOptions()
options.add_argument("start-maximized")
options.add_argument("disable-infobars")
options.add_argument("--disable-extensions")
options.add_experimental_option("excludeSwitches", ["enable-automation"])
driver = driver = webdriver.Chrome(chrome_options=options,
executable_path=ChromeDriverManager().install())
# extract html to dict
def getDictComm():
html = driver.page_source.replace('\n', '')
return e.extract(html)
# goto next page
def NextPage():
try:
# wait page load
time.sleep(3)
element = WebDriverWait(
driver, 10).until(EC.presence_of_element_located((By.LINK_TEXT, '次の15件 >>')))
element.click()
print("Navigating to Next Page")
return True
except (TimeoutException, WebDriverException) as e:
print("Last page reached")
return False
# write dict to scv file
def writeToCSV(dictComm, writer):
for (r, rr) in zip(dictComm['reviews'], dictComm['reviewers']):
r["product"] = dictComm["product_title"]
r['rating'] = r['rating'] if r['rating'] else 'N/A'
r['author'] = rr['name']
writer.writerow(r)
# product_data = []
def getRakutenReview():
with open("urls.txt", 'r') as urllist:
for url in urllist.readlines():
driver.get(url)
html = driver.page_source.replace('\n', '')
dictComm = e.extract(html)
if dictComm:
productTittle = re.findall(
r'[^\*"/:?\\|<>]', dictComm["product_title"].replace(' ', '_'), re.S)
csvFileName = "".join(productTittle) + '.csv'
with open('comm/'+csvFileName, 'w', encoding='UTF-8', errors='ignore') as outfile:
writer = csv.DictWriter(outfile, fieldnames=[
"title", "content", "date", "variant", "images", "verified", "author", "rating", "product"], quoting=csv.QUOTE_ALL)
writer.writeheader()
writeToCSV(dictComm, writer)
while True:
if NextPage():
writeToCSV(getDictComm(), writer)
else:
break
driver.close()
if __name__ == '__main__':
getRakutenReview()
<file_sep>#!/usr/bin/python
# -*- coding: utf-8 -*-
import time
import json
import csv
import re
from selectorlib import Extractor
from dateutil import parser as dateparser
from selenium import webdriver
from selenium.webdriver.common.keys import Keys
from webdriver_manager.chrome import ChromeDriverManager
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
from selenium.common.exceptions import TimeoutException, WebDriverException
from impReviewProperty import ReviewProperty
class ReviewAPI:
def __init__(self):
self.__reviewProperty = ReviewProperty()
self.__options = webdriver.ChromeOptions()
self.__options.add_argument("start-maximized")
self.__options.add_argument("disable-infobars")
self.__options.add_argument("--disable-extensions")
self.__options.add_experimental_option(
"excludeSwitches", ["enable-automation"])
self.__driver = driver = webdriver.Chrome(chrome_options=self.__options,
executable_path=ChromeDriverManager().install())
# extract html to dict
# Create an Extractor by reading from the YAML file
def __extractUrl(self):
e = Extractor.from_yaml_file(self.__reviewProperty.getSiteSelector)
html = self.__driver.page_source.replace('\n', '')
self.__reviewProperty.setDictComm(e.extract(html))
# goto next page
def __NextPage(self):
try:
# wait page load
time.sleep(3)
if self.__reviewProperty.getECSite == 'rakuten':
element = WebDriverWait(
self.__driver, 10).until(EC.presence_of_element_located((By.LINK_TEXT, '次の15件 >>')))
element.click()
elif self.__reviewProperty.getECSite == 'amazon':
self.__driver.execute_script("return arguments[0].scrollIntoView(true);", WebDriverWait(
self.__driver, 10).until(EC.element_to_be_clickable((By.XPATH, "//li[@class='a-last']/a"))))
self.__driver.find_element_by_xpath(
"//li[@class='a-last']/a").click()
print("Navigating to Next Page")
return True
except (TimeoutException, WebDriverException) as e:
print("Last page reached")
return False
# write dict to scv file
def __writeToCSV(self, writer):
try:
if self.__reviewProperty.getECSite == 'rakuten':
for (r, rr) in zip(self.__reviewProperty.getDictComm['reviews'], self.__reviewProperty.getDictComm['reviewers']):
r["product"] = self.__reviewProperty.getDictComm["product_title"]
r['rating'] = r['rating'].split('つ')[0] if r['rating'] else 'N/A'
r['author'] = rr['name']
writer.writerow(r)
elif self.__reviewProperty.getECSite == 'amazon':
for r in self.__reviewProperty.getDictComm['reviews']:
if '日本' in r['date']:
r["product"] = self.__reviewProperty.getDictComm["product_title"]
r['verified'] = 'Yes' if r['verified'] else 'No'
r['rating'] = r['rating'].split('つ')[0] if r['rating'] else 'N/A'
# r['images'] = " ".join(r['images']) if r['images'] else 'N/A'
# r['date'] = dateparser.parse(r['date'].split('に')[0]).strftime('%d %b %Y')
year = r['date'].split('年')[0]
month = r['date'].split('年')[1].split('月')[0]
day = r['date'].split('年')[1].split('月')[1].split('日')[0]
r['date'] = '-'.join((year, month, day))
writer.writerow(r)
else:
break
except Exception as e:
print('[{1}] Error : {0}'.format(e, ReviewAPI.__writeToCSV.__name__))
# product_data = []
def reviewCrawler(self):
try:
with open("urls.txt", 'r') as urllist:
for url in urllist.readlines():
# check EC site source
if url.find('rakuten') != -1:
self.__reviewProperty.setECSite("rakuten")
elif url.find('amazon') != -1:
self.__reviewProperty.setECSite('amazon')
else:
print('wrong url: ', url, '\n')
continue
self.__driver.get(url)
self.__extractUrl()
if self.__reviewProperty.getDictComm:
productTittle = re.findall(
r'[^\*"/:?\\|<>]', self.__reviewProperty.getDictComm["product_title"].replace(' ', '_'), re.S)
csvFileName = '{0}_{1}.csv'.format(
self.__reviewProperty.getECSite, "".join(productTittle))
try:
with open('comm/'+csvFileName, 'w', encoding='UTF-8', errors='ignore') as outfile:
writer = csv.DictWriter(outfile, fieldnames=["title", "content", "date", "variant", "verified", "author", "rating", "product"], quoting=csv.QUOTE_ALL)
writer.writeheader()
# self.__writeToCSV(writer)
while True:
if self.__NextPage():
self.__extractUrl()
self.__writeToCSV(writer)
else:
break
except IOError as ioe:
print('file error:', ioe)
except Exception as e:
print('[{1}] Error : {0}'.format(e, ReviewAPI.reviewCrawler.__name__))
self.__driver.close()
<file_sep>#!/usr/bin/python
# -*- coding: utf-8 -*-
# google search results crawler
from selenium import webdriver
from selenium.webdriver.common.keys import Keys
from webdriver_manager.microsoft import EdgeChromiumDriverManager
from selenium.common.exceptions import NoSuchElementException
import os
import urllib.request as urllib2
import socket
import time
import gzip
from io import BytesIO
import re
import random
import types
from dotenv import load_dotenv, find_dotenv
import operator
import sys
# Load config from .env file
try:
load_dotenv(find_dotenv(usecwd=True))
base_url = os.environ.get('BASE_URL')
results_per_page = int(os.environ.get('RESULTS_PER_PAGE'))
except:
print("ERROR: Make sure you have .env file with proper config")
sys.exit(1)
user_agents = list()
# results from the search engine
# basically include url, title,content
class SearchResult:
def __init__(self):
self.url = ''
self.title = ''
self.content = ''
def getURL(self):
return self.url
def setURL(self, url):
self.url = url
def getTitle(self):
return self.title
def setTitle(self, title):
self.title = title
def getContent(self):
return self.content
def setContent(self, content):
self.content = content
def printIt(self, prefix=''):
print('url\t->', self.url, '\n',
'title\t->', self.title, '\n',
'content\t->', self.content)
def writeFile(self, filename):
try:
with open(filename, 'a') as f:
f.write('url:' + self.url + '\n')
f.write('title:' + self.title + '\n')
f.write('content:' + self.content + '\n\n')
except IOError as e:
print('file error:', e)
class GoogleAPI:
def __init__(self):
timeout = 40
socket.setdefaulttimeout(timeout)
def randomSleep(self):
sleeptime = random.randint(60, 120)
time.sleep(sleeptime)
def extractDomain(self, url):
"""Return string
extract the domain of a url
"""
domain = ''
pattern = re.compile(r'http[s]?://([^/]+)/', re.U | re.M)
url_match = pattern.search(url)
if(url_match and url_match.lastindex > 0):
domain = url_match.group(1)
return domain
def extractUrl(self, href):
""" Return a string
extract a url from a link
"""
url = ''
pattern = re.compile(r'(http[s]?://[^&]+)&', re.U | re.M)
url_match = pattern.search(href)
if(url_match and url_match.lastindex > 0):
url = url_match.group(1)
return url
def extractSearchResults(self, html):
"""
Return a list
extract serach results list from downloaded html file
"""
results = list()
soup = BeautifulSoup(html, 'html.parser')
div = soup.find('div', id='main')
if (type(div) == type(None)):
div = soup.find('div', id='center_col')
if (type(div) == type(None)):
div = soup.find('body')
if (type(div) != type(None)):
lis = div.findAll('a')
if(len(lis) > 0):
for link in lis:
if (type(link) == type(None)):
continue
url = link['href']
if url.find(".google") > 6:
continue
if url.find("http") == -1:
continue
url = self.extractUrl(url)
if(operator.eq(url, '') == 0):
continue
title = link.renderContents().decode('utf-8')
title = re.sub(r'<.+?>', '', title)
result = SearchResult()
result.setURL(url)
result.setTitle(title)
span = link.find('div')
if (type(span) != type(None)):
content = span.renderContents().decode('utf-8')
content = re.sub(r'<.+?>', '', content)
result.setContent(content)
results.append(result)
return results
def search(self, query, lang='jp', num=results_per_page):
"""Return a list of lists
search web
@param query -> query key words
@param lang -> language of search results
@param num -> number of search results to return
"""
search_results = list()
query = urllib2.quote(query)
if(num % results_per_page == 0):
pages = num / results_per_page
else:
pages = num / results_per_page + 1
for p in range(0, int(pages)):
start = p * results_per_page
url = '%s/search?hl=%s&num=%d&start=%s&q=%s' % (
base_url, lang, results_per_page, start, query)
retry = 3
while(retry > 0):
try:
# driver = webdriver.Edge(executable_path=r'.venv\\Scripts\\msedgedriver.exe')
driver = webdriver.Edge(EdgeChromiumDriverManager().install())
driver.get(url)
html = driver.page_source.replace('\n', '')
results = self.extractSearchResults(html)
search_results.extend(results)
driver.close()
break
except urllib2.URLError as e:
print('url error:', e)
self.randomSleep()
retry = retry - 1
continue
except Exception as e:
print('error:', e)
retry = retry - 1
self.randomSleep()
continue
return search_results
def load_user_agent():
with open('./user_agents', 'r') as fp:
for line in fp.readlines():
user_agents.append(line.strip('\n'))
def crawler():
# Load use agent string from file
load_user_agent()
# Create a GoogleAPI instance
api = GoogleAPI()
# set expect search results to be crawled
expect_num = 10
# if no parameters, read query keywords from file
if(len(sys.argv) < 2):
keywords = open('./keywords', 'r', encoding="utf-8")
keyword = keywords.readline()
while(keyword):
results = api.search(keyword, num=expect_num)
for r in results:
r.printIt()
keyword = keywords.readline()
keywords.close()
else:
keyword = sys.argv[1]
results = api.search(keyword, num=expect_num)
for r in results:
r.printIt()
if __name__ == '__main__':
crawler()
<file_sep>from collections import defaultdict
from ginza import *
import spacy
import csv
from urllib import request
from bs4 import BeautifulSoup
import bs4
from sumy.parsers.plaintext import PlaintextParser
from sumy.nlp.tokenizers import Tokenizer
from sumy.summarizers.lex_rank import LexRankSummarizer
nlp = spacy.load("ja_ginza") # GiNZAモデルの読み込み
# frames = defaultdict(lambda: 0) # 依存関係の出現頻度を格納
# sentences = set() # 重複文検出用のset
# with open('comm\Fuji_Medical_Equipment_Co.,_Ltd._Foot_Massager_-_FT-100.csv', 'r', encoding='UTF-8', errors='ignore') as cf:
# reader = csv.reader(cf)
# header = next(reader)
# for row in reader:
# if not row: continue
# if row[7]=='N/A':continue
# if (float(row[7]) < 3.0): continue
# doc = nlp(row[1]) # 解析を実行し
# for sent in doc.sents: # 文単位でループ
# if sent.text in sentences: continue # 重複文はスキップ
# sentences.add(sent.text)
# for t in bunsetu_head_tokens(sent): # 文節主辞トークンのうち
# if t.pos_ not in {"ADJ", "VERB"}: continue # 述語以外はスキップ
# v = phrase(lemma_)(t) # 述語とその格要素(主語・目的語相当)の句を集める
# dep_phrases = sub_phrases(t, phrase(lemma_), is_not_stop)
# subj = [phrase for dep, phrase in dep_phrases if dep in {"nsubj"}]
# obj = [phrase for dep,
# phrase in dep_phrases if dep in {"obj", "iobj"}]
# for s in subj:
# for o in obj:
# frames[(s, o, v)] += 1 # 格要素と述語の組み合わせをカウント
# for frame, count in sorted(frames.items(), key=lambda t: -t[1]):
# print(count, *frame, sep="\t") # 出現頻度の高い順に表示
# urlに要約対象とする書籍のURLを指定します。以下は「だしの取り方 by 北大路魯山人」のURLです。
url = 'https://www.aozora.gr.jp/cards/001403/files/49986_37674.html'
html = request.urlopen(url)
soup = BeautifulSoup(html, 'html.parser')
body = soup.select('.main_text')
text = ''
for b in body[0]:
if type(b) == bs4.element.NavigableString:
text += b
continue
# ルビの場合、フリガナは対象にせずに、漢字のみ使用します。
text += ''.join([e.text for e in b.find_all('rb')])
print(text)
corpus = []
originals = []
doc = nlp(text)
for s in doc.sents:
originals.append(s)
tokens = []
for t in s:
tokens.append(t.lemma_)
corpus.append(' '.join(tokens))
print(len(corpus))
print(len(originals))
# 連結したcorpusを再度tinysegmenterでトークナイズさせる
parser = PlaintextParser.from_string(''.join(corpus), Tokenizer('japanese'))
summarizer = LexRankSummarizer()
summarizer.stop_words = [' '] # スペースも1単語として認識されるため、ストップワードにすることで除外する
# sentencres_countに要約後の文の数を指定します。
summary = summarizer(document=parser.document, sentences_count=3)
# 元の文を表示
for sentence in summary:
print(originals[corpus.index(sentence.__str__())])<file_sep>#!/usr/bin/python
# -*- coding: utf-8 -*-
# summerize commend from amazon
import csv,re
import spacy
from sumy.parsers.plaintext import PlaintextParser
from sumy.nlp.tokenizers import Tokenizer
from sumy.summarizers.lex_rank import LexRankSummarizer
from sumy.summarizers.lsa import LsaSummarizer as Summarizer
from sumy.nlp.stemmers import Stemmer
from sumy.utils import get_stop_words
from janome.analyzer import Analyzer
from janome.charfilter import UnicodeNormalizeCharFilter, RegexReplaceCharFilter
from janome.tokenizer import Tokenizer as JanomeTokenizer # sumyのTokenizerと名前が被るため
from janome.tokenfilter import POSKeepFilter, ExtractAttributeFilter
LANGUAGE = "japanese"
SENTENCES_COUNT = 5
nlp = spacy.load('ja_ginza')
def getText():
text = ''
with open('comm\Fuji_Medical_Equipment_Co.,_Ltd._Foot_Massager_-_FT-100.csv', 'r', encoding='UTF-8', errors='ignore') as cf:
reader = csv.reader(cf)
header = next(reader)
for row in reader:
if not row: continue
if row[7] == 'N/A': continue
if (float(row[7]) < 3.0): continue
# clean_text = row[1].lower()
# clean_text = re.sub(r"\W"," ",clean_text) # removing any non-words characters which include special characters, comma, punctuation
# clean_text = re.sub(r"\s+",' ',clean_text) # removing any extra spaces in middle
# clean_text = re.sub(r"^\s",' ',clean_text) # removing any extra spaces in beginning
# clean_text = re.sub(r"\s$",' ',clean_text) # removing any extra spaces in end
# text += clean_text
# text.append(clean_text)
text += row[1] if row[1][-1] == '。' else row[1]+'。'
return text
def getCorpusByJanome():
corpus = []
sents = getText()
# ()「」、。は全てスペースに置き換える
char_filters = [UnicodeNormalizeCharFilter(
), RegexReplaceCharFilter(r'[(\)「」、。.\s;?!\'``^]', ' ')]
tokenizer = JanomeTokenizer()
# 名詞・形容詞・副詞・動詞の原型のみ
token_filters = [POSKeepFilter(
['名詞', '形容詞', '副詞', '動詞']), ExtractAttributeFilter('base_form')]
analyzer = Analyzer(char_filters= char_filters,tokenizer= tokenizer,token_filters= token_filters)
corpus = [' '.join(analyzer.analyze(s))+ '。' for s in sents]
# for sents in corpus:
# print(sents)
return corpus
def getCorpusByGinza():
corpus = []
originals = []
text = getText()
doc = nlp(text)
for s in doc.sents:
originals.append(s)
tokens = []
for t in s:
tokens.append(t.lemma_)
corpus.append(' '.join(tokens))
return corpus, originals
def summerizedText():
# 連結したcorpusを再度tinysegmenterでトークナイズさせる
corpus, originals = getCorpusByGinza()
# corpus = getCorpusByJanome()
parser = PlaintextParser.from_string(
' '.join(corpus), Tokenizer(LANGUAGE))
# stemmer = Stemmer(LANGUAGE)
# summarizer = Summarizer(stemmer)
summarizer = LexRankSummarizer()
# summarizer.stop_words = get_stop_words(LANGUAGE)
summarizer.stop_words = [" ","あそこ","あっ","あの","あのかた","あの人","あり","あります","ある","あれ","い","いう","います","いる","う","うち","え","お","および","おり","おります","か","かつて","から","が","き","ここ","こちら","こと","この","これ","これら","さ","さらに","し","しかし","する","ず","せ","せる","そこ","そして","その","その他","その後","それ","それぞれ","それで","た","ただし","たち","ため","たり","だ","だっ","だれ","つ","て","で","でき","できる","です","では","でも","と","という","といった","とき","ところ","として","とともに","とも","と共に","どこ","どの","な","ない","なお","なかっ","ながら","なく","なっ","など","なに","なら","なり","なる","なん","に","において","における","について","にて","によって","により","による","に対して","に対する","に関する","の","ので","のみ","は","ば","へ","ほか","ほとんど","ほど","ます","また","または","まで","も","もの","ものの","や","よう","より","ら","られ","られる","れ","れる","を","ん","何","及び","彼","彼女","我々","特に","私","私達","貴方","貴方方"] # スペースも1単語として認識されるため、ストップワードにすることで除外する
# sentencres_countに要約後の文の数を指定します。
return summarizer(document=parser.document, sentences_count=SENTENCES_COUNT), originals,corpus
if __name__ == '__main__':
summaryTexts,originals,corpus = summerizedText()
# 元の文を表示
for sentence in summaryTexts:
# print(sentence)
print(originals[corpus.index(sentence.__str__())])
<file_sep># data-mining
## Usage
1. Add url in `url.txt` file
<file_sep>#!/usr/bin/python
# -*- coding: utf-8 -*-
# get commends from Amazon
import time
import json
import csv
import re
from selectorlib import Extractor
from dateutil import parser as dateparser
from selenium import webdriver
from selenium.webdriver.common.keys import Keys
from webdriver_manager.chrome import ChromeDriverManager
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
from selenium.common.exceptions import TimeoutException, WebDriverException
# Create an Extractor by reading from the YAML file
e = Extractor.from_yaml_file('selectors.yml')
options = webdriver.ChromeOptions()
options.add_argument("start-maximized")
options.add_argument("disable-infobars")
options.add_argument("--disable-extensions")
options.add_experimental_option("excludeSwitches", ["enable-automation"])
driver = driver = webdriver.Chrome(chrome_options=options,
executable_path=ChromeDriverManager().install())
# extract html to dict
def getDictComm():
html = driver.page_source.replace('\n', '')
return e.extract(html)
# goto next page
def NextPage():
try:
# wait page load
time.sleep(3)
driver.execute_script("return arguments[0].scrollIntoView(true);", WebDriverWait(
driver, 10).until(EC.element_to_be_clickable((By.XPATH, "//li[@class='a-last']/a"))))
driver.find_element_by_xpath(
"//li[@class='a-last']/a").click()
print("Navigating to Next Page")
return True
except (TimeoutException, WebDriverException) as e:
print("Last page reached")
return False
# write dict to scv file
def writeToCSV(dictComm, writer):
for r in dictComm['reviews']:
r["product"] = dictComm["product_title"]
r['verified'] = 'Yes' if r['verified'] else 'No'
r['rating'] = r['rating'].split(
' out of')[0] if r['rating'] else 'N/A'
r['images'] = " ".join(
r['images']) if r['images'] else 'N/A'
r['date'] = dateparser.parse(
r['date'].split('on ')[-1]).strftime('%d %b %Y')
writer.writerow(r)
# product_data = []
def getAmazonCom():
with open("urls.txt", 'r') as urllist:
for url in urllist.readlines():
driver.get(url)
html = driver.page_source.replace('\n', '')
dictComm = e.extract(html)
if dictComm:
productTittle = re.findall(
r'[^\*"/:?\\|<>]', dictComm["product_title"].replace(' ', '_'), re.S)
csvFileName = "".join(productTittle) + '.csv'
with open('comm/'+csvFileName, 'w', encoding='UTF-8', errors='ignore') as outfile:
writer = csv.DictWriter(outfile, fieldnames=["title", "content", "date", "variant",
"images", "verified", "author", "rating", "product"], quoting=csv.QUOTE_ALL)
writer.writeheader()
writeToCSV(dictComm, writer)
while True:
if NextPage():
writeToCSV(getDictComm(), writer)
else:
break
driver.close()
if __name__ == '__main__':
getAmazonCom()
<file_sep>
#!/usr/bin/python
# -*- coding: utf-8 -*-
import csv,re,os
import nagisa # library used for Natural Language Processing for japanese
import pykakasi # library for conversion of Kanji into Hirigana, Katakana and Romaji
import heapq # library for implementing priority queues where the queue item with higher weight is given more priority in processing
from jamdict import Jamdict # library for searching the japanese vocabulary
from eCSiteReviewCrawler import ReviewAPI # function for getting EC site review and store data in csv file
class SumAPI:
def __init__(self):
self.__csvFile = ''
self.__csvPath = './comm'
def getText(self):
text=''
with open('{0}/{1}'.format(self.__csvPath,self.__csvFile), 'r', encoding='UTF-8', errors='ignore') as cf:
reader = csv.reader(cf)
header = next(reader)
for row in reader:
if not row: continue
if row[6] == 'N/A': continue
if (float(row[6]) < 3.0): continue
text += row[1] if row[1][-1] == '。' else row[1]+'。'
return re.sub(r"\s+",' ',text) # for removing the extra spaces
def getToken(self):
text = self.getText()
# Pre-Processing the japanese data using regex
clean_text = text.lower() # converts any english word in lower case
clean_text = re.sub(r"\W"," ",clean_text) # removing any non-words characters which include special characters, comma, punctuation
clean_text = re.sub(r"\d"," ",clean_text) # removing any digits
clean_text = re.sub(r"\s+",' ',clean_text) # removing any extra spaces in middle
clean_text = re.sub(r"^\s",' ',clean_text) # removing any extra spaces in beginning
clean_text = re.sub(r"\s$",' ',clean_text) # removing any extra spaces in end
sentences = text.split("。")
jp_tokenised_words = nagisa.extract(clean_text, extract_postags=['英単語','接頭辞','形容詞','名詞','動詞','助動詞','副詞'])
return jp_tokenised_words.words
def getWeightWordFq(self):
tokenised_words = self.getToken()
jp_stopwords = ["あそこ","あっ","あの","あのかた","あの人","あり","あります","ある","あれ","い","いう","います","いる","う","うち","え","お","および","おり","おります","か","かつて","から","が","き","ここ","こちら","こと","この","これ","これら","さ","さらに","し","しかし","する","ず","せ","せる","そこ","そして","その","その他","その後","それ","それぞれ","それで","た","ただし","たち","ため","たり","だ","だっ","だれ","つ","て","で","でき","できる","です","では","でも","と","という","といった","とき","ところ","として","とともに","とも","と共に","どこ","どの","な","ない","なお","なかっ","ながら","なく","なっ","など","なに","なら","なり","なる","なん","に","において","における","について","にて","によって","により","による","に対して","に対する","に関する","の","ので","のみ","は","ば","へ","ほか","ほとんど","ほど","ます","また","または","まで","も","もの","ものの","や","よう","より","ら","られ","られる","れ","れる","を","ん","何","及び","彼","彼女","我々","特に","私","私達","貴方","貴方方"]
# print(tokenised_words)
# Calculate the frequency of each word
word2count = {} # dictionary stores the word as a key and frequency as its value
for word in tokenised_words:
if word not in jp_stopwords: # We dont want to include any stop word
word2count[word] = 1 if word not in word2count.keys() else word2count[word]+1
# Calculate the weighted frequency of each word by dividing the frequency of the word by maximum frequency of word in whole article
for key in word2count.keys():
word2count[key] = word2count[key]/max(word2count.values()) # Weighted Frequency
return word2count
# Below function , "getSpaceSeperatedJpWords(text)" inserts spaces among words in Japanese sentence by using 'pykakasi' library
def getSpaceSeperatedJpWords(self, text):
wakati = pykakasi.wakati()
conv = wakati.getConverter()
result_with_spaces = conv.do(text)
return result_with_spaces
def getSentScore(self):
word2count = self.getWeightWordFq()
print(word2count)
sentences = self.getText().split("。")
sent2score={} # This dictionary stores each sentence and its score as value
for sentence in sentences: # for each sentence in all sentences
# get each word as a token using "'英単語','接頭辞','形容詞','名詞','動詞','助動詞','副詞'" as list of filters
tokenised_sentence = nagisa.extract(sentence, extract_postags=['英単語','接頭辞','形容詞','名詞','動詞','助動詞','副詞'])
words = tokenised_sentence.words
for word in words: # if each word of all words in that sentence and
if word in word2count.keys(): # if that word is available in "word2count" dictionary
# add its corresponding weighted freqency
sent2score[sentence] = word2count[word] if sentence not in sent2score.keys() else sent2score[sentence] + word2count[word]
print(sent2score)
return heapq.nlargest(5,sent2score,key=sent2score.get) # top 15 sentences are selected having high scores
def reviewSum(self):
# get review from EC site
reviewAPI = ReviewAPI()
reviewAPI.reviewCrawler()
arrFiles = [f for f in os.listdir(self.__csvPath) if os.path.isfile(os.path.join(self.__csvPath, f))]
try:
with open('summary.txt', 'w', encoding='UTF-8', errors='ignore') as sf:
for cf in arrFiles :
self.__csvFile = cf
best_sentences = self.getSentScore()
# join those sentence to form the summary
summary = [''.join(b)+'。' for b in best_sentences]
print(''.join(summary))
# sf.write(cf)
sf.writelines(summary)
except Exception as e:
print('[{1}] Error : {0}'.format(e, SumAPI.reviewSum.__name__))
if __name__ == '__main__':
# init summerize API
sumAPI = SumAPI()
sumAPI.reviewSum()
<file_sep>class ReviewProperty:
def __init__(self):
self.__ecSite = ''
self.__dictComm = dict()
# Set EC sit
def setECSite(self, ecSite):
self.__ecSite = ecSite
# Get EC site
@property
def getECSite(self):
return self.__ecSite
# Get selector for EC site
@property
def getSiteSelector(self):
if self.__ecSite == 'rakuten':
return 'rakutenSelectors.yml'
elif self.__ecSite == 'amazon':
return 'amazonSelectors.yml'
# Set dictionary of review from EC site
def setDictComm(self, dictComm):
self.__dictComm = dictComm
# Get dictionary of review from EC site
@property
def getDictComm(self):
return self.__dictComm
| 7975110c5c881289ce0d47c2d5d77f96485fdcaf | [
"Markdown",
"Python"
]
| 9 | Python | jhc-shou/data-mining | 5b33324e2d165b663647a381bd1b193b4811f1da | e14a80a6ada3ac248aa614623bb13e1365f6f845 |
refs/heads/master | <repo_name>joaolucasl/komp<file_sep>/webpack.config.js
const webpack = require('webpack');
const path = require('path');
const config = {
entry: __dirname + '/src/index.js',
devtool: 'source-map',
output: {
path: __dirname + '/lib',
filename: 'komp.js',
library: 'Komp',
},
module: {
loaders: [
{
test: /(\.jsx|\.js)$/,
loader: 'babel',
exclude: /(node_modules)/,
},
{
test: /(\.js)$/,
loader: "eslint-loader",
exclude: /node_modules/
}
]
},
resolve: {
root: path.resolve('./src'),
extensions: ['', '.js']
}
};
module.exports = config;<file_sep>/test/app.js
window.onload = () => {
const Komp = window.Komp
const h = Komp.h
const root = document.querySelector('.app')
Komp.render(h('div', { color: 'red' }, [h('h1', { color: 'blue' }, [h('b', {}, 'OI')])]), root)
}
<file_sep>/src/vdom/diff.js
import DOMNode from '../dom/node'
const diff = (newNode, root /* , oldNode = vNode()*/) => {
const dom = DOMNode(newNode)
root.parentNode.replaceChild(dom, root)
}
export default diff
<file_sep>/src/dom/node.js
const fromVNode = (vNode) => {
if (typeof vNode === 'string') {
return document.createTextNode(vNode)
}
const node = document.createElement(vNode.name)
Object.keys(vNode.attr).forEach(key => node.setAttribute(key, vNode.attr[key]))
const children = vNode.children || []
children.forEach(child => node.appendChild(fromVNode(child)))
return node
}
export default fromVNode
<file_sep>/src/index.js
import render from './render'
import h from './vdom/h'
export {
render,
h,
}
| 4693a6be3b8a8303076128c7e365bacebeb654d1 | [
"JavaScript"
]
| 5 | JavaScript | joaolucasl/komp | b5c7ee21953026439bef4b27656a355dbc14c42f | 0145419dfd64ea02a301061d8d8ec1cbd192deed |
refs/heads/master | <repo_name>evilscientress/PyAT<file_sep>/ltestatus.py
#!/usr/bin/python
from PyAT import PyAT
modem = PyAT('/dev/sierra_internal_AT')
SYMBOL = ''
rssi = modem.get_signal_quality()
if rssi is None:
print ('%s no signal' % SYMBOL)
else:
rssi_range = PyAT.dbm_to_range(rssi)
rssi_range_name = PyAT.SIGNAL_RANGE[rssi_range]
regstatus = modem.get_registration_status()
operator = modem.get_operator()
network_tech = modem.get_network_technology()
print("%(symbol)s %(roaming)s'%(operator_name)s' %(rssi)s %(network_tech)s" % {
'symbol': SYMBOL,
'roaming': 'R! ' if regstatus['stat'] == PyAT.REGISTRATION_REGISTERED_ROAMING else '',
'operator_name': operator['operator'],
'rssi': (str(rssi) + 'dBm'),
'network_tech': network_tech,
})
if rssi is None or rssi_range == 0:
print('#ff0000')
elif rssi_range == 1:
print('#ff7f00')
elif rssi_range == 2:
print('#ffff00')
elif rssi_range == 3:
print('#00ff00')
modem.close()
<file_sep>/PyAT.py
#!/usr/bin/python
import serial
import re
ENABLE_DEBUGGING=False
def DEBUG(*args, **kwargs):
if ENABLE_DEBUGGING:
print(*args, **kwargs)
class AT_Command_Exception(Exception):
pass
class AT_Command_Error(AT_Command_Exception):
pass
class PyAT(object):
def __init__(self, port):
super(PyAT, self).__init__()
self.port = port
self.ser = serial.Serial(self.port, timeout=5)
self.registration_status_mode_set=False
def close(self):
self.ser.close()
def _sendcommand(self, cmd, regex=None):
DEBUG('> %s' % cmd)
self.ser.write((cmd + '\r\n').encode('ascii'))
cmd_resp = None
regex_match_result = None
while True:
resp = self.ser.readline().decode('ascii').strip()
regex_match = re.match(regex, resp) if regex is not None else None
DEBUG('< %s' % resp)
if regex_match is not None:
DEBUG("found regex match")
regex_match_result = regex_match
elif resp == cmd:
DEBUG('found comannd echo')
elif resp.startswith('OK'):
DEBUG('command executed sucessfully')
return regex_match_result
elif resp.startswith('ERROR') or resp.startswith('+CME ERROR:'):
DEBUG('ERROR EXCECUTING COMMAND')
raise AT_Command_Error('Error sending command "%s": "%s"' % (cmd, resp))
elif resp == '':
pass
else:
DEBUG('WARNING Unknowen Response: "%s"' % resp)
@classmethod
def csq_to_dbm(cls, csq_dbm_value):
csq_dbm_value = int(csq_dbm_value)
if csq_dbm_value == 99:
return None
elif 0 <= csq_dbm_value and csq_dbm_value <= 31:
return -113 + 2 * csq_dbm_value
else:
raise ValueError('invalid csq_dbm_value')
SIGNAL_RANGE_MARGINAL = 0
SIGNAL_RANGE_OK = 1
SIGNAL_RANGE_GOOD = 2
SIGNAL_RANGE_EXCELLENT = 3
SIGNAL_RANGE = {
0: 'Marginal',
1: 'OK',
2: 'Good',
3: 'Excellent',
}
@classmethod
def dbm_to_range(cls, dbm):
if dbm is None:
raise ValueError('dbm may not be none')
elif type(dbm) is not int and type(dbm) is not float:
raise TypeError('dbm has invalid type')
elif dbm < -93:
return cls.SIGNAL_RANGE_MARGINAL
elif -93 <= dbm and dbm < -83:
return cls.SIGNAL_RANGE_OK
elif -83 <= dbm and dbm < -73:
return cls.SIGNAL_RANGE_GOOD
elif -73 <= dbm:
return cls.SIGNAL_RANGE_EXCELLENT
else:
raise ValueError('invalid dbm value %d' % dbm)
def get_signal_quality(self):
resp = self._sendcommand('AT+CSQ', r'\+CSQ: (?P<rssi>\d+),(?P<ber>\d+)')
if resp is not None:
return self.csq_to_dbm(resp.group('rssi'))
else:
raise self.AT_Command_Exception("ERROR no value returned")
REGISTRATION_NOT_REGISTERED = 0
REGISTRATION_REGISTERED_HOME = 1
REGISTRATION_NOT_REGISTERED_SEARCHING = 2
REGISTRATION_REGISTRATION_DENIED = 3
REGISTRATION_UNKNOWN = 4
REGISTRATION_REGISTERED_ROAMING = 5
REGISTRATION_REGISTERED_SMS_HOME = 6
REGISTRATION_REGISTERED_SMS_ROAMING = 7
REGISTRATION_REGISTERED_EMERGENCY = 8
ACCESS_TECHNOLOGY_GSM = 0
ACCESS_TECHNOLOGY_GSM_COMPACT = 1
ACCESS_TECHNOLOGY_UTRAN = 2
ACCESS_TECHNOLOGY_GSM_EGPRS = 3
ACCESS_TECHNOLOGY_UTRAN_HSDPA = 4
ACCESS_TECHNOLOGY_UTRAN_HSUPA = 5
ACCESS_TECHNOLOGY_UTRAN_HSDPA_HSUPA = 6
ACCESS_TECHNOLOGY_E_UTRAN = 7
ACCESS_TECHNOLOGY = {
0: "GSM",
1: "GSM Compact",
2: "Utran",
3: "GSM w/EGPRS",
4: "UTRAN w/HSDPA",
5: "UTRAN w/HSUPA",
6: "UTRAN w/HSDPA and HSUPA",
7: "E-UTRAN"
}
def get_registration_status(self):
if not self.registration_status_mode_set:
self._sendcommand('AT+CREG=2')
self.registration_status_mode_set = True
resp = self._sendcommand('AT+CREG?', r'\+CREG: (?P<type>[0-2]),(?P<stat>[0-8])(?:,"(?P<lac>[0-9a-fA-F]{1,4})","(?P<ci>[0-9a-fA-F]{1,8})"(?:,(?P<act>[0-7]))?)?')
if resp is not None:
return {
'type': int(resp.group('type')),
'stat': int(resp.group('stat')),
'lac': resp.group('lac'),
'ci': resp.group('ci'),
'act': int(resp.group('act')) if resp.group('act') is not None else None,
'act_name': self.ACCESS_TECHNOLOGY[int(resp.group('act'))] if resp.group('act') is not None else None,
}
else:
raise self.AT_Command_Exception("ERROR no value returned")
OPERATOR_MODE_AUTOMATIC = 0
OPERATOR_MODE_MANUAL = 1
OPERATOR_MODE_DEREGISTER = 2
OPERATOR_MODE_SET_FORMAT_ONLY = 3
OPERATOR_MODE_AUTOMATIC_MANUAL = 4 # try manual, if manual fails fall back to automatic
def get_operator(self):
resp = self._sendcommand('AT+COPS?', r'\+COPS: (?P<mode>[0-4])(?:,(?P<format>[0-2]),"(?P<oper>.{1,16})"(?:,(?P<act>[0-7]))?)?')
if resp is not None:
return {
'mode': int(resp.group('mode')),
'format': int(resp.group('format')) if resp.group('format') is not None else None,
'operator': resp.group('oper'),
'act': int(resp.group('act')) if resp.group('act') is not None else None,
'act_name': self.ACCESS_TECHNOLOGY[int(resp.group('act'))] if resp.group('act') is not None else None,
}
else:
raise self.AT_Command_Exception("ERROR no value returned")
def set_operator(self, mode=0, operator_format=None, operator=None, act=None):
if 0 > mode and mode > 4:
raise KeyError('mode must be between 0-4')
if operator is not None and operator_format is None:
raise ValueError('error operator_format must be set if operator is set')
if operator_format is not None and operator is None:
if mode == 3:
operator = ""
else:
raise ValueError('error operator must be set if operator_format is set and mode is not 3')
if act is None and (operator_format is None or operator is None):
raise ValueError('act may only be set if operator and format is set too')
cmd = 'AT+COPS=%d' % mode
if mode > 0:
cmd += ',%d,%s' % (operator_format, operator)
if act is not None:
cmd += ',%d' % act
self._sendcommand(cmd)
def get_operator_name(self):
resp = self.get_operator()
return resp['operator']
NETWORK_TEHCNOLOGY_INUSE=0
NETWORK_TEHCNOLOGY_AVAILABLE=1
NETWORK_TEHCNOLOGY_SUPPORTED=2
def get_network_technology(self, mode=0):
resp = self._sendcommand("AT*CNTI=%d" % mode, r'\*CNTI: [0-2],(?P<tech>.*)')
if resp is not None:
return resp.group('tech')
else:
raise self.AT_Command_Exception("ERROR no value returned")
if __name__ == "__main__":
ENABLE_DEBUGGING=True
modem = PyAT('/dev/sierra_internal_AT')
rssi = modem.get_signal_quality()
print("current rssi %d (%s)" % (rssi, PyAT.SIGNAL_RANGE[PyAT.dbm_to_range(rssi)]))
print("registration status %r" % modem.get_registration_status())
print("get_operator %r" % modem.get_operator())
print("get_operator_name %s" % modem.get_operator_name())
print("get_network_technology %s" % modem.get_network_technology())
modem.close()<file_sep>/README.md
# PyAT
A Python API for AT commands according to 3GPP TS 27.007
## Work In Progress
This API is under heavy development, expect lots of changes
## Rerference documentation used
- [ETSI TS 127 007 V10.3.0 (2011-04)](https://www.etsi.org/deliver/etsi_ts/127000_127099/127007/10.03.00_60/ts_127007v100300p.pdf)
## License
MIT License | ce42dfae99c26936c6293d4b5b5a5e7dc4ce5737 | [
"Markdown",
"Python"
]
| 3 | Python | evilscientress/PyAT | 18e76f9ff034c78b7e2b422d0817a25f3edbbdae | 75d28c5d082ddac30c16c2a28c9a2b2bdc9ac7bd |
refs/heads/main | <repo_name>GalaxyGamingBoy/Block-Destroyer-C<file_sep>/SECURITY.md
# Security Policy
## Supported Versions
Use this section to tell people about which versions of your project are
currently being supported with security updates.
| Version | Supported |
| ------- | ------------------ |
| 0.2.x | :white_check_mark: |
## Reporting a Vulnerability
Go t othe issues tab and make an issue with the tag security-vulnerability
<file_sep>/ball-destroyer.c
#include "raylib.h"
typedef struct{
int width;
int height;
int posX;
int posY;
}player;
typedef struct{
float radius;
int posX;
int posY;
}ball;
void Draw(int pposX, int pposY, int bposX, int bposY, int scoreCount){
//Variables
char score[20];
//Player
player playerBox;
playerBox.width = 20;
playerBox.height = 62;
playerBox.posX = pposX;
playerBox.posY = pposY;
//Ball
ball ballCircle;
ballCircle.radius = 4.56f;
ballCircle.posX = bposX;
ballCircle.posY = bposY;
//Begin Drawing
BeginDrawing();
//Refresh BG
ClearBackground(DARKGRAY);
//Draw the player
DrawRectangle(playerBox.posX, playerBox.posY, playerBox.width, playerBox.height, DARKBLUE);
//Draw Ball
DrawCircle(ballCircle.posX, ballCircle.posY, ballCircle.radius, GOLD);
//Draw Text
sprintf(score, "Score: %d", scoreCount);
DrawText(score, 650, 10, 25, SKYBLUE);
//End Drawing
EndDrawing();
}
int main(void)
{
const int screenWidth = 800;
const int screenHeight = 450;
int scoreCount = 0;
int isOutOfView;
int tempPPos[] = {10, 10, 5};
int tempBPos[] = {10, 10, 3.5, 1}; // 1 = Bottom Right 2 = Bottom Left 3 = Top Right 4 = Top Left
int start = 1;
InitWindow(screenWidth, screenHeight, "Block Destroyer v0.2");
SetTargetFPS(60); // Set our game to run at 60 frames-per-second
// Main game loop
while (!WindowShouldClose())
{
//Player Controls
if (IsKeyDown(KEY_W) || IsKeyDown(KEY_UP)){
tempPPos[1] -= tempPPos[2];
if(tempPPos[1] < 0){
tempPPos[1] = 0;
}else if(tempPPos[1] > 388){
tempPPos[1] = 388;
}
}else if (IsKeyDown(KEY_S) || IsKeyDown(KEY_DOWN)){
tempPPos[1] += tempPPos[2];
if(tempPPos[1] < 0){
tempPPos[1] = 0;
}else if(tempPPos[1] > 388){
tempPPos[1] = 388;
}
}
if (IsKeyDown(KEY_D) || IsKeyDown(KEY_RIGHT)){
tempPPos[0] += tempPPos[2];
// Checking if colliding with edges
if(tempPPos[0] > 780){
tempPPos[0] = 780;
}else if(tempPPos[0] < 0){
tempPPos[0] = 0;
}
}else if (IsKeyDown(KEY_A) || IsKeyDown(KEY_LEFT)){
tempPPos[0] -= tempPPos[2];
// Checking if colliding with edges
if(tempPPos[0] > 780){
tempPPos[0] = 780;
}else if(tempPPos[0] < 0){
tempPPos[0] = 0;
}
}
if(IsKeyDown(KEY_SPACE) || IsKeyDown(KEY_ENTER)){
start = 0;
}
if(start==1){
tempBPos[0] = tempPPos[0];
tempBPos[0] += 35;
tempBPos[1] = tempPPos[1];
tempBPos[1] += 31;
}else if(start == 0){
if(tempBPos[1] > 450 && tempBPos[3] == 1){
tempBPos[3] = 3;
}
if(tempBPos[1] > 450 && tempBPos[3] == 2){
tempBPos[3] = 4;
}
if(tempBPos[1] < 0 && tempBPos[3] == 3){
tempBPos[3] = 2;
}
if(tempBPos[1] < 0 && tempBPos[3] == 4){
tempBPos[3] = 1;
}
if(tempBPos[0] > 780 && tempBPos[3] == 1){
tempPPos[3] = 4;
}
if(tempBPos[0] > 780 && tempBPos[3] == 3){
tempPPos[3] = 2;
}
if(tempBPos[0] == tempPPos[0] && tempBPos[3] == 1){
tempPPos[3] = 3;
}
if(tempBPos[3] == 1){ // 1 = Bottom Right 2 = Bottom Left 3 = Top Right 4 = Top Left
tempBPos[0] += tempBPos[2];
tempBPos[1] += tempBPos[2];
}else if(tempBPos[3] == 2){
tempBPos[0] -= tempBPos[2];
tempBPos[1] += tempBPos[2];
}else if(tempBPos[3] == 3){
tempBPos[0] += tempBPos[2];
tempBPos[1] -= tempBPos[2];
}else if(tempBPos[3] == 4){
tempBPos[0] -= tempBPos[2];
tempBPos[1] -= tempBPos[2];
}
}
if(tempBPos[0] > 0){
isOutOfView = 0;
}
if(tempBPos[0] < 0){
if(isOutOfView == 0){
isOutOfView = 1;
scoreCount--;
}
}
Draw(tempPPos[0], tempPPos[1], tempBPos[0], tempBPos[1], scoreCount);
}
CloseWindow(); // Close window and OpenGL context
return 0;
} | a5699e1ac3c4f17e2a7f255d79af42b79360010c | [
"Markdown",
"C"
]
| 2 | Markdown | GalaxyGamingBoy/Block-Destroyer-C | 92b241a81a236f9e19d21924aa2310c9f3512d1b | f2037b73b4004064250e131ca710979aab17af4e |
refs/heads/master | <repo_name>crtahlin/medplot<file_sep>/man/plotPropPositiveCI.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{plotPropPositiveCI}
\alias{plotPropPositiveCI}
\title{Plot of confidence intervals for proportions for groups an measurements.}
\usage{
plotPropPositiveCI(data, groupingVar, measurementVar, selectedSymptoms)
}
\arguments{
\item{TODO}{}
}
\description{
TODO
}
<file_sep>/R/PlotSymptomsLogisticResgression.R
tableLogist <- function(data,
measurementVar,
selectedMeasurement,
covariate,
selectedSymptoms,
thresholdValue) {
# subset the data, using only the selected evaluation
data <- data[data[,measurementVar]==selectedMeasurement, ]
# binarize the data
data[, selectedSymptoms] <- ifelse(data[, selectedSymptoms]>thresholdValue, 1, 0)
table <- data.frame("Variable"=selectedSymptoms) # table of printable results - Fixed effect
table2 <- data.frame("Variable"=selectedSymptoms) # table of raw results
table3 <- data.frame("Variable"=selectedSymptoms) # table of printable results - Intercept
# check if covariate is binary and generate text which levels we are comparing
if (determineTypeofVariable(data[,covariate])[["nLevels"]]=="binary") { # binary var
levels <- levels(as.factor(data[,covariate])) # levels of the covariate
oddsFor <- paste(levels[2],"vs",levels[1]) # text describing which variables were compared
}
if (determineTypeofVariable(data[,covariate])[["nLevels"]]=="multilevel" & # numerical var w multi levels
( determineTypeofVariable(data[,covariate])[["type"]]=="integer")
| determineTypeofVariable(data[,covariate])[["type"]]=="numeric") {
#levels <- levels(as.factor(data[,covariate])) # levels of the covariate
oddsFor <- paste("unit difference in", covariate) # text describing which variables were compared
}
for (symptom in selectedSymptoms) {
model <- glm(data[,symptom] ~ data[,covariate], family=binomial())
table[table["Variable"]==symptom, "Odds ratio"] <-
format(exp(model$coef[2]), digits=2)
table[table["Variable"]==symptom, "95% conf. interval"] <-
paste(format(exp(confint(model)[2,1]), digits=2),
" to ",
format(exp(confint(model)[2,2]), digits=2))
table2[table2["Variable"]==symptom, "OR"] <- exp(model$coef[2])
table2[table2["Variable"]==symptom, "CILower"] <- exp(confint(model)[2,1])
table2[table2["Variable"]==symptom, "CIUpper"] <- exp(confint(model)[2,2])
table[table["Variable"]==symptom, "P value"] <-
format(summary(model)$coefficients[2,"Pr(>|z|)"],digits=2)
# printable intercept table
table3[table3["Variable"]==symptom, "Odds (intercept)"] <-
format(exp(model$coef[1]), digits=2)
table3[table3["Variable"]==symptom, "95% conf. interval"] <-
paste(format(exp(confint(model)[1,1]), digits=2),
" to ",
format(exp(confint(model)[1,2]), digits=2))
table3[table3["Variable"]==symptom, "P value"] <-
format(summary(model)$coefficients[1,"Pr(>|z|)"],digits=2)
}
return(list(printableResultsTable=table,
rawResultsTable=table2,
referenceValue=oddsFor,
printableInterceptTable=table3))
}<file_sep>/R/PlotSymptomsTabDataSummary.R
#' @title Tables summarizing data
#'
#' @description TODO
#'
#' @param TODO
summarizeData <- function(data,
personIDVar,
measurementVar,
selectedSymptoms,
groupingVar) {
#how many different subjects are included
num.obs=nrow(data)
cat(paste("Number of observations included in the data set: ", num.obs, "\n"))
#how many different subjects are included
num.samples.unique=length(unique(data[,personIDVar]))
cat(paste("Number of unique subjects: ", num.samples.unique, "\n"))
my.summary=summary(data)
#how many measurement occasions
num.measurement.occasions=length(unique(data[,measurementVar]))
cat(paste("Number of evaluation occasions: ", num.measurement.occasions, "\n"))
#how many subjects were measured at each measurement occasion, treated as a data frame for better display
num.measurement.per.occasion= as.data.frame(table(data[,measurementVar]))
dimnames(num.measurement.per.occasion)[[2]]=c("Evaluation occasion",
"Number of subjects")
cat("\nNumber of observations per evaluation occasion \n")
#print(as.table(num.measurement.per.occasion))
print(num.measurement.per.occasion)
#how many times was a single subject measured
num.measurement.per.patient=as.data.frame(table(table(data[,personIDVar])) )
dimnames(num.measurement.per.patient)[[2]]=c("Number of evaluation",
"Number of subjects")
cat("\nNumber of evaluations per subject\n")
print(num.measurement.per.patient)
cat("\nNumber of subjects with missing values (by evaluation occasion, for the evaluated subjects) \n")
#gets a table with the number of missing values for each variable, at each occasion
num.missing.values.occ=addmargins((t(apply(data[,selectedSymptoms], 2, function(symptom) {tapply(symptom, INDEX=data[,measurementVar], FUN=function(x) sum(is.na(x)))}))))
print(num.missing.values.occ)
######### summary for the grouping variable
summary.grouping.variable=table(data[,groupingVar])
cat("\nSummary of grouping variable data (all observations) \n")
print(summary.grouping.variable)
#levels of the grouping variable
levels.grouping=levels(as.factor(data[,groupingVar]))
#number of levels
num.levels.grouping=length(levels.grouping)
summary.grouping.variable.occ=tapply(data[,groupingVar], INDEX=data[,measurementVar], FUN=function(x) table(x))
summary.grouping.variable.occ2=tapply(data[,groupingVar], INDEX=data[,measurementVar], FUN=function(x)
{my.res=numeric(num.levels.grouping)
for(i in 1:num.levels.grouping) my.res[i]=sum(x==levels.grouping[i], na.rm=T)
return(my.res)
})
summary.grouping.variable.occ2= matrix(unlist(summary.grouping.variable.occ2), ncol=num.levels.grouping, byrow=TRUE)
dimnames(summary.grouping.variable.occ2)[[2]]=levels.grouping
dimnames(summary.grouping.variable.occ2)[[1]]=sort(unique(data[,measurementVar]))
cat("\nSummary of grouping variable data (per evaluation occasion) \n")
print(addmargins(summary.grouping.variable.occ2))
}<file_sep>/inst/shinyapp_symptoms/server.R
# This is the server part of the shiny script to plot
# graph for displaying presence of symptoms.
# load library for generation interactive web pages
library(shiny)
# load library for generating graph scales
library(scales)
# Main function
shinyServer(function(input, output) {
# subset the data with the selected symptoms
data <- reactive( function() {
symptomsData[symptomsData$variable %in% input$selectedSymptoms,]
})
# plot the graph, but only for selected symptoms
output$plot <- renderPlot(function() {
# if no symbols are selected, do not plot
if (dim(data())[1]>0) {
print(
ggplot(data(), aes(x = Date, y = Patient, size = value, colour = variable)) +
geom_point(shape = 1) + theme_bw() +
# old scale - scale according to diameter
#scale_size(range= c(1,20)) +
# scale according to area
scale_size_area(breaks=c(1:10),minor_breaks=c(1:10),
guide="legend",
limits=c(1,10),
max_size=10) +
scale_x_date(breaks = date_breaks("1 week"),
labels = date_format("%d %b"),
minor_breaks = date_breaks("1 day"))
)}
})
output$message <- renderText(if (dim(data())[1]==0){paste("Please select one or more symptoms.")})
})
# runApp(launch.browser=TRUE)
<file_sep>/R/PlotSymptomsTabDistribution_GroupingVariable.R
#' @title Plot proportions with symptoms
#' @description Function that plots proportion of subjects that have a certain
#' symptom present.
#'
#' @param data Data to be passed to the function as a data frame.
#' @param grouping The column name of the binary grouping variable used to define the groups on the plot.
#' @param measurements The column name of the variable containing measurement occasions.
#' @param symptomsNames A vector of column names containing the symptom severity.
plotPropPositive <- function (data,
grouping="Sex",
measurements="Measurement",
symptomsNames) {
# code will be used without changing variable names
# mapping new names to old names
my.data.expanded.nNOIS <- data
which.var <- grouping
which.symptoms <- symptomsNames # TODO: pass the symptom names as char vector
names.symptoms <- symptomsNames
### EXISTING CODE ####
##################### which of the variables are symptoms
#derive the matrix with the symptom intensity only
my.data.symptoms=my.data.expanded.nNOIS[, which.symptoms, drop=FALSE]
#derive the matrix with the symptom positivity/negativity only
my.data.symptoms.yn=ifelse(my.data.expanded.nNOIS[,which.symptoms, drop=FALSE]>0, 1, 0)
#how many symptoms
num.symptoms=ncol(my.data.symptoms)
################ two bargraphs ######
#select the variable based on which the graphs are drawn separately
#index of the variable in the db
my.var=my.data.expanded.nNOIS[,which.var]
my.levels=sort(unique(my.var))
num.levels=length(my.levels)
#time for each record
my.times.all=my.data.expanded.nNOIS[,measurements]
#unique times
my.times=sort(unique(my.data.expanded.nNOIS[,measurements]))
num.times=length(my.times)
prop.with.symptoms=lapply(1:num.times, function(i) {
tmp=vector("list", num.levels)
for(j in 1:num.levels){
tmp[[j]]=apply(my.data.symptoms.yn[which(my.times.all==my.times[i] &
my.var==my.levels[j]),,drop=FALSE],
2, function(x) mean(x==TRUE, na.rm=TRUE))
}
tmp
})
if(num.levels==2) {
linch <- max(strwidth(names.symptoms, "inch")+.4, na.rm = TRUE)
par(mai=c(1.02, linch,0.82,0.42))
my.order.symptoms=order(prop.with.symptoms[[1]][[1]], decreasing=FALSE)
prop.with.symptoms.1=lapply(prop.with.symptoms, function(x) x[[1]])
prop.with.symptoms.2=lapply(prop.with.symptoms, function(x) x[[2]])
prop.with.symptoms.1=matrix(unlist(prop.with.symptoms.1), nrow=num.times, byrow=TRUE)
prop.with.symptoms.2=matrix(unlist(prop.with.symptoms.2), nrow=num.times, byrow=TRUE)
linch <- max(strwidth(names.symptoms, "inch")+.4, na.rm = TRUE)
par(mai=c(1.02, linch,0.82,0.42))
barplot(prop.with.symptoms.1[num.times:1,my.order.symptoms],
beside=TRUE, hor=TRUE, xlim=c(-1,1),
names.arg=names.symptoms[my.order.symptoms],
las=1, xlab="Proportion of subjects", add=FALSE,
legend.text=c(paste0("T=", my.times[num.times:1])),
args.legend=list(x=par("usr")[2],
y=par("usr")[3], yjust = 0 ),
axes=FALSE)
barplot(-prop.with.symptoms.2[num.times:1,my.order.symptoms],
beside=TRUE, hor=TRUE,
names.arg=names.symptoms[my.order.symptoms],
las=1, xlab="Proportion of subjects", add=TRUE,
axes=FALSE)
abline(v=seq(-1, 1, by=.1), lty=2, col="light gray")
axis(side=1, labels=c(1, 0.5, 0, 0.5, 1), at=c(-1, -0.5, 0, 0.5, 1))
text(x=0, par("usr")[4],
labels=paste(which.var, "=" , my.levels[1]), xpd=T, adj=c(0))
text(x=0, par("usr")[4],
labels=paste(which.var, "=" ,my.levels[2]), xpd=T, adj=c(1))
}# end if
}
#' @title Table with proportions
#'
#' @description TODO
#'
#' @param TODO
tablePropPosGroups <- function (data,
groupingVar="Sex",
measurementVar,
forMeasurement,
symptomsNames,
thresholdValue=0,
doPValueAdjustments) {
# removing entries with missing data for groupingVar
data <- data[!is.na(data[groupingVar]),]
# TODO: create a Groupinga() reactive vector to pass to this and other functions?
groupingLevels <- as.character(unlist(unique(data[groupingVar])))
data <- data[data[measurementVar]==forMeasurement,]
tableData <- data.frame("Variable"=symptomsNames)
column1Name <- paste("Prop. of ", groupingLevels[1])
column2Name <- paste("Positive/all for ", groupingLevels[1])
column3Name <- paste("Prop. of ", groupingLevels[2])
column4Name <- paste("Positive/all for ", groupingLevels[2])
column5Name <- paste("P value")
column6Name <- paste("95% CI for the difference")
if (doPValueAdjustments==TRUE) {
column7Name <- paste("Adj. P value (Holm-Bonferroni)")
column9Name <- paste("Adj. P value (permutations)")
column10Name <- paste("Q-value (Benjamini-Hochberg)")
column8Name <- paste("Q-value (Benjamini-Yekutieli)")
}
group1Data <- data[data[groupingVar]==groupingLevels[1],]
group1Data[, symptomsNames] <- (group1Data[,symptomsNames]>thresholdValue)
group2Data <- data[data[groupingVar]==groupingLevels[2],]
group2Data[, symptomsNames] <- (group2Data[,symptomsNames]>thresholdValue)
for (symptom in symptomsNames) {
group1Positive <- sum(group1Data[,symptom], na.rm=TRUE)
group1Negative <- sum(!group1Data[,symptom], na.rm=TRUE)
group2Positive <- sum(group2Data[,symptom], na.rm=TRUE)
group2Negative <- sum(!group2Data[,symptom], na.rm=TRUE)
testMatrix <- matrix(c(group1Positive, group1Negative,
group2Positive, group2Negative),
byrow=TRUE, ncol=2)
results <- prop.test(testMatrix)
tableData[tableData["Variable"]==symptom, column1Name ] <-
format(results$estimate[1], digits=2)
tableData[tableData["Variable"]==symptom, column2Name ] <-
paste(group1Positive, "/", group1Positive+group1Negative)
tableData[tableData["Variable"]==symptom, column3Name ] <-
format(results$estimate[2], digits=2)
tableData[tableData["Variable"]==symptom, column4Name ] <-
paste(group2Positive, "/", group2Positive+group2Negative)
tableData[tableData["Variable"]==symptom, column5Name ] <-
format(results$p.value, digits=2)
tableData[tableData["Variable"]==symptom, column6Name ] <-
paste(format(results$conf.int[[1]], digits=2),
format(results$conf.int[[2]], digits=2), sep=" to ")
}
if (doPValueAdjustments==TRUE) {
# add a Holm-Bonferoni corrected P value column
tableData[, column7Name] <- format(p.adjust(p=tableData[, column5Name], method="holm"), digits=2)
# add a Benjamini-Hochberg Q-value
tableData[, column10Name] <- format(p.adjust(p=tableData[, column5Name], method="BH"), digits=2)
# add a Benjamini-Yekutieli Q-value - see ?p.adjust
tableData[, column8Name] <- format(p.adjust(p=tableData[, column5Name], method="BY"), digits=2)
# OBSOLETE CODE FOR PERMUTATION CALCULATED P VALUES (WITHOUT CORRECTION)
# for (symptom in symptomsNames) {
# # add a permutation calculated P value
# tableData[tableData["Variable"]==symptom, column9Name ] <-
# format( .pvaluebyPermutation(data=na.omit(data[,c(symptom, groupingVar)]),
# variableName=symptom,
# groupName=groupingVar,
# FUN=.propDif,
# nPermutations=1000,
# thresholdValue=thresholdValue)
# , digits=2)
# }
#
# run a a multivariate permutation correcting P value
permutationCorectedPValue <- permCorrPValue(data=data,
nPermutations=999,
selectedSymptoms=symptomsNames,
groupingVar=groupingVar,
measurementVar=measurementVar,
measurementOccasion=forMeasurement,
FUN="propPValue", # function that returns a P value
thresholdValue=0
)
for (symptom in symptomsNames) {
# add a permutation calculated corrected P value
tableData[tableData["Variable"]==symptom, column9Name ] <-
format(permutationCorectedPValue[symptom], digits=2)
}
}
return(tableData)
}
#' @title Table with medians
#'
#' @description TODO Ignores threshold value.
#'
#' @param TODO
tableMeGroups <- function (data,
groupingVar="Sex",
measurementVar,
forMeasurement,
symptomsNames,
thresholdValue=0,
doPValueAdjustments) {
# removing entries with missing data for groupingVar
data <- data[!is.na(data[groupingVar]),]
# list grouping levels
groupingLevels <- as.character(unlist(unique(data[groupingVar])))
# only use data for a particular occasion
data <- data[data[measurementVar]==forMeasurement,]
# construct table with solutions
tableData <- data.frame("Variable"=symptomsNames)
# name columns of the table with solutions
column1Name <- paste("Median of ", groupingLevels[1])
column2Name <- paste("IQR for ", groupingLevels[1])
column3Name <- paste("Median of ", groupingLevels[2])
column4Name <- paste("IQR for ", groupingLevels[2])
column5Name <- paste("P value")
# column6Name <- paste("Conf. int. for diff. of prop. ")
if (doPValueAdjustments==TRUE) {
column7Name <- paste("Adj. P value (Holm-Bonferroni)")
column9Name <- paste("Adj. P value (permutations)")
column10Name <- paste("Q-value (Benjamini-Hochberg)")
column8Name <- paste("Q-value (Benjamini-Yekutieli)")
}
group1Data <- data[data[groupingVar]==groupingLevels[1],]
# group1Data[, symptomsNames] <- (group1Data[,symptomsNames]>thresholdValue)
group2Data <- data[data[groupingVar]==groupingLevels[2],]
# group2Data[, symptomsNames] <- (group2Data[,symptomsNames]>thresholdValue)
for (symptom in symptomsNames) {
group1Median <- median(group1Data[,symptom], na.rm=TRUE)
group1IQR <- quantile(group1Data[,symptom], c(0.25, 0.75), na.rm=TRUE)
group2Median <- median(group2Data[,symptom], na.rm=TRUE)
group2IQR <- quantile(group2Data[,symptom], c(0.25, 0.75), na.rm=TRUE)
# testMatrix <- matrix(c(group1Positive, group1Negative,
# group2Positive, group2Negative),
# byrow=TRUE, ncol=2)
# results <- prop.test(testMatrix)
#
result <- wilcox.test(x=group1Data[,symptom], y=group2Data[,symptom])
tableData[tableData["Variable"]==symptom, column1Name ] <- format(group1Median, digits=2)
tableData[tableData["Variable"]==symptom, column2Name ] <- paste(group1IQR[1], " to ", group1IQR[2])
tableData[tableData["Variable"]==symptom, column3Name ] <- format(group2Median, digits=2)
tableData[tableData["Variable"]==symptom, column4Name ] <- paste(group2IQR[1], " to ", group2IQR[2])
tableData[tableData["Variable"]==symptom, column5Name ] <- format(result$p.value, digits=2, nsmall=2)
# tableData[tableData["Variable"]==symptom, column6Name ] <-
# paste(format(results$conf.int[[1]], digits=2),
# format(results$conf.int[[2]], digits=2), sep=";")
}
if (doPValueAdjustments==TRUE) {
# add a Holm-Bonferoni corrected P value column
tableData[, column7Name] <- format(p.adjust(p=tableData[, column5Name], method="holm"), digits=2, nsmall=2)
# add a Benjamini-Hochberg Q-value - see ?p.adjust
tableData[, column10Name] <- format(p.adjust(p=tableData[, column5Name], method="BH"), digits=2, nsmall=2)
# add a Benjamini-Yekutieli Q-value - see ?p.adjust
tableData[, column8Name] <- format(p.adjust(p=tableData[, column5Name], method="BY"), digits=2, nsmall=2)
# for (symptom in symptomsNames) {
# tableData[tableData["Variable"]==symptom, column9Name ] <-
# format( .pvaluebyPermutation(data=na.omit(data[,c(symptom, groupingVar)]),
# variableName=symptom,
# groupName=groupingVar,
# FUN=.medianDif,
# nPermutations=1000,
# thresholdValue=thresholdValue)
# , digits=2)
# }
# run a a multivariate permutation correcting P value
permutationCorectedPValue <- permCorrPValue(data=data,
nPermutations=999,
selectedSymptoms=symptomsNames,
groupingVar=groupingVar,
measurementVar=measurementVar,
measurementOccasion=forMeasurement,
FUN="MannWhitneyPValue", # function that returns a P value
thresholdValue=0
)
for (symptom in symptomsNames) {
# add a permutation calculated corrected P value
tableData[tableData["Variable"]==symptom, column9Name ] <-
format(permutationCorectedPValue[symptom], digits=2, nsmall=2)
}
}
return(tableData)
}
#' @title Plot of confidence intervals for proportions for groups an measurements.
#'
#' @description TODO
#' @param TODO
plotPropPositiveCI <- function (data,
groupingVar,
measurementVar,
selectedSymptoms) {
dataWithCIs <- .returnPropCIs(data=data,
groupingVar=groupingVar,
measurementVar=measurementVar,
selectedSymptoms=selectedSymptoms)
# dataTest[dataTest["group"]=="F", c("mean", "lower", "upper")] <-
# - dataTest[dataTest["group"]=="F", c("mean", "lower", "upper")]
# make values of CIs negative for first group
dataWithCIs[dataWithCIs["Group"]==as.character(unique(data[,groupingVar])[1]),
c("Mean","UpperCI","LowerCI")] <-
- dataWithCIs[dataWithCIs["Group"]==as.character(unique(data[,groupingVar])[1])
, c("Mean","UpperCI","LowerCI")]
plot <- ggplot() +
geom_errorbarh(data=dataWithCIs,
mapping=aes(y=Measurement, x=UpperCI, xmin=UpperCI, xmax=LowerCI,
colour=Group),
height=0.2, size=1) +
geom_point(data=dataWithCIs,
mapping=aes(y=Measurement, x=Mean), size=4, shape=21, fill="white") +
facet_grid(Variable~.) +
scale_x_continuous(limits=c(-1,1),
breaks=seq(-1,1,by=0.2),
labels=abs(seq(-1,1,by=0.2)),
minor_breaks=((seq(-1,1, by=0.1)))) +
geom_vline(xintercept=0)+
myTheme() + labs(title="Proportions of positive values (with confidence intervals)",
x= "Proportions")
# +
# opts(title="geom_errorbarh", plot.title=theme_text(size=40, vjust=1.5))
return(plot)
}
#' @title Multivariate permutation test for correcting the P value
#'
#' @description TODO
#'
#' @param TODO
permCorrPValue <- function(data,
nPermutations=999,
selectedSymptoms, # vector of selected variables - defines order
groupingVar,
measurementVar,
measurementOccasion,
FUN, # function that return a P value,
thresholdValue=0 # threshold for positivity
) {
data <- data[data[measurementVar]==measurementOccasion,]
groupingLevels <- as.character(unique(data[,groupingVar]))
# calculate the P value based on the sample
calculatedPValue <- numeric()
# if calculating proportions
if (FUN=="propPValue") {
for (symptom in selectedSymptoms) {
calculatedPValue[symptom] <- .propPValue(
successG1=sum(na.omit(data[data[groupingVar]==groupingLevels[1],symptom]) > thresholdValue),
failureG1=sum(na.omit(data[data[groupingVar]==groupingLevels[1],symptom]) <= thresholdValue),
successG2=sum(na.omit(data[data[groupingVar]==groupingLevels[2],symptom]) > thresholdValue),
failureG2=sum(na.omit(data[data[groupingVar]==groupingLevels[2],symptom]) <= thresholdValue)
)
}}
# if calculating difference of samples (Mann-Whitney)
if (FUN=="MannWhitneyPValue") {
for (symptom in selectedSymptoms) {
calculatedPValue[symptom] <- .MannWhitneyPValue(
na.omit(data[data[groupingVar]==groupingLevels[1],symptom]),
na.omit(data[data[groupingVar]==groupingLevels[2],symptom])
)
}}
globalMinPValues <- numeric()
permutationPValues <- numeric()
for (permutation in 1:nPermutations) {
for (symptom in selectedSymptoms) {
data[,groupingVar] <- sample(data[,groupingVar]) # shuffle group membership
# if calculating proportions
if (FUN=="propPValue") {
permutationPValues[symptom] <- .propPValue(
sum(na.omit(data[data[groupingVar]==groupingLevels[1],symptom]) > thresholdValue),
sum(na.omit(data[data[groupingVar]==groupingLevels[1],symptom]) <= thresholdValue),
sum(na.omit(data[data[groupingVar]==groupingLevels[2],symptom]) > thresholdValue),
sum(na.omit(data[data[groupingVar]==groupingLevels[2],symptom]) <= thresholdValue)
)}
# if calculating difference of samples (Mann-Whitney)
if (FUN=="MannWhitneyPValue") {
permutationPValues[symptom] <- .MannWhitneyPValue(
na.omit(data[data[groupingVar]==groupingLevels[1],symptom]),
na.omit(data[data[groupingVar]==groupingLevels[2],symptom])
)
}
}
globalMinPValues[permutation] <- min(permutationPValues)
}
correctedPvalues <- numeric()
for (symptom in selectedSymptoms) {
correctedPvalues[symptom] <- sum(calculatedPValue[symptom]>globalMinPValues)/(nPermutations+1)
}
return(correctedPvalues)
}
### Helper functions ####
# calculate p value by permutation - OBSOLETE, not multivariate
.pvaluebyPermutation <- function(data,
variableName,
groupName,
FUN,
nPermutations=1000,
thresholdValue) {
# based on function in
# http://cran.r-project.org/web/packages/permute/vignettes/permutations.pdf
## check x and group are of same length
stopifnot(all.equal(length(data[,variableName]), length(data[,groupName])))
## number of observations
N <- nobs(data[,variableName])
## generate the required set of permutations
pset <- shuffleSet(N, nset = nPermutations)
## iterate over the set of permutations applying FUN
D <- apply(pset, 1, FUN, x = data[,variableName], grp = data[,groupName], thresholdValue=thresholdValue)
## add on the observed mean difference
D <- c(FUN(seq_len(N), data[,variableName], data[,groupName], thresholdValue=thresholdValue), D)
## compute & return the p-value
Ds <- sum(D >= D[1]) # how many >= to the observed diff?
# use inequality, since medians otherwise don't get a proper p value?
# Ds <- sum(D > D[1])
pValue <- (Ds / (nPermutations + 1)) # what proportion of perms is this (the pval)?
# hist(D,breaks=20, main = paste("diff histogram; our value=",D[1],"; how many larger or same:", Ds ))
# rug(D[1], col = "red", lwd = 2)
return(pValue)
}
# difference of means between groups
.meanDif <- function(i, x, grp, thresholdValue) {
grp <- grp[i]
# need to return abs() for two tailed test?
diff <- abs(mean(x[grp == "Female"]) - mean(x[grp == "Male"]))
return(diff)
}
# difference of proportions between groups
.propDif <- function(i, x, grp, thresholdValue) {
grp <- grp[i]
# need to return abs() for two tailed test?
diff <- abs(mean(x[grp == "Female"]>thresholdValue) - mean(x[grp == "Male"]>thresholdValue))
return(diff)
}
# difference of medians between groups
.medianDif <- function(i, x, grp, thresholdValue) {
grp <- grp[i]
# need to return abs() for two tailed test?
diff <- abs(median(x[grp == "Female"]) - median(x[grp == "Male"]))
return(diff)
}
.returnPropCIs <- function(data,
groupingVar,
measurementVar,
selectedSymptoms) {
# define levels for covariates
group <- unique(na.omit(data[,groupingVar]))
measurement <- unique(na.omit(data[,measurementVar]))
results <- data.frame(expand.grid(Group=group,
Measurement=measurement,
Variable=selectedSymptoms), Mean=NA, LowerCI=NA, UpperCI=NA)
for (i in group) {
for(j in measurement) {
for(k in selectedSymptoms) {
# omit missing values, create boolean vector
symptomData <- na.omit((data[(data[groupingVar]==i & data[measurementVar]==j), k])==1)
testResults <- prop.test(x= sum(symptomData), n= (sum(symptomData) + sum(!symptomData)))
mean <- testResults$estimate
lower <- testResults$conf.int[1]
upper <- testResults$conf.int[2]
results[(results["Group"]==i &
results["Measurement"]==j &
results["Variable"]==k), c("Mean","LowerCI","UpperCI")] <-
c(mean, lower, upper)
}
}
}
# make Measurement, Variable and Group a factor
results[, "Measurement"] <- as.factor(results[, "Measurement"])
results[, "Group"] <- as.factor(results[, "Group"])
results[, "Variable"] <- as.factor(results[, "Variable"])
return(results)
}
.propPValue <- function (successG1, failureG1, successG2, failureG2) {
result <- prop.test(matrix(data=c(successG1,
failureG1,
successG2,
failureG2),
byrow=TRUE, nrow=2))
return(result$p.value)
}
.MannWhitneyPValue <- function (group1, group2) {
result <- wilcox.test(x=group1, y=group2)
return(format(result$p.value, digits=2))
}<file_sep>/man/plotTimelineProfiles.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{plotTimelineProfiles}
\alias{plotTimelineProfiles}
\title{Plot profiles for variables.}
\usage{
plotTimelineProfiles(data, plotType = "oneGraph", personIDVar, measurementVar,
selectedSymptoms, sizeofRandomSample = 10, sizeofGroup = 25)
}
\arguments{
\item{TODO}{}
}
\description{
TODO
}
<file_sep>/man/findFirstRowof1stLevel.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{findFirstRowof1stLevel}
\alias{findFirstRowof1stLevel}
\title{Finds first row of 1st level group}
\usage{
findFirstRowof1stLevel(data, firstLevel)
}
\description{
Not meant to be called by the user.
}
<file_sep>/inst/shinyapp_tests/ui.R
# Shiny UI for plotting test results
# load library for generation interactive web pages
library(shiny)
# make UI a page with sidebar for options
shinyUI(
pageWithSidebar(
# create header for the window
headerPanel(title="Positive test results"),
# create the sidebar
sidebarPanel(
# create selection - should tooltips be generated?
radioButtons(inputId="generateTooltips",
label="Generate tooltips?",
choices=c("Yes"=TRUE,"No"=FALSE)),
# create selection - which sorting method should be used?
selectInput(inputId="sortingMethod",
label="What kind of sorting should be used?",
choices=c("none","DateIn","BEA","BEA_TSP","PCA"),
multiple=FALSE),
# create option for uploading file with data
fileInput(inputId="dataFile",
label="Upload Excel data file",
multiple=FALSE,
accept=c("application/vnd.ms-excel",
"application/vnd.openxmlformats-officedocument.spreadsheetml.sheet")) #,
# # path to save the SVG file to
# textInput(inputId="fileName",
# label="Filename of saved SVG file (optional)"
# ),
# # make a submit button to update changes
# submitButton(text = "Apply Changes")
),
mainPanel(
tabsetPanel(
# TODO: to display SVG inside HTML and use the "tooltips",
# it must be properly embedded as described in:
# http://www.w3schools.com/svg/svg_inhtml.asp
# Figure out how to create custom shiny code for this.
tabPanel("Graph", h3(textOutput("message")), imageOutput("dataPlot", height="100%")),
tabPanel("Data", tableOutput("dataTable")),
tabPanel("Parameters", tableOutput("parametersTable")),
tabPanel(title="Errors", verbatimTextOutput("errors")),
tabPanel(title="Debug",h3("Client Session Parameters"), verbatimTextOutput("debug"))
)
)
)
)
<file_sep>/man/drawLineBelow2ndLevel.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{drawLineBelow2ndLevel}
\alias{drawLineBelow2ndLevel}
\title{Draws lines below 2nd level groups}
\usage{
drawLineBelow2ndLevel(data, firstLevel, secondLevel, lastRowforDiagnosis,
DIAGNOSIS.LEVELS, LINE.SIZE)
}
\description{
Not meant to be called by the user.
}
<file_sep>/inst/shinyapp_tests/server.R
# Shiny server for plottting test results
# load library for generation interactive web pages
library(shiny)
# load library for reading Excel files
library(gdata)
# load medplot library
library(medplot)
# save the location of template data file
templateLocation <- paste0(path.package("medplot"),"/extdata/PlotTests_shiny.xlsx")
# main function
shinyServer(function(input, output, session) {
# load data from Excel
data <- reactive({
if (!is.null(input$dataFile)) {
data <- read.xls(input$dataFile$datapath, sheet="DATA")
# change Excel numeric date format into R dates
data$DateIn <-
as.POSIXct(as.Date(data$DateIn, origin="1899-12-30"))
data$DateOut <-
as.POSIXct(as.Date(data$DateOut, origin="1899-12-30"))
return(data)} else {
### load default data
# DEMO SCENARIO
data <- read.xls(templateLocation, sheet="DATA")
# change Excel numeric date format into R dates
data$DateIn <-
as.POSIXct(as.Date(data$DateIn, origin="1899-12-30"))
data$DateOut <-
as.POSIXct(as.Date(data$DateOut, origin="1899-12-30"))
return(data)
}
})
# load parameters from Excel
parameters <- reactive({
if (!is.null(input$dataFile)) {
parameters <- read.xls(input$dataFile$datapath, sheet="PARAMETERS")
return(parameters)} else {
# loading default parameters
# DEMO PARAMETERS
parameters <- read.xls(templateLocation, sheet="PARAMETERS")
return(parameters)
}
})
output$dataTable <- renderTable({
data <- data()
data$DateIn <-
as.character(as.Date(data$DateIn, origin="1899-12-30"),format="%d.%m.%Y")
data$DateOut <-
as.character(as.Date(data$DateOut, origin="1899-12-30"),format="%d.%m.%Y")
return(data)
# NOTE: if we want to render the table of data, we have to convert the dates into
# characters, since renderTable seems to use xtable, which seems to not handle
# dates very well (http://stackoverflow.com/questions/8652674/r-xtable-and-dates)
})
output$parametersTable <- renderTable(parameters())
# generate reactive plot and error messages
generatePlot <- reactive({
outfile <- tempfile(fileext='.svg')
plotTests(data=data(),
figureParameters=parameters(),
fileName=outfile,
generateTooltips=input$generateTooltips,
sortMethod=input$sortingMethod)
list(errorMessages=errorMessages, outfile=outfile)
}
)
# assign the plot to an output slot
output$dataPlot <- renderImage(
list(
src=generatePlot()$outfile,
alt = "This is alternate text",
contentType = 'image/svg+xml')
)
# assign error messages to an output slot
output$errors <- renderPrint({unlist(generatePlot()$errorMessages)})
# generate debug info and assign it to an output slot
clientData <- session$clientData
output$debug <- renderText({
clientDataNames <- names(clientData)
allValues <- lapply(clientDataNames, function(name) {
paste(name, clientData[[name]], sep=" = ")
})
paste(allValues, collapse = "\n")
})
# meassage to display over the graph
output$message <- renderText(
if (is.null(input$dataFile)) {paste("WORKING WITH DEMO DATA!")}
)
})<file_sep>/man/plotTests.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{plotTests}
\alias{plotTests}
\title{Plot test results}
\usage{
plotTests(data, figureParameters, fileName, generateTooltips = TRUE,
sortMethod = "DateIn")
}
\description{
Function that plots which test results were positive for
a subject on a certain day.
}
\details{
The function is primarily meant to be called from MS Excel through the
RExcel plugin and not used directly from R.
Data is sent from Excel and drawn on a graph by
this R function. The output contains two graphs: A and B.
The A graph shows the test results for subjects (y-axis) on certain days
(x-axis). The results are represented by colored dots and can be divided in
groups and subgroups (two levels), depending on the nature of data.
The B graph shows, via a barchart, how many results were positive on
a certain day for a certain group.
}
<file_sep>/man/tableMeGroups.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{tableMeGroups}
\alias{tableMeGroups}
\title{Table with medians}
\usage{
tableMeGroups(data, groupingVar = "Sex", measurementVar, forMeasurement,
symptomsNames, thresholdValue = 0, doPValueAdjustments)
}
\arguments{
\item{TODO}{}
}
\description{
TODO Ignores threshold value.
}
<file_sep>/man/drawLineAbove1stLevel.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{drawLineAbove1stLevel}
\alias{drawLineAbove1stLevel}
\title{Draws lines above 1st level groups}
\usage{
drawLineAbove1stLevel(firstRowforType, LINE.SIZE)
}
\description{
Not meant to be called by the user.
}
<file_sep>/inst/shinyapp_symptoms/Symptoms.R
# This file gets called from RExcel, to prepare the data
# needed by shiny.
# load library for melting data
library(reshape2)
# transform date information into R compliant dates
mySymptomsData["Date"] <- as.Date(mySymptomsData[,"Date"], "%d.%m.%Y")
# transform data into ggplot compliant format
symptomsData <- melt(mySymptomsData, id.vars = c("Patient", "Date"))
# save data to file to share with the R session that will run shiny
save.image(file="data.Rdata")
# get the working directory
workingDir <- getwd()
# generate script, which contains information to start the shiny server.
script <- paste(
"
# This script is automatically generated by Symptoms.R
# and is not meant to be edited by user.
# load library used for ploting
library(ggplot2)
# set working directory
setwd(\"", workingDir ,"\")
# load data saved from Excel
load(\"data.Rdata\")
#load library shiny for HTML drawing and run shiny
library(shiny)
runApp(launch.browser=TRUE)
"
, sep="")
# save script as Run.R
# Run.R will be called by RExcel process.
writeLines(script, con="Run.R")
<file_sep>/R/PlotTests.R
#' @title Plot test results
#' @description Function that plots which test results were positive for
#' a subject on a certain day.
#' @details
#' The function is primarily meant to be called from MS Excel through the
#' RExcel plugin and not used directly from R.
#' Data is sent from Excel and drawn on a graph by
#' this R function. The output contains two graphs: A and B.
#'
#' The A graph shows the test results for subjects (y-axis) on certain days
#' (x-axis). The results are represented by colored dots and can be divided in
#' groups and subgroups (two levels), depending on the nature of data.
#'
#' The B graph shows, via a barchart, how many results were positive on
#' a certain day for a certain group.
#'
plotTests <- function (data, figureParameters, fileName,
generateTooltips = TRUE, sortMethod="DateIn") {
# replace missing values with NA (must do because RExcel sends them as
# empty string and it does that by purpose - at a different setting
# the date labels would also be transformed, which we do not want
data[data==""] <- NA
# drop unused levels in data frame factors
data <- droplevels(data)
# start profiling
# Rprof(filename="Rprof.out", append=TRUE)
# initialize placeholder for error messages
errorMessages <<- list()
criticalError <- FALSE
# check if function arguments were received; a bit superflous since
# function will not work without arguments passed
if(!exists(x="data")) {
errorMessages[length(errorMessages)+1] <<-
"Error: Data was not passed to R."
}
if(!exists(x="figureParameters")) {
errorMessages[length(errorMessages)+1] <<-
"Error: Plot parameters were not passed to R."
}
# identify and store first and last columns that contain dates of tests
datesColumnsIndices <- grep("^X[0-9]{1,2}[.][0-9]{1,2}[.][0-9]{4}$",
colnames(data))
DATES.COLUMN.FIRST <- min(datesColumnsIndices)
DATES.COLUMN.LAST <- max(datesColumnsIndices)
# number of date columns
nDATES <- DATES.COLUMN.LAST - DATES.COLUMN.FIRST + 1
# identify and store parameters for drawing (levels of test results, color,
# size of symbols)
TEST.RESULT.LEVELS <- as.character(unlist(figureParameters["Result"]))
DOT.COLORS <- as.character(unlist(figureParameters["Color"]))
DOT.SIZES <- as.integer(unlist(figureParameters["Size"]))
# make dot colors transparent with transparency APLHA (0-255)
ALPHA <- 150
DOT.COLORS <- apply(col2rgb(DOT.COLORS), 2, function(x)
{rgb(red=x[1], green=x[2], blue=x[3], alpha=ALPHA, maxColorValue=255)} )
# set the color for the lines at unit level
LINE.UNIT.COLOR <- rgb(red=col2rgb("gray")[1],
green=col2rgb("gray")[2],
blue=col2rgb("gray")[3],
alpha=ALPHA,
maxColorValue=255)
# count the number of units in sample
nUNITS <- dim(data)[1]
# store the dates in R date format
datesofTests <- as.Date(colnames(data)[datesColumnsIndices], "X%d.%m.%Y")
# vector of relative difference in days from test on day x to test on day 1
daysofTests <- datesofTests-datesofTests[1]
# create table of frequencies for all USED combinations of types & diagnosis
tableofGroups <- table(data$Type, data$Diagnosis, useNA="ifany")
# walk through all the units in the sample and assign them
# an absolute line number (keeping spaces for separating lines between groups)
nTYPES <- dim(tableofGroups)[1]
nDIAGNOSIS <- dim(tableofGroups)[2]
TYPE.LEVELS <- dimnames(tableofGroups)[[1]]
DIAGNOSIS.LEVELS <- dimnames(tableofGroups)[[2]]
# set size of space for separating lines on graph
LINE.SIZE <- 4
# Graph B settings
# set buffer between bars plotted in relative units of one bar width
# e.g. if 1, that means buffer is the same width as one bar
BAR.BUFFER <- 3
# reorder data (using seriation package) unless no sorting requested
if (sortMethod!="none") {
data <- sortData(data, sortMethod,
nUNITS,
DATES.COLUMN.FIRST,DATES.COLUMN.LAST,
TEST.RESULT.LEVELS )
}
# add an ordering column to the data dataframe; the number means which row
# the unit should be drawn in
order <- vector(mode="integer", length=nUNITS)
data <- cbind(data, Order=order)
rm(order)
# start with line number 1; number will be incremented for every unit and every
# separating line
lineNo <- 1
# assign a rownumber into column "Order" for each unit
for (i in 1:nTYPES) {
for (j in 1:nDIAGNOSIS) {
# lookup cell size and go to next cell in table if no units are found
cellSize <- tableofGroups[i,j]
if (cellSize == 0) {next}
# find names for cell dimension
typeName <- dimnames(tableofGroups)[[1]][i]
diagnosisName <- dimnames(tableofGroups)[[2]][j]
# catch a special case, when 1st level is NA
if (is.na(typeName)) {
# a special case when both 1st and 2nd level are NA
if (is.na(diagnosisName)) {
data[which(is.na(data$Type) & is.na(data$Diagnosis)),]["Order"] <-
(lineNo):(lineNo-1 + cellSize)
# increase line number by cellsize and buffer for the separating line
lineNo <- lineNo + cellSize + LINE.SIZE
# case when only 1st level is NA
} else {
# assign an abolute order number to units in this cell
data[which(is.na(data$Type) & data$Diagnosis==diagnosisName),]["Order"] <-
(lineNo):(lineNo-1 + cellSize)
# increase line number by cellsize and buffer for the separating line
lineNo <- lineNo + cellSize + LINE.SIZE
}
} else {
# catch a special case when only 2nd level is NA
if (is.na(diagnosisName)) {
data[which(data$Type==typeName & is.na(data$Diagnosis)),]["Order"] <-
(lineNo):(lineNo-1 + cellSize)
# increase line number by cellsize and buffer for the separating line
lineNo <- lineNo + cellSize + LINE.SIZE
} else {
# assign an abolute order number to units in this cell
data[which(data$Type==typeName & data$Diagnosis==diagnosisName),]["Order"] <-
(lineNo):(lineNo-1 + cellSize)
# increase line number by cellsize and buffer for the separating line
lineNo <- lineNo + cellSize + LINE.SIZE
}
}
}
}
# load plotting library
library(Cairo)
# set width of plotting region (in centimeters)
PLOT.WIDTH <- max(c(max(daysofTests), (5+3*length(TEST.RESULT.LEVELS)) ))
Cairo(fileName,
type="svg",
width=PLOT.WIDTH ,
height=29,
units="cm",
dpi=300,
pointsize=3)
# Cairo("example.pdf", type="pdf",width=19,height=24,units="cm",dpi=300)
# svg("example1.svg")
#set layout
layout(c(1,2), heights=c(4,1))
# set margins for graph A
par(mar=c(5, 7, 4, 2) + 0.1 )
# generate plot with number of days of tests on the x axis
# and appropriate y axis
plot(TRUE,
# plot y axis with buffer for lines between groups
# the top of the plot starts at 0 and goes down
# to the total of lines drawn
ylim=c(lineNo, 0),
# plot x axis with one column for every date
xlim=c(0, max(daysofTests)),
# plot nothing - just prepare the "canvas"
type="n", axes=FALSE, xlab="Date", ylab=""
)
# initialize counter of how many symbols are drawn
pointCounter <- 0
# initialize list that will hold additional info for symbol tooltips
symbolTooltip <- list()
# draw symbols of test results for each unit via their ordering
for (i in 1:lineNo) {
# skip drawing if there is no unit in this line number
if (!any(data$Order==i))
next
# otherwise draw each point of the line,
# but only the columns that are not NA
# calculate the indices of nonNA test values
index <- which(!is.na(data[data$Order==i, DATES.COLUMN.FIRST : DATES.COLUMN.LAST]))
for (ii in index) {
# get a character vector of all the tests in the cell
tests <- isolateTests(string=as.character(unlist(
data[data$Order==i, DATES.COLUMN.FIRST : DATES.COLUMN.LAST][ii]
)), separator=",")
# TODO: would the code run faster, if I would use as reference:
# data[DATES.COLUMN.FIRST + ii - 1])
# as bellow ?
# test if all test results entered are valid
for (iii in 1:length(tests)) {
if (!any(tests[iii]==TEST.RESULT.LEVELS)){
errorMessages[length(errorMessages)+1] <<-
paste("Error: Data does not match allowed values. Check ID:",
data[data$Order==i,]["ID"],
"and date column:",
names(data[DATES.COLUMN.FIRST + ii - 1]),
"for value:'",
tests[iii],
"'"
)
# if at least one is invalid, set that a critical error has occured
# as TRUE
criticalError <- TRUE
}
}
# draws point one on top of the other if multiple tests
# are positive in the same cell
points(
# draw x coordinate for this cell
# repeat as many times as there are test results in the cell
x=rep(daysofTests[ii], length(tests)),
# draw the line number as the y coordinate, as many times
# as there are test results in the cell
y=rep(i, length(tests)),
# use the color that is defined for each test
# by comparing the values in the cell with colors defined as parameters
col=DOT.COLORS[unlist(lapply(tests,
function(x) which(x==TEST.RESULT.LEVELS) ) )],
# use the size of dot that is defined for each test
cex=DOT.SIZES[unlist(lapply(tests,
function(x) which(x==TEST.RESULT.LEVELS) ) )],
pch=16
)
# save some additional information for each point
# to be shown as a tooltip
# loop through all points - for case when there are two in one cell
for (k in 1:length(tests)) {
symbolTooltip[pointCounter+k] <-
paste("ID:", data[data$Order==i, "ID"], # "ID" of patient
"; result:", # lists test results
as.character(unlist(data[data$Order==i,
DATES.COLUMN.FIRST :
DATES.COLUMN.LAST][ii])),
"; date:", format(datesofTests[ii],format="%d.%m.%y" )# Date of test
)
}
# increment counter of symbols drawn
pointCounter <- pointCounter + length(tests)
}
# draw death of patient if they died, at the date of release
# but only compare non-NA values of Outcome
if (!is.na(data[data$Order==i,]$Outcome)) {
if (data[data$Order==i,]$Outcome=="died") {
dateofDeath <- as.Date((data[data$Order==i,]$DateOut))
dayofDeath <- dateofDeath - datesofTests[1]
if (dayofDeath < 0) {
errorMessages[length(errorMessages)+1] <<-
paste("Error: Day of death falls before the date of first test.")
criticalError <- TRUE
}
if (dayofDeath > max(daysofTests)) {
errorMessages[length(errorMessages)+1] <<-
paste("Warning: Day of death falls after the last date of
observation.")
}
if (dayofDeath >= 0 && dayofDeath <= max(daysofTests)) {
plotDeaths(lineNumber=i, dayofDeath= dayofDeath + 1)
symbolTooltip[pointCounter+1] <-
paste("ID:", data[data$Order==i, "ID"], # "ID" of patient
"; died:", format(dateofDeath, format="%d.%m.%y"))
pointCounter <- pointCounter + 1
}
}
}
}
# draw labels on the x axis
axis(1, at=daysofTests, labels=rep("", length(daysofTests)), cex.axis=.35)
DATE.LABELS <- format(datesofTests, format="%d %b")
axis(1, at=daysofTests[seq(1, length(daysofTests), 2)],
labels=DATE.LABELS[seq(1, length(daysofTests), 2)],
cex.axis=.75, line=-0.5, lwd=0)
axis(1, at=daysofTests[seq(2, length(daysofTests), 2)],
labels=DATE.LABELS[seq(2, length(daysofTests), 2)],
cex.axis=.75, line=-1, lwd=0)
# draw horizontal lines for each unit
abline(h=unlist(data["Order"]), lty=2, col=LINE.UNIT.COLOR)
# call function to draw labels for each existing 1st+2nd level combination
for (i in TYPE.LEVELS) {
for (ii in DIAGNOSIS.LEVELS) {
if (checkifCombinationExists(data, i, ii)) {
drawLevelLabels(data, TYPE.LEVELS, LINE.SIZE, DIAGNOSIS.LEVELS,
firstLevel=i, secondLevel=ii)
}
}
}
# add label of graph A
mtext("A", side=3, adj=0, line=1, cex=1.5)
# add a legend to the A plot
legend("topleft",
# x=2,
# y=-3,
bty="n",
xpd=TRUE,
legend=TEST.RESULT.LEVELS,
pch=16,
horiz=TRUE,
col=DOT.COLORS,
pt.cex=DOT.SIZES
)
# if critical error occured, print a warning on graph
if (criticalError) {
text(x=1, y=0, labels="Critical error",
cex=max(c(8, PLOT.WIDTH/15)), col="red",
srt=-45, adj=0)
}
#### draw graph B #####
# set margins for 2nd figure
par(mar=c(4, 7, .5, 2) + 0.1 )
# create data structure to hold data for graph of percentage of positive units
positiveUnits <- matrix(ncol=length(DATES.COLUMN.FIRST : DATES.COLUMN.LAST),
nrow=length(TYPE.LEVELS))
# calculate percentages of positive (nonNEGATIVE) units in all results
# per 1st level group for each of the dates
column <- 1
for (i in DATES.COLUMN.FIRST : DATES.COLUMN.LAST) { # cycle through all dates (columns)
row <- 1
for (ii in TYPE.LEVELS) { # cycle through all 1st level levels (rows)
# for case when 1st level in nonNA
if (!is.na(ii)) {
sumNonNA <- sum(!is.na(data[data$Type==ii, i]))
sumPositive <- sum(!is.na(data[data$Type==ii, i]) &
data[data$Type==ii, i]!="neg" )
positiveUnits[row, column] <- sumPositive/sumNonNA*100
}
# for case when 1st level is NA
if (is.na(ii)) {
sumNonNA <- sum(!is.na(data[is.na(data$Type), i]))
sumPositive <- sum(!is.na(data[is.na(data$Type), i]) &
data[is.na(data$Type), i]!="neg" )
positiveUnits[row, column] <- sumPositive/sumNonNA*100
}
row <- row + 1
}
column <- column +1
}
# calculate minumum difference between consecutive tests in days
minimumInterval <- min(as.numeric(diff(daysofTests)))
# calculate space available to draw a bar,
# leaving a buffer the size of one width
barWidth <- minimumInterval / (length(TYPE.LEVELS)+BAR.BUFFER)
# calculate distance to middle of bars drawn
middleofBars <- (barWidth*(length(TYPE.LEVELS))/2)
# initialize spacing vector (vector of spaces between bars)
spacingVector <- vector(mode="numeric")
# generate spacing vector for drawing bars appropriately spaced
# each number is the space before each column
# go through all columns
for (i in 1:(dim(positiveUnits)[1]*dim(positiveUnits)[2])) {
if (i == 1) { # case of first column - align the middle of all columns above date
spacingVector[i] <- (-middleofBars/barWidth)
} else {
if (i %% length(TYPE.LEVELS) == 1) {
# for every column that belongs to a new set of columns
# calculate how far appart tests are in days
# and adjust shift for this difference
spacingVector[i] <- (daysofTests[(i %/% length(TYPE.LEVELS))+1] -
daysofTests[(i %/% length(TYPE.LEVELS))]
- 2*middleofBars) /(barWidth)
} else if (length(TYPE.LEVELS)==1) { # if only one column (level) in group
spacingVector[i] <- (diff(daysofTests)[i-1] - 2*middleofBars)/(barWidth)
}
else { # all columns inside a group have zero space between
spacingVector[i] <- 0
}
}
}
# load libray for "barplot2" function
library(gplots)
# load library for "ColorBrewer" function
library(RColorBrewer)
# only use ColorBrewer if there are more than two levels
# and less than 10 levels which is too much for the "Pastel1" pallette
if (length(TYPE.LEVELS)>2 & length(TYPE.LEVELS)<10) {
legendColors <- brewer.pal(n=length(TYPE.LEVELS), name="Pastel1")
}
# if one level use black colors for columns
if (length(TYPE.LEVELS)==1) {legendColors <- c("black")}
# if two levels use black and white colors for columns
if (length(TYPE.LEVELS)==2) {legendColors <- c("black", "white")}
# if more than 9 levels just take random colors out of pallette
# (more than 9 levels are not expected)
if (length(TYPE.LEVELS)>10) {
legendColors <- sample(colors(),length(TYPE.LEVELS), replace=FALSE)
}
barplot2(positiveUnits, beside=T, col=legendColors, axes=F,
xlim=c(0, max(daysofTests)), width=barWidth,
space=spacingVector)
abline(h=seq(0, 100, by=5), col="gray", lty=2)
# draw labels on the x axis
axis(1, at=daysofTests, labels=rep("", length(daysofTests)), cex.axis=.35)
# draw x axis labels
DATE.LABELS <- format(datesofTests, format="%d %b")
axis(1, at=daysofTests[seq(1, length(daysofTests), 2)],
labels=DATE.LABELS[seq(1, length(daysofTests), 2)],
cex.axis=.75, line=-0.5, lwd=0)
axis(1, at=daysofTests[seq(2, length(daysofTests), 2)],
labels=DATE.LABELS[seq(2, length(daysofTests), 2)],
cex.axis=.75, line=-1, lwd=0)
# draw tick marks on the y axis
axis(2, cex.axis=.5)
# draw title of y axis
mtext("Percentage of positive swabs", line=2, side=2, cex=.7 )
# draw legend for graph B
TEMP.TYPE.LEVELS <- TYPE.LEVELS
TEMP.TYPE.LEVELS[(is.na(TYPE.LEVELS))]<-"Unknown"
legend("topright", legend=TEMP.TYPE.LEVELS, fill=legendColors, cex=.7)
rm(TEMP.TYPE.LEVELS)
# draw the "title" of the graph
mtext("B", side=3, adj=0, line=1, cex=1.5)
# write the date and time of plotting
mtext(paste("Generated on:", format(Sys.time(), "%d %b %Y, %H:%M")),
side=1, line=2, cex=1)
# writes image to file
dev.off()
# add interactivity to the figure
# execute only if parameter to show tooltips is TRUE
if (generateTooltips) {
library(SVGAnnotation)
# open the generated SVG file
doc <- xmlParse(fileName)
# only generate graph with tooltips if the number of points is the same
# as the number of tooltip annotations
if (pointCounter!=length(getPlotPoints(doc)[[1]]))
{errorMessages[length(errorMessages)+1] <<-
paste("Critical Error: Number of points drawn (",
length(getPlotPoints(doc)[[1]]),
") does not match the number",
"of tooltips generated (",
pointCounter,
"). Graph with tooltips will not be generated.")
} else {
# add tool tips - must reference to the first list in
# a list of points - because they were drawn first?
# if the order of drawing changes, this will break
addToolTips(getPlotPoints(doc)[[1]],
symbolTooltip,
addArea=TRUE)
# add CSS styles inside the file
# internal style enable easier sharing of SVG files
# without having to share referenced CSS files
addCSS(doc, insert=TRUE)
saveXML(doc, fileName)
}
}
# set default error message, if no errors were generated
if (length(errorMessages) == 0) {
errorMessages[length(errorMessages)+1] <<- "No errors found."
}
}
############ HELPER FUNCTIONS ##################################################
# functions below are called by the plotTests function and are not meant to
# be called by the user
#' @title Draw labels and lines for groups
#' @description Function that plots labels for both possible groups describing
#' subjects.
#' @details
#' The function is meant to be called by the general function for plotting,
#' when it needs to draw the labels for the groups.
drawLevelLabels <- function (data, TYPE.LEVELS, LINE.SIZE, DIAGNOSIS.LEVELS,
firstLevel, secondLevel) {
# get the rownumber for the first and last units in this 1st level group
firstRowforType <- findFirstRowof1stLevel(data, firstLevel)
lastRowforType <- findLastRowof1stLevel(data, firstLevel)
# get rownumber for first and last unit in 2nd level group
firstRowforDiagnosis <- findFirstRowof2ndLevel(data, firstLevel, secondLevel)
lastRowforDiagnosis <- findLastRowof2ndLevel(data, firstLevel, secondLevel)
# draw labels and lines for 2nd level
draw2ndLevelLabels(label=secondLevel, firstRowforDiagnosis,
lastRowforDiagnosis )
drawLineBelow2ndLevel(data, firstLevel, secondLevel,
lastRowforDiagnosis, DIAGNOSIS.LEVELS, LINE.SIZE)
# draw labels and lines for 1st level
draw1stLevelLabels(label=firstLevel, firstRowforType)
drawLineBelow1stLevel(data, firstLevel, TYPE.LEVELS,
lastRowforType, LINE.SIZE)
}
#' @title Draws labels for the 2nd level groups
#' @description Not meant to be called by the user.
draw2ndLevelLabels <- function (label,
firstRowforDiagnosis,
lastRowforDiagnosis) {
if (!is.na(label)){ mtext(line=1, text=label, side=2, las=2, cex=.75,
padj=0, font=1,
at=(firstRowforDiagnosis+lastRowforDiagnosis)/2)}
if (is.na(label)) {mtext(line=1, text="Unknown", side=2, las=2, cex=.75,
padj=0, font=1,
at=(firstRowforDiagnosis+lastRowforDiagnosis)/2)}
}
#' @title Draws labels for the 1st level groups
#' @description Not meant to be called by the user.
draw1stLevelLabels <- function (label, firstRowforType ) {
if (!is.na(label)){
mtext(line=2, text=label, side=2, las=2, cex=1, padj=0, font=2,
at=firstRowforType)}
if (is.na(label)){
mtext(line=2, text="Unknown", side=2, las=2, cex=1, padj=0, font=2,
at=firstRowforType)}
}
#' @title Finds first row of 1st level group
#' @description Not meant to be called by the user.
findFirstRowof1stLevel <- function(data, firstLevel) {
if (is.na(firstLevel)) # if 1st level is NA
{min(data[is.na(data$Type),]["Order"], na.rm=TRUE)} else {
min(data[data$Type==firstLevel,]["Order"], na.rm=TRUE)} # if it is nonNA
}
#' @title Finds last row of 1st level group
#' @description Not meant to be called by the user.
findLastRowof1stLevel <- function (data, firstLevel) {
if (is.na(firstLevel)) # if 1st level is NA
{max(data[is.na(data$Type),]["Order"], na.rm=TRUE)} else { # if it is nonNA
max(data[data$Type==firstLevel,]["Order"], na.rm=TRUE)}
}
#' @title Finds first row of 2nd level group
#' @description Not meant to be called by the user.
findFirstRowof2ndLevel <- function (data, firstLevel, secondLevel) {
if ( is.na(firstLevel) & !is.na(secondLevel) ) # 1st level NA, 2nd level nonNA
{x <- min(data[is.na(data$Type) & data$Diagnosis==secondLevel,]["Order"],
na.rm=TRUE) }
if (is.na(firstLevel) & is.na(secondLevel)) # 1st level NA, 2nd level NA
{x <- min(data[is.na(data$Type) & is.na(data$Diagnosis),]["Order"],
na.rm=TRUE)}
if (!is.na(firstLevel) & !is.na(secondLevel)) # 1st level nonNA, 2nd level nonNA
{x <- min(data[data$Type==firstLevel & data$Diagnosis==secondLevel,]["Order"],
na.rm=TRUE)}
if (!is.na(firstLevel) & is.na(secondLevel)) # 1st level nonNA, 2nd level NA
{x <- min(data[data$Type==firstLevel & is.na(data$Diagnosis),]["Order"],
na.rm=TRUE)}
return(x)
}
#' @title Finds last row of 2nd level group
#' @description Not meant to be called by the user.
findLastRowof2ndLevel <- function (data, firstLevel, secondLevel) {
if (is.na(firstLevel) & !is.na(secondLevel)) # 1st level NA, 2nd level nonNA
{x <- max(data[is.na(data$Type) & data$Diagnosis==secondLevel,]["Order"],
na.rm=TRUE) }
if (is.na(firstLevel) & is.na(secondLevel)) # 1st level NA, 2nd level NA
{x <- max(data[is.na(data$Type) & is.na(data$Diagnosis),]["Order"],
na.rm=TRUE)}
if (!is.na(firstLevel) & !is.na(secondLevel)) # 1st level nonNA, 2nd level nonNA
{x <- max(data[data$Type==firstLevel & data$Diagnosis==secondLevel,]["Order"],
na.rm=TRUE)}
if (!is.na(firstLevel) & is.na(secondLevel)) # 1st level nonNA, 2nd level NA
{x <- max(data[data$Type==firstLevel & is.na(data$Diagnosis),]["Order"],
na.rm=TRUE)}
return(x)
}
#' @title Draws lines above 1st level groups
#' @description Not meant to be called by the user.
drawLineAbove1stLevel <- function (firstRowforType, LINE.SIZE) {
abline(h=firstRowforType-(LINE.SIZE/2), lty=2, col="black", lwd=2)
}
#' @title Draws lines below 1st level groups
#' @description Not meant to be called by the user.
drawLineBelow1stLevel <- function (data,
firstLevel,
TYPE.LEVELS,
lastRowforType,
LINE.SIZE) {
# if (firstLevel != tail(TYPE.LEVELS[!is.na(TYPE.LEVELS)], n=1)) {
if (lastRowforType!=max(data$Order)){
abline(h=lastRowforType+(LINE.SIZE/2), lty=2, col="black", lwd=2)
}
}
#' @title Draws lines below 2nd level groups
#' @description Not meant to be called by the user.
drawLineBelow2ndLevel <- function (data,
firstLevel,
secondLevel,
lastRowforDiagnosis,
DIAGNOSIS.LEVELS,
LINE.SIZE) {
# add line after last unit, for any but the last group
if (lastRowforDiagnosis!=max(data$Order)) {
abline(h=lastRowforDiagnosis+(LINE.SIZE/2), lty=2, col="gray", lwd=2)
}
}
#' @title Gets all test results written in a cell
#' @description Not meant to be called by the user.
#' @details
#' Parses test result strings from a string (a cell of data containing results).
#' Different separators can be defined. Leading spaces are removed.
isolateTests <- function (string, separator) {
a <- unlist(strsplit(string, split=separator)) # isolate all tests in cell
b <- sub(pattern="^ ", replacement="", x=a) # remove leading spaces
return(b) # return character vector to calling function
}
#' @title Adds an error message to error message list
#' @description Called from the plotTests function and not meant to be
#' called by user.
#' @details
#' This function adds an error message to the list of error messages
#' which is passed to RExcel. The messages are shown to the user
#' in the Excel spreadsheet.
addErrorMessage <- function(text) {
errorMessages[length(errorMessages)+1] <<- text
}
#' @title Checks if combination of groups exists
#' @description Checks if combination of groups exists and is not meant
#' to be called from by user.
#' @details
#' Checks for existance of combination of first and second level groups
#' in the data.
checkifCombinationExists <- function (data, firstLevel, secondLevel) {
if (!is.na(firstLevel) & !is.na(secondLevel)) {
x <- (!dim(na.omit(data[data$Type==firstLevel &
data$Diagnosis==secondLevel,]["Order"]))[[1]]==0)
}
if (!is.na(firstLevel) & is.na(secondLevel)) {
x <- (!dim(na.omit(data[data$Type==firstLevel &
is.na(data$Diagnosis),]["Order"]))[[1]]==0)
}
if ( is.na(firstLevel) & !is.na(secondLevel)) {
x <- (!dim(na.omit(data[is.na(data$Type) &
data$Diagnosis==secondLevel,]["Order"]))[[1]]==0)
}
if ( is.na(firstLevel) & is.na(secondLevel)) {
x <- (!dim(na.omit(data[is.na(data$Type) &
is.na(data$Diagnosis),]["Order"]))[[1]]==0)
}
return(x)
}
#' @title Plots a death of a patient with a special character
#' @description Not meant to be called by the user.
plotDeaths <- function (lineNumber, dayofDeath) {
points(x=dayofDeath, y=lineNumber, pch=15, col="black", cex=2)
# WARNING: for some pch values, the SVG file saves the plotted symbol
# to a different XML node, which destroys the tooltip generation feature
# safe pch are those with simple forms: circle, square, ...?
# but not a cross or a star (since those are apparently drawn in more than one
# move). Just so you know, in case that you change the pch value
# and everything breaks :)
}
#' @title Sort results data
#' @description Function that sorts the data according to a criterion. Not to
#' be called directly by the user.
#' @details
#' Function is called by the plotTests function, to sort the data before
#' it starts working on it. Different methods of sorting can be used.
#' "DateIn" sorts by the date of admission.
#' "BEA" uses seriation package and Bond Energy Algorithm.
#' "BEA_TSP" as above with TSP to optimize the measure of effectivenes.
#' "PCA" uses seriation package First principal component algorithm.
sortData <- function (data, sortMethod="BEA",
nUNITS,
DATES.COLUMN.FIRST,DATES.COLUMN.LAST,
TEST.RESULT.LEVELS ) {
# if sorting method is by date of admission
if (sortMethod=="DateIn") {
# return data reordered by date of admission
return(data[order(data$DateIn), ])
} else { # else return data sorted by the seriation package
nTESTS <- length(TEST.RESULT.LEVELS)
nDATES <- (DATES.COLUMN.LAST - DATES.COLUMN.FIRST + 1)
# initialize matrix to hold info about positive test results
# the matrix has a row for every patient and
# the columns for every day*result
# and will contain 1 where the day+result is realized
results <- matrix(0, nrow=nUNITS, ncol=nDATES*nTESTS)
colnames(results) <- rep(TEST.RESULT.LEVELS, nDATES)
for (i in 1:nUNITS) {
for (ii in 1:nDATES) {
for (iii in 1:nTESTS) {
if(grepl(TEST.RESULT.LEVELS[iii],
data[i, DATES.COLUMN.FIRST + ii - 1])
) {
results[i, (ii-1)*nTESTS + iii] <- 1
}
}
}
}
# load library for sorting via similarity
library(seriation)
# sort, get order by patients dimension
series <- seriate(results, method=sortMethod)
seriesOrder <- get_order(series,dim=1)
# return data resorted
return(data[seriesOrder, ])
}
}<file_sep>/man/tableAllWithSymptoms.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{tableAllWithSymptoms}
\alias{tableAllWithSymptoms}
\title{Table for both groups}
\usage{
tableAllWithSymptoms(data, measurementVar, forMeasurement, symptomsNames,
thresholdValue = 0)
}
\arguments{
\item{TODO}{}
}
\description{
TODO Thresholds take into account for proportions, not for medians.
}
<file_sep>/R/PlotSymptomsImportData.R
#' @title Function that imports data for the plotSymptoms plot
#'
#' @description Function is called to import the data in appropriate format for the
#' plotSymptoms plot.
#'
#' @param datafile Path to the file containing data.
#' @param format Format of the file containing data ("Excel", "TSV", "Demo").
#' @export
importSymptomsData <- function (datafile, format) {
if (format=="Excel") {
# read the data into R
data <- read.xls(datafile, sheet="DATA")
# transform date information into R compliant dates
# TODO: should we remove this line, since transform is done later, when the user select the
# date column?
data["Date"] <- as.Date(data[,"Date"], "%d.%m.%Y")
return(data)
}
if (format=="TSV") {
# read the data into R
data <- read.csv(datafile, header=TRUE, sep="\t", stringsAsFactors=FALSE)
return(data)
}
if (format=="Demo") {# Same code as for Excel at the moment
}
}
#' @title Function that imports PATIENTS Excel tab for the plotSymptoms plot
#'
#' @param datafile Path to the file containing data.
importSymptomsPatients <- function (datafile) {
data <- read.xls(datafile, sheet="PATIENTS")
return(data)
}<file_sep>/man/permCorrPValue.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{permCorrPValue}
\alias{permCorrPValue}
\title{Multivariate permutation test for correcting the P value}
\usage{
permCorrPValue(data, nPermutations = 999, selectedSymptoms, groupingVar,
measurementVar, measurementOccasion, FUN, thresholdValue = 0)
}
\arguments{
\item{TODO}{}
}
\description{
TODO
}
<file_sep>/man/findLastRowof1stLevel.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{findLastRowof1stLevel}
\alias{findLastRowof1stLevel}
\title{Finds last row of 1st level group}
\usage{
findLastRowof1stLevel(data, firstLevel)
}
\description{
Not meant to be called by the user.
}
<file_sep>/man/draw1stLevelLabels.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{draw1stLevelLabels}
\alias{draw1stLevelLabels}
\title{Draws labels for the 1st level groups}
\usage{
draw1stLevelLabels(label, firstRowforType)
}
\description{
Not meant to be called by the user.
}
<file_sep>/man/plotDeaths.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{plotDeaths}
\alias{plotDeaths}
\title{Plots a death of a patient with a special character}
\usage{
plotDeaths(lineNumber, dayofDeath)
}
\description{
Not meant to be called by the user.
}
<file_sep>/man/findLastRowof2ndLevel.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{findLastRowof2ndLevel}
\alias{findLastRowof2ndLevel}
\title{Finds last row of 2nd level group}
\usage{
findLastRowof2ndLevel(data, firstLevel, secondLevel)
}
\description{
Not meant to be called by the user.
}
<file_sep>/man/importSymptomsPatients.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{importSymptomsPatients}
\alias{importSymptomsPatients}
\title{Function that imports PATIENTS Excel tab for the plotSymptoms plot}
\usage{
importSymptomsPatients(datafile)
}
\arguments{
\item{datafile}{Path to the file containing data.}
}
\description{
Function that imports PATIENTS Excel tab for the plotSymptoms plot
}
<file_sep>/inst/shinyapp_symptoms2/loadLibraries.R
# This file contains code to load libraries required by the medplot package
# load library for generation interactive web pages
library(shiny)
# if(!require(shinyIncubator)) { # NOTE!!!: this is not available on CRAN, might not be best to include it?
# devtools::install_github("shiny-incubator", "rstudio")
# library(shinyIncubator)
# }
# load library for generating graph scales
library(scales)
# load library for melting data
library(reshape2)
# library for plotting data
library(ggplot2)
# library for reading Excel files
library(gdata)
# library for manipulating data
library(plyr)
# library for clustering
library(pheatmap)
# load medplot library
library(medplot)
#library for date management
library(lubridate)
# library for regression modeling
library(rms)
# library for logistic regression modeling
library(logistf) # note: this library has a myriad of dependencies - it might cause problems
# library for permutation tests
library(permute)
# bootstrap library
library(boot)
# libraries for mixed models
library(lme4)
library(lmerTest)
<file_sep>/R/PlotSymptomsLasagnaPlot.R
plotLasagna <- function(data, # dataFiltered()
treatasBinary, # input$treatasBinary
symptom, # input$selectedSymptoms
dateVar, # input$dateVar
personIDVar, # input$patientIDVar
measurementVar, # input$measurementVar
groupingVar, # input$groupingVar
thresholdValue # input$thresholdValue
) {
# list all subject IDs
subjects <- as.character(unique(data[, personIDVar]))
# list all levels of evaluation occasions
levels <- as.character(unique(data[, measurementVar]))
# construct empty matrix of results for all possible combinations of subjects X levels
results <- matrix(nrow=length(subjects), ncol=length(levels), dimnames=list(subjects, levels))
# run a loop filling the data into the matrix
for (level in levels) {
# subset data for a particular evaluation accasion
tempdata <- data[data[, measurementVar]==level, c(personIDVar,symptom)]
# only look at subjects, that were measured at that evaluation occasion
subjects <- as.character(unique(tempdata[, personIDVar]))
for (subject in subjects) {
results[subject, level] <- tempdata[tempdata[,personIDVar]==subject, symptom]
}
}
# plot a different graph if data is binary (legend is different)
if (treatasBinary==TRUE) {
pheatmap(results, cluster_cols=FALSE, main=symptom, show_rownames=FALSE,
legend_breaks=c(0,1),
legend_labels=c(0,1),
color=c('blue','red'),
drop_levels=TRUE)
} else {
pheatmap(results, cluster_cols=FALSE, main=symptom, show_rownames=FALSE)
}
}
########
pastePlotFilenames <- function(filenames) {
out <- vector()
for (file in filenames) {
#data <- readPNG(file, native=TRUE)
out <- paste(out, img(src=paste0("./temp/",basename(file))))
#out <- paste(out, (img(src=data)))
}
return(out)
}<file_sep>/man/plotPropWithSymptomsCI.Rd
\name{plotPropWithSymptomsCI}
\alias{plotPropWithSymptomsCI}
\title{Plot of confidence intervals for proportions for groups an measurements.}
\usage{
plotPropWithSymptomsCI(data, groupingVar, measurementVar, selectedSymptoms)
}
\arguments{
\item{TODO}{}
}
\description{
TODO
}
<file_sep>/man/plotTimeline.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{plotTimeline}
\alias{plotTimeline}
\title{Plot timeline of symptom presence}
\usage{
plotTimeline(data, date, personID, measurement, symptoms,
displayFormat = "dates", treatasBinary = FALSE)
}
\arguments{
\item{data}{The data frame used by the plot.}
\item{date}{Name of variable containing dates.}
\item{personID}{Name of variable containing person ID.}
\item{measurement}{Name of variable containing measuring occasion info.}
\item{symptoms}{Vector of variable names representing measured symptoms.
for ggplot() (see melt()). Returns a ggplot object that has to be plotted via print().}
}
\description{
Function that plots symptom severity presence for patients at a certain time.
}
<file_sep>/man/tableMediansWithSymptoms.Rd
\name{tableMediansWithSymptoms}
\alias{tableMediansWithSymptoms}
\title{Table with medians}
\usage{
tableMediansWithSymptoms(data, groupingVar = "Sex", measurementVar,
forMeasurement, symptomsNames, thresholdValue = 0, doPValueAdjustments)
}
\arguments{
\item{TODO}{}
}
\description{
TODO Ignores threshold value.
}
<file_sep>/README.md
# medplot R package
Functions for visualizing and analyzing medical data in R - primarily longitudinal epidemiologic data.
You can find the Excel template file (to use with the medplot shiny app) in the folder:
[/inst/extdata/PlotSymptoms_shiny.xlsx](/inst/extdata/PlotSymptoms_shiny.xlsx)
And the Tab Separated Values template file in the folder:
[/inst/extdata/PlotSymptoms_TemplateTSV.txt](/inst/extdata/PlotSymptoms_TemplateTSV.txt)
# Docker installation
Medplot package is contained in a docker image located at:
https://hub.docker.com/r/crtahlin/medplot/
Make sure you have docker installed. Instructions for various platforms can be found on:
https://docs.docker.com/engine/installation/
Run the docker container (it is automatically downloaded) by running the following in a shell:
```sudo docker run -d -p 3838:3838 crtahlin/medplot```
Open your browser and connect to: [http://localhost:3838/shinyapp_symptoms2/](http://localhost:3838/shinyapp_symptoms2/)
Medplot app should open in your browser.
# General installation instructions for the medplot package
After installing the [R programming language](http://cran.r-project.org/), use the console in R to install the devtools package:
```{r, eval=FALSE}
install.packages("devtools")
```
Make sure you install the development tools mentioned on the [devtools web page](http://www.rstudio.com/products/rpackages/devtools/). For MS Windows, this would be the [Rtools package](http://cran.r-project.org/bin/windows/Rtools/).
Perl has to be installed in order to enable importing of MS Excel files. For MS Windows, [Strawberry perl](http://www.perl.org/get.html) should work. Reboot computer.
Load devtools and install packages from GitHub:
```{r, eval=FALSE}
library(devtools)
install_github("crtahlin/medplot")
```
# Running medplot localy
To run medplot:
```{r, eval=FALSE}
library(medplot)
medplotOnline()
```
This should open the application in a web browser.
# Troubleshooting Linux installation
Some specific problems arose while installing on Linux (Ubuntu 14.04). Solutions are listed below.
Prerequisite for the devtools package - use terminal to:
sudo apt-get install libcurl4-openssl-dev
Prerequisite for the Cairo & medplot packages - use terminal to:
sudo apt-get install libcairo2-dev
sudo apt-get install libxt-dev
as described in http://stackoverflow.com/questions/16678934/r-cairo-installation-without-apt-get-sudo
Prerequisite for the XML & medplot package - use terminal to:
sudo apt-get install r-cran-xml
as described in http://stackoverflow.com/questions/7765429/unable-to-install-r-package-in-ubuntu-11-04
<!--
Notes on installation in Linux
To install R
sudo apt-get install r-base-core
To get devtools working
sudo apt-get install libcurl4-openssl-dev
To install devtools
install.packages("devtools")
To install Cairo library
sudo apt-get install libcairo2-dev
sudo apt-get install libxt-dev
as described in http://stackoverflow.com/questions/16678934/r-cairo-installation-without-apt-get-sudo
To install XML package
sudo apt-get install r-cran-xml
as described in http://stackoverflow.com/questions/7765429/unable-to-install-r-package-in-ubuntu-11-04
To install medplot
library("devtools")
To install shinyIncubator
install_github(username="rstudio", repo="shiny-incubator")
To run medplot:
medplotOnline()
-->
<file_sep>/man/tabelizeLogistf.Rd
\name{tabelizeLogistf}
\alias{tabelizeLogistf}
\title{OBSOLETE Plot logistf
#
#}
\usage{
tabelizeLogistf(data, measurementVar, selectedMeasurement, covariate,
selectedSymptoms, thresholdValue)
}
\arguments{
\item{TODO}{TODO write instructions}
\item{TODO}{}
}
\description{
TODO write description # #
}
\details{
#
}
<file_sep>/man/plotRCSmod.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{plotRCSmod}
\alias{plotRCSmod}
\title{Plot RCS MOD ??? TODO : help contents}
\usage{
plotRCSmod(My.mod, my.var, my.ylab = "Response", my.xlab = "Covariate",
my.xlim = NULL, my.ylim = NULL, my.knots = NULL, my.title = NULL,
vlines = NULL, hlines = NULL, my.cex.lab = NULL, my.axes = TRUE,
my.cex.main = NULL, my.cex.axis = NULL)
}
\arguments{
\item{My.mod}{???}
}
\description{
Plot RCS MOD ??? TODO : help contents
}
<file_sep>/man/summarizeData.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{summarizeData}
\alias{summarizeData}
\title{Tables summarizing data}
\usage{
summarizeData(data, personIDVar, measurementVar, selectedSymptoms, groupingVar)
}
\arguments{
\item{TODO}{}
}
\description{
TODO
}
<file_sep>/man/plotRCS.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{plotRCS}
\alias{plotRCS}
\title{Plot RCS}
\usage{
plotRCS(data.all, data.yn = NULL, measurement, selectedSymptoms,
measurementSelectedrcs, rcsIDVar, binaryVar = TRUE)
}
\arguments{
\item{data}{Data for ploting.}
}
\description{
Plot RCS.
}
<file_sep>/R/PlotSymptomsTabTimeline.R
#' @title Plot timeline of symptom presence
#'
#' @description Function that plots symptom severity presence for patients at a certain time.
#'
#' @param data The data frame used by the plot.
#' @param date Name of variable containing dates.
#' @param personID Name of variable containing person ID.
#' @param measurement Name of variable containing measuring occasion info.
#' @param symptoms Vector of variable names representing measured symptoms.
#'
#' for ggplot() (see melt()). Returns a ggplot object that has to be plotted via print().
plotTimeline <- function (data,
date,
personID,
measurement,
symptoms,
displayFormat = "dates",
treatasBinary=FALSE) {
# Scenario - timeFromInclusion
# add info about days since inclusion in the study in a column if needed for ploting
# TODO: I will rewrite this as .calculateTimeSinceInclusion(); use this function here instead
if (displayFormat == "timeFromInclusion") {
# find the day of inclusion in the study for each person
uniquePeople <- as.data.frame(unique(data[personID]))
colnames(uniquePeople) <- personID
for (person in uniquePeople[,1]) {
subset <- data[which(data[personID]==person), date]
uniquePeople[which(uniquePeople[personID]==person), "minDate"] <- as.character(min(subset))
data[which(data[personID]==person), "minDate"] <- as.character(min(subset))
}
data$minDate <- as.Date(data$minDate, format="%Y-%m-%d")
data$daysSinceInclusion <- as.numeric(data[,date] - data$minDate) # save as numeric for melt()to work
}
# keep only relevant data and melt() it into ggplot format
if (displayFormat == "timeFromInclusion") { # in case of ploting days since inclusion on horizontal axis
data <- data[ , c("daysSinceInclusion", personID, measurement, symptoms)]
data <- melt(data=data, id.vars = c(personID, "daysSinceInclusion", measurement))
#horizontalAxisVariable <- "daysSinceInclusion"
colnames(data)[which(colnames(data)=="daysSinceInclusion")] <- "horizontalAxisVariable"
}
# Scenario - dates
# in case dates are used on the horizontal axis
if (displayFormat == "dates") { # in case of ploting with dates on horzontal axis
data <- data[ , c(date, personID, measurement, symptoms)]
data <- melt(data=data, id.vars = c(personID, date, measurement))
#horizontalAxisVariable <- "Date"
colnames(data)[which(colnames(data)==date)] <- "horizontalAxisVariable"
}
# Scenario - measurement occasions
if (displayFormat == "measurementOccasions") {
data <- data[ , c(date, personID, measurement, symptoms)]
data[ ,measurement] <- as.factor(data[ ,measurement])
# calculate how many subject were measured on a occasion
peopleMeasured <- data.frame(horizontalAxisVariable=unique(data[,measurement]), Number=NA)
for (i in (peopleMeasured[,"horizontalAxisVariable"])) { # go through all measurement occasions
dataForMeasurement <- data[data[measurement]==i,]
peopleMeasured[peopleMeasured["horizontalAxisVariable"]==i,"Number"] <-
(length(unique(dataForMeasurement[,personID])))
}
# Y coord for annotations above graph
yCoord <- 0.65
# melt the data
data <- melt(data=data, id.vars = c(personID, date, measurement))
#horizontalAxisVariable <- "Measurement"
colnames(data)[which(colnames(data)==measurement)] <- "horizontalAxisVariable"
}
# rename column names to make sure ggplot recognizes them and that the code below works
colnames(data)[which(colnames(data)==personID)] <- "PersonID"
#colnames(data)[which(colnames(data)==measurement)] <- "Measurement"
# Ploting function ####
if (treatasBinary==FALSE) {
plot <- ggplot(data, aes(x = horizontalAxisVariable, y = PersonID, size = value, colour = variable)) +
geom_point(shape = 1) + myTheme() +
geom_point(subset=.(value == 0), shape=2, size=1) +
scale_size_area(breaks=c(1:10),minor_breaks=c(1:10),
guide="legend",
limits=c(1,10),
max_size=10)}
if (treatasBinary==TRUE) {
data[,"value"] <- ifelse(data[,"value"]==1, "PRESENT", "ABSENT")
plot <- ggplot(data, aes(x = horizontalAxisVariable, y = PersonID, size = value,
colour = variable)) +
geom_point(shape=1) + myTheme() + scale_size_manual(values=c(1,5))
}
# if plotting dates on horizontal axis
if (displayFormat == "dates") {
plot <- plot + scale_x_date(breaks = date_breaks("1 week"),
labels = date_format("%d %b"),
minor_breaks = date_breaks("1 day")) +
xlab("Date") +
theme(axis.text.x = element_text(angle=90))
}
# if ploting days since inclusion on horizontal axis
if (displayFormat == "timeFromInclusion") {
plot <- plot + xlab("Days since inclusion")
}
# if ploting measurement occasion on horizontal axis
if (displayFormat == "measurementOccasions") {
plot <- plot + xlab("Measurement occasions")
plot <- plot + annotate(geom="text", x=as.factor(peopleMeasured$horizontalAxisVariable),
y=yCoord,
label=paste("Measured:",peopleMeasured$Number))
}
return(plot)
}<file_sep>/man/isolateTests.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{isolateTests}
\alias{isolateTests}
\title{Gets all test results written in a cell}
\usage{
isolateTests(string, separator)
}
\description{
Not meant to be called by the user.
}
\details{
Parses test result strings from a string (a cell of data containing results).
Different separators can be defined. Leading spaces are removed.
}
<file_sep>/man/plotClusterHeatmap.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{plotClusterHeatmap}
\alias{plotClusterHeatmap}
\title{Plot clustered heatmaps.}
\usage{
plotClusterHeatmap(data, variableName, variableValue, selectedSymptoms,
annotationVars = NA, treatasBinary = FALSE)
}
\arguments{
\item{data}{Data fram to be passed to the function.}
\item{variableName}{The column name of the variable used for filtering.}
\item{variableValues}{The value of the filtering variable used to filter.}
}
\description{
Plots a heatmap for presence of symptoms.
}
<file_sep>/man/plotTimelineBoxplots.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{plotTimelineBoxplots}
\alias{plotTimelineBoxplots}
\title{Plot timeline boxplots}
\usage{
plotTimelineBoxplots(data, personIDVar, measurementVar, selectedSymptoms,
faceting)
}
\arguments{
\item{TODO}{}
}
\description{
TODO
}
<file_sep>/man/plotDistribution.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{plotDistribution}
\alias{plotDistribution}
\title{Plots of distribution of the symptoms}
\usage{
plotDistribution(data, selectedSymptoms, selectedProportion, measurements)
}
\arguments{
\item{data}{Data frame for ploting.}
}
\description{
Plots of distribution of the symptoms
}
<file_sep>/man/medplotOnline.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{medplotOnline}
\alias{medplotOnline}
\title{Function to load the medplot plotSymptoms shiny web app}
\usage{
medplotOnline()
}
\description{
Function call loads the shiny app in located the default installation folder.
}
<file_sep>/man/plotClusterDendrogram.Rd
\name{plotClusterDendrogram}
\alias{plotClusterDendrogram}
\title{Plot dendrogram of patient groups based on symptoms.}
\usage{
plotClusterDendrogram(data, variableName, variableValue, selectedSymptoms)
}
\arguments{
\item{data}{Data fram to be passed to the function.}
\item{variableName}{The column name of the variable used
for filtering.}
\item{variableValues}{The value of the filtering variable
used to filter.}
}
\description{
Plots a dendrogram of patient groups based on symptom
values they have.
}
<file_sep>/man/tableRCS.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{tableRCS}
\alias{tableRCS}
\title{Restricted cubic spline P values in tabular form}
\usage{
tableRCS(data.all, data.yn, measurement, selectedSymptoms,
measurementSelectedrcs, rcsIDVar, binaryVar = TRUE)
}
\arguments{
\item{TODO}{}
}
\description{
TODO
}
<file_sep>/R/PlotSymptomsGenerateGraph.R
#' @title Code to generate a downloadable file containing plot (downloaHandler) to download using a button.
#'
#' @param filename Suggested file name.
#' @param plotFunction The function that plots the graph (can be reactive).
#' @param width Width of the plot in pixels.
#' @param height Height of plot in pixels.
#' @param print Whether to use print() function or not. Needed with ggplot objects and not with base plots?
#' @export
downloadPlot <- function(filename="plot.eps",
plotFunction,
width,
height,
print=FALSE) {
downloadHandler(
filename = filename,
content = function(file) {
postscript(file, paper="special", width = width/72, height = height/72)
if (print==TRUE) { print(match.fun(plotFunction)()) }
if (print==FALSE) { match.fun(plotFunction)() }
dev.off()
}, contentType="application/postscript"
)
}
<file_sep>/man/addErrorMessage.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{addErrorMessage}
\alias{addErrorMessage}
\title{Adds an error message to error message list}
\usage{
addErrorMessage(text)
}
\description{
Called from the plotTests function and not meant to be
called by user.
}
\details{
This function adds an error message to the list of error messages
which is passed to RExcel. The messages are shown to the user
in the Excel spreadsheet.
}
<file_sep>/man/tablePropPosGroups.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{tablePropPosGroups}
\alias{tablePropPosGroups}
\title{Table with proportions}
\usage{
tablePropPosGroups(data, groupingVar = "Sex", measurementVar, forMeasurement,
symptomsNames, thresholdValue = 0, doPValueAdjustments)
}
\arguments{
\item{TODO}{}
}
\description{
TODO
}
<file_sep>/inst/shinyapp_symptoms2/ui.R
# This is the GUI part of the shiny script to plot
# graph for displaying presence of symptoms.
# Define UI for displaying presence of symptoms
shinyUI(fluidPage(
#theme="/bootstrap/spacelab_bootstrap.css",
# Application title ####
titlePanel("medplot"),
# Layout of the GUI ####
sidebarLayout(
# Define the sidebar panel ####
sidebarPanel(
width=3,
textOutput("medplotVersion"),
uiOutput("messageSelectVars"),
# greyed out panel
wellPanel(
conditionalPanel(
condition="input.dataFileType =='Demo'",
h4("Working with DEMO data")),
# selection of type of data file
selectInput(inputId="dataFileType",
label="Select type of data file:",
choices=c(
" "=NA,
"Excel template"="Excel",
"Tab separated values (TSV) file"="TSV",
"Demo data"="Demo"
)),
# show file upload control if user selects she will upload a file
conditionalPanel(
condition="input.dataFileType =='Excel' || input.dataFileType =='TSV'",
fileInput(inputId="dataFile",
label={h5("Upload data file:")},
multiple=FALSE,
accept=c("application/vnd.ms-excel",
"application/vnd.openxmlformats-officedocument.spreadsheetml.sheet",
"text/plain"))
),
# selection of various variables needed in the analysis
uiOutput("selectPatientIDVar"),
uiOutput("selectDateVar"),
uiOutput("selectMeasurementVar"),
uiOutput("selectGroupingVar"),
uiOutput("selectSymptoms"), # these are our outcome variables
uiOutput("selectTreatasBinary"),
uiOutput("selectThresholdValue")
)
),
# Define the main panel ####
mainPanel(
tabsetPanel(
# TAB - welcome page with copyright info ####
tabPanel(title="Welcome",
includeHTML("www/welcome.html")),
# TAB - summary of data ####
tabPanel(title="Data overview",
verbatimTextOutput("dataSummary")),
# TAB - Graphical exploration over time ####
tabPanel(title="Graphical exploration",
uiOutput("selectGraphOverTime"),
# Profile plots
uiOutput("selectGraphType"),
uiOutput("selectRandomSampleSize"),
uiOutput("selectMaxGroupSize"),
plotOutput("plotTimelineProfiles", height="auto"),
conditionalPanel(condition="input.selectedGraphOverTime=='profilePlot'",
downloadButton("downLoadplotTimelineProfiles", label="Download")),
uiOutput("plotTimelineProfilesDescr"),
# Lasagna plot
uiOutput("plotLasagna"),
conditionalPanel(condition="input.selectedGraphOverTime=='lasagnaPlot'",
uiOutput("downloadLasagna")),
uiOutput("plotLasagnaDesc"),
# Boxplots
uiOutput("selectFacetingType"),
plotOutput("plotTimelineBoxplots", height="auto"),
conditionalPanel(condition="input.selectedGraphOverTime=='boxPlot'",
downloadButton("downLoadplotTimelineBoxplot", label="Download")),
uiOutput("plotTimelineBoxplotsDesc"),
# Timeline graph
uiOutput("selectDisplayFormat"),
plotOutput("plotTimeline", height="auto"),
conditionalPanel(condition="input.selectedGraphOverTime=='timelinePlot'",
downloadButton("downLoadplotTimeline", label="Download")),
uiOutput("plotTimelineDesc"),
# Presence of symptoms graph
uiOutput("selectMeasurementForPresencePlot"),
plotOutput("plotProportion", height="auto"),
conditionalPanel(condition="input.selectedGraphOverTime=='presencePlot'",
downloadButton("downLoadplotProportion", label="Download")),
uiOutput("plotProportionDesc")
),
# TAB - Summary ####
tabPanel(title="Summary",
plotOutput("plotPyramid", height="auto"),
conditionalPanel(condition="input.treatasBinary==true",
downloadButton("downLoadplotPyramid", label="Download")),
uiOutput("selectEvaluationTime2"),
dataTableOutput("tableforBoxplots"),
dataTableOutput("tableforProportions"),
plotOutput("plotPresence", height="auto"),
conditionalPanel(condition="input.treatasBinary==true",
downloadButton("downLoadplotPresence", label="Download")),
plotOutput("plotMedians", height="auto"),
conditionalPanel(condition="input.treatasBinary==false",
downloadButton("downLoadplotMedians", label="Download")),
uiOutput("mediansDescr"),
uiOutput("proportionsDescr")
),
# TAB - Summary tables : grouping variable ####
tabPanel(title="Summary tables : grouping variable",
plotOutput("plotPropCIs", height="auto"),
conditionalPanel(condition="input.treatasBinary==true",
downloadButton("downLoadPropPositiveCI", label="Download")),
uiOutput("UIpropTable"),
uiOutput("UIdoPvalueAdjustments"),
dataTableOutput("tablePropGroups"),
uiOutput("textTablePropGroups"),
dataTableOutput("tableMedianGroups"),
uiOutput("textTableMedianGroups")
),
# TAB - Clustering ####
tabPanel(title="Clustering",
uiOutput("clusteringUI"),
plotOutput("plotClusterDendrogram", height="auto"),
downloadButton("downLoadplotClusterDendrogram", label="Download"),
uiOutput("dendrogramDescr"),
plotOutput("plotClusterCorrelations", height="auto"),
downloadButton("downLoadplotClusterCorrelations", label="Download"),
uiOutput("correlationDescr"),
uiOutput("selectClusterAnnotations"),
plotOutput("plotClusterHeatmap", height="auto"),
downloadButton("downLoadplotClusterHeatmap", label="Download"),
uiOutput("heatmapDescr")
),
# TAB - Regression model : one evaluation time ####
tabPanel(title="Regression model : one evaluation time",
uiOutput("selectEvaluationTime"),
uiOutput("selectCovariate"),
uiOutput("checkUseFirthCorrection"),
uiOutput("checkUseRCSModel"),
# Graphs
# Logistic regression with Firth correction
plotOutput("plotLogistf", height="auto"),
uiOutput("logistfRegDownload"),
dataTableOutput("tableLogistf"),
dataTableOutput("tableLogistfIntercept"),
uiOutput("logistfDescr"),
# Logistic regression
plotOutput("plotLogist", height="auto"),
uiOutput("logistRegDownload"),
dataTableOutput("tableLogist"),
dataTableOutput("tableLogistIntercept"),
uiOutput("logistDescr"),
# Linear regression
plotOutput("plotLinear", height="auto"),
uiOutput("linearRegDownload"),
dataTableOutput("tableLinear"),
dataTableOutput("tableLinearIntercept"),
uiOutput("linearDescr"),
# RCS regression
plotOutput("plotRCS", height="100%"),
uiOutput("RCSRegDownload"),
dataTableOutput("tableRCS"),
uiOutput("RCSDescr")
),
# TAB - Regression model : all evaluation times ####
tabPanel(title="Regression model : all evaluation times",
uiOutput("selectCovariate1st"),
uiOutput("selectMixedModelType"),
textOutput("mixedModelTable0Caption"),
dataTableOutput("mixedModelTable0"),
textOutput("mixedModelTable1Caption"),
dataTableOutput("mixedModelTable1"),
plotOutput("mixedModelGraph1", height="auto"),
uiOutput("graph1Download"),
textOutput("mixedModelTable2Caption"),
dataTableOutput("mixedModelTable2"),
plotOutput("mixedModelGraph2", height="auto"),
uiOutput("graph2Download"),
textOutput("mixedModelTable3Caption"),
dataTableOutput("mixedModelTable3"),
plotOutput("mixedModelGraph3", height="auto"),
uiOutput("graph3Download"),
uiOutput("regressionAllDescr")
),
# TAB - Selected data ####
tabPanel(title="Uploaded data",
dataTableOutput("data")),
# TAB - Debug ####
tabPanel("Selected variables",
verbatimTextOutput("selectedVariables"),
textOutput("debug"))
)
)
)
)
)<file_sep>/man/importSymptomsData.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{importSymptomsData}
\alias{importSymptomsData}
\title{Function that imports data for the plotSymptoms plot}
\usage{
importSymptomsData(datafile, format)
}
\arguments{
\item{datafile}{Path to the file containing data.}
\item{format}{Format of the file containing data ("Excel", "TSV", "Demo").}
}
\description{
Function is called to import the data in appropriate format for the
plotSymptoms plot.
}
<file_sep>/man/plotDendrogram.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{plotDendrogram}
\alias{plotDendrogram}
\title{Plot dendrogram of patient groups based on symptoms.}
\usage{
plotDendrogram(data, variableName, variableValue, selectedSymptoms,
treatasBinary = FALSE)
}
\arguments{
\item{data}{Data fram to be passed to the function.}
\item{variableName}{The column name of the variable used for filtering.}
\item{variableValues}{The value of the filtering variable used to filter.}
}
\description{
Plots a dendrogram of patient groups based on symptom values they
have.
}
<file_sep>/man/findFirstRowof2ndLevel.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{findFirstRowof2ndLevel}
\alias{findFirstRowof2ndLevel}
\title{Finds first row of 2nd level group}
\usage{
findFirstRowof2ndLevel(data, firstLevel, secondLevel)
}
\description{
Not meant to be called by the user.
}
<file_sep>/R/PlotSymptomsGgplotTheme.R
#define a ggplot2 theme
myTheme <- function () {
myTheme <- theme_bw() + theme(
text = element_text(size=18)
)
return(myTheme)}<file_sep>/R/PlotSymptomsDetermineTypeofVariable.R
determineTypeofVariable <- function(variableValues) { # takes a vector of values
if(!is.vector(variableValues)) {return("Error: Function expects a vector as argument")}
#variableValues <- as.vector(variableValues)
nLevels <- length(unique(na.omit(variableValues)))
if (nLevels==1) {numLevels <- "unary"}
if (nLevels==2) {numLevels <- "binary"}
if (nLevels>2) {numLevels <- "multilevel"}
return(list(nLevels=numLevels, type=class(variableValues)))
}<file_sep>/R/PlotSymptomsMixedModel.R
mixedModel <- function(data,
selectedSymptoms,
coVariate1st,
subjectIDVar,
measurementVar,
dateVar,
thresholdValue,
treatasBinary,
selectedModel){
# if response variable is binary, do a data transformation based on the thresholdvalue
if (treatasBinary==TRUE) {data[, selectedSymptoms] <-
ifelse(data[,selectedSymptoms]>thresholdValue, 1, 0)
}
# transform the 1st covariate into appropriate form ####
coVariate1stType <- determineTypeofVariable(data[,coVariate1st])
if (coVariate1stType["type"]=="character") {
# make coVariate1st factor variable
data[, coVariate1st] <- as.factor(data[, coVariate1st])
}
if (coVariate1stType["type"]=="numeric" | coVariate1stType["type"]=="integer") {
# make coVariate1st numeric variable
data[, coVariate1st] <- as.numeric(data[, coVariate1st])
}
# make measurementVar a factor variable ####
data[, measurementVar] <- as.factor(data[, measurementVar])
# if the model includes days since inclusion, add this info to the data (column "daysSinceInclusion")
if (selectedModel=="MMtimeSinceInclusion") {
data <- calculateDaysSinceInclusion(data, subjectIDVar, dateVar)
time <- "daysSinceInclusion"
}
# function to generate text for describing what is compared (reference level)####
describeComparison <- function (data, variable) {
if(!is.factor(data[,variable])) {return(print("Function takes a factor variable!"))}
levels <- levels(data[, variable])
referenceLevel <- levels[1]
nonReferenceLevels <- levels[-1]
comparisonString <- paste0(variable, nonReferenceLevels)
comparisonText <- paste ("reference level:", referenceLevel)
return(list(levels=levels,
referenceLevel=referenceLevel,
nonReferenceLevels=nonReferenceLevels,
comparisonString=comparisonString,
comparisonText=comparisonText))
}
if (is.factor(data[,coVariate1st])) {
# name of the binary grouping variable coeficient assigned by R
coVariate1stCoefName <- describeComparison(data, coVariate1st)[["comparisonString"]]
# reference value is? the second level - the one that gets compared to the first level
referenceValue <- describeComparison(data, coVariate1st)[["nonReferenceLevels"]]
# what are we comparing?
coVariate1stComparison <- describeComparison(data, coVariate1st)[["comparisonText"]]
} else {
coVariate1stCoefName <- coVariate1st
coVariate1stComparison <- "continious variable"
}
# list measurement occasion levels
measurementLevels <- describeComparison(data, measurementVar)[["levels"]]
# the measurement level that other are compared to is?
measurementVarComparison <- describeComparison(data, measurementVar)[["referenceLevel"]]
# the other measurement levels
nonReferenceMeasurements <- describeComparison(data, measurementVar)[["nonReferenceLevels"]]
# prepare data frame for the results, depending on what kind of response variable we have
if ( is.factor(data[,coVariate1st]) & (length(levels(data[,coVariate1st]))>2) ) {
# if 1st covariate is multilevel factor
resultscoVariate1st <- data.frame(expand.grid(Variable=selectedSymptoms,
CovariateLevel=referenceValue))
} else { # if the 1st covariate is binary factor or numerical
resultscoVariate1st <- data.frame(Variable=selectedSymptoms)
}
resultsMeasurementVar <- data.frame(expand.grid(Variable=selectedSymptoms,
Measurement=nonReferenceMeasurements))
resultsDaysSinceInclusion <- data.frame(Variable=selectedSymptoms)
resultsIntercept <- data.frame(Variable=selectedSymptoms)
# The logic ####
# cycle through response variables
for (symptom in selectedSymptoms) {
# choose the right model formula depending on user selected option of model
# construct the right formula based on selected model
if (selectedModel=="MMsimple") {
formula <- as.formula(paste(symptom, "~", coVariate1st,
"+(1|", subjectIDVar, ")"))
}
if (selectedModel=="MMmeasurement") {
formula <- as.formula(paste(symptom, "~",
as.factor(measurementVar),
"+",
coVariate1st,
"+(1|", subjectIDVar, ")"))
}
if (selectedModel=="MMtimeSinceInclusion") {
formula <- as.formula(paste(symptom, "~", time,"+", coVariate1st,
"+(1|", subjectIDVar, ")"))
}
# build the model depending on whether the response is binary or not ####
if (treatasBinary==TRUE) { # treatasBinary = TRUE ####
model <- glmer(formula, family=binomial, na.action=na.omit, data=data)
#### Returns results from glmer ####
returnResultsGlmer <- function (model, variateName) {
Estimate <- exp(summary(model)$coef[variateName, "Estimate"])
CILower <- exp(summary(model)$coef[variateName, "Estimate"] -
qnorm(.975)*summary(model)$coef[variateName, "Std. Error"])
CIUpper <- exp(summary(model)$coef[variateName, "Estimate"] +
qnorm(.975)*summary(model)$coef[variateName, "Std. Error"])
PValue <- summary(model)$coef[variateName, "Pr(>|z|)"]
return(list(Estimate=Estimate, CILower=CILower, CIUpper=CIUpper, PValue=PValue))
}
####
# results for the intercept (should be the same code, regardles of model type?)
tempResult <- returnResultsGlmer(model, "(Intercept)")
# TODO: intercept has odds, not odds ratio - OK for now, change later (some code
# depends on this. But the printable results already correct this.
resultsIntercept[resultsIntercept["Variable"]==symptom, "OR"] <-
tempResult[["Estimate"]]
resultsIntercept[resultsIntercept["Variable"]==symptom, "ORCILower"] <-
tempResult[["CILower"]]
resultsIntercept[resultsIntercept["Variable"]==symptom, "ORCIUpper"] <-
tempResult[["CIUpper"]]
resultsIntercept[resultsIntercept["Variable"]==symptom, "ORPValue"] <-
tempResult[["PValue"]]
rm(tempResult)
# results for the grouping variable if it is multilevel factor ####
if ( is.factor(data[,coVariate1st]) & (length(levels(data[,coVariate1st]))>2) ) {
for (level in referenceValue) {
tempResult <- returnResultsGlmer(model, paste0(coVariate1st,level))
resultscoVariate1st[resultscoVariate1st["Variable"]==symptom &
resultscoVariate1st["CovariateLevel"]==level,
"OR"] <- tempResult[["Estimate"]]
resultscoVariate1st[resultscoVariate1st["Variable"]==symptom &
resultscoVariate1st["CovariateLevel"]==level,
"ORCILower"] <- tempResult[["CILower"]]
resultscoVariate1st[resultscoVariate1st["Variable"]==symptom &
resultscoVariate1st["CovariateLevel"]==level,
"ORCIUpper"] <- tempResult[["CIUpper"]]
resultscoVariate1st[resultscoVariate1st["Variable"]==symptom &
resultscoVariate1st["CovariateLevel"]==level,
"ORPValue"] <- tempResult[["PValue"]]
rm(tempResult)
}
} else {
# results for the grouping variable if it is binary or numerical ####
tempResult <- returnResultsGlmer(model, coVariate1stCoefName)
resultscoVariate1st[resultscoVariate1st["Variable"]==symptom, "OR"] <-
tempResult[["Estimate"]]
resultscoVariate1st[resultscoVariate1st["Variable"]==symptom, "ORCILower"] <-
tempResult[["CILower"]]
resultscoVariate1st[resultscoVariate1st["Variable"]==symptom, "ORCIUpper"] <-
tempResult[["CIUpper"]]
resultscoVariate1st[resultscoVariate1st["Variable"]==symptom, "ORPValue"] <-
tempResult[["PValue"]]
rm(tempResult)
}
# results for the measurement variable
if (selectedModel=="MMmeasurement") {
for (measurement in nonReferenceMeasurements) {
tempResult <- returnResultsGlmer(model, paste0(measurementVar,measurement))
resultsMeasurementVar[resultsMeasurementVar["Variable"]==symptom &
resultsMeasurementVar["Measurement"]==measurement,
"OR"] <- tempResult[["Estimate"]]
resultsMeasurementVar[resultsMeasurementVar["Variable"]==symptom &
resultsMeasurementVar["Measurement"]==measurement,
"ORCILower"] <- tempResult[["CILower"]]
resultsMeasurementVar[resultsMeasurementVar["Variable"]==symptom &
resultsMeasurementVar["Measurement"]==measurement,
"ORCIUpper"] <- tempResult[["CIUpper"]]
resultsMeasurementVar[resultsMeasurementVar["Variable"]==symptom &
resultsMeasurementVar["Measurement"]==measurement,
"ORPValue"] <-
tempResult[["PValue"]]
rm(tempResult)
}}
# results for the days since inclusion variable
if (selectedModel=="MMtimeSinceInclusion") {
tempResult <- returnResultsGlmer(model, time)
resultsDaysSinceInclusion[resultsDaysSinceInclusion["Variable"]==symptom, "OR"] <-
tempResult[["Estimate"]]
resultsDaysSinceInclusion[resultsDaysSinceInclusion["Variable"]==symptom, "ORCILower"] <-
tempResult[["CILower"]]
resultsDaysSinceInclusion[resultsDaysSinceInclusion["Variable"]==symptom, "ORCIUpper"] <-
tempResult[["CIUpper"]]
resultsDaysSinceInclusion[resultsDaysSinceInclusion["Variable"]==symptom, "ORPValue"] <-
tempResult[["PValue"]]
}
}
if(treatasBinary==FALSE) { #####
model <- lmerTest::lmer(formula, na.action=na.omit, data=data)
##### Returns results form lmer() ####
returnResultsLmer <- function(model,
variateName) {
Estimate <- summary(model)$coef[variateName, "Estimate"]
confIntervals <- confint(model)[variateName, c("2.5 %", "97.5 %")]
CILower <- confIntervals[1]
CIUpper <- confIntervals[2]
PValue <- lmerTest::summary(model)$coef[variateName, "Pr(>|t|)"]
return(list(Estimate=Estimate, CILower=CILower, CIUpper=CIUpper, PValue=PValue))
}
#####
# results for the intercept (should be the same code, regardles of model type?)
tempResult <- returnResultsLmer(model, "(Intercept)")
# TODO: intercept has intercept, not Beta - OK for now, change later (some code
# depends on this. But the printable results already correct this.
resultsIntercept[resultsIntercept["Variable"]==symptom, "beta"] <-
tempResult[["Estimate"]]
resultsIntercept[resultsIntercept["Variable"]==symptom, "betaCILower"] <-
tempResult[["CILower"]]
resultsIntercept[resultsIntercept["Variable"]==symptom, "betaCIUpper"] <-
tempResult[["CIUpper"]]
resultsIntercept[resultsIntercept["Variable"]==symptom, "betaPValue"] <-
tempResult[["PValue"]]
rm(tempResult)
# results if coVariate1st multilevel factor
if ( is.factor(data[,coVariate1st]) & (length(levels(data[,coVariate1st]))>2) ) {
for (level in referenceValue) {
tempResult <- returnResultsLmer(model, paste0(coVariate1st,level))
resultscoVariate1st[resultscoVariate1st["Variable"]==symptom &
resultscoVariate1st["CovariateLevel"]==level,
"beta"] <- tempResult[["Estimate"]]
resultscoVariate1st[resultscoVariate1st["Variable"]==symptom &
resultscoVariate1st["CovariateLevel"]==level,
"betaCILower"] <- tempResult[["CILower"]]
resultscoVariate1st[resultscoVariate1st["Variable"]==symptom &
resultscoVariate1st["CovariateLevel"]==level,
"betaCIUpper"] <- tempResult[["CIUpper"]]
resultscoVariate1st[resultscoVariate1st["Variable"]==symptom &
resultscoVariate1st["CovariateLevel"]==level,
"betaPValue"] <-
tempResult[["PValue"]]
rm(tempResult)
}
} else {
# results for the coVariate1st variable
tempResult <- returnResultsLmer(model, coVariate1stCoefName)
resultscoVariate1st[resultscoVariate1st["Variable"]==symptom, "beta"] <-
tempResult[["Estimate"]]
resultscoVariate1st[resultscoVariate1st["Variable"]==symptom, "betaCILower"] <-
tempResult[["CILower"]]
resultscoVariate1st[resultscoVariate1st["Variable"]==symptom, "betaCIUpper"] <-
tempResult[["CIUpper"]]
resultscoVariate1st[resultscoVariate1st["Variable"]==symptom, "betaPValue"] <-
tempResult[["PValue"]]
rm(tempResult)
}
if (selectedModel=="MMmeasurement") {
# results for the measurement variable
for (measurement in nonReferenceMeasurements) {
tempResult <- returnResultsLmer(model, paste0(measurementVar,measurement))
resultsMeasurementVar[resultsMeasurementVar["Variable"]==symptom &
resultsMeasurementVar["Measurement"]==measurement,
"beta"] <- tempResult[["Estimate"]]
resultsMeasurementVar[resultsMeasurementVar["Variable"]==symptom &
resultsMeasurementVar["Measurement"]==measurement,
"betaCILower"] <- tempResult[["CILower"]]
resultsMeasurementVar[resultsMeasurementVar["Variable"]==symptom &
resultsMeasurementVar["Measurement"]==measurement,
"betaCIUpper"] <- tempResult[["CIUpper"]]
resultsMeasurementVar[resultsMeasurementVar["Variable"]==symptom &
resultsMeasurementVar["Measurement"]==measurement,
"betaPValue"] <- tempResult[["PValue"]]
rm(tempResult)
}}
if (selectedModel=="MMtimeSinceInclusion") {
# results for the days since inclusion variable
tempResult <- returnResultsLmer(model, time)
resultsDaysSinceInclusion[resultsDaysSinceInclusion["Variable"]==symptom, "beta"] <-
tempResult[["Estimate"]]
resultsDaysSinceInclusion[resultsDaysSinceInclusion["Variable"]==symptom, "betaCILower"] <-
tempResult[["CILower"]]
resultsDaysSinceInclusion[resultsDaysSinceInclusion["Variable"]==symptom, "betaCIUpper"] <-
tempResult[["CIUpper"]]
resultsDaysSinceInclusion[resultsDaysSinceInclusion["Variable"]==symptom, "betaPValue"] <-
tempResult[["PValue"]]
rm(tempResult)
}
}
}
# # generate printable results tables
# # for outcome non-binary
# if (treatasBinary==FALSE) {
# # for covariate1st
# # TODO: for scenario with multilevel categorical
# # TODO: all scenarios for binary variables
# printableResultsCoVariate1st <- cbind(format(resultscoVariate1st$Variable, digits=2),
# format(resultscoVariate1st$beta, digits=2),
# paste(format(resultscoVariate1st$betaCILower, digits=2),
# "to",
# format(resultscoVariate1st$betaCIUpper, digits=2)),
# format(resultscoVariate1st$betaPValue, digits=2))
# colnames(printableResultsCoVariate1st) <- c("Variable", "Beta", "95% conf. interval", "P Value" )
# # for evaluation occasion
# printableResultsMeasurementVar <- cbind(format(resultsMeasurementVar$Variable, digits=2),
# format(resultsMeasurementVar$Measurement, digits=2),
# format(resultsMeasurementVar$beta, digits=2),
# paste(format(resultsMeasurementVar$betaCILower, digits=2),
# "to",
# format(resultsMeasurementVar$betaCIUpper, digits=2)),
# format(resultsMeasurementVar$betaPValue, digits=2))
# colnames(printableResultsMeasurementVar) <- c("Variable","Measurement" ,"Beta",
# "95% conf. interval", "P Value" )
# # for timesinceinclusion
# printableResultsDaysSinceInclusion <- cbind(format(resultsDaysSinceInclusion$Variable, digits=2),
# format(resultsDaysSinceInclusion$beta, digits=2),
# paste(format(resultsDaysSinceInclusion$betaCILower, digits=2),
# "to",
# format(resultsDaysSinceInclusion$betaCIUpper, digits=2)),
# format(resultsDaysSinceInclusion$betaPValue, digits=2))
# colnames(printableResultsDaysSinceInclusion) <- c("Variable","Beta", "95% conf. interval", "P Value" )
#
#
# }
# browser()
#### Construct printable results tables ####
# printable results for the intercepts ####
if ("OR" %in% colnames(resultsIntercept) ) { # OR scenario ####
printableResultsIntercept <-
data.frame("Variable"=resultsIntercept$Variable,
"Odds (intercept)"=format(resultsIntercept$OR, digits=2),
"95% conf. interval"= paste(format(resultsIntercept$ORCILower, digits=2),
"to",
format(resultsIntercept$ORCIUpper, digits=2)),
"P Value"=format(resultsIntercept$ORPValue, digits=2), check.names=FALSE)
}
if ("beta" %in% colnames(resultsIntercept) ) { # beta scenario ####
printableResultsIntercept <-
data.frame("Variable"=resultsIntercept$Variable,
"Intercept"=format(resultsIntercept$beta, digits=2),
"95% conf. interval"= paste(format(resultsIntercept$betaCILower, digits=2),
"to",
format(resultsIntercept$betaCIUpper, digits=2)),
"P Value"=format(resultsIntercept$betaPValue, digits=2), check.names=FALSE)
}
# printable results for coVariate1st ####
if ("OR" %in% colnames(resultscoVariate1st) ) { # OR scenario ####
if ("CovariateLevel" %in% colnames(resultscoVariate1st)) { # multilevel factor
printableResultsCoVariate1st <-
data.frame("Variable"=resultscoVariate1st$Variable,
"Levels"= resultscoVariate1st$CovariateLevel,
"OR"=format(resultscoVariate1st$OR, digits=2),
"95% conf. interval"= paste(format(resultscoVariate1st$ORCILower, digits=2),
"to",
format(resultscoVariate1st$ORCIUpper, digits=2)),
"P Value"=format(resultscoVariate1st$ORPValue, digits=2), check.names=FALSE)
# resort the results of printableResultsCoVariate1st
printableResultsCoVariate1st <- resortbyVariables(printableResultsCoVariate1st, selectedSymptoms)
} else { # non multilevel factor
printableResultsCoVariate1st <-
data.frame("Variable"=resultscoVariate1st$Variable,
"OR"=format(resultscoVariate1st$OR, digits=2),
"95% conf. interval"= paste(format(resultscoVariate1st$ORCILower, digits=2),
"to",
format(resultscoVariate1st$ORCIUpper, digits=2)),
"P Value"=format(resultscoVariate1st$ORPValue, digits=2), check.names=FALSE)
}}
if ("beta" %in% colnames(resultscoVariate1st) ) { # beta scenario ####
if ("CovariateLevel" %in% colnames(resultscoVariate1st)) { # multilevel factor
printableResultsCoVariate1st <-
data.frame("Variable"=resultscoVariate1st$Variable,
"Levels"= resultscoVariate1st$CovariateLevel,
"Beta"=format(resultscoVariate1st$beta, digits=2),
"95% conf. interval"= paste(format(resultscoVariate1st$betaCILower, digits=2),
"to",
format(resultscoVariate1st$betaCIUpper, digits=2)),
"P Value"=format(resultscoVariate1st$betaPValue, digits=2), check.names=FALSE)
# resort the results of printableResultsCoVariate1st
printableResultsCoVariate1st <- resortbyVariables(printableResultsCoVariate1st, selectedSymptoms)
} else { # non multilevel factor
printableResultsCoVariate1st <-
data.frame("Variable"=resultscoVariate1st$Variable,
"Beta"=format(resultscoVariate1st$beta, digits=2),
"95% conf. interval"= paste(format(resultscoVariate1st$betaCILower, digits=2),
"to",
format(resultscoVariate1st$betaCIUpper, digits=2)),
"P Value"=format(resultscoVariate1st$betaPValue, digits=2), check.names=FALSE)
}}
printableResultsMeasurementVar <- data.frame()
printableResultsDaysSinceInclusion <- data.frame()
# printable results for MeasurementVar ####
if ("OR" %in% colnames(resultsMeasurementVar) ) { # OR scenario ####
printableResultsMeasurementVar <-
data.frame("Variable"=resultsMeasurementVar$Variable,
"Levels"= resultsMeasurementVar$Measurement,
"OR"=format(resultsMeasurementVar$OR, digits=2),
"95% conf. interval"= paste(format(resultsMeasurementVar$ORCILower, digits=2),
"to",
format(resultsMeasurementVar$ORCIUpper, digits=2)),
"P Value"=format(resultsMeasurementVar$ORPValue, digits=2), check.names=FALSE)
# resort the results of printableResultsMeasurementVar, to keep Variables together
printableResultsMeasurementVar <-
resortbyVariables(printableResultsMeasurementVar, selectedSymptoms)
}
if ("beta" %in% colnames(resultsMeasurementVar) ) { # beta scenario ####
printableResultsMeasurementVar <-
data.frame("Variable"=resultsMeasurementVar$Variable,
"Levels"= resultsMeasurementVar$Measurement,
"Beta"=format(resultsMeasurementVar$beta, digits=2),
"95% conf. interval"= paste(format(resultsMeasurementVar$betaCILower, digits=2),
"to",
format(resultsMeasurementVar$betaCIUpper, digits=2)),
"P Value"=format(resultsMeasurementVar$betaPValue, digits=2), check.names=FALSE)
# resort the results of printableResultsMeasurementVar, to keep Variables together
printableResultsMeasurementVar <-
resortbyVariables(printableResultsMeasurementVar, selectedSymptoms)
}
# printable results for daysSinceInclusion ####
if ("OR" %in% colnames(resultsDaysSinceInclusion) ) { # OR scenario ####
printableResultsDaysSinceInclusion <-
data.frame("Variable"=resultsDaysSinceInclusion$Variable,
"OR"=format(resultsDaysSinceInclusion$OR, digits=2),
"95% conf. interval"= paste(format(resultsDaysSinceInclusion$ORCILower, digits=2),
"to",
format(resultsDaysSinceInclusion$ORCIUpper, digits=2)),
"P Value"=format(resultsDaysSinceInclusion$ORPValue, digits=2), check.names=FALSE)
}
if ("beta" %in% colnames(resultsDaysSinceInclusion) ) { # beta scenario ####
printableResultsDaysSinceInclusion <-
data.frame("Variable"=resultsDaysSinceInclusion$Variable,
"Beta"=format(resultsDaysSinceInclusion$beta, digits=2),
"95% conf. interval"= paste(format(resultsDaysSinceInclusion$betaCILower, digits=2),
"to",
format(resultsDaysSinceInclusion$betaCIUpper, digits=2)),
"P Value"=format(resultsDaysSinceInclusion$betaPValue, digits=2), check.names=FALSE)
}
return(list( intercept=resultsIntercept,
printableIntercept=printableResultsIntercept,
coVariate1st=resultscoVariate1st,
printablecoVariate1st=printableResultsCoVariate1st,
coVariate1stComparison=coVariate1stComparison,
measurementVar=resultsMeasurementVar,
printablemeasurementVar=printableResultsMeasurementVar,
measurementVarComparison=measurementVarComparison,
daysSinceInclusion=resultsDaysSinceInclusion,
printabledaysSinceInclusion=printableResultsDaysSinceInclusion))
}
calculateDaysSinceInclusion <- function (data,
subjectIDVar,
dateVar) {
# find the day of inclusion in the study for each person
uniquePeople <- as.data.frame(unique(data[subjectIDVar]))
colnames(uniquePeople) <- subjectIDVar
for (person in uniquePeople[,1]) {
subset <- as.Date(data[which(data[subjectIDVar]==person), dateVar], format="%d.%m.%Y")
uniquePeople[which(uniquePeople[subjectIDVar]==person), "minDate"] <- min(subset)
data[which(data[subjectIDVar]==person), "minDate"] <- min(subset)
}
# data$minDate <- as.Date(data$minDate, format="%Y-%m-%d")
data$daysSinceInclusion <- as.numeric(as.Date(data[,dateVar], format="%d.%m.%Y") - data$minDate) # save as numeric for melt()to work
# TODO: tu je nekaj čudnega - zakaj je to sploh delovlao? oba datuma sta v različnem formatu
# morao bi biti: data$minDate <- as.Date(data$minDate, format="%d.%m.%Y")
# in tudi : as.Date(data[,dateVar], format="%d.%m.%Y")
# as.Date(data[,dateVar], format="%d.%m.%Y") - as.Date(data$minDate, format="%d.%m.%Y")
return(data)
}
plotFixedEffectsofcoVariate1st <- function (calculatedStatistics,
coVariate1st,
coVariate1stReferenceValue,
treatasBinary,
variableOrder) {
graphTitle <- paste("Fixed effects of", coVariate1st)
# for binary response variable
if (treatasBinary==TRUE) {
xlabLabel <- "Odds ratios"
plot <- ggplot() +
geom_errorbarh(data=calculatedStatistics,
mapping=aes(y=Variable, x=ORCIUpper, xmin=ORCIUpper, xmax=ORCILower),
height=0.2, size=1) +
geom_point(data=calculatedStatistics,
mapping=aes(y=Variable, x=OR), size=4, shape=21, fill="white") +
geom_vline(xintercept=1) +
if ("CovariateLevel" %in% colnames(calculatedStatistics)) {facet_grid(CovariateLevel ~ .)}
}
# for continious response variable
if (treatasBinary==FALSE) {
xlabLabel <- "Slope coefficients"
plot <- ggplot() +
geom_errorbarh(data=calculatedStatistics,
mapping=aes(y=Variable, x=betaCIUpper, xmin=betaCIUpper, xmax=betaCILower),
height=0.2, size=1) +
geom_point(data=calculatedStatistics,
mapping=aes(y=Variable, x=beta), size=4, shape=21, fill="white") +
geom_vline(xintercept=0) +
if ("CovariateLevel" %in% colnames(calculatedStatistics)) {facet_grid(CovariateLevel ~ .)}
}
plot <- plot + myTheme() + labs(title=graphTitle,
x= xlabLabel) + scale_y_discrete(limits=rev(variableOrder))
return(plot)
}
plotFixedEffectsofMeasurementVar <- function (calculatedStatistics,
measurementVar,
treatasBinary,
variableOrder) {
graphTitle <- paste("Fixed effects of", measurementVar)
# for binary response variable
if (treatasBinary==TRUE) {
xlabLabel <- "Odds ratios"
plot <- ggplot(data=calculatedStatistics) +
geom_errorbarh(data=calculatedStatistics,
mapping=aes(y=Variable, x=ORCIUpper, xmin=ORCIUpper, xmax=ORCILower),
height=0.2, size=1) +
geom_point(data=calculatedStatistics,
mapping=aes(y=Variable, x=OR), size=4, shape=21, fill="white") +
geom_vline(xintercept=1) +
facet_grid(Measurement ~ .)
}
# for continious response variable
if (treatasBinary==FALSE) {
xlabLabel <- "Slope coefficients"
plot <- ggplot(data=calculatedStatistics) +
geom_errorbarh(data=calculatedStatistics,
mapping=aes(y=Variable, x=betaCIUpper, xmin=betaCIUpper, xmax=betaCILower),
height=0.2, size=1) +
geom_point(data=calculatedStatistics,
mapping=aes(y=Variable, x=beta), size=4, shape=21, fill="white") +
geom_vline(xintercept=0) +
facet_grid(Measurement ~ .)
}
plot <- plot + myTheme() + labs(title=graphTitle,
x= xlabLabel) + scale_y_discrete(limits=rev(variableOrder))
return(plot)
}
plotFixedEffectsofDaysSinceInclusion <- function (calculatedStatistics,
treatasBinary,
variableOrder) {
graphTitle <- paste("Fixed effects of days since inclusion in the study")
# for binary response variable
if (treatasBinary==TRUE) {
xlabLabel <- "Odds ratios"
plot <- ggplot() +
geom_errorbarh(data=calculatedStatistics,
mapping=aes(y=Variable, x=ORCIUpper, xmin=ORCIUpper, xmax=ORCILower),
height=0.2, size=1) +
geom_point(data=calculatedStatistics,
mapping=aes(y=Variable, x=OR), size=4, shape=21, fill="white") +
geom_vline(xintercept=1)
}
# for continious response variable
if (treatasBinary==FALSE) {
xlabLabel <- "Slope coefficients"
plot <- ggplot() +
geom_errorbarh(data=calculatedStatistics,
mapping=aes(y=Variable, x=betaCIUpper, xmin=betaCIUpper, xmax=betaCILower),
height=0.2, size=1) +
geom_point(data=calculatedStatistics,
mapping=aes(y=Variable, x=beta), size=4, shape=21, fill="white") +
geom_vline(xintercept=0)
}
plot <- plot + myTheme() + labs(title=graphTitle,
x= xlabLabel) + scale_y_discrete(limits=rev(variableOrder))
return(plot)
}
# helper function to resort printable data frames to keep Variable levels together
resortbyVariables <- function(dataframe, selectedSymptoms) {
variableOrder <- match(dataframe[,"Variable"], selectedSymptoms)
evaluationOrder <- dataframe[,"Levels"]
totalOrder <- order(variableOrder, evaluationOrder)
return(dataframe[totalOrder,])
}<file_sep>/R/PlotSymptomsTabLogistf.R
# #' @title OBSOLETE Plot logistf
# #'
# #' @description TODO write description
# #'
# #' @param TODO TODO write instructions
# plotLogistf <- function (data,
# data.yn,
# measurement,
# measurementSelectedlogistf,
# logistfIDVar,
# selectedSymptoms,
# numSymptoms) {
#
#
# ########### plot of the results
# my.data.symptoms.yn=data.yn[measurement==measurementSelectedlogistf,]
#
# #which variable is used in the model
# my.var=data[measurement==measurementSelectedlogistf, logistfIDVar]
#
# #fix this problem: if a variable is selected and it has just one value - like in our example Response at t=0, the program freezes
# if(length(unique(my.var))==1) return()
#
# my.mod.firth=vector("list", numSymptoms)
#
# #estimate the logistics model with Firth correction for each symptom
# for(i in 1:numSymptoms){
# my.mod.firth[[i]]=logistf(my.data.symptoms.yn[,i]~ my.var, family="binomial")
# }
#
# linch <- max(strwidth(selectedSymptoms, "inch")+0.4, na.rm = TRUE)
# par(mai=c(1.02,linch,0.82,0.42))
#
# #number of levels of the variable
# num.levels=nrow(my.mod.firth[[1]]$coef)-1
#
# par(mfrow=c(max(num.levels,1), 1))
#
# for(i in 1:max(num.levels, 1)){
# OR.b.0=matrix(unlist(lapply(my.mod.firth,
# function(x) exp(cbind(x$coef, x$ci.lower, x$ci.upper)[i+1,]))),
# ncol=3, byrow=3)
# dimnames(OR.b.0)[[1]]=selectedSymptoms
#
# plot(1:numSymptoms,
# ylim=c(0, numSymptoms),
# xlim=c(max(min(OR.b.0)-0.1, 0), max(OR.b.0)),
# type="n",
# axes=FALSE,
# xlab="OR",
# ylab="")
#
# segments(OR.b.0[,2], c(1:(numSymptoms)), OR.b.0[,3], c(1:(numSymptoms)))
# points(OR.b.0[,1],c(1:(numSymptoms)))
# axis(2, at=c(1:(numSymptoms))+.17, labels=selectedSymptoms, las=2)
# axis(1)
# abline(v=1, lwd=2)
#
# #string that expresses the levels of the categorical variables being compared
#
# my.level.string=ifelse(is.numeric(my.var), "", paste0(levels(my.var)[i+1], " versus ", levels(my.var)[1], " (reference)"))
#
# title(paste0("T = ", measurementSelectedlogistf, ";\n Odds ratios and 95% confidence intervals\n", my.level.string))
# ##### fix the title, to express what the OR represents - in case of categorical variables or more than 1 level
# }
# }
#' @title Logistf data in tabular format
#'
#' @description TODO
#'
#' @param TODO
tableLogistf <- function (data,
measurementVar,
selectedMeasurement,
covariate,
selectedSymptoms,
thresholdValue) {
# subset the data, using only the selected evaluation
data <- data[data[,measurementVar]==selectedMeasurement,]
#data <- data[measurementValues==selectedMeasurement,]
# binarize the data
data[, selectedSymptoms] <- ifelse(data[, selectedSymptoms]>thresholdValue, 1, 0)
table <- data.frame("Variable"=selectedSymptoms) # table of printable results - Fixed effect
table2 <- data.frame("Variable"=selectedSymptoms) # table of raw results
table3 <- data.frame("Variable"=selectedSymptoms) # table of printable results - Intercept
# check if covariate is binary and generate text which levels we are comparing
if (determineTypeofVariable(data[,covariate])[["nLevels"]]=="binary") { # binary var
levels <- levels(as.factor(data[,covariate])) # levels of the covariate
oddsFor <- paste(levels[2],"vs",levels[1]) # text describing which variables were compared
}
if (determineTypeofVariable(data[,covariate])[["nLevels"]]=="multilevel" & # numerical var w multi levels
( determineTypeofVariable(data[,covariate])[["type"]]=="integer")
| determineTypeofVariable(data[,covariate])[["type"]]=="numeric") {
#levels <- levels(as.factor(data[,covariate])) # levels of the covariate
oddsFor <- paste("unit difference in", covariate) # text describing which variables were compared
}
for (symptom in selectedSymptoms) {
model <- logistf(data[,symptom] ~ data[,covariate], family="binomial")
table[table["Variable"]==symptom, "Odds ratio"] <-
format(exp(model$coef[2]), digits=2)
table[table["Variable"]==symptom, "95% conf. interval"] <-
paste(format(exp(model$ci.lower[2]), digits=2),
" to ",
format(exp(model$ci.upper[2]), digits=2))
table2[table2["Variable"]==symptom, "OR"] <- exp(model$coef[2])
table2[table2["Variable"]==symptom, "CILower"] <- exp(model$ci.lower[2])
table2[table2["Variable"]==symptom, "CIUpper"] <- exp(model$ci.upper[2])
table[table["Variable"]==symptom, "P value"] <- format(model$prob[2], digits=2)
# results for intercept
table3[table3["Variable"]==symptom, "Odds (intercept)"] <-
format(exp(model$coef[1]), digits=2)
table3[table3["Variable"]==symptom, "95% conf. interval"] <-
paste(format(exp(model$ci.lower[1]), digits=2),
" to ",
format(exp(model$ci.upper[1]), digits=2))
table3[table3["Variable"]==symptom, "P value"] <- format(model$prob[1], digits=2)
}
return(list(printableResultsTable=table,
rawResultsTable=table2,
referenceValue=oddsFor,
printableInterceptTable=table3))
}<file_sep>/man/plotCI.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{plotCI}
\alias{plotCI}
\title{Ploting confidence intervals}
\usage{
plotCI(data.yn, measurements, selectedSymptoms, selectedProportion)
}
\arguments{
\item{ddd}{dddd}
}
\description{
Ploting confidence intervals
}
<file_sep>/R/PlotSymptomsPresencePlot.R
plotPresenceofSymptoms <- function (data, # dataFiltered()
selectedSymptoms, # input$selectedSymptoms
measurementVar, # input$measurementVar
measurement, # input$selectedMeasurementForPresencePlot
thresholdValue=0 # input$thresholdValue
) {
#browser()
# filter - use only data for selected measurement occasion
data <- data[data[measurementVar]==measurement,]
# binarize the symptoms data
data[,selectedSymptoms] <- ifelse(data[,selectedSymptoms]>thresholdValue, 1, 0)
calculatedData <- t(apply(data[,selectedSymptoms],
2, # by column
function(x) {
result=prop.test(sum(x==1, na.rm=TRUE), length(x))
# browser()
return(c( "PresentIn"=sum(x==1, na.rm=T),
"Proportion"=result$estimate[[1]],
"CILower"=result$conf.int[1],
"CIUpper"=result$conf.int[2]))
}))
calculatedData <- as.data.frame(calculatedData)
calculatedData[,"Variable"] <- row.names(calculatedData)
# for.CI=t(apply(data.yn[measurements==selectedProportion,], 2, function(x) {a=prop.test(sum(x==1, na.rm=T), length(x))
# return( c(sum(x==1, na.rm=T), length(x), a$estimate, a$conf.int ))
# }
# )
# )
plot <- ggplot() +
geom_errorbarh(data=calculatedData,
mapping=aes(y=Variable, x=CIUpper, xmin=CILower, xmax=CIUpper),
height=0.2, size=1) +
geom_point(data=calculatedData,
mapping=aes(y=Variable, x=Proportion), size=4, shape=21, fill="white") +
# facet_grid(Variable~.) +
scale_x_continuous(limits=c(0,1),
breaks=seq(0,1,by=0.2),
labels=abs(seq(0,1,by=0.2)),
minor_breaks=((seq(0,1, by=0.1)))) +
# geom_vline(xintercept=0)+
myTheme() + labs(title="Proportions of positive variables \n(with 95% confidence intervals)",
x= "Proportions") +
scale_y_discrete(limits=rev(selectedSymptoms))
# +
# opts(title="geom_errorbarh", plot.title=theme_text(size=40, vjust=1.5))
return(plot)
}
# x <- plotPresenceofSymptoms(data, # dataFiltered()
# measurementVar="Measurement", # input$measurementVar
# measurement=0, # input$selectedMeasurementForPresencePlot
# thresholdValue=0, # input$thresholdValue
# selectedSymptoms=c("Fatigue", "Malaise")
# )
<file_sep>/man/plotSymptomsTimeline.Rd
\name{plotSymptomsTimeline}
\alias{plotSymptomsTimeline}
\title{Plot timeline of symptom presence}
\usage{
plotSymptomsTimeline(data, date, personID, measurement, symptoms,
displayFormat = "dates")
}
\arguments{
\item{data}{The data frame used by the plot.}
\item{date}{Name of variable containing dates.}
\item{personID}{Name of variable containing person ID.}
\item{measurement}{Name of variable containing measuring
occasion info.}
\item{symptoms}{Vector of variable names representing
measured symptoms.
for ggplot() (see melt()). Returns a ggplot object that
has to be plotted via print().}
}
\description{
Function that plots symptom severity presence for patients
at a certain time.
}
<file_sep>/man/tableLogistf.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{tableLogistf}
\alias{tableLogistf}
\title{OBSOLETE Plot logistf
#
#}
\usage{
tableLogistf(data, measurementVar, selectedMeasurement, covariate,
selectedSymptoms, thresholdValue)
}
\arguments{
\item{TODO}{TODO write instructions}
\item{TODO}{}
}
\description{
TODO write description
#
#
}
\details{
#
}
<file_sep>/man/drawLineBelow1stLevel.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{drawLineBelow1stLevel}
\alias{drawLineBelow1stLevel}
\title{Draws lines below 1st level groups}
\usage{
drawLineBelow1stLevel(data, firstLevel, TYPE.LEVELS, lastRowforType, LINE.SIZE)
}
\description{
Not meant to be called by the user.
}
<file_sep>/man/plot.rcs.mod.Rd
\name{plot.rcs.mod}
\alias{plot.rcs.mod}
\title{Plot RSC MOD ??? TODO : help contents}
\usage{
\method{plot}{rcs.mod}(My.mod, my.var, my.ylab = "Response",
my.xlab = "Covariate", my.xlim = NULL, my.ylim = NULL,
my.knots = NULL, my.title = NULL, vlines = NULL, hlines = NULL,
my.cex.lab = NULL, my.axes = TRUE, my.cex.main = NULL,
my.cex.axis = NULL)
}
\arguments{
\item{My.mod}{???}
}
\description{
Plot RSC MOD ??? TODO : help contents
}
<file_sep>/man/plotPropWithSymptoms.Rd
\name{plotPropWithSymptoms}
\alias{plotPropWithSymptoms}
\title{Plot proportions with symptoms}
\usage{
plotPropWithSymptoms(data, grouping = "Sex", measurements = "Measurement",
symptomsNames)
}
\arguments{
\item{data}{Data to be passed to the function as a data
frame.}
\item{grouping}{The column name of the binary grouping
variable used to define the groups on the plot.}
\item{measurements}{The column name of the variable
containing measurement occasions.}
\item{symptomsNames}{A vector of column names containing
the symptom severity.}
}
\description{
Function that plots proportion of subjects that have a
certain symptom present.
}
<file_sep>/man/downloadPlot.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{downloadPlot}
\alias{downloadPlot}
\title{Code to generate a downloadable file containing plot (downloaHandler) to download using a button.}
\usage{
downloadPlot(filename = "plot.eps", plotFunction, width, height,
print = FALSE)
}
\arguments{
\item{filename}{Suggested file name.}
\item{plotFunction}{The function that plots the graph (can be reactive).}
\item{width}{Width of the plot in pixels.}
\item{height}{Height of plot in pixels.}
\item{print}{Whether to use print() function or not. Needed with ggplot objects and not with base plots?}
}
\description{
Code to generate a downloadable file containing plot (downloaHandler) to download using a button.
}
<file_sep>/man/drawLevelLabels.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{drawLevelLabels}
\alias{drawLevelLabels}
\title{Draw labels and lines for groups}
\usage{
drawLevelLabels(data, TYPE.LEVELS, LINE.SIZE, DIAGNOSIS.LEVELS, firstLevel,
secondLevel)
}
\description{
Function that plots labels for both possible groups describing
subjects.
}
\details{
The function is meant to be called by the general function for plotting,
when it needs to draw the labels for the groups.
}
<file_sep>/man/sortData.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{sortData}
\alias{sortData}
\title{Sort results data}
\usage{
sortData(data, sortMethod = "BEA", nUNITS, DATES.COLUMN.FIRST,
DATES.COLUMN.LAST, TEST.RESULT.LEVELS)
}
\description{
Function that sorts the data according to a criterion. Not to
be called directly by the user.
}
\details{
Function is called by the plotTests function, to sort the data before
it starts working on it. Different methods of sorting can be used.
"DateIn" sorts by the date of admission.
"BEA" uses seriation package and Bond Energy Algorithm.
"BEA_TSP" as above with TSP to optimize the measure of effectivenes.
"PCA" uses seriation package First principal component algorithm.
}
<file_sep>/R/PlotSymptomsValueswithCIsPlot.R
plotValueswithCIs <- function (data,
variableName="Variable",
valueName="OR",
CILowerName="CILower",
CIUpperName="CIUpper",
xLabel,
yLabel,
graphTitle,
vLine=NULL,
variableOrder=NULL) {
plot <- ggplot() +
geom_errorbarh(data=data,
mapping=aes_string(y=variableName, x=CIUpperName, xmin=CIUpperName, xmax=CILowerName),
height=0.2, size=1) +
geom_point(data=data,
mapping=aes_string(y=variableName, x=valueName),
size=4, shape=21, fill="white") +
#theme_bw() +
myTheme() +
labs(title=graphTitle, x= xLabel, y=yLabel) +
geom_vline(xintercept = vLine) +
scale_y_discrete(limits=rev(variableOrder))
return(plot)
}
#
# # define a ggplot2 theme
# myTheme <- theme_bw() + theme(
# text = element_text(size=18)
# )
#
# # test plot
# plotValueswithCIs(data=my.data,
# variableName="Headache",
# valueName="Concentration",
# CILowerName="Insomnia",
# CIUpperName="Nausea",
# xLabel="Test X label",
# yLabel="Test Y label",
# graphTitle="Test title",
# vLine=NULL)
<file_sep>/man/tabelizeRCS.Rd
\name{tabelizeRCS}
\alias{tabelizeRCS}
\title{Restricted cubic spline P values in tabular form}
\usage{
tabelizeRCS(data.all, data.yn, measurement, selectedSymptoms,
measurementSelectedrcs, rcsIDVar, binaryVar = TRUE)
}
\arguments{
\item{TODO}{}
}
\description{
TODO
}
<file_sep>/man/plotDistributionBoxplot.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{plotDistributionBoxplot}
\alias{plotDistributionBoxplot}
\title{Plots boxplots of symptoms.}
\usage{
plotDistributionBoxplot(data, data.yn, selectedSymptoms, selectedProportion,
measurements, posOnly = FALSE, threshold)
}
\arguments{
\item{data}{Data for the boxplots.}
\item{selectedSymptoms}{Column names of symptoms in data frame.}
\item{selectedProportion}{User selected measurement time.}
\item{measurements}{Vector of measurement values.}
\item{posOnly}{TRUE/FALSE}
\item{threshold}{User selected threshold value of symptoms.}
}
\description{
Plots boxplots of symptoms.
}
<file_sep>/man/tabelizeBoxplots.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{tabelizeBoxplots}
\alias{tabelizeBoxplots}
\title{Boxplots data in form of a table}
\usage{
tabelizeBoxplots(measurements, measurementVar, data, selectedSymptoms)
}
\arguments{
\item{TODO}{}
}
\description{
TODO
}
<file_sep>/man/plotClustersofPatients.Rd
\name{plotClustersofPatients}
\alias{plotClustersofPatients}
\title{Plot clusters of patients based on symptoms.}
\usage{
plotClustersofPatients()
}
\arguments{
\item{ddd}{dddd}
}
\description{
Visualize clusters of patients based on symptoms and other
arguments.
}
<file_sep>/R/PlotSymptomsLinearModel.R
tableLinear <- function(data,
measurementVar,
selectedMeasurement,
covariate,
selectedSymptoms#,
#thresholdValue
) {
# subset the data, using only the selected evaluation
data <- data[data[,measurementVar]==selectedMeasurement, ]
# # binarize the data
# data[, selectedSymptoms] <- ifelse(data[, selectedSymptoms]>thresholdValue, 1, 0)
table <- data.frame("Variable"=selectedSymptoms) # table of printable results - Fixed effect
table2 <- data.frame("Variable"=selectedSymptoms) # table of raw results
table3 <- data.frame("Variable"=selectedSymptoms) # table of printable results - Intercept
# # check if covariate is binary and generate text which levels we are comparing
# if (determineTypeofVariable(data[,covariate])[["nLevels"]]=="binary") { # binary var
# levels <- levels(as.factor(data[,covariate])) # levels of the covariate
# oddsFor <- paste(levels[2],"vs",levels[1]) # text describing which variables were compared
# }
#
# if (determineTypeofVariable(data[,covariate])[["nLevels"]]=="multilevel" & # numerical var w multi levels
# ( determineTypeofVariable(data[,covariate])[["type"]]=="integer")
# | determineTypeofVariable(data[,covariate])[["type"]]=="numeric") {
# #levels <- levels(as.factor(data[,covariate])) # levels of the covariate
# oddsFor <- paste("unit difference in", covariate) # text describing which variables were compared
# }
#
for (symptom in selectedSymptoms) {
model <- glm(data[,symptom] ~ data[,covariate], family=gaussian())
table[table["Variable"]==symptom, "Beta (slope)"] <-
format(model$coef[2], digits=2)
table[table["Variable"]==symptom, "95% conf. interval"] <-
paste(format((confint(model)[2,1]), digits=2),
" to ",
format((confint(model)[2,2]), digits=2))
table2[table2["Variable"]==symptom, "beta"] <- (model$coef[2])
table2[table2["Variable"]==symptom, "CILower"] <- (confint(model)[2,1])
table2[table2["Variable"]==symptom, "CIUpper"] <- (confint(model)[2,2])
table[table["Variable"]==symptom, "P value"] <-
format(summary(model)$coefficients[2,"Pr(>|t|)"],digits=2)
# printable table of intercepts
table3[table3["Variable"]==symptom, "Intercept"] <-
format(model$coef[1], digits=2)
table3[table3["Variable"]==symptom, "95% conf. interval"] <-
paste(format((confint(model)[1,1]), digits=2),
" to ",
format((confint(model)[1,2]), digits=2))
table3[table["Variable"]==symptom, "P value"] <-
format(summary(model)$coefficients[1,"Pr(>|t|)"],digits=2)
}
return(list(printableResultsTable=table,
rawResultsTable=table2,
printableInterceptTable=table3))
}
# head(mtcars)
# model <- glm(mtcars$mpg~mtcars$hp, family=gaussian)
# str(model)
# summary(model)
# confint(model)
# coef(model)
# model$coef[2]
<file_sep>/man/checkifCombinationExists.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{checkifCombinationExists}
\alias{checkifCombinationExists}
\title{Checks if combination of groups exists}
\usage{
checkifCombinationExists(data, firstLevel, secondLevel)
}
\description{
Checks if combination of groups exists and is not meant
to be called from by user.
}
\details{
Checks for existance of combination of first and second level groups
in the data.
}
<file_sep>/R/PlotSymptomsGenerateDescriptions.R
generateDescription <- function(outputName) {
# GraphExpl_ProfilePlots_AllSubjects ####
if (outputName=="GraphExpl_ProfilePlots_AllSubjects") {
description <- renderText({
paste(
"Data are displayed using profile plots (also known as spaghetti plots).
Profile plots are scatterplots displaying the evaluation occasions and the values of the variables where the values from the same subject are connected. The larger red dots display the median values. Each outcome is displayed in a separate graph.<br><br>
Profile plots are useful for the identification of trends and to display individual changes. When many subjects are plotted together or the possible number of values of the variable is limited, the profiles might overlap obscuring the trends. You might improve your graphs by displaying only a subset of the subjects (select: 'Random selection of the subjects on one graph' as the type of graph to plot) or by plotting multiple graphs for each outcome (select: 'Multiple graphs per outcome variable'). You could also try plotting Lasagna plots, as they might be more informative.<br><br>
More descriptive statistics are provided in the Summary tabs.<br><br>
You can copy the graph(s) by right clicking on them (selecting 'Copy Image' or 'Save Image as...') or download them as Postscript graphics by clicking the 'Download' button.<br><br>
References: <br>
Weiss RE (2005) Modeling Longitudinal Data: With 72 Figures. Springer.")
})}
# GraphExpl_ProfilePlots_RandomSubjects ####
if (outputName=="GraphExpl_ProfilePlots_RandomSubjects") {
description <- renderText({
paste(
"Data were displayed using profile plots (also known as spaghetti plots).
Profile plots are scatterplots displaying the evaluation occasions and the values of the variables where the values from the same subject are connected. The larger red dots display the median values. Each outcome is displayed in a separate graph.<br><br>
Profile plots are useful for the identification of trends and to display individual changes. When many subjects are plotted together or the possible number of values of the variable is limited, the profiles might overlap obscuring the trends. You might improve your graphs by displaying only a subset of the subjects (select: 'Random selection of the subjects on one graph' as the type of graph to plot) or by plotting multiple graphs for each outcome (select: 'Multiple graphs per outcome variable'). You could also try plotting Lasagna plots, as they might be more informative.<br><br>
More descriptive statistics are provided in the Summary tabs.<br><br>
You can copy the graph(s) by right clicking on them (selecting 'Copy Image' or 'Save Image as...') or download them as Postscript graphics by clicking the 'Download' button.<br><br>
References: <br>
Weiss RE (2005) Modeling Longitudinal Data: With 72 Figures. Springer.")
})}
# GraphExpl_ProfilePlots_MultipleGraphs ####
if (outputName=="GraphExpl_ProfilePlots_MultipleGraphs") {
description <- renderText({
paste(
"Data were displayed using profile plots (also known as spaghetti plots).
Profile plots are scatterplots displaying the evaluation occasions on the x-axis and the values of the outcome variables on the y-axis. The values from the same subject are connected. The larger red dots indicate the median values. Each outcome is displayed in a separate graph.<br><br>
Profile plots are useful for the identification of trends and to display individual changes. When many subjects are plotted together or the possible number of values of the variable is limited, the profiles might overlap obscuring the trends. You might improve your graphs by displaying only a subset of the subjects (select: 'Random selection of the subjects on one graph' as the type of graph to plot) or by plotting multiple graphs for each outcome (select: 'Multiple graphs per outcome variable'). You could also try plotting Lasagna plots, as they might be more informative.<br><br>
More descriptive statistics are provided in the Summary tabs.<br><br>
You can copy the graph(s) by right clicking on them (selecting 'Copy Image' or 'Save Image as...') or download them as Postscript graphics by clicking the 'Download' button.<br><br>
References: <br>
Weiss RE (2005) Modeling Longitudinal Data: With 72 Figures. Springer.")
})}
# GraphExpl_LasagnaPlots ####
if (outputName=="GraphExpl_LasagnaPlots") {
description <- renderText({
paste(
"Data were displayed using heat maps (also known as lasagna plots).
In lasagna plots the evaluation times are reported horizontally and the measurements of each subject appear in the same row. Colors are used to display the value of the variables. In our implementation the subjects are arranged using a hierarchical clustering algorithm (with Euclidean distance and complete linkage agglomeration method). The rearrangement of the subjects is useful for data exploration because similar subjects are grouped together. Missing values are displayed with white color.<br><br>
More descriptive statistics are provided in the Summary tabs. <br><br>
You can copy the graph(s) by right clicking on them (selecting 'Copy Image' or 'Save Image as...') or download them as Postscript graphics by clicking the 'Download' button.<br><br>
References: <br>
<NAME>, <NAME>, <NAME>, <NAME>, <NAME>, et al. (2010) Lasagna plots: a saucy alternative to spaghetti plots. Epidemiology 21: 621-625."
)})}
# GraphExpl_BoxPlots ####
if (outputName=="GraphExpl_BoxPlots") {
description <- renderText({
paste(
"Data were visualized using box and whisker plots (also known as boxplots). The boxplots represent the first quartile (i.e., lower edge of the box), median (i.e., bar inside the box) and third quartile (i.e., upper edge of the box). Vertical lines extend to the most extreme values or to a distance of 1.5 times the interquartile range (IQR) from the upper or lower quartile. The points (if any) plotted individually as dots are those that are at a greater distance from the quartiles than 1.5 times the IQR. <br><br>
Boxplots do not display individual changes or the presence of missing values. Profile plots and heat maps (lasagna plots) might be used if the aim is to visualize individual changes over time.<br><br>
More descriptive statistics are provided in the Summary tabs.<br><br>
You can copy the graph(s) by right clicking on them (selecting 'Copy Image' or 'Save Image as...') or download them as Postscript graphics by clicking the 'Download' button."
)})}
# GraphExpl_Timeline ####
if (outputName=="GraphExpl_Timeline") {
description <- renderText({
paste(
"Data were visualized using a timeline plot. Timeline plots display the measurement occasion (date, evaluation occasion or day from inclusion) on the horizontal axis while the measurements of each subject appear at the same height; the values of the outcomes are displayed using dots of different sizes (bubbles). Values of zero are represented using small triangles. The colors denote the outcomes. All the selected outcomes and subjects are displayed in the same graph. <br><br>
Timeline plots are useful especially for small data sets, where the individual patterns are easy to follow or for displaying a small number of outcomes (you can use the menu to de-select the outcomes you are not interested in).<br><br>
More descriptive statistics are provided in the Summary tabs.<br><br>
You can copy the graph(s) by right clicking on them (selecting 'Copy Image' or 'Save Image as...') or download them as Postscript graphics by clicking the 'Download' button."
)
})}
# GraphExpl_Barplots ####
if (outputName=="GraphExpl_Barplots") {
description <- renderText({
paste(
"Data were visualized using barplots that display the proportion of patients with positive outcomes at a given evaluation occasion. <br><br>
Please note that in medplot we define positive outcomes of binary variables those coded with the largest value (1s if the 0/1 coding was used, Yes if the Yes/No coding is used (alphabetical order), etc).
You might want to display the proportions with their 95% confidence intervals, which are more informative. These graphs are available in the Summary tab. <br><br>
More descriptive statistics are provided in the Summary tabs.<br><br>
You can copy the graph(s) by right clicking on them (selecting 'Copy Image' or 'Save Image as...') or download them as Postscript graphics by clicking the 'Download' button."
)})}
# Summary_Medians ####
if (outputName=="Summary_Medians") {
description <- renderText({
paste(
"The table displays the estimated medians and interquartile ranges (IQR) for the outcome variables. 95% confidence intervals (95% CI) are reported for the medians, based on the percentile bootstrap with 2000 iterations. The table also reports the 25th and 75th percentiles (also known as the first and third quartile, which are the lower and upper limits of the IQR) and the number of missing values (#NA) for each outcome variable.<br><br>
The graph shows the estimated median values for the outcome variables along with their 95% confidence intervals.<br><br>
You can copy the graph(s) by right clicking on them (selecting 'Copy Image' or 'Save Image as...') or download them as Postscript graphics by clicking the 'Download' button."
)})}
# Summary_Proportions ####
if (outputName=="Summary_Proportions") {
description <- renderText({
paste(
"
Please note that in medplot we define positive outcomes of binary variables those coded with the largest value (1s if the 0/1 coding was used, Yes if the Yes/No coding is used (alphabetical order), etc). <br><br>
The table displays the frequencies (Positive) and estimated proportions (Proportion) of subjects with positive values of outcome variables. 95% confidence intervals are reported for proportions (95% CI for proportion), based on the exact binomial method. The number of missing values (#NAs) for each outcome variable is reported.<br><br>
The graph shows the overall estimated proportion of subjects with positive value of the outcome variables for the selected evaluation occasion, along with their 95% confidence intervals.<br><br>
You can copy the graph(s) by right clicking on them (selecting 'Copy Image' or 'Save Image as...') or download them as Postscript graphics by clicking the 'Download' button."
)})}
# SummaryGrouping_Proportions ####
if (outputName=="SummaryGrouping_Proportions") {
description <- renderText({
paste(
"
In medplot outputs the largest value of a binary outcome variable is defined as Positive (i.e., the positive value is 1 with a 0/1 coding, Yes with a Yes/No coding, etc).<br><br>
The subjects were divided in two groups, defined by the values observed for the selected grouping variable.<br><br>
The graph shows the estimated proportion of subjects with positive values of outcome variables together with their 95% confidence intervals. All evaluation occasions are represented for each outcome variable. Both groups are represented on the graph - one to the left and the other to right of the zero value on the horizontal axis.<br><br>
The table displays the estimated proportion of subjects in a certain group, P value for the difference of proportions and the 95% confidence interval for the difference of proportions. The groups are compared using the chi-squared test with continuity correction. Data with missing values for grouping variable are removed from analysis.<br><br>
Adjusted P values and False discovery rates (Q values) taking into account multiple comparisons are calculated and displayed only if the option 'Calculate P value adjustments?' was selected.
Adjusted P values are based on the Holm-Bonferroni method (which is conservative and lacks statistical power if the outcomes are correlated) or on a multivariate permutation based adjustment (which takes into account the correlation between outcomes and is generally more statistically powerful than Holm-Bonferroni). Q values are evaluated using the Benjamini-Hochberg (which assumes independent or positively dependent outcomes) or Benjamini-Hochberg-Yekutieli procedure (which makes no assumptions about outcome dependence but is more conservative). Q values represent the minimum false discovery rate at which the test may be called significant.<br><br>
You can copy the graph(s) by right clicking on them (selecting 'Copy Image' or 'Save Image as...') or download them as Postscript graphics by clicking the 'Download' button.<br><br>
References:<br>
Westfall PH YS (1993) Resampling-Based Multiple Testing. Wiley New York.<br>
<NAME>, <NAME> (1995) Controlling the false discovery rate: a practical and powerful approach to multiple testing. Journal of the Royal Statistical Society Series B (Methodological) : 289-300.<br>
<NAME>, <NAME> (2001) The control of the false discovery rate in multiple testing under dependency. Annals of statistics : 1165-1188."
)})}
# SummaryGrouping_Medians ####
if (outputName=="SummaryGrouping_Medians") {
description <- renderText({
paste(
"The subjects were divided in two groups, defined by the values observed for the selected grouping variable.<br><br>
The table displays the median and interquartile range (25th to 75th percentile) within each subgroup.
The subgroups were compared with the Mann-Whitney test (also known as the Wilcoxon rank sum test). The null hypothesis of the Mann-Whitney test is that the distributions of both groups are identical in the population. Data with missing values for grouping variable were removed from analysis.<br><br>
Adjusted P values and False discovery rates (Q values) taking into account multiple comparisons are calculated and displayed only if the option 'Calculate P value adjustments?' was selected.<br><br>
Adjusted P values are based on the Holm-Bonferroni method (which is conservative and lacks statistical power if the outcomes are correlated) or on a multivariate permutation based adjustment (which takes into account the correlation between outcomes and is generally more statistically powerful than Holm-Bonferroni). <br><br>
Q values are evaluated using the Benjamini-Hochberg (which assumes independent or positively dependent outcomes) or Benjamini-Hochberg-Yekutieli procedure (which makes no assumptions about outcome dependence but is more conservative). Q values represent the minimum false discovery rate at which the test may be called significant.<br><br>
References:<br>
Westfall PH YS (1993) Resampling-Based Multiple Testing. Wiley New York.<br>
<NAME>, <NAME> (1995) Controlling the false discovery rate: a practical and powerful approach to multiple testing. Journal of the Royal Statistical Society Series B (Methodological) : 289-300.<br>
<NAME>, <NAME> (2001) The control of the false discovery rate in multiple testing under dependency. Annals of statistics : 1165-1188."
)})}
# Clustering_Dendrogram ####
if (outputName=="Clustering_Dendrogram") {
description <- renderText({
paste(
"The dendrogram displays the similarity of outcome variables by hierarchically clustering the them for the chosen evaluation occasion. The complete linkage method is used and the distance is based on correlation for numerical variables and on Euclidean distance for binary variables. <br><br>
You can copy the graph(s) by right clicking on them (selecting 'Copy Image' or 'Save Image as...') or download them as Postscript graphics by clicking the 'Download' button."
)})}
# Clustering_Heatmap ####
if (outputName=="Clustering_Heatmap") {
description <- renderText({
paste(
"The heat map displays the complete data obtained at the chosen evaluation occasion. A column represents values for a particular subject, while the rows represent outcome variables. Subjects and outcome variables are arranged according to their similarities using hierarchical clustering with complete linkage method and Euclidean distance. Values of outcomes are color coded.<br><br>
Additional variables can be selected to annotate the graph - their values are displayed in the top row.<br><br>
You can copy the graph(s) by right clicking on them (selecting 'Copy Image' or 'Save Image as...') or download them as Postscript graphics by clicking the 'Download' button.
"
)})}
# Clustering_Correlations ####
if (outputName=="Clustering_Correlations") {
description <- renderText({
paste(
"The heat map displays the correlations between outcome variables at the chosen evaluation occasion. Values of pairwise Spearman correlations between two outcome variables are also displayed numerically in each cell, and are color coded. Only complete observations are used. The outcome variables are arranged according to their similarities using hierarchical clustering with complete linkage method and Euclidean distance.<br><br>
You can copy the graph(s) by right clicking on them (selecting 'Copy Image' or 'Save Image as...') or download them as Postscript graphics by clicking the 'Download' button."
)})}
# RegressionOne_Linear ####
if (outputName=="RegressionOne_Linear") {
description <- renderText({
paste(
"
A separate linear regression model was estimated for each of the outcome variables, each of the regression models included the selected covariate and an intercept. Only the measurements obtained at the selected evaluation occasion were used.
The aim of the analyses was to estimate the association between the selected covariate and each of the numerical outcome variables at a given evaluation occasion. <br><br>
The graph displays the estimated regression coefficients (beta - slope) for the covariate, obtained for each of the estimated models, together with their 95% confidence intervals. <br><br>
The results in the tables display the estimated regression coefficients (beta - slope) and the intercepts for each of the models; the variable names appearing in the tables indicate which outcome variable was used. The coefficients represent the estimated change in the outcome for one unit change of the covariate; the intercept provides an estimate of the average value of the outcome, when the value of the covariate is equal to 0 (often the interpretation of the intercept is meaningless). The tables also display 95% confidence intervals for these parameters along with their P values (the null hypotheses are that the coefficients are equal to 0 in the population).<br><br>
You can copy the graph(s) by right clicking on them (selecting 'Copy Image' or 'Save Image as...') or download them as Postscript graphics by clicking the 'Download' button."
)})}
# RegressionOne_OddsRatio ####
if (outputName=="RegressionOne_OddsRatio") {
description <- renderText({
paste(
"A separate logistic regression model was estimated for each of the outcome variables, each of the regression models included the selected covariate and an intercept. Only the measurements obtained at the selected evaluation occasion were used.
The aim of the analyses was to estimate the association between the selected covariate and each of the binary outcome variables at a given evaluation occasion. <br><br>
<br><br>
The graph displays the estimated odds ratios for the covariate, together with their 95% confidence intervals.<br><br>
The results in the tables display the estimated odds ratios for the covariate and the intercept, for each of the fitted models; the variable names appearing in the tables indicate which outcome variable was
used. For categorical covariates the odds ratios are expressed for a level of covariate compared to the reference level of that covariate (the reference level is the level coded with the smallest value).
For numerical covariates they represent the increase in odds of the outcome for one unit change of the covariate.
The odds intercept provides an estimate of the average odds of the outcome, when the value of the covariate is equal to 0 (often the interpretation of the intercept is meaningless).
The tables also display 95% confidence intervals for these parameters along with their P values (the null hypothesis are that the odds ratio is equal to 1 in the population). <br><br>
You can copy the graph(s) by right clicking on them (selecting 'Copy Image' or 'Save Image as...') or download them as Postscript graphics by clicking the 'Download' button."
)})}
# RegressionOne_Firth ####
if (outputName=="RegressionOne_Firth") {
description <- renderText({
paste(
"Logistic regression with the Firth correction was used to estimate the association between each of the outcome variables and the selected covariate, using the measurements obtained at the selected evaluation occasion.
A separate logistic regression model was estimated for each of the outcome variables, each of the regression models included the selected covariate and an intercept.<br><br>
The graph displays the odds ratios for the covariate, together with the 95% confidence intervals.<br><br>
The results in the tables display estimated odds ratios and odds intercepts for outcome variables estimated using logistic regression with the Firth correction, for each of the models.
The Firth correction is useful for small data sets or when the phenomenon of separation occurs (the positive and negative outcomes are perfectly separated by a covariate and the estimate of a parameter consequently diverges to infinity without this correction).
The variable names appearing in the tables indicate which outcome variable was used. The odds ratios are expressed for a level of covariate compared to the reference level of the covariate. They represent the increase in odds of the outcome for one unit chenge of the covariate. The odds intercept provides an estimate of the average odds of the outcome, when the value of the covariate is equal to 0 (often the interpretation of the intercept is meaningless).
The tables also display 95% confidence intervals for these parameters along with their P values (the null hypothesis are that the odds ratio is equal to 1 in the population). <br><br>
You can copy the graph(s) by right clicking on them (selecting 'Copy Image' or 'Save Image as...') or download them as Postscript graphics by clicking the 'Download' button.<br><br>
References:<br>
<NAME>, <NAME> (2002) A solution to the problem of separation in logistic regression. Statistics in medicine 21: 2409-2419."
)})}
# RegressionOne_RCS ####
if (outputName=="RegressionOne_RCS") {
description <- renderText({
paste(
"The graphs represent the estimated association of the chosen covariate with each of the outcome variables, obtained using restricted cubic splines to flexibly model the association.
Non linear associations might be apparent from the graphs. The table contains P values for the model (the null hypthosesis being that the model terms are zero). <br><br>
You can copy the graph(s) by right clicking on them (selecting 'Copy Image' or 'Save Image as...') or download them as Postscript graphics by clicking the 'Download' button. <br><br>
References: <br>
<NAME>, <NAME>, <NAME> (1988) Regression models in clinical studies: determining relationships between predictors and response. Journal of the National Cancer Institute 80: 1198-1202."
)})}
# RegressionAll ####
if (outputName=="RegressionAll") {
description <- renderText({
paste(
"Linear regression was used to assess the association between each of the outcome variables and the selected covariate, using all the measurements taken over time (a separate model was fitted for each of the outcomes).
To take into account the multiple measurements repeated in each patient, the analyses were adjusted for the subject variable as a random effect (a mixed effect linear model was used). Optionaly, the analyses can be adjusted for Evaluation occasion or Time from inclusion in the study.<br><br>
Choosing 'Outcome~Covariate+Subject(random effect)' will build models for studying the association of one covariate with the outcome, without adjusting the analysis for the other covariates. Choosing either 'Outcome~Covariate+Evaluation occasion+Subject(random effect)' or 'Outcome~Covariate+Time since inclusion+Subject(random effect)' will also evaluate time in the model. Evaluation occasions are modeled as categorical covariates while time since inclusion is modeled as a numerical covariate.<br><br>
Tables display the estimated intercepts for the models and the estimated regression coefficients - either odds ratios (OR) for binary or beta coefficients (beta - slope) for numerical and categorical variables. The 95% confidence intervals for the coefficients are listed (95% conf. interval), as well as the P values (the null hypothesis being, that the coefficients do not differ from 0). <br><br>
The graphs display the estimated coefficients along with their 95% confidence intervals .<br><br>
You can copy the graph(s) by right clicking on them (selecting 'Copy Image' or 'Save Image as...') or download them as Postscript graphics by clicking the 'Download' button.
References:<br>
<NAME>, <NAME> (2006) Data analysis using regression and multilevel/hierarchical models. Cambridge University Press."
)})}
##### Return value ######
return(description)
}
<file_sep>/R/RunApps.R
#' @title Function to load the medplot plotSymptoms shiny web app
#'
#' @description Function call loads the shiny app in located the default installation folder.
medplotOnline <- function(){
require("shiny")
runApp(file.path(path.package("medplot"), "shinyapp_symptoms2" ))
}
<file_sep>/R/PlotSymptomsTabDistribution_MeasurementOccasion.R
#' @title Plots of distribution of the symptoms
#'
#' @param data Data frame for ploting.
plotDistribution <- function (data, selectedSymptoms, selectedProportion, measurements ) {
#adjust the margins for the labels of the boxplot
linch <- max(strwidth(selectedSymptoms, "inch")+0.4, na.rm = TRUE)
par(mai=c(1.02,linch,0.82,0.42))
#par(mfrow=c(1,2))
#calculate the proportion with symptoms and reorder (from the most common to the least common)
prop.with.symptoms=apply(data[measurements==selectedProportion, ,drop=FALSE], 2, function(x) mean(x==TRUE, na.rm=TRUE))
my.order.symptoms=order(prop.with.symptoms, decreasing=FALSE)
tmp=barplot(prop.with.symptoms[my.order.symptoms],
hor=TRUE, names.arg=selectedSymptoms[my.order.symptoms],
las=1, xlim=c(0,1),
xlab="Proportion of subjects")
abline(v=seq(0, 1, by=.1), col="light gray", lty=3)
title(paste0("Presence of outcome variables\n",
"at evaluation occasion T = ",
selectedProportion
))
}
#' @title Plots boxplots of symptoms.
#'
#'@param data Data for the boxplots.
#'@param selectedSymptoms Column names of symptoms in data frame.
#'@param selectedProportion User selected measurement time.
#'@param measurements Vector of measurement values.
#'@param posOnly TRUE/FALSE
#'@param threshold User selected threshold value of symptoms.
plotDistributionBoxplot <- function (data,
data.yn,
selectedSymptoms,
selectedProportion,
measurements,
posOnly=FALSE,
threshold ) {
#adjust the margins for the labels of the boxplot
linch <- max(strwidth(selectedSymptoms, "inch")+0.4, na.rm = TRUE)
par(mai=c(1.02,linch,0.82,0.42))
#calculate the proportion with symptoms and reorder (from the most common to the least common)
prop.with.symptoms=apply(data.yn[measurements==selectedProportion,], 2, function(x) mean(x==TRUE, na.rm=TRUE))
my.order.symptoms=order(prop.with.symptoms, decreasing=FALSE)
#display all the data
if(!posOnly) {
boxplot(t(apply(data[measurements==selectedProportion, selectedSymptoms[my.order.symptoms]], 1, function(x) x)),
horizontal=TRUE, names=selectedSymptoms[my.order.symptoms], las=1, xlab="Value")
title(paste0("T = ", selectedProportion, "; distribution of symptoms"))
} else { #display the distribution only for positive patients
#remove the non-positive observations
tmp=(apply(data[measurements==selectedProportion, selectedSymptoms[my.order.symptoms]], 1, function(x) x))
#print(dim(tmp))
#remove the non-positive values
boxplot(apply(tmp, 1, function(x) x[which(x>threshold)]),
horizontal=TRUE, names=selectedSymptoms[my.order.symptoms], las=1, xlab="Value")
title(paste0("T = ", selectedProportion, "; distribution of symptoms"))
}
}
#' @title Ploting confidence intervals
#'
#' @param ddd dddd
plotCI <- function (data.yn,
measurements,
selectedSymptoms,
selectedProportion) {
for.CI=t(apply(data.yn[measurements==selectedProportion,], 2, function(x) {a=prop.test(sum(x==1, na.rm=T), length(x))
return( c(sum(x==1, na.rm=T), length(x), a$estimate, a$conf.int ))
}
)
)
prop.with.symptoms=apply(data.yn[measurements==selectedProportion,], 2,
function(x) mean(x==TRUE, na.rm=TRUE))
my.order.symptoms=order(prop.with.symptoms, decreasing=FALSE)
#reorder the symptoms
for.CI=for.CI[my.order.symptoms, ]
num.symptoms=length(selectedSymptoms)
linch <- max(strwidth(selectedSymptoms, "inch")+0.4, na.rm = TRUE)
par(mai=c(1.02,linch,0.82,0.42))
for(i in 1:num.symptoms){
plot( for.CI[,3], 1:num.symptoms, cex=2, axes=FALSE, xlab="Proportion of patients", ylab="", xlim=c(0,1))
segments( for.CI[,4], 1:num.symptoms, for.CI[,5], 1:num.symptoms)#, lty=2 )
axis(2, at=1:num.symptoms, labels=selectedSymptoms[my.order.symptoms], las=1)
axis(1)
abline(v=seq(0, 1, by=.2), lty=2, col="light gray")
box()
}
title(paste0("T = ", selectedProportion, ";\n 95% confidence intervals for the presence of symptom"))
}
#' @title Table for both groups
#'
#' @description TODO Thresholds take into account for proportions, not for medians.
#'
#' @param TODO
tableAllWithSymptoms <- function (data,
measurementVar,
forMeasurement,
symptomsNames,
thresholdValue=0) {
# TODO: create a Groupinga() reactive vector to pass to this and other functions?
#groupingLevels <- as.character(unlist(unique(data[groupingVar])))
data <- data[data[measurementVar]==forMeasurement,]
tableData <- data.frame("Variable"=symptomsNames)
column1Name <- paste("Prop. with positive ")
column2Name <- paste("Positive/all")
column3Name <- paste("Median")
column4Name <- paste("IQR")
column5Name <- paste("Median w threshold")
column6Name <- paste("IQR w threshold")
aboveThresholdData <- data
aboveThresholdData[, symptomsNames] <- (data[,symptomsNames]>thresholdValue)
#group2Data <- data[data[groupingVar]==groupingLevels[2],]
#group2Data[, symptomsNames] <- (group2Data[,symptomsNames]>thresholdValue)
for (symptom in symptomsNames) {
patientsPositive <- sum(aboveThresholdData[,symptom], na.rm=TRUE)
patientsNegative <- sum(!aboveThresholdData[,symptom], na.rm=TRUE)
# group2Positive <- sum(group2Data[,symptom])
# group2Negative <- sum(!group2Data[,symptom])
# testMatrix <- matrix(c(group1Positive, group1Negative,
# group2Positive, group2Negative),
# byrow=TRUE, ncol=2)
# results <- prop.test(testMatrix)
tableData[tableData["Variable"]==symptom, column1Name ] <-
patientsPositive / (patientsPositive + patientsNegative)
tableData[tableData["Variable"]==symptom, column2Name ] <-
paste(patientsPositive, "/", patientsPositive+patientsNegative)
tableData[tableData["Variable"]==symptom, column3Name ] <- median(data[,symptom], na.rm=TRUE)
tableData[tableData["Variable"]==symptom, column4Name ] <-
paste(quantile(data[,symptom], c(0.25, 0.75), na.rm=TRUE)[1], " to ",
quantile(data[,symptom], c(0.25, 0.75), na.rm=TRUE)[2])
tableData[tableData["Variable"]==symptom, column5Name ] <-
median(data[data[symptom]>thresholdValue, symptom], na.rm=TRUE)
tableData[tableData["Variable"]==symptom, column6Name ] <-
paste(quantile(data[data[symptom]>thresholdValue, symptom], c(0.25, 0.75), na.rm=TRUE)[1], " to ",
quantile(data[data[symptom]>thresholdValue, symptom], c(0.25, 0.75), na.rm=TRUE)[2])
# tableData[tableData["Variable"]==symptom, column5Name ] <-
# format(results$p.value, digits=2)
# tableData[tableData["Variable"]==symptom, column6Name ] <-
# paste(format(results$conf.int[[1]], digits=2),
# format(results$conf.int[[2]], digits=2), sep=";")
}
return(tableData)
}<file_sep>/inst/shinyapp_symptoms2/graphSizes.R
# This file contains functions that determine the sizes (heights) of the medplot package shiny app
#how much space should be used for the graphical output of different graphs?
numRowsTimeline <- function(){if(!is.null(dataFiltered())){
if(input$selectedGraphOverTime!="timelinePlot") {return(0)} # no space reserved
# if(input$treatasBinary==TRUE) {return(0)} # no graph if not possible to draw
max(ceiling(length(input$selectedSymptoms))*40, # so that legend is visible
(dim(dataFiltered()[1])*0.75), # to not compress patient axis too much
400) # at least this size
} else {return(0)} # if there is no data, height of plot should be zero
}
numRowsTimelineProfile <- function(){if(!is.null(input$selectedGraphType)){
if(input$selectedGraphOverTime!="profilePlot") {return(0)}
if(input$treatasBinary==TRUE) {return(0)} # no graph if not possible to draw
if(input$selectedGraphType=="multipleGraphs") { # case of multiple graphs per one variable
uniqueSubjects <- unique(dataFiltered()[input$patientIDVar])
numSubjects <- dim(uniqueSubjects)[1]
numGroups <- ceiling(numSubjects/input$groupSize)
size <- (numGroups*length(input$selectedSymptoms)*300)
# Cairo graphics has some limit on the max. num of lines,
# apparently it is somewhere below 38000?)
return(min(size, 30000))
}
if(input$selectedGraphType=="oneGraph" || input$selectedGraphType=="randomSample") {
size <- max(ceiling(length(input$selectedSymptoms))*300, # so that legend is visible
(dim(dataFiltered()[1])*0.75), # to not compress patient axis too much
400) # at least this size
return(size)
}
} else {return(0)} # if there is no data, height of plot should be zero
}
numRowsTimelineBoxplots <- function(){if(!is.null(dataFiltered())){
if(input$selectedGraphOverTime!="boxPlot") {return(0)}
if (input$selectedFacetingType=="variablesOnYaxis") {
if(input$treatasBinary==TRUE) {return(0)}
tmp <- max(ceiling(length(input$selectedSymptoms))*200,
300) # minumum reserved space
return(tmp)}
if (input$selectedFacetingType=="variablesOnXaxis") {
if(input$treatasBinary==TRUE) {return(0)}
tmp <- max(ceiling(length(unique(na.omit(Measurement()))))*200,
300) # minumum reserved space
return(tmp)}
}else{return(0)} # height of plot when no data available
}
numRowsProportions <- function(){
if(!is.null(dataFilteredwithThreshold())){
if(input$treatasBinary==FALSE) {return(0)}
max(ceiling(length(input$selectedSymptoms))*40,
# if there are less than cca. 4 measurement occasions,
# each symptom should get aprox. 40 lines
length(input$selectedSymptoms)*length(measurementLevels())*15,
# for more than 4 measurement occasions,
# this should give extra vertical space for
# measurements to be visible
300 # minumum reserved space
)}else{return(0)} # height of plot when no data available
}
numRowsProportionsCI<- function(){
try({
if(!is.null(dataFilteredwithThreshold())){
if(input$treatasBinary==FALSE) {return(0)}
max(ceiling(length(input$selectedSymptoms))*80,
# if there are less than cca. 4 measurement occasions,
# each symptom should get aprox. 40 lines
length(input$selectedSymptoms)*length(measurementLevels())*20,
# for more than 4 measurement occasions,
# this should give extra vertical space for
# measurements to be visible
300 # minumum reserved space
)}else{return(0)} # height of plot when no data available
}, silent=TRUE)
}
numRowsPresencePlot <- function(){
if(!is.null(input$selectedEvaluationTime2)) {
if (input$treatasBinary==FALSE) {return(0)}
if(input$treatasBinary==TRUE) {
max(ceiling(length(input$selectedSymptoms))*30,
300 # minumum reserved space
)
}}else{return(0)} # height of plot when no data available
}
numRowsMedianPlot <- function(){
if(!is.null(input$selectedEvaluationTime2)) {
if (input$treatasBinary==TRUE) {return(0)}
if(input$treatasBinary==FALSE) {
max(ceiling(length(input$selectedSymptoms))*30,
300 # minumum reserved space
)
}}else{return(0)} # height of plot when no data available
}
numRowsClustering <- function() {if(!is.null(dataFiltered())){
#if(input$treatasBinary==TRUE) {return(0)}
max(ceiling(length(input$selectedSymptoms))*40,
300)}else{return(0)}}
numRowsClustering2 <- function() {if(!is.null(dataFiltered())){
#if(input$treatasBinary==TRUE) {return(0)}
max(ceiling(length(input$selectedSymptoms))*40,
300)}else{return(0)}}
numRowsClustering3 <- function() {if(!is.null(dataFiltered())){
#if(input$treatasBinary==TRUE) {return(0)}
max(ceiling(length(input$selectedSymptoms))*40,
300)}else{return(0)}}
numRowsDistributions <- function() {if(!(is.null(dataFiltered()) || is.null(input$posOnly)) ){
if(input$treatasBinary==FALSE){return(0)}
max(ceiling(length(input$selectedSymptoms))*30,
300)}else{return(0)}}
numRowsLogistf <- function() {if(!is.null(regressionScenario())){
if(regressionScenario()!="scenarioLogistf") {return(0)}
max(ceiling(length(input$selectedSymptoms))*30,
300)}else{return(0)}}
numRowsLogist <- function() {if(!is.null(regressionScenario())){
if(regressionScenario()!="scenarioLogist") {return(0)}
max(ceiling(length(input$selectedSymptoms))*30,
300)}else{return(0)}}
numRowsLinear <- function() {if(!is.null(regressionScenario())){
if(regressionScenario()!="scenarioLinearModel") {return(0)}
max(ceiling(length(input$selectedSymptoms))*30,
300)}else{return(0)}}
numRowsProportion <- function(){
if(!(is.null(dataFiltered.yn()) || is.null(input$selectedMeasurementForPresencePlot) )){
if(input$selectedGraphOverTime!="presencePlot") {return(0)}
if(input$treatasBinary==TRUE){
max(ceiling(length(input$selectedSymptoms))*30,
300)
}} else {return(0)}
}
numRowsRCSModel <- function() {if(!is.null(regressionScenario())){
if(regressionScenario()!="scenarioRCSModel") {return(0)}
max(ceiling(length(input$selectedSymptoms))*100,
300)}else{return(0)}}
numRowsMixedModels1 <- function(){if(!is.null(dataFiltered()) &
!is.null(input$selectedMixedModelType)){
max(ceiling(length(input$selectedSymptoms))*30*(length(unique(dataFiltered()[,input$selectedCovariate1st]))-1),
300) }else{return(0)} }
numRowsMixedModels2 <- function(){if(!is.null(dataFiltered()) &
!is.null(input$selectedMixedModelType)){
nFacets <- length(unique(Measurement()))-1
(input$selectedMixedModelType=="MMmeasurement")*
max(ceiling(length(input$selectedSymptoms))*30*nFacets,
300 ) }else{return(0)} }
numRowsMixedModels3 <- function(){if(!is.null(dataFiltered()) &
!is.null(input$selectedMixedModelType)){
(input$selectedMixedModelType=="MMtimeSinceInclusion")*
max( ceiling(length(input$selectedSymptoms))*30,
300 )}else{return(0)} }
NumRows <- function(){if(!is.null(dataFiltered())){
ceiling(length(input$selectedSymptoms)/3)*300 }else{return(0)} }<file_sep>/inst/shinyapp_symptoms2/server.R
# This is the server part of the shiny script to plot
# graph for displaying presence of symptoms.
# Load libraries required by the package ####
source("loadLibraries.R", local = TRUE)
# Global activity -----------------------------------------------------------
# purge temporary files
file.remove(list.files(paste0(x=getwd(), "/www/temp"), full.names=TRUE))
# set number of processors for boot package
options(boot.ncpus=Sys.getenv('NUMBER_OF_PROCESSORS'))
# Main function -----------------------------------------------------------
shinyServer(function(input, output, session, clientData) {
# VARIABLES ####
# the working directory
# workingDir <- paste0(gsub(pattern="/", replacement="\\\\", x=getwd()))
workingDir <- getwd()
# FUNCTIONS ####
# Load the file that contains functions for determining graph sizes (heights)
source("graphSizes.R", local = TRUE)
# REACTIVE FUNCTIONS ####
# Measurement() - values of variable selected as measurement occasion variable ####
Measurement <- reactive({
if(!is.null(dataFiltered())){
dataFiltered()[,input$measurementVar]
}
})
# measurementLevels() - the measurement levels available
measurementLevels <- reactive ({
if(!is.null(Measurement())){
sort(unique(Measurement()))
}
})
# dataExtended() - data set with all the imported data ####
dataExtended <- reactive({
observe(input$dataFile)
# TODO: napiši kodo za scenarij, ko input$datafile ni null, ampak
# samo ni v pravem formatu - tudi takrat naj vrne NULL
if (input$dataFileType=="Demo") {
# commented out: code for importing Excel DEMO file (for testing)
# templateLocation <- paste0(path.package("medplot"),"/extdata/PlotSymptoms_shiny.xlsx")
# patients <- importSymptomsPatients(datafile=templateLocation)
# symptoms <- importSymptomsData(datafile=templateLocation,
# format="Excel")
# data <- join(x=symptoms, y=patients, by="PersonID", type="inner")
templateLocation <- paste0(path.package("medplot"),"/extdata/DataEM.txt")
data <- importSymptomsData(datafile=templateLocation,
format="TSV")
return(data)
}
if(is.null(input$dataFile)) return()
if (input$dataFileType=="Excel") {
patients <- importSymptomsPatients(datafile=input$dataFile$datapath)
symptoms <- importSymptomsData(datafile=input$dataFile$datapath,
format="Excel")
data <- join(x=symptoms, y=patients, by="PersonID", type="inner")
}
if (input$dataFileType=="TSV") {
data <- importSymptomsData(datafile=input$dataFile$datapath,
format="TSV")
}
return(data)
})
# dataFiltered() - data set with data only for selected variables ####
dataFiltered <- reactive({
if(! (is.null(dataExtended()) || is.null(input$selectedSymptoms) )) {
data <- dataExtended()[ , # all rows
c(input$patientIDVar,
input$groupingVar,
input$dateVar,
input$measurementVar,
input$selectedSymptoms
)]
# try to convert dates into R format
try(expr={data[input$dateVar] <- as.Date(data[,input$dateVar], format="%d.%m.%Y")}, silent=TRUE)
return(data)
}
})
# dataExtendedwithThreshold() - all data with threshold value honored ####
# sets all symptom values below threshold value to zero
dataExtendedwithThreshold <- reactive ({
if(!(is.null(dataExtended()) || is.null(input$thresholdValue) )) {
data <- dataExtended()
data[, input$selectedSymptoms] <-
ifelse(data[, input$selectedSymptoms]>input$thresholdValue, 1, 0)
return(data)
}
})
# dataFilteredwithThreshold() - filtered data set with threshold value honored ####
# sets all symptom values below threshold value to zero
dataFilteredwithThreshold <- reactive ({
if(!(is.null(dataFiltered()) || is.null(input$thresholdValue) )){
data <- dataFiltered()
data[,input$selectedSymptoms] <-
ifelse(data[, input$selectedSymptoms]>input$thresholdValue, 1, 0)
return(data)
}
})
# dataVariableNames() - returns the names of all column of imported data ####
dataVariableNames <- reactive({
if(!is.null(dataExtended())){
unlist(names(dataExtended()))
}
})
# dataFiltered.yn() - dataset with the positive/negative values for the selected symptoms ####
# TODO: check if this is still used anywhere and delete if not
dataFiltered.yn=reactive({
if( !( is.null(dataFiltered()) | is.null(input$thresholdValue) )) {
data=ifelse(dataFiltered()[, input$selectedSymptoms, drop=FALSE]>input$thresholdValue, 1, 0)
return(data)
} else {return(NULL)}
})
# SIDEBAR ####
# GUI - printing medplot package version
output$medplotVersion <- renderText({
paste("Version:",packageVersion(pkg="medplot"))
})
# GUI - selecting symptoms ####
output$selectSymptoms <- renderUI({
if (!is.null(dataVariableNames())) {
selectInput(inputId="selectedSymptoms",
label="Choose outcome variables to analyse (the order used is also used on most graphs):",
choices=dataVariableNames(),
multiple=TRUE,
# if DEMO data used, some variables are automatically selected
if (input$dataFileType=="Demo"){selected=c("Fatigue","Malaise",
"Arthralgia","Headache",
"Myalgia","Back.C",
"Dizziness", "Nausea",
"Sleepiness", "Forgetfulness",
"Concentration", "Paresthesias",
"Irritability","Back.L",
"Back.Th", "Insomnia")})}
})
# GUI - selecting Date variable ####
output$selectDateVar <- renderUI({
if (!is.null(dataVariableNames())){
selectInput(inputId="dateVar",
label="Choose date variable:",
choices=dataVariableNames(),
selected="Date")
}
})
# GUI - selecting grouping variable ####
output$selectGroupingVar <- renderUI({
if (!is.null(dataVariableNames())){
selectInput(inputId="groupingVar",
label="Choose grouping variable:",
choices=dataVariableNames(),
selected="Sex")
}
})
# GUI - selecting person ID variable ####
output$selectPatientIDVar <- renderUI({
if (!is.null(dataVariableNames())) {
selectInput(inputId="patientIDVar",
label="Choose subject ID variable:",
choices=dataVariableNames(),
selected="PersonID")
}
})
# GUI - selecting measurements variable ####
output$selectMeasurementVar <- renderUI({
if (!is.null(dataVariableNames())) {
selectInput(inputId="measurementVar",
label="Choose evaluation occasion variable:",
choices=dataVariableNames(),
selected="Measurement")
}
})
# GUI - selecting use of thresholding ####
output$selectTreatasBinary <- renderUI({
if (!is.null(dataVariableNames())){
checkboxInput(inputId="treatasBinary",
label="Treat and analyse outcome variables as binary?",
value=FALSE)
}})
# GUI - selecting treshold value ####
output$selectThresholdValue <- renderUI({
if (!is.null(dataVariableNames()) & !is.null(input$treatasBinary)){
if(input$treatasBinary==TRUE) {
numericInput(inputId="thresholdValue",
"Threshold for positivity of the outcome variables:",
value=0,
min=0,
max=9)
}}
})
# GUI - reseting threshold value if "treat as binary" option changes
observe({
input$treatasBinary
updateNumericInput(session, inputId="thresholdValue", value=0)
})
# TABS ####
# message - used on all tabs
output$messageSelectVars <- renderUI({
if(is.null(dataFiltered())) {h4("Please use the menus below to upload data,
select parameters and one or more variables to analyse.")}
})
# TAB - Data overview ####
output$dataSummary <- renderPrint({
if(!is.null(dataFiltered())) {
summarizeData(data=dataFiltered(),
personIDVar=input$patientIDVar,
measurementVar=input$measurementVar,
selectedSymptoms=input$selectedSymptoms,
groupingVar=input$groupingVar
)
}})
# TAB - Graphical exploration ####
output$selectGraphOverTime <- renderUI({
if (!is.null(dataFiltered())) {
selectInput(inputId="selectedGraphOverTime",
label="Select type of graph:",
choices= if (input$treatasBinary==TRUE) {
c("Lasagna plots"="lasagnaPlot",
"Barplots with proportions"="presencePlot",
"Timeline"="timelinePlot"
)} else {
c("Profile plots"="profilePlot",
"Lasagna plots"="lasagnaPlot",
"Boxplots"="boxPlot",
"Timeline"="timelinePlot"
)},
selected= if (input$treatasBinary==TRUE) {"presencePlot"} else {"timelinePlot"},
multiple=FALSE)
}})
# Profile plots ####
# Menus
# ui - select type of graph
output$selectGraphType <- renderUI({
if(!is.null(input$selectedGraphOverTime)) {
if (input$selectedGraphOverTime=="profilePlot") {
if(input$treatasBinary==FALSE){
selectInput(inputId="selectedGraphType",
label="Select type of graphs to plot:",
choices=c("All subjects on one graph"="oneGraph",
"Random selection of subjects on one graph"="randomSample",
"Multiple graphs per outcome variable"="multipleGraphs"),
selected="randomSample",
multiple=FALSE)
}}}
})
output$selectRandomSampleSize <- renderUI({
if(!is.null(input$selectedGraphType)) {
if (input$selectedGraphOverTime=="profilePlot") {
if (input$selectedGraphType=="randomSample") {
if(input$treatasBinary==FALSE){
numericInput(inputId="sampleSize",
label="Select number of randomly selected subjects:",
value=10,
min=1,
max=100,
step=5)
}}}}
})
output$selectMaxGroupSize <- renderUI({
if(!is.null(input$selectedGraphType)) {
if (input$selectedGraphOverTime=="profilePlot") {
if (input$selectedGraphType=="multipleGraphs") {
if(input$treatasBinary==FALSE){
numericInput(inputId="groupSize",
label="Select the maximum number of subjects on one graph:",
value=25,
min=10,
max=100,
step=5)
}}}}
})
# Graph
# reactive code for ploting
plotTimelineProfilesReactive <- reactive({
plotTimelineProfiles(data=dataFiltered(),
plotType=input$selectedGraphType,
personIDVar=input$patientIDVar,
measurementVar=input$measurementVar,
selectedSymptoms=input$selectedSymptoms,
sizeofRandomSample=input$sampleSize,
sizeofGroup=input$groupSize)
})
# the plot rendered in browser
output$plotTimelineProfiles <- renderPlot({
if(!is.null(input$selectedGraphType)) {
if ( (input$selectedGraphType=="oneGraph") ||
(input$selectedGraphType=="randomSample" && !is.null(input$sampleSize)) ||
(input$selectedGraphType=="multipleGraphs" && !is.null(input$groupSize))) {
if(input$treatasBinary==FALSE){
progress <- Progress$new(session, min=1, max=100)
on.exit(progress$close())
progress$set(message = 'Calculation in progress',
detail = 'This may take a while...',
value=NULL)
plotTimelineProfilesReactive()
}}}
}, height=numRowsTimelineProfile
)
# the plot to be downloaded - high quality plot
output$downLoadplotTimelineProfiles <- downloadPlot(
plotFunction=plotTimelineProfilesReactive,
width=clientData$output_plotTimelineProfiles_width,
height=clientData$output_plotTimelineProfiles_height,
print=TRUE
)
# profile plot descriptions
output$plotTimelineProfilesDescr <- reactive({
if(!is.null(input$selectedGraphType)) {
if(input$selectedGraphOverTime=="profilePlot") {
description <- switch(input$selectedGraphType,
"oneGraph" = generateDescription("GraphExpl_ProfilePlots_AllSubjects"),
"randomSample" = generateDescription("GraphExpl_ProfilePlots_RandomSubjects"),
"multipleGraphs" = generateDescription("GraphExpl_ProfilePlots_MultipleGraphs")
)
return(description())
} else {return()}
}
})
# Lasagna plots ####
# Graph
output$plotLasagna <- renderUI({
if (!is.null(input$selectedGraphOverTime)) {
if (input$selectedGraphOverTime=="lasagnaPlot") {
progress <- Progress$new(session, min=1, max=100)
on.exit(progress$close())
progress$set(message = 'Calculation in progress',
detail = 'This may take a while...',
value=NULL)
filenames <- vector()
filenamesEPS <- vector()
# generate as many files as there are plots
for (symptom in input$selectedSymptoms) {
filenames[symptom] <- tempfile(pattern="symptom", tmpdir=paste0(workingDir,"/www/temp"), fileext=".png")
filenamesEPS[symptom] <- tempfile(pattern=symptom, fileext = ".eps")
# plot PNG graph for each symptom
png(paste0(filenames[symptom]))
plotLasagna(if (input$treatasBinary==FALSE) {dataFiltered()}else{dataFilteredwithThreshold()},
treatasBinary=input$treatasBinary,
symptom=symptom,
dateVar=input$dateVar,
personIDVar=input$patientIDVar,
measurementVar=input$measurementVar,
groupingVar=input$groupingVar,
thresholdValue=input$thresholdValue)
dev.off()
# prepare EPS graphs
width = 5 # in inches
height = 5 # in inches
postscript(filenamesEPS[symptom], paper="special", width=width, height = height)
plotLasagna(if (input$treatasBinary==FALSE) {dataFiltered()}else{dataFilteredwithThreshold()},
treatasBinary=input$treatasBinary,
symptom=symptom,
dateVar=input$dateVar,
personIDVar=input$patientIDVar,
measurementVar=input$measurementVar,
groupingVar=input$groupingVar,
thresholdValue=input$thresholdValue)
dev.off()
}
# create a ZIP file of EPS plots
zipFile <<- tempfile(pattern="zip", tmpdir=paste0(workingDir,"/www/temp/"), fileext=".zip")
zip(zipfile=zipFile, files=filenamesEPS, flags="-Dj")
out <- pastePlotFilenames(filenames)
return(div(HTML(out),class="shiny-plot-output shiny-bound-output"))
}}
})
# prepare ZIP for downloading EPS files
output$lasagnaDownload <- downloadHandler(
filename="Lasagna.zip",
content= function(file) {
file.copy(from=zipFile, to=file)
}, contentType="application/octet-stream")
# prepare download button
output$downloadLasagna <- renderUI({
out <- downloadButton(outputId = "lasagnaDownload", label = "Download")
return(out)
})
output$plotLasagnaDesc <- reactive({
if (!is.null(input$selectedGraphOverTime)) {
if (input$selectedGraphOverTime=="lasagnaPlot") {
description <- generateDescription("GraphExpl_LasagnaPlots")
return(description())
}}
})
# Boxplots ####
# Menu
output$selectFacetingType <- renderUI({
if(!is.null(input$selectedGraphOverTime)) {
if (input$selectedGraphOverTime=="boxPlot") {
selectInput(inputId="selectedFacetingType",
label="Select faceting type:",
choices=c("Variables ~ Evaluation occasions"="variablesOnYaxis",
"Evaluation occasions ~ Variables"="variablesOnXaxis")
)
}}
})
# Graph
# reactive plot
plotTimelineBoxplotsReactive <- reactive({
plotTimelineBoxplots(data=dataFiltered(),
personIDVar=input$patientIDVar,
measurementVar=input$measurementVar,
selectedSymptoms=input$selectedSymptoms,
faceting=input$selectedFacetingType)
})
# plot
output$plotTimelineBoxplots <- renderPlot({
if(!is.null(dataFiltered())) {
if(input$selectedGraphOverTime=="boxPlot") {
if(input$treatasBinary==FALSE){
progress <- Progress$new(session, min=1, max=100)
on.exit(progress$close())
progress$set(message = 'Calculation in progress',
detail = 'This may take a while...',
value=NULL)
plotTimelineBoxplotsReactive()
}
}
} else {return()}
},height=numRowsTimelineBoxplots)
# download handler
output$downLoadplotTimelineBoxplot <- downloadPlot(
plotFunction = plotTimelineBoxplotsReactive,
width = clientData$output_plotTimelineBoxplots_width,
height = clientData$output_plotTimelineBoxplots_height,
print = TRUE)
# description
output$plotTimelineBoxplotsDesc <- reactive({
if (!is.null(input$selectedGraphOverTime)) {
if (input$selectedGraphOverTime=="boxPlot") {
description <- generateDescription("GraphExpl_BoxPlots")
return(description())
}}
})
# Timeline graph ####
# Menu
output$selectDisplayFormat <- renderUI({
if(!is.null(dataFiltered())){
if(input$selectedGraphOverTime=="timelinePlot") {
selectInput(inputId="displayFormat",
label="Choose what to display on the horizontal axis:",
choices=c("Dates" = "dates",
"Time from inclusion" ="timeFromInclusion",
"Evaluation occasions" = "measurementOccasions"),
selected="dates",
multiple=FALSE)
}}
})
# Graph
# reactive plot
plotTimelineReactive <- reactive({
if (input$treatasBinary == TRUE) {
data=dataFilteredwithThreshold()
} else { data=dataFiltered() }
plotTimeline(data=data,
date=input$dateVar,
personID=input$patientIDVar,
measurement=input$measurementVar,
symptoms=input$selectedSymptoms,
displayFormat = input$displayFormat,
treatasBinary=input$treatasBinary)
})
# plot
output$plotTimeline <- renderPlot({
if(!(is.null(dataFiltered()) || is.null(input$displayFormat))){
if(input$selectedGraphOverTime=="timelinePlot") {
progress <- Progress$new(session, min=1, max=100)
on.exit(progress$close())
progress$set(message = 'Calculation in progress',
detail = 'This may take a while...',
value=NULL)
plotTimelineReactive()
}}
}, height=numRowsTimeline)
# downaload handler
output$downLoadplotTimeline <- downloadPlot(plotFunction = plotTimelineReactive,
width = clientData$output_plotTimeline_width,
height = clientData$output_plotTimeline_height,
print = TRUE)
# description
output$plotTimelineDesc <- reactive({
if (!is.null(input$selectedGraphOverTime)) {
if (input$selectedGraphOverTime=="timelinePlot") {
description <- generateDescription("GraphExpl_Timeline")
return(description())
}}
})
#Barplots with proportions ####
#Menu
output$selectMeasurementForPresencePlot <- renderUI({
if(!is.null(input$selectedGraphOverTime)) {
if(input$selectedGraphOverTime=="presencePlot") {
selectInput(inputId="selectedMeasurementForPresencePlot",
label="Select evaluation occasion:",
choices=measurementLevels(), selected=measurementLevels()[1])
}}
})
# reactive plot
plotProportionReactive <- reactive({
plotDistribution(data=dataFiltered.yn(),
selectedSymptoms=input$selectedSymptoms,
selectedProportion=input$selectedMeasurementForPresencePlot,
measurements=Measurement())
})
# Plot - Presence (plot - proportions) ###
output$plotProportion=renderPlot({
if(!(is.null(dataFiltered.yn()) || is.null(input$selectedMeasurementForPresencePlot) )){
if(input$treatasBinary==TRUE){
plotProportionReactive()
}}
}, height=numRowsProportion)
# download handler
output$downLoadplotProportion <- downloadHandler (
# since this is base graphics (not ggplot) the EPS file generation
# has to be handled differently - the downloadPlot() function does not work
# due to "Cairo" graphics device being used instead of "postscipt"
# maybe has to do with being called from a reactive function and/or plot being
# in a different environement?
filename="plot.eps",
content = function (file) {
width = clientData$output_plotProportion_width
height = clientData$output_plotProportion_height
postscript(file, paper="special", width = width/72, height = height/72)
plotDistribution(data=dataFiltered.yn(),
selectedSymptoms=input$selectedSymptoms,
selectedProportion=input$selectedMeasurementForPresencePlot,
measurements=Measurement())
dev.off()
}, contentType="application/postscript")
output$plotProportionDesc <- reactive({
if (!is.null(input$selectedGraphOverTime)) {
if (input$selectedGraphOverTime=="presencePlot") {
description <- generateDescription("GraphExpl_Barplots")
return(description())
}}
})
# TAB - Summary ####
# Boxplot tables - all tables at once - not interesting for current version
# leaving in for future reference
# output$tableforBoxplots <- renderUI({
# if(!is.null(dataFiltered())) {
# #if(input$treatasBinary==FALSE){
# progress <- Progress$new(session, min=1, max=100)
# on.exit(progress$close())
#
# progress$set(message = 'Calculation in progress',
# detail = 'This may take a while...',
# value=NULL)
#
# PERFORMANCE BUG HERE - the measurements passed to this function is a vector of
# all measurements, when it should be only of unique measurement;
# out <- tabelizeBoxplots(measurements=Measurement(),
# measurementVar=input$measurementVar,
# data=dataFiltered(),
# selectedSymptoms=input$selectedSymptoms)
#
# return(div(HTML(out),class="shiny-html-output"))
# } #}
# })
# Menu
output$selectEvaluationTime2 <- renderUI({
selectInput(inputId="selectedEvaluationTime2",
label="Select evaluation occasion:",
choices=if(!is.null(measurementLevels())) {measurementLevels()},
selected=if(!is.null(measurementLevels())) {measurementLevels()[1]})
})
# Pyramid plot ####
# Graph
# reactive plot
plotPyramidReactive <- reactive({
plotPropPositive(data=dataFilteredwithThreshold(),
grouping=input$groupingVar,
measurements=input$measurementVar,
symptomsNames=input$selectedSymptoms)
})
output$plotPyramid <- renderPlot ({
try({
if(!(is.null(dataFilteredwithThreshold()) || is.null(input$treatasBinary) )){
if(input$treatasBinary==TRUE){
progress <- Progress$new(session, min=1, max=100)
on.exit(progress$close())
progress$set(message = 'Calculation in progress',
detail = 'This may take a while...',
value=NULL)
plotPyramidReactive()
}}}, silent=TRUE)
} ,height=numRowsProportions)
output$downLoadplotPyramid <- downloadHandler (
# since this is base graphics (not ggplot) the EPS file generation
# has to be handled differently - the downloadPlot() function does not work
# due to "Cairo" graphics device being used instead of "postscipt"
# maybe has to do with being called from a reactive function and/or plot being
# in a different environement?
filename="plot.eps",
content = function (file) {
width = clientData$output_plotPyramid_width
height = clientData$output_plotPyramid_height
postscript(file, paper="special", width = width/72, height = height/72)
plotPropPositive(data=dataFilteredwithThreshold(),
grouping=input$groupingVar,
measurements=input$measurementVar,
symptomsNames=input$selectedSymptoms)
dev.off()
}, contentType="application/postscript")
# calculate data for tables of medians & CI plots ####
dataforSummaryNonBinary <- reactive({
if(!is.null(dataFiltered())) {
if(input$treatasBinary==FALSE){
progress <- Progress$new(session, min=1, max=100)
on.exit(progress$close())
progress$set(message = 'Calculation in progress',
detail = 'This may take a while...',
value=NULL)
tableMedians(measurement=input$selectedEvaluationTime2,
measurementVar=input$measurementVar,
data=dataFiltered(),
selectedSymptoms=input$selectedSymptoms)
}}
})
# Median tables ####
output$tableforBoxplots <- renderDataTable({
if(!is.null(dataFiltered())) {
if(input$treatasBinary==FALSE){
return(dataforSummaryNonBinary()[["printableTable"]])
}}
}, options=list(bFilter=FALSE, bPaginate=FALSE, bInfo=FALSE))
# Median reactive plot
plotMediansReactive <- reactive({
plotValueswithCIs(data=dataforSummaryNonBinary()[["rawTable"]],
variableName="Variables",
valueName="Median",
CILowerName="CILower",
CIUpperName="CIUpper",
xLabel="Medians",
yLabel="Variable",
graphTitle="Medians of variables \n(with 95% confidence intervals)",
vLine=NULL,
variableOrder=input$selectedSymptoms)
})
# plot
output$plotMedians <- renderPlot({
plotMediansReactive()
}, height=numRowsMedianPlot)
# download handler
output$downLoadplotMedians <- downloadPlot(
plotFunction = plotMediansReactive,
width = clientData$output_plotMedians_width,
height = clientData$output_plotMedians_height,
print = TRUE
)
# Medians description
output$mediansDescr <- reactive({
if (!is.null(dataFiltered())) {
if (input$treatasBinary==FALSE) {
description <- generateDescription("Summary_Medians")
return(description())
}}})
# Proportions tables
output$tableforProportions <- renderDataTable({
if(!is.null(dataFilteredwithThreshold())) {
if(input$treatasBinary==TRUE){
out <- tableProportions(measurement=input$selectedEvaluationTime2,
measurementVar=input$measurementVar,
data=dataFilteredwithThreshold(),
selectedSymptoms=input$selectedSymptoms)
return(out)
}}
}, options=list(bFilter=FALSE, bPaginate=FALSE, bInfo=FALSE))
# Proportions graph
# reactive plot
plotPresenceReactive <- reactive({
plot <- plotPresenceofSymptoms(data=dataFiltered(),
selectedSymptoms=input$selectedSymptoms,
measurementVar=input$measurementVar,
measurement=input$selectedEvaluationTime2,
thresholdValue=ifelse(!is.null(input$thresholdValue),input$thresholdValue ,0))
})
# plot
output$plotPresence <- renderPlot({
if(!is.null(input$selectedEvaluationTime2)) {
if(input$treatasBinary==TRUE) {
progress <- Progress$new(session, min=1, max=100)
on.exit(progress$close())
progress$set(message = 'Calculation in progress',
detail = 'This may take a while...',
value=NULL)
print(plotPresenceReactive())
}}
}, height=numRowsPresencePlot)
# download handler
output$downLoadplotPresence <- downloadPlot(
plotFunction = plotPresenceReactive,
width = clientData$output_plotPresence_width,
height = clientData$output_plotPresence_height,
print = TRUE
)
# Proportions description
output$proportionsDescr <- reactive({
if (!is.null(dataFiltered())) {
if (input$treatasBinary==TRUE) {
description <- generateDescription("Summary_Proportions")
return(description())
}}})
# TAB - Summary tables : grouping variable ####
# Proportions by groups with confidence intervals ####
# Graph
# reactive plot
plotPropPositiveCIReactive <- reactive({
plotPropPositiveCI(data=dataFilteredwithThreshold(),
groupingVar=input$groupingVar,
measurementVar=input$measurementVar,
selectedSymptoms=input$selectedSymptoms)
})
# plot
output$plotPropCIs <- renderPlot ({
try({
if(!is.null(dataFilteredwithThreshold())){
if(input$treatasBinary==TRUE){
progress <- Progress$new(session, min=1, max=100)
on.exit(progress$close())
progress$set(message = 'Calculation in progress',
detail = 'This may take a while...',
value=NULL)
plotPropPositiveCIReactive()
}}}, silent=TRUE)
} ,height=numRowsProportionsCI)
output$downLoadPropPositiveCI <- downloadPlot(
plotFunction = plotPropPositiveCIReactive,
width = clientData$output_plotPropCIs_width,
height = clientData$output_plotPropCIs_height,
print = TRUE
)
# Menu
output$UIpropTable = renderUI({
if(!is.null(measurementLevels())){
#select the measurement
selectInput(inputId="measurementSelectedprop",
label="Select evaluation occasion:",
choices=measurementLevels(), selected=measurementLevels()[1])
}
})
output$UIdoPvalueAdjustments <- renderUI({
if(!is.null(measurementLevels())){
checkboxInput(inputId="doPValueAdjustments",
label="Calculate P value adjustments? (It may take a long time.)",
value=FALSE)
}
})
# Tables
# Table of proportions of patients in a group with a symptom ####
output$tablePropGroups <- renderDataTable ({
if(!(is.null(dataFiltered()) || is.null(input$thresholdValue))){
if(input$treatasBinary==TRUE){
progress <- Progress$new(session, min=1, max=100)
progress$set(message = 'Calculation in progress',
detail = 'This will take a while...',
value=NULL)
on.exit(progress$close())
out <- tablePropPosGroups(data=dataFiltered(),
groupingVar=input$groupingVar,
measurementVar=input$measurementVar,
forMeasurement=input$measurementSelectedprop,
symptomsNames=input$selectedSymptoms,
thresholdValue=input$thresholdValue,
doPValueAdjustments=input$doPValueAdjustments
)
return(out)
}}
}, options=list(bFilter=FALSE, bPaginate=FALSE, bInfo=FALSE))
# text - explaining tablePropGroups
output$textTablePropGroups <- reactive({
if(!is.null(dataFiltered())){
if(input$treatasBinary==TRUE){
description <- generateDescription("SummaryGrouping_Proportions")
return(description())
}}
})
# Table with medians of symptoms values in a group ####
output$tableMedianGroups <- renderDataTable ({
if(!(is.null(dataFiltered()) || is.null(input$measurementSelectedprop) )){
if(input$treatasBinary==FALSE){
progress <- Progress$new(session, min=1, max=100)
progress$set(message = 'Calculation in progress',
detail = 'This will take a while...',
value=NULL)
on.exit(progress$close())
tableMeGroups(data=dataFiltered(),
groupingVar=input$groupingVar,
measurementVar=input$measurementVar,
forMeasurement=input$measurementSelectedprop,
symptomsNames=input$selectedSymptoms,
thresholdValue=input$thresholdValue,
doPValueAdjustments=input$doPValueAdjustments)
}}
}, options=list(bFilter=FALSE, bPaginate=FALSE, bInfo=FALSE))
# text - explainig tableMedianGroups
output$textTableMedianGroups <- reactive({
if(!is.null(dataFiltered())){
if(input$treatasBinary==FALSE){
description <- generateDescription("SummaryGrouping_Medians")
return(description())
}}
})
output$messageNotAppropriate10 <- renderText({
if(!is.null(input$treatasBinary)){
if (input$treatasBinary==FALSE) {
"This type of analysis is not appropriate for numerical responses."
}}
})
# TAB - Clustering ####
# Menu
output$clusteringUI = renderUI({
if(!(is.null(measurementLevels()) || is.null(measurementLevels()) )){
#select the measurement
selectInput(inputId="selectedMeasurementValue",
label="Select evaluation occasion:",
choices=measurementLevels(), selected=measurementLevels()[1])
}
})
# Graphs
# Dendrogram plot ####
output$plotClusterDendrogram=renderPlot({
if(!(is.null(dataFiltered()) || is.null(input$selectedMeasurementValue) )){
if (input$treatasBinary==TRUE) {data=dataFilteredwithThreshold()} else {data=dataFiltered()}
plotDendrogram(data=data,
variableName=input$measurementVar,
variableValue=input$selectedMeasurementValue,
selectedSymptoms=input$selectedSymptoms,
treatasBinary=input$treatasBinary)
}
},height=numRowsClustering)
output$downLoadplotClusterDendrogram <- downloadHandler(
# since this is base graphics (not ggplot) the EPS file generation
# has to be handled differently - the downloadPlot() function does not work
# due to "Cairo" graphics device being used instead of "postscipt"
# maybe has to do with being called from a reactive function and/or plot being
# in a different environement?
filename="plot.eps",
content = function (file) {
width = clientData$output_plotClusterDendrogram_width
height = clientData$output_plotClusterDendrogram_height
postscript(file, paper="special", width = width/72, height = height/72)
if (input$treatasBinary==TRUE) {data=dataFilteredwithThreshold()} else {data=dataFiltered()}
plotDendrogram(data=data,
variableName=input$measurementVar,
variableValue=input$selectedMeasurementValue,
selectedSymptoms=input$selectedSymptoms,
treatasBinary=input$treatasBinary)
dev.off()
}, contentType="application/postscript"
)
# Dendrogram description
output$dendrogramDescr <- reactive({
if (!is.null(dataFiltered())) {
if (!is.null(input$selectedMeasurementValue)) {
description <- generateDescription("Clustering_Dendrogram")
return(description())
}}
})
# Heatmap - Selection of annotation variables
output$selectClusterAnnotations <- renderUI({
if(!is.null(dataFiltered())){
selectedSymptoms <- which(dataVariableNames() %in% input$selectedSymptoms)
selectInput(inputId="selectedClusterAnnotations",
label="Select variables for annotating graph:",
# TODO: remove some variables from selection
choices=dataVariableNames()[-selectedSymptoms],
selected=c(input$groupingVar),
multiple=TRUE)
}
})
# Heatmap plot ####
output$plotClusterHeatmap=renderPlot({
if(!is.null(dataExtended())){
if (input$treatasBinary==TRUE) {data=dataExtendedwithThreshold()} else {data=dataExtended()}
plotClusterHeatmap(data=data,
#TODO: make dependent on selection
variableName=input$measurementVar,
variableValue=input$selectedMeasurementValue,
selectedSymptoms=input$selectedSymptoms,
annotationVars=input$selectedClusterAnnotations,
treatasBinary=input$treatasBinary)
}
},height=numRowsClustering2)
output$downLoadplotClusterHeatmap <- downloadHandler(
# since this is base graphics (not ggplot) the EPS file generation
# has to be handled differently - the downloadPlot() function does not work
# due to "Cairo" graphics device being used instead of "postscipt"
# maybe has to do with being called from a reactive function and/or plot being
# in a different environement?
filename="plot.eps",
content = function (file) {
width = clientData$output_plotClusterHeatmap_width
height = clientData$output_plotClusterHeatmap_height
postscript(file, paper="special", width = width/72, height = height/72)
if (input$treatasBinary==TRUE) {data=dataExtendedwithThreshold()} else {data=dataExtended()}
plotClusterHeatmap(data=data,
#TODO: make dependent on selection
variableName=input$measurementVar,
variableValue=input$selectedMeasurementValue,
selectedSymptoms=input$selectedSymptoms,
annotationVars=input$selectedClusterAnnotations,
treatasBinary=input$treatasBinary)
dev.off()
}, contentType="application/postscript"
)
# Heat map description
output$heatmapDescr <- reactive({
if (!is.null(dataFiltered())) {
if (!is.null(input$selectedMeasurementValue)) {
description <- generateDescription("Clustering_Heatmap")
return(description())
}}
})
# Correlation plot ####
output$plotClusterCorrelations <- renderPlot({
if(!is.null(dataExtended())){
if (input$treatasBinary==TRUE) {data=dataFilteredwithThreshold()} else {data=dataFiltered()}
plotCorrelations(data=data,
variableName=input$measurementVar,
variableValue=input$selectedMeasurementValue,
selectedSymptoms=input$selectedSymptoms,
treatasBinary=input$treatasBinary)
}
},height=numRowsClustering3)
output$downLoadplotClusterCorrelations <- downloadHandler(
# since this is base graphics (not ggplot) the EPS file generation
# has to be handled differently - the downloadPlot() function does not work
# due to "Cairo" graphics device being used instead of "postscipt"
# maybe has to do with being called from a reactive function and/or plot being
# in a different environement?
filename="plot.eps",
content = function (file) {
width = clientData$output_plotClusterCorrelations_width
height = clientData$output_plotClusterCorrelations_height
postscript(file, paper="special", width = width/72, height = height/72)
if (input$treatasBinary==TRUE) {data=dataFilteredwithThreshold()} else {data=dataFiltered()}
plotCorrelations(data=data,
variableName=input$measurementVar,
variableValue=input$selectedMeasurementValue,
selectedSymptoms=input$selectedSymptoms,
treatasBinary=input$treatasBinary)
dev.off()
}, contentType="application/postscript"
)
# Correlation plot description
output$correlationDescr <- reactive({
if (!is.null(dataFiltered())) {
if (!is.null(input$selectedMeasurementValue)) {
description <- generateDescription("Clustering_Correlations")
return(description())
}}
})
# TAB - Regression model : one evaluation ####
# uncomment debuging outputs if needed - shows regression scenario
# (also add output to ui.R)
#output$debug10 <- renderText({paste(regressionScenario())})
# output$debug9 <- renderText({
# paste("selectedEvaluationTime:", ifelse(is.null(input$selectedEvaluationTime), "NULL", input$selectedEvaluationTime),
# "selectedCovariate:", ifelse(is.null(input$selectedCovariate), "NULL", input$selectedCovariate) ,
# "treatasBinary:",ifelse(is.null(input$treatasBinary), "NULL", input$treatasBinary),
# "useFirthCorrection:", ifelse(is.null(input$useFirthCorrection), "NULL", input$useFirthCorrection),
# "useRCSModel:", ifelse(is.null(input$useRCSModel), "NULL", input$useRCSModel))
# })
# Menus ####
output$selectEvaluationTime <- renderUI({
selectInput(inputId="selectedEvaluationTime",
label="Select evaluation occasion:",
choices=if(!is.null(measurementLevels())) {measurementLevels()},
selected=if(!is.null(measurementLevels())) {measurementLevels()[1]})
})
output$selectCovariate <- renderUI({
selectInput(inputId="selectedCovariate",
label="Select covariate for analysis:",
choices=dataVariableNames(),
selected=input$groupingVar)
})
output$checkUseFirthCorrection <- renderUI({
if (!is.null(input$treatasBinary)) {
if (input$treatasBinary==TRUE) {
checkboxInput(inputId="useFirthCorrection",
label="Use Firth correction?",
value=FALSE)
}}
})
output$checkUseRCSModel <- renderUI({
if (!is.null(input$treatasBinary) & !is.null(input$selectedCovariate)) {
if (input$treatasBinary==FALSE) {
if (determineTypeofVariable(dataExtended()[,input$selectedCovariate])[["nLevels"]]=="multilevel" &
(determineTypeofVariable(dataExtended()[,input$selectedCovariate])[["type"]]=="integer" |
determineTypeofVariable(dataExtended()[,input$selectedCovariate])[["type"]]=="numeric")
) {
checkboxInput(inputId="useRCSModel", label="Use flexible model of the association of the selected
variables with the numerical covariate?",
value=FALSE)
}}}
})
# Determine scenario ####
regressionScenario <- reactive({
if (!is.null(input$treatasBinary) &
!is.null(input$selectedEvaluationTime) &
!is.null(input$selectedCovariate) &
!is.null(dataFiltered()) &
!is.null(input$measurementVar) &
!is.null(input$selectedSymptoms)
) {
if (input$treatasBinary==TRUE) {
if (is.null(input$useFirthCorrection)) {return("scenarioLogist")}
if (input$useFirthCorrection==FALSE) {return("scenarioLogist")}
if (input$useFirthCorrection==TRUE) {return("scenarioLogistf")}
}
if (input$treatasBinary==FALSE) {
if (is.null(input$useRCSModel) ) {return("scenarioLinearModel")}
if (input$useRCSModel==FALSE) {return("scenarioLinearModel")}
if (input$useRCSModel==TRUE) {return("scenarioRCSModel")}
}}
})
# Scenario - Logistic regression with Firth correction
# Create results of Logistic regression with Firth correction
resultsLogistf <- reactive({
if(!is.null(regressionScenario())) {
if(regressionScenario()=="scenarioLogistf") {
out <- tableLogistf(data=dataExtended(),
measurementVar=input$measurementVar,
selectedMeasurement=input$selectedEvaluationTime,
covariate=input$selectedCovariate,
selectedSymptoms=input$selectedSymptoms,
thresholdValue=input$thresholdValue)
return(out)
}}
})
# plot - logistf ####
plotLogistfReactive <- reactive({
if(regressionScenario()!="scenarioLogistf") {return()}
plotValueswithCIs(data=resultsLogistf()[["rawResultsTable"]],
variableName="Variable",
valueName="OR",
CILowerName="CILower",
CIUpperName="CIUpper",
xLabel="Odds ratios",
yLabel="Variables",
graphTitle=paste("Odds ratios and confidence intervals for",
resultsLogistf()[["referenceValue"]],
"\n at evaluation T=",
input$selectedEvaluationTime,
"(using Firth correction)"),
vLine=1,
variableOrder=input$selectedSymptoms)
})
output$plotLogistf <- renderPlot({
if(!is.null(resultsLogistf()) ){
if(regressionScenario()=="scenarioLogistf") {
print(plotLogistfReactive())
}}
}, height=numRowsLogistf)
# table - logistf ####
output$tableLogistf <- renderDataTable({
if(!is.null(resultsLogistf()) ){
if(regressionScenario()=="scenarioLogistf") {
return(resultsLogistf()[["printableResultsTable"]])
}}
}, options=list(bFilter=FALSE, bPaginate=FALSE, bInfo=FALSE))
# table - logistf Intercepts
output$tableLogistfIntercept <- renderDataTable({
if(!is.null(resultsLogistf()) ){
if(regressionScenario()=="scenarioLogistf") {
return(resultsLogistf()[["printableInterceptTable"]])
}}
}, options=list(bFilter=FALSE, bPaginate=FALSE, bInfo=FALSE))
# description logistf
output$logistfDescr <- reactive({
if (!is.null(resultsLogistf())) {
if (regressionScenario()=="scenarioLogistf") {
description <- generateDescription("RegressionOne_Firth")
return(description())
}}
})
# Scenario - logistic regression (without Firth correction) ####
resultsLogist <- reactive({
if(!is.null(regressionScenario()) ){
if(regressionScenario()=="scenarioLogist") {
out <- tableLogist(data=dataExtended(),
measurementVar=input$measurementVar,
selectedMeasurement=input$selectedEvaluationTime,
covariate=input$selectedCovariate,
selectedSymptoms=input$selectedSymptoms,
thresholdValue=input$thresholdValue)
return(out)
}}
})
# table - logist ####
output$tableLogist <- renderDataTable({
if(!is.null(resultsLogist()) ){
if(regressionScenario()=="scenarioLogist") {
resultsLogist()[["printableResultsTable"]]
}}
}, options=list(bFilter=FALSE, bPaginate=FALSE, bInfo=FALSE))
# table - logist intercept ####
output$tableLogistIntercept <- renderDataTable({
if(!is.null(resultsLogist()) ){
if(regressionScenario()=="scenarioLogist") {
resultsLogist()[["printableInterceptTable"]]
}}
}, options=list(bFilter=FALSE, bPaginate=FALSE, bInfo=FALSE))
# plot - logist ####
plotLogistReactive <- reactive({
plotValueswithCIs(data=resultsLogist()[["rawResultsTable"]],
variableName="Variable",
valueName="OR",
CILowerName="CILower",
CIUpperName="CIUpper",
xLabel="Odds ratios",
yLabel="Variables",
graphTitle=paste("Odds ratios and confidence intervals for",
resultsLogist()[["referenceValue"]],
"\n at evaluation T=",
input$selectedEvaluationTime),
vLine=1,
variableOrder=input$selectedSymptoms)
})
output$plotLogist <- renderPlot({
if(!is.null(resultsLogist())){
if(regressionScenario()=="scenarioLogist") {
plotLogistReactive()
}}
}, height=numRowsLogist)
# description logist
output$logistDescr <- reactive({
if (!is.null(resultsLogist())) {
if (regressionScenario()=="scenarioLogist") {
description <- generateDescription("RegressionOne_OddsRatio")
return(description())
}}})
# Scenario - linear regression
resultsLinear <- reactive({
if (!is.null(regressionScenario())) {
if (regressionScenario()=="scenarioLinearModel") {
out <- tableLinear(data=dataExtended(),
measurementVar=input$measurementVar,
selectedMeasurement=input$selectedEvaluationTime,
covariate=input$selectedCovariate,
selectedSymptoms=input$selectedSymptoms)
return(out)
}}
})
# table - linear ####
output$tableLinear <- renderDataTable({
if(!is.null(resultsLinear())){
if (regressionScenario()=="scenarioLinearModel") {
resultsLinear()[["printableResultsTable"]]
}}
}, options=list(bFilter=FALSE, bPaginate=FALSE, bInfo=FALSE))
# table - linear Intercepts ####
output$tableLinearIntercept <- renderDataTable({
if(!is.null(resultsLinear())){
if (regressionScenario()=="scenarioLinearModel") {
resultsLinear()[["printableInterceptTable"]]
}}
}, options=list(bFilter=FALSE, bPaginate=FALSE, bInfo=FALSE))
# plot - linear ####
plotLinearReactive <- reactive({
plotValueswithCIs(data=resultsLinear()[["rawResultsTable"]],
variableName="Variable",
valueName="beta",
CILowerName="CILower",
CIUpperName="CIUpper",
xLabel="Beta (slope) coefficient",
yLabel="Variables",
graphTitle=paste("Beta coefficients and confidence intervals for effects of",
input$selectedCovariate,
"\n on selected variables at evaluation T=",
input$selectedEvaluationTime),
vLine=0,
variableOrder=input$selectedSymptoms)
})
output$plotLinear <- renderPlot({
if(!is.null(resultsLinear())){
if (regressionScenario()=="scenarioLinearModel") {
plotLinearReactive()
}}
}, height=numRowsLinear)
# description linear
output$linearDescr <- reactive({
if (!is.null(resultsLinear())) {
if (regressionScenario()=="scenarioLinearModel") {
description <- generateDescription("RegressionOne_Linear")
return(description())
}}})
# Scenario - modeling with Restricted Cubic Splines
# plot - RCS plot ####
plotRCSReactive <- reactive({
plotRCS(data.all=dataExtended(),
data.yn=dataFiltered.yn(),
measurement=Measurement(),
selectedSymptoms=input$selectedSymptoms,
measurementSelectedrcs=input$selectedEvaluationTime,
rcsIDVar=input$selectedCovariate,
binaryVar=input$treatasBinary)
})
output$plotRCS=renderPlot({
if(!is.null(regressionScenario())){
if (regressionScenario()=="scenarioRCSModel") {
plotRCSReactive()
}}
}, height=numRowsRCSModel)
# table - RCS table ####
output$tableRCS <- renderDataTable({
if(!is.null(regressionScenario())){
if (regressionScenario()=="scenarioRCSModel") {
tableRCS(data.all=dataExtended(),
data.yn=dataFiltered.yn(),
measurement=Measurement(),
selectedSymptoms=input$selectedSymptoms,
measurementSelectedrcs=input$selectedEvaluationTime,
rcsIDVar=input$selectedCovariate,
binaryVar=input$treatasBinary
)
}}
}, options=list(bFilter=FALSE, bPaginate=FALSE, bInfo=FALSE))
# description RCS
output$RCSDescr <- reactive({
if (!is.null(regressionScenario())) {
if (regressionScenario()=="scenarioRCSModel") {
description <- generateDescription("RegressionOne_RCS")
return(description())
}}})
# download "buttons" for each of the regression graphs ####
output$linearRegDownload <- renderUI({
if (regressionScenario()=="scenarioLinearModel") {
output$downLoadRegressionOneTime1 <- downloadPlot(plotFunction = plotLinearReactive,
width = clientData$output_plotLinear_width,
height = clientData$output_plotLinear_height,
print = TRUE)
downloadButton("downLoadRegressionOneTime1", label="Download")
}
})
output$logistRegDownload <- renderUI({
if (regressionScenario()=="scenarioLogist") {
output$downLoadRegressionOneTime2 <- downloadPlot(plotFunction = plotLogistReactive,
width = clientData$output_plotLogist_width,
height = clientData$output_plotLogist_height,
print = TRUE)
downloadButton("downLoadRegressionOneTime2", label="Download")
}
})
output$logistfRegDownload <- renderUI({
if (regressionScenario()=="scenarioLogistf") {
output$downLoadRegressionOneTime3 <- downloadPlot(plotFunction = plotLogistfReactive,
width = clientData$output_plotLogistf_width,
height = clientData$output_plotLogistf_height,
print = TRUE)
downloadButton("downLoadRegressionOneTime3", label="Download")
}
})
output$RCSRegDownload <- renderUI({
if (regressionScenario()=="scenarioRCSModel") {
output$downLoadRegressionOneTime4 <- downloadHandler(
# since this is base graphics (not ggplot) the EPS file generation
# has to be handled differently - the downloadPlot() function does not work
# due to "Cairo" graphics device being used instead of "postscipt"
# maybe has to do with being called from a reactive function and/or plot being
# in a different environement?
filename="plot.eps",
content = function (file) {
width = clientData$output_plotRCS_width
height = clientData$output_plotRCS_height
postscript(file, paper="special", width = width/72, height = height/72)
# if (input$treatasBinary==TRUE) {data=dataFilteredwithThreshold()} else {data=dataFiltered()}
plotRCS(data.all=dataExtended(),
data.yn=dataFiltered.yn(),
measurement=Measurement(),
selectedSymptoms=input$selectedSymptoms,
measurementSelectedrcs=input$selectedEvaluationTime,
rcsIDVar=input$selectedCovariate,
binaryVar=input$treatasBinary)
dev.off()
}, contentType="application/postscript"
)
downloadButton("downLoadRegressionOneTime4", label="Download")
}
})
# TAB - Regression model : all evaluations ####
# Menu
output$selectCovariate1st <- renderUI({
selectInput(inputId="selectedCovariate1st",
label="Select covariate for analysis:",
choices=dataVariableNames(),
selected=input$groupingVar)
})
output$selectMixedModelType <- renderUI({
selectInput(inputId="selectedMixedModelType",
label="Select a mixed model type:",
choices=c("Outcome ~ Covariate + Subject (random effect)"="MMsimple",
"Outcome ~ Covariate + Evaluation occasion + Subject (random effect)"="MMmeasurement",
"Outcome ~ Covariate + Time from inclusion + Subject (random effect)"="MMtimeSinceInclusion"),
selected="MMsimple")
})
# Results
mixedModelResults <- reactive({
progress <- Progress$new(session, min=1, max=100)
progress$set(message = 'Calculation in progress',
detail = 'This will take a while...',
value=NULL)
on.exit(progress$close())
mixedModel(data=dataExtended(),
selectedSymptoms=input$selectedSymptoms,
coVariate1st=input$selectedCovariate1st,
subjectIDVar=input$patientIDVar,
measurementVar=input$measurementVar,
dateVar=input$dateVar,
thresholdValue=input$thresholdValue,
treatasBinary=input$treatasBinary,
selectedModel=input$selectedMixedModelType)
})
# Table 0 ####
output$mixedModelTable0Caption <- renderText(
if(!is.null(input$selectedMixedModelType)) {
paste("Table: Intercepts for the models")
})
output$mixedModelTable0 <- renderDataTable({
if(!is.null(input$selectedMixedModelType)) {
results <- mixedModelResults()[["printableIntercept"]]
return(results)
}
}, options=list(bFilter=FALSE, bPaginate=FALSE, bInfo=FALSE))
# Table 1 ####
output$mixedModelTable1Caption <- renderText(
if(!is.null(input$selectedMixedModelType)) {
paste("Table: Fixed effects of",
input$selectedCovariate1st,
"for",
mixedModelResults()[["coVariate1stComparison"]])
})
output$mixedModelTable1 <- renderDataTable({
if(!is.null(input$selectedMixedModelType)) {
results <- mixedModelResults()[["printablecoVariate1st"]]
return(results)
}
}, options=list(bFilter=FALSE, bPaginate=FALSE, bInfo=FALSE))
# Graph 1 ####
mixedModelGraph1Reactive <- reactive({
#print(
plotFixedEffectsofcoVariate1st(calculatedStatistics=mixedModelResults()[["coVariate1st"]],
coVariate1st=input$selectedCovariate1st,
coVariate1stReferenceValue=mixedModelResults()[["coVariate1stReferenceValue"]],
treatasBinary=input$treatasBinary,
variableOrder=input$selectedSymptoms
)
})
output$mixedModelGraph1 <- renderPlot({
if(!is.null(input$selectedMixedModelType)) {
mixedModelGraph1Reactive()
}
}, height=numRowsMixedModels1)
# Download Graph 1 ####
output$graph1Download <- renderUI({
output$downLoadGraph1 <- downloadPlot(plotFunction = mixedModelGraph1Reactive,
width = clientData$output_mixedModelGraph1_width,
height = clientData$output_mixedModelGraph1_height,
print = TRUE)
downloadButton("downLoadGraph1", label="Download")
})
# Table 2 ####
output$mixedModelTable2Caption <- renderText(
if(!is.null(input$selectedMixedModelType)) {
if (input$selectedMixedModelType=="MMmeasurement") {
paste("Table: Fixed effects of",
input$measurementVar,
"for T=",
mixedModelResults()[["measurementVarComparison"]],
"used as reference")
}}
)
output$mixedModelTable2 <- renderDataTable({
if(!is.null(input$selectedMixedModelType)) {
if (input$selectedMixedModelType=="MMmeasurement") {
results <- mixedModelResults()[["printablemeasurementVar"]]
return(results)
}}
}, options=list(bFilter=FALSE, bPaginate=FALSE, bInfo=FALSE))
# Graph 2 ####
mixedModelGraph2Reactive <- reactive({
plotFixedEffectsofMeasurementVar(calculatedStatistics=mixedModelResults()[["measurementVar"]],
measurementVar=input$measurementVar,
treatasBinary=input$treatasBinary,
variableOrder=input$selectedSymptoms)
})
output$mixedModelGraph2 <- renderPlot({
if(!is.null(input$selectedMixedModelType)) {
if (input$selectedMixedModelType=="MMmeasurement") {
mixedModelGraph2Reactive()
}}
}, height=numRowsMixedModels2)
# Download Graph 2 ####
output$graph2Download <- renderUI({
if(!is.null(input$selectedMixedModelType)) {
if (input$selectedMixedModelType=="MMmeasurement") {
output$downLoadGraph2 <- downloadPlot(plotFunction = mixedModelGraph2Reactive,
width = clientData$output_mixedModelGraph2_width,
height = clientData$output_mixedModelGraph2_height,
print = TRUE)
downloadButton("downLoadGraph2", label="Download")
}}
})
# Table 3 ####
output$mixedModelTable3Caption <- renderText(
if(!is.null(input$selectedMixedModelType)) {
if (input$selectedMixedModelType=="MMtimeSinceInclusion") {
paste("Table: Fixed effects of time since inclusion in the study")
}}
)
output$mixedModelTable3 <- renderDataTable({
if(!is.null(input$selectedMixedModelType)) {
if (input$selectedMixedModelType=="MMtimeSinceInclusion") {
results <- mixedModelResults()[["printabledaysSinceInclusion"]]
return(results)
}}
}, options=list(bFilter=FALSE, bPaginate=FALSE, bInfo=FALSE))
# Graph 3 ####
mixedModelGraph3Reactive <- reactive({
plotFixedEffectsofDaysSinceInclusion(calculatedStatistics=mixedModelResults()[["daysSinceInclusion"]],
treatasBinary=input$treatasBinary,
variableOrder=input$selectedSymptoms)
})
output$mixedModelGraph3 <- renderPlot({
if(!is.null(input$selectedMixedModelType)) {
if (input$selectedMixedModelType=="MMtimeSinceInclusion") {
mixedModelGraph3Reactive()
}}
}, height=numRowsMixedModels3)
# Download Graph 3 ####
output$graph3Download <- renderUI({
if(!is.null(input$selectedMixedModelType)) {
if (input$selectedMixedModelType=="MMtimeSinceInclusion") {
output$downLoadGraph3 <- downloadPlot(plotFunction = mixedModelGraph3Reactive,
width = clientData$output_mixedModelGraph3_width,
height = clientData$output_mixedModelGraph3_height,
print = TRUE)
downloadButton("downLoadGraph3", label="Download")
}}
})
# description Regression All
output$regressionAllDescr <- reactive({
if (!is.null(input$selectedMixedModelType)) {
description <- generateDescription("RegressionAll")
return(description())
}})
# TAB - Uploaded data ####
# Table - list the subseted data in an output slot ####
output$data <- renderDataTable({
if(!is.null(dataFiltered())){
data <- dataFiltered()
# TODO: We could render a renderDataTable(), but how to display dates in
# format 1.12.2014 and still sort them correctly?
# Sys.setlocale("LC_TIME", "Slovenian")
data[,input$dateVar] <- as.Date(data[,input$dateVar], format="%d.%m.%Y")
#data[,input$dateVar] <- as.character(as.Date(data[,input$dateVar], format="%d.%m.%Y"),
# format="%d.%m.%Y")
#data$Date <- as.character(as.Date(data$Date, origin="1899-12-30"),format="%d.%m.%Y")
# save(data, file="dataFiltered.Rdata")
return(data)
# NOTE: if we want to render the table of data, we have to convert the dates into
# characters, since renderTable seems to use xtable, which seems to not handle
# dates very well (http://stackoverflow.com/questions/8652674/r-xtable-and-dates)
}
})
# TAB - Selected variables ####
output$selectedVariables <- renderPrint({
selectedInputs <- reactiveValuesToList(input)
print(selectedInputs)
})
output$debug <- reactive({
# browser()
})
})<file_sep>/man/plotLogistf.Rd
\name{plotLogistf}
\alias{plotLogistf}
\title{Plot logistf}
\usage{
plotLogistf(data, data.yn, measurement, measurementSelectedlogistf,
logistfIDVar, selectedSymptoms, numSymptoms)
}
\arguments{
\item{TODO}{TODO write instructions}
}
\description{
TODO write description
}
<file_sep>/man/plotPropPositive.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{plotPropPositive}
\alias{plotPropPositive}
\title{Plot proportions with symptoms}
\usage{
plotPropPositive(data, grouping = "Sex", measurements = "Measurement",
symptomsNames)
}
\arguments{
\item{data}{Data to be passed to the function as a data frame.}
\item{grouping}{The column name of the binary grouping variable used to define the groups on the plot.}
\item{measurements}{The column name of the variable containing measurement occasions.}
\item{symptomsNames}{A vector of column names containing the symptom severity.}
}
\description{
Function that plots proportion of subjects that have a certain
symptom present.
}
<file_sep>/inst/shinyapp_symptoms/ui.R
# This is the GUI part of the shiny script to plot
# graph for displaying presence of symptoms.
library(shiny)
# Prepare the list of possible symptoms
reportedSymptoms <- unlist(levels(symptomsData[,"variable"]))
# Define UI for displaying presence of symptoms
shinyUI(pageWithSidebar(
# Application title
headerPanel("Patients and their symptoms"),
# Define the selection panel
sidebarPanel(wellPanel(
checkboxGroupInput(inputId="selectedSymptoms",
label=h5("Symptoms:"),
choices=reportedSymptoms)
)
),
# Define the output panel
mainPanel(
plotOutput("plot"),
h3(textOutput("message"))
)
)
)
<file_sep>/man/draw2ndLevelLabels.Rd
% Generated by roxygen2 (4.0.2): do not edit by hand
\name{draw2ndLevelLabels}
\alias{draw2ndLevelLabels}
\title{Draws labels for the 2nd level groups}
\usage{
draw2ndLevelLabels(label, firstRowforDiagnosis, lastRowforDiagnosis)
}
\description{
Not meant to be called by the user.
}
<file_sep>/R/PlotSymptomsTabDistribution_OverTime.R
#' @title Plot profiles for variables.
#'
#' @description TODO
#'
#' @param TODO
plotTimelineProfiles <- function (data,
plotType="oneGraph",
personIDVar,
measurementVar,
selectedSymptoms,
sizeofRandomSample=10,
sizeofGroup=25) {
# prepare data for different type of plots
# one plot per variable - all data
if (plotType=="oneGraph") {
# prepare data
dataMelted <- melt(data=data,
id.vars=c(personIDVar, measurementVar),
measure.vars=selectedSymptoms )
# rename column names to make sure ggplot recognizes them
colnames(dataMelted)[which(colnames(dataMelted)==personIDVar)] <- "PersonID"
colnames(dataMelted)[which(colnames(dataMelted)==measurementVar)] <- "Measurement"
# set some variables as factors
dataMelted[,"PersonID"] <- as.factor(dataMelted[,"PersonID"])
dataMelted[,"Measurement"] <- as.factor(dataMelted[,"Measurement"])
}
# one plot per variable - random subsample of patients
if (plotType=="randomSample") {
# draw a random sample of people
peopleInSample <- sample(unique(data[, personIDVar]), sizeofRandomSample)
dataRandomSample <- data[data[, personIDVar] %in% peopleInSample, ]
# prepare data
dataMelted <- melt(data=dataRandomSample,
id.vars=c(personIDVar, measurementVar),
measure.vars=selectedSymptoms )
# rename column names to make sure ggplot recognizes them
colnames(dataMelted)[which(colnames(dataMelted)==personIDVar)] <- "PersonID"
colnames(dataMelted)[which(colnames(dataMelted)==measurementVar)] <- "Measurement"
# set some variables as factors
dataMelted[,"PersonID"] <- as.factor(dataMelted[,"PersonID"])
dataMelted[,"Measurement"] <- as.factor(dataMelted[,"Measurement"])
}
# more plots per variable
if (plotType=="multipleGraphs") {
# add info about group membership to the data
uniqueSubjects <- unique(data[personIDVar])
numSubjects <- dim(uniqueSubjects)[1]
numGroups <- ceiling(numSubjects/sizeofGroup)
for (i in 1:numSubjects) {
uniqueSubjects[i,"MemberOfGroup"] <- i%%numGroups
}
# join info about group membership to the data
data <- join(x=data, y=uniqueSubjects, by=personIDVar, type="inner")
# prepare data
dataMelted <- melt(data=data,
id.vars=c(personIDVar, measurementVar, "MemberOfGroup" ),
measure.vars=selectedSymptoms )
# rename column names to make sure ggplot recognizes them
colnames(dataMelted)[which(colnames(dataMelted)==personIDVar)] <- "PersonID"
colnames(dataMelted)[which(colnames(dataMelted)==measurementVar)] <- "Measurement"
# set some variables as factors
dataMelted[,"PersonID"] <- as.factor(dataMelted[,"PersonID"])
dataMelted[,"Measurement"] <- as.factor(dataMelted[,"Measurement"])
# create the plot object
p <- ggplot(data=dataMelted, aes(x=Measurement, y=value, group=PersonID, colour=PersonID)) +
# draw points, draw lines, facet by symptom, use black & white theme
geom_point() + geom_line() + facet_grid(variable + MemberOfGroup ~ .) + myTheme() +
# add summary statistics at each point
stat_summary(aes(group=1), geom="point", fun.y=median, shape=16, size=5, colour="red")
return(p)
}
# create the plot object
p <- ggplot(data=dataMelted, aes(x=Measurement, y=value, group=PersonID, colour=PersonID)) +
# draw points, draw lines, facet by symptom, use black & white theme
geom_point() + geom_line() + facet_grid(variable~.) + myTheme() +
# add summary statistics at each point
stat_summary(aes(group=1), geom="point", fun.y=median, shape=16, size=5, colour="red")
return(p)
}
#' @title Plot timeline boxplots
#'
#' @description TODO
#'
#' @param TODO
#'
plotTimelineBoxplots <- function(data,
personIDVar,
measurementVar,
selectedSymptoms,
faceting) {
# TODO remove next line
dev.list()
# prepare data
dataMelted <- melt(data=data,
id.vars=c(personIDVar, measurementVar),
measure.vars=selectedSymptoms )
# rename column names to make sure ggplot recognizes them
colnames(dataMelted)[which(colnames(dataMelted)==personIDVar)] <- "PersonID"
colnames(dataMelted)[which(colnames(dataMelted)==measurementVar)] <- "Measurement"
# set some variables as factors
dataMelted[,"PersonID"] <- as.factor(dataMelted[,"PersonID"])
dataMelted[,"Measurement"] <- as.factor(dataMelted[,"Measurement"])
# code to draw graph
# # define x, y axis, groups, coloring
if (faceting=="variablesOnYaxis") {
p <- ggplot(data=dataMelted, aes(x=Measurement, y=value)) +
# draw points, jitter points, draw boxplots, facet by variable, use black & white theme
geom_boxplot(width=0.5) +
geom_jitter(alpha=I(1/5)) +
facet_grid(variable ~.) +
myTheme() +
ylab("Value") + xlab("Evaluation occasion")
}
if (faceting=="variablesOnXaxis") {
p <- ggplot(data=dataMelted, aes(x=variable, y=value)) +
# draw points, jitter points, draw boxplots, facet by variable, use black & white theme
geom_boxplot(width=0.5) +
geom_jitter(alpha=I(1/5)) +
facet_grid(Measurement ~.) +
myTheme() +
theme(axis.text.x=element_text(angle=90, hjust=1)) +
ylab("Value") + xlab("Variable")
}
# return ggplot
return(p)
}
#' @title Boxplots data in form of a table
#'
#' @description TODO
#'
#' @param TODO
tabelizeBoxplots <- function(measurements,
measurementVar,
data,
selectedSymptoms ) {
tables <- list()
for (measurement in measurements) {
table <- tableMedians(measurement=measurement,
measurementVar=measurementVar,
data=data,
selectedSymptoms=selectedSymptoms)
tables[[as.character(measurement)]] <-
print(xtable(table, caption=paste("Evaluation occasion:", measurement)),
type="html",
html.table.attributes='class="data table table-bordered table-condensed"',
caption.placement="top")
}
return(lapply(tables, paste))
}
# helper function for tabelizeBoxplots
tableMedians <- function(measurement,
measurementVar,
data,
selectedSymptoms) {
result <- data.frame("Variables"=selectedSymptoms, "Median"=NA)
result2 <- data.frame("Variables"=selectedSymptoms)
data <- data[data[,measurementVar]==measurement, ]
for (symptom in selectedSymptoms) {
result[result[,"Variables"]==symptom,"Median"] <-
median(na.omit(data[ ,symptom]))
result2[result2[,"Variables"]==symptom,"Median"] <-
median(na.omit(data[ ,symptom]))
calculateMedian <- function(data, indices) {median(data[indices], na.rm=TRUE)}
temp <- boot(data=data[,symptom], R=2000, statistic=calculateMedian)
res <- boot.ci(temp, type="perc", conf=c(0.95))
result[result[,"Variables"]==symptom, "95% CI (bootstrap)"] <-
paste(format(res$percent[4], digits=2), "to", format(res$percent[5], digits=2) )
result2[result2[,"Variables"]==symptom,"CILower"] <- res$percent[4]
result2[result2[,"Variables"]==symptom,"CIUpper"] <- res$percent[5]
result[result[,"Variables"]==symptom,"IQR"] <-
IQR(na.omit(data[ ,symptom]))
result[result[,"Variables"]==symptom,"25th perc."] <-
quantile(na.omit(data[ ,symptom]), 0.25)
result[result[,"Variables"]==symptom,"75th perc."] <-
quantile(na.omit(data[ ,symptom]), 0.75)
result[result[,"Variables"]==symptom, "# NAs"] <-
sum(is.na(data[,symptom]))
}
return(list(printableTable=result, rawTable=result2))
}
# construct a table of proportions
tableProportions <- function(measurement,
measurementVar,
data,
selectedSymptoms) {
result <- data.frame("Variables"=selectedSymptoms, "Positive"=NA)
data <- data[data[,measurementVar]==measurement, ]
for (symptom in selectedSymptoms) {
positive <- sum(na.omit(data[ ,symptom]))
all <- length(data[ ,symptom])
#res <- prop.test(x=positive, n=all, conf.level=0.95)
res <- binom.test(x=positive, n=all, conf.level=0.95)
result[result[,"Variables"]==symptom,"Positive"] <- positive
result[result[,"Variables"]==symptom,"All"] <- all
result[result[,"Variables"]==symptom,"Proportion"] <- format(res$estimate, digits=2)
result[result[,"Variables"]==symptom, "95% CI for proportion"] <-
paste(format(res$conf.int[1], digits=2), "to", format(res$conf.int[2], digits=2) )
result[result[,"Variables"]==symptom, "# NAs"] <-
sum(is.na(data[,symptom]))
}
return(result)
}<file_sep>/R/PlotSymptomsTabRCS.R
#' @title Plot RCS
#'
#' @description Plot RCS.
#'
#' @param data Data for ploting.
plotRCS <- function (data.all,
data.yn=NULL,
measurement,
selectedSymptoms,
measurementSelectedrcs,
rcsIDVar,
binaryVar=TRUE) {
num.symptoms=length(selectedSymptoms)
# only if binary data is passed to function
if(binaryVar==TRUE & is.null(data.yn)) {return()}
if(binaryVar) {my.data.symptoms.yn=data.yn[measurement==measurementSelectedrcs,]}
my.data.symptoms = data.all[measurement==measurementSelectedrcs, selectedSymptoms]
#temp: use age
my.var=data.all[measurement==measurementSelectedrcs,rcsIDVar] # should be which.rcsId ? or will it work like this?
par(mfrow=c(ceiling(num.symptoms/3), 3))
for(i in c(1:num.symptoms)){
if (binaryVar==TRUE) {my.data <- my.data.symptoms.yn[,i] } else {
my.data <- my.data.symptoms[,i]}
my.mod=glm(my.data~rcs(my.var),
family=ifelse(binaryVar, "binomial", "gaussian"), x=T, y=T)
plotRCSmod(my.mod,
my.mod$x[,2],
my.ylab=ifelse(binaryVar, "Probability of positive variable", "Estimated value"),
my.xlab=rcsIDVar,
my.title=selectedSymptoms[i])
my.p=ifelse(anova(my.mod, test="Chi")[2,5]<0.001,
"P<0.001",
paste("P=", round(anova(my.mod, test="Chi")[2,5],4)))
text(x=par("usr")[1]+10, y=par("usr")[4]-.1, labels=my.p, xpd=T)
}
}
#' @title Plot RCS MOD ??? TODO : help contents
#'
#' @param My.mod ???
plotRCSmod <- function(My.mod,
my.var,
my.ylab="Response",
my.xlab="Covariate",
my.xlim=NULL,
my.ylim=NULL,
my.knots=NULL,
my.title=NULL,
vlines=NULL,
hlines=NULL,
my.cex.lab=NULL,
my.axes=TRUE,
my.cex.main=NULL,
my.cex.axis=NULL ){
my.intervals<-c(0.0001, 0.001, .01, .1, 1, 10, 100, 1000, 10000, 100000)
my.range<- diff(range(my.var, na.rm=T))
if(is.null(vlines)){
my.by<-my.intervals[which(my.range/my.intervals<10)[1]]
my.vlines=seq(-100000, 10000, by=my.by)
}
my.pred<-predict(My.mod, type="response", se=T)
my.range.y=diff(range(my.pred, na.rm=T))
if(is.null(hlines)){
my.by.h<-my.intervals[which(my.range/my.intervals<10)[1]]
my.hlines=seq(-100000, 10000, by=my.by.h)
}
matplot( rbind(my.var, my.var, my.var),
rbind(my.pred$fit, my.pred$fit-1.96*my.pred$se, my.pred$fit+1.96*my.pred$se),
pch=rep(1, length(my.pred$fit)*3),
col=1, type="n", xlab=my.xlab, ylab=my.ylab, xlim=my.xlim, ylim=my.ylim,
main=my.title, cex.lab=my.cex.lab,
axes=my.axes, cex.main=my.cex.main, cex.axis=my.cex.axis)
lines( my.var[order(my.var)], my.pred$fit[order(my.var)])
lines( my.var[order(my.var)], (my.pred$fit+1.96*my.pred$se)[order(my.var)], lty=2)
lines( my.var[order(my.var)], (my.pred$fit-1.96*my.pred$se)[order(my.var)], lty=2)
# abline(h=seq(0,1,by=.1), v= vlines , lty=3, col="light grey")
if(!is.null(hlines) | !is.null(vlines)) abline(h=hlines, v= vlines , lty=3, col="light grey") else grid()
if(!is.null(my.knots)) axis(1, at=my.knots, line=2, cex.axis=.65)
}
#' @title Restricted cubic spline P values in tabular form
#'
#' @description TODO
#'
#' @param TODO
tableRCS <- function(data.all,
data.yn,
measurement,
selectedSymptoms,
measurementSelectedrcs,
rcsIDVar,
binaryVar=TRUE) {
if(binaryVar==TRUE) {
data <- data.yn[measurement==measurementSelectedrcs,]
} else {
data <- data.all[measurement==measurementSelectedrcs, selectedSymptoms]
}
variable <- data.all[measurement==measurementSelectedrcs,rcsIDVar]
table <- data.frame("Variable"=selectedSymptoms)
for (symptom in selectedSymptoms) {
model <- glm(data[,symptom]~rcs(variable), family=ifelse(binaryVar, "binomial", "gaussian"), x=T, y=T)
table[table[,"Variable"]==symptom, "P value"] <-
ifelse(anova(model, test="Chi")[2,5]<0.001,
"P<0.001",
paste(format(round(anova(model, test="Chi")[2,5],4), digits=2)))
}
return(table)
}<file_sep>/R/PlotSymptomsTabClustering.R
#' @title Plot dendrogram of patient groups based on symptoms.
#'
#' @description Plots a dendrogram of patient groups based on symptom values they
#' have.
#'
#' @param data Data fram to be passed to the function.
#' @param variableName The column name of the variable used for filtering.
#' @param variableValues The value of the filtering variable used to filter.
plotDendrogram <- function (data,
variableName,
variableValue,
selectedSymptoms,
treatasBinary=FALSE) {
dataSubset=data[data[,variableName]==variableValue, selectedSymptoms]
#distance: 1-Spearman's correlation, agglomeration method='complete',
if (treatasBinary==FALSE) { # for numerical outcomes
plot(hclust(as.dist(1-cor(dataSubset, use="c", method="s"))), ann=FALSE)
} else { # for binary outcomes
plot(hclust(dist(t(dataSubset))), ann=FALSE)
}
}
# output$plotClusterDendrogram=renderPlot({
# print("plot the hierarchical clustering")
# #my.data.for.cluster=symptomsData()[symptomsData()[,3]==input$measurementSelected,-c(1:3)]
# my.data.for.cluster=symptomsData()[symptomsData()[,3]=="T0",-c(1:3)]
# plot(hclust(as.dist(cor(my.data.for.cluster, use="c", method="s"))))
#
#
#' @title Plot clustered heatmaps.
#'
#' @description Plots a heatmap for presence of symptoms.
#'
#' @param data Data fram to be passed to the function.
#' @param variableName The column name of the variable used for filtering.
#' @param variableValues The value of the filtering variable used to filter.
plotClusterHeatmap <- function (data,
variableName,
variableValue,
selectedSymptoms,
annotationVars=NA,
treatasBinary=FALSE) {
dataSubset=t(data[data[,variableName]==variableValue, c(selectedSymptoms)])
#TODO: remove reference above to the first three columns "-c(1:3)"
#browser()
annotation <- data.frame(data[, annotationVars])
if (treatasBinary==FALSE) {
pheatmap(dataSubset, annotation=annotation,
show_colnames=FALSE
#main=FALSE) #"Outcomes and subjects heatmap"
)
} else {
pheatmap(dataSubset,
legend_breaks=c(0,1),
legend_labels=c(0,1),
color=c('blue','red'),
drop_levels=TRUE,
annotation=annotation,
show_colnames=FALSE
#main=FALSE #"Outcomes and subjects heatmap"
)
}
}
########
plotCorrelations <- function (data,
variableName,
variableValue,
selectedSymptoms,
treatasBinary=FALSE) {
dataSubset=data[data[,variableName]==variableValue, selectedSymptoms]
pheatmap(cor(dataSubset,method="s", use="c" ), display_numbers=TRUE#,
#main="Outcome correlations heatmap"
)
} | 2e4ca04aaff834b3e0a8177d1ebb6f86c057e674 | [
"Markdown",
"R"
]
| 81 | R | crtahlin/medplot | 1c8365ca0c79459170b2108ac12fe2db982303ba | 60691530bc273a056143883c15f451d966b1e009 |
refs/heads/master | <repo_name>sysudatabaselesson16/FPTree<file_sep>/prepare/report.md
# 论文阅读与前期工作总结
### 姓名:陈扬, 陈伟松, 牛凌宇, 陆记
### 学号:17343017,17343015, 17343091, 17343080
---
## 前期工作
### 使用示意图展示普通文件IO方式(fwrite等)的流程,即进程与系统内核,磁盘之间的数据交换如何进行?为什么写入完成后要调用fsync?

普通文件io方式通过缓冲区对文件进行间接操作以此提高文件读写效率,当进程fwrite写文件时,首先要向系统内核发出请求,系统内核再将数据存到缓冲区中,如果缓冲区未满,则不将缓冲区输出到磁盘文件中,一直到缓冲区满或者系统内核需要再次使用缓冲区时,才会将缓冲区文件写入.
调用fsync是因为普通文件io方式是一种延迟写策略,它并不是在要修改文件时即时修改,而是要等到缓冲区满或复用才会,以此减少写时间,提高效率,但是采用这种方式在系统异常退出时可能导致修改文件的数据留在缓冲区而没有即时存到磁盘文件的情况,在多进程情况下也会造成数据不一致的情况,调用fsync就是为了防止这些情况产生,fsync为文件写加锁,只有在文件写操作落实到磁盘文件上后才会解锁,它的作用在数据库系统中相当于是保证了数据的一致性.
### 简述文件映射的方式如何操作文件。与普通IO区别?为什么写入完成后要调用msync?文件内容什么时候被载入内存?
(使用什么函数,函数的工作流程)

与普通io的不同之处在与普通io是通过buffer来对文件进行间接操作,而文件映射是直接找到文件数据所在内存地址对内存进行操作,这样对大文件读写有大优势,且文件映射中用户空间和系统空间各自修改操作可以直接反映在映射的区域内,具有高效的交互性,方便实现内存共享,可以实现大文件的高效传输
调用msync是因为进程在映射空间的对共享内容的改变并不直接写回到磁盘文件中,往往在调用munmap()后才执行该操作。可以通过调用msync()实现磁盘上文件内容与共享内存区的内容一致.
MapViewOfFile()时文件内容被载入内存
### 参考[Intel的NVM模拟教程](https://software.intel.com/zh-cn/articles/how-to-emulate-persistent-memory-on-an-intel-architecture-server)模拟NVM环境,用fio等工具测试模拟NVM的性能并与磁盘对比(关键步骤结果截图)。
(推荐Ubuntu 18.04LTS下配置,跳过内核配置,编译和安装步骤)
以root权限用vim打开grub文件

打开后原grub文件是这样的

修改,在GRUB_CMDLINE_LINUX_DEFAULT中加入memmap=2G!8G,用:wq!保存并退出

为了方便创建超级用户root

启动配置文件

重启后,查看内存仿真

查看内核中是否内置了 DAX 和 PMEM

安装包含 DAX 的文件系统,先创建文件夹

然后制作文件系统

映射,可以查看到pmem0

在NVM上用fio测试性能

得出结果

在磁盘上用fio测试性能

得出结果

可以看到NVM比磁盘快很多
### 使用[PMDK的libpmem库](http://pmem.io/pmdk/libpmem/)编写样例程序操作模拟NVM(关键实验结果截图,附上编译命令和简单样例程序)。
(样例程序使用教程的即可,主要工作是编译安装并链接PMDK库)
目标是编译安装并链接PMDK库,先从github上clone下pmdk

然后安装好下列pmdk所需要的包

检查版本

编译安装

链接

加入到共享库配置文件,然后就可以编译运行程序了

这里使用的程序是提供的样例程序如下:
/*
* Copyright 2014-2017, Intel Corporation
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions
* are met:
*
* * Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
*
* * Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in
* the documentation and/or other materials provided with the
* distribution.
*
* * Neither the name of the copyright holder nor the names of its
* contributors may be used to endorse or promote products derived
* from this software without specific prior written permission.
*
* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
* LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
* A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
* OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
* SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
* LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
* DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
* THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
* (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
/*
* simple_copy.c -- show how to use pmem_memcpy_persist()
*
* usage: simple_copy src-file dst-file
*
* Reads 4k from src-file and writes it to dst-file.
*/
#include <sys/types.h>
#include <sys/stat.h>
#include <fcntl.h>
#include <stdio.h>
#include <errno.h>
#include <stdlib.h>
#ifndef _WIN32
#include <unistd.h>
#else
#include <io.h>
#endif
#include <string.h>
#include <libpmem.h>
/* just copying 4k to pmem for this example */
#define BUF_LEN 4096
int main(int argc, char *argv[])
{
int srcfd;
char buf[BUF_LEN];
char *pmemaddr;
size_t mapped_len;
int is_pmem;
int cc;
if (argc != 3) {
fprintf(stderr, "usage: %s src-file dst-file\n", argv[0]);
exit(1);
}
/* open src-file */
if ((srcfd = open(argv[1], O_RDONLY)) < 0) {
perror(argv[1]);
exit(1);
}
/* create a pmem file and memory map it */
if ((pmemaddr = pmem_map_file(argv[2], BUF_LEN,
PMEM_FILE_CREATE|PMEM_FILE_EXCL,
0666, &mapped_len, &is_pmem)) == NULL) {
perror("pmem_map_file");
exit(1);
}
/* read up to BUF_LEN from srcfd */
if ((cc = read(srcfd, buf, BUF_LEN)) < 0) {
pmem_unmap(pmemaddr, mapped_len);
perror("read");
exit(1);
}
/* write it to the pmem */
if (is_pmem) {
pmem_memcpy_persist(pmemaddr, buf, cc);
} else {
memcpy(pmemaddr, buf, cc);
pmem_msync(pmemaddr, cc);
}
close(srcfd);
pmem_unmap(pmemaddr, mapped_len);
exit(0);
}
---
## 论文阅读
### 总结一下本文的主要贡献和观点(500字以内)(不能翻译摘要)。
>因为持久树对延迟十分敏感,而SCM比DRAM的延迟相对更高,所以当把持久树做为SCM的基本构建块时,虽然使SCM具有了持久性,但在写入时明显比DRAM慢。因此为了让SCM拥有与DRAM相似的物理性能,本文提出了混合SCM-DRAM持久性和并发性的一种B+-Tree——FPTree(指纹持久树)。这种树运用了“指纹识别”的技术,“指纹”相当于一个过滤器,通过在搜索过程先扫描它们来避免产生与搜索关键字无关的指纹,从而大大提高了SCM的性能。在本文中,也通过定义一个持久化指针解决了内存分配持久引起的内存泄漏的问题以及通过使用以原子方式写入的flags来指示写操作是否完成从而解决了SCM字粒度写入引起的部分写入的问题。同时,因为持久性的内存分配花销太大,为了解决这个问题,本文提出了可以通过一次分配多个叶子的块来是实现分摊式的持久性内存分配。除此之外,为了提高 性能,在实现过程中允许事务在求助于回退机制之前重试几次,而使用HTM明显与我们需要将数据刷新到持久内存以确保一致性不兼容,所以为了解决这一问题,本文提出了选择性并发的技术,即对数据的暂态部分和持久部分采用不同的并发方案。FPTree的意义在于保证了内存任何节点崩溃然后恢复到一致状态而不会丢失任何信息;提高了SCM的恢复性能;在高并发场景中,可以很好地适应高SCM延迟,并且可以很好地扩展。
### SCM硬件有什么特性?与普通磁盘有什么区别?普通数据库以页的粒度读写磁盘的方式适合操作SCM吗?
>特性:
>1、按字节寻址
>2、非易失性
>3、延迟低且不对称
>4、比DRAM更密集
>5、高效节能
>SCM与普通磁盘的区别:存取速度比普通磁盘快很多,但价格比普通磁盘高。
>普通数据库以页的粒度读写磁盘的方式不适合操作SCM,因为SCM是以字的粒度写入的。
### 操作SCM为什么要调用CLFLUSH等指令?
>因为软件无法访问SCM中的所有缓冲区,而没有软件的支持,根本无法保证SCM写入的顺序和持久性。所以需要通过调用CLFLUSH、MFENCE、SFENCE等指令来确定执行SCM内存操作的顺序。如果写入后不调用这些指令,当系统崩溃后,系统的数据都会流失。
### FPTree的指纹技术有什么重要作用?
>指纹是指FPTree叶子密钥的单字节哈希值,在叶子的开头连续存储,而指纹相当于一个过滤器,在搜索过程中,通过先搜索它们来避免产生与搜索关键字无关的指纹,这样可以减少搜索的次数从而大大提高SCM的性能。
### 为了保证指纹技术的数学证明成立,哈希函数应如何选取?
(哈希函数生成的哈希值具有什么特征,能简单对键值取模生成吗?)
>哈希值特征:符合均匀分布。
>不能简单对键值取模生成。
### 持久化指针的作用是什么?与课上学到的什么类似?
>作用:可以通过调用持久化指针访问内存中的历史数据,以检查是否存在内存泄漏。
>pid+offset
<file_sep>/Programming-FPTree/src/lycsb.cpp
#include <leveldb/db.h>
#include <string>
//#include <time.h>
#define KEY_LEN 8
#define VALUE_LEN 8
using namespace std;
const string workload = "../workloads";
const string load = workload + "220w-rw-50-50-load.txt"; // TODO: the workload_load filename
const string run = workload + "220w-rw-50-50-run.txt"; // TODO: the workload_run filename
const string filePath = "";
const int READ_WRITE_NUM = 2200000; // TODO: how many operations
int main()
{
leveldb::DB* db;
leveldb::Options options;
leveldb::WriteOptions write_options;
// TODO: open and initial the levelDB
leveldb::Status s = leveldb::DB::Open(options, "/tmp/testdb", &db);
assert(s.ok());
uint64_t inserted = 0, queried = 0, t = 0;
uint64_t* key = new uint64_t[2200000]; // the key and value are same
bool* ifInsert = new bool[2200000]; // the operation is insertion or not
FILE *ycsb_load, *ycsb_run; // the files that store the ycsb operations
char *buf = NULL;
size_t len = 0;
struct timespec start, finish; // use to caculate the time
double single_time; // single operation time
printf("Load phase begins \n");
// TODO: read the ycsb_load and store
ycsb_load = fopen(load, "r");
char op[7];//load the operations
if (ycsb_load == NULL) return 0;
/*int flag = 0, file_row = 0;
while(!feof(ycsb_load)){
flag = fgetc(ycsb_load);
if(flag == '\n')
file_row++;
}
file_row += 1;*/
for (t = 0; t < 2200000; t++) {
fscanf(ycsb_load, "%s %llu", op, &key[t]);
if (strcmp(op,"INSERT") == 0){
ifInsert[t] = true;
}
}
clock_gettime(CLOCK_MONOTONIC, &start);
// TODO: load the workload in LevelDB
for (t = 0; t < 2200000; t++) {
if (ifInsert[t]) {//record the number of insert
inserted++;
}
//write operations
leveldb::Status s = db->Put(write_options, leveldb::Slice((char*)&key[t], KEY_LEN), leveldb::Slice((char*)&key[t], VALUE_LEN));
if (!s.ok()) printf("put failed!");
}
clock_gettime(CLOCK_MONOTONIC, &finish);
single_time = (finish.tv_sec - start.tv_sec) * 1000000000.0 + (finish.tv_nsec - start.tv_nsec);
printf("Load phase finishes: %d items are inserted \n", inserted);
printf("Load phase used time: %fs\n", single_time / 1000000000.0);
printf("Load phase single insert time: %fns\n", single_time / inserted);
int operation_num = 2200000;
inserted = 0;
// TODO:read the ycsb_run and store
ycsb_run = fopen(run, "r");
char op1[7];
if (ycsb_run == NULL) return 0;
for (t = 0; t < 2200000; t++) {
ifInsert[t] = false;
fscanf(ycsb_run, "%s %llu", op1, &key[t]);
if (strcmp(op1,"INSERT") == 0){
ifInsert[t] = true;
}
}
clock_gettime(CLOCK_MONOTONIC, &start);
// TODO: operate the levelDB
string val;
for (t = 0; t < 2200000; t++) {
if (ifInsert[t]) {
inserted++;
leveldb::Status s1 = db->Put(write_options, leveldb::Slice((char*)&key[t], KEY_LEN), leveldb::Slice((char*)&key[t], VALUE_LEN));
if (!s1.ok()) printf("Put failed!");
}
else {
leveldb::Status s2 = db->Get(leveldb::ReadOptions(), leveldb::Slice((char*)&key[t], KEY_LEN), &val);
if (!s2.ok()) printf("Get failed!");
}
}
clock_gettime(CLOCK_MONOTONIC, &finish);
single_time = (finish.tv_sec - start.tv_sec) + (finish.tv_nsec - start.tv_nsec) / 1000000000.0;
printf("Run phase finishes: %d/%d items are inserted/searched\n", operation_num - inserted, inserted);
printf("Run phase throughput: %f operations per second \n", READ_WRITE_NUM/single_time);
return 0;
}
<file_sep>/Programming-FPTree/src/p_allocator.cpp
#include "utility/p_allocator.h"
#include <iostream>
using namespace std;
// the file that store the information of allocator
const string P_ALLOCATOR_CATALOG_NAME = "p_allocator_catalog";
// a list storing the free leaves
const string P_ALLOCATOR_FREE_LIST = "free_list";
size_t mapped_len;
PAllocator *PAllocator::pAllocator = new PAllocator();
PAllocator *PAllocator::getAllocator()
{
if (PAllocator::pAllocator == NULL)
{
PAllocator::pAllocator = new PAllocator();
}
return PAllocator::pAllocator;
}
/* data storing structure of allocator
In the catalog file, the data structure is listed below
| maxFileId(8 bytes) | freeNum = m | treeStartLeaf(the PPointer) |
In freeList file:
| freeList{(fId, offset)1,...(fId, offset)m} |
*/
PAllocator::PAllocator()
{
string allocatorCatalogPath = DATA_DIR + P_ALLOCATOR_CATALOG_NAME;
string freeListPath = DATA_DIR + P_ALLOCATOR_FREE_LIST;
ifstream allocatorCatalog(allocatorCatalogPath, ios::in | ios::binary);
ifstream freeListFile(freeListPath, ios::in | ios::binary);
// judge if the catalog exists
if (allocatorCatalog.is_open() && freeListFile.is_open())
{
// exist
allocatorCatalog.read((char *)&maxFileId, sizeof(maxFileId));
allocatorCatalog.read((char *)&freeNum, sizeof(freeNum));
allocatorCatalog.read((char *)&startLeaf.fileId, sizeof(startLeaf.fileId));
allocatorCatalog.read((char *)&startLeaf.offset, sizeof(startLeaf.offset));
freeList.resize(freeNum);
for (int i = 0; i < freeNum; i++)
{
freeListFile.read((char *)&(freeList[i].fileId), sizeof(freeList[i].fileId));
freeListFile.read((char *)&(freeList[i].offset), sizeof(freeList[i].offset));
}
allocatorCatalog.close();
freeListFile.close();
// TODO
}
else
{
// not exist, create catalog and free_list file, then open.
if (!allocatorCatalog.is_open())
{
ofstream f(allocatorCatalogPath, ios::out | ios::binary);
if (!f)
cout << "Fail in creating file.\n";
else
{
maxFileId = 1;
freeNum = 0;
f.write((char *)&maxFileId, sizeof(maxFileId));
f.write((char *)&freeNum, sizeof(freeNum));
f.close();
}
}
else if (!freeListFile.is_open())
{
ofstream f(freeListPath, ios::out | ios::binary);
if (!f)
cout << "Fail in creating file.\n";
else
f.close();
}
// TODO
}
this->initFilePmemAddr();
}
PAllocator::~PAllocator()
{
persistCatalog(); //在操作完毕后要把数据结构写入NVM,方便下次取用
for (int i = 1; i < maxFileId; i++)
{
pmem_unmap(fId2PmAddr[i], mapped_len);
}
fId2PmAddr.clear();
freeList.clear();
maxFileId = 1;
freeNum = 0;
pAllocator = NULL;
//由于该项目的单例模式中并无法将pAllocator delete所以只把它置为NULL
// TODO
}
// memory map all leaves to pmem address, storing them in the fId2PmAddr
void PAllocator::initFilePmemAddr()
{
int PMEM_LEN = LEAF_GROUP_HEAD + LEAF_GROUP_AMOUNT * (sizeof(Key) + sizeof(Value)); //pmem应该分配大小,也就是leaf_group大小
for (int i = 1; i < maxFileId; i++)
{
char *pmemaddr;
int is_pmem;
char ss[10];
sprintf(ss, "%d", i); //数字转字符串构成数据文件名
string data_path = DATA_DIR + ss;
if ((pmemaddr = (char *)pmem_map_file(data_path.c_str(), PMEM_LEN, PMEM_FILE_CREATE, 0666, &mapped_len, &is_pmem)) == NULL)
{
perror("pmem_map_file");
exit(1);
}
fId2PmAddr[i] = pmemaddr; //保存虚拟地址跟fileId的映射
pmem_persist(pmemaddr, mapped_len); //持久化指针
}
// TODO
}
// get the pmem address of the target PPointer from the map fId2PmAddr
char *PAllocator::getLeafPmemAddr(PPointer p)
{
// TODO
if (ifLeafExist(p) && ifLeafFree(p))
{
return (fId2PmAddr[p.fileId] + p.offset);
}
return NULL;
}
// get and use a leaf for the fptree leaf allocation
// return
bool PAllocator::getLeaf(PPointer &p, char *&pmem_addr)
{
// TODO
if (freeNum >= 1 || newLeafGroup()) //没freeNum就分配页
{
p.offset = freeList[freeNum - 1].offset;
p.fileId = freeList[freeNum - 1].fileId;
pmem_addr = fId2PmAddr[p.fileId];
pmem_addr[sizeof(uint64_t) + (p.offset - LEAF_GROUP_HEAD) / calLeafSize()] = 1; //将leaf_group中对应leaf的bitmap位置置为1,标志used, 这里计算bitmap位置的式子有点奇怪是因为offset内容是直接跟leaf在leaf_group位置挂钩的,要先算它是第几个leaf,再加上sizeof(uint64_t)
((uint64_t *)pmem_addr)[0] += 1; //leaf_group usednum++
pmem_persist(pmem_addr, mapped_len);
pmem_addr = getLeafPmemAddr(p); //最终要的是叶的虚拟地址
freeNum--;
freeList.pop_back();
return true;
}
return false;
}
bool PAllocator::ifLeafUsed(PPointer p)
{
// TODO
if (ifLeafExist(p))
{
char *pmem_addr = fId2PmAddr[p.fileId];
return pmem_addr[sizeof(uint64_t) + (p.offset - LEAF_GROUP_HEAD) / calLeafSize()] == 1;
}
return false;
}
bool PAllocator::ifLeafFree(PPointer p)
{
// TODO
for (int i = 0; i < freeList.size(); i++)
if (p == freeList[i])
return true;
return false;
}
// judge whether the leaf with specific PPointer exists.
bool PAllocator::ifLeafExist(PPointer p)
{
// TODO
return p.fileId > 0 && p.fileId < maxFileId && p.offset >= LEAF_GROUP_HEAD && (p.offset - LEAF_GROUP_HEAD) / calLeafSize() < LEAF_GROUP_AMOUNT && ((p.offset - LEAF_GROUP_HEAD) % calLeafSize() == 0);
//判断fileId是否合法,offset是否合法,要注意offset合法应该是在leaf的初始位置
}
// free and reuse a leaf
bool PAllocator::freeLeaf(PPointer p)
{
// TODO
if (ifLeafExist(p) && ifLeafUsed(p) && !ifLeafFree(p))
{
char *pmem_addr = fId2PmAddr[p.fileId];
pmem_addr[sizeof(uint64_t) + (p.offset - LEAF_GROUP_HEAD) / calLeafSize()] = 0;
((uint64_t *)pmem_addr)[0] -= 1;
freeList.push_back(p);
freeNum++;
return true;
}
return false;
}
bool PAllocator::persistCatalog()
{
// TODO
//将数据结构写入NVM
string allocatorCatalogPath = DATA_DIR + P_ALLOCATOR_CATALOG_NAME;
string freeListPath = DATA_DIR + P_ALLOCATOR_FREE_LIST;
ofstream allocatorCatalog(allocatorCatalogPath, ios::out | ios::binary);
ofstream freeListFile(freeListPath, ios::out | ios::binary);
allocatorCatalog.write((char *)&maxFileId, sizeof(maxFileId));
allocatorCatalog.write((char *)&freeNum, sizeof(freeNum));
allocatorCatalog.write((char *)&startLeaf.fileId, sizeof(startLeaf.fileId));
allocatorCatalog.write((char *)&startLeaf.offset, sizeof(startLeaf.offset));
for (int i = 0; i < freeNum; i++)
{
freeListFile.write((char *)&(freeList[i].fileId), sizeof(freeList[i].fileId));
freeListFile.write((char *)&(freeList[i].offset), sizeof(freeList[i].offset));
}
allocatorCatalog.close();
freeListFile.close();
return false;
}
/*
Leaf group structure: (uncompressed)
| usedNum(8b) | bitmap(n * byte) | leaf1 |...| leafn |
*/
// create a new leafgroup, one file per leafgroup
bool PAllocator::newLeafGroup()
{
// TODO
char ss[10];
sprintf(ss, "%d", maxFileId);
string newpath = DATA_DIR + ss;
ofstream out(newpath, ios::binary | ios::out); //只是用来创建文件的,
if (!out)
{
cout << "Cann't create new file." << endl;
return false;
}
else
{
out.close();
int PMEM_LEN = LEAF_GROUP_HEAD + LEAF_GROUP_AMOUNT * calLeafSize();
char *pmemaddr;
int is_pmem;
size_t mapped_len;
if ((pmemaddr = (char *)pmem_map_file(newpath.c_str(), PMEM_LEN, PMEM_FILE_CREATE, 0666, &mapped_len, &is_pmem)) == NULL)
{
cout << "pmem_map_file" << endl;
return false;
}
fId2PmAddr[maxFileId] = pmemaddr;
pmem_memset(pmemaddr, 0, sizeof(Byte), mapped_len);
pmem_persist(pmemaddr, mapped_len);
freeNum += LEAF_GROUP_AMOUNT;
for (int i = 0; i < LEAF_GROUP_AMOUNT; i++)
{
PPointer p;
p.fileId = maxFileId;
p.offset = LEAF_GROUP_HEAD + i * calLeafSize();
freeList.push_back(p);
}
maxFileId++;
return true;
}
return false;
}
<file_sep>/Programming-FPTree/SystemManuals.md
# FPTreeBD 系统说明书
此文档是对于课程设计 **FPTreeBD** 的系统说明书,由于是根据基础模板上进行补全开发,系统说明书的很多细节会在逐步完成的过程中
补全。
## 1 引言
### 1.1 编写目的
此说明书是对本次课程设计的需求明确,以及对后续开发过程中的主要依据和详细说明。
### 1.2 背景
项目名称: FPTreeBD键值储存系统
项目组成员: 陈扬、陈伟松、陆记、牛凌宇
### 1.3 参考资料
.
## 2 系统介绍
### 2.1 系统需求
本次课程设计是实现一个简单的键值储存系统,其原理是基于针对NVM优化的数据结构FPTree。最终包装成一个调用库,并且对外可用的对数据实现4种基本操作:
1. Insert 增加数据
2. Remove 删除数据
3. Update 更改数据
4. Find 查询数据
实现的是单线程的FPTree。
### 2.2 系统结构
系统基于数据结构FPTree,架构如下

### 2.3 项目目录结构
```
|__gtest: 为Google Test项目目录,不用管
|__include: 里包含所有用到的头文件
|__fptree: fptree的头文件所在文件夹
|__fptree.h: fptree地头文件
|__utility: fptree所用工具的头文件所在文件夹
|__utility.h: 指纹计算等工具函数所在头文件
|__clhash.h: 指纹计算所用哈希函数头文件
|__p_allocator.h: NVM内存分配器头文件
|__src: 为项目源码所在地,完成里面所有的实现
|__bin: 可执行文件所在文件夹
|__main: main.cpp的可执行文件
|__lycsb: lycsb.cpp的可执行文件
|__ycsb: ycsb.cpp的可执行文件
|__fptree.cpp: fptree的源文件,项目核心文件
|__clhash.c: 指纹计算的哈希函数源文件
|__p_allocator.cpp: NVM内存分配器源文件
|__lycsb.cpp: LevelDB的YCSB测试代码
|__ycsb.cpp: FPTreeDB和LevelDB的YCSB对比测试代码
|__makefile: src下项目的编译文件
|__workloads: 为YCSB测试负载文件,用于YCSB Benchmark测试
|__数据量-rw-读比例-写比例-load.txt: YCSB测试数据库装载文件
|__数据量-rw-读比例-写比例-run.txt: YCSB测试运行文件
|__test: 为Google Test用户测试代码所在,请完成编译并通过所有测试
|__bin: 单元测试可执行文件所在文件夹
|__fptree_test: fptree_test.cpp的可执行文件
|__utility_test: utility_test.cpp的可执行文件
|__fptree_test.cpp: fptree相关测试
|__utility_test.cpp: PAllocator等相关测试
|__makefile: gtest单元测试的编译文件
```
### 2.4 项目检验标准
通过使用 **YCSB测试** 和 **Google Test测试** 来检验此项目的完成情况
## 3 系统实现
### 3.1 FPTree
这是整个键值存储系统的接口类
#### FPTree::FPTree(uint64_t t_degree)
构造函数
#### FPTree::~FPTree()
析构函数
#### void FPTree::recursiveDelete(Node* n)
删除整棵树,被析构函数调用
#### bool FPTree::bulkLoading() {
这是重建树的主要函数
先检查目标文件夹内有没有数据文件,遍历文件进行BulkLoading
没有数据文件则进行新树的生成

#### InnerNode* FPTree::getRoot()
获得根节点
#### void FPTree::changeRoot(InnerNode* newRoot)
产生新的根节点
#### void FPTree::printTree()
打印树
#### FPTree::insert(Key k, Value v)
增加数据
直接调用节点的函数
#### bool FPTree::remove(Key k)
删除数据
直接调用节点的函数
#### bool FPTree::update(Key k, Value v)
更新数据
直接调用节点的函数
#### Value FPTree::find(Key k)
查找数据
直接调用节点的函数
### 3.2 KeyNode
这个数据结构由一个代表键值和节点的索引组成,用于节点分裂时,将新生成的节点索引返回给上层节点插入记录.
其用在插入操作和分裂操作中。
如下图

### 3.3 InnerNode
这是FPTree中间索引节点,其不存放键值对数据信息,结构如下:

#### InnerNode::InnerNode(const int& d, FPTree* const& t, bool _isRoot)
构造函数
#### InnerNode::~InnerNode()
析构函数
#### int InnerNode::findIndex(const Key& k)
二分查找目标子节点
#### void InnerNode::insertNonFull(const Key& k, Node* const& node)
节点不满直接插入
#### KeyNode* InnerNode::insert(const Key& k, const Value& v)
键值对插入
#### KeyNode* InnerNode::insertLeaf(const KeyNode& leaf)
插入叶子节点
#### KeyNode* InnerNode::split()
分裂满了的节点
#### bool InnerNode::remove(const Key& k, const int& index, InnerNode* const& parent, bool &ifDelete)、
键值对删除
#### void InnerNode::getBrother(const int& index, InnerNode* const& parent, InnerNode* &leftBro, InnerNode* &rightBro)
获得中间节点的左右兄弟节点。如果左右兄弟都存在,统一处理右兄弟。
#### void InnerNode::mergeParentLeft(InnerNode* const& parent, InnerNode* const& leftBro)
与父亲节点以及左兄弟合并
#### void InnerNode::mergeParentRight(InnerNode* const& parent, InnerNode* const& rightBro)
与父亲节点以及右兄弟合并
#### void InnerNode::redistributeLeft(const int& index, InnerNode* const& leftBro, InnerNode* const& parent)
与左兄弟重分布
#### void InnerNode::redistributeRight(const int& index, InnerNode* const& rightBro, InnerNode* const& parent)
与右兄弟重分布
#### void InnerNode::mergeLeft(InnerNode* const& leftBro, const Key& k)
与左兄弟合并
#### void InnerNode::mergeRight(InnerNode* const& rightBro, const Key& k)
与右兄弟合并
#### void InnerNode::removeChild(const int& keyIdx, const int& childIdx)
删除儿子
#### bool InnerNode::update(const Key& k, const Value& v)
键值对修改
#### Value InnerNode::find(const Key& k)
键值对查询
#### Node* InnerNode::getChild(const int& idx)
获得儿子
#### Key InnerNode::getKey(const int& idx)
获得key
#### void InnerNode::printNode()
打印节点
### 3.4 LeafNode
这是整个FPTree存储数据的直接对象,所有键值对数据只存放于叶子节点中。
叶子节点也是与NVM交互的对象,只要是操作PAllocator映射NVM文件后的虚拟地址,通过文件映射的方式操作相应数据文件。
因为节点的操作对象是内存映射文件数据后的虚拟地址,所以关键是设置好NVM数据的指针。结构如下:

#### LeafNode::LeafNode(FPTree* t) && LeafNode::LeafNode(PPointer p, FPTree* t)
构造函数
#### LeafNode::~LeafNode()
析构函数
#### void LeafNode::printNode()
打印节点
#### KeyNode* LeafNode::insert(const Key& k, const Value& v)
键值对插入
#### void LeafNode::insertNonFull(const Key& k, const Value& v)
节点不满直接插入
#### KeyNode* LeafNode::split()
分裂
#### Key LeafNode::findSplitKey()
找到中间值的key
#### int LeafNode::getBit(const int& idx)
得到bit
#### Key LeafNode::getKey(const int& idx)
得到key
#### Value LeafNode::getValue(const int& idx)
得到值
#### PPointer LeafNode::getPPointer()
得到指针
#### bool LeafNode::remove(const Key& k, const int& index, InnerNode* const& parent, bool &ifDelete)
键值对删除
#### bool LeafNode::update(const Key& k, const Value& v)
键值对修改
#### Value LeafNode::find(const Key& k)
键值对查询
#### int LeafNode::findFirstZero()
找到第一个空节点
#### void LeafNode::persist()
保存真个叶子
### 3.5 PAllocator
这是NVM文件管理的主要对象,其负责分配LeafNode在NVM中的空间,映射数据文件并返回虚拟地址给LeafNode使用。其管理的叶子文件的粒度是一个LeafGroup,一个LeafGroup由多个叶子以及一个数据头组成,数据头由一个8字节的当前LeafGroup使用叶子数和叶子槽的bitmap,bitmap为了简单使用1个byte指明一个槽位。
#### 3.5.1 主要数据文件
1. LeafGroup结构:| usedNum(8 bytes) | bitmap(n bytes) | Leaf1 | ... | leafN |
2. catelog:| maxFileId(8 bytes) | freeNum(8 bytes) | treeStartLeaf(PPointer) |
3. freeList:| (fId, offset)1, ..., (fId)N |
#### 3.5.2 类描述
此类是单例模式,一次只能打开一个FPTree
##### PAllocator::PAllocator()
构造函数
读取NVM非意识性内存中的catlog和freeList文件,载入上次运行结束的状态
##### PAllocator::~PAllocator()
析构函数
解除虚拟地址和fileId之间的映射关系,将pAllocator指为NULL
##### void PAllocator::initFilePmemAddr()
载入上一次状态后根据fileID映射到虚拟地址
##### char *PAllocator::getLeafPmemAddr(PPointer p)
获取一个叶节点的虚拟地址
##### bool PAllocator::getLeaf(PPointer &p, char *&pmem_addr)
获取一个叶节点
##### bool PAllocator::ifLeafUsed(PPointer p)
判断叶节点是否被使用
##### bool PAllocator::ifLeafFree(PPointer p)
判断叶节点是否空闲
##### bool PAllocator::ifLeafExist(PPointer p)
判断叶节点是否存在
##### bool PAllocator::freeLeaf(PPointer p)
释放叶节点
##### bool PAllocator::persistCatalog()
写入Catalog和freeList进入文件
##### bool PAllocator::newLeafGroup()
创建一个新的LeafGroup
### 3.6 ycsb
这是一个键值数据库性能测试的benchmark
其中的命令格式:
```
Commands:
read key [field1 field2 ...] - Read a record
scan key recordcount [field1 field2 ...] - Scan starting at key
insert key name1=value1 [name2=value2 ...] - Insert a new record
update key name1=value1 [name2=value2 ...] - Update a record
delete key - Delete a record
table [tablename] - Get or [set] the name of the table
quit - Quit
```
#### 3.6.1 第一步: 读取load文件
初始化数据库
#### 3.6.2 第二步: 读取run文件
对数据库进行操作
## 4 实现时间计划
| 内容 | 时间 |
| ------------------------------- | ---------------- |
| 系统说明书 | 5/4前 (后续更新) |
| PAllocator实现并通过utility测试 | 5/4前 |
| LevelDB的使用以及测试 | 5/4前 |
| FPTreeDB插入操作并通过相关测试 | 5/11前 |
| FPTreeDB重载操作并通过相关测试 | 5/11前 |
| FPTreeDB查询操作并通过相关测试 | 5/18前 |
| FPTreeDB更新操作并通过相关测试 | 5/18前 |
| FPTreeDB删除操作并通过相关测试 | 5/31前 |
| FPTreeDB所有剩下实现以及测试 | 5/31前 |
## 5 实验进度:
已完成levelDB的环境配置和测试代码的编写以及nvm内存分配器的头文件(p_allocator.h)的编写。
| 9f249e9703ab4dfb4b59425a3b2a69a246d961d7 | [
"Markdown",
"C++"
]
| 4 | Markdown | sysudatabaselesson16/FPTree | 346d44b5b49ee71ce08b9a8aca2db77bdee933dc | 7466291202787c38cd1763a4e41162077cbf5942 |
refs/heads/master | <file_sep>import { Component } from '@angular/core';
import { LugaresService } from '../services/lugares.service';
@Component({
selector: 'app-lugares',
templateUrl: './lugares.component.html',
styleUrls: ['./lugares.component.css']
})
export class LugaresComponent {
title = 'app';
lat: number = -12.0909665;
lng: number = -77.0276488;
lugares = null;
constructor(private lugaresService: LugaresService) {
this.lugares = lugaresService.getLugares();
}
}
<file_sep>import { Injectable } from "@angular/core";
@Injectable()
export class LugaresService {
lugares: any = [
{ id: 1, plan: 'pagado', cercania: 1, distancia: 1, active: true, nombre: "<NAME>", description: 'Descripción de mi negocio' },
{ id: 2, plan: 'gratuito', cercania: 1, distancia: 1.8, active: true, nombre: "<NAME>", description: 'Descripción de mi negocio' },
{ id: 3, plan: 'gratuito', cercania: 2, distancia: 5, active: false, nombre: "<NAME>", description: '' },
{ id: 4, plan: 'gratuito', cercania: 3, distancia: 10, active: true, nombre: "<NAME>", description: 'Descripción de mi negocio' },
{ id: 5, plan: 'pagado', cercania: 3, distancia: 35, active: true, nombre: "<NAME>", description: 'Descripción de mi negocio' },
{ id: 6, plan: 'gratuito', cercania: 3, distancia: 120, active: true, nombre: "<NAME>", description: 'Descripción de mi negocio' }
];
public getLugares() {
return this.lugares;
}
public buscarLugar(id) {
return this.lugares.find(({ id }) => id == id) || null;
}
} | 977806833a17b260cbc542e95324509462be05ad | [
"TypeScript"
]
| 2 | TypeScript | Danielapariona/PlatziSquare | ddf1fff69be38deac68115fe58c95b2a80b98ad0 | 232223de4e99f40f9ef1d242a113396842e65441 |
refs/heads/master | <file_sep>#include <iostream>
using namespace std;
class ArrayRect
{
int** table;
public:
ArrayRect(int e, int n)
{
table = new int*[3];
for(int i=0;i<3;i++)
{
table[i] = new int[n+1];
table[i][0] = n;
for(int j=1;j<=n;j++)
{
table[i][j]=e;
}
}
}
ArrayRect(int **arr, int n)
{
table = new int*[3];
for(int i=0;i<3;i++)
{
table[i] = new int[n+1];
table[i][0] = n;
for(int j=1;j<=n;j++)
{
table[i][j] = arr[i][j-1];
}
}
}
ArrayRect(const ArrayRect& ar)
{
table = new int*[3];
for(int i=0;i<3;i++)
{
table[i] = new int[ar.table[i][0]+1]; //dlugosc tamtej tablicy + 1
table[i][0] = ar.table[i][0];
for(int j=1;j<=ar.table[i][0];j++)
{
table[i][j] = ar.table[i][j];
}
}
}
~ArrayRect()
{
for(int i=0;i<3;i++)
{
cout <<"Usuwam wiersz " << i+1 <<". o elementach: ";
for(int j=1;j<=table[i][0];j++) cout <<table[i][j] << " ";
cout <<endl;
delete table[i];
}
delete table;
}
ArrayRect* addRow(int e, int p)
{
int min = table[0][0];
int indexMin = 0;
for(int i=1;i<3;i++)
{
if(table[i][0] < min)
{
indexMin = i;
min = table[i][0];
}
}
if(p>table[indexMin][0]) return this;
else
{
ArrayRect *old = new ArrayRect(*this);
int * newRow = new int[table[indexMin][0]+1];
newRow[0] = table[indexMin][0]+1;
int offset = 0;
for(int i=0;i<newRow[0];i++)
{
if(i==p)
{
newRow[i+1] = e;
offset = 1;
}
else newRow[i+1] = table[indexMin][i+1-offset];
}
delete table[indexMin];
table[indexMin] = newRow;
return old;
}
}
int max()
{
int max = table[0][1];
for(int i=0;i<3;i++)
{
for(int j=1;j<=table[i][0];j++)
{
if(max < table[i][j]) max = table[i][j];
}
}
return max;
}
int longest()
{
int max = table[0][0];
for(int i=1;i<3;i++)
{
if(max<table[i][0]) max = table[i][0];
}
return max;
}
bool ifRext()
{
int d = table[0][0];
for(int i=1;i<3;i++)
if(d!=table[i][0]) return false;
return true;
}
ArrayRect* delRow(int p)
{
int max = table[0][0];
int indexMax = 0;
for(int i=1;i<3;i++)
{
if(max < table[i][0])
{
max = table[i][0];
indexMax = i;
}
}
if(p>table[indexMax][0]) return this;
else
{
ArrayRect *old = new ArrayRect(*this);
int * newRow = new int[table[indexMax][0]-1];
newRow[0] = table[indexMax][0]-1;
int offset = 0;
for(int i=0;i<table[indexMax][0];i++)
{
if(i==p)
{
offset = 1;
}
else newRow[i+1-offset] = table[indexMax][i+1];
}
delete table[indexMax];
table[indexMax] = newRow;
return old;
}
}
};
int main()
{
ArrayRect a = ArrayRect(3,5);
ArrayRect *b = a.addRow(8,1);
delete b;
b = a.addRow(10,1);
delete b;
b = a.addRow(11,1);
delete b;
b = a.addRow(11,1);
cout <<a.max() <<endl;
cout <<a.longest() <<endl;
cout <<a.ifRext() <<endl;
cout <<b->ifRext() <<endl;
delete b;
b = a.delRow(1);
delete b;
}
| 3b1ea81a782ae5a1576d75787a5be0d1207a97e8 | [
"C++"
]
| 1 | C++ | LukaszSac/Random-exam-solution | cec7ba35adf473e211e994010a0395f76c38f70d | df5fe9d8220df9d74167054f80cd24cb6bf96c9d |
refs/heads/master | <repo_name>flaab/py-alien-invaders<file_sep>/play.py
# Dependencies
import pygame
from pygame.sprite import Group
# Imports
import game_functions as gf
from settings import Settings
from ship import Ship
from alien import Alien
# Runs the game
def run_game():
""" Runs the game"""
# start a screen object
pygame.init()
# Read settings
ai_settings = Settings()
# Screen
screen = pygame.display.set_mode((ai_settings.screen_width, ai_settings.screen_height))
pygame.display.set_caption("PyAlien Invaders")
# make a ship
ship = Ship(ai_settings, screen)
# Make a group to store bullets
bullets = Group()
# Make a group to store aliens
aliens = Group()
# Make an alien
# alien = Alien(ai_settings, screen)
# Create the fleet of aliens
gf.create_fleet(ai_settings, screen, ship, aliens)
# main loop
while True:
# Events check using game functions module
gf.check_events(ai_settings, screen, ship, bullets)
# Update ship and bullets
ship.update()
gf.update_bullets(ai_settings, screen, ship, bullets, aliens)
# Update aliens
gf.update_aliens(ai_settings, screen, ship, aliens, bullets)
# Draw screen as set up now
gf.update_screen(ai_settings, screen, ship, aliens, bullets)
# Run game
run_game()
<file_sep>/ship.py
# Dependencies
import pygame
# Class Ship
class Ship():
# Constructor
def __init__(self, ai_settings, screen):
""" Initializes the sip and sets position """
# Set screen
self.screen = screen
# Save settings for the object
self.ai_settings = ai_settings
# Load ship
self.image = pygame.image.load('images/ship.png')
self.rect = self.image.get_rect()
self.screen_rect = screen.get_rect()
# Start new ship
self.rect.centerx = float(self.screen_rect.centerx)
self.rect.bottom = self.screen_rect.bottom
# Movement flag
self.moving_right = False
self.moving_left = False
self.moving_up = False
self.moving_down = False
#--------------------------------------------
# Updates the ship position
def update(self):
""" Update ship position based on flag """
# Move Ship
if(self.moving_right):
self.rect.centerx = min(self.ai_settings.screen_width, self.rect.centerx + self.ai_settings.ship_speed_factor)
if(self.moving_left):
self.rect.centerx = max(0, self.rect.centerx - self.ai_settings.ship_speed_factor)
if(self.moving_up):
self.rect.centery = max(0, self.rect.centery - self.ai_settings.ship_speed_factor)
if(self.moving_down):
self.rect.centery = min(self.ai_settings.screen_height, self.rect.centery + self.ai_settings.ship_speed_factor)
#--------------------------------------------
# Shows the ship
def blitme(self):
""" Draws ship in current location """
self.screen.blit(self.image, self.rect)
#--------------------------------------------
# Center ship on the screen
def center_ship(self):
""" Move the ship to the center of the screen """
self.center = self.screen_rect.centerx
self.rect.bottom = self.screen_rect.bottom
<file_sep>/settings.py
class Settings():
""" A class to store all settings for the game """
# Function: Init
def __init__(self):
""" Initialize the game settings """
# Ship speed factor -never changes-
self.ship_speed_factor = 10
# Bullet settings
self.bullet_speed_factor_start = 10
self.bullet_speed_factor_step = 2
self.bullet_speed_factor_max = 30
self.bullet_speed_factor = self.bullet_speed_factor_start
# Bullet width
self.bullet_width_start = 10
self.bullet_width_step = 2
self.bullet_width_min = 2
self.bullet_width = self.bullet_width_start
# Bullet height
self.bullet_height_start = 20
self.bullet_height_step = 2
self.bullet_height_min = 5
self.bullet_height = self.bullet_height_start
# Bullet color
self.bullet_color = (255, 255, 0)
# Screen Settings
self.screen_width = 900
self.screen_height = 569
self.bg_color = (230, 230, 230)
# Alien sideways speed
self.alien_speed_factor_start = 4
self.alien_speed_factor_step = 1
self.alien_speed_factor = self.alien_speed_factor_start
# Alien drop speed
self.fleet_drop_speed_start = 4
self.fleet_drop_speed_step = 1
self.fleet_drop_speed = self.fleet_drop_speed_start
self.fleet_direction = 1 # 1 = right, -1 left<file_sep>/README.md
# Py-Alien-Invaders
Py-Alien-Invaders is yet another Python game coded with PyGame. It is a shooter in which the player controls a spaceship by moving it horizontally and vertically across the screen and firing at descending aliens. The aim is to defeat all the rows of aliens that move horizontally back and forth across the screen as they advance toward the bottom of the screen. Each additional level increases the speed of the aliens and decreases the size of the ship's bullets.

## Play
To play, run the play.py file. Use the arrows to move the ship and the spacebar to shoot.
```
python3 play.py
```
## Libraries
* [PyGame](https://www.pygame.org/news) - Python3 Module to create games
## Authors
**<NAME>** - Developer
## License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details
<file_sep>/game_functions.py
# Dependencies
import sys
import pygame
from time import sleep
# Imports
from bullet import Bullet
from alien import Alien
# The ship has been hit. The game resets.
def ship_hit(ai_settings, screen, ship, aliens, bullets):
""" A ship has been hit """
# Clean aliens and bullets
aliens.empty()
bullets.empty()
# Reset the game
restart_difficulty(ai_settings)
create_fleet(ai_settings, screen, ship, aliens)
ship.center_ship()
sleep(1)
#--------------------------------------------
# Checks key down events
def check_keydown_events(event, ai_settings, screen, ship, bullets):
""" Handles key down events """
if(event.key == pygame.K_RIGHT): # Move right
ship.moving_right = True
if(event.key == pygame.K_LEFT): # Move left
ship.moving_left = True
if(event.key == pygame.K_UP): # Move up
ship.moving_up = True
if(event.key == pygame.K_DOWN): # Move down
ship.moving_down = True
if(event.key == pygame.K_SPACE): # Space fire bullets
new_bullet = Bullet(ai_settings, screen, ship)
bullets.add(new_bullet)
if(event.key == pygame.K_q): # Key q
sys.exit()
#--------------------------------------------
# Checks key up events
def check_keyup_events(event, ship):
""" Handles key up events """
if(event.key == pygame.K_RIGHT): # Stop moving right
ship.moving_right = False
if(event.key == pygame.K_LEFT): # Stop moving left
ship.moving_left = False
if(event.key == pygame.K_UP): # Stop moving up
ship.moving_up = False
if(event.key == pygame.K_DOWN): # Stop moving down
ship.moving_down = False
#--------------------------------------------
# Checks events
def check_events(ai_settings, screen, ship, bullets):
""" Responds to keys and mouse events """
for event in pygame.event.get():
# Exit
if event.type == pygame.QUIT:
sys.exit()
# Keydown events
elif event.type == pygame.KEYDOWN:
check_keydown_events(event, ai_settings, screen, ship, bullets)
# Key up events
elif event.type == pygame.KEYUP:
check_keyup_events(event, ship)
#--------------------------------------------
# Handles collisions between bullets and aliens
def handle_bullet_alien_collisions(ai_settings, screen, ship, aliens, bullets):
""" Checks if bullets hit aliens and recreates the fleet """
collisions = pygame.sprite.groupcollide(bullets, aliens, True, True)
# If no more aliens
if(len(aliens) == 0):
# Destroy all bullets and aliens
bullets.empty()
# Start over
create_fleet(ai_settings, screen, ship, aliens)
# Reset difficulty
increase_difficulty(ai_settings)
#--------------------------------------------
# Increase the gameplay by one
def increase_difficulty(ai_settings):
""" Make ship speed faster by one step """
ai_settings.alien_speed_factor = ai_settings.alien_speed_factor + ai_settings.alien_speed_factor_step
ai_settings.fleet_drop_speed = ai_settings.fleet_drop_speed + ai_settings.fleet_drop_speed_step
ai_settings.bullet_width = max(ai_settings.bullet_width_min, ai_settings.bullet_width - ai_settings.bullet_width_step)
ai_settings.bullet_height = max(ai_settings.bullet_height_min, ai_settings.bullet_height - ai_settings.bullet_height_step)
ai_settings.bullet_speed_factor = min(ai_settings.bullet_speed_factor_start, ai_settings.bullet_speed_factor + ai_settings.bullet_speed_factor_step)
#--------------------------------------------
# Restart game difficulty
def restart_difficulty(ai_settings):
""" Make difficulty easy again"""
ai_settings.alien_speed_factor = ai_settings.alien_speed_factor_start
ai_settings.fleet_drop_speed = ai_settings.fleet_drop_speed_start
ai_settings.bullet_width = ai_settings.bullet_width_start
ai_settings.bullet_height = ai_settings.bullet_height_start
ai_settings.bullet_speed_factor = ai_settings.bullet_speed_factor
#--------------------------------------------
# Updates bullets positions
def update_bullets(ai_settings, screen, ship, bullets, aliens):
""" Update position of bullets """
bullets.update()
# Get rid of bullets that go off
for bullet in bullets.copy():
if(bullet.rect.bottom <= 0):
bullets.remove(bullet)
# Check collisions
handle_bullet_alien_collisions(ai_settings, screen, ship, aliens, bullets)
#--------------------------------------------
# Get the number of rows
def get_number_rows(ai_settings, ship_height, alien_height):
""" Determines number of rows """
#available_space_y = (ai_settings.screen_height - (3 * alien_height) - ship_height)
available_space_y = (ai_settings.screen_height / 2.5)
number_rows = int(available_space_y / (1.5 * alien_height))
return(number_rows)
#--------------------------------------------
# Get number of aliens
def get_number_of_aliens_x(ai_settings, alien_width):
""" Returns number of aliens per row """
available_space_x = ai_settings.screen_width - 2*alien_width
number_aliens_x = int(available_space_x / (2 * alien_width))
return(number_aliens_x)
#--------------------------------------------
# Creates an alien
def create_alien(ai_settings, screen, aliens, alien_number, row_number):
""" Create an alien and place it in a row """
alien = Alien(ai_settings, screen)
alien_width = alien.rect.width
alien.x = alien_width + 2 * alien_width * alien_number
alien.rect.x = alien.x
alien.rect.y = alien.rect.height + 1.5 * alien.rect.height * row_number
aliens.add(alien)
#--------------------------------------------
# Change fleet direction
def change_fleet_direction(ai_settings, aliens):
""" Drop the fleet and change direction """
for alien in aliens.sprites():
alien.rect.y += ai_settings.fleet_drop_speed
# Change direction
if(ai_settings.fleet_direction == 1):
ai_settings.fleet_direction = -1
else:
ai_settings.fleet_direction = 1
#--------------------------------------------
# Check fleet edges
def check_fleet_edges(ai_settings, aliens):
""" Respond if any alien reaches the edge """
for alien in aliens.sprites():
if(alien.check_edges()):
change_fleet_direction(ai_settings, aliens)
break
#--------------------------------------------
# Creates the fleet
def create_fleet(ai_settings, screen, ship, aliens):
""" Creates a fleet of aliens """
alien = Alien(ai_settings, screen)
number_aliens_x = get_number_of_aliens_x(ai_settings, alien.rect.width)
number_rows = get_number_rows(ai_settings, ship.rect.height, alien.rect.height)
# Create the first row
for row_number in range(number_rows):
for alien_number in range(number_aliens_x):
create_alien(ai_settings, screen, aliens, alien_number, row_number)
#--------------------------------------------
# Updates screen
def update_screen(ai_settings, screen, ship, aliens, bullets):
""" Updates screen """
# Background image
background_image = pygame.image.load("images/background.jpg").convert()
# Redraw screen
# screen.fill(ai_settings.bg_color)
screen.blit(background_image, [0,0])
# Redraw all bullets
for bullet in bullets.sprites():
bullet.draw_bullet()
# Display ship
ship.blitme()
# Display alient
#alien.blitme()
# Show aliens
aliens.draw(screen)
# Make screen visible
pygame.display.flip()
#--------------------------------------------
# Checks if aliens reach the bottom of the screen
def check_aliens_bottom(ai_settings, screen, ship, aliens, bullets):
""" Checks if aliens reach the center of the screen """
screen_rect = screen.get_rect()
for alien in aliens.sprites():
if(alien.rect.bottom >= screen_rect.bottom):
ship_hit(ai_settings, screen, ship, aliens, bullets)
#--------------------------------------------
# Update aliens
def update_aliens(ai_settings, screen, ship, aliens, bullets):
""" Check edges and update """
check_fleet_edges(ai_settings, aliens)
aliens.update()
# Check collision with our ship
if(pygame.sprite.spritecollideany(ship, aliens)):
ship_hit(ai_settings, screen, ship, aliens, bullets)
# Check aliens reaching the bottom of the screen
check_aliens_bottom(ai_settings, screen, ship, aliens, bullets)
| 5d9b85987166cbc5701583ab1fcb15cf94c3930a | [
"Markdown",
"Python"
]
| 5 | Python | flaab/py-alien-invaders | 006094756f272f91267df4e366e62ce14c3b6743 | c9009efc23f065c37f30601f7b04b6cbdb32492c |
refs/heads/master | <file_sep>$(function() {
var workCarousel = $('.work-carousel:not(.more-works-carousel)');
workCarousel.owlCarousel({
items: 1,
nav: true,
margin: 8,
smartSpeed: 50,
animateOut: 'fadeOut',
navText: ['<i class="icon-arrow-left"></i>', '<i class="icon-arrow-right"></i>']
});
$('.more-works-carousel').owlCarousel({
nav: true,
slideBy: 'page',
margin: 30,
smartSpeed: 50,
navText: ['<i class="icon-arrow-left"></i>', '<i class="icon-arrow-right"></i>'],
responsive: {
0: {
items: 1,
autoWidth: true,
margin: 15,
center: true,
},
767: {
items: 3,
},
}
});
});<file_sep>Vue.component('pz-skill-card', {
template: `
<div class="skill-card">
<div class="skill-card-title">
<img :src="imgSrc" alt="">
<h4>{{ name }}</h4>
</div>
</div>`,
props: ['name', 'imgSrc'],
});
| edcfb56e0853a579714c4d5754d22d2778ce3cf4 | [
"JavaScript"
]
| 2 | JavaScript | Linpeizhen/peizhen-lin.com | bf4c7dbda4f257ed80289e57d97bb473e2773a41 | 4f81e9b3a6482c9cd45e97aee98ce52e9b2ae921 |
refs/heads/master | <file_sep>import HTMLTestRunner
import os
import sys
import time
import unittest
from selenium import webdriver
class Baidu3(unittest.TestCase):
def setUp(self):
self.driver = webdriver.Firefox()
self.driver.implicitly_wait(30)
self.base_url = "http://172.16.17.32:8080/sis/public/index.html"
def test_baiduSearch(self):
driver = self.driver
driver.get(self.base_url + '/')
time.sleep(6)
# driver.find_element_by_id("kw").send_keys("测试")
# driver.find_element_by_id("su").click()
# try:
# self.assertEqual("测试_百度", driver.title, msg="网页未打开")
# except:
# self.savescreenshot(driver, "baiduerror.png")
# time.sleep(3)
driver.quit()
# 生成测试截图
def savescreenshot(self, driver, file_name):
if not os.path.exists('./测试截图'):
os.makedirs("./测试截图")
now = time.strftime("%Y%m%d-%H%M%S", time.localtime(time.time()))
driver.get_screenshot_as_file("./测试截图/" + now + file_name)
time.sleep(3)
# 测试报告
# if __name__ == "__main__":
# # 当前文件所在得到文件夹的绝对路径
# curpath = sys.path[0]
# if not os.path.exists(curpath + '/测试报告'):
# os.makedirs(curpath + '/测试报告')
#
# now = time.strftime("%Y-%m-%d-%H %M %S", time.localtime(time.time()))
# print("------------")
# print(time.time())
# print("------------")
# print(time.localtime(time.time()))
# print("------------")
# print(now)
#
# filename = curpath + '/resultreport/' + now + 'resultreport.html'
# with open(filename, 'wb+') as fp:
# runner = HTMLTestRunner.HTMLTestRunner(stream=fp, title=u"测试报告", description=u"用例报告", verbosity=2)
# suite = creatsuit()
# runner.run(suite)
<file_sep>import csv
import os
import sys
from ddt import ddt, data, unpack, file_data
from selenium import webdriver
from selenium.webdriver import ActionChains
from selenium.webdriver.common.by import By
from selenium.webdriver.common.keys import Keys
from selenium.webdriver.support.ui import Select
from selenium.common.exceptions import NoSuchElementException
from selenium.common.exceptions import NoAlertPresentException
import unittest, time, re
# -*- coding: utf-8 -*
@ddt
class Baidu1(unittest.TestCase):
# 测试固件
def setUp(self):
self.driver = webdriver.Firefox()
self.driver.implicitly_wait(30)
self.base_url = "http://localhost:8080/sis/public/index.html"
self.verificationErros = []
self.accept_next_alert = True
# 数据驱动
# @data(["你好", u"你好_百度搜索"], ["守望", u"守望啊_百度"])
# @unpack
# @file_data读取json
# @file_data('testjson.json')
# 文件读取
# @data(*getText('test_baidu.txt'))
@unpack
def test_baidusearch(self):
self.driver.get(self.base_url+'/')
self.driver.set_window_size(1488, 878)
self.driver.find_element(By.NAME, "username").click()
self.driver.find_element(By.NAME, "username").click()
element = self.driver.find_element(By.NAME, "username")
actions = ActionChains(self.driver)
actions.double_click(element).perform()
self.driver.find_element(By.ID, "login_container").click()
self.driver.find_element(By.NAME, "password").click()
self.driver.find_element(By.NAME, "password").click()
element = self.driver.find_element(By.NAME, "password")
actions = ActionChains(self.driver)
actions.double_click(element).perform()
self.driver.find_element(By.NAME, "password").send_keys("123456")
self.driver.find_element(By.CSS_SELECTOR, "#login_form .btn-success").click()
self.driver.find_element(By.NAME, "username").click()
self.driver.find_element(By.CSS_SELECTOR, ".mt-2:nth-child(5)").click()
self.driver.find_element(By.NAME, "password").click()
self.driver.find_element(By.NAME, "password").click()
element = self.driver.find_element(By.NAME, "password")
actions = ActionChains(self.driver)
actions.double_click(element).perform()
self.driver.find_element(By.NAME, "password").click()
self.driver.find_element(By.NAME, "password").click()
element = self.driver.find_element(By.NAME, "password")
actions = ActionChains(self.driver)
actions.double_click(element).perform()
self.driver.find_element(By.NAME, "password").send_keys("123")
self.driver.find_element(By.CSS_SELECTOR, "#login_form .btn-success").click()
self.driver.find_element(By.CSS_SELECTOR, ".mr-auto").click()
self.driver.find_element(By.ID, "stu_tab").click()
element = self.driver.find_element(By.ID, "stu_tab")
actions = ActionChains(self.driver)
actions.move_to_element(element).perform()
element = self.driver.find_element(By.CSS_SELECTOR, "body")
actions = ActionChains(self.driver)
actions.move_to_element(element, 0, 0).perform()
self.driver.find_element(By.CSS_SELECTOR, "tr:nth-child(1) > td:nth-child(2)").click()
self.driver.find_element(By.ID, "stu_table_toolbar_update").click()
element = self.driver.find_element(By.ID, "stu_table_toolbar_update")
actions = ActionChains(self.driver)
actions.move_to_element(element).perform()
element = self.driver.find_element(By.CSS_SELECTOR, "body")
actions = ActionChains(self.driver)
actions.move_to_element(element, 0, 0).perform()
self.driver.find_element(By.ID, "stu_table_toolbar_update_form_studentName").click()
self.driver.find_element(By.ID, "stu_table_toolbar_update_form_studentName").send_keys("<KEY>")
self.driver.find_element(By.ID, "stu_table_toolbar_update_form_submit").click()
self.driver.find_element(By.CSS_SELECTOR, "#stu_table_toolbar_update_modal_dialog .ml-1 > .btn").click()
self.driver.find_element(By.NAME, "btSelectItem").click()
self.driver.find_element(By.ID, "stu_table_toolbar_add").click()
element = self.driver.find_element(By.ID, "stu_table_toolbar_add")
actions = ActionChains(self.driver)
actions.move_to_element(element).perform()
element = self.driver.find_element(By.CSS_SELECTOR, "body")
actions = ActionChains(self.driver)
actions.move_to_element(element, 0, 0).perform()
element = self.driver.find_element(By.ID, "stu_table_toolbar_add_form_studentName")
actions = ActionChains(self.driver)
actions.move_to_element(element).click_and_hold().perform()
element = self.driver.find_element(By.ID, "stu_table_toolbar_add_form_studentName")
actions = ActionChains(self.driver)
actions.move_to_element(element).perform()
element = self.driver.find_element(By.ID, "stu_table_toolbar_add_form_studentName")
actions = ActionChains(self.driver)
actions.move_to_element(element).release().perform()
self.driver.find_element(By.ID, "stu_table_toolbar_add_form_studentName").click()
self.driver.find_element(By.ID, "stu_table_toolbar_add_form_studentName").send_keys("dsd")
self.driver.find_element(By.ID, "stu_table_toolbar_add_form_idCard").click()
self.driver.find_element(By.ID, "stu_table_toolbar_add_form_idCard").send_keys("ds")
self.driver.find_element(By.ID, "stu_table_toolbar_add_form_studentEmail").click()
self.driver.find_element(By.ID, "stu_table_toolbar_add_form_studentEmail").send_keys("dsds")
self.driver.find_element(By.ID, "stu_table_toolbar_add_form_studentNo").click()
self.driver.find_element(By.ID, "stu_table_toolbar_add_form_studentNo").send_keys("dsd")
self.driver.find_element(By.CSS_SELECTOR, ".form-group:nth-child(4) .filter-option-inner-inner").click()
self.driver.find_element(By.CSS_SELECTOR, ".show > .inner > .dropdown-menu > li:nth-child(1) .text").click()
dropdown = self.driver.find_element(By.ID, "stu_table_toolbar_add_form_classes.id")
dropdown.find_element(By.XPATH, "//option[. = '幼儿园😂大班']").click()
self.driver.find_element(By.CSS_SELECTOR,
"#stu_table_toolbar_add_form > .form-group:nth-child(6) .filter-option-inner-inner").click()
self.driver.find_element(By.ID, "stu_table_toolbar_add_form_submit").click()
self.driver.find_element(By.LINK_TEXT, "3").click()
@unittest.skip('skipping')
def test_hao(self):
driver = self.driver
driver.get(self.base_url + "/")
driver.find_element_by_link_text("hao123").click()
try:
self.assertEqual(u"hao123_上网从这里开始码", driver.title)
except:
self.savescreenshot(driver, "hao123Error.png")
time.sleep(6)
driver.quit()
def savescreenshot(self, driver, file_name):
if not os.path.exists('./errorImage'):
os.makedirs("./errorImage")
now = time.strftime("%Y%m%d-%H%M%S", time.localtime(time.time()))
driver.get_screenshot_as_file("./errorImage/" + now + file_name)
time.sleep(3)
def is_element_present(self, how, what):
try:
self.driver.find_element(by=how, value=what)
except NoSuchElementException as e:
return False
return True
# 判断alert是否存在,可删除
def is_alert_present(self):
try:
self.driver.switch_to_alert()
except NoAlertPresentException as e:
return False
return True
# 关闭alert,可删除
def close_alert_and_get_its_text(self):
try:
alert = self.driver.switch_to_alert()
alert_text = alert.text
if self.accept_next_alert:
alert.accept()
else:
alert.dismiss()
return alert_text
finally:
self.accept_next_alert = True
# test fixture,清除环境
# 测试固件
def tearDown(self):
self.driver.quit()
self.assertEqual([], self.verificationErrors)
if __name__ == "__main__": # 执行用例
unittest.main(verbosity=2)
<file_sep>import HTMLTestRunner
import os
import sys
import time
import unittest
def creatsuit():
discover = unittest.defaultTestLoader.discover("../测试脚本",pattern="testbaidu1.py",top_level_dir=None)
print(discover)
return discover
if __name__ == "__main__":
# 当前文件所在得到文件夹的绝对路径
curpath =sys.path[0]
if not os.path.exists(curpath+'/resultreport'):
os.makedirs(curpath+'/resultreport')
now=time.strftime("%Y-%m-%d-%H %M %S",time.localtime(time.time()))
print("------------")
print(time.time())
print("------------")
print(time.localtime(time.time()))
print("------------")
print(now)
filename=curpath+'/resultreport/'+now+'resultreport.html'
with open(filename,'wb+') as fp:
runner =HTMLTestRunner.HTMLTestRunner(stream=fp,title=u"测试报告",description=u"用例报告" ,verbosity=2)
suite = creatsuit()
runner.run(suite)<file_sep>from selenium import webdriver
import time
driver = webdriver.Firefox()
driver.get("https://www.baidu.com/")
# 最大化浏览器
driver.maximize_window()
time.sleep(6)
# # 通过元素id查询
# driver.find_element_by_id("kw").send_keys("你好")
# driver.find_element_by_id("su").click()
#
# # 通过name定位
# driver.find_element_by_name("wd").send_keys("hello")
# driver.find_element_by_id("su").click()
#
# # 通过class_name定位无法成功,以为该元素内容过多,无法定位 前提:元素数量唯一,否则出错
# driver.find_element_by_class_name("s_ipt")
#
# # 通过tag name定位,前提:元素数量唯一,否则出错
# driver.find_element_by_tag_name("input").send_keys("nih")
# driver.find_element_by_id("su").click()
#
# # 通过link text定位 数量唯一
# driver.find_element_by_link_text("视频").click()
#
# # xpath定位
# driver.find_element_by_xpath("//*[@id='kw']").send_keys("he")
# driver.find_element_by_xpath("//*[@id='su']").click()
#
# # css定位
# driver.find_element_by_css_selector("#kw").send_keys("aaa")
# driver.find_element_by_css_selector("#su").click()
time.sleep(6)
driver.quit()
<file_sep>import time
from selenium import webdriver
from selenium.webdriver.common.keys import Keys
driver = webdriver.Firefox()
driver.get("file:///F:/Python=Study/Html/drop_down.html")
time.sleep(6)
options = driver.find_elements_by_tag_name("option")
for option in options:
if option.get_attribute("value") == "9.03":
option.click()
time.sleep(6)
driver.quit()
<file_sep>import time
from selenium import webdriver
from selenium.webdriver.common.keys import Keys
driver = webdriver.Firefox()
driver.get("file:///F:/Python=Study/Html/checkbox.html")
time.sleep(6)
inputs = driver.find_elements_by_tag_name("input")
for input in inputs:
if input.get_attribute("type")=="radio":
input.click()
time.sleep(6)<file_sep>import unittest
from 测试脚本 import testbaidu1
def creatsuitr():
suite = unittest.TestSuite()
suite.addTest(unittest.makeSuite(testbaidu1.Baidu1))
return suite
if __name__ == "__main__":
suite = creatsuitr()
runner = unittest.TextTestRunner(verbosity=2)
runner.run(suite)
<file_sep>import time
from selenium import webdriver
from selenium.webdriver.common.keys import Keys
driver = webdriver.Firefox()
driver.get("https://www.baidu.com/")
# 最大化浏览器
driver.maximize_window()
time.sleep(4)
# # 搜索
# driver.find_element_by_css_selector("#kw").send_keys("你好")
# driver.find_element_by_css_selector("#su").click()
# time.sleep(6)
# # 清除搜索内容
# driver.find_element_by_name("wd").clear()
# time.sleep(3)
# # 重新搜索
# driver.find_element_by_css_selector("#kw").send_keys("守望先锋")
# # 提交查询内容
# driver.find_element_by_id("su").submit()
# driver.find_element_by_id("s-bottom-layer-left")
# print("text=" + driver.find_element_by_id("s-bottom-layer-left"))
# driver.find_element_by_id("kw").send_keys("守望先锋")
# driver.find_element_by_id("su").submit()
# time.sleep(4)
# driver.minimize_window()
# # 设置浏览器大小
# time.sleep(4)
# driver.set_window_size(400,800)
# driver.maximize_window()
# # 浏览器后退
# driver.back()
# time.sleep(6)
# # 浏览器前进
# driver.forward()
# js="var q=document.documentElement.scrollTop=100000"
# driver.execute_script(js)
# driver.implicitly_wait(10)
# driver.find_element_by_link_text("owl直播_2020守望先锋职业联赛直播间_虎牙直播").click()
# title = driver.title
# print("title=" + title)
# url = driver.current_url
# print("url=" + url)
# driver = webdriver.Firefox()
# driver.get("http://127.0.0.1:88/biz/user-login-L2Jpei8=.html")
#
# driver.maximize_window()
# # 输入账户
# driver.find_element_by_id("account").send_keys("admin")
# driver.find_element_by_id("account").send_keys(Keys.TAB)
# # 输入密码
# driver.find_element_by_name("password").send_keys("<PASSWORD>46")
# driver.find_element_by_name("password").send_keys(Keys.ENTER)
# 复制
driver.find_element_by_css_selector("#kw").send_keys("你好")
time.sleep(6)
driver.find_element_by_id("kw").send_keys(Keys.CONTROL, 'a')
# 剪贴
driver.find_element_by_id("kw").send_keys(Keys.CONTROL, 'x')
time.sleep(6)
driver.quit()
<file_sep>import unittest
from 测试脚本 import testbaidu1
def creatsuite():
suite = unittest.TestSuite()
# 将测试用例加入到测试套件中
suite.addTest(testbaidu1.Baidu1("test_baidusearch"))
suite.addTest(testbaidu1.Baidu1("test_hao"))
return suite
if __name__ == "__main__":
suite = creatsuite()
runner = unittest.TextTestRunner(verbosity=2)
runner.run(suite)<file_sep>import unittest, time, re
from selenium import webdriver
class testSis(unittest.TestCase):
def setUp(self):
self.driver = webdriver.Firefox()
self.driver.implicitly_wait(30)
self.base_url = "http://localhost:8080/sis/public/index.html"
self.verificationErros = []
self.accept_next_alert = True
def test_login(self):
driver = self.driver
driver.get(self.base_url + '/')
driver.maximize_window()
time.sleep(5)
driver.quit()
def tearDown(self):
self.driver.quit()
self.assertEqual([], self.verificationErrors)
if __name__ == "__main__": # 执行用例
unittest.main(verbosity=2)
<file_sep>from selenium import webdriver
driver = webdriver.Firefox()
driver.get("file:///F:/Python=Study/Html/send.html")
driver.find_element_by_tag_name("input").click()
| e1ef695562c7765ce6fb5f3c7d878e8f1faa75ec | [
"Python"
]
| 11 | Python | Mxxxxx/PycStudy | 5e7d51501ce40dc12ba47a868057b5c8062890b1 | c4de425d8dde4c37d22dd0d44b24dbbffe3c69e0 |
refs/heads/main | <repo_name>odiak/map-and-set<file_sep>/src/set.test.ts
import { Set } from './set'
import { makeHashAndEqualFunction } from './util'
test('basics', () => {
const set = new Set(
[{ a: 1 }, { a: 2 }],
makeHashAndEqualFunction(({ a }) => [a])
)
expect(set.has({ a: 1 })).toBe(true)
expect(set.has({ a: 3 })).toBe(false)
expect(set.size).toBe(2)
set.delete({ a: 1 })
expect(set.has({ a: 1 })).toBe(false)
expect(set.size).toBe(1)
set.add({ a: 10 })
set.add({ a: 10 })
set.add({ a: 11 })
expect(set.size).toBe(3)
expect([...set.values()]).toEqual([{ a: 2 }, { a: 10 }, { a: 11 }])
expect([...set.keys()]).toEqual([{ a: 2 }, { a: 10 }, { a: 11 }])
expect([...set]).toEqual([{ a: 2 }, { a: 10 }, { a: 11 }])
expect([...set.entries()]).toEqual([
[{ a: 2 }, { a: 2 }],
[{ a: 10 }, { a: 10 }],
[{ a: 11 }, { a: 11 }]
])
set.clear()
expect(set.size).toBe(0)
expect([...set.values()]).toEqual([])
})
<file_sep>/src/util.ts
export type HashAndEqual<T> = {
hash(value: T): number
equal(value1: T, value2: T): boolean
}
type Primitive = number | string | boolean | null | undefined
export function hashCodeFromString(s: string): number {
let h = 1
for (let i = 0; i < s.length; i++) {
h = h * 31 + s.charCodeAt(i)
}
return h
}
const hashCodeOfPositiveInfinity = hashCodeFromString('Infinity')
const hashCodeOfNegativeInfinity = hashCodeFromString('-Infinity')
const hashCodeOfNan = hashCodeFromString('NaN')
const hashCodeOfTrue = hashCodeFromString('true')
const hashCodeOfFalse = hashCodeFromString('false')
const hashCodeOfNull = hashCodeFromString('null')
const hashCodeOfUndefined = hashCodeFromString('undefined')
export function hashCodeFromNumber(n: number): number {
if (Number.isInteger(n)) {
let hash = n
hash = ~hash + (hash << 15)
hash = hash ^ (hash >> 12)
hash = hash + (hash << 2)
hash = hash ^ (hash >> 4)
hash = hash * 2057
hash = hash ^ (hash >> 16)
return hash & 0x3fffffff
} else if (n === Number.POSITIVE_INFINITY) {
return hashCodeOfPositiveInfinity
} else if (n === Number.NEGATIVE_INFINITY) {
return hashCodeOfNegativeInfinity
} else if (Number.isNaN(n)) {
return hashCodeOfNan
} else {
return hashCodeFromString(n.toString())
}
}
export function hashCodeFromPrimitiveArray(array: Array<Primitive>): number {
return array.reduce((acc: number, e) => {
let h = 0
switch (typeof e) {
case 'number':
h = hashCodeFromNumber(e)
break
case 'boolean':
h = e ? hashCodeOfTrue : hashCodeOfFalse
break
case 'string':
h = hashCodeFromString(e)
break
case 'undefined':
h = hashCodeOfUndefined
break
case 'object': // case of null
h = hashCodeOfNull
break
}
return acc * 31 + h
}, 1)
}
export function makeHashAndEqualFunction<T>(
extractFeature: (value: T) => Array<Primitive>
): HashAndEqual<T> {
return {
hash(value: T): number {
return hashCodeFromPrimitiveArray(extractFeature(value))
},
equal(value1: T, value2: T): boolean {
const array1 = extractFeature(value1)
const array2 = extractFeature(value2)
return array1.length === array2.length && array1.every((v, i) => v === array2[i])
}
}
}
<file_sep>/src/index.ts
export { Map, MapKeyOptions } from './map'
export { Set, SetOptions } from './set'
export { makeHashAndEqualFunction } from './util'
<file_sep>/src/map.ts
type Entry<Key, Value> = {
key: Key
value: Value
hash: number
}
type EntryContainer<Key, Value> = {
entry: Entry<Key, Value> | null
next: EntryContainer<Key, Value> | null
}
export type MapKeyOptions<Key> = {
hash(key: Key): number
equal(key1: Key, key2: Key): boolean
}
export class Map<Key, Value> {
private hashKey: (key: Key) => number
private equalKey: (key1: Key, key2: Key) => boolean
private hashTable: Array<EntryContainer<Key, Value> | null>
private dataTable: Array<EntryContainer<Key, Value>>
private internalSize = 0
constructor(iterable: Iterable<[Key, Value]> | undefined | null, keyOption: MapKeyOptions<Key>) {
this.hashKey = keyOption.hash
this.equalKey = keyOption.equal
const bucketSize = Array.isArray(iterable) ? getInitialBucketSize(iterable.length) : 2
this.hashTable = new Array<null>(bucketSize).fill(null)
this.dataTable = []
if (iterable != null) {
for (const [key, value] of iterable) {
this.set(key, value)
}
}
}
get size(): number {
return this.internalSize
}
clear(): void {
this.hashTable = new Array<null>(this.hashTable.length).fill(null)
this.dataTable = []
this.internalSize = 0
}
private getEntryContainer(key: Key): EntryContainer<Key, Value> | null {
const hash = this.hashKey(key)
const index = indexFor(hash, this.hashTable.length)
for (let c = this.hashTable[index]; c != null; c = c.next) {
const e = c.entry
if (e != null && e.hash === hash && (e.key === key || this.equalKey(e.key, key))) {
return c
}
}
return null
}
delete(key: Key): boolean {
const c = this.getEntryContainer(key)
if (c == null) return false
c.entry = null
this.internalSize--
if (this.hashTable.length > 2 && this.internalSize < this.hashTable.length) {
this.resize(this.hashTable.length / 2)
}
return true
}
get(key: Key): Value | undefined {
return this.getEntryContainer(key)?.entry?.value
}
has(key: Key): boolean {
return this.getEntryContainer(key) != null
}
set(key: Key, value: Value): this {
const hash = this.hashKey(key)
const index = indexFor(hash, this.hashTable.length)
for (let c = this.hashTable[index]; c != null; c = c.next) {
const e = c.entry
if (e != null && e.hash === hash && (e.key === key || this.equalKey(e.key, key))) {
e.value = value
return this
}
}
const c = { entry: { key, value, hash }, next: this.hashTable[index] }
this.hashTable[index] = c
this.dataTable.push(c)
this.internalSize++
if (this.dataTable.length > this.hashTable.length * 2) {
if (this.internalSize < this.dataTable.length) {
this.resize(this.hashTable.length)
} else {
this.resize(this.hashTable.length * 2)
}
}
return this
}
private resize(bucketSize: number) {
const oldDataTable = this.dataTable
type EC = EntryContainer<Key, Value>
const hashTable = new Array<EC | null>(bucketSize).fill(null)
const dataTable = [] as Array<EC>
for (const c of oldDataTable) {
const e = c.entry
if (e == null) continue
const index = indexFor(e.hash, bucketSize)
c.next = hashTable[index]
hashTable[index] = c
dataTable.push(c)
}
this.hashTable = hashTable
this.dataTable = dataTable
}
*entries(): IterableIterator<[Key, Value]> {
for (const { entry } of this.dataTable) {
if (entry != null) yield [entry.key, entry.value]
}
}
[Symbol.iterator](): IterableIterator<[Key, Value]> {
return this.entries()
}
*keys(): IterableIterator<Key> {
for (const { entry } of this.dataTable) {
if (entry != null) yield entry.key
}
}
*values(): IterableIterator<Value> {
for (const { entry } of this.dataTable) {
if (entry != null) yield entry.value
}
}
forEach(callback: (value: Value, key: Key, map: this) => void): void {
for (const { entry } of this.dataTable) {
if (entry != null) callback(entry.value, entry.key, this)
}
}
}
function getInitialBucketSize(numEntries: number): number {
let n = 2
while (n < numEntries) n *= 2
return n
}
function indexFor(hash: number, capacity: number): number {
return hash & (capacity - 1)
}
<file_sep>/src/set.ts
type Entry<Value> = {
value: Value
hash: number
}
type EntryContainer<Value> = {
entry: Entry<Value> | null
next: EntryContainer<Value> | null
}
export type SetOptions<Value> = {
hash(value: Value): number
equal(value1: Value, value2: Value): boolean
}
export class Set<Value> {
private hashValue: (value: Value) => number
private equalValue: (value1: Value, value2: Value) => boolean
private hashTable: Array<EntryContainer<Value> | null>
private dataTable: Array<EntryContainer<Value>> = []
private internalSize = 0
constructor(iterable: Iterable<Value> | null | undefined, options: SetOptions<Value>) {
this.hashValue = options.hash
this.equalValue = options.equal
const bucketSize = Array.isArray(iterable) ? getInitialBucketSize(iterable.length) : 2
this.hashTable = new Array<null>(bucketSize).fill(null)
if (iterable != null) {
for (const value of iterable) {
this.add(value)
}
}
}
get size(): number {
return this.internalSize
}
private getEntryContainer(value: Value): EntryContainer<Value> | null {
const hash = this.hashValue(value)
const index = indexFor(hash, this.hashTable.length)
for (let c = this.hashTable[index]; c != null; c = c.next) {
const e = c.entry
if (e != null && e.hash === hash && (e.value === value || this.equalValue(e.value, value))) {
return c
}
}
return null
}
has(value: Value): boolean {
return this.getEntryContainer(value) != null
}
delete(value: Value): boolean {
const c = this.getEntryContainer(value)
if (c == null) return false
c.entry = null
this.internalSize--
if (this.hashTable.length > 2 && this.internalSize < this.hashTable.length) {
this.resize(this.hashTable.length / 2)
}
return true
}
add(value: Value): this {
const hash = this.hashValue(value)
const index = indexFor(hash, this.hashTable.length)
for (let c = this.hashTable[index]; c != null; c = c.next) {
const e = c.entry
if (e != null && e.hash === hash && (e.value === value || this.equalValue(e.value, value))) {
e.value = value
return this
}
}
const c = { entry: { value, hash }, next: this.hashTable[index] }
this.hashTable[index] = c
this.dataTable.push(c)
this.internalSize++
if (this.dataTable.length > this.hashTable.length * 2) {
if (this.internalSize < this.hashTable.length) {
this.resize(this.hashTable.length)
} else {
this.resize(this.hashTable.length * 2)
}
}
return this
}
private resize(bucketSize: number) {
const newHashTable = new Array<EntryContainer<Value> | null>(bucketSize).fill(null)
const newDataTable = new Array<EntryContainer<Value>>()
for (const container of this.dataTable) {
if (container.entry == null) continue
const index = indexFor(container.entry.hash, bucketSize)
container.next = newHashTable[index]
newHashTable[index] = container
newDataTable.push(container)
}
}
clear(): void {
this.internalSize = 0
this.hashTable = new Array<null>(this.hashTable.length).fill(null)
this.dataTable = []
}
*values(): IterableIterator<Value> {
for (const { entry } of this.dataTable) {
if (entry != null) yield entry.value
}
}
[Symbol.iterator](): IterableIterator<Value> {
return this.values()
}
keys(): IterableIterator<Value> {
return this.values()
}
*entries(): IterableIterator<[Value, Value]> {
for (const { entry } of this.dataTable) {
if (entry != null) yield [entry.value, entry.value]
}
}
forEach(callback: (value: Value, key: Value, set: this) => void): void {
for (const { entry } of this.dataTable) {
if (entry != null) callback(entry.value, entry.value, this)
}
}
}
function getInitialBucketSize(numEntries: number): number {
let n = 2
while (n < numEntries) n *= 2
return n
}
function indexFor(hash: number, capacity: number): number {
return hash & (capacity - 1)
}
<file_sep>/README.md
# map-and-set
This package provides Map and Set implementations that have very similar API to ECMAScript's Map and Set.
Unlike ECMAScript's ones, they have hash and equality functions.
You can use objects for keys of Map and values of Set without worrying about objects' identity.
Implementations of Map and Set are inspired by an article: [\[V8 Deep Dives\] Understanding Map Internals \| by <NAME> \| ITNEXT](https://itnext.io/v8-deep-dives-understanding-map-internals-45eb94a183df).
Thanks Andrey.
## Install
You can install it with npm.
```console
npm install map-and-set
```
## Example
```typescript
import { Map, Set, makeHashAndEqualFunction } from 'map-and-set'
interface User {
id: number
name: string
}
const map = new Map<User, number>(null, {
hash: (user) => user.id,
equal: (user1, user2) => user1.id === user2.id
})
// You can write shortly with utility function
const map_ = new Map<User, number>(
null,
makeHashAndEqualFunction((user) => [user.id])
)
map.set({ id: 1, name: 'Tom' }, 99)
map.set({ id: 2, name: 'Michael' }, 100)
map.get({ id: 1, name: 'Tom' }) // => 99
map.has({ id: 1, name: 'Tom' }) // => true
map.has({ id: 1, name: 'Unknown' }) // => true
map.has({ id: 3, name: 'Tom' }) // => false
map.delete({ id: 1 })
map.has({ id: 1, name: 'Tom' }) // => false
const set = new Set<User>(
null,
makeHashAndEqualityFunction((user) => [user.id])
)
set.add({ id: 10, name: 'Tomoko' })
set.add({ id: 11, name: 'Yuya' })
set.has({ id: 10, name: 'Tomoko' }) // => true
set.has({ id: 10, name: 'Unknown' }) // => true
set.has({ id: 11, name: 'Yuya' }) // => true
set.has({ id: 99, name: 'Unknown' }) // => false
set.delete({ id: 11, name: 'Yuya' })
set.has({ id: 11, name: 'Yuya' }) // => false
```
## License
MIT
<file_sep>/src/map.test.ts
import { Map } from './map'
import { makeHashAndEqualFunction } from './util'
test('basics', () => {
const map = new Map<number, string>(
[
[1, 'a'],
[2, 'b'],
[100, 'foo']
],
{ hash: (n) => n, equal: (n, m) => n === m }
)
expect(map.get(1)).toBe('a')
expect(map.get(2)).toBe('b')
expect(map.get(99)).toBe(undefined)
map.set(8, 'eight')
expect(map.get(8)).toBe('eight')
expect(map.size).toBe(4)
map.delete(8)
expect(map.get(8)).toBe(undefined)
expect(map.size).toBe(3)
const map2 = new Map<number, string>(
null,
makeHashAndEqualFunction((v) => [v])
)
const expected = [] as number[]
for (let i = 0; i < 100; i++) {
map2.set(i, String(i))
if (i % 3 !== 0) {
expected.push(i)
}
}
for (let i = 99; i >= 0; i -= 3) {
map2.delete(i)
}
expect([...map2.keys()]).toEqual(expected)
})
test('get', () => {
const map = new Map<[number, string], number>(
[
[[1, 'a'], 99],
[[2, 'foo'], 100]
],
makeHashAndEqualFunction((v) => v)
)
expect(map.get([1, 'a'])).toEqual(99)
expect(map.get([2, 'a'])).toEqual(undefined)
expect(map.get([2, 'foo'])).toEqual(100)
})
test('set', () => {
const map = new Map<{ a: number; b: string }, number>(
null,
makeHashAndEqualFunction(({ a, b }) => [a, b])
)
expect(map.get({ a: 1, b: 'x' })).toBe(undefined)
map.set({ a: 1, b: 'x' }, 999)
expect(map.get({ a: 1, b: 'x' })).toBe(999)
})
test('delete', () => {
const map = new Map<{ a: number; b: string }, number>(
null,
makeHashAndEqualFunction(({ a, b }) => [a, b])
)
map.set({ a: 1, b: 'x' }, 999)
expect(map.get({ a: 1, b: 'x' })).toBe(999)
expect(map.has({ a: 1, b: 'x' })).toBe(true)
expect(map.size).toBe(1)
map.delete({ a: 1, b: 'x' })
expect(map.get({ a: 1, b: 'x' })).toBe(undefined)
expect(map.has({ a: 1, b: 'x' })).toBe(false)
expect(map.size).toBe(0)
})
test('clear', () => {
const map = new Map<string[], number>(
null,
makeHashAndEqualFunction((k) => k)
)
map.set(['foo', 'bar'], 99)
map.set(['a', 'b', 'c'], 899)
expect([...map.entries()]).toEqual([
[['foo', 'bar'], 99],
[['a', 'b', 'c'], 899]
])
expect(map.size).toBe(2)
map.clear()
expect([...map.entries()]).toEqual([])
expect(map.size).toBe(0)
})
test('size', () => {
const map = new Map<number, number>(
null,
makeHashAndEqualFunction((v) => [v])
)
expect(map.size).toBe(0)
for (let i = 0; i < 100; i++) {
map.set(i, i)
}
expect(map.size).toBe(100)
})
test('entries', () => {
const map = new Map<number, number>(
null,
makeHashAndEqualFunction((v) => [v])
)
map.set(1, 2)
map.set(2, 3)
map.set(3, 4)
map.set(4, 5)
expect([...map.entries()]).toEqual([
[1, 2],
[2, 3],
[3, 4],
[4, 5]
])
expect([...map]).toEqual([...map.entries()])
})
test('keys', () => {
const map = new Map(
[
['a', 1],
['b', 2],
['hello', 3]
],
makeHashAndEqualFunction((v) => [v])
)
expect([...map.keys()]).toEqual(['a', 'b', 'hello'])
})
test('values', () => {
const map = new Map(
[
['a', 1],
['b', 2],
['hello', 3]
],
makeHashAndEqualFunction((v) => [v])
)
expect([...map.values()]).toEqual([1, 2, 3])
})
| 6980adc5b43f811971852c4d0f644030e6e75622 | [
"Markdown",
"TypeScript"
]
| 7 | TypeScript | odiak/map-and-set | e6dfb1357805bb13a0816790d8c861b5c160f76e | 4f0e016ac346413d8c61d282aa4d0331f5a47062 |
refs/heads/master | <repo_name>tedb/etcd<file_sep>/server/release_version.go
package server
const ReleaseVersion = "0.4.2+git"
| f4d31630f33180af3afd66b5c9a347fb13178c7f | [
"Go"
]
| 1 | Go | tedb/etcd | 16c2bcf951548d1c298b46c3e579fd2ef06d5839 | 578c47b60cbfbfa289c70ad191ddb58cf5695d21 |
refs/heads/main | <repo_name>LexAndYacc/EssentialCpp<file_sep>/2/2.5/main.cpp
#include <iostream>
#include <algorithm>
#include <vector>
int max(int a, int b);
double max(double a, double b);
const std::string &max(const std::string &a, const std::string &b);
int max(const std::vector<int> &elems);
double max(const std::vector<double> &elems);
std::string max(const std::vector<std::string> &elems);
int max(const int *array, int size);
double max(const double *array, int size);
std::string max(const std::string *array, int size);
int main()
{
std::cout << "max(1,2) = " << max(1, 2) << std::endl;
std::cout << "max(1.0,2.0) = " << max(1.0, 2.0) << std::endl;
std::cout << "max(\"adf\",\"sdfdsf\") = " << max("adf", "sdfdsf") << std::endl;
std::vector<int> vInt{1, 2, 3, 4};
std::cout << "max(vector<int>) = " << max(vInt) << std::endl;
std::vector<double> vDouble{1.76, 2.1234, 3.5, 2.346};
std::cout << "max(vector<double>) = " << max(vDouble) << std::endl;
std::vector<std::string> vString{"1.76, 2.1234, 3.5, 2.346","ga","wert","dsaf"};
std::cout << "max(vector<string>) = " << max(vString) << std::endl;
int aInt[] = {1,2,3,45};
std::cout << "max(int[]) = " << max(aInt,sizeof(aInt) / sizeof(aInt[0])) << std::endl;
double aDouble[] = {-3.5,234.534,24.124,213556.2134};
std::cout << "max(double[]) = " << max(aDouble,sizeof(aDouble) / sizeof(aDouble[0])) << std::endl;
std::string aString[] = {"1,2,3,45","sdf","gdrh","zgfsgh"};
std::cout << "max(string[]) = " << max(aString,sizeof(aString) / sizeof(aString[0])) << std::endl;
return 0;
}
int max(int a, int b)
{
return std::max(a, b);
}
double max(double a, double b)
{
return std::max(a, b);
}
const std::string &max(const std::string &a, const std::string &b)
{
return std::max(a, b);
}
int max(const std::vector<int> &elems)
{
if (elems.size() == 0)
{
std::cerr << "空vector\n";
return 0;
}
int maxNum = elems[0];
for (int i = 1; i < elems.size(); ++i)
{
maxNum = max(maxNum, elems[i]);
}
return maxNum;
}
double max(const std::vector<double> &elems)
{
if (elems.size() == 0)
{
std::cerr << "空vector\n";
return 0;
}
double maxNum = elems[0];
for (int i = 1; i < elems.size(); ++i)
{
maxNum = max(maxNum, elems[i]);
}
return maxNum;
}
std::string max(const std::vector<std::string> &elems)
{
if (elems.size() == 0)
{
std::cerr << "空vector\n";
return 0;
}
std::string maxNum = elems[0];
for (int i = 1; i < elems.size(); ++i)
{
maxNum = max(maxNum, elems[i]);
}
return maxNum;
}
int max(const int *array, int size)
{
if (0 == array || size <= 0)
{
std::cerr << "空array\n";
return 0;
}
int maxNum = array[0];
for (int i = 1; i < size; ++i)
{
maxNum = max(maxNum, array[i]);
}
return maxNum;
}
double max(const double *array, int size)
{
if (0 == array || size <= 0)
{
std::cerr << "空array\n";
return 0;
}
double maxNum = array[0];
for (int i = 1; i < size; ++i)
{
maxNum = max(maxNum, array[i]);
}
return maxNum;
}
std::string max(const std::string *array, int size)
{
if (0 == array || size <= 0)
{
std::cerr << "空array\n";
return 0;
}
std::string maxNum = array[0];
for (int i = 1; i < size; ++i)
{
maxNum = max(maxNum, array[i]);
}
return maxNum;
}<file_sep>/2/2.2/main.cpp
#include <iostream>
#include <vector>
using namespace std;
bool GetPentagonal(vector<int> &elems, int size);
bool ShowPentagoal(const vector<int> &elems, string type = "int");
int main()
{
vector<int> elems;
if (GetPentagonal(elems, -1))
{
ShowPentagoal(elems);
}
if (GetPentagonal(elems, 0))
{
ShowPentagoal(elems);
}
if (GetPentagonal(elems, 10))
{
ShowPentagoal(elems);
}
if (GetPentagonal(elems, 1025))
{
ShowPentagoal(elems);
}
return 0;
}
//获取序列
bool GetPentagonal(vector<int> &elems, int size)
{
const int max_size = 1024;
if (size <= 0 || size > max_size)
{
cerr << "GetPentagonal(): oops: invalid size: "
<< size << " -- can't fulfill request.\n";
return false;
}
elems.clear();
for (int ix = elems.size(); ix < size; ++ix)
{
elems.push_back((ix + 1) * (3 * (ix + 1) - 1) / 2);
}
return true;
}
//显示序列
bool ShowPentagoal(const vector<int> &elems, string type)
{
for (auto item : elems)
{
cout << item << " ";
}
cout << "\n";
return true;
}<file_sep>/1/1.8/main.cpp
#include <iostream>
using namespace std;
int main()
{
int answer = 666;
int guess = 0;
int tryTimes(0);
string arr[] = {
"error1\n",
"error2\n",
"error3\n",
"error4\n"};
while (cin >> guess)
{
if (guess == answer)
{
break;
}
else
{
switch (tryTimes++)
{
case 0:
cout << arr[0];
break;
case 1:
cout << arr[1];
break;
case 2:
cout << arr[2];
break;
case 3:
cout << arr[3];
break;
default:
cout << arr[3];
break;
}
}
}
return 0;
}<file_sep>/1/1.6/main.cpp
#include <iostream>
#include <vector>
using namespace std;
int main()
{
int array[100];
int i = 0;
while (cin >> array[i])
{
i++;
}
vector<int> v(array, array + i);
int sum = 0;
for (int j = 0; j < i; j++)
{
sum += array[j];
}
cout << sum << " " << sum / 1.0 / i << endl;
sum = 0;
for (int j = 0; j < v.size(); j++)
{
sum += array[j];
}
cout << sum << " " << sum / 1.0 / i;
return 0;
}<file_sep>/2/2.6/main.cpp
#include <iostream>
#include <vector>
template <typename T>
T MyMax(T t1, T t2);
template <typename T>
T MyMax(std::vector<T> elems);
template <typename T>
T MyMax(const T *array, int size);
int main()
{
std::cout << "max(1,2) = " << MyMax(1, 2) << std::endl;
std::cout << "max(1.0,2.0) = " << MyMax(1.0, 2.0) << std::endl;
std::cout << "max(\"adf\",\"sdfdsf\") = " << MyMax("adf", "sdfdsf") << std::endl;
std::vector<int> vInt{1, 2, 3, 4};
std::cout << "max(vector<int>) = " << MyMax(vInt) << std::endl;
std::vector<double> vDouble{1.76, 2.1234, 3.5, 2.346};
std::cout << "max(vector<double>) = " << MyMax(vDouble) << std::endl;
std::vector<std::string> vString{"1.76, 2.1234, 3.5, 2.346", "ga", "wert", "dsaf"};
std::cout << "max(vector<string>) = " << MyMax(vString) << std::endl;
int aInt[] = {1, 2, 3, 45};
std::cout << "max(int[]) = " << MyMax(aInt, sizeof(aInt) / sizeof(aInt[0])) << std::endl;
double aDouble[] = {-3.5, 234.534, 24.124, 213556.2134};
std::cout << "max(double[]) = " << MyMax(aDouble, sizeof(aDouble) / sizeof(aDouble[0])) << std::endl;
std::string aString[] = {"1,2,3,45", "sdf", "gdrh", "zgfsgh"};
std::cout << "max(string[]) = " << MyMax(aString, sizeof(aString) / sizeof(aString[0])) << std::endl;
return 0;
}
template <typename T>
T MyMax(T t1, T t2)
{
return std::max(t1, t2);
}
template <typename T>
T MyMax(std::vector<T> elems)
{
if (elems.size() == 0)
{
std::cerr << "空vector\n";
return 0;
}
T maxItem = elems[0];
for (int i = 1; i < elems.size(); ++i)
{
maxItem = MyMax(maxItem, elems[i]);
}
return maxItem;
}
template <typename T>
T MyMax(const T *array, int size)
{
if (0 == array || size <= 0)
{
std::cerr << "空array\n";
return 0;
}
T maxNum = array[0];
for (int i = 1; i < size; ++i)
{
maxNum = MyMax(maxNum, array[i]);
}
return maxNum;
}<file_sep>/2/2.4/main.cpp
#include <iostream>
#include <vector>
using namespace std;
bool IsSizeOK(int size);
const vector<int> *GetPentagonal(int size);
bool ShowPentagoal(const vector<int> &elems, string type = "int");
int main()
{
if (0 != GetPentagonal(-1))
{
ShowPentagoal(*GetPentagonal(-1));
}
if (0 != GetPentagonal(0))
{
ShowPentagoal(*GetPentagonal(0));
}
if (0 != GetPentagonal(10))
{
ShowPentagoal(*GetPentagonal(10));
}
if (0 != GetPentagonal(1025))
{
ShowPentagoal(*GetPentagonal(1025));
}
return 0;
}
bool IsSizeOK(int size)
{
const int max_size = 1024;
if (size <= 0 || size > max_size)
{
return false;
}
return true;
}
//获取序列
const vector<int> *GetPentagonal(int size)
{
static vector<int> elems;
if (!IsSizeOK(size))
{
cerr << "GetPentagonal(): oops: invalid size: "
<< size << " -- can't fulfill request.\n";
return false;
}
for (int ix = elems.size(); ix < size; ++ix)
{
elems.push_back((ix + 1) * (3 * (ix + 1) - 1) / 2);
}
return &elems;
}
//显示序列
bool ShowPentagoal(const vector<int> &elems, string type)
{
for (auto item : elems)
{
cout << item << " ";
}
cout << "\n";
return true;
}<file_sep>/1/1.7/main.cpp
#include <iostream>
#include <fstream>
#include <vector>
#include <algorithm>
using namespace std;
int main()
{
vector<string> v;
ifstream infile("test.txt");
ofstream outfile("test2.txt");
string temp;
while (infile >> temp)
{
v.push_back(temp);
}
for (int j = 0; j < v.size(); j++)
{
cout << v[j] << " ";
}
sort(v.begin(),v.end());
for (int j = 0; j < v.size(); j++)
{
outfile << v[j] << " ";
}
return 0;
} | cb29c25645e20c0dda8ba38bae3d3546169ed646 | [
"C++"
]
| 7 | C++ | LexAndYacc/EssentialCpp | fafa9e2d49038c7627f0731f081f5fb15c4f3080 | 22285fadcd359c0b34f5d0f60d3f80b3ca3527e1 |
refs/heads/master | <file_sep>package com.digitalone.kasiranto.model;
import com.google.gson.annotations.Expose;
import com.google.gson.annotations.SerializedName;
public class AdminMessage {
@SerializedName("error")
@Expose
private boolean error;
@SerializedName("message")
@Expose
private String message;
public boolean getError(){
return error;
}
public void setError(boolean input){
this.error = input;
}
public String getMsg(){
return message;
}
public void setMsg(String input){
this.message = input;
}
}
<file_sep>package com.digitalone.kasiranto.fragment;
import android.content.Intent;
import android.os.Bundle;
import android.support.design.widget.FloatingActionButton;
import android.support.design.widget.TabLayout;
import android.support.v4.app.Fragment;
import android.support.v4.view.ViewPager;
import android.support.v7.app.AppCompatActivity;
import android.support.v7.widget.Toolbar;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import com.digitalone.kasiranto.R;
import com.digitalone.kasiranto.activity.ActivityFilter;
import com.digitalone.kasiranto.adapter.AdapterViewPagerTransaksi;
import com.digitalone.kasiranto.adapter.ViewPagerAdapter;
/**
* A simple {@link Fragment} subclass.
*/
public class FragmentTransaksi extends Fragment implements View.OnClickListener{
private Toolbar toolbar;
private View view;
private AdapterViewPagerTransaksi adapter;
private TabLayout tabLayout;
private ViewPager viewPager;
private FloatingActionButton fab;
public static FragmentTransaksi newInstance(){
return new FragmentTransaksi();
}
public FragmentTransaksi() {
// Required empty public constructor
}
@Override
public View onCreateView(LayoutInflater inflater, ViewGroup container,
Bundle savedInstanceState) {
view = inflater.inflate(R.layout.fragment_transaksi, container, false);
initView();
return view;
}
private void initView() {
fab = view.findViewById(R.id.fabfilter);
toolbar = view.findViewById(R.id.toolbar);
fab.setOnClickListener(this);
((AppCompatActivity)getActivity()).setSupportActionBar(toolbar);
((AppCompatActivity) getActivity()).getSupportActionBar().setTitle("Transaksi");
viewPager = view.findViewById(R.id.pagertransaksi);
tabLayout = view.findViewById(R.id.tabtransaksi);
adapter = new AdapterViewPagerTransaksi(getChildFragmentManager());
viewPager.setAdapter(adapter);
tabLayout.setupWithViewPager(viewPager);
}
@Override
public void onClick(View v) {
switch (v.getId()){
case R.id.fabfilter:
startActivity(new Intent(getActivity(), ActivityFilter.class));
break;
}
}
}
<file_sep>package com.digitalone.kasiranto.adapter;
import android.content.Context;
import android.support.annotation.NonNull;
import android.support.v7.widget.RecyclerView;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.widget.TextView;
import com.digitalone.kasiranto.R;
import com.digitalone.kasiranto.activity.ActivityKafeCheckout;
import com.digitalone.kasiranto.activity.ActivityKolamIkanCheckout;
import com.digitalone.kasiranto.fragment.FragmentKafe;
import com.digitalone.kasiranto.fragment.FragmentKolamIkan;
import com.digitalone.kasiranto.helper.DBHelper;
import com.digitalone.kasiranto.model.KafeTemp;
import java.util.List;
public class AdapterKafeTemp extends RecyclerView.Adapter<AdapterKafeTemp.ListTemp> {
private List<KafeTemp> items;
private Context context = null;
private DBHelper helper ;
public AdapterKafeTemp(List<KafeTemp> items, Context context) {
this.items = items;
this.context = context;
}
@NonNull
@Override
public AdapterKafeTemp.ListTemp onCreateViewHolder(@NonNull ViewGroup parent, int viewType) {
helper = new DBHelper(context);
View view = LayoutInflater.from(parent.getContext())
.inflate(R.layout.list_kafe_bayar, parent, false);
return new ListTemp(view);
}
@Override
public void onBindViewHolder(@NonNull AdapterKafeTemp.ListTemp holder, int position) {
final KafeTemp kafeItem = items.get(position);
holder.txtId.setText(String.valueOf(kafeItem.getKafe_id_sql()));
holder.txtId.setVisibility(View.GONE);
holder.txtNama.setText(kafeItem.getKafe_nama());
holder.txtJumlah.setText(kafeItem.getKafe_jumlah());
holder.txtHarga.setText(kafeItem.getKafe_harga());
holder.hapus.setOnClickListener(new View.OnClickListener(){
@Override
public void onClick(View v) {
FragmentKafe.totaldetail = FragmentKafe.totaldetail - Integer.parseInt(kafeItem.getKafe_harga());
helper.deleteItemKafe(kafeItem);
helper.getAllKafeTemps();
((ActivityKafeCheckout)context).refresh();
}
});
}
@Override
public int getItemCount() {
return items.size();
}
class ListTemp extends RecyclerView.ViewHolder{
private TextView txtNama, txtJumlah, txtHarga, txtId,hapus;
public ListTemp(View itemView) {
super(itemView);
txtNama = itemView.findViewById(R.id.temp_kafe_item);
txtJumlah = itemView.findViewById(R.id.temp_kafe_jumlah);
txtHarga = itemView.findViewById(R.id.temp_kafe_harga);
txtId = itemView.findViewById(R.id.temp_kafe_id);
hapus = itemView.findViewById(R.id.hapus);
}
}
}
<file_sep>package com.digitalone.kasiranto.adapter;
import android.content.Context;
import android.support.annotation.NonNull;
import android.support.v7.widget.RecyclerView;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.widget.TextView;
import com.digitalone.kasiranto.R;
import com.digitalone.kasiranto.model.KafeTransaksiItem;
import com.digitalone.kasiranto.model.KolamIkanTransaksiItem;
import java.text.DecimalFormat;
import java.text.DecimalFormatSymbols;
import java.util.List;
public class AdapterTransaksiKolamIkan extends RecyclerView.Adapter<AdapterTransaksiKolamIkan.HolderTransKafe> {
private List<KolamIkanTransaksiItem> items;
private Context context;
private DecimalFormat fRupiah;
public AdapterTransaksiKolamIkan(List<KolamIkanTransaksiItem> items, Context context) {
this.items = items;
this.context = context;
}
@NonNull
@Override
public AdapterTransaksiKolamIkan.HolderTransKafe onCreateViewHolder(@NonNull ViewGroup parent, int viewType) {
View view = LayoutInflater.from(parent.getContext())
.inflate(R.layout.list_transaksi_kafe, parent, false);
return new HolderTransKafe(view);
}
@Override
public void onBindViewHolder(@NonNull AdapterTransaksiKolamIkan.HolderTransKafe holder, int position) {
final KolamIkanTransaksiItem item = items.get(position);
holder.txtNama.setText(item.getNamaItemKafe());
holder.txtHarga.setText(String.valueOf(item.getHargaItemKafe()));
holder.txtTanggal.setText(item.getKtDatetime());
holder.txtJumlah.setText(String.valueOf(item.getKtJumlah()));
holder.txtTotal.setText(String.valueOf(item.getKtTotal()));
}
@Override
public int getItemCount() {
return items.size();
}
public class HolderTransKafe extends RecyclerView.ViewHolder {
private TextView txtNama, txtHarga, txtTanggal,txtJumlah, txtTotal;
public HolderTransKafe(View itemView) {
super(itemView);
txtNama = itemView.findViewById(R.id.kafe_transaksi_nama);
txtHarga = itemView.findViewById(R.id.kafe_transaksi_harga);
txtTanggal = itemView.findViewById(R.id.kafe_transaksi_waktu);
txtJumlah = itemView.findViewById(R.id.kafe_transaksi_jumlah);
txtTotal = itemView.findViewById(R.id.kafe_transaksi_total);
}
}
}
<file_sep>package com.digitalone.kasiranto.service;
import com.digitalone.kasiranto.model.Kafe;
import com.digitalone.kasiranto.model.AdminMessage;
import com.digitalone.kasiranto.model.KafeDate;
import com.digitalone.kasiranto.model.KafeTransaksi;
import com.digitalone.kasiranto.model.KolamIkan;
import com.digitalone.kasiranto.model.KolamIkanDate;
import com.digitalone.kasiranto.model.KolamRenang;
import com.digitalone.kasiranto.model.KolamRenangDate;
import com.digitalone.kasiranto.model.KolamTransaksi;
import com.digitalone.kasiranto.model.TiketMasuk;
import com.digitalone.kasiranto.model.TiketMasukDate;
import com.digitalone.kasiranto.model.Toko;
import com.digitalone.kasiranto.model.TokoDate;
import com.digitalone.kasiranto.model.TokoTransaksi;
import com.digitalone.kasiranto.model.Transaksi;
import com.digitalone.kasiranto.model.Warung;
import com.digitalone.kasiranto.model.WarungDate;
import com.digitalone.kasiranto.model.WarungTransaksi;
import com.google.gson.JsonArray;
import retrofit2.Call;
import retrofit2.http.Body;
import retrofit2.http.Field;
import retrofit2.http.FormUrlEncoded;
import retrofit2.http.GET;
import retrofit2.http.POST;
public interface APIService {
String BASE_URL = "http://rainflare.org/";
/** Area Kafe **/
@GET("kasiranto/GetKafe.php")
Call<Kafe> getKafe();
@FormUrlEncoded
@POST("kasiranto/InsertKafe.php")
Call<AdminMessage> insertKafe(@Field("nama_item_kafe") String nama,
@Field("harga_item_kafe") String harga,
@Field("stok_item_kafe") String stok);
@FormUrlEncoded
@POST("kasiranto/UpdateKafe.php")
Call<AdminMessage> updateKafe(@Field("id") String id,
@Field("nama_item_kafe") String nama,
@Field("harga_item_kafe") String harga,
@Field("stok_item_kafe") String stok);
@FormUrlEncoded
@POST("kasiranto/DeleteKafe.php")
Call<AdminMessage> deleteKafe(@Field("id") String id);
@POST("kasiranto/InsertPesananKafe.php")
Call<AdminMessage> ordered(@Body JsonArray array_items);
@GET("kasiranto/GetKafeTransaksi.php")
Call<KafeTransaksi> getKafeTr();
@FormUrlEncoded
@POST("kasiranto/GetFilterKafe.php")
Call<KafeDate> filterKafeDate(@Field("from") String from, @Field("to") String to);
/** Area Toko **/
@GET("kasiranto/GetToko.php")
Call<Toko> getToko();
@FormUrlEncoded
@POST("kasiranto/GetFilterToko.php")
Call<TokoDate> filterTokoDate(@Field("from") String from, @Field("to") String to);
@FormUrlEncoded
@POST("kasiranto/InsertToko.php")
Call<AdminMessage> insertToko(@Field("toko_nama") String nama,
@Field("toko_harga") String harga,
@Field("toko_stok") String stok);
@POST("kasiranto/InsertPesananToko.php")
Call<AdminMessage> orderedToko(@Body JsonArray array_items);
@FormUrlEncoded
@POST("kasiranto/UpdateToko.php")
Call<AdminMessage> updateToko(@Field("toko_id") String id,
@Field("toko_nama") String nama,
@Field("toko_harga") String harga,
@Field("toko_stok") String stok);
@FormUrlEncoded
@POST("kasiranto/DeleteToko.php")
Call<AdminMessage> deleteToko(@Field("toko_id") String id);
@GET("kasiranto/GetTokoTransaksi.php")
Call<TokoTransaksi> getTokoTr();
/** Area Warung **/
@GET("kasiranto/GetWarung.php")
Call<Warung> getWarung();
@FormUrlEncoded
@POST("kasiranto/InsertWarung.php")
Call<AdminMessage> insertWarung(@Field("warung_nama") String nama,
@Field("warung_harga") String harga,
@Field("warung_stok") String stok);
@POST("kasiranto/InsertPesananWarung.php")
Call<AdminMessage> orderedWarung(@Body JsonArray array_items);
@FormUrlEncoded
@POST("kasiranto/UpdateWarung.php")
Call<AdminMessage> updateWarung(@Field("warung_id") String id,
@Field("warung_nama") String nama,
@Field("warung_harga") String harga,
@Field("warung_stok") String stok);
@FormUrlEncoded
@POST("kasiranto/DeleteWarung.php")
Call<AdminMessage> deleteWarung(@Field("warung_id") String id);
@GET("kasiranto/GetWarungTransaksi.php")
Call<WarungTransaksi> getWarungTr();
@FormUrlEncoded
@POST("kasiranto/GetFilterWarung.php")
Call<WarungDate> filterWarungDate(@Field("from") String from, @Field("to") String to);
/** Area Kolam Ikan **/
@FormUrlEncoded
@POST("kasiranto/InsertKolamIkan.php")
Call<AdminMessage> insertKolamIkan(@Field("nama_item_KolamIkan") String nama,
@Field("harga_item_KolamIkan") String harga,
@Field("stok_item_KolamIkan") String stok);
@GET("kasiranto/GetKolamIkanTransaksi.php")
Call<KolamTransaksi> getKolamikanTr();
@FormUrlEncoded
@POST("kasiranto/UpdateKolamIkan.php")
Call<AdminMessage> updateKolamIkan(@Field("id") String id,
@Field("nama_item_KolamIkan") String nama,
@Field("harga_item_KolamIkan") String harga,
@Field("stok_item_KolamIkan") String stok);
@FormUrlEncoded
@POST("kasiranto/DeleteKolamIkan.php")
Call<AdminMessage> deleteKolamIkan(@Field("toko_id") String id);
@POST("kasiranto/InsertPesananKolamIkan.php")
Call<AdminMessage> ordered_Kolamikan(@Body JsonArray array_items);
@GET("kasiranto/GetKolamIkan.php")
Call<KolamIkan> getKolamIkan();
@FormUrlEncoded
@POST("kasiranto/GetFilterKolamDate.php")
Call<KolamIkanDate> filterKolamDate(@Field("from") String from, @Field("to") String to);
/** Area Tiket Masuk **/
@FormUrlEncoded
@POST("kasiranto/InsertTiketMasuk.php")
Call<AdminMessage> insertTiketMasuk(@Field("nama_item_tiketmasuk") String nama,
@Field("harga_item_tiketmasuk") String harga);
@FormUrlEncoded
@POST("kasiranto/UpdateTiketMasuk.php")
Call<AdminMessage> updateTiketMasuk(@Field("id") String id,
@Field("nama_item_tiketmasuk") String nama,
@Field("harga_item_tiketmasuk") String harga);
@FormUrlEncoded
@POST("kasiranto/DeleteTiketMasuk.php")
Call<AdminMessage> deleteTiketMasuk(@Field("id") String id);
@GET("kasiranto/GetTiketMasuk.php")
Call<TiketMasuk> getTiketmasuk();
@POST("kasiranto/InsertPesananTiketMasuk.php")
Call<AdminMessage> ordered_tiketmasuk(@Body JsonArray array_items);
@FormUrlEncoded
@POST("kasiranto/GetFilterTiketMasukDate.php")
Call<TiketMasukDate> filterTiketMasukDate(@Field("from") String from, @Field("to") String to);
/** Area Kolam Renang **/
@FormUrlEncoded
@POST("kasiranto/InsertKolamRenang.php")
Call<AdminMessage> insertKolamRenang(@Field("nama_item_kolamrenang") String nama,
@Field("harga_item_kolamrenang") String harga);
@FormUrlEncoded
@POST("kasiranto/UpdateKolamRenang.php")
Call<AdminMessage> updateKolamRenang(@Field("id") String id,
@Field("nama_item_kolamrenang") String nama,
@Field("harga_item_kolamrenang") String harga);
@FormUrlEncoded
@POST("kasiranto/DeleteKolamRenang.php")
Call<AdminMessage> deleteKolamRenang(@Field("id") String id);
@GET("kasiranto/GetKolamRenang.php")
Call<KolamRenang> getKolamRenang();
@POST("kasiranto/InsertPesananKolamRenang.php")
Call<AdminMessage> ordered_KolamRenang(@Body JsonArray array_items);
@FormUrlEncoded
@POST("kasiranto/GetFilterKolamRenangDate.php")
Call<KolamRenangDate> filterKolamRenangDate(@Field("from") String from, @Field("to") String to);
/** Area User **/
@FormUrlEncoded
@POST("kasiranto/UpdatePassword.php")
Call<AdminMessage> updatePassword(@Field("<PASSWORD>") String password,
@Field("id") String id);
/** AREA ALL TRANSAKSI **/
@POST("kasiranto/InsertAllTokoTransaksi.php")
Call<AdminMessage> transaksiAllToko(@Body JsonArray array_items);
@POST("kasiranto/InsertAllKafeTransaksi.php")
Call<AdminMessage> transaksiAllKafe(@Body JsonArray array_items);
@POST("kasiranto/InsertAllWarungTransaksi.php")
Call<AdminMessage> transaksiAllWarung(@Body JsonArray array_items);
@POST("kasiranto/InsertAllKolamRenangTransaksi.php")
Call<AdminMessage> transaksiAllKolamRenang(@Body JsonArray array_items);
@POST("kasiranto/InsertAllTiketMasukTransaksi.php")
Call<AdminMessage> transaksiAllTiketMasuk(@Body JsonArray array_items);
@POST("kasiranto/InsertAllKolamIkanTransaksi.php")
Call<AdminMessage> transaksiAllKolamIkan(@Body JsonArray array_items);
@FormUrlEncoded
@POST("kasiranto/GetAllTransaksi.php")
Call<Transaksi> getAllTransaksi(@Field("from") String from, @Field("to") String to);
}
<file_sep>package com.digitalone.kasiranto.activity;
import android.app.ProgressDialog;
import android.content.Intent;
import android.support.annotation.NonNull;
import android.support.design.widget.BottomNavigationView;
import android.support.v4.app.FragmentTransaction;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.support.v7.widget.Toolbar;
import android.view.Menu;
import android.view.MenuItem;
import android.widget.TextView;
import com.digitalone.kasiranto.R;
import com.digitalone.kasiranto.fragment.FragmentHome;
import com.digitalone.kasiranto.fragment.FragmentTransaksi;
import com.digitalone.kasiranto.preferences.SessionManager;
import java.util.HashMap;
public class ActivitySuperUser extends AppCompatActivity implements BottomNavigationView.OnNavigationItemSelectedListener {
SessionManager session;
TextView tLevel;
private ProgressDialog pDialog;
BottomNavigationView bottomNav;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_super_user);
initView();
if (savedInstanceState == null){
loadHomeFragment();
}
}
private void initView(){
bottomNav = findViewById(R.id.bottom_navigation);
bottomNav.setOnNavigationItemSelectedListener(this);
pDialog = new ProgressDialog(this);
session = new SessionManager(this);
// HashMap<String, String> user = session.getUser();
// String levelname = user.get(SessionManager.KEY_LEVEL);
pDialog.setCancelable(false);
if (!session.isLoggedIn()) {
logoutUser();
}
//tLevel.setText(levelname);
}
private void logoutUser() {
session.setLogin(false);
pDialog.setMessage("Logging out ...");
showDialog();
// Launching the login activity
Intent intent = new Intent(ActivitySuperUser.this, ActivityLogin.class);
startActivity(intent);
hideDialog();
finish();
}
private void showDialog() {
if (!pDialog.isShowing())
pDialog.show();
}
private void hideDialog() {
if (pDialog.isShowing())
pDialog.dismiss();
}
@Override
public boolean onCreateOptionsMenu(Menu menu) {
getMenuInflater().inflate(R.menu.menu_admin, menu);
return super.onCreateOptionsMenu(menu);
}
@Override
public boolean onOptionsItemSelected(MenuItem item) {
switch (item.getItemId()){
case R.id.keluar:
logoutUser();
break;
case R.id.ubah_password:
startActivity(new Intent(ActivitySuperUser.this, ActivityUbahPassword.class));
break;
}
return super.onOptionsItemSelected(item);
}
@Override
public boolean onNavigationItemSelected(@NonNull MenuItem item) {
switch (item.getItemId()){
case R.id.homee:
loadHomeFragment();
return true;
case R.id.transaksi:
loadTransaksiFragment();
return true;
}
return false;
}
private void loadHomeFragment() {
FragmentHome fragment = FragmentHome.newInstance();
FragmentTransaction ft = getSupportFragmentManager().beginTransaction();
ft.replace(R.id.fragment_frame, fragment);
ft.commit();
}
private void loadTransaksiFragment() {
FragmentTransaksi fragment = FragmentTransaksi.newInstance();
FragmentTransaction ft = getSupportFragmentManager().beginTransaction();
ft.replace(R.id.fragment_frame, fragment);
ft.commit();
}
}
<file_sep>package com.digitalone.kasiranto.model;
import com.google.gson.annotations.Expose;
import com.google.gson.annotations.SerializedName;
public class TokoTransaksiItem {
@SerializedName("toko_nama")
@Expose
private String tokoNama;
@SerializedName("toko_harga")
@Expose
private int tokoHarga;
@SerializedName("tt_jumlah")
@Expose
private String ttJumlah;
@SerializedName("tt_total")
@Expose
private String ttTotal;
@SerializedName("tt_datetime")
@Expose
private String ttDatetime;
public String getTokoNama(){
return tokoNama;
}
public void setTokoNama(String input){
this.tokoNama = input;
}
public int getTokoHarga(){
return tokoHarga;
}
public void setTokoHarga(int input){
this.tokoHarga = input;
}
public String getTtJumlah(){
return ttJumlah;
}
public void setTtJumlah(String input){
this.ttJumlah = input;
}
public String getTtTotal(){
return ttTotal;
}
public void setTtTotal(String input){
this.ttTotal = input;
}
public String getTtDatetime(){
return ttDatetime;
}
public void setTtDatetime(String input){
this.ttDatetime = input;
}
}
<file_sep>package com.digitalone.kasiranto.helper;
import android.content.ContentValues;
import android.content.Context;
import android.database.Cursor;
import android.database.sqlite.SQLiteDatabase;
import android.database.sqlite.SQLiteOpenHelper;
import com.digitalone.kasiranto.model.Kafe;
import com.digitalone.kasiranto.model.KafeTemp;
import com.digitalone.kasiranto.model.KolamIkan;
import com.digitalone.kasiranto.model.KolamIkanTemp;
import com.digitalone.kasiranto.model.KolamRenang;
import com.digitalone.kasiranto.model.KolamRenangTemp;
import com.digitalone.kasiranto.model.TiketMasuk;
import com.digitalone.kasiranto.model.TiketMasukTemp;
import com.digitalone.kasiranto.model.Toko;
import com.digitalone.kasiranto.model.TokoTemp;
import com.digitalone.kasiranto.model.WarungTemp;
import java.util.ArrayList;
import java.util.List;
public class DBHelper extends SQLiteOpenHelper {
private static final int DATABASE_VERSION = 1;
private static final String DATABASE_NAME = "kafe_db";
public DBHelper(Context context) {
super(context, DATABASE_NAME, null, DATABASE_VERSION);
}
@Override
public void onCreate(SQLiteDatabase db) {
db.execSQL(KafeTemp.CREATE_TABLE);
db.execSQL(TokoTemp.CREATE_TABLE_TOKO);
db.execSQL(WarungTemp.CREATE_TABLE_WARUNG);
db.execSQL(TiketMasukTemp.CREATE_TABLE_TIKETMASUK);
db.execSQL(KolamIkanTemp.CREATE_TABLE_KOLAMIKAN);
db.execSQL(KolamRenangTemp.CREATE_TABLE_KOLAMRENANG);
}
@Override
public void onUpgrade(SQLiteDatabase db, int oldVersion, int newVersion) {
switch(oldVersion) {
case 1:
db.execSQL("DROP TABLE IF EXISTS " + KafeTemp.TABLE_NAME);
case 2:
db.execSQL("DROP TABLE IF EXISTS " + TokoTemp.TABLE_TOKO_NAME);
case 3:
db.execSQL("DROP TABLE IF EXISTS " + WarungTemp.TABLE_WARUNG_NAME);
case 4:
db.execSQL("DROP TABLE IF EXISTS " + TiketMasukTemp.TABLE_NAME_TiketMasuk);
case 5:
db.execSQL("DROP TABLE IF EXISTS " + KolamIkanTemp.TABLE_KOLAMIKAN_NAME);
case 6 :
db.execSQL("DROP TABLE IF EXISTS " + KolamRenangTemp.TABLE_RENANG_NAME);
break;
default:
throw new IllegalStateException(
"onUpgrade() with unknown oldVersion " + oldVersion);
}
onCreate(db);
}
public long insertKolamrenang(String nama, String jumlah, String harga, int kolamrenangid) {
SQLiteDatabase db = this.getWritableDatabase();
ContentValues values = new ContentValues();
values.put(KolamRenangTemp.COLUMN_KOLAM_RENANG_ITEM, nama);
values.put(KolamRenangTemp.COLUMN_KOLAM_RENANG_JUMLAH, jumlah);
values.put(KolamRenangTemp.COLUMN_KOLAM_RENANG_HARGA, harga);
values.put(KolamRenangTemp.COLUMN_KOLAM_RENANG_ID_SQL, kolamrenangid);
long id = db.insert(KolamRenangTemp.TABLE_RENANG_NAME, null, values);
db.close();
return id;
}
public long insertKafe(String nama, String jumlah, String harga, int kafeid) {
SQLiteDatabase db = this.getWritableDatabase();
ContentValues values = new ContentValues();
values.put(KafeTemp.COLUMN_KAFE_ITEM, nama);
values.put(KafeTemp.COLUMN_KAFE_JUMLAH, jumlah);
values.put(KafeTemp.COLUMN_KAFE_HARGA, harga);
values.put(KafeTemp.COLUMN_KAFE_ID_SQL, kafeid);
long id = db.insert(KafeTemp.TABLE_NAME, null, values);
db.close();
return id;
}
public long insertKolamIkan(String nama, String jumlah, String harga, int kafeid) {
SQLiteDatabase db = this.getWritableDatabase();
ContentValues values = new ContentValues();
values.put(KafeTemp.COLUMN_KAFE_ITEM, nama);
values.put(KafeTemp.COLUMN_KAFE_JUMLAH, jumlah);
values.put(KafeTemp.COLUMN_KAFE_HARGA, harga);
values.put(KafeTemp.COLUMN_KAFE_ID_SQL, kafeid);
long id = db.insert(KolamIkanTemp.TABLE_KOLAMIKAN_NAME, null, values);
db.close();
return id;
}
public long insertTiketMasuk(String nama, String jumlah, String harga, int kafeid) {
SQLiteDatabase db = this.getWritableDatabase();
ContentValues values = new ContentValues();
values.put(KafeTemp.COLUMN_KAFE_ITEM, nama);
values.put(KafeTemp.COLUMN_KAFE_JUMLAH, jumlah);
values.put(KafeTemp.COLUMN_KAFE_HARGA, harga);
values.put(KafeTemp.COLUMN_KAFE_ID_SQL, kafeid);
long id = db.insert(TiketMasukTemp.TABLE_NAME_TiketMasuk, null, values);
db.close();
return id;
}
public void deleteallTiketMasuk(){
SQLiteDatabase db = this.getWritableDatabase();
db.execSQL(" DELETE FROM " + TiketMasukTemp.TABLE_NAME_TiketMasuk);
db.close();
}
public void deleteallKolamRenang(){
SQLiteDatabase db = this.getWritableDatabase();
db.execSQL(" DELETE FROM " + KolamRenangTemp.TABLE_RENANG_NAME);
db.close();
}
public void deleteallKolamIkan(){
SQLiteDatabase db = this.getWritableDatabase();
db.execSQL(" DELETE FROM " + KolamIkanTemp.TABLE_KOLAMIKAN_NAME);
db.close();
}
public void deleteallToko(){
SQLiteDatabase db = this.getWritableDatabase();
db.execSQL(" DELETE FROM " + TokoTemp.TABLE_TOKO_NAME);
db.close();
}
public void deleteallWarung(){
SQLiteDatabase db = this.getWritableDatabase();
db.execSQL(" DELETE FROM " + WarungTemp.TABLE_WARUNG_NAME);
db.close();
}
public void deleteItemkolamikan(KolamIkanTemp kolam){
SQLiteDatabase db = this.getWritableDatabase();
db.delete(KolamIkanTemp.TABLE_KOLAMIKAN_NAME, KolamIkanTemp.COLUMN_KOLAMIKAN_ID + " = ?",
new String[]{String.valueOf(kolam.getKolamIkan_id())});
db.close();
}
public void deleteItemKafe(KafeTemp kolam){
SQLiteDatabase db = this.getWritableDatabase();
db.delete(KafeTemp.TABLE_NAME, KafeTemp.COLUMN_KAFE_ID + " = ?",
new String[]{String.valueOf(kolam.getKafe_id())});
db.close();
}
public void deleteItemKolamrenang(KolamRenangTemp kolam){
SQLiteDatabase db = this.getWritableDatabase();
db.delete(KolamRenangTemp.TABLE_RENANG_NAME, KolamRenangTemp.COLUMN_KOLAM_RENANG_ID + " = ?",
new String[]{String.valueOf(kolam.getKolamRenang_id())});
db.close();
}
public void deleteitemTiketmasuk(TiketMasukTemp kolam){
SQLiteDatabase db = this.getWritableDatabase();
db.delete(TiketMasukTemp.TABLE_NAME_TiketMasuk, TiketMasukTemp.COLUMN_TIKETMASUK_ID + " = ?",
new String[]{String.valueOf(kolam.getTiketMasuk_id())});
db.close();
}
public void deleteitemWarung(WarungTemp kolam){
SQLiteDatabase db = this.getWritableDatabase();
db.delete(WarungTemp.TABLE_WARUNG_NAME, WarungTemp.COLUMN_WARUNG_ID + " = ?",
new String[]{String.valueOf(kolam.getWarung_id())});
db.close();
}
public long insertToko(String nama, String jumlah, String harga, int tokoid) {
SQLiteDatabase db = this.getWritableDatabase();
ContentValues values = new ContentValues();
values.put(TokoTemp.COLUMN_TOKO_ITEM, nama);
values.put(TokoTemp.COLUMN_TOKO_JUMLAH, jumlah);
values.put(TokoTemp.COLUMN_TOKO_HARGA, harga);
values.put(TokoTemp.COLUMN_TOKO_ID_SQL, tokoid);
long id = db.insert(TokoTemp.TABLE_TOKO_NAME, null, values);
db.close();
return id;
}
public long insertWarung(String nama, String jumlah, String harga, int tokoid) {
SQLiteDatabase db = this.getWritableDatabase();
ContentValues values = new ContentValues();
values.put(WarungTemp.COLUMN_WARUNG_ITEM, nama);
values.put(WarungTemp.COLUMN_WARUNG_JUMLAH, jumlah);
values.put(WarungTemp.COLUMN_WARUNG_HARGA, harga);
values.put(WarungTemp.COLUMN_WARUNG_ID_SQL, tokoid);
long id = db.insert(WarungTemp.TABLE_WARUNG_NAME, null, values);
db.close();
return id;
}
public KafeTemp getKafeTemps(long id) {
SQLiteDatabase db = this.getReadableDatabase();
Cursor cursor = db.query(KafeTemp.TABLE_NAME,
new String[]{KafeTemp.COLUMN_KAFE_ID, KafeTemp.COLUMN_KAFE_ITEM, KafeTemp.COLUMN_KAFE_JUMLAH, KafeTemp.COLUMN_KAFE_HARGA, KafeTemp.COLUMN_KAFE_ID_SQL},
KafeTemp.COLUMN_KAFE_ID + "=?",
new String[]{String.valueOf(id)}, null, null, null, null);
if (cursor != null)
cursor.moveToFirst();
KafeTemp kafeTemp = new KafeTemp(
cursor.getInt(cursor.getColumnIndex(KafeTemp.COLUMN_KAFE_ID)),
cursor.getString(cursor.getColumnIndex(KafeTemp.COLUMN_KAFE_ITEM)),
cursor.getString(cursor.getColumnIndex(KafeTemp.COLUMN_KAFE_JUMLAH)),
cursor.getString(cursor.getColumnIndex(KafeTemp.COLUMN_KAFE_HARGA)),
cursor.getInt(cursor.getColumnIndex(KafeTemp.COLUMN_KAFE_ID_SQL)));
cursor.close();
return kafeTemp;
}
public TokoTemp getTokoTemps(long id) {
SQLiteDatabase db = this.getReadableDatabase();
Cursor cursor = db.query(TokoTemp.TABLE_TOKO_NAME,
new String[]{TokoTemp.COLUMN_TOKO_ID, TokoTemp.COLUMN_TOKO_ITEM, TokoTemp.COLUMN_TOKO_JUMLAH,
TokoTemp.COLUMN_TOKO_HARGA, TokoTemp.COLUMN_TOKO_ID_SQL},
TokoTemp.COLUMN_TOKO_ID + "=?",
new String[]{String.valueOf(id)}, null, null, null, null);
if (cursor != null)
cursor.moveToFirst();
TokoTemp tokoTemp = new TokoTemp(
cursor.getInt(cursor.getColumnIndex(TokoTemp.COLUMN_TOKO_ID)),
cursor.getString(cursor.getColumnIndex(TokoTemp.COLUMN_TOKO_ITEM)),
cursor.getString(cursor.getColumnIndex(TokoTemp.COLUMN_TOKO_JUMLAH)),
cursor.getString(cursor.getColumnIndex(TokoTemp.COLUMN_TOKO_HARGA)),
cursor.getInt(cursor.getColumnIndex(TokoTemp.COLUMN_TOKO_ID_SQL)));
cursor.close();
return tokoTemp;
}
public List<KafeTemp> getAllKafeTemps() {
List<KafeTemp> temps = new ArrayList<>();
String selectQuery = "SELECT * FROM " + KafeTemp.TABLE_NAME;
SQLiteDatabase db = this.getWritableDatabase();
Cursor cursor = db.rawQuery(selectQuery, null);
if (cursor.moveToFirst()) {
do {
KafeTemp kafeTemp = new KafeTemp();
kafeTemp.setKafe_id(cursor.getInt(cursor.getColumnIndex(KafeTemp.COLUMN_KAFE_ID)));
kafeTemp.setKafe_nama(cursor.getString(cursor.getColumnIndex(KafeTemp.COLUMN_KAFE_ITEM)));
kafeTemp.setKafe_jumlah(cursor.getString(cursor.getColumnIndex(KafeTemp.COLUMN_KAFE_JUMLAH)));
kafeTemp.setKafe_harga(cursor.getString(cursor.getColumnIndex(KafeTemp.COLUMN_KAFE_HARGA)));
kafeTemp.setKafe_id_sql(cursor.getInt(cursor.getColumnIndex(KafeTemp.COLUMN_KAFE_ID_SQL)));
temps.add(kafeTemp);
} while (cursor.moveToNext());
}
db.close();
return temps;
}
public List<KolamIkanTemp> getAllKolamIkanTemps() {
List<KolamIkanTemp> temps = new ArrayList<>();
String selectQuery = "SELECT * FROM " + KolamIkanTemp.TABLE_KOLAMIKAN_NAME;
SQLiteDatabase db = this.getWritableDatabase();
Cursor cursor = db.rawQuery(selectQuery, null);
if (cursor.moveToFirst()) {
do {
KolamIkanTemp kafeTemp = new KolamIkanTemp();
kafeTemp.setKolamIkan_id(cursor.getInt(cursor.getColumnIndex(KolamIkanTemp.COLUMN_KOLAMIKAN_ID)));
kafeTemp.setKolamIkan_nama(cursor.getString(cursor.getColumnIndex(KolamIkanTemp.COLUMN_KOLAMIKAN_ITEM)));
kafeTemp.setKolamIkan_jumlah(cursor.getString(cursor.getColumnIndex(KolamIkanTemp.COLUMN_KOLAMIKAN_JUMLAH)));
kafeTemp.setKolamIkan_harga(cursor.getString(cursor.getColumnIndex(KolamIkanTemp.COLUMN_KOLAMIKAN_HARGA)));
kafeTemp.setKolamIkan_id_sql(cursor.getInt(cursor.getColumnIndex(KolamIkanTemp.COLUMN_KOLAMIKAN_ID_SQL)));
temps.add(kafeTemp);
} while (cursor.moveToNext());
}
db.close();
return temps;
}
public List<TiketMasukTemp> getAllTiketmasukTemps() {
List<TiketMasukTemp> temps = new ArrayList<>();
String selectQuery = "SELECT * FROM " + KafeTemp.TABLE_NAME_TiketMasuk;
SQLiteDatabase db = this.getWritableDatabase();
Cursor cursor = db.rawQuery(selectQuery, null);
if (cursor.moveToFirst()) {
do {
TiketMasukTemp tiketMasukTemp = new TiketMasukTemp();
tiketMasukTemp.setTiketMasuk_id(cursor.getInt(cursor.getColumnIndex(TiketMasukTemp.COLUMN_TIKETMASUK_ID)));
tiketMasukTemp.setTiketMasuk_nama(cursor.getString(cursor.getColumnIndex(TiketMasukTemp.COLUMN_TIKETMASUK_ITEM)));
tiketMasukTemp.setTiketMasuk_jumlah(cursor.getString(cursor.getColumnIndex(TiketMasukTemp.COLUMN_TIKETMASUK_JUMLAH)));
tiketMasukTemp.setTiketMasuk_harga(cursor.getString(cursor.getColumnIndex(TiketMasukTemp.COLUMN_TIKETMASUK_HARGA)));
tiketMasukTemp.setTiketMasuk_id_sql(cursor.getInt(cursor.getColumnIndex(TiketMasukTemp.COLUMN_TIKETMASUK_ID_SQL)));
temps.add(tiketMasukTemp);
} while (cursor.moveToNext());
}
db.close();
return temps;
}
public List<KolamRenangTemp> getAllKolamRenangTemps() {
List<KolamRenangTemp> temps = new ArrayList<>();
String selectQuery = "SELECT * FROM " + KolamRenangTemp.TABLE_RENANG_NAME;
SQLiteDatabase db = this.getWritableDatabase();
Cursor cursor = db.rawQuery(selectQuery, null);
if (cursor.moveToFirst()) {
do {
KolamRenangTemp tiketMasukTemp = new KolamRenangTemp();
tiketMasukTemp.setKolamRenang_id(cursor.getInt(cursor.getColumnIndex(KolamRenangTemp.COLUMN_KOLAM_RENANG_ID)));
tiketMasukTemp.setKolamRenang_nama(cursor.getString(cursor.getColumnIndex(KolamRenangTemp.COLUMN_KOLAM_RENANG_ITEM)));
tiketMasukTemp.setKolamRenang_jumlah(cursor.getString(cursor.getColumnIndex(KolamRenangTemp.COLUMN_KOLAM_RENANG_JUMLAH)));
tiketMasukTemp.setKolamRenang_harga(cursor.getString(cursor.getColumnIndex(KolamRenangTemp.COLUMN_KOLAM_RENANG_HARGA)));
tiketMasukTemp.setKolamRenang_id_sql(cursor.getInt(cursor.getColumnIndex(KolamRenangTemp.COLUMN_KOLAM_RENANG_ID_SQL)));
temps.add(tiketMasukTemp);
} while (cursor.moveToNext());
}
db.close();
return temps;
}
public List<TokoTemp> getAllTokoTemps() {
List<TokoTemp> temps = new ArrayList<>();
String selectQuery = "SELECT * FROM " + TokoTemp.TABLE_TOKO_NAME;
SQLiteDatabase db = this.getWritableDatabase();
Cursor cursor = db.rawQuery(selectQuery, null);
if (cursor.moveToFirst()) {
do {
TokoTemp tokoTemp = new TokoTemp();
tokoTemp.setToko_id(cursor.getInt(cursor.getColumnIndex(TokoTemp.COLUMN_TOKO_ID)));
tokoTemp.setToko_nama(cursor.getString(cursor.getColumnIndex(TokoTemp.COLUMN_TOKO_ITEM)));
tokoTemp.setToko_harga(cursor.getString(cursor.getColumnIndex(TokoTemp.COLUMN_TOKO_HARGA)));
tokoTemp.setToko_id_sql(cursor.getInt(cursor.getColumnIndex(TokoTemp.COLUMN_TOKO_ID_SQL)));
tokoTemp.setToko_jumlah(cursor.getString(cursor.getColumnIndex(TokoTemp.COLUMN_TOKO_JUMLAH)));
temps.add(tokoTemp);
} while (cursor.moveToNext());
}
db.close();
return temps;
}
public List<WarungTemp> getAllWarungTemps() {
List<WarungTemp> temps = new ArrayList<>();
String selectQuery = "SELECT * FROM " + WarungTemp.TABLE_WARUNG_NAME;
SQLiteDatabase db = this.getWritableDatabase();
Cursor cursor = db.rawQuery(selectQuery, null);
if (cursor.moveToFirst()) {
do {
WarungTemp warungTemp = new WarungTemp();
warungTemp.setWarung_id(cursor.getInt(cursor.getColumnIndex(WarungTemp.COLUMN_WARUNG_ID)));
warungTemp.setWarung_harga(cursor.getString(cursor.getColumnIndex(WarungTemp.COLUMN_WARUNG_HARGA)));
warungTemp.setWarung_jumlah(cursor.getString(cursor.getColumnIndex(WarungTemp.COLUMN_WARUNG_JUMLAH)));
warungTemp.setWarung_nama(cursor.getString(cursor.getColumnIndex(WarungTemp.COLUMN_WARUNG_ITEM)));
warungTemp.setWarung_id_sql(cursor.getInt(cursor.getColumnIndex(WarungTemp.COLUMN_WARUNG_ID_SQL)));
temps.add(warungTemp);
} while (cursor.moveToNext());
}
db.close();
return temps;
}
public int getKafeTempsCount() {
String countQuery = "SELECT * FROM " + KafeTemp.TABLE_NAME;
SQLiteDatabase db = this.getReadableDatabase();
Cursor cursor = db.rawQuery(countQuery, null);
int count = cursor.getCount();
cursor.close();
return count;
}
public int getTiketTempsCount() {
String countQuery = "SELECT * FROM " + KafeTemp.TABLE_NAME_TiketMasuk;
SQLiteDatabase db = this.getReadableDatabase();
Cursor cursor = db.rawQuery(countQuery, null);
int count = cursor.getCount();
cursor.close();
return count;
}
public int getKolamRenangTempsCount() {
String countQuery = "SELECT * FROM " + KolamRenangTemp.TABLE_RENANG_NAME;
SQLiteDatabase db = this.getReadableDatabase();
Cursor cursor = db.rawQuery(countQuery, null);
int count = cursor.getCount();
cursor.close();
return count;
}
public int getKolamIkanTempsCount() {
String countQuery = "SELECT * FROM " + KolamIkanTemp.TABLE_KOLAMIKAN_NAME;
SQLiteDatabase db = this.getReadableDatabase();
Cursor cursor = db.rawQuery(countQuery, null);
int count = cursor.getCount();
cursor.close();
return count;
}
public int getTokoTempsCount() {
String countQuery = "SELECT * FROM " + TokoTemp.TABLE_TOKO_NAME;
SQLiteDatabase db = this.getReadableDatabase();
Cursor cursor = db.rawQuery(countQuery, null);
int count = cursor.getCount();
cursor.close();
return count;
}
public int getWarungTempsCount() {
String countQuery = "SELECT * FROM " + WarungTemp.TABLE_WARUNG_NAME;
SQLiteDatabase db = this.getReadableDatabase();
Cursor cursor = db.rawQuery(countQuery, null);
int count = cursor.getCount();
cursor.close();
return count;
}
public void deleteAllKafe(){
SQLiteDatabase db = this.getWritableDatabase();
db.delete(KafeTemp.TABLE_NAME, null, null);
db.execSQL("DELETE FROM " + KafeTemp.TABLE_NAME);
db.execSQL("VACUUM");
db.close();
}
public void deleteKafeTemps(KafeTemp temp) {
SQLiteDatabase db = this.getWritableDatabase();
db.delete(KafeTemp.TABLE_NAME, KafeTemp.COLUMN_KAFE_ID + " = ?",
new String[]{String.valueOf(temp.getKafe_id())});
db.close();
}
public void deleteTokoTemps(TokoTemp temp) {
SQLiteDatabase db = this.getWritableDatabase();
db.delete(TokoTemp.TABLE_TOKO_NAME, TokoTemp.COLUMN_TOKO_ID + " = ?",
new String[]{String.valueOf(temp.getToko_id())});
db.close();
}
}
<file_sep>package com.digitalone.kasiranto.fragment;
import android.content.DialogInterface;
import android.os.Bundle;
import android.support.design.widget.FloatingActionButton;
import android.support.v4.app.Fragment;
import android.support.v4.widget.SwipeRefreshLayout;
import android.support.v7.app.AlertDialog;
import android.support.v7.widget.LinearLayoutManager;
import android.support.v7.widget.RecyclerView;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.widget.EditText;
import android.widget.Toast;
import com.digitalone.kasiranto.R;
import com.digitalone.kasiranto.adapter.AdapterKafe;
import com.digitalone.kasiranto.adapter.AdapterToko;
import com.digitalone.kasiranto.model.AdminMessage;
import com.digitalone.kasiranto.model.Kafe;
import com.digitalone.kasiranto.model.Toko;
import com.digitalone.kasiranto.model.TokoItem;
import com.digitalone.kasiranto.service.RestAPIHelper;
import java.util.ArrayList;
import java.util.List;
import retrofit2.Call;
import retrofit2.Callback;
import retrofit2.Response;
/**
* A simple {@link Fragment} subclass.
*/
public class FragmentAdminToko extends Fragment implements View.OnClickListener, SwipeRefreshLayout.OnRefreshListener{
public static final String TITLE = "Toko";
private View view;
private List<TokoItem> items;
private AdapterToko adapterToko;
private RecyclerView recyclerView;
private FloatingActionButton fab;
private AlertDialog.Builder dialog;
private EditText edtNama,edtHarga,edtStok;
private String nama, harga, stok;
private LayoutInflater inflater;
private View dialogView;
private SwipeRefreshLayout refreshLayout;
public static FragmentAdminToko newInstance(){
return new FragmentAdminToko();
}
public FragmentAdminToko() {
// Required empty public constructor
}
@Override
public View onCreateView(LayoutInflater inflater, ViewGroup container,
Bundle savedInstanceState) {
view = inflater.inflate(R.layout.fragment_admin_toko, container, false);
initView();
return view;
}
private void initView() {
refreshLayout = view.findViewById(R.id.refreshtoko);
refreshLayout.setOnRefreshListener(this);
refreshLayout.post(new Runnable() {
@Override
public void run() {
refreshLayout.setRefreshing(true);
getTokos();
}
});
fab = view.findViewById(R.id.fabtoko);
recyclerView = view.findViewById(R.id.recycleadmintoko);
fab.setOnClickListener(this);
items = new ArrayList<>();
RecyclerView.LayoutManager layoutManager = new LinearLayoutManager(getActivity());
recyclerView.setLayoutManager(layoutManager);
recyclerView.setAdapter(adapterToko);
}
private void dialogForm(){
dialog = new AlertDialog.Builder(getActivity());
inflater = getLayoutInflater();
dialogView = inflater.inflate(R.layout.form_toko, null);
edtHarga = dialogView.findViewById(R.id.toko_form_harga);
edtNama = dialogView.findViewById(R.id.toko_form_nama);
edtStok = dialogView.findViewById(R.id.toko_form_stok);
dialog.setView(dialogView);
dialog.setCancelable(true);
dialog.setTitle("Form Item Toko");
dialog.setPositiveButton("Simpan", new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int which) {
nama = edtNama.getText().toString();
harga = edtHarga.getText().toString();
stok = edtStok.getText().toString();
insertToko(nama,harga,stok);
}
});
dialog.setNegativeButton("Batal", new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int which) {
dialog.dismiss();
}
});
dialog.show();
}
private void insertToko(String nama, String harga, String stok){
retrofit2.Call<AdminMessage> call = RestAPIHelper.ServiceApi(getActivity().getApplication()).insertToko(nama,harga,stok);
call.enqueue(new Callback<AdminMessage>() {
@Override
public void onResponse(Call<AdminMessage> call, Response<AdminMessage> response) {
if (response.body() != null){
boolean error = response.body().getError();
String msg = response.body().getMsg();
if (!error){
Toast.makeText(getContext(), msg, Toast.LENGTH_SHORT).show();
}else {
Toast.makeText(getContext(), msg, Toast.LENGTH_SHORT).show();
}
onRefresh();
}
}
@Override
public void onFailure(Call<AdminMessage> call, Throwable t) {
Toast.makeText(getContext(), "Cek koneksi internet anda", Toast.LENGTH_SHORT).show();
}
});
}
private void getTokos(){
retrofit2.Call<Toko> call = RestAPIHelper.ServiceApi(getActivity().getApplication()).getToko();
call.enqueue(new Callback<Toko>() {
@Override
public void onResponse(Call<Toko> call, Response<Toko> response) {
if (response.body() != null){
items = response.body().getToko();
adapterToko = new AdapterToko(items, getContext());
recyclerView.setAdapter(adapterToko);
}else {
Toast.makeText(getContext(), "Error response", Toast.LENGTH_SHORT).show();
}
refreshLayout.setRefreshing(false);
}
@Override
public void onFailure(Call<Toko> call, Throwable t) {
refreshLayout.setRefreshing(false);
Toast.makeText(getContext(), "onFailure", Toast.LENGTH_SHORT).show();
}
});
}
@Override
public void onClick(View v) {
switch (v.getId()){
case R.id.fabtoko:
dialogForm();
break;
}
}
@Override
public void onRefresh() {
items.clear();
adapterToko.notifyDataSetChanged();
getTokos();
}
}
<file_sep>package com.digitalone.kasiranto.adapter;
import android.content.Context;
import android.support.annotation.NonNull;
import android.support.v7.widget.RecyclerView;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.widget.TextView;
import com.digitalone.kasiranto.R;
import com.digitalone.kasiranto.model.TransaksiItem;
import java.util.List;
public class AdapterTransaksi extends RecyclerView.Adapter<AdapterTransaksi.HolderTransaksi> {
private List<TransaksiItem> items;
private Context context;
public AdapterTransaksi(List<TransaksiItem> items, Context context) {
this.items = items;
this.context = context;
}
@NonNull
@Override
public AdapterTransaksi.HolderTransaksi onCreateViewHolder(@NonNull ViewGroup parent, int viewType) {
View view = LayoutInflater.from(parent.getContext())
.inflate(R.layout.list_filter, parent,false);
return new HolderTransaksi(view);
}
@Override
public void onBindViewHolder(@NonNull AdapterTransaksi.HolderTransaksi holder, int position) {
TransaksiItem item = items.get(position);
holder.txtNama.setText(item.getTNama());
holder.txtJumlah.setText(item.getTJumlah());
holder.txtTotal.setText(item.getTTotal());
holder.txtTanggal.setText(item.getTDatetime());
}
@Override
public int getItemCount() {
return items.size();
}
public class HolderTransaksi extends RecyclerView.ViewHolder {
private TextView txtNama, txtHarga, txtTanggal,txtJumlah, txtTotal;
public HolderTransaksi(View itemView) {
super(itemView);
txtNama = itemView.findViewById(R.id.filter_nama);
txtHarga = itemView.findViewById(R.id.filter_harga);
txtTanggal = itemView.findViewById(R.id.filter_waktu);
txtJumlah = itemView.findViewById(R.id.filter_jumlah);
txtTotal = itemView.findViewById(R.id.filter_total);
}
}
}
<file_sep>package com.digitalone.kasiranto.adapter;
import android.content.Context;
import android.support.annotation.NonNull;
import android.support.v7.widget.RecyclerView;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.widget.TextView;
import com.digitalone.kasiranto.R;
import com.digitalone.kasiranto.model.TokoTransaksiItem;
import java.util.List;
public class AdapterTransaksiToko extends RecyclerView.Adapter<AdapterTransaksiToko.HolderTransToko> {
private List<TokoTransaksiItem> items;
private Context context;
public AdapterTransaksiToko(List<TokoTransaksiItem> items, Context context) {
this.items = items;
this.context = context;
}
@NonNull
@Override
public AdapterTransaksiToko.HolderTransToko onCreateViewHolder(@NonNull ViewGroup parent, int viewType) {
View view = LayoutInflater.from(parent.getContext())
.inflate(R.layout.list_transaksi_toko, parent,false);
return new HolderTransToko(view);
}
@Override
public void onBindViewHolder(@NonNull AdapterTransaksiToko.HolderTransToko holder, int position) {
TokoTransaksiItem item = items.get(position);
holder.txtNama.setText(item.getTokoNama());
holder.txtHarga.setText(String.valueOf(item.getTokoHarga()));
holder.txtTanggal.setText(item.getTtDatetime());
holder.txtJumlah.setText(item.getTtJumlah());
holder.txtTotal.setText(item.getTtTotal());
}
@Override
public int getItemCount() {
return items.size();
}
public class HolderTransToko extends RecyclerView.ViewHolder {
private TextView txtNama, txtHarga, txtTanggal,txtJumlah, txtTotal;
public HolderTransToko(View itemView) {
super(itemView);
txtNama = itemView.findViewById(R.id.toko_transaksi_nama);
txtHarga = itemView.findViewById(R.id.toko_transaksi_harga);
txtTanggal = itemView.findViewById(R.id.toko_transaksi_waktu);
txtJumlah = itemView.findViewById(R.id.toko_transaksi_jumlah);
txtTotal = itemView.findViewById(R.id.toko_transaksi_total);
}
}
}
<file_sep>package com.digitalone.kasiranto.model;
import com.google.gson.annotations.Expose;
import com.google.gson.annotations.SerializedName;
import java.util.List;
public class Transaksi {
@Expose
@SerializedName("error")
private boolean error;
@Expose
@SerializedName("transaksi")
private List<TransaksiItem> transaksi;
public boolean getError(){
return error;
}
public void setError(boolean input){
this.error = input;
}
public List<TransaksiItem> getTransaksi(){
return transaksi;
}
public void setTransaksi(List<TransaksiItem> input){
this.transaksi = input;
}
}
<file_sep>package com.digitalone.kasiranto.model;
import com.google.gson.annotations.Expose;
import com.google.gson.annotations.SerializedName;
import java.util.List;
public class TiketMasuk {
@SerializedName("error")
@Expose
private boolean error;
@SerializedName("tiketmasuk")
@Expose
private List<TiketMasukItem> kafes;
public boolean getError(){
return error;
}
public void setError(boolean input){
this.error = input;
}
public List<TiketMasukItem> getTiketMasuk(){
return kafes;
}
public void setTiketMasuk(List<TiketMasukItem> input){
this.kafes = input;
}
}
<file_sep>package com.digitalone.kasiranto.adapter;
import android.content.Context;
import android.support.annotation.NonNull;
import android.support.v7.widget.RecyclerView;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.widget.TextView;
import com.digitalone.kasiranto.R;
import com.digitalone.kasiranto.activity.ActivityTiketMasukCheckout;
import com.digitalone.kasiranto.activity.ActivityWarungCheckout;
import com.digitalone.kasiranto.fragment.FragmentTiketMasuk;
import com.digitalone.kasiranto.fragment.FragmentWarung;
import com.digitalone.kasiranto.helper.DBHelper;
import com.digitalone.kasiranto.model.WarungTemp;
import java.util.List;
public class AdapterWarungTemp extends RecyclerView.Adapter<AdapterWarungTemp.HolderWarungTemp> {
private List<WarungTemp> temps;
private Context context = null;
private DBHelper helper;
public AdapterWarungTemp(List<WarungTemp> temps, Context context) {
this.temps = temps;
this.context = context;
}
@NonNull
@Override
public HolderWarungTemp onCreateViewHolder(@NonNull ViewGroup parent, int viewType) {
helper = new DBHelper(context);
View view = LayoutInflater.from(parent.getContext())
.inflate(R.layout.list_warung_bayar, parent,false);
return new HolderWarungTemp(view);
}
@Override
public void onBindViewHolder(@NonNull HolderWarungTemp holder, int position) {
final WarungTemp temp = temps.get(position);
holder.txtId.setText(String.valueOf(temp.getWarung_id_sql()));
holder.txtId.setVisibility(View.GONE);
holder.txtNama.setText(temp.getWarung_nama());
holder.txtJumlah.setText(temp.getWarung_jumlah());
holder.txtHarga.setText(temp.getWarung_harga());
holder.hapus.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View v) {
FragmentWarung.totaldetail = FragmentWarung.totaldetail - Integer.parseInt(temp.getWarung_harga());
helper.deleteitemWarung(temp);
helper.getAllWarungTemps();
((ActivityWarungCheckout)context).refresh();
}
});
}
@Override
public int getItemCount() {
return temps.size();
}
public class HolderWarungTemp extends RecyclerView.ViewHolder {
private TextView txtNama, txtJumlah, txtHarga, txtId,hapus;
public HolderWarungTemp(View itemView) {
super(itemView);
txtNama = itemView.findViewById(R.id.temp_warung_item);
txtJumlah = itemView.findViewById(R.id.temp_warung_jumlah);
txtHarga = itemView.findViewById(R.id.temp_warung_harga);
txtId = itemView.findViewById(R.id.temp_warung_id);
hapus = itemView.findViewById(R.id.hapus);
}
}
}
<file_sep>package com.digitalone.kasiranto.model;
import com.google.gson.annotations.Expose;
import com.google.gson.annotations.SerializedName;
public class WarungDateItem {
@SerializedName("warung_nama")
@Expose
private String warung_nama;
@SerializedName("warung_harga")
@Expose
private int warung_harga;
@SerializedName("wt_jumlah")
@Expose
private String wtJumlah;
@SerializedName("wt_total")
@Expose
private String wtTotal;
@SerializedName("wt_datetime")
@Expose
private String wtDatetime;
public String getWarung_nama() {
return warung_nama;
}
public void setWarung_nama(String warung_nama) {
this.warung_nama = warung_nama;
}
public int getWarung_harga() {
return warung_harga;
}
public void setWarung_harga(int warung_harga) {
this.warung_harga = warung_harga;
}
public String getWtJumlah() {
return wtJumlah;
}
public void setWtJumlah(String wtJumlah) {
this.wtJumlah = wtJumlah;
}
public String getWtTotal() {
return wtTotal;
}
public void setWtTotal(String wtTotal) {
this.wtTotal = wtTotal;
}
public String getWtDatetime() {
return wtDatetime;
}
public void setWtDatetime(String wtDatetime) {
this.wtDatetime = wtDatetime;
}
}
<file_sep>package com.digitalone.kasiranto.model;
public class WarungTemp {
public static final String TABLE_WARUNG_NAME = "warung_temps";
public static final String COLUMN_WARUNG_ID = "warung_id";
public static final String COLUMN_WARUNG_ITEM = "warung_nama";
public static final String COLUMN_WARUNG_JUMLAH = "warung_jumlah";
public static final String COLUMN_WARUNG_HARGA = "warung_harga";
public static final String COLUMN_WARUNG_ID_SQL = "warung_id_sql";
private int warung_id;
private String warung_nama;
private String warung_jumlah;
private String warung_harga;
private int warung_id_sql;
public static final String CREATE_TABLE_WARUNG =
"CREATE TABLE " + TABLE_WARUNG_NAME + "("
+ COLUMN_WARUNG_ID + " INTEGER PRIMARY KEY AUTOINCREMENT,"
+ COLUMN_WARUNG_ITEM + " TEXT,"
+ COLUMN_WARUNG_JUMLAH + " TEXT,"
+ COLUMN_WARUNG_HARGA + " TEXT,"
+ COLUMN_WARUNG_ID_SQL + " INTEGER"
+ ")";
public WarungTemp(int warung_id, String warung_nama, String warung_jumlah, String warung_harga, int warung_id_sql) {
this.warung_id = warung_id;
this.warung_nama = warung_nama;
this.warung_jumlah = warung_jumlah;
this.warung_harga = warung_harga;
this.warung_id_sql = warung_id_sql;
}
public WarungTemp() {
}
public int getWarung_id() {
return warung_id;
}
public void setWarung_id(int warung_id) {
this.warung_id = warung_id;
}
public String getWarung_nama() {
return warung_nama;
}
public void setWarung_nama(String warung_nama) {
this.warung_nama = warung_nama;
}
public String getWarung_jumlah() {
return warung_jumlah;
}
public void setWarung_jumlah(String warung_jumlah) {
this.warung_jumlah = warung_jumlah;
}
public String getWarung_harga() {
return warung_harga;
}
public void setWarung_harga(String warung_harga) {
this.warung_harga = warung_harga;
}
public int getWarung_id_sql() {
return warung_id_sql;
}
public void setWarung_id_sql(int warung_id_sql) {
this.warung_id_sql = warung_id_sql;
}
}
<file_sep>package com.digitalone.kasiranto.model;
import com.google.gson.annotations.Expose;
import com.google.gson.annotations.SerializedName;
import java.util.List;
public class TiketMasukDate {
@SerializedName("error")
@Expose
private boolean error;
@SerializedName("tiketmasuk_date")
@Expose
private List<TiketMasukDateItem> tiketmasukDate;
public boolean getError(){
return error;
}
public void setError(boolean input){
this.error = input;
}
public List<TiketMasukDateItem> getTiketmasukDate(){
return tiketmasukDate;
}
public void setTiketmasukDate(List<TiketMasukDateItem> input){
this.tiketmasukDate = input;
}
}
<file_sep>package com.digitalone.kasiranto.model;
public class KafeTemp {
public static final String TABLE_NAME = "kafe_temps";
public static final String TABLE_NAME_TiketMasuk = "tiketmasuk_temps";
public static final String COLUMN_KAFE_ID = "kafe_id";
public static final String COLUMN_KAFE_ITEM = "kafe_nama";
public static final String COLUMN_KAFE_JUMLAH = "kafe_jumlah";
public static final String COLUMN_KAFE_HARGA = "kafe_harga";
public static final String COLUMN_KAFE_ID_SQL = "kafe_id_sql";
private int kafe_id;
private String kafe_nama;
private String kafe_jumlah;
private String kafe_harga;
private int kafe_id_sql;
public static final String CREATE_TABLE =
"CREATE TABLE " + TABLE_NAME + "("
+ COLUMN_KAFE_ID + " INTEGER PRIMARY KEY AUTOINCREMENT,"
+ COLUMN_KAFE_ITEM + " TEXT,"
+ COLUMN_KAFE_JUMLAH + " TEXT,"
+ COLUMN_KAFE_HARGA + " TEXT,"
+ COLUMN_KAFE_ID_SQL + " INTEGER"
+ ")";
public KafeTemp(int kafe_id, String kafe_nama, String kafe_jumlah, String kafe_harga, int kafe_id_sql) {
this.kafe_id = kafe_id;
this.kafe_nama = kafe_nama;
this.kafe_jumlah = kafe_jumlah;
this.kafe_harga = kafe_harga;
this.kafe_id_sql = kafe_id_sql;
}
public KafeTemp() {
}
public int getKafe_id() {
return kafe_id;
}
public void setKafe_id(int kafe_id) {
this.kafe_id = kafe_id;
}
public String getKafe_nama() {
return kafe_nama;
}
public void setKafe_nama(String kafe_nama) {
this.kafe_nama = kafe_nama;
}
public String getKafe_jumlah() {
return kafe_jumlah;
}
public void setKafe_jumlah(String kafe_jumlah) {
this.kafe_jumlah = kafe_jumlah;
}
public String getKafe_harga() {
return kafe_harga;
}
public void setKafe_harga(String kafe_harga) {
this.kafe_harga = kafe_harga;
}
public int getKafe_id_sql() {
return kafe_id_sql;
}
public void setKafe_id_sql(int kafe_id_sql) {
this.kafe_id_sql = kafe_id_sql;
}
}
<file_sep>package com.digitalone.kasiranto.model;
import com.google.gson.annotations.Expose;
import com.google.gson.annotations.SerializedName;
import java.util.List;
public class TokoDate {
@SerializedName("error")
@Expose
private boolean error;
@SerializedName("toko")
@Expose
private List<TokoDateItem> toko;
public boolean getError(){
return error;
}
public void setError(boolean input){
this.error = input;
}
public List<TokoDateItem> getDateToko(){
return toko;
}
public void setDateToko(List<TokoDateItem> input){
this.toko = input;
}
}
<file_sep>package com.digitalone.kasiranto.fragment;
import android.os.Bundle;
import android.support.v4.app.Fragment;
import android.support.v4.widget.SwipeRefreshLayout;
import android.support.v7.widget.LinearLayoutManager;
import android.support.v7.widget.RecyclerView;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.widget.TextView;
import android.widget.Toast;
import com.digitalone.kasiranto.R;
import com.digitalone.kasiranto.adapter.AdapterTransaksiWarung;
import com.digitalone.kasiranto.adapter.AdapterTransaksiWarung;
import com.digitalone.kasiranto.model.WarungTransaksi;
import com.digitalone.kasiranto.model.WarungTransaksiItem;
import com.digitalone.kasiranto.service.RestAPIHelper;
import java.util.ArrayList;
import java.util.List;
import retrofit2.Call;
import retrofit2.Callback;
import retrofit2.Response;
/**
* A simple {@link Fragment} subclass.
*/
public class FragmentTransaksiWarung extends Fragment implements SwipeRefreshLayout.OnRefreshListener{
public static final String TITLE = "Transaksi Warung";
private SwipeRefreshLayout refreshLayout;
private List<WarungTransaksiItem> items;
private AdapterTransaksiWarung adapterTransaksiWarung;
private RecyclerView recyclerView;
private View view;
private TextView msgKosong;
public static FragmentTransaksiWarung newInstance(){
return new FragmentTransaksiWarung();
}
public FragmentTransaksiWarung() {
// Required empty public constructor
}
@Override
public View onCreateView(LayoutInflater inflater, ViewGroup container,
Bundle savedInstanceState) {
view = inflater.inflate(R.layout.fragment_transaksi_warung, container, false);
initView();
if (items == null){
msgKosong.setVisibility(View.VISIBLE);
}else {
msgKosong.setVisibility(View.GONE);
getTransaksiWarung();
}
return view;
}
private void initView() {
msgKosong = view.findViewById(R.id.msg_transaksi);
refreshLayout = view.findViewById(R.id.refreshwarungtr);
refreshLayout.setOnRefreshListener(this);
refreshLayout.post(new Runnable() {
@Override
public void run() {
refreshLayout.setRefreshing(true);
getTransaksiWarung();
}
});
recyclerView = view.findViewById(R.id.recycletransaksiwarung);
items = new ArrayList<>();
RecyclerView.LayoutManager layoutManager = new LinearLayoutManager(getContext());
recyclerView.setLayoutManager(layoutManager);
recyclerView.setAdapter(adapterTransaksiWarung);
}
private void getTransaksiWarung() {
retrofit2.Call<WarungTransaksi> call = RestAPIHelper.ServiceApi(getActivity().getApplication()).getWarungTr();
call.enqueue(new Callback<WarungTransaksi>() {
@Override
public void onResponse(Call<WarungTransaksi> call, Response<WarungTransaksi> response) {
if (response.body() != null){
items = response.body().getWarungTransaksi();
adapterTransaksiWarung = new AdapterTransaksiWarung(items, getContext());
recyclerView.setAdapter(adapterTransaksiWarung);
refreshLayout.setRefreshing(false);
}
}
@Override
public void onFailure(Call<WarungTransaksi> call, Throwable t) {
Toast.makeText(getContext(), "OnFailure", Toast.LENGTH_SHORT).show();
}
});
}
@Override
public void onRefresh() {
items.clear();
adapterTransaksiWarung.notifyDataSetChanged();
getTransaksiWarung();
}
}
<file_sep>package com.digitalone.kasiranto.model;
import com.google.gson.annotations.Expose;
import com.google.gson.annotations.SerializedName;
public class KolamRenangItem {
@SerializedName("id")
@Expose
private int id;
@SerializedName("nama_item_kolamrenang")
@Expose
private String namaItemKafe;
@SerializedName("harga_item_kolamrenang")
@Expose
private int hargaItemKafe;
@SerializedName("stok_item_kolamrenang")
@Expose
private int stokItemKafe;
public int getId(){
return id;
}
public void setId(int input){
this.id = input;
}
public String getNamaItemKafe(){
return namaItemKafe;
}
public void setNamaItemKafe(String input){
this.namaItemKafe = input;
}
public int getHargaItemKafe(){
return hargaItemKafe;
}
public void setHargaItemKafe(int input){
this.hargaItemKafe = input;
}
public int getStokItemKafe(){
return stokItemKafe;
}
public void setStokItemKafe(int input){
this.stokItemKafe = input;
}
}
<file_sep>package com.digitalone.kasiranto.service;
import android.app.Application;
import com.google.gson.Gson;
import com.google.gson.GsonBuilder;
import okhttp3.Cache;
import okhttp3.OkHttpClient;
import retrofit2.Retrofit;
import retrofit2.converter.gson.GsonConverterFactory;
public class RestAPIHelper {
public static APIService ServiceApi(Application application){
Cache okHttpCache = new Cache(application.getCacheDir(), 10 * 1024 *1024);
OkHttpClient.Builder client = new OkHttpClient.Builder();
client.cache(okHttpCache);
Gson gson = new GsonBuilder()
.setLenient().create();
Retrofit retrofit = new Retrofit.Builder()
.baseUrl(APIService.BASE_URL)
.client(client.build())
.addConverterFactory(GsonConverterFactory.create(gson))
.build();
return retrofit.create(APIService.class);
}
}
<file_sep>package com.digitalone.kasiranto.model;
import com.google.gson.annotations.Expose;
import com.google.gson.annotations.SerializedName;
import java.util.List;
public class KolamIkanDate {
@SerializedName("error")
@Expose
private boolean error;
@SerializedName("kolamikan")
@Expose
private List<KolamIkanDateItem> kafe;
public boolean getError(){
return error;
}
public void setError(boolean input){
this.error = input;
}
public List<KolamIkanDateItem> getDateKafe(){
return kafe;
}
public void setDateKafe(List<KolamIkanDateItem> input){
this.kafe = input;
}
}
<file_sep>
package com.digitalone.kasiranto.model;
import com.google.gson.annotations.Expose;
import com.google.gson.annotations.SerializedName;
import java.util.List;
public class KolamIkan {
@SerializedName("error")
@Expose
private boolean error;
@SerializedName("KolamIkan")
@Expose
private List<KolamIkanItem> KolamIkan;
public boolean getError(){
return error;
}
public void setError(boolean input){
this.error = input;
}
public List<KolamIkanItem> getKafe(){
return KolamIkan;
}
public void setKafe(List<KolamIkanItem> input){
this.KolamIkan = input;
}
}
<file_sep>package com.digitalone.kasiranto.model;
import com.google.gson.annotations.Expose;
import com.google.gson.annotations.SerializedName;
public class WarungTransaksiItem {
@SerializedName("warung_nama")
@Expose
private String warungNama;
@SerializedName("warung_harga")
@Expose
private int warungHarga;
@SerializedName("wt_jumlah")
@Expose
private String wtJumlah;
@SerializedName("wt_total")
@Expose
private String wtTotal;
@SerializedName("wt_datetime")
@Expose
private String wtDatetime;
public String getWarungNama(){
return warungNama;
}
public void setWarungNama(String input){
this.warungNama = input;
}
public int getWarungHarga(){
return warungHarga;
}
public void setWarungHarga(int input){
this.warungHarga = input;
}
public String getWtJumlah(){
return wtJumlah;
}
public void setWtJumlah(String input){
this.wtJumlah = input;
}
public String getWtTotal(){
return wtTotal;
}
public void setWtTotal(String input){
this.wtTotal = input;
}
public String getWtDatetime(){
return wtDatetime;
}
public void setWtDatetime(String input){
this.wtDatetime = input;
}
}
<file_sep>package com.digitalone.kasiranto.adapter;
import android.content.Context;
import android.support.annotation.NonNull;
import android.support.v7.widget.RecyclerView;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.widget.TextView;
import com.digitalone.kasiranto.R;
import com.digitalone.kasiranto.model.KafeDateItem;
import java.util.List;
public class AdapterKafeDate extends RecyclerView.Adapter<AdapterKafeDate.HolderKafeDate> {
private List<KafeDateItem> items;
private Context context;
public AdapterKafeDate(List<KafeDateItem> items, Context context) {
this.items = items;
this.context = context;
}
@NonNull
@Override
public AdapterKafeDate.HolderKafeDate onCreateViewHolder(@NonNull ViewGroup parent, int viewType) {
View view = LayoutInflater.from(parent.getContext())
.inflate(R.layout.list_filter, parent, false);
return new HolderKafeDate(view);
}
@Override
public void onBindViewHolder(@NonNull AdapterKafeDate.HolderKafeDate holder, int position) {
KafeDateItem item = items.get(position);
holder.txtNama.setText(item.getNamaItemKafe());
holder.txtHarga.setText(String.valueOf(item.getHargaItemKafe()));
holder.txtJumlah.setText(item.getKtJumlah());
holder.txtTanggal.setText(item.getKtDatetime());
holder.txtTotal.setText(item.getKtTotal());
}
@Override
public int getItemCount() {
return items.size();
}
public class HolderKafeDate extends RecyclerView.ViewHolder {
private TextView txtNama, txtHarga, txtTanggal,txtJumlah, txtTotal;
public HolderKafeDate(View itemView) {
super(itemView);
txtNama = itemView.findViewById(R.id.filter_nama);
txtHarga = itemView.findViewById(R.id.filter_harga);
txtTanggal = itemView.findViewById(R.id.filter_waktu);
txtJumlah = itemView.findViewById(R.id.filter_jumlah);
txtTotal = itemView.findViewById(R.id.filter_total);
}
}
}
<file_sep>package com.digitalone.kasiranto.fragment;
import android.os.Bundle;
import android.support.v4.app.Fragment;
import android.support.v4.widget.SwipeRefreshLayout;
import android.support.v7.widget.LinearLayoutManager;
import android.support.v7.widget.RecyclerView;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.widget.TextView;
import android.widget.Toast;
import com.digitalone.kasiranto.R;
import com.digitalone.kasiranto.adapter.AdapterTransaksiToko;
import com.digitalone.kasiranto.adapter.AdapterTransaksiToko;
import com.digitalone.kasiranto.model.TokoTransaksi;
import com.digitalone.kasiranto.model.TokoTransaksiItem;
import com.digitalone.kasiranto.service.RestAPIHelper;
import java.util.ArrayList;
import java.util.List;
import retrofit2.Call;
import retrofit2.Callback;
import retrofit2.Response;
/**
* A simple {@link Fragment} subclass.
*/
public class FragmentTransaksiToko extends Fragment implements SwipeRefreshLayout.OnRefreshListener{
public static final String TITLE = "Transaksi Toko";
private SwipeRefreshLayout refreshLayout;
private List<TokoTransaksiItem> items;
private AdapterTransaksiToko adapterTransaksiToko;
private RecyclerView recyclerView;
private View view;
private TextView msgKosong;
public static FragmentTransaksiToko newInstance(){
return new FragmentTransaksiToko();
}
public FragmentTransaksiToko() {
// Required empty public constructor
}
@Override
public View onCreateView(LayoutInflater inflater, ViewGroup container,
Bundle savedInstanceState) {
view = inflater.inflate(R.layout.fragment_transaksi_toko, container, false);
initView();
if (items == null){
msgKosong.setVisibility(View.VISIBLE);
}else {
msgKosong.setVisibility(View.GONE);
getTransaksiToko();
}
return view;
}
private void initView() {
msgKosong = view.findViewById(R.id.msg_transaksi);
refreshLayout = view.findViewById(R.id.refrestokotr);
refreshLayout.setOnRefreshListener(this);
refreshLayout.post(new Runnable() {
@Override
public void run() {
refreshLayout.setRefreshing(true);
getTransaksiToko();
}
});
recyclerView = view.findViewById(R.id.recycletransaksitoko);
items = new ArrayList<>();
RecyclerView.LayoutManager layoutManager = new LinearLayoutManager(getContext());
recyclerView.setLayoutManager(layoutManager);
recyclerView.setAdapter(adapterTransaksiToko);
}
private void getTransaksiToko() {
retrofit2.Call<TokoTransaksi> call = RestAPIHelper.ServiceApi(getActivity().getApplication()).getTokoTr();
call.enqueue(new Callback<TokoTransaksi>() {
@Override
public void onResponse(Call<TokoTransaksi> call, Response<TokoTransaksi> response) {
if (response.body() != null){
items = response.body().getTokoTransaksi();
adapterTransaksiToko = new AdapterTransaksiToko(items, getContext());
recyclerView.setAdapter(adapterTransaksiToko);
refreshLayout.setRefreshing(false);
}
}
@Override
public void onFailure(Call<TokoTransaksi> call, Throwable t) {
Toast.makeText(getContext(), "OnFailure", Toast.LENGTH_SHORT).show();
}
});
}
@Override
public void onRefresh() {
items.clear();
adapterTransaksiToko.notifyDataSetChanged();
getTransaksiToko();
}
}
<file_sep>package com.digitalone.kasiranto.activity;
import android.app.DatePickerDialog;
import android.content.Intent;
import android.support.v4.widget.SwipeRefreshLayout;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.support.v7.widget.LinearLayoutManager;
import android.support.v7.widget.RecyclerView;
import android.support.v7.widget.Toolbar;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.DatePicker;
import android.widget.LinearLayout;
import android.widget.TextView;
import android.widget.Toast;
import com.android.volley.Request;
import com.android.volley.RequestQueue;
import com.android.volley.VolleyError;
import com.android.volley.toolbox.StringRequest;
import com.android.volley.toolbox.Volley;
import com.digitalone.kasiranto.R;
import com.digitalone.kasiranto.adapter.AdapterKafeDate;
import com.digitalone.kasiranto.adapter.AdapterKolamIkanDate;
import com.digitalone.kasiranto.adapter.AdapterKolamRenangDate;
import com.digitalone.kasiranto.adapter.AdapterTiketMasukDate;
import com.digitalone.kasiranto.adapter.AdapterTokoDate;
import com.digitalone.kasiranto.adapter.AdapterTransaksi;
import com.digitalone.kasiranto.adapter.AdapterWarungDate;
import com.digitalone.kasiranto.model.KafeDate;
import com.digitalone.kasiranto.model.KafeDateItem;
import com.digitalone.kasiranto.model.KolamIkan;
import com.digitalone.kasiranto.model.KolamIkanDate;
import com.digitalone.kasiranto.model.KolamIkanDateItem;
import com.digitalone.kasiranto.model.KolamRenang;
import com.digitalone.kasiranto.model.KolamRenangDate;
import com.digitalone.kasiranto.model.KolamrenangDateItem;
import com.digitalone.kasiranto.model.TiketMasukDate;
import com.digitalone.kasiranto.model.TiketMasukDateItem;
import com.digitalone.kasiranto.model.TokoDate;
import com.digitalone.kasiranto.model.TokoDateItem;
import com.digitalone.kasiranto.model.Transaksi;
import com.digitalone.kasiranto.model.TransaksiItem;
import com.digitalone.kasiranto.model.WarungDate;
import com.digitalone.kasiranto.model.WarungDateItem;
import com.digitalone.kasiranto.service.RestAPIHelper;
import org.json.JSONArray;
import org.json.JSONException;
import org.json.JSONObject;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Calendar;
import java.util.HashMap;
import java.util.List;
import java.util.Locale;
import java.util.Map;
import retrofit2.Call;
import retrofit2.Callback;
import retrofit2.Response;
public class ActivityFilter extends AppCompatActivity implements View.OnClickListener{
private TextView txtDari, txtKe, totalFilter;
private Button btnCariKafe, btnCariToko, btnCariWarung,btncarikolamikan, btnkolamrenang,btntiketmasuk, btnCariSemua;
private LinearLayout btnDari, btnKe;
private DatePickerDialog dateFrom, dateTo;
private SimpleDateFormat dateFormatter, dateToFormatter;
private SwipeRefreshLayout refreshLayout;
private List<TransaksiItem> allitems;
private List<KafeDateItem> items;
private List<KolamrenangDateItem> itemskolamrenang;
private List<TokoDateItem> tokoDateItems;
private List<KolamIkanDateItem> kolamIkanDateItems;
private List<WarungDateItem> warungDateItems;
private List<TiketMasukDateItem> tiketMasukDateItems;
private AdapterTransaksi adapterTransaksi;
private AdapterTiketMasukDate adapterTiketMasukDate;
private AdapterKafeDate adapterKafeDate;
private AdapterTokoDate adapterTokoDate;
private AdapterKolamRenangDate adapterKolamDate;
private AdapterWarungDate adapterWarungDate;
private AdapterKolamIkanDate adapterKolamIkanDate;
private RecyclerView recyclerView;
private Toolbar toolBar;
private String dari, ke;
private int total, totaltoko, totalwarung,totalkolamikan,totalkolamrenang, totaltiketmasuk, totalall;
private String totalfilter, totalfiltertoko, totalfilterwarung, totalfilterkolamikan,totalfilterkolamrenang, totalfiltertiketmasuk, totalfilterall;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_filter);
initView();
}
private void initView(){
totalall = 0;
total = 0;
totaltoko = 0;
totalkolamikan = 0;
totalwarung = 0;
toolBar = findViewById(R.id.toolbarfilter);
setSupportActionBar(toolBar);
getSupportActionBar().setTitle("Filter Transaksi");
getSupportActionBar().setDisplayHomeAsUpEnabled(true);
btnCariKafe = findViewById(R.id.btn_cari_kafe);
btnCariToko = findViewById(R.id.btn_cari_toko);
btnCariWarung = findViewById(R.id.btn_cari_warung);
btnCariSemua = findViewById(R.id.btn_cari_semua);
txtDari = findViewById(R.id.fromkafe);
txtKe = findViewById(R.id.tokafe);
totalFilter = findViewById(R.id.totalfilter);
btnDari = findViewById(R.id.btn_from_kafe);
btnKe = findViewById(R.id.btn_to_kafe);
refreshLayout = findViewById(R.id.refreshfilter);
btncarikolamikan = findViewById(R.id.btn_cari_kolamikan);
btntiketmasuk = findViewById(R.id.btn_cari_tiketMasuk);
recyclerView = findViewById(R.id.recyclefilter);
btnkolamrenang = findViewById(R.id.btn_cari_KolamRenang);
items = new ArrayList<>();
tiketMasukDateItems = new ArrayList<>();
allitems = new ArrayList<>();
RecyclerView.LayoutManager layoutManager = new LinearLayoutManager(this);
recyclerView.setLayoutManager(layoutManager);
recyclerView.setAdapter(adapterKafeDate);
btnDari.setOnClickListener(this);
btnCariToko.setOnClickListener(this);
btnCariWarung.setOnClickListener(this);
btnCariSemua.setOnClickListener(this);
btnKe.setOnClickListener(this);
btnCariKafe.setOnClickListener(this);
btncarikolamikan.setOnClickListener(this);
btnkolamrenang.setOnClickListener(this);
btntiketmasuk.setOnClickListener(this);
dateFormatter = new SimpleDateFormat("yyyy-MM-dd", Locale.getDefault());
dateToFormatter = new SimpleDateFormat("yyyy-MM-dd", Locale.getDefault());
}
private void showDateFrom(){
Calendar newCalendar = Calendar.getInstance();
dateFrom = new DatePickerDialog(this, new DatePickerDialog.OnDateSetListener() {
@Override
public void onDateSet(DatePicker view, int year, int monthOfYear, int dayOfMonth) {
Calendar newDate = Calendar.getInstance();
newDate.set(year, monthOfYear, dayOfMonth);
txtDari.setText(""+dateFormatter.format(newDate.getTime()));
dari = txtDari.getText().toString();
}
},newCalendar.get(Calendar.YEAR), newCalendar.get(Calendar.MONTH), newCalendar.get(Calendar.DAY_OF_MONTH));
dateFrom.show();
}
private void showDateTo(){
Calendar newCalendar = Calendar.getInstance();
dateTo= new DatePickerDialog(this, new DatePickerDialog.OnDateSetListener() {
@Override
public void onDateSet(DatePicker view, int year, int monthOfYear, int dayOfMonth) {
Calendar newDate = Calendar.getInstance();
newDate.set(year, monthOfYear, dayOfMonth);
txtKe.setText(""+dateToFormatter.format(newDate.getTime()));
ke = txtKe.getText().toString();
}
},newCalendar.get(Calendar.YEAR), newCalendar.get(Calendar.MONTH), newCalendar.get(Calendar.DAY_OF_MONTH));
dateTo.show();
}
private void getFilterKafe(String from, String to){
retrofit2.Call<KafeDate> call = RestAPIHelper.ServiceApi(getApplication()).filterKafeDate(from, to);
call.enqueue(new Callback<KafeDate>() {
@Override
public void onResponse(Call<KafeDate> call, Response<KafeDate> response) {
if (response.body() != null){
boolean error = response.body().getError();
if (!error) {
items = response.body().getDateKafe();
for (int i=0; i<items.size(); i++){
//totali.add(Integer.valueOf(items.get(i).getKtTotal()));
totalfilter = items.get(i).getKtTotal();
total = total + Integer.parseInt(totalfilter);
//Toast.makeText(ActivityFilter.this, ""+total, Toast.LENGTH_SHORT).show();
}
totalFilter.setText(String.valueOf(total));
adapterKafeDate = new AdapterKafeDate(items, getApplicationContext());
recyclerView.setAdapter(adapterKafeDate);
Log.d("response", String.valueOf(items));
}
}
refreshLayout.setRefreshing(false);
}
@Override
public void onFailure(Call<KafeDate> call, Throwable t) {
Log.e(ActivityFilter.class.getSimpleName(),t.getMessage());
Toast.makeText(ActivityFilter.this, "onFailure", Toast.LENGTH_SHORT).show();
}
});
}
private void getFilterKolamIkan(String from, String to){
retrofit2.Call<KolamIkanDate> call = RestAPIHelper.ServiceApi(getApplication()).filterKolamDate(from, to);
call.enqueue(new Callback<KolamIkanDate>() {
@Override
public void onResponse(Call<KolamIkanDate> call, Response<KolamIkanDate> response) {
if (response.body() != null){
boolean error = response.body().getError();
if (!error) {
kolamIkanDateItems = response.body().getDateKafe();
for (int i=0; i<kolamIkanDateItems.size(); i++){
//totali.add(Integer.valueOf(items.get(i).getKtTotal()));
totalfilterkolamikan = kolamIkanDateItems.get(i).getKtTotal();
totalkolamikan = totalkolamikan + Integer.parseInt(totalfilterkolamikan);
//Toast.makeText(ActivityFilter.this, ""+total, Toast.LENGTH_SHORT).show();
}
totalFilter.setText(String.valueOf(totalkolamikan));
adapterKolamIkanDate = new AdapterKolamIkanDate(kolamIkanDateItems, getApplicationContext());
recyclerView.setAdapter(adapterKolamIkanDate);
Log.d("response", String.valueOf(warungDateItems));
}
}
refreshLayout.setRefreshing(false);
}
@Override
public void onFailure(Call<KolamIkanDate> call, Throwable t) {
Log.e(ActivityFilter.class.getSimpleName(),t.getMessage());
Toast.makeText(ActivityFilter.this, "onFailure", Toast.LENGTH_SHORT).show();
}
});
}
private void getFilterKolamrenang(String from, String to){
retrofit2.Call<KolamRenangDate> call = RestAPIHelper.ServiceApi(getApplication()).filterKolamRenangDate(from, to);
call.enqueue(new Callback<KolamRenangDate>() {
@Override
public void onResponse(Call<KolamRenangDate> call, Response<KolamRenangDate> response) {
if (response.body() != null){
boolean error = response.body().getError();
if (!error) {
itemskolamrenang = response.body().getDateKafe();
for (int i = 0; i < itemskolamrenang.size(); i++) {
//totali.add(Integer.valueOf(items.get(i).getKtTotal()));
totalfilterkolamrenang = itemskolamrenang.get(i).getKtTotal();
totalkolamrenang = totalkolamrenang + Integer.parseInt(totalfilterkolamrenang);
//Toast.makeText(ActivityFilter.this, ""+total, Toast.LENGTH_SHORT).show();
}
totalFilter.setText(String.valueOf(totalkolamrenang));
adapterKolamDate = new AdapterKolamRenangDate(itemskolamrenang, getApplicationContext());
recyclerView.setAdapter(adapterKolamDate);
}
}
refreshLayout.setRefreshing(false);
}
@Override
public void onFailure(Call<KolamRenangDate> call, Throwable t) {
Log.e(ActivityFilter.class.getSimpleName(),t.getMessage());
Toast.makeText(ActivityFilter.this, "onFailure", Toast.LENGTH_SHORT).show();
}
});
}
private void getFilterTiketMasuk(String from, String to){
retrofit2.Call<TiketMasukDate> call = RestAPIHelper.ServiceApi(getApplication()).filterTiketMasukDate(from, to);
call.enqueue(new Callback<TiketMasukDate>() {
@Override
public void onResponse(Call<TiketMasukDate> call, Response<TiketMasukDate> response) {
if (response.body() != null){
boolean error = response.body().getError();
if (!error) {
tiketMasukDateItems = response.body().getTiketmasukDate();
for (int i = 0; i < tiketMasukDateItems.size(); i++) {
//totali.add(Integer.valueOf(items.get(i).getKtTotal()));
totalfiltertiketmasuk = tiketMasukDateItems.get(i).getTmtTotal();
totaltiketmasuk = totaltiketmasuk + Integer.parseInt(totalfiltertiketmasuk);
//Toast.makeText(ActivityFilter.this, ""+total, Toast.LENGTH_SHORT).show();
}
totalFilter.setText(String.valueOf(totaltiketmasuk));
adapterTiketMasukDate = new AdapterTiketMasukDate(tiketMasukDateItems, getApplicationContext());
recyclerView.setAdapter(adapterTiketMasukDate);
}
}
refreshLayout.setRefreshing(false);
}
@Override
public void onFailure(Call<TiketMasukDate> call, Throwable t) {
Log.e(ActivityFilter.class.getSimpleName(),t.getMessage());
Toast.makeText(ActivityFilter.this, "onFailure", Toast.LENGTH_SHORT).show();
}
});
}
private void getFilterToko(String from, String to){
retrofit2.Call<TokoDate> call = RestAPIHelper.ServiceApi(getApplication()).filterTokoDate(from, to);
call.enqueue(new Callback<TokoDate>() {
@Override
public void onResponse(Call<TokoDate> call, Response<TokoDate> response) {
if (response.body() != null){
boolean error = response.body().getError();
if (!error) {
tokoDateItems = response.body().getDateToko();
for (int i=0; i<tokoDateItems.size(); i++){
//totali.add(Integer.valueOf(items.get(i).getKtTotal()));
totalfiltertoko = tokoDateItems.get(i).getTtTotal();
totaltoko = totaltoko + Integer.parseInt(totalfiltertoko);
//Toast.makeText(ActivityFilter.this, ""+total, Toast.LENGTH_SHORT).show();
}
totalFilter.setText(String.valueOf(totaltoko));
adapterTokoDate = new AdapterTokoDate(tokoDateItems, getApplicationContext());
recyclerView.setAdapter(adapterTokoDate);
Log.d("response", String.valueOf(items));
}
}
refreshLayout.setRefreshing(false);
}
@Override
public void onFailure(Call<TokoDate> call, Throwable t) {
Log.e(ActivityFilter.class.getSimpleName(),t.getMessage());
Toast.makeText(ActivityFilter.this, "onFailure", Toast.LENGTH_SHORT).show();
}
});
}
private void getFilterWarung(String from, String to){
retrofit2.Call<WarungDate> call = RestAPIHelper.ServiceApi(getApplication()).filterWarungDate(from, to);
call.enqueue(new Callback<WarungDate>() {
@Override
public void onResponse(Call<WarungDate> call, Response<WarungDate> response) {
if (response.body() != null){
boolean error = response.body().getError();
if (!error) {
warungDateItems = response.body().getDateWarung();
for (int i=0; i<warungDateItems.size(); i++){
//totali.add(Integer.valueOf(items.get(i).getKtTotal()));
totalfilterwarung = warungDateItems.get(i).getWtTotal();
totalwarung = totalwarung + Integer.parseInt(totalfilterwarung);
//Toast.makeText(ActivityFilter.this, ""+total, Toast.LENGTH_SHORT).show();
}
totalFilter.setText(String.valueOf(totalwarung));
adapterWarungDate = new AdapterWarungDate(warungDateItems, getApplicationContext());
recyclerView.setAdapter(adapterWarungDate);
Log.d("response", String.valueOf(items));
}
}
refreshLayout.setRefreshing(false);
}
@Override
public void onFailure(Call<WarungDate> call, Throwable t) {
Log.e(ActivityFilter.class.getSimpleName(),t.getMessage());
Toast.makeText(ActivityFilter.this, "onFailure", Toast.LENGTH_SHORT).show();
}
});
}
private void getAll(String from, String to){
retrofit2.Call<Transaksi> call = RestAPIHelper.ServiceApi(getApplication()).getAllTransaksi(from, to);
call.enqueue(new Callback<Transaksi>() {
@Override
public void onResponse(Call<Transaksi> call, Response<Transaksi> response) {
if (response.body() != null){
boolean error = response.body().getError();
if (!error){
allitems = response.body().getTransaksi();
for (int i=0; i<allitems.size(); i++){
totalfilterall = allitems.get(i).getTTotal();
totalall = totalall + Integer.parseInt(totalfilterall);
}
totalFilter.setText(String.valueOf(totalall));
adapterTransaksi = new AdapterTransaksi(allitems, getApplicationContext());
recyclerView.setAdapter(adapterTransaksi);
}
}
refreshLayout.setRefreshing(false);
}
@Override
public void onFailure(Call<Transaksi> call, Throwable t) {
Log.e(ActivityFilter.class.getSimpleName(),t.getMessage());
Toast.makeText(ActivityFilter.this, "onFailure", Toast.LENGTH_SHORT).show();
}
});
}
@Override
public void onClick(View v) {
switch (v.getId()){
case R.id.btn_from_kafe:
showDateFrom();
break;
case R.id.btn_to_kafe:
showDateTo();
break;
case R.id.btn_cari_semua:
refreshLayout.setRefreshing(true);
getAll(dari, ke);
totalall = 0;
totalFilter.setText(String.valueOf(totalall));
break;
case R.id.btn_cari_kafe:
refreshLayout.setRefreshing(true);
getFilterKafe(dari, ke);
total = 0;
totalFilter.setText(String.valueOf(total));
break;
case R.id.btn_cari_toko:
refreshLayout.setRefreshing(true);
getFilterToko(dari, ke);
totaltoko = 0;
totalFilter.setText(String.valueOf(totaltoko));
break;
case R.id.btn_cari_warung:
refreshLayout.setRefreshing(true);
getFilterWarung(dari, ke);
totalwarung = 0;
totalFilter.setText(String.valueOf(totalwarung));
break;
case R.id.btn_cari_kolamikan:
refreshLayout.setRefreshing(true);
getFilterKolamIkan(dari, ke);
totalkolamikan = 0 ;
totalFilter.setText(String.valueOf(totalkolamikan));
break;
case R.id.btn_cari_KolamRenang :
refreshLayout.setRefreshing(true);
getFilterKolamrenang(dari,ke);
totalkolamrenang = 0;
totalFilter.setText(String.valueOf(totalkolamrenang));
case R.id.btn_cari_tiketMasuk:
refreshLayout.setRefreshing(true);
getFilterTiketMasuk(dari,ke);
totaltiketmasuk = 0;
totalFilter.setText(String.valueOf(totaltiketmasuk));
}
}
@Override
public boolean onSupportNavigateUp() {
finish();
return super.onSupportNavigateUp();
}
}
<file_sep>package com.digitalone.kasiranto.model;
import com.google.gson.annotations.Expose;
import com.google.gson.annotations.SerializedName;
public class TiketMasukDateItem {
@SerializedName("nama_item_tiketmasuk")
@Expose
private String namaItemTiketmasuk;
@SerializedName("harga_item_tiketmasuk")
@Expose
private int hargaItemTiketmasuk;
@SerializedName("tmt_jumlah")
@Expose
private String tmtJumlah;
@SerializedName("tmt_total")
@Expose
private String tmtTotal;
@SerializedName("tmt_datetime")
@Expose
private String tmtDatetime;
public String getNamaItemTiketmasuk(){
return namaItemTiketmasuk;
}
public void setNamaItemTiketmasuk(String input){
this.namaItemTiketmasuk = input;
}
public int getHargaItemTiketmasuk(){
return hargaItemTiketmasuk;
}
public void setHargaItemTiketmasuk(int input){
this.hargaItemTiketmasuk = input;
}
public String getTmtJumlah(){
return tmtJumlah;
}
public void setTmtJumlah(String input){
this.tmtJumlah = input;
}
public String getTmtTotal(){
return tmtTotal;
}
public void setTmtTotal(String input){
this.tmtTotal = input;
}
public String getTmtDatetime(){
return tmtDatetime;
}
public void setTmtDatetime(String input){
this.tmtDatetime = input;
}
}
<file_sep>package com.digitalone.kasiranto.adapter;
import android.content.Context;
import android.support.annotation.NonNull;
import android.support.v7.widget.RecyclerView;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.widget.TextView;
import com.digitalone.kasiranto.R;
import com.digitalone.kasiranto.model.TokoDateItem;
import java.util.List;
public class AdapterTokoDate extends RecyclerView.Adapter<AdapterTokoDate.HolderTokoDate> {
private List<TokoDateItem> items;
private Context context;
public AdapterTokoDate(List<TokoDateItem> items, Context context) {
this.items = items;
this.context = context;
}
@NonNull
@Override
public AdapterTokoDate.HolderTokoDate onCreateViewHolder(@NonNull ViewGroup parent, int viewType) {
View view = LayoutInflater.from(parent.getContext())
.inflate(R.layout.list_filter, parent, false);
return new AdapterTokoDate.HolderTokoDate(view);
}
@Override
public void onBindViewHolder(@NonNull AdapterTokoDate.HolderTokoDate holder, int position) {
TokoDateItem item = items.get(position);
holder.txtNama.setText(item.getNamaToko());
holder.txtHarga.setText(String.valueOf(item.getTokoHarga()));
holder.txtJumlah.setText(item.getTtJumlah());
holder.txtTanggal.setText(item.getTtDatetime());
holder.txtTotal.setText(item.getTtTotal());
}
@Override
public int getItemCount() {
return items.size();
}
public class HolderTokoDate extends RecyclerView.ViewHolder {
private TextView txtNama, txtHarga, txtTanggal,txtJumlah, txtTotal;
public HolderTokoDate(View itemView) {
super(itemView);
txtNama = itemView.findViewById(R.id.filter_nama);
txtHarga = itemView.findViewById(R.id.filter_harga);
txtTanggal = itemView.findViewById(R.id.filter_waktu);
txtJumlah = itemView.findViewById(R.id.filter_jumlah);
txtTotal = itemView.findViewById(R.id.filter_total);
}
}
}
<file_sep>package com.digitalone.kasiranto.fragment;
import android.os.Bundle;
import android.support.v4.app.Fragment;
import android.support.v4.widget.SwipeRefreshLayout;
import android.support.v7.widget.LinearLayoutManager;
import android.support.v7.widget.RecyclerView;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.widget.TextView;
import android.widget.Toast;
import com.digitalone.kasiranto.R;
import com.digitalone.kasiranto.adapter.AdapterTransaksiKafe;
import com.digitalone.kasiranto.model.KafeTransaksi;
import com.digitalone.kasiranto.model.KafeTransaksiItem;
import com.digitalone.kasiranto.service.RestAPIHelper;
import java.util.ArrayList;
import java.util.List;
import retrofit2.Call;
import retrofit2.Callback;
import retrofit2.Response;
/**
* A simple {@link Fragment} subclass.
*/
public class FragmentTransaksiKafe extends Fragment implements SwipeRefreshLayout.OnRefreshListener{
public static final String TITLE = "Transaksi Kafe";
private SwipeRefreshLayout refreshLayout;
private List<KafeTransaksiItem> items;
private AdapterTransaksiKafe adapterTransaksiKafe;
private RecyclerView recyclerView;
private View view;
private TextView msgKosong;
public static FragmentTransaksiKafe newInstance(){
return new FragmentTransaksiKafe();
}
public FragmentTransaksiKafe() {
// Required empty public constructor
}
@Override
public View onCreateView(LayoutInflater inflater, ViewGroup container,
Bundle savedInstanceState) {
view = inflater.inflate(R.layout.fragment_transaksi_kafe, container, false);
initView();
if (items == null){
msgKosong.setVisibility(View.VISIBLE);
}else {
msgKosong.setVisibility(View.GONE);
getTransaksiKafe();
}
return view;
}
private void initView() {
msgKosong = view.findViewById(R.id.msg_transaksi);
refreshLayout = view.findViewById(R.id.refreshkafetr);
refreshLayout.setOnRefreshListener(this);
refreshLayout.post(new Runnable() {
@Override
public void run() {
refreshLayout.setRefreshing(true);
getTransaksiKafe();
}
});
recyclerView = view.findViewById(R.id.recycletransaksikafe);
items = new ArrayList<>();
RecyclerView.LayoutManager layoutManager = new LinearLayoutManager(getContext());
recyclerView.setLayoutManager(layoutManager);
recyclerView.setAdapter(adapterTransaksiKafe);
}
private void getTransaksiKafe() {
retrofit2.Call<KafeTransaksi> call = RestAPIHelper.ServiceApi(getActivity().getApplication()).getKafeTr();
call.enqueue(new Callback<KafeTransaksi>() {
@Override
public void onResponse(Call<KafeTransaksi> call, Response<KafeTransaksi> response) {
if (response.body() != null){
items = response.body().getKafeTransaksi();
adapterTransaksiKafe = new AdapterTransaksiKafe(items, getContext());
recyclerView.setAdapter(adapterTransaksiKafe);
refreshLayout.setRefreshing(false);
}
}
@Override
public void onFailure(Call<KafeTransaksi> call, Throwable t) {
Toast.makeText(getContext(), "OnFailure", Toast.LENGTH_SHORT).show();
}
});
}
@Override
public void onRefresh() {
items.clear();
adapterTransaksiKafe.notifyDataSetChanged();
getTransaksiKafe();
}
}
<file_sep>package com.digitalone.kasiranto.adapter;
import android.content.Context;
import android.support.annotation.NonNull;
import android.support.v7.widget.RecyclerView;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.widget.TextView;
import com.digitalone.kasiranto.R;
import com.digitalone.kasiranto.activity.ActivityKolamIkanCheckout;
import com.digitalone.kasiranto.activity.ActivityTiketMasukCheckout;
import com.digitalone.kasiranto.fragment.FragmentKolamIkan;
import com.digitalone.kasiranto.fragment.FragmentTiketMasuk;
import com.digitalone.kasiranto.helper.DBHelper;
import com.digitalone.kasiranto.model.KafeTemp;
import com.digitalone.kasiranto.model.TiketMasuk;
import com.digitalone.kasiranto.model.TiketMasukTemp;
import java.util.List;
public class AdapterTiketMasukTemp extends RecyclerView.Adapter<AdapterTiketMasukTemp.ListTemp> {
private List<TiketMasukTemp> items;
private Context context= null;
private DBHelper dbHelper;
public AdapterTiketMasukTemp(List<TiketMasukTemp> items, Context context) {
this.items = items;
this.context = context;
}
@NonNull
@Override
public AdapterTiketMasukTemp.ListTemp onCreateViewHolder(@NonNull ViewGroup parent, int viewType) {
dbHelper = new DBHelper(context);
View view = LayoutInflater.from(parent.getContext())
.inflate(R.layout.list_kafe_bayar, parent, false);
return new ListTemp(view);
}
@Override
public void onBindViewHolder(@NonNull AdapterTiketMasukTemp.ListTemp holder, int position) {
final TiketMasukTemp kafeItem = items.get(position);
holder.txtId.setText(String.valueOf(kafeItem.getTiketMasuk_id()));
holder.txtId.setVisibility(View.GONE);
holder.txtNama.setText(kafeItem.getTiketMasuk_nama());
holder.txtJumlah.setText(kafeItem.getTiketMasuk_jumlah());
holder.txtHarga.setText(kafeItem.getTiketMasuk_harga());
holder.hapuss.setOnClickListener(new View.OnClickListener(){
@Override
public void onClick(View v) {
FragmentTiketMasuk.totaldetail = FragmentTiketMasuk.totaldetail - Integer.parseInt(kafeItem.getTiketMasuk_harga());
dbHelper.deleteitemTiketmasuk(kafeItem);
dbHelper.getAllTiketmasukTemps();
((ActivityTiketMasukCheckout)context).refresh();
}
});
}
@Override
public int getItemCount() {
return items.size();
}
class ListTemp extends RecyclerView.ViewHolder{
private TextView txtNama, txtJumlah, txtHarga, txtId, hapuss;
public ListTemp(View itemView) {
super(itemView);
txtNama = itemView.findViewById(R.id.temp_kafe_item);
txtJumlah = itemView.findViewById(R.id.temp_kafe_jumlah);
txtHarga = itemView.findViewById(R.id.temp_kafe_harga);
txtId = itemView.findViewById(R.id.temp_kafe_id);
hapuss = itemView.findViewById(R.id.hapus);
}
}
}
<file_sep>package com.digitalone.kasiranto.model;
import com.google.gson.annotations.Expose;
import com.google.gson.annotations.SerializedName;
public class TokoDateItem {
@SerializedName("toko_nama")
@Expose
private String namaToko;
@SerializedName("toko_harga")
@Expose
private int tokoHarga;
@SerializedName("tt_jumlah")
@Expose
private String ttJumlah;
@SerializedName("tt_total")
@Expose
private String ttTotal;
@SerializedName("tt_datetime")
@Expose
private String ttDatetime;
public String getNamaToko() {
return namaToko;
}
public void setNamaToko(String namaToko) {
this.namaToko = namaToko;
}
public int getTokoHarga() {
return tokoHarga;
}
public void setTokoHarga(int tokoHarga) {
this.tokoHarga = tokoHarga;
}
public String getTtJumlah() {
return ttJumlah;
}
public void setTtJumlah(String ttJumlah) {
this.ttJumlah = ttJumlah;
}
public String getTtTotal() {
return ttTotal;
}
public void setTtTotal(String ttTotal) {
this.ttTotal = ttTotal;
}
public String getTtDatetime() {
return ttDatetime;
}
public void setTtDatetime(String ttDatetime) {
this.ttDatetime = ttDatetime;
}
}
<file_sep>package com.digitalone.kasiranto.fragment;
import android.content.DialogInterface;
import android.os.Bundle;
import android.support.design.widget.FloatingActionButton;
import android.support.v4.app.Fragment;
import android.support.v4.widget.SwipeRefreshLayout;
import android.support.v7.app.AlertDialog;
import android.support.v7.widget.LinearLayoutManager;
import android.support.v7.widget.RecyclerView;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.widget.EditText;
import android.widget.Toast;
import com.digitalone.kasiranto.R;
import com.digitalone.kasiranto.adapter.AdapterKafe;
import com.digitalone.kasiranto.model.Kafe;
import com.digitalone.kasiranto.model.KafeItem;
import com.digitalone.kasiranto.model.AdminMessage;
import com.digitalone.kasiranto.service.RestAPIHelper;
import java.util.ArrayList;
import java.util.List;
import retrofit2.Call;
import retrofit2.Callback;
import retrofit2.Response;
/**
* A simple {@link Fragment} subclass.
*/
public class FragmentAdminKafe extends Fragment implements View.OnClickListener, SwipeRefreshLayout.OnRefreshListener {
public static final String TITLE = "Kafe";
private View view;
private List<KafeItem> items;
private AdapterKafe kafe;
private RecyclerView recyclerView;
private FloatingActionButton fab;
private AlertDialog.Builder dialog;
private EditText edtNama,edtHarga,edtStok;
private String nama, harga, stok;
private LayoutInflater inflater;
private View dialogView;
private SwipeRefreshLayout refreshLayout;
public static FragmentAdminKafe newInstance(){
return new FragmentAdminKafe();
}
public FragmentAdminKafe() {
// Required empty public constructor
}
@Override
public View onCreateView(LayoutInflater inflater, ViewGroup container,
Bundle savedInstanceState) {
view = inflater.inflate(R.layout.fragment_admin_kafe, container, false);
initView();
return view;
}
private void initView() {
refreshLayout = view.findViewById(R.id.refreshkafe);
refreshLayout.setOnRefreshListener(this);
refreshLayout.post(new Runnable() {
@Override
public void run() {
refreshLayout.setRefreshing(true);
getKafes();
}
});
fab = view.findViewById(R.id.fab);
recyclerView = view.findViewById(R.id.recycleadminkafe);
fab.setOnClickListener(this);
items = new ArrayList<>();
RecyclerView.LayoutManager layoutManager = new LinearLayoutManager(getActivity());
recyclerView.setLayoutManager(layoutManager);
recyclerView.setAdapter(kafe);
}
private void dialogForm(){
dialog = new AlertDialog.Builder(getActivity());
inflater = getLayoutInflater();
dialogView = inflater.inflate(R.layout.form_kafe, null);
edtHarga = dialogView.findViewById(R.id.kafe_form_harga);
edtNama = dialogView.findViewById(R.id.kafe_form_nama);
edtStok = dialogView.findViewById(R.id.kafe_form_stok);
dialog.setView(dialogView);
dialog.setCancelable(true);
dialog.setTitle("Form Item Kafe");
dialog.setPositiveButton("Simpan", new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int which) {
nama = edtNama.getText().toString();
harga = edtHarga.getText().toString();
stok = edtStok.getText().toString();
insertKafe(nama,harga,stok);
}
});
dialog.setNegativeButton("Batal", new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int which) {
dialog.dismiss();
}
});
dialog.show();
}
private void insertKafe(String nama, String harga, String stok){
retrofit2.Call<AdminMessage> call = RestAPIHelper.ServiceApi(getActivity().getApplication()).insertKafe(nama,harga,stok);
call.enqueue(new Callback<AdminMessage>() {
@Override
public void onResponse(Call<AdminMessage> call, Response<AdminMessage> response) {
if (response.body() != null){
boolean error = response.body().getError();
String msg = response.body().getMsg();
if (!error){
Toast.makeText(getContext(), msg, Toast.LENGTH_SHORT).show();
}else {
Toast.makeText(getContext(), msg, Toast.LENGTH_SHORT).show();
}
onRefresh();
}
}
@Override
public void onFailure(Call<AdminMessage> call, Throwable t) {
Toast.makeText(getContext(), "Cek koneksi internet anda", Toast.LENGTH_SHORT).show();
}
});
}
private void getKafes(){
retrofit2.Call<Kafe> call = RestAPIHelper.ServiceApi(getActivity().getApplication()).getKafe();
call.enqueue(new Callback<Kafe>() {
@Override
public void onResponse(Call<Kafe> call, Response<Kafe> response) {
if (response.body() != null){
items = response.body().getKafe();
kafe = new AdapterKafe(items, getContext());
recyclerView.setAdapter(kafe);
}else {
Toast.makeText(getContext(), "Error response", Toast.LENGTH_SHORT).show();
}
refreshLayout.setRefreshing(false);
}
@Override
public void onFailure(Call<Kafe> call, Throwable t) {
refreshLayout.setRefreshing(false);
Toast.makeText(getContext(), "onFailure", Toast.LENGTH_SHORT).show();
}
});
}
@Override
public void onClick(View v) {
switch (v.getId()){
case R.id.fab:
dialogForm();
break;
}
}
@Override
public void onRefresh() {
items.clear();
kafe.notifyDataSetChanged();
getKafes();
}
}
<file_sep>package com.digitalone.kasiranto.activity;
import android.app.ProgressDialog;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.support.v7.widget.Toolbar;
import android.view.View;
import android.widget.AdapterView;
import android.widget.ArrayAdapter;
import android.widget.Button;
import android.widget.EditText;
import android.widget.Spinner;
import android.widget.Toast;
import com.digitalone.kasiranto.R;
import com.digitalone.kasiranto.model.AdminMessage;
import com.digitalone.kasiranto.service.RestAPIHelper;
import java.util.ArrayList;
import java.util.List;
import retrofit2.Call;
import retrofit2.Callback;
import retrofit2.Response;
public class ActivityUbahPassword extends AppCompatActivity implements AdapterView.OnItemSelectedListener, View.OnClickListener {
private Toolbar toolBar;
private Spinner spinner;
private EditText edtPass, edtRePass;
private Button btnUbah;
private String level;
private int idPass;
private ProgressDialog pDialog;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_ubah_password);
initView();
}
private void initView(){
idPass = 0;
toolBar = findViewById(R.id.toolbar_ubah_password);
setSupportActionBar(toolBar);
getSupportActionBar().setTitle("Ubah Password");
getSupportActionBar().setDisplayHomeAsUpEnabled(true);
spinner = findViewById(R.id.spinner_ubah_password);
edtPass = findViewById(R.id.password_ubah);
edtRePass = findViewById(R.id.repassword);
btnUbah = findViewById(R.id.btn_ubah_password);
btnUbah.setOnClickListener(this);
spinner.setOnItemSelectedListener(this);
pDialog = new ProgressDialog(this);
pDialog.setCancelable(false);
List<String> levelitems = new ArrayList<>();
levelitems.add("Select User");
levelitems.add("Superuser");
levelitems.add("Warung");
levelitems.add("Toko");
levelitems.add("KafeRenang");
levelitems.add("KolamIkan");
ArrayAdapter<String> levelAdapter = new ArrayAdapter<String>(this,android.R.layout.simple_spinner_item,levelitems);
levelAdapter.setDropDownViewResource(android.R.layout.simple_spinner_dropdown_item);
spinner.setAdapter(levelAdapter);
}
private void ubahPassword(String password, String id){
pDialog.setMessage("Logging in ...");
showDialog();
retrofit2.Call<AdminMessage> call = RestAPIHelper.ServiceApi(getApplication()).updatePassword(password, id);
call.enqueue(new Callback<AdminMessage>() {
@Override
public void onResponse(Call<AdminMessage> call, Response<AdminMessage> response) {
if (response.body() != null){
boolean error = response.body().getError();
String msg = response.body().getMsg();
if (!error){
Toast.makeText(ActivityUbahPassword.this, msg, Toast.LENGTH_SHORT).show();
}else {
Toast.makeText(ActivityUbahPassword.this, msg, Toast.LENGTH_SHORT).show();
}
hideDialog();
}
}
@Override
public void onFailure(Call<AdminMessage> call, Throwable t) {
Toast.makeText(ActivityUbahPassword.this, "Cek koneksi internet anda", Toast.LENGTH_SHORT).show();
hideDialog();
}
});
}
@Override
public boolean onSupportNavigateUp() {
finish();
return super.onSupportNavigateUp();
}
@Override
public void onItemSelected(AdapterView<?> parent, View view, int position, long id) {
level = parent.getItemAtPosition(position).toString();
if (level.equals("Select User")){
return;
}else if (level.equals("Superuser")){
idPass = 1;
}else if (level.equals("Warung")){
idPass = 2;
}else if (level.equals("Toko")){
idPass = 3;
}else if (level.equals("KafeRenang")){
idPass = 4;
}else if (level.equals("KolamIkan")){
idPass = 5;
}
}
@Override
public void onNothingSelected(AdapterView<?> parent) {
Toast.makeText(this, "Please select your level status", Toast.LENGTH_SHORT).show();
}
@Override
public void onClick(View v) {
switch (v.getId()){
case R.id.btn_ubah_password:
if (!edtPass.getText().toString().equals(edtRePass.getText().toString())){
Toast.makeText(ActivityUbahPassword.this, "Password tidak sesuai", Toast.LENGTH_SHORT).show();
}else {
ubahPassword(edtPass.getText().toString(), String.valueOf(idPass));
}
break;
}
}
private void showDialog() {
if (!pDialog.isShowing())
pDialog.show();
}
private void hideDialog() {
if (pDialog.isShowing())
pDialog.dismiss();
}
}
| 70f917f3d9190d759630c0d8b89a3ada244f0ea8 | [
"Java"
]
| 35 | Java | imdvlpr/kasiranto | 24e560049a1d4c2fc4245058ed761540fff78098 | da364cf03082232be79d1b22c9565c8c87207eec |
refs/heads/master | <file_sep>{###############################################################################
# test-timing.R
# This file is part of the R package harvestr.
#
# Copyright 2012 <NAME>
# Date: 8/14/2012
#
# DESCRIPTION
# ===========
# unit testing for function to tests correct timing functionality.
#
# LICENSE
# ========
# harvestr is free software: you can redistribute it and/or modify it under the
# terms of the GNU General Public License as published by the Free Software
# Foundation, either version 3 of the License, or (at your option) any later
# version.
#
# This file is distributed in the hope that it will be useful, but WITHOUT ANY
# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with
# this file If not, see http://www.gnu.org/licenses/.
#
}###############################################################################
library(testthat)
library(harvestr)
foreach::registerDoSEQ()
context("Timing")
make_has_attr <- function(name, class='any'){
function(x){
a <- attributes(x)
if (!(name %in% names(a)))
structure(FALSE, msg = sprintf("Does not have '%s' attribute", name)) else
if (class != 'any' && ! inherits(a[[name]], class))
structure( FALSE
, msg = sprintf("has '%s' attribute but it is not of class '%s'."
, name, class)) else
TRUE
}
}
has_time <- make_has_attr('time', 'proc_time')
with_option <- function (code, ...) {
old <- options(...)
on.exit(options(old))
force(code)
}
test_that("withseed times results", {
seed <- gather(1)[[1]]
expect_true(has_time(withseed(seed, runif(1), time=TRUE)))
expect_true(has_time(with_option(withseed(seed, runif(1)), harvestr.time=TRUE)))
})
test_that('farm times results', {
seeds <- gather(3)
expect_true(has_time(farm(seeds, runif(10), time=TRUE, .progress='none', .parallel=FALSE)))
expect_true(has_time(with_option(farm(seeds, runif(10), .progress='none', .parallel=FALSE), harvestr.time=T)))
})
test_that('reap times results', {
x <- farm(1, runif(100), .progress='none', .parallel=FALSE)[[1]]
expect_true(has_time(reap(x, mean, time=TRUE)))
expect_true(has_time(with_option(reap(x, mean), harvestr.time=T)))
})
test_that('harvest times results', {
x <- farm(3, runif(100), .progress='none', .parallel=FALSE)
expect_true(has_time(harvest(x, mean, time=TRUE, .progress='none', .parallel=FALSE)))
expect_true(has_time(with_option(harvest(x, mean, .progress='none', .parallel=FALSE), harvestr.time=T)))
})
<file_sep>{###############################################################################
# test-defaults.R
# This file is part of the R package harvestr.
#
# Copyright 2012 <NAME>
# Date: 6/2/2012
#
# DESCRIPTION
# ===========
# test the caching facilities.
#
# LICENSE
# ========
# harvestr is free software: you can redistribute it and/or modify it under the
# terms of the GNU General Public License as published by the Free Software
# Foundation, either version 3 of the License, or (at your option) any later
# version.
#
# This file is distributed in the hope that it will be useful, but WITHOUT ANY
# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with
# this file If not, see http://www.gnu.org/licenses/.
#
}###############################################################################
library(harvestr)
library(testthat)
context("defaults")
test_that("is_top_harvestr_call", {
expect_false(is_top_harvestr_call())
expect_true (test_is_top_harvestr_call())
expect_false(test_is_top_harvestr_call(1))
})
test_that("parallel", {
expect_error( dflt_harvestr_parallel()
, "dflt_harvestr_parallel should not be called directly."
)
foreach::registerDoSEQ()
expect_equal( test_dflt_harvestr_parallel(), TRUE
, info="Top call should be equal to getDoParRegistered")
expect_equal(test_dflt_harvestr_parallel(1), 0
, info="not the top call.")
expect_equal(test_dflt_harvestr_parallel(2, nest=1), 1
, info="nested with more parallel than nesting")
expect_equal(test_dflt_harvestr_parallel(1, nest=2), 0
, info="nested with more nesting than parallel")
})
test_that("progress", {
expect_equal(test_dflt_harvestr_progress(TRUE , 'windows', is.top.call=TRUE), 'win')
expect_equal(test_dflt_harvestr_progress(TRUE , 'unix' , is.top.call=TRUE), 'time')
expect_equal(test_dflt_harvestr_progress(FALSE, 'windows', is.top.call=TRUE), 'none')
expect_equal(test_dflt_harvestr_progress(FALSE, 'unix' , is.top.call=TRUE), 'none')
})
test_that("others", {
expect_false(dflt_harvestr_time())
expect_false(dflt_harvestr_cache())
expect_equal(dflt_harvestr_cache_dir(), "harvestr-cache")
expect_equal(dflt_harvestr_use_try(), !interactive())
})
<file_sep>{###############################################################################
# test-caching.R
# This file is part of the R package harvestr.
#
# Copyright 2012 <NAME>
# Date: 6/2/2012
#
# DESCRIPTION
# ===========
# test the caching facilities.
#
# LICENSE
# ========
# harvestr is free software: you can redistribute it and/or modify it under the
# terms of the GNU General Public License as published by the Free Software
# Foundation, either version 3 of the License, or (at your option) any later
# version.
#
# This file is distributed in the hope that it will be useful, but WITHOUT ANY
# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with
# this file If not, see http://www.gnu.org/licenses/.
#
}###############################################################################
library(harvestr)
library(testthat)
library(boot)
foreach::registerDoSEQ()
context("Caching")
cache.dir <- normalizePath(file.path(tempdir(), "harvestr-cache"), mustWork=FALSE)
oo <- options( harvestr.cache.dir=cache.dir
)
reg.finalizer(emptyenv(), function(...){unlink(cache.dir, TRUE)}, onexit=TRUE)
long_function <- function(){
# should take about a second
rnorm(5e6)
paste("Your luck lotto numbers are:"
, paste(sample(56, 5), collapse=" ")
, '|'
, sample(46, 1), sep=' ')
}
takes_at_least <-function (amount) {
function(expr) {
duration <- system.time(force(expr))["elapsed"]
expectation(duration > amount,
sprintf("took %s seconds, which is more than %s", duration, amount))
}
}
test_that("caching in with seed with long_function", {
seed <- gather(1)[[1]]
unlink(cache.dir, recursive=TRUE, force=TRUE)
t1 <- system.time(run1 <- withseed(seed, long_function(), cache=TRUE))
t2 <- system.time(run2 <- withseed(seed, long_function(), cache=TRUE))
expect_true(all(t2[1] <= t1[1]))
expect_identical(run1, run2)
})
test_that("caching in farm with mean of rnorm", {
seeds <- gather(10)
unlink(cache.dir, recursive=TRUE, force=TRUE)
t1 <- system.time(run1 <- farm(seeds, mean(rnorm(5e6)), cache=TRUE, .progress='none', .parallel=FALSE))
t2 <- system.time(run2 <- farm(seeds, mean(rnorm(5e6)), cache=TRUE, .progress='none', .parallel=FALSE))
expect_true(all(t2[1] <= t1[1]))
expect_identical(run1, run2)
})
test_that("caching in reap with long sample", {
seed <- gather(1)
unlink(cache.dir, recursive=TRUE, force=TRUE)
dir.create(cache.dir, recursive=TRUE)
expect_true(dir.exists(cache.dir))
long_sample <- function(...)head(sample(...))
x <- plant( list(5e7), seed)[[1]]
hash <- digest(list(x, long_sample, source="harvestr::reap"), algo='md5')
t1 <- system.time(run1 <- reap(x, long_sample, hash=hash, cache=TRUE, time=FALSE))
t2 <- system.time(run2 <- reap(x, long_sample, hash=hash, cache=TRUE, time=FALSE))
expect_true(all(t2['user.self'] <= t1['user.self']), info="Rum time is faster in second call.")
expect_identical(run1, run2, info="Runs are identical.")
unlink(cache.dir, recursive=TRUE, force=TRUE)
})
test_that('caching in harvest using boot', {
unlink(cache.dir, recursive=TRUE, force=TRUE)
x <- farm(10, rnorm(1e4), cache=TRUE, .progress='none', .parallel=FALSE)
t1 <- system.time(run1 <- harvest(x, boot, var, 100, cache=TRUE, .progress='none', .parallel=FALSE))
t2 <- system.time(run2 <- harvest(x, boot, var, 100, cache=TRUE, .progress='none', .parallel=FALSE))
expect_equal(run1, run2)
expect_true(noattr(t2[1] <= t1[1]))
})
test_that('caching with timing turned on', {
unlink(cache.dir, recursive=TRUE, force=TRUE)
x <- farm(10, rnorm(1e4), cache=TRUE, .progress='none', .parallel=FALSE)
t1 <- system.time(run1 <- harvest(x, boot, var, 100, cache=TRUE, time=TRUE, .progress='none', .parallel=FALSE))
t2 <- system.time(run2 <- harvest(x, boot, var, 100, cache=TRUE, time=TRUE, .progress='none', .parallel=FALSE))
expect_equal(run1, run2)
expect_true(noattr(t2[1] <= t1[1]))
})
options(oo)
<file_sep>{###############################################################################
# test=gather.R
# This file is part of the R package harvestr.
#
# Copyright 2012 <NAME>
# Date: 6/2/2012
#
# DESCRIPTION
# ===========
# unit testing for gather and other process flow functions.
#
# LICENSE
# ========
# harvestr is free software: you can redistribute it and/or modify it under the
# terms of the GNU General Public License as published by the Free Software
# Foundation, either version 3 of the License, or (at your option) any later
# version.
#
# This file is distributed in the hope that it will be useful, but WITHOUT ANY
# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with
# this file. If not, see http://www.gnu.org/licenses/.
#
}###############################################################################
library(testthat)
library(plyr)
foreach::registerDoSEQ()
context("main functions")
options( harvestr.time=FALSE
, "harvestr::Interactive::exclude" = NULL
, "harvestr.use.cache" = NULL
, "harvestr.cache.dir" = NULL
, "harvestr.use.try" = NULL
, "harvestr.progress" = 'none'
, "harvestr.parallel" = FALSE
)
test_that("gather is replicable", {
seeds <- gather(10, seed=1234)
seeds0 <- gather(10, seed=1234)
expect_identical(seeds, seeds0)
})
test_that("gather gives the appropriate kinds of seed.", {
.l <- gather(1)
seed <- .l[[1]]
replace.seed(seed)
expect_identical(head(RNGkind(), 2), c("L'Ecuyer-CMRG", "Inversion"))
x <- plant(list(1), seeds=.l)
expect_equivalent(gather(x), .l)
})
test_that("gather errors", {
expect_error(gather(list(a=1,b=2)))
expect_error(gather('seeds'))
})
test_that("gather replaces seed.", {
set.seed(123, "Mersenne-Twister", "Box-Muller")
l <- get.seed()
seeds <- gather(10)
k <- get.seed()
expect_identical(l,k)
expect_identical(head(RNGkind(), 2), c("Mersenne-Twister", "Box-Muller"))
})
test_that("farm is replicable", {
seeds <- gather(10)
e <- farm(seeds, rnorm(10), cache=FALSE, time=FALSE, .progress='none', .parallel=FALSE)
f <- farm(seeds, rnorm(10), cache=FALSE, time=FALSE, .progress='none', .parallel=FALSE)
expect_identical(e,f)
expect_equivalent(e,f)
})
test_that('farm is indifferent to order.', {
seeds <- gather(10)
o <- sample(seq_along(seeds))
e <- farm(seeds , rnorm(10), .progress='none', .parallel=FALSE)
g <- farm(seeds[o], rnorm(10), .progress='none', .parallel=FALSE)[order(o)]
expect_equivalent(e,g)
})
test_that("reap is reproducible", {
seed <- gather(1)
x <- plant(list(1:10), seed)[[1]]
a <- reap(x, sample)
b <- withseed(seed[[1]], sample(1:10))
expect_identical(a,b)
expect_identical(reap(a, sample), reap(a, sample))
})
test_that("harvest", {
seeds <- gather(10)
e <- farm(seeds, rnorm(10), .progress='none', .parallel=FALSE)
x <- harvest(e, sample, replace=TRUE, .progress='none', .parallel=FALSE)
y <- harvest(e, sample, replace=TRUE, .progress='none', .parallel=FALSE)
expect_equivalent(x,y)
})
test_that("Permutation", {
seeds <- gather(10)
e <- farm(seeds, rnorm(10), .progress='none', .parallel=FALSE)
x <- harvest(e, sample, replace=TRUE, .progress='none', .parallel=FALSE)
o <- sample(seq_along(e))
y <- harvest(e[o], sample, replace=TRUE, .progress='none', .parallel=FALSE)[order(o)]
expect_equivalent(x,y)
})
test_that("using with", {
data <- farm(gather(3), data.frame(x123=runif(100), y456=rnorm(100)), .progress='none', .parallel=FALSE)
m1 <- harvest(data, with, mean(x123), .progress='none', .parallel=FALSE)
m2 <- lapply(data, with, mean(x123))
expect_equivalent(m1, m2)
})
test_that("plow & bale", {
df <- data.frame(rep=1:10, n=100)
rngs <- plow(df[-1], rnorm, .parallel=FALSE)
expect_is(rngs, "harvestr-results")
means <- harvest(rngs, mean, .parallel=FALSE, .progress='none')
expect_is(means, "harvestr-results")
df.means <- bale(means)
expect_is(df.means, "data.frame")
})
test_that("bale errors", {
expect_error(bale(list(1, TRUE)))
expect_error(bale(1:10))
expect_error(bale(list()))
expect_warning(bale(list(a=1, b=2)))
})
test_that("sprout warning", {
set.seed(20170823)
seed <- gather(1)[[1]]
seed2 <- sprout(seed, 10)
expect_warning( sprout(seed2[[1]], 10)
, "RNG seed provided is already a substream seed, independence of streams not guaranteed."
)
})
test_that("reap parameter use.try", {
oo <- options(harvestr.try.silent = TRUE)
seed <- gather(1)
x <- plant(list(rnorm), seed)[[1]]
expect_silent(a <- reap(x, sample, use.try=TRUE))
expect_is(a, "try-error")
expect_error(a <- reap(x, sample, use.try=FALSE))
options(oo)
})
test_that("farm checks function arguments", {
seeds <- gather(10, seed = 20170823)
expect_error(x <- farm(seeds, rnorm))
})
test_that("try_summary", {
oo <- options( warn=2
, harvestr.use.try = TRUE
, harvestr.try.silent = TRUE
, harvestr.try.summary = FALSE
)
seeds <- gather(10, seed = 20170824)
a <- farm(seeds, rnorm(1))
random_failure <- function(a){
if(a < 0) stop("Sorry it failed.")
"everything is good"
}
b <- harvest(a, random_failure, use.try=TRUE)
expect_output( try_summary(b)
, "6 of 10 \\( 60%\\) calls produced errors"
)
options(oo)
})
test_that("plant something besides seeds", {
expect_error( plant(as.list(letters), LETTERS)
, 'inherits\\(seeds, "numeric"\\) is not TRUE'
)
})
test_that("branch", {
x <- farm(gather(3, seed=20170825), rnorm(1))
y <- 1:3
results <- branch(rnorm, x[[1]], y, n=100, time=TRUE)
expect_is(results, "harvestr-results")
})
<file_sep>#' Test if a function was called from others.
#'
#' @param ... functions to pass
#' @param FUNS functions as a list
#'
#' @export
called_from <-
function( ... #< functions to consider
, FUNS=list(...) #< passed as a list
){
stopifnot( missing(FUNS) || length(list(...)) == 0 )
if(any(.i <- sapply(FUNS, is.character))){
FUNS[.i] <- lapply(FUNS[.i], match.fun)
}
stopifnot(all(sapply(FUNS, is.function)))
n <- sys.nframe()
for(i in 0:n){
if(any(sapply(FUNS, identical, sys.function(i))))
return(TRUE)
}
return(FALSE)
}
#' @describeIn called_from Determine if the function is called while knitting a document
#' @export
is_knitting <-
function(){
"Determine if the function is called while knitting a document"
if("knitr" %in% .packages(all.available = TRUE)){
called_from(knitr::knit)
} else FALSE
}
#' Smarter interactive test
#'
#' This is a smarter version of \code{\link{interactive}},
#' but also excludes cases inside knit or in startup
#' \code{\link{.First}}, or others specified in dots.
#' You can also specify functions to exclude in the option
#' \code{harvestr::Interactive::exclude}
#'
#' @param exclude.calls functions to pass to \code{\link{called_from}}
#'
#' @export
Interactive <-
function(exclude.calls=getOption("harvestr::Interactive::exclude")){
Interactive_core(exclude.calls=exclude.calls)
}
Interactive_core <-
function( is.interactive = interactive()
, is.knitting = is_knitting()
, exclude.calls = getOption("harvestr::Interactive::exclude")
){
is.interactive &&
!is.knitting &&
!called_from(.First.sys) &&
if(!is.null(exclude.calls)){
!called_from(FUNS=exclude.calls)
} else TRUE
}
<file_sep>{###############################################################################
# test-attributes.R
# This file is part of the R package harvestr.
#
# Copyright 2012 <NAME>
# Date: 6/2/2012
#
# DESCRIPTION
# ===========
# test the caching facilities.
#
# LICENSE
# ========
# harvestr is free software: you can redistribute it and/or modify it under the
# terms of the GNU General Public License as published by the Free Software
# Foundation, either version 3 of the License, or (at your option) any later
# version.
#
# This file is distributed in the hope that it will be useful, but WITHOUT ANY
# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with
# this file If not, see http://www.gnu.org/licenses/.
#
}###############################################################################
library(harvestr)
library(testthat)
context("attributes")
test_that("noattr", {
a <- structure(1, name="hello")
expect_identical(noattr(a), 1)
l <- list(a=a, b=structure(2, name = "Hola"))
expect_identical(noattr(l), list(1,2))
})
test_that("getAttr", {
a <- structure(1, name="hello")
expect_equal(getAttr(a, 'name'), "hello")
expect_equal(getAttr(a, 'name', "john"), "hello")
expect_null(getAttr(a, 'class'))
expect_equal(getAttr(a, 'class', 'abc'), "abc")
})
<file_sep>{###############################################################################
# utils.R
# This file is part of the R package harvestr.
#
# Copyright 2012 <NAME>
# Date: 6/2/2012
#
# DESCRIPTION
# ===========
# Helper utilities for working with harvestr functions.
#
# LICENSE
# ========
# harvestr is free software: you can redistribute it and/or modify it under the
# terms of the GNU General Public License as published by the Free Software
# Foundation, either version 3 of the License, or (at your option) any later
# version.
#
# This file is distributed in the hope that it will be useful, but WITHOUT ANY
# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with
# this file. If not, see http://www.gnu.org/licenses/.
#
}###############################################################################
#' Check if an object or list of objects has seed attributes
#'
#' @param x an object or list to check
#'
#' @export
is_seeded <- function(x){
if(!is.null(attr(x, 'ending.seed'))) {
return(TRUE)
} else {
if(is.list(x)){
return(all(sapply(x, is_seeded)))
}
}
return(FALSE)
}
#' Use a reference class method
#' @param method name of the method to call
#' @param ... additional arguments to pass along
#'
#' @seealso \link{ReferenceClasses}
#' @return a function that calls the designated meethod
#' @example inst/examples/use_method.R
#' @export
use_method <- function(method, ...){
method <- as.character(substitute(method))
function(x){
fun <- do.call(`$`, list(x, method))
fun(...)
}
}
#' retrieve the total time for a simulation
#'
#' @param x a list from harvest
#' @export
total_time <- function(x){
times <- sapply(x, attr, 'time')
structure(apply(times, 1, sum), class='proc_time')
}
swap_args <- function(fun){
stopifnot(length( f <- formals(fun))>1)
f2 <- fun
formals(f2) <-
f[c(2:1, seq_along(f)[-2:-1])]
f2
}
only1 <- function(.list){
all(.list == .list[[1]])
}
is_unity <- function(...)only1(list(...))
is_homo <- function(.list){
classes <- lapply(.list, class)
only1(classes)
}
is_homogeneous <- function(...)is_homo(list(...))
<file_sep>{###############################################################################
# harvestr-packages.R
# This file is part of the R package harvestr.
#
# Copyright 2012 <NAME>
# Date: 6/2/2012
#
# DESCRIPTION
# ===========
# package documentation
#
# LICENSE
# ========
# harvestr is free software: you can redistribute it and/or modify it under the
# terms of the GNU General Public License as published by the Free Software
# Foundation, either version 3 of the License, or (at your option) any later
# version.
#
# This file is distributed in the hope that it will be useful, but WITHOUT ANY
# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with
# this file. If not, see http://www.gnu.org/licenses/.
#
}###############################################################################
#' \code{harvestr} package
#' @name harvestr
#' @aliases package-harvestr harvestr
#' @docType package
#' @aliases harvestr package-harvestr
#' @title A Simple Reproducible Parallel Simulation Framework
#' @author <NAME> <<EMAIL>>
#'
#' The harvestr package is a framework for parallel reproducible simulations.
#'
#' The functions to know about are:
#' \enumerate{
#' \item \code{\link{gather}} - which gathers parallel seeds.
#' \item \code{\link{farm}} - which uses the saved seeds from gather to replicate an expression,
#' once for each seed.
#' \item \code{\link{harvest}} - which uses objects from farm, that have saved seed attributes,
#' to continue evaluation from where farm finished.
#' \item \code{\link{reap}} - is used by harvest for a single item
#' \item \code{\link{plant}} - is used to set seeds for a list of predefined objects so that harvest
#' can be used on it.
#' \item \code{\link{sprout}} - Generate independent sub-streams.
#' \item \code{\link{graft}} - Replicate and object in independent substreams
#' of random numbers.
#' }
#'
#' @section Caching:
#' The functions in \code{harvestr} can cache results for faster and
#' interuptible simulations. This option defaults to \code{FALSE} but can be
#' chosen by specifying the \code{cache} parameter in any of the functions
#' that produce results.
#'
#' The caching is performed by saving a RData file in a specified caching
#' directory. The default directory is named "harvestr-cache" and resides
#' under the \link[base:getwd]{working directory}. This can be specified by setting
#' the \code{harvestr.cache.dir} \code{\link{option}}. Files in this directory
#' use file names derived from hashes of the expression to evaluate. Do not
#' modify the file names.
#'
#' @section Options:
#' The following options control behavior and default values for harvestr.
#' \enumerate{
#' \item \code{harvestr.use.cache=FALSE} - Should results be cached for fault
#' tollerance and accelerated reproducability?
#' \item \code{harvestr.cache.dir="harvestr-cache"} - The directory to use for
#' storing cached results.
#' \item \code{harvestr.time=FALSE} - Should results be timed?
#' \item \code{harvestr.use.try=!\link{interactive}()} - Should the vectorized calls use try
#' to increase fault tollerance?
#' \item \code{harvestr.try.silent=FALSE} - Should try be run silently?
#' \item \code{harvestr.try.summary=TRUE} - Print a result if errors were found?
#' \item \code{harvestr.parallel} - Run results in parallel? Default is to run in parallel if a parallel back end is registered and the call is the uppermost harvestr call.
#' \item \code{harvestr.progress} - Use a progress bar? Default is to show a bar in interactive mode for top level call, but the type is platform dependent.
#' }
NULL
<file_sep>---
title: Using `harvestr` for replicable simulations
author: <NAME>
vignette: >
%\VignetteIndexEntry{Using harvestr for replicable simulations}
%\VignetteEngine{knitr::rmarkdown}
%\VignetteEncoding{UTF-8}
...
```{r setup, results='hide', messages=FALSE, warnings=FALSE}
library(harvestr)
library(plyr)
library(MCMCpack)
library(dostats)
```
# Introduction
The `harvestr` package is a new approach to simulation studies
that facilitates parallel execution. It builds off the structures
available in the `plyr`, `foreach` and `rsprng`
packages. What `harvestr` brings to the picture is abstractions
of the process of performing simulation.
# Process
The theme of `harvestr` is that of gardening, stemming from the
idea that the pseudo-random numbers generated (RNG) in replicable
simulation come from initial states called seeds. Figure 1 shows the
basic process for `harvestr`.
(workflow.png)[The basic `harvestr` process]
The ideas are simple.
* `gather(n, {[`seed{]})} takes an integer for the number of
seeds to generate.\\ Optionally, the seed can be set for replicable
simulations. This uses the `rsprng` library to initialize
independent parallel random number streams.
* The seeds that are returned from `gather` are then fed into the
`farm` function along with an expression to be generate data.
`farm` returns a list of data frames each independently
generated under each of the rng streams.
* The final step is to apply a function to analyze the data. This is
done with the `harvest` command, which takes the data from
`farm` and applies an analysis function to the dataset. In the
case that the analysis is deterministic `harvest` is equivalant
to `llply` from the `plyr` package. The difference is
with stochastic analysis, such as Markov Chain Monte Carlo (MCMC),
where `harvest` resumes the RNG stream where `farm` left
off when generating the data.
The effect is the results can be taken in any order and independently,
and the final results are the same as if each analysis was taken from
start to end with setting a single seed for each stream.
## Example 1 - Basic Example
Some learn best by example. Here I will show a simple example for the
basic process. Here we will perform simple linear regression for 100
data sets. First off we gather the seeds. This step is separate to
facilitate storing the seeds to be distributed along with research if
necessary.
```{r ex1-setup}
seeds <- gather(25, seed=12345)
```
Second, we generate the data.
```{r ex1-generate_data}
datasets <- farm(seeds, {
x <- rnorm(100)
y <- rnorm(100, mean=x)
data.frame(y,x)
})
```
Then we analyze the data.
```{r r ex1-analysis}
analyses <- harvest(datasets, lm)
```
So what do we have in `analyses`? We have whatever `lm`
returned. In this case we have a list of `lm` objects containg
the results of a linear regression. Ussually we will want to do more to
summarize the results.
```{r ex1-summarize}
coefs <- Reduce(rbind, lapply(analyses, coef))
apply(coefs, 2, dostats, mean, sd)
```
## Example 2 - Stochastic Analysis
That is very nice, but rather simple as far ananalyses go. What might be
more interesting is to perform an analysis with a stochastic component such
as Markov Chain Monte Carlo.
```{r stochastic, message=F, warning=F}
library(MCMCpack)
library(plyr)
posteriors <- harvest(datasets, MCMCregress, formula=y~x)
dataframes <- harvest(posteriors, as.data.frame)
X.samples <- harvest(dataframes, `[[`, "x")
densities <- harvest(X.samples, density)
```
```{r fig=T}
plot(densities[[1]])
l_ply(densities, lines)
```
## Example 3 - Caching
To ease longer analyses with many steps caching is available.
```{r caching_run1}
unlink("harvestr-cache", recursive=TRUE) # reset cache
system.time({
posteriors1 <- harvest(datasets, MCMCregress, formula=y~x, cache=TRUE)
})
```
and when we run it again.
```{r cache_run2}
system.time({
posteriors2 <- harvest(datasets, MCMCregress, formula=y~x, cache=TRUE)
})
```
To maintain integrity `harvestr` functions take use
`digest` to create hashes of the seed, data, and function so that
if any element changes, out of data cache results will not be used.
<file_sep>library(testthat)
library(harvestr)
foreach::registerDoSEQ()
test_check("harvestr")
<file_sep>library(harvestr)
library(testthat)
library(plyr)
if(require(doParallel)) {
cl=makeCluster(2)
clusterEvalQ(cl, (library(harvestr)))
registerDoParallel(cl)
}
{ context("parallel")
test_that('withseed is invariante under foreach', {
seed=gather(1, seed=1234)[[1]]
r <- foreach(seed = replicate(4, seed, simplify=F)) %do%
withseed(seed, rnorm(1e5))
s <- foreach(seed = replicate(4, seed, simplify=F)) %dopar%
withseed(seed, rnorm(1e5))
identical((r), (s))
expect_true(all(laply(r, all.equal, r[[1]], check.attributes=F)))
expect_true(all(laply(s, all.equal, s[[1]], check.attributes=F)))
})
test_that("farm is parallelizable.", {
set.seed(123)
seeds <- gather(4)
a <- farm(seeds, runif(5))
b <- farm(seeds, runif(5))
c <- farm(seeds, runif(5), .parallel=T)
expect_identical(a, b)
expect_identical(a, c)
})
test_that('harvest is parallelizable with option', {
seeds <- gather(100, seed=1234)
e <- farm(seeds, rnorm(10))
x <- harvest(e, sample, replace=T)
z <- harvest(e, sample, replace=T, .parallel=T)
expect_equivalent(noattr(x),noattr(z))
})
}
<file_sep>{###############################################################################
# test-Interactive.R
# This file is part of the R package harvestr.
#
# Copyright 2012 <NAME>
# Date: 6/2/2012
#
# DESCRIPTION
# ===========
# test the caching facilities.
#
# LICENSE
# ========
# harvestr is free software: you can redistribute it and/or modify it under the
# terms of the GNU General Public License as published by the Free Software
# Foundation, either version 3 of the License, or (at your option) any later
# version.
#
# dostats is distributed in the hope that it will be useful, but WITHOUT ANY
# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with
# dostats. If not, see http://www.gnu.org/licenses/.
#
}###############################################################################
library(harvestr)
library(testthat)
oo <- options( "harvestr::Interactive::exclude" = NULL)
context("Interactive")
test_that("testing is_knitting", {
expect_false(is_knitting())
})
test_that("called from", {
expect_false(called_from(stats::rnorm))
expect_true(called_from(test_that))
expect_false(called_from('rnorm'))
expect_true(called_from('expect_true'))
})
test_that("Interactive", {
expect_equal(Interactive(), interactive())
})
test_that("Interactive exclusions", {
test_Interactive <- function(...){Interactive_core(...)}
expect_true (test_Interactive(is.interactive = TRUE, is.knitting=FALSE, NULL))
expect_false(test_Interactive(is.interactive =FALSE, is.knitting=FALSE, NULL))
expect_false(test_Interactive(is.interactive = TRUE, is.knitting= TRUE, NULL))
expect_false(test_Interactive(is.interactive = TRUE, is.knitting=FALSE, list(test_Interactive)))
})
options(oo)
<file_sep>{###############################################################################
# test-use_method.R
# This file is part of the R package harvestr.
#
# Copyright 2012 <NAME>
# Date: 6/2/2012
#
# DESCRIPTION
# ===========
# testing for use_method
#
# LICENSE
# ========
# harvestr is free software: you can redistribute it and/or modify it under the
# terms of the GNU General Public License as published by the Free Software
# Foundation, either version 3 of the License, or (at your option) any later
# version.
#
# This file is distributed in the hope that it will be useful, but WITHOUT ANY
# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with
# this file. If not, see http://www.gnu.org/licenses/.
#
}###############################################################################
library(harvestr)
library(testthat)
context("use_method")
test_that("use_method", {
exf <- system.file("examples", "use_method.R", package="harvestr")
if(exf=="")
exf <- system.file("inst", "examples", "use_method.R", package="harvestr")
if(exf!=""){
source(exf)
x <- mr$new(x = 2L)
x$x <- 2L
name <- "<NAME>!"
x$name <- "Hello World!"
expect_that(use_method("hello"), is_a("function"))
expect_that(use_method(hello), is_a("function"))
expect_that(use_method(hello, 1), is_a("function"))
expect_that(use_method(hello)(x), is_a("character"))
expect_that(use_method(times, 3)(x), equals(6))
} else {expect_that(FALSE, is_true(), "Could not find the example for use_method.")}
})
<file_sep>
RNGkind("L'Ecuyer-CMRG")
set.seed(123)
(s <- .Random.seed)
r<-nextRNGStream(s)
replicate(10, nextRNGSubStream(r), simplify=F)
n <- 10
l
a <-
llply(l, withseed, runif(10), environment())
b <-
llply(l, withseed, runif(10), environment())
identical(a,b)
a <- replicate(3, {
r <<- nextRNGStream(s)
replicate(3, r <<- nextRNGSubStream(r), T)
}, F)
b <- replicate(3, {
r <<- nextRNGStream(s)
runif(10)
replicate(3, r <<- nextRNGSubStream(r), T)
}, F)
nextRNGSubStream(s)
set.seed(r)
a <- runif(10)
nextRNGStream(s)
nextRNGSubStream(s)
set.seed(r)
b <- runif(10)
set.seed(1)
a <- runif(10)
set.seed(1)
b <- runif(10)
<file_sep>library(harvestr)
library(plyr)
mr <- setRefClass("HelloWorld",
fields = list(
x = 'integer',
name = 'character'),
methods = list(
hello = function(){
invisible(name)
},
times = function(y){
x*y
},
babble = function(n){
paste(sample(letters), collapse='')
}
)
)
p <- data.frame(x=as.integer(1:26), name=letters, stringsAsFactors=FALSE)
# create list of objects
objs <- mlply(p, mr$new)
# plant seeds to prep objects for harvest
objs <- plant(objs)
# run methods on objects
talk <- harvest(objs, use_method(babble), .progress='none', .parallel=FALSE)
unlist(talk)
# and to show reproducibility
more.talk <- harvest(objs, use_method(babble), .progress='none', .parallel=FALSE)
identical(unlist(talk), unlist(more.talk))
<file_sep>{###############################################################################
# gather.R
# This file is part of the R package harvestr.
#
# Copyright 2012 <NAME>
# Date: 6/2/2012
#
# DESCRIPTION
# ===========
# Interface level functions that define the process flow.
#
# gather -> farm -> harvest
# gather -> plant -> harvest
#
# LICENSE
# ========
# harvestr is free software: you can redistribute it and/or modify it under the
# terms of the GNU General Public License as published by the Free Software
# Foundation, either version 3 of the License, or (at your option) any later
# version.
#
# This file is distributed in the hope that it will be useful, but WITHOUT ANY
# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with
# this file. If not, see http://www.gnu.org/licenses/.
#
}###############################################################################
#' Gather independent seeds.
#'
#' Collect seeds representing independent random number streams.
#' These seeds can them be used in \code{\link{farm}} or \code{\link{plant}}.
#'
#'
#' @param x number of seeds, or an object with seeds to gather
#' @param seed a seed to use to set the seed, must be compatible with "L'Ecuyer-CMRG"
#' @param ... passed on
#' @param .starting if TRUE starting seeds with be gathered rather than ending seeds.
#'
#' @seealso \link{RNG}
#' @family harvest
#' @importFrom parallel nextRNGStream
#' @include withseed.R
#' @export
gather <-
function(x, seed=get.seed(), ..., .starting=F){
oldseed <- get.seed()
on.exit(replace.seed(oldseed))
if(is.list(x)){
seeds <- lapply(x, attr, ifelse(.starting,"starting.seed", "ending.seed"))
if(any(sapply(seeds, is.null))) stop("Malformed list.")
seeds
} else if(is.numeric(x) && isTRUE(all.equal(x,ceiling(x)))){
if(!is.null(seed)) {
set.seed(seed, kind="L'Ecuyer-CMRG", normal.kind="Inversion")
} else {
RNGkind(kind="L'Ecuyer-CMRG", normal.kind="Inversion")
}
r <- get.seed()
seeds <- vector('list', x)
for(i in seq_len(x)) {
r <-
seeds[[i]] <-
structure( nextRNGStream(r)
, RNGlevel='stream'
, class=c("rng-seed", "integer")
)
}
structure(seeds, class=c('rng-seeds', 'list'))
} else {
stop("x must be either a list or integer")
}
}
#' Create substreams of numbers based of a current stream.
#'
#' Seeds from \code{\link{gather}} can be used to generate another
#' set of independent streams. These seeds can be given to
#' \code{\link{graft}}
#'
#' As a convenience \code{seed} can be an object that has a seed attached,
#' ie. the result of any \code{harvestr} function.
#'
#' @param seed a current random number stream compatible with
#' \code{\link{nextRNGSubStream}}
#' @param n number of new streams to create.
#'
#' @family harvest
#' @seealso \code{\link{nextRNGSubStream}}
#' @importFrom parallel nextRNGSubStream
#' @export
sprout <- function(seed, n) {
es <- attr(seed, 'ending.seed')
if(!is.null(es)) seed <- es
rng.level <- attr(seed, 'RNGlevel')
if(!is.null(rng.level) && rng.level == 'substream') {
warning(paste('RNG seed provided is already a substream seed,'
,'independence of streams not guaranteed.'))
}
seeds <- replicate(n, simplify=FALSE, {
seed <<- structure( nextRNGSubStream(seed)
, class = c('rng-seed', 'integer')
, RNGlevel = 'substream'
)
})
structure(seeds, class=c('rng-seeds', 'list'))
}
#' Call a function continuing the random number stream.
#'
#' The \code{reap} function is the central function to \code{harvest}.
#' It takes an object, \code{x}, extracts the previous seed, ie. state of
#' the random number generator, sets the seed, and continues any evaluation.
#' This creates a continuous random number stream, that is completly
#' reproducible.
#'
#' The function calling works the same as the \link{apply} family of functions.
#'
#' @param x an object
#' @param fun a function to call on object
#' @param ... passed onto function
#' @param hash hash of the list to retrieve the cache from.
#' @param cache use cache, see Caching in \code{\link{harvestr}}
#' @param cache.dir directory for the cache.
#' @param time should results be timed?
#' @param use.try Should the call be wrapped in try?
#'
#' @seealso \code{\link{withseed}}, \code{\link{harvest}}, and \code{\link{with}}
#' @export
reap <-
function( x, fun, ...
, hash = digest(list(x, fun, ..., source="harvestr::reap"), algo="md5")
, cache = getOption('harvestr.use.cache', defaults$cache())
, cache.dir = getOption('harvestr.cache.dir', defaults$cache.dir())
, time = getOption('harvestr.time' , defaults$time())
, use.try = getOption('harvestr.use.try' , !interactive())
) {
seed <- attr(x, "ending.seed")
if(is.null(seed))
stop("Could not find a seed value associated with x")
if(cache){
cache <- structure(cache, expr.md5 = hash)
}
f <- if(use.try) {
function(){try(fun(x,...), silent=getOption('harvestr.try.silent', FALSE))}
} else {
function(){fun(x,...)}
}
withseed(seed, f, cache=cache, cache.dir=cache.dir, time=time)
}
#' Evaluate an expression for a set of seeds.
#'
#' For each seed, set the seed, then evaluate the expression.
#' The \code{farm} function is used to generate data.
#' The Seeds for the state of the random numbrer generator is stored in the
#' attribute \code{'ending.seed'}, and will be used by harvestr functions
#' for any other random number generation that is needed.
#'
#' @param seeds a list of seeds can be obtained though \code{\link{gather}}
#' @param expr an expression to evalutate with the different seeds.
#' @param envir an environment within which to evaluate \code{expr}.
#' @param ... extra arguments
#' @param cache should cached results be used or generated?
#' @param time should results be timed?
#' @param .parallel should the computations be run in parallel?
#' @param .progress Show a progress bar?
#'
#' @importFrom plyr llply ldply
#' @family harvest
#' @export
farm <-
function( seeds, expr, envir = parent.frame(), ...
, cache = getOption('harvestr.use.cache', defaults$cache() )
, time = getOption('harvestr.time' , defaults$time() )
, .parallel = getOption('harvestr.parallel' , defaults$parallel() )
, .progress = getOption('harvestr.progress' , defaults$progress() )
){
if(is.numeric(seeds) && length(seeds)==1)
seeds <- gather(seeds)
fun <- if(is.name(substitute(expr)) && is.function(expr)){
stopifnot(is.null(formals(expr)))
expr
} else {
substitute(expr)
}
results <- llply(.data=seeds, .fun=withseed, fun, envir=envir, ...
, cache=cache, time=time
, .parallel=.parallel, .progress=.progress)
if(time){
attributes(results)$time <- total_time(results)
}
structure( results
, 'function' = 'harvestr::farm'
, class = c('harvestr-datasets', 'list')
)
}
#' Harvest the results.
#' @param .list a list of \code{data.frame}s See details.
#' @param fun a function to apply
#' @param ... passed to \code{fun}
#' @param time should results be timed?
#' @param .parallel should the computations be run in parallel?
#' @param .progress Show a progress bar?
#'
#' @details
#' harvest is functionaly equivalant to llply, but takes on additional capability when used
#' with the other functions from this package. When an object comes from \code{\link{withseed}}
#' the ending seed is extacted and used to continue evaluation.
#'
#' @importFrom plyr mlply
#' @family harvest
#' @export
harvest <-
function(.list, fun, ...
, time = getOption('harvestr.time' , defaults$time() )
, .parallel = getOption('harvestr.parallel' , defaults$parallel() )
, .progress = getOption('harvestr.progress' , defaults$progress() )
){
results <- llply( .list, reap, fun, ..., time =time
, .parallel=.parallel, .progress=.progress
)
if(getOption('harvestr.try.summary', TRUE)) try_summary(results)
if(time){
attributes(results)$time <- total_time(results)
}
structure( results
, 'function' = 'harvestr::harvest'
, class = c('harvestr-results', 'list')
)
}
is_try_error <- function(x)inherits(x, 'try-error')
get_message <- function(e)attr(e, 'condition')$message
try_summary <- function(results){
errors <- Filter(is_try_error, results)
if(length(errors)){
errors <- sapply(errors, get_message)
cat(sprintf("%d of %d (%3.2g%%) calls produced errors"
, length(errors) , length(results)
, length(errors) / length(results) * 100), "\n\n")
print(table(errors))
}
}
#' Assign elements of a list with seeds
#' @param .list a list to set seeds on
#' @param ... passed to gather to generate seeds.
#' @param seeds to plant from \code{\link{gather}} or \code{\link{sprout}}
#'
#' @description
#' The function \code{plant} assigns
#' each element in list set seed.
#' This will replace and ending seeds values already set for the
#' objects in the list.
#'
#' @family harvest
#' @export
plant <-
function(.list
, seeds = gather(length(.list), ...)
, ...
) {
if(inherits(.list, 'data.frame'))
.list <- mlply(.list, data.frame)
stopifnot(inherits(.list, 'list'))
n <- length(.list)
if(!inherits(seeds, "rng-seeds")){
stopifnot(inherits(seeds, "numeric"))
seeds <- gather(n, seed=seeds)
}
stopifnot(n == length(seeds))
for(i in seq_len(n)){
attr(.list[[i]],'ending.seed') <- seeds[[i]]
}
return(.list)
}
#' @rdname plant
#' @aliases graft
#'
#' @description
#' The \code{graft} function replicates an object with independent substreams.
#' The result from \code{graft} should be used with \code{\link{harvest}}.
#'
#' @param x an objects that already has seeds.
#' @param n number of seeds to create
#' @family harvest
#' @export
graft <-
function(x, n, seeds = sprout(x, n))
plant(replicate(length(seeds), x, F), seeds)
branch <-
function( fun #< function to apply to x and y[i]
, x #< seeded object
, y #< Vector of first argument
, ... #< additional arguments passed to `<mapply>`.
, seeds = sprout(x, length(y)) #< seeds to use.
, time = getOption('harvestr.time' , defaults$time() )
, .parallel = getOption('harvestr.parallel' , defaults$parallel() )
, .progress = getOption('harvestr.progress' , defaults$progress() )
){
#! Take an object, `x`, graft it into length(y) sub-streams,
#^ then call `fun` with the grafts and corresponding y element.
#! function is called as `fun(x,y)`
buds <- graft(x, seeds=seeds)
shoots <- t(mapply(list, x=buds, y, ..., SIMPLIFY=TRUE))
results <-
mlply( shoots, .fun=reap, fun=fun
, time = time
, .parallel = .parallel
, .progress = .progress
)
if(getOption('harvestr.try.summary', TRUE)) try_summary(results)
if(time){
attributes(results)$time <- total_time(results)
}
structure( results
, 'function' = 'harvestr::branch'
, class = c('harvestr-results', 'list')
)
#TODO[Andrew]: Test branch
}
#' Apply over rows of a data frame
#'
#' @param df a data frame of parameters
#' @param f a function
#' @param ... additional parameters
#' @param seeds seeds to use.
#' @param seed passed to gather to generate seeds.
#' @inheritParams harvest
#'
#' @return a list with f applied to each row of df.
#'
#' @importFrom plyr splat
#' @export
plow <-
function( df, f, ...
, seed=get.seed()
, seeds=gather(nrow(df), seed=seed)
, time = getOption('harvestr.time' , defaults$time() )
, .parallel = getOption('harvestr.parallel' , defaults$parallel() )
, .progress = getOption('harvestr.progress' , defaults$progress() )
){
parameters <- plant(df, seeds=seeds)
structure( harvest( parameters, splat(f), ...
, .parallel=.parallel, .progress=.progress, time=time
)
, "function" ="harvestr::plow")
}
#' Combine results into a data frame
#'
#' @param l a list, from a harvestr function.
#' @param .check should checks be run on the object.
#'
#' @seealso ldply
#'
#' @export
bale <-
function(l, .check=T){
if(.check){
stopifnot( is_homo(l)
, inherits(l, 'list')
, length(l)>0
)
if(!inherits(l, 'harvestr-results'))
warning('bale is intended to be used with harvestr results but got a ', class(l))
}
ldply(l, if(inherits(l[[1]], 'harvestr-results')) bale else as.data.frame)
}
<file_sep>library(testthat)
context("seeds")
test_that("get.seed retrieves properly", {
set.seed(123, "L'Ecuyer-CMRG", "Inversion", 'Rejection')
seed <- get.seed()
expect_is(seed, 'rng-seed')
eseed <- structure(c(10407L
, 1806547166L
, -983674937L
, 643431772L
, 1162448557L
, -959247990L
, -133913213L), class=c('rng-seed', 'integer'))
expect_identical(seed, eseed)
})
test_that("replace.seed", {
eseed <- c(10407L
, 1806547166L
, -983674937L
, 643431772L
, 1162448557L
, -959247990L
, -133913213L)
set.seed(123, "Mersenne-Twister", "Box-Muller")
k <- RNGkind()
replace.seed(eseed)
l <- RNGkind()
expect_identical(k, c("Mersenne-Twister", "Box-Muller", "Rejection"))
expect_identical(l, c("L'Ecuyer-CMRG", "Inversion", "Rejection"))
})
test_that("withseed, is replicable", {
seeds <- gather(4)
seed1 <- seeds[[1]]
seed2 <- seeds[[2]]
noattr(a <- withseed(seed1, rnorm(10)))
noattr(b <- withseed(seed1, rnorm(10)))
noattr(c <- withseed(seed2, rnorm(10)))
d <- rnorm(10)
expect_identical(a, b)
expect_false(all(a == c)) # FALSE
expect_false(identical(a, c)) # FALSE
expect_false(all(a == d)) # FALSE
})
test_that("withseed replaces previous seed.", {
seed <- gather(1)[[1]]
set.seed(123, "Mersenne-Twister", "Box-Muller")
l <- get.seed()
a <- withseed(seed, get.seed())
k <- get.seed()
expect_identical(l,k)
expect_equivalent(a, attr(a, 'starting.seed'))
expect_equivalent(a, attr(a, 'ending.seed'))
expect_false(identical(l, noattr(a)))
expect_false(isTRUE(all.equal(a, l)))
expect_identical(RNGkind(), c("Mersenne-Twister", "Box-Muller", "Rejection"))
})
test_that('seeds set properly - withseed', {
seeds <- gather(4)
seed <- noattr(seeds[[1]])
set.seed(123, "Mersenne-Twister")
cseed <- .Random.seed
a <- withseed(seed, .Random.seed)
s1 <- noattr(attr(a, 'starting.seed'))
s2 <- noattr(a)
s3 <- noattr(attr(a, 'ending.seed'))
expect_identical(seed, s1)
expect_identical(seed, s2)
expect_identical(seed, s3)
expect_false(isTRUE(all.equal(seed, cseed)))
expect_identical(cseed, .Random.seed)
})
test_that('withseed handles passed expressions', {
expr <- quote(runif(10))
seed <- gather(1)[[1]]
a <- withseed(seed, expr)
b <- withseed(seed, runif(10))
expect_equal(a, b)
expect_equal(length(a), 10)
expect_true(all(0<=a & a<=1))
})
test_that('is_seeded', {
x <- plant(list(1))
expect_false(is_seeded(NULL))
expect_true(is_seeded(x))
expect_true(is_seeded(x[[1]]))
})
test_that('swap_args', {
f <- function(a=1, b=2){a**b}
g <- swap_args(f)
expect_equal(formals(g), pairlist(b=2, a=1))
})
test_that('homogeneity', {
expect_true (only1(list(a=1, b=1)))
expect_false(only1(list(a=1, b=2)))
expect_true (is_unity(a=1, b=1))
expect_false(is_unity(a=1, b=2))
expect_true (is_homo(list(a=1, b=2 )))
expect_false(is_homo(list(a=1, b=2L)))
expect_true (is_homogeneous(a=1, b=2 ))
expect_false(is_homogeneous(a=1, b=2L))
})
test_that("remove seed", {
runif(1)
replace.seed(NULL)
expect_false(exists('.Random.seed', envir=.GlobalEnv))
})
test_that('get.seed', {
replace.seed(NULL)
expect_null(a <- get.seed())
seed <- gather(1)[[1]]
})
test_that("GetOrSetSeed", {
replace.seed(NULL)
seed <- GetOrSetSeed()
expect_is(seed, 'integer')
})
test_that('printing seeds', {
seeds <- gather(3)
expect_match(format(seeds), "L'Ecuyer-CMRG<[0-9A-F]+/[0-9A-F]+/[0-9A-F]+/[0-9A-F]+/[0-9A-F]+/[0-9A-F]+>")
expect_output(print(seeds), "L'Ecuyer-CMRG<[0-9A-F]+/[0-9A-F]+/[0-9A-F]+/[0-9A-F]+/[0-9A-F]+/[0-9A-F]+>")
expect_match(format(seeds[[1]]), "L'Ecuyer-CMRG<[0-9A-F]+/[0-9A-F]+/[0-9A-F]+/[0-9A-F]+/[0-9A-F]+/[0-9A-F]+>")
expect_output(print(seeds[[1]]), "L'Ecuyer-CMRG<[0-9A-F]+/[0-9A-F]+/[0-9A-F]+/[0-9A-F]+/[0-9A-F]+/[0-9A-F]+>")
expect_output(print(gather(100)), "\\+ 94 more, 100 in total.")
})
<file_sep>{###############################################################################
# withseed.R
# This file is part of the R package harvestr.
#
# Copyright 2012 <NAME>
# Date: 6/2/2012
#
# DESCRIPTION
# ===========
# functions for working with random number seeds.
#
# LICENSE
# ========
# harvestr is free software: you can redistribute it and/or modify it under the
# terms of the GNU General Public License as published by the Free Software
# Foundation, either version 3 of the License, or (at your option) any later
# version.
#
# This file is distributed in the hope that it will be useful, but WITHOUT ANY
# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with
# this file. If not, see http://www.gnu.org/licenses/.
#
}###############################################################################
#' Do a computation with a given seed.
#' @rdname seed_funs
#' @param seed a valid seed value
#' @param expr expression to evaluate.
#' @param envir the \code{\link{environment}} to evaluate the code in.
#' @param cache should results be cached or retrieved from cache.
#' @param cache.dir Where should cached results be saved to/retrieve from.
#' @param time should results be timed?
#'
#' @details
#' Compute the expr with the given seed, replacing the global seed after compuatations
#' are finished.
#'
#' does not replace the global .Random.seed
#' @note Not parallel compatible, this modifies the global environment, while processing.
#' @importFrom digest digest
#' @seealso \code{\link{set.seed}}
#' @export
withseed <- function(seed, expr, envir=parent.frame()
, cache = getOption('harvestr.use.cache', defaults$cache() )
, cache.dir = getOption("harvestr.cache.dir", defaults$cache.dir())
, time = getOption('harvestr.time' , defaults$time() )
){
oldseed <- get.seed()
on.exit(replace.seed(oldseed))
se <- substitute(expr)
if(cache){
expr.md5 <- attr(cache, 'expr.md5')
parent.call <- sys.call(-1)[[1]]
if(is.null(expr.md5)){
if(is.call(se))
expr.md5 <- digest(se , 'md5')
else
expr.md5 <- digest(expr, 'md5')
}
seed.md5 <- digest(seed, 'md5')
filename <- paste(expr.md5, '-', seed.md5, ".RData", sep='')
cache.file <- file.path(cache.dir, filename)
if(file.exists(cache.file)){
load(cache.file)
return(result)
}
}
replace.seed(seed)
fun <- if(is.name(se)){
if(is.function(expr)){
stopifnot(is.null(formals(expr)))
expr
} else if(is.call(expr)) {
as.function(list(expr), envir=envir)
} else {
eval(substitute(function()expr), envir=envir)
}
} else {
eval(substitute(function()expr), envir=envir)
}
if(time) start.time <- proc.time()
result <- fun()
ending.seed <- get.seed()
attributes(ending.seed) <- attributes(seed) # will carry forward things like RNGStream level.
result <- structure(result,
starting.seed = seed,
ending.seed = ending.seed,
time=if(time)structure(proc.time() - start.time, class = "proc_time"))
if(cache && !inherits(result, 'try-error')){
if(!file.exists(cache.dir)) dir.create(cache.dir)
save(result, file=cache.file)
}
result
}
#' safe version of retrieving the .Random.seed
#' @rdname seed_funs
#' @return the .Random.seed if defined, otherwise NULL
#' @export
get.seed <- function(){
if(exists(".Random.seed", envir=.GlobalEnv, mode="numeric")) {
seed <- get(".Random.seed", envir=.GlobalEnv, inherits=FALSE)
class(seed) <- c("rng-seed", "integer")
return(seed)
} else {
return(NULL)
}
}
#' @rdname seed_funs
#' @param delete logical to delete if \code{seed} is null.
#' @details
#' Replaces the .Random.seed with seed unless seed is null, then it will
#' delete the .Random.seed if delete=T
#' @export
replace.seed <- function(seed, delete=TRUE){
if(is.null(seed)){
if(delete && exists('.Random.seed', envir=.GlobalEnv, inherits=FALSE))
remove('.Random.seed', envir=.GlobalEnv)
} else {
assign('.Random.seed', noattr(seed), envir=.GlobalEnv)
}
}
#' Get or Set Current Seed - Safe Version
#' @rdname seed_funs
#'
#' @details
#' Always returns a valid seed.
#' Useful for grabbing a seed used to generate a random object.
#'
#' @return a valid .Random.seed value.
#' @importFrom stats runif
GetOrSetSeed<-function(){
if(is.null(get.seed())) stats::runif(1)
seed <- .Random.seed
seed
}
#' @export
#' @importFrom utils head tail
`format.rng-seed` <-
function( x
, ...
){
if (length(x) <= 7)
paste0(RNGkind()[1], "<", paste(sprintf("%X", x[-1]), collapse="/"), ">")
else
paste0(RNGkind()[1], "<", digest(tail(head(x, -1), -1), "xxhash64"), "+", tail(x, 1), ">")
}
#' @export
`print.rng-seed` <-
function( x
, ...
){
print(format(x))
}
#' @export
`format.rng-seeds` <-
function( x
, ...
){
sapply(x, `format.rng-seed`)
}
#' @export
`print.rng-seeds` <-
function( x #< [rng-seeds] object
, ...
, max.length = 10 #< maximum length to not truncate
, show.length = 6 #< when truncating how many to show
){
o <- format(x, ...)
if(length(o)>max.length){
print(head(o, show.length))
cat("+", length(o) - 6, "more, ", length(o), "in total.\n")
} else {
print(o)
}
}
<file_sep>foreach::registerDoSEQ()
context("Grafting")
test_that("graft is reproducible", if (requireNamespace('MCMCpack')) {
results <-
replicate(2, simplify=F, {
datasets <- farm(gather(3, seed = 20120604), {
x1 <- rnorm(100)
x2 <- rnorm(100)
y <- rbinom(100, 1, p=plogis(x1 + x2))
data.frame(y, x1, x2)
}, .progress='none', .parallel=FALSE)
substreams <- plyr::llply(datasets, graft, 10)
subchains <- harvest(substreams[[1]], MCMCpack::MCMCregress
, formula=y~x1+x2, n=100, .progress='none', .parallel=FALSE)
})
expect_identical(noattr(results[[1]]), noattr(results[[2]]))
})
<file_sep>is_harvestr_frame <- function(env){
par <- parent.env(env)
if(!exists(".packageName", envir=par, inherits=FALSE)) return(FALSE)
par$.packageName == "harvestr"
}
is_top_harvestr_call <-
function(n=0){
frames <- head(sys.frames(), -1-n)
sum(sapply(frames, is_harvestr_frame))==1
}
test_is_top_harvestr_call <- function(...)is_top_harvestr_call(...)
#' @importFrom foreach getDoParRegistered
dflt_harvestr_parallel <-
function(){
if(is_top_harvestr_call())
stop("dflt_harvestr_parallel should not be called directly.")
if(is_top_harvestr_call(1))
return(getDoParRegistered())
frames <- sys.frames()
harvestr.frames <- Filter(is_harvestr_frame, frames)
has.parallel <- sapply(harvestr.frames, exists, x=".parallel", where=-1, inherits=FALSE)
if(!any(has.parallel)) return(FALSE)
return(max(get(envir=harvestr.frames[has.parallel][[1]], ".parallel") - 1, 0))
}
test_dflt_harvestr_parallel <- function(.parallel=0, nest=.parallel){
if(nest>0) return(Recall(.parallel=.parallel, nest= nest-1))
dflt_harvestr_parallel()
}
dflt_harvestr_progress <-
function( is.interactive = Interactive()
, OS = .Platform$OS.type
, is.top.call = is_top_harvestr_call(1)
){
if( is.interactive && is.top.call){
if( OS == "windows")
return("win")
else return("time")
}
return("none")
}
test_dflt_harvestr_progress <- function(...){dflt_harvestr_progress(...)}
dflt_harvestr_time <- function(){ FALSE }
dflt_harvestr_cache <- function(){ FALSE }
dflt_harvestr_cache_dir <- function(){"harvestr-cache"}
dflt_harvestr_use_try <- function(){!interactive()}
defaults <-
list( parallel = dflt_harvestr_parallel
, progress = dflt_harvestr_progress
, time = dflt_harvestr_time
, cache = dflt_harvestr_cache
, cache.dir = dflt_harvestr_cache_dir
, use.try = dflt_harvestr_use_try
)
<file_sep>## Test environments
* local Gentoo install, R 3.3.1
* ubuntu 12.04 (on travis-ci), R 3.3.1
* win-builder (devel and release)
## R CMD check results
R CMD check results
0 errors | 0 warnings | 0 notes
## Reverse dependencies
There are no dependencies on harvestr currently active on CRAN.
---
<file_sep>
cache.dir <- normalizePath(file.path(tempdir(), "harvestr-cache"), mustWork=F)
options(harvestr.cache.dir=cache.dir)
reg.finalizer(emptyenv(), function(...){unlink(cache.dir, TRUE)}, onexit=TRUE)
long_function <- function(wait=15){
# allow ne to read the ether
Sys.sleep(wait)
paste("Your luck lotto numbers are:",
paste(sample(56, 5), collapse=" "),
'|', sample(46, 1), sep=' ')
}
cache.dir <- getOption("harvestr.cache.dir")
if (file.exists(cache.dir)) unlink(cache.dir, recursive=TRUE, force=TRUE)
t1 <- system.time(a <- withseed(gather(1)[[1]], long_function(20), cache=T))
t2 <- system.time(b <- withseed(gather(1)[[1]], long_function(20), cache=T))
rbind(t1, t2)
identical(a, b)
<file_sep># The `harvestr` Parallel Simulation Framework.
[](https://travis-ci.org/halpo/harvestr)
[](https://codecov.io/github/halpo/harvestr?branch=master)
[](https://cran.r-project.org/package=harvestr)
The `harvestr` package is a framework for conducting replicable parallel
simulations in [R](http://www.r-project.org). It builds off the
the popular [plyr](http://cran.r-project.org/?pacakge=plyr)
package for split apply combine framework, and the parallel combined
multiple-recursive generator from L'Ecuyer (1999).
Due to the replicable simulations being based off seed values,this package takes a theme of seeds and farming. The principal functions are as follows:
* `gather` - Creates a list of parallel rng seeds.
* `farm` - Uses seeds from `gather` to evaluate expressions after each seed has been set. This is usefull for generating data.
* `harvest` - This will take the results from `farm` and continue evaluation with the random number generation where farm left off. This is useful for the evaluating data generated with farm, through stochastic methods such as Markov Chain Monte Carlo.
* `reap` - is the single version of harvest for a single element that has appropriately structured seed attributes.
* `plant` - takes a list of objects, assumed to be of the same class, and gives each element a parallel seed value to use with `harvest` for evaluation.
* `graft` - splits RNG sub-streams from a main object.
* `sprout` - gets the seeds for use in `graft`.
## Lists ##
All of the functions work off lists, They expect and return lists, which can be easily converted to data frames. I would do this with `ldply(list, I)`.
## Parallel ##
The advantage of setting the seeds like this is that parallelization is seamless and transparent, similar to the `plyr` framework each function has a `.parallel` argument, which defaults to `FALSE`, but when set to true will evaluate and run in parallel. An appropriate parallel backend must be specified. For example, with a multicore backend you would run the following code.
```r
library(doMC)
regiserDoMC()
```
See the `plyr` and `foreach` packages documentation for what backends are currently supported.
## Operating Systems ##
`harvestr` is limited in it's capabilities by the packages that it depends on, mainly
[foreach](https://cran.r-project.org/package=foreach)
and [plyr](https://cran.r-project.org/package=plyr)
The Parallel backends are platform limited read the individual packages documentation:
* [doMC](https://cran.r-project.org/?package=doMC)
* [doSMP](https://cran.r-project.org/?package=doSMP)
* [doParallel](https://cran.r-project.org/?package=doParallel)
* [doMPI](https://cran.r-project.org/?package=doMPI)
* [doRedis](https://cran.r-project.org/?package=doRedis)
* [doSNOW](https://cran.r-project.org/?package=doSNOW)
# Notes #
Please note that this project is released with a [Contributor Code of Conduct](CONDUCT.md). By participating in this project you agree to abide by its terms.
<file_sep>#' Strip attributes from an object.
#'
#' @param x, any object
#' @seealso \link{attributes}
#' @export
noattr <- noattributes <- function(x) {
if(is.list(x)){
x <- llply(x, noattributes)
}
attributes(x) <- NULL
x
}
#' Retrieve an attribute or a default if not present.
#'
#' Behaves similar to \code{\link{getOption}}, but is a simple wrapper
#' for \code{\link{attr}}.
#'
#'
#' @param object An R Object.
#' @param name Name of the Attribute
#' @param default The value if the attribute is not set or NULL
#'
#' @export
getAttr <- function(object, name, default=NULL){
a <- attr(object, name)
if(is.null(a))
return(default)
else
return(a)
}
| 778e0d3344c4cf7db1ef170cb74b1bb867cb4f44 | [
"Markdown",
"R",
"RMarkdown"
]
| 24 | R | halpo/harvestr | 6c19cee53727b97e5391214ab90ea8ded30c2e2a | 6cb2386f5d54b4909ede1cf80fedc7249f2f0d21 |
refs/heads/master | <repo_name>dandrewgarvin/oop-css<file_sep>/README.md
# OOP-CSS
*Cascading Style Sheets.* The bane of many web-developers' existence.
Mugs, T-shirts, Stickers, and Gifs have all been made to show the high level of loathing these developers have toward this piece of technology. My personal favorite is this gif of <NAME>:

But never fear, front-end developers. Your problems are soon to increase exponentially. Let me introduce you to `OOP-CSS` (AKA `OOPS`).
## What is `OOP-CSS` (`OOPS`)?
OOPS is a poor, humorous attempt at taking the well-disliked syntax of CSS, and turning it into Object-Oriented Programming language. It is by no means intended to become a feature-filled language (if it was, I'd have written a real Lexer, Parser, and Evaluator), but instead is a spoof-of-concept idea that I thought would be fun to make.
## Notes
This project was built in a very short period of time, and I, the author, would like to take full credit for the idea/project, but request that anyone looking at this code does not judge me for the way it was written. Since this project was just for fun, I wasn't worried about optimizing the code, but instead wrote it only in a way that worked.
## TODO
- [x] Remove comments from code
- [x] Break code into json blocks
- [x] Save ast-temp.json file for documentation
- [ ] File should not be hard-coded in, but be received from stdin
- [ ] :root variables in JSON file should be saved as JS variables
- [ ] .function blocks should be turned into JS function blocks
- [ ] .function::before block should declare parameters (if not exists, no parameters)
- [ ] .function::after (if exists) should invoke function block<file_sep>/tools/saveTemp.js
const fs = require('fs');
function saveTemp(arr) {
const jsonified = JSON.stringify(arr, null, 4);
fs.writeFileSync('ast-temp.json', jsonified, 'utf8', null);
}
module.exports = saveTemp;<file_sep>/index.js
const fs = require('fs');
const removeComments = require('./tools/removeComments');
const blockify = require('./tools/blockify');
const saveTemp = require('./tools/saveTemp');
let file = fs.readFileSync(__dirname + '/test.txt', 'utf8');
let noComments = removeComments(file);
let ast = blockify(noComments);
saveTemp(ast);
console.log(ast); | 0155871747fcf7631a5dee22314432efe8929b51 | [
"Markdown",
"JavaScript"
]
| 3 | Markdown | dandrewgarvin/oop-css | 8af93ad67fa575687611fbb8991d1f5e71b7a3d9 | b5f43f00ab24a379945fa7aaa6314971c0572923 |
refs/heads/master | <repo_name>spriore/fun-projects<file_sep>/langtons_ant.py
import tkinter as tk
from random import randint
from time import sleep
class App:
def __init__(self):
self.root = tk.Tk()
self.root.attributes("-fullscreen", True)
screen_width = self.root.winfo_screenwidth() - 20
screen_height = self.root.winfo_screenheight() - 20
self.box = 10 # self.box size in px
self.xscale = int(screen_width / self.box )
self.yscale = int(screen_height / self.box )
self.c = tk.Canvas(self.root, height= screen_height, width= screen_width)
self.c.pack()
self.setup()
self.root.bind('<space>', lambda e: self.reset())
self.root.after(500, self.update)
self.root.mainloop()
def setup(self):
self.grid = [[0 for y in range(self.yscale)] for x in range(self.xscale)]
for x in range(self.xscale):
for y in range(self.yscale):
self.grid[x][y] = self.c.create_rectangle(
x*self.box+10,
y*self.box+10,
(x+1)*self.box+10,
(y+1)*self.box+10,
fill='#ffffff')
self.ant_coord = [round(self.xscale/2),round(self.yscale/2)]
self.ant_dir = [1,0]
self.ant = self.c.create_rectangle(
self.ant_coord[0] * self.box + 10,
self.ant_coord[1] * self.box + 10,
(self.ant_coord[0] + 1) * self.box + 10,
(self.ant_coord[1] + 1) * self.box + 10,
fill = '#871719')
def update(self):
self.run = True
while self.run == True:
pos = self.grid[self.ant_coord[0]][self.ant_coord[1]]
state = self.c.itemcget(pos, 'fill')
if state == '#ffffff':
self.c.itemconfig(pos, fill='#0f52ba')
self.ant_dir = [-self.ant_dir[1], self.ant_dir[0]]
elif state == '#000000':
self.c.itemconfig(pos, fill='#ffffff')
self.ant_dir = [self.ant_dir[1], -self.ant_dir[0]]
else:
self.c.itemconfig(pos, fill='#000000')
self.ant_coord = [x + y for x, y in zip(self.ant_coord, self.ant_dir)]
if self.ant_coord[0] < 0:
self.ant_coord[0] = self.xscale-1
elif self.ant_coord[0] >= self.xscale:
self.ant_coord[0] = 0
if self.ant_coord[1] < 0:
self.ant_coord[1] = self.yscale-1
elif self.ant_coord[1] >= self.yscale:
self.ant_coord[1] = 0
self.c.coords(self.ant,
self.ant_coord[0] * self.box + 10,
self.ant_coord[1] * self.box + 10,
(self.ant_coord[0] + 1) * self.box + 10,
(self.ant_coord[1] + 1) * self.box + 10)
self.c.update()
# sleep(.005)
def reset(self):
self.root.attributes("-fullscreen", False)
self.run = False
if __name__ == '__main__':
app = App()
<file_sep>/double_pendulum.py
import tkinter as tk
from math import pi, sin, cos, radians
import random, time
class App:
def __init__(self):
self.root = tk.Tk()
self.l = tk.Frame(self.root)
self.g = tk.Scale(self.l, label='Gravity', from_ = 1, to = 9.8, resolution = .1, orient=tk.HORIZONTAL)
self.l1 = tk.Scale(self.l, label='L1', from_ = 1, to = 100, resolution = 1, orient=tk.HORIZONTAL)
self.l2 = tk.Scale(self.l, label='L2', from_ = 1, to = 100, resolution = 1, orient=tk.HORIZONTAL)
self.m1 = tk.Scale(self.l, label='M1', from_ = 1, to = 100, resolution = 1, orient=tk.HORIZONTAL)
self.m2 = tk.Scale(self.l, label='M2', from_ = 1, to = 100, resolution = 1, orient=tk.HORIZONTAL)
self.c = tk.Canvas(self.root, bg='#ffffff', height = 500, width = 500)
self.l.pack(side=tk.LEFT)
self.g.pack()
self.l1.pack()
self.l2.pack()
self.m1.pack()
self.m2.pack()
self.c.pack(side=tk.RIGHT)
self.setup()
self.root.after(500, self.update)
self.root.mainloop()
def setup(self):
self.o = [250,250]
self.obj1 = {'l': self.l1.get(), 't': pi/2, 'v': 0.0, 'm': self.m1.get()}
self.obj2 = {'l': self.l2.get(), 't': pi, 'v': 0.0, 'm': self.m2.get()}
self.origin = self.c.create_rectangle(
self.o[0] - 2,
self.o[1] - 2,
self.o[0] + 2,
self.o[1] + 2,
fill = '#ffffff')
self.line1 = self.c.create_line(
self.o[0],
self.o[1],
self.o[0] + self.obj1['l'] * sin(self.obj1['t']),
self.o[1] + self.obj1['l'] * cos(self.obj1['t']),
width = 5)
self.line2 = self.c.create_line(
self.c.coords(self.line1)[2],
self.c.coords(self.line1)[3],
self.c.coords(self.line1)[2] + self.obj2['l'] * sin(self.obj2['t']),
self.c.coords(self.line1)[3] + self.obj2['l'] * cos(self.obj2['t']),
width = 5)
self.root.bind('<space>', lambda e: self.update())
def update(self):
self.run = True
self.root.bind('<space>', lambda e: self.reset())
while self.run == True:
pcoords = self.c.coords(self.line2)
g = self.g.get()
l1 = self.l1.get()
l2 = self.l2.get()
m1 = self.m1.get()
m2 = self.m2.get()
t1 = self.obj1['t']
t2 = self.obj2['t']
t1_v = self.obj1['v']
t2_v = self.obj2['v']
den = 2 * m1 + m2 - m2 * cos(2 * t1 - 2 * t2)
t1_a = (-g * (2 * m1 + m2) * sin(t1) - m2 * g * sin(t1 - 2 * t2) - 2 * sin(t1 - t2) * m2 * (t2_v * t2_v * l2 + t1_v * t1_v * l1 * cos(t1 - t2))) / (l1 * den)
t2_a = (2 * sin(t1 - t2) * ((t1_v * t1_v * l1 * (m1 + m2)) + (g * (m1 + m2) * cos(t1)) + t2_v * t2_v * l2 * m2 * cos(t1 - t2))) / (l2 * den)
self.obj1['v'] = round(self.obj1['v'] + t1_a, 4)
self.obj2['v'] = round(self.obj2['v'] + t2_a, 4)
self.obj1['t'] = round((self.obj1['t'] + self.obj1['v']) % (2 * pi), 4)
self.obj2['t'] = round((self.obj2['t'] + self.obj2['v']) % (2 * pi), 4)
self.c.coords(self.line1,
self.o[0],
self.o[1],
self.o[0] + l1 * sin(self.obj1['t']),
self.o[1] + l1 * cos(self.obj1['t']))
self.c.coords(self.line2,
self.c.coords(self.line1)[2],
self.c.coords(self.line1)[3],
self.c.coords(self.line1)[2] + l2 * sin(self.obj2['t']),
self.c.coords(self.line1)[3] + l2 * cos(self.obj2['t']))
self.c.create_line(
pcoords[2],
pcoords[3],
self.c.coords(self.line2)[2],
self.c.coords(self.line2)[3])
self.c.update()
time.sleep(.050)
def reset(self):
self.run = False
self.c.delete('all')
self.setup()
if __name__ == '__main__':
app = App()
<file_sep>/snake.py
import tkinter as tk
from random import randrange
import queue
class Application:
def __init__( self ):
self.scale = 13 # No of columns and rows in the grid
self.b_dim = 40 # Size of each square
self.c_dim = self.scale * self.b_dim
self.bound = self.c_dim - self.b_dim
self.center = int( self.scale / 2 ) * self.b_dim
self.win_con = self.scale**2 - 1
self.run = False
self.root = tk.Tk()
self.l = tk.Label(self.root, anchor = 'n', fg = '#000000', text = 'Press spacebar to start.')
self.c = tk.Canvas(self.root, bg = '#444444', width = self.c_dim, height = self.c_dim)
self.l.pack()
self.c.pack()
self.queue = queue.Queue()
self.root.bind('<space>',lambda e: self.start_game())
self.root.bind('<Left>', lambda e: self.new_dir([-1,0]))
self.root.bind('<Right>',lambda e: self.new_dir([1,0]))
self.root.bind('<Up>', lambda e: self.new_dir([0,-1]))
self.root.bind('<Down>', lambda e: self.new_dir([0,1]))
self.root.mainloop()
def start_game( self ):
if self.run == False:
self.run = True
self.count = 0
self.step = 50 / self.win_con
self.speed = 100
self.l['text'] = 'Score: 0/' + str(self.win_con)
self.snake_init()
self.food_init()
self.update()
def snake_init( self ):
self.head = [self.center,self.center]
self.body = [self.scale_init( self.head )]
self.body_pos = [self.head]
self.dir = [0,1]
self.last_dir = [0,1]
def update( self ):
if not self.queue.empty():
self.dir = self.queue.get()
self.head = [x + (y * self.b_dim) for x, y in zip( self.head, self.dir )]
if not ( 0 <= self.head[0] <= self.bound ) or not ( 0 <= self.head[1] <= self.bound ) or self.head in self.body_pos:
self.reset( 'lose' )
else:
self.body.insert( 0, self.scale_init( self.head ) )
self.body_pos.insert( 0, self.head )
if self.head == self.f_pos:
self.c.delete( self.food )
self.count += 1
self.speed -= self.step
self.l['text'] = 'Score: ' + str(self.count) + '/' + str(self.win_con)
if self.count != self.win_con:
self.food_init()
else:
self.reset( 'win' )
else:
self.c.delete( self.body.pop() )
self.body_pos.pop()
if self.run == True:
self.c.after( int(self.speed) , self.update )
def new_dir( self, vector ):
if self.last_dir != [ -x for x in vector ]:
self.queue.put(vector)
self.last_dir = vector
def scale_init( self, pos ):
self.scale = self.c.create_rectangle(
pos[0],
pos[1],
pos[0] + self.b_dim,
pos[1] + self.b_dim,
fill = '#ffffff')
return self.scale
def food_init( self ):
while True:
loc = [randrange(0, self.bound, self.b_dim), randrange(0, self.bound, self.b_dim)]
if not loc in self.body_pos:
self.f_pos = loc
self.food = self.c.create_rectangle(
self.f_pos[0],
self.f_pos[1],
self.f_pos[0] + self.b_dim,
self.f_pos[1] + self.b_dim,
fill = '#871719')
break
def reset( self, condition ):
self.run = False
self.c.delete( self.food )
while len( self.body ) > 0:
self.c.delete( self.body.pop() )
if condition == 'win':
self.l['text'] = 'You Win! Press spacebar to play again.'
else:
self.l['text'] = 'Score: ' + str( self.count ) + ' Press spacebar to play again.'
if __name__ == "__main__":
app = Application()
<file_sep>/README.md
# fun-projects
Personal coding challanges
| 6e2b051836c5c9124d1904aff92456e9d707da11 | [
"Markdown",
"Python"
]
| 4 | Python | spriore/fun-projects | fd0424cba3d69be0e8e44ddde698c209556affbe | 4746cc23328fc4d32bdc9a71f42155fa0febc63b |
refs/heads/master | <repo_name>McKnightRider/Test-Geolocation2<file_sep>/README.md
# Test-Geolocation2
Revised geolocation app to try to improve accuracy
To view website go to mcknightrider.github.io/Test-Geolocation2
<file_sep>/src/scripts.js
const holeBtn = document.getElementById("holeBtn");
const distBtn = document.getElementById("distBtn");
const finishBtn = document.getElementById("finishRound");
const distTable = document.getElementById("distTable");
let array = []; // Adding four elements per point: latitude, longitude, accuracy and distance
let hole = 0; // Count hole number
let shots = 0; // Count shots on hole
let roundStarted = false; // This indicates that the round has started
let holeStarted = false; // This indicates that the hole has started
// The haversine formula is commonly used to calculate the great-circle distance between two points on a sphere
const R = 6371e3; // metres
const metresToYards = 1.09361;
/* const options = {
enableHighAccuracy: true,
timeout: 5000,
maximumAge: 0
}; */
function success(pos) {
let crd = pos.coords;
console.log('Your current position is:');
console.log(`Latitude : ${crd.latitude}`);
console.log(`Longitude: ${crd.longitude}`);
console.log(`More or less ${crd.accuracy} meters.`);
//showPosition(pos);
if (holeStarted === false) {
array.push(crd.latitude, crd.longitude, crd.accuracy, 0);
/* document.getElementById("startLat").innerHTML = Math.round(crd.latitude * 10000) / 10000 + "\xB0";
document.getElementById("startLon").innerHTML = Math.round(crd.longitude * 10000) / 10000 + "\xB0";
document.getElementById("startAcc").innerHTML = Math.round(crd.accuracy * 10000) / 10000; */
//holeStarted = true;
distTable.lastChild.innerHTML = `<td>${hole} Tee</td>
<td>${Math.round(array[array.length - 4] * 10000)/10000}\xB0</td>
<td>${Math.round(array[array.length - 3] * 10000)/10000}\xB0</td>
<td>${Math.round(array[array.length - 2] * 10000)/10000}</td>
<td>-</td>
<td>-</td>`;
holeStarted = true;
shots = 0;
} else {
array.push(crd.latitude, crd.longitude, crd.accuracy);
getDist(array);
}
shots++;
console.log("The current array is: " + array);
}
function error(err) {
console.warn(`ERROR(${err.code}): ${err.message}`);
}
/* function showPosition(pos) {
var latlon = pos.coords.latitude + "," + pos.coords.longitude;
var img_url = "https://maps.googleapis.com/maps/api/staticmap?center="+latlon+"&zoom=14&size=400x300&sensor=false&key=<KEY>";
document.getElementById("mapholder").innerHTML = "<img src='"+img_url+"'>";
} */
function getDist(array) {
// Now doing the haversine formula calculations
let lat_new = array[array.length - 3];
console.log("lat_new is " + lat_new);
let lat_old = array[array.length - 7];
let lon_new = array[array.length - 2];
let lon_old = array[array.length - 6];
//let x = lat_new - lat_old;
let φ1 = toRadians(lat_old);
let φ2 = toRadians(lat_new);
let Δφ = toRadians(lat_new - lat_old);
let Δλ = toRadians(lon_new - lon_old);
let a = Math.sin(Δφ/2) * Math.sin(Δφ/2) +
Math.cos(φ1) * Math.cos(φ2) *
Math.sin(Δλ/2) * Math.sin(Δλ/2);
let c = 2 * Math.atan2(Math.sqrt(a), Math.sqrt(1-a));
let d = R * c;
console.log("The distance from the last point is " + d + " metres.");
array.push(d);
console.log("The number of shots is: " + shots);
distTable.lastChild.innerHTML = `<td>Shot ${shots}</td>
<td>${Math.round(array[array.length - 4] * 10000)/10000}\xB0</td>
<td>${Math.round(array[array.length - 3] * 10000)/10000}\xB0</td>
<td>${Math.round(array[array.length - 2] * 10000)/10000}</td>
<td>${Math.round(array[array.length - 1]).toLocaleString()}</td>
<td>${Math.round(array[array.length - 1] * metresToYards).toLocaleString()}</td>`;
if (shots === 1) {
let driveText = "drive" + hole;
console.log(`The hole number is ${hole} and driveText is ${driveText}.`);
document.getElementById(driveText).innerHTML = Math.round(array[array.length - 1] * metresToYards).toLocaleString();
}
console.log(`The current array is ${array.length} long.`);
/* if (hole === 9) {
console.log("We are now on 9 holes.");
const tr3 = document.createElement("tr");
statsTable.appendChild(tr3);
} */
};
function toRadians(num) {
return num * Math.PI/180;
}
//putting success and error in below as functions without the trailing () makes them callbacks apparently
//navigator.geolocation.getCurrentPosition(success, error, options);
//array.push(success());
//need to put things in the success() function if they are to wait for the coordinates to be available
distBtn.addEventListener("click", function() {
console.log(`You have now pressed the button ${shots + 1} times.`)
if (holeStarted === false) {
//holeStarted = true; // Have moved this to the success() function
distBtn.innerHTML = "Mark";
} else {
const tr = document.createElement("tr");
distTable.appendChild(tr);
};
//navigator.geolocation.getCurrentPosition(success, error, options);
navigator.geolocation.getAccurateCurrentPosition();
//letsGeolocate();
});
holeBtn.addEventListener("click", function() {
const tr2 = document.createElement("tr");
if (roundStarted === false) {
roundStarted = true;
holeBtn.innerHTML = "Next Hole";
distBtn.disabled = false;
finishBtn.disabled = false;
};
hole++;
distBtn.innerHTML = "Drive";
distTable.appendChild(tr2);
distTable.lastChild.innerHTML = `<td>${hole} Tee</td>
<td></td>
<td></td>
<td></td>
<td>-</td>
<td>-</td>`;
holeStarted = false;
document.getElementById(`row${hole - 1}`).style.backgroundColor = "yellow";
if (hole > 1) {
document.getElementById(`row${hole - 2}`).style.backgroundColor = "aqua";
}
if (hole == 18) {
holeBtn.disabled = true;
}
});
/* function setDate() {
let today = new Date();
let dd = today.getDate();
let mm = today.getMonth(); //January is 0
let yyyy = today.getFullYear();
let months = ["January", "February", "March", "April", "May", "June", "July", "August", "September", "October", "November", "December"];
today = dd + ' ' + months[mm] + ' ' + yyyy;
document.getElementById("date").innerHTML = today;
};
setDate();*/
function myFunction() {
console.log("this is: " + typeof(this));
document.getElementById("menu1").innerHTML = $(this[1]).text();
console.log("myFunction() is running");
};
class Course {
constructor(arr) {
this.holeYards = arr[0];
this.holePars = arr[1];
this.ss = arr[2];
this.name = arr[3];
this.holeSI = arr[4];
this.tees = arr[5];
this.date = arr[6];
this.handicap = arr[7];
}
calcFront9(str) {
let sum = 0;
for (var i = 0; i < 9; i++) {
if (str === "length") {
sum += this.holeYards[i];
} else if (str === "par") {
sum += this.holePars[i];
}
}
return sum;
}
calcBack9(str) {
let sum = 0;
for (var i = 9; i < 18; i++) {
if (str === "length") {
sum += this.holeYards[i];
} else if (str === "par") {
sum += this.holePars[i];
}
}
return sum;
}
calc(str) {
let sum = 0;
for (var i = 0; i < 18; i++) {
if (str === "length") {
sum += this.holeYards[i];
} else if (str === "par") {
sum += this.holePars[i];
}
}
return sum;
}
getYards() {
return this.holeYards;
}
getPars() {
return this.holePars;
}
getSS() {
return this.ss;
}
getName() {
return this.name;
}
getSI() {
return this.holeSI;
}
};
// Now trying to get this data from Firebase
/* const gxWhiteYards = [367, 452, 364, 411, 187, 277, 526, 188, 392, 384, 409, 371, 157, 390, 406, 172, 350, 440];
const gxWhitePars = [4, 4, 4, 4, 3, 4, 5, 3, 4, 4, 4, 4, 3, 4, 4, 3, 4, 4];
const gxWhiteSS = 71; */
//const gxWhite = new Course([gxWhiteYards, gxWhitePars, gxWhiteSS]);
/* const gxYellowYards = [353, 437, 343, 386, 176, 254, 513, 177, 389, 379, 355, 361, 144, 386, 399, 157, 335, 421];
const gxYellowPars = [4, 4, 4, 4, 3, 4, 5, 3, 4, 4, 4, 4, 3, 4, 4, 3, 4, 4];
const gxYellowSS = 70; */
function getData(courseTeesArray) {
for (i = 0; i < courseTeesArray.length; i++) {
console.log(`courseTeesArray[${i}] is ${courseTeesArray[i]}.`)
}
const url = "https://simple-golf-app.firebaseio.com/courses.json";
let networkDataReceived = false;
fetch(url)
.then(function(res) {
return res.json();
})
.then(function(data) {
networkDataReceived = true;
console.log('From web', data);
//var dataArray = [];
//var dataObj = data;
let setKey = "";
for (let key in data) {
console.log("key: " + key + ", data: " + data[key]);
//console.log("Seeing if key name prints: " + dataObj[key].name);
if (data[key].name === courseTeesArray[0] && data[key].tees === courseTeesArray[1]) {
setKey = key;
console.log("setKey is: " + setKey);
}
//dataArray.push(data[key]);
//console.log("data: " + data.gxWhite.holePars);
}
// Try to adapt this snippet
/* var result = jsObjects.filter(obj => {
return obj.b === 6
}) */
setCourseDetails2(data[setKey]);
//setGXWhite(data[setKey]);
//updateUI(dataArray);
//console.log("Test print: " + dataArray[0].holeYards);
});
//.then(setGXWhite(dataObj));
}
function setCourseDetails2(dataObj) {
console.log("Here! " + dataObj.name);
document.getElementById("ss").value = dataObj.ss;
const courseObject = new Course([dataObj.holeYards, dataObj.holePars, dataObj.ss, dataObj.name, dataObj.holeSI, dataObj.date, dataObj.handicap]);
for (let i = 0; i < dataObj.holeYards.length; i++) {
//setCourseDetails(newCourse);
document.getElementById(`yards${i + 1}`).innerHTML = courseObject.getYards()[i];
document.getElementById("yardsFront9").innerHTML = courseObject.calcFront9("length");
document.getElementById("yardsBack9").innerHTML = courseObject.calcBack9("length");
document.getElementById("yardsAll").innerHTML = courseObject.calcFront9("length") + courseObject.calcBack9("length");
document.getElementById(`par${i + 1}`).innerHTML = courseObject.getPars()[i];
document.getElementById("parFront9").innerHTML = courseObject.calcFront9("par");
document.getElementById("parBack9").innerHTML = courseObject.calcBack9("par");
document.getElementById("parAll").innerHTML = courseObject.calcFront9("par") + courseObject.calcBack9("par");
document.getElementById(`si${i + 1}`).innerHTML = courseObject.getSI()[i];
}
}
function setCourse() {
if (document.getElementById("course").value === "other") {
modal.style.display = "block";
} else if (document.getElementById("course").value !== "nil" && document.getElementById("course") !== "other" && document.getElementById("tees").value !== "nil") {
let indexCourse = document.getElementById("course").selectedIndex;
let indexTees = document.getElementById("tees").selectedIndex;
getData([document.getElementById("course")[indexCourse].innerHTML, document.getElementById("tees")[indexTees].innerHTML]);
}
};
let scoreDropdownFront9 = `<select class="scores scoresFront9" onchange="sumScoresPutts('scores'); calcStableford();">
<option disabled selected value=0>--select--</option>
<option value=1>1</option>
<option value=2>2</option>
<option value=3>3</option>
<option value=4>4</option>
<option value=5>5</option>
<option value=6>6</option>
<option value=7>7</option>
<option value=8>8</option>
<option value=9>9</option>
<option value=10>Other</option>
<option value=0>-</option>
</select>`;
let scoreDropdownBack9 = `<select class="scores scoresBack9" onchange="sumScoresPutts('scores'); calcStableford();">
<option disabled selected value=0>--select--</option>
<option value=1>1</option>
<option value=2>2</option>
<option value=3>3</option>
<option value=4>4</option>
<option value=5>5</option>
<option value=6>6</option>
<option value=7>7</option>
<option value=8>8</option>
<option value=9>9</option>
<option value=10>Other</option>
<option value=0>-</option>
</select>`;
let sumScoresPutts = function(stat) {
let statFront9 = stat + "Front9";
let statBack9 = stat + "Back9";
let statAll = stat + "All";
let sumFront9 = 0;
let sumBack9 = 0;
if (stat === "stableford") {
let front9Scores = document.getElementsByClassName(statFront9);
for (let i = 0; i < front9Scores.length; i++) {
if (Number.isInteger(parseInt(front9Scores[i].innerHTML))) {
sumFront9 += parseInt(front9Scores[i].innerHTML);
}
}
let back9Scores = document.getElementsByClassName(statBack9);
for (let i = 0; i < back9Scores.length; i++) {
if (Number.isInteger(parseInt(back9Scores[i].innerHTML))) {
sumBack9 += parseInt(back9Scores[i].innerHTML);
}
}
} else if (stat !== "strokesGained") {
let front9Scores = document.getElementsByClassName(statFront9);
for (let i = 0; i < front9Scores.length; i++) {
if (Number.isInteger(parseInt(front9Scores[i].value))) {
sumFront9 += parseInt(front9Scores[i].value);
}
}
let back9Scores = document.getElementsByClassName(statBack9);
for (let i = 0; i < back9Scores.length; i++) {
if (Number.isInteger(parseInt(back9Scores[i].value))) {
sumBack9 += parseInt(back9Scores[i].value);
}
}
} else {
let front9Scores = document.getElementsByClassName(statFront9);
for (let i = 0; i < front9Scores.length; i++) {
if (parseFloat(front9Scores[i].innerHTML)) {
sumFront9 += parseFloat(front9Scores[i].innerHTML);
}
}
let back9Scores = document.getElementsByClassName(statBack9);
for (let i = 0; i < back9Scores.length; i++) {
if (parseFloat(back9Scores[i].innerHTML)) {
sumBack9 += parseFloat(back9Scores[i].innerHTML);
}
}
}
// Need the * 1000 / 1000 as decimals used with strokes gained in this function
document.getElementById(statFront9).innerHTML = Math.round(sumFront9 * 1000) / 1000;
document.getElementById(statBack9).innerHTML = Math.round(sumBack9 * 1000) / 1000;
document.getElementById(statAll).innerHTML = Math.round((sumFront9 + sumBack9) * 1000) / 1000;
}
let fairwaysDropdownFront9 = `<select class="fairways fairwaysFront9" onchange="sumStats('fairways')">
<option disabled selected value="nil">--select--</option>
<option value="yes">\u2714</option>
<option value="left">Left</option>
<option value="right">Right</option>
<option value="bunker">Bunker</option>
<option value="duffed">Duffed</option>
<option value="over">Over</option>
<option value="other">Other</option>
<option value="nil">-</option>
</select>`;
let fairwaysDropdownBack9 = `<select class="fairways fairwaysBack9" onchange="sumStats('fairways')">
<option disabled selected value="nil">--select--</option>
<option value="yes">\u2714</option>
<option value="left">Left</option>
<option value="right">Right</option>
<option value="bunker">Bunker</option>
<option value="duffed">Duffed</option>
<option value="over">Over</option>
<option value="other">Other</option>
<option value="nil">-</option>
</select>`;
const sumStats = function(stat) {
let statFront9 = stat + "Front9";
let statBack9 = stat + "Back9";
let statAll = stat + "All";
let hitFront9 = 0;
let allFront9 = 0;
let hitBack9 = 0;
let allBack9 = 0;
let front9Sum = document.getElementsByClassName(statFront9);
for (let i = 0; i < front9Sum.length; i++) {
if (front9Sum[i].value.indexOf("yes") !== -1) {
hitFront9++;
allFront9++;
} else if (front9Sum[i].value !== "nil") {
allFront9++;
}
}
let back9Sum = document.getElementsByClassName(statBack9);
for (let i = 0; i < back9Sum.length; i++) {
if (back9Sum[i].value.indexOf("yes") !== -1) {
hitBack9++;
allBack9++;
} else if (back9Sum[i].value !== "nil") {
allBack9++;
}
}
document.getElementById(statFront9).innerHTML = `${hitFront9} of ${allFront9}`;
document.getElementById(statBack9).innerHTML = `${hitBack9} of ${allBack9}`;
document.getElementById(statAll).innerHTML = `${hitFront9 + hitBack9} of ${allFront9 + allBack9}`;
}
let greensDropdownFront9 = `<select class="greens greensFront9" onchange="sumStats('greens')">
<option disabled selected value="nil">--select--</option>
<option value="yes<">\u2714 <= 10 ft</option>
<option value="yes>">\u2714 > 10 ft</option>
<option value="over">Over</option>
<option value="left">Left</option>
<option value="right">Right</option>
<option value="short">Short</option>
<option value="wellOff">Well Off</option>
<option value=nil>-</option>
</select>`;
let greensDropdownBack9 = `<select class="greens greensBack9" onchange="sumStats('greens')">
<option disabled selected value="nil">--select--</option>
<option value="yes<">\u2714 <= 10 ft</option>
<option value="yes>">\u2714 > 10 ft</option>
<option value="over">Over</option>
<option value="left">Left</option>
<option value="right">Right</option>
<option value="short">Short</option>
<option value="wellOff">Well Off</option>
<option value=nil>-</option>
</select>`;
let upAndDownDropdownFront9 = `<select class="upAndDown upAndDownFront9" onchange="sumStats('upAndDown')">
<option disabled selected value="nil">--select--</option>
<option value="yes">\u2714</option>
<option value="no">\u2715</option>
<option value=nil>-</option>
</select>`;
let upAndDownDropdownBack9 = `<select class="upAndDown upAndDownBack9" onchange="sumStats('upAndDown')">
<option disabled selected value="nil">--select--</option>
<option value="yes">\u2714</option>
<option value="no">\u2715</option>
<option value=nil>-</option>
</select>`;
let sandDropdownFront9 = `<select class="sand sandFront9" onchange="sumStats('sand')">
<option disabled selected value="nil">--select--</option>
<option value="yes">\u2714</option>
<option value="no">\u2715</option>
<option value=nil>-</option>
</select>`;
let sandDropdownBack9 = `<select class="sand sandBack9" onchange="sumStats('sand')">
<option disabled selected value="nil">--select--</option>
<option value="yes">\u2714</option>
<option value="no">\u2715</option>
<option value=nil>-</option>
</select>`;
let puttsDropdownFront9 = `<select class="putts puttsFront9" onchange="sumScoresPutts('putts')">
<option disabled selected value="nil">--select--</option>
<option value=0>0</option>
<option value=1>1</option>
<option value=2>2</option>
<option value=3>3</option>
<option value=4>4</option>
<option value=5>5</option>
<option value="nil">Other</option>
<option value="nil">-</option>
</select>`;
let puttsDropdownBack9 = `<select class="putts puttsBack9" onchange="sumScoresPutts('putts')">
<option disabled selected value="nil">--select--</option>
<option value=0>0</option>
<option value=1>1</option>
<option value=2>2</option>
<option value=3>3</option>
<option value=4>4</option>
<option value=5>5</option>
<option value="nil">Other</option>
<option value="nil">-</option>
</select>`;
let driveClubDropdownFront9 = `<select class="driveClub driveClubFront9" onchange="aveDriverDist()">
<option disabled selected value="nil">--select--</option>
<option value="DR">Driver</option>
<option value="3W">3 Wood</option>
<option value="17H">17\xB0 Hybrid</option>
<option value="20H">20\xB0 Hybrid</option>
<option value="23H">23\xB0 Hybrid</option>
<option value="5I">5 Iron</option>
<option value="6I">6 Iron</option>
<option value="7I">7 Iron</option>
<option value="8I">8 Iron</option>
<option value="9I">9 Iron</option>
<option value="PW">Pitching Wedge</option>
<option value="GW">Gap Wedge</option>
<option value="5SW">Sand Wedge</option>
<option value="nil">Other</option>
<option value="nil">-</option>
</select>`;
let driveClubDropdownBack9 = `<select class="driveClub driveClubBack9" onchange="aveDriverDist()">
<option disabled selected value="nil">--select--</option>
<option value="DR">Driver</option>
<option value="3W">3 Wood</option>
<option value="17H">17\xB0 Hybrid</option>
<option value="20H">20\xB0 Hybrid</option>
<option value="23H">23\xB0 Hybrid</option>
<option value="5I">5 Iron</option>
<option value="6I">6 Iron</option>
<option value="7I">7 Iron</option>
<option value="8I">8 Iron</option>
<option value="9I">9 Iron</option>
<option value="PW">Pitching Wedge</option>
<option value="GW">Gap Wedge</option>
<option value="5SW">Sand Wedge</option>
<option value="nil">Other</option>
<option value="nil">-</option>
</select>`;
let avePuttDist = function() {
let sumFront9 = 0;
let countFront9 = 0;
let sumBack9 = 0;
let countBack9 = 0;
let puttDistsFront9 = document.getElementsByClassName("puttDistFront9");
for (let i = 0; i < puttDistsFront9.length; i++) {
if (Number.isInteger(parseInt(puttDistsFront9[i].value))) {
sumFront9 += parseInt(puttDistsFront9[i].value);
countFront9 += 1;
}
}
let puttDistsBack9 = document.getElementsByClassName("puttDistBack9");
for (let i = 0; i < puttDistsBack9.length; i++) {
if (Number.isInteger(parseInt(puttDistsBack9[i].value))) {
sumBack9 += parseInt(puttDistsBack9[i].value);
countBack9 += 1;
}
}
if (countFront9 > 0) {
document.getElementById("avePuttDistFront9").innerHTML = "Ave: " + Math.round(sumFront9/countFront9 * 10) / 10;
}
if (countBack9 > 0) {
document.getElementById("avePuttDistBack9").innerHTML = "Ave: " + Math.round(sumBack9/countBack9 * 10) / 10;
}
document.getElementById("avePuttDistAll").innerHTML = "Ave: " + Math.round(10 * (sumFront9 + sumBack9)/(countFront9 + countBack9)) / 10;
}
document.addEventListener('DOMContentLoaded', function() {
setTable();
}, false);
function setTable() {
for (let i = 0; i < 18; i++) {
const tr1 = document.createElement("tr");
tr1.setAttribute("id", `row${i}`);
statsTable.appendChild(tr1);
if (i < 9) {
statsTable.lastChild.innerHTML = `<td>${i + 1}</td>
<td id="yards${i + 1}"></td>
<td id="par${i + 1}"></td>
<td id="si${i + 1}"></td>
<td id="score${i + 1}">${scoreDropdownFront9}</td>
<td id="stableford${i + 1}" class="stableford stablefordFront9 stablefordColumn"></td>
<td>${fairwaysDropdownFront9}</td>
<td>${driveClubDropdownFront9}</td>
<td id="drive${i + 1}" class="driveDists"></td>
<td>${greensDropdownFront9}</td>
<td>${upAndDownDropdownFront9}</td>
<td>${sandDropdownFront9}</td>
<td id="putts${i + 1}">${puttsDropdownFront9}</td>
<td><input type="number" id="dist${i + 1}" class="puttDistFront9" onchange="avePuttDist(); calcStrokesGained(value, ${i + 1});"></td>
<td id="strokesGained${i+1}" class="strokesGainedFront9"></td>`;
} else {
statsTable.lastChild.innerHTML = `<td>${i + 1}</td>
<td id="yards${i + 1}"></td>
<td id="par${i + 1}"></td>
<td id="si${i + 1}"></td>
<td id="score${i + 1}">${scoreDropdownBack9}</td>
<td id="stableford${i + 1}" class="stableford stablefordBack9 stablefordColumn"></td>
<td>${fairwaysDropdownBack9}</td>
<td>${driveClubDropdownBack9}</td>
<td id="drive${i + 1}" class="driveDists"></td>
<td>${greensDropdownBack9}</td>
<td>${upAndDownDropdownBack9}</td>
<td>${sandDropdownBack9}</td>
<td id="putts${i + 1}">${puttsDropdownBack9}</td>
<td><input type="number" id="dist${i + 1}" class="puttDistBack9" onchange="avePuttDist(); calcStrokesGained(value, ${i + 1});"></td>
<td id="strokesGained${i+1}" class="strokesGainedBack9"></td>`;
}
if (i + 1 === 9) {
const tr2 = document.createElement("tr");
statsTable.appendChild(tr2);
statsTable.lastChild.innerHTML = `<td>Front 9</td>
<td id="yardsFront9"></td>
<td id="parFront9"></td>
<td></td>
<td id="scoresFront9">0</td>
<td id="stablefordFront9" class="stablefordColumn">0</td>
<td id="fairwaysFront9">0 of 0</td>
<td></td>
<td id="aveDriverDistFront9"></td>
<td id="greensFront9">0 of 0</td>
<td id="upAndDownFront9">0 of 0</td>
<td id="sandFront9">0 of 0</td>
<td id="puttsFront9">0</td>
<td id="avePuttDistFront9">-</td>
<td id="strokesGainedFront9">-</td>`;
}
if (i + 1 === 18) {
const tr2 = document.createElement("tr");
statsTable.appendChild(tr2);
statsTable.lastChild.innerHTML = `<td>Back 9</td>
<td id="yardsBack9"></td>
<td id="parBack9"></td>
<td></td>
<td id="scoresBack9">0</td>
<td id="stablefordBack9" class="stablefordColumn">0</td>
<td id="fairwaysBack9">0 of 0</td>
<td></td>
<td id="aveDriverDistBack9"></td>
<td id="greensBack9">0 of 0</td>
<td id="upAndDownBack9">0 of 0</td>
<td id="sandBack9">0 of 0</td>
<td id="puttsBack9">0</td>
<td id="avePuttDistBack9">-</td>
<td id="strokesGainedBack9">-</td>`;
const tr3 = document.createElement("tr");
statsTable.appendChild(tr3);
statsTable.lastChild.innerHTML = `<td>Total</td>
<td id="yardsAll"></td>
<td id="parAll"></td>
<td></td>
<td id="scoresAll">0</td>
<td id="stablefordAll" class="stablefordColumn">0</td>
<td id="fairwaysAll">0 of 0</td>
<td></td>
<td id="aveDriverDistAll"></td>
<td id="greensAll">0 of 0</td>
<td id="upAndDownAll">0 of 0</td>
<td id="sandAll">0 of 0</td>
<td id="puttsAll">0</td>
<td id="avePuttDistAll">-</td>
<td id="strokesGainedAll">-</td>`;
}
}
}
// PGA stats from 2010 at https://thesandtrap.com/forums/topic/51757-pga-tour-putts-gainedmake-percentage-stats/
// Fortunately the data is for each foot (1 to 100 feet) with no gaps, so easy to put in one-dimensional array
// Array number is expected number of putts to be taken by PGA tour players from that distance
let expectedPutts = [1.001, 1.009, 1.053, 1.147, 1.256, 1.357, 1.443, 1.515, 1.575, 1.626];
let a2 = [1.669, 1.705, 1.737, 1.765, 1.79, 1.811, 1.83, 1.848, 1.863, 1.878];
let a3 = [1.891, 1.903, 1.914, 1.924, 1.934, 1.944, 1.953, 1.961, 1.97, 1.978];
let a4 = [1.993, 1.993, 2.001, 2.009, 2.016, 2.024, 2.032, 2.039, 2.047, 2.055];
let a5 = [2.062, 2.07, 2.078, 2.086, 2.094, 2.102, 2.11, 2.118, 2.127, 2.135];
let a6 = [2.143, 2.152, 2.16, 2.168, 2.177, 2.185, 2.193, 2.202, 2.21, 2.218];
let a7 = [2.226, 2.234, 2.242, 2.25, 2.257, 2.265, 2.272, 2.279, 2.286, 2.293];
let a8 = [2.299, 2.306, 2.312, 2.318, 2.324, 2.329, 2.334, 2.339, 2.344, 2.349];
let a9 = [2.353, 2.357, 2.361, 2.364, 2.367, 2.37, 2.373, 2.375, 2.377, 2.379];
let a10 = [2.381, 2.382, 2.383, 2.384, 2.384, 2.384, 2.384, 2.383, 2.383, 2.382];
for (let i = 2; i < 11; i++) {
let x = eval("a" + i);
//console.log(x[0]);
for (let j = 0; j < x.length; j++) {
expectedPutts.push(x[j]);
};
};
console.log("The length of the array is: " + expectedPutts.length);
console.log(`The expected number of putts from 20 feet is ${expectedPutts[19]}.`);
function calcStrokesGained(dist, hole) {
console.log("The hole is: " + hole);
console.log("The distance is: " + dist);
let x = "strokesGained" + hole;
console.log(`x is ${x}.`);
let y = "putts" + hole;
console.log(`y is ${y}.`);
let z = "dist" + hole;
console.log(`The value in y is ${document.getElementById(y).firstChild.value}.`);
if (document.getElementById(y).firstChild.value === "nil") {
console.warn("Please enter the number of putts first before the first putt distance.");
document.getElementById(z).value = 0;
} else {
document.getElementById(x).innerHTML = Math.round((expectedPutts[dist - 1] - document.getElementById(y).firstChild.value) * 1000) / 1000;
console.log("Going to sumScoresPutts() function.");
sumScoresPutts("strokesGained");
}
};
finishBtn.addEventListener("click", function() {
//alert("This button doesn't do anything yet. It will send data to database in due course.");
let statsObject = {
date: {},
course: {},
tees: {},
competition: {},
format: {},
css: {},
handicap: {},
scores: {},
stableford: {},
fairways: [],
driveClub: [],
driveDist: [],
greens: [],
upAndDown: [],
sand: [],
putts: [],
puttDists: [],
strokesGained: []
};
// For numerical inputs - currently scores, to add putts, puttsDist and strokesGained later
let x = document.getElementsByClassName("scores");
let w = document.getElementsByClassName("stableford");
let checkNull = true;
// Set basic stats
//statsObject.date = document.getElementById("date").innerHTML;
statsObject.date = document.getElementById("date2").value;
let indexCourse = document.getElementById("course").selectedIndex;
let indexTees = document.getElementById("tees").selectedIndex;
statsObject.course = document.getElementById("course")[indexCourse].innerHTML;
statsObject.tees = document.getElementById("tees")[indexTees].innerHTML;
statsObject.handicap = document.getElementById("handicap").value;
statsObject.competition = document.getElementById("competition").value;
let indexFormat = document.getElementById("format").selectedIndex;
statsObject.format = document.getElementById("format")[indexFormat].innerHTML;
statsObject.css = document.getElementById("css").value;
for (let i = 0; i < x.length; i++) {
if (Number.isInteger(parseInt(x[i].value))) {
statsObject.scores[i] = parseInt(x[i].value);
//statsObject.scores.push(parseInt(x[i].value));
} else {
statsObject.scores[i] = "-";
checkNull = false;
}
}
for (let i = 0; i < w.length; i++) {
if (Number.isInteger(parseInt(w[i].innerHTML))) {
statsObject.stableford[i] = parseInt(w[i].innerHTML);
//statsObject.scores.push(parseInt(x[i].value));
} else {
statsObject.stableford[i] = "-";
}
}
// For non-numerical inputs
let y = ["fairways", "driveClub", "driveDist", "greens", "upAndDown", "sand"];
for (let i = 0; i < y.length; i++) {
let z = document.getElementsByClassName(y[i]);
console.log("y[i] is " + y[i]);
console.log("z[0] is " + z[0]);
for (let j = 0; j < z.length; j++) {
let ind = z[j].selectedIndex;
console.log("index is: " + ind); //Need to deal with non-selected option
if (z[j].selectedIndex !== 0) {
statsObject[y[i]][j] = z[j][ind].innerHTML;
console.log("z[j][ind].innerHTML is: " + z[j][ind].innerHTML);
} else {
statsObject[y[i]][j] = "-";
}
}
}
// Check if scores all completed before submitting to database
if (checkNull === false) {
var checking = confirm("Not all the scores are completed, do you still want to submit?");
if (checking) {
postDatabase("scores", statsObject);
holeBtn.disabled = true;
} else {
alert("Data not yet submitted. Complete the numbers and resubmit.")
}
} else {
postDatabase("scores", statsObject);
holeBtn.disabled = true;
}
});
/* function askNewCourseDetails() {
document.getElementById("newCourse").style.display = "block";
}; */
// Get the modal
var modal = document.getElementById('myModal');
// Get the button that opens the modal
//var btn = document.getElementById("myBtn");
// Get the <span> element that closes the modal
var span = document.getElementsByClassName("close")[0];
// When the user clicks the button, open the modal
/* btn.onclick = function() {
modal.style.display = "block";
} */
// When the user clicks on <span> (x), close the modal
span.onclick = function() {
modal.style.display = "none";
}
// When the user clicks anywhere outside of the modal, close it
window.onclick = function(event) {
if (event.target == modal) {
modal.style.display = "none";
}
}
document.getElementById("newCourseStatsBtn").addEventListener("click", function() {
modal.style.display = "none";
console.log("The newCourseStats button was pressed");
let newCourseName = capitaliseFirstLetters(document.getElementById("newCourseName").value);
let newCourseTees = capitaliseFirstLetters(document.getElementById("newCourseTees").value);
let newYards = document.getElementsByClassName("a");
let newPars = document.getElementsByClassName("b");
let newCourseSS = document.getElementById("c").value;
let newSI = document.getElementsByClassName("d");
let newCourseYards = [];
let newCoursePars = [];
let newCourseSI = [];
for (i = 0; i < newYards.length; i += 1) {
newCourseYards.push(parseInt(newYards[i].value));
newCoursePars.push(parseInt(newPars[i].value));
newCourseSI.push(parseInt(newSI[i].value));
};
console.log("newCourseYards is: " + newCourseYards);
console.log("newCoursePars is: " + newCoursePars);
console.log("newCourseSI is: " + newCourseSI);
var newCourse = new Course([newCourseYards, newCoursePars, newCourseSS, newCourseName, newCourseSI, newCourseTees]);
console.log("The new course is " + newCourse);
let opt = document.createElement("option");
opt.setAttribute("id", "nc");
opt.setAttribute("selected", true); // Focuses on this new option, but may apply on future page load if there
// Need to modify to cater for more than one word names, best set up as function
//opt.innerHTML = newCourseName[0].toUpperCase() + newCourseName.substr(1, newCourseName.length).toLowerCase();
opt.innerHTML = newCourse.name;
document.getElementById("course").appendChild(opt);
//document.getElementById("course").lastChild.innerHTML = "<option value=\"nc\">newCourseName</option>";
// Now to check if tee colour specified is in list, then select it if it is or add if not
let tees = document.getElementById("tees");
let newTees = true;
console.log("The number of tees is" + (tees.length - 1)); // Ignore first --select-- child
for (let i = 1; i < tees.length; i += 1) {
console.log(tees.children[i].innerHTML);
if (tees.children[i].innerHTML.toLowerCase() === newCourseTees.toLowerCase()) {
tees.children[i].selected = true;
newTees = false;
}
}
if (newTees = true) {
let opt2 = document.createElement("option");
opt2.setAttribute("id", newCourseTees.toLowerCase());
//newTeesColour = newCourseTees[0].toUpperCase() + newCourseTees.substr(1, newCourseTees.length).toLowerCase();
//opt2.innerHTML = newTeesColour;
opt2.innerHTML = capitaliseFirstLetters(newCourseTees);
opt2.selected = true;
tees.appendChild(opt2);
}
// Set standard scratch
document.getElementById("ss").value = newCourse.ss;
// Add course yards and pars to statsTable
/* for (i = 0; i < newCourse.getYards().length; i++) {
setCourseDetails2(newCourse);
console.log("Just set the course details.");
}; */
setCourseDetails2(newCourse);
postDatabase("course", newCourse);
});
/* let front9Scores = document.getElementsByClassName(statFront9);
for (let i = 0; i < front9Scores.length; i++) {
if (Number.isInteger(parseInt(front9Scores[i].value))) {
sumFront9 += parseInt(front9Scores[i].value);
}
} */
function capitaliseFirstLetters(str) {
strArrayOld = str.split(" ");
strArrayNew = [];
for (let i = 0; i < strArrayOld.length; i += 1) {
strArrayNew.push(strArrayOld[i][0].toUpperCase() + strArrayOld[i].substr(1, strArrayOld[i].length).toLowerCase());
}
return strArrayNew.join(" ");
}
function aveDriverDist() {
let driveClubsFront9 = document.getElementsByClassName("driveClubFront9");
let driveClubsBack9 = document.getElementsByClassName("driveClubBack9");
//let driveClubsAll = document.getElementsByClassName("driveClubAll");
//console.log("driveClubs is " + driveClubs);
sumFront9 = 0;
countFront9 = 0;
sumBack9 = 0;
countBack9 = 0;
sumAll = 0;
countAll = 0;
for (let i = 0; i < driveClubsFront9.length; i += 1) {
if (driveClubsFront9[i].value === "DR") {
sumFront9 += parseInt(eval(`drive${i + 1}`).innerHTML);
countFront9 += 1;
sumAll += parseInt(eval(`drive${i + 1}`).innerHTML);
countAll += 1;
}
}
for (let i = 0; i < driveClubsBack9.length; i += 1) {
if (driveClubsBack9[i].value === "DR") {
sumBack9 += parseInt(eval(`drive${i + 10}`).innerHTML);
countBack9 += 1;
sumAll += parseInt(eval(`drive${i + 10}`).innerHTML);
countAll += 1;
}
}
if (countFront9 > 0) {
document.getElementById("aveDriverDistFront9").innerHTML = Math.round(sumFront9/countFront9 * 1000) / 1000;
}
if (countBack9 > 0) {
document.getElementById("aveDriverDistBack9").innerHTML = Math.round(sumBack9/countBack9 * 1000) / 1000;
}
document.getElementById("aveDriverDistAll").innerHTML = Math.round(sumAll/countAll * 1000) / 1000;
}
function postDatabase(str, obj) {
let url = "";
if (str === "course") {
url = "https://simple-golf-app.firebaseio.com/courses.json";
} else if (str === "scores") {
url = "https://simple-golf-app.firebaseio.com/scores.json";
} else {
console.log("Error with postDatabase() url call.")
}
var networkDataReceived = false;
fetch(url, {
method: "POST",
headers: {
"Content-type": "application/json",
"Accept": "application/json"
},
/* body: JSON.stringify({
course: "St Norman's",
competition: "Goose Salver"
}) */
body: JSON.stringify(obj)
})
.then(function(res) {
return res.json();
})
.then(function(data) {
networkDataReceived = true;
alert("Data successfully submitted!");
});
}
//postNewCourse();
/* function letsGeolocate() {
var output = document.getElementById("mapholder"); // the div where messages will appear
var options = {
};
function geolocationSuccess(position) {
var latitude = position.coords.latitude;
var longitude = position.coords.longitude;
output.innerHTML = 'Latitude: ' + latitude + '<br/>Longitude: ' + longitude
}
function geolocationError() {
output.innerHTML = "Unable to retrieve your location";
}
function geoprogress() {
output.innerHTML = '<p>Locating in progress</p>';
}
navigator.geolocation.getAccurateCurrentPosition(geolocationSuccess, geolocationError, geoprogress, options);
} */
function geoprogress() {
document.getElementById("mapholder").innerHTML = '<p>Locating in progress</p>';
}
//navigator.geolocation.getAccurateCurrentPosition = function (geolocationSuccess, geolocationError, geoprogress, options) {
navigator.geolocation.getAccurateCurrentPosition = function (success, error, geoprogress, options) {
var lastCheckedPosition,
locationEventCount = 0,
watchID,
timerID;
options = options || {};
var checkLocation = function (position) {
lastCheckedPosition = position;
locationEventCount = locationEventCount + 1;
// We ignore the first event unless it's the only one received because some devices seem to send a cached
// location even when maxaimumAge is set to zero
if ((position.coords.accuracy <= options.desiredAccuracy) && (locationEventCount > 1)) {
clearTimeout(timerID);
navigator.geolocation.clearWatch(watchID);
foundPosition(position);
} else {
geoprogress(position);
}
};
var stopTrying = function () {
navigator.geolocation.clearWatch(watchID);
foundPosition(lastCheckedPosition);
};
var onError = function (err) {
clearTimeout(timerID);
navigator.geolocation.clearWatch(watchID);
//geolocationError(error);
error(err);
};
var foundPosition = function (position) {
//geolocationSuccess(position);
success(position);
document.getElementById("mapholder").innerHTML = '<p>Location complete</p>';
};
// Inserting success, error and geoprogress here so they are within the scope of this function
var success = function (position) {
let crd = position.coords;
console.log('Your current position is:');
console.log(`Latitude : ${crd.latitude}`);
console.log(`Longitude: ${crd.longitude}`);
console.log(`More or less ${crd.accuracy} meters.`);
//showPosition(pos);
if (holeStarted === false) {
array.push(crd.latitude, crd.longitude, crd.accuracy, 0);
/* document.getElementById("startLat").innerHTML = Math.round(crd.latitude * 10000) / 10000 + "\xB0";
document.getElementById("startLon").innerHTML = Math.round(crd.longitude * 10000) / 10000 + "\xB0";
document.getElementById("startAcc").innerHTML = Math.round(crd.accuracy * 10000) / 10000; */
//holeStarted = true;
distTable.lastChild.innerHTML = `<td>${hole} Tee</td>
<td>${Math.round(array[array.length - 4] * 10000)/10000}\xB0</td>
<td>${Math.round(array[array.length - 3] * 10000)/10000}\xB0</td>
<td>${Math.round(array[array.length - 2] * 10000)/10000}</td>
<td>-</td>
<td>-</td>`;
holeStarted = true;
shots = 0;
} else {
array.push(crd.latitude, crd.longitude, crd.accuracy);
getDist(array);
}
shots++;
console.log("The current array is: " + array);
}
var error = function(err) {
console.warn(`ERROR(${err.code}): ${err.message}`);
}
var geoprogress = function () {
document.getElementById("mapholder").innerHTML = '<p>Locating in progress</p>';
}
if (!options.maxWait) options.maxWait = 10000; // Default 10 seconds
if (!options.desiredAccuracy) options.desiredAccuracy = 10; // Default 20 meters
if (!options.timeout) options.timeout = options.maxWait; // Default to maxWait
options.maximumAge = 0; // Force current locations only
options.enableHighAccuracy = true; // Force high accuracy (otherwise, why are you using this function?)
watchID = navigator.geolocation.watchPosition(checkLocation, onError, options);
timerID = setTimeout(stopTrying, options.maxWait); // Set a timeout that will abandon the location loop
};
function setFormat() {
let formatBtn = document.getElementById("format");
let pointsColumn = document.getElementsByClassName("stablefordColumn");
if (formatBtn.value === "strokeplay") {
for (let i = 0; i < pointsColumn.length; i += 1) {
pointsColumn[i].style.display = "none";
}
} else if (formatBtn.value === "stableford") {
for (let i = 0; i < pointsColumn.length; i += 1) {
pointsColumn[i].style.display = "table-cell";
}
}
}
function calcStableford() {
console.log("???");
let handicap = document.getElementById("handicap").value;
let shots = 0; // Handicap shots per hole
let holeSI = 0;
let holePar = 0;
let holeGrossScore = 0;
let holeNetScore = 0;
console.log("Handicap is " + handicap);
// Assuming scratch or worse handicap
for (let i = 0; i < 18; i += 1) {
holeSI = parseInt(document.getElementById(`si${i + 1}`).innerHTML);
if (handicap <= 18) {
if (holeSI <= handicap) {
shots = 1;
} else {
shots = 0;
}
} else if (handicap > 18 && handicap <= 36) {
if (holeSI <= handicap - 18) {
shots = 2;
} else {
shots = 1;
}
};
/* if (handicap <= 18 && holeSI <= handicap) {
shots = 1;
} else if (handicap <= 18 && holeSI > handicap) {
shots = 0;
} */
holePar = parseInt(document.getElementById(`par${i + 1}`).innerHTML);
holeGrossScore = parseInt(document.getElementById(`score${i + 1}`).firstChild.value);
if (holeGrossScore !== 0) {
holeNetScore = holeGrossScore - shots;
document.getElementById(`stableford${i + 1}`).innerHTML = Math.max(2 + holePar - holeNetScore, 0);
}
}
sumScoresPutts("stableford");
}
function setCourseDropdown() {
let coursesCell = document.getElementById("coursesList");
const url = "https://simple-golf-app.firebaseio.com/courses.json";
let networkDataReceived = false;
let coursesSet = new Set(); // Avoiding duplicate names
let coursesList = []; // For list of courses in database - cannot sort() set so do so with array
let coursesCellString = ""
fetch(url)
.then(function(res) {
return res.json();
})
.then(function(data) {
networkDataReceived = true;
console.log('From web', data);
//let setKey = "";
for (let key in data) {
console.log("key name: " + data[key].name + ", data: " + data[key]);
coursesSet.add(data[key].name);
coursesList = Array.from(coursesSet).sort();
//coursesList.sort();
//console.log("Seeing if key name prints: " + dataObj[key].name);
/* if (data[key].name === courseTeesArray[0] && data[key].tees === courseTeesArray[1]) {
setKey = key;
console.log("setKey is: " + setKey);
} */
}
//coursesList =
coursesCellString = `<select id="course" onchange="setCourse()">
<option disabled selected value="nil">--select--</option>`;
for (let i = 0; i < coursesList.length; i += 1) {
coursesCellString += `<option>${coursesList[i]}</option>`;
}
coursesCellString += `<option value="other">Other</option>
</select>`;
/* coursesCell.innerHTML = `<select id="course" onchange="setCourse()">
<option disabled selected value="nil">--select--</option>
<option value="gx">Gerrards Cross</option>
<option value="other">Other</option>
</select>`; */
coursesCell.innerHTML = coursesCellString;
});
}
setCourseDropdown();
function setTees() {
let teesCell = document.getElementById("teesList");
let selectedCourseIndex = document.getElementById("course").selectedIndex;
let selectedCourse = document.getElementById("course")[selectedCourseIndex].innerHTML;
console.log("The selected course is " + selectedCourse);
// I am calling the data too often, should combine with setCourseDropdown() function;
const url = "https://simple-golf-app.firebaseio.com/courses.json";
let networkDataReceived = false;
let teesList = [];
let teesCellString = "";
fetch(url)
.then(function(res) {
return res.json();
})
.then(function(data) {
networkDataReceived = true;
console.log('From web', data);
//let setKey = "";
for (let key in data) {
console.log("key name: " + data[key].name + ", tees: " + data[key].tees);
if (data[key].name === selectedCourse) {
teesList.push(data[key].tees);
}
teesList.sort();
}
teesCellString = `<select id="tees" onchange="setCourse()">
<option disabled selected value="nil">--select--</option>`;
for (let i = 0; i < teesList.length; i += 1) {
teesCellString += `<option>${teesList[i]}</option>`;
}
teesCellString += `<option value="other">Other</option>
</select>`;
teesCell.innerHTML = teesCellString;
})
} | b761b80d7471283b791df01041c66f0e622ed9ba | [
"Markdown",
"JavaScript"
]
| 2 | Markdown | McKnightRider/Test-Geolocation2 | 4d26ffc2eb35d0191e08d7dedb0441d70681f5ea | 94274ea9f0b8b0556db662d26ce65a944a838638 |
refs/heads/main | <repo_name>Abadi11/JavaScript-Core-1-Coursework-Week3<file_sep>/3-extra/credit-card-validator.js
function isCreditCardValidate (numberCard){
let output;
numberCard = numberCard.toString().split("");
let newArr=[];
for (let i = 0; i < numberCard.length; i++){
numberCard [i]= parseInt(numberCard[i])
newArr.push(numberCard[i])
}
// now the new Array is number
console.log(newArr)
// function to check every element is just a number
function checkDigit(num){
if (Number.isNaN(num)){
return true;
}else{
return false;
}
}
// Checking if the credit card is validated or not
if (newArr.length !== 16){
output = "Number must be 16 digits"
}else if (newArr.some(checkDigit)){
output = "All of the items must be numbers"
}else if ((newArr[15]) % 2 !== 0 ){
output = "The final digit must be even"
}else if (newArr.reduce((a,b) => a+b) <= 16){
output = "The sum of all the digits must be greater than 16"
}else{
output = "It is validate"
}
// The credit card number must be composed of at least two different digits (i.e. all of the digits cannot be the same) I took it from google :)
var obj = {};
for (let i = 0; i< newArr.length; i++){
obj[newArr[i]] = true;
}
if (Object.keys(obj).length < 2) {
output = "The credit card number must be composed of at least two different digits";
}
return output
}
var number = "4444444555151662";
console.log(isCreditCardValidate(number)) | 0f521ed1111e4059e8226adbbdb0d344f6117b60 | [
"JavaScript"
]
| 1 | JavaScript | Abadi11/JavaScript-Core-1-Coursework-Week3 | 746c2c47b5b8d7ae62f939a28ced896e49949052 | de557b5579de158b15fa125bf61250656930ade1 |
refs/heads/master | <file_sep>import math
coords = input().split(" ")
coords = [int(i) for i in coords]
assert len(coords) == 4
for i in coords:
assert i <= 1000
print(round(math.sqrt((coords[2]-coords[0])**2+(coords[3]-coords[1])**2)))
<file_sep>import math
procents = int(input())
money = int(input())
assert 1 <= procents <= 99
assert 10 ** 5 <= money <= 10 ** 9
print(math.ceil(money*(procents/100)))
<file_sep>length = int(input())
numbers = input().split(" ")
numbers = [int(i) for i in numbers]
assert 1 <= length <= 5000
assert len(numbers) == length
for i in numbers:
assert i <= 2**31 - 1
numbers.sort()
if length % 2:
print('{:.1f}'.format(numbers[int((length/2))]))
else:
print('{:.1f}'.format((numbers[int(length/2)]+numbers[int(length/2-1)])/2))
| 37e92e37a3f81585da005fc3167dedda0810075f | [
"Python"
]
| 3 | Python | TrashboxBobylev/Olymp2018 | 6c1ce96f850f621fbd016e2b38a6480b85e05701 | 45165c0ba192ad42d186013d11cd94addfb61d7b |
refs/heads/master | <file_sep># first attempt
# References:
# **** https://www.youtube.com/watch?v=JtiK0DOeI4A&ab_channel=TechWithTim ****
# https://www.redblobgames.com/pathfinding/a-star/introduction.html
# https://www.youtube.com/watch?v=P4d0pUkAsz4&ab_channel=RANJIRAJ
# https://www.youtube.com/watch?v=-L-WgKMFuhE&ab_channel=SebastianLague
# https://www.simplifiedpython.net/a-star-algorithm-python-tutorial/
# https://en.wikipedia.org/wiki/A*_search_algorithm
# https://medium.com/@nicholas.w.swift/easy-a-star-pathfinding-7e6689c7f7b2
# https://www.youtube.com/watch?v=ob4faIum4kQ&ab_channel=TrevorPayne
import pygame
import math
import numpy as np
from queue import PriorityQueue
environment = pygame.display.set_mode((800, 800))
RED = (255, 0, 0) # closed
GREEN = (0, 255, 0) # open
BLUE = (0, 0, 255) # goal
ORANGE = (255, 165, 0) # start
BLACK = (0, 0, 0) # barrier
WHITE = (255, 255, 255) # entry
PURPLE = (128, 0, 128) # path
GREY = (128, 128, 128) # environment
class Node:
def __init__(self, row, column, size, numberOfentries):
self.row = row
self.column = column
self.i = row * size
self.j = column * size
self.color = WHITE
self.neighbors = []
self.size = size
self.numberOfentries = numberOfentries
def getPosition(self):
return self.row, self.column
def makeBarrier(self):
self.color = BLACK
def draw(self, environment):
pygame.draw.rect(environment, self.color, (self.i, self.j, self.size, self.size))
def updateNeighbors(self, graph):
self.neighbors = []
if self.row < self.numberOfentries - 1 and not graph[self.row+1][self.column].color == BLACK:
self.neighbors.append(graph[self.row + 1][self.column])
if self.row > 0 and not graph[self.row - 1][self.column].color == BLACK:
self.neighbors.append(graph[self.row - 1][self.column])
if self.column < self.numberOfentries - 1 and not graph[self.row][self.column + 1].color == BLACK:
self.neighbors.append(graph[self.row][self.column + 1])
if self.column > 0 and not graph[self.row][self.column - 1].color == BLACK:
self.neighbors.append(graph[self.row][self.column - 1])
def heuristic(pointA, pointB):
i1, j1 = pointA
i2, j2 = pointB
return abs(i1 - i2) + abs(j1 - j2)
def reconstructPath(origin, current, draw):
while current in origin:
current = origin[current]
current.color = PURPLE
draw()
def cost(c, g, h):
return (c-g)/h
# def boundedcost(draw, graph, start, goal, c):
# open = []
# closed = []
# g = []
# open.append(start)
# n = []
# while not open.empty():
# for nprime in n:
# if nprime in open or nprime in closed and g[nprime] <= cost(n, nprime):
# continue
# g[nprime] = g[n] + cost(n, nprime)
# if g[nprime] + h[nprime] >= c:
# continue
# if nprime == goal:
# reconstructPath(origin, goal, draw)
# goal.color = BLUE
# return True
# if nprime in open:
# print("placeholder ")
# else:
# open.append(nprime)
def A_Star(draw, graph, start, goal, constraint):
# constraint = 150
openPQ = PriorityQueue()
count = 0
openPQ.put((0, count, start))
origin = {}
g = {}
for row in graph:
for node in row:
g[node] = float("inf")
g[start] = 0
f = {}
for row in graph:
for node in row:
f[node] = float("inf")
f[start] = heuristic(start.getPosition(), goal.getPosition())
ezOpenSet = {start}
while not openPQ.empty():
for event in pygame.event.get():
if event.type == pygame.QUIT:
pygame.quit()
current = openPQ.get()[2]
ezOpenSet.remove(current)
if current == goal:
reconstructPath(origin, goal, draw)
goal.color = BLUE
return True
for neighbor in current.neighbors:
tentative_gScore = g[current] + 1
if tentative_gScore < g[neighbor]:
origin[neighbor] = current
g[neighbor] = tentative_gScore
f[neighbor] = tentative_gScore + heuristic(neighbor.getPosition(), goal.getPosition())
# and f[neighbor] < constraint
# print("g: ", g[neighbor])
# print("h: ", heuristic(neighbor.getPosition(), goal.getPosition()))
# print("f: ", f[neighbor])
# print("cost: ", cost(constraint, g[neighbor], heuristic(neighbor.getPosition(), goal.getPosition())))
if neighbor not in ezOpenSet and f[neighbor] < constraint:
count += 1
openPQ.put((f[neighbor], count, neighbor))
ezOpenSet.add(neighbor)
neighbor.color = GREEN
draw()
if current != start:
current.color = RED
return False
def draw(environment, graph, entries, size):
environment.fill(WHITE)
for row in graph:
for node in row:
node.draw(environment)
space = size // entries
for i in range(entries):
pygame.draw.line(environment, GREY, (0, i * space), (size, i * space))
for j in range(entries):
pygame.draw.line(environment, GREY, (j * space, 0), (j * space, size))
pygame.display.update()
def main(environment, size):
entries = 100
graph = []
space = size // entries
for i in range(entries):
graph.append([])
for j in range(entries):
node = Node(i, j, space, entries)
graph[i].append(node)
# Constructing the environment from Figure 3.31:
# --------------------------------------------------------------------------------
start = graph[10][90]
graph[10][90].color = ORANGE
goal = graph[95][45]
graph[95][45].color = BLUE
def environmentA():
# Horizontal Rectangle
# -------------------------------------
# Vertices:
graph[16][95].makeBarrier() #HR1
graph[16][85].makeBarrier() #HR2
graph[46][85].makeBarrier() #HR3
graph[46][95].makeBarrier() #HR4
# left side of horizontal rectangle
for i in range(85, 95):
graph[16][i].makeBarrier()
# top side of horizontal rectangle
for i in range(17, 46):
graph[i][85].makeBarrier()
# right side of horizontal rectangle
for i in reversed(range(86, 95)):
graph[46][i].makeBarrier()
# bottom side of horizontal rectangle
for i in range(17, 46):
graph[i][95].makeBarrier()
# -------------------------------------
# Pentagon
# -------------------------------------
graph[25][75].makeBarrier() # P1
graph[14][70].makeBarrier() # P2
graph[10][56].makeBarrier() # P3
graph[20][45].makeBarrier() # P4
graph[30][56].makeBarrier() # P5
# bottom left side of pentagon
graph[24][74].makeBarrier()
graph[23][74].makeBarrier()
graph[22][73].makeBarrier()
graph[21][73].makeBarrier()
graph[20][72].makeBarrier()
graph[19][72].makeBarrier()
graph[18][71].makeBarrier()
graph[17][71].makeBarrier()
graph[16][70].makeBarrier()
graph[15][70].makeBarrier()
# top left side of pentagon
graph[10][55].makeBarrier()
graph[11][55].makeBarrier()
graph[11][54].makeBarrier()
graph[12][54].makeBarrier()
graph[13][53].makeBarrier()
graph[13][52].makeBarrier()
graph[14][51].makeBarrier()
graph[15][51].makeBarrier()
graph[16][50].makeBarrier()
graph[16][49].makeBarrier()
graph[17][48].makeBarrier()
graph[17][47].makeBarrier()
graph[18][47].makeBarrier()
graph[19][46].makeBarrier()
# left side of pentagon
graph[10][57].makeBarrier()
graph[10][58].makeBarrier()
graph[10][59].makeBarrier()
graph[11][60].makeBarrier()
graph[12][61].makeBarrier()
graph[12][62].makeBarrier()
graph[12][63].makeBarrier()
graph[13][64].makeBarrier()
graph[13][65].makeBarrier()
graph[13][66].makeBarrier()
graph[14][67].makeBarrier()
graph[14][68].makeBarrier()
graph[14][69].makeBarrier()
# top side of pentagon
graph[21][46].makeBarrier()
graph[22][46].makeBarrier()
graph[23][47].makeBarrier()
graph[24][47].makeBarrier()
graph[24][48].makeBarrier()
graph[24][49].makeBarrier()
graph[25][49].makeBarrier()
graph[25][50].makeBarrier()
graph[25][51].makeBarrier()
graph[26][51].makeBarrier()
graph[26][52].makeBarrier()
graph[26][53].makeBarrier()
graph[27][54].makeBarrier()
graph[28][54].makeBarrier()
graph[28][55].makeBarrier()
graph[29][55].makeBarrier()
graph[29][56].makeBarrier()
# bottom right side of pentagon
graph[30][57].makeBarrier()
graph[30][58].makeBarrier()
graph[30][59].makeBarrier()
graph[29][59].makeBarrier()
graph[29][60].makeBarrier()
graph[29][61].makeBarrier()
graph[29][62].makeBarrier()
graph[28][63].makeBarrier()
graph[28][64].makeBarrier()
graph[28][65].makeBarrier()
graph[28][66].makeBarrier()
graph[27][67].makeBarrier()
graph[27][68].makeBarrier()
graph[27][69].makeBarrier()
graph[27][70].makeBarrier()
graph[26][71].makeBarrier()
graph[26][72].makeBarrier()
graph[26][73].makeBarrier()
graph[26][74].makeBarrier()
# Isosceles Triangle
graph[37][55].makeBarrier() # IT1
graph[41][78].makeBarrier() # IT2
graph[33][78].makeBarrier() # IT3
# graph[36][56].makeBarrier()
for i in range(56, 62):
graph[36][i].makeBarrier()
for i in range(62, 67):
graph[35][i].makeBarrier()
for i in range(67, 73):
graph[34][i].makeBarrier()
for i in range(73, 78):
graph[33][i].makeBarrier()
for i in range(56, 62):
graph[38][i].makeBarrier()
for i in range(62, 67):
graph[39][i].makeBarrier()
for i in range(67, 73):
graph[40][i].makeBarrier()
for i in range(73, 78):
graph[41][i].makeBarrier()
for i in range(34, 41):
graph[i][78].makeBarrier()
# Quadrilateral
graph[43][60].makeBarrier() # Q1
graph[43][44].makeBarrier() # Q2
graph[51][41].makeBarrier() # Q3
graph[56][48].makeBarrier() # Q4
for i in range(45, 60):
graph[43][i].makeBarrier()
graph[44][43].makeBarrier()
graph[45][43].makeBarrier()
graph[46][43].makeBarrier()
graph[47][42].makeBarrier()
graph[48][42].makeBarrier()
graph[49][42].makeBarrier()
graph[50][42].makeBarrier()
graph[52][42].makeBarrier()
graph[52][43].makeBarrier()
graph[53][44].makeBarrier()
graph[53][45].makeBarrier()
graph[54][46].makeBarrier()
graph[55][47].makeBarrier()
graph[55][49].makeBarrier()
graph[54][50].makeBarrier()
graph[53][51].makeBarrier()
graph[52][52].makeBarrier()
graph[51][53].makeBarrier()
graph[50][54].makeBarrier()
graph[49][55].makeBarrier()
graph[48][56].makeBarrier()
graph[47][57].makeBarrier()
graph[46][58].makeBarrier()
graph[45][59].makeBarrier()
graph[44][60].makeBarrier()
# Right Triangle
graph[56][90].makeBarrier() #RT3
graph[66][83].makeBarrier()# RT2
graph[49][70].makeBarrier() #RT1
# left side
graph[56][91].makeBarrier()
graph[56][89].makeBarrier()
graph[55][89].makeBarrier()
graph[55][88].makeBarrier()
graph[55][87].makeBarrier()
graph[54][87].makeBarrier()
graph[53][87].makeBarrier()
graph[53][86].makeBarrier()
graph[53][85].makeBarrier()
graph[53][84].makeBarrier()
graph[53][83].makeBarrier()
graph[52][82].makeBarrier()
graph[52][81].makeBarrier()
graph[52][80].makeBarrier()
graph[52][79].makeBarrier()
graph[51][78].makeBarrier()
graph[51][77].makeBarrier()
graph[51][76].makeBarrier()
graph[51][75].makeBarrier()
graph[50][74].makeBarrier()
graph[50][73].makeBarrier()
graph[49][72].makeBarrier()
graph[49][71].makeBarrier()
# right side
graph[50][70].makeBarrier()
graph[51][71].makeBarrier()
graph[52][72].makeBarrier()
graph[53][73].makeBarrier()
graph[54][73].makeBarrier()
graph[55][74].makeBarrier()
graph[56][75].makeBarrier()
graph[57][76].makeBarrier()
graph[58][77].makeBarrier()
graph[59][78].makeBarrier()
graph[60][79].makeBarrier()
graph[61][79].makeBarrier()
graph[62][80].makeBarrier()
graph[63][81].makeBarrier()
graph[64][82].makeBarrier()
graph[65][82].makeBarrier()
graph[65][84].makeBarrier()
graph[64][85].makeBarrier()
graph[63][86].makeBarrier()
graph[62][87].makeBarrier()
graph[61][88].makeBarrier()
graph[60][89].makeBarrier()
graph[59][90].makeBarrier()
graph[58][91].makeBarrier()
graph[57][92].makeBarrier()
# Vertical Rectangle
graph[62][46].makeBarrier() # R1
graph[62][75].makeBarrier() # R2
graph[77][46].makeBarrier() # R3
graph[77][75].makeBarrier() # R4
for i in range(47, 75):
graph[62][i].makeBarrier()
for i in range(47, 75):
graph[77][i].makeBarrier()
for i in range(63, 77):
graph[i][46].makeBarrier()
for i in range(63, 77):
graph[i][75].makeBarrier()
# Hexagon
graph[79][78].makeBarrier() # H1 # highest
graph[74][83].makeBarrier() # H2 # top left
graph[74][88].makeBarrier() # H3 # bottom left
graph[79][92].makeBarrier() # H4 # lowest
graph[84][83].makeBarrier() # H5 # top right
graph[84][88].makeBarrier() # H6 # bottom right
graph[78][79].makeBarrier()
graph[77][80].makeBarrier()
graph[76][81].makeBarrier()
graph[75][82].makeBarrier()
graph[80][79].makeBarrier()
graph[81][80].makeBarrier()
graph[82][81].makeBarrier()
graph[83][82].makeBarrier()
for i in range(84, 88):
graph[74][i].makeBarrier()
for i in range(84, 88):
graph[84][i].makeBarrier()
graph[78][91].makeBarrier()
graph[77][91].makeBarrier()
graph[76][90].makeBarrier()
graph[75][89].makeBarrier()
graph[80][91].makeBarrier()
graph[81][91].makeBarrier()
graph[82][90].makeBarrier()
graph[83][89].makeBarrier()
# Kite
graph[80][50].makeBarrier() # K1
graph[86][45].makeBarrier() # K2
graph[92][50].makeBarrier() # K3
graph[86][69].makeBarrier() # K4
graph[81][51].makeBarrier()
graph[81][52].makeBarrier()
graph[81][53].makeBarrier()
graph[81][54].makeBarrier()
graph[82][55].makeBarrier()
graph[82][56].makeBarrier()
graph[82][57].makeBarrier()
graph[82][58].makeBarrier()
graph[83][59].makeBarrier()
graph[83][60].makeBarrier()
graph[83][61].makeBarrier()
graph[83][62].makeBarrier()
graph[84][63].makeBarrier()
graph[84][64].makeBarrier()
graph[84][65].makeBarrier()
graph[84][66].makeBarrier()
graph[85][67].makeBarrier()
graph[85][68].makeBarrier()
graph[91][51].makeBarrier()
graph[91][52].makeBarrier()
graph[91][53].makeBarrier()
graph[91][54].makeBarrier()
graph[90][55].makeBarrier()
graph[90][56].makeBarrier()
graph[90][57].makeBarrier()
graph[90][58].makeBarrier()
graph[89][59].makeBarrier()
graph[89][60].makeBarrier()
graph[89][61].makeBarrier()
graph[89][62].makeBarrier()
graph[88][63].makeBarrier()
graph[88][64].makeBarrier()
graph[88][65].makeBarrier()
graph[88][66].makeBarrier()
graph[87][67].makeBarrier()
graph[87][68].makeBarrier()
graph[91][49].makeBarrier()
graph[90][49].makeBarrier()
graph[89][48].makeBarrier()
graph[88][47].makeBarrier()
graph[87][46].makeBarrier()
graph[81][49].makeBarrier()
graph[82][49].makeBarrier()
graph[83][48].makeBarrier()
graph[84][47].makeBarrier()
graph[85][46].makeBarrier()
# --------------------------------------------------------------------------------
def environmentB():
# rectangle
graph[ 20 ][ 64 ].makeBarrier()
graph[ 20 ][ 64 ].makeBarrier()
graph[ 20 ][ 65 ].makeBarrier()
graph[ 20 ][ 65 ].makeBarrier()
graph[ 20 ][ 65 ].makeBarrier()
graph[ 20 ][ 66 ].makeBarrier()
graph[ 20 ][ 66 ].makeBarrier()
graph[ 20 ][ 66 ].makeBarrier()
graph[ 20 ][ 66 ].makeBarrier()
graph[ 20 ][ 67 ].makeBarrier()
graph[ 20 ][ 67 ].makeBarrier()
graph[ 20 ][ 67 ].makeBarrier()
graph[ 20 ][ 67 ].makeBarrier()
graph[ 20 ][ 68 ].makeBarrier()
graph[ 20 ][ 68 ].makeBarrier()
graph[ 20 ][ 68 ].makeBarrier()
graph[ 20 ][ 69 ].makeBarrier()
graph[ 20 ][ 69 ].makeBarrier()
graph[ 20 ][ 70 ].makeBarrier()
graph[ 20 ][ 70 ].makeBarrier()
graph[ 20 ][ 71 ].makeBarrier()
graph[ 20 ][ 71 ].makeBarrier()
graph[ 20 ][ 71 ].makeBarrier()
graph[ 20 ][ 72 ].makeBarrier()
graph[ 20 ][ 72 ].makeBarrier()
graph[ 20 ][ 72 ].makeBarrier()
graph[ 20 ][ 72 ].makeBarrier()
graph[ 20 ][ 73 ].makeBarrier()
graph[ 20 ][ 73 ].makeBarrier()
graph[ 20 ][ 73 ].makeBarrier()
graph[ 20 ][ 74 ].makeBarrier()
graph[ 20 ][ 74 ].makeBarrier()
graph[ 20 ][ 74 ].makeBarrier()
graph[ 21 ][ 74 ].makeBarrier()
graph[ 21 ][ 74 ].makeBarrier()
graph[ 21 ][ 74 ].makeBarrier()
graph[ 21 ][ 74 ].makeBarrier()
graph[ 21 ][ 74 ].makeBarrier()
graph[ 22 ][ 74 ].makeBarrier()
graph[ 22 ][ 74 ].makeBarrier()
graph[ 22 ][ 74 ].makeBarrier()
graph[ 22 ][ 74 ].makeBarrier()
graph[ 23 ][ 74 ].makeBarrier()
graph[ 23 ][ 74 ].makeBarrier()
graph[ 23 ][ 74 ].makeBarrier()
graph[ 23 ][ 74 ].makeBarrier()
graph[ 24 ][ 74 ].makeBarrier()
graph[ 24 ][ 74 ].makeBarrier()
graph[ 24 ][ 74 ].makeBarrier()
graph[ 25 ][ 74 ].makeBarrier()
graph[ 25 ][ 74 ].makeBarrier()
graph[ 26 ][ 74 ].makeBarrier()
graph[ 26 ][ 74 ].makeBarrier()
graph[ 26 ][ 74 ].makeBarrier()
graph[ 27 ][ 74 ].makeBarrier()
graph[ 27 ][ 74 ].makeBarrier()
graph[ 27 ][ 74 ].makeBarrier()
graph[ 28 ][ 74 ].makeBarrier()
graph[ 28 ][ 74 ].makeBarrier()
graph[ 28 ][ 74 ].makeBarrier()
graph[ 29 ][ 74 ].makeBarrier()
graph[ 29 ][ 74 ].makeBarrier()
graph[ 29 ][ 74 ].makeBarrier()
graph[ 30 ][ 74 ].makeBarrier()
graph[ 30 ][ 74 ].makeBarrier()
graph[ 30 ][ 74 ].makeBarrier()
graph[ 30 ][ 74 ].makeBarrier()
graph[ 31 ][ 74 ].makeBarrier()
graph[ 32 ][ 74 ].makeBarrier()
graph[ 32 ][ 74 ].makeBarrier()
graph[ 32 ][ 74 ].makeBarrier()
graph[ 33 ][ 74 ].makeBarrier()
graph[ 33 ][ 74 ].makeBarrier()
graph[ 33 ][ 74 ].makeBarrier()
graph[ 33 ][ 74 ].makeBarrier()
graph[ 34 ][ 74 ].makeBarrier()
graph[ 34 ][ 74 ].makeBarrier()
graph[ 34 ][ 74 ].makeBarrier()
graph[ 35 ][ 74 ].makeBarrier()
graph[ 35 ][ 74 ].makeBarrier()
graph[ 35 ][ 74 ].makeBarrier()
graph[ 35 ][ 74 ].makeBarrier()
graph[ 35 ][ 74 ].makeBarrier()
graph[ 36 ][ 74 ].makeBarrier()
graph[ 36 ][ 74 ].makeBarrier()
graph[ 36 ][ 74 ].makeBarrier()
graph[ 36 ][ 74 ].makeBarrier()
graph[ 36 ][ 74 ].makeBarrier()
graph[ 36 ][ 74 ].makeBarrier()
graph[ 37 ][ 74 ].makeBarrier()
graph[ 37 ][ 74 ].makeBarrier()
graph[ 38 ][ 74 ].makeBarrier()
graph[ 38 ][ 74 ].makeBarrier()
graph[ 38 ][ 74 ].makeBarrier()
graph[ 38 ][ 74 ].makeBarrier()
graph[ 39 ][ 74 ].makeBarrier()
graph[ 39 ][ 74 ].makeBarrier()
graph[ 39 ][ 74 ].makeBarrier()
graph[ 39 ][ 74 ].makeBarrier()
graph[ 39 ][ 74 ].makeBarrier()
graph[ 40 ][ 74 ].makeBarrier()
graph[ 40 ][ 74 ].makeBarrier()
graph[ 40 ][ 74 ].makeBarrier()
graph[ 40 ][ 74 ].makeBarrier()
graph[ 40 ][ 74 ].makeBarrier()
graph[ 40 ][ 74 ].makeBarrier()
graph[ 40 ][ 74 ].makeBarrier()
graph[ 40 ][ 74 ].makeBarrier()
graph[ 40 ][ 73 ].makeBarrier()
graph[ 40 ][ 73 ].makeBarrier()
graph[ 40 ][ 73 ].makeBarrier()
graph[ 40 ][ 72 ].makeBarrier()
graph[ 40 ][ 72 ].makeBarrier()
graph[ 40 ][ 71 ].makeBarrier()
graph[ 40 ][ 71 ].makeBarrier()
graph[ 40 ][ 71 ].makeBarrier()
graph[ 40 ][ 70 ].makeBarrier()
graph[ 40 ][ 70 ].makeBarrier()
graph[ 40 ][ 70 ].makeBarrier()
graph[ 40 ][ 69 ].makeBarrier()
graph[ 40 ][ 69 ].makeBarrier()
graph[ 40 ][ 68 ].makeBarrier()
graph[ 40 ][ 68 ].makeBarrier()
graph[ 40 ][ 68 ].makeBarrier()
graph[ 40 ][ 67 ].makeBarrier()
graph[ 40 ][ 67 ].makeBarrier()
graph[ 40 ][ 67 ].makeBarrier()
graph[ 40 ][ 66 ].makeBarrier()
graph[ 40 ][ 66 ].makeBarrier()
graph[ 40 ][ 66 ].makeBarrier()
graph[ 40 ][ 65 ].makeBarrier()
graph[ 40 ][ 65 ].makeBarrier()
graph[ 40 ][ 65 ].makeBarrier()
graph[ 40 ][ 64 ].makeBarrier()
graph[ 40 ][ 64 ].makeBarrier()
graph[ 40 ][ 64 ].makeBarrier()
graph[ 40 ][ 64 ].makeBarrier()
graph[ 40 ][ 64 ].makeBarrier()
graph[ 40 ][ 64 ].makeBarrier()
graph[ 40 ][ 64 ].makeBarrier()
graph[ 40 ][ 64 ].makeBarrier()
graph[ 40 ][ 64 ].makeBarrier()
graph[ 39 ][ 64 ].makeBarrier()
graph[ 39 ][ 64 ].makeBarrier()
graph[ 39 ][ 64 ].makeBarrier()
graph[ 39 ][ 64 ].makeBarrier()
graph[ 39 ][ 64 ].makeBarrier()
graph[ 38 ][ 64 ].makeBarrier()
graph[ 38 ][ 64 ].makeBarrier()
graph[ 38 ][ 64 ].makeBarrier()
graph[ 37 ][ 64 ].makeBarrier()
graph[ 37 ][ 64 ].makeBarrier()
graph[ 37 ][ 64 ].makeBarrier()
graph[ 36 ][ 64 ].makeBarrier()
graph[ 36 ][ 64 ].makeBarrier()
graph[ 36 ][ 64 ].makeBarrier()
graph[ 35 ][ 64 ].makeBarrier()
graph[ 35 ][ 64 ].makeBarrier()
graph[ 34 ][ 64 ].makeBarrier()
graph[ 34 ][ 64 ].makeBarrier()
graph[ 33 ][ 64 ].makeBarrier()
graph[ 33 ][ 64 ].makeBarrier()
graph[ 33 ][ 64 ].makeBarrier()
graph[ 33 ][ 64 ].makeBarrier()
graph[ 32 ][ 64 ].makeBarrier()
graph[ 32 ][ 64 ].makeBarrier()
graph[ 31 ][ 64 ].makeBarrier()
graph[ 31 ][ 64 ].makeBarrier()
graph[ 30 ][ 64 ].makeBarrier()
graph[ 30 ][ 64 ].makeBarrier()
graph[ 29 ][ 64 ].makeBarrier()
graph[ 29 ][ 64 ].makeBarrier()
graph[ 28 ][ 64 ].makeBarrier()
graph[ 28 ][ 64 ].makeBarrier()
graph[ 28 ][ 64 ].makeBarrier()
graph[ 27 ][ 64 ].makeBarrier()
graph[ 27 ][ 64 ].makeBarrier()
graph[ 27 ][ 64 ].makeBarrier()
graph[ 26 ][ 64 ].makeBarrier()
graph[ 26 ][ 64 ].makeBarrier()
graph[ 25 ][ 64 ].makeBarrier()
graph[ 25 ][ 64 ].makeBarrier()
graph[ 24 ][ 64 ].makeBarrier()
graph[ 24 ][ 64 ].makeBarrier()
graph[ 24 ][ 64 ].makeBarrier()
graph[ 23 ][ 64 ].makeBarrier()
graph[ 23 ][ 64 ].makeBarrier()
graph[ 22 ][ 64 ].makeBarrier()
graph[ 22 ][ 64 ].makeBarrier()
graph[ 22 ][ 64 ].makeBarrier()
graph[ 22 ][ 63 ].makeBarrier()
graph[ 21 ][ 63 ].makeBarrier()
graph[ 21 ][ 63 ].makeBarrier()
graph[ 21 ][ 63 ].makeBarrier()
graph[ 21 ][ 64 ].makeBarrier()
graph[ 21 ][ 64 ].makeBarrier()
graph[ 21 ][ 64 ].makeBarrier()
graph[ 21 ][ 64 ].makeBarrier()
graph[ 20 ][ 64 ].makeBarrier()
graph[ 20 ][ 64 ].makeBarrier()
graph[ 20 ][ 64 ].makeBarrier()
# triangle, hexagon, quadrilateral
graph[ 32 ][ 15 ].makeBarrier()
graph[ 32 ][ 15 ].makeBarrier()
graph[ 32 ][ 15 ].makeBarrier()
graph[ 32 ][ 16 ].makeBarrier()
graph[ 33 ][ 16 ].makeBarrier()
graph[ 33 ][ 16 ].makeBarrier()
graph[ 33 ][ 16 ].makeBarrier()
graph[ 33 ][ 17 ].makeBarrier()
graph[ 33 ][ 17 ].makeBarrier()
graph[ 34 ][ 18 ].makeBarrier()
graph[ 34 ][ 18 ].makeBarrier()
graph[ 34 ][ 18 ].makeBarrier()
graph[ 34 ][ 19 ].makeBarrier()
graph[ 34 ][ 19 ].makeBarrier()
graph[ 35 ][ 20 ].makeBarrier()
graph[ 35 ][ 20 ].makeBarrier()
graph[ 35 ][ 21 ].makeBarrier()
graph[ 36 ][ 22 ].makeBarrier()
graph[ 36 ][ 22 ].makeBarrier()
graph[ 36 ][ 23 ].makeBarrier()
graph[ 36 ][ 24 ].makeBarrier()
graph[ 36 ][ 24 ].makeBarrier()
graph[ 37 ][ 25 ].makeBarrier()
graph[ 37 ][ 25 ].makeBarrier()
graph[ 37 ][ 26 ].makeBarrier()
graph[ 37 ][ 26 ].makeBarrier()
graph[ 37 ][ 27 ].makeBarrier()
graph[ 38 ][ 27 ].makeBarrier()
graph[ 38 ][ 28 ].makeBarrier()
graph[ 38 ][ 28 ].makeBarrier()
graph[ 38 ][ 28 ].makeBarrier()
graph[ 38 ][ 29 ].makeBarrier()
graph[ 38 ][ 29 ].makeBarrier()
graph[ 39 ][ 29 ].makeBarrier()
graph[ 39 ][ 29 ].makeBarrier()
graph[ 38 ][ 29 ].makeBarrier()
graph[ 37 ][ 29 ].makeBarrier()
graph[ 36 ][ 29 ].makeBarrier()
graph[ 36 ][ 29 ].makeBarrier()
graph[ 36 ][ 29 ].makeBarrier()
graph[ 35 ][ 29 ].makeBarrier()
graph[ 35 ][ 29 ].makeBarrier()
graph[ 34 ][ 29 ].makeBarrier()
graph[ 34 ][ 29 ].makeBarrier()
graph[ 33 ][ 29 ].makeBarrier()
graph[ 32 ][ 29 ].makeBarrier()
graph[ 31 ][ 29 ].makeBarrier()
graph[ 30 ][ 29 ].makeBarrier()
graph[ 29 ][ 29 ].makeBarrier()
graph[ 29 ][ 29 ].makeBarrier()
graph[ 28 ][ 29 ].makeBarrier()
graph[ 27 ][ 29 ].makeBarrier()
graph[ 27 ][ 29 ].makeBarrier()
graph[ 26 ][ 29 ].makeBarrier()
graph[ 25 ][ 29 ].makeBarrier()
graph[ 25 ][ 29 ].makeBarrier()
graph[ 24 ][ 29 ].makeBarrier()
graph[ 24 ][ 29 ].makeBarrier()
graph[ 24 ][ 28 ].makeBarrier()
graph[ 23 ][ 28 ].makeBarrier()
graph[ 23 ][ 28 ].makeBarrier()
graph[ 23 ][ 28 ].makeBarrier()
graph[ 23 ][ 28 ].makeBarrier()
graph[ 23 ][ 28 ].makeBarrier()
graph[ 24 ][ 28 ].makeBarrier()
graph[ 24 ][ 27 ].makeBarrier()
graph[ 24 ][ 27 ].makeBarrier()
graph[ 24 ][ 26 ].makeBarrier()
graph[ 25 ][ 26 ].makeBarrier()
graph[ 25 ][ 25 ].makeBarrier()
graph[ 25 ][ 25 ].makeBarrier()
graph[ 25 ][ 24 ].makeBarrier()
graph[ 26 ][ 24 ].makeBarrier()
graph[ 26 ][ 23 ].makeBarrier()
graph[ 27 ][ 22 ].makeBarrier()
graph[ 28 ][ 22 ].makeBarrier()
graph[ 28 ][ 21 ].makeBarrier()
graph[ 28 ][ 21 ].makeBarrier()
graph[ 29 ][ 21 ].makeBarrier()
graph[ 29 ][ 20 ].makeBarrier()
graph[ 29 ][ 19 ].makeBarrier()
graph[ 30 ][ 18 ].makeBarrier()
graph[ 30 ][ 18 ].makeBarrier()
graph[ 31 ][ 17 ].makeBarrier()
graph[ 31 ][ 16 ].makeBarrier()
graph[ 31 ][ 16 ].makeBarrier()
graph[ 32 ][ 15 ].makeBarrier()
graph[ 32 ][ 15 ].makeBarrier()
graph[ 65 ][ 19 ].makeBarrier()
graph[ 65 ][ 19 ].makeBarrier()
graph[ 66 ][ 18 ].makeBarrier()
graph[ 67 ][ 18 ].makeBarrier()
graph[ 67 ][ 18 ].makeBarrier()
graph[ 68 ][ 18 ].makeBarrier()
graph[ 68 ][ 18 ].makeBarrier()
graph[ 69 ][ 18 ].makeBarrier()
graph[ 69 ][ 18 ].makeBarrier()
graph[ 70 ][ 18 ].makeBarrier()
graph[ 70 ][ 18 ].makeBarrier()
graph[ 71 ][ 18 ].makeBarrier()
graph[ 71 ][ 18 ].makeBarrier()
graph[ 72 ][ 18 ].makeBarrier()
graph[ 72 ][ 18 ].makeBarrier()
graph[ 73 ][ 18 ].makeBarrier()
graph[ 73 ][ 18 ].makeBarrier()
graph[ 73 ][ 19 ].makeBarrier()
graph[ 74 ][ 19 ].makeBarrier()
graph[ 75 ][ 20 ].makeBarrier()
graph[ 75 ][ 20 ].makeBarrier()
graph[ 75 ][ 21 ].makeBarrier()
graph[ 76 ][ 21 ].makeBarrier()
graph[ 76 ][ 21 ].makeBarrier()
graph[ 77 ][ 22 ].makeBarrier()
graph[ 77 ][ 22 ].makeBarrier()
graph[ 78 ][ 23 ].makeBarrier()
graph[ 78 ][ 23 ].makeBarrier()
graph[ 78 ][ 23 ].makeBarrier()
graph[ 78 ][ 24 ].makeBarrier()
graph[ 79 ][ 24 ].makeBarrier()
graph[ 78 ][ 24 ].makeBarrier()
graph[ 78 ][ 24 ].makeBarrier()
graph[ 78 ][ 25 ].makeBarrier()
graph[ 78 ][ 25 ].makeBarrier()
graph[ 78 ][ 25 ].makeBarrier()
graph[ 77 ][ 26 ].makeBarrier()
graph[ 77 ][ 26 ].makeBarrier()
graph[ 77 ][ 27 ].makeBarrier()
graph[ 77 ][ 27 ].makeBarrier()
graph[ 76 ][ 28 ].makeBarrier()
graph[ 76 ][ 28 ].makeBarrier()
graph[ 76 ][ 28 ].makeBarrier()
graph[ 76 ][ 29 ].makeBarrier()
graph[ 75 ][ 29 ].makeBarrier()
graph[ 75 ][ 29 ].makeBarrier()
graph[ 75 ][ 30 ].makeBarrier()
graph[ 75 ][ 30 ].makeBarrier()
graph[ 74 ][ 30 ].makeBarrier()
graph[ 74 ][ 30 ].makeBarrier()
graph[ 74 ][ 31 ].makeBarrier()
graph[ 74 ][ 31 ].makeBarrier()
graph[ 74 ][ 31 ].makeBarrier()
graph[ 73 ][ 31 ].makeBarrier()
graph[ 72 ][ 31 ].makeBarrier()
graph[ 71 ][ 30 ].makeBarrier()
graph[ 71 ][ 30 ].makeBarrier()
graph[ 71 ][ 30 ].makeBarrier()
graph[ 70 ][ 30 ].makeBarrier()
graph[ 70 ][ 30 ].makeBarrier()
graph[ 69 ][ 30 ].makeBarrier()
graph[ 68 ][ 30 ].makeBarrier()
graph[ 68 ][ 30 ].makeBarrier()
graph[ 68 ][ 30 ].makeBarrier()
graph[ 67 ][ 30 ].makeBarrier()
graph[ 67 ][ 31 ].makeBarrier()
graph[ 67 ][ 31 ].makeBarrier()
graph[ 66 ][ 31 ].makeBarrier()
graph[ 66 ][ 30 ].makeBarrier()
graph[ 66 ][ 30 ].makeBarrier()
graph[ 65 ][ 29 ].makeBarrier()
graph[ 65 ][ 29 ].makeBarrier()
graph[ 64 ][ 28 ].makeBarrier()
graph[ 64 ][ 28 ].makeBarrier()
graph[ 63 ][ 27 ].makeBarrier()
graph[ 62 ][ 27 ].makeBarrier()
graph[ 62 ][ 26 ].makeBarrier()
graph[ 62 ][ 26 ].makeBarrier()
graph[ 61 ][ 26 ].makeBarrier()
graph[ 61 ][ 26 ].makeBarrier()
graph[ 61 ][ 25 ].makeBarrier()
graph[ 60 ][ 25 ].makeBarrier()
graph[ 61 ][ 25 ].makeBarrier()
graph[ 61 ][ 25 ].makeBarrier()
graph[ 61 ][ 25 ].makeBarrier()
graph[ 62 ][ 24 ].makeBarrier()
graph[ 62 ][ 24 ].makeBarrier()
graph[ 62 ][ 23 ].makeBarrier()
graph[ 62 ][ 23 ].makeBarrier()
graph[ 63 ][ 22 ].makeBarrier()
graph[ 63 ][ 22 ].makeBarrier()
graph[ 64 ][ 21 ].makeBarrier()
graph[ 64 ][ 20 ].makeBarrier()
graph[ 64 ][ 20 ].makeBarrier()
graph[ 64 ][ 19 ].makeBarrier()
graph[ 64 ][ 19 ].makeBarrier()
graph[ 65 ][ 19 ].makeBarrier()
graph[ 73 ][ 65 ].makeBarrier()
graph[ 73 ][ 65 ].makeBarrier()
graph[ 74 ][ 63 ].makeBarrier()
graph[ 74 ][ 62 ].makeBarrier()
graph[ 74 ][ 62 ].makeBarrier()
graph[ 74 ][ 61 ].makeBarrier()
graph[ 75 ][ 60 ].makeBarrier()
graph[ 75 ][ 59 ].makeBarrier()
graph[ 75 ][ 59 ].makeBarrier()
graph[ 75 ][ 58 ].makeBarrier()
graph[ 76 ][ 57 ].makeBarrier()
graph[ 76 ][ 56 ].makeBarrier()
graph[ 76 ][ 56 ].makeBarrier()
graph[ 77 ][ 55 ].makeBarrier()
graph[ 77 ][ 55 ].makeBarrier()
graph[ 77 ][ 54 ].makeBarrier()
graph[ 77 ][ 54 ].makeBarrier()
graph[ 77 ][ 53 ].makeBarrier()
graph[ 77 ][ 53 ].makeBarrier()
graph[ 78 ][ 53 ].makeBarrier()
graph[ 78 ][ 53 ].makeBarrier()
graph[ 79 ][ 53 ].makeBarrier()
graph[ 79 ][ 53 ].makeBarrier()
graph[ 80 ][ 53 ].makeBarrier()
graph[ 81 ][ 53 ].makeBarrier()
graph[ 81 ][ 53 ].makeBarrier()
graph[ 81 ][ 53 ].makeBarrier()
graph[ 82 ][ 53 ].makeBarrier()
graph[ 82 ][ 53 ].makeBarrier()
graph[ 83 ][ 53 ].makeBarrier()
graph[ 83 ][ 53 ].makeBarrier()
graph[ 84 ][ 53 ].makeBarrier()
graph[ 84 ][ 53 ].makeBarrier()
graph[ 85 ][ 53 ].makeBarrier()
graph[ 85 ][ 53 ].makeBarrier()
graph[ 86 ][ 53 ].makeBarrier()
graph[ 86 ][ 53 ].makeBarrier()
graph[ 86 ][ 53 ].makeBarrier()
graph[ 87 ][ 53 ].makeBarrier()
graph[ 88 ][ 53 ].makeBarrier()
graph[ 88 ][ 53 ].makeBarrier()
graph[ 88 ][ 53 ].makeBarrier()
graph[ 89 ][ 53 ].makeBarrier()
graph[ 89 ][ 53 ].makeBarrier()
graph[ 89 ][ 53 ].makeBarrier()
graph[ 89 ][ 54 ].makeBarrier()
graph[ 89 ][ 55 ].makeBarrier()
graph[ 89 ][ 56 ].makeBarrier()
graph[ 89 ][ 57 ].makeBarrier()
graph[ 89 ][ 58 ].makeBarrier()
graph[ 89 ][ 58 ].makeBarrier()
graph[ 89 ][ 59 ].makeBarrier()
graph[ 89 ][ 60 ].makeBarrier()
graph[ 89 ][ 62 ].makeBarrier()
graph[ 89 ][ 63 ].makeBarrier()
graph[ 89 ][ 64 ].makeBarrier()
graph[ 89 ][ 65 ].makeBarrier()
graph[ 89 ][ 66 ].makeBarrier()
graph[ 89 ][ 67 ].makeBarrier()
graph[ 89 ][ 68 ].makeBarrier()
graph[ 89 ][ 69 ].makeBarrier()
graph[ 89 ][ 70 ].makeBarrier()
graph[ 89 ][ 70 ].makeBarrier()
graph[ 89 ][ 71 ].makeBarrier()
graph[ 89 ][ 71 ].makeBarrier()
graph[ 89 ][ 71 ].makeBarrier()
graph[ 88 ][ 71 ].makeBarrier()
graph[ 88 ][ 71 ].makeBarrier()
graph[ 88 ][ 71 ].makeBarrier()
graph[ 88 ][ 71 ].makeBarrier()
graph[ 87 ][ 71 ].makeBarrier()
graph[ 86 ][ 72 ].makeBarrier()
graph[ 86 ][ 72 ].makeBarrier()
graph[ 85 ][ 72 ].makeBarrier()
graph[ 83 ][ 72 ].makeBarrier()
graph[ 81 ][ 72 ].makeBarrier()
graph[ 79 ][ 72 ].makeBarrier()
graph[ 77 ][ 72 ].makeBarrier()
graph[ 76 ][ 72 ].makeBarrier()
graph[ 75 ][ 72 ].makeBarrier()
graph[ 74 ][ 72 ].makeBarrier()
graph[ 73 ][ 72 ].makeBarrier()
graph[ 72 ][ 72 ].makeBarrier()
graph[ 72 ][ 72 ].makeBarrier()
graph[ 71 ][ 72 ].makeBarrier()
graph[ 71 ][ 72 ].makeBarrier()
graph[ 70 ][ 72 ].makeBarrier()
graph[ 71 ][ 71 ].makeBarrier()
graph[ 71 ][ 70 ].makeBarrier()
graph[ 71 ][ 69 ].makeBarrier()
graph[ 71 ][ 68 ].makeBarrier()
graph[ 72 ][ 68 ].makeBarrier()
graph[ 72 ][ 67 ].makeBarrier()
graph[ 72 ][ 66 ].makeBarrier()
graph[ 73 ][ 65 ].makeBarrier()
graph[ 73 ][ 65 ].makeBarrier()
graph[ 73 ][ 65 ].makeBarrier()
graph[ 73 ][ 64 ].makeBarrier()
graph[ 73 ][ 64 ].makeBarrier()
graph[ 74 ][ 63 ].makeBarrier()
graph[ 74 ][ 63 ].makeBarrier()
graph[ 74 ][ 62 ].makeBarrier()
graph[ 74 ][ 62 ].makeBarrier()
graph[ 74 ][ 61 ].makeBarrier()
graph[ 75 ][ 60 ].makeBarrier()
graph[ 75 ][ 59 ].makeBarrier()
graph[ 75 ][ 58 ].makeBarrier()
graph[ 76 ][ 58 ].makeBarrier()
graph[ 76 ][ 57 ].makeBarrier()
graph[ 76 ][ 56 ].makeBarrier()
graph[ 77 ][ 56 ].makeBarrier()
graph[ 77 ][ 55 ].makeBarrier()
graph[ 77 ][ 55 ].makeBarrier()
graph[ 77 ][ 54 ].makeBarrier()
graph[ 77 ][ 54 ].makeBarrier()
graph[ 77 ][ 54 ].makeBarrier()
graph[ 77 ][ 71 ].makeBarrier()
graph[ 77 ][ 71 ].makeBarrier()
graph[ 77 ][ 72 ].makeBarrier()
graph[ 78 ][ 72 ].makeBarrier()
graph[ 78 ][ 72 ].makeBarrier()
graph[ 79 ][ 72 ].makeBarrier()
graph[ 79 ][ 72 ].makeBarrier()
graph[ 80 ][ 72 ].makeBarrier()
graph[ 80 ][ 72 ].makeBarrier()
graph[ 81 ][ 72 ].makeBarrier()
graph[ 82 ][ 71 ].makeBarrier()
graph[ 83 ][ 71 ].makeBarrier()
graph[ 83 ][ 72 ].makeBarrier()
graph[ 84 ][ 72 ].makeBarrier()
graph[ 85 ][ 71 ].makeBarrier()
graph[ 86 ][ 71 ].makeBarrier()
graph[ 86 ][ 72 ].makeBarrier()
graph[ 87 ][ 72 ].makeBarrier()
graph[ 87 ][ 72 ].makeBarrier()
graph[ 88 ][ 72 ].makeBarrier()
graph[ 88 ][ 72 ].makeBarrier()
graph[ 88 ][ 72 ].makeBarrier()
graph[ 89 ][ 72 ].makeBarrier()
graph[ 89 ][ 71 ].makeBarrier()
graph[ 89 ][ 71 ].makeBarrier()
graph[ 89 ][ 71 ].makeBarrier()
graph[ 89 ][ 70 ].makeBarrier()
graph[ 89 ][ 69 ].makeBarrier()
graph[ 89 ][ 69 ].makeBarrier()
graph[ 89 ][ 68 ].makeBarrier()
graph[ 89 ][ 68 ].makeBarrier()
graph[ 89 ][ 67 ].makeBarrier()
graph[ 89 ][ 66 ].makeBarrier()
graph[ 89 ][ 65 ].makeBarrier()
graph[ 89 ][ 64 ].makeBarrier()
graph[ 89 ][ 63 ].makeBarrier()
graph[ 89 ][ 63 ].makeBarrier()
graph[ 89 ][ 62 ].makeBarrier()
graph[ 89 ][ 61 ].makeBarrier()
graph[ 89 ][ 61 ].makeBarrier()
graph[ 89 ][ 60 ].makeBarrier()
graph[ 89 ][ 60 ].makeBarrier()
graph[ 89 ][ 60 ].makeBarrier()
graph[ 34 ][ 46 ].makeBarrier()
graph[ 34 ][ 48 ].makeBarrier()
graph[ 34 ][ 49 ].makeBarrier()
graph[ 34 ][ 49 ].makeBarrier()
graph[ 34 ][ 50 ].makeBarrier()
graph[ 34 ][ 50 ].makeBarrier()
graph[ 34 ][ 50 ].makeBarrier()
graph[ 34 ][ 51 ].makeBarrier()
graph[ 34 ][ 52 ].makeBarrier()
graph[ 34 ][ 53 ].makeBarrier()
graph[ 34 ][ 53 ].makeBarrier()
graph[ 34 ][ 54 ].makeBarrier()
graph[ 34 ][ 54 ].makeBarrier()
graph[ 34 ][ 55 ].makeBarrier()
graph[ 34 ][ 56 ].makeBarrier()
graph[ 34 ][ 56 ].makeBarrier()
graph[ 34 ][ 57 ].makeBarrier()
graph[ 34 ][ 58 ].makeBarrier()
graph[ 34 ][ 59 ].makeBarrier()
graph[ 34 ][ 59 ].makeBarrier()
graph[ 34 ][ 60 ].makeBarrier()
graph[ 34 ][ 60 ].makeBarrier()
graph[ 34 ][ 61 ].makeBarrier()
graph[ 34 ][ 61 ].makeBarrier()
graph[ 35 ][ 61 ].makeBarrier()
graph[ 36 ][ 61 ].makeBarrier()
graph[ 37 ][ 61 ].makeBarrier()
graph[ 38 ][ 61 ].makeBarrier()
graph[ 38 ][ 61 ].makeBarrier()
graph[ 39 ][ 61 ].makeBarrier()
graph[ 39 ][ 61 ].makeBarrier()
graph[ 40 ][ 61 ].makeBarrier()
graph[ 40 ][ 61 ].makeBarrier()
graph[ 41 ][ 61 ].makeBarrier()
graph[ 42 ][ 61 ].makeBarrier()
graph[ 43 ][ 61 ].makeBarrier()
graph[ 44 ][ 61 ].makeBarrier()
graph[ 45 ][ 61 ].makeBarrier()
graph[ 46 ][ 61 ].makeBarrier()
graph[ 48 ][ 61 ].makeBarrier()
graph[ 48 ][ 61 ].makeBarrier()
graph[ 49 ][ 61 ].makeBarrier()
graph[ 50 ][ 61 ].makeBarrier()
graph[ 52 ][ 61 ].makeBarrier()
graph[ 52 ][ 61 ].makeBarrier()
graph[ 53 ][ 61 ].makeBarrier()
graph[ 54 ][ 61 ].makeBarrier()
graph[ 55 ][ 61 ].makeBarrier()
graph[ 56 ][ 61 ].makeBarrier()
graph[ 57 ][ 61 ].makeBarrier()
graph[ 58 ][ 62 ].makeBarrier()
graph[ 59 ][ 62 ].makeBarrier()
graph[ 59 ][ 62 ].makeBarrier()
graph[ 59 ][ 62 ].makeBarrier()
graph[ 59 ][ 59 ].makeBarrier()
graph[ 59 ][ 57 ].makeBarrier()
graph[ 59 ][ 55 ].makeBarrier()
graph[ 59 ][ 53 ].makeBarrier()
graph[ 59 ][ 52 ].makeBarrier()
graph[ 59 ][ 50 ].makeBarrier()
graph[ 59 ][ 49 ].makeBarrier()
graph[ 59 ][ 47 ].makeBarrier()
graph[ 59 ][ 47 ].makeBarrier()
graph[ 59 ][ 46 ].makeBarrier()
graph[ 59 ][ 46 ].makeBarrier()
graph[ 59 ][ 45 ].makeBarrier()
graph[ 59 ][ 45 ].makeBarrier()
graph[ 57 ][ 46 ].makeBarrier()
graph[ 54 ][ 46 ].makeBarrier()
graph[ 51 ][ 46 ].makeBarrier()
graph[ 50 ][ 46 ].makeBarrier()
graph[ 48 ][ 45 ].makeBarrier()
graph[ 46 ][ 45 ].makeBarrier()
graph[ 44 ][ 45 ].makeBarrier()
graph[ 42 ][ 45 ].makeBarrier()
graph[ 41 ][ 45 ].makeBarrier()
graph[ 39 ][ 45 ].makeBarrier()
graph[ 38 ][ 45 ].makeBarrier()
graph[ 37 ][ 45 ].makeBarrier()
graph[ 36 ][ 45 ].makeBarrier()
graph[ 35 ][ 45 ].makeBarrier()
graph[ 35 ][ 45 ].makeBarrier()
graph[ 34 ][ 45 ].makeBarrier()
graph[ 34 ][ 46 ].makeBarrier()
graph[ 34 ][ 46 ].makeBarrier()
graph[ 34 ][ 47 ].makeBarrier()
graph[ 34 ][ 48 ].makeBarrier()
graph[ 34 ][ 48 ].makeBarrier()
graph[ 34 ][ 49 ].makeBarrier()
graph[ 34 ][ 49 ].makeBarrier()
graph[ 34 ][ 49 ].makeBarrier()
graph[ 34 ][ 50 ].makeBarrier()
graph[ 34 ][ 50 ].makeBarrier()
graph[ 39 ][ 45 ].makeBarrier()
graph[ 39 ][ 45 ].makeBarrier()
graph[ 41 ][ 45 ].makeBarrier()
graph[ 42 ][ 45 ].makeBarrier()
graph[ 43 ][ 45 ].makeBarrier()
graph[ 44 ][ 46 ].makeBarrier()
graph[ 45 ][ 46 ].makeBarrier()
graph[ 45 ][ 46 ].makeBarrier()
graph[ 46 ][ 46 ].makeBarrier()
graph[ 47 ][ 46 ].makeBarrier()
graph[ 48 ][ 46 ].makeBarrier()
graph[ 49 ][ 46 ].makeBarrier()
graph[ 51 ][ 46 ].makeBarrier()
graph[ 52 ][ 46 ].makeBarrier()
graph[ 52 ][ 46 ].makeBarrier()
graph[ 53 ][ 46 ].makeBarrier()
graph[ 54 ][ 46 ].makeBarrier()
graph[ 55 ][ 46 ].makeBarrier()
graph[ 56 ][ 46 ].makeBarrier()
graph[ 56 ][ 46 ].makeBarrier()
graph[ 57 ][ 46 ].makeBarrier()
graph[ 58 ][ 46 ].makeBarrier()
graph[ 58 ][ 46 ].makeBarrier()
graph[ 58 ][ 46 ].makeBarrier()
graph[ 58 ][ 46 ].makeBarrier()
graph[ 59 ][ 46 ].makeBarrier()
graph[ 40 ][ 44 ].makeBarrier()
graph[ 40 ][ 45 ].makeBarrier()
graph[ 47 ][ 61 ].makeBarrier()
graph[ 52 ][ 62 ].makeBarrier()
graph[ 51 ][ 62 ].makeBarrier()
graph[ 51 ][ 61 ].makeBarrier()
graph[ 59 ][ 61 ].makeBarrier()
graph[ 59 ][ 61 ].makeBarrier()
graph[ 59 ][ 61 ].makeBarrier()
graph[ 59 ][ 60 ].makeBarrier()
graph[ 59 ][ 59 ].makeBarrier()
graph[ 59 ][ 58 ].makeBarrier()
graph[ 59 ][ 58 ].makeBarrier()
graph[ 59 ][ 57 ].makeBarrier()
graph[ 59 ][ 56 ].makeBarrier()
graph[ 59 ][ 56 ].makeBarrier()
graph[ 59 ][ 55 ].makeBarrier()
graph[ 59 ][ 54 ].makeBarrier()
graph[ 59 ][ 54 ].makeBarrier()
graph[ 59 ][ 53 ].makeBarrier()
graph[ 59 ][ 53 ].makeBarrier()
graph[ 59 ][ 52 ].makeBarrier()
graph[ 47 ][ 80 ].makeBarrier()
graph[ 47 ][ 80 ].makeBarrier()
graph[ 48 ][ 80 ].makeBarrier()
graph[ 49 ][ 80 ].makeBarrier()
graph[ 50 ][ 80 ].makeBarrier()
graph[ 51 ][ 80 ].makeBarrier()
graph[ 52 ][ 80 ].makeBarrier()
graph[ 52 ][ 80 ].makeBarrier()
graph[ 54 ][ 80 ].makeBarrier()
graph[ 55 ][ 80 ].makeBarrier()
graph[ 55 ][ 80 ].makeBarrier()
graph[ 55 ][ 80 ].makeBarrier()
graph[ 56 ][ 80 ].makeBarrier()
graph[ 57 ][ 82 ].makeBarrier()
graph[ 58 ][ 83 ].makeBarrier()
graph[ 59 ][ 83 ].makeBarrier()
graph[ 59 ][ 84 ].makeBarrier()
graph[ 60 ][ 85 ].makeBarrier()
graph[ 61 ][ 85 ].makeBarrier()
graph[ 61 ][ 86 ].makeBarrier()
graph[ 61 ][ 86 ].makeBarrier()
graph[ 61 ][ 88 ].makeBarrier()
graph[ 61 ][ 89 ].makeBarrier()
graph[ 61 ][ 91 ].makeBarrier()
graph[ 61 ][ 92 ].makeBarrier()
graph[ 61 ][ 92 ].makeBarrier()
graph[ 61 ][ 92 ].makeBarrier()
graph[ 61 ][ 92 ].makeBarrier()
graph[ 61 ][ 93 ].makeBarrier()
graph[ 60 ][ 93 ].makeBarrier()
graph[ 59 ][ 94 ].makeBarrier()
graph[ 59 ][ 94 ].makeBarrier()
graph[ 59 ][ 95 ].makeBarrier()
graph[ 59 ][ 95 ].makeBarrier()
graph[ 58 ][ 95 ].makeBarrier()
graph[ 58 ][ 95 ].makeBarrier()
graph[ 58 ][ 95 ].makeBarrier()
graph[ 57 ][ 95 ].makeBarrier()
graph[ 55 ][ 95 ].makeBarrier()
graph[ 54 ][ 96 ].makeBarrier()
graph[ 53 ][ 96 ].makeBarrier()
graph[ 52 ][ 96 ].makeBarrier()
graph[ 51 ][ 96 ].makeBarrier()
graph[ 50 ][ 95 ].makeBarrier()
graph[ 48 ][ 95 ].makeBarrier()
graph[ 48 ][ 95 ].makeBarrier()
graph[ 47 ][ 95 ].makeBarrier()
graph[ 47 ][ 95 ].makeBarrier()
graph[ 46 ][ 94 ].makeBarrier()
graph[ 45 ][ 93 ].makeBarrier()
graph[ 44 ][ 92 ].makeBarrier()
graph[ 43 ][ 91 ].makeBarrier()
graph[ 42 ][ 91 ].makeBarrier()
graph[ 42 ][ 90 ].makeBarrier()
graph[ 42 ][ 90 ].makeBarrier()
graph[ 42 ][ 88 ].makeBarrier()
graph[ 42 ][ 88 ].makeBarrier()
graph[ 42 ][ 87 ].makeBarrier()
graph[ 42 ][ 86 ].makeBarrier()
graph[ 42 ][ 85 ].makeBarrier()
graph[ 42 ][ 84 ].makeBarrier()
graph[ 42 ][ 84 ].makeBarrier()
graph[ 42 ][ 84 ].makeBarrier()
graph[ 43 ][ 83 ].makeBarrier()
graph[ 43 ][ 82 ].makeBarrier()
graph[ 44 ][ 82 ].makeBarrier()
graph[ 44 ][ 82 ].makeBarrier()
graph[ 45 ][ 81 ].makeBarrier()
graph[ 45 ][ 81 ].makeBarrier()
graph[ 46 ][ 80 ].makeBarrier()
graph[ 46 ][ 80 ].makeBarrier()
graph[ 46 ][ 80 ].makeBarrier()
graph[ 47 ][ 80 ].makeBarrier()
graph[ 46 ][ 80 ].makeBarrier()
graph[ 45 ][ 80 ].makeBarrier()
graph[ 44 ][ 81 ].makeBarrier()
graph[ 44 ][ 81 ].makeBarrier()
graph[ 44 ][ 82 ].makeBarrier()
graph[ 43 ][ 82 ].makeBarrier()
graph[ 43 ][ 83 ].makeBarrier()
graph[ 43 ][ 83 ].makeBarrier()
graph[ 42 ][ 84 ].makeBarrier()
graph[ 42 ][ 85 ].makeBarrier()
graph[ 42 ][ 85 ].makeBarrier()
graph[ 42 ][ 86 ].makeBarrier()
graph[ 42 ][ 87 ].makeBarrier()
graph[ 42 ][ 87 ].makeBarrier()
graph[ 42 ][ 88 ].makeBarrier()
graph[ 42 ][ 88 ].makeBarrier()
graph[ 42 ][ 89 ].makeBarrier()
graph[ 42 ][ 89 ].makeBarrier()
graph[ 42 ][ 90 ].makeBarrier()
graph[ 42 ][ 91 ].makeBarrier()
graph[ 42 ][ 91 ].makeBarrier()
graph[ 42 ][ 91 ].makeBarrier()
graph[ 43 ][ 91 ].makeBarrier()
graph[ 43 ][ 92 ].makeBarrier()
graph[ 44 ][ 92 ].makeBarrier()
graph[ 44 ][ 93 ].makeBarrier()
graph[ 45 ][ 93 ].makeBarrier()
graph[ 45 ][ 93 ].makeBarrier()
graph[ 45 ][ 94 ].makeBarrier()
graph[ 46 ][ 94 ].makeBarrier()
graph[ 46 ][ 94 ].makeBarrier()
graph[ 47 ][ 95 ].makeBarrier()
graph[ 47 ][ 95 ].makeBarrier()
graph[ 48 ][ 95 ].makeBarrier()
graph[ 48 ][ 95 ].makeBarrier()
graph[ 49 ][ 95 ].makeBarrier()
graph[ 49 ][ 95 ].makeBarrier()
graph[ 50 ][ 95 ].makeBarrier()
graph[ 50 ][ 95 ].makeBarrier()
graph[ 51 ][ 95 ].makeBarrier()
graph[ 51 ][ 96 ].makeBarrier()
graph[ 51 ][ 96 ].makeBarrier()
graph[ 52 ][ 96 ].makeBarrier()
graph[ 52 ][ 96 ].makeBarrier()
graph[ 52 ][ 96 ].makeBarrier()
graph[ 53 ][ 96 ].makeBarrier()
graph[ 53 ][ 96 ].makeBarrier()
graph[ 54 ][ 95 ].makeBarrier()
graph[ 55 ][ 95 ].makeBarrier()
graph[ 55 ][ 95 ].makeBarrier()
graph[ 56 ][ 95 ].makeBarrier()
graph[ 56 ][ 95 ].makeBarrier()
graph[ 57 ][ 95 ].makeBarrier()
graph[ 57 ][ 95 ].makeBarrier()
graph[ 58 ][ 94 ].makeBarrier()
graph[ 59 ][ 93 ].makeBarrier()
graph[ 59 ][ 93 ].makeBarrier()
graph[ 60 ][ 93 ].makeBarrier()
graph[ 60 ][ 92 ].makeBarrier()
graph[ 61 ][ 92 ].makeBarrier()
graph[ 61 ][ 92 ].makeBarrier()
graph[ 61 ][ 91 ].makeBarrier()
graph[ 61 ][ 91 ].makeBarrier()
graph[ 61 ][ 90 ].makeBarrier()
graph[ 62 ][ 89 ].makeBarrier()
graph[ 62 ][ 89 ].makeBarrier()
graph[ 62 ][ 88 ].makeBarrier()
graph[ 62 ][ 88 ].makeBarrier()
graph[ 61 ][ 87 ].makeBarrier()
graph[ 61 ][ 86 ].makeBarrier()
graph[ 61 ][ 86 ].makeBarrier()
graph[ 61 ][ 86 ].makeBarrier()
graph[ 61 ][ 85 ].makeBarrier()
graph[ 61 ][ 85 ].makeBarrier()
graph[ 61 ][ 85 ].makeBarrier()
graph[ 61 ][ 85 ].makeBarrier()
graph[ 60 ][ 84 ].makeBarrier()
graph[ 59 ][ 84 ].makeBarrier()
graph[ 59 ][ 84 ].makeBarrier()
graph[ 58 ][ 83 ].makeBarrier()
graph[ 58 ][ 83 ].makeBarrier()
graph[ 57 ][ 82 ].makeBarrier()
graph[ 57 ][ 82 ].makeBarrier()
graph[ 57 ][ 82 ].makeBarrier()
graph[ 56 ][ 81 ].makeBarrier()
graph[ 56 ][ 81 ].makeBarrier()
graph[ 56 ][ 80 ].makeBarrier()
graph[ 56 ][ 80 ].makeBarrier()
graph[ 56 ][ 80 ].makeBarrier()
graph[ 56 ][ 80 ].makeBarrier()
graph[ 56 ][ 80 ].makeBarrier()
graph[ 55 ][ 80 ].makeBarrier()
graph[ 54 ][ 80 ].makeBarrier()
graph[ 54 ][ 80 ].makeBarrier()
graph[ 53 ][ 80 ].makeBarrier()
graph[ 53 ][ 80 ].makeBarrier()
graph[ 53 ][ 80 ].makeBarrier()
graph[ 52 ][ 80 ].makeBarrier()
graph[ 45 ][ 2 ].makeBarrier()
graph[ 45 ][ 2 ].makeBarrier()
graph[ 47 ][ 2 ].makeBarrier()
graph[ 48 ][ 2 ].makeBarrier()
graph[ 49 ][ 2 ].makeBarrier()
graph[ 49 ][ 2 ].makeBarrier()
graph[ 50 ][ 2 ].makeBarrier()
graph[ 50 ][ 2 ].makeBarrier()
graph[ 51 ][ 2 ].makeBarrier()
graph[ 52 ][ 2 ].makeBarrier()
graph[ 52 ][ 2 ].makeBarrier()
graph[ 53 ][ 2 ].makeBarrier()
graph[ 53 ][ 2 ].makeBarrier()
graph[ 54 ][ 2 ].makeBarrier()
graph[ 55 ][ 2 ].makeBarrier()
graph[ 56 ][ 2 ].makeBarrier()
graph[ 56 ][ 2 ].makeBarrier()
graph[ 56 ][ 2 ].makeBarrier()
graph[ 56 ][ 2 ].makeBarrier()
graph[ 57 ][ 3 ].makeBarrier()
graph[ 58 ][ 4 ].makeBarrier()
graph[ 58 ][ 4 ].makeBarrier()
graph[ 59 ][ 5 ].makeBarrier()
graph[ 59 ][ 5 ].makeBarrier()
graph[ 60 ][ 6 ].makeBarrier()
graph[ 60 ][ 6 ].makeBarrier()
graph[ 61 ][ 6 ].makeBarrier()
graph[ 61 ][ 7 ].makeBarrier()
graph[ 61 ][ 7 ].makeBarrier()
graph[ 61 ][ 7 ].makeBarrier()
graph[ 61 ][ 8 ].makeBarrier()
graph[ 61 ][ 8 ].makeBarrier()
graph[ 61 ][ 8 ].makeBarrier()
graph[ 61 ][ 9 ].makeBarrier()
graph[ 61 ][ 10 ].makeBarrier()
graph[ 61 ][ 10 ].makeBarrier()
graph[ 61 ][ 10 ].makeBarrier()
graph[ 61 ][ 11 ].makeBarrier()
graph[ 61 ][ 11 ].makeBarrier()
graph[ 61 ][ 12 ].makeBarrier()
graph[ 61 ][ 12 ].makeBarrier()
graph[ 61 ][ 12 ].makeBarrier()
graph[ 61 ][ 12 ].makeBarrier()
graph[ 61 ][ 13 ].makeBarrier()
graph[ 60 ][ 13 ].makeBarrier()
graph[ 59 ][ 14 ].makeBarrier()
graph[ 59 ][ 14 ].makeBarrier()
graph[ 59 ][ 15 ].makeBarrier()
graph[ 58 ][ 15 ].makeBarrier()
graph[ 58 ][ 15 ].makeBarrier()
graph[ 57 ][ 16 ].makeBarrier()
graph[ 57 ][ 16 ].makeBarrier()
graph[ 57 ][ 16 ].makeBarrier()
graph[ 57 ][ 17 ].makeBarrier()
graph[ 56 ][ 17 ].makeBarrier()
graph[ 56 ][ 17 ].makeBarrier()
graph[ 55 ][ 17 ].makeBarrier()
graph[ 54 ][ 17 ].makeBarrier()
graph[ 52 ][ 17 ].makeBarrier()
graph[ 52 ][ 17 ].makeBarrier()
graph[ 51 ][ 16 ].makeBarrier()
graph[ 51 ][ 16 ].makeBarrier()
graph[ 50 ][ 16 ].makeBarrier()
graph[ 49 ][ 16 ].makeBarrier()
graph[ 48 ][ 16 ].makeBarrier()
graph[ 48 ][ 16 ].makeBarrier()
graph[ 47 ][ 16 ].makeBarrier()
graph[ 47 ][ 16 ].makeBarrier()
graph[ 46 ][ 16 ].makeBarrier()
graph[ 46 ][ 16 ].makeBarrier()
graph[ 46 ][ 16 ].makeBarrier()
graph[ 44 ][ 14 ].makeBarrier()
graph[ 43 ][ 13 ].makeBarrier()
graph[ 42 ][ 13 ].makeBarrier()
graph[ 42 ][ 12 ].makeBarrier()
graph[ 41 ][ 12 ].makeBarrier()
graph[ 41 ][ 12 ].makeBarrier()
graph[ 40 ][ 12 ].makeBarrier()
graph[ 40 ][ 12 ].makeBarrier()
graph[ 40 ][ 11 ].makeBarrier()
graph[ 40 ][ 11 ].makeBarrier()
graph[ 40 ][ 10 ].makeBarrier()
graph[ 40 ][ 9 ].makeBarrier()
graph[ 40 ][ 8 ].makeBarrier()
graph[ 40 ][ 8 ].makeBarrier()
graph[ 40 ][ 7 ].makeBarrier()
graph[ 40 ][ 6 ].makeBarrier()
graph[ 40 ][ 6 ].makeBarrier()
graph[ 40 ][ 5 ].makeBarrier()
graph[ 40 ][ 5 ].makeBarrier()
graph[ 40 ][ 5 ].makeBarrier()
graph[ 41 ][ 5 ].makeBarrier()
graph[ 42 ][ 4 ].makeBarrier()
graph[ 42 ][ 4 ].makeBarrier()
graph[ 43 ][ 4 ].makeBarrier()
graph[ 44 ][ 3 ].makeBarrier()
graph[ 44 ][ 2 ].makeBarrier()
graph[ 45 ][ 2 ].makeBarrier()
graph[ 45 ][ 2 ].makeBarrier()
graph[ 46 ][ 2 ].makeBarrier()
graph[ 46 ][ 1 ].makeBarrier()
graph[ 46 ][ 1 ].makeBarrier()
graph[ 46 ][ 1 ].makeBarrier()
graph[ 47 ][ 2 ].makeBarrier()
graph[ 48 ][ 2 ].makeBarrier()
graph[ 48 ][ 2 ].makeBarrier()
graph[ 48 ][ 2 ].makeBarrier()
graph[ 54 ][ 17 ].makeBarrier()
graph[ 53 ][ 17 ].makeBarrier()
graph[ 51 ][ 17 ].makeBarrier()
graph[ 50 ][ 17 ].makeBarrier()
graph[ 45 ][ 16 ].makeBarrier()
graph[ 45 ][ 15 ].makeBarrier()
graph[ 44 ][ 15 ].makeBarrier()
graph[ 43 ][ 14 ].makeBarrier()
graph[ 43 ][ 14 ].makeBarrier()
graph[ 83 ][ 83 ].makeBarrier()
graph[ 83 ][ 84 ].makeBarrier()
graph[ 83 ][ 84 ].makeBarrier()
graph[ 84 ][ 85 ].makeBarrier()
graph[ 84 ][ 85 ].makeBarrier()
graph[ 85 ][ 86 ].makeBarrier()
graph[ 85 ][ 87 ].makeBarrier()
graph[ 86 ][ 87 ].makeBarrier()
graph[ 86 ][ 88 ].makeBarrier()
graph[ 87 ][ 88 ].makeBarrier()
graph[ 87 ][ 89 ].makeBarrier()
graph[ 88 ][ 90 ].makeBarrier()
graph[ 88 ][ 91 ].makeBarrier()
graph[ 88 ][ 91 ].makeBarrier()
graph[ 89 ][ 92 ].makeBarrier()
graph[ 89 ][ 93 ].makeBarrier()
graph[ 89 ][ 94 ].makeBarrier()
graph[ 89 ][ 94 ].makeBarrier()
graph[ 89 ][ 94 ].makeBarrier()
graph[ 89 ][ 94 ].makeBarrier()
graph[ 88 ][ 94 ].makeBarrier()
graph[ 87 ][ 94 ].makeBarrier()
graph[ 86 ][ 94 ].makeBarrier()
graph[ 86 ][ 94 ].makeBarrier()
graph[ 85 ][ 94 ].makeBarrier()
graph[ 85 ][ 94 ].makeBarrier()
graph[ 84 ][ 94 ].makeBarrier()
graph[ 84 ][ 94 ].makeBarrier()
graph[ 83 ][ 94 ].makeBarrier()
graph[ 82 ][ 94 ].makeBarrier()
graph[ 81 ][ 94 ].makeBarrier()
graph[ 81 ][ 94 ].makeBarrier()
graph[ 80 ][ 94 ].makeBarrier()
graph[ 80 ][ 94 ].makeBarrier()
graph[ 79 ][ 94 ].makeBarrier()
graph[ 78 ][ 94 ].makeBarrier()
graph[ 78 ][ 94 ].makeBarrier()
graph[ 78 ][ 94 ].makeBarrier()
graph[ 77 ][ 94 ].makeBarrier()
graph[ 77 ][ 93 ].makeBarrier()
graph[ 77 ][ 93 ].makeBarrier()
graph[ 78 ][ 93 ].makeBarrier()
graph[ 78 ][ 93 ].makeBarrier()
graph[ 78 ][ 92 ].makeBarrier()
graph[ 79 ][ 92 ].makeBarrier()
graph[ 79 ][ 92 ].makeBarrier()
graph[ 79 ][ 91 ].makeBarrier()
graph[ 79 ][ 91 ].makeBarrier()
graph[ 80 ][ 90 ].makeBarrier()
graph[ 80 ][ 89 ].makeBarrier()
graph[ 81 ][ 88 ].makeBarrier()
graph[ 81 ][ 88 ].makeBarrier()
graph[ 81 ][ 87 ].makeBarrier()
graph[ 82 ][ 87 ].makeBarrier()
graph[ 82 ][ 86 ].makeBarrier()
graph[ 82 ][ 85 ].makeBarrier()
graph[ 83 ][ 85 ].makeBarrier()
graph[ 83 ][ 84 ].makeBarrier()
graph[ 83 ][ 84 ].makeBarrier()
graph[ 83 ][ 84 ].makeBarrier()
graph[ 76 ][ 40 ].makeBarrier()
graph[ 77 ][ 41 ].makeBarrier()
graph[ 77 ][ 41 ].makeBarrier()
graph[ 77 ][ 42 ].makeBarrier()
graph[ 78 ][ 42 ].makeBarrier()
graph[ 78 ][ 43 ].makeBarrier()
graph[ 79 ][ 43 ].makeBarrier()
graph[ 79 ][ 44 ].makeBarrier()
graph[ 80 ][ 45 ].makeBarrier()
graph[ 80 ][ 45 ].makeBarrier()
graph[ 81 ][ 46 ].makeBarrier()
graph[ 81 ][ 47 ].makeBarrier()
graph[ 81 ][ 48 ].makeBarrier()
graph[ 81 ][ 48 ].makeBarrier()
graph[ 81 ][ 48 ].makeBarrier()
graph[ 80 ][ 48 ].makeBarrier()
graph[ 78 ][ 48 ].makeBarrier()
graph[ 77 ][ 48 ].makeBarrier()
graph[ 76 ][ 48 ].makeBarrier()
graph[ 75 ][ 48 ].makeBarrier()
graph[ 74 ][ 48 ].makeBarrier()
graph[ 74 ][ 48 ].makeBarrier()
graph[ 73 ][ 48 ].makeBarrier()
graph[ 73 ][ 48 ].makeBarrier()
graph[ 73 ][ 48 ].makeBarrier()
graph[ 73 ][ 48 ].makeBarrier()
graph[ 73 ][ 48 ].makeBarrier()
graph[ 73 ][ 47 ].makeBarrier()
graph[ 74 ][ 46 ].makeBarrier()
graph[ 74 ][ 45 ].makeBarrier()
graph[ 74 ][ 44 ].makeBarrier()
graph[ 75 ][ 43 ].makeBarrier()
graph[ 75 ][ 43 ].makeBarrier()
graph[ 75 ][ 42 ].makeBarrier()
graph[ 76 ][ 41 ].makeBarrier()
graph[ 76 ][ 41 ].makeBarrier()
graph[ 76 ][ 41 ].makeBarrier()
graph[ 78 ][ 49 ].makeBarrier()
graph[ 79 ][ 49 ].makeBarrier()
graph[ 79 ][ 49 ].makeBarrier()
graph[ 79 ][ 49 ].makeBarrier()
graph[ 80 ][ 49 ].makeBarrier()
graph[ 80 ][ 48 ].makeBarrier()
graph[ 80 ][ 48 ].makeBarrier()
graph[ 79 ][ 48 ].makeBarrier()
graph[ 79 ][ 48 ].makeBarrier()
graph[ 79 ][ 48 ].makeBarrier()
while True:
whichenv, whatval = input("type A for environment 1 or B for environment 2, followed by C value \n").split()
c = int(whatval)
if c < 133:
print("ERROR: Please exit and run again with an appropriate constraint value. ")
break
if whichenv == 'A':
environmentA()
elif whichenv == 'B':
environmentB()
else:
print("please enter a valid option ")
draw(environment, graph, entries, size)
for event in pygame.event.get():
if event.type == pygame.QUIT:
pygame.quit()
for row in graph:
for node in row:
node.updateNeighbors(graph)
A_Star(lambda: draw(environment, graph, entries, size), graph, start, goal, c)
break
main(environment, 800)
# python3 AStar.py
<file_sep># References:
# **** https://www.youtube.com/watch?v=JtiK0DOeI4A&ab_channel=TechWithTim ****
# https://www.redblobgames.com/pathfinding/a-star/introduction.html
# https://www.youtube.com/watch?v=P4d0pUkAsz4&ab_channel=RANJIRAJ
# https://www.youtube.com/watch?v=-L-WgKMFuhE&ab_channel=SebastianLague
# https://www.simplifiedpython.net/a-star-algorithm-python-tutorial/
# https://en.wikipedia.org/wiki/A*_search_algorithm
# https://medium.com/@nicholas.w.swift/easy-a-star-pathfinding-7e6689c7f7b2
# https://www.youtube.com/watch?v=ob4faIum4kQ&ab_channel=TrevorPayne
import pygame
import math
from queue import PriorityQueue
import random
environment = pygame.display.set_mode((800, 800))
RED = (255, 0, 0) # closed
GREEN = (0, 255, 0) # open
BLUE = (0, 0, 255) # goal
ORANGE = (255, 165, 0) # start
BLACK = (0, 0, 0) # barrier
WHITE = (255, 255, 255) # entry
PURPLE = (128, 0, 128) # path
GREY = (128, 128, 128) # environment
class Node:
def __init__(self, row, column, size, numberOfentries):
self.row = row
self.column = column
self.i = row * size
self.j = column * size
self.color = WHITE
self.neighbors = []
self.size = size
self.numberOfentries = numberOfentries
def getPosition(self):
return self.row, self.column
def makeBarrier(self):
self.color = BLACK
def draw(self, environment):
pygame.draw.rect(environment, self.color,
(self.i, self.j, self.size, self.size))
def updateNeighbors(self, graph):
self.neighbors = []
if self.row < self.numberOfentries - 1 and not graph[self.row+1][self.column].color == BLACK:
self.neighbors.append(graph[self.row + 1][self.column])
if self.row > 0 and not graph[self.row - 1][self.column].color == BLACK:
self.neighbors.append(graph[self.row - 1][self.column])
if self.column < self.numberOfentries - 1 and not graph[self.row][self.column + 1].color == BLACK:
self.neighbors.append(graph[self.row][self.column + 1])
if self.column > 0 and not graph[self.row][self.column - 1].color == BLACK:
self.neighbors.append(graph[self.row][self.column - 1])
def heuristic(pointA, pointB):
i1, j1 = pointA
i2, j2 = pointB
return abs(i1 - i2) + abs(j1 - j2)
def reconstructPath(origin, current, draw):
while current in origin:
current = origin[current]
current.color = PURPLE
draw()
def arastar(draw, graph, start, goal):
G = 1000000
delta_w = 50
w = 1000
g = {}
for row in graph:
for node in row:
g[node] = float("inf")
g[start] = 0
origin = {}
f = {}
for row in graph:
for node in row:
f[node] = float("inf")
f[start] = w*heuristic(start.getPosition(), goal.getPosition())
incumbent = None
ezOpenSet = {start}
while len(ezOpenSet) != 0 and w:
for event in pygame.event.get():
if event.type == pygame.QUIT:
pygame.quit()
new_solution, cost_new_solution = arastar_improve_solution(draw, graph, start, goal, ezOpenSet, w, G, incumbent, g, origin, f)
if new_solution is not None:
G = cost_new_solution
incumbent = new_solution
w -= delta_w
reconstructPath(incumbent, goal, draw)
def arastar_improve_solution(draw, graph, start, goal, ezOpenSet, w, G, newSolution, g, origin, f):
openPQ = PriorityQueue()
count = 0
openPQ.put((0, count, start))
while not openPQ.empty():
for event in pygame.event.get():
if event.type == pygame.QUIT:
pygame.quit()
n = openPQ.get()[2]
if n in ezOpenSet:
ezOpenSet.remove(n)
if(G <= f[n]):
return None, None
for successor_nprime in n.neighbors:
# tentative = g[n]+1
if successor_nprime not in ezOpenSet or g[n] + 1 < g[successor_nprime]:
if g[n] + 1 < g[successor_nprime]:
f[successor_nprime] = (g[n]+1)+(w*heuristic(successor_nprime.getPosition(), goal.getPosition()))
g[successor_nprime] = g[n] + 1
count = count + 1
openPQ.put((f[successor_nprime], count, successor_nprime))
successor_nprime.color = GREEN
if f[successor_nprime] < G:
origin[successor_nprime] = n
if successor_nprime == goal:
cost_new_solution = g[successor_nprime]
return origin, cost_new_solution
ezOpenSet.add(successor_nprime)
def draw(environment, graph, entries, size):
environment.fill(WHITE)
for row in graph:
for node in row:
node.draw(environment)
space = size // entries
for i in range(entries):
pygame.draw.line(environment, GREY, (0, i * space), (size, i * space))
for j in range(entries):
pygame.draw.line(environment, GREY,
(j * space, 0), (j * space, size))
pygame.display.update()
def main(environment):
entries = 100
entries = int(input("enter grid size: "))
density = int(input("enter density: "))
if entries == 100:
size = 800
elif entries == 200:
size = 1000
elif entries == 300:
size = 1200
graph = []
space = size // entries
for i in range(entries):
graph.append([])
for j in range(entries):
node = Node(i, j, space, entries)
graph[i].append(node)
a = random.randrange(1, entries//1.8)
b = random.randrange(1, entries//1.8)
c = random.randrange(1, entries//1.8)
d = random.randrange(1, entries//1.8)
start = graph[a][b]
graph[a][b].color = ORANGE
goal = graph[c][d]
graph[c][d].color = BLUE
if entries == 100:
if density == 10:
for i in range(1000):
x = random.randrange(1, entries)
y = random.randrange(1, entries)
if x != a and x != c and y != b and y != d:
graph[x][y].makeBarrier()
# graph[random.randrange(1, entries)][random.randrange(1, entries)].makeBarrier()
elif density == 20:
for i in range(2000):
x = random.randrange(1, entries)
y = random.randrange(1, entries)
if x != a and x != c and y != b and y != d:
graph[x][y].makeBarrier()
# graph[random.randrange(1, entries)][random.randrange(1, entries)].makeBarrier()
elif density == 30:
for i in range(3000):
x = random.randrange(1, entries)
y = random.randrange(1, entries)
if x != a and x != c and y != b and y != d:
graph[x][y].makeBarrier()
# graph[random.randrange(1, entries)][random.randrange(1, entries)].makeBarrier()
elif entries == 200:
if density == 10:
for i in range(4000):
x = random.randrange(1, entries)
y = random.randrange(1, entries)
if x != a and x != c and y != b and y != d:
graph[x][y].makeBarrier()
# graph[random.randrange(1, entries)][random.randrange(1, entries)].makeBarrier()
elif density == 20:
for i in range(8000):
x = random.randrange(1, entries)
y = random.randrange(1, entries)
if x != a and x != c and y != b and y != d:
graph[x][y].makeBarrier()
# graph[random.randrange(1, entries)][random.randrange(1, entries)].makeBarrier()
elif density == 30:
for i in range(12000):
x = random.randrange(1, entries)
y = random.randrange(1, entries)
if x != a and x != c and y != b and y != d:
graph[x][y].makeBarrier()
# graph[random.randrange(1, entries)][random.randrange(1, entries)].makeBarrier()
elif entries == 300:
if density == 10:
for i in range(9000):
x = random.randrange(1, entries)
y = random.randrange(1, entries)
if x != a and x != c and y != b and y != d:
graph[x][y].makeBarrier()
# graph[random.randrange(1, entries)][random.randrange(1, entries)].makeBarrier()
elif density == 20:
for i in range(18000):
x = random.randrange(1, entries)
y = random.randrange(1, entries)
if x != a and x != c and y != b and y != d:
graph[x][y].makeBarrier()
# graph[random.randrange(1, entries)][random.randrange(1, entries)].makeBarrier()
elif density == 30:
for i in range(27000):
x = random.randrange(1, entries)
y = random.randrange(1, entries)
if x != a and x != c and y != b and y != d:
graph[x][y].makeBarrier()
# graph[random.randrange(1, entries)][random.randrange(1, entries)].makeBarrier()
while True:
draw(environment, graph, entries, size)
for event in pygame.event.get():
if event.type == pygame.QUIT:
pygame.quit()
for row in graph:
for node in row:
node.updateNeighbors(graph)
arastar(lambda: draw(environment, graph, entries, size), graph, start, goal)
break
main(environment)
<file_sep># References:
# **** https://www.youtube.com/watch?v=JtiK0DOeI4A&ab_channel=TechWithTim ****
# https://www.redblobgames.com/pathfinding/a-star/introduction.html
# https://www.youtube.com/watch?v=P4d0pUkAsz4&ab_channel=RANJIRAJ
# https://www.youtube.com/watch?v=-L-WgKMFuhE&ab_channel=SebastianLague
# https://www.simplifiedpython.net/a-star-algorithm-python-tutorial/
# https://en.wikipedia.org/wiki/A*_search_algorithm
# https://medium.com/@nicholas.w.swift/easy-a-star-pathfinding-7e6689c7f7b2
# https://www.youtube.com/watch?v=ob4faIum4kQ&ab_channel=TrevorPayne
import pygame
import math
from queue import PriorityQueue
environment = pygame.display.set_mode((800, 800))
RED = (255, 0, 0) # closed
GREEN = (0, 255, 0) # open
BLUE = (0, 0, 255) # goal
ORANGE = (255, 165, 0) # start
BLACK = (0, 0, 0) # barrier
WHITE = (255, 255, 255) # entry
PURPLE = (128, 0, 128) # path
GREY = (128, 128, 128) # environment
class Node:
def __init__(self, row, column, size, numberOfentries):
self.row = row
self.column = column
self.i = row * size
self.j = column * size
self.color = WHITE
self.neighbors = []
self.size = size
self.numberOfentries = numberOfentries
def getPosition(self):
return self.row, self.column
def makeBarrier(self):
self.color = BLACK
def draw(self, environment):
pygame.draw.rect(environment, self.color, (self.i, self.j, self.size, self.size))
def updateNeighbors(self, graph):
self.neighbors = []
if self.row < self.numberOfentries - 1 and not graph[self.row+1][self.column].color == BLACK:
self.neighbors.append(graph[self.row + 1][self.column])
if self.row > 0 and not graph[self.row - 1][self.column].color == BLACK:
self.neighbors.append(graph[self.row - 1][self.column])
if self.column < self.numberOfentries - 1 and not graph[self.row][self.column + 1].color == BLACK:
self.neighbors.append(graph[self.row][self.column + 1])
if self.column > 0 and not graph[self.row][self.column - 1].color == BLACK:
self.neighbors.append(graph[self.row][self.column - 1])
def heuristic(pointA, pointB):
i1, j1 = pointA
i2, j2 = pointB
return abs(i1 - i2) + abs(j1 - j2)
def reconstructPath(origin, current, draw):
while current in origin:
current = origin[current]
current.color = PURPLE
draw()
def A_Star(draw, graph, start, goal):
openPQ = PriorityQueue()
count = 0
openPQ.put((0, count, start))
origin = {}
g = {}
for row in graph:
for node in row:
g[node] = float("inf")
g[start] = 0
f = {}
for row in graph:
for node in row:
f[node] = float("inf")
f[start] = heuristic(start.getPosition(), goal.getPosition())
ezOpenSet = {start}
while not openPQ.empty():
for event in pygame.event.get():
if event.type == pygame.QUIT:
pygame.quit()
current = openPQ.get()[2]
ezOpenSet.remove(current)
if current == goal:
reconstructPath(origin, goal, draw)
goal.color = BLUE
return True
for neighbor in current.neighbors:
tentative_gScore = g[current] + 1
if tentative_gScore < g[neighbor]:
origin[neighbor] = current
g[neighbor] = tentative_gScore
f[neighbor] = tentative_gScore + heuristic(neighbor.getPosition(), goal.getPosition())
if neighbor not in ezOpenSet:
count += 1
openPQ.put((f[neighbor], count, neighbor))
ezOpenSet.add(neighbor)
neighbor.color = GREEN
draw()
if current != start:
current.color = RED
return False
def draw(environment, graph, entries, size):
environment.fill(WHITE)
for row in graph:
for node in row:
node.draw(environment)
space = size // entries
for i in range(entries):
pygame.draw.line(environment, GREY, (0, i * space), (size, i * space))
for j in range(entries):
pygame.draw.line(environment, GREY, (j * space, 0), (j * space, size))
pygame.display.update()
def main(environment, size):
entries = 100
graph = []
space = size // entries
for i in range(entries):
graph.append([])
for j in range(entries):
node = Node(i, j, space, entries)
graph[i].append(node)
# Constructing the environment from Figure 3.31:
# --------------------------------------------------------------------------------
start = graph[10][90]
graph[10][90].color = ORANGE
goal = graph[95][45]
graph[95][45].color = BLUE
# Horizontal Rectangle
# -------------------------------------
# Vertices:
graph[16][95].makeBarrier() #HR1
graph[16][85].makeBarrier() #HR2
graph[46][85].makeBarrier() #HR3
graph[46][95].makeBarrier() #HR4
# left side of horizontal rectangle
for i in range(85, 95):
graph[16][i].makeBarrier()
# top side of horizontal rectangle
for i in range(17, 46):
graph[i][85].makeBarrier()
# right side of horizontal rectangle
for i in reversed(range(86, 95)):
graph[46][i].makeBarrier()
# bottom side of horizontal rectangle
for i in range(17, 46):
graph[i][95].makeBarrier()
# -------------------------------------
# Pentagon
# -------------------------------------
graph[25][75].makeBarrier() # P1
graph[14][70].makeBarrier() # P2
graph[10][56].makeBarrier() # P3
graph[20][45].makeBarrier() # P4
graph[30][56].makeBarrier() # P5
# bottom left side of pentagon
graph[24][74].makeBarrier()
graph[23][74].makeBarrier()
graph[22][73].makeBarrier()
graph[21][73].makeBarrier()
graph[20][72].makeBarrier()
graph[19][72].makeBarrier()
graph[18][71].makeBarrier()
graph[17][71].makeBarrier()
graph[16][70].makeBarrier()
graph[15][70].makeBarrier()
# top left side of pentagon
graph[10][55].makeBarrier()
graph[11][55].makeBarrier()
graph[11][54].makeBarrier()
graph[12][54].makeBarrier()
graph[13][53].makeBarrier()
graph[13][52].makeBarrier()
graph[14][51].makeBarrier()
graph[15][51].makeBarrier()
graph[16][50].makeBarrier()
graph[16][49].makeBarrier()
graph[17][48].makeBarrier()
graph[17][47].makeBarrier()
graph[18][47].makeBarrier()
graph[19][46].makeBarrier()
# left side of pentagon
graph[10][57].makeBarrier()
graph[10][58].makeBarrier()
graph[10][59].makeBarrier()
graph[11][60].makeBarrier()
graph[12][61].makeBarrier()
graph[12][62].makeBarrier()
graph[12][63].makeBarrier()
graph[13][64].makeBarrier()
graph[13][65].makeBarrier()
graph[13][66].makeBarrier()
graph[14][67].makeBarrier()
graph[14][68].makeBarrier()
graph[14][69].makeBarrier()
# top side of pentagon
graph[21][46].makeBarrier()
graph[22][46].makeBarrier()
graph[23][47].makeBarrier()
graph[24][47].makeBarrier()
graph[24][48].makeBarrier()
graph[24][49].makeBarrier()
graph[25][49].makeBarrier()
graph[25][50].makeBarrier()
graph[25][51].makeBarrier()
graph[26][51].makeBarrier()
graph[26][52].makeBarrier()
graph[26][53].makeBarrier()
graph[27][54].makeBarrier()
graph[28][54].makeBarrier()
graph[28][55].makeBarrier()
graph[29][55].makeBarrier()
graph[29][56].makeBarrier()
# bottom right side of pentagon
graph[30][57].makeBarrier()
graph[30][58].makeBarrier()
graph[30][59].makeBarrier()
graph[29][59].makeBarrier()
graph[29][60].makeBarrier()
graph[29][61].makeBarrier()
graph[29][62].makeBarrier()
graph[28][63].makeBarrier()
graph[28][64].makeBarrier()
graph[28][65].makeBarrier()
graph[28][66].makeBarrier()
graph[27][67].makeBarrier()
graph[27][68].makeBarrier()
graph[27][69].makeBarrier()
graph[27][70].makeBarrier()
graph[26][71].makeBarrier()
graph[26][72].makeBarrier()
graph[26][73].makeBarrier()
graph[26][74].makeBarrier()
# Isosceles Triangle
graph[37][55].makeBarrier() # IT1
graph[41][78].makeBarrier() # IT2
graph[33][78].makeBarrier() # IT3
# graph[36][56].makeBarrier()
for i in range(56, 62):
graph[36][i].makeBarrier()
for i in range(62, 67):
graph[35][i].makeBarrier()
for i in range(67, 73):
graph[34][i].makeBarrier()
for i in range(73, 78):
graph[33][i].makeBarrier()
for i in range(56, 62):
graph[38][i].makeBarrier()
for i in range(62, 67):
graph[39][i].makeBarrier()
for i in range(67, 73):
graph[40][i].makeBarrier()
for i in range(73, 78):
graph[41][i].makeBarrier()
for i in range(34, 41):
graph[i][78].makeBarrier()
# Quadrilateral
graph[43][60].makeBarrier() # Q1
graph[43][44].makeBarrier() # Q2
graph[51][41].makeBarrier() # Q3
graph[56][48].makeBarrier() # Q4
for i in range(45, 60):
graph[43][i].makeBarrier()
graph[44][43].makeBarrier()
graph[45][43].makeBarrier()
graph[46][43].makeBarrier()
graph[47][42].makeBarrier()
graph[48][42].makeBarrier()
graph[49][42].makeBarrier()
graph[50][42].makeBarrier()
graph[52][42].makeBarrier()
graph[52][43].makeBarrier()
graph[53][44].makeBarrier()
graph[53][45].makeBarrier()
graph[54][46].makeBarrier()
graph[55][47].makeBarrier()
graph[55][49].makeBarrier()
graph[54][50].makeBarrier()
graph[53][51].makeBarrier()
graph[52][52].makeBarrier()
graph[51][53].makeBarrier()
graph[50][54].makeBarrier()
graph[49][55].makeBarrier()
graph[48][56].makeBarrier()
graph[47][57].makeBarrier()
graph[46][58].makeBarrier()
graph[45][59].makeBarrier()
graph[44][60].makeBarrier()
# Right Triangle
graph[56][90].makeBarrier() #RT3
graph[66][83].makeBarrier()# RT2
graph[49][70].makeBarrier() #RT1
# left side
graph[56][91].makeBarrier()
graph[56][89].makeBarrier()
graph[55][89].makeBarrier()
graph[55][88].makeBarrier()
graph[55][87].makeBarrier()
graph[54][87].makeBarrier()
graph[53][87].makeBarrier()
graph[53][86].makeBarrier()
graph[53][85].makeBarrier()
graph[53][84].makeBarrier()
graph[53][83].makeBarrier()
graph[52][82].makeBarrier()
graph[52][81].makeBarrier()
graph[52][80].makeBarrier()
graph[52][79].makeBarrier()
graph[51][78].makeBarrier()
graph[51][77].makeBarrier()
graph[51][76].makeBarrier()
graph[51][75].makeBarrier()
graph[50][74].makeBarrier()
graph[50][73].makeBarrier()
graph[49][72].makeBarrier()
graph[49][71].makeBarrier()
# right side
graph[50][70].makeBarrier()
graph[51][71].makeBarrier()
graph[52][72].makeBarrier()
graph[53][73].makeBarrier()
graph[54][73].makeBarrier()
graph[55][74].makeBarrier()
graph[56][75].makeBarrier()
graph[57][76].makeBarrier()
graph[58][77].makeBarrier()
graph[59][78].makeBarrier()
graph[60][79].makeBarrier()
graph[61][79].makeBarrier()
graph[62][80].makeBarrier()
graph[63][81].makeBarrier()
graph[64][82].makeBarrier()
graph[65][82].makeBarrier()
graph[65][84].makeBarrier()
graph[64][85].makeBarrier()
graph[63][86].makeBarrier()
graph[62][87].makeBarrier()
graph[61][88].makeBarrier()
graph[60][89].makeBarrier()
graph[59][90].makeBarrier()
graph[58][91].makeBarrier()
graph[57][92].makeBarrier()
# Vertical Rectangle
graph[62][46].makeBarrier() # R1
graph[62][75].makeBarrier() # R2
graph[77][46].makeBarrier() # R3
graph[77][75].makeBarrier() # R4
for i in range(47, 75):
graph[62][i].makeBarrier()
for i in range(47, 75):
graph[77][i].makeBarrier()
for i in range(63, 77):
graph[i][46].makeBarrier()
for i in range(63, 77):
graph[i][75].makeBarrier()
# Hexagon
graph[79][78].makeBarrier() # H1 # highest
graph[74][83].makeBarrier() # H2 # top left
graph[74][88].makeBarrier() # H3 # bottom left
graph[79][92].makeBarrier() # H4 # lowest
graph[84][83].makeBarrier() # H5 # top right
graph[84][88].makeBarrier() # H6 # bottom right
graph[78][79].makeBarrier()
graph[77][80].makeBarrier()
graph[76][81].makeBarrier()
graph[75][82].makeBarrier()
graph[80][79].makeBarrier()
graph[81][80].makeBarrier()
graph[82][81].makeBarrier()
graph[83][82].makeBarrier()
for i in range(84, 88):
graph[74][i].makeBarrier()
for i in range(84, 88):
graph[84][i].makeBarrier()
graph[78][91].makeBarrier()
graph[77][91].makeBarrier()
graph[76][90].makeBarrier()
graph[75][89].makeBarrier()
graph[80][91].makeBarrier()
graph[81][91].makeBarrier()
graph[82][90].makeBarrier()
graph[83][89].makeBarrier()
# Kite
graph[80][50].makeBarrier() # K1
graph[86][45].makeBarrier() # K2
graph[92][50].makeBarrier() # K3
graph[86][69].makeBarrier() # K4
graph[81][51].makeBarrier()
graph[81][52].makeBarrier()
graph[81][53].makeBarrier()
graph[81][54].makeBarrier()
graph[82][55].makeBarrier()
graph[82][56].makeBarrier()
graph[82][57].makeBarrier()
graph[82][58].makeBarrier()
graph[83][59].makeBarrier()
graph[83][60].makeBarrier()
graph[83][61].makeBarrier()
graph[83][62].makeBarrier()
graph[84][63].makeBarrier()
graph[84][64].makeBarrier()
graph[84][65].makeBarrier()
graph[84][66].makeBarrier()
graph[85][67].makeBarrier()
graph[85][68].makeBarrier()
graph[91][51].makeBarrier()
graph[91][52].makeBarrier()
graph[91][53].makeBarrier()
graph[91][54].makeBarrier()
graph[90][55].makeBarrier()
graph[90][56].makeBarrier()
graph[90][57].makeBarrier()
graph[90][58].makeBarrier()
graph[89][59].makeBarrier()
graph[89][60].makeBarrier()
graph[89][61].makeBarrier()
graph[89][62].makeBarrier()
graph[88][63].makeBarrier()
graph[88][64].makeBarrier()
graph[88][65].makeBarrier()
graph[88][66].makeBarrier()
graph[87][67].makeBarrier()
graph[87][68].makeBarrier()
graph[91][49].makeBarrier()
graph[90][49].makeBarrier()
graph[89][48].makeBarrier()
graph[88][47].makeBarrier()
graph[87][46].makeBarrier()
graph[81][49].makeBarrier()
graph[82][49].makeBarrier()
graph[83][48].makeBarrier()
graph[84][47].makeBarrier()
graph[85][46].makeBarrier()
# --------------------------------------------------------------------------------
while True:
draw(environment, graph, entries, size)
for event in pygame.event.get():
if event.type == pygame.QUIT:
pygame.quit()
for row in graph:
for node in row:
node.updateNeighbors(graph)
A_Star(lambda: draw(environment, graph, entries, size), graph, start, goal)
break
main(environment, 800)
# python3 AStar.py
| d7f3ad4ce08ade9ca185d48d4c2e184f8c05ddfe | [
"Python"
]
| 3 | Python | jasonskipper/artificial-intelligence | 44fab638e3bcb1de31f6e0f31489c8f1e88f1a7f | ca50a42b90612c1287950e3f74990192d1667cf5 |
refs/heads/master | <repo_name>SteveAmor/raspberry-pi<file_sep>/ultrasonicDistanceSensor.py
#Libraries
import RPi.GPIO as GPIO
import time
from pythonosc import osc_message_builder
from pythonosc import udp_client
GPIO.setmode(GPIO.BCM)
#set GPIO Pins
GPIO_TRIGGER = 4
GPIO_ECHO = 17
# send to sonic pi
sender = udp_client.SimpleUDPClient('127.0.0.1',4559)
#set GPIO inputs and outputs
GPIO.setup(GPIO_TRIGGER, GPIO.OUT)
GPIO.setup(GPIO_ECHO, GPIO.IN)
# set Trigger to HIGH and after some time return to LOW
GPIO.output(GPIO_TRIGGER, True)
time.sleep(0.00001)
GPIO.output(GPIO_TRIGGER, False)
# ECHO trigger - pulse returned
while GPIO.input(GPIO_ECHO) == 0:
StartTime = time.time()
# ECHO trigger - pulse returned
while GPIO.input(GPIO_ECHO) == 1:
StopTime = time.time()
# calculate distance
# multiply the time with the sonic speed (34300 cm/s)
#and divide by 2 because it's going there and back
TimeDifference = StopTime - StartTime
distance = (TimeDifference * 34300) / 2
pitch = round(distance + 60)
sender.send_message('/play_this', pitch)
print ("Pitch = %d" % pitch)
time.sleep(0.1)
GPIO.cleanup()
<file_sep>/README.md
# raspberry-pi
Raspberry Pi stuff
Code will be put here that I have included as a tutorial on my blog
https://carmelhowe.wordpress.com/
| 4eb121bf2ebcb35c383632f03dceeb21726887ac | [
"Markdown",
"Python"
]
| 2 | Python | SteveAmor/raspberry-pi | bb5575b4f3cb8a212a5f20c4d63e3c99355bf4d6 | 1f072d043321acb1fa34f414a9f9b09cb47e6065 |
refs/heads/master | <repo_name>wanghaibo1991/HBGOGO<file_sep>/Example/Podfile
use_frameworks!
platform :ios, '8.0'
target 'HBGOGO_Example' do
pod 'HBGOGO', :path => '../'
pod 'TRUNetworking', :path => '../../../Lib/TRUNetworking'
target 'HBGOGO_Tests' do
inherit! :search_paths
pod 'FBSnapshotTestCase'
end
end
<file_sep>/README.md
# HBGOGO
[](https://travis-ci.org/王海波/HBGOGO)
[](https://cocoapods.org/pods/HBGOGO)
[](https://cocoapods.org/pods/HBGOGO)
[](https://cocoapods.org/pods/HBGOGO)
## Example
To run the example project, clone the repo, and run `pod install` from the Example directory first.
## Requirements
## Installation
HBGOGO is available through [CocoaPods](https://cocoapods.org). To install
it, simply add the following line to your Podfile:
```ruby
pod 'HBGOGO'
```
## Author
王海波, <EMAIL>
## License
HBGOGO is available under the MIT license. See the LICENSE file for more info.
| 6981036ec799fef89b2fe220cf75a52c904d187f | [
"Markdown",
"Ruby"
]
| 2 | Ruby | wanghaibo1991/HBGOGO | e32282853ccf99fb4129089b87ef671a3cd87fee | 357ab8cb4fd9ae95fc29640e8fa9aef7315a661f |
refs/heads/master | <file_sep>function populate (){
$('table').empty();
var start = $('#start').val() || 10;
var end = $('#end').val() || 100;
if(start > end){
$('#test').append('<tr><td>Start MUST be less than the end!</td></tr>');
return;
}
var counter = start;
for(var j=1; j<=Math.ceil(end/10); j++){
$('#test').append("<tr id=" + j + "></tr>");
for(var i=1; i<=10; i++){
if(counter <= end){
if(counter % 4 === 0 && counter % 6 === 0 && counter !== 0){
$('#' + j).append('<td class="drwho">Dr. Who</td>');
}
else if(counter % 4 === 0 && counter !== 0){
$('#' + j).append('<td class="drwho">Dr</td>');
}
else if(counter % 6 === 0 && counter !== 0){
$('#' + j).append('<td class="drwho">Who</td>');
}
else{
$('#' + j).append('<td>' + counter + '</td>') ;
}
}
counter++;
}
}
}
populate();
$('#submit').on('click', populate);<file_sep># DrWhoChallenge
Created a mini-app of a fizz buzz clone.
Just HTML and JQuery, no server.
| dfa9019684603dbe5a16ee96bf716ceeba34bc51 | [
"JavaScript",
"Markdown"
]
| 2 | JavaScript | joellamathellama/DrWhoChallenge | de0d17dd7d316abb76a7addb4104d92fbc0c9258 | e7f68b70eb2f246141a7d8ed3d5c59c149d38b75 |
refs/heads/master | <file_sep>using System;
using System.Collections.Generic;
using System.Text;
using System.ComponentModel.DataAnnotations;
namespace WarsztatyViewModel
{
public class ContactViewModel
{
public string Name { get; set; }
public int Number { get; set; }
public string Email { get; set; }
public string Phone { get; set; }
public DateTime CreateDate { get; set; }
public bool IsActive { get; set; }
}
}
<file_sep>using System;
namespace WarsztatyCore
{
public class Class1
{
}
}
<file_sep>using System;
using System.Collections.Generic;
using System.Text;
using WarsztatyViewModel;
namespace WarsztatyCore.Services
{
public class ContactServices
{
public void Add(ContactViewModel contact)
{
var toSave = new ContactModel()
{
CreatedDate = DateTime.Now,
Email = contact.Email,
Name = contact.Name,
Number = contact.Number,
Phone = contact.Phone,
};
toSave.CreatedDate = DateTime.Now;
var db = new DataContext();
db.Add<ContactModel>(toSave);
db.SaveChanges();
}
}
}
<file_sep>using System;
using System.Collections.Generic;
using System.Text;
using System.ComponentModel.DataAnnotations;
using System.ComponentModel.DataAnnotations.Schema;
namespace WarsztatyCore
{
[Table("Contact")]
public class ContactModel
{
[Key]
public virtual int Id { get; set; }
[MaxLength(25)]
public virtual string Name { get; set; }
public virtual int Number { get; set; }
public virtual string Email { get; set; }
public virtual string Phone { get; set; }
public virtual DateTime CreatedDate { get; set; }
public virtual bool IsActive { get; set; }
}
}
<file_sep>using System;
namespace WarsztatyViewModel
{
public class Class1
{
}
}
| 5a799a60407ac8ecfcbffb5b124a52fec06fb573 | [
"C#"
]
| 5 | C# | Kamil01491/AzureWorkShopUnit4 | 3615e3fdb72d5b2c063987d7142ff5563f1ad96b | 32bde2420482ca4b0074c8460797fa0f7503103d |
refs/heads/master | <file_sep>package com.lagou.test;
import com.lagou.mapper.IOrderMapper;
import com.lagou.mapper.IUserMapper;
import com.lagou.pojo.User;
import org.apache.ibatis.io.Resources;
import org.apache.ibatis.session.SqlSession;
import org.apache.ibatis.session.SqlSessionFactory;
import org.apache.ibatis.session.SqlSessionFactoryBuilder;
import org.junit.Before;
import org.junit.Test;
import java.io.IOException;
import java.io.InputStream;
/**
* @ClassName CacheTest
* @Description TODO
* @Author mao
* @Data 2021/6/23 10:13
**/
public class CacheTest {
private IUserMapper userMapper;
private SqlSession sqlSession;
@Before
public void before() throws IOException {
//1.Resources工具类,配置文件的加载,把配置文件加载成字节输入流
InputStream resourceAsStream = Resources.getResourceAsStream("sqlMapConfig.xml");
//2.解析了配置文件,并创建了sqlSessionFactory工厂
SqlSessionFactory sqlSessionFactory = new SqlSessionFactoryBuilder().build(resourceAsStream);
//3.生产sqlSession
sqlSession = sqlSessionFactory.openSession(true);//默认开启一个事务,但事务不会自动提交
//4.sqlSession调用方法
userMapper = sqlSession.getMapper(IUserMapper.class);
}
@Test
public void firstLevelCache(){
//第一次查询id为1的用户
User user1 = userMapper.findUserById(1);
User user2 = userMapper.findUserById(1);
System.out.println( user1 == user2);
}
}
<file_sep>package com.lagou.test;
import com.lagou.mapper.IOrderMapper;
import com.lagou.mapper.IUserMapper;
import com.lagou.pojo.Order;
import com.lagou.pojo.User;
import org.apache.ibatis.io.Resources;
import org.apache.ibatis.session.SqlSession;
import org.apache.ibatis.session.SqlSessionFactory;
import org.apache.ibatis.session.SqlSessionFactoryBuilder;
import org.junit.Before;
import org.junit.Test;
import java.io.IOException;
import java.io.InputStream;
import java.util.List;
/**
* @ClassName MybatisTest
* @Description TODO
* @Author mao
* @Data 2021/6/22 15:02
**/
public class MybatisTest {
@Test
public void test1() throws IOException {
//1.Resources工具类,配置文件的加载,把配置文件加载成字节输入流
InputStream resourceAsStream = Resources.getResourceAsStream("sqlMapConfig.xml");
//2.解析了配置文件,并创建了sqlSessionFactory工厂
SqlSessionFactory sqlSessionFactory = new SqlSessionFactoryBuilder().build(resourceAsStream);
//3.生产sqlSession
SqlSession sqlSession = sqlSessionFactory.openSession();//默认开启一个事务,但事务不会自动提交
//4.sqlSession调用方法
IOrderMapper mapper = sqlSession.getMapper(IOrderMapper.class);
List<Order> orderAndUser = mapper.findOrderAndUser();
for (Order order : orderAndUser) {
System.out.println(order);
}
}
@Test
public void test2() throws IOException {
//1.Resources工具类,配置文件的加载,把配置文件加载成字节输入流
InputStream resourceAsStream = Resources.getResourceAsStream("sqlMapConfig.xml");
//2.解析了配置文件,并创建了sqlSessionFactory工厂
SqlSessionFactory sqlSessionFactory = new SqlSessionFactoryBuilder().build(resourceAsStream);
//3.生产sqlSession
SqlSession sqlSession = sqlSessionFactory.openSession();//默认开启一个事务,但事务不会自动提交
//4.sqlSession调用方法
IUserMapper mapper = sqlSession.getMapper(IUserMapper.class);
List<User> all = mapper.findAll();
for (User user : all) {
System.out.println(user);
}
}
@Test
public void test3() throws IOException {
//1.Resources工具类,配置文件的加载,把配置文件加载成字节输入流
InputStream resourceAsStream = Resources.getResourceAsStream("sqlMapConfig.xml");
//2.解析了配置文件,并创建了sqlSessionFactory工厂
SqlSessionFactory sqlSessionFactory = new SqlSessionFactoryBuilder().build(resourceAsStream);
//3.生产sqlSession
SqlSession sqlSession = sqlSessionFactory.openSession();//默认开启一个事务,但事务不会自动提交
//4.sqlSession调用方法
IUserMapper mapper = sqlSession.getMapper(IUserMapper.class);
List<User> all = mapper.findUserAndRole();
for (User user : all) {
System.out.println(user);
}
}
private IUserMapper userMapper;
private IOrderMapper orderMapper;
@Before
public void before() throws IOException {
//1.Resources工具类,配置文件的加载,把配置文件加载成字节输入流
InputStream resourceAsStream = Resources.getResourceAsStream("sqlMapConfig.xml");
//2.解析了配置文件,并创建了sqlSessionFactory工厂
SqlSessionFactory sqlSessionFactory = new SqlSessionFactoryBuilder().build(resourceAsStream);
//3.生产sqlSession
SqlSession sqlSession = sqlSessionFactory.openSession(true);//默认开启一个事务,但事务不会自动提交
//4.sqlSession调用方法
userMapper = sqlSession.getMapper(IUserMapper.class);
orderMapper = sqlSession.getMapper(IOrderMapper.class);
}
@Test
public void addUser(){
User user = new User();
user.setId(4);
user.setUsername("成都市");
userMapper.addUser(user);
}
@Test
public void updateUser(){
User user = new User();
user.setId(4);
user.setUsername("b市");
userMapper.updateUser(user);
}
@Test
public void deleteUser(){
// User user = new User();
// user.setId(4);
// user.setUsername("b市");
userMapper.deleteUser(3);
}
@Test
public void selectUser(){
// User user = new User();
// user.setId(4);
// user.setUsername("b市");
List<User> users = userMapper.selectUser();
System.out.println(users);
}
@Test
public void oneToOne(){
// User user = new User();
// user.setId(4);
// user.setUsername("b市");
List<Order> orderAndUser = orderMapper.findOrderAndUser();
System.out.println(orderAndUser);
}
@Test
public void oneToMany(){
// User user = new User();
// user.setId(4);
// user.setUsername("b市");
List<User> all = userMapper.findAll();
System.out.println(all);
}
@Test
public void manyToMany(){
// User user = new User();
// user.setId(4);
// user.setUsername("b市");
List<User> all = userMapper.findUserAndRole();
System.out.println(all);
}
}
<file_sep>package com.lagou.pojo;
/**
* @ClassName Order
* @Description TODO
* @Author mao
* @Data 2021/6/22 11:15
**/
public class Order {
private Integer id;
private String orderTime;
private Double total;
//该订单属于哪个用户
private User user;
public void setId(Integer id) {
this.id = id;
}
public void setOrderTime(String orderTime) {
this.orderTime = orderTime;
}
public void setTotal(Double total) {
this.total = total;
}
public User getUser() {
return user;
}
public void setUser(User user) {
this.user = user;
}
@Override
public String toString() {
return "Order{" +
"id=" + id +
", orderTime='" + orderTime + '\'' +
", total=" + total +
", user=" + user +
'}';
}
}
<file_sep>package com.lagou.mapper;
import com.lagou.pojo.Role;
import com.lagou.pojo.User;
import org.apache.ibatis.annotations.*;
import java.util.List;
/**
* @ClassName IUserMapper
* @Description TODO
* @Author mao
* @Data 2021/6/22 15:15
**/
public interface IUserMapper {
//查询所有用户信息以及用户关联的订单信息
@Select("select * from `user`")
@Results({
@Result(property = "id", column = "id"),
@Result(property = "username", column = "username"),
@Result(property = "orderList", column = "id", javaType = List.class,
many = @Many(select = "com.lagou.mapper.IOrderMapper.findOrderByUid"))
})
public List<User> findAll();
//查询所有用户信息以及用户关联的角色信息
@Select("select * from `user`")
@Results({
@Result(property = "id", column = "id"),
@Result(property = "username", column = "username"),
@Result(property = "roleList", column = "id", javaType = List.class,
many = @Many(select = "com.lagou.mapper.IRoleMapper.findRoleByUid"))
})
public List<User> findUserAndRole();
@Insert("insert into `user` values(#{id},#{username})")
public void addUser(User user);
@Update("update user set username = #{username} where id = #{id}")
public void updateUser(User user);
@Select("select * from `user`")
public List<User> selectUser();
@Select("delete from `user` where id = #{id}")
public List<User> deleteUser(Integer id);
@Select("select * from user where id = #{id}")
public User findUserById(Integer id);
}
| 706b840db609174b344cbb7f1c91b4ca219a50eb | [
"Java"
]
| 4 | Java | MaoLixin1998/mybatis_multitable | 0112fe582cc9a3f13af1435f73da57fb7ccdb9f9 | cd9b433a9d2e91483804cc984855b62b36f06d8d |
refs/heads/master | <file_sep>package com.thomsonreuters.ce.dynamic_wrapper.test;
import java.util.HashMap;
import java.util.Map;
public enum CmdStatus {
WAITING(1, "WAITING"),
RUNNING(2, "RUNNING"),
FINISHED(3, "FINISHED"),
FAILED(4, "FAILED");
private int id;
private String name;
static final Map<Integer, CmdStatus> StatusByCode = new HashMap<Integer, CmdStatus>();
static {
for (CmdStatus area : CmdStatus.values()) {
StatusByCode.put(area.id, area);
}
}
private CmdStatus(int id, String name) {
this.id = id;
this.name = name;
}
public int getId() {
return id;
}
public String getName() {
return name;
}
public static CmdStatus getByCode(int ID) {
return StatusByCode.get(ID);
}
}
<file_sep>package com.thomsonreuters.ce.dynamic_wrapper.test;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Statement;
import java.util.Properties;
import oracle.jdbc.OracleConnection;
import oracle.jdbc.OracleDriver;
import oracle.jdbc.OracleStatement;
import oracle.jdbc.dcn.DatabaseChangeEvent;
import oracle.jdbc.dcn.DatabaseChangeListener;
import oracle.jdbc.dcn.DatabaseChangeRegistration;
public class OracleDCN {
static final String USERNAME = "mpd2_cnr";
static final String PASSWORD = "<PASSWORD>";
static String URL = "jdbc:oracle:thin:@(description = (ADDRESS = (PROTOCOL = TCP)(HOST = c419jvfdw01t.int.thomsonreuters.com)(PORT = 1521))"
+ "(ADDRESS = (PROTOCOL = TCP)(HOST = c569xrtdw02t.int.thomsonreuters.com)(PORT = 1521))(load_balance = on)(connect_data ="
+ "(server = dedicated)(service_name = ord510a.int.thomsonreuters.com)))";
public static void main(String[] args) {
OracleDCN oracleDCN = new OracleDCN();
try {
oracleDCN.run();
} catch (Exception ex) {
ex.printStackTrace();
}
}
private void run() throws Exception {
OracleConnection conn = connect();
Properties prop = new Properties();
prop.setProperty(OracleConnection.DCN_NOTIFY_ROWIDS, "true");
prop.setProperty(OracleConnection.DCN_IGNORE_DELETEOP, "true");
prop.setProperty(OracleConnection.DCN_IGNORE_UPDATEOP, "true");
DatabaseChangeRegistration dcr = conn.registerDatabaseChangeNotification(prop);
try {
dcr.addListener(new DCNDemoListener(this));
Statement stmt = conn.createStatement();
((OracleStatement) stmt).setDatabaseChangeRegistration(dcr);
ResultSet rs = stmt.executeQuery("SELECT * FROM universal_config");
while (rs.next()) {
}
rs.close();
stmt.close();
} catch (SQLException ex) {
if (conn != null) {
conn.unregisterDatabaseChangeNotification(dcr);
conn.close();
}
throw ex;
}
}
OracleConnection connect() throws SQLException {
OracleDriver dr = new OracleDriver();
Properties prop = new Properties();
prop.setProperty("user", OracleDCN.USERNAME);
prop.setProperty("password", <PASSWORD>);
return (OracleConnection) dr.connect(OracleDCN.URL, prop);
}
}
class DCNDemoListener implements DatabaseChangeListener {
OracleDCN demo;
DCNDemoListener(OracleDCN dem) {
demo = dem;
}
public void onDatabaseChangeNotification(DatabaseChangeEvent e) {
Thread t = Thread.currentThread();
System.out.println("DCNDemoListener: got an event (" + this + " running on thread " + t + ")");
System.out.println(e.toString());
System.out.println("Changed row id : "+e.getTableChangeDescription()[0].getRowChangeDescription()[0].getRowid().stringValue());
synchronized (demo) {
demo.notify();
}
}
}<file_sep>package com.thomsonreuters.ce.dynamic_wrapper.test;
import java.io.BufferedOutputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.FilenameFilter;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method;
import java.net.URL;
import java.net.URLConnection;
import java.security.Permission;
import java.sql.Connection;
import java.sql.ResultSet;
import java.sql.PreparedStatement;
import java.sql.SQLException;
import java.util.HashMap;
import java.util.Iterator;
import java.util.Map;
import java.util.Properties;
import java.util.Set;
import java.util.TreeMap;
import java.util.jar.JarEntry;
import java.util.jar.JarInputStream;
import java.util.jar.JarOutputStream;
import java.util.zip.GZIPInputStream;
import org.apache.tools.tar.TarEntry;
import org.apache.tools.tar.TarInputStream;
import org.apache.commons.io.FileUtils;
import com.thomsonreuters.ce.database.EasyConnection;
public class ApplicationVersion {
public int getId() {
return id;
}
public void setId(int id) {
this.id = id;
}
public String getVersion() {
return version;
}
public void setVersion(String version) {
this.version = version;
}
public String getRelease_media_path() {
return release_media_path;
}
public void setRelease_media_path(String release_media_path) {
this.release_media_path = release_media_path;
}
public String getMain_class() {
return main_class;
}
public void setMain_class(String main_class) {
this.main_class = main_class;
}
public ApplicationStatus getApp_status() {
return app_status;
}
public void setApp_status(ApplicationStatus app_status) {
this.app_status = app_status;
}
private static String strSQL="select id, version, release_media_path, main_class, status from hackathon_application_version where app_id=?";
private static String UpdateAppVerStatus="Update hackathon_application_version set status=? where id=?";
private static HashMap<Integer, ApplicationVersion> AppVerList= new HashMap<Integer, ApplicationVersion>();
private int id;
private String version;
private String release_media_path;
private String main_class;
private ApplicationStatus app_status;
private DynamicClassLoader DCL=null;
public ApplicationVersion(int id, String version,
String release_media_path, String main_class,
ApplicationStatus app_status) {
super();
this.id = id;
this.version = version;
this.release_media_path = release_media_path;
this.main_class = main_class;
this.app_status = app_status;
}
public static void LoadAllVersions()
{
Connection DBConn=null;
try {
DBConn = new EasyConnection(ApplicationStarter.dbpool_name);
PreparedStatement objGetAllAppVersions=DBConn.prepareStatement(strSQL);
objGetAllAppVersions.setInt(1, ApplicationStarter.app_id);
ResultSet objResult=objGetAllAppVersions.executeQuery();
while (objResult.next())
{
int id=objResult.getInt("id");
String version=objResult.getString("version");
String release_media_path=objResult.getString("release_media_path");
String main_class=objResult.getString("main_class");
ApplicationStatus app_status=ApplicationStatus.getByCode(objResult.getInt("status"));
AppVerList.put(id, new ApplicationVersion(id,version,release_media_path,main_class,app_status));
}
objResult.close();
objGetAllAppVersions.close();
} catch (SQLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} finally
{
try {
DBConn.close();
} catch (SQLException ex) {
ex.printStackTrace();
}
}
}
public static ApplicationVersion getByVersion(int ver_id)
{
return AppVerList.get(ver_id);
}
public boolean Perform(ControlCmd cmd, String parameter)
{
boolean result;
if (cmd.equals(ControlCmd.START) && (app_status.equals(app_status.STOPPED)))
{
UpdateAppVerStatus(ApplicationStatus.STARTING);
result=Start();
if (result)
{
UpdateAppVerStatus(ApplicationStatus.STARTED);
}
else
{
UpdateAppVerStatus(ApplicationStatus.STOPPED);
}
}
else if (cmd.equals(ControlCmd.STOP) && (app_status.equals(app_status.STARTED)))
{
UpdateAppVerStatus(ApplicationStatus.STOPPING);
result=Stop();
if (result)
{
UpdateAppVerStatus(ApplicationStatus.STOPPED);
}
else
{
UpdateAppVerStatus(ApplicationStatus.STARTED);
}
}
else if (cmd.equals(ControlCmd.INSTALL) && (app_status.equals(app_status.RELEASED)))
{
UpdateAppVerStatus(ApplicationStatus.INSTALLING);
result=Install(parameter);
if (result)
{
UpdateAppVerStatus(ApplicationStatus.STOPPED);
}
else
{
UpdateAppVerStatus(ApplicationStatus.RELEASED);
}
}
else if (cmd.equals(ControlCmd.UNINSTALL) && (app_status.equals(app_status.STOPPED) ))
{
UpdateAppVerStatus(ApplicationStatus.UNINSTALLING);
result=Uninstall();
if (result)
{
UpdateAppVerStatus(ApplicationStatus.RELEASED);
}
else
{
UpdateAppVerStatus(ApplicationStatus.STOPPED);
}
}
else
{
ApplicationStarter.thisLogger.warn("Can not execute command:"+cmd.getName()+" on version:" + this.version);
result=false;
}
return result;
}
public boolean Start()
{
try {
ApplicationStarter.thisLogger.info("Starting application:"+ApplicationStarter.app_name+" version:"+version);
String version_install_path=ApplicationStarter.install_path+"/"+version;
DCL=new DynamicClassLoader(GetClassPath(version_install_path));
Thread.currentThread().setContextClassLoader(DCL);
Class class_main=DCL.loadClass(main_class);
Method mainMethod = class_main.getMethod("main", String[].class);
try {
mainMethod.invoke(null, (Object)new String[]{"start"});
} catch (InvocationTargetException e) {
// TODO Auto-generated catch block
ApplicationStarter.thisLogger.error(e.getTargetException());
}
ApplicationStarter.thisLogger.info("Application:"+ApplicationStarter.app_name+" version:"+version+" has been started");
return true;
} catch (Exception e) {
// TODO Auto-generated catch block
ApplicationStarter.thisLogger.error("Error occured while executing start command.", e);
return false;
}
}
private boolean Stop()
{
try {
ApplicationStarter.thisLogger.info("Stopping application:"+ApplicationStarter.app_name+" version:"+version);
if (DCL!=null)
{
Class class_main=DCL.loadClass(main_class);
Method mainMethod = class_main.getMethod("main", String[].class);
try {
mainMethod.invoke(null, (Object)new String[]{"stop"});
} catch (InvocationTargetException e) {
ApplicationStarter.thisLogger.error(e.getTargetException());
}
DCL=null;
System.gc();
}
ApplicationStarter.thisLogger.info("Application:"+ApplicationStarter.app_name+" version:"+version+" has been stopped");
return true;
} catch (Exception e) {
ApplicationStarter.thisLogger.error("Error occured while executing stop command.", e);
return false;
}
}
private boolean Install(String configURLString)
{
try {
ApplicationStarter.thisLogger.info("Installing application:"+ApplicationStarter.app_name+" version:"+version+" to folder:"+ApplicationStarter.install_path+"/"+version);
String mainJarName = ApplicationStarter.app_name+"-"+version+".jar";
File dir = new File(ApplicationStarter.install_path+"/"+version);
if (dir.exists())
{
FileUtils.cleanDirectory(dir);
}
InputStream release = getInstallingStream(release_media_path, null, null);
GZIPInputStream gis = new GZIPInputStream(release);
TarInputStream tin = new TarInputStream(gis);
TarEntry te;
while ((te = tin.getNextEntry()) != null) {
if (te.isDirectory()) {
new File(dir, te.getName()).mkdirs();
ApplicationStarter.thisLogger.info("Created folder:"+te.getName());
continue;
}
String outName = te.getName();
File out = new File(dir, outName);
try (FileOutputStream fos = new FileOutputStream(out)) {
if (outName.toLowerCase().endsWith(mainJarName)) {
ApplicationStarter.thisLogger.info("Found main application jar file:"+outName);
JarInputStream jis = new JarInputStream(tin);
try (JarOutputStream jos = new JarOutputStream(new BufferedOutputStream(fos))) {
JarEntry je;
while ((je = jis.getNextJarEntry()) != null) {
JarEntry outputEntry = new JarEntry(je.getName());
outputEntry.setTime(je.getTime());
jos.putNextEntry(outputEntry);
if (je.getName().toLowerCase().endsWith(".properties")) {
try(InputStream resInputStream = getInstallingStream(configURLString + "/" + je.getName(), "pcadmin", "Hgg41kkt")){
write(resInputStream, jos);
}
catch (Exception e){
//System.out.println(e.getMessage());
write(jis, jos);
}
} else {
write(jis, jos);
}
}
}
} else {
write(tin, fos);
ApplicationStarter.thisLogger.info("Installed "+out.getAbsolutePath());
}
}
}
ApplicationStarter.thisLogger.info("Application:"+ApplicationStarter.app_name+" version:"+version+" has been installed to folder:"+ApplicationStarter.install_path+"/"+version);
return true;
} catch (Exception e) {
ApplicationStarter.thisLogger.error("Error occured while executing install command with parameter \""+configURLString+"\" .", e);
return false;
}
}
private void write(InputStream is, OutputStream os) throws Exception {
byte[] b = new byte[8192];
int c;
while ((c = is.read(b, 0, b.length)) != -1) {
os.write(b, 0, c);
}
}
private InputStream getInstallingStream(String urlString, String username, String password) throws Exception {
URL u = new URL(urlString);
InputStream is;
if (username != null) {
String userPassword = username + ":" + password;
String encoding = new sun.misc.BASE64Encoder().encode(userPassword.getBytes());
URLConnection uc = u.openConnection();
uc.setRequestProperty("Authorization", "Basic " + encoding);
is = uc.getInputStream();
} else {
is = u.openStream();
}
return is;
}
private boolean Uninstall()
{
try{
ApplicationStarter.thisLogger.info("Uninstalling application:"+ApplicationStarter.app_name+" version:"+version+" in folder:"+ApplicationStarter.install_path+"/"+version);
FileUtils.deleteDirectory(new File(ApplicationStarter.install_path,version));
ApplicationStarter.thisLogger.info("Application:"+ApplicationStarter.app_name+" version:"+version+" in folder:"+ApplicationStarter.install_path+"/"+version +" has been uninstalled");
return true;
} catch (Exception e) {
ApplicationStarter.thisLogger.error("Error occured while executing uninstall command.", e);
return false;
}
}
private void UpdateAppVerStatus(ApplicationStatus AS)
{
Connection DBConn=null;
try {
DBConn = new EasyConnection(ApplicationStarter.dbpool_name);
PreparedStatement objupdateappverstatus=DBConn.prepareStatement(UpdateAppVerStatus);
objupdateappverstatus.setInt(1, AS.getId());
objupdateappverstatus.setInt(2, id);
objupdateappverstatus.executeUpdate();
objupdateappverstatus.close();
DBConn.commit();
} catch (SQLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} finally
{
try {
DBConn.close();
} catch (SQLException ex) {
ex.printStackTrace();
}
}
this.app_status=AS;
}
private static String GetClassPath(String install_path)
{
StringBuffer sbPath=new StringBuffer();
Properties prop = new Properties();
try {
FileInputStream fis = new FileInputStream(install_path+"/etc/wrapper.conf");
prop.load(fis);
} catch (Exception e) {
throw new RuntimeException("Error in parsing warpper.conf");
}
TreeMap<Integer, String> pathMap=new TreeMap<Integer, String>();
Iterator it= prop.entrySet().iterator();
while(it.hasNext()){
Map.Entry<String, String> entry=(Map.Entry<String, String>)it.next();
String key = entry.getKey();
if (key.startsWith("wrapper.java.classpath."))
{
int pathorder=Integer.valueOf(key.substring(23));
String classpath=entry.getValue();
if (classpath.startsWith("%REPO_DIR%"))
{
classpath=install_path+"/repo/"+classpath.substring(11);
}
else
{
classpath=install_path+"/"+classpath;
}
pathMap.put(pathorder, classpath);
}
}
String path_separator=System.getProperty("path.separator");
it = pathMap.entrySet().iterator();
while(it.hasNext()){
Map.Entry<String, String> entry1=(Map.Entry<String, String>)it.next();
if (sbPath.length()==0)
{
sbPath.append(entry1.getValue());
}
else
{
sbPath.append(path_separator).append(entry1.getValue());
}
}
return sbPath.toString();
}
public static void writeToArr(File dir, FilenameFilter searchSuffix, StringBuffer Classpath) {
File []files = dir.listFiles();
for(File f : files){
if(f.isDirectory()){
//µÝ¹éÁË¡£
writeToArr(f, searchSuffix, Classpath);
}else{
if(searchSuffix.accept(dir, f.getName())){
Classpath.append(";"+f.getAbsolutePath());
}
}
}
}
}
| a0d727e1e244336dcaf1435d362f5600197912ec | [
"Java"
]
| 3 | Java | sihanwang/dynamic_wrapper | 66d736b1fad2ac901e8ee24aa31da8fadeaf070e | f10927d8583d690d7022814660b3d26788dc2619 |
refs/heads/master | <file_sep>using Microsoft.AspNet.OData;
using Microsoft.AspNet.OData.Extensions;
using Microsoft.AspNet.OData.Query;
using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Mvc.ApiExplorer;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
namespace Phx.People.API
{
public class Startup
{
public Startup(IConfiguration configuration)
{
Configuration = configuration;
}
public IConfiguration Configuration { get; }
// This method gets called by the runtime. Use this method to add services to the container.
public void ConfigureServices(IServiceCollection services)
{
services.AddSingleton(Configuration);
Config.NETCore.PhxNetCoreConfig.Services = services;
Config.NETCore.PhxNetCoreConfig.AddSnapshotCollector();
Config.NETCore.PhxNetCoreConfig.AddCors();
Config.NETCore.PhxNetCoreConfig.AddApi<Startup>();
Config.NETCore.PhxNetCoreConfig.AddMvc(false);
// add standalone odata support
services.AddOData();
}
// This method gets called by the runtime. Use this method to configure the HTTP request pipeline.
public void Configure(IApplicationBuilder app, IApiVersionDescriptionProvider provider)
{
Config.NETCore.PhxNetCoreConfig.Configure(app, provider, configureRoutes: builder =>
{
// add standalone odata support
builder.EnableDependencyInjection();
builder
.Select()
.Filter()
.Count()
.OrderBy();
});
}
}
}
<file_sep>using System;
using System.Collections.Generic;
using System.Fabric;
using Microsoft.AspNetCore.Hosting;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging;
using Microsoft.ServiceFabric.Services.Communication.AspNetCore;
using Microsoft.ServiceFabric.Services.Communication.Runtime;
using Microsoft.ServiceFabric.Services.Runtime;
using Serilog;
using Serilog.Core.Enrichers;
using ILogger = Serilog.ILogger;
namespace Phx.People.API
{
/// <summary>
/// The FabricRuntime creates an instance of this class for each service type instance.
/// </summary>
internal sealed class API : StatelessService
{
public ILogger<API> Logger { get; }
public API(StatelessServiceContext context, ILogger logger)
: base(context)
{
PropertyEnricher[] properties = new PropertyEnricher[]
{
new PropertyEnricher("ServiceTypeName", context.ServiceTypeName),
new PropertyEnricher("ServiceName", context.ServiceName),
new PropertyEnricher("PartitionId", context.PartitionId),
new PropertyEnricher("InstanceId", context.ReplicaOrInstanceId),
};
logger.ForContext(properties);
Logger = new LoggerFactory().AddSerilog(logger.ForContext(properties)).CreateLogger<API>();
}
/// <summary>
/// Optional override to create listeners (like tcp, http) for this service instance.
/// </summary>
/// <returns>The collection of listeners.</returns>
protected override IEnumerable<ServiceInstanceListener> CreateServiceInstanceListeners()
{
return new ServiceInstanceListener[]
{
new ServiceInstanceListener(serviceContext =>
new KestrelCommunicationListener(serviceContext, "ServiceEndpoint", (url, listener) =>
{
ServiceEventSource.Current.ServiceMessage(serviceContext, $"Starting Kestrel on {url}");
return new WebHostBuilder()
.UseKestrel()
.UseContentRoot(AppContext.BaseDirectory)
.ConfigureAppConfiguration((hostingContext, config) =>
{
config
.SetBasePath(hostingContext.HostingEnvironment.ContentRootPath)
.AddJsonFile("phxappsettings.json", false, true)
.AddJsonFile(
$"phxappsettings.{hostingContext.HostingEnvironment.EnvironmentName}.json",
true, true)
.AddJsonFile("appsettings.json", false, true)
.AddJsonFile(
$"appsettings.{hostingContext.HostingEnvironment.EnvironmentName}.json", true,
true)
.AddEnvironmentVariables();
})
.ConfigureServices(services => services.AddSingleton(serviceContext))
.UseStartup<Startup>()
.UseServiceFabricIntegration(listener, ServiceFabricIntegrationOptions.UseUniqueServiceUrl)
.UseUrls(url)
.UseSerilog()
.UseApplicationInsights()
.Build();
}))
};
}
}
}
<file_sep>using Microsoft.ServiceFabric.Services.Runtime;
using System;
using System.Diagnostics;
using System.Threading;
using Serilog;
namespace Phx.People.API
{
internal static class Program
{
/// <summary>
/// This is the entry point of the service host process.
/// </summary>
private static void Main()
{
try
{
// The ServiceManifest.XML file defines one or more service type names.
// Registering a service maps a service type name to a .NET type.
// When Service Fabric creates an instance of this service type,
// an instance of the class is created in this host process.
Log.Logger = new LoggerConfiguration()
.MinimumLevel.Verbose()
.Enrich.FromLogContext()
.Enrich.WithAssemblyName()
.WriteTo.Debug(outputTemplate:
"[{Timestamp:HH:mm:ss} {AssemblyName} {Level:u3}] {Message:lj}{NewLine}{Exception}")
.CreateLogger();
ServiceRuntime.RegisterServiceAsync("Phx.People.APIType",
context => new API(context, Log.Logger)).GetAwaiter().GetResult();
ServiceEventSource.Current.ServiceTypeRegistered(Process.GetCurrentProcess().Id, typeof(API).Name);
// Prevents this host process from terminating so services keeps running.
Thread.Sleep(Timeout.Infinite);
}
catch (Exception e)
{
ServiceEventSource.Current.ServiceHostInitializationFailed(e.ToString());
throw;
}
}
}
}
<file_sep>using System;
using System.Collections.Generic;
using System.Fabric;
using System.IO;
using Microsoft.AspNetCore.Hosting;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging;
using Microsoft.ServiceFabric.Services.Communication.AspNetCore;
using Microsoft.ServiceFabric.Services.Communication.Runtime;
using Microsoft.ServiceFabric.Services.Runtime;
using Ocelot.DependencyInjection;
using Ocelot.Middleware;
using Serilog;
using Serilog.Core.Enrichers;
using ILogger = Serilog.ILogger;
namespace Phx.Gateway
{
/// <summary>
/// The FabricRuntime creates an instance of this class for each service type instance.
/// </summary>
internal sealed class Gateway : StatelessService
{
public ILogger<Gateway> Logger { get; }
public Gateway(StatelessServiceContext context, ILogger logger)
: base(context)
{
PropertyEnricher[] properties = new PropertyEnricher[]
{
new PropertyEnricher("ServiceTypeName", context.ServiceTypeName),
new PropertyEnricher("ServiceName", context.ServiceName),
new PropertyEnricher("PartitionId", context.PartitionId),
new PropertyEnricher("InstanceId", context.ReplicaOrInstanceId),
};
logger.ForContext(properties);
Logger = new LoggerFactory().AddSerilog(logger.ForContext(properties)).CreateLogger<Gateway>();
}
/// <summary>
/// Optional override to create listeners (like tcp, http) for this service instance.
/// </summary>
/// <returns>The collection of listeners.</returns>
protected override IEnumerable<ServiceInstanceListener> CreateServiceInstanceListeners()
{
return new ServiceInstanceListener[]
{
new ServiceInstanceListener(serviceContext =>
new KestrelCommunicationListener(serviceContext, "ServiceEndpoint", (url, listener) =>
{
ServiceEventSource.Current.ServiceMessage(serviceContext, $"Starting Kestrel on {url}");
return new WebHostBuilder()
.UseKestrel()
.UseContentRoot(AppContext.BaseDirectory)
.ConfigureAppConfiguration((hostingContext, config) =>
{
config
.SetBasePath(hostingContext.HostingEnvironment.ContentRootPath)
.AddJsonFile("appsettings.json", false, true)
.AddJsonFile(
$"appsettings.{hostingContext.HostingEnvironment.EnvironmentName}.json", true,
true)
.AddJsonFile("ocelot.json", false, true)
.AddEnvironmentVariables();
})
.ConfigureServices(services => services.AddSingleton(serviceContext))
.UseStartup<Startup>()
.UseServiceFabricIntegration(listener, ServiceFabricIntegrationOptions.None)
.UseUrls(url)
.UseSerilog()
.Build();
}))
};
}
}
}
<file_sep>using System.Collections.Generic;
using System.Threading.Tasks;
using Microsoft.Extensions.Configuration;
using Microsoft.ServiceFabric.Services.Remoting;
using Phx.People.Data;
namespace Phx.People.v1_0.Business
{
public interface IBusinessClient : IService
{
Task<IEnumerable<IConfigurationSection>> GetConfiguration();
Task<List<Person>> AllPeople();
}
}
<file_sep>using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.ServiceFabric.Services.Remoting.Client;
using Phx.Cars.v1_0.Business;
namespace Phx.Cars.API.v1_0.Controllers
{
[ApiController]
[Route("v{version:apiVersion}/[controller]")]
public class RemoteController : ControllerBase
{
[HttpGet(nameof(CallCommunicator))]
public async Task<ActionResult<string>> CallCommunicator()
{
var communicatorClient = ServiceProxy.Create<IBusinessClient>(BusinessService.Uri);
var name = await communicatorClient.Hello();
return name;
}
}
}
<file_sep>using System;
using System.Collections.Generic;
using System.Fabric;
using System.Threading.Tasks;
using Microsoft.ServiceFabric.Services.Communication.Runtime;
using Microsoft.ServiceFabric.Services.Remoting.Runtime;
using Microsoft.ServiceFabric.Services.Runtime;
namespace Phx.Cars.v1_0.Business
{
public static class BusinessService
{
public static Uri Uri => new Uri($"{Environment.GetEnvironmentVariable("Fabric_ApplicationName")}/Phx.Cars.v1_0.Business");
}
/// <summary>
/// An instance of this class is created for each service instance by the Service Fabric runtime.
/// </summary>
internal sealed class Business : StatelessService, IBusinessClient
{
public Business(StatelessServiceContext context)
: base(context)
{ }
protected override IEnumerable<ServiceInstanceListener> CreateServiceInstanceListeners()
{
return this.CreateServiceRemotingInstanceListeners();
}
public async Task<string> Hello()
{
throw new System.NotImplementedException();
}
}
}
<file_sep>using System;
using System.Collections.Generic;
using System.Fabric;
using System.Threading.Tasks;
using Autofac.Features.OwnedInstances;
using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.Configuration;
using Microsoft.ServiceFabric.Services.Communication.Runtime;
using Microsoft.ServiceFabric.Services.Remoting.Runtime;
using Microsoft.ServiceFabric.Services.Runtime;
using Phx.People.Data;
namespace Phx.People.v1_0.Business
{
public static class BusinessService
{
public static Uri Uri => new Uri($"{Environment.GetEnvironmentVariable("Fabric_ApplicationName")}/Phx.People.v1_0.Business");
}
/// <summary>
/// An instance of this class is created for each service instance by the Service Fabric runtime.
/// </summary>
public class Business : StatelessService, IBusinessClient
{
private readonly IConfiguration _config;
private readonly Func<Owned<PeopleContext>> _peopleCtx;
public Business(StatelessServiceContext svcCtx, IConfiguration config, Func<Owned<PeopleContext>> peopleCtx)
: base(svcCtx)
{
_config = config;
_peopleCtx = peopleCtx;
}
protected override IEnumerable<ServiceInstanceListener> CreateServiceInstanceListeners()
{
return this.CreateServiceRemotingInstanceListeners();
}
public async Task<IEnumerable<IConfigurationSection>> GetConfiguration()
{
return _config.GetChildren();
}
public async Task<List<Person>> AllPeople()
{
using (var ctx = _peopleCtx().Value)
{
return await ctx.People.ToListAsync();
}
}
}
}
<file_sep>using Microsoft.EntityFrameworkCore;
namespace Phx.People.Data
{
public class PeopleContext : DbContext
{
public DbSet<Person> People { get; set; }
protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
{
if (!optionsBuilder.IsConfigured)
{
optionsBuilder.UseInMemoryDatabase("Phx.People.InMem");
}
//Database.EnsureCreated();
}
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
modelBuilder.Entity<Person>().HasData(
new Person
{
Id = 1,
Name = "Denis",
Age = 30
},
new Person
{
Id = 2,
Name = "Eric",
Age = 26
});
}
}
}
<file_sep>using System;
using System.Diagnostics;
using System.Threading;
using Autofac;
using Autofac.Integration.ServiceFabric;
using Microsoft.Extensions.Configuration;
using Phx.People.Data;
namespace Phx.People.v1_0.Business
{
internal static class Program
{
/// <summary>
/// This is the entry point of the service host process.
/// </summary>
private static void Main()
{
try
{
// The ServiceManifest.XML file defines one or more service type names.
// Registering a service maps a service type name to a .NET type.
// When Service Fabric creates an instance of this service type,
// an instance of the class is created in this host process.
// Start with the trusty old container builder.
var builder = new ContainerBuilder();
// Build configuration
var config = (IConfiguration)new ConfigurationBuilder()
.AddEnvironmentVariables()
.Build();
// Register configuration
builder.RegisterInstance(config)
.As<IConfiguration>()
.SingleInstance();
// Register DB context
builder.RegisterType<PeopleContext>();
// Register the Autofac magic for Service Fabric support.
builder.RegisterServiceFabricSupport();
// Register a stateless service...
builder.RegisterStatelessService<Business>("Phx.People.v1_0.BusinessType");
using (builder.Build())
{
ServiceEventSource.Current.ServiceTypeRegistered(
Process.GetCurrentProcess().Id,
typeof(Business).Name);
// Prevents this host process from terminating so services keep running.
Thread.Sleep(Timeout.Infinite);
}
}
catch (Exception e)
{
ServiceEventSource.Current.ServiceHostInitializationFailed(e.ToString());
throw;
}
}
}
}
<file_sep>using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Mvc.ApiExplorer;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
namespace Phx.Cars.API
{
public class Startup
{
public Startup(IConfiguration configuration)
{
Configuration = configuration;
}
public IConfiguration Configuration { get; }
// This method gets called by the runtime. Use this method to add services to the container.
public void ConfigureServices(IServiceCollection services)
{
services.AddSingleton(Configuration);
Config.NETCore.PhxNetCoreConfig.Services = services;
Config.NETCore.PhxNetCoreConfig.AddSnapshotCollector();
Config.NETCore.PhxNetCoreConfig.AddCors();
Config.NETCore.PhxNetCoreConfig.AddApi<Startup>();
Config.NETCore.PhxNetCoreConfig.AddMvc(false);
}
// This method gets called by the runtime. Use this method to configure the HTTP request pipeline.
public void Configure(IApplicationBuilder app, IApiVersionDescriptionProvider provider)
{
Phx.Config.NETCore.PhxNetCoreConfig.Configure(app, provider);
}
}
}
<file_sep>using System.Threading.Tasks;
using Microsoft.ServiceFabric.Services.Remoting;
namespace Phx.Cars.v1_0.Business
{
public interface IBusinessClient : IService
{
Task<string> Hello();
}
}
<file_sep>using System;
using Microsoft.AspNetCore.Authentication;
using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Hosting;
using Microsoft.AspNetCore.Mvc;
using Microsoft.AspNetCore.Mvc.Authorization;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Ocelot.DependencyInjection;
using Ocelot.Middleware;
namespace Phx.Gateway
{
public class Startup
{
public Startup(IConfiguration configuration)
{
Configuration = configuration;
}
public IConfiguration Configuration { get; }
// This method gets called by the runtime. Use this method to add services to the container.
public void ConfigureServices(IServiceCollection services)
{
// Add authentication
services
.AddAuthentication("ADB2C")
.AddAzureADB2CBearer(
"ADB2C",
"Jwt",
options => { Configuration.Bind("AzureAd", options); });
const string applicationPolicy = "PhxAppAuth";
services.AddAuthorization(options =>
{
// Add application authorization policy
options.AddPolicy(applicationPolicy, policy =>
{
policy.RequireAuthenticatedUser();
});
});
// Add Ocelot integration
services.AddOcelot();
services.AddMvc(options =>
{
// Add application policy as default for all requests
options.Filters.Add(new AuthorizeFilter(applicationPolicy));
options.AllowCombiningAuthorizeFilters = false;
}).SetCompatibilityVersion(CompatibilityVersion.Version_2_2);
// Disable CORS until further notice
services.AddCors(options =>
{
options.AddPolicy("AllowAll", p =>
{
p.AllowCredentials()
.AllowAnyHeader()
.AllowAnyMethod()
.AllowAnyOrigin()
.AllowCredentials()
.SetPreflightMaxAge(TimeSpan.FromDays(7))
.Build();
});
});
}
// This method gets called by the runtime. Use this method to configure the HTTP request pipeline.
public void Configure(IApplicationBuilder app, IHostingEnvironment env)
{
app
.UseHsts()
.UseHttpsRedirection()
.UseAuthentication()
.UseDeveloperExceptionPage();
app
.UseCors("AllowAll")
.UseMvc();
// Add Ocelot integration
var configuration = new OcelotPipelineConfiguration
{
PreQueryStringBuilderMiddleware = async (ctx, next) =>
{
await next.Invoke();
}
};
app.UseOcelot(configuration).Wait();
}
}
}
<file_sep>using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using Microsoft.AspNet.OData;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Configuration;
using Microsoft.ServiceFabric.Services.Remoting.Client;
using Phx.People.Data;
using Phx.People.v1_0.Business;
namespace Phx.People.API.v1_0.Controllers
{
[ApiController]
[Route("v{version:apiVersion}/[controller]")]
public class RemoteController : ControllerBase
{
private readonly IBusinessClient _peopleClient;
public RemoteController()
{
_peopleClient = ServiceProxy.Create<IBusinessClient>(BusinessService.Uri);
}
[HttpGet(nameof(GetConfiguration))]
public async Task<IEnumerable<IConfigurationSection>> GetConfiguration()
{
var configurationSections = await _peopleClient.GetConfiguration();
return configurationSections;
}
[HttpGet(nameof(People))]
[EnableQuery]
public async Task<IQueryable<Person>> People()
{
var people = await _peopleClient.AllPeople();
return people.AsQueryable();
}
}
}
| f92643248ad99fd6dfa83f473e18188f34fb2602 | [
"C#"
]
| 14 | C# | denious/sf-ocelot | 35baf186cd73c5b3e3e4ac6434171a008f112b30 | 42ffb5429f9dfa25a4962f4593d121c82c75a9b4 |
refs/heads/master | <file_sep>using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Linq;
using System.Text;
using System.Windows.Forms;
using ANDREICSLIB;
using ANDREICSLIB.Helpers;
using ANDREICSLIB.Licensing;
using ANDREICSLIB.NewControls;
namespace Timezone_Sleep_Converter
{
public partial class Form1 : Form
{
#region licensing
private const String HelpString = "";
private readonly String OtherText =
@"�" + DateTime.Now.Year +
@" <NAME> (http://www.andreigec.net)
Licensed under GNU LGPL (http://www.gnu.org/)
Zip Assets � SharpZipLib (http://www.sharpdevelop.net/OpenSource/SharpZipLib/)
";
#endregion
public Form1()
{
InitializeComponent();
}
private void exitToolStripMenuItem_Click(object sender, EventArgs e)
{
Application.Exit();
}
private void Form1_Load(object sender, EventArgs e)
{
CustomTimeZones.Create();
controller.InitDropDownWithTimeZones(fromTZ);
controller.InitDropDownWithTimeZones(toTZ);
toTZ.SelectedIndex = 7;
SetT1Hours(T1Bar.BarMaximumValue, T1Bar.BarMinimumValue);
SetT2Hours(T2Bar.BarMaximumValue, T2Bar.BarMinimumValue);
GetOptions();
Licensing.LicensingForm(this, menuStrip1, HelpString, OtherText);
}
private void setmytime_Click(object sender, EventArgs e)
{
var me = CustomTimeZones.GetMyTimeZone();
var s = me.ToString();
foreach (var v in fromTZ.Items)
{
if (v.Equals(s))
{
fromTZ.SelectedItem = v;
return;
}
}
}
private void SetT1Hours(int barmax, int barmin)
{
var bar = barmax - barmin;
if (bar < 0)
bar = 24 + bar;
T1hoursawakelabel.Text = controller.HoursAwakeString(bar);
T1hoursasleeplabel.Text = controller.HoursAsleepString(24 - bar);
}
private void SetT2Hours(int barmax, int barmin)
{
var bar = barmax - barmin;
if (bar < 0)
bar = 24 + bar;
T2hoursawakelabel.Text = controller.HoursAwakeString(bar);
T2hoursasleeplabel.Text = controller.HoursAsleepString(24 - bar);
}
private void dragBar1_BarValueChange(DragBar entry)
{
SetT1Hours(entry.BarMaximumValue, entry.BarMinimumValue);
GetOptions();
}
private void T2Bar_BarValueChange(DragBar entry)
{
SetT2Hours(entry.BarMaximumValue, entry.BarMinimumValue);
GetOptions();
}
private void GetOptions()
{
if (fromTZ.Text.Length == 0 || toTZ.Text.Length == 0)
return;
option1.Text = "";
var diffhours = 0;
var diffmins = 0;
var early = false;
//wake
controller.GetDiffHours(T1Bar, T2Bar, true, out diffhours, out diffmins, out early, fromTZ.Text, toTZ.Text);
option1.Text = controller.MakeTimeString("Wake up", diffhours, diffmins, early);
//sleep
controller.GetDiffHours(T1Bar, T2Bar, false, out diffhours, out diffmins, out early, fromTZ.Text, toTZ.Text);
option2.Text = controller.MakeTimeString("Sleep", diffhours, diffmins, early);
}
private void fromTZ_TextChanged(object sender, EventArgs e)
{
GetOptions();
}
private void toTZ_TextChanged(object sender, EventArgs e)
{
GetOptions();
}
}
}
<file_sep>Timezone-Sleep-Converter
========================
If you go overseas and want to maintain the same sleeping pattern time wise,
Timezone Sleep Converter will tell you how many hours you will need to alter your current
sleeping pattern to avoid being jetlagged.
Images
======


<file_sep>using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using ANDREICSLIB;
using System.Windows.Forms;
using ANDREICSLIB.Helpers;
using ANDREICSLIB.NewControls;
namespace Timezone_Sleep_Converter
{
public class controller
{
public static string HoursAwakeString(int hours)
{
return "Hours Awake:" + hours.ToString();
}
public static void InitDropDownWithTimeZones(ComboBox CB)
{
CB.Items.Clear();
var index = 0;
var me = CustomTimeZones.GetMyTimeZone();
foreach (var v in CustomTimeZones.Zones)
{
CB.Items.Add(v.ToString());
if (me == v)
CB.SelectedIndex = index;
index++;
}
}
public static string HoursAsleepString(int hours)
{
return "Hours Asleep:" + hours.ToString();
}
public static void GetDiffHours(DragBar T1, DragBar T2, bool min, out int diffhours, out int diffmins, out bool early, string fromTZ, string toTZ)
{
diffhours = diffmins = 0;
var here = CustomTimeZones.FromString(fromTZ);
var there = CustomTimeZones.FromString(toTZ);
var TS1 = here.UTCoffset - there.UTCoffset;
var wakeH = T1.BarMinimumValue;
var wakeM = 0;
var wakeH2 = T2.BarMinimumValue;
var wakeM2 = 0;
if (min == false)
{
wakeH = T1.BarMaximumValue;
wakeH2 = T2.BarMaximumValue;
}
var a = 0;
//option1
while (wakeH != wakeH2 || wakeM != wakeM2)
{
if (wakeH > wakeH2)
{
wakeH--;
diffhours--;
}
else if (wakeH < wakeH2)
{
wakeH++;
diffhours++;
}
if (wakeM > wakeM2)
{
wakeM -= 30;
diffmins -= 30;
}
else if (wakeM > wakeM2)
{
wakeM += 30;
diffmins += 30;
}
}
diffhours += TS1.Hours;
diffmins += TS1.Minutes;
diffhours %= 24;
if (diffhours > 12)
{
diffhours = 24 - diffhours;
early = true;
}
else if (diffhours < -12)
{
diffhours = 24 + diffhours;
early = false;
}
else
early = !(diffhours >= 0);// && diffmins >= 0);
diffhours = Math.Abs(diffhours);
diffmins = Math.Abs(diffmins);
}
public static String MakeTimeString(String prepend, int hours, int minutes, bool early)
{
var earlierstr = "earlier";
if (early == false)
earlierstr = "later";
String hour;
if (hours == 1)
hour = " Hour ";
else
hour = " Hours ";
String mins;
if (minutes == 1)
mins = " Minute ";
else
mins = " Minutes ";
var ret = prepend + " ";
if (hours != 0)
{
ret += hours.ToString() + hour + earlierstr;
if (minutes != 0)
ret += " and ";
}
if (minutes != 0)
{
ret += minutes.ToString() + mins + earlierstr;
}
if (hours == 0 && minutes == 0)
ret += "at the same time";
return ret;
}
}
}
| 3ea31800ad2cae79d28acbb1c53eca635ac5ddd4 | [
"Markdown",
"C#"
]
| 3 | C# | andreigec/Timezone-Sleep-Converter | 7d1860d14a5c3d10202954263c8a3e96126bbbbb | e4646ae073fcc5d134ecf852118667f6156e401d |
refs/heads/master | <repo_name>jolin1337/php-rest-api-lib<file_sep>/Response.php
<?php
namespace gd\rest;
use Exception;
class Response {
private $app = null;
private $ended = False;
function __construct ($app) {
$this->app = $app;
}
public function getApp () {
return $this->app;
}
public function hasEnded () {
return $this->ended;
}
public function withStatus ($code) {
http_response_code((int) $code);
return $this;
}
public function setHeader($headerName, $headerValue) {
header($headerName . ': ' . $headerValue);
return $this;
}
public function withContentType ($type) {
return $this->setHeader('Content-Type', $type);
}
public function redirectTo ($url) {
return $this->setHeader('Location', $url);
}
public function write ($data) {
if ($this->ended !== True) {
if (is_array($data) || $data instanceof stdClass)
$data = json_encode($data, JSON_UNESCAPED_UNICODE);
echo $data;
} else {
throw new Exception("Error already ended response", 1);
}
return $this;
}
public function end ($moreData = '') {
$this->write($moreData);
$this->ended = True;
session_write_close();
// fastcgi_finish_request();
return $this;
}
}
?>
<file_sep>/Request.php
<?php
namespace gd\rest;
/**
*
*/
class Request {
private $app = null;
private $route = null;
private $serverOptions = [];
private $cookieOptions = [];
private $extraParams = [];
function __construct ($app, $route, $options=[], $extraParams=[]) {
if (!is_array($options)) $options = [];
$this->serverOptions = isset($options['SERVER']) ? $options['SERVER'] : $_SERVER;
$this->cookieOptions = isset($options['COOKIE']) ? $options['COOKIE'] : $_COOKIE;
$this->extraParams = $extraParams;
$this->app = $app;
$this->route = $route;
}
public function getRoute () {
return $this->route;
}
public function getApp () {
return $this->app;
}
public function usingSSL () {
return ( ! empty( $this->serverOptions['HTTPS'] ) && $this->serverOptions['HTTPS'] == 'on' );
}
public function getIp () {
$ipaddress = '';
if ($this->serverOptions['HTTP_CLIENT_IP'] != '127.0.0.1')
$ipaddress = $this->serverOptions['HTTP_CLIENT_IP'];
else if ($this->serverOptions['REMOTE_ADDR'] != '127.0.0.1')
$ipaddress = $this->serverOptions['REMOTE_ADDR'];
else if ($this->serverOptions['HTTP_X_FORWARDED_FOR'] != '127.0.0.1')
$ipaddress = $this->serverOptions['HTTP_X_FORWARDED_FOR'];
else if ($this->serverOptions['HTTP_X_FORWARDED'] != '127.0.0.1')
$ipaddress = $this->serverOptions['HTTP_X_FORWARDED'];
else if ($this->serverOptions['HTTP_FORWARDED_FOR'] != '127.0.0.1')
$ipaddress = $this->serverOptions['HTTP_FORWARDED_FOR'];
else if ($this->serverOptions['HTTP_FORWARDED'] != '127.0.0.1')
$ipaddress = $this->serverOptions['HTTP_FORWARDED'];
else
$ipaddress = 'UNKNOWN';
return $ipaddress;
}
public function getHostName ($useForwardedHost = false) {
$host = ( $useForwardedHost && isset( $this->serverOptions['HTTP_X_FORWARDED_HOST'] ) ) ? $this->serverOptions['HTTP_X_FORWARDED_HOST'] : ( isset( $this->serverOptions['HTTP_HOST'] ) ? $this->serverOptions['HTTP_HOST'] : null );
$host = isset( $host ) ? $host : $this->serverOptions['SERVER_NAME'];
return $host;
}
public function getPort () {
$ssl = $this->usingSSL();
$port = $this->serverOptions['SERVER_PORT'];
$port = ( ( ! $ssl && $port=='80' ) || ( $ssl && $port=='443' ) ) ? '' : $port;
return $port;
}
public function getHost ($useForwardedHost = false) {
return $this->getHostName() . ':' . $this->getPort();
}
public function getProtocol () {
$ssl = $this->usingSSL();
$sp = strtolower( $this->serverOptions['SERVER_PROTOCOL'] );
$protocol = substr( $sp, 0, strpos( $sp, '/' ) ) . ( ( $ssl ) ? 's' : '' );
return $protocol;
}
public function getOrigin ($useForwardedHost = false) {
$host = $this->getHostName($useForwardedHost);
$protocol = $this->getProtocol();
return $protocol . '://' . $host;
}
public function getPath () {
return $this->serverOptions['REQUEST_URI'];
}
public function getPathName () {
return $this->serverOptions['PHP_SELF'];
}
public function getUrl ($useForwardedHost = false) {
return $this->getOrigin($this->serverOptions, $useForwardedHost) . $this->getPath();
}
public function getMethod () {
$options = $this->serverOptions;
return Request::method($options);
}
public function getCookies () {
return $this->cookieOptions;
}
public function getOptions () {
return clone (object) [
'COOKIE' => $this->cookieOptions,
'SERVER' => $this->serverOptions
];
}
public function getParams ($expectedParams=null, $inputParams=[], bool $validate) {
$inputParams = array_merge(is_array($inputParams) ? $inputParams : [], is_array($this->extraParams) ? $this->extraParams : []);
$requestParams = $validate ? Request::validateParams($expectedParams, $inputParams) : $inputParams;
return $requestParams ? (object) $requestParams : False;
}
public function getPayloadParams ($payloadRequest=null, $inputParams=[], bool $validate=True) {
return $this->getPutParams($payloadRequest, $inputParams, $validate);
}
public function getPostParams ($postRequest=null, $inputParams=[], bool $validate=True) {
return $this->getParams($postRequest, array_merge($inputParams, $_POST), $validate);
}
public function getPutParams ($putRequest=null, $inputParams=[], bool $validate=True) {
$fileContents = file_get_contents("php://input");
$_PUT = json_decode($fileContents, True);
if (!$_PUT)
parse_str($fileContents, $_PUT);
return $this->getParams($putRequest, array_merge($inputParams, $_PUT), $validate);
}
public function getDeleteParams ($deleteRequest=null, $inputParams=[], bool $validate=True) {
parse_str(file_get_contents("php://input"), $_DELETE);
return $this->getParams($deleteRequest, array_merge($inputParams, $_DELETE), $validate);
}
public function getQueryParams ($queryRequest=null, $inputParams=[], bool $validate=True) {
return $this->getParams($queryRequest, array_merge($inputParams, $_GET), $validate);
}
public function getFileParams ($fileRequest=null, $inputParams=[], bool $validate=True) {
// TODO: change structure of files param
return $this->getParams($fileRequest, array_merge($inputParams, $_FILES), $validate);
}
public function getUrlParams($urlRequest=null, $inputParams=[], bool $validate=True) {
return $this->getParams($urlRequest, array_merge($inputParams, (array) $this->route->getParams()), $validate);
}
public function getAllParams($urlRequest=null, $inputParams=[], bool $validate=True) {
return $this->getParams($urlRequest, array_merge(
(array) $this->getUrlParams($urlRequest, $inputParams, False),
(array) $this->getQueryParams($urlRequest, $inputParams, False),
(array) $this->getPostParams($urlRequest, $inputParams, False),
(array) $this->getPutParams($urlRequest, $inputParams, False)
// (array) $this->getDeleteParams($urlRequest, $inputParams, False), // Since this is the same as put params for now we ignore this and save performance
// (array) $this->getFileParams($urlRequest, $inputParams, False) // TODO: Make this more lite the other params
), $validate);
}
public static function method ($serverOptions) {
$method = isset($serverOptions['HTTP_ACCESS_CONTROL_REQUEST_METHOD']) ?
$serverOptions['HTTP_ACCESS_CONTROL_REQUEST_METHOD'] : $serverOptions['REQUEST_METHOD'];
return strtolower($method);
}
public static function validateParams ($parameters, $validateAgainst, $defaultdefault = NULL) {
if (!$parameters) return $validateAgainst;
// global $_GET, $_POST;
if(is_string($validateAgainst))
$t_vals = &$GLOBALS[$validateAgainst];
else if(is_array($validateAgainst))
$t_vals = $validateAgainst;
if(!isset($t_vals) || !is_array($t_vals))
return False;
$res = array();
$defaultOptions = array('optional' => False, 'default' => $defaultdefault, 'match' => False, 'values' => False);
foreach ($parameters as $param => $options) {
if(!is_array($options)) {
$param = $options;
$options = array('optional' => $defaultOptions['optional'], 'default' => $options, 'match' => $defaultOptions['match']);
}
if(!isset($options['optional'])) $options['optional'] = $defaultOptions['optional'];
if(!isset($options['default'])) $options['default'] = $defaultdefault;
if(!isset($options['match'])) $options['match'] = $defaultOptions['match'];
if(!isset($options['values']) || !is_array($options['values'])) $options['values'] = $defaultOptions['values'];
if (!isset($t_vals[$param]) && $options['optional'] === True) {
$res[$param] = $options['default'];
continue;
}
else if (!isset($t_vals[$param]) && $options['optional'] === False) {
// var_dump($param, /*$t_vals[$param], */$options, $t_vals);
return False;
}
if((!is_array($options['values']) || in_array($t_vals[$param], $options['values'])) && (!is_string($options['match']) || preg_match($options['match'], $t_vals[$param]))) {
$res[$param] = $t_vals[$param];
continue;
}
return False;
}
return $res;
}
}
?>
<file_sep>/App.php
<?php
namespace gd\rest;
use Exception;
require_once 'Request.php';
require_once 'Response.php';
require_once 'Route.php';
if (!class_exists('\gd\rest\App')):
class App {
private $hook = null;
private $basePath = '';
private $requestPathName = null;
private $router = [
'get' => [],
'post' => [],
'put' => [],
'delete' => [],
// '/root/path/to/function => callback'
];
private $errorRoutes = [];
private $subApps = [];
public function setHook ($hook) {
$this->hook = $hook;
return $this;
}
public function getHook () {
return $this->hook;
}
public function setBasePath ($path) {
$this->basePath = $path;
return $this;
}
public function getBasePath () {
return $this->basePath;
}
public function getPath () {
return $this->getBasePath();
}
public function setRequestPathName ($path) {
$this->requestPathName = $path;
return $this;
}
public function getRequestPathName () {
return $this->requestPathName;
}
public function any (String $path, ...$args) {
$httpRequests = array_keys($this->router);
foreach ($args as $arg) {
$this->map($path, $httpRequests, $arg);
}
return $this;
}
public function get (String $path, ...$args) {
foreach ($args as $arg) {
$this->map($path, ['get'], $arg);
}
return $this;
}
public function post (String $path, ...$args) {
foreach ($args as $arg) {
$this->map($path, ['post'], $arg);
}
return $this;
}
public function put (String $path, ...$args) {
foreach ($args as $arg) {
$this->map($path, ['put'], $arg);
}
return $this;
}
public function delete (String $path, ...$args) {
foreach ($args as $arg) {
$this->map($path, ['delete'], $arg);
}
return $this;
}
public function error (callable ...$args) {
foreach ($args as $arg) {
$this->errorRoutes[] = $arg;
}
return $this;
}
public function use ($path, ...$args) {
foreach ($args as $arg) {
// if (is_callable($arg))
// $this->subApps[] = $arg;
// foreach ($arg->router as $method => $routes) {
// if (isset($this->router[$method]))
// $this->router[$method] = array_merge(array_values($this->router[$method]), array_values($routes));
// }
if (!is_string($arg)) {
if (!method_exists($arg, 'getBasePath')) {
$tmp = new App();
$tmp->setHook($arg);
$arg = $tmp;
}
$arg->setBasePath($path . '/' . $arg->getBasePath());
}
$this->any($path, $arg);
}
return $this;
}
public function matchRequest($request, $response) {
if ($response->hasEnded() || (is_callable($this->getHook()) && !$this->getHook()($request, $response))) return False;
$method = $request->getMethod();
$options = (array)$request->getOptions();
$basePath = Route::removeDoubleSlashes($this->basePath ? $this->basePath : '/');
if (strpos($options['SERVER']['PHP_SELF'], $basePath) !== 0)
return False;
$options['SERVER']['PHP_SELF'] = substr($options['SERVER']['PHP_SELF'], strlen($basePath)-1);
foreach ($this->router[$method] as $route) {
$request = new Request($this, $route, $options);
if ($route->matchRequest($request, $response)) {
return True;
}
}
return False;
}
public function processRequest ($req=null, $res=null) {
$isSubRoute = $req instanceof Request;
if (($isSubRoute && $res->hasEnded()) || (is_callable($this->getHook()) && !$this->getHook()($req, $res))) return False;
$extraParams = null;
$options = $isSubRoute ? (array) $req->getOptions() : [
'SERVER' => $_SERVER,
'COOKIE' => $_COOKIE
];
if (!$isSubRoute) $extraParams = $req;
if ($this->requestPathName) {
$options['SERVER']['PHP_SELF'] = $this->requestPathName;
}
$basePath = Route::removeDoubleSlashes($this->basePath ? $this->basePath : '/');
if (strpos($options['SERVER']['PHP_SELF'], $basePath) !== 0)
return $this;
$options['SERVER']['PHP_SELF'] = substr($options['SERVER']['PHP_SELF'], strlen($basePath)-1);
$response = $isSubRoute ? $res : new Response($this);
$method = $isSubRoute ? $req->getMethod() : Request::method($_SERVER);
$foundOne = False;
if (isset($this->router[$method])) {
foreach ($this->router[$method] as $route) {
$request = new Request($this, $route, $options, $extraParams);
if ($route->matchRequest($request, $response)) {
$result = $route->processRequest($request, $response);
$foundOne = True;
if (!$result || ($isSubRoute && $res->hasEnded())) {
break;
}
}
}
}
if ($foundOne === False) {
$response->withStatus(404);
$request = new Request($this, null, $options, $extraParams);
foreach ($this->errorRoutes as $route) {
$route($request, $response);
}
} else if ($isSubRoute) return False;
return $this;
}
private function map ($path, $methods, $function=False) {
if (!is_string($path)) {
$this->map('/', $methods, $path);
$path = '/';
}
$httpRequests = array_keys($this->router);
foreach ($methods as $method) {
if (in_array($method, $httpRequests)) {
if (is_callable($function)) {
$this->router[$method][] = new Route($method, $path, $function);
} else if ($function instanceof App) {
$this->router[$method][] = $function;
} else if (is_string($function)) {
$this->router[$method][] = new Route($method, $path . '/{aa?}', function (Request $request, Response $response) use ($path, $function) {
$reqPath = $request->getPathName();
$hasTrailingSlash = strpos($reqPath, '/');
$reqPath = substr($request->getPathName(), strlen($path) + ($hasTrailingSlash ? 1 : 0));
$includePath = $function . (!$hasTrailingSlash ? '/' . $reqPath : '');
$includePath = preg_replace('/[^\.a-zA-Z0-9\-\_\/\\\\:]/', '', $includePath);
$file_found = true;
if (substr($includePath, -strlen('.php')) === '.php' && is_file($includePath)) {
$response->withContentType('text/html');
include($includePath);
} else if (substr($includePath, -strlen('.html')) === '.html' && is_file($includePath)) {
$response->withContentType('text/html');
include($includePath);
} else if (substr($includePath, -strlen('.css')) === '.css' && is_file($includePath)) {
$response->withContentType('text/css');
include($includePath);
} else if (substr($includePath, -strlen('.js')) === '.js' && is_file($includePath)) {
$response->withContentType('text/javascript');
include($includePath);
} else if (substr($includePath, -strlen('.png')) === '.png' && is_file($includePath)) {
$response->withContentType('image/png');
include($includePath);
} else if (is_file($includePath . '/index.php')) {
$response->withContentType('text/html');
include($includePath . '/index.php');
} else if (is_file($includePath . '/index.html')) {
$response->withContentType('text/html');
include($includePath . '/index.html');
} else if (is_file($includePath)) {
readfile($includePath);
} else {
$file_found = false;
}// else echo "--=( " . $includePath;
if ($file_found) {
$response->end();
}
});
}
}
}
return $this;
}
}
endif;
?>
<file_sep>/Route.php
<?php
namespace gd\rest;
use Exception;
include_once 'Request.php';
include_once 'Response.php';
class Route {
private $method = 'get';
private $path = '';
private $function = null;
// Generated variables for performance
private $params = [];
private $partsOfPath;
private $processedRequest = False;
function __construct (String $method, String $path, $function) {
$this->method = $method;
$this->path = $this->parsePath($path);
$this->function = $function;
}
private function parsePath (String $path) {
$path = Route::removeDoubleSlashes($path); // Remove double slashes (//)
$parts = explode('/', $path);
foreach ($parts as $index => $part) {
if (strpos($part, '{') === 0 && strpos($part, '}') === strlen($part) - 1) {
$this->params[substr($part, 1, -1)] = $index;
}
}
$this->partsOfPath = $parts;
return $path;
}
public function getParams() {
if ($this->processedRequest !== False) {
$path = Route::removeDoubleSlashes($this->processedRequest->getPathName());
$parts = explode('/', $path);
$params = [];
$isLastParamIndicator = False;
foreach ($this->params as $param => $index) {
$isLastParamIndicator = strpos($param, '?') !== False;
if (isset($parts[$index])) {
$params[$param] = $parts[$index];
} else {
$params[$param] = null;
}
if ($isLastParamIndicator) {
$params[$param] .= implode('/', array_slice($parts, $index+1));
}
}
return (object) $params;
}
throw new Exception("Error Request not processed for this route", 1);
}
public function getPath () {
return $this->path;
}
public function getMethod () {
return $this->method;
}
public function setMethod (String $method) {
$this->method = $method;
}
public function matchRequest (Request $req) {
if ($this->method !== $req->getMethod()) return False;
$reqPathName = Route::removeDoubleSlashes($req->getPathName());
$parts = explode('/', $reqPathName);
foreach ($this->partsOfPath as $index => $part) {
$isParameter = in_array($index, $this->params);
if ($isParameter && strpos($part, '?') !== False) return True;
if (!$isParameter && (!isset($parts[$index]) || $part !== $parts[$index])) {
return False;
}
}
return count($parts) === count($this->partsOfPath);
}
public function processRequest (Request $req, Response $resp) {
if ($this->processedRequest) throw new Exception("Error Request processed twise", 1);
$this->processedRequest = $req;
if ($this->matchRequest($req)) {
$callback = $this->function;
if (is_callable($callback))
return !$callback($req, $resp);
else if (method_exists($callback, 'processRequest')) {
return $callback->processedRequest($req, $resp);
}
}
return $this;
}
public static function removeDoubleSlashes (String $string) {
return preg_replace('/\/\/+/', '/', $string); // Remove double slashes (//)
}
}
?><file_sep>/README.md
# A simple PHP Rest API library
This library helps you to create a rest API by adding convenient tools and structure to the PHP code-base.
## Getting started
An example of how you can configure the rest API. In the root folder of where you want your API to resinde at the PHP server create a .htaccess file that rewrites all subtrafic to this file.
```
RewriteEngine On
RewriteRule ^(.*) index.php?_=$1 [QSA,L]
```
Next create the index.php file with the API endpoints you would want to use.
```
use \gd\rest\App;
use \gd\rest\Request;
use \gd\rest\Response;
$app = new App();
$app->setRequestPathName('/' . $_GET['_']);
unset($_GET['_']);
// Declare endpoints:
$app->get('/api/v1/ping', function (Request $request, Response $response) {
$response->end('pong');
})
->get('/api/v1/short-url', function (Request $request, Response $response) {
$response->redirectTo($request->getUrl() . '/api');
})
->get('/api/v1/api', function (Request $request, Response $response) {
include('docs/index.php'); // will serve the documentation with it's own logic
})
->get('/app/content', function (Request $request, Response $response) {
$response->end(file_get_contents('content/index.html'));
})
->use('/app', '../static/pages') // Serve normal files as a file server
->post('api/v1/update, authenticate, function (Request $request, Response $response) {
$params = $request->getPostParams([
'id' => [ 'match' => '/\d+/', 'required' => true, 'default' => null ]
]);
if ($params === false) $response->withStatus(400)->end('Invalid params, id must be numeric!');
// TODO: update id =)
$response->withStatus(200)->end('Successfully updated id');
});
```
Additional features supported are file uploading, put and delete requests as well as request chaining and forwarding.
| 3d5c407d0f007202ef0fb8c5a3520e94f2b7a183 | [
"Markdown",
"PHP"
]
| 5 | PHP | jolin1337/php-rest-api-lib | 6d96567a284c236fd3aac996f28a1a47c0724d3a | 15ce2253bb444c476e36fbef13feb199280e6280 |
refs/heads/master | <repo_name>tmwayne/shell_scripts<file_sep>/config_vms.sh
#!/bin/bash
USAGE='Usage: send_setup_files [-f files] [-d dirs] [-p vm_path] vm1 [vm2]'
while getopts ":hf:d:" opt; do
case $opt in
h)
echo $USAGE
exit 0
;;
f)
FILES=$OPTARG
;;
d)
DIRS=$OPTARG
;;
\?)
echo "Invalid option: -$OPTARG" >&2
;;
esac
done
shift $((OPTIND-1))
BOXES=$@
if [ $# -eq 0 ] ; then
BOXES='sandbox bcg'
fi
DEFAULT_FILES=$(echo $HOME/{.vimrc,.bash_profile,.bash_cheatsheet.txt})
FILES=${FILES:-$DEFAULT_FILES}
DEFAULT_DIRS=$(echo $HOME/.vim/{ftplugin,templates,snippets,ides})
DIRS=${DIRS:-$DEFAULT_DIRS}
for b in $BOXES ; do
echo "Updating $b..."
scp $FILES $b:
scp -r $DIRS $b:.vim
done
echo 'Success!'
<file_sep>/print_header.sh
#!/bin/bash
# Usage: print_header [-d delim] filename
while getopts ":d:" opt; do
case $opt in
d)
DELIM=$OPTARG
;;
\?)
echo "Invalid option: -$OPTARG" >&2
;;
esac
done
shift $((OPTIND-1))
FILE="${1:-/dev/stdin}"
DELIM=${DELIM:-'\t'}
head -1 $FILE | sed "s/$DELIM/\n/g"
<file_sep>/python_err.sh
#!/bin/bash
# Usage: python_err script [args]
# Description: If script throws an error,
# it will be opened in VIM with line number at error
if [ $# -eq 0 ]
then
echo 'Usage: python_err script [args]'
exit 1
fi
SCRIPT=$1
exec 3>&1 4>&2
SE=$(( python "$@" ) 2>&1 >/dev/null )
if [ -n "$SE" ]
then
echo "$SE"
read -p "Press any key to open file in VIM"
LINE=$(echo "$SE" | sed -n '/<module>/s/.*line \([0-9]*\).*/\1/p')
vim +$LINE $1
fi
<file_sep>/build_fe_skeleton.sh
#!/bin/bash
# Usage: build_skeleton
PROG_NAME=$0
## CHECK FOR ARGUMENTS
while getopts ":o:c" opt; do
case $opt in
o)
ROOT=$OPTARG
;;
c)
CLEANUP=TRUE
;;
\?)
echo "Invalid option: -$OPTARG" >&2
;;
esac
done
shift $((OPTIND-1))
## CHECK FOR VALID ARGUMENTS
ROOT=${ROOT:-frontend}
if [ -e "$ROOT" ]
then
echo "$PROG_NAME: Root directory already exists"
exit 1
else
mkdir "$ROOT"
fi
## FOR FLASK
## BUILD FOLDER SKELETON
DIRS="
app/static/data
app/static/css
app/static/js
app/templates
"
for dir in $DIRS
do
echo "Making directory: $dir"
mkdir -p "$ROOT/$dir"
done
## CREATE MINIMUM SET OF FILES
FILES="
LICENSE
README.md
config.py
main.py
start_frontend.sh
app/__init__.py
app/email.py
app/errors.py
app/forms.py
app/models.py
app/routes.py
app/static/css/styles.css
app/static/js/scripts.js
app/templates/index.html
"
for f in $FILES
do
echo "Making file: $f"
touch "$ROOT/$f"
done
## DELETE SKELETON AFTER PROGRAM FINISHES
if [ "$CLEANUP" = "TRUE" ]
then
rm -rf "$ROOT"
fi
<file_sep>/py-vim
#!/bin/bash
USAGE='Usage: vim-py [-e virtual_env] [-p port] [vim_args]'
## VARIABLES
##############################
# Command-line arguments
while getopts ":he:o:p:" opt; do
case $opt in
h)
echo $USAGE
exit 0 ;;
e)
VENV=$OPTARG ;;
p)
PORT=$OPTARG ;;
\?)
echo "Invalid option: -$OPTARG" >&2 ;;
esac
done
shift $((OPTIND-1))
VIM_ARGS=$@
# File configurations
# Program defined variables
PORT=${VIMPY_PORT:-${PORT:-8080}}
VENV=${VENV:-.env-py}
## Use virtual environment if specified
if [ -f "$VENV/bin/ipython3" ] ; then
IPYTHON=$VENV/bin/ipython3
elif [ -f "$HOME/.env-py/bin/ipython" ] ; then
IPYTHON=~/.env-py/bin/ipython3
else
echo 'ERROR: unable to find ipython3'
exit 1
fi
## ASSERTIONS
##############################
## MAIN
##############################
START_SERVER="python3 -m http.server $PORT"
SESSION=$$
# if ! tmux ls > /dev/null 2>&1 ; then
if [ -z "$TMUX" ]; then
## 1) SERVER: webserver to enable viewing output from browser
tmux new-session -s $SESSION -n SERVER -d $START_SERVER
## 2) SHELL: bash shell to run commands
tmux new-window -n SHELL
## 3) IDE: Python IDE using Vim and IPython
tmux new-window -n IDE vim $VIM_ARGS
tmux split-window -h $IPYTHON
tmux last-pane
## Attach session to current client
tmux -2 attach-session -t $SESSION
else
## IDE: Python IDE using Vim and IPython
tmux new-window -n IDE vim $VIM_ARGS
tmux split-window -h $IPYTHON
tmux last-pane
fi
<file_sep>/cookie_cutter.sh
#!/bin/bash
usage='Usage: cookie_cutter [-d cookie_cutter_dir] [-D project_dir] project_type project_name'
## CONFIGURATIONS
default_cookie_cutter_dir="$HOME/scripts/skeletons/cookie_cutters"
## ARGUMENTS
while getopts ":hd:D:" opt; do
case $opt in
h)
echo $usage
exit 0
;;
d)
cookie_cutter_dir=$OPTARG
;;
D)
proj_dir=$OPTARG
;;
\?)
echo "Invalid option: -$OPTARG" >&2
;;
esac
done
shift $((OPTIND-1))
cookie_cutter_dir=${cookie_cutter_dir:-$default_cookie_cutter_dir}
proj_type=$1
cookie_cutter="$cookie_cutter_dir/$proj_type"
proj_dir=${proj_dir:-$(pwd)}
proj_name=$2
proj_dest="$proj_dir/$proj_name"
## ASSERTIONS
if [ "$#" -lt 2 ] ; then
echo $usage
exit 1
fi
if [ ! -d $cookie_cutter ] ; then
echo 'Error: template for project type does not exist'
exit 1
fi
if [ -d "$proj_dest" ] ; then
echo 'Error: project directory already exists'
echo "$proj_dest"
exit 1
fi
## MAIN
printf "Creating $proj_type project in $proj_dir ... "
if cp -r $cookie_cutter $proj_dest; then
echo 'Success!'
else
echo 'Error'
echo 'Unable to create cookie cutter project.'
fi
<file_sep>/mkenv
#!/bin/bash
USAGE='Usage: mkenv [name]'
ENV_NAME=${ENV_NAME:-.env-py}
python3 -m venv $ENV_NAME
<file_sep>/quizzer.sh
#!/bin/bash
# source activate stats
python "~/scripts/python/quizzer.py" "$@"
<file_sep>/lower_names.sh
#!/bin/bash
# Usage: lower_col_names filename
FILE=$1
if [ -z $FILE ]
then
echo "Usage: lower_col_names filename"
exit 1
fi
TMP=$(mktemp)
let LINES=$( wc -l $FILE | cut -d' ' -f1 )
sed -e '1s/.*/\L&/;1q' $FILE > $TMP
tail -$(( LINES - 1 )) $FILE >> $TMP
let NEW_LINES=$( wc -l $TMP | cut -d' ' -f1 )
if [[ $NEW_LINES -eq $LINES ]]
then
mv $TMP $FILE
fi
<file_sep>/cut_column.sh
#!/bin/bash
# Usage: cut_column filename columnname
while getopts ":d:" opt; do
case $opt in
d)
DELIM="$OPTARG"
;;
\?)
echo "Invalid option: -$OPTARG" >&2
;;
esac
done
shift $((OPTIND-1))
if [[ $# -lt 1 ]]
then
echo 'Usage: cut_column filename columname'
exit 1
elif [[ $# -eq 1 ]]
then
echo "cut_column doesn't currently work with standard in"
else
FILE=$1
COLUMN=$2
fi
DELIM=${DELIM:-'\t'}
echo $DELIM
# let IND=$( head -1 $FILE | sed "s/$DELIM/\\n/g" | sed -n "/$COLUMN/=" )
echo $( head -1 $FILE | sed "s/$DELIM/\n/g" )
# echo $IND
# IND="-f $IND"
# cut -d $DELIM -f $IND $FILE
<file_sep>/vim-dev.sh
#!/bin/bash
USAGE='Usage: vim-dev repl [vim-args]'
## ARGUMENTS
##############################
while getopts ":e:" opt; do
case $opt in
h)
echo $USAGE
exit 0
;;
\?)
echo "Invalid option: -$OPTARG" >&2
;;
esac
done
shift $((OPTIND-1))
REPL=$1
shift
VIM_ARGS=$@
SESSION=$$
## ASSERTIONS
##############################
if [ -z $REPL ] ; then
echo $USAGE
exit 1
fi
## MAIN
##############################
## Only start a new session if none exists
# if ! tmux ls > /dev/null 2>&1 ; then
if [ -z "$TMUX" ]; then
tmux new-session -s $SESSION -n SHELL -d
tmux new-window -n IDE vim $VIM_ARGS
tmux split-window -h $REPL
tmux last-pane
tmux -2 attach-session -t $SESSION
else
## IDE: Python IDE using Vim and IPython
tmux new-window -n IDE vim $VIM_ARGS
tmux split-window -h $REPL
tmux last-pane
fi
<file_sep>/count_vals.sh
#!/bin/bash
# Usage: count_vals filename columnname
while getopts ":d:" opt; do
case $opt in
d)
ARGS="-d $OPTARG"
DELIM=$OPTARG
;;
\?)
echo "Invalid option: -$OPTARG" >&2
;;
esac
done
shift $((OPTIND-1))
if [[ $# -lt 2 ]]
then
echo 'Usage: count_vals filename columname'
exit 1
elif [[ $# -eq 1 ]]
then
echo "count_vals doesn't currently work with standard in"
fi
FILE=$1
COLUMN=$2
echo "count $COLUMN"
cut_column $ARGS $FILE $COLUMN | sort | uniq -c
| 9aaef375a8f3f9f52e7de2ff85cfc3c048afb71c | [
"Shell"
]
| 12 | Shell | tmwayne/shell_scripts | 2f77b8e5ad452dd7f28a1be1983da4328b995491 | 124277c415ab48179e1ea2ff3c7308000df79642 |
refs/heads/master | <file_sep>#include <TimerOne.h>
#include <Ethernet.h>
#include <UbidotsEthernet.h>
#include <SPI.h>
/***** POST *****/
char const * TOKEN = "<KEY>"; // Assign your Ubidots TOKEN
char const * VARIABLE_LABEL_1 = "temperature"; // Assign the unique variable label to send the data
char const * VARIABLE_LABEL_2 = "power"; // Assign the unique variable label to send the data
char const * VARIABLE_LABEL_3 = "presence"; // Assign the unique variable label to send the data
/***** ID's (Não utilizados...) *****/
char const * TEMP_ID = "5bc52e96c03f974c1670ad02"; // Assign the unique variable label to send the data
char const * POWER_ID = "5bc52ea1c03f974c1670ad03"; // Assign the unique variable label to send the data
/***** GET *****/
char const * DEVICE_LABEL = "PID"; // Assign the unique device label
char const * VARIABLE_LABEL_4 = "Set_Point"; // Assign the unique variable label to get the last value
char const * VARIABLE_LABEL_5 = "kp"; // Assign the unique variable label to get the last value
char const * VARIABLE_LABEL_6 = "ki"; // Assign the unique variable label to get the last value
char const * VARIABLE_LABEL_7 = "kd"; // Assign the unique variable label to get the last value
/* Enter a MAC address for your controller below */
/* Newer Ethernet shields have a MAC address printed on a sticker on the shield */
byte mac[] = { 0xDE, 0xAD, 0xBE, 0xEF, 0xFE, 0xED };
Ubidots client(TOKEN); /* initialize the instance */
/***** Definições - Temperatura *****/
#define PINO_TERMISTOR A15 // Pino do Termistor
#define TERMISTOR_NOMINAL 10000 // Valor do termistor na temperatura nominal
#define TEMPERATURA_NOMINAL 25 // Temp. nominal descrita no Manual
#define N_AMOSTRAS 5 // Número de amostragens para Média
#define COEFICIENTE 3977 // Beta do Termistor
#define V_RESISTOR 10000 // Valor do resistor em série (Mesmo do Termistor)
/***** Definições - PID *****/
#define pSENSOR A1
#define pCONTROLE 3
#define pPRESENCE 33
/***** Definições - Potência *****/
#define triacApin 13 //Define que o Dimmer será comandado pelo pino 13
/***** Variáveis do Controle de Temperatura *****/
volatile float temperatura;
volatile int bPresence;
int amostras;
int i;
/***** Variáveis do Controle de Potência *****/
int frequencia = 60;
int stateTriacA = 0;
int power = 0; //Inicializa variável que controla potência na carga com 0 (Desligada)
int teste = 0;
int CNT = 0;
/***** Variáveis de Controle PID *****/
double error = 0;
double temperature;
double lastTemperature;
double kP = 5.00;
double kI = 0.00;
double kD = 0.00;
double
P = 0,
I = 0,
D = 0;
double PID = 0;
/***** Inicializações *****/
double SetPoint = 45;
int ControlePwm = 50;
long lastProcess = 0;
float deltaTime = 0;
void setup()
{
/* INIT Comunicação */
Serial.begin(9600);
/* INIT Temperatura */
analogReference(EXTERNAL);
/* INIT Potência */
pinMode(triacApin, OUTPUT);
digitalWrite(triacApin, HIGH);
Timer1.initialize(); //Inicializa biblioteca TimerOne para a frequência
attachInterrupt(0, zero_cross_detect, FALLING); //Interrupção para detecção do Zero-Cross
/* INIT PID */
pinMode(pSENSOR, INPUT);
pinMode(pCONTROLE, OUTPUT);
pinMode(pPRESENCE, INPUT);
/* INIT Conexão Ethernet */
Serial.print(F("Starting ethernet..."));
if (!Ethernet.begin(mac)) Serial.println(F("failed"));
else Serial.println(Ethernet.localIP());
/* Atraso na inicialização do Ethernet Shield */
delay(2000);
Serial.println(F("Ready"));
}
void loop()
{
Controle_Temperatura();
bPresence = digitalRead(pPRESENCE);
/***** Realiza controle quando há presença *****/
if (bPresence != 0)
{
Controle_PID();
/***** Limita potência no intervalo 0% ~ 100% *****/
if (ControlePwm > 100) power = 100;
else if (ControlePwm < 0) power = 0;
else power = ControlePwm;
}
else power = 0;
/***** Monitoração dos valores no console *****/
Serial.print("SP= ");
Serial.print(SetPoint);
Serial.print(" ; T= ");
Serial.print(temperatura);
Serial.print(" ; P= ");
Serial.print(power);
Serial.print(" ; Presence= ");
Serial.print(bPresence);
Serial.println(" ;");
/* Get e Post à cada 5 segundos */
if (CNT > 100)
{
CNT = 0;
Ethernet_GetControl();
Ethernet_PostControl();
}
/***** Periódica de 50ms *****/
CNT++;
delay(50);
}
/***** Ethernet - GET *****/
void Ethernet_GetControl()
{
Ethernet.maintain();
/***** GET *****/
int getSP = client.getValue(DEVICE_LABEL, VARIABLE_LABEL_4);
float getkP = client.getValue(DEVICE_LABEL, VARIABLE_LABEL_5);
float getkI = client.getValue(DEVICE_LABEL, VARIABLE_LABEL_6);
float getkD = client.getValue(DEVICE_LABEL, VARIABLE_LABEL_7);
SetPoint = getSP;
kP = getkP;
kI = getkI;
kD = getkD;
}
/***** Ethernet - POST *****/
void Ethernet_PostControl()
{
Ethernet.maintain();
/* Variáveis locais para envio de dados */
float postTemperatura = temperatura;
int postPower = power;
/* Sending values to Ubidots */
/***** POST *****/
client.add(VARIABLE_LABEL_1, postTemperatura); //Temp_Real
client.add(VARIABLE_LABEL_2, postPower); //Potência
client.add(VARIABLE_LABEL_3, bPresence); //Acrescentar sensor de presença
client.sendAll();
}
void Controle_Temperatura()
{
/* Inicializa média = 0 */
float media = 0;
int i = 0;
amostras = 0;
/* Acumula N amostras do sinal analógico à cada 10ms */
while (i< N_AMOSTRAS)
{
amostras += analogRead(PINO_TERMISTOR);
i++;
delay(10);
}
/* Calcula valor médio das amostras */
media = amostras / N_AMOSTRAS;
/* Converte o valor da tensão média em resistência */
media = 1023 / media - 1;
media = V_RESISTOR / media;
/* Faz o Cálculo pelo método do Fator Beta */
temperatura = media / TERMISTOR_NOMINAL; // (R/Ro)
temperatura = log(temperatura); // ln(R/Ro)
temperatura /= COEFICIENTE; // 1/B * ln(R/Ro)
temperatura += 1.0 / (TEMPERATURA_NOMINAL + 273.15); // + (1/To)
temperatura = 1.0 / temperatura; // Inverte o valor
temperatura -= 273.15; // Converte para Celsius
}
/* Rotina para cálculo dos fatores PID */
void Controle_PID()
{
error = (SetPoint - temperatura); //Erro é a diferença entre o Set-Point e a Temperatura Real
deltaTime = (millis() - lastProcess)/1000.0; //Tempo gasto para execução do último processo
lastProcess = millis();
/* Cálculo do fator Proporcional */
P = error * kP;
/* Cálculo o fator Integral */
I += (error * kI) * deltaTime;
/* Cálculo do fator Derivativo */
D = ((lastTemperature - temperatura) * kD) / deltaTime;
/* Última temperatura - Referência para próximo cálculo*/
lastTemperature = temperatura;
/* Soma das parcelas */
PID = P + I + D;
/* Converção para Controle */
ControlePwm = (PID + 50);
}
/* Rotina de detecção do Zero-Cross e interrupção */
void zero_cross_detect()
{
if(power > 0)
{
long dimtime = int(map(power,0,100,8000,150)); //Calcula o atraso para o disparo do TRIAC
Timer1.attachInterrupt(gateTRIAC, dimtime); //Associa a função gateTRIAC com Interrupção do TIMER1
Timer1.start(); //Inicia contagem TIMER1
}
else
{
digitalWrite(triacApin, HIGH); //Mantém gate do TRIAC desativado.
Timer1.stop();
}
}
/* Trata interrupção do Timer 1 */
void gateTRIAC ()
{ //Trata interrupção do TIMER1 gerando pulso no gate do TRIAC
digitalWrite(triacApin, LOW); //Dispara o TRIAC
delayMicroseconds(100); //Aguarda 100 microsegundos para garantir disparo do TRIAC
digitalWrite(triacApin, HIGH); //Desabibilta gate do TRIAC
Timer1.stop();
}
| e244822dae47c054373596656182999fb745598c | [
"C"
]
| 1 | C | JonnyMarques91/PID_Ethernet | ac3400720b810fa84cf05a5cb901d30ac68c502c | 3b1d3b282dd6f28e7907ac7785a66a25bd8d1a64 |
refs/heads/master | <file_sep>################################################################################
## fitfun.SP functions
##
## a function to obtain the solution paths for the SPSP algorithm. These
## functions need to take arguments x, y, family, standardize, and intercept
## and return a glmnet object that has the solution paths stored. The solution
## paths will be used as ingredients for the SPSP algorithm.
##
################################################################################
#' @title Four Fitting-Functions that can be used as an input of \code{fitfun.SP} argument
#' to obtain the solution paths for the SPSP algorithm. The users can also customize a
#' function to generate the solution paths. As long as the customized function take
#' arguments x, y, family, standardize, and intercept, and return an object of class
#' \code{glmnet}, \code{lars} (or \code{SCAD}, \code{MCP} in the future).
#'
#' @name Fitting-Functions
NULL
#' \code{lasso.glmnet} uses lasso selection from \code{\link[glmnet]{glmnet}}.
#'
#' @param x a matrix of the independent variables. The dimensions are (nobs) and (nvars); each row is an observation vector.
#' @param y Response variable. Quantitative for \code{family="gaussian"} or \code{family="poisson"} (non-negative counts).
#' For \code{family="binomial"} should be either a factor with two levels.
#'
#' @param family Response type. Either a character string representing one of the built-in families,
#' or else a glm() family object.
#' @param standardize logical argument. Should conduct standardization before the estimation? Default is TRUE.
#' @param intercept logical. If x is a data.frame, this argument determines if the resulting model matrix should contain
#' a separate intercept or not. Default is TRUE.
#' @param ... Additional optional arguments.
#'
#' @return An object of class \code{"glmnet"} is returned to provide solution paths for the SPSP algorithm.
#'
#' @rdname Fitting-Functions
#'
#' @importFrom glmnet glmnet
#'
#' @export
#'
lasso.glmnet <- function(x,
y,
family,
standardize,
intercept, ...) {
if (!requireNamespace("glmnet", quietly = TRUE))
stop("Package ", sQuote("glmnet"), " needed but not available")
fit_sp <- glmnet(x = x, y = y, family = family,
alpha=1, standardize = standardize, intercept=intercept, ...)
return(fit_sp)
}
#' \code{adalasso.glmnet} the function to conduct the adaptive lasso selection using the \code{lambda.1se} from cross-validation lasso method
#' to obtain initial coefficients. It uses package \code{\link[glmnet]{glmnet}}.
#' @rdname Fitting-Functions
#'
#' @return An object of class \code{"glmnet"} is returned to provide solution paths for the SPSP algorithm.
#'
#' @importFrom glmnet glmnet cv.glmnet
#'
#' @export
#'
adalasso.glmnet <- function(x,
y,
family,
standardize,
intercept, ...) {
# Extract all optional arguments for further use.
args <- list(...)
for (name in names(args) ) {
assign(name, args[[name]])
}
if (!requireNamespace("glmnet", quietly = TRUE))
stop("Package ", sQuote("glmnet"), " needed but not available")
n <- dim(x)[1]; p <- dim(x)[2]
if (n<p) {
# use CV for the ridge regression and "temp$lambda.1se" to obtain initial coefficients
cv_fl1 <- cv.glmnet(x = x, y = y, family = family,
alpha = 0,
intercept = intercept, standardize = standardize, ...)
first_coef <- coef(cv_fl1, s = cv_fl1$lambda.1se)
first_beta <- first_coef[-1]
} else {
if (intercept) {
x1 <- as.matrix(cbind(1, x))
# first_coef <- solve(t(x)%*%x)%*%t(x)%*%y
first_coef <- crossprod(solve(crossprod(x1)), crossprod(x1,y))
first_beta <- first_coef[-1]
} else {
x1 <- as.matrix(x)
# first_coef <- solve(t(x)%*%x)%*%t(x)%*%y
first_coef <- crossprod(solve(crossprod(x1)), crossprod(x1,y))
first_beta <- first_coef
}
}
# If the penalty.factor are given, then adjust the penalties.
if (exists("penalty.factor")) {
penalty.factor <- abs(first_beta + 1/sqrt(n))^(-1) * penalty.factor
} else {
penalty.factor <- abs(first_beta + 1/sqrt(n))^(-1)
}
fit_sp <- glmnet(x = x, y = y, #penalty.factor = penalty.factor,
family = family, alpha=1,
intercept = intercept, standardize = standardize, ...)
return(fit_sp)
}
#' \code{adalassoCV.glmnet} adaptive lasso selection using the \code{lambda.1se} from cross-validation adaptive
#' lasso method to obtain initial coefficients. It uses package \code{\link[glmnet]{glmnet}}.
#' @rdname Fitting-Functions
#'
#' @return An object of class \code{"glmnet"} is returned to provide solution paths for the SPSP algorithm.
#'
#' @importFrom glmnet glmnet cv.glmnet
#' @importFrom stats coef
#'
#' @export
#'
adalassoCV.glmnet <- function(x,
y,
family,
standardize,
intercept, ...) {
# Extract all optional arguments for further use.
args <- list(...)
for (name in names(args) ) {
assign(name, args[[name]])
}
if (!requireNamespace("glmnet", quietly = TRUE))
stop("Package ", sQuote("glmnet"), " needed but not available")
n <- dim(x)[1]; p <- dim(x)[2]
if (n<p) {
fl1 <- glmnet(x = x, y = y, family = family,
alpha=1,
intercept = intercept,
standardize = standardize, ...)
# use lasso from cv.glmnet to find initial lambda
cv_fl1 <- cv.glmnet(x = x, y = y, family = family,
alpha=1, intercept = intercept, standardize = standardize, ...)
lambda_tem <- cv_fl1$lambda.1se # use this as prespecified lambda
first_coef <- fl1$beta[,which.min(abs(fl1$lambda-lambda_tem))]
} else {
# first_coef <- solve(t(x)%*%x)%*%t(x)%*%y
first_coef <- crossprod(solve(crossprod(x)), crossprod(x,y))
}
# If the penalty.factor are given, then adjust the penalties.
if (exists("penalty.factor")) {
penalty.factor <- abs(first_coef + 1/sqrt(n))^(-1) * penalty.factor
} else {
penalty.factor <- abs(first_coef + 1/sqrt(n))^(-1)
}
fit_cv_ada <- cv.glmnet(x = x, y = y, # penalty.factor = penalty.factor,
family = family, alpha=1, intercept = intercept, standardize = standardize, ...)
first_ada_coef <- coef(fit_cv_ada, s=fit_cv_ada$lambda.1se)
first_ada_beta <- first_ada_coef[-1]
# If the penalty.factor are given, then adjust the penalties.
if (exists("penalty.factor")) {
penalty.factor <- abs(first_ada_beta + 1/sqrt(n))^(-1) * penalty.factor
} else {
penalty.factor <- abs(first_ada_beta + 1/sqrt(n))^(-1)
}
fit_sp <- glmnet(x = x, y = y, # penalty.factor = penalty.factor,
family = family, alpha=1,
intercept = intercept, standardize = standardize, ...)
# Store the first adalasso coefficient obtained from the adalasso.CV procedure.
fit_sp$first_ada_coef <- first_ada_coef
return(fit_sp)
}
#' \code{ridge.glmnet} uses ridge regression to obtain the solution path.
#'
#' @return An object of class \code{"glmnet"} is returned to provide solution paths for the SPSP algorithm.
#'
#' @rdname Fitting-Functions
#' @export
#'
ridge.glmnet <- function(x,
y,
family,
standardize,
intercept, ...) {
if (!requireNamespace("glmnet", quietly = TRUE))
stop("Package ", sQuote("glmnet"), " needed but not available")
fit_sp <- glmnet(x = x, y = y, family = family,
alpha = 0, standardize = standardize, intercept=intercept, ...)
return(fit_sp)
}
#' \code{lasso.lars} uses lasso selection in \code{\link[lars]{lars}} to obtain the solution path.
#'
#' @rdname Fitting-Functions
#'
#' @return An object of class \code{"lars"} is returned to provide solution paths for the SPSP algorithm.
#'
#' @importFrom lars lars
#'
#' @export
#'
lasso.lars <- function(x,
y,
family,
standardize,
intercept, ...) {
if (!requireNamespace("lars", quietly = TRUE))
stop("Package ", sQuote("lars"), " needed but not available")
stop("The function lasso.lars() is currently under development.")
# return(fit_sp)
}
<file_sep>## Test environments
* local OS X install, R 4.1.0
* ubuntu 16.04.6 LTS (on travis-ci), R 4.1.0
* win-builder (devel and release)
## R CMD check results
There were no ERRORs, Notes, or WARNINGs.
* [Fixed] Notes of checks results from linux, windows, macos: Namespace in Imports field not imported from: ‘MASS’. All declared Imports should be used.
* [Fixed] The Description field contains <https://doi.org/10.1214/18-EJS1434>). Please rather write <doi:10.1214/18-EJS1434>.
* [Fixed] Please write references in the description of the DESCRIPTION file in the form authors (year) <doi:...> authors (year) <arXiv:...> authors (year, ISBN:...) or if those are not available: authors (year) <https:...> with no space after 'doi:', 'arXiv:', 'https:' and angle brackets for auto-linking. (If you want to add a title as well please put it in quotes: "Title")
* [Fixed] Please add \value to .Rd files regarding exported methods and explain the functions results in the documentation.
* [Fixed] Some code lines in examples are commented out in HighDim.Rd
* [Fixed] checking CRAN incoming feasibility ... NOTE
Maintainer: 'Xiaorui (<NAME> <<EMAIL>>'
New submission
Possibly misspelled words in DESCRIPTION:
EBIC (8:66)
Liu (8:72)
SPSP (5:21)
Found the following (possibly) invalid URLs:
URL: https://cran.r-project.org/web/checks/check_results_SPSP.html
From: README.md
Status: 404
Message: Not Found
* [Fixed] checking top-level files ... NOTE
Non-standard file/directory found at top level:
'cran-comments.md'
## Downstream dependencies
All packages that I could install passed.
<file_sep>library(testthat)
library(SPSP)
test_check("SPSP")
<file_sep># SPSP: an R Package for Selecting the relevant predictors by Partitioning the Solution Paths of the Penalized Likelihood Approach
<!-- badges: start -->
[](https://cran.r-project.org/package=SPSP)
[](https://cran.r-project.org/web/checks/check_results_SPSP.html)
[](https://cranlogs.r-pkg.org/badges/grand-total/SPSP)
[](https://cranlogs.r-pkg.org/badges/last-month/SPSP?color=green)
[](https://cranlogs.r-pkg.org/badges/last-week/SPSP?color=yellow)
<!-- badges: end -->
Overview
--------
An implementation of the feature Selection procedure by Partitioning the entire Solution Paths
(namely SPSP) to identify the relevant features rather than using a single tuning parameter.
By utilizing the entire solution paths, this procedure can obtain better selection accuracy than
the commonly used approach of selecting only one tuning parameter based on existing criteria,
cross-validation (CV), generalized CV, AIC, BIC, and EBIC (<NAME>., & <NAME>. (2018)
https://doi.org/10.1214/18-EJS1434). It is more stable and accurate (low false positive and false negative
rates) than other variable selection approaches. In addition, it can be flexibly coupled with
the solution paths of Lasso, adaptive Lasso, ridge regression, and other penalized estimators.
## Installation
The `SPSP` package is currently available on [SPSP CRAN](https://CRAN.R-project.org/package=SPSP).
### Install `SPSP` development version from GitHub (recommended)
``` r
# Install the development version from GitHub
if (!requireNamespace("devtools")) install.packages("devtools")
devtools::install_github("XiaoruiZhu/SPSP")
```
### Install `SPSP` from the CRAN
``` r
# Install from CRAN
install.packages("SPSP")
```
## Example
The following example shows the R code for
``` r
library(SPSP)
data(HihgDim)
library(glmnet)
# Use the user-friendly function SPSP() to conduct the selection by Partitioning the
# Solution Paths (the SPSP algorithm). The user only needs to specify the independent
# variables matrix, response, family, and \code{fitfun.SP = lasso.glmnet}.
x <- as.matrix(HighDim[,-1])
y <- HighDim[,1]
spsp_lasso_1 <- SPSP(x = x, y = y, family = "gaussian", fitfun.SP = lasso.glmnet,
init = 1, standardize = FALSE, intercept = FALSE)
head(spsp_lasso_1$nonzero)
head(spsp_lasso_1$beta_SPSP)
spsp_adalasso_5 <- SPSP(x = x, y = y, family = "gaussian", fitfun.SP = adalasso.glmnet,
init = 5, standardize = T, intercept = FALSE)
head(spsp_adalasso_5$nonzero)
head(spsp_adalasso_5$beta_SPSP)
# Use the function SPSP_step() to select the relevant predictors by partitioning the
# solution paths based on the user provided solution paths \code{BETA}.
lasso_fit <- glmnet(x = x, y = y, alpha = 1, intercept = FALSE)
# SPSP+Lasso method
K <- dim(lasso_fit$beta)[2]
LBETA <- as.matrix(lasso_fit$beta)
spsp_lasso_1 <- SPSP_step(x = x, y = y, BETA = LBETA,
init = 1, standardize = FALSE, intercept = FALSE)
head(spsp_lasso_1$nonzero)
head(spsp_lasso_1$beta_SPSP)
```
References
----------
<NAME>., & <NAME>. (2018). Selection by partitioning the solution paths. *Electronic Journal of Statistics*, 12(1), 1988-2017. <10.1214/18-EJS1434>
<file_sep>#' Selection by Partitioning the Solution Paths
#'
#' An implementation of the feature Selection procedure by Partitioning the entire Solution Paths
#' (namely SPSP) to identify the relevant features rather than using a single tuning parameter.
#' By utilizing the entire solution paths, this procedure can obtain better selection accuracy than
#' the commonly used approach of selecting only one tuning parameter based on existing criteria,
#' cross-validation (CV), generalized CV, AIC, BIC, and EBIC (<NAME>., & <NAME>. (2018)). It is
#' more stable and accurate (low false positive and false negative rates) than other variable
#' selection approaches. In addition, it can be flexibly coupled with the solution paths of Lasso,
#' adaptive Lasso, ridge regression, and other penalized estimators.
#'
#' @details This package includes two main functions and several functions (fitfun.SP) to obtains
#' the solution paths. The \code{SPSP} function allows users to specify the penalized likelihood
#' approaches that will generate the solution paths for the SPSP procedure. Then this function
#' will automatically partitioning the entire solution paths. Its key idea is to classify variables
#' as relevant or irrelevant at each tuning parameter and then to select all of the variables
#' which have been classified as relevant at least once. The \code{SPSP_step} purely apply the
#' partitioning step that needs the solution paths as the input. In addition, there are several
#' functions to obtain the solution paths. They can be used as an input of \code{fitfun.SP} argument.
#'
#'
#' @docType package
#'
#' @author Xiaorui (<NAME>, \email{<EMAIL>}, \cr <NAME>, \email{<EMAIL>}, \cr
#' <NAME>, \email{<EMAIL>}
#'
#' @references <NAME>., & <NAME>. (2018). Selection by partitioning the solution paths.
#' \emph{Electronic Journal of Statistics}, 12(1), 1988-2017. <10.1214/18-EJS1434>
#'
#' @name SPSP-package
#'
#' @useDynLib SPSP
#'
NULL
## Generic implementation of SPSP
SPSP <- function(x, ...) {
UseMethod("SPSP", x)
}
#' Selection by partitioning the solution paths of Lasso, Adaptive Lasso, and Ridge penalized regression.
#'
#' A user-friendly function to conduct the selection by Partitioning the Solution Paths (the SPSP algorithm). The
#' user only needs to specify the independent variables matrix, response, family, and \code{fitfun.SP}.
#'
#' @param x A matrix with all independent variables, of dimension n by p; each row is an observation vector with p variables.
#' @param y Response variable. Quantitative for \code{family="gaussian"} or \code{family="poisson"} (non-negative counts).
#' For \code{family="binomial"} should be either a factor with two levels.
#'
#' @param family Response type. Either a character string representing one of the built-in families,
#' or else a glm() family object.
#' @param fitfun.SP A function to obtain the solution paths for the SPSP algorithm. This function takes the arguments
#' x, y, family as above, and additionally the standardize and intercept and others in \code{\link[glmnet]{glmnet}}
#' or \code{\link[lars]{lars}}. The function fit the model with lasso, adaptive lasso, or ridge regression to
#' return the solution path of the corresponding penalized likelihood approach.
#' \describe{
#' \item{\code{lasso.glmnet}}{lasso selection from \code{\link[glmnet]{glmnet}}.}
#' \item{\code{adalasso.glmnet}}{adaptive lasso selection using the \code{lambda.1se} from cross-validation lasso method
#' to obtain initial coefficients. It uses package \code{\link[glmnet]{glmnet}}.}
#' \item{\code{adalassoCV.glmnet}}{adaptive lasso selection using the \code{lambda.1se} from cross-validation adaptive
#' lasso method to obtain initial coefficients. It uses package \code{\link[glmnet]{glmnet}}.}
#' \item{\code{ridge.glmnet}}{use ridge regression to obtain the solution path.}
#' \item{\code{lasso.lars}}{use lasso selection in \code{\link[lars]{lars}} to obtain the solution path.}
#' }
#' @param standardize logical argument. Should conduct standardization before the estimation? Default is TRUE.
#' @param intercept logical. If x is a data.frame, this argument determines if the resulting model matrix should contain
#' a separate intercept or not. Default is TRUE.
#' @param args.fitfun.SP A named list containing additional arguments that are passed to the fitting function;
#' see also argument \code{args.fitfun.SP} in do.call.
#' @param ... Additional optional arguments.
#'
#' @importFrom Rcpp evalCpp
#' @importFrom stats glm.fit coef
#' @importFrom glmnet glmnet cv.glmnet
#'
#' @return An object of class \code{"SPSP"} is a list containing at least the following components:
#' \item{\code{beta_SPSP}}{the estimated coefficients of SPSP selected model;}
#' \item{\code{S0}}{the estimated relevant sets;}
#' \item{\code{nonzero}}{the selected covariates;}
#' \item{\code{zero}}{the covariates that are not selected;}
#' \item{\code{thres}}{the boundaries for abs(beta);}
#' \item{\code{R}}{the sorted adjacent distances;}
#' \item{\code{intercept}}{the estimated intercept when \code{intercept == T}.}
#'
#' This object has attribute contains:
#'
#' \item{\code{glmnet.fit}}{the fitted penalized regression within the input function \code{fitfun.SP};}
#' \item{\code{family}}{the family of fitted object;}
#' \item{\code{fitfun.SP}}{the function to obtain the solution paths for the SPSP algorithm;}
#' \item{\code{args.fitfun.SP}}{a named list containing additional arguments for the function \code{fitfun.SP}.}
#'
#'
#' @export
#'
#' @examples
#' data(HighDim)
#' library(glmnet)
#' # Use the high dimensional dataset (data(HighDim)) to test SPSP+Lasso and SPSP+AdaLasso:
#' data(HighDim)
#'
#' x <- as.matrix(HighDim[,-1])
#' y <- HighDim[,1]
#'
#' spsp_lasso_1 <- SPSP::SPSP(x = x, y = y, family = "gaussian", fitfun.SP = lasso.glmnet,
#' init = 1, standardize = FALSE, intercept = FALSE)
#'
#' head(spsp_lasso_1$nonzero)
#' head(spsp_lasso_1$beta_SPSP)
#'
#' spsp_adalasso_5 <- SPSP::SPSP(x = x, y = y, family = "gaussian", fitfun.SP = adalasso.glmnet,
#' init = 5, standardize = TRUE, intercept = FALSE)
#'
#' head(spsp_adalasso_5$nonzero)
#' head(spsp_adalasso_5$beta_SPSP)
#'
#'
#'
SPSP <-
function(x,
y,
family = c("gaussian", "binomial"),
fitfun.SP = lasso.glmnet,
args.fitfun.SP = list(),
standardize = TRUE,
intercept = TRUE,
...) {
this.call <- match.call()
family <- match.arg(family)
glmnet_T <- inherits(fitfun.SP, "glmnet")
fit_mod_SP <- do.call(fitfun.SP, c(list(x = x, y = y, family = family,
standardize = standardize, intercept = intercept),
args.fitfun.SP))
# Conduct the SPSP step that use the above solution path to obtain relevant predictors.
SPSP_temp <- SPSP_step(x = x, y = y, BETA = fit_mod_SP$beta, standardize = standardize, intercept = intercept, ...)
# Assign attributes and class
attr(SPSP_temp, "glmnet.fit") <- fit_mod_SP
attr(SPSP_temp, "family") <- family
attr(SPSP_temp, "fitfun.SP") <- fitfun.SP
attr(SPSP_temp, "args.fitfun.SP") <- args.fitfun.SP
class(SPSP_temp) <- c("SPSP", class(fit_mod_SP), class(SPSP_temp))
return(SPSP_temp)
}
#' The selection step with the input of the solution paths.
#'
#' A function to select the relevant predictors by partitioning the solution paths (the SPSP algorithm)
#' based on the user provided solution paths \code{BETA}.
#'
#' @param x independent variables as a matrix, of dimension nobs x nvars; each row is an observation vector.
#' @param y response variable. Quantitative for \code{family="gaussian"} or \code{family="poisson"} (non-negative counts).
#' For \code{family="binomial"} should be either a factor with two levels.
#'
#' @param family either a character string representing one of the built-in families, or else a glm() family object.
#' @param BETA the solution paths obtained from a prespecified fitting step \code{fitfun.SP = lasso.glmnet} etc. It must be
#' a p by k matrix, should be thicker and thicker, each column corresponds to a lambda, and lambda gets smaller and smaller.
#' It is just the returned value \code{beta} from a \code{glmnet} object.
#' @param standardize whether need standardization.
#' @param intercept logical. If x is a data.frame, this argument determines if the resulting model matrix should contain
#' a separate intercept or not.
#' @param init initial coefficients, starting from init-th estimator of the solution paths. The default is 1.
#' @param R sorted adjacent distances, default is NULL. Will be calculated inside.
#' @param ... Additional optional arguments.
#'
#'
#' @examples
#' data(HighDim)
#' library(glmnet)
#'
#' x <- as.matrix(HighDim[,-1])
#' y <- HighDim[,1]
#'
#' lasso_fit <- glmnet(x = x, y = y, alpha = 1, intercept = FALSE)
#'
#' # SPSP+Lasso method
#' K <- dim(lasso_fit$beta)[2]
#' LBETA <- as.matrix(lasso_fit$beta)
#'
#' spsp_lasso_1 <- SPSP_step(x = x, y = y, BETA = LBETA,
#' init = 1, standardize = FALSE, intercept = FALSE)
#' head(spsp_lasso_1$nonzero)
#' head(spsp_lasso_1$beta_SPSP)
#'
#' @return A list containing at least the following components:
#' \item{\code{beta_SPSP}}{the estimated coefficients of SPSP selected model;}
#' \item{\code{S0}}{the estimated relevant sets;}
#' \item{\code{nonzero}}{the selected covariates;}
#' \item{\code{zero}}{the covariates that are not selected;}
#' \item{\code{thres}}{the boundaries for abs(beta);}
#' \item{\code{R}}{the sorted adjacent distances;}
#' \item{\code{intercept}}{the estimated intercept when \code{intercept == T}.}
#'
#' This object has attribute contains:
#'
#' \item{\code{glmnet.fit}}{the fitted penalized regression within the input function \code{fitfun.SP};}
#' \item{\code{family}}{the family of fitted object;}
#' \item{\code{fitfun.SP}}{the function to obtain the solution paths for the SPSP algorithm;}
#' \item{\code{args.fitfun.SP}}{a named list containing additional arguments for the function \code{fitfun.SP}.}
#'
#'
#' @export
#'
SPSP_step <-
function(x,
y,
family = c("gaussian", "binomial"),
BETA,
standardize=TRUE,
intercept = TRUE,
init = 1,
R = NULL,
...) {
this.call <- match.call()
family <- match.arg(family)
if (is.character(family))
family <- get(family, mode = "function", envir = parent.frame())
if (is.function(family))
family <- family()
if (is.null(family$family)) {
print(family)
stop("'family' not recognized")
}
if (!is.logical(intercept))
stop("Please specify intercept == TRUE/FALSE.")
if (!is.logical(standardize))
stop("Please specify stardardize == TRUE/FALSE.")
if (standardize == TRUE) { # standardize == TRUE
x <- scale(x, center = TRUE, scale = TRUE)
}
n <- dim(x)[1] # size
p <- dim(x)[2] # dimension
K <- dim(BETA)[2]
BETA <- as.matrix(BETA[,K:1])
# BETA for the following should be a p by k matrix;should be sparser and sparser
# each column corresponds to a lambda, and lambda gets larger and larger
K <- dim(BETA)[2] # the number of the tuning parameter lambda
if (is.null(R)) {
#### select the default the tuning parameter if it is not given
gap0 <- sort(diff(c(0,sort(abs(BETA[,init])))), decreasing = TRUE)
# sorted adjacent distances
R <- ifelse(gap0[2]!=0, as.numeric(gap0[1]/gap0[2]), as.numeric(gap0[1]))
}
### only consider the rows with at least one non-zero beta
colsbeta <- colSums(BETA)
K2 <- max(which(colsbeta!=0))
thres <- c() ## boundaries for abs(beta)
S0 <- list() # estimated relevant sets
S0c <- list() # estimated irrelevant sets
# the initial values;
thres_tmp <- max(abs(BETA[, 1]))
S0_tmp <- which(abs(BETA[, 1]) > thres_tmp)
S0c_tmp <- which(abs(BETA[, 1]) <= thres_tmp)
# loop for the update
if (K2 >= 1) {
for (k in 1:K2) {
beta_abs <- abs(BETA[, k])
beta_sort <- sort(beta_abs)
# update the initials
thres[k] <- ifelse(length(S0c_tmp) >= 1 ,max(beta_abs[S0c_tmp]),0)
S0_tmp <- which(abs(BETA[, k]) > thres[k])
S0c_tmp <- which(abs(BETA[, k]) <= thres[k])
S0[[k]] <- S0_tmp
S0c[[k]] <- S0c_tmp
if (length(S0c[[k]]) >=1) {
gap <- diff(c(0, beta_sort))
# the distance between current relevant and irrelevant sets
gap_10 <- ifelse(length(S0c_tmp) == p, 0, gap[length(S0c_tmp) + 1])
# gap for the current irrelevant set: S0c_tmp
gap_0 <- gap[1:length(S0c_tmp)]
o1 <- which.max(gap_0)
gap_01 <- max(gap_0)
gap_02 <- ifelse(o1>1, max(gap[1:(o1-1)]), 0)
if (gap_10 <= R * gap_01 & gap_01 >= R * gap_02 ) {
thres[k] <- ifelse(o1>1, beta_sort[o1-1], 0)
S0_tmp <- which(abs(BETA[, k]) > thres[k])
S0c_tmp <- which(abs(BETA[, k]) <= thres[k])
S0[[k]] <- S0_tmp
S0c[[k]] <- S0c_tmp
}
}
}
}
index <- rep(1,p)
for(i in 1:p) {
if (all(abs(BETA[i, 1:K2]) <= thres)) index[i] <- 0
}
nz <- which(index == 1)
z <- which(index == 0)
beta_SPSP <- rep(0, p)
if (intercept == TRUE) { # intercept == TRUE
if (length(nz) >= 1) {
xc <- x[, nz]
xc1 <- cbind(1, xc)
if (length(nz) < n) { # if p < n
betac1 <- glm.fit(y = y, x = xc1, intercept = FALSE, family = family)$coefficients
} else { # # if p >= n, use ridge reg to provide final coefficients
# betac1 <- solve(t(xc1)%*%xc1+0.001*diag(length(nz)+1))%*%t(xc1)%*%y # Very slow
# betac1 <- crossprod(solve(crossprod(xc1)+0.001*diag(length(nz)+1)), crossprod(xc1,y)) # Faster
betac1 <- solve((crossprod(xc1) + 0.001 * diag(length(nz) + 1)), crossprod(xc1, y)) # Fastest
}
intercept_est <- betac1[1]
betac <- betac1[-1]
beta_SPSP[nz] <- betac
} else { # if no variable is selected and intercept == TRUE, run a model with intercept only
intercept_est <- glm.fit(y = y, x = rep(1, n), intercept = FALSE, family = family)$coefficients
betac <- NA
}
} else { # intercept == FALSE
intercept_est <- NA
if (length(nz) >= 1) {
xc <- x[, nz]
if (length(nz) < n) {
betac <- glm.fit(y = y, x = xc, intercept = FALSE, family = family)$coefficients
} else { # change all as ridge reg to provide final coefficients
# betac <- solve(t(xc)%*%xc+0.001*diag(length(nz)))%*%t(xc)%*%y # Very slow
# betac <- crossprod(solve(crossprod(xc)+0.001*diag(length(nz))), crossprod(xc,y)) # Faster
betac <- solve((crossprod(xc) + 0.001*diag(length(nz))), crossprod(xc, y)) # Fastest
}
beta_SPSP[nz] <- betac
}
else if (!intercept) {
stop("No variable is selected and no intercept.")
}
}
# Change the display and format of some results
names(beta_SPSP) <- colnames(x)
nz_name <- colnames(x)[nz]
z_name <- colnames(x)[z]
names(intercept_est) <- "(Intercept)"
SPSP <- list(beta_SPSP = beta_SPSP,
S0 = S0,
nonzero = nz_name, zero = z_name,
thres = thres, R = R,
intercept = intercept_est)
attr(SPSP, "thres") <- thres ## boundaries for abs(beta)
attr(SPSP, "R") <- R ## sorted adjacent distances
attr(SPSP, "y") <- y
attr(SPSP, "x") <- x
attr(SPSP, "init") <- init
return(SPSP)
}
<file_sep>#' A high dimensional dataset with n equals to 200 and p equals to 500.
#'
#' A dataset with 200 observations and 500 dimensions is generated from the following process:
#' linear regression model with only first three non-zero coefficients equal to 3, 2, and 1.5 respectively.
#' The covariates are correlated with AR structure (rho=0.3). The error term is normally distributed with
#' zero mean and sd equals to 0.5.
#'
#'
#' @docType data
#'
#' @keywords datasets
#'
#' @name HighDim
#'
#' @usage
#' data(HighDim)
#'
#' @examples
#'
#' # HighDim dataset is generated from the following process:
#' n <- 200; p <- 500; sigma <- 0.5
#' beta <- rep(0, p); nonzero <- c(1, 2, 3); zero <- setdiff(1:p, nonzero)
#' beta[nonzero] <- c(3, 2, 1.5)
#' Sigma <- 0.3^(abs(outer(1:p,1:p,"-")))
#' library(MASS)
#' X <- mvrnorm(n, rep(0,p), Sigma)
#' error <- rnorm(n, 0, sigma)
#'
#' X <- apply(X, 2, scale) * sqrt(n)/sqrt(n-1)
#' error <- error - mean(error)
#'
#' Y <- X %*% beta + error
#' HighDim <- data.frame(Y, X)
#' head(HighDim)
#'
#'
NULL
<file_sep>test_that("Check SPSP_step() use data(HighDim)", {
skip_if_not_installed("glmnet")
library(glmnet)
# Use the high dimensional dataset (data(HighDim)) to test SPSP+Lasso:
data(HighDim)
x <- as.matrix(HighDim[,-1])
y <- HighDim[,1]
lasso_fit <- glmnet(x = x, y = y, alpha = 1, intercept = FALSE)
# library(plotmo) # for plot_glmnet
# plot_glmnet(lasso_fit, label=15, xvar ="lambda")
# coef(lasso_fit, s = 0.5)
test_lm_3 <- lm(Y ~ X1 + X2 + X3 + 0, data = HighDim)
# When nonzero == 0, run the following to obtain intercept
# test_lm_int <- glm.fit(y = y, x = rep(1, length(y)), intercept = TRUE, family = gaussian())$coefficients
# When nonzero >= 1, run the following to obtain intercept
# test_lm_int <- glm.fit(y = y, x = cbind(1, x[,1:3]), intercept = FALSE, family = gaussian())$coefficients
test_lm_int <- glm(y ~ X1 + X2 + X3 + 1, data = HighDim, family = gaussian())$coefficients
# summary(test_lm_3)
# SPSP+Lasso method
K <- dim(lasso_fit$beta)[2]
LBETA <- as.matrix(lasso_fit$beta)
r_spsp1 <- SPSP::SPSP_step(x = x, y = y, BETA = LBETA, init = 1, standardize = FALSE, intercept = FALSE)
est_beta_1 <- r_spsp1$beta_SPSP
# colnames(X)[which(est_beta_1 != 0)]
head(est_beta_1)
r_spsp5 <- SPSP::SPSP_step(x = x, y = y, BETA = LBETA, init = 5, standardize = FALSE)
est_beta_5 <- r_spsp5$beta_SPSP
# head(est_beta_5)
r_spsp5_std <- SPSP::SPSP_step(x = x, y = y, BETA = LBETA, init = 5, standardize = T, intercept = T)
est_beta_5_std <- r_spsp5_std$beta_SPSP
# head(est_beta_5_std)
# r_spsp5_std$intercept
# Comparison
expect_equal(coef(test_lm_3), est_beta_1[1:3])
expect_equal(coef(test_lm_3), est_beta_5[1:3])
# expect_equal(coef(test_lm_3), est_beta_5_std[1:3])
expect_equal(test_lm_int[1], r_spsp5_std$intercept)
})
test_that("Check SPSP() with lasso.glmnet() use data(HighDim)", {
skip_if_not_installed("glmnet")
library(glmnet)
# Use the high dimensional dataset (data(HighDim)) to test SPSP+Lasso:
data(HighDim)
x <- as.matrix(HighDim[,-1])
y <- HighDim[,1]
xstd <- scale(HighDim[,-1], center = TRUE, scale = TRUE)
HighDim2 <- data.frame(Y = y, xstd)
test_lm_3 <- lm(Y ~ X1 + X2 + X3 + 0, data = HighDim)
test_lm_std_3 <- lm(Y ~ X1 + X2 + X3 + 0, data = HighDim2)
# When nonzero == 0, run the following to obtain intercept
# test_lm_int <- glm.fit(y = y, x = rep(1, length(y)), intercept = TRUE, family = gaussian())$coefficients
# When nonzero >= 1, run the following to obtain intercept
# test_lm_int <- glm.fit(y = y, x = cbind(1, x[,1:3]), intercept = FALSE, family = gaussian())$coefficients
test_lm_int <- glm(y ~ X1 + X2 + X3 + 1, data = HighDim, family = gaussian())$coefficients
lasso_fit <- glmnet(x = x, y = y, alpha = 1, intercept = FALSE)
# coef(lasso_fit)
r_spsp1 <- SPSP::SPSP(x = x, y = y, family = "gaussian", fitfun.SP = lasso.glmnet,
init = 1, standardize = FALSE, intercept = FALSE)
est_beta_1 <- r_spsp1$beta_SPSP
# colnames(X)[which(est_beta_1 != 0)]
# head(est_beta_1)
r_spsp5 <- SPSP::SPSP(x = x, y = y, family = "gaussian", fitfun.SP = lasso.glmnet,
init = 5, standardize = FALSE, intercept = FALSE)
est_beta_5 <- r_spsp5$beta_SPSP
# head(est_beta_5)
r_spsp5_std <- SPSP::SPSP(x = x, y = y, family = "gaussian", fitfun.SP = adalasso.glmnet,
init = 5, standardize = TRUE, intercept = FALSE)
est_beta_5_std <- r_spsp5_std$beta_SPSP
# head(est_beta_5_std)
r_spsp5_std_int <- SPSP::SPSP(x = x, y = y, family = "gaussian", fitfun.SP = adalasso.glmnet,
init = 5, standardize = TRUE, intercept = TRUE)
est_beta_5_std_int <- r_spsp5_std_int$beta_SPSP
# head(est_beta_5_std_int)
# r_spsp5_std_int$intercept
# Check if coefficients are same as ols
expect_equal(coef(test_lm_3), est_beta_1[1:3])
expect_equal(coef(test_lm_3), est_beta_5[1:3])
expect_equal(coef(test_lm_std_3), est_beta_5_std[1:3])
expect_equal(coef(test_lm_std_3), est_beta_5_std_int[1:3])
# Check if intercept is correct
expect_true(is.na(r_spsp1$intercept))
expect_true(is.na(r_spsp5$intercept))
expect_true(is.na(r_spsp5_std$intercept))
expect_true(!is.na(r_spsp5_std_int$intercept))
})
test_that("Check SPSP() with use data(Boston)", {
skip_if_not_installed("MASS")
skip_if_not_installed("glmnet")
library(glmnet)
library(MASS)
# Boston Housing data from http://archive.ics.uci.edu/ml/datasets/Housing
data(Boston, package="MASS")
colnames(Boston)
summary(Boston)
Boston2 <- subset(Boston, crim<=3.2)
# Boston2 <- Boston
# dim(Boston2)
x <- as.matrix(Boston2[,-14]); y <- Boston2[,14]
summary(x)
x[,"crim"] <- log(x[,"crim"])
x[,"tax"] <- log(x[,"tax"])
x[,"lstat"] <- log(x[,"lstat"])
x[,"dis"] <- log(x[,"dis"])
# x[,"rad"] <- log(x[,"rad"])
y <- scale(y)
x[,"age"] <- log(x[,"age"])
for (i in 1:(ncol(x))){
x[,i] <- scale(x[,i])
}
# summary(x)
# round(cor(x), 2)
# round(cor(Boston[,-14]), 2)
# summary(y)
expect_error(SPSP::SPSP(x = x, y = y, family = "gaussian", fitfun.SP = lasso.glmnet,
init = 5, standardize = T, intercept = FALSE),
NA)
expect_error(SPSP::SPSP(x = x, y = y, family = "gaussian", fitfun.SP = adalasso.glmnet,
init = 5, standardize = T, intercept = FALSE),
NA)
expect_error(SPSP::SPSP(x = x, y = y, family = "gaussian", fitfun.SP = adalassoCV.glmnet,
init = 5, standardize = T, intercept = FALSE),
NA)
expect_error(SPSP::SPSP(x = x, y = y, family = "gaussian", fitfun.SP = ridge.glmnet,
init = 5, standardize = T, intercept = FALSE),
NA)
err <- expect_error(SPSP::SPSP(x = x, y = y, family = "gaussian", fitfun.SP = lasso.lars,
init = 5, standardize = T, intercept = FALSE))
expect_equal(err$message, "The function lasso.lars() is currently under development.")
})
test_that("Check penalty.factor argument in glmnet() for SPSP()", {
skip_if_not_installed("MASS")
skip_if_not_installed("glmnet")
library(glmnet)
library(MASS)
# Boston Housing data from http://archive.ics.uci.edu/ml/datasets/Housing
data(Boston, package="MASS")
colnames(Boston)
summary(Boston)
Boston2 <- subset(Boston, crim<=3.2)
# Boston2 <- Boston
# dim(Boston2)
x <- as.matrix(Boston2[,-14]); medv <- Boston2[,14]
summary(x)
x[,"crim"] <- log(x[,"crim"])
x[,"tax"] <- log(x[,"tax"])
x[,"lstat"] <- log(x[,"lstat"])
x[,"dis"] <- log(x[,"dis"])
# x[,"rad"] <- log(x[,"rad"])
medv <- scale(medv)
x[,"age"] <- log(x[,"age"])
for (i in 1:(ncol(x))){
x[,i] <- scale(x[,i])
}
# summary(x)
# round(cor(x), 2)
# round(cor(Boston[,-14]), 2)
# summary(y)
expect_error(SPSP::SPSP(x = x, y = medv, family = "gaussian", fitfun.SP = lasso.glmnet,
init = 5, standardize = T, intercept = FALSE),
NA)
test1 <- SPSP::SPSP(x = x, y = medv, family = "gaussian", fitfun.SP = lasso.glmnet,
args.fitfun.SP = list(penalty.factor = c(rep(1, 5), 0, rep(1, 6), 0)),
init = 1, standardize = T, intercept = FALSE)
# test1$nonzero
fit1 <- attr(test1, "glmnet.fit")
# fit1$beta
# library(plotmo)
# plot_glmnet(fit1, label=15, xvar ="lambda")
expect_error(SPSP::SPSP(x = x, y = medv, family = "gaussian", fitfun.SP = adalassoCV.glmnet,
init = 5, standardize = T, intercept = FALSE),
NA)
test2 <- SPSP::SPSP(x = x, y = medv, family = "gaussian", fitfun.SP = adalassoCV.glmnet,
args.fitfun.SP = list(penalty.factor = c(rep(1, 5), 0, rep(1, 6), 0)),
init = 1, standardize = T, intercept = FALSE)
# test2$nonzero
fit2 <- attr(test2, "glmnet.fit")
# plot_glmnet(fit2, label=15, xvar ="lambda")
})
<file_sep># SPSP 0.1.1
## Major changes
1. Fix the function corresponding to the `intercept `argument. The intercept argument controls whether to include the intercept, but since the SPSP algorithm estimates the intercept separately. It needs to be dealt with separately.
2. The `standardize` only control the standardization of covariates.
# SPSP 0.1.0
## Major changes
1. Implement the Selection by Partitioning the Solution Paths procedure in the SPSP() and SPSP_step() functions;
2. Add fitfun.SP functions to obtain the solution paths for the SPSP algorithm;
3. Include one high-dimensional data for illustration purpose.
| 0fb62e67999a48444bd39608cafe15ce5bedc8a2 | [
"Markdown",
"R"
]
| 8 | R | XiaoruiZhu/SPSP | 37f503666a4c7d991131bebbeea130ad4f34b8cc | f57729b368fd80333b740d5b32418b6b2c0d9bbf |
refs/heads/master | <file_sep>source 'http://rubygems.org'
gem 'jruby-openssl'
gem 'sinatra'
gem 'trinidad'
gem 'json'
gem 'rest-client'
gem 'mongo'
gem 'spreadsheet'
gem 'pony'
<file_sep>#!/bin/sh
jruby -S gem install bundler heroku
jruby -S bundle install
git init
git add .
git commit -m "Initial commit of repository"
heroku create --stack cedar
cp Gemfile Jemfile
cp Jemfile.lock Jemfile.lock
heroku create --stack cedar --buildpack http://github.com/heroku/heroku-buildpack-java
git push heroku master
heroku logs
<file_sep>require 'sinatra'
require 'json'
require 'mongo'
require 'base64'
require 'spreadsheet'
require 'jxl.jar'
DATE_COL=0
DESCRIPTION_COL=1
CLIENT_COL=2
CATEGORY_COL=3
TOTAL_COL=5
EXPENSE_START_ROW = 13
NAME_COL = 1
NAME_ROW = 9
SIGNATURE_COL = 1
SIGNATURE_ROW = 23
TOP_DATE_COL = 4
TOP_DATE_ROW = 9
BOTTOM_DATE_COL = 4
BOTTOM_DATE_ROW = 23
get '/' do
puts "Show expenses"
coll = get_col
@expenses = coll.find
erb :expenses
end
get '/hello' do
"hello"
end
post '/expense' do
expense = JSON.parse(request.body.read)
coll = get_col
id = coll.insert(expense)
# email(id, "<EMAIL>")
id.to_s
end
get '/expense/:id/receipts/:index/image' do |id, index|
coll = get_col
begin
expense = coll.find("_id" => BSON::ObjectId(id)).to_a[0]
image_encoded = expense["receipts"][index.to_i]["image"]
throw Exception.new unless image_encoded
content_type "image/png"
Base64.decode64 image_encoded
rescue
status 404
"Image not found"
end
end
get '/expense' do
coll = get_col
expenses = coll.find.collect {|expense| expense.to_json}
expenses.join
end
get '/expense/:id' do |id|
coll = get_col
begin
coll.find("_id" => BSON::ObjectId(id)).to_a[0].to_json
rescue
status 404
"Expense not found"
end
end
delete '/expenses' do
coll = get_col
coll.remove
coll.count.to_s
end
get '/expense/:id/excel.xls' do |id|
send_file(generate_jruby_excel(id))
end
post '/expense/:id/email' do |id|
email(id, params[:emailAddress])
"email sent to #{params[:emailAddress]}"
end
get '/expense/:id/email/:address' do |id, address|
email(id, address)
"email sent to #{address}"
end
def email(id, address)
require 'pony'
expense_spreadsheet = File.new(generate_jruby_excel(id))
attachments = {"expense.xls" => expense_spreadsheet.read}
expense = get_col.find("_id" => BSON::ObjectId(id)).to_a[0]
expense["receipts"].each_with_index do |receipt, index|
image_encoded = receipt["image"]
image_decoded = Base64.decode64 image_encoded
attachments["receipt_#{index}.png"] = image_decoded
end
Pony.mail(
:from => "testing",
:to => address,
:attachments => attachments,
:subject => "Expenses",
:body => "Please find attached my expenses",
:port => '587',
:via => :smtp,
:via_options => {
:address => 'smtp.sendgrid.net',
:port => '587',
:enable_starttls_auto => true,
:user_name => ENV['SENDGRID_USERNAME'],
:password => ENV['<PASSWORD>'],
:authentication => :plain,
:domain => ENV['SENDGRID_DOMAIN']})
end
def generate_excel(id)
coll = get_col
expense = coll.find("_id" => BSON::ObjectId(id)).to_a[0]
book = Spreadsheet.open("template.xls", 'r')
sheet = book.worksheet(0)
sheet[NAME_ROW, NAME_COL] = expense["name"]
expense["receipts"].each_with_index do |receipt, i|
sheet[EXPENSE_START_ROW + i, DATE_COL] = receipt["date"]
sheet[EXPENSE_START_ROW + i, DESCRIPTION_COL] = receipt["description"]
sheet[EXPENSE_START_ROW + i, CLIENT_COL] = receipt["client"]
sheet[EXPENSE_START_ROW + i, CATEGORY_COL] = receipt["category"]
amount_in_dollars = receipt["amount_in_cents"] ? receipt["amount_in_cents"].to_f/100 : receipt["amountInCents"].to_f/100
sheet[EXPENSE_START_ROW + i, TOTAL_COL] = amount_in_dollars
end
file = Tempfile.new('spreadsheet')
book.write(file.path)
file
end
def get_col
if ENV['MONGOHQ_URL']
uri = URI.parse(ENV['MONGOHQ_URL'])
conn = Mongo::Connection.from_uri(ENV['MONGOHQ_URL'])
db = conn.db(uri.path.gsub(/^\//, ''))
db["expenses"]
else
connection = Mongo::Connection.new
db = connection.db("mydb")
db["expenses"]
end
end
def generate_jruby_excel(id)
expense = get_col.find("_id" => BSON::ObjectId(id)).to_a[0]
writeable_workbook = nil
begin
template = java.io.File.new("template.xls")
temp_file = java.io.File.createTempFile(java.lang.String.valueOf(java.lang.System.currentTimeMillis()), ".xls")
temp_file.deleteOnExit()
workbook = Java::jxl.Workbook.getWorkbook(template)
writeable_workbook = Java::jxl.Workbook.createWorkbook(temp_file, workbook)
sheet = writeable_workbook.getSheet(0)
name_label = Java::jxl.write.Label.new(NAME_COL, NAME_ROW, expense["name"])
sheet.addCell(name_label)
date = Time.now.strftime('%d/%m/%Y')
top_date_label = Java::jxl.write.Label.new(TOP_DATE_COL, TOP_DATE_ROW, date)
sheet.addCell(top_date_label)
bottom_date_label = Java::jxl.write.Label.new(BOTTOM_DATE_COL, BOTTOM_DATE_ROW, date)
sheet.addCell(bottom_date_label)
expense["receipts"].each_with_index do |receipt, i|
date_label = Java::jxl.write.Label.new(DATE_COL, EXPENSE_START_ROW + i, receipt["date"])
description_label = Java::jxl.write.Label.new(DESCRIPTION_COL, EXPENSE_START_ROW + i, receipt["description"])
category_label = Java::jxl.write.Label.new(CATEGORY_COL, EXPENSE_START_ROW + i, receipt["category"])
client_label = Java::jxl.write.Label.new(CLIENT_COL, EXPENSE_START_ROW + i, receipt["client"])
amount_in_dollars = receipt["amount_in_cents"] ? receipt["amount_in_cents"].to_f/100 : receipt["amountInCents"].to_f/100
amount_number = Java::jxl.write.Number.new(TOTAL_COL, EXPENSE_START_ROW + i, amount_in_dollars)
sheet.addCell(date_label)
sheet.addCell(description_label)
sheet.addCell(category_label)
sheet.addCell(client_label)
sheet.addCell(amount_number)
end
if (expense["signature"])
image_signature = Base64.decode64(expense["signature"])
signature = Tempfile.new(['signature', '.png'])
signature.write(image_signature)
signature.rewind
writable_image = Java::jxl.write.WritableImage.new(SIGNATURE_COL, SIGNATURE_ROW, 2, 2, java.io.File.new(signature.path))
sheet.addImage(writable_image)
end
ensure
if (writeable_workbook)
writeable_workbook.write
writeable_workbook.close
end
end
temp_file.path
end
<file_sep>require 'rubygems'
require 'rest_client'
url = "http://jruby-expenseit.herokuapp.com/expenses"
RestClient.delete url
| b639742d5efb0f11238d510bd6a81f2288660094 | [
"Ruby",
"Shell"
]
| 4 | Ruby | codingricky/jruby-expenseit | 16f3936f7dcfe1e1b76f6aa1a00fa4a383640828 | 8d90c366793479baffa4c967cc73fd3af979c718 |
refs/heads/master | <repo_name>learrsan/WebPart00<file_sep>/WebPart00/WebPartNoVisual/WebPartNoVisual.cs
using System;
using System.ComponentModel;
using System.Web;
using System.Web.UI;
using System.Web.UI.WebControls;
using System.Web.UI.WebControls.WebParts;
using Microsoft.SharePoint;
using Microsoft.SharePoint.WebControls;
namespace WebPart00.WebPartNoVisual
{
[ToolboxItemAttribute(false)]
public class WebPartNoVisual : WebPart
{
protected override void CreateChildControls()
{
Label lbl=new Label();
lbl.Text = "Hola";
}
}
}
| d82bb6382330e722e69584b5dffb4d81eeb47525 | [
"C#"
]
| 1 | C# | learrsan/WebPart00 | 63296976310ed5fcbb077e0a07e19e67c8b42d39 | dfaa153f7ed00efbd72a717e702f3db9a47a74da |
refs/heads/master | <repo_name>rossgoodwin/rossgoodwin.github.io<file_sep>/csv_to_json.py
import csv
import json
from collections import defaultdict
import datetime
import calendar
import StringIO
import requests
response = requests.get('https://docs.google.com/spreadsheets/d/1W2e4HI2pCrI2zTISdif4D4rcKJ_2K2Lq07E-fr-ZGQ0/pub?gid=1348098975&single=true&output=csv')
with open('projects.csv', 'w') as outfile:
outfile.write(response.text)
fileObj = open('projects.csv', 'r')
reader = csv.DictReader(fileObj)
projList = []
for row in reader:
# print row.keys()
projList.append(row)
fileObj.close()
def get_date(projObj):
m, d, y = map(int, projObj['Date of Completion'].split('/'))
dateObj = datetime.datetime(y, m, d, 0, 0)
return -(dateObj - datetime.datetime(1970,1,1)).total_seconds()
projList = sorted(projList, key=get_date)
with open('projects.json', 'w') as outfile:
json.dump(projList, outfile)
| 82a0f3dd6c676b99773f43429b4cbbf5b65a88d7 | [
"Python"
]
| 1 | Python | rossgoodwin/rossgoodwin.github.io | 2af8e1471e8a920d05833b7ff67e9c9677042997 | a1e9543d2bd74d8b37bcead29b00033a12456845 |
refs/heads/master | <file_sep>/*
--------------------- Factory -------------------------
-----------------FactoryIntermediate-------------------
-------------ClassUsedByFactory {1...n}----------------
*/
// the html file uses DOM 0
// creates the objects and switch between them
// acts like an interface
// to keep the method pattern
class FactoryIntermediate {
basicMethodGeneration() {}
}
class UsedByFactory1 extends FactoryIntermediate {
basicMethodGeneration () {
document.getElementById("result").innerHTML =
"UsedByFactory1 returned in getElementId";
}
}
class UsedByFactory2 extends FactoryIntermediate {
basicMethodGeneration () {
document.getElementById("result").innerHTML =
"UsedByFactory2 returned in getElementId";
}
}
class UsedByFactory3 extends FactoryIntermediate {
basicMethodGeneration () {
document.getElementById("result").innerHTML =
"UsedByFactory3 returned in getElementId";
}
}
function doCreate(obj) {
someName = Factory.create(obj.value);
debugger;
someName.basicMethodGeneration();
}
class Factory {
static create(type) {
var typedVar = null;
switch(type) {
case "1":
typedVar = new UsedByFactory1();
break;
case "2":
typedVar = new UsedByFactory2();
break;
case "3":
typedVar = new UsedByFactory3();
break;
}
return typedVar;
}
}
| d72b0bdfc343d927e6b18e947becb5ccee471595 | [
"JavaScript"
]
| 1 | JavaScript | joaovicenteft/designpatternsJS | e72d1f585e11f58e0e5f0ef5c8b392355dc71f0a | 6ea166162469ebf2df6f1bce8cc7dce5e426bfb4 |
refs/heads/master | <file_sep>import React, { Component} from 'react';
import { connect } from 'react-redux';
import WeatherChart from '../components/weather_chart';
class WeatherList extends Component {
renderCityWeather(city_weather) {
const temps = city_weather.list.map(data => data.main.temp);
const humidities = city_weather.list.map(data => data.main.humidity)
const pressures = city_weather.list.map(data => data.main.pressure)
return (
<tr key={city_weather.city.id} >
<td>{city_weather.city.name}</td>
<td><WeatherChart data={temps} color='red' /></td>
<td><WeatherChart data={humidities} color='blue'/></td>
<td><WeatherChart data={pressures} color='green'/></td>
</tr>
)
}
render() {
console.log('in weatherlist', this.props.weather);
return (
<table className='table table-hover'>
<thead>
<tr>
<th>City</th>
<th>Temperature</th>
<th>Humidity</th>
<th>Pressure</th>
</tr>
</thead>
<tbody>
{this.props.weather.map(this.renderCityWeather)}
</tbody>
</table>
)
}
}
function mapStateToProps(state) {
return {
weather: state.weather
}
}
export default connect(mapStateToProps)(WeatherList)
<file_sep>import React from 'react';
import { Sparklines, SparklinesLine } from 'react-sparklines';
export default function WeatherChart({data, color}) {
return (
<Sparklines height={50} margin={5} data={data} limit={24}>
<SparklinesLine color={color} />
</Sparklines>
)
}
| 19a6070718cc52a99a77cad9522098766fe5302c | [
"JavaScript"
]
| 2 | JavaScript | ericchen0121/weatherSearchReactRedux | 2e4f405015c8b6a34e4e4fd6150759997572086f | 2c4f1e9f6a7a2774e70724666394c2c906be15a8 |
refs/heads/main | <repo_name>abhilashkrishnan/blaze<file_sep>/src/debug.h
#ifndef DEBUG_H
#define DEBUG_H
#include "vm/chunk.h"
void disassembleChunk(Chunk* chunk, const char* name);
int disassembleInstruction(Chunk* chunk, int offset);
#endif<file_sep>/README.md
# Blaze
Blaze Programming Language will be a compiled and interpreted language. Blaze Language will be an Object-Oriented programming language. Blaze source code will be compiled to bytecode. Blaze runtime (virtual machine) will interpret the bytecode and execute the code.
<file_sep>/src/compiler/compiler.h
#ifndef COMPILER_H
#define COMPILER_H
#include "../vm/chunk.h"
#include "../vm/object.h"
bool compile(const char* source, Chunk* chunk);
#endif<file_sep>/Makefile
FILES = ./build/memory/memory.o ./build/vm/value.o ./build/vm/object.o ./build/vm/chunk.o ./build/debug.o ./build/scanner/scanner.o ./build/compiler/compiler.o ./build/vm/vm.o
INCLUDES = -I./src
FLAGS = -g -Wall
all: ./build/memory/memory.o ./build/vm/chunk.o ./build/vm/value.o ./build/vm/object.o ./build/debug.o ./build/scanner/scanner.o ./build/compiler/compiler.o ./build/vm/vm.o ./bin/blaze
./bin/blaze: ./src/main.c
gcc ${INCLUDES} ${FLAGS} ${FILES} ./src/main.c -o ./bin/blaze
./build/memory/memory.o: ./src/memory/memory.c
gcc ${INCLUDES} ${FLAGS} ./src/memory/memory.c -o ./build/memory/memory.o -c
./build/vm/chunk.o: ./src/vm/chunk.c
gcc ${INCLUDES} ${FLAGS} -I./src/memory ./src/vm/chunk.c -o ./build/vm/chunk.o -c
./build/debug.o: ./src/debug.c
gcc ${INCLUDES} ${FLAGS} -I./src/vm ./src/debug.c -o ./build/debug.o -c
./build/vm/value.o: ./src/vm/value.c
gcc ${INCLUDES} ${FLAGS} -I./src/memory ./src/vm/value.c -o ./build/vm/value.o -c
./build/vm/object.o: ./src/vm/object.c
gcc ${INCLUDES} ${FLAGS} -I./src/memory ./src/vm/object.c -o ./build/vm/object.o -c
./build/scanner/scanner.o: ./src/scanner/scanner.c
gcc ${INCLUDES} ${FLAGS} -I./src/vm ./src/scanner/scanner.c -o ./build/scanner/scanner.o -c
./build/compiler/compiler.o: ./src/compiler/compiler.c
gcc ${INCLUDES} ${FLAGS} -I./src/vm ./src/compiler/compiler.c -o ./build/compiler/compiler.o -c
./build/vm/vm.o: ./src/vm/vm.c
gcc ${INCLUDES} ${FLAGS} -I./src/vm ./src/vm/vm.c -o ./build/vm/vm.o -c
clean:
rm -rf ${FILES}
rm -rf ./bin/blaze | 29d7ec629ae170fbf1a03cb0667741d8c9bc215f | [
"Markdown",
"C",
"Makefile"
]
| 4 | C | abhilashkrishnan/blaze | 03379038750e003ff23b80e9c28602420c46bae4 | 12ac3487ccc9fb9d980eca78620622ba6618a658 |
refs/heads/master | <repo_name>paulosevero/vagrant_examples<file_sep>/vagrant-for-rails-apps/lang/README.es.md
# Entorno virtual de desarollo para Rails en Vagrant
[English][en] |
------------- |
Archivos de configuración para construir automaticamente una maquina virtual flexible con las herramientas básicas para desarrollar proyectos con el framework Rails.
## Forma de uso
0. Instala en tu computadora el software listado en la sección «Prerequisitos».
1. Clona el repositorio en tu computadora.
2. Ajusta el parametro `provider` y otras opciones del `Vagrantfile`.
3. Ejecuta `vagrant up` y espera a que la maquina virtual se construya. Al terminar, ejecuta `vagrant reload`.
4. Ya con la maquina virtual lista, conectate a ella con `vagrant ssh` y muevente al directorio sincronizado con `cd /vagrant`.
5. Ahora puedes crear un projecto de nuevo de Rails con `rails new .` o ajustar la configuracion de Ruby y las gemas con RVM para ser compatibles con un proyecto existente.
## Prerequisitos
* [Vagrant][0]
* [Virtualbox][1] y el paquete de extensión.
Si experimentas retrasos en la sincronización de archivos con `virtualbox` como `type`, una mejor opcion es usar [SSHFS][2] o [NFS][3], aunque usualmente funciona bien.
## Software incluido
* **[Ubuntu Bionic][4]**: Ésta distribución de Linux es la más cercana a la utilizada en el stack Heroku-18.
* **Ruby 2.5.x (con [RVM][5])**: Lenguaje de programación que soporta al framework Rails. El administrador de versiones de Ruby permite usar diferentes versiones y grupos de gemas por proyecto.
* **[Rails][11]**: Y otras gemas para ayudar al desarrollo de aplicaciones web. (Son las versiones estables más recientes):
- RSpec
- Cucumber
- Mailcatcher
- Pry-Byebug
- PG
- Redis-Rails
- Webpacker
- Bundler
* **[Yarn][12] and [Webpacker][13]**: Para proyectos de Rails con mucho uso de JavaScript.
* **[Node.js][6]**: Motor de ejecición para JavaScript del lado del servidor. (Versión estable más reciente).
* **[Postgres][7]**: Manejador avanzado de bases de datos relacionales. (Distribución de Bionic).
* **[Redis][8]**: Almacenamiento de estructura de datos en memoria. (Distribución de Bionic).
* **[Heroku CLI][9]**: Crea y maneja applicaciones en Heroku desde la terminal.
* **[Ngrok][15]**: Crea tuneles temporales seguros para permitir el acceso a tu aplicación desde Internet.
* **ZSH Shell (With [Oh-My-Zsh!][14])** Herramientas para mejorar la experiencia al trabajar con la terminal.
---
[0]: https://www.vagrantup.com/downloads.html
[1]: https://www.virtualbox.org/wiki/Downloads
[2]: https://fedoramagazine.org/vagrant-sharing-folders-vagrant-sshfs/
[3]: https://www.vagrantup.com/docs/synced-folders/nfs.html
[4]: https://app.vagrantup.com/ubuntu/boxes/bionic64
[5]: https://rvm.io/
[6]: https://nodejs.org/en/
[7]: https://www.postgresql.org/
[8]: https://redis.io/
[9]: https://devcenter.heroku.com/articles/heroku-cli
[10]: https://www.heroku.com/
[11]: https://rubyonrails.org/
[12]: https://yarnpkg.com/
[13]: https://github.com/rails/webpacker
[14]: http://ohmyz.sh/
[15]: https://ngrok.com/
[en]: ../README.md
<file_sep>/vagrant-for-rails-apps/Vagrantfile
Vagrant.configure('2') do |config|
config.vm.box = 'ubuntu/bionic64'
config.vm.define 'rails-apps-box'
config.vm.provider 'virtualbox' do |vb|
vb.customize ['modifyvm', :id, '--memory', '1024']
end
config.vm.synced_folder '.', '/vagrant', type: 'virtualbox'
config.vm.provision "shell", privileged: false, run: "once", path: "provision/zsh_setup.sh"
config.vm.provision "shell", privileged: false, run: "once", path: "provision/box_setup.zsh", env: { "LC_ALL" => "en_US.UTF-8", "LANG" => "en_US.UTF-8", "LANGUAGE" => "en_US.UTF-8", }
end
<file_sep>/README.md
## How to Use Vagrant to Both Research and Development
In this repository I show how Vagrant can be used to both development and research purposes.
## Vagrant for Researchers
Reproductivity is an emerging concern in the context of current high-level scientific research. Nowadays, there are tons of tools that can support researchers in ensuring their research will be easily reproduced (e.g., versioning control systems like [GitHub](https://github.com), digital interactive notebooks like [Jupyter](http://jupyter.org/), etc.), however, not everyone knows about these technologies, so it is always important to show how to use them. In this repository, I show how we can use Vagrant to create automated research environments that can be easily reproduced without hurrying about issues like OS incompatibilities or broken dependencies.
The sample scientific environment I automated through Vagrant was Nas Parallel Benchmarks (NPB), which is a very-known benchmark suite for High-Performance Computing supported by NASA. More specifically, I created a Vagrantfile that automatically download and configure the OpenMP version of NPB. One of the most relevant aspects of reproducible research is customization, i.e., if you download the repository, you may probably want to execute the experiments just like I did in my research but in some cases, you could also have interest on modifying some parameters to make your contribution, for example. To exemplify such customization, I created a variable called PROCS (second line) that you can adjust to change the number of processors will be available to the benchmark. So, to execute the NPB, you just need to execute `vagrant up` and my script will download NPB from its official website, and it will install its dependencies. To execute one of the NPB algorithms, just enter in the VM by executing `vagrant ssh`, access the benchmark folder by running `cd NPB3.3.1/NPB3.3-OMP`, compile the algorithm you want with the desired class, for example, FT: `make ft CLASS=A`, then finally run the algorithm: `./bin/ft.A.x`.
## Vagrant for Developers
Nowadays one of the key aspects to achieving success and earn a place in the sun is productivity. Given that, developers are looking for strategies to improve their productivity through emerging technologies and tools that help them to focus on what is actually relevant, and in this context, Vagrant comes with the proposal of reducing to almost zero the time spent to configure development environments.
To exemplify how to use Vagrant to automate development environments, I used the example provided by [Rojo](https://github.com/Rojo/vagrant-for-rails-apps), which created a Vagrantfile that automatically installs a bunch of tools which are usually adopted by Ruby on Rails developers:
* Ruby gems
* RSpec
* Cucumber
* Mailcatcher
* Pry-Byebug
* PG
* Redis-Rails
* Webpacker
* Bundler
* Yarn and Webpacker: For Rails projects with heavy use of JavaScript.
* Node.js: Server side JavaScript runtime.
* Postgres: Advanced SQL database.
* Redis: In-memory data structure store.
* Heroku CLI: Create and manage Heroku apps from the command line.
* Ngrok: Exposes local servers behind NATs and firewalls to the public internet over secure tunnels.
* ZSH Shell (With Oh-My-Zsh!) Tools to improve the experience of working with the shell.
To get our Ruby on Rails environment up and running, we just need to execute `vagrant up` and the automation code will do the rest for us. As soon the process is finished, we can access our development environment by running `vagrant ssh` and create Rails applications as we used to do (`rails new myapp ...`).
<file_sep>/vagrant-for-rails-apps/provision/box_setup.zsh
#!/usr/bin/env zsh
###
# Set missing language configuration
set_language()
{
echo "Setting ´vagrant´ user language configuration"
echo -e "\n# Set locale configuration" >> ~/.profile
echo 'export LC_ALL=en_US.UTF-8' >> ~/.profile
echo 'export LANG=en_US.UTF-8' >> ~/.profile
echo "export LANGUAGE=en_US.UTF-8\n" >> ~/.profile
}
###
# Install and configure PostgreSQL database manager
install_postgres() {
echo "Installing PostgreSQL"
sudo apt-get update
sudo apt-get install -y postgresql postgresql-contrib
# Set up vagrant user
sudo -u postgres bash -c "psql -c \"CREATE USER vagrant WITH PASSWORD '<PASSWORD>';\""
sudo -u postgres bash -c "psql -c \"ALTER USER vagrant WITH SUPERUSER;\""
# Make available useful extensions to the schemas
sudo -u postgres bash -c "psql -c \"CREATE EXTENSION unaccent SCHEMA pg_catalog;\""
sudo -u postgres bash -c "psql -c \"CREATE EXTENSION pg_trgm SCHEMA pg_catalog;\""
# Start service
sudo service postgresql start
}
###
# Install Redis data store
install_redis() {
echo "Installing Redis"
sudo apt-get install -y redis-server
}
###
# Install Node Version Manager
install_nvm() {
echo "Installing NVM"
wget -qO- https://raw.githubusercontent.com/creationix/nvm/v0.33.8/install.sh | bash
echo -e "\n# Node Version Manager" >> ~/.profile
echo 'export NVM_DIR="$HOME/.nvm"' >> ~/.profile
echo '[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh"' >> ~/.profile
echo '[ -s "$NVM_DIR/bash_completion" ] && \. "$NVM_DIR/bash_completion"' >> ~/.profile
echo -e "\n" >> ~/.profile
echo 'autoload -U add-zsh-hook' >> ~/.profile
echo 'load-nvmrc() {' >> ~/.profile
echo ' local node_version="$(nvm version)"' >> ~/.profile
echo ' local nvmrc_path="$(nvm_find_nvmrc)"' >> ~/.profile
echo -e "\n" >> ~/.profile
echo ' if [ -n "$nvmrc_path" ]; then' >> ~/.profile
echo ' local nvmrc_node_version=$(nvm version "$(cat "${nvmrc_path}")")' >> ~/.profile
echo -e "\n" >> ~/.profile
echo ' if [ "$nvmrc_node_version" = "N/A" ]; then' >> ~/.profile
echo ' nvm install' >> ~/.profile
echo ' elif [ "$nvmrc_node_version" != "$node_version" ]; then' >> ~/.profile
echo ' nvm use' >> ~/.profile
echo ' fi' >> ~/.profile
echo ' elif [ "$node_version" != "$(nvm version default)" ]; then' >> ~/.profile
echo ' echo "Reverting to nvm default version"' >> ~/.profile
echo ' nvm use default' >> ~/.profile
echo ' fi' >> ~/.profile
echo '}' >> ~/.profile
echo 'add-zsh-hook chpwd load-nvmrc' >> ~/.profile
echo -e "load-nvmrc\n" >> ~/.profile
export NVM_DIR="$HOME/.nvm"
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh"
[ -s "$NVM_DIR/bash_completion" ] && \. "$NVM_DIR/bash_completion"
}
###
# Install stable version of Node.js
install_nodejs() {
echo 'Installing Node.js'
nvm install stable
nvm alias default stable
nvm use default
}
###
# Install Ruby Version Manager
install_rvm() {
echo 'Installing RVM'
gpg --keyserver hkp://keys.gnupg.net --recv-keys <KEY> 7D2BAF1CF37B13E2069D6956105BD0E739499BDB
\curl -sSL https://get.rvm.io | bash
source $HOME/.rvm/scripts/rvm
rvm get head
}
###
# Install Matz Ruby Interpreter and common gems
install_ruby() {
echo 'Installing Ruby'
sudo apt-get install -y libxml2 libxml2-dev libxslt1-dev libpq-dev
rvm install 2.5
rvm use 2.5@global
gem update --system --no-ri --no-rdoc
gem update --no-ri --no-rdoc
gem install bundler rails rspec-rails cucumber-rails pg redis-rails webpacker mailcatcher pry-byebug --no-ri --no-rdoc
rvm use 2.5 --default
rvm cleanup all
}
###
# Install Heroku CLI
install_heroku() {
echo 'Installing Heroku CLI'
wget -qO- https://cli-assets.heroku.com/install-ubuntu.sh | sh
}
###
# Install Yarn JavaScript modules manager
install_yarn() {
echo 'Installing Yarn'
curl -sS https://dl.yarnpkg.com/debian/pubkey.gpg | sudo apt-key add -
echo "deb https://dl.yarnpkg.com/debian/ stable main" | sudo tee /etc/apt/sources.list.d/yarn.list
sudo apt-get update && sudo apt-get install -y yarn
}
###
# Install Ngrok secure tunnel manager
install_ngrok() {
echo 'Installing Ngrok'
sudo apt install -y unzip
wget https://bin.equinox.io/c/4VmDzA7iaHb/ngrok-stable-linux-amd64.zip
sudo unzip ngrok-stable-linux-amd64.zip -d /usr/local/bin
rm -rf ngrok-stable-linux-amd64.zip
}
###
# Remove unused software
clean_up() {
sudo apt -y autoremove && sudo apt autoclean
}
setup() {
set_language
install_postgres
install_redis
install_nvm
install_nodejs
install_rvm
install_ruby
install_heroku
install_yarn
install_ngrok
clean_up
}
setup "$@"
<file_sep>/vagrant-npb/Vagrantfile
# Number of processors the VM will have access to
PROCS=4
Vagrant.configure("2") do |config|
config.vm.define "npb" do |npb|
npb.vm.box = "ubuntu/bionic64"
npb.vm.network :forwarded_port, guest: 80, host: 8080
npb.vm.hostname = "npb"
npb.vm.provider :virtualbox do |vb|
vb.customize ["modifyvm", :id, "--memory", "2048"]
vb.customize ["modifyvm", :id, "--cpus", PROCS]
end
npb.vm.provision "shell", inline: <<-SHELL
export DEBIAN_FRONTEND=noninteractive
cd /home/vagrant
sudo apt update
sudo apt install -y build-essential gfortran tmux htop git sysstat
wget https://www.nas.nasa.gov/assets/npb/NPB3.3.1.tar.gz
tar -xzf NPB3.3.1.tar.gz
rm -rf NPB3.3.1/NPB3.3-MPI
rm -rf NPB3.3.1/NPB3.3-SER
rm NPB3.3.1/Changes.log
rm NPB3.3.1/NPB3.3-HPF.README
rm NPB3.3.1/NPB3.3-JAV.README
rm NPB3.3.1/README
cd NPB3.3.1/NPB3.3-OMP/config
cp suite.def.template suite.def
cp NAS.samples/make.def.gcc_x86 make.def
cd /home/vagrant
chmod 777 -R /home/vagrant/NPB3.3.1
export OMP_NUM_THREADS=PROCS
export OMP_PROC_BIND=true
SHELL
end
end
| 0ba91fcc53002d9455d9ad1c10a08a743e887a67 | [
"Markdown",
"Ruby",
"Shell"
]
| 5 | Markdown | paulosevero/vagrant_examples | 43019420801f0269179d893becc3170b00b47dc9 | c72d32f135ce9a177622eae9bdb94c546405b288 |
refs/heads/master | <file_sep>package pageObject;
import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.WebElement;
import resources.Base;
public class Register_Page extends Base{
public Register_Page (WebDriver driver) {
super();
this.driver = driver;
}
By fullName = By.xpath("//input[@name='full_name']");
By preferred = By.xpath("//input[@name='preferred_name']");
By email = By.xpath("//input[@name='email']");
By selectCountry = By.xpath("//img[@class='q-img__image q-img__image--with-transition q-img__image--loaded']");
By mobileNumber = By.xpath("//input[@name='phone']");
By getInfoForm = By.xpath("//input[@class='q-field__input q-placeholder col']");
By promoCode = By.xpath("//input[@data-cy='register-person-referral-code']");
By chkAgree = By.xpath("//div[@class='q-checkbox__bg absolute']");
By applyCode = By.xpath("//span[contains(text(),'Apply')]");
By btnContinue = By.xpath("//span[contains(text(),'Con')]");
By loginPage = By.xpath("//span[contains(text(),'Login')]");
By skip = By.xpath("//span[contains(text(),'Skip')]");
By iframe = By.xpath("//fc_widget");
public WebElement fullName() {
return driver.findElement(fullName);
}
public WebElement skip() {
return driver.findElement(skip);
}
public WebElement preferred() {
return driver.findElement(preferred);
}
public WebElement applyCode() {
return driver.findElement(applyCode);
}
public WebElement email() {
return driver.findElement(email);
}
public WebElement selectCountry() {
return driver.findElement(selectCountry);
}
public WebElement mobileNumber() {
return driver.findElement(mobileNumber);
}
public WebElement getInfoForm() {
return driver.findElement(getInfoForm);
}
public WebElement promoCode() {
return driver.findElement(promoCode);
}
public WebElement chkAgree() {
return driver.findElement(chkAgree);
}
public VerifyOTP_Page btnContinue() {
driver.findElement(btnContinue).click();
VerifyOTP_Page v = new VerifyOTP_Page(driver);
return v;
}
public Login_page loginPage() {
driver.findElement(loginPage).click();
Login_page l = new Login_page(driver);
return l;
}
public Chatbox_page iframe() {
driver.switchTo().frame(driver.findElement(iframe));
Chatbox_page c = new Chatbox_page(driver);
return c;
}
}
<file_sep>package resources;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.util.Properties;
public class Utils {
public static String getStringValue(String value) throws IOException {
Properties p = new Properties();
FileInputStream fis = new FileInputStream(new File(System.getProperty("user.dir")+"/src/test/java/resources/global.properties"));
p.load(fis);
return p.getProperty(value);
}
}
<file_sep>package resources;
import java.io.IOException;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
public class Base {
public static WebDriver driver;
public static WebDriver initBrowser() throws IOException {
System.setProperty("webdriver.chrome.driver", System.getProperty("user.dir")+"/src/test/java/resources/chromedriver.exe");
driver = new ChromeDriver();
driver.get(Utils.getStringValue("baseURL"));
driver.manage().window().maximize();
return driver;
}
}
<file_sep>baseURL = https://feature-qa-automation.customer-frontend.staging.aspireapp.com/sg/
verifyOTP = https://feature-qa-automation.customer-frontend.staging.aspireapp.com/sg/register/verify-otp
<file_sep>package pageObject;
import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.WebElement;
import resources.Base;
public class Login_page extends Base {
public Login_page(WebDriver driver) {
this.driver=driver;
}
By userName = By.xpath("//input[@name='username']");
By rememberMe = By.xpath("//*[name()='svg']");
By register = By.xpath("//span[contains(text(),'Reg')]");
By btnNext = By.xpath("//span[contains(text(),'Next')]");
public WebElement userName() {
return driver.findElement(userName);
}
public WebElement rememberMe() {
return driver.findElement(rememberMe);
}
public Register_Page register() {
driver.findElement(register).click();
Register_Page r = new Register_Page(driver);
return r;
}
public WebElement btnNext() {
return driver.findElement(btnNext);
}
}
| 1b3be079b8cfa5a781af1d49b4b01b84a02c5072 | [
"Java",
"INI"
]
| 5 | Java | dungdoan68/Aspire | 8701da18a8302ae8c9f7a9585dcef320ff364f0b | ce6832418bee388b949e3e217dc9e2c0dac100f1 |
refs/heads/master | <repo_name>bmehler/jenkinstest<file_sep>/src/test.php
<?php
echo 'Nochmal von ganz vorne';
echo 'An den Anfang';
echo 'Zurück! Von ganz vorne!';
echo 'Beginnen wir nochmal';
echo 'Nochmal beginnen';
echo 'Und wieder von vorne!';
echo 'Heute ist ein schöner Tag!';
echo 'Hoffentlich morgen auch';
echo 'Es regnet nicht!';
echo 'Manchmal gibts auch Schnee!';
echo 'Ich weine!';
echo 'Wann wirds mal wieder richtig Sommer?';
echo 'OH da hat sich ein Fehler eingeschlichen';
echo 'Mal sehen ob jetzt was geht!';
echo 'Jetzt ist wieder Somme';
echo 'Ist heute ein guter Tag!';
?>
| fb5efa810317bfdaaf8ca8a93e019b7ea65eca5c | [
"PHP"
]
| 1 | PHP | bmehler/jenkinstest | 61682b9a3ce9e08046447344129b6148496517bf | e76dbde7914fadb767163072143ccc6c703be129 |
refs/heads/master | <repo_name>k0div/BOT<file_sep>/index.js
const discord = require("discord.js");
const bot = new discord.Client();
bot.on("message", (message) => {
if(message.content == "!teste"){
//message.reply("Teste Concluido!");
message.channel.sendMessage("Teste Concluido!");
}
});
bot.login(process.env.BOT_TOKEN);
| 8e212f6800cd8439738a7f94dc7addef35fb7d7b | [
"JavaScript"
]
| 1 | JavaScript | k0div/BOT | 0f3b83e3e962c278a058ab156cbc81ea0cf57b94 | 26efe18db15b6093b6cb86eea7ffb216573881d4 |
refs/heads/master | <repo_name>Nepeus/querido<file_sep>/conectar.php
<?php
$nombre_bd = $_POST['name'];
$conn = mysqli_connect($_POST['host'], $_POST['user'], $_POST['pass']);
if (!$conn) {
echo 'No se pudo conectar a mysql';
exit;
}
$sql = "SHOW TABLES FROM $nombre_bd";
$resultado = mysqli_query($conn,$sql);
if (!$resultado) {
echo "Error de BD, no se pudieron listar las tablas\n";
echo 'Error MySQL: ' . mysqli_error();
exit;
}
$i = 0;
while($row = mysqli_fetch_array($result)){ //Creates a loop to loop through results
echo "<tr><td>" . $i . "</td><td>" . $row[0] . "</td></tr>"; //$row['index'] the index here is a
$i++;
}
//mysqli_free_result($resultado);
?><file_sep>/generar.php
<?php
include "lib/phpqrcode/qrlib.php";
$name = $_POST['name'];
$sortName = $_POST['username'];
$dni = $_POST['dni'];
$email = $_POST['email'];
$addressLabel = '';
$addressPobox = '';
$addressExt = '';
$addressStreet = '';
$addressTown = '';
$addressRegion = $_POST['location'];
$addressPostCode = '';
$addressCountry = '';
$codeContents = 'BEGIN:VCARD'."\n";
$codeContents .= 'VERSION:2.1'."\n";
$codeContents .= 'N:'.$sortName."\n";
$codeContents .= 'FN:'.$name."\n";
$codeContents .= 'TEL;WORK;VOICE:'.$dni."\n";
$codeContents .= 'ADR;TYPE=work;'.
'LABEL="'.$addressLabel.'":'
.$addressPobox.';'
.$addressExt.';'
.$addressStreet.';'
.$addressTown.';'
.$addressPostCode.';'
.$addressCountry
."\n";
$codeContents .= 'EMAIL:'.$email."\n";
$codeContents .= 'END:VCARD';
$uniqid = uniqid();
QRcode::png($codeContents, "qr/".$uniqid.'.png', L, 3);
// displaying
?>
<?php include_once 'header.php'; ?>
<?php include_once 'menu.php'; ?>
<section id="main">
<section id="content">
<div class="container">
<div class="card" style="margin-left:auto;margin-right:auto;width:500px;">
<div class="card-header" style="text-align:center;">
<h2>¡Codigo QR generado con exito!</h2>
</div>
<div class="card-body card-padding" style="text-align:center;">
<img src="qr/<?php echo $uniqid ?>.png"/>
</div>
</div>
<div class="block-header" style="text-align:center;padding-bottom:10px;">
<a class="btn btn-primary btn-lg login-button" href="genqr.php" class="">Volver</a>
</div>
</div>
</section>
</section>
<?php include_once 'footer.php'; ?>
</body>
</html><file_sep>/footer.php
<footer id="footer">
<?php echo date('Y') ?> © <NAME>
</footer><file_sep>/conexion.php
<?php include_once 'header.php'; ?>
<?php include_once 'menu.php'; ?>
<?php
$test = array(
'nombre' => 'u844022381_multa',
'user' => 'u844022381_app',
'pass' => '<PASSWORD>',
'host' => 'mysql.hostinger.com.ar'
)
?>
<section id="main">
<section id="content">
<div class="container">
<div class="block-header">
<h2>Ingrese datos de conexión</h2>
</div>
<div class="card">
<div class="card-header">
<h2>Base de datos</h2>
</div>
<div class="card-body card-padding">
<form class="row" role="form" action="" id="connForm">
<div class="col-sm-2">
<div class="form-group fg-line">
<label class="sr-only" for="qrnombre">Nombre</label>
<input type="text" class="form-control input-sm" id="qrnombre" placeholder="Base de datos" name="name" value="<?php echo $test['nombre'] ?>">
</div>
</div>
<div class="col-sm-2">
<div class="form-group fg-line">
<label class="sr-only" for="qrPass">Usuario</label>
<input type="text" class="form-control input-sm" id="qrUser" placeholder="Usuario" name="user" value="<?php echo $test['user'] ?>">
</div>
</div>
<div class="col-sm-2">
<div class="form-group fg-line">
<label class="sr-only" for="qrPass">Password</label>
<input type="password" class="form-control input-sm" id="qrPass" placeholder="Contraseña" name="pass" value="<?php echo $test['pass'] ?>">
</div>
</div>
<div class="col-sm-2">
<div class="form-group fg-line">
<label class="sr-only" for="qrPass">Host</label>
<input type="text" class="form-control input-sm" id="qrHost" placeholder="127.0.0.1" name="host" value="<?php echo $test['host'] ?>">
</div>
</div>
<div class="col-sm-4">
<a href="#" onclick="conectar($('#connForm'))" class="btn btn-primary btn-sm m-t-5 waves-effect">CONECTAR</a>
</div>
</form>
</div>
</div>
<div class="block-header" style="display: none;">
<h2>Tablas de la base de datos
<small>Seleccione la tabla deseada para generar el/los codigo/s QR</small>
</h2>
</div>
<div class="card" style="display:none" id="tablas">
<div class="card-header">
<h2>Tablas</h2>
</div>
<div class="card-body table-responsive">
<table class="table">
<thead>
<th>#</th>
<th>Tabla</th>
<th>Acciones</th>
</thead>
<tbody>
</tbody>
</table>
</div>
</div>
</div>
</section>
</section>
<?php include_once 'footer.php' ?>
<script type="text/javascript">
function conectar(form){
$.ajax({
url: 'conectar.php',
type: 'POST',
dataType: 'html',
data: form.serialize()
})
.done(function(tablas) {
$('#tablas').show();
$('#tablas').after().show();
$('table.table tbody').append(tablas)
})
.fail(function() {
console.log("error");
})
.always(function() {
console.log("complete");
});
}
</script>
</html><file_sep>/README.md
# Querido
Querido es un formulario que genera un codigo QR con los datos ingresados.
- TODO
# TODO
- Mostrar una url con el codigo (shorturl)
- Mejorar estilos del formulario
### Tech
Querido usa las siguientes tecnologias para funcionar correctamente
* [phpqrcode] - Generador de codigos qr!
* [Twitter Bootstrap] - great UI boilerplate for modern web apps
* [jQuery] - duh
OpenSource por supuesto
<file_sep>/genqr.php
<?php include_once 'header.php'; ?>
<?php include_once 'menu.php'; ?>
<section id="main">
<section id="content">
<div class="container">
<div class="block-header" style="text-align:center;">
<h2>Formulario de datos</h2>
</div>
<div class="card" style="margin-left:auto;margin-right:auto;width:500px;">
<div class="card-header">
<h2>Aviso:
<small>Los datos cargados en el siguiente formulario se utilizaran para generar un codigo QR que acredita identidad.</small>
</h2>
</div>
<div class="card-body card-padding">
<form class="" method="post" action="generar.php" >
<div class="form-group fg-line">
<label for="qrNombre">Nombre</label>
<input type="text" name="name" class="form-control input-sm" id="qrNombre" placeholder="Ingrese Nombres"/>
</div>
<div class="form-group fg-line">
<label for="qrApellido">Apellido</label>
<input type="text" name="username" class="form-control input-sm" id="qrApellido" placeholder="Ingrese Apellidos"/>
</div>
<div class="form-group fg-line">
<label for="qrDni">DNI</label>
<input type="text" name="dni" class="form-control input-sm" id="qrDni" placeholder="xx.xxx.xxx"/>
</div>
<div class="form-group fg-line">
<label for="qrMail">E-Mail</label>
<input type="email" name="email" class="form-control input-sm" id="qrMail" placeholder="<EMAIL>"/>
</div>
<div class="form-group fg-line">
<label for="qrLocalidad">Localidad</label>
<input type="text" name="location" class="form-control input-sm" id="qrLocalidad"
placeholder="Ej: Palmira, Mendoza"/>
</div>
<button type="submit" target="_self" id="button" class="btn btn-primary btn-lg btn-block login-button">Registrar</button>
</form>
</div>
</div>
</div>
</section>
</section>
<?php include_once 'footer.php' ?>
</html><file_sep>/index.php
<?php include_once 'header.php'; ?>
<?php include_once 'menu.php'; ?>
<section id="main">
<section id="content">
<div class="container">
<div class="block-header" style="text-align:center;">
<h2>Formulario de datos</h2>
</div>
<a href="conexion.php" class="btn btn-primary btn-lg waves-effect">Conectar a una Base de datos</a>
<a href="genqr.php" class="btn bgm-bluegray btn-lg waves-effect">Desde Formulario</a>
</div>
</section>
</section>
<?php include_once 'footer.php' ?>
</html> | 191a5888c1a2a548ab5dab65633f54ad506ba3a7 | [
"Markdown",
"PHP"
]
| 7 | PHP | Nepeus/querido | f7a625bed737bb08df80059be98906e85a499423 | 1c83db1311c5a08fa37e5d44a51e6a5d0e802154 |
refs/heads/main | <file_sep>const trailHeads = ['Bell Canyon', 'Rock Mouth', 'Hidden Valley', 'Little Willow Canyon', 'Draper Alpine', 'Dimple Dell', 'Temple Quarry', 'Alpenbock Loop']
const pathBtn = document.querySelector("#path-finder")
function pathFinder(){
alert("You should Try " + trailHeads[Math.floor(Math.random() * trailHeads.length)] + " Trail")
}
pathBtn.addEventListener('click', pathFinder)<file_sep>let colorBtn = document.querySelector("#color")
function favColor(){
alert('My favorite color is Yellow!')
}
colorBtn.addEventListener("click", favColor)
let placeBtn = document.querySelector("#place")
function favPlace(){
alert('My favorite place is in the Mountains!')
}
placeBtn.addEventListener("click", favPlace)
let ritualBtn = document.querySelector("#ritual")
function favRitual(){
alert('My favorite ritual is dancing or shadow boxing before going on a run!')
}
ritualBtn.addEventListener("click", favRitual)
| 077f166855a187852c7cf517d582db08aacc7fbe | [
"JavaScript"
]
| 2 | JavaScript | owenutley/Assessment-week-3 | 9453d2f3fdf6b80c5efad4336f60f640817a2278 | 568ed1d2a73890ab0b91bae62c974147194efb88 |
refs/heads/master | <file_sep># Dot Product Experiment
用不同的硬件和方法计算不同长度的向量点积并计算其消耗的时间
c = [a1, a2, ..., an] * [b1, b2, ..., bn] = sum(a1×b1, a2×b2, ... an×bn)
程序分别用以下三种方式计算点积:
+ CPU单核暴力运算
+ OpenMP多线程运算
+ CUDA GPU运算
## 编译环境:
+ CMake 3.17 or above(CLion自带的好像编译不出OpenMP,自己装的可以)
+ CUDA版本不是太低应该都行
+ 须在环境变量中设置`CUDA_PATH`为CUDA的安装目录,在`PATH`中添加nvcc所在的目录
+ 支持OpenMP的C++编译器
## 运行和输出
如果你不想自己编译,也可以前往[Releases](https://github.com/lonelyion/dot_product_experiment/releases)页面下载Windows平台的二进制文件,当然需要NVIDIA的显卡(但是不需要安装CUDA),否则不能运行。
当然有N卡还是不能运行的话,可以尝试把数据量改小一点再编译,可能显存不够也会出问题。
程序会在当前目录创建一个`output.csv`的文件作为输出,表格样例如下,时间的单位为微秒(μs):
| 向量长度N | tCPU1/单核运算时间 | tCPU2/OMP运算时间 | tGPU/CUDA运算时间 | Diff/CPU和GPU结果的差值 |
| ---- | ---- | ---- | ---- | ---- |
|16|0|13|82|0|
|32|0|108|23|0.000001|
|...|...|...|...|...|
|268435456|687158|110460|19145|0.586042|
<file_sep>cmake_minimum_required(VERSION 3.17)
project(dot_product LANGUAGES CXX CUDA)
set(CMAKE_CUDA_STANDARD 14)
find_package(OpenMP REQUIRED)
if (OPENMP_FOUND)
set(CMAKE_CUDA_FLAGS "${CMAKE_CUDA_FLAGS} -Xcompiler=${OpenMP_CXX_FLAGS}")
set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} ${OpenMP_CXX_FLAGS}")
endif()
add_executable(dot_product main.cu my_dot_product.h)
set_target_properties(
dot_product
PROPERTIES
CUDA_SEPARABLE_COMPILATION ON)<file_sep>//
// Created by ion on 2021/4/7.
//
#ifndef _MY_DOT_PRODUCT_H
#define _MY_DOT_PRODUCT_H
#include <random>
#include <vector>
#include <iostream>
#include <chrono>
#include "omp.h"
#include <cstdlib>
extern const size_t vector_n;
template <typename T>
double dot_cpu(const std::vector<T> &a, const std::vector<T> &b, int n)
{
double dTemp = 0;
for (int i=0; i<n; ++i)
{
dTemp += a[i] * b[i];
}
return dTemp;
}
template <typename T>
void print_vector(T *a, size_t row, size_t len, std::vector<T> &out) {
printf("(");
for(int i = 0; i < len; i++) {
out[i] = a[row*len + i];
printf("%f", a[row*len + i]);
if(i != len - 1) printf(",");
}
printf(")");
}
void initialize_data(float* ip,int size)
{
std::random_device rd;
std::mt19937 gen(rd());
//std::uniform_real_distribution<float> dis(0.0, 5.0);
std::normal_distribution<float> dis(1.0, 0.2);
#pragma omp parallel for
for(int i = 0; i < size; i++)
{
ip[i] = dis(gen);
}
#pragma omp barrier
}
template <typename T>
bool check_result(T *dst_cpu, T *dst_gpu, const int size, float *a, float *b) {
static const double epsilon = 1.0E-7;
for(int i = 0; i < size; i++) {
if(abs(dst_cpu[i] - dst_gpu[i]) > epsilon) {
printf("Match failed at %d\n", i);
std::vector<float> va(vector_n+1),vb(vector_n+1);
print_vector(a, i, vector_n, va);
printf(" + ");
print_vector(b, i, vector_n, vb);
printf("\nCPU: %f / GPU: %f / Validate: %f\n", dst_cpu[i], dst_gpu[i], dot_cpu(va, vb, vector_n));
return false;
}
}
return true;
}
int time_d(std::chrono::time_point<std::chrono::steady_clock> a, std::chrono::time_point<std::chrono::steady_clock> b) {
//std::chrono::duration<double, std::milli> duration = (b - a);
//return duration.count();
return std::chrono::duration_cast<std::chrono::microseconds>(b - a).count();
}
#endif //VECTOR_SCALAR_PRODUCT_MY_DOT_PRODUCT_H
| c3fd58e5289a3894923ff8af6c866a486af8d593 | [
"Markdown",
"CMake",
"C++"
]
| 3 | Markdown | lonelyion/dot-product-experiment | f64b62ace32ad26d8394b3dff8a52b06e0b43d4f | 09af57374015bb4d8282bcea9704de3386a1de7e |
refs/heads/main | <file_sep>using System;
using System.Collections.Generic;
using System.ComponentModel.DataAnnotations;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace Facturation.Shared
{
public class Facture
{
public static int DEADLINE_WARNING = 14;
public static int DEADLINE_ALERT = 2;
public int Id { get; set; }
[Required(ErrorMessage = "Numero requis.")]
[StringLength(15, MinimumLength =6, ErrorMessage = "Doit être compris entre 6 et 15 caractères.")]
public string Numero { get; set; }
[Required(ErrorMessage = "Client requis.")]
[StringLength(50, MinimumLength = 2, ErrorMessage ="Doit être compris entre 2 et 50 caractères.")]
public string Client { get; set; }
[Required(ErrorMessage = "Date estimée de règlement requise.")]
public DateTime DateReglement { get; set; }
[Required(ErrorMessage = "Montant requis.")]
[Range(0, double.MaxValue, ErrorMessage = "Doit valoir au minimum 0.")]
public decimal Montant { get; set; }
[Required(ErrorMessage = "Montant réglé requis.")]
[Range(0, double.MaxValue, ErrorMessage = "Doit valoir au minimum 0.")]
public decimal MontantRegle { get; set; }
public static int NbDaysRemaining(DateTime date)
{
return (int)Math.Truncate((date - DateTime.Now).TotalDays);
}
}
}
<file_sep>using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace Facturation.Shared
{
public class DashboardDto
{
public DashboardDto()
{
}
public DashboardDto(IBusinessData from)
{
CA = from.CA;
Percu = from.Percu;
MontantDus = from
.GetFactures(DateTime.Now, null)
.Select(f => new MontantDuDto(f));
}
public DashboardDto(IEnumerable<Facture> from)
{
foreach (Facture item in from)
{
CA += item.Montant;
Percu += item.MontantRegle;
MontantDus.Append(new MontantDuDto(item));
}
}
public decimal CA { get; set; }
public decimal Percu { get; set; }
public IEnumerable<MontantDuDto> MontantDus { get; set; }
}
}
<file_sep>using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace Facturation.Shared
{
public static class Utils
{
public static string ToInputDate(this DateTime date)
{
string yyyy = date.Year.ToString();
string MM = date.Month < 9 ? "0" + date.Month.ToString() : date.Month.ToString();
string DD = date.Day < 9 ? "0" + date.Day.ToString() : date.Day.ToString();
return yyyy + "-" + MM + "-" + DD;
}
}
}
<file_sep>using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace Facturation.Shared
{
public class FactureSorting
{
public string SortBy { get; set; }
public string Order { get; set; }
public FactureSorting(string sortBy, string order)
{
SortBy = sortBy;
Order = order;
}
}
public static class E_FactureSortBy
{
public static bool MatchesFactureElement(this string elementToCheck)
{
return elementToCheck switch
{
DateReglement => true,
Numero => true,
Client => true,
Montant => true,
_ => false,
};
}
public static bool MatchesSortOrder(this string elementToCheck)
{
return elementToCheck switch
{
ASC => true,
DESC => true,
_ => false,
};
}
public const string DateReglement = "DateReglement";
public const string Numero = "Numero";
public const string Client = "Client";
public const string Montant = "Montant";
public const string ASC = "ASC";
public const string DESC = "DESC";
}
}
<file_sep>using Facturation.Shared;
using System;
using System.Collections.Generic;
using System.Data.SqlClient;
using Dapper;
namespace Facturation.Server.Models
{
public class BusinessDataSql : IBusinessData, IDisposable
{
private SqlConnection cnct;
public BusinessDataSql(string connectionString)
{
cnct = new SqlConnection(connectionString);
}
public void Dispose()
{
cnct.Dispose();
}
public IEnumerable<Facture> Factures
=> cnct.Query<Facture>("SELECT * FROM Factures");
public decimal CA
=> cnct.QueryFirst<decimal>("select sum(f.Montant) from Factures f");
public decimal Percu
=> cnct.QueryFirst<decimal>("select sum(f.MontantRegle) from Factures f");
public void Add(Facture facture)
=> cnct.Execute("insert into Factures values (@Numero, @Client, @DateReglement, @Montant, @MontantRegle)", facture);
public void Edit(Facture facture)
=> cnct.Execute("update Factures set Numero = @Numero, Client = @Client, DateReglement = @DateReglement, " +
"Montant = @Montant, MontantRegle = @MontantRegle where Id = @Id", facture);
public IEnumerable<Facture> GetFactures(DateTime? debut, DateTime? fin)
{
string sql = "select * from Factures f ";
if(debut != null)
{
if(fin != null && fin > debut)
sql += string.Format("where DateReglement between {0} and {1}", debut.Value.ToShortDateString(), fin.Value.ToShortDateString());
else
sql += string.Format("where DateReglement >= {0}", debut.Value.ToShortDateString());
}
else
if (fin != null)
sql += string.Format("where DateReglement <= {0}", fin.Value.ToShortDateString());
return cnct.Query<Facture>(sql);
}
public IEnumerable<Facture> GetFactures(string orderBy, string asc)
{
string sql = string.Format("select * from Factures f order by {0} {1}", orderBy, asc);
return cnct.Query<Facture>(sql);
}
}
}
<file_sep>using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace Facturation.Shared
{
public class DashboardResearch
{
public DateTime DateMin { get; set; }
public DateTime DateMax { get; set; }
public DashboardResearch()
{
int year = DateTime.Now.Year;
DateMin = new DateTime(year, 1, 1);
DateMax = new DateTime(year, 12, 31);
}
}
}
<file_sep>using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace Facturation.Shared
{
public class FactureRef
{
public string FReference { get; set; }
public FactureRef(string fRef)
{
FReference = fRef;
}
}
}
| 42d008dc0fe71496646f50e5ed8fc0830813b520 | [
"C#"
]
| 7 | C# | AqVScipio/NET_Facturation_Atelier4 | a96b662cbf40cff73243e35aaad17d8577fc6e07 | 226da137b9199dbcdc041c336809cd487d58463a |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.