branch_name
stringclasses 149
values | text
stringlengths 23
89.3M
| directory_id
stringlengths 40
40
| languages
listlengths 1
19
| num_files
int64 1
11.8k
| repo_language
stringclasses 38
values | repo_name
stringlengths 6
114
| revision_id
stringlengths 40
40
| snapshot_id
stringlengths 40
40
|
---|---|---|---|---|---|---|---|---|
refs/heads/main
|
<repo_name>Guilospanck/Dota2CountersBackend<file_sep>/README.md
# Dota 2 Counters - Backend
Backend hosted on an EC2 Amazon instance to serve as backend to the react native application [Dota2CountersReactNative](https://github.com/Guilospanck/Dota2CountersReactNative).
---
## Some configurations to remember
### FOR NODE APPLICATIONS YOU'LL NEED PM2:
1) Install PM2 on your server
```bash
npm install pm2@latest -g
```
:arrow_right: ```PM2``` it's a package that is going to run the process in the background.
2) Go to your ```server.js``` folder and run
```bash
pm2 start server.js
```
:arrow_right: If you have problems and need to add node params, you can do something like:
```bash
pm2 start server.js --node-args="--max-http-header-size=16384"
```
Other configs to pm2:
```bash
pm2 status
pm2 restart {index}
```
---
### TO FORWARD PORTS IN UBUNTU:
1) ```cat /proc/sys/net/ipv4/ip_forward``` :arrow_right: verify if port forwarding is enabled (1: enabled, 0: disabled)
2) if disabled (0), then:
- ```sudo nano /etc/sysctl.conf```
- uncomment the following line: ```net.ipv4.ip_forward```
- ```sudo sysctl -p /etc/sysctl.conf``` (to enable changes)
3) ```cat /proc/sys/net/ipv4/ip_forward``` :arrow_right: verify if port forwarding is enabled (1: enabled, 0: disabled)
4) set port forwarding:
```bash
sudo iptables -A PREROUTING -t nat -i eth0 -p tcp --dport {PORT_OF_YOUR_INSTANCE} -j REDIRECT --to-port {PORT_OF_YOUR_SERVER_APPLICATION}
```
For e.g.:
```bash
sudo iptables -A PREROUTING -t nat -i eth0 -p tcp --dport 80 -j REDIRECT --to-port 8080
```
where:
- {PORT_OF_YOUR_INSTANCE} is the port opened on your machine (usually 80, 443, 22 ...)
- {PORT_OF_YOUR_SERVER_APPLICATION} is the port that you're listening on your application (on server.js for node)
5) Now open your firewall to that port
```bash
sudo iptables -A INPUT -p tcp -m tcp --sport {PORT_OF_YOUR_INSTANCE} -j ACCEPT
```
For e.g.:
```bash
sudo iptables -A INPUT -p tcp -m tcp --sport 80 -j ACCEPT)
```
And:
```bash
sudo iptables -A OUTPUT -p tcp -m tcp --dport {PORT_OF_YOUR_INSTANCE} -j ACCEPT
```
---
### TO SET CRONTAB:
1) Enable cron: sudo systemctl enable cron
2) Type crontab -e and then enter
3) select nano as your editor (usually option 1)
4) Set your crontab service (go to cron guru for examples)
```bash
30 03 * * * python3 /home/ubuntu/dota2cp/dotabuff_scrapper.py
```
It'll run dotabuff_scrapper.py everyday at 03:30am.
:arrow_right: (don't forget the "/" before the path in "/home...", otherwise will not work)
:arrow_right: (always test before, in other words, test on the terminal: python3 /home/ubuntu/dota2cp/dotabuff_scrapper.py)
<file_sep>/server.js
#!/usr/bin/env nodejs
const express = require('express');
const bodyParser = require('body-parser');
const cors = require('cors');
const app = express();
const fs = require('fs')
app.use(cors());
// create application/json parser
app.use(bodyParser.json());
// create application/x-www-form-urlencoded parser
app.use(bodyParser.urlencoded({ extended: false }));
// First route
app.get('/dota2cp/api/', (req, res) => {
fs.readFile('../countersByHero.txt', 'utf8', (err, data) => {
if (err) {
res.send(err);
return
}
res.json({msg: "Success", data: data});
})
})
const port = 7033;
app.listen(port, () => { console.log('We are live on ' + port); });
|
3e5181b8e5cc6e069265fe5239be9ef6fac3d46c
|
[
"Markdown",
"JavaScript"
] | 2 |
Markdown
|
Guilospanck/Dota2CountersBackend
|
fa12dc044f3fdde157af6cb16befca9187f7e554
|
8483b6ec9d5f509b1b2d1e37b7514ae676774570
|
refs/heads/master
|
<file_sep>ANSIBLE_GROUPS = {
"chronos" => ["chronos1"],
"all_groups:children" => ["chronos"]
}
Vagrant.configure(2) do |config|
config.vm.box = "ubuntu/trusty64"
config.vm.provider "virtualbox" do |v|
v.memory = 1024
end
config.vm.define "chronos1" do |chronos1|
chronos1.vm.network "private_network", ip: "192.168.33.18"
chronos1.vm.hostname = "chronos1"
chronos1.vm.provision "ansible" do |ansible|
ansible.playbook = "playbook.yml"
ansible.groups = ANSIBLE_GROUPS
end
end
end
<file_sep>FROM ubuntu:latest
RUN mkdir -p /data/test
COPY ./log /data/test/log
COPY ./run.sh /data/test/run.sh
CMD ["/data/test/run.sh"]
#CMD ["cat" , "/data/test/log"]
<file_sep>ANSIBLE_GROUPS = {
"mnodes" => ["mnode2", "mnode3", "mnode1"],
"all_groups:children" => ["mnodes"]
}
Vagrant.configure(2) do |config|
config.vm.box = "ubuntu/trusty64"
config.vm.provider "virtualbox" do |v|
v.memory = 1024
end
config.vm.define "mnode1" do |mnode1|
mnode1.vm.network "private_network", ip: "192.168.33.14"
mnode1.vm.hostname = "mnode1"
mnode1.vm.provision "ansible" do |ansible|
ansible.playbook = "playbook.yml"
ansible.groups = ANSIBLE_GROUPS
end
end
config.vm.define "mnode2" do |mnode2|
mnode2.vm.network "private_network", ip: "192.168.33.15"
mnode2.vm.hostname = "mnode2"
mnode2.vm.provision "ansible" do |ansible|
ansible.playbook = "playbook.yml"
ansible.groups = ANSIBLE_GROUPS
end
end
config.vm.define "mnode3" do |mnode3|
mnode3.vm.network "private_network", ip: "192.168.33.16"
mnode3.vm.hostname = "mnode3"
mnode3.vm.provision "ansible" do |ansible|
ansible.playbook = "playbook.yml"
ansible.groups = ANSIBLE_GROUPS
end
end
end
<file_sep>ANSIBLE_GROUPS = {
"mnodes" => ["mnode2", "mnode3", "mnode1"],
"marathon" => [ "mnode1"],
"chronos" => [ "mnode1"],
"zookeeper" => [ "mnode1"],
"all_groups:children" => ["mnodes", "marathon", "chronos", "zookeeper"]
}
Vagrant.configure(2) do |config|
config.vm.box = "ubuntu/trusty64"
config.vm.provider "virtualbox" do |v|
v.memory = 1024
end
N = 3
(1..N).each do |machine_id|
config.vm.define "mnode#{machine_id}" do |machine|
machine.vm.hostname = "mnode#{machine_id}"
machine.vm.network "private_network", ip: "192.168.33.#{9+machine_id}"
# Only execute once the Ansible provisioner,
# when all the machines are up and ready.
if machine_id == N
machine.vm.provision "complete", type: "ansible" do |ansible|
# Disable default limit to connect to all the machines
ansible.groups = ANSIBLE_GROUPS
ansible.limit = "all_groups"
ansible.playbook = "playbook.yml"
end
machine.vm.provision "services", type: "ansible" do |ansible|
# Disable default limit to connect to all the machines
ansible.groups = ANSIBLE_GROUPS
ansible.limit = "all_groups"
ansible.playbook = "playbook_services.yml"
end
end
end
end
end
<file_sep>ANSIBLE_GROUPS = {
"znodes" => ["znode2", "znode3", "znode1"],
"all_groups:children" => ["znodes"]
}
Vagrant.configure(2) do |config|
config.vm.box = "ubuntu/trusty64"
config.vm.provider "virtualbox" do |v|
v.memory = 1024
end
config.vm.define "znode1" do |znode1|
znode1.vm.network "private_network", ip: "192.168.33.10"
znode1.vm.hostname = "znode1"
znode1.vm.provision "ansible" do |ansible|
ansible.playbook = "playbook.yml"
ansible.groups = ANSIBLE_GROUPS
end
end
config.vm.define "znode2" do |znode2|
znode2.vm.network "private_network", ip: "192.168.33.11"
znode2.vm.hostname = "znode2"
znode2.vm.provision "ansible" do |ansible|
ansible.playbook = "playbook.yml"
ansible.groups = ANSIBLE_GROUPS
end
end
config.vm.define "znode3" do |znode3|
znode3.vm.network "private_network", ip: "192.168.33.12"
znode3.vm.hostname = "znode3"
znode3.vm.provision "ansible" do |ansible|
ansible.playbook = "playbook.yml"
ansible.groups = ANSIBLE_GROUPS
end
end
end
<file_sep>#!/bin/bash
TOP_DIR=/opt/teradata/appserver
if [ $TOP_DIR = "INSTALL_DIR_VAR" ]
then
echo "################################################################################"
echo "# "
echo "# You must replace the string INSTALL_DIR_VAR with your installation directory."
echo "# "
echo "################################################################################"
exit 1
fi
JAVA=$TOP_DIR/java/current/bin/java
sudo -u postgres $TOP_DIR/postgres/current/bin/psql -d appserver -p 6625 -c "$1"
if [ "$?" -ne "0" ]; then
echo "ERROR: Could not enter connection profile/catalog table into postgres.">>/tmp/install.log
exit $?
fi
<file_sep>-- Query to visualize table
insert into app_center_visualizations values (1,
'{
"db_table_name":"${db_table_name}",
"vizType":"table",
"version":"1.0",
"title":"Data from ${db_table_name}"
}');
<file_sep>ANSIBLE_GROUPS = {
"elk" => ["elk1"],
"all_groups:children" => ["elk"]
}
Vagrant.configure(2) do |config|
config.vm.box = "ubuntu/trusty64"
config.vm.provider "virtualbox" do |v|
v.memory = 4084
end
config.vm.define "elk1" do |elk1|
elk1.vm.network "private_network", ip: "192.168.33.19"
elk1.vm.hostname = "elk1"
elk1.vm.provision "ansible" do |ansible|
ansible.playbook = "playbook.yml"
ansible.groups = ANSIBLE_GROUPS
end
end
end
<file_sep>ANSIBLE_GROUPS = {
"marathon" => ["marathon1"],
"all_groups:children" => ["marathon"]
}
Vagrant.configure(2) do |config|
config.vm.box = "ubuntu/trusty64"
config.vm.provider "virtualbox" do |v|
v.memory = 1024
end
config.vm.define "marathon1" do |marathon1|
marathon1.vm.network "private_network", ip: "192.168.33.17"
marathon1.vm.hostname = "marathon1"
marathon1.vm.provision "ansible" do |ansible|
ansible.playbook = "playbook.yml"
ansible.groups = ANSIBLE_GROUPS
end
end
end
|
7892c5f9437963106145a45ad168d68275144bb5
|
[
"SQL",
"Ruby",
"Dockerfile",
"Shell"
] | 9 |
Ruby
|
puneetguptanitj/mesos-vagrant-setup
|
724062e79e2f4e3c0b559bdd3fd5f486425cb003
|
729813586104a0d7cba5edb5df9610037c34d773
|
refs/heads/master
|
<file_sep>using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class ColorChangingScript : MonoBehaviour
{
Light lt;
public Light leftLight;
public Light rightLight;
public List<Color> colors;
void Start()
{
lt = GetComponent<Light>();
//StartCoroutine(updateColor());
//lt.color = Color.blue;
}
// Update is called once per frame
void Update()
{
if(Input.GetKey(KeyCode.Alpha1)) {
if(colors.Count > 0 && colors[0] != null) {
lt.color = colors[0];
}
}
if(Input.GetKey(KeyCode.Alpha2)) {
if(colors.Count > 1 && colors[1] != null) {
lt.color = colors[1];
}
}
if(Input.GetKey(KeyCode.Alpha3)) {
if(colors.Count > 2 && colors[2] != null) {
lt.color = colors[2];
}
}
if(Input.GetKey(KeyCode.Alpha4)) {
if(colors.Count > 3 && colors[3] != null) {
lt.color = colors[3];
}
}
if(Input.GetKey(KeyCode.Alpha5)) {
if(colors.Count > 3 && colors[3] != null) {
lt.color = colors[4];
}
}
if(Input.GetKey(KeyCode.Alpha6)) {
if(colors.Count > 3 && colors[3] != null) {
lt.color = colors[5];
}
}
if(Input.GetKey(KeyCode.J)) {
if (leftLight != null) {
StartCoroutine(updateLight(leftLight, this.lt.color));
}
}
if(Input.GetKey(KeyCode.K)) {
if(rightLight != null) {
StartCoroutine(updateLight(rightLight, this.lt.color));
} else {
//light.color = new Color(207, 131, 28);
}
}
if(Input.GetKey(KeyCode.B)){
lt.color = Color.red;
}
if(Input.GetKey(KeyCode.N)){
lt.color = Color.green;
}
if(Input.GetKey(KeyCode.M)){
lt.color = Color.blue;
}
}
IEnumerator updateLight(Light light, Color color) {
yield return new WaitForSeconds(.1f);
light.color = color;
}
}
|
3e001784b35c6f2b95610ee1d383cb4665095a5a
|
[
"C#"
] | 1 |
C#
|
garry645/ArenaModel-Team8
|
8a074693153aa2bd010941cbf970f6075eb6bcb3
|
d305374cc3a2bd8f1724fdc1d5d06e3a8c7d5671
|
refs/heads/master
|
<file_sep>window.onload=function(){
let banner=document.querySelectorAll(".banner .imgqw");
let dot=document.querySelectorAll(".banner li");
let but=document.querySelector(".banner .leftmini");
let but2=document.querySelector(".banner .rightmini");
let father=document.querySelector(".banners");
banner[0].style.opacity=1;
dot[0].classList.add("hots");
let next=0;
let now=0;
let cv=setInterval(move23,2000);
let flag=true;
function move23(){
next++;
if(next==banner.length){
next=0;
}
banner[now].style.opacity=0;
animate(banner[next],{opacity:1},function(){
flag=true;
});
dot[now].classList.remove("hots");
dot[next].classList.add("hots");
now=next;
}
function moveL(){
next--;
if(next==-1){
next=banner.length-1;
}
banner[now].style.opacity=0;
animate(banner[next],{opacity:1},function(){
flag=true;
});
dot[now].classList.remove("hots");
dot[next].classList.add("hots");
now=next;
}
for(let i=0;i<dot.length;i++){
dot[i].onmouseenter=function(){
clearInterval(cv);
for(let j=0;j<dot.length;j++){
dot[j].classList.remove("hots");
banner[j].style.opacity=0;
}
dot[i].classList.add("hots");
banner[i].style.opacity=1;
now=i;
next=i;
}
}
but.onclick=function(){
if(!flag){
return;
}
flag=false;
moveL();
}
but2.onclick=function(){
if(!flag){
return;
}
flag=false;
move23();
}
father.onmouseenter=function () {
clearInterval(cv);
}
father.onmouseleave=function () {
cv=setInterval(move23,2000);
}
let looks=document.querySelectorAll(".looks");
let dots=document.querySelectorAll(".dot31");
let dots1=document.querySelector(".cbox .but1");
let dots23=document.querySelector(".cbox .but2");
let smallbanner=function(looks,dots,dots1,dots23){
looks[0].style.left=0;
dots[0].classList.add("hot");
let next1=0;
let now1=0;
let flagg=true;
function move1(){
next1++;
if(next1==looks.length){
next1=0;
}
looks[next1].style.left="288px";
animate(looks[now1],{left:-288},function(){
flagg=true;
});
animate(looks[next1],{left:0});
dots[now1].classList.remove("hot");
dots[next1].classList.add("hot");
now1=next1;
}
function moveL1(){
next1--;
if(next1<0){
next1=looks.length-1;
}
looks[next1].style.left="-288px";
animate(looks[now1],{left:288},function(){
flagg=true;});
animate(looks[next1],{left:0});
dots[now1].classList.remove("hot");
dots[next1].classList.add("hot");
now1=next1;
}
for(let i=0;i<dots.length;i++){
dots[i].onclick=function(){
if(i==now1){
return;
}
else if(i>now1){
looks[i].style.left="288px";
animate(looks[now1],{left:-288});
animate(looks[i],{left:0});
dots[now1].classList.remove("hot");
dots[i].classList.add("hot");
}
else if(i<now1){
looks[i].style.left="-288px";
animate(looks[now1],{left:288});
animate(looks[i],{left:0});
dots[now1].classList.remove("hot");
dots[i].classList.add("hot");
}
now1=i;
next1=now1;
}
}
dots1.onclick=function(){
if(!flagg){
return;
}
flagg=false;
if(next1==0){
flagg=true;
return;
}
moveL1();
}
dots23.onclick=function(){
if(!flagg){
return;
}
flagg=false;
if(next1==looks.length-1){
flagg=true;
return;
}
move1();
}
}
smallbanner(looks,dots,dots1,dots23);
let view1=document.querySelectorAll(".lookq");
let doot1=document.querySelectorAll(".dot32");
let buut1=document.querySelector(".but3");
let buut2=document.querySelector(".but4");
smallbanner(view1,doot1,buut1,buut2);
let view2=document.querySelectorAll(".lookw");
let doot2=document.querySelectorAll(".dot33");
let buut3=document.querySelector(".but5");
let buut4=document.querySelector(".but6");
smallbanner(view2,doot2,buut3,buut4);
let view3=document.querySelectorAll(".looke");
let doot3=document.querySelectorAll(".dot34");
let buut5=document.querySelector(".but7");
let buut6=document.querySelector(".but8");
smallbanner(view3,doot3,buut5,buut6);
// *************
// 左右点击
let fcenter=document.querySelector(".bigfter");
let lllft=document.querySelector(".lllft");
let rrrit=document.querySelector(".rrrit");
let w=parseInt(getComputedStyle(fcenter,null).width)/3;
function leftRight(fcenter,lllft,rrrit,w,num){
let times=0;
rrrit.onclick=function(){
times++;
if(times==num){
times=num-1;
}
fcenter.style.transform=`translate(${-w*times}px)`
}
lllft.onclick=function(){
times--;
if(times==-1){
times=0;
}
fcenter.style.transform=`translate(${-w*times}px)`
}
}
leftRight(fcenter,lllft,rrrit,w,3);
let bg=document.querySelector(".bbigg");
let Leftt=document.querySelector(".Leftt");
let Rightt=document.querySelector(".Rightt");
let q=parseInt(getComputedStyle(bg,null).width)/2;
leftRight(bg,Leftt,Rightt,q,2);
//******************
let blook=document.querySelectorAll(".hcenter");
let pointer=document.querySelectorAll(".hotqwe a");
blook[0].style.display="block";
pointer[0].classList.add("active");
for(let i=0;i<pointer.length;i++){
pointer[i].onmouseover=function(){
for(let q=0;q<blook.length;q++){
pointer[q].classList.remove("active");
blook[q].style.display="none";
}
pointer[i].classList.add("active");
blook[i].style.display="block";
}
}
let back=document.querySelector(".backtop");
document.onscroll=function(){
if(document.body.scrollTop || document.documentElement.scrollTop>=900){
back.style.display="block";
}else{
back.style.display="none";
}
}
back.onclick=function(){
animate(document.body,{scrollTop:0});
animate(document.documentElement,{scrollTop:0});
}
// ***************
// 选项卡
let leftjpg=document.querySelectorAll(".leftjpg a");
let amaz=document.querySelectorAll(".amaz");
for(let i=0;i<leftjpg.length;i++){
leftjpg[i].onmouseover=function () {
for(let j=0;j<leftjpg.length;j++){
amaz[j].style.display="none";
}
amaz[i].style.display="flex";
}
leftjpg[i].onmouseout=function () {
amaz[i].style.display="none";
}
}
// *****
// 下拉选项卡
let phone=document.querySelectorAll(".bbgg");
let xiala=document.querySelectorAll(".downla");
console.log(phone);
for(let i=0;i<phone.length;i++){
phone[i].onmouseover=function() {
for (let j = 0; j < xiala.length; j++) {
xiala[j].style.height ="230px";
xiala[j].style.zIndex = 1;
}
xiala[i].style.height = "230px";
xiala[i].style.zIndex = 999;
}
phone[i].onmouseout=function(){
for (let j=0;j<phone.length;j++){
xiala[j].style.height=0;
xiala[j].style.zIndex = 1;
}
}
}
}
// xiala[i].style.height=0;
// xiala[i].style.borderTop="none";
// }
// for(let i=0;i<8;i++){
// phone[i].onmouseover=function(){
// for(let j=0;j<xiala.length;j++){
// xiala[j].style.height=0;
// xiala[j].style.borderTop="none";
// }
// xiala[i].style.height="230px";
// xiala[i].style.borderTop="1px solid #e0e0e0";
// xiala[i].style.boxShadow="0 3px 4px rgba(0,0,0,0.18)";
//
//
// }
// phone[i].onmouseout=function(){
// xiala[i].style.height=0;
// xiala[i].style.borderTop="none";
// }
// }
// }
<file_sep># xiaomibig
小米官网
|
3a6f66f25af7df0fef6c5edfaad0d3fcc8d41f75
|
[
"JavaScript",
"Markdown"
] | 2 |
JavaScript
|
wrxwrx/xiaomibig
|
1699786859408de46acfe7ef4a596dc0eebb37bc
|
3fb188d60c1c3238a8707def128e128112fba51c
|
refs/heads/master
|
<repo_name>H3nr1q/initKotlin<file_sep>/src/com/teste/br/testes/TesteFuncionarioGerente.kt
package com.teste.br.testes
import com.teste.br.Analista
import com.teste.br.Funcionario
import com.teste.br.Gerente
import com.teste.br.Pessoa
import java.math.BigDecimal
fun main(){
val Maria = Gerente(nome = "<NAME>", cpf = "564615165165", 6000.00, senha = "<PASSWORD>")
ImprimeRelatorioFuncionario.imprime(Maria)
TesteAutenticacao().autentica(Maria)
}
<file_sep>/src/com/teste/br/testes/TesteFuncionarioAnalista.kt
package com.teste.br.testes
import com.teste.br.Analista
import com.teste.br.Funcionario
import com.teste.br.Pessoa
import java.math.BigDecimal
fun main(){
val joao = Analista(nome = "Joao", cpf = "564615165165", 2000.00)
ImprimeRelatorioFuncionario.imprime(joao)
}
<file_sep>/src/com/teste/br/testes/TesteAutenticacao.kt
package com.teste.br.testes
import com.teste.br.Login
class TesteAutenticacao {
fun autentica(login: Login) = println(login.login())
}<file_sep>/src/com/teste/br/testes/TesteBanco.kt
package com.teste.br.testes
import com.teste.br.Banco
fun main(){
val digiOneBank = Banco("Digione", 12)
println(digiOneBank.nome)
println(digiOneBank.numero)
val banco2 = digiOneBank.copy("Banco2")
println(banco2.nome)
println(banco2.numero)
println(banco2.info())
}<file_sep>/src/com/teste/br/Login.kt
package com.teste.br
//toda inferface é abastrata
interface Login {
fun login():Boolean
}
|
6a38d904615b620e8919a369681dea90366fc242
|
[
"Kotlin"
] | 5 |
Kotlin
|
H3nr1q/initKotlin
|
d0de3e082e015d453dd0d7b1b1ec5066957e67c0
|
79c2485d3ee0dfc29c03811e514c4659ca7d1aaa
|
refs/heads/master
|
<repo_name>railyt/3.kodutoo-IV-ruhm<file_sep>/edit.php
<?php
//edit.php
require("functions.php");
require("editFunctions.php");
if(isset($_GET["delete"])){
deletePerson($_GET["id"]);
header("Location: data.php");
exit();
}
//kas kasutaja uuendab andmeid
if(isset($_POST["update"])){
updatePerson(cleanInput($_POST["id"]), cleanInput($_POST["City"]),
cleanInput($_POST["Cinema"]), cleanInput($_POST["Movie"]), cleanInput($_POST["Genre"]),
cleanInput($_POST["Comment"]), cleanInput($_POST["Rating"]));
header("Location: edit.php?id=".$_POST["id"]."&success=true");
exit();
}
//saadan kaasa id
$p = getSinglePerosonData($_GET["id"]);
//var_dump($p);
?>
<br><br>
<a href="data.php"> tagasi </a>
<h2>Muuda kirjet</h2>
<form action="<?php echo htmlspecialchars($_SERVER["PHP_SELF"]);?>" method="post" >
<input type="hidden" name="id" value="<?=$_GET["id"];?>" >
<label for="City" >Linn</label><br>
<input id="City" name="City" type="text" value="<?php echo $p->City;?>" ><br><br>
<label for="Cinema" >Kino</label><br>
<input id="Cinema" name="Cinema" type="text" value="<?php echo $p->Cinema;?>" ><br><br>
<label for="Movie" >Film</label><br>
<input id="Movie" name="Movie" type="text" value="<?php echo $p->Movie;?>" ><br><br>
<label for="Genre" >Žanr</label><br>
<input id="Genre" name="Genre" type="text" value="<?php echo $p->Genre;?>" ><br><br>
<label for="Comment" >Kommentaar</label><br>
<input id="Comment" name="Comment" type="text" value="<?php echo $p->Comment;?>" ><br><br>
<label for="Rating" >Hinne</label><br>
<input id="Rating" name="Rating" type="text" value="<?php echo $p->Rating;?>" ><br><br>
<input type="submit" name="update" value="Salvesta">
</form>
<a href="?id=<?=$_GET["id"];?>&delete=true">kustuta</a>
<file_sep>/data.php
<?php
//ühendan sessionniga
require("functions.php");
//kui ei ole sisse loginud, suunan login lehele
if(!isset($_SESSION["userId"])){
header("Location: login.php");
exit();
}
//kas aadressireal on logout
if (isset($_GET["logout"])) {
SESSION_destroy();
header("Location: login.php");
exit();
}
//kontrollin kas tühi
if ( isset($_POST["City"]) &&
isset($_POST["Cinema"]) &&
isset($_POST["Movie"]) &&
isset($_POST["Genre"]) &&
isset($_POST["Comment"]) &&
isset($_POST["Rating"]) &&
!empty($_POST["City"]) &&
!empty($_POST["Cinema"]) &&
!empty($_POST["Movie"]) &&
!empty($_POST["Genre"]) &&
!empty($_POST["Comment"]) &&
!empty($_POST["Rating"])
) {
$Cinema = cleanInput($_POST["Cinema"]);
saveEvent(cleanInput($_POST["City"]), ($_POST["Cinema"]), ($_POST["Movie"]), ($_POST["Genre"]), ($_POST["Comment"]),
($_POST["Rating"]));
header("Location: login.php");
exit();
}
$City= "";
$Cinema = "";
$Movie = "";
$Genre = "" ;
$Comment = "" ;
$Rating= "" ;
$CityError= "*";
if (isset ($_POST["City"])) {
if (empty ($_POST["City"])) {
$CityError = "* Väli on kohustuslik!";
} else {
$City = $_POST["City"];
}
}
$CinemaError= "*";
if (isset ($_POST["Cinema"])) {
if (empty ($_POST["Cinema"])) {
$CinemaError = "* Väli on kohustuslik!";
} else {
$Cinema = $_POST["Cinema"];
}
}
$MovieError= "*";
if (isset ($_POST["Movie"])) {
if (empty ($_POST["Movie"])) {
$MovieError = "* Väli on kohustuslik!";
} else {
$Movie = $_POST["Movie"];
}
}
$GenreError= "*";
if (isset ($_POST["Genre"])) {
if (empty ($_POST["Genre"])) {
$GenreError = "* Väli on kohustuslik!";
} else {
$Genre = $_POST["Genre"];
}
}
$CommentError= "*";
if (isset ($_POST["Comment"])) {
if (empty ($_POST["Comment"])) {
$CommentError = "* Väli on kohustuslik!";
} else {
$Comment = $_POST["Comment"];
}
}
$RatingError= "*";
if (isset ($_POST["Rating"])) {
if (empty ($_POST["RatingError"])) {
$RatingError = "* Väli on kohustuslik!";
} else {
$Rating = $_POST["Rating"];
}
}
if(isset ($_POST["Rating"])) {
if(empty ($_POST["Rating"])){
$RatingError = "* Väli on kohustuslik!";
} else{
$Rating = $_POST["Rating"];
}
}
if(isset ($_POST["Comment"])) {
if(empty ($_POST["Comment"])){
$CommentError = "* Väli on kohustuslik!";
} else{
$Comment = $_POST["Comment"];
}
}
if(isset ($_POST["Genre"])) {
if(empty ($_POST["Genre"])){
$GenreError = "* Väli on kohustuslik!";
} else{
$Genre = $_POST["Genre"];
}
}
if(isset ($_POST["Movie"])) {
if(empty ($_POST["Movie"])){
$MovieError = "* Väli on kohustuslik!";
} else{
$Movie = $_POST["Movie"];
}
}
if(isset ($_POST["Cinema"])) {
if(empty ($_POST["Cinema"])){
$CinemaError = "* Väli on kohustuslik!";
} else{
$Cinema = $_POST["Cinema"];
}
}
if(isset ($_POST["City"])) {
if(empty ($_POST["City"])){
$CityError = "* Väli on kohustuslik!";
} else{
$City = $_POST["City"];
}
}
?>
<h1>Data</h1>
<?php echo$_SESSION["userEmail"];?>
<?=$_SESSION["userEmail"];?>
<p>
Tere tulemast <?=$_SESSION["userEmail"];?>!
<a href="?logout=1">logi välja</a>
</p>
<h2>Salvesta sündmus</h2>
<form method="POST" >
<label>Linn</label><br>
<input name="City" type="text" value="<?=$City;?>"><?php echo $CityError;?>
<br><br>
<label>Kino</label><br>
<input name="Cinema" type="text" value="<?=$Cinema;?>"><?php echo $CinemaError;?>
<br><br>
<label>Film</label><br>
<input name="Movie" type="text" value="<?=$Movie;?>"><?php echo $MovieError;?>
<br><br>
<label>Žanr</label><br>
<input name="Genre" type="text" value="<?=$Genre;?>"><?php echo $GenreError;?>
<br><br>
<label>Kommentaar</label><br>
<input name="Comment" type="text" value="<?=$Comment;?>"><?php echo $CommentError;?>
<br><br>
<label>Hinne</label><br>
<input name="Rating" type="float" value="<?=$Rating;?>"><?php echo $RatingError;?>
<br><br>
<input type="submit" value="Salvesta">
</form>
<?php
if(isset($_GET["q"])){
$q=$_GET["q"];
}else{
//ei otsi
$q="";
}
$sort = "id";
$order = "ASC";
if (isset($_GET["sort"]) && isset($_GET["order"])) {
$sort = $_GET["sort"];
$order = $_GET["order"];
}
$people=getAllPeople($q, $sort, $order);
?>
<form>
<input type="search" name="q" value="<?=$q;?>">
<input type="submit" value="Otsi">
</form>
<h2>Arhiiv</h2>
<?php
$html="<table>";
$html .="<tr>";
$orderId="ASC";
$arr="↓";
if(isset($_GET["order"])&&
$_GET["order"]=="ASC"&&
$_GET["sort"]=="id"){
$orderId="DESC";
$arr="↑";
}
$html .= "<th>
<a href='?q=".$q."&sort=id&order=".$orderId."'>
ID ".$arr."
</a>
</th>";
$orderCity="ASC";
$arr="↓";
if(isset($_GET["order"])&&
$_GET["order"]=="ASC"&&
$_GET["sort"]=="City"){
$orderCity="DESC";
$arr="↑";
}
$html .= "<th>
<a href='?q=".$q."&sort=City&order=".$orderCity."'>
Linn ".$arr."
</a>
</th>";
$orderCinema="ASC";
$arr="↓";
if(isset($_GET["order"])&&
$_GET["order"]=="ASC"&&
$_GET["sort"]=="Cinema"){
$orderCinema="DESC";
$arr="↑";
}
$html .= "<th>
<a href='?q=".$q."&sort=Cinema&order=".$orderCinema."'>
Kino ".$arr."
</a>
</th>";
$orderMovie="ASC";
$arr="↓";
if(isset($_GET["order"])&&
$_GET["order"]=="ASC"&&
$_GET["sort"]=="Movie"){
$orderMovie="DESC";
$arr="↑";
}
$html .= "<th>
<a href='?q=".$q."&sort=Movie&order=".$orderMovie."'>
Film ".$arr."
</a>
</th>";
$orderGenre="ASC";
$arr="↓";
if(isset($_GET["order"])&&
$_GET["order"]=="ASC"&&
$_GET["sort"]=="Genre"){
$orderGenre="DESC";
$arr="↑";
}
$html .= "<th>
<a href='?q=".$q."&sort=Genre&order=".$orderGenre."'>
Žanr ".$arr."
</a>
</th>";
$orderComment="ASC";
$arr="↓";
if(isset($_GET["order"])&&
$_GET["order"]=="ASC"&&
$_GET["sort"]=="Comment"){
$orderComment="DESC";
$arr="↑";
}
$html .= "<th>
<a href='?q=".$q."&sort=Comment&order=".$orderComment."'>
Kommentaar ".$arr."
</a>
</th>";
$orderRating="ASC";
$arr="↓";
if(isset($_GET["order"])&&
$_GET["order"]=="ASC"&&
$_GET["sort"]=="Rating"){
$orderRating="DESC";
$arr="↑";
}
$html .= "<th>
<a href='?q=".$q."&sort=Rating&order=".$orderRating."'>
Hinne ".$arr."
</a>
</th>";
$html .="</tr>";
//iga liikmekohta masssiiivis
foreach($people as $p){
$html .="<tr>";
$html .="<td>".$p->id."</td>";
$html .="<td>".$p->City."</td>";
$html .="<td>".$p->Cinema."</td>";
$html .="<td>".$p->Movie."</td>";
$html .="<td>".$p->Genre."</td>";
$html .="<td>".$p->Comment."</td>";
$html .="<td>".$p->Rating."</td>";
$html .= "<td><a href='edit.php?id=".$p->id."'>edit.php</a></td>";
$html .="</tr>";
}
$html .="</table>";
echo $html;
?><file_sep>/editFunctions.php
<?php
require_once("../../config.php");
function getSinglePerosonData($edit_id){
$database = "if16_raily_4";
//echo "id on ".$edit_id;
$mysqli = new mysqli($GLOBALS["serverHost"], $GLOBALS["serverUsername"], $GLOBALS["serverPassword"], $database);
$stmt = $mysqli->prepare("SELECT City, Cinema, Movie, Genre, Comment, Rating FROM kino WHERE id=? AND deleted IS NULL");
$stmt->bind_param("i", $edit_id);
$stmt->bind_result($City, $Cinema, $Movie, $Genre, $Comment, $Rating);
$stmt->execute();
//tekitan objekti
$p = new Stdclass();
//saime ühe rea andmeid
if($stmt->fetch()){
// saan siin alles kasutada bind_result muutujaid
$p->City = $City;
$p->Cinema = $Cinema;
$p->Movie = $Movie;
$p->Genre = $Genre;
$p->Comment = $Comment;
$p->Rating = $Rating;
}else{
// ei saanud rida andmeid kätte
// sellist id'd ei ole olemas
// see rida võib olla kustutatud
header("Location: data.php");
exit();
}
$stmt->close();
$mysqli->close();
return $p;
}
function updatePerson($id, $City, $Cinema, $Movie, $Genre, $Comment, $Rating){
$database = "if16_raily_4";
$mysqli = new mysqli($GLOBALS["serverHost"], $GLOBALS["serverUsername"], $GLOBALS["serverPassword"], $database);
$stmt = $mysqli->prepare("UPDATE kino SET City=?, Cinema=?, Movie=?, Genre=?,
Comment=?, Rating=? WHERE id=? AND deleted IS NULL");
$stmt->bind_param("sssssii",$City, $Cinema, $Movie, $Genre, $Comment, $Rating, $id);
// kas õnnestus salvestada
if($stmt->execute()){
// õnnestus
echo "salvestus õnnestus!";
}
$stmt->close();
$mysqli->close();
}
function deletePerson($id){
$database = "if16_raily_4";
$mysqli = new mysqli($GLOBALS["serverHost"], $GLOBALS["serverUsername"], $GLOBALS["serverPassword"], $database);
$stmt = $mysqli->prepare("
UPDATE kino SET deleted=NOW()
WHERE id=? AND deleted IS NULL");
$stmt->bind_param("i",$id);
// kas õnnestus salvestada
if($stmt->execute()){
// õnnestus
echo "salvestus õnnestus!";
}
$stmt->close();
$mysqli->close();
}
?><file_sep>/functions.php
<?php
require("../../config.php");
//see fail peab olema seotud kõigiga kus tahame sessiooni kasutada, saab kasutada nüüd $_session muutujat
session_start();
$database = "if16_raily_4";
function signup($email, $password) {
$mysqli = new mysqli($GLOBALS["serverHost"], $GLOBALS["serverUsername"], $GLOBALS["serverPassword"], $GLOBALS["database"]);
$stmt = $mysqli->prepare("INSERT INTO user_sample (email, password) VALUE (?,?)");
echo $mysqli->error;
$stmt->bind_param("ss",$email, $password);
if ($stmt->execute() ) {
echo "õnnestus";
} else { "ERROR".$stmt->error;
}
}
function login($email, $password) {
$notice="";
$mysqli = new mysqli($GLOBALS["serverHost"], $GLOBALS["serverUsername"], $GLOBALS["serverPassword"], $GLOBALS["database"]);
$stmt = $mysqli->prepare("SELECT id, email, password, created FROM user_sample WHERE email = ?" );
echo $mysqli->error;
//asendan küsimärgi
$stmt->bind_param("s",$email);
//rea kohta tulba väärtus
$stmt->bind_result($id, $emailFromDb, $passwordFromDb, $created);
$stmt->execute();
//ainult select puhul
if($stmt->fetch()){
//oli olemas,rida käes
$hash=hash("sha512", $password);
if($hash==$passwordFromDb) {
echo "Kasutaja $id logis sisse";
$_SESSION["userId"] = $id;
$_SESSION["userEmail"] = $emailFromDb;
header("Location: data.php");
exit();
} else {
$notice = "Parool vale";
}
} else {
//ei olnud ühtegi rida
$notice = "Sellise emailiga $email kasutajat ei ole olemas";
}
$stmt->close();
$mysqli->close();
return $notice;
}
function saveEvent($City, $Cinema, $Movie, $Genre, $Comment, $Rating) {
$mysqli = new mysqli($GLOBALS["serverHost"], $GLOBALS["serverUsername"], $GLOBALS["serverPassword"], $GLOBALS["database"]);
$stmt = $mysqli->prepare("INSERT INTO kino (City, Cinema, Movie, Genre, Comment, Rating) VALUE (?, ?, ?, ?, ?, ?)");
echo $mysqli->error;
$stmt->bind_param("sssssi", $City, $Cinema, $Movie, $Genre, $Comment, $Rating);
if ( $stmt->execute() ) {
echo "õnnestus <br>";
} else {
echo "ERROR ".$stmt->error;
}
}
function getAllPeople($q, $sort, $order){
$mysqli = new mysqli($GLOBALS["serverHost"], $GLOBALS["serverUsername"], $GLOBALS["serverPassword"], $GLOBALS["database"]);
$allowedSort = ["id", "City", "Cinema", "Movie", "Genre", "Comment", "Rating"];
// sort ei kuulu lubatud tulpade sisse
if(!in_array($sort, $allowedSort)){
$sort = "id";
}
$orderBy = "ASC";
if($order == "DESC") {
$orderBy = "DESC";
}
//echo "Sorteerin: ".$sort." ".$orderBy." ";
if($q!=""){
//otsin
echo"otsin: ".$q;
$stmt = $mysqli->prepare("
SELECT id, City, Cinema, Movie, Genre, Comment, Rating
FROM kino
WHERE deleted IS NULL
AND ( City LIKE ? OR Cinema LIKE ? OR Movie LIKE ? OR Genre LIKE ? OR Comment LIKE ? OR Rating LIKE ?)
ORDER BY $sort $orderBy
");
$searchWord="%".$q."%";
$stmt->bind_param("sssssi", $searchWord, $searchWord, $searchWord, $searchWord, $searchWord, $searchWord);
}else {
//ei otsi
$stmt = $mysqli->prepare("
SELECT id, City, Cinema, Movie, Genre, Comment, Rating
FROM kino
WHERE deleted IS NULL
ORDER BY $sort $orderBy
");
}
$stmt->bind_result($id, $City, $Cinema, $Movie, $Genre, $Comment, $Rating);
$stmt->execute();
$results=array();
//tsüklissisu toimib seni kaua, mitu rida SQL lausega tuleb
while($stmt->fetch()) {
$human=new StdClass();
$human->id=$id;
$human->City=$City;
$human->Cinema=$Cinema;
$human->Movie=$Movie;
$human->Genre=$Genre;
$human->Comment=$Comment;
$human->Rating=$Rating;
array_push($results, $human);
}
return $results;
}
/*
function getAllPeople(){
$mysqli = new mysqli($GLOBALS["serverHost"], $GLOBALS["serverUsername"], $GLOBALS["serverPassword"], $GLOBALS["database"]);
$stmt=$mysqli->prepare("SELECT id, City, Cinema, Movie, Genre, Comment, Rating FROM kino WHERE deleted IS NULL");
$stmt->bind_result($id, $City, $Cinema, $Movie, $Genre, $Comment, $Rating);
$stmt->execute();
$results=array();
//tsüklissisu toiimib seni kaua, mitu rida SQL lausega tuleb
while($stmt->fetch()) {
$human=new StdClass();
$human->id=$id;
$human->City=$City;
$human->Cinema=$Cinema;
$human->Movie=$Movie;
$human->Genre=$Genre;
$human->Comment=$Comment;
$human->Rating=$Rating;
array_push($results, $human);
}
return $results;
}
*/
function cleanInput($input) {
$input=trim($input);
$input=stripslashes($input);
$input=htmlspecialchars($input);
return $input;
}
/*function sum($x, $y) {
return $x + $y;
}
echo sum(12312312,12312355553);
echo "<br>";
function hello($n, $p) {
return "<NAME>mast ".$n." ". $p;
}
echo hello("Raily", "T");
echo "<br>";
*/
?>
|
53cb7a587419eab8266b725dc1aa5e8ad55d8583
|
[
"PHP"
] | 4 |
PHP
|
railyt/3.kodutoo-IV-ruhm
|
b93404c94eaf1f17b728e066695c962868cae6b6
|
652c34dd2b283934e5028355e25fca6037ee3c86
|
refs/heads/master
|
<repo_name>cowkey-gem/ars<file_sep>/doc/tables_run.sh
#!/bin/sh
cat ./tables.txt | while read line
do
php ../artisan make:migration create_${line}_table
done
<file_sep>/database/migrations/2018_10_10_084810_create_appointments_table.php
<?php
use Illuminate\Support\Facades\Schema;
use Illuminate\Database\Schema\Blueprint;
use Illuminate\Database\Migrations\Migration;
class CreateAppointmentsTable extends Migration
{
/**
* Run the migrations.
*
* @return void
*/
public function up()
{
Schema::create('appointments', function (Blueprint $table) {
$table->increments('id');
$table->string('name', 255)->comment('名称');
$table->date('date')->comment('予約日');
$table->time('start_time')->comment('開始時間');
$table->time('end_time')->comment('終了時間');
$table->char('status', 1)->comment('状態');
$table->timestamps();
});
}
/**
* Reverse the migrations.
*
* @return void
*/
public function down()
{
Schema::dropIfExists('appointments');
}
}
|
d7cca7ac22f66201be0bfd095ada95043b44638d
|
[
"PHP",
"Shell"
] | 2 |
Shell
|
cowkey-gem/ars
|
45b85d328d0691eea285346e013748f8758f07cf
|
8004961e74eef93c9c7c0d072604afaf45871057
|
refs/heads/master
|
<repo_name>MajaMarkovic1/health-institution-typescript<file_sep>/health-institution.ts
class Person {
firstName: string;
lastName: string;
date: Date;
constructor(firstName: string, lastName: string){
this.firstName = firstName;
this.lastName = lastName;
this.date = new Date();
}
}
class Doctor extends Person{
specialty: string;
patients: Patient[];
constructor(firstName: string, lastName: string, specialty: string){
super(firstName, lastName);
this.specialty = specialty;
this.patients = [];
}
setDoctor(){
console.log(`${this.date} Doctor ${this.firstName} ${this.lastName} is created.`)
}
addPatient(patient: Patient){
this.patients.push(patient);
console.log(`${this.date} Patient ${patient.firstName} ${patient.lastName} is added to doctor ${this.firstName}'s list.`)
}
setAppointment(patient: Patient, examination: MedicalExamination){
if (this.patients.indexOf(patient) != -1){
patient.appointments.push(examination);
console.log(`${this.date} ${examination.type} appointment for the patient ${patient.firstName} ${patient.lastName} is set`)
} else {
console.log(`${this.date} Patient ${patient.firstName} ${patient.lastName} is not one of your patients.`)
}
}
}
class Patient extends Person {
private birthId: number;
private registerNumber: number;
doctor: Doctor;
appointments: MedicalExamination[];
constructor(firstName: string, lastName: string, birthId: number, registerNumber: number){
super(firstName, lastName);
this.birthId = birthId;
this.registerNumber = registerNumber;
this.doctor = null;
this.appointments = [];
}
setPatient(){
console.log(`${this.date} Patient ${this.firstName} ${this.lastName} is created.`)
}
chooseDoctor(doctor: Doctor){
if (!this.doctor){
this.doctor = doctor;
doctor.patients.push(this);
console.log(`${this.date} Patient ${this.firstName} ${this.lastName} chose doctor ${doctor.firstName} ${doctor.lastName}`);
} else {
console.log(`${this.date} Patient ${this.firstName} has already chosen a doctor.`)
}
}
getExaminaton(examination: MedicalExamination, data: object){
if (this.appointments.indexOf(examination) != -1){
console.log(`${this.date} Patient ${this.firstName}:`, examination.examine(data));
this.appointments.splice(this.appointments.indexOf(examination), 1)
} else {
console.log(`${this.date} ${this.firstName} doesn't have an appointment for ${examination.type} examination.`)
}
}
}
abstract class MedicalExamination {
type: string;
date: Date;
constructor(type: string){
this.type = type;
this.date = new Date();
}
abstract examine(data: object): void;
}
class BloodPressureExamination extends MedicalExamination{
topValue: number;
bottomValue: number;
pulse: number;
constructor(type: string){
super(type);
}
examine(data: object){
this.topValue = data['topValue'];
this.bottomValue = data['bottomValue'];
this.pulse = data['pulse'];
return `Blood pressure examination is being maintained.
Top value: ${this.topValue}
Bottom value: ${this.bottomValue}
Pulse: ${this.pulse}`;
}
}
class BloodSugarLevelExamination extends MedicalExamination{
value: number;
lastMealTime: string;
constructor(type: string){
super(type);
}
examine(data: object){
this.value = data['value'];
this.lastMealTime = data['lastMealTime'];
return `Blood sugar level examination is being maintained.
Value: ${this.value}
Time of last meal: ${this.lastMealTime}`;
}
}
class BloodCholesteroleLevel extends MedicalExamination {
value: number;
lastMealTime: string;
constructor(type: string){
super(type);
}
examine(data: object){
this.value = data['value'];
this.lastMealTime = data['lastMealTime'];
return `Blood cholesterole level examination is being maintained.
Value: ${this.value}
Time of last meal: ${this.lastMealTime}`;
}
}
let doctor1 = new Doctor('Milan', 'Milanovic', 'cardiologist');
let doctor2 = new Doctor('Petar', 'Petrovic', 'neurospecialist');
let patient1 = new Patient('Dragan', 'Dragic', 123456789, 123);
let patient2 = new Patient('Marko', 'Markovic', 123456789, 123);
let bloodSugarExamination = new BloodSugarLevelExamination('Blood sugar level');
let bloodPressureExamination = new BloodPressureExamination('Blood pressure');
let bloodCholesteroleExamination = new BloodCholesteroleLevel('Blood cholesterole');
doctor1.setDoctor();
patient1.setPatient();
patient1.chooseDoctor(doctor1);
patient1.chooseDoctor(doctor2);
patient2.chooseDoctor(doctor1);
//console.log(doctor1.patients)
doctor1.setAppointment(patient1, bloodSugarExamination);
//doctor1.setAppointment(patient2, bloodSugarExamination);
doctor1.setAppointment(patient1, bloodPressureExamination);
//console.log(patient1.appointments)
patient1.getExaminaton(bloodSugarExamination, {
value: 55, lastMealTime: '11h'
});
patient2.getExaminaton(bloodSugarExamination, {
value: 59, lastMealTime: '12h'
});
patient1.getExaminaton(bloodPressureExamination, {
topValue: 100, bottomValue: 50, pulse: 120
});
patient1.getExaminaton(bloodSugarExamination, {
value: 60, lastMealTime: '12h'
});
<file_sep>/README.md
# health-institution-typescript<file_sep>/health-institution.js
class Person {
constructor(firstName, lastName) {
this.firstName = firstName;
this.lastName = lastName;
this.date = new Date();
}
}
class Doctor extends Person {
constructor(firstName, lastName, specialty) {
super(firstName, lastName);
this.specialty = specialty;
this.patients = [];
}
setDoctor() {
console.log(`${this.date} Doctor ${this.firstName} ${this.lastName} is created.`);
}
addPatient(patient) {
this.patients.push(patient);
console.log(`${this.date} Patient ${patient.firstName} ${patient.lastName} is added to doctor ${this.firstName}'s list.`);
}
setAppointment(patient, examination) {
if (this.patients.indexOf(patient) != -1) {
patient.appointments.push(examination);
console.log(`${this.date} ${examination.type} appointment for the patient ${patient.firstName} ${patient.lastName} is set`);
}
else {
console.log(`${this.date} Patient ${patient.firstName} ${patient.lastName} is not one of your patients.`);
}
}
}
class Patient extends Person {
constructor(firstName, lastName, birthId, registerNumber) {
super(firstName, lastName);
this.birthId = birthId;
this.registerNumber = registerNumber;
this.doctor = null;
this.appointments = [];
}
setPatient() {
console.log(`${this.date} Patient ${this.firstName} ${this.lastName} is created.`);
}
chooseDoctor(doctor) {
if (!this.doctor) {
this.doctor = doctor;
doctor.patients.push(this);
console.log(`${this.date} Patient ${this.firstName} ${this.lastName} chose doctor ${doctor.firstName} ${doctor.lastName}`);
}
else {
console.log(`${this.date} Patient ${this.firstName} has already chosen a doctor.`);
}
}
getExaminaton(examination, data) {
if (this.appointments.indexOf(examination) != -1) {
console.log(`${this.date} Patient ${this.firstName}:`, examination.examine(data));
this.appointments.splice(this.appointments.indexOf(examination), 1);
}
else {
console.log(`${this.date} ${this.firstName} doesn't have an appointment for ${examination.type} examination.`);
}
}
}
class MedicalExamination {
constructor(type) {
this.type = type;
this.date = new Date();
}
}
class BloodPressureExamination extends MedicalExamination {
constructor(type) {
super(type);
}
examine(data) {
this.topValue = data['topValue'];
this.bottomValue = data['bottomValue'];
this.pulse = data['pulse'];
return `Blood pressure examination is being maintained.
Top value: ${this.topValue}
Bottom value: ${this.bottomValue}
Pulse: ${this.pulse}`;
}
}
class BloodSugarLevelExamination extends MedicalExamination {
constructor(type) {
super(type);
}
examine(data) {
this.value = data['value'];
this.lastMealTime = data['lastMealTime'];
return `Blood sugar level examination is being maintained.
Value: ${this.value}
Time of last meal: ${this.lastMealTime}`;
}
}
class BloodCholesteroleLevel extends MedicalExamination {
constructor(type) {
super(type);
}
examine(data) {
this.value = data['value'];
this.lastMealTime = data['lastMealTime'];
return `Blood cholesterole level examination is being maintained.
Value: ${this.value}
Time of last meal: ${this.lastMealTime}`;
}
}
let doctor1 = new Doctor('Milan', 'Milanovic', 'cardiologist');
let doctor2 = new Doctor('Petar', 'Petrovic', 'neurospecialist');
let patient1 = new Patient('Dragan', 'Dragic', 123456789, 123);
let patient2 = new Patient('Marko', 'Markovic', 123456789, 123);
let bloodSugarExamination = new BloodSugarLevelExamination('Blood sugar level');
let bloodPressureExamination = new BloodPressureExamination('Blood pressure');
let bloodCholesteroleExamination = new BloodCholesteroleLevel('Blood cholesterole');
doctor1.setDoctor();
patient1.setPatient();
patient1.chooseDoctor(doctor1);
patient1.chooseDoctor(doctor2);
patient2.chooseDoctor(doctor1);
//console.log(doctor1.patients)
doctor1.setAppointment(patient1, bloodSugarExamination);
//doctor1.setAppointment(patient2, bloodSugarExamination);
doctor1.setAppointment(patient1, bloodPressureExamination);
//console.log(patient1.appointments)
patient1.getExaminaton(bloodSugarExamination, {
value: 55, lastMealTime: '11h'
});
patient2.getExaminaton(bloodSugarExamination, {
value: 59, lastMealTime: '12h'
});
patient1.getExaminaton(bloodPressureExamination, {
topValue: 100, bottomValue: 50, pulse: 120
});
patient1.getExaminaton(bloodSugarExamination, {
value: 60, lastMealTime: '12h'
});
|
a707ca79dd1a8817d7fa8d3116ff7fbe8d694ced
|
[
"Markdown",
"TypeScript",
"JavaScript"
] | 3 |
TypeScript
|
MajaMarkovic1/health-institution-typescript
|
c12bb3799d734bcd4b544a587aa3ce126fe8d036
|
4813fa01ae41c8923e49ce794f083496380942b4
|
refs/heads/master
|
<repo_name>KiarashMajdiPortfolio/Othello<file_sep>/MoveMaker.java
public class MoveMaker {
/*
* This method is used to apply a previously verified and converted move to the previously defined game board
* @position This parameter indicates user or AI specified position. (integer [2]; 0 <= int[0] < 8 && 0 <= int[1] <= 8)
* @color This parameter refers to the player who played the turn. (char; 'X' || 'O')
* @return: void
* */
public static void makeMove(char[][] board, int[] position, char color){
board[position[0]][position[1]] = color; //Adds a piece to the board in the position.
for (int[] h: MoveChecker.directions){ //Checks possible flips in every direction.
boolean flag = false;
boolean path = false;
int a =position[0] + h[0];
int b = position[1] + h[1];
try{ //throws the error that position[0] + h[0] or position[1] + h[1] exceeds 7 or 0 bounds. In that case, //we don't need to check because the piece path is closed from that direction.
while (board[a][b] == MoveChecker.otherColor(color)){ //keeps following the opponent piece sequence until gets to some empty space, wall or self piece.
flag = true;
a += h[0];
b += h[1];
if (a >= 8 || a < 0){
flag = false;
break;
}
if (b >= 8 || a < 0){
flag = false;
break;
}
}
}
catch (Exception e){
flag = false;
}
if (flag){
if (board[a][b] == color){//the cause of breaking of the while should be only self piece in order to that path be good.
path = true;
}
}
if (path){//if the path is good, everything in that path gets flipped.
a =position[0] + h[0];
b = position[1] + h[1];
while (board[a][b] == MoveChecker.otherColor(color)){
board[a][b] = color;
a += h[0];
b += h[1];
if (a >= 8 || a < 0){
break;
}
if (b >= 8 || b < 0){
break;
}
}
}
}
}
}
|
42e009c1e539fb22b345b5e985410949a6f08e77
|
[
"Java"
] | 1 |
Java
|
KiarashMajdiPortfolio/Othello
|
c23e4e6d5b90afa668c24b9bdf902ad36ad19ddf
|
c6ead543ddd831fae2b3c3b5bb01da7e6b57ecf3
|
refs/heads/master
|
<repo_name>liurufeng/fb-login-php-lower-version<file_sep>/sample-use/account.php
<?php
require_once $_SERVER['DOCUMENT_ROOT'] . '/Facebook/autoload.php';
$fb = new Facebook\Facebook(array(
'app_id' => 'YOUR-FB-ID',
'app_secret' => 'YOUR-FB-secret',
'default_graph_version' => 'v2.5'
));
$helper = $fb->getJavaScriptHelper();
try {
$accessToken = $helper->getAccessToken();
} catch(Facebook\Exceptions\FacebookResponseException $e) {
// When Graph returns an error
echo 'Graph returned an error: ' . $e->getMessage();
header('Location: /login.php');
} catch(Facebook\Exceptions\FacebookSDKException $e) {
// When validation fails or other local issues
echo 'Facebook SDK returned an error: ' . $e->getMessage();
header('Location: /login.php');
}
if (isset($accessToken)) {
$fb->setDefaultAccessToken($accessToken);
try {
$requestProfile = $fb->get("/me?fields=name,first_name,last_name,email");
$profile = $requestProfile->getGraphNode()->asArray();
} catch(Facebook\Exceptions\FacebookResponseException $e) {
// When Graph returns an error
echo 'Graph returned an error: ' . $e->getMessage();
header('Location: /login.php');
} catch(Facebook\Exceptions\FacebookSDKException $e) {
// When validation fails or other local issues
echo 'Facebook SDK returned an error: ' . $e->getMessage();
header('Location: /login.php');
}
//now let's check if the user is existing
if($profile['email']) {
$db = Database::getInstance();
$userInfo = PasswordManagement::login_and_get_info($profile['email'], '', $db, false, true);
if ($userInfo !== false) {
$cust_acct = new CustomerAccount($profile['email']);
$login = $cust_acct->login();
header('Location: /account.php');
exit();
} else {
// didn't find this email (user account) in user database, create the new FB account
$sql = sprintf("
INSERT INTO users (login, password, date_regd, verified_account, fname, lname, fb_user)
VALUES ('%s', '', CURDATE(), 1, '%s', '%s', 1)",
);
if(!$db->query($sql)){
header('Location: /login.php');
}
else{
// load the new customer's account info
$cust_acct = new CustomerAccount($profile['email']);
// log the customer into the site
$cust_acct->login();
header('Location: /account.php');
exit();
}
}
} else {
header('Location: /login.php');
exit();
}
exit;
} else {
header('Location: /login.php');
exit();
}
|
5a6833cf38e3a2631976181c626d93c29bac9eed
|
[
"PHP"
] | 1 |
PHP
|
liurufeng/fb-login-php-lower-version
|
ae2e58dadc622b46880be85553aabea8e0b87c80
|
27b3c18d18fe0a3852274cbee93c881461cb69ed
|
refs/heads/temitope-unstack
|
<file_sep>import React from "react"
import Layout from "../components/layout"
const NotFoundPage = () => (
<Layout>
<div
className="py-20 px-6"
style={{ background: "linear-gradient(90deg, green 0%, darkgreen 100%)" }}
>
<h1 className="text-4xl font-bold mb-2 text-white">404</h1>
<p className="text-2xl mb-8 text-gray-200">
*waves* This is not the page you are looking for ...
</p>
</div>
</Layout>
)
export default NotFoundPage
|
10e8799a6e5badfe862755294ee6536772ce1710
|
[
"JavaScript"
] | 1 |
JavaScript
|
temitopeakinsoto/unstack
|
676256ed07de1511dd870f0955911330c8a5334a
|
35ca59784bb75cee1c7f1dfe4cf5bd06746a1fae
|
refs/heads/main
|
<file_sep># Google-meet-self-attendance-bot

An attendance bot which joins google meet automatically according to schedule and marks present in the google meet.
I have been missing my morning online lectures lately because of this irregular sleep cycle and procrastination.
So I came up with this innovative idea of developing a python program which will automatically open the meet link for me and join the lecture using my account credentials.
Moreover I added a versatile feature to this project which would make this project totally self-acting in handling the Google Meetings webpage.
I have developed a program which will answer as “Present” through the Microphone of the laptop whenever the teacher takes your name. Pygame , selenium and chrome driver are used in this project.
To run the program just clone the project change/add paths to chrome driver,add email id password and meet link,edit words' tuple with different pronunciations of your name,add path of openlink.py file to the batch file.And run batch file.
You can also start your meetings at a specific provided time. To do this associate the batch file with windows task scheduler and provide the time of your meeting.
Here is the tutorial link for this project :https://wadisarvesh33.medium.com/google-meet-attendance-bot-sarvesh-wadi-d8dd02838ee3
Refer this tutorial if you have any doubts/errors with the project.
Make sure to install VB-Audio Virtual Cable, it helps playing the recorded sound on the meeting.
https://vb-audio.com/Cable/
Please like and share my post on LinkedIn
LinkedIn link :
https://www.linkedin.com/posts/sarvesh-wadi-546531186_programming-python-coding-activity-6721649520037371904-FhGh
<file_sep>import webbrowser
import time
import math
# importing webdriver from selenium
from selenium import webdriver
from selenium.webdriver.common.keys import Keys
from selenium.webdriver.chrome.options import Options
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
from selenium.common.exceptions import NoSuchElementException
from selenium.common.exceptions import StaleElementReferenceException
from pygame._sdl2 import get_num_audio_devices, get_audio_device_name #Get playback device names
from pygame import mixer #Playing sound
url = 'https://accounts.google.com/signin/v2/identifier?ltmpl=meet&continue=https%3A%2F%2Fmeet.google.com%3Fhs%3D193&&flowName=GlifWebSignIn&flowEntry=ServiceLogin'
# Here Chrome will be used
#driver = webdriver.Chrome("D://openlink_meet/chromedriver.exe")
chrome_options = Options()
#chrome_options.add_argument('use-fake-device-for-media-stream')
chrome_options.add_argument('use-fake-ui-for-media-stream')
chrome_options.add_argument('--disable-notifications')
driver = webdriver.Chrome("Enter your path to chrome driver",chrome_options=chrome_options)
# Opening the website
driver.get(url)
# geeting the button by class name
SignIn = driver.find_element_by_id("identifierId")
# clicking on the button
SignIn.send_keys("<PASSWORD>")
SignIn.send_keys(Keys.ENTER)
driver.implicitly_wait(10)
EnterPass =driver.find_element_by_xpath("//*[@id='password']/div[1]/div/div[1]/input")
EnterPass.send_keys("<PASSWORD>")
EnterPass.send_keys(Keys.ENTER)
EnterCode =driver.find_element_by_xpath("//*[@id='i3']")
EnterCode.send_keys("Enter the meet code here")
JoinLink =driver.find_element_by_xpath("/html/body/c-wiz/div/div[2]/div/div[1]/div[3]/div/div[2]/div[2]/button/span")
JoinLink.click()
#Dismiss =driver.find_element_by_xpath("//*[@id='yDmH0d']/div[3]/div/div[2]/div[3]/div")
#Dismiss.click()
ignored_exceptions=(NoSuchElementException,StaleElementReferenceException)
#driver.implicitly_wait(10)
Mute = driver.find_element_by_xpath("//*[@id='yDmH0d']/c-wiz/div/div/div[9]/div[3]/div/div/div[4]/div/div/div[1]/div[1]/div/div[4]/div[1]/div/div/div")
Mute.click()
CamOff = driver.find_element_by_xpath("/html/body/div/c-wiz/div/div/div[9]/div[3]/div/div/div[4]/div/div/div[1]/div[1]/div/div[4]/div[2]/div/div")
CamOff.click()
#driver.implicitly_wait(10)
JoinNow =WebDriverWait(driver,10,ignored_exceptions=ignored_exceptions).until(EC.element_to_be_clickable((By.XPATH, "/html/body/div[1]/c-wiz/div/div/div[9]/div[3]/div/div/div[4]/div/div/div[2]/div/div[2]/div/div[1]/div[1]/span/span")))
JoinNow.click()
TurnOnCaptions = driver.find_element_by_xpath("/html/body/div[1]/c-wiz/div[1]/div/div[9]/div[3]/div[10]/div[2]/div/div[3]/div/span/button/span[2]")
TurnOnCaptions.click()
#ShowEveryone = driver.find_element_by_xpath("//*[@id='ow3']/div[1]/div/div[5]/div[3]/div[6]/div[3]/div/div[2]/div[1]/span")
#ShowEveryone.click()
students=WebDriverWait(driver,20,ignored_exceptions=ignored_exceptions).until(EC.element_to_be_clickable((By.XPATH, "/html/body/div[1]/c-wiz/div[1]/div/div[9]/div[3]/div[10]/div[3]/div[2]/div/div/div[2]/div/div")))
text = students.text
Total_numStudents = int(text)
print(Total_numStudents)
Caption_tray = WebDriverWait(driver,100,ignored_exceptions=ignored_exceptions).until(EC.presence_of_element_located((By.XPATH, "/html/body/div[1]/c-wiz/div[1]/div/div[9]/div[3]/div[7]/div")))
Captions = WebDriverWait(driver,100,ignored_exceptions=ignored_exceptions).until(EC.presence_of_element_located((By.XPATH, "/html/body/div[1]/c-wiz/div[1]/div/div[9]/div[3]/div[7]/div/div[2]/div/span/span")))
count = 0
staleElement = True
while staleElement :
try :
Caption_tray = driver.find_element_by_xpath("/html/body/div[1]/c-wiz/div[1]/div/div[9]/div[3]/div[7]/div")
Captions = driver.find_element_by_xpath("/html/body/div[1]/c-wiz/div[1]/div/div[9]/div[3]/div[7]/div/div[2]/div")
if Captions.is_displayed() :
Caption_text= Captions.text
Caption_text = Caption_text.lower()
print(Caption_text)
except(StaleElementReferenceException):
staleElement = True
except(NoSuchElementException) :
staleElement = True
changed_numstudents = int(students.text)
print(changed_numstudents)
if changed_numstudents > Total_numStudents :
Total_numStudents = changed_numstudents
elif changed_numstudents < Total_numStudents :
if changed_numstudents <= math.floor(0.2*Total_numStudents):
EndCall=driver.find_element_by_xpath("/html/body/div[1]/c-wiz/div[1]/div/div[9]/div[3]/div[10]/div[2]/div/div[7]/span/button/i")
EndCall.click()
if count ==0 :
words = ("roll number 22", "Jondoe" , "Johndoe" , "<NAME>" ,"<NAME>" , "Jon" )
if any(name in Caption_text for name in words):
UnMute = driver.find_element_by_xpath("/html/body/div[1]/c-wiz/div[1]/div/div[9]/div[3]/div[10]/div[2]/div/div[1]/div/div/span/button/div[2]")
UnMute.click()
mixer.init(devicename='CABLE Input (VB-Audio Virtual Cable)')
mixer.music.load("Enter path of your voice recording here")
mixer.music.play()
time.sleep(4)
mixer.music.stop()
UnMute.click()
count+=1
|
d7abf68e24508ab4340ac1bee2aa28b7ce80cbde
|
[
"Markdown",
"Python"
] | 2 |
Markdown
|
Sameer-Karoshi/Google-meet-self-attendance-bot
|
fc4da64b53ed2f21bf3c4aa3dc07752c635e6c4f
|
cc12c3a59bc7f204395443c66bde00cd4ad3eedb
|
refs/heads/master
|
<repo_name>hdwatts/project-euler-even-fibonacci-e-000<file_sep>/lib/even_fibonacci.rb
# Implement your procedural solution here!
def even_fibonacci_sum(limit)
arr = fib([1, 2], limit)
sum = 0
arr.each do |num|
if num % 2 == 0
sum += num
end
end
sum
end
def fib(array, limit)
if array[-1] + array[-2] > limit
array
else
sum = array[-1] + array[-2]
array.push(sum)
fib(array, limit)
end
end
<file_sep>/lib/oo_even_fibonacci.rb
# Implement your object-oriented solution here!
class EvenFibonacci
def initialize(limit)
@limit = limit
end
def sum
arr = fib([1, 2])
sum = 0
arr.each do |num|
if num % 2 == 0
sum += num
end
end
sum
end
def fib(array)
if array[-1] + array[-2] > @limit
array
else
sum = array[-1] + array[-2]
array.push(sum)
fib(array)
end
end
end
|
d102b2203f99a4f378ee59580abca22159b51fab
|
[
"Ruby"
] | 2 |
Ruby
|
hdwatts/project-euler-even-fibonacci-e-000
|
f5d1f026c9ae60292010d40d5c0a53735169573f
|
f4f468c1227611c87ec5edb69c08d5cd35eede2a
|
refs/heads/main
|
<file_sep>using NUnit.Framework;
namespace Oware.Tests
{
public class MockScoreHouse : IScoreHouse{
public int GetCount() {
throw new System.NotImplementedException();
}
public void AddSeed(Seed seed) {
throw new System.NotImplementedException();
}
public void Reset() {
throw new System.NotImplementedException();
}
}
public class Tests
{
[SetUp]
public void Setup()
{
}
[Test]
public void Test()
{
//ARRANGE
House h1 = new House(1,1);
// ACT
h1.ResetHouse();
// ASSERT
Assert.AreEqual(4,h1.GetCount(), "The number of seeds should be 4");
}
[Test]
public void Test2()
{
//ARRANGE
// ACT
Player p1 = new Player("John");
// ASSERT
//Assert.Pass(p1.AddSeedToScoreHouse(new Seed()));
//MockScoreHouse.Verify();
Assert.AreEqual(0, p1.GetScore(),"the test returns a void");
p1.AddSeedToScoreHouse(new Seed());
//p1.VerifyAll();
//IScoreHouse.Verify();
Assert.AreEqual(1, p1.GetScore(),"the test should add one score");
}
}
}
|
af29f69364d971408d571b819ddc4f999e697d3d
|
[
"C#"
] | 1 |
C#
|
siya9/travi
|
75911a6c2e5072212dcb43a36ac488ae1e8b985b
|
88d432e5998c51ec1973e4c8cd452eb0af547bb1
|
refs/heads/master
|
<file_sep># MyButton_iOS
UIView를 상속받아 나만의 버튼을 만들어보는 예제입니다.
<file_sep>//
// MyButton.swift
// MyButtonIBDesignable
//
// Created by yagom
// Copyright © 2017년 yagom. All rights reserved.
//
import UIKit
/**
영문 문장으로 작성되어 있는 주석은 UIButton 혹은 그의 부모 클래스, 혹은 UIControlState 등
UIButton 문서에서 찾아볼 수 있는 property 혹은 method의 주석입니다.
그 기능들을 참고하여 따라 만들어 보았습니다.
**/
class MyButton: UIView {
typealias ButtonAction = ((MyButton) -> Void)
//MARK: - Initializers
override init(frame: CGRect) {
super.init(frame: frame)
self.preservedAlpha = self.alpha
}
required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
}
//MARK: - Properties
fileprivate var storedTitleLabel: UILabel?
fileprivate var storedBackgroundImageView: UIImageView?
fileprivate var information: StateInformation = StateInformation()
fileprivate var targetActions: Set<Action> = []
fileprivate var closureActions: [String: ButtonAction] = [:]
fileprivate var preservedAlpha: CGFloat = 1.0
// default is YES. if NO, ignores touch events and subclasses may draw differently
var isEnabled: Bool {
get {
return self.state.contains(.disabled) == false
}
set {
switch newValue {
case true:
self.state.remove(.disabled)
case false:
self.state.update(with: .disabled)
}
}
}
// default is NO may be used by some subclasses or by application
var isSelected: Bool {
get {
return self.state.contains(.selected)
}
set {
switch newValue {
case true:
self.state.remove(.normal)
self.state.update(with: .selected)
case false:
self.state.remove(.selected)
self.state.update(with: .normal)
}
}
}
// default is NO. this gets set/cleared automatically when touch enters/exits during tracking and cleared on up
var isHighlighted: Bool {
get {
return self.state.contains(.highlighted) == false
}
}
// could be more than one state (e.g. disabled|selected). synthesized from other flags.
var state: State = .normal {
didSet {
updateView()
}
}
fileprivate var currentInformation: StateInformation.Information? {
return self.information.stateInforamtion(for: self.state)
}
deinit {
self.storedTitleLabel?.removeFromSuperview()
self.storedBackgroundImageView?.removeFromSuperview()
self.storedTitleLabel = nil
self.storedBackgroundImageView = nil
}
}
//MARK: - custom types
extension MyButton {
//MARK: - State
struct State: OptionSet, Hashable {
let rawValue: Int
static var normal: State = State(rawValue: 1 << 0)
// used when isHighlighted is set
static var highlighted: State = State(rawValue: 1 << 1)
static var disabled: State = State(rawValue: 1 << 2)
// flag usable by app (see below)
static var selected: State = State(rawValue: 1 << 3)
static func ==(lhs: State, rhs: State) -> Bool {
return lhs.hashValue == rhs.hashValue
}
var hashValue: Int {
return self.rawValue
}
}
//MARK: StateInformation
fileprivate struct StateInformation {
struct Information {
var title: String?
var titleColor: UIColor?
var attributedTitle: NSAttributedString?
var backgroundImage: UIImage?
var shouldAttributed: Bool = false
}
fileprivate var infoDictionary: [State: Any] = [:]
func stateInforamtion(for state: State) -> Information? {
return infoDictionary[state] as? Information
}
mutating func setStateInforamtion(_ info: Information, for state: State, button: MyButton){
infoDictionary[state] = info
button.updateView()
}
}
//MARK: - Action
fileprivate struct Action: Hashable {
var target: NSObject
var selector: Selector
static func==(lhs: Action, rhs: Action) -> Bool {
return lhs.target == rhs.target && lhs.selector == rhs.selector
}
var hashValue: Int {
return selector.hashValue
}
}
}
//MARK:- draw
extension MyButton {
fileprivate func drawImageView() {
if self.backgroundImageView == nil {
let imageView = UIImageView(frame: self.bounds)
self.storedBackgroundImageView = imageView
self.insertSubview(imageView, at: 0)
}
}
fileprivate func drawTitleLabel() {
if self.titleLabel == nil {
let titleLabel = UILabel(frame: self.bounds)
titleLabel.textAlignment = .center
self.storedTitleLabel = titleLabel
self.addSubview(titleLabel)
self.bringSubview(toFront: titleLabel)
}
}
fileprivate var transparentAlpha: CGFloat {
return self.preservedAlpha * 0.6
}
override var alpha: CGFloat {
willSet {
if state.contains(.disabled) || state.contains(.highlighted) {
if self.transparentAlpha == newValue {
self.preservedAlpha = self.alpha
}
}
}
}
fileprivate func updateView() {
self.alpha = state.contains(.highlighted) || state.contains(.disabled) ? transparentAlpha : self.preservedAlpha
self.isUserInteractionEnabled = !state.contains(.disabled)
if state.contains(.disabled) {
return
}
let information = self.information.stateInforamtion(for: state)
let normalInformation = self.information.stateInforamtion(for: .normal)
let shouldAttributed = information?.shouldAttributed ?? normalInformation?.shouldAttributed ?? false
self.titleLabel?.text = nil
self.titleLabel?.attributedText = nil
self.titleLabel?.textColor = information?.titleColor ?? normalInformation?.titleColor
self.backgroundImageView?.image = information?.backgroundImage ?? normalInformation?.backgroundImage
if shouldAttributed {
self.titleLabel?.attributedText = information?.attributedTitle ?? normalInformation?.attributedTitle
} else {
self.titleLabel?.text = information?.title ?? normalInformation?.title
}
}
}
//MARK:- title
extension MyButton {
//MARK: Properties
// these are the values that will be used for the current state. you can also use these for overrides. a heuristic will be used to
// determine what image to choose based on the explict states set. For example, the 'normal' state value will be used for all states
// that don't have their own image defined.
// normal/highlighted/selected/disabled. can return nil
var currentTitle: String? {
return self.information.stateInforamtion(for: self.state)?.title
}
// normal/highlighted/selected/disabled. always returns non-nil. default is white(1,1)
var currentTitleColor: UIColor {
return self.information.stateInforamtion(for: self.state)?.titleColor ?? .white
}
// return title label. will always create them if necessary.
var titleLabel: UILabel? {
return storedTitleLabel
}
//MARK: Methods
// default is nil. title is assumed to be single line
// normal/highlighted/selected/disabled.
func setTitle(_ title: String?, for state: State) {
self.drawTitleLabel()
var information = self.information.stateInforamtion(for: state) ?? StateInformation.Information()
information.title = title
self.information.setStateInforamtion(information, for: state, button: self)
}
// default if nil. use opaque white
// normal/highlighted/selected/disabled.
func setTitleColor(_ color: UIColor?, for state: State) {
self.drawTitleLabel()
var information = self.information.stateInforamtion(for: state) ?? StateInformation.Information()
information.titleColor = color
self.information.setStateInforamtion(information, for: state, button: self)
}
// these getters only take a single state value
// normal/highlighted/selected/disabled.
func title(for state: State) -> String? {
return self.information.stateInforamtion(for: state)?.title
}
// normal/highlighted/selected/disabled.
func titleColor(for state: State) -> UIColor? {
return self.information.stateInforamtion(for: state)?.titleColor
}
}
//MARK:- attributed title
extension MyButton {
//MARK: Properties
@available(iOS 6.0, *)
var currentAttributedTitle: NSAttributedString? {
return self.currentInformation?.attributedTitle
}
//MARK: Methods
@available(iOS 6.0, *)
// default is nil. title is assumed to be single line
func setAttributedTitle(_ title: NSAttributedString?, for state: State) {
self.drawTitleLabel()
var information = self.information.stateInforamtion(for: state) ?? StateInformation.Information()
information.attributedTitle = title
self.information.setStateInforamtion(information, for: state, button: self)
}
@available(iOS 6.0, *)
// normal/highlighted/selected/disabled.
func attributedTitle(for state: State) -> NSAttributedString? {
return self.information.stateInforamtion(for: state)?.attributedTitle
}
}
//MARK:- background image
extension MyButton {
//MARK: Properties
// return background image view. will always create them if necessary.
var backgroundImageView: UIImageView? {
return storedBackgroundImageView
}
// normal/highlighted/selected/disabled.
var currentBackgroundImage: UIImage? {
return self.currentInformation?.backgroundImage
}
//MARK: Methods
// default is nil
func setBackgroundImage(_ image: UIImage?, for state: State) {
self.drawImageView()
var information = self.information.stateInforamtion(for: state) ?? StateInformation.Information()
information.backgroundImage = image
self.information.setStateInforamtion(information, for: state, button: self)
}
// normal/highlighted/selected/disabled.
func backgroundImage(for state: State) -> UIImage? {
return self.information.stateInforamtion(for: state)?.backgroundImage
}
}
//MARK:- action
extension MyButton {
func addTarget(_ target: NSObject, action: Selector) {
let action = Action(target: target, selector: action)
self.targetActions.insert(action)
}
func removeTarget(_ target: NSObject, action: Selector) {
let action = Action(target: target, selector: action)
self.targetActions.remove(action)
}
func removeAllTargets() {
self.targetActions.removeAll()
}
func addAction(name: String, _ action: @escaping ButtonAction) {
closureActions[name] = action
}
func removeAction(name: String, _ action: ButtonAction) {
closureActions.removeValue(forKey: name)
}
func removeAllActions() {
closureActions.removeAll()
}
}
//MARK:- touch
extension MyButton {
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
self.state.update(with: .highlighted)
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
self.state.remove(.highlighted)
self.targetActions.forEach { $0.target.perform($0.selector, with: self, afterDelay: 0) }
self.closureActions.forEach { [unowned self] in
$0.value(self)
}
}
override func touchesCancelled(_ touches: Set<UITouch>, with event: UIEvent?) {
self.state.remove(.highlighted)
}
}
<file_sep>//
// ViewController.swift
// MyButton
//
// Created by yagom
// Copyright © 2017년 yagom. All rights reserved.
//
import UIKit
class ViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
let button: MyButton = MyButton(frame: CGRect(x: 50, y: 50, width: 200, height: 50))
button.setTitle("normal", for: .normal)
button.setTitleColor(.yellow, for: .normal)
button.backgroundColor = .black
button.setTitle("highlighted1", for: [.normal, .highlighted])
button.setTitleColor(.white, for: [.normal, .highlighted])
button.setTitle("selected", for: .selected)
button.setTitleColor(.green, for: .selected)
button.setTitle("highlighted2", for: [.selected, .highlighted])
button.setTitleColor(.red, for: [.selected, .highlighted])
button.addAction(name: "first") { (sender: MyButton) in
print("touch up inside")
sender.isSelected = !sender.isSelected
}
button.addTarget(self, action: #selector(self.touchUpButton(sender:)))
self.view.addSubview(button)
let disableButton: MyButton = MyButton(frame: CGRect(x: 50, y: 120, width: 200, height: 50))
disableButton.setTitle("Disable the button", for: .normal)
disableButton.setTitleColor(.black, for: .normal)
disableButton.setTitle("Enable the button", for: .selected)
disableButton.addAction(name: "en/disable") { (sender: MyButton) in
button.isEnabled = !button.isEnabled
sender.isSelected = !sender.isSelected
}
self.view.addSubview(disableButton)
}
func touchUpButton(sender: MyButton) {
print("button tapped")
}
}
|
c79c0a46ac76954734b5b38b4f15233bfa2e1c78
|
[
"Markdown",
"Swift"
] | 3 |
Markdown
|
connect-boostcamp/MyButton_iOS
|
a020c1903be0fa2d52faa73236ff9fc416e03c31
|
6bfa7001fb36570a448c73e866f2d7830f39a23c
|
refs/heads/master
|
<file_sep>using FakeItEasy;
using Microsoft.Practices.Prism.Mvvm.Interfaces;
using Microsoft.VisualStudio.TestTools.UnitTesting;
using Sandbox.UILogic.ViewModels;
namespace Sandbox.Tests.ViewModels
{
[TestClass]
public class TestMainPageViewModel
{
private INavigationService _navigationService;
private MainPageViewModel _viewModel;
[TestInitialize]
public void Initialize()
{
_navigationService = A.Fake<INavigationService>();
_viewModel = new MainPageViewModel(_navigationService);
}
[TestMethod]
public void PressingReactivePropertiesButtonsNavigatesToPage()
{
_viewModel.ShowReactivePropertiesCommand.Execute(null);
A.CallTo(() => _navigationService.Navigate("ReactiveProperties", null)).MustHaveHappened();
}
}
}<file_sep>// -----------------------------------------------------------------------
// <copyright file="Navigator.cs" company="NOKIA">
// copyright © 2014 Nokia. All rights reserved.
// This material, including documentation and any related computer
// programs, is protected by copyright controlled by Nokia. All
// rights are reserved. Copying, including reproducing, storing,
// adapting or translating, any or all of this material requires the
// prior written consent of Nokia. This material also contains
// confidential information which may not be disclosed to others
// without the prior written consent of Nokia.
// </copyright>
// -----------------------------------------------------------------------
using System;
using Windows.UI.Xaml.Controls;
using HERE.Common;
namespace Sandbox.Common
{
public class Navigator : INavigator
{
private readonly Frame _frame;
public bool CanGoBack
{
get { return _frame.CanGoBack; }
}
public string CurrentSourcePageType
{
get { return _frame.CurrentSourcePageType.ToString(); }
}
public Navigator(Frame frame)
{
_frame = frame;
}
public event EventHandler<bool> CanGoBackChanged;
public void GoBack()
{
_frame.GoBack();
}
public void Navigate(string sourcePage, object parameter)
{
_frame.Navigate(Type.GetType(sourcePage), parameter);
}
public void Navigate(string sourcePage)
{
Navigate(sourcePage, null);
}
public void Navigate(Uri uri)
{
throw new NotImplementedException();
}
}
}<file_sep>// -----------------------------------------------------------------------
// <copyright file="ReactiveExtensions.cs" company="NOKIA">
// copyright © 2014 Nokia. All rights reserved.
// This material, including documentation and any related computer
// programs, is protected by copyright controlled by Nokia. All
// rights are reserved. Copying, including reproducing, storing,
// adapting or translating, any or all of this material requires the
// prior written consent of Nokia. This material also contains
// confidential information which may not be disclosed to others
// without the prior written consent of Nokia.
// </copyright>
// -----------------------------------------------------------------------
namespace MapsW8.Base.Reactive
{
using System;
using System.Collections;
using System.Collections.ObjectModel;
using System.Collections.Specialized;
using System.Reactive;
using System.Reactive.Concurrency;
using System.Reactive.Disposables;
using System.Reactive.Linq;
using System.Reactive.Subjects;
using System.Runtime.CompilerServices;
using System.Threading;
public static class ReactiveExtensions
{
public static IObservable<TSource> ObserveOnUI<TSource>(this IObservable<TSource> source)
{
if (SynchronizationContext.Current == null)
{
return source.ObserveOn(Scheduler.CurrentThread);
}
else
{
return source.ObserveOn(SynchronizationContext.Current);
}
}
public static IObservable<TSource> SubscribeOnUI<TSource>(this IObservable<TSource> source)
{
if (SynchronizationContext.Current == null)
{
return source.SubscribeOn(Scheduler.CurrentThread);
}
else
{
return source.SubscribeOn(SynchronizationContext.Current);
}
}
public static IObservable<TSource> SelectArgs<TSource>(this IObservable<EventPattern<TSource>> source)
{
return source.Select(ev => ev.EventArgs);
}
public static IObservable<TSource> WhereIsNotNull<TSource>(this IObservable<TSource> source) where TSource : class
{
return source.Where(arg => arg != null);
}
public static IObservable<TSource> Log<TSource>(this IObservable<TSource> source, Action<string, string> action, [CallerMemberName] string memberName = null)
{
return source.Do(arg => action("" + arg, memberName));
}
public static IObservable<int> CountObservable<TSource>(this TSource source) where TSource : ICollection, INotifyCollectionChanged
{
return Observable.FromEventPattern<NotifyCollectionChangedEventHandler, NotifyCollectionChangedEventArgs>(
h => source.CollectionChanged += h,
h => source.CollectionChanged -= h)
.Select(_ => source.Count)
.StartWith(source.Count);
}
/// <summary>
/// Works similarly to SelectMany but when the source triggers new value,
/// we unsubscribe from previous stream and use only values from new stream
/// </summary>
/// <typeparam name="TSource"></typeparam>
/// <typeparam name="TResults"></typeparam>
/// <param name="source"></param>
/// <param name="action"></param>
/// <returns></returns>
public static IObservable<TResults> SelectManyOneAtATime<TSource, TResults>(this IObservable<TSource> source, Func<TSource, IObservable<TResults>> action)
{
return source.Select(arg => action(arg)).Switch();
}
}
}<file_sep>using System.Windows.Input;
using Microsoft.Practices.Prism.Commands;
using Microsoft.Practices.Prism.Mvvm;
using Microsoft.Practices.Prism.Mvvm.Interfaces;
namespace Sandbox.UILogic.ViewModels
{
public class MainPageViewModel : ViewModel
{
public ICommand ShowReactivePropertiesCommand { get; private set; }
public ICommand SubscriptionChangeCommand { get; private set; }
public ICommand SandboxCommand { get; private set; }
public ICommand PropertyUpdateCommand { get; private set; }
public ICommand LatestWinsCommand { get; private set; }
public ICommand SelectSwitchCommand { get; private set; }
public MainPageViewModel(INavigationService navigationService)
{
ShowReactivePropertiesCommand =
new DelegateCommand(() => navigationService.Navigate("ReactiveProperties", null));
SubscriptionChangeCommand = new DelegateCommand(() => navigationService.Navigate("SubscriptionChange", null));
SandboxCommand = new DelegateCommand(() => navigationService.Navigate("SandBox", null));
PropertyUpdateCommand = new DelegateCommand(() => navigationService.Navigate("PropertyUpdate", null));
LatestWinsCommand = new DelegateCommand(() => navigationService.Navigate("LatestWins", null));
SelectSwitchCommand = new DelegateCommand(() => navigationService.Navigate("SelectSwitch", null));
}
}
}<file_sep>using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
namespace Sandbox.Linq
{
public static class LinqSandbox
{
public static void Basics()
{
List<int> normalList = new List<int> { 1, 2, 3, 4, 5 };
IEnumerable<int> aBitDifferentList = new List<int> { 1, 2, 3, 4, 5 };
int element1 = normalList[0];
//int element2 = aBitDifferentList[0];
}
public static IEnumerable<int> LinqBasics()
{
var items = new List<int> { 1, 2, 3, 4, 5 };
var oddItems = items.Where(number => number % 2 == 1)
.Select(number => number * number)
.Take(3)
.OrderByDescending(number => number);
return oddItems;
}
public static IEnumerable<int> LinqLessBasic()
{
var items = Enumerable.Range(0, int.MaxValue);
int whereCount = 0;
int selectCount = 0;
var oddItems = items.Where(number => { whereCount++; return number % 2 == 1; })
.Select(number => { selectCount++; return number * number; })
.Take(3)
.OrderByDescending(number => number)
.ToList();
return oddItems;
}
public static string FoldTextToOneLine(string s, int maxLines = 8, string foldSeperator = ", ")
{
// no null check is intentional
List<string> lines = new List<string>();
string[] strings = s.Split('\n', ',', ';');
foreach (string line in strings)
{
string trimmedLine = line.Trim();
if (trimmedLine.Length > 0)
{
lines.Add(trimmedLine);
if (lines.Count >= maxLines)
{
break;
}
}
}
StringBuilder sb = new StringBuilder();
for (int i = 0; i < lines.Count; i++)
{
sb.Append(lines[i]);
if (i < lines.Count - 1)
{
sb.Append(foldSeperator);
}
}
return sb.ToString();
}
public static string FoldTextToOneLineLinq(string multiLineString, int maxLines = 8, string foldSeperator = ", ")
{
// no null check is intentional
return multiLineString.Split('\n', ',', ';')
.Select(line => line.Trim())
.Where(trimmedLine => trimmedLine.Length > 0)
.Take(maxLines)
.Aggregate((first, second) => first + foldSeperator + second);
}
}
}
<file_sep>using System;
using System.Reactive.Subjects;
namespace Sandbox.UILogic.Model
{
public class ReactivePlace
{
private readonly Subject<string> _addressSubject = new Subject<string>();
private readonly Subject<bool> _isFavoriteSubject = new Subject<bool>();
private readonly Subject<string> _nameSubject = new Subject<string>();
private string _address;
private bool _isFavorite;
private string _name;
public string Address
{
get { return _address; }
set
{
if (_address != value)
{
_address = value;
_addressSubject.OnNext(_address);
}
}
}
public bool IsFavorite
{
get { return _isFavorite; }
set
{
if (_isFavorite != value)
{
_isFavorite = value;
_isFavoriteSubject.OnNext(_isFavorite);
}
}
}
public string Name
{
get { return _name; }
set
{
if (_name != value)
{
_name = value;
_nameSubject.OnNext(_name);
}
}
}
public IObservable<string> AddressObservable
{
get { return _addressSubject; }
}
public IObservable<string> NameObservable
{
get { return _nameSubject; }
}
public IObservable<bool> IsFavoriteObservable
{
get { return _isFavoriteSubject; }
}
}
}<file_sep>using System;
using System.Collections.Generic;
using System.Reactive.Linq;
using Microsoft.Practices.Prism.Mvvm;
using Sandbox.UILogic.Model;
using Utilities.Reactive;
namespace Sandbox.UILogic.ViewModels
{
public class PropertyUpdatePageViewModel : ViewModel, IDisposable
{
private readonly Place _place = new Place {Name = "Home", Address = "Lezajsk", IsFavorite = true};
public string DisplayText
{
get
{
return string.Format("This place name is: {0}, the address {1} and it is favorite: {2}", _place.Name,
_place.Address, _place.IsFavorite);
}
}
public ReadonlyReactiveProperty<string> ReactiveDisplayText { get; private set; }
#region Helpers
public string PlaceName
{
get { return _place.Name; }
set { _place.Name = value; }
}
public string PlaceAddress
{
get { return _place.Address; }
set { _place.Address = value; }
}
public bool PlaceIsFavorite
{
get { return _place.IsFavorite; }
set { _place.IsFavorite = value; }
}
#endregion
public PropertyUpdatePageViewModel()
{
_place.NameChanged += OnNameChanged;
_place.AddressChanged += OnAddressChanged;
_place.IsFavoriteChanged += OnIsFavoriteChanged;
#region Reactive stuff
var nameObservable = Observable.FromEventPattern<string>(h => _place.NameChanged += h,
h => _place.NameChanged -= h)
.Select(args => args.EventArgs)
.StartWith(_place.Name);
var addressObservable = Observable.FromEventPattern<string>(h => _place.AddressChanged += h,
h => _place.AddressChanged -= h)
.Select(args => args.EventArgs)
.StartWith(_place.Address);
var isFavoriteObservable = Observable.FromEventPattern<bool>(h => _place.IsFavoriteChanged += h,
h => _place.IsFavoriteChanged -= h)
.Select(args => args.EventArgs)
.StartWith(_place.IsFavorite);
ReactiveDisplayText =
nameObservable.CombineLatest(addressObservable, isFavoriteObservable, (name, address, isFavorite) =>
string.Format("This place name is: {0}, the address {1} and it is favorite: {2}", name,
address, isFavorite))
.ToReadonlyReactiveProperty();
#endregion
}
public void Dispose()
{
ReactiveDisplayText.Dispose();
}
public override void OnNavigatedFrom(Dictionary<string, object> viewModelState, bool suspending)
{
base.OnNavigatedFrom(viewModelState, suspending);
Dispose();
}
private void OnIsFavoriteChanged(object sender, bool e)
{
OnPropertyChanged(() => DisplayText);
}
private void OnAddressChanged(object sender, string e)
{
OnPropertyChanged(() => DisplayText);
}
private void OnNameChanged(object sender, string e)
{
OnPropertyChanged(() => DisplayText);
}
}
}<file_sep>
using Microsoft.Practices.Prism.StoreApps;
namespace Sandbox.Views
{
public sealed partial class SelectSwitchPage : VisualStateAwarePage
{
public SelectSwitchPage()
{
InitializeComponent();
}
}
}
<file_sep>using System;
using System.Reactive.Concurrency;
using System.Reactive.Disposables;
using System.Reactive.Linq;
using System.Reactive.Subjects;
using System.Threading;
using System.Windows;
namespace MapsW8.Base.Reactive
{
/// <summary>
/// <para>If call Schedule on UIThread then schedule immediately else dispatch BeginInvoke.</para>
/// <para>UIDIspatcherScheduler is created when access to UIDispatcher.Default first in the whole application.</para>
/// <para>If you want to explicitly initialize, call UIDispatcherScheduler.Initialize() in App.xaml.cs.</para>
/// </summary>
public static class UIDispatcherScheduler
{
private static readonly Lazy<SynchronizationContextScheduler> defaultScheduler =
new Lazy<SynchronizationContextScheduler>(() => new SynchronizationContextScheduler(SynchronizationContext.Current));
/// <summary>
/// <para>If call Schedule on UIThread then schedule immediately else dispatch BeginInvoke.</para>
/// <para>UIDIspatcherScheduler is created when access to UIDispatcher.Default first in the whole application.</para>
/// <para>If you want to explicitly initialize, call UIDispatcherScheduler.Initialize() in App.xaml.cs.</para>
/// </summary>
public static IScheduler Default
{
get
{
return SynchronizationContext.Current == null ? (IScheduler)Scheduler.CurrentThread : defaultScheduler.Value;
}
}
/// <summary>
/// Create UIDispatcherSchedule on called thread if is not initialized yet.
/// </summary>
public static void Initialize()
{
var _ = defaultScheduler.Value;
}
}
}<file_sep>// -----------------------------------------------------------------------
// <copyright file="PageBase.cs" company="NOKIA">
// copyright © 2014 Nokia. All rights reserved.
// This material, including documentation and any related computer
// programs, is protected by copyright controlled by Nokia. All
// rights are reserved. Copying, including reproducing, storing,
// adapting or translating, any or all of this material requires the
// prior written consent of Nokia. This material also contains
// confidential information which may not be disclosed to others
// without the prior written consent of Nokia.
// </copyright>
// -----------------------------------------------------------------------
using Windows.UI.Xaml;
using Windows.UI.Xaml.Controls;
using HERE.Common;
using System;
using System.Reactive.Linq;
namespace Sandbox.Common
{
public class PageBase : Page
{
private IDisposable _backButtonSubscription;
protected PageBase()
{
Loaded += OnLoaded;
Unloaded += PageBase_Unloaded;
}
void PageBase_Unloaded(object sender, RoutedEventArgs e)
{
var viewModel = DataContext as PageViewModel;
if (viewModel != null)
{
viewModel.Dispose();
}
if (_backButtonSubscription != null)
{
_backButtonSubscription.Dispose();
}
}
private void OnLoaded(object sender, RoutedEventArgs e)
{
var viewModel = DataContext as PageViewModel;
if (viewModel != null)
{
viewModel.Navigator = new Navigator(Frame);
}
#if WINDOWS_PHONE_APP
_backButtonSubscription = Observable.FromEventPattern<Windows.Phone.UI.Input.BackPressedEventArgs>
(h => Windows.Phone.UI.Input.HardwareButtons.BackPressed += h,
h => Windows.Phone.UI.Input.HardwareButtons.BackPressed -= h)
.Where(_ => ((PageViewModel)viewModel).Navigator.CanGoBack)
.Select(ev => ev.EventArgs)
.Subscribe(GoBack);
#endif
}
#if WINDOWS_PHONE_APP
void GoBack(Windows.Phone.UI.Input.BackPressedEventArgs e)
{
var viewModel = DataContext as PageViewModel;
viewModel.Navigator.GoBack();
e.Handled = true;
}
#endif
}
}<file_sep>using Microsoft.Practices.Prism.Mvvm;
using Utilities.Reactive;
namespace Sandbox.UILogic.ViewModels
{
public class SandBoxPageViewModel : ViewModel
{
public ReactiveProperty<string> Text { get; private set; }
public ReactiveCommand ClickCommand { get; private set; }
public SandBoxPageViewModel()
{
ClickCommand = new ReactiveCommand();
Text = new ReactiveProperty<string>();
}
}
}<file_sep>// -----------------------------------------------------------------------
// <copyright file="RactivePropertyExtensions.cs" company="NOKIA">
// copyright © 2014 Nokia. All rights reserved.
// This material, including documentation and any related computer
// programs, is protected by copyright controlled by Nokia. All
// rights are reserved. Copying, including reproducing, storing,
// adapting or translating, any or all of this material requires the
// prior written consent of Nokia. This material also contains
// confidential information which may not be disclosed to others
// without the prior written consent of Nokia.
// </copyright>
// -----------------------------------------------------------------------
namespace MapsW8.Base.Reactive
{
using System;
using System.Reactive.Concurrency;
public static class RactivePropertyExtensions
{
public static ReactiveProperty<T> ToReactiveProperty<T>(this IObservable<T> source,
T initialValue = default(T),
ReactivePropertyMode mode = ReactivePropertyMode.DistinctUntilChanged|ReactivePropertyMode.RaiseLatestValueOnSubscribe)
{
return new ReactiveProperty<T>(source, initialValue, mode);
}
public static ReactiveProperty<T> ToReactiveProperty<T>(this IObservable<T> source,
IScheduler raiseEventScheduler,
T initialValue = default(T),
ReactivePropertyMode mode = ReactivePropertyMode.DistinctUntilChanged|ReactivePropertyMode.RaiseLatestValueOnSubscribe)
{
return new ReactiveProperty<T>(source, raiseEventScheduler, initialValue, mode);
}
public static ReadonlyReactiveProperty<T> ToReadonlyReactiveProperty<T>(this IObservable<T> source,
T initialValue = default(T),
ReactivePropertyMode mode = ReactivePropertyMode.DistinctUntilChanged|ReactivePropertyMode.RaiseLatestValueOnSubscribe)
{
return new ReadonlyReactiveProperty<T>(source, initialValue, mode);
}
public static ReadonlyReactiveProperty<T> ToReadonlyReactiveProperty<T>(this IObservable<T> source,
IScheduler raiseEventScheduler,
T initialValue = default(T),
ReactivePropertyMode mode = ReactivePropertyMode.DistinctUntilChanged|ReactivePropertyMode.RaiseLatestValueOnSubscribe)
{
return new ReadonlyReactiveProperty<T>(source, raiseEventScheduler, initialValue, mode);
}
}
}<file_sep>using System;
using System.Collections.Generic;
using System.Linq;
using System.Reactive.Linq;
namespace Sandbox.UILogic
{
public static class ReactiveExtensions
{
public static IObservable<long> BucketSum<TSource>(this IObservable<TSource> self, Func<TSource, long> selector)
{
var bucket = new Dictionary<TSource, long>();
return self.Select(source =>
{
bucket[source] = selector(source);
return bucket.Values.Sum();
});
}
}
}
<file_sep>using Microsoft.Practices.Prism.StoreApps;
namespace Sandbox.Views
{
public sealed partial class LatestWinsPage : VisualStateAwarePage
{
public LatestWinsPage()
{
InitializeComponent();
}
}
}<file_sep>using Microsoft.Practices.Prism.StoreApps;
namespace Sandbox.Views
{
public sealed partial class SandBoxPage : VisualStateAwarePage
{
public SandBoxPage()
{
InitializeComponent();
}
}
}<file_sep>using Sandbox.Linq;
namespace Sandbox.Console
{
class Program
{
static void Main(string[] args)
{
var display = LinqSandbox.LinqBasics();
foreach (int something in display)
{
System.Console.WriteLine(something);
}
LinqSandbox.LinqLessBasic();
string testString = " this is some , string, that we , would to test, with ; parsing";
string statementBased = LinqSandbox.FoldTextToOneLine(testString);
string expressionBased = LinqSandbox.FoldTextToOneLineLinq(testString);
System.Console.ReadKey();
}
}
}
<file_sep>using System;
using System.Collections.Generic;
using System.Diagnostics.CodeAnalysis;
using System.Text;
using System.Windows.Input;
using Windows.UI.Xaml;
using Windows.UI.Xaml.Controls;
namespace Sandbox.Commands
{
public class SelectionCommand
{
[SuppressMessage("Microsoft.Security",
"CA2104:DoNotDeclareReadOnlyMutableReferenceTypes")]
public static readonly DependencyProperty
CommandProperty =
DependencyProperty.RegisterAttached("Command",
typeof(ICommand),
typeof(SelectionCommand),
new PropertyMetadata(null, CommandPropertyChanged));
public static void SetCommand(DependencyObject attached, ICommand value)
{
attached.SetValue(CommandProperty, value);
}
public static ICommand GetCommand(DependencyObject attached)
{
return (attached.GetValue(CommandProperty) as ICommand);
}
private static void CommandPropertyChanged(DependencyObject obj, DependencyPropertyChangedEventArgs args)
{
// first check if the component is a list
if (obj is ListViewBase)
{
var list = obj as ListViewBase;
list.SelectionChanged += (sender, e) => ExecuteCommand(obj, list.SelectedItems);
}
}
private static void ExecuteCommand(DependencyObject attached, object argument)
{
ICommand command = GetCommand(attached);
if (command != null && command.CanExecute(argument))
{
command.Execute(argument);
}
}
}
}
<file_sep>using System;
namespace Sandbox.UILogic.Model
{
public class Place
{
private string _address;
private bool _isFavorite;
private string _name;
public string Address
{
get { return _address; }
set
{
if (_address != value)
{
_address = value;
AddressChanged(this, _address);
}
}
}
public bool IsFavorite
{
get { return _isFavorite; }
set
{
if (_isFavorite != value)
{
_isFavorite = value;
IsFavoriteChanged(this, _isFavorite);
}
}
}
public string Name
{
get { return _name; }
set
{
if (_name != value)
{
_name = value;
NameChanged(this, _name);
}
}
}
public event EventHandler<string> AddressChanged = delegate { };
public event EventHandler<bool> IsFavoriteChanged = delegate { };
public event EventHandler<string> NameChanged = delegate { };
}
}<file_sep>// -----------------------------------------------------------------------
// <copyright file="Unsubscriber.cs" company="NOKIA">
// copyright © 2014 Nokia. All rights reserved.
// This material, including documentation and any related computer
// programs, is protected by copyright controlled by Nokia. All
// rights are reserved. Copying, including reproducing, storing,
// adapting or translating, any or all of this material requires the
// prior written consent of Nokia. This material also contains
// confidential information which may not be disclosed to others
// without the prior written consent of Nokia.
// </copyright>
// -----------------------------------------------------------------------
namespace MapsW8.Base.Observable
{
using System;
using System.Collections.Generic;
public class Unsubscriber<T> : IDisposable
{
private readonly List<IObserver<T>> _observers;
private readonly IObserver<T> _observer;
public Unsubscriber(List<IObserver<T>> observers, IObserver<T> observer)
{
_observers = observers;
_observer = observer;
}
public void Dispose()
{
if (_observer != null && _observers.Contains(_observer))
{
_observers.Remove(_observer);
}
}
}
}<file_sep>using System;
namespace Sandbox.UILogic.Model
{
public class City : Place
{
private long _population;
public long Population
{
get { return _population; }
set
{
if (_population != value)
{
_population = value;
PopulationChanged(this, _population);
}
}
}
public event EventHandler<long> PopulationChanged = delegate { };
}
}
<file_sep>using Microsoft.Practices.Prism.StoreApps;
namespace Sandbox.Views
{
/// <summary>
/// An empty page that can be used on its own or navigated to within a Frame.
/// </summary>
public sealed partial class ReactivePropertiesPage : VisualStateAwarePage
{
public ReactivePropertiesPage()
{
InitializeComponent();
}
}
}<file_sep>// -----------------------------------------------------------------------
// <copyright file="ReactiveProperty.cs" company="NOKIA">
// copyright © 2014 Nokia. All rights reserved.
// This material, including documentation and any related computer
// programs, is protected by copyright controlled by Nokia. All
// rights are reserved. Copying, including reproducing, storing,
// adapting or translating, any or all of this material requires the
// prior written consent of Nokia. This material also contains
// confidential information which may not be disclosed to others
// without the prior written consent of Nokia.
// </copyright>
// -----------------------------------------------------------------------
namespace MapsW8.Base.Reactive
{
using System;
using System.Reactive.Concurrency;
public class ReactiveProperty<T> : ReactivePropertyBase<T>, IReactiveProperty
{
/// <summary>PropertyChanged raise on UIDispatcherScheduler</summary>
public ReactiveProperty()
: this(default(T), ReactivePropertyMode.DistinctUntilChanged | ReactivePropertyMode.RaiseLatestValueOnSubscribe)
{
}
/// <summary>PropertyChanged raise on UIDispatcherScheduler</summary>
public ReactiveProperty(T initialValue = default(T),
ReactivePropertyMode mode = ReactivePropertyMode.DistinctUntilChanged|ReactivePropertyMode.RaiseLatestValueOnSubscribe)
: this(UIDispatcherScheduler.Default, initialValue, mode)
{ }
/// <summary>PropertyChanged raise on selected scheduler</summary>
public ReactiveProperty(IScheduler raiseEventScheduler,
T initialValue = default(T),
ReactivePropertyMode mode = ReactivePropertyMode.DistinctUntilChanged|ReactivePropertyMode.RaiseLatestValueOnSubscribe)
: this(System.Reactive.Linq.Observable.Never<T>(), raiseEventScheduler, initialValue, mode)
{
}
// ToReactiveProperty Only
internal ReactiveProperty(IObservable<T> source,
T initialValue = default(T),
ReactivePropertyMode mode = ReactivePropertyMode.DistinctUntilChanged|ReactivePropertyMode.RaiseLatestValueOnSubscribe)
: this(source, UIDispatcherScheduler.Default, initialValue, mode)
{
}
internal ReactiveProperty(IObservable<T> source,
IScheduler raiseEventScheduler,
T initialValue = default(T),
ReactivePropertyMode mode = ReactivePropertyMode.DistinctUntilChanged|ReactivePropertyMode.RaiseLatestValueOnSubscribe) :
base(source, raiseEventScheduler, initialValue, mode)
{
}
/// <summary>
/// Get latestValue or push(set) value.
/// </summary>
public T Value
{
get { return InternalValue; }
set { InternalValue = value; }
}
object IReactiveProperty.Value
{
get { return Value; }
}
}
}<file_sep>using System;
using System.Collections.Generic;
using System.Linq;
using System.Reactive.Linq;
using Microsoft.Practices.Prism.Mvvm;
using Sandbox.UILogic.Model;
using Utilities.Reactive;
namespace Sandbox.UILogic.ViewModels
{
public class SelectSwitchPageViewModel : ViewModel, IDisposable
{
public IList<City> CitiesList { get; private set; }
public ReadonlyReactiveProperty<string> Population { get; private set; }
public ReactiveCommand<IList<object>> ItemSelectedCommand { get; private set;}
public ReactiveProperty<bool> IsMultipleSelection { get; private set; }
public SelectSwitchPageViewModel()
{
CitiesList = Enumerable.Range(0, 5)
.Select(number => new City { Population = (uint)number * 1000})
.ToList();
ItemSelectedCommand = new ReactiveCommand<IList<object>>();
IsMultipleSelection = new ReactiveProperty<bool>(false);
Population = IsMultipleSelection.Select(isMultipleSelection => isMultipleSelection ?
DefineMultipleSliderPopulationObservable(ItemSelectedCommand) :
DefineSingleSliderPopulationObservable(ItemSelectedCommand))
.Switch()
.ToReadonlyReactiveProperty();
}
public void Dispose()
{
ItemSelectedCommand.Dispose();
Population.Dispose();
IsMultipleSelection.Dispose();
}
private static IObservable<string> DefineSingleSliderPopulationObservable(IObservable<IList<object>> itemsSelectedObservable)
{
return itemsSelectedObservable.Select(selectedItems => selectedItems.OfType<City>()
.FirstOrDefault())
.Select(city => city != null ? Observable.FromEventPattern<long>(h => city.PopulationChanged += h,
h => city.PopulationChanged -= h)
.Select(args => args.EventArgs)
.StartWith(city.Population)
: Observable.Return<long>(0))
.Switch()
.StartWith(0)
.Select(population => population.ToString());
}
private static IObservable<string> DefineMultipleSliderPopulationObservable(IObservable<IList<object>> itemsSelectedObservable)
{
return itemsSelectedObservable.Select(selectedItems => selectedItems.OfType<City>()
.ToList())
.Select(selectedItems => selectedItems.Any() ? selectedItems.ToObservable()
.SelectMany(city => Observable.FromEventPattern<long>(h => city.PopulationChanged += h,
h => city.PopulationChanged -= h)
.Select(args => city)
.StartWith(city))
.BucketSum(city => city.Population)
: Observable.Return<long>(0))
.Switch()
.StartWith(0)
.Select(population => population.ToString());
}
}
}
<file_sep>using System;
using System.Collections.Generic;
using System.Reactive.Linq;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Practices.Prism.Mvvm;
using Utilities.Reactive;
namespace Sandbox.UILogic.ViewModels
{
public class LatestWinsPageViewModel : ViewModel, IDisposable
{
public ReadonlyReactiveProperty<string> Description { get; private set; }
public ReactiveCommand CountPressesCommand { get; private set; }
public LatestWinsPageViewModel()
{
CountPressesCommand = new ReactiveCommand();
Description = CountPressesCommand.Select((_, count) => count)
.Select(count => Observable.FromAsync(token => SlowCounterAsync(count, token)))
.Switch()
.ToReadonlyReactiveProperty();
}
public void Dispose()
{
CountPressesCommand.Dispose();
Description.Dispose();
}
private async Task<string> SlowCounterAsync(int count, CancellationToken token)
{
await Task.Delay(500, token);
token.ThrowIfCancellationRequested();
return string.Format("This is the {0} time you have pressed the button", count);
}
public override void OnNavigatedFrom(Dictionary<string, object> viewModelState, bool suspending)
{
base.OnNavigatedFrom(viewModelState, suspending);
Dispose();
}
}
}<file_sep>// -----------------------------------------------------------------------
// <copyright file="ObservableCommand.cs" company="NOKIA">
// copyright © 2014 Nokia. All rights reserved.
// This material, including documentation and any related computer
// programs, is protected by copyright controlled by Nokia. All
// rights are reserved. Copying, including reproducing, storing,
// adapting or translating, any or all of this material requires the
// prior written consent of Nokia. This material also contains
// confidential information which may not be disclosed to others
// without the prior written consent of Nokia.
// </copyright>
// -----------------------------------------------------------------------
namespace MapsW8.Base
{
using HERE.Common;
using MapsW8.Base.Observable;
using System;
using System.Collections.Generic;
using System.Windows.Input;
public class ObservableCommand : ICommand, IObservable<object>
{
private readonly List<IObserver<object>> _observers = new List<IObserver<object>>();
public event EventHandler CanExecuteChanged = null;
protected readonly Func<object, bool> _canExecute;
public ObservableCommand(Func<object, bool> canExecute)
{
_canExecute = canExecute;
}
public ObservableCommand() :
this(null)
{
}
public virtual bool CanExecute(object parameter)
{
return _canExecute == null || _canExecute(parameter);
}
public virtual void Execute(object parameter)
{
foreach (var observer in _observers.ToArray())
{
observer.OnNext(parameter);
}
}
public void NotifyCanExecuteChanged()
{
this.Notify(CanExecuteChanged);
}
public IDisposable Subscribe(IObserver<object> observer)
{
if (!_observers.Contains(observer))
{
_observers.Add(observer);
}
return new Unsubscriber<object>(_observers, observer);
}
}
}<file_sep>using Microsoft.Practices.Prism.StoreApps;
namespace Sandbox.Views
{
public sealed partial class SubscriptionChangePage : VisualStateAwarePage
{
public SubscriptionChangePage()
{
InitializeComponent();
}
}
}<file_sep>using Microsoft.Practices.Prism.StoreApps;
namespace Sandbox.Views
{
public sealed partial class PropertyUpdatePage : VisualStateAwarePage
{
public PropertyUpdatePage()
{
InitializeComponent();
}
}
}<file_sep>using System;
using System.Collections.Generic;
using System.Text;
using Windows.UI.Xaml.Controls;
using Windows.UI.Xaml.Data;
namespace Sandbox.Converters
{
public class SelectionConverter : IValueConverter
{
public object Convert(object value, Type targetType, object parameter, string language)
{
var isMultipleSelection = (bool)value;
return isMultipleSelection ? ListViewSelectionMode.Multiple : ListViewSelectionMode.Single;
}
public object ConvertBack(object value, Type targetType, object parameter, string language)
{
var selection = (ListViewSelectionMode)value;
return selection == ListViewSelectionMode.Multiple;
}
}
}
<file_sep>using System;
using System.Globalization;
using System.Threading.Tasks;
using Windows.ApplicationModel.Activation;
using Windows.UI.Xaml;
using Microsoft.Practices.Prism.Mvvm;
using Microsoft.Practices.Prism.PubSubEvents;
using Microsoft.Practices.Unity;
namespace Sandbox
{
public sealed partial class App : MvvmAppBase
{
private readonly IUnityContainer _container = new UnityContainer();
public App()
{
InitializeComponent();
}
protected override Task OnLaunchApplicationAsync(LaunchActivatedEventArgs args)
{
NavigationService.Navigate("Main", null);
Window.Current.Activate();
return Task.FromResult<object>(null);
}
protected override Task OnInitializeAsync(IActivatedEventArgs args)
{
_container.RegisterInstance(NavigationService);
_container.RegisterInstance(SessionStateService);
ViewModelLocationProvider.SetDefaultViewTypeToViewModelTypeResolver(viewType =>
{
var viewModelTypeName = string.Format(CultureInfo.InvariantCulture,
"Sandbox.UILogic.ViewModels.{0}ViewModel, Sandbox.UILogic, Version=1.0.0.0, Culture=neutral",
viewType.Name);
var viewModelType = Type.GetType(viewModelTypeName);
return viewModelType;
});
return base.OnInitializeAsync(args);
}
protected override object Resolve(Type type)
{
return _container.Resolve(type);
}
protected override void OnRegisterKnownTypesForSerialization()
{
base.OnRegisterKnownTypesForSerialization();
}
}
}<file_sep>using System;
using System.Collections.Generic;
using System.Reactive.Linq;
using Microsoft.Practices.Prism.Mvvm;
using Utilities.Reactive;
namespace Sandbox.UILogic.ViewModels
{
public class ReactivePropertiesPageViewModel : ViewModel, IDisposable
{
public ReadonlyReactiveProperty<string> RetriesText { get; private set; }
public ReactivePropertiesPageViewModel()
{
RetriesText = Observable.Interval(TimeSpan.FromSeconds(1))
.Select(number => "This is " + number + " retry")
.ToReadonlyReactiveProperty();
}
public void Dispose()
{
RetriesText.Dispose();
}
public override void OnNavigatedFrom(Dictionary<string, object> viewModelState, bool suspending)
{
base.OnNavigatedFrom(viewModelState, suspending);
Dispose();
}
}
}<file_sep>using System;
using System.Collections.Generic;
using System.Reactive.Linq;
using System.Windows.Input;
using Microsoft.Practices.Prism.Commands;
using Microsoft.Practices.Prism.Mvvm;
using Sandbox.UILogic.Model;
using Utilities.Reactive;
namespace Sandbox.UILogic.ViewModels
{
public class SubscriptionChangePageViewModel : ViewModel, IDisposable
{
public readonly List<Place> _places = new List<Place>
{
new Place {Name = "Home", Address = "Lezajsk", IsFavorite = true},
new Place {Name = "Work", Address = "Berlin", IsFavorite = false},
new Place {Name = "Retirement place", Address = "Hawaii", IsFavorite = true}
};
private string _currentAddress;
private bool _currentFavorite;
private int _currentIndex;
private string _currentName;
private Place _currentPlace;
public string CurrentName
{
get { return _currentName; }
private set
{
if (_currentName != value)
{
_currentName = value;
OnPropertyChanged(() => CurrentName);
}
}
}
public string CurrentAddress
{
get { return _currentAddress; }
private set
{
if (_currentAddress != value)
{
_currentAddress = value;
OnPropertyChanged(() => CurrentAddress);
}
}
}
public bool CurrentFavorite
{
get { return _currentFavorite; }
private set
{
if (_currentFavorite != value)
{
_currentFavorite = value;
OnPropertyChanged(() => CurrentFavorite);
}
}
}
#region Reactive stuff
private readonly ReactiveProperty<Place> _currentReactivePlace = new ReactiveProperty<Place>();
public ReadonlyReactiveProperty<string> CurrentReactiveName { get; private set; }
public ReadonlyReactiveProperty<string> CurrentReactiveAddress { get; private set; }
public ReadonlyReactiveProperty<bool> CurrentReactiveFavorite { get; private set; }
#endregion
public ICommand NextCommand { get; private set; }
public SubscriptionChangePageViewModel()
{
SetPlace(_places[_currentIndex]);
NextCommand = new DelegateCommand(() => SetPlace(_places[++_currentIndex%_places.Count]));
#region Reactive stuff
CurrentReactiveName =
_currentReactivePlace.Select(place => Observable.FromEventPattern<string>(h => place.NameChanged += h,
h => place.NameChanged -= h)
.Select(args => args.EventArgs)
.StartWith(place.Name))
.Switch()
.ToReadonlyReactiveProperty();
CurrentReactiveAddress =
_currentReactivePlace.Select(
place => Observable.FromEventPattern<string>(h => place.AddressChanged += h,
h => place.AddressChanged -= h)
.Select(args => args.EventArgs)
.StartWith(place.Address))
.Switch()
.ToReadonlyReactiveProperty();
CurrentReactiveFavorite =
_currentReactivePlace.Select(
place => Observable.FromEventPattern<bool>(h => place.IsFavoriteChanged += h,
h => place.IsFavoriteChanged -= h)
.Select(args => args.EventArgs)
.StartWith(place.IsFavorite))
.Switch()
.ToReadonlyReactiveProperty();
#endregion
}
public void Dispose()
{
CurrentReactiveName.Dispose();
CurrentReactiveAddress.Dispose();
CurrentReactiveFavorite.Dispose();
}
public override void OnNavigatedFrom(Dictionary<string, object> viewModelState, bool suspending)
{
base.OnNavigatedFrom(viewModelState, suspending);
Dispose();
}
private void SetPlace(Place place)
{
SetNormalPlace(place);
_currentReactivePlace.Value = place;
}
private void SetNormalPlace(Place place)
{
if (_currentPlace != null)
{
_currentPlace.NameChanged -= OnNameChanged;
_currentPlace.AddressChanged -= OnAddressChanged;
_currentPlace.IsFavoriteChanged -= OnIsFavoriteChanged;
}
_currentPlace = place;
CurrentName = _currentPlace.Name;
CurrentAddress = _currentPlace.Address;
CurrentFavorite = _currentPlace.IsFavorite;
if (_currentPlace != null)
{
_currentPlace.NameChanged += OnNameChanged;
_currentPlace.AddressChanged += OnAddressChanged;
_currentPlace.IsFavoriteChanged += OnIsFavoriteChanged;
}
}
private void OnIsFavoriteChanged(object sender, bool isFavorite)
{
CurrentFavorite = isFavorite;
}
private void OnAddressChanged(object sender, string address)
{
CurrentAddress = address;
}
private void OnNameChanged(object sender, string name)
{
CurrentName = name;
}
}
}
|
5fadbda7653e837d5d690bd91a29c8e628a6eb0b
|
[
"C#"
] | 31 |
C#
|
tomaszpolanski/Sandbox
|
9c0e582098994e84473ff647c7fa4d408e61d957
|
02094b49d1f1f19375f09064f4bb599d752e501a
|
refs/heads/master
|
<repo_name>taityoukou/JavaProject_for_study<file_sep>/DesignPattern_State/readme.md
状态模式(State)
核心思想就是:当对象的状态改变时,同时改变其行为,很好理解!就拿QQ来说,有几种状态,在线、隐身、忙碌等,每个状态对应不同的操作,而且你的好友也能看到你的状态,所以,状态模式就两点:1、可以通过改变状态来获得不同的行为。2、你的好友能同时看到你的变化。
例子里面,State类是个状态类,Context类可以实现切换
输出:
execute the first opt!
execute the second opt!
根据这个特性,状态模式在日常开发中用的挺多的,尤其是做网站的时候,我们有时希望根据对象的某一属性,区别开他们的一些功能,比如说简单的权限控制等。<file_sep>/README.md
# JavaProject_for_study
JavaProject_for_study
Those projects are created when I study Java.
<file_sep>/UserRegister_SSH3/src/com/anson/dao/userDaoImpl.java
package com.anson.dao;
import java.util.List;
import org.hibernate.Session;
import org.hibernate.Transaction;
import com.anson.model.User;
import com.anson.util.HibernateUtil;
public class userDaoImpl implements userDao {
@Override
public void insert(User user) {
Session session = HibernateUtil.getSessionFactory().getCurrentSession();
Transaction tx =session.beginTransaction();
session.save(user);
tx.commit();
tx=null;
}
@Override
public void update(User user) {
// TODO Auto-generated method stub
}
@Override
public void delete(User user) {
// TODO Auto-generated method stub
}
@Override
public List<User> find(User user) {
Session session = HibernateUtil.getSessionFactory().getCurrentSession();
Transaction tx =session.beginTransaction();
@SuppressWarnings("unchecked")
List<User> userlist = session.createQuery(
"select user from User user where user.username = :username" )
.setParameter( "username", user.getUsername() ).getResultList();
tx.commit();
tx=null;
return userlist;
}
}
<file_sep>/DynamicProxy_Static/src/TimeProxy.java
public class TimeProxy implements Moveable,Stopable{
Moveable t;
public TimeProxy(Moveable t) {
super();
this.t = t;
}
@Override
public void move() {
System.out.println("Time loging...");
t.move();
}
@Override
public void stop() {
System.out.println("Stop");
}
}
<file_sep>/DesignPattern_Iterator/src/Iterator_Demo.java
public class Iterator_Demo {
public static void main(String[] args) {
NameRepository nameRepository = new NameRepository();
for (Iterator iterator = nameRepository.getIterator(); iterator.hasNext();) {
System.out.println("Name=" + iterator.next().toString());
}
}
}
/*
迭代器模式
迭代器模式(Iterator Pattern)是 Java 和 .Net 编程环境中非常常用的设计模式。这种模式用于顺序访问集合对象的元素,
不需要知道集合对象的底层表示。迭代器模式属于行为型模式。
介绍
意图:提供一种方法顺序访问一个聚合对象中各个元素, 而又无须暴露该对象的内部表示。
主要解决:不同的方式来遍历整个整合对象。
何时使用:遍历一个聚合对象。
如何解决:把在元素之间游走的责任交给迭代器,而不是聚合对象。
关键代码:定义接口:hasNext, next。
应用实例:JAVA 中的 iterator。
优点:1、它支持以不同的方式遍历一个聚合对象。
2、迭代器简化了聚合类。
3、在同一个聚合上可以有多个遍历。
4、在迭代器模式中,增加新的聚合类和迭代器类都很方便,无须修改原有代码。
缺点:由于迭代器模式将存储数据和遍历数据的职责分离,增加新的聚合类需要对应增加新的迭代器类,类的个数成对增加,
这在一定程度上增加了系统的复杂性。
使用场景: 1、访问一个聚合对象的内容而无须暴露它的内部表示。
2、需要为聚合对象提供多种遍历方式。
3、为遍历不同的聚合结构提供一个统一的接口。
注意事项:迭代器模式就是分离了集合对象的遍历行为,抽象出一个迭代器类来负责,这样既可以做到不暴露集合的内部结构,
又可让外部代码透明地访问集合内部的数据。*/<file_sep>/DesignPattern_Command/readme.md
命令模式(Command Pattern)是一种数据驱动的设计模式,它属于行为型模式。请求以命令的形式包裹在对象中,并传给调用对象。调用对象寻找可以处理该命令的合适的对象,并把该命令传给相应的对象,该对象执行命令。
定义
将来自客户端的请求传入一个对象,从而使你可用不同的请求对客户进行参数化。用于“行为请求者”与“行为实现者”解耦,可实现二者之间的松耦合,以便适应变化。分离变化与不变的因素。
角色
Command
定义命令的接口,声明执行的方法。
ConcreteCommand
命令接口实现对象,是“虚”的实现;通常会持有接收者,并调用接收者的功能来完成命令要执行的操作。
Receiver
接收者,真正执行命令的对象。任何类都可能成为一个接收者,只要它能够实现命令要求实现的相应功能。
Invoker
要求命令对象执行请求,通常会持有命令对象,可以持有很多的命令对象。这个是客户端真正触发命令并要求命令执行相应操作的地方,也就是说相当于使用命令对象的入口。
Client
创建具体的命令对象,并且设置命令对象的接收者。注意这个不是我们常规意义上的客户端,而是在组装命令对象和接收者,或许,把这个Client称为装配者会更好理解,因为真正使用命令的客户端是从Invoker来触发执行。
优点
1.降低对象之间的耦合度。
2.新的命令可以很容易地加入到系统中。
3.可以比较容易地设计一个组合命令。
4.调用同一方法实现不同的功能
缺点
使用命令模式可能会导致某些系统有过多的具体命令类。因为针对每一个命令都需要设计一个具体命令类,因此某些系统可能需要大量具体命令类,这将影响命令模式的使用。
适用情况
1.系统需要将请求调用者和请求接收者解耦,使得调用者和接收者不直接交互。
2.系统需要在不同的时间指定请求、将请求排队和执行请求。
3.系统需要支持命令的撤销(Undo)操作和恢复(Redo)操作。
4.系统需要将一组操作组合在一起,即支持宏命令。
应用
模拟对电视机的操作有开机、关机、换台命令。
执行结果:
TV turn on!
Channel changed to 3
TV turn off!
总结
1.命令模式的本质是对命令进行封装,将发出命令的责任和执行命令的责任分割开。
2.每一个命令都是一个操作:请求的一方发出请求,要求执行一个操作;接收的一方收到请求,并执行操作。
3.命令模式允许请求的一方和接收的一方独立开来,使得请求的一方不必知道接收请求的一方的接口,更不必知道请求是怎么被接收,以及操作是否被执行、何时被执行,以及是怎么被执行的。
4.命令模式使请求本身成为一个对象,这个对象和其他对象一样可以被存储和传递。
5.命令模式的关键在于引入了抽象命令接口,且发送者针对抽象命令接口编程,只有实现了抽象命令接口的具体命令才能与接收者相关联。
<file_sep>/Spring_Hibernate/src/com/anson/dao/UserDao.java
package com.anson.dao;
import com.anson.model.User;
public interface UserDao {
public void userAdd(User user);
public int userDel(User user);
}
<file_sep>/Spring_Hibernate_Template/src/com/anson/dao/UserDaoImpl.java
package com.anson.dao;
import javax.annotation.Resource;
import org.hibernate.Session;
import org.hibernate.SessionFactory;
import org.springframework.orm.hibernate5.HibernateTemplate;
import org.springframework.stereotype.Component;
import com.anson.model.User;
@Component("user")
public class UserDaoImpl implements UserDao {
private HibernateTemplate hibernateTemplate;
/* private SessionFactory sessionFactory;
public SessionFactory getSessionFactory() {
return sessionFactory;
}
@Resource
public void setSessionFactory(SessionFactory sessionFactory) {
this.sessionFactory = sessionFactory;
}*/
public HibernateTemplate getTemplate() {
return hibernateTemplate;
}
@Resource
public void setTemplate(HibernateTemplate template) {
this.hibernateTemplate = template;
}
@Override
public void userAdd(User user) {
/* Session s = sessionFactory.getCurrentSession();
s.save(user);*/
hibernateTemplate.save(user);
System.out.println("user saved!");
}
@Override
public int userDel(User user) {
// TODO Auto-generated method stub
return 0;
}
}
<file_sep>/DesignPattern_Facade/src/UserTest.java
public class UserTest {
public static void main(String[] args) {
Computer computer = new Computer();
computer.startup();
computer.shutdown();
}
}
//外观模式(Facade)
/*外观模式是为了解决类与类之家的依赖关系的,像Spring一样,可以将类和类之间的关系配置到配置文件中。
而外观模式就是将他们的关系放在一个Facade类中,降低了类类之间的耦合度,该模式中没有涉及到接口。
在本例子中,我们以用户使用一个计算机的启动过程为例。
如果我们没有Computer类,那么,CPU、Memory、Disk他们之间将会相互持有实例,产生联系,
这样会造成严重的依赖,修改一个类,可能会带来其他类的修改,这不是我们想要看到的。
有了Computer类,他们之间的关系被放在了Computer类里,这样就起到了解耦的作用。
这,就是外观模式!
*/
<file_sep>/UserRegister_SSH3/src/com/anson/service/UserManagerImpl.java
package com.anson.service;
import com.anson.dao.userDao;
import com.anson.dao.userDaoImpl;
import com.anson.model.User;
public class UserManagerImpl implements UserManager {
@Override
public boolean isExist(User user) {
userDao userdao = new userDaoImpl();
if (userdao.find(user).isEmpty()){
return false;
}else{
return true;
}
}
@Override
public void add(User user) {
userDao userdao = new userDaoImpl();
userdao.insert(user);
}
}
<file_sep>/DesignPattern_Chain_of_Responsibility/readme.md
责任链模式
顾名思义,责任链模式(Chain of Responsibility Pattern)为请求创建了一个接收者对象的链。这种模式给予请求的类型,对请求的发送者和接收者进行解耦。这种类型的设计模式属于行为型模式。
在这种模式中,通常每个接收者都包含对另一个接收者的引用。如果一个对象不能处理该请求,那么它会把相同的请求传给下一个接收者,依此类推。
介绍
意图:避免请求发送者与接收者耦合在一起,让多个对象都有可能接收请求,将这些对象连接成一条链,并且沿着这条链传递请求,直到有对象处理它为止。
主要解决:职责链上的处理者负责处理请求,客户只需要将请求发送到职责链上即可,无须关心请求的处理细节和请求的传递,所以职责链将请求的发送者和请求的处理者解耦了。
何时使用:在处理消息的时候以过滤很多道。
如何解决:拦截的类都实现统一接口。
关键代码:Handler 里面聚合它自己,在 HanleRequest 里判断是否合适,如果没达到条件则向下传递,向谁传递之前 set 进去。
应用实例: 1、红楼梦中的"击鼓传花"。 2、JS 中的事件冒泡。 3、JAVA WEB 中 Apache Tomcat 对 Encoding 的处理,Struts2 的拦截器,jsp servlet 的 Filter。
优点: 1、降低耦合度。它将请求的发送者和接收者解耦。 2、简化了对象。使得对象不需要知道链的结构。 3、增强给对象指派职责的灵活性。通过改变链内的成员或者调动它们的次序,允许动态地新增或者删除责任。 4、增加新的请求处理类很方便。
缺点: 1、不能保证请求一定被接收。 2、系统性能将受到一定影响,而且在进行代码调试时不太方便,可能会造成循环调用。 3、可能不容易观察运行时的特征,有碍于除错。
使用场景: 1、有多个对象可以处理同一个请求,具体哪个对象处理该请求由运行时刻自动确定。 2、在不明确指定接收者的情况下,向多个对象中的一个提交一个请求。 3、可动态指定一组对象处理请求。
注意事项:在 JAVA WEB 中遇到很多应用。
该Demo输出如下:
Standard Console::Logger: This is an information.
----------------
File::Logger: This is an debug level information.
Standard Console::Logger: This is an debug level information.
----------------
Error Console::Logger: This is an error information.
File::Logger: This is an error information.
Standard Console::Logger: This is an error information.<file_sep>/DesignPattern_TemplateMethod/src/TemplateTest.java
public class TemplateTest {
public static void main(String[] args) {
String exp = "8+8";
AbstractCalculator cal = new Plus();
int result = cal.calculate(exp, "\\+");
System.out.println(result);
}
}
/*
模板方法模式(Template Method)
解释一下模板方法模式,就是指:一个抽象类中,有一个主方法,再定义1...n个方法,
可以是抽象的,也可以是实际的方法,定义一个类,继承该抽象类,重写抽象方法,通过调用抽象类,实现对子类的调用。
模板方法模式的主要思想:定义一个算法流程,将一些特定步骤的具体实现、延迟到子类。使得可以在不改变算法流程的情况下,
通过不同的子类、来实现“定制”流程中的特定的步骤。
策略模式的主要思想:使不同的算法可以被相互替换,而不影响客户端的使用。
在思想和意图上看,模板方法更加强调:
1)定义一条线(算法流程),线上的多个点是可以变化的(具体实现在子类中完成),线上的多个点一定是会被执行的,并且一定是按照特定流程被执行的。
2)算法流程只有唯一的入口,对于点的访问是受限的【通常用受保护的虚函数来定义可变点】。
策略模式更注重于: 一个“策略”是一个 整体的(完整的) 算法,算法是可以被整体替换的。而模板方法只能被替换其中的特定点,算法流程是固定不可变的。
在这样的细节上看来,模板方法 和 一组策略模式 是不可以划等号的。
*/<file_sep>/UserRegister_SSH/src/com/anson/service/UserManager.java
package com.anson.service;
import org.hibernate.Session;
import org.hibernate.SessionFactory;
import org.hibernate.Transaction;
import org.hibernate.boot.MetadataSources;
import org.hibernate.boot.registry.StandardServiceRegistry;
import org.hibernate.boot.registry.StandardServiceRegistryBuilder;
import com.anson.dao.userDao;
import com.anson.model.User;
public class UserManager implements userDao{
private SessionFactory sessionFactory;
private StandardServiceRegistry serviceRegistry;
@Override
public boolean isExist(User user) {
return false;
}
@Override
public void add(User user) {
serviceRegistry = new StandardServiceRegistryBuilder().configure().build();
sessionFactory = new MetadataSources(serviceRegistry).buildMetadata().buildSessionFactory();
Session session = sessionFactory.getCurrentSession();
Transaction tx =session.beginTransaction();
session.save(user);
tx.commit();
tx=null;
StandardServiceRegistryBuilder.destroy(serviceRegistry);
}
}
<file_sep>/DesignPattern_Command/src/TV.java
//命令接收者Receiver
public class TV {
private int CurrentChannel=0;
public void turnOn(){
System.out.println("TV turn on!");
}
public void turnOff(){
System.out.println("TV turn off!");
}
public void changeChannel(int channel){
setCurrentChannel(channel);
System.out.println("Channel changed to "+channel);
}
public int getCurrentChannel() {
return CurrentChannel;
}
public void setCurrentChannel(int currentChannel) {
CurrentChannel = currentChannel;
}
}
|
9f2cdd352cd17a8309bbe7e40f5b80ab36624c8a
|
[
"Markdown",
"Java"
] | 14 |
Markdown
|
taityoukou/JavaProject_for_study
|
7f89bb699d4a6818f86a1e1762f0a03f678ec72e
|
75a8a09713071e9c76af63f2c446298c949fbbcd
|
refs/heads/master
|
<file_sep>#pragma once
#include<vector>
#include "Tonomat.h"
#include <deque>
#include <list>
#include <iostream>
#include <algorithm>
#include <iterator>
#include <fstream>
using namespace std;
template<typename T>
class RepoT {
private:
vector<T> el;
public:
RepoT();
void addItem(T t);
void deleteItem(T t);
void updateItem(int t,T n);
vector<T> getAll();
vector<T> getPoz();
int getSize();
~RepoT();
};
template<class T>
RepoT<T>::RepoT() {
}
template<class T>
RepoT<T>::~RepoT() {
}
template<class T>
int RepoT<T>::getSize()
{
return this->el.size();
}
template<class T>
void RepoT<T>::addItem(T item)
{
this->el.push_back(item);
}
template<class T>
void RepoT<T>::deleteItem(T t)
{
typename vector<T>::iterator it;
it = find(el.begin(), el.end(), t);
if (it != el.end())
el.erase(it);
}
template<class T>
void RepoT<T>::updateItem(int t,T n)
{
for (int i = 0; i < el.size(); i++) {
if (i == t) {
el[i] = n;
}
}
}
template<class T>
vector<T> RepoT<T>::getAll() {
return this->el;
}
template <class T>
vector<T> RepoT<T>::getPoz()
{
return this->el;
}<file_sep>#include "UI.h"
#include "RepoFile.h"
#include "RepoT.h"
#include "Service.h"
#include <iostream>
#include <fstream>
UI::UI(){}
UI::~UI(){}
void UI::addItem(int cod, int pret, string nume)
{
cout << "cod : ";
cin >> cod;
cout << "pret : ";
cin >> pret;
cout << "nume : ";
cin >> nume;
Tonomat t(cod, pret, nume);
s.addItem(t);
}
void UI::deleteItem(int cod, int pret, string nume)
{
cout << "Introduceti urmatoarele infornatii pe care doriti sa le stergeti\n";
cout << "cod : ";
cin >> cod;
cout << "pret : ";
cin >> pret;
cout << "nume : ";
cin >> nume;
Tonomat t(cod, pret, nume);
s.deleteItem(t);
}
void UI::updateItem()
{
cout << "introduceti codul produsului :";
int cod;
cin >> cod;
int pret;
cout << " introduceti noul pret :";
cin >> pret;
string nume;
cout << "introduceti noul nume : ";
cin >> nume;
this->updateNume(cod,nume);
this->updatePret(cod, pret);
}
void UI::updatePret(int cod, int modif)
{
Tonomat t;
t.setCod(cod);
t.setPret(modif);
for (int i = 0; i < s.getSize(); i++)
{
if (s.getAll().getPoz()[i].getCod() == cod)
{
t.setNume(s.getAll().getPoz()[i].getNume());
s.updateI(i, t);
break;
}
}
}
void UI::updateNume(int cod, string modif)
{
Tonomat t;
t.setCod(cod);
t.setNume(modif);
for (int i = 0; i < s.getSize(); i++)
{
if (cod == s.getAll().getPoz()[i].getCod())
{
t.setPret(s.getAll().getPoz()[i].getPret());
this->s.updateI(i, t);
break;
}
}
}
int UI::getPret(int cod)
{
for (int i = 0; i < s.getSize(); i++)
{
if (cod == s.getAll().getPoz()[i].getCod())
{
return s.getAll().getPoz()[i].getPret();
}
}
return -1;
}
int UI::findItem(int cod)
{
for (int i = 0; i < s.getSize(); i++)
{
if (cod == s.getAll().getPoz()[i].getCod())
{
return i;
}
}
return -1;
}
void UI::transaction()
{
cout << "Introduceti codul produsului pe care il doriti : ";
int cod;
cin >> cod;
int pret = this->getPret(cod);
int find = this->findItem(cod);
cout << "Produsului costa " << pret << " lei " << endl;
cout << "Introduceti monedele : ";
int monezi;
int suma_in_tonomat = 100;
cin >> monezi;
if (monezi < pret)
{
cout << endl;
cout << "Insuficienti bani!!! " << endl;
}
else if (monezi == pret)
{
cout << endl;
cout << "Tranzactie reusita !!! " << endl;
}
else
{
cout << endl;
cout << "Asteptati restul ... " << endl;
int cincizeci;
this->rest(monezi, pret, cincizeci);
if (suma_in_tonomat > cincizeci) {
cout << "Restul : \n";
cout << cincizeci << " monede de 50 de bani" << endl;
}
else
cout << "Tonomatul nu are sa va dea rest" << endl;
}
}
void UI::rest(int suma, int pret, int& cincizeci)
{
int rest;
rest = suma - pret;
rest = rest * 10;
cincizeci = rest / 5;
}
int UI::Meniu() {
int cod = 0;
int pret = 0;
string nume = " ";
vector<Tonomat> aux;
int optiune;
do
{
cout << "\nMENIU :\n";
cout << "1. Adaugare \n";
cout << "2. Afisare \n";
cout << "3. Stergere \n";
cout << "4. Modificare \n";
cout << "5. Cumparare\n";
cout << "Alegeti o optiune : ";
cin >> optiune;
switch (optiune) {
case 1: {
cout << "Introduceti urmatoarele infornatii pe care doriti sa le adaugati\n ";
addItem(cod, pret, nume);
break; }
case 2: {
cout << endl;
cout << "Produsele sunt: " << endl;
for (int i = 0; i < this->s.getSize(); i++) {
cout << s.getAll().getPoz()[i] << "\n";
}
std::ifstream f("Tonomat.txt");
if (f.is_open())
std::cout << f.rdbuf() << endl;
break; }
case 3: {
deleteItem(cod, pret, nume);
break;
}
case 4: {
updateItem();
break;
}
case 5: {
transaction();
break;
}
case 0: {
cout << "Programul se inchide\n";
break; }
}
} while (optiune != 0);
return 0;
}
<file_sep>#pragma once
#include <stdlib.h>
#include <iostream>
using namespace std;
class Tonomat {
private:
int cod;
int pret;
string nume;
public:
Tonomat();
Tonomat(int cod, int pret, string nume);
Tonomat(const Tonomat& t);
~Tonomat();
int getCod();
int getPret();
string getNume();
void setCod(int cod);
void setPret(int pret);
void setNume(string nume);
Tonomat& operator=(const Tonomat&);
bool operator==(const Tonomat&);
friend ostream& operator<<(ostream& os, const Tonomat& t);
};<file_sep>#include <iostream>
#include "RepoT.h"
#include "RepoFile.h"
#include "Tonomat.h"
#include "Service.h"
#include <assert.h>
#include <cassert>
#include "UI.h"
using namespace std;
void testeTonomat() {
Tonomat t(1, 10, "ciocolata");
assert(t.getCod() == 1);
assert(t.getPret() == 10);
assert(t.getNume() == "ciocolata");
t.setCod(2);
t.setPret(2);
t.setNume("croissante");
assert(t.getCod() == 2 && t.getPret() == 2 && t.getNume() == "croissante");
}
void testeRepoT_addItem() {
Tonomat t1(5, 5, "ciocolata");
RepoT<Tonomat> rt;
rt.addItem(t1);
assert(rt.getAll()[0].getCod() == 5);
assert(rt.getAll()[0].getPret() == 5);
assert(rt.getAll()[0].getNume() == "ciocolata");
assert(rt.getSize() == 1);
}
void testeRepoT_deleteItem()
{
Tonomat t2(11, 100, "suc");
RepoT<Tonomat> rt;
rt.deleteItem(t2);
assert(rt.getSize() == 3);
}
void testeService_addItem() {
Tonomat t1(5, 500, "ciocolata");
Service s;
s.addItem(t1);
assert(s.getSize() == 1);
}
void testeService_deleteItem()
{
Tonomat t2(11, 100, "suc");
Service s;
s.deleteItem(t2);
assert(s.getSize() == 2);
}
void testeRepoFile_addItem()
{
Tonomat t1(5, 500, "ciocolata");
RepoFile<Tonomat> rf;
rf.loadFromFile("Tonomat.txt");
rf.addItem(t1);
assert(rf.getSize() == 2);
}
void testeRepoFile_deleteItem()
{
Tonomat t2(11, 100, "suc");
RepoFile<Tonomat> rf;
rf.loadFromFile("Tonomat.txt");
rf.deleteItem(t2);
assert(rf.getSize() == 2);
}
void testeRest()
{
UI u;
int cinci;
u.rest(10, 9, cinci);
assert(cinci == 2 );
}
void teste() {
void testeTonomat();
void testeRepoT_addItem();
void testeRepoT_deleteItem();
void testeService_addItem();
void testeService_deleteItem();
void testeRepoFile_addItem();
void testeRepoFile_deleteItem();
void testeRest();
cout << "Testare cu succes!";
}<file_sep>#include "Service.h"
Service::Service()
{
}
Service::~Service()
{
}
Service::Service(RepoT<Tonomat>& t)
{
this->s = t;
}
RepoT<Tonomat> Service::getAll()
{
return this->s;
}
void Service::addItem(Tonomat& t)
{
s.addItem(t);
}
void Service::deleteItem(Tonomat& t)
{
s.deleteItem(t);
}
void Service::updateI(int t,Tonomat & nou)
{
this->s.updateItem(t, nou);
}
int Service::getSize()
{
return s.getSize();
}
<file_sep>#pragma once
#include "Service.h"
class UI {
private:
Service s;
public:
UI();
void addItem(int cod, int pret, string nume);
void deleteItem(int cod, int pret, string nume);
void updateItem();
void rest(int suma, int pret, int& cincizeci);
void updatePret(int cod, int modif);
void updateNume(int cod, string modif);
int findItem(int cod);
int getPret(int cod);
void transaction();
int Meniu();
~UI();
};<file_sep>#pragma once
#include "Tonomat.h"
#include <deque>
#include <fstream>
#include <string>
using namespace std;
template <typename T>
class RepoFile {
private:
vector<T> el;
string file;
public:
RepoFile() {};
void loadFromFile(string fileName);
void addItem(T t);
void deleteItem(T t);
void updateItem(int t, T nou);
vector<T> getAll();
int getSize();
void saveToFile();
~RepoFile() {};
};
template <class T>
void RepoFile<T>::loadFromFile(string fileName)
{
el.clear();
file = fileName;
ifstream f(fileName);
int cod;
int pret;
string nume;
while (!f.eof()) {
f >> cod >> pret >> nume;
if (cod != 0) {
T t(cod, pret, nume);
el.push_back(t);
}
}
f.close();
}
template <class T>
void RepoFile<T>::saveToFile()
{
ofstream f(file);
for (size_t i = 0; i < el.size(); i++) {
f << el[i];
}
f.close();
}
template <class T>
void RepoFile<T>::addItem(T t) {
this->el.push_back(t);
}
template <class T>
void RepoFile<T>::deleteItem(T t) {
vector<Tonomat>::iterator it;
it = find(el.begin(), el.end(), t);
if (it != el.end())
el.erase(it);
}
template <class T>
void RepoFile<T>::updateItem(int t, T nou) {
for (int i = 0; i < el.size(); i++) {
if (i == t) {
el[i] = nou;
}
}
}
template <class T>
int RepoFile<T>::getSize() {
return this->el.size();
}
template <class T>
vector<T> RepoFile<T>::getAll() {
return this->el;
}
<file_sep>#include "Tonomat.h"
#include "RepoT.h"
#include "teste.h"
#include "Service.h"
#include "UI.h"
#include "RepoFile.h"
#include <iostream>
using namespace std;
int main() {
UI aux;
aux.Meniu();
teste();
return 0;
}<file_sep>#pragma once
#include "Tonomat.h"
#include "RepoT.h"
class Service
{
private:
RepoT<Tonomat> s;
public:
Service();
Service(RepoT<Tonomat>& t);
void addItem(Tonomat& t);
void deleteItem(Tonomat& t);
void updateI(int t,Tonomat & nou);
RepoT<Tonomat> getAll();
int getSize();
~Service();
};<file_sep>#include "Tonomat.h"
Tonomat::Tonomat()
{
this->cod = 0;
this->pret = 0;
this->nume = " ";
}
Tonomat::~Tonomat()
{
}
Tonomat::Tonomat(int cod, int pret, string nume) {
this->cod = cod;
this->pret = pret;
this->nume = nume;
}
Tonomat::Tonomat(const Tonomat& t) {
this->cod = t.cod;
this->pret = t.pret;
this->nume = t.nume;
}
void Tonomat::setCod(int cod) {
this->cod = cod;
}
void Tonomat::setPret(int pret) {
this->pret = pret;
}
void Tonomat::setNume(string nume) {
this->nume = nume;
}
int Tonomat::getCod() {
return this->cod;
}
int Tonomat::getPret() {
return this->pret;
}
string Tonomat::getNume() {
return this->nume;
}
Tonomat& Tonomat::operator=(const Tonomat& t)
{
this->setCod(t.cod);
this->setPret(t.pret);
this->setNume(t.nume);
return *this;
}
bool Tonomat::operator==(const Tonomat& t) {
return (this->cod == t.cod) && (this->pret = t.pret);
}
ostream& operator<<(ostream& os, const Tonomat& t)
{
os << t.cod << " " << t.pret << " " << t.nume;
return os;
}
|
69947c72fa3746ecb8cadc37381741655227620c
|
[
"C++"
] | 10 |
C++
|
LauraDiosan-CS/lab5-6-template-stl-Lorena-Pop
|
fffe9691137bf1bccd46acfc01dede42ff72d02a
|
66548e60edc2e7c188b35b571816b4607d91b6bb
|
refs/heads/master
|
<file_sep>import React from "react";
import "./App.css";
import Website from "./website/Website";
import Client from "./client/Client";
class App extends React.Component {
constructor() {
super();
this.state = {
username: "",
showLogin: false
};
}
login = username => this.setState({ username });
logout = () => this.setState({ username: "" });
handleLogin = () => this.setState({ showLogin: !this.state.showLogin });
render() {
return (
<div className="main-container">
{this.state.username === " " ? (
<Website
handleLogin={this.handleLogin}
showLogin={this.state.showLogin}
/>
) : (
<Client
handleLogin={this.handleLogin}
showLogin={this.state.showLogin}
/>
)}
</div>
);
}
}
export default App;
<file_sep>import React from "react";
const LoginMisc = props => {
return (
<React.Fragment>
<div className="login-links">
<a href="##">Forgot Password</a>
</div>
<div className="or">
<hr className="bar" />
<span>OR</span>
<hr className="bar" />
</div>
<a href="##" className="secondary-btn">
Create an account
</a>
<footer id="main-footer">
<p>Copyright © 2019, branmanagement All Rights Reserved</p>
<div className="misc-links">
<a href="##" onClick={props.login}>
terms of user
</a>
|
<a href="##" onClick={props.logout}>
Privacy Policy
</a>
</div>
</footer>
</React.Fragment>
);
};
export default LoginMisc;
<file_sep>import React from "react";
import propTypes from "prop-types";
const Table = props => {
return (
<div className="table-head">
<table className="table">
<thead className="thead">
<tr>
<td>
<h1>Your Appointments</h1>
</td>
</tr>
<tr>
<th>Day</th>
<th>Time</th>
<th>Hairstyle</th>
<th>Stylist</th>
<th>Status</th>
</tr>
</thead>
<tbody className="tbody">
<tr>
<td>{props.day}</td>
<td>{props.time}</td>
<td>{props.hairstyle}</td>
<td>{props.stylist}</td>
<td>{props.status}</td>
</tr>
</tbody>
</table>
</div>
);
};
Table.propTypes = {
day: propTypes.string.isRequired,
time: propTypes.string.isRequired,
hairstyle: propTypes.string.isRequired,
stylist: propTypes.string.isRequired,
status: propTypes.string.isRequired
};
Table.defaultProps = {
day: "Month Day, Year",
time: "9:30am",
hairstyle: "hairstyle name",
stylist: "top stylist",
status: "confirmed"
};
export default Table;
<file_sep>import React from "react";
import propTypes from "prop-types";
const ServicesCard = props => {
return (
<React.Fragment>
<article className="service-item">
<i className="fas fa-ruler-horizontal" />
<div className="service-title">
<h2>{props.service}</h2>
<ul className="service-list" />
<li>{props.name}</li>
<button className="service-button">Book Today</button>
</div>
</article>
</React.Fragment>
);
};
ServicesCard.propTypes = {
service: propTypes.string.isRequired,
name: propTypes.string.isRequired
};
ServicesCard.defaultProps = {
service: "Hair Service",
name: "basic service"
};
export default ServicesCard;
<file_sep>import React from "react";
import "../../css/welcome.css";
const Home = ({ name }) => {
return (
<section id="home">
<div className="home-title">
<h1 className="home-text">Shear Elegance</h1>
<div className="home-underline" />
<div className="home-btn">
<button type="button">Book Today</button>
<button type="button">Contact Us</button>
</div>
</div>
</section>
);
};
export default Home;
<file_sep>import React from "react";
class Newsfeed extends React.Component {
render() {
return (
<div className="newsfeed-container">
<h1>Newsfeed here!</h1>
</div>
);
}
}
export default Newsfeed;
<file_sep>import React from "react";
import propTypes from "prop-types";
const HairCard = props => {
return (
<div className="profile-card">
<div className="card">
<div className="more">
<i className="fa fa-info-circle" />
</div>
<div className="appointment">
<i className="fas fa-cut" />
<h4>Hair Card</h4>
</div>
<ul className="appointment-facts">
<li className="facts">Hair is: {props.hair_is}</li>
<li className="facts">Hair type: {props.hair_type}</li>
<li className="facts">Hair Length: {props.hair_length}</li>
<li className="facts">last_time_colored: {props.last_colored}</li>
<li className="facts">have_a_perm_or_relaxer: {props.last_perm}</li>
</ul>
<div className="btn-group">
<button className="profile-btn">See More</button>
<button className="profile-btn">Edit</button>
</div>
</div>
</div>
);
};
HairCard.propTypes = {
hair_is: propTypes.string.isRequired,
hair_type: propTypes.string.isRequired,
hair_length: propTypes.string.isRequired,
last_colored: propTypes.string.isRequired,
last_perm: propTypes.string.isRequired
};
HairCard.defaultProps = {
hair_is: " unknown",
hair_type: " unknown",
hair_length: " unknown",
last_colored: " unknown",
last_perm: " unknown"
};
export default HairCard;
<file_sep>import React from "react";
import Horizontal from "../navbar/Horizontal";
import "../App.css";
const Navbar = props => {
return (
<section className="nav-container">
<Horizontal handleLogin={props.handleLogin} />
</section>
);
};
export default Navbar;
<file_sep>import React from "react";
import TestimonialQuotes from "./TestimonialQuotes";
import "../../../css/welcome.css";
const Testimonial = props => {
return (
<section id="testimonial">
<TestimonialQuotes />
</section>
);
};
export default Testimonial;
<file_sep>import React from "react";
import "../../../css/welcome.css";
class About extends React.Component {
render() {
// console.log(window.location);
return (
<section id="about">
<h1>We Are Professionals</h1>
<button type="button">Contact Us</button>
</section>
);
}
}
export default About;
<file_sep>import React from "react";
const HairCard2 = props => {
return (
<div className="profile-card">
<div className="card">
<div className="back">
<div className="go-back">
<i className="fa fa-chevron-circle-left" />
</div>
<ul className="appointment-facts">
<li className="facts">
suffered_from_hair_loss: {props.suffered_from_hair_loss}
</li>
<li className="facts">
been_diagnosed_with_alopecia:
{props.been_diagnosed_with_alopecia}
</li>
<li className="facts">
take_any_medication:{props.take_any_medication}
</li>
<li className="facts">
been_pregnant_in_the_last_6_months:
{props.been_pregnant_in_the_last_6_months}
</li>
<li className="facts">
suffer_from_ezcema_to_the_scalp:
{props.suffered_from_ezcema_to_the_scalp}
</li>
<li className="facts">
suffer_from_psoriasis_to_the_scalp:
{props.suffered_psoriasis_to_the_scalp}
</li>
<li className="facts">
have_a_sensitive_scalp:
{props.have_a_sensitive_scalp}
</li>
<li className="facts">
any_known_allergies: {props.any_known_allergies}
</li>
<li className="facts">
frequently_swim_or_go_to_the_gym:
{props.frequently_swim_or_go_to_the_gym}
</li>
<li className="facts">
currently_have_colour_in_your_hair:
{props.currently_have_colour_in_your_hair}
</li>
<li className="facts">
used_hair_extensions_before:{props.used_hair_extensions_before}
</li>
<li className="facts">
which_type_did_you_use: {props.which_type_did_you_use}
</li>
</ul>
</div>
</div>
</div>
);
};
HairCard2.propTypes = {
suffered_from_hair_loss: propTypes.bool.isRequired,
been_diagnosed_with_alopecia: propTypes.bool.isRequired,
take_any_medication: propTypes.bool.isRequired,
been_pregnant_in_the_last_6_months: propTypes.bool.isRequired,
suffer_from_ezcema_to_the_scalp: propTypes.bool.isRequired,
suffer_from_psoriasis_to_the_scalp: propTypes.bool.isRequired,
have_a_sensitive_scalp: propTypes.bool.isRequired,
any_known_allergies: propTypes.string.isRequired,
frequently_swim_or_go_to_the_gym: propTypes.bool.isRequired,
currently_have_colour_in_your_hair: propTypes.bool.isRequired,
used_hair_extensions_before: propTypes.bool.isRequired,
which_type_did_you_use: propTypes.string.isRequired
};
HairCard2.defaultProps = {
suffered_from_hair_loss: false.toString(),
been_diagnosed_with_alopecia: false.toString(),
take_any_medication: false.toString(),
been_pregnant_in_the_last_6_months: pfalse.toString(),
suffer_from_ezcema_to_the_scalp: false.toString(),
suffer_from_psoriasis_to_the_scalp: false.toString(),
have_a_sensitive_scalp: false.toString(),
any_known_allergies: false.toString(),
frequently_swim_or_go_to_the_gym: false.toString(),
currently_have_colour_in_your_hair: false.toString(),
used_hair_extensions_before: false.toString(),
which_type_did_you_use: "n/as"
};
export default HairCard2;
<file_sep>import React from "react";
import { Route, Switch, Redirect } from "react-router-dom";
import propTypes from "prop-types";
import Profile from "./profile/Profile";
import Appointment from "./appointment/Appointment";
import Line from "./line/Line";
import Dashboard from "./dashboard/Dashboard";
import Newsfeed from "./newsfeed/Newsfeed";
import SideBar from "../navbar/SideBar";
import Header from "../navbar/Header";
import Login from "../login/Login";
class Client extends React.Component {
render() {
return (
<section id="client">
{this.props.showLogin ? null : <Header />}
<Route
path="/login"
render={() =>
this.props.showLogin ? <Login /> : <Redirect to="/profile" />
}
/>
<div className="client-container">
{this.props.showLogin ? null : (
<SideBar handleLogin={this.props.handleLogin} />
)}
<Switch>
<Route path="/profile" component={Profile} />
<Route path="/appointment" component={Appointment} />
<Route path="/line" component={Line} />
<Route path="/dashboard" component={Dashboard} />
<Route path="/newsfeed" component={Newsfeed} />
<Route render={() => <p>Not Found</p>} />
</Switch>
</div>
</section>
);
}
}
Client.propTypes = {
showLogin: propTypes.bool.isRequired,
handleLogin: propTypes.func.isRequired
};
Client.defaultProps = {
showLogin: false
};
export default Client;
<file_sep>import React from "react";
import propTypes from "prop-types";
const ContactCard = props => {
return (
<React.Fragment>
<h3>{props.comp}</h3>
<p>{props.address}</p>
<p>{props.city}</p>
<p>{props.zip}</p>
</React.Fragment>
);
};
ContactCard.propTypes = {
comp: propTypes.string.isRequired,
address: propTypes.string.isRequired,
city: propTypes.string.isRequired,
state: propTypes.string.isRequired,
zip: propTypes.number.isRequired
};
ContactCard.defaultProps = {
comp: "Find Us Here",
address: "1234 Salon Blvd",
city: "Curly",
state: "New Jersey",
zip: parseInt("09873")
};
export default ContactCard;
<file_sep>import React from "react";
class Dashboard extends React.Component {
render() {
return (
<div className="dashboard-container">
<h1>Dashboard here!</h1>
</div>
);
}
}
export default Dashboard;
<file_sep>import React, { Component } from "react";
class ContactDetail extends Component {
render() {
return (
<React.Fragment>
<form className="contactform">
<h3>contact us</h3>
<input
type="text"
className="contact-inputs"
placeholder="Enter name..."
name="name"
/>
<input
type="email"
className="contact-inputs"
placeholder="Enter email..."
name="email"
/>
<textarea
name="message"
className="contact-message"
cols="30"
rows="10"
placeholder="Your message..."
/>
<button type="button">Submit Message</button>
</form>
</React.Fragment>
);
}
}
export default ContactDetail;
<file_sep>import React from "react";
import ContactMap from "../contact/ContactMap";
import ContactForm from "../contact/ContactForm";
import Title from "../../../components/Title";
import "../../../css/welcome.css";
class Contact extends React.Component {
render() {
return (
<section id="contact">
<Title name="Contact Us" />
<div className="contact-container">
<ContactMap />
<ContactForm />
</div>
</section>
);
}
}
export default Contact;
<file_sep>import React from "react";
import { Link } from "react-router-dom";
const LoginShowcase = ({ clickHandler }) => {
return (
<div id="right">
<div className="showcase-content">
<h1 className="showcase-text">
Come and Hang with Us <strong>@ShopTalk</strong>
</h1>
<Link to="/" className="secondary-btn" onClick={() => clickHandler()}>
Sign Up Today
</Link>
</div>
</div>
);
};
export default LoginShowcase;
<file_sep>import React from "react";
import ServicesCard from "./ServicesCard";
import Title from "../../../components/Title";
import "../../../css/welcome.css";
class Services extends React.Component {
constructor() {
super();
this.state = { name: "Services" };
}
render() {
return (
<section id="services">
<Title name={this.state.name} />
<div className="services-container">
<ServicesCard />
</div>
</section>
);
}
}
export default Services;
<file_sep>import React from "react";
import AppointmentCard from "./AppointmentCard";
import HairCard from "./HairCard";
import ProfileCard from "./ProfileCard";
import propTypes from "prop-types";
class ProfileList extends React.Component {
render() {
return (
<div className="profile-list">
<ProfileCard />
<AppointmentCard />
<HairCard />
</div>
);
}
}
ProfileList.propTypes = {
user: propTypes.object.isRequired,
appointments: propTypes.array.isRequired,
haircard: propTypes.array.isRequired,
profile: propTypes.array.isRequired
};
ProfileList.defaultProps = {
user: {},
appointments: [{}],
haircard: [{}],
profile: [{}]
};
export default ProfileList;
<file_sep>import React from "react";
import "../navbar/navbar.css";
import propTypes from "prop-types";
import { NavLink } from "react-router-dom";
const SideBar = props => {
return (
<React.Fragment>
<section id="side-menu">
<ul className="side-nav-links">
<li className="head-link">
<div className="logo-image">
<img
src={require("../../images/logo.png")}
alt="logo"
style={{ textAlign: "center" }}
/>
</div>
<h1>Shear Elegance</h1>
</li>
<li>
<NavLink exact activeClassName="active" to="/">
<span>
<i className="fas fa-home" />
</span>
Home
</NavLink>
</li>
<li>
<NavLink exact activeClassName="active" to="/profile">
<span>
<i className="fas fa-users" />
</span>
Profile
</NavLink>
</li>
<li>
<NavLink exact activeClassName="active" to="/profile">
<span>
<i className="fas fa-briefcase" />
</span>
Stylist Profile
</NavLink>
</li>
<li>
<NavLink exact activeClassName="active" to="/appointment">
<span>
<i className="fas fa-calendar-check" />
</span>
Appointment
</NavLink>
</li>
<li>
<NavLink exact activeClassName="active" to="/dashboard">
<span>
<i className="fas fa-tachometer-alt" />
</span>
Dashboard
</NavLink>
</li>
<li>
<NavLink exact activeClassName="active" to="/newsfeed">
<span>
<i className="fas fa-newspaper" />
</span>
NewsFeed
</NavLink>
</li>
<li>
<NavLink
exact
activeClassName="active"
to="/login"
onClick={props.handleLogin}
>
<span>
<i className="fas fa-sign-out-alt" />
</span>
Logout
</NavLink>
</li>
</ul>
</section>
</React.Fragment>
);
};
SideBar.propTypes = {
handleLogin: propTypes.func.isRequired
};
export default SideBar;
<file_sep>## ShearElegance
This app was the second version of the app. This version was built to use more advance features and a different ui design pattern. Shear Elegance is a hairsalon app designed as a customer relationship management system for salon owners. The app allows users to
1. View the services and products offered by the salon.
2. Communciate with owners and clients, via email, social media links and chat.
3. Perform CRUD actions for appointments and user profile account.
4. Develop secure portals for each, customers, stylists and administrators.
5. Allow for image upload, and social commenting(likes, etc.)
6. Allow for users to document and maintain the data relating to their hair needs.
7. Authentication and Authorization features
8. Advance CSS/SCSS
From a technical learining perspective.This project was created using react/redux and was designed to demonstrate understanding of the following concepts.
1. socket protocol
2. authentication/authorization (jwt tokens)
3. nested routing using react router
4. higher order components
5. component styling
6. react clientside encapsulation
7. redux state management
8. integration testing
## Technologies
1. React 16.8.6
2. Redux 4.0.1
3. CSS3
4. Rails Api
## Project Status
currrent project status --- refactoring ui, to create scroll pages
## Authors
<NAME>
<file_sep>import React from "react";
import propTypes from "prop-types";
import Title from "../../../components/Title";
const TestimonialQuotes = props => {
return (
<div className="testimonial-center">
<Title name={"Testimonials"} />
<article>
<img src={props.avatar} alt="pic" />
<blockquote>
<p>
<i className="fa fa-quote-left" /> {props.quote}.
</p>
<footer>-customer</footer>
</blockquote>
</article>
</div>
);
};
TestimonialQuotes.propTypes = {
avatar: propTypes.string.isRequired,
quote: propTypes.string.isRequired
};
TestimonialQuotes.defaultProps = {
avatar:
"https://images.pexels.com/photos/718978/pexels-photo-718978.jpeg?auto=compress&cs=tinysrgb&dpr=2&h=750&w=1260",
quote: "Lorem ipsum dolor sit ametconsectetur adipisicing elit."
};
export default TestimonialQuotes;
<file_sep>import React from "react";
import ContactDetail from "../../forms/ContactDetail";
import ContactCard from "./ContactCard";
const ContactForm = props => {
return (
<React.Fragment>
<div className="contact-right">
<article className="contact-info">
<div className="contact-detail">
<div className="contact-icon">
<i className="fas fa-phone" />
</div>
</div>
<div className="contact-text">
<ContactCard />
</div>
</article>
<div className="form-container">
<ContactDetail />
</div>
</div>
</React.Fragment>
);
};
export default ContactForm;
<file_sep>import React from "react";
import { Switch, Route } from "react-router-dom";
import Navbar from "../navbar/Navbar";
import Home from "./Home";
import Gallery from "./gallery/Gallery";
import Services from "./services/Services";
import Testimonials from "./testimonials/Testimonials";
import About from "./about/About";
import Contact from "./contact/Contact";
import Login from "../login/Login";
import Footer from "./Footer";
class Website extends React.Component {
render() {
return (
<div className="website-container">
{this.props.showLogin ? null : (
<Navbar handleLogin={this.props.handleLogin} />
)}
<Switch>
<Route exact path="/" component={Home} />
<Route exact path="/gallery" component={Gallery} />
<Route exact path="/services" component={Services} />
<Route exact path="/testimonials" component={Testimonials} />
<Route exact path="/about" component={About} />
<Route exact path="/contact" component={Contact} />
<Route
exact
path="/login"
render={routerProps => <Login {...routerProps} />}
/>
<Route exact path="/footer" component={Footer} />
<Route render={() => <p>Not Found</p>} />
</Switch>
</div>
);
}
}
export default Website;
<file_sep>import React from "react";
const GalleryShowCard = props => {
return (
<React.Fragment>
<article className="show-pic">
<div>
<img src={require("../../../images/hair-1.jpeg")} alt="pic" />
<h1>Short Cuts</h1>
</div>
</article>
<article className="show-pic">
<img src={require("../../../images/natural-look.jpeg")} alt="pic" />
<h1>Long Cuts</h1>
</article>
<article className="show-pic">
<img src={require("../../../images/kids-hair.jpeg")} alt="pic" />
<h1>Natural Hair</h1>
</article>
<article className="show-pic">
<img src={require("../../../images/hair-6.jpeg")} alt="pic" />
<h1>Hair Extensions</h1>
</article>
</React.Fragment>
);
};
export default GalleryShowCard;
|
71a5152e6b295b76be0511e63246adc1a18a4831
|
[
"JavaScript",
"Markdown"
] | 25 |
JavaScript
|
LaSkilzs/HairStudio-CRM2-FrontEnd
|
dc67418d6cb6801389430f9309716b6adc6790d8
|
230c836dff30b66ce3366f42e325df2a3854c07c
|
refs/heads/master
|
<file_sep>const insertCss = () => {
const css = `.error_choose {
display: none !important;
}`
const head = document.head || document.getElementsByTagName('head')[0]
const style = document.createElement('style')
head.appendChild(style)
style.type = 'text/css'
style.appendChild(document.createTextNode(css))
}
const trainingKey = 'training'
const checkedClass = 'checkbox_active'
let isTraining = false
const detectIsTraining = () => {
return JSON.parse(localStorage.getItem(trainingKey))
}
const setTraining = newTrainingValue => {
isTraining = newTrainingValue
localStorage.setItem(trainingKey, newTrainingValue)
}
(() => {
insertCss()
const inputTraining = document.querySelector('.checkt_sub > label:first-of-type input')
isTraining = detectIsTraining()
const trainingButton = document.querySelector('.checkt_sub > label:first-of-type span')
trainingButton.classList[isTraining ? 'add' : 'remove'](checkedClass)
inputTraining.checked = isTraining
trainingButton.addEventListener('click', () => {
setTraining(inputTraining.checked)
})
const showTips = () => [].slice.call(document.querySelectorAll('.reply_ticket')).forEach(item => {
if (item.style['border-color'] !== '' && isTraining) {
item.style.display = 'block'
}
})
;[].slice.call(document.querySelectorAll('.label_raio')).forEach(label => label.addEventListener('click', showTips))
const element = document.querySelector('.subscriptions_not')
if (element) {
element.classList.add('subscriptions_not_active')
}
})()
|
873329297a581465cba6d0e537a2acadc9bda8c2
|
[
"JavaScript"
] | 1 |
JavaScript
|
spasea/tests-helper
|
9dfa557c1bdb499a24837c7b5f49d8cb713a3cd1
|
89d65094e2a4423e1c44eecde7eb058bf5626232
|
refs/heads/master
|
<file_sep>#!/usr/bin/env node
import LSPServer from "./language-server";
import * as fs from "fs";
import findUp from "find-up";
import * as path from "path";
const packageFilePath = findUp.sync("elm.json");
if (packageFilePath === null) {
console.log("Unable to find an elm.json file, elm-lsp will be disabled.");
throw new Error("Unable to find elm.json");
}
const projectFile = JSON.parse(fs.readFileSync(packageFilePath).toString());
const dirname = path.dirname(packageFilePath);
process.chdir(dirname);
LSPServer.start(projectFile);
<file_sep># Elm LSP
Language server for Elm with diagnostic support, built on [elm-analyse](https://github.com/stil4m/elm-analyse).
## Deprecated as of 2019-06-01
I recommend using [elm-language-server](https://github.com/elm-tooling/elm-language-server) instead, it includes everything in this repo along with additional features like finding all references, jumping to definitions, and more.
Shown with ALE integration in Vim.

[![NPM Version][npm-image]][npm-url]
[![Build Status][travis-image]][travis-url]
[![Downloads Stats][npm-downloads]][npm-url]
## Installation
```sh
npm install -g elm-lsp
```
## Editor Setup
|Editor|What you need|
|---|---|
|Vim|[ALE](https://github.com/w0rp/ale)|
## Project Information
This project aims to become a full-fleged language server over time by building on existing work from the Elm community. In this initial release all linting support is provided by the excellent [elm-analyse](https://github.com/stil4m/elm-analyse).
## Roadmap
|Feature|Supported|
|---|---|
|Diagnostics|✔️|
|Code completion|❌|
|Hover|❌|
|Jump to definition|❌|
|Workspace symbols|❌|
|Find references|❌|
## Release History
* 1.0.3
* Look in parent directories for elm.json (#2)
* 1.0.2
* Use package names rather than relative paths for dependencies (#1)
* 1.0.1
* Fix readme on npm
* 1.0.0
* Initial release with `elm-analyse` support
## Meta
Distributed under the MIT license. See ``LICENSE`` for more information.
<!-- Markdown link & img dfn's -->
[npm-image]: https://img.shields.io/npm/v/elm-lsp.svg?style=flat-square
[npm-url]: https://npmjs.org/package/elm-lsp
[npm-downloads]: https://img.shields.io/npm/dt/elm-lsp.svg?style=flat-square
[travis-image]: https://img.shields.io/travis/antew/elm-lsp/master.svg?style=flat-square
[travis-url]: https://travis-ci.org/antew/elm-lsp
<file_sep>import uri2path from "file-uri-to-path";
import fileUrl from "file-url";
import _ from "lodash";
import * as path from "path";
import {
createConnection,
Diagnostic,
DiagnosticSeverity,
InitializeParams,
ProposedFeatures,
TextDocumentChangeEvent,
TextDocuments,
TextDocumentSyncKind
} from "vscode-languageserver";
import { ElmApp, Message, LogMessage } from "elm-analyse/ts/domain";
import * as fileLoadingPorts from "elm-analyse/ts/file-loading-ports";
// import * as dependencies from 'elm-analyse/ts/util/dependencies';
import * as dependencies from "elm-analyse/ts/util/dependencies";
const { Elm } = require("elm-analyse/dist/app/backend-elm.js");
// TODO: remove need for this config, the port, format, and open options
// should not be needed.
// elm-format will be needed to implement fixing issues with elm-analyse
const CONFIG = {
port: 3000,
elmFormatPath: "elm-format",
format: "json",
open: false
};
function start(project: {}) {
// Disable console logging while in language server mode
// otherwise in stdio mode we will not be sending valid JSON
console.log = console.warn = console.error = () => {};
let connection = createConnection(ProposedFeatures.all);
run(project, function(elm: ElmApp) {
let documents: TextDocuments = new TextDocuments();
let filesWithDiagnostics = new Set();
documents.listen(connection);
connection.listen();
connection.onInitialize((params: InitializeParams) => ({
capabilities: {
textDocumentSync: {
openClose: true,
willSave: true,
change: TextDocumentSyncKind.Full
},
textDocument: {
publishDiagnostics: {
relatedInformation: false
}
}
}
}));
// The content of a text document has changed. This event is emitted
// when the text document first opened or when its content has changed.
documents.onDidOpen(validateTextDocument);
documents.onDidChangeContent(validateTextDocument);
documents.onDidSave(validateTextDocument);
async function validateTextDocument(change: TextDocumentChangeEvent): Promise<void> {
elm.ports.fileWatch.send({
event: "update",
file: path.relative(process.cwd(), uri2path(change.document.uri)),
content: change.document.getText()
});
}
function publishDiagnostics(messages: Message[], uri: string) {
const messagesForFile = messages.filter(m =>
// Windows paths have a forward slash in the `message.file`, which won't
// match with the end of the file URI we have from the language server event,
// so this replaces backslashes before matching to get consistent behavior
uri.endsWith(m.file.replace("\\", "/"))
);
let diagnostics: Diagnostic[] = messagesForFile.map(messageToDiagnostic);
connection.sendDiagnostics({ uri: uri, diagnostics });
}
elm.ports.sendReportValue.subscribe(function(report) {
// When publishing diagnostics it looks like you have to publish
// for one URI at a time, so this groups all of the messages for
// each file and sends them as a batch
const messagesByFile = _.groupBy(report.messages, "file");
const filesInReport = new Set(_.map(_.keys(messagesByFile), fileUrl));
const filesThatAreNowFixed = new Set([...filesWithDiagnostics].filter(uriPath => !filesInReport.has(uriPath)));
filesWithDiagnostics = filesInReport;
// We you fix the last error in a file it no longer shows up in the report, but
// we still need to clear the error marker for it
filesThatAreNowFixed.forEach(file => publishDiagnostics([], file));
_.forEach(messagesByFile, (messages, file) => publishDiagnostics(messages, fileUrl(file)));
});
elm.ports.log.subscribe((data: LogMessage) => {
console.log(data.level + ":", data.message);
});
});
}
function messageToDiagnostic(message: Message): Diagnostic {
if (message.type === "FileLoadFailed") {
return {
severity: DiagnosticSeverity.Error,
range: { start: { line: 0, character: 0 }, end: { line: 1, character: 0 } },
code: "1",
message: "Error parsing file",
source: "elm-analyse"
};
}
let [lineStart, colStart, lineEnd, colEnd] = message.data.properties.range;
const range = {
start: { line: lineStart - 1, character: colStart - 1 },
end: { line: lineEnd - 1, character: colEnd - 1 }
};
return {
severity: DiagnosticSeverity.Warning,
range: range,
code: message.id,
// Clean up the error message a bit, removing the end of the line, e.g.
// "Record has only one field. Use the field's type or introduce a Type. At ((14,5),(14,20))"
message:
message.data.description.split(/at .+$/i)[0] +
"\n" +
`See https://stil4m.github.io/elm-analyse/#/messages/${message.type}`,
source: "elm-analyse"
};
}
function run(project: {}, onload: (app: ElmApp) => void) {
dependencies.getDependencies(function(registry) {
const app = Elm.Analyser.init({
flags: {
server: false,
registry: registry || [],
project: project
}
});
fileLoadingPorts.setup(app, CONFIG, process.cwd());
onload(app);
});
}
export default { start };
|
83d0d28d87ddfb29b23f162460cf81c03e90ad36
|
[
"JavaScript",
"TypeScript",
"Markdown"
] | 3 |
JavaScript
|
antew/elm-lsp
|
742032ece7104e34c33554092546b4a5a0ef4a5a
|
900b8969b36afa7292e7c7f7fd691a799a107f6d
|
refs/heads/master
|
<repo_name>nabeshin/Cyotek.Windows.Forms.ColorPicker<file_sep>/Cyotek.Windows.Forms.ColorPicker.Demo/ColorPickerDialogDemoForm.cs
using System;
using System.Drawing;
using System.Windows.Forms;
namespace Cyotek.Windows.Forms.ColorPicker.Demo
{
// Cyotek Color Picker controls library
// Copyright © 2013-2014 Cyotek.
// http://cyotek.com/blog/tag/colorpicker
// Licensed under the MIT License. See colorpicker-license.txt for the full text.
// If you use this code in your applications, donations or attribution are welcome
internal partial class ColorPickerDialogDemoForm : BaseForm
{
#region Instance Fields
private Color _color;
#endregion
#region Public Constructors
public ColorPickerDialogDemoForm()
{
InitializeComponent();
}
#endregion
#region Overridden Methods
protected override void OnLoad(EventArgs e)
{
base.OnLoad(e);
_color = Color.SeaGreen;
}
#endregion
#region Event Handlers
private void browseColorButton_Click(object sender, EventArgs e)
{
using (ColorPickerDialog dialog = new ColorPickerDialog())
{
dialog.Color = _color;
if (dialog.ShowDialog(this) == DialogResult.OK)
{
_color = dialog.Color;
panel.Invalidate();
}
}
}
private void closeToolStripMenuItem_Click(object sender, EventArgs e)
{
this.Close();
}
private void panel_Paint(object sender, PaintEventArgs e)
{
using (Brush brush = new SolidBrush(_color))
{
e.Graphics.FillRectangle(brush, panel.ClientRectangle);
}
}
#endregion
}
}
|
7c9c8ea3a1cc76eee52cd9b74324bbe1e2e19935
|
[
"C#"
] | 1 |
C#
|
nabeshin/Cyotek.Windows.Forms.ColorPicker
|
0a8020bf2079b311249f98563b5fe12160c7b593
|
a37f465b0c3e846d6d3374d7e981eea74ccbea7a
|
refs/heads/master
|
<repo_name>njdup/personalSite<file_sep>/views/animation.js
function print(text){
var message = document.createTextNode(text);
var cursor = document.getElementById("cursor");
document.getElementById("firstLine").removeChild(cursor);
document.getElementById("firstLine").appendChild(message);
document.getElementById("firstLine").appendChild(cursor);
}
function animation(){
var runs = 0;
var interval = 250;
function run(){
switch(runs){
case 0: setTimeout(function(){ print('e')}, interval); runs++; break;
case 1: setTimeout(function(){ print('c')}, 2*interval); runs++; break;
case 2: setTimeout(function(){ print('h')}, 3*interval); runs++; break;
case 3: setTimeout(function(){ print('o')}, 4*interval); runs++; break;
case 4: setTimeout(function(){ print(' ')}, 5*interval); runs++; break;
case 5: setTimeout(function(){ print('\'')}, 6*interval); runs++; break;
case 6: setTimeout(function(){ print('H')}, 7*interval); runs++; break;
case 7: setTimeout(function(){ print('e')}, 8*interval); runs++; break;
case 8: setTimeout(function(){ print('l')}, 9*interval); runs++; break;
case 9: setTimeout(function(){ print('l')}, 10*interval); runs++; break;
case 10: setTimeout(function(){ print('o')}, 11*interval); runs++; break;
case 11: setTimeout(function(){ print(' ')}, 12*interval); runs++; break;
case 12: setTimeout(function(){ print('W')}, 13*interval); runs++; break;
case 13: setTimeout(function(){ print('o')}, 14*interval); runs++; break;
case 14: setTimeout(function(){ print('r')}, 15*interval); runs++; break;
case 15: setTimeout(function(){ print('l')}, 16*interval); runs++; break;
case 16: setTimeout(function(){ print('d')}, 17*interval); runs++; break;
case 17: setTimeout(function(){ print('!')}, 18*interval); runs++; break;
case 18: setTimeout(function(){ print('\'')}, 19*interval); runs++; break;
}
}
for(var i = 0; i < 19; i++){
run();
}
}
function animationPartTwo(){
function run(){
var cursor = document.getElementById("cursor");
document.getElementById("firstLine").removeChild(cursor);
var location = document.getElementById("secondLine");
var para = document.createElement("p");
var message = document.createTextNode('Hello World!');
para.appendChild(message);
para.setAttribute("class", "terminal");
location.appendChild(para);
}
setTimeout(run, 4750);
}
function animationPartThree(){
function run(){
var location = document.getElementById("thirdLine");
var p = document.createElement("p");
var node = document.createTextNode("<EMAIL>$");
p.setAttribute("id", "userInfo");
p.setAttribute("class", "terminal");
p.appendChild(node);
location.appendChild(p);
}
setTimeout(run, 4750);
}
function runningAnimation(){
var run = 1;
function blinkingCursor(){
function removeCursor(){
var location = document.getElementById("thirdLine");
var userInfo = document.getElementById("userInfo");
while(location.lastChild != userInfo){
location.removeChild(location.lastChild);
}
}
function addCursor(){
var location = document.getElementById("thirdLine");
var cursor = document.createElement("p");
var text = document.createTextNode("_");
cursor.appendChild(text);
cursor.setAttribute("id", "cursor");
cursor.setAttribute("class", "terminal");
if(!document.contains(cursor)) location.appendChild(cursor);
}
if(run === 1){
addCursor();
run++;
} else{
removeCursor();
run--;
}
}
function startBlinking(){
setInterval(blinkingCursor, 500);
}
setTimeout(startBlinking, 4750);
}
|
25641cf2ef9df5a480923d8ef85593d5e55e0f5f
|
[
"JavaScript"
] | 1 |
JavaScript
|
njdup/personalSite
|
f4db4bbc14791c2f61c01490920eea3eb6ece67c
|
18f965dd753803ecc106f43e10b2ce734d2ba68a
|
refs/heads/master
|
<repo_name>Egalloc/CollageBuilderUpdated<file_sep>/collageBuilderMockup/src/collageBuilderMockup/APIResponse.java
package collageBuilderMockup;
import java.util.List;
public class APIResponse {
SearchInfo searchInformation;
Item[] items;
public SearchInfo getSearchInformation() {
return searchInformation;
}
public Item[] getItems() {
return items;
}
}
|
9330291bf041422aa1a0f5aafd1d5b4fd43fa657
|
[
"Java"
] | 1 |
Java
|
Egalloc/CollageBuilderUpdated
|
3105136d10673d4c9fe80ceee5e68d082dd0fd85
|
938280d6ed94b685a96ebc14f5446636bee533ed
|
refs/heads/master
|
<file_sep>#![feature(proc_macro_hygiene, decl_macro)]
#[macro_use] extern crate rocket;
#[macro_use] extern crate rocket_contrib;
#[macro_use] extern crate serde_derive;
#[macro_use(bson)] extern crate bson;
extern crate chrono;
mod lib;
mod meta;
mod models;
mod controllers;
mod utils;
#[get("/")]
fn index() -> &'static str {
"Hello, this is a API!"
}
fn main() {
rocket::ignite().mount("/", routes![
controllers::user::get, controllers::user::getAll, controllers::user::insert,
controllers::user::follow,
controllers::tweet::get, controllers::tweet::getAll, controllers::tweet::insert,
controllers::tweet::getAllFromUser, controllers::tweet::like, controllers::tweet::getAllFromUsersFollowing,
controllers::tweet::retweet
])
.register(catchers![controllers::not_found::lookup])
.launch();
}<file_sep># twitter-rocket-rust
Mini API do twitter utiliando a linguagem Rust e o framework Rocket
## Instalação
- Instalar o [Rust](https://www.rust-lang.org/tools/install).
- rustup update.
- rustup install nightly.
- restup default nightly.
## Compilar
- cargo build
## Compilar e executar projeto
- cargo run
<file_sep>use bson;
use rocket_contrib::json::Json;
use rocket_contrib::json::JsonValue;
use crate::models;
use crate::meta;
use crate::utils;
// Retorna dados do usuário
#[get("/user/<id>")]
pub fn get(id: String) -> JsonValue {
match utils::validations::validateObjectId(id.clone()) {
false => {
return utils::validations::generateErrorJson("Id incorreto".to_string(), 400);
},
true => {
let document = models::User::find_one(id.to_owned()).unwrap();
let result = bson::from_bson::<meta::user::GetResponse>(bson::Bson::Document(document.unwrap()));
match result {
Ok(user) => {
json!({
"code": 200,
"success": true,
"data": user,
"error": ""
})
},
Err (_e) => {
return utils::validations::generateErrorJson(_e.to_string(), 404);
}
}
}
}
}
// Retorna todos os usuários cadastrados
#[get("/users")]
pub fn getAll() -> JsonValue {
match models::User::find() {
Ok(users) => {
json!({
"code": 200,
"success": true,
"data": users,
"error": ""
})
},
Err (_e) => {
return utils::validations::generateErrorJson(_e.to_string(), 404);
}
}
}
// Cria um usuário
#[post("/user", format = "application/json", data = "<user>")]
pub fn insert(user: Json<meta::user::Post>) -> JsonValue {
let model = models::User::Model {
email: user.email.to_owned(),
name: user.name.to_owned()
};
let document = model.insert().unwrap();
let result = bson::from_bson::<meta::user::PostResponse>(bson::Bson::Document(document.unwrap()));
match result {
Ok(user) => {
json!({
"code": 201,
"success": true,
"data": user,
"error": ""
})
},
Err (_e) => {
return utils::validations::generateErrorJson(_e.to_string(), 400);
}
}
}
// Segue um usuário
#[put("/user/follow", format = "application/json", data = "<follow>")]
pub fn follow(follow: Json<meta::user::FollowRequest>) -> JsonValue {
match utils::validations::validateObjectId(follow.user_id.clone()) {
false => {
return utils::validations::generateErrorJson("Id incorreto".to_string(), 400);
},
true => {
match utils::validations::validateObjectId(follow.user_to_follow_id.clone()) {
false => {
return utils::validations::generateErrorJson("Id incorreto".to_string(), 400);
},
true => {
models::User::follow(follow.user_id.to_owned(), follow.user_to_follow_id.to_owned());
return json!({
"code": 200,
"success": true,
"data": {},
"error": ""
});
}
}
}
}
}
<file_sep>[package]
name = "twitter"
version = "0.1.0"
authors = ["<NAME> <<EMAIL>>"]
edition = "2018"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
rocket = "0.4.2"
serde = "1.0"
serde_json = "1.0"
serde_derive = "1.0"
bson = "0.13.0"
mongodb = "0.3.7"
chrono = "0.4.9"
[dependencies.rocket_contrib]
version = "0.4.2"
default-features = false
features = ["json"]
|
b34f80223b8bea00dfe8092f1eb45f3e90d9a91f
|
[
"Markdown",
"Rust",
"TOML"
] | 4 |
Rust
|
Rodrigoaz7/twitter-rocket-rust
|
c32abdda7b7979a4bae307a998313fcb2c2c38b1
|
709d1320125d6d06eae0e14c0973396bca9daac2
|
refs/heads/master
|
<file_sep># Installation
> OS
- Linux(RHEL/CentOS/SUSE/u)
- Win
- OS X
> 字符集 character set
- utf8
- utf8mb4
- gbk
## REF
- [installing](https://dev.mysql.com/doc/refman/5.6/en/installing.html)
- [binary-installation](https://dev.mysql.com/doc/refman/5.6/en/binary-installation.html)
- [windows-installation](https://dev.mysql.com/doc/refman/5.6/en/windows-installation.html)
<file_sep># SWAP
> 虚拟内存
将 /proc/sys/vm/swappiness 修改成很小的值,如0或1。除非虚拟内存完全满了,否则不要使用交换区。
```bash
[root@localhost ~]# cat /proc/sys/vm/swappiness
30
```<file_sep># Mariadb
- website
```html
https://mariadb.org/
https://mariadb.com/kb/en/
```
- wikipedia
```html
http://en.wikipedia.org/wiki/MariaDB
```
<file_sep># DEV
## 规划
- 容量规划
- 业务增速?极限在哪?
- 分库分表
- “合久必分,分久必合”
## 建模
- 业务架构
- 业务逻辑、关联关系
- [数据类型 Data Types](DataTypes/Readme.md)
- [索引](IndexConstraint/Readme.md)
## SQL开发
- [SQL语言(Structured Query Language)](sql_base/Readme.md)
- 语法标准
- ANSI SQL-89
- ANSI SQL-92
- 范式
- 设计规范
- [审核 Audit](Audit/Readme.md)
- [audit log](../Mgmt/Maintenance/Log/audit_log.md)
## Transactions and Locking
- [Transactions](Transactions/Readme.md)
- [Locking](Locking/Readme.md)
## Programming Inside MySQL
### Programming
- Stored Routines
- Stored Procedure
- CALL statement
- [EXAMPLE](Programming/ex_procedure.sql)
- Stored Function
- [EXAMPLE](Programming/ex_function.sql)
- EVENT
- [EXAMPLE](Programming/ex_event.sql)
- TRIGGER
- [EXAMPLE](Programming/ex_trigger.sql)
```
触发顺序:
UPDATE:
BEFORE INSERT
BEFORE UPDATE
AFTER UPDATE
INSERT:
BEFORE INSERT
AFTER INSERT
```
- 视图 VIEW
- [EXAMINE](Programming/examine.sql)
### Privileges
- ```TRIGGER``` privilege
```mysql
CREATE TRIGGER;
DROP TRIGGER;
```
- ```EVENT``` privilege
- create, modify, or delete events.
- You must have the **SUPER** privilege to set the global ```event_scheduler``` variable.
- **event_scheduler** thread
### REF
- [condition-handling](https://dev.mysql.com/doc/refman/5.6/en/condition-handling.html)
## 事务和锁 Transactions and Locking
- [Transactions](Transactions/Readme.md)
- [Locking](Locking/Readme.md)
<file_sep># MVCC
InnoDB adds three fields to each row stored in a database:
- DB_TRX_ID: Transaction identifier
- DB_ROLL_PTR: Roll pointer
- DB_ROW_ID: Row ID
<file_sep># Mgmt
## 部署 Deploy
- [Installation](Installation/Readme.md)
- [Configuration](Configuration/Readme.md)
- [Maintenance](Maintenance/Readme.md)
- Log
- Metadata
## 架构 Architecture
- 硬件架构
- x86_64
- 系统架构
- [Memory](Architecture/Memory.md)
- [Disk](Architecture/Disk.md)
- 网络拓扑
- [Communication Protocols](Architecture/ComProtocols.md)
## [存储引擎 Storage Engines](StorageEngine/Readme.md)
## 性能优化 Optimize
- [服务器配置](Optimize/Server.md)
- [优化MySQL性能](Optimize/Readme.md)
## 监控 Monitor
## 变更 Change
- online ddl change => 锁表
## 测试 Testing
- 基准测试
- 性能测试
## 安全 Security
- Networks Security
- Firewall
- authorized clients
- root access
- mysql_secure_installation
- Operating System Security
- 防火墙
- 网络
- 账户权限
- File Systems
- MySQL programs
- Database directories and files
- Log, status, and configuration files
- [MySQL安全](Security/Readme.md)
## 诊断、排错 TroubleShooting
## 工具 Tools
- [Tools](Tools/Readme.md)
<file_sep>type显示的是访问类型,是较为重要的一个指标,结果值从好到坏依次是:
```
system > const > eq_ref > ref > fulltext > ref_or_null > index_merge > unique_subquery > index_subquery > range > index > ALL
```
一般来说,得保证查询至少达到range级别,最好能达到ref。
possible_keys
指出MySQL能使用哪个索引在该表中找到行。如果是空的,没有相关的索引。这时要提高性能,可通过检验WHERE子句,看是否引用某些字段,或者检查字段不是适合索引。
key
显示MySQL实际决定使用的键。如果没有索引被选择,键是NULL。
key_len
显示MySQL决定使用的键长度。如果键是NULL,长度就是NULL。文档提示特别注意这个值可以得出一个多重主键里mysql实际使用了哪一部分。
ref
显示哪个字段或常数与key一起被使用。
rows
这个数表示mysql要遍历多少数据才能找到,在innodb上是不准确的。
Extra
如果是Only index,这意味着信息只用索引树中的信息检索出的,这比扫描整个表要快。
如果是where used,就是使用上了where限制。
如果是impossible where 表示用不着where,一般就是没查出来啥。<file_sep># LOG
- 错误日志 error log (--log_error)
- 二进制日志 binary log (--log_bin, --expire_logs_days)
- 查询日志 general log (--general_log)
- 慢查询日志 slow query log (--slow_query_log, --long_query_time)
- Enterprise Audit (--audit_log, --audit_log_file)
- 操作日誌
## 错误日志 error log
```mysql
(shawn@localhost) [(none)] 14:10:43> SHOW VARIABLES LIKE 'log_error';
+---------------+---------------------------------+
| Variable_name | Value |
+---------------+---------------------------------+
| log_error | /data/log/mysql/mysql-error.log |
+---------------+---------------------------------+
1 row in set (0.00 sec)
```
## [二进制日志 binary log](binlog.md)
## 查询日志 general log
## [慢查询日志 slow query log](slow_log.md)
## MySQL Enterprise Audit
[audit_log](audit_log.md)
## 操作日誌
```bash
vim /etc/bashrc
```
```yaml
export WHOAMI=`whoami`
export DATE=`date +"%y%m%d_%H%M%S"`
alias mysql="mysql --tee /data/logs/mysql_record/record_${DATE}_${WHOAMI}.log"
```
```bash
source /etc/bashrc
```
# Tools
```bash
mysqlsla
mysqldumpslow
```
# REF
- log
- [server-logs](https://dev.mysql.com/doc/refman/5.6/en/server-logs.html)
- [mysql-log-rotate](https://dev.mysql.com/doc/refman/5.6/en/log-file-maintenance.html)
- [slave-logs-status](https://dev.mysql.com/doc/refman/5.6/en/slave-logs-status.html)
- audit
- [mysql-enterprise-audit](https://dev.mysql.com/doc/refman/5.6/en/mysql-enterprise-audit.html)
- [audit-log](https://dev.mysql.com/doc/refman/5.6/en/audit-log-reference.html)
- [audit-log-file](https://dev.mysql.com/doc/refman/5.6/en/audit-log-file.html)<file_sep># RDS
> 特点
- 敏捷
- 弹性<file_sep># ACID
- Atomic
- Consistent
- [Isolated](Isolated.md)
- Durable
> Transaction SQL Control Statements
- START TRANSACTION (BEGIN)
- SAVEPOINT
- COMMIT
- ROLLBACK
- ROLLBACK TO SAVEPOINT
- RELEASE SAVEPOINT
- SET AUTOCOMMIT
```mysql
SELECT @@AUTOCOMMIT;
SET @@AUTOCOMMIT :=0;
```
> Example of statements that cause an implicit commit:
```mysql
TRUNCATE TABLE tablename;
LOAD DATA INFILE '/path/datafile';
```
## REF
- [refman-implicit-commit](https://dev.mysql.com/doc/refman/5.6/en/implicit-commit.html)
<file_sep># MySQL Knowledge System
**MySQL 知识体系梳理**
> DBA首要职责:保障数据安全
>
> 有效的备份是最后一颗救命稻草
**以 MySQL 5.6 为基准版本**
## Chapter
体系分为如下几部分:
- [开发 DEV -- 产品DBA](DEV/Readme.md)
- [管理 Mgmt -- 运维DBA](Mgmt/Readme.md)
- [高可用 HA](HA/Readme.md)
- [云数据库 RDS](RDS/Readme.md)
- [场景案例 Case](Case/Readme.md)
- [分支](Branch/Readme.md)
## 演进过程
案例 => 规则 => 规范 => 工具
## OCP

## [Reference](./Reference.md)
## BookList
<file_sep># Case
## 电商
## 金融
- 传统金融
- 互联网金融
## 百台数据库以上怎么规划、运维
## 大规模数据库下业务拆分的原则
## 大规模数据库环境下自动化工作开展
<file_sep>
## B-Tree Index
## BTREE索引
BALANCE 平衡索引
最左匹配原则
复合索引
## 前缀索引
## 全文本索引 FULLTEXT
- MyISAM
- Innodb 5.6+
## HASH索引
- Memory 引擎
## PRIMARY KEY
## UNIQUE
## R-Tree 空间索引
- MyISAM引擎<file_sep># Maintenance
- [LOG](Log/Readme.md)
- [Metadata](Metadata/Readme.md)
- [SHOW](SHOW.md)
- [Table Maintenance](Table/Readme.md)
## REF
- [server-logs](https://dev.mysql.com/doc/refman/5.6/en/server-logs.html)<file_sep>GRANT ALL ON sakila.* TO 'shawn'@'%';
-- 匹配sakila开头的schema
GRANT ALL ON `sakila%`.* TO 'shawn'@'%';
GRANT EXECUTE, ALTER ROUTINE ON PROCEDURE sakila.record_count TO 'shawn'@'%';<file_sep># Communication Protocols
- TCP/IP
- UNIX socket file
- Shared memory
- Named pipes<file_sep>#!/bin/bash
echo "***** <NAME> *****"
rpm -qa | grep mariadb | xargs rpm -e --nodeps
echo "***** INSTALL MYSQL DEPS *****"
yum install -y gcc.x86_64 gcc-c++.x86_64 cmake ncurses-devel perl perl-Data-Dumper glibc.x86_64
echo "***** Done! *****"
<file_sep># SHOW Statements
```mysql
SHOW DATABASES ;
-- SHOW CREATE TABLE
SHOW CREATE TABLE CITY ;
SHOW TABLES ;
SHOW TABLES FROM SAKILA ;
SHOW OPEN TABLES ;
-- SHOW COLUMS
-- DESCRIBE
-- DESC
SHOW COLUMNS FROM CITY;
DESC CITY ;
SHOW FULL COLUMNS FROM CITY;
-- SHOW INDEX
SHOW INDEX FROM CITY;
SHOW CHARACTER SET ;
SHOW COLLATION ;
SHOW PROCESSLIST ;
SHOW TABLE STATUS ;
SHOW PLUGINS ;
-- GRANT
SHOW GRANTS ;
SHOW GRANTS FOR 'shawn'@'localhost';
```<file_sep># Optimizing MySQL Performance
> performance tuning
## Target
## What
- innodb table
- data structure
```mysql
-- 优化表的数据类型
use sakila;
select * from country procedure analyse();
```
- index constraint
- query execution plan
```mysql
USE sakila;
-- PROCEDURE ANALYSE(100, 256)
SELECT city_id, city, country_id FROM City PROCEDURE ANALYSE(250,1024);
```
- locking
## How
定位 分析 优化
### 定位
- STATUS
```mysql
SHOW STATUS;
```
- EXPLAIN
```mysql
-- EXPLAIN
EXPLAIN EXTENDED
SHOW WARNINGS
EXPLAIN PARTITIONS
```
```mysql
EXPLAIN FORMAT = JSON
SELECT city, country_id FROM city;
```
```json
{
"query_block": {
"select_id": 1,
"table": {
"table_name": "city",
"access_type": "ALL",
"rows": 600,
"filtered": 100
}
}
}
```
- PROFILE
```mysql
-- SHOW PROFILE
SELECT @@have_profiling;
select @@profiling;
show profiles;
-- show profile [cpu | all] for query 4;
```
- TRACE
```mysql
-- 分析优化器
SELECT * FROM INFORMATION_SCHEMA.OPTIMIZER_TRACE ;
```
## Tuning
- [Server](Server.md)
- buffer pool
## REF
- [optimization](https://dev.mysql.com/doc/refman/5.6/en/optimization.html)
- [Locking Operations](https://dev.mysql.com/doc/refman/5.6/en/locking-issues.html)
<file_sep># Transactions
> All or none of the steps succeed.
- Execute if all steps are good.
- Cancel if steps have error or are incomplete.
## [ACID](ACID.md)
- Atomic 原子性
- Consistent 一致性
- [Isolated](Isolated.md) 隔离性
- Durable 持久性
## 本地事务
```mysql
COMMIT;
ROLLBACK;
LOCK TABLES;
```
## 分布式事务 XA
## Use locking to protect transactions
> InnoDB tables use row-level locking so that multiple sessions and applications can read from
and write to the same table simultaneously, without making each other wait and without
producing inconsistent results. For this storage engine, avoid using the LOCK TABLES
statement; it does not offer any extra protection but instead reduces concurrency.
> The automatic row-level locking makes these tables suitable for your busiest databases with
your most important data, while also simplifying application logic, because you do not need to
lock and unlock tables. Consequently, the InnoDB storage engine is the default in MySQL 5.6.
<file_sep># Semisynchronous Replication
## REF
- [replication-semisync](https://dev.mysql.com/doc/refman/5.6/en/replication-semisync.html)<file_sep># percona-tools for mysql
https://tools.percona.com/
# my.cnf
https://tools.percona.com/wizard
# Install toolkit
```bash
yum install -y perl-ExtUtils-CBuilder perl-ExtUtils-MakeMaker
```
<file_sep># 存储引擎 Storage Engines
## Engines
### [Innodb](Innodb/Innodb.md)
### [MyISM](MyISM.md)
### OTHER
- MEMORY
- stored in memory, represented on disk by a .frm file
- Maximum size option --max-heap-table-size
- Cannot contain TEXT or BLOB columns
- Can use different character sets for different columns
- ARCHIVE
- Represented by .frm file
- Data file: .ARZ
- Does not support indexes
- Supports INSERT and SELECT
- Supports ORDER BY operations and BLOB columns
- Accepts all but spatial data types
- Uses row-level locking
- Supports AUTO_INCREMENT columns
- FEDERATED
- EXAMPLE
- BLACKHOLE
- Used for replication
- Supports all kinds of indexes
- Retrievals always return an empty result
- MERGE
- CSV
- PERFORMANCE_SCHEMA
- [NDBCLUSTER](https://dev.mysql.com/doc/refman/5.6/en/mysql-cluster.html)
```mysql
SHOW ENGINES;
SELECT @@default_storage_engine;
```
## Partitioning
[Partitioning](Partitioning/Readme.md)
## REF
- [storage-engines](https://dev.mysql.com/doc/refman/5.6/en/storage-engines.html)
- [memory-storage-engine](https://dev.mysql.com/doc/refman/5.6/en/memory-storage-engine.html)
<file_sep>
```bash
groupadd mysql
useradd -r -g mysql mysql
cd /usr/local
tar zxvf /path/to/mysql-VERSION-OS.tar.gz
ln -s full-path-to-mysql-VERSION-OS mysql
cd mysql
chown -R mysql .
chgrp -R mysql .
scripts/mysql_install_db --user=mysql --basedir=/usr/local/mysql --datadir=/usr/local/mysql/data
chown -R root .
chown -R mysql data
cp support-files/my-small.cnf /etc/my.cnf
bin/mysqld_safe --user=mysql &
```
<file_sep>#!/bin/bash
# Def
MYSQL_HOME="/opt/mysql"
# Init
cd ${MYSQL_HOME}/bin
./mysql_secure_installation
<file_sep># sample data
- [example database](https://dev.mysql.com/doc/index-other.html)
- [test-db](https://launchpad.net/test-db/)
<file_sep>
```mysql
mysql> alter table j1 convert to character set utf8mb4;
```
```bash
vim /etc/my.cnf
[mysqld]
init-connect='SET NAMES utf8mb4'
character-set-server=utf8mb4
```
## Ref
- [UTF8字符集的表怎么直接转UTF8MB4?](http://mp.weixin.qq.com/s/VWVKy16gMJ_kFeICsyMiVw)
<file_sep>SELECT
`ID`,
`USER`,
`HOST`,
`DB`,
`COMMAND`,
`TIME`,
`STATE`,
LEFT(`INFO`, 51200) AS `Info`
FROM `information_schema`.`PROCESSLIST`;
SHOW PROCESSLIST ;<file_sep># Structured Query Language
[sample sql](sample.sql)<file_sep># MySQL Security
## User Management
- 初始化root密码
- MySQL 5.7安装后随机生成的root密码
```bash
grep 'temporary password' /var/log/mysqld.log
```
- [MySQL User Account Management](UserManagement/Readme.md)
- [Grant](UserManagement/Grant.md)
## SSL Protocol
- Uses different encryption algorithms to secure data over a public network
- Detects any data change, loss, or replay
- Incorporates algorithms that provide identity verification using the X509 standard
> [SSL Protocol](SSL/Readme.md)
## Audit
## REF
- MySQL Security
- [security](https://dev.mysql.com/doc/refman/5.6/en/security.html)
- [faqs-security](https://dev.mysql.com/doc/refman/5.6/en/faqs-security.html)
- [privilege-system](https://dev.mysql.com/doc/refman/5.6/en/privilege-system.html)
- [user-account-management](https://dev.mysql.com/doc/refman/5.6/en/user-account-management.html)
- [secure-connections](https://dev.mysql.com/doc/refman/5.6/en/secure-connections.html)
- Audit
- [MySQL Enterprise Audit](https://www.mysql.com/products/enterprise/audit.html)
- [audit-log](https://dev.mysql.com/doc/refman/5.5/en/audit-log.html)
<file_sep># 数据类型 Data Types
## Numeric
- Integer
- Floating-Point
- Fixed-Point
- BIT
## Character(String)
- Text
- VARCHAR
- MAX 65,535 characters
- Integer
- ENUM
- SET
## Binary
- Binary
- BLOB
- MAX 65,535 bytes
## Temporal
- DATE
- TIME
- DATETIME
- TIMESTAMP
- YEAR
## Spatial Data Types
- Entity
- Space
- Definable location
> MyISAM supports both spatial and non-spatial indexes.
## * NULL
## REF
- [refman-data-types](https://dev.mysql.com/doc/refman/5.6/en/data-types.html)
<file_sep>主从配置case
Enable GTID on the three servers
<file_sep># Replication
- binary log
- replication threads and logs
- replication environment
- replication topologies
## Replication Use Cases
- Horizontal scale-out 水平向外扩展
- Business intelligence and analytics 业务智能和分析
- Geographic data distribution 地理数据分布
## Master Configuration
- **REPLICATION SLAVE** privilege
```mysql
GRANT REPLICATION SLAVE ON *.* TO <user>@<slave-hostname>;
```
从库的复制账号需要主库的replication slave权限
主从库都需要replication client权限
R client:
1. 用来监控和管理复制的账号需要该权限
2. 后续可以方便的交换主备库角色
## Slave Configuration
```mysql
CHANGE MASTER TO
MASTER_HOST = 'host_name',
MASTER_PORT = port_num,
MASTER_USER = 'user_name',
MASTER_PASSWORD = '<PASSWORD>',
MASTER_LOG_FILE = 'master_log_name',
MASTER_LOG_POS = master_log_pos;
CHANGE MASTER TO MASTER_PASSWORD='<PASSWORD>';
```
```mysql
-- STATUS
SHOW MASTER STATUS;
```
```bash
mysqldump -uroot -p --master-data -B sakila > backup_sakila.sql
```
## REF
- [faqs-replication](https://dev.mysql.com/doc/refman/5.6/en/faqs-replication.html)
- [replication-features](https://dev.mysql.com/doc/refman/5.6/en/replication-features.html)
<file_sep># HA
## 备份与恢复 Backup and Recovery
## 复制 Replication
- [复制机制](Replication/replication.md)
- 同步复制
- [Semisynchronous Replication 半同步复制](Replication/semisync.md)
- Asynchronous Replication 异步复制
> 关注点
- [GTIDs](Replication/GTIDs.md)
- [binlog](../Mgmt/Maintenance/Log/binlog.md)
- [Replication Threads](Replication/ReplicationThreads.md)
- 时延
## HA
- MySQL Cluster
- MySQL Fabric
- [MHA](MHA/Readme.md)
- MMM
- Percona-Cluster
- 基于LVS/Keepalived/haproxy方案
## Middleware
- Cobar
- [MyCat](http://www.mycat.io/)
- mongo乐视
## 容灾
- [备份 Backup](Backup/Readme.md)
- [恢复 Recovery](Recovery/Readme.md)
- 异地多中心
> 关注点
- RPO
- RTO
## REF
- [MySQL HA/Scalability Guide](https://dev.mysql.com/doc/mysql-ha-scalability/en/)
<file_sep># mysql client
```bash
mysql
mysqladmin
mysqlimport
mysqldump
mysqlcheck
mysqlshow
mysqlslap
```
> mysql
```mysql
SOURCE /usr/stage/world_innodb.sql
```
> mysqladmin
```bash
mysqladmin --help
mysqladmin status
mysqladmin variables
mysqladmin processlist
mysqladmin shutdown
```
> mysqldumpslow
- perl script
```bash
mysqldumpslow mysql-slow.log
```
> mysqlbinlog
```bash
mysqlbinlog mysql-bin.000001
```
# REF
- [MySQL Client Programs](https://dev.mysql.com/doc/refman/5.6/en/programs-client.html)<file_sep>#!/bin/bash
# Def Var
PWD=`pwd`
CNF_DIR=${PWD}
MYSQL_HOME="/opt/mysql"
MY_CNF="my-master.cnf"
# Updata my.cnf
cd ${MYSQL_HOME}
if [ -d my.cnf ]; then
mv my.cnf my.cnf.org
fi
cd ${CNF_DIR}
cp -v ${MY_CNF} ${MYSQL_HOME}/my.cnf
cat ${MYSQL_HOME}/my.cnf
echo "***** Done *****"
<file_sep>#!/bin/bash
echo "***** START MYSQL *****"
INSTALL_DIR=/opt/mysql
${INSTALL_DIR}/bin/mysqld_safe &
echo "***** Done! *****"
<file_sep># Monitor
## 主机性能
### 监控项
- CPU
- MEMORY
- SWAP
- DISK
- IOPS
- SSD/HDD
- 文件系统
- NETWORK
- IN
- OUT
### 监控指标
## 数据库性能
- 内存
- 线程
- 连接数
- 访问量
- 运行状态
- [mysql status](mysqlStatus.md)
## benchmarking 基准
- mysqlslap
```bash
mysqlslap --iterations=5000 --concurrency=50 --query=workload.sql
--create=schema.sql --delimiter=";"
```
- sql-bench
## Performance Schema
[Performance Schema](../Optimize/PerformanceSchema.md)
## 报警
- 报警阈值
- 报警途径
- Mail
- SMS
- Weixin
## 报表
- 实现方式
- MySQL+Python
## REF
- [performance-schema](https://dev.mysql.com/doc/refman/5.6/en/performance-schema-quick-start.html)
<file_sep># 服务器配置
## ENV
- CPU
- Memory
- [SWAP](Server/swap.md)
- [DISK](Server/disk.md)
- Network
- 操作系统
<file_sep># IndexConstraint
- 查看索引
```mysql
USE sakila;
(shawn@localhost) [sakila] 13:58:39> SHOW INDEX FROM city\G
*************************** 1. row ***************************
Table: city
Non_unique: 0
Key_name: PRIMARY
Seq_in_index: 1
Column_name: city_id
Collation: A
Cardinality: 600
Sub_part: NULL
Packed: NULL
Null:
Index_type: BTREE
Comment:
Index_comment:
*************************** 2. row ***************************
Table: city
Non_unique: 1
Key_name: idx_fk_country_id
Seq_in_index: 1
Column_name: country_id
Collation: A
Cardinality: 300
Sub_part: NULL
Packed: NULL
Null:
Index_type: BTREE
Comment:
Index_comment:
2 rows in set (0.00 sec)
```
```mysql
USE information_schema;
(shawn@localhost) [information_schema] 17:22:29> show status like "Handler_read%";
+-----------------------+-------+
| Variable_name | Value |
+-----------------------+-------+
| Handler_read_first | 0 |
| Handler_read_key | 0 | 索引不常用
| Handler_read_last | 0 |
| Handler_read_next | 0 |
| Handler_read_prev | 0 |
| Handler_read_rnd | 0 |
| Handler_read_rnd_next | 127 | 查询效率
+-----------------------+-------+
7 rows in set (0.00 sec)
```
- 表空间回收
```mysql
alter table tbl_name engine=innodb;
```<file_sep># Branch
- [Mariadb](Mariadb.md)
- [Percona](Percona.md)
<file_sep># Index Constraint
- [索引类型](IndexCategory.md)
- [索引级别](IndexType.md)
- 索引优化 => 参见性能优化
## ICP特性
[Index Condition Pushdown](https://dev.mysql.com/doc/refman/5.6/en/index-condition-pushdown-optimization.html)
<file_sep># MySQL STATUS
> Examining Server Status
```mysql
SHOW STATUS
```
```bash
mysqladmin status
mysqladmin extended-status
```
> Top Status Variables
```mysql
SHOW STATUS LIKE 'Created_tmp_disk_tables';
SHOW STATUS LIKE 'Handler_read_first';
SHOW STATUS LIKE 'Innodb_buffer_pool_wait_free';
SHOW STATUS LIKE 'Max_used_connections';
SHOW STATUS LIKE 'Open_tables';
SHOW STATUS LIKE 'Select_full_join';
SHOW STATUS LIKE 'Slow_queries';
SHOW STATUS LIKE 'Sort_merge_passes';
SHOW STATUS LIKE 'Threads_connected';
SHOW STATUS LIKE 'Uptime';
```
<file_sep># MySQL 5.7
## MySQL 5.7 新特性
Server层优化、InnoDB层优化、Optimizer优化、EXPLAIN CONNECTION、JSON、虚拟列、新增sys schema、performance schema增强、支持多个触发器、在线修改VARCHAR长度、并发复制等
## [Reference](./Reference.md)
<file_sep># Tools
- 建模
- PowerDesigner
- 访问
- Navicat
- DataGrip
- HeidiSQL
- Manage
- [MySQL Workbench](https://dev.mysql.com/doc/workbench/en/)
- SQL Development
- Database Design and Modeling
- Server Administration
- Backup
- MySQL Enterprise Backup
- ```mysqldump``` : Logical backups
- Replication
- mysqldbcopy
- mysqldbcompare
- mysqlrpladmin
- mysqlfailover
- Router/Proxy
- mysql router
- ProxySQL
- 测试
- [Sysbench](Sysbench/sysbench.md)
- orion
- supersmack
- [mysql-tpcc](mysql-tpcc.md)
- 工具集
- mysql utilties
- [mysql client](mysql-client.md)
- [percona-tools for mysql](perconaTools.md)
- orzdba,iotop,ytop
- pstack
- 监控 Monitor
- [MySQL Enterprise Monitor](https://www.mysql.com/products/enterprise/monitor.html)
- others
- [通用的存储过程库](http://mysql-sr-lib.sourceforge.net/)
- [移植SQL Server上的存储过程](https://github.com/TownSuite/tsql2mysql)
- [UDF Repository for MySQL](http://www.mysqludf.org/)
- [MySQL沙箱](http://mysqlsandbox.net/)
- [common_schema](https://github.com/shlomi-noach/common_schema)
- [old index](http://code.openark.org/forge/common_schema)
> The common_schema is to MySQL as jQuery is to javaScript.
<file_sep># Reference
## 官方网站
- [refman](https://dev.mysql.com/doc/refman/5.6/en/)
- [Supported Platforms: MySQL Database](https://www.mysql.com/support/supportedplatforms/database.html)
- [news](http://www.mysql.com/news-and-events/web-seminars/)
- [events](http://www.mysql.com/news-and-events/events/)
- [oracle-wiki](https://wikis.oracle.com/display/mysql)
- [shop](https://shop.oracle.com/)
- [err-msg](https://dev.mysql.com/doc/refman/5.6/en/error-messages-server.html)
- [glossary](https://dev.mysql.com/doc/refman/5.6/en/glossary.html)
- [index-other](https://dev.mysql.com/doc/index-other.html)
## 部落格
- [翟振兴(网易高级DBA)个人博客 ](http://zhaizhenxing.blog.51cto.com/)
- [wiki](https://en.wikipedia.org/wiki/MySQL)
- [hatemysql](http://hatemysql.com/)
## 其他
> Replication & HA
- [MySQL复制原理和简单配置(5.6.20)](http://mp.weixin.qq.com/s/sZHXgPbAyb7rrdvowWh6zg)
- [MySQL 主主复制 + LVS + Keepalived 实现 MySQL 高可用性](https://mp.weixin.qq.com/s/HG_-Wj1T-btLSPEbe0LchA)
- [MySQL 复制介绍及搭建](http://mp.weixin.qq.com/s/7J0iJUlnMgy4ERGZWGUwSw)
- [一条更新操作引起的MySQL主从复制异常](http://mp.weixin.qq.com/s/oFqcr2ooJuNCvMwuCfCcHw)
- [秒级故障切换!用MHA轻松实现MySQL高可用(一)](http://mp.weixin.qq.com/s/Kbw1OnxtDnruvKepA5Ymtw)
- [秒级故障切换!用MHA轻松实现MySQL高可用(三)](http://mp.weixin.qq.com/s/xLIHHbRnm1K1bMI2S5FY6g)
- [应对亿级访问,另辟蹊径实现MySQL主库高可用](https://mp.weixin.qq.com/s/JcWTNhojkenkX0ZaK-3CWw)
- [MySQL高可用在网易的最佳应用与实践](http://mp.weixin.qq.com/s/ajlnIRa2q_kOaeuE2uNZgA)
- [10款常见MySQL高可用方案选型解读](http://mp.weixin.qq.com/s/ya07VRKS30qGOhC2dlh7sA)
- [魅族资深DBA:利用MHA构建MySQL高可用平台](http://mp.weixin.qq.com/s/1-ODR9qXBvl-xzVcLhg0FQ)
- [优酷土豆资深工程师:MySQL高可用之MaxScale与MHA(有彩蛋)](https://mp.weixin.qq.com/s/8dGRHp58QazUIK3t0AIsEA)
- [MySQL Master High Available 理论篇(一)](http://mp.weixin.qq.com/s/4tRrTIkt31YhnZIUsND8yw)
- [MySQL Master High Available 理论篇(二)](http://mp.weixin.qq.com/s/fFq0Zb6KhEA7zwOetOucuw)
- []()
> Optimize
- [一张思维导图学会如何构建高性能MySQL系统!](http://mp.weixin.qq.com/s/OBIOAjHo5k8ioNz-F2LoQA)
- [MySQL大表优化方案](https://segmentfault.com/a/1190000006158186)
- [MySQL阿里实践经典案例之参数调优最佳实践](http://mp.weixin.qq.com/s/fmhYRroQ3GNFdV2XxmBHWw)
- [优化案例 | CASE WHEN进行SQL改写优化](http://mp.weixin.qq.com/s/78UiSgJ-WfXVeqP9tQ2AfQ)
- [秒杀场景下的开源MySQL压测及性能优化](http://mp.weixin.qq.com/s/5_treDJ4np4HbJz_IsvAMQ)
- [微博MySQL优化之路](http://mp.weixin.qq.com/s/4bcpWFB2C_cKxA9meFtgUg)
- [如何用一款小工具大大加速MySQL SQL语句优化(附源码)](http://mp.weixin.qq.com/s/OasqEyagKola-vfdQx-EEg)
- [解开发者之痛:中国移动MySQL数据库优化最佳实践(有彩蛋)](http://mp.weixin.qq.com/s/PM5BvRbq7YI1Qd9olCA7Rg)
- [单表60亿记录等大数据场景的MySQL优化和运维之道](http://mp.weixin.qq.com/s/Cet5WaLJKe9IC3PM1WFlhw)
- [MySQL架构优化实战系列2:主从复制同步与查询性能调优](http://mp.weixin.qq.com/s/gjF7xGWjkumxA89YqahVdA)
- []()
> Docker
- [京东MySQL数据库Docker化最佳实践](http://mp.weixin.qq.com/s/ODTzyVm1rTl6PJ7gD2WN6A)
- [MySQL 到底能不能放到 Docker 里跑?](http://mp.weixin.qq.com/s/LhCHEkSstmru4PnrfuoaVg)
- []()
> Other
- [MySQL 开发实践 8 问](http://mp.weixin.qq.com/s/gRNDkWgF8F5ZppC3tl_fyw)
- [MySQL去O的实施技术](http://mp.weixin.qq.com/s/pwHSGoF6rgnOtLCCREhG6A)
- [mysql备份实战-Xtrabackup工具备份](https://mp.weixin.qq.com/s/qZesUVLQ25t14vtbRAzIWA)
- [MySQL数据库的“十宗罪”(附10大经典错误案例)](https://mp.weixin.qq.com/s/q-FYtKqyYdr_ikRKILPWaw)
- [微店MySQL自动化运维体系的构建之路](http://mp.weixin.qq.com/s/UP7vL2rdlFwjR35K_I4Hrg)
- [MySQL row格式的两个问题](http://mp.weixin.qq.com/s/KfC8D060gy6EIJtKK3Kp9A)
- [如何监控MySQL的复制延迟?](http://mp.weixin.qq.com/s/e4As2xlm4-Cwi0jcH9KrkQ)
- [MySQL 的 20+ 条最佳实践](https://mp.weixin.qq.com/s/4BJYDUGOpKeADYb83uALHw)
- [全栈必备:MySQL性能调优](http://blog.jobbole.com/107264/)
- [解密网易MySQL实例迁移高效完成背后的黑科技](http://mp.weixin.qq.com/s/fHSqYstewJmNZuOhwYWiSQ)
- [FAQ系列 | InnoDB Monitor被自动打开](http://mp.weixin.qq.com/s/ySKUGqv0hxKn8Wlp_eroGA)
- [存储总量达20T的MySQL实例,如何完成迁移?](http://mp.weixin.qq.com/s/SUjuQC-bBu_eD_PQWvQ7Vw)
- [中间件mysql-proxy的一些细节](http://mp.weixin.qq.com/s/p7lVZYSuzMwexulvQiN4yw)
- [京东 MySQL 监控之 Zabbix 优化、自动化](http://mp.weixin.qq.com/s/od3jVTbkIkDOAmZK0Y90bA)
- [What is CDB for My SQL?](https://www.qcloud.com/document/product/236/5147)
- [MySQL 资源大全](http://blog.jobbole.com/100516/)
- [MySQL 最佳实践: RDS 只读实例延迟分析](http://mp.weixin.qq.com/s/Pt_5UMQ4_f0F4C-pAqUjjA)
- [MySQL 性能监控 4 大指标](https://mp.weixin.qq.com/s/Vi7O7Mr0HNSmN3OFQbigZg)
- [学会用各种姿势备份MySQL数据库](http://mp.weixin.qq.com/s/N-GTa5oBsF8sBc_epfz5OA)
- [腾讯游戏DBA利刃 - SQL审核工具介绍](http://mp.weixin.qq.com/s/d5vcT6HS3igfgqhk6knP9w)
## MySQL_5.7
- [MySQL 5.7 REF](../MySQL_5.7/Reference.md)
## MySQL_8
## OCP(MySQL)
- [唠唠MySQL-OCP那点事](http://mp.weixin.qq.com/s/VvcnPfVmFLVbNzEvgvikeQ)
- [MySQL OCP考试记](http://mp.weixin.qq.com/s/yur7OTc0ESxUYCg6HOKD3w)
- []()
<file_sep># Performance Schema
```mysql
SHOW VARIABLES LIKE 'performance_schema';
```
```mysql
USE performance_schema;
SELECT file_name, event_name FROM file_instances LIMIT 1;
```
## Instruments
> Instruments in Performance Schema are points within the server source code from which MySQL raises events.
```mysql
stage/sql/statistics
statement/com/Binlog Dump
wait/io/file/innodb/innodb_data_file
wait/io/file/sql/binlog
wait/io/socket/sql/server_unix_socket
```
## REF
- [performance-schema-table-descriptions](https://dev.mysql.com/doc/refman/5.6/en/performance-schema-table-descriptions.html)
<file_sep># Locking
## Types of locks
- Shared lock
- Exclusive lock
## 行锁 表锁
- 行锁
- 共享锁 S
- 排他锁 X
- 表锁
- 意向共享锁 IS
- 意向排他锁 IX
- 行锁的三种算法
- Record Lock
- Gap Lock
- Next-key Lock:Record Lock + Gap Lock
锁定一个范围,并且锁定记录本身
## Explicit Row Locks
- LOCK IN SHARE MODE
- FOR UPDATE
## 监控状态
```mysql
SHOW FULL PROCESSLIST;
SHOW ENGINE INNODB STATUS;
SELECT * FROM INFORMATION_SCHEMA.INNODB_TRX;
SELECT * FROM INFORMATION_SCHEMA.INNODB_LOCKS;
-- SELECT * FROM INFORMATION_SCHEMA.INNODB_WAITS;
```
## Deadlocks
> Deadlocks are a classic problem in transactional databases, but they are not dangerous
unless they are so frequent that you cannot run certain transactions at all.
## Implicit Locks
> No lock unless SERIALIZABLE level, LOCK IN SHARE MODE, or FOR UPDATE is used
## REF
- [innodb-deadlock-detection](https://dev.mysql.com/doc/refman/5.6/en/innodb-deadlock-detection.html)
<file_sep># Audit
- 审核对象
- SQL -- DDL
- EVENT(audit.log)
- [审核规则](rule.md)
- SQL审核注意事项
- 数据库的版本、特性和差异
- SQL的执行计划
- 表、索引信息
- 统计信息收集策略
- SQL的执行时间、并发量
- 表的数据量以及数据未来的变化
- SQL所对应的业务特征

<file_sep>## MySQL 5.7 REF
- [从一个调试结果来看MySQL 5.7的无损复制](http://mp.weixin.qq.com/s/qQUtOeAazK8QWUluP3880g)
- [网易这样用sys schema优雅提升MySQL易用性](http://mp.weixin.qq.com/s/3QA3Y-zmtX2AJxKg-SyEEA)
- [分分钟搭建MySQL Group Replication测试环境](https://mp.weixin.qq.com/s/1pOry8UyRrxi9kO2opAvNg)
- [【MySQL 5.7.17】从主从复制到Group Replication](http://mp.weixin.qq.com/s/mkZ8zTSlO0-G4nuo7zLG4g)
- [MySQL Profile在5.7的简单测试](http://mp.weixin.qq.com/s/m2D7rgYR2n4ujH2MZ0Fhwg)
- [迄今最安全的MySQL?细数5.7那些惊艳与鸡肋的新特性(中)](http://mp.weixin.qq.com/s/QITMHC4Vidh2-uQI3uS4GA)
- [MySQL 5.7 双主复制+keepalived,常规业务一般够用了](http://mp.weixin.qq.com/s/sryCTX-LOwYj2y2aHYytjg)
- []()
- []()
<file_sep>-- schema base info
SELECT
TABLE_SCHEMA,
'ENGINE',
COUNT(*),
SUM(data_length) total_size
FROM INFORMATION_SCHEMA.TABLES
WHERE TABLE_TYPE = 'BASE TABLE'
GROUP BY TABLE_SCHEMA, 'ENGINE';
<file_sep># Atomic
原子性:所有语句作为一个单元全部成功执行或全部取消.<file_sep># Memory
## Server/Shared
- Query cache
- Thread cache
## Storage Engine/Shared
- Log buffer
- Buffer pool
## Connection/Session
- Sort buffer
- Read buffer
- Temporary table
## REF
[How MySQL Uses Memory](https://dev.mysql.com/doc/refman/5.6/en/memory-use.html)
<file_sep>USE sakila;
CREATE TRIGGER City_AD
AFTER DELETE
ON City
FOR EACH ROW
INSERT INTO city_backup (city_id, city)
VALUES (OLD.city_id, OLD.city);
|
0ee3daa464cf466b38176d242b4ca5d26e578e3a
|
[
"Markdown",
"SQL",
"Shell"
] | 54 |
Markdown
|
dikang123/mysql-study
|
dd4a6b53a4204e7a2f723571b9d551500dd29799
|
0a1fe95b299707cbefc0b50b131c334ae33c3f6d
|
refs/heads/master
|
<repo_name>evgenidb/Telerik-exams<file_sep>/Semester 1/CSharp - Part 2/FirstExam/Greedy Dwarf/Program.cs
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace Greedy_Dwarf
{
class Program
{
static void Main(string[] args)
{
string strValley = Console.ReadLine();
char[] separators = { ',', ' ' };
string[] strValleyArray = strValley.Split(separators, StringSplitOptions.RemoveEmptyEntries);
int[] valley = new int[strValleyArray.Length];
for (int index = 0; index < valley.Length; index++)
{
valley[index] = int.Parse(strValleyArray[index]);
}
int numOfPatterns = int.Parse(Console.ReadLine());
int[][] patterns = new int[numOfPatterns][];
for (int pattIndex = 0; pattIndex < numOfPatterns; pattIndex++)
{
string strPattern = Console.ReadLine();
string[] strPatternArray = strPattern.Split(separators, StringSplitOptions.RemoveEmptyEntries);
int[] pattern = new int[strPatternArray.Length];
for (int index = 0; index < pattern.Length; index++)
{
pattern[index] = int.Parse(strPatternArray[index]);
}
patterns[pattIndex] = pattern;
}
long maxGold = long.MinValue;
long currentPatternGold = 0;
List<int> currentPattenVisitedLinks = new List<int>();
foreach (int[] pattern in patterns)
{
int currentPosition = 0;
bool again = true;
while (again)
{
foreach (int patternIndex in pattern)
{
// Calc the current position
Console.WriteLine(currentPosition);
// Validate currentPosition
if (currentPosition < 0 || currentPosition >= valley.Length)
{
// Console.WriteLine("Break1 {0} {1}", currentPatternGold, currentPosition);
again = false;
break;
}
// Check if we were in this position before
foreach (int pos in currentPattenVisitedLinks)
{
if (pos == currentPosition)
{
// Console.WriteLine("Break2 {0} {1}", currentPatternGold, currentPosition);
again = false;
break;
}
}
if (!again)
{
break;
}
// Add position
currentPattenVisitedLinks.Add(currentPosition);
// Add the gold in this position
currentPatternGold += valley[currentPosition];
currentPosition += patternIndex;
}
}
if (currentPatternGold > maxGold)
{
maxGold = currentPatternGold;
}
currentPattenVisitedLinks.Clear();
}
Console.WriteLine(maxGold);
}
}
}
<file_sep>/Semester 1/CSharp - Part 2/FirstExam/KaspichanNumbers/Program.cs
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace KaspichanNumbers
{
class Program
{
// Class variables
private static uint numBase = 10;
private static List<string> numbers;
private static bool testing = false;
// Properties
public static uint NumBase
{
get
{
return numBase;
}
set
{
numBase = value;
}
}
public static List<string> Numbers
{
get
{
return numbers;
}
set
{
numbers = value;
}
}
public static bool Testing
{
get
{
return testing;
}
set
{
testing = value;
}
}
// Methods
// Main Method
static void Main(string[] args)
{
// IsTesting
Testing = false;
// Number Generator
NumBase = 256;
GenerateNumbers();
// PrintGeneratedNumbers();
// User input
ulong numberToConvert = UserInput();
// Convert Number
string convertedNumber = ConvertNumber(numberToConvert);
// Print Number
PrintNumber(convertedNumber);
}
// Number Generator Methods
private static void GenerateNumbers()
{
Numbers = new List<string>();
for (int num = 0; num < NumBase; num++)
{
string currentNumber = GetNumber(num);
// PrintMessage(currentNumber);
Numbers.Add(currentNumber);
}
}
private static string GetNumber(int num)
{
int firstNumID = (int)((num / 26) - 1);
string firstNum = "";
if (firstNumID < 0)
{
firstNum = "";
}
else
{
firstNum = ((char)(((uint)'a') + firstNumID)).ToString();
}
uint secondNumID = (uint)(num % 26);
string secondNum = "";
if (secondNumID < 0)
{
secondNum = "";
}
else
{
secondNum = ((char)(((uint)'A') + secondNumID)).ToString();
}
string currentNumber = firstNum + secondNum;
return currentNumber;
}
// User Input
private static ulong UserInput()
{
return ulong.Parse(Console.ReadLine());
}
// Number Conversion
private static string ConvertNumber(ulong numberToConvert)
{
string convertedNumber = "";
PrintMessage("Foo!");
PrintMessage(numberToConvert.ToString());
PrintMessage((numberToConvert / NumBase).ToString());
PrintMessage((numberToConvert % NumBase).ToString());
if (numberToConvert == 0)
{
return "A";
}
while (numberToConvert / NumBase != 0 || numberToConvert % NumBase != 0)
{
uint numberID = (uint)(numberToConvert % NumBase);
numberToConvert /= NumBase;
PrintMessage("Inside While!");
convertedNumber = Numbers[(int)numberID] + convertedNumber;
PrintMessage(convertedNumber);
}
PrintMessage("Bar");
return convertedNumber;
}
// Print Methods
private static void PrintNumber(string convertedNumber)
{
Console.WriteLine(convertedNumber);
}
private static void PrintGeneratedNumbers()
{
// If we're not debuging - don't print anything
if (!Testing)
{
return;
}
foreach (string num in Numbers)
{
Console.Write("{0}, ", num);
}
}
private static void PrintMessage(string currentNumber)
{
// If we're not debuging - don't print anything
if (!Testing)
{
return;
}
Console.WriteLine(currentNumber);
}
}
}
<file_sep>/Semester 1/CSharp - Part 2/FirstExam/ConsoleJustification/Program.cs
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace ConsoleJustification
{
class Program
{
// Class Variables
private static bool testing = false;
// Properties
public static bool Testing
{
get
{
return testing;
}
set
{
testing = value;
}
}
// Methods
// Main Method
static void Main(string[] args)
{
// IsTesting
Testing = false;
if (!Testing)
{
// User input
int userInputLineCount = 0;
int maxLineWidth = 0;
List<string> lines = UserInputContainer(ref userInputLineCount, ref maxLineWidth);
// Justificate
List<string> justifiedLines = Justify(lines, maxLineWidth);
// Print Text
PrintText(justifiedLines);
}
else
{
RunTests();
}
}
// User Input
private static List<string> UserInputContainer(ref int userInputLineCount, ref int maxLineWidth)
{
userInputLineCount = ReadUserLineCount();
maxLineWidth = ReadLineWidth();
List<string> lines = ReadLines(userInputLineCount);
return lines;
}
private static List<string> ReadLines(int userInputLineCount)
{
List<string> lines = new List<string>();
for (int line = 0; line < userInputLineCount; line++)
{
string strLine = Console.ReadLine();
lines.Add(strLine);
}
return lines;
}
private static int ReadUserLineCount()
{
return int.Parse(Console.ReadLine());
}
private static int ReadLineWidth()
{
return int.Parse(Console.ReadLine());
}
// Justify
static List<string> Justify(List<string> textToJustify, int maxLineWidth)
{
List<string> justifiedText = new List<string>();
//PrintDebugMessage("Justify Method - Start");
// Trim the lines of the extra spaces on both ends
TrimWhitespaces(textToJustify);
// Lines to single string
string text = LineTextToString(textToJustify);
//PrintDebugMessage(text);
// Replace all multiple spaces with single space
string[] words = TextToWords(text);
PrintDebugMessageArray(words);
// Construct the New Lines
List<string> newLines = ConstructNewLines(words, maxLineWidth);
// PrintDebugMessageList(newLines);
// Justify the New Lines
justifiedText = JustifyText(newLines, maxLineWidth);
//PrintDebugMessage("Justify Method - End");
return justifiedText;
}
private static void TrimWhitespaces(List<string> textToJustify)
{
for (int line = 0; line < textToJustify.Count; line++)
{
textToJustify[line].Trim();
}
}
private static List<string> JustifyText(List<string> newLines, int maxLineWidth)
{
List<string> justifiedText = new List<string>();
foreach (string line in newLines)
{
if (line.Length >= maxLineWidth || !(line.Contains(" ")))
{
//PrintDebugMessage("One-liner!");
// The line is with maxWidth or there is just one word, i.e. no need to justify
// so we need only to add it to the justifiedText
justifiedText.Add(line);
// Continue to the next line
continue;
}
else
{
//PrintDebugMessage("Multy-liner!");
int spacesToAdd = maxLineWidth - line.Length;
StringBuilder lineToJustify = new StringBuilder(line);
string spacesToSearch = " ";
int currentIndexOfSpaces = -1;
for (int space = 0; space < spacesToAdd; space++)
{
currentIndexOfSpaces = (lineToJustify.ToString()).IndexOf(spacesToSearch, (currentIndexOfSpaces + 1));
if (spacesToSearch == " ")
{
PrintDebugMessage("");
}
if (currentIndexOfSpaces < 0)
{
spacesToSearch += " ";
currentIndexOfSpaces = -1;
space--;
continue;
}
lineToJustify.Insert(index: currentIndexOfSpaces, value: " ");
currentIndexOfSpaces += spacesToSearch.Length;
}
// Add the justified line
justifiedText.Add(lineToJustify.ToString());
}
}
return justifiedText;
}
private static List<string> ConstructNewLines(string[] words, int maxLineWidth)
{
List<string> newLines = new List<string>();
StringBuilder currentLine = new StringBuilder();
for (int wordIndex = 0; wordIndex < words.Length; wordIndex++ )
{
if ((currentLine.Length + 1 + words[wordIndex].Length) <= maxLineWidth) // Current Line Width + 1 (for the space) + Current Word Width must be <= to Max Line Width
{
if (currentLine.Length > 0)
{
// Add a space if there is a word before the current one
currentLine.Append(" ");
}
}
else
{
// Add the new constructed line
if (currentLine.Length > 0)
{
newLines.Add(currentLine.ToString());
}
// Clear the current line for the next line
currentLine.Clear();
}
// Append the current word
currentLine.Append(words[wordIndex]);
// If it's the last word - Append it!
if (wordIndex + 1 == words.Length)
{
newLines.Add(currentLine.ToString());
}
}
return newLines;
}
private static string[] TextToWords(string text)
{
char[] separators = { ' ' };
return text.Split(separators, StringSplitOptions.RemoveEmptyEntries);
}
private static string LineTextToString(List<string> textToJustify)
{
StringBuilder text = new StringBuilder();
for (int lineNumber = 0; lineNumber < textToJustify.Count; lineNumber++)
{
text.Append(textToJustify[lineNumber]);
if (lineNumber + 1 != textToJustify.Count)
{
// i.e. it's not the last line - add an extra space
text.Append(" ");
}
}
return text.ToString();
}
// Prints
private static void PrintText(List<string> text)
{
PrintDebugMessage("");
foreach (string line in text)
{
Console.WriteLine("{0}", line);
}
if (Testing)
{
Console.WriteLine();
Console.WriteLine("Line Count: {0}", text.Count);
Console.WriteLine();
}
}
private static void PrintDebugMessage(string msg = "")
{
// If we're not debuging - don't print anything
if (!Testing)
{
return;
}
Console.WriteLine(msg);
}
private static void PrintDebugMessageArray(string[] words)
{
// If we're not debuging - don't print anything
if (!Testing)
{
return;
}
foreach (string w in words)
{
Console.WriteLine(w);
}
}
private static void PrintDebugMessageList(List<string> lines)
{
// If we're not debuging - don't print anything
if (!Testing)
{
return;
}
foreach (string line in lines)
{
Console.WriteLine(line);
}
}
// Testing
// Test Container
private static void RunTests()
{
int failures = 0;
Func<bool>[] tests = { Test1, Test2, Test3, Test4, Test5 };
for (int i = 0; i < tests.Length; i++)
{
if (!tests[i]())
{
failures++;
Console.WriteLine();
PrintDebugMessage(String.Format("Test {0} Failed!", i + 1));
Console.WriteLine("____________________________");
Console.WriteLine();
}
}
PrintDebugMessage(String.Format("{0} of {1} Tests Failed!", failures, tests.Length));
}
// Tests
private static bool Test1()
{
int maxLineWidth = 20;
List<string> lines = new List<string>();
lines.Add("We happy few we band");
lines.Add("of brothers for he who sheds");
lines.Add("his blood");
lines.Add("with");
lines.Add("me shall be my brother");
List<string> justifiedLines = Justify(lines, maxLineWidth);
PrintText(justifiedLines);
bool isSuccess = false;
return isSuccess;
}
private static bool Test2()
{
int maxLineWidth = 18;
List<string> lines = new List<string>();
lines.Add("Beer beer beer Im going for");
lines.Add(" a ");
lines.Add("beer");
lines.Add("Beer beer beer Im gonna");
lines.Add("drink some beer");
lines.Add("I love drinkiiiiiiiiing");
lines.Add("beer");
lines.Add("lovely");
lines.Add("lovely");
lines.Add("beer");
List<string> justifiedLines = Justify(lines, maxLineWidth);
PrintText(justifiedLines);
bool isSuccess = false;
return isSuccess;
}
private static bool Test3()
{
int maxLineWidth = 1;
List<string> lines = new List<string>();
lines.Add("a");
List<string> justifiedLines = Justify(lines, maxLineWidth);
PrintText(justifiedLines);
bool isSuccess = false;
return isSuccess;
}
private static bool Test4()
{
int maxLineWidth = 20;
List<string> lines = new List<string>();
lines.Add("a a");
List<string> justifiedLines = Justify(lines, maxLineWidth);
PrintText(justifiedLines);
bool isSuccess = false;
return isSuccess;
}
private static bool Test5()
{
int maxLineWidth = 20;
List<string> lines = new List<string>();
lines.Add("a a");
lines.Add("a a");
List<string> justifiedLines = Justify(lines, maxLineWidth);
PrintText(justifiedLines);
bool isSuccess = false;
return isSuccess;
}
}
}
|
f1c250ad63e4c811ff2d7d31fa041c3e3efce9cc
|
[
"C#"
] | 3 |
C#
|
evgenidb/Telerik-exams
|
9ab3141db0bcd202d6df4a21704ca912e491f898
|
ce7b676af543e52b3aff9da44ff81a9eac53352f
|
refs/heads/master
|
<file_sep>from django.contrib import admin
from termine.models import *
# Register your models here.
<file_sep>from django.shortcuts import render
from django.views.generic import TemplateView
from termine.models import *
# Create your views here.
class TermineView(TemplateView):
template_name = "base_termine_index.html"
def get_context_data(self, **kwargs):
# Call the base implementation first to get a context
context = super(TermineView, self).get_context_data(**kwargs)
return context<file_sep># -*- coding: utf-8 -*-
# Generated by Django 1.10 on 2016-08-06 16:22
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('home', '0007_auto_20160806_1607'),
]
operations = [
migrations.CreateModel(
name='HandyLogo',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('bild', models.ImageField(upload_to='')),
('beschreibung', models.CharField(max_length=200)),
('prioritaet', models.PositiveSmallIntegerField()),
('veroeffentlichen', models.BooleanField(default=True)),
],
),
]
<file_sep># -*- coding: utf-8 -*-
# Generated by Django 1.10 on 2016-08-06 22:25
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('info', '0005_infotext'),
]
operations = [
migrations.CreateModel(
name='InfoBild',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('bild', models.ImageField(upload_to='')),
('beschreibung', models.CharField(max_length=200)),
],
),
]
<file_sep>from django.shortcuts import render
from django.views.generic import TemplateView
from rwk.models import *
# Create your views here.
class RwkView(TemplateView):
template_name = "base_rwk_index.html"
def get_context_data(self, **kwargs):
# Call the base implementation first to get a context
context = super(RwkView, self).get_context_data(**kwargs)
return context<file_sep>from django.conf.urls import url
from termine.views import *
app_name = 'termine'
urlpatterns = [
url(r'^$', TermineView.as_view(), name='index'),
]<file_sep>from django.conf.urls import url
from impressum.views import *
app_name = 'impressum'
urlpatterns = [
url(r'^$', ImpressumView.as_view(), name='index'),
]<file_sep># -*- coding: utf-8 -*-
# Generated by Django 1.10 on 2016-08-05 22:03
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('home', '0002_auto_20160805_2158'),
]
operations = [
migrations.AddField(
model_name='kurzinfo',
name='priority',
field=models.PositiveSmallIntegerField(default=0),
preserve_default=False,
),
]
<file_sep>from django.shortcuts import render
from django.views.generic import TemplateView
from home.models import *
# Create your views here.
class HomeView(TemplateView):
template_name = "base_home_index.html"
def get_context_data(self, **kwargs):
# Call the base implementation first to get a context
context = super(HomeView, self).get_context_data(**kwargs)
# Add in a QuerySet of all the books
context['bilder'] = KarussellBild.objects.order_by('-prioritaet').filter(veroeffentlichen=True)
context['infos'] = KurzInfo.objects.order_by('-prioritaet').filter(veroeffentlichen=True)
context['logo'] = None
logos = HandyLogo.objects.filter(veroeffentlichen=True)
if (logos.count()>0):
context['logo'] = logos[0]
return context<file_sep>from django.conf.urls import url
from kontakt.views import *
app_name = 'kontakt'
urlpatterns = [
url(r'^$', KontaktView.as_view(), name='index'),
]<file_sep>from django.shortcuts import render
from django.views.generic import TemplateView
from kontakt.models import *
# Create your views here.
class KontaktView(TemplateView):
template_name = "base_kontakt_index.html"
def get_context_data(self, **kwargs):
# Call the base implementation first to get a context
context = super(KontaktView, self).get_context_data(**kwargs)
return context<file_sep>from django.contrib import admin
from info.models import *
# Register your models here.
admin.site.register(Person)
admin.site.register(InfoText)
admin.site.register(InfoBild)<file_sep># -*- coding: utf-8 -*-
# Generated by Django 1.10 on 2016-08-06 16:07
from __future__ import unicode_literals
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('home', '0006_karussellbild_published'),
]
operations = [
migrations.RenameField(
model_name='karussellbild',
old_name='description',
new_name='beschreibung',
),
migrations.RenameField(
model_name='karussellbild',
old_name='priority',
new_name='prioritaet',
),
migrations.RenameField(
model_name='karussellbild',
old_name='published',
new_name='veroeffentlichen',
),
migrations.RenameField(
model_name='kurzinfo',
old_name='priority',
new_name='prioritaet',
),
migrations.RenameField(
model_name='kurzinfo',
old_name='published',
new_name='veroeffentlichen',
),
]
<file_sep>from django.contrib import admin
from home.models import *
# Register your models here.
admin.site.register(KurzInfo)
admin.site.register(KarussellBild)
admin.site.register(HandyLogo)<file_sep>beautifulsoup4==4.5.1
Django==1.10
django-cleanup==0.4.2
django-filter==0.13.0
Markdown==2.6.6
Pillow==3.3.0
pytz==2016.6.1
<file_sep>from django.apps import AppConfig
class RwkConfig(AppConfig):
name = 'rwk'
<file_sep># -*- coding: utf-8 -*-
# Generated by Django 1.10 on 2016-08-06 17:26
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('info', '0002_auto_20160806_1725'),
]
operations = [
migrations.AlterField(
model_name='person',
name='position',
field=models.PositiveSmallIntegerField(choices=[(0, '1. Vorstand'), (1, '2. Vorstand')]),
),
]
<file_sep>from django.contrib import admin
from impressum.models import *
# Register your models here.
<file_sep>POSITIONEN = (
(0, '1. Schützenmeister'),
(1, '2. Schützenmeister'),
)<file_sep>from django.conf.urls import url
from info.views import *
app_name = 'info'
urlpatterns = [
url(r'^$', InfoView.as_view(), name='index'),
]<file_sep>from django.conf.urls import url
from rwk.views import *
app_name = 'rwk'
urlpatterns = [
url(r'^$', RwkView.as_view(), name='index'),
]<file_sep># -*- coding: utf-8 -*-
# Generated by Django 1.10 on 2016-08-06 17:55
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('info', '0004_auto_20160806_1740'),
]
operations = [
migrations.CreateModel(
name='InfoText',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('text', models.TextField()),
('ueberschrift', models.CharField(max_length=50)),
('veroeffentlichen', models.BooleanField(default=True)),
],
),
]
<file_sep># SchuetzenHomepage
<file_sep># -*- coding: utf-8 -*-
# Generated by Django 1.10 on 2016-08-05 23:27
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('home', '0005_auto_20160805_2237'),
]
operations = [
migrations.AddField(
model_name='karussellbild',
name='published',
field=models.BooleanField(default=True),
),
]
<file_sep># -*- coding: utf-8 -*-
# Generated by Django 1.10 on 2016-08-05 22:11
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('home', '0003_kurzinfo_priority'),
]
operations = [
migrations.CreateModel(
name='KarussellBild',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('bild', models.FileField(upload_to='')),
('description', models.CharField(max_length=200)),
('priority', models.PositiveSmallIntegerField()),
],
),
migrations.RemoveField(
model_name='kurzinfo',
name='created_date',
),
migrations.RemoveField(
model_name='kurzinfo',
name='modified_date',
),
]
<file_sep>from django.db import models
from django.utils.safestring import mark_safe
from info.choices import *
# Create your models here.
class Person(models.Model):
name = models.CharField(max_length=30)
bild = models.ImageField()
veroeffentlichen = models.BooleanField(default=True)
position = models.PositiveSmallIntegerField(choices=POSITIONEN)
email = models.EmailField()
def __str__(self):
return self.name + ' ist ' + POSITIONEN[self.position][1]
class InfoText(models.Model):
ueberschrift = models.CharField(max_length=50)
text = models.TextField()
veroeffentlichen = models.BooleanField(default=True)
def textSafe(self):
return mark_safe(self.text)
def __str__(self):
return self.ueberschrift
class InfoBild(models.Model):
bild = models.ImageField()
beschreibung = models.CharField(max_length=200)
def __str__(self):
return self.beschreibung<file_sep>from django.shortcuts import render
from django.views.generic import TemplateView
from impressum.models import *
# Create your views here.
class ImpressumView(TemplateView):
template_name = "base_impressum_index.html"
def get_context_data(self, **kwargs):
# Call the base implementation first to get a context
context = super(ImpressumView, self).get_context_data(**kwargs)
return context<file_sep>from django.shortcuts import render
from django.views.generic import TemplateView
from info.models import *
# Create your views here.
class InfoView(TemplateView):
template_name = "base_info_index.html"
def get_context_data(self, **kwargs):
# Call the base implementation first to get a context
context = super(InfoView, self).get_context_data(**kwargs)
vorstaende = Person.objects.filter(veroeffentlichen=True, position=0)
context['ersterVorstand'] = None
if (vorstaende.count()>0):
context['ersterVorstand'] = vorstaende[0]
vorstaende = Person.objects.filter(veroeffentlichen=True, position=1)
context['zweiterVorstand'] = None
if (vorstaende.count()>0):
context['zweiterVorstand'] = vorstaende[0]
infoText = InfoText.objects.filter(veroeffentlichen=True)
context['infoText'] = None
if (infoText.count()>0):
context['infoText'] = infoText[0]
return context
|
93fb224f08f59e46b7d9c05eb175b88dc0ab90ea
|
[
"Markdown",
"Python",
"Text"
] | 28 |
Python
|
enlightenedbit/SchuetzenHomepage
|
79dc8a999c9e2e2752958e480e1bc1da3ed2601a
|
c6aee18b9e846521bba4503af797b99b8a877518
|
refs/heads/master
|
<file_sep>export const MY_CONST = 36;
<file_sep># verdaccio-publish
Publishing to a private NPM registry with `verdaccio`
----
### Publish via the `verdaccio` CLI:
```
yarn --cwd ./publisher-cli && yarn --cwd ./publisher-cli test
```
### Publish via the `verdaccio` Node.js API:
```
yarn --cwd ./publisher-node-api && yarn --cwd ./publisher-node-api test
```
|
91b7e82e2ba2e8ffa15b55badde7f65ba677bf9e
|
[
"JavaScript",
"Markdown"
] | 2 |
JavaScript
|
kittaakos/verdaccio-publish
|
9bc16a8302b069ad1eb6aafc8281124ec4eb603d
|
db9fc7ccac9b9dfa58a3dc06b9df66692aece866
|
refs/heads/master
|
<file_sep>package com.example.backend;
import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertFalse;
import static org.junit.jupiter.api.Assertions.assertNotNull;
import static org.junit.jupiter.api.Assertions.assertTrue;
import java.util.List;
import java.util.Set;
import org.junit.jupiter.api.Test;
public class UserControllerTest {
@Test
public void addNewUser() {
UserController usrcon = new UserController();
String result = usrcon.addNewUser("uid", "email", "phone", "name");
assertEquals("Saved", result);
}
@Test
public void getAllUsers() {
UserController usrcon = new UserController();
usrcon.addNewUser("uid", "email", "phone", "name");
Iterable<User> users = usrcon.getAll();
assertNotNull(users);
}
@Test
public void addNewDog() {
UserController usrcon = new UserController();
usrcon.addNewUser("uid", "email", "phone", "name");
String result = usrcon.addNewDog("uid", "dogname", "breed", "age", "weight", "", "", "");
assertEquals("added new dog", result);
}
@Test
public void findDogs() {
UserController usrcon = new UserController();
usrcon.addNewUser("uid", "email", "phone", "name");
usrcon.addNewDog("uid", "dogname", "breed", "age", "weight", "", "", "");
Dog dog = new Dog("dogname", "breed", "age", "weight", "", "", "");
List<Dog> dogs = usrcon.findDogs("uid");
assertTrue(dogs.contains(dog));
}
@Test
public void findUser() {
UserController usrcon = new UserController();
User user = new User("uid", "email", "phone", "name", new ContactList());
usrcon.addNewUser("uid", "email", "phone", "name");
User usrconUser = usrcon.findUser("uid");
assertEquals(user, usrconUser);
}
@Test
public void updateFail() {
UserController usrcon = new UserController();
String status = usrcon.update("uid", "name", "email", "phone");
assertEquals("no user found", status);
}
@Test
public void update() {
UserController usrcon = new UserController();
usrcon.addNewUser("uid", "email", "phone", "name");
String status = usrcon.update("uid", "name", "email", "phone");
assertEquals("Updated", status);
}
@Test
public void newNameForUserFail() {
UserController usrcon = new UserController();
boolean status = usrcon.newNameForUser("uid", "newName");
assertFalse(status);
}
@Test
public void newNameForUser() {
UserController usrcon = new UserController();
usrcon.addNewUser("uid", "email", "phone", "name");
boolean status = usrcon.newNameForUser("uid", "newName");
assertTrue(status);
}
@Test
public void newPhoneNumberForUserFail(){
UserController usrcon = new UserController();
boolean status = usrcon.newPhoneNumberForUser("uid", "newPhone");
assertFalse(status);
}
@Test
public void newPhoneNumberForUser(){
UserController usrcon = new UserController();
usrcon.addNewUser("uid", "email", "phone", "name");
boolean status = usrcon.newPhoneNumberForUser("uid", "newPhone");
assertTrue(status);
}
@Test
public void newEmailForUserFail() {
UserController usrcon = new UserController();
boolean status = usrcon.newEmailForUser("uid", "newEmail");
assertFalse(status);
}
@Test
public void newEmailForUser() {
UserController usrcon = new UserController();
usrcon.addNewUser("uid", "email", "phone", "name");
boolean status = usrcon.newEmailForUser("uid", "newEmail");
assertTrue(status);
}
@Test
public void deleteUserFail() {
UserController usrcon = new UserController();
boolean status = usrcon.deleteUser("uid");
assertFalse(status);
}
@Test
public void deleteUser() {
UserController usrcon = new UserController();
usrcon.addNewUser("uid", "email", "phone", "name");
boolean status = usrcon.deleteUser("uid");
assertTrue(status);
}
}<file_sep>
spring.datasource.url=jdbc:mysql://mysql.dsv.su.se:3306/ludo1017?rewriteBatchedStatements=true
spring.datasource.username=ludo1017
spring.datasource.password=<PASSWORD>
spring.jpa.properties.hibernate.jdbc.batch_size=100
spring.jpa.properties.hibernate.order_inserts=true
spring.jpa.properties.hibernate.generate_statistics = true
spring.jpa.properties.hibernate.order_updates = true
spring.jpa.properties.hibernate.jdbc.batch_versioned_data=true
spring.jpa.properties.hibernate.id.new_generator_mappings = true
spring.jpa.show-sql = true
spring.jpa.hibernate.ddl-auto=update
<file_sep>package com.example.backend;
import java.awt.image.BufferedImage;
import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.IOException;
import java.net.URLEncoder;
import java.nio.charset.StandardCharsets;
import java.sql.Blob;
import java.sql.SQLException;
import java.util.Base64;
import java.util.List;
import java.util.Optional;
import java.util.Set;
import java.util.concurrent.atomic.AtomicReference;
import javax.imageio.ImageIO;
import javax.sql.rowset.serial.SerialBlob;
import javax.sql.rowset.serial.SerialException;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Controller;
import org.springframework.web.bind.annotation.DeleteMapping;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.ResponseBody;
@Controller // This means that this class is a Controller
@RequestMapping(path = "/dog")
public class DogController {
@Autowired
private DogRepository dogRepository;
@Autowired
private UserRepository userRepo;
@GetMapping(path = "/all")
public @ResponseBody List<Dog> allDog() {
return dogRepository.findAll();
}
@GetMapping(path = "/findDog")
public @ResponseBody Dog findUser(@RequestParam long id) {
Dog d = dogRepository.findById(id).get();
return d;
}
@PostMapping(value = "/updatedogname")
public @ResponseBody boolean newNameForDog(@RequestBody long id, String newName) {
Dog d = dogRepository.findById(id).get();
if (d != null) {
d.setName(newName);
return true;
} else {
return false;
}
}
@PostMapping(value = "/updatedogweight")
public @ResponseBody boolean newWeightForDog(@RequestBody long id, String newWeight) {
Dog d = dogRepository.findById(id).get();
if (d != null) {
d.setWeight(newWeight);
return true;
} else {
return false;
}
}
@PostMapping(value = "/updatedogheight")
public @ResponseBody boolean newHeightForDog(@RequestBody long id, String newHeight) {
Dog d = dogRepository.findById(id).get();
if (d != null) {
d.setHeight(newHeight);
return true;
} else {
return false;
}
}
@PostMapping(value = "/updatedogdescription")
public @ResponseBody boolean newDescriptionForDog(@RequestBody long id, String newDescription) {
Dog d = dogRepository.findById(id).get();
if (d != null) {
d.setDescription(newDescription);
return true;
} else {
return false;
}
}
@PostMapping(value = "/updatedogage")
public @ResponseBody boolean newAgeForDog(@RequestBody long id, String newAge) {
Dog d = dogRepository.findById(id).get();
if (d != null) {
d.setAge(newAge);
return true;
} else {
return false;
}
}
@PostMapping(value = "/setPicture")
public @ResponseBody String setPicture(@RequestParam Long id, String base64, String uid) {
byte[] decodedByte = Base64.getMimeDecoder().decode(base64);
AtomicReference<String> test = new AtomicReference<String>();
User u = userRepo.findByUid(uid);
test.set("newValue");
AtomicReference<Dog> toAddPic = new AtomicReference<Dog>();
try {
Blob blob = new SerialBlob(decodedByte);
u.getOwnedDog().forEach((e) -> {
if(e.getId().equals(id)){
toAddPic.set(e);
}
});
if(toAddPic.get() != null){
toAddPic.get().setBlobImage(blob);
userRepo.save(u);
test.set("here");
}
} catch (SerialException e) {
test.set(e.getMessage());
e.printStackTrace();
} catch (SQLException e) {
test.set(e.getMessage());
e.printStackTrace();
}
return test.get();
}
@GetMapping(value = "/getPicture")
public @ResponseBody String getPic(@RequestParam long id) {
List<User> u = userRepo.findAll();
AtomicReference<String> picture = new AtomicReference<String>();
picture.set("none");
u.forEach((eu) -> {
eu.getOwnedDog().forEach((e) -> {
if(e.getId().equals(id)){
if(e.getImage() != "Error"){
picture.set(e.getImage());
}
}
});
});
return picture.get();
}
@PostMapping(value = "/deletedog")
public @ResponseBody String deleteDog(@RequestParam long id, String uid) {
Optional<Dog> optinalEntity = dogRepository.findById(id);
Dog d = optinalEntity.get();
//dogRepository.delete(d);
User u = userRepo.findByUid(uid);
u.removeDog(d);
userRepo.save(u);
return "true";
}
@PostMapping(value = "/clearAllDogs")
public @ResponseBody String deleteAllDog(@RequestParam String uid) {
//dogRepository.delete(d);
User u = userRepo.findByUid(uid);
u.getOwnedDog().clear();
userRepo.save(u);
return "true";
}
public static byte[] getImage() {
File file = new File("Java.png");
if (file.exists()) {
try {
BufferedImage bufferedImage = ImageIO.read(file);
ByteArrayOutputStream byteOutStream = new ByteArrayOutputStream();
ImageIO.write(bufferedImage, "png", byteOutStream);
return byteOutStream.toByteArray();
} catch (IOException e) {
e.printStackTrace();
}
}
return null;
}
}<file_sep>package com.example.backend;
import javax.persistence.Entity;
import javax.persistence.FetchType;
import javax.persistence.GeneratedValue;
import javax.persistence.GenerationType;
import javax.persistence.Id;
import javax.persistence.MapsId;
import javax.persistence.OneToOne;
import javax.persistence.SequenceGenerator;
import org.hibernate.annotations.GenericGenerator;
@Entity
public class Position {
@Id
@GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "USERS_SEQ")
@SequenceGenerator(name = "USERS_SEQ", sequenceName = "SEQUENCE_USERS")
private Long id;
private Double x;
private Double y;
Position(){};
Position(Double x, Double y){
this.x = x;
this.y = y;
}
public Double getX(){
return x;
}
public Double getY(){
return y;
}
public void setX(Double x){
this.x = x;
}
public void setY(Double y){
this.y = y;
}
public String toString(){
return this.y.toString()+","+this.x.toString();
}
}<file_sep>package com.example.backend;
import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertNotNull;
import static org.junit.jupiter.api.Assertions.assertNull;
import org.junit.jupiter.api.Test;
public class DogTest {
@Test
public void makeDogEmptyConstructor() {
Dog dog = new Dog();
assertNull(dog.getName());
assertNull(dog.getRace());
assertNull(dog.getAge());
assertNull(dog.getHeight());
assertNull(dog.getWeight());
//assertNull(dog.getDogpicture());
assertNull(dog.getDescription());
assertNull(dog.getGender());
}
@Test
public void makeDogConstructor() {
Dog dog = new Dog("name", "breed", "1", "2", "3", "dogpicture", "description", "gender");
assertEquals("name", dog.getName());
assertEquals("breed", dog.getRace());
assertEquals("1", dog.getAge());
assertEquals("2", dog.getHeight());
assertEquals("3", dog.getWeight());
//assertEquals("dogpicture", dog.getDogpicture());
assertEquals("description", dog.getDescription());
assertEquals("gender", dog.getGender());
}
@Test
public void makeDogSetters() {
Dog dog = new Dog();
dog.setName("name");
dog.setRace("breed");
dog.setAge("1");
dog.setHeight("2");
dog.setWeight("3");
//dog.setDogpicture("dogpicture");
dog.setDescription("description");
dog.setGender("gender");
assertEquals("name", dog.getName());
assertEquals("breed", dog.getRace());
assertEquals("1", dog.getAge());
assertEquals("2", dog.getHeight());
assertEquals("3", dog.getWeight());
//assertEquals("dogpicture", dog.getDogpicture());
assertEquals("description", dog.getDescription());
assertEquals("gender", dog.getGender());
}
}<file_sep>package com.example.backend;
import org.hibernate.SessionFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.data.repository.CrudRepository;
import antlr.collections.impl.LList;
import javax.persistence.EntityManagerFactory;
import javax.transaction.Transactional;
import com.example.backend.*;
// This will be AUTO IMPLEMENTED by Spring into a Bean called userRepository
// CRUD refers Create, Read, Update, Delete
public interface LightRepository extends JpaRepository<Light, Long> {
}<file_sep>package com.example.backend;
import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.data.jpa.repository.Query;
import org.springframework.data.repository.CrudRepository;
import com.example.backend.*;
// This will be AUTO IMPLEMENTED by Spring into a Bean called userRepository
// CRUD refers Create, Read, Update, Delete
public interface DogRepository extends JpaRepository<Dog, Long> {
}<file_sep>package com.example.backend;
import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertNotNull;
import static org.junit.jupiter.api.Assertions.assertNull;
import org.junit.jupiter.api.Test;
public class UserTest {
@Test
public void makeUserEmptyConstructor() {
User user = new User();
assertNull(user.getEmail());
assertNull(user.getPhoneNumber());
assertNull(user.getName());
}
@Test
public void makeUserConstructor() {
User user = new User("uid", "<EMAIL>", "0700000000", "name",new ContactList() );
assertEquals("<EMAIL>" ,user.getEmail());
assertEquals("0700000000", user.getPhoneNumber());
assertEquals("name", user.getName());
}
@Test
public void makeUserSetters() {
User user = new User();
user.setEmail("<EMAIL>");
user.setPhoneNumber("0700000000");
user.setName("name");
assertEquals("<EMAIL>" ,user.getEmail());
assertEquals("0700000000", user.getPhoneNumber());
assertEquals("name", user.getName());
}
}<file_sep># PVT6
Grupp 6 PVT
2020-04-07
Hej från Viktor
Tisdag
Hej från Fredrik
Hej från Martin
Gör något
Hej från Thomas
Hej och hejdå
Hej från Jakob
Hej från Bea
Hej från Sara
Hej från Rasmus
Hej från Bea
Hej från Jakob
<file_sep>package com.example.backend;
import java.util.HashSet;
import java.util.Set;
import java.util.concurrent.atomic.AtomicBoolean;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Controller;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.ResponseBody;
import antlr.collections.List;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestBody;
@Controller // This means that this class is a Controller
@RequestMapping(path = "/contacts")
public class ContactRequestController {
@Autowired // This means to get the bean called userRepository
// Which is auto-generated by Spring, we will use it to handle the data
private UserRepository userRepository;
@GetMapping(path = "/sentRequests")
public @ResponseBody Set<ContactRequest> getContactRequests(@RequestParam String uid) {
Set<ContactRequest> sent = new HashSet<ContactRequest>();
Set<ContactRequest> list = userRepository.findByUid(uid).getContactRequest();
User user = userRepository.findByUid(uid);
for (ContactRequest e : list) {
if (e.getSender() != null && e.getSender().getUid() == user.getUid()) {
sent.add(e);
}
}
return sent;
}
@GetMapping(path = "/waitingRequests")
public @ResponseBody Set<ContactRequest> getWaitingContactRequests(@RequestParam String uid) {
Set<ContactRequest> sent = new HashSet<ContactRequest>();
Set<ContactRequest> list = userRepository.findByUid(uid).getContactRequest();
User user = userRepository.findByUid(uid);
for (ContactRequest e : list) {
if (e.getSender() != null && e.getSender().getUid() != user.getUid()) {
sent.add(e);
}
}
return sent;
}
@GetMapping(path = "/clearRequests")
public @ResponseBody String clearRequests(@RequestParam String uid) {
User u = userRepository.findByUid(uid);
u.getContactRequest().clear();
userRepository.save(u);
return "true";
}
@PostMapping(path = "/answer")
public @ResponseBody String answerRequest(@RequestParam String e, String phone, String uid) {
User u = userRepository.findByUid(uid);
User u2 = userRepository.findByPhone(phone);
AtomicBoolean remove = new AtomicBoolean();
Set<ContactRequest> contactRequests = u.getContactRequest();
contactRequests.forEach((element) -> {
if (element.getSender() == u2) {
switch (e) {
case "accept":
element.setStatus(Status.ACCEPTED);
u2.findUserFromContactRequests(element).setStatus(Status.ACCEPTED);
// ADD TO FRIENDS
if (u.getContactList() == null) {
u.setContactList(new ContactList());
userRepository.save(u);
}
if (u2.getContactList() == null) {
u2.setContactList(new ContactList());
userRepository.save(u2);
}
u.getContactList().addUser(u2);
u2.getContactList().addUser(u);
userRepository.save(u);
userRepository.save(u2);
remove.set(true);
break;
case "rejcet":
element.setStatus(Status.REJECTRED);
u2.findUserFromContactRequests(element).setStatus(Status.REJECTRED);
userRepository.save(u);
userRepository.save(u2);
remove.set(true);
break;
}
}
});
if (remove.get() == true){
removeRequest(u2, u);
}
return "true";
}
@GetMapping(path = "/all")
public @ResponseBody ContactList getContacts(@RequestParam String uid) {
return userRepository.findByUid(uid).getContactList();
}
@PostMapping(path = "/remove")
public @ResponseBody String removeContact(@RequestParam String uid, String phone) {
User u1 = userRepository.findByUid(uid);
User u2 = userRepository.findByPhone(phone);
u1.getContactList().getUser().remove(u2);
u2.getContactList().getUser().remove(u1);
userRepository.save(u1);
userRepository.save(u2);
return "true";
}
@PostMapping(value = "/new")
public @ResponseBody String sendRequest(@RequestParam String sendUid, String phone) {
User sender = userRepository.findByUid(sendUid);
User receiver = userRepository.findByPhone(phone);
if (receiver != null) {
if(checkIfAlreadyFriends(sender, receiver)){
return "Already friends";
}
if(sender == receiver){
return "You can not add your self";
}
ContactRequest request = new ContactRequest(sender, receiver, Status.WAITING);
sender.setContactRequest(request);
receiver.setContactRequest(request);
userRepository.save(sender);
userRepository.save(receiver);
return "Sent friend request";
} else
return "No user found";
}
private Boolean checkIfAlreadyFriends(User u1, User u2){
Boolean user1 = u1.getContactList().getUser().contains(u2);
Boolean user2 = u2.getContactList().getUser().contains(u1);
return user2 && user1;
}
public String removeRequest(User sender, User receiver){
Set<ContactRequest> toRemove = new HashSet<ContactRequest>();
Set<ContactRequest> contactRequests = sender.getContactRequest();
contactRequests.forEach((e) -> {
if (e.getReceiver() == receiver) {
toRemove.add(e);
}
});
toRemove.forEach((e) -> {
contactRequests.remove(e);
userRepository.saveAndFlush(sender);
});
toRemove.clear();
Set<ContactRequest> contactRequests2 = receiver.getContactRequest();
contactRequests2.forEach((e) -> {
if (e.getSender() == sender) {
toRemove.add(e);
}
});
toRemove.forEach((e) -> {
contactRequests2.remove(e);
userRepository.saveAndFlush(receiver);
});
return "true";
}
@PostMapping(value = "/cancleRequests")
public @ResponseBody String cancleRequests(@RequestParam String uid, String phone) {
User sender = userRepository.findByUid(uid);
User receiver = userRepository.findByPhone(phone);
Set<ContactRequest> toRemove = new HashSet<ContactRequest>();
Set<ContactRequest> contactRequests = sender.getContactRequest();
contactRequests.forEach((e) -> {
if (e.getReceiver() == receiver) {
toRemove.add(e);
}
});
toRemove.forEach((e) -> {
contactRequests.remove(e);
userRepository.saveAndFlush(sender);
});
toRemove.clear();
Set<ContactRequest> contactRequests2 = receiver.getContactRequest();
contactRequests2.forEach((e) -> {
if (e.getSender() == sender) {
toRemove.add(e);
}
});
toRemove.forEach((e) -> {
contactRequests2.remove(e);
userRepository.saveAndFlush(receiver);
});
return "true";
}
}<file_sep>
spring.datasource.url=jdbc:mysql://mysql.dsv.su.se:3306/ludo1017?rewriteBatchedStatements=true
spring.datasource.username=ludo1017
spring.datasource.password=<PASSWORD>
spring.jpa.properties.hibernate.connection.driver_class = com.mysql.cj.jdbc.Driver
spring.jpa.properties.hibernate.generate_statistics = true
spring.jpa.properties.hibernate.order_updates = true
spring.jpa.properties.hibernate.jdbc.batch_versioned_data=true
spring.jpa.properties.hibernate.id.new_generator_mappings = true
spring.jpa.show-sql = true
spring.jpa.hibernate.ddl-auto=update
spring.jackson.serialization.fail-on-empty-beans=false
spring.datasource.driverClassName=com.mysql.cj.jdbc.Driver
<file_sep>package com.example.backend;
import java.io.IOException;
import java.net.URI;
import java.net.http.HttpClient;
import java.net.http.HttpRequest;
import java.net.http.HttpResponse;
import java.util.Optional;
import java.util.Set;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Controller;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.ResponseBody;
@Controller // This means that this class is a Controller
@RequestMapping(path = "/route")
public class RoutesController {
@Autowired
private UserRepository userRepository;
@Autowired
private RouteRepository routeRepo;
@GetMapping(path = "/test")
public @ResponseBody String testRoute() {
return "go";
}
@GetMapping(path = "/getSavedRoutes")
public @ResponseBody Set<Route> getSavedRoutes(@RequestParam String uid) {
User u = userRepository.findByUid(uid);
return u.getSavedRoutes();
}
@GetMapping(path = "/getRoute")
public @ResponseBody String getRoute(@RequestParam Long id) {
Route u = routeRepo.findById(id).get();
if( u != null) {
return u.getRoutes();
}
return "none";
}
@GetMapping(path = "/clean")
public @ResponseBody String clean(@RequestParam String uid) {
User u = userRepository.findByUid(uid);
u.cleanRoutes();
userRepository.save(u);
return "done";
}
@PostMapping(path = "/delete")
public @ResponseBody String remove(@RequestParam String uid, Long id) {
User u = userRepository.findByUid(uid);
Route r = routeRepo.findById(id).get();
u.removeRoute(r);
userRepository.save(u);
routeRepo.delete(r);
return "done";
}
@PostMapping(path = "/saveRoute")
public @ResponseBody String saveRoutes(@RequestParam String uid, String name, String route, String distans){
User u = userRepository.findByUid(uid);
String[] pos = route.split("/");
// Extremet fullt men funkar
Position latlng1 = new Position(Double.parseDouble((pos[0].split(",")[1])), Double.parseDouble((pos[0].split(",")[0])));
Position latlng2 = new Position(Double.parseDouble((pos[1].split(",")[1])), Double.parseDouble((pos[1].split(",")[0])));
Position latlng3 = new Position(Double.parseDouble((pos[2].split(",")[1])), Double.parseDouble((pos[2].split(",")[0])));
Route r = new Route(name, latlng1, latlng2, latlng3, distans);
u.addRoutes(r);
routeRepo.save(r);
userRepository.save(u);
return "Saved";
}
@GetMapping(path = "/new")
public @ResponseBody String generateRoute(@RequestParam Double posX, Double posY, int distans) {
String returnvalue = "";
double o = posY * Math.PI / 180;
double i = posX * Math.PI / 180;
double h = distans * 0.609344;
double m = distans * 0.609344;
double k = Math.PI / 180 * ((120 - 60) * Math.random() + 60);
double f = 0.85;
double p = f * m / (2 + 2 * Math.sin(k / 2));
double c = 4000 * Math.cos(o);
double d = 2 * Math.random() * Math.PI;
double n = p * Math.cos(d) / 4000 + o;
double g = i + p * Math.sin(d) / c;
double l = p * Math.cos(d + k) / 4000 + o;
double e = i + p * Math.sin(d + k) / c;
Position latlng1 = new Position((posX), (posY));
Position latlng2 = new Position((g * 180 / Math.PI), (n * 180 / Math.PI));
Position latlng3 = new Position((e * 180 / Math.PI), (l * 180 / Math.PI));
HttpClient client = HttpClient.newHttpClient();
HttpRequest request = HttpRequest.newBuilder()
.uri(URI.create("https://api.mapbox.com/directions/v5/mapbox/walking/" + latlng1.toString() + ";"
+ latlng2.toString() + ";" + latlng3.toString() + ";" + latlng1.toString()
+ ".json?access_token=<KEY>&steps=true&overview=full&geometries=geojson&annotations=distance&continue_straight=true"))
.build();
try {
HttpResponse<String> response = client.send(request, HttpResponse.BodyHandlers.ofString());
returnvalue = response.body();
} catch (IOException e1) {
e1.printStackTrace();
} catch (InterruptedException e1) {
e1.printStackTrace();
}
return returnvalue ;
// return "https://api.mapbox.com/directions/v5/mapbox/walking/" + latlng1.toString() + ";"
// + latlng2.toString() + ";" + latlng3.toString() + ";" + latlng1.toString()
// + ".json?access_token=<KEY>&steps=true&overview=full&geometries=geojson&annotations=distance&continue_straight=true";
}
}
<file_sep>package com.example.backend;
import java.util.Set;
import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.GenerationType;
import javax.persistence.Id;
import javax.persistence.OneToMany;
import javax.persistence.SequenceGenerator;
@Entity
public class SearchHistory {
@Id
@GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "USERS_SEQ")
@SequenceGenerator(name = "USERS_SEQ", sequenceName = "SEQUENCE_USERS")
private Long id;
@OneToMany
private Set<Position> pos;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public Set getPos() {
return pos;
}
public void setPosd(Position pos) {
this.pos.add(pos);
}
}<file_sep>package com.example.backend;
import java.util.Collection;
import java.util.Iterator;
import java.util.List;
import java.util.Set;
import java.util.concurrent.atomic.AtomicReference;
import javax.persistence.EntityManager;
import javax.persistence.EntityManagerFactory;
import javax.persistence.Persistence;
import javax.persistence.PersistenceContext;
import javax.transaction.Transactional;
import javax.validation.constraints.Null;
import org.hibernate.HibernateException;
import org.hibernate.Session;
import org.hibernate.SessionFactory;
import org.hibernate.StatelessSession;
import org.hibernate.Transaction;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.jpa.provider.HibernateUtils;
import org.springframework.data.jpa.repository.Query;
import org.springframework.stereotype.Controller;
import org.springframework.web.bind.annotation.DeleteMapping;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.ResponseBody;
import org.springframework.web.bind.annotation.RequestBody;
@Controller // This means that this class is a Controller
@RequestMapping(path = "/user") // This means URL's start with /demo (after Application path)
public class UserController {
@Autowired // This means to get the bean called userRepository
// Which is auto-generated by Spring, we will use it to handle the data
private UserRepository userRepository;
@Autowired
private DogRepository dogRepo;
@Autowired
private EntityManagerFactory emf;
@PostMapping(path = "/new") // Map ONLY POST Requests
public @ResponseBody String addNewUser(@RequestParam String uid, String email, String phone, String name) {
User n = new User(uid, email, phone, name, new ContactList());
userRepository.save(n);
return "Saved";
}
@GetMapping(path = "/all")
public @ResponseBody Iterable<User> getAll() {
Iterable<User> u = userRepository.findAll();
return u;
}
@GetMapping(path = "/dogs")
public @ResponseBody List<Dog> findDogs(@RequestParam String uid) {
User u = userRepository.findByUid(uid);
return u.getOwnedDog();
}
@PostMapping(path = "/location")
public @ResponseBody String updatePos(@RequestParam Double lat, Double log, String uid) {
User u = userRepository.findByUid(uid);
u.setPosition(new Position((log), (lat)));
userRepository.save(u);
return "true";
}
@GetMapping(path = "/find")
public @ResponseBody User findUser(@RequestParam String uid) {
User u = userRepository.findByUid(uid);
return u;
}
@PostMapping(path = "/newdog") // Map ONLY POST Requests
public @ResponseBody String addNewDog(@RequestParam String uid, String name, String breed, String age,
String height, String weight, String description, String gender) {
try {
User u = userRepository.findByUid(uid);
Dog d = new Dog(name, breed, age, height, weight, description, gender);
u.setOwnedDog(d);
userRepository.save(u);
return ((Dog) getLastElement(u.getOwnedDog())).getId().toString();
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
return null;
}
}
public Object getLastElement(final Collection c) {
final Iterator itr = c.iterator();
Object lastElement = itr.next();
while (itr.hasNext()) {
lastElement = itr.next();
}
return lastElement;
}
@PostMapping(value = "/update")
public @ResponseBody String update(@RequestParam String uid, String name, String email, String phone) {
User u = userRepository.findByUid(uid);
if (u != null) {
u.setName(name);
u.setEmail(email);
u.setPhoneNumber(phone);
userRepository.save(u);
return "Updated";
} else {
return "no user found";
}
}
@PostMapping(value = "/updatename")
public @ResponseBody boolean newNameForUser(@RequestBody String uid, String newName) {
User u = userRepository.findByUid(uid);
if (u != null) {
u.setName(newName);
return true;
} else {
return false;
}
}
@PostMapping(value = "/updatephonenumber")
public @ResponseBody boolean newPhoneNumberForUser(@RequestBody String uid, String newPhone) {
User u = userRepository.findByUid(uid);
if (u != null) {
u.setPhoneNumber(newPhone);
return true;
} else {
return false;
}
}
@PostMapping(value = "/updateemail")
public @ResponseBody boolean newEmailForUser(@RequestBody String uid, String newEmail) {
User u = userRepository.findByUid(uid);
if (u != null) {
u.setEmail(newEmail);
return true;
} else {
return false;
}
}
@DeleteMapping(value = "/deleteuser")
public @ResponseBody boolean deleteUser(@RequestBody String uid) {
User u = userRepository.findByUid(uid);
if (u != null) {
userRepository.delete(u);
return true;
} else {
return false;
}
}
}
<file_sep>package com.example.backend;
import java.util.HashSet;
import java.util.Set;
import java.util.Set;
import javax.persistence.CascadeType;
import javax.persistence.Entity;
import javax.persistence.FetchType;
import javax.persistence.GeneratedValue;
import javax.persistence.GenerationType;
import javax.persistence.Id;
import javax.persistence.JoinColumn;
import javax.persistence.JoinTable;
import javax.persistence.ManyToMany;
import javax.persistence.OneToMany;
import javax.persistence.SequenceGenerator;
import com.fasterxml.jackson.annotation.JsonIgnore;
import org.hibernate.annotations.GenericGenerator;
@Entity
public class ContactList {
@Id
@GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "USERS_SEQ")
@SequenceGenerator(name = "USERS_SEQ", sequenceName = "SEQUENCE_USERS")
private Long id;
@ManyToMany(fetch = FetchType.LAZY, cascade = { CascadeType.ALL })
@JoinTable(
name = "ContactLists",
joinColumns = { @JoinColumn(name = "userId")
},
inverseJoinColumns = { @JoinColumn(name = "contactListId") }
)
private Set<User> users = new HashSet<>();;
public ContactList(){
}
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public Set<User> getUser() {
return users;
}
public void addUser(User user) {
this.users.add(user);
}
}
|
d8fbc9e8dd218e0551df0f7c200bdea7ed0faf60
|
[
"Markdown",
"Java",
"INI"
] | 15 |
Java
|
PVT6/PVT6
|
006b23e16f05b588b8c450ab379577d5744e3ded
|
de791aa0dc33801503bb9e5840b01b43a17ae8ce
|
refs/heads/master
|
<file_sep>// This Program uses the Luhn Algorithm to detect
// invalid credit card numbers
// by <NAME> 2017
#include <stdlib.h>
#include <stdio.h>
//---------------------------------------------------
// **** PROTOTYPES ****
//---------------------------------------------------
int find_sum1(int a[], int length);
int find_sum2(int a[], int length);
//---------------------------------------------------
// **** MAIN PROGRAM ****
//---------------------------------------------------
int main()
{
int a[20]={-1}, i=0, length=0, input=0, sum1, sum2, checksum=0;
printf("\n\nThis Program will use the Luhn algorithm to detect\n");
printf("fraudulent credit card numbers.\n\n");
printf("Please enter credit card number in the following format:\n");
printf("\t-All digits separated by a single space\n");
printf("\t-Enter -1 when finished.\n\n");
while (input != -1)
{
scanf(" %i", &input);
if (input !=-1)
{
a[i] = input;
i++;
length++;
}
}
if (length > 20)
{
printf("\nError, too many digits entered.\n");
printf("Proceeding with first 20 digits only.\n");
length=20;
}
printf("\nThe Credit Card Number is: ");
for (i = 0; i < length; i++)
{
printf("%i", a[i]);
}
printf("\n");
sum1 = find_sum1(a, length);
printf("\nsum1 is: %i\n", sum1);
sum2 = find_sum2(a, length);
printf("\nsum2 is %i\n", sum2);
checksum = (9*(sum1+sum2))%10;
//Modulus 10 will give a remainder of the last digit only.
printf("Checksum %i, and the last digit %i ", checksum, a[length-1]);
if (checksum == a[length-1])
printf("are the same: Valid Credit Card number.\n\n");
else
printf("do not match. Credit Card number invalid.\n\n");
}
//---------------------------------------------------
// **** FUNCTIONS ****
//---------------------------------------------------
int find_sum1(int a[], int length)
{
int sum1=0, calc=0, i;
printf("The numbers for sum1 are ");
for (i = ( length-2 ); i >= 0; i-=2 )
{
//For loop to calculate sum1 of the Luhn Algorithm.
//sum1 will ignore the checkdigit (end digit),
//and add the rightmost digit with itself.
//
//The result is added to a total sum. If the result
//is over 10, only the second digit will be added.
//The formula repeats every 2 digits to the left until 0.
calc = (a[i]+a[i]);
if (calc >= 10)
{
//If the sum of this digit > 10,
calc = (calc-10)+1;
//subtract 10 and add 1
}
printf("%i ", calc);
sum1 += calc;
}
return sum1;
}
int find_sum2(int a[], int length)
{
int sum2=0, i;
printf("The numbers for sum2 are ");
for (i = ( length-3 ); i >= 0; i-=2)
{
//For loop to ignore the check digit,
//and find total of all OTHER digits.
printf("%i ", a[i]);
sum2 += a[i];
}
return sum2;
}<file_sep># Invalid Credit Card Detector
### Description:
This Program applies the Luhn Algorithm to a provided Credit Card number and advises the operator if a fraudulent number is detected. The Luhn Algorithm uses a simple checksum method on the last digit of a Credit Card number to validate the sequence as a potentially usable Credit Card number. The number sequence is considered valid if the checksum mod 10 is equal to zero. This technique of validation assists with the detection of errors such as incorrect digit entries. <br>
#### Example of usage:
<p align="center">
<img src="https://gdurl.com/CuR5" >
</p>
### Instructions:
See Install, Compile and Run instructions [here](https://github.com/AndrewColbeck/ProgrammingNotes/wiki/C-Programming).<br>
<br><br>
<NAME> © 2018<br>
B.Sc. Computer Science, B.A. Audio Engineering (Hons) <br>
follow:
[<img src="https://gdurl.com/vYH5">](https://github.com/AndrewColbeck)
[<img src="https://gdurl.com/xpGoe">](https://www.facebook.com/andrewtcolbeck)
[<img src="https://gdurl.com/FGea">](https://www.youtube.com/channel/UCG9CXPHtEN6zEz-KmLGFT2A)
[<img src="https://gdurl.com/f8fuk">](https://www.linkedin.com/in/andrewcolbeck)
|
7fcbccfa8a4e2451633dd16ed9cd285d8d7d3657
|
[
"Markdown",
"C"
] | 2 |
C
|
AndrewColbeck/C_CreditCardValidator
|
c97948281bee3afcf0647b4abef5e3c711cf5b74
|
a512bd04cdc818db04d0c4b18f7f63bebaea5288
|
refs/heads/master
|
<file_sep>import scrapy
from musicspider.items import MusicspiderItem
class QuotesSpider(scrapy.Spider):
name = "musicspider"
domain = 'http://www.luoo.net'
def start_requests(self):
urls = []
for i in range(1,90):
urls.append('http://www.luoo.net/tag/?p='+str(i))
for url in urls:
yield scrapy.Request(url=url, callback=self.parse)
def parse(self, response):
items=[]
s= response.xpath('//div[@class="vol-list"]/div/a/@href').extract()
for i in range(0,len(s)):
item = MusicspiderItem()
suburl =response.xpath('//div[@class="vol-list"]/div/a/@href').extract()[i]
item['per_id'] = suburl[-3:]
item['per_title'] = response.xpath('//div[@class="vol-list"]/div/a/@title').extract()[i]
item['per_pic_url'] = response.xpath('//div[@class="vol-list"]/div/a/img/@src').extract()[i]
item['category'] = response.xpath('//meta[@name="keywords"]/@content').extract_first().split(",")[-1]
items.append(item)
request = scrapy.Request(str(suburl), callback= self.subparse)
request.meta['item'] = item
yield request
def subparse(self, response):
item = response.meta['item']
sels = response.xpath('//div[@class="track-wrapper clearfix"]').extract()
for i in range(0,len(sels)):
item['music_name'] =response.xpath('//div[@class="track-wrapper clearfix"]/a/text()').extract()[i]
item['artist'] =response.xpath('//div[@class="track-wrapper clearfix"]/span[2]/text()').extract()[i]
item['music_id'] = "%02d" % (i+1)
item['music_url'] ='http://mp3-cdn.luoo.net/low/luoo/radio'+item['per_id']+'/'+item['music_id']+'.mp3'
item['pic_url'] = response.xpath('//div[@class="track-wrapper clearfix"]/a[3]/@data-img').extract()[i]
item['special'] = response.xpath('//div[@class="track-wrapper clearfix"]/a[3]/@data-text').extract()[i]
lrc_id = response.xpath('//div[@class="track-wrapper clearfix"]/a[2]/@data-sid').extract()[i]
item['lrc_url'] = 'http://www.luoo.net/single/lyric/'+lrc_id
item['comments']= response.xpath('//p[@class="the-comment"]/text()').extract_first()
item['category'] = response.xpath('//meta[@name="keywords"]/@content').extract_first().split(",")[-1].lower()
yield item
<file_sep># -*- coding: utf-8 -*-
# Define here the models for your scraped items
#
# See documentation in:
# http://doc.scrapy.org/en/latest/topics/items.html
import scrapy
class MusicspiderItem(scrapy.Item):
# define the fields for your item here like:
# name = scrapy.Field()
music_id = scrapy.Field()
music_name = scrapy.Field()
artist = scrapy.Field()
special = scrapy.Field()
music_url= scrapy.Field()
pic_url = scrapy.Field()
lrc_url = scrapy.Field()
comments = scrapy.Field()
music_id = scrapy.Field()
per_id = scrapy.Field()
per_title = scrapy.Field()
per_pic_url = scrapy.Field()
category = scrapy.Field()
<file_sep># -*- coding: utf-8 -*-
# Define your item pipelines here
#
# Don't forget to add your pipeline to the ITEM_PIPELINES setting
# See: http://doc.scrapy.org/en/latest/topics/item-pipeline.html
# more pipelines.py
# -*- coding: utf-8 -*-
# Define your item pipelines here
#
# Don't forget to add your pipeline to the ITEM_PIPELINES setting
# See: http://doc.scrapy.org/en/latest/topics/item-pipeline.html
from scrapy import signals
import json
import codecs
from twisted.enterprise import adbapi
from datetime import datetime
from hashlib import md5
import MySQLdb
import MySQLdb.cursors
class MySQLStoreMusicspiderPipeline(object):
def __init__(self):
self.conn = MySQLdb.connect(user = 'zmarvin', passwd = '<PASSWORD>', db = 'enjoymusicdb', host = 'localhost', charset = 'utf8', use_unicode = True)
self.cursor = self.conn.cursor()
def process_item(self, item, spider):
try:
self.cursor.execute(
"""INSERT IGNORE INTO song (music_id, music_name, artist,special, music_url, pic_url, lrc_url, comments, per_id, per_pic_url,per_title,category)
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)""",
(
item['music_id'],
item['music_name'],
item['artist'],
item['special'],
item['music_url'],
item['pic_url'],
item['lrc_url'],
item['comments'],
item['per_id'],
item['per_pic_url'],
item['per_title'],
item['category']
)
)
self.conn.commit()
except MySQLdb.Error, e:
print 'Error %d %s' % (e.args[0], e.args[1])
return item
<file_sep># music_spider
use scrapy crawl music resource from http://www.luoo.net
|
c314257f6edbdc0ed764e25f4af4f2755e7c011a
|
[
"Markdown",
"Python"
] | 4 |
Python
|
zmarvin/music_spider
|
cfa8bf7f1ca9ef577820fc3f0981bc0fd113a318
|
54501a8b792e9095cf376d9ba153571ad1bc2815
|
refs/heads/main
|
<file_sep>[package]
name = "nginx-log-rotator"
version = "0.1.0"
edition = "2018"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
clap = "3.0.0-beta.4"
env_logger = "0.9.0"
log = "0.4.14"
nix = "0.22.1"
serde = "1.0.130"
serde_derive = "1.0.130"
serde_yaml = "0.8.21"
signal-hook = "0.3.10"
<file_sep># nginx-log-rotator
Nginx用のログ切り捨てツール.
Dockerで動くNginxプロセスがmtail用に吐くログを短縮するのが目的
シグナルは全てNginxへ送られる。exit_statusはnginxの終了コードと同じになる。
```sh
nginx_log_rotator --config /etc/log_rotate_config.yaml
```
## 設定例
`rotate_span`の単位は秒、`log_inherit_kilobytes`で
何kilobyte作り直したログファイルに受け継ぐか決められる。
最初の不完全な一行は切り捨てる。
```yaml
---
cmd:
- nginx
- "-g"
- "daemon off;"
rotate_span: 1800
rotate_targets:
- /var/log/mtail/access.log
log_inherit_kilobytes: 256
```
<file_sep>use clap::{AppSettings, Clap};
use log::{error, info, warn};
use nix::sys::signal;
use nix::unistd::Pid;
use serde_derive::{Deserialize, Serialize};
use signal_hook::iterator::Signals;
use std::fs;
use std::io::BufRead;
use std::io::Write;
use std::io::{self, Read};
use std::path;
use std::process::Command;
use std::thread;
use std::time::Duration;
#[derive(Clap)]
#[clap(version = "0.1.0", author = "<NAME> <<EMAIL>>")]
#[clap(setting = AppSettings::ColoredHelp)]
struct Opts {
#[clap(short, long)]
config: Option<String>,
#[clap(long)]
example: bool,
}
#[derive(Serialize, Deserialize)]
struct Config {
cmd: Vec<String>,
rotate_span: u64,
rotate_targets: Vec<String>,
log_inherit_kilobytes: u64,
}
impl Config {
fn exmaple() -> Self {
Self {
cmd: vec![
"nginx".to_owned(),
"-g".to_owned(),
"daemon off;".to_owned(),
],
rotate_span: 1800,
rotate_targets: vec!["/var/log/mtail/access.log".to_owned()],
log_inherit_kilobytes: 256,
}
}
}
use signal_hook::consts::signal as sigs;
use std::io::Seek;
const CHUNK_SIZE: usize = 2048;
// バックアップファイルのパスを返す
fn truncate_log(
input_path: &path::Path,
inherit_kilobytes: u64,
) -> io::Result<Option<path::PathBuf>> {
if !input_path.exists() {
warn!("Log file {} does not exist", input_path.to_string_lossy());
return Ok(None)
}
let mut input = fs::File::open(&input_path)?;
let backup_path_str = format!("{}.bak", input_path.to_string_lossy());
let output_path_str = format!("{}.next", input_path.to_string_lossy());
let backup_path = path::Path::new(&backup_path_str);
let output_path = path::Path::new(&output_path_str);
let output = fs::File::create(output_path)?;
let mut writer = io::BufWriter::new(output);
if input.metadata()?.len() * 1024 <= inherit_kilobytes as u64 {
info!("skip truncate {}", input_path.to_string_lossy());
return Ok(None);
}
// 後半inherit_kilobytesの位置につける
input.seek(io::SeekFrom::End(inherit_kilobytes as i64 * 1024))?;
let mut reader = io::BufReader::new(input);
// 最初の一行は捨てる
reader.read_line(&mut String::new())?;
// CHUNK_SIZE単位で読み書きする
let mut buf = Vec::new();
buf.resize(CHUNK_SIZE, 0);
loop {
match reader.read(&mut buf) {
Ok(0) => break,
Ok(_) => {
writer.write_all(&buf)?;
buf.clear();
}
Err(e) => return Err(e),
}
}
writer.flush()?;
fs::rename(input_path, backup_path)?;
fs::rename(output_path, input_path)?;
Ok(Some(backup_path.to_owned()))
}
fn signal_hook_term_sig_to_nix_signal(sig: i32) -> Option<signal::Signal> {
match sig {
sigs::SIGABRT => Some(signal::SIGABRT),
sigs::SIGALRM => Some(signal::SIGALRM),
sigs::SIGBUS => Some(signal::SIGBUS),
sigs::SIGCHLD => Some(signal::SIGCHLD),
sigs::SIGCONT => Some(signal::SIGCONT),
sigs::SIGHUP => Some(signal::SIGHUP),
sigs::SIGINT => Some(signal::SIGINT),
sigs::SIGIO => Some(signal::SIGIO),
sigs::SIGPIPE => Some(signal::SIGPIPE),
sigs::SIGPROF => Some(signal::SIGPROF),
sigs::SIGQUIT => Some(signal::SIGQUIT),
sigs::SIGSYS => Some(signal::SIGSYS),
sigs::SIGTERM => Some(signal::SIGTERM),
sigs::SIGTRAP => Some(signal::SIGTRAP),
sigs::SIGTSTP => Some(signal::SIGTSTP),
sigs::SIGTTIN => Some(signal::SIGTTIN),
sigs::SIGTTOU => Some(signal::SIGTTOU),
sigs::SIGURG => Some(signal::SIGURG),
sigs::SIGUSR1 => Some(signal::SIGUSR1),
sigs::SIGUSR2 => Some(signal::SIGUSR2),
sigs::SIGVTALRM => Some(signal::SIGVTALRM),
sigs::SIGWINCH => Some(signal::SIGWINCH),
sigs::SIGXCPU => Some(signal::SIGXCPU),
sigs::SIGXFSZ => Some(signal::SIGXFSZ),
_ => None,
}
}
const PROPAGATION_SIGNALS: [i32; 24] = [
sigs::SIGABRT,
sigs::SIGALRM,
sigs::SIGBUS,
sigs::SIGCHLD,
sigs::SIGCONT,
sigs::SIGHUP,
sigs::SIGINT,
sigs::SIGIO,
sigs::SIGPIPE,
sigs::SIGPROF,
sigs::SIGQUIT,
sigs::SIGSYS,
sigs::SIGTERM,
sigs::SIGTRAP,
sigs::SIGTSTP,
sigs::SIGTTIN,
sigs::SIGTTOU,
sigs::SIGURG,
sigs::SIGUSR1,
sigs::SIGUSR2,
sigs::SIGVTALRM,
sigs::SIGWINCH,
sigs::SIGXCPU,
sigs::SIGXFSZ,
];
fn spawn(
exe: &str,
args: &[String],
truncate_span: Duration,
truncate_targets: Vec<String>,
inherit_kilobytes: u64,
) -> Result<(), (String, i32)> {
let mut child = Command::new(&exe).args(args).spawn().map_err(|e| {
(
format!(
"Failed to execute command {} with args({:?}) due to {:?}",
exe, args, e
),
-1,
)
})?;
let pid = child.id();
// シグナル中継用のスレッド
thread::spawn(move || {
// ハンドル不可能なシグナル(SIGKILL, SIGTSEGV, SIGILL, SIGFPE)を除いて全てのシグナルをキャッチ
for sig in Signals::new(&PROPAGATION_SIGNALS)
.expect("Cannot create signals")
.forever()
{
let recieved_signal = signal_hook_term_sig_to_nix_signal(sig)
.unwrap_or_else(|| panic!("cannot convert signal {}", sig));
// 伝搬
if let Err(errno) = signal::kill(Pid::from_raw(pid as i32), recieved_signal) {
warn!(
"Failed to send signal to child process due to errno({})",
errno
);
}
}
});
// ログローテート用のスレッド
thread::spawn(move || loop {
thread::sleep(truncate_span);
match truncate_targets
.iter()
.map(|target| truncate_log(path::Path::new(target), inherit_kilobytes))
.collect::<io::Result<Vec<Option<path::PathBuf>>>>()
{
Ok(backup_files) => {
if let Err(errno) = signal::kill(Pid::from_raw(pid as i32), signal::SIGUSR1) {
error!("Cannot switch log file due to errno {}", errno);
}
for backup_path in backup_files.into_iter().flatten() {
if let Err(e) = fs::remove_file(&backup_path) {
warn!(
"Cannot remove backup file {} due to {:?}",
backup_path.to_string_lossy(),
e
);
}
}
}
Err(e) => {
error!("Cannot truncate log file: {:?}", e);
}
}
});
// シグナル中継対象プロセスを見張る
match child.wait() {
Ok(status) => {
if status.success() {
info!("The child process gracefuly exited");
Ok(())
} else {
Err((
format!("The child process exited with {}", status),
status.code().unwrap_or(-1),
))
}
}
Err(e) => Err((format!("Unknown err {:?}", e), -1)),
}
}
fn run() -> Result<(), (String, i32)> {
let opts: Opts = Opts::parse();
// 設定のexampleを表示
if opts.example {
println!("{}", serde_yaml::to_string(&Config::exmaple()).unwrap());
Ok(())
} else if let Some(config) = opts.config {
let config: Config = serde_yaml::from_str(
&fs::read_to_string(config).map_err(|_| ("Cannot read config file".to_owned(), -1))?,
)
.map_err(|_| ("Invalid config file".to_owned(), -1))?;
let mut command = config.cmd.into_iter();
if let Some(exe) = command.next() {
let args = command.collect::<Vec<String>>();
spawn(
&exe,
args.as_slice(),
Duration::from_secs(config.rotate_span),
config.rotate_targets,
config.log_inherit_kilobytes,
)
} else {
Err(("Command is empty".to_owned(), -1))
}
} else {
Err(("--config option must be passed".to_owned(), -1))
}
}
fn main() {
env_logger::init();
if let Err((msg, exit)) = run() {
error!("{}", msg);
std::process::exit(exit);
}
}
|
f89b81b5392ae8f7b4d20b1af30fcd95f9fa46d7
|
[
"TOML",
"Rust",
"Markdown"
] | 3 |
TOML
|
namachan10777/nginx-log-rotator
|
8579ad79946f4c60f555d63d03ef3ca14c76c99d
|
31dc38ec50e2c704dcc7d6fd408a13b84daa15db
|
refs/heads/master
|
<repo_name>elastict/Orleans.CosmosDB<file_sep>/src/Orleans.Persistence.CosmosDB/Options/IPartitionKeyProvider.cs
using Orleans.Runtime;
using System.Threading.Tasks;
namespace Orleans.Persistence.CosmosDB.Options
{
public interface IPartitionKeyProvider
{
ValueTask<string> GetPartitionKey(string grainType, GrainReference grainReference, string storageName);
}
internal class DefaultPartitionKeyProvider : IPartitionKeyProvider
{
public ValueTask<string> GetPartitionKey(string grainType, GrainReference grainReference, string storageName)
{
return new ValueTask<string>(grainType);
}
}
}
|
62c92b82ffd0667a456d1c79acd46229f6b0649c
|
[
"C#"
] | 1 |
C#
|
elastict/Orleans.CosmosDB
|
62c600bdf34900cc32d72838db870a438581dcc2
|
9afb853fd69b26bc9cbcf74e02f75fb6f9ac988d
|
refs/heads/master
|
<repo_name>helderberto/d3test-front<file_sep>/app/components/InputField/index.js
import React, { Component } from 'react';
import ReactDOM from 'react-dom';
import PropTypes from 'prop-types';
import InputMask from 'react-input-mask';
class InputField extends Component {
constructor(props) {
super(props);
}
componentWillUpdate(nextProps, nextState) {
const $input = this.inputElement.input;
const $parent = $input.parentElement;
this.toggleClass($parent, 'input__group__error', nextProps['data-error']);
this.toggleClass($input, 'input-field__error', nextProps['data-error']);
}
toggleClass($el, className, toggle = false) {
if (toggle) {
$el.classList.add(className);
} else {
$el.classList.remove(className);
}
}
render() {
let props = Object.assign({}, this.props);
return(
<InputMask
{...this.props}
ref={(node) => (this.inputElement = node)}
/>
);
}
}
InputField.defaultProps = {
type: 'text',
className: 'input-field'
};
InputField.propTypes = {
type: PropTypes.string,
name: PropTypes.string.isRequired,
className: PropTypes.string
};
export default InputField;<file_sep>/app/components/Header/index.js
import React from 'react';
const Header = props =>
<header className="header">
<div className="header__wrapper wrapper">
<strong className="header__logo exp">
EXP_
</strong>
<strong className="header__name exp">
Karol com 5K_
</strong>
</div>
</header>;
export default Header;<file_sep>/app/components/FormPayment/index.js
import React, { Component } from 'react';
import { connect } from 'react-redux';
import InputField from '../InputField';
import Button from '../Button';
import SelectBox from '../SelectBox';
import { states, cities, months, years, plots } from '../../constants/formOptions';
import validationUtil from '../../utils/validationUtil';
import Cards from 'react-credit-cards';
class FormPayment extends Component {
constructor(props) {
super(props);
this.state = {
sendAddress: 'sameAddress',
invalidForm: false,
zipcode: '',
street: '',
number: '',
compl: '',
state: '',
city: '',
card_name: '',
card_number: '',
card_month: '',
card_year: '',
cvc: '',
plots: '',
card_expiry: '',
focused: 'card_name',
rules: [
{
name: 'zipcode',
validations: ['required'],
address: 'otherAddress'
},
{
name: 'street',
validations: ['required'],
address: 'otherAddress'
},
{
name: 'number',
validations: ['required', 'number'],
address: 'otherAddress'
},
{
name: 'state',
validations: ['required'],
address: 'otherAddress'
},
{
name: 'city',
validations: ['required'],
address: 'otherAddress'
},
{
name: 'card_number',
validations: ['required'],
address: null
},
{
name: 'card_name',
validations: ['required'],
address: null
},
{
name: 'card_month',
validations: ['required', 'number'],
address: null
},
{
name: 'card_year',
validations: ['required', 'number'],
address: null
},
{
name: 'cvc',
validations: ['required', 'number'],
address: null
},
{
name: 'plots',
validations: ['required'],
address: null
},
]
};
}
onChange(event) {
this.setState({
[event.target.name]: event.target.value
});
}
onSubmit(event) {
event.preventDefault();
this.validate();
if (this.props.payment.valid) {
this.props.paymentValid();
this.clearValidate();
this.clearForm();
}
}
validate() {
let invalidForm = false;
let errors = {};
this.state.rules.map(field => {
if ((validationUtil.isEqualObject(field.validations, ['required']) && this.isSameAddress(field.address))) {
if (validationUtil.isEmpty(this.state[field.name])) {
invalidForm = true;
this._setFieldError(field.name, true);
errors[field.name] = true;
} else {
this._setFieldError(field.name, false);
if (errors[field.name]) delete errors[field.name];
}
}
if (validationUtil.isEqualObject(field.validations, ['required', 'number']) && this.isSameAddress(field.address)) {
if (validationUtil.isEmpty(this.state[field.name]) || !validationUtil.isNumber(this.state[field.name])) {
invalidForm = true;
this._setFieldError(field.name, true);
errors[field.name] = true;
} else {
this._setFieldError(field.name, false);
if (errors[field.name]) delete errors[field.name];
}
}
});
if (invalidForm) {
this.props.paymentInvalid(errors);
} else {
this.props.paymentValid();
}
this.setState(this.state);
}
_setFieldError(fieldName, error = false) {
this.state[`${fieldName}Error`] = error;
}
isSameAddress(address) {
if (address === null) return true;
return (this.state.sendAddress === address ? true : false);
}
handleClick(event) {
this.setState({
sendAddress: event.target.value
});
if (event.target.value === 'sameAddress') {
this.clearValidateAddress();
}
if (event.target.name === 'cvc') {
this.setState({
focused: event.target.name
});
}
}
clearValidateAddress() {
this.setState({
zipcodeError: false,
streetError: false,
numberError: false,
stateError: false,
cityError: false
});
}
clearValidate() {
this.clearValidateAddress();
this.setState({
sendAddress: 'sameAddress',
invalidForm: false,
card_nameError: false,
card_numberError: false,
card_monthError: false,
card_yearError: false,
cvcError: false,
plotsError: false,
});
}
clearForm() {
this.setState({
zipcode: '',
street: '',
number: '',
compl: '',
state: '',
city: '',
card_name: '',
card_number: '',
card_month: '',
card_year: '',
cvc: '',
plots: '',
});
}
handleBlurCVC(event) {
this.setState({
focused: 'card_name'
});
}
render() {
let showFieldsAddress = (this.state.sendAddress === 'sameAddress' ? true : false);
return(
<form ref={ (node) => {this.formElement = node} } onSubmit={this.onSubmit.bind(this)} className="form__horizontal form">
<div className="form__buttons__address form__row">
<p className="form__legend">Endereço de cobrança</p>
<Button
onClick={this.handleClick.bind(this)}
className={showFieldsAddress ? 'button__success button' : 'button'}
value="sameAddress"
>É o mesmo da entrega</Button>
<Button
onClick={this.handleClick.bind(this)}
className={!showFieldsAddress ? 'button__success button' : 'button'}
value="otherAddress"
>É diferente da entrega</Button>
</div>
<div className={showFieldsAddress ? 'form__row__hidden' : ''}>
<div className="form__row">
<div className="input__group">
<label htmlFor="zipcode">CEP</label>
<div className="input__group__field input__group__field__cep">
<InputField
className="input-field input-field__cep"
type="tel"
name="zipcode"
id="zipcode"
onChange={this.onChange.bind(this)}
value={this.state.zipcode}
data-error={this.state.zipcodeError}
mask="99999-999"
maskChar=""
/>
</div>
<a className="form__link" href="http://www.buscacep.correios.com.br/sistemas/buscacep/buscaCepEndereco.cfm" target="_blank">
Não sei meu CEP
</a>
</div>
</div>
<div className="form__row">
<p className="form__legend">Confirme seu endereço de cobrança</p>
<div className="input__group input__group__break">
<label htmlFor="street">Logradouro</label>
<div className="input__group__field input__group__field__street input__group__mr25">
<InputField
className="input-field__street input-field"
name="street"
id="street"
onChange={this.onChange.bind(this)}
value={this.state.street}
data-error={this.state.streetError}
/>
</div>
<label htmlFor="number">Número</label>
<div className="input__group__field input__group__field__number">
<InputField
type="tel"
name="number"
id="number"
onChange={this.onChange.bind(this)}
value={this.state.number}
data-error={this.state.numberError}
maxLength="8"
/>
</div>
</div>
<div className="input__group input__group__break">
<label htmlFor="compl">Complemento</label>
<div className="input__group__field input__group__field__compl input__group__mr25">
<InputField
className="input-field__street input-field"
name="compl"
id="compl"
onChange={this.onChange.bind(this)}
value={this.state.compl}
/>
</div>
<label htmlFor="estado">UF</label>
<div className="input__group__field input__group__field__state input__group__select">
<SelectBox
name="state"
options={states}
onChange={this.onChange.bind(this)}
value={this.state.state}
data-error={this.state.stateError}
/>
</div>
</div>
<div className="input__group input__group__break">
<label htmlFor="cidade">Cidade</label>
<div className="input__group__field input__group__field__city input__group__select">
<SelectBox
name="city"
options={cities}
onChange={this.onChange.bind(this)}
value={this.state.city}
data-error={this.state.cityError}
/>
</div>
</div>
</div>
</div>
<div className="form__row">
<p className="form__legend">Dados do cartão</p>
<Cards
cvc={this.state.cvc}
number={this.state.card_number}
name={this.state.card_name}
locale={{ valid: 'MÊS/ANO' }}
placeholders={ { name: 'NOME COMPLETO' } }
expiry={(this.state.card_month < 10 ? this.state.card_month : this.state.card_month) + this.state.card_year}
focused={this.state.focused}
/>
<div className="input__group input__group__break">
<label htmlFor="card_number">Número</label>
<div className="input__group__field input__group__field__card_number input__group__mr25">
<InputField
name="card_number"
id="card_number"
className="input-field"
onChange={this.onChange.bind(this)}
value={this.state.card_number}
data-error={this.state.card_numberError}
mask="9999 9999 9999 9999"
maskChar=""
/>
</div>
<label htmlFor="card_name">Nome Completo</label>
<div className="input__group__field input__group__field__card_name">
<InputField
name="card_name"
id="card_name"
onChange={this.onChange.bind(this)}
value={this.state.card_name}
data-error={this.state.card_nameError}
maxLength="30"
/>
</div>
</div>
<div className="input__group input__group__break">
<label htmlFor="card_date_month">Validade</label>
<div className="input__group__field input__group__field__dates input__group__mr25">
<div className="input__group__field input__group__field__month input__group__select">
<SelectBox
name="card_month"
options={months}
onChange={this.onChange.bind(this)}
value={this.state.card_month}
data-error={this.state.card_monthError}
/>
</div>
<div className="input__group__field input__group__field__year input__group__select">
<SelectBox
name="card_year"
options={years}
onChange={this.onChange.bind(this)}
value={this.state.card_year}
data-error={this.state.card_yearError}
/>
</div>
</div>
<label htmlFor="cvc">Código de segurança</label>
<div className="input__group__field input__group__field__card_code">
<InputField
type="tel"
name="cvc"
id="cvc"
onChange={this.onChange.bind(this)}
value={this.state.cvc}
data-error={this.state.cvcError}
maxLength="4"
onClick={this.handleClick.bind(this)}
onBlur={this.handleBlurCVC.bind(this)}
/>
</div>
</div>
</div>
<div className="form__row">
<p className="form__legend">Valor e parcelamento</p>
<div className="input__group input__group__break">
<div className="input__group__field input__group__field__plots input__group__select">
<SelectBox
name="plots"
options={plots}
onChange={this.onChange.bind(this)}
value={this.state.plots}
data-error={this.state.plotsError}
/>
</div>
</div>
</div>
<div className="submit">
<Button className="button__back button">Voltar</Button>
<Button type="submit" className="button__success button">Finalizar Compra</Button>
</div>
</form>
)
}
}
const mapStateToProps = (state) => {
return {
payment: state.payment
}
};
const mapDispatchToProps = (dispatch) => {
return {
paymentValid: () => {
dispatch({
type: "PAYMENT_VALID"
});
},
paymentInvalid: (errors) => {
dispatch({
type: "PAYMENT_INVALID",
payload: errors
});
}
}
};
export default connect(mapStateToProps, mapDispatchToProps)(FormPayment);<file_sep>/app/components/Button/index.js
import React from 'react';
import PropTypes from 'prop-types';
const Button = props => {
return <button {...props}>{props.children}</button>;
}
Button.defaultProps = {
type: 'button',
className: 'button'
};
Button.propTypes = {
type: PropTypes.string.isRequired
};
export default Button;<file_sep>/app/components/PurchaseDetail/index.js
import React from 'react';
const PurchaseDetail = props =>
<div className="purchase-detail">
<h3>
<strong>R$ 660,00</strong> em até 5x
</h3>
<p className="purchase-detail__text">Você terá direito a:</p>
<ul className="purchase-detail__list">
<li>
<span className="highlighted">01</span> tênis especial para corrida
</li>
<li>
<span className="highlighted">05</span> peças de vestuário para você correr melhor
</li>
<li>
<span className="highlighted">05</span> acessórios para ajudar no seu desempenho
</li>
<li>
Acompanhamento personalizado de profissionais de treino
</li>
<li>
Apoio com dicas de nutrição
</li>
<li>
Apoio exclusivo ao conteúdo do app <span className="highlight">EXP_</span>
</li>
</ul>
</div>;
export default PurchaseDetail;<file_sep>/app/containers/HomeContainer.js
import React from 'react';
import Header from '../components/Header';
import Footer from '../components/Footer';
import Steps from '../components/Steps';
import FormPayment from '../components/FormPayment';
import PurchaseDetail from '../components/PurchaseDetail';
import AlertBox from '../components/AlertBox';
const HomeContainer = props =>
<div>
<Header />
<div className="content">
<h1>Cadastro</h1>
<Steps active="4" />
<AlertBox />
<div className="content__wrapper wrapper">
<div className="block-form col">
<h2 className="title__payment">Pagamento</h2>
<FormPayment />
</div>
<div className="block-detail col">
<h2>Detalhes da Compra</h2>
<PurchaseDetail />
</div>
</div>
</div>
<Footer />
</div>;
export default HomeContainer;
<file_sep>/app/components/SelectBox/index.js
import React, { Component } from 'react';
import PropTypes from 'prop-types';
class SelectBox extends Component {
constructor(props) {
super(props);
}
componentWillUpdate(nextProps, nextState) {
const $input = this.inputElement;
const $parent = $input.parentElement;
this.toggleClass($parent, 'input__group__error', nextProps['data-error']);
this.toggleClass($input, 'input-field__error', nextProps['data-error']);
}
toggleClass($el, className, toggle = false) {
if (toggle) {
$el.classList.add(className);
} else {
$el.classList.remove(className);
}
}
render() {
let props = Object.assign({}, this.props);
delete props.options;
return(
<select {...props} ref={ (node) => {this.inputElement = node} } className="input-field">
<option value=""></option>
{this.props.options.map(option => {
return <option value={option.value} key={option.value} checked={this.props.value === option.value}>{option.name}</option>
})}
</select>
)
}
}
SelectBox.propTypes = {
name: PropTypes.string.isRequired,
options: PropTypes.array.isRequired
};
export default SelectBox;
|
e7c5aca4d9d3d015e5805b56f002ee496d42d6eb
|
[
"JavaScript"
] | 7 |
JavaScript
|
helderberto/d3test-front
|
c4db5a854d1b2c365542ffda812bdf49824240f1
|
dadaaffdda0a43ca8eab17a898c86c851777c84f
|
refs/heads/master
|
<file_sep>import React,{ Component } from 'react';
import {View,Text,Image,TextInput,StyleSheet,TabNavigator} from 'react-native';
import newsComponent from './news';
export default class HomeComponent extends Component {
constructor(){
super();
}
render(){
return <View>
{/* <Image source={require('')}></Image> */}
<Text>this is home</Text>
</View>
}
}
const preStyles = StyleSheet.create({
});
<file_sep>import React,{ Component } from 'react';
import {View,Text,Image,StyleSheet,Navigation} from 'react-native';
export default class ShopComponent extends Component {
constructor(){
super();
this.state = {
}
}
render(){
return <View>
<Text>this is SHOP</Text>
</View>
}
} <file_sep>/**
* Sample React Native App
* https://github.com/facebook/react-native
* @flow
*/
import React, { Component } from 'react';
import {
AppRegistry,
StyleSheet,
Text,
View
} from 'react-native';
import {StackNavigator} from 'react-navigation';
import PreComponent from './app/components/pre';
import BarComponent from './app/components/bar';
const RootNavigator = StackNavigator ({
pre:{
screen:PreComponent,
navigationOptions:()=>{
return {
header:null
}
}
},
bar:{
screen:BarComponent,
navigationOptions:()=>{
return {
header:null
}
}
}
});
AppRegistry.registerComponent('myapp', () => RootNavigator);
|
c60cae1305ebf79ff0aacdd1254f5eeea01cea31
|
[
"JavaScript"
] | 3 |
JavaScript
|
wenfangcao/haiboApp
|
28fafa254a243f16ebc483607f190de1b560eeaf
|
591ec7007b1515c0472c8dabb18e6fe805d7823c
|
refs/heads/main
|
<repo_name>xyh777/acceleratorBe<file_sep>/accelerator/src/main/java/com/accelerator/accelerator/repository/StudentRepository.java
package com.accelerator.accelerator.repository;
import org.springframework.data.jpa.repository.JpaRepository;
import com.accelerator.accelerator.entity.Student;
public interface StudentRepository extends JpaRepository<Student,Integer> {
}
<file_sep>/accelerator/src/main/java/com/accelerator/accelerator/controller/StudentHandler.java
package com.accelerator.accelerator.controller;
import com.accelerator.accelerator.entity.Student;
import com.accelerator.accelerator.repository.StudentRepository;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import java.util.List;
@RestController
@RequestMapping("/student")
public class StudentHandler {
@Autowired
private StudentRepository studentRepository;
@GetMapping("/findAll")
public List<Student> findAll(){
return studentRepository.findAll();
}
}
|
807749635c73d26e41f35a2b67c46667d8e5b105
|
[
"Java"
] | 2 |
Java
|
xyh777/acceleratorBe
|
b76b9cec42dff18b4e1102cf3b60ce197534e787
|
8238b2245a558fd51330740e0d592a2aedeb0bcb
|
refs/heads/master
|
<file_sep>import React from 'react'
import EventCard from './EventCard'
const AllEventsList = ({events = []}) => {
return(
events.map(event =>
<EventCard
name={event.name}
time={event.time}
location={event.location}
date={event.date}
description={event.description}
eligibility={event.eligibility}
key={event.id}
/>
)
)
}
export default AllEventsList<file_sep>const mysql = require('mysql')
require('dotenv').config()
const db = mysql.createPool({
host: "us-cdbr-iron-east-04.cleardb.net",
user: "b1319838ea3262",
password: "<PASSWORD>",
database: "heroku_121ab46c32c2683"
})
module.exports = {
db
}<file_sep>import React, {useState} from 'react'
const ChangeSkillset = ({showSkillsForm, hideSkillsForm, submitSkillsForm}) => {
const[newSkills, setNewSkills] = useState('')
const handleSkillsChange = (e) => {
setNewSkills(e.target.value)
}
const handleSubmit = () => {
let Skillset = {
skillset: newSkills
}
submitSkillsForm(Skillset)
console.log('new skillset info...', Skillset)
setNewSkills('')
}
let show = showSkillsForm ? "modal display-block" : "modal display-none"
return(
<div className={show}>
<div className="modal-main">
<form class="ui form">
<div class="field">
<label>Skills</label>
<input type="text" name="skills" placeholder="Skills" value={newSkills} onChange={handleSkillsChange}></input>
</div>
<button type="button" onClick={handleSubmit}>Submit</button>
<button type="button" onClick={hideSkillsForm}>Close</button>
</form>
</div>
</div>
)
}
export default ChangeSkillset<file_sep>import React from 'react'
const UserSkillset = (props) => {
return(
<div className="ui items">
<div className="item">
<div className="content">
<div className="header">Skillset</div>
<div className="description">{props.skills}</div>
</div>
</div>
</div>
)
}
export default UserSkillset<file_sep>import React, {useState} from 'react'
import {Link} from 'react-router-dom'
const AppliedJob = ({id, student_id, co_id, status, student_name, title, updateJobStatus}) => {
const[newStatus, setNewStatus] = useState('')
const handleStatusChange = (e) => {
setNewStatus(e.target.value)
}
const updateStatus = () => {
let job = {
job_id: id,
newStatus: newStatus,
co_id,
student_id
}
updateJobStatus(job)
}
return(
<div>
<div className="ui items">
<div className="item">
<div className="content">
<div className="header">{student_name}</div>
<div className="description">{title}</div>
<div className="description">{status}</div>
</div>
<Link to={{
pathname:'/studentProfile',
id: student_id
}}>
<button variant="raised">
See Profile
</button>
</Link>
</div>
<div onChange={handleStatusChange}>
<input type="radio" value="Pending" name="status"/> Pending
<input type="radio" value="Reviewed" name="status"/> Reviewed
<input type="radio" value="Declined" name="status"/> Declined
</div>
</div>
<button className="ui button primary" type="button" onClick={updateStatus}>Update</button>
</div>
)
}
export default AppliedJob<file_sep>import React from 'react'
import EducationCard from './EducationCard'
const UserEducation = ({education = [], submitDelete, showEducationForm, submitEducationForm} ) => {
if (education === undefined){
education = []
}
return(
education.map(item =>
<EducationCard
name={item.schoolname}
degree={item.degree}
location={item.location}
dates={item.dates}
gpa={item.gpa}
id={item.id}
submitDelete={submitDelete}
showEducationForm={showEducationForm}
submitEducationForm={submitEducationForm}
/>
)
)
}
export default UserEducation<file_sep>import React from 'react'
import StudentCard from './StudentCard'
const StudentsList = ({students = []}) => {
return(
students.map(student=>
<StudentCard
key={student.id}
id={student.id}
name={student.name}
schoolname={student.schoolname}
skillset={student.skillset}
/>
)
)
}
export default StudentsList<file_sep>import React, {Component} from 'react';
import { Route, Switch } from 'react-router-dom';
import './App.css';
import Home from './components/Home'
import Register from './components/Register'
import Login from './components/Login'
import CompanyRegister from './components/CompanyRegister'
import CompanyLogin from './components/CompanyLogin'
import UserProfile from './components/UserPofile/UserProfile'
import CompanyProfile from './components/CompanyProfile/CompanyProfile'
import JobPostPage from './components/Jobs/JobPostPage'
import StudentTab from './components/StudentsTab/StudentTab'
import StudentProfile from './components/StudentsTab/StudentProfile'
import JobTab from './components/JobSearch/JobTab'
import JobProfile from './components/JobSearch/JobProfile'
import ApplicationsTab from'./components/ApplicationsTab/ApplicationsTab'
import EventsCompanyTab from './components/EventsCompany/EventsCompanyTab'
import EventsStudentsTab from './components/EventsStudents/EventsStudentsTab'
import EventProfile from './components/EventsStudents/EventProfile'
import CompanyPreview from './components/JobSearch/CompanyPreview'
class App extends Component {
render(){
const App = () => {
return(
<div>
<Switch>
<Route exact path='/' component={Home}/>
<Route path='/register' component={Register}/>
<Route path='/login' component={Login}/>
<Route path='/company-register' component={CompanyRegister}/>
<Route path='/company-login' component={CompanyLogin}/>
<Route path='/userProfile' component={UserProfile}/>
<Route path='/companyProfile' component={CompanyProfile}/>
<Route path='/jobPosts' component={JobPostPage}/>
<Route path='/student-tab' component={StudentTab}/>
<Route path ='/studentProfile' component={StudentProfile}/>
<Route path ='/job-tab' component={JobTab}/>
<Route path ='/jobProfile' component={JobProfile}/>
<Route path='/applicationsTab' component={ApplicationsTab}/>
<Route path='/company-events-tab' component={EventsCompanyTab}/>
<Route path='/student-events-tab' component={EventsStudentsTab}/>
<Route path='/eventProfile' component={EventProfile}/>
<Route path='/companyPreview' component={CompanyPreview}/>
</Switch>
</div>
)
}
return (
<Switch>
<App/>
</Switch>
);
}
}
export default App;
<file_sep>const express = require('express')
const {db} = require('../../db')
const bodyParser = require('body-parser');
const router = express.Router();
const app = express()
app.use(bodyParser.json());
router.get('/get-job-post-page', (req, res) => {
let jobPageInfo = {
JobsPosted: [],
JobsApplied: []
}
let co_id = req.session.userId
console.log('session id check: ', co_id)
db.query('SELECT * FROM jobs WHERE co_id = ?', [`${co_id}`], (error, results, fields) => {
if (error) throw error
jobPageInfo.JobsPosted = results
console.log('job results: ', results)
db.query('SELECT * FROM jobs_students WHERE co_id = ?', [`${co_id}`], (error, results, fields) => {
if (error) throw error
jobPageInfo.JobsApplied = results
res.send({
...jobPageInfo
})
})
})
})
router.post('/post-new-job', async (req, res) => {
console.log('inside job post route')
let co_id = req.session.userId
let {title, deadline, date, location, salary, description, category} = req.body
db.query(`INSERT INTO jobs (title, deadline, date, location, salary, description, category, co_id)
VALUES ('${title}', '${deadline}', '${date}', '${location}', '${salary}', '${description}', '${category}', '${co_id}')`,
(error, result) => {
console.log('result from inserting (outer loop): ', result)
if (error) throw error
console.log('Posting New Job with co_id: ', co_id)
db.query('SELECT * FROM jobs WHERE co_id = ?', [`${co_id}`], (error, results, fields) => {
if (error) throw error
console.log('results from selecting (inner loop): ', results)
res.send({
JobsPosted: results
})
})
})
})
router.post('/update-job-status', (req, res) => {
let co_id = req.session.userId
let {job_id, newStatus} = req.body
db.query("UPDATE jobs_students SET status = ? WHERE id = ?",
[`${newStatus}`,`${job_id}`],
(error, result) => {
if (error) throw error
console.log('Updated Job Status')
db.query(`SELECT * FROM jobs_students WHERE co_id = ?`, [`${co_id}`],
(error, results, fields) => {
if (error) throw error
console.log('Updated Applied Jobs list: ', results)
res.send({
JobsApplied: results
})
})
})
})
router.get('/get-job-list', (req, res) => {
console.log('session id check: ', req.session.userId)
db.query('SELECT * FROM jobs', (error, results, fields) => {
if (error) throw error
res.send({
Jobs: results
})
})
})
module.exports = {
jobPostsApiRouter: router
};<file_sep>import React , {Component} from 'react'
import axios from 'axios'
import NavBar from '../NavBar/NavBar'
class EventProfile extends Component {
constructor(props){
super(props)
this.state = {
id: props.location.id,
co_name: props.location.co_name,
event_name: props.location.event_name,
date: props.location.date,
time: props.location.time,
location: props.location.location,
eligibility: props.location.eligibility,
description: props.location.description,
co_id: props.location.co_id,
event_id: props.location.id,
message: ''
}
}
componentDidMount( ){
console.log('props: ', this.props)
}
handleApply = () => {
let EventApplication = {
co_name: this.state.co_name,
eligibility: this.state.eligibility,
event_id: this.state.event_id,
event_name: this.state.event_name,
co_id: this.state.co_id,
date: this.state.date,
time: this.state.time,
location: this.state.location
}
axios.post('/api/apply-to-event', EventApplication)
.then(response => {
console.log('response after applying: ', response)
this.setState({
message: response.data.message
})
})
}
render(){
return(
<div>
<NavBar/>
<h1>Event Profile</h1>
<div className="item">
<div className="content">
<div className="header">{this.state.co_name}</div>
<div className="meta">{this.state.event_name}</div>
<div className="extra">{this.state.description}</div>
<div className="description">{this.state.location}</div>
<div className="description">{this.state.date}</div>
<div className="description">{this.state.eligibility}</div>
<div className="extra">{this.state.description}</div>
<div className="extra">{this.state.time}</div>
</div>
</div>
<button className="ui button primary" onClick={this.handleApply}>Apply!</button>
<h3>{this.state.message}</h3>
</div>
)
}
}
export default EventProfile<file_sep>import React from 'react'
const Filter = ({handleReset, handleNameChange,handleSchoolChange, handleSkillChange}) => {
return(
<div>
Name:
<input
onChange={handleNameChange}
/>
School:
<input
onChange={handleSchoolChange}
/>
Skillset:
<input
onChange={handleSkillChange}
/>
<button onClick={handleReset}>Reset</button>
</div>
)
}
export default Filter<file_sep>
const initialState = {
Basic: {
name: '',
birthdate: '',
location: '',
email: '',
phone: '',
objective: '',
degree: '',
},
Education: [],
Skillset:{
skills: ''
},
Experience: []
}
const userInfoReducer = (state = initialState, action) => {
switch (action.type) {
case 'CHANGE_BASIC_CONTACT_INFO':
//console.log('changing contact info...', action.payload)
let basicPayload = action.payload
let newBasicObj = {
Basic: {
...basicPayload
}
}
console.log('new basic info..', newBasicObj)
return {
...state,
...newBasicObj
}
case 'CHANGE_SKILLS_INFO':
let newSkillsObj = {
Skillset: {
...action.payload
}
}
return {
...state,
...newSkillsObj
}
case 'ADD_EXPERIENCE_INFO':
let experiencePayload = action.payload
let newExperienceObj = {
Experience: experiencePayload
}
console.log('adding new experience info: ', newExperienceObj)
return {
...state,
...newExperienceObj
}
case 'ADD_EDUCATION_INFO':
let educationPayload = action.payload
//let newEducationArray = state.Education.concat(educationPayload)
let newEducationObj = {
Education: educationPayload
}
console.log('adding new education info: ', newEducationObj)
return {
...state,
...newEducationObj
}
case 'EDIT_EXPERIENCE_INFO':
let editedExperienceObj = {
Experience: action.payload
}
console.log('editing new education info with: ', editedExperienceObj)
return {
...state,
...editedExperienceObj
}
case 'EDIT_EDUCATION_INFO':
let editedEducationObj = {
Education: action.payload
}
console.log('editing new education info: ', editedEducationObj)
return {
...state,
...editedEducationObj
}
case 'DELETE_EXPERIENCE_INFO':
let newDeletedExperienceObj = {
Experience: action.payload
}
console.log('deleting and adding exp. info: ', newDeletedExperienceObj)
return {
...state,
...newDeletedExperienceObj
}
case 'DELETE_EDUCATION_INFO':
let deletedEducationPayload = action.payload
let newDeletedEducationObj = {
Education: deletedEducationPayload
}
console.log('adding new education info: ', newDeletedEducationObj)
return {
...state,
...newDeletedEducationObj
}
case 'GET_USER_INFO':
let userInfoPayload = action.payload
console.log('getting user info: ', userInfoPayload)
return {
...userInfoPayload
}
default:
return state
}
}
export default userInfoReducer<file_sep>import React from 'react'
const ApplicationCard = ({title, company, status, date}) => {
return(
<div className="ui items">
<div className="item">
<div className="content">
<div className="header">{title}</div>
<div className="meta">{company}</div>
<div className="meta">{status}</div>
<div className="description">{date}</div>
</div>
</div>
</div>
)
}
export default ApplicationCard<file_sep>const express = require('express')
const {db} = require('../../db')
const bodyParser = require('body-parser');
const router = express.Router();
const app = express()
app.use(bodyParser.json());
//query DB with user_id
router.post('/get-user-profile/', (req, res) => {
let id = req.session.userId
console.log('Current user_id (from api): ', id)
let userInfo = {
Basic: {},
Education: [],
Skillset: {},
Experience: [],
Id: ''
}
db.query('SELECT * FROM Users WHERE id = ?', [`${id}`], (error, results, fields) => {
if(error) {
console.log('error: ', error)
res.send({
message: 'query failed'
})
}
let result = results[0]
console.log('results: ', result)
userInfo.Basic.name = result.name
userInfo.Basic.birthdate = result.birthdate
userInfo.Basic.location = result.location
userInfo.Basic.email = result.email_profile
userInfo.Basic.phone = result.phone
userInfo.Basic.objective = result.objective
userInfo.Basic.degree = result.degree
userInfo.Skillset.skills = result.skillset
userInfo.Id = result.id
//console.log('results: ', result)
db.query(`SELECT * FROM Education WHERE user_id = ?`, [`${id}`], (error, results, fields) => {
userInfo.Education = results
console.log('education results (from get-user-profile): ', results)
db.query("SELECT * FROM experience WHERE user_id = ?", [`${id}`], (error, results, fields) => {
userInfo.Experience = results
console.log('experience results (from get-user-profile): ', results)
console.log('new user info: ', userInfo)
res.send({
...userInfo
})
})
})
})
})
router.post('/change-basic-info', (req, res) => {
let id = req.session.userId
let {birthdate, location, objective, email, phone, degree} = req.body
db.query("UPDATE Users SET birthdate = ?, location = ?, email_profile = ?, objective = ?, phone = ?, degree = ? WHERE id = ?",
[birthdate, location, email, objective, phone, degree, id ],
(error, result) => {
if (error) throw error
console.log('result of updating basic info in db:', result)
})
console.log('basic info change data: ', req.body)
res.send({
...req.body
})
})
router.post('/change-skills-info', (req, res) => {
let id = req.session.userId
let {skillset} = req.body
console.log('skillset body: ', req.body)
db.query("UPDATE users SET skillset = ? WHERE id = ?",
[skillset, id],
(error, result) => {
if (error) throw error
res.send({
skills: skillset
})
})
})
router.post('/add-experience', (req,res) => {
let {company, title, location, dates, description, user_id} = req.body
db.query(`INSERT INTO experience (company, title, location, dates, description, user_id)
VALUES ('${company}', '${title}', '${location}', '${dates}', '${description}', '${user_id}')`, (error, result) => {
if (error) throw error
console.log('Added Experience info')
db.query(`SELECT * FROM experience WHERE user_id = ?`, [`${user_id}`], (error, results, fields) => {
if(error) throw error
console.log('Updated Experience List: ', results)
res.send({
Experience: results
})
})
})
})
router.post('/add-education', (req,res) => {
let {schoolname, degree, location, dates, gpa, user_id} = req.body
db.query(`INSERT INTO Education (schoolname, degree, location, dates, gpa, user_id) VALUES ('${schoolname}', '${degree}', '${location}', '${dates}', '${gpa}', '${user_id}')`, (error, result)=> {
if(error) throw error
console.log('Added Education info')
db.query(`SELECT * FROM Education WHERE user_id = ?`, [`${user_id}`], (error, results, fields) => {
if(error) throw error
console.log('Updated Education List: ', results)
res.send({
Education: results
})
})
})
})
router.post('/edit-experience', (req, res) => {
let {company, title, location, dates, description, id} = req.body
let user_id = req.session.userId
console.log('Editing Experience id: ', id)
db.query("UPDATE experience SET company = ?, title = ?, location = ?, dates = ?, description = ? WHERE id = ?",
[company, title, location, dates, description, id],
(error, result) => {
if(error) throw error
console.log('Edited Experience Info')
db.query(`SELECT * FROM experience WHERE user_id = ?`, [`${user_id}`], (error, results, fields) => {
if (error) throw error
console.log('Newly Edited Experience List: ', results)
res.send({
Experience: results
})
})
})
})
router.post('/edit-education', (req, res) => {
let {schoolname, degree, location, dates, gpa, id} = req.body
let user_id = req.session.userId
console.log('Editting education id: ', id)
db.query("UPDATE Education SET schoolname = ?, degree = ?, location = ?, dates = ?, gpa = ? WHERE id = ?",
[schoolname, degree, location, dates, gpa, id],
(error, result) => {
if (error) throw error
console.log('Edited Education Info')
db.query(`SELECT * FROM Education WHERE user_id = ?`, [`${user_id}`], (error, results, fields) => {
if(error) throw error
console.log('Edited Education List: ', results)
res.send({
Education: results
})
})
})
})
router.post('/delete-experience', (req, res) => {
let id = req.body.id
let user_id = req.session.userId
db.query('DELETE FROM experience WHERE id = ?', id, (error, results, fields) => {
if (error) throw error
console.log(`Deleting Experience with id: ${id}`)
db.query(`SELECT * FROM experience WHERE user_id = ?`, [`${user_id}`], (error, results, fields) => {
if(error) throw error
console.log('Updated Experience List: ', results)
res.send({
Experience: results
})
})
})
})
router.post('/delete-education', (req, res) => {
let id = req.body.id
let user_id = req.session.userId
db.query('DELETE FROM Education WHERE id = ?', id, (error, results, fields) => {
if (error) throw error
console.log(`Deleting Education with id: ${id}`)
db.query(`SELECT * FROM Education WHERE user_id = ?`, [`${user_id}`], (error, results, fields) => {
if(error) throw error
console.log('Updated Education List: ', results)
res.send({
Education: results
})
})
})
})
router.get('/get-student-list', (req,res) => {
db.query('SELECT * FROM users', (error, results, fields) => {
if (error) throw error
console.log('Fetched student list: ', results)
res.send({
Students: results
})
})
})
router.post('/get-student-page', (req, res) => {
let {id} = req.body
db.query('SELECT * FROM users WHERE id = ?', [`${id}`], (error, results, fields) => {
let result = results[0]
res.send({...result})
})
})
module.exports = {
userApiRouter: router
}
<file_sep>import React, { Component } from 'react';
import axios from 'axios'
import {Redirect} from 'react-router'
import cookie from 'react-cookies'
class Login extends Component {
constructor(props){
super(props)
this.state = {
email: "",
password: "",
logged: false,
message: ''
}
}
emailChangeHandler = (e) => {
this.setState({
email: e.target.value
})
}
passwordChangeHandler = (e) => {
this.setState({
password: e.target.value
})
}
submitLogin = (e) => {
e.preventDefault()
const user = {
email: this.state.email,
password: <PASSWORD>,
}
console.log('User: ', user)
axios.post('/login', user)
.then(response => {
console.log('Response: ', response)
if(response.data.login === 'Success')
this.setState({
logged: true
})
this.setState({
message: response.data.message
})
})
}
getInfo = (e) => {
e.preventDefault()
axios.get(`/api/get-user-profile/`)
.then(response => {
console.log('Response: ', response)
})
}
render(){
let redirectVar = null
if(cookie.load("Student-Logged")){
redirectVar = <Redirect to='/job-tab'/>
}
return(
<div>
<h1>Student Login</h1>
{redirectVar}
<form>
<input onChange={this.emailChangeHandler} placeholder="email" type="text"/>
<input onChange={this.passwordChangeHandler} placeholder="<PASSWORD>" type="<PASSWORD>"/>
<button type="button" onClick={this.submitLogin}>Login!</button>
<button type="button" onClick={this.getInfo}>Get Info</button>
</form>
{this.state.message}
</div>
)
}
}
export default Login<file_sep>import axios from 'axios'
export const getUserProfile = () => {
axios.get('/api/get-user-profile')
.then(response => {
console.log('user profile results: ', response)
return response.data.userInfo
})
}
export const getUserEducation = () => {
axios.get('/api/get-education')
.then(response => {
console.log('education results response: ', response)
return response.data
})
}<file_sep>import React , {Component} from 'react'
import axios from 'axios'
class CompanyPreview extends Component {
constructor(props){
super(props)
this.state = {
co_id: this.props.location.co_id
}
}
componentDidMount(){
let co_id = this.state.co_id
axios.post('/api/get-company-preview', {co_id})
.then(response => {
console.log('response: ', response.data)
this.setState({
co_name: response.data.name,
co_email: response.data.email_profile,
location: response.data.location,
phone: response.data.phone,
description: response.data.description
})
})
}
render(){
return(
<div>
<h1>Company Profile</h1>
<div className="item">
<div className="content">
<div className="header">{this.state.co_name}</div>
<div className="meta">{this.state.co_email}</div>
<div className="extra">{this.state.description}</div>
<div className="description">{this.state.location}</div>
<div className="description">{this.state.phone}</div>
</div>
</div>
</div>
)
}
}
export default CompanyPreview<file_sep>import React from 'react'
const CompanyBasic = (props) => {
return (
<div className="ui items">
<div className="item">
<div className="content">
<div className="header">{props.name}</div>
<div className="description">{props.location}</div>
<div className="description">{props.description}</div>
</div>
</div>
</div>
)
}
export default CompanyBasic<file_sep>import React , {useState} from 'react'
const EventForm = ({hideEventForm, showEventForm, postNewEvent}) => {
const [newName, setNewName] = useState('')
const [newTime, setNewTime] = useState('')
const [newLocation, setNewLocation] = useState('')
const [newDate, setNewDate] = useState('')
const [newDescription, setNewDescription] = useState('')
const [newEligibility, setNewEligibility] = useState('')
const handleNameChange = (e) => {
setNewName(e.target.value)
}
const handleTimeChange = (e) => {
setNewTime(e.target.value)
}
const handleLocationChange = (e) => {
setNewLocation(e.target.value)
}
const handleDateChange = (e) => {
setNewDate(e.target.value)
}
const handleDescriptionChange = (e) => {
setNewDescription(e.target.value)
}
const handleEligibilityChange = (e) => {
setNewEligibility(e.target.value)
}
const handleSubmit =() => {
let EventObj = {
name: newName,
time: newTime,
location: newLocation,
date: newDate,
description: newDescription,
eligibility: newEligibility
}
postNewEvent(EventObj)
setNewName('')
setNewTime('')
setNewLocation('')
setNewDate('')
setNewDescription('')
setNewEligibility('')
}
let show = showEventForm ? "modal display-block" : "modal display-none"
return(
<div className ={show} >
<div className="modal-main">
<form class="ui form">
<div class="field">
<label>Name</label>
<input type="text" name="name" placeholder="Name" value={newName} onChange={handleNameChange}></input>
</div>
<div class="field">
<label>Time</label>
<input type="text" name="time" placeholder="Time" value={newTime} onChange={handleTimeChange}></input>
</div>
<div class="field">
<label>Description</label>
<input type="text" name="description" placeholder="Description" value={newDescription} onChange={handleDescriptionChange}></input>
</div>
<div class="field">
<label>Location</label>
<input type="text" name="location" placeholder="Location" value={newLocation}onChange={handleLocationChange}></input>
</div>
<div class="field">
<label>Date</label>
<input type="text" name="date" placeholder="Date" value={newDate} onChange={handleDateChange}></input>
</div>
<div class="field">
<label>Eligibility</label>
<input type="text" name="eligibility" placeholder="Eligibility" value={newEligibility} onChange={handleEligibilityChange}></input>
</div>
<button type="button" onClick={handleSubmit}>Submit</button>
<button type="button" onClick={hideEventForm}>Close</button>
</form>
</div>
</div>
)
}
export default EventForm<file_sep>import React, { Component } from 'react';
import axios from 'axios'
class CompanyRegister extends Component {
constructor(props){
super(props)
this.state = {
email: "",
password: "",
name: "",
message: ""
}
}
nameChangeHandler = (e) => {
this.setState({
name: e.target.value
})
}
emailChangeHandler = (e) => {
this.setState({
email: e.target.value
})
}
passwordChangeHandler = (e) => {
this.setState({
password: e.target.value
})
}
submitRegistration = (e) => {
e.preventDefault()
const newCompany = {
name: this.state.name,
email: this.state.email,
password: <PASSWORD>
}
console.log('newCompany', newCompany)
axios.defaults.withCredentials = true;
axios.post('/company-register', newCompany)
.then(response => {
console.log('Response: ', response)
this.setState({
message: response.data.message
})
})
}
render(){
return(
<div>
<h1>Company Registration</h1>
<form>
<input onChange={this.nameChangeHandler} placeholder="name" type="text"/>
<input onChange={this.emailChangeHandler} placeholder="email" type="text"/>
<input onChange={this.passwordChangeHandler} placeholder="<PASSWORD>" type="text"/>
<button type="button" onClick={this.submitRegistration}>Register!</button>
</form>
{this.state.message}
</div>
)
}
}
export default CompanyRegister<file_sep>import React, { Component } from 'react';
import axios from 'axios'
import {Redirect} from 'react-router'
import cookie from 'react-cookies'
class CompanyLogin extends Component {
constructor(props){
super(props)
this.state = {
email: "",
password: "",
logged: false,
message: ''
}
}
emailChangeHandler = (e) => {
this.setState({
email: e.target.value
})
}
passwordChangeHandler = (e) => {
this.setState({
password: e.target.value
})
}
submitLogin = (e) => {
e.preventDefault()
const user = {
email: this.state.email,
password: <PASSWORD>
}
console.log('User: ', user)
axios.post('/company-login', user)
.then(response => {
console.log('Response: ', response)
if(response.data.login === 'Success'){
this.setState({
logged: true
})
}
this.setState({
message: response.data.message
})
})
}
render(){
let redirectVar = null
if(cookie.load("Company-Logged") || this.state.logged){
console.log('logged in cookie loaded!')
redirectVar = <Redirect to='/jobPosts'/>
}
return(
<div>
<h1>Company Login</h1>
{redirectVar}
<div>
<form>
<input onChange={this.emailChangeHandler} placeholder="email" type="text"/>
<input onChange={this.passwordChangeHandler} placeholder="<PASSWORD>" type="<PASSWORD>"/>
<button type="button" onClick={this.submitLogin}>Login!</button>
</form>
</div>
{this.state.message}
</div>
)
}
}
export default CompanyLogin<file_sep>import React from 'react'
import { Link } from 'react-router-dom';
const StudentCard = ({id, name, schoolname, skillset}) => {
return(
<div className="ui items">
<div className="item">
<div className="content">
<div className="header">{name}</div>
<div className="meta">{schoolname}</div>
<div className="description">{skillset}</div>
<Link to={{
pathname:'/studentProfile',
id: id
}}>
<button variant="raised">
See Profile
</button>
</Link>
</div>
</div>
</div>
)
}
export default StudentCard<file_sep>import React from 'react'
const UserBasic = (props) => {
return (
<div className="ui items">
<div className="item">
<div className="content">
<div className="header">{props.name}</div>
<div className="meta">{props.birthdate}</div>
<div className="description">{props.location}</div>
<div className="description">{props.email}</div>
<div className="description">{props.phone}</div>
<div className="extra">{props.objective}</div>
<div className="extra">{props.degree}</div>
</div>
</div>
</div>
)
}
export default UserBasic<file_sep>import {connect} from 'react-redux'
import React, {Component} from 'react'
import cookie from 'react-cookies'
import {Redirect} from 'react-router'
import {changeBasicInfo, changeContactInfo, fetchCompanyInfo} from '../../actions/companyProfile'
import CompanyBasic from './CompanyBasic'
import CompanyContact from './CompanyContact'
import ChangeBasicInfo from './ChangeBasicInfo'
import ChangeContactInfo from './ChangeContactInfo'
import NavBar from '../NavBar/NavBar'
class CompanyProfile extends Component {
constructor(props){
super(props)
this.state = {
showBasicForm: false,
showContactForm: false,
logged: true
}
}
componentDidMount() {
console.log('state: ', this.state)
if(cookie.load("Company-Logged")){
this.props.fetchCompanyInfo()
}
}
showBasicForm = () => {
this.setState({
showBasicForm: true
})
}
showContactForm = () => {
this.setState({
showContactForm: true
})
}
hideBasicForm = () => {
this.setState({
showBasicForm: false
})
}
hideContactForm = () => {
this.setState({
showContactForm: false
})
}
submitBasicForm = (newBasicInfo) => {
this.props.changeBasicInfo(newBasicInfo)
this.hideBasicForm()
}
submitContactForm = (newContactInfo) => {
this.props.changeContactInfo(newContactInfo)
this.hideContactForm()
}
render(){
console.log('props', this.props)
let redirectVar = null
if(!cookie.load("Company-Logged") || this.state.logged === false){
redirectVar = <Redirect to='/company-login'/>
}
return(
<div>
<NavBar/>
<div className="ui segment">
{redirectVar}
<h2>Basic Info</h2>
<CompanyBasic
name={this.props.companyInfo.Basic.name}
location={this.props.companyInfo.Basic.location}
description={this.props.companyInfo.Basic.description}
/>
<button className="ui button primary" onClick={this.showBasicForm}>Edit</button>
<ChangeBasicInfo
showBasicForm={this.state.showBasicForm}
hideBasicForm={this.hideBasicForm}
submitBasicForm={this.submitBasicForm}
/>
<div class="ui section divider"></div>
<h2>Contact Info</h2>
<CompanyContact
email={this.props.companyInfo.Contact.email}
phone={this.props.companyInfo.Contact.phone}
/>
<button className="ui button primary" onClick={this.showContactForm}>Edit</button>
<ChangeContactInfo
showContactForm={this.state.showContactForm}
hideContactForm={this.hideContactForm}
submitContactForm={this.submitContactForm}
/>
</div>
</div>
)
}
}
const mapStatetoProps = (state) => {
console.log(state)
return {
companyInfo: state.companyInfo
}
}
export default connect(mapStatetoProps, {changeBasicInfo, changeContactInfo, fetchCompanyInfo})(CompanyProfile)<file_sep>import React from 'react'
import { Link } from 'react-router-dom';
const EventCompanyCard = ({student_id, event_name, student_name}) => {
return(
<div className="ui items">
<div className="item">
<div className="content">
<div className="header">{event_name}</div>
<div className="meta">{student_name}</div>
<Link to={{
pathname:'/studentProfile',
id: student_id
}}>
<button variant="raised">
See Profile
</button>
</Link>
</div>
</div>
</div>
)
}
export default EventCompanyCard<file_sep>
import React from 'react'
import JobCard from './JobCard'
const JobList = ({jobs = []}) => {
return(
jobs.map(job =>
<JobCard
title={job.title}
category={job.category}
location={job.location}
date={job.date}
deadline={job.deadline}
salary={job.salary}
job_id={job.id}
co_id={job.co_id}
description={job.description}
key={job.id}
/>
)
)
}
export default JobList<file_sep>const bcrypt = require('bcryptjs')
const express = require('express')
const {db} = require('../db')
const bodyParser = require('body-parser');
const router = express.Router();
const app = express()
app.use(bodyParser.json());
router.post('/register', (req, res) => {
console.log('req is', req.body)
let email = req.body.email
let password = <PASSWORD>
let name = req.body.name
let schoolName = req.body.schoolName
let degree = req.body.degree
db.query('SELECT * FROM Users WHERE email = ?', [email], (error, results, fields) => {
if (error) {
console.log(error)
}
else if (results.length > 0){
res.send({
message: "email already taken"
})
} else {
console.log('inserting into DB...')
bcrypt.hash(password, 10, function(err, hash) {
let stringHash = hash.toString()
db.query(`INSERT INTO Users (email, password, name, schoolname, email_profile, degree) VALUES ('${email}', '${stringHash}', '${name}', '${schoolName}', '${email}', '${degree}')`, (err, result) => {
if (err) {
console.log(err)
}
db.query('SELECT * FROM Users WHERE email = ?', [email], (error, results, fields) => {
if (err) console.log(err)
console.log('results: ', results[0])
let user_id = results[0].id
let user_email = results[0].email
res.cookie('cookie',"Student", {maxAge: 900000, httpOnly: false, path : '/'})
req.session.userId = user_id
if(!req.session.user){
console.log('no user session')
} else {
console.log(`Session with ${user_email} with ${req.session.user}`)
}
res.send({
message: 'registered student',
results: results
})
})
})
})
}
})
})
router.post('/company-register', (req, res) => {
console.log('req is: ', req.body)
let email = req.body.email
let password = <PASSWORD>
let name = req.body.name
db.query('SELECT * FROM Company WHERE email = ?', [email], (error, results, fields) => {
if (error) {
console.log(error)
}
else if (results.length > 0){
res.send({
message: "email already taken"
})
} else {
console.log('inserting into DB...')
bcrypt.hash(password, 10, function(err, hash) {
let stringHash = hash.toString()
db.query(`INSERT INTO Company (email, password, name) VALUES ('${email}', '${stringHash}', '${name}')`, (err, result) => {
if (err) {
console.log(err)
} else {
res.send({
message: "Registered Company!"
})
}
})
})
}
})
})
router.post('/login', (req, res) => {
let email = req.body.email
let password = <PASSWORD>
let queryResults
db.query('SELECT * FROM Users WHERE email = ?', [email], async (error, results, fields) => {
if (error) throw err
queryResults = await results
console.log('test query results: ', queryResults)
console.log('email check result: ', results)
if(queryResults.length === 0) {
res.send({
message: "Login failed. Email not Found"
})
}
if(queryResults.length > 0) {
let user_id = results[0].id
let hash = results[0].password
console.log('query results: ', results[0])
bcrypt.compare(password, hash, (err, result) => {
console.log('Bcrypt result: ', result)
if (result){
req.session.userId = user_id
req.session.studentName = results[0].name
req.session.degree = results[0].degree
console.log(req.session.userId)
res.cookie('Student-Logged',"user", {maxAge: 900000, httpOnly: false, path : '/'})
res.send({
login: 'Success',
message: `Session with ${email} with ${req.session.userId}`,
id: req.session.userId
})
} else {
res.send({
login: 'Failed',
message: 'Login Failed. Incorrect Password'
})
}
})
}
})
})
router.post('/company-login', (req, res) => {
let email = req.body.email
let password = <PASSWORD>
let queryResults
db.query('SELECT * FROM Company WHERE email = ?', [email], async (error, results, fields) => {
if (error) throw err
queryResults = await results
console.log('email check result: ', results)
if(queryResults.length === 0)
res.send({
message: "Login Failed. Unable to find email."
})
if(queryResults.length > 0) {
let user_id = results[0].id
let hash = results[0].password
console.log('query results: ', results[0])
bcrypt.compare(password, hash, (err, result) => {
console.log('Bcrypt result: ', result)
if (result){
req.session.userId = user_id
req.session.company_name = results[0].name
res.cookie('Company-Logged',"Company", {maxAge: 900000, httpOnly: false, path : '/'})
res.send({
message: `Session with ${email} with ${req.session.userId}`,
id: req.session.userId
})
} else {
res.send({
message: 'Login Failed. Incorrect Password'
})
}
})
}
})
})
router.post('/logout', (req, res) => {
res.clearCookie('cookie')
res.clearCookie('Student-Logged')
res.clearCookie('Company-Logged')
let id = req.session.userId
res.clearCookie('connect.sid')
req.session.destroy((err) =>{
res.send({
message: "logging out",
sessionCheck: id
})
})
})
module.exports = {
userRouter: router
}
<file_sep>import React, {Component} from 'react'
import axios from 'axios'
class JobProfile extends Component {
constructor(props){
super(props)
this.state = {
job_id: props.location.job_id,
co_id: props.location.co_id,
title: props.location.title,
company: props.location.company,
date: props.location.date,
deadline: props.location.deadline,
location: props.location.location,
salary: props.location.salary,
category: props.location.category,
description: props.location.description,
message: ''
}
}
componentDidMount(){
console.log('props: ', this.props)
}
handleApply = () => {
let jobApplication = {
co_id: this.state.co_id,
title: this.state.title,
job_id: this.state.job_id,
company: this.state.company
}
axios.post('/api/apply-to-job', jobApplication)
.then(response => {
console.log('Response: ', response.data.message)
this.setState({
message: response.data.message
})
})
}
render(){
return(
<div>
<h1>Job Profile</h1>
<div className="item">
<div className="content">
<div className="header">{this.state.title}</div>
<div className="meta">{this.state.company}</div>
<div className="extra">{this.state.description}</div>
<div className="description">{this.state.location}</div>
<div className="description">{this.state.category}</div>
<div className="description">{this.state.salary}</div>
<div className="extra">{this.state.date}</div>
<div className="extra">{this.state.deadline}</div>
</div>
</div>
<button className="ui button primary" onClick={this.handleApply}>Apply!</button>
<h3>{this.state.message}</h3>
</div>
)
}
}
export default JobProfile<file_sep>import React, { Component } from 'react';
import { Link } from 'react-router-dom';
import {Redirect} from 'react-router'
import cookie from 'react-cookies'
class Home extends Component {
render() {
let redirectVar = null
if(cookie.load("Company-Logged")){
console.log('logged in cookie loaded!')
redirectVar = <Redirect to='/jobPosts'/>
}
if(cookie.load("Student-Logged")){
redirectVar = <Redirect to="/job-tab"/>
}
return (
<div className="App">
{redirectVar}
<h1>Welcome to Handshake!</h1>
{/* Link to List.js */}
<div>
<Link to={'./register'}>
<button variant="raised">
Go To Registration for Students
</button>
</Link>
<Link to={'./login'}>
<button variant="raised">
Go To Login for Students
</button>
</Link>
</div>
<div>
<Link to={'./company-register'}>
<button variant="raised">
Go To Registration for Companies
</button>
</Link>
<Link to={'./company-login'}>
<button variant="raised">
Go To Login for Companies
</button>
</Link>
</div>
</div>
);
}
}
export default Home;<file_sep>import React from 'react'
const Filter = ({handleCategoryChange, handleLocationChange, handleReset}) => {
return(
<div>
<div>
Location:
<input
onChange={handleLocationChange}
/>
</div>
<button onClick={()=>handleCategoryChange("Part-Time")}>Part-Time</button>
<button onClick={()=>handleCategoryChange("Full-Time")}>Full-Time</button>
<button onClick={()=>handleCategoryChange("On-Campus")}>On-Campus</button>
<button onClick={()=>handleCategoryChange("Internship")}>Internship</button>
<button onClick={handleReset}>Reset</button>
</div>
)
}
export default Filter<file_sep>import React, {Component} from 'react'
class EditEducation extends Component {
//{showEducationForm, hideEducationForm, submitEducationForm, id}
constructor(props){
super(props)
this.state = {
showForm: this.props.showEducationForm
}
console.log('From constructor, showEducationForm: ', this.props.showEducationForm)
}
componentDidUpdate(prevProps, prevState){
if (prevProps.showEducationForm !== this.props.showEducationForm){
this.setState({
showForm: this.props.showEducationForm
})
}
console.log("From update (previous): ", prevProps.showEducationForm )
console.log("From update (new) :", this.props.showEducationForm)
}
handleNameChange = (e) => {
this.setState({
newName: e.target.value
})
}
handleDegreeChange = (e) => {
this.setState({
newDegree: e.target.value
})
}
handleLocationChange = (e) => {
this.setState({
newLocation: e.target.value
})
}
handleDateChange = (e) => {
this.setState({
newDates: e.target.value
})
}
handleGpaChange = (e) => {
this.setState({
newGpa: e.target.value
})
}
handleSubmit = () => {
let EducationInfo = {
schoolname: this.state.newName,
degree: this.state.newDegree,
location: this.state.newLocation,
dates: this.state.newDates,
gpa: this.state.newGpa,
id: this.props.id
}
console.log("education object from 'edit education': ", EducationInfo)
this.props.submitEducationForm(EducationInfo)
}
handleHide = () => {
this.props.hideEducationForm()
}
render() {
let show = this.state.showForm ? "modal display-block" : "modal display-none"
console.log('props:', this.props)
return(
<div className ={show}>
<div className="modal-main">
<form class="ui form">
<div class="field">
<label>Name</label>
<input type="text" name="name" placeholder="Name" onChange={this.handleNameChange}></input>
</div>
<div class="field">
<label>Degree</label>
<input type="text" name="degree" placeholder="Degree" onChange={this.handleDegreeChange}></input>
</div>
<div class="field">
<label>Location</label>
<input type="text" name="location" placeholder="Location" onChange={this.handleLocationChange}></input>
</div>
<div class="field">
<label>Dates</label>
<input type="text" name="dates" placeholder="Dates" onChange={this.handleDateChange}></input>
</div>
<div class="field">
<label>GPA</label>
<input type="text" name="gpa" placeholder="GPA" onChange={this.handleGpaChange}></input>
</div>
<button className="ui button primary" onClick={this.handleSubmit}>Submit</button>
<button className="ui button primary" onClick={this.handleHide}>Close</button>
</form>
</div>
</div>
)
}
}
export default EditEducation
<file_sep>import React from 'react';
import ReactDOM from 'react-dom';
import {render} from 'react-dom'
import {BrowserRouter} from 'react-router-dom'
import './index.css';
import App from './App';
import{Provider} from 'react-redux'
import{createStore, applyMiddleware} from 'redux'
import thunk from 'redux-thunk'
import reducers from './reducers'
const store = createStore(reducers, applyMiddleware(thunk))
ReactDOM.render(
<Provider store={store}>
<BrowserRouter>
<App />
</BrowserRouter>
</Provider>
, document.getElementById('root'));
<file_sep>import React, {Component} from 'react'
import axios from 'axios'
import EventsList from './EventsList'
import Filter from './Filter'
import RegisteredEventsList from './RegisteredEventsList'
import NavBar from '../NavBar/NavBar'
class EventsStudentsTab extends Component {
constructor(props){
super(props)
this.state = {
events: [],
filteredEvents: [],
registeredEvents: [],
nameSearch: ''
}
}
componentDidMount(){
//axios get all events
axios.get('/api/get-all-events')
.then(response => {
console.log('response (all events): ', response.data)
this.setState({
filteredEvents: response.data.events,
events: response.data.events
})
})
//axios get registered events
axios.get('/api/get-registered-events')
.then(response => {
console.log('response (registered events): ', response.data)
this.setState({
registeredEvents: response.data.events
})
})
}
reset = () => {
this.setState({
filteredEvents: this.state.events
})
}
handleNameChange = (e) => {
this.setState({
nameSearch: e.target.value
})
this.filterByname()
}
filterByname = () => {
let lowerCaseNameSearch = this.state.nameSearch.toLowerCase()
let events = this.state.events
console.log('events: ', events)
//filtered Array
let filteredEvents = events.filter(event => {
let lowerCaseName = (!event.name) ? '' : event.name.toLowerCase()
return lowerCaseName.includes(lowerCaseNameSearch)
})
this.setState({
filteredEvents: filteredEvents
})
}
render(){
return(
<div>
<NavBar/>
<h3>Filter By:</h3>
<Filter
handleNameChange={this.handleNameChange}
handleReset={this.reset}
/>
<h2>Events:</h2>
<EventsList
events={this.state.filteredEvents}
/>
<h2>Registered Events:</h2>
<RegisteredEventsList
events={this.state.registeredEvents}
/>
</div>
)
}
}
export default EventsStudentsTab<file_sep>import axios from 'axios'
export const changeBasicInfo = (info) => async dispatch =>{
let response = await axios.post('/api/change-company-basic-info', info)
.then(
response => {
console.log('basic-info post response: ', response)
return response.data
}
)
dispatch({
type: 'CHANGE_BASIC_INFO',
payload: response
})
}
export const changeContactInfo = (info) => async dispatch =>{
let response = await axios.post('/api/change-company-contact-info', info)
.then(
response => {
console.log('contact-info change post response: ', response)
return response.data
}
)
dispatch({
type: 'CHANGE_CONTACT_INFO',
payload: response
})
}
export const fetchCompanyInfo = () => async dispatch => {
console.log('fetching info')
let data = await axios.post('/api/get-company-profile', {message: "fetching info"})
.then(response => {
console.log('company profile results: ', response)
return response.data
})
dispatch({
type: 'GET_COMPANY_INFO',
payload: data
})
}<file_sep>const initialSate = {
JobsPosted: [],
JobsApplied: [],
}
const jobPostPageReducer = (state = initialSate, action) => {
switch(action.type){
case 'GET_JOB_POST_PAGE':
let jobPostsPayload = action.payload
console.log('Getting Job Posts Page...', jobPostsPayload)
return{
...jobPostsPayload
}
case 'POST_NEW_JOB':
let newJobsPostedObj = {
JobsPosted: action.payload
}
console.log('action: ', action)
console.log('payload from reducer: ', action.payload)
console.log('adding new jobs posted info: ', newJobsPostedObj)
return {
...state,
...newJobsPostedObj
}
case 'UPDATE_JOB_STATUS':
let updatedJobsStatus = {
...action.payload
}
console.log('updating applied jobs\' status: ', updatedJobsStatus)
return{
...state,
...updatedJobsStatus
}
default:
return state
}
}
export default jobPostPageReducer<file_sep>import React from 'react'
import EventCard from './EventCard'
const EventsList = ({events = []}) =>{
return(
events.map(event =>
<EventCard
event_id={event.id}
co_name={event.co_name}
time={event.time}
location={event.location}
event_name={event.name}
eligibility={event.eligibility}
description={event.description}
date={event.date}
co_id={event.co_id}
key={event.id}
/>
)
)
}
export default EventsList<file_sep>const express = require('express')
const {db} = require('../../db')
const bodyParser = require('body-parser');
const router = express.Router();
const app = express()
app.use(bodyParser.json());
router.post('/apply-to-job', (req, res) => {
let {co_id, title, job_id, company} = req.body
let student_id = req.session.userId
let student_name = req.session.studentName
let today = new Date().toLocaleDateString()
db.query('SELECT * FROM jobs_students WHERE job_id = ? AND student_id = ?',
[`${job_id}`, `${student_id}`], (error, results, fields) => {
if (error) throw error
console.log('outer loop results: ', results)
//console.log('results: ', results)
if(results.length > 0) {
res.send({
message: 'You have already applied to this job'
})
} else {
db.query(`INSERT INTO jobs_students (student_id, co_id, status, student_name, title, job_id, company)
VALUES('${student_id}', '${co_id}', 'Pending', '${student_name}', '${title}', '${job_id}', '${company}')`,
(error, result) => {
if(error) throw error
console.log('result inner loop:')
res.send({
message: "Application Submitted"
})
})
}
})
})
router.get('/get-applications', (req, res) =>{
let student_id = req.session.userId
console.log('student_id: ', student_id)
db.query('SELECT * FROM jobs_students WHERE student_id = ?',
[`${student_id}`], (error, results, fields) => {
if (error) throw error
console.log('results: ', results)
res.send({
apps: results
})
})
})
module.exports = {
applicationsApiRouter: router
}<file_sep>import {getUserProfile, getUserEducation} from './Users'
export default {
getUserEducation,
getUserProfile
}<file_sep>import {combineReducers} from 'redux'
import {selectedInfoReducer} from './select'
import userInfoReducer from './UserInfo'
import companyInfoReducer from './CompanyInfo'
import jobPostPageReducer from './JobPostPage'
//import the reducers and export it
export default combineReducers({
userInfo: userInfoReducer,
companyInfo: companyInfoReducer,
selectedInfo: selectedInfoReducer,
jobPostPageInfo: jobPostPageReducer
})<file_sep>import axios from 'axios'
export const fetchJobPostPageInfo = () => async dispatch => {
console.log('fetching job post page info from actions: ')
let data = await axios.get('/api/get-job-post-page')
.then(response => {
console.log('job post page info results: ', response)
return response.data
})
dispatch({
type: 'GET_JOB_POST_PAGE',
payload: data
})
}
export const postNewJob = (newJob) => async dispatch => {
console.log('posting new job from actions: ', newJob)
let data = await axios.post('/api/post-new-job', newJob)
.then(response => {
console.log('updated jobs list (array): ', response.data.JobsPosted)
return response.data.JobsPosted
})
console.log('payload data: ', data)
dispatch({
type: 'POST_NEW_JOB',
payload: data
})
}
export const updateJobStatus = (job) => async dispatch => {
console.log('updating applied job: ', job)
let data = await axios.post('/api/update-job-status', job)
.then(response => {
console.log('updated applied jobs list: ', response.data)
return response.data
})
dispatch({
type: 'UPDATE_JOB_STATUS',
payload: data
})
}<file_sep>import React from 'react'
const EventCard = ({name, time, location, date, description, eligibility}) => {
return(
<div className="ui items">
<div className="item">
<div className="content">
<div className="header">{name}</div>
<div className="meta">{location}</div>
<div className="meta">{date}</div>
<div className="meta">{time}</div>
<div className="meta">{description}</div>
<div className="meta">{eligibility}</div>
</div>
</div>
</div>
)
}
export default EventCard<file_sep>import React from 'react'
import {Link} from 'react-router-dom'
const EventCard = ({event_id, co_name, time, location, event_name, eligibility, description, date, co_id}) => {
return(
<div className="ui items">
<div className="item">
<div className="content">
<div className="header">{event_name}</div>
<div className="meta">{date}</div>
<div className="meta">{co_name}</div>
<Link to={{
pathname:'/eventProfile',
id: event_id,
co_name,
time,
location,
eligibility,
description,
event_name,
co_id,
date
}}>
<button variant="raised">
See Details
</button>
</Link>
</div>
</div>
</div>
)
}
export default EventCard<file_sep>const express = require('express')
const path = require('path');
const app = express()
const bodyParser = require('body-parser');
const session = require('express-session')
const cookieParser = require('cookie-parser')
const cors = require('cors')
app.use(bodyParser.json());
app.use(cors({ origin: 'http://localhost:3000', credentials: true }));
require('dotenv').config()
app.use(express.static(path.join(__dirname, 'client/build')));
app.use(session({
secret : 'secretStuff',
resave : false, // Forces the session to be saved back to the session store, even if the session was never modified during the request
saveUninitialized : false, // Force to save uninitialized session to db. A session is uninitialized when it is new but not modified.
cookie: { maxAge: 1000 * 60 * 60 }
}));
const {userRouter} = require('./routes/userLogging')
const {userApiRouter} = require('./routes/apis/userData')
const {companyApiRouter} = require('./routes/apis/companyData')
const {jobPostsApiRouter} = require('./routes/apis/jobData')
const {applicationsApiRouter} = require('./routes/apis/applicationData')
const {eventsApiRouter} = require('./routes/apis/eventsData')
app.use(cookieParser())
app.use('/', userRouter)
app.use('/api', userApiRouter)
app.use('/api', companyApiRouter)
app.use('/api', jobPostsApiRouter)
app.use('/api', applicationsApiRouter)
app.use('/api', eventsApiRouter)
app.get('*', (req, res) => {
res.sendFile(path.join(__dirname+'/client/build/index.html'));
});
const PORT = process.env.PORT || 5000
app.listen(PORT)<file_sep>const express = require('express')
const {db} = require('../../db')
const bodyParser = require('body-parser');
const router = express.Router();
const app = express()
app.use(bodyParser.json());
//gets all registered events for a company
router.get('/get-companyEvents', (req, res) => {
let co_id = req.session.userId
console.log('inside of this route')
db.query('SELECT * FROM events_students WHERE co_id = ?', [`${co_id}`], (error, results, fields) => {
if (error) throw error
console.log('Events results: ', results)
res.send({
events: results
})
})
}
)
//get all events for a company
router.get('/get-all-company-events', (req, res) => {
let co_id = req.session.userId
db.query('SELECT * FROM events_company WHERE co_id = ?', [`${co_id}`], (error, results, fields) => {
if (error) throw error
console.log('Events results: ', results)
res.send({
events: results
})
})
})
//gets all events for a student
router.get('/get-all-events', (req,res) => {
db.query('SELECT * FROM events_company', (error, results, fields) => {
if (error) throw error
console.log('events results: ', results)
res.send({
events: results
})
})
})
//get registered events for a user
router.get('/get-registered-events', (req, res) => {
let student_id = req.session.userId
db.query('SELECT * FROM events_students WHERE student_id = ?', [`${student_id}`], (error, results, fields) => {
if (error) throw error
console.log('registered events results: ', results)
res.send({
events: results
})
})
})
//apply to event, but checks eligibility first
router.post('/apply-to-event', (req, res) => {
let student_id = req.session.userId
let student_name = req.session.studentName
let student_degree = req.session.degree
let {
co_name,
eligibility,
event_id,
event_name,
co_id,
date,
time
} = req.body
console.log('degree: ', student_degree)
console.log('req body: ', req.body)
let alreadyRegisteredResults
if(!student_degree) {
student_degree = ""
}
if (!eligibility || eligibility === 'All' || eligibility === 'all'){
eligibility = student_degree.toLowerCase()
}
//check eligibility
db.query('SELECT * FROM events_company WHERE id = ?',
[`${event_id}`], (error, results, fields) => {
console.log('event application query result: ', results)
if(eligibility.toLowerCase() !== student_degree.toLowerCase()){
console.log('eligibility do not match')
res.send({
message: "Sorry, you are not eligibile for this event"
})
}
//check if already registered
db.query('SELECT * FROM events_students WHERE student_id = ? AND event_id =?', [`${student_id}`, `${event_id}`],
async (error, results, fields) => {
if (error) throw error
alreadyRegisteredResults = await results
console.log('reg check: ', alreadyRegisteredResults)
if(alreadyRegisteredResults.length >= 1){
res.send({
message: 'You\'ve already registered for this'
})
} else if(alreadyRegisteredResults.length === undefined || alreadyRegisteredResults.length < 1 ) {
console.log('eligibility match each other')
db.query(`INSERT INTO events_students (co_name, event_id, event_name, co_id, date, time, student_name, student_id)
VALUES('${co_name}', '${event_id}', '${event_name}', '${co_id}', '${date}', '${time}', '${student_name}', '${student_id}')`,
(error, result) => {
if (error) throw error
console.log('result of regestering: ', result)
res.send({
message: 'Registered for Event'
})
}
)
}
})
})
})
router.post('/create-event', (req,res) => {
let co_id = req.session.userId
let {name,
time,
location,
date,
description,
eligibility} = req.body
let eventObj = {
name,
time,
location,
date,
description,
eligibility
}
db.query(`INSERT INTO events_company (co_id, name, time, location, date, description, eligibility)
VALUES('${co_id}', '${name}', '${time}', '${location}', '${date}', '${description}', '${eligibility}')`,
(error, result) => {
if (error) throw error
res.send({
message: "Event Created",
event: eventObj
})
})
})
module.exports = {
eventsApiRouter: router
}<file_sep>import React, {Component} from 'react'
import ApplicationList from './ApplicationList'
import Filter from './Filter'
import axios from 'axios'
import NavBar from '../NavBar/NavBar'
class ApplicationsTab extends Component {
constructor(props){
super(props)
this.state={
apps: [],
filteredApps: []
}
}
componentDidMount(){
axios.get('/api/get-applications')
.then(response => {
console.log('response: ', response.data.apps)
this.setState({
apps: response.data.apps,
filteredApps: response.data.apps
})
})
}
reset = () => {
this.setState({
filteredApps: this.state.apps
})
}
filterByStatus = (status) => {
let apps = this.state.apps
let filteredApps = apps.filter(app => {
return app.status === status
})
this.setState({
filteredApps
})
}
render(){
return(
<div>
<NavBar/>
<h1>Student's Applications Tab</h1>
<div>
<Filter
handleStatusChange={this.filterByStatus}
handleReset={this.reset}
/>
</div>
<h3>Applications: </h3>
<div>
<ApplicationList
applications={this.state.filteredApps}
/>
</div>
</div>
)
}
}
export default ApplicationsTab<file_sep>export const selectedInfoReducer = (selectedInfo = null, action) => {
if (action.type === 'INFO_SELECTED'){
console.log('payload', action.payload)
return action.payload
}
return selectedInfo
}
<file_sep>import React, {useState} from 'react'
import EditExperience from './EditExperience'
const ExperienceCard = ({company, title, location, dates, description, id, submitDelete, submitExperienceForm}) => {
const [showEditForm, setEditForm] = useState(false)
const handleShowEditForm = () => {
setEditForm(true)
console.log('showing form: ', showEditForm)
}
const hideExperienceForm = () => {
setEditForm(false)
}
return (
<div>
<div className="ui items">
<div className="item">
<div className="content">
<div className="header">{company}</div>
<div className="meta">{title}</div>
<div className="description">{location}</div>
<div className="extra">{dates}</div>
<div className="extra">{description}</div>
</div>
</div>
<button className="ui button primary" onClick={()=>submitDelete(id)}>Delete</button>
<button className="ui button primary" onClick={handleShowEditForm}>Edit</button>
<div class="ui section divider"></div>
</div>
<div>
<EditExperience
showExperienceForm={showEditForm}
hideExperienceForm={hideExperienceForm}
submitExperienceForm={submitExperienceForm}
id={id}
/>
</div>
</div>
)
}
export default ExperienceCard<file_sep>import React from 'react'
const CompanyContact = (props) => {
return(
<div className="ui items">
<div className="item">
<div className="content">
<div className="description">{props.email}</div>
<div className="description">{props.phone}</div>
</div>
</div>
</div>
)
}
export default CompanyContact<file_sep>import React, {useState} from 'react'
const JobForm = ({showJobForm, hideJobForm, postNewJob}) => {
const [newTitle, setNewTitle] = useState('')
const [newDeadline, setNewDeadline] = useState('')
const [newLocation, setNewLocation] = useState('')
const [newSalary, setNewSalary] = useState('')
const [newDescription, setNewDescription] = useState('')
const [newCategory, setNewCategory] = useState('')
const handleTitleChange = (e) => {
setNewTitle(e.target.value)
}
const handleDeadlineChange = (e) => {
setNewDeadline(e.target.value)
}
const handleLocationChange = (e) => {
setNewLocation(e.target.value)
}
const handleSalaryChange = (e) => {
setNewSalary(e.target.value)
}
const handleDescriptionChange = (e) => {
setNewDescription(e.target.value)
}
const handleCategoryChange = (e) => {
setNewCategory(e.target.value)
}
const handleSubmit = () =>{
let today = new Date().toLocaleDateString()
let formatDate = today.replace("/", "-").replace("/", "-")
let JobObj = {
title: newTitle,
deadline: newDeadline,
date:formatDate,
location: newLocation,
salary: parseInt(newSalary, 10),
description: newDescription,
category: newCategory
}
postNewJob(JobObj)
setNewTitle('')
setNewDescription('')
setNewLocation('')
setNewSalary('')
setNewDeadline('')
setNewCategory('')
}
let show = showJobForm ? "modal display-block" : "modal display-none"
return(
<div className ={show} >
<div className="modal-main">
<form class="ui form">
<div class="field">
<label>Title</label>
<input type="text" name="title" placeholder="Title" value={newTitle} onChange={handleTitleChange}></input>
</div>
<div class="field">
<label>Description</label>
<input type="text" name="description" placeholder="Description" value={newDescription} onChange={handleDescriptionChange}></input>
</div>
<div class="field">
<label>Location</label>
<input type="text" name="location" placeholder="Location" value={newLocation}onChange={handleLocationChange}></input>
</div>
<div class="field">
<label>Category</label>
<div onChange={handleCategoryChange}>
<input type="radio" value="Full-Time" name="category"/> Full-Time
<input type="radio" value="Part-Time" name="category"/> Part-Time
<input type="radio" value="Internship" name="category"/> Intern
<input type="radio" value="On-Campus" name="category"/> On-Campus
</div>
</div>
<div class="field">
<label>Salary</label>
<input type="text" name="salary" placeholder="Salary" value={newSalary} onChange={handleSalaryChange}></input>
</div>
<div class="field">
<label>Deadline</label>
<input type="text" name="deadline" placeholder="Deadline" value={newDeadline} onChange={handleDeadlineChange}></input>
</div>
<button type="button" onClick={handleSubmit}>Submit</button>
<button type="button" onClick={hideJobForm}>Close</button>
</form>
</div>
</div>
)
}
export default JobForm<file_sep>import React, { useState } from "react"
const ChangeBasicInfo = ({
showBasicForm,
hideBasicForm,
submitBasicForm}) => {
const [newName, setNewName] = useState('')
const [newBirthDate, setNewBirthDate] = useState('')
const [newLocation, setNewLocation] = useState('')
const [newObjective, setNewObjective] = useState('')
const [newEmail, setNewEmail] = useState('')
const [newPhone, setNewPhone] = useState('')
const [newDegree, setNewDegree] = useState('')
const handleNameChange = (e) => {
setNewName(e.target.value)
}
const handleBirthDateChange = (e) => {
setNewBirthDate(e.target.value)
}
const handleLocationChange = (e) => {
setNewLocation(e.target.value)
}
const handleObjectiveChange = (e) => {
setNewObjective(e.target.value)
}
const handleEmailChange = (e) => {
setNewEmail(e.target.value)
}
const handlePhoneChange = (e) => {
setNewPhone(e.target.value)
}
const handleDegreeChange = (e) => {
setNewDegree(e.target.value)
}
const handleSubmit = () => {
let BasicInfo = {
name: newName,
birthdate: newBirthDate,
location: newLocation,
objective: newObjective,
email: newEmail,
phone: newPhone,
degree: newDegree
}
console.log('new basic info...', BasicInfo)
submitBasicForm(BasicInfo)
setNewName('')
setNewObjective('')
setNewPhone('')
setNewEmail('')
setNewLocation('')
setNewBirthDate('')
setNewDegree('')
}
let show = showBasicForm ? "modal display-block" : "modal display-none"
return(
<div className ={show} >
<div className="modal-main">
<form class="ui form">
<div class="field">
<label>Name</label>
<input type="text" name="name" placeholder="Name" value={newName} onChange={handleNameChange}></input>
</div>
<div class="field">
<label>Birth Date</label>
<input type="text" name="birthdate" placeholder="Birth Date" value={newBirthDate} onChange={handleBirthDateChange}></input>
</div>
<div class="field">
<label>Location</label>
<input type="text" name="location" placeholder="Location" value={newLocation}onChange={handleLocationChange}></input>
</div>
<div class="field">
<label>email</label>
<input type="text" name="location" placeholder="Email" value={newEmail} onChange={handleEmailChange}></input>
</div>
<div class="field">
<label>Phone Number</label>
<input type="text" name="location" placeholder="Phone #" value={newPhone}onChange={handlePhoneChange}></input>
</div>
<div class="field">
<label>Objective</label>
<input type="text" name="objective" placeholder="Objective" value={newObjective} onChange={handleObjectiveChange}></input>
</div>
<div class="field">
<label>Degree</label>
<input type="text" name="degree" placeholder="Degree" value={newDegree} onChange={handleDegreeChange}></input>
</div>
<button type="button" onClick={handleSubmit}>Submit</button>
<button type="button" onClick={hideBasicForm}>Close</button>
</form>
</div>
</div>
)
}
export default ChangeBasicInfo<file_sep># handshake
in the server directory type: "npm run dev" to run the app locally. It will run the front end and backend concurrently.
<file_sep>import React, {useState} from 'react'
const ChangeContactInfo = ({showContactForm, submitContactForm, hideContactForm}) => {
const [newEmail, setNewEmail] = useState('')
const [newPhone, setNewPhone] = useState('')
const handleEmailChange = (e) => {
setNewEmail(e.target.value)
}
const handlePhoneChange = (e) => {
setNewPhone(e.target.value)
}
const handleSubmit = () => {
let ContactInfo = {
email: newEmail,
phone: newPhone,
}
console.log('new contact info: ', ContactInfo)
submitContactForm(ContactInfo)
setNewEmail('')
setNewPhone('')
}
let show = showContactForm ? "modal display-block" : "modal display-none"
return(
<div className ={show} >
<div className="modal-main">
<form class="ui form">
<div class="field">
<label>Email</label>
<input type="text" name="name" placeholder="Email" value={newEmail} onChange={handleEmailChange}></input>
</div>
<div class="field">
<label>Phone</label>
<input type="text" name="phone" placeholder="Phone" value={newPhone}onChange={handlePhoneChange}></input>
</div>
<button type="button" onClick={handleSubmit}>Submit</button>
<button type="button" onClick={hideContactForm}>Close</button>
</form>
</div>
</div>
)
}
export default ChangeContactInfo<file_sep>import {connect} from 'react-redux'
import React, {Component} from 'react'
import cookie from 'react-cookies'
import {Redirect} from 'react-router'
import {fetchJobPostPageInfo, updateJobStatus, postNewJob} from '../../actions/jobPostPage'
import AppliedJobList from './AppliedJobList'
import JobList from './JobList'
import JobForm from './JobForm'
import NavBar from '../NavBar/NavBar'
class JobPostPage extends Component {
constructor(props){
super(props)
this.state = {
showJobForm: false,
logged: true
}
}
componentDidMount(){
console.log('Job Post Page state: ', this.state)
if(cookie.load("Company-Logged")){
this.props.fetchJobPostPageInfo()
}
}
hideJobForm = () => {
this.setState({
showJobForm: false
})
}
showJobForm = () => {
this.setState({
showJobForm: true
})
}
render() {
console.log('props', this.props)
let redirectVar = null
if(!cookie.load("Company-Logged") || this.state.logged === false){
redirectVar = <Redirect to='/company-login'/>
}
return(
<div>
<NavBar/>
{redirectVar}
<h1>Company's Job Posting Page</h1>
<div className="ui section divider"></div>
<h2>Posted Jobs</h2>
<JobList
jobs={this.props.jobPostPageInfo.JobsPosted}
/>
<div className="ui section divider"></div>
<button className="ui button primary" onClick={this.showJobForm}>Post Job</button>
<JobForm
hideJobForm={this.hideJobForm}
showJobForm={this.state.showJobForm}
postNewJob={this.props.postNewJob}
/>
<div className="ui section divider"></div>
<h2>Applied Jobs</h2>
<AppliedJobList
appliedJobs={this.props.jobPostPageInfo.JobsApplied}
updateJobStatus={this.props.updateJobStatus}
/>
</div>
)
}
}
const mapStatetoProps = (state) => {
console.log(state)
return{
jobPostPageInfo: state.jobPostPageInfo
}
}
export default connect(mapStatetoProps, {fetchJobPostPageInfo, updateJobStatus, postNewJob})(JobPostPage)
|
535b3d0fa77f816716be2202413603a8c59cf990
|
[
"JavaScript",
"Markdown"
] | 53 |
JavaScript
|
adao88/handshake
|
3dc4e921376eda80900fb53bdcc2a0adaf6163b0
|
b2a8625b64b71b3ac6a23acc07e0b1dbc46262ee
|
refs/heads/master
|
<file_sep>from enum import Enum
case_running_status = Enum('PASS', 'FAILED', 'INIT', 'RUNNING')
case_operation_status =Enum('PRE','POST')
<file_sep># -*- coding: utf-8 -*-
import logging
import sys
import importlib as importlib
from time import sleep
import yaml
from cct.engine import Engine
from cct.testset import TestSet
logging.basicConfig(level=logging.DEBUG, format='%(asctime)s %(filename)s[line:%(lineno)d] %(levelname)s %(message)s',
datefmt='%a, %d %b %Y %H:%M:%S')
logger = logging.getLogger(__file__)
def load_case():
data = None
with open('./testset.yml') as f:
data = yaml.load(f)
return data
def get_case_instance(path):
name = 'cct.cases.{module_name}'.format(module_name=path)
module = importlib.import_module(name)
case = getattr(module, path)
return case
def get_cases():
ts_data = load_case()
case_dict = {}
for case in ts_data.get('cases'):
id = case.get('id')
name = case.get('name')
dep = case.get('dependence')
process_at = case.get('process_at')
case_dict[case.get('id')] = get_case_instance(case.get('path'))(id, name, dep, process_at, )
return case_dict
def init_log():
# 创建一个日志器logger并设置其日志级别为DEBUG
logger.setLevel(logging.DEBUG)
# 创建一个流处理器handler并设置其日志级别为DEBUG
handler = logging.StreamHandler(sys.stdout)
handler.setLevel(logging.DEBUG)
# 创建一个格式器formatter并将其添加到处理器handler
formatter = logging.Formatter("%(asctime)s - %(name)s - %(levelname)s - %(message)s")
handler.setFormatter(formatter)
# 为日志器logger添加上面创建的处理器handler
logger.addHandler(handler)
if __name__ == '__main__':
# init_log()
cases = get_cases()
ts = TestSet(cases)
engine = Engine(test_set=ts)
engine.start()
<file_sep># -*- coding: utf-8 -*-
import logging
logger = logging.getLogger(__file__)
step = {
'flow_list': [
{
'type': 'fault',
'case_list': [
None,
{
'id': '333',
'name': 'node_fault',
'model': {
}
},
]
},
{
'type': 'service',
'case_list': [
{
'id': '111',
'name': 'create_lun',
'model': {
}
},
{
'id': '222',
'name': 'create_snap',
'model': {
}
},
]
}
]
}
def pares_step(step):
"""
找到每个bbt的依赖关系,用例结果依赖和并发依赖
:return:
"""
# 先搞业务流
ret = []
flow_service = get_case_list(step, 'service')
flow_fault = get_case_list(step, 'service')
flow_io = get_case_list(step, 'service')
for i in range(len(flow_service.case_list)):
if i == 0:
if flow_service[0]:
ret.append(Case())
else:
# check left has service or not
left_node = 1
pass
def make_dependence(x, y, step):
case = get_case(x, y, step)
if case:
if x == 0 and is_service(step, y):
pass
else:
pass
else:
return None
def get_first_service_case():
pass
def handle_service_node(case,x, step):
left_nodes = get_left_nodes(x, step)
if left_nodes:
if contains_service_node():
case.set_dep_rst(get_first_service_case())
pass
else:
pass
def contains_service_node(case_list):
pass
def get_left_nodes(x, y, step):
ret = []
if x > 0:
for index in [index for index in range(len(step.flow_list)) if index != y]:
ret.append(step.flow_list[index][x])
return ret
else:
logger.error("invalid x={x}".format(x=x))
return None
def is_service(step, y):
return step.flow_list[y] == 'service'
def get_case(x, y, step):
# TODO add the additional prop type for next process
return step.flow_list[y].case_list[x]
def get_case_type(y, step):
return step.flow_list[y].type
def get_case_list(step, flow_type):
for flow in step.flow_list:
if flow.type == flow_type:
return flow
return None
def to_test_set():
pass
class Case:
def __init__(self, dep_rst=None, dep_op=None):
self.dep = {
'result': None,
'op': None
}
pass
a = None
print(len(a))
<file_sep>import logging
from time import sleep
from cct.case import Case
from cct.common.enums import case_operation_status
logger = logging.getLogger(__file__)
class create_lun(Case):
def pre_test(self):
logger.info('doing something before create luns')
sleep(3)
def process(self):
self.set_case_operation_status(case_operation_status.PRE)
logger.info('create luns')
sleep(5)
self.set_case_operation_status(case_operation_status.POST)
def post_test(self):
logger.info('create lun has finished')
<file_sep>from cct.case import Case
class create_lun(Case):
pass
class create_snapshot(Case):
pass
class create_lun(Case):
pass<file_sep>import logging
from time import sleep
from cct.case import Case
logger = logging.getLogger(__file__)
class start_io(Case):
def pre_test(self):
logger.info ('doing something before start io')
sleep(3)
def process(self):
logger.info('started io')
sleep(5)
<file_sep>class TestSet(object):
def __init__(self, case_dict):
self.__case_dict = case_dict
self.__case_dep_running_status_dict = {}
self.__case_dep_op_status_dict = {}
self.__load_data()
self.__load_data(False)
for id, case in self.case_dict.items():
case.test_set = self
@property
def case_dict(self):
return self.__case_dict
def __load_data(self, is_dep_running=True):
target_dict = self.__case_dep_running_status_dict if is_dep_running else self.__case_dep_op_status_dict
for id, case in self.case_dict.items():
dep_dict = case.dependence if is_dep_running else case.dep_op_dict
if dep_dict is not None:
dep_id = dep_dict.get('id')
status = dep_dict.get('status').upper()
key = '{case_id}_{status}'.format(case_id=dep_id, status=status)
if key not in target_dict:
target_dict[key] = []
target_dict[key].append(case)
def set_case_status(self, case, case_status):
'''
update case status, and send xxx
:param case:
:param case_status:
:return:
'''
case.status = case_status
# make the related case available
key = '{case_id}_{status}'.format(case_id=case.case_id, status=case_status.key.upper())
cases = self.__case_dep_running_status_dict.get(key)
if cases:
for dep_case in cases:
dep_case.resume()
def set_case_operation_status(self, case, status):
case.operation_status = status
key = '{case_id}_{status}'.format(case_id=case.case_id, status=status.key.upper())
cases = self.__case_dep_op_status_dict.get(key)
if cases:
for dep_case in cases:
dep_case.resume()
<file_sep>import logging
import threading
import time
from cct.common.enums import case_running_status
logger = logging.getLogger(__file__)
class Engine(object):
def __init__(self, *keys, **kwargs):
self.__testset = kwargs.get('test_set')
self.__case_thread_list = []
def start(self):
# execute the cases
for id, case in self.__testset.case_dict.items():
t = threading.Thread(target=self.run_single_case, name=case.case_id, kwargs={'case': case})
t.setDaemon(True)
logger.info('test case ={name} will start'.format(name=case.name))
t.start()
self.__case_thread_list.append(t)
for t in self.__case_thread_list:
t.join()
def run_single_case(self, case):
try:
case.status = case_running_status.RUNNING
self.pause_4_depend_other_case(case)
case.pre_test()
self.pause_4_depend_op_status(case)
case.process()
case.post_test()
self.__testset.set_case_status(case, case_running_status.PASS)
except:
self.__testset.set_case_status(case, case_running_status.FAILED)
def pause_4_depend_other_case(self, case):
if case.dependence:
logger.info('case={name} will pause to wait for other case running status'.format(name=case.name))
case.pause()
def pause_4_depend_op_status(self, case):
if case.dep_op_dict:
logger.info('case={name} will pause to wait for other case op status'.format(name=case.name))
case.pause()
else:
logger.info ('case={name} will not pause={rst}'.format(name=case.name, rst=case.dep_op_dict is None))
def do1():
logger.info ("i am doing 1")
<file_sep>from git import Repo
if __name__ == '__main__':
repo=Repo('')
# 新建版本库对象
repo = Repo(r'E:\Notes')
# 进行文件修改操作
# 获取版本库暂存区
index = repo.index
# 添加修改文件
index.add(['new.txt'])
# 提交修改到本地仓库
index.commit('this is a test')
# 获取远程仓库
remote = repo.remote()
# 推送本地修改到远程仓库
remote.push()<file_sep>import logging
import threading
import time
from cct.common.enums import case_running_status
logger = logging.getLogger(__file__)
class Case(object):
def __init__(self, case_id, case_name, dependence, process_at):
self.__case_id = case_id
self.__signal = threading.Event()
self.__status = case_running_status.INIT
self.__name = case_name
self.__dependence = dependence
self.__operation_status = None
self.__dependence_op = process_at
self.__test_set = None
@property
def test_set(self):
return self.__test_set
@test_set.setter
def test_set(self, ts):
self.__test_set = ts
@property
def dep_op_dict(self):
return self.__dependence_op
@property
def operation_status(self):
return self.__operation_status
@operation_status.setter
def operation_status(self, status):
self.__operation_status = status
@property
def dependence(self):
return self.__dependence
@property
def status(self):
return self.__status
@status.setter
def status(self, status):
self.__status = status
@property
def name(self):
return self.__name
def resume(self):
logger.info('case={id} is resumed'.format(id=self.name))
self.__signal.set()
def pause(self):
logger.info('case={id} is paused'.format(id=self.name))
self.__signal.clear()
self.__signal.wait()
@property
def case_id(self):
return self.__case_id
def pre_test(self):
logger.info("do something before test")
time.sleep(3)
pass
def process(self):
pass
def post_test(self):
pass
def set_case_operation_status(self, status):
self.test_set.set_case_operation_status(self, status)
|
a9f1507c12139e6687983445b5e5f5e97062ae41
|
[
"Python"
] | 10 |
Python
|
LmangoLemon/mind
|
1b269acca41f840c5c71cb6c92ec92ecfb977ad4
|
9dc7e08ab879c5999e7f0e0256ea2cf043e61297
|
refs/heads/master
|
<repo_name>gt-elc-lab/thrum<file_sep>/README.md
# thrum
visualization for campus social media activity
<file_sep>/__init__.py
from collection import *
from analysis import *
__all__ = ['collection', 'analysis']
<file_sep>/analysis/tfidf.py
import math
import re
import nltk
import string
from nltk.tokenize import word_tokenize
from nltk.corpus import stopwords
from nltk.stem.porter import PorterStemmer
from sklearn.feature_extraction.text import TfidfVectorizer
class TFIDF(object):
def __init__(self, corpus, stemmer=PorterStemmer()):
self.stemmer = stemmer
self.corpus = corpus
self.punctuation = (string.punctuation)
self.tfidf = None
def stem_tokens(self, tokens):
stemmed = []
for token in tokens:
# stemmed.append(self.stemmer.stem(token))
stemmed.append(token)
return stemmed
def tokenize(self, text):
tokens = nltk.word_tokenize(text)
stems = self.stem_tokens(tokens)
return stems
def compute(self, ngram_range=(1, 2)):
token_dict = {}
for document in self.corpus:
text = document.get_text().lower()
no_punctuation = "".join([ch for ch in text if ch not in self.punctuation])
token_dict[document] = no_punctuation
tfidf = TfidfVectorizer(tokenizer=self.tokenize, stop_words='english', ngram_range=ngram_range)
tfidf.fit_transform(token_dict.values())
self.tfidf = tfidf
return self.tfidf
def get(self):
tfidf = self.compute()
feature_names = tfidf.get_feature_names()
words = []
for item in self.corpus:
response = tfidf.transform([item.get_text()])
for col in response.nonzero()[1]:
words.append((feature_names[col], response[0, col]))
words = sorted(words, key=lambda x: x[1], reverse=True)
return words
<file_sep>/analysis/d3_formatters.py
class ForceLayout(object):
def __init__(self):
return
def create_bigrams_graph(self, bigrams):
nodes = []
edges = []
location = {}
for a, b in bigrams:
if a not in location:
nodes.append({'word': str(a)})
location[a] = len(nodes) - 1
if b not in location:
nodes.append({'word': str(b)})
location[b] = len(nodes) - 1
edge = {'source': location[a], 'target' : location[b]}
edges.append(edge)
return (nodes, edges)<file_sep>/tests.py
import unittest
import nltk
from datetime import datetime
from datetime import timedelta
from analysis.tfidf import TFIDF
from analysis.ner import NER
from collection.models import Post, Comment
class TFIDFTests(unittest.TestCase):
def setUp(self):
self.corpus = fetch_data()
self.tfidf = TFIDF(self.corpus)
def testTFIDF(self):
values = self.tfidf.get()
for v, r in values:
print v, r
def testSchoolTFIDF(self):
schools = Post.list_colleges()
for school in schools:
print ''
print school
print '_______________'
corpus = fetch_data(college=school)
results = TFIDF(corpus).get()[::-1]
for v, r in results[:20]:
print v, r
class NERTests(unittest.TestCase):
def setUp(self):
self.corpus = fetch_data()
self.ner = NER(self.corpus)
def testNER(self):
data = "".join([item.get_text() for item in self.corpus])
result = self.ner.named_entities(data)
for v, r in result:
if r in ['NN','JJ', 'NNS', 'JJR','RB','RBR']:
print v , r
class TopicDetectionTests(unittest.TestCase):
def setUp(self):
self.corpus = fetch_data()
self.tfidf_results = TFIDF(self.corpus).get()
fused = ''.join([item.get_text() for item in self.corpus])
self.ner_results = NER(self.corpus).named_entities(fused)
def testTopicDetection(self):
tf = set([word[0] for word in self.tfidf_results if word[1] > 0.3])
ner = set([word[0] for word in self.ner_results if word[1] in parts_of_speech])
intersection = tf.intersection(ner)
for word in intersection:
print word
recovered = recover_bigrams(tf, ner)
for word in recovered:
print word
parts_of_speech = set(['NN','JJ', 'NNS', 'JJR','RB','RBR'])
def fetch_data(college='Georgia Tech', limit=20):
today = datetime.now()
two_days_ago = today - timedelta(hours=48)
posts = Post.query.filter(Post.college == college, Post.created.between(two_days_ago, today)).all()
comments = []
for post in posts:
for comment in post.comments:
comments.append(comment)
return posts + comments
def recover_bigrams(tfidf, named_entities):
bigrams = set()
for bigram in tfidf:
first_word = bigram.split(" ")[0]
if first_word in named_entities:
bigrams.add(bigram)
return bigrams
if __name__ == '__main__':
unittest.main()<file_sep>/app.py
from flask import Flask, json, render_template, request, jsonify
from flask.ext.triangle import Triangle
from nltk.tokenize import sent_tokenize
from datetime import datetime, timedelta
from collection.models import Post, Comment, db
from collection.nltk_20 import WordFrequency
from analysis.tfidf import TFIDF
from analysis.grams import BiGramGenerator
from analysis.d3_formatters import ForceLayout
from analysis.timeseries import TimeSerializer
from analysis.ner import NER
from collection.nltk_20 import WordFrequency
app = Flask(__name__, static_url_path='/static')
Triangle(app)
query = db.session.query(Post)
@app.route('/')
def index():
return render_template('home.html')
@app.route('/colleges')
def colleges():
schools = Post.list_colleges()
return jsonify(data=schools)
@app.route('/trending')
def trending():
college = request.args.get('college')
data = fetch_data(college)
topics = detect_topics(data)
return jsonify(data=topics)
def fetch_data(college, hours=24):
today = datetime.now()
two_days_ago = today - timedelta(hours=hours)
posts = Post.query.filter(
Post.college == college,
Post.created.between(two_days_ago, today)).all()
comments = []
for post in posts:
for comment in post.comments:
comments.append(comment)
return posts + comments
def detect_topics(corpus):
parts_of_speech = set(['NN','JJ', 'NNS', 'JJR','RB','RBR'])
tfidf = TFIDF(corpus).get()
# fused = ''.join([item.get_text() for item in corpus])
# ner = NER(corpus).named_entities(fused)
relevant = set([word[0] for word in tfidf if word[1] > 0.4])
# entities = set([word[0] for word in ner if word[1] in parts_of_speech])
# intersection = relevant.intersection(entities)
# bigrams = recover_bigrams(relevant, entities)
return list(relevant)
def recover_bigrams(tfidf, named_entities):
bigrams = set()
for bigram in tfidf:
first_word = bigram.split(" ")[0]
if first_word in named_entities:
bigrams.add(bigram)
return bigrams
@app.route('/posts')
def send_posts():
college = request.args.get('college')
term = request.args.get('term')
posts = [item.get_text() for item in text_search(college, term)]
return jsonify(data=posts)
def text_search(college, term):
present = datetime.now()
past = datetime.now() - timedelta(weeks=4)
posts = Post.query.whoosh_search(term).filter(
Post.college==college,
Post.created.between(past, present)).all()
comments = Comment.query.whoosh_search(term).filter(
Comment.college==college,
Comment.created.between(past, present)).all()
return posts + comments
@app.route('/usage')
def usage():
college = request.args.get('college')
term = request.args.get('term')
posts = text_search(college, term)
data = cluster(posts)
return jsonify(data=data)
def cluster(posts):
return TimeSerializer().daily_buckets(posts)
# @app.route('/data/')
# def send_data():
# dropdown = Post.list_colleges()
# colleges = request.args.getlist('colleges')
# term = request.args.get('term')
# result = []
# for school in colleges:
# corpus = query_word(school, term)
# buckets = bucketize(corpus, school, term)
# if buckets:
# result += buckets
# return render_template('line_graph.html', colleges=dropdown,
# results=jsonify(data=result))
# @app.route('/<college>/<word>/<time_stamp>')
# def word_tree(college, word, time_stamp):
# corpus = query_interval(college, word, time_stamp)
# return jsonify(data=word_tree_data(corpus))
# def query_word(college, word):
# post_results = Post.query.whoosh_search(word).filter(Post.college==college)
# comments_results = Comment.query.whoosh_search(word).filter(Comment.college==college)
# corpus = post_results.all() + comments_results.all()
# return corpus
# def bucketize(corpus, school, term):
# serializer = TimeSerializer()
# return serializer.weekly_buckets(corpus, school, term)
# def word_tree_data(posts):
# corpus = ""
# for post in posts:
# if isinstance(post, Post):
# corpus += post.title
# corpus += post.text
# if isinstance(post, Comment):
# corpus += post.body
# return sent_tokenize(corpus)
# def query_interval(college, word, time_stamp):
# date = datetime.fromtimestamp(float(time_stamp) / 1000)
# week_before = date - timedelta(weeks=1)
# week_after = date + timedelta(weeks=1)
# posts = Post.query.whoosh_search(word).filter(Post.college==college,
# Post.created.between(week_before, week_after))
# comments = Comment.query.whoosh_search(word).filter(Comment.college==college,
# Comment.created.between(week_before, week_after))
# result = posts.all() + comments.all()
# return result
if __name__ == "__main__":
app.run(debug=True)
<file_sep>/analysis/grams.py
import nltk
from nltk.collocations import *
class BiGramGenerator(object):
def __init__(self):
return
def bigrams(self, corpus, frequency):
tokens = nltk.wordpunct_tokenize(corpus)
finder = BigramCollocationFinder.from_words(tokens)
finder.apply_freq_filter(frequency)
bigram_measures = nltk.collocations.BigramAssocMeasures()
scored = finder.score_ngrams(bigram_measures.raw_freq)
grams = sorted(bigram for bigram, score in scored)
return grams<file_sep>/collection/db_create.py
from models import db, Post, Comment
try:
Comment.query.delete()
Post.query.delete()
print "successfully cleared the database"
except Exception as e:
print str(e)
finally:
db.create_all()
print "successfully created a new database"<file_sep>/static/src/app.js
'use strict'
var app = angular.module('thrum', []);
app.controller('MainController', MainController);
app.service('Api', ['$http', Api]);
app.service('SelectionService', SelectionService);
app.directive('barGraph', ['SelectionService', BarGraph]);
MainController.$inject = ['$scope', '$rootScope', 'Api'];
function MainController($scope, $rootScope, Api) {
$scope.selected;
$scope.colleges;
$scope.trending;
$scope.usageData = [];
init();
function init() {
Api.getColleges().success(function(response, status) {
$scope.colleges = response.data;
})
.error(function(data, status) {
alert('Server Responded with ' + status);
});
}
$scope.selectCollege = function(college) {
$scope.selected = college;
Api.getTrending(college).success(function(response, status) {
$scope.trending = response.data;
})
.error(function(response, status) {
alert('Server Responded with ' + status);
});
};
$scope.usage = function(term) {
Api.getUsage($scope.selected, term).success(function(response, status) {
$scope.usageData = response.data;
$rootScope.$broadcast('dataChanged');
})
.error(function(response, status) {
alert('Server Responded with ' + status);
});
};
}
function Api($http) {
var exports = {};
exports.getColleges = function() {
return $http({
url: '/colleges',
method: 'GET',
params: null
});
};
exports.getTrending = function(college) {
return $http({
url: '/trending',
method: 'GET',
params: {
college: college
}
});
};
exports.getUsage = function(college, term) {
return $http({
url: '/usage',
method: 'GET',
params: {
college: college,
term: term
}
});
};
return exports;
}
function BarGraph() {
var directive = {
restrict: 'AE',
scope: {
data: '='
},
link: link,
};
function link(scope, element, attrs, SelectionService) {
var margin = {top: 20, right: 20, bottom: 50, left: 50};
var width = 800 - margin.left - margin.right;
var height = 300 - margin.top - margin.bottom;
function DateDomain(d) {return new Date(d.date);}
function YCountDomain(d) { return d.count;}
var svg = d3.select(element[0]).append("svg")
.attr("width", width + margin.left + margin.right)
.attr("height", height + margin.top + margin.bottom)
.append("g")
.attr("transform", "translate(" + margin.left + "," + margin.top + ")");
scope.render = function(data) {
var x = d3.time.scale().range([0, width])
.domain([d3.min(data, DateDomain),
d3.max(data, DateDomain)]);
var y = d3.scale.linear().range([height, 0])
.domain([0, d3.max(data, YCountDomain)]);
svg.selectAll('.y.axis').remove();
svg.selectAll('.x.axis').remove();
var xAxis = d3.svg.axis()
.scale(x)
.orient("bottom")
var yAxis = d3.svg.axis()
.scale(y)
.orient("left");
svg.append("g")
.attr("class", "x axis")
.attr("transform", "translate(0," + height + ")")
.call(xAxis);
svg.append("g")
.attr("class", "y axis")
.call(yAxis)
.append("text")
.attr("transform", "rotate(-90)")
.attr("y", 6)
.attr("dy", ".71em")
.style("text-anchor", "end")
.text("");
var line = d3.svg.line()
.x(function(d) {
return x(new Date(d.date));
})
.y(function(d) {
return y(d.count);
})
.interpolate('monotone');
svg.selectAll(".line").remove();
svg.append("path")
.datum(data)
.attr("class", "line")
.attr('fill', 'none')
.style('stroke', 'red')
.style('stroke-width', '3px')
.attr("d", line);
svg.selectAll("circle").remove();
var points = svg.selectAll(".point")
.data(data)
.enter().append("svg:circle")
.attr("stroke", "black")
.attr("fill", 'blue')
.attr("cx", function(d) {
return x(new Date(d.date));
})
.attr("cy", function(d) {
return y(d.count);
})
.attr("r", 5)
};
scope.$on('dataChanged', function() {
alert('data changed');
scope.render(scope.data);
});
}
return directive;
}
SelectionService.$inject = ['$rootScope'];
function SelectionService($rootScope) {
var college;
var term;
var data;
var service = {};
service.setCollege = function(college) {
this.college = college;
};
service.setTerm = function(term) {
this.term = term;
};
service.setData = function(data) {
this.data = data;
$rootScope.$broadcast('injected');
};
service.addToData = function(moredata) {
this.data.push(moredata);
};
service.getTerm = function() {
return this.term;
};
service.getData = function() {
return this.data;
};
service.getCollege = function() {
return this.college;
};
return service;
}<file_sep>/collection/models.py
from flask import Flask
from flask.ext.sqlalchemy import SQLAlchemy
from datetime import datetime
import flask.ext.whooshalchemy as whoosh
DATABASE_URI = 'mysql://root:password@localhost/proto_thrum'
app = Flask(__name__)
app.config['SQLALCHEMY_DATABASE_URI'] = DATABASE_URI
db = SQLAlchemy(app)
class Post(db.Model):
"""sqlalchemy Post object.
Attributes:
id (str):
title (str):
text (str):
url (str):
ups (int):
downs (int):
subreddit (str):
college (str):
time_stamp (datetime):
created (datetime):
modified (datetime):
"""
__tablename__ = 'post'
__searchable__ = ['title','text']
id = db.Column(db.String(10), primary_key=True)
title = db.Column(db.String(10000), nullable=False)
text = db.Column(db.String(10000), nullable=True)
url = db.Column(db.String(1000), nullable=True)
ups = db.Column(db.Integer, nullable=False)
downs = db.Column(db.Integer, nullable=False)
subreddit = db.Column(db.String(1000), nullable=False)
college = db.Column(db.String(1000), nullable=False)
time_stamp = db.Column(db.DateTime, default=datetime.now())
created = db.Column(db.DateTime, nullable=False)
modified = db.Column(db.DateTime, default=datetime.now())
comments = db.relationship('Comment', backref='post', lazy='dynamic')
def __init__(self, id, title, text, url, ups, downs, subreddit, college, create_utc):
"""
Args:
id (str):
title (str):
text (str):
url (str):
ups (int):
downs (int):
subreddit (str):
college (str):
time_stamp (datetime):
created (datetime):
modified (datetime):
"""
self.id = id
self.title = title
self.text = text
self.url = url
self.ups = ups
self.downs = downs
self.subreddit = subreddit
self.college = college
self.created = datetime.utcfromtimestamp(create_utc)
@staticmethod
def list_colleges():
colleges = db.session.query(Post.college.distinct())
return sorted([str(name[0]) for name in colleges.all()])
def get_text(self):
return self.text
def __repr__(self):
""" String representation of a post object """
return '<Post %s>' % self.title
def __hash__(self):
return hash(self.id + str(self.created))
def __eq__(self, other):
return self.id == other.id
class Comment(db.Model):
"""sqlalchemy Comment object.
Attributes:
id (str):
body (str):
ups (int):
downs (int):
post_id (str):
post (sqlalchemy relationship):
post_id (str):
time_stamp (datetime):
modified (datetime):
created (datetime):
"""
__tablename__ = 'comment'
__searchable__ = ['body']
id = db.Column(db.String(10), primary_key=True)
body = db.Column(db.String(10000), nullable=True)
ups = db.Column(db.Integer, nullable=False)
downs = db.Column(db.Integer, nullable=False)
post_id = db.Column(db.String(10), db.ForeignKey('post.id'))
time_stamp = db.Column(db.DateTime, default=datetime.now())
modified = db.Column(db.DateTime, default=datetime.now())
college = db.Column(db.String(1000), nullable=False)
created = db.Column(db.DateTime, nullable=False)
def __init__(self, id, body, ups, downs, post_id, create_utc, college):
"""
Args:
id (str):
body (str):
ups (int):
downs (int):
post_id (str):
post (sqlalchemy relationship):
post_id (str):
time_stamp (datetime):
modified (datetime):
created (datetime):
"""
self.id = id
self.body = body
self.ups = ups
self.downs = downs
self.post_id = post_id
self.created = datetime.utcfromtimestamp(create_utc)
self.college = college
def get_text(self):
return self.body
def __repr__(self):
""" String representation of a comment object"""
return '<Comment %s>' % self.body[:10]
def __hash__(self):
return hash(self.id + str(self.created))
def __eq__(self, other):
return self.id == other.id
whoosh.whoosh_index(app, Post)
whoosh.whoosh_index(app, Comment)<file_sep>/analysis/timeseries.py
import itertools
from datetime import datetime, date, timedelta
from collections import Counter
from nltk.corpus import stopwords
class TimeSerializer(object):
""" Provides methods for computing time series data"""
def __init__(self):
"""
Args:
posts (list): list of post sqlalchemy post objects
"""
self.now = datetime.now()
def today(self):
return datetime.now()
def get_days_ago(self, days):
return datetime.now() - timedelta(days=days)
def get_weeks_ago(self, weeks):
return datetime.now()- timedelta(weeks=weeks)
def hourly_activity(self, data):
result = []
data = sorted(data, key=lambda x: x.created)
groupings = itertools.groupby(data, key=lambda x: x.created.hour)
for hour, group in groupings:
record = {'date': str(self.today() - timedelta(hours=hour)), 'count': len(list(group))}
result.append(record)
return result
def weekly_buckets(self, data, school, term, vanilla=False):
data = sorted(data, key=lambda x: x.created)
buckets = itertools.groupby(data, key=lambda x: int(x.created.strftime('%U')))
if vanilla:
return buckets
output = []
for bucket, items in buckets:
# datetime objects don't have a week field so we have to find the
# week of the post with respect to the current time.
# strftime('%U') gives you the current week in the year as a string
offset = int(self.now.strftime('%U')) - bucket
date = self.now - timedelta(weeks=offset)
count = len(list(items))
output.append({'date': str(date), 'count': count, 'college':school, 'word': term })
return output
def daily_buckets(self, data):
data = sorted(data, key=lambda x: x.created)
buckets = itertools.groupby(data, key=lambda x: x.created.timetuple().tm_yday)
year = datetime.now().year
year_start = date(year,1,1)
daily = []
for bucket, items in buckets:
time_stamp = str(year_start + timedelta(days=bucket - 1))
daily.append({'date': time_stamp, 'count': len(list(items))})
return daily<file_sep>/analysis/ner.py
import string
from nltk.tokenize import word_tokenize, sent_tokenize
from nltk.corpus import stopwords
from nltk import probability
from stop_words import get_stop_words
from nltk.tag import pos_tag
class NER(object):
""" Performs Named Entity Recognition against a given corpus"""
def __init__(self, corpus):
self.corpus = corpus
self.stopwords = get_stop_words('english') + stopwords.words('english')
def tag_words(self, sentence):
return pos_tag(sentence)
def tokenize_sentences(self, corpus):
corpus = corpus.lower()
removed_punctuation = "".join([ch for ch in corpus if ch not in string.punctuation])
sentences = sent_tokenize(removed_punctuation)
return sentences
def tokenize_words(self, sentence):
words = word_tokenize(sentence)
return words
def named_entities(self, corpus, tag_prefix=""):
sentences = self.tokenize_sentences(corpus)
tagged_text = []
for sentence in sentences:
tagged_text = tagged_text + (self.tag_words(self.tokenize_words(sentence)))
return [(word, tag) for (word, tag) in tagged_text if tag.startswith(tag_prefix)]<file_sep>/collection/config.py
SUBREDDITS = {
'Georgia Tech':'gatech',
'Ohio State University':'OSU',
'Texas A&M': 'aggies',
'University of Central Florida':'ucf',
'UC Berkeley': 'berkeley',
'McGill University':'mcgill',
'Rochester Institute of Technology': 'rit',
'UC Santa Barbara': 'ucsantabarbara',
'UT Austin': 'UTAustin',
'Purdue' : 'purdue',
'York University': 'yorku',
'CalPoly Ubispo': 'CalPoly',
'North Carolina State University': 'ncsu',
'Penn State University': 'PennStateUniversity',
'University of Maryland': 'UMD',
'University of Georgia' : 'UGA',
'University of San Diego': 'ucsd',
'Arizona State University': 'asu',
'University of British Columbia': 'UBC'
}
USERNAME = 'elc-gt'
PASSWORD = '<PASSWORD>'<file_sep>/collection/nltk_20.py
from nltk.probability import FreqDist
from nltk.corpus import stopwords
class WordFrequency(object):
def __init__(self):
self.stopwords = stopwords.words('english')
self.stopwords.append('I')
def word_frequencies(self, text_glob):
tokens = [word for word in nltk.word_tokenize(text_glob)
if word not in self.stopwords]
text = nltk.Text(tokens)
fd = nltk.FreqDist(text)
return [{'word': word, 'value':count} for word, count in fd.items()]
@staticmethod
def remove_punctuation(text):
regex = re.compile('[^a-zA-Z0-9 ]')
return regex.sub("", text)<file_sep>/collection/rcrawler_update.py
import praw
from models import db, Post, Comment
from config import SUBREDDITS, USERNAME, PASSWORD
from datetime import datetime
import re
def main():
"""
Runs the crawler
"""
reddit = praw.Reddit('PRAW Gatech subreddit monitor')
print "Logging in"
reddit.login(USERNAME, PASSWORD)
print "Logged in. Starting to scrape"
for school, subreddit in SUBREDDITS.iteritems():
print "Scraping {} : r/{}".format(school, subreddit)
start = datetime.now()
posts = reddit.get_subreddit(subreddit)
num_posts, num_comments = crawl_subreddit(posts.get_new(limit=30), school, subreddit)
duration = datetime.now() - start
output = "Finished scraping {} : r/{} | {} posts {} comments | {} seconds"
print output.format(school, subreddit, num_posts, num_comments,
duration.seconds)
print "Done"
def crawl_subreddit(posts, school, subreddit):
"""
Args:
posts: posts obtained from reddit
school: the name of the school
subreddit: the name of the subreddit
Returns:
(num_posts, num_comments) : the number of posts and comments
"""
num_posts = 0
num_comments = 0
regex = re.compile('[^a-zA-Z0-9 -]')
try:
for submission in posts:
new_post = Post(id=submission.id,
title=submission.title,
text=submission.selftext,
url=submission.url,
ups=submission.ups,
downs=submission.downs,
subreddit=subreddit,
college=school,
create_utc=submission.created_utc)
db.session.merge(new_post)
db.session.commit()
num_posts += 1
submission.replace_more_comments(limit=None, threshold=0)
comments = praw.helpers.flatten_tree(submission.comments)
for comment in comments:
new_comment = Comment(id=comment.id,
body=comment.body,
ups=comment.ups,
downs=comment.downs,
post_id=submission.id,
create_utc=submission.created_utc,
college=school)
db.session.merge(new_comment)
db.session.commit()
num_comments += 1
except Exception as error:
print str(error)
return (num_posts, num_comments)
if __name__ == '__main__':
main()
|
e8b2e19fc6f24895876ab91c9b6093181d615e12
|
[
"Markdown",
"Python",
"JavaScript"
] | 15 |
Markdown
|
gt-elc-lab/thrum
|
6ef19f6b3e58107eeab78c14efcab159a396a8fa
|
04ee497a7ad17d43dedacb95bb729b9e05bc2474
|
refs/heads/master
|
<repo_name>ShreedharaCA/clone-flipkart<file_sep>/.c9/metadata/workspace/README.md
{"changed":true,"filter":false,"title":"README.md","tooltip":"/README.md","value":" The Following page is a Clone Of FLIPKART by Using Twitter Bootstrap. The intention of making this page is only for LEARNING PURPOSE ALONE.","undoManager":{"mark":-1,"position":0,"stack":[[{"group":"doc","deltas":[{"start":{"row":0,"column":0},"end":{"row":30,"column":0},"action":"remove","lines":[""," ,-----.,--. ,--. ,---. ,--.,------. ,------."," ' .--./| | ,---. ,--.,--. ,-| || o \\ | || .-. \\ | .---'"," | | | || .-. || || |' .-. |`..' | | || | \\ :| `--, "," ' '--'\\| |' '-' '' '' '\\ `-' | .' / | || '--' /| `---."," `-----'`--' `---' `----' `---' `--' `--'`-------' `------'"," ----------------------------------------------------------------- ","","","Hi there! Welcome to Cloud9 IDE!","","To get you started, we have created a small hello world application.","","1) Open the hello-world.php file","","2) Follow the run instructions in the file's comments","","3) If you want to look at the Apache logs, check out ~/lib/apache2/log","","And that's all there is to it! Just have fun. Go ahead and edit the code, ","or add new files. It's all up to you! ","","Happy coding!","The Cloud9 IDE team","","","## Support & Documentation","","Visit http://docs.c9.io for support, or to learn more about using Cloud9 IDE. ","To watch some training videos, visit http://www.youtube.com/user/c9ide",""]},{"start":{"row":0,"column":0},"end":{"row":0,"column":140},"action":"insert","lines":[" The Following page is a Clone Of FLIPKART by Using Twitter Bootstrap. The intention of making this page is only for LEARNING PURPOSE ALONE."]}]}]]},"ace":{"folds":[],"scrolltop":0,"scrollleft":0,"selection":{"start":{"row":0,"column":140},"end":{"row":0,"column":140},"isBackwards":false},"options":{"guessTabSize":true,"useWrapMode":false,"wrapToView":true},"firstLineState":0},"timestamp":1426175184450}<file_sep>/README.md
<<<<<<< HEAD
The Following page is a Clone Of FLIPKART by Using Twitter Bootstrap. The intention of making this page is only for LEARNING PURPOSE ALONE.
=======
# clone-flipkart
The Following page is a Clone Of FLIPKART by Using Twitter Bootstrap. The intention of making this page is only for LEARNING PURPOSE ALONE.
>>>>>>> ce316495792632e270f808353bce6a3977c2ff03
<file_sep>/index.php
<?php include_once("clone-flipkart.html"); ?>
|
2dd5c4e4b5c13bc344fc24a28893e992510e89fe
|
[
"Markdown",
"PHP"
] | 3 |
Markdown
|
ShreedharaCA/clone-flipkart
|
249163a3e1ea3644de3c6a2a8b8b2bd1c311fcc9
|
6f9b431dd34f1f1f9ff257731d53c62d733a3b52
|
refs/heads/main
|
<repo_name>simplex3510/Geometria_Fail<file_sep>/Assets/Scripts/Title/SureExit.cs
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.UI;
public class SureExit : MonoBehaviour
{
float height= 50f;
RectTransform rectTransform;
void Start()
{
rectTransform = GetComponent<RectTransform>();
}
void Update()
{
height = Mathf.Lerp(height, 250f, 0.3f);
rectTransform.SetSizeWithCurrentAnchors(RectTransform.Axis.Vertical, height);
// Debug.Log(height);
}
private void OnDisable()
{
height = 50f;
rectTransform.SetSizeWithCurrentAnchors(RectTransform.Axis.Vertical, height);
// Debug.Log(height);
}
public void OnClickExit()
{
Application.Quit();
}
}
<file_sep>/Assets/Scripts/Title/Drop.cs
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.UI;
using UnityEngine.EventSystems;
public class Drop : MonoBehaviour, IDropHandler
{
public void OnDrop(PointerEventData eventData)
{
// Debug.Log("OnDrop item :" + eventData.selectedObject.name);
if (eventData.selectedObject)
{
eventData.selectedObject.transform.SetParent(this.transform);
eventData.selectedObject.transform.localPosition = Vector3.zero;
eventData.selectedObject.transform.localScale = Vector3.one;
eventData.selectedObject.GetComponent<Image>().raycastTarget = true;
eventData.selectedObject = null;
}
}
}
<file_sep>/Assets/Scripts/Title/Drag.cs
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.UI;
using UnityEngine.EventSystems;
public class Drag : MonoBehaviour, IBeginDragHandler, IDragHandler, IEndDragHandler
{
Transform startParent;
Image perkIcon;
private void Awake()
{
perkIcon = GetComponent<Image>();
#region perkIcon alpha값 처리
if (perkIcon.sprite == null)
{
Color temp = new Color(0, 0, 0, 0);
perkIcon.color = temp;
// perkIcon.color.a = 0f; 가 왜 안 되는지 질문하기
}
#endregion
}
public void OnBeginDrag(PointerEventData eventData)
{
startParent = transform.parent;
transform.SetParent(GameObject.FindGameObjectWithTag("UI Canvas").transform);
startParent.localScale = Vector3.one;
GetComponent<Image>().raycastTarget = false;
eventData.selectedObject = this.gameObject;
}
public void OnDrag(PointerEventData eventData)
{
transform.position = eventData.position;
}
public void OnEndDrag(PointerEventData eventData)
{
if (eventData.selectedObject != null)
{
transform.SetParent(startParent);
transform.localPosition = Vector3.zero;
transform.localScale = Vector3.one;
GetComponent<Image>().raycastTarget = true;
}
}
}
<file_sep>/Assets/Scripts/ChangeCursor.cs
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class ChangeCursor : MonoBehaviour
{
[SerializeField]
Texture2D cursorImage;
void Start()
{
Cursor.SetCursor(cursorImage, Vector2.zero, CursorMode.ForceSoftware);
}
}
<file_sep>/Assets/Scripts/Title/GameTestManager.cs
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.SceneManagement;
public class GameTestManager : MonoBehaviour
{
private void Update()
{
if(Input.GetMouseButtonDown(1))
{
SceneManager.LoadScene("GameTitle");
}
}
}
<file_sep>/Assets/Scripts/Game/Enemy.cs
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
enum Command
{
Up = 'w',
Down = 's',
Left = 'a',
Right = 'd',
}
public class Enemy : MonoBehaviour
{
public Transform playerTransform;
bool isBattle;
bool isBattleWin;
[SerializeField]
float speed = 3;
Transform enemyTransform;
void Start()
{
enemyTransform = GetComponent<Transform>();
}
void Update()
{
transform.position = Vector3.MoveTowards(enemyTransform.position, playerTransform.position, speed * Time.deltaTime);
if (isBattle)
{
if (Input.GetKeyDown((KeyCode)Command.Down))
{
Time.timeScale = 1;
GameObject.Destroy(this.gameObject);
}
}
}
private void OnCollisionEnter2D(Collision2D other)
{
if (other.gameObject.tag == "Player")
{
// Enter Battle Mode
isBattle = true;
Time.timeScale = 0;
}
}
}
<file_sep>/Assets/Scripts/Title/Title.cs
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.UI;
using UnityEngine.SceneManagement;
public class Title : MonoBehaviour
{
public void OnClickChallenge()
{
SceneManager.LoadScene("GameMain");
}
public void OnClickPerkSetting()
{
SceneManager.LoadScene("GameTest");
}
}
<file_sep>/Assets/Scripts/Game/CameraManager.cs
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class CameraManager : MonoBehaviour
{
bool isZoom = false;
bool isCharge = false;
float fullChargeTime = 1f;
float currentChargeTime;
float zoomIn = 10f;
float zoomOut = 13f;
float zoomPower = 0.1f;
Transform playerPosition;
Camera mainCamera;
private void Start()
{
isZoom = false;
mainCamera = Camera.main;
mainCamera.orthographicSize = 5f;
playerPosition = GetComponent<Transform>();
}
private void Update()
{
#region camera view
mainCamera.transform.position = new Vector3(playerPosition.position.x, playerPosition.position.y, -10f);
#endregion
#region charge effect
if (Input.GetMouseButton(0))
{
currentChargeTime += Time.deltaTime;
if (fullChargeTime <= currentChargeTime && isCharge == false)
{
mainCamera.orthographicSize = 12f;
isCharge = true;
}
}
if(isCharge)
{
CameraZoomEffect(zoomOut, 0.01f);
}
if(Input.GetMouseButtonDown(0))
{
currentChargeTime = 0f;
isCharge = false;
}
#endregion
if (isZoom)
{
CameraZoomEffect(zoomIn, zoomPower);
}
else
{
CameraZoomEffect(zoomOut, zoomPower);
}
}
void CameraZoomEffect(float _zoom, float _zoomSpeed)
{
mainCamera.orthographicSize = Mathf.Lerp(mainCamera.orthographicSize, _zoom, _zoomSpeed);
}
private void OnCollisionEnter2D(Collision2D other)
{
if (other.gameObject.tag == "Enemy")
{
isZoom = true;
}
}
private void OnCollisionExit2D(Collision2D other)
{
if (other.gameObject.tag == "Enemy")
{
isZoom = false;
}
}
}<file_sep>/Assets/Scripts/Game/Player.cs
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.EventSystems;
public class Player : MonoBehaviour
{
public ParticleSystem chargingEffect;
public ParticleSystem chargedEffect;
public float speed;
public float offsetSpeed;
bool isCharged;
bool isCharging;
float currentChargeTime;
float fullChargeTime = 1f;
Camera m_camera;
Rigidbody2D m_rigidbody2D;
SpriteRenderer spriteRenderer;
DrawArrow drawArrow;
Vector3 startPosition;
Vector3 movePosition;
Vector3 currentPosition;
Vector3 direction;
Vector3 startPoint;
Vector3 currentPoint;
Vector3 endPoint;
void Start()
{
m_camera = Camera.main;
m_rigidbody2D = GetComponent<Rigidbody2D>();
spriteRenderer = GetComponent<SpriteRenderer>();
drawArrow = GetComponentInChildren<DrawArrow>();
}
void Update()
{
if (Input.GetMouseButtonDown(0))
{
isCharging = true;
isCharged = false;
// startPoint = m_camera.ScreenToWorldPoint(Input.mousePosition);
// startPoint.z = 0f;
startPosition = transform.position;
chargingEffect.Play();
}
if (Input.GetMouseButton(0))
{
currentPoint = m_camera.ScreenToWorldPoint(Input.mousePosition);
currentPoint.z = 0f;
Debug.Log(currentPoint);
#region 드로우 라인 on
drawArrow.RenderLine(currentPoint * -1, currentPoint);
#endregion
#region 이펙트 및 이미지 로드
currentChargeTime += Time.deltaTime;
if (fullChargeTime - 0.15f <= currentChargeTime)
{
chargingEffect.Stop();
}
if (fullChargeTime <= currentChargeTime)
{
// 한 번만 실행
if(isCharged)
{
return;
}
isCharged = true;
chargedEffect.Play();
spriteRenderer.sprite = Resources.Load<Sprite>("Sprites/Player_Charged");
}
#endregion
}
if (Input.GetMouseButtonUp(0))
{
isCharging = false;
isCharged = false;
// endPoint = m_camera.ScreenToWorldPoint(Input.mousePosition);
startPoint = currentPoint * -1;
endPoint = currentPoint;
// endPoint.z = 0f;
#region 드로우 라인 off
drawArrow.EndLine();
#endregion
// 방향 설정 및 이동 거리 제한
direction = new Vector2(Mathf.Clamp(startPoint.x - endPoint.x, -10f, 10f),
Mathf.Clamp(startPoint.y - endPoint.y, -10f, 10f));
movePosition = direction; // 이동 해야 할 거리
direction = direction.normalized; // 방향 벡터로 변환
#region 이동 및 이미지 로드
if (fullChargeTime <= currentChargeTime)
{
spriteRenderer.sprite = Resources.Load<Sprite>("Sprites/Player");
m_rigidbody2D.velocity = direction * speed * offsetSpeed;
currentChargeTime = 0f;
}
else
{
chargingEffect.Stop();
currentChargeTime = 0f;
}
#endregion
}
currentPosition = transform.position;
// 이동해야 할 거리와 실재 이동 거리 비교 연산
if(movePosition.magnitude <= (currentPosition - startPosition).magnitude)
{
// 이동해야 할 만큼만 이동하고 멈춤
m_rigidbody2D.velocity = new Vector2(0, 0);
}
}
}<file_sep>/Assets/Scripts/Game/DrawArrow.cs
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class DrawArrow : MonoBehaviour
{
float minOffset = -5f;
float maxOffset = 5f;
LineRenderer lineRenderer;
private void Awake()
{
lineRenderer = GetComponent<LineRenderer>();
}
public void RenderLine(Vector3 _startPoint, Vector3 _endPoint)
{
lineRenderer.positionCount = 2;
Vector3[] points = new Vector3[2];
_startPoint = new Vector3(Mathf.Clamp(_startPoint.x, minOffset, maxOffset),
Mathf.Clamp(_startPoint.y, minOffset, maxOffset),
0f);
_endPoint = new Vector3(Mathf.Clamp(_endPoint.x, minOffset, maxOffset),
Mathf.Clamp(_endPoint.y, minOffset, maxOffset),
0f);
points[0] = _startPoint;
points[1] = _endPoint;
lineRenderer.SetPositions(points);
}
public void EndLine()
{
lineRenderer.positionCount = 0;
}
}
|
eb86c37fded5f0d5bad2d8714e147c67d3b5eef6
|
[
"C#"
] | 10 |
C#
|
simplex3510/Geometria_Fail
|
7ebe809a5cecd1770633fecd4c8b8f0c5481cc16
|
42433d5247cf8fa9e524fa4bdae16ae0c12425bc
|
refs/heads/master
|
<repo_name>dmkanter/PIR-burgler-alarm<file_sep>/README.md
# PIR-burgler-alarm
Created this for my kid so we get alerted should someone dare enter his room
To use, need to have the RPi.GPIO python library installed
also need to have a Prowl API key
<file_sep>/pir-nokey.py
#!/usr/bin/python
import RPi.GPIO as GPIO
import time
import os
GPIO.setmode(GPIO.BCM)
# set up the GPIO channels - one input and one output
GPIO.setup(12, GPIO.IN)
# print (GPIO.VERSION)
while (1):
# input from pin 12
input_value = GPIO.input(12)
print(input_value)
if (input_value) :
os.system("curl -X POST \"https://api.prowlapp.com/publicapi/add?application=RaspberryPio&event=someone-entered-office&apikey=YOUR_PROWL_KEY_HERE\"")
time.sleep(60)
else :
# do nothing
time.sleep(2)
|
bdc331ab3a6b8bdca09fdf5dae5aad253997a9fe
|
[
"Markdown",
"Python"
] | 2 |
Markdown
|
dmkanter/PIR-burgler-alarm
|
bf81b732fb43f5f0a42f03cfe246c59227a0a295
|
8b3b3f747453e90b3728f96ec4bbbcbb217dc5bb
|
refs/heads/master
|
<file_sep>from django.shortcuts import render , redirect
from django.contrib import messages
from .models import Show
# Create your views here.
def index(request):
context = {
'shows': Show.objects.all()
}
return render(request, 'shows.html', context)
def new(request):
return render(request, 'new.html')
def create(request):
# CREATE THE SHOW
errors = Show.objects.validate(request.POST)
if errors:
for (key, value) in errors.items():
messages.error(request, value)
return redirect('/shows/new')
Show.objects.create(
title = request.POST['title'],
network = request.POST['network'],
release_date = request.POST['release_date'],
description = request.POST['description']
)
return redirect('/shows')
def edit(request, show_id):
one_show = Show.objects.get(id=show_id)
context = {
'show': one_show
}
return render(request, 'edit.html', context)
def update(request, show_id):
# update show!
to_update = Show.objects.get(id=show_id)
# updates each field
to_update.title = request.POST['title']
to_update.release_date = request.POST['release_date']
to_update.network = request.POST['network']
to_update.description = request.POST['description']
to_update.save()
return redirect('/shows/')
def show(request, show_id):
# query for one show with show_id
one_show = Show.objects.get(id=show_id)
context = {
'show': one_show
}
return render(request, 'show.html', context)
def delete(request, show_id):
# NOTE: Delete one show!
to_delete = Show.objects.get(id=show_id)
to_delete.delete()
return redirect('/shows')
# Create your views here.
<file_sep>from django.shortcuts import render, HttpResponse, redirect
# Create your views here.
def index(request):
return HttpResponse("placeholder to later display list of blogs")
def new(request):
return HttpResponse("Placedholder to display a new form to creat a new blog")
def create(request):
return redirect('/')
def show(request, number):
return HttpResponse(f"Placeholder to display blog number {number}.")
def edit(request, number):
return HttpResponse(f"Placeholder to edit blog {number}.")
def destroy(request, number):
return redirect('/')
def djangoOne(request):
return render(request, "index.html")
# Create your views here<file_sep>from django.urls import path
from . import views
urlpatterns = [
path('', views.index),
path('chickens/create', views.create_chicken),
path('chickens/<int:chicken_id>', views.show_chicken),
path('chickens/<int:chicken_id>/destroy', views.delete_chicken),
path('chickens/<int:chicken_id>/edit', views.edit_chicken),
path('chickens/<int:chicken_id>/update',views.update_chicken),
]<file_sep>from django.db import models
# Create your models here.
class ChickenManager(models.Manager):
def create_validator(self, reqPOST):
errors = {}
chickens_with_name = Chicken.objects.filter(name=reqPOST['chicken_name'])
if len(chickens_with_name) >= 1:
errors['unique'] = "Name already taken"
if len(reqPOST['chicken_name']) < 3:
errors['name'] = "Name is too short, use at least 3 characters"
if len(reqPOST['color']) < 3:
errors['color'] = "Color is too short, use at least 3 characters"
return errors
def edit_validator(self, reqPOST, chicken_id):
errors = {}
chickens_with_name = Chicken.objects.filter(name=reqPOST['chicken_name'])
if len(chickens_with_name) >= 1:
if chicken_id != chickens_with_name[0].id:
errors['unique'] = "Name already taken"
if len(reqPOST['chicken_name']) < 3:
errors['name'] = "Name is too short, use at least 3 characters"
if len(reqPOST['color']) < 3:
errors['color'] = "Color is too short, use at least 3 characters"
return errors
class Chicken(models.Model):
name = models.TextField()
color = models.TextField()
created_at = models.DateTimeField(auto_now_add=True)
updated_at = models.DateTimeField(auto_now=True)
objects = ChickenManager()
<file_sep>from django.shortcuts import render, redirect
from .models import Dojo, Ninja
def index(request):
context = {
"dojos" : Dojo.objects.all()
}
return render(request, 'index.html', context)
def create_dojo(request):
Dojo.objects.create(
name=request.POST['name'],
city=request.POST['city'],
state=request.POST['state'],
)
return redirect('/')
def create_ninja(request):
Ninja.objects.create(
first_name=request.POST['first_name'],
last_name=request.POST['last_name'],
dojo_id=request.POST['dojo'],
)
return redirect('/')
# def delete(request):
# objects = Db_test.objects.all()
# if request.method == "POST":
# # Fetch list of items to delete, by ID
# items_to_delete = request.POST.getlist('delete_items')
# # Delete those items all in one go
# Db_test.objects.filter(pk__in=items_to_delete).delete()
# return render(request, "models_test/delete.html", {"values": objects})
# Create your views here.
<file_sep>from django.apps import AppConfig
class PythonreviewappConfig(AppConfig):
name = 'pythonReviewApp'
<file_sep>from django.shortcuts import render, redirect
LANGS = (
'Python',
'MERN',
'Java',
)
LOCATIONS = (
'San Jose',
'Bellevue',
'Oakland',
)
# Create your views here.
def index(request):
context = {
'locations': LOCATIONS,
'languages': LANGS
}
return render(request, 'form.html', context)
def survey(request):
if request.method == 'GET':
return redirect('/')
request.session['result'] = {
'name': request.POST['name'],
'location': request.POST['location'],
'language': request.POST['language'],
'comment': request.POST['comment']
}
return redirect('/result')
def result(request):
context = {
'result': request.session['result']
}
return render(request, 'result.html', context)
# Create your views here.
<file_sep># my_environments
All of my assignments and Homework for Python Stack
<file_sep>from django.urls import path
from . import views
urlpatterns = [
path('', views.index),
path("users/create",views.create_user),
path("main_page",views.main_page),
path("users/login",views.login),
path("logout",views.logout),
path('giraffes/create', views.create_giraffe),
]<file_sep>from django.db import models
# Create your models here.
class CourseManager(models.Manager):
def create_validator(self, reqPOST):
errors = {}
if len(reqPOST['course_name']) < 6:
errors['name'] = "Course name is too short"
if len(reqPOST['description']) < 16:
errors['desc'] = "Description is too short"
return errors
class Course(models.Model):
name = models.TextField()
description = models.TextField()
created_at = models.DateTimeField(auto_now_add=True)
updated_at = models.DateTimeField(auto_now=True)
objects = CourseManager()<file_sep>from django.shortcuts import render, redirect
def index(request):
return render(request, 'index.html')
def submission(request):
if request.method == 'POST':
context = {
'name': request.POST['name'],
'lang': request.POST['location'],
'loc': request.POST['language'],
'gen': request.POST['gender'],
'cmmt': request.POST['comment']
}
return render(request, 'results.html', context)
return render(request, 'results.html')
# Create your views here.
<file_sep>from django.apps import AppConfig
class RandappConfig(AppConfig):
name = 'randApp'
<file_sep>class User:
def __init__(self, name, email):
self.name = name
self.email = email
self.account_balance = 0
def make_withdrawal(self, amount):
self.account_balance -= amount
return self
def make_deposit(self, amount):
self.account_balance += amount
return self
def display_user_balance(self):
print(f"User:{self.name}, Balance:${self.account_balance}")
return self
def transfer_money(self, other_user, amount):
other_user.account_balance += amount
self.account_balance -= amount
troy = User("Troy", "<EMAIL>")
mike = User("Mike", "<EMAIL>")
kevin = User("Kevin", "<EMAIL>")
troy.make_deposit(100).make_deposit(200).make_deposit(
300).make_withdrawal(200).display_user_balance()
mike.make_deposit(200).make_deposit(400).make_withdrawal(
300).make_withdrawal(100).display_user_balance()
kevin.make_deposit(5000).make_withdrawal(1000).make_withdrawal(
500).make_withdrawal(500).display_user_balance()
kevin.transfer_money(mike, 1000)
kevin.display_user_balance()
mike.display_user_balance()
<file_sep>from django.shortcuts import render, redirect
import random
from datetime import datetime
# helper dictionary, for easy access to min/max gold values
GOLD_MAP = {
"farm": (10,20),
"cave": (5,10),
"house": (2,5),
"casino": (0,50)
}
# Create your views here.
def index(request):
# check if either 'gold' or 'activities' keys are not in session (yet)
if not "gold" in request.session or "activities" not in request.session:
# set these to initial values if that is the case!
request.session['gold'] = 0
request.session['activities'] = []
return render(request, 'index.html')
def reset(request):
request.session.clear()
return redirect('/')
def process_gold(request):
if request.method == 'GET':
return redirect('/')
building_name = request.POST['building']
# access the correct mix/max values from the user's form submission
building = GOLD_MAP[building_name]
# upper case string (for message)
building_name_upper = building_name[0].upper() + building_name[1:]
# calculate the correct random number for this building
curr_gold = random.randint(building[0], building[1])
# generate a datetime string, with the proper format, for RIGHT NOW
now_formatted = datetime.now().strftime("%m/%d/%Y %I:%M%p")
# for formatting message color! (this will correspond to a css class)
result = 'earn'
message = f"Earned {curr_gold} from the {building_name_upper}! ({now_formatted})"
# check if we need to do casino stuff
if building_name == 'casino':
# if so, see if we lost money
if random.randint(0,1) > 0: # 50% chance of being True/False
# if we lost money, we need a different message!
message = f"Entered a {building_name_upper} and lost {curr_gold} golds... Ouch... ({now_formatted})"
# we also need to convert our turn's gold amount to a negative number
curr_gold = curr_gold * -1
result = 'lose'
# update session gold value
request.session['gold'] += curr_gold
# update session activities with new message
# NOTE: each 'activity' is a dictionary, with the message as well as the 'result' for css purposes
request.session['activities'].append({"message": message, "result": result})
return redirect('/')<file_sep>from django.shortcuts import render, redirect
from .models import Order, Product
from django.db.models import Sum
def index(request):
context = {
"all_products": Product.objects.all()
}
return render(request, "index.html", context)
def checkout(request):
last = Order.objects.last()
price=last.total_price
full_order = Order.objects.aggregate(Sum('quantity_ordered'))['quantity_ordered__sum']
full_price = Order.objects.aggregate(Sum('total_price'))['total_price__sum']
context = {
'orders':full_order,
'total':full_price,
'bill':price,
}
return render(request, "checkout.html",context)
def purchase(request):
if request.method == 'POST':
this_product = Product.objects.filter(id=request.POST["id"])
if not this_product:
return redirect('/')
else:
quantity = int(request.POST["quantity"])
total_charge = quantity*(float(this_product[0].price))
Order.objects.create(quantity_ordered=quantity, total_price=total_charge)
return redirect('/checkout')
else:
return redirect('/')
<file_sep>from django.apps import AppConfig
class MarcrudappConfig(AppConfig):
name = 'marCRUDApp'
<file_sep>from django.shortcuts import render
# Create your views here.
from django.shortcuts import render, redirect
from .models import *
from django.contrib import messages
# Create your views here.
def index(request):
context = {
"all_courses": Course.objects.all()
}
return render(request, "index.html", context)
def create_course(request):
if request.method == "POST":
errors = Course.objects.create_validator(request.POST)
if len(errors) > 0:
for key, value in errors.items():
messages.error(request, value)
else:
course = Course.objects.create(name=request.POST['course_name'], description=request.POST['description'])
return redirect('/')
def destroy_course(request, course_id):
context = {
'one_course': Course.objects.get(id=course_id)
}
return render(request, "delete_page.html", context)
def delete_course(request, course_id):
if request.method == "POST":
course_to_delete = Course.objects.get(id=course_id)
course_to_delete.delete()
return redirect('/')<file_sep>from django.shortcuts import render, redirect
from .models import *
from django.contrib import messages
def index(request):
context = {
"all_books" : Book.objects.all()
}
return render(request, "index.html", context)
# Create your views here.
<file_sep>from django.urls import path
from . import views
urlpatterns = [
path('', views.index),
path('courses/create', views.create_course),
path('courses/destroy/<int:course_id>', views.destroy_course),
path('courses/delete/<int:course_id>', views.delete_course),
]<file_sep>from django.shortcuts import render, redirect
import bcrypt
from django.contrib import messages
from .models import *
def index(request):
return render(request, "index.html",)
def create_user(request):
if request.method == "POST":
errors = User.objects.create_valdiator(request.POST)
if len(errors) >0:
for key, value in errors.items():
messages.error(request, value)
return redirect('/')
else:
password = request.POST['password']
pw_hash = bcrypt.hashpw(password.encode(), bcrypt.gensalt()).decode()
user = User.objects.create(name=request.POST['user_name'], email=request.POST['email'], password=pw_hash)
request.session['user_id'] = user.id
return redirect('/main_page')
return redirect('/')
def main_page(request):
if 'user_id' not in request.session:
return redirect('/')
context ={
'current_user': User.objects.get(id=request.session['user_id']),
'all_giraffes': Giraffe.objects.all()
}
return render(request, "main_page.html", context)
def login(request):
if request.method =="POST":
users_with_email = User.objects.filter(email=request.POST['email'])
if users_with_email:
user = users_with_email[0]
if bcrypt.checkpw(request.POST['password'].encode(), user.password.encode()):
request.session['user_id'] = user.id
return redirect('/main_page')
messages.error(request, "Email or password are not right")
return redirect('/')
def logout(request):
request.session.flush()
return redirect('/')
def create_giraffe(request):
if 'user_id' not in request.session:
return redirect('/')
if request.method == "POST":
errors = Giraffe.objects.create_valdiator(request.POST)
if len(errors) >0:
for key, value in errors.items():
messages.error(request, value)
return redirect('/')
else:
giraffe = Giraffe.objects.create(name=request.POST['giraffe_name'], catchphrase=request.POST['catchphrase']
, owner=User.objects.get(id=request.session['user_id']))
return redirect('/main_page')
return redirect('/main_page')
# Create your views here.
<file_sep><!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Ninja Gold</title>
{% load static %}
<link rel="stylesheet" href="{% static 'style.css' %}" media="screen" title="no title" charset="utf-8">
</head>
<body>
<header>
<h1>Ninja Gold</h1>
<a href="/reset"><button>Reset</button></a>
</header>
<h3>Your Gold: <input type="text" disabled value="{{request.session.gold}}"></h3>
<div id="buildings">
<div>
<h1>Farm</h1>
<p>(earns 10-20 golds)</p>
<form action="/gold" method="post">
{% csrf_token %}
<input type="hidden" name="building" value="farm">
<button>Find Gold!</button>
</form>
</div>
<div>
<h1>Cave</h1>
<p>(earns 5-10 golds)</p>
<form action="/gold" method="post">
{% csrf_token %}
<input type="hidden" name="building" value="cave">
<button>Find Gold!</button>
</form>
</div>
<div>
<h1>House</h1>
<p>(earns 2-5 golds)</p>
<form action="/gold" method="post">
{% csrf_token %}
<input type="hidden" name="building" value="house">
<button>Find Gold!</button>
</form>
</div>
<div>
<h1>Casino</h1>
<p>(earns/takes 0-50 golds)</p>
<form action="/gold" method="post">
{% csrf_token %}
<input type="hidden" name="building" value="casino">
<button>Find Gold!</button>
</form>
</div>
</div>
<h2>Activities</h2>
<div id="activities">
{% for activity in request.session.activities reversed %}
<p class="{{ activity.result}}">{{ activity.message }}</p>
{% endfor %}
</div>
</body>
</html><file_sep>from django.db import models
import re
# Create your models here.
class UserManager(models.Manager):
def create_valdiator(self, reqPOST):
errors = {}
if len(reqPOST['user_name']) < 3:
errors['user_name'] = "Name is too short"
if len(reqPOST['email']) < 6:
errors['email'] = "Email is too short"
if len(reqPOST['password']) < 8:
errors['email'] = "Password is too short"
if reqPOST['password'] != reqPOST['password_conf']:
errors['match'] = "Password and password confirmation dont match"
EMAIL_REGEX = re.compile(r'^[a-zA-Z0-9.+_-]+@[a-zA-Z0-9._-]+\.[a-zA-Z]+$')
if not EMAIL_REGEX.match(reqData['email']):
errors['regex'] = ("Email in wrong format")
users_with_email = User.objects.filter(email=reqPOST['email'])
if len(users_with_email) >= 1:
errors['dup'] = "Email taken, use another"
return errors
class User(models.Model):
name = models.TextField()
email = models.TextField()
password = models.TextField()
created_at = models.DateTimeField(auto_now_add=True)
updated_at = models.DateTimeField(auto_now=True)
objects = UserManager()
class GiraffeManager(models.Manager):
def create_validator(self, reqPOST):
if len(reqPOST['giraffe_name']) < 3:
errors['giraffe_name'] = "Name is too short"
if len(reqPOST['catchphrase']) < 6:
errors['catchphrase'] = "Catchphrase is too short"
if len(reqPOST['password']) < 8:
errors['email'] = "Password is too short"
class Giraffe(models.Model):
name = models.TextField()
catchphrase = models.TextField()
owner = models.ForeignKey(User, related_name="giraffes_owned", on_delete=models.CASCADE)
created_at = models.DateTimeField(auto_now_add=True)
updated_at = models.DateTimeField(auto_now=True)
objects = GiraffeManager()<file_sep>from django.shortcuts import render, redirect
from .models import *
from django.contrib import messages
def index(request):
context = {
"all_chickens" : Chicken.objects.all()
}
return render(request, "index.html", context)
def create_chicken(request):
if request.method == "POST":
errors = Chicken.objects.create_validator(request.POST)
if len(errors) > 0:
for key, value in errors.items():
messages.error(request, value)
return redirect('/')
else:
chicken = Chicken.objects.create(name=request.POST['chicken_name'], color= request.POST['color'])
return redirect('/')
def show_chicken(request, chicken_id):
context = {
'one_chicken': Chicken.objects.get(id=chicken_id)
}
return render(request, "one_chicken.html", context)
def delete_chicken(request, chicken_id):
if request.method == "POST":
chicken_to_delete = Chicken.objects.get(id=chicken_id)
chicken_to_delete.delete()
return redirect('/')
def edit_chicken(request, chicken_id):
context = {
'one_chicken': Chicken.objects.get(id=chicken_id)
}
return render(request, "edit_chicken.html", context)
def update_chicken(request, chicken_id):
if request.method == "POST":
errors = Chicken.objects.edit_validator(request.POST, chicken_id)
if len(errors) > 0:
for key, value in errors.items():
messages.error(request, value)
return redirect(f'/chickens/{chicken_id}/edit')
else:
chicken = Chicken.objects.get(id=chicken_id)
chicken.name = request.POST['chicken_name']
chicken.color = request.POST['color']
chicken.save()
return redirect(f'/chickens/{chicken_id}')
return redirect('/')
# Create your views here.
<file_sep>from django.urls import path
from . import views
urlpatterns =[
path('', views.index),
path('ninjas/create', views.create_ninja),
path('dojos/create', views.create_dojo),
]
|
4b8e025931db7ed234b2f4a9aaf34bdc1434346e
|
[
"Markdown",
"Python",
"HTML"
] | 24 |
Python
|
hayesspencer/python
|
14daad63b452edaa6ad14a3ca1cb3b2a453532bb
|
93e23866624be1c5fcea0650252313e9574555fd
|
refs/heads/master
|
<repo_name>cooper-lyt/participant<file_sep>/src/main/java/cc/coopersoft/common/EnumHelper.java
package cc.coopersoft.common;
import cc.coopersoft.common.util.DefaultMessageBundle;
import javax.inject.Inject;
import javax.inject.Named;
import java.util.MissingResourceException;
import java.util.ResourceBundle;
/**
* Created by cooper on 6/16/16.
*/
@Named
public class EnumHelper implements java.io.Serializable{
@Inject @DefaultMessageBundle
private ResourceBundle bundle;
public String getLabel(Enum value){
if (value == null){
return "";
}
try {
return bundle.getString(value.getClass().getName() + "." + value.name());
}catch (MissingResourceException e){
return value.getClass().getName() + "." + value.name();
}
}
}
<file_sep>/src/main/java/cc/coopersoft/house/participant/controller/LocalContractConfig.java
package cc.coopersoft.house.participant.controller;
import cc.coopersoft.house.participant.contract.ContractPath;
import javax.enterprise.context.ApplicationScoped;
import javax.enterprise.inject.Any;
import javax.enterprise.inject.Default;
import javax.enterprise.inject.Instance;
import javax.inject.Inject;
/**
* Created by cooper on 23/10/2017.
*/
@ApplicationScoped
public class LocalContractConfig implements java.io.Serializable{
private static final String PATH_CONFIG_NAME = "CONTRACT_LOCATION";
private ContractPath contractPath;
public LocalContractConfig() {
}
@Inject
public LocalContractConfig(@Default RunParam runParam , @Any Instance<ContractPath> contractPath) {
for(ContractPath cp: contractPath){
if( runParam.getStringParam(PATH_CONFIG_NAME).equals(cp.getConfigName())){
this.contractPath = cp;
break;
}
}
}
public ContractPath getConfig() {
return contractPath;
}
}
<file_sep>/src/main/java/cc/coopersoft/house/participant/ParticipantAuthenticator.java
package cc.coopersoft.house.participant;
import cc.coopersoft.comm.exception.HttpApiServerException;
import cc.coopersoft.house.participant.controller.RunParam;
import cc.coopersoft.house.sale.HouseSellService;
import cc.coopersoft.house.sale.data.Developer;
import cc.coopersoft.house.sale.data.Seller;
import com.dgsoft.developersale.LogonStatus;
import org.picketlink.annotations.PicketLink;
import org.picketlink.authentication.BaseAuthenticator;
import org.picketlink.credential.DefaultLoginCredentials;
import org.picketlink.idm.IdentityManager;
import org.picketlink.idm.RelationshipManager;
import org.picketlink.idm.model.basic.BasicModel;
import org.picketlink.idm.model.basic.User;
import javax.inject.Inject;
import java.util.logging.Level;
import java.util.logging.Logger;
/**
* Created by cooper on 19/02/2017.
*/
@PicketLink
public class ParticipantAuthenticator extends BaseAuthenticator{
@Inject
private DefaultLoginCredentials credentials;
@Inject
private RunParam runParam;
@Inject
private RelationshipManager relationshipManager;
@Inject
private Logger logger;
@Inject
private IdentityManager identityManager;
@Inject
private AttrUser attrUser;
public void authenticate() {
try {
attrUser.setLoginData(HouseSellService.login(runParam.getStringParam("nginx_address"),credentials.getUserId(),credentials.getPassword(),attrUser.getRndData()));
if (LogonStatus.LOGON.equals(attrUser.getLogonStatus())){
User user = BasicModel.getUser(identityManager,attrUser.getLoginData().getAttrEmp().getId());
if (user == null){
user = new User(attrUser.getLoginData().getAttrEmp().getId());
user.setFirstName(attrUser.getLoginData().getAttrEmp().getName());
identityManager.add(user);
}
attrUser.setKeyId(credentials.getUserId());
if (attrUser.getLoginData().getCorpInfo() instanceof Seller){
BasicModel.grantRole(relationshipManager,user,BasicModel.getRole(identityManager,"seller"));
}else if (attrUser.getLoginData().getCorpInfo() instanceof Developer){
BasicModel.grantRole(relationshipManager,user,BasicModel.getRole(identityManager,"developer"));
}else{
throw new IllegalArgumentException("unknow logon type:" + attrUser.getLoginData().getCorpInfo().getClass());
}
setAccount(user);
setStatus(AuthenticationStatus.SUCCESS);
return;
}
} catch (HttpApiServerException e) {
logger.log(Level.WARNING,e.getMessage(),e);
}
setStatus(AuthenticationStatus.FAILURE);
}
}
<file_sep>/src/main/java/cc/coopersoft/house/participant/pages/Seller.java
package cc.coopersoft.house.participant.pages;
import org.apache.deltaspike.core.api.config.view.ViewConfig;
import org.apache.deltaspike.core.api.config.view.navigation.NavigationParameter;
/**
* Created by cooper on 26/02/2017.
*/
public interface Seller {
class Home implements ViewConfig {}
class HouseSourceView implements ViewConfig{}
class HouseList implements ViewConfig{}
interface Contract extends ViewConfig{
class SubmitContract implements Contract{}
interface Dg extends Contract{
class OldEdit implements Dg {}
}
interface Fc extends Contract{
class OldEdit implements Fc{}
}
interface Xf extends Contract{
class OldEdit implements Xf {}
}
}
interface Apply extends ViewConfig{
class HouseValid implements Apply {}
class HouseSellInfo implements Apply {}
class HouseShowInfo implements Apply{}
class HouseSalePicUpload implements Apply{}
class HouseSourceCreate implements Apply{}
class HouseSourceContract implements Apply{}
class HouseSourceSubmit implements Apply{}
class HouseSourceCommitted implements Apply{}
class ContractBaseInfo implements Apply{}
class ContractBuyerInfo implements Apply{}
class ContractSellerInfo implements Apply{}
}
}
<file_sep>/src/main/java/cc/coopersoft/house/participant/contract/ContractPathDG.java
package cc.coopersoft.house.participant.contract;
import cc.coopersoft.house.participant.data.ContractContextMap;
import cc.coopersoft.house.participant.pages.Seller;
import com.dgsoft.house.SaleType;
import org.apache.deltaspike.core.api.config.view.ViewConfig;
import javax.enterprise.inject.Any;
import java.io.OutputStream;
/**
* Created by cooper on 19/10/2017.
*/
@Any
public class ContractPathDG implements ContractPath {
public String getConfigName() {
return "donggang";
}
public Class<? extends ViewConfig> getEditPath(SaleType saleType) {
return Seller.Contract.Dg.OldEdit.class;
}
public void pdf(ContractContextMap contractContextMap, OutputStream outputStream) {
ContractPdfDG1.pdf(contractContextMap,outputStream);
}
public void agentPdf(ContractContextMap contractContextMap, OutputStream outputStream) {
}
public void seePdf(ContractContextMap contractContextMap, OutputStream outputStream) {
}
public Class<? extends ViewConfig> getAgentEditPath() {
return null;
}
}
<file_sep>/src/main/java/cc/coopersoft/house/participant/contract/ContractPath.java
package cc.coopersoft.house.participant.contract;
import cc.coopersoft.house.participant.data.ContractContextMap;
import com.dgsoft.house.SaleType;
import org.apache.deltaspike.core.api.config.view.ViewConfig;
import java.io.OutputStream;
/**
* Created by cooper on 19/10/2017.
*/
public interface ContractPath {
String getConfigName();
Class<? extends ViewConfig> getEditPath(SaleType saleType);
void pdf(ContractContextMap contractContextMap, OutputStream outputStream);
void agentPdf(ContractContextMap contractContextMap, OutputStream outputStream);
void seePdf(ContractContextMap contractContextMap, OutputStream outputStream);
Class<? extends ViewConfig> getAgentEditPath();
}
<file_sep>/src/main/java/cc/coopersoft/common/util/EntityHelper.java
package cc.coopersoft.common.util;
import javax.persistence.Entity;
import javax.persistence.Id;
import java.lang.reflect.Field;
import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method;
/**
* Created by cooper on 6/19/16.
*/
public class EntityHelper {
//TODO define in xml
/**
* Get the bean class from a container-generated proxy
* class
*
*/
public static Class getEntityClass(Class clazz)
{
while (clazz != null && !Object.class.equals(clazz))
{
if (clazz.isAnnotationPresent(Entity.class))
{
return clazz;
}
else
{
clazz = clazz.getSuperclass();
}
}
return null;
}
public static Object getEntityId(Object entity){
Class clazz = getEntityClass(entity.getClass());
while (clazz != null && !Object.class.equals(clazz))
{
if (clazz.isAnnotationPresent(Entity.class))
{
for(Method method: clazz.getDeclaredMethods()){
if (method.isAnnotationPresent(Id.class)){
try {
return method.invoke(entity);
} catch (IllegalAccessException e) {
throw new IllegalArgumentException(e);
} catch (InvocationTargetException e) {
throw new IllegalArgumentException(e);
}
}
}
for(Field field: clazz.getDeclaredFields()){
if (field.isAnnotationPresent(Id.class)){
try {
return field.get(entity);
} catch (IllegalAccessException e) {
throw new IllegalArgumentException(e);
}
}
}
}
else
{
clazz = clazz.getSuperclass();
}
}
throw new IllegalArgumentException("not have define Annotation Id");
}
}
<file_sep>/src/main/java/cc/coopersoft/common/ChoiceList.java
package cc.coopersoft.common;
import java.util.List;
/**
* Created by cooper on 7/6/16.
*/
public abstract class ChoiceList<E> implements java.io.Serializable {
private List<BatchData<E>> dataList;
protected abstract List<E> getOrgData();
protected void initDatas(){
if (dataList == null){
dataList = BatchData.fillData(getOrgData());
}
}
public void refresh(){
dataList = null;
}
public List<BatchData<E>> getDataList() {
initDatas();
return dataList;
}
public boolean isCheckAll(){
for(BatchData<E> data: getDataList()){
if (!data.isSelected()){
return false;
}
}
return true;
}
public void setCheckAll(boolean checked){
for(BatchData<E> data: getDataList()){
data.setSelected(checked);
}
}
public boolean isAnySelected(){
for(BatchData<E> data: getDataList()){
if (data.isSelected()){
return true;
}
}
return false;
}
}
<file_sep>/src/main/java/cc/coopersoft/house/participant/controller/PersonCardReader.java
package cc.coopersoft.house.participant.controller;
import javax.ws.rs.*;
@Path("/")
public class PersonCardReader {
@POST
@Path("/person")
@Produces({ "application/json" })
public String personCardRead(@PathParam("key") String key , @FormParam("cer") String cer){
return null;
}
}
<file_sep>/src/main/java/cc/coopersoft/common/MutableController.java
package cc.coopersoft.common;
import cc.coopersoft.common.util.DataHelper;
/**
* Created by cooper on 6/10/16.
*/
public abstract class MutableController implements java.io.Serializable{
private transient boolean dirty;
public boolean clearDirty()
{
boolean result = dirty;
dirty = false;
return result;
}
/**
* Set the dirty flag if the value has changed.
* Call whenever a subclass attribute is updated.
*
* @param oldValue the old value of an attribute
* @param newValue the new value of an attribute
* @return true if the newValue is not equal to the oldValue
*/
protected <U> boolean setDirty(U oldValue, U newValue)
{
boolean attributeDirty = DataHelper.isDirty(oldValue,newValue);
dirty = dirty || attributeDirty;
return attributeDirty;
}
/**
* Set the dirty flag.
*/
protected void setDirty()
{
dirty = true;
}
}
<file_sep>/src/main/resources/META-INF/resources/select-edit.js
$(document).ready(
function(){
$('.select-edit-input').on("focus",
function (){
$(this).dropdown();
}
)
$('.select-edit-group').on('hide.bs.dropdown', function () {
if ( $(this).children("input").is(':focus')){
return false;
}
})
$('.select-edit-group>.dropdown-menu>li>a').on("click",
function (e){
$(this).parents(".select-edit-group").children("input").val($(this).text());
e.preventDefault();
}
)
}
)<file_sep>/src/main/java/cc/coopersoft/house/participant/controller/OldHouseMoneyProtectedInfo.java
package cc.coopersoft.house.participant.controller;
import cc.coopersoft.house.sale.data.MoneyManager;
import cc.coopersoft.house.sale.data.MoneyProtectedBank;
import cc.coopersoft.house.sale.data.OldHouseMoney;
import javax.enterprise.context.RequestScoped;
import javax.inject.Inject;
import javax.inject.Named;
@Named
@RequestScoped
public class OldHouseMoneyProtectedInfo {
@Inject
private ContractHome contractHome;
@Inject
private ServerWord serverWord;
public boolean isProtectedMoney(){
return contractHome.getInstance().getMoneyManager() != null;
}
public void setProtectedMoney(boolean value){
if (value){
if (contractHome.getInstance().getMoneyManager() == null){
contractHome.getInstance().setMoneyManager(new MoneyManager(contractHome.getInstance().getId()));
contractHome.getInstance().getMoneyManager().setOldHouseMoney(new OldHouseMoney(contractHome.getInstance().getId()));
}
}else {
if (contractHome.getInstance().getMoneyManager() != null){
contractHome.getInstance().setMoneyManager(null);
}
}
}
public String getProtectedBankId(){
if (isProtectedMoney()){
return contractHome.getInstance().getMoneyManager().getBank();
}
return null;
}
public void setProtectedBankId(String value){
if (value == null || "".equals(value.trim())){
if (isProtectedMoney()){
contractHome.getInstance().getMoneyManager().setBank(null);
contractHome.getInstance().getMoneyManager().setAccount(null);
contractHome.getInstance().getMoneyManager().setBankName(null);
}
}else{
MoneyProtectedBank bank = serverWord.getOldHouseMoneyProtectedBankById(value);
if (bank != null){
contractHome.getInstance().getMoneyManager().setBank(bank.getId());
contractHome.getInstance().getMoneyManager().setAccount(bank.getAccount());
contractHome.getInstance().getMoneyManager().setBankName(bank.getBankName());
}
}
}
}
<file_sep>/src/main/java/cc/coopersoft/house/participant/Tools.java
package cc.coopersoft.house.participant;
import java.security.SecureRandom;
/**
* Created by cooper on 15/06/2017.
*/
public class Tools {
public static String validRandomData(){
String rndData = "";
int b ;
int a ;
SecureRandom r = new SecureRandom();
for (int i = 0; i < 32; i++) {
a = r.nextInt(26);
b = (char) (a + 65);
rndData += new Character((char) b).toString();
}
return rndData;
}
}
<file_sep>/src/main/java/cc/coopersoft/house/participant/controller/ServerWord.java
package cc.coopersoft.house.participant.controller;
import cc.coopersoft.comm.District;
import cc.coopersoft.comm.exception.HttpApiServerException;
import cc.coopersoft.house.sale.HouseSellService;
import cc.coopersoft.house.sale.data.MoneyProtectedBank;
import cc.coopersoft.house.sale.data.Word;
import javax.enterprise.context.ApplicationScoped;
import javax.inject.Inject;
import javax.inject.Named;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
/**
* Created by cooper on 04/03/2017.
*/
@Named
@ApplicationScoped
public class ServerWord implements java.io.Serializable {
@Inject
private RunParam runParam;
private List<District> districts;
private List<MoneyProtectedBank> oldHouseMoneyProtectedBank;
private Map<String,List<Word>> words = new HashMap<String, List<Word>>();
//private Map<String ,Map<String,Word>> words;
public List<District> getDistricts(){
if (districts == null){
try {
districts = HouseSellService.listDistrict(runParam.getStringParam("nginx_address"));
} catch (HttpApiServerException e) {
throw new IllegalArgumentException("server fail!" , e);
}
}
return districts;
}
public String getDistrictName(String id){
for (District d: getDistricts()){
if (d.getId().equals(id)){
return d.getName();
}
}
return null;
}
public List<MoneyProtectedBank> getOldHouseMoneyProtectedBank() {
if (oldHouseMoneyProtectedBank == null){
try {
oldHouseMoneyProtectedBank = HouseSellService.getOldHouseMoneyProtectedAccountList(runParam.getStringParam("nginx_address"));
} catch (HttpApiServerException e) {
throw new IllegalArgumentException("server fail!" , e);
}
}
return oldHouseMoneyProtectedBank;
}
public MoneyProtectedBank getOldHouseMoneyProtectedBankById(String id){
for(MoneyProtectedBank bank : getOldHouseMoneyProtectedBank()){
if (bank.getId().equals(id)){
return bank;
}
}
return null;
}
}
<file_sep>/src/main/java/cc/coopersoft/house/participant/contract/ContractPdfFC1.java
package cc.coopersoft.house.participant.contract;
import cc.coopersoft.common.tools.faces.convert.BigMoneyConverter;
import cc.coopersoft.house.participant.data.ContractContextMap;
import com.itextpdf.text.*;
import com.itextpdf.text.pdf.BaseFont;
import com.itextpdf.text.pdf.PdfPCell;
import com.itextpdf.text.pdf.PdfPTable;
import com.itextpdf.text.pdf.PdfWriter;
import java.io.IOException;
import java.io.OutputStream;
import java.text.DecimalFormat;
import java.text.SimpleDateFormat;
import java.util.Date;
public class ContractPdfFC1 {
//TODO 条码 二维码 水印
private static String pm(String s, int len){
StringBuffer result = new StringBuffer(s == null ? "" : s.trim());
if (!"".equals(result.toString())){
result.insert(0," ");
result.append(" ");
}else {
for (int i = 0; i < len ; i++) {
result.append(" ");
}
}
return result.toString();
}
public static void pdf(ContractContextMap contractContextMap, OutputStream outputStream){
Document document = new Document(PageSize.A4,40,40,60,60);
try {
PdfWriter pdfWriter = PdfWriter.getInstance(document,outputStream);
pdfWriter.setEncryption(null,null,PdfWriter.ALLOW_PRINTING,PdfWriter.STANDARD_ENCRYPTION_128);
BaseFont bfChinese = BaseFont.createFont("STSongStd-Light",
"UniGB-UCS2-H", false);
Font h1 = new Font(bfChinese, 32, Font.BOLD,
BaseColor.BLACK);
Font h2 = new Font(bfChinese, 18, Font.BOLD,
BaseColor.BLACK);
Font h3 = new Font(bfChinese, 13, Font.BOLD,
BaseColor.BLACK);
Font h3i = new Font(bfChinese, 13, Font.UNDERLINE,
BaseColor.BLACK);
Font b1 = new Font(bfChinese, 13, Font.NORMAL,
BaseColor.BLACK);
Font b2 = new Font(bfChinese, 12, Font.NORMAL,
BaseColor.BLACK);
Font bi = new Font(bfChinese,12,Font.UNDERLINE,BaseColor.BLACK);
DecimalFormat dfArea = new DecimalFormat("#0.00");
DecimalFormat dfMoney = new DecimalFormat("¥#0.00");
DecimalFormat dfInt = new DecimalFormat("#0");
document.open();
Paragraph p = new Paragraph("合同编号:" + contractContextMap.get("contract_number").getStringValue() ,b1);
document.add(p);
p = new Paragraph("房地产买卖契约",h1);
p.setAlignment(Element.ALIGN_CENTER);
document.add(p);
PdfPTable table = new PdfPTable(2);
float f[] = {1,8};
table.setWidths(f);
table.setWidthPercentage(100);
PdfPCell cell = new PdfPCell(new Paragraph("立契约人",b1));
cell.setRowspan(2);
table.addCell(cell);
PdfPCell cel = new PdfPCell(new Paragraph("甲方(卖方): " + contractContextMap.get("seller_name").getStringValue() + contractContextMap.get("seller_card_type").getStringValue() + "证号码: " + contractContextMap.get("seller_card_number").getStringValue(),b1));
cel.setBorder(0);
table.addCell(cel);
cel = new PdfPCell(new Paragraph("乙方(买方):" + contractContextMap.get("buyer_name").getStringValue() + contractContextMap.get("buyer_card_name").getStringValue() + "证号码: " + contractContextMap.get("buyer_card_number").getStringValue(),b1));
cel.setBorder(0);
table.addCell(cel);
document.add(table);
document.add(new Paragraph(" ",b1));
document.add(new Paragraph(" 甲、乙双方经过平等协商,一致同意就下列房地产买卖事项订立本契约,共同遵守。",b2));
p = new Paragraph(20);
p.add(new Phrase(" 一、甲方自愿将座落在凤城市", b2));
p.add(new Phrase(pm(contractContextMap.get("address").getStringValue(),50),bi));
p.add(new Phrase("的房地产(房屋建筑面积",b2));
p.add(new Phrase(pm(contractContextMap.get("house_area").getStringValue(),4),bi));
p.add(new Phrase("平方米;土地使用面积",b2));
p.add(new Phrase(pm(contractContextMap.get("land_area").getStringValue(),4),bi));
p.add(new Phrase("平方米)出售给乙方。",b2));
p.add(new Phrase(pm(contractContextMap.get("power_card_type").getStringValue(),4),bi));
p.add(new Phrase("证号",b2));
p.add(new Phrase(pm(contractContextMap.get("power_card_number").getStringValue(),4),bi));
p.add(new Phrase(",用途",b2));
p.add(new Phrase(pm(contractContextMap.get("use_type").getStringValue(),10),bi));
p.add(new Phrase(",结构:",b2));
p.add(new Phrase(pm(contractContextMap.get("structure").getStringValue(),10),bi));
p.add(new Phrase(",所在层数:",b2));
p.add(new Phrase(pm(contractContextMap.get("in_floor").getStringValue(),4),bi));
p.add(new Phrase("四至界限为",b2));
p.add(new Phrase(pm(contractContextMap.get("c1_1_1").getStringValue(),50),bi));
p.add(new Phrase("。乙方经过充分了解愿意购买该房地产。",b2));
document.add(p);
p = new Paragraph(20);
p.add(new Phrase(" 二、甲乙双方议定上述房地产成交价格为人民币(大写)",b2));
p.add(new Phrase(pm(BigMoneyConverter.numberToBigMoney(contractContextMap.get("money").getNumberValue()),55),bi));
p.add(new Phrase(";¥",b2));
p.add(new Phrase(pm( dfMoney.format(contractContextMap.get("money").getNumberValue()),10),bi));
p.add(new Phrase("元。乙方由",b2));
p.add(new Phrase(pm(contractContextMap.get("c_2_1_1").getStringValue(),4),bi));
p.add(new Phrase("年",b2));
p.add(new Phrase(pm(contractContextMap.get("c_2_1_2").getStringValue(),4),bi));
p.add(new Phrase("月",b2));
p.add(new Phrase(pm(contractContextMap.get("c_2_1_3").getStringValue(),4),bi));
p.add(new Phrase("日前",b2));
p.add(new Phrase(pm(contractContextMap.get("c_2_1_4").getStringValue(),4),bi));
p.add(new Phrase("次付清给甲方房款。付款方式",b2));
p.add(new Phrase(pm(contractContextMap.get("sale_pay_type").getStringValue(),4),bi));
p.add(new Phrase("。",b2));
document.add(p);
if (!contractContextMap.get("bank_name").isEmptyData()) {
p = new Paragraph(20);
p.add(new Phrase(" 买卖双方约定监管银行为:", b2));
p.add(new Phrase(pm(contractContextMap.get("bank_name").getStringValue(), 20),bi));
p.add(new Phrase("账号为:",b2));
p.add(new Phrase(pm(contractContextMap.get("bank_account").getStringValue(),20),bi));
p.add(new Phrase("监管资金为:人民币",b2));
p.add(new Phrase(dfMoney.format(contractContextMap.get("protected_money").getNumberValue()),bi));
p.add(new Phrase("元(大写):",b2));
p.add(new Phrase(BigMoneyConverter.numberToBigMoney(contractContextMap.get("protected_money").getNumberValue()),bi));
document.add(p);
document.add(new Paragraph(20," 出卖人确认的监管资金收款账户为:",b2));
p = new Paragraph(20);
p.add(new Phrase(" 收款人姓名:",b2));
p.add(new Phrase(contractContextMap.get("card_name").getStringValue(),bi));
p.add(new Phrase("收款人账号:",b2));
p.add(new Phrase(contractContextMap.get("card_number").getStringValue(),bi));
p.add(new Phrase("开户银行:",b2));
p.add(new Phrase(contractContextMap.get("card_bank").getStringValue(),bi));
document.add(p);
}
p = new Paragraph(20);
p.add(new Phrase(" 三、双方同意于",h3));
p.add(new Phrase(pm(contractContextMap.get("c_6_1_1").getStringValue(),4),bi));
p.add(new Phrase("年",b2));
p.add(new Phrase(pm(contractContextMap.get("c_6_1_2").getStringValue(),4),bi));
p.add(new Phrase("月",b2));
p.add(new Phrase(pm(contractContextMap.get("c_6_1_3").getStringValue(),4),bi));
p.add(new Phrase("日由甲方将上述房地产正式交付给乙方。房屋移交给乙方时,其该建筑物范围内的土地使用权一并转移给乙方。",b2));
document.add(p);
p = new Paragraph(20);
p.add(new Phrase(" 四、甲方保证上述房地产权属清楚,无纠纷,房地产无限制交易的情形。甲乙双方所提交的证件资料真实、合法、有效、无隐瞒、伪报。如有不实,甲乙双方愿负全部法律及经济责任。",b2));
document.add(p);
p = new Paragraph(20);
p.add(new Phrase(" 五、上述房地产办理过户手续所需缴纳的税费,由",b2));
p.add(new Phrase(pm(contractContextMap.get("c_5_1_1").getStringValue(),4),bi));
p.add(new Phrase("承担。甲方须协助乙方办理房地产买卖过户事宜。",b2));
document.add(p);
p = new Paragraph(20);
p.add(new Phrase(" 六、本契约经甲乙双方签章后生效。任何一方违约,由违约方向对方给付上述房地产价款百分之",b2));
p.add(new Phrase(pm(contractContextMap.get("c_6_1_1").getStringValue(),4),bi));
p.add(new Phrase("的违约金。",b2));
document.add(p);
p = new Paragraph(20);
p.add(new Phrase(" 七、本契约一式四份,甲乙双方各执一份,房地产交易中心和公证处各一份。",b2));
document.add(p);
p = new Paragraph(20);
p.add(new Phrase(" 八、双方约定的其他事项:",b2));
document.add(p);
p = new Paragraph(20," ",b2);
p.add(new Phrase(20,pm(contractContextMap.get("c8_1_1").getStringValue(),95),bi));
document.add(p);
document.add(new Paragraph(25," ",h2));
table = new PdfPTable(2);
table.setWidths(f);
table.setWidthPercentage(100);
cel = new PdfPCell(new Paragraph("卖 方(签章):",b2));
cel.setBorder(0);
table.addCell(cel);
cel = new PdfPCell(new Paragraph("买 方(签章):",b2));
cel.setBorder(0);
table.addCell(cel);
cel = new PdfPCell(new Paragraph("委托代理人(签章):",b2));
cel.setBorder(0);
cel.setPaddingTop(10);
table.addCell(cel);
cel = new PdfPCell(new Paragraph("委托代理人(签章):",b2));
cel.setBorder(0);
cel.setPaddingTop(10);
table.addCell(cel);
p = new Paragraph(20);
p.add(new Phrase(" ",bi));
p.add(new Phrase("证号码:",b2));
cel = new PdfPCell(p);
cel.setBorder(0);
cel.setPaddingTop(10);
table.addCell(cel);
p = new Paragraph(20);
p.add(new Phrase(" ",bi));
p.add(new Phrase("证号码:",b2));
cel = new PdfPCell(p);
cel.setBorder(0);
cel.setPaddingTop(10);
table.addCell(cel);
cel = new PdfPCell(new Paragraph("联系电话:",b2));
cel.setBorder(0);
cel.setPaddingTop(10);
table.addCell(cel);
cel = new PdfPCell(new Paragraph("联系电话:",b2));
cel.setBorder(0);
cel.setPaddingTop(10);
table.addCell(cel);
table.setKeepTogether(true);
document.add(table);
SimpleDateFormat sdf = new SimpleDateFormat("yyyy年MM月dd日");
p = new Paragraph(sdf.format(new Date()),b2);
p.setAlignment(Element.ALIGN_RIGHT);
document.add(p);
document.close();
} catch (DocumentException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
}
<file_sep>/src/main/java/cc/coopersoft/house/participant/data/model/HouseFilterType.java
package cc.coopersoft.house.participant.data.model;
import cc.coopersoft.house.sale.data.HouseSource;
/**
* Created by cooper on 24/09/2017.
*/
public class HouseFilterType {
private HouseSource.HouseSourceStatus filterType;
private long count;
public HouseFilterType(HouseSource.HouseSourceStatus filterType, long count) {
this.filterType = filterType;
this.count = count;
}
public HouseSource.HouseSourceStatus getFilterType() {
return filterType;
}
public void setFilterType(HouseSource.HouseSourceStatus filterType) {
this.filterType = filterType;
}
public long getCount() {
return count;
}
public void setCount(long count) {
this.count = count;
}
}
<file_sep>/src/main/java/cc/coopersoft/house/participant/controller/HouseValidInfoProducer.java
package cc.coopersoft.house.participant.controller;
import cc.coopersoft.house.sale.data.HouseValidInfo;
import org.apache.deltaspike.core.api.scope.ViewAccessScoped;
import javax.enterprise.inject.Produces;
import javax.inject.Named;
/**
* Created by cooper on 02/03/2017.
*/
public class HouseValidInfoProducer {
@Produces
@ViewAccessScoped
@Named("houseValidInfo")
public HouseValidInfo createHouseValidInfo(){
return new HouseValidInfo(HouseValidInfo.ValidType.MBBH_NUMBER);
}
}
<file_sep>/src/main/java/cc/coopersoft/house/participant/faces/ValidTypeConverter.java
package cc.coopersoft.house.participant.faces;
import cc.coopersoft.house.sale.data.HouseValidInfo;
import javax.faces.convert.EnumConverter;
import javax.faces.convert.FacesConverter;
/**
* Created by cooper on 14/06/2017.
*/
@FacesConverter("validTypeConverter")
public class ValidTypeConverter extends EnumConverter {
public ValidTypeConverter() {
super(HouseValidInfo.ValidType.class);
}
}
<file_sep>/src/main/java/cc/coopersoft/house/participant/controller/ContractCreate.java
package cc.coopersoft.house.participant.controller;
import cc.coopersoft.comm.HttpJsonDataGet;
import cc.coopersoft.comm.exception.HttpApiServerException;
import cc.coopersoft.common.EnumHelper;
import cc.coopersoft.house.SubmitType;
import cc.coopersoft.house.participant.AttrUser;
import cc.coopersoft.house.participant.Messages;
import cc.coopersoft.house.participant.data.ContractContextMap;
import cc.coopersoft.house.participant.pages.Seller;
import cc.coopersoft.house.sale.data.*;
import com.dgsoft.developersale.wsinterface.DESUtil;
import com.dgsoft.house.OwnerShareCalcType;
import com.dgsoft.house.SaleType;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.apache.deltaspike.core.api.config.view.ViewConfig;
import org.apache.deltaspike.jpa.api.transaction.Transactional;
import org.apache.deltaspike.jsf.api.message.JsfMessage;
import javax.enterprise.context.Conversation;
import javax.enterprise.context.ConversationScoped;
import javax.enterprise.inject.Default;
import javax.faces.application.FacesMessage;
import javax.faces.context.ExternalContext;
import javax.faces.context.FacesContext;
import javax.inject.Inject;
import javax.inject.Named;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.math.BigDecimal;
import java.util.*;
import java.util.logging.Level;
import java.util.logging.Logger;
/**
* Created by cooper on 27/09/2017.
*/
@Named
@ConversationScoped
public class ContractCreate implements java.io.Serializable{
@Inject
@Default
private Conversation conversation;
@Inject
protected Logger logger;
@Inject
private JsfMessage<Messages> messages;
@Inject
private HouseSourceHome houseSourceHome;
@Inject
private ContractHome contractHome;
@Inject
private RunParam runParam;
@Inject
private EnumHelper enumHelper;
@Inject
private LocalContractConfig localContractConfig;
@Inject
private AttrUser attrUser;
@Inject
private ServerToken serverToken;
@Inject
private FacesContext facesContext;
@Inject
private HouseSourceCreate houseSourceCreate;
private OwnerShareCalcType ownerShareCalcType;
@cc.coopersoft.house.participant.annotations.Seller
@Transactional
public Class<? extends ViewConfig> deleteContract(){
if (HouseContract.ContractStatus.PREPARE.equals(houseSourceHome.getHouseSourceCompany().getHouseContract().getStatus())){
contractHome.setId(houseSourceHome.getHouseSourceCompany().getHouseContract().getId());
houseSourceHome.getHouseSourceCompany().setHouseContract(null);
houseSourceHome.save();
contractHome.remove();
return Seller.HouseSourceView.class;
}else {
throw new IllegalArgumentException("contract status error!");
}
}
@cc.coopersoft.house.participant.annotations.Seller
@Transactional
public Class<? extends ViewConfig> submitContract(){
logger.config("call contract commit!");
contractHome.putContractContext();
ObjectMapper mapper = new ObjectMapper();
Map<String,String> params = new HashMap<String, String>(1);
try {
ByteArrayOutputStream buffer = new ByteArrayOutputStream();
localContractConfig.getConfig().pdf(contractHome.getContractContextMap(),buffer);
buffer.close();
//logger.config("begin upload file;");
contractHome.getInstance().setFileId (HttpJsonDataGet.putFile(runParam.getStringParam("nginx_address") + "/api/protected/put-media",buffer.toByteArray(),attrUser.getKeyId(),serverToken.getRndData(),serverToken.getDigest()));
String data = mapper.writeValueAsString(contractHome.getInstance());
//logger.config(data);
params.put("data", DESUtil.encrypt(data, attrUser.getLoginData().getToken()));
SubmitResult result;
if (runParam.getStringParam("has_server_record").equals("FALSE")){
params.put("source", DESUtil.encrypt(mapper.writeValueAsString(houseSourceHome.getInstance()), attrUser.getLoginData().getToken()));
result = HttpJsonDataGet.postData(runParam.getStringParam("server_address") + "interfaces/extends/contract_and_house/" + attrUser.getLoginData().getKey(),params, SubmitResult.class);
}else{
result = HttpJsonDataGet.postData(runParam.getStringParam("server_address") + "interfaces/extends/contract/" + SubmitType.SALE_CONTRACT.name() + "/" + attrUser.getLoginData().getKey(),params, SubmitResult.class);
}
switch (result.getStatus()){
case SUCCESS:
contractHome.getInstance().setStatus(HouseContract.ContractStatus.SUBMIT);
contractHome.getInstance().setCommitTime(new Date());
messages.addInfo().contractCommited();
houseSourceHome.getInstance().setStatus(HouseSource.HouseSourceStatus.SUBMIT);
contractHome.getInstance().setCommitTime(new Date());
contractHome.getInstance().setStatus(HouseContract.ContractStatus.SUBMIT);
houseSourceHome.save();
contractHome.save();
return Seller.HouseSourceView.class;
case FAIL:
facesContext.addMessage(null,
new FacesMessage(FacesMessage.SEVERITY_ERROR, result.getMessages(), result.getMessages()));
break;
case ERROR:
facesContext.addMessage(null,
new FacesMessage(FacesMessage.SEVERITY_ERROR, result.getMessages(), result.getMessages()));
break;
}
} catch (JsonProcessingException e) {
throw new IllegalArgumentException(e.getMessage(), e);
} catch (HttpApiServerException e) {
logger.log(Level.WARNING,e.getMessage(),e);
messages.addError().serverFail();
return null;
} catch (Exception e) {
throw new IllegalArgumentException(e.getMessage(), e);
}
return null;
}
@cc.coopersoft.house.participant.annotations.Seller
public void printPdf(){
contractHome.save();
ExternalContext externalContext = facesContext.getExternalContext();
externalContext.responseReset();
externalContext.setResponseContentType("application/pdf");
externalContext.setResponseHeader("Content-Disposition", "inline; filename=\"" + contractHome.getInstance().getId() + ".pdf\"");
try {
localContractConfig.getConfig().pdf(contractHome.getContractContextMap(),externalContext.getResponseOutputStream());
} catch (IOException e) {
throw new IllegalArgumentException(e);
}
facesContext.responseComplete();
}
@cc.coopersoft.house.participant.annotations.Seller
public Class<? extends ViewConfig> sourceToContract(){
if (houseSourceHome.getHouseSourceCompany().getHouseContract() != null){
contractHome.clearInstance();
contractHome.setId(houseSourceHome.getHouseSourceCompany().getHouseContract().getId());
if (HouseContract.ContractStatus.PREPARE.equals(contractHome.getInstance().getStatus())){
return localContractConfig.getConfig().getEditPath(contractHome.getInstance().getType());
}else{
throw new IllegalArgumentException("contract status error!");
}
}else{
beginConversation();
createAndInitContract();
return Seller.Apply.ContractBaseInfo.class;
}
}
private void createAndInitContract(){
contractHome.clearInstance();
contractHome.getInstance().setHouseArea(houseSourceHome.getInstance().getHouseArea());
contractHome.getInstance().setHouseCode(houseSourceHome.getInstance().getHouseCode());
contractHome.getInstance().setHouseDescription(houseSourceHome.getInstance().getAddress());
contractHome.getInstance().setType(SaleType.OLD_SELL);
contractHome.getInstance().setContractVersion(SaleType.OLD_SELL.getCurrentVersion());
}
@cc.coopersoft.house.participant.annotations.Seller
public Class<? extends ViewConfig> saveSourceAndCreateContract(){
houseSourceCreate.saveHouseSourceInfo();
createAndInitContract();
return Seller.Apply.ContractBaseInfo.class;
}
public void setOwnerShareCalcType(OwnerShareCalcType ownerShareCalcType) {
this.ownerShareCalcType = ownerShareCalcType;
}
public OwnerShareCalcType getOwnerShareCalcType() {
switch (runParam.getIntegerParam("OWNER_SHARE_CALC_TYPE")) {
case 1:
return OwnerShareCalcType.SCALE;
case 2:
return OwnerShareCalcType.AREA;
default:
return ownerShareCalcType;
}
}
// public Class<? extends ViewConfig> createNew(){
// beginConversation();
// return null;
// }
public Class<? extends ViewConfig> toSellerInfo(){
contractHome.createPowerPersonList();
return Seller.Apply.ContractSellerInfo.class;
}
public Class<? extends ViewConfig> toBuyerInfo(){
return Seller.Apply.ContractBuyerInfo.class;
}
public Class<? extends ViewConfig> createContract(){
//contractHome.save();
PowerPerson seller = contractHome.getSellerEditList().get(0).getPersonEntity();
PowerPerson buyer = contractHome.getBuyerEditList().get(0).getPersonEntity();
contractHome.getContractContextMap().put("contract_number",new ContractContextMap.ContarctContextItem(contractHome.getInstance().getId()));
contractHome.getContractContextMap().put("seller_name",new ContractContextMap.ContarctContextItem(seller.getPersonName()));
contractHome.getContractContextMap().put("seller_card_type",new ContractContextMap.ContarctContextItem(enumHelper.getLabel(seller.getCredentialsType())));
contractHome.getContractContextMap().put("seller_card_number", new ContractContextMap.ContarctContextItem(seller.getCredentialsNumber()));
contractHome.getContractContextMap().put("seller_tel", new ContractContextMap.ContarctContextItem(seller.getPhone()));
if (contractHome.getSellerEditList().size() > 1){
List<ContractContextMap> poolMapList = new ArrayList<ContractContextMap>(contractHome.getSellerEditList().size() - 1);
for (int i = 1 ; i < contractHome.getSellerEditList().size() ; i++){
ContractContextMap poolMap = new ContractContextMap();
PowerPerson pool = contractHome.getSellerEditList().get(i).getPersonEntity();
poolMap.put("name", new ContractContextMap.ContarctContextItem(pool.getPersonName()));
poolMap.put("card_type", new ContractContextMap.ContarctContextItem(enumHelper.getLabel(pool.getCredentialsType())));
poolMap.put("card_number", new ContractContextMap.ContarctContextItem(pool.getCredentialsNumber()));
poolMap.put("tel", new ContractContextMap.ContarctContextItem(pool.getPhone()));
poolMapList.add(poolMap);
}
contractHome.getContractContextMap().put("seller_pool", new ContractContextMap.ContarctContextItem(poolMapList));
}
if (contractHome.getBuyerEditList().size() > 1){
List<ContractContextMap> poolMapList = new ArrayList<ContractContextMap>(contractHome.getBuyerEditList().size() - 1);
for (int i = 1 ; i < contractHome.getBuyerEditList().size() ; i++){
ContractContextMap poolMap = new ContractContextMap();
PowerPerson pool = contractHome.getBuyerEditList().get(i).getPersonEntity();
poolMap.put("name", new ContractContextMap.ContarctContextItem(pool.getPersonName()));
poolMap.put("card_type", new ContractContextMap.ContarctContextItem(enumHelper.getLabel(pool.getCredentialsType())));
poolMap.put("card_number", new ContractContextMap.ContarctContextItem(pool.getCredentialsNumber()));
poolMap.put("tel", new ContractContextMap.ContarctContextItem(pool.getPhone()));
poolMapList.add(poolMap);
}
contractHome.getContractContextMap().put("buyer_pool", new ContractContextMap.ContarctContextItem(poolMapList));
}
contractHome.getContractContextMap().put("buyer_name", new ContractContextMap.ContarctContextItem(buyer.getPersonName()));
contractHome.getContractContextMap().put("buyer_card_name", new ContractContextMap.ContarctContextItem(enumHelper.getLabel(buyer.getCredentialsType())));
contractHome.getContractContextMap().put("buyer_card_number",new ContractContextMap.ContarctContextItem(buyer.getCredentialsNumber()));
contractHome.getContractContextMap().put("buyer_tel",new ContractContextMap.ContarctContextItem(buyer.getPhone()));
contractHome.getContractContextMap().put("money", new ContractContextMap.ContarctContextItem(contractHome.getInstance().getPrice()));
contractHome.getContractContextMap().put("pay_type", new ContractContextMap.ContarctContextItem(contractHome.getInstance().getSalePayType().name()));
contractHome.getContractContextMap().put("price_house_area", new ContractContextMap.ContarctContextItem(contractHome.getInstance().getPrice().divide(houseSourceHome.getInstance().getHouseArea(),2, BigDecimal.ROUND_HALF_EVEN)));
if (houseSourceHome.getInstance().getUseArea() != null && houseSourceHome.getInstance().getUseArea().compareTo(BigDecimal.ZERO) > 0) {
contractHome.getContractContextMap().put("price_use_area", new ContractContextMap.ContarctContextItem(contractHome.getInstance().getPrice().divide(houseSourceHome.getInstance().getUseArea(), 2, BigDecimal.ROUND_HALF_EVEN)));
}
contractHome.getContractContextMap().put("address", new ContractContextMap.ContarctContextItem(houseSourceHome.getInstance().getAddress()));
contractHome.getContractContextMap().put("use_type", new ContractContextMap.ContarctContextItem(houseSourceHome.getInstance().getDesignUseType()));
contractHome.getContractContextMap().put("map_number",new ContractContextMap.ContarctContextItem(houseSourceHome.getInstance().getMapNumber()));
contractHome.getContractContextMap().put("block_number",new ContractContextMap.ContarctContextItem(houseSourceHome.getInstance().getBlockNumber()));
contractHome.getContractContextMap().put("build_number",new ContractContextMap.ContarctContextItem(houseSourceHome.getInstance().getBuildNumber()));
contractHome.getContractContextMap().put("house_number",new ContractContextMap.ContarctContextItem(houseSourceHome.getInstance().getHouseOrder()));
contractHome.getContractContextMap().put("structure", new ContractContextMap.ContarctContextItem(houseSourceHome.getInstance().getStructure()));
contractHome.getContractContextMap().put("floor_count",new ContractContextMap.ContarctContextItem(String.valueOf(houseSourceHome.getInstance().getFloorCount())));
contractHome.getContractContextMap().put("in_floor", new ContractContextMap.ContarctContextItem(houseSourceHome.getInstance().getInFloorName()));
contractHome.getContractContextMap().put("power_card_type", new ContractContextMap.ContarctContextItem(enumHelper.getLabel(houseSourceHome.getInstance().getPowerCardType())));
contractHome.getContractContextMap().put("power_card_number", new ContractContextMap.ContarctContextItem(houseSourceHome.getInstance().getPowerCardNumber()));
contractHome.getContractContextMap().put("house_area", new ContractContextMap.ContarctContextItem(houseSourceHome.getInstance().getHouseArea()));
if (houseSourceHome.getInstance().getUseArea() != null && houseSourceHome.getInstance().getUseArea().compareTo(BigDecimal.ZERO) > 0 ) {
contractHome.getContractContextMap().put("use_area", new ContractContextMap.ContarctContextItem(houseSourceHome.getInstance().getUseArea()));
}
contractHome.getContractContextMap().put("comm_area", new ContractContextMap.ContarctContextItem(houseSourceHome.getInstance().getHouseArea().subtract(houseSourceHome.getInstance().getUseArea())));
contractHome.getContractContextMap().put("sale_pay_type",new ContractContextMap.ContarctContextItem(enumHelper.getLabel(contractHome.getInstance().getSalePayType())));
if (contractHome.getInstance().getMoneyManager() != null){
contractHome.getContractContextMap().put("bank_name" , new ContractContextMap.ContarctContextItem(contractHome.getInstance().getMoneyManager().getBankName()));
contractHome.getContractContextMap().put("bank_account", new ContractContextMap.ContarctContextItem(contractHome.getInstance().getMoneyManager().getAccount()));
contractHome.getContractContextMap().put("card_bank",new ContractContextMap.ContarctContextItem(contractHome.getInstance().getMoneyManager().getOldHouseMoney().getBankName()));
contractHome.getContractContextMap().put("card_number",new ContractContextMap.ContarctContextItem(contractHome.getInstance().getMoneyManager().getOldHouseMoney().getCardNumber()));
contractHome.getContractContextMap().put("card_name", new ContractContextMap.ContarctContextItem(contractHome.getInstance().getMoneyManager().getOldHouseMoney().getCardName()));
contractHome.getContractContextMap().put("protected_money", new ContractContextMap.ContarctContextItem(contractHome.getInstance().getMoneyManager().getMoney()));
}else {
contractHome.getContractContextMap().remove("bank_name");
}
contractHome.putContractContext();
houseSourceHome.getHouseSourceCompany().setHouseContract(contractHome.getInstance());
houseSourceHome.save();
endConversation();
return localContractConfig.getConfig().getEditPath(contractHome.getInstance().getType());
}
protected void beginConversation(){
if ( conversation.isTransient() )
{
conversation.begin();
conversation.setTimeout(1200000);
}
}
protected void endConversation() {
if ( !conversation.isTransient() )
{
conversation.end();
}
}
}
<file_sep>/src/main/java/cc/coopersoft/common/util/DataHelper.java
package cc.coopersoft.common.util;
/**
* Created by cooper on 6/22/16.
*/
public class DataHelper {
public static <U> boolean isDirty(U oldValue, U newValue)
{
return oldValue!=newValue && (
oldValue==null ||
!oldValue.equals(newValue)
);
}
public static boolean empty(String value){
return value == null || "".equals(value.trim());
}
}
<file_sep>/src/main/java/cc/coopersoft/common/EntityConverter.java
package cc.coopersoft.common;
import javax.faces.component.UIComponent;
import javax.faces.context.FacesContext;
import javax.faces.convert.Converter;
import javax.inject.Inject;
import javax.persistence.EntityManager;
import java.io.Serializable;
/**
* Created by cooper on 6/19/16.
*/
public abstract class EntityConverter implements Converter, Serializable {
@Inject
private EntityLoader entityLoader;
protected abstract EntityManager getEntityManager();
public Object getAsObject(FacesContext context, UIComponent component, String value) {
if (value == null || value.length() == 0)
{
return null;
}
return entityLoader.get(getEntityManager(),value);
}
public String getAsString(FacesContext context, UIComponent component, Object value) {
if (value == null)
{
return null;
}
if (value instanceof String)
{
return (String) value;
}
return entityLoader.put(value);
}
}
<file_sep>/src/main/java/cc/coopersoft/common/I18n.java
package cc.coopersoft.common;
import javax.annotation.PostConstruct;
import javax.enterprise.context.SessionScoped;
import javax.inject.Named;
import java.util.*;
/**
* Created by cooper on 7/22/16.
*/
@Named
@SessionScoped
public class I18n implements java.io.Serializable{
private Locale locale;
@PostConstruct
public void initParam(){
locale = Locale.CHINA;
}
public Locale getLocale() {
return locale;
}
public TimeZone getTimeZone(){
return Calendar.getInstance(getLocale()).getTimeZone();
}
public Date getDayBeginTime(Date value){
GregorianCalendar gc = new GregorianCalendar(getLocale());
gc.setTime(value);
gc.set(Calendar.HOUR_OF_DAY, 0);
gc.set(Calendar.MINUTE, 0);
gc.set(Calendar.SECOND, 0);
gc.set(Calendar.MILLISECOND, 0);
return gc.getTime();
}
public Date getDayEndTime(Date value){
return new Date(getDayBeginTime(value).getTime() + 24 * 60 * 60 * 1000 - 1000);
}
}
<file_sep>/src/main/java/cc/coopersoft/house/participant/Messages.java
package cc.coopersoft.house.participant;
import org.apache.deltaspike.core.api.message.MessageBundle;
import org.apache.deltaspike.core.api.message.MessageTemplate;
/**
* Created by cooper on 6/8/16.
*/
@MessageBundle
public interface Messages{
@MessageTemplate("{authenticate_fail}")
String authenticateFail();
@MessageTemplate("{valid_house_not_fount}")
String validHouseNotFound();
@MessageTemplate("{valid_house_error}")
String validHouseError();
@MessageTemplate("{valid_house_owner_fail}")
String validHouseOwnerFail();
@MessageTemplate("{api_server_fail}")
String serverFail();
@MessageTemplate("{house_source_exists}")
String houseSourceExists();
@MessageTemplate("{house_source_only_exists}")
String houseSourceOnlyExists();
@MessageTemplate("{house_exists_to_edit}")
String houseSourceToEdit();
@MessageTemplate("{contract_commited}")
String contractCommited();
@MessageTemplate("{house_source_committed}")
String houseSourceCommitted();
}
<file_sep>/src/main/java/cc/coopersoft/house/participant/Authorizer.java
package cc.coopersoft.house.participant;
import cc.coopersoft.house.participant.annotations.AttrCorp;
import cc.coopersoft.house.participant.annotations.Seller;
import org.apache.deltaspike.security.api.authorization.Secures;
import org.picketlink.Identity;
import org.picketlink.idm.IdentityManager;
import org.picketlink.idm.RelationshipManager;
import org.picketlink.idm.model.basic.BasicModel;
import javax.enterprise.context.ApplicationScoped;
/**
* Created by cooper on 26/02/2017.
*/
@ApplicationScoped
public class Authorizer {
@Secures
@Seller
public boolean doSellerCheck(Identity identity, IdentityManager identityManager, RelationshipManager relationshipManager) throws Exception{
return BasicModel.hasRole(relationshipManager, identity.getAccount(), BasicModel.getRole(identityManager, "seller"));
}
@Secures
@AttrCorp
public boolean doAttrCorpCheck(Identity identity, IdentityManager identityManager, RelationshipManager relationshipManager) throws Exception{
return BasicModel.hasRole(relationshipManager, identity.getAccount(), BasicModel.getRole(identityManager, "seller")) ||
BasicModel.hasRole(relationshipManager, identity.getAccount(), BasicModel.getRole(identityManager, "developer"));
}
}
<file_sep>/src/main/resources/cc/coopersoft/house/participant/Messages.properties
up=\u2191
down=\u2193
left=\u2039
right=\u203A
authenticate_fail=\u767B\u5F55\u5931\u8D25,\u7528\u6237\u540D\u6216\u5BC6\u7801\u9519\u8BEF!
application_title=\u623F\u5C4B\u4EA4\u6613\u5E73\u53F0
primary_key_conflict=\u7F16\u53F7\u51B2\u7A81,\u6B64\u7F16\u53F7\u5DF2\u7ECF\u5B58\u5728!
api_server_fail = \u8FDE\u63A5\u623F\u4EA7\u6548\u679C\u670D\u52A1\u5668\u51FA\u9519\u6216\u5931\u8D25\uFF0C\u8BF7\u8054\u7CFB\u7BA1\u7406\u90E8\u95E8
com.dgsoft.common.system.PersonEntity$CredentialsType.MASTER_ID=\u8EAB\u4EFD\u8BC1
com.dgsoft.common.system.PersonEntity$CredentialsType.SOLDIER_CARD=\u58EB\u5175\u8BC1
com.dgsoft.common.system.PersonEntity$CredentialsType.PASSPORT=\u62A4\u7167
com.dgsoft.common.system.PersonEntity$CredentialsType.COMPANY_CODE=\u8425\u4E1A\u6267\u7167
com.dgsoft.common.system.PersonEntity$CredentialsType.OTHER=\u5176\u5B83\u8BC1\u4EF6
com.dgsoft.common.system.PersonEntity$CredentialsType.CORP_CODE=\u673A\u6784\u4EE3\u7801\u8BC1
com.dgsoft.common.system.PersonEntity$CredentialsType.TW_ID=\u53F0\u6E7E\u901A\u884C\u8BC1
com.dgsoft.common.system.PersonEntity$CredentialsType.OFFICER_CARD=\u519B\u5B98\u8BC1
com.dgsoft.common.system.PersonEntity$CredentialsType.GA_ID=\u6E2F\u6FB3\u901A\u884C\u8BC1
com.dgsoft.house.PoolType.SINGLE_OWNER=\u5355\u72EC\u6240\u6709
com.dgsoft.house.PoolType.TOGETHER_OWNER=\u5171\u540C\u5171\u6709
com.dgsoft.house.PoolType.SHARE_OWNER=\u6309\u4EFD\u5171\u6709
com.dgsoft.common.system.Sex.FEMALE=\u5973
com.dgsoft.common.system.Sex.MALE=\u7537
com.dgsoft.house.SalePayType.ALL_PAY=\u4E00\u6B21\u6027\u4ED8\u6B3E
com.dgsoft.house.SalePayType.PART_PAY=\u5206\u671F\u4ED8\u6B3E
com.dgsoft.house.SalePayType.DEBIT_PAY=\u8D37\u6B3E
com.dgsoft.house.SalePayType.OTHER_PAY=\u5176\u5B83\u4ED8\u6B3E\u65B9\u5F0F
com.dgsoft.common.system.PowerPersonEntity$LegalType.LEGAL_OWNER=\u6CD5\u4EBA
com.dgsoft.common.system.PowerPersonEntity$LegalType.LEGAL_MANAGER=\u8D1F\u8D23\u4EBA
cc.coopersoft.house.ProxyType.ENTRUSTED=\u59D4\u6258\u4EE3\u7406\u4EBA
cc.coopersoft.house.ProxyType.LEGAL=\u6CD5\u5B9A\u4EE3\u7406\u4EBA
cc.coopersoft.house.LockType.HOUSE_LOCKED= [\u623F\u5C4B\u9884\u8B66]
cc.coopersoft.house.LockType.SYSTEM_LOCKED= [\u7CFB\u7EDF\u9884\u8B66]
cc.coopersoft.house.LockType.DISPUTE_REG= [\u5F02\u8BAE\u767B\u8BB0]
cc.coopersoft.house.LockType.CLOSE_REG= [\u67E5\u8BE2\u767B\u8BB0]
cc.coopersoft.house.LockType.OTHER_REG= [\u5176\u5B83\u9650\u5236\u767B\u8BB0]
cc.coopersoft.house.LockType.MORTGAGE_REEG= [\u62B5\u62BC\u767B\u8BB0]
cc.coopersoft.house.LockType.IN_BUSINESS= [\u6B63\u5728\u529E\u7406\u4E1A\u52A1]
cc.coopersoft.house.SaleLimitType.OWNER_CHANGE = \u4EA7\u6743\u9A8C\u8BC1\u5931\u8D25
cc.coopersoft.house.SaleLimitType.STATE_LIMIT = \u4E0D\u53EF\u9500\u552E\u7684\u623F\u5C4B\u72B6\u6001
cc.coopersoft.house.SaleLimitType.REGISTER_LIMIT = \u4E0D\u53EF\u9500\u552E\u7684\u767B\u8BB0\u72B6\u6001
cc.coopersoft.house.SaleLimitType.LOCKED_LIMIT = \u623F\u5C4B\u9884\u8B66
cc.coopersoft.house.SaleLimitType.CHECK_FAIL = \u5BA1\u6838\u5931\u8D25
cc.coopersoft.house.SaleLimitType.OTHER_SALE = \u5176\u5B83\u673A\u6784\u6302\u724C
com.dgsoft.developersale.LogonStatus.PASSWORD_ERROR = \u7535\u5B50\u94A5\u5319\u9A8C\u8BC1\u5931\u8D25
com.dgsoft.developersale.LogonStatus.EMP_DISABLE = \u4ECE\u4E1A\u4EBA\u5458\u5DF2\u88AB\u7981\u7528
com.dgsoft.developersale.LogonStatus.CORP_DISABLE = \u4ECE\u4E1A\u673A\u6784\u5DF2\u88AB\u7981\u7528
com.dgsoft.developersale.LogonStatus.CORP_OUT_TIME = \u4ECE\u4E1A\u673A\u6784\u5E74\u5BA1\u8FC7\u671F
com.dgsoft.developersale.LogonStatus.SERVER_ERROR = \u8FDE\u63A5\u623F\u4EA7\u6548\u679C\u670D\u52A1\u5668\u51FA\u9519\u6216\u5931\u8D25\uFF0C\u8BF7\u8054\u7CFB\u7BA1\u7406\u90E8\u95E8
valid_house_not_fount=\u623F\u5C4B\u9A8C\u8BC1\u5931\u8D25\uFF1A \u63D0\u4F9B\u7684\u623F\u5C4B\u6807\u8BC6\u65E0\u6CD5\u627E\u5230\u623F\u5C4B\uFF0C\u8BF7\u786E\u8BA4\u6807\u8BC6\u6B63\u786E\u6216\u5C1D\u8BD5\u4F7F\u7528\u5176\u5B83\u6807\u8BC6
valid_house_error=\u623F\u5C4B\u9A8C\u8BC1\u51FA\u9519\uFF1A\u65E0\u6CD5\u8FDE\u63A5\u9A8C\u8BC1\u670D\u52A1\u5668\uFF0C\u8BF7\u7A0D\u540E\u518D\u8BD5\u6216\u8054\u7CFB\u7BA1\u7406\u90E8\u95E8
valid_house_owner_fail=\u623F\u5C4B\u9A8C\u8BC1\u5931\u8D25\uFF1A\u63D0\u4F9B\u7684\u4EA7\u6743\u4FE1\u606F\u9A8C\u8BC1\u4E0D\u901A\u8FC7\uFF0C\u8BF7\u68C0\u67E5\u4EA7\u6743\u4FE1\u606F\u662F\u5426\u6B63\u786E
house_source_exists=\u623F\u6E90\u5DF2\u7ECF\u5B58\u5728\uFF0C\u5DF2\u52A0\u5165\u6B64\u623F\u6E90\u5171\u540C\u9500\u552E!
house_exists_to_edit=\u623F\u6E90\u5DF2\u7ECF\u5B58\u5728\uFF0C\u8F6C\u5165\u7F16\u8F91\u754C\u9762!
house_source_only_exists=\u623F\u6E90\u5DF2\u7ECF\u5B58\u5728\uFF0C\u5E76\u4E14\u4E3A\u72EC\u5360\u623F\u6E90\uFF0C\u4E0D\u5141\u8BB8\u52A0\u5165!
cc.coopersoft.house.sale.data.HouseSource$HouseSourceStatus.SUBMIT=\u5408\u540C\u5DF2\u63D0\u4EA4
cc.coopersoft.house.sale.data.HouseSource$HouseSourceStatus.PREPARE=\u623F\u6E90\u4FE1\u606F\u5F55\u5165
cc.coopersoft.house.sale.data.HouseSource$HouseSourceStatus.CHECK=\u623F\u6E90\u5BA1\u6838\u4E2D
cc.coopersoft.house.sale.data.HouseSource$HouseSourceStatus.CANCEL=\u5DF2\u64A4\u6D88
cc.coopersoft.house.sale.data.HouseSource$HouseSourceStatus.SELL=\u5DF2\u552E
cc.coopersoft.house.sale.data.HouseSource$HouseSourceStatus.SHOWING=\u623F\u6E90\u5C55\u793A\u4E2D
cc.coopersoft.house.sale.data.HouseSource$HouseSourceStatus.CHECK_PASS=\u623F\u6E90\u5BA1\u6838\u901A\u8FC7
cc.coopersoft.house.sale.data.HouseSource$HouseSourceStatus.CHECK_FAIL=\u623F\u6E90\u5BA1\u6838\u9A73\u56DE
cc.coopersoft.house.HousePowerCard.NEW_OWNER=\u4E0D\u52A8\u4EA7\u6743\u8BC1
cc.coopersoft.house.HousePowerCard.OWNER_RSHIP=\u623F\u4EA7\u8BC1
cc.coopersoft.house.HousePowerCard.POOL_RSHIP=\u5171\u6709\u6743\u8BC1
cc.coopersoft.house.sale.data.HouseSaleInfo$ShowAreaType.TO_END_TIME=\u6307\u5B9A\u65E5\u671F
cc.coopersoft.house.sale.data.HouseSaleInfo$ShowAreaType.TO_SELL=\u7B7E\u7EA6\u65E5\u671F
datetime_pattern=yyyy\u5E74MM\u6708dd\u65E5 HH:mm:ss
cc.coopersoft.house.UseType.DWELLING_KEY=\u4F4F\u5B85
cc.coopersoft.house.UseType.SHOP_HOUSE_KEY=\<KEY>
cc.coopersoft.house.UseType.STORE_HOUSE=\u4ED3\u50A8\u7528\u623F
cc.coopersoft.house.UseType.OFFICE=\u529E\u516C\u7528\u623F
cc.coopersoft.house.UseType.CAR_STORE=\u8F66\u5E93
cc.coopersoft.house.UseType.INDUSTRY=\u5DE5\u4E1A\u7528\u623F
cc.coopersoft.house.UseType.OTHER=\u5176\u5B83
contract_commited=\u5408\u540C\u5DF2\u63D0\u4EA4\u81F3\u4EA4\u6613\u5907\u6848\u7BA1\u7406\u90E8\u95E8
house_source_committed=\u623F\u6E90\u5DF2\u63D0\u4EA4\u81F3\u4EA4\u6613\u5907\u6848\u7BA1\u7406\u90E8\u95E8<file_sep>/src/main/webapp/seller/houseSourceView.xhtml
<?xml version='1.0' encoding='UTF-8' ?>
<!DOCTYPE html>
<ui:composition template="/seller/layout/pageTemplate.xhtml" xmlns="http://www.w3.org/1999/xhtml"
xmlns:h="http://java.sun.com/jsf/html"
xmlns:f="http://java.sun.com/jsf/core"
xmlns:b="http://bootsfaces.net/ui"
xmlns:p="http://primefaces.org/ui"
xmlns:c="http://java.sun.com/jstl/core"
xmlns:pt="http://xmlns.jcp.org/jsf/passthrough"
xmlns:ui="http://java.sun.com/jsf/facelets" xmlns:o="http://omnifaces.org/ui">
<h:outputStylesheet>
.head-title-panel.panel{
background-color: #fafafa;
}
.head-title-panel h2{
color: #4078c0;
margin-top: 9px;
margin-bottom: 16px;
font-size: 18px;
font-weight: normal;
}
</h:outputStylesheet>
<b:modal id="delete-modal" size="modal-sm" title="删除确认" styleClass="modalDeleteClass">
<p>删除合同确认,房源及合同的所有信息将会丢失!</p>
<f:facet name="footer">
<h:form>
<b:commandButton action="#{houseSaleInfoEdit.deleteHouseSource}" look="danger" style="width:100%" value="我确认要删除此房源">
<f:param name="houseSaleInfoId" value="#{houseSourceHome.instance.id}"/>
</b:commandButton>
</h:form>
</f:facet>
</b:modal>
<b:row style="margin-top: 20px;">
<b:column>
<b:messages globalOnly="true" styleClass="top-messages"/>
</b:column>
<b:column rendered="#{houseSourceHome.join or houseSourceHome.instance.allowJoin}">
<b:panel styleClass="head-title-panel">
<h:form>
<b:buttonGroup styleClass="right" style="margin-left:5px">
<b:button iconAwesome="plus" tooltip="加入" disabled="#{houseSourceHome.join}"/>
<b:commandButton action="#{contractCreate.sourceToContract}"
disabled="#{not houseSourceHome.allowContract}" iconAwesome="file-pdf-o" tooltip="签约">
<f:param name="houseSaleInfoId" value="#{houseSourceHome.instance.id}"/>
</b:commandButton>
<b:button disabled="#{not houseSourceHome.allowEdit}"
href="#{request.contextPath}/seller/apply/houseShowInfo.xhtml?houseSaleInfoId=#{houseSourceHome.instance.id}"
tooltip="修改房源信息" iconAwesome="pencil">
</b:button>
<b:button tooltip="删除" iconAwesome="trash"
pt:data-target="#delete-modal" pt:data-toggle="modal"
disabled="#{not houseSourceHome.allowDelete}"/>
<b:dropButton icon="print" value="打印" >
<b:navLink value="交易合同" target="_blank" disabled="#{not houseSourceHome.allowPrintSellContract}" href="#{runParam.getStringParam('contract_address')}#{houseSourceHome.houseSourceCompany.houseContract.fileId}"/>
<b:navCommandLink value="委托合同" action="#{houseSourceHome.printAgentPdf}" target="_blank">
<f:param name="houseSaleInfoId" value="#{houseSourceHome.instance.id}"/>
</b:navCommandLink>
<b:navCommandLink value="见证书" action="#{contractHome.printSeeContract}" disabled="#{empty houseSourceHome.contractId}" target="_blank">
<f:param name="contractId" value="#{houseSourceHome.contractId}"/>
<f:param name="houseSaleInfoId" value="#{houseSourceHome.instance.id}"/>
</b:navCommandLink>
</b:dropButton>
</b:buttonGroup>
</h:form>
<h2><h:outputText rendered="#{not empty houseSourceHome.instance.houseSaleInfo}" value="#{houseSourceHome.instance.houseSaleInfo.title}/"/>#{houseSourceHome.instance.sourceId}</h2>
<p>
<h:outputText value="#{houseSourceHome.instance.address}"/>
</p>
<p class="f6 text-gray mb-0 mt-2">
建立时间:<h:outputText value="#{houseSourceHome.instance.applyTime}"><f:convertDateTime locale="#{i18n.locale}" timeZone="#{i18n.timeZone}" pattern="#{messages.datetime_pattern}"/></h:outputText>
更新时间:<h:outputText value="#{houseSourceHome.instance.updateTime}"><f:convertDateTime locale="#{i18n.locale}" timeZone="#{i18n.timeZone}" pattern="#{messages.datetime_pattern}"/></h:outputText>
审核时间:<h:outputText value="#{houseSourceHome.instance.checkTime}"><f:convertDateTime locale="#{i18n.locale}" timeZone="#{i18n.timeZone}" pattern="#{messages.datetime_pattern}"/></h:outputText>
签约时间:<h:outputText value="#{houseSourceHome.instance.checkTime}"><f:convertDateTime locale="#{i18n.locale}" timeZone="#{i18n.timeZone}" pattern="#{messages.datetime_pattern}"/></h:outputText>
</p>
</b:panel>
</b:column>
<b:column rendered="#{houseSourceHome.join and (houseSourceHome.instance.status.name() eq 'CHECK_FAIL')}">
<b:alert severity="warning"><strong>房源审核失败!</strong> #{houseSourceHome.instance.messages}. </b:alert>
</b:column>
<b:column rendered="#{houseSourceHome.join or houseSourceHome.instance.allowJoin}">
<b:tabView>
<b:tab title="房源信息">
<f:facet name="anchor"><b:iconAwesome name="home" /></f:facet>
<div style="padding: 20px 15px ">
<b:row styleClass="details-info" rendered="#{houseSourceHome.join}">
<b:column span="12">
<label for="sale_title">标题</label>
<span class="info-value-block" id="sale_title">
#{houseSourceHome.houseSourceCompany.title}
</span>
</b:column>
<b:column span="12">
<label for="sale_memo">描述</label>
<span class="info-value-block" id="sale_memo">
#{houseSourceHome.houseSourceCompany.memo}
</span>
</b:column>
<b:column span="3">
<label for="owner_name">产权人</label>
<span class="info-value-block" id="owner_name">
#{houseSourceHome.instance.personName}
</span>
</b:column>
<b:column span="6">
<label for="owner_card">#{enumHelper.getLabel(houseSourceHome.instance.credentialsType)}</label>
<span class="info-value-block" id="owner_card">
#{houseSourceHome.instance.credentialsNumber}
</span>
</b:column>
<b:column span="3">
<label for="owner_tel">联系电话</label>
<span class="info-value-block" id="owner_tel">
#{houseSourceHome.instance.tel}
</span>
</b:column>
<b:column span="3">
<label for="proxy_person">联系人</label>
<span class="info-value-block" id="proxy_person">
#{houseSourceHome.instance.powerProxyPerson.personName}
</span>
</b:column>
<b:column span="6">
<label for="proxy_card">#{enumHelper.getLabel(houseSourceHome.instance.powerProxyPerson.credentialsType)}</label>
<span class="info-value-block" id="proxy_card">
#{houseSourceHome.instance.powerProxyPerson.credentialsNumber}
</span>
</b:column>
<b:column span="3">
<label for="proxy_tel">联系电话</label>
<span class="info-value-block" id="proxy_tel">
#{houseSourceHome.instance.powerProxyPerson.phone}
</span>
</b:column>
</b:row>
<b:row styleClass="details-info">
<b:column span="3">
<label for="house_source_code">房源编号</label>
<span class="info-value-block" id="house_source_code">
#{houseSourceHome.instance.sourceId}
</span>
</b:column>
<b:column span="3">
<label for="house_code">房屋编号</label>
<span class="info-value-block" id="house_code">
#{houseSourceHome.instance.houseCode}
</span>
</b:column>
<b:column span="3">
<label for="status">状态</label>
<span class="info-value-block" id="status">
#{enumHelper.getLabel(houseSourceHome.instance.status)}
</span>
</b:column>
<b:column span="3">
<label for="business_id">审核编号</label>
<span class="info-value-block" id="business_id">
#{houseSourceHome.instance.businessId}
</span>
</b:column>
<b:column span="3">
<label for="s_floor_count">总层数</label>
<span class="info-value-block" id="s_floor_count">
#{houseSourceHome.instance.floorCount}
</span>
</b:column>
<b:column span="3">
<label for="in_floor">所在层</label>
<span class="info-value-block" id="in_floor">
#{houseSourceHome.instance.houseSaleInfo.inFloor}(#{houseSourceHome.instance.inFloorName})
</span>
</b:column>
<b:column span="3">
<label for="district">城区</label>
<span class="info-value-block" id="district">
#{serverWord.getDistrictName(houseSourceHome.instance.district) }
</span>
</b:column>
<b:column span="3">
<label for="use_type">设计用途</label>
<span class="info-value-block" id="use_type">
#{enumHelper.getLabel(houseSourceHome.instance.useType)}(#{houseSourceHome.instance.designUseType})
</span>
</b:column>
<b:column span="3">
<label for="house_area">建筑面积</label>
<span class="info-value-block" id="house_area">
<h:outputText value="#{houseSourceHome.instance.houseArea}">
<f:convertNumber pattern="#0.00"/>
</h:outputText>
</span>
</b:column>
<b:column span="3">
<label for="use_area">套内面积</label>
<span class="info-value-block" id="use_area">
<h:outputText value="#{houseSourceHome.instance.useArea}">
<f:convertNumber pattern="#0.00"/>
</h:outputText>
</span>
</b:column>
<b:column span="3">
<label for="structure">结构</label>
<span class="info-value-block" id="structure">
<h:outputText value="#{houseSourceHome.instance.structure}">
</h:outputText>
</span>
</b:column>
<b:column span="9">
<label for="power_card">#{enumHelper.getLabel(houseSourceHome.instance.powerCardType)}</label>
<span class="info-value-block" id="power_card">
#{houseSourceHome.instance.powerCardNumber}
</span>
</b:column>
<b:column span="3">
<label for="allow_join">独占房源</label>
<span class="info-value-block" id="allow_join">
#{houseSourceHome.instance.allowJoin ? '否' : '是'}
</span>
</b:column>
</b:row>
</div>
</b:tab>
<b:tab title="房源描述" disabled="#{empty houseSourceHome.instance.houseSaleInfo}">
<f:facet name="anchor"><b:iconAwesome name="tag" /></f:facet>
<p:outputPanel style="padding: 20px 15px " rendered="#{not empty houseSourceHome.instance.houseSaleInfo}">
<b:row styleClass="details-info">
<b:column>
<label for="show_title">标题</label>
<span class="info-value-block" id="show_title">
#{houseSourceHome.instance.houseSaleInfo.title}
</span>
</b:column>
<b:column>
<label for="show_tags">关键词</label>
<span class="info-value-block" id="show_tags">
#{houseSourceHome.instance.houseSaleInfo.tags}
</span>
</b:column>
<b:column>
<label for="show_description">房源描述</label>
<span class="info-value-block" id="show_description">
#{houseSourceHome.instance.houseSaleInfo.description}
</span>
</b:column>
<b:column>
<label for="environment">环境描述</label>
<span class="info-value-block" id="environment">
#{houseSourceHome.instance.houseSaleInfo.environment}
</span>
</b:column>
<b:column>
<label for="decorate">装修情况</label>
<span class="info-value-block" id="decorate">
#{houseSourceHome.instance.houseSaleInfo.decorate}
</span>
</b:column>
<b:column span="3">
<label for="sale_area">销售区域</label>
<span class="info-value-block" id="sale_area">
#{saleAreaCache.getSaleAreaNameById(houseSourceHome.instance.houseSaleInfo.localArea) }
</span>
</b:column>
<b:column span="3">
<label for="school_area">学区</label>
<span class="info-value-block" id="school_area">
#{saleAreaCache.getSaleAreaNameById(houseSourceHome.instance.houseSaleInfo.schoolArea) }
</span>
</b:column>
<b:column span="3">
<label for="elevator">电梯</label>
<span class="info-value-block" id="elevator">
#{houseSourceHome.instance.houseSaleInfo.elevator ? "有" : "无"}
</span>
</b:column>
<b:column span="3">
<label for="floor_count">总层数</label>
<span class="info-value-block" id="floor_count">
#{houseSourceHome.instance.houseSaleInfo.floorCount}
</span>
</b:column>
<b:column span="3">
<label for="type_size">套型</label>
<span class="info-value-block" id="type_size">
#{houseSourceHome.instance.houseSaleInfo.roomCount}室#{houseSourceHome.instance.houseSaleInfo.livingRoom}厅#{houseSourceHome.instance.houseSaleInfo.kitchenCount}厨#{houseSourceHome.instance.houseSaleInfo.toiletCount}卫
</span>
</b:column>
<b:column span="3">
<label for="create_year">建成年份</label>
<span class="info-value-block" id="create_year">
#{houseSourceHome.instance.houseSaleInfo.createYear}
</span>
</b:column>
<b:column span="3">
<label for="direction">朝向</label>
<span class="info-value-block" id="direction">
#{houseSourceHome.instance.houseSaleInfo.direction}
</span>
</b:column>
<b:column span="3">
<label for="price">单价</label>
<span class="info-value-block" id="price">
<h:outputText value="#{houseSourceHome.instance.houseSaleInfo.price}">
<f:convertNumber pattern="#0.00"/>
</h:outputText>
</span>
</b:column>
<b:column span="3">
<label for="money">总价</label>
<span class="info-value-block" id="money">
<h:outputText value="#{houseSourceHome.instance.houseSaleInfo.sumPrice}">
<f:convertNumber pattern="#0.00"/>
</h:outputText>
</span>
</b:column>
<b:column span="3">
<label for="show_area_type">挂牌到期时间</label>
<span class="info-value-block" id="show_area_type">
<h:outputText rendered="#{houseSourceHome.instance.houseSaleInfo.showAreaType eq 'TO_END_TIME'}" value="#{houseSourceHome.instance.houseSaleInfo.endTime}">
<f:convertDateTime pattern="yyyy-mm-dd"/>
</h:outputText>
<h:outputText value="无" rendered="#{houseSourceHome.instance.houseSaleInfo.showAreaType eq 'TO_SELL'}"/>
</span>
</b:column>
</b:row>
</p:outputPanel>
</b:tab>
<b:tab title="房源图片" disabled="#{empty houseSourceHome.instance.houseSaleInfo}">
<f:facet name="anchor"><b:iconAwesome name="file-photo-o" /></f:facet>
<p:outputPanel rendered="#{not empty houseSourceHome.instance.houseSaleInfo}">
<b:row>
<b:column col-xs="6" col-md="3">
<c:forEach items="#{houseSourceHome.houseSalePicList}" var="_pic">
<b:thumbnail>
<img data-src="#{runParam.getStringParam('nginx_address')}/img/200x200/#{_pic.id}" alt="Generic placeholder thumbnail"/>
</b:thumbnail>
</c:forEach>
</b:column>
</b:row>
</p:outputPanel>
</b:tab>
</b:tabView>
</b:column>
<b:column rendered="#{not houseSourceHome.join and not houseSourceHome.instance.allowJoin }" styleClass="blankslate" span="9">
<svg aria-hidden="true" class="octicon octicon-search blankslate-icon" height="32" version="1.1" viewBox="0 0 16 16" width="32"><path fill-rule="evenodd" d="M15.7 13.3l-3.81-3.83A5.93 5.93 0 0 0 13 6c0-3.31-2.69-6-6-6S1 2.69 1 6s2.69 6 6 6c1.3 0 2.48-.41 3.47-1.11l3.83 3.81c.19.2.45.3.7.3.25 0 .52-.09.7-.3a.996.996 0 0 0 0-1.41v.01zM7 10.7c-2.59 0-4.7-2.11-4.7-4.7 0-2.59 2.11-4.7 4.7-4.7 2.59 0 4.7 2.11 4.7 4.7 0 2.59-2.11 4.7-4.7 4.7z"></path></svg>
<h3>
此房源为独占房源,不能查看和加入!
</h3>
委托人与中介经济机构所签委托合同中已放弃自己出售及委托其他机构出售房屋的权利,见委托合同第五条第一节.
</b:column>
</b:row>
</ui:composition><file_sep>/src/main/java/cc/coopersoft/common/BatchData.java
package cc.coopersoft.common;
import java.util.ArrayList;
import java.util.List;
/**
* Created by cooper on 6/21/16.
*/
public class BatchData<E> implements java.io.Serializable{
private E data;
private boolean selected;
public BatchData(E data, boolean selected) {
this.data = data;
this.selected = selected;
}
public BatchData(E data) {
this.data = data;
this.selected = false;
}
public boolean isSelected() {
return selected;
}
public void setSelected(boolean selected) {
this.selected = selected;
}
public E getData() {
return data;
}
public void setData(E data) {
this.data = data;
}
public static <T> List<BatchData<T>> fillData(List<T> datas){
List<BatchData<T>> result = new ArrayList<BatchData<T>>(datas.size());
for(T data: datas){
result.add(new BatchData<T>(data));
}
return result;
}
}
<file_sep>/pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>cc.coopersoft.house</groupId>
<artifactId>participant</artifactId>
<packaging>war</packaging>
<version>1.0-SNAPSHOT</version>
<name>participant Maven Webapp</name>
<url>http://maven.apache.org</url>
<repositories>
<repository>
<id>in-project</id>
<name>In Project Repo</name>
<url>file://${project.basedir}/lib/</url>
</repository>
<!--
<repository>
<id>prime-repo</id>
<name>PrimeFaces Maven Repository</name>
<url>http://repository.primefaces.org</url>
<layout>default</layout>
</repository>
-->
</repositories>
<properties>
<deltaspike.version>1.7.2</deltaspike.version>
<primefaces.version>6.1</primefaces.version>
<bootsfaces.version>1.2.0</bootsfaces.version>
<omnifaces.version>2.6.1</omnifaces.version>
<version.picketlink.javaee.bom>2.7.1.Final</version.picketlink.javaee.bom>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<javase.version>1.8</javase.version>
<javaee.version>7.0</javaee.version>
</properties>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.apache.deltaspike.distribution</groupId>
<artifactId>distributions-bom</artifactId>
<version>${deltaspike.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
<dependency>
<groupId>org.picketlink</groupId>
<artifactId>picketlink-javaee-7.0</artifactId>
<version>${version.picketlink.javaee.bom}</version>
<scope>import</scope>
<type>pom</type>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>local.house</groupId>
<artifactId>HouseInterface</artifactId>
<version>2.0.135Bate</version>
</dependency>
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>itextpdf</artifactId>
<version>5.5.12</version>
</dependency>
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>itext-asian</artifactId>
<version>5.2.0</version>
</dependency>
<dependency>
<groupId>org.bouncycastle</groupId>
<artifactId>bcprov-jdk15on</artifactId>
<version>1.59</version>
</dependency>
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.5.2</version>
</dependency>
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpmime</artifactId>
<version>4.5.2</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.7.6</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
<version>2.7.6</version>
</dependency>
<dependency>
<groupId>org.json</groupId>
<artifactId>json</artifactId>
<version>20090211</version>
</dependency>
<!-- deltaspike -->
<dependency>
<groupId>org.apache.deltaspike.core</groupId>
<artifactId>deltaspike-core-api</artifactId>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.deltaspike.core</groupId>
<artifactId>deltaspike-core-impl</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.apache.deltaspike.modules</groupId>
<artifactId>deltaspike-jpa-module-api</artifactId>
<version>${deltaspike.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.deltaspike.modules</groupId>
<artifactId>deltaspike-jpa-module-impl</artifactId>
<version>${deltaspike.version}</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.apache.deltaspike.modules</groupId>
<artifactId>deltaspike-security-module-api</artifactId>
<version>${deltaspike.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.deltaspike.modules</groupId>
<artifactId>deltaspike-security-module-impl</artifactId>
<version>${deltaspike.version}</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.apache.deltaspike.modules</groupId>
<artifactId>deltaspike-proxy-module-api</artifactId>
<version>${deltaspike.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.deltaspike.modules</groupId>
<artifactId>deltaspike-proxy-module-impl-asm5</artifactId>
<version>${deltaspike.version}</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.apache.deltaspike.modules</groupId>
<artifactId>deltaspike-jsf-module-api</artifactId>
<version>${deltaspike.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.deltaspike.modules</groupId>
<artifactId>deltaspike-jsf-module-impl</artifactId>
<version>${deltaspike.version}</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.apache.deltaspike.modules</groupId>
<artifactId>deltaspike-partial-bean-module-api</artifactId>
<version>${deltaspike.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.deltaspike.modules</groupId>
<artifactId>deltaspike-partial-bean-module-impl</artifactId>
<version>${deltaspike.version}</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.apache.deltaspike.modules</groupId>
<artifactId>deltaspike-data-module-api</artifactId>
<version>${deltaspike.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.deltaspike.modules</groupId>
<artifactId>deltaspike-data-module-impl</artifactId>
<version>${deltaspike.version}</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.picketlink</groupId>
<artifactId>picketlink-api</artifactId>
</dependency>
<dependency>
<groupId>org.picketlink</groupId>
<artifactId>picketlink-impl</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.picketlink</groupId>
<artifactId>picketlink-deltaspike</artifactId>
<version>2.7.0.Final</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax</groupId>
<artifactId>javaee-web-api</artifactId>
<version>${javaee.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.omnifaces</groupId>
<artifactId>omnifaces</artifactId>
<version>${omnifaces.version}</version>
</dependency>
<dependency>
<groupId>net.bootsfaces</groupId>
<artifactId>bootsfaces</artifactId>
<version>${bootsfaces.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.primefaces</groupId>
<artifactId>primefaces</artifactId>
<version>${primefaces.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.primefaces.extensions</groupId>
<artifactId>all-themes</artifactId>
<version>1.0.8</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi</artifactId>
<version>3.14</version>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml</artifactId>
<version>3.14</version>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml-schemas</artifactId>
<version>3.14</version>
</dependency>
</dependencies>
<build>
<defaultGoal>package</defaultGoal>
<resources>
<resource>
<directory>src/main/resources</directory>
</resource>
</resources>
<plugins>
<plugin>
<artifactId>maven-checkstyle-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<configuration>
<webResources>
<resource>
<directory>src/main/webapp</directory>
<filtering>true</filtering>
<excludes>
<exclude>**/*.swf</exclude>
<exclude>**/*.ico</exclude>
<exclude>**/*.eto</exclude>
<exclude>**/*.ttf</exclude>
<exclude>**/*.woff</exclude>
</excludes>
</resource>
<resource>
<directory>src/main/webapp</directory>
<filtering>false</filtering>
<includes>
<include>**/*.swf</include>
<exclude>**/*.ico</exclude>
<exclude>**/*.eto</exclude>
<exclude>**/*.ttf</exclude>
<exclude>**/*.woff</exclude>
</includes>
</resource>
</webResources>
<archive>
<manifest>
<addDefaultImplementationEntries>true</addDefaultImplementationEntries>
</manifest>
</archive>
<warName>participant</warName>
<webappDirectory>${project.build.directory}/participant_SNAPSHOT.war/</webappDirectory>
</configuration>
</plugin>
</plugins>
</build>
</project>
<file_sep>/src/main/webapp/frame/FileReader.xhtml
<!DOCTYPE composition PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<ui:composition xmlns="http://www.w3.org/1999/xhtml"
xmlns:ui="http://java.sun.com/jsf/facelets"
xmlns:f="http://java.sun.com/jsf/core"
xmlns:p="http://primefaces.org/ui"
xmlns:pt="http://xmlns.jcp.org/jsf/passthrough"
xmlns:b="http://bootsfaces.net/ui"
xmlns:h="http://java.sun.com/jsf/html">
<link type="text/css" rel="stylesheet" href="/participant/javax.faces.resource/fancybox.css.xhtml"/>
<iframe style="display:none;" class="call_extend_iframe" src=""/>
<h:outputScript>
function imgClick(fid){
console.log('#{runParam.getStringParam('public_api_server')}/img/orig/' + fid + '.jpg');
$.fancybox(
{
href: '#{runParam.getStringParam('public_api_server')}/img/orig/' + fid + '.jpg',
title: '图片',
closeBtn: true,
openEffect : 'elastic',
openSpeed : 150,
closeEffect : 'elastic',
closeSpeed : 150,
}
);
}
function selectFileUpload(typeId,title){
$('.call_extend_iframe').attr('src', "ExtendsFileSelect:// -key='" + typeId + "' -title='" + title + "'");
}
function cameraUpload(typeId,title){
$('.call_extend_iframe').attr('src', "ExtendsCamera://-key='" + typeId + "' -title='" + title + "'");
}
function loadContractFile() {
$.ajax({
url: "#{runParam.getStringParam('public_api_server')}/api/protected/get-contract-file",
data: {key: '#{contractHome.instance.id}'},
type: 'GET',
dataType: 'json',
crossDomain: true,
async: false,
success: function(data){
console.log('get data:' + data);
$.each(data,function(idx,obj) {
if ($('[data-commit-fileid="' + obj.ID + '"]').length == 0){
$('.js-commit-files').append('<div class="col-xs-6 col-md-3 js-commit-file-gallery"><div class="bmbox" onclick="removeFile(\'' + obj.ID + '\')">删除</div><a href="#" onclick="imgClick(\'' + obj.ID + '\');return false;" data-commit-fileid="' + obj.ID + '" class="thumbnail"><img src="#{runParam.getStringParam('public_api_server')}/img/480x320s/' + obj.ID + '" alt="..."/></a></div>')
}
});
},
error: function(){
//alert(arguments[1]);
console.log('error:' + arguments[1]);
}
});
}
setInterval(loadContractFile, 1000);
function bindDeleteBtn(){
$(".js-commit-file-gallery").unbind('hover');
$(".js-commit-file-gallery").hover(
function () {
$('.bmbox:first',this).show(100);
},
function () {
$('.bmbox:first',this).hide(100);
});
}
$(document).ready(bindDeleteBtn);
</h:outputScript>
<h:outputScript library="fancybox" name="jquery.fancybox.pack.js"/>
</ui:composition><file_sep>/src/main/java/cc/coopersoft/common/util/Authorization.java
package cc.coopersoft.common.util;
import org.picketlink.Identity;
import org.picketlink.authorization.util.AuthorizationUtil;
import org.picketlink.idm.PartitionManager;
import javax.inject.Inject;
import javax.inject.Named;
/**
* Created by cooper on 22/02/2017.
*/
@Named("auth")
public class Authorization {
@Inject
private PartitionManager partitionManager;
@Inject
private Identity identity;
public boolean hasRole(String roleName){
//BasicModel.hasRole()
return AuthorizationUtil.hasRole(identity,partitionManager,roleName);
}
}
<file_sep>/design/create.sql
SET SESSION FOREIGN_KEY_CHECKS=0;
DROP TABLE IF EXISTS CONTRACT.SELL_SHOW_IMG;
DROP TABLE IF EXISTS CONTRACT.HOUSE_SELL_INFO;
DROP TABLE IF EXISTS CONTRACT.SELL_PROXY_PERSON;
DROP TABLE IF EXISTS CONTRACT.HOUSE_SOURCE;
DROP TABLE IF EXISTS CONTRACT.COMPANY_SELL_INFO;
DROP TABLE IF EXISTS CONTRACT.SELL_AREA;
CREATE TABLE CONTRACT.MONEY_MANAGER
(
ID varchar(32) NOT NULL,
BANK varchar(32) NOT NULL,
BANK_NAME varchar(128) NOT NULL,
ACCOUNT varchar(32) NOT NULL,
MONEY decimal(19,4) NOT NULL,
PRIMARY KEY (ID)
) ENGINE = InnoDB DEFAULT CHARACTER SET utf8;
ALTER TABLE CONTRACT.MONEY_MANAGER
ADD FOREIGN KEY (ID)
REFERENCES CONTRACT.HOUSE_CONTRACT (ID)
ON UPDATE RESTRICT
ON DELETE RESTRICT
;
CREATE TABLE CONTRACT.OLD_HOUSE_MONEY
(
ID varchar(32) NOT NULL,
BANK_NAME varchar(512) NOT NULL,
CARD_NAME varchar(50) NOT NULL,
CARD_NUMBER varchar(128) NOT NULL,
PRIMARY KEY (ID)
) ENGINE = InnoDB DEFAULT CHARACTER SET utf8;
ALTER TABLE CONTRACT.OLD_HOUSE_MONEY
ADD FOREIGN KEY (ID)
REFERENCES CONTRACT.MONEY_MANAGER (ID)
ON UPDATE RESTRICT
ON DELETE RESTRICT
;
CREATE TABLE CONTRACT.SELL_SHOW_IMG
(
ID varchar(32) NOT NULL,
TITLE varchar(64) NOT NULL,
DESCRIPTION varchar(512),
HOUSE_SELL_INFO varchar(32) NOT NULL,
PRI int NOT NULL ,
PRIMARY KEY (ID)
) ENGINE = InnoDB DEFAULT CHARACTER SET utf8;
CREATE TABLE CONTRACT.HOUSE_SELL_INFO
(
ID varchar(32) NOT NULL,
TITLE varchar(64) NOT NULL,
TAGS varchar(512),
DESCRIPTION varchar(512),
ENVIRONMENT varchar(512),
LAT decimal(18,14),
LNG decimal(18,14),
ZOOM int,
ROOM_COUNT int NOT NULL,
LIVING_ROOM int NOT NULL,
METRO_AREA varchar(32),
DECORATE varchar(512),
ELEVATOR boolean,
COVER varchar(32),
IN_FLOOR int NOT NULL,
CREATE_YEAR int NOT NULL,
DIRECTION varchar(32),
ADDRESS varchar(200) NOT NULL,
LOCAL_AREA varchar(32),
SCHOOL_AREA varchar(32),
FLOOR_COUNT int NOT NULL,
KITCHEN_COUNT int NOT NULL,
TOILET_COUNT int NOT NULL,
PRICE decimal(19,4) NOT NULL,
SUM_PRICE decimal(19,4) NOT NULL,
END_TIME datetime,
TIME_AREA_TYPE varchar(16) NOT NULL,
PRIMARY KEY (ID)
) ENGINE = InnoDB DEFAULT CHARACTER SET utf8;
CREATE TABLE CONTRACT.HOUSE_SOURCE
(
ID varchar(32) NOT NULL,
POWER_CARD_NUMBER varchar(50) NOT NULL,
CREDENTIALS_TYPE varchar(32) NOT NULL,
CREDENTIALS_NUMBER varchar(100) NOT NULL,
OWNER_NAME varchar(50) NOT NULL,
TEL varchar(16) NOT NULL,
MASTER_GROUP_ID varchar(32) NOT NULL,
VERSION bigint NOT NULL,
CHECK_VIEW varchar(200),
HOUSE_CODE varchar(32) NOT NULL,
SOURCE_ID varchar(32) NOT NULL,
TYPE varchar(20) NOT NULL,
APPLY_TIME datetime NOT NULL,
STATUS varchar(20) NOT NULL,
SEARCH_KEY varchar(1024),
CHECK_TIME datetime NOT NULL,
BUSINESS_ID varchar(32),
MESSAGES longtext,
UPDATE_TIME timestamp NOT NULL,
POWER_CARD_TYPE varchar(16) NOT NULL,
ALLOW_JOIN boolean NOT NULL,
ADDRESS varchar(200),
MSG_TYPE varchar(20),
USE_TYPE varchar(32) NOT NULL,
MAP_NUMBER varchar(4),
BLOCK_NUMBER varchar(10) NOT NULL,
BUILD_NUMBER varchar(24) NOT NULL,
HOUSE_ORDER varchar(20) NOT NULL,
STRUCTURE varchar(32),
IN_FLOOR_NAME varchar(50),
FLOOR_COUNT int,
DISTRICT varchar(32) NOT NULL,
HOUSE_AREA decimal(18,3) NOT NULL,
USE_AREA decimal(18,3),
DESIGN_USE_TYPE varchar(512) NOT NULL,
PRICE decimal(19,4) NOT NULL,
SUM_PRICE decimal(19,4) NOT NULL,
TIME_AREA_TYPE varchar(16) NOT NULL,
END_TIME datetime,
PRIMARY KEY (ID)
) ENGINE = InnoDB DEFAULT CHARACTER SET utf8;
CREATE TABLE CONTRACT.SELL_PROXY_PERSON
(
ID varchar(32) NOT NULL,
TYPE varchar(16) NOT NULL,
PERSON_NAME varchar(50) NOT NULL,
CREDENTIALS_TYPE varchar(32) NOT NULL,
CREDENTIALS_NUMBER varchar(100) NOT NULL,
ADDRESS varchar(200),
TEL varchar(15) NOT NULL,
PRIMARY KEY (ID)
) ENGINE = InnoDB DEFAULT CHARACTER SET utf8;
CREATE TABLE CONTRACT.COMPANY_SELL_INFO
(
GROUP_ID varchar(32) NOT NULL,
ID varchar(32) NOT NULL,
HOUSE_SOURCE varchar(32) NOT NULL,
CONTRACT longtext,
TITLE varchar(100) NOT NULL,
MEMO longtext,
SALE_CONTRACT varchar(32),
PRIMARY KEY (ID)
) ENGINE = InnoDB DEFAULT CHARACTER SET utf8;
CREATE TABLE CONTRACT.NUMBER_SEQUENCE
(
ID varchar(32) NOT NULL,
TYPE varchar(16) NOT NULL,
NUMBER bigint NOT NULL,
VERSION bigint NOT NULL,
UPDATE_TIME datetime NOT NULL,
MIN_LEN int NOT NULL,
PRIMARY KEY (ID)
) ENGINE = InnoDB DEFAULT CHARACTER SET utf8;
ALTER TABLE CONTRACT.SELL_SHOW_IMG
ADD FOREIGN KEY (HOUSE_SELL_INFO)
REFERENCES CONTRACT.HOUSE_SELL_INFO (ID)
ON UPDATE RESTRICT
ON DELETE RESTRICT
;
ALTER TABLE CONTRACT.HOUSE_SELL_INFO
ADD FOREIGN KEY (ID)
REFERENCES CONTRACT.HOUSE_SOURCE (ID)
ON UPDATE RESTRICT
ON DELETE RESTRICT
;
ALTER TABLE CONTRACT.SELL_PROXY_PERSON
ADD FOREIGN KEY (ID)
REFERENCES CONTRACT.HOUSE_SOURCE (ID)
ON UPDATE RESTRICT
ON DELETE RESTRICT
;
ALTER TABLE CONTRACT.COMPANY_SELL_INFO
ADD FOREIGN KEY (HOUSE_SOURCE)
REFERENCES CONTRACT.HOUSE_SOURCE (ID)
ON UPDATE RESTRICT
ON DELETE RESTRICT
;
ALTER TABLE CONTRACT.COMPANY_SELL_INFO
ADD FOREIGN KEY (SALE_CONTRACT)
REFERENCES CONTRACT.HOUSE_CONTRACT (ID)
ON UPDATE RESTRICT
ON DELETE RESTRICT
;
ALTER TABLE HOUSE_CONTRACT ADD FILE_ID varchar(32) NULL;
ALTER TABLE HOUSE_CONTRACT ADD COMMIT_TIME datetime NULL;
ALTER TABLE OLD_HOUSE_CONTRACT ADD SELLER_POOL_TYPE VARCHAR(32) NULL;
INSERT INTO CONTRACT.SYSTEM_PARAM(ID,VALUE,TYPE) VALUES('contract_address','http://192.168.248.246/','STRING');
INSERT INTO CONTRACT.SYSTEM_PARAM(ID,VALUE,TYPE) VALUES('map.lat','40.12902283','STRING');
INSERT INTO CONTRACT.SYSTEM_PARAM(ID,VALUE,TYPE) VALUES('map.lng','124.33854311','STRING');
INSERT INTO CONTRACT.SYSTEM_PARAM(ID,VALUE,TYPE) VALUES('map.zoom','14','STRING');
INSERT INTO CONTRACT.SYSTEM_PARAM(ID,VALUE,TYPE) VALUES('ZONE_CODE','DG','STRING');
INSERT INTO HOUSE_INFO.MONEY_BANK(ID,NAME,KEYT,SECRET_TYPE,PRI,ACCOUNT_NUMBER)
VALUES ('DDBK','丹东银行','a1234567a1234567','NONE',1,'12345678');
-- http://192.168.248.248:8090/api/money/old/contract?bank=DDBK&account=12345678&contract=CDG201803203
-- http://192.168.248.248:8090/bank/money/old/contract/DDBK/12345678/CDG201803203
INSERT INTO MONEY_BUSINESS(ID,BUSINESS,STATUS,BANK,BANK_NAME,ACCOUNT_NUMBER,VER,VERSION,MONEY,CONTRACT)
VALUES ('test1','WP56-20180320013','CREATED','DDBK','丹东银行','12345678',1,1,2,'ff8080816243019001624329991c000f');
INSERT INTO MONEY_PAY_INFO(ID,BANK_NAME,CARD_NUMBER,CARD_NAME) VALUES ('test1','付款开户行','付款卡号','持卡人姓名');
-- http://192.168.248.248:8090/api/money/old/pay?bank=DDBK&account=12345678
-- http://192.168.248.248:8090/bank/money/old/pay/DDBK/12345678
INSERT INTO MONEY_BUSINESS(ID,BUSINESS,STATUS,BANK,BANK_NAME,ACCOUNT_NUMBER,VER,VERSION,MONEY,CONTRACT)
VALUES ('test2','WP56-20180320012','CREATED','DDBK','丹东银行','12345678',1,1,2,'ff80808162430190016243268e140008');
INSERT INTO MONEY_PAY_INFO(ID,BANK_NAME,CARD_NUMBER,CARD_NAME) VALUES ('test2','付款开户行','付款卡号','持卡人姓名');
INSERT INTO MONEY_BUSINESS(ID,BUSINESS,STATUS,BANK,BANK_NAME,ACCOUNT_NUMBER,VER,VERSION,MONEY,CONTRACT)
VALUES ('test3','WP56-20180320011','CREATED','DDBK','丹东银行','12345678',1,1,2,'ff80808162430190016243234c700001');
INSERT INTO MONEY_PAY_INFO(ID,BANK_NAME,CARD_NUMBER,CARD_NAME) VALUES ('test3','付款开户行','付款卡号','持卡人姓名');
-- curl -X POST --data 'data={"dir":"IN","money":800,"contract":"f5f5f5","seq":"999","time":"2018-3-31 13:08:01","timestamp": "2018-3-31 13:08:01"}' http://192.168.248.248:8090/bank/money/old/report/DDBK/12345678<file_sep>/src/main/java/cc/coopersoft/common/util/Resources.java
package cc.coopersoft.common.util;
import javax.enterprise.context.RequestScoped;
import javax.enterprise.inject.Produces;
import javax.enterprise.inject.spi.InjectionPoint;
import javax.faces.context.FacesContext;
import javax.inject.Named;
import java.util.ResourceBundle;
import java.util.logging.Logger;
/**
* Created by cooper on 6/5/16.
*/
public class Resources {
@Produces
public Logger produceLog(InjectionPoint injectionPoint) {
return Logger.getLogger(injectionPoint.getMember().getDeclaringClass().getName());
}
@Produces
@DefaultMessageBundle
public ResourceBundle getBundle() {
FacesContext context = FacesContext.getCurrentInstance();
return context.getApplication().getResourceBundle(context, "messages");
}
@Named
@Produces
@RequestScoped
public FacesContext getFacesContext() {
return FacesContext.getCurrentInstance();
}
}
<file_sep>/src/main/java/cc/coopersoft/common/tools/faces/convert/BigMoneyConverter.java
package cc.coopersoft.common.tools.faces.convert;
import javax.faces.component.UIComponent;
import javax.faces.context.FacesContext;
import javax.faces.convert.ConverterException;
import java.math.BigDecimal;
import java.text.DecimalFormat;
import java.text.NumberFormat;
/**
* Created by cooper on 9/13/15.
*/
public class BigMoneyConverter implements javax.faces.convert.Converter{
public Object getAsObject(FacesContext context, UIComponent component, String value) {
if (context == null || component == null) {
throw new NullPointerException();
}
// If the specified value is null or zero-length, return null
if (value == null) {
return (null);
}
value = value.trim();
if (value.length() < 1) {
return (null);
}
try {
return bigMoneyToNumber(value);
} catch (NumberFormatException nfe) {
throw new ConverterException(nfe);
} catch (Exception e) {
throw new ConverterException(e);
}
}
public String getAsString(FacesContext context, UIComponent component, Object value) {
if (context == null || component == null) {
throw new NullPointerException();
}
// If the specified value is null, return a zero-length String
if (value == null) {
return "";
}
// If the incoming value is still a string, play nice
// and return the value unmodified
if (value instanceof String) {
return (String) value;
}
try {
return numberToBigMoney((BigDecimal)value);
} catch (Exception e) {
throw new ConverterException(e);
}
}
public static void main(String[] args) {
String money = "伍仟零贰元贰角壹分";
System.out.println(bigMoneyToNumber(money));
System.out.println(numberToBigMoney(new BigDecimal(1320)));
}
public static String numberToBigMoney(BigDecimal amount){
NumberFormat nf = new DecimalFormat("#0.###");
String amt = nf.format(amount);
String dotPart = "";
String intPart = "";
int dotPos;
if ((dotPos = amt.indexOf('.')) != -1) {
intPart = amt.substring(0, dotPos);
dotPart = amt.substring(dotPos + 1);
} else {
intPart = amt;
}
if (intPart.length() > 16)
throw new InternalError("The amount is too big.");
String intBig = intToBig(intPart);
String dotBig = dotToBig(dotPart);
if ((dotBig.length() == 0) && (intBig.length() != 0)) {
return intBig + "元整";
} else if ((dotBig.length() == 0) && (intBig.length() == 0)) {
return intBig + "零元";
} else if ((dotBig.length() != 0) && (intBig.length() != 0)) {
return intBig + "元" + dotBig;
} else {
return dotBig;
}
}
private static final int GRADE = 4;
private static final String NUM = "零壹贰叁肆伍陆柒捌玖";
private static final String UNIT = "仟佰拾个";
private static final String GRADEUNIT = "仟万亿兆";
private static final String DOTUNIT = "角分厘";
private static String dotToBig(String dotPart) {
String strRet = "";
for (int i = 0; i < dotPart.length() && i < 3; i++) {
int num;
if ((num = Integer.parseInt(dotPart.substring(i, i + 1))) != 0)
strRet += NUM.substring(num, num + 1)
+ DOTUNIT.substring(i, i + 1);
}
return strRet;
}
private static String intToBig(String intPart) {
int grade;
String result = "";
String strTmp = "";
grade = intPart.length() / GRADE;
if (intPart.length() % GRADE != 0)
grade += 1;
for (int i = grade; i >= 1; i--) {
strTmp = getNowGradeVal(intPart, i);
result += getSubUnit(strTmp);
result = dropZero(result);
if (i > 1)
if (getSubUnit(strTmp).equals("零零零零")) {
result += "零" + GRADEUNIT.substring(i - 1, i);
} else {
result += GRADEUNIT.substring(i - 1, i);
}
}
return result;
}
private static String getSubUnit(String strVal) {
String rst = "";
for (int i = 0; i < strVal.length(); i++) {
String s = strVal.substring(i, i + 1);
int num = Integer.parseInt(s);
if (num == 0) {
if (i != strVal.length())
rst += "零";
} else {
rst += NUM.substring(num, num + 1);
if (i != strVal.length() - 1)
rst += UNIT.substring(i + 4 - strVal.length(), i + 4
- strVal.length() + 1);
}
}
return rst;
}
private static String getNowGradeVal(String strVal, int grade) {
String rst;
if (strVal.length() <= grade * GRADE)
rst = strVal.substring(0, strVal.length() - (grade - 1) * GRADE);
else
rst = strVal.substring(strVal.length() - grade * GRADE, strVal
.length()
- (grade - 1) * GRADE);
return rst;
}
private static String dropZero(String strVal) {
String strRst;
String strBefore;
String strNow;
strBefore = strVal.substring(0, 1);
strRst = strBefore;
for (int i = 1; i < strVal.length(); i++) {
strNow = strVal.substring(i, i + 1);
if (strNow.equals("零") && strBefore.equals("零"))
;
else
strRst += strNow;
strBefore = strNow;
}
if (strRst.substring(strRst.length() - 1, strRst.length()).equals("零"))
strRst = strRst.substring(0, strRst.length() - 1);
return strRst;
}
public static BigDecimal bigMoneyToNumber(String money) {
long integerPart = 0;
String newMoney = money.substring(0, money.indexOf("元") + 1).replace(" ","");//去除元后面的部分
//分割成如下形式:数字仟数字佰数字拾数字。 然后调用分割后的和,之后在求整个的和。
if (-1 != newMoney.indexOf("兆")){
String[] spits = newMoney.split("兆");//如果存在亿的部分就分割出含亿的部分
integerPart += getSubMoney(spits[0], "兆");
newMoney = spits[1];
}
if (-1 != newMoney.indexOf("亿")) {
String[] spits = newMoney.split("亿");//如果存在亿的部分就分割出含亿的部分
integerPart += getSubMoney(spits[0], "亿");
newMoney = spits[1];
}
if (-1 != newMoney.indexOf("万")) {//如果存在万的部分就分割出含万的部分
String[] spits = newMoney.split("万");
integerPart += getSubMoney(spits[0], "万");
newMoney = spits[1];
}
if (!newMoney.equals("元")) {//如果分割的最后不只有元了,就求不上万部分的和
integerPart += getSubMoney(newMoney.substring(0, newMoney.indexOf("元")), "元");
}
BigDecimal result = new BigDecimal(integerPart);
String floatMoney = money.substring(money.indexOf("元") + 1, money.length()).replace(" ","").replace("整","");
if (!"".equals(floatMoney)){
int floatPart = 0;
if (-1 != floatMoney.indexOf("角")){
floatPart = getMoney(floatMoney.substring(0, floatMoney.indexOf("角")).charAt(0)) * 100;
}
if (-1 != floatMoney.indexOf("分")){
floatPart += getMoney(floatMoney.substring(floatMoney.indexOf("分") - 1, floatMoney.indexOf("分")).charAt(0)) * 10;
}
if (-1 != floatMoney.indexOf("厘")){
floatPart += getMoney(floatMoney.substring(floatMoney.indexOf("厘") - 1, floatMoney.indexOf("厘")).charAt(0));
}
BigDecimal floatResult = new BigDecimal(floatPart);
result = result.add(floatResult.divide(new BigDecimal("1000"),3, BigDecimal.ROUND_DOWN));
}
return result;
}
private static long getSubMoney(String money, String unit) {//分割后返回分割部分的数和
long subSum = 0;//存储子部分的和
char[] moneys = money.toCharArray();
int rememberInt = 0;//用来记录数字部分
for (int i = 0; i < moneys.length; i++) {
int increament = increateNum(moneys[i]);
if (-1 == increament) {
rememberInt = getMoney(moneys[i]);
if (i == (moneys.length - 1)) {
subSum += getMoney(moneys[i]);
}
} else if (increament > 0) {//如果是单位:拾,仟,佰.。乘以对应的进制
subSum += increament * rememberInt;
} else {
continue;
}
}
return subSum * getUnit(unit);
}
private static long getUnit(String unit) {//根据单位返回零的个数
if (unit.equals("元")) {
return 1;
} else if (unit.equals("万")) {
return 10000;
} else if (unit.equals("亿")) {
return 100000000;
} else {
return -1;
}
}
private static int increateNum(char c) {//根据单位返回要乘的数
switch (c) {
case '拾': {
return 10;
}
case '佰': {
return 100;
}
case '仟': {
return 1000;
}
case '零': {
return 0;
}
default: {
return -1;
}
}
}
private static int getMoney(char m) {//返回大写人民币所对应的数字
switch (m) {
case '壹': {
return 1;
}
case '贰': {
return 2;
}
case '叁': {
return 3;
}
case '肆': {
return 4;
}
case '伍': {
return 5;
}
case '陆': {
return 6;
}
case '柒': {
return 7;
}
case '捌': {
return 8;
}
case '玖': {
return 9;
}
default: {
return 0;
}
}
}
}
<file_sep>/design/update1.1.sql
CREATE TABLE CONTRACT.COMMIT_FILE
(
ID varchar(32) NOT NULL,
CONTRACT varchar(32) NOT NULL,
UPLOAD_TIME timestamp,
MD5 longtext,
PRIMARY KEY (ID)
) ENGINE = InnoDB DEFAULT CHARACTER SET utf8;
ALTER TABLE CONTRACT.COMMIT_FILE
ADD FOREIGN KEY (CONTRACT)
REFERENCES CONTRACT.HOUSE_CONTRACT (ID)
ON UPDATE RESTRICT
ON DELETE RESTRICT
;
INSERT INTO SYSTEM_PARAM(ID,VALUE,DESCRIPTION,TYPE) VALUES('public_api_server','http://127.0.0.1:8090','public nginx address','STRING');
INSERT INTO CONTRACT.SYSTEM_PARAM(ID,VALUE,TYPE) VALUES('contract_address','http://192.168.248.248:9080/','STRING');
|
5912098c0bcace0765d65173ceab9766f5c40b6b
|
[
"SQL",
"HTML",
"JavaScript",
"Maven POM",
"INI",
"Java"
] | 34 |
Java
|
cooper-lyt/participant
|
fcf45dc13cdd058b34ddb79d41d3b22f5f8a8267
|
2378913c25d56ce73b508aba046f76f7e4faf366
|
refs/heads/master
|
<file_sep>
## Comments
### 2016-11-04
Linux has a NOTE on mis-spelled words, it is a false positive.
All revdep authors were emailed on 2016-10-17. Have only been in contact with the authors of 'plotly' about changes.
Thank you for your time.
Best,
Barret
#### Test environments and R CMD check results
* local OS X install (x86_64-apple-darwin13.4.0), R 3.3.0
* There were no ERRORs or WARNINGs.
* There is one NOTE.
* checking CRAN incoming feasibility ... NOTE
Maintainer: ‘<NAME> <<EMAIL>>’
* travis-ci
* Platform: x86_64-pc-linux-gnu (64-bit)
* Running under: Ubuntu precise (12.04.5 LTS)
* R
* version 3.3.1 (2016-06-21)
* R Under development (unstable) (2016-10-17 r71530)
* There were no ERRORs, WARNINGs, or NOTEs.
* win-builder (devel and release)
* There were no ERRORs or WARNINGs.
* There are one NOTE.
* checking CRAN incoming feasibility ... NOTE
Maintainer: '<NAME> <<EMAIL>>'
Possibly mis-spelled words in DESCRIPTION:
geoms (25:43)
ggplot (5:21)
scatterplot (26:49)
## Reverse dependencies
I have run R CMD check on downstream dependencies of GGally on my local machine.
* Summary - https://github.com/ggobi/ggally/blob/master/revdep/README.md
### RevDep Notes
* Failed to install dependencies for: MissingDataGUI, specmine, toaster, userfriendlyscience
* In contact with author and have resolved issues.
* plotly: checking examples ... ERROR
* Does not appear to be a GGally issue.
* ParamHelpers: checking tests ... ERROR
* SHELF: checking re-building of vignette outputs ... WARNING
* Does not appear to be a GGally issue. Seems like ggplot2 issue
* robustbase: checking re-building of vignette outputs ... WARNING
* vdmR: checking examples ... ERROR
<file_sep>context("ggnostic")
expect_print <- function(p) {
testthat::expect_silent(print(p, progress = FALSE))
}
test_that("ggnostic mtcars", {
mtc <- mtcars;
mtc$am <- c("0" = "automatic", "1" = "manual")[as.character(mtc$am)];
mod <- lm(mpg ~ wt + qsec + am, data = mtc);
continuous_type <- list(
.resid = wrap(ggally_nostic_resid, method = "loess"),
.std.resid = wrap(ggally_nostic_std_resid, method = "loess")
)
pm <- ggnostic(
mod,
mapping = ggplot2::aes(),
columnsY = c("mpg", ".fitted", ".se.fit", ".resid", ".std.resid", ".sigma", ".hat", ".cooksd"),
continuous = continuous_type
)
expect_print(pm)
pm <- ggnostic(
mod,
mapping = ggplot2::aes(color = am),
legend = c(1, 3),
continuous = continuous_type
)
expect_print(pm)
})
test_that("error checking", {
get_cols <- function(cols) {
match_nostic_columns(
cols,
c("mpg", broom_columns()),
"columnsY"
)
}
expect_equivalent(
get_cols(c(".resid", ".sig", ".hat", ".c")),
c(".resid", ".sigma", ".hat", ".cooksd")
)
expect_error(
get_cols(c(
"not_there", ".fitted", ".se.fit", ".resid",
".std.resid", ".sigma", ".hat", ".cooksd"
)),
"Could not match 'columnsY'"
)
})
<file_sep>
context("ggcoef")
suppressMessages(require(broom))
test_that("example", {
expect_print <- function(x) {
expect_silent(print(x))
}
reg <- lm(Sepal.Length ~ Sepal.Width + Petal.Length + Petal.Width, data = iris)
expect_print(ggcoef(reg))
d <- as.data.frame(Titanic)
reg2 <- glm(Survived ~ Sex + Age + Class, family = binomial, data = d, weights = d$Freq)
expect_print(ggcoef(reg2, exponentiate = TRUE))
expect_print(ggcoef(
reg2,
exponentiate = TRUE,
exclude_intercept = TRUE,
errorbar_height = .2,
color = "blue"
))
})
<file_sep>
#' Modify a ggmatrix object by adding an ggplot2 object to all plots
#'
#' This operator allows you to add ggplot2 objects to a ggmatrix object.
#'
#' If the first object is an object of class \code{ggmatrix}, you can add
#' the following types of objects, and it will return a modified ggplot
#' object.
#'
#' \itemize{
###### \item \code{data.frame}: replace current data.frame
###### (must use \code{\%+\%})
###### \item \code{uneval}: replace current aesthetics
###### \item \code{layer}: add new layer
#' \item \code{theme}: update plot theme
###### \item \code{scale}: replace current scale
###### \item \code{coord}: override current coordinate system
###### \item \code{facet}: override current coordinate faceting
#' }
#'
#' The \code{+} operator completely replaces elements
#' with elements from e2.
#'
#' @param e1 An object of class \code{ggplot} or \code{theme}
#' @param e2 A component to add to \code{e1}
#'
#' @export
#' @seealso \code{\link[ggplot2]{+.gg}} and \code{\link[ggplot2]{theme}}
#' @method + gg
#' @rdname gg-add
#' @examples
#' data(tips, package = "reshape")
#' pm <- ggpairs(tips[, 2:3])
#' ## change to black and white theme
#' pm + ggplot2::theme_bw()
#' ## change to linedraw theme
#' # pm + ggplot2::theme_linedraw()
#' ## change to custom theme
#' # pm + ggplot2::theme(panel.background = ggplot2::element_rect(fill = "lightblue"))
#'
"+.gg" <- function(e1, e2) {
if (is.ggmatrix(e1)) {
if (is.null(e1$gg)) {
e1$gg <- list()
}
if (inherits(e2, "labels")) {
label_names <- names(e2)
if ("x" %in% label_names) {
e1$xlab <- e2$x
}
if ("y" %in% label_names) {
e1$ylab <- e2$y
}
if ("title" %in% label_names) {
e1$title <- e2$title
}
non_ggmatrix_labels <- label_names[!label_names %in% c("x", "y", "title")]
if (length(non_ggmatrix_labels) > 0) {
if (is.null(e1$gg$labs)) {
e1$gg$labs <- structure(list(), class = "labels")
}
e1$gg$labs[non_ggmatrix_labels] <- e2[non_ggmatrix_labels]
}
return(e1)
} else if (is.theme(e2)) {
# Get the name of what was passed in as e2, and pass along so that it
# can be displayed in error messages
# e2name <- deparse(substitute(e2))
if (is.null(e1$gg$theme)) {
e1$gg$theme <- e2
} else {
# calls ggplot2 add method and stores the result in gg
e1$gg$theme <- e1$gg$theme %+% e2
}
return(e1)
} else {
stop("'ggmatrix' does not know how to add objects that do not have class 'theme' or 'labels'")
}
} else {
# calls ggplot2 add method
return(e1 %+% e2)
}
}
add_gg_info <- function(p, gg) {
if (!is.null(gg)) {
if (!is.null(gg$theme)) {
p <- p + gg$theme
}
if (!is.null(gg$labs)) {
p <- p + gg$labs
}
}
p
}
is.ggmatrix <- function(x) {
inherits(x, "ggmatrix")
}
<file_sep>
context("utils")
test_that("require_pkgs", {
detach("package:survival")
detach("package:scales")
expect_false("package:survival" %in% search())
expect_false("package:scales" %in% search())
suppressMessages(require_pkgs(c("survival", "scales")))
expect_true("package:survival" %in% search())
expect_true("package:scales" %in% search())
expect_error(suppressWarnings(suppressMessages(require_pkgs("DOES_NOT_EXIST_asdfasdfasfd"))))
})
<file_sep>context("ggfacet")
expect_print <- function(p) {
testthat::expect_silent(print(p, progress = FALSE))
}
suppressMessages(library(chemometrics, quietly = TRUE))
data(NIR)
NIR_sub <- data.frame(NIR$yGlcEtOH, NIR$xNIR[, 1:3])
test_that("warnings", {
expect_warning(
ggfacet(iris, columnsX = 1:5, columnsY = 1),
"1 factor variables are being removed from X columns"
)
expect_warning(
ggfacet(iris, columnsX = 1, columnsY = 1:5),
"1 factor variables are being removed from Y columns"
)
})
test_that("generally works", {
# factor variables
expect_print(ggfacet(NIR_sub, columnsY = 1:2, columnsX = 3:5, fn = ggally_smooth_loess))
})
test_that("generally works", {
# factor variables
expect_print(ggfacet(NIR_sub, columnsY = 1:2, columnsX = 3:5, fn = ggally_smooth_loess))
expect_print(
ggts(pigs, "time", c("gilts", "profit", "s_per_herdsz", "production", "herdsz"))
)
})
<file_sep>library(testthat)
library(GGally)
test_check("GGally")
<file_sep>* Failed to install dependencies for: MissingDataGUI, specmine, toaster, userfriendlyscience
* ParamHelpers: checking tests ... ERROR
* plotly: checking examples ... ERROR
* robustbase: checking re-building of vignette outputs ... WARNING
* SHELF: checking re-building of vignette outputs ... WARNING
* vdmR: checking examples ... ERROR
<file_sep>
#' Wrap a function with different parameter values
#'
#' Wraps a function with the supplied parameters to force different default behavior. This is useful for functions that are supplied to ggpairs. It allows you to change the behavior of one function, rather than creating multiple functions with different parameter settings.
#'
#' \code{wrap} is identical to \code{wrap_fn_with_params}. These function take the new parameters as arguements.
#'
#' \code{wrapp} is identical to \code{wrap_fn_with_param_arg}. These functions take the new parameters as a single list.
#'
#' The \code{params} and \code{fn} attributes are there for debugging purposes. If either attribute is altered, the function must be re-wrapped to have the changes take effect.
#'
#' @param funcVal function that the \code{params} will be applied to. The function should follow the api of \code{function(data, mapping, ...)\{\}}. \code{funcVal} is allowed to be a string of one of the \code{ggally_NAME} functions, such as \code{"points"} for \code{ggally_points} or \code{"facetdensity"} for \code{ggally_facetdensity}.
#' @param ... named parameters to be supplied to \code{wrap_fn_with_param_arg}
#' @param params named vector or list of parameters to be applied to the \code{funcVal}
#' @param funcArgName name of function to be displayed
#' @return a \code{function(data, mapping, ...)\{\}} that will wrap the original function with the parameters applied as arguements
#' @export
#' @rdname wrap
#' @examples
#' # example function that prints 'val'
#' fn <- function(data, mapping, val = 2) {
#' print(val)
#' }
#' fn(data = NULL, mapping = NULL) # 2
#'
#' # wrap function to change default value 'val' to 5 instead of 2
#' wrapped_fn1 <- wrap(fn, val = 5)
#' wrapped_fn1(data = NULL, mapping = NULL) # 5
#' # you may still supply regular values
#' wrapped_fn1(data = NULL, mapping = NULL, val = 3) # 3
#'
#' # wrap function to change 'val' to 5 using the arg list
#' wrapped_fn2 <- wrap_fn_with_param_arg(fn, params = list(val = 5))
#' wrapped_fn2(data = NULL, mapping = NULL) # 5
#'
#' # change parameter settings in ggpairs for a particular function
#' ## Goal output:
#' (regularPlot <- ggally_points(
#' iris,
#' ggplot2::aes(Sepal.Length, Sepal.Width),
#' size = 5, color = "red"
#' ))
#' # Wrap ggally_points to have parameter values size = 5 and color = 'red'
#' w_ggally_points <- wrap(ggally_points, size = 5, color = "red")
#' (wrappedPlot <- w_ggally_points(
#' iris,
#' ggplot2::aes(Sepal.Length, Sepal.Width)
#' ))
#'
#' # Double check the aes parameters are the same for the geom_point layer
#' identical(regularPlot$layers[[1]]$aes_params, wrappedPlot$layers[[1]]$aes_params)
#'
#' # Use a wrapped function in ggpairs
#' ggpairs(iris, 1:3, lower = list(continuous = wrap(ggally_points, size = 5, color = "red")))
#' ggpairs(iris, 1:3, lower = list(continuous = w_ggally_points))
wrap_fn_with_param_arg <- function(
funcVal,
params = NULL,
funcArgName = deparse(substitute(funcVal))
) {
if (missing(funcArgName)) {
fnName <- attr(funcVal, "name")
if (!is.null(fnName)) {
funcArgName <- fnName
}
}
if (!is.null(params)) {
if (is.vector(params)) {
params <- as.list(params)
}
if (length(params) > 0) {
if (!is.list(params)) {
stop("'params' must be a named list, named vector, or NULL")
}
if (is.null(names(params))) {
stop("'params' must be a named list, named vector, or NULL")
}
if (any(nchar(names(params)) == 0)) {
stop("'params' must be a named list, named vector, or NULL")
}
}
}
if (mode(funcVal) == "character") {
if (missing(funcArgName)) {
funcArgName <- str_c("ggally_", funcVal)
}
tryCatch({
funcVal <- get(
str_c("ggally_", funcVal),
mode = "function",
envir = loadNamespace("GGally")
)
},
error = function(e) {
stop(str_c(
"The following ggpair plot functions are readily available: \n",
"\tcontinuous: c('points', 'smooth', 'smooth_loess', 'density', 'cor', 'blank')\n",
"\tcombo: c('box', 'box_no_facet', 'dot', 'dot_no_facet', 'facethist',",
" 'facetdensity', 'denstrip', 'blank')\n",
"\tdiscrete: c('ratio', 'facetbar', 'blank')\n",
"\tna: c('na', 'blank')\n",
"\n",
"\tdiag continuous: c('densityDiag', 'barDiag', 'blankDiag')\n",
"\tdiag discrete: c('barDiag', 'blankDiag')\n",
"\tdiag na: c('naDiag', 'blankDiag')\n",
"\n",
"You may also provide your own function that follows the api of ",
"function(data, mapping, ...){ . . . }\nand returns a ggplot2 plot object\n",
"\tEx:\n",
"\tmy_fn <- function(data, mapping, ...){\n",
"\t p <- ggplot(data = data, mapping = mapping) + \n",
"\t geom_point(...)\n",
"\t p\n",
"\t}\n",
"\tggpairs(data, lower = list(continuous = my_fn))\n",
"\n",
"Function provided: ", funcVal
))
}
)
}
allParams <- ifnull(attr(funcVal, "params"), list())
allParams[names(params)] <- params
original_fn <- funcVal
ret_fn <- function(data, mapping, ...) {
allParams$data <- data
allParams$mapping <- mapping
argsList <- list(...)
allParams[names(argsList)] <- argsList
do.call(original_fn, allParams)
}
class(ret_fn) <- "ggmatrix_fn_with_params"
attr(ret_fn, "name") <- as.character(funcArgName)
attr(ret_fn, "params") <- allParams
attr(ret_fn, "fn") <- original_fn
ret_fn
}
#' @export
#' @rdname wrap
wrapp <- wrap_fn_with_param_arg
#' @export
#' @rdname wrap
wrap <- function(funcVal, ..., funcArgName = deparse(substitute(funcVal))) {
if (missing(funcArgName)) {
fnName <- attr(funcVal, "name")
if (!is.null(fnName)) {
funcArgName <- fnName
} else if (is.character(funcVal)) {
funcArgName <- str_c("ggally_", funcVal)
}
}
params <- list(...)
if (length(params) > 0) {
if (is.null(names(params))) {
stop("all parameters must be named arguements")
}
if (any(nchar(names(params)) == 0)) {
stop("all parameters must be named arguements")
}
}
wrap_fn_with_param_arg(funcVal, params = params, funcArgName = funcArgName)
}
#' @export
#' @rdname wrap
wrap_fn_with_params <- wrap
as.character.ggmatrix_fn_with_params <- function(x, ...) {
params <- attr(x, "params")
fnName <- attr(x, "name")
if (length(params) == 0) {
txt <- str_c("wrap: '", fnName, "'")
} else {
txt <- str_c("wrap: '", attr(x, "name"), "'; params: ", mapping_as_string(params))
}
txt
}
make_ggmatrix_plot_obj <- function(fn, mapping = ggplot2::aes(), dataPos = 1, gg = NULL) {
nonCallVals <- which(lapply(mapping, mode) == "call")
if (length(nonCallVals) > 0) {
nonCallNames <- names(mapping)[nonCallVals]
stop(
paste(
"variables: ",
paste(shQuote(nonCallNames, type = "cmd"), sep = ", "),
" have non standard format: ",
paste(shQuote(unlist(mapping[nonCallVals]), type = "cmd"), collapse = ", "),
". Please rename the columns or make a new column.",
sep = ""
)
)
}
ret <- list(
fn = fn,
mapping = mapping,
dataPos = dataPos,
gg = gg
)
class(ret) <- "ggmatrix_plot_obj"
ret
}
blank_plot_string <- function() {
"PM; (blank)"
}
mapping_as_string <- function(mapping) {
str_c("c(", str_c(names(mapping), as.character(mapping), sep = " = ", collapse = ", "), ")")
}
as.character.ggmatrix_plot_obj <- function(x, ...) {
hasGg <- (!is.null(x$gg))
mappingTxt <- mapping_as_string(x$mapping)
fnTxt <- ifelse(inherits(x$fn, "ggmatrix_fn_with_params"), as.character(x$fn), "custom_function")
if (inherits(x$fn, "ggmatrix_fn_with_params")) {
if (attr(x$fn, "name") %in% c("ggally_blank", "ggally_blankDiag")) {
return(blank_plot_string())
}
}
str_c(
"PM",
"; aes: ", mappingTxt,
"; fn: {", fnTxt, "}",
# "; dataPos: ", x$dataPos,
"; gg: ", as.character(hasGg)
)
}
#' ggmatrix structure
#'
#' View the condensed version of the ggmatrix object. The attribute "class" is ALWAYS altered to "_class" to avoid recursion.
#'
#' @param object ggmatrix object to be viewed
#' @param ... passed on to the default str method
#' @param raw boolean to determine if the plots should be converted to text or kept as original objects
#' @method str ggmatrix
#' @importFrom utils str
#' @export
str.ggmatrix <- function(object, ..., raw = FALSE) {
objName <- deparse(substitute(object))
obj <- object
if (identical(raw, FALSE)) {
cat(str_c(
"\nCustom str.ggmatrix output: \nTo view original object use ",
"'str(", objName, ", raw = TRUE)'\n\n"
))
obj$plots <- lapply(obj$plots, function(plotObj) {
if (ggplot2::is.ggplot(plotObj)) {
str_c("PM; ggplot2 object; mapping: ", mapping_as_string(plotObj$mapping))
} else if (inherits(plotObj, "ggmatrix_plot_obj")) {
as.character(plotObj)
} else {
plotObj
}
})
}
attr(obj, "_class") <- attr(obj, "class")
class(obj) <- NULL
str(obj, ...)
}
<file_sep>
context("gg-plots")
data(tips, package = "reshape")
data(nasa)
nas <- subset(nasa, x <= 2 & y == 1)
expect_print <- function(x) {
testthat::expect_silent(print(x))
}
test_that("denstrip", {
expect_message(
suppressWarnings(print(ggally_denstrip(tips, mapping = aes_string("sex", "tip")))),
"`stat_bin()` using `bins = 30`", fixed = TRUE
)
expect_message(
suppressWarnings(print(ggally_denstrip(tips, mapping = aes_string("tip", "sex")))),
"`stat_bin()` using `bins = 30`", fixed = TRUE
)
})
test_that("density", {
p <- ggally_density(
tips,
mapping = ggplot2::aes_string(x = "total_bill", y = "tip", fill = "..level..")
) + ggplot2::scale_fill_gradient(breaks = c(0.05, 0.1, 0.15, 0.2))
expect_equal(p$labels$fill, "level")
})
test_that("cor", {
expect_warning(
ggally_cor(tips, mapping = ggplot2::aes_string(x = "total_bill", y = "tip"), use = "NOTFOUND"),
"correlation 'use' not found"
)
ti <- tips
class(ti) <- c("NOTFOUND", "data.frame")
p <- ggally_cor(ti, ggplot2::aes(x = total_bill, y = tip, color = day), use = "complete.obs")
expect_equal(as.character(get("mapping", envir = p$layers[[2]])$colour), "labelp")
p <- ggally_cor(
ti,
ggplot2::aes(x = total_bill, y = tip, color = I("blue")),
use = "complete.obs"
)
expect_equal(deparse(get("aes_params", envir = p$layers[[1]])$colour), "I(\"blue\")")
expect_err <- function(..., msg = NULL) {
expect_error(
ggally_cor(
ti, ggplot2::aes(x = total_bill, y = tip),
...
),
msg
)
}
expect_err(corAlignPercent = 0.9, "'corAlignPercent' is deprecated")
expect_err(corMethod = "pearson", "'corMethod' is deprecated")
expect_err(corUse = "complete.obs", "'corUse' is deprecated")
expect_print(ggally_cor(ti, ggplot2::aes(x = total_bill, y = tip, color = "green")))
ti3 <- ti2 <- ti
ti2[2, "total_bill"] <- NA
ti3[2, "total_bill"] <- NA
ti3[3, "tip"] <- NA
ti3[4, "total_bill"] <- NA
ti3[4, "tip"] <- NA
expect_warn <- function(data, msg) {
expect_warning(
ggally_cor(data, ggplot2::aes(x = total_bill, y = tip)),
msg
)
}
expect_warn(ti2, "Removing 1 row that")
expect_warn(ti3, "Removed 3 rows containing")
expect_error(
ggally_cor(
ti,
ggplot2::aes(x = total_bill, y = tip, color = size)
),
"ggally_cor: mapping color column"
)
expect_silent(
ggally_cor(
ti,
ggplot2::aes(x = total_bill, y = tip, color = as.factor(size))
)
)
})
test_that("diagAxis", {
p <- ggally_diagAxis(iris, ggplot2::aes(x = Petal.Width))
pDat1 <- get("data", envir = p$layers[[2]])
attr(pDat1, "out.attrs") <- NULL
testDt1 <- data.frame(
xPos = c(0.076, 0.076, 0.076, 0.076, 0.076, 0.076, 0.500, 1.000, 1.500, 2.000, 2.500),
yPos = c(0.500, 1.000, 1.500, 2.000, 2.500, 0.076, 0.076, 0.076, 0.076, 0.076, 0.076),
lab = as.character(c(0.5, 1, 1.5, 2, 2.5, 0, 0.5, 1, 1.5, 2, 2.5)),
hjust = c(0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.5, 0.5, 0.5, 0.5, 0.5),
vjust = c(0.5, 0.5, 0.5, 0.5, 0.5, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0),
stringsAsFactors = FALSE
)
rownames(testDt1) <- 2:12
expect_equal(pDat1, testDt1)
p <- ggally_diagAxis(iris, ggplot2::aes(x = Species))
pDat2 <- get("data", envir = p$layers[[2]])
attr(pDat2, "out.attrs") <- NULL
testDt2 <- data.frame(
x = c(0.125, 0.500, 0.875),
y = c(0.875, 0.500, 0.125),
lab = c("setosa", "versicolor", "virginica")
)
expect_equal(pDat2, testDt2)
expect_error({
ggally_diagAxis(iris, mapping = ggplot2::aes(y = Sepal.Length))
}, "mapping\\$x is null.") # nolint
})
test_that("dates", {
class(nas) <- c("NOTFOUND", "data.frame")
p <- ggally_cor(nas, ggplot2::aes(x = date, y = ozone))
expect_equal(get("aes_params", envir = p$layers[[1]])$label, "Corr:\n0.278")
p <- ggally_cor(nas, ggplot2::aes(y = date, x = ozone))
expect_equal(get("aes_params", envir = p$layers[[1]])$label, "Corr:\n0.278")
p <- ggally_barDiag(nas, ggplot2::aes(x = date))
expect_equal(as.character(p$mapping$x), "date")
expect_equal(p$labels$y, "count")
})
test_that("rescale", {
p <- ggally_densityDiag(tips, mapping = ggplot2::aes(x = day), rescale = FALSE)
expect_true(p$labels$y == "density")
expect_print(p)
p <- ggally_densityDiag(tips, mapping = ggplot2::aes(x = day), rescale = TRUE)
expect_true(! identical(p$labels$y, "density"))
expect_print(p)
p <- ggally_barDiag(tips, mapping = ggplot2::aes(x = tip), binwidth = 0.25, rescale = FALSE)
expect_true(p$labels$y == "count")
expect_print(p)
p <- ggally_barDiag(tips, mapping = ggplot2::aes(x = tip), binwidth = 0.25, rescale = TRUE)
expect_true(! identical(p$labels$y, "count"))
expect_print(p)
})
<file_sep># [GGally](http://ggobi.github.io/ggally): Extension to [ggplot2](http://docs.ggplot2.org/current/)
Master: [](https://travis-ci.org/ggobi/ggally)
[](https://codecov.io/github/ggobi/ggally?branch=master)
[](http://cran.r-project.org/package=GGally)
[](http://cran.r-project.org/package=GGally)
[](https://zenodo.org/badge/latestdoi/22529/ggobi/ggally)
Dev: [](https://travis-ci.org/ggobi/ggally) [](https://codecov.io/github/ggobi/ggally?branch=dev)
[](https://gitter.im/ggobi/ggally?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
[ggplot2](http://docs.ggplot2.org/current/) is a plotting system for R based on the grammar of graphics. [GGally](https://ggobi.github.io/ggally) extends ggplot2 by adding several functions to reduce the complexity of combining geoms with transformed data. Some of these functions include a pairwise plot matrix, a scatterplot plot matrix, a parallel coordinates plot, a survival plot, and several functions to plot networks.
## Installation
To install this package from Github or [CRAN](http://cran.r-project.org/package=GGally), do the following from the R console:
```r
# Github
library(devtools)
install_github("ggobi/ggally")
```
```r
# CRAN
install.packages("GGally")
```
<file_sep># Setup
## Platform
|setting |value |
|:--------|:----------------------------|
|version |R version 3.3.1 (2016-06-21) |
|system |x86_64, darwin13.4.0 |
|ui |X11 |
|language |(EN) |
|collate |en_US.UTF-8 |
|tz |America/Indiana/Indianapolis |
|date |2016-10-17 |
## Packages
|package |* |version |date |source |
|:------------|:--|:----------|:----------|:----------------------------------|
|broom |* |0.4.1 |2016-06-24 |cran (@0.4.1) |
|chemometrics | |1.4.1 |2016-08-03 |cran (@1.4.1) |
|geosphere | |1.5-5 |2016-06-15 |cran (@1.5-5) |
|GGally |* |1.2.9.9999 |2016-10-17 |local (ggobi/ggally@NA) |
|ggplot2 |* |2.1.0.9001 |2016-10-17 |Github (hadley/ggplot2@1709196) |
|gtable | |0.2.0 |2016-02-26 |cran (@0.2.0) |
|igraph | |1.0.1 |2015-06-26 |cran (@1.0.1) |
|intergraph | |2.0-2 |2015-06-30 |cran (@2.0-2) |
|knitr | |1.14 |2016-08-13 |cran (@1.14) |
|mapproj | |1.2-4 |2015-08-03 |cran (@1.2-4) |
|maps | |3.1.1 |2016-07-27 |cran (@3.1.1) |
|network | |1.13.0 |2015-09-19 |CRAN (R 3.3.1) |
|packagedocs | |0.3.9.9999 |2016-10-16 |Github (hafen/packagedocs@4f8625d) |
|plyr | |1.8.4 |2016-06-08 |cran (@1.8.4) |
|progress | |1.0.2 |2015-06-05 |cran (@1.0.2) |
|RColorBrewer | |1.1-2 |2014-12-07 |cran (@1.1-2) |
|reshape | |0.8.5 |2014-04-23 |cran (@0.8.5) |
|rmarkdown | |1.1 |2016-10-16 |cran (@1.1) |
|roxygen2 | |5.0.1 |2015-11-11 |cran (@5.0.1) |
|scagnostics | |0.2-4 |2012-11-05 |cran (@0.2-4) |
|scales | |0.4.0.9003 |2016-10-17 |Github (hadley/scales@d58d83a) |
|sna | |2.4 |2016-08-08 |cran (@2.4) |
|survival | |2.39-5 |2016-06-26 |cran (@2.39-5) |
|testthat | |1.0.2 |2016-04-23 |cran (@1.0.2) |
# Check results
9 packages with problems
|package |version | errors| warnings| notes|
|:-------------------|:-------|------:|--------:|-----:|
|MissingDataGUI |0.2-5 | 1| 0| 0|
|ParamHelpers |1.9 | 1| 0| 0|
|plotly |4.5.2 | 2| 0| 1|
|robustbase |0.92-6 | 0| 1| 1|
|SHELF |1.2.1 | 0| 1| 0|
|specmine |1.0 | 1| 0| 0|
|toaster |0.5.2 | 1| 0| 0|
|userfriendlyscience |0.4-1 | 1| 0| 0|
|vdmR |0.2.2 | 2| 1| 0|
## MissingDataGUI (0.2-5)
Maintainer: <NAME> <<EMAIL>>
1 error | 0 warnings | 0 notes
```
checking package dependencies ... ERROR
Packages required but not available: ‘gWidgetsRGtk2’ ‘cairoDevice’
See section ‘The DESCRIPTION file’ in the ‘Writing R Extensions’
manual.
```
## ParamHelpers (1.9)
Maintainer: <NAME> <<EMAIL>>
Bug reports: https://github.com/berndbischl/ParamHelpers/issues
1 error | 0 warnings | 0 notes
```
checking tests ... ERROR
Running the tests in ‘tests/run-all.R’ failed.
Last 13 lines of output:
3. Failure: convertParamSetToIrace work with vecparam (@test_convertParamSetToIrace.R#85)
b$b not equal to c("v", "w").
target is NULL, current is character
testthat results ================================================================
OK: 1003 SKIPPED: 0 FAILED: 3
1. Error: convertParamSetToIrace (@test_convertParamSetToIrace.R#32)
2. Failure: convertParamSetToIrace uses correct boundaries (@test_convertParamSetToIrace.R#69)
3. Failure: convertParamSetToIrace work with vecparam (@test_convertParamSetToIrace.R#85)
Error: testthat unit tests failed
Execution halted
```
## plotly (4.5.2)
Maintainer: <NAME> <<EMAIL>>
Bug reports: https://github.com/ropensci/plotly/issues
2 errors | 0 warnings | 1 note
```
checking examples ... ERROR
Running examples in ‘plotly-Ex.R’ failed
The error most likely occurred in:
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: style
> ### Title: Modify trace(s)
> ### Aliases: style
>
> ### ** Examples
>
>
> p <- qplot(data = mtcars, wt, mpg, geom = c("point", "smooth"))
> # keep the hover info for points, but remove it for the line/ribbon
> style(p, hoverinfo = "none", traces = c(2, 3))
Error in gg2list(p, width = width, height = height, tooltip = tooltip, :
attempt to apply non-function
Calls: style ... plotly_build.gg -> ggplotly -> ggplotly.ggplot -> gg2list
Execution halted
checking tests ... ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
last_plot
The following object is masked from 'package:stats':
filter
The following object is masked from 'package:graphics':
layout
> library("RSclient")
Error in library("RSclient") : there is no package called 'RSclient'
Execution halted
checking package dependencies ... NOTE
Package suggested but not available for checking: ‘RSclient’
```
## robustbase (0.92-6)
Maintainer: <NAME> <<EMAIL>>
0 errors | 1 warning | 1 note
```
checking re-building of vignette outputs ... WARNING
Error in re-building vignettes:
...
Loading required package: robustbase
Loading required package: xtable
Loading required package: GGally
Error: processing vignette 'lmrob_simulation.Rnw' failed with diagnostics:
chunk 6 (label = fig-example-design)
Error : ggplot2 doesn't know how to deal with data of class matrix
Execution halted
checking Rd cross-references ... NOTE
Package unavailable to check Rd xrefs: ‘robustX’
```
## SHELF (1.2.1)
Maintainer: <NAME> <<EMAIL>>
Bug reports: https://github.com/OakleyJ/SHELF/issues
0 errors | 1 warning | 0 notes
```
checking re-building of vignette outputs ... WARNING
Error in re-building vignettes:
...
Quitting from lines 187-195 (Multivariate-normal-copula.Rmd)
Error: processing vignette 'Multivariate-normal-copula.Rmd' failed with diagnostics:
Each variable must be a 1d atomic vector or list.
Problem variables: 'Var2'
Execution halted
```
## specmine (1.0)
Maintainer: <NAME> <<EMAIL>>
1 error | 0 warnings | 0 notes
```
checking package dependencies ... ERROR
Packages required but not available: ‘xcms’ ‘MAIT’
See section ‘The DESCRIPTION file’ in the ‘Writing R Extensions’
manual.
```
## toaster (0.5.2)
Maintainer: <NAME> <<EMAIL>>
Bug reports: https://github.com/teradata-aster-field/toaster/issues
1 error | 0 warnings | 0 notes
```
checking package dependencies ... ERROR
Package required but not available: ‘RODBC’
See section ‘The DESCRIPTION file’ in the ‘Writing R Extensions’
manual.
```
## userfriendlyscience (0.4-1)
Maintainer: <NAME> <<EMAIL>>
1 error | 0 warnings | 0 notes
```
checking package dependencies ... ERROR
Package required but not available: ‘MBESS’
See section ‘The DESCRIPTION file’ in the ‘Writing R Extensions’
manual.
```
## vdmR (0.2.2)
Maintainer: <NAME> <<EMAIL>>
2 errors | 1 warning | 0 notes
```
checking examples ... ERROR
Running examples in ‘vdmR-Ex.R’ failed
The error most likely occurred in:
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: vhist
> ### Title: Generate histogram with interactive functions
> ### Aliases: vhist
>
> ### ** Examples
>
> data(vsfuk2012)
> vhist(FertilityRate, vsfuk2012, "hist1", "vsfuk2012", fill=Type)
`stat_bin()` using `bins = 30`. Pick better value with `binwidth`.
Error in grid.Call.graphics(L_downviewport, name$name, strict) :
Viewport 'panel.3-4-3-4' was not found
Calls: vhist ... downViewport -> downViewport.vpPath -> grid.Call.graphics
Execution halted
checking tests ... ERROR
Running the tests in ‘tests/run-all.R’ failed.
Last 13 lines of output:
Viewport 'panel.3-4-3-4' was not found
1: vhist(Sepal.Length, iris, "hist01", "iris", fill = Species) at /Users/barret/odrive/AmazonCloudDrive/git/R/ggobi_org/ggally/ggally/revdep/checks/vdmR.Rcheck/vdmR/tests/test-vdmR.R:7
2: grid::downViewport("panel.3-4-3-4")
3: downViewport.default("panel.3-4-3-4")
4: downViewport(vpPath(name), strict, recording = recording)
5: downViewport.vpPath(vpPath(name), strict, recording = recording)
6: grid.Call.graphics(L_downviewport, name$name, strict)
DONE ===========================================================================
Error: Test failures
In addition: Warning message:
Placing tests in `inst/tests/` is deprecated. Please use `tests/testthat/` instead
Execution halted
checking re-building of vignette outputs ... WARNING
Error in re-building vignettes:
...
`stat_bin()` using `bins = 30`. Pick better value with `binwidth`.
Quitting from lines 40-42 (vdmR-vignette.Rnw)
Error: processing vignette 'vdmR-vignette.Rnw' failed with diagnostics:
Viewport 'panel.3-4-3-4' was not found
Execution halted
```
|
91db4a1eaca8975ecd06119bf2d0d7d445c30c0e
|
[
"Markdown",
"R"
] | 12 |
Markdown
|
cpsievert/ggally
|
274deb8c72d6408a31b34adc751d445dcdd46ee6
|
622426a8d357378db595b2f5272033259d002c2e
|
refs/heads/main
|
<file_sep>using System;
namespace oppmotobayi
{
public class Motor {
public int secim { get; set; }
public int Hscm1 = 1000;
public int Hscm2 = 1000;
public int Hscm3 = 1000;
public int Hscm4 = 1000;
public int Hscm5 = 1000;
public int Hscm6 = 1000;
public int Hscm7 = 1000;
public int Hscm8 = 1000;
public int Bakiye { get; set; }
}
class Program
{
static void Main(string[] args)
{
Motor a = new Motor();
Console.WriteLine("--------------");
Console.WriteLine(" Bakiye gir ");
Console.WriteLine("--------------");
a.Bakiye = int.Parse(Console.ReadLine());
Console.WriteLine("---***----Menü---***----");
Console.WriteLine("-1- Satin Alma \n " + "-2-Ürün Satış\n" + "-3-Bakiye Göster\n" + "Çıkış");
a.secim = int.Parse(Console.ReadLine());
switch (a.secim)
{
case 1:
Salma();
break;
case 2:
Syap();
break;
case 3:
Console.WriteLine(a.Bakiye);
break;
case 4:
break;
default:
break;
}
}
public static void Salma()
{
Motor a = new Motor();
Console.WriteLine("motor türü giriniz");
Console.WriteLine("{0}{1}", "-1-Honda\n", "-2-Bmw\n");
a.secim = int.Parse(Console.ReadLine());
switch (a.secim)
{
case 1:
Console.WriteLine("seciminiz Honda");
Console.WriteLine("Tür seciniz!!");
Console.WriteLine("1-Sport\n" + "2-Scoter\n");
a.secim = int.Parse(Console.ReadLine());
switch (a.secim)
{
case 1:
Console.WriteLine("Secim sport");
Console.WriteLine("Model sec ");
Console.WriteLine("1 model a");
Console.WriteLine("2 model b ");
a.secim = int.Parse(Console.ReadLine());
switch (a.secim)
{
case 1:
Console.WriteLine("Secim model a");
a.Bakiye = a.Bakiye - a.Hscm1;
Console.WriteLine("Bakiye" + a.Bakiye);
break;
case 2:
Console.WriteLine("Secim model b");
a.Bakiye = a.Bakiye - a.Hscm2;
Console.WriteLine("Bakiye" + a.Bakiye);
break;
}
break;
case 2:
Console.WriteLine("Secim scoter");
Console.WriteLine("Model sec ");
Console.WriteLine("1 model a");
Console.WriteLine("2 model b ");
a.secim = int.Parse(Console.ReadLine());
switch (a.secim)
{
case 1:
Console.WriteLine("Secim model a");
a.Bakiye = a.Bakiye - a.Hscm3;
Console.WriteLine("Bakiye" + a.Bakiye);
break;
case 2:
Console.WriteLine("Secim model b");
a.Bakiye = a.Bakiye - a.Hscm4;
Console.WriteLine("Bakiye" + a.Bakiye);
break;
default:
break;
}
break;
}
break;
case 2:
Console.WriteLine("seciminiz Bmw");
Console.WriteLine("Tür seciniz!!");
Console.WriteLine("1-Sport\n" + "2-Scoter\n");
a.secim = int.Parse(Console.ReadLine());
switch (a.secim)
{
case 1:
Console.WriteLine("Secim sport");
Console.WriteLine("Model sec ");
Console.WriteLine("1 model a");
Console.WriteLine("2 model b ");
a.secim = int.Parse(Console.ReadLine());
switch (a.secim)
{
case 1:
Console.WriteLine("Secim model a");
a.Bakiye = a.Bakiye - a.Hscm5;
Console.WriteLine("Bakiye" + a.Bakiye);
break;
case 2:
Console.WriteLine("Secim model b");
a.Bakiye = a.Bakiye - a.Hscm6;
Console.WriteLine("Bakiye" + a.Bakiye);
break;
}
break;
case 2:
Console.WriteLine("Secim scoter");
Console.WriteLine("Model sec ");
Console.WriteLine("1 model a");
Console.WriteLine("2 model b ");
a.secim = int.Parse(Console.ReadLine());
switch (a.secim)
{
case 1:
Console.WriteLine("Secim model a");
a.Bakiye = a.Bakiye - a.Hscm7;
Console.WriteLine("Bakiye" + a.Bakiye);
break;
case 2:
Console.WriteLine("Secim model b");
a.Bakiye = a.Bakiye - a.Hscm8;
Console.WriteLine("Bakiye" + a.Bakiye);
break;
default:
break;
}
break;
}
break;
default:
Console.WriteLine("değer aralığı dışı");
break;
}
}
public static void Syap()
{
Motor a = new Motor();
Console.WriteLine("motor türü giriniz");
Console.WriteLine("{0}{1}", "-1-Honda\n", "-2-Bmw\n");
a.secim = int.Parse(Console.ReadLine());
switch (a.secim)
{
case 1:
Console.WriteLine("seciminiz Honda");
Console.WriteLine("Tür seciniz!!");
Console.WriteLine("1-Sport\n" + "2-Scoter\n");
a.secim = int.Parse(Console.ReadLine());
switch (a.secim)
{
case 1:
Console.WriteLine("Secim sport");
Console.WriteLine("Model sec ");
Console.WriteLine("1 model a");
Console.WriteLine("2 model b ");
a.secim = int.Parse(Console.ReadLine());
switch (a.secim)
{
case 1:
Console.WriteLine("Secim model a");
a.Bakiye = a.Bakiye + a.Hscm1;
Console.WriteLine("Bakiye" + a.Bakiye);
break;
case 2:
Console.WriteLine("Secim model b");
a.Bakiye = a.Bakiye + a.Hscm2;
Console.WriteLine("Bakiye" + a.Bakiye);
break;
}
break;
case 2:
Console.WriteLine("Secim scoter");
Console.WriteLine("Model sec ");
Console.WriteLine("1 model a");
Console.WriteLine("2 model b ");
a.secim = int.Parse(Console.ReadLine());
switch (a.secim)
{
case 1:
Console.WriteLine("Secim model a");
a.Bakiye = a.Bakiye + a.Hscm3;
Console.WriteLine("Bakiye" + a.Bakiye);
break;
case 2:
Console.WriteLine("Secim model b");
a.Bakiye = a.Bakiye + a.Hscm4;
Console.WriteLine("Bakiye" + a.Bakiye);
break;
default:
break;
}
break;
}
break;
case 2:
Console.WriteLine("seciminiz Bmw");
Console.WriteLine("Tür seciniz!!");
Console.WriteLine("1-Sport\n" + "2-Scoter\n");
a.secim = int.Parse(Console.ReadLine());
switch (a.secim)
{
case 1:
Console.WriteLine("Secim sport");
Console.WriteLine("Model sec ");
Console.WriteLine("1 model a");
Console.WriteLine("2 model b ");
a.secim = int.Parse(Console.ReadLine());
switch (a.secim)
{
case 1:
Console.WriteLine("Secim model a");
a.Bakiye = a.Bakiye + a.Hscm5;
Console.WriteLine("Bakiye" + a.Bakiye);
break;
case 2:
Console.WriteLine("Secim model b");
a.Bakiye = a.Bakiye + a.Hscm6;
Console.WriteLine("Bakiye" + a.Bakiye);
break;
}
break;
case 2:
Console.WriteLine("Secim scoter");
Console.WriteLine("Model sec ");
Console.WriteLine("1 model a");
Console.WriteLine("2 model b ");
a.secim = int.Parse(Console.ReadLine());
switch (a.secim)
{
case 1:
Console.WriteLine("Secim model a");
a.Bakiye = a.Bakiye + a.Hscm7;
Console.WriteLine("Bakiye" + a.Bakiye);
break;
case 2:
Console.WriteLine("Secim model b");
a.Bakiye = a.Bakiye + a.Hscm8;
Console.WriteLine("Bakiye" + a.Bakiye);
break;
default:
break;
}
break;
}
break;
default:
Console.WriteLine("değer aralığı dışı");
break;
}
}
}
}
|
cdba73fe3e9ecc15f792ba03550d37651ef80600
|
[
"C#"
] | 1 |
C#
|
canmadenn37/oppmotobayi
|
77b6d1ba7ccae11ad808ae12cf44abfd64bfaa65
|
f1c10f7834ed539aab1e5eb3ed3768fc1f2a27b5
|
refs/heads/master
|
<file_sep><?php
/**
* SermonAudioAPI connects to the sermonaudio.com json api via PHP
*
* @author <NAME>
* @package SermonAudioAPI
* @version 0.2.2
*/
namespace CarlosRios;
class SermonAudioAPI {
/**
* API key provided by Sermon Audio
*
* @access private
* @since 0.1
*
* @var string
*/
private $api_key;
/**
* Sermon Audio's API URL
*
* @access private
* @since 0.1
*
* @var string
*/
private $base_api_url = 'https://www.sermonaudio.com/api/%1$s?apikey=%2$s&%3$s';
/**
* Sermon Audio request route url
*
* @access public
* @since 0.2
*
* @var string
*/
public $request_route;
/**
* Call the construct for safety
*
* @access public
* @since 0.1
*/
public function __construct()
{
}
/**
* Sets the api key in the class
*
* @access public
* @since 0.1
*
* @param string $key API key provided by Sermon Audio
* @return object
*/
public function setApiKey( $key )
{
$this->api_key = $key;
return $this;
}
/**
* Sanitizes strings
*
* @access public
* @since 0.2
*
* @param string $data - a string to sanitize
* @return string
*/
public function sanitize_string( $data )
{
$esc_data = htmlentities( $data, ENT_QUOTES );
return $esc_data;
}
/**
* Builds out the sermons api route and stores it in the class
*
* @access public
* @since 0.1
*
* @param array $args - a list of allowed variables to query for.
* @return array | false
*/
public function sermonsApiRoute( $args = array() )
{
// Stores the request variables that will be added to the API request
$vars = array();
// Sets the speaker name and the category as speaker
if( isset( $args['speaker'] ) ) {
$vars['category'] = 'speaker';
$vars['item'] = $this->sanitize_string( $args['speaker'] );
}
// Sets the event as the category
if( isset( $args['event'] ) ) {
$vars['category'] = 'eventtype';
$vars['item'] = $this->sanitize_string( $args['event'] );
}
// Sets the series as the category
if( isset( $args['series'] ) ) {
$vars['category'] = 'series';
$vars['item'] = $this->sanitize_string( $args['series'] );
}
// Sets the page number
if( isset( $args['page'] ) ) {
$vars['page'] = $this->sanitize_string( $args['page'] );
}
// The amount of sermons to get per page
if( isset( $args['sermons_per_page'] ) ) {
$vars['pagesize'] = abs( $args['sermons_per_page'] );
}
// Gets the sermons by the year
if( isset( $args['year'] ) ) {
$vars['year'] = abs( $args['year'] );
}
// Build the request and return the route
return $this->buildRequestRoute( 'saweb_get_sermons.aspx', $vars );
}
/**
* Returns the sermons allowed for this API Key
*
* @access public
* @since 0.1
*
* @param array $args - a list of allowed variables to query for.
* @return array | false
*/
public function getSermons( $args )
{
$this->sermonsApiRoute( $args );
$data = $this->requestData();
// Return false if no sermons are returned
if( ! $data ) {
return false;
}
// Split the data into chunks
if( isset( $args['chunks'] ) && !empty( $data ) && is_array( $data ) ) {
$data = array_chunk( $data, abs( $args['chunks'] ) );
}
return $data;
}
/**
* Builds out the speakers api route and stores it in the class
*
* @access public
* @since 0.2
*
* @return string
*/
public function speakersApiRoute()
{
return $this->buildRequestRoute( 'saweb_get_speakers.aspx' );
}
/**
* Returns a list objects of the speakers for this church
*
* @access public
* @since 0.1
*
* @return array | false
*/
public function getSpeakers()
{
$this->speakersApiRoute();
$data = $this->requestData();
return $data;
}
/**
* Builds out the events api route and stores it in the class
*
* @access public
* @since 0.2
*
* @return string
*/
public function eventsApiRoute()
{
return $this->buildRequestRoute( 'saweb_get_eventtypes.aspx' );
}
/**
* Returns a list of objects for the events in this church
*
* @access public
* @since 0.1
*
* @return array | false
*/
public function getEvents()
{
$this->eventsApiRoute();
$data = $this->requestData();
return $data;
}
/**
* Builds out the languages api route and stores it in the class
*
* @access public
* @since 0.2
*
* @return array | false
*/
public function languagesApiRoute()
{
return $this->buildRequestRoute( 'saweb_get_languages.aspx' );
}
/**
* Returns a list of objects for the languages in this church
*
* @access public
* @since 0.1
*
* @return array | false
*/
public function getLanguages()
{
$this->languagesApiRoute();
$data = $this->requestData();
return $data;
}
/**
* Builds out the total sermons api route and stores it in the class
*
* @access public
* @since 0.2.1
*
* @param array $args - arguments accepted in sermonsApiRoute
* @see sermonsApiRoute
* @return int | false
*/
public function totalSermonsApiRoute( $args = array() )
{
$args['page'] = 'total';
return $this->sermonsApiRoute( $args );
}
/**
* Helper function that returns the total amount of sermons
* for a speaker, year, eventtype, or series
*
* @access public
* @since 0.2.1
*
* @param array $args - arguments accepted in sermonsApiRoute
* @see sermonsApiRoute
* @return int | false
*/
public function getTotalSermons( $args = array() )
{
$this->totalSermonsApiRoute( $args );
$sermons = $this->requestData();
return !empty( $sermons->total ) ? abs( $sermons->total ) : false;
}
/**
* Builds the request route and stores it in the object
*
* @access private
* @since 0.2
*
* @param string $request_endpoint The endpoint for the API request
* @param array $api_request_vars Extra variables to add to the request
* @return string [description]
*/
private function buildRequestRoute( $request_endpoint, $api_request_vars = array() )
{
// Make sure api key is set first
if( ! isset( $this->api_key ) || empty( $this->api_key ) ) {
throw new Exception( 'API key is required. No API key has been set.', 1);
return false;
}
// Request endpoint must be a string
if( ! is_string( $request_endpoint ) ){
return false;
}
// Setup the variables for the request
$api_request_vars = http_build_query( $api_request_vars );
// Setup the entire request url
return $this->request_route = sprintf( $this->base_api_url, $request_endpoint, $this->api_key, $api_request_vars );
}
/**
* Handy method that makes the request for data from the Sermon Audio API
*
* @access private
* @since 0.1
*
* @return false | array
*/
private function requestData()
{
if( !empty( $this->request_route ) && is_string( $this->request_route ) ) {
// Get the contents and convert them to PHP useable objects
$contents = json_decode( file_get_contents( $this->request_route ) );
// Return the contents of the request
return $contents;
}
return false;
}
}
<file_sep>[banner]: https://raw.githubusercontent.com/CarlosRios/sermonaudio-php-api/master/banner.jpg "SermonAudio API"
## SermonAudio PHP API
The SermonAudio PHP API allows you to connect to sermon audio easily via your php website or application.
### Uses
If you're already using SermonAudio.com to manage your church's sermons, speakers, sermon series, and events, then using the SermonAudio PHP API will allow you to create web applications or websites with the data you're already providing to SermonAudio.com. This API is great for pulling in A demo of the SermonAudio PHP API can be found [here](http://crios.me/sermonaudio-api).
#### Connecting to the API
Connecting to the API requires that you first obtain a SermonAudio API key which you can obtain [here](https://www.sermonaudio.com/secure/members_stats.asp).
Once you have an API key, all you need to do is create a new instance of the API and add your key.
```php
$sermon_audio = new SermonAudioAPI;
$sermon_audio->setApiKey( '<KEY>' );
```
<file_sep><?php
/**
* Gets all the sermons for this church
*
* @author <NAME>
* @package CarlosRios/SermonAudioAPI
* @subpackage Examples/All Sermons
* @version 1.1
*/
// Include the api
include_once( '../SermonAudioAPI.php' );
/**
* Gets the sermons from SermonAudio
* and displays them as JSON.
*
* @todo Add count parameter to grab a specific amount of sermons
*
* @since 1.1
* @return void
*/
function sermonaudio_sermons_as_json()
{
// Connect to the api. You must provide your own API Key from SermonAudio
$sa_api = new CarlosRios\SermonAudioAPI;
$sa_api->setApiKey( '<KEY>' );
// Get the total number of sermons and create sermons array
$total_sermons = $sa_api->getTotalSermons();
$sermons = array();
// If there are more than 100 sermons we will need to
// make multiple requests because currently 100 is the most
// sermons allowed per request by SermonAudio.com
if( $total_sermons > 100 ) {
$total_pages = ( $total_sermons / 100 );
// Get the data for all the sermons, requires multiple requests
// depending on how many sermons you have stored in sermon audio.
// Each request gets 100 sermons.
for ( $i=0; $i <= $total_pages + 1; $i++ ) {
$args = array(
'page' => $i,
'sermons_per_page' => 100,
);
$sermons[] = $sa_api->getSermons( $args );
}
} else {
// Get the sermons, will only grab up to 100
$args = array(
'sermons_per_page' => 100,
);
$sermons[] = $sa_api->getSermons( $args );
}
// Continue only if the sermons returned an array. Otherwise return false.
if( ! $sermons && !empty( $sermons ) ) {
$all_sermons = array();
foreach( (array) $sermons as $sermon_set ) {
$all_sermons = array_merge( array_reverse( $sermon_set ), $all_sermons );
}
echo json_encode( $all_sermons );
} else {
return false;
}
// Exit the script
exit;
}
// Display the sermons
sermonaudio_sermons_as_json();
|
af4e3479b36bff862e00041c21aaaddf0613e5c8
|
[
"Markdown",
"PHP"
] | 3 |
PHP
|
CarlosRios/sermonaudio-php-api
|
424424902a41fdc747e1ba9911133c68c9a64eed
|
4c4b69448adac392a19fdd8e63c78e2a450ddbda
|
refs/heads/master
|
<repo_name>haiplk/Dockerfiles<file_sep>/mysql-56/docker-entrypoint-initdb.d/setup-sstmp.sh
#!/bin/sh
# Create the /etc/ssmtp/ssmtp.conf file for sending reports, using
# values specified in the environment.
cat << EOM > /etc/ssmtp/ssmtp.conf
root=$SSMTP_SENDER_ADDRESS
mailhub=$SSMTP_HOST
AuthUser=$SMTP_USER
AuthPass=$<PASSWORD>
UseTLS=YES
UseSTARTTLS=YES
EOM
<file_sep>/mysql-56/README.md
When using this image, you must either specify the `MYSQL_ROOT_PASSWORD`
environment variable or put a file containing the (shell-compatible) variable
definition into the `docker-env-vars.d/` folder.
<file_sep>/mysql-56-custom/Dockerfile
FROM fbender/mysql-56
#EOF
<file_sep>/mysql-56/docker-compose.yml
# link with " external_links:\n - db-mysql:mysql"
db-mysql:
container_name: db-mysql
build: .
# image: fbender/mysql-56
restart: always
volumes_from:
- volume-db-mysql
volume-db-mysql:
container_name: volume-db-mysql
image: mysql:5.6
# image: fbender/mysql-56
entrypoint: /bin/echo mysql-56 Data-only volume container for db-mysql
# EOF
<file_sep>/mysql-56/docker-entrypoint-initdb.d/setup-maintenance-file.sh
#!/bin/sh
echo "[client]
user=root
password=${<PASSWORD>}
" > /etc/mysql/maintenance.cnf
# for automysqlbackup compliance
ln -s /etc/mysql/maintenance.cnf /etc/mysql/debian.cnf
if [ -n "$MYSQL_ADMIN_EMAIL" ]
then
sed -i.bak "s/MAILADDR=.*/MAILADDR=\"${MYSQL_ADMIN_EMAIL}\"/g" /etc/default/automysqlbackup
fi
#EOF
<file_sep>/mysql-56/start.sh
#!/bin/sh
docker-compose build --pull db-mysql
docker-compose up -d --no-recreate
#EOF
|
06f1cccb06d2dbbaac29f51d7cfb91f156f29f64
|
[
"Markdown",
"Dockerfile",
"YAML",
"Shell"
] | 6 |
Shell
|
haiplk/Dockerfiles
|
fc7e94bfcdc6cb6020ec32f393cf02311645d853
|
f3a8c8e1321ec1be81fc0e1f3209c5f411f92b8e
|
refs/heads/master
|
<file_sep>using System;
namespace exercise_2
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Введите слово!");
String word = Console.ReadLine();
switch (word)
{
case "мяу":
Console.WriteLine("Покорми кота!");
break;
case "гав":
Console.WriteLine("Погуляй с собакой!");
break;
default:
Console.WriteLine("я не знаю таких животных");
break;
}
}
}
}
|
b5776c0843db4f28872e13a92ecf78dfd93138be
|
[
"C#"
] | 1 |
C#
|
Ratimir989/exercise_2
|
1da9832bdbade3cd9988c4be3edf10fe79b7536c
|
421331ed48ae2f867383d49653d5ed8ff5d6b5cf
|
refs/heads/master
|
<repo_name>kain647/product-page<file_sep>/README.md
Project can be found here: https://product-page-kappa.vercel.app/
<file_sep>/src/product/styled.js
import styled from "styled-components";
export const Container = styled.div`
display: flex;
width: 100%;
height: 100%;
justify-content: center;
align-items: center;
font-family: "Poppins", sans-serif;
background-image: ${({ background }) => background};
`;
export const ProductPhoto = styled.div`
display: flex;
`;
export const ProductInfo = styled.div`
display: flex;
flex-direction: column;
`;
export const Title = styled.div`
display: flex;
font-weight: 700;
text-transform: uppercase;
letter-spacing: 1px;
font-size: 13px;
line-height: 1.2;
color: #fff;
margin-bottom: 10px;
`;
export const Subtitle = styled.h2`
display: flex;
font-weight: 800;
font-size: 34px;
line-height: 1.2;
color: #fff;
margin: 0;
margin-bottom: 10px;
`;
export const Price = styled.h4`
display: flex;
align-items: center;
font-weight: 500;
font-size: 26px;
line-height: 1.2;
color: #fff;
margin: 0;
margin-bottom: 30px;
span {
text-decoration: line-through;
font-size: 20px;
opacity: 0.6;
margin-left: 15px;
}
`;
export const Description = styled.h4`
display: flex;
align-items: center;
font-weight: 600;
font-size: 18px;
line-height: 1.2;
color: #fff;
span {
font-weight: 600;
font-size: 18px;
line-height: 1.2;
color: #fff;
margin-left: 25px;
opacity: 0.5;
}
`;
export const Info = styled.p`
display: flex;
width: 360px;
font-weight: 400;
font-size: 16px;
line-height: 1.7;
color: #fff;
margin: 0;
`;
export const ColorBox = styled.div`
display: flex;
box-sizing: border-box;
width: 40px;
height: 40px;
cursor: pointer;
margin-right: 10px;
border-radius: 4px;
border: ${({ active }) => (active ? "3px solid #434343" : "none")};
transform: ${({ active }) => (active ? "scale(1.1)" : "none")};
img {
//border-radius: 4px;
//border: 3px solid transparent;
}
:last-child {
margin-right: 0;
}
`;
export const Choose = styled.h5`
display: flex;
font-weight: 600;
font-size: 18px;
line-height: 1.2;
color: #fff;
margin-bottom: 20px;
`;
export const Upholstery = styled.div`
display: flex;
box-sizing: border-box;
margin-bottom: 50px;
//box-shadow: 0 12px 35px 0 rgb(16 39 112 / 25%);
`;
export const Cart = styled.button`
display: flex;
align-items: center;
justify-content: center;
font-weight: 500;
font-size: 14px;
line-height: 2;
height: 48px;
border-radius: 4px;
border: none;
width: 210px;
letter-spacing: 1px;
color: #fff;
background-color: #944852;
box-shadow: 0 6px 15px 0 rgb(16 39 112 / 15%);
cursor: pointer;
svg {
margin-right: 10px;
}
:hover {
box-shadow: 0 12px 35px 0 rgb(16 39 112 / 25%);
background-color: #000;
}
`;
<file_sep>/src/product/index.js
import React, { useState } from "react";
import { FiShoppingCart } from "react-icons/fi";
import {
Container,
ProductPhoto,
ProductInfo,
Title,
Subtitle,
Price,
Description,
Choose,
Upholstery,
ColorBox,
Info,
Cart,
} from "./styled";
const Product = () => {
const colors = [
{
preview: `images/mat1.jpg`,
previewBig: `images/ch1.png`,
background: `linear-gradient(196deg, #f1a9a9, #e66767)`,
},
{
preview: `images/mat2.jpg`,
previewBig: `images/ch2.png`,
background: `linear-gradient(196deg, #4c4c4c, #262626)`,
},
{
preview: `images/mat3.jpg`,
previewBig: `images/ch3.png`,
background: `linear-gradient(196deg, #8a9fb2, #5f7991)`,
},
{
preview: `images/mat4.jpg`,
previewBig: `images/ch4.png`,
background: `linear-gradient(196deg, #97afc3, #6789a7)`,
},
{
preview: `images/mat5.jpg`,
previewBig: `images/ch5.png`,
background: `linear-gradient(196deg, #afa6a0, #8c7f76)`,
},
{
preview: `images/mat6.jpg`,
previewBig: `images/ch6.png`,
background: `linear-gradient(196deg, #aaadac, #838786)`,
},
];
const [curent, setCurent] = useState(colors[0]);
return (
<Container background={curent.background}>
<ProductPhoto>
<img src={curent.previewBig} />
</ProductPhoto>
<ProductInfo>
<Title>Modern chair</Title>
<Subtitle><NAME></Subtitle>
<Price>
$174<span>$237</span>
</Price>
<Description>
Description<span>Details</span>
</Description>
<Info>
The chair construction is made of ash tree. Upholstery and wood color
at customer's request.
</Info>
<Choose>Choose upholstery:</Choose>
<Upholstery>
{colors.map((color) => {
return (
<Color
{...color}
onClick={() => setCurent(color)}
active={color.preview === curent.preview}
/>
);
})}
</Upholstery>
<Cart>
<FiShoppingCart />
Add To Cart
</Cart>
</ProductInfo>
</Container>
);
};
const Color = (props) => {
const { preview, onClick, active } = props;
return (
<ColorBox onClick={onClick} active={active}>
<img src={preview} />
</ColorBox>
);
};
export default Product;
|
431062b939aee2a583faaf199da2935c048910af
|
[
"Markdown",
"JavaScript"
] | 3 |
Markdown
|
kain647/product-page
|
b4f0c7594d28ef85735d5e88109bf8359cc735ca
|
acd77a37d86399aeceea77a4977b986c048f2489
|
refs/heads/master
|
<file_sep>package cn.itcast.constant;
public class TransactionConstants {
public static final String USER_ID = "id";
public static final String USER_ROLE = "role";
public static final String USER_USERNAME = "username";
public static final String USER_NICKNAME = "nickname";
public static final String USER_SESSION = "USER_SESSION";
public static final String SIDE_BUY = "BUY";
public static final String SIDE_SELL = "SELL";
public static final String POSITIONSIDE_LONG = "LONG";
public static final String POSITIONSIDE_SHORT = "SHORT";
public static final String POSITIONSIDE_BOTH = "BOTH";
public static final String TYPE_LIMIT = "LIMIT";
public static final String TYPE_MARKET = "MARKET";
public static final String TYPE_STOP = "STOP";
public static final String TYPE_TAKE_PROFIT = "TAKE_PROFIT";
public static final String TYPE_STOP_MARKET = "STOP_MARKET";
public static final String TYPE_TAKE_PROFIT_MARKET = "TAKE_PROFIT_MARKET";
public static final String WORKINGTYPE_MARK_PRICE = "MARK_PRICE";
public static final String WORKINGTYPE_CONTRACT_PRICE = "CONTRACT_PRICE";
public static final String TIMEINFORCE_GTC = "GTC";
public static final String BIAN_SYMBOL = "symbol";
public static final String BIAN_POSITIONAMT = "positionAmt";
public static final String BIAN_POSITIONSIDE = "positionSide";
public static final String BIAN_ORDERID = "orderId";
public static final String SYSTEM_STATUS = "status";
public static final String SYSTEM_MSG = "msg";
public static final String SYSTEM_STATUS_OK = "ok";
public static final String SYSTEM_STATUS_ERROR = "error";
}
<file_sep>package cn.itcast.utils;
import java.math.BigDecimal;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Properties;
import cn.itcast.constant.TransactionConstants;
import cn.itcast.pojo.Mail;
public class ToolsUtils {
private static Map<String, Float> curPrice = new HashMap<String, Float>();
private static Map<String, String> userPositionAmt = new HashMap<String, String>();
// private static Map<String, String> userLeverage = new HashMap<String, String>();
private static List<Properties> platformMail;
private static int offset;
static {
platformMail = new ArrayList<Properties>();
Properties qqProps = new Properties();
qqProps.put("mail.smtp.host", "smtp.qq.com");
qqProps.put("mail.smtp.auth", "true");
qqProps.put("mail.smtps.timeout", 10000);
qqProps.put("mail.smtps.connectiontimeout", 10000);
qqProps.put("mail.smtp.port", "587");
qqProps.put("mail.user", "<EMAIL>");
qqProps.put("mail.password", "<PASSWORD>");
platformMail.add(qqProps);
Properties wyProps = new Properties();
wyProps.put("mail.smtp.host", "smtp.163.com");
wyProps.put("mail.smtp.auth", "true");
wyProps.put("mail.smtps.timeout", 10000);
wyProps.put("mail.smtps.connectiontimeout", 10000);
wyProps.put("mail.user", "<EMAIL>");
wyProps.put("mail.password", "<PASSWORD>");
platformMail.add(wyProps);
Properties sinaProps = new Properties();
sinaProps.put("mail.smtp.host", "smtp.sina.com");
sinaProps.put("mail.smtp.auth", "true");
sinaProps.put("mail.smtps.timeout", 10000);
sinaProps.put("mail.smtps.connectiontimeout", 10000);
sinaProps.put("mail.user", "<EMAIL>");
sinaProps.put("mail.password", <PASSWORD>-");
platformMail.add(sinaProps);
offset = 0;
}
public static Map<String, Float> getCurPrice() {
return curPrice;
}
public static Float getCurPriceByKey(String key) {
return curPrice.get(key);
}
public static void setCurPrice(Map<String, Float> curPrice) {
ToolsUtils.curPrice = curPrice;
}
public static void setCurPrice(String key, Float value) {
ToolsUtils.curPrice.put(key, value);
}
public static Properties getRandomPlat() {
return platformMail.get(offset++ % platformMail.size());
}
public static String getUserPositionAmt(String key) {
return ToolsUtils.userPositionAmt.get(key);
}
public static void setUserPositionAmt(String key, String value) {
ToolsUtils.userPositionAmt.put(key, value);
}
// public static String getUserLeverage(String key) {
// return ToolsUtils.userLeverage.get(key);
// }
//
// public static void setUserLeverage(String key, String value) {
// ToolsUtils.userLeverage.put(key, value);
// }
public static String formatQuantity(String symbol, Float value) {
BigDecimal number = new BigDecimal(""+ value);
String result;
if(symbol.toUpperCase().equals("TRXUSDT") || symbol.toUpperCase().equals("XLMUSDT")
|| symbol.toUpperCase().equals("ADAUSDT")) {
result = number.setScale(0, BigDecimal.ROUND_DOWN).toString();
} else if(symbol.toUpperCase().equals("XRPUSDT") || symbol.toUpperCase().equals("EOSUSDT")
|| symbol.toUpperCase().equals("ETCUSDT") || symbol.toUpperCase().equals("LINKUSDT")
|| symbol.toUpperCase().equals("BNBUSDT") || symbol.toUpperCase().equals("ATOMUSDT")
|| symbol.toUpperCase().equals("NEOUSDT") || symbol.toUpperCase().equals("XTZUSDT")
|| symbol.toUpperCase().equals("QTUMUSDT") || symbol.toUpperCase().equals("ONTUSDT")) {
result = number.setScale(1, BigDecimal.ROUND_DOWN).toString();
} else {
result = number.setScale(3, BigDecimal.ROUND_DOWN).toString();
}
return result;
}
public static String formatPrice(String symbol, Float value) {
BigDecimal number = new BigDecimal(""+ value);
String result;
if(symbol.toUpperCase().equals("TRXUSDT") || symbol.toUpperCase().equals("XLMUSDT")
|| symbol.toUpperCase().equals("ADAUSDT")) {
result = number.setScale(5, BigDecimal.ROUND_DOWN).toString();
} else if(symbol.toUpperCase().equals("XRPUSDT") || symbol.toUpperCase().equals("ONTUSDT")) {
result = number.setScale(4, BigDecimal.ROUND_DOWN).toString();
} else if(symbol.toUpperCase().equals("EOSUSDT") || symbol.toUpperCase().equals("ETCUSDT")
|| symbol.toUpperCase().equals("LINKUSDT") || symbol.toUpperCase().equals("BNBUSDT")
|| symbol.toUpperCase().equals("ATOMUSDT") || symbol.toUpperCase().equals("NEOUSDT")
|| symbol.toUpperCase().equals("QTUMUSDT") || symbol.toUpperCase().equals("XTZUSDT")) {
result = number.setScale(3, BigDecimal.ROUND_DOWN).toString();
} else {
result = number.setScale(2, BigDecimal.ROUND_DOWN).toString();
}
return result;
}
public static Mail generateMail(Integer uid, String symbol, String subject, String content,
Integer state, String createTime, String updateTime) {
Mail mail = new Mail();
mail.setUid(uid);
mail.setSymbol(symbol);
mail.setSubject(subject);
mail.setContent(content);
mail.setState(state);
mail.setCreateTime(createTime);
mail.setUpdateTime(updateTime);
return mail;
}
public static String generatePositionSide(Boolean firstsd, boolean reverse, String side) {
String result = "";
if(firstsd != null) {
if(firstsd) {
if(side.equals(TransactionConstants.SIDE_BUY)) {
if(!reverse) {
result = TransactionConstants.POSITIONSIDE_LONG;
} else {
result = TransactionConstants.POSITIONSIDE_SHORT;
}
} else if(side.equals(TransactionConstants.SIDE_SELL)) {
if(!reverse) {
result = TransactionConstants.POSITIONSIDE_SHORT;
} else {
result = TransactionConstants.POSITIONSIDE_LONG;
}
}
} else {
result = TransactionConstants.POSITIONSIDE_BOTH;
}
}
return result;
}
public static String parsePositionSide(String side) {
String result = "";
switch(side) {
case TransactionConstants.POSITIONSIDE_LONG :
result = "双向多头持仓";
break;
case TransactionConstants.POSITIONSIDE_SHORT :
result = "双向空头持仓";
break;
case TransactionConstants.POSITIONSIDE_BOTH :
result = "单向持仓";
break;
}
return result;
}
public static void main(String[] args) {
System.out.println(formatQuantity("ETHUSDT", 61.97999201356768f));
System.out.println(formatQuantity("ETCUSDT", 61.97999201356768f));
System.out.println(formatQuantity("TRXUSDT", 61.97999201356768f));
System.out.println(formatPrice("ETHUSDT", 61.97999201356768f));
System.out.println(formatPrice("ATOMUSDT", 61.97999201356768f));
System.out.println(formatPrice("XRPUSDT", 61.97999201356768f));
System.out.println(formatPrice("TRXUSDT", 61.97999201356768f));
}
}
<file_sep>package cn.itcast.service;
import java.util.List;
import cn.itcast.pojo.Config;
public interface ConfigService {
public Config findConfigByUid(Config config);
public List<Config> findConfigsByUid(Integer uid);
public List<Config> findConfigFlag(Config config);
public int updateConfig(Config config);
}
<file_sep>package cn.itcast.dao;
import java.util.List;
import org.apache.ibatis.annotations.Param;
import cn.itcast.pojo.Plan;
import cn.itcast.pojo.User;
public interface UserMapper {
public User findUser(@Param("username") String username,
@Param("password") String password);
public User findUserById(@Param("id") int id);
public int updateUserById(User user);
public List<User> findUserByUid(@Param("uid") int uid);
public List<User> findUserByIds(List<String> ids);
public User checkPermission(@Param("id") String id, @Param("relaid") String relaid);
public User findRelaUser(@Param("id") int id);
}
<file_sep>package cn.itcast.task;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.List;
import org.apache.commons.lang3.StringUtils;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import com.alibaba.fastjson.JSON;
import com.alibaba.fastjson.JSONObject;
import cn.itcast.constant.TransactionConstants;
import cn.itcast.dao.MailMapper;
import cn.itcast.dao.MonitorMapper;
import cn.itcast.pojo.Mail;
import cn.itcast.pojo.Monitor;
import cn.itcast.service.OrderService;
import cn.itcast.utils.ToolsUtils;
public class MonitorUser implements Runnable {
private static final Logger logger = LoggerFactory.getLogger(MonitorUser.class);
private static SimpleDateFormat format = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
@Autowired
private MonitorMapper monitorMapper;
@Autowired
private MailMapper mailMapper;
@Autowired
private OrderService orderService;
public MonitorUser() {
new Thread(this).start();
}
@Override
public void run() {
try {
Thread.sleep(5000);
} catch (InterruptedException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
while(true) {
try {
List<Monitor> monitors = monitorMapper.findMoniteInfo();
for(Monitor monitor : monitors) {
String temp = orderService.positionRisk(monitor.getApikey(), monitor.getSecretkey());
List<String> lists = JSON.parseArray(temp, String.class);
for(String list : lists) {
JSONObject json = JSON.parseObject(list);
String name = monitor.getNickname() + "(" + monitor.getUsername() + ")" + json.getString(TransactionConstants.BIAN_SYMBOL) + "的"
+ ToolsUtils.parsePositionSide(json.getString(TransactionConstants.BIAN_POSITIONSIDE));
String value = ToolsUtils.getUserPositionAmt(name);
if(StringUtils.isEmpty(value)) {
ToolsUtils.setUserPositionAmt(name, json.getString(TransactionConstants.BIAN_POSITIONAMT));
} else {
if(!value.equals(json.getString(TransactionConstants.BIAN_POSITIONAMT))) {
ToolsUtils.setUserPositionAmt(name, json.getString(TransactionConstants.BIAN_POSITIONAMT));
String mails = monitor.getMails();
if(StringUtils.isNotEmpty(mails)) {
String[] mailIds = mails.split(",");
for(String mailId : mailIds) {
Mail mail = ToolsUtils.generateMail(Integer.parseInt(mailId), json.getString(TransactionConstants.BIAN_SYMBOL),
name + "的仓位发生变化:" + value + "->" + json.getString(TransactionConstants.BIAN_POSITIONAMT),
name + "的仓位发生变化:" + value + "->" + json.getString(TransactionConstants.BIAN_POSITIONAMT),
0, format.format(new Date()), format.format(new Date()));
mailMapper.insertMail(mail);
}
}
}
}
// value = ToolsUtils.getUserLeverage(name);
// if(StringUtils.isEmpty(value)) {
// ToolsUtils.setUserLeverage(name, json.getString("leverage"));
// } else {
// if(!value.equals(json.getString("leverage"))) {
// ToolsUtils.setUserLeverage(name, json.getString("leverage"));
// String mails = monitor.getMails();
// if(StringUtils.isNotEmpty(mails)) {
// String[] mailIds = mails.split(",");
// for(String mailId : mailIds) {
// Mail mail = ToolsUtils.generateMail(Integer.parseInt(mailId), json.getString(TransactionConstants.BIAN_SYMBOL),
// name + "的杠杆发生变化:" + value + "->" + json.getString("leverage"),
// name + "的杠杆发生变化:" + value + "->" + json.getString("leverage"),
// 0, format.format(new Date()), format.format(new Date()));
// mailMapper.insertMail(mail);
// }
// }
//
// }
// }
}
}
} catch(Exception e) {
e.printStackTrace();
} finally {
try {
Thread.sleep(1000 * 30);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
}
}
<file_sep>package cn.itcast.pojo;
public class User {
private Integer id;
private String username;
private String nickname;
private String password;
private String apiKey;
private String secretKey;
private String mail;
private String role;
private String relaids;
private Integer status;
public Integer getId() {
return id;
}
public void setId(Integer id) {
this.id = id;
}
public String getUsername() {
return username;
}
public void setUsername(String username) {
this.username = username;
}
public String getNickname() {
return nickname;
}
public void setNickname(String nickname) {
this.nickname = nickname;
}
public String getPassword() {
return password;
}
public void setPassword(String password) {
this.password = <PASSWORD>;
}
public String getApiKey() {
return apiKey;
}
public void setApiKey(String apiKey) {
this.apiKey = apiKey;
}
public String getSecretKey() {
return secretKey;
}
public void setSecretKey(String secretKey) {
this.secretKey = secretKey;
}
public String getMail() {
return mail;
}
public void setMail(String mail) {
this.mail = mail;
}
public String getRole() {
return role;
}
public void setRole(String role) {
this.role = role;
}
public String getRelaids() {
return relaids;
}
public void setRelaids(String relaids) {
this.relaids = relaids;
}
public Integer getStatus() {
return status;
}
public void setStatus(Integer status) {
this.status = status;
}
}
<file_sep>package cn.itcast.back;
import cn.itcast.service.OrderService;
public class FollowPlanTask implements Runnable {
private OrderService orderService;
private Integer id;
private String symbol;
private String first;
private String second;
private String third;
private String stop;
private String trigger;
private Integer compare;
private String trigger1;
private Integer compare1;
private Integer uid;
private String apiKey;
private String secretKey;
private Float curPrice;
public FollowPlanTask(OrderService orderService, Integer id, String symbol, String first, String second, String third, String stop,
String trigger, Integer compare, String trigger1, Integer compare1, Integer uid, String apiKey, String secretKey, Float curPrice) {
super();
this.orderService = orderService;
this.id = id;
this.symbol = symbol;
this.first = first;
this.second = second;
this.third = third;
this.stop = stop;
this.trigger = trigger;
this.compare = compare;
this.trigger1 = trigger1;
this.compare1 = compare1;
this.uid = uid;
this.apiKey = apiKey;
this.secretKey = secretKey;
this.curPrice = curPrice;
}
@Override
public void run() {
try {
orderService.follow(id, symbol, first, second, third, stop, trigger, compare, trigger1, compare1, uid, apiKey, secretKey, curPrice);
} catch (Exception e) {
e.printStackTrace();
}
}
}
<file_sep>package cn.itcast.back;
import cn.itcast.service.OrderService;
public class FollowStrategyStopTask implements Runnable {
private OrderService orderService;
private String symbol;
private String side;
private String stopPrice;
private Integer uid;
private String apiKey;
private String secretKey;
private String id;
public FollowStrategyStopTask(OrderService orderService, String symbol, String side, String stopPrice,
Integer uid, String apiKey, String secretKey, String id) {
super();
this.orderService = orderService;
this.symbol = symbol;
this.side = side;
this.stopPrice = stopPrice;
this.uid = uid;
this.apiKey = apiKey;
this.secretKey = secretKey;
this.id = id;
}
@Override
public void run() {
try {
orderService.strategyStop(symbol, side, stopPrice, uid, id, apiKey, secretKey);
} catch (Exception e) {
e.printStackTrace();
}
}
}
<file_sep>package cn.itcast.service.impl;
import java.util.List;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import cn.itcast.dao.UserMapper;
import cn.itcast.pojo.User;
import cn.itcast.service.UserService;
@Service
@Transactional
public class UserServiceImpl implements UserService {
@Autowired
private UserMapper userMapper;
public User findUser(String username, String password) {
return this.userMapper.findUser(username, password);
}
public User findUser(int id) {
return this.userMapper.findUserById(id);
}
public int updateUserById(User user) {
return userMapper.updateUserById(user);
}
public List<User> findUserByUid(int uid) {
return userMapper.findUserByUid(uid);
}
public List<User> findUserByIds(List<String> ids) {
return userMapper.findUserByIds(ids);
}
@Override
public User checkPermission(String id, String relaid) {
// TODO Auto-generated method stub
return userMapper.checkPermission(id, relaid);
}
@Override
public User findRelaUser(int id) {
// TODO Auto-generated method stub
return userMapper.findRelaUser(id);
}
}
<file_sep>package cn.itcast.client;
import java.net.InetSocketAddress;
import java.net.Proxy;
import java.util.concurrent.TimeUnit;
import okhttp3.OkHttpClient;
public class HttpClient {
private static OkHttpClient.Builder builder = new OkHttpClient.Builder()
.readTimeout(10, TimeUnit.SECONDS)
.writeTimeout(10, TimeUnit.SECONDS)
.connectTimeout(5, TimeUnit.SECONDS)
// .proxy(new Proxy(Proxy.Type.HTTP, new InetSocketAddress("127.0.0.1", 1080)))
.sslSocketFactory(SSLSocketClient.getSSLSocketFactory())
.hostnameVerifier(SSLSocketClient.getHostnameVerifier());
public static OkHttpClient okHttpClient = builder.build();
public static String baseUrl = "https://fapi.binance.com";
public static String advancedUrl = "https://api.binance.com";
}
<file_sep>package cn.itcast.back;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
public class ThreadPool {
private static ExecutorService fixedThreadPool = Executors.newFixedThreadPool(15);
public static void execute(Runnable runnable) {
fixedThreadPool.execute(runnable);
}
}
<file_sep>package cn.itcast.pojo;
public class Config {
private Integer id;
private Integer uid;
private String type;
private Float marketAmount;
private Integer limitAmount;
private Float maxLoss;
private Float tradeOffset;
private Float lossTriggerOffset;
private Float lossEntrustOffset;
private String lossWorkingType;
private Integer lossType;
private Integer rate;
private Integer autoTrade;
private Integer autoCancel;
public Integer getId() {
return id;
}
public void setId(Integer id) {
this.id = id;
}
public Integer getUid() {
return uid;
}
public void setUid(Integer uid) {
this.uid = uid;
}
public String getType() {
return type;
}
public void setType(String type) {
this.type = type;
}
public Float getMarketAmount() {
return marketAmount;
}
public void setMarketAmount(Float marketAmount) {
this.marketAmount = marketAmount;
}
public Integer getLimitAmount() {
return limitAmount;
}
public void setLimitAmount(Integer limitAmount) {
this.limitAmount = limitAmount;
}
public Float getMaxLoss() {
return maxLoss;
}
public void setMaxLoss(Float maxLoss) {
this.maxLoss = maxLoss;
}
public Float getTradeOffset() {
return tradeOffset;
}
public void setTradeOffset(Float tradeOffset) {
this.tradeOffset = tradeOffset;
}
public Float getLossTriggerOffset() {
return lossTriggerOffset;
}
public void setLossTriggerOffset(Float lossTriggerOffset) {
this.lossTriggerOffset = lossTriggerOffset;
}
public Float getLossEntrustOffset() {
return lossEntrustOffset;
}
public void setLossEntrustOffset(Float lossEntrustOffset) {
this.lossEntrustOffset = lossEntrustOffset;
}
public String getLossWorkingType() {
return lossWorkingType;
}
public void setLossWorkingType(String lossWorkingType) {
this.lossWorkingType = lossWorkingType;
}
public Integer getLossType() {
return lossType;
}
public void setLossType(Integer lossType) {
this.lossType = lossType;
}
public Integer getRate() {
return rate;
}
public void setRate(Integer rate) {
this.rate = rate;
}
public Integer getAutoTrade() {
return autoTrade;
}
public void setAutoTrade(Integer autoTrade) {
this.autoTrade = autoTrade;
}
public Integer getAutoCancel() {
return autoCancel;
}
public void setAutoCancel(Integer autoCancel) {
this.autoCancel = autoCancel;
}
}
<file_sep>package cn.itcast.controller;
import java.util.Arrays;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import javax.servlet.http.HttpSession;
import org.apache.commons.lang3.StringUtils;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Controller;
import org.springframework.ui.Model;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.ResponseBody;
import com.alibaba.fastjson.JSON;
import com.alibaba.fastjson.JSONArray;
import com.alibaba.fastjson.JSONObject;
import cn.itcast.constant.TransactionConstants;
import cn.itcast.pojo.User;
import cn.itcast.service.UserService;
@Controller
@RequestMapping("/User")
public class UserController {
@Autowired
private UserService userService;
@RequestMapping("/index")
public String index(Model model, HttpSession session) {
User user = (User) session.getAttribute(TransactionConstants.USER_SESSION);
model.addAttribute(TransactionConstants.USER_ROLE, user.getRole());
model.addAttribute(TransactionConstants.USER_USERNAME, user.getUsername());
model.addAttribute(TransactionConstants.USER_NICKNAME, user.getNickname());
return "user";
}
//用户登录
@RequestMapping(value = "/login")
public String login(String username, String password, Model model, HttpSession session) {
User user = userService.findUser(username, password);
if (user != null) {
session.setMaxInactiveInterval(-1);
session.setAttribute(TransactionConstants.USER_SESSION, user);
if(StringUtils.isNotEmpty(user.getApiKey()) && StringUtils.isNotEmpty(user.getSecretKey())) {
model.addAttribute(TransactionConstants.USER_ROLE, user.getRole());
model.addAttribute(TransactionConstants.USER_ID, user.getId());
model.addAttribute(TransactionConstants.USER_USERNAME, user.getUsername());
model.addAttribute(TransactionConstants.USER_NICKNAME, user.getNickname());
if(StringUtils.isNotEmpty(user.getRelaids())) {
List<String> idList = Arrays.asList(user.getRelaids().split(","));
List<User> users = userService.findUserByIds(idList);
if(users != null && users.size() > 0) {
JSONArray userInfos = new JSONArray();
for(User u : users) {
Map<String, String> temp = new HashMap<String, String>();
temp.put(TransactionConstants.USER_ID, "" + u.getId());
temp.put(TransactionConstants.USER_USERNAME, u.getUsername());
temp.put(TransactionConstants.USER_NICKNAME, u.getNickname());
userInfos.add(temp);
}
model.addAttribute("ids", userInfos.toString());
}
}
return "account";
} else {
return "user";
}
}
model.addAttribute(TransactionConstants.SYSTEM_MSG, "用户或密码错误,请重新输入");
return "index";
}
@RequestMapping(value = "/findUser")
@ResponseBody
public String findUser(HttpSession session) {
JSONObject result = new JSONObject();
try {
User user = (User) session.getAttribute(TransactionConstants.USER_SESSION);
User userInfo = userService.findUser(user.getId());
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_OK);
result.put(TransactionConstants.SYSTEM_MSG, JSON.toJSONString(userInfo));
} catch(Exception e) {
e.printStackTrace();
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, e.getMessage());
}
return result.toJSONString();
}
//退出登录
@RequestMapping(value = "/logout")
public String logout(HttpSession session) {
session.invalidate();
return "index";
}
@RequestMapping(value = "/update")
@ResponseBody
public String update(Integer id, String password, String apiKey, String secretKey, String mail, HttpSession session) {
JSONObject result = new JSONObject();
try {
User user = new User();
user.setId(id);
user.setPassword(<PASSWORD>);
user.setApiKey(apiKey);
user.setSecretKey(secretKey);
user.setMail(mail);
int number = userService.updateUserById(user);
if(number > 0) {
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_OK);
result.put(TransactionConstants.SYSTEM_MSG, "save successful");
} else {
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, "save failed");
}
session.invalidate();
} catch(Exception e) {
e.printStackTrace();
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, e.getMessage());
}
return result.toJSONString();
}
@RequestMapping(value = "/findUserByUid", produces="text/html;charset=UTF-8")
@ResponseBody
public String findUserByUid(Integer uid, HttpSession session) {
JSONObject result = new JSONObject();
try {
List<User> userInfos = userService.findUserByUid(uid);
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_OK);
result.put(TransactionConstants.SYSTEM_MSG, JSON.toJSONString(userInfos));
} catch(Exception e) {
e.printStackTrace();
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, e.getMessage());
}
return result.toJSONString();
}
@RequestMapping(value = "/changeUser", produces="text/html;charset=UTF-8")
@ResponseBody
public String changeUser(String id, String relaid, HttpSession session) {
JSONObject result = new JSONObject();
try {
User my = userService.checkPermission(id, "," + relaid + ",");
if(my != null) {
User relaUser = userService.findRelaUser(Integer.parseInt(relaid));
if(relaUser != null) {
session.setMaxInactiveInterval(-1);
session.setAttribute(TransactionConstants.USER_SESSION, relaUser);
Map<String, String> temp = new HashMap<String, String>();
temp.put(TransactionConstants.USER_ID, "" + relaUser.getId());
temp.put(TransactionConstants.USER_ROLE, relaUser.getRole());
temp.put(TransactionConstants.USER_USERNAME, relaUser.getUsername());
temp.put(TransactionConstants.USER_NICKNAME, relaUser.getNickname());
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_OK);
result.put(TransactionConstants.SYSTEM_MSG, JSON.toJSONString(temp));
return result.toJSONString();
}
}
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, "切换用户失败");
} catch(Exception e) {
e.printStackTrace();
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, e.getMessage());
}
return result.toJSONString();
}
}
<file_sep>package cn.itcast.controller;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.List;
import javax.servlet.http.HttpSession;
import org.apache.commons.lang3.StringUtils;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Controller;
import org.springframework.ui.Model;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.ResponseBody;
import com.alibaba.fastjson.JSON;
import com.alibaba.fastjson.JSONObject;
import cn.itcast.back.CancelPlanTask;
import cn.itcast.back.FollowPlanTask;
import cn.itcast.back.FollowStrategyStopTask;
import cn.itcast.back.FollowStrategyTask;
import cn.itcast.back.ThreadPool;
import cn.itcast.constant.TransactionConstants;
import cn.itcast.model.Result;
import cn.itcast.pojo.Config;
import cn.itcast.pojo.Plan;
import cn.itcast.pojo.User;
import cn.itcast.service.ConfigService;
import cn.itcast.service.OrderService;
import cn.itcast.utils.ToolsUtils;
@Controller
@RequestMapping("/Order")
public class OrderController {
private static final Logger logger = LoggerFactory.getLogger(OrderController.class);
@Autowired
private OrderService orderService;
@Autowired
private ConfigService configService;
private static SimpleDateFormat format = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
@RequestMapping(value = "/index")
public String index(Model model, HttpSession session) {
User user = (User) session.getAttribute(TransactionConstants.USER_SESSION);
model.addAttribute(TransactionConstants.USER_ROLE, user.getRole());
model.addAttribute(TransactionConstants.USER_USERNAME, user.getUsername());
model.addAttribute(TransactionConstants.USER_NICKNAME, user.getNickname());
return "account";
}
@RequestMapping(value = "/plan")
@ResponseBody
public String plan(String symbol, String first, String second, String third, String stop, Integer level,
String trigger, Integer compare, String trigger1, Integer compare1, HttpSession session) {
JSONObject result = new JSONObject();
try {
Float curPrice = ToolsUtils.getCurPriceByKey(symbol);
//获取配置项
User user = (User) session.getAttribute(TransactionConstants.USER_SESSION);
String plan = orderService.plan(symbol, first, second, third, stop, trigger, compare, trigger1, compare1,
user.getId(), user.getApiKey(), user.getSecretKey(), curPrice, level);
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_OK);
result.put(TransactionConstants.SYSTEM_MSG, plan);
if(user.getRole().indexOf("0") > -1) {
JSONObject jSONObject = JSON.parseObject(plan);
if(jSONObject.containsKey(TransactionConstants.USER_ID)) {
int id = jSONObject.getIntValue(TransactionConstants.USER_ID);
//进入关联跟单
if(id > 0) {
Config config = new Config();
config.setType(symbol);
config.setLossWorkingType("" + level);;
List<Config> allConfig = configService.findConfigFlag(config);
for(Config c : allConfig) {
ThreadPool.execute(new FollowPlanTask(orderService, id, symbol, first, second, third, stop,
trigger, compare, trigger1, compare1, c.getUid(), c.getType(), c.getLossWorkingType(), curPrice));
}
}
}
}
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, e.getMessage());
}
return result.toJSONString();
}
@RequestMapping(value = "/save")
@ResponseBody
public String save(String symbol, String first, String second, String third, String stop, Integer level,
String trigger, Integer compare, String trigger1, Integer compare1, HttpSession session) {
JSONObject result = new JSONObject();
try {
//获取配置项
User user = (User) session.getAttribute(TransactionConstants.USER_SESSION);
Plan plan = new Plan();
plan.setUid(user.getId());
plan.setPid(0);
plan.setSymbol(symbol);
plan.setFirst(Float.parseFloat(first));
plan.setSecond(Float.parseFloat(second));
plan.setThird(Float.parseFloat(third));
plan.setStop(Float.parseFloat(stop));
plan.setTrigger(StringUtils.isNotEmpty(trigger) ? Float.parseFloat(trigger) : 0f);
plan.setCompare(StringUtils.isNotEmpty(trigger) ? compare : 0);
plan.setTrigger1(StringUtils.isNotEmpty(trigger1) ? Float.parseFloat(trigger1) : 0f);
plan.setCompare1(StringUtils.isNotEmpty(trigger1) ? compare1 : 1);
plan.setState(5);
plan.setCreateTime(format.format(new Date()));
plan.setUpdateTime(format.format(new Date()));
plan.setType(0);
if(level != null) {
plan.setLevel(level);
} else {
plan.setLevel(1);
}
plan.setOrderIds("");
if(orderService.insertPlan(plan) > 0) {
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_OK);
} else {
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, "保存失败");
}
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, e.getMessage());
}
return result.toJSONString();
}
@RequestMapping(value = "/findAllPlans")
@ResponseBody
public String findAllPlans(HttpSession session) {
JSONObject result = new JSONObject();
try {
//获取配置项
User user = (User) session.getAttribute(TransactionConstants.USER_SESSION);
List<Plan> plans = orderService.findPlanByUid(user.getId());
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_OK);
result.put(TransactionConstants.SYSTEM_MSG, JSON.toJSONString(plans));
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, e.getMessage());
}
return result.toJSONString();
}
@RequestMapping(value = "/findCachePlans")
@ResponseBody
public String findCachePlans(HttpSession session) {
JSONObject result = new JSONObject();
try {
//获取配置项
User user = (User) session.getAttribute(TransactionConstants.USER_SESSION);
List<Plan> plans = orderService.findCachePlanByUid(user.getId());
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_OK);
result.put(TransactionConstants.SYSTEM_MSG, JSON.toJSONString(plans));
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, e.getMessage());
}
return result.toJSONString();
}
@RequestMapping(value = "/historyOrders")
@ResponseBody
public String historyOrders(Integer startTime, HttpSession session) {
JSONObject result = new JSONObject();
try {
//获取配置项
User user = (User) session.getAttribute(TransactionConstants.USER_SESSION);
int level = 1;
String[] roles = user.getRole().split(",");
for(String role : roles) {
if(StringUtils.isNotEmpty(role)) {
int atomic = Integer.parseInt(role);
if(atomic < 6 && atomic > level) {
level = atomic;
}
}
}
List<Plan> plans = orderService.findPlanByTime(startTime, level);
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_OK);
result.put(TransactionConstants.SYSTEM_MSG, JSON.toJSONString(plans));
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, e.getMessage());
}
return result.toJSONString();
}
@RequestMapping(value = "/fllowPlans", produces="text/html;charset=UTF-8")
@ResponseBody
public String fllowPlans(String symbol, HttpSession session) {
JSONObject result = new JSONObject();
try {
User user = (User) session.getAttribute(TransactionConstants.USER_SESSION);
int level = 1;
String[] roles = user.getRole().split(",");
for(String role : roles) {
if(StringUtils.isNotEmpty(role)) {
int atomic = Integer.parseInt(role);
if(atomic < 6 && atomic > level) {
level = atomic;
}
}
}
List<Plan> plans = orderService.findFllowPlans(level);
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_OK);
result.put(TransactionConstants.SYSTEM_MSG, JSON.toJSONString(plans));
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, e.getMessage());
}
return result.toJSONString();
}
@RequestMapping(value = "/cancelPlan")
@ResponseBody
public String cancelPlan(String symbol, String id, String state, String orderIds, HttpSession session) {
JSONObject result = new JSONObject();
try {
//获取配置项
User user = (User) session.getAttribute(TransactionConstants.USER_SESSION);
int number = orderService.cancelPlan(user.getId(), symbol, id, orderIds, Integer.parseInt(state), user.getApiKey(), user.getSecretKey());
if(number > 0) {
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_OK);
result.put(TransactionConstants.SYSTEM_MSG, "canceled successful");
} else {
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, "canceled failed");
}
if(user.getRole().indexOf("0") > -1) {
//进入关联撤单
List<Plan> plans = orderService.findPlansById(Integer.parseInt(id));
for(Plan plan : plans) {
ThreadPool.execute(new CancelPlanTask(orderService, plan.getUid(), plan.getId(), plan.getSymbol(),
plan.getOrderIds(), Integer.parseInt(state), plan.getCreateTime(), plan.getUpdateTime()));
}
}
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, e.getMessage());
}
return result.toJSONString();
}
@RequestMapping(value = "/follow")
@ResponseBody
public String follow(Integer id, HttpSession session) {
JSONObject result = new JSONObject();
try {
//获取配置项
User user = (User) session.getAttribute(TransactionConstants.USER_SESSION);
Plan plan = orderService.findPlanById(id);
if(plan != null && plan.getState() < 4) {
orderService.follow(id, plan.getSymbol(), plan.getFirst().toString(), plan.getSecond().toString(),
plan.getThird().toString(), plan.getStop().toString(), plan.getTrigger().toString(), plan.getCompare(),
plan.getTrigger1().toString(), plan.getCompare1(), user.getId(), user.getApiKey(), user.getSecretKey(),
ToolsUtils.getCurPriceByKey(plan.getSymbol()));
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_OK);
result.put(TransactionConstants.SYSTEM_MSG, "follow successful");
} else {
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, "follow failed");
}
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, e.getMessage());
}
return result.toJSONString();
}
@RequestMapping(value = "/repeat")
@ResponseBody
public String repeat(Integer id, HttpSession session) {
JSONObject result = new JSONObject();
try {
//获取配置项
User user = (User) session.getAttribute(TransactionConstants.USER_SESSION);
Plan plan = orderService.findPlanById(id);
if(plan != null) {
Integer level = null;
if(user.getRole().indexOf("0") > -1) {
level = plan.getLevel();
}
String temp = plan(plan.getSymbol(), plan.getFirst().toString(), plan.getSecond().toString(),
plan.getThird().toString(), plan.getStop().toString(), level, plan.getTrigger().toString(),
plan.getCompare(), plan.getTrigger1().toString(), plan.getCompare1(), session);
result = JSONObject.parseObject(temp);
} else {
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, "repeat failed");
}
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, e.getMessage());
}
return result.toJSONString();
}
@RequestMapping(value = "/warn", produces="text/html;charset=UTF-8")
@ResponseBody
public String warn(String id, HttpSession session) {
JSONObject result = new JSONObject();
try {
//获取配置项
User user = (User) session.getAttribute(TransactionConstants.USER_SESSION);
if(user.getRole().indexOf("0") > -1) {
//通知
int number = orderService.warn(id);
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_OK);
result.put(TransactionConstants.SYSTEM_MSG, "notice " + number + " users");
} else {
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, "只有带单人才能操作");
}
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, e.getMessage());
}
return result.toJSONString();
}
@RequestMapping(value = "/predict")
@ResponseBody
public String predict(Integer id, HttpSession session) {
JSONObject result = new JSONObject();
try {
//获取配置项
User user = (User) session.getAttribute(TransactionConstants.USER_SESSION);
Plan plan = orderService.findPlanById(id);
if(plan != null) {
Result temp = orderService.generateAndDealOrder(plan.getSymbol(), plan.getFirst().toString(), plan.getSecond().toString(),
plan.getThird().toString(), plan.getStop().toString(), plan.getTrigger().toString(), plan.getCompare(),
plan.getTrigger1().toString(), plan.getCompare1(), user.getId(), user.getApiKey(), user.getSecretKey(),
null, ToolsUtils.getCurPriceByKey(plan.getSymbol()), true);
if(temp.getState() == -1) {
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_OK);
result.put(TransactionConstants.SYSTEM_MSG, temp.getMsg());
} else {
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, "show detail failed");
}
} else {
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, "show detail failed");
}
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, e.getMessage());
}
return result.toJSONString();
}
@RequestMapping(value = "/submit")
@ResponseBody
public String submit(Integer id, HttpSession session) {
JSONObject result = new JSONObject();
try {
//获取配置项
Plan plan = orderService.findPlanById(id);
if(plan != null) {
return plan(plan.getSymbol(), plan.getFirst().toString(), plan.getSecond().toString(), plan.getThird().toString(), plan.getStop().toString(),
plan.getLevel(), plan.getTrigger().toString(), plan.getCompare(), plan.getTrigger1().toString(), plan.getCompare1(), session);
} else {
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, "submit plan failed");
}
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, e.getMessage());
}
return result.toJSONString();
}
@RequestMapping(value = "/strategyMarket")
@ResponseBody
public String strategyMarket(String symbol, String side, Integer level, HttpSession session) {
JSONObject result = new JSONObject();
String strategy = null;
try {
//获取配置项
User user = (User) session.getAttribute(TransactionConstants.USER_SESSION);
strategy = orderService.strategyMarket(symbol, side, user.getId(), user.getApiKey(), user.getSecretKey(), level, null);
if(user.getRole().indexOf("0") > -1) {
JSONObject jSONObject = JSON.parseObject(strategy);
if(jSONObject.containsKey(TransactionConstants.USER_ID)) {
int id = jSONObject.getIntValue(TransactionConstants.USER_ID);
//进入关联跟单
if(id > 0) {
Config config = new Config();
config.setType(symbol);
config.setLossWorkingType("" + level);;
List<Config> allConfig = configService.findConfigFlag(config);
for(Config c : allConfig) {
ThreadPool.execute(new FollowStrategyTask(orderService, symbol, side, c.getUid(), c.getType(), c.getLossWorkingType(), id));
}
}
}
}
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, e.getMessage());
}
return StringUtils.isNotEmpty(strategy) ? strategy : result.toJSONString();
}
@RequestMapping(value = "/strategyStop")
@ResponseBody
public String strategyStop(String symbol, String side, String stopPrice, String id, HttpSession session) {
JSONObject result = new JSONObject();
String strategy = null;
try {
//获取配置项
User user = (User) session.getAttribute(TransactionConstants.USER_SESSION);
strategy = orderService.strategyStop(symbol, side, stopPrice, user.getId(), id, user.getApiKey(), user.getSecretKey());
if(user.getRole().indexOf("0") > -1) {
//进入关联操作
List<Plan> plans = orderService.findPlansById(Integer.parseInt(id));
for(Plan plan : plans) {
ThreadPool.execute(new FollowStrategyStopTask(orderService, plan.getSymbol(), side, stopPrice,
plan.getUid(), plan.getCreateTime(), plan.getUpdateTime(), "" + plan.getId()));
}
}
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
result.put(TransactionConstants.SYSTEM_STATUS, TransactionConstants.SYSTEM_STATUS_ERROR);
result.put(TransactionConstants.SYSTEM_MSG, e.getMessage());
}
return StringUtils.isNotEmpty(strategy) ? strategy : result.toJSONString();
}
}
<file_sep>package cn.itcast.service.impl;
import java.util.List;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import cn.itcast.dao.ConfigMapper;
import cn.itcast.pojo.Config;
import cn.itcast.service.ConfigService;
@Service
@Transactional
public class ConfigServiceImpl implements ConfigService {
@Autowired
private ConfigMapper configMapper;
@Override
public Config findConfigByUid(Config config) {
// TODO Auto-generated method stub
return configMapper.findConfigByUid(config);
}
@Override
public List<Config> findConfigsByUid(Integer uid) {
// TODO Auto-generated method stub
return configMapper.findConfigsByUid(uid);
}
@Override
public int updateConfig(Config config) {
// TODO Auto-generated method stub
return configMapper.updateConfig(config);
}
@Override
public List<Config> findConfigFlag(Config config) {
// TODO Auto-generated method stub
return configMapper.findConfigFlag(config);
}
}
<file_sep>package cn.itcast.dao;
import java.util.List;
import org.apache.ibatis.annotations.Param;
import cn.itcast.pojo.Config;
public interface ConfigMapper {
public Config findConfigByUid(Config config);
public List<Config> findConfigsByUid(@Param("uid") Integer uid);
public List<Config> findConfigFlag(Config config);
public int updateConfig(Config config);
}
<file_sep>package cn.itcast.dao;
import java.util.List;
import org.apache.ibatis.annotations.Param;
import cn.itcast.pojo.Balance;
import cn.itcast.pojo.User;
public interface BalanceMapper {
public int insertBalance(Balance balance);
public List<Balance> findBalanceByUid(@Param("uid") Integer uid);
public List<User> findUserByStatus();
}
<file_sep>package cn.itcast.task;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.List;
import java.util.Properties;
import java.util.Random;
import javax.mail.Authenticator;
import javax.mail.Message;
import javax.mail.PasswordAuthentication;
import javax.mail.Session;
import javax.mail.Transport;
import javax.mail.internet.InternetAddress;
import javax.mail.internet.MimeMessage;
import org.apache.commons.lang3.StringUtils;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import cn.itcast.dao.MailMapper;
import cn.itcast.pojo.Mail;
import cn.itcast.utils.ToolsUtils;
public class SendMail implements Runnable {
private static final Logger logger = LoggerFactory.getLogger(SendMail.class);
private static SimpleDateFormat format = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
@Autowired
private MailMapper mailMapper;
public SendMail() {
new Thread(this).start();
}
@Override
public void run() {
try {
Thread.sleep(5000);
} catch (InterruptedException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
while(true) {
try {
List<Mail> mails = mailMapper.findUnsentMail();
for(Mail mail : mails) {
Properties props = ToolsUtils.getRandomPlat();
try {
if(StringUtils.isNotEmpty(mail.getSymbol())) {
Thread.sleep((long)(new Random().nextFloat() * 5 * 1000 + 3 * 1000));
Session ssn = Session.getInstance(props, new Authenticator() {
protected PasswordAuthentication getPasswordAuthentication() {
// 用户名、密码
String userName = props.getProperty("mail.user");
String password = props.getProperty("mail.password");
return new PasswordAuthentication(userName, password);
}
});
MimeMessage message = new MimeMessage(ssn);
InternetAddress fromAddress = new InternetAddress(props.getProperty("mail.user"), "币安狐狸");
message.setFrom(fromAddress);
InternetAddress toAddress = new InternetAddress(mail.getSymbol());
message.setRecipient(Message.RecipientType.TO, toAddress);
message.setSubject(mail.getSubject(), "utf-8");
message.setContent(mail.getContent(), "text/html;charset=utf-8");
Transport.send(message);
logger.info("{} send mail to {} successful", props.getProperty("mail.user"), mail.getSymbol());
mail.setState(1);
mail.setUpdateTime(format.format(new Date()));
mailMapper.updateConfig(mail);
}
} catch(Exception e) {
e.printStackTrace();
logger.error("{} send mail to {} failed", props.getProperty("mail.user"), mail.getSymbol());
}
}
} catch(Exception e) {
e.printStackTrace();
} finally {
try {
Thread.sleep(1000 * 5);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
}
}
|
726143b295476163338521bcabd8ae1f891f33bc
|
[
"Java"
] | 18 |
Java
|
atlantis68/biantrade
|
c9187b37e18baf82402fb14a8af3616330ee68f3
|
3860c277832bf5d9a69636119276e609d29ce254
|
refs/heads/master
|
<repo_name>lforet/robomow<file_sep>/telemetry/gps_tool.py
#GPS distance and bearing between two GPS points (Python recipe)
#This code outputs the distance between 2 GPS points showing also the vertical and horizontal bearing between them.
from math import *
#Two Example GPS Locations
lat1 = 53.32055555555556
lat2 = 53.31861111111111
lon1 = -1.7297222222222221
lon2 = -1.6997222222222223
Aaltitude = 2000
Oppsite = 20000
#Haversine Formuala to find vertical angle and distance
lon1, lat1, lon2, lat2 = map(radians, [lon1, lat1, lon2, lat2])
dlon = lon2 - lon1
dlat = lat2 - lat1
a = sin(dlat/2)**2 + cos(lat1) * cos(lat2) * sin(dlon/2)**2
c = 2 * atan2(sqrt(a), sqrt(1-a))
Base = 6371 * c
#Horisontal Bearing
def calcBearing(lat1, lon1, lat2, lon2):
dLon = lon2 - lon1
y = sin(dLon) * cos(lat2)
x = cos(lat1) * sin(lat2) \
- sin(lat1) * cos(lat2) * cos(dLon)
return atan2(y, x)
Bearing = calcBearing(lat1, lon1, lat2, lon2)
Bearing = degrees(Bearing)
Base2 = Base * 1000
distance = Base * 2 + Oppsite * 2 / 2
Caltitude = Oppsite - Aaltitude
#Convertion from radians to decimals
a = Oppsite/Base
b = atan(a)
c = degrees(b)
#Convert meters into Kilometers
distance = distance / 1000
#Output the data
print("---------------------------------------")
print(":::::Auto Aim Directional Anntenna:::::")
print("---------------------------------------")
print("Horizontial Distance:", Base,"km")
print(" Vertical Distance:", distance,"km")
print(" Vertical Bearing:",c)
print(" Horizontial Bearing:",Bearing)
print("---------------------------------------")
input("Press <enter> to Exit")
print(" Vertical Distance:", distance,"km")
print(" Vertical Bearing:",c)
print(" Horizontial Bearing:",Bearing)
print("---------------------------------------")
input("Press <enter> to Exit")
<file_sep>/lib/class_android_sensor.tcp.py
from threading import Thread
import time
import sys
import socket
import math
def read_line(s):
ret = ''
while True:
c = s.recv(1)
if c == '\n' or c == '':
break
else:
ret += c
return ret
class android_sensor_tcp(Thread):
def __init__(self, port):
self.host = ''
self.port = port # Arbitrary non-privileged port
self.socket = None
self.light = None
self.accuracy = None
self.xforce = None
self.yforce = None
self.zforce = None
self.xMag = None
self.yMag = None
self.zMag = None
self.pitch = None
self.roll= None
self.azimuth = None
self.heading = None
self.latitude = None
self.longitude = None
Thread.__init__(self)
def run(self):
while True:
try:
self.close_com()
except:
pass
try:
print "waiting to accept.."
conn, addr = self.open_com()
print "accepted connection from client.."
while conn <> "":
self.socket.listen(1)
for i in range(5):
data = read_line(conn)
if len(data) > 0: break
data1 = data.split(',')
#print data1
if len(data) > 1:
try:
self.light = float(data1[1])
self.accuracy = float(data1[2])
self.xforce = float(data1[3])
self.yforce = float(data1[4])
self.zforce = float(data1[5])
self.xMag = float(data1[6])
self.yMag = float(data1[7])
self.zMag = float(data1[8])
self.pitch = float(data1[9])
self.roll = float(data1[10])
self.azimuth = float(data1[11])
#self.latitude =float(data1[12])
#self.longitude = float(data1[13])
self.heading = math.degrees(self.azimuth)
if self.heading < 0: self.heading = self.heading + 360
self.heading = round(self.heading, 2)
#print "heading:", self.heading
except:
pass
else:
self.light = None
self.accuracy = None
self.xforce = None
self.yforce = None
self.zforce = None
self.xMag = None
self.yMag = None
self.zMag = None
self.pitch = None
self.roll= None
self.azimuth = None
self.heading = None
#self.latitude = None
#self.longitude = None
conn = ""
time.sleep(.5)
time.sleep(1)
except IOError as detail:
print "connection lost", detail
def open_com(self):
self.socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
self.socket.bind((self.host, self.port))
#print "listening..."
self.socket.listen(1)
try:
print "waiting to accept.."
conn, addr = self.socket.accept()
return conn, addr
except IOError as detail:
print "connection lost", detail
def close_com(self):
try:
print "closing Socket"
self.socket.close()
except NameError as detail:
print "No socket to close", detail
if __name__== "__main__":
android_sensor = android_sensor_tcp(8095)
android_sensor.daemon=True
android_sensor.start()
while True:
while android_sensor.heading == None:
time.sleep(.1)
print android_sensor.heading, android_sensor.latitude, android_sensor.longitude
time.sleep(.1)
del android_sensor
<file_sep>/main/mobot_main.py~
#!/usr/bin/python
import sys
sys.path.append( "../lib/" )
from PIL import Image, ImageTk
import time
from datetime import datetime
import socket
import cv2
import cv
from threading import *
import random
from math import *
import easygui as eg
import sonar_functions as sf
from img_processing_tools import *
from nav_functions import *
from maxsonar_class import *
from robomow_motor_class_arduino import *
from gps_functions import *
from class_android_sensor_tcp import *
from train_terrain import *
#from lidar_class import *
from visual import *
from mobot_nav_class import *
def snap_shot(filename):
"""grabs a frame from webcam, resizes to 320x240 and returns image"""
#capture from camera at location 0
now = time.time()
global webcam1
try:
#have to capture a few frames as it buffers a few frames..
for i in range (5):
ret, img = webcam1.read()
#print "time to capture 5 frames:", (time.time()) - now
cv2.imwrite(filename, img)
img1 = Image.open(filename)
img1.thumbnail((320,240))
img1.save(filename)
#print (time.time()) - now
except:
print "could not grab webcam"
return
def send_file(host, cport, mport, filetosend):
#global file_lock
file_lock = True
#print "file_lock", file_lock
try:
cs = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
cs.connect((host, cport))
cs.send("SEND " + filetosend)
print "sending file", filetosend
cs.close()
except:
#print "cs failed"
pass
time.sleep(0.1)
try:
ms = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
ms.connect((host, mport))
f = open(filetosend, "rb")
data = f.read()
f.close()
ms.send(data)
ms.close()
except:
#print "ms failed"
pass
#file_lock = False
#print "file_lock", file_lock
'''
def send_data(host="u1204vm.local", cport=9091, mport=9090, datatosend=""):
try:
cs = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
cs.connect((host, cport))
cs.send("SEND " + filetosend)
cs.close()
except:
pass
time.sleep(0.05)
try:
ms = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
ms.connect((host, mport))
f = open(filetosend, "rb")
data = f.read()
f.close()
ms.send(data)
ms.close()
except:
pass
'''
class send_video(Thread):
def __init__(self, filetosend):
self.filetosend = filetosend
Thread.__init__(self)
def run(self):
#global file_lock, hhh
print self.filetosend
while True:
snap_shot(self.filetosend)
send_file(host="u1204vm.local", cport=9090, mport=9091,filetosend=self.filetosend)
time.sleep(.01)
class send_sonar_data(Thread):
def __init__(self, filetosend):
self.sonar = MaxSonar()
self.filetosend = filetosend
self.sonar_data = ""
self.max_dist = -1
self.min_dist = -1
self.min_sensor = -1
self.max_sensor = -1
self.right_sensor = -1
self.left_sensor = -1
self.forwardl_sensor = -1
self.fowardr_sensor = -1
Thread.__init__(self)
def run(self):
#global file_lock, hhh
while True:
self.sonar_data = ""
self.max_dist = -1
self.min_dist = -1
self.min_sensor = -1
self.max_sensor = -1
self.right_sensor = -1
self.left_sensor = -1
self.forwardl_sensor = -1
self.fowardr_sensor = -1
#
#below 2 lines are for test purposes when actual US arent sending data
#for i in range(1,6):
# sonar_data = sonar_data + "s"+str(i)+":"+ str(random.randint(28, 91))
data = str(self.sonar.distances_cm())
self.sonar_data = []
sonar_data_str1 = ""
try:
if len(data) > 1:
self.sonar_data.append(int(data[(data.find('s1:')+3):(data.find('s2:'))]))
self.sonar_data.append(int(data[(data.find('s2:')+3):(data.find('s3:'))]))
self.sonar_data.append(int(data[(data.find('s3:')+3):(data.find('s4:'))]))
self.sonar_data.append(int(data[(data.find('s4:')+3):(data.find('s5:'))]))
#have to put this before cliff sensor
self.min_dist = min(self.sonar_data)
self.min_sensor = self.sonar_data.index(self.min_dist)
self.sonar_data.append(int(data[(data.find('s5:')+3):(len(data)-1)]))
self.max_dist = max(self.sonar_data)
#self.min_dist = min(self.sonar_data)
#self.min_sensor = self.sonar_data.index(self.min_dist)
self.max_sensor = self.sonar_data.index(self.max_dist)
#sonar_data_str1 = "".join(str(x) for x in self.sonar_data)
self.frontl_sensor = self.sonar_data[0]
self.frontr_sensor = self.sonar_data[1]
self.right_sensor = self.sonar_data[2]
self.left_sensor = self.sonar_data[3]
self.cliff = self.sonar_data[4]
#print sonar_data_str1
#print data
#
#f = open("sonar_data.txt", "w")
#f.write(data)
#f.close()
#send_file(host="u1204vm.local", cport=9092, mport=9093,filetosend=self.filetosend)
except:
pass
try:
time.sleep(.02)
except:
pass
print "out of while in sonar"
def terminate(self):
self.sonar.terminate()
def safety_check(motor, sonar, threshold, cliff):
answer = True
if (sonar.cliff > cliff):
print "cliff detected", sonar.cliff, cliff
answer = False
if (sonar.min_dist < threshold):
print "object too close", sonar.min_dist, threshold
answer = False
return answer
def move_mobot(motor, sonar, threshold, cliff, themove, speed):
move_time = .6
print "sonar data:", sonar.sonar_data
print "moving:", themove, " speed:", speed
#####################3
if themove == "s": motor.forward(0)
if (themove == "f"):
#if (safety_check(motor, sonar, threshold, cliff)) == True:
motor.forward(speed)
time.sleep(move_time)
motor.forward(0)
#print motor.motor1_speed, motor.motor2_speed
#else:
# evasive_maneuver(motor, sonar, threshold, cliff)
if (themove == "b"):
motor.left(-1*speed)
motor.right(-1*(speed*1.2))
#motor.reverse(speed)
time.sleep(move_time)
motor.forward(0)
#print motor.motor1_speed, motor.motor2_speed
if (themove == "l"):
motor.spin_left(speed)
time.sleep(move_time)
motor.forward(0)
#print motor.motor1_speed, motor.motor2_speed
if (themove == "r"):
motor.spin_right(speed)
time.sleep(move_time)
motor.forward(0)
#print motor.motor1_speed, motor.motor2_speed
print "done moving"
def enable_video():
video1 = send_video("snap_shot.jpg")
video1.daemon=True
video1.start()
#video1.join()
def enable_sonar():
sonar = send_sonar_data("sonar_data.txt")
sonar.daemon=True
sonar.start()
#sonar.join()
def test_gps():
#print "startup all gps"
#start_all_gps()
gpslist = gps_list()
#print gpslist
gps2 = mobot_gps()
gps2.daemon=True
gps2.start()
current_track = 1000
max_spd = 0.0
#gps2.join()
while True:
print "# of GPS Units:", len(gpslist)
#if (gps2.satellites > 0):
print 'Satellites (total of', gps2.satellites , ' in view)'
print "Active satellites used:", gps2.active_satellites
#for i in gps2.satellites:
# print '\t', i
print "lat: ", gps2.latitude
print "long:", gps2.longitude
print 'track: ', gps2.track
print "Current Track: ", current_track
mph = gps2.speed * 2.23693629
if mph > max_spd: max_spd = mph
print 'speed: m/sec', gps2.speed , " MPH:" , mph
print "max_spd: ", max_spd
if mph > 2.0:
current_track = gps2.track
#time.sleep(random.randint(1, 3))
time.sleep(.5)
#os.system("clear")
class display_sonar(Thread):
def __init__(self, sonar):
self.sonar = sonar
Thread.__init__(self)
class mobot_display(Thread):
def __init__(self, camID, sonar):
self.camID = camID
self.sonar = sonar
Thread.__init__(self)
def run(self):
global frame1
global frame2
cv2.namedWindow('Sonar Data', cv.CV_WINDOW_AUTOSIZE)
cv2.namedWindow('Front Camera', cv.CV_WINDOW_AUTOSIZE)
cv2.namedWindow('Ground Cam', cv.CV_WINDOW_AUTOSIZE)
cv.MoveWindow('Front Camera', 373, 24)
cv.MoveWindow('Ground Cam', 760, 24)
cv.MoveWindow('Sonar Data', 60, 24)
camera = cv.CreateCameraCapture(self.camID)
camera2 = cv.CreateCameraCapture(1)
while True:
#time.sleep(1)
try:
#print "raw sonar data", self.sonar.sonar_data
sonar_img = sf.sonar_graph(self.sonar.sonar_data)
cv.ShowImage('Sonar Data', PILtoCV(sonar_img, 4))
cv.WaitKey(10)
del sonar_img
except:
print "sonar display failure"
pass
try:
frame1 = cv.QueryFrame(camera)
frame1 = resize_img(frame1, 0.60)
cv.ShowImage('Front Camera', frame1)
cv.WaitKey(10)
except:
print "camera1 display failure"
pass
try:
#frame2 = cv.QueryFrame(camera2)
#frame2 = resize_img(frame2, 0.60)
frame2 = frame1
cv.ShowImage('Ground Cam', frame2)
cv.WaitKey(10)
except:
print "camera2 display failure"
pass
class sonar_prevent_hit(Thread):
def __init__(self, motor, sonar, threshold):
self.sonar = sonar
self.motor = motor
self.threshold = threshold
Thread.__init__(self)
def run(self):
while True:
#time.sleep(.12)
try:
#print "self.sonar.min_dist:", self.sonar.min_dist
if (self.sonar.min_dist < self.threshold):
print "auto hit prevent activated: " ,self.sonar.sonar_data
time.sleep(1)
evasive_maneuver(self.motor, self.sonar, self.threshold)
except:
print "sonar auto hit prevent failure"
pass
def evasive_maneuver(motor, sonar, threshold, cliff):
global auto_pilot_on
#wait to confirm
motor.stop()
time.sleep(.05)
while (sonar.min_dist < threshold or sonar.cliff > cliff or auto_pilot_on == True):
print "..........evasive maneuver............."
print "sonar_data: ", sonar.sonar_data
motor.stop()
time.sleep(.05)
motor.left(-1*24)
motor.right(-1*(24*1.2))
time.sleep(1)
motor.stop()
#time.sleep(.5)
if (sonar.right_sensor > sonar.left_sensor):
#while (sonar.frontl_sensor < threshold and sonar.frontr_sensor < threshold):
motor.spin_right(24)
time.sleep(random.randint(120, 260)/100)
#time.sleep(0.02)
else:
#while (sonar.frontl_sensor < threshold and sonar.frontr_sensor < threshold):
motor.spin_left(24)
time.sleep(random.randint(120, 260)/100)
#time.sleep(0.02)
#move_mobot(motor, 'f', 25)
#time.sleep(.2)
def turn_to_bearing(motor, sonar, threshold, compass, heading):
#wait to confirm
motor.forward(0)
time.sleep(1)
#while (sonar.min_dist > threshold ):
print "..........turning to heading: ", heading
dif = 1000
while dif > 10:
while compass.heading == None:
time.sleep(.01)
dif = abs(heading - compass.heading )
print "dif:", dif
print "heading now:", compass.heading
#motor.spin_right(22)
motor.right(-24)
motor.left(22)
#time.sleep(.3)
#move_mobot(motor, 's', 0)
#time.sleep(.3)
motor.foward(0)
time.sleep(1)
print "heading now:", compass.heading
class auto_move(Thread):
def __init__(self, motor, sonar, threshold, cliff):
self.motor = motor
self.sonar = sonar
self.threshold = threshold
self.cliff = cliff
Thread.__init__(self)
def run(self):
global auto_pilot_on
while True:
#print "running autopilot", auto_pilot_on
while auto_pilot_on == True:
#for i in range (10):
print "..........autopilot............."
print "sonar_data: ", self.sonar.sonar_data
now = datetime.now()
print "min sensor:", self.sonar.min_sensor
while (self.sonar.min_dist <= self.threshold):
evasive_maneuver(self.motor, self.sonar, self.threshold, self.cliff)
#time.sleep(.5)
if (self.sonar.min_dist > self.threshold):
move_mobot(self.motor, self.sonar, self.threshold, self.cliff, 'f',18)
#time.sleep(1)
#move_mobot(self.motor, 's', 0)
#print "sonar_data: ", sonar.sonar_data
#print "loop time:", datetime.now() - now
def wallfollow(motor, sonar, threshold):
rm_spd = 16
lm_spd = 16
spd = 14
while True:
print "..........wallfollow............."
print "sonar_data: ", sonar.sonar_data
print "sonar_right:", sonar.right_sensor, " sonar_left:", sonar.left_sensor,
print "RMotor: ", rm_spd, " LMotor: ", lm_spd
while (sonar.min_dist < threshold):
rm_spd = spd
lm_spd = spd
evasive_maneuver(motor, sonar, threshold)
else:
if sonar.right_sensor < 56:
lm_spd = spd - 4 #decrease left motor speed
rm_spd = spd + 1
if sonar.right_sensor > 57:
lm_spd = spd + 2 #increase left motor speed
rm_spd = spd
#rm_spd = rm_spd -1
#if sonar.right_sensor > 48 and sonar.right_sensor < 52:
# rm_spd = spd
# lm_spd = spd
#adjust for max/min
if lm_spd > 28: lm_spd = 28
if rm_spd > 28: rm_spd = 28
if lm_spd < 6: lm_spd = 6
if rm_spd < 6: rm_spd = 6
#send cmds to motors
motor.right(rm_spd)
motor.left(lm_spd)
time.sleep(.03)
#move_mobot(motor, 's', 0)
def train_on_terrain():
global frame2
img1 = frame2
video = None
reply = ""
while reply != 'Quit':
#eg.rootWindowPosition = eg.rootWindowPosition
print 'reply=', reply
#if reply == "": reply = "Next Frame"
if reply == "Mowable":
classID = "1"
if img1 != None:
features = find_features(frame2)
save_data(features, classID)
else:
if video != None:
img1 = np.array(grab_frame_from_video(video)[1])
else:
img1 = frame2
img1 = array2image(img1)
img1 = img1.resize((320,240))
img1 = image2array(img1)
cv2.imwrite('temp.png', img1)
if reply == "Non-Mowable":
classID = "2"
if img1 != None:
features = find_features(img1)
save_data(features, classID)
else:
if video != None:
img1 = np.array(grab_frame_from_video(video)[1])
else:
img1 = frame2
img1 = array2image(img1)
img1 = img1.resize((320,240))
img1 = image2array(img1)
cv2.imwrite('temp.png', img1)
if reply == "Quit":
print "Quitting Training...."
#sys.exit(-1)
if reply == "Predict":
print "AI predicting"
if img1 != None:
predict_class(CV2array(img1))
else:
if video != None:
img1 = np.array(grab_frame_from_video(video)[1])
else:
img1 = frame2
img1 = array2image(img1)
img1 = img1.resize((320,240))
img1 = image2array(img1)
cv2.imwrite('temp.png', img1)
if reply == "Subsection":
img1 = Image.open('temp.png')
print img1
xx = subsection_image(img1, 16,True)
print xx
#while (xx != 9):
# time.sleep(1)
if reply == "Retrain AI":
print "Retraining AI"
train_ai()
if reply == "Next Frame":
print "Acquiring new image.."
if video != None:
img1 = np.array(grab_frame_from_video(video)[1])
else:
img1 = frame2
print img1
#img1 = array2image(img1)
#img1 = CVtoPIL(img1)
#img1 = img1.resize((320,240))
#img1 = PILtoCV(img1,3)
#img1 = image2array(img1)
#cv2.imwrite('temp.png', img1)
cv.SaveImage('temp.png', img1)
#print type(img1)
#img1.save()
if reply == "Fwd 10 Frames":
print "Forward 10 frames..."
for i in range(10):
img1 = np.array(grab_frame_from_video(video)[1])
img1 = array2image(img1)
img1 = img1.resize((320,240))
img1 = image2array(img1)
cv2.imwrite('temp.png', img1)
if reply == "Del AI File":
data_filename = 'robomow_feature_data.csv'
f_handle = open(data_filename, 'w')
f_handle.write('')
f_handle.close()
if reply == "Test Img":
#im = Image.fromarray(image)
#im.save("new.png")
img1 = Image.open('temp.png')
#img1.thumbnail((320,240))
path = "../../../../mobot_data/images/test_images/"
filename = str(time.time()) + ".jpg"
img1.save(path+filename)
print "trying"
#if video != None:
# reply = eg.buttonbox(msg='Classify Image', title='Robomow GUI', choices=('Mowable', 'Non-Mowable', 'Test Img', 'Next Frame', 'Fwd 10 Frames', 'Predict', 'Subsection', 'Retrain AI' , 'Del AI File', 'Quit'), image='temp.png', root=None)
#else:
reply = eg.buttonbox(msg='Classify Image', title='Robomow GUI', choices=('Mowable', 'Non-Mowable', 'Test Img','Next Frame', 'Predict', 'Retrain AI' , 'Del AI File', 'Quit'), image='temp.png', root=None)
if __name__== "__main__":
testmode = False
if len(sys.argv) > 1:
if sys.argv[1] == 'testmode':
print 'starting in testing mode'
testmode= True
auto_pilot_on = False
sonar = None
motor = None
compass = None
reply =""
movetime = 0.4
threshold = 40
cliff = 48
frame1 = None
frame2 = None
#while sonar == None or motor == None:# or compass == None:
#while sonar == None:
# try:
# sonar = send_sonar_data("sonar_data.txt")
# except:
# time.sleep(.5)
# pass
#sonar.daemon=True
#sonar.start()
#while motor == None:
# try:
# motor = robomow_motor()
# print "motor.isConnected:", motor.isConnected
# except:
# time.sleep(.5)
# pass
#if compass == None:
# compass = android_sensor_tcp(8095)
# compass.daemon=True
# compass.start()
#while lidar == None:
# print 'trying to connect lidar'
# lidar = mobot_lidar("/dev/ttyUSB0", 115200)
# #except:
# time.sleep(.5)
# pass
#th = thread.start_new_thread(update_view, (lidar,))
#time.sleep(2)
#nav = mobot_nav (lidar)
#raw_input ('done connecting motor and lidar: press enter')
'''
#time.sleep(3)
#while compass.heading == None:
# time.sleep(1)
#turn_to_bearing(motor, sonar, 40, compass, 0)
#test_gps()
#gpslist = gps_list()
#print gpslist
#gps2 = mobot_gps()
#gps2.daemon=True
#gps2.start()
#time.sleep(5)
while True:
time.sleep(1)
#if (mobot_gps.active_satellites > 4):
#print "position: ", gps2.latitude, gps2.longitude
#the_map = create_map(10, 10, (gps2.latitude, gps2.longitude))
#break
print; print
print "READINGS"
print "--------------------------------"
print "Number of GPS units: " , len(gpslist)
print 'latitude ' , mobot_gps.latitude
print 'longitude ' , mobot_gps.longitude
#print 'mode:', mobot_gps.fix.mode
print 'track: ', mobot_gps.track
#print 'time utc ' , mobot_gps.utc, mobot_gps.fix.time
print 'altitude ' , mobot_gps.altitude
#print 'epx ', mobot_gps.fix.epx
#print 'epv ', mobot_gps.fix.epv
#print 'ept ', mobot_gps.fix.ept
#print 'epc ', mobot_gps.fix.epc
#print 'epd ', mobot_gps.fix.epd
#print 'eps ', mobot_gps.fix.eps
#print "speed ", mobot_gps.fix.speed
#print "climb " , mobot_gps.fix.climb
#print
#print 'Satellites (total of', len(mobot_gps.satellites) , ' in view)'
#for i in mobot_gps.satellites:
# print '\t', i
print "Active satellites used: ", mobot_gps.active_satellites
print the_map
except (KeyboardInterrupt, SystemExit): #when you press ctrl+c
print "\nKilling Thread..."
gpsp.running = False
gpsp.join() # wait for the thread to finish what it's doing
print "Done.\nExiting."
#wallfollow(motor, sonar)
'''
#start front navigation cam
#mobot_disp = mobot_display(0, sonar)
#mobot_disp.daemon=True
#mobot_disp.start()
#time.sleep(2)
#test_gps()
#mobot_autopilot = auto_move(motor, sonar, threshold, cliff)
#mobot_autopilot.daemon = True
#mobot_autopilot.start()
#print mobot_autopilot
#while True:
# time.sleep(.1)
#start sonar_hit_preventer
#sonar_hit_preventer = sonar_prevent_hit(motor, sonar, 38)
#sonar_hit_preventer.daemon=True
#sonar_hit_preventer.start()
#sonar_hit_preventer.join()
#time.sleep(2)
#while True:
# auto_move(motor, sonar, 38)
# time.sleep(.03)
#wallfollow(motor, sonar, 40)
eg.rootWindowPosition = "+60+375"
while True:
if reply == 'Turn?':
print "angel_greatest_dist", nav.angel_greatest_dist()
print "turn_left_or_right", nav.turn_left_or_right()
if reply == 'AutoPilot':
#auto_move(motor, sonar, threshold)
if auto_pilot_on == False:
auto_pilot_on = True
else:
auto_pilot_on = False
print "auto_pilot_on:", auto_pilot_on
if reply == 'F':
move_mobot(motor, sonar, threshold, cliff, 'f', 25)
#time.sleep(movetime)
if reply == 'B':
move_mobot(motor, sonar, threshold, cliff, 'b', 25)
#time.sleep(movetime)
if reply == 'R':
move_mobot(motor, sonar, threshold, cliff, 'r', 28)
time.sleep(movetime)
motor.stop()
if reply == 'L':
move_mobot(motor, sonar, threshold, cliff, 'l', 28)
time.sleep(movetime)
motor.stop()
if reply == 'STOP':
motor.stop()
if reply == 'TRAIN':
train_on_terrain()
if reply == "Quit":
print "stopping mobot..."
motor.forward(0)
time.sleep(.1)
del mobot_disp
motor.terminate()
sonar.terminate()
sys.exit(-1)
#time.sleep(movetime)
print "Quitting...."
reply = eg.buttonbox(title='Mobot Drive', choices=('AutoPilot', 'F', 'B', 'L', 'R', 'STOP', 'TRAIN', 'Quit'), root=None)
'''
#############################################################
#start gps
#get current gps postiion
#get bearing to target position
# turn toward target bearing
######################################################
gps1 = gps.gps(host="localhost", port="2947")
gps2 = gps.gps(host="localhost", port="2948")
gps3 = gps.gps(host="localhost", port="2949")
gps4 = gps.gps(host="localhost", port="2950")
'''
<file_sep>/Pololu/trex.py
#!/usr/bin/python
import serial
import time
class TReX:
fwd = 1
rvs = 2
speed_normal = 127
cmd_prefix = chr(0x86)
cmd_accelerate = 0xE0
lastAction = ''
def __init__(self,pname='/dev/ttyS2',baudrate=19200):
self.port = serial.Serial(pname)
self.port.baudrate= baudrate
def basic_action(self, thisAction):
if self.lastAction != thisAction:
if thisAction == "f":
print "/\ Forwards"
self.forwards()
elif thisAction == "b":
print "\/ Backwards"
self.backwards()
elif thisAction == "l":
print "<< Left"
self.left()
elif thisAction == "r":
print ">> Right"
self.right()
else:
print ">< Stopped"
self.stop()
self.lastAction = thisAction
def forwards(self):
self.accelerate_to(self.fwd, self.fwd, self.speed_normal, self.speed_normal)
def backwards(self):
self.accelerate_to(self.rvs, self.rvs, self.speed_normal, self.speed_normal)
def left(self):
self.accelerate_to(self.fwd, self.rvs, self.speed_normal, self.speed_normal)
def right(self):
self.accelerate_to(self.rvs, self.fwd, self.speed_normal, self.speed_normal)
def stop(self):
self.accelerate_to(self.fwd, self.fwd, 0, 0)
def accelerate_to(self, left_dir, right_dir, left_power, right_power):
self.port.write(self.cmd_prefix+chr(self.cmd_accelerate | left_dir*4 | right_dir) + chr(left_power) + chr(right_power))
#p = TReX()
#while ( 1 ):
# p.basic_action('r')
# time.sleep (1)
PORT = "/dev/ttyUSB0"
#if len(sys.argv) > 1:
# PORT = sys.argv[1]
ser = serial.Serial(PORT, 19200, timeout=1)
print ser.portstr,
print ser.baudrate,
print ser.bytesize,
print ser.parity,
print ser.stopbits
while 1:
ser.write (chr(0x81)) # write a string
print 'waiting on response..'
s = ser.readline()
ser.write (chr(0x82))
s = ser.readline()
time.sleep(1)
ser.close()
print s
<file_sep>/gui/web_gui/sitetest2/process_images.py
#!/usr/bin/env python
import os
from img_processing_tools import *
import Image
import time
import sys
import cv
root_path = "/home/lforet/images"
num_of_classes = 2
if __name__=="__main__":
if len(sys.argv) < 2:
data_filename = 'mower_image_data.csv'
for x in range(num_of_classes):
if x == 0:
classID = 1
path = root_path + "/class1"
if x == 1:
classID = -1
path = root_path + "/class2"
###########################
#process class 1 files (mowable)
###########################
count = 0
for subdir, dirs, files in os.walk(path):
count = len(files)
if count > 0:
if x == 0:
###########################
# delete data file and write header
f_handle = open(data_filename, 'w')
f_handle.write(str("classid, lbp, i3_histogram, rgb_histogram, sum_I3, sum2_I3, median_I3, avg_I3, var_I3, stddev_I3, rms_I3"))
f_handle.write('\n')
f_handle.close()
print "Files to Process: ", count
for subdir, dirs, files in os.walk(path):
for file in files:
filename1= os.path.join(path, file)
if x == 0: print "Processing images: CLASS 1 (MOWABLE)"
if x == 1: print "Processing images: CLASS -1 (NON-MOWABLE)"
print "Processing file: ", filename1
img = Image.open(filename1)
if len(img.getbands()) == 3:
if img.size[0] <> 640 or img.size[1] <> 480:
print "Image is not right size. Resizing image...."
img = img.resize((640, 480))
print "Resized to 640, 480"
#cv.WaitKey()
cv_image = PILtoCV(img)
cv.ShowImage("Image", cv_image)
cv.MoveWindow ('Image',50 ,50 )
cv.WaitKey(100)
#strg = "python show_histogram.py " + filename1
#print strg
#os.system(strg);
plot_rgb_histogram(img)
time.sleep(.1)
cv_image2 = cv.LoadImage('histogram.png')
cv.ShowImage("RGB Histogram", cv_image2)
cv.MoveWindow ('RGB Histogram',50 ,580 )
cv.WaitKey(100)
#cv.WaitKey()
time.sleep(.2)
WriteMeterics(img, classID, data_filename)
else:
print "image not valid for processing: ", filename1
time.sleep(3)
print
if len(sys.argv) > 1:
classID = 0
data_filename = 'sample_image_data.csv'
###########################
# delete data file and write header
f_handle = open(data_filename, 'w')
f_handle.write(str("classid, lbp, i3_histogram, rgb_histogram, sum_I3, sum2_I3, median_I3, avg_I3, var_I3, stddev_I3, rms_I3"))
f_handle.write('\n')
f_handle.close()
filename1= sys.argv[1]
print 'writing sample image stats to file: ', data_filename
print "Processing file: ", filename1
img = Image.open(filename1)
if len(img.getbands()) == 3:
if img.size[0] <> 640 or img.size[1] <> 480:
print "Image is not right size. Resizing image...."
img = img.resize((640, 480))
print "Resized to 640, 480"
cv_image = PILtoCV(img)
cv.ShowImage("Image", cv_image)
cv.MoveWindow ('Image',50 ,50 )
cv.WaitKey(100)
time.sleep(.2)
WriteMeterics(img, classID, data_filename)
#time.sleep(1)
else:
print "image not valid for processing: ", filename1
time.sleep(5)
print
<file_sep>/gui/breezy/textareademo.py
"""
File: textareademo.py
Author: <NAME>
Compute an investment report.
1. The inputs are
starting investment amount
number of years
interest rate (an integer percent)
2. The report is displayed in tabular form with a header.
3. Computations and outputs:
for each year
compute the interest and add it to the investment
print a formatted row of results for that year
4. The total investment and interest earned are also displayed.
"""
from breezypythongui import EasyFrame
class TextAreaDemo(EasyFrame):
"""Demonstrates a multiline text area."""
def __init__(self):
"""Sets up the window and widgets."""
EasyFrame.__init__(self, "Investment Calculator")
self.addLabel(text = "Initial amount", row = 0, column = 0)
self.addLabel(text = "Number of years", row = 1, column = 0)
self.addLabel(text = "Interest rate in %", row = 2, column = 0)
self.amount = self.addFloatField(value = 0.0, row = 0, column = 1)
self.period = self.addIntegerField(value = 0, row = 1, column = 1)
self.rate = self.addIntegerField(value = 0, row = 2, column = 1)
self.outputArea = self.addTextArea("", row = 4, column = 0,
columnspan = 2,
width = 50, height = 15)
self.compute = self.addButton(text = "Compute", row = 3, column = 0,
columnspan = 2,
command = self.compute)
# Event handling method.
def compute(self):
"""Computes the investment schedule based on the inputs
and outputs the schedule."""
# Obtain and validate the inputs
startBalance = self.amount.getNumber()
rate = self.rate.getNumber() / 100
years = self.period.getNumber()
if startBalance == 0 or rate == 0 or years == 0:
return
# Set the header for the table
result = "%4s%18s%10s%16s\n" % ("Year", "Starting balance",
"Interest", "Ending balance")
# Compute and append the results for each year
totalInterest = 0.0
for year in range(1, years + 1):
interest = startBalance * rate
endBalance = startBalance + interest
result += "%4d%18.2f%10.2f%16.2f\n" % \
(year, startBalance, interest, endBalance)
startBalance = endBalance
totalInterest += interest
# Append the totals for the period
result += "Ending balance: $%0.2f\n" % endBalance
result += "Total interest earned: $%0.2f\n" % totalInterest
# Output the result
self.outputArea["state"] = "normal"
self.outputArea.setText(result)
self.outputArea["state"] = "disabled"
if __name__ == "__main__":
TextAreaDemo().mainloop()
<file_sep>/telemetry/wifi.py
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# Copyright 2004, 2005 <NAME> <<EMAIL>> - Rotterdam, Netherlands
# Copyright 2009 by <NAME> <<EMAIL>>
#
# This file is part of Python WiFi
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#
import errno
import sys
import types
import pythonwifi.flags
from pythonwifi.iwlibs import Wireless, Iwrange, getNICnames
def print_scanning_results(wifi, args=None):
""" Print the access points detected nearby.
"""
# "Check if the interface could support scanning"
try:
iwrange = Iwrange(wifi.ifname)
except IOError, (error_number, error_string):
sys.stderr.write("%-8.16s Interface doesn't support scanning.\n\n" % (
wifi.ifname))
else:
# "Check for Active Scan (scan with specific essid)"
# "Check for last scan result (do not trigger scan)"
# "Initiate Scanning"
try:
results = wifi.scan()
except IOError, (error_number, error_string):
if error_number != errno.EPERM:
sys.stderr.write(
"%-8.16s Interface doesn't support scanning : %s\n\n" %
(wifi.ifname, error_string))
else:
if (len(results) == 0):
print "%-8.16s No scan results" % (wifi.ifname, )
else:
(num_channels, frequencies) = wifi.getChannelInfo()
print "%-8.16s Scan completed :" % (wifi.ifname, )
index = 1
for ap in results:
print " Cell %02d - Address: %s" % (index, ap.bssid)
print " ESSID:\"%s\"" % (ap.essid, )
print " Mode:%s" % (ap.mode, )
print " Frequency:%s (Channel %d)" % \
(wifi._formatFrequency(ap.frequency.getFrequency()),
frequencies.index(wifi._formatFrequency(
ap.frequency.getFrequency())) + 1)
if (ap.quality.updated & \
pythonwifi.flags.IW_QUAL_QUAL_UPDATED):
quality_updated = "="
else:
quality_updated = ":"
if (ap.quality.updated & \
pythonwifi.flags.IW_QUAL_LEVEL_UPDATED):
signal_updated = "="
else:
signal_updated = ":"
if (ap.quality.updated & \
pythonwifi.flags.IW_QUAL_NOISE_UPDATED):
noise_updated = "="
else:
noise_updated = ":"
print " " + \
"Quality%c%s/%s Signal level%c%s/%s Noise level%c%s/%s" % \
(quality_updated,
ap.quality.quality,
wifi.getQualityMax().quality,
signal_updated,
ap.quality.getSignallevel(),
"100",
noise_updated,
ap.quality.getNoiselevel(),
"100")
# This code on encryption keys is very fragile
if (ap.encode.flags & pythonwifi.flags.IW_ENCODE_DISABLED):
key_status = "off"
else:
if (ap.encode.flags & pythonwifi.flags.IW_ENCODE_NOKEY):
if (ap.encode.length <= 0):
key_status = "on"
print " Encryption key:%s" % (key_status, )
if len(ap.rate) > 0:
for rate_list in ap.rate:
# calc how many full lines of bitrates
rate_lines = len(rate_list) / 5
# calc how many bitrates on last line
rate_remainder = len(rate_list) % 5
line = 0
# first line should start with a label
rate_line = " Bit Rates:"
while line < rate_lines:
# print full lines
if line > 0:
# non-first lines should start *very* indented
rate_line = " "
rate_line = rate_line + "%s; %s; %s; %s; %s" % \
tuple(wifi._formatBitrate(x) for x in
rate_list[line * 5:(line * 5) + 5])
line = line + 1
print rate_line
if line > 0:
# non-first lines should start *very* indented
rate_line = " "
# print non-full line
print rate_line + "%s; "*(rate_remainder - 1) % \
tuple(wifi._formatBitrate(x) for x in
rate_list[line * 5:line * 5 + rate_remainder - 1]) + \
"%s" % (wifi._formatBitrate(
rate_list[line * 5 + rate_remainder - 1]))
index = index + 1
print
def print_channels(wifi, args=None):
""" Print all frequencies/channels available on the card.
"""
try:
(num_frequencies, channels) = wifi.getChannelInfo()
current_freq = wifi.getFrequency()
except IOError, (error_number, error_string):
# Channel/frequency info not available
if (error_number == errno.EOPNOTSUPP) or \
(error_number == errno.EINVAL) or \
(error_number == errno.ENODEV):
sys.stderr.write("%-8.16s no frequency information.\n\n" % (
wifi.ifname, ))
else:
report_error("channel", wifi.ifname, error_number, error_string)
else:
# Channel/frequency info available
print "%-8.16s %02d channels in total; available frequencies :" % \
(wifi.ifname, num_frequencies)
for channel in channels:
print " Channel %02d : %s" % \
(channels.index(channel)+1, channel)
# Do some low-level comparisons on frequency info
iwfreq = wifi.wireless_info.getFrequency()
# XXX - this is not the same flags value as iwlist.c
if iwfreq.flags & pythonwifi.flags.IW_FREQ_FIXED:
fixed = "="
else:
fixed = ":"
if iwfreq.getFrequency() < pythonwifi.iwlibs.KILO:
return_type = "Channel"
else:
return_type = "Frequency"
# Output current channel/frequency
current_freq = wifi.getFrequency()
print " Current %s%c%s (Channel %d)\n" % \
(return_type, fixed, current_freq, channels.index(current_freq) + 1 )
def print_bitrates(wifi, args=None):
""" Print all bitrates available on the card.
"""
try:
num_bitrates, bitrates = wifi.getBitrates()
except IOError, (error_number, error_string):
if (error_number == errno.EOPNOTSUPP) or \
(error_number == errno.EINVAL) or \
(error_number == errno.ENODEV):
# not a wireless device
sys.stderr.write("%-8.16s no bit-rate information.\n\n" % (
wifi.ifname, ))
else:
report_error("bit rate", wifi.ifname, error_number, error_string)
else:
if (num_bitrates > 0) and \
(num_bitrates <= pythonwifi.flags.IW_MAX_BITRATES):
# wireless device with bit rate info, so list 'em
print "%-8.16s %02d available bit-rates :" % \
(wifi.ifname, num_bitrates)
for rate in bitrates:
print "\t %s" % rate
else:
# wireless device, but no bit rate info available
print "%-8.16s unknown bit-rate information." % (wifi.ifname, )
# current bit rate
try:
bitrate = wifi.wireless_info.getBitrate()
except IOError, (error_number, error_string):
# no bit rate info is okay, error was given above
pass
else:
if bitrate.fixed:
fixed = "="
else:
fixed = ":"
print " Current Bit Rate%c%s" % (fixed, wifi.getBitrate())
# broadcast bit rate
# XXX add broadcast bit rate
print
def print_encryption(wifi, args=None):
""" Print encryption keys on the card.
"""
try:
keys = wifi.getKeys()
except IOError, (error_number, error_string):
if (error_number == errno.EOPNOTSUPP) or \
(error_number == errno.EINVAL) or \
(error_number == errno.ENODEV):
# not a wireless device
sys.stderr.write("%-8.16s no encryption keys information.\n\n" % (
wifi.ifname, ))
else:
range_info = Iwrange(wifi.ifname)
key_sizes = ""
for index in range(range_info.num_encoding_sizes - 1):
key_sizes = key_sizes + \
repr(range_info.encoding_size[index] * 8) + \
", "
key_sizes = key_sizes + \
repr(range_info.encoding_size[range_info.num_encoding_sizes - 1] * 8) + \
"bits"
print "%-8.16s %d key sizes : %s" % \
(wifi.ifname, range_info.num_encoding_sizes, key_sizes)
print " %d keys available :" % (len(keys), )
for key in keys:
print "\t\t[%d]: %s" % (key[0], key[1])
print " Current Transmit Key: [%s]" % \
(wifi.wireless_info.getKey().flags & pythonwifi.flags.IW_ENCODE_INDEX, )
if wifi.wireless_info.getKey().flags & pythonwifi.flags.IW_ENCODE_RESTRICTED:
print " Security mode:restricted"
if wifi.wireless_info.getKey().flags & pythonwifi.flags.IW_ENCODE_OPEN:
print " Security mode:open"
print "\n"
def format_pm_value(value, args=None):
""" Return formatted PM value.
"""
if (value >= pythonwifi.iwlibs.MEGA):
fvalue = "%gs" % (value / pythonwifi.iwlibs.MEGA, )
else:
if (value >= pythonwifi.iwlibs.KILO):
fvalue = "%gms" % (value / pythonwifi.iwlibs.KILO, )
else:
fvalue = "%dus" % (value, )
return fvalue
def print_power(wifi, args=None):
""" Print power management info for the card.
"""
try:
(pm_capa, power_period, power_timeout, power_saving, power_params) = \
wifi.getPowermanagement()
except IOError, (error_number, error_string):
if (error_number == errno.ENODEV):
sys.stderr.write("%-8.16s no power management information.\n\n" % (
wifi.ifname, ))
else:
print "%-8.16s " % (wifi.ifname, ),
if (pm_capa & pythonwifi.flags.IW_POWER_MODE):
print "Supported modes :"
if pm_capa & (pythonwifi.flags.IW_POWER_UNICAST_R |
pythonwifi.flags.IW_POWER_MULTICAST_R):
print "\t\t\to Receive all packets (unicast & multicast)"
print "\t ",
if pm_capa & pythonwifi.flags.IW_POWER_UNICAST_R:
print "\t\to Receive Unicast only (discard multicast)"
print "\t ",
if pm_capa & pythonwifi.flags.IW_POWER_MULTICAST_R:
print "\t\to Receive Multicast only (discard unicast)"
print "\t ",
if pm_capa & pythonwifi.flags.IW_POWER_FORCE_S:
print "\t\to Force sending using Power Management"
print "\t ",
if pm_capa & pythonwifi.flags.IW_POWER_REPEATER:
print "\t\to Repeat multicast"
print "\t ",
if (power_period[0] & pythonwifi.flags.IW_POWER_PERIOD):
if (power_period[0] & pythonwifi.flags.IW_POWER_MIN):
print "Auto period ; ",
else:
print "Fixed period ; ",
print "min period:%s\n\t\t\t " % \
(format_pm_value(power_period[1]), ),
print "max period:%s\n\t " % (format_pm_value(power_period[2]), ),
if (power_timeout[0] & pythonwifi.flags.IW_POWER_TIMEOUT):
if (power_timeout[0] & pythonwifi.flags.IW_POWER_MIN):
print "Auto timeout ; ",
else:
print "Fixed timeout ; ",
print "min period:%s\n\t\t\t " % \
(format_pm_value(power_timeout[1]), ),
print "max period:%s\n\t " % (format_pm_value(power_timeout[2]), ),
if (power_saving[0] & pythonwifi.flags.IW_POWER_SAVING):
if (power_saving[0] & pythonwifi.flags.IW_POWER_MIN):
print "Auto saving ; ",
else:
print "Fixed saving ; ",
print "min period:%s\n\t\t\t " % \
(format_pm_value(power_saving[1]), ),
print "max period:%s\n\t " % (format_pm_value(power_saving[2]), ),
if power_params.disabled:
print "Current mode:off"
else:
if (power_params.flags & pythonwifi.flags.IW_POWER_MODE == \
pythonwifi.flags.IW_POWER_UNICAST_R):
print "Current mode:Unicast only received"
elif (power_params.flags & pythonwifi.flags.IW_POWER_MODE == \
pythonwifi.flags.IW_POWER_MULTICAST_R):
print "Current mode:Multicast only received"
elif (power_params.flags & pythonwifi.flags.IW_POWER_MODE == \
pythonwifi.flags.IW_POWER_ALL_R):
print "Current mode:All packets received"
elif (power_params.flags & pythonwifi.flags.IW_POWER_MODE == \
pythonwifi.flags.IW_POWER_FORCE_S):
print "Current mode:Force sending"
elif (power_params.flags & pythonwifi.flags.IW_POWER_MODE == \
pythonwifi.flags.IW_POWER_REPEATER):
print "Current mode:Repeat multicasts"
print
def print_txpower(wifi, args=None):
""" Print transmit power info for the card.
"""
pass
def print_retry(wifi, args=None):
try:
range_info = Iwrange(wifi.ifname)
except IOError, (error_number, error_string):
if (error_number == errno.EOPNOTSUPP) or \
(error_number == errno.EINVAL) or \
(error_number == errno.ENODEV):
sys.stderr.write("%-8.16s no retry limit/lifetime information.\n\n" % (
wifi.ifname, ))
else:
ifname = "%-8.16s " % (wifi.ifname, )
if (range_info.retry_flags & pythonwifi.flags.IW_RETRY_LIMIT):
if (range_info.retry_flags & pythonwifi.flags.IW_RETRY_MIN):
limit = "Auto limit ; min limit:%d" % (
range_info.min_retry, )
else:
limit = "Fixed limit ; min limit:%d" % (
range_info.min_retry, )
print ifname + limit
ifname = None
print " max limit:%d" % (
range_info.max_retry, )
if (range_info.r_time_flags & pythonwifi.flags.IW_RETRY_LIFETIME):
if (range_info.r_time_flags & pythonwifi.flags.IW_RETRY_MIN):
lifetime = "Auto lifetime ; min lifetime:%d" % (
range_info.min_r_time, )
else:
lifetime = "Fixed lifetime ; min lifetime:%d" % (
range_info.min_r_time, )
if ifname:
print ifname + lifetime
ifname = None
else:
print " " + lifetime
print " max lifetime:%d" % (
range_info.max_r_time, )
iwparam = wifi.wireless_info.getRetry()
if iwparam.disabled:
print " Current mode:off"
else:
print " Current mode:on"
if (iwparam.flags & pythonwifi.flags.IW_RETRY_TYPE):
if (iwparam.flags & pythonwifi.flags.IW_RETRY_LIFETIME):
mode_type = "lifetime"
else:
mode_type = "limit"
mode = " "
if (iwparam.flags & pythonwifi.flags.IW_RETRY_MIN):
mode = mode + " min %s:%d" % (mode_type, iwparam.value)
if (iwparam.flags & pythonwifi.flags.IW_RETRY_MAX):
mode = mode + " max %s:%d" % (mode_type, iwparam.value)
if (iwparam.flags & pythonwifi.flags.IW_RETRY_SHORT):
mode = mode + " short %s:%d" % (mode_type, iwparam.value)
if (iwparam.flags & pythonwifi.flags.IW_RETRY_LONG):
mode = mode + " long %s:%d" % (mode_type, iwparam.value)
print mode
def print_aps(wifi, args=None):
""" Print the access points detected nearby.
iwlist.c uses the deprecated SIOCGIWAPLIST, but iwlist.py uses
regular scanning (i.e. Wireless.scan()).
"""
# "Check if the interface could support scanning"
try:
iwrange = Iwrange(wifi.ifname)
except IOError, (error_number, error_string):
sys.stderr.write("%-8.16s Interface doesn't support scanning.\n\n" % (
wifi.ifname))
else:
# "Check for Active Scan (scan with specific essid)"
# "Check for last scan result (do not trigger scan)"
# "Initiate Scanning"
try:
results = wifi.scan()
except IOError, (error_number, error_string):
if error_number != errno.EPERM:
sys.stderr.write(
"%-8.16s Interface doesn't support scanning : %s\n\n" %
(wifi.ifname, error_string))
else:
if (len(results) == 0):
print "%-8.16s Interface doesn't have " % (wifi.ifname, ) + \
"a list of Peers/Access-Points"
else:
print "%-8.16s Peers/Access-Points in range:"% (wifi.ifname, )
for ap in results:
if (ap.quality.quality):
if (ap.quality.updated & \
pythonwifi.flags.IW_QUAL_QUAL_UPDATED):
quality_updated = "="
else:
quality_updated = ":"
if (ap.quality.updated & \
pythonwifi.flags.IW_QUAL_LEVEL_UPDATED):
signal_updated = "="
else:
signal_updated = ":"
if (ap.quality.updated & \
pythonwifi.flags.IW_QUAL_NOISE_UPDATED):
noise_updated = "="
else:
noise_updated = ":"
print " %s : Quality%c%s/%s Signal level%c%s/%s Noise level%c%s/%s" % \
(ap.bssid,
quality_updated,
ap.quality.quality,
wifi.getQualityMax().quality,
signal_updated,
ap.quality.getSignallevel(),
"100",
noise_updated,
ap.quality.getNoiselevel(),
"100")
else:
print " %s" % (ap.bssid, )
print
def report_error(function, interface, error_number, error_string):
""" Print error to user. """
print """Uncaught error condition. Please report this to the \
developers' mailing list (informaion available at \
http://lists.berlios.de/mailman/listinfo/pythonwifi-dev). While attempting to \
print %s informaion for %s, the error "%d - %s" occurred.""" % \
(function, interface, error_number, error_string)
def usage():
print """\
Usage: iwlist.py [interface] scanning [essid NNN] [last]
[interface] frequency
[interface] channel
[interface] bitrate
[interface] encryption
[interface] keys
[interface] power
[interface] txpower
[interface] retry
[interface] ap
[interface] accesspoints
[interface] peers"""
def get_matching_command(option):
""" Return a function for the command.
'option' -- string -- command to match
Return None if no match found.
"""
# build dictionary of commands and functions
iwcommands = { "s" : ("scanning", print_scanning_results),
"c" : ("channel", print_channels),
"f" : ("frequency", print_channels),
"b" : ("bitrate", print_bitrates),
"ra" : ("rate", print_bitrates),
"en" : ("encryption", print_encryption),
"k" : ("keys", print_encryption),
"po" : ("power", print_power),
"t" : ("txpower", print_txpower),
"re" : ("retry", print_retry),
"ap" : ("ap", print_aps),
"ac" : ("accesspoints", print_aps),
"pe" : ("peers", print_aps),
#"ev" : ("event", print_event),
#"au" : ("auth", print_auth),
#"w" : ("wpakeys", print_wpa),
#"g" : ("genie", print_genie),
#"m" : ("modulation", print_modulation),
}
function = None
for command in iwcommands.keys():
if option.startswith(command):
if iwcommands[command][0].startswith(option):
function = iwcommands[command][1]
return function
def main():
# if only program name is given, print usage info
if len(sys.argv) == 1:
usage()
# if program name and one argument are given
if len(sys.argv) == 2:
option = sys.argv[1]
# look for matching command
list_command = get_matching_command(option)
# if the one argument is a command
if list_command is not None:
for ifname in getNICnames():
wifi = Wireless(ifname)
list_command(wifi)
else:
print "iwlist.py: unknown command `%s' " \
"(check 'iwlist.py --help')." % (option, )
# if program name and more than one argument are given
if len(sys.argv) > 2:
# Get the interface and command from command line
ifname, option = sys.argv[1:]
# look for matching command
list_command = get_matching_command(option)
# if the second argument is a command
if list_command is not None:
wifi = Wireless(ifname)
list_command(wifi, sys.argv[3:])
else:
print "iwlist.py: unknown command `%s' " \
"(check 'iwlist.py --help')." % (option, )
if __name__ == "__main__":
main()
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/unsupervised/__init__.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2012, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# License: MIT. See COPYING.MIT file in the milk distribution
'''
milk.unsupervised
Unsupervised Learning
---------------------
- kmeans: This is a highly optimised implementation of kmeans
- PCA: Simple implementation
- Non-negative matrix factorisation: both direct and with sparsity constraints
'''
from kmeans import kmeans,repeated_kmeans, select_best_kmeans
from gaussianmixture import *
from pca import pca, mds
import nnmf
from nnmf import *
from pdist import pdist, plike
from .som import som
from .normalise import zscore, center
__all__ = [
'center',
'kmeans',
'mds'
'pca',
'pdist',
'plike',
'repeated_kmeans',
'select_best_kmeans',
'som',
'zscore',
] + \
nnmf.__all__
<file_sep>/powertrain/robomow_motor_class.py
#!/usr/bin/env python
import serial
import time
def log(*msgline):
for msg in msgline:
print msg,
print
class robomow_motor(object):
def __init__(self,com_port="/dev/ttyUSB0"):
############################
# lets introduce and init the main variables
self.com = None
self.isInitialized = False
###########################
# lets connect the TTL Port
try:
self.com = serial.Serial(com_port, 9600, timeout=1)
self.com.close()
self.com.open()
self.isInitialized = True
self.lmotor_speed = 0
self.rmotor_speed = 0
log ("Link to motor driver -", com_port, "- successful")
#log (self.com.portstr, self.com.baudrate,self.com.bytesize,self.com.parity,self.com.stopbits)
except serial.serialutil.SerialException, e:
print e
log("Link to motor driver -", com_port, "- failed")
def __del__(self):
del(self)
def stats(self):
return (self.com.portstr, self.com.baudrate,self.com.bytesize,self.com.parity,self.com.stopbits)
def forward(self,speed):
##takes desired speed as percentage, returns 2 ints indicating each current motor speed
motor1_spd = int(63.0 * (float(speed)/100) + 65)
motor2_spd = int(63.0 * (float(speed)/100) + 193)
if motor1_spd < 65: motor1_spd = 65
if motor1_spd > 127: motor1_spd = 127
if motor2_spd < 193: motor2_spd = 193
if motor2_spd > 255: motor2_spd = 255
#print motor1_spd, motor2_spd
self.com.write (chr(int(hex(motor1_spd),16)))
#time.sleep(.01)
self.com.write (chr(int(hex(motor2_spd),16)))
#print "sending command: ", int(hex(speed),16)
#print "sending command: ", int(hex(speed+127),16)
self.lmotor_speed = motor1_spd
self.rmotor_speed = motor2_spd
def reverse(self,speed):
##takes desired speed as percentage, returns 2 ints indicating each current motor speed
motor1_spd = 63 - int(63.0 * (float(speed)/100))
motor2_spd = 191 - int(63.0 * (float(speed)/100))
if motor1_spd < 1: motor1_spd = 1
if motor1_spd > 63: motor1_spd = 63
if motor2_spd < 128: motor2_spd = 128
if motor2_spd > 191: motor2_spd = 191
#print motor1_spd, motor2_spd
self.com.write (chr(int(hex(motor1_spd),16)))
#time.sleep(.01)
self.com.write (chr(int(hex(motor2_spd ),16)))
#print "sending command: ", int(hex(speed),16)
#print "sending command: ", int(hex(speed+127),16)
self.lmotor_speed = motor1_spd
self.rmotor_speed = motor2_spd
def stop(self):
self.com.write (chr(int(hex(64),16)))
#time.sleep(.01)
self.com.write (chr(int(hex(192),16)))
self.lmotor_speed = 64
self.rmotor_speed = 192
def left(self, degree):
motor1_spd = motor1_spd - degree
motor2_spd = motor2_spd + degree
if motor1_spd < 1: motor1_spd = 1
if motor1_spd > 127: motor1_spd = 127
if motor2_spd < 128: motor1_spd = 128
if motor2_spd > 255: motor1_spd = 255
self.com.write (chr(int(hex(motor1_spd),16)))
#time.sleep(.01)
self.com.write (chr(int(hex(motor2_spd ),16)))
self.lmotor_speed = motor1_spd
self.rmotor_speed = motor2_spd
def right(self, degree):
motor1_spd = motor1_spd + degree
motor2_spd = motor2_spd - degree
if motor1_spd < 1: motor1_spd = 1
if motor1_spd > 127: motor1_spd = 127
if motor2_spd < 128: motor1_spd = 128
if motor2_spd > 255: motor1_spd = 255
self.com.write (chr(int(hex(motor1_spd),16)))
#time.sleep(.01)
self.com.write (chr(int(hex(motor2_spd ),16)))
self.lmotor_speed = motor1_spd
self.rmotor_speed = motor2_spd
<file_sep>/gui/breezy/paneldemo4.py
"""
File: paneldemo4.py
Calculates the attributes of a sphere. Uses a panel to organize the
command buttons.
"""
from breezypythongui import EasyFrame, EasyDialog
import math
class PanelDemo(EasyFrame):
def __init__(self):
"""Sets up the window and widgets."""
EasyFrame.__init__(self)
self.addButton(text = "Press me", row = 0, column = 0,
command = self.modify)
# Event handling method.
def modify(self):
"""Pops up a dialog to edit the model."""
dialog = PanelDialog(self)
class PanelDialog(EasyDialog):
"""Opens a dialog."""
def __init__(self, parent):
"""Sets up the window."""
EasyDialog.__init__(self, parent, "Sphere calculations")
def body(self, master):
"""Sets up the widgets."""
# Label and field for the radius
self.addLabel(master, text = "Radius",
row = 0, column = 0)
self.radiusField = self.addFloatField(master, value = 0.0,
row = 0,
column = 1)
# Label and field for the output
self.outputLabel = self.addLabel(master, text = "Diameter",
row = 1, column = 0)
self.outputField = self.addFloatField(master, value = 0.0,
row = 1,
column = 1)
# Panel for the command buttons
buttonPanel = self.addPanel(master, row = 2, column = 0,
columnspan = 2)
# The command buttons
buttonPanel.addButton(text = "Diameter",
row = 0, column = 0,
command = self.computeDiameter)
buttonPanel.addButton(text = "Area",
row = 0, column = 1,
command = self.computeArea)
buttonPanel.addButton(text = "Volume",
row = 0, column = 2,
command = self.computeVolume)
def apply(self):
"""Transfers data from the fields to the CD."""
self.setModified()
# The event handler methods for the buttons
def computeDiameter(self):
"""Inputs the radius, computes the diameter,
and outputs the result."""
radius = self.radiusField.getNumber()
diameter = radius * 2
self.outputField.setNumber(diameter)
self.outputLabel["text"] = "Diameter"
def computeArea(self):
"""Inputs the radius, computes the area,
and outputs the result."""
radius = self.radiusField.getNumber()
area = 4 * radius ** 2 * math.pi
self.outputField.setNumber(area)
self.outputLabel["text"] = "Area"
def computeVolume(self):
"""Inputs the radius, computes the volume,
and outputs the result."""
radius = self.radiusField.getNumber()
volume = 4/3 * radius ** 3 * math.pi
self.outputField.setNumber(volume)
self.outputLabel["text"] = "Volume"
#Instantiate and pop up the window."""
PanelDemo().mainloop()
<file_sep>/lib/mobot_wifi_consume.py
#!/usr/bin/python
import pika
import thread, time, sys, traceback
''' USAGE:
wifi = consume_wifi(channel_name.#, ip_of_publisher)
wifi = consume_wifi('wifi.1', 'localhost')
while True:
time.sleep(1)
print 'signal strength:', wifi.signal_strength
self.signal_strength:
int > -1 = the wifi ssid signal strength
-1 = connected to queue but ssid not found
-2 = no connecntion to msg queue
'''
class consume_wifi():
def __init__(self, channel_name, host_ip):
self.signal_strength = None
#-------------connection variables
self.channel_name = channel_name
self.host_ip = host_ip
self.queue_name = None
self.connection = None
self.channel = None
#----------------------RUN
self.run()
def connect(self):
self.connection = pika.BlockingConnection(pika.ConnectionParameters(host=self.host_ip))
self.channel = self.connection.channel()
self.channel.exchange_declare(exchange='mobot_data_feed',type='topic')
result = self.channel.queue_declare(exclusive=True, auto_delete=True, arguments={'x-message-ttl':100})
self.queue_name = result.method.queue
binding_keys = self.channel_name
self.channel.queue_bind(exchange='mobot_data_feed', queue=self.queue_name, routing_key=binding_keys)
def read_signal_strength(self):
while True:
if self.connection == None or self.connection.is_open == False:
self.connect()
time.sleep(0.005) # do not hog the processor power
#print "-" * 50
method_frame, properties, body = self.channel.basic_get(self.queue_name)
if method_frame:
# Display the message parts
#print method_frame
#print properties
#print body
self.signal_strength = body
#print "self.signal_strength:", self.signal_strength
self.channel.basic_ack(method_frame.delivery_tag)
else:
#print "no msgs read"
time.sleep(1)
self.signal_strength = -2
def run(self):
self.th = thread.start_new_thread(self.read_signal_strength, ())
if __name__== "__main__":
wifi = consume_wifi('wifi.1', '192.168.1.180')
while True:
time.sleep(.01)
print 'signal strength:', wifi.signal_strength
<file_sep>/navigation/astar/astar.py
#!/usr/bin/python
# Copyright (c) 2010 <NAME>
# Licensed under the MIT license.
# http://brandon.sternefamily.net/files/mit-license.txt
# Python A-Star (A*) Implementation
import sys
import heapq
import copy
class Node:
def __init__(self, contents, cost, hcost=0):
self.contents = contents # representation of the game board
self.cost = cost # cost to get to this node
self.hcost = hcost # estimated heuristic cost to goal
self.parent = None # link to parent node in tree
def setParent(self, parent):
self.parent = parent
# return a list of child nodes according to the set of legal moves
def getChildNodes(self, moves):
children = []
board = self.contents
for cell in board.keys():
if board[cell] != None: # there is a piece that can be moved
for move in moves[cell]: # examine legal moves for piece
if board[move] == None: # position is available
temp = board.copy()
# place the piece in its new position
temp[cell] = board[move]
temp[move] = board[cell]
# create node containing the new configuration
child = Node(temp, self.cost+1)
child.hcost = distance(child)
children.append(child)
return children
# evaluate if this node meets the goal criteria
def isGoal(self):
if (self.contents[1] == "black") and \
(self.contents[6] == "black") and \
(self.contents[5] == "white") and \
(self.contents[7] == "white"):
return True
else:
return False
# return a minified, static copy of the board representation so we
# can keep track of visited nodes (dictionaries require static keys)
def staticCast(self):
# return a tuple only containing filled cells
return tuple([(cell, self.contents[cell]) for cell in
self.contents.keys() if self.contents[cell] is not None])
# comparison function for keeping our priority queue in order
# keep less costly nodes toward the front of the queue (explore first)
def compare(a,b):
if (a.cost+a.hcost) < (b.cost+b.hcost): return -1
elif (a.cost+a.hcost) == (b.cost+b.hcost): return 0
else: return 1
# return a list of nodes from the start position to the goal position
def getPath(goal, start):
current = copy.copy(goal)
path = []
# start at the goal and follow the parent chain to the beginning
path.append(goal)
while current.contents != start.contents:
up = current.parent
path.append(up)
current = up
# reverse the list to give the start-to-goal ordering
path.reverse()
return path
# print a representation of a game board
def showDiagram(board):
for i in range(1, 11):
# line breaks
if i in [2, 6, 9]: print "\n"
if board[i] != None:
print board[i][0],
else: print "_",
print "\n"
# dist[i] is the minimum distance to a correct position for a piece of color
# (black, white). E.g., dist[5][1] is 0 because a white piece in square 5
# is already correctly positioned.
dist = {1: (0, 1), 2: (3, 4), 3: (6, 7), 4: (1, 2), 5: (2, 0),
6: (0, 3), 7: (1, 0), 8: (4, 5), 9: (5, 6), 10: (2, 3)}
# Heuristic function to estimate how far from the solution a given node is.
# In this problem, we always know the minimum distance from one square to
# via knight's moves. Uses dist (above) to lookup distances and sum them.
def distance(node):
distance = 0
for cell in [c for c in node.contents.keys()
if node.contents[c] is not None]:
if node.contents[cell] == "black":
distance += dist[cell][0]
else:
distance += dist[cell][1]
return distance
# driver function for the A-Star tree search
# takes a starting board configuration and a dictionary of legal moves
def aStarSearch(board, moves):
import heapq
# priority queue to store nodes
pq = []
heapq.heapify(pq)
#dictionary to store previously visited nodes
visited = {}
# put the initial node on the queue
start = Node(board, 0)
heapq.heappush(pq, start)
while (len(pq) > 0):
node = heapq.heappop(pq)
visNode = node.staticCast()
if visNode not in visited:
if node.isGoal():
return "We've got a winner.", node
else:
children = node.getChildNodes(moves)
for child in children:
child.setParent(node)
heapq.heappush(pq, child)
visited[visNode] = True
pq.sort(compare) # keep less costly nodes at the front
# entire tree searched, no goal state found
return "No solution.", None
def printHelp():
print "-p\tPath: reconstruct full path to goal"
sys.exit()
def main():
if "-h" in sys.argv or "--help" in sys.argv:
printHelp()
# dictionary to store board state
# cells contain one of ["black", "white", None]
board = {}
for i in range(1, 11, 1): board[i] = None
board[1] = "white"
board[6] = "white"
board[5] = "black"
board[7] = "black"
# dictionary defines valid operators (legal moves)
moves = { 1: [4, 7],
2: [8, 10],
3: [9],
4: [1, 6, 10],
5: [7],
6: [4],
7: [1, 5],
8: [2, 9],
9: [8, 3],
10: [2, 4] }
ans, goal = aStarSearch(board, moves)
print ans
# reconstruct path
if "-p" in sys.argv:
if ans == "We've got a winner.":
start = Node(board, 0)
path = getPath(goal, start)
i = 0
for node in path:
print "step ", i, ":"
showDiagram(node.contents)
i += 1
if __name__ == "__main__":
main()
<file_sep>/lib/mobot_wifi_publish.py~
#!/usr/bin/python
import dbus
import time
import thread
import pika
class WiFiList():
def __init__(self):
self.NM = 'org.freedesktop.NetworkManager'
self.has_nm = True
self.strength = None
self.aps = []
try:
self.bus = dbus.SystemBus()
nm = self.bus.get_object(self.NM, '/org/freedesktop/NetworkManager')
self.devlist = nm.GetDevices(dbus_interface = self.NM)
#-------------connection variables
self.channel_name = 'wifi.1'
self.connection = None
self.channel = None
#----------------------RUN
self.run()
except:
self.has_nm = False
def dbus_get_property(self, prop, member, proxy):
return proxy.Get(self.NM+'.' + member, prop, dbus_interface = 'org.freedesktop.DBus.Properties')
def repopulate_ap_list(self):
apl = []
res = []
for i in self.devlist:
tmp = self.bus.get_object(self.NM, i)
if self.dbus_get_property('DeviceType', 'Device', tmp) == 2:
apl.append(self.bus.get_object(self.NM, i).GetAccessPoints(dbus_interface = self.NM+'.Device.Wireless'))
for i in apl:
for j in i:
res.append(self.bus.get_object(self.NM, j))
return res
def update(self):
self.aps = []
if self.has_nm:
for i in self.repopulate_ap_list():
try:
ssid = self.dbus_get_property('Ssid', 'AccessPoint', i)
ssid = "".join(["%s" % k for k in ssid])
ss = self.dbus_get_property('Strength', 'AccessPoint', i);
mac = self.dbus_get_property('HwAddress', 'AccessPoint', i);
#self.aps.append({"mac":str(mac), "ssid": unicode(ssid), "ss": float(ss)})
self.aps.append([str(unicode(ssid)), int(ss)])
except:
pass
def signal_strength(self, ssid):
to_return = -1
self.update()
#filter(lambda x: 'abc' in x,lst)
for i in self.aps:
if i[0] == ssid:
to_return = i[1]
self.publish(str(to_return))
return to_return
def run(self):
self.connect()
#self.th = thread.start_new_thread(self.update, ())
def connect(self):
self.connection = pika.BlockingConnection(pika.ConnectionParameters('localhost'))
self.channel = self.connection.channel()
self.channel.exchange_declare(exchange='mobot_data_feed',type='topic')
#self.channel.queue_declare(queue='mobot_wifi', auto_delete=True, arguments={'x-message-ttl':100})
#self.channel.queue_bind(queue='mobot_wifi', exchange='mobot_data_feed', auto_delete=True, arguments={'x-message-ttl':1000})
def publish(self, data):
self.channel.basic_publish(exchange='mobot_data_feed', routing_key=self.channel_name, body=data, properties=pika.BasicProperties(expiration=str(100)))
if __name__ == "__main__":
wlist = WiFiList()
i = 0
while True:
time.sleep(.0001)
print wlist.signal_strength('isotope11_wireless'), i
i += 1
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/tests/test_lasso.py
from milk.supervised.lasso import lasso_learner
import milk.supervised.lasso
import numpy as np
def test_lasso_smoke():
np.random.seed(3)
for i in xrange(8):
X = np.random.rand(100,10)
Y = np.random.rand(5,10)
B = np.random.rand(5,100)
before = np.linalg.norm(Y - np.dot(B,X))
B = milk.supervised.lasso(X,Y)
after = np.linalg.norm(Y - np.dot(B,X))
assert after < before
assert np.all(~np.isnan(B))
def test_lasso_nans():
np.random.seed(3)
for i in xrange(8):
X = np.random.rand(100,10)
Y = np.random.rand(5,10)
B = np.random.rand(5,100)
for j in xrange(12):
Y.flat[np.random.randint(0,Y.size-1)] = float('nan')
B = milk.supervised.lasso(X,Y)
assert np.all(~np.isnan(B))
def test_lam_zero():
np.random.seed(2)
for i in xrange(8):
X = np.random.rand(24,2)
Y = np.random.rand(1,2)
B = milk.supervised.lasso(X,Y, lam=0.0)
R = Y - np.dot(B,X)
R = R.ravel()
assert np.dot(R,R) < .01
def test_lasso_walk():
np.random.seed(5)
for i in xrange(4):
X = np.random.rand(100,10)
Y = np.random.rand(5,10)
Bs = milk.supervised.lasso_walk(X,Y, start=.0001, nr_steps=3)
B0 = milk.supervised.lasso(X,Y, lam=.0001)
assert np.all(Bs[0] == B0)
assert not np.all(Bs[0] == Bs[-1])
assert len(Bs) == 3
def test_lasso_walk_nans():
np.random.seed(5)
for i in xrange(3):
X = np.random.rand(100,10)
Y = np.random.rand(5,10)
B = np.random.rand(5,100)
for j in xrange(12):
Y.flat[np.random.randint(0,Y.size-1)] = float('nan')
B = milk.supervised.lasso_walk(X,Y, nr_steps=6)
assert np.all(~np.isnan(B))
def test_learner():
np.random.seed(334)
learner = lasso_learner()
X = np.random.rand(100,10)
Y = np.random.rand(5,10)
model = learner.train(X,Y)
test = model.apply(np.random.rand(100))
assert len(test) == len(Y)
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/tests/test_som.py
import numpy as np
import random
from milk.unsupervised import som
from milk.unsupervised.som import putpoints, closest
def _slow_putpoints(grid, points, L=.2):
for point in points:
dpoint = grid-point
y,x = np.unravel_index(np.abs(dpoint).argmin(), dpoint.shape)
for dy in xrange(-4, +4):
for dx in xrange(-4, +4):
ny = y + dy
nx = x + dx
if ny < 0 or ny >= grid.shape[0]:
continue
if nx < 0 or nx >= grid.shape[1]:
continue
L2 = L/(1+np.abs(dy)+np.abs(dx))
grid[ny,nx] *= 1. - L2
grid[ny,nx] += point*L2
def data_grid():
np.random.seed(22)
data = np.arange(100000, dtype=np.float32)
grid = np.array([data.flat[np.random.randint(0, data.size)] for i in xrange(64*64)]).reshape((64,64,1))
data = data.reshape((-1,1))
return grid, data
def test_putpoints():
grid, points = data_grid()
points = points[:100]
grid2 = grid.copy()
putpoints(grid, points, L=0., R=1)
assert np.all(grid == grid2)
putpoints(grid, points, L=.5, R=1)
assert not np.all(grid == grid2)
def test_against_slow():
grid, points = data_grid()
grid2 = grid.copy()
putpoints(grid, points[:10], shuffle=False)
_slow_putpoints(grid2.reshape((64,64)), points[:10])
assert np.allclose(grid, grid2)
def test_som():
N = 10000
np.random.seed(2)
data = np.array([np.arange(N), N/4.*np.random.randn(N)])
data = data.transpose().copy()
grid = som(data, (8,8), iterations=3, R=4)
assert grid.shape == (8,8,2)
y,x = closest(grid, data[0])
assert 0 <= y < grid.shape[0]
assert 0 <= x < grid.shape[1]
grid2 = grid.copy()
np.random.shuffle(grid2)
full = np.abs(np.diff(grid2[:,:,0], axis=0)).mean()
obs = np.abs(np.diff(grid[:,:,0], axis=0)).mean()
obs2 = np.abs(np.diff(grid[:,:,0], axis=1)).mean()
assert obs + 4*np.abs(obs-obs2) < full
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/ext/jugparallel.py
# -*- coding: utf-8 -*-
# Copyright (C) 2011-2012, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
# License: MIT. See COPYING.MIT file in the milk distribution
'''
============
Jug Parallel
============
These are some functions that make it easier to take advantage of `jug
<http://luispedro.org/software/jug>`__ to perform tasks in parallel.
All of the functions in this module return a jug.Task object and are usable
with the ``CompoundTask`` interface in jug.
None of this will make sense unless you understand and have used jug.
'''
from __future__ import division
import numpy as np
import milk
try:
from jug import TaskGenerator, value
from jug.utils import identity
from jug.mapreduce import mapreduce
from jug.mapreduce import reduce as jug_reduce
except ImportError:
raise ImportError('milk.ext.jugparallel requires jug (http://luispedro.org/software/jug)')
def _nfold_reduce(a,b):
cmat = a[0] + b[0]
names = a[1]
assert a[1] == b[1]
if len(a) == 2:
return cmat, names
predictions = np.array([a[2],b[2]])
return cmat, names, predictions.max(0)
def nfoldcrossvalidation(features, labels, **kwargs):
'''
jug_task = nfoldcrossvalidation(features, labels, **kwargs)
A jug Task that perform n-foldcrossvalidation
N-fold cross validation is inherently parallel. This function returns a
``jug.Task`` which performs n-fold crossvalidation which jug can
parallelise.
Parameters
----------
features : sequence of features
labels : sequence
kwargs : any
This will be passed down to ``milk.nfoldcrossvalidation``
Returns
-------
jug_task : a jug.Task
A Task object
See Also
--------
milk.nfoldcrossvalidation : The same functionality as a "normal" function
jug.CompoundTask : This function can be used as argument to CompoundTask
'''
nfolds = kwargs.get('nfolds', 10)
features,labels = map(identity, (features,labels))
kwargs = dict( (k,identity(v)) for k,v in kwargs.iteritems())
nfold_one = TaskGenerator(milk.nfoldcrossvalidation)
mapped = [nfold_one(features, labels, folds=[i], **kwargs) for i in xrange(nfolds)]
return jug_reduce(_nfold_reduce, mapped)
def _select_min(s0, s1):
if s0[0] < s1[0]: return s0
else: return s1
def _evaluate_solution(args):
features, results, method = args
from milk.unsupervised.gaussianmixture import AIC, BIC
if method == 'AIC':
method = AIC
elif method == 'BIC':
method = BIC
else:
raise ValueError('milk.ext.jugparallel.kmeans_select_best: unknown method: %s' % method)
assignments, centroids = results
value = method(features, assignments, centroids)
return value, results
def _select_best(features, results, method):
features = identity(features)
return mapreduce(_select_min, _evaluate_solution, [(features,r,method) for r in results], reduce_step=32, map_step=8)
def kmeans_select_best(features, ks, repeats=1, method='AIC', R=None, **kwargs):
'''
assignments_centroids = kmeans_select_best(features, ks, repeats=1, method='AIC', R=None, **kwargs)
Perform ``repeats`` calls to ``kmeans`` for each ``k`` in ``ks``, select
the best one according to ``method.``
Note that, unlike a raw ``kmeans`` call, this is *always deterministic*
even if ``R=None`` (which is interpreted as being equivalent to setting it
to a fixed value). Otherwise, the jug paradigm would be broken as different
runs would give different results.
Parameters
----------
features : array-like
2D array
ks : sequence of integers
These will be the values of ``k`` to try
repeats : integer, optional
How many times to attempt each k (default: 1).
method : str, optional
Which method to use. Must be one of 'AIC' (default) or 'BIC'.
R : random number source, optional
Even you do not pass a value, the result will be deterministic. This is
different from the typical behaviour of ``R``, but, when using jug,
reproducibility is often but, when using jug, reproducibility is often
a desired feature.
kwargs : other options
These are passed transparently to ``kmeans``
Returns
-------
assignments_centroids : jug.Task
jug.Task which is the result of the best (as measured by ``method``)
kmeans clustering.
'''
from milk import kmeans
from milk.utils import get_pyrandom
kmeans = TaskGenerator(kmeans)
if R is not None:
start = get_pyrandom(R).randint(0,1024*1024)
else:
start = 7
results = []
for ki,k in enumerate(ks):
for i in xrange(repeats):
results.append(kmeans(features, k, R=(start+7*repeats*ki+i), **kwargs))
return _select_best(features, results, method)[1]
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/tests/test_gaussianmixture.py
import numpy as np
from milk.unsupervised import gaussianmixture
def _sq(x):
return x*x
def test_gm():
np.random.seed(22)
centroids = np.repeat(np.arange(4), 4).reshape((4,4))
fmatrix = np.concatenate([(np.random.randn(12,4)+c) for c in centroids])
assignments = np.repeat(np.arange(4), 12)
rss = sum(np.sum(_sq(fmatrix[i*12:(i+1)*12]-i)) for i in xrange(4))
assert np.abs(gaussianmixture.residual_sum_squares(fmatrix, assignments, centroids) - rss) < 1.e-12
assert gaussianmixture.BIC(fmatrix, assignments, centroids) > 0
assert gaussianmixture.AIC(fmatrix, assignments, centroids) > 0
assert gaussianmixture.BIC(fmatrix, assignments, centroids, model='full_covariance') > \
gaussianmixture.BIC(fmatrix, assignments, centroids, model='diagonal_covariance') > \
gaussianmixture.BIC(fmatrix, assignments, centroids, model='one_variance')
assert gaussianmixture.AIC(fmatrix, assignments, centroids, model='full_covariance') > \
gaussianmixture.AIC(fmatrix, assignments, centroids, model='diagonal_covariance') > \
gaussianmixture.AIC(fmatrix, assignments, centroids, model='one_variance')
<file_sep>/camera/framestream_remote.py
# PURPOSE:
#open sockets between remote and station
#capture frame on remote
#send frame from remote to station
#import the necessary things for Socket
import sys, socket
# import the necessary things for OpenCV
import cv
#this is important for capturing/displaying images
from opencv import highgui
import pygame
import Image
from pygame.locals import *
#capture a frame
camera = highgui.cvCreateCameraCapture(0)
def get_image():
im = highgui.cvQueryFrame(camera)
# Add the line below if you need it (Ubuntu 8.04+)
#im = opencv.cvGetMat(im)
#convert Ipl image to PIL image
return opencv.adaptors.Ipl2PIL(im)
fps = 30.0
pygame.init()
window = pygame.display.set_mode((640,480))
pygame.display.set_caption("WebCam Demo")
screen = pygame.display.get_surface()
while True:
events = pygame.event.get()
for event in events:
if event.type == QUIT or event.type == KEYDOWN:
sys.exit(0)
im = get_image()
pg_img = pygame.image.frombuffer(im.tostring(), im.size, im.mode)
screen.blit(pg_img, (0,0))
pygame.display.flip()
pygame.time.delay(int(1000 * 1.0/fps))
"""
#define variable
STATIONIP = '127.0.0.1' #IP of the station's IP
CPORT = 12345 #port to handle commands
DPORT = 12346 #port to handle data
#create socket
filename_tcp = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
#connect to station
filename_tcp.connect((HOST, CPORT))
#send file name
filename_tcp.send("SEND " + FILE)
#close port
filename_tcp.close()
data_tcp = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
data_tcp .connect((HOST, MPORT))
print "data port connected..."
f = open(FILE, "rb")
data = f.read()
f.close()
data_tcp.send(data)
data_tcp.close()
"""
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/tests/test_fisher.py
import milk.supervised.svm
import milk.supervised.normalise
import numpy as np
import milk.supervised.svm
def _slow_f(features,labels,kernel_or_sigma):
try:
kernel = kernel_or_sigma
kernel(features[0],features[1])
except:
kernel = milk.supervised.svm.rbf_kernel(kernel_or_sigma)
N1 = (labels == 0).sum()
N2 = (labels == 1).sum()
x1 = features[labels == 0]
x2 = features[labels == 1]
dm = 0
for i in xrange(N1):
for j in xrange(N1):
dm += kernel(x1[i],x1[j])/N1/N1
for i in xrange(N2):
for j in xrange(N2):
dm += kernel(x2[i],x2[j])/N2/N2
for i in xrange(N1):
for j in xrange(N2):
dm -= 2*kernel(x1[i],x2[j])/N1/N2
s1 = N1
for i in xrange(N1):
for j in xrange(N1):
s1 -= kernel(x1[i],x1[j])/N1
s2 = N2
for i in xrange(N2):
for j in xrange(N2):
s2 -= kernel(x2[i],x2[j])/N2
return (s1 + s2)/dm
def test_fisher_approx():
from milksets import wine
features,labels = wine.load()
f = milk.supervised.svm.sigma_value_fisher(features,labels)
for sigma in (2.**-4,2.,16.,32.):
assert abs(f(sigma) - _slow_f(features,labels,sigma)) < 1e-6
<file_sep>/navigation/gps/identify_device_on_ttyport.py
import glob
import os
import re
#lsusb to find vendorID and productID:
#Bus 002 Device 005: ID 067b:2303 Prolific Technology, Inc. PL2303 Serial Port
# then call print find_usb_tty("067b","2303")
def find_usb_tty(vendor_id = None, product_id = None) :
tty_devs = []
vendor_id = int(vendor_id, 16)
product_id = int(product_id , 16)
for dn in glob.glob('/sys/bus/usb/devices/*') :
try :
vid = int(open(os.path.join(dn, "idVendor" )).read().strip(), 16)
pid = int(open(os.path.join(dn, "idProduct")).read().strip(), 16)
if ((vendor_id is None) or (vid == vendor_id)) and ((product_id is None) or (pid == product_id)) :
dns = glob.glob(os.path.join(dn, os.path.basename(dn) + "*"))
for sdn in dns :
for fn in glob.glob(os.path.join(sdn, "*")) :
if re.search(r"\/ttyUSB[0-9]+$", fn) :
#tty_devs.append("/dev" + os.path.basename(fn))
tty_devs.append(os.path.join("/dev", os.path.basename(fn)))
pass
pass
pass
pass
except ( ValueError, TypeError, AttributeError, OSError, IOError ) :
pass
pass
return tty_devs
print find_usb_tty("067b","2303")
<file_sep>/powertrain/sabertooth/sabertooth_driver.py~
#!/usr/bin/python
import serial
import time
PORT = "/dev/ttyUSB0"
#if len(sys.argv) > 1:
# PORT = sys.argv[1]
ser = serial.Serial(PORT, 9600, timeout=1)
print ser.portstr,
print ser.baudrate,
print ser.bytesize,
print ser.parity,
print ser.stopbits
print ser
print "sending command"
time.sleep(1)
for i in xrange(80, 127, 1):
ser.write (chr(int(hex(i),16)))
time.sleep (.3)
percent = i/127.0
print i, " engine at speed: ", percent*100, '%'
time.sleep(5)
for i in xrange(127, 65, -1):
ser.write (chr(int(hex(i),16)))
time.sleep (.2)
percent = i/127.0
print i, " engine at speed: ", percent*100, '%'
ser.write (chr(int(hex(0),16)))
print "closing port"
time.sleep(2)
print 'port closed'
ser.close()
print hex(127)
print chr(0x50)
<file_sep>/navigation/gps/gpsreader3_3hz.py~
#!/usr/bin/python
import gps, os, time
session = gps.gps()
session.poll()
=======
session = gps.gps(host='localhost', port='2947')
session.next()
#session.stream(mode=WATCH_ENABLE|WATCH_NEWSTYLE)
session.stream()
#from gps import *
#session = gps() # assuming gpsd running with default options on port 2947
#session.stream(WATCH_ENABLE|WATCH_NEWSTYLE)
#report = session.next()
#print report
while 1:
os.system('clear')
session.next()
time.sleep(1)
# a = altitude, d = date/time, m=mode,
# o=postion/fix, s=status, y=satellites
print
print ' GPS reading'
print '----------------------------------------'
print 'latitude ' , session.fix.latitude
print 'longitude ' , session.fix.longitude
print 'time utc ' , session.utc, session.fix.time
print 'altitude ' , session.fix.altitude
print 'epx ' , session.fix.epx
print 'epv ' , session.fix.epv
print 'ept ' , session.fix.ept
print 'speed ' , session.fix.speed
print 'climb ' , session.fix.climb
print
print ' Satellites (total of', len(session.satellites) , ' in view)'
for i in session.satellites:
print '\t', i
time.sleep(3)
<file_sep>/ftdi/build/pyftdi/build/lib.linux-i686-2.7/pyftdi/serialext/ftdiext.py
# Copyright (c) 2008-2011, Neotion
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
# * Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright
# notice, this list of conditions and the following disclaimer in the
# documentation and/or other materials provided with the distribution.
# * Neither the name of the Neotion nor the names of its contributors may
# be used to endorse or promote products derived from this software
# without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL NEOTION BE LIABLE FOR ANY DIRECT, INDIRECT,
# INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA,
# OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
# LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
# NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE,
# EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
import re
import time
from pyftdi.pyftdi.ftdi import Ftdi, FtdiError
from pyftdi.pyftdi.misc import to_int
from usbext import SerialUsb
BACKEND = 'pyftdi'
__all__ = ['SerialFtdi']
class SerialFtdi(SerialUsb):
"""Serial port implementation for FTDI compatible with pyserial API"""
SCHEME = 'ftdi://'
# the following dictionaries should be augmented to support the various
# VID/PID that actually map to a USB-serial FTDI device
VENDOR_IDS = { 'ftdi': 0x0403 }
PRODUCT_IDS = { 0x0403 : \
{ '232': 0x6001,
'2232': 0x6010,
'4232': 0x6011,
'ft232': 0x6001,
'ft2232': 0x6010,
'ft4232': 0x6011
}
}
DEFAULT_VENDOR = 0x403
def open(self):
super(SerialFtdi, self).open(Ftdi,
SerialFtdi.SCHEME,
SerialFtdi.VENDOR_IDS,
SerialFtdi.PRODUCT_IDS,
SerialFtdi.DEFAULT_VENDOR)
<file_sep>/ftdi/build/pyftdi/build/lib.linux-i686-2.7/pyftdi/serialext/usbext.py
# Copyright (c) 2008-2011, Neotion
# Copyright (c) 2011, <NAME> <<EMAIL>>
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
# * Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright
# notice, this list of conditions and the following disclaimer in the
# documentation and/or other materials provided with the distribution.
# * Neither the name of the Neotion nor the names of its contributors may
# be used to endorse or promote products derived from this software
# without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL NEOTION BE LIABLE FOR ANY DIRECT, INDIRECT,
# INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA,
# OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
# LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
# NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE,
# EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
import re
import time
from pyftdi.pyftdi.misc import to_int
from serial import SerialBase
__all__ = ['SerialUsb']
class SerialUsb(SerialBase):
"""Base class for Serial port implementation compatible with pyserial API
using a USB device.
"""
BAUDRATES = sorted([9600 * (x+1) for x in range(6)] +
range(115200, 1000000, 115200) + \
range(1000000, 13000000, 100000))
def makeDeviceName(self, port):
return port
def open(self, devclass, scheme, vdict, pdict, default_vendor):
from serial import SerialException
if self._port is None:
raise SerialException("Port must be configured before use.")
portstr = self.portstr
if not portstr.startswith(scheme):
raise SerialException("Invalid URL")
plloc = portstr[len(scheme):].split('/')
plcomps = plloc[0].split(':') + [''] * 2
try:
plcomps[0] = vdict.get(plcomps[0], plcomps[0])
if plcomps[0]:
vendor = to_int(plcomps[0])
else:
vendor = None
product_ids = pdict.get(vendor, None)
if not product_ids:
product_ids = pdict[default_vendor]
plcomps[1] = product_ids.get(plcomps[1], plcomps[1])
if plcomps[1]:
product = to_int(plcomps[1])
else:
product = None
if not plloc[1]:
raise SerialException('Invalid device port')
if plloc[1] == '?':
show_devices = True
else:
interface = to_int(plloc[1])
show_devices = False
except (IndexError, ValueError):
raise SerialException('Invalid device URL')
sernum = None
idx = 0
if plcomps[2]:
try:
idx = to_int(plcomps[2])
if idx > 255:
idx = 0
raise ValueError
if idx:
idx -= 1
except ValueError:
sernum = plcomps[2]
try:
self.udev = devclass()
if not vendor or not product or sernum or idx:
# Need to enumerate USB devices to find a matching device
vendors = vendor and [vendor] or \
set(vdict.values())
vps = set()
for v in vendors:
products = pdict.get(v, [])
for p in products:
vps.add((v, products[p]))
devices = devclass.find_all(vps)
candidates = []
if sernum:
if sernum not in [dev[2] for dev in devices]:
raise SerialException("No USB device with S/N %s" % \
sernum)
for v, p, s, i, d in devices:
if s != sernum:
continue
if vendor and vendor != v:
continue
if product and product != p:
continue
candidates.append((v, p, s, i, d))
else:
for v, p, s, i, d in devices:
if vendor and vendor != v:
continue
if product and product != p:
continue
candidates.append((v, p, s, i, d))
if not show_devices:
try:
vendor, product, ifport, ifcount, description = \
candidates[idx]
except IndexError:
raise SerialException("No USB device #%d" % idx)
if show_devices:
self.show_devices(scheme, vdict, pdict, candidates)
raise SystemExit(candidates and \
'Please specify the USB device' or \
'No USB-Serial device has been detected')
if vendor not in pdict:
raise SerialException('Vendor ID 0x%04x not supported' % \
vendor)
if product not in pdict[vendor].values():
raise SerialException('Product ID 0x%04x not supported' % \
product)
self.udev.open(vendor, product, interface, idx, sernum)
except IOError:
raise SerialException('Unable to open USB port %s' % self.portstr)
self._isOpen = True
self._reconfigurePort()
self._product = product
def close(self):
self._isOpen = False
self.udev.close()
self.udev = None
def read(self, size=1):
"""Read size bytes from the serial port. If a timeout is set it may
return less characters as requested. With no timeout it will block
until the requested number of bytes is read."""
data = ''
start = time.time()
while size > 0:
buf = self.udev.read_data(size)
data += buf
size -= len(buf)
if self._timeout > 0:
if buf:
break
ms = time.time()-start
if ms > self._timeout:
break
time.sleep(0.01)
return data
def write(self, data):
"""Output the given string over the serial port."""
self.udev.write_data(data)
def flush(self):
"""Flush of file like objects. In this case, wait until all data
is written."""
# do nothing
def flushInput(self):
"""Clear input buffer, discarding all that is in the buffer."""
self.udev.purge_rx_buffer()
def flushOutput(self):
"""Clear output buffer, aborting the current output and
discarding all that is in the buffer."""
self.udev.purge_tx_buffer()
def sendBreak(self):
"""Send break condition."""
# Not supported
pass
def setRTS(self, level=True):
"""Set terminal status line: Request To Send"""
self.udev.set_rts(level)
def setDTR(self, level=True):
"""Set terminal status line: Data Terminal Ready"""
self.udev.set_dtr(level)
def getCTS(self):
"""Read terminal status line: Clear To Send"""
return self.udev.get_cts()
def getDSR(self):
"""Read terminal status line: Data Set Ready"""
return self.udev.get_dsr()
def getRI(self):
"""Read terminal status line: Ring Indicator"""
return self.udev.get_ri()
def getCD(self):
"""Read terminal status line: Carrier Detect"""
return self.udev.get_cd()
def inWaiting(self):
"""Return the number of characters currently in the input buffer."""
# not implemented
return 0
@property
def fifoSizes(self):
"""Return the (TX, RX) tupple of hardware FIFO sizes"""
return self.udev.fifo_sizes
def _reconfigurePort(self):
try:
self.udev.set_baudrate(self._baudrate)
self.udev.set_line_property(self._bytesize,
self._stopbits,
self._parity)
if self._rtscts:
self.udev.set_flowctrl('hw')
elif self._xonxoff:
self.udev.set_flowctrl('sw')
else:
self.udev.set_flowctrl('')
try:
self.udev.set_dynamic_latency(2, 200, 400)
except AttributeError:
# backend does not support this feature
pass
except IOError, e:
from serial import SerialException
err = self.udev.get_error_string()
raise SerialException("%s (%s)" % (str(e), err))
@staticmethod
def show_devices(scheme, vdict, pdict, candidates, out=None):
from string import printable as printablechars
if not out:
import sys
out = sys.stdout
indices = {}
interfaces = []
for (v, p, s, i, d) in candidates:
ikey = (v, p)
indices[ikey] = indices.get(ikey, 0) + 1
# try to find a matching string for the current vendor
vendors = []
# fallback if no matching string for the current vendor is found
vendor = '%04x' % v
for vc in vdict:
if vdict[vc] == v:
vendors.append(vc)
if vendors:
vendors.sort(key=len)
vendor = vendors[0]
# try to find a matching string for the current vendor
# fallback if no matching string for the current product is found
product = '%04x' % p
try:
products = []
productids = pdict[v]
for pc in productids:
if productids[pc] == p:
products.append(pc)
if products:
products.sort(key=len)
product = products[0]
except KeyError:
pass
# if the serial number is an ASCII char, use it, or use the index
# value
if [c for c in s if c not in printablechars or c == '?']:
serial = '%d' % indices[ikey]
else:
serial = s
# Now print out the prettiest URL syntax
for j in range(1, i+1):
# On most configurations, low interfaces are used for MPSSE,
# high interfaces are dedicated to UARTs
interfaces.append((scheme, vendor, product, serial, j, d))
if interfaces:
print >> out, "Available interfaces:"
for scheme, vendor, product, serial, j, d in interfaces:
if d:
desc = ' (%s)' % d
print >> out, ' %s%s:%s:%s/%d%s' % \
(scheme, vendor, product, serial, j, desc)
print >> out, ''
<file_sep>/navigation/gps/gps_tools.py
from math import *
def distance_and_bearings(lat1, lon1, lat2, lon2, start_altitude=0, dest_altitude=0):
#GPS distance and bearing between two GPS points (Python recipe)
#This code outputs the distance between 2 GPS points showing also the vertical and horizontal bearing between them.
#Haversine Formuala to find vertical angle and distance
lon1, lat1, lon2, lat2 = map(radians, [lon1, lat1, lon2, lat2])
dlon = lon2 - lon1
dlat = lat2 - lat1
a = sin(dlat/2)**2 + cos(lat1) * cos(lat2) * sin(dlon/2)**2
c = 2 * atan2(sqrt(a), sqrt(1-a))
#convert to meters
#Base = Base * 1000 6367442.5
Base = 6367442.5 * c
Bearing = calcBearing(lat1, lon1, lat2, lon2)
Bearing = round(degrees(Bearing), 2)
distance = Base * 2 + dest_altitude * 2 / 2
Caltitude = dest_altitude - start_altitude
#Convertion from radians to decimals
a = dest_altitude/Base
b = atan(a)
c = degrees(b)
#Convert meters into Kilometers
#distance = distance / 1000
Base = round(Base,2)
return Base, distance, c, Bearing
#Horisontal Bearing
def calcBearing(lat1, lon1, lat2, lon2):
dLon = lon2 - lon1
y = sin(dLon) * cos(lat2)
x = cos(lat1) * sin(lat2) \
- sin(lat1) * cos(lat2) * cos(dLon)
return atan2(y, x)
#Two Example GPS Locations
lat1 = 53.32055555555556
lat2 = 53.3206617415777
lon1 = -1.7297222222222221
lon2 = -1.7296909524757125
start_altitude = 0
dest_altitude = 10
H_Dist, distance, c, H_Bearing = distance_and_bearings(lat1, lon1, lat2, lon2, start_altitude, dest_altitude)
#Output the data
print("---------------------------------------")
print(":::::Auto Aim Directional Anntenna:::::")
print("---------------------------------------")
print("Horizontial Distance:", H_Dist,"meters")
print(" Vertical Distance:", distance,"km")
print(" Vertical Bearing:",c)
print(" Horizontial Bearing:",H_Bearing)
print("---------------------------------------")
input("Press <enter> to Exit")
<file_sep>/gui/breezy/mousedemo3.py
"""
File: mousedemo3.py
uthor: <NAME>
"""
from breezypythongui import EasyFrame, EasyCanvas
import random
class MouseDemo(EasyFrame):
"""Draws ovals in random colors or erases them."""
def __init__(self):
"""Sets up the window and widgets."""
EasyFrame.__init__(self, title = "Mouse Demo 3")
# Canvas
self.shapeCanvas = ShapeCanvas(self)
self.addCanvas(self.shapeCanvas, row = 0, column = 0,
columnspan = 2, width = 300, height = 150)
# Radio buttons
self.commandGroup = self.addRadiobuttonGroup(row = 1,
column = 0,
rowspan = 2,
orient = "horizontal")
defaultRB = self.commandGroup.addRadiobutton(text = "Draw",
command = self.chooseCommand)
self.commandGroup.setSelectedButton(defaultRB)
self.commandGroup.addRadiobutton(text = "Erase",
command = self.chooseCommand)
# Event handling method in the main window class
def chooseCommand(self):
"""Responds to a radio button click by updating the
shape canvas with the selected command."""
command = self.commandGroup.getSelectedButton()["text"]
self.shapeCanvas.setCommand(command)
class ShapeCanvas(EasyCanvas):
"""Draw an oval with a press, drag, and release of the mouse, when
Draw is selected. Erase with a mouse press when Erase is selected."""
def __init__(self, parent):
"""Sets up the canvas."""
EasyCanvas.__init__(self, parent, background = "gray")
self.command = "Draw"
self.items = list()
def setCommand(self, command):
"""Resets the command."""
self.command = command
def mousePressed(self, event):
"""Sets the first corner of the oval's bounding rectangle.
Removes the oval if Erase in on."""
self.x = event.x
self.y = event.y
if self.command == "Erase":
itemId = self.findItemId(self.x, self.y)
if itemId:
self.items.remove(itemId)
self.delete(itemId)
def mouseReleased(self, event):
"""Sets the second corner of the oval's bounding rectangle.
Draws an oval filled in a random color and saves its item ID."""
if self.command == "Draw":
if self.x != event.x and self.y != event.y:
color = self.getRandomColor()
itemId = self.drawOval(self.x, self.y,
event.x, event.y, fill = color)
self.items.append(itemId)
def findItemId(self, x, y):
"""If the coordinates are in a shape's bounding rectangle,
returns its item ID; otherwise, returns None."""
for itemId in self.items:
coords = self.coords(itemId)
if self.containsPoint(coords, x, y):
return itemId
return None
def containsPoint(self, coords, x, y):
"""Returns True if (x, y) is contained in the rectangle
defined by coords, or False otherwise."""
[x0, y0, x1, y1] = coords
return x >= min(x0, x1) and x <= max(x0, x1) \
and y >= min(y0, y1) and y <= max (y0, y1)
def getRandomColor(self):
"""Returns a random RGB color."""
hexR = hex(random.randint(0, 255))[2:]
hexG = hex(random.randint(0, 255))[2:]
hexB = hex(random.randint(0, 255))[2:]
if len(hexR) == 1: hexR = "0" + hexR
if len(hexG) == 1: hexG = "0" + hexG
if len(hexB) == 1: hexB = "0" + hexB
return "#" + hexR + hexG + hexB
# Instantiate and pop up the window."""
if __name__ == "__main__":
MouseDemo().mainloop()
<file_sep>/gui/breezy/mousedemo1.py
"""
File: mousedemo1.py
uthor: <NAME>
"""
from breezypythongui import EasyFrame, EasyCanvas
class MouseDemo(EasyFrame):
"""Draws coordinates of mouse presses on a canvas."""
def __init__(self):
"""Sets up the window and widgets."""
EasyFrame.__init__(self, title = "Mouse Demo 1")
# Canvas
self.shapeCanvas = ShapeCanvas(self)
self.addCanvas(self.shapeCanvas, row = 0, column = 0,
width = 300, height = 150)
class ShapeCanvas(EasyCanvas):
"""Displays the coordinates of the point where the mouse is pressed."""
def __init__(self, parent):
"""Background is gray."""
EasyCanvas.__init__(self, parent, background = "gray")
def mousePressed(self, event):
"""Draws the coordinates of the mouse press."""
coordString = "(" + str(event.x) + "," + str(event.y) + ")"
self.drawText(coordString, event.x, event.y)
# Instantiate and pop up the window."""
if __name__ == "__main__":
MouseDemo().mainloop()
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/tests/test_measures.py
import milk.measures.measures
import numpy as np
import numpy
from milk.measures import accuracy, waccuracy, bayesian_significance
def test_100():
C=numpy.zeros((2,2))
C[0,0]=100
C[1,1]=50
assert accuracy(C) == 1.
assert waccuracy(C) == 1.
def test_0():
C = numpy.array([
[0, 10],
[10, 0]
])
assert waccuracy(C) == 0.
assert accuracy(C) == 0.
def test_50():
C = numpy.array([
[10, 10],
[10, 10]
])
assert accuracy(C) == .5
assert waccuracy(C) == .5
def test_unbalanced():
C = numpy.array([
[20, 10],
[10, 0]
])
assert accuracy(C) == .5
assert waccuracy(C) == 1./3
def test_confusion_matrix():
np.random.seed(323)
labels0 = np.arange(101)%3
labels1 = (labels0 + np.random.rand(101)*2).astype(np.int) % 3
cmat = milk.measures.measures.confusion_matrix(labels0, labels1)
for i in xrange(3):
for j in xrange(3):
assert cmat[i,j] == np.sum( (labels0 == i) & (labels1 == j) )
def test_significance():
assert np.allclose(.5, [bayesian_significance(1024,i,i) for i in xrange(0, 1025, 3)])
<file_sep>/arduino/comtest/comtest.ino
int incomingByte = 0; // for incoming serial data
int var = 0;
int sum = 0;
void setup() {
Serial.begin(9600); // opens serial port, sets data rate to 9600 bps
establishContact(); // send a byte to establish contact until receiver responds
}
void loop() {
// send data only when you receive data:
if (Serial.available() > 0) {
// read the incoming byte:
incomingByte = Serial.read();
// say what you got:
Serial.print("I received: ");
Serial.println(incomingByte, DEC);
while(var < 100){
// do something repetitive 200 times
var++;
Serial.print("Hey Andrea Count: ");
Serial.println(var, DEC);
delay(10);
//sum = sum + var^2;
//Serial.print("Sum: ");
//Serial.println(sum, DEC);
}
var = 0;
}
}
void establishContact() {
while (Serial.available() <= 0) {
Serial.print('A'); // send a capital A
delay(1000);
}
}
<file_sep>/segmentation/pil_seg1.py
from PIL import Image
import pymeanshift as pms
import numpy
import Image
def PIL2array(img):
return numpy.array(img.getdata(),
numpy.uint8).reshape(img.size[1], img.size[0], 3)
def array2PIL(arr, size):
mode = 'RGBA'
arr = arr.reshape(arr.shape[0]*arr.shape[1], arr.shape[2])
if len(arr[0]) == 3:
arr = numpy.c_[arr, 255*numpy.ones((len(arr),1), numpy.uint8)]
return Image.frombuffer(mode, size, arr.tostring(), 'raw', mode, 0, 1)
def segment_img(original_image, spatial_radius=5,range_radius=5, min_density=60):
(segmented_image, labels_image, number_regions) = pms.segment(original_image, spatial_radius,
range_radius, min_density)
original_image = Image.open("/home/lforet/images/class1/1.grass3.jpg")
(segmented_image, labels_image, number_regions) = pms.segment(original_image, spatial_radius=10,
range_radius=10, min_density=60)
original_image.show()
array2PIL(segmented_image, original_image.size).show()
<file_sep>/main/train_terrain.py~
#!/usr/bin/env python
#!/usr/bin/python
import sys
sys.path.append( "../lib/" )
import easygui as eg
from img_processing_tools import *
#from PIL import Image
from PIL import ImageStat, Image, ImageDraw
import cv, cv2
import time
import mahotas
import numpy as np
import pickle
import csv
import milk
from threading import *
def snap_shot():
#capture from camera at location 0
now = time.time()
webcam1 = None
try:
while webcam1 == None:
webcam1 = cv2.VideoCapture(1)
time.sleep(.1)
#webcam1 = cv.CreateCameraCapture(1)
except:
print "******* Could not open WEBCAM *******"
print "Unexpected error:", sys.exc_info()[0]
#raise
#sys.exit(-1)
try:
#have to capture a few frames as it buffers a few frames..
for i in range (5):
ret, img = webcam1.read()
#img = cv.QueryFrame(webcam1)
#print "time to capture 5 frames:", (time.time()) - now
#cv2.imwrite(filename, img)
#img1 = Image.open(filename)
#img1.thumbnail((320,240))
#img1.save(filename)
#print (time.time()) - now
webcam1.release()
return img
except:
print "could not grab webcam"
def find_features(img):
#print type(img), img.size, img.shape
#gray scale the image if neccessary
#if img.shape[2] == 3:
# img = img.mean(2)
#img = mahotas.imread(imname, as_grey=True)
features = mahotas.features.haralick(img).mean(0)
#features = mahotas.features.lbp(img, 1, 8)
#features = mahotas.features.tas(img)
#features = mahotas.features.zernike_moments(img, 2, degree=8)
print 'features:', features, len(features), type(features[0])
return features
def classify(model, features):
return model.apply(features)
def grab_frame_from_video(video):
frame = video.read()
return frame
def predict_class(img):
try:
model = pickle.load( open( "robomow_ai_model.mdl", "rb" ) )
features = find_features(img)
classID = classify(model, features)
if classID == 1: answer = "Mowable"
if classID == 2: answer = "Non-Mowable"
print "predicted classID:", answer
eg.msgbox("predicted classID:"+answer)
return classID
except:
print "could not predict...bad data"
def save_data(features, classID):
data_filename = 'robomow_feature_data.csv'
###########################
print 'writing image features to file: ', data_filename
# delete data file and write header
#f_handle = open(data_filename, 'w')
#f_handle.write(str("classid, lbp, i3_histogram, rgb_histogram, sum_I3, sum2_I3, median_I3, avg_I3, var_I3, stddev_I3, rms_I3"))
#f_handle.write('\n')
#f_handle.close()
#write class data to file
f_handle = open(data_filename, 'a')
f_handle.write(str(classID))
f_handle.write(', ')
f_handle.close()
f_handle = open(data_filename, 'a')
for i in range(len(features)):
f_handle.write(str(features[i]))
f_handle.write(" ")
f_handle.write('\n')
f_handle.close()
def train_ai():
data = []
classID = []
features = []
features_temp_array = []
try:
data_filename = 'robomow_feature_data.csv'
print 'readind features and classID: ', data_filename
f_handle = open(data_filename, 'r')
reader = csv.reader(f_handle)
#read data from file into arrays
for row in reader:
data.append(row)
for row in range(0, len(data)):
#print features[row][1]
classID.append(int(data[row][0]))
features_temp_array.append(data[row][1].split(" "))
#removes ending element which is a space
for x in range(len(features_temp_array)):
features_temp_array[x].pop()
features_temp_array[x].pop(0)
#convert all strings in array to numbers
temp_array = []
for x in range(len(features_temp_array)):
temp_array = [ float(s) for s in features_temp_array[x] ]
features.append(temp_array)
#make numpy arrays
#features = np.asarray(features)
print classID, features
learner = milk.defaultclassifier()
model = learner.train(features, classID)
pickle.dump( model, open( "robomow_ai_model.mdl", "wb" ) )
except:
print "could not retrain.. bad file"
return
def subsection_image(pil_img, sections, visual):
sections = sections / 4
#print "sections= ", sections
fingerprint = []
# - -----accepts image and number of sections to divide the image into (resolution of fingerprint)
# ---------- returns a subsectioned image classified by terrain type
img_width, img_height = pil_img.size
#print "image size = img_wdith= ", img_width, " img_height=", img_height, " sections=", sections
#cv.DestroyAllWindows()
#time.sleep(2)
if visual == True:
cv_original_img1 = PILtoCV(pil_img,3)
#cv.NamedWindow('Original', cv.CV_WINDOW_AUTOSIZE)
cv.ShowImage("Original",cv_original_img1 )
#cv_original_img1_ary = np.array(PIL2array(pil_img))
#print cv_original_img1_ary
#cv2.imshow("Original",cv_original_img1_ary)
cv.MoveWindow("Original", ((img_width)+100),50)
#pil_img = rgb2I3 (pil_img)
#cv.WaitKey()
#cv.DestroyWindow("Original")
temp_img = pil_img.copy()
xsegs = img_width / sections
ysegs = img_height / sections
#print "xsegs, ysegs = ", xsegs, ysegs
for yy in range(0, img_height-ysegs+1 , ysegs):
for xx in range(0, img_width-xsegs+1, xsegs):
#print "Processing section =", xx, yy, xx+xsegs, yy+ysegs
box = (xx, yy, xx+xsegs, yy+ysegs)
#print "box = ", box
cropped_img1 = pil_img.crop(box)
I3_mean = ImageStat.Stat(cropped_img1).mean
I3_mean_rgb = (int(I3_mean[0]), int(I3_mean[1]), int(I3_mean[2]))
print "I3_mean: ", I3_mean
sub_ID = predict_class(image2array(cropped_img1))
print "sub_ID:", sub_ID
#fingerprint.append(sub_ID)
if visual == True:
cv_cropped_img1 = PILtoCV(cropped_img1,3)
cv.ShowImage("Fingerprint",cv_cropped_img1 )
cv.MoveWindow("Fingerprint", (img_width+100),50)
if sub_ID == 1: I3_mean_rgb = (50,150,50)
if sub_ID == 2: I3_mean_rgb = (150,150,150)
if sub_ID == 3: I3_mean_rgb = (0,0,200)
ImageDraw.Draw(pil_img).rectangle(box, (I3_mean_rgb))
cv_img = PILtoCV(pil_img,3)
cv.ShowImage("Image",cv_img)
cv.MoveWindow("Image", 50,50)
cv.WaitKey(20)
time.sleep(.1)
#print xx*yy
#time.sleep(.05)
#cv.DestroyAllWindows()
cv.DestroyWindow("Fingerprint")
cv.WaitKey(100)
cv.DestroyWindow("Image")
cv.WaitKey(100)
cv.DestroyWindow("Original")
cv.WaitKey(100)
cv.DestroyWindow("Image")
cv.WaitKey()
time.sleep(2)
#print "FINGERPRINT: ", fingerprint
#cv.WaitKey()
#return fingerprint
return 9
if __name__=="__main__":
print "********************************************************************"
print "* if 1 argument: video file to process otherwise uses webcam *"
print "********************************************************************"
video = None
webcam1 = None
img1 = None
if len(sys.argv) > 1:
try:
video = cv2.VideoCapture(sys.argv[1])
print video, sys.argv[1]
except:
print "******* Could not open image/video file *******"
print "Unexpected error:", sys.exc_info()[0]
#raise
sys.exit(-1)
reply =""
#eg.rootWindowPosition = "+100+100"
while True:
#eg.rootWindowPosition = eg.rootWindowPosition
print 'reply=', reply
#if reply == "": reply = "Next Frame"
if reply == "Mowable":
classID = "1"
if img1 != None:
features = find_features(img1)
save_data(features, classID)
else:
if video != None:
img1 = np.array(grab_frame_from_video(video)[1])
else:
img1 = snap_shot()
img1 = array2image(img1)
img1 = img1.resize((320,240))
img1 = image2array(img1)
cv2.imwrite('temp.png', img1)
if reply == "Non-Mowable":
classID = "2"
if img1 != None:
features = find_features(img1)
save_data(features, classID)
else:
if video != None:
img1 = np.array(grab_frame_from_video(video)[1])
else:
img1 = snap_shot()
img1 = array2image(img1)
img1 = img1.resize((320,240))
img1 = image2array(img1)
cv2.imwrite('temp.png', img1)
if reply == "Quit":
print "Quitting...."
sys.exit(-1)
if reply == "Predict":
print "AI predicting"
if img1 != None:
predict_class(img1)
else:
if video != None:
img1 = np.array(grab_frame_from_video(video)[1])
else:
img1 = snap_shot()
img1 = array2image(img1)
img1 = img1.resize((320,240))
img1 = image2array(img1)
cv2.imwrite('temp.png', img1)
if reply == "Subsection":
img1 = Image.open('temp.png')
print img1
xx = subsection_image(img1, 16,True)
print xx
#while (xx != 9):
# time.sleep(1)
if reply == "Retrain AI":
print "Retraining AI"
train_ai()
if reply == "Next Frame":
print "Acquiring new image.."
if video != None:
img1 = np.array(grab_frame_from_video(video)[1])
else:
img1 = snap_shot()
img1 = array2image(img1)
img1 = img1.resize((320,240))
img1 = image2array(img1)
cv2.imwrite('temp.png', img1)
#print type(img1)
#img1.save()
if reply == "Fwd 10 Frames":
print "Forward 10 frames..."
for i in range(10):
img1 = np.array(grab_frame_from_video(video)[1])
img1 = array2image(img1)
img1 = img1.resize((320,240))
img1 = image2array(img1)
cv2.imwrite('temp.png', img1)
if reply == "Del AI File":
data_filename = 'robomow_feature_data.csv'
f_handle = open(data_filename, 'w')
f_handle.write('')
f_handle.close()
if reply == "Test Img":
#im = Image.fromarray(image)
#im.save("new.png")
img1 = Image.open('temp.png')
#img1.thumbnail((320,240))
path = "../../../../mobot_data/images/test_images/"
filename = str(time.time()) + ".jpg"
img1.save(path+filename)
try:
print "trying"
if video != None:
reply = eg.buttonbox(msg='Classify Image', title='Robomow GUI', choices=('Mowable', 'Non-Mowable', 'Test Img', 'Next Frame', 'Fwd 10 Frames', 'Predict', 'Subsection', 'Retrain AI' , 'Del AI File', 'Quit'), image='temp.png', root=None)
else:
reply = eg.buttonbox(msg='Classify Image', title='Robomow GUI', choices=('Mowable', 'Non-Mowable', 'Test Img','Next Frame', 'Predict', 'Retrain AI' , 'Del AI File', 'Quit'), image='temp.png', root=None)
except:
pass
<file_sep>/navigation/gps/point_within_polygon.py
#!/usr/bin/env python
# determine if a point is inside a given polygon or not
# Polygon is a list of (x,y) pairs.
def point_inside_polygon(x,y,poly):
n = len(poly)
inside =False
p1x,p1y = poly[0]
for i in range(n+1):
p2x,p2y = poly[i % n]
print p2x,p2y
if y > min(p1y,p2y):
if y <= max(p1y,p2y):
if x <= max(p1x,p2x):
if p1y != p2y:
xinters = (y-p1y)*(p2x-p1x)/(p2y-p1y)+p1x
if p1x == p2x or x <= xinters:
inside = not inside
p1x,p1y = p2x,p2y
return inside
if __name__=="__main__":
import sys
#polygon = ([0,0],[5,0],[5,5],[0,5])
polygon = [(33.495296, -86.797129), (33.495273, -86.797107), (33.495276, -86.797016), (33.495354, -86.796978), (33.49537, -86.797029)]
#33.4952309 -86.7971662 should = false (outside polygon)
#33.4952980 -86.7970562 should = True (within polygon)
print polygon, float(sys.argv[1]), float(sys.argv[2])
print point_inside_polygon(float(sys.argv[1]),float(sys.argv[2]),polygon)
<file_sep>/gui/breezy/canvasdemo1.py
"""
File: canvasdemo1.py
uthor: <NAME>
"""
from breezypythongui import EasyFrame
import random
class CanvasDemo(EasyFrame):
"""Draws filled ovals on a canvas."""
def __init__(self):
"""Sets up the window and widgets."""
EasyFrame.__init__(self, title = "Canvas Demo 1")
self.colors = ("blue", "green", "red", "yellow")
# Canvas
self.canvas = self.addCanvas(row = 0, column = 0,
width = 300, height = 150,
background = "gray")
# Command button
self.ovalButton = self.addButton(text = "Draw oval",
row = 1, column = 0,
command = self.drawOval)
# Event handling method
def drawOval(self):
"""Draws a filled oval at a random position."""
x = random.randint(0, 300)
y = random.randint(0, 150)
color = random.choice(self.colors)
self.canvas.drawOval(x, y, x + 25, y + 25, fill = color)
# Instantiate and pop up the window."""
if __name__ == "__main__":
CanvasDemo().mainloop()
<file_sep>/gui/web_gui/mysite/mysite/myapp/templates/views.py
from datetime import datetime
from django.shortcuts import render
def home(request):
return render(request, 'home.html', {'right_now':datetime.utcnow()})
<file_sep>/ftdi/build/pyftdi/pyftdi/serialext/darwinext.py
# Copyright (c) 2008-2011, Neotion
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
# * Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright
# notice, this list of conditions and the following disclaimer in the
# documentation and/or other materials provided with the distribution.
# * Neither the name of the Neotion nor the names of its contributors may
# be used to endorse or promote products derived from this software
# without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL NEOTION BE LIABLE FOR ANY DIRECT, INDIRECT,
# INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA,
# OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
# LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
# NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE,
# EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
import array, fcntl
import serial
__all__ = ['SerialDarwin']
class SerialDarwin(object):
"""Serial port implementation dedicated to the Darwin kernel (Mac OS X)
It allows to circumvent the POSIX/termios current limitation of the
baudrate to 230400bps.
It is a true hack of the original pyserial implementation, and will
hopefully be deprecated once pyserial implement a better serial port
support for Apple computer.
It supports custom baudrate with no defined upper bound, and has been
tested with baudrates up to 3Mbps. It does not improve communication
speed on Mac OS 10.3 and below.
"""
IOSSIOSPEED = 0x80045402 #_IOW('T', 2, speed_t)
def _reconfigurePort(port):
"""Augment serial.Serial._reconfigurePort w/ IOKit call"""
try:
serial.Serial._reconfigurePort(port)
except AttributeError:
# use IOKit-specific call to set up high speeds
buf = array.array('i', [int(port._baudrate)])
fcntl.ioctl(port.fd, SerialDarwin.IOSSIOSPEED, buf, 1)
except:
raise AssertionError('Cannot reconfigure Darwin serial port')
<file_sep>/sonar/sonar_test.py
#!/usr/bin/env python
from maxsonar_class import *
import time
import re
sensor1 = MaxSonar()
while 1:
data = str(sensor1.distances_cm())
if len(data) > 1:
print "data=", data
#s1_data = re.search('s1', data)
#print s1_data.span()
s1_data = data[(data.find('s1:')+3):(data.find('s2:'))]
s2_data = data[(data.find('s2:')+3):(data.find('s3:'))]
s3_data = data[(data.find('s3:')+3):(data.find('s4:'))]
s4_data = data[(data.find('s4:')+3):(data.find('s5:'))]
s5_data = data[(data.find('s5:')+3):(len(data)-1)]
print s1_data, s2_data, s3_data, s4_data, s5_data
time.sleep(.1)
<file_sep>/gui/breezy/calculatordemo.py
"""
File: calculatordemo.py
Author: <NAME>
"""
from breezypythongui import EasyFrame
class CalculatorDemo(EasyFrame):
"""Illustrates command buttons and user events."""
def __init__(self):
"""Sets up the window, label, and buttons."""
EasyFrame.__init__(self, "Calculator")
self.digits = self.addLabel("", row = 0, column = 0,
columnspan = 3, sticky = "NSEW")
digit = 9
for row in range(1, 4):
for column in range(0, 3):
button = self.addButton(str(digit), row, column)
button["command"] = self.makeCommand(button)
digit -= 1
# Event handling method
def makeCommand(self, button):
"""Define and return the event handler for the button."""
def addDigit():
self.digits["text"] = self.digits["text"] + button["text"]
return addDigit
# Instantiates and pops up the window.
if __name__ == "__main__":
CalculatorDemo().mainloop()
<file_sep>/navigation/spatial/robomow_spatial_class.py~
#!/usr/bin/env python
import serial
import time
from struct import *
#Basic imports
from ctypes import *
import sys
#Phidget specific imports
from Phidgets.Phidget import Phidget
from Phidgets.PhidgetException import PhidgetErrorCodes, PhidgetException
from Phidgets.Events.Events import SpatialDataEventArgs, AttachEventArgs, DetachEventArgs, ErrorEventArgs
from Phidgets.Devices.Spatial import Spatial, SpatialEventData, TimeSpan
def log(*msgline):
for msg in msgline:
print msg,
print
class robomow_spatial(object):
def __init__(self):
############################
# lets introduce and init the main variables
#self.com = None
self.isInitialized = False
#Create an accelerometer object
try:
self = Spatial()
except RuntimeError as e:
print("Runtime Exception: %s" % e.details)
print("Exiting....")
exit(1)
try:
self.openPhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Waiting for attach....")
try:
self.waitForAttach(4000)
print "is attached = ", self.isAttached()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
try:
self.closePhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Exiting....")
exit(1)
else:
self.setDataRate(1000)
<file_sep>/gui/breezy/filedialogdemo.py
"""
File: filedialogdemo.py
Author: <NAME>
"""
from breezypythongui import EasyFrame
import tkinter.filedialog
class FileDialogDemo(EasyFrame):
"""Demonstrates a file dialog."""
def __init__(self):
"""Sets up the window and widgets."""
EasyFrame.__init__(self, "File Dialog Demo")
self.outputArea = self.addTextArea("", row = 0, column = 0,
columnspan = 2,
width = 80, height = 15)
self.addButton(text = "Open", row = 1, column = 0,
command = self.openFile)
self.addButton(text = "Save As...", row = 1, column = 1,
command = self.saveFileAs)
# Event handling methods
def openFile(self):
"""Pops up an open file dialog, and if a file is selected,
displays its text in the text area."""
filetypes = [("Python files", "*.py"), ("Text files", "*.txt")]
fileName = tkinter.filedialog.askopenfilename(parent = self,
filetypes = filetypes)
if fileName != "":
file = open(fileName, "r")
text = file.read()
file.close()
self.outputArea.setText(text)
self.setTitle(fileName)
def saveFileAs(self):
"""Pops up a save file dialog, and if a file is selected,
saves the contents of the text area to the file."""
fileName = tkinter.filedialog.asksaveasfilename(parent = self)
if fileName != "":
text = self.outputArea.getText()
file = open(fileName, "w")
file.write(text)
file.close()
self.setTitle(fileName)
if __name__ == "__main__":
FileDialogDemo().mainloop()
<file_sep>/camera/capcamopencv.py
#!/usr/bin/python
import sys, time
from opencv import cv
from opencv import highgui
hmin = 4
hmax = 18
highgui.cvNamedWindow('Camera', highgui.CV_WINDOW_AUTOSIZE)
#highgui.cvNamedWindow('Hue', highgui.CV_WINDOW_AUTOSIZE)
#highgui.cvCreateTrackbar("hmin Trackbar","Hue",hmin,180, change_hmin);
#highgui.cvCreateTrackbar("hmax Trackbar","Hue",hmax,180, change_hmax);
print "grabbing camera"
capture = highgui.cvCreateCameraCapture(0)
print "found camera"
time.sleep(1)
frame = highgui.cvQueryFrame(capture)
frameSize = cv.cvGetSize(frame)
print "frameSize =", frameSize
time.sleep(1)
cam_width = highgui.cvGetCaptureProperty(capture,highgui.CV_CAP_PROP_FRAME_WIDTH)
cam_height = highgui.cvGetCaptureProperty(capture,highgui.CV_CAP_PROP_FRAME_HEIGHT)
print "camers cam_height =", cam_height
print "camers cam_width =", cam_width
highgui.cvSetCaptureProperty(capture,highgui.CV_CAP_PROP_FRAME_WIDTH, 320)
highgui.cvSetCaptureProperty(capture,highgui.CV_CAP_PROP_FRAME_HEIGHT, 240)
time.sleep(1)
cam_width = highgui.cvGetCaptureProperty(capture,highgui.CV_CAP_PROP_FRAME_WIDTH)
cam_height = highgui.cvGetCaptureProperty(capture,highgui.CV_CAP_PROP_FRAME_HEIGHT)
print "camers cam_height =", cam_height
print "camers cam_width =", cam_width
print highgui.cvGetCaptureProperty(capture,highgui.CV_CAP_PROP_FPS)
print highgui.cvGetCaptureProperty(capture,highgui.CV_CAP_PROP_BRIGHTNESS)
#print highgui.cvGetCaptureProperty(capture,highgui.CV_CAP_CONTRAST )
print highgui.cvGetCaptureProperty(capture,highgui.CV_CAP_PROP_SATURATION )
print highgui.cvGetCaptureProperty(capture,highgui.CV_CAP_PROP_HUE )
hue = cv.cvCreateImage(frameSize,8,1)
print frameSize
while 1:
frame = highgui.cvQueryFrame(capture)
#cv.cvCvtColor(frame, hsv, cv.CV_BGR2HSV)
#cv.cvInRangeS(hsv,hsv_min,hsv_max,mask)
#cv.cvSplit(hsv,hue,satuation,value,None)
#cv.cvInRangeS(hue,hmin,hmax,hue)
highgui.cvShowImage('Camera',frame)
#highgui.cvShowImage('Hue',hue)
highgui.cvWaitKey(10)
"""
double cvGetCaptureProperty(CvCapture* capture, int property_id)
Gets video capturing properties.
Parameters:
* capture-video capturing structure.
* property-id
Property identifier. Can be one of the following:
* CV_CAP_PROP_POS_MSEC - Film current position in milliseconds or video capture timestamp
o CV_CAP_PROP_POS_FRAMES - 0-based index of the frame to be decoded/captured next
o CV_CAP_PROP_POS_AVI_RATIO - Relative position of the video file (0 - start of the film, 1 - end of the film)
o CV_CAP_PROP_FRAME_WIDTH - Width of the frames in the video stream
o CV_CAP_PROP_FRAME_HEIGHT - Height of the frames in the video stream
o CV_CAP_PROP_FPS - Frame rate
o CV_CAP_PROP_FOURCC - 4-character code of codec
o CV_CAP_PROP_FRAME_COUNT - Number of frames in the video file
o CV_CAP_PROP_BRIGHTNESS - Brightness of the image (only for cameras)
o CV_CAP_PROP_CONTRAST - Contrast of the image (only for cameras)
o CV_CAP_PROP_SATURATION - Saturation of the image (only for cameras)
o CV_CAP_PROP_HUE - Hue of the image (only for cameras)
"""
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/supervised/parzen.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2011, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# License: MIT. See COPYING.MIT file in the milk distribution
from __future__ import division
import numpy as np
def get_parzen_rbf_loocv(features,labels):
xij = np.dot(features,features.T)
f2 = np.sum(features**2,1)
d = f2-2*xij
d = d.T + f2
d_argsorted = d.argsort(1)
d_sorted = d.copy()
d_sorted.sort(1)
e_d = np.exp(-d_sorted)
labels_sorted = labels[d_argsorted].astype(np.double)
labels_sorted *= 2
labels_sorted -= 1
def f(sigma):
k = e_d ** (1./sigma)
return (((k[:,1:] * labels_sorted[:,1:]).sum(1) > 0) == labels).mean()
return f
<file_sep>/arduino/SabertoothSimplified/examples/SoftwareSerial/SabertoothSimplified.h
/*
Arduino Library for Sabertooth Simplified Serial
Copyright (c) 2012 Dimension Engineering LLC
http://www.dimensionengineering.com/arduino
Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted, provided that the above
copyright notice and this permission notice appear in all copies.
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY
SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER
RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT,
NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE
USE OR PERFORMANCE OF THIS SOFTWARE.
*/
#ifndef SabertoothSimplified_h
#define SabertoothSimplified_h
#include <Arduino.h>
class SabertoothSimplified
{
public:
SabertoothSimplified();
SabertoothSimplified(Stream& port);
public:
void motor(int power);
void motor(byte motor, int power);
void drive(int power);
void turn(int power);
void stop();
private:
void mixedMode(boolean enable);
void mixedUpdate();
void raw(byte motor, int power);
private:
boolean _mixed;
int _mixedDrive, _mixedTurn;
boolean _mixedDriveSet, _mixedTurnSet;
Stream& _port;
};
#endif
<file_sep>/PhidgetsPython/Python/FrequencyCounter-simple.py
#! /usr/bin/python
"""Copyright 2011 Phidgets Inc.
This work is licensed under the Creative Commons Attribution 2.5 Canada License.
To view a copy of this license, visit http://creativecommons.org/licenses/by/2.5/ca/
"""
__author__="<NAME>"
__version__="2.1.8"
__date__ ="14-Jan-2011 10:55:59 AM"
#Basic imports
import sys
from time import sleep
#Phidget specific imports
from Phidgets.PhidgetException import PhidgetException
from Phidgets.Devices.FrequencyCounter import FrequencyCounter, FilterType
#Create an accelerometer object
try:
freqCount = FrequencyCounter()
except RuntimeError as e:
print("Runtime Exception: %s" % e.details)
print("Exiting....")
exit(1)
#Information Display Function
def displayDeviceInfo():
print("|------------|----------------------------------|--------------|------------|")
print("|- Attached -|- Type -|- Serial No. -|- Version -|")
print("|------------|----------------------------------|--------------|------------|")
print("|- %8s -|- %30s -|- %10d -|- %8d -|" % (freqCount.isAttached(), freqCount.getDeviceName(), freqCount.getSerialNum(), freqCount.getDeviceVersion()))
print("|------------|----------------------------------|--------------|------------|")
print("Number of frequency inputs: %i" % (freqCount.getFrequencyInputCount()))
#Event Handler Callback Functions
def FrequencyCounterAttached(e):
attached = e.device
print("Frequency Counter %i Attached!" % (attached.getSerialNum()))
def FrequencyCounterDetached(e):
detached = e.device
print("Frequency Counter %i Detached!" % (detached.getSerialNum()))
def FrequencyCounterError(e):
try:
source = e.device
print("Frequency Counter %i: Phidget Error %i: %s" % (source.getSerialNum(), e.eCode, e.description))
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
def FrequencyCount(e):
source = e.device
print("Frequency Counter %i: Count Detected -- Index %i: Time %i -- Counts %i" % (source.getSerialNum(), e.index, e.time, e.counts))
#Main Program Code
try:
freqCount.setOnAttachHandler(FrequencyCounterAttached)
freqCount.setOnDetachHandler(FrequencyCounterDetached)
freqCount.setOnErrorhandler(FrequencyCounterError)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Opening phidget object....")
try:
freqCount.openPhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Waiting for attach....")
try:
freqCount.waitForAttach(10000)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
try:
freqCount.closePhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Exiting....")
exit(1)
else:
displayDeviceInfo()
try:
print("Set timeout to be 10seconds (100,000 microseconds)...")
freqCount.setTimeout(0, 100000)
sleep(5)
print("Set filter type to LOGIC_LEVEL...")
freqCount.setFilter(0, FilterType.FILTERTYPE_LOGIC_LEVEL)
print("Poll device for data")
print("Enabling frequency input channel 0...")
freqCount.setEnabled(0, True)
sleep(5)
print("Frequency Data -- Frequency: %d Hz Total Time: %i ms Total Counts: %i" % (freqCount.getFrequency(0), (freqCount.getTotalTime(0) / 100), freqCount.getTotalCount(0)))
print("Disabling frequency input channel 0...")
freqCount.setEnabled(0, False)
sleep(5)
print("Resetting frequency input channel 0...")
freqCount.reset(0)
sleep(5)
print("Set up event handling...")
freqCount.setOnFrequencyCountHandler(FrequencyCount)
print("Enabling frequency input channel 0...")
freqCount.setEnabled(0, True)
sleep(5)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
try:
freqCount.closePhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Exiting....")
exit(1)
print("Press Enter to quit....")
chr = sys.stdin.read(1)
print("Closing...")
try:
print("Disabling frequency input channel 0...")
freqCount.setEnabled(0, False)
sleep(5)
print("Resetting frequency input channel 0...")
freqCount.reset(0)
sleep(5)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
try:
freqCount.closePhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Exiting....")
exit(1)
try:
freqCount.closePhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Done.")
exit(0)<file_sep>/gui/breezy/listboxdemo.py
"""
File: listboxdemo.py
Author: <NAME>
"""
from breezypythongui import EasyFrame
from tkinter import END, NORMAL, DISABLED
class ListBoxDemo(EasyFrame):
"""Allows the user to add items to a list box, remove them,
and select them."""
def __init__(self):
"""Sets up the window and the widgets."""
EasyFrame.__init__(self, title = "List Box Demo",
width = 300, height = 150)
# Set up the list box
self.listBox = self.addListbox(row = 0, column = 0, rowspan = 4,
listItemSelected = self.listItemSelected)
# Add some items to the list box and select the first one
self.listBox.insert(END, "Apple")
self.listBox.insert(END, "Banana")
self.listBox.insert(END, "Cherry")
self.listBox.insert(END, "Orange")
self.listBox.setSelectedIndex(0)
# Set up the labels, fields, and buttons
self.addLabel(text = "Input", row = 0, column = 1)
self.addLabel(text = "Index", row = 1, column = 1)
self.addLabel(text = "Current item", row = 2, column = 1)
self.inputField = self.addTextField(text = "", row = 0,
column = 2, width = 10)
self.indexField = self.addIntegerField(value = "", row = 1,
column = 2, width = 10)
self.itemField = self.addTextField(text = "", row = 2,
column = 2, width = 10)
self.addButton(text = "Add", row = 3,
column = 1, command = self.add)
self.removeButton = self.addButton(text = "Remove", row = 3,
column = 2, command = self.remove)
# Display current index and currently selected item
self.listItemSelected(0)
# Event handling methods
def listItemSelected(self, index):
"""Responds to the selection of an item in the list box.
Updates the fields with the current item and its index."""
self.indexField.setNumber(index)
self.itemField.setText(self.listBox.getSelectedItem())
def add(self):
"""If an input is present, insert it before the selected
item in the list box. The selected item remains current.
If the item added is first one, select it and enable the
remove button."""
item = self.inputField.getText()
if item != "":
index = self.listBox.getSelectedIndex()
if index == -1:
self.listBox.insert(0, item)
self.listBox.setSelectedIndex(0)
self.listItemSelected(0)
self.removeButton["state"] = NORMAL
else:
self.listBox.insert(index, item)
self.listItemSelected(index + 1)
self.inputField.setText("")
def remove(self):
"""If there are items in the list, remove the selected item,
select the previous one, and update the fields. If there was
no previous item, select the next one. If the last item is
removed, disable the remove button."""
index = self.listBox.getSelectedIndex()
self.listBox.delete(index)
if self.listBox.size() > 0:
if index > 0:
index -= 1
self.listBox.setSelectedIndex(index)
self.listItemSelected(index)
else:
self.listItemSelected(-1)
self.removeButton["state"] = DISABLED
# Instantiate and pop up the window."""
if __name__ == "__main__":
ListBoxDemo().mainloop()
<file_sep>/PhidgetsPython/build/lib.linux-i686-2.7/Phidgets/__init__.py
__all__ = ["Phidget", "PhidgetLibrary", "PhidgetException", "Dictionary", "Manager", "Common"]<file_sep>/telemetry/threading_test.py
import threading
import time
testflag = 0
class MyThread ( threading.Thread ):
# Override Thread's __init__ method to accept the parameters needed:
def __init__ ( self, threadNum):
self.threadNum = threadNum
threading.Thread.__init__ ( self )
def run ( self ):
global testflag
print "counting to 5.."
for i in range(5):
print "Thread Number ", self.threadNum, ' named: ',self.getName()," Count = ", i
i = i + 1
time.sleep(1)
testflag = 1
class CheckFlag ( threading.Thread ):
def run ( self ):
global testflag
while True:
print "checking flag.."
time.sleep(.5)
if testflag <> 0:
print "flag changed"
break
athread = MyThread(1)
athread.setName ( 'athread' )
athread.start()
time.sleep(2)
#check if thread is alive
if athread.isAlive():
print 'athread is alive.'
else:
print 'athread is dead'
time.sleep(3)
"""
We can use the setDaemon method, too. If a True value is passed with this method and all other threads have finished executing, the Python program will exit, leaving the thread by itself:
import threading
import time
class DaemonThread ( threading.Thread ):
def run ( self ):
self.setDaemon ( True )
time.sleep ( 10 )
DaemonThread().start()
print 'Leaving.'
Python also contains a thread module that deals with lower level threading. The only feature I would like to point out is the start_new_thread function it contains. Using this, we can turn an ordinary function into a thread:
import thread
def thread( stuff ):
print "I'm a real boy!"
print stuff
thread.start_new_thread ( thread, ( 'Argument' ) )
"""
<file_sep>/lib/mobot_lidar_consume.py~
#!/usr/bin/python
import pika
import thread, time, sys, traceback
import cPickle as pickle
''' USAGE:
lidar = consume_lidar('lidar.1', 'localhost')
while True:
time.sleep(1)
print lidar.data[1]
print 'rpm speed:', lidar.speed_rpm
'''
class consume_lidar():
def __init__(self, channel_name, host_ip):
self.speed_rpm = 0
self.data = []
#-------------connection variables
self.channel_name = channel_name
self.host_ip = host_ip
self.queue_name = None
self.connection = None
self.channel = None
#----------------------RUN
self.run()
def connect(self):
self.connection = pika.BlockingConnection(pika.ConnectionParameters(host=self.host_ip))
self.channel = self.connection.channel()
self.channel.exchange_declare(exchange='mobot_data_feed',type='topic')
result = self.channel.queue_declare(exclusive=True)
self.queue_name = result.method.queue
binding_keys = self.channel_name
self.channel.queue_bind(exchange='mobot_data_feed', queue=self.queue_name, routing_key=self.channel_name)
def read_lidar(self):
while True:
time.sleep(0.0001) # do not hog the processor power
for method_frame, properties, body in self.channel.consume(self.queue_name):
#print "consuming"
# Display the message parts
temp_data = pickle.loads(body)
# Acknowledge the message
self.channel.basic_ack(method_frame.delivery_tag)
self.speed_rpm = temp_data[360]
self.data = temp_data[:360]
# Escape out of the loop after 10 messages
#if method_frame.delivery_tag == 10:
# break
# Cancel the consumer and return any pending messages
#requeued_messages = self.channel.cancel()
#print 'Requeued %i messages' % requeued_messages
# Close the channel and the connection
#self.channel.close()
#self.connection.close()
def run(self):
self.th = thread.start_new_thread(self.read_lidar, ())
if __name__== "__main__":
lidar = consume_lidar('lidar.1', 'localhost')
while True:
time.sleep(1)
print lidar.data[1]
print 'rpm speed:', lidar.speed_rpm
<file_sep>/navigation/gps/autostart_all_gpss.py
#this function will find and start all gps devices connected to linux computer
from subprocess import call
from identify_device_on_ttyport import *
import time
from threading import *
import gps
import random
import os
from math import sin, cos, radians, sqrt, atan2, asin, sqrt, pi
class mobot_gps( Thread ):
def __init__(self):
Thread.__init__( self )
self.latitude = 0.0
self.longitude = 0.0
self.altitude = 0.0
self.satellites = 0
self.active_satellites = 0
self.samples = 10
def __del__(self):
del(self)
def run(self):
self.process()
def process( self ):
gpss = []
#get list of all attached gps units
all_gps_list = gps_list()
print "Total # GPS Units Found:", len (all_gps_list)
#start all gps units
start_all_gps()
#connect via python
for n in range(len(all_gps_list)):
port = str(2947+n)
print "port", port
gpss.append(gps.gps(host="localhost", port=port))
print "starting_gps:", gpss[n]
#returncode = call(start_gps, shell=True)
time.sleep(1)
gpss[n].next()
gpss[n].stream()
#print 'Satellites (total of', len(gpss[n].satellites) , ' in view)'
#time.sleep(1)
while True :
avg_latitude = 0.0
avg_longitude = 0.0
avg_altitude = 0.0
avg_satellites = 0
avg_active_satellites = 0
try:
for n in range(len(all_gps_list)):
for x in xrange(1, self.samples):
os.system("clear")
while ActiveSatelliteCount(gpss[n].satellites) < 6:
print "Acquiring at least 6 GPS Satellites..."
print 'Satellites (total of', len(gpss[n].satellites) , ' in view)'
print "Number of acquired satellites: ", ActiveSatelliteCount(gpss[n].satellites)
time.sleep(1)
os.system("clear")
gpss[n].next()
gpss[n].stream()
gpss[n].next()
#test data
#gpss[n].fix.latitude = 53.32055555555556 + (random.random() * 0.00005)
#gpss[n].fix.longitude =-1.7297222222222221 + (random.random() * 0.00005)
print "READING GPS:", n, " ", x
print "-------------"
print 'latitude ' , gpss[n].fix.latitude
print 'longitude ' , gpss[n].fix.longitude
print 'time utc ' , gpss[n].utc, gpss[n].fix.time
print 'altitude ' , gpss[n].fix.altitude
print 'epx ', gpss[n].fix.epx
print 'epv ', gpss[n].fix.epv
print 'ept ', gpss[n].fix.ept
print "speed ", gpss[n].fix.speed
print "climb " , gpss[n].fix.climb
#print
print 'Satellites (total of', len(gpss[n].satellites) , ' in view)'
for i in gpss[n].satellites:
print '\t', i
print "Active satellites used: ", ActiveSatelliteCount(gpss[n].satellites)
avg_latitude = (avg_latitude + gpss[n].fix.latitude)
avg_longitude = (avg_longitude + gpss[n].fix.longitude)
#if (x > 0):
# print 'Avg latitude : ' , self.latitude / x
# print 'Avg longitude: ' , self.longitude / x
#time.sleep(10)
avg_active_satellites = (avg_active_satellites + ActiveSatelliteCount(gpss[n].satellites))
time.sleep(.2)
#print "Averaging....", (x*len(all_gps_list))
self.latitude = (avg_latitude / (x*len(all_gps_list)))
self.longitude = (avg_longitude / (x*len(all_gps_list)))
#print "total sats:", self.active_satellites
self.active_satellites = ( avg_active_satellites / (x*len(all_gps_list)))
#time.sleep(1)
print 'Avg latitude : ' , self.latitude, " ", abs(self.latitude - gpss[n].fix.latitude)
print 'Avg longitude: ' , self.longitude, " ", abs(self.longitude - gpss[n].fix.longitude)
print 'Avg Active Satellites: ' , self.active_satellites
print "Distance: ", round((lldistance((self.latitude, self.longitude), (gpss[n].fix.latitude, gpss[n].fix.longitude)) * 3.28084), 4), " feet"
time.sleep(5)
except:
#time.sleep(1)
pass
#os.system("clear")
def ActiveSatelliteCount(session):
count = 0
for i in range(len(session)):
s = str(session[i])
if s[-1] == "y":
count = count + 1
return count
def start_all_gps():
all_gps_list = gps_list()
#gps_list = ["/dev/ttyUSB0", "/dev/ttyUSB1"]
print "gps_list:", all_gps_list
print len(all_gps_list)
for n in range(len(all_gps_list)):
start_gps = "gpsd "+all_gps_list[n]+" -S" + str(2947+n) + " -n -b"
print "start_gps:", start_gps
returncode = call(start_gps, shell=True)
time.sleep(2)
#print returncode
def start_a_gps(gps_to_start):
start_gps = "gpsd "+gps_to_start+" -S " + str(2947+n) #+ "-n -b"
print "start_gps:", start_gps
returncode = call(start_gps, shell=True)
time.sleep(2)
#print returncode
def gps_list():
gps_list = find_usb_tty("067b","2303")
return gps_list
def decdeg2dms(dd):
mnt,sec = divmod(dd*3600,60)
deg,mnt = divmod(mnt,60)
return deg,mnt,sec
def lldistance(a, b):
"""
Calculates the distance between two GPS points (decimal)
@param a: 2-tuple of point A
@param b: 2-tuple of point B
@return: distance in m
"""
r = 6367442.5 # average earth radius in m
dLat = radians(a[0]-b[0])
dLon = radians(a[1]-b[1])
x = sin(dLat/2) ** 2 + \
cos(radians(a[0])) * cos(radians(b[0])) *\
sin(dLon/2) ** 2
#original# y = 2 * atan2(sqrt(x), sqrt(1-x))
y = 2 * asin(sqrt(x))
d = r * y
return d
if __name__ == "__main__":
#print "startup all gps"
#start_all_gps()
gpslist = gps_list()
#print gpslist
gps2 = mobot_gps()
gps2.daemon=True
gps2.start()
#gps2.join()
while 1:
print "# of GPS Units:", len(gpslist)
'''
if (gps2.satellites > 0):
print 'Satellites (total of', len(gps2.satellites) , ' in view)'
print "Active satellites used:", gps2.active_satellites
for i in gps2.satellites:
print '\t', i
'''
print "Active satellites used:", gps2.active_satellites
print "lat: ", gps2.latitude
print "long:", gps2.longitude
time.sleep(random.randint(1, 3))
#os.system("clear")
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/tests/test_regression_constant_features.py
import milk
import numpy as np
def test_constant_features():
learner = milk.defaultclassifier()
features = np.ones(20).reshape((-1,1))
labels = np.zeros(20)
labels[10:] += 1
features[10:] *= -1
learner.train(features, labels)
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/unsupervised/som.py
# -*- coding: utf-8 -*-
# Copyright (C) 2010, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
# License: MIT. See COPYING.MIT file in the milk distribution
from __future__ import division
import numpy as np
from ..utils import get_pyrandom
from . import _som
def putpoints(grid, points, L=.2, radius=4, iterations=1, shuffle=True, R=None):
'''
putpoints(grid, points, L=.2, radius=4, iterations=1, shuffle=True, R=None)
Feeds elements of `points` into the SOM `grid`
Parameters
----------
grid : ndarray
Self organising map
points : ndarray
data to feed to array
L : float, optional
How much to influence neighbouring points (default: .2)
radius : integer, optional
Maximum radius of influence (in L_1 distance, default: 4)
iterations : integer, optional
Number of iterations
shuffle : boolean, optional
Whether to shuffle the points before each iterations
R : source of randomness
'''
if radius is None:
radius = 4
if type(L) != float:
raise TypeError("milk.unsupervised.som: L should be floating point")
if type(radius) != int:
raise TypeError("milk.unsupervised.som: radius should be an integer")
if grid.dtype != np.float32:
raise TypeError('milk.unsupervised.som: only float32 arrays are accepted')
if points.dtype != np.float32:
raise TypeError('milk.unsupervised.som: only float32 arrays are accepted')
if len(grid.shape) == 2:
grid = grid.reshape(grid.shape+(1,))
if shuffle:
random = get_pyrandom(R)
for i in xrange(iterations):
if shuffle:
random.shuffle(points)
_som.putpoints(grid, points, L, radius)
def closest(grid, f):
'''
y,x = closest(grid, f)
Finds the coordinates of the closest point in the `grid` to `f`
::
y,x = \\argmin_{y,x} { || grid[y,x] - f ||^2 }
Parameters
----------
grid : ndarray of shape Y,X,J
self-organised map
f : ndarray of shape J
point
Returns
-------
y,x : integers
coordinates into `grid`
'''
delta = grid - f
delta **= 2
delta = delta.sum(2)
return np.unravel_index(delta.argmin(), delta.shape)
def som(data, shape, iterations=1000, L=.2, radius=4, R=None):
'''
grid = som(data, shape, iterations=1000, L=.2, radius=4, R=None):
Self-organising maps
Parameters
----------
points : ndarray
data to feed to array
shape : tuple
Desired shape of output. Must be 2-dimensional.
L : float, optional
How much to influence neighbouring points (default: .2)
radius : integer, optional
Maximum radius of influence (in L_1 distance, default: 4)
iterations : integer, optional
Number of iterations
R : source of randomness
Returns
-------
grid : ndarray
Map
'''
R = get_pyrandom(R)
d = data.shape[1]
if data.dtype != np.float32:
data = data.astype(np.float32)
grid = np.array(R.sample(data, np.product(shape))).reshape(shape + (d,))
putpoints(grid, data, L=L, radius=radius, iterations=iterations, shuffle=True, R=R)
return grid
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/supervised/_tree.cpp
// Copyright (C) 2010, <NAME> <<EMAIL>>
// License: MIT
#include <iostream>
#include <memory>
#include <cmath>
#include <cassert>
extern "C" {
#include <Python.h>
#include <numpy/ndarrayobject.h>
}
namespace {
template <typename T>
double set_entropy(const T* data, const int N, double* counts, long clen) {
for (int i = 0; i != clen; ++i) counts[i] = 0.;
for (int i = 0; i != N; ++i) {
int value = data[i];
if (value >= clen) {
PyErr_SetString(PyExc_RuntimeError, "milk.supervised.tree.set_entropy: label value too large. aborting");
return 0;
}
counts[value] += 1.;
}
// Here is the formula we use:
//
// H = - \sum px \log(px)
// = - \sum (cx/N) \log( cx / N)
// = - 1/N \sum { cx [ \log cx - \log N ] }
// = - 1/N { (\sum cx \log cx ) - ( \sum cx \log N ) }
// = - 1/N { (\sum cx \log cx ) - N \log N }
// = ( - 1/N \sum cx \log cx ) + \log N
double entropy = 0.;
for (int i = 0; i != clen; ++i) {
double cx = counts[i];
if (cx) entropy += cx * std::log(cx);
}
entropy /= -N;
entropy += std::log(double(N));
return entropy;
}
double set_entropy(PyArrayObject* points, double* counts, long clen) {
const int N = PyArray_SIZE(points);
#define HANDLE(type_marker, type) \
if (PyArray_TYPE(points) == type_marker) return set_entropy<type>(reinterpret_cast<const type*>(PyArray_DATA(points)), N, counts, clen);
HANDLE(NPY_INT, int)
HANDLE(NPY_LONG, long)
#undef HANDLE
assert(false);
return 0.;
}
PyObject* py_set_entropy(PyObject* self, PyObject* args) {
const char* errmsg = "Arguments were not what was expected for set_entropy.\n"
"This is an internal function: Do not call directly unless you know exactly what you're doing.\n";
PyArrayObject* labels;
PyArrayObject* counts;
if (!PyArg_ParseTuple(args, "OO", &labels, &counts)) {
PyErr_SetString(PyExc_RuntimeError,errmsg);
return 0;
}
if (!PyArray_Check(labels) || (PyArray_TYPE(labels) != NPY_INT && PyArray_TYPE(labels) != NPY_LONG) || !PyArray_ISCONTIGUOUS(labels) ||
!PyArray_Check(counts) || PyArray_TYPE(counts) != NPY_DOUBLE || !PyArray_ISCONTIGUOUS(counts)) {
PyErr_SetString(PyExc_RuntimeError,errmsg);
return 0;
}
double* cdata = reinterpret_cast<double*>(PyArray_DATA(counts));
const int nlabels = PyArray_DIM(counts, 0);
double res = set_entropy(labels, cdata, nlabels);
return PyFloat_FromDouble(res);
}
PyObject* py_information_gain(PyObject* self, PyObject* args) {
const char* errmsg = "Arguments were not what was expected for information_gain.\n"
"This is an internal function: Do not call directly unless you know exactly what you're doing.\n";
PyArrayObject* labels0;
PyArrayObject* labels1;
if (!PyArg_ParseTuple(args, "OO", &labels0, &labels1)) {
PyErr_SetString(PyExc_RuntimeError,errmsg);
return 0;
}
if (!PyArray_Check(labels0) || (PyArray_TYPE(labels0) != NPY_INT && PyArray_TYPE(labels0) != NPY_LONG) || !PyArray_ISCONTIGUOUS(labels0) ||
!PyArray_Check(labels1) || (PyArray_TYPE(labels1) != NPY_INT && PyArray_TYPE(labels1) != NPY_LONG) || !PyArray_ISCONTIGUOUS(labels1)) {
PyErr_SetString(PyExc_RuntimeError,errmsg);
std::cout << PyArray_TYPE(labels0) << " " << PyArray_TYPE(labels1) << '\n';
return 0;
}
double* counts;
double counts_array[8];
long clen = 0;
#define GET_MAX(index) \
const int N ## index = PyArray_DIM(labels ## index, 0); \
{ \
for (int i = 0; i != N ## index; ++i) { \
long val = 0; \
if (PyArray_TYPE(labels ## index) == NPY_INT) val = *reinterpret_cast<const int*>(PyArray_GETPTR1(labels ## index, i)); \
else if (PyArray_TYPE(labels ## index) == NPY_LONG) val = *reinterpret_cast<const int*>(PyArray_GETPTR1(labels ## index, i)); \
if (val > clen) clen = val; \
} \
}
GET_MAX(0);
GET_MAX(1);
++clen;
if (clen > 8) {
counts = new(std::nothrow) double[clen];
if (!counts) {
PyErr_NoMemory();
return 0;
}
} else {
counts = counts_array;
}
const double N = N0 + N1;
double H = - N0/N * set_entropy(labels0, counts, clen) - N1/N * set_entropy(labels1, counts, clen);
if (clen > 8) delete [] counts;
return PyFloat_FromDouble(H);
}
PyMethodDef methods[] = {
{"set_entropy", py_set_entropy, METH_VARARGS , "Do NOT call directly.\n" },
{"information_gain", py_information_gain, METH_VARARGS , "Do NOT call directly.\n" },
{NULL, NULL,0,NULL},
};
const char * module_doc =
"Internal Module.\n"
"\n"
"Do NOT use directly!\n";
} // namespace
extern "C"
void init_tree()
{
import_array();
(void)Py_InitModule3("_tree", methods, module_doc);
}
<file_sep>/navigation/lidar/test_lidar_faker_class.py
import lidar_faker_class
import time
lidar = lidar_faker_class.lidar_faker()
while True:
time.sleep(.05)
print lidar.x_degree, lidar.y_degree, lidar.dist, lidar.quality, lidar.rpm
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/unsupervised/parzen.py
# -*- coding: utf-8 -*-
# Copyright (C) 2010, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
from __future__ import division
import numpy as np
def get_parzen_1class_rbf_loocv(features):
'''
f,fprime = get_parzen_1class_rbf_loocv(features)
Leave-one-out crossvalidation value for 1-class Parzen window evaluator for
features.
Parameters
----------
features : ndarray
feature matrix
Returns
-------
f : function: double -> double
function which evaluates the value of a window value. Minize to get the
best window value.
fprime : function: double -> double
function: df/dh
'''
from milk.unsupervised.pdist import pdist
D2 = -pdist(features)
n = len(features)
sumD2 = D2.sum()
D2.flat[::(n+1)] = -np.inf
def f(h):
D2h = D2 / (2.*h)
np.exp(D2h, D2h)
val = D2h.sum()
return val/np.sqrt(2*h*np.pi)
def fprime(h):
D2h = D2 / (2.*h)
D2h.flat[::(n+1)] = 1.
D2h *= np.exp(D2h)
val = D2h.sum() - D2h.trace()
val /= np.sqrt(2*h*np.pi)
return -1./(4*np.pi*h)*f(h) + val
return f,fprime
def parzen(features, h):
'''
f = parzen(features, h)
Parzen window smoothing
Parameters
----------
features : ndarray
feature matrix
h : double
bandwidth
Returns
-------
f : callable (double^N -> double)
density function
'''
sum2 = np.array([np.dot(f,f) for f in features])
N = len(features)
beta = np.sqrt(2*h*np.pi)/N
def f(x):
dist = np.dot(features, -2*x)
dist += sum2
dist += np.dot(c,c)
dist /= 2.*h
np.exp(dist, dist)
val = dist.sum()
return val*beta
return f
<file_sep>/gui/breezy/scaledemo1.py
"""
File: scaledemo1.py
Displays the area of a circle whose radius is input from a sliding scale.
"""
from breezypythongui import EasyFrame
import math
class CircleArea(EasyFrame):
def __init__(self):
"""Sets up the window and widgets."""
EasyFrame.__init__(self, title = "Circle Area")
# Label and field for the area
self.addLabel(text = "Area",
row = 0, column = 0)
self.areaField = self.addFloatField(value = 0.0,
row = 0,
column = 1,
width = 20)
# Sliding scale for the radius
self.radiusScale = self.addScale(label = "Radius",
row = 1, column = 0,
columnspan = 2,
from_ = 0, to = 100,
length = 300,
tickinterval = 10,
command = self.computeArea)
# The event handler method for the sliding scale
def computeArea(self, radius):
"""Inputs the radius, computes the area,
and outputs the area."""
# radius is the current value of the scale, as a string.
area = float(radius) ** 2 * math.pi
self.areaField.setNumber(area)
# Instantiate and pop up the window."""
CircleArea().mainloop()
<file_sep>/preprocessimages.py
#!/usr/bin/env python
import os
from PIL import Image
from PIL import ImageOps
import sys, time
from optparse import OptionParser
import numpy as np
from numpy import *
import mlpy
import Numeric, Image
import matplotlib.mlab as mlab
import matplotlib.pyplot as plt
import matplotlib.image as mpimg
import mlpy
import pylab
import matplotlib.numerix as nx
from numpy import array
from scipy.cluster.vq import vq, kmeans, kmeans2, whiten
def hsvToRGB(h, s, v):
"""Convert HSV color space to RGB color space
@param h: Hue
@param s: Saturation
@param v: Value
return (r, g, b)
"""
import math
hi = math.floor(h / 60.0) % 6
f = (h / 60.0) - math.floor(h / 60.0)
p = v * (1.0 - s)
q = v * (1.0 - (f*s))
t = v * (1.0 - ((1.0 - f) * s))
return {
0: (v, t, p),
1: (q, v, p),
2: (p, v, t),
3: (p, q, v),
4: (t, p, v),
5: (v, p, q),
}[hi]
def rgbToHSV(r, g, b):
"""Convert RGB color space to HSV color space
@param r: Red
@param g: Green
@param b: Blue
return (h, s, v)
"""
maxc = max(r, g, b)
minc = min(r, g, b)
colorMap = {
id(r): 'r',
id(g): 'g',
id(b): 'b'
}
if colorMap[id(maxc)] == colorMap[id(minc)]:
h = 0
elif colorMap[id(maxc)] == 'r':
h = 60.0 * ((g - b) / (maxc - minc)) % 360.0
elif colorMap[id(maxc)] == 'g':
h = 60.0 * ((b - r) / (maxc - minc)) + 120.0
elif colorMap[id(maxc)] == 'b':
h = 60.0 * ((r - g) / (maxc - minc)) + 240.0
v = maxc
if maxc == 0.0:
s = 0.0
else:
s = 1.0 - (minc / maxc)
return (h, s, v)
def rgbToI3(r, g, b):
"""Convert RGB color space to I3 color space
@param r: Red
@param g: Green
@param b: Blue
return (I3) integer
"""
i3 = ((2*g)-r-b)/2
return i3
def image2array(im):
if im.mode not in ("L", "F"):
raise ValueError, "can only convert single-layer images"
if im.mode == "L":
a = Numeric.fromstring(im.tostring(), Numeric.UnsignedInt8)
#a = Numeric.fromstring(im.tostring(), Numeric.Int8)
else:
a = Numeric.fromstring(im.tostring(), Numeric.Float32)
a.shape = im.size[1], im.size[0]
return a
def array2image(a):
if a.typecode() == Numeric.UnsignedInt8:
mode = "L"
elif a.typecode() == Numeric.Float32:
mode = "F"
else:
raise ValueError, "unsupported image mode"
return Image.fromstring(mode, (a.shape[1], a.shape[0]), a.tostring())
def CalcHistogram(img):
#calc histogram of green band
bins = nx.arange(0,256)
hist1 = image2array(img)
H, xedges = np.histogram(np.ravel(hist1), bins=bins, normed=False)
return H
def CalcLBP(img):
#Function to calculate local binary pattern of an image
#pass in a img
#returns an array???
# Angle step.
#PI = 3.14159265
neighbors = 8
#a = 2*PI/neighbors;
#radius = 1
#Increment = 1/neighbors
xmax = img.size[0]
ymax = img.size[1]
#convert image to grayscale
grayimage = ImageOps.grayscale(img)
#make a copy to return
returnimage = Image.new("L", (xmax,ymax))
neighborRGB = np.empty([8], dtype=int)
meanRGB = 0
imagearray = grayimage.load()
for y in range(1, ymax-1, 1):
for x in range(1, xmax-1, 1):
centerRGB = imagearray[x, y]
meanRGB = centerRGB
neighborRGB[4] = imagearray[x+1,y+1]
neighborRGB[5] = imagearray[x,y+1]
neighborRGB[6] = imagearray[x-1,y+1]
neighborRGB[7] = imagearray[x-1,y]
neighborRGB[0] = imagearray[x-1,y-1]
neighborRGB[1] = imagearray[x,y-1]
neighborRGB[2] = imagearray[x+1,y-1]
neighborRGB[3] = imagearray[x+1,y]
#comparing against mean adds a sec vs comparing against center pixel
meanRGB= centerRGB + neighborRGB.sum()
meanRGB = meanRGB / (neighbors+1)
#compute Improved local binary pattern (center pixel vs the mean of neighbors)
lbp = 0
for i in range(neighbors):
#comparing against mean adds a sec vs comparing against center pixel
if neighborRGB[i] >= meanRGB:
#if neighborRGB[i] >= centerRGB:
lbp = lbp + (2**i)
#putpixel adds 1 second vs storing to array
returnimage.putpixel((x,y), lbp)
return returnimage
def rgb2I3 (img):
"""Convert RGB color space to I3 color space
@param r: Red
@param g: Green
@param b: Blue
return (I3) integer
"""
xmax = img.size[0]
ymax = img.size[1]
#make a copy to return
returnimage = Image.new("RGB", (xmax,ymax))
imagearray = img.load()
for y in range(0, ymax, 1):
for x in range(0, xmax, 1):
rgb = imagearray[x, y]
i3 = ((2*rgb[1])-rgb[0]-rgb[2]) / 2
#print rgb, i3
returnimage.putpixel((x,y), (0,i3,0))
return returnimage
usage = "usage: %prog [options]"
parser = OptionParser(prog=sys.argv[0], usage=usage)
parser.add_option("-d", "--dir", dest="directory",
help="directory of images to process. example: /home/user/python/images...default = images", metavar="DIRECTORY")
#parser.add_option('-d', '--dir', dest='directory of images to process', action='store',
# help='directory of images to process. example: /home/user/python/images')
#parser.add_option('-t', '--true', dest='true', action='store_true',
# help='example of using store_true, default %default')
#parser.add_option('-v', '--value', dest='value', action='store',
# help='example of using multiple arguments')
#
parser.set_defaults(true=False )
options, args = parser.parse_args()
#print 'OPTIONS::', options
#print 'ARGS::', args
if len(sys.argv) == 1:
path='testimage'
if len(sys.argv) == 2:
path= sys.argv[1]
print
print "Attempting to process files in directory: ", os.getcwd()+"/"+ path
print
count = 0
for subdir, dirs, files in os.walk(path):
count = len(files)
if count == 0:
print "No files in directory to process..."
if count > 0:
#delete classid and classdata files to completely rebuild them
f_handle = open("greenbandclassid.txt", 'w')
f_handle.close()
f_handle = open("greenbanddata.txt", 'w')
f_handle.close()
print "Files to Process: ", count
for subdir, dirs, files in os.walk(path):
for file in files:
filename1= os.path.join(path, file)
im = Image.open(filename1)
print
print "Processing current image: " , filename1
#print im.format, im.size, im.mode
if im.size[0] <> 320 or im.size[1] <> 240:
print "Image is not right size. Resizing image...."
im = im.resize((320, 240))
print "Resized to 320, 340"
if im.mode == "RGB":
print "Image has multiple color bands...Splitting Bands...."
Red_Band, Green_Band,Blue_Band = im.split()
im.show()
print Green_Band
Green_Band.show()
#print "Saving color bands...."
#filename = filename1.rsplit('.')[0] + "_RedBand.bmp"
#print filename1.rsplit('.')[0][-1]
imageclassid = filename1.rsplit('.')[0][-1]
classid = array(int(imageclassid[0]))
if imageclassid.isdigit():
print "Image class: ", imageclassid
f_handle = open("greenbandclassid.txt", 'a')
f_handle.write(str(classid))
f_handle.write(' ')
f_handle.close()
#calculate histogram
print "Calculating Histogram for the green pixels of image..."
Histogram = CalcHistogram(Green_Band)
#save Green Histogram to file in certain format
print "saving histogram to dictionary..."
f_handle = open("greenbanddata.txt", 'a')
for i in range(len(Histogram)):
f_handle.write(str(Histogram[i]))
f_handle.write(" ")
f_handle.write('\n')
f_handle.close()
#print "Saving.....", filename
#Red_Band.save(filename, "BMP")
#filename = filename1.rsplit('.')[0] + "_GreenBand.bmp"
print "calling i3"
I3image = rgb2I3(im)
#calculate histogram
print "Calculating Histogram for I3 pixels of image..."
Red_Band, Green_Band, Blue_Band = I3image.split()
Histogram = CalcHistogram(Green_Band)
#save I3 Histogram to file in certain format
f_handle = open("I3bandclassid.txt", 'a')
f_handle.write(str(classid))
f_handle.write(' ')
f_handle.close()
print "saving I3 histogram to dictionary..."
f_handle = open("I3banddata.txt", 'a')
for i in range(len(Histogram)):
f_handle.write(str(Histogram[i]))
f_handle.write(" ")
f_handle.write('\n')
f_handle.close()
"""
#Local Binary Patterns
inittime = time.time()
print "Computing Local Binary Patterns....", inittime, time.ctime()
neighborRGB = np.empty([8], dtype=int)
for y in range(1, ymax-1, 1):
#print y
for x in range(1, xmax-1, 1):
meanRGB = 0
neighborRGB[0] = 0
centerRGB = im2.getpixel((x, y))
#im2.putpixel((x,y), (255,0,0))
meanRGB = centerRGB
#lbp = 0
for i in range(neighbors):
XPixel = around(-radius*sin((i-1)*a) )
YPixel = around(radius*cos((i-1)*a) )
#print XPixel, YPixel,x+XPixel,y+YPixel
if x+XPixel > -1 and y+YPixel > -1:
neighborRGB[i] = im2.getpixel((x+XPixel,y+YPixel))
meanRGB = meanRGB + neighborRGB[i]
#if neighborRGB[i] >= centerRGB:
# lbp = lbp + (2**i)
#print neighborRGB, meanRGB
meanRGB = meanRGB / (neighbors+1)
#print neighborRGB, meanRGB
#im3.putpixel((x,y), lbp)
#compute Improved local binary pattern (center pixel vs the mean of neighbors)
lbp = 0
for i in range(neighbors):
#print i, neighborRGB[i], meanRGB, (neighborRGB[i] >= meanRGB)
if neighborRGB[i] >= meanRGB:
lbp = lbp + (2**i)
im3.putpixel((x,y), lbp)
#Compute Uniform local binary pattern (center pixel vs the mean of neighbors)
#im4.putpixel((x,y), 0)
#if lbp == 1 or lbp == 3 or lbp == 7 or lbp == 15 or lbp == 31 or lbp == 127 or lbp == 255:
# im4.putpixel((x,y), lbp)
print "Completed Computing Local Binary Patterns....", (time.time()-inittime)
"""
I3image.show()
#Local Binary Patterns
inittime = time.time()
print "Computing Local Binary Patterns....", inittime, time.ctime()
im6 = CalcLBP(im)
im7 = CalcLBP(Green_Band)
#Red_Band, Green_Band, Blue_Band = im.split()
print "Completed Computing Local Binary Patterns....", (time.time()-inittime)
im6.show()
im7.show()
#Compute Uniform local binary pattern (center pixel vs the mean of neighbors)
#im4.putpixel((x,y), 0)
#if lbp == 1 or lbp == 3 or lbp == 7 or lbp == 15 or lbp == 31 or lbp == 127 or lbp == 255:
# im4.putpixel((x,y), lbp)
#
#print im3.getcolors()
#print im3.histogram()
#im3.show()
#im5.show()
#hist1 = CalcHistogram(im3)
#print hist1
#plt.plot(hist1)
#plt.show()
#stop
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/tests/test_logistic.py
import milk.supervised.logistic
import milksets.wine
import numpy as np
def test_better_than_random():
learner = milk.supervised.logistic.logistic_learner()
features, labels = milksets.wine.load()
model = learner.train(features, labels == 0)
error = np.array([np.abs(model.apply(f)-(l == 0))
for f,l in zip(features, labels)])
assert error.mean() < .1
<file_sep>/navigation/lidar/test_mobot_lidar_class.py
#!/usr/bin/python
import sys
sys.path.append( "../../lib/" )
import mobot_lidar_class
import time
from visual import *
from mobot_nav_class import *
import cPickle as pickle
#global variables
use_points = True
use_outer_line = False
use_lines = True
use_intensity = False
init_level = 0
index = 0
offset = 140
#setup the window, view, etc
scene.forward = (1, -1, 0)
scene.background = (0.1, 0.1, 0.2)
scene.title = "Neato XV-11 Laser Distance Sensor - OpenLidarMap"
class grid:
"""A graphical grid, with two level of subdivision.
The grid can be manipulated (moved, showed/hidden...) by acting on self.frame.
"""
def __init__(self, size=100, small_interval=10, big_interval=50):
self.frame = frame(pos=(0,0,0))
for i in range(-size, size+small_interval, small_interval):
if i %big_interval == 0:
c = color.gray(0.65)
else:
c = color.gray(0.25)
curve(frame=self.frame, pos=[(i,0,size),(i,0,-size)], color=c)
curve(frame=self.frame, pos=[(size,0,i),(-size,0,i)], color=c)
#grid where major intervals are 1m, minor intervals are 10cm
my_grid = grid(size=4000, small_interval = 100, big_interval=1000)
my_grid.frame.pos.y=-5
# sample and intensity points
point = points(pos=[(0,0,0) for i in range(360)], size=5, color=(1 , 0, 0))
pointb = points(pos=[(0,0,0) for i in range(360)], size=5, color=(0.4, 0, 0))
point2 = points(pos=[(0,0,0) for i in range(360)], size=3, color=(1 , 1, 0))
point2b = points(pos=[(0,0,0) for i in range(360)], size=3, color=(0.4, 0.4, 0))
#lines
outer_line= curve (pos=[(0,0,0) for i in range(360)], size=5, color=(1 , 0, 0))
lines=[curve(pos=[(offset*cos(i* pi / 180.0),0,offset*-sin(i* pi / 180.0)),(offset*cos(i* pi / 180.0),0,offset*-sin(i* pi / 180.0))], color=[(0.1, 0.1, 0.2),(1,0,0)]) for i in range(360)]
zero_intensity_ring = ring(pos=(0,0,0), axis=(0,1,0), radius=offset-1, thickness=1, color = color.yellow)
#draw the robot (very approximatly)
robot = frame()
box(frame=robot, pos = (150,-65,0), width=300, length = 180, height = 100)
cylinder(frame=robot, pos=(60,-115,0), axis = (0,100,0), radius=150)
cylinder(frame=robot, pos = (0,10,0), axis= (0,-40,0),radius=60)
robot.visible = False # and hide it!
#draw the lidar (even more approximatly)
lidar=frame()
cylinder(frame=lidar, pos = (0,-8,0), axis = (0,16,0), radius = 40, color=color.gray(0.8))
cylinder(frame=lidar, pos = (-70,-18,0), axis = (0,-25,0), radius=15, color=color.gray(0.8))
ring(frame=lidar, pos=(0,-18,0), radius = 60, thickness=10, axis = (0,1,0), color=color.gray(0.8))
ring(frame=lidar, pos=(-12,-18,0), radius = 50, thickness=10, axis = (0,1,0), color=color.gray(0.8))
ring(frame=lidar, pos=(-30,-18,0), radius = 40, thickness=10, axis = (0,1,0), color=color.gray(0.8))
ring(frame=lidar, pos=(-45,-18,0), radius = 30, thickness=10, axis = (0,1,0), color=color.gray(0.8))
ring(frame=lidar, pos=(-60,-18,0), radius = 20, thickness=10, axis = (0,1,0), color=color.gray(0.8))
#Text
label_help = label(pos = (0,0,0))
label_help.text="""Red : distance.
Yellow : intensity (the ring materializes 'intensity == 0')
Darker colors when quality is subpar.
The grid has 10cm intervals.
Mouse:
Left+drag = rotate
Left+Right+drag = zoom
Keyboard:
h : show/hide this help
r : start motor (bluetooth version)
s : stop motor (bluetooth version)
i : show/hide intensity dots
o : show/hide distance outer line
p : show/hide distance dots
l : show/hide distance rays
j : show/hide RPM label
k : show/hide error count
g : show/hide grid
b : show/hide robot 3D model
n : show/hide lidar 3D model"""
label_speed = label(pos = (0,-500,0), xoffset=1, box=False, opacity=0.1)
label_errors = label(pos = (0,-1000,0), xoffset=1, text="errors: 0", visible = False, box=False)
def motor_control( speed ):
global ser, controler, rpm_setpoint
val = controler.process( speed - rpm_setpoint)
ser.write(chr(val))
def gui_update_speed(speed_rpm):
label_speed.text = "RPM : " + str(speed_rpm)
def compute_speed(data):
speed_rpm = float( data[0] | (data[1] << 8) ) / 64.0
return speed_rpm
def update_view(lidar):
"""Updates the view of a sample.
Takes the angle (an int, from 0 to 359) and the list of four bytes of data in the order they arrived.
"""
global offset, use_outer_line, use_line
while True:
gui_update_speed(lidar.speed_rpm)
for i in lidar.data:
time.sleep(0.0001)
angle = i[0]
dist_mm = i[1]
quality = i[2]
#reset the point display
point.pos[angle] = vector( 0, 0, 0 )
pointb.pos[angle] = vector( 0, 0, 0 )
point2.pos[angle] = vector( 0, 0, 0 )
point2b.pos[angle] = vector( 0, 0, 0 )
angle_rad = angle * pi / 180.0
c = cos(angle_rad)
s = -sin(angle_rad)
dist_x = dist_mm*c
dist_y = dist_mm*s
#print "angle:", angle, " dist_mm:", dist_mm, " quality:", quality
if not use_lines : lines[angle].pos[1]=(offset*c,0,offset*s)
if not use_outer_line :
outer_line.pos[angle]=(offset*c,0,offset*s)
outer_line.color[angle] = (0.1, 0.1, 0.2)
# display the sample
if quality == 0: # is the flag for "bad data" set?
# # yes it's bad data
lines[angle].pos[1]=(offset*c,0,offset*s)
outer_line.pos[angle]=(offset*c,0,offset*s)
outer_line.color[angle] = (0.1, 0.1, 0.2)
else:
# no, it's cool
# if not x1 & 0x40:
# X+1:6 not set : quality is OK
if use_points : point.pos[angle] = vector( dist_x,0, dist_y)
if use_intensity : point2.pos[angle] = vector( (quality + offset)*c,0, (quality + offset)*s)
if use_lines : lines[angle].color[1] = (1,0,0)
if use_outer_line : outer_line.color[angle] = (1,0,0)
#else:
# # X+1:6 set : Warning, the quality is not as good as expected
# if use_points : pointb.pos[angle] = vector( dist_x,0, dist_y)
# if use_intensity : point2b.pos[angle] = vector( (quality + offset)*c,0, (quality + offset)*s)
# if use_lines : lines[angle].color[1] = (0.4,0,0)
# if use_outer_line : outer_line.color[angle] = (0.4,0,0)
if use_lines : lines[angle].pos[1]=( dist_x, 0, dist_y)
if use_outer_line : outer_line.pos[angle]=( dist_x, 0, dist_y)
#raw_input ("press enter")
def angel_greatest_dist(lidar):
min_dist_mm = 60
greatest_dist_mm = 0
#while True:
for i in lidar.data:
time.sleep(0.0001)
angle = i[0]
dist_mm = i[1]
quality = i[2]
if dist_mm > min_dist_mm and quality > 10:
if dist_mm > greatest_dist_mm:
greatest_dist_mm = dist_mm
angle_to_return = angle
return angle_to_return
def nav(lidar):
min_dist_mm = 33
while True:
time.sleep(0.1)
print 'angel_greatest_dist(lidar) ', angel_greatest_dist(lidar)
'''
for i in lidar.data:
time.sleep(0.0001)
angle = i[0]
dist_mm = i[1]
quality = i[2]
if angle < front_left or angle > front_right:
if dist_mm > min_dist_mm and dist_mm < 200 and quality > 0:
print "---------------------------------------"
print 'OBJECT AHEAD:', i
print "---------------------------------------"
if angle < lc_left or angle > lc_right:
if dist_mm > min_dist_mm and dist_mm < 200 and quality > 0:
print "---------------------------------------"
print 'OBJECT LEFT:', i
print "---------------------------------------"
'''
label_help.visible = False
ml1 = mobot_lidar_class.lidar("/dev/ttyUSB0", 115200)
print ml1
#print ml1.ser
print dir(ml1)
time.sleep(1)
print ml1.ser
th = thread.start_new_thread(update_view, (ml1,))
#th2 = thread.start_new_thread(nav, (ml1,))
nav = mobot_nav (ml1)
while True:
rate(60) # synchonous repaint at 60fps
if scene.kb.keys: # event waiting to be processed?
s = scene.kb.getkey() # get keyboard info
#if s == "s": # stop motor
# ser.write(chr(0))
#elif s=="r": # run motor
# ser.write(chr(130))
if s=="o": # Toggle outer line
use_outer_line = not use_outer_line
elif s=="l": # Toggle rays
use_lines = not use_lines
elif s=="p": # Toggle points
use_points = not use_points
elif s=="i": # Toggle intensity
use_intensity = not use_intensity
zero_intensity_ring.visible = use_intensity
elif s=="g": # Toggle grid
my_grid.frame.visible = not my_grid.frame.visible
elif s=="b": # Toggle robot representation
robot.visible = not robot.visible
elif s=="n": # Toglle lidar representation
lidar.visible = not lidar.visible
elif s=="h": # Toggle help
label_help.visible = not label_help.visible
elif s=="j": # Toggle rpm
label_speed.visible = not label_speed.visible
elif s=="k": # Toggle errors
label_errors.visible = not label_errors.visible
elif s=="t": # which way to turn
print "angel_greatest_dist", nav.angel_greatest_dist()
print "turn_left_or_right", nav.turn_left_or_right()
#print "---------------------------------------"
#print ml1.data
#print "len: ", len(ml1.data)
#gps2 = mobot_gps()
#gps2.daemon=True
#gps2.start()
#print "---------------------------------------"
#print "speed: ", ml1.speed_rpm
#time.sleep(1)
<file_sep>/camera/Mowercapturecam.py~
#!/usr/bin/python
import sys, pygame, time
import os
from PIL import Image
from PIL import ImageEnhance, ImageStat
from PIL import ImageFilter
from pygame.locals import *
import opencv
# import the necessary things for OpenCV
from opencv import cv
from opencv import highgui
#from opencv.cv import *
#from opencv.highgui import *
#from CVtypes import cv
# the codec existing in cvcapp.cpp,
# need to have a better way to specify them in the future
# WARNING: I have see only MPEG1VIDEO working on my computer
H263 = 0x33363255
H263I = 0x33363249
MSMPEG4V3 = 0x33564944
MPEG4 = 0x58564944
MSMPEG4V2 = 0x3234504D
MJPEG = 0x47504A4D
MPEG1VIDEO = 0x314D4950
AC3 = 0x2000
MP2 = 0x50
FLV1 = 0x31564C46
res = (320,240)
pygame.init()
screen = pygame.display.set_mode((320,240))
pygame.display.set_caption('Webcam')
pygame.font.init()
font = pygame.font.SysFont("Courier",11)
def disp(phrase,loc):
s = font.render(phrase, True, (200,200,200))
sh = font.render(phrase, True, (50,50,50))
screen.blit(sh, (loc[0]+1,loc[1]+1))
screen.blit(s, loc)
brightness = 1.0
contrast = 1.0
shots = 0
# so, here is the main part of the program
if __name__ == '__main__':
# a small welcome
print "OpenCV Python capture video"
# first, create the necessary window
highgui.cvStartWindowThread()
highgui.cvNamedWindow('Camera', highgui.CV_WINDOW_AUTOSIZE)
highgui.cvStartWindowThread()
highgui.cvNamedWindow ('Color Segmentation', highgui.CV_WINDOW_AUTOSIZE)
highgui.cvStartWindowThread()
highgui.cvNamedWindow ('Canny', highgui.CV_WINDOW_AUTOSIZE)
# move the new window to a better place
highgui.cvMoveWindow ('Camera', 10, 10)
try:
# try to get the device number from the command line
device = int (sys.argv [1])
# got it ! so remove it from the arguments
del sys.argv [1]
except (IndexError, ValueError):
# no device number on the command line, assume we want the 1st device
device = 0
if len (sys.argv) == 1:
# no argument on the command line, try to use the camera
capture = highgui.cvCreateCameraCapture (device)
highgui.cvSetCaptureProperty(capture, highgui.CV_CAP_PROP_FRAME_WIDTH, 320)
highgui.cvSetCaptureProperty(capture, highgui.CV_CAP_PROP_FRAME_HEIGHT, 240)
contrast = highgui.cvGetCaptureProperty(capture, highgui.CV_CAP_PROP_CONTRAST)
print contrast
else:
# we have an argument on the command line,
# we can assume this is a file name, so open it
capture = highgui.cvCreateFileCapture (sys.argv [1])
# check that capture device is OK
print "opening camera"
if not capture:
print "Error opening capture device"
sys.exit (1)
# capture the 1st frame to get some propertie on it
frame = highgui.cvQueryFrame (capture)
# get size of the frame
frame_size = cv.cvGetSize (frame)
print "frame_size = ", frame_size
# get the frame rate of the capture device
fps = highgui.cvGetCaptureProperty (capture, highgui.CV_CAP_PROP_FPS)
if fps <= 0:
# no fps getted, so set it to 30 by default
fps = 30
print "frames per sec = ", fps
# create the writer
#writer = highgui.cvCreateVideoWriter ("captured.mpg", MPEG1VIDEO,
# fps, frame_size, True)
# check the writer is OK
#if not writer:
# print "Error opening writer"
# sys.exit (1)
print "starting loop"
while 1:
# do forever
# 1. capture the current image
frame = highgui.cvQueryFrame (capture)
frame = highgui.cvRetrieveFrame (capture)
#frame = highgui.cvGrabFrame (capture)
#img = highgui.cvRetrieveFrame (capture)
#cvGrabFrame(capture); // capture a frame
#img=cvRetrieveFrame(capture); // retrieve the captured frame
#cvWriteFrame(writer,img); // add the frame to the file
highgui.cvShowImage ('Camera', frame)
# handle events
#k = highgui.cvWaitKey ()
#print
for event in pygame.event.get():
if event.type == pygame.QUIT:
highgui.cvReleaseVideoWriter (writer)
highgui.cvDestroyAllWindows()
highgui.cvReleaseCapture (capture)
pygame.quit()
#sys.exit()
break
keyinput = pygame.key.get_pressed()
#print "key pressed: ", keyinput
if keyinput[K_SPACE]:
#frame = highgui.cvQueryFrame (capture)
#frame = highgui.cvGrabFrame (capture)
img = highgui.cvQueryFrame (capture)
img = highgui.cvRetrieveFrame (capture)
if frame is None:
# no image captured... end the processing
break
# display the frames to have a visual output
highgui.cvShowImage ('Camera', img)
# write the frame to the output file
#highgui.cvWriteFrame (writer, img)
if keyinput[K_ESCAPE]:
# user has press the ESC key, so exit
#highgui.cvReleaseVideoWriter (writer)
highgui.cvDestroyAllWindows()
highgui.cvReleaseCapture (capture)
pygame.quit()
#sys.exit ()
break
if keyinput[K_s]:
highgui.cvSaveImage("snapshot.BMP", img)
if keyinput[K_b]:
img = highgui.cvQueryFrame (capture)
PILimg = opencv.adaptors.Ipl2PIL(img)
PILimg = PILimg.filter(ImageFilter.BLUR)
opencvimg = opencv.adaptors.PIL2Ipl(PILimg)
highgui.cvShowImage ('Canny', opencvimg)
if keyinput[K_c]:
img = highgui.cvQueryFrame (capture)
PILimg = opencv.adaptors.Ipl2PIL(img)
PILimg = PILimg.filter(ImageFilter.CONTOUR)
opencvimg = opencv.adaptors.PIL2Ipl(PILimg)
highgui.cvShowImage ('Canny', opencvimg)
if keyinput[K_t]:
PILimg = opencv.adaptors.Ipl2PIL(img)
PILstats =ImageStat.Stat(PILimg)
print "stat.extrema = ", PILstats.extrema
print "stat.count = ", PILstats.count
print "stat.sum = ", PILstats.sum
print "stat.sum2 = ", PILstats.sum2
print "stat.mean = ", PILstats.mean
print "stat.median = ", PILstats.median
print "stat.rms = ", PILstats.rms
print "stat.stddev = ", PILstats.stddev
if keyinput[K_e]:
img = highgui.cvQueryFrame (capture)
PILimg = opencv.adaptors.Ipl2PIL(img)
PILimg = PILimg.filter(ImageFilter.FIND_EDGES)
opencvimg = opencv.adaptors.PIL2Ipl(PILimg)
highgui.cvShowImage ('Canny', opencvimg)
if keyinput[K_m]:
img = highgui.cvQueryFrame (capture)
PILimg = opencv.adaptors.Ipl2PIL(img)
PILimg = PILimg.filter(ImageFilter.MaxFilter(5))
opencvimg = opencv.adaptors.PIL2Ipl(PILimg)
highgui.cvShowImage ('Canny', opencvimg)
if keyinput[K_1]:
contrast += 0.1
#highgui.cvSetCaptureProperty(capture, highgui.CV_CAP_PROP_CONTRAST, contrast)
print "contrast =", contrast
#print highgui.cvGetCaptureProperty(capture, highgui.CV_CAP_PROP_CONTRAST)
PILimg = opencv.adaptors.Ipl2PIL(img)
PILimg = ImageEnhance.Brightness(PILimg).enhance(contrast)
#convert PIL image Ipl image
opencvimg = opencv.adaptors.PIL2Ipl(PILimg)
highgui.cvShowImage ('Canny', opencvimg)
if keyinput[K_p]:
# convert it to PIL image
#im = PIL.Image.frombuffer('RGB', (bmpinfo['bmWidth'], bmpinfo['bmHeight']), bmpstr, 'raw', 'BGRX', 0, 1)
# convert it to IPL image
#iplimage = adaptors.PIL2Ipl(im)
#if (sys.argv[2] > 0):
# segments = int(sys.argv[2])
#else:
#for now just save as a BMP on
#highgui.cvSaveImage("snapshot.BMP", img)
#PILimg = Image.open("snapshot.BMP")
#convert Ipl image to PIL image
PILimg = opencv.adaptors.Ipl2PIL(img)
segments = 8
x, y = PILimg.size
print "img size = ", x, y
xsegs = x / segments
ysegs = y / segments
#print x, y
for yy in range(0, y, ysegs):
for xx in range(0,x,xsegs):
#j = raw_input("press any key")
#print "xx, yy =", xx, yy, xx+xsegs, yy+ysegs
box = (xx, yy, xx+xsegs, yy+ysegs)
cell = PILimg.crop(box)
CellPixels = list(cell.getdata())
rgbpixel = 0
rgbtotal = 0
rtotal = 0
gtotal =0
btotal =0
I3Total = 0
TotalPixels = 0
for pixel in CellPixels:
#for rgb in CellPixels[TotalPixels]:
#print "pixel RGB = ", pixel
r = pixel[0]
rtotal += r
g = pixel[1]
gtotal += g
b = pixel[2]
btotal += b
#print "rgb = ", r, g, b
rgbpixel = r+g+b
#i1 = (r+g+b) / 3
#i2 = (r-b)
i3 = ((2*g) - r - b)/2
#print "i1 = ", i1
#print "i2 = ", i2
#print "i3 = ", i3
I3Total += i3
rgbtotal += rgbpixel/3
TotalPixels += 1
I3Avg = I3Total / TotalPixels
rgbtotalavg = rgbtotal / TotalPixels
ravg = rtotal / TotalPixels
gavg = gtotal / TotalPixels
bavg = btotal / TotalPixels
#print "TotalPixels = ", TotalPixels
#print "I3Ttotal = ", I3Total
#print "I3Avg = ", I3Avg
#print "rgbtotalavg = ", rgbtotalavg
#print "ravg = ", ravg
#print "gavg = ", gavg
#print "bavg = ", bavg
if gavg > ravg and gavg > bavg:
ravg = 0
gavg = 200
bavg =0
if ravg > bavg and ravg > gavg:
ravg = 200
gavg = 0
bavg =0
#cell.save("i.jpg", "JPEG")
#cell.paste((i1,i1,i1), box)
#cell.save("i1.jpg", "JPEG")
#cell.paste((i2,i2,i2), box)
#cell.save("i2.jpg", "JPEG")
#cell.paste((I3Avg,I3Avg,I3Avg), (0,0,xsegs,ysegs))
#cell.save("i3.jpg", "JPEG")
#cell.paste((ravg ,gavg ,bavg ), (0,0,xsegs,ysegs))
#cell.save("i4.jpg", "JPEG")
PILimg.paste((ravg ,gavg ,bavg ), (xx, yy, xx+xsegs, yy+ysegs))
PILimg.save("Grass8_Postprocess.jpg", "JPEG")
#PILimg.show()
#print "trying to show pic"
#highgui.cvShowImage ('Camera', PILimg)
#convert PIL image Ipl image
opencvimg = opencv.adaptors.PIL2Ipl(PILimg)
print "processed"
camshot = pygame.image.frombuffer(PILimg.tostring(), res, "RGB")
screen.blit(camshot, (0,0))
#opencvimg = highgui.cvLoadImage("Grass8_Postprocess.jpg")
highgui.cvShowImage ('Color Segmentation', opencvimg)
# handle events
#print "waiting on key press"
#k = highgui.cvWaitKey ()
#xsegs = x / segments
#ysegs = y / segments
#disp("B:" + str(brightness), (10,16))
disp("space: snapshot", (10,20))
disp("s: save shot", (10,30))
disp("t: stats", (10,40))
#print "hello"
pygame.time.wait(10)
pygame.display.flip()
# end working with the writer
# not working at this time... Need to implement some typemaps...
# but exiting without calling it is OK in this simple application
#highgui.cvReleaseVideoWriter (writer)
<file_sep>/gui/breezy/imagedemo.py
"""
File: imagedemo.py
Author: <NAME>
"""
#!/usr/bin/env python
import sys
sys.path.append( "../../lib/" )
import time
import mobot_video_consume
from img_processing_tools import *
import Image, ImageTk
from breezypythongui import EasyFrame
from Tkinter import PhotoImage
#from tkFont.Font import Font
class ImageDemo(EasyFrame):
"""Displays an image and a caption."""
def __init__(self):
"""Sets up the window and widgets."""
EasyFrame.__init__(self, title = "Image Demo")
self.setResizable(False)
self.imageLabel = self.addLabel(text = "",
row = 0, column = 0,
sticky = "NSEW")
textLabel = self.addLabel(text = "Smokey the cat",
row = 1, column = 0,
sticky = "NSEW")
#self.update_display()
# Set the font and color of the caption.
#font = Font(family = "Verdana", size = 20, slant = "italic")
#textLabel["font"] = font
textLabel["foreground"] = "blue"
def update_display(self):
# Load the image and associate it with the image label.
# Load the image and associate it with the image label.
self.image = PhotoImage(file = "smokey.gif")
imageLabel["image"] = self.image
#self.image = PhotoImage(self.frame)
#camera = mobot_video_consume.consume_video('video.0', 'localhost')
#while True:
#time.sleep(0.01)
#print camera.frame
#cv2.imshow('Video', camera.frame)
#cv.WaitKey(10)
#self.frame = ImageTk.PhotoImage(array2image(camera.frame))
#img = Image.open(camera.frame)
#self.imageLabel["image"] = self.frame
print "hi", self.frame
#self.imageLabel.pack()
#self.imageLabel.config(image=self.frame)
#self.main_gui.after(100, update_display)
self.after(5, self.update_display)
if __name__ == "__main__":
ImageDemo().mainloop()
<file_sep>/bluetooth/bluetoothclient.py
# file: rfcomm-client.py
# original code auth: <NAME> <<EMAIL>>
# modified multithreaded client/server auth: <NAME> <<EMAIL>>
# desc: simple demonstration of a client application that uses RFCOMM sockets
# intended for use with rfcomm-server
#
# $Id: rfcomm-client.py,v 1.3 2006/02/24 19:42:34 albert Exp $
from bluetooth import *
from select import *
import sys
import threading
import thread
import time
class ClientThreadRecv(threading.Thread):
def __init__(self, sock):
self.sock = sock;
threading.Thread.__init__(self);
def run(self):
#try:
while True:
for s in can_rd:
threadLock.acquire(0);
try:
data = self.sock.recv(1024)
print "Server sends:", " [%s]" % data
finally:
if threadLock.locked==True:
threadLock.release();
class ClientThreadSend(threading.Thread):
def __init__(self, sock):
self.sock = sock;
threading.Thread.__init__(self);
def run(self):
while True:
for s1 in can_wr:
threadLock.acquire(0);
try:
data = time.ctime();
if len(data) == 0: break
print "Client is sending: "+data;
self.sock.send(data)
data="";
finally:
if threadLock.locked==True:
threadLock.release();
time.sleep(5);
###########################################################
addr = None
if len(sys.argv) < 2:
print "no device specified. Searching all nearby bluetooth devices for"
print "the bt serve"
else:
addr = sys.argv[1]
print "Searching for bt serve on %s" % addr
# search for the FooBar service
uuid = "94f39d29-7d6d-437d-973b-fba39e49d4ee"
service_matches = find_service( uuid = uuid, address = addr )
if len(service_matches) == 0:
print "couldn't find the FooBar service =("
sys.exit(0)
first_match = service_matches[0]
port = first_match["port"]
name = first_match["name"]
host = first_match["host"]
print "connecting to \"%s\" on %s" % (name, host)
# Create the client socket
sock=BluetoothSocket( RFCOMM )
sock.connect((host, port))
sock.setblocking(True);
print "Connected:"
global can_rd
global can_wr
global has_exc
can_rd = 0
print "can_rd %s" % can_rd
global threadLock
threadLock=threading.Lock();
print "Starting Receiving Thread";
ClientThreadRecv(sock).start();
print "Starting Sending Thread (a time stamp is sent at 5 second intervals to the server)";
ClientThreadSend(sock).start();
print "Setup Complete, Client is running..."
while True:
can_rd, can_wr, has_exc = select( [sock], [sock], [], 2 )
sock.close();
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/unsupervised/pdist.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2012, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# License: MIT. See COPYING.MIT file in the milk distribution
from __future__ import division
import numpy as np
__all__ = [
'pdist',
'plike',
]
def pdist(X, Y=None, distance='euclidean2'):
'''
D = pdist(X, Y={X}, distance='euclidean2')
Compute distance matrix::
D[i,j] == np.sum( (X[i] - Y[j])**2 )
Parameters
----------
X : feature matrix
Y : feature matrix (default: use `X`)
distance : one of 'euclidean' or 'euclidean2' (default)
Returns
-------
D : matrix of doubles
'''
# Use Dij = np.dot(Xi, Xi) + np.dot(Xj,Xj) - 2.*np.dot(Xi,Xj)
if Y is None:
D = np.dot(X, X.T)
x2 = D.diagonal()
y2 = x2
else:
D = np.dot(X, Y.T)
x2 = np.array([np.dot(x,x) for x in X])
y2 = np.array([np.dot(y,y) for y in Y])
D *= -2.
D += x2[:,np.newaxis]
D += y2
# Because of numerical imprecision, we might get negative numbers
# (which cause problems down the road, e.g., when doing the sqrt):
np.maximum(D, 0, D)
if distance == 'euclidean':
np.sqrt(D, D)
return D
def plike(X, sigma2=None):
'''
L = plike(X, sigma2={guess based on X})
Compute likelihood that any two objects come from the same distribution
under a Gaussian distribution hypothesis::
L[i,j] = exp( ||X[i] - X[j]||^2 / sigma2 )
Parameters
----------
X : ndarray
feature matrix
sigma2 : float, optional
bandwidth
Returns
-------
L : ndarray
likelihood matrix
See Also
--------
pdist : function
Compute distances between objects
'''
L = pdist(X)
if sigma2 is None:
sigma2 = np.median(L)
L /= -sigma2
np.exp(L, L)
return L
<file_sep>/powertrain/relay/robomow_relay_class.py~
#!/usr/bin/env python
import serial
import time
from struct import *
def log(*msgline):
for msg in msgline:
print msg,
print
class robomow_relay(object):
def __init__(self,com_port="/dev/ttyUSB0"):
############################
# lets introduce and init the main variables
self.com = None
self.isInitialized = False
self.commands = {
'relay_1_on': 0x65,
'relay_1_off': 0x6F,
'relay_2_on': 0x66,
'relay_2_off': 0x70,
'info': 0x5A,
'relay_states': 0x5B,
}
###########################
# lets connect the TTL Port
try:
self.com = serial.Serial(com_port, 9600, timeout=1)
self.com.close()
self.com.open()
self.isInitialized = True
self.relay1_state = 0
self.relay2_state = 0
log ("Link to relay driver -", com_port, "- successful")
#log (self.com.portstr, self.com.baudrate,self.com.bytesize,self.com.parity,self.com.stopbits)
except serial.serialutil.SerialException, e:
print e
log("Link to relay driver -", com_port, "- failed")
def __del__(self):
del(self)
def stats(self):
return (self.com.portstr, self.com.baudrate,self.com.bytesize,self.com.parity,self.com.stopbits)
def send_command(self, cmd, read_response = False):
#ser = serial.Serial('/dev/ttyACM0', 9600)
#print "command", cmd
self.com.write(chr(cmd)+'\n')
#print "raw:", self.com.read()
response = read_response and self.com.read() or None
#print "response", response
#self.com.close()
return response
def click_relay_1(self):
print "indside clicking"
self.send_command(self.commands['relay_1_on'])
time.sleep(1)
self.send_command(self.commands['relay_1_off'])
def turn_relay_1_on(self):
self.send_command(self.commands['relay_1_on'])
def turn_relay_1_off(self):
self.send_command(self.commands['relay_1_off'])
def turn_relay_2_on(self):
self.send_command(self.commands['relay_2_on'])
def turn_relay_2_off(self):
self.send_command(self.commands['relay_2_off'])
def get_relay_states(self):
states = self.send_command(self.commands['relay_states'], read_response=True)
response = unpack('b',states)[0]
states = {
0: {'1': False, '2': False},
1: {'1': True, '2': False},
2: {'1': False, '2': True},
3: {'1': True, '2': True},
}
if response == 0:
self.relay1_state = 0
self.relay2_state = 0
if response == 1:
self.relay1_state = 1
self.relay2_state = 0
if response == 2:
self.relay1_state = 0
self.relay2_state = 1
if response == 3:
self.relay1_state = 1
self.relay2_state = 1
return states[response]
<file_sep>/camera/canny.py
from opencv.cv import *
from opencv.highgui import *
def DoCanny(img, lowThresh, highThresh, aperature):
gray = cvCreateImage(cvSize(cvGetSize(img).width, cvGetSize(img).height), IPL_DEPTH_8U, 1)
cvCvtColor(img,gray,CV_BGR2GRAY)
if (gray.nChannels != 1):
return False
out = cvCreateImage(cvSize(cvGetSize(gray).width, cvGetSize(gray).height), IPL_DEPTH_8U, 1)
cvCanny(gray, out, lowThresh, highThresh, aperature)
return out
if __name__ == '__main__':
cvNamedWindow("Example5-Canny", CV_WINDOW_AUTOSIZE)
cvNamedWindow("Example5", CV_WINDOW_AUTOSIZE)
g_capture = cvCreateFileCapture('C:\python26\sample.avi')
frames = long(cvGetCaptureProperty(g_capture, CV_CAP_PROP_FRAME_COUNT))
loop = True
while(loop):
frame = cvQueryFrame(g_capture)
if (frame == None):
break
cvShowImage("Example5", frame)
outCan = DoCanny(frame, 70.0, 140.0, 3)
cvShowImage("Example5-Canny", outCan)
char = cvWaitKey(0)
if (char != -1):
if (ord(char) == 27):
loop = False
cvDestroyWindow("Example5")
cvDestroyWindow("Example5-Canny")<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/tests/test_perceptron.py
import numpy as np
from milk.supervised.perceptron import perceptron_learner
from milk.supervised import _perceptron
from milksets.yeast import load
def test_raw():
np.random.seed(23)
data = np.random.random((100,10))
data[50:] += .5
labels = np.repeat((0,1), 50)
weights = np.zeros((11))
eta = 0.1
for i in xrange(20):
_perceptron.perceptron(data, labels, weights, eta)
errs = _perceptron.perceptron(data, labels, weights, eta)
assert errs < 10
def test_wrapper():
features,labels = load()
labels = (labels >= 5)
learner = perceptron_learner()
model = learner.train(features, labels)
test = map(model.apply, features)
assert np.mean(labels != test) < .35
<file_sep>/lib/identify_device_on_ttyport.py
import glob
import os
import re
import usb.core
#lsusb to find vendorID and productID:
#Bus 002 Device 005: ID 067b:2303 Prolific Technology, Inc. PL2303 Serial Port
# then call print find_usb_tty("067b","2303")
def find_usb_tty(vendor_id = None, product_id = None) :
tty_devs = []
vendor_id = int(vendor_id, 16)
product_id = int(product_id , 16)
#print "vendor_id, product_id", vendor_id, product_id
for dn in glob.glob('/sys/bus/usb/devices/*') :
try :
vid = int(open(os.path.join(dn, "idVendor" )).read().strip(), 16)
pid = int(open(os.path.join(dn, "idProduct")).read().strip(), 16)
#print vid, pid
if ((vendor_id is None) or (vid == vendor_id)) and ((product_id is None) or (pid == product_id)) :
dns = glob.glob(os.path.join(dn, os.path.basename(dn) + "*"))
for sdn in dns :
for fn in glob.glob(os.path.join(sdn, "*")) :
print fn
if re.search(r"\/ttyUSB[0-9]+$", fn) :
tty_devs.append(os.path.join("/dev", os.path.basename(fn)))
if re.search(r"\/video[0-9]+$", fn) :
tty_devs.append(os.path.join("/dev", os.path.basename(fn)))
if re.search(r"\/ttyACM[0-9]+$", fn) :
#tty_devs.append("/dev" + os.path.basename(fn))
tty_devs.append(os.path.join("/dev", os.path.basename(fn)))
pass
pass
pass
pass
except ( ValueError, TypeError, AttributeError, OSError, IOError ) :
pass
pass
return tty_devs
'''
import os
from os.path import join
def find_tty_usb(idVendor, idProduct):
"""find_tty_usb('067b', '2302') -> '/dev/ttyUSB0'"""
# Note: if searching for a lot of pairs, it would be much faster to search
# for the enitre lot at once instead of going over all the usb devices
# each time.
for dnbase in os.listdir('/sys/bus/usb/devices'):
dn = join('/sys/bus/usb/devices', dnbase)
if not os.path.exists(join(dn, 'idVendor')):
continue
idv = open(join(dn, 'idVendor')).read().strip()
if idv != idVendor:
continue
idp = open(join(dn, 'idProduct')).read().strip()
if idp != idProduct:
continue
#print dnbase, dn #, idv, idp
for subdir in os.listdir(dn):
print subdir
if subdir.startswith(dnbase+':'):
for subsubdir in os.listdir(join(dn, subdir)):
if subsubdir.startswith('video'):
return join('/dev', subsubdir)
'''
if __name__== "__main__":
print find_usb_tty('10c4','ea60' )
#dev = usb.core.find(idVendor=0x2341, idProduct=0x0043)
import usb.core
import usb.util
import sys
# find our device
dev = usb.core.find(idVendor='10c4', idProduct='ea60')
# was it found?
if dev is None:
raise ValueError('Device not found')
for cfg in dev:
sys.stdout.write(str(cfg.bConfigurationValue) + '\n')
for intf in cfg:
sys.stdout.write('\t' + str(intf.bInterfaceNumber) +',' + str(intf.bAlternateSetting) + '\n')
for ep in intf:
sys.stdout.write('\t\t' + str(ep.bEndpointAddress) +'\n')
'''
# set the active configuration. With no arguments, the first
# configuration will be the active one
dev.set_configuration()
# get an endpoint instance
cfg = dev.get_active_configuration()
interface_number = cfg[(0,0)].bInterfaceNumber
alternate_settting = usb.control.get_interface(interface_number)
intf = usb.util.find_descriptor(
cfg, bInterfaceNumber = interface_number,
bAlternateSetting = alternate_setting
)
ep = usb.util.find_descriptor(
intf,
# match the first OUT endpoint
custom_match = \
lambda e: \
usb.util.endpoint_direction(e.bEndpointAddress) == \
usb.util.ENDPOINT_OUT
)
assert ep is not None
# write the data
ep.write('test')
'''
<file_sep>/cam-histo.py
#! /usr/bin/env python
import sys
import functools
# import the necessary things for OpenCV
from opencv import cv
from opencv import highgui
#############################################################################
# definition of some constants
# how many bins we want for the histogram, and their ranges
hdims = 16
hranges = [[0, 180]]
# ranges for the limitation of the histogram
vmin = 10
vmax = 256
smin = 30
# the range we want to monitor
hsv_min = cv.cvScalar (0, smin, vmin, 0)
hsv_max = cv.cvScalar (180, 256, vmax, 0)
#############################################################################
# some useful functions
def hsv2rgb (hue):
# convert the hue value to the corresponding rgb value
sector_data = [[0, 2, 1],
[1, 2, 0],
[1, 0, 2],
[2, 0, 1],
[2, 1, 0],
[0, 1, 2]]
hue *= 0.1 / 3
sector = cv.cvFloor (hue)
p = cv.cvRound (255 * (hue - sector))
if sector & 1:
p ^= 255
rgb = {}
rgb [sector_data [sector][0]] = 255
rgb [sector_data [sector][1]] = 0
rgb [sector_data [sector][2]] = p
return cv.cvScalar (rgb [2], rgb [1], rgb [0], 0)
def on_mouse( event, x, y, flags, param = [] ):
global mouse_selection
global mouse_origin
global mouse_select_object
if event == highgui.CV_EVENT_LBUTTONDOWN:
print("Mouse down at (%i, %i)" % (x,y))
mouse_origin = cv.cvPoint(x,y)
mouse_selection = cv.cvRect(x,y,0,0)
mouse_select_object = True
return
if event == highgui.CV_EVENT_LBUTTONUP:
print("Mouse up at (%i,%i)" % (x,y))
mouse_select_object = False
if( mouse_selection.width > 0 and mouse_selection.height > 0 ):
global track_object
track_object = -1
return
if mouse_select_object:
mouse_selection.x = min(x,mouse_origin.x)
mouse_selection.y = min(y,mouse_origin.y)
mouse_selection.width = mouse_selection.x + cv.CV_IABS(x - mouse_origin.x)
mouse_selection.height = mouse_selection.y + cv.CV_IABS(y - mouse_origin.y)
mouse_selection.x = max( mouse_selection.x, 0 )
mouse_selection.y = max( mouse_selection.y, 0 )
mouse_selection.width = min( mouse_selection.width, frame.width )
mouse_selection.height = min( mouse_selection.height, frame.height )
mouse_selection.width -= mouse_selection.x
mouse_selection.height -= mouse_selection.y
#############################################################################
# so, here is the main part of the program
if __name__ == '__main__':
print "OpenCV Python wrapper test"
#print "OpenCV version: %s (%d, %d, %d)" % (cv.CV_VERSION,
# cv.CV_MAJOR_VERSION,
# cv.CV_MINOR_VERSION,
# cv.CV_SUBMINOR_VERSION)
# first, create the necessary windows
highgui.cvNamedWindow ('Camera', highgui.CV_WINDOW_AUTOSIZE)
highgui.cvNamedWindow ('Histogram', highgui.CV_WINDOW_AUTOSIZE)
# move the new window to a better place
highgui.cvMoveWindow ('Camera', 10, 40)
highgui.cvMoveWindow ('Histogram', 10, 270)
global mouse_origin
global mouse_selection
global mouse_select_object
mouse_select_object = False
global track_object
track_object = 0
global track_comp
global track_box
track_comp = cv.CvConnectedComp()
track_box = cv.CvBox2D()
highgui.cvSetMouseCallback( "Camera", on_mouse, 0 )
try:
# try to get the device number from the command line
device = int (sys.argv [1])
# got it ! so remove it from the arguments
del sys.argv [1]
except (IndexError, ValueError):
# no device number on the command line, assume we want the 1st device
device = 0
if len (sys.argv) == 1:
# no argument on the command line, try to use the camera
capture = highgui.cvCreateCameraCapture (device)
# set the wanted image size from the camera
highgui.cvSetCaptureProperty (capture,
highgui.CV_CAP_PROP_FRAME_WIDTH, 1600)
highgui.cvSetCaptureProperty (capture,
highgui.CV_CAP_PROP_FRAME_HEIGHT, 1200)
else:
# we have an argument on the command line,
# we can assume this is a file name, so open it
capture = highgui.cvCreateFileCapture (sys.argv [1])
# check that capture device is OK
if not capture:
print "Error opening capture device"
sys.exit (1)
# create an image to put in the histogram
histimg = cv.cvCreateImage (cv.cvSize (320,240), 8, 3)
# init the image of the histogram to black
cv.cvSetZero (histimg)
# capture the 1st frame to get some propertie on it
frame = highgui.cvQueryFrame (capture)
# get some properties of the frame
frame_size = cv.cvGetSize (frame)
# compute which selection of the frame we want to monitor
selection = cv.cvRect (0, 0, frame.width, frame.height)
# create some images usefull later
hue = cv.cvCreateImage (frame_size, 8, 1)
mask = cv.cvCreateImage (frame_size, 8, 1)
hsv = cv.cvCreateImage (frame_size, 8, 3 )
backproject = cv.cvCreateImage( frame_size, 8, 1 )
# create the histogram
hist = cv.cvCreateHist ([hdims], cv.CV_HIST_ARRAY, hranges, 1)
obj_hist = cv.cvCreateHist ([hdims], cv.CV_HIST_ARRAY, hranges, 1)
while 1:
# do forever
# 1. capture the current image
frame = highgui.cvQueryFrame (capture)
if frame is None:
# no image captured... end the processing
break
# mirror the captured image
#cv.cvFlip (frame, None, 1)
# compute the hsv version of the image
cv.cvCvtColor (frame, hsv, cv.CV_BGR2HSV)
# compute which pixels are in the wanted range
cv.cvInRangeS (hsv, hsv_min, hsv_max, mask)
# extract the hue from the hsv array
cv.cvSplit (hsv, hue, None, None, None)
# select the rectangle of interest in the hue/mask arrays
hue_roi = cv.cvGetSubRect (hue, selection)
mask_roi = cv.cvGetSubRect (mask, selection)
# it's time to compute the histogram
cv.cvCalcHist (hue_roi, hist, 0, mask_roi)
# extract the min and max value of the histogram
min_val, max_val, min_idx, max_idx = cv.cvGetMinMaxHistValue (hist)
# compute the scale factor
if max_val > 0:
scale = 255. / max_val
else:
scale = 0.
# scale the histograms
cv.cvConvertScale (hist.bins, hist.bins, scale, 0)
# clear the histogram image
cv.cvSetZero (histimg)
# compute the width for each bin do display
bin_w = histimg.width / hdims
for i in range (hdims):
# for all the bins
# get the value, and scale to the size of the hist image
val = cv.cvRound (cv.cvGetReal1D (hist.bins, i)
* histimg.height / 255)
# compute the color
color = hsv2rgb (i * 180. / hdims)
# draw the rectangle in the wanted color
cv.cvRectangle (histimg,
cv.cvPoint (i * bin_w, histimg.height),
cv.cvPoint ((i + 1) * bin_w, histimg.height - val),
color, -1, 8, 0)
# Make the sweet negative selection box
if mouse_select_object and mouse_selection.width > 0 and mouse_selection.height > 0:
a = cv.cvGetSubRect(frame,mouse_selection)
cv.cvXorS(a,cv.cvScalarAll(255), a) # Take the negative of the image..
del a
# Carry out the histogram tracking...
if track_object != 0:
cv.cvInRangeS( hsv, cv.cvScalar(0,smin ,min(vmin, vmax), 0), cv.cvScalar(180, 256, max(vmin,vmax), 0), mask )
cv.cvSplit(hsv, hue, None, None, None)
if track_object < 0:
# Calculate the histogram for the mouse_selection box
hue_roi_rect = cv.cvGetSubRect( hue, mouse_selection )
mask_roi_rect = cv.cvGetSubRect( mask, mouse_selection )
cv.cvCalcHist (hue_roi_rect, obj_hist, 0, mask_roi_rect)
min_val, max_val, min_idx, max_idx = cv.cvGetMinMaxHistValue (obj_hist)
track_window = mouse_selection
track_object = 1
cv.cvCalcBackProject( hue, backproject, obj_hist )
cv.cvAnd(backproject, mask, backproject)
#niter, track_comp, track_box =
cv.cvCamShift( backproject, track_window,
cv.cvTermCriteria( cv.CV_TERMCRIT_EPS | cv.CV_TERMCRIT_ITER, 10, 1 ), track_comp, track_box)
track_window = track_comp.rect
#if backproject_mode:
# cvCvtColor( backproject, image, CV_GRAY2BGR )
if not frame.origin:
track_box.angle = -track_box.angle
cv.cvEllipseBox( frame, track_box, cv.CV_RGB(255,0,0), 3, cv.CV_AA, 0 )
# we can now display the images
highgui.cvShowImage ('Camera', frame)
highgui.cvShowImage ('Histogram', histimg)
# handle events
k = highgui.cvWaitKey (10)
if k == '\x1b':
# user has press the ESC key, so exit
break
highgui.cvReleaseCapture(capture)
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/tests/test_tree.py
import milk.supervised.tree
import milk.supervised._tree
from milk.supervised._tree import set_entropy
from milk.supervised.tree import information_gain, stump_learner
import numpy as np
def test_tree():
from milksets import wine
features, labels = wine.load()
selected = (labels < 2)
features = features[selected]
labels = labels[selected]
C = milk.supervised.tree.tree_classifier()
model = C.train(features,labels)
assert (np.array([model.apply(f) for f in features]) == labels).mean() > .5
def test_split_subsample():
import random
from milksets import wine
features, labels = wine.load()
labels = labels.astype(np.int)
seen = set()
for i in xrange(20):
random.seed(2)
i,s = milk.supervised.tree._split(features[::10], labels[::10], None, milk.supervised.tree.information_gain, 2, random)
seen.add(i)
assert len(seen) <= 2
def test_set_entropy():
labels = np.arange(101)%3
counts = np.zeros(3)
entropy = milk.supervised._tree.set_entropy(labels, counts)
slow_counts = np.array([(labels == i).sum() for i in xrange(3)])
assert np.all(counts == slow_counts)
px = slow_counts.astype(float)/ slow_counts.sum()
slow_entropy = - np.sum(px * np.log(px))
assert np.abs(slow_entropy - entropy) < 1.e-8
def slow_information_gain(labels0, labels1):
H = 0.
N = len(labels0) + len(labels1)
nlabels = 1+max(labels0.max(), labels1.max())
counts = np.empty(nlabels, np.double)
for arg in (labels0, labels1):
H -= len(arg)/float(N) * set_entropy(arg, counts)
return H
def test_information_gain():
np.random.seed(22)
for i in xrange(8):
labels0 = (np.random.randn(20) > .2).astype(int)
labels1 = (np.random.randn(33) > .8).astype(int)
fast = information_gain(labels0, labels1)
slow = slow_information_gain(labels0, labels1)
assert np.abs(fast - slow) < 1.e-8
def test_information_gain_small():
labels1 = np.array([0])
labels0 = np.array([0, 1])
assert information_gain(labels0, labels1) < 0.
def test_z1_loss():
from milk.supervised.tree import z1_loss
L0 = np.zeros(10)
L1 = np.ones(10)
L1[3] = 0
W0 = np.ones(10)
W1 = np.ones(10)
assert z1_loss(L0, L1) == z1_loss(L0, L1, W0, W1)
assert z1_loss(L0, L1) != z1_loss(L0, L1, W0, .8*W1)
assert z1_loss(L0, L1) > 0
def test_stump_learner():
learner = stump_learner()
np.random.seed(111)
for i in xrange(8):
features = np.random.random_sample((40,2))
features[:20,0] += .5
labels = np.repeat((0,1),20)
model = learner.train(features, labels, normalisedlabels=True)
assert not model.apply([0.01,.5])
assert model.apply(np.random.random_sample(2)+.8)
assert model.idx == 0
<file_sep>/gui/breezy/radiobuttondemo1.py
"""
File: radiobuttondemo1.py
Demonstrates radio button capabilities.
"""
from breezypythongui import EasyFrame
class RadiobuttonDemo(EasyFrame):
"""When the Display button is pressed, shows the label of
the selected radio button. The button group has a default
vertical alignment."""
def __init__(self):
"""Sets up the window and widgets."""
EasyFrame.__init__(self, "Radio Button Demo")
# Add the button group
self.group = self.addRadiobuttonGroup(row = 0, column = 0, rowspan = 4)
# Add the radio buttons to the group
self.group.addRadiobutton("Freshman")
self.group.addRadiobutton("Sophomore")
self.group.addRadiobutton("Junior")
defaultRB = self.group.addRadiobutton("Senior")
# Select one of the buttons in the group
self.group.setSelectedButton(defaultRB)
# Output field and command button for the results
self.output = self.addTextField("", row = 0, column = 1)
self.addButton("Display", row = 1, column = 1,
command = self.displayButton)
# Event handler method
def displayButton(self):
"""Display the label of the selected button."""
selectedButton = self.group.getSelectedButton()
self.output.setText(selectedButton["text"])
# Instantiate and pop up the window."""
RadiobuttonDemo().mainloop()
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/supervised/defaultlearner.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2012, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# License: MIT. See COPYING.MIT file in the milk distribution
from __future__ import division
import numpy as np
from milk.supervised.base import supervised_model
__all__ = [
'defaultlearner',
'svm_simple',
'feature_selection_simple',
]
class default_learner(object):
def __init__(self, preproc, classifier, multi_adapter):
self.preproc = preproc
self.classifier = classifier
self.multi_adapter = preproc
def train(self, features, labels):
model = self.preproc.train(features, labels)
nfeatures = model.apply_many(features)
classifier = self.classifier
if len(set(labels)) > 2:
classifier = self.multi_adapter(classifier)
cmodel = classifier.train(nfeatures, labels)
return default_model(classifier, cmodel)
class default_model(supervised_model):
def __init__(self, preproc, classify):
self.preproc = preproc
self.classify = classify
def apply(self, features):
features = self.preproc.apply(features)
return self.classify.apply(features)
def defaultlearner(mode='medium', multi_strategy='1-vs-1', expanded=False):
'''
learner = defaultlearner(mode='medium')
Return the default classifier learner
This is an SVM based classifier using the 1-vs-1 technique for multi-class
problems (by default, see the ``multi_strategy`` parameter). The features
will be first cleaned up (normalised to [-1, +1]) and go through SDA
feature selection.
Parameters
-----------
mode : string, optional
One of ('fast','medium','slow', 'really-slow'). This defines the speed
accuracy trade-off. It essentially defines how large the SVM parameter
range is.
multi_strategy : str, optional
One of ('1-vs-1', '1-vs-rest', 'ecoc'). This defines the strategy used
to convert the base binary classifier to a multi-class classifier.
expanded : boolean, optional
If true, then instead of a single learner, it returns a list of
possible learners.
Returns
-------
learner : classifier learner object or list
If `expanded`, then it returns a list
See Also
--------
feature_selection_simple : Just perform the feature selection
svm_simple : Perform classification
'''
# These cannot be imported at module scope!
# The reason is that they introduce a dependency loop:
# gridsearch depends on nfoldcrossvalidation
# nfoldcrossvalidation depends on defaultlearner
# which cannot depend on gridsearch
#
# Importing at function level keeps all these issues at bay
#
from .classifier import ctransforms
from .gridsearch import gridsearch
from . import svm
from .normalise import chkfinite, interval_normalise
from .featureselection import sda_filter, featureselector, linear_independent_features
from .multi import one_against_one, one_against_rest, ecoc_learner
assert mode in ('really-slow', 'slow', 'medium', 'fast'), \
"milk.supervised.defaultlearner: mode must be one of 'fast','slow','medium'."
if multi_strategy == '1-vs-1':
multi_adaptor = one_against_one
elif multi_strategy == '1-vs-rest':
multi_adaptor = one_against_rest
elif multi_strategy == 'ecoc':
multi_adaptor = ecoc_learner
else:
raise ValueError('milk.supervised.defaultlearner: Unknown value for multi_strategy: %s' % multi_strategy)
if mode == 'fast':
c_range = np.arange(-2,4)
sigma_range = np.arange(-2,3)
elif mode == 'medium':
c_range = np.arange(-2,4)
sigma_range = np.arange(-4,4)
elif mode == 'really-slow':
c_range = np.arange(-4,10)
sigma_range = np.arange(-7,7)
else: # mode == 'slow'
c_range = np.arange(-9,5)
sigma_range = np.arange(-7,4)
kernels = [svm.rbf_kernel(2.**i) for i in sigma_range]
Cs = 2.**c_range
if expanded:
return [ctransforms(feature_selection_simple(),
multi_adaptor(svm.svm_to_binary(svm.svm_raw(C=C, kernel=kernel))))
for C in Cs for kernel in kernels]
return ctransforms(feature_selection_simple(),
gridsearch(multi_adaptor(svm.svm_to_binary(svm.svm_raw())),
params={ 'C': Cs, 'kernel': kernels, }))
def feature_selection_simple():
'''
selector = feature_selection_simple()
Standard feature normalisation and selection
This fills in NaNs and Infs (to 0 and large numbers), normalises features
to [-1, +1] and uses SDA for feature selection.
Returns
-------
selector : supervised learner
See Also
--------
defaultlearner : perform feature selection *and* classification
'''
from .classifier import ctransforms
from .normalise import chkfinite, interval_normalise
from .featureselection import sda_filter, featureselector, linear_independent_features
return ctransforms(
chkfinite(),
interval_normalise(),
featureselector(linear_independent_features),
sda_filter(),
)
def svm_simple(C, kernel):
'''
learner = svm_simple(C, kernel)
Returns a one-against-one SVM based classifier with `C` and `kernel`
Parameters
----------
C : double
C parameter
kernel : kernel
Kernel to use
Returns
-------
learner : supervised learner
See Also
--------
feature_selection_simple : Perform feature selection
defaultlearner : feature selection and gridsearch for SVM parameters
'''
from . import svm
from .multi import one_against_one
return one_against_one(svm.svm_to_binary(svm.svm_raw(C=C, kernel=kernel)))
<file_sep>/telemetry/serverside_tcp.py
# Echo server program
import socket
import time
import math
HOST = '' # Symbolic name meaning the local host
PORT = 8095 # Arbitrary non-privileged port
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.bind((HOST, PORT))
#next line spcifies how long in seconds to wait for a connection
#s.settimeout(5.0)
def read_line(s):
ret = ''
while True:
c = s.recv(1)
if c == '\n' or c == '':
break
else:
ret += c
return ret
print "listening..."
s.listen(1)
print "made connection..."
try:
print "waiting to accept.."
conn, addr = s.accept()
print "accepted connection from client.."
while conn <> "":
s.listen(1)
#print time.time()
#print s.gettimeout()
print 'Connected by', addr
#data = conn.recv(1024)
data = read_line(conn)
data1 = data.split(',')
#if not data: break
#print 'Received from remote: ', data
if len(data1) > 0 : print "data1:", data1
#conn.send("ACK")
try:
compass = math.degrees(float(data1[11]))
if compass < 0: compass = compass + 360
print "compass:", round(compass)
except:
pass
#time.sleep(.1)
except IOError as detail:
print "connection lost", detail
try:
print "closing Socket"
s.close()
except NameError as detail:
print "No socket to close", detail
<file_sep>/vidcap.py
"""
Sometimes we may have (or want) to use the pygame interface to the camera, sometimes opencv, to avoid changing lots of code this camera class is a wrapper around the opencv code - making it look like pygames camera.
<NAME> - OLPC Hitlab Project 2009
"""
import opencv
import opencv.cv as cv
import opencv.highgui as hg
import conversionUtils
import pygame
from pygame.locals import QUIT, K_ESCAPE, KEYDOWN
from pygame import surfarray
opencv_camera = True
class Camera():
"""
Simulate the pygame camera class using opencv for capturing.
Takes one additional parameter than a pygame.camera.Camera object
the imageType which is the return type required for any image requests.
imagetype can be set as:
'opencv' - this returns an opencv.CvMat object
'pygame' - returns a pygame surface, if surface is passed in - it will try blit directly to it.
"""
def __init__(self, device, size, mode, imageType='opencv'):
self.imageType = imageType
self.size = self.width, self.height = size
self.device = device
# todo: would be nice if this didn't make a whole lot of noise about firewire...
self.capture = hg.cvCreateCameraCapture(self.device)
# set the wanted image size from the camera
hg.cvSetCaptureProperty (self.capture, hg.CV_CAP_PROP_FRAME_WIDTH, self.width)
hg.cvSetCaptureProperty (self.capture, hg.CV_CAP_PROP_FRAME_HEIGHT, self.height)
def get_image(self,surface=None,*argv,**argd):
"""
Since we are using opencv there are some differences here.
This function will return a cvmat by default (the quickest route from camera to your code)
Or if pygame was specified a 3dpixel array
- this can be blitted directly to a pygame surface
- or can be converted to a surface array and returned to your pygame code.
"""
if self.imageType == "pygame":
try:
surfarray.blit_array(surface,conversionUtils.cv2SurfArray(hg.cvQueryFrame(self.capture)))
return surface
except:
return surfarray.make_surface(conversionUtils.cv2SurfArray(hg.cvQueryFrame(self.capture))).convert()
return hg.cvQueryFrame(self.capture)
def start(self):
pass
def stop(self):
return hg.cvReleaseCapture(self.capture)
def query_image(self):
return True
def list_cameras():
#return [0] # Just use this line if errors occur
cams = []
for i in range(3):
try:
capture = hg.cvCreateCameraCapture( i ) # Must be a better way of doing this...
if capture is not None:
cams.append(i)
except Exception, e:
pass
finally:
hg.cvReleaseCapture(capture)
return cams
def init():
"""
Work out at this point what we will use...
"""
pass
def opencvSnap(dev,size):
"""
An example use of the "camera" taking a single picture frame using opencv's cvMat as the return method.
"""
# First lets take a picture using opencv, and display it using opencv...
cvWin = hg.cvNamedWindow( "Opencv Rendering and Capture", 0 )
print("Opening device %s, with video size (%s,%s)" % (dev,size[0],size[1]))
# creates the camera of the specified size and in RGB colorspace
cam = Camera(dev, size, "RGB")
a = cam.get_image()
hg.cvShowImage ('Opencv Rendering and Capture', a)
# close the capture stream to avoid problems later, should see the camera turn off
hg.cvReleaseCapture(cam.capture)
del cam
# Wait for any key then clean up
print("Press any key to continue")
k = hg.cvWaitKey()
hg.cvDestroyWindow("Opencv Rendering and Capture")
def pygameSnap(dev,size):
"""
Another use, taking a single from the "camera" (ie using opencv)
and return the image as a pygame surface. Then we display it in pygame.
"""
print("Opening device %s, with video size (%s,%s), in pygame mode (returns pygame surface)" % (dev,size[0],size[1]))
# creates the camera of the specified size in pygame mode
cam = Camera(dev, size, "RGB", imageType='pygame')
display = pygame.display.set_mode( size, 0 )
pygame.display.set_caption("Pygame Render, Opencv Capture")
snapshot = cam.get_image()
display.blit(snapshot, (0,0))
cam.stop()
del cam # Turn off the camera
print("Press escape to continue")
while(True):
pygame.display.flip()
e = pygame.event.poll()
if e.type == QUIT or e.type == KEYDOWN and e.key == K_ESCAPE:
break
pygame.quit()
def pygameSnap2(dev,size):
"""
This time we get the image blitted directly to an existing surface.
"""
print("Opening device %s, with video size (%s,%s), in pygame mode (returns pygame surface)" % (dev,size[0],size[1]))
# creates the camera of the specified size in pygame mode
cam = Camera(dev, size, "RGB", imageType='pygame')
display = pygame.display.set_mode( size, 0 )
pygame.display.set_caption("Pygame Render, Opencv Capture, Direct Blit")
snapshot = cam.get_image() # We will get a first frame to use as our surface to direct blit to.
snapshot = cam.get_image(snapshot) # This time it should have blitted direct to snapshot
display.blit(snapshot, (0,0))
cam.stop()
del cam # Turn off the camera
print("Press escape to continue")
while(True):
pygame.display.flip()
e = pygame.event.poll()
if e.type == QUIT or e.type == KEYDOWN and e.key == K_ESCAPE:
break
pygame.quit()
if __name__ == "__main__":
init()
print("""
Running some opencv camera tests. You should see 3 seperate images, one after the other.
Press any key after each image.
""")
clist = list_cameras()
if not clist:
raise ValueError("Sorry, no cameras detected.")
dev = clist[0]
size = (640,480)
opencvSnap(dev,size)
pygameSnap(dev,size)
pygameSnap2(dev,size)
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/unsupervised/normalise.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2012, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
from __future__ import division
import numpy as np
__all__ = [
'center',
'zscore',
]
def _nanmean(arr, axis=None):
nancounts = np.sum(~np.isnan(arr), axis=axis)
return np.nansum(arr,axis=axis)/nancounts
def _nanstd(arr, axis=None):
if axis == 1:
return _nanstd(arr.T, axis=0)
mu = _nanmean(arr,axis=axis)
return np.sqrt(_nanmean((arr-mu)**2, axis=axis))
def zscore(features, axis=0, can_have_nans=True, inplace=False):
"""
features = zscore(features, axis=0, can_have_nans=True, inplace=False)
Returns a copy of features which has been normalised to zscores
Parameters
----------
features : ndarray
2-D input array
axis : integer, optional
which axis to normalise (default: 0)
can_have_nans : boolean, optional
whether ``features`` is allowed to have NaNs (default: True)
inplace : boolean, optional
Whether to operate inline (i.e., potentially change the input array).
Default is False
Returns
-------
features : ndarray
zscored version of features
"""
if features.ndim != 2:
raise('milk.unsupervised.zscore: Can only handle 2-D arrays')
if not inplace:
features = features.copy()
if can_have_nans:
mu = _nanmean(features, axis)
sigma = _nanstd(features, axis)
else:
mu = features.mean(axis)
sigma = np.std(features, axis)
sigma[sigma == 0] = 1.
if axis == 0:
features -= mu
features /= sigma
elif axis == 1:
features -= mu[:,None]
features /= sigma[:,None]
return features
def center(features, axis=0, can_have_nans=True, inplace=False):
'''
centered, mean = center(features, axis=0, inplace=False)
Center data
Parameters
----------
features : ndarray
2-D input array
axis : integer, optional
which axis to normalise (default: 0)
can_have_nans : boolean, optional
whether ``features`` is allowed to have NaNs (default: True)
inplace : boolean, optional
Whether to operate inline (i.e., potentially change the input array).
Default is False
Returns
-------
features : ndarray
centered version of features
mean : ndarray
mean values
'''
if can_have_nans:
meanfunction = _nanmean
else:
meanfunction = np.mean
features = np.array(features, copy=(not inplace), dtype=float)
mean = meanfunction(features, axis=axis)
if axis == 0:
features -= mean
elif axis == 1:
features -= mean[:,None]
else:
raise ValueError('milk.unsupervised.center: axis ∉ { 0, 1}')
return features, mean
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/tests/test_ext_jugparallel.py
try:
import jug
from jug import value
import jug.options
from jug.tests.utils import task_reset, simple_execute
except ImportError:
from nose import SkipTest
def task_reset(f):
def g():
raise SkipTest()
return g
@task_reset
def test_nfoldcrossvalidation():
store, space = jug.jug.init('milk/tests/data/jugparallel_jugfile.py', 'dict_store')
simple_execute()
assert len(jug.value(space['classified'])) == 2
assert len(jug.value(space['classified_wpred'])) ==3
@task_reset
def test_kmeans():
store, space = jug.jug.init('milk/tests/data/jugparallel_kmeans_jugfile.py', 'dict_store')
simple_execute()
assert len(value(space['clustered'])) == 2
<file_sep>/telemetry/rec_img.py~
#!/usr/bin/python
import socket
from PIL import ImageFile
import time
HOST = '127.0.0.1'
CPORT = 12345
MPORT = 12346
data_tcp = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
filename_tcp = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
#filename_tcp.connect(('127.0.0.1', 12345))
#data_tcp.connect(('127.0.0.1', 12346))
#tcp.send("1000 4")
filename_tcp.bind((HOST, CPORT))
data_tcp.bind((HOST, MPORT))
#next line spcifies how long in seconds to wait for a connection
#s.settimeout(5.0)
print "listening..."
filename_tcp.listen(1)
data_tcp.listen(1)
"""
#if want to recieve a file
try:
print "waiting for filename connection.."
conn, addr = filename_tcp.accept()
print "accepted connection from client.."
data = filename_tcp.recv(1024)
if data <> "":
print 'Received filename from server...', repr(data)
filename = repr(data)
else:
print "filename not received from server"
break
except IOError as detail:
print "connection lost", detail
"""
#to receive and view pil image
try:
print "waiting for data connection.."
conn, addr = data_tcp.accept()
print "accepted connection from client.."
file = open("bar2.jpg", "w")
parser = ImageFile.Parser()
print time.time()
while 1:
jpgdata = conn.recv(65536)
if not jpgdata:
data_tcp.close()
print "no more data"
break
parser.feed(jpgdata)
file.write(jpgdata)
print time.time()
print "data received.."
file.close()
image = parser.close()
image.show()
except IOError as detail:
print "connection lost", detail
<file_sep>/camera/showcam.py
from CVtypes import cv
win = 'Show Cam'
cv.NamedWindow(win)
cap = cv.CreateCameraCapture(0)
while cv.WaitKey(1) != 27:
img = cv.QueryFrame(cap)
cv.ShowImage(win, img)
<file_sep>/navigation/lidar/lidar_class.py
#!/usr/bin/python
import thread, time, sys, traceback
import serial
from math import *
class mobot_lidar ():
def __init__(self, com_port, baudrate):
self.init_level = 0
self.angle = 0
self.index = 0
self.speed_rpm = 0
# serial port
self.com_port = com_port
self.baudrate = baudrate
self.ser = serial.Serial(self.com_port, self.baudrate)
self.data = []
self.offset = 140
self.th = thread.start_new_thread(self.read_lidar, ())
def read_lidar(self):
nb_errors = 0
while True:
temp_data = []
while len(temp_data) < 360:
try:
time.sleep(0.00001) # do not hog the processor power
#print "self.init_level", self.init_level
if self.init_level == 0 :
b = ord(self.ser.read(1))
# start byte
if b == 0xFA :
self.init_level = 1
else:
self.init_level = 0
elif self.init_level == 1:
# position index
b = ord(self.ser.read(1))
if b >= 0xA0 and b <= 0xF9 :
self.index = b - 0xA0
self.init_level = 2
elif b != 0xFA:
self.init_level = 0
elif self.init_level == 2 :
# speed
b_speed = [ ord(b) for b in self.ser.read(2)]
# data
b_data0 = [ ord(b) for b in self.ser.read(4)]
b_data1 = [ ord(b) for b in self.ser.read(4)]
b_data2 = [ ord(b) for b in self.ser.read(4)]
b_data3 = [ ord(b) for b in self.ser.read(4)]
# for the checksum, we need all the data of the packet...
# this could be collected in a more elegent fashion...
all_data = [ 0xFA, self.index+0xA0 ] + b_speed + b_data0 + b_data1 + b_data2 + b_data3
# checksum
b_checksum = [ ord(b) for b in self.ser.read(2) ]
incoming_checksum = int(b_checksum[0]) + (int(b_checksum[1]) << 8)
# verify that the received checksum is equal to the one computed from the data
if checksum(all_data) == incoming_checksum:
#pass
self.speed_rpm = int(compute_speed(b_speed))
#gui_update_speed(speed_rpm)
#motor_control(speed_rpm)
temp_data.append(convert_data(self.index * 4 + 0, b_data0, self.offset))
temp_data.append(convert_data(self.index * 4 + 1, b_data1, self.offset))
temp_data.append(convert_data(self.index * 4 + 2, b_data2, self.offset))
temp_data.append(convert_data(self.index * 4 + 3, b_data3, self.offset))
else:
# the checksum does not match, something went wrong...
nb_errors +=1
print "errors found"
#label_errors.text = "errors: "+str(nb_errors)
# display the samples in an error state
#update_view(index * 4 + 0, [0, 0x80, 0, 0])
#update_view(index * 4 + 1, [0, 0x80, 0, 0])
#update_view(index * 4 + 2, [0, 0x80, 0, 0])
#update_view(index * 4 + 3, [0, 0x80, 0, 0])
self.init_level = 0 # reset and wait for the next packet
else: # default, should never happen...
self.init_level = 0
except :
traceback.print_exc(file=sys.stdout)
#print temp_data, i
#raw_input("enter")
if len (temp_data ) == 360: self.data = sorted(temp_data)
def checksum(data):
"""Compute and return the checksum as an int.
data -- list of 20 bytes (as ints), in the order they arrived in.
"""
# group the data by word, little-endian
data_list = []
for t in range(10):
data_list.append( data[2*t] + (data[2*t+1]<<8) )
# compute the checksum on 32 bits
chk32 = 0
for d in data_list:
chk32 = (chk32 << 1) + d
# return a value wrapped around on 15bits, and truncated to still fit into 15 bits
checksum = (chk32 & 0x7FFF) + ( chk32 >> 15 ) # wrap around to fit into 15 bits
checksum = checksum & 0x7FFF # truncate to 15 bits
return int( checksum )
def convert_data(angle, data, offset):
"""Updates the view of a sample.
Takes the angle (an int, from 0 to 359) and the list of four bytes of data in the order they arrived.
"""
#unpack data using the denomination used during the discussions
x = data[0]
x1= data[1]
x2= data[2]
x3= data[3]
angle_rad = angle * pi / 180.0
c = cos(angle_rad)
s = -sin(angle_rad)
dist_mm = x | (( x1 & 0x3f) << 8) # distance is coded on 13 bits ? 14 bits ?
quality = x2 | (x3 << 8) # quality is on 16 bits
dist_x = dist_mm*c
dist_y = dist_mm*s
#print "angle:", angle, " dist_mm:", dist_mm, " quality:", quality
return angle, dist_mm, quality
def compute_speed(data):
speed_rpm = float( data[0] | (data[1] << 8) ) / 64.0
return speed_rpm
<file_sep>/camera/snap_shot2.py
from VideoCapture import Device
import ImageDraw, sys, pygame, time
from pygame.locals import *
from PIL import ImageEnhance
res = (1024,768)
pygame.init()
cam = Device()
cam.setResolution(res[0],res[1])
screen = pygame.display.set_mode((1024,768))
pygame.display.set_caption('Webcam')
pygame.font.init()
font = pygame.font.SysFont("Courier",11)
def disp(phrase,loc):
s = font.render(phrase, True, (200,200,200))
sh = font.render(phrase, True, (50,50,50))
screen.blit(sh, (loc[0]+1,loc[1]+1))
screen.blit(s, loc)
brightness = 1.0
contrast = 1.0
shots = 0
while 1:
camshot = ImageEnhance.Brightness(cam.getImage()).enhance(brightness)
camshot = ImageEnhance.Contrast(camshot).enhance(contrast)
for event in pygame.event.get():
if event.type == pygame.QUIT: sys.exit()
keyinput = pygame.key.get_pressed()
if keyinput[K_1]: brightness -= .1
if keyinput[K_2]: brightness += .1
if keyinput[K_3]: contrast -= .1
if keyinput[K_4]: contrast += .1
if keyinput[K_q]: cam.displayCapturePinProperties()
if keyinput[K_w]: cam.displayCaptureFilterProperties()
if keyinput[K_s]:
filename = str(time.time()) + ".jpg"
cam.saveSnapshot(filename, quality=80, timestamp=0)
shots += 1
camshot = pygame.image.frombuffer(camshot.tostring(), res, "RGB")
screen.blit(camshot, (0,0))
disp("S:" + str(shots), (10,4))
disp("B:" + str(brightness), (10,16))
disp("C:" + str(contrast), (10,28))
pygame.display.flip()
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/tests/test_svm.py
import milk.supervised.svm
from milk.supervised.svm import svm_learn_smo, svm_learn_libsvm, _svm_apply, learn_sigmoid_constants, svm_sigmoidal_correction
import numpy
import random
import numpy as np
import milk.supervised.svm
import pickle
eps=1e-3
def approximate(a,b):
a=numpy.asanyarray(a)
b=numpy.asanyarray(b)
return numpy.abs(a-b).max() < eps
def assert_kkt(SVM):
X,Y,Alphas,b,C,kernel=SVM
N=len(Alphas)
for i in xrange(N):
if Alphas[i] == 0.:
assert Y[i]*_svm_apply(SVM,X[i])+eps >= 1
elif Alphas[i] == C:
assert Y[i]*_svm_apply(SVM,X[i])-eps <= 1
else:
assert abs(Y[i]*_svm_apply(SVM,X[i])-1) <= eps
def assert_all_correctly_classified(SVM,X,Y):
N = len(X)
for i in xrange(N):
assert _svm_apply(SVM,X[i]) * Y[i] > 0
def assert_more_than_50(SVM,X,Y):
N = len(X)
correct = 0
for i in xrange(N):
correct += (_svm_apply(SVM,X[i]) * Y[i] > 0)
assert correct > N/2
def test_simplest():
X=numpy.array([
[1],
[2],
])
Y=numpy.array([1,-1])
C=4.
kernel=numpy.dot
Alphas,b=svm_learn_smo(X,Y,kernel,C)
SVM=(X,Y,Alphas,b,C,kernel)
Alphas_,b_=svm_learn_libsvm(X,Y,kernel,C)
assert approximate(Alphas,Alphas_)
assert approximate(b,b_)
assert approximate(Alphas,[2.,2.])
assert approximate(b,-3)
assert_kkt(SVM)
assert_all_correctly_classified(SVM,X,Y)
def test_more_complex_kkt():
X=numpy.array([
[1,0],
[2,1],
[2,0],
[4,2],
[0,1],
[0,2],
[1,2],
[2,8]])
Y=numpy.array([1,1,1,1,-1,-1,-1,-1])
C=4.
kernel=numpy.dot
Alphas,b=svm_learn_smo(X,Y,kernel,C)
def verify():
SVM=(X,Y,Alphas,b,C,kernel)
sv=numpy.array([1,1,0,0,1,0,1,0])
nsv=~sv
computed_sv = (Alphas > 0) & (Alphas < C)
computed_nsv = ~computed_sv
assert_kkt(SVM)
assert numpy.all((sv-computed_sv) >= 0) # computed_sv in sv
assert numpy.all((computed_nsv-nsv) >= 0) # nsv in computed_nsv
assert_all_correctly_classified(SVM,X,Y)
verify()
Alphas,b=svm_learn_libsvm(X,Y,kernel,C)
verify()
def rbf(xi,xj):
return numpy.exp(-((xi-xj)**2).sum())
def test_rbf():
X=numpy.array([
[0,0,0],
[1,1,1],
])
Y=numpy.array([ 1, -1 ])
C=10
Alphas,b=svm_learn_smo(X,Y,rbf,C)
SVM=(X,Y,Alphas,b,C,rbf)
assert_all_correctly_classified(SVM,X,Y)
Alphas,b=svm_learn_libsvm(X,Y,rbf,C)
SVM=(X,Y,Alphas,b,C,rbf)
assert_all_correctly_classified(SVM,X,Y)
def test_random():
R=numpy.random.RandomState(123)
X=R.rand(10,3)
X[:5]+=1.
C=2
Y=numpy.ones(10)
Y[5:] *= -1
Alphas,b=svm_learn_smo(X,Y,rbf,C)
SVM=(X,Y,Alphas,b,C,rbf)
assert_more_than_50(SVM,X,Y)
Alphas,b=svm_learn_libsvm(X,Y,rbf,C)
SVM=(X,Y,Alphas,b,C,rbf)
assert_more_than_50(SVM,X,Y)
def test_platt_correction():
F=numpy.ones(10)
L=numpy.ones(10)
F[:5] *= -1
L[:5] *= 0
A,B=learn_sigmoid_constants(F,L)
assert 1./(1.+numpy.exp(+10*A+B)) > .99
assert 1./(1.+numpy.exp(-10*A+B)) <.01
def test_platt_correction_class():
F=numpy.ones(10)
L=numpy.ones(10)
F[:5] *= -1
L[:5] *= 0
corrector = svm_sigmoidal_correction()
A,B = learn_sigmoid_constants(F,L)
model = corrector.train(F,L)
assert model.A == A
assert model.B == B
assert model.apply(10) > .99
assert model.apply(-10) < .01
def test_perfect():
data = numpy.zeros((10,2))
data[5:] = 1
labels = numpy.zeros(10)
labels[5:] = 1
classifier=milk.supervised.svm.svm_raw(kernel=milk.supervised.svm.rbf_kernel(1),C=4)
model = classifier.train(data,labels)
assert numpy.all( (numpy.array([model.apply(data[i]) for i in xrange(10)]) > 0) == labels )
def test_smart_rbf():
import milksets.wine
features,labels = milksets.wine.load()
labels = (labels == 1)
kernel = milk.supervised.svm.rbf_kernel(2.**-4)
C = milk.supervised.svm.svm_raw(C=2.,kernel=kernel)
model = C.train(features,labels)
smartkernel = [model.apply(f) for f in features]
del kernel.kernel_nr_
del kernel.kernel_arg_
C = milk.supervised.svm.svm_raw(C=2.,kernel=kernel)
model = C.train(features,labels)
dumbkernel = [model.apply(f) for f in features]
np.allclose(smartkernel, dumbkernel)
def test_preprockernel():
def test_preproc(kernel):
X = np.random.rand(200,20)
Qs = np.random.rand(100, 20)
prekernel = kernel.preprocess(X)
Qs = np.random.rand(10, 20)
for q in Qs:
preprocval = prekernel(q)
procval = np.array([kernel(q,x) for x in X])
assert np.allclose(preprocval, procval)
yield test_preproc, milk.supervised.svm.rbf_kernel(2.)
def test_svm_to_binary():
base = milk.supervised.svm.svm_raw(kernel=np.dot, C=4.)
bin = milk.supervised.svm.svm_to_binary(base)
X = np.array([
[1,0],
[2,1],
[2,0],
[4,2],
[0,1],
[0,2],
[1,2],
[2,8]])
Y = np.array([1,1,1,1,-1,-1,-1,-1])
model = bin.train(X,Y)
modelbase = base.train(X,Y)
assert np.all(modelbase.Yw == model.models[0].Yw)
for x in X:
assert model.apply(x) in set(Y)
def test_fast_dotkernel():
base = milk.supervised.svm.svm_raw(kernel=np.dot, C=4.)
X = np.array([
[1,0],
[2,1],
[2,0],
[4,2],
[0,1],
[0,2],
[1,2],
[2,8]])
Y = np.array([1,1,1,1,-1,-1,-1,-1])
model = base.train(X,Y)
base = milk.supervised.svm.svm_raw(kernel = milk.supervised.svm.dot_kernel(), C = 4.)
model2 = base.train(X,Y)
assert np.all( model2.Yw == model.Yw )
def count_common_elements(s0,s1):
return len(s0.intersection(s1))
def test_not_ndarray_input():
elems = [
set([1,2,3]),
set([2,3]),
set([2,]),
set([0,1]),
set([0,2,3]),
set([1,2,3]),
set([2,3,45]),
set([1,2,100])]
has_one = [(1 in s) for s in elems]
has_one[0] = not has_one[0]
c = milk.supervised.svm.svm_raw(kernel = count_common_elements, C = 2.)
model = c.train(elems, has_one)
trainpreds = [model.apply(e) for e in elems]
assert np.mean((np.array(trainpreds) > 0 ) == has_one ) > .5
model = pickle.loads(pickle.dumps(model))
trainpreds = [model.apply(e) for e in elems]
assert np.mean((np.array(trainpreds) > 0 ) == has_one ) > .5
def test_rbf_kernel_call_many():
from milk.supervised.svm import rbf_kernel
np.random.seed(232)
X = np.random.random((32,16))
k = rbf_kernel(1.)
pre = k.preprocess(X)
qs = np.random.random((8,16))
mapped = np.array(map(pre, qs))
manyed = pre.call_many(qs)
assert np.allclose(manyed, mapped)
<file_sep>/PhidgetsPython/Python/GPS-simple.py
#! /usr/bin/python
"""Copyright 2011 Phidgets Inc.
This work is licensed under the Creative Commons Attribution 2.5 Canada License.
To view a copy of this license, visit http://creativecommons.org/licenses/by/2.5/ca/
"""
__author__="<NAME>"
__version__="2.1.8"
__date__ ="14-Jan-2011 3:01:06 PM"
#Basic imports
import sys
#Phidget specific imports
from Phidgets.PhidgetException import PhidgetException
from Phidgets.Devices.GPS import GPS
#Create an accelerometer object
try:
gps = GPS()
except RuntimeError as e:
print("Runtime Exception: %s" % e.details)
print("Exiting....")
exit(1)
#Information Display Function
def displayDeviceInfo():
print("|------------|----------------------------------|--------------|------------|")
print("|- Attached -|- Type -|- Serial No. -|- Version -|")
print("|------------|----------------------------------|--------------|------------|")
print("|- %8s -|- %30s -|- %10d -|- %8d -|" % (gps.isAttached(), gps.getDeviceName(), gps.getSerialNum(), gps.getDeviceVersion()))
print("|------------|----------------------------------|--------------|------------|")
#Event Handler Callback Functions
def GPSAttached(e):
attached = e.device
print("GPS %i Attached!" % (attached.getSerialNum()))
def GPSDetached(e):
detached = e.device
print("GPS %i Detached!" % (detached.getSerialNum()))
def GPSError(e):
try:
source = e.device
print("GPS %i: Phidget Error %i: %s" % (source.getSerialNum(), e.eCode, e.description))
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
def GPSPositionChanged(e):
source = e.device
print("GPS %i: Latitude: %F, Longitude: %F, Altitude: %F" % (source.getSerialNum(), e.latitude, e.longitude, e.altitude))
def GPSPositionFixStatusChanged(e):
source = e.device
if e.positionFixStatus:
status = "FIXED"
else:
status = "NOT FIXED"
print("GPS %i: Position Fix Status: %s" % (source.getSerialNum(), status))
#Main Program Code
try:
gps.setOnAttachHandler(GPSAttached)
gps.setOnDetachHandler(GPSDetached)
gps.setOnErrorhandler(GPSError)
gps.setOnPositionChangeHandler(GPSPositionChanged)
gps.setOnPositionFixStatusChangeHandler(GPSPositionFixStatusChanged)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Opening phidget object....")
try:
gps.openPhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Waiting for attach....")
try:
gps.waitForAttach(10000)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
try:
gps.closePhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Exiting....")
exit(1)
else:
displayDeviceInfo()
print("Press Enter to quit....")
try:
print("GPS Current Time: %s" %(gps.getTime().toString()))
print("GPS Current Date: %s" %(gps.getDate().toString()))
print("GPS Current Latitude: %F" %(gps.getLatitude()))
print("GPS Current Longitude: %F" %(gps.getLongitude()))
print("GPS Current Altitude: %F" %(gps.getAltitude()))
print("GPS Current Heading: %F" %(gps.getHeading()))
print("GPS Current Velocity: %F" % (gps.getVelocity()))
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
chr = sys.stdin.read(1)
print("Closing...")
try:
gps.closePhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Done.")
exit(0)<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/measures/curves.py
# -*- coding: utf-8 -*-
# Copyright (C) 2011, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
# License: MIT. See COPYING.MIT file in the milk distribution
from __future__ import division
import numpy as np
def precision_recall(values, labels, mode='all', nr_steps=100):
'''
precision, recall = precision_recall(values, labels, mode='all', nr_steps=100)
Compute a precision-recall curve.
For a given threshold ``T``, consider that the positions where ``values >=
T`` are classified as True. Precision is defined as ``TP/(TP+FP)``, while
recall is defined as ``TP/(TP+FN)``.
Parameters
----------
values : sequence of numbers
labels : boolean sequence
mode : str, optional
Which thresholds to consider. Either 'all' (i.e., use all values of
`values` as possible thresholds), or 'step' (using `nr_steps`
equidistant points from ``min(values)`` to ``max(values)``)
nr_steps : integer, optional
How many steps to use. Only meaningfule if ``mode == 'steps'``
Returns
-------
precision : a sequence of floats
recall : a sequence of floats
Actually, ``2 x P`` array is returned.
'''
values = np.asanyarray(values)
labels = np.asanyarray(labels)
if len(values) != len(labels):
raise ValueError('milk.measures.precision_recall: `values` must be of same length as `labels`')
if mode == 'all':
points = list(set(values))
points.sort()
elif mode == 'steps':
points = np.linspace(values.min(), values.max(), nr_steps)
else:
raise ValueError('milk.measures.precision_recall: cannot handle mode: `%s`' % mode)
true_pos = float(np.sum(labels))
precision_recall = np.empty((len(points),2), np.float)
for i,p in enumerate(points):
selected = (values >= p)
selected = labels[selected]
precision_recall[i] = (np.mean(selected), np.sum(selected)/true_pos)
return precision_recall.T
<file_sep>/knn.py~
#!/usr/bin/env python
import os
from PIL import Image
import sys
from optparse import OptionParser
import numpy as np
from numpy import *
from mvpa.clfs.knn import kNN
from mvpa.datasets import Dataset
#pymvpa stuff
f_handle = open("classdatafile.txt", 'r')
f_handle2 = open("classidfile.txt", 'r')
f_handle3 = open("predictdata.txt", 'r')
features = genfromtxt(f_handle, dtype = float)
classes = genfromtxt(f_handle2, dtype = int)
predictdata = genfromtxt(f_handle3, dtype = float)
predictdata = np.expand_dims(predictdata, axis=0)
print predictdata
print np.shape(features), features.ndim, features.dtype
print np.shape(classes), classes.ndim, classes.dtype
print np.shape(predictdata), predictdata.ndim, predictdata.dtype
f_handle.close()
f_handle2.close()
f_handle3.close()
training = Dataset(samples=features,labels=classes)
clf = kNN(k=2)
print "clf = ", clf
clf.train(training)
#print np.mean(clf.predict(training.samples) == training.labels)
classID = clf.predict(predictdata)
print "classID = ", classID
#print clf.trained_labels
if classID[0] == 1: print "Image is of class: GRASS"
if classID[0] == 2: print "Image is of class: DIRT/GRAVEL"
if classID[0] == 3: print "Image is of class: CEMENT/ASPHALT"
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/tests/test_defaultlearner.py
import milk
def test_extra_arg():
from milksets.wine import load
features,labels = load()
learner = milk.defaultlearner()
model = learner.train(features[::2],labels[::2], extra_arg=5)
assert model.apply(features[1]) < 12.
<file_sep>/main/im_test.py~
#!/usr/bin/python
import PythonMagick as pm
# Read the image
img = pm.read('temp.jpg')
img.display()
<file_sep>/lib/mobot_gps_class.py
#!/usr/bin/python
import gps, os
import threading
import time
import glob
import os
import re
#sudo apt-get install gpsd gpsd-clients
#gpsd /dev/ttyUSB0 -b -n
#The gpsd server reads NMEA sentences from the gps unit and is accessed on port 2947. You can test if everything is working by running a pre-built gpsd client such as xgps.
def find_usb_tty(vendor_id = None, product_id = None) :
#lsusb to find vendorID and productID:
#Bus 002 Device 005: ID 067b:2303 Prolific Technology, Inc. PL2303 Serial Port
# then call print find_usb_tty("067b","2303")
tty_devs = []
vendor_id = int(vendor_id, 16)
product_id = int(product_id , 16)
for dn in glob.glob('/sys/bus/usb/devices/*') :
try :
vid = int(open(os.path.join(dn, "idVendor" )).read().strip(), 16)
pid = int(open(os.path.join(dn, "idProduct")).read().strip(), 16)
if ((vendor_id is None) or (vid == vendor_id)) and ((product_id is None) or (pid == product_id)) :
dns = glob.glob(os.path.join(dn, os.path.basename(dn) + "*"))
for sdn in dns :
for fn in glob.glob(os.path.join(sdn, "*")) :
if re.search(r"\/ttyUSB[0-9]+$", fn) :
#tty_devs.append("/dev" + os.path.basename(fn))
tty_devs.append(os.path.join("/dev", os.path.basename(fn)))
pass
pass
pass
pass
except ( ValueError, TypeError, AttributeError, OSError, IOError ) :
pass
pass
return tty_devs
#print find_usb_tty("067b","2303")
class mobot_gps ( Thread ):
def __init__( self, command_port, data_port ):
Thread.__init__( self )
self.cmd_port = command_port
self.data_port = data_port
def run(self):
gps1 = gps.gps(host="localhost", port="2947")
gps2 = gps.gps(host="localhost", port="2947")
gps3 = gps.gps(host="localhost", port="2947")
gps4 = gps.gps(host="localhost", port="2947")
<file_sep>/gui/tkinter/maxsonar_class.py
#!/usr/bin/env python
import serial
import threading
import time
# need to find best way to search seria ports for find device
class MaxSonar(object):
def __init__(self):
self._isConnected = False
self._ser = self._open_serial_port()
self._should_stop = threading.Event()
self._start_reading()
self._data = 0
#self._port = port
def _open_serial_port(self):
while self._isConnected == False:
print "class MaxSonar: searching serial ports for ultrasonic sensor package..."
for i in range(11):
port = "/dev/ttyACM"
port = port[0:11] + str(i)
print "class MaxSonar: searching on port:", port
time.sleep(.2)
try:
ser = serial.Serial(port, 9600, timeout=1)
data = ser.readline()
#print "data=", int(data[3:(len(data)-1)])
if data[0:2] == "s1":
#ser.write("a\n") # write a string
print "class MaxSonar: found ultrasonic sensor package on serial port: ", port
self._isConnected = True
#self._port = ser
time.sleep(.35)
break
except:
pass
for i in range(11):
port = "/dev/ttyUSB"
port = port[0:11] + str(i)
print "class MaxSonar: searching on port:", port
time.sleep(.2)
try:
ser = serial.Serial(port, 9600, timeout=1)
data = ser.readline()
#print "data=", int(data[3:(len(data)-1)])
if data[0:2] == "s1":
#ser.write("a\n") # write a string
print "class MaxSonar: found ultrasonic sensor package on serial port: ", port
self._isConnected = True
#self._port = ser
time.sleep(.35)
break
except:
pass
if self._isConnected == False:
print "class MaxSonar: ultrasonic sensor package not found!"
time.sleep(1)
#print "returning", ser
return ser
def _start_reading(self):
def read():
#print self._should_stop.isSet()
#print self._ser.isOpen()
while not self._should_stop.isSet():
try:
data = self._ser.readline()
#print "recieved: ", data
#self._data = int(data[5:(len(data)-1)])
self._data = data[0:(len(data)-1)]
except:
try:
print "class MaxSonar:no connection...attempting to reconnect"
self._data = 0
self._isConnected = False
self._ser = self._open_serial_port()
time.sleep(.5)
except:
pass
thr = threading.Thread(target=read)
thr.daemon=True
thr.start()
return thr
def stop(self):
self._should_stop.set()
self._read_thread.wait()
def distances_cm(self):
return self._data
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/tests/test_affinity.py
import milk.unsupervised.affinity
import numpy as np
def test_affinity():
np.random.seed(22)
X = np.random.randn(100,10)
X[:40] += .4
S = milk.unsupervised.pdist(X)
clusters, labels = milk.unsupervised.affinity.affinity_propagation(S)
assert labels.max()+1 == len(clusters)
assert len(labels) == len(X)
assert clusters.max() < len(X)
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/unsupervised/gaussianmixture.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2011, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# License: MIT. See COPYING.MIT file in the milk distribution
from __future__ import division
import numpy as np
from numpy import log, pi, array
from numpy.linalg import det, inv
from .kmeans import residual_sum_squares, centroid_errors
__all__ = [
'BIC',
'AIC',
'log_likelihood',
'nr_parameters',
]
def log_likelihood(fmatrix,assignments,centroids,model='one_variance',covs=None):
'''
log_like = log_likelihood(feature_matrix, assignments, centroids, model='one_variance', covs=None)
Compute the log likelihood of feature_matrix[i] being generated from centroid[i]
'''
N,q = fmatrix.shape
k = len(centroids)
if model == 'one_variance':
Rss = residual_sum_squares(fmatrix,assignments,centroids)
#sigma2=Rss/N
return -N/2.*log(2*pi*Rss/N)-N/2
elif model == 'diagonal_covariance':
errors = centroid_errors(fmatrix,assignments,centroids)
errors *= errors
errors = errors.sum(1)
Rss = np.zeros(k)
counts = np.zeros(k)
for i in xrange(fmatrix.shape[0]):
c = assignments[i]
Rss[c] += errors[i]
counts[c] += 1
sigma2s = Rss/(counts+(counts==0))
return -N/2.*log(2*pi) -N/2. -1/2.*np.sum(counts*np.log(sigma2s+(counts==0)))
elif model == 'full_covariance':
res = -N*q/2.*log(2*pi)
for k in xrange(len(centroids)):
diff = (fmatrix[assignments == k] - centroids[k])
if covs is None:
covm = np.cov(diff.T)
else:
covm = covs[k]
if covm.shape == ():
covm = np.matrix([[covm]])
icov = np.matrix(inv(covm))
diff = np.matrix(diff)
Nk = diff.shape[0]
res += -Nk/2.*log(det(covm)) + \
-.5 * (diff * icov * diff.T).diagonal().sum()
return res
raise ValueError("log_likelihood: cannot handle model '%s'" % model)
def nr_parameters(fmatrix,k,model='one_variance'):
'''
nr_p = nr_parameters(fmatrix, k, model='one_variance')
Compute the number of parameters for a model of k clusters on
Parameters
----------
fmatrix : 2d-array
feature matrix
k : integer
nr of clusters
model : str
one of 'one_variance' (default), 'diagonal_covariance', or 'full_covariance'
Returns
-------
nr_p : integer
Number of parameters
'''
N,q = fmatrix.shape
if model == 'one_variance':
return k*q+1
elif model == 'diagonal_covariance':
return k*(q+1)
elif model == 'full_covariance':
return k*+q*q
raise ValueError("milk.unsupervised.gaussianmixture.nr_parameters: cannot handle model '%s'" % model)
def _compute(type, fmatrix, assignments, centroids, model='one_variance', covs=None):
N,q = fmatrix.shape
k = len(centroids)
log_like = log_likelihood(fmatrix, assignments, centroids, model, covs)
n_param = nr_parameters(fmatrix,k,model)
if type == 'BIC':
return -2*log_like + n_param * log(N)
elif type == 'AIC':
return -2*log_like + 2 * n_param
else:
assert False
def BIC(fmatrix, assignments, centroids, model='one_variance', covs=None):
'''
B = BIC(fmatrix, assignments, centroids, model='one_variance', covs={From Data})
Compute Bayesian Information Criterion
Parameters
----------
fmatrix : 2d-array
feature matrix
assignments : 2d-array
Centroid assignments
centroids : sequence
Centroids
model : str, optional
one of
'one_variance'
All features share the same variance parameter sigma^2. Default
'full_covariance'
Estimate a full covariance matrix or use covs[i] for centroid[i]
covs : sequence or matrix, optional
Covariance matrices. If None, then estimate from the data. If scalars
instead of matrices are given, then s stands for sI (i.e., the diagonal
matrix with s along the diagonal).
Returns
-------
B : float
BIC value
See Also
--------
AIC
'''
return _compute('BIC', fmatrix, assignments, centroids, model, covs)
def AIC(fmatrix,assignments,centroids,model='one_variance',covs=None):
'''
A = AIC(fmatrix,assignments,centroids,model)
Compute Akaike Information Criterion
Parameters
----------
fmatrix : 2d-array
feature matrix
assignments : 2d-array
Centroid assignments
centroids : sequence
Centroids
model : str, optional
one of
'one_variance'
All features share the same variance parameter sigma^2. Default
'full_covariance'
Estimate a full covariance matrix or use covs[i] for centroid[i]
covs : sequence, optional
Covariance matrices. If None, then estimate from the data. If scalars
instead of matrices are given, then s stands for sI (i.e., the diagonal
matrix with s along the diagonal).
Returns
-------
B : float
AIC value
See Also
--------
BIC
'''
return _compute('AIC', fmatrix, assignments, centroids, model, covs)
<file_sep>/navigation/lidar/lidar_faker_class.py
import random
import thread
import time
class lidar_faker():
def __init__(self):
self.x_degree = 0
self.y_degree = 0
self.dist = 0
self.quality = 0
self.rpm = 0
self.th = thread.start_new_thread(self.run, ())
def run(self):
while True:
time.sleep(.05)
self.read_lidar()
def read_lidar(self):
self.y_degree += 1
if self.y_degree > 360:
self.y_degree = 0
self.x_degree += 1
if self.x_degree > 360: self.x_degree = 0
self.dist = random.randint(0,4000)
self.quality = random.randint(0,100)
self.rpm = random.randint(281 ,295)
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/demos/adaboost.py
import pylab as plt
import milk.supervised.tree
import milk.supervised.adaboost
from milksets import wine
import milk.supervised.multi
weak = milk.supervised.tree.stump_learner()
learner = milk.supervised.adaboost.boost_learner(weak)
learner = milk.supervised.multi.one_against_one(learner)
features, labels = wine.load()
cmat,names,predictions = milk.nfoldcrossvalidation(features,labels, classifier=learner, return_predictions=True)
colors = "rgb"
codes = "xo"
for y,x,r,p in zip(features.T[0], features.T[1], labels, predictions):
code = codes[int(r == p)]
plt.plot([y],[x], colors[p]+code)
plt.show()
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/supervised/knn.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2011, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# License: MIT. See COPYING.MIT file in the milk distribution
from __future__ import division
from collections import defaultdict
import numpy as np
from .base import supervised_model
__all__ = [
'kNN',
]
class kNN(object):
'''
k-Nearest Neighbour Classifier
Naive implementation of a k-nearest neighbour classifier.
C = kNN(k)
Attributes:
-----------
k : integer
number of neighbours to consider
'''
def __init__(self, k=1):
self.k = k
def train(self, features, labels, normalisedlabels=False, copy_features=False):
features = np.asanyarray(features)
labels = np.asanyarray(labels)
if copy_features:
features = features.copy()
labels = labels.copy()
features2 = np.sum(features**2, axis=1)
return kNN_model(self.k, features, features2, labels)
class kNN_model(supervised_model):
def __init__(self, k, features, features2, labels):
self.k = k
self.features = features
self.f2 = features2
self.labels = labels
def apply(self, features):
features = np.asanyarray(features)
diff2 = np.dot(self.features, (-2.)*features)
diff2 += self.f2
neighbours = diff2.argsort()[:self.k]
labels = self.labels[neighbours]
votes = defaultdict(int)
for L in labels:
votes[L] += 1
v,L = max( (v,L) for L,v in votes.items() )
return L
<file_sep>/PhidgetsPython/Python/Accelerometer-simple.py
#!/usr/bin/env python
"""Copyright 2010 Phidgets Inc.
This work is licensed under the Creative Commons Attribution 2.5 Canada License.
To view a copy of this license, visit http://creativecommons.org/licenses/by/2.5/ca/
"""
__author__ = '<NAME>'
__version__ = '2.1.8'
__date__ = 'May 17 2010'
#Basic imports
from ctypes import *
import sys
#Phidget specific imports
from Phidgets.Phidget import Phidget
from Phidgets.PhidgetException import PhidgetErrorCodes, PhidgetException
from Phidgets.Events.Events import AccelerationChangeEventArgs, AttachEventArgs, DetachEventArgs, ErrorEventArgs
from Phidgets.Devices.Accelerometer import Accelerometer
#Create an accelerometer object
try:
accelerometer = Accelerometer()
except RuntimeError as e:
print("Runtime Exception: %s" % e.details)
print("Exiting....")
exit(1)
#Information Display Function
def DisplayDeviceInfo():
print("|------------|----------------------------------|--------------|------------|")
print("|- Attached -|- Type -|- Serial No. -|- Version -|")
print("|------------|----------------------------------|--------------|------------|")
print("|- %8s -|- %30s -|- %10d -|- %8d -|" % (accelerometer.isAttached(), accelerometer.getDeviceName(), accelerometer.getSerialNum(), accelerometer.getDeviceVersion()))
print("|------------|----------------------------------|--------------|------------|")
print("Number of Axes: %i" % (accelerometer.getAxisCount()))
#Event Handler Callback Functions
def AccelerometerAttached(e):
attached = e.device
print("Accelerometer %i Attached!" % (attached.getSerialNum()))
def AccelerometerDetached(e):
detached = e.device
print("Accelerometer %i Detached!" % (detached.getSerialNum()))
def AccelerometerError(e):
try:
source = e.device
print("Accelerometer %i: Phidget Error %i: %s" % (source.getSerialNum(), e.eCode, e.description))
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
def AccelerometerAccelerationChanged(e):
source = e.device
print("Accelerometer %i: Axis %i: %6f" % (source.getSerialNum(), e.index, e.acceleration))
#Main Program Code
try:
accelerometer.setOnAttachHandler(AccelerometerAttached)
accelerometer.setOnDetachHandler(AccelerometerDetached)
accelerometer.setOnErrorhandler(AccelerometerError)
accelerometer.setOnAccelerationChangeHandler(AccelerometerAccelerationChanged)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Opening phidget object....")
try:
accelerometer.openPhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Waiting for attach....")
try:
accelerometer.waitForAttach(10000)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
try:
accelerometer.closePhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Exiting....")
exit(1)
else:
try:
numAxis = accelerometer.getAxisCount()
accelerometer.setAccelChangeTrigger(0, 0.500)
accelerometer.setAccelChangeTrigger(1, 0.500)
if numAxis > 2:
accelerometer.setAccelChangeTrigger(2, 0.500)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
DisplayDeviceInfo()
print("Press Enter to quit....")
chr = sys.stdin.read(1)
print("Closing...")
try:
accelerometer.closePhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Done.")
exit(0)<file_sep>/camera/lasercamera.py
#!/usr/bin/python
import sys
from opencv import cv
from opencv import highgui
import opencv
from pygame.locals import *
hmin = 4
hmax = 18
vmin = 140
vmax = 255
smin = 147
smax = 255
hsv_min = cv.cvScalar(0, smin, vmin, 0)
hsv_max = cv.cvScalar(180, 256, vmax, 0)
capture = None
def change_hmin(p):
global hmin
hmin = p
def change_hmax(p):
global hmax
hmax = p
def change_smin(p):
global smin
smin = p
def change_smax(p):
global smax
smax = p
def change_vmin(p):
global vmin
vmin = p
def change_vmax(p):
global vmax
vmax = p
def change_brightness(p):
global capture
highgui.cvSetCaptureProperty(capture,highgui.CV_CAP_PROP_BRIGHTNESS, p)
print "change brightness",p;
def draw_target(img, x, y):
width = 10
color = cv.CV_RGB(0,255,0);
size = cv.cvGetSize(img)
#cv.cvSet2D(img,x,y,color);
for i in range(width):
for j in range(width):
if i==0 or j==0 or j==9 or i==9:
px = x + j - width/2
py = y + i - width/2
if px<0:
px = 0
if py<0:
py = 0
if px>=size.width:
px = size.width-1
if py>=size.height:
py = size.height-1
cv.cvSet2D(img,py,px,color)
def averageWhitePoints(frame):
xtotal = 0.0
ytotal = 0.0
count = 0;
size = cv.cvGetSize(frame)
for x in range(size.width):
for y in range(size.height):
if(cv.cvGetReal2D(frame, y, x) > 200):
xtotal = xtotal + x
ytotal = ytotal + y
count += 1
if count == 0:
return 0, 0
return int(xtotal/count), int(ytotal/count)
def main(args):
global capture
global hmax, hmin
highgui.cvNamedWindow('Camera', highgui.CV_WINDOW_AUTOSIZE)
highgui.cvNamedWindow('Hue', highgui.CV_WINDOW_AUTOSIZE)
highgui.cvNamedWindow('Satuation', highgui.CV_WINDOW_AUTOSIZE)
highgui.cvNamedWindow('Value', highgui.CV_WINDOW_AUTOSIZE)
highgui.cvNamedWindow('Laser', highgui.CV_WINDOW_AUTOSIZE)
highgui.cvMoveWindow('Camera', 0, 10)
highgui.cvMoveWindow('Hue', 0, 350)
highgui.cvMoveWindow('Satuation', 360, 10)
highgui.cvMoveWindow('Value', 360, 350)
highgui.cvMoveWindow('Laser', 700, 40)
highgui.cvCreateTrackbar("Brightness Trackbar","Camera",0,255, change_brightness);
highgui.cvCreateTrackbar("hmin Trackbar","Hue",hmin,180, change_hmin);
highgui.cvCreateTrackbar("hmax Trackbar","Hue",hmax,180, change_hmax);
highgui.cvCreateTrackbar("smin Trackbar","Satuation",smin,255, change_smin);
highgui.cvCreateTrackbar("smax Trackbar","Satuation",smax,255, change_smax);
highgui.cvCreateTrackbar("vmin Trackbar","Value",vmin,255, change_vmin);
highgui.cvCreateTrackbar("vmax Trackbar","Value",vmax,255, change_vmax);
print "grabbing camera"
capture = highgui.cvCreateCameraCapture(0)
print "found camera"
highgui.cvSetCaptureProperty(capture,highgui.CV_CAP_PROP_FRAME_WIDTH, 320)
highgui.cvSetCaptureProperty(capture,highgui.CV_CAP_PROP_FRAME_HEIGHT, 240)
frame = highgui.cvQueryFrame(capture)
frameSize = cv.cvGetSize(frame)
hsv = cv.cvCreateImage(frameSize,8,3)
mask = cv.cvCreateImage(frameSize,8,1)
hue = cv.cvCreateImage(frameSize,8,1)
satuation = cv.cvCreateImage(frameSize,8,1)
value = cv.cvCreateImage(frameSize,8,1)
laser = cv.cvCreateImage(frameSize,8,1)
while 1:
frame = highgui.cvQueryFrame(capture)
cv.cvCvtColor(frame, hsv, cv.CV_BGR2HSV)
#cv.cvInRangeS(hsv,hsv_min,hsv_max,mask)
cv.cvSplit(hsv,hue,satuation,value,None)
cv.cvInRangeS(hue,hmin,hmax,hue)
cv.cvInRangeS(satuation,smin,smax,satuation)
cv.cvInRangeS(value,vmin,vmax,value)
#cv.cvInRangeS(hue,0,180,hue)
cv.cvAnd(hue, value, laser)
#cv.cvAnd(laser, value, laser)
cenX,cenY = averageWhitePoints(laser)
#print cenX,cenY
draw_target(frame,cenX,cenY)
#draw_target(frame,200,1)
highgui.cvShowImage('Camera',frame)
highgui.cvShowImage('Hue',hue)
highgui.cvShowImage('Satuation',satuation)
highgui.cvShowImage('Value',value)
highgui.cvShowImage('Laser',laser)
k = highgui.cvWaitKey(10)
if k == " ":
highgui.cvDestroyAllWindows()
highgui.cvReleaseCapture (capture)
sys.exit()
if __name__ == '__main__':
main(sys.argv[1:]);
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/tests/test_pdist.py
import numpy as np
from milk.unsupervised import pdist, plike
def test_pdist():
np.random.seed(222)
X = np.random.randn(100,23)
Y = np.random.randn(80,23)
Dxx = pdist(X)
for i in xrange(X.shape[0]):
for j in xrange(X.shape[1]):
assert np.allclose(Dxx[i,j], np.sum((X[i]-X[j])**2))
Dxy = pdist(X,Y)
for i in xrange(X.shape[0]):
for j in xrange(Y.shape[1]):
assert np.allclose(Dxy[i,j], np.sum((X[i]-Y[j])**2))
Dxye = pdist(X, Y, 'euclidean')
assert np.allclose(Dxye, np.sqrt(Dxy))
def test_plike():
np.random.seed(222)
X = np.random.randn(100,23)
Lxx = plike(X)
assert len(Lxx) == len(Lxx.T)
Lxx2 = plike(X, sigma2=.001)
assert Lxx[0,1] != Lxx2[0,1]
assert Lxx[0,0] == Lxx2[0,0]
<file_sep>/pubsub/helloworld.py
'''
One listener is subscribed to a topic called 'rootTopic'.
One 'rootTopic' message gets sent.
'''
import Image
from pubsub import pub
from pylab import imread
import cPickle as pickle
import numpy
# ------------ create a listener ------------------
def listener1(arg1, arg2=None):
#print 'Function listener1 received:\n arg1="%s"\n arg2="%s"' % (arg1, arg2)
img_rec = arg1
#Image.fromarray(arg1).transpose(Image.FLIP_TOP_BOTTOM)
print len(img_rec), type(img_rec)
img1 = Image.fromarray(img_rec).transpose(Image.FLIP_TOP_BOTTOM)
img1.show()
# ------------ register listener ------------------
pub.subscribe(listener1, 'rootTopic')
#------ create a function that sends a message ----
def doSomethingAndSendResult():
fname = 'rec_image.jpg'
#img = Image.open(filetosend)
img = imread(fname)
#print type(img), img.size, img.format, dir(img)
# Pickle dictionary using protocol 0.
#im = Image.fromarray(img).transpose(Image.FLIP_TOP_BOTTOM)
#im.show()
print type(img), len(img)
#p_img = pickle.dumps(img)
#print img.toString
#img.show()
#imshow(img)
#f = open(filetosend, "rb")
#img = f.read()
#f.close()
print 'lahdida... have result, publishing it via pubsub'
#pub.sendMessage('rootTopic', arg1=123, arg2=dict(a='abc', b='def'))
pub.sendMessage('rootTopic', arg1=img)
# --------- define the main part of application --------
if __name__ == '__main__':
doSomethingAndSendResult()
<file_sep>/navigation/terrain_id/terrain_training/build/milk/README.rst
==============================
MILK: MACHINE LEARNING TOOLKIT
==============================
Machine Learning in Python
--------------------------
Milk is a machine learning toolkit in Python.
Its focus is on supervised classification with several classifiers available:
SVMs (based on libsvm), k-NN, random forests, decision trees. It also performs
feature selection. These classifiers can be combined in many ways to form
different classification systems.
For unsupervised learning, milk supports k-means clustering and affinity
propagation.
Milk is flexible about its inputs. It optimised for numpy arrays, but can often
handle anything (for example, for SVMs, you can use any dataype and any kernel
and it does the right thing).
There is a strong emphasis on speed and low memory usage. Therefore, most of
the performance sensitive code is in C++. This is behind Python-based
interfaces for convenience.
To learn more, check the docs at `http://packages.python.org/milk/
<http://packages.python.org/milk/>`_ or the code demos included with the source
at ``milk/demos/``.
Examples
--------
Here is how to test how well you can classify some ``features,labels`` data,
measured by cross-validation::
import numpy as np
import milk
features = np.random.rand(100,10) # 2d array of features: 100 examples of 10 features each
labels = np.zeros(100)
features[50:] += .5
labels[50:] = 1
confusion_matrix, names = milk.nfoldcrossvalidation(features, labels)
print 'Accuracy:', confusion_matrix.trace()/float(confusion_matrix.sum())
If want to use a classifier, you instanciate a *learner object* and call its
``train()`` method::
import numpy as np
import milk
features = np.random.rand(100,10)
labels = np.zeros(100)
features[50:] += .5
labels[50:] = 1
learner = milk.defaultclassifier()
model = learner.train(features, labels)
# Now you can use the model on new examples:
example = np.random.rand(10)
print model.apply(example)
example2 = np.random.rand(10)
example2 += .5
print model.apply(example2)
There are several classification methods in the package, but they all use the
same interface: ``train()`` returns a *model* object, which has an ``apply()``
method to execute on new instances.
Details
-------
License: MIT
Author: <NAME> (with code from LibSVM and scikits.learn)
API Documentation: `http://packages.python.org/milk/ <http://packages.python.org/milk/>`_
Mailing List: `http://groups.google.com/group/milk-users
<http://groups.google.com/group/milk-users>`__
Features
--------
- SVMs. Using the libsvm solver with a pythonesque wrapper around it.
- LASSO
- K-means using as little memory as possible. It can cluster millions of
instances efficiently.
- Random forests
- Self organising maps
- Stepwise Discriminant Analysis for feature selection.
- Non-negative matrix factorisation
- Affinity propagation
Recent History
--------------
The ChangeLog file contains a more complete history.
New in 0.5 (05 Nov 2012)
~~~~~~~~~~~~~~~~~~~~~~~~
- Add coordinate-descent based LASSO
- Add unsupervised.center function
- Make zscore work with NaNs (by ignoring them)
- Propagate apply_many calls through transformers
- Much faster SVM classification with means a much faster defaultlearner()
[measured 2.5x speedup on yeast dataset!]
New in 0.4.3 (17 Sept 2012)
~~~~~~~~~~~~~~~~~~~~~~~~~~~
- Add select_n_best & rank_corr to featureselection
- Add Euclidean MDS
- Add tree multi-class strategy
- Fix adaboost with boolean weak learners (issue #6, reported by audy
(<NAME>))
- Add ``axis`` arguments to zscore()
New in 0.4.2 (16 Jan 2012)
~~~~~~~~~~~~~~~~~~~~~~~~~~
- Make defaultlearner able to take extra arguments
- Make ctransforms_model a supervised_model (adds apply_many)
- Add expanded argument to defaultlearner
- Fix corner case in SDA
- Fix repeated_kmeans
- Fix parallel gridminimise on Windows
- Add multi_label argument to normaliselabels
- Add multi_label argument to nfoldcrossvalidation.foldgenerator
- Do not fork a process in gridminimise if nprocs == 1 (makes for easier
debugging, at the cost of slightly more complex code).
- Add milk.supervised.multi_label
- Fix ext.jugparallel when features is a Task
- Add milk.measures.bayesian_significance
New in 0.4.1
~~~~~~~~~~~~
- Fix important bug in multi-process gridsearch
New in 0.4.0
~~~~~~~~~~~~
- Use multiprocessing to take advantage of multi core machines (off by
default).
- Add perceptron learner
- Set random seed in random forest learner
- Add warning to milk/__init__.py if import fails
- Add return value to ``gridminimise``
- Set random seed in ``precluster_learner``
- Implemented Error-Correcting Output Codes for reduction of multi-class
to binary (including probability estimation)
- Add ``multi_strategy`` argument to ``defaultlearner()``
- Make the dot kernel in svm much, much, faster
- Make sigmoidal fitting for SVM probability estimates faster
- Fix bug in randomforest (patch by Wei on milk-users mailing list)
For older versions, see ``ChangeLog`` file
<file_sep>/gui/foo/foo/helpers.py
# -*- coding: utf-8 -*-
### BEGIN LICENSE
# This file is in the public domain
### END LICENSE
"""Helpers for an Ubuntu application."""
__all__ = [
'make_window',
]
import os
import gtk
from foo.fooconfig import get_data_file
import gettext
from gettext import gettext as _
gettext.textdomain('foo')
def get_builder(builder_file_name):
"""Return a fully-instantiated gtk.Builder instance from specified ui
file
:param builder_file_name: The name of the builder file, without extension.
Assumed to be in the 'ui' directory under the data path.
"""
# Look for the ui file that describes the user interface.
ui_filename = get_data_file('ui', '%s.ui' % (builder_file_name,))
if not os.path.exists(ui_filename):
ui_filename = None
builder = gtk.Builder()
builder.set_translation_domain('foo')
builder.add_from_file(ui_filename)
return builder
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/utils/parallel.py
# -*- coding: utf-8 -*-
# Copyright (C) 2011-2012, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
# License: MIT. See COPYING.MIT file in the milk distribution
from __future__ import division, with_statement
import multiprocessing
max_procs = 1
_used_procs = multiprocessing.Value('i', 1)
_plock = multiprocessing.Lock()
def set_max_processors(value=None):
'''
set_max_processors(value=None)
Set the maximum number of processors to ``value`` (or to the number of
physical CPUs if ``None``).
Note that this is valid for the current process and its children, but not
the parent.
Parameters
----------
value : int, optional
Number of processors to use. Defaults to number of CPUs (as returned by
``multiprocessing.cpu_count()``).
'''
global max_procs
if value is None:
value = multiprocessing.cpu_count()
max_procs = value
def get_proc():
'''
available = get_proc()
Reserve a processor
Returns
-------
available : bool
True if a processor is available
'''
with _plock:
if _used_procs.value >= max_procs:
return False
_used_procs.value += 1
return True
def release_proc():
'''
release_proc()
Returns a processor to the pool
'''
with _plock:
_used_procs.value -= 1
def release_procs(n, count_current=True):
'''
release_procs(n, count_current=True)
Returns ``n`` processors to the pool
Parameters
----------
n : int
Number of processors to release
count_current : bool, optional
Whether the current processor is to be included in ``n`` (default: True)
'''
if count_current:
n -= 1
if n > 0:
with _plock:
_used_procs.value -= n
def get_procs(desired=None, use_current=True):
'''
n = get_procs(desired=None, use_current=True)
Get the up to ``desired`` processors (use None for no maximum).
Parameters
----------
desired : int, optional
Number of processors you wish. By default, there is no maximum
use_current: bool, optional
Whether to count the current processor, True by default.
'''
if desired is None:
desired = 1024 # This should last a few years
n = (1 if use_current else 0)
while n < desired:
if get_proc():
n += 1
else:
return n
return n
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/ext/__init__.py
'''
===============
Milk Extensions
===============
These are modules whose functionality is not really part of the core
functionality of milk, but which are useful with it.
'''
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/demos/rf_wine_2d.py
from milk.supervised import randomforest
from milk.supervised.multi import one_against_one
import milk.nfoldcrossvalidation
import milk.unsupervised
import pylab
from milksets import wine
# Load 'wine' dataset
features, labels = wine.load()
# random forest learner
rf_learner = randomforest.rf_learner()
# rf is a binary learner, so we transform it into a multi-class classifier
learner = one_against_one(rf_learner)
# cross validate with this learner and return predictions on left-out elements
cmat,names, preds = milk.nfoldcrossvalidation(features, labels, classifier=learner, return_predictions=1)
print 'cross-validation accuracy:', cmat.trace()/float(cmat.sum())
# dimensionality reduction for display
x,v = milk.unsupervised.pca(features)
colors = "rgb" # predicted colour
marks = "xo" # whether the prediction was correct
for (y,x),p,r in zip(x[:,:2], preds, labels):
c = colors[p]
m = marks[p == r]
pylab.plot(y,x,c+m)
pylab.show()
<file_sep>/pubsub/recieve.py~
#!/usr/bin/env python
import sys
sys.path.append( "../lib/" )
import pika
from pylab import imread
import Image
import time
import cv, cv2
import cPickle as pickle
from img_processing_tools import *
import numpy
count = 0
def callback(ch, method, properties, body):
global count
count = count +1
#while True:
#time.sleep(0.0001)
#try:
frame = array2CV(pickle.loads(body))
print frame, type(frame)
print count
cv.ShowImage('Video', frame)
cv.WaitKey(10)
#except:
#pass
#print len(img_rec), type(img_rec)
#img1 = Image.fromstring('RGB', (320,240), img_rec)#.transpose(Image.FLIP_TOP_BOTTOM)
#
#img1 = Image.open(StringIO(img_rec))
#img1.show()
connection = pika.BlockingConnection(pika.ConnectionParameters(host='localhost'))
channel = connection.channel()
#channel.queue_declare(queue='hello')
channel.queue_declare(queue='hello',arguments={'x-message-ttl' : 1000})
print ' [*] Waiting for messages. To exit press CTRL+C'
cv2.namedWindow('Video', cv.CV_WINDOW_AUTOSIZE)
channel.basic_consume(callback, queue='hello', no_ack=True)
channel.start_consuming()
<file_sep>/android/bluetooth_server.py
#!/usr/bin/python
#JoshAshby 2011
#<EMAIL>
#http://joshashby.com
"""
Import all of the libraries we need.
sys is needed for the system arguments
bluetooth is for communicating with the phone which acts as a barcode scanner
virtkey is for emulating keyboard key presses to type the barcode data
"""
import bluetooth
import sys
import virtkey
"""
sock is the socket for the bluetooth connection, all bluetooth functions happen through sock
v is the object for virtual keys (the keyboard emulation)
"""
sock = bluetooth.BluetoothSocket(bluetooth.RFCOMM)
v = virtkey.virtkey()
"""
returns: host in the format of xx:xx:xx:xx:xx:xx
takes: nothing
"""
def find_host():
for host, name in bluetooth.discover_devices(lookup_names=True):
print host, name
#if (name == 'Nexus 10' or host == 'BC:20:A4:74:7E:43'):
if (host == "04:E4:51:11:19:09"):
print "name: ", name
return host
"""
returns: port number of SL4A, needed for connecting with the phone's python script inorder to get and pass data between the phone and computer
takes: host number in the format of xx:xx:xx:xx:xx:xx
finds which port everything, including the needed SL4A port, is running on. Needed in order to connect to device
"""
def find_port(host):
services = bluetooth.find_service(address=host)
for service in services:
print service
if service['name'] == 'SL4A':
print service
port = service['port']
print port
return port
"""
returns: nothing
takes: nothing
connect to the bluetooth device
"""
def connect(host, port):
sock.connect((host, port))
print sock
"""
returns: 100bytes from bluetooth everytime it's called
takes: nothing
receive data from the bluetooth device
"""
def receive():
return sock.recv(100)
"""
returns: nothing
takes: product_name, name of the product
send data to the bluetooth device
"""
def send(product_name):
sock.send(product_name)
"""
returns: nothing
takes: nothing
take input from the bluetooth device
"""
def scan():
while True:
query = receive()
print "received from client:", query
for x in query:
v.press_unicode(ord(x))
v.release_unicode(ord(x))
host = find_host()
print host
if (host == 'None'):
host = find_host()
print host
port = find_port(host)
print port
connect(host,port)
scan()
<file_sep>/gui/breezy/textfielddemo.py
"""
File: textfielddemo.py
Author: <NAME>
"""
from breezypythongui import EasyFrame
class TextFieldDemo(EasyFrame):
"""Converts an input string to uppercase and displays the result."""
def __init__(self):
"""Sets up the window and widgets."""
EasyFrame.__init__(self)
# Label and field for the input
self.addLabel(text = "Input",
row = 0, column = 0)
self.inputField = self.addTextField(text = "",
row = 0,
column = 1)
# Label and field for the output
self.addLabel(text = "Output",
row = 1, column = 0)
self.outputField = self.addTextField(text = "",
row = 1,
column = 1,
state = "readonly")
# The command button
self.button = self.addButton(text = "Convert",
row = 2, column = 0,
columnspan = 2,
command = self.convert)
# The event handling method for the button
def convert(self):
"""Inputs the string, converts it to uppercase,
and outputs the result."""
text = self.inputField.getText()
result = text.upper()
self.outputField.setText(result)
#Instantiate and pop up the window."""
if __name__ == "__main__":
TextFieldDemo().mainloop()
<file_sep>/pubsub/send_video_test.py
import send
test = send.mobot_video_feed(0, 320, 240)
print test
<file_sep>/gui/breezy/canvasdemo2.py
"""
File: canvasdemo2.py
Author: <NAME>
"""
from breezypythongui import EasyFrame
import random
class CanvasDemo(EasyFrame):
"""Draws filled ovals on a canvas, and allows the user to erase
them all."""
def __init__(self):
"""Sets up the window and widgets."""
EasyFrame.__init__(self, title = "Canvas Demo 2")
self.colors = ("blue", "green", "red", "yellow")
self.shapes = list()
# Canvas
self.canvas = self.addCanvas(row = 0, column = 0,
columnspan = 2,
width = 300, height = 150,
background = "gray")
# Command buttons
self.addButton(text = "Draw oval", row = 1, column = 0,
command = self.drawOval)
self.addButton(text = "Erase all", row = 1, column = 1,
command = self.eraseAll)
# Event handling method
def drawOval(self):
"""Draws a filled oval at a random position."""
x = random.randint(0, 300)
y = random.randint(0, 150)
color = random.choice(self.colors)
shape = self.canvas.drawOval(x, y, x + 25, y + 25, fill = color)
self.shapes.append(shape)
def eraseAll(self):
"""Deletes all ovals from the canvas."""
for shape in self.shapes:
self.canvas.delete(shape)
self.shapes = list()
# Instantiate and pop up the window."""
if __name__ == "__main__":
CanvasDemo().mainloop()
<file_sep>/gui/foo/bin/foo
#!/usr/bin/python
# -*- coding: utf-8 -*-
### BEGIN LICENSE
# This file is in the public domain
### END LICENSE
import sys
import os
import gtk
import gettext
from gettext import gettext as _
gettext.textdomain('foo')
# optional Launchpad integration
# this shouldn't crash if not found as it is simply used for bug reporting
try:
import LaunchpadIntegration
launchpad_available = True
except:
launchpad_available = False
# Add project root directory (enable symlink, and trunk execution).
PROJECT_ROOT_DIRECTORY = os.path.abspath(
os.path.dirname(os.path.dirname(os.path.realpath(sys.argv[0]))))
if (os.path.exists(os.path.join(PROJECT_ROOT_DIRECTORY, 'foo'))
and PROJECT_ROOT_DIRECTORY not in sys.path):
sys.path.insert(0, PROJECT_ROOT_DIRECTORY)
os.putenv('PYTHONPATH', PROJECT_ROOT_DIRECTORY) # for subprocesses
from foo import (
AboutFooDialog, PreferencesFooDialog)
from foo.helpers import get_builder
class FooWindow(gtk.Window):
__gtype_name__ = "FooWindow"
# To construct a new instance of this method, the following notable
# methods are called in this order:
# __new__(cls)
# __init__(self)
# finish_initializing(self, builder)
# __init__(self)
#
# For this reason, it's recommended you leave __init__ empty and put
# your inialization code in finish_intializing
def __new__(cls):
"""Special static method that's automatically called by Python when
constructing a new instance of this class.
Returns a fully instantiated FooWindow object.
"""
builder = get_builder('FooWindow')
new_object = builder.get_object("foo_window")
new_object.finish_initializing(builder)
return new_object
def finish_initializing(self, builder):
"""Called while initializing this instance in __new__
finish_initalizing should be called after parsing the UI definition
and creating a FooWindow object with it in order to finish
initializing the start of the new FooWindow instance.
Put your initilization code in here and leave __init__ undefined.
"""
# Get a reference to the builder and set up the signals.
self.builder = builder
self.builder.connect_signals(self)
store = self.builder.get_object('liststore1')
self.builder.get_object('combobox1').set_model(store)
global launchpad_available
if launchpad_available:
# see https://wiki.ubuntu.com/UbuntuDevelopment/Internationalisation/Coding for more information
# about LaunchpadIntegration
helpmenu = self.builder.get_object('helpMenu')
if helpmenu:
LaunchpadIntegration.set_sourcepackagename('foo')
LaunchpadIntegration.add_items(helpmenu, 0, False, True)
else:
launchpad_available = False
# Uncomment the following code to read in preferences at start up.
#dlg = PreferencesFooDialog.PreferencesFooDialog()
#self.preferences = dlg.get_preferences()
# Code for other initialization actions should be added here.
def about(self, widget, data=None):
"""Display the about box for foo."""
about = AboutFooDialog.AboutFooDialog()
response = about.run()
about.destroy()
def preferences(self, widget, data=None):
"""Display the preferences window for foo."""
prefs = PreferencesFooDialog.PreferencesFooDialog()
response = prefs.run()
if response == gtk.RESPONSE_OK:
# Make any updates based on changed preferences here.
pass
prefs.destroy()
def quit(self, widget, data=None):
"""Signal handler for closing the FooWindow."""
self.destroy()
def on_button1_clicked(self, widget, data=None):
#print "hello"
text = self.builder.get_object("entry1").get_text()
#set the UI to display the string
buff = self.builder.get_object("textview1").get_buffer()
buff.set_text(text)
#widge = self.builder.get_object("object_name").set_text(my_string)
#combobox = self.builder.get_object('combobox1')
#store = gtk.ListStore(gobject.TYPE_STRING)
#liststore = gtk.ListStore(str)
self.builder.get_object('liststore1').append([text])
#store.append([text])
#self.combobox.set_active(0)
#combobox.set_model(store)
cell = gtk.CellRendererText()
self.builder.get_object('combobox1').pack_start(cell)
self.builder.get_object('combobox1').add_attribute(cell, 'text', 0)
#self.builder.get_object('combobox1').append_text(text)
#self.combobox.append_text("Option 2")
def on_destroy(self, widget, data=None):
"""Called when the FooWindow is closed."""
# Clean up code for saving application state should be added here.
gtk.main_quit()
if __name__ == "__main__":
# Support for command line options.
import logging
import optparse
parser = optparse.OptionParser(version="%prog %ver")
parser.add_option(
"-v", "--verbose", action="store_true", dest="verbose",
help=_("Show debug messages"))
(options, args) = parser.parse_args()
# Set the logging level to show debug messages.
if options.verbose:
logging.basicConfig(level=logging.DEBUG)
logging.debug('logging enabled')
# Run the application.
window = FooWindow()
window.show()
gtk.main()
<file_sep>/scipy_cluster1.py
from hcluster import pdist, linkage, dendrogram
import numpy
from numpy.random import rand
X = rand(10,100)
X[0:5,:] *= 2
Y = pdist(X)
Z = linkage(Y)
dendrogram(Z)
<file_sep>/goodfeaturestotrack.py
import cv
img = cv.LoadImageM("2005_Nickel_Proof_Obv.tif", cv.CV_LOAD_IMAGE_GRAYSCALE)
eig_image = cv.CreateMat(img.rows, img.cols, cv.CV_32FC1)
temp_image = cv.CreateMat(img.rows, img.cols, cv.CV_32FC1)
# create the source image
canny_image = cv.CreateImage (cv.GetSize (img), 8, 1)
window_name = "Good Features To Track"
# create window and display the original picture in it
#cv.NamedWindow(window_name, 1)
#cv.SetZero(laplace)
#cv.ShowImage(window_name, img)
cv.Laplace(img, temp_image, 3)
cv.ShowImage("Laplace",temp_image)
#cv.SetZero(temp_image)
cv.Canny(img, canny_image , 50 , 150)
cv.ShowImage("Canny",canny_image )
for (x,y) in cv.GoodFeaturesToTrack(img, eig_image, temp_image, 20, 0.04, 1.0, useHarris = True):
print "good feature at", x,y
#Circle(img, center, radius, color, thickness=1, lineType=8, shift=0)
cv.Circle(img, (x,y), 6, (255,0,0),1, cv.CV_AA , 0)
cv.ShowImage(window_name, img)
cv.WaitKey(5)
img = cv.LoadImageM("/home/lforet/Downloads/photo.JPG")
tempimage = cv.LoadImageM("/home/lforet/Downloads/eye.JPG")
size = cv.GetSize(img)
size2 = cv.GetSize(tempimage)
width = (size[0] - size2[0]+1)
height = (size[1] - size2[1]+1)
resultimage = cv.CreateImage ((width,height), cv.IPL_DEPTH_32F, 1)
cv.MatchTemplate(img, tempimage, resultimage, 1)
cv.ShowImage("result", resultimage)
cv.ShowImage("photo", img)
# wait some key to end
img = cv.LoadImageM("2005_Nickel_Proof_Obv.tif", cv.CV_LOAD_IMAGE_GRAYSCALE)
(keypoints, descriptors) = cv.ExtractSURF(img, None, cv.CreateMemStorage(), (0, 30000, 3, 1))
print len(keypoints), len(descriptors)
for ((x, y), laplacian, size, dir, hessian) in keypoints:
print "x=%d y=%d laplacian=%d size=%d dir=%f hessian=%f" % (x, y, laplacian, size, dir, hessian)
cv.Circle(img, (x,y), size, (255,0,0),1, cv.CV_AA , 0)
cv.ShowImage("SURF", img)
stor = cv.CreateMemStorage()
seq = cv.FindContours(canny_image, stor, cv.CV_RETR_LIST, cv.CV_CHAIN_APPROX_SIMPLE)
cv.DrawContours(canny_image, seq, (255,0,0), (0,0,255), 20, thickness=1)
cv.ShowImage("Contours",canny_image )
original = cv.LoadImageM("2005_Nickel_Proof_Obv.tif", cv.CV_LOAD_IMAGE_GRAYSCALE)
print cv.GetHuMoments(cv.Moments(original))
cv.WaitKey(0)
"""
GoodFeaturesToTrack(image, eigImage, tempImage, cornerCount, qualityLevel, minDistance, mask=NULL, blockSize=3, useHarris=0, k=0.04) corners
Determines strong corners on an image.
Parameters:
* image (CvArr) The source 8-bit or floating-point 32-bit, single-channel image
* eigImage (CvArr) Temporary floating-point 32-bit image, the same size as image
* tempImage (CvArr) Another temporary image, the same size and format as eigImage
* cornerCount (int) number of corners to detect
* qualityLevel (float) Multiplier for the max/min eigenvalue; specifies the minimal accepted quality of image corners
* minDistance (float) Limit, specifying the minimum possible distance between the returned corners; Euclidian distance is used
* mask (CvArr) Region of interest. The function selects points either in the specified region or in the whole image if the mask is NULL
* blockSize (int) Size of the averaging block, passed to the underlying CornerMinEigenVal or CornerHarris used by the function
* useHarris (int) If nonzero, Harris operator ( CornerHarris ) is used instead of default CornerMinEigenVal
* k (float) Free parameter of Harris detector; used only if ( texttt{useHarris} != 0 )
The function finds the corners with big eigenvalues in the image. The function first calculates the minimal eigenvalue for every source image pixel using the CornerMinEigenVal function and stores them in eigImage . Then it performs non-maxima suppression (only the local maxima in 3times 3 neighborhood are retained). The next step rejects the corners with the minimal eigenvalue less than texttt{qualityLevel} \cdot max(texttt{eigImage}(x,y)) . Finally, the function ensures that the distance between any two corners is not smaller than minDistance . The weaker corners (with a smaller min eigenvalue) that are too close to the stronger corners are rejected.
Note that the if the function is called with different values A and B of the parameter qualityLevel , and A > {B}, the array of returned corners with qualityLevel=A will be the prefix of the output corners array with qualityLevel=B .
"""
<file_sep>/navigation/terrain_id/mahotas_classify.py
from glob import glob
import mahotas
import mahotas.features
import milk
import numpy as np
from jug import TaskGenerator
import pickle
#@TaskGenerator
def features_for(imname):
img = mahotas.imread(imname, as_grey=False)
features = mahotas.features.haralick(img).mean(0)
#features = mahotas.features.lbp(img, 1, 8)
#features = mahotas.features.tas(img)
#features =mahotas.features.zernike_moments(img, 2, degree=8)
#print 'features:', features, len(features), type(features[0])
return features
#return
#@TaskGenerator
def learn_model(features, labels):
learner = milk.defaultclassifier()
return learner.train(features, labels)
#@TaskGenerator
def classify(model, features):
return model.apply(features)
positives = glob('/home/lforet/images/class1/*.jpg')
negatives = glob('/home/lforet/images/class2/*.jpg')
unlabeled = glob('../../images/*.jpg')
#print positives
features = map(features_for, negatives + positives)
labels = [0] * len(negatives) + [1] * len(positives)
model = learn_model(features, labels)
print type(features), labels
#pickle.dump( model, open( "ai_model.mdl", "wb" ) )
#model = pickle.load( open( "ai_model.mdl", "rb" ) )
#labeled = [classify(model, features_for(u)) for u in unlabeled]
for u in unlabeled:
print "image: ", u, " = ", classify(model, features_for(u))
#print len(labeled)
#print labeled
<file_sep>/navigation/gps/gpsreader.py~
#!/usr/bin/python
#sudo apt-get install gpsd gpsd-clients
#gpsd /dev/ttyUSB0 -b -n
#The gpsd server reads NMEA sentences from the gps unit and is accessed on port 2947. You can test if everything is working by running a pre-built gpsd client such as xgps.
import gps, os, time
#from future import division
from math import sin, cos, radians, sqrt, atan2, asin, sqrt, pi
import math
rEarth = 6371.01 # Earth's average radius in km
epsilon = 0.000001 # threshold for floating-point equality
def lldistance(a, b):
"""
Calculates the distance between two GPS points (decimal)
@param a: 2-tuple of point A
@param b: 2-tuple of point B
@return: distance in m
"""
r = 6367442.5 # average earth radius in m
dLat = radians(a[0]-b[0])
dLon = radians(a[1]-b[1])
x = sin(dLat/2) ** 2 + \
cos(radians(a[0])) * cos(radians(b[0])) *\
sin(dLon/2) ** 2
#original# y = 2 * atan2(sqrt(x), sqrt(1-x))
y = 2 * asin(sqrt(x))
d = r * y
return d
def haversine(lon1, lat1, lon2, lat2):
"""
Calculate the great circle distance between two points
on the earth (specified in decimal degrees)
"""
# convert decimal degrees to radians
lon1, lat1, lon2, lat2 = map(radians, [lon1, lat1, lon2, lat2])
# haversine formula
dlon = lon2 - lon1
dlat = lat2 - lat1
a = sin(dlat/2)**2 + cos(lat1) * cos(lat2) * sin(dlon/2)**2
c = 2 * asin(sqrt(a))
km = 6367 * c
return km
def calculate_distance(lat1, lon1, lat2, lon2):
'''
* Calculates the distance between two points given their (lat, lon) co-ordinates.
* It uses the Spherical Law Of Cosines (http://en.wikipedia.org/wiki/Spherical_law_of_cosines):
*
* cos(c) = cos(a) * cos(b) + sin(a) * sin(b) * cos(C) (1)
*
* In this case:
* a = lat1 in radians, b = lat2 in radians, C = (lon2 - lon1) in radians
* and because the latitude range is [-pie/2, pie/2] instead of [0, pie]
* and the longitude range is [-pie, pie] instead of [0, 2pie]
* (1) transforms into:
*
* x = cos(c) = sin(a) * sin(b) + cos(a) * cos(b) * cos(C)
*
* Finally the distance is arccos(x)
'''
if ((lat1 == lat2) and (lon1 == lon2)):
return 0
try:
delta = lon2 - lon1
a = math.radians(lat1)
b = math.radians(lat2)
C = math.radians(delta)
x = math.sin(a) * math.sin(b) + math.cos(a) * math.cos(b) * math.cos(C)
distance = math.acos(x) # in radians
distance = math.degrees(distance) # in degrees
distance = distance * 60 # 60 nautical miles / lat degree
distance = distance * 1852 # conversion to meters
#distance = round(distance)
return distance;
except:
return 0
def deg2rad(angle):
return angle*pi/180
def rad2deg(angle):
return angle*180/pi
def pointRadialDistance(lat1, lon1, bearing, distance):
"""
Return final coordinates (lat2,lon2) [in degrees] given initial coordinates
(lat1,lon1) [in degrees] and a bearing [in degrees] and distance [in km]
"""
rlat1 = deg2rad(lat1)
rlon1 = deg2rad(lon1)
rbearing = deg2rad(bearing)
rdistance = distance / rEarth # normalize linear distance to radian angle
rlat = asin( sin(rlat1) * cos(rdistance) + cos(rlat1) * sin(rdistance) * cos(rbearing) )
if cos(rlat) == 0 or abs(cos(rlat)) < epsilon: # Endpoint a pole
rlon=rlon1
else:
rlon = ( (rlon1 - asin( sin(rbearing)* sin(rdistance) / cos(rlat) ) + pi ) % (2*pi) ) - pi
lat = rad2deg(rlat)
lon = rad2deg(rlon)
return (lat, lon)
def recalculate_coordinate(val, _as=None):
"""
Accepts a coordinate as a tuple (degree, minutes, seconds)
You can give only one of them (e.g. only minutes as a floating point number)
and it will be duly recalculated into degrees, minutes and seconds.
Return value can be specified as 'deg', 'min' or 'sec'; default return value is
a proper coordinate tuple.
This formula is only an approximation when applied to the Earth, because the Earth is not a perfect sphere: the Earth radius R varies from 6356.78 km at the poles to 6378.14 km at the equator. There are small corrections, typically on the order of 0.1% (assuming the geometric mean R = 6367.45 km is used everywhere, for example), because of this slight ellipticity of the planet. A more accurate method, which takes into account the Earth's ellipticity, is given by Vincenty's formulae.
"""
deg, min, sec = val
# pass outstanding values from right to left
min = (min or 0) + int(sec) / 60
sec = sec % 60
deg = (deg or 0) + int(min) / 60
min = min % 60
# pass decimal part from left to right
dfrac, dint = math.modf(deg)
min = min + dfrac * 60
deg = dint
mfrac, mint = math.modf(min)
sec = sec + mfrac * 60
min = mint
if _as:
sec = sec + min * 60 + deg * 3600
if _as == 'sec': return sec
if _as == 'min': return sec / 60
if _as == 'deg': return sec / 3600
return deg, min, sec
def points2distance(start, end):
"""
Calculate distance (in kilometers) between two points given as (long, latt) pairs
based on Haversine formula (http://en.wikipedia.org/wiki/Haversine_formula).
Implementation inspired by JavaScript implementation from
http://www.movable-type.co.uk/scripts/latlong.html
Accepts coordinates as tuples (deg, min, sec), but coordinates can be given
in any form - e.g. can specify only minutes:
(0, 3133.9333, 0)
is interpreted as
(52.0, 13.0, 55.998000000008687)
which, not accidentally, is the lattitude of Warsaw, Poland.
"""
start_long = math.radians(recalculate_coordinate(start[0], 'deg'))
start_latt = math.radians(recalculate_coordinate(start[1], 'deg'))
end_long = math.radians(recalculate_coordinate(end[0], 'deg'))
end_latt = math.radians(recalculate_coordinate(end[1], 'deg'))
d_latt = end_latt - start_latt
d_long = end_long - start_long
a = math.sin(d_latt/2)**2 + math.cos(start_latt) * math.cos(end_latt) * math.sin(d_long/2)**2
c = 2 * math.atan2(math.sqrt(a), math.sqrt(1-a))
return rEarth * c
def ActiveSatelliteCount(session):
count = 0
for i in range(len(session)):
s = str(session[i])
if s[-1] == "y":
count = count + 1
return count
def GetHomePosition():
os.system("clear")
while ActiveSatelliteCount(session.satellites) < 6:
print "Acquiring at least 6 GPS Satellites..."
print "Number of acquired satellites: ", ActiveSatelliteCount(session.satellites)
time.sleep(1)
os.system("clear")
session.next()
Lat = 0
Long = 0
print "Have Acquired at least 6 GPS Satellites..."
print "Saving Home Position....(about 15 secs)....."
num = 30
for i in range(num):
time.sleep(.5)
session.next()
Lat = Lat + session.fix.latitude
Long = Long + session.fix.longitude
print session.fix.latitude, session.fix.longitude
Lat = Lat / num
Long = Long / num
print "avgs =", Lat, Long
return Lat, Long, session.utc, session.fix.time, session.fix.altitude
def decdeg2dms(dd):
mnt,sec = divmod(dd*3600,60)
deg,mnt = divmod(mnt,60)
return deg,mnt,sec
session = gps.gps(host="localhost", port="2947")
#gps2 = gps.gps(host="localhost", port="2948")
#print gps.gps()
#print gps2
session.next()
session.stream()
#gps2.next()
#gps2.stream()
home = GetHomePosition()
print home
HomeLat2 = home[0]
HomeLong2 = home[1]
HomeLat = decdeg2dms(home[0])
HomeLong = decdeg2dms(home[1])
print HomeLat, HomeLong
#(lat,lon) = pointRadialDistance(lat1,lon1,bear,dist)
# print "%6.2f \t %6.2f \t %4.1f \t %6.1f \t %6.2f \t %6.2f" % (lat1,lon1,bear,dist,lat,lon)
while 1:
os.system("clear")
session.next()
#gps2.next()
# a = altitude, d = date/time, m=mode,
# o=postion/fix, s=status, y=satellites
print " GPS reading"
print "-------------"
print 'Start latitude ' , HomeLat2
print 'Start longitude ' , HomeLong2
print 'latitude ' , session.fix.latitude
print 'longitude ' , session.fix.longitude
print 'time utc ' , session.utc, session.fix.time
print 'altitude ' , session.fix.altitude
print 'epx ', session.fix.epx
print 'epv ', session.fix.epv
print 'ept ', session.fix.ept
print "speed ", session.fix.speed
print "climb " , session.fix.climb
print
print 'Satellites (total of', len(session.satellites) , ' in view)'
for i in session.satellites:
print '\t', i
print "Active satellites used: ", ActiveSatelliteCount(session.satellites)
currentLat = session.fix.latitude
currentLong = session.fix.longitude
distancefromhome = calculate_distance(HomeLat2, HomeLong2, currentLat, currentLong)
print "Distance from home =" , distancefromhome
print "Distance from home in meters = %8.8f \t" %(distancefromhome/1000)
print "new calc =", calculate_distance(HomeLat2, HomeLong2, session.fix.latitude, session.fix.longitude)
print "new calc 2 =:", lldistance((HomeLat2, HomeLong2), (session.fix.latitude, session.fix.longitude))
#time.sleep(1.5)
print 'track:', session.fix.track
#print help(session)
print 'mode ' , session.fix.mode
#print 'type' , session.fixType
print 'quality ', session.satellites
time.sleep(1.5)
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/tests/test_precluster_learner.py
import numpy as np
from milk.supervised.precluster import precluster_learner, select_precluster
from milk.tests.fast_classifier import fast_classifier
def c0():
return np.random.rand(8)
def c1():
return c0()+2.*np.ones(8)
def gen_data(seed, with_nums=False):
np.random.seed(seed)
features = []
labels =[]
for i in xrange(200):
f = []
for j in xrange(40):
use_0 = (i < 100 and j < 30) or (i >= 100 and j >= 30)
if use_0: f.append(c0())
else: f.append(c1())
labels.append((i < 100))
if with_nums:
features.append((f,[]))
else:
features.append(f)
return features, labels
def test_precluster():
learner = precluster_learner([2], base=fast_classifier(), R=12)
features, labels = gen_data(22)
model = learner.train(features,labels)
assert model.apply([c0() for i in xrange(35)])
assert not model.apply([c1() for i in xrange(35)])
def test_codebook_learner():
learner = select_precluster([2,3,4], base=fast_classifier())
learner.rmax = 3
features, labels = gen_data(23, 1)
model = learner.train(features,labels)
assert model.apply(([c0() for i in xrange(35)],[]))
assert not model.apply(([c1() for i in xrange(35)],[]))
def test_codebook_learner_case1():
learner = select_precluster([2], base=fast_classifier())
learner.rmax = 1
features, labels = gen_data(23, 1)
model = learner.train(features,labels)
assert model.apply(([c0() for i in xrange(35)],[]))
assert not model.apply(([c1() for i in xrange(35)],[]))
<file_sep>/navigation/WavefrontPlanner/WavefrontPlanner.py
############################################################################
# WAVEFRONT ALGORITHM
# Adapted to Python Code By <NAME>
# Fri Jan 29 13:56:53 PST 2010
# from C code from <NAME>
# (www.societyofrobots.com)
############################################################################
try:
import numpy
except:
print "The numpy math library is not installed."
import time
############################################################################
class waveFrontPlanner:
def __init__(self, map, slow=False):
self.__slow = slow
self.__map = map
if str(type(map)).find("numpy") != -1:
#If its a numpy array
self.__height, self.__width = self.__map.shape
else:
self.__height, self.__width = len(self.__map), len(self.__map[0])
self.__nothing = 0
self.__wall = 999
self.__goal = 1
self.__path = "PATH"
self.__finalPath = []
#Robot value
self.__robot = 254
#Robot default Location
self.__robot_x = 0
self.__robot_y = 0
#default goal location
self.__goal_x = 3
self.__goal_y = 6
#temp variables
self.__temp_A = 0
self.__temp_B = 0
self.__counter = 0
self.__steps = 0 #determine how processor intensive the algorithm was
#when searching for a node with a lower value
self.__minimum_node = 250
self.__min_node_location = 250
self.__new_state = 1
self.__old_state = 1
self.__reset_min = 250 #above this number is a special (wall or robot)
###########################################################################
def setRobotPosition(self, x, y):
"""
Sets the robot's current position
"""
self.__robot_x = x
self.__robot_y = y
###########################################################################
def setGoalPosition(self, x, y):
"""
Sets the goal position.
"""
self.__goal_x = x
self.__goal_y = y
###########################################################################
def robotPosition(self):
return (self.__robot_x, self.__robot_y)
###########################################################################
def goalPosition(self):
return (self.__goal_x, self.__goal_y)
###########################################################################
def run(self, prnt=False):
"""
The entry point for the robot algorithm to use wavefront propagation.
"""
path = []
while self.__map[self.__robot_x][self.__robot_y] != self.__goal:
if self.__steps > 20000:
print "Cannot find a path."
return
#find new location to go to
self.__new_state = self.propagateWavefront()
#update location of robot
if self.__new_state == 1:
self.__robot_x -= 1
if prnt:
print "Move to x=%d y=%d\n\n" % \
(self.__robot_x, self.__robot_y)
path.append((self.__robot_x, self.__robot_y))
if self.__new_state == 2:
self.__robot_y += 1
if prnt:
print "Move to x=%d y=%d\n\n" % \
(self.__robot_x, self.__robot_y)
path.append((self.__robot_x, self.__robot_y))
if self.__new_state == 3:
self.__robot_x += 1
if prnt:
print "Move to x=%d y=%d\n\n" % \
(self.__robot_x, self.__robot_y)
path.append((self.__robot_x, self.__robot_y))
if self.__new_state == 4:
self.__robot_y -= 1
if prnt:
print "Move to x=%d y=%d\n\n" % \
(self.__robot_x, self.__robot_y)
path.append((self.__robot_x, self.__robot_y))
self.__old_state = self.__new_state
msg = "Found the goal in %i steps:\n" % self.__steps
msg += "Map size= %i %i\n\n" % (self.__height, self.__width)
if prnt:
print msg
self.printMap()
return path
###########################################################################
def propagateWavefront(self, prnt=False):
"""
"""
self.unpropagate()
#Old robot location was deleted, store new robot location in map
self.__map[self.__robot_x][self.__robot_y] = self.__robot
self.__path = self.__robot
#start location to begin scan at goal location
self.__map[self.__goal_x][self.__goal_y] = self.__goal
counter = 0
while counter < 200: #allows for recycling until robot is found
x = 0
y = 0
time.sleep(0.00001)
#while the map hasnt been fully scanned
while x < self.__height and y < self.__width:
#if this location is a wall or the goal, just ignore it
if self.__map[x][y] != self.__wall and \
self.__map[x][y] != self.__goal:
#a full trail to the robot has been located, finished!
minLoc = self.minSurroundingNodeValue(x, y)
if minLoc < self.__reset_min and \
self.__map[x][y] == self.__robot:
if prnt:
print "Finished Wavefront:\n"
self.printMap()
# Tell the robot to move after this return.
return self.__min_node_location
#record a value in to this node
elif self.__minimum_node != self.__reset_min:
#if this isnt here, 'nothing' will go in the location
self.__map[x][y] = self.__minimum_node + 1
#go to next node and/or row
y += 1
if y == self.__width and x != self.__height:
x += 1
y = 0
#print self.__robot_x, self.__robot_y
if prnt:
print "Sweep #: %i\n" % (counter + 1)
self.printMap()
self.__steps += 1
counter += 1
return 0
###########################################################################
def unpropagate(self):
"""
clears old path to determine new path
stay within boundary
"""
for x in range(0, self.__height):
for y in range(0, self.__width):
if self.__map[x][y] != self.__wall and \
self.__map[x][y] != self.__goal and \
self.__map[x][y] != self.__path:
#if this location is a wall or goal, just ignore it
self.__map[x][y] = self.__nothing #clear that space
###########################################################################
def minSurroundingNodeValue(self, x, y):
"""
this method looks at a node and returns the lowest value around that
node.
"""
#reset minimum
self.__minimum_node = self.__reset_min
#down
if x < self.__height -1:
if self.__map[x + 1][y] < self.__minimum_node and \
self.__map[x + 1][y] != self.__nothing:
#find the lowest number node, and exclude empty nodes (0's)
self.__minimum_node = self.__map[x + 1][y]
self.__min_node_location = 3
#up
if x > 0:
if self.__map[x-1][y] < self.__minimum_node and \
self.__map[x-1][y] != self.__nothing:
self.__minimum_node = self.__map[x-1][y]
self.__min_node_location = 1
#right
if y < self.__width -1:
if self.__map[x][y + 1] < self.__minimum_node and \
self.__map[x][y + 1] != self.__nothing:
self.__minimum_node = self.__map[x][y + 1]
self.__min_node_location = 2
#left
if y > 0:
if self.__map[x][y - 1] < self.__minimum_node and \
self.__map[x][y - 1] != self.__nothing:
self.__minimum_node = self.__map[x][y-1]
self.__min_node_location = 4
return self.__minimum_node
###########################################################################
def printMap(self):
"""
Prints out the map of this instance of the class.
"""
msg = ''
for temp_B in range(0, self.__height):
for temp_A in range(0, self.__width):
if self.__map[temp_B][temp_A] == self.__wall:
msg += "%04s" % "[#]"
elif self.__map[temp_B][temp_A] == self.__robot:
msg += "%04s" % "-"
elif self.__map[temp_B][temp_A] == self.__goal:
msg += "%04s" % "G"
else:
msg += "%04s" % str(self.__map[temp_B][temp_A])
msg += "\n\n"
msg += "\n\n"
print msg
#
if self.__slow == True:
time.sleep(0.05)
###############################################################################
if __name__ == "__main__":
"""
X is vertical, Y is horizontal
"""
floormap = [[000,000,000,000,000,000,000,000,000,000,000,000], \
[000,000,999,999,999,999,999,999,000,000,000,000], \
[000,000,999,000,000,000,000,999,999,999,999,999], \
[000,000,999,000,999,999,000,000,000,000,999,000], \
[000,999,999,000,000,999,000,000,000,000,999,000], \
[000,999,000,000,999,999,999,999,999,000,999,000], \
[000,999,000,000,999,000,000,000,999,000,999,000], \
[000,999,000,999,999,999,000,000,999,000,999,999], \
[000,999,000,000,000,000,000,000,999,000,000,000], \
[000,999,999,999,999,999,999,000,999,000,999,999], \
[000,000,000,999,000,000,000,000,999,000,999,000], \
[000,999,999,999,999,999,999,999,999,000,000,000], \
[000,000,000,000,000,000,000,000,999,000,000,000], \
[000,999,000,000,000,000,000,000,999,000,000,000], \
[000,999,000,999,000,999,999,000,999,999,999,000], \
[000,999,000,999,000,999,000,999,000,000,000,000], \
[000,999,000,999,000,000,000,000,000,999,000,000], \
[000,999,999,000,000,999,000,000,000,000,999,000], \
[000,000,000,999,000,000,000,999,000,999,000,000]]
start = time.time()
planner = waveFrontPlanner(floormap, False)
#time.sleep(2)
planner.run(True)
end = time.time()
print "Took %f seconds to run wavefront simulation" % (end - start)
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/measures/nfoldcrossvalidation.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2011, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
# LICENSE: MIT
from __future__ import division
from ..supervised.classifier import normaliselabels
import numpy as np
__all__ = ['foldgenerator', 'getfold', 'nfoldcrossvalidation']
def foldgenerator(labels, nfolds=None, origins=None, folds=None, multi_label=False):
'''
for train,test in foldgenerator(labels, nfolds=None, origins=None)
...
This generator breaks up the data into `n` folds (default 10).
If `origins` is given, then all elements that share the same origin will
either be in testing or in training (never in both). This is useful when
you have several replicates that shouldn't be mixed together between
training&testing but that can be otherwise be treated as independent for
learning.
Parameters
----------
labels : a sequence
the labels
nfolds : integer
nr of folds (default 10 or minimum label size)
origins : sequence, optional
if present, must be an array of indices of the same size as labels.
folds : sequence of int, optional
which folds to generate
Returns
-------
iterator over `train, test`, two boolean arrays
'''
labels,names = normaliselabels(labels,multi_label=multi_label)
if origins is None:
origins = np.arange(len(labels))
else:
if len(origins) != len(labels):
raise ValueError(
'milk.nfoldcrossvalidation.foldgenerator: origins must be of same size as labels')
origins = np.asanyarray(origins)
fmin = len(labels)
for ell in xrange(len(names)):
if multi_label:
matching = (orig for i,orig in enumerate(origins) if labels[i,ell])
else:
matching = origins[labels == ell]
curmin = len(set(matching))
fmin = min(fmin, curmin)
if fmin == 1:
raise ValueError('''
milk.nfoldcrossvalidation.foldgenerator: nfolds was reduced to 1 because minimum class size was 1.
If you passed in an origins parameter, it might be caused by having a class come from a single origin.
''')
fold = np.zeros(len(labels))
fold -= 1
if nfolds is None:
nfolds = min(fmin, 10)
elif nfolds > fmin:
from warnings import warn
warn('milk.measures.nfoldcrossvalidation: Reducing the nr. of folds from %s to %s (minimum class size).' % (nfolds, fmin))
nfolds = fmin
if multi_label:
foldweight = np.zeros( (nfolds, len(names)), int)
for orig in np.unique(origins):
(locations,) = np.where(orig == origins)
weight = len(locations)
ell = labels[locations[0]]
f = np.argmin(foldweight[:,ell].sum(1))
fold[locations] = f
foldweight[f,ell] += weight
else:
for lab in set(labels):
locations = (labels == lab)
usedorigins = np.unique(origins[locations])
weights = np.array([np.sum(origins == orig) for orig in usedorigins])
foldweight = np.zeros(nfolds, int)
for w,orig in sorted(zip(weights, usedorigins)):
f = np.argmin(foldweight)
if np.any(fold[origins == orig] > -1):
raise ValueError(
'milk.nfoldcrossvalidation.foldgenerator: something is wrong. Maybe origin %s is present in two labels.' % orig)
fold[origins == orig] = f
foldweight[f] += w
for f in xrange(nfolds):
if folds is not None and f not in folds: continue
yield (fold != f), (fold == f)
def getfold(labels, fold, nfolds=None, origins=None):
'''
trainingset,testingset = getfold(labels, fold, nfolds=None, origins=None)
Get the training and testing set for fold `fold` in `nfolds`
Arguments are the same as for `foldgenerator`
Parameters
----------
labels : ndarray of labels
fold : integer
nfolds : integer
number of folds (default 10 or size of smallest class)
origins : sequence, optional
if given, then objects with same origin are *not* scattered across folds
'''
if nfolds < fold:
raise ValueError('milk.getfold: Attempted to get fold %s out of %s' % (fold, nfolds))
for i,(t,s) in enumerate(foldgenerator(labels, nfolds, origins)):
if i == fold:
return t,s
raise ValueError('milk.getfold: Attempted to get fold %s but the number of actual folds was too small (%s)' % (fold,i))
def nfoldcrossvalidation(features, labels, nfolds=None, learner=None, origins=None, return_predictions=False, folds=None, initial_measure=0, classifier=None,):
'''
Perform n-fold cross validation
cmatrix,names = nfoldcrossvalidation(features, labels, nfolds=10, learner={defaultclassifier()}, origins=None, return_predictions=False)
cmatrix,names,predictions = nfoldcrossvalidation(features, labels, nfolds=10, learner={defaultclassifier()}, origins=None, return_predictions=True)
cmatrix will be a N x N matrix, where N is the number of classes
cmatrix[i,j] will be the number of times that an element of class i was
classified as class j
names[i] will correspond to the label name of class i
Parameters
----------
features : a sequence
labels : an array of labels, where label[i] is the label corresponding to features[i]
nfolds : integer, optional
Nr of folds. Default: 10
learner : learner object, optional
learner should implement the train() method to return a model
(something with an apply() method). defaultclassifier() by default
This parameter used to be called `classifier` and that name is still supported
origins : sequence, optional
Origin ID (see foldgenerator)
return_predictions : bool, optional
whether to return predictions (default: False)
folds : sequence of int, optional
which folds to generate
initial_measure : any, optional
what initial value to use for the results reduction (default: 0)
Returns
-------
cmatrix : ndarray
confusion matrix
names : sequence
sequence of labels so that cmatrix[i,j] corresponds to names[i], names[j]
predictions : sequence
predicted output for each element
'''
import operator
from .measures import confusion_matrix
if len(features) != len(labels):
raise ValueError('milk.measures.nfoldcrossvalidation: len(features) should match len(labels)')
if classifier is not None:
if learner is not None:
raise ValueError('milk.nfoldcrossvalidation: Using both `learner` and `classifier` arguments. They are the same, but `learner` is preferred')
learner = classifier
if learner is None:
from ..supervised.defaultclassifier import defaultclassifier
learner = defaultclassifier()
labels,names = normaliselabels(labels)
if return_predictions:
predictions = np.empty_like(labels)
predictions.fill(-1) # This makes it clearer if there are bugs in the programme
try:
features = np.asanyarray(features)
except:
features = np.asanyarray(features, dtype=object)
if origins is not None:
origins = np.asanyarray(origins)
nclasses = labels.max() + 1
results = []
measure = confusion_matrix
train_kwargs = {}
for trainingset,testingset in foldgenerator(labels, nfolds, origins=origins, folds=folds):
if origins is not None:
train_kwargs = { 'corigins' : origins[trainingset] }
model = learner.train(features[trainingset], labels[trainingset], **train_kwargs)
cur_preds = np.array([model.apply(f) for f in features[testingset]])
if return_predictions:
predictions[testingset] = cur_preds
results.append(measure(labels[testingset], cur_preds))
result = reduce(operator.add, results, initial_measure)
if return_predictions:
return result, names, predictions
return result, names
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/tests/test_normaliselabels.py
from milk.supervised.normalise import normaliselabels
import numpy as np
def test_normaliselabels():
np.random.seed(22)
labels = np.zeros(120, np.uint8)
labels[40:] += 1
labels[65:] += 1
reorder = np.argsort(np.random.rand(len(labels)))
labels = labels[reorder]
labels2,names = normaliselabels(labels)
for new_n,old_n in enumerate(names):
assert np.all( (labels == old_n) == (labels2 == new_n) )
def test_normaliselabels_multi():
np.random.seed(30)
r = np.random.random
for v in xrange(10):
labels = []
p = np.array([.24,.5,.1,.44])
for i in xrange(100):
cur = [j for j in xrange(4) if r() < p[j]]
if not cur: cur = [0]
labels.append(cur)
nlabels, names = normaliselabels(labels, True)
assert len(labels) == len(nlabels)
assert len(nlabels[0]) == max(map(max,labels))+1
assert nlabels.sum() == sum(map(len,labels))
<file_sep>/gui/tkinter/mobot_main.py
#!/usr/local/bin/python
from PIL import Image, ImageTk
import time
from datetime import datetime
import socket
import cv2
from threading import *
import sys
from maxsonar_class import *
import random
hhh = 0
file_lock = False
def snap_shot(filename):
#capture from camera at location 0
now = time.time()
global webcam1
try:
#have to capture a few frames as it buffers a few frames..
for i in range (5):
ret, img = webcam1.read()
#print "time to capture 5 frames:", (time.time()) - now
cv2.imwrite(filename, img)
img1 = Image.open(filename)
img1.thumbnail((320,240))
img1.save(filename)
#print (time.time()) - now
except:
print "could not grab webcam"
return
def send_file(host, cport, mport, filetosend):
#global file_lock
file_lock = True
#print "file_lock", file_lock
try:
cs = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
cs.connect((host, cport))
cs.send("SEND " + filetosend)
print "sending file", filetosend
cs.close()
except:
print "cs failed"
pass
time.sleep(0.1)
try:
ms = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
ms.connect((host, mport))
f = open(filetosend, "rb")
data = f.read()
f.close()
ms.send(data)
ms.close()
except:
print "ms failed"
pass
#file_lock = False
#print "file_lock", file_lock
'''
def send_data(host="u1204vm.local", cport=9091, mport=9090, datatosend=""):
try:
cs = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
cs.connect((host, cport))
cs.send("SEND " + filetosend)
cs.close()
except:
pass
time.sleep(0.05)
try:
ms = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
ms.connect((host, mport))
f = open(filetosend, "rb")
data = f.read()
f.close()
ms.send(data)
ms.close()
except:
pass
'''
class send_video(Thread):
def __init__(self, filetosend):
self.filetosend = filetosend
Thread.__init__(self)
def run(self):
#global file_lock, hhh
print self.filetosend
while True:
snap_shot(self.filetosend)
send_file(host="u1204vm.local", cport=9090, mport=9091,filetosend=self.filetosend)
time.sleep(.01)
class send_sonar_data(Thread):
def __init__(self, filetosend):
self.filetosend = filetosend
self.sonar_data = ""
self.max_dist = -1
self.min_dist = -1
self.min_sensor = -1
self.max_sensor = -1
Thread.__init__(self)
def run(self):
#global file_lock, hhh
sonar = MaxSonar()
while True:
self.sonar_data = ""
self.max_dist = -1
self.min_dist = -1
self.min_sensor = -1
self.max_sensor = -1
#
#below 2 lines are for test purposes when actual US arent sending data
#for i in range(1,6):
# sonar_data = sonar_data + "s"+str(i)+":"+ str(random.randint(28, 91))
data = str(sonar.distances_cm())
self.sonar_data = []
sonar_data_str1 = ""
if len(data) > 1:
self.sonar_data.append(int(data[(data.find('s1:')+3):(data.find('s2:'))]))
self.sonar_data.append(int(data[(data.find('s2:')+3):(data.find('s3:'))]))
self.sonar_data.append(int(data[(data.find('s3:')+3):(data.find('s4:'))]))
self.sonar_data.append(int(data[(data.find('s4:')+3):(data.find('s5:'))]))
self.sonar_data.append(int(data[(data.find('s5:')+3):(len(data)-1)]))
self.max_dist = max(self.sonar_data)
self.min_dist = min(self.sonar_data)
self.min_sensor = self.sonar_data.index(self.min_dist)
self.max_sensor = self.sonar_data.index(self.max_dist)
#sonar_data_str1 = "".join(str(x) for x in self.sonar_data)
#print sonar_data_str1
#print data
f = open("sonar_data.txt", "w")
f.write(data)
f.close()
send_file(host="u1204vm.local", cport=9092, mport=9093,filetosend=self.filetosend)
time.sleep(.01)
print "out of while in sonar"
def move_mobot(themove):
if __name__== "__main__":
testmode = False
if len(sys.argv) > 1:
if sys.argv[1] == 'testmode':
print 'starting in testing mode'
testmode= True
webcam1 = cv2.VideoCapture(0)
video1 = send_video("snap_shot.jpg")
#video1.daemon=True
video1.start()
#video1.join()
##start sonar
if (testmode == False):
sonar = send_sonar_data("sonar_data.txt")
#sonar.daemon=True
sonar.start()
#sonar.join()
while True:
move = ""
#print "......................", sonar.sonar_data
print "sonar_data: ", sonar.sonar_data
print "max dist: ", sonar.max_dist, sonar.max_sensor
print "min_dist: ", sonar.min_dist, sonar.min_sensor
if sonar.max_sensor == 0: move = "foward"
if sonar.max_sensor == 1: move = "right"
#if sonar.max_sensor == 2: move = "reverse"
if sonar.max_sensor == 3: move = "left"
print "suggest moving: ", move
move_mobot(move)
time.sleep(1)
print "stopped"
<file_sep>/arduino/SabertoothSimplified/examples/Sweep/Sweep.ino
// Sweep Sample
// Copyright (c) 2012 Dimension Engineering LLC
// See license.txt for license details.
#include <SabertoothSimplified.h>
SabertoothSimplified ST; // We'll name the Sabertooth object ST.
// For how to configure the Sabertooth, see the DIP Switch Wizard for
// http://www.dimensionengineering.com/datasheets/SabertoothDIPWizard/start.htm
// Be sure to select Simplified Serial Mode for use with this library.
// This sample uses a baud rate of 9600.
//
// Connections to make:
// Arduino TX->1 -> Sabertooth S1
// Arduino GND -> Sabertooth 0V
// Arduino VIN -> Sabertooth 5V (OPTIONAL, if you want the Sabertooth to power the Arduino)
//
// If you want to use a pin other than TX->1, see the SoftwareSerial example.
void setup()
{
Serial.begin(9600); // This is the baud rate you chose with the DIP switches.
}
void loop()
{
int power;
// Ramp motor 1 and motor 2 from -127 to 127 (full reverse to full forward),
// waiting 20 ms (1/50th of a second) per value.
for (power = -127; power <= 127; power ++)
{
ST.motor(1, power);
ST.motor(2, power);
delay(20);
}
// Now go back the way we came.
for (power = 127; power >= -127; power --)
{
ST.motor(1, power);
ST.motor(2, power);
delay(20);
}
}
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/tests/test_measures_clusters.py
import milk.measures.cluster_agreement
import numpy as np
def test_rand_arand_jaccard():
np.random.seed(33)
labels = np.repeat(np.arange(4),10)
clusters = np.repeat(np.arange(4),10)
a0,b0,c0= milk.measures.cluster_agreement.rand_arand_jaccard(clusters, labels)
assert a0 == 1.
assert b0 == 1.
np.random.shuffle(clusters)
a1,b1,c1= milk.measures.cluster_agreement.rand_arand_jaccard(clusters, labels)
assert a1 >= 0.
assert a1 < 1.
assert b1 < 1.
assert b1 >= 0.
assert c1 < c0
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/demos/svm-decision-boundary.py
from pylab import *
import numpy as np
from milksets.wine import load
import milk.supervised
import milk.unsupervised.pca
import milk.supervised.svm
features, labels = load()
features = features[labels < 2]
labels = labels[labels < 2]
features,_ = milk.unsupervised.pca(features)
features = features[:,:2]
learner = milk.supervised.svm.svm_raw(kernel=np.dot, C=12)
model = learner.train(features, labels)
w = np.dot(model.svs.T, model.Yw)
b = model.b
x = np.linspace(-.5, .1, 100)
y = -w[0]/w[1]*x + b/w[1]
plot(features[labels == 1][:,0], features[labels == 1][:,1], 'bx')
plot(features[labels == 0][:,0], features[labels == 0][:,1], 'ro')
plot(x,y)
savefig('svm-demo-points.pdf')
clf()
learner = milk.supervised.svm.svm_raw(kernel=milk.supervised.svm.rbf_kernel(1.), C=12)
model = learner.train(features, labels)
Y, X = (np.mgrid[:101,:101]-50)/12.5
values = [model.apply((y,x)) for y,x in zip(Y.ravel(),X.ravel())]
values = np.array(values).reshape(Y.shape)
sfeatures = features*12.5
sfeatures += 50
plot(sfeatures[labels == 0][:,0], sfeatures[labels == 0][:,1], 'bo')
plot(sfeatures[labels == 1][:,0], sfeatures[labels == 1][:,1], 'ro')
imshow(values.T)
savefig('svm-demo-boundary.pdf')
<file_sep>/gui/easygui/img_processing_tools.py~
#!/usr/bin/env python
from PIL import Image
import numpy as np
from PIL import ImageStat
def image2array(img):
"""given an image, returns an array. i.e. create array of image using numpy """
return np.asarray(img)
###########################################################
def array2image(arry):
"""given an array, returns an image. i.e. create image using numpy array """
#Create image from array
return Image.fromarray(arry)
###########################################################
def PILtoCV(PIL_img):
cv_img = cv.CreateImageHeader(PIL_img.size, cv.IPL_DEPTH_8U, 1)
cv.SetData(cv_img, PIL_img.tostring())
return cv_img
###########################################################
def CVtoPIL(img):
"""converts CV image to PIL image"""
cv_img = cv.CreateMatHeader(cv.GetSize(img)[1], cv.GetSize(img)[0], cv.CV_8UC1)
#cv.SetData(cv_img, pil_img.tostring())
pil_img = Image.fromstring("L", cv.GetSize(img), img.tostring())
return pil_img
###########################################################
def CalcHistogram(img):
#calc histogram of green band
bins = np.arange(0,256)
hist1 = image2array(img)
H, xedges = np.histogram(np.ravel(hist1), bins=bins, normed=False)
return H
def WriteMeterics(image, classID):
#calculate histogram
print "Calculating Histogram for I3 pixels of image..."
Red_Band, Green_Band, Blue_Band = image.split()
Histogram = CalcHistogram(Green_Band)
#save I3 Histogram to file in certain format
f_handle = open('I3banddata.csv', 'a')
f_handle.write(str(classID))
f_handle.write(' ')
f_handle.close()
print "saving I3 histogram to dictionary..."
f_handle = open("I3banddata.csv", 'a')
for i in range(len(Histogram)):
f_handle.write(str(Histogram[i]))
f_handle.write(" ")
#f_handle.write('\n')
f_handle.close()
I3_sum = ImageStat.Stat(image).sum
I3_sum2 = ImageStat.Stat(image).sum2
I3_median = ImageStat.Stat(image).median
I3_mean = ImageStat.Stat(image).mean
I3_var = ImageStat.Stat(image).var
I3_stddev = ImageStat.Stat(image).stddev
I3_rms = ImageStat.Stat(image).rms
print "saving I3 meterics to dictionary..."
f_handle = open("I3banddata.csv", 'a')
print "sum img1_I3: ", I3_sum[1]
print "sum2 img1_I3: ", I3_sum2[1]
print "median img1_I3: ", I3_median[1]
print "avg img1_I3: ", I3_mean[1]
print "var img1_I3: ", I3_var[1]
print "stddev img1_I3: ", I3_stddev[1]
print "rms img1_I3: ", I3_rms[1]
#print "extrema img1_I3: ", ImageStat.Stat(img1_I3).extrema
#print "histogram I3: ", len(img1_I3.histogram())
f_handle.write(str(I3_sum[1]))
f_handle.write(" ")
f_handle.write(str(I3_sum2[1]))
f_handle.write(" ")
f_handle.write(str(I3_median[1]))
f_handle.write(" ")
f_handle.write(str(I3_mean[1]))
f_handle.write(" ")
f_handle.write(str(I3_var[1]))
f_handle.write(" ")
f_handle.write(str(I3_stddev[1]))
f_handle.write(" ")
f_handle.write(str(I3_rms[1]))
f_handle.write(" ")
f_handle.write('\n')
f_handle.close()
return
def rgbToI3(r, g, b):
"""Convert RGB color space to I3 color space
@param r: Red
@param g: Green
@param b: Blue
return (I3) integer
"""
i3 = ((2*g)-r-b)/2
return i3
def rgb2I3 (img):
"""Convert RGB color space to I3 color space
@param r: Red
@param g: Green
@param b: Blue
return (I3) integer
"""
xmax = img.size[0]
ymax = img.size[1]
#make a copy to return
returnimage = Image.new("RGB", (xmax,ymax))
imagearray = img.load()
for y in range(0, ymax, 1):
for x in range(0, xmax, 1):
rgb = imagearray[x, y]
i3 = ((2*rgb[1])-rgb[0]-rgb[2]) / 2
#print rgb, i3
returnimage.putpixel((x,y), (0,i3,0))
return returnimage
<file_sep>/camera/objecttracker.py
#!/usr/bin/python
import urllib2
import cv
import sys
if __name__ == "__main__":
laplace = None
colorlaplace = None
planes = [ None, None, None ]
capture = None
if len(sys.argv) == 1:
capture = cv.CreateCameraCapture(0)
elif len(sys.argv) == 2 and sys.argv[1].isdigit():
capture = cv.CreateCameraCapture(int(sys.argv[1]))
elif len(sys.argv) == 2:
capture = cv.CreateFileCapture(sys.argv[1])
if not capture:
print "Could not initialize capturing..."
sys.exit(-1)
cv.NamedWindow("Good Features to Track", 1)
while True:
frame = cv.QueryFrame(capture)
frame_size = cv.GetSize(frame)
#print frame_size[1], frame_size[0]
eig_image = cv.CreateMat(frame_size[1], frame_size[0], cv.CV_32FC1)
temp_image = cv.CreateMat(frame_size[1], frame_size[0], cv.CV_32FC1)
grayframe = cv.CreateImage (cv.GetSize (frame), 8, 1)
cv.CvtColor(frame, grayframe, cv.CV_RGB2GRAY)
if frame:
for (x,y) in cv.GoodFeaturesToTrack(grayframe, eig_image, temp_image, 20, 0.08, 1.0, blockSize=6,useHarris = True):
#print "good feature at", x,y
#Circle(img, center, radius, color, thickness=1, lineType=8, shift=0)
cv.Circle(frame, (x,y), 6, (255,0,0),1, cv.CV_AA , 0)
cv.ShowImage("Good Features to Track", frame)
if cv.WaitKey(10) != -1:
break
cv.DestroyWindow("Laplacian")
"""
GoodFeaturesToTrack(image, eigImage, tempImage, cornerCount, qualityLevel, minDistance, mask=NULL, blockSize=3, useHarris=0, k=0.04) corners
Determines strong corners on an image.
Parameters:
* image (CvArr) The source 8-bit or floating-point 32-bit, single-channel image
* eigImage (CvArr) Temporary floating-point 32-bit image, the same size as image
* tempImage (CvArr) Another temporary image, the same size and format as eigImage
* cornerCount (int) number of corners to detect
* qualityLevel (float) Multiplier for the max/min eigenvalue; specifies the minimal accepted quality of image corners
* minDistance (float) Limit, specifying the minimum possible distance between the returned corners; Euclidian distance is used
* mask (CvArr) Region of interest. The function selects points either in the specified region or in the whole image if the mask is NULL
* blockSize (int) Size of the averaging block, passed to the underlying CornerMinEigenVal or CornerHarris used by the function
* useHarris (int) If nonzero, Harris operator ( CornerHarris ) is used instead of default CornerMinEigenVal
* k (float) Free parameter of Harris detector; used only if ( texttt{useHarris} != 0 )
The function finds the corners with big eigenvalues in the image. The function first calculates the minimal eigenvalue for every source image pixel using the CornerMinEigenVal function and stores them in eigImage . Then it performs non-maxima suppression (only the local maxima in 3times 3 neighborhood are retained). The next step rejects the corners with the minimal eigenvalue less than texttt{qualityLevel} \cdot max(texttt{eigImage}(x,y)) . Finally, the function ensures that the distance between any two corners is not smaller than minDistance . The weaker corners (with a smaller min eigenvalue) that are too close to the stronger corners are rejected.
Note that the if the function is called with different values A and B of the parameter qualityLevel , and A > {B}, the array of returned corners with qualityLevel=A will be the prefix of the output corners array with qualityLevel=B .
"""
<file_sep>/camera/pyWebCamCap.py
import pygame
import Image
import ImageDraw, time
from pygame.locals import *
import sys
from PIL import ImageEnhance
from opencv import cv
from opencv import highgui
import opencv
#this is important for capturing/displaying images
from opencv import highgui
camera = highgui.cvCreateCameraCapture(0)
def get_image():
im = highgui.cvQueryFrame(camera)
# Add the line below if you need it (Ubuntu 8.04+)
im = opencv.cvGetMat(im)
#convert Ipl image to PIL image
return opencv.adaptors.Ipl2PIL(im)
res = (640,480)
pygame.init()
fps = 30.0
window = pygame.display.set_mode((res[0],res[1]))
#screen = pygame.display.set_mode((res[0],res[1]))
pygame.display.set_caption("WebCam Demo")
pygame.font.init()
font = pygame.font.SysFont("Courier",11)
screen = pygame.display.get_surface()
def disp(phrase,loc):
s = font.render(phrase, True, (200,200,200))
sh = font.render(phrase, True, (50,50,50))
screen.blit(sh, (loc[0]+1,loc[1]+1))
screen.blit(s, loc)
brightness = 1.0
contrast = 1.0
shots = 0
while True:
#camshot = ImageEnhance.Brightness(camera.getImage()).enhance(brightness)
#camshot = ImageEnhance.Contrast(camshot).enhance(contrast)
camshot = get_image()
for event in pygame.event.get():
#if event.type == pygame.QUIT or event.type == KEYDOWN:
if event.type == pygame.QUIT: sys.exit()
keyinput = pygame.key.get_pressed()
if keyinput[K_1]: brightness -= .1
if keyinput[K_2]: brightness += .1
if keyinput[K_3]: contrast -= .1
if keyinput[K_4]: contrast += .1
if keyinput[K_q]: camera.displayCapturePinProperties()
if keyinput[K_w]: camera.displayCaptureFilterProperties()
if keyinput[K_s]:
filename = str(time.time()) + ".jpg"
#camera.saveSnapshot(filename, quality=80, timestamp=0)
cv.SaveImage("testcap.jpg", camshot)
shots += 1
camshot = pygame.image.frombuffer(camshot.tostring(), res, "RGB")
screen.blit(camshot, (0,0))
disp("S:" + str(shots), (10,4))
disp("B:" + str(brightness), (10,16))
disp("C:" + str(contrast), (10,28))
pygame.display.flip()
#im = get_image()
#pg_img = pygame.image.frombuffer(im.tostring(), im.size, im.mode)
#screen.blit(pg_img, (0,0))
#pygame.display.flip()
pygame.time.delay(int(1000 * 1.0/fps))
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/measures/measures.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2010, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# License: MIT
from __future__ import division
import numpy as np
__all__ = [
'accuracy',
'confusion_matrix',
'waccuracy',
'zero_one_loss',
]
def accuracy(real, other=None, normalisedlabels=False, names=None):
'''
acc = accuracy(real, predicted, normalisedlabels=False, names=None)
Compute accuracy (fraction of correct predictions).
Parameters
----------
real : sequence
The real labels
predicted : sequence
The predicted sequence (must be same type as `real`)
normalisedlabels : boolean, optional
Whether labels have been normalised
names : sequence
The underlying names (unused)
Returns
-------
acc : float
'''
if other is None:
import warnings
warnings.warn('milk.measures.accuracy: calling this with one argument is a deprecated interface.', DeprecationWarning)
cmatrix = np.asanyarray(real)
return cmatrix.trace()/cmatrix.sum()
else:
return np.mean(np.asanyarray(real) == other)
def zero_one_loss(real, predicted, normalisedlabels=False, names=None):
'''
loss = zero_one_loss(real, predicted, normalisedlabels={unused}, names={unused})
Parameters
----------
real : sequence
the underlying labels
predicted : sequence
the predicted labels
normalisedlabels : unused
names: unused
Returns
-------
loss : integer
the number of instances where `real` differs from `predicted`
'''
return np.sum(np.asanyarray(real) != np.asanyarray(predicted))
def waccuracy(real, predicted=None, normalisedlabels=False, names=None):
'''
wacc = waccuracy(real, predicted, normalisedlabels={unused}, names={unused})
Weighted accuracy: average of accuracy for each (real) class. Can be very
different from accuracy if the classes are unbalanced (in particular, if
they are very unbalanced, you can get a high accuracy with a bad
classifier).
Parameters
----------
real : sequence
the underlying labels
predicted : sequence
the predicted labels
normalisedlabels : unused
names: unused
Returns
-------
wacc : float
the weighted accuracy
'''
if predicted is None:
import warnings
warnings.warn('milk.measures.accuracy: calling this with one argument is a deprecated interface.', DeprecationWarning)
cmatrix = np.asanyarray(real)
else:
cmatrix = confusion_matrix(real, predicted, normalisedlabels, names)
return (cmatrix.diagonal() / cmatrix.sum(1)).mean()
def confusion_matrix(real, predicted, normalisedlabels=False, names=None):
'''
cmatrix = confusion_matrix(real, predicted, normalisedlabels=False, names=None)
Computes the confusion matrix
Parameters
----------
real : sequence
The real labels
predicted : sequence
The predicted sequence (must be same type as `real`)
normalisedlabels : boolean, optional
Whether labels have been normalised
names : sequence
The underlying names (unused)
Returns
-------
cmatrix : 2 ndarray
'''
if not normalisedlabels:
from ..supervised.normalise import normaliselabels
real, names = normaliselabels(real)
predicted = map(names.index, predicted)
n = np.max(real)+1
cmat = np.zeros((n,n), int)
for r,p in zip(real, predicted):
cmat[r,p] += 1
return cmat
def bayesian_significance(n, c0, c1):
'''
sig = bayesian_significance(n, c0, c1)
Computes the Bayesian significance of the difference between a classifier
that gets ``c0`` correct versus one that gets ``c1`` right on ``n``
examples.
Parameters
----------
n : int
Total number of examples
c0 : int
Examples that first classifier got correct
c1 : int
Examples that second classifier got correct
Returns
-------
sig : float
Significance value
'''
def _logp(r, n, c):
return c*np.log(r)+(n-c)*np.log(1-r)
r = np.linspace(.0001,.9999,100)
lp0 = _logp(r,n,c0)
lp1 = _logp(r,n,c1)
mat = lp0 + lp1[:,np.newaxis]
mat -= mat.max()
mat = np.exp(mat)
mat /= mat.sum()
sig = np.triu(mat).sum()-mat.trace()/2.
return min(sig, 1. - sig)
## TODO: Implement http://en.wikipedia.org/wiki/Matthews_Correlation_Coefficient
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/tests/test_ecoc_learner.py
from milk.supervised.multi import ecoc_learner
from milk.supervised.classifier import ctransforms
from milk.supervised import svm
import milk.tests.fast_classifier
import milk.supervised.multi
from milksets.yeast import load
import numpy as np
def test_ecoc_learner():
base = milk.tests.fast_classifier.fast_classifier()
learner = milk.supervised.multi.ecoc_learner(base)
features, labels = load()
nlabels = len(set(labels))
model = learner.train(features[::2],labels[::2])
testl = np.array(model.apply_many(features[1::2]))
assert np.mean(testl == labels[1::2]) > 1./nlabels
assert testl.min() >= 0
assert testl.max() < nlabels
# This failed at one point:
learner = ecoc_learner(svm.svm_to_binary(svm.svm_raw(kernel=svm.dot_kernel(), C=1.)))
model = learner.train(features[:200], labels[:200])
assert (model is not None)
def test_ecoc_probability():
features,labels = load()
features = features[labels < 5]
labels = labels[labels < 5]
raw = svm.svm_raw(kernel=svm.dot_kernel(), C=1.)
base = ctransforms(raw, svm.svm_sigmoidal_correction())
learner = ecoc_learner(base, probability=True)
model = learner.train(features[::2], labels[::2])
results = map(model.apply, features[1::2])
results = np.array(results)
assert results.shape[1] == len(set(labels))
assert np.mean(results.argmax(1) == labels[1::2]) > .5
<file_sep>/gui/Frame1.py
#Boa:Frame:Frame1
import wx
import socket
import time
#Variables
HOST = '127.0.0.1' # Symbolic name meaning the local host
HBPORT = 12340 # HeartBeat Port Arbitrary non-privileged port
def create(parent):
return Frame1(parent)
[wxID_FRAME1, wxID_FRAME1BUTTON1, wxID_FRAME1PANEL1, wxID_FRAME1STTXTHRTBEAT,
wxID_FRAME1TEXTCTRL1, wxID_FRAME1TXTSTATUS,
] = [wx.NewId() for _init_ctrls in range(6)]
class Frame1(wx.Frame):
def _init_ctrls(self, prnt):
# generated method, don't edit
wx.Frame.__init__(self, id=wxID_FRAME1, name='', parent=prnt,
pos=wx.Point(649, 219), size=wx.Size(946, 673),
style=wx.DEFAULT_FRAME_STYLE, title=u'RoboMower DashBoard')
self.SetClientSize(wx.Size(930, 635))
self.panel1 = wx.Panel(id=wxID_FRAME1PANEL1, name='panel1', parent=self,
pos=wx.Point(0, 0), size=wx.Size(930, 635),
style=wx.TAB_TRAVERSAL)
self.button1 = wx.Button(id=wxID_FRAME1BUTTON1,
label=u'Connect To Robot', name='button1', parent=self.panel1,
pos=wx.Point(8, 8), size=wx.Size(104, 32), style=0)
self.button1.SetFont(wx.Font(8, wx.ROMAN, wx.NORMAL, wx.NORMAL, False,
u'Serif'))
self.button1.Bind(wx.EVT_BUTTON, self.OnButton1Button,
id=wxID_FRAME1BUTTON1)
self.textCtrl1 = wx.TextCtrl(id=wxID_FRAME1TEXTCTRL1, name='textCtrl1',
parent=self.panel1, pos=wx.Point(288, 40), size=wx.Size(88, 24),
style=0, value='textCtrl1')
self.stTxtHrtBeat = wx.StaticText(id=wxID_FRAME1STTXTHRTBEAT,
label=u'HeartBeat', name=u'stTxtHrtBeat', parent=self.panel1,
pos=wx.Point(282, 16), size=wx.Size(108, 16), style=0)
self.stTxtHrtBeat.SetFont(wx.Font(12, wx.SWISS, wx.NORMAL, wx.NORMAL,
False, u'Terminal'))
self.txtStatus = wx.TextCtrl(id=wxID_FRAME1TXTSTATUS, name=u'txtStatus',
parent=self.panel1, pos=wx.Point(120, 8), size=wx.Size(144, 600),
style=wx.TE_MULTILINE, value=u'')
def __init__(self, parent):
self._init_ctrls(parent)
def Initialize_HeartBeat():
HB_TCP = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
global HOST
global HBPORT
HB_TCP.bind((HOST, HBPORT))
print "listening for robot on port: ", HBPORT
hhh = self.txtStatus.GetValue()
hhh = hhh + "listening for robot on port: " + HBPORT + "\n"
self.txtStatus.SetValue(hhh)
HB_TCP.listen(1)
try:
print "waiting to accept.."
conn, addr = HB_TCP.accept()
print "accepted connection from client.."
while conn <> "":
HB_TCP.listen(1)
#print time.time()
#print s.gettimeout()
print 'Connected by', addr
data = conn.recv(1024)
if not data: break
print 'Received from remote: ', data
conn.send("ACK")
except IOError as detail:
print "connection lost", detail
def OnButton1Button(self, event):
ggg = self.stTxtHrtBeat.GetLabel()
#self.textCtrl1.SetValue("New Text")
self.textCtrl1.SetValue(ggg)
hhh = self.txtStatus.GetValue()
hhh = hhh + "data.." + "\n"
self.txtStatus.SetValue(hhh)
self.Initialize_HeartBeat
if __name__ == '__main__':
app = wx.PySimpleApp()
frame = create(None)
frame.Show()
app.MainLoop()
<file_sep>/lib/FileReceiver.py
# USAGE: python FileReciever.py
import socket, time, string, sys, urlparse
from threading import *
#------------------------------------------------------------------------
class FileReceiver ( Thread ):
def __init__( self, command_port, data_port ):
Thread.__init__( self )
self.cmd_port = command_port
self.data_port = data_port
def run(self):
self.process()
def bindmsock( self ):
self.msock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
self.msock.bind(('', self.data_port))
self.msock.listen(2)
#self.msock.settimeout(2)
print '[Media] Listening on port ', self.data_port
def acceptmsock( self ):
self.mconn, self.maddr = self.msock.accept()
print '[Media] Got connection from', self.maddr
def bindcsock( self ):
self.csock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
self.csock.bind(('', self.cmd_port))
self.csock.listen(2)
#self.csock.settimeout(2)
print '[Control] Listening on port ', self.cmd_port
def acceptcsock( self ):
self.cconn, self.maddr = self.csock.accept()
print '[Control] Got connection from', self.maddr
while 1:
data = self.cconn.recv(1024)
if not data: break
if data[0:4] == "SEND": self.filename = data[5:]
print '[Control] Getting ready to receive "%s"' % self.filename
break
def transfer( self ):
print '[Media] Starting media transfer for "%s"' % self.filename
f = open(self.filename,"wb")
while 1:
data = self.mconn.recv(1024)
if not data: break
f.write(data)
f.close()
print '[Media] Got "%s"' % self.filename
print '[Media] Closing media transfer for "%s"' % self.filename
def close( self ):
self.cconn.close()
self.csock.close()
self.mconn.close()
self.msock.close()
def process( self ):
while 1:
try:
self.bindcsock()
#time.sleep(1)
self.acceptcsock()
#time.sleep(1)
self.bindmsock()
#time.sleep(1)
self.acceptmsock()
#time.sleep(1)
self.transfer()
#time.sleep(1)
self.close()
time.sleep(.1)
except:
print "file xfer failed"
self.close()
time.sleep(.2)
pass
#------------------------------------------------------------------------
#s = StreamHandler()
#s.start()
<file_sep>/android/tcp_remote_sensing_client.py~
import android,socket,time
droid=android.Android();
ipServer= '192.168.1.180'
portServer= 8095
def sendData():
# BEGIN
droid.dialogCreateAlert('Sensor Recorder','Are you ready to record a session?')
droid.dialogSetPositiveButtonText('Record Now')
droid.dialogShow()
response = droid.dialogGetResponse().result
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((ipServer, portServer))
droid.dialogCreateAlert('Record Session','Do you want to record a session?')
droid.dialogSetPositiveButtonText('Stop')
droid.dialogSetNegativeButtonText('Restart')
droid.dialogShow()
droid.eventClearBuffer()
droid.startSensingTimed(1,50)
droid.startLocating()
#time.sleep(25)
loc = droid.readLocation()
#droid.stopLocating()
#now = str(datetime.datetime.now())
while True:
lat = str(loc[1]['gps']['latitude'])
lon = str(loc[1]['gps']['longitude'])
event=droid.eventWait().result
data=event["data"]
name=event["name"]
if name == 'sensors':
s.send('\n %.3f' % data["time"])
print data
for val in ['light', 'accuracy', 'xforce','yforce','zforce', 'xMag', 'yMag', 'zMag', 'pitch','roll','azimuth']:
if val in data :
s.send(', %6.6f' % data[val])
s.send(',' + lat + "," + lon)
elif name == 'dialog':
break
droid.stopSensing()
s.shutdown(socket.SHUT_RDWR);
s.close();
return data["which"]=='negative'
# END
# Prepare the recording session
oldTimeOut=droid.setScreenTimeout(30*60).result;
oldWifi=droid.checkWifiState().result;
if not(oldWifi) :
droid.toggleWifiState(1)
time.sleep(15) #wait for connection to be established
while sendData() :
pass
#Restore Session
droid.setScreenTimeout(oldTimeOut);
droid.toggleWifiState(oldWifi);
<file_sep>/ftdi/build/pyftdi/setup.py
#!/usr/bin/env python
# -*- coding: utf-8 -*-
#
# Copyright (c) 2010-2012 <NAME> <<EMAIL>>
# Copyright (c) 2010-2012 Neotion
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
from distutils.core import setup
VERSION='0.5.2'
def _read(fname):
import os
return open(os.path.join(os.path.dirname(__file__),
'pyftdi', fname)).read()
setup(
name='pyftdi',
version=VERSION,
description='FTDI device driver',
author='<NAME>',
author_email='<EMAIL>',
license='LGPL v2',
keywords = 'driver ftdi usb serial spi jtag prolific rs232',
url='http://github.com/eblot/pyftdi',
download_url=''.join(('https://github.com/eblot/pyftdi/tarball/v',
VERSION)),
packages=['pyftdi','pyftdi.pyftdi','pyftdi.serialext'],
package_data={'pyftdi': ['*.rst'],
'pyftdi.serialext' : ['*.rst']},
requires=['pyusb (>= 1.0.0a2)'],
classifiers=[
'Development Status :: 4 - Beta',
'Environment :: Other Environment',
'Intended Audience :: Developers',
'License :: OSI Approved :: GNU Library or '
'Lesser General Public License (LGPL)',
'Operating System :: MacOS :: MacOS X',
'Operating System :: POSIX',
'Operating System :: Microsoft :: Windows :: Windows 95/98/2000',
'Programming Language :: Python :: 2.6',
'Programming Language :: Python :: 2.7',
'Topic :: Software Development :: Libraries :: Python Modules',
'Topic :: System :: Hardware :: Hardware Drivers',
],
long_description=_read('README.rst'),
)
<file_sep>/powertrain/relay/relay_2usb.py
import serial
import time
import getopt
import sys
from struct import *
SERIAL_PATH = '/dev/ttyACM0'
BAUD_RATE = 9600
commands = {
'relay_1_on': 0x65,
'relay_1_off': 0x6F,
'relay_2_on': 0x66,
'relay_2_off': 0x70,
'info': 0x5A,
'relay_states': 0x5B,
}
def send_command(cmd, read_response = False):
ser = serial.Serial('/dev/ttyACM0', 9600)
ser.write(chr(cmd)+'\n')
response = read_response and ser.read() or None
#print "response:", int(response)
ser.close()
return response
def turn_relay_1_on():
send_command(commands['relay_1_on'])
def turn_relay_1_off():
send_command(commands['relay_1_off'])
def click_relay_1():
send_command(commands['relay_1_on'])
time.sleep(1)
send_command(commands['relay_1_off'])
def turn_relay_2_on():
send_command(commands['relay_2_on'])
def turn_relay_2_off():
send_command(commands['relay_2_off'])
def click_relay_2():
send_command(commands['relay_2_on'])
time.sleep(1)
send_command(commands['relay_2_off'])
def get_relay_states():
states = send_command(commands['relay_states'], read_response=True)
response = unpack('b',states)[0]
states = {
0: {'1': False, '2': False},
1: {'1': True, '2': False},
2: {'1': False, '2': True},
3: {'1': True, '2': True},
}
return states[response]
def usage():
print """Change relay status:
python rly02.py -r [1,2] -a [on, off, click]
Get device info:
python rly02.py -i
Get relay states:
python rly02.py -s
Help!;
python rly02.py -h
"""
if __name__ == '__main__':
try:
opts, args = getopt.getopt(sys.argv[1:], "hr:a:is", ["help",None, None, "info", "states"])
dict_opts = {}
for o,a in opts:
dict_opts[o] = a
except getopt.GetoptError:
print
usage()
sys.exit(2)
for o, a in opts:
if o in ("-r",):
if a in ['1', '2']:
relay = a
if dict_opts.has_key('-a') and\
dict_opts['-a'] in ['on','off','click']:
action = dict_opts['-a']
actions = {
'1_on' : lambda: turn_relay_1_on(),
'1_off' : lambda: turn_relay_1_off(),
'1_click' : lambda: click_relay_1(),
'2_on' : lambda: turn_relay_2_on(),
'2_off' : lambda: turn_relay_2_off(),
'2_click' : lambda: click_relay_2(),
}
actions['%s_%s'%(relay, action)]()
sys.exit()
else:
print "Action must be on, off or click"
else:
print "Relay must be 1 or 2"
elif o in ("-s", "--states"):
print get_relay_states()
sys.exit()
usage()
sys.exit()
<file_sep>/gui/web_gui/sitetest2/train_AI.py~
#!/usr/bin/env python
import os
from img_processing_tools import *
import Image
import time
import csv
import numpy as np
import milk
from mvpa.clfs.knn import kNN
from mvpa.datasets import Dataset
import mlpy
import matplotlib.pyplot as plt # required for plotting
ifile = open('mower_image_data.csv', "rb")
reader = csv.reader(ifile)
classID = []
features = []
lbp= []
lbp_temp_array = []
i3_histo_temp_array = []
i3_histo = []
rgb_histo_temp_array = []
rgb_histo = []
#read data from file into arrays
for row in reader:
features.append(row)
'''
I3_sum = ImageStat.Stat(image).sum
I3_sum2 = ImageStat.Stat(image).sum2
I3_median = ImageStat.Stat(image).median
I3_mean = ImageStat.Stat(image).mean
I3_var = ImageStat.Stat(image).var
I3_stddev = ImageStat.Stat(image).stddev
I3_rms = ImageStat.Stat(image).rms
'''
#print features[1][1]
#stop
for row in range(1, len(features)):
#print features[row][1]
classID.append(int(features[row][0]))
lbp_temp_array.append(features[row][1].split(" "))
i3_histo_temp_array.append(features[row][2].split(" "))
rgb_histo_temp_array.append(features[row][3].split(" "))
#removes ending element which is a space
for x in range(len(lbp_temp_array)):
lbp_temp_array[x].pop()
lbp_temp_array[x].pop(0)
i3_histo_temp_array[x].pop()
i3_histo_temp_array[x].pop(0)
rgb_histo_temp_array[x].pop()
rgb_histo_temp_array[x].pop(0)
#print lbp_temp_array
#convert all strings in array to numbers
temp_array = []
for x in range(len(lbp_temp_array)):
temp_array = [ float(s) for s in lbp_temp_array[x] ]
lbp.append(temp_array)
temp_array = [ float(s) for s in i3_histo_temp_array[x] ]
i3_histo.append(temp_array)
temp_array = [ float(s) for s in rgb_histo_temp_array[x] ]
rgb_histo.append(temp_array)
#make numpy arrays
lbp = np.asarray(lbp)
i3_histo = np.asarray(i3_histo)
rgb_histo = np.asarray(rgb_histo)
id_index = 15
lbp_predictdata = lbp[[id_index]]
i3_histo_predictdata = lbp[[id_index]]
print
#print predictdata
print classID[id_index]
#print "len lbp:", len(lbp)
#print "shape:", lbp.shape
#mvpa
lbp_training = Dataset(samples=lbp,labels=classID)
i3_histo_training = Dataset(samples=lbp,labels=classID)
clf = kNN(k=1, voting='majority')
print "clf = ", clf
clf.train(lbp_training)
lbp_predicted_classID = clf.predict(lbp_predictdata)
clf.train(i3_histo_training)
i3_histo_predicted_classID = clf.predict(i3_histo_predictdata)
print "lbp_predicted_classID: ", lbp_predicted_classID
print "i3_histo__predicted_classID :", i3_histo_predicted_classID
#if predicted_classID[0] == 1.0: print "Image is of class: GRASS"
#if predicted_classID[0] == 2.0: print "Image is of class: DIRT/GRAVEL"
#if predicted_classID[0] == 3.0: print "Image is of class: CEMENT/ASPHALT"
#mlpy
#Diagonal Linear Discriminant Analysis.
from numpy import *
from mlpy import *
xtr2 = np.array([[1.1, 2.4, 3.1, 1.0], # first sample
[1.2, 2.3, 3.0, 2.0], # second sample
[1.3, 2.2, 3.5, 1.0], # third sample
[1.4, 2.1, 3.2, 2.0]]) # fourth sample
ytr2 = np.array([1, -1, 1, -1])
xts2 = np.array([[4.0, 5.0, 6.0, 7.0]]) # test point
#ytr = array([1, 1, 1, 1, 1, 1, 1, 1, 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1])
xtr = lbp
ytr = np.array(classID)
print
print "----------------------------"
print
#print xtr2.shape, xtr.shape
#print ytr2.shape, ytr.shape
#ytr = classID # classes
#print lbp
print classID
#print xtr
print
print "----------------------------"
print
print ytr
#print xtr2
#print type(lbp_predictdata[0][0])
#print type(xtr2[0][0])
xts = np.array([lbp_predictdata[0]]) # test point
print "np.shape(xts), xts.ndim, xts.dtype:", np.shape(xts), xts.ndim, xts.dtype
print
print "----------------------------"
print
myknn = mlpy.Knn(k=1)
myknn.compute(xtr,ytr)
print "knn: ", myknn.predict(xts)
print "correct class:", classID[id_index]
mypda = mlpy.Pda()
mypda.compute(xtr,ytr)
print "pda: ", mypda.predict(xts)
print "correct class:", classID[id_index]
#print np.shape(xtr), xtr.ndim, xtr.dtype
#print np.shape(ytr), ytr.ndim, ytr.dtype
#print np.shape(xts), xts.ndim, xts.dtype
#milk
#Help on function feature_selection_simple in module milk.supervised.defaultlearner:
#selector = feature_selection_simple()
learner = milk.defaultclassifier(mode='medium', multi_strategy='1-vs-1', expanded=False)
model = learner.train(xtr, ytr)
print "milk: ", model.apply(xts[0])
#from sklearn.externals import joblib
#joblib.dump(model, 'saved_model.pkl')
#model2 = joblib.load('saved_model.pkl')
#print "milk: ", model2.apply(xts[0])
from sklearn.neighbors import KNeighborsClassifier
knn = KNeighborsClassifier(leaf_size=30, n_neighbors=5, p=2,
warn_on_equidistant=True, weights='distance')
model3 = knn.fit(xtr, ytr)
print "knn sclearn: ", model3.predict(xts)
from sklearn.externals import joblib
try:
os.remove('lbp_knn_clf.pkl')
except:
print "did not del file lbp_knn_clf.pkl"
joblib.dump(model3, 'lbp_knn_clf.pkl')
#print" feature ranking"
#myrank = Ranking() # initialize ranking class
#mysvm = Svm() # initialize svm class
#print myrank.compute(xtr, ytr, mysvm)
'''
n_neighbors = 15
h = .02 # step size in the mesh
import pylab as pl
from matplotlib.colors import ListedColormap
# display graphs
# Create color maps
cmap_light = ListedColormap(['#FFAAAA', '#AAFFAA', '#AAAAFF'])
cmap_bold = ListedColormap(['#FF0000', '#00FF00', '#0000FF'])
for weights in ['uniform', 'distance']:
# we create an instance of Neighbours Classifier and fit the data.
#clf = neighbors.KNeighborsClassifier(n_neighbors, weights=weights)
#clf.fit(X, y)
# Plot the decision boundary. For that, we will asign a color to each
# point in the mesh [x_min, m_max]x[y_min, y_max].
x_min, x_max = xtr[:, 0].min() - 1, xtr[:, 0].max() + 1
y_min, y_max = xtr[:, 1].min() - 1, xtr[:, 1].max() + 1
xx, yy = np.meshgrid(np.arange(x_min, x_max, h),
np.arange(y_min, y_max, h))
Z = knn.predict(np.c_[xx.ravel(), yy.ravel()])
# Put the result into a color plot
Z = Z.reshape(xx.shape)
pl.figure()
pl.pcolormesh(xx, yy, Z, cmap=cmap_light)
# Plot also the training points
pl.scatter(xtr[:, 0], xtr[:, 1], c=ytr, cmap=cmap_bold)
pl.title("3-Class classification (k = %i, weights = '%s')"
% (n_neighbors, weights))
pl.axis('tight')
pl.show()
'''
<file_sep>/gui/breezy/thermometerview1.py
"""
File: thermometerview1.py
Author: <NAME>
"""
from breezypythongui import EasyFrame
from thermometer import Thermometer
class ThermometerView(EasyFrame):
"""A termperature conversion program."""
def __init__(self, model):
"""Sets up the view. The model comes in as an argument."""
EasyFrame.__init__(self, title = "Temperature Converter")
self.model = model
# Label and field for Celsius
self.addLabel(text = "Celsius",
row = 0, column = 0)
self.celsiusField = self.addFloatField(value = model.getCelsius(),
row = 1,
column = 0,
precision = 2)
# Label and field for Fahrenheit
self.addLabel(text = "Fahrenheit",
row = 0, column = 1)
self.fahrField = self.addFloatField(value = model.getFahrenheit(),
row = 1,
column = 1,
precision = 2)
# Celsius to Fahrenheit button
self.addButton(text = ">>>>",
row = 2, column = 0,
command = self.computeFahr)
# Fahrenheit to Celsius button
self.addButton(text = "<<<<",
row = 2, column = 1,
command = self.computeCelsius)
# The controller methods
def computeFahr(self):
"""Inputs the Celsius degrees
and outputs the Fahrenheit degrees."""
degrees = self.celsiusField.getNumber()
self.model.setCelsius(degrees)
self.fahrField.setNumber(self.model.getFahrenheit())
def computeCelsius(self):
"""Inputs the Fahrenheit degrees
and outputs the Celsius degrees."""
degrees = self.fahrField.getNumber()
self.model.setFahrenheit(degrees)
self.celsiusField.setNumber(self.model.getCelsius())
# Instantiate the model, pass it to the view, and pop up the window.
if __name__ == "__main__":
model = Thermometer()
view = ThermometerView(model)
view.mainloop()
<file_sep>/lib/mobot_video_consume.py~
#!/usr/bin/python
import pika
import thread, time, sys, traceback
import cPickle as pickle
import cv, cv2
''' USAGE:
lidar = consume_lidar('video.1', 'localhost')
while True:
time.sleep(1)
print lidar.data[1]
print 'rpm speed:', lidar.speed_rpm
'''
class consume_video():
def __init__(self, channel_name, host_ip):
self.frame = None
#-------------connection variables
self.channel_name = channel_name
self.host_ip = host_ip
self.queue_name = None
self.connection = None
self.channel = None
#----------------------RUN
self.run()
def connect(self):
self.connection = pika.BlockingConnection(pika.ConnectionParameters(host=self.host_ip))
self.channel = self.connection.channel()
self.channel.exchange_declare(exchange='mobot_data_feed',type='topic')
result = self.channel.queue_declare(exclusive=True)
self.queue_name = result.method.queue
binding_keys = self.channel_name
self.channel.queue_bind(exchange='mobot_data_feed', queue=self.queue_name, routing_key=self.channel_name)
def grab_frames(self):
while True:
time.sleep(0.0001) # do not hog the processor power
self.connect()
for method_frame, properties, body in self.channel.consume(self.queue_name):
self.frame = pickle.loads(body)
# Acknowledge the message
self.channel.basic_ack(method_frame.delivery_tag)
def run(self):
self.th = thread.start_new_thread(self.grab_frames, ())
if __name__== "__main__":
cv2.namedWindow('Video', cv.CV_WINDOW_AUTOSIZE)
camera = consume_video('video.0', 'localhost')
while True:
time.sleep(0.1)
try:
print "receiving video feed data: ", len(camera.frame)
cv2.imshow('Video', camera.frame)
cv.WaitKey(10)
except:
print "no video feed"
pass
<file_sep>/navigation/terrain_id/terrain_training/build/milk/INSTALL.rst
=============
Building milk
=============
To install dependencies in Ubuntu::
sudo apt-get install python-numpy python-scipy libeigen3-dev
The following should work::
python setup.py install
A C++ compiler is required. On Windows, you might need to specify the compiler.
For example::
python setup.py install --compiler=mingw32
If you have mingw installed.
<file_sep>/gui/breezy/scaledemo2.py
"""
File: scaledemo1.py
Displays a color whose RGB values are input from sliding scales.
"""
from breezypythongui import EasyFrame
from tkinter import VERTICAL
class ColorMeter(EasyFrame):
def __init__(self):
"""Sets up the window and widgets."""
EasyFrame.__init__(self, title = "Color Meter")
# Label to display the RGB value
self.rgbLabel = self.addLabel(text = "RGB : x000000",
row = 0, column = 0)
# Canvas to display the color
self.canvas = self.addCanvas(row = 1, column = 0)
self.canvas["bg"] = "black"
# Sliding scale for red
self.redScale = self.addScale(label = "Red",
row = 0, column = 1,
orient = VERTICAL,
from_ = 0, to = 255,
length = 300,
tickinterval = 15,
command = self.setColor)
# Sliding scale for green
self.greenScale = self.addScale(label = "Green",
row = 0, column = 2,
orient = VERTICAL,
from_ = 0, to = 255,
length = 300,
tickinterval = 15,
command = self.setColor)
# Sliding scale for blue
self.blueScale = self.addScale(label = "Blue",
row = 0, column = 3,
orient = VERTICAL,
from_ = 0, to = 255,
length = 300,
tickinterval = 15,
command = self.setColor)
# Event handler for the three sliding scales
def setColor(self, value):
"""Gets the RGB values from the scales,
converts them to hex, and builds a six-digit
hex string to update the view."""
red = hex(self.redScale.get())[2:]
green = hex(self.greenScale.get())[2:]
blue = hex(self.blueScale.get())[2:]
if len(red) == 1:
red = "0" + red
if len(green) == 1:
green = "0" + green
if len(blue) == 1:
blue = "0" + blue
color = "#" + red + green + blue
self.rgbLabel["text"] = "RGB: " + color
self.canvas["bg"] = color
# Instantiate and pop up the window.
ColorMeter().mainloop()
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/utils/utils.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2012, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
import numpy as np
import random
__all__ = [
'get_nprandom',
'get_pyrandom',
]
def get_nprandom(R):
'''
R' = get_nprandom(R)
Returns a numpy.RandomState from R
Parameters
----------
R : can be one of:
None : Returns the default numpy global state
integer : Uses it as a seed for constructing a new random generator
RandomState : returns R
Returns
-------
R' : np.RandomState
'''
if R is None:
return np.random.mtrand._rand
if type(R) == int:
return np.random.RandomState(R)
if type(R) is random.Random:
return np.random.RandomState(R.randint(0, 2**30))
if type(R) is np.random.RandomState:
return R
raise TypeError("get_nprandom() does not know how to handle type {0}.".format(type(R)))
def get_pyrandom(R):
'''
R = get_pyrandom(R)
Returns a random.Random object based on R
Parameters
----------
R : can be one of:
None : Returns the default numpy global state
integer : Uses it as a seed for constructing a new random generator
RandomState : returns R
Returns
-------
R' : random.Random
'''
if R is None:
return random.seed.im_self
if type(R) is int:
return random.Random(R)
if type(R) is np.random.RandomState:
return random.Random(R.randint(2**30))
if type(R) is random.Random:
return R
raise TypeError("get_pyrandom() does not know how to handle type {0}.".format(type(R)))
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/supervised/defaultclassifier.py
from milk.supervised.defaultlearner import *
defaultclassifier = defaultlearner
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/nfoldcrossvalidation.py
from .measures.nfoldcrossvalidation import foldgenerator, getfold, nfoldcrossvalidation
<file_sep>/PhidgetsPython/Python/MotorControl-simple.py
#!/usr/bin/env python
"""Copyright 2010 Phidgets Inc.
This work is licensed under the Creative Commons Attribution 2.5 Canada License.
To view a copy of this license, visit http://creativecommons.org/licenses/by/2.5/ca/
"""
__author__ = '<NAME>'
__version__ = '2.1.8'
__date__ = 'May 17 2010'
#Basic imports
from ctypes import *
import sys
#Phidget specific imports
from Phidgets.PhidgetException import PhidgetErrorCodes, PhidgetException
from Phidgets.Events.Events import AttachEventArgs, DetachEventArgs, ErrorEventArgs, CurrentChangeEventArgs, InputChangeEventArgs, VelocityChangeEventArgs
from Phidgets.Devices.MotorControl import MotorControl
#import methods for sleeping thread
from time import sleep
#Create an motorcontrol object
try:
motorControl = MotorControl()
except RuntimeError as e:
print("Runtime Exception: %s" % e.details)
print("Exiting....")
exit(1)
#Information Display Function
def displayDeviceInfo():
print("|------------|----------------------------------|--------------|------------|")
print("|- Attached -|- Type -|- Serial No. -|- Version -|")
print("|------------|----------------------------------|--------------|------------|")
print("|- %8s -|- %30s -|- %10d -|- %8d -|" % (motorControl.isAttached(), motorControl.getDeviceName(), motorControl.getSerialNum(), motorControl.getDeviceVersion()))
print("|------------|----------------------------------|--------------|------------|")
#Event Handler Callback Functions
def motorControlAttached(e):
attached = e.device
print("MotorControl %i Attached!" % (attached.getSerialNum()))
def motorControlDetached(e):
detached = e.device
print("MotorControl %i Detached!" % (detached.getSerialNum()))
def motorControlError(e):
try:
source = e.device
print("Motor Control %i: Phidget Error %i: %s" % (source.getSerialNum(), e.eCode, e.description))
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
def motorControlCurrentChanged(e):
source = e.device
print("Motor Control %i: Motor %i Current Draw: %f" % (source.getSerialNum(), e.index, e.current))
def motorControlInputChanged(e):
source = e.device
print("Motor Control %i: Input %i State: %s" % (source.getSerialNum(), e.index, e.state))
def motorControlVelocityChanged(e):
source = e.device
print("Motor Control %i: Motor %i Current Velocity: %f" % (source.getSerialNum(), e.index, e.velocity))
#Main Program Code
try:
motorControl.setOnAttachHandler(motorControlAttached)
motorControl.setOnDetachHandler(motorControlDetached)
motorControl.setOnErrorhandler(motorControlError)
motorControl.setOnCurrentChangeHandler(motorControlCurrentChanged)
motorControl.setOnInputChangeHandler(motorControlInputChanged)
motorControl.setOnVelocityChangeHandler(motorControlVelocityChanged)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Opening phidget object....")
try:
motorControl.openPhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Waiting for attach....")
try:
motorControl.waitForAttach(10000)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
try:
motorControl.closePhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Exiting....")
exit(1)
else:
displayDeviceInfo()
#Control the motor a bit.
print("Will now simulate motor operation....")
print("Step 1: increase acceleration to 50, set target speed at 100")
try:
motorControl.setAcceleration(0, 50.00)
motorControl.setVelocity(0, 100.00)
sleep(5) #sleep for 5 seconds
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Step 2: Set acceleration to 100, decrease target speed to 75")
try:
motorControl.setAcceleration(0, 100.00)
motorControl.setVelocity(0, 75.00)
sleep(5) #sleep for 5 seconds
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Step 3: Stop the motor by decreasing speed to 0 at 50 acceleration")
try:
motorControl.setAcceleration(0, 50.00)
motorControl.setVelocity(0, 0.00)
sleep(5) #sleep for 5 seconds
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
else:
try:
motorControl.setAcceleration(0, 1.00)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Press Enter to quit....")
chr = sys.stdin.read(1)
print("Closing...")
try:
motorControl.closePhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Done.")
exit(0)
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/tests/test_curves.py
from milk.measures.curves import precision_recall
import numpy as np
def test_precision_recall():
labels = [0,1]*10
values = np.linspace(0,1,len(labels))
precision, recall = precision_recall(values, labels)
assert np.min(recall) >= 0.
assert np.max(recall) <= 1.
assert np.max(precision) <= 1.
assert np.min(precision) >= 0.
labels = [0]*10 + [1] * 10
values = np.linspace(0,1.,20)
precision,recall = precision_recall(values, labels, 'steps', 10)
assert min(precision) >= .5
assert max(precision) == 1.
assert max(recall) == 1.
<file_sep>/conversionUtils.py
from opencv import adaptors
from pygame import surfarray
def surf2CV(surf):
"""Given a surface, convert to an opencv format (cvMat)
"""
numpyImage = surfarray.pixels3d(surf)
cvImage = adaptors.NumPy2Ipl(numpyImage.transpose(1,0,2))
return cvImage
def cv2SurfArray(cvMat):
"""Given an open cvMat convert it to a pygame surface pixelArray
Should be able to call blit_array directly on this.
"""
numpyImage = adaptors.Ipl2NumPy(cvMat)
return numpyImage.transpose(1,0,2)
<file_sep>/telemetry/clientside_tcp.py
# Echo client program
import socket
import time
HOST = '127.0.0.1' # The remote host
PORT = 50008 # The same port as used by the server
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
connected = 0
#try n times to connect
for i in range(10):
try:
s.connect((HOST, PORT))
print "connected with server..."
connected = 1
break
except IOError as detail:
print "No server to connect to...", detail[0], detail [1]
time.sleep(1)
#send 10 heartbeat signals or could just keep s.connect every 1 second
if connected == 1:
while 1:
try:
s.send('sending Heartbeat signal...')
data = s.recv(1024)
if data <> "":
print 'Received and ACK from server...', repr(data)
else:
print "ACK not received from server"
break
print s.getpeername(), s.family, s.proto, s.type
time.sleep(1)
except IOError as detail:
print detail
break
#except IOError as detail:
# print "No server to connect to...", detail[0], detail [1]
# break
else:
print "Tried but never reached server..."
try:
print "closing Socket"
s.close()
except NameError as detail:
print "No socket to close", detail
<file_sep>/telemetry/client_rec_img.py
#!/usr/bin/python
import socket
from PIL import ImageFile
tcp = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
tcp.connect(('127.0.0.1', 12345))
tcp.send("1000 4")
file = open("bar.jpg", "w")
parser = ImageFile.Parser()
while 1:
jpgdata = tcp.recv(65536)
if not jpgdata:
tcp.close()
break
parser.feed(jpgdata)
file.write(jpgdata)
file.close()
image = parser.close()
image.show()
<file_sep>/ftdi/build/pyftdi/build/lib.linux-i686-2.7/pyftdi/pyftdi/misc.py
# Copyright (c) 2008-2011, Neotion
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
# * Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright
# notice, this list of conditions and the following disclaimer in the
# documentation and/or other materials provided with the distribution.
# * Neither the name of the Neotion nor the names of its contributors may
# be used to endorse or promote products derived from this software
# without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL NEOTION BE LIABLE FOR ANY DIRECT, INDIRECT,
# INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA,
# OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
# LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
# NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE,
# EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
"""Miscelleanous helpers
"""
import binascii
import re
# String values evaluated as true boolean values
TRUE_BOOLEANS = ['on', 'true', 'enable', 'enabled', 'yes', 'high', '1']
# String values evaluated as false boolean values
FALSE_BOOLEANS = ['off', 'false', 'disable', 'disabled', 'no', 'low', '0']
# ASCII or '.' filter
ASCIIFILTER = ''.join([(len(repr(chr(_x)))==3) and chr(_x) or \
'.' for _x in range(256)])
def hexdump(data):
"""Convert a binary buffer into a hexadecimal representation."""
src = ''.join(data)
length = 16
result = []
for i in xrange(0, len(src), length):
s = src[i:i+length]
hexa = ' '.join(["%02x" % ord(x) for x in s])
printable = s.translate(ASCIIFILTER)
result.append("%06x %-*s %s\n" % \
(i, length*3, hexa, printable))
return ''.join(result)
def hexline(data):
"""Convert a binary buffer into a hexadecimal representation"""
src = ''.join(data)
hexa = ' '.join(["%02x" % ord(x) for x in src])
printable = src.translate(ASCIIFILTER)
return "(%d) %s : %s" % (len(data), hexa, printable)
def to_int(value):
"""Parse a string and convert it into a value"""
if not value:
return 0
if isinstance(value, int):
return value
if isinstance(value, long):
return int(value)
mo = re.match('(?i)^\s*(\d+)\s*(?:([KM])B?)?\s*$', value)
if mo:
mult = { 'k': (1<<10), 'm': (1<<20) }
value = int(mo.group(1))
value *= mo.group(2) and mult[mo.group(2).lower()] or 1
return value
return int(value.strip(), value.startswith('0x') and 16 or 10)
def to_bool(value, permissive=True, allow_int=False):
"""Parse a string and convert it into a boolean value"""
if value is None:
return False
if isinstance(value, bool):
return value
if isinstance(value, int):
if allow_int:
return bool(value)
else:
if permissive:
return False
raise ValueError("Invalid boolean value: '%d'", value)
if value.lower() in TRUE_BOOLEANS:
return True
if permissive or (value.lower() in FALSE_BOOLEANS):
return False
raise ValueError('"Invalid boolean value: "%s"' % value)
def _crccomp():
"""Internal function used by crc16()"""
try:
from crcmod import mkCrcFun
except ImportError:
raise AssertionError("Python crcmod module not installed")
crc_polynomial = 0x11021
crc_initial = 0xFFFF
crc = mkCrcFun(crc_polynomial, crc_initial, False)
while True:
yield crc
def crc16(data):
"""Compute the CCITT CRC-16 checksum"""
crc = next(_crccomp())
return crc(data)
def xor(_a_, _b_):
"""XOR operation"""
return (not(_a_) and _b_) or (_a_ and not(_b_))
def is_iterable(obj):
"""Tells whether an instance is iterable or not"""
try:
iter(obj)
return True
except TypeError:
return False
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/unsupervised/affinity.py
# -*- coding: utf-8 -*-
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
# Copyright (C) 2010-2011,
# <NAME> <<EMAIL>>,
# <NAME> <<EMAIL>>,
# <NAME> <<EMAIL>>
#
# License: MIT. See COPYING.MIT file in the milk distribution
"""Affinity propagation
Original Authors (for scikits.learn):
<NAME> <EMAIL>
<NAME> <EMAIL>
Luis Pedro Coelho made the implementation more careful about allocating
intermediate arrays.
"""
import numpy as np
__all__ = [
'affinity_propagation',
]
def affinity_propagation(S, p=None, convit=30, maxit=200, damping=0.5, copy=True, R=0):
"""Perform Affinity Propagation Clustering of data
Parameters
----------
S : array [n_points, n_points]
Matrix of similarities between points
p : array [n_points,] or float, optional
Preferences for each point
damping : float, optional
Damping factor
copy : boolean, optional
If copy is False, the affinity matrix is modified inplace by the
algorithm, for memory efficiency
R : source of randomness
Returns
-------
cluster_centers_indices : array [n_clusters]
index of clusters centers
labels : array [n_points]
cluster labels for each point
Notes
-----
See examples/plot_affinity_propagation.py for an example.
Reference:
<NAME> and <NAME>, "Clustering by Passing Messages
Between Data Points", Science Feb. 2007
"""
if copy:
# Copy the affinity matrix to avoid modifying it inplace
S = S.copy()
n_points = S.shape[0]
assert S.shape[0] == S.shape[1]
if p is None:
p = np.median(S)
if damping < 0.5 or damping >= 1:
raise ValueError('damping must be >= 0.5 and < 1')
random_state = np.random.RandomState(R)
# Place preferences on the diagonal of S
S.flat[::(n_points+1)] = p
A = np.zeros((n_points, n_points))
R = np.zeros((n_points, n_points)) # Initialize messages
# Remove degeneracies
noise = random_state.randn(n_points, n_points)
typeinfo = np.finfo(S.dtype)
noise *= typeinfo.tiny*100
S += noise
del noise
# Execute parallel affinity propagation updates
e = np.zeros((n_points, convit))
ind = np.arange(n_points)
for it in range(maxit):
Aold = A.copy()
Rold = R.copy()
A += S
I = np.argmax(A, axis=1)
Y = A[ind, I]#np.max(A, axis=1)
A[ind, I] = typeinfo.min
Y2 = np.max(A, axis=1)
R = S - Y[:, np.newaxis]
R[ind, I[ind]] = S[ind, I] - Y2
Rold *= damping
R *= (1-damping)
R += Rold
# Compute availabilities
Rd = R.diagonal().copy()
np.maximum(R, 0, R)
R.flat[::n_points+1] = Rd
A = np.sum(R, axis=0)[np.newaxis, :] - R
dA = np.diag(A)
A = np.minimum(A, 0)
A.flat[::n_points+1] = dA
Aold *= damping
A *= (1-damping)
A += Aold
# Check for convergence
E = (np.diag(A) + np.diag(R)) > 0
e[:, it % convit] = E
K = np.sum(E, axis=0)
if it >= convit:
se = np.sum(e, axis=1);
unconverged = np.sum((se == convit) + (se == 0)) != n_points
if (not unconverged and (K>0)) or (it==maxit):
print "Converged after %d iterations." % it
break
else:
print "Did not converge"
I = np.where(np.diag(A+R) > 0)[0]
K = I.size # Identify exemplars
if K > 0:
c = np.argmax(S[:, I], axis=1)
c[I] = np.arange(K) # Identify clusters
# Refine the final set of exemplars and clusters and return results
for k in range(K):
ii = np.where(c==k)[0]
j = np.argmax(np.sum(S[ii, ii], axis=0))
I[k] = ii[j]
c = np.argmax(S[:, I], axis=1)
c[I] = np.arange(K)
labels = I[c]
# Reduce labels to a sorted, gapless, list
cluster_centers_indices = np.unique(labels)
labels = np.searchsorted(cluster_centers_indices, labels)
else:
labels = np.empty((n_points, 1))
cluster_centers_indices = None
labels.fill(np.nan)
return cluster_centers_indices, labels
<file_sep>/gui/breezy/thermometerview2.py
"""
File: thermometerview2.py
Author: <NAME>
"""
from breezypythongui import EasyFrame
from thermometer import Thermometer
class ThermometerView(EasyFrame):
"""A termperature conversion program. Uses sliding scales."""
def __init__(self, model):
"""Sets up the view. The model comes in as an argument."""
EasyFrame.__init__(self, title = "Temperature Converter")
self.model = model
# Sliding scale for Celsius
self.celsiusScale = self.addScale(label = "Celsius",
row = 0, column = 0,
from_ = -273.15, to = 100.0,
resolution = 0.01,
length = 250,
tickinterval = 0,
command = self.computeFahr)
self.celsiusScale.set(model.getCelsius())
# Sliding scale for Celsius
self.fahrScale = self.addScale(label = "Fahernheit",
row = 1, column = 0,
from_ = -459.67, to = 212.0,
resolution = 0.01,
length = 250,
tickinterval = 0,
command = self.computeCelsius)
self.fahrScale.set(model.getFahrenheit())
# The controller methods
def computeFahr(self, degreesCelsius):
"""Inputs the Celsius degrees
and outputs the Fahrenheit degrees."""
degrees = float(degreesCelsius)
self.model.setCelsius(degrees)
self.fahrScale.set(self.model.getFahrenheit())
def computeCelsius(self, degreesFahrenheit):
"""Inputs the Fahrenheit degrees
and outputs the Celsius degrees."""
degrees = float(degreesFahrenheit)
self.model.setFahrenheit(degrees)
self.celsiusScale.set(self.model.getCelsius())
# Instantiate the model, pass it to the view, and pop up the window.
if __name__ == "__main__":
model = Thermometer()
view = ThermometerView(model)
view.mainloop()
<file_sep>/disparity/disparity_test1.py
import cv
import sys
import cv2
import time
def cut(disparity, image, threshold):
for i in range(0, image.height):
for j in range(0, image.width):
# keep closer object
if cv.GetReal2D(disparity,i,j) > threshold:
cv.Set2D(disparity,i,j,cv.Get2D(image,i,j))
###########################################################
def resize_img(original_img, scale_percentage):
print original_img.height, original_img.width, original_img.nChannels
#resized_img = cv.CreateMat(original_img.rows * scale_percentage , original.cols * scale_percenta, cv.CV_8UC3)
resized_img = cv.CreateImage((cv.Round(original_img.width * scale_percentage) , cv.Round(original_img.height * scale_percentage)), original_img.depth, original_img.nChannels)
print "resizing image:", resized_img
Resize(original_img, resized_img)
"""
if visual == True:
print "resizing image..."
cv.ShowImage("original_img", original_img)
cv.ShowImage("resized_img", resized_img)
print "Press any key to continue....."
cv.WaitKey()
cv.DestroyWindow("original_img")
cv.DestroyWindow("resized_img")
"""
return(resized_img)
###########################################################
def grab_frame(camera):
#capture = cv.CreateCameraCapture(camera)
capture = cv.CaptureFromCAM(camera)
time.sleep(.1)
frame = cv.QueryFrame(capture)
time.sleep(.1)
temp = cv.CloneImage(frame)
#cv.ReleaseCapture(capture)
return temp
def CVtoGray(img):
grey_image = cv.CreateImage(cv.GetSize(img), cv.IPL_DEPTH_8U, 1)
temp_img = cv.CloneImage(img)
cv.CvtColor(temp_img, grey_image, cv.CV_RGB2GRAY)
return grey_image
###########################################################
def resize_img(original_img, scale_percentage):
#print original_img.height, original_img.width, original_img.nChannels
#resized_img = cv.CreateMat(original_img.rows * scale_percentage , original.cols * scale_percenta, cv.CV_8UC3)
resized_img = cv.CreateImage((cv.Round(original_img.width * scale_percentage) , cv.Round(original_img.height * scale_percentage)), original_img.depth, original_img.nChannels)
cv.Resize(original_img, resized_img)
"""
if visual == True:
print "resizing image..."
cv.ShowImage("original_img", original_img)
cv.ShowImage("resized_img", resized_img)
print "Press any key to continue....."
cv.WaitKey()
cv.DestroyWindow("original_img")
cv.DestroyWindow("resized_img")
"""
return(resized_img)
###########################################################
#if __name__=="__main__":
while 1:
#try:
#img1 = cv.LoadImage(sys.argv[1],cv.CV_LOAD_IMAGE_GRAYSCALE)
left = grab_frame(1)
left = resize_img(left, .3)
print "retuned:", left
time.sleep(.1)
#cv.ShowImage("cam-test2",frame_l)
#cv.WaitKey(0)
right = grab_frame(0)
right = resize_img(right, .3)
print "retuned:", right
time.sleep(.1)
#left = cv.CreateImage(cv.GetSize(frame_l), cv.IPL_DEPTH_8U, 1)
#left = CVtoGray(frame_l)
#right = cv.CreateImage(cv.GetSize(frame_r), cv.IPL_DEPTH_8U, 1)
#right = CVtoGray(frame_r)
#except:
#print "******* Could not open image files *******"
#sys.exit(-1)
print "done grabbing.."
left = CVtoGray(left)
right = CVtoGray(right)
cv.NamedWindow("Left",cv.CV_WINDOW_AUTOSIZE)
cv.ShowImage("Left",left)
cv.MoveWindow ('Left',50 ,50 )
cv.NamedWindow("Right",cv.CV_WINDOW_AUTOSIZE)
cv.ShowImage("Right",right)
cv.MoveWindow ('Right', (50 + (1 * (cv.GetSize(right)[0]))) , 50)
cv.WaitKey(150)
#time.sleep(1)
# cv.ShowImage("Coin 3", img3)
# cv.MoveWindow ('Coin 3', 375, 325)
# loading the stereo pair
#left = cv.LoadImage('scene_l1.jpg',cv.CV_LOAD_IMAGE_GRAYSCALE)
#right = cv.LoadImage('scene_r1.jpg',cv.CV_LOAD_IMAGE_GRAYSCALE)
print "processing disparity"
disparity_left = cv.CreateMat(left.height, left.width, cv.CV_16S)
disparity_right = cv.CreateMat(left.height, left.width, cv.CV_16S)
#cv.WaitKey()
# data structure initialization
state = cv.CreateStereoGCState(16,2)
# running the graph-cut algorithm
cv.FindStereoCorrespondenceGC(left,right,
disparity_left,disparity_right,state)
disp_left_visual = cv.CreateMat(left.height, left.width, cv.CV_8U)
cv.ConvertScale( disparity_left, disp_left_visual, -16 );
cv.Save( "disparity.pgm", disp_left_visual ); # save the map
# cutting the object farthest of a threshold (120)
cut(disp_left_visual,left,80)
temp2 = cv.GetImage(disp_left_visual)
temp2 = resize_img(temp2, 3)
cv.NamedWindow('Disparity map', cv.CV_WINDOW_AUTOSIZE)
cv.ShowImage('Disparity map', temp2)
cv.WaitKey(150)
time.sleep(1)
<file_sep>/gui/breezy/layoutdemo.py
"""
File: layoutdemo.py
Author: <NAME>
"""
from breezypythongui import EasyFrame
class LayoutDemo(EasyFrame):
"""Displays label in the quadrants."""
def __init__(self):
"""Sets up the window and the labels."""
EasyFrame.__init__(self)
self.addLabel(text = "(0, 0)", row = 0, column = 0)
self.addLabel(text = "(0, 1)", row = 0, column = 1)
self.addLabel(text = "(1, 0)", row = 1, column = 0)
self.addLabel(text = "(1, 1)", row = 1, column = 1)
# Instantiates and pops up the window.
if __name__ == "__main__":
LayoutDemo().mainloop()
<file_sep>/telemetry/HeartbeatClient.py
# Filename: HeartbeatClient.py
"""Heartbeat client, sends out an UDP packet periodically"""
import socket, time
SERVER_IP = '127.0.0.1'; SERVER_PORT = 43278; BEAT_PERIOD = 5
print ('Sending heartbeat to IP %s , port %d\n'
'press Ctrl-C to stop\n') % (SERVER_IP, SERVER_PORT)
while True:
hbSocket = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
hbSocket.sendto('PyHB', (SERVER_IP, SERVER_PORT))
if __debug__: print 'Time: %s' % time.ctime()
time.sleep(BEAT_PERIOD)
<file_sep>/navigation/spatial/robomow_spatial_test.py
#!/usr/bin/env python
import serial
import time
import math
#Basic imports
from ctypes import *
import sys
#Phidget specific imports
from Phidgets.Phidget import Phidget
from Phidgets.PhidgetException import PhidgetErrorCodes, PhidgetException
from Phidgets.Events.Events import SpatialDataEventArgs, AttachEventArgs, DetachEventArgs, ErrorEventArgs
from Phidgets.Devices.Spatial import Spatial, SpatialEventData, TimeSpan
sp = Spatial()
print("Opening phidget object....")
try:
sp.openPhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Waiting for attach....")
try:
sp.waitForAttach(6000)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
try:
sp.closePhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Exiting....")
exit(1)
else:
sp.setDataRate(16)
#print sp.spatialData()
print ("zeroing gyro...")
sp.zeroGyro()
time.sleep(2.2)
north = 0
for i in range (10000):
try:
print
gravity = sp.getAngularRate([0][0]), sp.getAngularRate([1][0]), sp.getAngularRate([2][0])
print "acc:", gravity
print "AngularRate:", sp.getAngularRate([0][0]), sp.getAngularRate([1][0]), sp.getAngularRate([2][0])
magField = sp.getMagneticField([0][0]), sp.getMagneticField([1][0]), sp.getMagneticField([2][0])
print "MagneticField:", magField
#print "compass = ", compass
#if compass < 0:
# compass = compass + 360
#print "north: ", north, (north / i) , i
#compassBearing = compass* (180.0 / math.pi);
#if (compassBearing > 360): compassBearing -= 360;
#if (compassBearing < 0): compassBearing += 360;
#print "compass: ", compass
except:
print ('..................')
time.sleep(.1)
#chr = sys.stdin.read(1)
sp.closePhidget()
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/unsupervised/kmeans.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2012, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
from __future__ import division
import numpy as np
from numpy import linalg
from . import _kmeans
from ..utils import get_pyrandom
from .normalise import zscore
__all__ = [
'kmeans',
'select_best_kmeans',
'repeated_kmeans',
]
try:
# This tests for the presence of 3-argument np.dot
# with the 3rd argument being the output argument
_x = np.array([
[1., 1.],
[0.,1.]] )
_y = np.array([2., 4.])
_r = np.array([0.,0.])
np.dot(_x, _y, _r)
if _r[0] != 6 or _r[1] != 4:
raise NotImplementedError
_dot3 = np.dot
except:
def _dot3(x, y, _):
return np.dot(x,y)
finally:
del _x
del _y
del _r
def _mahalanobis2(fmatrix, x, icov):
diff = fmatrix-x
# The expression below seems to be faster than looping over the elements and summing
return np.dot(diff, np.dot(icov, diff.T)).diagonal()
def centroid_errors(fmatrix, assignments, centroids):
'''
errors = centroid_errors(fmatrix, assignments, centroids)
Computes the following::
for all i, j:
ci = assignments[i]
errors[i,j] = fmatrix[ci, j] - centroids[ci, j]
Parameters
----------
fmatrix : 2D ndarray
feature matrix
assignments : 1D ndarray
Assignments array
centroids : 2D ndarray
centroids
Returns
-------
errors : float
Difference between fmatrix and corresponding centroid
'''
errors = []
for k,c in enumerate(centroids):
errors.append(fmatrix[assignments == k] - c)
return np.concatenate(errors)
def residual_sum_squares(fmatrix,assignments,centroids,distance='euclidean',**kwargs):
'''
rss = residual_sum_squares(fmatrix, assignments, centroids, distance='euclidean', **kwargs)
Computes residual sum squares
Parameters
----------
fmatrix : 2D ndarray
feature matrix
assignments : 1D ndarray
Assignments array
centroids : 2D ndarray
centroids
Returns
-------
rss : float
residual sum squares
'''
if distance != 'euclidean':
raise NotImplemented("residual_sum_squares only implemented for 'euclidean' distance")
rss = 0.0
for k, c in enumerate(centroids):
diff = fmatrix[assignments == k] - c
diff = diff.ravel()
rss += np.dot(diff, diff)
return rss
def assign_centroids(fmatrix, centroids, histogram=False, normalize=False, normalise=None):
'''
cids = assign_centroids(fmatrix, centroids, histogram=False, normalize=False)
Assigns a centroid to each element of fmatrix
Parameters
----------
fmatrix : 2D ndarray
feature matrix
centroids : 2D ndarray
centroids matrix
histogram : boolean, optional
If True, then the result is actually a histogram
normalize : boolean, optional
If True and ``histogram``, then the histogram is normalized to sum to
one.
Returns
-------
cids : sequence
``cids[i]`` is the index of the centroid closes to ``fmatrix[i];`` or,
if ``histogram``, then ``cids[i]`` is the number of points that were
assigned to centroid ``i.``
'''
dists = np.dot(fmatrix, (-2)*centroids.T)
dists += np.array([np.dot(c,c) for c in centroids])
cids = dists.argmin(1)
if histogram:
hist = np.array(
[np.sum(cids == ci) for ci in xrange(len(centroids))],
np.float)
if (normalize or normalise) and len(fmatrix):
hist /= hist.sum()
return hist
return cids
def _pycomputecentroids(fmatrix, centroids, assignments, counts):
k, Nf = centroids.shape
bins = np.arange(k+1)
ncounts,_ = np.histogram(assignments, bins)
counts[:] = ncounts
any_empty = False
mean = None
for ci,count in enumerate(counts):
if count:
where = (assignments.T == ci)
mean = _dot3(where, fmatrix, mean) # mean = dot(fmatrix.T, where.T), but it is better to not cause copies
mean /= count
centroids[ci] = mean
else:
any_empty = True
return any_empty
def kmeans(fmatrix, k, distance='euclidean', max_iter=1000, R=None, **kwargs):
'''
assignments, centroids = kmean(fmatrix, k, distance='euclidean', max_iter=1000, R=None, icov=None, covmat=None)
k-Means Clustering
Parameters
----------
fmatrix : ndarray
2-ndarray (Nelements x Nfeatures)
distance: string, optional
one of:
- 'euclidean' : euclidean distance (default)
- 'seuclidean' : standartised euclidean distance. This is equivalent to first normalising the features.
- 'mahalanobis' : mahalanobis distance.
This can make use of the following keyword arguments:
+ 'icov' (the inverse of the covariance matrix),
+ 'covmat' (the covariance matrix)
If neither is passed, then the function computes the covariance from the feature matrix
max_iter : integer, optional
Maximum number of iteration (default: 1000)
R : source of randomness, optional
Returns
-------
assignments : ndarray
An 1-D array of size `len(fmatrix)`
centroids : ndarray
An array of `k'` centroids
'''
fmatrix = np.asanyarray(fmatrix)
if not np.issubdtype(fmatrix.dtype, np.float):
fmatrix = fmatrix.astype(np.float)
if distance == 'seuclidean':
fmatrix = zscore(fmatrix)
distance = 'euclidean'
if distance == 'euclidean':
def distfunction(fmatrix, cs, dists):
dists = _dot3(fmatrix, (-2)*cs.T, dists)
dists += np.array([np.dot(c,c) for c in cs])
# For a distance, we'd need to add the fmatrix**2 components, but
# it doesn't matter because we are going to perform an argmin() on
# the result.
return dists
elif distance == 'mahalanobis':
icov = kwargs.get('icov', None)
if icov is None:
covmat = kwargs.get('covmat', None)
if covmat is None:
covmat = np.cov(fmatrix.T)
icov = linalg.inv(covmat)
def distfunction(fmatrix, cs, _):
return np.array([_mahalanobis2(fmatrix, c, icov) for c in cs]).T
else:
raise ValueError('milk.unsupervised.kmeans: `distance` argument unknown (%s)' % distance)
if k < 2:
raise ValueError('milk.unsupervised.kmeans `k` should be >= 2.')
if fmatrix.dtype in (np.float32, np.float64) and fmatrix.flags['C_CONTIGUOUS']:
computecentroids = _kmeans.computecentroids
else:
computecentroids = _pycomputecentroids
R = get_pyrandom(R)
centroids = np.array(R.sample(fmatrix,k), fmatrix.dtype)
prev = np.zeros(len(fmatrix), np.int32)
counts = np.empty(k, np.int32)
dists = None
for i in xrange(max_iter):
dists = distfunction(fmatrix, centroids, dists)
assignments = dists.argmin(1)
if np.all(assignments == prev):
break
if computecentroids(fmatrix, centroids, assignments.astype(np.int32), counts):
(empty,) = np.where(counts == 0)
centroids = np.delete(centroids, empty, axis=0)
k = len(centroids)
counts = np.empty(k, np.int32)
# This will cause new matrices to be allocated in the next iteration
dists = None
prev[:] = assignments
return assignments, centroids
def repeated_kmeans(fmatrix,k,iterations,distance='euclidean',max_iter=1000,R=None,**kwargs):
'''
assignments,centroids = repeated_kmeans(fmatrix, k, repeats, distance='euclidean',max_iter=1000,**kwargs)
Runs kmeans repeats times and returns the best result as evaluated
according to distance
See Also
--------
kmeans : runs kmeans once
Parameters
----------
fmatrix : feature matrix
k : nr of centroids
iterations : Nr of repetitions
distance : 'euclidean' (default) or 'seuclidean'
max_iter : Max nr of iterations per kmeans run
R : random source
Returns
-------
assignments : 1-D array of assignments
centroids : centroids
These are the same returns as the kmeans function
'''
kwargs['max_iter'] = max_iter
return select_best_kmeans(fmatrix, [k], repeats=iterations, method='loglike', distance=distance, **kwargs)
def select_best_kmeans(fmatrix, ks, repeats=1, method='AIC', R=None, **kwargs):
'''
assignments,centroids = select_best_kmeans(fmatrix, ks, repeats=1, method='AIC', R=None, **kwargs)
Runs kmeans repeats times and returns the best result as evaluated
according to distance
See Also
--------
kmeans : runs kmeans once
Parameters
----------
fmatrix : feature matrix
ks : sequence of integers
nr of centroids to try
iterations : integer, optional
Nr of repetitions for each value of k
R : random source, optional
Returns
-------
assignments : 1-D array of assignments
centroids : 2-D ndarray
centroids
These are the same returns as the kmeans function
'''
best = None
best_val = np.inf
R = get_pyrandom(R)
from milk.unsupervised.gaussianmixture import AIC, BIC, log_likelihood
if method == 'AIC':
method = AIC
elif method == 'BIC':
method = BIC
elif method == 'loglike':
method = log_likelihood
else:
raise ValueError('milk.kmeans.select_best_kmeans: unknown method: %s' % method)
if 'distance' in kwargs and kwargs['distance'] == 'seuclidean':
fmatrix = zscore(fmatrix)
for k in ks:
for i in xrange(repeats):
As,Cs = kmeans(fmatrix, k, R=R, **kwargs)
value = method(fmatrix, As, Cs)
if value < best_val:
best_val = value
best = As,Cs
return best
<file_sep>/gui/simplecv/gtk_button.py
from SimpleCV import *
from imagegtk import DisplayImage
im = Image("Lenna")
image = im.toRGB.getBitmap()
d = DisplayImage(title="iamgegtk")
label = gtk.Label("Lenna")
d.box.pack_start(label,False,False,0)
but1 = gtk.Button("Quit")
but1.connect("clicked",d.leave_app)
d.box.pack_end(but1,False,False,2)
d.show(image)
<file_sep>/gui/breezy/thermometer.py
"""
File: thermometer.py
Author: <NAME>
"""
class Thermometer(object):
"""Models temperature conversion between degrees Fahrenheit
and degrees Celsius."""
def __init__(self):
"""Sets up the model."""
# Celsius is the standard
self._degreesCelsius = 0.0
def getCelsius(self):
"""Returns the Celsius temperature."""
return self._degreesCelsius
def setCelsius(self, degrees):
"""Sets the thermometer to degrees in Celsius."""
self._degreesCelsius = degrees
def getFahrenheit(self):
"""Returns the Fahrenheit temperature."""
return self._degreesCelsius * 9 / 5 + 32
def setFahrenheit(self, degrees):
"""Sets the thermometer to degrees in Fahrenheit."""
self._degreesCelsius = (degrees - 32) * 5 / 9
<file_sep>/img_processing_tools.py
#!/usr/bin/env python
from PIL import Image
def WriteMeterics(image, classID):
#calculate histogram
print "Calculating Histogram for I3 pixels of image..."
Red_Band, Green_Band, Blue_Band = image.split()
Histogram = CalcHistogram(Green_Band)
#save I3 Histogram to file in certain format
f_handle = open('I3banddata.csv', 'a')
f_handle.write(str(classID))
f_handle.write(' ')
f_handle.close()
print "saving I3 histogram to dictionary..."
f_handle = open("I3banddata.txt", 'a')
for i in range(len(Histogram)):
f_handle.write(str(Histogram[i]))
f_handle.write(" ")
#f_handle.write('\n')
f_handle.close()
I3_sum = ImageStat.Stat(image).sum
I3_sum2 = ImageStat.Stat(image).sum2
I3_median = ImageStat.Stat(image).median
I3_mean = ImageStat.Stat(image).mean
I3_var = ImageStat.Stat(image).var
I3_stddev = ImageStat.Stat(image).stddev
I3_rms = ImageStat.Stat(image).rms
print "saving I3 meterics to dictionary..."
f_handle = open("I3banddata.txt", 'a')
print "sum img1_I3: ", I3_sum[1]
print "sum2 img1_I3: ", I3_sum2[1]
print "median img1_I3: ", I3_median[1]
print "avg img1_I3: ", I3_mean[1]
print "var img1_I3: ", I3_var[1]
print "stddev img1_I3: ", I3_stddev[1]
print "rms img1_I3: ", I3_rms[1]
#print "extrema img1_I3: ", ImageStat.Stat(img1_I3).extrema
#print "histogram I3: ", len(img1_I3.histogram())
f_handle.write(str(I3_sum[1]))
f_handle.write(" ")
f_handle.write(str(I3_sum2[1]))
f_handle.write(" ")
f_handle.write(str(I3_median[1]))
f_handle.write(" ")
f_handle.write(str(I3_mean[1]))
f_handle.write(" ")
f_handle.write(str(I3_var[1]))
f_handle.write(" ")
f_handle.write(str(I3_stddev[1]))
f_handle.write(" ")
f_handle.write(str(I3_rms[1]))
f_handle.write(" ")
f_handle.write('\n')
f_handle.close()
return
def rgbToI3(r, g, b):
"""Convert RGB color space to I3 color space
@param r: Red
@param g: Green
@param b: Blue
return (I3) integer
"""
i3 = ((2*g)-r-b)/2
return i3
def rgb2I3 (img):
"""Convert RGB color space to I3 color space
@param r: Red
@param g: Green
@param b: Blue
return (I3) integer
"""
xmax = img.size[0]
ymax = img.size[1]
#make a copy to return
returnimage = Image.new("RGB", (xmax,ymax))
imagearray = img.load()
for y in range(0, ymax, 1):
for x in range(0, xmax, 1):
rgb = imagearray[x, y]
i3 = ((2*rgb[1])-rgb[0]-rgb[2]) / 2
#print rgb, i3
returnimage.putpixel((x,y), (0,i3,0))
return returnimage
<file_sep>/gui/simplecv/imagegtk.py
import gtk
from threading import Thread
import gobject
import time
gtk.gdk.threads_init()
class DisplayImage():
def __init__(self,winsize=None,imgsize=None,title="imagegtk"):
"""
Constructs gtk window and adds widgets
parameters:
winsize - A tuple containing 2 integers to set window size
imgsize - A tuple containing 2 integers to set image size
title - A string for window title
"""
self.img = None
self.img_gtk = None
self.img_thrd = None
self.img_thrd_flag = False
self.interval = 0.1
self.mouseX = 0
self.mouseY = 0
self.mouse_rawX = 0
self.mouse_rawY = 0
self.done = False
self.thrd = None
self.mouseLeft = False
self.mouseRight = False
self.mouseMiddle = False
self.leftButtonDown = False
self.rightButtonDown = False
self.winsize = winsize
self.imgsize = imgsize
self.win = gtk.Window()
self.win.set_title(title)
if self.winsize is not None:
self.win.set_size_request(self.winsize[0],self.winsize[1])
self.win.connect("delete_event",self.__quit)
self.box = gtk.VBox(False,2)
self.win.add(self.box)
self.box.show()
self.image_box = gtk.EventBox()
self.image_box.connect("motion_notify_event",self.__motioncallback)
self.image_box.connect("button_press_event",self.__presscallback)
self.image_box.connect("button_release_event",self.__releasecallback)
self.box.pack_start(self.image_box,False,False,2)
def show(self,image,imgsize=None):
"""
Summary:
Creates a thread to show image and calls show_image
Parameters:
image - cv2.cv.iplimage or iplimage or filename
"""
if imgsize is not None and type(imgsize) is tuple:
self.imgsize = imgsize
if self.img_thrd:
# Thread exists. Hence change the flag to end the thread.
self.img_thrd_flag = False
self.img_thrd = Thread(target = self.__showimage,args=(image,),name="show_image")
self.img_thrd.start()
def __showimage(self,image):
"""
Summary:
Creates pixbuf from image data. Sets image from pixbuf.
Creates a thread to call gtk.main()
Parameters:
image - cv2.cv.iplimage or iplimage
NOTE:
This function shouldn't be called directly
"""
if self.done == True and self.img_gtk is None:
self.done = False
self.__init__()
elif self.done == True:
self.done = False
self.img = image
self.img_thrd_flag = True
if self.img_gtk is None:
self.img_flag=0
self.img_gtk = gtk.Image()
self.image_box.add(self.img_gtk)
if type(self.img) == str:
try:
self.img_pixbuf = gtk.gdk.pixbuf_new_from_file(self.img)
except gobject.GError:
print "Error: %s not found" % self.img
return
else:
self.img_pixbuf = gtk.gdk.pixbuf_new_from_data(self.img.tostring(),
gtk.gdk.COLORSPACE_RGB,
False,
self.img.depth,
self.img.width,
self.img.height,
self.img.width*self.img.nChannels)
if self.imgsize is not None:
self.img_pixbuf = self.__resizeImage(self.img_pixbuf,self.imgsize)
self.img_gtk.set_from_pixbuf(self.img_pixbuf)
self.img_gtk.show()
self.win.show_all()
if not self.img_flag:
self.__threadgtk()
self.img_flag=1
while (not self.done and self.img_thrd_flag) :
time.sleep(self.interval/2)
def __threadgtk(self):
"""
Summary:
Creates a thread for gtk.main()
"""
self.thrd = Thread(target=gtk.main, name = "GTK thread")
self.thrd.daemon = True
self.thrd.start()
def __quit(self,widget,data=None):
"""
Summary:
calls quit.
Note:
This function must be called with a callback.
"""
self.quit()
def __releasecallback(self,widget,event):
"""
Callback function when mouse button is released.
Updates mouseX, mouseY, mouse_rawX, mouse_rawY
"""
self.mouseX = int(event.x)
self.mouseY = int(event.y)
self.mouse_rawX = int(event.x_root)
self.mouse_rawY = int(event.y_root)
self.__setReleaseButtonState(event.button)
def __presscallback(self,widget,event):
"""
Callback function when mouse button is pressed.
Updates mouseX, mouseY, mouse_rawX, mouse_rawY
"""
self.mouseX = int(event.x)
self.mouseY = int(event.y)
self.mouse_rawX = int(event.x_root)
self.mouse_rawY = int(event.y_root)
self.__setPressButtonState(event.button)
def __setPressButtonState(self, mode):
if mode == 1:
self.leftButtonDown = True
self.mouseLeft = True
elif mode == 2:
self.rightButtonDown = True
self.mouseLeft = True
elif mode == 3:
self.mouseMiddle = True
def __setReleaseButtonState(self, mode):
if mode == 1:
self.leftButtonDown = False
self.mouseLeft = False
elif mode == 2:
self.rightButtonDown = False
self.mouseLeft = False
elif mode == 3:
self.mouseMiddle = False
def __motioncallback(self,widget,event):
"""
Callback function when mouse is moved.
Updates mouseX, mouseY, mouse_rawX, mouse_rawY
"""
self.mouseX = int(event.x)
self.mouseY = int(event.y)
self.mouse_rawX = int(event.x_root)
self.mouse_rawY = int(event.y_root)
def isDone(self):
"""
Check whether window exists or not
"""
return self.done
def callback_quit(self,widget,data=None):
"""
Summary:
calls quit.
Note:
This function must be called with a callback.
"""
self.quit()
def quit(self):
"""
Destroys window, image.
"""
self.img_gtk = None
self.win.destroy()
self.done = True
def change_title(self,title=None):
"""
Change the title of the window.
Parameter:
title - A string
Changes the title of the window
Example:
d = DisplayImage(title="imagegtk")
d.change_title("my title")
"""
if title is not None:
self.win.set_title(title)
self.win.show_all()
def change_size(self,size):
"""
Change the size of the window.
Parameter:
size - a tuple containing 2 integers
Changes the size of the window
Example:
d = DisplayImage(size=(800,600))
d.change_size((400,300))
"""
if size:
self.win_size = size
self.win.set_size_request(self.win_size[0], self.win_size[1])
self.win.show_all()
def __resizeImage(self,pixbuf,size):
"""
Parameters:
pixbuf - gtk.gdk.pixbuf
size - a tuple containing 2 integers
Returns resized gtk.gdk.pixbuf
"""
new_pixbuf = pixbuf.scale_simple(size[0],size[1],gtk.gdk.INTERP_BILINEAR)
return new_pixbuf
def set_image_size(self,size):
"""
To set or change the size of the image.
Parameters:
size - A tuple of two integers
Example:
display = DisplayImage()
i = Image("lenna")
display.show(i.toRGB().getBitmap)
display.set_image_size((400,300))
===============================================================
im = LoadImage("filename.jpg")
image = CreateImage((im.width,im.height),im.depth,im.channels)
CvtColor(im,image,CV_BGR2RGB)
display = DisplayImage()
display.show(image)
display.set_image_size((400,300))
===============================================================
"""
#http://faq.pygtk.org/index.py?req=show&file=faq08.006.htp
if size:
# Need to add more something to choose for interp_type
# current interp_type = gtk.gdk.INTERP_BILINEAR
# http://www.pygtk.org/docs/pygtk/class-gdkpixbuf.html#method-gdkpixbuf--scale-simple
# Should I change the original pixbuf or not ?
#new_buf = self.img_pixbuf.scale_simple(size[0],size[1],gtk.gdk.INTERP_BILINEAR)
new_buf = self.__resizeImage(self.img_pixbuf,size)
self.imgsize = size
self.img_gtk.set_from_pixbuf(new_buf)
self.img_gtk.show()
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/tests/test_gridsearch.py
import milk.supervised.gridsearch
import milk.supervised.svm
from milk.supervised.gridsearch import gridminimise, _allassignments, gridsearch
from milk.tests.fast_classifier import fast_classifier
from nose.tools import raises
import numpy as np
def slow_gridminimise(learner, features, labels, params, measure=None):
from ..measures.nfoldcrossvalidation import nfoldcrossvalidation
if measure is None:
measure = np.trace
best_val = initial_value
best = None
for assignement in _allassignments(params):
_set_assignment(learner, assignement)
S,_ = nfoldcrossvalidation(features, labels, classifier=learner)
cur = measure(S)
if cur > best_val:
best = assignement
best_val = cur
return best
def test_gridsearch():
from milksets import wine
features, labels = wine.load()
selected = (labels < 2)
features = features[selected]
labels = labels[selected]
G = milk.supervised.gridsearch(
milk.supervised.svm.svm_raw(),
params={'C':[.01,.1,1.,10.],
'kernel':[milk.supervised.svm.rbf_kernel(0.1),milk.supervised.svm.rbf_kernel(1.)]
})
model = G.train(features,labels)
reslabels = [model.apply(f) for f in features]
assert len(reslabels) == len(features)
test_gridsearch.slow = True
def test_all_assignements():
assert len(list(_allassignments({'C': [0,1], 'kernel' : ['a','b','c']}))) == 2 * 3
class error_learner(object):
def train(self, features, labels, **kwargs):
raise ValueError('oops')
def set_option(self, k, v):
pass
@raises(Exception)
def test_with_error():
from milksets.wine import load
features, labels = load()
learner = error_learner()
G = milk.supervised.gridsearch(
error_learner(),
params = { 'error' : range(3), 'error2' : range(5) }
)
G.train(features,labels)
class simple_model:
def __init__(self, c):
self.c = c
def apply(self, f):
return self.c
def f(a,b,c):
return a**2 + b**3 + c
class simple_learner:
def set_option(self, k, v):
setattr(self, k, v)
def train(self, fs, ls, normalisedlabels=False):
return simple_model(f(self.a, self.b, self.c))
def test_gridminimise():
features = np.arange(100)
labels = np.tile((0,1), 50)
paramspace = { 'a': np.arange(4), 'b' : np.arange(-3,3), 'c' : np.linspace(2., 10) }
best,value = gridminimise(simple_learner(), features, labels, paramspace, measure=(lambda _, p: p[0]), return_value=True)
best = dict(best)
val = f(best['a'], best['b'], best['c'])
assert value == val*100
for a in np.arange(4):
for b in np.arange(-3,3):
for c in np.linspace(2., 10):
assert val <= f(a,b,c)
gs = gridsearch(simple_learner(), paramspace, measure=(lambda _, p: p[0]), annotate=True)
model = gs.train(features, labels)
assert model.value == value
assert model.arguments == val
def test_gridminimise():
from milksets.wine import load
features, labels = load()
x = gridminimise(milk.supervised.svm_simple(kernel=np.dot, C=2.), features[::2], labels[::2] == 0, {'C' : (0.5,) })
cval, = x
assert cval == ('C', .5)
def test_gridminimise_return():
from milksets.wine import load
features,labels = load()
learner = fast_classifier()
gridminimise(learner, features, labels, { 'ignore' : [0] })
_,error = gridminimise(learner, features, labels, { 'ignore' : [0] }, return_value=True, nfolds=5)
cmat,_ = milk.nfoldcrossvalidation(features, labels, learner=learner, nfolds=5)
assert error == cmat.sum()-cmat.trace()
<file_sep>/gui/breezy/mousedemo2.py
"""
File: mousedemo2.py
uthor: <NAME>
"""
from breezypythongui import EasyFrame, EasyCanvas
import random
class MouseDemo(EasyFrame):
"""Draws ovals in random colors."""
def __init__(self):
"""Sets up the window and widgets."""
EasyFrame.__init__(self, title = "Mouse Demo 2")
# Canvas
self.shapeCanvas = ShapeCanvas(self)
self.addCanvas(self.shapeCanvas, row = 0, column = 0,
width = 300, height = 150)
class ShapeCanvas(EasyCanvas):
"""Draw an oval with a press, drag, and release of the mouse."""
def __init__(self, parent):
"""Sets up the canvas."""
EasyCanvas.__init__(self, parent, background = "gray")
def mousePressed(self, event):
"""Sets the first corner of the oval's bounding rectangle."""
self.x = event.x
self.y = event.y
def mouseReleased(self, event):
"""Sets the second corner of the oval's bounding rectangle.
Draws an oval filled in a random color."""
if self.x != event.x and self.y != event.y:
color = self.getRandomColor()
self.drawOval(self.x, self.y,
event.x, event.y, fill = color)
def getRandomColor(self):
"""Returns a random RGB color."""
hexR = hex(random.randint(0, 255))[2:]
hexG = hex(random.randint(0, 255))[2:]
hexB = hex(random.randint(0, 255))[2:]
if len(hexR) == 1: hexR = "0" + hexR
if len(hexG) == 1: hexG = "0" + hexG
if len(hexB) == 1: hexB = "0" + hexB
return "#" + hexR + hexG + hexB
# Instantiate and pop up the window."""
if __name__ == "__main__":
MouseDemo().mainloop()
<file_sep>/arduino/sonar/sonar.ino
// This program written for Arduino UNO rv3.
// for Maxbotix HRLV-Maxsonar MB1023
//Analog pin 0-4 for reading in the analog voltage from the MaxSonar device.
const int anPin0 = 0;
const int anPin1 = 1;
const int anPin2 = 2;
const int anPin3 = 3;
const int anPin4 = 4;
//variables needed to store values
long anVolt0;
long anVolt1;
long anVolt2;
long anVolt3;
long anVolt4;
long cm0;
long cm1;
long cm2;
long cm3;
long cm4;
int sum0=0;//Create sum variable so it can be averaged
int sum1=0;//Create sum variable so it can be averaged
int sum2=0;//Create sum variable so it can be averaged
int sum3=0;//Create sum variable so it can be averaged
int sum4=0;//Create sum variable so it can be averaged
int sum_mm = 0;
int avgrange=8;//Quantity of values to average (sample size)
int calibration_offset = 0;
float cm_to_inches = 0.393701;
void setup() {
//This opens up a serial connection to shoot the results back to the PC console
Serial.begin(57600);
pinMode(anPin0, INPUT);
delay(1);
pinMode(anPin1, INPUT);
delay(1);
pinMode(anPin2, INPUT);
delay(1);
pinMode(anPin3, INPUT);
delay(1);
pinMode(anPin4, INPUT);
}
void loop() {
//MaxSonar Analog reads are known to be very sensitive. See the Arduino forum for more information.
//A simple fix is to average out a sample of n readings to get a more consistant reading.\\
for(int i = 0; i < avgrange ; i++)
{
//Used to read in the analog voltage output that is being sent by the MaxSonar device.
//Scale factor is (Vcc/512) per inch. A 5V supply yields ~9.8mV/cm
//Arduino analog pin goes from 0 to 1024, so the value has to be divided by 2 to get the actual cm
anVolt0 = analogRead(anPin0)/2;
delay(5);
anVolt1 = analogRead(anPin1)/2;
delay(5);
anVolt2 = analogRead(anPin2)/2;
delay(5);
anVolt3 = analogRead(anPin3)/2;
delay(5);
anVolt4 = analogRead(anPin4)/2;
delay(5);
sum0 += anVolt0;
sum1 += anVolt1;
sum2 += anVolt2;
sum3 += anVolt3;
sum4 += anVolt4;
//delay(1);
}
cm0 = sum0/avgrange;
cm1 = sum1/avgrange;
cm2 = sum2/avgrange;
cm3 = sum3/avgrange;
cm4 = sum4/avgrange;
//inches = cm * cm_to_inches;
//inches = cm * cm_to_inches;
//inches = cm * cm_to_inches;
//inches = cm * cm_to_inches;
//inches = cm * cm_to_inches;
//Serial.print(inches);
//Serial.print("in, ");
Serial.print("s1:");
Serial.print(cm0);
Serial.print("s2:");
Serial.print(cm1);
Serial.print("s3:");
Serial.print(cm2);
Serial.print("s4:");
Serial.print(cm3);
Serial.print("s5:");
Serial.println(cm4);
//Serial.println();
//reset sample total
sum0 = 0;
sum1 = 0;
sum2 = 0;
sum3 = 0;
sum4 = 0;
delay(10);
}
<file_sep>/gui/breezy/song.py
"""
File: song.py
Author: <NAME>
"""
class Song(object):
"""Models a song."""
def __init__(self, title, artist, price):
"""Sets up the model."""
self.title = title
self.artist = artist
self.price = price
def __str__(self):
"""Returns the string rep of the song."""
return "Title: " + self.title + "\n" + \
"Artist: " + self.artist + "\n" + \
"Price: $%.2f\n" % self.price
def testSong():
"""Creates and prints a song."""
song = Song("Let It Be", "The Beatles", 0.99)
print(song)
import pickle
class SongDatabase(object):
"""Models a song database."""
def __init__(self, fileName = ""):
"""Sets up the model. Loads the model with songs if there
is a file name"""
self._items = {}
self.fileName = fileName
if fileName != "":
try:
file = open(fileName, "rb")
while True:
song = pickle.load(file)
self[song.title] = song
except Exception:
file.close()
def save(self, fileName = ""):
"""Saves the database to a file."""
if fileName != "":
self.fileName = fileName
elif self.fileName == "": return
file = open(self.fileName, "wb")
for song in self._items.values():
pickle.dump(song, file)
file.close()
def __getitem__(self, title):
"""Returns the cd with the given title, or None if
the title does not exist."""
return self._items.get(title, None)
def __setitem__(self, title, song):
"""Adds or replaces the cd at the given title."""
self._items[title] = song
def pop(self, title):
"""Removes and returns the cd at the given title,
or returns None if the title does not exist."""
return self._items.pop(title, None)
def getTitles(self):
"""Returns a sorted list of titles."""
titles = list(self._items.keys())
titles.sort()
return titles
def __str__(self):
"""Returns the string rep of the database."""
return "\n".join(map(lambda title: str(self[title]),
self.getTitles()))
def testDatabase():
"""Creates a database, prints it, saves it,
loads it, and prints it."""
database = SongDatabase()
song = Song("Let It Be", "The Beatles", 0.99)
database[song.title] = song
song = Song("<NAME>", "The Rolling Stones", 01.29)
database[song.title] = song
song = Song("Kashmir", "Led Zepplin", 1.29)
database[song.title] = song
song = Song("<NAME>", "Falco", 0.99)
database[song.title] = song
print(database)
database.save("songs.dat")
newdb = SongDatabase("songs.dat")
print(newdb)
if __name__ == "__main__":
testDatabase()
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/supervised/gridsearch.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2011, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# License: MIT. See COPYING.MIT file in the milk distribution
from __future__ import division
import numpy as np
from .classifier import normaliselabels
import multiprocessing
__all__ = [
'gridminimise',
'gridsearch',
]
def _allassignments(options):
try:
from itertools import product
except ImportError:
def product(*args, **kwds):
# from http://docs.python.org/library/itertools.html#itertools.product
pools = map(tuple, args) * kwds.get('repeat', 1)
result = [[]]
for pool in pools:
result = [x+[y] for x in result for y in pool]
for prod in result:
yield tuple(prod)
from itertools import repeat, izip
for ks,vs in izip(repeat(options.keys()), product(*options.values())):
yield zip(ks,vs)
def _set_options(learner, options):
for k,v in options:
learner.set_option(k,v)
class Grid1(multiprocessing.Process):
def __init__(self, learner, features, labels, measure, train_kwargs, options, folds, inq, outq):
self.learner = learner
self.features = features
self.labels = labels
self.measure = measure
self.train_kwargs = train_kwargs
self.options = options
self.folds = folds
self.inq = inq
self.outq = outq
super(Grid1, self).__init__()
def execute_one(self, index, fold):
_set_options(self.learner, self.options[index])
train, test = self.folds[fold]
model = self.learner.train(self.features[train], self.labels[train], normalisedlabels=True, **self.train_kwargs)
preds = model.apply_many(self.features[test])
error = self.measure(self.labels[test], preds)
return error
def run(self):
try:
while True:
index,fold = self.inq.get()
if index == 'shutdown':
self.outq.close()
self.outq.join_thread()
return
error = self.execute_one(index, fold)
self.outq.put( (index, error) )
except Exception, e:
import traceback
errstr = r'''\
Error in milk.gridminimise internal
Exception was: %s
Original Traceback:
%s
(Since this was run on a different process, this is not a real stack trace).
''' % (e, traceback.format_exc())
self.outq.put( ('error', errstr) )
def gridminimise(learner, features, labels, params, measure=None, nfolds=10, return_value=False, train_kwargs=None, nprocs=None, origins=None):
'''
best = gridminimise(learner, features, labels, params, measure={0/1 loss}, nfolds=10, return_value=False, nprocs=None)
best, value = gridminimise(learner, features, labels, params, measure={0/1 loss}, nfolds=10, return_value=True, nprocs=None)
Grid search for the settings of parameters that maximises a given measure
This function is equivalent to searching the grid, but does not actually
search the whole grid.
Parameters
----------
learner : a classifier object
features : sequence of features
labels : sequence of labels
params : dictionary of sequences
keys are the options to change,
values are sequences of corresponding elements to try
measure : function, optional
a function that takes labels and outputs and returns the loss.
Default: 0/1 loss. This must be an *additive* function.
nfolds : integer, optional
nr of folds to run, default: 10
return_value : boolean, optional
Whether to return the error value as well. Default False
train_kwargs : dict, optional
Options that are passed to the train() method of the classifier, using
the ``train(features, labels, **train_kwargs)`` syntax. Defaults to {}.
nprocs : integer, optional
Number of processors to use. By default, uses the
``milk.utils.parallel`` framework to check the number of
processors.
Returns
-------
best : a sequence of assignments
value : float
Only returned if ``return_value`` is true
'''
# The algorithm is as follows:
#
# for all assignments: error = 0, next_iteration = 0
#
# at each iteration:
# look for assignment with smallest error
# if that is done: return it
# else: perform one more iteration
#
# When the function returns, that assignment has the lowest error of all
# assignments and all the iterations are done. Therefore, other assignments
# could only be worse even if we never computed the whole error!
from ..measures.nfoldcrossvalidation import foldgenerator
from ..utils import parallel
if measure is None:
from ..measures.measures import zero_one_loss
measure = zero_one_loss
if train_kwargs is None:
train_kwargs = {}
try:
features = np.asanyarray(features)
except:
features = np.array(features, dtype=object)
labels,_ = normaliselabels(labels)
options = list(_allassignments(params))
iteration = np.zeros(len(options), int)
error = np.zeros(len(options), float)
folds = [(Tr.copy(), Te.copy()) for Tr,Te in foldgenerator(labels, nfolds=nfolds, origins=origins)]
# foldgenerator might actually decide on a smaller number of folds,
# depending on the distribution of class sizes:
nfolds = len(folds)
assert nfolds
if nprocs is None:
nprocs = len(options)
else:
nprocs = min(nprocs, len(options))
assert nprocs > 0, 'milk.supervised.gridminimise: nprocs <= 0!!'
nprocs = parallel.get_procs(nprocs, use_current=True)
executing = set()
workers = []
if nprocs > 1:
inqueue = multiprocessing.Queue()
outqueue = multiprocessing.Queue()
for i in xrange(nprocs):
inqueue.put((i,0))
executing.add(i)
w = Grid1(learner, features, labels, measure, train_kwargs, options, folds, inqueue, outqueue)
w.start()
workers.append(w)
getnext = outqueue.get
queuejob = lambda next, fold: inqueue.put( (next, fold) )
else:
worker = Grid1(learner, features, labels, measure, train_kwargs, options, folds, None, None)
queue = []
def queuejob(index,fold):
queue.append((index,fold))
def getnext():
index,fold = queue.pop()
return index, worker.execute_one(index,fold)
queuejob(0,0)
executing.add(0)
try:
while True:
p,err = getnext()
if p == 'error':
raise RuntimeError(err)
executing.remove(p)
iteration[p] += 1
error[p] += err
for best in np.where(error == error.min())[0]:
if iteration[best] == nfolds:
if return_value:
return options[best], error[best]
return options[best]
for next in error.argsort():
if iteration[next] < nfolds and next not in executing:
executing.add(next)
queuejob(next, iteration[next])
break
finally:
assert np.max(iteration) <= nfolds
if len(workers):
for w in workers:
inqueue.put( ('shutdown', None) )
inqueue.close()
inqueue.join_thread()
for w in workers:
w.join()
parallel.release_procs(nprocs, count_current=True)
class gridsearch(object):
'''
G = gridsearch(base, measure=accuracy, nfolds=10, params={ param1 : [...], param2 : [...]}, annotate=False)
Perform a grid search for the best parameter values.
When G.train() is called, then for each combination of p1 in param1, p2 in
param2, ... it performs (effectively)::
base.param1 = p1
base.param2 = p2
...
value[p1, p2,...] = measure(nfoldcrossvalidation(..., learner=base))
it then picks the highest set of parameters and re-learns a model on the
whole data.
Parameters
-----------
base : classifier to use
measure : function, optional
a function that takes labels and outputs and returns the loss.
Default: 0/1 loss. This must be an *additive* function.
nfolds : integer, optional
Nr of folds
params : dictionary
annotate : boolean
Whether to annotate the returned model with ``arguments`` and ``value``
fields with the result of cross-validation. Defaults to False.
All of the above can be *passed as parameters to the constructor or set as
attributes*.
See Also
--------
gridminimise : function
Implements the basic functionality behind this object
'''
def __init__(self, base, measure=None, nfolds=10, params={}, annotate=False):
self.params = params
self.base = base
self.nfolds = 10
self.measure = measure
self.annotate = annotate
def is_multi_class(self):
return self.base.is_multi_class()
def train(self, features, labels, normalisedlabels=False, **kwargs):
best,value = gridminimise(self.base, features, labels, self.params, self.measure, self.nfolds, return_value=True, train_kwargs=kwargs)
_set_options(self.base, best)
model = self.base.train(features, labels, normalisedlabels=normalisedlabels, **kwargs)
if self.annotate:
model.arguments = best
model.value = value
return model
<file_sep>/snap.py
from CVtypes import cv
win = 'Show Cam'
cv.NamedWindow(win)
cap = cv.CreateCameraCapture(0)
while cv.WaitKey(1) != 27:
img = cv.QueryFrame(cap)
cv.ShowImage(win, img)
<file_sep>/navigation/gps/read_gps_waypoints.py
#!/usr/bin/env python
import os
import time
import csv
import numpy as np
ifile = open('gps_polygon.txt', "rb")
reader = csv.reader(ifile, delimiter='\t')
line_info = []
gps_lat = []
gps_long = []
polygon = []
reader.next()
reader.next()
#read data from file into arrays
for row in reader:
line_info.append(row)
gps_lat.append(row[8])
gps_long.append(row[9])
temp = (float(row[8]), float(row[9]))
polygon.append(temp)
#print row
#print len(features)
#print line_info[0]
print gps_lat, gps_long
print;print;print
print polygon
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/tests/test_kmeans.py
import numpy as np
import milk.unsupervised
from milk.unsupervised.kmeans import assign_centroids, repeated_kmeans
def test_kmeans():
np.random.seed(132)
features = np.r_[np.random.rand(20,3)-.5,.5+np.random.rand(20,3)]
def test_distance(dist, kwargs={}):
centroids, _ = milk.unsupervised.kmeans(features, 2, distance=dist, **kwargs)
positions = [0]*20 + [1]*20
correct = (centroids == positions).sum()
assert correct >= 38 or correct <= 2
yield test_distance, 'euclidean'
yield test_distance, 'seuclidean'
yield test_distance, 'mahalanobis', { 'icov' : np.eye(3) }
def test_kmeans_centroids():
np.random.seed(132)
features = np.random.rand(201,30)
for k in [2,3,5,10]:
indices,centroids = milk.unsupervised.kmeans(features, k)
for i in xrange(k):
if np.any(indices == i):
assert np.allclose(centroids[i], features[indices == i].mean(0))
def test_assign_cids():
from milksets.wine import load
features,_ = load()
assigns, centroids = milk.unsupervised.kmeans(features, 3, R=2, max_iters=10)
assert np.all(assign_centroids(features, centroids) == assigns)
def test_non_contiguous_fmatrix():
from milksets.wine import load
features,_ = load()
features = features[:,::2]
assigns, centroids = milk.unsupervised.kmeans(features, 3, R=2, max_iters=10)
assert np.all(assign_centroids(features, centroids) == assigns)
features = features.astype(np.int32)
assigns, centroids = milk.unsupervised.kmeans(features, 3, R=2, max_iters=10)
assert np.all(assign_centroids(features, centroids) == assigns)
def test_repeated_kmeans():
np.random.seed(132)
features = np.random.rand(201,30)
cids,cs = repeated_kmeans(features, 3, 2)
assert len(cids) == len(features)
<file_sep>/telemetry/send_img.py
# USAGE: python FileSender.py [file]
import sys, socket
HOST = '127.0.0.1'
CPORT = 12345
MPORT = 12346
FILE = sys.argv[1]
filename_tcp = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
filename_tcp.connect((HOST, CPORT))
#filename_tcp.bind((HOST, CPORT))
filename_tcp.send("SEND " + FILE)
filename_tcp.close()
data_tcp = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
data_tcp .connect((HOST, MPORT))
print "data port connected..."
f = open(FILE, "rb")
data = f.read()
f.close()
data_tcp.send(data)
data_tcp.close()
<file_sep>/gui/tkinter/filereceiver_class_test.py
from FileReceiver import *
from threading import *
import time
s = FileReceiver()
#line below stops thread when main program stops
s.daemon = True
s.start()
for i in range (10):
print i
time.sleep(1)
<file_sep>/PhidgetsPython/setup.py
__author__="Adam"
__date__ ="18-Jan-2011 12:51:42 PM"
from distutils.core import setup
setup (
name = 'PythonModule',
version = '2.1.8',
description = 'Python Wrapper library for the Phidgets API',
author = 'Adam',
author_email = '<EMAIL>',
url = 'http://www.phidgets.com',
packages = ['Phidgets', 'Phidgets.Devices', 'Phidgets.Events'],
license = 'Creative Commons Attribution 2.5 Canada License',
)
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/tests/test_multi.py
import numpy as np
import random
import milk.supervised.svm
import milk.supervised.multi
from milk.supervised.classifier import ctransforms
from fast_classifier import fast_classifier
import milksets.wine
features,labels = milksets.wine.load()
A = np.arange(len(features))
random.seed(9876543210)
random.shuffle(A)
features = features[A]
labels = labels[A]
labelset = set(labels)
base = ctransforms(milk.supervised.svm.svm_raw(C=2.,kernel=milk.supervised.svm.rbf_kernel(2.**-3)),milk.supervised.svm.svm_binary())
def test_one_against_rest():
M = milk.supervised.multi.one_against_rest(base)
M = M.train(features[:100,:],labels[:100])
tlabels = [M.apply(f) for f in features[100:]]
for tl in tlabels:
assert tl in labelset
def test_one_against_one():
M = milk.supervised.multi.one_against_one(base)
M = M.train(features[:100,:],labels[:100])
tlabels = [M.apply(f) for f in features[100:]]
for tl in tlabels:
assert tl in labelset
tlabels_many = M.apply_many(features[100:])
assert np.all(tlabels == tlabels_many)
def test_two_thirds():
np.random.seed(2345)
C = milk.supervised.defaultclassifier('fast')
X = np.random.rand(120,4)
X[:40] += np.random.rand(40,4)
X[:40] += np.random.rand(40,4)
X[40:80] -= np.random.rand(40,4)
X[40:80] -= np.random.rand(40,4)
Y = np.repeat(np.arange(3), 40)
model = C.train(X,Y)
Y_ = np.array([model.apply(x) for x in X])
assert (Y_ == Y).mean() * 3 > 2
def test_multi_labels():
clabels = [[lab, lab+7] for lab in labels]
multi_label = milk.supervised.multi.one_against_rest_multi(base)
model = multi_label.train(features[::2], clabels[::2])
test_vals = [model.apply(f) for f in features[1::2]]
for ts in test_vals:
if 0.0 in ts: assert 7.0 in ts
if 1.0 in ts: assert 8.0 in ts
if 2.0 in ts: assert 9.0 in ts
def test_classifier_no_set_options():
# Basically these should not raise an exception
milk.supervised.multi.one_against_rest_multi(fast_classifier())
milk.supervised.multi.one_against_rest(fast_classifier())
milk.supervised.multi.one_against_one(fast_classifier())
def test_tree():
mtree = milk.supervised.multi.multi_tree_learner(fast_classifier())
labels = [0,1,2,2,3,3,3,3]
features = np.random.random_sample((len(labels), 8))
model = mtree.train(features, labels)
counts = np.zeros(4)
for ell in labels:
counts[ell] += 1
g0,g1 = milk.supervised.multi.split(counts)
assert np.all(g0 == [3]) or np.all(g1 == [3])
def r(m):
if len(m) == 1: return int(m[0])
else: return sorted([r(m[1]), r(m[2])])
assert r(model.model) == [3,[2,[0,1]]]
<file_sep>/lib/mobot_nav_class.py
#!/usr/bin/python
import thread, time, sys, traceback
import serial
from math import *
class mobot_nav ():
def __init__(self, lidar):
self.lidar = lidar
def angel_greatest_dist(self):
min_dist_mm = 60
greatest_dist_mm = 0
#while True:
for i in self.lidar.data:
time.sleep(0.0001)
angle = i[0]
dist_mm = i[1]
quality = i[2]
if dist_mm > min_dist_mm and quality > 10:
if dist_mm > greatest_dist_mm:
greatest_dist_mm = dist_mm
angle_to_return = angle
return angle_to_return
def turn_left_or_right(self):
min_dist_mm = 60
greatest_dist_mm = 0
way_to_turn = "none"
beam_width = 20
left = 90
right = 270
for i in self.lidar.data:
time.sleep(0.0001)
angle = i[0]
dist_mm = i[1]
quality = i[2]
if dist_mm > min_dist_mm and quality > 10:
if (left-beam_width) <= angle <= (left+beam_width):
if dist_mm > greatest_dist_mm:
greatest_dist_mm = dist_mm
way_to_turn = 'left'
if (right-beam_width) <= angle <= (right+beam_width):
if dist_mm > greatest_dist_mm:
greatest_dist_mm = dist_mm
way_to_turn = 'right'
print "greatest_dist_mm", greatest_dist_mm
return way_to_turn
<file_sep>/lib/ServoDriver.py
from mm18usb import *
class CoinServoDriver(object):
def __init__(self,x_servo=0,y_servo=1,z_servo=2):
#def __init__(self,x_servo=0):
self.x_servo = x_servo
self.y_servo = y_servo
self.z_servo = z_servo
self.device = mm18usb('/dev/ttyACM1', '/dev/ttyACM0')
self.device.set_acceleration(self.x_servo,2)
self.device.set_speed(self.x_servo,2)
self.device.set_acceleration(self.y_servo,2)
self.device.set_speed(self.y_servo,2)
self.device.set_acceleration(self.z_servo,2)
self.device.set_speed(self.z_servo,10)
self.device.go_home()
def __del__(self):
del(self.device)
def status_report(self):
return "X: %s\tY: %s\tZ: %s" % (self.device.get_position(self.x_servo),self.device.get_position(self.y_servo),self.device.get_position(self.z_servo))
#return "X: %s" % (self.device.get_position(self.x_servo))
def setx_speed(self, speed):
self.device.set_speed(self.x_servo,speed)
self.device.wait_until_at_target()
def setx_acceleration(self, acc):
self.device.set_acceleration(self.x_servo,acc)
self.device.wait_until_at_target()
def sety_speed(self, speed):
self.device.set_speed(self.y_servo,speed)
self.device.wait_until_at_target()
def sety_acceleration(self, acc):
self.device.set_acceleration(self.y_servo,acc)
self.device.wait_until_at_target()
def pan(self,dx):
x = self.device.get_position(self.x_servo)
x += dx
self.device.set_target(self.x_servo,x)
self.device.wait_until_at_target()
def tilt(self,dy):
y = self.device.get_position(self.y_servo)
y += dy
self.device.set_target(self.y_servo,y)
self.device.wait_until_at_target()
def rotate(self,dz):
z = self.device.get_position(self.z_servo)
z += dz
self.device.set_target(self.z_servo,z)
self.device.wait_until_at_target()
def goto(self,x,y,z=0):
#def goto(self,x=0):
print "x=", x
self.device.set_target(self.x_servo,x)
self.device.set_target(self.y_servo,y)
self.device.set_target(self.z_servo,z)
self.device.wait_until_at_target()
def reset(self):
self.device.go_home()
self.device.wait_until_at_target()
def arm_up(self, up):
pos = 1425+up
self.goto(pos)
self.device.wait_until_at_target()
def arm_down(self):
self.goto(1425)
self.device.wait_until_at_target()
if __name__=="__main__":
#coinid_servo = mm18usb('/dev/ttyACM2', '/dev/ttyACM1')
coinid_servo = CoinServoDriver()
print coinid_servo.status_report()
coinid_servo.setx_speed(5)
coinid_servo.setx_acceleration(1)
coinid_servo.sety_speed(5)
coinid_servo.sety_acceleration(1)
#print coinid_servo.status_report()
time.sleep(.5)
'''
coinid_servo.arm_up(0)
print coinid_servo.status_report()
time.sleep(1)
coinid_servo.arm_up(100)
print coinid_servo.status_report()
time.sleep(1)
coinid_servo.arm_down()
print coinid_servo.status_report()
time.sleep(1)
#print coinid_servo.get_errors()
#print "Get Position"
#pos = coinid_servo.get_position(0)
#print pos
#print "moving arm up"
#coinid_servo.reset()
#print "Set Target:"
#coinid_servo.set_target(0,1700)
#coinid_servo.wait_until_at_target()
#time.sleep(1)
#coinid_servo.arm_up()
coinid_servo.goto(3500)
print coinid_servo.status_report()
time.sleep(1)
'''
coinid_servo.goto(0,0,0)
print coinid_servo.status_report()
time.sleep(1)
coinid_servo.goto(500,0,0)
time.sleep(1)
print coinid_servo.status_report()
coinid_servo.goto(2500,0,0)
print coinid_servo.status_report()
time.sleep(1)
coinid_servo.goto(500,0,0)
#time.sleep(1)
#coinid_servo.tilt(1000)
#time.sleep(1)
#coinid_servo.tilt(2000)
del coinid_servo
<file_sep>/gui/breezy/numberfielddemo.py
"""
File: numberfielddemo.py
Author: <NAME>
"""
from breezypythongui import EasyFrame
import math
class NumberFieldDemo(EasyFrame):
"""Computes and displays the square root of an input number."""
def __init__(self):
"""Sets up the window and widgets."""
EasyFrame.__init__(self)
# Label and field for the input
self.addLabel(text = "An integer",
row = 0, column = 0)
self.inputField = self.addIntegerField(value = 0,
row = 0,
column = 1,
width = 10)
# Label and field for the output
self.addLabel(text = "Square root",
row = 1, column = 0)
self.outputField = self.addFloatField(value = 0.0,
row = 1,
column = 1,
width = 8,
precision = 2,
state = "readonly")
# The command button
self.addButton(text = "Compute", row = 2, column = 0,
columnspan = 2, command = self.computeSqrt)
# The event handling method for the button
def computeSqrt(self):
"""Inputs the integer, computes the square root,
and outputs the result."""
number = self.inputField.getNumber()
result = math.sqrt(number)
self.outputField.setNumber(result)
#Instantiate and pop up the window."""
if __name__ == "__main__":
NumberFieldDemo().mainloop()
<file_sep>/navigation/gps/gpsreader2.py
import location
import gobject
def on_error(control, error, data):
print "location error: %d... quitting" % error
data.quit()
def on_changed(device, data):
if not device:
return
if device.fix:
if device.fix[1] & location.GPS_DEVICE_LATLONG_SET:
print "lat = %f, long = %f" % device.fix[4:6]
# data.stop() commented out to allow continuous loop for a reliable fix - press ctrl c to break the loop, or program your own way of exiting)
if device.fix[1] & location.GPS_DEVICE_ALTITUDE_SET:
print "alt = %f" % device.fix[7]
print "horizontal accuracy: %f meters" % (device.fix[6] / 100)
# FIXME: not supported yet in Python
#if device.cell_info:
# if device.cell_info[0] & location.GSM_CELL_INFO_SET:
# print "Mobile Country Code GSM: %d" % device.cell_info[1][0]
# if device.cell_info[0] & location.WCDMA_CELL_INFO_SET:
# print "Mobile Country Code WCDMA: %d" % device.cell_info[2][0]
print "Satellites in view: %d, in use: %d" % (device.satellites_in_view, device.satellites_in_use)
def on_stop(control, data):
print "quitting"
data.quit()
def start_location(data):
data.start()
return False
loop = gobject.MainLoop()
control = location.GPSDControl.get_default()
device = location.GPSDevice()
control.set_properties(preferred_method=location.METHOD_USER_SELECTED,
preferred_interval=location.INTERVAL_DEFAULT)
control.connect("error-verbose", on_error, loop)
device.connect("changed", on_changed, control)
control.connect("gpsd-stopped", on_stop, loop)
gobject.idle_add(start_location, control)
loop.run()
<file_sep>/camera/cam-histo.py
#! /usr/bin/env python
import sys
# import the necessary things for OpenCV
from CVtypes import cv
#############################################################################
# definition of some constants
# how many bins we want for the histogram, and their ranges
hdims = 16
hranges = [[0, 180]]
# ranges for the limitation of the histogram
vmin = 10
vmax = 256
smin = 30
# the range we want to monitor
hsv_min = cv.Scalar (0, smin, vmin, 0)
hsv_max = cv.Scalar (180, 256, vmax, 0)
#############################################################################
# some useful functions
def hsv2rgb (hue):
# convert the hue value to the corresponding rgb value
sector_data = [[0, 2, 1],
[1, 2, 0],
[1, 0, 2],
[2, 0, 1],
[2, 1, 0],
[0, 1, 2]]
hue *= 0.1 / 3
sector = int (hue)
p = int(round (255 * (hue - sector)))
if sector & 1:
p ^= 255
rgb = {}
rgb [sector_data [sector][0]] = 255
rgb [sector_data [sector][1]] = 0
rgb [sector_data [sector][2]] = p
return cv.Scalar (rgb [2], rgb [1], rgb [0], 0)
#############################################################################
# so, here is the main part of the program
if __name__ == '__main__':
# a small welcome
print "OpenCV Python wrapper test"
print "OpenCV version: %s (%d, %d, %d)" % (cv.VERSION,
cv.MAJOR_VERSION,
cv.MINOR_VERSION,
cv.SUBMINOR_VERSION)
# first, create the necessary windows
cv.NamedWindow ('Camera', cv.WINDOW_AUTOSIZE)
cv.NamedWindow ('Histogram', cv.WINDOW_AUTOSIZE)
# move the new window to a better place
cv.MoveWindow ('Camera', 10, 40)
cv.MoveWindow ('Histogram', 10, 270)
try:
# try to get the device number from the command line
device = int (sys.argv [1])
# got it ! so remove it from the arguments
del sys.argv [1]
except (IndexError, ValueError):
# no device number on the command line, assume we want the 1st device
device = 0
if len (sys.argv) == 1:
# no argument on the command line, try to use the camera
capture = cv.CreateCameraCapture (device)
# set the wanted image size from the camera
cv.SetCaptureProperty (capture,
cv.CAP_PROP_FRAME_WIDTH, 320)
cv.SetCaptureProperty (capture,
cv.CAP_PROP_FRAME_HEIGHT,240)
else:
# we have an argument on the command line,
# we can assume this is a file name, so open it
capture = cv.CreateFileCapture (sys.argv [1])
# check that capture device is OK
if not capture:
print "Error opening capture device"
sys.exit (1)
# create an image to put in the histogram
histimg = cv.CreateImage (cv.Size (320,240), 8, 3)
# init the image of the histogram to black
cv.SetZero (histimg)
# capture the 1st frame to get some propertie on it
frame = cv.QueryFrame (capture)
# get some properties of the frame
frame_size = cv.GetSize (frame)
# compute which selection of the frame we want to monitor
selection = cv.Rect (0, 0, frame_size.width, frame_size.height)
# create some images usefull later
hue = cv.CreateImage (frame_size, 8, 1)
mask = cv.CreateImage (frame_size, 8, 1)
hsv = cv.CreateImage (frame_size, 8, 3 )
# create the histogram
hist = cv.CreateHist (1, [hdims], cv.HIST_ARRAY, hranges, 1)
while 1:
# do forever
# 1. capture the current image
frame = cv.QueryFrame (capture)
if frame is None:
# no image captured... end the processing
break
# mirror the captured image
cv.Flip (frame, None, 1)
# compute the hsv version of the image
cv.CvtColor (frame, hsv, cv.BGR2HSV)
# compute which pixels are in the wanted range
cv.InRangeS (hsv, hsv_min, hsv_max, mask)
# extract the hue from the hsv array
cv.Split (hsv, hue, None, None, None)
# select the rectangle of interest in the hue/mask arrays
cv.SetImageROI(hue, selection)
cv.SetImageROI(mask, selection)
# it's time to compute the histogram
cv.CalcHist ([hue], hist, 0, mask)
# clear the ROIs
cv.ResetImageROI(hue)
cv.ResetImageROI(mask)
# extract the min and max value of the histogram
min_val, max_val, min_ndx, max_ndx = cv.GetMinMaxHistValue (hist)
# compute the scale factor
if max_val > 0:
scale = 255. / max_val
else:
scale = 0.
# scale the histograms
cv.ConvertScale (hist[0].bins, hist[0].bins, scale, 0)
# clear the histogram image
cv.SetZero (histimg)
# compute the width for each bin do display
bin_w = histimg[0].width / hdims
for i in range (hdims):
# for all the bins
# get the value, and scale to the size of the hist image
val = int(round (cv.GetReal1D (hist[0].bins, i)
* histimg[0].height / 255))
# compute the color
color = hsv2rgb (i * 180. / hdims)
# draw the rectangle in the wanted color
cv.Rectangle (histimg,
cv.Point (i * bin_w, histimg[0].height),
cv.Point ((i + 1) * bin_w, histimg[0].height - val),
color, -1, 8, 0)
# we can now display the images
cv.ShowImage ('Camera', frame)
cv.ShowImage ('Histogram', histimg)
# handle events
k = cv.WaitKey (10)
if k == 0x1b:
# user has press the ESC key, so exit
break
<file_sep>/gui/breezy/radiobuttondemo2.py
"""
File: radiobuttondemo2.py
Demonstrates radio button capabilities.
"""
from breezypythongui import EasyFrame
from tkinter import HORIZONTAL
class RadiobuttonDemo(EasyFrame):
"""When the Display button is pressed, shows the label of
the selected radio button. The button group has a
horizontal alignment."""
def __init__(self):
"""Sets up the window and widgets."""
EasyFrame.__init__(self, "Radio Button Demo")
# Add the button group
self.group = self.addRadiobuttonGroup(row = 1, column = 0,
columnspan = 4,
orient = HORIZONTAL)
# Add the radio buttons to the group
self.group.addRadiobutton("Freshman")
self.group.addRadiobutton("Sophomore")
self.group.addRadiobutton("Junior")
defaultRB = self.group.addRadiobutton("Senior")
# Select one of the buttons in the group
self.group.setSelectedButton(defaultRB)
# Output field and command button for the results
self.output = self.addTextField("", row = 0, column = 1)
self.addButton("Display", row = 0, column = 0,
command = self.display)
# Event handler method
def display(self):
"""Displays the selected button's label in the text field."""
self.output.setText(self.group.getSelectedButton()["value"])
# Instantiate and pop up the window.
RadiobuttonDemo().mainloop()
<file_sep>/arduino/battery_meter/battery_meter/battery_meter.ino
//battery meter code for uno arduino
int
int interval = 1; //sample time interval (1000 (ms) = 1 second)
float bat_high = 13.10;
float bat_low = 11.40;
int samples = 100; //number of samples taken
int analog_read = 0;
int pin_to_read = 1;
int reading_sum_total = 0;
int reading_sum_avg = 0;
float volt = 0;
void setup()
{
Serial.begin(9600);
}
void loop()
{
for(int i = 0; i < samples ; i++)
{
analog_read = analogRead(pin_to_read);
reading_sum_total += analog_read;
delay(10);
}
reading_sum_avg = reading_sum_total / samples;
volt = (reading_sum_avg - bat_low) / (bat_high - bat_low);
Serial.print("Voltage: ");
Serial.println (volt);
delay(interval);
}
<file_sep>/navigation/gps/gps_distance.py
import geopy
from geopy.distance import VincentyDistance
def get_dest_gps_cood(lat1, lon1, bearing, distance_in_meters):
# given: lat1, lon1, b = bearing in degrees, d = distance in kilometers
#lat1 = 53.32055555555556
#lat2 = 53.31861111111111
#lon1 = -1.7297222222222221
#lon2 = -1.6997222222222223
d = distance_in_meters / 1000.0
b = bearing
print d, b
origin = geopy.Point(lat1, lon1)
destination = VincentyDistance(kilometers=d).destination(origin, b)
lat2, lon2 = destination.latitude, destination.longitude
return lat2, lon2
if __name__== "__main__":
lat1 = 53.32055555555556
lon1 = -1.7297222222222221
bearing = 270
distance_in_meters = 10.0
pos1 = get_dest_gps_cood(lat1, lon1, bearing, distance_in_meters/2.0)
print "pos1 lat long:", pos1
pos2 = get_dest_gps_cood(pos1[0], pos1[1], 0, distance_in_meters/2.0)
print "pos2 lat long:", pos2
<file_sep>/telemetry/video_stream_client.py
#!/usr/bin/env python
# This is some example code for python-gstreamer.
# It's a gstreamer TCP/IP client that sends a file
# through a tcpclientsink to a server on localhost
import gobject, pygst
pygst.require("0.10")
import gst
# create a pipeline and add [ filesrc ! tcpclientsink ]
pipeline = gst.Pipeline("client")
src = gst.element_factory_make("filesrc", "source")
src.set_property("location", "/home/Downloads/videos/sidewalk1.mp4")
pipeline.add(src)
client = gst.element_factory_make("tcpclientsink", "client")
pipeline.add(client)
client.set_property("host", "127.0.0.1")
client.set_property("port", 3000)
src.link(client)
pipeline.set_state(gst.STATE_PLAYING)
# enter into a mainloop
loop = gobject.MainLoop()
loop.run()
<file_sep>/lib/mobot_heartbeat_publish.py
#!/usr/bin/python
import thread, time, sys
import pika
class publish_heartbeat():
def __init__(self):
self.heartbeat_signal = "X"
#-------------connection variables
self.channel_name = 'heartbeat.1'
self.connection = None
self.channel = None
#----------------------RUN
self.run()
def connect(self):
self.connection = pika.BlockingConnection(pika.ConnectionParameters('localhost'))
self.channel = self.connection.channel()
self.channel.exchange_declare(exchange='mobot_data_feed',type='topic')
def publish(self, data):
self.channel.basic_publish(exchange='mobot_data_feed',
routing_key=self.channel_name, body=data)
def run(self):
self.connect()
self.th = thread.start_new_thread(self.send_heartbeat, ())
def send_heartbeat(self):
while True:
time.sleep(0.1) # send 10 times per second
self.publish(self.heartbeat_signal)
if __name__== "__main__":
heartbeat = publish_lidar():
while True:
time.sleep(0.00001)
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/tests/test_svm_sigmoidal.py
import numpy
from milk.supervised import svm
import numpy
import numpy as np
def old_learn_sigmoid_constants(F,Y,
max_iters=None,
min_step=1e-10,
sigma=1e-12,
eps=1e-5):
'''
Old version. Direct C-like implementation
'''
# the deci[i] array is called F[i] in this code
F = np.asanyarray(F)
Y = np.asanyarray(Y)
assert len(F) == len(Y)
assert numpy.all( (Y == 1) | (Y == 0) )
from numpy import log, exp
N=len(F)
if max_iters is None: max_iters = 1000
prior1 = Y.sum()
prior0 = N-prior1
small_nr = 1e-4
hi_t = (prior1+1.)/(prior1+2.)
lo_t = 1./(prior0+2.)
T = Y*hi_t + (1-Y)*lo_t
A = 0.
B = log( (prior0+1.)/(prior1+1.) )
def target(A,B):
fval = 0.
for i in xrange(N):
fApB = F[i]*A+B
if fApB >= 0:
fval += T[i]*fApB+log(1+exp(-fApB))
else:
fval += (T[i]-1.)*fApB+log(1+exp(fApB))
return fval
fval = target(A,B)
for iter in xrange(max_iters):
h11=sigma
h22=sigma
h21=0.
g1=0.
g2=0.
for i in xrange(N):
fApB = F[i]*A+B
if (fApB >= 0):
p = exp(-fApB)/(1.+exp(-fApB))
q = 1./(1.+exp(-fApB))
else:
p = 1./(1.+exp(fApB))
q = exp(fApB)/(1.+exp(fApB))
d2 = p * q
h11 += F[i]*F[i]*d2
h22 += d2
h21 += F[i]*d2
d1 = T[i] - p
g1 += F[i]*d1
g2 += d1
if abs(g1) < eps and abs(g2) < eps: # Stopping criteria
break
det = h11*h22 - h21*h21
dA = - (h22*g1 - h21*g2)/det
dB = - (h21*g1 + h11*g2)/det
gd = g1*dA + g2*dB
stepsize = 1.
while stepsize >= min_step:
newA = A + stepsize*dA
newB = B + stepsize*dB
newf = target(newA,newB)
if newf < fval+eps*stepsize*gd:
A = newA
B = newB
fval = newf
break
stepsize /= 2
if stepsize < min_step:
break
return A,B
def test_learn_sigmoid_contants():
Y = np.repeat((0,1),100)
np.random.seed(3)
for i in xrange(10):
F = np.random.rand(200)-.3
F[100:] *= -1
old = old_learn_sigmoid_constants(F,Y)
new = svm.learn_sigmoid_constants(F,Y)
assert np.allclose(old, new)
<file_sep>/PhidgetsPython/Python/Dictionary-simple.py
#!/usr/bin/env python
"""Copyright 2010 Phidgets Inc.
This work is licensed under the Creative Commons Attribution 2.5 Canada License.
To view a copy of this license, visit http://creativecommons.org/licenses/by/2.5/ca/
"""
__author__ = '<NAME>'
__version__ = '2.1.8'
__date__ = 'July 14 2010'
#Basic imports
from ctypes import *
import sys
from time import sleep
#Phidget specific imports
from Phidgets.PhidgetException import PhidgetErrorCodes, PhidgetException
from Phidgets.Events.Events import ErrorEventArgs, KeyChangeEventArgs, ServerConnectArgs, ServerDisconnectArgs
from Phidgets.Dictionary import Dictionary, DictionaryKeyChangeReason, KeyListener
#Create a Dictionary object and a key listener object
try:
dictionary = Dictionary()
keyListener = KeyListener(dictionary, ".*")
except RuntimeError as e:
print("Runtime Exception: %s" % e.details)
print("Exiting....")
exit(1)
#Event Handler Callback Functions
def DictionaryError(e):
print("Dictionary Error %i: %s" % (e.eCode, e.description))
return 0
def DictionaryServerConnected(e):
print("Dictionary connected to server %s" % (e.device.getServerAddress()))
try:
keyListener.start()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
return 0
def DictionaryServerDisconnected(e):
print("Dictionary disconnected from server")
try:
keyListener.stop()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
return 0
def KeyChanged(e):
if e.reason == DictionaryKeyChangeReason.PHIDGET_DICTIONARY_VALUE_CHANGED:
reason = "Value Changed"
elif e.reason == DictionaryKeyChangeReason.PHIDGET_DICTIONARY_ENTRY_ADDED:
reason = "Entry Added"
elif e.reason == DictionaryKeyChangeReason.PHIDGET_DICTIONARY_ENTRY_REMOVING:
reason = "Entry Removed"
elif e.reason == DictionaryKeyChangeReason.PHIDGET_DICTIONARY_CURRENT_VALUE:
reason = "Current Value"
print("%s -- Key: %s -- Value: %s" % (reason, e.key, e.value))
return 0
def KeyRemoved(e):
if e.reason == DictionaryKeyChangeReason.PHIDGET_DICTIONARY_VALUE_CHANGED:
reason = "Value Changed"
elif e.reason == DictionaryKeyChangeReason.PHIDGET_DICTIONARY_ENTRY_ADDED:
reason = "Entry Added"
elif e.reason == DictionaryKeyChangeReason.PHIDGET_DICTIONARY_ENTRY_REMOVING:
reason = "Entry Removed"
elif e.reason == DictionaryKeyChangeReason.PHIDGET_DICTIONARY_CURRENT_VALUE:
reason = "Current Value"
print("%s -- Key: %s -- Value: %s" % (reason, e.key, e.value))
return 0
#Main Program Code
try:
dictionary.setErrorHandler(DictionaryError)
dictionary.setServerConnectHandler(DictionaryServerConnected)
dictionary.setServerDisconnectHandler(DictionaryServerDisconnected)
keyListener.setKeyChangeHandler(KeyChanged)
keyListener.setKeyRemovalListener(KeyRemoved)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Opening Dictionary object....")
try:
dictionary.openRemoteIP("adam.local", 5001)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
try:
while dictionary.isAttachedToServer() == False:
pass
else:
print("Connected: %s" % (dictionary.isAttachedToServer()))
print("Server: %s:%s" % (dictionary.getServerAddress(), dictionary.getServerPort()))
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
try:
print("Now we'll add some keys...")
sleep(1)
dictionary.addKey("test1", "ok")
dictionary.addKey("test2", "ok", True)
dictionary.addKey("test3", "ok", False)
dictionary.addKey("test4", "ok", True)
sleep(2)
print("Now we will test for key 'test2' being in the dictionary.")
sleep(1)
value = dictionary.getKey("test2")
print("Key: test2 Value: %s" % (value))
print("Now we will remove one of the keys...")
sleep(1)
dictionary.removeKey("test4")
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Press Enter to quit....")
chr = sys.stdin.read(1)
print("Closing...")
try:
keyListener.stop()
dictionary.closeDictionary()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Done.")
exit(0)<file_sep>/gui/tkinter/sonar_functions.py~
#!/usr/bin/env python
import serial
import sys, time
from threading import Thread
import numpy as np
import matplotlib
matplotlib.use('agg')
from matplotlib.pyplot import figure, show, rc
import matplotlib.pyplot as P
from pylab import *
import Image
import cv
from maxsonar_class import *
import gc
###########################################################
def CVtoPIL_4Channel(CV_img):
"""converts CV image to PIL image"""
cv_img = cv.CreateMatHeader(cv.GetSize(img)[1], cv.GetSize(img)[0], cv.CV_8UC1)
#cv.SetData(cv_img, pil_img.tostring())
pil_img = Image.fromstring("L", cv.GetSize(img), img.tostring())
return pil_img
###########################################################
###########################################################
def PILtoCV_4Channel(PIL_img):
cv_img = cv.CreateImageHeader(PIL_img.size, cv.IPL_DEPTH_8U, 4)
cv.SetData(cv_img, PIL_img.tostring())
return cv_img
###########################################################
def fig2img ( fig ):
# put the figure pixmap into a numpy array
buf = fig2data ( fig )
w, h, d = buf.shape
fig_return = Image.fromstring( "RGBA", ( w ,h ), buf.tostring( ) )
buf = 0
return fig_return
###########################################################
def fig2data ( fig ):
"""
@brief Convert a Matplotlib figure to a 4D numpy array with RGBA channels and return it
@param fig a matplotlib figure
@return a numpy 3D array of RGBA values
"""
# draw the renderer
fig.canvas.draw ( )
# Get the RGBA buffer from the figure
w,h = fig.canvas.get_width_height()
buf = np.fromstring ( fig.canvas.tostring_argb(), dtype=np.uint8 )
buf.shape = ( w, h,4 )
# canvas.tostring_argb give pixmap in ARGB mode. Roll the ALPHA channel to have it in RGBA mode
buf = np.roll ( buf, 3, axis = 2 )
return buf
<<<<<<< HEAD
###########################################################
def sonar_graph(ping_readings):
#print "ping reading:", ping_readings
#print type(ping_readings[1])
# force square figure and square axes looks better for polar, IMO
fig = figure(figsize=(3.8,3.8))
ax = P.subplot(1, 1, 1, projection='polar')
P.rgrids([28, 61, 91])
ax.set_theta_zero_location('N')
ax.set_theta_direction(-1)
theta = 356
angle = theta * np.pi / 180.0
radii = [ping_readings[0]]
width = .15
bars1 = ax.bar(0, 100, width=0.001, bottom=0.0)
#print "theta, radii, width: ", theta, radii, width
bars = ax.bar(angle, radii, width=width, bottom=0.0, color='blue')
theta = 86
angle = theta * np.pi / 180.0
radii = [ping_readings[1]]
width = .15
bars = ax.bar(angle, radii, width=width, bottom=0.0, color='blue')
theta = 176
angle = theta * np.pi / 180.0
radii = [ping_readings[2]]
width = .15
bars = ax.bar(angle, radii, width=width, bottom=0.0, color='blue')
theta = 266
angle = theta * np.pi / 180.0
radii = [ping_readings[3]]
width = .15
bars = ax.bar(angle, radii, width=width, bottom=0.0, color='blue')
#print "finshed graph"
#pil_img = fig2img(fig)
#sonar_image = pil_img
#print type(pil_img), pil_img
#sonar_image = PILtoCV_4Channel(pil_img)
#cv.ShowImage("Sonar", sonar_image )
#cv.MoveWindow ('Sonar',50 ,50 )
#time.sleep(.01)
#cv.WaitKey(10)
fig.savefig('sonar_image.png')
#Image.open('sonar_image.png').save('sonar_image.jpg','JPEG')
#print "finished saving"
#stop
#garbage cleanup
#fig.clf()
P.close(fig)
#gc.collect()
#del fig
return
=======
>>>>>>> e0d260f296db2ad7fde958de126416ea81ad932d
def process_sonar_data(sonar_data):
#global sensor1
#data = str(sensor1.distances_cm())
data = sonar_data
print "sonar data from inside sonar_functions", sonar_data
if len(data) > 1:
#print "data=", data
#s1_data = re.search('s1', data)
#print s1_data.span()
s1_data = int(data[(data.find('s1:')+3):(data.find('s2:'))])
s2_data = int(data[(data.find('s2:')+3):(data.find('s3:'))])
s3_data = int(data[(data.find('s3:')+3):(data.find('s4:'))])
s4_data = int(data[(data.find('s4:')+3):(data.find('s5:'))])
s5_data = int(data[(data.find('s5:')+3):(len(data))])
print s1_data, s2_data, s3_data, s4_data, s5_data
data2 = []
data2.append(s1_data)
data2.append(s2_data)
data2.append(s3_data)
data2.append(s4_data)
data2.append(s5_data)
#s1_data = int(s1_data)
#if s1_data > 91: s1_data = 91
#if s1_data < 0: s1_data = 0
#print "s1:", s1_data
#sonar_img = sonar_graph(data2)
return data2
#time.sleep(.01)
if __name__== "__main__":
#t = Thread(target=sonar_display)
#t.start()
sensor1 = MaxSonar()
time.sleep(1)
sonar_display(sensor1)
time.sleep(1)
for i in range(1000):
sonar_display(sensor1)
time.sleep(0.01)
<file_sep>/gui/web_gui/sitetest2/img_processing_tools.py
#!/usr/bin/env python
from PIL import Image
import numpy as np
from PIL import ImageStat
import mahotas
import numpy as np
import cv
import matplotlib.pyplot as plt
from multiprocessing import Process
import time
import os
def plot_rgb_histogram(img):
# RGB Hitogram
# This script will create a histogram image based on the RGB content of
# an image. It uses PIL to do most of the donkey work but then we just
# draw a pretty graph out of it.
#
# May 2009, <NAME>, www.scottmcdonough.co.uk
#
import Image, ImageDraw
imagepath = "mZXN_1979" # The image to build the histogram of
histHeight = 120 # Height of the histogram
histWidth = 256 # Width of the histogram
multiplerValue = 1 # The multiplier value basically increases
# the histogram height so that love values
# are easier to see, this in effect chops off
# the top of the histogram.
showFstopLines = True # True/False to hide outline
fStopLines = 5
# Colours to be used
backgroundColor = (51,51,51) # Background color
lineColor = (102,102,102) # Line color of fStop Markers
red = (255,60,60) # Color for the red lines
green = (51,204,51) # Color for the green lines
blue = (0,102,255) # Color for the blue lines
##################################################################################
#img = Image.open(imagepath)
hist = img.histogram()
histMax = max(hist) # comon color
xScale = float((histWidth))/len(hist) # xScaling
yScale = float((histHeight))/histMax # yScaling
im = Image.new("RGBA", ((histWidth*multiplerValue), (histHeight*multiplerValue)), backgroundColor)
draw = ImageDraw.Draw(im)
# Draw Outline is required
if showFstopLines:
xmarker = histWidth/fStopLines
x =0
for i in range(1,fStopLines+1):
draw.line((x, 0, x, histHeight), fill=lineColor)
x+=xmarker
draw.line((histWidth-1, 0, histWidth-1, 200), fill=lineColor)
draw.line((0, 0, 0, histHeight), fill=lineColor)
# Draw the RGB histogram lines
x=0; c=0;
for i in hist:
if int(i)==0: pass
else:
color = red
if c>255: color = green
if c>511: color = blue
draw.line((x, histHeight, x, histHeight-(i*yScale)), fill=color)
if x>255: x=0
else: x+=1
c+=1
# Now save and show the histogram
im.save('histogram.png', 'PNG')
#im.show()
return im
###########################################################
def image2array(img):
"""given an image, returns an array. i.e. create array of image using numpy """
return np.asarray(img)
###########################################################
def array2image(arry):
"""given an array, returns an image. i.e. create image using numpy array """
#Create image from array
return Image.fromarray(arry)
###########################################################
def PILtoCV(PIL_img):
cv_img = cv.CreateImageHeader(PIL_img.size, cv.IPL_DEPTH_8U, 3)
cv.SetData(cv_img, PIL_img.tostring())
return cv_img
###########################################################
def CVtoPIL(img):
"""converts CV image to PIL image"""
cv_img = cv.CreateMatHeader(cv.GetSize(img)[1], cv.GetSize(img)[0], cv.CV_8UC1)
#cv.SetData(cv_img, pil_img.tostring())
pil_img = Image.fromstring("L", cv.GetSize(img), img.tostring())
return pil_img
###########################################################
def CalcHistogram(img):
#calc histogram of green band
bins = np.arange(0,256)
hist1 = image2array(img)
H, xedges = np.histogram(np.ravel(hist1), bins=bins, normed=False)
return H
def WriteMeterics(image, classID, data_filename):
if len(image.getbands()) == 3:
#write class data to file
f_handle = open(data_filename, 'a')
f_handle.write(str(classID))
f_handle.write(', ')
f_handle.close()
#calculate LBP histogram on raw image
np_img = np.array(image)
lbp1 = mahotas.features.lbp(np_img, 1, 8, ignore_zeros=False)
#print lbp1.ndim, lbp1.size
print "LBP Histogram: ", lbp1
print "LBP Length:", len(lbp1)
f_handle = open(data_filename, 'a')
for i in range(len(lbp1)):
f_handle.write(str(lbp1[i]))
f_handle.write(" ")
f_handle.write(',')
f_handle.close()
print "Image has multiple color bands...Splitting Bands...."
Red_Band, Green_Band, Blue_Band = image.split()
print "Calculating Histogram for I3 pixels of image..."
I3_Histogram = CalcHistogram(Green_Band)
#gaussian_numbers = normal(size=1000)
plt.hist(I3_Histogram, bins=32)
plt.title("I3 Histogram")
plt.xlabel("Value")
plt.ylabel("Frequency")
#plt.show(block=False)
plt.savefig("out.png")
plt.clf()
cv_image = cv.LoadImage("out.png")
cv.ShowImage("I3 Histogram", cv_image)
cv.MoveWindow ('I3 Histogram',705 ,50 )
time.sleep(.1)
cv.WaitKey(100)
#p = Process(target=plot_graph, args=([I3_Histogram],))
#p.start()
#p.join()
#save I3 Histogram to file in certain format
print "saving I3 histogram to dictionary..."
f_handle = open(data_filename, 'a')
for i in range(len(I3_Histogram)):
f_handle.write(str(I3_Histogram[i]))
f_handle.write(" ")
f_handle.write(',')
f_handle.close()
#calculate RGB histogram on raw image
rgb_histo = image.histogram()
print "saving RGB histogram to dictionary..."
f_handle = open(data_filename, 'a')
for i in range(len(rgb_histo)):
f_handle.write(str(rgb_histo[i]))
f_handle.write(" ")
f_handle.write(',')
f_handle.close()
#calculate I3 meterics
I3_sum = ImageStat.Stat(image).sum
I3_sum2 = ImageStat.Stat(image).sum2
I3_median = ImageStat.Stat(image).median
I3_mean = ImageStat.Stat(image).mean
I3_var = ImageStat.Stat(image).var
I3_stddev = ImageStat.Stat(image).stddev
I3_rms = ImageStat.Stat(image).rms
print "saving I3 meterics to dictionary..."
f_handle = open(data_filename, 'a')
print "sum img1_I3: ", I3_sum[1]
print "sum2 img1_I3: ", I3_sum2[1]
print "median img1_I3: ", I3_median[1]
print "avg img1_I3: ", I3_mean[1]
print "var img1_I3: ", I3_var[1]
print "stddev img1_I3: ", I3_stddev[1]
print "rms img1_I3: ", I3_rms[1]
#print "extrema img1_I3: ", ImageStat.Stat(img1_I3).extrema
#print "histogram I3: ", len(img1_I3.histogram())
f_handle.write(str(I3_sum[1]))
f_handle.write(",")
f_handle.write(str(I3_sum2[1]))
f_handle.write(",")
f_handle.write(str(I3_median[1]))
f_handle.write(",")
f_handle.write(str(I3_mean[1]))
f_handle.write(",")
f_handle.write(str(I3_var[1]))
f_handle.write(",")
f_handle.write(str(I3_stddev[1]))
f_handle.write(",")
f_handle.write(str(I3_rms[1]))
#f_handle.write(",")
f_handle.write('\n')
f_handle.close()
else:
print "image not valid for processing: ", filename1
time.sleep(5)
return
def rgbToI3(r, g, b):
"""Convert RGB color space to I3 color space
@param r: Red
@param g: Green
@param b: Blue
return (I3) integer
"""
i3 = ((2*g)-r-b)/2
return i3
def rgb2I3 (img):
"""Convert RGB color space to I3 color space
@param r: Red
@param g: Green
@param b: Blue
return (I3) integer
"""
xmax = img.size[0]
ymax = img.size[1]
#make a copy to return
returnimage = Image.new("RGB", (xmax,ymax))
imagearray = img.load()
for y in range(0, ymax, 1):
for x in range(0, xmax, 1):
rgb = imagearray[x, y]
i3 = ((2*rgb[1])-rgb[0]-rgb[2]) / 2
#print rgb, i3
returnimage.putpixel((x,y), (0,i3,0))
return returnimage
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/unsupervised/_som.cpp
#include <limits>
#include <iostream>
#include <cstdlib>
extern "C" {
#include <Python.h>
#include <numpy/ndarrayobject.h>
}
namespace {
struct SOM_Exception {
SOM_Exception(const char* msg): msg(msg) { }
const char* msg;
};
void assert_type_contiguous(PyArrayObject* array,int type) {
if (!PyArray_Check(array) ||
PyArray_TYPE(array) != type ||
!PyArray_ISCONTIGUOUS(array)) {
throw SOM_Exception("Arguments to putpoints don't conform to expectation. Are you calling this directly? This is an internal function!");
}
}
void putpoints(PyArrayObject* grid, PyArrayObject* points, float L, int radius) {
if (PyArray_NDIM(grid) != 3) throw SOM_Exception("grid should be three dimensional");
if (PyArray_NDIM(points) != 2) throw SOM_Exception("points should be two dimensional");
const int rows = PyArray_DIM(grid, 0);
const int cols = PyArray_DIM(grid, 1);
const int d = PyArray_DIM(grid, 2);
const int n = PyArray_DIM(points, 0);
if (PyArray_DIM(points, 1) != d) throw SOM_Exception("second dimension of points is not third dimension of grid");
Py_BEGIN_ALLOW_THREADS
for (int i = 0; i != n; i++){
const float* p = static_cast<float*>(PyArray_GETPTR1(points, i));
int min_y = 0;
int min_x = 0;
float best = std::numeric_limits<float>::max();
for (int y = 0; y != rows; ++y) {
for (int x = 0; x != cols; ++x) {
float dist = 0.;
const float* gpoint = static_cast<float*>(PyArray_GETPTR2(grid, y, x));
for (int j = 0; j != d; ++j) {
dist += (p[j] - gpoint[j])*(p[j] - gpoint[j]);
}
if (dist < best) {
best = dist;
min_y = y;
min_x = x;
}
}
}
const int start_y = std::max(0, min_y - radius);
const int start_x = std::max(0, min_x - radius);
const int end_y = std::min(rows, min_y + radius);
const int end_x = std::min(rows, min_x + radius);
for (int y = start_y; y != end_y; ++y) {
for (int x = start_x; x != end_x; ++x) {
const float L2 = L /(1 + std::abs(min_y - y) + std::abs(min_x - x));
float* gpoint = static_cast<float*>(PyArray_GETPTR2(grid, y, x));
for (int j = 0; j != d; ++j) {
gpoint[j] *= (1.-L2);
gpoint[j] += L2 * p[j];
}
}
}
}
Py_END_ALLOW_THREADS
}
PyObject* py_putpoints(PyObject* self, PyObject* args) {
try {
PyArrayObject* grid;
PyArrayObject* points;
float L;
int radius;
if (!PyArg_ParseTuple(args, "OOfi", &grid, &points, &L, &radius)) {
const char* errmsg = "Arguments were not what was expected for putpoints.\n"
"This is an internal function: Do not call directly unless you know exactly what you're doing.\n";
PyErr_SetString(PyExc_RuntimeError,errmsg);
return 0;
}
assert_type_contiguous(grid, NPY_FLOAT);
assert_type_contiguous(points, NPY_FLOAT);
putpoints(grid, points, L, radius);
Py_RETURN_NONE;
} catch (const SOM_Exception& exc) {
PyErr_SetString(PyExc_RuntimeError,exc.msg);
return 0;
} catch (...) {
PyErr_SetString(PyExc_RuntimeError,"Some sort of exception in putpoints.");
return 0;
}
}
PyMethodDef methods[] = {
{"putpoints", py_putpoints, METH_VARARGS , "Do NOT call directly.\n" },
{NULL, NULL,0,NULL},
};
const char * module_doc =
"Internal SOM Module.\n"
"\n"
"Do NOT use directly!\n";
} // namespace
extern "C"
void init_som()
{
import_array();
(void)Py_InitModule3("_som", methods, module_doc);
}
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/supervised/base.py
# -*- coding: utf-8 -*-
# Copyright (C) 2011, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
# License: MIT. See COPYING.MIT file in the milk distribution
from __future__ import division
class supervised_model(object):
def apply_many(self, fs):
'''
labels = model.apply_many( examples )
This is equivalent to ``map(model.apply, examples)`` but may be
implemented in a faster way.
Parameters
----------
examples : sequence of training examples
Returns
-------
labels : sequence of labels
'''
return map(self.apply, fs)
class base_adaptor(object):
def __init__(self, base):
self.base = base
def set_option(self, k, v):
self.base.set_option(k, v)
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/tests/test_knn.py
import numpy as np
import milk.supervised.knn
import numpy
def test_simple():
X=np.array([
[0,0,0],
[1,1,1],
])
Y=np.array([ 1, -1 ])
kNN = milk.supervised.knn.kNN(1)
kNN = kNN.train(X,Y)
assert kNN.apply(X[0]) == Y[0]
assert kNN.apply(X[1]) == Y[1]
assert kNN.apply([0,0,1]) == Y[0]
assert kNN.apply([0,1,1]) == Y[1]
def test_nnclassifier():
labels=[0,1]
data=[[0.,0.],[1.,1.]]
C = milk.supervised.knn.kNN(1)
model = C.train(data,labels)
assert model.apply(data[0]) == 0
assert model.apply(data[1]) == 1
assert model.apply([.01,.01]) == 0
assert model.apply([.99,.99]) == 1
assert model.apply([100,100]) == 1
assert model.apply([-100,-100]) == 0
assert model.apply([.9,.9]) == 1
middle = model.apply([.5,.5])
assert (middle == 0) or (middle == 1)
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/unsupervised/_kmeans.cpp
#include <algorithm>
extern "C" {
#include <Python.h>
#include <numpy/ndarrayobject.h>
}
namespace {
struct Kmeans_Exception {
Kmeans_Exception(const char* msg): msg(msg) { }
const char* msg;
};
void assert_type_contiguous(PyArrayObject* array,int type) {
if (!PyArray_Check(array) ||
PyArray_TYPE(array) != type ||
!PyArray_ISCONTIGUOUS(array)) {
throw Kmeans_Exception("Arguments to kmeans don't conform to expectation. Are you calling this directly? This is an internal function!");
}
}
template <typename ftype>
int computecentroids(ftype* f, ftype* centroids, PyArrayObject* a_assignments, PyArrayObject* a_counts, const int N, const int Nf, const int k) {
int zero_counts = 0;
Py_BEGIN_ALLOW_THREADS
const npy_int32* assignments = static_cast<npy_int32*>(PyArray_DATA(a_assignments));
npy_int32* counts = static_cast<npy_int32*>(PyArray_DATA(a_counts));
std::fill(counts, counts + k, 0);
std::fill(centroids, centroids + k*Nf, 0.0);
for (int i = 0; i != N; i++){
const int c = assignments[i];
if (c >= k) continue; // throw Kmeans_Exception("wrong value in assignments");
ftype* ck = centroids + Nf*c;
for (int j = 0; j != Nf; ++j) {
*ck++ += *f++;
}
++counts[c];
}
for (int i = 0; i != k; ++i) {
ftype* ck = centroids + Nf*i;
const ftype c = counts[i];
if (c == 0) {
++zero_counts;
} else {
for (int j = 0; j != Nf; ++j) {
*ck++ /= c;
}
}
}
Py_END_ALLOW_THREADS
return zero_counts;
}
PyObject* py_computecentroids(PyObject* self, PyObject* args) {
try {
PyArrayObject* fmatrix;
PyArrayObject* centroids;
PyArrayObject* assignments;
PyArrayObject* counts;
if (!PyArg_ParseTuple(args, "OOOO", &fmatrix, ¢roids, &assignments, &counts)) { throw Kmeans_Exception("Wrong number of arguments for computecentroids."); }
if (!PyArray_Check(fmatrix) || !PyArray_ISCONTIGUOUS(fmatrix)) throw Kmeans_Exception("fmatrix not what was expected.");
if (!PyArray_Check(centroids) || !PyArray_ISCONTIGUOUS(centroids)) throw Kmeans_Exception("centroids not what was expected.");
if (!PyArray_Check(counts) || !PyArray_ISCONTIGUOUS(counts)) throw Kmeans_Exception("counts not what was expected.");
if (!PyArray_Check(assignments) || !PyArray_ISCONTIGUOUS(assignments)) throw Kmeans_Exception("assignments not what was expected.");
if (PyArray_TYPE(counts) != NPY_INT32) throw Kmeans_Exception("counts should be int32.");
//if (PyArray_TYPE(assignments) != NPY_INT32) throw Kmeans_Exception("assignments should be int32.");
if (PyArray_TYPE(fmatrix) != PyArray_TYPE(centroids)) throw Kmeans_Exception("centroids and fmatrix should have same type.");
if (PyArray_NDIM(fmatrix) != 2) throw Kmeans_Exception("fmatrix should be two dimensional");
if (PyArray_NDIM(centroids) != 2) throw Kmeans_Exception("centroids should be two dimensional");
if (PyArray_NDIM(assignments) != 1) throw Kmeans_Exception("assignments should be two dimensional");
const int N = PyArray_DIM(fmatrix, 0);
const int Nf = PyArray_DIM(fmatrix, 1);
const int k = PyArray_DIM(centroids, 0);
if (PyArray_DIM(centroids, 1) != Nf) throw Kmeans_Exception("centroids has wrong number of features.");
if (PyArray_DIM(assignments, 0) != N) throw Kmeans_Exception("assignments has wrong size.");
if (PyArray_DIM(counts, 0) != k) throw Kmeans_Exception("counts has wrong size.");
switch (PyArray_TYPE(fmatrix)) {
#define TRY_TYPE(code, ftype) \
case code: \
if (computecentroids<ftype>( \
static_cast<ftype*>(PyArray_DATA(fmatrix)), \
static_cast<ftype*>(PyArray_DATA(centroids)), \
assignments, \
counts, \
N, Nf, k)) { \
Py_RETURN_TRUE; \
} \
Py_RETURN_FALSE;
TRY_TYPE(NPY_FLOAT, float);
TRY_TYPE(NPY_DOUBLE, double);
}
throw Kmeans_Exception("Cannot handle this type.");
} catch (const Kmeans_Exception& exc) {
PyErr_SetString(PyExc_RuntimeError,exc.msg);
return 0;
} catch (...) {
PyErr_SetString(PyExc_RuntimeError,"Some sort of exception in computecentroids.");
return 0;
}
}
PyMethodDef methods[] = {
{"computecentroids", py_computecentroids, METH_VARARGS , "Do NOT call directly.\n" },
{NULL, NULL,0,NULL},
};
const char * module_doc =
"Internal _kmeans Module.\n"
"\n"
"Do NOT use directly!\n";
} // namespace
extern "C"
void init_kmeans()
{
import_array();
(void)Py_InitModule3("_kmeans", methods, module_doc);
}
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/tests/test_pca.py
import numpy.random
import milk.unsupervised.pca
import numpy as np
def test_pca():
numpy.random.seed(123)
X = numpy.random.rand(10,4)
X[:,1] += numpy.random.rand(10)**2*X[:,0]
X[:,1] += numpy.random.rand(10)**2*X[:,0]
X[:,2] += numpy.random.rand(10)**2*X[:,0]
Y,V = milk.unsupervised.pca(X)
Xn = milk.unsupervised.normalise.zscore(X)
assert X.shape == Y.shape
assert ((np.dot(V[:4].T,Y[:,:4].T).T-Xn)**2).sum()/(Xn**2).sum() < .3
def test_mds():
from milk.unsupervised import pdist
np.random.seed(232)
for _ in xrange(12):
features = np.random.random_sample((12,4))
X = milk.unsupervised.mds(features,4)
D = pdist(features)
D2 = pdist(X)
assert np.mean( (D - D2) ** 2) < 10e-4
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/unsupervised/nnmf/lee_seung.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2010, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
from __future__ import division
import numpy as np
from numpy import dot
from ...utils import get_nprandom
__all__ = ['nnmf']
def nnmf(V, r, cost='norm2', max_iter=int(1e4), tol=1e-8, R=None):
'''
A,S = nnmf(X, r, cost='norm2', tol=1e-8, R=None)
Implement Lee & Seung's algorithm
Parameters
----------
V : 2-ndarray
input matrix
r : integer
nr of latent features
cost : one of:
'norm2' : minimise || X - AS ||_2 (default)
'i-div' : minimise D(X||AS), where D is I-divergence (generalisation of K-L divergence)
max_iter : integer, optional
maximum number of iterations (default: 10000)
tol : double
tolerance threshold for early exit (when the update factor is with tol
of 1., the function exits)
R : integer, optional
random seed
Returns
-------
A : 2-ndarray
S : 2-ndarray
Reference
---------
"Algorithms for Non-negative Matrix Factorization"
by <NAME>, <NAME>
(available at http://citeseer.ist.psu.edu/lee01algorithms.html)
'''
# Nomenclature in the function follows lee & seung, while outside nomenclature follows
eps = 1e-8
n,m = V.shape
R = get_nprandom(R)
W = R.standard_normal((n,r))**2
H = R.standard_normal((r,m))**2
for i in xrange(max_iter):
if cost == 'norm2':
updateH = dot(W.T,V)/(dot(dot(W.T,W),H)+eps)
H *= updateH
updateW = dot(V,H.T)/(dot(W,dot(H,H.T))+eps)
W *= updateW
elif cost == 'i-div':
raise NotImplementedError,'I-Div not implemented in lee_seung.nnmf'
if True or (i % 10) == 0:
max_update = max(updateW.max(),updateH.max())
if abs(1.-max_update) < tol:
break
return W,H
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/tests/test_grouped.py
import numpy as np
import milk.supervised.svm
from milk.supervised.svm import rbf_kernel
import milk.supervised.multi
import milk.supervised.grouped
from milk.supervised.classifier import ctransforms
import milksets.wine
def group(features, labels, step):
N = len(labels)
i = 0
gfeatures = []
glabels = []
while i < N:
next = i + step
while next > N or labels[next-1] != labels[i]: next -= 1
gfeatures.append(features[i:next])
glabels.append(labels[i])
i = next
return gfeatures, glabels
def test_voting():
base = ctransforms(milk.supervised.svm.svm_raw(C=2.,kernel=milk.supervised.svm.rbf_kernel(2.**-3)),milk.supervised.svm.svm_binary())
base = milk.supervised.multi.one_against_rest(base)
features,labels = milksets.wine.load()
gfeatures, glabels = group(features, labels, 3)
learner = milk.supervised.grouped.voting_classifier(base)
learner.train(gfeatures, glabels)
model = learner.train(gfeatures, glabels)
assert ([model.apply(f) for f in gfeatures] == np.array(glabels)).mean() > .8
def test_filter_outliers():
np.random.seed(22)
features = [np.random.randn(10,10) for i in xrange(20)]
for f in features:
f[0] *= 10
trainer = milk.supervised.grouped.filter_outliers(.9)
model = trainer.train(features, [0] * len(features))
for f in features:
ff = model.apply(f)
assert np.all(ff == f[1:])
def test_nfoldcrossvalidation():
np.random.seed(22)
features = np.array([np.random.rand(8+(i%3), 12)*(i//20) for i in xrange(40)], dtype=object)
labels = np.zeros(40, int)
labels[20:] = 1
classifier = milk.supervised.grouped.voting_classifier(milk.supervised.svm_simple(C=1., kernel=rbf_kernel(1./12)))
cmat, names = milk.nfoldcrossvalidation(features, labels, classifier=classifier)
assert cmat.shape == (2,2)
assert sorted(names) == range(2)
class identity_classifier(object):
def train(self, features, labels):
return identity_model()
class identity_model(object):
def apply(self, f):
return f
def test_meanclassif():
gfeatures = [np.arange(10), np.arange(10)%2]
glabels = [0,1]
meanclassif = milk.supervised.grouped.mean_classifier(identity_classifier())
model = meanclassif.train(gfeatures, glabels)
assert model.apply(gfeatures[0]) == np.arange(10).mean()
assert model.apply(gfeatures[1]) == .5
<file_sep>/lib/robomow_motor_class_arduino.py
#!/usr/bin/env python
import serial
import time
import os
from serial.tools import list_ports
def log(*msgline):
for msg in msgline:
print msg,
print
#!/usr/bin/env python
import serial
import threading
import time
# need to find best way to search seria ports for find device
class robomow_motor(object):
def __init__(self):
self.isConnected = False
self._should_stop = threading.Event()
self._data = 0
self.port = None
self.motor1_speed = 0
self.motor2_speed = 0
self.com = self.open_serial_port()
def open_serial_port(self):
while self.isConnected == False:
print "class robomow_motor: searching serial ports for motor controller package..."
for i in range(len(list_serial_ports())):
com = "/dev/ttyACM"
com = com[0:11] + str(i)
print "class robomow_motor: searching on COM:", com
time.sleep(.5)
try:
ser = serial.Serial(com, 57600, timeout=0.2)
data = ser.readline()
#print "data=", int(data[3:(len(data)-1)])
if data[0:2] == "m1":
#ser.write("X\n") # write a string
print "class robomow_motor: found robomow_motor package on COM: ", com
self.isConnected = True
time.sleep(.35)
break
except:
pass
if self.isConnected == False:
print "class robomow_motor: robomow_motor package not found!"
time.sleep(1)
#print "returning", ser
return ser
def __del__(self):
del(self)
def com_stats(self):
#print self.com
return (self.com.port,self.com.baudrate,self.com.bytesize,self.com.parity,self.com.stopbits)
def forward(self,speed):
##takes desired speed as percentage
#self.com.flushInput()
command_str = ("FW"+str(speed))
#validate_command(self, command_str)
send_command(self, command_str)
#self.motor_stats()
def reverse(self,speed):
##takes desired speed as percentage
#self.com.flushInput()
command_str = ("RV"+str(speed))
#validate_command(self, command_str)
send_command(self, command_str)
#self.motor_stats()
def stop(self):
##takes desired speed as percentage
command_str = ("FW"+str(0))
#validate_command(self, command_str)
send_command(self, command_str)
#self.motor_stats()
def spin_left(self, speed):
##takes desired speed as percentage
command_str = ("SL"+str(speed))
#validate_command(self, command_str)
send_command(self, command_str)
#self.motor_stats()
def spin_right(self, speed):
##takes desired speed as percentage
command_str = ("SR"+str(speed))
#validate_command(self, command_str)
send_command(self, command_str)
#self.motor_stats()
def right(self, speed):
##takes desired speed as percentage
command_str = ("M2"+str(speed))
#validate_command(self, command_str)
send_command(self, command_str)
#self.motor_stats()
def left(self, speed):
##takes desired speed as percentage
command_str = ("M1"+str(speed))
#validate_command(self, command_str)
send_command(self, command_str)
#self.motor_stats()
def motor_stats(self):
##takes desired speed as percentage
cmd = ("SP")
successful = False
for n in range (5):
time.sleep(0.05)
self.com.flushOutput()
#print "sending to motor arduino:", cmd
self.com.write (cmd)
received = ""
self.com.flushInput()
received = received + self.com.readline()
received = received + self.com.readline()
received = received + self.com.readline()
#print "received back from arduino:", received
#print len(received)
received = received.replace('\r\n', '')
received = received.replace('m1:', '')
received = received.replace('m', '')
received = received.replace(':', '')
#print "stripped:", received, " cmd:", cmd
#cmd = cmd[0] + cmd[1]
try:
if (len(received) > 0 and len(received) < 9):
#print "successful: sending to motor arduino"
received = received.strip("SP")
#print "received now:", received
spd_list = received.split(',')
print "spd_list", spd_list
successful = True
self.motor1_speed = int(spd_list[0])
self.motor2_speed = int(spd_list[1])
break
except:
pass
if (successful == False):
print "NOT successful: sending SP stats to motor arduino"
def terminate(self):
self.com.close()
print "closing motor serial", self.com
print self._should_stop.set()
#self._read_thread.wait()
#############################################################
def validate_command(self, cmd):
successful = False
for n in range (5):
time.sleep(0.05)
#self.com.flushOutput()
#print "sending to motor arduino:", cmd
self.com.write (cmd)
received = ""
self.com.flushInput()
#time.sleep(0.05)
while (len(received) < 1):
received = self.com.readline()
print "received back from arduino:", received
#print len(received)
received = received.replace('\r\n', '')
received = received.replace('m1:', '')
received = received.replace('m', '')
received = received.replace(':', '')
print "stripped:", received, " cmd:", cmd
if (received == cmd):
print "successful: sending to motor arduino"
successful = True
break
if (successful == False):
print "NOT successful: sending to motor arduino"
def send_command(self, cmd):
#successful = False
received = ""
for n in range (5):
self.com.flushOutput()
self.com.flushInput()
try:
received = self.com.readline()
except SerialException as jj:
print jj
print "received back from arduino:", received
received = received.replace("\r", "")
received = received.replace("\n", "")
if (received[0:3] == "m1:"): break
if (len(received) > 0) and (received[0:3] == "m1:"):
self.com.flushOutput()
self.com.flushInput()
print "sending command:", cmd
self.com.write(cmd)
else:
try:
print "closing port"
#print self.com
self.com.close()
self.isConnected = False
self.com = None
print "reopening port"
self.com = self.open_serial_port()
except:
print "problem re-opening motor serial port"
pass
#except:
# print "motor driver problem"
# pass
#try:
# received = ""
# self.com.flushInput()
# received = received + self.com.readline()
# print "received:", received
#except:
# pass
#time.sleep(5)
def list_serial_ports():
# Windows
if os.name == 'nt':
# Scan for available ports.
available = []
for i in range(256):
try:
s = serial.Serial(i)
available.append('COM'+str(i + 1))
s.close()
except serial.SerialException:
pass
return available
else:
# Mac / Linux
ports_to_return = []
for port in list_ports.comports():
#print port[1]
#[start:end:increment]
#print port[1][3:4:1]
if port[1][3:4:1] == "A":ports_to_return.append(port)
#print ports_to_return
#raw_input ("press enter")
return ports_to_return
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/tests/test_nnmf.py
import milk.unsupervised
import numpy as np
def test_nnmf():
def test3(method):
np.random.seed(8)
X3 = np.random.rand(20,3)
X = np.c_[ X3,
X3*2+np.random.rand(20,3)/20.,
-X3*2+np.random.rand(20,3)/10.]
W,V = method(X, 3, R=7)
assert np.sum((np.dot(W,V)-X)**2)/np.sum(X**2) < .5
yield test3, milk.unsupervised.lee_seung
yield test3, milk.unsupervised.sparse_nnmf
def test_sparse_nnmf():
# This is really just a smoke test because the test case is not sparse!!
from milk.unsupervised import sparse_nnmf
np.random.seed(8)
X3 = np.random.rand(20,3)
X = np.c_[ X3,
X3*2+np.random.rand(20,3)/20.,
-X3*2+np.random.rand(20,3)/10.]
W,V = sparse_nnmf(X, 3, sparsenessW=.7, sparsenessH=.7, R=7)
assert not np.any(np.isnan(W))
assert not np.any(np.isnan(V))
error = np.dot(W,V)-X
assert error.var() < X.var()
def test_hoyer_project():
from milk.unsupervised.nnmf.hoyer import _L1for, _project
def sp(n, L1, L2):
return (np.sqrt(n) - L1/L2)/(np.sqrt(n) - 1)
sparseness = .6
n = 9.
row = np.arange(int(n))/n
L2 = np.sqrt(np.dot(row, row))
L1 = _L1for(sparseness, row, L2)
assert np.abs(sp(n, L1, L2) - sparseness) < 1.e-4
row_ = _project(row, L1, L2)
assert not np.any(np.isnan(row_))
assert np.all(row_ >= 0)
L2 = np.sqrt(np.dot(row, row))
L1 = np.sum(np.abs(row_))
res = sp(n, L1, L2)
assert np.abs(res - sparseness) < 1.e-4
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/supervised/multi_label.py
# -*- coding: utf-8 -*-
# Copyright (C) 2011, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
# License: MIT. See COPYING.MIT file in the milk distribution
from __future__ import division
import numpy as np
from .base import supervised_model, base_adaptor
class one_by_one_model(supervised_model):
def __init__(self, models):
self.models = models
def apply(self, fs):
result = []
for ell,model in self.models.iteritems():
if model.apply(fs):
result.append(ell)
return result
class one_by_one(base_adaptor):
'''
Implements 1-vs-all multi-label classifier by transforming a base (binary)
classifier.
Example
-------
features = [....]
labels = [
(0,),
(1,2),
(0,2),
(0,3),
(1,2,3),
(2,0),
...
]
learner = one_by_one(milk.defaultlearner())
model = learner.train(features, labels)
'''
def train(self, features, labels, **kwargs):
universe = set()
for ls in labels:
universe.update(ls)
models = {}
for ell in universe:
contained = np.array([int(ell in ls) for ls in labels])
models[ell] = self.base.train(features, contained, normalisedlabels=True)
return one_by_one_model(models)
<file_sep>/PhidgetsPython/Python/HelloWorld.py
#! /usr/env/python
'''
#
# Phidget Hello World program for all devices
# (c) Phidgets 2012
#
'''
from ctypes import *
import sys
from Phidgets.PhidgetException import *
from Phidgets.Events.Events import *
from Phidgets.Manager import Manager
from Phidgets.Devices import *
# ========== Event Handling Functions ==========
def AttachHandler(event):
attachedDevice = event.device
serialNumber = attachedDevice.getSerialNum()
deviceName = attachedDevice.getDeviceName()
print("Hello to Device " + str(deviceName) + ", Serial Number: " + str(serialNumber))
def DetachHandler(event):
detachedDevice = event.device
serialNumber = detachedDevice.getSerialNum()
deviceName = detachedDevice.getDeviceName()
print("Goodbye Device " + str(deviceName) + ", Serial Number: " + str(serialNumber))
def LibraryErrorHandler(event):
try:
errorDevice = event.device
serialNumber = errorDevice.getSerialNum()
print("Error with Serial Number " + str(serialNumber))
except PhidgetException as e: LocalErrorCatcher(e)
# =========== Python-specific Exception Handler ==========
def LocalErrorCatcher(event):
print("Phidget Exception: " + str(e.code) + " - " + str(e.details) + ", Exiting...")
exit(1)
# ========= Main Code ==========
try: manager = Manager()
except RuntimeError as e:
print("Runtime Error " + e.details + ", Exiting...\n")
exit(1)
try:
manager.setOnAttachHandler(AttachHandler)
manager.setOnDetachHandler(DetachHandler)
manager.setOnErrorHandler(LibraryErrorHandler)
except PhidgetException as e: LocalErrorCatcher(e)
print("Opening....")
try:
# This would be openPhidget for a specific device object
manager.openManager()
except PhidgetException as e: LocalErrorCatcher(e)
print("Phidget Simple Playground (plug and unplug devices)");
print("Press Enter to end anytime...");
character = str(raw_input())
print("Closing...")
try:
# This would be closePhidget() for a specific device object
manager.closeManager()
except PhidgetException as e: LocalErrorCatcher(e)
exit(0)
<file_sep>/telemetry/serverside.py
1 import socket
2
3 UDP_IP="127.0.0.1"
4 UDP_PORT=5005
5 MESSAGE="Hello, World!"
6
7 print "UDP target IP:", UDP_IP
8 print "UDP target port:", UDP_PORT
9 print "message:", MESSAGE
10
11 sock = socket.socket( socket.AF_INET, # Internet
12 socket.SOCK_DGRAM ) # UDP
13 sock.sendto( MESSAGE, (UDP_IP, UDP_PORT) )
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/tests/test_regression.py
import pickle
import numpy as np
import milk.supervised._svm
from gzip import GzipFile
from os import path
import numpy as np
from milksets.wine import load
from milk.supervised import defaultclassifier
import milk
def test_svm_crash():
X,Y,kernel, C, eps ,tol, = pickle.load(GzipFile(path.dirname(__file__) + '/data/regression-2-Dec-2009.pp.gz'))
X = X[2:-2,:].copy()
Y = Y[2:-2].copy()
N = len(Y)
Y = Y.astype(np.int32)
p = -np.ones(N,np.double)
params = np.array([0,C,eps,tol],np.double)
Alphas0 = np.zeros(N, np.double)
cache_size = (1<<20)
# The line below crashed milk:
milk.supervised._svm.eval_LIBSVM(X,Y,Alphas0,p,params,kernel,cache_size)
# HASN'T CRASHED!
def test_nov2010():
# Bug submitted by <NAME>
# This was failing in 0.3.5 because SDA selected no features
np.random.seed(222)
features = np.random.randn(100,20)
features[:50] *= 2
labels = np.repeat((0,1), 50)
classifier = milk.defaultclassifier()
model = classifier.train(features, labels)
new_label = model.apply(np.random.randn(20)*2)
new_label2 = model.apply(np.random.randn(20))
assert new_label == 0
assert new_label2 == 1
def test_default_small():
features, labels = load()
selected = np.concatenate( [np.where(labels < 2)[0], np.where(labels == 2)[0][:6]] )
features = features[selected]
labels = labels[selected]
learner = defaultclassifier('fast')
# For version 0.3.8, the line below led to an error
milk.nfoldcrossvalidation(features, labels, classifier=learner)
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/measures/cluster_agreement.py
# -*- coding: utf-8 -*-
# Copyright (C) 2011, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
# License: MIT. See COPYING.MIT file in the milk distribution
from __future__ import division
import numpy as np
def rand_arand_jaccard(recovered, labels):
'''
rand, a_rand, jaccard = rand_arand_jaccard(recovered, labels)
Compute Rand, Adjusted Rand, and Jaccard indices
These share most of the computation. Therefore, it is best to compute them
together even if you are only going to use some.
Parameters
----------
recovered : sequence of int
The recovered clusters
labels : sequence of int
Underlying labels
Returns
-------
rand : float
Rand index
a_rand : float
Adjusted Rand index
jaccard : float
Jaccard index
References
----------
http://en.wikipedia.org/wiki/Rand_index
http://en.wikipedia.org/wiki/Jaccard_index
'''
from scipy.misc import comb
recovered = np.asanyarray(recovered)
labels = np.asanyarray(labels)
contig,_,_ = np.histogram2d(recovered, labels,np.arange(max(recovered.max()+2,labels.max()+2)))
A_0 = contig.sum(0)
A_1 = contig.sum(1)
Ai2 = np.sum(A_0*(A_0-1)/2.)
Bi2 = np.sum(A_1*(A_1-1)/2.)
n = A_0.sum()
a = comb(contig.ravel(), 2).sum()
b = comb(A_0, 2).sum()-a
c = comb(A_1, 2).sum()-a
d = comb(n, 2)-a-b-c
rand = (a+d)/(a+b+c+d)
jaccard = (a+d)/(b+c+d)
index = np.sum(contig*(contig-1)/2)
expected = Ai2*Bi2/n/(n-1)*2.
maxindex = (Ai2+Bi2)/2.
a_rand = (index-expected)/(maxindex-expected)
return rand, a_rand, jaccard
<file_sep>/telemetry/findpcs_network_ping.py
#!/usr/bin/env python
# this program pings all pcs and returns a list of names found
import socket
if hasattr(socket, 'setdefaulttimeout'):
print 'Set the default timeout on sockets to 5 seconds'
print socket.getdefaulttimeout()
socket.setdefaulttimeout(.1)
print socket.getdefaulttimeout()
for ip in range(87, 255):
ip_to_ping = "192.168.1."+str(ip)
print "PINGING:", ip_to_ping
try:
answer = socket.gethostbyaddr(ip_to_ping)
print answer[0]
if answer[0] == "mobot-2012.local":
print "FOUND IT"
except:
print "Oops! That was no valid number. Try again..."
<file_sep>/arduino/motor_controller_rc_pc/motor_controller_rc_pc.ino
#include <SoftwareSerial.h>
// Labels for use with the Sabertooth 2x5 motor controller
// Digital pin 13 is the serial transmit pin to the
// Sabertooth 2x5
#define SABER_TX_PIN 13
#define SABER_RX_PIN 12
// Set to 9600 through Sabertooth dip switches
#define SABER_BAUDRATE 9600
// Simplified serial Limits for each motor
#define SABER_MOTOR1_FULL_FORWARD 127
#define SABER_MOTOR1_FULL_REVERSE 1
#define SABER_MOTOR1_FULL_STOP 64
#define SABER_MOTOR2_FULL_FORWARD 255
#define SABER_MOTOR2_FULL_REVERSE 128
#define SABER_MOTOR2_FULL_STOP 192
#define SABER_MAXIMUM_SPEED 40
// Motor level to send when issuing the full stop command
#define SABER_ALL_STOP 0
int rc1_fwrv = 8; //channel 2 on rc
int rc2_rtlf = 9; //channel 1 on rc
int rc3_kill = 10; //channel 5 on rc
int rc1_val;
int rc2_val;
int rc3_val;
String pc_cmd = "";
SoftwareSerial SaberSerial = SoftwareSerial( SABER_RX_PIN, SABER_TX_PIN );
//global variables
boolean isConnectedPC = false;
boolean isConnectedRC = false;
int motor1_spd;
int motor2_spd;
void initSabertooth()
{
// Init software UART to communicate
// with the Sabertooth 2x5
pinMode( SABER_TX_PIN, OUTPUT );
SaberSerial.begin( SABER_BAUDRATE );
Serial.begin( 57600 );
// 2 second time delay for the Sabertooth to init
delay( 2000 );
// Send full stop command
stop_motors();
pinMode(rc1_fwrv, INPUT);
pinMode(rc2_rtlf, INPUT);
pinMode(rc3_kill, INPUT);
}
void establishContact() {
Serial.println("establishContact: ");
Serial.println(Serial.available());
byte incomingByte = 0;
while (Serial.available() <= 0) {
Serial.println('mmd'); //mobot motor driver
delay(100);
}
// read the incoming byte:
incomingByte = Serial.read();
// echo data received:
Serial.print("I received: ");
Serial.println(incomingByte);
}
void setup( )
{
initSabertooth( );
}
void loop(){
//String cmd = "";
//forward(50);
//delay (2000);
//Serial.println("loop");
//SaberSerial.write("loop2");
//establishContact();
rc_commands();
if (rc3_val < 1500){
pc_commands();
}
//serial_print_stuff();
//Serial.println(");
// if (SaberSerial.available()){
// Serial.write(SaberSerial.read());
//}
//if (Serial.available()){
// SaberSerial.write(Serial.read());
//}
//SaberSerial.println("hello")
delay (10);
}
void pc_commands(){
byte incomingByte;
String cmd = "";
String confirm_cmd = "";
char incoming_character;
if(Serial.available() <= 0) {
Serial.println ("m1:");
// delay(100);
}
while(Serial.available()) {
incoming_character = Serial.read();
cmd.concat(incoming_character);
}
if (cmd != "") {
if (cmd.substring(0,2) == "FW"){
int speed_cmd = cmd.substring(2,5).toInt();
forward(speed_cmd);
}
if (cmd.substring(0,2) == "RV"){
int speed_cmd = cmd.substring(2,5).toInt();
reverse(speed_cmd);
}
if (cmd.substring(0,2) == "SL"){
int speed_cmd = cmd.substring(2,5).toInt();
left(speed_cmd);
}
if (cmd.substring(0,2) == "SR"){
int speed_cmd = cmd.substring(2,5).toInt();
right(speed_cmd);
}
if (cmd.substring(0,2) == "ST"){
int speed_cmd = cmd.substring(2,5).toInt();
stop_motors();
}
if (cmd.substring(0,2) == "M1"){
int speed_cmd = cmd.substring(2,5).toInt();
m1_drive(speed_cmd);
}
if (cmd.substring(0,2) == "M2"){
int speed_cmd = cmd.substring(2,5).toInt();
m2_drive(speed_cmd);
}
if (cmd.substring(0,2) == "SP"){
stats();
}
// lines below are for validation
//for (int xx = 0; xx < 3; xx++) {
//delay(10);
// Serial.println (cmd);
// delay(10);
// }
}
//Serial.print("cmd send to robot");
//Serial.println(cmd);
//return cmd;
}
void rc_commands(){
String cmd = "";
rc3_val = pulseIn(rc3_kill, HIGH, 20000);//channel 5 on rc
if (rc3_val > 1400){
rc1_val = pulseIn(rc1_fwrv, HIGH, 20000);
//Serial.print(rc1_val);
if (rc1_val > 1000 && rc1_val < 2000){
rc1_val = map(rc1_val, 1000, 1900, -100, 100);
if (rc1_val < -100){
rc1_val = -100;
}
if (rc1_val > 100){
rc1_val = 100;
}
}
else{
rc1_val = 0;
}
// ******************************************************
rc2_val = pulseIn(rc2_rtlf, HIGH, 20000);//channel 1 on rc
if (rc2_val > 1000 && rc2_val < 2000){
rc2_val = map(rc2_val, 1000, 1900, -100, 100);
if (rc2_val < -100){
rc2_val = -100;
}
if (rc2_val > 100){
rc2_val = 100;
}
//Serial.println(rc2_val);
}
else{
rc2_val = 0;
}
// mix fw/rv with rt/lf readings
motor2_spd = rc1_val + rc2_val;
motor1_spd = rc1_val + (rc2_val * -1);
//adjust for min / max
if (motor1_spd > SABER_MAXIMUM_SPEED){
motor1_spd = SABER_MAXIMUM_SPEED;
}
if (motor1_spd < (SABER_MAXIMUM_SPEED*-1)){
motor1_spd = (SABER_MAXIMUM_SPEED*-1);
}
if (motor2_spd > SABER_MAXIMUM_SPEED){
motor2_spd = SABER_MAXIMUM_SPEED;
}
if (motor2_spd < (SABER_MAXIMUM_SPEED*-1)){
motor2_spd = (SABER_MAXIMUM_SPEED*-1);
}
motor1_spd = map(motor1_spd, -100, 100, 0, 127);
motor2_spd = map(motor2_spd, -100, 100, 128, 255);
//Serial.print (rc1_val);
//Serial.print (" ");
//Serial.print (rc2_val);
//Serial.print("motor1_spd: ");
//Serial.print(motor1_spd);
//Serial.print(" motor2_spd: ");
//Serial.println(motor2_spd);
rc_motor_drive();
}
// ******************************************************
//if (rc3_val < 1400){
// Serial.println("PC switch activated");
//stop_motors();
//}
}
void rc_motor_drive(){
//Serial.write(rc1_val);
//Serial.write(rc2_val);
SaberSerial.write(motor1_spd);
SaberSerial.write(motor2_spd);
}
void m1_drive(int speed_val){
if (speed_val > SABER_MAXIMUM_SPEED){
speed_val = SABER_MAXIMUM_SPEED;
}
if (speed_val < (SABER_MAXIMUM_SPEED*-1)){
speed_val = (SABER_MAXIMUM_SPEED*-1);
}
int val = map(speed_val, -100, 100, 1, 127);
if (val > 127){
val = 127;
}
else if (val < 1){
val = 1;
}
//Serial.print("m1:");
//Serial.println(val);
//Serial.write(val);
SaberSerial.write(val);
motor1_spd = speed_val;
}
void m2_drive(int speed_val){
if (speed_val > SABER_MAXIMUM_SPEED){
speed_val = SABER_MAXIMUM_SPEED;
}
if (speed_val < (SABER_MAXIMUM_SPEED*-1)){
speed_val = (SABER_MAXIMUM_SPEED*-1);
}
int val = map(speed_val, -100, 100, 128, 255);
if (val < 128){
val = 128;
}
else if (val > 255){
val = 255;
}
if (val == 191){
val = 192;
}
//Serial.write(val);
//Serial.print("m2:");
//Serial.println(val);
SaberSerial.write(val);
motor2_spd = speed_val;
}
void forward(int speed_val){
m1_drive(speed_val);
m2_drive(speed_val);
}
void reverse(int speed_val){
speed_val = speed_val * -1;
m1_drive(speed_val);
m2_drive(speed_val);
}
void right(int speed_val){
m1_drive(speed_val);
m2_drive(speed_val*-1);
}
void left(int speed_val){
m1_drive(speed_val*-1);
m2_drive(speed_val);
}
void stop_motors(){
int speed_val = 0;
rc1_val = 64;
rc2_val = 192;
m1_drive(speed_val);
m2_drive(speed_val);
}
void stats(){
Serial.print(motor1_spd);
Serial.print(",");
Serial.println(motor2_spd);
}
void serial_print_stuff(){
Serial.print (rc1_val);
Serial.print (" ");
Serial.print (rc2_val);
Serial.print("motor1_spd: ");
Serial.print(motor1_spd);
Serial.print(" motor2_spd: ");
Serial.print(motor2_spd);
Serial.print(" rc1: ");
Serial.print(rc1_val);
Serial.print(" rc2: ");
Serial.print(rc2_val);
Serial.print(" rc3: ");
Serial.println(rc3_val);
}
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/tests/test_nfoldcrossvalidation.py
import milk.measures.nfoldcrossvalidation
from milk.measures.nfoldcrossvalidation import nfoldcrossvalidation, foldgenerator
import milk.supervised.tree
import numpy as np
from fast_classifier import fast_classifier
def test_foldgenerator():
labels = np.array([1]*20+[2]*30+[3]*20)
for nf in [None,2,3,5,10,15,20]:
assert np.array([test.copy() for _,test in foldgenerator(labels,nf)]).sum(0).max() == 1
assert np.array([test.copy() for _,test in foldgenerator(labels,nf)]).sum() == len(labels)
assert np.array([(test&train).sum() for train,test in foldgenerator(labels,nf)]).sum() == 0
def test_foldgenerator_not_empty():
for nf in (None, 2, 3, 5, 10, 15, 20):
for Tr,Te in foldgenerator([0] * 10 + [1] *10, nf, None):
assert not np.all(Tr)
assert not np.all(Te)
def test_nfoldcrossvalidation_simple():
from milksets import wine
features, labels = wine.load()
features = features[::2]
labels = labels[::2]
cmat,clabels = nfoldcrossvalidation(features, labels, classifier=fast_classifier())
assert cmat.shape == (3,3)
assert len(clabels) == 3
def test_nfoldcrossvalidation_simple_list():
from milksets import wine
features, labels = wine.load()
features = features[::2]
labels = labels[::2]
cmat,clabels = nfoldcrossvalidation(list(features), list(labels), classifier=fast_classifier())
assert cmat.shape == (3,3)
assert len(clabels) == 3
class test_classifier(object):
def __init__(self,N):
self.tested = np.zeros(N,bool)
self.N = N
def apply(self, features):
cur = np.zeros(self.N,bool)
cur[features] = True
assert not np.any(cur & self.tested)
self.tested |= cur
return np.zeros_like(features)
def train(self,f,l):
return self
def test_nfoldcrossvalidation_testall():
N = 121
C = test_classifier(N)
features = np.arange(N)
labels = np.zeros(N)
cmat,clabels = nfoldcrossvalidation(features, labels, classifier=C)
assert np.all(C.tested)
def test_getfold():
A = np.zeros(20)
A[:10] = 1
t,s = milk.measures.nfoldcrossvalidation.getfold(A,0,10)
tt,ss = milk.measures.nfoldcrossvalidation.getfold(A,1,10)
assert not np.any((~t)&(~tt))
def test_nfoldcrossvalidation_defaultclassifier():
np.random.seed(2233)
X = np.random.rand(60,5)
X[:20] += 4.
X[-20:] -= 4.
Y = np.ones(60)
Y[:20] = 0
Y[-20:] = 2
Y += 100
cmat,clabels = milk.measures.nfoldcrossvalidation.nfoldcrossvalidation(X,Y)
assert cmat.shape == (3,3)
clabels.sort()
assert np.all(clabels == [100,101,102])
def test_foldgenerator_origins():
def test_origins(labels, origins):
for nf in (2,3,5,7):
assert np.array([test.copy() for _,test in foldgenerator(labels, nf, origins)]).sum(0).max() == 1
assert np.array([test.copy() for _,test in foldgenerator(labels, nf, origins)]).sum() == len(labels)
assert np.min([test.sum() for _,test in foldgenerator(labels, nf, origins)]) > 0
assert np.min([train.sum() for train,_ in foldgenerator(labels, nf, origins)]) > 0
for Tr,Te in foldgenerator(labels, nf, origins):
assert not np.any(Tr&Te)
in_test = set(origins[Te])
in_train = set(origins[Tr])
assert len(in_train.intersection(in_test)) == 0
tested = np.zeros(len(labels))
for Tr,Te in foldgenerator(labels, nf, origins):
tested[Te] += 1
assert np.all(tested == 1)
labels = np.zeros(120, np.uint8)
labels[39:] += 1
labels[66:] += 1
origins = np.repeat(np.arange(40), 3)
yield test_origins, labels, origins
reorder = np.argsort(np.random.rand(len(labels)))
labels = labels[reorder]
origins = origins[reorder]
yield test_origins, labels, origins
def test_stringlabels():
np.random.seed(222)
D = np.random.rand(100,10)
D[:40] += np.random.rand(40,10)**2
labelnames = ['one'] * 40 + ['two'] * 60
cmat,Lo = nfoldcrossvalidation(D, labelnames, classifier=fast_classifier())
assert Lo[0] in labelnames
assert Lo[1] in labelnames
assert Lo[0] != Lo[1] in labelnames
def test_predictions():
np.random.seed(222)
D = np.random.rand(100,10)
D[:40] += np.random.rand(40,10)**2
labels = [0] * 40 + [1] * 60
cmat,_,predictions = nfoldcrossvalidation(D, labels, classifier=fast_classifier(), return_predictions=1)
assert np.all((predictions == 0)|(predictions == 1))
assert cmat.trace() == np.sum(predictions == labels)
def test_multi():
np.random.seed(30)
r = np.random.random
for _ in xrange(10):
labels = []
p = np.array([.24,.5,.1,.44])
for i in xrange(100):
cur = [j for j in xrange(4) if r() < p[j]]
if not cur: cur = [0]
labels.append(cur)
seen = np.zeros(100, int)
for Tr,Te in foldgenerator(labels, 5, multi_label=True):
assert np.sum(Tr & Te) == 0
seen[Te] += 1
assert np.sum(seen) == 100
assert np.ptp(seen) == 0
<file_sep>/lib/maxsonar_class.py
#!/usr/bin/env python
import serial
import threading
import time
import os
from serial.tools import list_ports
# need to find best way to search seria ports for find device
class MaxSonar(object):
def __init__(self):
self._isConnected = False
self._ser = self._open_serial_port()
self._should_stop = threading.Event()
self._start_reading()
self._data = 0
#self._port = port
def _open_serial_port(self):
while self._isConnected == False:
print "class MaxSonar: searching serial ports for ultrasonic sensor package..."
for i in range(len(list_serial_ports())):
port = "/dev/ttyACM"
port = port[0:11] + str(i)
print "class MaxSonar: searching on port:", port
time.sleep(.5)
try:
#could actually get serial number from device from list_serial _ports and look specifically for that one
ser = serial.Serial(port, 57600, timeout=1)
data = ser.readline()
#print "data=", int(data[3:(len(data)-1)])
if data[0:2] == "s1":
#ser.write("a\n") # write a string
print "class MaxSonar: found ultrasonic sensor package on serial port: ", port
self._isConnected = True
#self._port = ser
#time.sleep(.35)
break
except:
pass
if self._isConnected == False:
print "class MaxSonar: ultrasonic sensor package not found!"
time.sleep(.5)
#print "returning", ser
return ser
def _start_reading(self):
def read():
#print self._should_stop.isSet()
#print self._ser.isOpen()
while not self._should_stop.isSet():
try:
#data = ""
self._ser.flushInput()
data = self._ser.readline()
#print "recieved: ", data
#self._data = int(data[5:(len(data)-1)])
self._data = data[0:(len(data)-1)]
except:
try:
print "class MaxSonar:no connection...attempting to reconnect"
self._data = 0
self._isConnected = False
self._ser = self._open_serial_port()
#time.sleep(.1)
except:
pass
thr = threading.Thread(target=read)
thr.daemon=True
thr.start()
return thr
def terminate(self):
self._ser.close()
print "closing sonar serial", self._ser
print self._should_stop.set()
#self._read_thread.wait()
def distances_cm(self):
return self._data
def list_serial_ports():
# Windows
if os.name == 'nt':
# Scan for available ports.
available = []
for i in range(256):
try:
s = serial.Serial(i)
available.append('COM'+str(i + 1))
s.close()
except serial.SerialException:
pass
return available
else:
# Mac / Linux
ports_to_return = []
for port in list_ports.comports():
#print port[1]
#[start:end:increment]
#print port[1][3:4:1]
if port[1][3:4:1] == "A":ports_to_return.append(port)
#print ports_to_return
#raw_input ("press enter")
return ports_to_return
<file_sep>/gui/breezy/buttondemo.py
"""
File: buttondemo.py
Author: <NAME>
"""
from breezypythongui import EasyFrame
import math, thread
import time
class ButtonDemo(EasyFrame):
"""Illustrates command buttons and user events."""
def __init__(self):
"""Sets up the window, label, and buttons."""
EasyFrame.__init__(self)
# A single label in the first row.
self.label = self.addLabel(text = "Hello world!",
row = 0, column = 0,
columnspan = 2, sticky = "NSEW")
# Two command buttons in the second row.
self.clearBtn = self.addButton(text = "Clear",
row = 1, column = 0,
command = self.clear)
self.restoreBtn = self.addButton(text = "Restore",
row = 1, column = 1,
command = self.restore,
state = "disabled")
#self.newWindow = tk
self.app = CircleArea(self.newWindow)
# Methods to handle user events.
def clear(self):
"""Resets the label to the empty string and
the button states."""
self.label["text"] = ""
self.clearBtn["state"] = "disabled"
self.restoreBtn["state"] = "normal"
def restore(self):
"""Resets the label to 'Hello world!'and sets
the state of the buttons."""
self.label["text"] = "Hello world!"
self.clearBtn["state"] = "normal"
self.restoreBtn["state"] = "disabled"
class CircleArea(EasyFrame):
def __init__(self):
"""Sets up the window and widgets."""
EasyFrame.__init__(self, title = "Circle Area")
# Label and field for the area
self.addLabel(text = "Area",
row = 0, column = 0)
self.areaField = self.addFloatField(value = 0.0,
row = 0,
column = 1,
width = 20)
# Sliding scale for the radius
self.radiusScale = self.addScale(label = "Radius",
row = 1, column = 0,
columnspan = 2,
from_ = 0, to = 100,
length = 300,
tickinterval = 10,
command = self.computeArea)
# The event handler method for the sliding scale
def computeArea(self, radius):
"""Inputs the radius, computes the area,
and outputs the area."""
# radius is the current value of the scale, as a string.
area = float(radius) ** 2 * math.pi
self.areaField.setNumber(area)
def start_button():
ButtonDemo().mainloop()
def start_slider():
CircleArea().mainloop()
# Instantiates and pops up the window.
if __name__ == "__main__":
#th = thread.start_new_thread(start_button, ())
#th2 = thread.start_new_thread(start_button, ())
#th3 = thread.start_new_thread(start_button, ())
#time.sleep(.1)
#start_slider ()
CircleArea().mainloop()
#start_button()
<file_sep>/CVtypes.py
"""
This file started life as cvtypes.py with the following header (that I can't read)
Wrapper-Modul cvtypes.py zur Verwendung der OpenCV-Bibliothek beta5
unter Python, wobei der Zugriff ueber ctypes erfolgt.
Autor: <NAME>
To do: noch fehlende Strukturen wrappen (z. B. CvKalman)
noch fehlende Konstanten wrappen (z. B. CV_MAX_DIM_HEAP)
noch fehlende Makros und Inlinefunktionen wrappen
ausgiebig testen
Log: 2006/07/25 Dokumentationsstrings hinzugefuegt
2006/07/10 Fehler in cvGEMM, cvMatMulAdd und cvMatMul beseitigt
2006/06/28 Modul erzeugt
I hacked it both automatically and by hand to bring it up to date with OpenCV 1.0 and
to use prototype for the functions. I also added from_param methods to allow lists to many
functions that expect a C array.
I checked with Michael and he graciously agreed to let me give it away. This software is
free for any use. If you or your lawyer are stupid enough to believe that Micheal or I have
any liability for it, you should not use it, otherwise be our guest.
<NAME> February 2007
Updated 12 May 2007 to include modifications provided by Russell Warren
"""
# --- Importe ----------------------------------------------------------------
import ctypes, os
from ctypes import Structure, Union, POINTER, SetPointerType, CFUNCTYPE, cdll, byref
from ctypes import c_char_p, c_double, c_float, c_byte, c_ubyte, c_int, c_void_p, c_ulong
from ctypes import c_uint32, c_short, c_char, c_longlong
# ----Load the DLLs ----------------------------------------------------------
if os.name == 'posix':
_cxDLL = cdll.LoadLibrary('libcxcore.so.1')
_cvDLL = cdll.LoadLibrary('libcv.so.1')
_hgDLL = cdll.LoadLibrary('libhighgui.so.1')
else:
_cxDLL = cdll.cxcore100
_cvDLL = cdll.cv100
_hgDLL = cdll.highgui100
# --- CONSTANTS AND STUFF FROM CV.H ------------------------------------------------------
CV_BLUR_NO_SCALE = 0
CV_BLUR = 1
CV_GAUSSIAN = 2
CV_MEDIAN = 3
CV_BILATERAL = 4
CV_INPAINT_NS = 0
CV_INPAINT_TELEA = 1
CV_SCHARR = -1
CV_MAX_SOBEL_KSIZE = 7
CV_BGR2BGRA = 0
CV_RGB2RGBA = CV_BGR2BGRA
CV_BGRA2BGR = 1
CV_RGBA2RGB = CV_BGRA2BGR
CV_BGR2RGBA = 2
CV_RGB2BGRA = CV_BGR2RGBA
CV_RGBA2BGR = 3
CV_BGRA2RGB = CV_RGBA2BGR
CV_BGR2RGB = 4
CV_RGB2BGR = CV_BGR2RGB
CV_BGRA2RGBA = 5
CV_RGBA2BGRA = CV_BGRA2RGBA
CV_BGR2GRAY = 6
CV_RGB2GRAY = 7
CV_GRAY2BGR = 8
CV_GRAY2RGB = CV_GRAY2BGR
CV_GRAY2BGRA = 9
CV_GRAY2RGBA = CV_GRAY2BGRA
CV_BGRA2GRAY = 10
CV_RGBA2GRAY = 11
CV_BGR2BGR565 = 12
CV_RGB2BGR565 = 13
CV_BGR5652BGR = 14
CV_BGR5652RGB = 15
CV_BGRA2BGR565 = 16
CV_RGBA2BGR565 = 17
CV_BGR5652BGRA = 18
CV_BGR5652RGBA = 19
CV_GRAY2BGR565 = 20
CV_BGR5652GRAY = 21
CV_BGR2BGR555 = 22
CV_RGB2BGR555 = 23
CV_BGR5552BGR = 24
CV_BGR5552RGB = 25
CV_BGRA2BGR555 = 26
CV_RGBA2BGR555 = 27
CV_BGR5552BGRA = 28
CV_BGR5552RGBA = 29
CV_GRAY2BGR555 = 30
CV_BGR5552GRAY = 31
CV_BGR2XYZ = 32
CV_RGB2XYZ = 33
CV_XYZ2BGR = 34
CV_XYZ2RGB = 35
CV_BGR2YCrCb = 36
CV_RGB2YCrCb = 37
CV_YCrCb2BGR = 38
CV_YCrCb2RGB = 39
CV_BGR2HSV = 40
CV_RGB2HSV = 41
CV_BGR2Lab = 44
CV_RGB2Lab = 45
CV_BayerBG2BGR = 46
CV_BayerGB2BGR = 47
CV_BayerRG2BGR = 48
CV_BayerGR2BGR = 49
CV_BayerBG2RGB = CV_BayerRG2BGR
CV_BayerGB2RGB = CV_BayerGR2BGR
CV_BayerRG2RGB = CV_BayerBG2BGR
CV_BayerGR2RGB = CV_BayerGB2BGR
CV_BGR2Luv = 50
CV_RGB2Luv = 51
CV_BGR2HLS = 52
CV_RGB2HLS = 53
CV_HSV2BGR = 54
CV_HSV2RGB = 55
CV_Lab2BGR = 56
CV_Lab2RGB = 57
CV_Luv2BGR = 58
CV_Luv2RGB = 59
CV_HLS2BGR = 60
CV_HLS2RGB = 61
CV_COLORCVT_MAX = 100
CV_INTER_NN = 0
CV_INTER_LINEAR = 1
CV_INTER_CUBIC = 2
CV_INTER_AREA = 3
CV_WARP_FILL_OUTLIERS = 8
CV_WARP_INVERSE_MAP = 16
CV_SHAPE_RECT = 0
CV_SHAPE_CROSS = 1
CV_SHAPE_ELLIPSE = 2
CV_SHAPE_CUSTOM = 100
CV_MOP_OPEN = 2
CV_MOP_CLOSE = 3
CV_MOP_GRADIENT = 4
CV_MOP_TOPHAT = 5
CV_MOP_BLACKHAT = 6
CV_TM_SQDIFF = 0
CV_TM_SQDIFF_NORMED = 1
CV_TM_CCORR = 2
CV_TM_CCORR_NORMED = 3
CV_TM_CCOEFF = 4
CV_TM_CCOEFF_NORMED = 5
CV_LKFLOW_PYR_A_READY = 1
CV_LKFLOW_PYR_B_READY = 2
CV_LKFLOW_INITIAL_GUESSES = 4
CV_POLY_APPROX_DP = 0
CV_DOMINANT_IPAN = 1
CV_CONTOURS_MATCH_I1 = 1
CV_CONTOURS_MATCH_I2 = 2
CV_CONTOURS_MATCH_I3 = 3
CV_CONTOUR_TREES_MATCH_I1 = 1
CV_CLOCKWISE = 1
CV_COUNTER_CLOCKWISE = 2
CV_COMP_CORREL = 0
CV_COMP_CHISQR = 1
CV_COMP_INTERSECT = 2
CV_COMP_BHATTACHARYYA = 3
CV_VALUE = 1
CV_ARRAY = 2
CV_DIST_MASK_3 = 3
CV_DIST_MASK_5 = 5
CV_DIST_MASK_PRECISE = 0
CV_THRESH_BINARY = 0 # value = (value > threshold) ? max_value : 0
CV_THRESH_BINARY_INV = 1 # value = (value > threshold) ? 0 : max_value
CV_THRESH_TRUNC = 2 # value = (value > threshold) ? threshold : value
CV_THRESH_TOZERO = 3 # value = (value > threshold) ? value : 0
CV_THRESH_TOZERO_INV = 4 # value = (value > threshold) ? 0 : value
CV_THRESH_MASK = 7
CV_THRESH_OTSU = 8 # use Otsu algorithm to choose the optimal threshold value
CV_ADAPTIVE_THRESH_MEAN_C = 0
CV_ADAPTIVE_THRESH_GAUSSIAN_C = 1
CV_FLOODFILL_FIXED_RANGE = 1 << 16
CV_FLOODFILL_MASK_ONLY = 1 << 17
CV_CANNY_L2_GRADIENT = 1 << 31
CV_HOUGH_STANDARD = 0
CV_HOUGH_PROBABILISTIC = 1
CV_HOUGH_MULTI_SCALE = 2
CV_HOUGH_GRADIENT = 3
CV_HAAR_DO_CANNY_PRUNING = 1
CV_HAAR_SCALE_IMAGE = 2
CV_CALIB_USE_INTRINSIC_GUESS = 1
CV_CALIB_FIX_ASPECT_RATIO = 2
CV_CALIB_FIX_PRINCIPAL_POINT = 4
CV_CALIB_ZERO_TANGENT_DIST = 8
CV_CALIB_CB_ADAPTIVE_THRESH = 1
CV_CALIB_CB_NORMALIZE_IMAGE = 2
CV_CALIB_CB_FILTER_QUADS = 4
CV_FM_7POINT = 1
CV_FM_8POINT = 2
CV_FM_LMEDS_ONLY = 4
CV_FM_RANSAC_ONLY = 8
CV_FM_LMEDS = CV_FM_LMEDS_ONLY + CV_FM_8POINT
CV_FM_RANSAC = CV_FM_RANSAC_ONLY + CV_FM_8POINT
# --- CONSTANTS AND STUFF FROM highgui.h ----
CV_WINDOW_AUTOSIZE = 1
CV_EVENT_MOUSEMOVE = 0
CV_EVENT_LBUTTONDOWN = 1
CV_EVENT_RBUTTONDOWN = 2
CV_EVENT_MBUTTONDOWN = 3
CV_EVENT_LBUTTONUP = 4
CV_EVENT_RBUTTONUP = 5
CV_EVENT_MBUTTONUP = 6
CV_EVENT_LBUTTONDBLCLK = 7
CV_EVENT_RBUTTONDBLCLK = 8
CV_EVENT_MBUTTONDBLCLK = 9
CV_EVENT_FLAG_LBUTTON = 1
CV_EVENT_FLAG_RBUTTON = 2
CV_EVENT_FLAG_MBUTTON = 4
CV_EVENT_FLAG_CTRLKEY = 8
CV_EVENT_FLAG_SHIFTKEY = 16
CV_EVENT_FLAG_ALTKEY = 32
CV_LOAD_IMAGE_UNCHANGED = -1 # 8 bit, color or gray - deprecated, use CV_LOAD_IMAGE_ANYCOLOR
CV_LOAD_IMAGE_GRAYSCALE = 0 # 8 bit, gray
CV_LOAD_IMAGE_COLOR = 1 # 8 bit unless combined with CV_LOAD_IMAGE_ANYDEPTH, color
CV_LOAD_IMAGE_ANYDEPTH = 2 # any depth, if specified on its own gray by itself
# equivalent to CV_LOAD_IMAGE_UNCHANGED but can be modified
# with CV_LOAD_IMAGE_ANYDEPTH
CV_LOAD_IMAGE_ANYCOLOR = 4
CV_CVTIMG_FLIP = 1
CV_CVTIMG_SWAP_RB = 2
CV_CAP_ANY = 0 # autodetect
CV_CAP_MIL = 100 # MIL proprietary drivers
CV_CAP_VFW = 200 # platform native
CV_CAP_V4L = 200
CV_CAP_V4L2 = 200
CV_CAP_FIREWARE = 300 # IEEE 1394 drivers
CV_CAP_IEEE1394 = 300
CV_CAP_DC1394 = 300
CV_CAP_CMU1394 = 300
CV_CAP_STEREO = 400 # TYZX proprietary drivers
CV_CAP_TYZX = 400
CV_TYZX_LEFT = 400
CV_TYZX_RIGHT = 401
CV_TYZX_COLOR = 402
CV_TYZX_Z = 403
CV_CAP_QT = 500 # Quicktime
CV_CAP_PROP_POS_MSEC = 0
CV_CAP_PROP_POS_FRAMES = 1
CV_CAP_PROP_POS_AVI_RATIO = 2
CV_CAP_PROP_FRAME_WIDTH = 3
CV_CAP_PROP_FRAME_HEIGHT = 4
CV_CAP_PROP_FPS = 5
CV_CAP_PROP_FOURCC = 6
CV_CAP_PROP_FRAME_COUNT = 7
CV_CAP_PROP_FORMAT = 8
CV_CAP_PROP_MODE = 9
CV_CAP_PROP_BRIGHTNESS = 10
CV_CAP_PROP_CONTRAST = 11
CV_CAP_PROP_SATURATION = 12
CV_CAP_PROP_HUE = 13
CV_CAP_PROP_GAIN = 14
CV_CAP_PROP_CONVERT_RGB = 15
# --- CONSTANTS AND STUFF FROM cxcore.h ----
CV_AUTOSTEP = 0x7fffffff
CV_MAX_ARR = 10
CV_NO_DEPTH_CHECK = 1
CV_NO_CN_CHECK = 2
CV_NO_SIZE_CHECK = 4
CV_CMP_EQ = 0
CV_CMP_GT = 1
CV_CMP_GE = 2
CV_CMP_LT = 3
CV_CMP_LE = 4
CV_CMP_NE = 5
CV_CHECK_RANGE = 1
CV_CHECK_QUIET = 2
CV_RAND_UNI = 0
CV_RAND_NORMAL = 1
CV_GEMM_A_T = 1
CV_GEMM_B_T = 2
CV_GEMM_C_T = 4
CV_SVD_MODIFY_A = 1
CV_SVD_U_T = 2
CV_SVD_V_T = 4
CV_LU = 0
CV_SVD = 1
CV_SVD_SYM = 2
CV_COVAR_SCRAMBLED = 0
CV_COVAR_NORMAL = 1
CV_COVAR_USE_AVG = 2
CV_COVAR_SCALE = 4
CV_COVAR_ROWS = 8
CV_COVAR_COLS = 16
CV_PCA_DATA_AS_ROW = 0
CV_PCA_DATA_AS_COL = 1
CV_PCA_USE_AVG = 2
CV_C = 1
CV_L1 = 2
CV_L2 = 4
CV_NORM_MASK = 7
CV_RELATIVE = 8
CV_DIFF = 16
CV_MINMAX = 32
CV_DIFF_C = (CV_DIFF | CV_C)
CV_DIFF_L1 = (CV_DIFF | CV_L1)
CV_DIFF_L2 = (CV_DIFF | CV_L2)
CV_RELATIVE_C = (CV_RELATIVE | CV_C)
CV_RELATIVE_L1 = (CV_RELATIVE | CV_L1)
CV_RELATIVE_L2 = (CV_RELATIVE | CV_L2)
CV_REDUCE_SUM = 0
CV_REDUCE_AVG = 1
CV_REDUCE_MAX = 2
CV_REDUCE_MIN = 3
CV_DXT_FORWARD = 0
CV_DXT_INVERSE = 1
CV_DXT_SCALE = 2 # divide result by size of array
CV_DXT_INV_SCALE = CV_DXT_INVERSE + CV_DXT_SCALE
CV_DXT_INVERSE_SCALE = CV_DXT_INV_SCALE
CV_DXT_ROWS = 4 # transfor each row individually
CV_DXT_MUL_CONJ = 8 # conjugate the second argument of cvMulSpectrums
CV_FRONT = 1
CV_BACK = 0
CV_GRAPH_VERTEX = 1
CV_GRAPH_TREE_EDGE = 2
CV_GRAPH_BACK_EDGE = 4
CV_GRAPH_FORWARD_EDGE = 8
CV_GRAPH_CROSS_EDGE = 16
CV_GRAPH_ANY_EDGE = 30
CV_GRAPH_NEW_TREE = 32
CV_GRAPH_BACKTRACKING = 64
CV_GRAPH_OVER = -1
CV_GRAPH_ALL_ITEMS = -1
CV_GRAPH_ITEM_VISITED_FLAG = 1 << 30
CV_GRAPH_SEARCH_TREE_NODE_FLAG = 1 << 29
CV_GRAPH_FORWARD_EDGE_FLAG = 1 << 28
CV_FILLED = -1
CV_AA = 16
CV_FONT_HERSHEY_SIMPLEX = 0
CV_FONT_HERSHEY_PLAIN = 1
CV_FONT_HERSHEY_DUPLEX = 2
CV_FONT_HERSHEY_COMPLEX = 3
CV_FONT_HERSHEY_TRIPLEX = 4
CV_FONT_HERSHEY_COMPLEX_SMALL = 5
CV_FONT_HERSHEY_SCRIPT_SIMPLEX = 6
CV_FONT_HERSHEY_SCRIPT_COMPLEX = 7
CV_FONT_ITALIC = 16
CV_FONT_VECTOR0 = CV_FONT_HERSHEY_SIMPLEX
CV_ErrModeLeaf = 0 # print error and exit program
CV_ErrModeParent = 1 # print error and continue
CV_ErrModeSilent = 2 # don't print and continue
#------
# make function prototypes a bit easier to declare
def cfunc(name, dll, result, *args):
'''build and apply a ctypes prototype complete with parameter flags
e.g.
cvMinMaxLoc = cfunc('cvMinMaxLoc', _cxDLL, None,
('image', POINTER(IplImage), 1),
('min_val', POINTER(double), 2),
('max_val', POINTER(double), 2),
('min_loc', POINTER(CvPoint), 2),
('max_loc', POINTER(CvPoint), 2),
('mask', POINTER(IplImage), 1, None))
means locate cvMinMaxLoc in dll _cxDLL, it returns nothing.
The first argument is an input image. The next 4 arguments are output, and the last argument is
input with an optional value. A typical call might look like:
min_val,max_val,min_loc,max_loc = cvMinMaxLoc(img)
'''
atypes = []
aflags = []
for arg in args:
atypes.append(arg[1])
aflags.append((arg[2], arg[0]) + arg[3:])
return CFUNCTYPE(result, *atypes)((name, dll), tuple(aflags))
class ListPOINTER(object):
'''Just like a POINTER but accept a list of ctype as an argument'''
def __init__(self, etype):
self.etype = etype
def from_param(self, param):
if isinstance(param, (list,tuple)):
return (self.etype * len(param))(*param)
class ListPOINTER2(object):
'''Just like POINTER(POINTER(ctype)) but accept a list of lists of ctype'''
def __init__(self, etype):
self.etype = etype
def from_param(self, param):
if isinstance(param, (list,tuple)):
val = (POINTER(self.etype) * len(param))()
for i,v in enumerate(param):
if isinstance(v, (list,tuple)):
val[i] = (self.etype * len(v))(*v)
else:
raise TypeError, 'nested list or tuple required at %d' % i
return val
else:
raise TypeError, 'list or tuple required'
class ByRefArg(object):
'''Just like a POINTER but accept an argument and pass it byref'''
def __init__(self, atype):
self.atype = atype
def from_param(self, param):
return byref(param)
class CallableToFunc(object):
'''Make the callable argument into a C callback'''
def __init__(self, cbacktype):
self.cbacktype = cbacktype
def from_param(self, param):
return self.cbacktype(param)
# --- Globale Variablen und Ausnahmen ----------------------------------------
CV_TM_SQDIFF =0
CV_TM_SQDIFF_NORMED =1
CV_TM_CCORR =2
CV_TM_CCORR_NORMED =3
CV_TM_CCOEFF =4
CV_TM_CCOEFF_NORMED =5
# Image type (IplImage)
IPL_DEPTH_SIGN = 0x80000000
IPL_DEPTH_1U = 1
IPL_DEPTH_8U = 8
IPL_DEPTH_16U = 16
IPL_DEPTH_32F = 32
IPL_DEPTH_8S = -8
IPL_DEPTH_16S = -16
IPL_DEPTH_32S = -32
IPL_DATA_ORDER_PIXEL = 0
IPL_DATA_ORDER_PLANE = 1
IPL_ORIGIN_TL = 0
IPL_ORIGIN_BL = 1
IPL_ALIGN_4BYTES = 4
IPL_ALIGN_8BYTES = 8
IPL_ALIGN_16BYTES = 16
IPL_ALIGN_32BYTES = 32
IPL_ALIGN_DWORD = IPL_ALIGN_4BYTES
IPL_ALIGN_QWORD = IPL_ALIGN_8BYTES
IPL_BORDER_CONSTANT = 0
IPL_BORDER_REPLICATE = 1
IPL_BORDER_REFLECT = 2
IPL_BORDER_WRAP = 3
IPL_IMAGE_HEADER = 1
IPL_IMAGE_DATA = 2
IPL_IMAGE_ROI = 4
CV_TYPE_NAME_IMAGE = "opencv-image"
IPL_DEPTH_64F = 64
# Matrix type (CvMat)
CV_CN_MAX = 4
CV_CN_SHIFT = 3
CV_DEPTH_MAX = (1 << CV_CN_SHIFT)
CV_8U = 0
CV_8S = 1
CV_16U = 2
CV_16S = 3
CV_32S = 4
CV_32F = 5
CV_64F = 6
CV_USRTYPE1 = 7
def CV_MAKETYPE(depth,cn):
return ((depth) + (((cn)-1) << CV_CN_SHIFT))
CV_MAKE_TYPE = CV_MAKETYPE
CV_8UC1 = CV_MAKETYPE(CV_8U,1)
CV_8UC2 = CV_MAKETYPE(CV_8U,2)
CV_8UC3 = CV_MAKETYPE(CV_8U,3)
CV_8UC4 = CV_MAKETYPE(CV_8U,4)
CV_8SC1 = CV_MAKETYPE(CV_8S,1)
CV_8SC2 = CV_MAKETYPE(CV_8S,2)
CV_8SC3 = CV_MAKETYPE(CV_8S,3)
CV_8SC4 = CV_MAKETYPE(CV_8S,4)
CV_16UC1 = CV_MAKETYPE(CV_16U,1)
CV_16UC2 = CV_MAKETYPE(CV_16U,2)
CV_16UC3 = CV_MAKETYPE(CV_16U,3)
CV_16UC4 = CV_MAKETYPE(CV_16U,4)
CV_16SC1 = CV_MAKETYPE(CV_16S,1)
CV_16SC2 = CV_MAKETYPE(CV_16S,2)
CV_16SC3 = CV_MAKETYPE(CV_16S,3)
CV_16SC4 = CV_MAKETYPE(CV_16S,4)
CV_32SC1 = CV_MAKETYPE(CV_32S,1)
CV_32SC2 = CV_MAKETYPE(CV_32S,2)
CV_32SC3 = CV_MAKETYPE(CV_32S,3)
CV_32SC4 = CV_MAKETYPE(CV_32S,4)
CV_32FC1 = CV_MAKETYPE(CV_32F,1)
CV_32FC2 = CV_MAKETYPE(CV_32F,2)
CV_32FC3 = CV_MAKETYPE(CV_32F,3)
CV_32FC4 = CV_MAKETYPE(CV_32F,4)
CV_64FC1 = CV_MAKETYPE(CV_64F,1)
CV_64FC2 = CV_MAKETYPE(CV_64F,2)
CV_64FC3 = CV_MAKETYPE(CV_64F,3)
CV_64FC4 = CV_MAKETYPE(CV_64F,4)
CV_AUTO_STEP = 0x7fffffff
CV_MAT_CN_MASK = ((CV_CN_MAX - 1) << CV_CN_SHIFT)
def CV_MAT_CN(flags):
return ((((flags) & CV_MAT_CN_MASK) >> CV_CN_SHIFT) + 1)
CV_MAT_DEPTH_MASK = (CV_DEPTH_MAX - 1)
def CV_MAT_DEPTH(flags):
return ((flags) & CV_MAT_DEPTH_MASK)
CV_MAT_TYPE_MASK = (CV_DEPTH_MAX*CV_CN_MAX - 1)
def CV_MAT_TYPE(flags):
((flags) & CV_MAT_TYPE_MASK)
CV_MAT_CONT_FLAG_SHIFT = 9
CV_MAT_CONT_FLAG = (1 << CV_MAT_CONT_FLAG_SHIFT)
def CV_IS_MAT_CONT(flags):
return ((flags) & CV_MAT_CONT_FLAG)
CV_IS_CONT_MAT = CV_IS_MAT_CONT
CV_MAT_TEMP_FLAG_SHIFT = 10
CV_MAT_TEMP_FLAG = (1 << CV_MAT_TEMP_FLAG_SHIFT)
def CV_IS_TEMP_MAT(flags):
return ((flags) & CV_MAT_TEMP_FLAG)
CV_MAGIC_MASK = 0xFFFF0000
CV_MAT_MAGIC_VAL = 0x42420000
CV_TYPE_NAME_MAT = "opencv-matrix"
# Termination criteria for iterative algorithms
CV_TERMCRIT_ITER = 1
CV_TERMCRIT_NUMBER = CV_TERMCRIT_ITER
CV_TERMCRIT_EPS = 2
# Data structures for persistence (a.k.a serialization) functionality
CV_STORAGE_READ = 0
CV_STORAGE_WRITE = 1
CV_STORAGE_WRITE_TEXT = CV_STORAGE_WRITE
CV_STORAGE_WRITE_BINARY = CV_STORAGE_WRITE
CV_STORAGE_APPEND = 2
CV_MAX_DIM = 32
CV_FILLED = -1
CV_AA = 16
CV_VERSION = "1.0.0"
CV_MAJOR_VERSION = 1
CV_MINOR_VERSION = 0
CV_SUBMINOR_VERSION = 0
CV_WINDOW_AUTOSIZE = 1
CV_CAP_PROP_POS_MSEC = 0
CV_CAP_PROP_POS_FRAMES = 1
CV_CAP_PROP_POS_AVI_RATIO = 2
CV_CAP_PROP_FRAME_WIDTH = 3
CV_CAP_PROP_FRAME_HEIGHT = 4
CV_CAP_PROP_FPS = 5
CV_CAP_PROP_FOURCC = 6
CV_CAP_PROP_FRAME_COUNT = 7
CV_CAP_PROP_FORMAT = 8
CV_CAP_PROP_MODE = 9
CV_CAP_PROP_BRIGHTNESS =10
CV_CAP_PROP_CONTRAST =11
CV_CAP_PROP_SATURATION =12
CV_CAP_PROP_HUE =13
CV_CAP_PROP_GAIN =14
CV_CAP_PROP_CONVERT_RGB =15
def CV_FOURCC(c1,c2,c3,c4):
return (((ord(c1))&255) + (((ord(c2))&255)<<8) + (((ord(c3))&255)<<16) + (((ord(c4))&255)<<24))
# hack the ctypes.Structure class to include printing the fields
class _Structure(Structure):
def __repr__(self):
'''Print the fields'''
res = []
for field in self._fields_:
res.append('%s=%s' % (field[0], repr(getattr(self, field[0]))))
return self.__class__.__name__ + '(' + ','.join(res) + ')'
@classmethod
def from_param(cls, obj):
'''Magically construct from a tuple'''
if isinstance(obj, cls):
return obj
if isinstance(obj, tuple):
return cls(*obj)
raise TypeError
# --- Klassen- und Funktionsdefinitionen -------------------------------------
# 2D point with integer coordinates
class CvPoint(_Structure):
_fields_ = [("x", c_int),
("y", c_int)]
# 2D point with floating-point coordinates
class CvPoint2D32f(_Structure):
_fields_ = [("x", c_float),
("y", c_float)]
# 3D point with floating-point coordinates
class CvPoint3D32f(_Structure):
_fields_ = [("x", c_float),
("y", c_float),
("z", c_float)]
# 2D point with double precision floating-point coordinates
class CvPoint2D64f(_Structure):
_fields_ = [("x", c_double),
("y", c_double)]
CvPoint2D64d = CvPoint2D64f
# 3D point with double precision floating-point coordinates
class CvPoint3D64f(_Structure):
_fields_ = [("x", c_double),
("y", c_double),
("z", c_double)]
CvPoint3D64d = CvPoint3D64f
# pixel-accurate size of a rectangle
class CvSize(_Structure):
_fields_ = [("width", c_int),
("height", c_int)]
# sub-pixel accurate size of a rectangle
class CvSize2D32f(_Structure):
_fields_ = [("width", c_float),
("height", c_float)]
# offset and size of a rectangle
class CvRect(_Structure):
_fields_ = [("x", c_int),
("y", c_int),
("width", c_int),
("height", c_int)]
def bloat(self, s):
return CvRect(self.x-s, self.y-s, self.width+2*s, self.height+2*s)
# A container for 1-,2-,3- or 4-tuples of numbers
class CvScalar(_Structure):
_fields_ = [("val", c_double * 4)]
def __init__(self, *vals):
'''Enable initialization with multiple parameters instead of just a tuple'''
if len(vals) == 1:
super(CvScalar, self).__init__(vals[0])
else:
super(CvScalar, self).__init__(vals)
# Termination criteria for iterative algorithms
class CvTermCriteria(_Structure):
_fields_ = [("type", c_int),
("max_iter", c_int),
("epsilon", c_double)]
# Multi-channel matrix
class CvMat(Structure):
_fields_ = [("type", c_int),
("step", c_int),
("refcount", c_void_p),
("data", c_void_p),
("rows", c_int),
("cols", c_int)]
# Multi-dimensional dense multi-channel matrix
class CvMatNDdata(Union):
_fields_ = [("ptr", POINTER(c_ubyte)),
("s", POINTER(c_short)),
("i", POINTER(c_int)),
("fl", POINTER(c_float)),
("db", POINTER(c_double))]
class CvMatNDdim(Structure):
_fields_ = [("size", c_int),
("step", c_int)]
class CvMatND(Structure):
_fields_ = [("type", c_int),
("dims", c_int),
("refcount", c_void_p),
("data", CvMatNDdata),
("dim", CvMatNDdim*CV_MAX_DIM)]
# IPL image header
class IplImage(Structure):
_fields_ = [("nSize", c_int),
("ID", c_int),
("nChannels", c_int),
("alphaChannel", c_int),
("depth", c_int),
("colorModel", c_char * 4),
("channelSeq", c_char * 4),
("dataOrder", c_int),
("origin", c_int),
("align", c_int),
("width", c_int),
("height", c_int),
("roi", c_void_p),
("maskROI", c_void_p),
("imageID", c_void_p),
("tileInfo", c_void_p),
("imageSize", c_int),
("imageData", c_int),
("widthStep", c_int),
("BorderMode", c_int * 4),
("BorderConst", c_int * 4),
("imageDataOrigin", c_char_p)]
# List of attributes
class CvAttrList(Structure):
_fields_ = [("attr", c_void_p),
("next", c_void_p)]
# Memory storage
_lpCvMemStorage = POINTER("CvMemStorage")
class CvMemStorage(Structure):
_fields_ = [("signature", c_int),
("bottom", c_void_p),
("top", c_void_p),
("parent", _lpCvMemStorage),
("block_size", c_int),
("free_space", c_int)]
SetPointerType(_lpCvMemStorage, CvMemStorage)
class CvMemStoragePos(Structure):
_fields_ = []
# Sequence
class CvSeq(Structure):
_fields_ = [("flags", c_int),
("header_size", c_int),
("h_prev", c_void_p),
("h_next", c_void_p),
("v_prev", c_void_p),
("v_next", c_void_p),
("total", c_int),
("elem_size", c_int),
("block_max", c_void_p),
("ptr", c_void_p),
("delta_elems", c_int),
("storage", POINTER(CvMemStorage)),
("free_blocks", c_void_p),
("first", c_void_p)]
class CvSeqBlock(Structure):
_fields_ = []
class CvSeqWriter(Structure):
_fields_ = []
class CvSeqReader(Structure):
_fields_ = []
# File storage
class CvFileStorage(Structure):
_fields_ = []
# not implemented yet
class CvSparseMat(Structure):
_fields_ = []
class CvContourScanner(Structure):
_fields_ = []
class CvHistogram(Structure):
_fields_ = [('type', c_int),
('bins', c_void_p)]
class CvString(Structure):
_fields_ = []
class CvSlice(Structure):
_fields_ = []
class CvSET(Structure):
_fields_ = []
class CvGraph(Structure):
_fields_ = []
class CvGraphEdge(Structure):
_fields_ = []
class CvGraphScanner(Structure):
_fields_ = []
class CvFileNode(Structure):
_fields_ = []
class CvStringHashNode(Structure):
_fields_ = []
class CvTypeInfo(Structure):
_fields_ = []
class CvContourTree(Structure):
_fields_ = []
class CvBox2D(_Structure):
_fields_ = [('center', CvPoint2D32f),
('size', CvSize2D32f),
('angle', c_float)]
class CvSubdiv2DPoint(Structure):
_fields_ = []
class CvSubdiv2DPointLocation(Structure):
_fields_ = []
class CvKalman(Structure):
_fields_ = []
class CvConDensation(Structure):
_fields_ = []
class CvHaarClassifierCascade(Structure):
_fields_ = []
class CvPOSITObject(Structure):
_fields_ = []
class CvMatr32f(Structure):
_fields_ = []
class CvVect32f(Structure):
_fields_ = []
class CvCapture(Structure):
_fields_ = []
class CvVideoWriter(Structure):
_fields_ = []
class CvSetElem(Structure):
_fields_ = []
class CvGraphVtx(Structure):
_fields_ = []
class CvTreeNodeIterator(Structure):
_fields_ = []
class CvFont(Structure):
_fields_ = []
class CvLineIterator(Structure):
_fields_ = []
class CvModuleInfo(Structure):
_fields_ = []
class IplConvKernel(Structure):
_fields_ = []
class CvConnectedComp(_Structure):
_fields_ = [('area', c_double),
('value', CvScalar),
('rect', CvRect),
('contour', POINTER(CvSeq))]
class CvMOMENTS(Structure):
_fields_ = []
class CvHuMoments(Structure):
_fields_ = []
class CvChain(Structure):
_fields_ = []
class CvChainPtReader(Structure):
_fields_ = []
class CvContour(Structure):
_fields_ = []
class CvCmpFunc(Structure):
_fields_ = []
class CvDistanceFunction(Structure):
_fields_ = []
class CvSubdiv2D(Structure):
_fields_ = []
class CvSubdiv2DEdge(Structure):
_fields_ = []
# --- 1 Operations on Arrays -------------------------------------------------
# --- 1.1 Initialization -----------------------------------------------------
# Creates header and allocates data
cvCreateImage = cfunc('cvCreateImage', _cxDLL, POINTER(IplImage),
('size', CvSize, 1), # CvSize size
('depth', c_int, 1), # int depth
('channels', c_int, 1), # int channels
)
# Allocates, initializes, and returns structure IplImage
cvCreateImageHeader = cfunc('cvCreateImageHeader', _cxDLL, POINTER(IplImage),
('size', CvSize, 1), # CvSize size
('depth', c_int, 1), # int depth
('channels', c_int, 1), # int channels
)
# Releases header
cvReleaseImageHeader = cfunc('cvReleaseImageHeader', _cxDLL, None,
('image', ByRefArg(POINTER(IplImage)), 1), # IplImage** image
)
# Releases header and image data
cvReleaseImage = cfunc('cvReleaseImage', _cxDLL, None,
('image', ByRefArg(POINTER(IplImage)), 1), # IplImage** image
)
# Initializes allocated by user image header
cvInitImageHeader = cfunc('cvInitImageHeader', _cxDLL, POINTER(IplImage),
('image', POINTER(IplImage), 1), # IplImage* image
('size', CvSize, 1), # CvSize size
('depth', c_int, 1), # int depth
('channels', c_int, 1), # int channels
('origin', c_int, 1, 0), # int origin
('align', c_int, 1, 4), # int align
)
# Makes a full copy of image
cvCloneImage = cfunc('cvCloneImage', _cxDLL, POINTER(IplImage),
('image', POINTER(IplImage), 1), # const IplImage* image
)
# Sets channel of interest to given value
cvSetImageCOI = cfunc('cvSetImageCOI', _cxDLL, None,
('image', POINTER(IplImage), 1), # IplImage* image
('coi', c_int, 1), # int coi
)
# Returns index of channel of interest
cvGetImageCOI = cfunc('cvGetImageCOI', _cxDLL, c_int,
('image', POINTER(IplImage), 1), # const IplImage* image
)
# Sets image ROI to given rectangle
cvSetImageROI = cfunc('cvSetImageROI', _cxDLL, None,
('image', POINTER(IplImage), 1), # IplImage* image
('rect', CvRect, 1), # CvRect rect
)
# Releases image ROI
cvResetImageROI = cfunc('cvResetImageROI', _cxDLL, None,
('image', POINTER(IplImage), 1), # IplImage* image
)
# Returns image ROI coordinates
cvGetImageROI = cfunc('cvGetImageROI', _cxDLL, CvRect,
('image', POINTER(IplImage), 1), # const IplImage* image
)
# Creates new matrix
cvCreateMat = cfunc('cvCreateMat', _cxDLL, POINTER(CvMat),
('rows', c_int, 1), # int rows
('cols', c_int, 1), # int cols
('type', c_int, 1), # int type
)
# Creates new matrix header
cvCreateMatHeader = cfunc('cvCreateMatHeader', _cxDLL, POINTER(CvMat),
('rows', c_int, 1), # int rows
('cols', c_int, 1), # int cols
('type', c_int, 1), # int type
)
# Deallocates matrix
cvReleaseMat = cfunc('cvReleaseMat', _cxDLL, None,
('mat', ByRefArg(POINTER(CvMat)), 1), # CvMat** mat
)
# Initializes matrix header
cvInitMatHeader = cfunc('cvInitMatHeader', _cxDLL, POINTER(CvMat),
('mat', POINTER(CvMat), 1), # CvMat* mat
('rows', c_int, 1), # int rows
('cols', c_int, 1), # int cols
('type', c_int, 1), # int type
('data', c_void_p, 1, None), # void* data
('step', c_int, 1), # int step
)
# Creates matrix copy
cvCloneMat = cfunc('cvCloneMat', _cxDLL, POINTER(CvMat),
('mat', POINTER(CvMat), 1), # const CvMat* mat
)
# Creates multi-dimensional dense array
cvCreateMatND = cfunc('cvCreateMatND', _cxDLL, POINTER(CvMatND),
('dims', c_int, 1), # int dims
('sizes', POINTER(c_int), 1), # const int* sizes
('type', c_int, 1), # int type
)
# Creates new matrix header
cvCreateMatNDHeader = cfunc('cvCreateMatNDHeader', _cxDLL, POINTER(CvMatND),
('dims', c_int, 1), # int dims
('sizes', POINTER(c_int), 1), # const int* sizes
('type', c_int, 1), # int type
)
# Initializes multi-dimensional array header
cvInitMatNDHeader = cfunc('cvInitMatNDHeader', _cxDLL, POINTER(CvMatND),
('mat', POINTER(CvMatND), 1), # CvMatND* mat
('dims', c_int, 1), # int dims
('sizes', POINTER(c_int), 1), # const int* sizes
('type', c_int, 1), # int type
('data', c_void_p, 1, None), # void* data
)
# Creates full copy of multi-dimensional array
cvCloneMatND = cfunc('cvCloneMatND', _cxDLL, POINTER(CvMatND),
('mat', POINTER(CvMatND), 1), # const CvMatND* mat
)
# Allocates array data
cvCreateData = cfunc('cvCreateData', _cxDLL, None,
('arr', c_void_p, 1), # CvArr* arr
)
# Releases array data
cvReleaseData = cfunc('cvReleaseData', _cxDLL, None,
('arr', c_void_p, 1), # CvArr* arr
)
# Assigns user data to the array header
cvSetData = cfunc('cvSetData', _cxDLL, None,
('arr', c_void_p, 1), # CvArr* arr
('data', c_void_p, 1), # void* data
('step', c_int, 1), # int step
)
# Retrieves low-level information about the array
cvGetRawData = cfunc('cvGetRawData', _cxDLL, None,
('arr', c_void_p, 1), # const CvArr* arr
('data', POINTER(POINTER(c_byte)), 1), # uchar** data
('step', POINTER(c_int), 1, None), # int* step
('roi_size', POINTER(CvSize), 1, None), # CvSize* roi_size
)
# Returns matrix header for arbitrary array
cvGetMat = cfunc('cvGetMat', _cxDLL, POINTER(CvMat),
('arr', c_void_p, 1), # const CvArr* arr
('header', POINTER(CvMat), 1), # CvMat* header
('coi', POINTER(c_int), 1, None), # int* coi
('allowND', c_int, 1, 0), # int allowND
)
# Returns image header for arbitrary array
cvGetImage = cfunc('cvGetImage', _cxDLL, POINTER(IplImage),
('arr', c_void_p, 1), # const CvArr* arr
('image_header', POINTER(IplImage), 1), # IplImage* image_header
)
# Creates sparse array
cvCreateSparseMat = cfunc('cvCreateSparseMat', _cxDLL, POINTER(CvSparseMat),
('dims', c_int, 1), # int dims
('sizes', POINTER(c_int), 1), # const int* sizes
('type', c_int, 1), # int type
)
# Deallocates sparse array
cvReleaseSparseMat = cfunc('cvReleaseSparseMat', _cxDLL, None,
('mat', ByRefArg(POINTER(CvSparseMat)), 1), # CvSparseMat** mat
)
# Creates full copy of sparse array
cvCloneSparseMat = cfunc('cvCloneSparseMat', _cxDLL, POINTER(CvSparseMat),
('mat', POINTER(CvSparseMat), 1), # const CvSparseMat* mat
)
# --- 1.2 Accessing Elements and sub-Arrays ----------------------------------
# Returns matrix header corresponding to the rectangular sub-array of input image or matrix
cvGetSubRect = cfunc('cvGetSubRect', _cxDLL, POINTER(CvMat),
('arr', c_void_p, 1), # const CvArr* arr
('submat', POINTER(CvMat), 2), # CvMat* submat
('rect', CvRect, 1), # CvRect rect
)
# Returns array row or row span
cvGetRows = cfunc('cvGetRows', _cxDLL, POINTER(CvMat),
('arr', c_void_p, 1), # const CvArr* arr
('submat', POINTER(CvMat), 1), # CvMat* submat
('start_row', c_int, 1), # int start_row
('end_row', c_int, 1), # int end_row
('delta_row', c_int, 1, 1), # int delta_row
)
# Returns array column or column span
cvGetCols = cfunc('cvGetCols', _cxDLL, POINTER(CvMat),
('arr', c_void_p, 1), # const CvArr* arr
('submat', POINTER(CvMat), 1), # CvMat* submat
('start_col', c_int, 1), # int start_col
('end_col', c_int, 1), # int end_col
)
# Returns one of array diagonals
cvGetDiag = cfunc('cvGetDiag', _cxDLL, POINTER(CvMat),
('arr', c_void_p, 1), # const CvArr* arr
('submat', POINTER(CvMat), 1), # CvMat* submat
('diag', c_int, 1, 0), # int diag
)
# Returns size of matrix or image ROI
cvGetSize = cfunc('cvGetSize', _cxDLL, CvSize,
('arr', c_void_p, 1), # const CvArr* arr
)
### Initializes sparse array elements iterator
##cvInitSparseMatIterator = _cxDLL.cvInitSparseMatIterator
##cvInitSparseMatIterator.restype = POINTER(CvSparseNode) # CvSparseNode*
##cvInitSparseMatIterator.argtypes = [
## c_void_p, # const CvSparseMat* mat
## c_void_p # CvSparseMatIterator* mat_iterator
## ]
##
### Initializes sparse array elements iterator
##cvGetNextSparseNode = _cxDLL.cvGetNextSparseNode
##cvGetNextSparseNode.restype = POINTER(CvSparseNode) # CvSparseNode*
##cvGetNextSparseNode.argtypes = [
## c_void_p # CvSparseMatIterator* mat_iterator
## ]
# Returns type of array elements
cvGetElemType = cfunc('cvGetElemType', _cxDLL, c_int,
('arr', c_void_p, 1), # const CvArr* arr
)
# Return number of array dimensions and their sizes or the size of particular dimension
cvGetDims = cfunc('cvGetDims', _cxDLL, c_int,
('arr', c_void_p, 1), # const CvArr* arr
('sizes', POINTER(c_int), 1, None), # int* sizes
)
cvGetDimSize = cfunc('cvGetDimSize', _cxDLL, c_int,
('arr', c_void_p, 1), # const CvArr* arr
('index', c_int, 1), # int index
)
# Return pointer to the particular array element
cvPtr1D = cfunc('cvPtr1D', _cxDLL, c_void_p,
('arr', c_void_p, 1), # const CvArr* arr
('idx0', c_int, 1), # int idx0
('type', POINTER(c_int), 1, None), # int* type
)
cvPtr2D = cfunc('cvPtr2D', _cxDLL, c_void_p,
('arr', c_void_p, 1), # const CvArr* arr
('idx0', c_int, 1), # int idx0
('idx1', c_int, 1), # int idx1
('type', POINTER(c_int), 1, None), # int* type
)
cvPtr3D = cfunc('cvPtr3D', _cxDLL, c_void_p,
('arr', c_void_p, 1), # const CvArr* arr
('idx0', c_int, 1), # int idx0
('idx1', c_int, 1), # int idx1
('idx2', c_int, 1), # int idx2
('type', POINTER(c_int), 1, None), # int* type
)
cvPtrND = cfunc('cvPtrND', _cxDLL, c_void_p,
('arr', c_void_p, 1), # const CvArr* arr
('idx', POINTER(c_int), 1), # int* idx
('type', POINTER(c_int), 1, None), # int* type
('create_node', c_int, 1, 1), # int create_node
('precalc_hashval', POINTER(c_uint32), 1, None), # unsigned* precalc_hashval
)
# Return the particular array element
cvGet1D = cfunc('cvGet1D', _cxDLL, CvScalar,
('arr', c_void_p, 1), # const CvArr* arr
('idx0', c_int, 1), # int idx0
)
cvGet2D = cfunc('cvGet2D', _cxDLL, CvScalar,
('arr', c_void_p, 1), # const CvArr* arr
('idx0', c_int, 1), # int idx0
('idx1', c_int, 1), # int idx1
)
cvGet3D = cfunc('cvGet3D', _cxDLL, CvScalar,
('arr', c_void_p, 1), # const CvArr* arr
('idx0', c_int, 1), # int idx0
('idx1', c_int, 1), # int idx1
('idx2', c_int, 1), # int idx2
)
cvGetND = cfunc('cvGetND', _cxDLL, CvScalar,
('arr', c_void_p, 1), # const CvArr* arr
('idx', POINTER(c_int), 1), # int* idx
)
# Return the particular element of single-channel array
cvGetReal1D = cfunc('cvGetReal1D', _cxDLL, c_double,
('arr', c_void_p, 1), # const CvArr* arr
('idx0', c_int, 1), # int idx0
)
cvGetReal2D = cfunc('cvGetReal2D', _cxDLL, c_double,
('arr', c_void_p, 1), # const CvArr* arr
('idx0', c_int, 1), # int idx0
('idx1', c_int, 1), # int idx1
)
cvGetReal3D = cfunc('cvGetReal3D', _cxDLL, c_double,
('arr', c_void_p, 1), # const CvArr* arr
('idx0', c_int, 1), # int idx0
('idx1', c_int, 1), # int idx1
('idx2', c_int, 1), # int idx2
)
cvGetRealND = cfunc('cvGetRealND', _cxDLL, c_double,
('arr', c_void_p, 1), # const CvArr* arr
('idx', POINTER(c_int), 1), # int* idx
)
# Change the particular array element
cvSet1D = cfunc('cvSet1D', _cxDLL, None,
('arr', c_void_p, 1), # CvArr* arr
('idx0', c_int, 1), # int idx0
('value', CvScalar, 1), # CvScalar value
)
cvSet2D = cfunc('cvSet2D', _cxDLL, None,
('arr', c_void_p, 1), # CvArr* arr
('idx0', c_int, 1), # int idx0
('idx1', c_int, 1), # int idx1
('value', CvScalar, 1), # CvScalar value
)
cvSet3D = cfunc('cvSet3D', _cxDLL, None,
('arr', c_void_p, 1), # CvArr* arr
('idx0', c_int, 1), # int idx0
('idx1', c_int, 1), # int idx1
('idx2', c_int, 1), # int idx2
('value', CvScalar, 1), # CvScalar value
)
cvSetND = cfunc('cvSetND', _cxDLL, None,
('arr', c_void_p, 1), # CvArr* arr
('idx', POINTER(c_int), 1), # int* idx
('value', CvScalar, 1), # CvScalar value
)
# Change the particular array element
cvSetReal1D = cfunc('cvSetReal1D', _cxDLL, None,
('arr', c_void_p, 1), # CvArr* arr
('idx0', c_int, 1), # int idx0
('value', c_double, 1), # double value
)
cvSetReal2D = cfunc('cvSetReal2D', _cxDLL, None,
('arr', c_void_p, 1), # CvArr* arr
('idx0', c_int, 1), # int idx0
('idx1', c_int, 1), # int idx1
('value', c_double, 1), # double value
)
cvSetReal3D = cfunc('cvSetReal3D', _cxDLL, None,
('arr', c_void_p, 1), # CvArr* arr
('idx0', c_int, 1), # int idx0
('idx1', c_int, 1), # int idx1
('idx2', c_int, 1), # int idx2
('value', c_double, 1), # double value
)
cvSetRealND = cfunc('cvSetRealND', _cxDLL, None,
('arr', c_void_p, 1), # CvArr* arr
('idx', POINTER(c_int), 1), # int* idx
('value', c_double, 1), # double value
)
# Clears the particular array element
cvClearND = cfunc('cvClearND', _cxDLL, None,
('arr', c_void_p, 1), # CvArr* arr
('idx', POINTER(c_int), 1), # int* idx
)
# --- 1.3 Copying and Filling ------------------------------------------------
# Copies one array to another
cvCopy = cfunc('cvCopy', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('mask', c_void_p, 1, None), # const CvArr* mask
)
# Sets every element of array to given value
cvSet = cfunc('cvSet', _cxDLL, None,
('arr', c_void_p, 1), # CvArr* arr
('value', CvScalar, 1), # CvScalar value
('mask', c_void_p, 1, None), # const CvArr* mask
)
# Clears the array
cvSetZero = cfunc('cvSetZero', _cxDLL, None,
('arr', c_void_p, 1), # CvArr* arr
)
cvZero = cvSetZero
# --- 1.4 Transforms and Permutations ----------------------------------------
# Changes shape of matrix/image without copying data
cvReshape = cfunc('cvReshape', _cxDLL, POINTER(CvMat),
('arr', c_void_p, 1), # const CvArr* arr
('header', POINTER(CvMat), 1), # CvMat* header
('new_cn', c_int, 1), # int new_cn
('new_rows', c_int, 1, 0), # int new_rows
)
# Changes shape of multi-dimensional array w/o copying data
cvReshapeMatND = cfunc('cvReshapeMatND', _cxDLL, c_void_p,
('arr', c_void_p, 1), # const CvArr* arr
('sizeof_header', c_int, 1), # int sizeof_header
('header', c_void_p, 1), # CvArr* header
('new_cn', c_int, 1), # int new_cn
('new_dims', c_int, 1), # int new_dims
('new_sizes', POINTER(c_int), 1), # int* new_sizes
)
# Fill destination array with tiled source array
cvRepeat = cfunc('cvRepeat', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
)
# Flip a 2D array around vertical, horizontall or both axises
cvFlip = cfunc('cvFlip', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1, None), # CvArr* dst
('flip_mode', c_int, 1, 0), # int flip_mode
)
# Divides multi-channel array into several single-channel arrays or extracts a single channel from the array
cvSplit = cfunc('cvSplit', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst0', c_void_p, 1, None), # CvArr* dst0
('dst1', c_void_p, 1, None), # CvArr* dst1
('dst2', c_void_p, 1, None), # CvArr* dst2
('dst3', c_void_p, 1, None), # CvArr* dst3
)
# Composes multi-channel array from several single-channel arrays or inserts a single channel into the array
cvMerge = cfunc('cvMerge', _cxDLL, None,
('src0', c_void_p, 1), # const CvArr* src0
('src1', c_void_p, 1), # const CvArr* src1
('src2', c_void_p, 1), # const CvArr* src2
('src3', c_void_p, 1), # const CvArr* src3
('dst', c_void_p, 1), # CvArr* dst
)
# --- 1.5 Arithmetic, Logic and Comparison -----------------------------------
# Performs look-up table transform of array
cvLUT = cfunc('cvLUT', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('lut', c_void_p, 1), # const CvArr* lut
)
# Converts one array to another with optional linear transformation
cvConvertScale = cfunc('cvConvertScale', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('scale', c_double, 1, 1), # double scale
('shift', c_double, 1, 0), # double shift
)
cvCvtScale = cvConvertScale
cvScale = cvConvertScale
def cvConvert(src, dst):
cvConvertScale(src, dst, 1, 0)
# Converts input array elements to 8-bit unsigned integer another with optional linear transformation
cvConvertScaleAbs = cfunc('cvConvertScaleAbs', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('scale', c_double, 1, 1), # double scale
('shift', c_double, 1, 0), # double shift
)
cvCvtScaleAbs = cvConvertScaleAbs
# Computes per-element sum of two arrays
cvAdd = cfunc('cvAdd', _cxDLL, None,
('src1', c_void_p, 1), # const CvArr* src1
('src2', c_void_p, 1), # const CvArr* src2
('dst', c_void_p, 1), # CvArr* dst
('mask', c_void_p, 1, None), # const CvArr* mask
)
# Computes sum of array and scalar
cvAddS = cfunc('cvAddS', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('value', CvScalar, 1), # CvScalar value
('dst', c_void_p, 1), # CvArr* dst
('mask', c_void_p, 1, None), # const CvArr* mask
)
# Computes weighted sum of two arrays
cvAddWeighted = cfunc('cvAddWeighted', _cxDLL, None,
('src1', c_void_p, 1), # const CvArr* src1
('alpha', c_double, 1), # double alpha
('src2', c_void_p, 1), # const CvArr* src2
('beta', c_double, 1), # double beta
('gamma', c_double, 1), # double gamma
('dst', c_void_p, 1), # CvArr* dst
)
# Computes per-element difference between two arrays
cvSub = cfunc('cvSub', _cxDLL, None,
('src1', c_void_p, 1), # const CvArr* src1
('src2', c_void_p, 1), # const CvArr* src2
('dst', c_void_p, 1), # CvArr* dst
('mask', c_void_p, 1, None), # const CvArr* mask
)
# Computes difference between scalar and array
cvSubRS = cfunc('cvSubRS', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('value', CvScalar, 1), # CvScalar value
('dst', c_void_p, 1), # CvArr* dst
('mask', c_void_p, 1, None), # const CvArr* mask
)
# Calculates per-element product of two arrays
cvMul = cfunc('cvMul', _cxDLL, None,
('src1', c_void_p, 1), # const CvArr* src1
('src2', c_void_p, 1), # const CvArr* src2
('dst', c_void_p, 1), # CvArr* dst
('scale', c_double, 1, 1), # double scale
)
# Performs per-element division of two arrays
cvDiv = cfunc('cvDiv', _cxDLL, None,
('src1', c_void_p, 1), # const CvArr* src1
('src2', c_void_p, 1), # const CvArr* src2
('dst', c_void_p, 1), # CvArr* dst
('scale', c_double, 1, 1), # double scale
)
# Calculates per-element bit-wise conjunction of two arrays
cvAnd = cfunc('cvAnd', _cxDLL, None,
('src1', c_void_p, 1), # const CvArr* src1
('src2', c_void_p, 1), # const CvArr* src2
('dst', c_void_p, 1), # CvArr* dst
('mask', c_void_p, 1, None), # const CvArr* mask
)
# Calculates per-element bit-wise conjunction of array and scalar
cvAndS = cfunc('cvAndS', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('value', CvScalar, 1), # CvScalar value
('dst', c_void_p, 1), # CvArr* dst
('mask', c_void_p, 1, None), # const CvArr* mask
)
# Calculates per-element bit-wise disjunction of two arrays
cvOr = cfunc('cvOr', _cxDLL, None,
('src1', c_void_p, 1), # const CvArr* src1
('src2', c_void_p, 1), # const CvArr* src2
('dst', c_void_p, 1), # CvArr* dst
('mask', c_void_p, 1, None), # const CvArr* mask
)
# Calculates per-element bit-wise disjunction of array and scalar
cvOrS = cfunc('cvOrS', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('value', CvScalar, 1), # CvScalar value
('dst', c_void_p, 1), # CvArr* dst
('mask', c_void_p, 1, None), # const CvArr* mask
)
# Performs per-element bit-wise "exclusive or" operation on two arrays
cvXor = cfunc('cvXor', _cxDLL, None,
('src1', c_void_p, 1), # const CvArr* src1
('src2', c_void_p, 1), # const CvArr* src2
('dst', c_void_p, 1), # CvArr* dst
('mask', c_void_p, 1, None), # const CvArr* mask
)
# Performs per-element bit-wise "exclusive or" operation on array and scalar
cvXorS = cfunc('cvXorS', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('value', CvScalar, 1), # CvScalar value
('dst', c_void_p, 1), # CvArr* dst
('mask', c_void_p, 1, None), # const CvArr* mask
)
# Performs per-element bit-wise inversion of array elements
cvNot = cfunc('cvNot', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
)
# Performs per-element comparison of two arrays
cvCmp = cfunc('cvCmp', _cxDLL, None,
('src1', c_void_p, 1), # const CvArr* src1
('src2', c_void_p, 1), # const CvArr* src2
('dst', c_void_p, 1), # CvArr* dst
('cmp_op', c_int, 1), # int cmp_op
)
# Performs per-element comparison of array and scalar
cvCmpS = cfunc('cvCmpS', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('value', c_double, 1), # double value
('dst', c_void_p, 1), # CvArr* dst
('cmp_op', c_int, 1), # int cmp_op
)
# Checks that array elements lie between elements of two other arrays
cvInRange = cfunc('cvInRange', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('lower', c_void_p, 1), # const CvArr* lower
('upper', c_void_p, 1), # const CvArr* upper
('dst', c_void_p, 1), # CvArr* dst
)
# Checks that array elements lie between two scalars
cvInRangeS = cfunc('cvInRangeS', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('lower', CvScalar, 1), # CvScalar lower
('upper', CvScalar, 1), # CvScalar upper
('dst', c_void_p, 1), # CvArr* dst
)
# Finds per-element maximum of two arrays
cvMax = cfunc('cvMax', _cxDLL, None,
('src1', c_void_p, 1), # const CvArr* src1
('src2', c_void_p, 1), # const CvArr* src2
('dst', c_void_p, 1), # CvArr* dst
)
# Finds per-element maximum of array and scalar
cvMaxS = cfunc('cvMaxS', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('value', c_double, 1), # double value
('dst', c_void_p, 1), # CvArr* dst
)
# Finds per-element minimum of two arrays
cvMin = cfunc('cvMin', _cxDLL, None,
('src1', c_void_p, 1), # const CvArr* src1
('src2', c_void_p, 1), # const CvArr* src2
('dst', c_void_p, 1), # CvArr* dst
)
# Finds per-element minimum of array and scalar
cvMinS = cfunc('cvMinS', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('value', c_double, 1), # double value
('dst', c_void_p, 1), # CvArr* dst
)
# Calculates absolute difference between two arrays
cvAbsDiff = cfunc('cvAbsDiff', _cxDLL, None,
('src1', c_void_p, 1), # const CvArr* src1
('src2', c_void_p, 1), # const CvArr* src2
('dst', c_void_p, 1), # CvArr* dst
)
# Calculates absolute difference between array and scalar
cvAbsDiffS = cfunc('cvAbsDiffS', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('value', CvScalar, 1), # CvScalar value
)
def cvAbs(src, dst):
value = CvScalar()
value.val[0] = 0.0
value.val[1] = 0.0
value.val[2] = 0.0
value.val[3] = 0.0
cvAbsDiffS(src, dst, value)
# --- 1.6 Statistics ---------------------------------------------------------
# Counts non-zero array elements
cvCountNonZero = cfunc('cvCountNonZero', _cxDLL, c_int,
('arr', c_void_p, 1), # const CvArr* arr
)
# Summarizes array elements
cvSum = cfunc('cvSum', _cxDLL, CvScalar,
('arr', c_void_p, 1), # const CvArr* arr
)
# Calculates average (mean) of array elements
cvAvg = cfunc('cvAvg', _cxDLL, CvScalar,
('arr', c_void_p, 1), # const CvArr* arr
('mask', c_void_p, 1, None), # const CvArr* mask
)
# Calculates average (mean) of array elements
cvAvgSdv = cfunc('cvAvgSdv', _cxDLL, None,
('arr', c_void_p, 1), # const CvArr* arr
('mean', POINTER(CvScalar), 1), # CvScalar* mean
('std_dev', POINTER(CvScalar), 1), # CvScalar* std_dev
('mask', c_void_p, 1, None), # const CvArr* mask
)
# Finds global minimum and maximum in array or subarray
## cvMinMaxLoc = _cxDLL.cvMinMaxLoc
## cvMinMaxLoc.restype = None # void
## cvMinMaxLoc.argtypes = [
## c_void_p, # const CvArr* arr
## c_void_p, # double* min_val
## c_void_p, # double* max_val
## c_void_p, # CvPoint* min_loc=NULL
## c_void_p, # CvPoint* max_loc=NULL
## c_void_p # const CvArr* mask=NULL
## ]
cvMinMaxLoc = cfunc('cvMinMaxLoc', _cxDLL, None,
('image', POINTER(IplImage), 1),
('min_val', POINTER(c_double), 2),
('max_val', POINTER(c_double), 2),
('min_loc', POINTER(CvPoint), 2),
('max_loc', POINTER(CvPoint), 2),
('mask', c_void_p, 1, None))
# Calculates absolute array norm, absolute difference norm or relative difference norm
cvNorm = cfunc('cvNorm', _cxDLL, c_double,
('arr1', c_void_p, 1), # const CvArr* arr1
('arr2', c_void_p, 1, None), # const CvArr* arr2
('norm_type', c_int, 1), # int norm_type
('mask', c_void_p, 1, None), # const CvArr* mask
)
# --- 1.7 Linear Algebra -----------------------------------------------------
# Initializes scaled identity matrix
cvSetIdentity = cfunc('cvSetIdentity', _cxDLL, None,
('mat', c_void_p, 1), # CvArr* mat
('value', CvScalar, 1), # CvScalar value
)
# Calculates dot product of two arrays in Euclidian metrics
cvDotProduct = cfunc('cvDotProduct', _cxDLL, c_double,
('src1', c_void_p, 1), # const CvArr* src1
('src2', c_void_p, 1), # const CvArr* src2
)
# Calculates cross product of two 3D vectors
cvCrossProduct = cfunc('cvCrossProduct', _cxDLL, None,
('src1', c_void_p, 1), # const CvArr* src1
('src2', c_void_p, 1), # const CvArr* src2
('dst', c_void_p, 1), # CvArr* dst
)
# Calculates sum of scaled array and another array
cvScaleAdd = cfunc('cvScaleAdd', _cxDLL, None,
('src1', c_void_p, 1), # const CvArr* src1
('scale', CvScalar, 1), # CvScalar scale
('src2', c_void_p, 1), # const CvArr* src2
('dst', c_void_p, 1), # CvArr* dst
)
# Performs generalized matrix multiplication
cvGEMM = cfunc('cvGEMM', _cxDLL, None,
('src1', c_void_p, 1), # const CvArr* src1
('src2', c_void_p, 1), # const CvArr* src2
('alpha', c_double, 1), # double alpha
('src3', c_void_p, 1), # const CvArr* src3
('beta', c_double, 1), # double beta
('dst', c_void_p, 1), # CvArr* dst
('tABC', c_int, 1, 0), # int tABC
)
def cvMatMulAdd(src1, src2, src3, dst):
cvGEMM(src1, src2, 1, src3, 1, dst, 0)
def cvMatMul(src1, src2, dst):
cvMatMulAdd(src1, src2, 0, dst)
# Performs matrix transform of every array element
cvTransform = cfunc('cvTransform', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('transmat', POINTER(CvMat), 1), # const CvMat* transmat
('shiftvec', POINTER(CvMat), 1, None), # const CvMat* shiftvec
)
# Performs perspective matrix transform of vector array
cvPerspectiveTransform = cfunc('cvPerspectiveTransform', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('mat', POINTER(CvMat), 1), # const CvMat* mat
)
# Calculates product of array and transposed array
cvMulTransposed = cfunc('cvMulTransposed', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('order', c_int, 1), # int order
('delta', c_void_p, 1, None), # const CvArr* delta
)
# Returns trace of matrix
cvTrace = cfunc('cvTrace', _cxDLL, CvScalar,
('mat', c_void_p, 1), # const CvArr* mat
)
# Transposes matrix
cvTranspose = cfunc('cvTranspose', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
)
# Returns determinant of matrix
cvDet = cfunc('cvDet', _cxDLL, c_double,
('mat', c_void_p, 1), # const CvArr* mat
)
# Finds inverse or pseudo-inverse of matrix
cvInvert = cfunc('cvInvert', _cxDLL, c_double,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('method', c_int, 1), # int method
)
# Solves linear system or least-squares problem
cvSolve = cfunc('cvSolve', _cxDLL, c_int,
('src1', c_void_p, 1), # const CvArr* src1
('src2', c_void_p, 1), # const CvArr* src2
('dst', c_void_p, 1), # CvArr* dst
('method', c_int, 1), # int method
)
# Performs singular value decomposition of real floating-point matrix
cvSVD = cfunc('cvSVD', _cxDLL, None,
('A', c_void_p, 1), # CvArr* A
('W', c_void_p, 1), # CvArr* W
('U', c_void_p, 1, None), # CvArr* U
('V', c_void_p, 1, None), # CvArr* V
('flags', c_int, 1, 0), # int flags
)
# Performs singular value back substitution
cvSVBkSb = cfunc('cvSVBkSb', _cxDLL, None,
('W', c_void_p, 1), # const CvArr* W
('U', c_void_p, 1), # const CvArr* U
('V', c_void_p, 1), # const CvArr* V
('B', c_void_p, 1), # const CvArr* B
('X', c_void_p, 1), # CvArr* X
('flags', c_int, 1), # int flags
)
# Computes eigenvalues and eigenvectors of symmetric matrix
cvEigenVV = cfunc('cvEigenVV', _cxDLL, None,
('mat', c_void_p, 1), # CvArr* mat
('evects', c_void_p, 1), # CvArr* evects
('evals', c_void_p, 1), # CvArr* evals
('eps', c_double, 1, 0), # double eps
)
# Calculates covariation matrix of the set of vectors
cvCalcCovarMatrix = cfunc('cvCalcCovarMatrix', _cxDLL, None,
('vects', POINTER(c_void_p), 1), # const CvArr** vects
('count', c_int, 1), # int count
('cov_mat', c_void_p, 1), # CvArr* cov_mat
('avg', c_void_p, 1), # CvArr* avg
('flags', c_int, 1), # int flags
)
# Calculates Mahalonobis distance between two vectors
cvMahalanobis = cfunc('cvMahalanobis', _cxDLL, c_double,
('vec1', c_void_p, 1), # const CvArr* vec1
('vec2', c_void_p, 1), # const CvArr* vec2
('mat', c_void_p, 1), # CvArr* mat
)
# --- 1.8 Math Functions -----------------------------------------------------
# Calculates cubic root
cvCbrt = cfunc('cvCbrt', _cxDLL, c_float,
('value', c_float, 1), # float value
)
# Calculates angle of 2D vector
cvFastArctan = cfunc('cvFastArctan', _cxDLL, c_float,
('y', c_float, 1), # float y
('x', c_float, 1), # float x
)
# Calculates magnitude and/or angle of 2d vectors
cvCartToPolar = cfunc('cvCartToPolar', _cxDLL, None,
('x', c_void_p, 1), # const CvArr* x
('y', c_void_p, 1), # const CvArr* y
('magnitude', c_void_p, 1), # CvArr* magnitude
('angle', c_void_p, 1, None), # CvArr* angle
('angle_in_degrees', c_int, 1, 0), # int angle_in_degrees
)
# Calculates cartesian coordinates of 2d vectors represented in polar form
cvPolarToCart = cfunc('cvPolarToCart', _cxDLL, None,
('magnitude', c_void_p, 1), # const CvArr* magnitude
('angle', c_void_p, 1), # const CvArr* angle
('x', c_void_p, 1), # CvArr* x
('y', c_void_p, 1), # CvArr* y
('angle_in_degrees', c_int, 1, 0), # int angle_in_degrees
)
# Raises every array element to power
cvPow = cfunc('cvPow', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('power', c_double, 1), # double power
)
# Calculates exponent of every array element
cvExp = cfunc('cvExp', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
)
# Calculates natural logarithm of every array element absolute value
cvLog = cfunc('cvLog', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
)
# Finds real roots of a cubic equation
cvSolveCubic = cfunc('cvSolveCubic', _cxDLL, None,
('coeffs', c_void_p, 1), # const CvArr* coeffs
('roots', c_void_p, 1), # CvArr* roots
)
# --- 1.9 Random Number Generation -------------------------------------------
# Fills array with random numbers and updates the RNG state
cvRandArr = cfunc('cvRandArr', _cxDLL, None,
('rng', c_void_p, 1), # CvRNG* rng
('arr', c_void_p, 1), # CvArr* arr
('dist_type', c_int, 1), # int dist_type
('param1', CvScalar, 1), # CvScalar param1
('param2', CvScalar, 1), # CvScalar param2
)
# --- 1.10 Discrete Transforms -----------------------------------------------
# Performs forward or inverse Discrete Fourier transform of 1D or 2D floating-point array
CV_DXT_FORWARD = 0
CV_DXT_INVERSE = 1
CV_DXT_SCALE = 2
CV_DXT_ROWS = 4
CV_DXT_INV_SCALE = CV_DXT_SCALE | CV_DXT_INVERSE
CV_DXT_INVERSE_SCALE = CV_DXT_INV_SCALE
cvDFT = cfunc('cvDFT', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('flags', c_int, 1), # int flags
('nonzero_rows', c_int, 1, 0), # int nonzero_rows
)
# Returns optimal DFT size for given vector size
cvGetOptimalDFTSize = cfunc('cvGetOptimalDFTSize', _cxDLL, c_int,
('size0', c_int, 1), # int size0
)
# Performs per-element multiplication of two Fourier spectrums
cvMulSpectrums = cfunc('cvMulSpectrums', _cxDLL, None,
('src1', c_void_p, 1), # const CvArr* src1
('src2', c_void_p, 1), # const CvArr* src2
('dst', c_void_p, 1), # CvArr* dst
('flags', c_int, 1), # int flags
)
# Performs forward or inverse Discrete Cosine transform of 1D or 2D floating-point array
CV_DXT_FORWARD = 0
CV_DXT_INVERSE = 1
CV_DXT_ROWS = 4
cvDCT = cfunc('cvDCT', _cxDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('flags', c_int, 1), # int flags
)
# --- 2 Dynamic Structures ---------------------------------------------------
# --- 2.1 Memory Storages ----------------------------------------------------
# Creates memory storage
cvCreateMemStorage = cfunc('cvCreateMemStorage', _cxDLL, POINTER(CvMemStorage),
('block_size', c_int, 1, 0), # int block_size
)
# Creates child memory storage
cvCreateChildMemStorage = cfunc('cvCreateChildMemStorage', _cxDLL, POINTER(CvMemStorage),
('parent', POINTER(CvMemStorage), 1), # CvMemStorage* parent
)
# Releases memory storage
cvReleaseMemStorage = cfunc('cvReleaseMemStorage', _cxDLL, None,
('storage', POINTER(POINTER(CvMemStorage)), 1), # CvMemStorage** storage
)
# Clears memory storage
cvClearMemStorage = cfunc('cvClearMemStorage', _cxDLL, None,
('storage', POINTER(CvMemStorage), 1), # CvMemStorage* storage
)
# Allocates memory buffer in the storage
cvMemStorageAlloc = cfunc('cvMemStorageAlloc', _cxDLL, c_void_p,
('storage', POINTER(CvMemStorage), 1), # CvMemStorage* storage
('size', c_ulong, 1), # size_t size
)
# Allocates text string in the storage
cvMemStorageAllocString = cfunc('cvMemStorageAllocString', _cxDLL, CvString,
('storage', POINTER(CvMemStorage), 1), # CvMemStorage* storage
('ptr', c_char_p, 1), # const char* ptr
('len', c_int, 1), # int len
)
# Saves memory storage position
cvSaveMemStoragePos = cfunc('cvSaveMemStoragePos', _cxDLL, None,
('storage', POINTER(CvMemStorage), 1), # const CvMemStorage* storage
('pos', POINTER(CvMemStoragePos), 1), # CvMemStoragePos* pos
)
# Restores memory storage position
cvRestoreMemStoragePos = cfunc('cvRestoreMemStoragePos', _cxDLL, None,
('storage', POINTER(CvMemStorage), 1), # CvMemStorage* storage
('pos', POINTER(CvMemStoragePos), 1), # CvMemStoragePos* pos
)
# --- 2.2 Sequences ----------------------------------------------------------
# Creates sequence
cvCreateSeq = cfunc('cvCreateSeq', _cxDLL, POINTER(CvSeq),
('seq_flags', c_int, 1), # int seq_flags
('header_size', c_int, 1), # int header_size
('elem_size', c_int, 1), # int elem_size
('storage', POINTER(CvMemStorage), 1), # CvMemStorage* storage
)
# Sets up sequence block size
cvSetSeqBlockSize = cfunc('cvSetSeqBlockSize', _cxDLL, None,
('seq', POINTER(CvSeq), 1), # CvSeq* seq
('delta_elems', c_int, 1), # int delta_elems
)
# Adds element to sequence end
cvSeqPush = cfunc('cvSeqPush', _cxDLL, c_void_p,
('seq', POINTER(CvSeq), 1), # CvSeq* seq
('element', c_void_p, 1, None), # void* element
)
# Removes element from sequence end
cvSeqPop = cfunc('cvSeqPop', _cxDLL, None,
('seq', POINTER(CvSeq), 1), # CvSeq* seq
('element', c_void_p, 1, None), # void* element
)
# Adds element to sequence beginning
cvSeqPushFront = cfunc('cvSeqPushFront', _cxDLL, c_void_p,
('seq', POINTER(CvSeq), 1), # CvSeq* seq
('element', c_void_p, 1, None), # void* element
)
# Removes element from sequence beginning
cvSeqPopFront = cfunc('cvSeqPopFront', _cxDLL, None,
('seq', POINTER(CvSeq), 1), # CvSeq* seq
('element', c_void_p, 1, None), # void* element
)
# Pushes several elements to the either end of sequence
cvSeqPushMulti = cfunc('cvSeqPushMulti', _cxDLL, None,
('seq', POINTER(CvSeq), 1), # CvSeq* seq
('elements', c_void_p, 1), # void* elements
('count', c_int, 1), # int count
('in_front', c_int, 1, 0), # int in_front
)
# Removes several elements from the either end of sequence
cvSeqPopMulti = cfunc('cvSeqPopMulti', _cxDLL, None,
('seq', POINTER(CvSeq), 1), # CvSeq* seq
('elements', c_void_p, 1), # void* elements
('count', c_int, 1), # int count
('in_front', c_int, 1, 0), # int in_front
)
# Inserts element in sequence middle
cvSeqInsert = cfunc('cvSeqInsert', _cxDLL, c_void_p,
('seq', POINTER(CvSeq), 1), # CvSeq* seq
('before_index', c_int, 1), # int before_index
('element', c_void_p, 1, None), # void* element
)
# Removes element from sequence middle
cvSeqRemove = cfunc('cvSeqRemove', _cxDLL, None,
('seq', POINTER(CvSeq), 1), # CvSeq* seq
('index', c_int, 1), # int index
)
# Clears sequence
cvClearSeq = cfunc('cvClearSeq', _cxDLL, None,
('seq', POINTER(CvSeq), 1), # CvSeq* seq
)
# Returns pointer to sequence element by its index
cvGetSeqElem = cfunc('cvGetSeqElem', _cxDLL, c_void_p,
('seq', POINTER(CvSeq), 1), # const CvSeq* seq
('index', c_int, 1), # int index
)
def CV_GET_SEQ_ELEM(TYPE, seq, index):
result = cvGetSeqElem(seq)
return cast(result, POINTER(TYPE))
# Returns index of concrete sequence element
cvSeqElemIdx = cfunc('cvSeqElemIdx', _cxDLL, c_int,
('seq', POINTER(CvSeq), 1), # const CvSeq* seq
('element', c_void_p, 1), # const void* element
('block', POINTER(POINTER(CvSeqBlock)), 1, None), # CvSeqBlock** block
)
# Copies sequence to one continuous block of memory
cvCvtSeqToArray = cfunc('cvCvtSeqToArray', _cxDLL, c_void_p,
('seq', POINTER(CvSeq), 1), # const CvSeq* seq
('elements', c_void_p, 1), # void* elements
('slice', CvSlice, 1), # CvSlice slice
)
# Constructs sequence from array
cvMakeSeqHeaderForArray = cfunc('cvMakeSeqHeaderForArray', _cxDLL, POINTER(CvSeq),
('seq_type', c_int, 1), # int seq_type
('header_size', c_int, 1), # int header_size
('elem_size', c_int, 1), # int elem_size
('elements', c_void_p, 1), # void* elements
('total', c_int, 1), # int total
('seq', POINTER(CvSeq), 1), # CvSeq* seq
('block', POINTER(CvSeqBlock), 1), # CvSeqBlock* block
)
# Makes separate header for the sequence slice
cvSeqSlice = cfunc('cvSeqSlice', _cxDLL, POINTER(CvSeq),
('seq', POINTER(CvSeq), 1), # const CvSeq* seq
('slice', CvSlice, 1), # CvSlice slice
('storage', POINTER(CvMemStorage), 1, None), # CvMemStorage* storage
('copy_data', c_int, 1, 0), # int copy_data
)
# Removes sequence slice
cvSeqRemoveSlice = cfunc('cvSeqRemoveSlice', _cxDLL, None,
('seq', POINTER(CvSeq), 1), # CvSeq* seq
('slice', CvSlice, 1), # CvSlice slice
)
# Inserts array in the middle of sequence
cvSeqInsertSlice = cfunc('cvSeqInsertSlice', _cxDLL, None,
('seq', POINTER(CvSeq), 1), # CvSeq* seq
('before_index', c_int, 1), # int before_index
('from_arr', c_void_p, 1), # const CvArr* from_arr
)
# Reverses the order of sequence elements
cvSeqInvert = cfunc('cvSeqInvert', _cxDLL, None,
('seq', POINTER(CvSeq), 1), # CvSeq* seq
)
# a < b ? -1 : a > b ? 1 : 0
CvCmpFunc = CFUNCTYPE(c_int, # int
c_void_p, # const void* a
c_void_p, # const void* b
c_void_p) # void* userdata
# Sorts sequence element using the specified comparison function
cvSeqSort = cfunc('cvSeqSort', _cxDLL, None,
('seq', POINTER(CvSeq), 1), # CvSeq* seq
('func', CvCmpFunc, 1), # CvCmpFunc func
('userdata', c_void_p, 1, None), # void* userdata
)
# Searches element in sequence
cvSeqSearch = cfunc('cvSeqSearch', _cxDLL, c_void_p,
('seq', POINTER(CvSeq), 1), # CvSeq* seq
('elem', c_void_p, 1), # const void* elem
('func', CvCmpFunc, 1), # CvCmpFunc func
('is_sorted', c_int, 1), # int is_sorted
('elem_idx', POINTER(c_int), 1), # int* elem_idx
('userdata', c_void_p, 1, None), # void* userdata
)
# Initializes process of writing data to sequence
cvStartAppendToSeq = cfunc('cvStartAppendToSeq', _cxDLL, None,
('seq', POINTER(CvSeq), 1), # CvSeq* seq
('writer', POINTER(CvSeqWriter), 1), # CvSeqWriter* writer
)
# Creates new sequence and initializes writer for it
cvStartWriteSeq = cfunc('cvStartWriteSeq', _cxDLL, None,
('seq_flags', c_int, 1), # int seq_flags
('header_size', c_int, 1), # int header_size
('elem_size', c_int, 1), # int elem_size
('storage', POINTER(CvMemStorage), 1), # CvMemStorage* storage
('writer', POINTER(CvSeqWriter), 1), # CvSeqWriter* writer
)
# Finishes process of writing sequence
cvEndWriteSeq = cfunc('cvEndWriteSeq', _cxDLL, POINTER(CvSeq),
('writer', POINTER(CvSeqWriter), 1), # CvSeqWriter* writer
)
# Updates sequence headers from the writer state
cvFlushSeqWriter = cfunc('cvFlushSeqWriter', _cxDLL, None,
('writer', POINTER(CvSeqWriter), 1), # CvSeqWriter* writer
)
# Initializes process of sequential reading from sequence
cvStartReadSeq = cfunc('cvStartReadSeq', _cxDLL, None,
('seq', POINTER(CvSeq), 1), # const CvSeq* seq
('reader', POINTER(CvSeqReader), 1), # CvSeqReader* reader
('reverse', c_int, 1, 0), # int reverse
)
# Returns the current reader position
cvGetSeqReaderPos = cfunc('cvGetSeqReaderPos', _cxDLL, c_int,
('reader', POINTER(CvSeqReader), 1), # CvSeqReader* reader
)
# Moves the reader to specified position
cvSetSeqReaderPos = cfunc('cvSetSeqReaderPos', _cxDLL, None,
('reader', POINTER(CvSeqReader), 1), # CvSeqReader* reader
('index', c_int, 1), # int index
('is_relative', c_int, 1, 0), # int is_relative
)
# --- 2.3 Sets ---------------------------------------------------------------
# Creates empty set
cvCreateSet = cfunc('cvCreateSet', _cxDLL, POINTER(CvSET),
('set_flags', c_int, 1), # int set_flags
('header_size', c_int, 1), # int header_size
('elem_size', c_int, 1), # int elem_size
('storage', POINTER(CvMemStorage), 1), # CvMemStorage* storage
)
# Occupies a node in the set
cvSetAdd = cfunc('cvSetAdd', _cxDLL, c_int,
('set_header', POINTER(CvSET), 1), # CvSet* set_header
('elem', POINTER(CvSetElem), 1, None), # CvSetElem* elem
('inserted_elem', POINTER(POINTER(CvSetElem)), 1, None), # CvSetElem** inserted_elem
)
# Removes element from set
cvSetRemove = cfunc('cvSetRemove', _cxDLL, None,
('set_header', POINTER(CvSET), 1), # CvSet* set_header
('index', c_int, 1), # int index
)
# Clears set
cvClearSet = cfunc('cvClearSet', _cxDLL, None,
('set_header', POINTER(CvSET), 1), # CvSet* set_header
)
# --- 2.4 Graphs -------------------------------------------------------------
# Creates empty graph
cvCreateGraph = cfunc('cvCreateGraph', _cxDLL, POINTER(CvGraph),
('graph_flags', c_int, 1), # int graph_flags
('header_size', c_int, 1), # int header_size
('vtx_size', c_int, 1), # int vtx_size
('edge_size', c_int, 1), # int edge_size
('storage', POINTER(CvMemStorage), 1), # CvMemStorage* storage
)
# Adds vertex to graph
cvGraphAddVtx = cfunc('cvGraphAddVtx', _cxDLL, c_int,
('graph', POINTER(CvGraph), 1), # CvGraph* graph
('vtx', POINTER(CvGraphVtx), 1, None), # const CvGraphVtx* vtx
('inserted_vtx', POINTER(POINTER(CvGraphVtx)), 1, None), # CvGraphVtx** inserted_vtx
)
# Removes vertex from graph
cvGraphRemoveVtx = cfunc('cvGraphRemoveVtx', _cxDLL, c_int,
('graph', POINTER(CvGraph), 1), # CvGraph* graph
('index', c_int, 1), # int index
)
# Removes vertex from graph
cvGraphRemoveVtxByPtr = cfunc('cvGraphRemoveVtxByPtr', _cxDLL, c_int,
('graph', POINTER(CvGraph), 1), # CvGraph* graph
('vtx', POINTER(CvGraphVtx), 1), # CvGraphVtx* vtx
)
# Adds edge to graph
cvGraphAddEdge = cfunc('cvGraphAddEdge', _cxDLL, c_int,
('graph', POINTER(CvGraph), 1), # CvGraph* graph
('start_idx', c_int, 1), # int start_idx
('end_idx', c_int, 1), # int end_idx
('edge', POINTER(CvGraphEdge), 1, None), # const CvGraphEdge* edge
('inserted_edge', POINTER(POINTER(CvGraphEdge)), 1, None), # CvGraphEdge** inserted_edge
)
# Adds edge to graph
cvGraphAddEdgeByPtr = cfunc('cvGraphAddEdgeByPtr', _cxDLL, c_int,
('graph', POINTER(CvGraph), 1), # CvGraph* graph
('start_vtx', POINTER(CvGraphVtx), 1), # CvGraphVtx* start_vtx
('end_vtx', POINTER(CvGraphVtx), 1), # CvGraphVtx* end_vtx
('edge', POINTER(CvGraphEdge), 1, None), # const CvGraphEdge* edge
('inserted_edge', POINTER(POINTER(CvGraphEdge)), 1, None), # CvGraphEdge** inserted_edge
)
# Removes edge from graph
cvGraphRemoveEdge = cfunc('cvGraphRemoveEdge', _cxDLL, None,
('graph', POINTER(CvGraph), 1), # CvGraph* graph
('start_idx', c_int, 1), # int start_idx
('end_idx', c_int, 1), # int end_idx
)
# Removes edge from graph
cvGraphRemoveEdgeByPtr = cfunc('cvGraphRemoveEdgeByPtr', _cxDLL, None,
('graph', POINTER(CvGraph), 1), # CvGraph* graph
('start_vtx', POINTER(CvGraphVtx), 1), # CvGraphVtx* start_vtx
('end_vtx', POINTER(CvGraphVtx), 1), # CvGraphVtx* end_vtx
)
# Finds edge in graph
cvFindGraphEdge = cfunc('cvFindGraphEdge', _cxDLL, POINTER(CvGraphEdge),
('graph', POINTER(CvGraph), 1), # const CvGraph* graph
('start_idx', c_int, 1), # int start_idx
('end_idx', c_int, 1), # int end_idx
)
# Finds edge in graph
cvFindGraphEdgeByPtr = cfunc('cvFindGraphEdgeByPtr', _cxDLL, POINTER(CvGraphEdge),
('graph', POINTER(CvGraph), 1), # const CvGraph* graph
('start_vtx', POINTER(CvGraphVtx), 1), # const CvGraphVtx* start_vtx
('end_vtx', POINTER(CvGraphVtx), 1), # const CvGraphVtx* end_vtx
)
# Counts edges indicent to the vertex
cvGraphVtxDegree = cfunc('cvGraphVtxDegree', _cxDLL, c_int,
('graph', POINTER(CvGraph), 1), # const CvGraph* graph
('vtx_idx', c_int, 1), # int vtx_idx
)
# Finds edge in graph
cvGraphVtxDegreeByPtr = cfunc('cvGraphVtxDegreeByPtr', _cxDLL, c_int,
('graph', POINTER(CvGraph), 1), # const CvGraph* graph
('vtx', POINTER(CvGraphVtx), 1), # const CvGraphVtx* vtx
)
# Clears graph
cvClearGraph = cfunc('cvClearGraph', _cxDLL, None,
('graph', POINTER(CvGraph), 1), # CvGraph* graph
)
# Clone graph
cvCloneGraph = cfunc('cvCloneGraph', _cxDLL, POINTER(CvGraph),
('graph', POINTER(CvGraph), 1), # const CvGraph* graph
('storage', POINTER(CvMemStorage), 1), # CvMemStorage* storage
)
# Creates structure for depth-first graph traversal
cvCreateGraphScanner = cfunc('cvCreateGraphScanner', _cxDLL, POINTER(CvGraphScanner),
('graph', POINTER(CvGraph), 1), # CvGraph* graph
('vtx', POINTER(CvGraphVtx), 1, None), # CvGraphVtx* vtx
('mask', c_int, 1), # int mask
)
# Makes one or more steps of the graph traversal procedure
cvNextGraphItem = cfunc('cvNextGraphItem', _cxDLL, c_int,
('scanner', POINTER(CvGraphScanner), 1), # CvGraphScanner* scanner
)
# Finishes graph traversal procedure
cvReleaseGraphScanner = cfunc('cvReleaseGraphScanner', _cxDLL, None,
('scanner', POINTER(POINTER(CvGraphScanner)), 1), # CvGraphScanner** scanner
)
# --- 2.5 Trees --------------------------------------------------------------
# Initializes tree node iterator
cvInitTreeNodeIterator = cfunc('cvInitTreeNodeIterator', _cxDLL, None,
('tree_iterator', POINTER(CvTreeNodeIterator), 1), # CvTreeNodeIterator* tree_iterator
('first', c_void_p, 1), # const void* first
('max_level', c_int, 1), # int max_level
)
# Returns the currently observed node and moves iterator toward the next node
cvNextTreeNode = cfunc('cvNextTreeNode', _cxDLL, c_void_p,
('tree_iterator', POINTER(CvTreeNodeIterator), 1), # CvTreeNodeIterator* tree_iterator
)
# Returns the currently observed node and moves iterator toward the previous node
cvPrevTreeNode = cfunc('cvPrevTreeNode', _cxDLL, c_void_p,
('tree_iterator', POINTER(CvTreeNodeIterator), 1), # CvTreeNodeIterator* tree_iterator
)
# Gathers all node pointers to the single sequence
cvTreeToNodeSeq = cfunc('cvTreeToNodeSeq', _cxDLL, POINTER(CvSeq),
('first', c_void_p, 1), # const void* first
('header_size', c_int, 1), # int header_size
('storage', POINTER(CvMemStorage), 1), # CvMemStorage* storage
)
# Adds new node to the tree
cvInsertNodeIntoTree = cfunc('cvInsertNodeIntoTree', _cxDLL, None,
('node', c_void_p, 1), # void* node
('parent', c_void_p, 1), # void* parent
('frame', c_void_p, 1), # void* frame
)
# Removes node from tree
cvRemoveNodeFromTree = cfunc('cvRemoveNodeFromTree', _cxDLL, None,
('node', c_void_p, 1), # void* node
('frame', c_void_p, 1), # void* frame
)
# --- 3 Drawing Functions ----------------------------------------------------
# --- 3.1 Curves and Shapes --------------------------------------------------
# Constructs a color value
def CV_RGB(r, g, b):
result = CvScalar()
result.val[0] = b
result.val[1] = g
result.val[2] = r
return result
# Draws a line segment connecting two points
cvLine = cfunc('cvLine', _cxDLL, None,
('img', c_void_p, 1), # CvArr* img
('pt1', CvPoint, 1), # CvPoint pt1
('pt2', CvPoint, 1), # CvPoint pt2
('color', CvScalar, 1), # CvScalar color
('thickness', c_int, 1, 1), # int thickness
('line_type', c_int, 1, 8), # int line_type
('shift', c_int, 1, 0), # int shift
)
# Draws simple, thick or filled rectangle
cvRectangle = cfunc('cvRectangle', _cxDLL, None,
('img', c_void_p, 1), # CvArr* img
('pt1', CvPoint, 1), # CvPoint pt1
('pt2', CvPoint, 1), # CvPoint pt2
('color', CvScalar, 1), # CvScalar color
('thickness', c_int, 1, 1), # int thickness
('line_type', c_int, 1, 8), # int line_type
('shift', c_int, 1, 0), # int shift
)
# Draws a circle
cvCircle = cfunc('cvCircle', _cxDLL, None,
('img', c_void_p, 1), # CvArr* img
('center', CvPoint, 1), # CvPoint center
('radius', c_int, 1), # int radius
('color', CvScalar, 1), # CvScalar color
('thickness', c_int, 1, 1), # int thickness
('line_type', c_int, 1, 8), # int line_type
('shift', c_int, 1, 0), # int shift
)
# Draws simple or thick elliptic arc or fills ellipse sector
cvEllipse = cfunc('cvEllipse', _cxDLL, None,
('img', c_void_p, 1), # CvArr* img
('center', CvPoint, 1), # CvPoint center
('axes', CvSize, 1), # CvSize axes
('angle', c_double, 1), # double angle
('start_angle', c_double, 1), # double start_angle
('end_angle', c_double, 1), # double end_angle
('color', CvScalar, 1), # CvScalar color
('thickness', c_int, 1, 1), # int thickness
('line_type', c_int, 1, 8), # int line_type
('shift', c_int, 1, 0), # int shift
)
def cvEllipseBox(img, box, color, thickness=1, line_type=8, shift=0):
'''Draws simple or thick elliptic arc or fills ellipse sector'''
cvEllipse(img, CvPoint(int(box.center.x), int(box.center.y)),
CvSize(int(box.size.height*0.5),int(box.size.width*0.5)),
box.angle, 0, 360, color, thickness, line_type, shift)
# Fills polygons interior
cvFillPoly = cfunc('cvFillPoly', _cxDLL, None,
('img', c_void_p, 1), # CvArr* img
('pts', POINTER(POINTER(CvPoint)), 1), # CvPoint** pts
('npts', POINTER(c_int), 1), # int* npts
('contours', c_int, 1), # int contours
('color', CvScalar, 1), # CvScalar color
('line_type', c_int, 1, 8), # int line_type
('shift', c_int, 1, 0), # int shift
)
# Fills convex polygon
cvFillConvexPoly = cfunc('cvFillConvexPoly', _cxDLL, None,
('img', c_void_p, 1), # CvArr* img
('pts', POINTER(CvPoint), 1), # CvPoint* pts
('npts', c_int, 1), # int npts
('color', CvScalar, 1), # CvScalar color
('line_type', c_int, 1, 8), # int line_type
('shift', c_int, 1, 0), # int shift
)
# Draws simple or thick polygons
cvPolyLine = cfunc('cvPolyLine', _cxDLL, None,
('img', c_void_p, 1), # CvArr* img
('pts', POINTER(POINTER(CvPoint)), 1), # CvPoint** pts
('npts', POINTER(c_int), 1), # int* npts
('contours', c_int, 1), # int contours
('is_closed', c_int, 1), # int is_closed
('color', CvScalar, 1), # CvScalar color
('thickness', c_int, 1, 1), # int thickness
('line_type', c_int, 1, 8), # int line_type
('shift', c_int, 1, 0), # int shift
)
# --- 3.2 Text ---------------------------------------------------------------
# Initializes font structure
cvInitFont = cfunc('cvInitFont', _cxDLL, None,
('font', POINTER(CvFont), 2), # CvFont* font
('font_face', c_int, 1), # int font_face
('hscale', c_double, 1), # double hscale
('vscale', c_double, 1), # double vscale
('shear', c_double, 1, 0), # double shear
('thickness', c_int, 1, 1), # int thickness
('line_type', c_int, 1, 8), # int line_type
)
# Draws text string
cvPutText = cfunc('cvPutText', _cxDLL, None,
('img', c_void_p, 1), # CvArr* img
('text', c_char_p, 1), # const char* text
('org', CvPoint, 1), # CvPoint org
('font', POINTER(CvFont), 1), # const CvFont* font
('color', CvScalar, 1), # CvScalar color
)
# Retrieves width and height of text string
cvGetTextSize = cfunc('cvGetTextSize', _cxDLL, None,
('text_string', c_char_p, 1), # const char* text_string
('font', POINTER(CvFont), 1), # const CvFont* font
('text_size', POINTER(CvSize), 1), # CvSize* text_size
('baseline', POINTER(c_int), 1), # int* baseline
)
# --- 3.3 Point Sets and Contours --------------------------------------------
# Draws contour outlines or interiors in the image
cvDrawContours = cfunc('cvDrawContours', _cxDLL, None,
('img', c_void_p, 1), # CvArr* img
('contour', POINTER(CvSeq), 1), # CvSeq* contour
('external_color', CvScalar, 1), # CvScalar external_color
('hole_color', CvScalar, 1), # CvScalar hole_color
('max_level', c_int, 1), # int max_level
('thickness', c_int, 1, 1), # int thickness
('line_type', c_int, 1, 8), # int line_type
)
# Initializes line iterator
cvInitLineIterator = cfunc('cvInitLineIterator', _cxDLL, c_int,
('image', c_void_p, 1), # const CvArr* image
('pt1', CvPoint, 1), # CvPoint pt1
('pt2', CvPoint, 1), # CvPoint pt2
('line_iterator', POINTER(CvLineIterator), 1), # CvLineIterator* line_iterator
('connectivity', c_int, 1, 8), # int connectivity
('left_to_right', c_int, 1, 0), # int left_to_right
)
# Clips the line against the image rectangle
cvClipLine = cfunc('cvClipLine', _cxDLL, c_int,
('img_size', CvSize, 1), # CvSize img_size
('pt1', POINTER(CvPoint), 1), # CvPoint* pt1
('pt2', POINTER(CvPoint), 1), # CvPoint* pt2
)
# Approximates elliptic arc with polyline
cvEllipse2Poly = cfunc('cvEllipse2Poly', _cxDLL, c_int,
('center', CvPoint, 1), # CvPoint center
('axes', CvSize, 1), # CvSize axes
('angle', c_int, 1), # int angle
('arc_start', c_int, 1), # int arc_start
('arc_end', c_int, 1), # int arc_end
('pts', POINTER(CvPoint), 1), # CvPoint* pts
('delta', c_int, 1), # int delta
)
# --- 4 Data Persistence and RTTI --------------------------------------------
# --- 4.1 File Storage -------------------------------------------------------
# Opens file storage for reading or writing data
cvOpenFileStorage = cfunc('cvOpenFileStorage', _cxDLL, POINTER(CvFileStorage),
('filename', c_char_p, 1), # const char* filename
('memstorage', POINTER(CvMemStorage), 1), # CvMemStorage* memstorage
('flags', c_int, 1), # int flags
)
# Releases file storage
cvReleaseFileStorage = cfunc('cvReleaseFileStorage', _cxDLL, None,
('fs', POINTER(POINTER(CvFileStorage)), 1), # CvFileStorage** fs
)
# --- 4.2 Writing Data -------------------------------------------------------
# Starts writing a new structure
cvStartWriteStruct = cfunc('cvStartWriteStruct', _cxDLL, None,
('fs', POINTER(CvFileStorage), 1), # CvFileStorage* fs
('name', c_char_p, 1), # const char* name
('struct_flags', c_int, 1), # int struct_flags
('type_name', c_char_p, 1, None), # const char* type_name
('attributes', CvAttrList, 1), # CvAttrList attributes
)
# Ends writing a structure
cvEndWriteStruct = cfunc('cvEndWriteStruct', _cxDLL, None,
('fs', POINTER(CvFileStorage), 1), # CvFileStorage* fs
)
# Writes an integer value
cvWriteInt = cfunc('cvWriteInt', _cxDLL, None,
('fs', POINTER(CvFileStorage), 1), # CvFileStorage* fs
('name', c_char_p, 1), # const char* name
('value', c_int, 1), # int value
)
# Writes a floating-point value
cvWriteReal = cfunc('cvWriteReal', _cxDLL, None,
('fs', POINTER(CvFileStorage), 1), # CvFileStorage* fs
('name', c_char_p, 1), # const char* name
('value', c_double, 1), # double value
)
# Writes a text string
cvWriteString = cfunc('cvWriteString', _cxDLL, None,
('fs', POINTER(CvFileStorage), 1), # CvFileStorage* fs
('name', c_char_p, 1), # const char* name
('str', c_char_p, 1), # const char* str
('quote', c_int, 1, 0), # int quote
)
# Writes comment
cvWriteComment = cfunc('cvWriteComment', _cxDLL, None,
('fs', POINTER(CvFileStorage), 1), # CvFileStorage* fs
('comment', c_char_p, 1), # const char* comment
('eol_comment', c_int, 1), # int eol_comment
)
# Starts the next stream
cvStartNextStream = cfunc('cvStartNextStream', _cxDLL, None,
('fs', POINTER(CvFileStorage), 1), # CvFileStorage* fs
)
# Writes user object
cvWrite = cfunc('cvWrite', _cxDLL, None,
('fs', POINTER(CvFileStorage), 1), # CvFileStorage* fs
('name', c_char_p, 1), # const char* name
('ptr', c_void_p, 1), # const void* ptr
('attributes', CvAttrList, 1), # CvAttrList attributes
)
# Writes multiple numbers
cvWriteRawData = cfunc('cvWriteRawData', _cxDLL, None,
('fs', POINTER(CvFileStorage), 1), # CvFileStorage* fs
('src', c_void_p, 1), # const void* src
('len', c_int, 1), # int len
('dt', c_char_p, 1), # const char* dt
)
# Writes file node to another file storage
cvWriteFileNode = cfunc('cvWriteFileNode', _cxDLL, None,
('fs', POINTER(CvFileStorage), 1), # CvFileStorage* fs
('new_node_name', c_char_p, 1), # const char* new_node_name
('node', POINTER(CvFileNode), 1), # const CvFileNode* node
('embed', c_int, 1), # int embed
)
# --- 4.3 Reading Data -------------------------------------------------------
# Retrieves one of top-level nodes of the file storage
cvGetRootFileNode = cfunc('cvGetRootFileNode', _cxDLL, POINTER(CvFileNode),
('fs', POINTER(CvFileStorage), 1), # const CvFileStorage* fs
('stream_index', c_int, 1, 0), # int stream_index
)
# Finds node in the map or file storage
cvGetFileNodeByName = cfunc('cvGetFileNodeByName', _cxDLL, POINTER(CvFileNode),
('fs', POINTER(CvFileStorage), 1), # const CvFileStorage* fs
('map', POINTER(CvFileNode), 1), # const CvFileNode* map
('name', c_char_p, 1), # const char* name
)
# Returns a unique pointer for given name
cvGetHashedKey = cfunc('cvGetHashedKey', _cxDLL, POINTER(CvStringHashNode),
('fs', POINTER(CvFileStorage), 1), # CvFileStorage* fs
('name', c_char_p, 1), # const char* name
('len', c_int, 1), # int len
('create_missing', c_int, 1, 0), # int create_missing
)
# Finds node in the map or file storage
cvGetFileNode = cfunc('cvGetFileNode', _cxDLL, POINTER(CvFileNode),
('fs', POINTER(CvFileStorage), 1), # CvFileStorage* fs
('map', POINTER(CvFileNode), 1), # CvFileNode* map
('key', POINTER(CvStringHashNode), 1), # const CvStringHashNode* key
('create_missing', c_int, 1, 0), # int create_missing
)
# Returns name of file node
cvGetFileNodeName = cfunc('cvGetFileNodeName', _cxDLL, c_char_p,
('node', POINTER(CvFileNode), 1), # const CvFileNode* node
)
# Decodes object and returns pointer to it
cvRead = cfunc('cvRead', _cxDLL, c_void_p,
('fs', POINTER(CvFileStorage), 1), # CvFileStorage* fs
('node', POINTER(CvFileNode), 1), # CvFileNode* node
('attributes', POINTER(CvAttrList), 1, None), # CvAttrList* attributes
)
# Reads multiple numbers
cvReadRawData = cfunc('cvReadRawData', _cxDLL, None,
('fs', POINTER(CvFileStorage), 1), # const CvFileStorage* fs
('src', POINTER(CvFileNode), 1), # const CvFileNode* src
('dst', c_void_p, 1), # void* dst
('dt', c_char_p, 1), # const char* dt
)
# Initializes file node sequence reader
cvStartReadRawData = cfunc('cvStartReadRawData', _cxDLL, None,
('fs', POINTER(CvFileStorage), 1), # const CvFileStorage* fs
('src', POINTER(CvFileNode), 1), # const CvFileNode* src
('reader', POINTER(CvSeqReader), 1), # CvSeqReader* reader
)
# Initializes file node sequence reader
cvReadRawDataSlice = cfunc('cvReadRawDataSlice', _cxDLL, None,
('fs', POINTER(CvFileStorage), 1), # const CvFileStorage* fs
('reader', POINTER(CvSeqReader), 1), # CvSeqReader* reader
('count', c_int, 1), # int count
('dst', c_void_p, 1), # void* dst
('dt', c_char_p, 1), # const char* dt
)
# --- 4.4 RTTI and Generic Functions -----------------------------------------
# Registers new type
cvRegisterType = cfunc('cvRegisterType', _cxDLL, None,
('info', POINTER(CvTypeInfo), 1), # const CvTypeInfo* info
)
# Unregisters the type
cvUnregisterType = cfunc('cvUnregisterType', _cxDLL, None,
('type_name', c_char_p, 1), # const char* type_name
)
# Returns the beginning of type list
cvFirstType = cfunc('cvFirstType', _cxDLL, POINTER(CvTypeInfo),
)
# Finds type by its name
cvFindType = cfunc('cvFindType', _cxDLL, POINTER(CvTypeInfo),
('type_name', c_char_p, 1), # const char* type_name
)
# Returns type of the object
cvTypeOf = cfunc('cvTypeOf', _cxDLL, POINTER(CvTypeInfo),
('struct_ptr', c_void_p, 1), # const void* struct_ptr
)
# Releases the object
cvRelease = cfunc('cvRelease', _cxDLL, None,
('struct_ptr', POINTER(c_void_p), 1), # void** struct_ptr
)
# Makes a clone of the object
cvClone = cfunc('cvClone', _cxDLL, c_void_p,
('struct_ptr', c_void_p, 1), # const void* struct_ptr
)
# Saves object to file
cvSave = cfunc('cvSave', _cxDLL, None,
('filename', c_char_p, 1), # const char* filename
('struct_ptr', c_void_p, 1), # const void* struct_ptr
('name', c_char_p, 1, None), # const char* name
('comment', c_char_p, 1, None), # const char* comment
('attributes', CvAttrList, 1), # CvAttrList attributes
)
# Loads object from file
cvLoad = cfunc('cvLoad', _cxDLL, c_void_p,
('filename', c_char_p, 1), # const char* filename
('memstorage', POINTER(CvMemStorage), 1, None), # CvMemStorage* memstorage
('name', c_char_p, 1, None), # const char* name
('real_name', POINTER(c_char_p), 1, None), # const char** real_name
)
# Load and cast to given type
def cvLoadCast(filename, ctype):
'''Use cvLoad and then cast the result to ctype'''
return ctypes.cast(cvLoad(filename), ctypes.POINTER(ctype))
# --- 5 Miscellaneous Functions ----------------------------------------------
# Checks every element of input array for invalid values
cvCheckArr = cfunc('cvCheckArr', _cxDLL, c_int,
('arr', c_void_p, 1), # const CvArr* arr
('flags', c_int, 1, 0), # int flags
('min_val', c_double, 1, 0), # double min_val
('max_val', c_double, 1, 0), # double max_val
)
cvCheckArray = cvCheckArr
# Splits set of vectors by given number of clusters
cvKMeans2 = cfunc('cvKMeans2', _cxDLL, None,
('samples', c_void_p, 1), # const CvArr* samples
('cluster_count', c_int, 1), # int cluster_count
('labels', c_void_p, 1), # CvArr* labels
('termcrit', CvTermCriteria, 1), # CvTermCriteria termcrit
)
# Splits sequence into equivalency classes
cvSeqPartition = cfunc('cvSeqPartition', _cxDLL, c_int,
('seq', POINTER(CvSeq), 1), # const CvSeq* seq
('storage', POINTER(CvMemStorage), 1), # CvMemStorage* storage
('labels', POINTER(POINTER(CvSeq)), 1), # CvSeq** labels
('is_equal', CvCmpFunc, 1), # CvCmpFunc is_equal
('userdata', c_void_p, 1), # void* userdata
)
# --- 6 Error Handling and System Functions ----------------------------------
# --- 6.1 Error Handling -----------------------------------------------------
# Returns the current error status
cvGetErrStatus = cfunc('cvGetErrStatus', _cxDLL, c_int,
)
# Sets the error status
cvSetErrStatus = cfunc('cvSetErrStatus', _cxDLL, None,
('status', c_int, 1), # int status
)
# Returns the current error mode
cvGetErrMode = cfunc('cvGetErrMode', _cxDLL, c_int,
)
# Sets the error mode
CV_ErrModeLeaf = 0
CV_ErrModeParent = 1
CV_ErrModeSilent = 2
cvSetErrMode = cfunc('cvSetErrMode', _cxDLL, c_int,
('mode', c_int, 1), # int mode
)
# Raises an error
cvError = cfunc('cvError', _cxDLL, c_int,
('status', c_int, 1), # int status
('func_name', c_char_p, 1), # const char* func_name
('err_msg', c_char_p, 1), # const char* err_msg
('file_name', c_char_p, 1), # const char* file_name
('line', c_int, 1), # int line
)
# Returns textual description of error status code
cvErrorStr = cfunc('cvErrorStr', _cxDLL, c_char_p,
('status', c_int, 1), # int status
)
# Sets a new error handler
CvErrorCallback = CFUNCTYPE(c_int, # int
c_int, # int status
c_char_p, # const char* func_name
c_char_p, # const char* err_msg
c_char_p, # const char* file_name
c_int) # int line
cvRedirectError = cfunc('cvRedirectError', _cxDLL, CvErrorCallback,
('error_handler', CvErrorCallback, 1), # CvErrorCallback error_handler
('userdata', c_void_p, 1, None), # void* userdata
('prev_userdata', POINTER(c_void_p), 1, None), # void** prev_userdata
)
# Provide standard error handling
cvNulDevReport = cfunc('cvNulDevReport', _cxDLL, c_int,
('status', c_int, 1), # int status
('func_name', c_char_p, 1), # const char* func_name
('err_msg', c_char_p, 1), # const char* err_msg
('file_name', c_char_p, 1), # const char* file_name
('line', c_int, 1), # int line
('userdata', c_void_p, 1), # void* userdata
)
cvStdErrReport = cfunc('cvStdErrReport', _cxDLL, c_int,
('status', c_int, 1), # int status
('func_name', c_char_p, 1), # const char* func_name
('err_msg', c_char_p, 1), # const char* err_msg
('file_name', c_char_p, 1), # const char* file_name
('line', c_int, 1), # int line
('userdata', c_void_p, 1), # void* userdata
)
cvGuiBoxReport = cfunc('cvGuiBoxReport', _cxDLL, c_int,
('status', c_int, 1), # int status
('func_name', c_char_p, 1), # const char* func_name
('err_msg', c_char_p, 1), # const char* err_msg
('file_name', c_char_p, 1), # const char* file_name
('line', c_int, 1), # int line
('userdata', c_void_p, 1), # void* userdata
)
# --- 6.2 System and Utility Functions ---------------------------------------
# Allocates memory buffer
cvAlloc = cfunc('cvAlloc', _cxDLL, c_void_p,
('size', c_ulong, 1), # size_t size
)
# Deallocates memory buffer
#cvFree = _cxDLL.cvFree
#cvFree.restype = None # void
#cvFree.argtypes = [
# c_void_p # void** ptr
# ]
# Returns number of tics
cvGetTickCount = cfunc('cvGetTickCount', _cxDLL, c_longlong,
)
# Returns number of tics per microsecond
cvGetTickFrequency = cfunc('cvGetTickFrequency', _cxDLL, c_double,
)
# Registers another module
cvRegisterModule = cfunc('cvRegisterModule', _cxDLL, c_int,
('module_info', POINTER(CvModuleInfo), 1), # const CvModuleInfo* module_info
)
# Retrieves information about the registered module(s) and plugins
cvGetModuleInfo = cfunc('cvGetModuleInfo', _cxDLL, None,
('module_name', c_char_p, 1), # const char* module_name
('version', POINTER(c_char_p), 1), # const char** version
('loaded_addon_plugins', POINTER(c_char_p), 1), # const char** loaded_addon_plugins
)
# Switches between optimized/non-optimized modes
cvUseOptimized = cfunc('cvUseOptimized', _cxDLL, c_int,
('on_off', c_int, 1), # int on_off
)
# Assings custom/default memory managing functions
#CvAllocFunc = CFUNCTYPE(c_void_p, # void*
# c_ulong, # size_t size
# c_void_p) # void* userdata
#
#CvFreeFunc = CFUNCTYPE(c_int, # int
# c_void_p, # void* pptr
# c_void_p) # void* userdata
#cvSetMemoryManager = _cxDLL.cvSetMemoryManager
#cvSetMemoryManager.restype = None # void
#cvSetMemoryManager.argtypes = [
# CvAllocFunc, # CvAllocFunc alloc_func=NULL
# CvFreeFunc, # CvFreeFunc free_func=NULL
# c_void_p # void* userdata=NULL
# ]
# --- 1 Image Processing -----------------------------------------------------
# --- 1.1 Gradients, Edges and Corners ---------------------------------------
# Calculates first, second, third or mixed image derivatives using extended Sobel operator
cvSobel = cfunc('cvSobel', _cvDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('xorder', c_int, 1), # int xorder
('yorder', c_int, 1), # int yorder
('aperture_size', c_int, 1, 3), # int aperture_size
)
# Calculates Laplacian of the image
cvLaplace = cfunc('cvLaplace', _cvDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('aperture_size', c_int, 1, 3), # int aperture_size
)
# Implements Canny algorithm for edge detection
cvCanny = cfunc('cvCanny', _cvDLL, None,
('image', c_void_p, 1), # const CvArr* image
('edges', c_void_p, 1), # CvArr* edges
('threshold1', c_double, 1), # double threshold1
('threshold2', c_double, 1), # double threshold2
('aperture_size', c_int, 1, 3), # int aperture_size
)
# Calculates feature map for corner detection
cvPreCornerDetect = cfunc('cvPreCornerDetect', _cvDLL, None,
('image', c_void_p, 1), # const CvArr* image
('corners', c_void_p, 1), # CvArr* corners
('aperture_size', c_int, 1, 3), # int aperture_size
)
# Calculates eigenvalues and eigenvectors of image blocks for corner detection
cvCornerEigenValsAndVecs = cfunc('cvCornerEigenValsAndVecs', _cvDLL, None,
('image', c_void_p, 1), # const CvArr* image
('eigenvv', c_void_p, 1), # CvArr* eigenvv
('block_size', c_int, 1), # int block_size
('aperture_size', c_int, 1, 3), # int aperture_size
)
# Calculates minimal eigenvalue of gradient matrices for corner detection
cvCornerMinEigenVal = cfunc('cvCornerMinEigenVal', _cvDLL, None,
('image', c_void_p, 1), # const CvArr* image
('eigenval', c_void_p, 1), # CvArr* eigenval
('block_size', c_int, 1), # int block_size
('aperture_size', c_int, 1, 3), # int aperture_size
)
# Harris edge detector
cvCornerHarris = cfunc('cvCornerHarris', _cvDLL, None,
('image', c_void_p, 1), # const CvArr* image
('harris_responce', c_void_p, 1), # CvArr* harris_responce
('block_size', c_int, 1), # int block_size
('aperture_size', c_int, 1, 3), # int aperture_size
('k', c_double, 1, 0), # double k
)
# Refines corner locations
cvFindCornerSubPix = cfunc('cvFindCornerSubPix', _cvDLL, None,
('image', c_void_p, 1), # const CvArr* image
('corners', POINTER(CvPoint2D32f), 1), # CvPoint2D32f* corners
('count', c_int, 1), # int count
('win', CvSize, 1), # CvSize win
('zero_zone', CvSize, 1), # CvSize zero_zone
('criteria', CvTermCriteria, 1), # CvTermCriteria criteria
)
# Determines strong corners on image
cvGoodFeaturesToTrack = cfunc('cvGoodFeaturesToTrack', _cvDLL, None,
('image', c_void_p, 1), # const CvArr* image
('eig_image', c_void_p, 1), # CvArr* eig_image
('temp_image', c_void_p, 1), # CvArr* temp_image
('corners', POINTER(CvPoint2D32f), 1), # CvPoint2D32f* corners
('corner_count', POINTER(c_int), 1), # int* corner_count
('quality_level', c_double, 1), # double quality_level
('min_distance', c_double, 1), # double min_distance
('mask', c_void_p, 1, None), # const CvArr* mask
('block_size', c_int, 1, 3), # int block_size
('use_harris', c_int, 1, 0), # int use_harris
('k', c_double, 1, 0), # double k
)
# --- 1.2 Sampling, Interpolation and Geometrical Transforms -----------------
# Reads raster line to buffer
cvSampleLine = cfunc('cvSampleLine', _cvDLL, c_int,
('image', c_void_p, 1), # const CvArr* image
('pt1', CvPoint, 1), # CvPoint pt1
('pt2', CvPoint, 1), # CvPoint pt2
('buffer', c_void_p, 1), # void* buffer
('connectivity', c_int, 1, 8), # int connectivity
)
# Retrieves pixel rectangle from image with sub-pixel accuracy
cvGetRectSubPix = cfunc('cvGetRectSubPix', _cvDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('center', CvPoint2D32f, 1), # CvPoint2D32f center
)
# Retrieves pixel quadrangle from image with sub-pixel accuracy
cvGetQuadrangleSubPix = cfunc('cvGetQuadrangleSubPix', _cvDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('map_matrix', POINTER(CvMat), 1), # const CvMat* map_matrix
)
# Resizes image
cvResize = cfunc('cvResize', _cvDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('interpolation', c_int, 1), # int interpolation
)
# Applies affine transformation to the image
cvWarpAffine = cfunc('cvWarpAffine', _cvDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('map_matrix', POINTER(CvMat), 1), # const CvMat* map_matrix
('flags', c_int, 1), # int flags
('fillval', CvScalar, 1), # CvScalar fillval
)
# Calculates affine matrix of 2d rotation
cv2DRotationMatrix = cfunc('cv2DRotationMatrix', _cvDLL, POINTER(CvMat),
('center', CvPoint2D32f, 1), # CvPoint2D32f center
('angle', c_double, 1), # double angle
('scale', c_double, 1), # double scale
('map_matrix', POINTER(CvMat), 1), # CvMat* map_matrix
)
# Applies perspective transformation to the image
cvWarpPerspective = cfunc('cvWarpPerspective', _cvDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('map_matrix', POINTER(CvMat), 1), # const CvMat* map_matrix
('flags', c_int, 1), # int flags
('fillval', CvScalar, 1), # CvScalar fillval
)
# Calculates perspective transform from 4 corresponding points
cvGetPerspectiveTransform = cfunc('cvGetPerspectiveTransform', _cvDLL, POINTER(CvMat),
('src', POINTER(CvPoint2D32f), 1), # const CvPoint2D32f* src
('dst', POINTER(CvPoint2D32f), 1), # const CvPoint2D32f* dst
('map_matrix', POINTER(CvMat), 1), # CvMat* map_matrix
)
# Applies generic geometrical transformation to the image
cvRemap = cfunc('cvRemap', _cvDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('mapx', c_void_p, 1), # const CvArr* mapx
('mapy', c_void_p, 1), # const CvArr* mapy
('flags', c_int, 1), # int flags
('fillval', CvScalar, 1), # CvScalar fillval
)
# Remaps image to log-polar space
cvLogPolar = cfunc('cvLogPolar', _cvDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('center', CvPoint2D32f, 1), # CvPoint2D32f center
('M', c_double, 1), # double M
('flags', c_int, 1), # int flags
)
# --- 1.3 Morphological Operations -------------------------------------------
# Creates structuring element
cvCreateStructuringElementEx = cfunc('cvCreateStructuringElementEx', _cvDLL, c_void_p,
('cols', c_int, 1), # int cols
('rows', c_int, 1), # int rows
('anchor_x', c_int, 1), # int anchor_x
('anchor_y', c_int, 1), # int anchor_y
('shape', c_int, 1), # int shape
('values', POINTER(c_int), 1, None), # int* values
)
# Deletes structuring element
cvReleaseStructuringElement = cfunc('cvReleaseStructuringElement', _cvDLL, None,
('element', POINTER(POINTER(IplConvKernel)), 1), # IplConvKernel** element
)
# Erodes image by using arbitrary structuring element
cvErode = cfunc('cvErode', _cvDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('element', POINTER(IplConvKernel), 1, None), # IplConvKernel* element
('iterations', c_int, 1, 1), # int iterations
)
# Dilates image by using arbitrary structuring element
cvDilate = cfunc('cvDilate', _cvDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('element', POINTER(IplConvKernel), 1, None), # IplConvKernel* element
('iterations', c_int, 1, 1), # int iterations
)
# Performs advanced morphological transformations
cvMorphologyEx = cfunc('cvMorphologyEx', _cvDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('temp', c_void_p, 1), # CvArr* temp
('element', POINTER(IplConvKernel), 1), # IplConvKernel* element
('operation', c_int, 1), # int operation
('iterations', c_int, 1, 1), # int iterations
)
# --- 1.4 Filters and Color Conversion ---------------------------------------
# Smooths the image in one of several ways
cvSmooth = cfunc('cvSmooth', _cvDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('smoothtype', c_int, 1), # int smoothtype
('param1', c_int, 1, 3), # int param1
('param2', c_int, 1, 0), # int param2
('param3', c_double, 1, 0), # double param3
)
# Convolves image with the kernel
cvFilter2D = cfunc('cvFilter2D', _cvDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('kernel', POINTER(CvMat), 1), # const CvMat* kernel
('anchor', CvPoint, 1), # CvPoint anchor
)
# Copies image and makes border around it
cvCopyMakeBorder = cfunc('cvCopyMakeBorder', _cvDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('offset', CvPoint, 1), # CvPoint offset
('bordertype', c_int, 1), # int bordertype
('value', CvScalar, 1), # CvScalar value
)
# Calculates integral images
cvIntegral = cfunc('cvIntegral', _cvDLL, None,
('image', c_void_p, 1), # const CvArr* image
('sum', c_void_p, 1), # CvArr* sum
('sqsum', c_void_p, 1, None), # CvArr* sqsum
('tilted_sum', c_void_p, 1, None), # CvArr* tilted_sum
)
CV_BGR2BGRA = 0
CV_RGB2RGBA = CV_BGR2BGRA
CV_BGRA2BGR = 1
CV_RGBA2RGB = CV_BGRA2BGR
CV_BGR2RGBA = 2
CV_RGB2BGRA = CV_BGR2RGBA
CV_RGBA2BGR = 3
CV_BGRA2RGB = CV_RGBA2BGR
CV_BGR2RGB = 4
CV_RGB2BGR = CV_BGR2RGB
CV_BGRA2RGBA = 5
CV_RGBA2BGRA = CV_BGRA2RGBA
CV_BGR2GRAY = 6
CV_RGB2GRAY = 7
CV_GRAY2BGR = 8
CV_GRAY2RGB = CV_GRAY2BGR
CV_GRAY2BGRA = 9
CV_GRAY2RGBA = CV_GRAY2BGRA
CV_BGRA2GRAY = 10
CV_RGBA2GRAY = 11
CV_BGR2BGR565 = 12
CV_RGB2BGR565 = 13
CV_BGR5652BGR = 14
CV_BGR5652RGB = 15
CV_BGRA2BGR565 = 16
CV_RGBA2BGR565 = 17
CV_BGR5652BGRA = 18
CV_BGR5652RGBA = 19
CV_GRAY2BGR565 = 20
CV_BGR5652GRAY = 21
CV_BGR2BGR555 = 22
CV_RGB2BGR555 = 23
CV_BGR5552BGR = 24
CV_BGR5552RGB = 25
CV_BGRA2BGR555 = 26
CV_RGBA2BGR555 = 27
CV_BGR5552BGRA = 28
CV_BGR5552RGBA = 29
CV_GRAY2BGR555 = 30
CV_BGR5552GRAY = 31
CV_BGR2XYZ = 32
CV_RGB2XYZ = 33
CV_XYZ2BGR = 34
CV_XYZ2RGB = 35
CV_BGR2YCrCb = 36
CV_RGB2YCrCb = 37
CV_YCrCb2BGR = 38
CV_YCrCb2RGB = 39
CV_BGR2HSV = 40
CV_RGB2HSV = 41
CV_BGR2Lab = 44
CV_RGB2Lab = 45
CV_BayerBG2BGR = 46
CV_BayerGB2BGR = 47
CV_BayerRG2BGR = 48
CV_BayerGR2BGR = 49
CV_BayerBG2RGB = CV_BayerRG2BGR
CV_BayerGB2RGB = CV_BayerGR2BGR
CV_BayerRG2RGB = CV_BayerBG2BGR
CV_BayerGR2RGB = CV_BayerGB2BGR
CV_BGR2Luv = 50
CV_RGB2Luv = 51
CV_BGR2HLS = 52
CV_RGB2HLS = 53
CV_HSV2BGR = 54
CV_HSV2RGB = 55
CV_Lab2BGR = 56
CV_Lab2RGB = 57
CV_Luv2BGR = 58
CV_Luv2RGB = 59
CV_HLS2BGR = 60
CV_HLS2RGB = 61
# Converts image from one color space to another
cvCvtColor = cfunc('cvCvtColor', _cvDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('code', c_int, 1), # int code
)
# Applies fixed-level threshold to array elements
cvThreshold = cfunc('cvThreshold', _cvDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('threshold', c_double, 1), # double threshold
('max_value', c_double, 1), # double max_value
('threshold_type', c_int, 1), # int threshold_type
)
# Applies adaptive threshold to array
cvAdaptiveThreshold = cfunc('cvAdaptiveThreshold', _cvDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('max_value', c_double, 1), # double max_value
('adaptive_method', c_int, 1), # int adaptive_method
('threshold_type', c_int, 1), # int threshold_type
('block_size', c_int, 1, 3), # int block_size
('param1', c_double, 1, 5), # double param1
)
# --- 1.5 Pyramids and the Applications --------------------------------------
# Downsamples image
cvPyrDown = cfunc('cvPyrDown', _cvDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('filter', c_int, 1), # int filter
)
# Upsamples image
cvPyrUp = cfunc('cvPyrUp', _cvDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('filter', c_int, 1), # int filter
)
# Implements image segmentation by pyramids
cvPyrSegmentation = cfunc('cvPyrSegmentation', _cvDLL, None,
('src', POINTER(IplImage), 1), # IplImage* src
('dst', POINTER(IplImage), 1), # IplImage* dst
('storage', POINTER(CvMemStorage), 1), # CvMemStorage* storage
('comp', POINTER(POINTER(CvSeq)), 1), # CvSeq** comp
('level', c_int, 1), # int level
('threshold1', c_double, 1), # double threshold1
('threshold2', c_double, 1), # double threshold2
)
# --- 1.6 Connected Components -----------------------------------------------
# Fills a connected component with given color
cvFloodFill = cfunc('cvFloodFill', _cvDLL, None,
('image', c_void_p, 1), # CvArr* image
('seed_point', CvPoint, 1), # CvPoint seed_point
('new_val', CvScalar, 1), # CvScalar new_val
('lo_diff', CvScalar, 1), # CvScalar lo_diff
('up_diff', CvScalar, 1), # CvScalar up_diff
('comp', POINTER(CvConnectedComp), 1, None), # CvConnectedComp* comp
('flags', c_int, 1, 4), # int flags
('mask', c_void_p, 1, None), # CvArr* mask
)
CV_FLOODFILL_FIXED_RANGE = 1 << 16
CV_FLOODFILL_MASK_ONLY = 1 << 17
# Finds contours in binary image
cvFindContours = cfunc('cvFindContours', _cvDLL, c_int,
('image', c_void_p, 1), # CvArr* image
('storage', POINTER(CvMemStorage), 1), # CvMemStorage* storage
('first_contour', POINTER(POINTER(CvSeq)), 1), # CvSeq** first_contour
('header_size', c_int, 1), # int header_size
('mode', c_int, 1), # int mode
('method', c_int, 1), # int method
('offset', CvPoint, 1), # CvPoint offset
)
# Initializes contour scanning process
cvStartFindContours = cfunc('cvStartFindContours', _cvDLL, CvContourScanner,
('image', c_void_p, 1), # CvArr* image
('storage', POINTER(CvMemStorage), 1), # CvMemStorage* storage
('header_size', c_int, 1), # int header_size
('mode', c_int, 1), # int mode
('method', c_int, 1), # int method
('offset', CvPoint, 1), # CvPoint offset
)
# Finds next contour in the image
cvFindNextContour = cfunc('cvFindNextContour', _cvDLL, POINTER(CvSeq),
('scanner', CvContourScanner, 1), # CvContourScanner scanner
)
# Replaces retrieved contour
cvSubstituteContour = cfunc('cvSubstituteContour', _cvDLL, None,
('scanner', CvContourScanner, 1), # CvContourScanner scanner
('new_contour', POINTER(CvSeq), 1), # CvSeq* new_contour
)
# Finishes scanning process
cvEndFindContours = cfunc('cvEndFindContours', _cvDLL, POINTER(CvSeq),
('scanner', POINTER(CvContourScanner), 1), # CvContourScanner* scanner
)
# --- 1.7 Image and Contour moments ------------------------------------------
# Calculates all moments up to third order of a polygon or rasterized shape
cvMoments = cfunc('cvMoments', _cvDLL, None,
('arr', c_void_p, 1), # const CvArr* arr
('moments', POINTER(CvMOMENTS), 1), # CvMoments* moments
('binary', c_int, 1, 0), # int binary
)
# Retrieves spatial moment from moment state structure
cvGetSpatialMoment = cfunc('cvGetSpatialMoment', _cvDLL, c_double,
('moments', POINTER(CvMOMENTS), 1), # CvMoments* moments
('x_order', c_int, 1), # int x_order
('y_order', c_int, 1), # int y_order
)
# Retrieves central moment from moment state structure
cvGetCentralMoment = cfunc('cvGetCentralMoment', _cvDLL, c_double,
('moments', POINTER(CvMOMENTS), 1), # CvMoments* moments
('x_order', c_int, 1), # int x_order
('y_order', c_int, 1), # int y_order
)
# Retrieves normalized central moment from moment state structure
cvGetNormalizedCentralMoment = cfunc('cvGetNormalizedCentralMoment', _cvDLL, c_double,
('moments', POINTER(CvMOMENTS), 1), # CvMoments* moments
('x_order', c_int, 1), # int x_order
('y_order', c_int, 1), # int y_order
)
# Calculates seven Hu invariants
cvGetHuMoments = cfunc('cvGetHuMoments', _cvDLL, None,
('moments', POINTER(CvMOMENTS), 1), # CvMoments* moments
('hu_moments', POINTER(CvHuMoments), 1), # CvHuMoments* hu_moments
)
# --- 1.8 Special Image Transforms -------------------------------------------
# Finds lines in binary image using Hough transform
cvHoughLines2 = cfunc('cvHoughLines2', _cvDLL, POINTER(CvSeq),
('image', c_void_p, 1), # CvArr* image
('line_storage', c_void_p, 1), # void* line_storage
('method', c_int, 1), # int method
('rho', c_double, 1), # double rho
('theta', c_double, 1), # double theta
('threshold', c_int, 1), # int threshold
('param1', c_double, 1, 0), # double param1
('param2', c_double, 1, 0), # double param2
)
# Finds circles in grayscale image using Hough transform
cvHoughCircles = cfunc('cvHoughCircles', _cvDLL, POINTER(CvSeq),
('image', c_void_p, 1), # CvArr* image
('circle_storage', c_void_p, 1), # void* circle_storage
('method', c_int, 1), # int method
('dp', c_double, 1), # double dp
('min_dist', c_double, 1), # double min_dist
('param1', c_double, 1, 100), # double param1
('param2', c_double, 1, 100), # double param2
)
# Calculates distance to closest zero pixel for all non-zero pixels of source image
cvDistTransform = cfunc('cvDistTransform', _cvDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('distance_type', c_int, 1), # int distance_type
('mask_size', c_int, 1, 3), # int mask_size
('mask', POINTER(c_float), 1, None), # const float* mask
('labels', c_void_p, 1, None), # CvArr* labels
)
# --- 1.9 Histograms ---------------------------------------------------------
CV_HIST_ARRAY = 0
CV_HIST_SPARSE = 1
# Creates histogram
cvCreateHist = cfunc('cvCreateHist', _cvDLL, POINTER(CvHistogram),
('dims', c_int, 1), # int dims
('sizes', ListPOINTER(c_int), 1), # int* sizes
('type', c_int, 1), # int type
('ranges', ListPOINTER2(c_float), 1, None), # float** ranges=NULL
('uniform', c_int, 1, 1), # int uniform=1
)
# Sets bounds of histogram bins
cvSetHistBinRanges = cfunc('cvSetHistBinRanges', _cvDLL, None,
('hist', POINTER(CvHistogram), 1), # CvHistogram* hist
('ranges', ListPOINTER2(c_float), 1), # float** ranges
('uniform', c_int, 1, 1), # int uniform
)
# Releases histogram
cvReleaseHist = cfunc('cvReleaseHist', _cvDLL, None,
('hist', ByRefArg(POINTER(CvHistogram)), 1), # CvHistogram** hist
)
# Clears histogram
cvClearHist = cfunc('cvClearHist', _cvDLL, None,
('hist', POINTER(CvHistogram), 1), # CvHistogram* hist
)
# Makes a histogram out of array
cvMakeHistHeaderForArray = cfunc('cvMakeHistHeaderForArray', _cvDLL, POINTER(CvHistogram),
('dims', c_int, 1), # int dims
('sizes', POINTER(c_int), 1), # int* sizes
('hist', POINTER(CvHistogram), 1), # CvHistogram* hist
('data', POINTER(c_float), 1), # float* data
('ranges', ListPOINTER2(c_float), 1, None), # float** ranges
('uniform', c_int, 1, 1), # int uniform
)
# Finds minimum and maximum histogram bins
cvGetMinMaxHistValue = cfunc('cvGetMinMaxHistValue', _cvDLL, None,
('hist', POINTER(CvHistogram), 1), # const CvHistogram* hist
('min_value', POINTER(c_float), 2), # float* min_value
('max_value', POINTER(c_float), 2), # float* max_value
('min_idx', POINTER(c_int), 2), # int* min_idx
('max_idx', POINTER(c_int), 2), # int* max_idx
)
# Normalizes histogram
cvNormalizeHist = cfunc('cvNormalizeHist', _cvDLL, None,
('hist', POINTER(CvHistogram), 1), # CvHistogram* hist
('factor', c_double, 1), # double factor
)
# Thresholds histogram
cvThreshHist = cfunc('cvThreshHist', _cvDLL, None,
('hist', POINTER(CvHistogram), 1), # CvHistogram* hist
('threshold', c_double, 1), # double threshold
)
CV_COMP_CORREL = 0
CV_COMP_CHISQR = 1
CV_COMP_INTERSECT = 2
CV_COMP_BHATTACHARYYA= 3
# Compares two dense histograms
cvCompareHist = cfunc('cvCompareHist', _cvDLL, c_double,
('hist1', POINTER(CvHistogram), 1), # const CvHistogram* hist1
('hist2', POINTER(CvHistogram), 1), # const CvHistogram* hist2
('method', c_int, 1), # int method
)
# Copies histogram
cvCopyHist = cfunc('cvCopyHist', _cvDLL, None,
('src', POINTER(CvHistogram), 1), # const CvHistogram* src
('dst', POINTER(POINTER(CvHistogram)), 1), # CvHistogram** dst
)
# Calculate the histogram
cvCalcHist = cfunc('cvCalcArrHist', _cvDLL, None,
('image', ListPOINTER(POINTER(IplImage)), 1), # IplImage** image
('hist', POINTER(CvHistogram), 1), # CvHistogram* hist
('accumulate', c_int, 1, 0), # int accumulate
('mask', c_void_p, 1, None), # CvArr* mask
)
# Calculates back projection
cvCalcBackProject = cfunc('cvCalcArrBackProject', _cvDLL, None,
('image', ListPOINTER(POINTER(IplImage)), 1), # IplImage** image
('back_project', POINTER(IplImage), 1), # IplImage* back_project
('hist', POINTER(CvHistogram), 1), # CvHistogram* hist
)
# Divides one histogram by another
cvCalcProbDensity = cfunc('cvCalcProbDensity', _cvDLL, None,
('hist1', POINTER(CvHistogram), 1), # const CvHistogram* hist1
('hist2', POINTER(CvHistogram), 1), # const CvHistogram* hist2
('dst_hist', POINTER(CvHistogram), 1), # CvHistogram* dst_hist
('scale', c_double, 1, 255), # double scale
)
def QueryHistValue_1D(hist, i1, i2):
return cvGetReal1D(hist[0].bins, i1)
def QueryHistValue_2D(hist, i1, i2):
return cvGetReal2D(hist[0].bins, i1, i2)
def QueryHistValue_3D(hist, i1, i2, i3):
return cvGetReal2D(hist[0].bins, i1, i2, i3)
# Equalizes histogram of grayscale image
cvEqualizeHist = cfunc('cvEqualizeHist', _cvDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
)
# --- 1.10 Matching ----------------------------------------------------------
# Compares template against overlapped image regions
cvMatchTemplate = cfunc('cvMatchTemplate', _cvDLL, None,
('image', c_void_p, 1), # const CvArr* image
('templ', c_void_p, 1), # const CvArr* templ
('result', c_void_p, 1), # CvArr* result
('method', c_int, 1), # int method
)
# Compares two shapes
cvMatchShapes = cfunc('cvMatchShapes', _cvDLL, c_double,
('object1', c_void_p, 1), # const void* object1
('object2', c_void_p, 1), # const void* object2
('method', c_int, 1), # int method
('parameter', c_double, 1, 0), # double parameter
)
# Computes "minimal work" distance between two weighted point configurations
CvDistanceFunction = CFUNCTYPE(c_float, # float
c_void_p, # const float* f1
c_void_p, # const float* f2
c_void_p) # void* userdata
cvCalcEMD2 = cfunc('cvCalcEMD2', _cvDLL, c_float,
('signature1', c_void_p, 1), # const CvArr* signature1
('signature2', c_void_p, 1), # const CvArr* signature2
('distance_type', c_int, 1), # int distance_type
('distance_func', CvDistanceFunction, 1, None), # CvDistanceFunction distance_func
('cost_matrix', c_void_p, 1, None), # const CvArr* cost_matrix
('flow', c_void_p, 1, None), # CvArr* flow
('lower_bound', POINTER(c_float), 1, None), # float* lower_bound
('userdata', c_void_p, 1, None), # void* userdata
)
# --- 2 Structural Analysis --------------------------------------------------
# --- 2.1 Contour Processing Functions ---------------------------------------
# Approximates Freeman chain(s) with polygonal curve
cvApproxChains = cfunc('cvApproxChains', _cvDLL, POINTER(CvSeq),
('src_seq', POINTER(CvSeq), 1), # CvSeq* src_seq
('storage', POINTER(CvMemStorage), 1), # CvMemStorage* storage
('method', c_int, 1), # int method
('parameter', c_double, 1, 0), # double parameter
('minimal_perimeter', c_int, 1, 0), # int minimal_perimeter
('recursive', c_int, 1, 0), # int recursive
)
# Initializes chain reader
cvStartReadChainPoints = cfunc('cvStartReadChainPoints', _cvDLL, None,
('chain', POINTER(CvChain), 1), # CvChain* chain
('reader', POINTER(CvChainPtReader), 1), # CvChainPtReader* reader
)
# Gets next chain point
cvReadChainPoint = cfunc('cvReadChainPoint', _cvDLL, CvPoint,
('reader', POINTER(CvChainPtReader), 1), # CvChainPtReader* reader
)
# Approximates polygonal curve(s) with desired precision
cvApproxPoly = cfunc('cvApproxPoly', _cvDLL, POINTER(CvSeq),
('src_seq', c_void_p, 1), # const void* src_seq
('header_size', c_int, 1), # int header_size
('storage', POINTER(CvMemStorage), 1), # CvMemStorage* storage
('method', c_int, 1), # int method
('parameter', c_double, 1), # double parameter
('parameter2', c_int, 1, 0), # int parameter2
)
# Calculates up-right bounding rectangle of point set
cvBoundingRect = cfunc('cvBoundingRect', _cvDLL, CvRect,
('points', c_void_p, 1), # CvArr* points
('update', c_int, 1, 0), # int update
)
# Calculates area of the whole contour or contour section
cvContourArea = cfunc('cvContourArea', _cvDLL, c_double,
('contour', c_void_p, 1), # const CvArr* contour
('slice', CvSlice, 1), # CvSlice slice
)
# Calculates contour perimeter or curve length
cvArcLength = cfunc('cvArcLength', _cvDLL, c_double,
('curve', c_void_p, 1), # const void* curve
('slice', CvSlice, 1), # CvSlice slice
('is_closed', c_int, 1), # int is_closed
)
# Creates hierarchical representation of contour
cvCreateContourTree = cfunc('cvCreateContourTree', _cvDLL, POINTER(CvContourTree),
('contour', POINTER(CvSeq), 1), # const CvSeq* contour
('storage', POINTER(CvMemStorage), 1), # CvMemStorage* storage
('threshold', c_double, 1), # double threshold
)
# Restores contour from tree
cvContourFromContourTree = cfunc('cvContourFromContourTree', _cvDLL, POINTER(CvSeq),
('tree', POINTER(CvContourTree), 1), # const CvContourTree* tree
('storage', POINTER(CvMemStorage), 1), # CvMemStorage* storage
('criteria', CvTermCriteria, 1), # CvTermCriteria criteria
)
# Compares two contours using their tree representations
cvMatchContourTrees = cfunc('cvMatchContourTrees', _cvDLL, c_double,
('tree1', POINTER(CvContourTree), 1), # const CvContourTree* tree1
('tree2', POINTER(CvContourTree), 1), # const CvContourTree* tree2
('method', c_int, 1), # int method
('threshold', c_double, 1), # double threshold
)
# --- 2.2 Computational Geometry ---------------------------------------------
# Finds bounding rectangle for two given rectangles
cvMaxRect = cfunc('cvMaxRect', _cvDLL, CvRect,
('rect1', POINTER(CvRect), 1), # const CvRect* rect1
('rect2', POINTER(CvRect), 1), # const CvRect* rect2
)
# Initializes point sequence header from a point vector
cvPointSeqFromMat = cfunc('cvPointSeqFromMat', _cvDLL, POINTER(CvSeq),
('seq_kind', c_int, 1), # int seq_kind
('mat', c_void_p, 1), # const CvArr* mat
('contour_header', POINTER(CvContour), 1), # CvContour* contour_header
('block', POINTER(CvSeqBlock), 1), # CvSeqBlock* block
)
# Finds box vertices
cvBoxPoints = cfunc('cvBoxPoints', _cvDLL, None,
('box', CvBox2D, 1), # CvBox2D box
('pt', CvPoint2D32f, 1), # CvPoint2D32f pt
)
# Fits ellipse to set of 2D points
cvFitEllipse2 = cfunc('cvFitEllipse2', _cvDLL, CvBox2D,
('points', c_void_p, 1), # const CvArr* points
)
# Fits line to 2D or 3D point set
cvFitLine = cfunc('cvFitLine', _cvDLL, None,
('points', c_void_p, 1), # const CvArr* points
('dist_type', c_int, 1), # int dist_type
('param', c_double, 1), # double param
('reps', c_double, 1), # double reps
('aeps', c_double, 1), # double aeps
('line', POINTER(c_float), 1), # float* line
)
# Finds convex hull of point set
cvConvexHull2 = cfunc('cvConvexHull2', _cvDLL, POINTER(CvSeq),
('input', c_void_p, 1), # const CvArr* input
('hull_storage', c_void_p, 1, None), # void* hull_storage
('orientation', c_int, 1), # int orientation
('return_points', c_int, 1, 0), # int return_points
)
# Tests contour convex
cvCheckContourConvexity = cfunc('cvCheckContourConvexity', _cvDLL, c_int,
('contour', c_void_p, 1), # const CvArr* contour
)
# Finds convexity defects of contour
cvConvexityDefects = cfunc('cvConvexityDefects', _cvDLL, POINTER(CvSeq),
('contour', c_void_p, 1), # const CvArr* contour
('convexhull', c_void_p, 1), # const CvArr* convexhull
('storage', POINTER(CvMemStorage), 1, None), # CvMemStorage* storage
)
# Point in contour test
cvPointPolygonTest = cfunc('cvPointPolygonTest', _cvDLL, c_double,
('contour', c_void_p, 1), # const CvArr* contour
('pt', CvPoint2D32f, 1), # CvPoint2D32f pt
('measure_dist', c_int, 1), # int measure_dist
)
# Finds circumscribed rectangle of minimal area for given 2D point set
cvMinAreaRect2 = cfunc('cvMinAreaRect2', _cvDLL, CvBox2D,
('points', c_void_p, 1), # const CvArr* points
('storage', POINTER(CvMemStorage), 1, None), # CvMemStorage* storage
)
# Finds circumscribed circle of minimal area for given 2D point set
cvMinEnclosingCircle = cfunc('cvMinEnclosingCircle', _cvDLL, c_int,
('points', c_void_p, 1), # const CvArr* points
('center', POINTER(CvPoint2D32f), 1), # CvPoint2D32f* center
('radius', POINTER(c_float), 1), # float* radius
)
# Calculates pair-wise geometrical histogram for contour
cvCalcPGH = cfunc('cvCalcPGH', _cvDLL, None,
('contour', POINTER(CvSeq), 1), # const CvSeq* contour
('hist', POINTER(CvHistogram), 1), # CvHistogram* hist
)
# --- 2.3 Planar Subdivisions ------------------------------------------------
# Inserts a single point to Delaunay triangulation
cvSubdivDelaunay2DInsert = cfunc('cvSubdivDelaunay2DInsert', _cvDLL, POINTER(CvSubdiv2DPoint),
('subdiv', POINTER(CvSubdiv2D), 1), # CvSubdiv2D* subdiv
('pt', CvPoint2D32f, 1), # CvPoint2D32f pt
)
# Inserts a single point to Delaunay triangulation
cvSubdiv2DLocate = cfunc('cvSubdiv2DLocate', _cvDLL, CvSubdiv2DPointLocation,
('subdiv', POINTER(CvSubdiv2D), 1), # CvSubdiv2D* subdiv
('pt', CvPoint2D32f, 1), # CvPoint2D32f pt
('edge', POINTER(CvSubdiv2DEdge), 1), # CvSubdiv2DEdge* edge
('vertex', POINTER(POINTER(CvSubdiv2DPoint)), 1, None), # CvSubdiv2DPoint** vertex
)
# Finds the closest subdivision vertex to given point
cvFindNearestPoint2D = cfunc('cvFindNearestPoint2D', _cvDLL, POINTER(CvSubdiv2DPoint),
('subdiv', POINTER(CvSubdiv2D), 1), # CvSubdiv2D* subdiv
('pt', CvPoint2D32f, 1), # CvPoint2D32f pt
)
# Calculates coordinates of Voronoi diagram cells
cvCalcSubdivVoronoi2D = cfunc('cvCalcSubdivVoronoi2D', _cvDLL, None,
('subdiv', POINTER(CvSubdiv2D), 1), # CvSubdiv2D* subdiv
)
# Removes all virtual points
cvClearSubdivVoronoi2D = cfunc('cvClearSubdivVoronoi2D', _cvDLL, None,
('subdiv', POINTER(CvSubdiv2D), 1), # CvSubdiv2D* subdiv
)
# --- 3 Motion Analysis and Object Tracking ----------------------------------
# --- 3.1 Accumulation of Background Statistics ------------------------------
# Adds frame to accumulator
cvAcc = cfunc('cvAcc', _cvDLL, None,
('image', c_void_p, 1), # const CvArr* image
('sum', c_void_p, 1), # CvArr* sum
('mask', c_void_p, 1, None), # const CvArr* mask
)
# Adds the square of source image to accumulator
cvSquareAcc = cfunc('cvSquareAcc', _cvDLL, None,
('image', c_void_p, 1), # const CvArr* image
('sqsum', c_void_p, 1), # CvArr* sqsum
('mask', c_void_p, 1, None), # const CvArr* mask
)
# Adds product of two input images to accumulator
cvMultiplyAcc = cfunc('cvMultiplyAcc', _cvDLL, None,
('image1', c_void_p, 1), # const CvArr* image1
('image2', c_void_p, 1), # const CvArr* image2
('acc', c_void_p, 1), # CvArr* acc
('mask', c_void_p, 1, None), # const CvArr* mask
)
# Updates running average
cvRunningAvg = cfunc('cvRunningAvg', _cvDLL, None,
('image', c_void_p, 1), # const CvArr* image
('acc', c_void_p, 1), # CvArr* acc
('alpha', c_double, 1), # double alpha
('mask', c_void_p, 1, None), # const CvArr* mask
)
# --- 3.2 Motion Templates ---------------------------------------------------
# Updates motion history image by moving silhouette
cvUpdateMotionHistory = cfunc('cvUpdateMotionHistory', _cvDLL, None,
('silhouette', c_void_p, 1), # const CvArr* silhouette
('mhi', c_void_p, 1), # CvArr* mhi
('timestamp', c_double, 1), # double timestamp
('duration', c_double, 1), # double duration
)
# Calculates gradient orientation of motion history image
cvCalcMotionGradient = cfunc('cvCalcMotionGradient', _cvDLL, None,
('mhi', c_void_p, 1), # const CvArr* mhi
('mask', c_void_p, 1), # CvArr* mask
('orientation', c_void_p, 1), # CvArr* orientation
('delta1', c_double, 1), # double delta1
('delta2', c_double, 1), # double delta2
('aperture_size', c_int, 1, 3), # int aperture_size
)
# Calculates global motion orientation of some selected region
cvCalcGlobalOrientation = cfunc('cvCalcGlobalOrientation', _cvDLL, c_double,
('orientation', c_void_p, 1), # const CvArr* orientation
('mask', c_void_p, 1), # const CvArr* mask
('mhi', c_void_p, 1), # const CvArr* mhi
('timestamp', c_double, 1), # double timestamp
('duration', c_double, 1), # double duration
)
# Segments whole motion into separate moving parts
cvSegmentMotion = cfunc('cvSegmentMotion', _cvDLL, POINTER(CvSeq),
('mhi', c_void_p, 1), # const CvArr* mhi
('seg_mask', c_void_p, 1), # CvArr* seg_mask
('storage', POINTER(CvMemStorage), 1), # CvMemStorage* storage
('timestamp', c_double, 1), # double timestamp
('seg_thresh', c_double, 1), # double seg_thresh
)
# --- 3.3 Object Tracking ----------------------------------------------------
# Finds object center on back projection
cvMeanShift = cfunc('cvMeanShift', _cvDLL, c_int,
('prob_image', c_void_p, 1), # const CvArr* prob_image
('window', CvRect, 1), # CvRect window
('criteria', CvTermCriteria, 1), # CvTermCriteria criteria
('comp', POINTER(CvConnectedComp), 1), # CvConnectedComp* comp
)
# Finds object center, size, and orientation
cvCamShift = cfunc('cvCamShift', _cvDLL, c_int,
('prob_image', c_void_p, 1), # const CvArr* prob_image
('window', CvRect, 1), # CvRect window
('criteria', CvTermCriteria, 1), # CvTermCriteria criteria
('comp', POINTER(CvConnectedComp), 2), # CvConnectedComp* comp
('box', POINTER(CvBox2D), 2), # CvBox2D* box
)
# Changes contour position to minimize its energy
cvSnakeImage = cfunc('cvSnakeImage', _cvDLL, None,
('image', POINTER(IplImage), 1), # const IplImage* image
('points', POINTER(CvPoint), 1), # CvPoint* points
('length', c_int, 1), # int length
('alpha', POINTER(c_float), 1), # float* alpha
('beta', POINTER(c_float), 1), # float* beta
('gamma', POINTER(c_float), 1), # float* gamma
('coeff_usage', c_int, 1), # int coeff_usage
('win', CvSize, 1), # CvSize win
('criteria', CvTermCriteria, 1), # CvTermCriteria criteria
('calc_gradient', c_int, 1, 1), # int calc_gradient
)
# --- 3.4 Optical Flow -------------------------------------------------------
# Calculates optical flow for two images
cvCalcOpticalFlowHS = cfunc('cvCalcOpticalFlowHS', _cvDLL, None,
('prev', c_void_p, 1), # const CvArr* prev
('curr', c_void_p, 1), # const CvArr* curr
('use_previous', c_int, 1), # int use_previous
('velx', c_void_p, 1), # CvArr* velx
('vely', c_void_p, 1), # CvArr* vely
('lambda', c_double, 1), # double lambda
('criteria', CvTermCriteria, 1), # CvTermCriteria criteria
)
# Calculates optical flow for two images
cvCalcOpticalFlowLK = cfunc('cvCalcOpticalFlowLK', _cvDLL, None,
('prev', c_void_p, 1), # const CvArr* prev
('curr', c_void_p, 1), # const CvArr* curr
('win_size', CvSize, 1), # CvSize win_size
('velx', c_void_p, 1), # CvArr* velx
('vely', c_void_p, 1), # CvArr* vely
)
# Calculates optical flow for two images by block matching method
cvCalcOpticalFlowBM = cfunc('cvCalcOpticalFlowBM', _cvDLL, None,
('prev', c_void_p, 1), # const CvArr* prev
('curr', c_void_p, 1), # const CvArr* curr
('block_size', CvSize, 1), # CvSize block_size
('shift_size', CvSize, 1), # CvSize shift_size
('max_range', CvSize, 1), # CvSize max_range
('use_previous', c_int, 1), # int use_previous
('velx', c_void_p, 1), # CvArr* velx
('vely', c_void_p, 1), # CvArr* vely
)
# Calculates optical flow for a sparse feature set using iterative Lucas-Kanade method in pyramids
cvCalcOpticalFlowPyrLK = cfunc('cvCalcOpticalFlowPyrLK', _cvDLL, None,
('prev', c_void_p, 1), # const CvArr* prev
('curr', c_void_p, 1), # const CvArr* curr
('prev_pyr', c_void_p, 1), # CvArr* prev_pyr
('curr_pyr', c_void_p, 1), # CvArr* curr_pyr
('prev_features', POINTER(CvPoint2D32f), 1), # const CvPoint2D32f* prev_features
('curr_features', POINTER(CvPoint2D32f), 1), # CvPoint2D32f* curr_features
('count', c_int, 1), # int count
('win_size', CvSize, 1), # CvSize win_size
('level', c_int, 1), # int level
('status', c_char_p, 1), # char* status
('track_error', POINTER(c_float), 1), # float* track_error
('criteria', CvTermCriteria, 1), # CvTermCriteria criteria
('flags', c_int, 1), # int flags
)
# --- 3.5 Estimators ---------------------------------------------------------
# Allocates Kalman filter structure
cvCreateKalman = cfunc('cvCreateKalman', _cvDLL, POINTER(CvKalman),
('dynam_params', c_int, 1), # int dynam_params
('measure_params', c_int, 1), # int measure_params
('control_params', c_int, 1, 0), # int control_params
)
# Deallocates Kalman filter structure
cvReleaseKalman = cfunc('cvReleaseKalman', _cvDLL, None,
('kalman', POINTER(POINTER(CvKalman)), 1), # CvKalman** kalman
)
# Estimates subsequent model state
cvKalmanPredict = cfunc('cvKalmanPredict', _cvDLL, POINTER(CvMat),
('kalman', POINTER(CvKalman), 1), # CvKalman* kalman
('control', POINTER(CvMat), 1, None), # const CvMat* control
)
cvKalmanUpdateByTime = cvKalmanPredict
# Adjusts model state
cvKalmanCorrect = cfunc('cvKalmanCorrect', _cvDLL, POINTER(CvMat),
('kalman', POINTER(CvKalman), 1), # CvKalman* kalman
('measurement', POINTER(CvMat), 1), # const CvMat* measurement
)
cvKalmanUpdateByMeasurement = cvKalmanCorrect
# Allocates ConDensation filter structure
cvCreateConDensation = cfunc('cvCreateConDensation', _cvDLL, POINTER(CvConDensation),
('dynam_params', c_int, 1), # int dynam_params
('measure_params', c_int, 1), # int measure_params
('sample_count', c_int, 1), # int sample_count
)
# Deallocates ConDensation filter structure
cvReleaseConDensation = cfunc('cvReleaseConDensation', _cvDLL, None,
('condens', POINTER(POINTER(CvConDensation)), 1), # CvConDensation** condens
)
# Initializes sample set for ConDensation algorithm
cvConDensInitSampleSet = cfunc('cvConDensInitSampleSet', _cvDLL, None,
('condens', POINTER(CvConDensation), 1), # CvConDensation* condens
('lower_bound', POINTER(CvMat), 1), # CvMat* lower_bound
('upper_bound', POINTER(CvMat), 1), # CvMat* upper_bound
)
# Estimates subsequent model state
cvConDensUpdateByTime = cfunc('cvConDensUpdateByTime', _cvDLL, None,
('condens', POINTER(CvConDensation), 1), # CvConDensation* condens
)
# --- 4 Pattern Recognition --------------------------------------------------
# --- 4.1 Object Detection ---------------------------------------------------
# Loads a trained cascade classifier from file or the classifier database embedded in OpenCV
cvLoadHaarClassifierCascade = cfunc('cvLoadHaarClassifierCascade', _cvDLL, POINTER(CvHaarClassifierCascade),
('directory', c_char_p, 1), # const char* directory
('orig_window_size', CvSize, 1), # CvSize orig_window_size
)
# Releases haar classifier cascade
cvReleaseHaarClassifierCascade = cfunc('cvReleaseHaarClassifierCascade', _cvDLL, None,
('cascade', POINTER(POINTER(CvHaarClassifierCascade)), 1), # CvHaarClassifierCascade** cascade
)
# Detects objects in the image
cvHaarDetectObjects = cfunc('cvHaarDetectObjects', _cvDLL, POINTER(CvSeq),
('image', c_void_p, 1), # const CvArr* image
('cascade', POINTER(CvHaarClassifierCascade), 1), # CvHaarClassifierCascade* cascade
('storage', POINTER(CvMemStorage), 1), # CvMemStorage* storage
('scale_factor', c_double, 1, 1), # double scale_factor
('min_neighbors', c_int, 1, 3), # int min_neighbors
('flags', c_int, 1, 0), # int flags
('min_size', CvSize, 1), # CvSize min_size
)
def ChangeCvSeqToCvRect(result, func, args):
'''Handle the casting to extract a list of Rects from the Seq returned'''
res = []
for i in xrange(result[0].total):
f = cvGetSeqElem(result, i)
r = ctypes.cast(f, ctypes.POINTER(CvRect))[0]
res.append(r)
return res
cvHaarDetectObjects.errcheck = ChangeCvSeqToCvRect
# Assigns images to the hidden cascade
cvSetImagesForHaarClassifierCascade = cfunc('cvSetImagesForHaarClassifierCascade', _cvDLL, None,
('cascade', POINTER(CvHaarClassifierCascade), 1), # CvHaarClassifierCascade* cascade
('sum', c_void_p, 1), # const CvArr* sum
('sqsum', c_void_p, 1), # const CvArr* sqsum
('tilted_sum', c_void_p, 1), # const CvArr* tilted_sum
('scale', c_double, 1), # double scale
)
# Runs cascade of boosted classifier at given image location
cvRunHaarClassifierCascade = cfunc('cvRunHaarClassifierCascade', _cvDLL, c_int,
('cascade', POINTER(CvHaarClassifierCascade), 1), # CvHaarClassifierCascade* cascade
('pt', CvPoint, 1), # CvPoint pt
('start_stage', c_int, 1, 0), # int start_stage
)
# --- 5 Camera Calibration and 3D Reconstruction -----------------------------
# --- 5.1 Camera Calibration -------------------------------------------------
# Projects 3D points to image plane
cvProjectPoints2 = cfunc('cvProjectPoints2', _cvDLL, None,
('object_points', POINTER(CvMat), 1), # const CvMat* object_points
('rotation_vector', POINTER(CvMat), 1), # const CvMat* rotation_vector
('translation_vector', POINTER(CvMat), 1), # const CvMat* translation_vector
('intrinsic_matrix', POINTER(CvMat), 1), # const CvMat* intrinsic_matrix
('distortion_coeffs', POINTER(CvMat), 1), # const CvMat* distortion_coeffs
('image_points', POINTER(CvMat), 1), # CvMat* image_points
('dpdrot', POINTER(CvMat), 1, None), # CvMat* dpdrot
('dpdt', POINTER(CvMat), 1, None), # CvMat* dpdt
('dpdf', POINTER(CvMat), 1, None), # CvMat* dpdf
('dpdc', POINTER(CvMat), 1, None), # CvMat* dpdc
('dpddist', POINTER(CvMat), 1, None), # CvMat* dpddist
)
# Finds perspective transformation between two planes
cvFindHomography = cfunc('cvFindHomography', _cvDLL, None,
('src_points', POINTER(CvMat), 1), # const CvMat* src_points
('dst_points', POINTER(CvMat), 1), # const CvMat* dst_points
('homography', POINTER(CvMat), 1), # CvMat* homography
)
# Finds intrinsic and extrinsic camera parameters using calibration pattern
cvCalibrateCamera2 = cfunc('cvCalibrateCamera2', _cvDLL, None,
('object_points', POINTER(CvMat), 1), # const CvMat* object_points
('image_points', POINTER(CvMat), 1), # const CvMat* image_points
('point_counts', POINTER(CvMat), 1), # const CvMat* point_counts
('image_size', CvSize, 1), # CvSize image_size
('intrinsic_matrix', POINTER(CvMat), 1), # CvMat* intrinsic_matrix
('distortion_coeffs', POINTER(CvMat), 1), # CvMat* distortion_coeffs
('rotation_vectors', POINTER(CvMat), 1, None), # CvMat* rotation_vectors
('translation_vectors', POINTER(CvMat), 1, None), # CvMat* translation_vectors
('flags', c_int, 1, 0), # int flags
)
# Finds extrinsic camera parameters for particular view
cvFindExtrinsicCameraParams2 = cfunc('cvFindExtrinsicCameraParams2', _cvDLL, None,
('object_points', POINTER(CvMat), 1), # const CvMat* object_points
('image_points', POINTER(CvMat), 1), # const CvMat* image_points
('intrinsic_matrix', POINTER(CvMat), 1), # const CvMat* intrinsic_matrix
('distortion_coeffs', POINTER(CvMat), 1), # const CvMat* distortion_coeffs
('rotation_vector', POINTER(CvMat), 1), # CvMat* rotation_vector
('translation_vector', POINTER(CvMat), 1), # CvMat* translation_vector
)
# Converts rotation matrix to rotation vector or vice versa
cvRodrigues2 = cfunc('cvRodrigues2', _cvDLL, c_int,
('src', POINTER(CvMat), 1), # const CvMat* src
('dst', POINTER(CvMat), 1), # CvMat* dst
('jacobian', POINTER(CvMat), 1, 0), # CvMat* jacobian
)
# Transforms image to compensate lens distortion
cvUndistort2 = cfunc('cvUndistort2', _cvDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('intrinsic_matrix', POINTER(CvMat), 1), # const CvMat* intrinsic_matrix
('distortion_coeffs', POINTER(CvMat), 1), # const CvMat* distortion_coeffs
)
# Computes undistorion map
cvInitUndistortMap = cfunc('cvInitUndistortMap', _cvDLL, None,
('intrinsic_matrix', POINTER(CvMat), 1), # const CvMat* intrinsic_matrix
('distortion_coeffs', POINTER(CvMat), 1), # const CvMat* distortion_coeffs
('mapx', c_void_p, 1), # CvArr* mapx
('mapy', c_void_p, 1), # CvArr* mapy
)
# Finds positions of internal corners of the chessboard
cvFindChessboardCorners = cfunc('cvFindChessboardCorners', _cvDLL, c_int,
('image', c_void_p, 1), # const void* image
('pattern_size', CvSize, 1), # CvSize pattern_size
('corners', POINTER(CvPoint2D32f), 1), # CvPoint2D32f* corners
('corner_count', POINTER(c_int), 1, None), # int* corner_count
('flags', c_int, 1), # int flags
)
# Renders the detected chessboard corners
cvDrawChessboardCorners = cfunc('cvDrawChessboardCorners', _cvDLL, None,
('image', c_void_p, 1), # CvArr* image
('pattern_size', CvSize, 1), # CvSize pattern_size
('corners', POINTER(CvPoint2D32f), 1), # CvPoint2D32f* corners
('count', c_int, 1), # int count
('pattern_was_found', c_int, 1), # int pattern_was_found
)
# --- 5.2 Pose Estimation ----------------------------------------------------
# Initializes structure containing object information
cvCreatePOSITObject = cfunc('cvCreatePOSITObject', _cvDLL, POINTER(CvPOSITObject),
('points', POINTER(CvPoint3D32f), 1), # CvPoint3D32f* points
('point_count', c_int, 1), # int point_count
)
# Implements POSIT algorithm
cvPOSIT = cfunc('cvPOSIT', _cvDLL, None,
('posit_object', POINTER(CvPOSITObject), 1), # CvPOSITObject* posit_object
('image_points', POINTER(CvPoint2D32f), 1), # CvPoint2D32f* image_points
('focal_length', c_double, 1), # double focal_length
('criteria', CvTermCriteria, 1), # CvTermCriteria criteria
('rotation_matrix', CvMatr32f, 1), # CvMatr32f rotation_matrix
('translation_vector', CvVect32f, 1), # CvVect32f translation_vector
)
# Deallocates 3D object structure
cvReleasePOSITObject = cfunc('cvReleasePOSITObject', _cvDLL, None,
('posit_object', POINTER(POINTER(CvPOSITObject)), 1), # CvPOSITObject** posit_object
)
# Calculates homography matrix for oblong planar object (e.g. arm)
cvCalcImageHomography = cfunc('cvCalcImageHomography', _cvDLL, None,
('line', POINTER(c_float), 1), # float* line
('center', POINTER(CvPoint3D32f), 1), # CvPoint3D32f* center
('intrinsic', POINTER(c_float), 1), # float* intrinsic
('homography', POINTER(c_float), 1), # float* homography
)
# --- 5.3 Epipolar Geometry --------------------------------------------------
# Calculates fundamental matrix from corresponding points in two images
cvFindFundamentalMat = cfunc('cvFindFundamentalMat', _cvDLL, c_int,
('points1', POINTER(CvMat), 1), # const CvMat* points1
('points2', POINTER(CvMat), 1), # const CvMat* points2
('fundamental_matrix', POINTER(CvMat), 1), # CvMat* fundamental_matrix
('method', c_int, 1), # int method
('param1', c_double, 1, 1), # double param1
('param2', c_double, 1, 0), # double param2
('status', POINTER(CvMat), 1, None), # CvMat* status
)
# For points in one image of stereo pair computes the corresponding epilines in the other image
cvComputeCorrespondEpilines = cfunc('cvComputeCorrespondEpilines', _cvDLL, None,
('points', POINTER(CvMat), 1), # const CvMat* points
('which_image', c_int, 1), # int which_image
('fundamental_matrix', POINTER(CvMat), 1), # const CvMat* fundamental_matrix
('correspondent_lines', POINTER(CvMat), 1), # CvMat* correspondent_lines
)
# Convert points to/from homogenious coordinates
cvConvertPointsHomogenious = cfunc('cvConvertPointsHomogenious', _cvDLL, None,
('src', POINTER(CvMat), 1), # const CvMat* src
('dst', POINTER(CvMat), 1), # CvMat* dst
)
# --- 1 Simple GUI -----------------------------------------------------------
# Creates window
cvNamedWindow = cfunc('cvNamedWindow', _hgDLL, c_int,
('name', c_char_p, 1), # const char* name
('flags', c_int, 1, 1), # int flags
)
# Destroys a window
cvDestroyWindow = cfunc('cvDestroyWindow', _hgDLL, None,
('name', c_char_p, 1), # const char* name
)
# Destroys all the HighGUI windows
cvDestroyAllWindows = cfunc('cvDestroyAllWindows', _hgDLL, None,
)
# Sets window size
cvResizeWindow = cfunc('cvResizeWindow', _hgDLL, None,
('name', c_char_p, 1), # const char* name
('width', c_int, 1), # int width
('height', c_int, 1), # int height
)
# Sets window position
cvMoveWindow = cfunc('cvMoveWindow', _hgDLL, None,
('name', c_char_p, 1), # const char* name
('x', c_int, 1), # int x
('y', c_int, 1), # int y
)
# Gets window handle by name
cvGetWindowHandle = cfunc('cvGetWindowHandle', _hgDLL, c_void_p,
('name', c_char_p, 1), # const char* name
)
# Gets window name by handle
cvGetWindowName = cfunc('cvGetWindowName', _hgDLL, c_void_p,
('window_handle', c_void_p, 1), # void* window_handle
)
# Shows the image in the specified window
cvShowImage = cfunc('cvShowImage', _hgDLL, None,
('name', c_char_p, 1), # const char* name
('image', c_void_p, 1), # const CvArr* image
)
# Creates the trackbar and attaches it to the specified window
CvTrackbarCallback = CFUNCTYPE(None, # void
c_int) # int pos
cvCreateTrackbar = cfunc('cvCreateTrackbar', _hgDLL, c_int,
('trackbar_name', c_char_p, 1), # const char* trackbar_name
('window_name', c_char_p, 1), # const char* window_name
('value', POINTER(c_int), 1), # int* value
('count', c_int, 1), # int count
('on_change', CvTrackbarCallback, 1), # CvTrackbarCallback on_change
)
# Retrieves trackbar position
cvGetTrackbarPos = cfunc('cvGetTrackbarPos', _hgDLL, c_int,
('trackbar_name', c_char_p, 1), # const char* trackbar_name
('window_name', c_char_p, 1), # const char* window_name
)
# Sets trackbar position
cvSetTrackbarPos = cfunc('cvSetTrackbarPos', _hgDLL, None,
('trackbar_name', c_char_p, 1), # const char* trackbar_name
('window_name', c_char_p, 1), # const char* window_name
('pos', c_int, 1), # int pos
)
# Assigns callback for mouse events
CV_EVENT_MOUSEMOVE = 0
CV_EVENT_LBUTTONDOWN = 1
CV_EVENT_RBUTTONDOWN = 2
CV_EVENT_MBUTTONDOWN = 3
CV_EVENT_LBUTTONUP = 4
CV_EVENT_RBUTTONUP = 5
CV_EVENT_MBUTTONUP = 6
CV_EVENT_LBUTTONDBLCLK = 7
CV_EVENT_RBUTTONDBLCLK = 8
CV_EVENT_MBUTTONDBLCLK = 9
CV_EVENT_FLAG_LBUTTON = 1
CV_EVENT_FLAG_RBUTTON = 2
CV_EVENT_FLAG_MBUTTON = 4
CV_EVENT_FLAG_CTRLKEY = 8
CV_EVENT_FLAG_SHIFTKEY = 16
CV_EVENT_FLAG_ALTKEY = 32
CvMouseCallback = CFUNCTYPE(None, # void
c_int, # int event
c_int, # int x
c_int, # int y
c_int, # int flags
c_void_p) # void* param
cvSetMouseCallback = cfunc('cvSetMouseCallback', _hgDLL, None,
('window_name', c_char_p, 1), # const char* window_name
('on_mouse', CallableToFunc(CvMouseCallback), 1), # CvMouseCallback on_mouse
('param', c_void_p, 1, None), # void* param
)
# Waits for a pressed key
cvWaitKey = cfunc('cvWaitKey', _hgDLL, c_int,
('delay', c_int, 1, 0), # int delay
)
# --- 2 Loading and Saving Images --------------------------------------------
# Loads an image from file
cvLoadImage = cfunc('cvLoadImage', _hgDLL, POINTER(IplImage),
('filename', c_char_p, 1), # const char* filename
('iscolor', c_int, 1, 1), # int iscolor
)
# Saves an image to the file
cvSaveImage = cfunc('cvSaveImage', _hgDLL, c_int,
('filename', c_char_p, 1), # const char* filename
('image', c_void_p, 1), # const CvArr* image
)
# --- 3 Video I/O functions --------------------------------------------------
# Initializes capturing video from file
cvCreateFileCapture = cfunc('cvCreateFileCapture', _hgDLL, POINTER(CvCapture),
('filename', c_char_p, 1), # const char* filename
)
# Initializes capturing video from camera
cvCreateCameraCapture = cfunc('cvCreateCameraCapture', _hgDLL, POINTER(CvCapture),
('index', c_int, 1), # int index
)
# Releases the CvCapture structure
cvReleaseCapture = cfunc('cvReleaseCapture', _hgDLL, None,
('capture', POINTER(POINTER(CvCapture)), 1), # CvCapture** capture
)
# Grabs frame from camera or file
cvGrabFrame = cfunc('cvGrabFrame', _hgDLL, c_int,
('capture', POINTER(CvCapture), 1), # CvCapture* capture
)
# Gets the image grabbed with cvGrabFrame
cvRetrieveFrame = cfunc('cvRetrieveFrame', _hgDLL, POINTER(IplImage),
('capture', POINTER(CvCapture), 1), # CvCapture* capture
)
# Grabs and returns a frame from camera or file
cvQueryFrame = cfunc('cvQueryFrame', _hgDLL, POINTER(IplImage),
('capture', POINTER(CvCapture), 1), # CvCapture* capture
)
def CheckNonNull(result, func, args):
if not result:
raise RuntimeError, 'QueryFrame failed'
return args
# Gets video capturing properties
cvGetCaptureProperty = cfunc('cvGetCaptureProperty', _hgDLL, c_double,
('capture', POINTER(CvCapture), 1), # CvCapture* capture
('property_id', c_int, 1), # int property_id
)
# Sets video capturing properties
cvSetCaptureProperty = cfunc('cvSetCaptureProperty', _hgDLL, c_int,
('capture', POINTER(CvCapture), 1), # CvCapture* capture
('property_id', c_int, 1), # int property_id
('value', c_double, 1), # double value
)
# Creates video file writer
cvCreateVideoWriter = cfunc('cvCreateVideoWriter', _hgDLL, POINTER(CvVideoWriter),
('filename', c_char_p, 1), # const char* filename
('fourcc', c_int, 1), # int fourcc
('fps', c_double, 1), # double fps
('frame_size', CvSize, 1), # CvSize frame_size
('is_color', c_int, 1, 1), # int is_color
)
# Releases AVI writer
cvReleaseVideoWriter = cfunc('cvReleaseVideoWriter', _hgDLL, None,
('writer', POINTER(POINTER(CvVideoWriter)), 1), # CvVideoWriter** writer
)
# Writes a frame to video file
cvWriteFrame = cfunc('cvWriteFrame', _hgDLL, c_int,
('writer', POINTER(CvVideoWriter), 1), # CvVideoWriter* writer
('image', POINTER(IplImage), 1), # const IplImage* image
)
# --- 4 Utility and System Functions -----------------------------------------
# Initializes HighGUI
cvInitSystem = cfunc('cvInitSystem', _hgDLL, c_int,
('argc', c_int, 1), # int argc
('argv', POINTER(c_char_p), 1), # char** argv
)
CV_CVTIMG_SWAP_RB = 2
CV_CVTIMG_FLIP = 1
# Converts one image to another with optional vertical flip
cvConvertImage = cfunc('cvConvertImage', _hgDLL, None,
('src', c_void_p, 1), # const CvArr* src
('dst', c_void_p, 1), # CvArr* dst
('flags', c_int, 1, 0), # int flags
)
# -- Helpers for access to images for other GUI packages
def cvImageAsBuffer(img):
btype = ctypes.c_byte * img[0].imageSize
return buffer(btype.from_address(img[0].imageData))
try:
import wx
def cvImageAsBitmap(img, flip=True):
sz = cvGetSize(img)
flags = CV_CVTIMG_SWAP_RB
if flip:
flags |= CV_CVTIMG_FLIP
cvConvertImage(img, img, flags)
bitmap = wx.BitmapFromBuffer(sz.width, sz.height, cvImageAsBuffer(img))
return bitmap
except ImportError:
pass
# --- Dokumentationsstrings --------------------------------------------------
cvCreateImage.__doc__ = """IplImage* cvCreateImage(CvSize size, int depth, int channels)
Creates header and allocates data
"""
cvCreateImageHeader.__doc__ = """IplImage* cvCreateImageHeader(CvSize size, int depth, int channels)
Allocates, initializes, and returns structure IplImage
"""
cvReleaseImageHeader.__doc__ = """void cvReleaseImageHeader(IplImage** image)
Releases header
"""
cvReleaseImage.__doc__ = """void cvReleaseImage(IplImage** image)
Releases header and image data
"""
cvInitImageHeader.__doc__ = """IplImage* cvInitImageHeader(IplImage* image, CvSize size, int depth, int channels, int origin=0, int align=4)
Initializes allocated by user image header
"""
cvCloneImage.__doc__ = """IplImage* cvCloneImage(const IplImage* image)
Makes a full copy of image
"""
cvSetImageCOI.__doc__ = """void cvSetImageCOI(IplImage* image, int coi)
Sets channel of interest to given value
"""
cvGetImageCOI.__doc__ = """int cvGetImageCOI(const IplImage* image)
Returns index of channel of interest
"""
cvSetImageROI.__doc__ = """void cvSetImageROI(IplImage* image, CvRect rect)
Sets image ROI to given rectangle
"""
cvResetImageROI.__doc__ = """void cvResetImageROI(IplImage* image)
Releases image ROI
"""
cvGetImageROI.__doc__ = """CvRect cvGetImageROI(const IplImage* image)
Returns image ROI coordinates
"""
cvCreateMat.__doc__ = """CvMat* cvCreateMat(int rows, int cols, int type)
Creates new matrix
"""
cvCreateMatHeader.__doc__ = """CvMat* cvCreateMatHeader(int rows, int cols, int type)
Creates new matrix header
"""
cvReleaseMat.__doc__ = """void cvReleaseMat(CvMat** mat)
Deallocates matrix
"""
cvInitMatHeader.__doc__ = """CvMat* cvInitMatHeader(CvMat* mat, int rows, int cols, int type, void* data=NULL, int step=CV_AUTOSTEP)
Initializes matrix header
"""
cvCloneMat.__doc__ = """CvMat* cvCloneMat(const CvMat* mat)
Creates matrix copy
"""
cvCreateMatND.__doc__ = """CvMatND* cvCreateMatND(int dims, const int* sizes, int type)
Creates multi-dimensional dense array
"""
cvCreateMatNDHeader.__doc__ = """CvMatND* cvCreateMatNDHeader(int dims, const int* sizes, int type)
Creates new matrix header
"""
cvInitMatNDHeader.__doc__ = """CvMatND* cvInitMatNDHeader(CvMatND* mat, int dims, const int* sizes, int type, void* data=NULL)
Initializes multi-dimensional array header
"""
cvCloneMatND.__doc__ = """CvMatND* cvCloneMatND(const CvMatND* mat)
Creates full copy of multi-dimensional array
"""
cvCreateData.__doc__ = """void cvCreateData(CvArr* arr)
Allocates array data
"""
cvReleaseData.__doc__ = """void cvReleaseData(CvArr* arr)
Releases array data
"""
cvSetData.__doc__ = """void cvSetData(CvArr* arr, void* data, int step)
Assigns user data to the array header
"""
cvGetRawData.__doc__ = """void cvGetRawData(const CvArr* arr, uchar** data, int* step=NULL, CvSize* roi_size=NULL)
Retrieves low-level information about the array
"""
cvGetMat.__doc__ = """CvMat* cvGetMat(const CvArr* arr, CvMat* header, int* coi=NULL, int allowND=0)
Returns matrix header for arbitrary array
"""
cvGetImage.__doc__ = """IplImage* cvGetImage(const CvArr* arr, IplImage* image_header)
Returns image header for arbitrary array
"""
cvCreateSparseMat.__doc__ = """CvSparseMat* cvCreateSparseMat(int dims, const int* sizes, int type)
Creates sparse array
"""
cvReleaseSparseMat.__doc__ = """void cvReleaseSparseMat(CvSparseMat** mat)
Deallocates sparse array
"""
cvCloneSparseMat.__doc__ = """CvSparseMat* cvCloneSparseMat(const CvSparseMat* mat)
Creates full copy of sparse array
"""
cvGetSubRect.__doc__ = """CvMat* cvGetSubRect(const CvArr* arr, CvMat* submat, CvRect rect)
Returns matrix header corresponding to the rectangular sub-array of input image or matrix
"""
cvGetRows.__doc__ = """CvMat* cvGetRows(const CvArr* arr, CvMat* submat, int start_row, int end_row, int delta_row=1)
Returns array row or row span
"""
cvGetCols.__doc__ = """CvMat* cvGetCols(const CvArr* arr, CvMat* submat, int start_col, int end_col)
Returns array column or column span
"""
cvGetDiag.__doc__ = """CvMat* cvGetDiag(const CvArr* arr, CvMat* submat, int diag=0)
Returns one of array diagonals
"""
cvGetSize.__doc__ = """CvSize cvGetSize(const CvArr* arr)
Returns size of matrix or image ROI
"""
cvGetElemType.__doc__ = """int cvGetElemType(const CvArr* arr)
Returns type of array elements
"""
cvGetDims.__doc__ = """int cvGetDims(const CvArr* arr, int* sizes=NULL)
Return number of array dimensions and their sizes or the size of particular dimension
"""
cvPtr1D.__doc__ = """uchar* cvPtr1D(const CvArr* arr, int idx0, int* type=NULL)
Return pointer to the particular array element
"""
cvGet1D.__doc__ = """CvScalar cvGet1D(const CvArr* arr, int idx0)
Return the particular array element
"""
cvGetReal1D.__doc__ = """double cvGetReal1D(const CvArr* arr, int idx0)
Return the particular element of single-channel array
"""
cvSet1D.__doc__ = """void cvSet1D(CvArr* arr, int idx0, CvScalar value)
Change the particular array element
"""
cvSetReal1D.__doc__ = """void cvSetReal1D(CvArr* arr, int idx0, double value)
Change the particular array element
"""
cvClearND.__doc__ = """void cvClearND(CvArr* arr, int* idx)
Clears the particular array element
"""
cvCopy.__doc__ = """void cvCopy(const CvArr* src, CvArr* dst, const CvArr* mask=NULL)
Copies one array to another
"""
cvSet.__doc__ = """void cvSet(CvArr* arr, CvScalar value, const CvArr* mask=NULL)
Sets every element of array to given value
"""
cvSetZero.__doc__ = """void cvSetZero(CvArr* arr)
Clears the array
"""
cvReshape.__doc__ = """CvMat* cvReshape(const CvArr* arr, CvMat* header, int new_cn, int new_rows=0)
Changes shape of matrix/image without copying data
"""
cvReshapeMatND.__doc__ = """CvArr* cvReshapeMatND(const CvArr* arr, int sizeof_header, CvArr* header, int new_cn, int new_dims, int* new_sizes)
Changes shape of multi-dimensional array w/o copying data
"""
cvRepeat.__doc__ = """void cvRepeat(const CvArr* src, CvArr* dst)
Fill destination array with tiled source array
"""
cvFlip.__doc__ = """void cvFlip(const CvArr* src, CvArr* dst=NULL, int flip_mode=)
Flip a 2D array around vertical, horizontall or both axises
"""
cvSplit.__doc__ = """void cvSplit(const CvArr* src, CvArr* dst0, CvArr* dst1, CvArr* dst2, CvArr* dst3)
Divides multi-channel array into several single-channel arrays or extracts a single channel from the array
"""
cvMerge.__doc__ = """void cvMerge(const CvArr* src0, const CvArr* src1, const CvArr* src2, const CvArr* src3, CvArr* dst)
Composes multi-channel array from several single-channel arrays or inserts a single channel into the array
"""
cvLUT.__doc__ = """void cvLUT(const CvArr* src, CvArr* dst, const CvArr* lut)
Performs look-up table transform of array
"""
cvConvertScale.__doc__ = """void cvConvertScale(const CvArr* src, CvArr* dst, double scale=1, double shift=0)
Converts one array to another with optional linear transformation
"""
cvConvertScaleAbs.__doc__ = """void cvConvertScaleAbs(const CvArr* src, CvArr* dst, double scale=1, double shift=0)
Converts input array elements to 8-bit unsigned integer another with optional linear transformation
"""
cvAdd.__doc__ = """void cvAdd(const CvArr* src1, const CvArr* src2, CvArr* dst, const CvArr* mask=NULL)
Computes per-element sum of two arrays
"""
cvAddS.__doc__ = """void cvAddS(const CvArr* src, CvScalar value, CvArr* dst, const CvArr* mask=NULL)
Computes sum of array and scalar
"""
cvAddWeighted.__doc__ = """void cvAddWeighted(const CvArr* src1, double alpha, const CvArr* src2, double beta, double gamma, CvArr* dst)
Computes weighted sum of two arrays
"""
cvSub.__doc__ = """void cvSub(const CvArr* src1, const CvArr* src2, CvArr* dst, const CvArr* mask=NULL)
Computes per-element difference between two arrays
"""
cvSubRS.__doc__ = """void cvSubRS(const CvArr* src, CvScalar value, CvArr* dst, const CvArr* mask=NULL)
Computes difference between scalar and array
"""
cvMul.__doc__ = """void cvMul(const CvArr* src1, const CvArr* src2, CvArr* dst, double scale=1)
Calculates per-element product of two arrays
"""
cvDiv.__doc__ = """void cvDiv(const CvArr* src1, const CvArr* src2, CvArr* dst, double scale=1)
Performs per-element division of two arrays
"""
cvAnd.__doc__ = """void cvAnd(const CvArr* src1, const CvArr* src2, CvArr* dst, const CvArr* mask=NULL)
Calculates per-element bit-wise conjunction of two arrays
"""
cvAndS.__doc__ = """void cvAndS(const CvArr* src, CvScalar value, CvArr* dst, const CvArr* mask=NULL)
Calculates per-element bit-wise conjunction of array and scalar
"""
cvOr.__doc__ = """void cvOr(const CvArr* src1, const CvArr* src2, CvArr* dst, const CvArr* mask=NULL)
Calculates per-element bit-wise disjunction of two arrays
"""
cvOrS.__doc__ = """void cvOrS(const CvArr* src, CvScalar value, CvArr* dst, const CvArr* mask=NULL)
Calculates per-element bit-wise disjunction of array and scalar
"""
cvXor.__doc__ = """void cvXor(const CvArr* src1, const CvArr* src2, CvArr* dst, const CvArr* mask=NULL)
Performs per-element bit-wise "exclusive or" operation on two arrays
"""
cvXorS.__doc__ = """void cvXorS(const CvArr* src, CvScalar value, CvArr* dst, const CvArr* mask=NULL)
Performs per-element bit-wise "exclusive or" operation on array and scalar
"""
cvNot.__doc__ = """void cvNot(const CvArr* src, CvArr* dst)
Performs per-element bit-wise inversion of array elements
"""
cvCmp.__doc__ = """void cvCmp(const CvArr* src1, const CvArr* src2, CvArr* dst, int cmp_op)
Performs per-element comparison of two arrays
"""
cvCmpS.__doc__ = """void cvCmpS(const CvArr* src, double value, CvArr* dst, int cmp_op)
Performs per-element comparison of array and scalar
"""
cvInRange.__doc__ = """void cvInRange(const CvArr* src, const CvArr* lower, const CvArr* upper, CvArr* dst)
Checks that array elements lie between elements of two other arrays
"""
cvInRangeS.__doc__ = """void cvInRangeS(const CvArr* src, CvScalar lower, CvScalar upper, CvArr* dst)
Checks that array elements lie between two scalars
"""
cvMax.__doc__ = """void cvMax(const CvArr* src1, const CvArr* src2, CvArr* dst)
Finds per-element maximum of two arrays
"""
cvMaxS.__doc__ = """void cvMaxS(const CvArr* src, double value, CvArr* dst)
Finds per-element maximum of array and scalar
"""
cvMin.__doc__ = """void cvMin(const CvArr* src1, const CvArr* src2, CvArr* dst)
Finds per-element minimum of two arrays
"""
cvMinS.__doc__ = """void cvMinS(const CvArr* src, double value, CvArr* dst)
Finds per-element minimum of array and scalar
"""
cvAbsDiff.__doc__ = """void cvAbsDiff(const CvArr* src1, const CvArr* src2, CvArr* dst)
Calculates absolute difference between two arrays
"""
cvAbsDiffS.__doc__ = """void cvAbsDiffS(const CvArr* src, CvArr* dst, CvScalar value)
Calculates absolute difference between array and scalar
"""
cvCountNonZero.__doc__ = """int cvCountNonZero(const CvArr* arr)
Counts non-zero array elements
"""
cvSum.__doc__ = """CvScalar cvSum(const CvArr* arr)
Summarizes array elements
"""
cvAvg.__doc__ = """CvScalar cvAvg(const CvArr* arr, const CvArr* mask=NULL)
Calculates average (mean) of array elements
"""
cvAvgSdv.__doc__ = """void cvAvgSdv(const CvArr* arr, CvScalar* mean, CvScalar* std_dev, const CvArr* mask=NULL)
Calculates average (mean) of array elements
"""
cvMinMaxLoc.__doc__ = """void cvMinMaxLoc(const CvArr* arr, double* min_val, double* max_val, CvPoint* min_loc=NULL, CvPoint* max_loc=NULL, const CvArr* mask=NULL)
Finds global minimum and maximum in array or subarray
"""
cvNorm.__doc__ = """double cvNorm(const CvArr* arr1, const CvArr* arr2=NULL, int norm_type=CV_L2, const CvArr* mask=NULL)
Calculates absolute array norm, absolute difference norm or relative difference norm
"""
cvSetIdentity.__doc__ = """void cvSetIdentity(CvArr* mat, CvScalar value=cvRealScalar(1)
Initializes scaled identity matrix
"""
cvDotProduct.__doc__ = """double cvDotProduct(const CvArr* src1, const CvArr* src2)
Calculates dot product of two arrays in Euclidian metrics
"""
cvCrossProduct.__doc__ = """void cvCrossProduct(const CvArr* src1, const CvArr* src2, CvArr* dst)
Calculates cross product of two 3D vectors
"""
cvScaleAdd.__doc__ = """void cvScaleAdd(const CvArr* src1, CvScalar scale, const CvArr* src2, CvArr* dst)
Calculates sum of scaled array and another array
"""
cvGEMM.__doc__ = """void cvGEMM(const CvArr* src1, const CvArr* src2, double alpha, const CvArr* src3, double beta, CvArr* dst, int tABC=0)
Performs generalized matrix multiplication
"""
cvTransform.__doc__ = """void cvTransform(const CvArr* src, CvArr* dst, const CvMat* transmat, const CvMat* shiftvec=NULL)
Performs matrix transform of every array element
"""
cvPerspectiveTransform.__doc__ = """void cvPerspectiveTransform(const CvArr* src, CvArr* dst, const CvMat* mat)
Performs perspective matrix transform of vector array
"""
cvMulTransposed.__doc__ = """void cvMulTransposed(const CvArr* src, CvArr* dst, int order, const CvArr* delta=NULL)
Calculates product of array and transposed array
"""
cvTrace.__doc__ = """CvScalar cvTrace(const CvArr* mat)
Returns trace of matrix
"""
cvTranspose.__doc__ = """void cvTranspose(const CvArr* src, CvArr* dst)
Transposes matrix
"""
cvDet.__doc__ = """double cvDet(const CvArr* mat)
Returns determinant of matrix
"""
cvInvert.__doc__ = """double cvInvert(const CvArr* src, CvArr* dst, int method=CV_LU)
Finds inverse or pseudo-inverse of matrix
"""
cvSolve.__doc__ = """int cvSolve(const CvArr* src1, const CvArr* src2, CvArr* dst, int method=CV_LU)
Solves linear system or least-squares problem
"""
cvSVD.__doc__ = """void cvSVD(CvArr* A, CvArr* W, CvArr* U=NULL, CvArr* V=NULL, int flags=0)
Performs singular value decomposition of real floating-point matrix
"""
cvSVBkSb.__doc__ = """void cvSVBkSb(const CvArr* W, const CvArr* U, const CvArr* V, const CvArr* B, CvArr* X, int flags)
Performs singular value back substitution
"""
cvEigenVV.__doc__ = """void cvEigenVV(CvArr* mat, CvArr* evects, CvArr* evals, double eps=0)
Computes eigenvalues and eigenvectors of symmetric matrix
"""
cvCalcCovarMatrix.__doc__ = """void cvCalcCovarMatrix(const CvArr** vects, int count, CvArr* cov_mat, CvArr* avg, int flags)
Calculates covariation matrix of the set of vectors
"""
cvMahalanobis.__doc__ = """double cvMahalanobis(const CvArr* vec1, const CvArr* vec2, CvArr* mat)
Calculates Mahalonobis distance between two vectors
"""
cvCbrt.__doc__ = """float cvCbrt(float value)
Calculates cubic root
"""
cvFastArctan.__doc__ = """float cvFastArctan(float y, float x)
Calculates angle of 2D vector
"""
cvCartToPolar.__doc__ = """void cvCartToPolar(const CvArr* x, const CvArr* y, CvArr* magnitude, CvArr* angle=NULL, int angle_in_degrees=0)
Calculates magnitude and/or angle of 2d vectors
"""
cvPolarToCart.__doc__ = """void cvPolarToCart(const CvArr* magnitude, const CvArr* angle, CvArr* x, CvArr* y, int angle_in_degrees=0)
Calculates cartesian coordinates of 2d vectors represented in polar form
"""
cvPow.__doc__ = """void cvPow(const CvArr* src, CvArr* dst, double power)
Raises every array element to power
"""
cvExp.__doc__ = """void cvExp(const CvArr* src, CvArr* dst)
Calculates exponent of every array element
"""
cvLog.__doc__ = """void cvLog(const CvArr* src, CvArr* dst)
Calculates natural logarithm of every array element absolute value
"""
cvSolveCubic.__doc__ = """void cvSolveCubic(const CvArr* coeffs, CvArr* roots)
Finds real roots of a cubic equation
"""
cvRandArr.__doc__ = """void cvRandArr(CvRNG* rng, CvArr* arr, int dist_type, CvScalar param1, CvScalar param2)
Fills array with random numbers and updates the RNG state
"""
cvGetOptimalDFTSize.__doc__ = """int cvGetOptimalDFTSize(int size0)
Returns optimal DFT size for given vector size
"""
cvMulSpectrums.__doc__ = """void cvMulSpectrums(const CvArr* src1, const CvArr* src2, CvArr* dst, int flags)
Performs per-element multiplication of two Fourier spectrums
"""
cvCreateMemStorage.__doc__ = """CvMemStorage* cvCreateMemStorage(int block_size=0)
Creates memory storage
"""
cvCreateChildMemStorage.__doc__ = """CvMemStorage* cvCreateChildMemStorage(CvMemStorage* parent)
Creates child memory storage
"""
cvReleaseMemStorage.__doc__ = """void cvReleaseMemStorage(CvMemStorage** storage)
Releases memory storage
"""
cvClearMemStorage.__doc__ = """void cvClearMemStorage(CvMemStorage* storage)
Clears memory storage
"""
cvMemStorageAlloc.__doc__ = """void* cvMemStorageAlloc(CvMemStorage* storage, size_t size)
Allocates memory buffer in the storage
"""
cvMemStorageAllocString.__doc__ = """CvString cvMemStorageAllocString(CvMemStorage* storage, const char* ptr, int len=-1)
Allocates text string in the storage
"""
cvSaveMemStoragePos.__doc__ = """void cvSaveMemStoragePos(const CvMemStorage* storage, CvMemStoragePos* pos)
Saves memory storage position
"""
cvRestoreMemStoragePos.__doc__ = """void cvRestoreMemStoragePos(CvMemStorage* storage, CvMemStoragePos* pos)
Restores memory storage position
"""
cvCreateSeq.__doc__ = """CvSeq* cvCreateSeq(int seq_flags, int header_size, int elem_size, CvMemStorage* storage)
Creates sequence
"""
cvSetSeqBlockSize.__doc__ = """void cvSetSeqBlockSize(CvSeq* seq, int delta_elems)
Sets up sequence block size
"""
cvSeqPush.__doc__ = """char* cvSeqPush(CvSeq* seq, void* element=NULL)
Adds element to sequence end
"""
cvSeqPop.__doc__ = """void cvSeqPop(CvSeq* seq, void* element=NULL)
Removes element from sequence end
"""
cvSeqPushFront.__doc__ = """char* cvSeqPushFront(CvSeq* seq, void* element=NULL)
Adds element to sequence beginning
"""
cvSeqPopFront.__doc__ = """void cvSeqPopFront(CvSeq* seq, void* element=NULL)
Removes element from sequence beginning
"""
cvSeqPushMulti.__doc__ = """void cvSeqPushMulti(CvSeq* seq, void* elements, int count, int in_front=0)
Pushes several elements to the either end of sequence
"""
cvSeqPopMulti.__doc__ = """void cvSeqPopMulti(CvSeq* seq, void* elements, int count, int in_front=0)
Removes several elements from the either end of sequence
"""
cvSeqInsert.__doc__ = """char* cvSeqInsert(CvSeq* seq, int before_index, void* element=NULL)
Inserts element in sequence middle
"""
cvSeqRemove.__doc__ = """void cvSeqRemove(CvSeq* seq, int index)
Removes element from sequence middle
"""
cvClearSeq.__doc__ = """void cvClearSeq(CvSeq* seq)
Clears sequence
"""
cvGetSeqElem.__doc__ = """char* cvGetSeqElem(const CvSeq* seq, int index)
Returns pointer to sequence element by its index
"""
cvSeqElemIdx.__doc__ = """int cvSeqElemIdx(const CvSeq* seq, const void* element, CvSeqBlock** block=NULL)
Returns index of concrete sequence element
"""
cvCvtSeqToArray.__doc__ = """void* cvCvtSeqToArray(const CvSeq* seq, void* elements, CvSlice slice=CV_WHOLE_SEQ)
Copies sequence to one continuous block of memory
"""
cvMakeSeqHeaderForArray.__doc__ = """CvSeq* cvMakeSeqHeaderForArray(int seq_type, int header_size, int elem_size, void* elements, int total, CvSeq* seq, CvSeqBlock* block)
Constructs sequence from array
"""
cvSeqSlice.__doc__ = """CvSeq* cvSeqSlice(const CvSeq* seq, CvSlice slice, CvMemStorage* storage=NULL, int copy_data=0)
Makes separate header for the sequence slice
"""
cvSeqRemoveSlice.__doc__ = """void cvSeqRemoveSlice(CvSeq* seq, CvSlice slice)
Removes sequence slice
"""
cvSeqInsertSlice.__doc__ = """void cvSeqInsertSlice(CvSeq* seq, int before_index, const CvArr* from_arr)
Inserts array in the middle of sequence
"""
cvSeqInvert.__doc__ = """void cvSeqInvert(CvSeq* seq)
Reverses the order of sequence elements
"""
cvSeqSort.__doc__ = """void cvSeqSort(CvSeq* seq, CvCmpFunc func, void* userdata=NULL)
Sorts sequence element using the specified comparison function
"""
cvSeqSearch.__doc__ = """char* cvSeqSearch(CvSeq* seq, const void* elem, CvCmpFunc func, int is_sorted, int* elem_idx, void* userdata=NULL)
Searches element in sequence
"""
cvStartAppendToSeq.__doc__ = """void cvStartAppendToSeq(CvSeq* seq, CvSeqWriter* writer)
Initializes process of writing data to sequence
"""
cvStartWriteSeq.__doc__ = """void cvStartWriteSeq(int seq_flags, int header_size, int elem_size, CvMemStorage* storage, CvSeqWriter* writer)
Creates new sequence and initializes writer for it
"""
cvEndWriteSeq.__doc__ = """CvSeq* cvEndWriteSeq(CvSeqWriter* writer)
Finishes process of writing sequence
"""
cvFlushSeqWriter.__doc__ = """void cvFlushSeqWriter(CvSeqWriter* writer)
Updates sequence headers from the writer state
"""
cvStartReadSeq.__doc__ = """void cvStartReadSeq(const CvSeq* seq, CvSeqReader* reader, int reverse=0)
Initializes process of sequential reading from sequence
"""
cvGetSeqReaderPos.__doc__ = """int cvGetSeqReaderPos(CvSeqReader* reader)
Returns the current reader position
"""
cvSetSeqReaderPos.__doc__ = """void cvSetSeqReaderPos(CvSeqReader* reader, int index, int is_relative=0)
Moves the reader to specified position
"""
cvCreateSet.__doc__ = """CvSET* cvCreateSet(int set_flags, int header_size, int elem_size, CvMemStorage* storage)
Creates empty set
"""
cvSetAdd.__doc__ = """int cvSetAdd(CvSET* set_header, CvSetElem* elem=NULL, CvSetElem** inserted_elem=NULL)
Occupies a node in the set
"""
cvSetRemove.__doc__ = """void cvSetRemove(CvSET* set_header, int index)
Removes element from set
"""
cvClearSet.__doc__ = """void cvClearSet(CvSET* set_header)
Clears set
"""
cvCreateGraph.__doc__ = """CvGraph* cvCreateGraph(int graph_flags, int header_size, int vtx_size, int edge_size, CvMemStorage* storage)
Creates empty graph
"""
cvGraphAddVtx.__doc__ = """int cvGraphAddVtx(CvGraph* graph, const CvGraphVtx* vtx=NULL, CvGraphVtx** inserted_vtx=NULL)
Adds vertex to graph
"""
cvGraphRemoveVtx.__doc__ = """int cvGraphRemoveVtx(CvGraph* graph, int index)
Removes vertex from graph
"""
cvGraphRemoveVtxByPtr.__doc__ = """int cvGraphRemoveVtxByPtr(CvGraph* graph, CvGraphVtx* vtx)
Removes vertex from graph
"""
cvGraphAddEdge.__doc__ = """int cvGraphAddEdge(CvGraph* graph, int start_idx, int end_idx, const CvGraphEdge* edge=NULL, CvGraphEdge** inserted_edge=NULL)
Adds edge to graph
"""
cvGraphAddEdgeByPtr.__doc__ = """int cvGraphAddEdgeByPtr(CvGraph* graph, CvGraphVtx* start_vtx, CvGraphVtx* end_vtx, const CvGraphEdge* edge=NULL, CvGraphEdge** inserted_edge=NULL)
Adds edge to graph
"""
cvGraphRemoveEdge.__doc__ = """void cvGraphRemoveEdge(CvGraph* graph, int start_idx, int end_idx)
Removes edge from graph
"""
cvGraphRemoveEdgeByPtr.__doc__ = """void cvGraphRemoveEdgeByPtr(CvGraph* graph, CvGraphVtx* start_vtx, CvGraphVtx* end_vtx)
Removes edge from graph
"""
cvFindGraphEdge.__doc__ = """CvGraphEdge* cvFindGraphEdge(const CvGraph* graph, int start_idx, int end_idx)
Finds edge in graph
"""
cvFindGraphEdgeByPtr.__doc__ = """CvGraphEdge* cvFindGraphEdgeByPtr(const CvGraph* graph, const CvGraphVtx* start_vtx, const CvGraphVtx* end_vtx)
Finds edge in graph
"""
cvGraphVtxDegree.__doc__ = """int cvGraphVtxDegree(const CvGraph* graph, int vtx_idx)
Counts edges indicent to the vertex
"""
cvGraphVtxDegreeByPtr.__doc__ = """int cvGraphVtxDegreeByPtr(const CvGraph* graph, const CvGraphVtx* vtx)
Finds edge in graph
"""
cvClearGraph.__doc__ = """void cvClearGraph(CvGraph* graph)
Clears graph
"""
cvCloneGraph.__doc__ = """CvGraph* cvCloneGraph(const CvGraph* graph, CvMemStorage* storage)
Clone graph
"""
cvCreateGraphScanner.__doc__ = """CvGraphScanner* cvCreateGraphScanner(CvGraph* graph, CvGraphVtx* vtx=NULL, int mask=CV_GRAPH_ALL_ITEMS)
Creates structure for depth-first graph traversal
"""
cvNextGraphItem.__doc__ = """int cvNextGraphItem(CvGraphScanner* scanner)
Makes one or more steps of the graph traversal procedure
"""
cvReleaseGraphScanner.__doc__ = """void cvReleaseGraphScanner(CvGraphScanner** scanner)
Finishes graph traversal procedure
"""
cvInitTreeNodeIterator.__doc__ = """void cvInitTreeNodeIterator(CvTreeNodeIterator* tree_iterator, const void* first, int max_level)
Initializes tree node iterator
"""
cvNextTreeNode.__doc__ = """void* cvNextTreeNode(CvTreeNodeIterator* tree_iterator)
Returns the currently observed node and moves iterator toward the next node
"""
cvPrevTreeNode.__doc__ = """void* cvPrevTreeNode(CvTreeNodeIterator* tree_iterator)
Returns the currently observed node and moves iterator toward the previous node
"""
cvTreeToNodeSeq.__doc__ = """CvSeq* cvTreeToNodeSeq(const void* first, int header_size, CvMemStorage* storage)
Gathers all node pointers to the single sequence
"""
cvInsertNodeIntoTree.__doc__ = """void cvInsertNodeIntoTree(void* node, void* parent, void* frame)
Adds new node to the tree
"""
cvRemoveNodeFromTree.__doc__ = """void cvRemoveNodeFromTree(void* node, void* frame)
Removes node from tree
"""
cvLine.__doc__ = """void cvLine(CvArr* img, CvPoint pt1, CvPoint pt2, CvScalar color, int thickness=1, int line_type=8, int shift=0)
Draws a line segment connecting two points
"""
cvRectangle.__doc__ = """void cvRectangle(CvArr* img, CvPoint pt1, CvPoint pt2, CvScalar color, int thickness=1, int line_type=8, int shift=0)
Draws simple, thick or filled rectangle
"""
cvCircle.__doc__ = """void cvCircle(CvArr* img, CvPoint center, int radius, CvScalar color, int thickness=1, int line_type=8, int shift=0)
Draws a circle
"""
cvEllipse.__doc__ = """void cvEllipse(CvArr* img, CvPoint center, CvSize axes, double angle, double start_angle, double end_angle, CvScalar color, int thickness=1, int line_type=8, int shift=0)
Draws simple or thick elliptic arc or fills ellipse sector
"""
cvFillPoly.__doc__ = """void cvFillPoly(CvArr* img, CvPoint** pts, int* npts, int contours, CvScalar color, int line_type=8, int shift=0)
Fills polygons interior
"""
cvFillConvexPoly.__doc__ = """void cvFillConvexPoly(CvArr* img, CvPoint* pts, int npts, CvScalar color, int line_type=8, int shift=0)
Fills convex polygon
"""
cvPolyLine.__doc__ = """void cvPolyLine(CvArr* img, CvPoint** pts, int* npts, int contours, int is_closed, CvScalar color, int thickness=1, int line_type=8, int shift=0)
Draws simple or thick polygons
"""
cvInitFont.__doc__ = """void cvInitFont(CvFont* font, int font_face, double hscale, double vscale, double shear=0, int thickness=1, int line_type=8)
Initializes font structure
"""
cvPutText.__doc__ = """void cvPutText(CvArr* img, const char* text, CvPoint org, const CvFont* font, CvScalar color)
Draws text string
"""
cvGetTextSize.__doc__ = """void cvGetTextSize(const char* text_string, const CvFont* font, CvSize* text_size, int* baseline)
Retrieves width and height of text string
"""
cvDrawContours.__doc__ = """void cvDrawContours(CvArr* img, CvSeq* contour, CvScalar external_color, CvScalar hole_color, int max_level, int thickness=1, int line_type=8)
Draws contour outlines or interiors in the image
"""
cvInitLineIterator.__doc__ = """int cvInitLineIterator(const CvArr* image, CvPoint pt1, CvPoint pt2, CvLineIterator* line_iterator, int connectivity=8, int left_to_right=0)
Initializes line iterator
"""
cvClipLine.__doc__ = """int cvClipLine(CvSize img_size, CvPoint* pt1, CvPoint* pt2)
Clips the line against the image rectangle
"""
cvEllipse2Poly.__doc__ = """int cvEllipse2Poly(CvPoint center, CvSize axes, int angle, int arc_start, int arc_end, CvPoint* pts, int delta)
Approximates elliptic arc with polyline
"""
cvOpenFileStorage.__doc__ = """CvFileStorage* cvOpenFileStorage(const char* filename, CvMemStorage* memstorage, int flags)
Opens file storage for reading or writing data
"""
cvReleaseFileStorage.__doc__ = """void cvReleaseFileStorage(CvFileStorage** fs)
Releases file storage
"""
cvStartWriteStruct.__doc__ = """void cvStartWriteStruct(CvFileStorage* fs, const char* name, int struct_flags, const char* type_name=NULL, CvAttrList attributes=cvAttrLis)
Starts writing a new structure
"""
cvEndWriteStruct.__doc__ = """void cvEndWriteStruct(CvFileStorage* fs)
Ends writing a structure
"""
cvWriteInt.__doc__ = """void cvWriteInt(CvFileStorage* fs, const char* name, int value)
Writes an integer value
"""
cvWriteReal.__doc__ = """void cvWriteReal(CvFileStorage* fs, const char* name, double value)
Writes a floating-point value
"""
cvWriteString.__doc__ = """void cvWriteString(CvFileStorage* fs, const char* name, const char* str, int quote=0)
Writes a text string
"""
cvWriteComment.__doc__ = """void cvWriteComment(CvFileStorage* fs, const char* comment, int eol_comment)
Writes comment
"""
cvStartNextStream.__doc__ = """void cvStartNextStream(CvFileStorage* fs)
Starts the next stream
"""
cvWrite.__doc__ = """void cvWrite(CvFileStorage* fs, const char* name, const void* ptr, CvAttrList attributes=cvAttrList)
Writes user object
"""
cvWriteRawData.__doc__ = """void cvWriteRawData(CvFileStorage* fs, const void* src, int len, const char* dt)
Writes multiple numbers
"""
cvWriteFileNode.__doc__ = """void cvWriteFileNode(CvFileStorage* fs, const char* new_node_name, const CvFileNode* node, int embed)
Writes file node to another file storage
"""
cvGetRootFileNode.__doc__ = """CvFileNode* cvGetRootFileNode(const CvFileStorage* fs, int stream_index=0)
Retrieves one of top-level nodes of the file storage
"""
cvGetFileNodeByName.__doc__ = """CvFileNode* cvGetFileNodeByName(const CvFileStorage* fs, const CvFileNode* map, const char* name)
Finds node in the map or file storage
"""
cvGetHashedKey.__doc__ = """CvStringHashNode* cvGetHashedKey(CvFileStorage* fs, const char* name, int len=-1, int create_missing=0)
Returns a unique pointer for given name
"""
cvGetFileNode.__doc__ = """CvFileNode* cvGetFileNode(CvFileStorage* fs, CvFileNode* map, const CvStringHashNode* key, int create_missing=0)
Finds node in the map or file storage
"""
cvGetFileNodeName.__doc__ = """const char* cvGetFileNodeName(const CvFileNode* node)
Returns name of file node
"""
cvRead.__doc__ = """void* cvRead(CvFileStorage* fs, CvFileNode* node, CvAttrList* attributes=NULL)
Decodes object and returns pointer to it
"""
cvReadRawData.__doc__ = """void cvReadRawData(const CvFileStorage* fs, const CvFileNode* src, void* dst, const char* dt)
Reads multiple numbers
"""
cvStartReadRawData.__doc__ = """void cvStartReadRawData(const CvFileStorage* fs, const CvFileNode* src, CvSeqReader* reader)
Initializes file node sequence reader
"""
cvReadRawDataSlice.__doc__ = """void cvReadRawDataSlice(const CvFileStorage* fs, CvSeqReader* reader, int count, void* dst, const char* dt)
Initializes file node sequence reader
"""
cvRegisterType.__doc__ = """void cvRegisterType(const CvTypeInfo* info)
Registers new type
"""
cvUnregisterType.__doc__ = """void cvUnregisterType(const char* type_name)
Unregisters the type
"""
cvFirstType.__doc__ = """CvTypeInfo* cvFirstType(voi)
Returns the beginning of type list
"""
cvFindType.__doc__ = """CvTypeInfo* cvFindType(const char* type_name)
Finds type by its name
"""
cvTypeOf.__doc__ = """CvTypeInfo* cvTypeOf(const void* struct_ptr)
Returns type of the object
"""
cvRelease.__doc__ = """void cvRelease(void** struct_ptr)
Releases the object
"""
cvClone.__doc__ = """void* cvClone(const void* struct_ptr)
Makes a clone of the object
"""
cvSave.__doc__ = """void cvSave(const char* filename, const void* struct_ptr, const char* name=NULL, const char* comment=NULL, CvAttrList attributes=cvAttrLis)
Saves object to file
"""
cvLoad.__doc__ = """void* cvLoad(const char* filename, CvMemStorage* memstorage=NULL, const char* name=NULL, const char** real_name=NULL)
Loads object from file
"""
cvCheckArr.__doc__ = """int cvCheckArr(const CvArr* arr, int flags=0, double min_val=0, double max_val=)
Checks every element of input array for invalid values
"""
cvKMeans2.__doc__ = """void cvKMeans2(const CvArr* samples, int cluster_count, CvArr* labels, CvTermCriteria termcrit)
Splits set of vectors by given number of clusters
"""
cvSeqPartition.__doc__ = """int cvSeqPartition(const CvSeq* seq, CvMemStorage* storage, CvSeq** labels, CvCmpFunc is_equal, void* userdata)
Splits sequence into equivalency classes
"""
cvGetErrStatus.__doc__ = """int cvGetErrStatus(void)
Returns the current error status
"""
cvSetErrStatus.__doc__ = """void cvSetErrStatus(int status)
Sets the error status
"""
cvGetErrMode.__doc__ = """int cvGetErrMode(void)
Returns the current error mode
"""
cvError.__doc__ = """int cvError(int status, const char* func_name, const char* err_msg, const char* file_name, int line)
Raises an error
"""
cvErrorStr.__doc__ = """const char* cvErrorStr(int status)
Returns textual description of error status code
"""
cvNulDevReport.__doc__ = """int cvNulDevReport(int status, const char* func_name, const char* err_msg, const char* file_name, int line, void* userdata)
Provide standard error handling
"""
cvAlloc.__doc__ = """void* cvAlloc(size_t size)
Allocates memory buffer
"""
#cvFree.__doc__ = """void cvFree(void** ptr)
#
#Deallocates memory buffer
#"""
cvGetTickCount.__doc__ = """int64 cvGetTickCount(void)
Returns number of tics
"""
cvGetTickFrequency.__doc__ = """double cvGetTickFrequency(void)
Returns number of tics per microsecond
"""
cvRegisterModule.__doc__ = """int cvRegisterModule(const CvModuleInfo* module_info)
Registers another module
"""
cvGetModuleInfo.__doc__ = """void cvGetModuleInfo(const char* module_name, const char** version, const char** loaded_addon_plugins)
Retrieves information about the registered module(s) and plugins
"""
cvUseOptimized.__doc__ = """int cvUseOptimized(int on_off)
Switches between optimized/non-optimized modes
"""
cvSobel.__doc__ = """void cvSobel(const CvArr* src, CvArr* dst, int xorder, int yorder, int aperture_size=3)
Calculates first, second, third or mixed image derivatives using extended Sobel operator
"""
cvLaplace.__doc__ = """void cvLaplace(const CvArr* src, CvArr* dst, int aperture_size=3)
Calculates Laplacian of the image
"""
cvCanny.__doc__ = """void cvCanny(const CvArr* image, CvArr* edges, double threshold1, double threshold2, int aperture_size=3)
Implements Canny algorithm for edge detection
"""
cvPreCornerDetect.__doc__ = """void cvPreCornerDetect(const CvArr* image, CvArr* corners, int aperture_size=3)
Calculates feature map for corner detection
"""
cvCornerEigenValsAndVecs.__doc__ = """void cvCornerEigenValsAndVecs(const CvArr* image, CvArr* eigenvv, int block_size, int aperture_size=3)
Calculates eigenvalues and eigenvectors of image blocks for corner detection
"""
cvCornerMinEigenVal.__doc__ = """void cvCornerMinEigenVal(const CvArr* image, CvArr* eigenval, int block_size, int aperture_size=3)
Calculates minimal eigenvalue of gradient matrices for corner detection
"""
cvCornerHarris.__doc__ = """void cvCornerHarris(const CvArr* image, CvArr* harris_responce, int block_size, int aperture_size=3, double k=0.04)
Harris edge detector
"""
cvFindCornerSubPix.__doc__ = """void cvFindCornerSubPix(const CvArr* image, CvPoint2D32f* corners, int count, CvSize win, CvSize zero_zone, CvTermCriteria criteria)
Refines corner locations
"""
cvGoodFeaturesToTrack.__doc__ = """void cvGoodFeaturesToTrack(const CvArr* image, CvArr* eig_image, CvArr* temp_image, CvPoint2D32f* corners, int* corner_count, double quality_level, double min_distance, const CvArr* mask=NULL, int block_size=3, int use_harris=0, double k=0.04)
Determines strong corners on image
"""
cvSampleLine.__doc__ = """int cvSampleLine(const CvArr* image, CvPoint pt1, CvPoint pt2, void* buffer, int connectivity=8)
Reads raster line to buffer
"""
cvGetRectSubPix.__doc__ = """void cvGetRectSubPix(const CvArr* src, CvArr* dst, CvPoint2D32f center)
Retrieves pixel rectangle from image with sub-pixel accuracy
"""
cvGetQuadrangleSubPix.__doc__ = """void cvGetQuadrangleSubPix(const CvArr* src, CvArr* dst, const CvMat* map_matrix)
Retrieves pixel quadrangle from image with sub-pixel accuracy
"""
cvResize.__doc__ = """void cvResize(const CvArr* src, CvArr* dst, int interpolation=CV_INTER_LINEAR)
Resizes image
"""
cvWarpAffine.__doc__ = """void cvWarpAffine(const CvArr* src, CvArr* dst, const CvMat* map_matrix, int flags=CV_INTER_LINEAR+CV_WARP_FILL_OUTLIERS, CvScalar fillval=cvScalarAll(0)
Applies affine transformation to the image
"""
cv2DRotationMatrix.__doc__ = """CvMat* cv2DRotationMatrix(CvPoint2D32f center, double angle, double scale, CvMat* map_matrix)
Calculates affine matrix of 2d rotation
"""
cvWarpPerspective.__doc__ = """void cvWarpPerspective(const CvArr* src, CvArr* dst, const CvMat* map_matrix, int flags=CV_INTER_LINEAR+CV_WARP_FILL_OUTLIERS, CvScalar fillval=cvScalarAll(0)
Applies perspective transformation to the image
"""
cvGetPerspectiveTransform.__doc__ = """CvMat* cvGetPerspectiveTransform(const CvPoint2D32f* src, const CvPoint2D32f* dst, CvMat* map_matrix)
Calculates perspective transform from 4 corresponding points
"""
cvRemap.__doc__ = """void cvRemap(const CvArr* src, CvArr* dst, const CvArr* mapx, const CvArr* mapy, int flags=CV_INTER_LINEAR+CV_WARP_FILL_OUTLIERS, CvScalar fillval=cvScalarAll(0)
Applies generic geometrical transformation to the image
"""
cvLogPolar.__doc__ = """void cvLogPolar(const CvArr* src, CvArr* dst, CvPoint2D32f center, double M, int flags=CV_INTER_LINEAR+CV_WARP_FILL_OUTLIERS)
Remaps image to log-polar space
"""
cvCreateStructuringElementEx.__doc__ = """IplConvKernel* cvCreateStructuringElementEx(int cols, int rows, int anchor_x, int anchor_y, int shape, int* values=NULL)
Creates structuring element
"""
cvReleaseStructuringElement.__doc__ = """void cvReleaseStructuringElement(IplConvKernel** element)
Deletes structuring element
"""
cvErode.__doc__ = """void cvErode(const CvArr* src, CvArr* dst, IplConvKernel* element=NULL, int iterations=1)
Erodes image by using arbitrary structuring element
"""
cvDilate.__doc__ = """void cvDilate(const CvArr* src, CvArr* dst, IplConvKernel* element=NULL, int iterations=1)
Dilates image by using arbitrary structuring element
"""
cvMorphologyEx.__doc__ = """void cvMorphologyEx(const CvArr* src, CvArr* dst, CvArr* temp, IplConvKernel* element, int operation, int iterations=1)
Performs advanced morphological transformations
"""
cvSmooth.__doc__ = """void cvSmooth(const CvArr* src, CvArr* dst, int smoothtype=CV_GAUSSIAN, int param1=3, int param2=0, double param3=0)
Smooths the image in one of several ways
"""
cvFilter2D.__doc__ = """void cvFilter2D(const CvArr* src, CvArr* dst, const CvMat* kernel, CvPoint anchor=cvPoint(-1, -1)
Convolves image with the kernel
"""
cvCopyMakeBorder.__doc__ = """void cvCopyMakeBorder(const CvArr* src, CvArr* dst, CvPoint offset, int bordertype, CvScalar value=cvScalarAll(0)
Copies image and makes border around it
"""
cvIntegral.__doc__ = """void cvIntegral(const CvArr* image, CvArr* sum, CvArr* sqsum=NULL, CvArr* tilted_sum=NULL)
Calculates integral images
"""
cvCvtColor.__doc__ = """void cvCvtColor(const CvArr* src, CvArr* dst, int code)
Converts image from one color space to another
"""
cvThreshold.__doc__ = """void cvThreshold(const CvArr* src, CvArr* dst, double threshold, double max_value, int threshold_type)
Applies fixed-level threshold to array elements
"""
cvAdaptiveThreshold.__doc__ = """void cvAdaptiveThreshold(const CvArr* src, CvArr* dst, double max_value, int adaptive_method=CV_ADAPTIVE_THRESH_MEAN_C, int threshold_type=CV_THRESH_BINARY, int block_size=3, double param1=5)
Applies adaptive threshold to array
"""
cvPyrDown.__doc__ = """void cvPyrDown(const CvArr* src, CvArr* dst, int filter=CV_GAUSSIAN_5x5)
Downsamples image
"""
cvPyrUp.__doc__ = """void cvPyrUp(const CvArr* src, CvArr* dst, int filter=CV_GAUSSIAN_5x5)
Upsamples image
"""
cvPyrSegmentation.__doc__ = """void cvPyrSegmentation(IplImage* src, IplImage* dst, CvMemStorage* storage, CvSeq** comp, int level, double threshold1, double threshold2)
Implements image segmentation by pyramids
"""
cvFloodFill.__doc__ = """void cvFloodFill(CvArr* image, CvPoint seed_point, CvScalar new_val, CvScalar lo_diff=cvScalarAll(0), CvScalar up_diff=cvScalarAll(0), CvConnectedComp* comp=NULL, int flags=4, CvArr* mask=NULL)
Fills a connected component with given color
"""
cvFindContours.__doc__ = """int cvFindContours(CvArr* image, CvMemStorage* storage, CvSeq** first_contour, int header_size=sizeofCvContour, int mode=CV_RETR_LIST, int method=CV_CHAIN_APPROX_SIMPLE, CvPoint offset=cvPoint(0, 0)
Finds contours in binary image
"""
cvStartFindContours.__doc__ = """CvContourScanner cvStartFindContours(CvArr* image, CvMemStorage* storage, int header_size=sizeofCvContour, int mode=CV_RETR_LIST, int method=CV_CHAIN_APPROX_SIMPLE, CvPoint offset=cvPoint(0, 0)
Initializes contour scanning process
"""
cvFindNextContour.__doc__ = """CvSeq* cvFindNextContour(CvContourScanner scanner)
Finds next contour in the image
"""
cvSubstituteContour.__doc__ = """void cvSubstituteContour(CvContourScanner scanner, CvSeq* new_contour)
Replaces retrieved contour
"""
cvEndFindContours.__doc__ = """CvSeq* cvEndFindContours(CvContourScanner* scanner)
Finishes scanning process
"""
cvMoments.__doc__ = """void cvMoments(const CvArr* arr, CvMOMENTS* moments, int binary=0)
Calculates all moments up to third order of a polygon or rasterized shape
"""
cvGetSpatialMoment.__doc__ = """double cvGetSpatialMoment(CvMOMENTS* moments, int x_order, int y_order)
Retrieves spatial moment from moment state structure
"""
cvGetCentralMoment.__doc__ = """double cvGetCentralMoment(CvMOMENTS* moments, int x_order, int y_order)
Retrieves central moment from moment state structure
"""
cvGetNormalizedCentralMoment.__doc__ = """double cvGetNormalizedCentralMoment(CvMOMENTS* moments, int x_order, int y_order)
Retrieves normalized central moment from moment state structure
"""
cvGetHuMoments.__doc__ = """void cvGetHuMoments(CvMOMENTS* moments, CvHuMoments* hu_moments)
Calculates seven Hu invariants
"""
cvHoughLines2.__doc__ = """CvSeq* cvHoughLines2(CvArr* image, void* line_storage, int method, double rho, double theta, int threshold, double param1=0, double param2=0)
Finds lines in binary image using Hough transform
"""
cvHoughCircles.__doc__ = """CvSeq* cvHoughCircles(CvArr* image, void* circle_storage, int method, double dp, double min_dist, double param1=100, double param2=100)
Finds circles in grayscale image using Hough transform
"""
cvDistTransform.__doc__ = """void cvDistTransform(const CvArr* src, CvArr* dst, int distance_type=CV_DIST_L2, int mask_size=3, const float* mask=NULL, CvArr* labels=NULL)
Calculates distance to closest zero pixel for all non-zero pixels of source image
"""
cvCreateHist.__doc__ = """CvHistogram* cvCreateHist(int dims, int* sizes, int type, float** ranges=NULL, int uniform=1)
Creates histogram
"""
cvSetHistBinRanges.__doc__ = """void cvSetHistBinRanges(CvHistogram* hist, float** ranges, int uniform=1)
Sets bounds of histogram bins
"""
cvReleaseHist.__doc__ = """void cvReleaseHist(CvHistogram** hist)
Releases histogram
"""
cvClearHist.__doc__ = """void cvClearHist(CvHistogram* hist)
Clears histogram
"""
cvMakeHistHeaderForArray.__doc__ = """CvHistogram* cvMakeHistHeaderForArray(int dims, int* sizes, CvHistogram* hist, float* data, float** ranges=NULL, int uniform=1)
Makes a histogram out of array
"""
cvGetMinMaxHistValue.__doc__ = """void cvGetMinMaxHistValue(const CvHistogram* hist, float* min_value, float* max_value, int* min_idx=NULL, int* max_idx=NULL)
Finds minimum and maximum histogram bins
"""
cvNormalizeHist.__doc__ = """void cvNormalizeHist(CvHistogram* hist, double factor)
Normalizes histogram
"""
cvThreshHist.__doc__ = """void cvThreshHist(CvHistogram* hist, double threshold)
Thresholds histogram
"""
cvCompareHist.__doc__ = """double cvCompareHist(const CvHistogram* hist1, const CvHistogram* hist2, int method)
Compares two dense histograms
"""
cvCopyHist.__doc__ = """void cvCopyHist(const CvHistogram* src, CvHistogram** dst)
Copies histogram
"""
cvCalcProbDensity.__doc__ = """void cvCalcProbDensity(const CvHistogram* hist1, const CvHistogram* hist2, CvHistogram* dst_hist, double scale=255)
Divides one histogram by another
"""
cvEqualizeHist.__doc__ = """void cvEqualizeHist(const CvArr* src, CvArr* dst)
Equalizes histogram of grayscale image
"""
cvMatchTemplate.__doc__ = """void cvMatchTemplate(const CvArr* image, const CvArr* templ, CvArr* result, int method)
Compares template against overlapped image regions
"""
cvMatchShapes.__doc__ = """double cvMatchShapes(const void* object1, const void* object2, int method, double parameter=0)
Compares two shapes
"""
cvApproxChains.__doc__ = """CvSeq* cvApproxChains(CvSeq* src_seq, CvMemStorage* storage, int method=CV_CHAIN_APPROX_SIMPLE, double parameter=0, int minimal_perimeter=0, int recursive=0)
Approximates Freeman chain(s) with polygonal curve
"""
cvStartReadChainPoints.__doc__ = """void cvStartReadChainPoints(CvChain* chain, CvChainPtReader* reader)
Initializes chain reader
"""
cvReadChainPoint.__doc__ = """CvPoint cvReadChainPoint(CvChainPtReader* reader)
Gets next chain point
"""
cvApproxPoly.__doc__ = """CvSeq* cvApproxPoly(const void* src_seq, int header_size, CvMemStorage* storage, int method, double parameter, int parameter2=0)
Approximates polygonal curve(s) with desired precision
"""
cvBoundingRect.__doc__ = """CvRect cvBoundingRect(CvArr* points, int update=0)
Calculates up-right bounding rectangle of point set
"""
cvContourArea.__doc__ = """double cvContourArea(const CvArr* contour, CvSlice slice=CV_WHOLE_SEQ)
Calculates area of the whole contour or contour section
"""
cvArcLength.__doc__ = """double cvArcLength(const void* curve, CvSlice slice=CV_WHOLE_SEQ, int is_closed=-1)
Calculates contour perimeter or curve length
"""
cvCreateContourTree.__doc__ = """CvContourTree* cvCreateContourTree(const CvSeq* contour, CvMemStorage* storage, double threshold)
Creates hierarchical representation of contour
"""
cvContourFromContourTree.__doc__ = """CvSeq* cvContourFromContourTree(const CvContourTree* tree, CvMemStorage* storage, CvTermCriteria criteria)
Restores contour from tree
"""
cvMatchContourTrees.__doc__ = """double cvMatchContourTrees(const CvContourTree* tree1, const CvContourTree* tree2, int method, double threshold)
Compares two contours using their tree representations
"""
cvMaxRect.__doc__ = """CvRect cvMaxRect(const CvRect* rect1, const CvRect* rect2)
Finds bounding rectangle for two given rectangles
"""
cvPointSeqFromMat.__doc__ = """CvSeq* cvPointSeqFromMat(int seq_kind, const CvArr* mat, CvContour* contour_header, CvSeqBlock* block)
Initializes point sequence header from a point vector
"""
cvBoxPoints.__doc__ = """void cvBoxPoints(CvBox2D box, CvPoint2D32f pt[4])
Finds box vertices
"""
cvFitEllipse2.__doc__ = """CvBox2D cvFitEllipse2(const CvArr* points)
Fits ellipse to set of 2D points
"""
cvFitLine.__doc__ = """void cvFitLine(const CvArr* points, int dist_type, double param, double reps, double aeps, float* line)
Fits line to 2D or 3D point set
"""
cvConvexHull2.__doc__ = """CvSeq* cvConvexHull2(const CvArr* input, void* hull_storage=NULL, int orientation=CV_CLOCKWISE, int return_points=0)
Finds convex hull of point set
"""
cvCheckContourConvexity.__doc__ = """int cvCheckContourConvexity(const CvArr* contour)
Tests contour convex
"""
cvConvexityDefects.__doc__ = """CvSeq* cvConvexityDefects(const CvArr* contour, const CvArr* convexhull, CvMemStorage* storage=NULL)
Finds convexity defects of contour
"""
cvPointPolygonTest.__doc__ = """double cvPointPolygonTest(const CvArr* contour, CvPoint2D32f pt, int measure_dist)
Point in contour test
"""
cvMinAreaRect2.__doc__ = """CvBox2D cvMinAreaRect2(const CvArr* points, CvMemStorage* storage=NULL)
Finds circumscribed rectangle of minimal area for given 2D point set
"""
cvMinEnclosingCircle.__doc__ = """int cvMinEnclosingCircle(const CvArr* points, CvPoint2D32f* center, float* radius)
Finds circumscribed circle of minimal area for given 2D point set
"""
cvCalcPGH.__doc__ = """void cvCalcPGH(const CvSeq* contour, CvHistogram* hist)
Calculates pair-wise geometrical histogram for contour
"""
cvSubdivDelaunay2DInsert.__doc__ = """CvSubdiv2DPoint* cvSubdivDelaunay2DInsert(CvSubdiv2D* subdiv, CvPoint2D32f p)
Inserts a single point to Delaunay triangulation
"""
cvSubdiv2DLocate.__doc__ = """CvSubdiv2DPointLocation cvSubdiv2DLocate(CvSubdiv2D* subdiv, CvPoint2D32f pt, CvSubdiv2DEdge* edge, CvSubdiv2DPoint** vertex=NULL)
Inserts a single point to Delaunay triangulation
"""
cvFindNearestPoint2D.__doc__ = """CvSubdiv2DPoint* cvFindNearestPoint2D(CvSubdiv2D* subdiv, CvPoint2D32f pt)
Finds the closest subdivision vertex to given point
"""
cvCalcSubdivVoronoi2D.__doc__ = """void cvCalcSubdivVoronoi2D(CvSubdiv2D* subdiv)
Calculates coordinates of Voronoi diagram cells
"""
cvClearSubdivVoronoi2D.__doc__ = """void cvClearSubdivVoronoi2D(CvSubdiv2D* subdiv)
Removes all virtual points
"""
cvAcc.__doc__ = """void cvAcc(const CvArr* image, CvArr* sum, const CvArr* mask=NULL)
Adds frame to accumulator
"""
cvSquareAcc.__doc__ = """void cvSquareAcc(const CvArr* image, CvArr* sqsum, const CvArr* mask=NULL)
Adds the square of source image to accumulator
"""
cvMultiplyAcc.__doc__ = """void cvMultiplyAcc(const CvArr* image1, const CvArr* image2, CvArr* acc, const CvArr* mask=NULL)
Adds product of two input images to accumulator
"""
cvRunningAvg.__doc__ = """void cvRunningAvg(const CvArr* image, CvArr* acc, double alpha, const CvArr* mask=NULL)
Updates running average
"""
cvUpdateMotionHistory.__doc__ = """void cvUpdateMotionHistory(const CvArr* silhouette, CvArr* mhi, double timestamp, double duration)
Updates motion history image by moving silhouette
"""
cvCalcMotionGradient.__doc__ = """void cvCalcMotionGradient(const CvArr* mhi, CvArr* mask, CvArr* orientation, double delta1, double delta2, int aperture_size=3)
Calculates gradient orientation of motion history image
"""
cvCalcGlobalOrientation.__doc__ = """double cvCalcGlobalOrientation(const CvArr* orientation, const CvArr* mask, const CvArr* mhi, double timestamp, double duration)
Calculates global motion orientation of some selected region
"""
cvSegmentMotion.__doc__ = """CvSeq* cvSegmentMotion(const CvArr* mhi, CvArr* seg_mask, CvMemStorage* storage, double timestamp, double seg_thresh)
Segments whole motion into separate moving parts
"""
cvMeanShift.__doc__ = """int cvMeanShift(const CvArr* prob_image, CvRect window, CvTermCriteria criteria, CvConnectedComp* comp)
Finds object center on back projection
"""
cvCamShift.__doc__ = """int cvCamShift(const CvArr* prob_image, CvRect window, CvTermCriteria criteria, CvConnectedComp* comp, CvBox2D* box=NULL)
Finds object center, size, and orientation
"""
cvSnakeImage.__doc__ = """void cvSnakeImage(const IplImage* image, CvPoint* points, int length, float* alpha, float* beta, float* gamma, int coeff_usage, CvSize win, CvTermCriteria criteria, int calc_gradient=1)
Changes contour position to minimize its energy
"""
cvCalcOpticalFlowHS.__doc__ = """void cvCalcOpticalFlowHS(const CvArr* prev, const CvArr* curr, int use_previous, CvArr* velx, CvArr* vely, double lambda, CvTermCriteria criteria)
Calculates optical flow for two images
"""
cvCalcOpticalFlowLK.__doc__ = """void cvCalcOpticalFlowLK(const CvArr* prev, const CvArr* curr, CvSize win_size, CvArr* velx, CvArr* vely)
Calculates optical flow for two images
"""
cvCalcOpticalFlowBM.__doc__ = """void cvCalcOpticalFlowBM(const CvArr* prev, const CvArr* curr, CvSize block_size, CvSize shift_size, CvSize max_range, int use_previous, CvArr* velx, CvArr* vely)
Calculates optical flow for two images by block matching method
"""
cvCalcOpticalFlowPyrLK.__doc__ = """void cvCalcOpticalFlowPyrLK(const CvArr* prev, const CvArr* curr, CvArr* prev_pyr, CvArr* curr_pyr, const CvPoint2D32f* prev_features, CvPoint2D32f* curr_features, int count, CvSize win_size, int level, char* status, float* track_error, CvTermCriteria criteria, int flags)
Calculates optical flow for a sparse feature set using iterative Lucas-Kanade method in pyramids
"""
cvCreateKalman.__doc__ = """CvKalman* cvCreateKalman(int dynam_params, int measure_params, int control_params=0)
Allocates Kalman filter structure
"""
cvReleaseKalman.__doc__ = """void cvReleaseKalman(CvKalman** kalman)
Deallocates Kalman filter structure
"""
cvKalmanPredict.__doc__ = """const CvMat* cvKalmanPredict(CvKalman* kalman, const CvMat* control=NULL)
Estimates subsequent model state
"""
cvKalmanCorrect.__doc__ = """const CvMat* cvKalmanCorrect(CvKalman* kalman, const CvMat* measurement)
Adjusts model state
"""
cvCreateConDensation.__doc__ = """CvConDensation* cvCreateConDensation(int dynam_params, int measure_params, int sample_count)
Allocates ConDensation filter structure
"""
cvReleaseConDensation.__doc__ = """void cvReleaseConDensation(CvConDensation** condens)
Deallocates ConDensation filter structure
"""
cvConDensInitSampleSet.__doc__ = """void cvConDensInitSampleSet(CvConDensation* condens, CvMat* lower_bound, CvMat* upper_bound)
Initializes sample set for ConDensation algorithm
"""
cvConDensUpdateByTime.__doc__ = """void cvConDensUpdateByTime(CvConDensation* condens)
Estimates subsequent model state
"""
cvLoadHaarClassifierCascade.__doc__ = """CvHaarClassifierCascade* cvLoadHaarClassifierCascade(const char* directory, CvSize orig_window_size)
Loads a trained cascade classifier from file or the classifier database embedded in OpenCV
"""
cvReleaseHaarClassifierCascade.__doc__ = """void cvReleaseHaarClassifierCascade(CvHaarClassifierCascade** cascade)
Releases haar classifier cascade
"""
cvHaarDetectObjects.__doc__ = """CvSeq* cvHaarDetectObjects(const CvArr* image, CvHaarClassifierCascade* cascade, CvMemStorage* storage, double scale_factor=1.1, int min_neighbors=3, int flags=0, CvSize min_size=cvSize(0, 0)
Detects objects in the image
"""
cvSetImagesForHaarClassifierCascade.__doc__ = """void cvSetImagesForHaarClassifierCascade(CvHaarClassifierCascade* cascade, const CvArr* sum, const CvArr* sqsum, const CvArr* tilted_sum, double scale)
Assigns images to the hidden cascade
"""
cvRunHaarClassifierCascade.__doc__ = """int cvRunHaarClassifierCascade(CvHaarClassifierCascade* cascade, CvPoint pt, int start_stage=0)
Runs cascade of boosted classifier at given image location
"""
cvProjectPoints2.__doc__ = """void cvProjectPoints2(const CvMat* object_points, const CvMat* rotation_vector, const CvMat* translation_vector, const CvMat* intrinsic_matrix, const CvMat* distortion_coeffs, CvMat* image_points, CvMat* dpdrot=NULL, CvMat* dpdt=NULL, CvMat* dpdf=NULL, CvMat* dpdc=NULL, CvMat* dpddist=NULL)
Projects 3D points to image plane
"""
cvFindHomography.__doc__ = """void cvFindHomography(const CvMat* src_points, const CvMat* dst_points, CvMat* homography)
Finds perspective transformation between two planes
"""
cvCalibrateCamera2.__doc__ = """void cvCalibrateCamera2(const CvMat* object_points, const CvMat* image_points, const CvMat* point_counts, CvSize image_size, CvMat* intrinsic_matrix, CvMat* distortion_coeffs, CvMat* rotation_vectors=NULL, CvMat* translation_vectors=NULL, int flags=0)
Finds intrinsic and extrinsic camera parameters using calibration pattern
"""
cvFindExtrinsicCameraParams2.__doc__ = """void cvFindExtrinsicCameraParams2(const CvMat* object_points, const CvMat* image_points, const CvMat* intrinsic_matrix, const CvMat* distortion_coeffs, CvMat* rotation_vector, CvMat* translation_vector)
Finds extrinsic camera parameters for particular view
"""
cvRodrigues2.__doc__ = """int cvRodrigues2(const CvMat* src, CvMat* dst, CvMat* jacobian=0)
Converts rotation matrix to rotation vector or vice versa
"""
cvUndistort2.__doc__ = """void cvUndistort2(const CvArr* src, CvArr* dst, const CvMat* intrinsic_matrix, const CvMat* distortion_coeffs)
Transforms image to compensate lens distortion
"""
cvInitUndistortMap.__doc__ = """void cvInitUndistortMap(const CvMat* intrinsic_matrix, const CvMat* distortion_coeffs, CvArr* mapx, CvArr* mapy)
Computes undistorion map
"""
cvFindChessboardCorners.__doc__ = """int cvFindChessboardCorners(const void* image, CvSize pattern_size, CvPoint2D32f* corners, int* corner_count=NULL, int flags=CV_CALIB_CB_ADAPTIVE_THRESH)
Finds positions of internal corners of the chessboard
"""
cvDrawChessboardCorners.__doc__ = """void cvDrawChessboardCorners(CvArr* image, CvSize pattern_size, CvPoint2D32f* corners, int count, int pattern_was_found)
Renders the detected chessboard corners
"""
cvCreatePOSITObject.__doc__ = """CvPOSITObject* cvCreatePOSITObject(CvPoint3D32f* points, int point_count)
Initializes structure containing object information
"""
cvPOSIT.__doc__ = """void cvPOSIT(CvPOSITObject* posit_object, CvPoint2D32f* image_points, double focal_length, CvTermCriteria criteria, CvMatr32f rotation_matrix, CvVect32f translation_vector)
Implements POSIT algorithm
"""
cvReleasePOSITObject.__doc__ = """void cvReleasePOSITObject(CvPOSITObject** posit_object)
Deallocates 3D object structure
"""
cvCalcImageHomography.__doc__ = """void cvCalcImageHomography(float* line, CvPoint3D32f* center, float* intrinsic, float* homography)
Calculates homography matrix for oblong planar object (e.g. arm)
"""
cvFindFundamentalMat.__doc__ = """int cvFindFundamentalMat(const CvMat* points1, const CvMat* points2, CvMat* fundamental_matrix, int method=CV_FM_RANSAC, double param1=1., double param2=0.99, CvMat* status=NUL)
Calculates fundamental matrix from corresponding points in two images
"""
cvComputeCorrespondEpilines.__doc__ = """void cvComputeCorrespondEpilines(const CvMat* points, int which_image, const CvMat* fundamental_matrix, CvMat* correspondent_line)
For points in one image of stereo pair computes the corresponding epilines in the other image
"""
cvConvertPointsHomogenious.__doc__ = """void cvConvertPointsHomogenious(const CvMat* src, CvMat* dst)
Convert points to/from homogenious coordinates
"""
cvNamedWindow.__doc__ = """int cvNamedWindow(const char* name, int flags)
Creates window
"""
cvDestroyWindow.__doc__ = """void cvDestroyWindow(const char* name)
Destroys a window
"""
cvDestroyAllWindows.__doc__ = """void cvDestroyAllWindows(oi)
Destroys all the HighGUI windows
"""
cvResizeWindow.__doc__ = """void cvResizeWindow(const char* name, int width, int height)
Sets window size
"""
cvMoveWindow.__doc__ = """void cvMoveWindow(const char* name, int x, int y)
Sets window position
"""
cvGetWindowHandle.__doc__ = """void* cvGetWindowHandle(const char* name)
Gets window handle by name
"""
cvGetWindowName.__doc__ = """constchar* cvGetWindowName(void* window_handle)
Gets window name by handle
"""
cvShowImage.__doc__ = """void cvShowImage(const char* name, const CvArr* image)
Shows the image in the specified window
"""
cvGetTrackbarPos.__doc__ = """int cvGetTrackbarPos(const char* trackbar_name, const char* window_name)
Retrieves trackbar position
"""
cvSetTrackbarPos.__doc__ = """void cvSetTrackbarPos(const char* trackbar_name, const char* window_name, int pos)
Sets trackbar position
"""
cvWaitKey.__doc__ = """int cvWaitKey(int delay=0)
Waits for a pressed key
"""
cvLoadImage.__doc__ = """IplImage* cvLoadImage(const char* filename, int iscolor=1)
Loads an image from file
"""
cvSaveImage.__doc__ = """int cvSaveImage(const char* filename, const CvArr* image)
Saves an image to the file
"""
cvCreateFileCapture.__doc__ = """CvCapture* cvCreateFileCapture(const char* filename)
Initializes capturing video from file
"""
cvCreateCameraCapture.__doc__ = """CvCapture* cvCreateCameraCapture(int index)
Initializes capturing video from camera
"""
cvReleaseCapture.__doc__ = """void cvReleaseCapture(CvCapture** capture)
Releases the CvCapture structure
"""
cvGrabFrame.__doc__ = """int cvGrabFrame(CvCapture* capture)
Grabs frame from camera or file
"""
cvRetrieveFrame.__doc__ = """IplImage* cvRetrieveFrame(CvCapture* capture)
Gets the image grabbed with cvGrabFrame
"""
cvQueryFrame.__doc__ = """IplImage* cvQueryFrame(CvCapture* capture)
Grabs and returns a frame from camera or file
"""
cvGetCaptureProperty.__doc__ = """double cvGetCaptureProperty(CvCapture* capture, int property_id)
Gets video capturing properties
"""
cvSetCaptureProperty.__doc__ = """int cvSetCaptureProperty(CvCapture* capture, int property_id, double value)
Sets video capturing properties
"""
cvCreateVideoWriter.__doc__ = """CvVideoWriter* cvCreateVideoWriter(const char* filename, int fourcc, double fps, CvSize frame_size, int is_color=1)
Creates video file writer
"""
cvReleaseVideoWriter.__doc__ = """void cvReleaseVideoWriter(CvVideoWriter** writer)
Releases AVI writer
"""
cvWriteFrame.__doc__ = """int cvWriteFrame(CvVideoWriter* writer, const IplImage* image)
Writes a frame to video file
"""
cvInitSystem.__doc__ = """int cvInitSystem(int argc, char** argv)
Initializes HighGUI
"""
cvConvertImage.__doc__ = """void cvConvertImage(const CvArr* src, CvArr* dst, int flags=0)
Converts one image to another with optional vertical flip
"""
# --- SOME FUNCTION COPIES FROM THE C HEADERS (reverse compatibility?) ---
cvGetSubArr = cvGetSubRect
cvZero = cvSetZero
cvCvtScale = cvConvertScale
cvScale = cvConvertScale
cvCvtScaleAbs = cvConvertScaleAbs
cvCheckArray = cvCheckArr
cvMatMulAddEx = cvGEMM
cvMatMulAddS = cvTransform
cvT = cvTranspose
cvMirror = cvFlip
cvInv = cvInvert
cvMahalonobis = cvMahalanobis
cvFFT = cvDFT
cvGraphFindEdge = cvFindGraphEdge
cvGraphFindEdgeByPtr = cvFindGraphEdgeByPtr
cvDrawRect = cvRectangle
cvDrawLine = cvLine
cvDrawCircle = cvCircle
cvDrawEllipse = cvEllipse
cvDrawPolyLine = cvPolyLine
# wrapup all the functions into a single object so we can say
# from OpenCV import cv
# cv.Foo instead of OpenCV.cvFoo
class namespace:
pass
nsp = namespace()
mdict = locals()
for sym, val in mdict.items():
if sym.startswith('CV_'):
sname = sym[3:]
if sname == 'SVD':
sname = '_SVD'
elif sym.startswith('cv'):
sname = sym[2:]
elif sym.startswith('Cv'):
sname = sym[2:]
else:
continue
if not hasattr(nsp, sname):
setattr(nsp, sname, val)
else:
print 'name collision', sname, getattr(nsp, sname)
cv = nsp
# --- Hauptprogramm ----------------------------------------------------------
if __name__ == "__main__":
print __doc__
<file_sep>/PhidgetsPython/Python/RFID-simple.py
#!/usr/bin/env python
"""Copyright 2010 Phidgets Inc.
This work is licensed under the Creative Commons Attribution 2.5 Canada License.
To view a copy of this license, visit http://creativecommons.org/licenses/by/2.5/ca/
"""
__author__ = '<NAME>'
__version__ = '2.1.8'
__date__ = 'May 17 2010'
#Basic imports
from ctypes import *
import sys
#Phidget specific imports
from Phidgets.PhidgetException import PhidgetErrorCodes, PhidgetException
from Phidgets.Events.Events import AttachEventArgs, DetachEventArgs, ErrorEventArgs, OutputChangeEventArgs, TagEventArgs
from Phidgets.Devices.RFID import RFID
#Create an RFID object
try:
rfid = RFID()
except RuntimeError as e:
print("Runtime Exception: %s" % e.details)
print("Exiting....")
exit(1)
#Information Display Function
def displayDeviceInfo():
print("|------------|----------------------------------|--------------|------------|")
print("|- Attached -|- Type -|- Serial No. -|- Version -|")
print("|------------|----------------------------------|--------------|------------|")
print("|- %8s -|- %30s -|- %10d -|- %8d -|" % (rfid.isAttached(), rfid.getDeviceName(), rfid.getSerialNum(), rfid.getDeviceVersion()))
print("|------------|----------------------------------|--------------|------------|")
print("Number of outputs: %i -- Antenna Status: %s -- Onboard LED Status: %s" % (rfid.getOutputCount(), rfid.getAntennaOn(), rfid.getLEDOn()))
#Event Handler Callback Functions
def rfidAttached(e):
attached = e.device
print("RFID %i Attached!" % (attached.getSerialNum()))
def rfidDetached(e):
detached = e.device
print("RFID %i Detached!" % (detached.getSerialNum()))
def rfidError(e):
try:
source = e.device
print("RFID %i: Phidget Error %i: %s" % (source.getSerialNum(), e.eCode, e.description))
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
def rfidOutputChanged(e):
source = e.device
print("RFID %i: Output %i State: %s" % (source.getSerialNum(), e.index, e.state))
def rfidTagGained(e):
source = e.device
rfid.setLEDOn(1)
print("RFID %i: Tag Read: %s" % (source.getSerialNum(), e.tag))
def rfidTagLost(e):
source = e.device
rfid.setLEDOn(0)
print("RFID %i: Tag Lost: %s" % (source.getSerialNum(), e.tag))
#Main Program Code
try:
rfid.setOnAttachHandler(rfidAttached)
rfid.setOnDetachHandler(rfidDetached)
rfid.setOnErrorhandler(rfidError)
rfid.setOnOutputChangeHandler(rfidOutputChanged)
rfid.setOnTagHandler(rfidTagGained)
rfid.setOnTagLostHandler(rfidTagLost)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Opening phidget object....")
try:
rfid.openPhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Waiting for attach....")
try:
rfid.waitForAttach(10000)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
try:
rfid.closePhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Exiting....")
exit(1)
else:
displayDeviceInfo()
print("Turning on the RFID antenna....")
rfid.setAntennaOn(True)
print("Press Enter to quit....")
chr = sys.stdin.read(1)
try:
lastTag = rfid.getLastTag()
print("Last Tag: %s" % (lastTag))
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Closing...")
try:
rfid.closePhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Done.")
exit(0)<file_sep>/navigation/gps/test_gps_functions.py
#!/usr/bin/python
from gps_functions import *
from math import *
#Two Example GPS Locations
lat1 = 53.32055555555556
lat2 = 53.31861111111111
lon1 = -1.7297222222222221
lon2 = -1.6997222222222223
#lat1 = 33.466582
#lon1 = -86.824076
#peru
#lat2 = -14.087724
#lon2 = -75.763532
#lat2 = 33.466582
#lon2 = -86.8240652425
Aaltitude = 2000
Oppsite = 20000
bearing = 270
distance_in_meters = 10.0
pos1 = get_dest_gps_cood(lat1, lon1, bearing, distance_in_meters/2.0)
#print "pos1 lat long:", pos1
pos2 = get_dest_gps_cood(pos1[0], pos1[1], 0, distance_in_meters/2.0)
print "pos2 lat long:", pos2
a = [lat1, lon1]
b = [pos2[0], pos2[1]]
print "lldistance(a, b) in meters: ", lldistance(a, b)
print "haversine(lon1, lat1, lon2, lat2) in meters: ", haversine(lon1, lat1, pos2[0], pos2[1])*1000
print "calculate_distance(lat1, lon1, lat2, lon2): ", calculate_distance(lat1, lon1, pos2[0], pos2[1])
#print "points2distance(a, b):", points2distance(a, b)
bearing = degrees(calcBearing(lat1, lon1, pos2[0], pos2[1]))
bearing = (bearing + 360) % 360
print "calcBearing(lat1, lon1, lat2, lon2):", bearing
print "distance.distance(ne, cl).meters" , geopyDistance(lat1, lon1, pos2[0], pos2[1])
print "bearing2(lat1, lon1, lat2, lon2):", bearing2(lat1, lon1, pos2[0], pos2[1])
destLat, destLong = destination_coordinates(lat1, lon1, bearing , calculate_distance(lat1, lon1, pos2[0], pos2[1]))
print "destination_coordinates(lat1, lon1, 55, 12):",pos2[0], pos2[1]
print "bearing2(lat1, lon1, destLat, destLong):", bearing2(lat1, lon1, pos2[0], pos2[1])
print "meters_to_feet(meters)", meters_to_feet(calculate_distance(lat1, lon1, pos2[0], pos2[1]))
H_Dist, distance, c, H_Bearing = distance_and_bearings(lat1, lon1, pos2[0], pos2[1], start_altitude=0, dest_altitude=0)
print("---------------------------------------")
print("Horizontial Distance:", H_Dist,"meters")
print(" Vertical Distance:", distance,"km")
print(" Vertical Bearing:",c)
print(" Horizontial Bearing:",H_Bearing)
<file_sep>/lib/gps_functions.py
#!/usr/bin/python
#sudo apt-get install gpsd gpsd-clients
#gpsd /dev/ttyUSB0 -b -n
#The gpsd server reads NMEA sentences from the gps unit and is accessed on port 2947. You can test if everything is working by running a pre-built gpsd client such as xgps.
#sudo easy_install geopy
from geopy import distance
import geopy
from geopy.distance import VincentyDistance
from math import *
import gps, os, time
#from future import division
from math import sin, cos, radians, sqrt, atan2, asin, sqrt, pi
from subprocess import call
from identify_device_on_ttyport import *
from threading import *
import random
rEarth = 6371.01 # Earth's average radius in km
epsilon = 0.000001 # threshold for floating-point equality
# model major (km) minor (km) flattening
ELLIPSOIDS = {'WGS-84': (6378.137, 6356.7523142, 1 / 298.257223563),
'GRS-80': (6378.137, 6356.7523141, 1 / 298.257222101),
'Airy (1830)': (6377.563396, 6356.256909, 1 / 299.3249646),
'Intl 1924': (6378.388, 6356.911946, 1 / 297.0),
'Clarke (1880)': (6378.249145, 6356.51486955, 1 / 293.465),
'GRS-67': (6378.1600, 6356.774719, 1 / 298.25),
}
class mobot_gps( Thread ):
def __init__(self):
Thread.__init__( self )
self.latitude = 0.0
self.longitude = 0.0
self.altitude = 0.0
self.satellites = 0
self.active_satellites = 0
self.track = 0.0
self.speed = 0.0
self.samples = 10
def __del__(self):
del(self)
def run(self):
self.process()
def process( self ):
gpss = []
#get list of all attached gps units
all_gps_list = gps_list()
print "Total # GPS Units Found:", len (all_gps_list)
#start all gps units
start_all_gps()
#connect via python
for n in range(len(all_gps_list)):
port = str(2947+n)
print "port", port
gpss.append(gps.gps(host="localhost", port=port))
print "starting_gps:", gpss[n]
#returncode = call(start_gps, shell=True)
time.sleep(1)
gpss[n].next()
gpss[n].stream()
#print 'Satellites (total of', len(gpss[n].satellites) , ' in view)'
#time.sleep(1)
while True :
avg_latitude = 0.0
avg_longitude = 0.0
avg_altitude = 0.0
avg_satellites = 0
avg_active_satellites = 0
avg_track = 0.0;
avg_speed = 0.0
try:
for n in range(len(all_gps_list)):
for x in xrange(1, self.samples):
#os.system("clear")
#while ActiveSatelliteCount(gpss[n].satellites) < 4:
# print "Acquiring at least 6 GPS Satellites..."
# print 'Satellites (total of', len(gpss[n].satellites) , ' in view)'
# print "Number of acquired satellites: ", ActiveSatelliteCount(gpss[n].satellites)
# time.sleep(1)
# os.system("clear")
# gpss[n].next()
# gpss[n].stream()
gpss[n].next()
gpss[n].stream()
#test data
#gpss[n].fix.latitude = 53.32055555555556 + (random.random() * 0.00005)
#gpss[n].fix.longitude =-1.7297222222222221 + (random.random() * 0.00005)
#print "READING GPS:", n, " ", x
#print "-------------"
#print 'latitude ' , gpss[n].fix.latitude
#print 'longitude ' , gpss[n].fix.longitude
#print 'mode:', gpss[n].fix.mode
#print 'track: ', gpss[n].fix.track
#print 'time utc ' , gpss[n].utc, gpss[n].fix.time
#print 'altitude ' , gpss[n].fix.altitude
#print 'epx ', gpss[n].fix.epx
#print 'epv ', gpss[n].fix.epv
#print 'ept ', gpss[n].fix.ept
#print 'epc ', gpss[n].fix.epc
#print 'epd ', gpss[n].fix.epd
#print 'eps ', gpss[n].fix.eps
#print "speed ", gpss[n].fix.speed
#print "climb " , gpss[n].fix.climb
#print
#print 'Satellites (total of', len(gpss[n].satellites) , ' in view)'
#for i in gpss[n].satellites:
# print '\t', i
#print "Active satellites used: ", ActiveSatelliteCount(gpss[n].satellites)
avg_latitude = (avg_latitude + gpss[n].fix.latitude)
avg_longitude = (avg_longitude + gpss[n].fix.longitude)
avg_altitude = (avg_altitude + gpss[n].fix.altitude)
avg_track = (avg_track + gpss[n].fix.track)
avg_speed = (avg_speed + gpss[n].fix.speed)
avg_active_satellites = (avg_active_satellites + ActiveSatelliteCount(gpss[n].satellites))
avg_satellites = (avg_satellites + len(gpss[n].satellites))
time.sleep(.2)
#print "Averaging....", (x*len(all_gps_list))
self.latitude = (avg_latitude / (x*len(all_gps_list)))
self.longitude = (avg_longitude / (x*len(all_gps_list)))
#print "total sats:", self.active_satellites
self.active_satellites = ( avg_active_satellites / (x*len(all_gps_list)))
self.track = (avg_track / (x*len(all_gps_list)))
self.speed = (avg_speed / (x*len(all_gps_list)))
self.altitude = (avg_altitude / (x*len(all_gps_list)))
self.satellites = (avg_satellites / (x*len(all_gps_list)))
#time.sleep(1)
#print 'Avg latitude : ' , self.latitude, " ", abs(self.latitude - gpss[n].fix.latitude)
#print 'Avg longitude: ' , self.longitude, " ", abs(self.longitude - gpss[n].fix.longitude)
#print 'Avg Active Satellites: ' , self.active_satellites
#print "Distance: ", round((lldistance((self.latitude, self.longitude), (gpss[n].fix.latitude, gpss[n].fix.longitude)) * 3.28084), 4), " feet"
#time.sleep(5)
except:
#time.sleep(1)
pass
#os.system("clear")
def ActiveSatelliteCount(session):
count = 0
for i in range(len(session)):
s = str(session[i])
if s[-1] == "y":
count = count + 1
return count
def start_all_gps():
#kill all gpsd
print "this program must be run as sudo"
os.system("killall gpsd")
time.sleep(1)
all_gps_list = gps_list()
#gps_list = ["/dev/ttyUSB0", "/dev/ttyUSB1"]
print "gps_list:", all_gps_list
print len(all_gps_list)
for n in range(len(all_gps_list)):
start_gps = "gpsd "+all_gps_list[n]+" -S " + str(2947+n) + " -n -b"
print "start_gps:", start_gps
returncode = call(start_gps, shell=True)
time.sleep(4)
#print returncode
def start_a_gps(gps_to_start):
start_gps = "gpsd "+gps_to_start+" -S " + str(2947+n) + " -n -b"
print "start_gps:", start_gps
returncode = call(start_gps, shell=True)
time.sleep(2)
#print returncode
def gps_list():
gps_list = find_usb_tty("067b","2303")
return gps_list
def decdeg2dms(dd):
mnt,sec = divmod(dd*3600,60)
deg,mnt = divmod(mnt,60)
return deg,mnt,sec
def lldistance(a, b):
"""
Calculates the distance between two GPS points (decimal)
@param a: 2-tuple of point A
@param b: 2-tuple of point B
@return: distance in m
"""
r = 6367442.5 # average earth radius in m
dLat = radians(a[0]-b[0])
dLon = radians(a[1]-b[1])
x = sin(dLat/2) ** 2 + \
cos(radians(a[0])) * cos(radians(b[0])) *\
sin(dLon/2) ** 2
#original# y = 2 * atan2(sqrt(x), sqrt(1-x))
y = 2 * asin(sqrt(x))
d = r * y
return d
def geopyDistance(lat1, lon1, lat2, lon2):
x = distance.distance((lat1, lon1), (lat2,lon2)).meters
return x
def haversine(lon1, lat1, lon2, lat2):
"""
Calculate the great circle distance between two points
on the earth (specified in decimal degrees)
"""
# convert decimal degrees to radians
lon1, lat1, lon2, lat2 = map(radians, [lon1, lat1, lon2, lat2])
# haversine formula
dlon = lon2 - lon1
dlat = lat2 - lat1
a = sin(dlat/2)**2 + cos(lat1) * cos(lat2) * sin(dlon/2)**2
c = 2 * asin(sqrt(a))
km = 6367 * c
return km
def calcBearing(lat1, lon1, lat2, lon2):
lon1, lat1, lon2, lat2 = map(radians, [lon1, lat1, lon2, lat2])
dLon = lon2 - lon1
y = sin(dLon) * cos(lat2)
x = cos(lat1) * sin(lat2) \
- sin(lat1) * cos(lat2) * cos(dLon)
return atan2(y, x)
#def vertical_angle():
def bearing2(lat1, lon1, lat2, lon2):
startLat = math.radians(lat1)
startLong = math.radians(lon1)
endLat = math.radians(lat2)
endLong = math.radians(lon2)
dLong = endLong - startLong
dPhi = math.log(math.tan(endLat/2.0+math.pi/4.0)/math.tan(startLat/2.0+math.pi/4.0))
if abs(dLong) > math.pi:
if dLong > 0.0:
dLong = -(2.0 * math.pi - dLong)
else:
dLong = (2.0 * math.pi + dLong)
bearing = (math.degrees(math.atan2(dLong, dPhi)) + 360.0) % 360.0;
return bearing
def calculate_distance(lat1, lon1, lat2, lon2):
'''
* Calculates the distance between two points given their (lat, lon) co-ordinates.
* It uses the Spherical Law Of Cosines (http://en.wikipedia.org/wiki/Spherical_law_of_cosines):
*
* cos(c) = cos(a) * cos(b) + sin(a) * sin(b) * cos(C) (1)
*
* In this case:
* a = lat1 in radians, b = lat2 in radians, C = (lon2 - lon1) in radians
* and because the latitude range is [-pie/2, pie/2] instead of [0, pie]
* and the longitude range is [-pie, pie] instead of [0, 2pie]
* (1) transforms into:
*
* x = cos(c) = sin(a) * sin(b) + cos(a) * cos(b) * cos(C)
*
* Finally the distance is arccos(x)
'''
if ((lat1 == lat2) and (lon1 == lon2)):
return 0
try:
delta = lon2 - lon1
a = math.radians(lat1)
b = math.radians(lat2)
C = math.radians(delta)
x = math.sin(a) * math.sin(b) + math.cos(a) * math.cos(b) * math.cos(C)
distance = math.acos(x) # in radians
distance = math.degrees(distance) # in degrees
distance = distance * 60 # 60 nautical miles / lat degree
distance = distance * 1852 # conversion to meters
#distance = round(distance)
return distance;
except:
return 0
def deg2rad(angle):
return angle*pi/180
def rad2deg(angle):
return angle*180/pi
def pointRadialDistance(lat1, lon1, bearing, distance):
"""
Return final coordinates (lat2,lon2) [in degrees] given initial coordinates
(lat1,lon1) [in degrees] and a bearing [in degrees] and distance [in km]
"""
rlat1 = deg2rad(lat1)
rlon1 = deg2rad(lon1)
rbearing = deg2rad(bearing)
rdistance = distance / rEarth # normalize linear distance to radian angle
rlat = asin( sin(rlat1) * cos(rdistance) + cos(rlat1) * sin(rdistance) * cos(rbearing) )
if cos(rlat) == 0 or abs(cos(rlat)) < epsilon: # Endpoint a pole
rlon=rlon1
else:
rlon = ( (rlon1 - asin( sin(rbearing)* sin(rdistance) / cos(rlat) ) + pi ) % (2*pi) ) - pi
lat = rad2deg(rlat)
lon = rad2deg(rlon)
return (lat, lon)
def recalculate_coordinate(val, _as=None):
"""
Accepts a coordinate as a tuple (degree, minutes, seconds)
You can give only one of them (e.g. only minutes as a floating point number)
and it will be duly recalculated into degrees, minutes and seconds.
Return value can be specified as 'deg', 'min' or 'sec'; default return value is
a proper coordinate tuple.
This formula is only an approximation when applied to the Earth, because the Earth is not a perfect sphere: the Earth radius R varies from 6356.78 km at the poles to 6378.14 km at the equator. There are small corrections, typically on the order of 0.1% (assuming the geometric mean R = 6367.45 km is used everywhere, for example), because of this slight ellipticity of the planet. A more accurate method, which takes into account the Earth's ellipticity, is given by Vincenty's formulae.
"""
deg, min, sec = val
# pass outstanding values from right to left
min = (min or 0) + int(sec) / 60
sec = sec % 60
deg = (deg or 0) + int(min) / 60
min = min % 60
# pass decimal part from left to right
dfrac, dint = math.modf(deg)
min = min + dfrac * 60
deg = dint
mfrac, mint = math.modf(min)
sec = sec + mfrac * 60
min = mint
if _as:
sec = sec + min * 60 + deg * 3600
if _as == 'sec': return sec
if _as == 'min': return sec / 60
if _as == 'deg': return sec / 3600
return deg, min, sec
def points2distance(start, end):
"""
Calculate distance (in kilometers) between two points given as (long, latt) pairs
based on Haversine formula (http://en.wikipedia.org/wiki/Haversine_formula).
Implementation inspired by JavaScript implementation from
http://www.movable-type.co.uk/scripts/latlong.html
Accepts coordinates as tuples (deg, min, sec), but coordinates can be given
in any form - e.g. can specify only minutes:
(0, 3133.9333, 0)
is interpreted as
(52.0, 13.0, 55.998000000008687)
which, not accidentally, is the lattitude of Warsaw, Poland.
"""
start_long = math.radians(recalculate_coordinate(start[0], 'deg'))
start_latt = math.radians(recalculate_coordinate(start[1], 'deg'))
end_long = math.radians(recalculate_coordinate(end[0], 'deg'))
end_latt = math.radians(recalculate_coordinate(end[1], 'deg'))
d_latt = end_latt - start_latt
d_long = end_long - start_long
a = math.sin(d_latt/2)**2 + math.cos(start_latt) * math.cos(end_latt) * math.sin(d_long/2)**2
c = 2 * math.atan2(math.sqrt(a), math.sqrt(1-a))
return rEarth * c
def meters_to_feet(meters):
print "metters", meters
return (meters*3.28084)
def destination_coordinates(lat1, lon1, bearing_in_degrees, distance_to_travel_in_meters):
# given: lat1, lon1, bearing = bearing to travel in degrees, distance_to_travel = distance to travel in meters
distance_to_travel_in_meters = float(distance_to_travel_in_meters) / 1000
print distance_to_travel_in_meters
origin = geopy.Point(lat1, lon1)
destination = VincentyDistance(kilometers=distance_to_travel_in_meters).destination(origin, bearing_in_degrees)
lat2, lon2 = destination.latitude, destination.longitude
return lat2,lon2
def distance_and_bearings(lat1, lon1, lat2, lon2, start_altitude=0, dest_altitude=0):
#GPS distance and bearing between two GPS points (Python recipe)
#This code outputs the distance between 2 GPS points showing also the vertical and horizontal bearing between them.
#Haversine Formuala to find vertical angle and distance
lon1, lat1, lon2, lat2 = map(radians, [lon1, lat1, lon2, lat2])
dlon = lon2 - lon1
dlat = lat2 - lat1
a = sin(dlat/2)**2 + cos(lat1) * cos(lat2) * sin(dlon/2)**2
c = 2 * atan2(sqrt(a), sqrt(1-a))
#convert to meters
#Base = Base * 1000 6367442.5
Base = 6367442.5 * c # average earth radius in m
Bearing = calcBearing(lat1, lon1, lat2, lon2)
Bearing = round(degrees(Bearing), 2)
distance = Base * 2 + dest_altitude * 2 / 2
Caltitude = dest_altitude - start_altitude
#Convertion from radians to decimals
a = dest_altitude/Base
b = atan(a)
c = degrees(b)
#Convert meters into Kilometers
#distance = distance / 1000
Base = round(Base,2)
return Base, distance, c, Bearing
#Horisontal Bearing
def calcBearing(lat1, lon1, lat2, lon2):
dLon = lon2 - lon1
y = sin(dLon) * cos(lat2)
x = cos(lat1) * sin(lat2) \
- sin(lat1) * cos(lat2) * cos(dLon)
return atan2(y, x)
def get_dest_gps_cood(lat1, lon1, bearing, distance_in_meters):
# given: lat1, lon1, b = bearing in degrees, d = distance in kilometers
#returns new lat long
#lat1 = 53.32055555555556
#lat2 = 53.31861111111111
#lon1 = -1.7297222222222221
#lon2 = -1.6997222222222223
d = distance_in_meters / 1000.0
b = bearing
print d, b
origin = geopy.Point(lat1, lon1)
destination = VincentyDistance(kilometers=d).destination(origin, b)
lat2, lon2 = destination.latitude, destination.longitude
return lat2, lon2
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/tests/test_rf.py
from milk.supervised import randomforest
import numpy as np
def test_rf():
from milksets import wine
features, labels = wine.load()
features = features[labels < 2]
labels = labels[labels < 2]
learner = randomforest.rf_learner()
model = learner.train(features[::5], labels[::5])
test = [model.apply(f) for f in features]
assert np.mean(labels == test) > .7
<file_sep>/build/pyUSB/README.rst
=======================================
PyUSB 1.0 - Easy USB access from Python
=======================================
Introduction
============
The PyUSB module provides for Python easy access to the host
machine's Universal Serial Bus (USB) system.
Until 0.4 version, PyUSB used to be a thin wrapper over libusb.
With 1.0 version, things changed considerably. Now PyUSB is an
API rich, backend neutral Python USB module easy to use.
As with most Python modules, PyUSB's documentation is based on Python
doc strings and can therefore be manipulated by tools such as pydoc.
You can also find a tutorial at: http://pyusb.sourceforge.net/docs/1.0/tutorial.html.
PyUSB is being developed and tested in Linux and Windows, but it should work
fine in any platform running Python >= 2.4, ctypes and at least one of the
builtin backends.
PyUSB supports libusb 0.1, libusb 1.0 and OpenUSB, but the user does not need
to worry about that, unless in some corner cases.
If you have any question about PyUSB, you can use the PyUSB mailing list
hosted in the SourceForge. In the PyUSB website (http://pyusb.sourceforge.net)
you can find instructions on how to subscribe to the mailing list.
Installing PyUSB on GNU/Linux Systems
=====================================
These instructions are for Debian-based systems. Instructions for
other flavors of GNU/Linux should be similar.
You will first need to install the following packages:
1) python (PyUSB is useless without it), version >= 2.4
2) At least one of the supported libraries (libusb 1.0, libusb 0.1 or OpenUSB)
3) If your Python version is < 2.5, you have to install ctypes as a separate package,
because these versions of Python does not ship it.
For example, the command::
$ sudo apt-get install python libusb
should install all these packages on most Debian-based systems with
access to the proper package repositories.
Once the above packages are installed, you can install PyUSB
with the command::
$ sudo python setup.py install
Run it as root from within the same directory as this README file.
Installing PyUSB on Windows
===========================
Now that PyUSB is 100% written in Python, you install it on Windows
in the same way you do on Linux::
python setup.py install
Remember that you need libusb (1.0 or 0.1) or OpenUSB running on your
system. For Windows users, libusb 1.0 is still experimental, so it is
recommended libusb-win32 package. Check the libusb website for updates
(http://www.libusb.org).
Reporting bugs/Submitting patches
=================================
Some people have been sending patches and reporting bugs directly
at my email. Please, do it through Trac, I had a hardtime tracking
their names to put them in the acknowledgments file. ;-)
PS: this README file was based on the great Josh Lifton's one... ^_^
<file_sep>/powertrain/motor_test.py
from robomow_motor_class_arduino import *
motor1 = robomow_motor()
print "motor1.isConnected:", motor1.isConnected
#print motor1.com_stats()
while True:
print motor1.motor_stats()
time.sleep(.5)
'''
for x in range(2):
print "forward"
motor1.forward(100)
time.sleep(1)
print motor1.motor1_speed, motor1.motor2_speed
print "rev"
motor1.reverse(100)
print motor1.motor1_speed, motor1.motor2_speed
time.sleep(2)
print "right"
motor1.right(100)
print motor1.motor1_speed, motor1.motor2_speed
time.sleep(2)
print "left"
motor1.left(100)
print motor1.motor1_speed, motor1.motor2_speed
time.sleep(2)
print "stop"
motor1.stop()
print motor1.motor1_speed, motor1.motor2_speed
for i in xrange(0, 101, 100):
print "forward"
print motor1.forward(i)
time.sleep(1)
print "l/r motors speeds: ", motor1.lmotor_speed, motor1.rmotor_speed
print "reverse"
print motor1.reverse(i)
time.sleep(1)
print "l/r motors speeds: ", motor1.lmotor_speed, motor1.rmotor_speed
print "left"
print motor1.left(i)
time.sleep(1)
print "l/r motors speeds: ", motor1.lmotor_speed, motor1.rmotor_speed
print "right"
print motor1.right(i)
time.sleep(1)
print "l/r motors speeds: ", motor1.lmotor_speed, motor1.rmotor_speed
print "STOP"
print motor1.stop()
time.sleep(1)
print "l/r motors speeds: ", motor1.lmotor_speed, motor1.rmotor_speed
'''
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/tests/test_multi_view.py
import milk.supervised.multi_view
import numpy as np
import milk.supervised.svm
from milk.supervised.defaultclassifier import feature_selection_simple
def test_multi_view():
from milksets.wine import load
features, labels = load()
features0 = features[::10]
features1 = features[1::10]
features2 = features[2::10]
labels0 = labels[::10]
labels1 = labels[1::10]
labels2 = labels[2::10]
assert np.all(labels0 == labels1)
assert np.all(labels1 == labels2)
labels = labels0
train_features = zip(features0,features1,features2)
test_features = zip(features[3::10], features[4::10], features[5::10])
base = milk.supervised.classifier.ctransforms(
feature_selection_simple(),
milk.supervised.svm.svm_raw(C=128, kernel=milk.supervised.svm.rbf_kernel(4.)),
milk.supervised.svm.svm_sigmoidal_correction()
)
classifier = milk.supervised.multi_view.multi_view_classifier([base,base,base])
model = classifier.train(train_features, labels == 0)
assert ([model.apply(f) for f in test_features] == (labels == 0)).mean() > .9
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/unsupervised/pca.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2012, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# License: MIT. See COPYING.MIT file in the milk distribution
from __future__ import division
import numpy as np
from numpy import linalg
from . import normalise
from .pdist import pdist
__all__ = [
'pca',
'mds',
]
def pca(X, zscore=True):
'''
Y,V = pca(X, zscore=True)
Principal Component Analysis
Performs principal component analysis. Returns transformed
matrix and principal components
Parameters
----------
X : 2-dimensional ndarray
data matrix
zscore : boolean, optional
whether to normalise to zscores (default: True)
Returns
-------
Y : ndarray
Transformed matrix (of same dimension as X)
V : ndarray
principal components
'''
if zscore:
X = normalise.zscore(X)
C = np.cov(X.T)
w,v = linalg.eig(C)
Y = np.dot(v,X.T).T
return Y,v
def mds(features, ndims, zscore=False):
'''
X = mds(features, ndims, zscore=False)
Euclidean Multi-dimensional Scaling
Parameters
----------
features : ndarray
data matrix
ndims : int
Number of dimensions to return
zscore : boolean, optional
Whether to zscore the features (default: False)
Returns
-------
X : ndarray
array of size ``(m, ndims)`` where ``m = len(features)``
'''
if zscore:
X = normalise.zscore(features)
P2 = pdist(features)
n = len(P2)
J = np.eye(n) - (1./n)* np.ones((n,n))
B = -.5 * np.dot(J,np.dot(P2,J))
w,v = np.linalg.eig(B)
w = w[:ndims]
s = np.sign(w)
w = np.abs(w).real
w = np.diag(np.sqrt(s * w))
X = np.dot(v[:,:ndims], w)
return X.real
<file_sep>/powertrain/robomow_motor_class_arduino.py
#!/usr/bin/env python
import serial
import time
def log(*msgline):
for msg in msgline:
print msg,
print
#!/usr/bin/env python
import serial
import threading
import time
# need to find best way to search seria ports for find device
class robomow_motor(object):
def __init__(self):
self.isConnected = False
self._should_stop = threading.Event()
self._data = 0
self.port = None
self.motor1_speed = 0
self.motor2_speed = 0
self.com = self.open_serial_port()
def open_serial_port(self):
while self.isConnected == False:
print "class robomow_motor: searching serial ports for motor controller package..."
for i in range(11):
com = "/dev/ttyUSB"
com = com[0:11] + str(i)
print "class robomow_motor: searching on COM:", com
time.sleep(.1)
try:
ser = serial.Serial(com, 9600, timeout=1)
data = ser.readline()
#print "data=", int(data[3:(len(data)-1)])
if data[0:2] == "m1":
#ser.write("X") # write a string
print "class robomow_motor: found motor controller package on COM: ", com
self.isConnected = True
time.sleep(.35)
break
except:
pass
com = "/dev/ttyACM"
com = com[0:11] + str(i)
print "class robomow_motor: searching on COM:", com
time.sleep(.1)
try:
ser = serial.Serial(com, 9600, timeout=1)
data = ser.readline()
#print "data=", int(data[3:(len(data)-1)])
if data[0:2] == "m1":
#ser.write("X\n") # write a string
print "class robomow_motor: found robomow_motor package on COM: ", com
self.isConnected = True
time.sleep(.35)
break
except:
pass
if self.isConnected == False:
print "class robomow_motor: robomow_motor package not found!"
time.sleep(1)
#print "returning", ser
return ser
def __del__(self):
del(self)
def com_stats(self):
#print self.com
return (self.com.port,self.com.baudrate,self.com.bytesize,self.com.parity,self.com.stopbits)
def forward(self,speed):
##takes desired speed as percentage
self.com.flushInput()
command_str = ("FW"+str(speed))
#validate_command(self, command_str)
send_command(self, command_str)
self.motor_stats()
def reverse(self,speed):
##takes desired speed as percentage
self.com.flushInput()
command_str = ("RV"+str(speed))
#validate_command(self, command_str)
send_command(self, command_str)
self.motor_stats()
def stop(self):
##takes desired speed as percentage
command_str = ("ST")
#validate_command(self, command_str)
send_command(self, command_str)
self.motor_stats()
def left(self, speed):
##takes desired speed as percentage
command_str = ("LT"+str(speed))
#validate_command(self, command_str)
send_command(self, command_str)
self.motor_stats()
def right(self, speed):
##takes desired speed as percentage
command_str = ("RT"+str(speed))
#validate_command(self, command_str)
send_command(self, command_str)
self.motor_stats()
def motor_stats(self):
##takes desired speed as percentage
cmd = ("SP")
successful = False
for n in range (5):
time.sleep(0.05)
self.com.flushOutput()
#print "sending to motor arduino:", cmd
self.com.write (cmd)
received = ""
self.com.flushInput()
received = received + self.com.readline()
received = received + self.com.readline()
received = received + self.com.readline()
#print "received back from arduino:", received
#print len(received)
received = received.replace('\r\n', '')
received = received.replace('m1:', '')
received = received.replace('m', '')
received = received.replace(':', '')
#print "stripped:", received, " cmd:", cmd
#cmd = cmd[0] + cmd[1]
if (len(received) > 0):
#print "successful: sending to motor arduino"
received = received.strip("SP")
#print "received now:", received
spd_list = received.split(',')
print "spd_list", spd_list
successful = True
self.motor1_speed = int(spd_list[0])
self.motor2_speed = int(spd_list[1])
break
if (successful == False):
print "NOT successful: sending SP stats to motor arduino"
def validate_command(self, cmd):
successful = False
for n in range (5):
time.sleep(0.05)
self.com.flushOutput()
#print "sending to motor arduino:", cmd
self.com.write (cmd)
received = ""
self.com.flushInput()
#time.sleep(0.05)
while (len(received) < 1):
received = self.com.readline()
print "received back from arduino:", received
#print len(received)
received = received.replace('\r\n', '')
received = received.replace('m1:', '')
received = received.replace('m', '')
received = received.replace(':', '')
print "stripped:", received, " cmd:", cmd
if (received == cmd):
print "successful: sending to motor arduino"
successful = True
break
if (successful == False):
print "NOT successful: sending to motor arduino"
def send_command(self, cmd):
#successful = False
#for n in range (5):
# time.sleep(0.05)
self.com.flushOutput()
#print "sending to motor arduino:", cmd
self.com.write (cmd)
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/supervised/normalise.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2012, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# License: MIT. See COPYING.MIT file in the milk distribution
from __future__ import division
import numpy as np
from .base import supervised_model
from ..unsupervised.normalise import zscore
__all__ = [
'zscore',
'zscore_normalise',
'interval_normalise',
'chkfinite',
'sample_to_2min',
'normaliselabels'
]
class subtract_divide_model(supervised_model):
def __init__(self, shift, factor):
factor[factor == 0] = 1 # This makes the division a null op.
self.shift = shift
self.factor = factor
def apply_many(self, features):
return (features - self.shift)/self.factor
def apply(self, features):
return (features - self.shift)/self.factor
def __repr__(self):
return 'subtract_divide_model(%s, %s)' % (self.shift, self.factor)
class zscore_normalise(object):
'''
Normalise to z-scores
A preprocessor that normalises features to z scores.
'''
def train(self, features, labels, **kwargs):
shift = features.mean(0)
factor = np.std(features,0)
return subtract_divide_model(shift, factor)
class interval_normalise(object):
'''
Linearly scale to the interval [-1,1] (per libsvm recommendation)
'''
def train(self, features, labels, **kwargs):
ptp = features.ptp(0)
shift = features.min(0) + ptp/2.
factor = ptp/2.
return subtract_divide_model(shift, factor)
def __repr__(self):
return 'interval_normalise()'
def sample_to_2min(labels):
'''
selected = sample_to_2min(labels)
Select examples so that the ratio of size of the largest
class to the smallest class is at most two (i.e.,
min_label_count = min { (labels == L).sum() | for L in set(labels) }
for L' in set(labels):
assert (labels == L').sum() <= 2 * min_label_count
)
Parameters
----------
labels : sequence of labels
Returns
-------
selected : a Boolean numpy.ndarray
'''
from collections import defaultdict
counts = defaultdict(int)
for n in labels:
counts[n] += 1
labels = np.asanyarray(labels)
max_entries = np.min(counts.values())*2
selected = np.zeros(len(labels), bool)
for c in counts.iterkeys():
p, = np.where(labels == c)
p = p[:max_entries]
selected[p] = 1
return selected
class chkfinite(supervised_model):
'''
Fill NaN & Inf values
Replaces NaN & Inf values with zeros.
'''
def __init__(self):
pass
def train(self, features, labels, **kwargs):
return self
def apply(self, features):
nans = np.isnan(features) + np.isinf(features)
if nans.any():
features = features.copy()
features[nans] = 0
return features
def __repr__(self):
return 'chkfinite()'
def normaliselabels(labels, multi_label=False):
'''
normalised, names = normaliselabels(labels, multi_label=False)
If not ``multi_label`` (the default), normalises the labels to be integers
from 0 through N-1. Otherwise, assume that each label is actually a
sequence of labels.
``normalised`` is a np.array, while ``names`` is a list mapping the indices to
the old names.
Parameters
----------
labels : any iterable of labels
multi_label : bool, optional
Whether labels are actually composed of multiple labels
Returns
------
normalised : a numpy ndarray
If not ``multi_label``, this is an array of integers 0 .. N-1;
otherwise, it is a boolean array of size len(labels) x N
names : list of label names
'''
if multi_label:
names = set()
for ell in labels: names.update(ell)
names = list(sorted(names))
normalised = np.zeros( (len(labels), len(names)), bool)
for i,ls in enumerate(labels):
for ell in map(names.index, ls):
normalised[i,ell] = True
return normalised, names
else:
names = sorted(set(labels))
normalised = map(names.index, labels)
return np.array(normalised), names
<file_sep>/ftdi/pyftdi.py
# ----------
# import the PyUSB module
import d2xx
# list devices by description, returns tuple of attached devices description strings
d = d2xx.listDevices(d2xx.OPEN_BY_DESCRIPTION)
print d
# list devices by serial, returns tuple of attached devices serial strings
d = d2xx.listDevices() # implicit d2xx.OPEN_BY_SERIAL_NUMBER
print d
h = d2xx.open(0)
print h
# read eeprom
print h.eeRead()
# get queue status
print h.getQueueStatus()
# set RX/TX timeouts
h.setTimeouts(1000,1000)
# write bytes (serial mode)
print h.write('Hello world!\r\n')
# read bytes (serial mode)
print h.read(5)
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/supervised/__init__.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2012, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# License: MIT. See COPYING.MIT file in the milk distribution
'''
milk.supervised
This hold the supervised classification modules:
Submodules
----------
- defaultclassifier: contains a default "good enough" classifier
- svm: related to SVMs
- grouped: contains objects to transform single object classifiers into group classifiers
by voting
- multi: transforms binary classifiers into multi-class classifiers (1-vs-1 or 1-vs-rest)
- featureselection: feature selection
- knn: k-nearest neighbours
- tree: decision tree classifiers
Classifiers
-----------
All classifiers have a `train` function which takes 2 arguments:
- features : sequence of features
- labels : sequence of labels
They return a `model` object, which has an `apply` function which takes a
single input and returns its label.
Note that there are always two objects: the learned and the model and they are
independent. Every time you call learner.train() you get a new model.
Both classifiers and models are pickle()able.
Example
-------
::
features = np.random.randn(100,20)
features[:50] *= 2
labels = np.repeat((0,1), 50)
classifier = milk.defaultclassifier()
model = classifier.train(features, labels)
new_label = model.apply(np.random.randn(100))
new_label2 = model.apply(np.random.randn(100)*2)
'''
from .defaultclassifier import defaultclassifier, svm_simple
from .classifier import normaliselabels
from .gridsearch import gridsearch
from .tree import tree_learner
from .lasso import lasso, lasso_learner, lasso_model_walk, lasso_walk
__all__ = [
'normaliselabels',
'defaultclassifier',
'svm_simple',
'gridsearch',
'lasso',
'lasso_learner',
'lasso_model_walk',
'lasso_walk',
'tree_learner',
]
<file_sep>/pubsub/recieve.py
#!/usr/bin/env python
import sys
sys.path.append( "../lib/" )
import pika
from pylab import imread
import Image
import time
import cv, cv2
import cPickle as pickle
from img_processing_tools import *
import numpy
count = 0
def callback(ch, method, properties, body):
global count
count = count +1
frame = pickle.loads(body)
print count
cv2.imshow('Video', frame)
cv.WaitKey(10)
connection = pika.BlockingConnection(pika.ConnectionParameters(host='192.168.1.39'))
channel = connection.channel()
channel.queue_declare(queue='mobot_video1', auto_delete=True, arguments={'x-message-ttl':1000})
print ' [*] Waiting for messages. To exit press CTRL+C'
cv2.namedWindow('Video', cv.CV_WINDOW_AUTOSIZE)
channel.basic_consume(callback, queue='mobot_video1', no_ack=True)
channel.start_consuming()
<file_sep>/gui/tkinter/basestation_gui.py
from Tkinter import *
import tkMessageBox
from PIL import Image, ImageTk
import time
from PIL import ImageFile
from datetime import datetime
#from ThreadedBeatServer import *
import socket
import sonar_functions as sf
from FileReceiver import *
from threading import *
import matplotlib.pyplot as P
import matplotlib.cm as cm
from matplotlib.pyplot import figure, show, rc
import pylab as pl
import numpy as np
sock = None
ROBOT_IP = None
PORT = 50005
def TextOut(text):
if (paused.get() != 'UN-PAUSE'):
Textbox1.insert(END, str(datetime.now()) + ':' + text +'\n') #print new line in textbox
Textbox1.yview(END) #autoscroll
def Search_For_Robot():
global ROBOT_IP
#first try by dns lookup
#robot_name = 'mobot-2012'
try:
answer = None
print "Searching for Robot using DNS..."
TextOut("Searching for Robot using DNS...")
time.sleep(.1)
answer = socket.gethostbyname('mobot-2012.local')
print "Robot found on IP: ", answer
TextOut("Robot found on IP: " + answer)
ROBOT_IP = answer
return answer
except:
if answer == None:
print "failed DNS Search...trying PING method...", answer
for ip in xrange(30, 150, 1):
ip_to_ping = "192.168.1."+str(ip)
print "Searching for Robot on IP:", ip_to_ping,
TextOut("Searching for Robot on IP:" + ip_to_ping)
try:
answer = socket.gethostbyaddr(ip_to_ping)
print ": Found Device: ",answer[0]
temp = answer[0][:10]
#print "temp=", temp
#Textout(": Found Device: " + temp)
if temp == 'mobot-2012':
print "Robot found on IP: ", ip_to_ping
TextOut("Robot found on IP: " + ip_to_ping)
ROBOT_IP = ip_to_ping
return ip_to_ping
break
except:
print ": No response....."
TextOut(": No response.....")
return False
else:
return answer
def com_loop(address, port):
global sock
#print "sock:", sock
if sock == None:
try:
print "Attempting to connect: %s port %s" % (address, port)
TextOut("Attempting to connect: %s port %s" % (address, port))
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.settimeout(.1)
sock.connect((address, port))
print "Connected to %s on port %s" % (address, port)
TextOut("Connected to %s on port %s" % (address, port))
return True
except socket.error, e:
print "Connection to: %s port %s failed: %s" % (address, port, e)
TextOut("Connection to: %s port %s failed: %s" % (address, port, e))
return False
else:
try:
sock.send("PING\n")
#print "communication to robot: ACTIVE"
#TextOut("communication to robot: ACTIVE")
#data = sock.recv(1)
#print "recieved from robot: ", data
time.sleep(.1)
return True
except socket.error, e:
print "communication to robot: NOT ACTIVE"
TextOut("communication to robot: NOT ACTIVE")
sock = None
time.sleep(.5)
return False
def toggle_button_pause():
'''
use
t_btn.config('text')[-1]
to get the present state of the toggle button
'''
if button_pause.config('text')[-1] == 'UN-PAUSE':
#button_pause.config(text='PAUSE')
paused.set('PAUSE')
else:
#button_pause.config(text='UN-PAUSE')
paused.set('UN-PAUSE')
def send_command_to_robot(command, ip, port):
global sock
#IP="192.168.1.87"
#PORT=50005
#print "target IP:", IP
#print "target port:", PORT
#print "Command:", command
if sock != None:
try:
sock.send(command)
print "Command Sent to Robot:", command
TextOut("Command Sent to Robot:" + command)
reply = sock.recv(1024)
TextOut("Echoed from Robot:" + reply)
time.sleep(.1)
return True
except socket.error, e:
print "Send Command Failed"
TextOut("Send Command Failed")
print "com to robot: NOT ACTIVE"
TextOut("communication to robot: NOT ACTIVE")
sock = None
time.sleep(.5)
return False
else:
print "Send Command Failed"
TextOut("Send Command Failed")
print "com to robot: NOT ACTIVE"
TextOut("communication to robot: NOT ACTIVE")
sock = None
time.sleep(.5)
return False
def hide_buttons():
Button_Enable_Motors.pack_forget()
MF.pack_forget()
MB.pack_forget()
ML.pack_forget()
MR.pack_forget()
Button_Update_Image.pack_forget()
camera_1.pack_forget()
def enable_drive_motors():
print Button_Enable_Motors.configure('text')[-1][0]
if Button_Enable_Motors.configure('text')[-1][0] == 'Disable':
#button_pause.config(text='PAUSE')
#paused.set('PAUSE')
Button_Enable_Motors.configure(text='Enable Drive Motors', bg = "red")
MF.configure(state=DISABLED, background='red')
MB.configure(state=DISABLED, background='red')
ML.configure(state=DISABLED, background='red')
MR.configure(state=DISABLED, background='red')
else:
#button_pause.config(text='UN-PAUSE')
#paused.set('UN-PAUSE')
Button_Enable_Motors.configure(text='Disable Drive Motors', bg = "green")
MF.configure(state=NORMAL, background='green')
MB.configure(state=NORMAL, background='green')
ML.configure(state=NORMAL, background='green')
MR.configure(state=NORMAL, background='green')
#send_command_to_robot("edm")
def show_buttons():
Button_Enable_Motors.pack()
Button_Update_Image.pack()
camera_1.pack()
def update_sonar(ip, port):
s = FileReceiver(9092, 9093)
#line below stops thread when main program stops
s.daemon = True
s.start()
sonarfeed = sonar_feed()
sonarfeed.daemon=True
sonarfeed.start()
def update_images(ip, port):
s = FileReceiver(9090, 9091)
print s.cmd_port,s.data_port
#line below stops thread when main program stops
s.daemon = True
s.start()
videofeed = video_feed()
videofeed.daemon=True
videofeed.start()
###### 11/23/2012 - need to try pickle instead of saving files...
class video_feed(Thread):
def run(self):
while True:
try:
image = Image.open("snap_shot.jpg")
image.thumbnail((320,240))
photo1 = ImageTk.PhotoImage(image)
camera_1.config(image=photo1)
time.sleep(.1)
except:
pass
class sonar_feed(Thread):
def run(self):
while True:
try:
f = open("sonar_data.txt", "r")
sonar_data = f.readline()
f.close()
processed_sonar_data = []
print "calling sonar_display"
#sonar_image = sf.sonar_display(sonar_data)
processed_sonar_data = sf.process_sonar_data(sonar_data)
print "processed_sonar_data ", processed_sonar_data
sf.sonar_graph(processed_sonar_data)
#print "saving sonar image"
#sonar_image.save("sonar_image.jpg")
#print "returning"
#refresh with new somar image
image = Image.open("sonar_image.png")
#image.thumbnail((320,240))
sonar_img = ImageTk.PhotoImage(image)
sonar_display.config(image=sonar_img)
time.sleep(.1)
#gc.collect()
except:
pass
def update_display():
global ROBOT_IP
#global heartbeat_enabled
#IP = "192.168.1.44"
PORT = 50005
#print "update called"
#print "paused:", paused.get()
if (testmode == False):
if ROBOT_IP != None:
if com_loop(ROBOT_IP,PORT) == True:
#if 1 == 1:
com_status.set('COM ACTIVE ON IP: ' + ROBOT_IP)
Button_Com_Status.configure(bg = "green")
#update_sonar()
#show_buttons()
#update_images()
else:
com_status.set('COM NOT ACTIVE' )
Button_Com_Status.configure(bg = "red")
#hide_buttons()
else:
Search_For_Robot()
#main_gui.update()
main_gui.after(100, update_display)
if __name__== "__main__":
testmode = False
if len(sys.argv) > 1:
if sys.argv[1] == 'testmode':
print 'starting in testing mode'
testmode= True
main_gui = Tk()
main_gui.geometry("1150x840")
com_status = StringVar()
com_status.set('COM INACTIVE')
frame1=Frame(main_gui, bd=1, relief=SUNKEN)
frame2=Frame(main_gui, bd=1, relief=SUNKEN)
frame3=Frame(main_gui, bd=1, relief=SUNKEN)
frame4=Frame(main_gui, bd=1, relief=SUNKEN)
Button_Com_Status = Button(frame1, textvariable=com_status);
Button_Com_Status.pack();
button_search_for_bot = Button(frame1, text="Find Bot", command=lambda: Search_For_Robot());
#button_search_for_bot.grid(row=0, column=1, sticky=W)
button_search_for_bot.pack(side=LEFT)
Button_Enable_Motors = Button(frame1, text="Enable Drive Motors", command=enable_drive_motors)
Button_Enable_Motors.pack(side=LEFT)
MF = Button(frame1, text="Forward", command=lambda: send_command_to_robot('f', ROBOT_IP, PORT));
#MF.grid(row=0, column=1, sticky=W)
MB = Button(frame1, text="Reverse", command=lambda: send_command_to_robot('b', ROBOT_IP, PORT))
#MB.grid(row=0, column=2, sticky=W)
ML = Button(frame1, text="Left", command=lambda: send_command_to_robot('l', ROBOT_IP, PORT))
#ML.grid(row=1, column=1, sticky=W)
MR = Button(frame1, text="Right", command=lambda: send_command_to_robot('r', ROBOT_IP, PORT))
#MR.grid(row=1, column=2, sticky=W)
MF.pack(side=LEFT)
MB.pack(side=LEFT)
ML.pack(side=LEFT)
MR.pack(side=LEFT)
frame1.pack( padx=5, pady=5)
Button_Enable_Motors.configure(background='green')
MF.configure(state=DISABLED, background='red')
MB.configure(state=DISABLED, background='red')
ML.configure(state=DISABLED, background='red')
MR.configure(state=DISABLED, background='red')
Button_Update_Image = Button(frame2, text="Grab Images", command=lambda: update_images(ROBOT_IP, 12345));
Button_Update_Image.pack(side=LEFT);
button_toggle_sonar = Button(frame2, text="Update Sonar", command=lambda: update_sonar(ROBOT_IP, 12345));
button_toggle_sonar.pack()
frame2.pack()
image = Image.open("temp.jpg")
sonar = Image.open('sonar_image.png')
#sonar.thumbnail((440,440))
photo1 = ImageTk.PhotoImage(image)
camera_1 = Label(frame3, image=photo1, bd=1, relief=SUNKEN)
camera_1.pack(side=LEFT, padx=5, pady=5)
sonar_img = ImageTk.PhotoImage(sonar)
sonar_display = Label(frame3, image=sonar_img, bd=1, relief=SUNKEN)
sonar_display.pack(side=LEFT, padx=5, pady=5)
frame3.pack()
s = Scrollbar(frame4)
Textbox1 = Text(frame4)
Textbox1.focus_set()
s.pack(side=RIGHT, fill=Y)
Textbox1.pack(side=RIGHT)#, fill=Tkinter.Y)
s.config(command=Textbox1.yview)
Textbox1.config(yscrollcommand=s.set, height=15, width=90)
frame4.pack()
#
paused = StringVar()
paused.set('PAUSE')
button_pause = Button(main_gui, textvariable=paused, command=toggle_button_pause);button_pause.pack();
#Button_show_Image = Button(main_gui, text="Load Image", command=random_image).pack();
var_IP_of_bot = StringVar(None)
IP_of_bot = Entry(main_gui, textvariable=var_IP_of_bot)
IP_of_bot.pack()
#frame1.pack_forget()
update_display()
main_gui.mainloop()
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/wrapper/wraplibsvm.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2010, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
# License: MIT. See COPYING.MIT file in the milk distribution
from __future__ import division
from milk.supervised.classifier import normaliselabels
try:
from libsvm import svm as libsvm
except ImportError:
try:
import svm as libsvm
except ImportError:
libsvm = None
from tempfile import NamedTemporaryFile
class libsvmModel(object):
def __init__(self, model, names, output_probability):
self.model = model
self.names = names
self.output_probability = output_probability
def apply(self,feats):
if self.output_probability:
return self.model.predict_probability(feats)
res = self.model.predict(feats)
return self.names[int(res)]
def __getstate__(self):
# This is really really really hacky, but it works
N = NamedTemporaryFile()
self.model.save(N.name)
S = N.read()
return S,self.output_probability,self.names
def __setstate__(self,state):
if libsvm is None:
raise RuntimeError('LibSVM Library not found. Cannot use this classifier.')
S,self.output_probability,self.names = state
N = NamedTemporaryFile()
N.write(S)
N.flush()
self.model = libsvm.svm_model(N.name)
class libsvmClassifier(object):
def __init__(self,probability = False, auto_weighting = True):
if libsvm is None:
raise RuntimeError('LibSVM Library not found. Cannot use this classifier.')
self.param = libsvm.svm_parameter(kernel_type = libsvm.RBF, probability = probability)
self.output_probability = probability
self.auto_weighting = auto_weighting
def set_option(self,optname,value):
setattr(self.param, optname, value)
def train(self, features, labels):
labels,names = normaliselabels(labels)
if self.auto_weighting:
nlabels = labels.max() + 1
self.param.nr_weight = int(nlabels)
self.param.weight_label = range(nlabels)
self.param.weight = [(labels != i).mean() for i in xrange(nlabels)]
problem = libsvm.svm_problem(labels.astype(float), features)
model = libsvm.svm_model(problem, self.param)
return libsvmModel(model, names, self.output_probability)
<file_sep>/navigation/terrain_id/terrain_id_from_video.py
#!/usr/bin/env python
import os
from img_processing_tools import *
import Image, ImageDraw
import time
import sys
import cv, cv2
from PIL import ImageStat
import numpy as np
def grab_frame_from_video(video):
frame = video.read()
#counter = 0
#while video.grab():
# counter += 1
# flag, frame = video.retrieve()
# if flag:
#gray_frm = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)
#cv2.imwrite('frame_'+str(counter)+'.png',gray_frm)
#cv.ShowImage("Video",frame)
#cv2.imshow("Video",frame)
#cv.WaitKey(1000/fps)
return frame
def classifiy_section(img):
#class_ID: 1=grass, 2=non-mowable, 3=unknown
class_ID = 2
#print img[0], img[1], img[2]
#print "class_ID", class_ID
#if (img[0] > 65 and img[0] < 115) and (img[1] > 70 and img[1] < 135) and (img[2] < 85): class_ID = 1
#if img[0] > 82 and img[1] > 85 and img[2] > 72: class_ID = 2
rb_sum = img[0] + img[2]
g_sum = img[1]
if img[1] > 26: class_ID=1
#if img[1] < 15 and img[2] > 5 : class_ID=2
#print "rgb: ", img, "class_ID: ", class_ID
return class_ID
def rgb2I3 (img):
"""Convert RGB color space to I3 color space
@param r: Red
@param g: Green
@param b: Blue
return (I3) integer
"""
img_width, img_height = img.size
#make a copy to return
returnimage = Image.new("RGB", (img_width, img_height))
imagearray = img.load()
for y in range(0, img_height, 1):
for x in range(0, img_width, 1):
rgb = imagearray[x, y]
i3 = ((2*rgb[1])-rgb[0]-rgb[2]) / 2
#print rgb, i3
returnimage.putpixel((x,y), (0,i3,0))
return returnimage
def subsection_image(pil_img, sections, visual):
sections = sections /4
#print "sections= ", sections
fingerprint = []
# - -----accepts image and number of sections to divide the image into (resolution of fingerprint)
# ---------- returns a subsectioned image classified by terrain type
img_width, img_height = pil_img.size
#print "image size = img_wdith= ", img_width, " img_height=", img_height, " sections=", sections
if visual == True:
cv_original_img1 = PILtoCV(pil_img)
cv.ShowImage("Original",cv_original_img1 )
cv.MoveWindow("Original", ((img_width)+100),50)
pil_img = rgb2I3 (pil_img)
#cv.WaitKey()
temp_img = pil_img.copy()
xsegs = img_width / sections
ysegs = img_height / sections
#print "xsegs, ysegs = ", xsegs, ysegs
for yy in range(0, img_height-ysegs+1 , ysegs):
for xx in range(0, img_width-xsegs+1, xsegs):
#print "Processing section =", xx, yy, xx+xsegs, yy+ysegs
box = (xx, yy, xx+xsegs, yy+ysegs)
#print "box = ", box
cropped_img1 = pil_img.crop(box)
I3_mean = ImageStat.Stat(cropped_img1).mean
I3_mean_rgb = (int(I3_mean[0]), int(I3_mean[1]), int(I3_mean[2]))
#print "I3_mean: ", I3_mean
sub_ID = classifiy_section(I3_mean_rgb)
fingerprint.append(sub_ID)
if visual == True:
cv_cropped_img1 = PILtoCV(cropped_img1)
cv.ShowImage("Fingerprint",cv_cropped_img1 )
cv.MoveWindow("Fingerprint", (img_width+100),50)
if sub_ID == 1: I3_mean_rgb = (50,150,50)
if sub_ID == 2: I3_mean_rgb = (150,150,150)
if sub_ID == 3: I3_mean_rgb = (0,0,200)
ImageDraw.Draw(pil_img).rectangle(box, (I3_mean_rgb))
cv_img = PILtoCV(pil_img)
cv.ShowImage("Image",cv_img)
cv.MoveWindow("Image", 50,50)
cv.WaitKey(10)
#print xx*yy
#time.sleep(.05)
#print "FINGERPRINT: ", fingerprint
#cv.WaitKey()
return fingerprint
###########################################################
def direction_I3_sum(pil_img):
pil_img = rgb2I3 (pil_img)
center_size = .20
#returns which direction based on sum of I3 total per side (L or R)
img_width, img_height = pil_img.size
left_box = (0, 0, img_width/2, img_height )
right_box = (img_width/2, 0, img_width, img_height )
center_box = (int( (img_width/2) - (img_width*(center_size/2))), 0, int( (img_width/2) + (img_width*(center_size/2))), img_height )
#print 'center_box' , center_box
#print "box1, box2, center_box ", box1, box2, center_box
cropped_left_img = pil_img.crop(left_box)
cropped_right_img = pil_img.crop(right_box)
cropped_center_img = pil_img.crop(center_box)
c_img_width, c_img_height = cropped_center_img.size
center_left_box = (0, 0, c_img_width/2, c_img_height )
center_right_box = (c_img_width/2, 0, c_img_width, c_img_height )
l_center_img = cropped_center_img.crop(center_left_box)
r_center_img = cropped_center_img.crop(center_right_box)
#print l_center_img, r_center_img , cropped_center_img
cv_img = PILtoCV(cropped_left_img)
cv.ShowImage("Left",cv_img)
cv.MoveWindow("Left", 2*(img_width)+60,0)
#cv.WaitKey(25)
cv_img = PILtoCV(cropped_right_img)
cv.ShowImage("Right",cv_img)
cv.MoveWindow("Right", 2*(img_width)+60+img_width/2,0)
#cv.WaitKey(25)
cv_img = PILtoCV(cropped_center_img)
cv.ShowImage("Center",cv_img)
cv.MoveWindow("Center", 2*(img_width)+60+img_width,0)
#cv.WaitKey()
cv_img = PILtoCV(l_center_img)
cv.ShowImage("LCenter",cv_img)
cv.MoveWindow("LCenter",2*(img_width)+60+img_width+c_img_width,0)
cv_img = PILtoCV(r_center_img)
cv.ShowImage("RCenter",cv_img)
cv.MoveWindow("RCenter",2*(img_width)+60+img_width+(c_img_width)+c_img_width/2,0)
I3_sum = ImageStat.Stat(pil_img).sum[1]
I3_sum_left = ImageStat.Stat(cropped_left_img ).sum[1]
I3_sum_right =ImageStat.Stat(cropped_right_img ).sum[1]
I3_center_sum = ImageStat.Stat(cropped_center_img).sum[1]
I3_sum_cleft_g = ImageStat.Stat(l_center_img).sum[1]
I3_sum_cright_g = ImageStat.Stat(r_center_img).sum[1]
print 'I3_sum_left, I3_sum_right, I3_center_sum : ' , I3_sum_cleft_g , I3_sum_cright_g,I3_center_sum
#percent = int(I3_sum_left_g/(I3_sum_left_g+I3_sum_right_g ) * 100)
l_percent = I3_sum_cleft_g / I3_center_sum
r_percent = I3_sum_cright_g / I3_center_sum
print 'l_percent, r_percent, I3_center_sum: ', l_percent, r_percent, I3_center_sum
#cv.WaitKey()
thres1=.55
if I3_center_sum > 50000 and ( l_percent > thres1 or r_percent > thres1):
print "................ ON EDGE ..............."
cv.WaitKey(250)
thres1 = .50
thres2 = .90
thres3 = .96
print 'I3_sum_cleft_g , (I3_center_sum * thres1):',I3_sum_cleft_g , (I3_center_sum * thres1)
if I3_sum_cleft_g > (I3_center_sum * thres1) and I3_sum_cleft_g < (I3_center_sum * thres2):
print '.............turn right -> 50%-90%.............'
cv.WaitKey()
#side = "L"
#if within 90-95 %continue forward
if I3_sum_cleft_g > (I3_center_sum * thres2) and I3_sum_cleft_g < (I3_center_sum * thres3):
print '.............continue foward -> within 90-95 %.............'
cv.WaitKey()
# if over 96% turn towards grass
if I3_sum_cleft_g > (I3_center_sum * thres3):
print '.............turn left -> over 96%.............'
cv.WaitKey()
# if over 96% turn towards grass
if I3_sum_cright_g > (I3_center_sum * thres3):
print '.............turn right -> over 96%.............'
cv.WaitKey()
#if I3_sum_cright_g > (I3_center_sum * thres2):
# print '.............turn left.............edge on left'
# cv.WaitKey()
#side = "R"
else:
print "................ NOT ON EDGE so ????????????????"
'''
#is there enough data
if I3_center_sum > 100000:
print "continue foward"
else:
if I3_sum_cleft_g > I3_sum_cright_g:
print "turn left"
cv.WaitKey(1000)
if I3_sum_cleft_g < I3_sum_cright_g:
print "turn right"
cv.WaitKey(1000)
'''
#i think this!!
thres2 = .55
print I3_sum_cleft_g , (I3_center_sum * thres2)/2
if l_percent < thres2 and r_percent < thres2:
print "too much grass...maybe in a field away from edge"
if I3_sum_left < I3_sum_right:
print "turn left toward edge"
else:
print "turn right toward edge"
cv.WaitKey(10)
thres3 = 150000
print I3_sum_cleft_g
if I3_center_sum < thres3:
print "too much non-grass...maybe on non-mowing suface"
if I3_sum_left > I3_sum_right:
print "turn left toward grass"
else:
print "turn right toward grass"
cv.WaitKey(10)
print
cv.WaitKey(5)
if __name__=="__main__":
video = cv2.VideoCapture(sys.argv[1])
while 1:
# if len(sys.argv) < 4:
# print "******* Requires 3 image files of the same size."
# print "This program will return the angle at which the second is in relation to the first. ***"
# sys.exit(-1)
try:
#img1 = cv.LoadImage(sys.argv[1])
#frame = grab_frame(1)
img1 = np.array(grab_frame_from_video(video)[1])
cv.NamedWindow("Video",cv.CV_WINDOW_AUTOSIZE)
cv2.imshow("Video",img1)
except:
print "******* Could not open image/video file *******"
sys.exit(-1)
#print len (sys.argv)
if len(sys.argv) == 2:
resolution = 32
else:
resolution = int(sys.argv[2])
img1 = array2image(img1)
img1 = img1.resize((320,240))
direction_I3_sum(img1)
"""
image_fingerprint = np.array(subsection_image(img1, resolution, False))
#print "FINGERPRINT: ",image_fingerprint
#print 'len(image_fingerprint):', len(image_fingerprint)
#image_fingerprint.reshape(((image_fingerprint/2), 2))
#print image_fingerprint
step = len(image_fingerprint)/ (resolution/4)
#print "step =", step
a = []
b = []
for x in range (0, len(image_fingerprint), step):
#print x
for y in range(0, step/2):
#print x,y
a.append(image_fingerprint[(x+y)])
b.append(image_fingerprint[(x+(step/2)+y)])
#print a
#print b
direction = sum(a)-sum(b)
total_sum = sum(a)+sum(b)
limit = int(2*(len(image_fingerprint) * .95))
print "leftside-rightside:", direction , ' total sum:', total_sum, ' limit: ', limit
#if direction > -5 :
# print "turn right"
# cv.WaitKey(600)
#if direction < -55:
# print "turn left"
# cv.WaitKey(500)
if direction < -55:
# print "turn left"
# cv.WaitKey(500)
if total_sum > limit:
print "not enough data to continue mowing"
cv.WaitKey()
cv.WaitKey(20)
"""
<file_sep>/gui/foo/foo/PreferencesFooDialog.py
# -*- coding: utf-8 -*-
### BEGIN LICENSE
# This file is in the public domain
### END LICENSE
from desktopcouch.records.server import CouchDatabase
from desktopcouch.records.record import Record
import gtk
from foo.helpers import get_builder
import gettext
from gettext import gettext as _
gettext.textdomain('foo')
class PreferencesFooDialog(gtk.Dialog):
__gtype_name__ = "PreferencesFooDialog"
preferences = {}
def __new__(cls):
"""Special static method that's automatically called by Python when
constructing a new instance of this class.
Returns a fully instantiated PreferencesFooDialog object.
"""
builder = get_builder('PreferencesFooDialog')
new_object = builder.get_object("preferences_foo_dialog")
new_object.finish_initializing(builder)
return new_object
def finish_initializing(self, builder):
"""Called while initializing this instance in __new__
finish_initalizing should be called after parsing the ui definition
and creating a PreferencesFooDialog object with it in order to
finish initializing the start of the new PerferencesFooDialog
instance.
Put your initialization code in here and leave __init__ undefined.
"""
# Get a reference to the builder and set up the signals.
self.builder = builder
self.builder.connect_signals(self)
# Set up couchdb and the preference info.
self._db_name = "foo"
self._database = CouchDatabase(self._db_name, create=True)
self._preferences = None
self._key = None
# Set the record type and then initalize the preferences.
self._record_type = (
"http://wiki.ubuntu.com/Quickly/RecordTypes/Foo/"
"Preferences")
self._preferences = self.get_preferences()
# TODO: code for other initialization actions should be added here
def get_preferences(self):
"""Return a dict of preferences for foo.
Creates a couchdb record if necessary.
"""
if self._preferences == None:
# The dialog is initializing.
self._load_preferences()
# If there were no saved preference, this.
return self._preferences
def _load_preferences(self):
# TODO: add preferences to the self._preferences dict default
# preferences that will be overwritten if some are saved
self._preferences = {"record_type": self._record_type}
results = self._database.get_records(
record_type=self._record_type, create_view=True)
if len(results.rows) == 0:
# No preferences have ever been saved, save them before returning.
self._key = self._database.put_record(Record(self._preferences))
else:
self._preferences = results.rows[0].value
del self._preferences['_rev']
self._key = results.rows[0].value["_id"]
def _save_preferences(self):
self._database.update_fields(self._key, self._preferences)
def ok(self, widget, data=None):
"""The user has elected to save the changes.
Called before the dialog returns gtk.RESONSE_OK from run().
"""
# Make any updates to self._preferences here. e.g.
#self._preferences["preference1"] = "value2"
self._save_preferences()
def cancel(self, widget, data=None):
"""The user has elected cancel changes.
Called before the dialog returns gtk.RESPONSE_CANCEL for run()
"""
# Restore any changes to self._preferences here.
pass
if __name__ == "__main__":
dialog = PreferencesFooDialog()
dialog.show()
gtk.main()
<file_sep>/PhidgetsPython/Python/Manager-simple.py
#!/usr/bin/env python
"""Copyright 2010 Phidgets Inc.
This work is licensed under the Creative Commons Attribution 2.5 Canada License.
To view a copy of this license, visit http://creativecommons.org/licenses/by/2.5/ca/
"""
__author__ = '<NAME>'
__version__ = '2.1.8'
__date__ = 'May 17 2010'
#Basic imports
from ctypes import *
import sys
#Phidget specific imports
from Phidgets.PhidgetException import PhidgetErrorCodes, PhidgetException
from Phidgets.Events.Events import AttachEventArgs, DetachEventArgs, ErrorEventArgs
from Phidgets.Phidget import Phidget
from Phidgets.Manager import Manager
#Event Handler Callback Functions
def ManagerDeviceAttached(e):
attached = e.device
print("Manager - Device %i: %s Attached!" % (attached.getSerialNum(), attached.getDeviceName()))
def ManagerDeviceDetached(e):
detached = e.device
print("Manager - Device %i: %s Detached!" % (detached.getSerialNum(), detached.getDeviceName()))
def ManagerError(e):
print("Manager Phidget Error %i: %s" % (e.eCode, e.description))
#Create an interfacekit object
try:
mngr = Manager()
except RuntimeError as e:
print("Runtime Exception: %s" % e.details)
print("Exiting....")
exit(1)
#Main Program Code
try:
mngr.setOnAttachHandler(ManagerDeviceAttached)
mngr.setOnDetachHandler(ManagerDeviceDetached)
mngr.setOnErrorHandler(ManagerError)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Opening phidget manager....")
try:
mngr.openManager()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
attachedDevices = mngr.getAttachedDevices()
print("|------------|----------------------------------|--------------|------------|")
print("|- Attached -|- Type -|- Serial No. -|- Version -|")
print("|------------|----------------------------------|--------------|------------|")
for attachedDevice in attachedDevices:
print("|- %8s -|- %30s -|- %10d -|- %8d -|" % (attachedDevice.isAttached(), attachedDevice.getDeviceName(), attachedDevice.getSerialNum(), attachedDevice.getDeviceVersion()))
print("|------------|----------------------------------|--------------|------------|")
print("Press Enter to quit....")
chr = sys.stdin.read(1)
print("Closing...")
try:
mngr.closeManager()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Done.")
exit(0)<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/supervised/multi_view.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2011, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# License: MIT. See COPYING.MIT file in the milk distribution
import numpy as np
__all__ = [
'multi_view_learner',
]
class multi_view_model(object):
def __init__(self, models):
self.models = models
def apply(self, features):
if len(features) != len(self.models):
raise ValueError('milk.supervised.two_view: Nr of features does not match training data (got %s, expected %s)' % (len(features) ,len(self.models)))
Ps = np.array([model.apply(f) for model,f in zip(self.models, features)])
if np.any(Ps <= 0.): return False
if np.any(Ps >= 1.): return True
# This is binary only:
# if \prod Pi > \prod (1-Pi) return 1
# is equivalent to
# if \prod Pi/(1-Pi) > 1. return 1
# if \sum \log( Pi/(1-Pi) ) > 0. return 1
return np.sum( np.log(Ps/(1-Ps)) ) > 0
class multi_view_learner(object):
'''
Multi View Learner
This learner learns different classifiers on multiple sets of features and
combines them for classification.
'''
def __init__(self, bases):
self.bases = bases
def train(self, features, labels, normalisedlabels=False):
features = zip(*features)
if len(features) != len(self.bases):
raise ValueError('milk.supervised.multi_view_learner: ' +
'Nr of features does not match classifiser construction (got %s, expected %s)'
% (len(features) ,len(self.bases)))
models = []
for basis,f in zip(self.bases, features):
try:
f = np.array(f)
except:
f = np.array(f, dtype=object)
models.append(basis.train(f, labels))
return multi_view_model(models)
multi_view_classifier = multi_view_learner
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/tests/test_parzen.py
from __future__ import division
import milk.supervised.normalise
from milk.supervised.parzen import get_parzen_rbf_loocv
import numpy as np
import milksets
def _slow_parzen(features, labels, sigma):
correct = 0
N = len(features)
labels = 2*labels - 1
def kernel(fi, fj):
return np.exp(-((fi-fj)**2).sum()/sigma)
for i in xrange(N):
C = 0.
for j in xrange(N):
if i == j: continue
C += labels[j] * kernel(features[i],features[j])
if (C*labels[i] > 0): correct += 1
return correct/N
def test_parzen():
features,labels = milksets.wine.load()
labels = (labels == 1)
features = milk.supervised.normalise.zscore(features)
f = get_parzen_rbf_loocv(features, labels)
sigmas = 2.**np.arange(-4,4)
for s in sigmas:
assert abs(_slow_parzen(features, labels, s) - f(s)) < 1e-6
<file_sep>/gui/breezy/checkbuttondemo.py
"""
File: checkbuttondemo.py
Demonstrates check button capabilities.
"""
from breezypythongui import EasyFrame
class CheckbuttonDemo(EasyFrame):
"""When the display button is pressed, shows the label of
the selected radio button. The button group has a default
vertical alignment."""
def __init__(self):
"""Sets up the window and widgets."""
EasyFrame.__init__(self, "Check Button Demo")
# Add two check buttons
self.firstCB = self.addCheckbutton(text = "First",
row = 0, column = 0,
command = self.first)
self.secondCB = self.addCheckbutton(text = "Second",
row = 1, column = 0,
command = self.second)
# Event handler methods
def first(self):
"""Display a message box with the state of the check button."""
if self.firstCB.isChecked():
message = "First has been checked"
else:
message = "First has been unchecked"
self.messageBox(title = "State of First", message = message)
def second(self):
"""Display a message box with the state of the check button."""
if self.secondCB.isChecked():
message = "Second has been checked"
else:
message = "Second has been unchecked"
self.messageBox(title = "State of Second", message = message)
# Instantiate and pop up the window."""
CheckbuttonDemo().mainloop()
<file_sep>/procimg.py
import Image
from numpy import *
from PIL import Image
import ImageFilter
import ImageOps
import ImageStat
#load image
im = Image.open("images/1.grass11.jpg")
print "image loaded", im
print "image is in mode = " , im.mode
print "image size = ", im.size
#print "image info = ", im.info
im.show()
#image resize
# adjust width and height to your needs
#
#idth = 500
#eight = 420
#
# use one of these filter options to resize the image
#
#m2 = im1.resize((width, height), Image.NEAREST) # use nearest neighbour
#m3 = im1.resize((width, height), Image.BILINEAR) # linear interpolation in a 2x2 environment
#m4 = im1.resize((width, height), Image.BICUBIC) # cubic spline interpolation in a 4x4 environment
#m5 = im1.resize((width, height), Image.ANTIALIAS) # best down-sizing filter
#convert an image to different format
im6 = im.convert("L")
im6.save("greyscale.jpg")
#print bands of color
print "bands of color = ", im6.getbands()
#print list of colors
#print im6.getcolors()
#print number of colors
print "number of unique colors = " , len(im6.getcolors())
#print pixel values colors
#print list(im6.getdata())
#create histogram of greyscale image
image_histogram = im6.histogram()
print image_histogram
print "image_histogram size = ", len(image_histogram)
im6.show()
#split image into color bands (RGB)
kkk = im.split()
print kkk[1]
#kkk[0].show()
#kkk[1].show()
#kkk[2].show()
#/Parse An Image
#import ImageFile
#fp = open("lena.pgm", "rb")
#p = ImageFile.Parser()
#while 1:
# s = fp.read(1024)
# if not s:
# break
# p.feed(s)
#im = p.close()
#im.save("copy.jpg")
#
#median filter
#im8 = im.filter(ImageFilter.MedianFilter(size=3))
#im8.show()
#mode filter
#im8 = im.filter(ImageFilter.ModeFilter(size=3))
#im8.show()
#EDGE_ENHANCEfilter
#im8 = im.filter(ImageFilter.EDGE_ENHANCE)
#im8.show()
#greyscale the image
im8 = ImageOps.grayscale(im)
im8.show()
#equalize the image
im8 = ImageOps.equalize(im6)
im8.show()
#get image stats
im_stat = ImageStat.Stat(im)
print "stat.extrema = ", im_stat.extrema
print "stat.count = ", im_stat.count
print "stat.sum = ", im_stat.sum
print "stat.sum2 = ", im_stat.sum2
print "stat.mean = ", im_stat.mean
print "stat.median = ", im_stat.median
print "stat.rms = ", im_stat.rms
print "stat.var = ", im_stat.var
print "stat.stddev = ", im_stat.stddev
<file_sep>/navigation/zc_compass/zc_compass.py
import serial
import sys, time
PORT = "/dev/ttyUSB0"
if len(sys.argv) > 1:
PORT = sys.argv[1]
ser = serial.Serial(PORT, 9600, parity=serial.PARITY_ODD, stopbits=serial.STOPBITS_ONE, timeout=1, dsrdtr=False, xonxoff=False, rtscts=False)
print ser.name,
print ser.baudrate,
print ser.bytesize,
print ser.parity,
print ser.stopbits,
print ser.dsrdtr
while 1:
print "Trying to connect..."
time.sleep(.2)
#ser.write("r") # write a string
#time.sleep(.05)
#s = ser.read()
s = ser.readline()
#print s
if len(s) > 0: break
#s = ser.read(100) # read up to one hundred bytes
#
while 1:
s = ser.readline()
print "recieved from arduino: ", s
if len(s) < 1: break
if ser.isOpen():
print "Connected..."
# or as much is in the buffer
#print s
ser.close() # close port
if not ser.isOpen():
print "Not Connected..."
ser.close() # close port
<file_sep>/PhidgetsPython/Python/Servo-simple.py
#!/usr/bin/env python
"""Copyright 2010 Phidgets Inc.
This work is licensed under the Creative Commons Attribution 2.5 Canada License.
To view a copy of this license, visit http://creativecommons.org/licenses/by/2.5/ca/
"""
__author__ = '<NAME>'
__version__ = '2.1.8'
__date__ = 'May 17 2010'
#Basic imports
from ctypes import *
import sys
from time import sleep
#Phidget specific imports
from Phidgets.PhidgetException import PhidgetException
from Phidgets.Devices.Servo import Servo, ServoTypes
#Create an servo object
try:
servo = Servo()
except RuntimeError as e:
print("Runtime Exception: %s" % e.details)
print("Exiting....")
exit(1)
#Information Display Function
def DisplayDeviceInfo():
print("|------------|----------------------------------|--------------|------------|")
print("|- Attached -|- Type -|- Serial No. -|- Version -|")
print("|------------|----------------------------------|--------------|------------|")
print("|- %8s -|- %30s -|- %10d -|- %8d -|" % (servo.isAttached(), servo.getDeviceName(), servo.getSerialNum(), servo.getDeviceVersion()))
print("|------------|----------------------------------|--------------|------------|")
print("Number of motors: %i" % (servo.getMotorCount()))
#Event Handler Callback Functions
def ServoAttached(e):
attached = e.device
print("Servo %i Attached!" % (attached.getSerialNum()))
def ServoDetached(e):
detached = e.device
print("Servo %i Detached!" % (detached.getSerialNum()))
def ServoError(e):
try:
source = e.device
print("Servo %i: Phidget Error %i: %s" % (source.getSerialNum(), e.eCode, e.description))
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
def ServoPositionChanged(e):
source = e.device
print("Servo %i: Motor %i Current Position: %f" % (source.getSerialNum(), e.index, e.position))
#Main Program Code
try:
servo.setOnAttachHandler(ServoAttached)
servo.setOnDetachHandler(ServoDetached)
servo.setOnErrorhandler(ServoError)
servo.setOnPositionChangeHandler(ServoPositionChanged)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Opening phidget object....")
try:
servo.openPhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Waiting for attach....")
try:
servo.waitForAttach(10000)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
try:
servo.closePhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Exiting....")
exit(1)
else:
DisplayDeviceInfo()
try:
print("Setting the servo type for motor 0 to HITEC_HS322HD")
servo.setServoType(0, ServoTypes.PHIDGET_SERVO_HITEC_HS322HD)
#Setting custom servo parameters example - 600us-2000us == 120 degrees
#servo.setServoParameters(0, 600, 2000, 120)
print("Move to position 10.00")
servo.setPosition(0, 10.00)
sleep(5)
print("Move to position 50.00")
servo.setPosition(0, 50.00)
sleep(5)
print("Move to position 100.00")
servo.setPosition(0, 100.00)
sleep(5)
print("Move to position 150.00")
servo.setPosition(0, 150.00)
sleep(5)
print("Move to position PositionMax")
servo.setPosition(0, servo.getPositionMax(0))
sleep(5)
print("Move to position PositionMin")
servo.setPosition(0, servo.getPositionMin(0))
sleep(5)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Press Enter to quit....")
chr = sys.stdin.read(1)
print("Closing...")
try:
servo.closePhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Done.")
exit(0)<file_sep>/arduino/SabertoothSimplified/examples/SimpleExample/SabertoothSimplified.cpp
/*
Arduino Library for Sabertooth Simplified Serial
Copyright (c) 2012 Dimension Engineering LLC
http://www.dimensionengineering.com/arduino
Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted, provided that the above
copyright notice and this permission notice appear in all copies.
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY
SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER
RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT,
NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE
USE OR PERFORMANCE OF THIS SOFTWARE.
*/
#include "SabertoothSimplified.h"
SabertoothSimplified::SabertoothSimplified()
: _port(Serial)
{
}
SabertoothSimplified::SabertoothSimplified(Stream& port)
: _port(port)
{
}
void SabertoothSimplified::motor(int power)
{
motor(1, power);
}
void SabertoothSimplified::motor(byte motor, int power)
{
mixedMode(false);
raw(motor, power);
}
void SabertoothSimplified::drive(int power)
{
mixedMode(true);
_mixedDrive = constrain(power, -127, 127);
_mixedDriveSet = true;
mixedUpdate();
}
void SabertoothSimplified::turn(int power)
{
mixedMode(true);
_mixedTurn = constrain(power, -127, 127);
_mixedTurnSet = true;
mixedUpdate();
}
void SabertoothSimplified::stop()
{
_port.write((uint8_t)0);
_mixedDriveSet = false;
_mixedTurnSet = false;
}
void SabertoothSimplified::mixedMode(boolean enable)
{
if (_mixed == enable) { return; }
stop();
_mixed = enable;
}
void SabertoothSimplified::mixedUpdate()
{
if (!_mixedDriveSet || !_mixedTurnSet) { return; }
raw(1, _mixedDrive - _mixedTurn);
raw(2, _mixedDrive + _mixedTurn);
}
void SabertoothSimplified::raw(byte motor, int power)
{
byte command, magnitude;
power = constrain(power, -127, 127);
magnitude = abs(power) >> 1;
if (motor == 1)
{
command = power < 0 ? 63 - magnitude : 64 + magnitude;
}
else if (motor == 2)
{
command = power < 0 ? 191 - magnitude : 192 + magnitude;
}
command = constrain(command, 1, 254);
_port.write(command);
}
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/supervised/perceptron.py
# -*- coding: utf-8 -*-
# Copyright (C) 2011, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# License: MIT. See COPYING.MIT file in the milk distribution
import numpy as np
from .classifier import normaliselabels
from .base import supervised_model
from . import _perceptron
class perceptron_model(supervised_model):
def __init__(self, w):
self.w = w
def apply(self, f):
f = np.asanyarray(f)
v = self.w[0] + np.dot(f, self.w[1:])
return v > 0
class perceptron_learner(object):
def __init__(self, eta=.1, max_iters=128):
self.eta = eta
self.max_iters = max_iters
def train(self, features, labels, normalisedlabels=False, **kwargs):
if not normalisedlabels:
labels, _ = normaliselabels(labels)
features = np.asanyarray(features)
if features.dtype not in (np.float32, np.float64):
features = features.astype(np.float64)
weights = np.zeros(features.shape[1]+1, features.dtype)
for i in xrange(self.max_iters):
errors = _perceptron.perceptron(features, labels, weights, self.eta)
if not errors:
break
return perceptron_model(weights)
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/supervised/multi.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2011, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# License: MIT. See COPYING.MIT file in the milk distribution
from __future__ import division
from .classifier import normaliselabels
from .base import supervised_model, base_adaptor
import numpy as np
__all__ = [
'one_against_rest',
'one_against_one',
'one_against_rest_multi',
'ecoc_learner',
'multi_tree_learner',
]
def _asanyarray(f):
try:
return np.asanyarray(f)
except:
return np.array(f, dtype=object)
class one_against_rest(base_adaptor):
'''
Implements one vs. rest classification strategy to transform
a binary classifier into a multi-class classifier.
classifier = one_against_rest(base)
base must obey the classifier interface
Example
-------
::
multi = one_against_rest(milk.supervised.simple_svm())
model = multi.train(training_features,labels)
print model.apply(testing_features)
See Also
--------
one_against_one
'''
def train(self, features, labels, normalisedlabels=False):
labels, names = normaliselabels(labels)
nclasses = labels.max() + 1
models = []
for i in xrange(nclasses):
model = self.base.train(features, (labels == i).astype(int), normalisedlabels=True)
models.append(model)
return one_against_rest_model(models, names)
class one_against_rest_model(supervised_model):
def __init__(self, models, names):
self.models = models
self.nclasses = len(self.models)
self.names = names
def apply(self, feats):
vals = np.array([c.apply(feats) for c in self.models])
(idxs,) = np.where(vals)
if len(idxs) == 1:
(label,) = idxs
elif len(idxs) == 0:
label = 0
else:
label = idxs[0]
return self.names[label]
class one_against_one(base_adaptor):
'''
Implements one vs. one classification strategy to transform
a binary classifier into a multi-class classifier.
classifier = one_against_one(base)
base must obey the classifier interface
Example
-------
::
multi = one_against_one(milk.supervised.simple_svm())
multi.train(training_features,labels)
print multi.apply(testing_features)
See Also
--------
one_against_rest
'''
def train(self, features, labels, weights=None, **kwargs):
'''
one_against_one.train(objs,labels)
'''
labels, names = normaliselabels(labels)
if weights is not None:
weights = np.asanyarray(weights)
features = _asanyarray(features)
nclasses = labels.max() + 1
models = [ [None for i in xrange(nclasses)] for j in xrange(nclasses)]
child_kwargs = kwargs.copy()
child_kwargs['normalisedlabels'] = True
for i in xrange(nclasses):
for j in xrange(i+1, nclasses):
idxs = (labels == i) | (labels == j)
if not np.any(idxs):
raise ValueError('milk.multi.one_against_one: Pair-wise classifier has no data')
if weights is not None:
child_kwargs['weights'] = weights[idxs]
model = self.base.train(features[idxs], (labels[idxs]==i).astype(int), **child_kwargs)
models[i][j] = model
return one_against_one_model(models, names)
class one_against_one_model(supervised_model):
def __init__(self, models, names):
self.models = models
self.names = _asanyarray(names)
self.nclasses = len(models)
def apply_many(self, features):
nc = self.nclasses
votes = np.zeros((nc, len(features)))
for i in xrange(nc):
for j in xrange(i+1,nc):
vs = self.models[i][j].apply_many(features)
vs = _asanyarray(vs)
votes[i] += (vs > 0)
votes[j] += (vs <= 0)
return self.names[votes.argmax(0)]
def apply(self,feats):
'''
one_against_one.apply(objs)
Classify one single object.
'''
nc = self.nclasses
votes = np.zeros(nc)
for i in xrange(nc):
for j in xrange(i+1,nc):
c = self.models[i][j].apply(feats)
if c:
votes[i] += 1
else:
votes[j] += 1
return self.names[votes.argmax(0)]
class one_against_rest_multi_model(supervised_model):
def __init__(self, models):
self.models = models
def apply(self, feats):
return [lab for lab,model in self.models.iteritems() if model.apply(feats)]
class one_against_rest_multi(base_adaptor):
'''
learner = one_against_rest_multi()
model = learner.train(features, labels)
classes = model.apply(f_test)
This for multi-label problem (i.e., each instance can have more than one label).
'''
def train(self, features, labels, normalisedlabels=False, weights=None):
'''
'''
import operator
all_labels = set()
for ls in labels:
all_labels.update(ls)
models = {}
kwargs = { 'normalisedlabels': True }
if weights is not None:
kwargs['weights'] = weights
for label in all_labels:
nlabels = np.array([int(label in ls) for ls in labels])
models[label] = self.base.train(features, nlabels, **kwargs)
return one_against_rest_multi_model(models)
def _solve_ecoc_model(codes, p):
try:
import scipy.optimize
except ImportError:
raise ImportError("milk.supervised.ecoc: fitting ECOC probability models requires scipy.optimize")
z,_ = scipy.optimize.nnls(codes.T, p)
z /= z.sum()
return z
class ecoc_model(supervised_model):
def __init__(self, models, codes, return_probability):
self.models = models
self.codes = codes
self.return_probability = return_probability
def apply(self, f):
word = np.array([model.apply(f) for model in self.models])
if self.return_probability:
return _solve_ecoc_model(self.codes, word)
else:
word = word.astype(bool)
errors = (self.codes != word).sum(1)
return np.argmin(errors)
class ecoc_learner(base_adaptor):
'''
Implements error-correcting output codes for reducing a multi-class problem
to a set of binary problems.
Reference
---------
"Solving Multiclass Learning Problems via Error-Correcting Output Codes" by
<NAME>, <NAME> in Journal of Artificial Intelligence
Research, Vol 2, (1995), 263-286
'''
def __init__(self, base, probability=False):
base_adaptor.__init__(self, base)
self.probability = probability
def train(self, features, labels, normalisedlabels=False, **kwargs):
if normalisedlabels:
labelset = np.unique(labels)
else:
labels,names = normaliselabels(labels)
labelset = np.arange(len(names))
k = len(labelset)
n = 2**(k-1)
codes = np.zeros((k,n),bool)
for k_ in xrange(1,k):
codes[k_].reshape( (-1, 2**(k-k_-1)) )[::2] = 1
codes = ~codes
# The last column of codes is not interesting (all 1s). The array is
# actually of size 2**(k-1)-1, but it is easier to compute the full
# 2**(k-1) and then ignore the last element.
codes = codes[:,:-1]
models = []
for code in codes.T:
nlabels = np.zeros(len(labels), int)
for ell,c in enumerate(code):
if c:
nlabels[labels == ell] = 1
models.append(self.base.train(features, nlabels, normalisedlabels=True, **kwargs))
return ecoc_model(models, codes, self.probability)
def split(counts):
groups = ([],[])
weights = np.zeros(2, float)
in_order = counts.argsort()
for s in in_order[::-1]:
g = weights.argmin()
groups[g].append(s)
weights[g] += counts[s]
return groups
class multi_tree_model(supervised_model):
def __init__(self, model):
self.model = model
def apply(self, feats):
def ap_recursive(smodel):
if len(smodel) == 1:
return smodel[0]
model,left,right = smodel
if model.apply(feats): return ap_recursive(left)
else: return ap_recursive(right)
return ap_recursive(self.model)
class multi_tree_learner(base_adaptor):
'''
Implements a multi-class learner as a tree of binary decisions.
At each level, labels are split into 2 groups in a way that attempt to
balance the number of examples on each side (and not the number of labels
on each side). This mean that on a 4 class problem with a distribution like
[ 50% 25% 12.5% 12.5%], the "optimal" splits are
o
/ \
/ \
[0] o
/ \
[1] o
/ \
[2][3]
where all comparisons are perfectly balanced.
'''
def train(self, features, labels, normalisedlabels=False, **kwargs):
if not normalisedlabels:
labels,names = normaliselabels(labels)
labelset = np.arange(len(names))
else:
labels = np.asanyarray(labels)
labelset = np.arange(labels.max()+1)
def recursive(labelset, counts):
if len(labelset) == 1:
return labelset
g0,g1 = split(counts)
nlabels = np.array([(ell in g0) for ell in labels], int)
model = self.base.train(features, nlabels, normaliselabels=True, **kwargs)
m0 = recursive(labelset[g0], counts[g0])
m1 = recursive(labelset[g1], counts[g1])
return (model, m0, m1)
counts = np.zeros(labels.max()+1)
for ell in labels:
counts[ell] += 1
return multi_tree_model(recursive(np.arange(labels.max()+1), counts))
<file_sep>/ftdi/build/pyftdi/pyftdi/pyftdi/usbtools.py
# Copyright (C) 2010-2011 <NAME> <<EMAIL>>
# All rights reserved.
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
import os
import threading
import usb.core
import usb.util
class UsbTools(object):
"""Helpers to obtain information about connected USB devices."""
# Need to maintain a list of reference USB devices, to circumvent a
# limitation in pyusb that prevents from opening several times the same
# USB device. The following dictionary used bus/address/vendor/product keys
# to track (device, refcount) pairs
DEVICES = {}
LOCK = threading.RLock()
USBDEVICES = []
@staticmethod
def find_all(vps):
"""Find all devices that match the vendor/product pairs of the vps
list."""
devices = []
devs = UsbTools._find_devices(vps)
for dev in devs:
ifcount = max([cfg.bNumInterfaces for cfg in dev])
sernum = usb.util.get_string(dev, 64, dev.iSerialNumber)
description = usb.util.get_string(dev, 64, dev.iProduct)
devices.append((dev.idVendor, dev.idProduct, sernum, ifcount,
description))
return devices
@classmethod
def get_device(cls, vendor, product, index, serial, description):
"""Find a previously open device with the same vendor/product
or initialize a new one, and return it"""
cls.LOCK.acquire()
try:
vps = [(vendor, product)]
if index or serial or description:
dev = None
if not vendor:
raise AssertionError('Vendor identifier is required')
devs = cls._find_devices(vps)
if description:
devs = [dev for dev in devs if \
usb.util.get_string(dev, 64, dev.iProduct) \
== description]
if serial:
devs = [dev for dev in devs if \
usb.util.get_string(dev, 64, dev.iSerialNumber) \
== serial]
try:
dev = devs[index]
except IndexError:
raise IOError("No such device")
else:
devs = cls._find_devices(vps)
dev = devs and devs[0] or None
if not dev:
raise IOError('Device not found')
try:
devkey = (dev.bus, dev.address, vendor, product)
if None in devkey[0:2]:
raise AttributeError('USB back does not support bus '
'enumeration')
except AttributeError:
devkey = (vendor, product)
if devkey not in cls.DEVICES:
if os.name in ('posix', ):
for configuration in dev:
# we need to detach any kernel driver from the device
# be greedy: reclaim all device interfaces from the ker
for interface in configuration:
ifnum = interface.bInterfaceNumber
if not dev.is_kernel_driver_active(ifnum):
continue
try:
dev.detach_kernel_driver(ifnum)
except usb.core.USBError, e:
pass
dev.set_configuration()
cls.DEVICES[devkey] = [dev, 1]
else:
cls.DEVICES[devkey][1] += 1
return cls.DEVICES[devkey][0]
finally:
cls.LOCK.release()
@classmethod
def release_device(cls, usb_dev):
"""Release a previously open device, if it not used anymore"""
# Lookup for ourselves in the class dictionary
cls.LOCK.acquire()
try:
for devkey in cls.DEVICES:
dev, refcount = cls.DEVICES[devkey]
if dev == usb_dev:
# found
if refcount > 1:
# another interface is open, decrement
cls.DEVICES[devkey][1] -= 1
else:
# last interface in use, release
usb.util.dispose_resources(cls.DEVICES[devkey][0])
del cls.DEVICES[devkey]
break
finally:
cls.LOCK.release()
@classmethod
def _find_devices(cls, vps):
"""Find an USB device and return it.
This code re-implements the usb.core.find() method using a local
cache to avoid calling several times the underlying LibUSB and the
system USB calls to enumerate the available USB devices. As these
calls are time-hungry (about 1 second/call), the enumerated devices
are cached. It consumes a bit more memory but dramatically improves
start-up time.
Hopefully, this kludge is temporary and replaced with a better
implementation from PyUSB at some point.
"""
cls.LOCK.acquire()
try:
backend = None
import usb.backend.libusb10 as libusb10
import usb.backend.libusb01 as libusb01
import usb.backend.openusb as openusb
for m in (libusb10, openusb, libusb01):
backend = m.get_backend()
if backend is not None:
break
else:
raise ValueError('No backend available')
if not cls.USBDEVICES:
# not freed until Python runtime completion
# enumerate_devices returns a generator, so back up the
# generated device into a list. To save memory, we only
# back up the supported devices
devlist = []
vpdict = {}
for v, p in vps:
vpdict.setdefault(v, [])
vpdict[v].append(p)
for dev in backend.enumerate_devices():
device = usb.core.Device(dev, backend)
vendor = device.idVendor
product = device.idProduct
if vendor in vpdict:
products = vpdict[vendor]
if products and (product not in products):
continue
devlist.append(device)
cls.USBDEVICES = devlist
return cls.USBDEVICES
finally:
cls.LOCK.release()
<file_sep>/gui/web_gui/sitetest2/cgi-bin/script_bckup
#!/usr/bin/python
# Import modules for CGI handling
import cgi, cgitb
import time
"""
#while 1:
#data = open('1.grass10.jpg', 'rb').read()
#print "Content-Type: image/jpg\nContent-Length: %d\n" % len(data)
#print data
def refresh_display():
# Create instance of FieldStorage
form = cgi.FieldStorage()
# Get data from fields
first_name = form.getvalue('first_name')
last_name = form.getvalue('last_name')
button2 = form.getvalue('button2')
print #
data_uri = open('1.grass10.jpg', 'rb').read().encode('base64').replace('\n', '')
img_tag = '<img src="data:image/jpg;base64,{0}">'.format(data_uri)
print (img_tag)
#print "Content-type:text/html\r\n\r\n"
print "<html>"
print "<head>"
print "<title>Hello - Second CGI Program</title>"
print "</head>"
print "<body>"
print "<h2>Hello %s %s</h2>" % (first_name, last_name)
print '<form name="form2" action="/cgi-bin/script.py" method="get">'
#print '<input type="text" name="button2" />' % ('button2')
print '<input type="submit" value="Submit" name="Submit2"/>'
print '</form>'
#print '<input type="submit" value="Submit" name="Submit1"/>'
#print '<input type="submit" value="Submit" name="Submit2"/>'
print "</body>"
print "</html>"
if "Submit1" in form:
button = 1
print "button 1 pressed"
elif "Submit2" in form2:
button = 2
print "button 2 pressed"
else:
print "Couldn't determine which button was pressed."
if "button2" in form:
print "button2 pressed"
print "hi"
"""
def execute_program():
#import os
#os.system("./test_gui1.py")
#time.sleep(2)
#print "Location: http://localhost:8002/"
print "Location: http://isotope11.com"
if __name__ == '__main__':
#button = refresh_display()
execute_program()
#main()
<file_sep>/lib/mobot_wifi_graph.py
import matplotlib.pyplot as plt
import numpy as np
import time
import thread
import math
import random
from mobot_wifi_consume import *
from matplotlib import mpl
import sys
import matplotlib.colors as mcolors
def drange(start, stop, step):
r = start
while r < stop:
yield r
r += step
def closest(target, collection):
return min((abs(target - i), i) for i in collection)[1]
def calculate_color(val):
#R=(255*val)/100
R=(255*(100-val))/100;
#G=(255*(100-val))/100;
G=(255*val)/100
B=0
#print R,G,B
to_return = [normalize_val(R, 0, 255),normalize_val(G, 0, 255),B]
return to_return
def normalize_val(val, floor, ceiling):
return float (int(((float(val) - floor) / ceiling) * 100)) / 100
def calculate_color2(val):
temp = int(translate(val, 0, 80, 0, 4))
color = []
n = 5
R = (1.0 - (0.25 * temp))
G = (0.25 * temp)
B = 0
color = [R,G,B]
#print 'val:', val, ' temp:', temp, ' color:', color
return color
def translate(sensor_val, in_from, in_to, out_from, out_to):
out_range = out_to - out_from
in_range = in_to - in_from
in_val = sensor_val - in_from
val=(float(in_val)/in_range)*out_range
out_val = out_from+val
return out_val
def animate():
n= 100
# http://www.scipy.org/Cookbook/Matplotlib/Animations
x = []
y = [0] * 100
for i in range(n):
x.append(i)
cdict = {'red': ((0.0, 1.0, 1.0),
(1.0, 0.0, 0.0)),
'green': ((0.0, 0.0, 0.0),
(1.0, 1.0, 1.0)),
'blue': ((0.0, 0.0, 0.0),
(1.0, 0.0, 0.0))
}
cmap = mcolors.LinearSegmentedColormap('my_colormap', cdict, 5)
#cmap = mpl.cm.cool
#norm = mpl.colors.Normalize(vmin=100, vmax=0)
#bounds = [0, 25, 50, 75, 100]
#norm = mpl.colors.BoundaryNorm(bounds, cmap.N)
cax = ax.imshow((x,y), cmap=cmap)
cbar = fig.colorbar(cax, orientation='vertical')
wifi = consume_wifi('wifi.1', '192.168.1.180')
while True:
time.sleep(.2)
y = y[1:]
signal_value = wifi.signal_strength
print signal_value
try:
val = int(wifi.signal_strength)
if val < 0: val = 0
#print val
except:
val = 0
#val = random.randint(0,100)
pass
y.append(val)
plt.cla()
plt.xticks(xrange(0,100,10))#, endpoint=True))
plt.yticks(xrange(0,100,10))#, endpoint=True))
plt.xlabel('10 seconds')
plt.ylabel('Strength')
plt.grid(True)
plt.ylim([0,100])
plt.xlim([0,100])
plt.grid(True)
#from mpl_toolkits.axes_grid1 import make_axes_locatable
#divider = make_axes_locatable(plt.gca())
#cax = plt.append_axes("right", "5%", pad="3%")
#cbar = plt.colorbar(fig, orientation='vertical')
#plt.tight_layout()
colors = []
#print y
for i in range(len(x)):
#colors.append(calculate_color(y[i]))
colors.append(calculate_color2(y[i]))
print val , colors[99]
plt.bar(x , y, 1, color=colors)
fig.canvas.draw()
#raw_input('ll')
#print int(translate(100, 0, 255, 1, 5))
#sys.exit()
fig, ax = plt.subplots(1,1,1,dpi=60)
win = fig.canvas.manager.window
win.after(10, animate)
plt.show ()
<file_sep>/PhidgetsPython/Python/Analog-simple.py
#! /usr/bin/python
"""Copyright 2011 Phidgets Inc.
This work is licensed under the Creative Commons Attribution 2.5 Canada License.
To view a copy of this license, visit http://creativecommons.org/licenses/by/2.5/ca/
"""
__author__="<NAME>"
__version__="2.1.8"
__date__ ="13-Jan-2011 4:28:27 PM"
#Basic imports
import sys
from time import sleep
#Phidget specific imports
from Phidgets.PhidgetException import PhidgetException
from Phidgets.Devices.Analog import Analog
#Create an accelerometer object
try:
analog = Analog()
except RuntimeError as e:
print("Runtime Exception: %s" % e.details)
print("Exiting....")
exit(1)
#Information Display Function
def displayDeviceInfo():
print("|------------|----------------------------------|--------------|------------|")
print("|- Attached -|- Type -|- Serial No. -|- Version -|")
print("|------------|----------------------------------|--------------|------------|")
print("|- %8s -|- %30s -|- %10d -|- %8d -|" % (analog.isAttached(), analog.getDeviceName(), analog.getSerialNum(), analog.getDeviceVersion()))
print("|------------|----------------------------------|--------------|------------|")
print("Number of analog outputs: %i" % (analog.getOutputCount()))
print("Maximum output voltage: %d" % (analog.getVoltageMax(0)))
print("Minimum output voltage: %d" % (analog.getVoltageMin(0)))
#Event Handler Callback Functions
def AnalogAttached(e):
attached = e.device
print("Analog %i Attached!" % (attached.getSerialNum()))
def AnalogDetached(e):
detached = e.device
print("Analog %i Detached!" % (detached.getSerialNum()))
def AnalogError(e):
try:
source = e.device
print("Analog %i: Phidget Error %i: %s" % (source.getSerialNum(), e.eCode, e.description))
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
#Main Program Code
try:
analog.setOnAttachHandler(AnalogAttached)
analog.setOnDetachHandler(AnalogDetached)
analog.setOnErrorhandler(AnalogError)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Opening phidget object....")
try:
analog.openPhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Waiting for attach....")
try:
analog.waitForAttach(10000)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
try:
analog.closePhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Exiting....")
exit(1)
else:
displayDeviceInfo()
try:
print("Enabling Analog output channel 0...")
analog.setEnabled(0, True)
sleep(5)
print("Set analog output voltage to +5.00V...")
analog.setVoltage(0, 5.00)
sleep(5)
print("Set analog output voltage to -5.00V...")
analog.setVoltage(0, -5.00)
sleep(5)
print("Set analog output voltage to +0.00V...")
analog.setVoltage(0, 0.00)
sleep(5)
print("Disabling Analog output channel 0...")
analog.setEnabled(0, False)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Press Enter to quit....")
chr = sys.stdin.read(1)
print("Closing...")
try:
analog.closePhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Done.")
exit(0)<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/supervised/grouped.py
# -*- coding: utf-8 -*-
# Copyright (C) 2010-2011, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# License: MIT. See COPYING.MIT file in the milk distribution
# -*- coding: utf-8 -*-
from __future__ import division
import numpy as np
from collections import defaultdict
from .classifier import normaliselabels
from .base import base_adaptor, supervised_model
__all__ = [
'voting_learner',
'mean_learner',
'remove_outliers',
'filter_outliers',
]
def _concatenate_features_labels(gfeatures, glabels):
if type(gfeatures) == np.ndarray and gfeatures.dtype == object:
gfeatures = list(gfeatures)
features = np.concatenate(gfeatures)
labels = []
for feats,label in zip(gfeatures, glabels):
labels.extend( [label] * len(feats) )
return features, labels
class voting_learner(base_adaptor):
'''
Implements a voting scheme for multiple sub-examples per example.
classifier = voting_learner(base)
base should be a binary classifier
Example
-------
::
voterlearn = voting_learner(milk.supervised.simple_svm())
voter = voterlearn.train(training_groups, labeled_groups)
res = voter.apply([ [f0, f1, f3] ])
'''
def train(self, gfeatures, glabels, normalisedlabels=False):
features, labels = _concatenate_features_labels(gfeatures, glabels)
return voting_model(self.base.train(features, labels))
voting_classifier = voting_learner
class voting_model(supervised_model):
def __init__(self, base):
self.base = base
def apply(self, gfeatures):
votes = defaultdict(int)
for feats in gfeatures:
votes[self.base.apply(feats)] += 1
best = None
most_votes = 0
for k,v in votes.iteritems():
if v > most_votes:
best = k
most_votes = v
return best
class mean_learner(base_adaptor):
'''
Implements a mean scheme for multiple sub-examples per example.
classifier = mean_learner(base)
`base` should be a classifier that returns a numeric confidence value
`classifier` will return the **mean**
Example
-------
::
meanlearner = mean_learner(milk.supervised.raw_svm())
model = meanlearner.train(training_groups, labeled_groups)
res = model.apply([ [f0, f1, f3] ])
'''
def train(self, gfeatures, glabels, normalisedlabels=False):
features, labels = _concatenate_features_labels(gfeatures, glabels)
return mean_model(self.base.train(features, labels))
mean_classifier = mean_learner
class mean_model(supervised_model):
def __init__(self, base):
self.base = base
def apply(self, gfeatures):
return np.mean([self.base.apply(feats) for feats in gfeatures])
def remove_outliers(features, limit, min_size):
'''
features = remove_outliers(features, limit, min_size)
'''
nsize = int(limit * len(features))
if nsize < min_size:
return features
normed = features - features.mean(0)
std = normed.std(0)
std[std == 0] = 1
normed /= std
f2_sum1 = (normed**2).mean(1)
values = f2_sum1.copy()
values.sort()
top = values[nsize]
selected = f2_sum1 < top
return features[selected]
class filter_outliers_model(supervised_model):
def __init__(self, limit, min_size):
self.limit = limit
self.min_size = min_size
def apply(self, features):
return remove_outliers(features, self.limit, self.min_size)
class filter_outliers(object):
def __init__(self, limit=.9, min_size=3):
self.limit = limit
self.min_size = min_size
def train(self, features, labels, normalisedlabels=False):
return filter_outliers_model(self.limit, self.min_size)
<file_sep>/telemetry/clientside_udp.py
import socket, time
UDP_IP="127.0.0.1"
UDP_PORT=5005
sock = socket.socket( socket.AF_INET, # Internet
socket.SOCK_DGRAM ) # UDP
sock.bind( (UDP_IP,UDP_PORT) )
count = 0
while True:
now = time.time()
count = count + 1
print count
data, addr = sock.recvfrom( 1024 ) # buffer size is 1024 bytes
print "received message:", data, " time:", (now-(time.time()))
<file_sep>/lib/nav_functions.py
import sys
sys.path.append( "/home/mobot/projects/robomow/lib/" )
from heapq import heappush, heappop # for priority queue
import math
import time
import random
from gps_functions import *
import simplekml
class node:
xPos = 0 # x position
yPos = 0 # y position
distance = 0 # total distance already travelled to reach the node
priority = 0 # priority = distance + remaining distance estimate
def __init__(self, xPos, yPos, distance, priority):
self.xPos = xPos
self.yPos = yPos
self.distance = distance
self.priority = priority
def __lt__(self, other): # comparison method for priority queue
return self.priority < other.priority
def updatePriority(self, xDest, yDest):
self.priority = self.distance + self.estimate(xDest, yDest) * 10 # A*
# give higher priority to going straight instead of diagonally
def nextMove(self, dirs, d): # d: direction to move
if dirs == 8 and d % 2 != 0:
self.distance += 14
else:
self.distance += 10
# Estimation function for the remaining distance to the goal.
def estimate(self, xDest, yDest):
xd = xDest - self.xPos
yd = yDest - self.yPos
# Euclidian Distance
d = math.sqrt(xd * xd + yd * yd)
# Manhattan distance
# d = abs(xd) + abs(yd)
# Chebyshev distance
# d = max(abs(xd), abs(yd))
return(d)
# A-star algorithm.
# The path returned will be a string of digits of directions.
def pathFind(the_map, n, m, num_of_directions, xA, yA, xB, yB):
if num_of_directions == 4:
dx = [1, 0, -1, 0]
dy = [0, 1, 0, -1]
elif num_of_directions == 8:
dx = [1, 1, 0, -1, -1, -1, 0, 1]
dy = [0, 1, 1, 1, 0, -1, -1, -1]
closed_nodes_map = [] # map of closed (tried-out) nodes
open_nodes_map = [] # map of open (not-yet-tried) nodes
dir_map = [] # map of dirs
row = [0] * n
for i in range(m): # create 2d arrays
closed_nodes_map.append(list(row))
open_nodes_map.append(list(row))
dir_map.append(list(row))
pq = [[], []] # priority queues of open (not-yet-tried) nodes
pqi = 0 # priority queue index
# create the start node and push into list of open nodes
n0 = node(xA, yA, 0, 0)
n0.updatePriority(xB, yB)
heappush(pq[pqi], n0)
open_nodes_map[yA][xA] = n0.priority # mark it on the open nodes map
# A* search
while len(pq[pqi]) > 0:
# get the current node w/ the highest priority
# from the list of open nodes
n1 = pq[pqi][0] # top node
n0 = node(n1.xPos, n1.yPos, n1.distance, n1.priority)
x = n0.xPos
y = n0.yPos
heappop(pq[pqi]) # remove the node from the open list
open_nodes_map[y][x] = 0
closed_nodes_map[y][x] = 1 # mark it on the closed nodes map
# quit searching when the goal is reached
# if n0.estimate(xB, yB) == 0:
if x == xB and y == yB:
# generate the path from finish to start
# by following the dirs
path = ''
while not (x == xA and y == yA):
j = dir_map[y][x]
c = str((j + num_of_directions / 2) % num_of_directions)
path = c + path
x += dx[j]
y += dy[j]
return path
# generate moves (child nodes) in all possible dirs
for i in range(num_of_directions):
xdx = x + dx[i]
ydy = y + dy[i]
if not (xdx < 0 or xdx > n-1 or ydy < 0 or ydy > m - 1
or the_map[ydy][xdx] == 1 or closed_nodes_map[ydy][xdx] == 1):
# generate a child node
m0 = node(xdx, ydy, n0.distance, n0.priority)
m0.nextMove(num_of_directions, i)
m0.updatePriority(xB, yB)
# if it is not in the open list then add into that
if open_nodes_map[ydy][xdx] == 0:
open_nodes_map[ydy][xdx] = m0.priority
heappush(pq[pqi], m0)
# mark its parent node direction
dir_map[ydy][xdx] = (i + num_of_directions / 2) % num_of_directions
elif open_nodes_map[ydy][xdx] > m0.priority:
# update the priority
open_nodes_map[ydy][xdx] = m0.priority
# update the parent direction
dir_map[ydy][xdx] = (i + num_of_directions / 2) % num_of_directions
# replace the node
# by emptying one pq to the other one
# except the node to be replaced will be ignored
# and the new node will be pushed in instead
while not (pq[pqi][0].xPos == xdx and pq[pqi][0].yPos == ydy):
heappush(pq[1 - pqi], pq[pqi][0])
heappop(pq[pqi])
heappop(pq[pqi]) # remove the target node
# empty the larger size priority queue to the smaller one
if len(pq[pqi]) > len(pq[1 - pqi]):
pqi = 1 - pqi
while len(pq[pqi]) > 0:
heappush(pq[1-pqi], pq[pqi][0])
heappop(pq[pqi])
pqi = 1 - pqi
heappush(pq[pqi], m0) # add the better node instead
return '' # if no route found
def create_map(length, width, starting_pos):
#starting_pos = lat, lon
#n = 30 # horizontal size of the map
#m = 30 # vertical size of the map
the_map = []
row = [[0,0]] * length
for i in range(width): # create empty map
the_map.append(list(row))
#xB, yB = 4, 4 # destination
#mobot starts in center of map
xA, yA = int(width / 2), int(length / 2)
lat1 = starting_pos[0]
lon1 = starting_pos[1]
the_map[xA][yA] = lat1, lon1
print 'Map size (X,Y): ', width, length
print 'Start: ', xA, yA, the_map[xA][yA]
#print 'Finish: ', xB, yB
#print the_map
meters_left = int(width/2)
meters_up = int(length / 2)
pos_left = []
pos_left = destination_coordinates(lat1, lon1, 270, meters_left)
print pos_left
print lldistance((lat1, lon1), pos_left)
pos_up = []
pos_up = destination_coordinates(pos_left[0], pos_left[1], 0, meters_up)
print pos_up
print lldistance((lat1, lon1), pos_up)
the_map[0][0] = pos_up
#print the_map
kml = simplekml.Kml()
data_filename = "mobot_local_map.kml"
f_handle = open(data_filename, 'w')
f_handle.write('')
f_handle.close()
for y in range(width):
for x in range(length):
temp_posx = destination_coordinates(pos_up[0], pos_up[1], 90 , x)
temp_posy = destination_coordinates(temp_posx[0], temp_posx[1], 180,y)
the_map[y][x] = temp_posy
name_str = "mobot" + str(x) + " " + str(y)
kml.newpoint(name=(name_str), coords=[(temp_posy[1], temp_posy[0])])
kml.save(data_filename)
return the_map
def print_map(the_map, width, length):
# display the map with the route added
print 'Map:'
for y in range(width):
for x in range(length):
xy = the_map[y][x]
if xy == 0:
print '.', # space
elif xy == 1:
print 'O', # obstacle
elif xy == 2:
print 'S', # start
elif xy == 3:
print 'R', # route
elif xy == 4:
print 'F', # finish
print
if __name__== "__main__":
'''
# fillout the map with a '+' pattern
for x in range(width / 8, width * 7 / 8):
the_map[length / 2][x] = 1
for y in range(length/8, length * 7 / 8):
the_map[y][width / 2] = 1
'''
#create map
length = 10
width = 10
the_map = create_map(length, width, (33.5, -86.5))
print the_map
sys.exit(-1)
#until current location = destination location
currentx = xA
currenty = yA
#update map with starting and destination position
the_map[yA][xA] = 2
the_map[yB][xB] = 4
arrived = False
num_of_directions = 8 # number of possible directions to move on the map
if num_of_directions == 4:
dx = [1, 0, -1, 0]
dy = [0, 1, 0, -1]
elif num_of_directions == 8:
dx = [1, 1, 0, -1, -1, -1, 0, 1]
dy = [0, 1, 1, 1, 0, -1, -1, -1]
while arrived == False:
#place random obstacles
for i in range(5):
randx =random.randint(1,length-1)
randy =random.randint(1,width-1)
if the_map[randy][randx] == 0:
#print "randx,randy", randx,randy
the_map[randy][randx] = 1
the_route = pathFind(the_map, length, width, num_of_directions, currentx, currenty, xB, yB)
if len(the_route) > 0:
print "route:", the_route
j = int(the_route[0])
currentx += dx[j]
currenty += dy[j]
the_map[currenty][currentx] = 3
print_map(the_map, width, length)
else:
print "I have arrived..."
arrived = True
raw_input('Press Enter...')
'''
self.latitude = (avg_latitude / (x*len(all_gps_list)))
self.longitude = (avg_longitude / (x*len(all_gps_list)))
#print "total sats:", self.active_satellites
self.active_satellites = ( avg_active_satellites / (x*len(all_gps_list)))
#time.sleep(1)
print 'Avg latitude : ' , self.latitude, " ", abs(self.latitude - gpss[n].fix.latitude)
print 'Avg longitude: ' , self.longitude, " ", abs(self.longitude - gpss[n].fix.longitude)
print 'Avg Active Satellites: ' , self.active_satellites
print "Distance: ", round((lldistance((self.latitude, self.longitude), (gpss[n].fix.latitude, gpss[n].fix.longitude)) * 3.28084), 4), " feet"
'''
<file_sep>/arduino/SabertoothSimplified/examples/TankStyleSweep/TankStyleSweep.ino
// Tank-Style Sweep Sample
// Copyright (c) 2012 Dimension Engineering LLC
// See license.txt for license details.
#include <SabertoothSimplified.h>
// Mixed mode is for tank-style diff-drive robots.
// Only Packet Serial actually has mixed mode, so this Simplified Serial library
// emulates it (to allow easy switching between the two libraries).
SabertoothSimplified ST; // We'll name the Sabertooth object ST.
// For how to configure the Sabertooth, see the DIP Switch Wizard for
// http://www.dimensionengineering.com/datasheets/SabertoothDIPWizard/start.htm
// Be sure to select Simplified Serial Mode for use with this library.
// This sample uses a baud rate of 9600.
//
// Connections to make:
// Arduino TX->1 -> Sabertooth S1
// Arduino GND -> Sabertooth 0V
// Arduino VIN -> Sabertooth 5V (OPTIONAL, if you want the Sabertooth to power the Arduino)
//
// If you want to use a pin other than TX->1, see the SoftwareSerial example.
void setup()
{
Serial.begin(9600); // This is the baud rate you chose with the DIP switches.
ST.drive(0); // The Sabertooth won't act on mixed mode until
ST.turn(0); // it has received power levels for BOTH throttle and turning, since it
// mixes the two together to get diff-drive power levels for both motors.
// So, we set both to zero initially.
}
// Mixed mode tips:
// drive() should go forward and back, turn() should go right and left.
// If this is reversed, swap M2A and M2B.
// Positive on drive() should go forward, negative should go backward.
// If this is reversed, swap A and B on both M1 and M2.
// Positive on turn() should go right, negative should go left.
// If this is reversed, swap M1 and M2.
// In this sample, the SLOW sweep (left-to-right) here is turning,
// and the FAST sweep (backwards-to-forwards) is throttle.
void loop()
{
int power;
// Don't turn. Ramp from going backwards to going forwards, waiting 20 ms (1/50th of a second) per value.
for (power = -127; power <= 127; power ++)
{
ST.drive(power);
delay(20);
}
// Now, let's use a power level of 20 (out of 127) forward.
// This way, our turning will have a radius. Mostly, the command
// is just to demonstrate you can use drive() and turn() at the same time.
ST.drive(20);
// Ramp turning from full left to full right SLOWLY by waiting 50 ms (1/20th of a second) per value.
for (power = -127; power <= 127; power ++)
{
ST.turn(power);
delay(50);
}
// Now stop turning, and stop driving.
ST.turn(0);
ST.drive(0);
// Wait a bit. This is so you can catch your robot if you want to. :-)
delay(5000);
}
<file_sep>/telemetry/video_stream_server.py
#!/usr/bin/env python
# This is some example code for python-gstreamer.
# It's a gstreamer TCP/IP server that listens on
# the localhost for a client to send data to it.
# It shows how to use the tcpserversrc and tcpclientsink
# elements.
import gobject, pygst
pygst.require("0.10")
import gst
# Callback for the decodebin source pad
def new_decode_pad(dbin, pad, islast):
pad.link(convert.get_pad("sink"))
# create a pipeline and add [tcpserversrc ! decodebin ! audioconvert ! alsasink]
pipeline = gst.Pipeline("server")
tcpsrc = gst.element_factory_make("tcpserversrc", "source")
pipeline.add(tcpsrc)
tcpsrc.set_property("host", "127.0.0.1")
tcpsrc.set_property("port", 3000)
decode = gst.element_factory_make("decodebin", "decode")
decode.connect("new-decoded-pad", new_decode_pad)
pipeline.add(decode)
tcpsrc.link(decode)
convert = gst.element_factory_make("audioconvert", "convert")
pipeline.add(convert)
sink = gst.element_factory_make("alsasink", "sink")
pipeline.add(sink)
convert.link(sink)
pipeline.set_state(gst.STATE_PLAYING)
# enter into a mainloop
loop = gobject.MainLoop()
loop.run()
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/supervised/randomforest.py
# -*- coding: utf-8 -*-
# Copyright (C) 2010-2011, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# License: MIT. See COPYING.MIT file in the milk distribution
'''
Random Forest
-------------
Main elements
-------------
rf_learner : A learner object
'''
from __future__ import division
import numpy as np
import milk.supervised.tree
from .normalise import normaliselabels
from .base import supervised_model
from ..utils import get_nprandom
__all__ = [
'rf_learner',
]
def _sample(features, labels, n, R):
'''
features', labels' = _sample(features, labels, n, R)
Sample n element from (features,labels)
Parameters
----------
features : sequence
labels : sequence
Same size as labels
n : integer
R : random object
Returns
-------
features' : sequence
labels' : sequence
'''
N = len(features)
sfeatures = []
slabels = []
for i in xrange(n):
idx = R.randint(N)
sfeatures.append(features[idx])
slabels.append(labels[idx])
return np.array(sfeatures), np.array(slabels)
class rf_model(supervised_model):
def __init__(self, forest, names, return_label = True):
self.forest = forest
self.names = names
self.return_label = return_label
def apply(self, features):
rf = len(self.forest)
votes = sum(t.apply(features) for t in self.forest)
if self.return_label:
return (votes > (rf//2))
return votes / rf
class rf_learner(object):
'''
Random Forest Learner
learner = rf_learner(rf=101, frac=.7)
Attributes
----------
rf : integer, optional
Nr of trees to learn (default: 101)
frac : float, optional
Sample fraction
R : np.random object
Source of randomness
'''
def __init__(self, rf=101, frac=.7, R=None):
self.rf = rf
self.frac = frac
self.R = get_nprandom(R)
def train(self, features, labels, normalisedlabels=False, names=None, return_label=True, **kwargs):
N,M = features.shape
m = int(self.frac*M)
n = int(self.frac*N)
R = get_nprandom(kwargs.get('R', self.R))
tree = milk.supervised.tree.tree_learner(return_label=return_label)
forest = []
if not normalisedlabels:
labels,names = normaliselabels(labels)
elif names is None:
names = (0,1)
for i in xrange(self.rf):
forest.append(
tree.train(*_sample(features, labels, n, R),
**{'normalisedlabels' : True})) # This syntax is necessary for Python 2.5
return rf_model(forest, names, return_label)
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/supervised/logistic.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2011, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
# License: MIT. See COPYING.MIT file in the milk distribution
from __future__ import division
import numpy as np
from .normalise import normaliselabels
from .base import supervised_model
__all__ = [
'logistic_learner',
]
@np.vectorize
def _sigmoidal(z):
if (z > 300): return 1.
if z < -300: return 0.
return 1./(1+np.exp(-z))
class logistic_model(supervised_model):
def __init__(self, bs):
self.bs = bs
def apply(self, fs):
return _sigmoidal(self.bs[0] + np.dot(fs, self.bs[1:]))
class logistic_learner(object):
'''
learner = logistic_learner(alpha=0.0)
Logistic regression learner
There are two implementations:
1. One which depends on ``scipy.optimize``. This is the default and is
extremely fast.
2. If ``import scipy`` fails, then we fall back to a Python only
gradient-descent. This gives good results, but is many times slower.
Properties
----------
alpha : real, optional
penalty for L2-normalisation. Default is zero, for no penalty.
'''
def __init__(self, alpha=0.0):
self.alpha = alpha
def train(self, features, labels, normalisedlabels=False, names=None, **kwargs):
def error(bs):
response = bs[0] + np.dot(features, bs[1:])
response = _sigmoidal(response)
diff = response - labels
log_like = np.dot(diff, diff)
L2_penalty = self.alpha * np.dot(bs, bs)
return log_like + L2_penalty
def error_prime(bs):
fB = np.dot(features, bs[1:])
response = _sigmoidal(bs[0] + fB)
sprime = response * (1-response)
ds = (response - labels) * sprime
b0p = np.sum(ds)
b1p = np.dot(features.T, ds)
bp = np.concatenate( ([b0p], b1p) )
return 2.*(bp + self.alpha*bs)
features = np.asanyarray(features)
if not normalisedlabels:
labels, _ = normaliselabels(labels)
N,f = features.shape
bs = np.zeros(f+1)
try:
from scipy import optimize
# Some testing revealed that this was a good combination
# call fmin_cg twice first and then fmin
# I do not understand why 100%, but there it is
bs = optimize.fmin_cg(error, bs, error_prime, disp=False)
bs = optimize.fmin_cg(error, bs, error_prime, disp=False)
bs = optimize.fmin(error, bs, disp=False)
except ImportError:
import warnings
warnings.warn('''\
milk.supervised.logistic.train: Could not import scipy.optimize.
Fall back to very simple gradient descent (which is slow).''')
bs = np.zeros(f+1)
cur = 1.e-6
ebs = error(bs)
for i in xrange(1000000):
dir = error_prime(bs)
step = (lambda e : bs - e *dir)
enbs = ebs + 1
while enbs > ebs:
cur /= 2.
if cur == 0.:
break
nbs = step(cur)
enbs = error(nbs)
while cur < 10.:
cur *= 2
nnbs = step(cur)
ennbs = error(nnbs)
if ennbs < enbs:
nbs = nnbs
enbs = ennbs
else:
break
bs = nbs
ebs = enbs
return logistic_model(bs)
<file_sep>/gui/web_gui/sitetest2/robomow_logic.py~
#!/usr/bin/env python
import easygui as eg
import sys
from img_processing_tools import *
#from PIL import Image
from PIL import ImageStat
import cv
import time
import mahotas
def snap_shot():
#directory = eg.diropenbox(msg=None, title=None, default=None)
#print directory
filename = eg.fileopenbox(msg=None, title=None, default='*', filetypes=None)
print filename
#img1 = cv.LoadImage(filename)
img1 = Image.open(filename)
if img1.size[0] <> 320 or img1.size[1] <> 240:
print "Image is not right size. Resizing image...."
img1 = img1.resize((320, 240))
print "Resized to 320, 340"
#img1 = Image.open(filename).convert('RGB').save('temp.gif')
img1.save('temp.jpg')
#print img1
#cv.ShowImage("Frame1", img1)
#time.sleep(1)
#cv.WaitKey()
#time.sleep(1)
#cv.DestroyWindow("Frame1")
#print "window destroyed"
#time.sleep(.5)
return img1
if __name__=="__main__":
if len(sys.argv) < 2:
print "******* Requires an argument"
print "This program will return the angle at which the second is in relation to the first. ***"
sys.exit(-1)
#loop = 1
#reply =""
#while loop == 1:
# data file schema
# classID, next 256 integers are I3 greenband histogram, I3 sum, I3 sum2, I3 median, I3 mean,
# I3 variance, I3 Standard Deviation, I3 root mean square
#if reply == "":
# image = snap_shot()
if sys.argv[1] == "mowable":
#if reply == "Mowable":
#eg.msgbox("Going to mow....:")
classID = "1"
print "calling i3"
I3image = rgb2I3(Image.open('temp.jpg'))
WriteMeterics(I3image, classID)
if sys.argv[1] == "non_nowable":
classID = "2"
print "calling i3"
I3image = rgb2I3(Image.open('temp.jpg'))
WriteMeterics(I3image, classID)
if sys.argv[1] == "quit":
print "Quitting...."
sys.exit(-1)
if sys.argv[1] == "new_image":
print "Acquiring new image.."
image = snap_shot()
#print np.array(image)
#print PIL2array(image)
#lbp1 = mahotas.features.lbp(image , 1, 8, ignore_zeros=False)
#print lbp1
#reply = eg.buttonbox(msg='Classify Image', title='Robomow GUI', choices=('Mowable', 'Non-Mowable', 'New Image', 'Quit'), image='temp.gif', root=None)
"""
if reply == "Grab Frame":
try:
#img1 = cv.LoadImage(sys.argv[1],cv.CV_LOAD_IMAGE_GRAYSCALE)
frame = grab_frame(0)
#img1 = cv.CreateImage(cv.GetSize(frame), cv.IPL_DEPTH_8U, 1)
#img1 = CVtoGray(frame)
#cv.WaitKey()
#img1 = CV_enhance_edge(img1)
#cv.WaitKey()
#img2 = cv.LoadImage(sys.argv[1],cv.CV_LOAD_IMAGE_GRAYSCALE)
#img3 = cv.LoadImage(sys.argv[2],cv.CV_LOAD_IMAGE_GRAYSCALE)
print "frame=", frame
cv.ShowImage("Frame1", frame)
cv.MoveWindow ('Frame1',50 ,50 )
except:
print "******* Could not open camera *******"
frame = Image.open("1.grass10.jpg")
#sys.exit(-1)
"""
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/supervised/svm.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2012, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# License: MIT. See COPYING.MIT file in the milk distribution
from __future__ import division
from .classifier import normaliselabels, ctransforms_model
from .base import supervised_model
from collections import deque
import numpy
import numpy as np
import random
from . import _svm
__all__ = [
'rbf_kernel',
'polynomial_kernel',
'precomputed_kernel',
'dot_kernel',
'svm_raw',
'svm_binary',
'svm_to_binary',
'svm_sigmoidal_correction',
'sigma_value_fisher',
'fisher_tuned_rbf_svm',
]
def _svm_apply(SVM, q):
'''
f_i = _svm_apply(SVM, q)
@internal: This is mostly used for testing
'''
X,Y,Alphas,b,C,kernel=SVM
N = len(X)
s = 0.0
for i in xrange(N):
s += Alphas[i] * Y[i] * kernel(q, X[i])
return s - b
def svm_learn_smo(X,Y,kernel,C,eps=1e-4,tol=1e-2,cache_size=(1<<20)):
'''
Learn a svm classifier
X: data
Y: labels in SVM format (ie Y[i] in (1,-1))
This is a very raw interface. In general, you should use a class
like svm_classifier.
Implements the Sequential Minimum Optimisation Algorithm from Platt's
"Fast training of support vector machines using sequential minimal optimization"
in Advances in kernel methods: support vector learning
Pages: 185 - 208
Year of Publication: 1999
ISBN:0-262-19416-3
'''
assert numpy.all(numpy.abs(Y) == 1)
assert len(X) == len(Y)
N = len(Y)
Y = Y.astype(numpy.int32)
params = numpy.array([0,C,1e-3,1e-5],numpy.double)
Alphas0 = numpy.zeros(N, numpy.double)
_svm.eval_SMO(X,Y,Alphas0,params,kernel,cache_size)
return Alphas0, params[0]
def svm_learn_libsvm(features, labels, kernel, C, eps=1e-4, tol=1e-2, cache_size=(1<<20), alphas=None):
'''
Learn a svm classifier using LIBSVM optimiser
This is a very raw interface. In general, you should use a class
like svm_classifier.
This uses the LIBSVM optimisation algorithm
Parameters
----------
X : ndarray
data
Y : ndarray
labels in SVM format (ie Y[i] in (1,-1))
kernel : kernel
C : float
eps : float, optional
tol : float, optional
cache_size : int, optional
alphas : ndarray, optional
Returns
-------
alphas : ndarray
b : float
'''
if not np.all(np.abs(labels) == 1):
raise ValueError('milk.supervised.svm.svm_learn_libsvm: Y[i] != (-1,+1)')
assert len(features) == len(labels)
n = len(labels)
labels = labels.astype(np.int32)
p = -np.ones(n, np.double)
params = np.array([0,C,eps,tol], dtype=np.double)
if alphas is None:
alphas = np.zeros(n, np.double)
elif alphas.dtype != np.double or len(alphas) != n:
raise ValueError('milk.supervised.svm_learn_libsvm: alphas is in wrong format')
_svm.eval_LIBSVM(features, labels, alphas, p, params, kernel, cache_size)
return alphas, params[0]
class preprocessed_rbf_kernel(object):
def __init__(self, X, sigma, beta):
self.X = X
self.Xsum = (X**2).sum(1)
self.sigma = sigma
self.beta = beta
def call_many(self, qs):
from milk.unsupervised import pdist
dists = pdist(self.X, qs, 'euclidean2')
dists /= -self.sigma
np.exp(dists, dists)
dists *= self.beta
return dists.T
def __call__(self, q):
minus_d2_sigma = np.dot(self.X,q)
minus_d2_sigma *= 2.
minus_d2_sigma -= self.Xsum
minus_d2_sigma -= np.dot(q,q)
minus_d2_sigma /= self.sigma
return self.beta * np.exp(minus_d2_sigma)
class rbf_kernel(object):
'''
kernel = rbf_kernel(sigma,beta=1)
Radial Basis Function kernel
Returns a kernel (ie, a function that implements)
beta * exp( - ||x1 - x2|| / sigma)
'''
def __init__(self, sigma, beta=1):
self.sigma = sigma
self.beta = beta
self.kernel_nr_ = 0
self.kernel_arg_ = float(sigma)
def __call__(self, x1, x2):
d2 = x1 - x2
d2 **= 2
d2 = d2.sum()
res = self.beta*np.exp(-d2/self.sigma)
return res
def preprocess(self, X):
return preprocessed_rbf_kernel(X, self.sigma, self.beta)
class polynomial_kernel(object):
'''
kernel = polynomial_kernel(d,c=1)
returns a kernel (ie, a function) that implements:
(<x1,x2>+c)**d
'''
def __init__(self, d, c=1):
self.d = d
self.c = c
def __call__(self,x1,x2):
return (np.dot(x1,x2)+self.c)**self.d
class precomputed_kernel(object):
'''
kernel = precomputed_kernel(kmatrix)
A "fake" kernel which is precomputed.
'''
def __init__(self, kmatrix, copy=False):
kmatrix = np.ascontiguousarray(kmatrix, np.double, copy=copy)
self.kernel_nr_ = 1
self.kernel_arg_ = 0.
def __call__(self, x0, x1):
return kmatrix[x0,x1]
class _call_kernel(object):
def __init__(self, k, svs):
self.svs = svs
self.kernel = k
def __call__(self, q):
return np.array([self.kernel(s, q) for s in self.svs])
class preprocessed_dot_kernel(object):
def __init__(self, svs):
self.svs = svs
def __call__(self, x1):
return np.dot(self.svs, x1)
class dot_kernel(object):
def __init__(self):
self.kernel_nr_ = 2
self.kernel_arg_ = 0.
def __call__(self, x0, x1):
return np.dot(x0, x1)
def preprocess(self, svs):
return preprocessed_dot_kernel(svs)
class svm_raw_model(supervised_model):
def __init__(self, svs, Yw, b, kernel):
self.svs = svs
self.Yw = Yw
self.b = b
self.kernel = kernel
try:
self.kernelfunction = self.kernel.preprocess(self.svs)
except AttributeError:
self.kernelfunction = _call_kernel(self.kernel, self.svs)
def apply_many(self, qs):
try:
qs = self.kernelfunction.call_many(qs)
except AttributeError:
qs = np.array(map(self.kernelfunction, qs))
return np.dot(qs, self.Yw) - self.b
def apply(self, q):
Q = self.kernelfunction(q)
return np.dot(Q, self.Yw) - self.b
class svm_raw(object):
'''
svm_raw: classifier
classifier = svm_raw(kernel, C, eps=1e-3, tol=1e-8)
Parameters
----------
kernel : callable
the kernel to use. This should be a function that takes two data
arguments see rbf_kernel and polynomial_kernel.
C : float
the C parameter
eps : float, optional
the precision to which to solve the problem (default 1e-3)
tol : float, optional
(|x| < tol) is considered zero
'''
def __init__(self, kernel=None, C=1., eps=1e-3, tol=1e-8):
self.C = C
self.kernel = kernel
self.eps = eps
self.tol = tol
self.algorithm = 'libsvm'
def train(self, features, labels, normalisedlabels=False, **kwargs):
assert self.kernel is not None, 'milk.supervised.svm_raw.train: kernel not set!'
assert self.algorithm in ('libsvm','smo'), 'milk.supervised.svm_raw: unknown algorithm (%s)' % self.algorithm
assert not (np.isinf(self.C) or np.isnan(self.C)), 'milk.supervised.svm_raw: setting C to NaN or Inf causes problems.'
features = np.asanyarray(features)
if normalisedlabels:
Y = labels.copy()
else:
Y,_ = normaliselabels(labels)
assert Y.max() == 1, 'milk.supervised.svm_raw can only handle binary problems'
Y *= 2
Y -= 1
kernel = self.kernel
try:
kernel = (self.kernel.kernel_nr_, self.kernel.kernel_arg_)
features = np.ascontiguousarray(features, np.double)
except AttributeError:
pass
if self.algorithm == 'smo':
alphas,b = svm_learn_smo(features,Y,kernel,self.C,self.eps,self.tol)
else:
alphas,b = svm_learn_libsvm(features,Y,kernel,self.C,self.eps,self.tol)
svsi = (alphas != 0)
svs = features[svsi]
w = alphas[svsi]
Y = Y[svsi]
Yw = w * Y
return svm_raw_model(svs, Yw, b, self.kernel)
def get_params(self):
return self.C, self.eps,self.tol
def set_params(self,params):
self.C,self.eps,self.tol = params
def set_option(self, optname, value):
setattr(self, optname, value)
def learn_sigmoid_constants(F,Y,
max_iters=None,
min_step=1e-10,
sigma=1e-12,
eps=1e-5):
'''
A,B = learn_sigmoid_constants(F,Y)
This is a very low-level interface look into the svm_classifier class.
Parameters
----------
F : Values of the function F
Y : Labels (in boolean format, ie, in (0,1))
Other Parameters
----------------
max_iters : Maximum nr. of iterations
min_step : Minimum step
sigma : sigma
eps : A small number
Reference for Implementation
----------------------------
Implements the algorithm from "A Note on Platt's Probabilistic Outputs for
Support Vector Machines" by Lin, Lin, and Weng.
Machine Learning, Vol. 68, No. 3. (23 October 2007), pp. 267-276
'''
# Below we use safe constructs to avoid using the overflown values, but we
# must compute them because of the way numpy works.
errorstate = np.seterr(over='ignore')
# the deci[i] array is called F in this code
F = np.asanyarray(F)
Y = np.asanyarray(Y)
assert len(F) == len(Y)
assert numpy.all( (Y == 1) | (Y == 0) )
if max_iters is None:
max_iters = 1000
prior1 = Y.sum()
prior0 = len(F)-prior1
small_nr = 1e-4
hi_t = (prior1+1.)/(prior1+2.)
lo_t = 1./(prior0+2.)
T = Y*hi_t + (1-Y)*lo_t
A = 0.
B = np.log( (prior0+1.)/(prior1+1.) )
def target(A,B):
fApB = F*A + B
lef = np.log1p(np.exp(fApB))
lemf = np.log1p(np.exp(-fApB))
fvals = np.choose(fApB >= 0, ( T*fApB + lemf, (T-1.)*fApB + lef))
return np.sum(fvals)
fval = target(A,B)
for iter in xrange(max_iters):
fApB = F*A + B
ef = np.exp(fApB)
emf = np.exp(-fApB)
p = np.choose(fApB >= 0, ( emf/(1.+emf), 1./(1.+ef) ))
q = np.choose(fApB >= 0, ( 1/(1.+emf), ef/(1.+ef) ))
d2 = p * q
h11 = np.dot(F*F,d2) + sigma
h22 = np.sum(d2) + sigma
h21 = np.dot(F,d2)
d1 = T - p
g1 = np.dot(F,d1)
g2 = np.sum(d1)
if abs(g1) < eps and abs(g2) < eps: # Stopping criteria
break
det = h11*h22 - h21*h21
dA = - (h22*g1 - h21*g2)/det
dB = - (h21*g1 + h11*g2)/det
gd = g1*dA + g2*dB
stepsize = 1.
while stepsize >= min_step:
newA = A + stepsize*dA
newB = B + stepsize*dB
newf = target(newA,newB)
if newf < fval+eps*stepsize*gd:
A = newA
B = newB
fval = newf
break
stepsize /= 2
else:
print 'Line search fails'
break
np.seterr(**errorstate)
return A,B
class svm_binary_model(supervised_model):
def __init__(self, classes):
self.classes = classes
def apply(self,f):
return self.classes[f >= 0.]
class svm_binary(object):
'''
classifier = svm_binary()
model = classifier.train(features, labels)
assert model.apply(f) in labels
'''
def train(self, features, labels, normalisedlabels=False, **kwargs):
if normalisedlabels:
return svm_binary_model( (0,1) )
assert len(labels) >= 2, 'Cannot train from a single example'
names = sorted(set(labels))
assert len(names) == 2, 'milk.supervised.svm.svm_binary.train: Can only handle two class problems'
return svm_binary_model(names)
class svm_to_binary(object):
'''
svm_to_binary(base_svm)
A simple wrapper so that
svm_to_binary(base_svm)
is a model that takes the base_svm classifier and then binarises its model output.
NOTE: This class does the same job as::
ctransforms(base_svm, svm_binary())
'''
def __init__(self, svm_base):
'''
binclassifier = svm_to_binary(svm_base)
a classifier that binarises the output of svm_base.
'''
self.base = svm_base
def train(self, features, labels, **kwargs):
model = self.base.train(features, labels, **kwargs)
binary = svm_binary()
binary_model = binary.train(features, labels, **kwargs)
return ctransforms_model([model, binary_model])
def set_option(self, opt, value):
self.base.set_option(opt, value)
class svm_sigmoidal_correction_model(supervised_model):
def __init__(self, A, B):
self.A = A
self.B = B
def apply(self,features):
return 1./(1.+numpy.exp(features*self.A+self.B))
class svm_sigmoidal_correction(object):
'''
svm_sigmoidal_correction : a classifier
Sigmoidal approximation for obtaining a probability estimate out of the output
of an SVM.
'''
def __init__(self):
self.max_iters = None
def train(self, features, labels, **kwargs):
A,B = learn_sigmoid_constants(features,labels,self.max_iters)
return svm_sigmoidal_correction_model(A, B)
def get_params(self):
return self.max_iters
def set_params(self,params):
self.max_iters = params
def sigma_value_fisher(features,labels):
'''
f = sigma_value_fisher(features,labels)
value_s = f(s)
Computes a function which computes how good the value of sigma
is for the features. This function should be *minimised* for a
good value of sigma.
Parameters
-----------
features : features matrix as 2-ndarray.
Returns
-------
f : a function: float -> float
this function should be minimised for a good `sigma`
Reference
----------
Implements the measure in
"Determination of the spread parameter in the
Gaussian kernel for classification and regression"
by <NAME>, <NAME>, <NAME>, and <NAME>
'''
features = np.asanyarray(features)
xij = np.dot(features,features.T)
f2 = np.sum(features**2,1)
d = f2-2*xij
d = d.T + f2
N1 = (labels==0).sum()
N2 = (labels==1).sum()
C1 = -d[labels == 0][:,labels == 0]
C2 = -d[labels == 1][:,labels == 1]
C12 = -d[labels == 0][:,labels == 1]
C1 = C1.copy()
C2 = C2.copy()
C12 = C12.copy()
def f(sigma):
sigma = float(sigma)
N1 = C1.shape[0]
N2 = C2.shape[0]
if C12.shape != (N1,N2):
raise ValueError
C1v = np.sum(np.exp(C1/sigma))/N1
C2v = np.sum(np.exp(C2/sigma))/N2
C12v = np.sum(np.exp(C12/sigma))/N1/N2
return (N1 + N2 - C1v - C2v)/(C1v/N1+C2v/N2 - 2.*C12v)
return f
class fisher_tuned_rbf_svm(object):
'''
F = fisher_tuned_rbf_svm(sigmas, base)
Returns a wrapper classifier that uses RBF kernels automatically
tuned using sigma_value_fisher.
'''
def __init__(self, sigmas, base):
self.sigmas = sigmas
self.base = base
def train(self, features, labels, **kwargs):
f = sigma_value_fisher(features, labels)
fs = [f(s) for s in self.sigmas]
self.sigma = self.sigmas[np.argmin(fs)]
self.base.set_option('kernel',rbf_kernel(self.sigma))
return self.base.train(features, labels, **kwargs)
<file_sep>/camera/trywxcv.py
import wx
from CVtypes import cv
import time
class myFrame(wx.Frame):
def __init__(self):
wx.Frame.__init__(self, None, -1, 'Try CV')
self.SetClientSize((320,240))
# setup capture
self.cap = cv.CreateCameraCapture(0)
self.Bind(wx.EVT_IDLE, self.onIdle)
# infrastructure for timing
self.frames = 0
self.starttime = time.clock()
def onIdle(self, event):
img = cv.QueryFrame(self.cap)
self.displayImage(img)
event.RequestMore()
# timing stuff
self.frames += 1
interval = time.clock() - self.starttime
if interval > 5:
print 'rate = %.1f frames/second' % (self.frames / interval)
self.frames = 0
self.starttime = time.clock()
def displayImage(self, img, offset=(0,0)):
bitmap = cv.ImageAsBitmap(img)
dc = wx.ClientDC(self)
dc.DrawBitmap(bitmap, offset[0], offset[1], False)
class myApp(wx.App):
def OnInit(self):
self.frame = myFrame()
self.frame.Show(True)
return True
app = myApp(0)
app.MainLoop()
<file_sep>/main/front_cam.py
import cv2, cv
import time
webcam1 = cv2.VideoCapture(0)
cv2.namedWindow('Front Camera', cv.CV_WINDOW_AUTOSIZE)
while True:
#for i in range (5):
ret, frame = webcam1.read()
cv2.imshow('Front Camera', frame)
cv.WaitKey(5)
time.sleep(.1)
<file_sep>/gui/breezy/canvasdemo3.py
"""
File: canvasdemo3.py
Author: <NAME>
"""
from breezypythongui import EasyFrame, EasyCanvas
import tkinter.colorchooser
class CanvasDemo(EasyFrame):
"""Draws ovals, rectangles, or line segments."""
def __init__(self):
"""Sets up the window and widgets."""
EasyFrame.__init__(self, title = "Canvas Demo 3")
# Specialized canvas for drawing shapes
self.shapeCanvas = ShapeCanvas(self)
self.addCanvas(self.shapeCanvas, row = 0, column = 2,
rowspan = 8, width = 200)
# Radio buttons, labels, fields, and color canvas
self.shapeGroup = self.addRadiobuttonGroup(row = 5,
column = 0,
rowspan = 3)
defaultRB = self.shapeGroup.addRadiobutton(
text = "Line segment",
command = self.chooseShape)
self.shapeGroup.setSelectedButton(defaultRB)
self.shapeGroup.addRadiobutton(text = "Rectangle",
command = self.chooseShape)
self.shapeGroup.addRadiobutton(text = "Oval",
command = self.chooseShape)
self.addLabel(text = "x0", row = 0, column = 0)
self.addLabel(text = "y0", row = 1, column = 0)
self.addLabel(text = "x1", row = 2, column = 0)
self.addLabel(text = "y1", row = 3, column = 0)
self.addLabel(text = "Color", row = 4, column = 0)
self.x0Field = self.addIntegerField(0, row = 0, column = 1)
self.y0Field = self.addIntegerField(0, row = 1, column = 1)
self.x1Field = self.addIntegerField(0, row = 2, column = 1)
self.y1Field = self.addIntegerField(0, row = 3, column = 1)
self.colorCanvas = self.addCanvas(row = 4, column = 1,
width = 45, height = 20,
background = "black")
# Command buttons
self.addButton(text = "Choose color", row = 5, column = 1,
command = self.chooseColor)
self.addButton(text = "Draw", row = 6, column = 1,
command = self.draw)
self.addButton(text = "Erase all",
row = 7, column = 1,
command = self.eraseAll)
# Event handling methods in the main window class
def chooseShape(self):
"""Responds to a radio button click by updating the
shape canvas with the selected shape type."""
shapeType = self.shapeGroup.getSelectedButton()["text"]
self.shapeCanvas.setShapeType(shapeType)
def chooseColor(self):
"""Pops up a color chooser, outputs the result,
and updates the two canvases."""
colorTuple = tkinter.colorchooser.askcolor()
if not colorTuple[0]: return
hexString = colorTuple[1]
self.colorCanvas["background"] = hexString
self.shapeCanvas.setColor(hexString)
def draw(self):
"""Draws a shape of the current type at the current position
and in the current color."""
x0 = self.x0Field.getNumber()
y0 = self.y0Field.getNumber()
x1 = self.x1Field.getNumber()
y1 = self.y1Field.getNumber()
self.shapeCanvas.draw(x0, y0, x1, y1)
def eraseAll(self):
"""Deletes all shapes from the canvas."""
self.shapeCanvas.eraseAll()
class ShapeCanvas(EasyCanvas):
"""Supports the drawing of different types of shapes."""
def __init__(self, parent):
"""Sets up the canvas."""
EasyCanvas.__init__(self, parent, background = "gray")
self.shapeType = "Line segment"
self.color = "black"
self.shapes = list()
def setColor(self, color):
"""Resets the color to the given color."""
self.color = color
def setShapeType(self, shapeType):
"""Resets the shape type to the given shape type."""
self.shapeType = shapeType
def draw(self, x0, y0, x1, y1):
"""Draws the shape at the given coordinates."""
if self.shapeType == "Line segment":
shape = self.drawLine(x0, y0, x1, y1, self.color)
elif self.shapeType == "Rectangle":
shape = self.drawRectangle(x0, y0, x1, y1, self.color)
else:
shape = self.drawOval(x0, y0, x1, y1, self.color)
self.shapes.append(shape)
def eraseAll(self):
"""Deletes all shapes from the canvas."""
for shape in self.shapes:
self.delete(shape)
self.shapes = list()
# Instantiate and pop up the window."""
if __name__ == "__main__":
CanvasDemo().mainloop()
<file_sep>/prcoessimages.py
import Image
#load image
im = Image.open("grass11.jpg")
print "image loaded" = im
im.close
<file_sep>/PhidgetsPython/Python/LED-simple.py
#!/usr/bin/env python
"""Copyright 2010 Phidgets Inc.
This work is licensed under the Creative Commons Attribution 2.5 Canada License.
To view a copy of this license, visit http://creativecommons.org/licenses/by/2.5/ca/
"""
__author__ = '<NAME>'
__version__ = '2.1.8'
__date__ = 'May 17 2010'
#Basic imports
from ctypes import *
import sys
#Phidget specific imports
from Phidgets.PhidgetException import PhidgetErrorCodes, PhidgetException
from Phidgets.Events.Events import AttachEventArgs, DetachEventArgs, ErrorEventArgs
from Phidgets.Devices.LED import LED, LEDCurrentLimit, LEDVoltage
from time import sleep
#Create an LED object
try:
led = LED()
except RuntimeError as e:
print("Runtime Exception: %s" % e.details)
print("Exiting....")
exit(1)
#Information Display Function
def displayDeviceInfo():
print("|------------|----------------------------------|--------------|------------|")
print("|- Attached -|- Type -|- Serial No. -|- Version -|")
print("|------------|----------------------------------|--------------|------------|")
print("|- %8s -|- %30s -|- %10d -|- %8d -|" % (led.isAttached(), led.getDeviceName(), led.getSerialNum(), led.getDeviceVersion()))
print("|------------|----------------------------------|--------------|------------|")
#Event Handler Callback Functions
def ledAttached(e):
attached = e.device
print("LED %i Attached!" % (attached.getSerialNum()))
def ledDetached(e):
detached = e.device
print("LED %i Detached!" % (detached.getSerialNum()))
def ledError(e):
try:
source = e.device
print("LED %i: Phidget Error %i: %s" % (source.getSerialNum(), e.eCode, e.description))
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
#Main Program Code
try:
led.setOnAttachHandler(ledAttached)
led.setOnDetachHandler(ledDetached)
led.setOnErrorhandler(ledError)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
exit(1)
print("Opening phidget object...")
try:
led.openPhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
exit(1)
print("Waiting for attach....")
try:
led.waitForAttach(10000)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
try:
led.closePhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
exit(1)
exit(1)
else:
displayDeviceInfo()
print("Setting the output current limit and voltage levels to the default values....")
print("This is only supported on the 1031 - LED Advanced")
#try to set these values, if we get an exception, it means most likely we are using an old 1030 LED board instead of a 1031 LED Advanced board
try:
led.setCurrentLimit(LEDCurrentLimit.CURRENT_LIMIT_20mA)
led.setVoltage(LEDVoltage.VOLTAGE_2_75V)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
#This example assumes that there are LED's plugged into locations 0-7
print("Turning on LED's 0 - 9...")
for i in range(8):
led.setDiscreteLED(i, 100)
sleep(1)
print("Turning off LED's 0 - 9...")
for i in range(8):
led.setDiscreteLED(i, 0)
sleep(1)
print("Press Enter to quit....")
chr = sys.stdin.read(1)
print("Closing...")
try:
led.closePhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting...")
exit(1)
print("Done.")
exit(0)<file_sep>/img_tool_tester.py
from img_processing_tools import *
from PIL import Image
from PIL import ImageStat
import sys
if __name__=="__main__":
if len(sys.argv) < 1:
#print "******* Requires an image files of the same size."
#print "This program will return the angle at which the second is in relation to the first. ***"
sys.exit(-1)
try:
img1 = Image.open(sys.argv[1])
#img2 = cv.LoadImage(sys.argv[2],cv.CV_LOAD_IMAGE_GRAYSCALE)
except:
print "******* Could not open image files *******"
sys.exit(-1)
img1.show()
img1_I3 = rgb2I3(img1)
img1_I3.show()
print "sum img1: ", ImageStat.Stat(img1).sum
print "sum img1_I3: ", ImageStat.Stat(img1_I3).sum
print "sum2 img1_I3: ", ImageStat.Stat(img1_I3).sum2
print "median img1_I3: ", ImageStat.Stat(img1_I3).median
print "avg img1_I3: ", ImageStat.Stat(img1_I3).mean
print "var img1_I3: ", ImageStat.Stat(img1_I3).var
print "stddev img1_I3: ", ImageStat.Stat(img1_I3).stddev
print "rms img1_I3: ", ImageStat.Stat(img1_I3).rms
print "extrema img1_I3: ", ImageStat.Stat(img1_I3).extrema
print "histogram I3: ", len(img1_I3.histogram())
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/supervised/featureselection.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2012, <NAME> <<EMAIL>>
#
# License: MIT. See COPYING.MIT file in the milk distribution
from __future__ import division
import numpy as np
from numpy.linalg import det
from . classifier import normaliselabels
__all__ = [
'sda',
'linearly_independent_subset',
'linear_independent_features',
'filterfeatures',
'featureselector',
'sda_filter',
'rank_corr',
'select_n_best',
]
def _sweep(A, k, flag):
Akk = A[k,k]
if Akk == 0:
Akk = 1.e-5
# cross[i,j] = A[i,k] * A[k,j]
cross = (A[:,k][:, np.newaxis] * A[k])
B = A - cross/Akk
# currently: B[i,j] = A[i,j] - A[i,k]*A[k,j]/Akk
# Now fix row k and col k, followed by Bkk
B[k] = flag * A[k]/A[k,k]
B[:,k] = flag * A[:,k]/A[k,k]
B[k,k] = -1./Akk
return B
def sda(features, labels, tolerance=.01, significance_in=.05, significance_out=.05, loose=False):
'''
features_idx = sda(features, labels, tolerance=.01, significance_in=.05, significance_out=.05)
Stepwise Discriminant Analysis for feature selection
Pre-filter the feature matrix to remove linearly dependent features
before calling this function. Behaviour is undefined otherwise.
This implements the algorithm described in Jennrich, R.I. (1977), "Stepwise
Regression" & "Stepwise Discriminant Analysis," both in Statistical Methods
for Digital Computers, eds. <NAME>, <NAME>, and <NAME>, New York;
<NAME> & Sons, Inc.
Parameters
----------
features : ndarray
feature matrix. There should not be any perfectly correlated features.
labels : 1-array
labels
tolerance : float, optional
significance_in : float, optional
significance_out : float, optional
Returns
-------
features_idx : sequence
sequence of integer indices
'''
from scipy import stats
assert len(features) == len(labels), 'milk.supervised.featureselection.sda: length of features not the same as length of labels'
N, m = features.shape
labels,labelsu = normaliselabels(labels)
q = len(labelsu)
df = features - features.mean(0)
T = np.dot(df.T, df)
dfs = [(features[labels == i] - features[labels == i].mean(0)) for i in xrange(q)]
W = np.sum(np.dot(d.T, d) for d in dfs)
ignoreidx = ( W.diagonal() == 0 )
if ignoreidx.any():
idxs, = np.where(~ignoreidx)
if not len(idxs):
return np.arange(m)
selected = sda(features[:,~ignoreidx],labels)
return idxs[selected]
output = []
D = W.diagonal()
df1 = q-1
last_enter_k = -1
while True:
V = W.diagonal()/T.diagonal()
W_d = W.diagonal()
V_neg = (W_d < 0)
p = V_neg.sum()
if V_neg.any():
V_m = V[V_neg].min()
k, = np.where(V == V_m)
k = k[0]
Fremove = (N-p-q+1)/(q-1)*(V_m-1)
df2 = N-p-q+1
PrF = 1 - stats.f.cdf(Fremove,df1,df2)
if PrF > significance_out:
#print 'removing ',k, 'V(k)', 1./V_m, 'Fremove', Fremove, 'df1', df1, 'df2', df2, 'PrF', PrF
if k == last_enter_k:
# We are going into an infinite loop.
import warnings
warnings.warn('milk.featureselection.sda: infinite loop detected (maybe bug?).')
break
W = _sweep(W,k,1)
T = _sweep(T,k,1)
continue
ks = ( (W_d / D) > tolerance)
if ks.any():
V_m = V[ks].min()
k, = np.where(V==V_m)
k = k[0]
Fenter = (N-p-q)/(q-1) * (1-V_m)/V_m
df2 = N-p-q
PrF = 1 - stats.f.cdf(Fenter,df1,df2)
if PrF < significance_in:
#print 'adding ',k, 'V(k)', 1./V_m, 'Fenter', Fenter, 'df1', df1, 'df2', df2, 'PrF', PrF
W = _sweep(W,k,-1)
T = _sweep(T,k,-1)
if loose or (PrF < 0.0001):
output.append((Fenter,k))
last_enter_k = k
continue
break
output.sort(reverse=True)
return np.array([idx for _,idx in output])
def linearly_independent_subset(V, threshold=1.e-5, return_orthogonal_basis=False):
'''
subset = linearly_independent_subset(V, threshold=1.e-5)
subset,U = linearly_independent_subset(V, threshold=1.e-5, return_orthogonal_basis=True)
Discover a linearly independent subset of `V`
Parameters
----------
V : sequence of input vectors
threshold : float, optional
vectors with 2-norm smaller or equal to this are considered zero
(default: 1e.-5)
return_orthogonal_basis : Boolean, optional
whether to return orthogonal basis set
Returns
-------
subset : ndarray of integers
indices used for basis
U : 2-array
orthogonal basis into span{V}
Implementation Reference
------------------------
Use Gram-Schmidt with a check for when the v_k is close enough to zero to ignore
See http://en.wikipedia.org/wiki/Gram-Schmidt_process
'''
V = np.array(V, copy=True)
orthogonal = []
used = []
for i,u in enumerate(V):
for v in orthogonal:
u -= np.dot(u,v)/np.dot(v,v) * v
if np.dot(u,u) > threshold:
orthogonal.append(u)
used.append(i)
if return_orthogonal_basis:
return np.array(used),np.array(orthogonal)
return np.array(used)
def linear_independent_features(features, labels=None):
'''
indices = linear_independent_features(features, labels=None)
Returns the indices of a set of linearly independent features (columns).
Parameters
----------
features : ndarray
labels : ignored
This argument is only here to conform to the learner interface.
Returns
-------
indices : ndarray of integers
indices of features to keep
See Also
--------
`linearly_independent_subset` :
this function is equivalent to `linearly_independent_subset(features.T)`
'''
return linearly_independent_subset(features.T)
class filterfeatures(object):
'''
selector = filterfeatures(idxs)
Returns a transformer which selects the features given by idxs. I.e.,
``apply(features)`` is equivalent to ``features[idxs]``
Parameters
----------
idxs : ndarray
This can be either an array of integers (positions) or an array of booleans
'''
def __init__(self, idxs):
self.idxs = idxs
def apply(self, features):
return features[self.idxs]
def apply_many(self, features):
features = np.asanyarray(features)
return features[:,self.idxs]
def __repr__(self):
return 'filterfeatures(%s)' % self.idxs
class featureselector(object):
'''
selector = featureselector(function)
Returns a transformer which selects features according to
selected_idxs = function(features,labels)
'''
def __init__(self, selector):
self.selector = selector
def train(self, features, labels, **kwargs):
idxs = self.selector(features, labels)
if len(idxs) == 0:
import warnings
warnings.warn('milk.featureselection: No features selected! Using all features as fall-back.')
idxs = np.arange(len(features[0]))
return filterfeatures(idxs)
def __repr__(self):
return 'featureselector(%s)' % self.selector
def sda_filter():
return featureselector(sda)
def rank_corr(features, labels):
'''
rs = rank_corr(features, labels)
Computes the following expression::
rs[i] = max_e COV²(rank(features[:,i]), labels == e)
This is appropriate for numeric features and categorical labels.
Parameters
----------
features : ndarray
feature matrix
labels : sequence
Returns
-------
rs : ndarray of float
rs are the rank correlations
'''
features = np.asanyarray(features)
labels = np.asanyarray(labels)
n = len(features)
ranks = features.argsort(0)
ranks = ranks.astype(float)
binlabels = np.array([(labels == ell) for ell in set(labels)], dtype=float)
mx = ranks.mean(0)
my = binlabels.mean(1)
sx = ranks.std(0)
sy = binlabels.std(1)
r = np.dot(binlabels,ranks)
r -= np.outer(n*my, mx)
r /= np.outer(sy, sx)
r /= n # Use n [instead of n-1] to match numpy's corrcoef
r **= 2
return r.max(0)
class select_n_best(object):
'''
select_n_best(n, measure)
Selects the `n` features that score the highest in `measure`
'''
def __init__(self, n, measure):
self.n = n
self.measure = measure
def train(self, features, labels, **kwargs):
values = self.measure(features, labels)
values = values.argsort()
return filterfeatures(values[:self.n])
<file_sep>/arduino/SabertoothSimplified/examples/SimpleExample/SimpleExample.ino
// Simple Example Sample
// Copyright (c) 2012 Dimension Engineering LLC
// See license.txt for license details.
#include <SabertoothSimplified.h>
SabertoothSimplified ST; // We'll name the Sabertooth object ST.
// For how to configure the Sabertooth, see the DIP Switch Wizard for
// http://www.dimensionengineering.com/datasheets/SabertoothDIPWizard/start.htm
// Be sure to select Simplified Serial Mode for use with this library.
// This sample uses a baud rate of 9600.
//
// Connections to make:
// Arduino TX->1 -> Sabertooth S1
// Arduino GND -> Sabertooth 0V
// Arduino VIN -> Sabertooth 5V (OPTIONAL, if you want the Sabertooth to power the Arduino)
//
// If you want to use a pin other than TX->1, see the SoftwareSerial example.
void setup()
{
Serial.begin(9600); // This is the baud rate you chose with the DIP switches.
}
void loop()
{
ST.motor(1, 127); // Go forward at full power.
delay(2000); // Wait 2 seconds.
ST.motor(1, 0); // Stop.
delay(2000); // Wait 2 seconds.
ST.motor(1, -127); // Reverse at full power.
delay(2000); // Wait 2 seconds.
ST.motor(1, 0); // Stop.
delay(2000);
}
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/tests/test_normalise.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2012, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
from __future__ import division
import numpy
import numpy as np
from milk.supervised.normalise import sample_to_2min
import milk.supervised.normalise
def test_zscore_normalise():
I=milk.supervised.normalise.zscore_normalise()
numpy.random.seed(1234)
features = numpy.random.rand(20,100)
L = numpy.zeros(100)
model = I.train(features, L)
transformed = np.array([model.apply(f) for f in features])
assert np.all( transformed.mean(0)**2 < 1e-7 )
assert np.all( np.abs(transformed.std(0) - 1) < 1e-3 )
def test_sample_to_2min():
A = np.zeros(256, np.int32)
def test_one(A):
selected = sample_to_2min(A)
ratios = []
for l0 in set(A):
for l1 in set(A):
ratios.append( (A[selected] == l0).sum() / (A[selected] == l1).sum() )
assert np.max(ratios) <= 2.001
A[20:] = 1
yield test_one, A
A[21:] = 1
yield test_one, A
A[129:] = 2
yield test_one, A
def test_sample_to_2min_list():
from collections import defaultdict
def count(xs):
counts = defaultdict(int)
for x in xs:
counts[x] += 1
return counts
labels = ["A"]*8 + ["B"]*12 + ["C"]*16 + ["D"] * 24 + ["E"] * 1000
selected = sample_to_2min(labels)
before = count(labels)
after = count(np.array(labels)[selected])
assert max(after.values()) == min(before.values())*2
def test_interval_normalise():
interval = milk.supervised.normalise.interval_normalise()
np.random.seed(105)
features = np.random.randn(100, 5)
model = interval.train(features, features[0] > 0)
transformed = np.array([model.apply(f) for f in features])
assert np.allclose(transformed.min(0), -1)
assert np.allclose(transformed.max(0), +1)
def test_nanstd():
from milk.unsupervised.normalise import _nanstd
np.random.seed(234)
for i in xrange(8):
x = np.random.rand(200,231)
np.allclose(_nanstd(x,0), x.std(0))
np.allclose(_nanstd(x,1), x.std(1))
<file_sep>/navigation/gps/gps_device.py
#!/usr/bin/env python
#
# Adapted by <NAME> (<EMAIL>)
# from the following script:
# Copyright (c) IBM Corporation, 2006. All Rights Reserved.
# Author: <NAME> (<EMAIL>)
import serial
import datetime
import sys
# Usual location (address) of gps plugged into usb (linux)
GPSADDR = "/dev/ttyUSB0"
# Baud rate of device - 115200 for dataloggers, 4800 for Garmin etrex
BAUD = 115200
class GPSDevice(object):
""" General GPS Device interface for connecting to serial port GPS devices
using the default communication params specified by the National Marine
Electronics Association (NMEA) specifications.
"""
def __init__(self, commport):
""" GPSDevice(port)
Connects to the serial port specified, as the port numbers are
zero-based on windows the actual device string would be "COM" +
port+1.
"""
self.commport = commport
self.port = None
def open(self):
""" open() open the GPS device port, the NMEA default serial
I/O parameters are defined as 115200,8,N,1. (4800 for
garmin)
"""
nmea_params = {
'port': self.commport,
'baudrate': BAUD,
'bytesize': serial.EIGHTBITS,
'parity': serial.PARITY_NONE,
'stopbits': serial.STOPBITS_ONE
}
if self.port:
print 'Device port is already open'
sys.exit(2)
try:
self.port = serial.Serial(**nmea_params)
self.port.open()
except serial.SerialException:
print """
Problem connecting to GPS
Is device connected and in NMEA transfer mode?
"""
sys.exit(2)
def read(self):
""" read() -> dict read a single NMEA sentence from the device
returning the data as a dictionary. The 'sentence' key will
identify the sentence type itself with other parameters
extracted and nicely formatted where possible.
"""
sentence = 'error'
line = self._read_raw()
if line:
record = self._validate(line)
if record:
if record[0] in _decode_func:
return _decode_func[record[0]](record)
else:
sentence = record[0]
return {
'sentence': sentence
}
def read_all(self):
""" read_all() -> dict A generator allowing the user to read
data from the device in a for loop rather than having to craft
their own looping method.
"""
while 1:
try:
record = self.read()
except IOError:
raise StopIteration
yield record
def close(self):
""" close() Close the port, note you can no longer read from
the device until you re-open it.
"""
if not self.port:
print 'Device port not open, cannot close'
sys.exit()
self.port.close()
self.port = None
def _read_raw(self):
""" _read_raw() -> str Internal method which reads a line from
the device (line ends in \r\n).
"""
if not self.port:
print 'Device port not open, cannot read'
sys.exit()
return self.port.readline()
def _checksum(self, data):
""" _checksum(data) -> str Internal method which calculates
the XOR checksum over the sentence (as a string, not including
the leading '$' or the final 3 characters, the ',' and
checksum itself).
"""
checksum = 0
for character in data:
checksum = checksum ^ ord(character)
hex_checksum = "%02x" % checksum
return hex_checksum.upper()
def _validate(self, sentence):
""" _validate(sentence) -> str
Internal method.
"""
sentence.strip()
if sentence.endswith('\r\n'):
sentence = sentence[:len(sentence)-2]
if not sentence.startswith('$GP'):
# Note that sentences that start with '$P' are proprietary
# formats and are described as $P<mid><sid> where MID is the
# manufacturer identified (Magellan is MGN etc.) and then the
# SID is the manufacturers sentence identifier.
return None
star = sentence.rfind('*')
if star >= 0:
check = sentence[star + 1:]
sentence = sentence[1:star]
sum = self._checksum(sentence)
if sum <> check:
return None
sentence = sentence[2:]
return sentence.split(',')
# The internal decoder functions start here.
def format_date(datestr):
""" format_date(datestr) -> str
Internal function. Turn GPS DDMMYY into DD/MM/YY
"""
if datestr == '':
return ''
year = int(datestr[4:])
now = datetime.date.today()
if year + 2000 > now.year:
year = year + 1900
else:
year = year + 2000
the_date = datetime.date(year, int(datestr[2:4]), int(datestr[:2]))
return the_date.isoformat()
def format_time(timestr):
""" format_time(timestr) -> str Internal function. Turn GPS HHMMSS
into HH:MM:SS UTC
"""
if timestr == '':
return ''
utc_str = ' +00:00'
the_time = datetime.time(int(timestr[:2]),
int(timestr[2:4]),
int(timestr[4:6]))
return the_time.strftime('%H:%M:%S') #+ utc_str
def format_latlong(data, direction):
""" formatp_latlong(data, direction) -> str
Internal function. Turn GPS HHMM.nnnn into standard HH.ddddd
"""
# Check to see if it's HMM.nnnn or HHMM.nnnn or HHHMM.nnnn
if data == '':
return 0 # this to stop blowing up on empty string (Garmin etrex)
dot = data.find('.')
if (dot > 5) or (dot < 3):
raise ValueError, 'Incorrect formatting of "%s"' % data
hours = data[0:dot-2]
mins = float(data[dot-2:])
if hours[0] == '0':
hours = hours[1:]
if direction in ['S', 'W']:
hours = '-' + hours
decimal = mins / 60.0 * 100.0
decimal = decimal * 10000.0
return '%s.%06d' % (hours, decimal)
def _convert(v, f, d):
""" a multi-purpose function that converts
into a number of data types
v = value
f = data type e.g int, float string
d = default value if it doesn't work
"""
try:
return f(v)
except:
return d
def _decode_gsv(data):
""" decode_gsv(date) -> dict
Internal function.
data[0] = sentence
data[1] = number of sentences
data[2] = sentence number
data[3] = satellites in view
"""
if data[3] == '00':
print """
GPS not receiving enough satellites or outputting strange values
Try turning GPS off and back on again before re-starting script
"""
sys.exit(2)
sats = []
if data[2] < data[1]: # if this isn't the last sentence
blockno = 4 # the number of blocks in a full sentence
elif data[2] == data[1]: # this IS the last sentence
# get the remaining number of blocks:
blockno = _convert(data[3], int, 0) % 4
if blockno == 0:
blockno = 4
#print 'number of satellites: %s' % data[3]
#print 'number of sentences: %s' % data[1]
#print 'number of satellites in last sentence: %s' % blockno
for i in range(blockno * 4): # iterate through the sentence
sats.append(_convert(data[i + 4], int, 0))
return {
'sentence': data[0],
'Sentence_no.': _convert(data[2], int, 0),
'NumberOfSentences': _convert(data[1], int, 0),
'inview': _convert(data[3], int, 0),
'satellite_data_list': sats
}
def _decode_rmc(data):
""" Simply parses the rmc sentence into a dictionary that makes it
easier to query for values
"""
return {
'sentence': data[0],
'time': format_time(data[1]),
'active': data[2],
'latitude': _convert(('%s' % format_latlong(data[3], data[4])),
float, 0),
'longitude': _convert(('%s' % format_latlong(data[5], data[6])),
float, 0),
'knots': _convert(data[7], float, 0),
'bearing': _convert(data[8], float, 0),
'date': format_date(data[9]),
'mag_var': '%s,%s' % (data[10], data[11])
}
# dictionary that maps the sentences onto functions
_decode_func = {
'GSV': _decode_gsv,
'RMC': _decode_rmc
}
def main():
"""
test the script by printing the outputs
"""
# setup and connect to gps
gps = GPSDevice(GPSADDR)
gps.open()
for record in gps.read_all():
print record
# test function
if __name__ == '__main__':
sys.exit(main())
<file_sep>/arduino/rc_controller_tester/rc_controller_tester.ino
int rc1 = 9;
int rc2 = 8;
int rc3 = 10;
int rc1_val;
int rc2_val;
int rc3_val;
boolean kill_motors_flag;
void setup(){
Serial.begin(9600);
pinMode(rc1, INPUT);
pinMode(rc2, INPUT);
pinMode(rc3, INPUT);
}
void loop(){
read_pulses();
kill_switch();
write_motors();
serial_print_stuff();
delay(250);
}
void read_pulses(){
// RC 1 value used to control motor 1 output
rc1_val = pulseIn(rc1, HIGH, 20000);
Serial.print(rc1_val);
if (rc1_val > 1000 && rc1_val < 2000){
rc1_val = map(rc1_val, 1000, 2000, 1, 127);
if (rc1_val < 1){
rc1_val = 1;
}
if (rc1_val > 127){
rc1_val = 127;
}
}
else{
rc1_val = 64;
}
// RC 2 value used to control motor 2 output
rc2_val = pulseIn(rc2, HIGH, 20000);
Serial.print(" ");
Serial.println(rc2_val);
if (rc2_val > 1000 && rc2_val < 2000){
rc2_val = map(rc2_val, 1000, 2000, 128, 255);
if (rc2_val < 128){
rc2_val = 128;
}
if (rc2_val > 255){
rc2_val = 255;
}
//Serial.println(rc2_val);
}
else{
rc2_val = 192;
}
// RC 3 value from a toggle switch used to control motor output
rc3_val = pulseIn(rc3, HIGH, 20000);
if (rc3_val > 1000 && rc3_val < 2000){
if (rc3_val < 1000){
rc3_val = 1000;
}
if (rc3_val > 2000){
rc3_val = 2000;
}
}
else{
rc3_val = 1000;
}
}
void kill_switch(){
if (rc3_val > 1700){
kill_motors_flag = false;
}
else{
kill_motors_flag = true;
}
}
void write_motors(){
if (kill_motors_flag == false){
Serial.print(rc1_val);
Serial.print(rc2_val);
}
else{
Serial.print(0);
}
}
void serial_print_stuff(){
Serial.print("rc1: ");
Serial.print(rc1_val);
Serial.print(" rc2: ");
Serial.print(rc2_val);
Serial.print(" rc3: ");
Serial.print(rc3_val);
Serial.print(" flag: ");
Serial.println(kill_motors_flag);
}
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/supervised/precluster.py
# -*- coding: utf-8 -*-
# Copyright (C) 2011, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
# License: MIT. See COPYING.MIT file in the milk distribution
from __future__ import division
import numpy as np
from milk.unsupervised.kmeans import select_best_kmeans, assign_centroids
from .base import supervised_model, base_adaptor
import multiprocessing
from milk.utils import parallel
from milk import defaultlearner
class precluster_model(supervised_model):
def __init__(self, centroids, base):
self.centroids = centroids
self.base = base
self.normalise = True
def apply(self, features):
histogram = assign_centroids(features, self.centroids, histogram=True, normalise=self.normalise)
return self.base.apply(histogram)
class precluster_learner(object):
'''
This learns a classifier by clustering the input features
'''
def __init__(self, ks, base=None, R=None):
if base is None:
base = defaultlearner()
self.ks = ks
self.R = R
self.base = base
self.normalise = True
def set_option(self, k, v):
if k in ('R', 'ks'):
setattr(self, k, v)
else:
self.base.set_option(k,v)
def train(self, features, labels, **kwargs):
allfeatures = np.vstack(features)
assignments, centroids = select_best_kmeans(allfeatures, self.ks, repeats=1, method="AIC", R=self.R)
histograms = [assign_centroids(f, centroids, histogram=True, normalise=self.normalise) for f in features]
base_model = self.base.train(histograms, labels, **kwargs)
return precluster_model(centroids, base_model)
class codebook_model(supervised_model):
def __init__(self, centroids, base, normalise):
self.centroids = centroids
self.base = base
self.normalise = normalise
def apply(self, features):
from milk.unsupervised.kmeans import assign_centroids
f0,f1 = features
features = assign_centroids(f0, self.centroids, histogram=True, normalise=self.normalise)
if f1 is not None and len(f1):
features = np.concatenate((features, f1))
return self.base.apply(features)
class codebook_learner(base_adaptor):
def set_option(self, k, v):
if k != 'codebook':
raise KeyError('milk.precluster.codebook_learner: unknown option `%s`' % k)
self.codebook = v
def train(self, features, labels, **kwargs):
from milk.unsupervised.kmeans import assign_centroids
tfeatures = np.array([ assign_centroids(f, self.codebook, histogram=True, normalise=self.normalise)
for f,_ in features])
tfeatures = np.hstack((tfeatures, np.array([f for _,f in features])))
base_model = self.base.train(tfeatures, labels, **kwargs)
return codebook_model(self.codebook, base_model, self.normalise)
class kmeans_cluster(multiprocessing.Process):
def __init__(self, features, inq, outq):
self.features = features
self.inq = inq
self.outq = outq
def execute(self):
import milk
while True:
k,ri = self.inq.get()
if k == 'shutdown':
return
_,centroids = milk.kmeans(self.features, k=k, R=(k*1024+ri))
self.outq.put(centroids)
def run(self):
try:
self.execute()
except Exception, e:
errstr = r'''\
Error in milk.supervised.precluster.learn_codebook internal
Exception was: %s
Original Traceback:
%s
(Since this was run on a different process, this is not a real stack trace).
''' % (e, traceback.format_exc())
self.outq.put( ('error', errstr) )
class select_precluster(object):
def __init__(self, ks, base, normalise=True, rmax=16):
self.base = base
self.ks = ks
self.rmax = rmax
self.sample = 16
self.nfolds = 5
self.normalise = normalise
def train(self, features, labels, **kwargs):
from milk.supervised.gridsearch import gridminimise
c_features = np.concatenate([f for f,_ in features if len(f)])
c_features = c_features[::self.sample]
nprocs = parallel.get_procs(use_current=True)
tow = multiprocessing.Queue()
fromw = multiprocessing.Queue()
for k in self.ks:
for ri in xrange(self.rmax):
tow.put((k,ri))
for i in xrange(nprocs):
tow.put(('shutdown',None))
workers = [kmeans_cluster(c_features, tow, fromw) for i in xrange(nprocs)]
for w in workers:
if nprocs > 1:
w.start()
else:
w.execute()
try:
codebooks = [fromw.get() for i in xrange(len(self.ks)*self.rmax)]
finally:
tow.close()
tow.join_thread()
if nprocs > 1:
for w in workers:
w.join()
parallel.release_procs(len(workers), count_current=True)
base = codebook_learner(self.base)
base.normalise = self.normalise
if len(codebooks) > 1:
(best,) = gridminimise(base, features, labels, { 'codebook' : codebooks }, nfolds=self.nfolds)
_,codebook = best
else:
(codebook,) = codebooks
base.codebook = codebook
return base.train(features, labels)
<file_sep>/gui/breezy/circlearea1.py
"""
File: circlearea1.py
Inputs a radius of a circle and outputs its area.
"""
from breezypythongui import EasyFrame
import math
class CircleArea(EasyFrame):
def __init__(self):
"""Sets up the window and widgets."""
EasyFrame.__init__(self, title = "Circle Area")
# Label and field for the radius
self.addLabel(text = "Radius",
row = 0, column = 0)
self.radiusField = self.addFloatField(value = 0.0,
row = 0,
column = 1,
width = 10)
# Label and field for the area
self.addLabel(text = "Area",
row = 1, column = 0)
self.areaField = self.addFloatField(value = 0.0,
row = 1,
column = 1)
# The command button
self.button = self.addButton(text = "Compute",
row = 2, column = 0,
columnspan = 2,
command = self.computeArea)
# The event handler method for the button
def computeArea(self):
"""Inputs the radius, computes the area,
and outputs the area."""
radius = self.radiusField.getNumber()
area = radius ** 2 * math.pi
self.areaField.setNumber(area)
#Instantiate and pop up the window."""
CircleArea().mainloop()
<file_sep>/PhidgetsPython/Python/PHSensor-simple.py
#!/usr/bin/env python
"""Copyright 2010 Phidgets Inc.
This work is licensed under the Creative Commons Attribution 2.5 Canada License.
To view a copy of this license, visit http://creativecommons.org/licenses/by/2.5/ca/
"""
__author__ = '<NAME>'
__version__ = '2.1.8'
__date__ = 'May 17 2010'
#Basic imports
from ctypes import *
import sys
#Phidget specific imports
from Phidgets.PhidgetException import PhidgetErrorCodes, PhidgetException
from Phidgets.Events.Events import AttachEventArgs, DetachEventArgs, ErrorEventArgs, PHChangeEventArgs
from Phidgets.Devices.PHSensor import PHSensor
#Create an PHSensor object
try:
phSensor = PHSensor()
except RuntimeError as e:
print("Runtime Exception: %s" % e.details)
print("Exiting....")
exit(1)
#Information Display Function
def displayDeviceInfo():
print("|------------|----------------------------------|--------------|------------|")
print("|- Attached -|- Type -|- Serial No. -|- Version -|")
print("|------------|----------------------------------|--------------|------------|")
print("|- %8s -|- %30s -|- %10d -|- %8d -|" % (phSensor.isAttached(), phSensor.getDeviceName(), phSensor.getSerialNum(), phSensor.getDeviceVersion()))
print("|------------|----------------------------------|--------------|------------|")
print("PH Sensitivity: %f" % (phSensor.getPHChangeTrigger()))
print("Current Potential: %f" % (phSensor.getPotential()))
#Event Handler Callback Functions
def phSensorAttached(e):
attached = e.device
print("PHSensor %i Attached!" % (attached.getSerialNum()))
def phSensorDetached(e):
detached = e.device
print("PHSensor %i Detached!" % (detached.getSerialNum()))
def phSensorError(e):
try:
source = e.device
print("PHSensor %i: Phidget Error %i: %s" % (source.getSerialNum(), e.eCode, e.description))
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
def phSensorPHChanged(e):
source = e.device
potential = phSensor.getPotential()
print("PHSensor %i: PH: %f -- Potential: %f" % (source.getSerialNum(), e.PH, potential))
#Main Program Code
try:
phSensor.setOnAttachHandler(phSensorAttached)
phSensor.setOnDetachHandler(phSensorDetached)
phSensor.setOnErrorhandler(phSensorError)
phSensor.setOnPHChangeHandler(phSensorPHChanged)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Opening phidget object....")
try:
phSensor.openPhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Waiting for attach....")
try:
phSensor.waitForAttach(10000)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
try:
phSensor.closePhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Exiting....")
exit(1)
else:
displayDeviceInfo()
print("Increasing sensitivity to 10.00")
phSensor.setPHChangeTrigger(10.00)
print("Press Enter to quit....")
chr = sys.stdin.read(1)
print("Closing...")
try:
phSensor.closePhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Done.")
exit(0)<file_sep>/gui/breezy/colorchooserdemo.py
"""
File: colorchooserdemo.py
Author: <NAME>
"""
from breezypythongui import EasyFrame
import tkinter.colorchooser
class ColorPicker(EasyFrame):
"""Displays the results of picking a color."""
def __init__(self):
"""Sets up the window and widgets."""
EasyFrame.__init__(self, title = "Color Chooser Demo")
# Labels and output fields
self.addLabel("R", row = 0, column = 0)
self.addLabel("G", row = 1, column = 0)
self.addLabel("B", row = 2, column = 0)
self.addLabel("Color", row = 3, column = 0)
self.r = self.addIntegerField(255, row = 0, column = 1)
self.g = self.addIntegerField(255, row = 1, column = 1)
self.b = self.addIntegerField(255, row = 2, column = 1)
self.hex = self.addTextField("#ffffff", row = 3,
column = 1, width = 10)
# Canvas
self.canvas = self.addCanvas(row = 0, column = 2,
rowspan = 4,
width = 50)
# Command button
self.addButton(text = "Choose color", row = 4, column = 0,
columnspan = 3, command = self.chooseColor)
# Event handling method
def chooseColor(self):
"""Pops up a color chooser and outputs the result."""
colorTuple = tkinter.colorchooser.askcolor()
if not colorTuple[0]: return
((r, g, b), hexString) = colorTuple
self.r.setNumber(int(r))
self.g.setNumber(int(g))
self.b.setNumber(int(b))
self.hex.setText(hexString)
self.canvas["background"] = hexString
# Instantiate and pop up the window."""
if __name__ == "__main__":
ColorPicker().mainloop()
<file_sep>/navigation/terrain_id/terrain_training/train_terrain.py~
#!/usr/bin/env python
import easygui as eg
import sys
from img_processing_tools import *
#from PIL import Image
from PIL import ImageStat
import cv, cv2
import time
import mahotas
import numpy as np
import pickle
import csv
import milk
def snap_shot():
#capture from camera at location 0
now = time.time()
global webcam1
try:
#have to capture a few frames as it buffers a few frames..
for i in range (5):
ret, img = webcam1.read()
#print "time to capture 5 frames:", (time.time()) - now
#cv2.imwrite(filename, img)
#img1 = Image.open(filename)
#img1.thumbnail((320,240))
#img1.save(filename)
#print (time.time()) - now
return img
except:
print "could not grab webcam"
def find_features(img):
#print type(img), img.size, img.shape
#gray scale the image if neccessary
#if img.shape[2] == 3:
# img = img.mean(2)
#img = mahotas.imread(imname, as_grey=True)
features = mahotas.features.haralick(img).mean(0)
print type(features)
print features
#features = mahotas.features.lbp(img, 1, 8)
features = features + mahotas.features.tas(img)
features = features + mahotas.features.zernike_moments(img, 2, degree=8)
print 'features:', features, len(features), type(features[0])
return features
def classify(model, features):
return model.apply(features)
def grab_frame_from_video(video):
frame = video.read()
return frame
def predict_class(img):
try:
model = pickle.load( open( "robomow_ai_model.mdl", "rb" ) )
features = find_features(img)
classID = classify(model, features)
if classID == 1: answer = "Mowable"
if classID == 2: answer = "Non-Mowable"
print "predicted classID:", answer
eg.msgbox("predicted classID:"+answer)
except:
print "could not predict...bad data"
def save_data(features, classID):
data_filename = 'robomow_feature_data.csv'
###########################
print 'writing image features to file: ', data_filename
# delete data file and write header
#f_handle = open(data_filename, 'w')
#f_handle.write(str("classid, lbp, i3_histogram, rgb_histogram, sum_I3, sum2_I3, median_I3, avg_I3, var_I3, stddev_I3, rms_I3"))
#f_handle.write('\n')
#f_handle.close()
#write class data to file
f_handle = open(data_filename, 'a')
f_handle.write(str(classID))
f_handle.write(', ')
f_handle.close()
f_handle = open(data_filename, 'a')
for i in range(len(features)):
f_handle.write(str(features[i]))
f_handle.write(" ")
f_handle.write('\n')
f_handle.close()
def train_ai():
data = []
classID = []
features = []
features_temp_array = []
try:
data_filename = 'robomow_feature_data.csv'
print 'readind features and classID: ', data_filename
f_handle = open(data_filename, 'r')
reader = csv.reader(f_handle)
#read data from file into arrays
for row in reader:
data.append(row)
for row in range(0, len(data)):
#print features[row][1]
classID.append(int(data[row][0]))
features_temp_array.append(data[row][1].split(" "))
#removes ending element which is a space
for x in range(len(features_temp_array)):
features_temp_array[x].pop()
features_temp_array[x].pop(0)
#convert all strings in array to numbers
temp_array = []
for x in range(len(features_temp_array)):
temp_array = [ float(s) for s in features_temp_array[x] ]
features.append(temp_array)
#make numpy arrays
#features = np.asarray(features)
print classID, features
learner = milk.defaultclassifier()
model = learner.train(features, classID)
pickle.dump( model, open( "robomow_ai_model.mdl", "wb" ) )
except:
print "could not retrain.. bad file"
return
if __name__=="__main__":
print "********************************************************************"
print "* if 1 argument: video file to process otherwise uses webcam *"
print "********************************************************************"
video = None
webcam1 = None
if len(sys.argv) > 1:
try:
video = cv2.VideoCapture(sys.argv[1])
#print video, sys.argv[1]
except:
print "******* Could not open image/video file *******"
print "Unexpected error:", sys.exc_info()[0]
#raise
sys.exit(-1)
else:
try:
webcam1 = cv2.VideoCapture(0)
except:
print "******* Could not open WEBCAM *******"
print "Unexpected error:", sys.exc_info()[0]
#raise
sys.exit(-1)
reply =""
eg.rootWindowPosition = "+100+100"
while True:
eg.rootWindowPosition = eg.rootWindowPosition
print 'reply=', reply
if reply == "": reply = "Next Frame"
if reply == "Mowable":
classID = "1"
features = find_features(img1)
save_data(features, classID)
if reply == "Non-Mowable":
classID = "2"
features = find_features(img1)
save_data(features, classID)
if reply == "Quit":
print "Quitting...."
sys.exit(-1)
if reply == "Predict":
print "AI predicting"
predict_class(img1)
if reply == "Retrain AI":
print "Retraining AI"
train_ai()
if reply == "Next Frame":
print "Acquiring new image.."
if video != None:
img1 = np.array(grab_frame_from_video(video)[1])
else:
img1 = snap_shot()
img1 = array2image(img1)
img1 = img1.resize((320,240))
img1 = image2array(img1)
cv2.imwrite('temp.png', img1)
#print type(img1)
#img1.save()
if reply == "Fwd 10 Frames":
print "Forward 10 frames..."
for i in range(10):
img1 = np.array(grab_frame_from_video(video)[1])
img1 = array2image(img1)
img1 = img1.resize((320,240))
img1 = image2array(img1)
cv2.imwrite('temp.png', img1)
if reply == "Del AI File":
data_filename = 'robomow_feature_data.csv'
f_handle = open(data_filename, 'w')
f_handle.write('')
f_handle.close()
if reply == "Test Img":
#im = Image.fromarray(image)
#im.save("new.png")
img1 = Image.open('temp.png')
#img1.thumbnail((320,240))
path = "../../../../mobot_data/images/test_images/"
filename = str(time.time()) + ".jpg"
img1.save(path+filename)
if video != None:
reply = eg.buttonbox(msg='Classify Image', title='Robomow GUI', choices=('Mowable', 'Non-Mowable', 'Test Img', 'Next Frame', 'Fwd 10 Frames', 'Predict', 'Retrain AI' , 'Del AI File', 'Quit'), image='temp.png', root=None)
else:
reply = eg.buttonbox(msg='Classify Image', title='Robomow GUI', choices=('Mowable', 'Non-Mowable', 'Test Img','Next Frame', 'Predict', 'Retrain AI' , 'Del AI File', 'Quit'), image='temp.png', root=None)
<file_sep>/gui/web_gui/sitetest2/test.py~
#!/usr/bin/env python
def printtest():
print "test"
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/tests/test_featureselection.py
import milk.supervised.featureselection
from milk.supervised.featureselection import select_n_best, rank_corr
import numpy as np
def test_sda():
from milksets import wine
features, labels = wine.load()
selected = milk.supervised.featureselection.sda(features,labels)
for sel in selected:
assert sel <= features.shape[1]
def test_linear_independent_features():
np.random.seed(122)
X3 = np.random.rand(20,3)
X = np.c_[X3,X3*2+np.random.rand(20,3)/20.,-X3*2+np.random.rand(20,3)/10.]
X2 = np.c_[X3,X3*2,-X3*3e-3]
assert len(milk.supervised.featureselection.linear_independent_features(X)) == 9
assert len(milk.supervised.featureselection.linear_independent_features(X2)) == 3
assert np.all (np.sort(milk.supervised.featureselection.linear_independent_features(X2) % 3) == np.arange(3))
def _rank(A,tol=1e-8):
s = np.linalg.svd(A,compute_uv=0)
return (s > tol).sum()
def _slow_linear_independent_features(featmatrix):
'''
Returns the indices of a set of linearly independent features (columns).
indices = linear_independent_features(features)
'''
independent = [0,]
rank = 1
feat = [featmatrix[:,0]]
for i,col in enumerate(featmatrix.T):
feat.append(col)
nrank = _rank(np.array(feat))
if nrank == rank:
del feat[-1]
else:
rank = nrank
independent.append(i)
return np.array(independent)
def test_select_n():
from milksets.wine import load
features,labels = load()
for n in (1,2,4,8):
select = select_n_best(n, rank_corr)
model = select.train(features,labels)
f = model.apply(features[3])
assert len(f) == n
def test_select_n():
from milksets.wine import load
features,labels = load()
for n in (1,2,4,8):
select = select_n_best(n, rank_corr)
model = select.train(features,labels)
f = model.apply(features[3])
assert len(f) == n
def slow_rank_corr(features, labels):
features = np.asanyarray(features)
labels = np.asanyarray(labels)
binlabels = [(labels == ell) for ell in set(labels)]
rs = []
for feat in features.T:
ranks = feat.argsort()
corrcoefs = [np.corrcoef(ranks, labs)[0,1] for labs in binlabels]
corrcoefs = np.array(corrcoefs)
corrcoefs **= 2
rs.append(np.max(corrcoefs))
return np.array(rs)
def test_compare_rank_corr():
from milksets.wine import load
features,labels = load()
r0 = rank_corr(features,labels)
r1 = slow_rank_corr(features,labels)
assert np.allclose(r0,r1)
<file_sep>/lib/protox2d_class.py
#!/usr/bin/env python
"""
SUMMARY:
This library allows interface to ProxoX2D LIDAR.
Call with the string ID of the device.
Returns: array of 360 elements. The index is the angle, the array element value is distance in mm.
EXAMPLE USAGE:
lidar = protox2d('A1')
data = lidar.read_lidar()
print data
"""
import serial
#import threading
import time
#import os
from serial.tools import list_ports
#import sys
#sys.path.append( "../../libs/" )
from identify_device_on_ttyport import *
NUMBER_READINGS = 2
class protox2d():
def __init__(self, protox2dID):
self.lidar_data = None
self.id = protox2dID
self.x_angle = 0
self.y_angle = 0
self.distance_mm = 0
self.quality = 0
self.rpm = 0
# serial port
self.com_port = None
self.baudrate = 115200
self.ser = None
self._isConnected = False
self.data = []
#-------------connection variables
self.feed_num = 'protox2d.1'
self.connection = None
self.channel = None
#----------------------RUN
self.run()
def connect_to_lidar(self):
while self._isConnected == False:
print "protox2d: searching serial ports for protox2d..."
ports = find_usb_tty("10c4","ea60")
if (len(ports) > 0):
for port_to_try in ports:
print "protox2d: attempting connection to port:", port_to_try
try:
#could actually get serial number from device from list_serial _ports and look specifically for that one
ser = serial.Serial(port_to_try, 115200)
if (ser.isOpen() == True):
print "Connected. Waiting 5 seconds for ProtoX2D to spin up."
time.sleep(5)
temp_data = ser.readline()
temp_data = temp_data.strip('\r\n')
data = temp_data.split(',')
print data
if data[0] == self.id :
#ser.write("a\n") # write a string
print "protox2d with id:", self.id, " connected to on serial port: ", port_to_try
self._isConnected = True
self.ser = ser
#time.sleep(.35)
break
except:
pass
time.sleep(.5)
if self._isConnected == False:
print "protox2d: protox2d sensor package not found!"
time.sleep(.5)
#print "returning", ser
return ser
def connect(self):
self.connection = pika.BlockingConnection(pika.ConnectionParameters('localhost'))
self.channel = self.connection.channel()
#channel.queue_declare(queue='mobot_video1', auto_delete=True, arguments={'x-message-ttl':1000})
self.channel.exchange_declare(exchange='mobot_data_feed',type='topic')
def publish(self, data):
self.channel.basic_publish(exchange='mobot_data_feed',
routing_key=self.feed_num, body=data)
def run(self):
self.connect_to_lidar()
#if self.ser != None:
# self.connect()
# self.th = thread.start_new_thread(self.read_lidar, ())
def read_lidar(self):
temp_array = [0] * 360
num_of_readings = (360 * NUMBER_READINGS)
for reading in range(num_of_readings):
temp_data = self.ser.readline()
temp_data = temp_data.strip('\r\n')
lidar_data = temp_data.split(',')
try:
#self.id = lidar_data[0]
#self.x_angle = int(lidar_data[1])
self.y_angle = int(lidar_data[2])
self.distance_mm = int(lidar_data[3])
self.quality = int(lidar_data[4])
#self.rpm = int(lidar_data[5])
if self.quality > 0:
temp_array[self.y_angle] = self.distance_mm
except:
pass
#zero_distance = 0
#for i in range(360):
# if temp_array[i] == 0: zero_distance = zero_distance + 1
#print "angle: ", i, " distance_mm: ", temp_array[i]
#print "zero_distance:",zero_distance
return temp_array
if __name__== "__main__":
lidar = protox2d('A1')
while True:
#time.sleep(3)
data = lidar.read_lidar()
print data, len(data)
<file_sep>/telemetry/tcp_socket_throughput.py~
#!/usr/bin/env python
#
# tcp_socket_throughput.py
# TCP Socket Connection Throughput Tester
# <NAME> (www.goldb.org), 2008
import sys
import time
import socket
from threading import Thread
host = '127.0.0.1'
port = 1234
thread_count = 10 # concurrent sender agents
class Controller:
def __init__(self):
self.count_ref = []
def start(self):
for i in range(thread_count):
agent = Agent(self.count_ref)
agent.setDaemon(True)
agent.start()
print 'started %d threads' % (i + 1)
while True:
time.sleep(1)
line = 'connects/sec: %s' % len(self.count_ref)
self.count_ref[:] = []
sys.stdout.write(chr(0x08) * len(line))
sys.stdout.write(line)
class Agent(Thread):
def __init__(self, count_ref):
Thread.__init__(self)
self.count_ref = count_ref
def run(self):
while True:
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
try:
s.connect((host, port))
s.shutdown(2)
s.close()
self.count_ref.append(1)
except:
print 'SOCKET ERROR\n'
if __name__ == '__main__':
controller = Controller()
controller.start()
<file_sep>/camera/track_object.py
##
## Based on camshiftdemo.c
##
import sys
from opencv import cv
from opencv import highgui
import opencv
from myro import *
go = False
image = None
hsv = None
hue = None
mask = None
backproject = None
ist = None
backproject_mode = 0
select_object = 0
track_object = 0
origin = cv.CvPoint()
selection = cv.CvRect()
track_window = cv.CvRect()
track_box = cv.CvBox2D()
track_comp = cv.CvConnectedComp()
hdims = 16
hranges = [[0, 180]]
vmin = 60
vmax = 256
smin = 65
def on_mouse(event, x, y, flags, param):
global select_object, selection, image, origin, select_object, track_object
if image is None:
return
if image.origin:
y = image.height - y
if select_object:
selection.x = min(x,origin.x)
selection.y = min(y,origin.y)
selection.width = selection.x + cv.CV_IABS(x - origin.x)
selection.height = selection.y + cv.CV_IABS(y - origin.y)
selection.x = max( selection.x, 0 )
selection.y = max( selection.y, 0 )
selection.width = min( selection.width, image.width )
selection.height = min( selection.height, image.height )
selection.width -= selection.x
selection.height -= selection.y
if event == highgui.CV_EVENT_LBUTTONDOWN:
origin = cv.cvPoint(x,y)
selection = cv.cvRect(x,y,0,0)
select_object = 1
elif event == highgui.CV_EVENT_LBUTTONUP:
select_object = 0
if( selection.width > 0 and selection.height > 0 ):
track_object = -1
def out_values():
print "vmin =", vmin, "vmax =", vmax, "smin =", smin
def set_vmin(value):
global vmin
vmin = value
out_values()
def set_vmax(value):
global vmax
vmax = value
out_values()
def set_smin(value):
global smin
smin = value
out_values()
def hsv2rgb (hue):
# convert the hue value to the corresponding rgb value
sector_data = [[0, 2, 1],
[1, 2, 0],
[1, 0, 2],
[2, 0, 1],
[2, 1, 0],
[0, 1, 2]]
hue *= 0.1 / 3
sector = cv.cvFloor (hue)
p = cv.cvRound (255 * (hue - sector))
if sector & 1:
p ^= 255
rgb = {}
rgb [sector_data [sector][0]] = 255
rgb [sector_data [sector][1]] = 0
rgb [sector_data [sector][2]] = p
return cv.cvScalar (rgb [2], rgb [1], rgb [0], 0)
if __name__ == '__main__':
# use the webcam
capture = highgui.cvCreateCameraCapture (0)
# check that capture device is OK
if not capture:
print "Error opening capture device"
sys.exit (1)
# display a small howto use it
print "Hot keys: \n"
print "\tESC - quit the program\n"
print "\tc - stop the tracking\n"
print "\tb - switch to/from backprojection view\n"
print "To initialize tracking, select the object with mouse\n"
# first, create the necessary windows
highgui.cvNamedWindow ('VisualJoystick', highgui.CV_WINDOW_AUTOSIZE)
# register the mouse callback
highgui.cvSetMouseCallback ('VisualJoystick', on_mouse, None)
highgui.cvCreateTrackbar( "Vmin", "VisualJoystick", vmin, 256, set_vmin)
highgui.cvCreateTrackbar( "Vmax", "VisualJoystick", vmax, 256, set_vmax)
highgui.cvCreateTrackbar( "Smin", "VisualJoystick", smin, 256, set_smin)
if go:
init()
print getBattery()
while True:
frame = highgui.cvQueryFrame (capture)
if frame is None:
# no image captured... end the processing
break
if image is None:
# create the images we need
image = cv.cvCreateImage (cv.cvGetSize (frame), 8, 3)
image.origin = frame.origin
hsv = cv.cvCreateImage( cv.cvGetSize(frame), 8, 3 )
hue = cv.cvCreateImage( cv.cvGetSize(frame), 8, 1 )
mask = cv.cvCreateImage( cv.cvGetSize(frame), 8, 1 )
backproject = cv.cvCreateImage( cv.cvGetSize(frame), 8, 1 )
hist = cv.cvCreateHist( [hdims], cv.CV_HIST_ARRAY, hranges, 1 )
# flip the image
cv.cvFlip (frame, image, 1)
cv.cvCvtColor( image, hsv, cv.CV_BGR2HSV)
cv.cvLine(image, cv.cvPoint(0, image.height/2), cv.cvPoint(image.width, image.height/2),
cv.CV_RGB(0,255,0), 2, 8, 0 )
cv.cvLine(image, cv.cvPoint(image.width/2, 0), cv.cvPoint(image.width/2, image.height),
cv.CV_RGB(0,255,0), 2, 8, 0 )
if track_object:
_vmin = vmin
_vmax = vmax
cv.cvInRangeS( hsv,
cv.cvScalar( 0, smin,min(_vmin,_vmax),0),
cv.cvScalar(180, 256, max(_vmin,_vmax),0),
mask );
cv.cvSplit( hsv, hue, None, None, None)
if track_object < 0:
max_val = 0.0
subhue = cv.cvGetSubRect(hue, selection)
submask = cv.cvGetSubRect(mask, selection)
cv.cvCalcHist( subhue, hist, 0, submask )
# extract the min and max value of the histogram
min_val, max_val, min_idx, max_idx = cv.cvGetMinMaxHistValue (hist)
if (max_val):
cv.cvConvertScale( hist.bins, hist.bins, 255.0 / max_val, 0)
else:
cv.cvConvertScale( hist.bins, hist.bins, 0.0, 0 )
track_window = selection
track_object = 1
cv.cvCalcArrBackProject( hue, backproject, hist )
cv.cvAnd( backproject, mask, backproject, 0 )
cv.cvCamShift( backproject, track_window,
cv.cvTermCriteria( cv.CV_TERMCRIT_EPS | cv.CV_TERMCRIT_ITER, 10, 1 ),
track_comp, track_box )
track_window = track_comp.rect
if backproject_mode:
cv.cvCvtColor( backproject, image, cv.CV_GRAY2BGR )
if not image.origin:
track_box.angle = -track_box.angle
cv.cvEllipseBox(image, track_box, cv.CV_RGB(255,0,0), 3, cv.CV_AA, 0)
if (track_box.size.width > 10 or track_box.size.height > 10):
rotate = ( (image.width/2.0) - track_box.center.x) / (image.width/2.0)
translate = ((image.height/2.0) - track_box.center.y) / (image.height/2.0)
#print "rotate =", rotate, "translate =", translate
if go:
move(translate, rotate)
if select_object and selection.width > 0 and selection.height > 0:
subimg = cv.cvGetSubRect(image, selection)
cv.cvXorS( subimage, cv.cvScalarAll(255), subimage, 0 )
highgui.cvShowImage( "VisualJoystick", image )
c = highgui.cvWaitKey(10)
if c == '\x1b':
break
elif c == 'b':
backproject_mode ^= 1
elif c == 'c':
track_object = 0
cv.cvZero( histimg )
if go:
stop()
if go:
stop()
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/measures/__init__.py
from measures import accuracy, waccuracy, zero_one_loss, confusion_matrix, bayesian_significance
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/supervised/_svm.cpp
// Copyright (C) 2008, <NAME> <<EMAIL>>
// Copyright (c) 2000-2008 <NAME> and <NAME> (LIBSVM Code)
//
// Permission is hereby granted, free of charge, to any person obtaining a copy
// of this software and associated documentation files (the "Software"), to deal
// in the Software without restriction, including without limitation the rights
// to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
// copies of the Software, and to permit persons to whom the Software is
// furnished to do so, subject to the following conditions:
//
// The above copyright notice and this permission notice shall be included in
// all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
// IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
// FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
// AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
// LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
// OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
// THE SOFTWARE.
#include <cassert>
#include <math.h>
#include <iostream>
#include <stdio.h>
#include <list>
#include <memory>
#include <cmath>
#include <vector>
//#include <debug/vector>
//#include <debug/list>
//using __gnu_debug::vector;
//using __gnu_debug::list;
using std::vector;
using std::list;
extern "C" {
#include <Python.h>
#include <numpy/ndarrayobject.h>
}
namespace {
const double INF = HUGE_VAL;
// This is a boost function
// Copied here for convenience.
template <typename Iter>
inline Iter prior(Iter it) {
--it;
return it;
}
using std::max;
using std::min;
template <typename T>
T median(T bot, T mid, T hi) {
if (mid < bot) return bot;
if (mid > hi) return hi;
return mid;
}
struct SMO_Exception {
SMO_Exception(const char* msg): msg(msg) { }
const char* msg;
};
struct Python_Exception { };
void check_for_interrupts() {
PyErr_CheckSignals();
if (PyErr_Occurred()) {
throw Python_Exception();
}
}
class KernelComputation {
public:
virtual ~KernelComputation() { }
virtual double do_kernel(int i1, int i2) const = 0;
};
class KernelCache {
public:
KernelCache(std::auto_ptr<KernelComputation> computation_, int N, int cache_nr_doubles);
virtual ~KernelCache();
double* get_kline(int idx, int size = -1);
double* get_diag();
double kernel_apply(int i1, int i2) const;
protected:
virtual double do_kernel(int i1, int i2) const {
return computation_->do_kernel(i1,i2);
}
const int N_;
private:
std::auto_ptr<KernelComputation> computation_;
double ** cache_;
double * dcache_;
int cache_free_;
list<int> cache_lru;
vector<list<int>::iterator> cache_iter;
};
KernelCache::KernelCache(std::auto_ptr<KernelComputation> computation, int N, int cache_nr_floats):
N_(N),
computation_(computation),
dcache_(0) {
cache_ = new double*[N_];
for (int i = 0; i != N_; ++i) cache_[i] = 0;
cache_free_ = (cache_nr_floats/N_);
cache_iter.resize(N_,cache_lru.end());
}
KernelCache::~KernelCache() {
for (int i = 0; i != N_; ++i) delete [] cache_[i];
delete [] cache_;
delete [] dcache_;
}
double* KernelCache::get_kline(int idx, int s) {
if (s == -1) s = N_;
assert (s <= N_);
if (!cache_[idx]) {
if (!cache_free_) {
int to_remove = cache_lru.front();
cache_lru.pop_front();
cache_[idx] = cache_[to_remove];
cache_[to_remove] = 0;
} else {
cache_[idx] = new double[N_];
--cache_free_;
}
for (int i = 0; i != N_; ++i) {
if (i == idx && dcache_) cache_[i][i] = dcache_[i];
else if(i != idx && cache_[i]) cache_[idx][i] = cache_[i][idx];
else cache_[idx][i] = do_kernel(idx,i);
}
} else {
cache_lru.erase(cache_iter[idx]);
}
cache_lru.push_back(idx);
cache_iter[idx]=prior(cache_lru.end());
return cache_[idx];
}
double* KernelCache::get_diag() {
if (!dcache_) {
dcache_ = new double[N_];
for (int i = 0; i != N_; ++i) {
if (cache_[i]) dcache_[i] = cache_[i][i];
else dcache_[i] = do_kernel(i,i);
}
}
return dcache_;
}
/***
* Returns the value of Kernel(X_i1, X_i2).
* Uses the cache if possible, but does not update it.
*/
double KernelCache::kernel_apply(int i1, int i2) const {
if (cache_[i1]) {
assert(do_kernel(i1,i2) == cache_[i1][i2]);
return cache_[i1][i2];
}
if (cache_[i2]) {
assert(do_kernel(i1,i2) == cache_[i2][i1]);
return cache_[i2][i1];
}
if (i1 == i2 && dcache_) {
assert(do_kernel(i1,i2) == dcache_[i1]);
return dcache_[i1];
}
return do_kernel(i1,i2);
}
class PyKernel : public KernelComputation {
PyKernel(const PyKernel&);
public:
PyKernel(PyObject* X, PyObject* kernel, int N);
~PyKernel();
protected:
virtual double do_kernel(int i1, int i2) const;
private:
PyObject* const X_;
PyObject* const pykernel_;
const int N_;
};
PyKernel::PyKernel(PyObject* X, PyObject* kernel, int N):
X_(X),
pykernel_(kernel),
N_(N)
{
Py_INCREF(X);
Py_INCREF(kernel);
}
PyKernel::~PyKernel() {
Py_DECREF(X_);
Py_DECREF(pykernel_);
}
double PyKernel::do_kernel(int i1, int i2) const {
assert(i1 < N_);
assert(i2 < N_);
PyObject* obj1 = PySequence_GetItem(X_,i1);
PyObject* obj2 = PySequence_GetItem(X_,i2);
if (!obj1 || !obj2) {
Py_XDECREF(obj1);
Py_XDECREF(obj2);
throw SMO_Exception("svm.eval_SMO: Unable to access element in X array");
}
PyObject* arglist = Py_BuildValue("(OO)",obj1,obj2);
PyObject* result = PyEval_CallObject(pykernel_,arglist);
Py_XDECREF(obj1);
Py_XDECREF(obj2);
Py_DECREF(arglist);
if (!result) {
check_for_interrupts();
throw SMO_Exception("svm.eval_SMO: Unable to call kernel");
}
double val = PyFloat_AsDouble(result);
Py_DECREF(result);
return val;
}
class RBFKernel : public KernelComputation {
public:
RBFKernel(PyArrayObject* X, double gamma);
~RBFKernel();
protected:
virtual double do_kernel(int i1, int i2) const;
private:
PyArrayObject* X_;
double ngamma_;
const int N_;
const int N1_;
};
RBFKernel::RBFKernel(PyArrayObject* X, double gamma):
X_(reinterpret_cast<PyArrayObject*>(X)),
ngamma_(-1./gamma),
N_(PyArray_DIM(X,0)),
N1_(PyArray_DIM(X,1))
{
if (!PyArray_Check(X)) {
throw SMO_Exception("RBF Kernel used, but not with numpy array.");
}
if (!PyArray_ISCARRAY(X)) {
throw SMO_Exception("RBF Kernel used but not with CARRAY.");
}
Py_INCREF(X);
}
RBFKernel::~RBFKernel() {
Py_DECREF(X_);
}
double RBFKernel::do_kernel(int i1, int i2) const {
assert(i1 < N_);
assert(i2 < N_);
const double* data1 = static_cast<const double*>(PyArray_GETPTR1(X_,i1));
const double* data2 = static_cast<const double*>(PyArray_GETPTR1(X_,i2));
double sumdiff = 0.;
for (int i = 0; i != N1_; ++i) {
double diff = data1[i]-data2[i];
sumdiff += diff * diff;
}
sumdiff *= ngamma_;
double res = std::exp(sumdiff);
return res;
}
class DotKernel : public KernelComputation {
public:
explicit DotKernel(PyArrayObject* X);
~DotKernel();
protected:
virtual double do_kernel(int i1, int i2) const;
private:
PyArrayObject* X_;
const int N_;
const int N1_;
};
DotKernel::DotKernel(PyArrayObject* X):
X_(reinterpret_cast<PyArrayObject*>(X)),
N_(PyArray_DIM(X,0)),
N1_(PyArray_DIM(X,1))
{
if (!PyArray_Check(X)) {
throw SMO_Exception("Dot Kernel used, but not with numpy array.");
}
if (!PyArray_ISCARRAY(X)) {
throw SMO_Exception("Dot Kernel used but not with CARRAY.");
}
Py_INCREF(X);
}
DotKernel::~DotKernel() {
Py_DECREF(X_);
}
double DotKernel::do_kernel(int i1, int i2) const {
assert(i1 < N_);
assert(i2 < N_);
const double* data1 = static_cast<const double*>(PyArray_GETPTR1(X_,i1));
const double* data2 = static_cast<const double*>(PyArray_GETPTR1(X_,i2));
double dotsum = 0.;
for (int i = 0; i != N1_; ++i) {
dotsum += data1[i] * data2[i];
}
return dotsum;
}
class PrecomputedKernel : public KernelComputation {
public:
PrecomputedKernel(PyArrayObject* X);
~PrecomputedKernel();
protected:
virtual double do_kernel(int i1, int i2) const;
private:
PyArrayObject* X_;
};
PrecomputedKernel::PrecomputedKernel(PyArrayObject* X):
X_(X)
{
if (!PyArray_Check(X)) {
throw SMO_Exception("PrecomputedKernel used, but not with numpy array.");
}
if (!PyArray_ISCARRAY(X)) {
throw SMO_Exception("Precomputed used but not with CARRAY.");
}
Py_INCREF(X);
}
PrecomputedKernel::~PrecomputedKernel() {
Py_DECREF(X_);
}
double PrecomputedKernel::do_kernel(int i1, int i2) const {
const double* data = static_cast<const double*>(PyArray_GETPTR2(X_,i1,i2));
return *data;
}
std::auto_ptr<KernelComputation> get_kernel(PyObject* X, PyObject* kernel) {
typedef std::auto_ptr<KernelComputation> res_type;
if (PyCallable_Check(kernel)) return res_type(new PyKernel(X, kernel, PySequence_Length(X)));
if (!PyTuple_Check(kernel) || PyTuple_Size(kernel) != 2) throw SMO_Exception("Cannot parse kernel.");
PyObject* type = PyTuple_GET_ITEM(kernel,0);
PyObject* arg = PyTuple_GET_ITEM(kernel,1);
if (!PyInt_Check(type) || !PyFloat_Check(arg)) throw SMO_Exception("Cannot parse kernel (wrong types)");
long type_nr = PyInt_AsLong(type);
double arg_value = PyFloat_AsDouble(arg);
switch (type_nr) {
case 0:
return res_type(new RBFKernel(reinterpret_cast<PyArrayObject*>(X), arg_value));
case 1:
return res_type(new PrecomputedKernel(reinterpret_cast<PyArrayObject*>(X)));
case 2:
return res_type(new DotKernel(reinterpret_cast<PyArrayObject*>(X)));
default:
throw SMO_Exception("Unknown kernel type!");
}
}
class LIBSVM_KernelCache : public KernelCache {
public:
LIBSVM_KernelCache(const int* Y, std::auto_ptr<KernelComputation> kernel, int N, int cache_nr_doubles)
:KernelCache(kernel,N,cache_nr_doubles),
Y_(Y)
{ }
private:
double do_kernel(int i, int j) const {
double res = KernelCache::do_kernel(i,j);
return res * Y_[i] * Y_[j];
}
const int* Y_;
};
class SMO {
public:
SMO(PyObject* X, int* Y, double* Alphas, double& b, double C, int N, PyObject* kernel, double eps, double tol, int cache_size):
Alphas(Alphas),
Y(Y),
b(b),
C(C),
N(N),
cache_(get_kernel(X,kernel),N,cache_size),
eps(eps),
tol(tol) { }
void optimise();
double apply(int) const;
bool take_step(int,int);
bool examine_example(int);
double get_error(int) const;
private:
double* Alphas;
int* Y;
double& b;
double C;
int N;
mutable KernelCache cache_;
const double eps;
const double tol;
};
double SMO::get_error(int j) const {
return apply(j) - Y[j];
}
double SMO::apply(int j) const {
double sum = -b;
double* Kernel_Line = cache_.get_kline(j);
for (int i = 0; i != N; ++i) {
if (Alphas[i] != C) {
sum += Y[i] * Alphas[i] * Kernel_Line[i];
}
}
//std::cout << "SMO::apply( " << j << " ): " << sum << '\n';
return sum;
}
bool SMO::take_step(int i1, int i2) {
//std::cout << "take_step( " << i1 << ", " << i2 << " );\n";
if (i1 == i2) return false;
const double alpha1 = Alphas[i1];
const double alpha2 = Alphas[i2];
const int y1 = Y[i1];
const int y2 = Y[i2];
double L, H;
if (y1 != y2) {
L = max(0.,alpha2-alpha1);
H = min(C,C+alpha2-alpha1);
} else {
L = max(0.,alpha1+alpha2-C);
H = min(C,alpha1+alpha2);
}
if (L == H) return false;
const int s = y1*y2;
const double E1 = get_error(i1);
const double E2 = get_error(i2);
const double k11 = cache_.kernel_apply(i1,i1);
const double k12 = cache_.kernel_apply(i1,i2);
const double k22 = cache_.kernel_apply(i2,i2);
const double eta = 2*k12-k11-k22;
double a1,a2;
if (eta < 0) {
a2 = alpha2-y2*(E1-E2)/eta;
a2 = median(L,a2,H);
} else {
double gamma = alpha1+s*alpha2; // Eq. (12.22)
double v1=E1+y1+b-y1*alpha1*k11-y2*alpha2*k12; // Eq. (12.21) // Note that f(x1) = E1 + y1
double v2=E2+y2+b-y1*alpha1*k12-y2*alpha2*k22; // Eq. (12.21)
double L_obj = gamma-s*L+L-.5*k11*(gamma-s*L)*(gamma-s*L)-.5*k22*L*L-s*k12*(gamma-s*L)*L-y1*(gamma-s*L)*v1-y2*L*v2; // # + W_const # Eq. (12.23)
double H_obj = gamma-s*H+H-.5*k11*(gamma-s*H)*(gamma-s*L)-.5*k22*H*H-s*k12*(gamma-s*H)*H-y1*(gamma-s*H)*v1-y2*H*v2; // # + W_const # Eq. (12.23)
if (L_obj > H_obj + eps) {
a2 = L;
} else if (L_obj < H_obj - eps) {
a2 = H;
} else {
a2 = alpha2;
}
}
if (a2 < tol) a2 = 0;
else if (a2 > C-tol) a2 = C;
if (std::abs(a2-alpha2) < eps*(a2+alpha2+eps)) return false;
a1 = alpha1+s*(alpha2-a2);
if (a1 < tol) a1 = 0;
if (a1 > C-tol) a1 = C;
// update everything
Alphas[i1]=a1;
Alphas[i2]=a2;
double b1 = E1 + Y[i1]*(a1-alpha1)*k11+Y[i2]*(a2-alpha2)*k12+b; // Eq. (12.9)
double b2 = E2 + Y[i1]*(a1-alpha1)*k12+Y[i2]*(a2-alpha2)*k22+b; // Eq. (12.10)
const double new_b = (b1+b2)/2.;
/*
#for i in xrange(N):
# if Alphas[i] in (0,C):
# continue
# elif i == i1 or i == i2:
# E[i] = 0
# else:
# E[i] += y1*(a1-alpha1)*kernel_apply(i1,i)+y2*(a2-alpha2)*kernel_apply(i2,i) + (b-new_b) # Eq. (12.11)
#E[i1]=f_at(i1)-y1
#E[i2]=f_at(i2)-y2
*/
b = new_b;
return true;
}
bool SMO::examine_example(int i2) {
//std::cout << "examine_example( " << i2 << " ) " << std::endl;
const int y2 = Y[i2];
const double alpha2 = Alphas[i2];
const double E2 = get_error(i2);
const double r2 = E2 * y2;
//#print 'alpha2', alpha2, 'E2', E2, 'r2', r2
if ( ( (r2 < -tol) && (alpha2 < C) ) ||
( (r2 > tol) && (alpha2 > 0) )){
int best_i1 = -1;
double bestE = -1;
for (int i = 0; i != N; ++i) {
if (Alphas[i] != 0 && Alphas[i] != C) {
double dE = E2-get_error(i);
if (dE < 0.) dE = -dE;
if (dE > bestE) {
bestE=dE;
best_i1 = i;
}
}
}
if (best_i1 != -1 && take_step(best_i1,i2)) return true;
for (int i1 = 0; i1 != N; ++i1) {
if (Alphas[i1] && Alphas[i1] != C && take_step(i1,i2)) return true;
}
for (int i1 = 0; i1 != N; ++i1) {
if (take_step(i1,i2)) return true;
}
}
return false;
}
void SMO::optimise() {
b = 0;
for (int i = 0; i != N; ++i) Alphas[i] = 0;
int changed = 0;
bool examineAll = true;
//int iter = 0;
while (changed || examineAll) {
//std::cout << "SMO::optimize loop: " << iter++ << "\n";
check_for_interrupts();
changed = 0;
for (int i = 0; i != N; ++i) {
if (examineAll || (Alphas[i] != 0 && Alphas[i] != C)) {
changed += examine_example(i);
}
}
if (examineAll) examineAll = false;
else if (!changed) examineAll = true;
}
}
void assert_type_contiguous(PyArrayObject* array,int type) {
if (!PyArray_Check(array) ||
PyArray_TYPE(array) != type ||
!PyArray_ISCONTIGUOUS(array)) {
throw SMO_Exception("Arguments to eval_(SMO|LIBSVM) don't conform to expectation. Are you calling this directly? This is an internal function!");
}
}
PyObject* eval_SMO(PyObject* self, PyObject* args) {
try {
PyObject* X;
PyArrayObject* Y;
PyArrayObject* Alphas0;
PyArrayObject* params;
PyObject* kernel;
int cache_size;
if (!PyArg_ParseTuple(args, "OOOOOi", &X, &Y, &Alphas0, ¶ms,&kernel,&cache_size)) {
const char* errmsg = "Arguments were not what was expected for eval_SMO.\n"
"This is an internal function: Do not call directly unless you know what you're doing.\n";
PyErr_SetString(PyExc_RuntimeError,errmsg);
return 0;
}
assert_type_contiguous(Y,NPY_INT32);
assert_type_contiguous(Alphas0,NPY_DOUBLE);
assert_type_contiguous(params,NPY_DOUBLE);
if (PyArray_DIM(params,0) < 4) throw SMO_Exception("eval_SMO: Too few parameters");
int * Yv = static_cast<int*>(PyArray_DATA(Y));
double* Alphas = static_cast<double*>(PyArray_DATA(Alphas0));
unsigned N = PyArray_DIM(Y,0);
double b = *static_cast<double*>(PyArray_GETPTR1(params,0));
double C = *static_cast<double*>(PyArray_GETPTR1(params,1));
double eps = *static_cast<double*>(PyArray_GETPTR1(params,2));
double tol = *static_cast<double*>(PyArray_GETPTR1(params,3));
SMO optimiser(X,Yv,Alphas,b,C,N,kernel,eps,tol,cache_size);
optimiser.optimise();
*static_cast<double*>(PyArray_GETPTR1(params,0)) = b; // Write back b
Py_RETURN_NONE;
} catch (const Python_Exception&) {
// if Python_Exception was thrown, then PyErr is already set.
return 0;
} catch (const SMO_Exception& exc) {
PyErr_SetString(PyExc_RuntimeError,exc.msg);
return 0;
} catch (...) {
PyErr_SetString(PyExc_RuntimeError,"Some sort of exception in eval_SMO.");
return 0;
}
}
// The code for LIBSVM_Solver is taken from LIBSVM and adapted to work well in milk
// Copyright (c) 2000-2008 <NAME> and <NAME>
// Changes were made to make it more similar to our formulation.
// An SMO algorithm in Fan et al., JMLR 6(2005), p. 1889--1918
// Solves:
//
// min 0.5(\alpha^T Q \alpha) + p^T \alpha
//
// y^T \alpha = \delta
// y_i = +1 or -1
// 0 <= alpha_i <= Cp for y_i = 1
// 0 <= alpha_i <= Cn for y_i = -1
//
// Given:
//
// Q, p, y, Cp, Cn, and an initial feasible point \alpha
// l is the size of vectors and matrices
// eps is the stopping tolerance
//
// solution will be put in \alpha, objective value will be put in obj
//
#define info printf
void info_flush() { }
class LIBSVM_Solver {
public:
LIBSVM_Solver(PyObject* X, int* Y, double* Alphas, double* p, double& b, double C, int N, PyObject* kernel, double eps, double tol, int cache_size, bool shrinking):
Alphas(Alphas),
Y(Y),
b(b),
C(C),
N(N),
cache_(Y,get_kernel(X,kernel),N,cache_size),
eps(eps),
tol(tol),
alpha_status(N),
active_size(N),
active_set(N),
p(p),
shrinking(shrinking),
tau(eps)
{ }
virtual ~LIBSVM_Solver() { }
struct SolutionInfo {
double obj;
double rho;
double upper_bound_p;
double upper_bound_n;
double r; // for LIBSVM_Solver_NU
};
void optimise();
protected:
double* Alphas;
int* Y;
double& b;
double C;
const int N;
LIBSVM_KernelCache cache_;
const double eps;
const double tol;
vector<double> G; // gradient of objective function
enum alpha_status_e { lower_bound, upper_bound, free};
vector<alpha_status_e> alpha_status;
int active_size;
vector<int> active_set;
double *p;
vector<double> G_bar; // gradient, if we treat free variables as 0
bool unshrinked; // XXX
bool shrinking;
double tau;
double get_C(int i) const {
return C;
//return (Y[i] > 0)? Cp : Cn;
}
void update_alpha_status(int i)
{
if(Alphas[i] >= get_C(i))
alpha_status[i] = upper_bound;
else if(Alphas[i] <= 0)
alpha_status[i] = lower_bound;
else alpha_status[i] = free;
}
bool is_upper_bound(int i) const { return alpha_status[i] == upper_bound; }
bool is_lower_bound(int i) const { return alpha_status[i] == lower_bound; }
bool is_free(int i) const { return alpha_status[i] == free; }
void swap_index(int i, int j);
void reconstruct_gradient();
virtual bool select_working_set(int &i, int &j);
virtual double calculate_rho();
virtual void do_shrinking();
private:
bool be_shrunken(int i, double Gmax1, double Gmax2);
void print_status() const;
};
void LIBSVM_Solver::swap_index(int i, int j)
{
// We *do not* swap in the cache or kernel
// Therefore *all acesses* to the cache or kernel
// must be of the form cache_.get_kernel(active_set[i])
// instead of cache_.get_kernel(i)!
std::swap(Y[i],Y[j]);
std::swap(G[i],G[j]);
std::swap(alpha_status[i],alpha_status[j]);
std::swap(Alphas[i],Alphas[j]);
std::swap(p[i],p[j]);
std::swap(active_set[i],active_set[j]);
std::swap(G_bar[i],G_bar[j]);
}
void LIBSVM_Solver::reconstruct_gradient()
{
// reconstruct inactive elements of G from G_bar and free variables
if(active_size == N) return;
for(int i=active_size;i<N;++i)
G[i] = G_bar[i] + p[i];
for(int i=0;i<active_size;++i) {
if(is_free(i))
{
const double *Q_i = cache_.get_kline(active_set[i],N);
const double alpha_i = Alphas[i];
for(int j=active_size;j<N;j++)
G[j] += alpha_i * Q_i[active_set[j]];
}
}
}
void LIBSVM_Solver::optimise() {
// INITIALISE:
unshrinked = false;
for(int i=0;i<N;i++) {
update_alpha_status(i);
active_set[i] = i;
}
active_size = N;
// initialize gradient
G.resize(N);
G_bar.resize(N);
for(int i=0;i<N;++i) {
G[i] = p[i];
G_bar[i] = 0;
}
for(int i=0;i<N;i++) {
if(!is_lower_bound(i))
{
const double *Q_i= cache_.get_kline(active_set[i]);
const double alpha_i = Alphas[i];
for(int j=0;j<N;j++) G[j] += alpha_i*Q_i[j];
if(is_upper_bound(i)) {
for(int j=0;j<N;j++) G_bar[j] += get_C(i) * Q_i[j];
}
}
}
//MAIN LOOP
int counter = min(N,1000)+1;
const int max_iters = 10*1000;
for (int iter = 0; iter != max_iters; ++iter) {
if (!(iter % 16)) check_for_interrupts();
// show progress and do shrinking
if(--counter == 0) {
counter = min(N,1000);
if(shrinking) do_shrinking();
//info("."); info_flush();
}
int i,j;
if(select_working_set(i,j)) {
// reconstruct the whole gradient
reconstruct_gradient();
// reset active set size and check
active_size = N;
//info("*"); info_flush();
if(select_working_set(i,j)) {
break;
}
else counter = 1; // do shrinking next iteration
}
assert((i >= 0) && (i < active_size));
assert((j >= 0) && (j < active_size));
// update alpha[i] and alpha[j], handle bounds carefully
const double *Q_i = cache_.get_kline(active_set[i], active_size);
const double *Q_j = cache_.get_kline(active_set[j], active_size);
const double C_i = get_C(i);
const double C_j = get_C(j);
const double old_alpha_i = Alphas[i];
const double old_alpha_j = Alphas[j];
if(Y[i]!=Y[j]) {
double quad_coef = Q_i[active_set[i]]+Q_j[active_set[j]]+2.0*Q_i[active_set[j]];
if (quad_coef <= 0) quad_coef = tau;
const double delta = (-G[i]-G[j])/quad_coef;
const double diff = Alphas[i] - Alphas[j];
Alphas[i] += delta;
Alphas[j] += delta;
if(diff > 0) {
if(Alphas[j] < 0) {
Alphas[j] = 0;
Alphas[i] = diff;
}
} else {
if(Alphas[i] < 0) {
Alphas[i] = 0;
Alphas[j] = -diff;
}
}
if(diff > C_i - C_j) {
if(Alphas[i] > C_i) {
Alphas[i] = C_i;
Alphas[j] = C_i - diff;
}
} else {
if(Alphas[j] > C_j) {
Alphas[j] = C_j;
Alphas[i] = C_j + diff;
}
}
//print_status();
} else {
double quad_coef = Q_i[active_set[i]]+Q_j[active_set[j]]-2*Q_i[active_set[j]];
if (quad_coef <= 0) quad_coef = tau;
const double delta = (G[i]-G[j])/quad_coef;
double sum = Alphas[i] + Alphas[j];
Alphas[i] -= delta;
Alphas[j] += delta;
if(sum > C_i) {
if(Alphas[i] > C_i) {
Alphas[i] = C_i;
Alphas[j] = sum - C_i;
}
} else {
if(Alphas[j] < 0) {
Alphas[j] = 0;
Alphas[i] = sum;
}
}
if(sum > C_j) {
if(Alphas[j] > C_j) {
Alphas[j] = C_j;
Alphas[i] = sum - C_j;
}
} else {
if(Alphas[i] < 0) {
Alphas[i] = 0;
Alphas[j] = sum;
}
}
}
// update G
const double delta_Alphas_i = Alphas[i] - old_alpha_i;
const double delta_Alphas_j = Alphas[j] - old_alpha_j;
// print_status();
for(int k=0;k<active_size;++k) {
G[k] += Q_i[active_set[k]]*delta_Alphas_i + Q_j[active_set[k]]*delta_Alphas_j;
}
// update Alphas_status and G_bar
const bool ui = is_upper_bound(i);
const bool uj = is_upper_bound(j);
update_alpha_status(i);
update_alpha_status(j);
if(ui != is_upper_bound(i)) {
Q_i = cache_.get_kline(active_set[i], N);
if(ui) {
for(int k=0;k<N;k++)
G_bar[k] -= C_i * Q_i[active_set[k]];
} else {
for(int k=0;k<N;k++)
G_bar[k] += C_i * Q_i[active_set[k]];
}
}
if(uj != is_upper_bound(j)) {
Q_j = cache_.get_kline(active_set[j], N);
if(uj) {
for(int k=0;k<N;k++)
G_bar[k] -= C_j * Q_j[active_set[k]];
} else {
for(int k=0;k<N;k++)
G_bar[k] += C_j * Q_j[active_set[k]];
}
}
}
// CLEANUP
// calculate rho
//si->rho = calculate_rho();
b = calculate_rho();
// calculate objective value
double v = 0;
for(int i=0;i<N;i++) {
v += Alphas[i] * (G[i] + p[i]);
}
//si->obj = v/2;
// put back the solution
for (int i = 0; i != N; ++i) {
while (active_set[i] != i) {
int j = active_set[i];
std::swap(Y[i],Y[j]); // It's not polite to clobber Y, so put it back
std::swap(Alphas[i],Alphas[j]);
std::swap(active_set[i],active_set[j]);
}
}
//si->upper_bound_p = Cp;
//si->upper_bound_n = Cn;
}
void LIBSVM_Solver::print_status() const {
std::cout << " active_set_size: " << active_size << "\n";
for (int i = 0; i != N; ++i) {
std::cout << " active_set[i]: " << active_set[i] << "\n";
std::cout << " p[i]: " << p[i] << "\n";
std::cout << " G[i]: " << G[i] << "\n";
std::cout << " G_bar[i]: " << G_bar[i] << "\n";
std::cout << " alpha_status[i]: " << alpha_status[i] << "\n";
std::cout << " Y[i]: " << Y[i] << "\n";
std::cout << " Alphas[i]: " << Alphas[i] << "\n";
std::cout << "\n";
}
}
// return true if already optimal, return false otherwise
bool LIBSVM_Solver::select_working_set(int &out_i, int &out_j) {
// return i,j such that
// i: maximizes -y_i * grad(f)_i, i in I_up(\alpha)
// j: minimizes the decrease of obj value
// (if quadratic coefficeint <= 0, replace it with tau)
// -y_j*grad(f)_j < -y_i*grad(f)_i, j in I_low(\alpha)
//print_status();
double Gmax = -INF;
double Gmax2 = -INF;
int Gmax_idx = -1;
int Gmin_idx = -1;
double obj_diff_min = INF;
for(int t=0;t<active_size;t++) {
if( (Y[t] == +1 && !is_upper_bound(t)) ||
(Y[t] == -1 && !is_lower_bound(t))) {
if (-Y[t]*G[t] >= Gmax) {
Gmax = -Y[t]*G[t];
Gmax_idx = t;
}
}
}
if (Gmax_idx == -1) return true;
const int i = Gmax_idx;
const double* Q_i = cache_.get_kline(active_set[i], N);
const double* QDiag = cache_.get_diag();
for(int j=0;j<active_size;++j) {
if ((Y[j] == +1 && !is_lower_bound(j)) ||
(Y[j] == -1 && !is_upper_bound(j))) {
const double YGj = Y[j]*G[j];
const double grad_diff = Gmax+YGj;
if (YGj >= Gmax2) Gmax2 = YGj;
if (grad_diff > 0) {
const double quad_coef = Q_i[active_set[i]] + QDiag[active_set[j]] - 2*Y[j]*Y[i]*Q_i[active_set[j]];
const double obj_diff_factor = (quad_coef > 0) ? quad_coef : tau;
const double obj_diff = -(grad_diff*grad_diff)/obj_diff_factor;
if (obj_diff <= obj_diff_min) {
Gmin_idx = j;
obj_diff_min = obj_diff;
}
}
}
}
if (Gmin_idx == -1) return true;
if(Gmax+Gmax2 < eps) return true;
out_i = Gmax_idx;
out_j = Gmin_idx;
return false;
}
bool LIBSVM_Solver::be_shrunken(int i, const double Gmax1, const double Gmax2)
{
if(is_upper_bound(i)) {
if(Y[i]==+1) return -G[i] > Gmax1;
return -G[i] > Gmax2;
} else if(is_lower_bound(i)) {
if (Y[i]==+1) return G[i] > Gmax2;
return G[i] > Gmax1;
}
return false;
}
void LIBSVM_Solver::do_shrinking()
{
double Gmax1 = -INF; // max { -y_i * grad(f)_i | i in I_up(\alpha) }
double Gmax2 = -INF; // max { y_i * grad(f)_i | i in I_low(\alpha) }
// find maximal violating pair first
for(int i=0;i<active_size;i++) {
if(Y[i]==+1) {
if(!is_upper_bound(i)) Gmax1 = max(-G[i],Gmax1);
if(!is_lower_bound(i)) Gmax2 = max( G[i],Gmax2);
} else {
if(!is_upper_bound(i)) Gmax2 = max(-G[i],Gmax2);
if(!is_lower_bound(i)) Gmax1 = max( G[i],Gmax1);
}
}
// shrink
for(int i=0;i<active_size;++i) {
if (be_shrunken(i, Gmax1, Gmax2)) {
--active_size;
while (active_size > i) {
if (!be_shrunken(active_size, Gmax1, Gmax2)) {
swap_index(i,active_size);
break;
}
--active_size;
}
}
}
// unshrink, check all variables again before final iterations
if(unshrinked || Gmax1 + Gmax2 > eps*10) return;
unshrinked = true;
reconstruct_gradient();
for(int i= N-1; i >= active_size; --i) {
if (!be_shrunken(i, Gmax1, Gmax2)) {
while (active_size < i) {
if (be_shrunken(active_size, Gmax1, Gmax2)) {
swap_index(i,active_size);
break;
}
++active_size;
}
++active_size;
}
}
}
double LIBSVM_Solver::calculate_rho() {
int nr_free = 0;
double ub = INF, lb = -INF, sum_free = 0;
for(int i=0;i<active_size;i++) {
const double yG = Y[i]*G[i];
if(is_upper_bound(i))
{
if(Y[i]==-1) ub = min(ub,yG);
else lb = max(lb,yG);
}
else if(is_lower_bound(i))
{
if(Y[i]==+1) ub = min(ub,yG);
else lb = max(lb,yG);
}
else
{
++nr_free;
sum_free += yG;
}
}
if(nr_free>0) return sum_free/nr_free;
return (ub+lb)/2;
}
PyObject* eval_LIBSVM(PyObject* self, PyObject* args) {
try {
PyObject* X;
PyArrayObject* Y;
PyArrayObject* Alphas0;
PyArrayObject* p;
PyArrayObject* params;
PyObject* kernel;
int cache_size;
if (!PyArg_ParseTuple(args, "OOOOOOi", &X, &Y, &Alphas0, &p, ¶ms,&kernel,&cache_size)) {
const char* errmsg = "Arguments were not what was expected for eval_LIBSVM.\n"
"This is an internal function: Do not call directly unless you know exactly what you're doing.\n";
PyErr_SetString(PyExc_RuntimeError,errmsg);
return 0;
}
assert_type_contiguous(Y,NPY_INT32);
assert_type_contiguous(Alphas0,NPY_DOUBLE);
assert_type_contiguous(p,NPY_DOUBLE);
assert_type_contiguous(params,NPY_DOUBLE);
if (PyArray_DIM(params,0) < 4) throw SMO_Exception("eval_LIBSVM: Too few parameters");
int * Yv = static_cast<int*>(PyArray_DATA(Y));
double* Alphas = static_cast<double*>(PyArray_DATA(Alphas0));
double* pv = static_cast<double*>(PyArray_DATA(p));
unsigned N = PyArray_DIM(Y,0);
double& b = *static_cast<double*>(PyArray_GETPTR1(params,0));
double C = *static_cast<double*>(PyArray_GETPTR1(params,1));
double eps = *static_cast<double*>(PyArray_GETPTR1(params,2));
double tol = *static_cast<double*>(PyArray_GETPTR1(params,3));
LIBSVM_Solver optimiser(X,Yv,Alphas,pv,b,C,N,kernel,eps,tol,cache_size, true);
optimiser.optimise();
Py_RETURN_NONE;
} catch (const Python_Exception&) {
// if Python_Exception was thrown, then PyErr is already set.
return 0;
} catch (const SMO_Exception& exc) {
PyErr_SetString(PyExc_RuntimeError,exc.msg);
return 0;
} catch (...) {
PyErr_SetString(PyExc_RuntimeError,"Some sort of exception in eval_LIBSVM.");
return 0;
}
}
PyMethodDef methods[] = {
{"eval_SMO",eval_SMO, METH_VARARGS , "Do NOT call directly.\n" },
{"eval_LIBSVM",eval_LIBSVM, METH_VARARGS , "Do NOT call directly.\n" },
{NULL, NULL,0,NULL},
};
const char * module_doc =
"Internal SVM Module.\n"
"\n"
"Do NOT use directly!\n";
} // namespace
extern "C"
void init_svm()
{
import_array();
(void)Py_InitModule3("_svm", methods, module_doc);
}
<file_sep>/arduino/battery_meter/battery_meter2/battery_meter2.ino
//battery meter code for uno arduino
int interval = 1; //sample time interval (1000 (ms) = 1 second)
float bat1_high = 12.95;
float bat1_low = 11.10;
float bat2_high = 12.95;
float bat2_low = 11.10;
float bat3_high = 12.95;
float bat3_low = 11.10;
int samples = 10; //number of samples taken
int analog_read = 0;
int bat1_pin_to_read = A0;
int bat2_pin_to_read = A1;
int bat3_pin_to_read = A2;
long bat1_reading_sum_total = 0;
long bat2_reading_sum_total = 0;
long bat3_reading_sum_total = 0;
int bat1_reading_sum_avg = 0;
int bat2_reading_sum_avg = 0;
int bat3_reading_sum_avg = 0;
float bat1_volt = 0;
float bat2_volt = 0;
float bat3_volt = 0;
int bat1_lvl = 0;
int bat2_lvl = 0;
int bat3_lvl = 0;
float calibration = 0.023941748; //voltage per reading
void setup()
{
Serial.begin(9600);
//pinMode(bat1_pin_to_read, INPUT);
//digitalWrite(bat1_pin_to_read, HIGH);
//pinMode(bat2_pin_to_read, INPUT);
//digitalWrite(bat2_pin_to_read, HIGH);
//pinMode(bat3_pin_to_read, INPUT);
//digitalWrite(bat3_pin_to_read, HIGH);
}
void loop()
{
for(int i = 0; i < samples ; i++)
{
bat1_reading_sum_total += analogRead(bat1_pin_to_read);
delay(50);
bat2_reading_sum_total += analogRead(bat2_pin_to_read);
delay(50);
bat3_reading_sum_total += analogRead(bat3_pin_to_read);
delay(50);
}
//average out the readings
bat1_reading_sum_avg = bat1_reading_sum_total / samples;
bat2_reading_sum_avg = bat2_reading_sum_total / samples;
bat3_reading_sum_avg = bat3_reading_sum_total / samples;
//Serial.println (bat1_reading_sum_avg * (5.0 / 1023.0));
Serial.println (bat1_reading_sum_avg );
Serial.println (bat2_reading_sum_avg );
Serial.println (bat3_reading_sum_avg );
//calculate voltages
bat1_volt = (bat1_reading_sum_avg * calibration );
bat2_volt = (bat2_reading_sum_avg * calibration );
bat3_volt = (bat3_reading_sum_avg * calibration );
//calculate battery levels
bat1_lvl = int(100 * ((bat1_volt - bat1_low) / (bat1_high - bat1_low)));
bat2_lvl = int(100 * ((bat2_volt - bat2_low) / (bat2_high - bat2_low)));
bat3_lvl = int(100 * ((bat3_volt - bat3_low) / (bat3_high - bat3_low)));
//send data out
Serial.print ("bat bank,");
Serial.print ("1,");
Serial.print("voltage,");
Serial.print (bat1_volt);
Serial.print (",");
Serial.print ("battery lvl,");
Serial.println (bat1_lvl);
Serial.print ("bat bank,");
Serial.print ("2,");
Serial.print("voltage,");
Serial.print (bat2_volt);
Serial.print (",");
Serial.print ("battery lvl,");
Serial.println (bat2_lvl);
Serial.print ("bat bank,");
Serial.print ("3,");
Serial.print("voltage,");
Serial.print (bat3_volt);
Serial.print (",");
Serial.print ("battery lvl,");
Serial.println (bat3_lvl);
//reset for next loop
bat1_reading_sum_avg = 0;
bat2_reading_sum_avg = 0;
bat3_reading_sum_avg = 0;
bat1_reading_sum_total = 0;
bat2_reading_sum_total = 0;
bat3_reading_sum_total = 0;
delay(interval);
}
<file_sep>/gui/easygui/test_gui1.py
#!/usr/bin/env python
import easygui as eg
import sys
from img_processing_tools import *
#from PIL import Image
from PIL import ImageStat
import cv2
import time
import mahotas
import Image
def snap_shot(filename):
#capture from camera at location 0
now = time.time()
global webcam1
try:
#have to capture a few frames as it buffers a few frames..
for i in range (5):
ret, img = webcam1.read()
print "time to capture 5 frames:", (time.time()) - now
cv2.imwrite(filename, img)
#img1 = Image.open(filename)
#img1.thumbnail((320,240))
#img1.save(filename)
#print (time.time()) - now
except:
print "could not grab webcam"
return img
if __name__=="__main__":
loop = 1
reply =""
webcam1 = cv2.VideoCapture(0)
while True:
# data file schema
# classID, next 256 integers are I3 greenband histogram, I3 sum, I3 sum2, I3 median, I3 mean,
# I3 variance, I3 Standard Deviation, I3 root mean square
if reply == "":
image = snap_shot('temp.png')
if reply == "Mowable":
#eg.msgbox("Going to mow....:")
classID = "1"
print "calling i3"
I3image = rgb2I3(image)
WriteMeterics(I3image, classID)
if reply == "Non-Mowable":
classID = "2"
print "calling i3"
I3image = rgb2I3(image)
WriteMeterics(I3image, classID)
if reply == "Test Img":
#im = Image.fromarray(image)
#im.save("new.png")
img1 = Image.open('temp.png')
#img1.thumbnail((320,240))
img1.save('new.png')
if reply == "Quit":
print "Quitting...."
sys.exit(-1)
if reply == "New Image":
print "Acquiring new image.."
image = snap_shot('temp.png')
#print np.array(image)
#print PIL2array(image)
#lbp1 = mahotas.features.lbp(image , 1, 8, ignore_zeros=False)
#print lbp1
reply = eg.buttonbox(msg='Classify Image', title='Robomow GUI', choices=('Mowable', 'Non-Mowable', 'Test Img', 'New Image', 'Quit'), image='temp.png', root=None)
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/tests/data/jugparallel_jugfile.py
import milk.ext.jugparallel
from milksets.wine import load
from milk.tests.fast_classifier import fast_classifier
features,labels = load()
classified = milk.ext.jugparallel.nfoldcrossvalidation(features, labels, learner=fast_classifier())
classified_wpred = milk.ext.jugparallel.nfoldcrossvalidation(features, labels, learner=fast_classifier(), return_predictions=True)
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/tests/test_multi_label.py
from milk.tests.fast_classifier import fast_classifier
import milk.supervised.multi_label
import milk
import numpy as np
def test_one_by_one():
np.random.seed(23)
r = np.random.random
ps = np.array([.7,.5,.8,.3,.8])
learner = milk.supervised.multi_label.one_by_one(fast_classifier())
universe = range(len(ps))
for _ in xrange(10):
labels = []
features = []
bases = [np.random.rand(20) for pj in ps]
for i in xrange(256):
cur = []
curf = np.zeros(20,float)
for j,pj in enumerate(ps):
if r() < pj:
cur.append(j)
curf += r()*bases[j]
if not cur: continue
labels.append(cur)
features.append(curf)
model = learner.train(features, labels)
predicted = model.apply_many(features)
matrix = np.zeros((2,2), int)
for t,p in zip(labels, predicted):
for ell in universe:
row = (ell in t)
col = (ell in p)
matrix[row,col] += 1
Tn,Fp = matrix[0]
Fn,Tp = matrix[1]
prec = Tp/float(Tp+Fp)
recall = Tp/float(Tp+Fn)
F1 = 2*prec*recall/(prec + recall)
assert F1 > .3
<file_sep>/ftdi/build/pyftdi/pyftdi/serialext/loggerext.py
# Copyright (c) 2008-2011, Neotion
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
# * Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright
# notice, this list of conditions and the following disclaimer in the
# documentation and/or other materials provided with the distribution.
# * Neither the name of the Neotion nor the names of its contributors may
# be used to endorse or promote products derived from this software
# without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL NEOTION BE LIABLE FOR ANY DIRECT, INDIRECT,
# INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA,
# OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
# LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
# NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE,
# EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
import sys
from pyftdi.pyftdi.misc import hexdump
__all__ = ['SerialLoggerPort']
class SerialLoggerPort(object):
"""Serial port implementation to log input/output data to a log file"""
def __init__(self, **kwargs):
self._logger = None
if 'logger' in kwargs:
self.set_logger(kwargs['logger'])
del kwargs['logger']
super(SerialLoggerPort, self).__init__(**kwargs)
def open(self, **kwargs):
return super(SerialLoggerPort, self).open(**kwargs)
def _log_read(self, data):
if not self._logger:
return
try:
print >>self._logger, "READ:\n%s" % hexdump(data)
except:
print >> sys.stderr, 'Cannot log read data'
def _log_write(self, data):
if not self._logger:
return
try:
print >>self._logger, "WRITE:\n%s" % hexdump(data)
except:
print >>sys.stderr, 'Cannot log written data'
def _log_flush(self, type_):
if not self._logger:
return
try:
print >>self._logger, "FLUSH: %s\n" % type_
except:
print >>sys.stderr, 'Cannot log flush'
def _log_waiting(self, count):
if not self._logger:
return
try:
print >>self._logger, "INWAITING: %d\n" % count
except:
print >>sys.stderr, 'Cannot log flush'
def set_logger(self, logger):
try:
self._logger = open(logger, "wt")
except IOError, e:
print >> sys.stderr, \
"Cannot log data to %s" % kwargs['logger']
def read(self, size=1):
data = super(SerialLoggerPort, self).read(size)
self._log_read(data)
return data
def write(self, data):
super(SerialLoggerPort, self).write(data)
if len(data):
self._log_write(data)
def inWaiting(self):
wait = super(SerialLoggerPort, self).inWaiting()
self._log_waiting(wait)
return wait
def flush(self):
self._log_flush('I+O')
super(SerialLoggerPort, self).flush()
def flushInput(self):
self._log_flush('I')
super(SerialLoggerPort, self).flushInput()
def flushOutput(self):
self._log_flush('O')
super(SerialLoggerPort, self).flushOutput()
<file_sep>/gui/foo/foo/AboutFooDialog.py
# -*- coding: utf-8 -*-
### BEGIN LICENSE
# This file is in the public domain
### END LICENSE
import gtk
from foo.helpers import get_builder
import gettext
from gettext import gettext as _
gettext.textdomain('foo')
class AboutFooDialog(gtk.AboutDialog):
__gtype_name__ = "AboutFooDialog"
def __new__(cls):
"""Special static method that's automatically called by Python when
constructing a new instance of this class.
Returns a fully instantiated AboutFooDialog object.
"""
builder = get_builder('AboutFooDialog')
new_object = builder.get_object("about_foo_dialog")
new_object.finish_initializing(builder)
return new_object
def finish_initializing(self, builder):
"""Called while initializing this instance in __new__
finish_initalizing should be called after parsing the ui definition
and creating a AboutFooDialog object with it in order to
finish initializing the start of the new AboutFooDialog
instance.
Put your initialization code in here and leave __init__ undefined.
"""
# Get a reference to the builder and set up the signals.
self.builder = builder
self.builder.connect_signals(self)
# Code for other initialization actions should be added here.
if __name__ == "__main__":
dialog = AboutFooDialog()
dialog.show()
gtk.main()
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/tests/test_adaboost.py
import numpy as np
import milk.supervised.tree
import milk.supervised.adaboost
def test_learner():
from milksets import wine
learner = milk.supervised.adaboost.boost_learner(milk.supervised.tree.stump_learner())
features, labels = wine.load()
features = features[labels < 2]
labels = labels[labels < 2] == 0
labels = labels.astype(int)
model = learner.train(features[::2], labels[::2])
train_out = np.array(map(model.apply, features))
assert (train_out == labels).mean() > .9
def test_too_many_boolean_indices_regression():
import milk.supervised.randomforest
import milk.supervised.adaboost
import milksets.wine
from milk.supervised.multi import one_against_one
weak = milk.supervised.randomforest.rf_learner()
learner = milk.supervised.adaboost.boost_learner(weak)
learner = one_against_one(learner)
features, labels = milksets.wine.load()
# sample features so that the test is faster (still gives error):
learner.train(features[::16], labels[::16])
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/supervised/set2binary_array.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2011, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# License: MIT. See COPYING.MIT file in the milk distribution
from __future__ import division
import numpy as np
__all__ = [
'set2binary_array',
]
class set2binary_array_model(object):
def __init__(self, universe):
self.universe = list(universe)
def apply(self, elems):
res = np.zeros(len(self.universe) + 1, bool)
for e in elems:
try:
res[self.universe.index(e)] = True
except :
res[-1] = True
return res
class set2binary_array(object):
def train(self, features, labels, normalisedlabels=False):
allfeatures = set()
for f in features:
allfeatures.update(f)
return set2binary_array_model(allfeatures)
<file_sep>/lib/mobot_video_publish.py
#!/usr/bin/env python
import pika
from pylab import imread
import time
import cv, cv2
import cPickle as pickle
import gc
'''
USAGE:
import mobot_video_class
camera1 = mobot_video_class.mobot_video(camera#, width, height)
helps on ubunutu to not drop frames
lsmod
rmmod uvcvideo
modprobe uvcvideo nodrop=1 timeout=5000 quirks=0x80
'''
class publish_video():
def __init__(self, camera_num, x, y):
self.camera_num = camera_num
self.x = x
self.y = y
self.camera = None
self.frame_count = 0
self.recovery_count = 0
self.frame = None
self.capture_time = 0.0
#-------------connection variables
self.feed_num = 'video.' + str(camera_num )
self.connection = None
self.channel = None
#----------------------RUN
self.run()
def connect(self):
self.connection = pika.BlockingConnection(pika.ConnectionParameters('localhost'))
self.channel = self.connection.channel()
#channel.queue_declare(queue='mobot_video1', auto_delete=True, arguments={'x-message-ttl':1000})
self.channel.exchange_declare(exchange='mobot_data_feed',type='topic')
#print self.feed_num
'''
queue_declare(callback, queue='', passive=False, durable=False, exclusive=False, auto_delete=False, nowait=False, arguments=None)[source]
Declare queue, create if needed. This method creates or checks a queue. When creating a new queue the client can specify various properties that control the durability of the queue and its contents, and the level of sharing for the queue.
Leave the queue name empty for a auto-named queue in RabbitMQ
Parameters:
callback (method) - The method to call on Queue.DeclareOk
queue (str or unicode) - The queue name
passive (bool) - Only check to see if the queue exists
durable (bool)- Survive reboots of the broker
exclusive (bool)- Only allow access by the current connection
auto_delete (bool) - Delete after consumer cancels or disconnects
nowait (bool) - Do not wait for a Queue.DeclareOk
arguments (dict) - Custom key/value arguments for the queue
'''
def publish(self, data):
self.channel.basic_publish(exchange='mobot_data_feed',
routing_key=self.feed_num, body=data)
def enable_local_display(self):
cv2.namedWindow('Front Camera', cv.CV_WINDOW_AUTOSIZE)
#webcam1 = cv.CreateCameraCapture(1)
webcam1 = cv2.VideoCapture(1)
cv2.imshow('Front Camera', frame1)
cv2.waitKey(30)
def initialize_camera(self, camera_num, x, y):
if camera_num <> "":
self.camera = cv2.VideoCapture(camera_num)
self.camera.set(cv2.cv.CV_CAP_PROP_FRAME_WIDTH, x)
self.camera.set(cv2.cv.CV_CAP_PROP_FRAME_HEIGHT, y)
#self.camera1.set(cv2.cv.CV_CAP_PROP_FPS, 10)
def run(self):
self.connect()
self.initialize_camera(self.camera_num, self.x, self.y)
while True:
time.sleep(0.0001) #dont hog resources
self.frame = None
self.frame_count += 1
now = time.time()
try:
ret, self.frame = self.camera.read()
except:
pass
self.capture_time = (time.time()-now)
print 'frames:', self.frame_count , " capture time:", self.capture_time, " recovery_count:", self.recovery_count
if self.capture_time > 0.9 or self.frame == None:
self.frame = None
while self.frame == None:
self.recovery_count += 1
try:
if self.camera != None:
self.camera.release
gc.enable()
gc.collect()
self.initialize_camera(self.camera_num, self.x, self.y)
try:
ret, self.frame = self.camera.read()
except:
pass
except:
time.sleep(.5)
pass
pickled_frame = pickle.dumps(self.frame,-1)
self.publish(pickled_frame)
if __name__== "__main__":
lidar = publish_video(0, 320, 240)
while True:
time.sleep(0.00001)
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/supervised/precluster_learner.py
from .precluster import *
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/tests/fast_classifier.py
import numpy as np
from milk.supervised.base import supervised_model
class fast_classifier(object):
def __init__(self):
pass
def set_option(self, _k, _v):
pass
def train(self, features, labels, **kwargs):
examples = {}
for f,lab in zip(features, labels):
if lab not in examples:
examples[lab] = f
return fast_model(examples)
class fast_model(supervised_model):
def __init__(self, examples):
self.examples = examples
assert len(self.examples)
def apply(self, f):
best = None
best_val = +np.inf
for k,v in self.examples.iteritems():
d = v-f
dist = np.dot(d,d)
if dist < best_val:
best = k
best_val = dist
return best
<file_sep>/PhidgetsPython/Python/TextLCD-simple.py
#!/usr/bin/env python
"""Copyright 2010 Phidgets Inc.
This work is licensed under the Creative Commons Attribution 2.5 Canada License.
To view a copy of this license, visit http://creativecommons.org/licenses/by/2.5/ca/
"""
__author__ = '<NAME>'
__version__ = '2.1.8'
__date__ = 'May 17 2010'
#Basic imports
from ctypes import *
import sys
from time import sleep
#Phidget specific imports
from Phidgets.Phidget import PhidgetID
from Phidgets.PhidgetException import PhidgetErrorCodes, PhidgetException
from Phidgets.Events.Events import AttachEventArgs, DetachEventArgs, ErrorEventArgs
from Phidgets.Devices.TextLCD import TextLCD, TextLCD_ScreenSize
#Create an TextLCD object
try:
textLCD = TextLCD()
except RuntimeError as e:
print("Runtime Exception: %s" % e.details)
print("Exiting....")
exit(1)
#Information Display Function
def DisplayDeviceInfo():
try:
isAttached = textLCD.isAttached()
name = textLCD.getDeviceName()
serialNo = textLCD.getSerialNum()
version = textLCD.getDeviceVersion()
rowCount = textLCD.getRowCount()
columnCount = textLCD.getColumnCount()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
return 1
print("|------------|----------------------------------|--------------|------------|")
print("|- Attached -|- Type -|- Serial No. -|- Version -|")
print("|------------|----------------------------------|--------------|------------|")
print("|- %8s -|- %30s -|- %10d -|- %8d -|" % (isAttached, name, serialNo, version))
print("|------------|----------------------------------|--------------|------------|")
print("Number of Rows: %i -- Number of Columns: %i" % (rowCount, columnCount))
#Event Handler Callback Functions
def TextLCDAttached(e):
attached = e.device
print("TextLCD %i Attached!" % (attached.getSerialNum()))
def TextLCDDetached(e):
detached = e.device
print("TextLCD %i Detached!" % (detached.getSerialNum()))
def TextLCDError(e):
try:
source = e.device
print("TextLCD %i: Phidget Error %i: %s" % (source.getSerialNum(), e.eCode, e.description))
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
#Main Program Code
try:
textLCD.setOnAttachHandler(TextLCDAttached)
textLCD.setOnDetachHandler(TextLCDDetached)
textLCD.setOnErrorhandler(TextLCDError)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Opening phidget object....")
try:
textLCD.openPhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Waiting for attach....")
try:
textLCD.waitForAttach(10000)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
try:
textLCD.closePhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Exiting....")
exit(1)
else:
DisplayDeviceInfo()
try:
if textLCD.getDeviceID()==PhidgetID.PHIDID_TEXTLCD_ADAPTER:
textLCD.setScreenIndex(0)
textLCD.setScreenSize(TextLCD_ScreenSize.PHIDGET_TEXTLCD_SCREEN_2x8)
print("Writing to first row....")
textLCD.setDisplayString(0, "Row 1")
sleep(2)
print("Writing to second row....")
textLCD.setDisplayString(1, "Row 2")
sleep(2)
print("Adjusting contrast up....")
textLCD.setContrast(255)
sleep(2)
print("Adjusting contrast down....")
textLCD.setContrast(110)
sleep(2)
print("Turn on cursor....")
textLCD.setCursor(True)
sleep(2)
print("Turn on cursor blink....")
textLCD.setCursor(False)
textLCD.setCursorBlink(True)
sleep(2)
print("Clear the screen...")
textLCD.setCursorBlink(False)
textLCD.setDisplayString(0, "")
textLCD.setDisplayString(1, "")
sleep(2)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
try:
print("Display some custom characters...")
print("Set the custom chars....")
textLCD.setCustomCharacter(0, 949247, 536)
textLCD.setCustomCharacter(1, 1015791, 17180)
textLCD.setCustomCharacter(2, 1048039, 549790)
textLCD.setCustomCharacter(3, 1031395, 816095)
textLCD.setCustomCharacter(4, 498785, 949247)
textLCD.setCustomCharacter(5, 232480, 1015791)
textLCD.setCustomCharacter(6, 99328, 1048039)
print("Display the custom chars....")
textLCD.setDisplayString(0, "Custom..")
customString = textLCD.getCustomCharacter(0)
customString += textLCD.getCustomCharacter(1)
customString += textLCD.getCustomCharacter(2)
customString += textLCD.getCustomCharacter(3)
customString += textLCD.getCustomCharacter(4)
customString += textLCD.getCustomCharacter(5)
customString += textLCD.getCustomCharacter(6)
textLCD.setDisplayString(1, customString)
sleep(2)
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Press Enter to quit....")
chr = sys.stdin.read(1)
print("Closing...")
textLCD.setDisplayString(0, "")
textLCD.setDisplayString(1, "")
try:
textLCD.closePhidget()
except PhidgetException as e:
print("Phidget Exception %i: %s" % (e.code, e.details))
print("Exiting....")
exit(1)
print("Done.")
exit(0)<file_sep>/camera/Pixelate.py
import os, sys
import Tkinter
import Image, ImageTk
import ImageEnhance
def enhance(eimage):
enh = ImageEnhance.Contrast(eimage)
enh.enhance(1.3).show("30% more contrast")
def win1():
top = Tkinter.Tk()
image1 = Image.open("C:\Python26\image.jpg")
top.geometry('%dx%d' % (image1.size[0],image1.size[1]))
tkpi = ImageTk.PhotoImage(image1)
label_image1 = Tkinter.Label(top, image=tkpi)
label_image1.place(x=0,y=0,width=image1.size[0],height=image1.size[1])
top.title("window 1")
startButton = Tkinter.Button(top, text="Start", command=win2)
startButton.grid(row=1, column=7)
leaveButton = Tkinter.Button(top, text="Quit", command=top.destroy)
leaveButton.grid(row=1, column=1, sticky='nw')
#b1Var = Tkinter.StringVar()
#b2Var = Tkinter.StringVar()
#b1Var.set('b1')
#b2Var.set('b2')
#box1Label = Tkinter.Label(top,textvariable=b1Var,width=12)
#box1Label.grid(row=3, column=2)
#box2Label = Tkinter.Label(top,textvariable=b2Var,width=12)
#box2Label.grid(row=3, column=3)
top.mainloop()
def win2():
board = Tkinter.Toplevel()
image2 = Image.open("C:\Python26\image2.jpg")
board.geometry('%dx%d' % (image2.size[0],image2.size[1]))
tkpi2 = ImageTk.PhotoImage(image2)
label_image2 = Tkinter.Label(board, image=tkpi2)
label_image2.place(x=0,y=0,width=image2.size[0],height=image2.size[1])
board.title("window 2")
startButton = Tkinter.Button(board, text="Start", command=enhance(image2))
startButton.grid(row=1, column=7)
board.mainloop()
try:
win1()
except Exception, e:
# This is used to skip anything not an image.
# Image.open will generate an exception if it cannot open a file.
# Warning, this will hide other errors as well.
print "didnt work"
pass
<file_sep>/gui/breezy/songbrowser1.py
"""
File: songbrowser1.py
Author: <NAME>
"""
from breezypythongui import EasyFrame
from tkinter import END, NORMAL, DISABLED
import tkinter.filedialog
from song import SongDatabase, Song
class SongBrowser(EasyFrame):
"""Allows the user to browse songs in a database file, delete them,
and save the database to a file."""
def __init__(self):
"""Sets up the window and the widgets."""
EasyFrame.__init__(self, title = "Song Browser",
width = 300, height = 150)
# Create a model
self.database = SongDatabase()
self.selectedTitle = None
# Set up the menus
menuBar = self.addMenuBar(row = 0, column = 0, columnspan = 2)
fileMenu = menuBar.addMenu("File")
fileMenu.addMenuItem("Open", self.openFile)
fileMenu.addMenuItem("Save", self.saveFile)
fileMenu.addMenuItem("Save As...", self.saveFileAs)
self.editMenu = menuBar.addMenu("Edit", state = DISABLED)
self.editMenu.addMenuItem("Find", self.findSong)
self.editMenu.addMenuItem("Delete", self.deleteSong)
# Set up the list box
self.listBox = self.addListbox(row = 1, column = 0, width = 20,
listItemSelected = self.listItemSelected)
# Set the text area
self.outputArea = self.addTextArea("", row = 1, column = 1,
width = 30, height = 4)
# Event handling methods
def listItemSelected(self, index):
"""Responds to the selection of an item in the list box.
Updates the selected title and the text area with the
current song's info."""
self.selectedTitle = self.listBox.getSelectedItem()
if self.selectedTitle == "":
self.outputArea.setText("")
else:
self.outputArea.setText(str(self.database[self.selectedTitle]))
def openFile(self):
"""Pops up an open file dialog. Updates the view if
a file is opened."""
filetypes = [("Database files", "*.dat")]
fileName = tkinter.filedialog.askopenfilename(parent = self,
filetypes = filetypes)
if fileName == "": return
self.database = SongDatabase(fileName)
self.listBox.clear()
for title in self.database.getTitles():
self.listBox.insert(END, title)
self.listBox.setSelectedIndex(0)
self.listItemSelected(0)
if self.listBox.size() > 0:
self.editMenu["state"] = NORMAL
else:
self.editMenu["state"] = DISABLED
def saveFile(self):
"""If the database has a file name, saves it to the file.
Otherwise, calls saveFileAs to prompt the user for a file
name."""
if self.database.fileName != "":
self.database.save()
else:
self.saveFileAs()
def saveFileAs(self):
"""Pops up a dialog to save the database to a file."""
fileName = tkinter.filedialog.asksaveasfilename(parent = self)
if fileName != "":
self.database.save(fileName)
def findSong(self):
"""Prompts the user for a title and searches for the song.
Selects and updates if found."""
title = self.prompterBox(title = "Find",
promptString = "Enter a song's title")
if title == "": return
song = self.database[title]
if not song:
self.messageBox(message = title + " was not found")
else:
index = self.listBox.getIndex(title)
self.listBox.setSelectedIndex(index)
self.listItemSelected(index)
def deleteSong(self):
"""Deletes the selected song from the database and updates the view."""
if self.selectedTitle:
self.database.pop(self.selectedTitle)
index = self.listBox.getSelectedIndex()
self.listBox.delete(index)
if self.listBox.size() > 0:
if index > 0:
index -= 1
self.listBox.setSelectedIndex(index)
self.listItemSelected(index)
else:
self.listItemSelected(-1)
self.editMenu["state"] = DISABLED
# Instantiate and pop up the window."""
if __name__ == "__main__":
SongBrowser().mainloop()
<file_sep>/show_histogram.py
# RGB Hitogram
# This script will create a histogram image based on the RGB content of
# an image. It uses PIL to do most of the donkey work but then we just
# draw a pretty graph out of it.
#
# May 2009, <NAME>, www.scottmcdonough.co.uk
#
import Image, ImageDraw
import sys
import numpy as np
import matplotlib.pyplot as plt
import pylab as P
if __name__=="__main__":
if len(sys.argv) < 1:
#print "******* Requires an image files of the same size."
#print "This program will return the angle at which the second is in relation to the first. ***"
sys.exit(-1)
try:
imagepath = sys.argv[1] # The image to build the histogram of
#img2 = cv.LoadImage(sys.argv[2],cv.CV_LOAD_IMAGE_GRAYSCALE)
except:
print "******* Could not open image files *******"
sys.exit(-1)
histHeight = 120 # Height of the histogram
histWidth = 256 # Width of the histogram
multiplerValue = 1.5 # The multiplier value basically increases
# the histogram height so that love values
# are easier to see, this in effect chops off
# the top of the histogram.
showFstopLines = True # True/False to hide outline
fStopLines = 5
# Colours to be used
backgroundColor = (51,51,51) # Background color
lineColor = (102,102,102) # Line color of fStop Markers
red = (255,60,60) # Color for the red lines
green = (51,204,51) # Color for the green lines
blue = (0,102,255) # Color for the blue lines
##################################################################################
img = Image.open(imagepath)
hist = img.histogram()
histMax = max(hist) #comon color
xScale = float(histWidth)/len(hist) # xScaling
yScale = float((histHeight)*multiplerValue)/histMax # yScaling
im = Image.new("RGBA", (histWidth, histHeight), backgroundColor)
draw = ImageDraw.Draw(im)
# Draw Outline is required
if showFstopLines:
xmarker = histWidth/fStopLines
x =0
for i in range(1,fStopLines+1):
draw.line((x, 0, x, histHeight), fill=lineColor)
x+=xmarker
draw.line((histWidth-1, 0, histWidth-1, 200), fill=lineColor)
draw.line((0, 0, 0, histHeight), fill=lineColor)
# Draw the RGB histogram lines
x=0; c=0;
for i in hist:
if int(i)==0: pass
else:
color = red
if c>255: color = green
if c>511: color = blue
draw.line((x, histHeight, x, histHeight-(i*yScale)), fill=color)
if x>255: x=0
else: x+=1
c+=1
# Now save and show the histogram
im.save('histogram.png', 'PNG')
im.show()
#!/usr/bin/env python
#
# The hist() function now has a lot more options
#
#
# first create a single histogram
#
mu, sigma = 200, 25
x = mu + sigma*P.randn(10000)
# the histogram of the data with histtype='step'
n, bins, patches = P.hist(x, 50, normed=1, histtype='stepfilled')
P.setp(patches, 'facecolor', 'g', 'alpha', 0.75)
# add a line showing the expected distribution
y = P.normpdf( bins, mu, sigma)
l = P.plot(bins, y, 'k--', linewidth=1.5)
#
# create a histogram by providing the bin edges (unequally spaced)
#
P.figure()
bins = [100,125,150,160,170,180,190,200,210,220,230,240,250,275,300]
# the histogram of the data with histtype='step'
n, bins, patches = P.hist(x, bins, normed=1, histtype='bar', rwidth=0.8)
#
# now we create a cumulative histogram of the data
#
P.figure()
n, bins, patches = P.hist(x, 50, normed=1, histtype='step', cumulative=True)
# add a line showing the expected distribution
y = P.normpdf( bins, mu, sigma).cumsum()
y /= y[-1]
l = P.plot(bins, y, 'k--', linewidth=1.5)
# create a second data-set with a smaller standard deviation
sigma2 = 15.
x = mu + sigma2*P.randn(10000)
n, bins, patches = P.hist(x, bins=bins, normed=1, histtype='step', cumulative=True)
# add a line showing the expected distribution
y = P.normpdf( bins, mu, sigma2).cumsum()
y /= y[-1]
l = P.plot(bins, y, 'r--', linewidth=1.5)
# finally overplot a reverted cumulative histogram
n, bins, patches = P.hist(x, bins=bins, normed=1,
histtype='step', cumulative=-1)
P.grid(True)
P.ylim(0, 1.05)
#
# histogram has the ability to plot multiple data in parallel ...
# Note the new color kwarg, used to override the default, which
# uses the line color cycle.
#
P.figure()
# create a new data-set
x = mu + sigma*P.randn(1000,3)
n, bins, patches = P.hist(x, 10, normed=1, histtype='bar',
color=['crimson', 'burlywood', 'chartreuse'],
label=['Crimson', 'Burlywood', 'Chartreuse'])
P.legend()
#
# ... or we can stack the data
#
P.figure()
n, bins, patches = P.hist(x, 10, normed=1, histtype='barstacked')
#
# finally: make a multiple-histogram of data-sets with different length
#
x0 = mu + sigma*P.randn(10000)
x1 = mu + sigma*P.randn(7000)
x2 = mu + sigma*P.randn(3000)
# and exercise the weights option by arbitrarily giving the first half
# of each series only half the weight of the others:
w0 = np.ones_like(x0)
w0[:len(x0)/2] = 0.5
w1 = np.ones_like(x1)
w1[:len(x1)/2] = 0.5
w2 = np.ones_like(x2)
w0[:len(x2)/2] = 0.5
P.figure()
n, bins, patches = P.hist( [x0,x1,x2], 10, weights=[w0, w1, w2], histtype='bar')
P.show()
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/supervised/tree.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2011, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# License: MIT. See COPYING.MIT file in the milk distribution
'''
================
Tree Classifier
================
Decision tree based classifier
------------------------------
'''
from __future__ import division
import numpy as np
from .classifier import normaliselabels
from .base import supervised_model
__all__ = [
'tree_learner',
'stump_learner',
]
class Leaf(object):
'''
v : value
w : weight
'''
def __init__(self, v, w):
self.v = v
self.w = w
def __repr__(self):
return 'Leaf(%s,%s)' % (self.v, self.w)
class Node(object): # This could be replaced by a namedtuple
def __init__(self, featid, featval, left, right):
self.featid = featid
self.featval = featval
self.left = left
self.right = right
def _split(features, labels, weights, criterion, subsample, R):
N,f = features.shape
if subsample is not None:
samples = np.array(R.sample(xrange(features.shape[1]), subsample))
features = features[:, samples]
f = subsample
best = None
best_val = float('-Inf')
for i in xrange(f):
domain_i = sorted(set(features[:,i]))
for d in domain_i[1:]:
cur_split = (features[:,i] < d)
if weights is not None:
value = criterion(labels[cur_split], labels[~cur_split], weights[cur_split], weights[~cur_split])
else:
value = criterion(labels[cur_split], labels[~cur_split])
if value > best_val:
best_val = value
if subsample is not None:
ti = samples[i]
else:
ti = i
best = ti,d
return best
from ._tree import set_entropy
from ._tree import information_gain as _information_gain
def information_gain(labels0, labels1, include_entropy=False):
'''
ig = information_gain(labels0, labels1, include_entropy=False)
Information Gain
See http://en.wikipedia.org/wiki/Information_gain_in_decision_trees
The function calculated here does not include the original entropy unless
you explicitly ask for it (by passing include_entropy=True)
'''
if include_entropy:
return set_entropy(np.concatenate( (labels0, labels1) )) + \
_information_gain(labels0, labels1)
return _information_gain(labels0, labels1)
def z1_loss(labels0, labels1, weights0=None, weights1=None):
'''
z = z1_loss(labels0, labels1)
z = z1_loss(labels0, labels1, weights0, weights1)
zero-one loss
'''
def _acc(labels, weights):
c = (labels.mean() > .5)
if weights is not None:
return np.dot((labels != c), weights)
return np.sum(labels != c)
return _acc(labels0, weights0) + _acc(labels1, weights1)
def neg_z1_loss(labels0, labels1, weights0=None, weights1=None):
'''
z = neg_z1_loss(labels0, labels1)
z = neg_z1_loss(labels0, labels1, weights0, weights1)
zero-one loss, with the sign reversed so it can be *maximised*.
'''
return -z1_loss(labels0,labels1,weights0,weights1)
def build_tree(features, labels, criterion, min_split=4, subsample=None, R=None, weights=None):
'''
tree = build_tree(features, labels, criterion, min_split=4, subsample=None, R=None, weights={all 1s})
Parameters
----------
features : sequence
features to use
labels : sequence
labels
criterion : function {labels} x {labels} -> float
function to measure goodness of split
min_split : integer
minimum size to split on
subsample : integer, optional
if given, then, at each step, choose
R : source of randomness, optional
See `milk.util.get_pyrandom`
weights : sequence, optional
weight of instance (default: all the same)
Returns
-------
tree : Tree
'''
assert len(features) == len(labels), 'build_tree: Nr of labels does not match nr of features'
features = np.asanyarray(features)
labels = np.asanyarray(labels, dtype=np.int)
if subsample is not None:
if subsample <= 0:
raise ValueError('milk.supervised.tree.build_tree: `subsample` must be > 0.\nDid you mean to use None to signal no subsample?')
from ..utils import get_pyrandom
R = get_pyrandom(R)
def recursive(features, labels):
N = float(len(labels))
if N < min_split:
return Leaf(labels.sum()/N, N)
S = _split(features, labels, weights, criterion, subsample, R)
if S is None:
return Leaf(labels.sum()/N, N)
i,thresh = S
split = features[:,i] < thresh
return Node(featid=i,
featval=thresh,
left =recursive(features[ split], labels[ split]),
right=recursive(features[~split], labels[~split]))
return recursive(features, labels)
def apply_tree(tree, features):
'''
conf = apply_tree(tree, features)
Applies the decision tree to a set of features.
'''
if type(tree) is Leaf:
return tree.v
if features[tree.featid] < tree.featval:
return apply_tree(tree.left, features)
return apply_tree(tree.right, features)
class tree_learner(object):
'''
tree = tree_learner()
model = tree.train(features, labels)
model2 = tree.train(features, labels, weights=weights)
predicted = model.apply(testfeatures)
A decision tree classifier (currently, implements the greedy ID3
algorithm without any pruning).
Attributes
----------
criterion : function, optional
criterion to use for tree construction,
this should be a function that receives a set of labels
(default: information_gain).
min_split : integer, optional
minimum size to split on (default: 4).
'''
def __init__(self, criterion=information_gain, min_split=4, return_label=True, subsample=None, R=None):
self.criterion = criterion
self.min_split = min_split
self.return_label = return_label
self.subsample = subsample
self.R = R
def train(self, features, labels, normalisedlabels=False, weights=None):
if not normalisedlabels:
labels,names = normaliselabels(labels)
tree = build_tree(features, labels, self.criterion, self.min_split, self.subsample, self.R, weights)
return tree_model(tree, self.return_label)
tree_classifier = tree_learner
class tree_model(supervised_model):
'''
tree model
'''
def __init__(self, tree, return_label):
self.tree = tree
self.return_label = return_label
def apply(self,feats):
value = apply_tree(self.tree, feats)
if self.return_label:
return value > .5
return value
class stump_model(supervised_model):
def __init__(self, idx, cutoff, names):
self.names = names
self.idx = idx
self.cutoff = cutoff
def apply(self, f):
value = f[self.idx] > self.cutoff
if self.names is not None:
return self.names[value]
return value
def __repr__(self):
return '<stump(%s, %s)>' % (self.idx, self.cutoff)
class stump_learner(object):
def __init__(self):
pass
def train(self, features, labels, normalisedlabels=False, weights=None, **kwargs):
if not normalisedlabels:
labels,names = normaliselabels(labels)
else:
names = kwargs.get('names')
split = _split(features, labels, weights, neg_z1_loss, subsample=None, R=None)
if split is None:
raise ValueError('milk.supervised.stump_learner: Unable to find split (all features are the same)')
idx,cutoff = split
return stump_model(idx, cutoff, names)
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/tests/test_set2binary_array.py
import numpy as np
from milk.supervised import set2binary_array
def test_set2binary_array_len():
s2f = set2binary_array.set2binary_array()
inputs = [ np.arange(1,3)*2, np.arange(4)**2, np.arange(6)+2 ]
labels = [0,0,1]
model = s2f.train(inputs,labels)
assert len(model.apply(inputs[0])) == len(model.apply(inputs[1]))
assert len(model.apply(inputs[0])) == len(model.apply(inputs[2]))
assert len(model.apply(inputs[0])) == len(model.apply(range(128)))
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/active/eimpact.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2010, <NAME> <<EMAIL>>
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
from __future__ import division
import numpy
from ..supervised import svm
from ..supervised.classifier import ctransforms
def expected_impacts(D,labels,U):
'''
EIs = expected_impacts(D,labels,U)
Compute Expected impact for each element of U
Eis[i]: P(label[i] == 1) * IMPACT(label[i] == 1) + P(label[i] == 0) * IMPACT(label[i] == 0)
'''
assert len(D) == len(labels), 'Nr of labeled examples should match lenght of labels vector'
K = svm.rbf_kernel(20000)
prob_classifier = ctransforms(svm.svm_raw(kernel=K,C=4),svm.svm_sigmoidal_correction())
label_classifier = ctransforms(svm.svm_raw(kernel=K,C=4),svm.svm_binary())
prob_classifier.train(D,labels)
u_probs = prob_classifier(U)
u_labels = (u_probs > .5)
impacts = []
for u,p in zip(U,u_probs):
print len(impacts)
label_classifier.train(numpy.vstack((D,u)),numpy.hstack((labels,[0])))
u_labels_0 = label_classifier(U)
label_classifier.train(numpy.vstack((D,u)),numpy.hstack((labels,[1])))
u_labels_1 = label_classifier(U)
e_impact = (1.-p)*(u_labels != u_labels_0).sum() + p*(u_labels != u_labels_1).sum()
impacts.append(e_impact)
return impacts
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/supervised/_lasso.cpp
// Copyright (C) 2012, <NAME> <<EMAIL>>
// License: MIT
#include <iostream>
#include <memory>
#include <cmath>
#include <random>
#include <cassert>
#include <queue>
#include <eigen3/Eigen/Dense>
using namespace Eigen;
extern "C" {
#include <Python.h>
#include <numpy/ndarrayobject.h>
}
namespace {
template <typename T>
int random_int(T& random, const int max) {
std::uniform_int_distribution<int> dist(0, max - 1);
return dist(random);
}
inline
float soft(const float val, const float lam) {
return std::copysign(std::fdim(std::fabs(val), lam), val);
}
typedef Map<Matrix<float, Dynamic, Dynamic, RowMajor>, Aligned> MapXAf;
bool has_nans(const MapXAf& X) {
for (int i = 0; i != X.rows(); ++i) {
for (int j = 0; j != X.cols(); ++j) {
if (std::isnan(X(i,j))) return true;
}
}
return false;
}
struct lasso_solver {
lasso_solver(const MapXAf& X, const MapXAf& Y, const MapXAf& W, MapXAf& B, const int max_iter, const float lam, const float eps)
:X(X)
,Y(Y)
,W(W)
,B(B)
,max_iter(max_iter)
,lam(lam)
,eps(eps)
{ }
void next_coords(int& i, int& j) {
++j;
if (j == B.cols()) {
j = 0;
++i;
if (i == B.rows()) {
i = 0;
}
}
}
int solve() {
Matrix<float, Dynamic, Dynamic, RowMajor> residuals;
int i = 0;
int j = -1;
// sweep is whether we are doing a whole-B sweep (i.e., even looking at
// non-zero Bs)
// changed is whether something has changed
// We set this up to trigger computation of residuals and a sweep on
// the first iteration
bool sweep = false;
bool changed = false;
assert(!has_nans(X));
assert(!has_nans(Y));
assert(!has_nans(B));
for (int it = 0; it != max_iter; ++it) {
this->next_coords(i, j);
if (i == 0 && j == 0) {
if (sweep && !changed) {
return it;
}
if (!changed) {
sweep = true;
// Reset the residuals matrix to the best values to avoid
// drift due to successive rounding:
residuals = Y - B*X;
} else {
sweep = false;
}
changed = false;
}
if (!sweep && B(i,j) == 0.0) continue;
// We now set βᵢⱼ holding everything else fixed. This comes down
// to a very simple 1-dimensional problem.
// We remember the current value in order to compute update below
const float prev = B(i,j);
assert(Y.cols() == W.cols());
assert(Y.cols() == X.cols());
assert(Y.cols() == residuals.cols());
const float x2 = (W.row(i).array()*X.row(j).array() * X.row(j).array()).sum();
const float xy = (W.row(i).array()*X.row(j).array() * residuals.row(i).array()).sum();
const float raw_step = (x2 == 0.0 ? 0.0 : xy/x2);
const float best = soft(prev + raw_step, lam);
const float step = best - prev;
if (std::fabs(step) > eps) {
assert(!std::isnan(best));
B(i,j) = best;
residuals.row(i) -= step*X.row(j);
changed = true;
}
}
return max_iter;
}
std::mt19937 r;
const MapXAf& X;
const MapXAf& Y;
const MapXAf& W;
MapXAf& B;
const int max_iter;
const float lam;
const float eps;
};
MapXAf as_eigen(PyArrayObject* arr) {
assert(PyArray_EquivTypenums(PyArray_TYPE(arr), NPY_FLOAT32));
assert(PyArray_ISCARRAY_RO(arr));
return MapXAf(
static_cast<float*>(PyArray_DATA(arr)),
PyArray_DIM(arr, 0),
PyArray_DIM(arr, 1));
}
const char* errmsg = "INTERNAL ERROR";
PyObject* py_lasso(PyObject* self, PyObject* args) {
PyArrayObject* X;
PyArrayObject* Y;
PyArrayObject* W;
PyArrayObject* B;
int max_iter;
float lam;
float eps;
if (!PyArg_ParseTuple(args, "OOOOiff", &X, &Y, &W, &B, &max_iter, &lam, &eps)) return NULL;
if (!PyArray_Check(X) || !PyArray_ISCARRAY_RO(X) ||
!PyArray_Check(Y) || !PyArray_ISCARRAY_RO(Y) ||
!PyArray_Check(W) || !PyArray_ISCARRAY_RO(W) ||
!PyArray_Check(B) || !PyArray_ISCARRAY(B) ||
!PyArray_EquivTypenums(PyArray_TYPE(X), NPY_FLOAT32) ||
!PyArray_EquivTypenums(PyArray_TYPE(Y), NPY_FLOAT32) ||
!PyArray_EquivTypenums(PyArray_TYPE(W), NPY_FLOAT32) ||
!PyArray_EquivTypenums(PyArray_TYPE(B), NPY_FLOAT32)) {
PyErr_SetString(PyExc_RuntimeError,errmsg);
return 0;
}
MapXAf mX = as_eigen(X);
MapXAf mY = as_eigen(Y);
MapXAf mW = as_eigen(W);
MapXAf mB = as_eigen(B);
max_iter *= mB.size();
lasso_solver solver(mX, mY, mW, mB, max_iter, lam, eps);
const int iters = solver.solve();
return Py_BuildValue("i", iters);
}
PyMethodDef methods[] = {
{"lasso", py_lasso, METH_VARARGS , "Do NOT call directly.\n" },
{NULL, NULL,0,NULL},
};
const char * module_doc =
"Internal Module.\n"
"\n"
"Do NOT use directly!\n";
} // namespace
extern "C"
void init_lasso()
{
import_array();
(void)Py_InitModule3("_lasso", methods, module_doc);
}
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/supervised/lasso.py
# -*- coding: utf-8 -*-
import numpy as np
import _lasso
from .base import supervised_model
from milk.unsupervised import center
def lasso(X, Y, B=None, lam=1., max_iter=None, tol=None):
'''
B = lasso(X, Y, B={np.zeros()}, lam=1. max_iter={1024}, tol={1e-5})
Solve LASSO Optimisation
B* = arg min_B ½/n || Y - BX ||₂² + λ||B||₁
where $n$ is the number of samples.
Milk uses coordinate descent, looping through the coordinates in order
(with an active set strategy to update only non-zero βs, if possible). The
problem is convex and the solution is guaranteed to be optimal (within
floating point accuracy).
Parameters
----------
X : ndarray
Design matrix
Y : ndarray
Matrix of outputs
B : ndarray, optional
Starting values for approximation. This can be used for a warm start if
you have an estimate of where the solution should be. If used, the
solution might be written in-place (if the array has the right format).
lam : float, optional
λ (default: 1.0)
max_iter : int, optional
Maximum nr of iterations (default: 1024)
tol : float, optional
Tolerance. Whenever a parameter is to be updated by a value smaller
than ``tolerance``, that is considered a null update. Be careful that
if the value is too small, performance will degrade horribly.
(default: 1e-5)
Returns
-------
B : ndarray
'''
X = np.ascontiguousarray(X, dtype=np.float32)
Y = np.ascontiguousarray(Y, dtype=np.float32)
if B is None:
B = np.zeros((Y.shape[0],X.shape[0]), np.float32)
else:
B = np.ascontiguousarray(B, dtype=np.float32)
if max_iter is None:
max_iter = 1024
if tol is None:
tol = 1e-5
if X.shape[0] != B.shape[1] or \
Y.shape[0] != B.shape[0] or \
X.shape[1] != Y.shape[1]:
raise ValueError('milk.supervised.lasso: Dimensions do not match')
if np.any(np.isnan(X)) or np.any(np.isnan(B)):
raise ValueError('milk.supervised.lasso: NaNs are only supported in the ``Y`` matrix')
W = np.ascontiguousarray(~np.isnan(Y), dtype=np.float32)
Y = np.nan_to_num(Y)
n = Y.size
_lasso.lasso(X, Y, W, B, max_iter, float(2*n*lam), float(tol))
return B
def lasso_walk(X, Y, B=None, nr_steps=None, start=None, step=None, tol=None, return_lams=False):
'''
Bs = lasso_walk(X, Y, B={np.zeros()}, nr_steps={64}, start={automatically inferred}, step={.9}, tol=None, return_lams=False)
Bs,lams = lasso_walk(X, Y, B={np.zeros()}, nr_steps={64}, start={automatically inferred}, step={.9}, tol=None, return_lams=True)
Repeatedly solve LASSO Optimisation
B* = arg min_B ½/n || Y - BX ||₂² + λ||B||₁
for different values of λ.
Parameters
----------
X : ndarray
Design matrix
Y : ndarray
Matrix of outputs
B : ndarray, optional
Starting values for approximation. This can be used for a warm start if
you have an estimate of where the solution should be.
start : float, optional
first λ to use (default is ``np.abs(Y).max()``)
nr_steps : int, optional
How many steps in the path (default is 64)
step : float, optional
Multiplicative step to take (default is 0.9)
tol : float, optional
This is the tolerance parameter. It is passed to the lasso function
unmodified.
return_lams : bool, optional
Whether to return the values of λ used (default: False)
Returns
-------
Bs : ndarray
'''
if nr_steps is None:
nr_steps = 64
if step is None:
step = .9
if start is None:
n = Y.size
start = 0.5/n*np.nanmax(np.abs(Y))*np.abs(X).max()
lam = start
lams = []
Bs = []
for i in xrange(nr_steps):
# The central idea is that each iteration is already "warm" and this
# should be faster than starting from zero each time
B = lasso(X, Y, B, lam=lam, tol=tol)
lams.append(lam)
Bs.append(B.copy())
lam *= step
if return_lams:
return np.array(Bs), np.array(lams)
return np.array(Bs)
def _dict_subset(mapping, keys):
return dict(
[(k,mapping[k]) for k in keys])
class lasso_model(supervised_model):
def __init__(self, betas, mean):
self.betas = betas
self.mean = mean
def retrain(self, features, labels, lam, **kwargs):
features, mean = center(features)
betas = lasso(features, labels, self.betas.copy(), lam=lam, **_dict_subset(kwargs, ['tol', 'max_iter']))
return lasso_model(betas, mean)
def apply(self, features):
return np.dot(self.betas, features) + self.mean
class lasso_learner(object):
def __init__(self, lam=1.0):
self.lam = lam
def train(self, features, labels, betas=None, **kwargs):
labels, mean = center(labels, axis=1)
betas = lasso(features, labels, betas, lam=self.lam)
return lasso_model(betas, mean)
def lasso_model_walk(X, Y, B=None, nr_steps=64, start=None, step=.9, tol=None, return_lams=False):
Y, mean = center(Y, axis=1)
Bs,lams = lasso_walk(X,Y, B, nr_steps, start, step, tol, return_lams=True)
models = [lasso_model(B, mean) for B in Bs]
if return_lams:
return models, lams
return models
<file_sep>/camera/gabor.py
from opencv.cv import *
from opencv.highgui import *
import math
import sys
src = 0
src_f = 0
image = 0
dest = 0
dest_mag = 0
kernelimg=0
big_kernelimg=0
kernel=0
kernel_size =21
pos_var = 50
pos_w = 5
pos_phase = 0
pos_psi = 90
def Process():
var = pos_var/10.0
w = pos_w/10.0
phase = pos_phase*CV_PI/180.0
psi = CV_PI*pos_psi/180.0
cvZero(kernel)
for x in range(-kernel_size/2+1,kernel_size/2+1):
for y in range(-kernel_size/2+1,kernel_size/2+1):
kernel_val = math.exp( -((x*x)+(y*y))/(2*var))*math.cos( w*x*math.cos(phase)+w*y*math.sin(phase)+psi)
cvSet2D(kernel,y+kernel_size/2,x+kernel_size/2,cvScalar(kernel_val))
cvSet2D(kernelimg,y+kernel_size/2,x+kernel_size/2,cvScalar(kernel_val/2+0.5))
cvFilter2D(src_f, dest,kernel,cvPoint(-1,-1))
cvShowImage("Process window",dest)
cvResize(kernelimg,big_kernelimg)
cvShowImage("Kernel",big_kernelimg)
cvPow(dest,dest_mag,2)
cvShowImage("Mag",dest_mag)
def cb_var(pos):
global pos_var
if pos > 0:
pos_var = pos
else:
pos_var = 1
Process()
def cb_w(pos):
global pos_w
pos_w = pos
Process()
def cb_phase(pos):
global pos_phase
pos_phase = pos
Process()
def cb_psi(pos):
global pos_psi
pos_psi = pos
Process()
if __name__ == '__main__':
image = cvLoadImage(sys.argv[1],1);
print image.width
if kernel_size%2==0:
kernel_size += 1
kernel = cvCreateMat(kernel_size,kernel_size,CV_32FC1)
kernelimg = cvCreateImage(cvSize(kernel_size,kernel_size),IPL_DEPTH_32F,1)
big_kernelimg = cvCreateImage(cvSize(kernel_size*20,kernel_size*20),IPL_DEPTH_32F,1)
src = cvCreateImage(cvSize(image.width,image.height),IPL_DEPTH_8U,1)
src_f = cvCreateImage(cvSize(image.width,image.height),IPL_DEPTH_32F,1)
cvCvtColor(image,src,CV_BGR2GRAY)
cvConvertScale(src,src_f,1.0/255,0)
dest = cvCloneImage(src_f)
dest_mag = cvCloneImage(src_f)
cvNamedWindow("Process window",1)
cvNamedWindow("Kernel",1)
cvNamedWindow("Mag",1)
cvCreateTrackbar("Variance","Process window", pos_var,100,cb_var)
cvCreateTrackbar("Pulsation","Process window",pos_w ,30,cb_w)
cvCreateTrackbar("Phase","Process window",pos_phase ,180,cb_phase)
cvCreateTrackbar("Psi","Process window",pos_psi ,360,cb_psi)
Process()
cvWaitKey(0)
cvDestroyAllWindows()
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/unsupervised/nnmf/hoyer.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2010, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
from __future__ import division
import numpy as np
from ...utils import get_nprandom
__all__ = ['hoyer_sparse_nnmf']
def sp(s):
L2 = np.sqrt(np.dot(s,s))
L1 = np.abs(s).sum()
sn = np.sqrt(len(s))
return (sn-L1/L2)/(sn-1)
def _solve_alpha(s,m,L2):
sm = s-m
s2 = np.dot(s,s)
sm2 = np.dot(sm, sm)
m2 = np.dot(m, m)
dot = np.dot(m, sm)
alpha = (-dot + np.sqrt(dot**2 - sm2*(m2-L2**2)))/sm2
return alpha
def _project(x,L1,L2):
'''
Implement projection onto sparse space
'''
x = np.asanyarray(x)
n = len(x)
s = x + (L1 - x.sum())/n
Z = np.zeros(n,bool)
while True:
m = (~Z) * L1/(n-Z.sum())
alpha = _solve_alpha(s,m,L2)
s = m + alpha * (s - m)
negs = (s < 0)
if not negs.any():
return s
Z |= negs
s[Z] = 0
c = (s.sum() - L1)/(~Z).sum()
s -= c*(~Z)
def _L1for(s,x,L2):
'''
Solve for L1 in
s = [ sqrt(n) - L1/L2] / [sqrt(n) - 1]
'''
sn = np.sqrt(len(x))
return L2*(s+sn-s*sn)
def sparse_nnmf(V, r, sparsenessW=None, sparsenessH=None, max_iter=10000, R=None):
'''
W,H = hoyer.sparse_nnmf(V, r, sparsenessW = None, sparsenessH = None, max_iter=10000, R=None)
Implement sparse nonnegative matrix factorisation.
Parameters
----------
V : 2-D matrix
input feature matrix
r : integer
number of latent features
sparsenessW : double, optional
sparseness contraint on W (default: no sparsity contraint)
sparsenessH : double, optional
sparseness contraint on H (default: no sparsity contraint)
max_iter : integer, optional
maximum nr of iterations (default: 10000)
R : integer, optional
source of randomness
Returns
-------
W : 2-ndarray
H : 2-ndarray
Reference
---------
"Non-negative Matrix Factorisation with Sparseness Constraints"
by <NAME>
in Journal of Machine Learning Research 5 (2004) 1457--1469
'''
n,m = V.shape
R = get_nprandom(R)
mu_W = .15
mu_H = .15
eps = 1e-8
W = R.standard_normal((n,r))**2
H = R.standard_normal((r,m))**2
def fix(X, sparseness):
for i in xrange(r):
row = X[i]
L2 = np.sqrt(np.dot(row, row))
row /= L2
X[i] = _project(row, _L1for(sparseness, row, 1.), 1.)
def fixW():
fix(W.T, sparsenessW)
def fixH():
fix(H, sparsenessH)
if sparsenessW is not None: fixW()
if sparsenessH is not None: fixH()
for i in xrange(max_iter):
if sparsenessW is not None:
W -= mu_W * np.dot(np.dot(W,H)-V,H.T)
fixW()
else:
updateW = np.dot(V,H.T)/(np.dot(W,np.dot(H,H.T))+eps)
W *= updateW
if sparsenessH is not None:
H -= mu_H * np.dot(W.T,np.dot(W,H)-V)
fixH()
else:
updateH = np.dot(W.T,V)/(np.dot(np.dot(W.T,W),H)+eps)
H *= updateH
return W,H
<file_sep>/navigation/terrain_id/terrain_id.py
#!/usr/bin/env python
import os
from img_processing_tools import *
import Image, ImageDraw
import time
import sys
import cv, cv2
from PIL import ImageStat
import numpy as np
def grab_frame_from_video(video):
frame = video.read()
#counter = 0
#while video.grab():
# counter += 1
# flag, frame = video.retrieve()
# if flag:
#gray_frm = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)
#cv2.imwrite('frame_'+str(counter)+'.png',gray_frm)
#cv.ShowImage("Video",frame)
#cv2.imshow("Video",frame)
#cv.WaitKey(1000/fps)
return frame
def classifiy_section(img):
#class_ID: 1=grass, 2=non-mowable, 3=unknown
class_ID = 2
#print img[0], img[1], img[2]
#print "class_ID", class_ID
#if (img[0] > 65 and img[0] < 115) and (img[1] > 70 and img[1] < 135) and (img[2] < 85): class_ID = 1
#if img[0] > 82 and img[1] > 85 and img[2] > 72: class_ID = 2
rb_sum = img[0] + img[2]
g_sum = img[1]
if img[1] > 26: class_ID=1
#if img[1] < 15 and img[2] > 5 : class_ID=2
print "rgb: ", img, "class_ID: ", class_ID
return class_ID
def rgb2I3 (img):
"""Convert RGB color space to I3 color space
@param r: Red
@param g: Green
@param b: Blue
return (I3) integer
"""
img_width, img_height = img.size
#make a copy to return
returnimage = Image.new("RGB", (img_width, img_height))
imagearray = img.load()
for y in range(0, img_height, 1):
for x in range(0, img_width, 1):
rgb = imagearray[x, y]
i3 = ((2*rgb[1])-rgb[0]-rgb[2]) / 2
#print rgb, i3
returnimage.putpixel((x,y), (0,i3,0))
return returnimage
def subsection_image(pil_img, sections, visual):
sections = sections /4
print "sections= ", sections
fingerprint = []
# - -----accepts image and number of sections to divide the image into (resolution of fingerprint)
# ---------- returns a subsectioned image classified by terrain type
img_width, img_height = pil_img.size
print "image size = img_wdith= ", img_width, " img_height=", img_height, " sections=", sections
if visual == True:
cv_original_img1 = PILtoCV(pil_img)
cv.ShowImage("Original",cv_original_img1 )
cv.MoveWindow("Original", ((img_width)+100),50)
pil_img = rgb2I3 (pil_img)
#cv.WaitKey()
temp_img = pil_img.copy()
xsegs = img_width / sections
ysegs = img_height / sections
print "xsegs, ysegs = ", xsegs, ysegs
for yy in range(0, img_height-ysegs+1 , ysegs):
for xx in range(0, img_width-xsegs+1, xsegs):
print "Processing section =", xx, yy, xx+xsegs, yy+ysegs
box = (xx, yy, xx+xsegs, yy+ysegs)
print "box = ", box
cropped_img1 = pil_img.crop(box)
I3_mean = ImageStat.Stat(cropped_img1).mean
I3_mean_rgb = (int(I3_mean[0]), int(I3_mean[1]), int(I3_mean[2]))
print "I3_mean: ", I3_mean
sub_ID = classifiy_section(I3_mean_rgb)
fingerprint.append(sub_ID)
if visual == True:
cv_cropped_img1 = PILtoCV(cropped_img1)
cv.ShowImage("Fingerprint",cv_cropped_img1 )
cv.MoveWindow("Fingerprint", (img_width+100),50)
if sub_ID == 1: I3_mean_rgb = (50,150,50)
if sub_ID == 2: I3_mean_rgb = (150,150,150)
if sub_ID == 3: I3_mean_rgb = (0,0,200)
ImageDraw.Draw(pil_img).rectangle(box, (I3_mean_rgb))
cv_img = PILtoCV(pil_img)
cv.ShowImage("Image",cv_img)
cv.MoveWindow("Image", 50,50)
cv.WaitKey(50)
#time.sleep(.05)
#print "FINGERPRINT: ", fingerprint
#cv.WaitKey()
return fingerprint
###########################################################
def main(args):
# if len(sys.argv) < 4:
# print "******* Requires 3 image files of the same size."
# print "This program will return the angle at which the second is in relation to the first. ***"
# sys.exit(-1)
try:
#img1 = cv.LoadImage(sys.argv[1])
#frame = grab_frame(1)
video = cv2.VideoCapture(sys.argv[1])
img1 = np.array(grab_frame_from_video(video)[1])
cv.NamedWindow("Video",cv.CV_WINDOW_AUTOSIZE)
cv2.imshow("Video",img1)
#cv.ShowImage("Video", frame)
#cv.WaitKey(1000)
#img1 = Image.open(sys.argv[1])
#img1 = cv.CreateImage(cv.GetSize(frame), cv.IPL_DEPTH_8U, 1)
#img1 = CVtoGray(frame)
#cv.WaitKey()
#img1 = CV_enhance_edge(img1)
#cv.WaitKey()
#img2 = cv.LoadImage(sys.argv[1],cv.CV_LOAD_IMAGE_GRAYSCALE)
#img3 = cv.LoadImage(sys.argv[2],cv.CV_LOAD_IMAGE_GRAYSCALE)
except:
print "******* Could not open image/video file *******"
sys.exit(-1)
print len (sys.argv)
if len(sys.argv) == 2:
resolution = 32
else:
resolution = int(sys.argv[2])
img1 = array2image(img1)
image_fingerprint = np.array(subsection_image(img1, resolution, True))
print "FINGERPRINT: ",image_fingerprint
print 'len(image_fingerprint):', len(image_fingerprint)
#image_fingerprint.reshape(((image_fingerprint/2), 2))
print image_fingerprint
step = len(image_fingerprint)/ (resolution/4)
print "step =", step
a = []
b = []
for x in range (0, len(image_fingerprint), step):
#print x
for y in range(0, step/2):
#print x,y
a.append(image_fingerprint[(x+y)])
b.append(image_fingerprint[(x+(step/2)+y)])
#print a
#print b
direction = sum(a)-sum(b)
print "leftside-rightside:", direction
if direction > 0: print "grass on right"
if direction < 0: print "grass on left"
cv.WaitKey()
if __name__=="__main__":
while 1:
main(1)
#sys.exit(main(sys.argv))
<file_sep>/PhidgetsPython/Phidgets/Common.py
"""Copyright 2011 Phidgets Inc.
This work is licensed under the Creative Commons Attribution 2.5 Canada License.
To view a copy of this license, visit http://creativecommons.org/licenses/by/2.5/ca/
"""
__author__ = '<NAME>'
__version__ = '2.1.8'
__date__ = 'July 19 2011'
import sys
def prepOutput(output):
if sys.version_info < (3, 2):
return output.value
else:
return output.value.decode('utf-8')<file_sep>/navigation/lidar/lidar_visualization.py
from lidar_class import *
import time
from visual import *
#setup the window, view, etc
scene.forward = (1, -1, 0)
scene.background = (0.1, 0.1, 0.2)
scene.title = "ProtoX1 Lidar Distance"
class grid:
"""A graphical grid, with two level of subdivision.
The grid can be manipulated (moved, showed/hidden...) by acting on self.frame.
"""
def __init__(self, size=100, small_interval=10, big_interval=50):
self.frame = frame(pos=(0,0,0))
for i in range(-size, size+small_interval, small_interval):
if i %big_interval == 0:
c = color.gray(0.65)
else:
c = color.gray(0.25)
curve(frame=self.frame, pos=[(i,0,size),(i,0,-size)], color=c)
curve(frame=self.frame, pos=[(size,0,i),(-size,0,i)], color=c)
#grid where major intervals are 1m, minor intervals are 10cm
my_grid = grid(size=4000, small_interval = 100, big_interval=1000)
my_grid.frame.pos.y=-5
while True:
rate(60) # synchonous repaint at 60fps
if scene.kb.keys: # event waiting to be processed?
s = scene.kb.getkey() # get keyboard info
if s == "q": # stop motor
sys.exit(-1)
<file_sep>/gui/web_gui/sitetest2/cgi-bin/script.py~
#!/usr/bin/python
import sys
import time
import cgi, cgitb
import os
# Create instance of FieldStorage
form = cgi.FieldStorage()
# Get data from fields
#first_name = form.getvalue('first_name')
#last_name = form.getvalue('last_name')
#button1 = form.getvalue('Submit1')
#time.sleep(1)
#os.system("./test_gui1.py")
time.sleep(1)
print 'Content-type: text/html\n'
if "New Image" in form:
print "button New Image pressed"
time.sleep(1)
os.system("./robomow_logic.py new_image")
elif "Mowable" in form:
print "button Mowable pressed"
time.sleep(1)
os.system("./robomow_logic.py mowable")
elif "Non-Mowable" in form:
print "button Non-Mowable pressed"
time.sleep(1)
os.system("./robomow_logic.py non_mowable")
else:
print "Couldn't determine which button was pressed."
#time.sleep(1)
location = 'http://localhost:8004'
new_html_page = '''
<img src="temp.jpg" />
<form action="/cgi-bin/script.py" method="get">
<input type="submit" value="New Image" name="New Image"/>
<input type="submit" value="Mowable" name="Mowable"/>
<input type="submit" value="Non-Mowable" name="Non-Mowable"/>
</form>
'''
new_html_page = new_html_page + "Hello this is some new data"
#print new_html_page
#time.sleep(10)
f_handle = open('index.html', 'w')
f_handle.write(str(new_html_page))
f_handle.write(' ')
f_handle.close()
page = '''
<html>
<head>
<meta http-equiv="Refresh" content="1; URL='''+location+'''">
</head>
<body></body>
</html>'''
time.sleep(.5)
print page
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/supervised/adaboost.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2012, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
# License: MIT. See COPYING.MIT file in the milk distribution
from __future__ import division
import numpy as np
from .normalise import normaliselabels
from .base import supervised_model
'''
AdaBoost
Simple implementation of Adaboost
Learner
-------
boost_learner
'''
__all__ = [
'boost_learner',
]
def _adaboost(features, labels, base, max_iters):
m = len(features)
D = np.ones(m, dtype=float)
D /= m
Y = np.ones(len(labels), dtype=float)
names = np.array([-1, +1])
Y = names[labels]
H = []
A = []
for t in xrange(max_iters):
Ht = base.train(features, labels, weights=D)
train_out = np.array(map(Ht.apply, features))
train_out = names[train_out.astype(int)]
Et = np.dot(D, (Y != train_out))
if Et > .5:
# early return
break
At = .5 * np.log((1. + Et) / (1. - Et))
D *= np.exp((-At) * Y * train_out)
D /= np.sum(D)
A.append(At)
H.append(Ht)
return H, A
class boost_model(supervised_model):
def __init__(self, H, A, names):
self.H = H
self.A = A
self.names = names
def apply(self, f):
v = sum((a*h.apply(f)) for h,a in zip(self.H, self.A))
v /= np.sum(self.A)
return self.names[v > .5]
class boost_learner(object):
'''
learner = boost_learner(weak_learner_type(), max_iters=100)
model = learner.train(features, labels)
test = model.apply(f)
AdaBoost learner
Attributes
----------
base : learner
Weak learner
max_iters : integer
Nr of iterations (default: 100)
'''
def __init__(self, base, max_iters=100):
self.base = base
self.max_iters = max_iters
def train(self, features, labels, normalisedlabels=False, names=(0,1), weights=None, **kwargs):
if not normalisedlabels:
labels,names = normaliselabels(labels)
H,A = _adaboost(features, labels, self.base, self.max_iters)
return boost_model(H, A, names)
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/supervised/_perceptron.cpp
// Copyright (C) 2011, <NAME> <<EMAIL>>
// License: MIT
#include <iostream>
#include <memory>
#include <cmath>
#include <cassert>
extern "C" {
#include <Python.h>
#include <numpy/ndarrayobject.h>
}
namespace {
template <typename T>
int perceptron(PyArrayObject* data_arr, const long* labels, PyArrayObject* weights_arr, double eta) {
const T* data = reinterpret_cast<T*>(PyArray_DATA(data_arr));
T* weights = reinterpret_cast<T*>(PyArray_DATA(weights_arr));
const int N0 = PyArray_DIM(data_arr, 0);
const int N1 = PyArray_DIM(data_arr, 1);
int nr_errors = 0;
for (int i = 0; i != N0; ++i, data += N1, ++labels) {
T val = weights[0];
for (int j = 0; j != N1; ++j) {
val += weights[j+1] * data[j];
}
int ell = (val > 0);
if (ell != *labels) {
int pm = (*labels ? +1 : -1);
++nr_errors;
T error = pm * eta * std::abs(pm-val);
weights[0] += error;
for (int j = 0; j != N1; ++j) {
weights[j+1] += error*data[j];
}
}
}
return nr_errors;
}
PyObject* py_perceptron(PyObject* self, PyObject* args) {
const char* errmsg = "Arguments were not what was expected for perceptron.\n"
"This is an internal function: Do not call directly unless you know exactly what you're doing.\n";
PyArrayObject* data;
PyArrayObject* labels;
PyArrayObject* weights;
double eta;
if (!PyArg_ParseTuple(args, "OOOd", &data, &labels, &weights, &eta)) {
PyErr_SetString(PyExc_RuntimeError,errmsg);
return 0;
}
if (!PyArray_Check(data) || !PyArray_ISCONTIGUOUS(data) ||
!PyArray_Check(weights) || !PyArray_ISCONTIGUOUS(weights) ||
!PyArray_Check(labels) || !PyArray_ISCONTIGUOUS(labels) || !PyArray_EquivTypenums(PyArray_TYPE(labels), NPY_LONG) ||
PyArray_TYPE(data) != PyArray_TYPE(weights)||
PyArray_NDIM(data) != 2 || PyArray_NDIM(weights) != 1 || PyArray_DIM(data,1) + 1 != PyArray_DIM(weights,0)) {
PyErr_SetString(PyExc_RuntimeError,errmsg);
return 0;
}
int nr_errors;
if (PyArray_TYPE(data) == NPY_FLOAT) {
nr_errors = perceptron<float>(data, reinterpret_cast<const long*>(PyArray_DATA(labels)), weights, eta);
} else if (PyArray_TYPE(data) == NPY_DOUBLE) {
nr_errors = perceptron<double>(data, reinterpret_cast<const long*>(PyArray_DATA(labels)), weights, eta);
} else {
PyErr_SetString(PyExc_RuntimeError, errmsg);
return 0;
}
return PyLong_FromLong(nr_errors);
}
PyMethodDef methods[] = {
{"perceptron", py_perceptron, METH_VARARGS , "Do NOT call directly.\n" },
{NULL, NULL,0,NULL},
};
const char * module_doc =
"Internal Module.\n"
"\n"
"Do NOT use directly!\n";
} // namespace
extern "C"
void init_perceptron()
{
import_array();
(void)Py_InitModule3("_perceptron", methods, module_doc);
}
<file_sep>/ftdi/build/pyftdi/build/lib.linux-i686-2.7/pyftdi/serialext/expander.py
# Copyright (c) 2008-2011, Neotion
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
# * Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright
# notice, this list of conditions and the following disclaimer in the
# documentation and/or other materials provided with the distribution.
# * Neither the name of the Neotion nor the names of its contributors may
# be used to endorse or promote products derived from this software
# without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL NEOTION BE LIABLE FOR ANY DIRECT, INDIRECT,
# INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA,
# OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
# LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
# NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE,
# EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
import os
import sys
__all__ = ['SerialExpander', 'SerialExpanderError']
class SerialExpanderError(Exception):
"""Raised when the expander is in trouble"""
pass
class SerialExpander(object):
"""Provides extension classes compatible with PySerial module"
Create a class on request, to prevent useless dependency on
non-standard PySerial module"""
@staticmethod
def serialclass(device, use_logger=False):
"""Provide a pyserial class that supports high-speed serial port on
Apple computer (usually through a USB-RS232 adaptor key)"""
try:
import serial
except ImportError:
raise SerialExpanderError("Python serial module not installed")
# default serial class
type_ = serial.Serial
type_.backend = 'serial'
if device.startswith('ftdi://'):
# for now, assume the USB device is a FTDI device
# a USB dispatcher should be implemented here
from ftdiext import SerialFtdi, BACKEND
type_ = SerialFtdi
type_.backend = BACKEND
elif device.startswith('prolific://'):
# for now, assume the USB device is a Prolific device
# a USB dispatcher should be implemented here
from plext import SerialProlific, BACKEND
type_ = SerialProlific
type_.backend = BACKEND
elif os.path.exists(device):
import stat
from socketext import SerialSocket
if stat.S_ISSOCK(os.stat(device)[0]):
type_ = SerialSocket
type_.backend = 'socket'
if type_.backend == 'serial' and sys.platform.lower() in ('darwin'):
# hack for Mac OS X hosts: the underlying termios system library
# cannot cope with baudrates > 230kbps and pyserial << 9.7
version = os.uname()[2].split('.')
if 'version' not in serial.__dict__ or \
serial.version[0] < 9 or serial.version[1] < 7:
# Tiger or above can support arbitrary serial speeds
if int(version[0]) >= 8:
# remove all speeds not supported with TERMIOS
# so that pyserial never attempts to use them directly
for b in serial.baudrate_constants.keys():
if b > 230400:
del serial.baudrate_constants[b]
from darwinext import SerialDarwin
class SerialDarwinAdapter(SerialDarwin, type_):
backend = type_.backend
type_ = SerialDarwinAdapter
if use_logger:
from loggerext import SerialLoggerPort
class SerialLoggerAdapter(SerialLoggerPort, type_):
backend = type_.backend
type_ = SerialLoggerAdapter
return type_
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/tests/__init__.py
try:
import milksets
del milksets
except ImportError:
import sys
sys.stderr.write('''\
Could not import milksets.
This companion package does not provide any functionality, but
is necessary for some of the testing.''')
def run():
import nose
from os import path
currentdir = path.dirname(__file__)
updir = path.join(currentdir, '..')
nose.run('milk', argv=['', '--exe', '-w', updir])
<file_sep>/ftdi/build/pyftdi/build/lib.linux-i686-2.7/pyftdi/serialext/README.rst
Examples
--------
Serial port
...........
``serialext/tests/pyterm.py`` is a simple serial terminal that can be used
to test the serial port feature.::
Usage: pyterm.py [options]
Pure python simple serial terminal
Options:
-h, --help show this help message and exit
-d, --debug enable debug mode
-f, --fullmode use full terminal mode, exit with [Ctrl]+A
-p DEVICE, --port=DEVICE
serial port device name (list available ports with
'ftdi:///?' or 'prolific:///?')
-b BAUDRATE, --baudrate=BAUDRATE
serial port baudrate
-r RESET, --reset=RESET
HW reset on DTR line
-o LOGFILE, --logfile=LOGFILE
path to the log file
If the PyFtdi module is not yet installed and ``pyterm.py`` is run from the
archive directory, ``PYTHONPATH`` should be defined to the current directory::
PYTHONPATH=$PWD ./serialext/tests/pyterm.py -p ftdi:///?
The above command lists all the available FTDI device ports.
To start up a serial terminal session, use the ``-p`` option switch to select
the proper port, for example::
PYTHONPATH=$PWD ./serialext/tests/pyterm.py -p ftdi://ftdi:2232/1
Quick step-by-step instruction guide
....................................
Shell commands::
pip install virtualenv
virtualenv ~/.pyusb
# one out of the two following lines:
~/.pyusb/bin/pip install pyserial
~/.pyusb/bin/easy_install install pyserial
wget http://pypi.python.org/packages/source/p/pyusb/pyusb-1.0.0a2.tar.gz
tar xzvf pyusb-1.0.0a2.tar.gz
cd pyusb-1.0.0a2
~/.pyusb/bin/python setup.py install
PYTHONPATH=. ~/.pyusb/bin/python serialext/tests/pyterm.py -p ftdi:///?
Note that if there's only one FTDI device connected to the host, the FTDI URL
should be as simple as ``ftdi:///n``, where n is the FTDI UART port (starting
from 1).
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/tests/test_utils.py
import numpy as np
from milk.utils.utils import get_nprandom, get_pyrandom
def test_nprandom():
assert get_nprandom(None).rand() != get_nprandom(None).rand()
assert get_nprandom(1).rand() != get_nprandom(2).rand()
assert get_nprandom(1).rand() == get_nprandom(1).rand()
r = get_nprandom(1)
assert get_nprandom(r).rand() != r.rand()
def test_pyrandom():
assert get_pyrandom(None).random() != get_pyrandom(None).random()
assert get_pyrandom(1).random() != get_pyrandom(2).random()
assert get_pyrandom(1).random() == get_pyrandom(1).random()
r = get_pyrandom(1)
assert get_pyrandom(r).random() != r.random()
def test_cross_random():
assert get_pyrandom(get_nprandom(1)).random() == get_pyrandom(get_nprandom(1)).random()
assert get_nprandom(get_pyrandom(1)).rand() == get_nprandom(get_pyrandom(1)).rand()
def test_recursive():
def recurse(f):
R = f(None)
assert f(R) is R
yield recurse, get_pyrandom
yield recurse, get_nprandom
<file_sep>/camera/snap_from_video.py
import cv, cv2
import sys
fps = 15
video = cv2.VideoCapture(sys.argv[1])
cv.NamedWindow("Video",cv.CV_WINDOW_AUTOSIZE)
counter = 0
while video.grab():
counter += 1
flag, frame = video.retrieve()
if flag:
#gray_frm = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)
#cv2.imwrite('frame_'+str(counter)+'.png',gray_frm)
#cv.ShowImage("Video",frame)
cv2.imshow("Video",frame)
cv.WaitKey(1000/fps)
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/supervised/classifier.py
# -*- coding: utf-8 -*-
# Copyright (C) 2008-2012, <NAME> <<EMAIL>>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
from __future__ import division
import numpy as np
from .normalise import normaliselabels
from .base import supervised_model
__all__ = ['normaliselabels', 'ctransforms']
class threshold_model(object):
'''
threshold_model
Attributes
----------
threshold : float
threshold value
'''
def __init__(self, threshold=.5):
self.threshold = .5
def apply(self, f):
return f >= self.threshold
class fixed_threshold_learner(object):
def __init__(self, threshold=.5):
self.threshold = threshold
def train(self, features, labels, **kwargs):
return threshold_model(self.threshold)
class ctransforms_model(supervised_model):
'''
model = ctransforms_model(models)
A model that consists of a series of transformations.
See Also
--------
ctransforms
'''
def __init__(self, models):
self.models = models
def apply_many(self, features):
for m in self.models:
features = m.apply_many(features)
return features
def apply(self,features):
for T in self.models:
features = T.apply(features)
return features
class ctransforms(object):
'''
ctransf = ctransforms(c0, c1, c2, ...)
Concatenate transforms.
'''
def __init__(self,*args):
self.transforms = args
def train(self, features, labels, **kwargs):
models = []
model = None
for T in self.transforms:
if model is not None:
features = np.array([model.apply(f) for f in features])
model = T.train(features, labels, **kwargs)
models.append(model)
return ctransforms_model(models)
def set_option(self, opt, val):
idx, opt = opt
self.transforms[idx].set_option(opt,val)
<file_sep>/gui2.py
#!/usr/bin/python
# communicate.py
import wx
import os
ID_NEW = 1
ID_RENAME = 2
ID_CLEAR = 3
ID_DELETE = 4
ID_BROWSE = 5
class Communicate(wx.Frame):
def __init__(self, parent, id, title):
wx.Frame.__init__(self, parent, id, title, size=(480, 400))
self.PhotoMaxSize = 240
panel = wx.Panel(self, -1)
#hbox = wx.BoxSizer(wx.HORIZONTAL)
#hbox = wx.BoxSizer()
self.listbox = wx.ListBox(panel, -1)
#hbox.Add(self.listbox, 1, wx.EXPAND | wx.ALL, 50)
#hbox.Add(self.listbox, 1)
instructions = 'Browse for an image'
img = wx.EmptyImage(240,240)
self.imageCtrl = wx.StaticBitmap(panel, wx.ID_ANY,
wx.BitmapFromImage(img))
instructLbl = wx.StaticText(panel, label=instructions)
self.photoTxt = wx.TextCtrl(panel, size=(200,-1))
#hbox.Add(leftPanel, 1, wx.EXPAND | wx.ALL, 5)
#hbox.Add(self.rightPanel, 1, wx.EXPAND | wx.ALL, 5)
btnPanel = wx.Panel(panel, -1)
#vbox = wx.BoxSizer(wx.HORIZONTAL)
vbox = wx.BoxSizer(wx.VERTICAL)
new = wx.Button(btnPanel, ID_NEW, 'New', size=(90, 30))
ren = wx.Button(btnPanel, ID_RENAME, 'Rename', size=(90, 30))
dlt = wx.Button(btnPanel, ID_DELETE, 'Delete', size=(90, 30))
clr = wx.Button(btnPanel, ID_CLEAR, 'Clear', size=(90, 30))
#browseBtn = wx.Button(btnPanel, ID_BROWSE, 'Browse', size=(90,30))
self.Bind(wx.EVT_BUTTON, self.NewItem, id=ID_NEW)
self.Bind(wx.EVT_BUTTON, self.OnRename, id=ID_RENAME)
self.Bind(wx.EVT_BUTTON, self.OnDelete, id=ID_DELETE)
self.Bind(wx.EVT_BUTTON, self.OnClear, id=ID_CLEAR)
#self.Bind(wx.EVT_BUTTON, self.onBrowse)
self.Bind(wx.EVT_LISTBOX_DCLICK, self.OnRename)
vbox.Add((-1, 20))
vbox.Add(new)
vbox.Add(ren, 0, wx.TOP, 5)
vbox.Add(dlt, 0, wx.TOP, 5)
vbox.Add(clr, 0, wx.TOP, 5)
#vbox.Add(browseBtn, 0, wx.TOP, 5)
vbox.Add(self.photoTxt, 0, wx.TOP, 5)
vbox.Add(self.imageCtrl, 0, wx.ALL, 5)
vbox.Add(self.listbox, 1)
btnPanel.SetSizer(vbox)
#hbox.Add(btnPanel, .1, wx.EXPAND | wx.RIGHT, 120)
panel.SetSizer(vbox)
#self.createWidgets()
#self.frame.Show()
#self.mainSizer = wx.BoxSizer(wx.VERTICAL)
#self.sizer = wx.BoxSizer(wx.HORIZONTAL)
#vbox.Add(wx.StaticLine(panel, wx.ID_ANY),
# 0, wx.ALL|wx.EXPAND, 5)
#self.mainSizer.Add(instructLbl, 0, wx.ALL, 5)
#
#self.sizer.Add(self.photoTxt, 100, wx.ALL, 5)
#self.sizer.Add(browseBtn, 0, wx.ALL, 5)
#self.mainSizer.Add(self.sizer, 0, wx.ALL, 5)
#panel.SetSizer(self.mainSizer)
#self.mainSizer.Fit(self.frame)
#self.panel.Layout()
self.Centre()
self.Show(True)
def NewItem(self, event):
text = wx.GetTextFromUser('Enter a new item', 'Insert dialog')
if text != '':
self.listbox.Append(text)
def OnRename(self, event):
sel = self.listbox.GetSelection()
text = self.listbox.GetString(sel)
renamed = wx.GetTextFromUser('Rename item', 'Rename dialog', text)
if renamed != '':
self.listbox.Delete(sel)
self.listbox.Insert(renamed, sel)
def OnDelete(self, event):
sel = self.listbox.GetSelection()
if sel != -1:
self.listbox.Delete(sel)
def OnClear(self, event):
self.listbox.Clear()
def onBrowse(self, event):
"""
Browse for file
"""
wildcard = "JPEG files (*.jpg)|*.jpg"
dialog = wx.FileDialog(None, "Choose a file",
wildcard=wildcard,
style=wx.OPEN)
if dialog.ShowModal() == wx.ID_OK:
self.photoTxt.SetValue(dialog.GetPath())
print "file name gotten from dialog"
self.listbox.Append(dialog.GetPath())
dialog.Destroy()
self.onView()
def onView(self):
filepath = self.photoTxt.GetValue()
img = wx.Image(filepath, wx.BITMAP_TYPE_ANY)
# scale the image, preserving the aspect ratio
W = img.GetWidth()
H = img.GetHeight()
if W > H:
NewW = self.PhotoMaxSize
NewH = self.PhotoMaxSize * H / W
else:
NewH = self.PhotoMaxSize
NewW = self.PhotoMaxSize * W / H
img = img.Scale(NewW,NewH)
self.imageCtrl.SetBitmap(wx.BitmapFromImage(img))
#self.panel.Refresh()
app = wx.App()
#ListBox(None, -1, 'ListBox')
Communicate(None, -1, 'widgets communicate')
app.MainLoop()
<file_sep>/robomowGUI.py
#!/usr/bin/python
# communicate.py
import wx
import os
ID_NEW = 1
ID_RENAME = 2
ID_CLEAR = 3
ID_DELETE = 4
class PhotoCtrl(wx.App):
def __init__(self, redirect=False, filename=None):
wx.App.__init__(self, redirect, filename)
self.frame = wx.Frame(None, title='Photo Control')
self.panel = wx.Panel(self.frame)
self.PhotoMaxSize = 240
#panel = wx.Panel(self, -1)
#self.listbox = wx.ListBox(self.panel, -1)
#panel = wx.Panel(self, -1)
#self.rightPanel = RightPanel(panel, -1)
#leftPanel = LeftPanel(panel, -1)
#self.listbox = ListBox(panel, -1)
self.createWidgets()
self.frame.Show()
def createWidgets(self):
instructions = 'Browse for an image'
img = wx.EmptyImage(240,240)
self.imageCtrl = wx.StaticBitmap(self.panel, wx.ID_ANY,
wx.BitmapFromImage(img))
instructLbl = wx.StaticText(self.panel, label=instructions)
self.photoTxt = wx.TextCtrl(self.panel, size=(200,-1))
browseBtn = wx.Button(self.panel, label='Browse')
browseBtn.Bind(wx.EVT_BUTTON, self.onBrowse)
self.mainSizer = wx.BoxSizer(wx.VERTICAL)
self.sizer = wx.BoxSizer(wx.HORIZONTAL)
self.mainSizer.Add(wx.StaticLine(self.panel, wx.ID_ANY),
0, wx.ALL|wx.EXPAND, 5)
self.mainSizer.Add(instructLbl, 0, wx.ALL, 5)
self.mainSizer.Add(self.imageCtrl, 0, wx.ALL, 5)
self.sizer.Add(self.photoTxt, 0, wx.ALL, 5)
self.sizer.Add(browseBtn, 0, wx.ALL, 5)
self.mainSizer.Add(self.sizer, 0, wx.ALL, 5)
self.panel.SetSizer(self.mainSizer)
self.mainSizer.Fit(self.frame)
self.panel.Layout()
def onBrowse(self, event):
"""
Browse for file
"""
wildcard = "JPEG files (*.jpg)|*.jpg"
dialog = wx.FileDialog(None, "Choose a file",
wildcard=wildcard,
style=wx.OPEN)
if dialog.ShowModal() == wx.ID_OK:
self.photoTxt.SetValue(dialog.GetPath())
#print self
ListBox.hbox.listbox.Append(dialog.GetPath())
dialog.Destroy()
self.onView()
def onView(self):
filepath = self.photoTxt.GetValue()
img = wx.Image(filepath, wx.BITMAP_TYPE_ANY)
# scale the image, preserving the aspect ratio
W = img.GetWidth()
H = img.GetHeight()
if W > H:
NewW = self.PhotoMaxSize
NewH = self.PhotoMaxSize * H / W
else:
NewH = self.PhotoMaxSize
NewW = self.PhotoMaxSize * W / H
img = img.Scale(NewW,NewH)
self.imageCtrl.SetBitmap(wx.BitmapFromImage(img))
self.panel.Refresh()
class LeftPanel(wx.Panel):
def __init__(self, parent, id):
wx.Panel.__init__(self, parent, id, style=wx.BORDER_SUNKEN)
self.text = parent.GetParent().rightPanel.text
button1 = wx.Button(self, -1, '+', (10, 10))
button2 = wx.Button(self, -1, '-', (10, 60))
self.Bind(wx.EVT_BUTTON, self.OnPlus, id=button1.GetId())
self.Bind(wx.EVT_BUTTON, self.OnMinus, id=button2.GetId())
def OnPlus(self, event):
value = int(self.text.GetLabel())
value = value + 1
ListBox.listbox.Append("ko")
self.text.SetLabel(str(value))
def OnMinus(self, event):
value = int(self.text.GetLabel())
value = value - 1
self.text.SetLabel(str(value))
class RightPanel(wx.Panel):
def __init__(self, parent, id):
wx.Panel.__init__(self, parent, id, style=wx.BORDER_SUNKEN)
self.text = wx.StaticText(self, -1, '0', (40, 60))
class Communicate(wx.Frame):
def __init__(self, parent, id, title):
wx.Frame.__init__(self, parent, id, title, size=(280, 200))
panel = wx.Panel(self, -1)
self.rightPanel = RightPanel(panel, -1)
leftPanel = LeftPanel(panel, -1)
hbox = wx.BoxSizer()
hbox.Add(leftPanel, 1, wx.EXPAND | wx.ALL, 5)
hbox.Add(self.rightPanel, 1, wx.EXPAND | wx.ALL, 5)
panel.SetSizer(hbox)
self.Centre()
self.Show(True)
class ListBox(wx.Frame):
def __init__(self, parent, id, title):
wx.Frame.__init__(self, parent, id, title, size=(350, 220))
panel = wx.Panel(self, -1)
hbox = wx.BoxSizer(wx.HORIZONTAL)
self.listbox = wx.ListBox(panel, -1)
hbox.Add(self.listbox, 1, wx.EXPAND | wx.ALL, 20)
btnPanel = wx.Panel(panel, -1)
vbox = wx.BoxSizer(wx.VERTICAL)
new = wx.Button(btnPanel, ID_NEW, 'New', size=(90, 30))
ren = wx.Button(btnPanel, ID_RENAME, 'Rename', size=(90, 30))
dlt = wx.Button(btnPanel, ID_DELETE, 'Delete', size=(90, 30))
clr = wx.Button(btnPanel, ID_CLEAR, 'Clear', size=(90, 30))
self.Bind(wx.EVT_BUTTON, self.NewItem, id=ID_NEW)
self.Bind(wx.EVT_BUTTON, self.OnRename, id=ID_RENAME)
self.Bind(wx.EVT_BUTTON, self.OnDelete, id=ID_DELETE)
self.Bind(wx.EVT_BUTTON, self.OnClear, id=ID_CLEAR)
self.Bind(wx.EVT_LISTBOX_DCLICK, self.OnRename)
vbox.Add((-1, 20))
vbox.Add(new)
vbox.Add(ren, 0, wx.TOP, 5)
vbox.Add(dlt, 0, wx.TOP, 5)
vbox.Add(clr, 0, wx.TOP, 5)
btnPanel.SetSizer(vbox)
hbox.Add(btnPanel, 0.6, wx.EXPAND | wx.RIGHT, 20)
panel.SetSizer(hbox)
self.Centre()
self.Show(True)
def NewItem(self, event):
text = wx.GetTextFromUser('Enter a new item', 'Insert dialog')
if text != '':
#print self.listbox.Append(text)
self.listbox.Append(text)
if text == '':
#print self.listbox.Append(text)
self.listbox.Append("hi")
def OnRename(self, event):
sel = self.listbox.GetSelection()
text = self.listbox.GetString(sel)
renamed = wx.GetTextFromUser('Rename item', 'Rename dialog', text)
if renamed != '':
self.listbox.Delete(sel)
self.listbox.Insert(renamed, sel)
def OnDelete(self, event):
sel = self.listbox.GetSelection()
if sel != -1:
self.listbox.Delete(sel)
def OnClear(self, event):
self.listbox.Clear()
#app = wx.App()
#
app = PhotoCtrl()
ListBox(None, -1, 'ListBox')
Communicate(None, -1, 'widgets communicate')
app.MainLoop()
<file_sep>/ftdi/build/pyftdi/pyftdi/pyftdi/spi.py
# Copyright (c) 2010-2011, <NAME> <<EMAIL>>
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
# * Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright
# notice, this list of conditions and the following disclaimer in the
# documentation and/or other materials provided with the distribution.
# * Neither the name of the Neotion nor the names of its contributors may
# be used to endorse or promote products derived from this software
# without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL NEOTION BE LIABLE FOR ANY DIRECT, INDIRECT,
# INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA,
# OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
# LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
# NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE,
# EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
import struct
import time
from array import array as Array
from pyftdi.pyftdi.ftdi import Ftdi
__all__ = ['SpiPort', 'SpiController']
class SpiIOError(IOError):
"""SPI I/O error"""
class SpiPort(object):
"""SPI port"""
def __init__(self, controller, cs_cmd):
"""Instanciate a new SPI port.
An SPI port is never instanciated directly.
Use SpiController.get_port() method to obtain an SPI port"""
self._controller = controller
self._cs_cmd = cs_cmd
self._frequency = self._controller.frequency
def exchange(self, out='', readlen=0):
"""Perform a half-duplex transaction with the SPI slave"""
return self._controller._exchange(self._frequency, self._cs_cmd,
out, readlen)
def flush(self):
"""Force the flush of the HW FIFOs"""
self._controller._flush()
def set_frequency(self, frequency):
"""Change SPI bus frequency"""
self._frequency = min(frequency, self._controller.frequency_max)
@property
def frequency(self):
"""Return the current SPI bus block"""
return self._frequency
class SpiController(object):
"""SPI master"""
SCK_BIT = 0x01
DO_BIT = 0x02
DI_BIT = 0x04
CS_BIT = 0x08
PAYLOAD_MAX_LENGTH = 0x10000 # 16 bits max
def __init__(self, silent_clock=False, cs_count=4):
"""Instanciate a SpiController.
silent_clock should be set to avoid clocking out SCLK when all /CS
signals are released. This clock beat is used to enforce a delay
between /CS signal activation. When weird SPI devices are used,
SCLK beating may cause trouble. In this case, silent_clock should
be set but beware that SCLK line should be fitted with a pull-down
resistor, as SCLK is high-Z during this short period of time.
cs_count is the number of /CS lines (one per device to drive on the
SPI bus)
"""
self._ftdi = Ftdi()
self._cs_bits = ((SpiController.CS_BIT << cs_count) - 1) & \
~(SpiController.CS_BIT - 1)
self._ports = [None] * cs_count
self._direction = self._cs_bits | \
SpiController.DO_BIT | \
SpiController.SCK_BIT
self._cs_high = Array('B')
if silent_clock:
# Set SCLK as input to avoid emitting clock beats
self._cs_high.extend([Ftdi.SET_BITS_LOW, self._cs_bits,
self._direction&~SpiController.SCK_BIT])
# /CS to SCLK delay, use 8 clock cycles as a HW tempo
self._cs_high.extend([Ftdi.WRITE_BITS_TMS_NVE, 8-1, 0xff])
# Restore idle state
self._cs_high.extend([Ftdi.SET_BITS_LOW, self._cs_bits,
self._direction])
self._immediate = Array('B', [Ftdi.SEND_IMMEDIATE])
self._frequency = 0.0
def configure(self, vendor, product, interface, frequency=6.0E6):
"""Configure the FTDI interface as a SPI master"""
self._frequency = \
self._ftdi.open_mpsse(vendor, product, interface,
direction=self._direction,
initial=self._cs_bits, # /CS all high
frequency=frequency)
def terminate(self):
"""Close the FTDI interface"""
if self._ftdi:
self._ftdi.close()
self._ftdi = None
def get_port(self, cs):
"""Obtain a SPI port to drive a SPI device selected by cs"""
if not self._ftdi:
raise SpiIOError("FTDI controller not initialized")
if cs >= len(self._ports):
raise SpiIOError("No such SPI port")
if not self._ports[cs]:
cs_state = 0xFF & ~((SpiController.CS_BIT<<cs) |
SpiController.SCK_BIT |
SpiController.DO_BIT)
cs_cmd = Array('B', [Ftdi.SET_BITS_LOW,
cs_state,
self._direction])
self._ports[cs] = SpiPort(self, cs_cmd)
self._flush()
return self._ports[cs]
@property
def frequency_max(self):
"""Returns the maximum SPI clock"""
return self._ftdi.frequency_max
@property
def frequency(self):
"""Returns the current SPI clock"""
return self._frequency
def _exchange(self, frequency, cs_cmd, out, readlen):
"""Perform a half-duplex transaction with the SPI slave"""
if not self._ftdi:
raise SpiIOError("FTDI controller not initialized")
if len(out) > SpiController.PAYLOAD_MAX_LENGTH:
raise SpiIOError("Output payload is too large")
if readlen > SpiController.PAYLOAD_MAX_LENGTH:
raise SpiIOError("Input payload is too large")
if self._frequency != frequency:
freq = self._ftdi.set_frequency(frequency)
# store the requested value, not the actual one (best effort)
self._frequency = frequency
write_cmd = struct.pack('<BH', Ftdi.WRITE_BYTES_NVE_MSB, len(out)-1)
if readlen:
read_cmd = struct.pack('<BH', Ftdi.READ_BYTES_NVE_MSB, readlen-1)
cmd = Array('B', cs_cmd)
cmd.fromstring(write_cmd)
cmd.extend(out)
cmd.fromstring(read_cmd)
cmd.extend(self._immediate)
cmd.extend(self._cs_high)
self._ftdi.write_data(cmd)
# USB read cycle may occur before the FTDI device has actually
# sent the data, so try to read more than once if no data is
# actually received
data = self._ftdi.read_data_bytes(readlen, 4)
else:
cmd = Array('B', cs_cmd)
cmd.fromstring(write_cmd)
cmd.extend(out)
cmd.extend(self._cs_high)
self._ftdi.write_data(cmd)
data = Array('B')
return data
def _flush(self):
"""Flush the HW FIFOs"""
self._ftdi.write_data(self._immediate)
self._ftdi.purge_buffers()
<file_sep>/svm.py
import numpy as np
import mlpy
#from numpy import *
#from mlpy import *
xtr = np.array([[7.0, 2.0, 3.0, 1.0], # first sample
[1.0, 2.0, 3.0, 2.0], # second sample
[2.0, 2.0, 2.0, 1.0], # third sample#
[2.0, 4.0, 2.0, 6.0],
[2.0, 2.0, 7.0, 9.0]])
print xtr
print np.size(xtr), np.shape(xtr), np.ndim(xtr), xtr.dtype
ytr = np.array([1, 2, 3, 1, 2]) # classes
print ytr
print np.size(ytr), np.shape(ytr), np.ndim(ytr), xtr.dtype
#Save and read data from disk
print mlpy.data_tofile('data_example.dat', xtr, ytr, sep=' ')
x, y = mlpy.data_fromfile('data_example.dat')
print x
print y
print "mlpy.data_normalize(x) = ", mlpy.data_normalize(x)
#mysvm = mlpy.Svm() # initialize Svm class
myknn = mlpy.Knn(k = 1) # initialize knn class
## initialize fda class
myfda = mlpy.Fda()
#print mysvm.compute(xtr, ytr) # compute SVM
print myknn.compute(xtr, ytr) # compute knn
print myfda.compute(xtr, ytr) # compute fda
#print mysvm.predict(xtr) # predict SVM model on training
print myknn.predict(xtr) # predict knn model on training data
print myfda.predict(xtr) # predict fda model on training data
xts = np.array([2.0, 2.0, 7.0, 1.0]) # test point
print xts.shape, xts.ndim, xts.dtype
#print mysvm.predict(xts) # predict SVM model on test point
print myknn.predict(xts) # predict knn model on test point
print myfda.predict(xts) # predict fda model on test point
#print mysvm.realpred # real-valued prediction
print myknn.realpred # real-valued prediction
print myfda.realpred # real-valued prediction
#print mysvm.weights(xtr, ytr) # compute weights on training data
#print myfda.weights(xtr, ytr) # compute weights on training data
<file_sep>/telemetry/serverside_udp.py
import socket
import time
import thread
import datetime
class Heartbeat():
def __init__(self, robot_IP, basestation_IP):
#-------------connection variables
self.SOCKET = None
self.SOCKET2 = None
self.UDP_IP="127.0.0.1"
self.UDP_PORT=5005
self.MESSAGE= "X" #str(time.time())
self.ACTIVE = False
#----------------------RUN
self.run()
def connect(self):
self.SOCKET = socket.socket( socket.AF_INET, socket.SOCK_DGRAM ) # UDP
self.SOCKET2 = socket.socket( socket.AF_INET, socket.SOCK_DGRAM ) # UDP
#host= '0.0.0.0'
self.SOCKET2.bind( (self.UDP_IP, self.UDP_PORT) )
def run(self):
self.connect()
self.th = thread.start_new_thread(self.send_heartbeat, ())
self.th2 = thread.start_new_thread(self.receive_heartbeat, ())
def send_heartbeat(self):
while True:
time.sleep(.499) # send 10 times per second
#self.MESSAGE = str(time.time())
#print "sending heartbeat", self.MESSAGE
self.SOCKET.sendto( self.MESSAGE, (self.UDP_IP, self.UDP_PORT) )
def receive_heartbeat(self):
count = 0
while True:
now = time.time()
count = count + 1
print count
data, addr = self.SOCKET2.recvfrom( 128 ) # buffer size is X bytes
beat_time = (time.time()-now)
#self.SOCKET2.flush()
#print "received message:", data, " time:", beat_time
data = None
if beat_time < 0.5:
self.ACTIVE = True
if beat_time > 0.5:
self.ACTIVE = False
#print "heartbeat Active:", self.ACTIVE
if __name__== "__main__":
heartbeat = Heartbeat('localhost', 'localhost') # (robot_IP, basestation_IP)
while True:
time.sleep(1)
print heartbeat.ACTIVE
<file_sep>/process_images.py~
#!/usr/bin/env python
import os
import img_processing_tools
import Image
path = "/home/lforet/images"
if __name__=="__main__":
###########################
#process class 1 files (mowable)
###########################
count = 0
path = path + "/class1"
for subdir, dirs, files in os.walk(path):
count = len(files)
if count > 0:
#delete classid and classdata files to completely rebuild them
print "Processing images: CLASS 1 (MOWABLE)"
classID = 1
print "Files to Process: ", count
for subdir, dirs, files in os.walk(path):
for file in files:
filename1= os.path.join(path, file)
print "Processing file: ", filename1
img = Image.open(filename1)
WriteMeterics(img, classID)
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/supervised/weighted_voting_adaboost.py
from math import exp, log
from operator import itemgetter
'''
AdaBoost implementation with weighted voting as a decision procedure
'''
class weighted_voting_adaboost(object):
# initializes with already built classifiers and corresponding
def __init__(self, in_classifiers, in_coefficients):
self.classifiers = in_classifiers
self.coefficients = in_coefficients
# decision by weighted voting
def apply(self, in_features):
# a "class number" => "votes value" mapping
answers = {}
for classifier, coefficient in zip(self.classifiers, self.coefficients):
answer = classifier.apply(in_features)
if answer in answers:
answers[answer] += coefficient
else:
answers[answer] = coefficient
# dict maximum by value
result = max(answers.iteritems(), key=itemgetter(1))
return result[0]
class weighted_voting_ada_learner(object):
def __init__(self, in_composition_size, in_learner):
self.learner = in_learner
self.composition_size = in_composition_size
def reset(self, in_features):
self.classifiers = []
# linear coefficients for the classifiers in composition
self.coefficients = []
self.weights = [1. / float(len(in_features))] * len(in_features)
def train(self, in_features, in_labels):
self.reset(in_features)
for iteration in xrange(self.composition_size):
self.classifiers.append(self.learner.train(in_features, in_labels, weights=self.weights))
# new classifier initially gets weight 1
self.coefficients.append(1)
answers = []
for obj in in_features:
answers.append(self.classifiers[-1].apply(obj))
err = self.compute_weighted_error(in_labels, answers)
if abs(err) < 1e-6:
return weighted_voting_adaboost(self.classifiers, self.coefficients)
alpha = 0.5 * log((1.0 - err) / err)
# updating the coefficient of the last added classifier
self.coefficients[-1] = alpha
self.update_weights(in_labels, answers, alpha)
self.normalize_weights()
return weighted_voting_adaboost(self.classifiers, self.coefficients)
def compute_weighted_error(self, in_labels, in_answers):
error = 0.
w_sum = sum(self.weights)
for ind in xrange(len(in_labels)):
error += (in_answers[ind] != in_labels[ind]) * self.weights[ind] / w_sum
return error
def update_weights(self, in_labels, in_answers, in_alpha):
for ind in xrange(len(in_labels)):
self.weights[ind] *= exp(in_alpha * (in_answers[ind] != in_labels[ind]))
def normalize_weights(self):
w_sum = sum(self.weights)
for ind in xrange(len(self.weights)):
self.weights[ind] /= w_sum
<file_sep>/telemetry/ThreadedBeatServer.py
# Filename: ThreadedBeatServer.py
"""Threaded heartbeat server"""
UDP_PORT = 43278; CHECK_PERIOD = .05; CHECK_TIMEOUT = .5
import socket, threading, time
class Heartbeats(dict):
"""Manage shared heartbeats dictionary with thread locking"""
def __init__(self):
super(Heartbeats, self).__init__()
self._lock = threading.Lock()
def __setitem__(self, key, value):
"""Create or update the dictionary entry for a client"""
self._lock.acquire()
super(Heartbeats, self).__setitem__(key, value)
self._lock.release()
def getSilent(self):
"""Return a list of clients with heartbeat older than CHECK_TIMEOUT"""
limit = time.time() - CHECK_TIMEOUT
self._lock.acquire()
silent = [ip for (ip, ipTime) in self.items() if ipTime < limit]
self._lock.release()
return silent
class Receiver(threading.Thread):
"""Receive UDP packets and log them in the heartbeats dictionary"""
def __init__(self, goOnEvent, heartbeats):
super(Receiver, self).__init__()
self.goOnEvent = goOnEvent
self.heartbeats = heartbeats
self.recSocket = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
self.recSocket.settimeout(CHECK_TIMEOUT)
self.recSocket.bind((socket.gethostbyname('localhost'), UDP_PORT))
def run(self):
while self.goOnEvent.isSet():
try:
data, addr = self.recSocket.recvfrom(5)
if data == 'PyHB':
self.heartbeats[addr[0]] = time.time()
except socket.timeout:
pass
def main():
receiverEvent = threading.Event()
receiverEvent.set()
heartbeats = Heartbeats()
receiver = Receiver(goOnEvent = receiverEvent, heartbeats = heartbeats)
receiver.start()
print ('Threaded heartbeat server listening on port %d\n'
'press Ctrl-C to stop\n') % UDP_PORT
try:
while True:
silent = heartbeats.getSilent()
print 'Silent clients: %s' % silent
time.sleep(CHECK_PERIOD)
except KeyboardInterrupt:
print 'Exiting, please wait...'
receiverEvent.clear()
receiver.join()
print 'Finished.'
if __name__ == '__main__':
main()
<file_sep>/gui/web_gui/sitetest2/predict_image_class.py~
#!/usr/bin/env python
import os
from img_processing_tools import *
import Image
import time
import csv
import numpy as np
import milk
from mvpa.clfs.knn import kNN
from mvpa.datasets import Dataset
import mlpy
import matplotlib.pyplot as plt # required for plotting
ifile = open('sample_image_data.csv', "rb")
reader = csv.reader(ifile)
classID = []
features = []
lbp= []
lbp_temp_array = []
i3_histo_temp_array = []
i3_histo = []
rgb_histo_temp_array = []
rgb_histo = []
#read data from file into arrays
for row in reader:
features.append(row)
'''
I3_sum = ImageStat.Stat(image).sum
I3_sum2 = ImageStat.Stat(image).sum2
I3_median = ImageStat.Stat(image).median
I3_mean = ImageStat.Stat(image).mean
I3_var = ImageStat.Stat(image).var
I3_stddev = ImageStat.Stat(image).stddev
I3_rms = ImageStat.Stat(image).rms
'''
for row in range(1, len(features)):
#print features[row][1]
classID.append(int(features[row][0]))
lbp_temp_array.append(features[row][1].split(" "))
i3_histo_temp_array.append(features[row][2].split(" "))
rgb_histo_temp_array.append(features[row][3].split(" "))
#removes ending element which is a space
for x in range(len(lbp_temp_array)):
lbp_temp_array[x].pop()
lbp_temp_array[x].pop(0)
i3_histo_temp_array[x].pop()
i3_histo_temp_array[x].pop(0)
rgb_histo_temp_array[x].pop()
rgb_histo_temp_array[x].pop(0)
#print lbp_temp_array
#convert all strings in array to numbers
temp_array = []
for x in range(len(lbp_temp_array)):
temp_array = [ float(s) for s in lbp_temp_array[x] ]
lbp.append(temp_array)
temp_array = [ float(s) for s in i3_histo_temp_array[x] ]
i3_histo.append(temp_array)
temp_array = [ float(s) for s in rgb_histo_temp_array[x] ]
rgb_histo.append(temp_array)
#make numpy arrays
lbp = np.asarray(lbp)
i3_histo = np.asarray(i3_histo)
rgb_histo = np.asarray(rgb_histo)
id_index = 0
lbp_predictdata = lbp[[id_index]]
i3_histo_predictdata = lbp[[id_index]]
xts = np.array([lbp_predictdata[0]]) # test point
print "np.shape(xts), xts.ndim, xts.dtype:", np.shape(xts), xts.ndim, xts.dtype
from sklearn.externals import joblib
model = joblib.load('lbp_knn_clf.pkl')
from sklearn.neighbors import KNeighborsClassifier
print "Image sample data: i3_histo_predictdata"
print i3_histo_predictdata
predicted_classID = model.predict(xts)
print "lpb_knn_clf: ", predicted_classID
if predicted_classID[0] == 1: print "Image is of class: GRASS - MOWABLE"
if predicted_classID[0] == -1: print "Image is of class: DIRT/GRAVEL/ASPHALT - NON-MOWABLE"
<file_sep>/ftdi/build/pyftdi/pyftdi/README.rst
========
PyFtdi
========
Overview
~~~~~~~~
PyFtdi aims at providing a user-space driver for modern FTDI_ devices,
implemented in pure Python language.
Modern FTDI_ devices include:
* FT2232D (dual port, clock up to 6 MHz)
* FT2232H (dual port, clock up to 30 MHz)
* FT4232H (quad port, clock up to 30 MHz)
Other FTDI_ devices could also be supported (including FT232* devices),
although these devices are not a primary goal for PyFtdi, and therefore have
not been tested with PyFtdi.
Extras
------
This module also contains a basic driver for Prolific PL2303 chip written in
pure Python. PL2303 is not an FTDI device, but it may serve the same purpose:
a USB-to-serial adapter.
As such, a Python driver for this device has been added to this project sarting
at version 0.4.0, so that a PL2303 serial adaptor can be used as an FTDI
alternative to drive a serial port from a USB bus.
Primary goals
~~~~~~~~~~~~~
It should support the following modes:
* UART/Serial USB converter, up to 12Mbps (depending on the FTDI device
capability)
* SPI master
* JTAG master
* Bitbang/GPIO support (not a primary goal)
PyFtdi should provide a pyserial_ compliant API, to be used as a drop-in module
to access USB-serial converters based on FTDI_ devices.
.. _FTDI: http://www.ftdichip.com/
.. _pyserial: http://pyserial.sourceforge.net/
Requirements
~~~~~~~~~~~~
PyFtdi relies on PyUSB_, which itself depends on one of the following native
libraries:
* libusb-1.0 (recommended)
* libusb-0.1 (deprecated)
* openusb (not tested with pyftdi)
PyFtdi does not depend on any other native library, and only uses standard
Python modules.
To use the serial port feature of PyFtdi, pyserial_ 2.5+ module should be
installed.
Python_ 2.6 or above is required. Python_ 3.x is not yet supported.
.. _PyUSB: http://sourceforge.net/projects/pyusb/
.. _Python: http://python.org/
Status
~~~~~~
This project is still at an early alpha development stage.
However, PyFtdi is being forked from a closed-source software implementation
that has been successfully used for over a year - including serial, spi and
jtag protocols, based on top of the libftdi_ open source library.
libftdi_ is now being phased out from this closed-source project and replaced
with PyFtdi, to ease maintenance and customization.
Meanwhile, PyFtdi is developed as an open-source solution.
Supported features
------------------
* All FTDI device ports (UART, MPSSE) can be used simultaneously.
* Serial port, up to 12 Mbps. PyFtdi includes a pyserial_ emulation layer that
offers transparent access to the FTDI serial ports through a pyserial_-
compliant API. The ``serialext`` directory contains a minimal serial terminal
demonstrating the use of this extension, and a dispatcher automatically
selecting the serial backend (pyserial_, PyFtdi), based on the serial port
name.
* SPI master. PyFtdi includes several examples demonstrating how to use the
FTDI SPI master with a pure-Python serial flash device driver for several
common devices. For now, SPI Mode 0 (CPOL=0, CPHA=0) is the only supported
mode. It should be easy to extend the SPI master to deal with less common
modes. These tests show an average 470 KB/s read out from flash devices
running with a 6 MHz SPI clock on a Core2Duo Mac Book Pro.
* JTAG is under development and is not fully supported yet.
.. _libftdi: http://www.intra2net.com/en/developer/libftdi/
Installation
~~~~~~~~~~~~
Since PyUSB 1.0.0a2, USB bus enumeration can be performed without applying
any patch.
* Download pyusb-1.0.0a2
* Install pyusb
* Install pyftdi
Troubleshooting
---------------
*"Error: No backend available"*
libusb native library cannot be loaded. Try helping the dynamic loader:
* On Linux: ``export LD_LIBRARY_PATH=<path>``
where <path> is the directory containing the ``libusb-1.*.so`` library file
* On Mac OS X: ``export DYLD_LIBRARY_PATH=.../lib``
where <path> is the directory containing the ``libusb-1.*.dylib`` library file
Development
~~~~~~~~~~~
PyFtdi is developed on Mac OS X platforms (including 64-bit kernels), and is
validated on a regular basis on Linux hosts.
As it contains no native code, it should work on any PyUSB_ and libusb_
supported platforms, including but not limited to, Windows.
.. _libusb: http://www.libusb.org/
Examples
~~~~~~~~
See the developer page available from http://github.com/eblot/pyftdi for SPI
and JTAG examples.
.. include:: serialext/README.rst
<file_sep>/navigation/terrain_id/terrain_training/build/milk/build/lib.linux-i686-2.7/milk/tests/test_defaultclassifier.py
import numpy as np
import milk
import milk.supervised.defaultclassifier
import pickle
def test_defaultclassifier():
from milksets import wine
features, labels = wine.load()
C = milk.supervised.defaultclassifier()
model = C.train(features,labels)
labelset = set(labels)
for f in features:
assert model.apply(f) in labelset
test_defaultclassifier.slow = True
def test_pickle():
np.random.seed(23232432)
X = np.random.rand(100,10)
labels = np.zeros(100)
X[50:] += .5
labels[50:] = 1
classifier = milk.supervised.defaultclassifier()
model = classifier.train(X, labels)
s = pickle.dumps(model)
model = pickle.loads(s)
test = [model.apply(x) for x in X]
test = np.array(test)
assert (test == labels).mean() > .6
def test_pickle_learner():
learner = milk.defaultlearner()
assert len(pickle.dumps(learner))
def test_expandend():
np.random.seed(23232432)
X = np.random.rand(100,10)
labels = np.zeros(100)
X[50:] += .5
labels[50:] = 1
learners = milk.defaultlearner(expanded=True)
for learner in learners:
model = learner.train(X, labels)
test = [model.apply(x) for x in X]
test = np.array(test)
assert set(test) == set(labels)
<file_sep>/navigation/terrain_id/terrain_training/build/milk/milk/tests/data/jugparallel_kmeans_jugfile.py
import milk.ext.jugparallel
from milksets.wine import load
from milk.tests.fast_classifier import fast_classifier
features,labels = load()
clustered = milk.ext.jugparallel.kmeans_select_best(features, ks=(2,8), repeats=2, max_iters=6)
|
02d1d89ed3c2a71a4f5a36f3a19f0683a0ae37e5
|
[
"Python",
"C++",
"reStructuredText"
] | 307 |
Python
|
lforet/robomow
|
eca69d000dc77681a30734b073b2383c97ccc02e
|
49dbb0a1c873f75e11228e24878b1e977073118b
|
refs/heads/master
|
<repo_name>chrisgud/TriviaGame<file_sep>/README.md
# TriviaGame
### A simple trivia game
* Clicking the Start button displays:
* A 30 second timer that counts down to 0
* 8 multiple choice trivia questions with 4 possible answers
* A Done button to submit answers, otherwise the quiz ends when the timer hits 0
* Once the timer hits 0 or the Done button is clicked
* A final tally of correct, incorrect, and unanswered trivia questions will be displayed.<file_sep>/assets/javascript/app.js
window.onload = () => {
$('#start').on('click', trivia.start)
}
var intervalId;
//trivia object to track time
var trivia = {
time: 30,
//starts timer, displays questions, hides button, shows timer
start: function () {
intervalId = setInterval(trivia.count, 1000);
displayQuestions();
$('#buttons').css('display', 'none');
$('#timeDiv').css('display', 'block');
},
count: function () {
trivia.time--;
if (trivia.time === 0) {
trivia.completed()
} else {
var converted = trivia.timeConverter(trivia.time);
$("#timeSpan").text(converted);
}
},
//useful display function from stopwatch
timeConverter: function (t) {
var minutes = Math.floor(t / 60);
var seconds = t - (minutes * 60);
if (seconds < 10) {
seconds = "0" + seconds;
}
if (minutes === 0) {
minutes = "00";
}
else if (minutes < 10) {
minutes = "0" + minutes;
}
return minutes + ":" + seconds;
},
completed: function () {
//Calculate Scores
var correct = 0, unanswered = 0, incorrect = 0;
clearInterval(intervalId);
for (i = 0; i < questionList.length; i++) {
var answerId = parseInt($(`input[name=${i}]:checked`).val());
if (isNaN(answerId)) {
unanswered++;
} else if (questionList[i].correctAnswer === answerId) {
correct++;
} else {
incorrect++;
}
}
//Display Results
$('#timeDiv').html('<h3>All Done!</h3>');
$('#questionDiv').html(`
<h3>Correct Answers: ${correct}</h3>
<h3>Incorrect Answers: ${incorrect}</h3>
<h3>Unanswered Questions: ${unanswered}</h3>
`);
}
}
//Declare trivia questions as an object
var questionList = [
{
question: 'What was the first full length CGI movie?',
answer: ['A Bug\'s Life', 'Monsters Inc.', 'Toy Story', 'The Lion King'],
correctAnswer: 2
},
{
question: 'Which of these is NOT a name of one of the Spice Girls?',
answer: ['Sporty Spice', 'Fred Spice', 'Scary Spice', 'Posh Spice'],
correctAnswer: 1
},
{
question: 'Which NBA team won the most titles in the 90s?',
answer: ['New York Knicks', 'Portland Trailblazers', 'Los Angeles Lakers', 'Chicago Bulls'],
correctAnswer: 3
},
{
question: 'Which group released the hit song, "Smells Like Teen Spirit"?',
answer: ['Nirvana', 'Backstreet Boys', 'The Offspring', 'No Doubt'],
correctAnswer: 0
},
{
question: 'Which popular Disney movie featured the song, "Circle of Life"?',
answer: ['Aladdin', 'Hercules', 'Mulan', 'The Lion King'],
correctAnswer: 3
},
{
question: 'Finish this line from the Fresh Prince of Bel-Air theme song: "I whistled for a cab and when it came near, the license plate said..."',
answer: ['Dice', 'Mirror', 'Fresh', 'Cab'],
correctAnswer: 2
},
{
question: 'What was Doug\'s best friend\'s name?',
answer: ['Skeeter', 'Mark', 'Zach', 'Cody'],
correctAnswer: 0
},
{
question: 'What was the name of hte principal at Bayside High in Saved By The Bell?',
answer: ['Mr. Zhou', 'Mr. Driggers', 'Mr. Belding', 'Mr. Page'],
correctAnswer: 2
},
];
//Builds form with question headers and radio buttons named by QuestionList index
// then appends assembled form to the question div.
function displayQuestions() {
var questionForm = $('<form>');
for (i = 0; i < questionList.length; i++) {
var questionDiv = $('<div>');
questionDiv.append(`<h3 class='p-3'>${questionList[i].question}</h3>`);
for (j = 0; j < questionList[i].answer.length; j++) {
var radioDiv = $('<span class="p-3">');
radioDiv.html(`<input type="radio" id=${questionList[i].answer[j].toLowerCase().replace(/[^A-Z0-9]+/ig, '')} name=${i} value=${j} />
<label for=${questionList[i].answer[j].toLowerCase().replace(/[^A-Z0-9]+/ig, '')}>${questionList[i].answer[j]}</label>`)
questionDiv.append(radioDiv);
}
questionDiv.append('<hr>');
questionForm.append(questionDiv);
}
questionForm.append(`<input type="submit" value="Done" id="questionSubmit">`);
$('#questionDiv').append(questionForm);
$('#questionSubmit').on('click', (event) => {
event.preventDefault();
trivia.completed();
});
};
|
ac191da58a94f512acc917d9d1743f322a194ba4
|
[
"Markdown",
"JavaScript"
] | 2 |
Markdown
|
chrisgud/TriviaGame
|
6491f465db8030e639c963f4b4973be12bf527db
|
f0feea548adff814c777d6b317dc621765cd73a1
|
refs/heads/master
|
<file_sep>'use strict';
let inquirer = require('inquirer');
const question = {
type: 'list',
name: 'game',
message: 'Welcome! Select the game, U want to play:',
choices: ['puzzles', 'building']
};
function getSelectedGame() {
return inquirer.prompt([question]);
}
module.exports = { getSelectedGame: getSelectedGame };<file_sep>import {draw, clear} from '../../../tools/drawer';
function gameElement(color, x, y) {
this.x = x;
this.y = y;
this.color = color;
let that = this;
this.setPosition = function (position) {
this.color = position.color;
let y = this.y;
while (y >= position.y) {
clear(this.x, y);
draw(this.x, y, this.color);
y--; //draw next position
}
};
this.destroyElement = function() {
this.color = -1;
that.draw(this.color);
};
this.draw = function (currColor) {
let Color = currColor || color;
if (Color === -1) {
clear(x, y);
} else {
draw(x, y, Color);
}
};
this.draw();
}
export { gameElement }
<file_sep>const colors = ['#f4e010', '#d60c11', '#32bbb9', '#33cd39', '#6a008e', '#ff8f59', '#5926C9', '#C9319E', '#f6e641'];
function writeGameOver() {
document.getElementById('gameOver').innerHTML = 'Game Over';
}
function writeCount() {
let count = -1;
return function () {
count++;
document.getElementById('score').innerHTML = count;
};
}
function getElement(center) {
const store = [
[{x: center, y: 0}, {x: center + 1, y: 0}, {x: center, y: 1}, {x:center + 1, y: 1}],
[{x: center, y: 0}, {x: center, y: 1}, {x:center + 1, y: 1}],
[{x: center, y: 0}, {x: center, y: 1}, {x: center, y: 2}],
[{x: center, y: 0}, {x: center, y: 1}, {x:center + 1, y: 1}, {x: center, y: 2}],
[{x: center, y: 0}, {x: center + 1, y: 0}, {x:center + 1, y: 1}, {x:center + 2, y: 1}],
[{x: center, y: 1}, {x: center + 1, y: 1}, {x:center + 2, y: 1}, {x:center + 1, y: 0}]
];
return getRandomValue(store);
}
function getColor() {
return getRandomValue(colors);
}
function getRandomValue(arr) {
return arr[Math.floor(Math.random()*arr.length)];
}
export { writeCount, writeGameOver, getColor, getElement }
<file_sep>let ctx, size, canvas, colsSize;
function setContext (canvs, colSize) {
canvas = canvs;
size = colSize;
ctx = canvas.getContext('2d');
ctx.lineWidth = 2;
}
function draw(x, y, color) {
ctx.strokeStyle = ctx.fillStyle = color;
ctx.strokeRect(
x * size,
y * size,
size - 3,
size - 3
);
ctx.fillRect(
x * size + 4,
y * size + 4,
9,
9
);
}
function clear(x, y) {
draw(x, y, '#ededed');
}
function drawCanvas() {
let n = Math.round(canvas.width / size);
for (let x = 0; x < n; x ++) {
for (let y = 0; y < n; y ++) {
draw(x, y, '#ededed');
}
}
}
export { setContext, draw, clear, drawCanvas }<file_sep>'use strict';
import { writeGameOver } from '../helper';
import { setFigureToModel, checkYAxis, checkXAxis} from '../model/model';
import { generatePoints } from './point';
let instance, timeout;
const events = new window.EventPubSub();
document.onkeydown = function (e) {
let keyboard = e.keyCode;
switch (keyboard) {
case 37 : //left arrow
instance.move(-1);
break;
case 39 : //right arrow
instance.move(1);
break;
case 32 : // space key
instance.rotate();
}
};
function resetTimeOut() {
if (timeout) {
clearTimeout(timeout);
}
}
function Figure() {
let points = generatePoints();
this.startAction = function () {
//if there are free space bottom
if(checkYAxis(points)) {
//start moving
draw();
moveDown();
} else {
//delete figure
instance = null;
writeGameOver();
}
};
function draw() {
points.forEach( point => point.draw());
}
function clear() {
points.forEach( point => point.clear());
}
// 'space' keydown event handler
this.rotate = function () {
let newPoints = calculateRotating();
// if there are free space around
if(checkYAxis(newPoints) && checkXAxis(newPoints)) {
clear();
for (let i = 0; i < newPoints.length; i++) {
// make deep copy
points[i].x = JSON.parse(JSON.stringify(newPoints[i].x));
points[i].y = JSON.parse(JSON.stringify(newPoints[i].y));
}
draw();
}
};
function calculateRotating() {
let center = getRotationCenter(),
f = 90 * (Math.PI / 180),
newPoints = [];
points.forEach(point => {
let x = point.x - center.x,
y = point.y - center.y,
// new points
X = x * Math.cos(f) - y * Math.sin(f),
Y = x * Math.sin(f) + y * Math.cos(f);
newPoints.push({
x: Math.round(X + center.x),
y: Math.round(Y + center.y)
})
});
return newPoints;
}
function getRotationCenter() {
let x = 0, y = 0, count = 0;
points.forEach( point => {
x += point.x;
y += point.y;
count++
});
return {
x: Math.floor(x / count),
y: Math.ceil(y / count)
}
}
// left & right keydown event handler
this.move = function (direction) {
if (checkXAxis(points, direction)) {
clear();
points.forEach( point => point.x += direction );
draw();
}
};
function moveDown() {
// if free space bottom
if(checkYAxis(points)) {
clear();
//move figure down
points.forEach(point => point.y += 1);
draw();
} else {
// write figure to model
setFigureToModel(points);
instance = null;
events.emit('createFigure');
return;
}
timeout = setTimeout(moveDown, 500);
}
return instance;
}
events.on('createFigure', function () {
if (instance) {
instance = null;
resetTimeOut();
}
instance = new Figure();
instance.startAction();
});
function initFigure () {
events.emit('createFigure');
}
export { initFigure }
<file_sep>'use strict';
import {writeCount} from '../helper';
import {Point} from '../entity/point';
let model, count;
function initModel (n) {
model = createModel(n);
count = writeCount();
count();
}
function createModel (size) {
let array = new Array(size);
for (let i = 0; i < size; i++) {
array[i] = new Array(size).fill(false);
}
return array;
}
function getCenter() {
return (model.length -1) / 2;
}
function setFigureToModel (figure) {
figure.forEach(point => model[point.x][point.y] = point);
checkRows();
}
function checkXAxis (points, direction) {
let directions = [];
if (direction) {
directions = [direction];
} else {
directions = [-1, 1];
}
return directions.every( direction =>
points.every( point =>
(point.x + direction) < model.length &&
(point.x + direction) >= 0 &&
!model[point.x + direction][point.y]
)
)
}
function checkYAxis (figure) {
return figure.every(point => {
if (point.x >= 0 && point.y+1 >= 0) {
return model[point.x][point.y+1] === false;
}
})
}
function checkRows() {
let y = model.length-1;
// while count not gets top of play field
while (y >= 0) {
let currentArr = [];
for (let x = 0; x <= model.length-1; x++) {
currentArr.push(model[x][y]);
}
// if row is empty => go to top
if(currentArr.every(point => !point)) {
y = 0;
// if row is full => cut off
} else if (currentArr.every(point => !!point)) {
for (let x = 0; x <= model.length-1; x++) {
model[x][y].clear();
model[x][y] = false;
}
updateCanvas(y);
count();
}
y--;
}
}
function updateCanvas(startY) {
for(let x = 0; x <= model.length-1; x++) {
let y = startY;
while (y >= 0) {
if (model[x][y-1]) {
model[x][y] = new Point({x,y}, model[x][y-1].color);
model[x][y-1].clear();
model[x][y-1] = false;
model[x][y].draw();
} else {
model[x][y] = false;
}
y--;
}
checkRows();
}
}
export { initModel, setFigureToModel, checkYAxis, checkXAxis, getCenter }
<file_sep># Tetris emulator
#### Get and prepare project
```bash
$ git clone https://github.com/Julia-Chubatyuk/tetris
$ cd tetris/
$ npm i --quiet
```
### Usage
```bash
$ npm start
```
Then choose the game and have fun!<file_sep>'use strict';
const webpack = require("webpack");
const open = require("open");
const dialog = require("./dialog");
const HtmlWebpackPlugin = require("html-webpack-plugin");
const WebpackOnBuildPlugin = require('on-build-webpack');
const webpackConfig = new Promise((res, rej) => {
dialog.getSelectedGame().then(result => {
res({
context: __dirname + `/${result.game}`,
entry: {
app: "./app.js",
},
output: {
path: __dirname + "/build/",
filename: "[name].bundle.js",
},
stats:{
assets: false,
cached: false,
cachedAssets: false,
children: false,
chunks: false,
chunkModules: false,
chunkOrigins: false,
colors: false,
depth: false,
entrypoints: false,
errors: true,
errorDetails: true,
hash: false,
maxModules: 0,
modules: false,
performance: false,
providedExports: false,
publicPath: false,
reasons: false,
source: false,
timings: false,
usedExports: false,
version: false,
warnings: false
},
plugins: [
new HtmlWebpackPlugin({
template: __dirname + `/${result.game}/index.html`,
title: `${result.game}`,
filename: 'build.html'
}),
new WebpackOnBuildPlugin(() => {open(__dirname + "/build/build.html")})
],
module: {
rules: [{
test: /\.js$/,
use:[{
loader: "babel-loader",
options: {
presets: ["es2015"],
compact: true
}
}]
}]
}
})
})
});
module.exports = webpackConfig;<file_sep>import {setContext, drawCanvas} from '../tools/drawer';
import {initModel} from './src/model/model';
import {initFigure} from './src/entity/figure';
let canvas, context, size = 20, n;
window.onload = function () {
canvas = document.getElementById('canvas');
n = Math.floor(canvas.width / size);
setContext(canvas, size);
drawCanvas(canvas);
};
document.getElementById('start').addEventListener('click', function (e) {
drawCanvas(canvas);
initModel(n);
initFigure();
this.blur();
});
|
600c5ed51847313c7151d4e7501c90281762f3de
|
[
"JavaScript",
"Markdown"
] | 9 |
JavaScript
|
Julia-Chubatyuk/tetris
|
409ace611202deadb91d6f0a4866c400e66290a9
|
47e463d58a72c9b54aa05aeccfe3aae8781fd631
|
refs/heads/master
|
<file_sep>package com.neuedu.hospital_back.service;
import com.neuedu.hospital_back.mapper.InvoiceMapper;
import com.neuedu.hospital_back.mapper.RegistrationMapper;
import com.neuedu.hospital_back.mapper.WorkloadsCountMapper;
import com.neuedu.hospital_back.po.WorkloadsCount;
import net.sf.json.JSONObject;
import org.springframework.stereotype.Service;
import javax.annotation.Resource;
import java.util.HashMap;
import java.util.Map;
@Service
public class WorkloadsCountService {
@Resource
private WorkloadsCountMapper workloadsCountMapper;
@Resource
private RegistrationMapper registrationMapper;
@Resource
private InvoiceMapper invoiceMapper;
private String[] feeTypes = {"中药费", "西药费", "挂号费", "诊察费", "检查费", "检验费", "治疗费", "材料费", "手术费", "其他费用"};
private String[] payTypes = {"自费", "医保卡", "新农合"};
public WorkloadsCount getWorkloadsCountByPostDId(JSONObject object) {
String postDId = object.getString("postDId");
Long beginTime = object.getLong("beginTime");
Long endTime = object.getLong("endTime");
Map<String, Double> fees = new HashMap<>();
for (String feeType : feeTypes) {
fees.put(feeType, workloadsCountMapper.getFeesByPostDId(postDId, beginTime, endTime, feeType));
}
WorkloadsCount workloadsCount = new WorkloadsCount();
workloadsCount.setVisits(registrationMapper.getPostVisits(postDId, beginTime, endTime));
workloadsCount.setInvoiceNum(invoiceMapper.getInvoiceCountByPostDId(postDId, beginTime, endTime));
return setValues(workloadsCount, fees);
}
public WorkloadsCount getWorkloadsCountBycId(JSONObject object) {
Integer cId = object.getInt("cId");
Long beginTime = object.getLong("beginTime");
Long endTime = object.getLong("endTime");
Map<String, Double> fees = new HashMap<>();
for (String feeType : feeTypes) {
fees.put(feeType, workloadsCountMapper.getFeesBycIdOnFeeType(cId, beginTime, endTime, feeType));
}
for (String payType : payTypes) {
fees.put(payType, workloadsCountMapper.getFeesBycIdOnPayType(cId, beginTime, endTime, payType));
}
WorkloadsCount workloadsCount = new WorkloadsCount();
return setValues(workloadsCount, fees);
}
public WorkloadsCount getWorkloadsCountBydId(JSONObject object) {
String dId = object.getString("dId");
Long beginTime = object.getLong("beginTime");
Long endTime = object.getLong("endTime");
Map<String, Double> fees = new HashMap<>();
for (String feeType : feeTypes) {
fees.put(feeType, workloadsCountMapper.getFeesBydId(dId, beginTime, endTime, feeType));
}
WorkloadsCount workloadsCount = new WorkloadsCount();
workloadsCount.setVisits(registrationMapper.getVisits(dId, beginTime, endTime));
workloadsCount.setInvoiceNum(invoiceMapper.getInvoiceCountBydId(dId, beginTime, endTime));
return setValues(workloadsCount, fees);
}
public WorkloadsCount getWorkloadsCountByuId(JSONObject object) {
Integer uId = object.getInt("uId");
Long beginTime = object.getLong("beginTime");
Long endTime = object.getLong("endTime");
Map<String, Double> fees = new HashMap<>();
for (String feeType : feeTypes) {
fees.put(feeType, workloadsCountMapper.getFeesByuId(uId, beginTime, endTime, feeType));
}
WorkloadsCount workloadsCount = new WorkloadsCount();
workloadsCount.setVisits(registrationMapper.getUserVisits(uId, beginTime, endTime));
workloadsCount.setInvoiceNum(invoiceMapper.getInvoiceCountByuId(uId, beginTime, endTime));
return setValues(workloadsCount, fees);
}
private WorkloadsCount setValues(WorkloadsCount workloadsCount, Map<String, Double> fees) {
if (fees.get("中药费") != null) workloadsCount.setZyFee(fees.get("中药费"));
if (fees.get("西药费") != null) workloadsCount.setXyFee(fees.get("西药费"));
if (fees.get("挂号费") != null) workloadsCount.setRegistrationFee(fees.get("挂号费"));
if (fees.get("诊察费") != null) workloadsCount.setDiagnosticFee(fees.get("诊察费"));
if (fees.get("检查费") != null) workloadsCount.setExaminationFee(fees.get("检查费"));
if (fees.get("检验费") != null) workloadsCount.setTestFee(fees.get("检验费"));
if (fees.get("治疗费") != null) workloadsCount.setTreatmentFee(fees.get("治疗费"));
if (fees.get("材料费") != null) workloadsCount.setMaterialFee(fees.get("材料费"));
if (fees.get("手术费") != null) workloadsCount.setSurgeryFee(fees.get("手术费"));
if (fees.get("其他费用") != null) workloadsCount.setOtherFee(fees.get("其他费用"));
if (fees.get("自费") != null) workloadsCount.setOwnExpenses(fees.get("自费"));
if (fees.get("医保") != null) workloadsCount.setMedicalInsurance(fees.get("医保"));
if (fees.get("新农合") != null) workloadsCount.setnCMS(fees.get("新农合"));
return workloadsCount;
}
}
<file_sep>package com.neuedu.hospital_back.mapper;
import com.neuedu.hospital_back.po.ExamnationItem;
import com.neuedu.hospital_back.po.ExamnationTemplate;
import org.apache.ibatis.annotations.Param;
import java.util.List;
public interface ExamnationTemplateMapper {
int deleteByPrimaryKey(Integer eTId);
int insert(ExamnationTemplate record);
List<ExamnationTemplate> selectByCondition(@Param("eTName") String eTName,@Param("eTScope")String eTScope, @Param("recordType")String recordType);
List<ExamnationTemplate> selectExamnationItemBydId(@Param("eTName") String eTName, @Param("recordType")String recordType, @Param("dId")String dId);
List<ExamnationTemplate> selectExamnationItemByuId(@Param("eTName") String eTName,@Param("recordType")String recordType,@Param("uId")int uId);
ExamnationTemplate selectExamnationTemplateByeTId(Integer eTId);
List<ExamnationItem> selectExamnationItemByeTId(Integer eTId);
int updateByPrimaryKeySelective(ExamnationTemplate record);
}<file_sep>package com.neuedu.hospital_back.controller;
import com.neuedu.hospital_back.po.ExamnationItem;
import com.neuedu.hospital_back.po.ExamnationTemplate;
import com.neuedu.hospital_back.service.ExamnationTemplateService;
import com.neuedu.hospital_back.service.ExamnationitemService;
import net.sf.json.JSONObject;
import org.springframework.web.bind.annotation.*;
import javax.annotation.Resource;
import java.util.List;
@CrossOrigin
@RestController
@RequestMapping("/examnationItem")
public class ExamnationItemController {
@Resource
private ExamnationitemService examnationitemService;
@Resource
private ExamnationTemplateService examnationTemplateService;
@RequestMapping("/importExamnationItems")
public boolean importExamnationItems() throws Exception {
return examnationitemService.importExamnationItems();
}
@RequestMapping("/getExamnationItems")
public List<ExamnationItem> getExamnationItems(@RequestBody ExamnationItem examnationItem) {
return examnationitemService.selectExamnationItem(examnationItem);
}
@RequestMapping("/getAllExamnationItem")
public List<ExamnationItem> getAllExamnationItem() {
return examnationitemService.selectAllExamnationItem();
}
@PostMapping("/getExamnationItemByPage")
public List<ExamnationItem> getExamnationItemByPage(@RequestBody JSONObject object) {
return examnationitemService.selectExamnationItemByPage(object);
}
@PostMapping("/getExamnationItemByEIName")
public List<ExamnationItem> getExamnationItemByEIName(@RequestBody JSONObject object) {
return examnationitemService.selectExamnationItemByEIName(object);
}
@RequestMapping("/deleteExamnationItem")
public boolean deleteExamnationItem(@RequestBody JSONObject eIId) {
return examnationitemService.deleteByPrimaryKey(eIId);
}
@RequestMapping("/insertExamnationItem")
public boolean insertExamnationItem(@RequestBody ExamnationItem examnationItem) {
return examnationitemService.insert(examnationItem);
}
@RequestMapping("/updateExamnationItem")
public boolean updateExamnationItem(@RequestBody ExamnationItem examnationItem){
return examnationitemService.updateByPrimaryKeySelective(examnationItem);
}
@RequestMapping("/getPageCount")
public int getPageCount(){
return examnationitemService.getPageCount();
}
@RequestMapping("/insertExaminationTemplate")
public boolean insertExaminationTemplate(@RequestBody JSONObject object){
return examnationTemplateService.insert(object);
}
@PostMapping("/getExaminationTemplateByCondition")
public List<ExamnationTemplate> getExaminationTemplateByCondition(@RequestBody JSONObject object){
return examnationTemplateService.selectByCondition(object);
}
@PostMapping("/checkDetail")
public ExamnationTemplate checkDetail(@RequestBody JSONObject object){
return examnationTemplateService.checkDetail(object);
}
@PostMapping("/deleteExamnationTemplateByPrimaryKey")
public Boolean ExamnationTemplate(@RequestBody JSONObject object){
return examnationTemplateService.deleteByPrimaryKey(object);
}
}
<file_sep>package com.neuedu.hospital_back.controller;
import com.neuedu.hospital_back.po.RegistrationLevel;
import com.neuedu.hospital_back.service.RegistrationlevelService;
import net.sf.json.JSONObject;
import org.springframework.web.bind.annotation.*;
import javax.annotation.Resource;
import java.util.List;
@CrossOrigin
@RestController
@RequestMapping("/registrationLevel")
public class RegistrationLevelController {
@Resource
private RegistrationlevelService registrationlevelService;
@RequestMapping("/getRegistrationLevel")
public List<RegistrationLevel> getRegistrationLevels(@RequestBody RegistrationLevel registrationlevel) {
return registrationlevelService.selectByPrimaryKey(registrationlevel);
}
@PostMapping("/getAllRegistrationLevel")
public List<RegistrationLevel> getAllRegistrationLevels() {
return registrationlevelService.selectAll();
}
@RequestMapping("/deleteRegistrationLevel")
public boolean deleteRegistrationLevel(@RequestBody JSONObject rLId) {
return registrationlevelService.deleteByPrimaryKey(rLId);
}
@RequestMapping("/insertRegistrationLevel")
public boolean insertRegistrationLevel(@RequestBody RegistrationLevel registrationlevel) {
return registrationlevelService.insert(registrationlevel);
}
@RequestMapping("/updateRegistrationLevel")
public boolean updateRegistrationLevel(@RequestBody RegistrationLevel registrationlevel) {
return registrationlevelService.updateByPrimaryKeySelective(registrationlevel);
}
}
<file_sep>package com.neuedu.hospital_back.po;
import java.sql.Date;
public class Arrangement {
private Integer aId;
private Integer uId;
private Integer plan;
private Date aBegin;
private Date aEnd;
public String getdId() {
return dId;
}
public void setdId(String dId) {
this.dId = dId;
}
private String dId;
public Arrangement() {
}
public Arrangement(Integer uId, Integer plan, Date aBegin, Date aEnd, String dId) {
this.uId = uId;
this.plan = plan;
this.aBegin = aBegin;
this.aEnd = aEnd;
this.dId = dId;
}
public Arrangement(Integer aId, Integer uId, Integer plan, Date aBegin, Date aEnd, String dId) {
this.aId = aId;
this.uId = uId;
this.plan = plan;
this.aBegin = aBegin;
this.aEnd = aEnd;
this.dId = dId;
}
public Integer getaId() {
return aId;
}
public void setaId(Integer aId) {
this.aId = aId;
}
public Integer getuId() {
return uId;
}
public void setuId(Integer uId) {
this.uId = uId;
}
public Integer getPlan() {
return plan;
}
public void setPlan(Integer plan) {
this.plan = plan;
}
public Date getaBegin() {
return aBegin;
}
public void setaBegin(Date aBegin) {
this.aBegin = aBegin;
}
public Date getaEnd() {
return aEnd;
}
public void setaEnd(Date aEnd) {
this.aEnd = aEnd;
}
}<file_sep>package com.neuedu.hospital_back.po;
import java.util.ArrayList;
import java.util.List;
public class User {
private Integer uId;
private String uNickName;
private String uPassword;
private String uName;
private Boolean isDoctor;
private String uCategory;
private List<Department> departments = new ArrayList<>();
public Boolean getDoctor() {
return isDoctor;
}
public void setDoctor(Boolean doctor) {
isDoctor = doctor;
}
public List<Department> getDepartments() {
return departments;
}
public void setDepartments(List<Department> departments) {
this.departments = departments;
}
public Integer getuId() {
return uId;
}
public void setuId(Integer uId) {
this.uId = uId;
}
public String getuNickName() {
return uNickName;
}
public void setuNickName(String uNickName) {
this.uNickName = uNickName;
}
public String getuPassword() {
return uPassword;
}
public void setuPassword(String uPassword) {
this.uPassword = uPassword;
}
public String getuName() {
return uName;
}
public void setuName(String uName) {
this.uName = uName;
}
public Boolean getIsDoctor() {
return isDoctor;
}
public void setIsDoctor(Boolean isDoctor) {
this.isDoctor = isDoctor;
}
public String getuCategory() {
return uCategory;
}
public void setuCategory(String uCategory) {
this.uCategory = uCategory;
}
}<file_sep>package com.neuedu.hospital_back.po;
public class ExamnationItem {
private Integer eIId;
private String eIName;
private String eISpecification;
private Double eIFee;
private String eIFeeType;
private String eICode;
@Override
public String toString() {
return "ExamnationItem{" +
"eIId=" + eIId +
", eIName='" + eIName + '\'' +
", eISpecification='" + eISpecification + '\'' +
", eIFee=" + eIFee +
", eIFeeType='" + eIFeeType + '\'' +
", eICode='" + eICode + '\'' +
'}';
}
public String geteISpecification() {
return eISpecification;
}
public void seteISpecification(String eISpecification) {
this.eISpecification = eISpecification;
}
public Double geteIFee() {
return eIFee;
}
public void seteIFee(Double eIFee) {
this.eIFee = eIFee;
}
public String geteIFeeType() {
return eIFeeType;
}
public void seteIFeeType(String eIFeeType) {
this.eIFeeType = eIFeeType;
}
public String geteICode() {
return eICode;
}
public void seteICode(String eICode) {
this.eICode = eICode;
}
public Integer geteIId() {
return eIId;
}
public void seteIId(Integer eIId) {
this.eIId = eIId;
}
public String geteIName() {
return eIName;
}
public void seteIName(String eIName) {
this.eIName = eIName;
}
}<file_sep>package com.neuedu.hospital_back.po;
import java.util.List;
public class DiagnosisTemplate {
private Integer datId;
private String diaType;
private String datName;
private Long datTime;
private String datScope;
private List<Medicine> medicines;
private Department department;
public List<Medicine> getMedicines() {
return medicines;
}
public void setMedicines(List<Medicine> medicines) {
this.medicines = medicines;
}
public Department getDepartment() {
return department;
}
public void setDepartment(Department department) {
this.department = department;
}
public Integer getDatId() {
return datId;
}
public void setDatId(Integer datId) {
this.datId = datId;
}
public String getDiaType() {
return diaType;
}
public void setDiaType(String diaType) {
this.diaType = diaType;
}
public String getDatName() {
return datName;
}
public void setDatName(String datName) {
this.datName = datName;
}
public Long getDatTime() {
return datTime;
}
public void setDatTime(Long datTime) {
this.datTime = datTime;
}
public String getDatScope() {
return datScope;
}
public void setDatScope(String datScope) {
this.datScope = datScope;
}
}<file_sep>package com.neuedu.hospital_back.controller;
import com.neuedu.hospital_back.po.Medicine;
import com.neuedu.hospital_back.service.AccountService;
import net.sf.json.JSONObject;
import org.springframework.web.bind.annotation.*;
import javax.annotation.Resource;
import java.util.ArrayList;
import java.util.List;
@CrossOrigin
@RestController
@RequestMapping("/account")
public class AccountController {
@Resource
private AccountService accountService;
@PostMapping("/insertAccount")
public String insertAccount(@RequestBody JSONObject object){
return accountService.insertAccount(object);
}
@PostMapping("/updateUId")
public boolean updateUId(@RequestBody JSONObject object){
return accountService.updateUId(object);
}
@PostMapping("/insertDiagnosisAccount")
public boolean insertDiagnosisAccount(@RequestBody JSONObject object){
return accountService.insertDiagnosisAccount(object);
}
@PostMapping("/getMedicineByRIdAndTime")
public List<Medicine> getMedicineByRIdAndTime(@RequestBody JSONObject object){
return accountService.getMedicineByRIdAndTime(object);
}
@PostMapping("/getAlreadyDrawMedicineByRIdAndTime")
public List<Medicine> getAlreadyDrawMedicineByRIdAndTime(@RequestBody JSONObject object){
return accountService.getAlreadyDrawMedicineByRIdAndTime(object);
}
@PostMapping("/returnMoney")
public ArrayList<String> returnExamApplication(@RequestBody JSONObject object){
return accountService.returnExamApplication(object);
}
@PostMapping("/returnRegistration")
public Boolean returnRegistration(@RequestBody JSONObject object){
return accountService.returnRegistration(object);
}
}
<file_sep>package com.neuedu.hospital_back.po;
public class Department {
private String dId;
private String dName;
private String dType;
private String dCategory;
@Override
public String toString() {
return "Department{" +
"dId='" + dId + '\'' +
", dName='" + dName + '\'' +
", dType='" + dType + '\'' +
", dCategory='" + dCategory + '\'' +
'}';
}
public String getdCategory() {
return dCategory;
}
public void setdCategory(String dCategory) {
this.dCategory = dCategory;
}
public String getdType() {
return dType;
}
public void setdType(String dType) {
this.dType = dType;
}
public String getdId() {
return dId;
}
public void setdId(String dId) {
this.dId = dId;
}
public String getdName() {
return dName;
}
public void setdName(String dName) {
this.dName = dName;
}
}
<file_sep>package com.neuedu.hospital_back.service;
import com.neuedu.hospital_back.mapper.PatientMapper;
import com.neuedu.hospital_back.mapper.RegistrationLevelMapper;
import com.neuedu.hospital_back.mapper.RegistrationMapper;
import com.neuedu.hospital_back.po.Patient;
import com.neuedu.hospital_back.po.Registration;
import com.neuedu.hospital_back.po.RegistrationInfo;
import net.sf.json.JSONObject;
import org.springframework.stereotype.Service;
import javax.annotation.Resource;
import java.sql.Date;
import java.text.SimpleDateFormat;
import java.util.List;
@Service
public class RegistrationService {
@Resource
private PatientMapper patientMapper;
@Resource
private RegistrationMapper registrationMapper;
@Resource
private RegistrationLevelMapper registrationLevelMapper;
public int insertPatient(JSONObject object) {
Patient patient = new Patient();
patient.setpId(object.getString("pId"));
patient.setpName(object.getString("pName"));
patient.setpBirth(Date.valueOf(object.getString("pBirth")));
patient.setpSex(object.getBoolean("pSex"));
patient.setpAddress(object.getString("pAddress"));
return patientMapper.insert(patient);
}
public int getRemainNumber(JSONObject object) {
return registrationLevelMapper.getLimitationByName(object.getString("rLName"))
- registrationMapper.getTodayCount(object.getInt("uId"));
}
public int insertRegistration(Registration registration) {
registration.setrDate(registration.getrDate() / 1000);
registrationMapper.insertRegistration(registration);
return registration.getrId();
}
public Patient getPatientById(JSONObject object) {
return patientMapper.getById(object.getString("pId"));
}
public List<RegistrationInfo> getRegistrationInfo(JSONObject object) {
String rStatus = object.getString("rStatus");
if (rStatus.equals("诊毕")) {
return registrationMapper.getAlreadyDiagnosisByuId(object.getInt("uId"), object.getString("pName"));
} else {
return registrationMapper.getNotDiagnosisByuId(object.getInt("uId"), object.getString("pName"));
}
}
public List<RegistrationInfo> getRegistrationInfoByUIdAndDId(JSONObject object) {
String rStatus = object.getString("rStatus");
if (rStatus.equals("诊毕")) {
return registrationMapper.getAlreadyDiagnosisByuIdAndDId(object.getInt("uId"), object.getString("pName"), object.getString("dId"));
} else {
return registrationMapper.getNotDiagnosisByuIdAndDId(object.getInt("uId"), object.getString("pName"), object.getString("dId"));
}
}
public List<RegistrationInfo> getRegistrationInfoByrId(JSONObject object) {
List<RegistrationInfo> registrationInfos = registrationMapper.getRegistrationInfoByrId(object.getInt("rId"));
for (RegistrationInfo registrationInfo : registrationInfos) {
SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
String rTime = sdf.format(new Date(registrationInfo.getrDate()));
registrationInfo.setrTime(rTime);
String MorningOrEvening = MorningOrEvening(registrationInfo.getrDate());
registrationInfo.setMorningOrEvening(MorningOrEvening);
if (registrationInfo.getrStatus().equals("未看诊")) {
registrationInfo.setOkToWithdraw(true);
} else {
registrationInfo.setOkToWithdraw(false);
}
}
return registrationInfos;
}
public List<RegistrationInfo> getRegistrationInfoByrIdOrPName(JSONObject object) {
return registrationMapper.getRegistrationInfoByrIdOrPName(object.getInt("rId"), object.getString("pName"));
}
private String MorningOrEvening(long date) {
SimpleDateFormat sdf = new SimpleDateFormat("HH");
String time = sdf.format(new Date(date));
int a = Integer.parseInt(time);
if (a >= 12) {
return "下午";
} else {
return "上午";
}
}
public boolean updateRStatus(JSONObject object) {
return registrationMapper.updateRegistration(object.getInt("rId"), object.getString("rStatus"), object.getString("rResult")) == 1;
}
}
<file_sep>package com.neuedu.hospital_back.service;
import com.neuedu.hospital_back.mapper.*;
import com.neuedu.hospital_back.po.Account;
import com.neuedu.hospital_back.po.DiagnosisMedicine;
import com.neuedu.hospital_back.po.Medicine;
import net.sf.json.JSONArray;
import net.sf.json.JSONObject;
import org.springframework.stereotype.Service;
import javax.annotation.Resource;
import java.lang.reflect.Array;
import java.util.ArrayList;
import java.util.Calendar;
import java.util.Date;
import java.util.List;
@Service
public class AccountService {
@Resource
private AccountMapper accountMapper;
@Resource
private AccountExaminationApplicationMapper accountExaminationApplicationMapper;
@Resource
private AccountDiagnosisMapper accountDiagnosisMapper;
@Resource
private DiagnosisMedicineMapper diagnosisMedicineMapper;
@Resource
private ExaminationApplicationMapper examinationApplicationMapper;
@Resource
private InvoiceMapper invoiceMapper;
@Resource
private RegistrationMapper registrationMapper;
public boolean insertDiagnosisAccount(JSONObject object) {
JSONArray jsonArray = object.getJSONArray("accounts");
List<Account> accounts = (List) JSONArray.toCollection(jsonArray, Account.class);
int result = 0;
for (Account account : accounts) {
System.out.println(account.toString());
result += accountMapper.insert(account);
accountDiagnosisMapper.insert(account.getAccId(), account.getDia_M_Id());
}
return result == accounts.size();
}
public String insertAccount(JSONObject object) {
JSONArray jsonArray = object.getJSONArray("accounts");
String iStatus = object.getString("iStatus");
List<Account> accounts = (List) JSONArray.toCollection(jsonArray, Account.class);
String iId = generateIId();
invoiceMapper.insertInvoice(iId, iStatus);
for (Account account : accounts) {
account.setiId(iId);
accountMapper.insert(account);
if (account.getFeeType().equals("中药") || account.getFeeType().equals("西药")) {
accountDiagnosisMapper.insert(account.getAccId(), account.getDia_M_Id());
} else if(account.getFeeType().equals("挂号费")){
}else accountExaminationApplicationMapper.insert(account.getAccId(), account.geteAId());
}
return iId;
}
public String updateIId(JSONObject object){
String iId = object.getString("iId");
String newIId = generateIId();
accountMapper.updateIId(newIId, iId);
invoiceMapper.insertInvoice(newIId,"生效");
invoiceMapper.updateInvoice(iId, "作废");
return newIId;
}
private String generateIId(){
Calendar cal=Calendar.getInstance();
int year=cal.get(Calendar.YEAR);
int month=cal.get(Calendar.MONTH);
int day=cal.get(Calendar.DATE);
int date = year * 10000 + (month + 1) * 100 + day;
/**发票前缀*/
String profix = String.valueOf(date);
/**发票后缀*/
int suffix = invoiceMapper.getNowIId(String.valueOf(date));
String iId = "";
if(suffix < 10){
iId = profix + "00" + String.valueOf(suffix);
}else if(suffix < 100){
iId = profix + "0" + String.valueOf(suffix);
}else{
iId = profix + String.valueOf(suffix);
}
return iId;
}
public boolean updateUId(JSONObject object) {
Integer rId = object.getInt("rId");
String dId = object.getString("dId");
Integer uId = object.getInt("uId");
List<Integer> eAIdList = object.getJSONArray("eAIdList");
int result = 0;
for (Integer eAId : eAIdList) {
result += accountMapper.updateUId(rId, dId, eAId, uId);
}
return result == eAIdList.size();
}
public List<Medicine> getMedicineByRIdAndTime(JSONObject object) {
Integer rId = object.getInt("rId");
String time = object.getString("Date");
return accountMapper.getMedicineByRIdAndTime(rId, time);
}
public List<Medicine> getAlreadyDrawMedicineByRIdAndTime(JSONObject object) {
Integer rId = object.getInt("rId");
String time = object.getString("Date");
return accountMapper.getAlreadyDrawMedicineByRIdAndTime(rId, time);
}
public ArrayList<String> returnExamApplication(JSONObject object) {
List<Double> eAFees = (List) JSONArray.toCollection(object.getJSONArray("eAFee"), Double.class);
List<Integer> eAIds =(List) JSONArray.toCollection(object.getJSONArray("eAIds"), Integer.class);
List<Integer> dia_M_Ids = (List) JSONArray.toCollection(object.getJSONArray("dia_M_Ids"), Integer.class);
List<Double> medicineFees = (List) JSONArray.toCollection(object.getJSONArray("medicineFee"), Double.class);
int re = 0;
List<String> iIdList = new ArrayList<>();
List<Integer> accIdList = new ArrayList<>();
for (int i = 0; i < eAIds.size(); i++) {
//获取accId
Integer accId = accountExaminationApplicationMapper.getAccId(eAIds.get(i));
//获取iId
String iId = accountMapper.getIIdByAccId(accId);
if(!iIdList.contains(iId)){
iIdList.add(iId);
}
accIdList.add(accId);
//更新account中费用
re += accountMapper.updateFeeById(accId, 0-eAFees.get(i));
//删除中间表
// accountExaminationApplicationMapper.deleteByeAId(eAIds.get(i));
//更新ea状态
examinationApplicationMapper.updateStatus(eAIds.get(i), "已退费");
}
for (int i = 0; i < dia_M_Ids.size(); i++) {
//获取accId
Integer accId = accountDiagnosisMapper.getAccId(dia_M_Ids.get(i));
Account account=new Account();
account.setAccId(accId);
Account account1=accountMapper.selectByCondition(account).get(0);
account1.setiId(null);
account1.setFee(-medicineFees.get(i));
accountMapper.insert(account1);
accIdList.add(account1.getAccId());
//获取iId
String iId = accountMapper.getIIdByAccId(accId);
if(!iIdList.contains(iId)){
iIdList.add(iId);
}
//更新account中费用
re += accountMapper.updateFeeById(accId, medicineFees.get(i));
//更新中间表
accountDiagnosisMapper.updateByia_M_Id(account1.getAccId(),dia_M_Ids.get(i));
// accountDiagnosisMapper.deleteBydia_M_Id(dia_M_Ids.get(i));
//更新d_a状态
DiagnosisMedicine d = new DiagnosisMedicine();
d.setmState("已退费");
d.setDia_M_Id(dia_M_Ids.get(i));
diagnosisMedicineMapper.updateByKey(d);
}
//更新account中退费部分的发票号
String newRefundIId = generateIId();
System.out.println(newRefundIId);
for(Integer accId: accIdList){
accountMapper.updateIIdByAccId(accId, accountMapper.getIIdByAccId(accId), newRefundIId);
}
invoiceMapper.insertInvoice(newRefundIId,"冲红");
//更新中间表对应accId
//更改account中未退费的发票号
String newIId = generateIId();
System.out.println(newIId);
for(String iId: iIdList){
accountMapper.updateIId(iId,newIId);
invoiceMapper.updateInvoice(iId, "作废");
invoiceMapper.insertConnection(iId, newRefundIId);
}
invoiceMapper.insertInvoice(newIId,"生效");
ArrayList<String> returnList = new ArrayList<>();
returnList.add(newRefundIId);
returnList.add(newIId);
return returnList;
}
public boolean returnRegistration(JSONObject object){
registrationMapper.updateRegistration(object.getInt("rId"), object.getString("rStatus"), object.getString("rResult"));
Account a=new Account();
a.setrId(object.getInt("rId"));
a.setFeeType("挂号费");
List<Account> accounts= accountMapper.selectByCondition(a);
if(accounts.size()>=1){
a=accounts.get(0);
String iId=a.getiId();
String newRefundIId = generateIId();
accountMapper.updateIIdByAccId(a.getAccId(), accountMapper.getIIdByAccId(a.getAccId()),newRefundIId);
accountMapper.updateFee(a.getAccId(),0);
invoiceMapper.insertInvoice(newRefundIId,"冲红");
invoiceMapper.insertConnection(iId, newRefundIId);
}
return accounts.size()==1;
}
}
<file_sep>package com.neuedu.hospital_back.mapper;
import com.neuedu.hospital_back.po.ExamnationItem;
import org.apache.ibatis.annotations.Param;
import java.util.List;
public interface ExamnationitemMapper {
int deleteByPrimaryKey(int eIId);
int insert(ExamnationItem record);
int updateByPrimaryKeySelective(ExamnationItem record);
int getExamnationItemCount();
List<ExamnationItem> selectExamnationItemByeIName(@Param("eIName") String eIName, @Param("recordType") String recordType);
List<ExamnationItem> selectExamnationItem(ExamnationItem examnationItem);
ExamnationItem selectById(int eIId);
List<ExamnationItem> selectAllExamnationItem();
List<ExamnationItem> selectExamnationItemByPage(@Param("begin") int begin, @Param("pageSize") int pageSize);
}<file_sep>package com.neuedu.hospital_back.service;
import com.neuedu.hospital_back.mapper.RegistrationLevelMapper;
import net.sf.json.JSONObject;
import org.springframework.stereotype.Service;
import javax.annotation.Resource;
import com.neuedu.hospital_back.po.RegistrationLevel;
import java.util.List;
@Service
public class RegistrationlevelService {
@Resource
private RegistrationLevelMapper registrationlevelMapper;
public boolean deleteByPrimaryKey(JSONObject rLId) {
return registrationlevelMapper.deleteByPrimaryKey(rLId.getInt("rLId")) == 1;
}
public boolean insert(RegistrationLevel record) {
return registrationlevelMapper.insert(record) == 1;
}
public List<RegistrationLevel> selectByPrimaryKey(RegistrationLevel registrationLevel) {
return registrationlevelMapper.getRegistrationLevel(registrationLevel);
}
public List<RegistrationLevel> selectAll() {
return registrationlevelMapper.getAllRegistrationLevel();
}
public boolean updateByPrimaryKeySelective(RegistrationLevel record) {
return registrationlevelMapper.updateByPrimaryKeySelective(record) == 1;
}
}
<file_sep>package com.neuedu.hospital_back.po;
public class ExaminationTemplateDepartment {
private Integer eT_dId;
private Integer eTId;
private String dId;
public Integer geteT_dId() {
return eT_dId;
}
public void seteT_dId(Integer eT_dId) {
this.eT_dId = eT_dId;
}
public Integer geteTId() {
return eTId;
}
public void seteTId(Integer eTId) {
this.eTId = eTId;
}
public String getdId() {
return dId;
}
public void setdId(String dId) {
this.dId = dId;
}
}<file_sep>package com.neuedu.hospital_back.po;
import java.sql.Date;
public class RegistrationInfo {
private int rId;
private String pId;
private String pName;
private Date pBirth;
private String pAddress;
private boolean pSex;
private String MorningOrEvening;
private String rStatus;
private String dName;
private boolean okToWithdraw;
private long rDate;
private String rTime;
private String dId;
private String uId;
public String getuId() {
return uId;
}
public void setuId(String uId) {
this.uId = uId;
}
public String getdId() {
return dId;
}
public void setdId(String dId) {
this.dId = dId;
}
public long getrDate() {
return rDate;
}
public void setrDate(long rDate) {
this.rDate = rDate;
}
public String getrTime() {
return rTime;
}
public void setrTime(String rTime) {
this.rTime = rTime;
}
public String getpAddress() {
return pAddress;
}
public void setpAddress(String pAddress) {
this.pAddress = pAddress;
}
public String getMorningOrEvening() {
return MorningOrEvening;
}
public void setMorningOrEvening(String MorningOrEvening) {
this.MorningOrEvening = MorningOrEvening;
}
public String getrStatus() {
return rStatus;
}
public void setrStatus(String rStatus) {
this.rStatus = rStatus;
}
public String getdName() {
return dName;
}
public void setdName(String dName) {
this.dName = dName;
}
public boolean isOkToWithdraw() {
return okToWithdraw;
}
public void setOkToWithdraw(boolean okToWithdraw) {
this.okToWithdraw = okToWithdraw;
}
public int getrId() {
return rId;
}
public void setrId(int rId) {
this.rId = rId;
}
public String getpId() {
return pId;
}
public void setpId(String pId) {
this.pId = pId;
}
public String getpName() {
return pName;
}
public void setpName(String pName) {
this.pName = pName;
}
public Date getpBirth() {
return pBirth;
}
public void setpBirth(Date pBirth) {
this.pBirth = pBirth;
}
public boolean ispSex() {
return pSex;
}
public void setpSex(boolean pSex) {
this.pSex = pSex;
}
}
<file_sep>package com.neuedu.hospital_back.service;
import org.springframework.stereotype.Service;
import javax.annotation.Resource;
import com.neuedu.hospital_back.mapper.DoctorArrangementRegulationMapper;
import com.neuedu.hospital_back.po.DoctorArrangementRegulation;
@Service
public class DoctorArrangementRegulationService {
@Resource
private DoctorArrangementRegulationMapper doctorArrangementRegulationMapper;
public int deleteByPrimaryKey(Integer do_Ar_Id) {
return doctorArrangementRegulationMapper.deleteByPrimaryKey(do_Ar_Id);
}
public int insert(DoctorArrangementRegulation record) {
return doctorArrangementRegulationMapper.insert(record);
}
public int updateByPrimaryKeySelective(DoctorArrangementRegulation record) {
return doctorArrangementRegulationMapper.updateByPrimaryKeySelective(record);
}
}
<file_sep>package com.neuedu.hospital_back.mapper;
import com.neuedu.hospital_back.po.Registration;
import com.neuedu.hospital_back.po.RegistrationInfo;
import org.apache.ibatis.annotations.Param;
import java.util.List;
public interface RegistrationMapper {
void insertRegistration(Registration registration);
int getVisits(@Param("dId") String dId, @Param("beginTime") Long beginTime, @Param("endTime") Long endTime);
int getPostVisits(@Param("postDId") String postDId, @Param("beginTime") Long beginTime, @Param("endTime") Long endTime);
int getUserVisits(@Param("uId") Integer uId, @Param("beginTime") Long beginTime, @Param("endTime") Long endTime);
List<RegistrationInfo> getAlreadyDiagnosisByuId(@Param("uId") int uId, @Param("pName") String pName);
List<RegistrationInfo> getNotDiagnosisByuId(@Param("uId") int uId, @Param("pName") String pName);
List<RegistrationInfo> getAlreadyDiagnosisByuIdAndDId(@Param("uId") int uId, @Param("pName") String pName, @Param("dId") String dId);
List<RegistrationInfo> getNotDiagnosisByuIdAndDId(@Param("uId") int uId, @Param("pName") String pName, @Param("dId") String dId);
List<RegistrationInfo> getRegistrationInfoByrId(@Param("rId") int rId);
int updateRegistration(@Param("rId") int rId, @Param("rStatus") String rStatus, @Param("rResult") String rResult);
List<RegistrationInfo> getRegistrationInfoByrIdOrPName(@Param("rId") int rId, @Param("pName") String pName);
int getTodayCount(int uId);
}
<file_sep>package com.neuedu.hospital_back.po;
public class WorkloadsCount {
private int visits;//看诊人次
private int invoiceNum;//发票数量
private double zyFee;//中药费
private double xyFee;//西药费
private double registrationFee;//挂号费
private double diagnosticFee;//诊察费
private double examinationFee;//检查费
private double testFee;//检验费
private double treatmentFee;//治疗费
private double materialFee;//材料费
private double surgeryFee;//手术费
private double otherFee;//其他费用
private double ownExpenses;//自费
private double medicalInsurance;//医保
private double nCMS;//新农合
public double getOwnExpenses() {
return ownExpenses;
}
public void setOwnExpenses(double ownExpenses) {
this.ownExpenses = ownExpenses;
}
public double getMedicalInsurance() {
return medicalInsurance;
}
public void setMedicalInsurance(double medicalInsurance) {
this.medicalInsurance = medicalInsurance;
}
public double getnCMS() {
return nCMS;
}
public void setnCMS(double nCMS) {
this.nCMS = nCMS;
}
public double getTestFee() {
return testFee;
}
public void setTestFee(double testFee) {
this.testFee = testFee;
}
public int getVisits() {
return visits;
}
public void setVisits(int visits) {
this.visits = visits;
}
public int getInvoiceNum() {
return invoiceNum;
}
public void setInvoiceNum(int invoiceNum) {
this.invoiceNum = invoiceNum;
}
public double getZyFee() {
return zyFee;
}
public void setZyFee(double zyFee) {
this.zyFee = zyFee;
}
public double getXyFee() {
return xyFee;
}
public void setXyFee(double xyFee) {
this.xyFee = xyFee;
}
public double getRegistrationFee() {
return registrationFee;
}
public void setRegistrationFee(double registrationFee) {
this.registrationFee = registrationFee;
}
public double getDiagnosticFee() {
return diagnosticFee;
}
public void setDiagnosticFee(double diagnosticFee) {
this.diagnosticFee = diagnosticFee;
}
public double getExaminationFee() {
return examinationFee;
}
public void setExaminationFee(double examinationFee) {
this.examinationFee = examinationFee;
}
public double getTreatmentFee() {
return treatmentFee;
}
public void setTreatmentFee(double treatmentFee) {
this.treatmentFee = treatmentFee;
}
public double getMaterialFee() {
return materialFee;
}
public void setMaterialFee(double materialFee) {
this.materialFee = materialFee;
}
public double getSurgeryFee() {
return surgeryFee;
}
public void setSurgeryFee(double surgeryFee) {
this.surgeryFee = surgeryFee;
}
public double getOtherFee() {
return otherFee;
}
public void setOtherFee(double otherFee) {
this.otherFee = otherFee;
}
}
<file_sep>package com.neuedu.hospital_back.mapper;
import com.neuedu.hospital_back.po.DiagnosisTemplateUser;
public interface DiagnosisTemplateUserMapper {
int insert(DiagnosisTemplateUser record);
int deleteBydatId(Integer datId);
}<file_sep>package com.neuedu.hospital_back.controller;
import com.neuedu.hospital_back.po.DiagnosisTemplate;
import com.neuedu.hospital_back.po.DiagnosisTemplateMedicine;
import com.neuedu.hospital_back.po.Medicine;
import com.neuedu.hospital_back.service.DiagnosisTemplateService;
import com.neuedu.hospital_back.service.MedicineService;
import net.sf.json.JSONObject;
import org.springframework.web.bind.annotation.*;
import javax.annotation.Resource;
import java.util.List;
@CrossOrigin
@RestController
@RequestMapping("/medicine")
public class MedicineController {
@Resource
private DiagnosisTemplateService diagnosisTemplateService;
@Resource
private MedicineService medicineService;
@RequestMapping("/importMedicine")
public boolean importMedicine() throws Exception {
return medicineService.importMedicine();
}
@PostMapping("/getMedicineByMName")
public List<Medicine> getMedicineByEIName(@RequestBody JSONObject object) {
return medicineService.selectMedicineByMName(object);
}
@PostMapping("/getMedicine")
public List<Medicine> getMedicineByCondition(@RequestBody Medicine medicine){
return medicineService.selectMedicineByCondition(medicine);
}
@PostMapping("/getMedicineByPage")
public List<Medicine> getMedicineByPage(@RequestBody JSONObject object){
return medicineService.getMedicineByPage(object);
}
@PostMapping("/getMedicineCount")
public int getMedicineCount(){
return medicineService.getMedicineCount();
}
@RequestMapping("/deleteMedicine")
public boolean deleteMedicine(@RequestBody JSONObject object) {
return medicineService.deleteByPrimaryKey(object)==1;
}
@RequestMapping("/insertMedicine")
public boolean insertMedicine(@RequestBody Medicine medicine) {
return medicineService.insert(medicine)==1;
}
@RequestMapping("/updateMedicine")
public boolean updateMedicine(@RequestBody Medicine medicine){
return medicineService.updateByPrimaryKeySelective(medicine)==1;
}
@RequestMapping("/insertDiagnosisTemplate")
public boolean insertDiagnosisTemplate(@RequestBody JSONObject object){
return diagnosisTemplateService.insert(object);
}
@PostMapping("/getDiagnosisTemplateByCondition")
public List<DiagnosisTemplate> getDiagnosisTemplateByCondition(@RequestBody JSONObject object){
return diagnosisTemplateService.selectByCondition(object);
}
@PostMapping("/checkDetail")
public DiagnosisTemplate checkDetail(@RequestBody JSONObject object){
return diagnosisTemplateService.checkDetail(object);
}
@PostMapping("/deleteDiagnosisTemplateByPrimaryKey")
public Boolean DiagnosisTemplate(@RequestBody JSONObject object){
return diagnosisTemplateService.deleteByPrimaryKey(object);
}
@PostMapping("/updateDiagnosisTemplateMedicine")
public int updateDiagnosisTemplateMedicine(@RequestBody DiagnosisTemplateMedicine d){
return diagnosisTemplateService.updateDiagnosisTemplateMedicine(d);
}
}
<file_sep>package com.neuedu.hospital_back.po;
import java.util.ArrayList;
import java.util.List;
public class Diagnosis {
private Integer diaId;
private String diaType;
private Integer rId;
private String diaName;
private Long createDate;
private Long useDate;
private String diaState;
private Integer uId;
private List<Medicine> medicines = new ArrayList<>();
private Double diaFee;
public Diagnosis() {
}
public Diagnosis( String diaType, Integer rId, String diaName, Long createDate, String diaState,Integer uId) {
this.diaType = diaType;
this.rId = rId;
this.diaName = diaName;
this.createDate = createDate;
this.diaState = diaState;
this.uId=uId;
}
public Integer getuId() {
return uId;
}
public Double getDiaFee() {
return diaFee;
}
public void setDiaFee(Double diaFee) {
this.diaFee = diaFee;
}
public void setuId(Integer uId) {
this.uId = uId;
}
public Integer getDiaId() {
return diaId;
}
public void setDiaId(Integer diaId) {
this.diaId = diaId;
}
public String getDiaType() {
return diaType;
}
public void setDiaType(String diaType) {
this.diaType = diaType;
}
public Integer getrId() {
return rId;
}
public void setrId(Integer rId) {
this.rId = rId;
}
public String getDiaName() {
return diaName;
}
public void setDiaName(String diaName) {
this.diaName = diaName;
}
public Long getCreateDate() {
return createDate;
}
public void setCreateDate(Long createDate) {
this.createDate = createDate;
}
public Long getUseDate() {
return useDate;
}
public void setUseDate(Long useDate) {
this.useDate = useDate;
}
public String getDiaState() {
return diaState;
}
public void setDiaState(String diaState) {
this.diaState = diaState;
}
public List<Medicine> getMedicines() {
return medicines;
}
public void setMedicines(List<Medicine> medicines) {
this.medicines = medicines;
}
}<file_sep>package com.neuedu.hospital_back;
import com.neuedu.hospital_back.po.User;
import com.neuedu.hospital_back.service.UserService;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.test.context.junit4.SpringRunner;
import javax.annotation.Resource;
import java.util.*;
@RunWith(SpringRunner.class)
@SpringBootTest
public class UserTest {
@Resource
private UserService userService;
@Test
public void getUser() {
User user = new User();
user.setuId(2);
List<User> users = userService.selectByCondition(user);
System.out.println(users);
}
@Test
public void getAllUsers() {
userService.selectAllUser();
}
}
<file_sep>package com.neuedu.hospital_back.mapper;
import org.apache.ibatis.annotations.Param;
public interface WorkloadsCountMapper {
double getFeesByPostDId(@Param("postDId") String postDId, @Param("beginTime") Long beginTime, @Param("endTime") Long endTime, @Param("feeType") String feeType);
double getFeesBydId(@Param("dId") String dId, @Param("beginTime") Long beginTime, @Param("endTime") Long endTime, @Param("feeType") String feeType);
double getFeesByuId(@Param("uId") Integer uId, @Param("beginTime") Long beginTime, @Param("endTime") Long endTime, @Param("feeType") String feeType);
double getFeesBycIdOnFeeType(@Param("cId") Integer cId, @Param("beginTime") Long beginTime, @Param("endTime") Long endTime, @Param("feeType") String feeType);
double getFeesBycIdOnPayType(@Param("cId") Integer cId, @Param("beginTime") Long beginTime, @Param("endTime") Long endTime, @Param("payType") String payType);
}
<file_sep>package com.neuedu.hospital_back.util;
import org.apache.poi.hssf.usermodel.HSSFWorkbook;
import org.apache.poi.ss.usermodel.*;
import org.apache.poi.ss.util.NumberToTextConverter;
import org.apache.poi.xssf.usermodel.XSSFWorkbook;
import org.springframework.stereotype.Component;
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
import java.text.SimpleDateFormat;
import java.util.Date;
@Component
public class ReadExcelUtil {
private static final String EXCEL_XLS = ".xls";
private static final String EXCEL_XLSX = ".xlsx";
public String getCellValue(Cell cell) {
String ret = "";
if (cell == null) return ret;
CellType type = cell.getCellTypeEnum();
switch (type) {
case BOOLEAN:
ret = String.valueOf(cell.getBooleanCellValue());
break;
case ERROR:
ret = null;
break;
case FORMULA:
Workbook wb = cell.getSheet().getWorkbook();
CreationHelper crateHelper = wb.getCreationHelper();
FormulaEvaluator evaluator = crateHelper.createFormulaEvaluator();
ret = getCellValue(evaluator.evaluateInCell(cell));
break;
case NUMERIC:
if (DateUtil.isCellDateFormatted(cell)) {
Date theDate = cell.getDateCellValue();
ret = new SimpleDateFormat().format(theDate);
} else {
ret = NumberToTextConverter.toText(cell.getNumericCellValue());
}
break;
case STRING:
ret = cell.getRichStringCellValue().getString();
break;
default:
ret = "";
}
return ret; // 有必要自行trim
}
/**
* 判断excel的版本,并根据文件流数据获取workbook
*
* @throws IOException
*/
public Workbook getWorkBook(InputStream is, File file) throws Exception {
Workbook workbook = null;
if (file.getName().endsWith(EXCEL_XLS)) {
workbook = new HSSFWorkbook(is);
} else if (file.getName().endsWith(EXCEL_XLSX)) {
workbook = new XSSFWorkbook(is);
}
return workbook;
}
/**
* 校验文件是否为excel
*
* @throws Exception
*/
public void checkExcelVaild(File file) throws Exception {
String message;
if (!file.exists()) {
message = "文件不存在!";
throw new Exception(message);
}
if (!file.isFile() || ((!file.getName().endsWith(EXCEL_XLS) && !file.getName().endsWith(EXCEL_XLSX)))) {
System.out.println(file.isFile() + "===" + file.getName().endsWith(EXCEL_XLS) + "===" + file.getName().endsWith(EXCEL_XLSX));
System.out.println(file.getName());
message = "文件不是Excel";
throw new Exception(message);
}
}
}<file_sep>package com.neuedu.hospital_back.service;
import com.neuedu.hospital_back.mapper.DepartmentMapper;
import com.neuedu.hospital_back.mapper.DepartmentUserMapper;
import com.neuedu.hospital_back.po.Department;
import com.neuedu.hospital_back.po.Disease;
import com.neuedu.hospital_back.util.ReadExcelUtil;
import net.sf.json.JSONObject;
import org.apache.poi.ss.usermodel.Row;
import org.apache.poi.ss.usermodel.Sheet;
import org.apache.poi.ss.usermodel.Workbook;
import org.springframework.stereotype.Service;
import org.springframework.util.ClassUtils;
import javax.annotation.Resource;
import java.io.File;
import java.io.FileInputStream;
import java.io.InputStream;
import java.util.ArrayList;
import java.util.List;
@Service
public class DepartmentService {
@Resource
private DepartmentMapper departmentMapper;
@Resource
private DepartmentUserMapper departmentUserMapper;
@Resource
private ReadExcelUtil readExcelUtil;
public boolean importDepartment() throws Exception {
String path = ClassUtils.getDefaultClassLoader().getResource("static/department.xlsx").getPath();
File excelFile = new File(path);//创建excel文件对象
InputStream is = new FileInputStream(excelFile);//创建输入流对象
readExcelUtil.checkExcelVaild(excelFile);
Workbook workbook = readExcelUtil.getWorkBook(is, excelFile);
Sheet sheet = workbook.getSheetAt(0);
for (int rowIndex = 1; rowIndex <= sheet.getLastRowNum(); rowIndex++) {
Row row = sheet.getRow(rowIndex);
Department department = new Department();
department.setdId(readExcelUtil.getCellValue(row.getCell(0)));
department.setdName(readExcelUtil.getCellValue(row.getCell(1)));
department.setdType(readExcelUtil.getCellValue(row.getCell(2)));
department.setdCategory(readExcelUtil.getCellValue(row.getCell(3)));
departmentMapper.insertDepartment(department);
}
is.close();
return true;
}
public List<Department> selectByuId(JSONObject object) {
List<String> dIds = departmentUserMapper.selectByuId(object.getInt("uId"));
List<Department> departments = new ArrayList<>();
for (String dId : dIds) {
departments.add(departmentMapper.getDepartmentBydId(dId));
}
return departments;
}
public List<Department> getAllDepartments() {
return departmentMapper.getAllDepartments();
}
public List<Department> getDepartments(Department department) {
return departmentMapper.getDepartments(department);
}
public List<Department> getDepartmentByPage(JSONObject jsonObject) {
return departmentMapper.getDepartmentByPage(jsonObject.getInt("pageNum"), jsonObject.getInt("pageSize"));
}
public boolean deleteDepartment(JSONObject object) {
return departmentMapper.deleteDepartment(object.getString("dId")) == 1;
}
public boolean insertDepartment(Department department) {
return departmentMapper.insertDepartment(department) == 1;
}
public boolean updateDepartment(Department department) {
return departmentMapper.updateDepartment(department) == 1;
}
public int getDepartmentCount() {
return departmentMapper.getDepartmentCount();
}
}
<file_sep>package com.neuedu.hospital_back.mapper;
import org.apache.ibatis.annotations.Param;
public interface AccountExaminationApplicationMapper {
int insert(@Param("accId")int accId, @Param("eAId")int eAId);
Integer getAccId(Integer eAId);
int deleteByeAId(Integer integer);
}
<file_sep>package com.neuedu.hospital_back.controller;
import com.neuedu.hospital_back.po.Disease;
import com.neuedu.hospital_back.service.DiseaseService;
import net.sf.json.JSONObject;
import org.springframework.web.bind.annotation.*;
import javax.annotation.Resource;
import java.util.List;
@CrossOrigin
@RestController
@RequestMapping("/disease")
public class DiseaseController {
@Resource
private DiseaseService diseaseService;
@RequestMapping("/importDiseases")
public boolean importDiseases() throws Exception {
return diseaseService.importDiseases();
}
@RequestMapping("/getDiseases")
public List<Disease> getDiseases(@RequestBody Disease disease) {
return diseaseService.getDiseases(disease);
}
@RequestMapping("/getDiseaseType")
public List<String> getDiseaseType(){
return diseaseService.getDiseaseType();
}
@PostMapping("/getDiseaseByPage")
public List<Disease> getAllDiseases(@RequestBody JSONObject object) {
return diseaseService.getDiseaseByPage(object);
}
@RequestMapping("/deleteDisease")
public boolean deleteDisease(@RequestBody JSONObject disId) {
return diseaseService.deleteDisease(disId);
}
@RequestMapping("/insertDisease")
public boolean insertDisease(@RequestBody Disease disease) {
return diseaseService.insert(disease);
}
@RequestMapping("/updateDisease")
public boolean updateDisease(@RequestBody Disease disease){
return diseaseService.updateDisease(disease);
}
@RequestMapping("/getPageCount")
public int getPageCount(@RequestBody JSONObject object){
return diseaseService.getDiseaseCount(object);
}
}
<file_sep>package com.neuedu.hospital_back.po;
public class Registration {
private int rId;
private int drId;
private int eId;
private String dId;
private int mRId;
private String rLName;
private String pId;
private int uId;
private int rOrder;
private double rFee;
private String rStatus;
private String payType;
private boolean hasMedicalHistory;
private long rDate;
private String rResult;
private String postDId;//开单科室
public String getPostDId() {
return postDId;
}
public void setPostDId(String postDId) {
this.postDId = postDId;
}
public String getrResult() {
return rResult;
}
public void setrResult(String rResult) {
this.rResult = rResult;
}
public long getrDate() {
return rDate;
}
public void setrDate(long rDate) {
this.rDate = rDate;
}
public String getPayType() {
return payType;
}
public void setPayType(String payType) {
this.payType = payType;
}
public int getrId() {
return rId;
}
public void setrId(int rId) {
this.rId = rId;
}
public int getDrId() {
return drId;
}
public void setDrId(int drId) {
this.drId = drId;
}
public int geteId() {
return eId;
}
public void seteId(int eId) {
this.eId = eId;
}
public String getdId() {
return dId;
}
public void setdId(String dId) {
this.dId = dId;
}
public int getmRId() {
return mRId;
}
public void setmRId(int mRId) {
this.mRId = mRId;
}
public String getrLName() {
return rLName;
}
public void setrLName(String rLName) {
this.rLName = rLName;
}
public String getpId() {
return pId;
}
public void setpId(String pId) {
this.pId = pId;
}
public String getrStatus() {
return rStatus;
}
public void setrStatus(String rStatus) {
this.rStatus = rStatus;
}
public int getuId() {
return uId;
}
public void setuId(int uId) {
this.uId = uId;
}
public int getrOrder() {
return rOrder;
}
public void setrOrder(int rOrder) {
this.rOrder = rOrder;
}
public double getrFee() {
return rFee;
}
public void setrFee(double rFee) {
this.rFee = rFee;
}
public boolean getHasMedicalHistory() {
return hasMedicalHistory;
}
public void setHasMedicalHistory(boolean hasMedicalHistory) {
this.hasMedicalHistory = hasMedicalHistory;
}
}
<file_sep>package com.neuedu.hospital_back.mapper;
import com.neuedu.hospital_back.po.DiagnosisTemplate;
import com.neuedu.hospital_back.po.ExamnationItem;
import com.neuedu.hospital_back.po.Medicine;
import org.apache.ibatis.annotations.Param;
import java.util.List;
public interface DiagnosisTemplateMapper {
int deleteByPrimaryKey(Integer datId);
int insert(DiagnosisTemplate record);
List<DiagnosisTemplate> selectByCondition(@Param("datName") String datName, @Param("datScope")String datScope, @Param("diaType")String diaType);
List<DiagnosisTemplate> selectDiagnosisTemplateBydId(@Param("datName") String datName, @Param("diaType")String diaType, @Param("dId")String dId);
List<DiagnosisTemplate> selectDiagnosisTemplateByuId(@Param("datName") String datName,@Param("diaType")String diaType,@Param("uId")int uId);
DiagnosisTemplate selectDiagnosisTemplateBydatId(Integer datId);
List<Medicine> selectMedicineBydatId(Integer datId);
int updateByPrimaryKeySelective(DiagnosisTemplate record);
}<file_sep>package com.neuedu.hospital_back.controller;
import com.neuedu.hospital_back.po.User;
import com.neuedu.hospital_back.service.UserService;
import net.sf.json.JSONObject;
import org.springframework.web.bind.annotation.*;
import javax.annotation.Resource;
import java.text.ParseException;
import java.util.List;
@CrossOrigin
@RestController
@RequestMapping("/user")
public class UserController {
@Resource
private UserService userService;
@PostMapping("/getAvailableDoctor")
public List<User> getAvailableDoctor(@RequestBody JSONObject object) throws ParseException {
return userService.getAvailableDoctor(object);
}
@PostMapping("/getUser")
public List<User> selectByCondition(@RequestBody User user) {
return userService.selectByCondition(user);
}
@PostMapping("/getAllUser")
public List<User> selectAllUser() {
return userService.selectAllUser();
}
@PostMapping("/getUserByPage")
public List<User> getUserByPage(@RequestBody JSONObject object) {
return userService.getUserByPage(object);
}
@PostMapping("/deleteUser")
public boolean deleteUser(@RequestBody JSONObject object) {
return userService.deleteByuId(object);
}
@PostMapping("/insertUser")
public boolean insertUser(@RequestBody JSONObject object) {
return userService.insert(object);
}
@PostMapping("/updateUser")
public boolean updateUser(@RequestBody JSONObject object) {
return userService.updateUser(object);
}
@PostMapping("/getUserCount")
public int getUserCount(){
return userService.getUserCount();
}
@PostMapping("/getUerByDepartment")
public List<User> getUerByDepartment(@RequestBody JSONObject object){
return userService.getUerByDepartment(object);
}
@PostMapping("/getUerByDIdAndRLName")
public List<User> getUerByDIdAndRLName(@RequestBody JSONObject object){
return userService.getUerByDIdAndRLName(object);
}
@PostMapping("/login")
public User login(@RequestBody JSONObject object){
return userService.login(object);
}
@PostMapping("/getUserById")
public User getUserById(@RequestBody JSONObject object){
return userService.getUserById(object);
}
}
<file_sep>package com.neuedu.hospital_back.po;
public class ArrangementList {
private String arName;
private String rLName;
private String uName;
private int arPlan;
private int uId;
public int getuId() {
return uId;
}
public void setuId(int uId) {
this.uId = uId;
}
private int[] arPlanList = new int[14];
public int getArPlan() {
return arPlan;
}
public void setArPlan(int arPlan) {
this.arPlan = arPlan;
}
public int[] getArPlanList() {
return arPlanList;
}
public void setArPlanList(int[] arPlanList) {
this.arPlanList = arPlanList;
}
public String getArName() {
return arName;
}
public void setArName(String arName) {
this.arName = arName;
}
public String getrLName() {
return rLName;
}
public void setrLName(String rLName) {
this.rLName = rLName;
}
public String getuName() {
return uName;
}
public void setuName(String uName) {
this.uName = uName;
}
}
<file_sep>package com.neuedu.hospital_back.mapper;
import com.neuedu.hospital_back.po.DiagnosisType;
import java.util.List;
public interface DiagnosisTypeMapper {
int insert(DiagnosisType diagnosisType);
List<DiagnosisType> getByrId(int rId);
}
<file_sep>package com.neuedu.hospital_back.po;
import java.util.Date;
import java.util.List;
public class ExamnationTemplate {
private Integer eTId;
private String eTName;
private long eTDate;
private String eTScope;
private String recordType;
private List<ExamnationItem> examnationItemList;
private Department department;
public Department getDepartment() {
return department;
}
public void setDepartment(Department department) {
this.department = department;
}
public List<ExamnationItem> getExamnationItemList() {
return examnationItemList;
}
public void setExamnationItemList(List<ExamnationItem> examnationItemList) {
this.examnationItemList = examnationItemList;
}
public Integer geteTId() {
return eTId;
}
public void seteTId(Integer eTId) {
this.eTId = eTId;
}
public String geteTName() {
return eTName;
}
public void seteTName(String eTName) {
this.eTName = eTName;
}
public long geteTDate() {
return eTDate;
}
public void seteTDate(long eTDate) {
this.eTDate = eTDate;
}
public String geteTScope() {
return eTScope;
}
public void seteTScope(String eTScope) {
this.eTScope = eTScope;
}
public String getRecordType() {
return recordType;
}
public void setRecordType(String recordType) {
this.recordType = recordType;
}
}<file_sep>package com.neuedu.hospital_back.mapper;
import com.neuedu.hospital_back.po.Patient;
public interface PatientMapper {
int insert(Patient patient);
Patient getById(String pId);
}
<file_sep>package com.neuedu.hospital_back;
import org.mybatis.spring.annotation.MapperScan;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
@SpringBootApplication
@MapperScan("com.neuedu.hospital_back.mapper")
public class HospitalBackApplication {
public static void main(String[] args) {
SpringApplication.run(HospitalBackApplication.class, args);
}
}
<file_sep>package com.neuedu.hospital_back.mapper;
import com.neuedu.hospital_back.po.ExamnationTemplateExamnationItem;
public interface ExamnationTemplateExamnationItemMapper {
// int deleteByPrimaryKey(Integer eT_EI_Id);
int deleteByETId(Integer eTId);
int insert(ExamnationTemplateExamnationItem record);
// int updateByPrimaryKeySelective(ExamnationTemplateExamnationItem record);
}<file_sep>package com.neuedu.hospital_back.po;
public class ExaminationTemplateUser {
private Integer eT_uId;
private Integer eTId;
private Integer uId;
public Integer geteT_uId() {
return eT_uId;
}
public void seteT_uId(Integer eT_uId) {
this.eT_uId = eT_uId;
}
public Integer geteTId() {
return eTId;
}
public void seteTId(Integer eTId) {
this.eTId = eTId;
}
public Integer getuId() {
return uId;
}
public void setuId(Integer uId) {
this.uId = uId;
}
}<file_sep>package com.neuedu.hospital_back.po;
public class Doctor {
private Integer uId;
private String dVacation;
private Boolean isDue;
private String rLName;
public Integer getuId() {
return uId;
}
public void setuId(Integer uId) {
this.uId = uId;
}
public String getdVacation() {
return dVacation;
}
public void setdVacation(String dVacation) {
this.dVacation = dVacation;
}
public Boolean getIsDue() {
return isDue;
}
public void setIsDue(Boolean isDue) {
this.isDue = isDue;
}
public String getrLName() {
return rLName;
}
public void setrLName(String rLName) {
this.rLName = rLName;
}
}
|
8071f03a8810602882290d42374f630126476c16
|
[
"Java"
] | 39 |
Java
|
Jesse-HXY/hospital_back
|
21811ccd06b48498e8e17a7609e85431d76d9c18
|
b770d3489b5af6e20ef7fd75ab5c77254b2de5ab
|
refs/heads/master
|
<file_sep>```
Usage: rrdtoold.py [options] command
Options:
-s, --signal=[start|stop|restart|debug]
-h, --help
INSERT INTO mps_messages(topic, message, created_at) VALUES('rrdtool', '{"command":"remove", "username":"twkang010602", "datetime":["2017020801","2017020802"]}', now());
CREATE TABLE `mps_messages` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`topic` varchar(100) NOT NULL,
`message` text NOT NULL,
`status` int(1) NOT NULL DEFAULT '0',
`created_at` datetime NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8
```
<file_sep>#-*- coding: utf-8 -*-
import random
import logging
import threading
import itertools
from minpubsub import create
class ProducerThread(threading.Thread):
def __init__(self, name, queue, event):
super(ProducerThread,self).__init__()
self.name = name
self.queue = queue
self.event = event
self.pubsub = create('mysql', 'localhost', 'USER', 'PASSWD', '<PASSWORD>')
self.subscriber = self.pubsub.subscribe('rrdtool')
def run(self):
while not self.event.wait(0.01):
if not self.queue.full():
item = self.subscriber.getNext()
if item is not None:
keys = ('id', 'topic', 'message', 'created_at')
item = dict(itertools.izip(keys, item))
self.queue.put(item)
logging.debug('Putting ' + str(item)
+ ' : ' + str(self.queue.qsize()) + ' items in queue')
logging.debug("end")
<file_sep>#!/usr/local/python2.6/bin/python
#-*- coding: utf-8 -*-
import sys
import time
import getopt
import signal
import logging
try:
from Queue import Queue
except ImportError:
from queue import Queue
from threading import Event
sys.path.append("/root/sdev_tools/rrdtoold/lib")
from daemonize import Daemonize
from producer import ProducerThread
from consumer import ConsumerThread
app_queue = Queue(10)
app_event = Event()
app_logger = logging.getLogger('')
app_logger.setLevel(logging.DEBUG)
formatter = logging.Formatter(
'%(asctime)s - %(threadName)-9s - %(levelname)s - %(message)s',
'%Y-%m-%d %H:%M:%S')
logfile = logging.FileHandler('/root/sdev_tools/rrdtoold/log/rrdtoold.log')
#logfile.setLevel(logging.DEBUG)
logfile.setLevel(logging.INFO)
logfile.setFormatter(formatter)
app_logger.addHandler(logfile)
console = logging.StreamHandler()
console.setLevel(logging.DEBUG)
console.setFormatter(formatter)
app_logger.addHandler(console)
def usage():
print('Usage: rrdtoold.py [options] command')
print('Options:')
print('-s, --signal=[start|stop|restart|debug]')
print('-h, --help')
sys.exit()
def app_stop(signal, frame):
app_event.set()
def main():
signal.signal(signal.SIGINT, app_stop)
signal.signal(signal.SIGTERM, app_stop)
signal.signal(signal.SIGHUP, app_stop)
p = ProducerThread("producer1", app_queue, app_event)
p.setDaemon(True)
p.start()
c1 = ConsumerThread("consumer1", app_queue, app_event)
#c2 = ConsumerThread("consumer2", app_queue, app_event)
c1.setDaemon(True)
#c2.setDaemon(True)
c1.start()
#c2.start()
p.join()
c1.join()
#c2.join()
if __name__ == '__main__':
s = ""
try:
opts, args = getopt.getopt(sys.argv[1:],"hs:",["signal=",])
except getopt.GetoptError:
usage()
for opt, value in opts:
if opt in ("-s", "--signal"):
s = value
elif opt == ("-h", "--help"):
usage()
if s == "":
usage()
else:
pid = "/root/sdev_tools/rrdtoold/run/app.pid"
if s == "debug":
daemon = Daemonize(app="app", pid=pid, action=main, foreground=True)
daemon.start()
else:
daemon = Daemonize(app="app", pid=pid, action=main, auto_close_fds=False)
if s == "start":
daemon.start()
elif s == "stop":
daemon.stop()
elif s == "restart":
daemon.stop()
daemon.start()
else:
usage()
<file_sep>#-*- coding: utf-8 -*-
import os
import re
import json
import time
import shutil
import datetime
import logging
import threading
import subprocess
import traceback
class RRDTool(object):
def __init__(self, username):
self.rrdtool = "/usr/local/bin/rrdtool"
self.rrddb_dir = "/home/apps/apache/htdocs/rrd/rrddb/"
self.temp_dir = "/home/apps/apache/htdocs/rrd/temp/"
self.username = username
self.rrddb_file = os.path.join(self.rrddb_dir, "%s.rrd" % self.username)
self.rrddb_xml_file = os.path.join(self.temp_dir, "%s.rrd.xml" % username)
self.rrddb_new_file = os.path.join(self.temp_dir, "%s_new.rrd" % username)
self.backup_file = os.path.join(self.temp_dir, "%s.rrd.bak.%s" % (self.username, datetime.datetime.now().strftime("%Y%m%d%H%M")))
def get_rrddb_lastinfo(self, rrddb_file):
last_time, last_input, last_output = None, None, None
p = subprocess.Popen(
(self.rrdtool, 'lastupdate', rrddb_file), stdout=subprocess.PIPE)
out, err = p.communicate()
for s in out.split("\n"):
m = re.search(r"(\d+): (\d+) (\d+)", s)
if m:
last_time, last_input, last_output = m.groups()
return last_time, last_input, last_output
def get_rrd_update_value(self, rrddb_file, rrddb_new_file, start_time, end_time, input, output):
values = []
p = subprocess.Popen(
(self.rrdtool, 'fetch', rrddb_file, 'AVERAGE', '-s', start_time, '-e', end_time), stdout=subprocess.PIPE)
out, err = p.communicate()
for s in out.split("\n"):
m = re.search(r"(\d+): ([\.e\+\-\d]+) ([\.e\+\-\d]+)", s)
if m:
last_time, last_input, last_output = m.groups()
last_input = int(float(last_input)*300)
last_output = int(float(last_output)*300)
values.append((last_time,last_input, last_output))
values.reverse()
input = float(input)
output = float(output)
for n, v in enumerate(values):
v = v + ("%d" % input,)
v = v + ("%d" % output,)
values[n] = v
input -= v[1]
output -= v[2]
values.reverse()
for v in values:
logging.debug("%s update %s %s:%s:%s" % (self.rrdtool, rrddb_new_file, v[0], v[3], v[4]))
subprocess.Popen(
(self.rrdtool, 'update', rrddb_new_file, '%s:%s:%s' % (v[0], v[3], v[4])), stdout=subprocess.PIPE).communicate()
def remove_rrddata(self, remove_datetimes):
"""
<!-- 2017-02-02 15:00:00 KST / 1486015200 --> <row><v>NaN</v><v>NaN</v></row>
"""
patterns = []
for remove_datetime in remove_datetimes:
if len(remove_datetime) == 10:
remove_datetime = datetime.datetime.strptime(remove_datetime, '%Y%m%d%H')
remove_datetime = remove_datetime.strftime("%Y-%m-%d %H")
elif len(remove_datetime) == 8:
remove_datetime = datetime.datetime.strptime(remove_datetime, '%Y%m%d')
remove_datetime = remove_datetime.strftime("%Y-%m-%d")
else:
logging.error("Unsupported date format %s" % remove_datetime)
continue
patterns.append(re.compile(r" %s\s?[:\d]+ \w+ / (\d+) --> <row><v>(.*)</v><v>(.*)</v></row>" % remove_datetime))
if len(patterns) == 0:
logging.error("No correction pattern")
return
self.backup_rrddb()
start_update_time = self.get_rrddb_lastinfo(self.rrddb_file)[0]
if start_update_time is None:
logging.error("No last update time %s" % self.rrddb_file)
return
init_value = "0.0000000000e+00"
p1 = subprocess.Popen(
(self.rrdtool, 'dump', self.rrddb_file), stdout=subprocess.PIPE)
if os.path.exists(self.rrddb_xml_file): os.unlink(self.rrddb_xml_file)
fp = open(self.rrddb_xml_file, "a+")
for num, line in enumerate(p1.stdout):
for pattern in patterns:
matching = pattern.search(line)
if matching:
if len(matching.groups()) == 3:
logging.debug("%s %s" % (num, line.strip()))
line = line.replace(matching.groups()[1], init_value).replace(matching.groups()[2], init_value)
logging.debug("%s %s" % (num, line.strip()))
fp.write(line)
fp.close()
if os.path.exists(self.rrddb_new_file): os.unlink(self.rrddb_new_file)
p2 = subprocess.Popen(
(self.rrdtool, 'restore', self.rrddb_xml_file, self.rrddb_new_file), stdout=subprocess.PIPE)
p2.communicate()
os.chmod(self.rrddb_new_file, 0644)
if os.path.exists(self.rrddb_xml_file): os.unlink(self.rrddb_xml_file)
stop_update_time, last_input, last_output = self.get_rrddb_lastinfo(self.rrddb_file)
if start_update_time != stop_update_time:
self.get_rrd_update_value(self.rrddb_file, self.rrddb_new_file, start_update_time, stop_update_time, last_input, last_output)
os.rename(self.rrddb_file, "%s.orig.%s" % (self.rrddb_file, datetime.datetime.now().strftime("%Y%m%d%H%M")))
os.rename(self.rrddb_new_file, self.rrddb_file)
def backup_rrddb(self):
if not os.path.exists(self.rrddb_file):
logging.error("File not found %s" % self.rrddb_file)
return False
else:
"""
example.rrd backed up to example.rrd.bak.%Y%m%d%H%M
"""
try:
shutil.copyfile(self.rrddb_file, self.backup_file)
except:
return False
logging.debug("%s backed up to %s" % (self.rrddb_file, self.backup_file))
return True
class ConsumerThread(threading.Thread):
def __init__(self, name, queue, event):
super(ConsumerThread,self).__init__()
self.name = name
self.queue = queue
self.event = event
def run(self):
while not self.event.wait(0.5):
if not self.queue.empty():
item = self.queue.get()
logging.debug('Getting ' + str(item)
+ ' : ' + str(self.queue.qsize()) + ' items in queue')
try:
message = json.loads(item["message"])
command = message["command"]
username = message["username"]
remove_datetime = message["datetime"]
except (ValueError, KeyError):
logging.error("This is an invalid message %s" % item["message"])
if command == "remove":
try:
logging.info("Begin editing %s rrddb" % username)
r = RRDTool(username)
r.remove_rrddata(remove_datetime)
logging.info("%s rrddb modified complete" % username)
except:
logging.error(traceback.format_exc())
logging.debug("end")
|
9bf1a9a5da31495c596bfdb3f5d531fee267a12b
|
[
"Markdown",
"Python"
] | 4 |
Markdown
|
ms0011054/rrdtoold
|
cd98e50ce1950e0dc6f0ff1addcae6cf27911785
|
10f64c07abe5998614ebaa2518d6deae051cc526
|
refs/heads/master
|
<file_sep>DISCORD_TOKEN=<PASSWORD>
BOARD_NAMES=a,c,w,m,cgl,cm,n,jp,v,vg,vp,vr,co,g,tv,k,o,an,tg,sp,asp,sci,his,int,out,toy,po,p,ck,lit,mu,fa,3,gd,diy,wsg,qst,biz,trv,fit,x,adv,lgbt,mlp,news,wsr,vip,a,a,biz,biz,o,o,adv,adv,tv,tv,mlp,mlp,v,v,vg,vg
BOT_PREFIX=!
BOT_FREQUENCY=10000
INSULT_0=GITHUB_TOS<file_sep>const fetch = require('isomorphic-unfetch');
async function pages() {
return await fetch('https://a.4cdn.org/pol/catalog.json').then((r) =>
r.json()
);
}
async function threads() {
return (await pages()).map(page=>page.threads).reduce((p, c)=>[...p,...c]);
};
async function find(pattern) {
pattern = pattern instanceof RegExp ? pattern : new RegExp(pattern);
return (await threads()).filter(thread=>pattern.test(thread.sub));
};
//(async ()=>console.log(await threads()))();
module.exports.pages = pages;
module.exports.threads = threads;
module.exports.find = find;
<file_sep># thread-watcher
A discord bot that provides tools to monitor 4chan threads automatically
<file_sep>const { Permissions, TextChannel, RichEmbed } = require('discord.js');
const Command = require('../structures/Command');
const Watcher = require('../structures/Watcher');
const FLAGS = Permissions.FLAGS;
const watchers = new Map();
module.exports = new Command(
{
name: 'watchers',
description: '',
usage: 'watch [add|list] channel pattern ...boards',
requires: new Permissions([
FLAGS.SEND_MESSAGES,
FLAGS.EMBED_LINKS,
FLAGS.MANAGE_MESSAGES
]),
permissions: new Permissions([
FLAGS.ADMINISTRATOR,
]),
overrides: ['671978823310639104'],
channels: {
mode: 'whitelist',
list: ['672263292781068288', '672254609615486996']
}
},
async (message, op, channel, pattern, ...boardlist) => {
if (!(channel instanceof TextChannel))
return await message.channel.send(
'expected a channel for second argument'
);
if (!message.guild.channels.has(channel.id))
return await message.channel.send(
'that channel is not in this guild'
);
if (!pattern instanceof RegExp || ("string" !== typeof pattern))
return await message.channel.send(
'expected string or regexp for pattern'
);
pattern = pattern instanceof RegExp ? pattern.source : new RegExp(pattern);
if (!watchers.has(channel.id)) watchers.set(channel.id, new Map());
const cwatchers = watchers.get(channel.id);
const source = pattern.source;
switch (op) {
case 'add':
if (cwatchers.has(source))
return await message.channel.send(
`already watching for ${source} in <#${channel.id}>`
);
cwatchers.set(source, new Watcher(channel, pattern, [...boardlist]));
await message.channel.send(
`started watching for ${source} in <#${channel.id}>`
);
break;
case 'list':
if (cwatchers.size == 0)
return await message.channel.send(
`no threads are being watched in <#${channel.id}>`
);
//TODO clean up hackfix for overflow on fields
let keys = [...cwatchers.keys()].slice(0, 24);
await message.channel.send(
new RichEmbed({
fields: keys.map((key) => ({
inline: false,
name: `${key}`,
value: `watching ${
cwatchers.get(key).messages.size
} threads`
}))
})
);
break;
case 'remove':
if(cwatchers.has(source)) {
cwatchers.get(source).kill();
cwatchers.delete(source);
await message.channel.send('OK');
break;
}
await message.channel.send(`not watching for ${source}`);
break;
default:
await message.channel.send(`${op} is not a valid op`);
break;
}
}
);
<file_sep>const EventEmitter = require('events');
const fetch = require('isomorphic-unfetch');
const assign = require('object-assign');
const Thread = require('./Thread');
// prettier-ignore
const IMPORTANT_KEYS = [
"no", "closed", "sub", "filename",
"ext", "tim", "replies", "images",
"id", "country", "last_modified"
];
class Board extends EventEmitter {
constructor(name) {
super();
this.name = name;
Object.defineProperty(this, 'threads', {
configurable: true,
writable: false,
enumerable: true,
value: new Map()
});
}
// prettier-ignore
async update(dest = 'catalog') {
let reduced=(await this.reduced('catalog')).map(thread=> assign(thread, {
closed: "closed" in thread ? thread.closed : false
}));
let threads=this.threads,values=[...threads.values()].map(thread=>assign(thread, {
closed: reduced.find((t) => t.no == thread.no) == null
}));
// new threads
const created = reduced.filter(({no})=>!threads.has(no));
created.forEach(data=>{
this.emit("created", data, this);
threads.set(data.no, new Thread(data));
}, this);
// updated threads
const updated = reduced.filter((d)=>threads.has(d.no)&&threads.get(d.no).update(d));
updated.forEach(data=>this.emit("updated", data, this), this);
// removed threads
const removed = values.filter((d)=>threads.has(d)&&d.closed);
removed.forEach(data=>{
this.emit("removed", data, this);
threads.delete(data.no)
}, this);
}
async pages(dest = 'threads') {
if (dest != 'threads' && dest != 'catalog')
throw new TypeError("dest expected 'threads' or 'catalog`");
return await fetch(
`https://a.4cdn.org/${this.name}/${dest}.json`
).then((r) => r.json());
}
async reduced(dest = 'threads') {
return (await this.pages(dest)).reduce(
(p, c) => [...p, ...c.threads],
[]
);
}
async filtered(pattern) {
if (pattern instanceof RegExp || 'string' == typeof pattern)
return (await this.reduced('catalog')).filter((thread) =>
pattern.test(thread.sub)
);
else return [];
}
}
module.exports = Board;
<file_sep>const fetch = require('isomorphic-unfetch');
async function pages() {
return await fetch('https://a.4cdn.org/pol/threads.json').then((r) =>
r.json()
);
}
module.exports = async function stats() {
return (await pages())
.map((page) => page.threads)
.reduce((p, c) => [...p, ...c]);
};
module.exports.pages = pages;
<file_sep>require('dotenv').config();
const Discord = require('discord.js');
const client = new Discord.Client();
const prefix = process.env.BOT_PREFIX;
//TODO clean up regexp
const args_pattern = /(?:['"]|`{3}|`{1})((?:[^'"`\\]|\\.)*)(?:['"]|`{3}|`{1})|(?!")([^\s]+)(?!")/g;
let commands = require('./src/util/commandloader');
client.on('ready', () => {
console.log(`Logged in as ${client.user.tag}!`);
commands.forEach((command)=>{
command.client = client;
});
});
client.on('message', async (message) => {
const { author, content } = message;
if (author.bot || !content.startsWith(prefix)) return;
let [command, ...args] = [
...content.substr(prefix.length).matchAll(args_pattern)
].map((v) => v[1] || v[2]);
if(commands.has(command))
commands.get(command).execute(message, ...args);
});
client.login(process.env.DISCORD_TOKEN);<file_sep>class Thread {
constructor(data, board) {
this.data = data;
this.board = board;
}
update(data) {
let changed = false;
for (let key in data) {
if (key == 'last_replies') this.data[key] = data[key];
else changed = this.data[key] != data[key] || changed;
if (changed) this.data[key] = data[key];
}
return changed;
}
// prettier-ignore
close() { this.board.threads.delete(this.data.no); }
async stats() {
return (await this.board.reduced('catalog')).find(
(t) => t.no == this.no
);
}
}
module.exports = Thread;
<file_sep>const fs = require('fs');
const path = require('path');
const commands = new Map();
fs.readdirSync(path.resolve(__dirname, '../commands')).forEach((file) => {
const command = require(`../commands/${file}`);
if (commands.has(command.name))
throw new Error(
`attempted to override existing command ${
command.name
} in ${path.resolve(__dirname, `../commands/${file}`)}`
);
commands.set(command.name, command);
});
module.exports = commands;
<file_sep>const { RichEmbed } = require("discord.js");
class ThreadEmbed extends RichEmbed {
constructor(thread, active=true) {
super({
author: { name: `${thread.sub}` },
title: active
? `https://boards.4chan.org/pol/thread/${thread.no}`
: `https://archive.4plebs.org/pol/thread/${thread.no}`,
image: {
url: `https://i.4cdn.org/pol/${thread.tim}${thread.ext}`
},
color: active ? 0x00ff00 : 0xff0000,
fields: [
{
inline: false,
name: "replies",
value: `${thread.replies}`
},
{
inline: false,
name: "created",
value: `${thread.now}`
}
]
});
}
}
module.exports = ThreadEmbed;<file_sep>const { Permissions } = require('discord.js');
const REXEXP_PATTERN = /^\/(.*)\/((?:([gmisuy])(?!.*\3))*)$/;
const CHANNELS_PATTERN = /<#([0-9]+)>/;
const USERS_PATTERN = /<@!?([0-9]+)>/;
const ROLES_PATTERN = /<@&([0-9]+)>/;
class Command {
constructor(
{
name,
description,
usage,
permissions,
overrides,
requires,
channels
},
callback
) {
this.name = name;
this.description = description;
this.usage = usage;
this.permissions = permissions || Permissions.DEFAULT;
this.requires = requires || Permissions.DEFAULT;
this.overrides = overrides || [];
this.callback = callback;
this.client = null;
this.channels = channels || { mode: 'blacklist', list: [] };
}
get bound() {
return this.client != null;
}
can({ permissions: { bitfield }, roles }) {
for (let override of this.overrides)
if (roles.has(override)) return true;
const pbitfield = this.permissions.bitfield;
return (bitfield && pbitfield) == pbitfield;
}
transform(args) {
return [...args].map((value) => {
if (!this.bound) return value;
if (CHANNELS_PATTERN.test(value)) {
const channelid = value.match(CHANNELS_PATTERN)[1];
if (this.client.channels.has(channelid))
return this.client.channels.get(channelid);
}
if (USERS_PATTERN.test(value)) {
const userid = value.match(USERS_PATTERN)[1];
if (this.client.users.has(userid))
return this.client.users.get(userid);
}
if (ROLES_PATTERN.test(value)) {
const roleid = value.match(ROLES_PATTERN)[1];
if (this.client.roles.has(roleid))
return this.client.roles.get(userid);
}
if (REXEXP_PATTERN.test(value)) return new RegExp(value);
return value;
});
}
async execute(message, ...args) {
const perms = message.guild.me.permissionsIn(message.channel);
const result = this.channels.list.find(id=>id==message.channel.id);
if(this.channels.mode=="blacklist" ? result : !result)
return;
if (!perms.has(this.requires) || !this.can(message.member))
return false;
return await this.callback(message, ...this.transform(args));
}
}
Object.defineProperty(Command, 'property1', {
value: 42,
writable: false
});
module.exports = Command;
|
83f8434c5b6d4be142be4fd59c94acaaf50af536
|
[
"JavaScript",
"Markdown",
"Shell"
] | 11 |
Shell
|
ACSD/thread-watcher
|
96dd269a34048778fa04e63f97b925942efd7f65
|
10e4dbfc27378c63784eb8c7c9038e56755cfcf4
|
refs/heads/master
|
<repo_name>IVnaBo/SwiftCookBook<file_sep>/Podfile
target 'SwiftCookBook' do
platform :ios, '9.0'
use_frameworks!
pod 'SnapKit'
pod 'Kingfisher'
pod 'SVProgressHUD'
pod 'Alamofire'
pod 'SwiftyJSON'
pod 'MJRefresh'
end
<file_sep>/SwiftCookBook/SearchCtrl.swift
//
// SearchCtrl.swift
// SwiftCookBook
//
// Created by ivna.evecom on 2018/3/15.
// Copyright © 2018年 ivna. All rights reserved.
//
import UIKit
import MJRefresh
import SVProgressHUD
class SearchCtrl: UIViewController,UISearchBarDelegate {
let identify = "identify"
var currentP:Int = 0
var totalPage:Int = 1
var cid:String = "2"
override func viewDidLoad() {
super.viewDidLoad()
self.view.backgroundColor = UIColor.white
setUpUI()
refreshHeaderData()
let notifation = Notification.Name(rawValue: COOKTAGCHANGE)
/**
* -- 监听 通知事件
*/
NotificationCenter.default.addObserver(self, selector: #selector(receiveNoti(noti:)), name: notifation, object: nil)
// Do any additional setup after loading the view.
}
@objc func receiveNoti(noti:Notification){
let useInfo = noti.userInfo as! [String:String]
let value = useInfo["cid"]
self.cid = value!
print("接收通知")
print(self.cid)
self.refreshHeaderData()
}
/**
* -- 上拉刷新数据
*/
@objc func refreshHeaderData(){
///清空所有数据
self.dataArr.removeAll()
NetWorkTool.shareNetWorkTool.cookIndexCidRequest(page:currentP,Cid: self.cid) { (models, currentP, totalP) in
self.currentP = currentP
self.dataArr = models
self.totalPage = totalP / 10
self.cookTable.reloadData()
self.cookTable.mj_header.endRefreshing()
}
}
/**
* -- 加载更多数据
*/
@objc func loadMoreData(){
currentP = currentP + 1
print(currentP)
NetWorkTool.shareNetWorkTool.cookIndexCidRequest(page: currentP , Cid: "3") { (models, currentP, totalP) in
for value in models{
self.dataArr.append(value)
}
self.totalPage = totalP / 10
self.cookTable.reloadData()
if(self.currentP > self.totalPage){
self.cookTable.mj_footer.endRefreshingWithNoMoreData()
}else{
self.cookTable.mj_footer.endRefreshing()
}
}
}
/**
* -- 懒加载‘
-- leftTable
*/
fileprivate lazy var cookTable:UITableView = {
let leftTab = UITableView.init(frame: CGRect.init(x: 0, y: 0, width: screen_W, height: screen_H - 64 ), style: .plain)
leftTab.delegate = self
leftTab.dataSource = self
/// 注册cell
leftTab.register(UINib.init(nibName: "CookItemCell", bundle: Bundle.main), forCellReuseIdentifier: identify)
leftTab.mj_header = MJRefreshNormalHeader.init(refreshingTarget: self, refreshingAction: #selector(refreshHeaderData))
leftTab.mj_footer = MJRefreshAutoNormalFooter.init(refreshingTarget: self, refreshingAction: #selector(loadMoreData))
leftTab.showsVerticalScrollIndicator = false
leftTab.showsHorizontalScrollIndicator = false
return leftTab
}()
fileprivate lazy var dataArr:[CookTagModel] = {
return [CookTagModel]()
}()
func setUpUI() {
let searchBar = UISearchBar.init(frame: CGRect.init(x: 10, y: 0, width: screen_W - 20, height: 45))
searchBar.barStyle = .default
searchBar.placeholder = "搜你想要的..."
searchBar.showsCancelButton = true
searchBar.tintColor = UIColor.orange
searchBar.barTintColor = UIColor.gray
searchBar.searchBarStyle = .minimal
searchBar.delegate = self
// 水平 垂直
searchBar.searchTextPositionAdjustment = UIOffsetMake(20, 0)
// searchBar TextField
let searchField = searchBar.value(forKey: "_searchField") as! UITextField
/// 通过kvc方法修改内部属性
searchField.setValue(UIColor.gray, forKeyPath: "_placeholderLabel.textColor")
searchField.setValue(UIFont.systemFont(ofSize: 14), forKeyPath: "_placeholderLabel.font")
self.cookTable.tableHeaderView = searchBar
self.view.addSubview(cookTable)
// self.navigationItem.titleView = searchBar
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
deinit {
NotificationCenter.default.removeObserver(self)
}
/*
// MARK: - Navigation
// In a storyboard-based application, you will often want to do a little preparation before navigation
override func prepare(for segue: UIStoryboardSegue, sender: Any?) {
// Get the new view controller using segue.destinationViewController.
// Pass the selected object to the new view controller.
}
*/
}
extension SearchCtrl:UITableViewDelegate,UITableViewDataSource{
// MARK: dat
func tableView(_ tableView: UITableView, numberOfRowsInSection section: Int) -> Int{
return (self.dataArr.count)
}
func tableView(_ tableView: UITableView, cellForRowAt indexPath: IndexPath) -> UITableViewCell{
let cell = tableView.dequeueReusableCell(withIdentifier: identify) as! CookItemCell
cell.reloadData(self.dataArr[indexPath.row])
return cell
}
func tableView(_ tableView: UITableView, heightForRowAt indexPath: IndexPath) -> CGFloat {
return 240
}
/**
* -- 单元格格点击事件
*/
func tableView(_ tableView: UITableView, didSelectRowAt indexPath: IndexPath) {
let array = self.dataArr[indexPath.row].steps
var imageList:[(title:String,url:String)] = []
for value in array! {
imageList.append((value["step"].stringValue,value["img"].stringValue))
}
let cookDetailVc = CookDetailCtrl()
cookDetailVc.navigationItem.title = self.dataArr[indexPath.row].title
cookDetailVc.imageList = imageList
self.navigationController?.pushViewController(cookDetailVc, animated: true)
}
/**
* -- 取消按钮点击事件 取消响应
*/
func searchBarCancelButtonClicked(_ searchBar: UISearchBar) {
searchBar.resignFirstResponder()
}
/**
* -- 搜索按钮点击事件 。执行搜索事件
*/
func searchBarSearchButtonClicked(_ searchBar: UISearchBar) {
var searchTxt = searchBar.text
if (searchTxt?.isEmpty)! {
SVProgressHUD.showInfo(withStatus: "请输入想要查询的菜名")
return
}else{
//url 过滤中文编码
// 请求到数据。 将原有数据删除, 赋值新数据 展示。
// 没有请求到数据 提示。 搜不到...
///请求地址:http://apis.juhe.cn/cook/query.php
//请求参数:menu=%E9%AA%A8&dtype=&pn=&rn=&albums=&key=43589755dfa048431ce4e330d2161238
}
}
}
<file_sep>/SwiftCookBook/v/CookItemCell.swift
//
// CookItemCell.swift
// SwiftCookBook
//
// Created by ivna.evecom on 2018/3/15.
// Copyright © 2018年 ivna. All rights reserved.
//
import UIKit
import Kingfisher
class CookItemCell: UITableViewCell {
@IBOutlet weak var cookImg: UIImageView!
@IBOutlet weak var cookTitle: UILabel!
@IBOutlet weak var imtro: UILabel!
override func awakeFromNib() {
super.awakeFromNib()
// Initialization code
}
override func setSelected(_ selected: Bool, animated: Bool) {
super.setSelected(selected, animated: animated)
// Configure the view for the selected state
}
func reloadData(_ model:CookTagModel) {
cookImg.kf.setImage(with: URL.init(string: model.albums!), placeholder: UIImage.init(named: "placehoder"), options: nil, progressBlock: nil, completionHandler: nil)
cookTitle.text = model.title
imtro.text = model.imtro
}
}
<file_sep>/SwiftCookBook/tool/UIImageView+Extension.swift
//
// UIImageView+Extension.swift
// SwiftCookBook
//
// Created by ivna.evecom on 2018/3/15.
// Copyright © 2018年 ivna. All rights reserved.
//
import UIKit
import Kingfisher
extension UIImageView{
func setImageByUrl(_ url: String,_ imgName:String) {
let placeholder = UIImage(named: imgName)
if url == ""{
self.image = placeholder
}
else{
let url = url.addingPercentEncoding(withAllowedCharacters: .urlQueryAllowed)
//self.kf.setImage(with:URL(string: url!)!)
//print(url!)
self.kf.setImage(with: URL(string: url!)!, placeholder: placeholder, options: nil, progressBlock: nil, completionHandler: { (image, error, cacheType, imageURL) in
self.image = (image == nil) ? placeholder : image
}
)
}
}
func setImageByUrl(_ url: String) {
let placeholder = UIImage(named: "default5")
if url == ""{
self.image = placeholder
}
else{
let url = url.addingPercentEncoding(withAllowedCharacters: .urlQueryAllowed)
//self.kf.setImage(with:URL(string: url!)!)
//print(url!)
self.kf.setImage(with: URL(string: url!)!, placeholder: placeholder, options: nil, progressBlock: nil, completionHandler: { (image, error, cacheType, imageURL) in
self.image = (image == nil) ? placeholder : image
}
)
}
}
}
<file_sep>/SwiftCookBook/ViewController.swift
//
// ViewController.swift
// CookBook--Swift
//
// Created by ivna.evecom on 2018/3/14.
// Copyright © 2018年 ivna. All rights reserved.
//
import UIKit
class ViewController: UIViewController {
let leftId:String = "LeftID"
let rightId:String = "RightID"
/**
* -- 记录上次滚动偏移量
*/
var lastOffsetY:CGFloat = 0
var isScrollDown:Bool = false
override func viewDidLoad() {
super.viewDidLoad()
self.view.backgroundColor = UIColor.white
self.navigationItem.title = "菜谱大全"
self.view.addSubview(self.leftTable)
self.view.addSubview(self.rightTable)
NetWorkTool.shareNetWorkTool.cookCategoryRequest { (leftValue, rightValue) in
self.leftData = leftValue
self.rightData = rightValue
self.leftTable.reloadData()
self.rightTable.reloadData()
}
// Do any additional setup after loading the view, typically from a nib.
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
/**
* -- 懒加载‘
-- leftTable
*/
fileprivate lazy var leftTable:UITableView = {
let leftTab = UITableView.init(frame: CGRect.init(x: 0, y: 0, width: 130, height: screen_H - 64 ), style: .plain)
leftTab.delegate = self
leftTab.dataSource = self
leftTab.showsVerticalScrollIndicator = false
leftTab.showsHorizontalScrollIndicator = false
return leftTab
}()
/**
* -- rightTable
*/
fileprivate lazy var rightTable:UITableView = {
let rightTab = UITableView.init(frame: CGRect.init(x: 130, y: 0, width: screen_W - 130, height: screen_H - 64 ), style: .plain)
rightTab.delegate = self
rightTab.dataSource = self
rightTab.showsHorizontalScrollIndicator = false
rightTab.showsVerticalScrollIndicator = false
return rightTab
}()
// MARK:懒加载数据源
fileprivate lazy var leftData:[LeftModel] = {
return [LeftModel]()
}()
fileprivate lazy var rightData:[[RightModel]] = {
return [[RightModel]]()
}()
}
// MARK: - UITableViewDelegate UITableViewDataSource
extension ViewController:UITableViewDelegate,UITableViewDataSource{
/**
* -- 俩个table 有多少个row
*/
func tableView(_ tableView: UITableView, numberOfRowsInSection section: Int) -> Int{
if(tableView == self.leftTable){
return self.leftData.count
}else{
return self.rightData[section].count
}
}
/**
* -- 有多少组 左边的是固定1组 右边的是左边的分类数
*/
func numberOfSections(in tableView: UITableView) -> Int {
if(tableView == self.rightTable){
return self.leftData.count
}else{
return 1
}
}
/**
* -- 根据tableView绘制cell
*/
func tableView(_ tableView: UITableView, cellForRowAt indexPath: IndexPath) -> UITableViewCell{
if tableView == self.leftTable {
var cell = tableView.dequeueReusableCell(withIdentifier: leftId)
if(cell == nil){
cell = UITableViewCell.init(style: .default, reuseIdentifier: leftId)
cell?.textLabel?.font = UIFont.systemFont(ofSize: 13)
cell?.textLabel?.textAlignment = .center
}
cell?.textLabel?.text = self.leftData[indexPath.row].name
return cell!
}else{
var cell = tableView.dequeueReusableCell(withIdentifier: rightId)
if cell == nil{
cell = UITableViewCell.init(style: .default, reuseIdentifier: rightId)
cell?.textLabel?.font = UIFont.systemFont(ofSize: 13)
cell?.textLabel?.textAlignment = .center
// print("11111")/// 测试重用
print("section\(indexPath.section) " + " row\(indexPath.row)")
}
cell?.textLabel?.text = self.rightData[indexPath.section][indexPath.row].name
return cell!
}
}
/**
* -- 表格头视图 只能右边有
*/
func tableView(_ tableView: UITableView, viewForHeaderInSection section: Int) -> UIView? {
if(tableView == self.rightTable){
let headLab = UILabel.init(frame: CGRect.init(x: 0, y: 0, width: screen_W, height: 25))
headLab.font = UIFont.systemFont(ofSize: 13)
headLab.backgroundColor = UIColor.init(red: 244 / 255, green: 244 / 255, blue: 244 / 255, alpha: 1)
headLab.text = self.leftData[section].name
return headLab
}else{
return nil
}
}
/**
* -- 头视图高度
*/
func tableView(_ tableView: UITableView, heightForHeaderInSection section: Int) -> CGFloat {
if(tableView == self.leftTable){ /// 去除左边第一个组的头视图
return 0.01
}else{
return 25
}
}
/**
* -- 当点击了左边表格。右边做出相应响应
*/
func tableView(_ tableView: UITableView, didSelectRowAt indexPath: IndexPath) {
if(tableView == self.leftTable){
let indexP = IndexPath.init(row: 0, section: indexPath.row)
self.rightTable.scrollToRow(at: indexP, at: .top, animated: true)
self.leftTable.scrollToRow(at: indexPath, at: .middle, animated: true)
}else{
let notifation = NSNotification.Name(rawValue:COOKTAGCHANGE)
NotificationCenter.default.post(name: notifation, object: self, userInfo: ["cid":self.rightData[indexPath.section][indexPath.row].cookId ?? "3"])
print("发送通知!")
self.tabBarController?.selectedIndex = 1
}
}
// MARK:滚动相关代理方法
/**
* -- 往下滚动 scrollView.contentOffset.y 是正值逐步增大 往下滚动 逐步减小
*/
func scrollViewDidScroll(_ scrollView: UIScrollView) {
if rightTable == scrollView {
print(scrollView.contentOffset.y)
isScrollDown = lastOffsetY < scrollView.contentOffset.y // 上次的比这次的小 说明是向下滚动
lastOffsetY = scrollView.contentOffset.y /// 将上次滚动的距离纪录
}
}
/**
* -- 当拖动右边表格的同时 处理左边的表格
*/
func selectRowAtIndexPath(indexP:Int) {
self.leftTable.selectRow(at: IndexPath.init(row: indexP, section: 0), animated: true, scrollPosition: .bottom)
}
/**
* -- 分区头标题即将展示
*/
func tableView(_ tableView: UITableView, willDisplayHeaderView view: UIView, forSection section: Int) {
// 当前的tableView是RightTableView,RightTableView滚动的方向向上,RightTableView是用户拖拽而产生滚动的((主要判断RightTableView用户拖拽而滚动的,还是点击LeftTableView而滚动的)
if(rightTable == tableView ) && !isScrollDown && rightTable.isDragging{
self.selectRowAtIndexPath(indexP: section)
}
}
/**
* -- 分区头标题展示结束
*/
func tableView(_ tableView: UITableView, didEndDisplayingHeaderView view: UIView, forSection section: Int) {
// 当前的tableView是RightTableView,RightTableView滚动的方向向下,RightTableView是用户拖拽而产生滚动的((主要判断RightTableView用户拖拽而滚动的,还是点击LeftTableView而滚动的)
if(rightTable == tableView ) && isScrollDown && rightTable.isDragging{
/** 向下滚动的话 则表示 左边的分类也应该向下移动一位 */
self.selectRowAtIndexPath(indexP: section + 1)
}
}
}
<file_sep>/SwiftCookBook/m/CookTagModel.swift
//
// CookTagModel.swift
// SwiftCookBook
//
// Created by ivna.evecom on 2018/3/15.
// Copyright © 2018年 ivna. All rights reserved.
//
import UIKit
import SwiftyJSON
class CookTagModel: NSObject {
var cookId:String?
var title:String?
var tags:String?
var imtro:String?
var albums:String?
var steps:[JSON]?
init(json:JSON) {
cookId = json["id"].stringValue
title = json["title"].stringValue
tags = json["tags"].stringValue
imtro = json["imtro"].stringValue
albums = json["albums"].arrayValue.first?.stringValue
steps = json["steps"].arrayValue
}
}
<file_sep>/SwiftCookBook/main/BsNavitionVc.swift
//
// BsNavitionVc.swift
// Joke_Swift
//
// Created by ivna.evecom on 2018/3/7.
// Copyright © 2018年 ivna. All rights reserved.
//
import UIKit
class BsNavitionVc: UINavigationController,UIGestureRecognizerDelegate{
override func viewDidLoad() {
super.viewDidLoad()
let navigationBar = UINavigationBar.appearance()
navigationBar.setBackgroundImage(UIImage.init(named: "navigation_background"), for: .default)
self.interactivePopGestureRecognizer?.delegate = self
// Do any additional setup after loading the view.
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
override func pushViewController(_ viewController: UIViewController, animated: Bool) {
if viewControllers.count > 0 { /// 如果栈内还有控制器。
viewController.hidesBottomBarWhenPushed = true
viewController.navigationItem.leftBarButtonItem = UIBarButtonItem.init(image: UIImage.init(named: "lefterbackicon_titlebar_24x24_"), style: .plain, target: self, action: #selector(navigationBack))
}
super.pushViewController(viewController, animated: true)
}
@objc func navigationBack(){
popViewController(animated: true)
}
/**
* -- 侧滑返回手势
*/
func gestureRecognizer(_ gestureRecognizer: UIGestureRecognizer, shouldReceive touch: UITouch) -> Bool {
if self.childViewControllers.count > 1 { ///如果不是栈顶控制器
return true
}else{
return false
}
}
/*
// MARK: - Navigation
// In a storyboard-based application, you will often want to do a little preparation before navigation
override func prepare(for segue: UIStoryboardSegue, sender: Any?) {
// Get the new view controller using segue.destinationViewController.
// Pass the selected object to the new view controller.
}
*/
}
<file_sep>/README.md
# SwiftCookBook
#使用聚合数据
#用到Swift的一些基础第三方框架。 SwiftyJson.SnapKit .Alamofire 等
<file_sep>/SwiftCookBook/tool/Config.swift
//
// Config.swift
// CookBook--Swift
//
// Created by ivna.evecom on 2018/3/14.
// Copyright © 2018年 ivna. All rights reserved.
//
import UIKit
let screen_W:CGFloat = UIScreen.main.bounds.size.width
let screen_H:CGFloat = UIScreen.main.bounds.size.height
/// RGBA的颜色设置
func MQColor( r:CGFloat, g:CGFloat, b:CGFloat, a:CGFloat) -> UIColor {
return UIColor(red: r / 255.0, green: g / 255.0, blue: b / 255.0, alpha: a)
}
//背景颜色
func MQBackColor() -> UIColor {
return MQColor(r:54, g: 171, b: 254, a: 1)
}
let COOKTAGCHANGE:String = "cookTagChange"
/// iPhone 5
let isIPhone5 = screen_H == 568 ? true : false
/// iPhone 6
let isIPhone6 = screen_H == 667 ? true : false
/// iPhone 6P
let isIPhone6P = screen_H == 736 ? true : false
let fontSize16:CGFloat = isIPhone5 ? 14 : 16
let fontSize18:CGFloat = isIPhone5 ? 16 : 18
let fontSize14:CGFloat = isIPhone5 ? 12 : 14
let fontSize12:CGFloat = isIPhone5 ? 10 : 12
let fontSize22:CGFloat = isIPhone5 ? 20 : 22
let fontSize20:CGFloat = isIPhone5 ? 18 : 20
let fontSize24:CGFloat = isIPhone5 ? 22 : 24
extension String{
//过滤中文编码
func getUrlString() ->String{
return self.addingPercentEncoding(withAllowedCharacters: .urlQueryAllowed)!
}
}
<file_sep>/SwiftCookBook/m/RightModel.swift
//
// RightModel.swift
// SwiftCookBook
//
// Created by ivna.evecom on 2018/3/14.
// Copyright © 2018年 ivna. All rights reserved.
//
import UIKit
import SwiftyJSON
class RightModel: NSObject {
var cookId:String?
var name:String?
init(json:JSON) {
cookId = json["id"].stringValue
name = json["name"].stringValue
}
}
<file_sep>/SwiftCookBook/tool/NetWorkTool.swift
//
// NetWorkTool.swift
// SwiftCookBook
//
// Created by ivna.evecom on 2018/3/14.
// Copyright © 2018年 ivna. All rights reserved.
//
import UIKit
import SVProgressHUD
import Alamofire
import SwiftyJSON
/**
* -- 网络请求工具类
*/
class NetWorkTool: NSObject {
/**
* -- 私有化init方法
*/
fileprivate override init() {}
/**
* -- 创建实例
*/
static let shareNetWorkTool = NetWorkTool()
/**
* -- 主页分类请求
* -- 数据返回为 左右table [model]
*/
func cookCategoryRequest(_ finished:@escaping(_ leftModels:[LeftModel],_ rightData:[[RightModel]] ) ->()){
SVProgressHUD.showInfo(withStatus: "加载中...")
let url = "http://apis.juhe.cn/cook/category?parentid=&dtype=&key=43589755dfa048431ce4e330d2161238"
Alamofire.request(url).responseJSON { (response) in
var leftModels = [LeftModel]()
// var rightModels = [RightModel]()
var rightData = [[RightModel]]() /// 套俩层
guard response.result.isSuccess else{
SVProgressHUD.showError(withStatus: "数据请求失败...")
finished(leftModels,rightData) /// 请求失败返回空数据
return
}
if let values = response.result.value{
let json = JSON(values)
let resultCode = json["resultcode"].stringValue
if resultCode == "200"{
let resultArr = json["result"].arrayValue
for leftValue in resultArr{
leftModels.append(LeftModel.init(json: leftValue))
/// 外层中 每个分类对应着右边的每一个数组
var rightModels = [RightModel]()
let leftList = leftValue["list"].arrayValue
for rightV in leftList{/// 内层循环
rightModels.append(RightModel.init(json: rightV))
}
// 数组 -> 数组 -> 模型
rightData.append(rightModels)
}
finished(leftModels,rightData)
}else{
finished(leftModels,rightData) /// 校验码出错返回空数据
}
}else{
finished(leftModels,rightData) /// 请求出错返回空数据
}
SVProgressHUD.dismiss()
}
}
/**
* --
*/
/// 根据标签 请求 列表数据
///
/// - Parameters:
/// - page: 当前页
/// - Cid: 当前标签
/// - finished: 结果回调
func cookIndexCidRequest(page:Int,Cid:String,_ finished:@escaping(_ models:[CookTagModel],_ currentP:Int,_ totalNum:Int)->()){
//cid=3&dtype=&pn=&rn=&format=&key=43589755dfa048431ce4e330d2161238
// let url = "http://apis.juhe.cn/cook/index?" + "cid=\(Cid)" + "&dtype=&pn)&rn=&format=&key=43589755dfa048431ce4e330d2161238"
let url = "http://apis.juhe.cn/cook/index?" + "cid=\(Cid)" + "&pn=\(page)" + "&dtype=&rn=&format=&key=43589755dfa048431ce4e330d2161238"
Alamofire.request(url).responseJSON { (response) in
var dataArrs = [CookTagModel]()
guard response.result.isSuccess else{
SVProgressHUD.showError(withStatus: "请求出错")
finished(dataArrs,0,0)
return
}
if let values = response.result.value{
let json = JSON(values)
let resultCode = json["resultcode"].stringValue
if resultCode == "200"{
let resultDic = json["result"].dictionary
let totalNum = resultDic!["totalNum"]?.intValue
let currentP = resultDic!["pn"]?.intValue
let dataArr = resultDic!["data"]?.arrayValue
for value in dataArr!{
dataArrs.append(CookTagModel.init(json: value))
}
finished(dataArrs,currentP!,totalNum!)
}
}
}
}
}
<file_sep>/SwiftCookBook/m/LeftModel.swift
//
// LeftModel.swift
// SwiftCookBook
//
// Created by ivna.evecom on 2018/3/14.
// Copyright © 2018年 ivna. All rights reserved.
//
import UIKit
import SwiftyJSON
class LeftModel: NSObject {
var parentId:Int?
var name :String?
init(json:JSON) {
parentId = json["parentId"].int
name = json["name"].stringValue
}
}
<file_sep>/SwiftCookBook/CookDetailCtrl.swift
//
// CookDetailCtrl.swift
// SwiftCookBook
//
// Created by ivna.evecom on 2018/3/15.
// Copyright © 2018年 ivna. All rights reserved.
//
import UIKit
class CookDetailCtrl: UIViewController{
var sliderGalley:SliderGalleryController!
var imageList:[(title:String,url:String)] = []
override func viewDidLoad() {
super.viewDidLoad()
self.view.backgroundColor = UIColor.white
sliderGalley = SliderGalleryController()
sliderGalley.delegate = self
sliderGalley.view.frame = CGRect.init(x: 0, y: 80, width: screen_W, height: screen_H * 0.3)
self.view.addSubview(sliderGalley.view)
self.addChildViewController(sliderGalley)
// Do any additional setup after loading the view.
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
/*
// MARK: - Navigation
// In a storyboard-based application, you will often want to do a little preparation before navigation
override func prepare(for segue: UIStoryboardSegue, sender: Any?) {
// Get the new view controller using segue.destinationViewController.
// Pass the selected object to the new view controller.
}
*/
}
// MARK: - UIScrollViewDelegate
extension CookDetailCtrl: SliderGalleryControllerDelegate {
//图片轮播组件协议方法:获取内部scrollView尺寸
func galleryScrollerViewSize() -> CGSize {
return CGSize(width: screen_W, height: screen_H * 0.3)
}
//图片轮播组件协议方法:获取数据集合
func galleryDataSource() -> [(title:String,url:String)] {
return self.imageList
}
}
<file_sep>/SwiftCookBook/main/BsTabBarVc.swift
//
// BsTabBarVc.swift
// Joke_Swift
//
// Created by ivna.evecom on 2018/3/7.
// Copyright © 2018年 ivna. All rights reserved.
//
import UIKit
class BsTabBarVc: UITabBarController {
override func viewDidLoad() {
super.viewDidLoad()
addChildViewControllers()
// Do any additional setup after loading the view.
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
/// 添加子控制器
func addChildViewControllers() {
setChildViewController(ViewController(), title: "分类", imgName: "首页")
setChildViewController(SearchCtrl(), title: "推荐", imgName: "教玩具")
}
///初始化 子控制器
private func setChildViewController(_ childController :UIViewController,title:String,imgName:String){
childController.tabBarItem.title = title
/// 默认状态下的图标
childController.tabBarItem.image = UIImage.init(named: imgName + "_1")?.withRenderingMode(.alwaysOriginal)
/// 选中状态下的图标
childController.tabBarItem.selectedImage = UIImage.init(named: imgName)?.withRenderingMode(.alwaysOriginal)
let navVc = BsNavitionVc.init(rootViewController: childController)
addChildViewController(navVc)
}
/*
// MARK: - Navigation
// In a storyboard-based application, you will often want to do a little preparation before navigation
override func prepare(for segue: UIStoryboardSegue, sender: Any?) {
// Get the new view controller using segue.destinationViewController.
// Pass the selected object to the new view controller.
}
*/
}
|
beed5026ef846c70a071dd70199a421f54014b92
|
[
"Swift",
"Ruby",
"Markdown"
] | 14 |
Ruby
|
IVnaBo/SwiftCookBook
|
461af0ef2807ca24f2dd733d13a6708655535414
|
6f5eb7c9af6d960868c9765e6415823f87cdc86d
|
refs/heads/master
|
<file_sep>const express = require('express')
const db = require('./models')
const app = express();
const bodyParser = require('body-parser')
const cookieParser = require("cookie-parser");
const userRoutes = require('./routes/user')
const port = process.env.PORT || 3000
app.use(bodyParser.json());
app.use(express.urlencoded({extended:true}));
app.use(express.json());
app.use(cookieParser());
app.use("/api",userRoutes)
db.sequelize.sync().then( () => {
app.listen(port, () => {
console.log("DB CONNECTED")
console.log(`listening at: http://localhost:${port}`)
});
});<file_sep>const express = require ('express');
const db = require('../models');
const router = express.Router();
const{check, validationResult} = require('express-validator');
const { signup,signin } = require('../controller/user');
//Signup Route
router.post('/signup',
[
check("firstName").isLength({ min:3 }).withMessage('Name must be at least 3 chars long'),
check("lastName").isLength({ min: 3 }).withMessage('Name must be at least 3 chars long'),
check("email").isEmail().withMessage('Please enter correct email address'),
check("password").isLength({ min: 6 }).withMessage('Password length should be minium 6 char long')
],
signup);
//Sign In Route
router.post("/signin",[
check("email").isEmail().withMessage('Please enter valid email address'),
check("password").isLength({ min: 6 }).withMessage('Please enter valid password')
], signin);
router.get ('/all',(req,res) => {
db.User.findAll().then((newuser) => res.send(newuser));
});
module.exports = router;
|
aedff78095eaf9001692a7ae00e2b4198a6349e6
|
[
"JavaScript"
] | 2 |
JavaScript
|
manpreet711/UserLogin
|
0e3a12884f8fc94e5364c0088a158c2cc677c150
|
76fc3a38bd764b8c5e5fa0e02bfae965af9161a8
|
refs/heads/master
|
<file_sep>module Helper
def instance_running?(*args)
state = ec2_instance_state(*args)
state == 'running' ? true : false
end
def instance_stopped?(*args)
state = ec2_instance_state(*args)
state == 'stopped' ? true : false
end
def instance_stopping?(*args)
state = ec2_instance_state(*args)
state == 'stopping' ? true : false
end
def instance_terminated?(*args)
state = ec2_instance_state(*args)
state == 'terminated' ? true : false
end
private
def ec2_instance_state(instance_id,
aws_access_key_id = nil,
aws_secret_access_key = nil)
aws_region = 'us-east-1'
if node.ec2
aws_region = node.ec2.placement_availability_zone.chop
end
if aws_access_key_id
ec2_con = AWS::EC2::Client::V20130815.new(
:access_key_id => aws_access_key_id,
:secret_access_key => aws_secret_access_key,
:region => aws_region)
else
ec2_con = AWS::EC2::Client::V20130815.new(:region => aws_region)
end
begin
resp = ec2_con.describe_instance_status(
:instance_ids => ["#{instance_id}"],
:include_all_instances => true)
instance = resp[:instance_status_set].first
state = instance[:instance_state][:name]
return state
rescue AWS::EC2::Errors::InvalidInstanceID::NotFound
return 'notfound'
end
end
end
<file_sep>class Chef::Recipe::Jlaws
# Retrieve a specified file from S3 using an EC2 instances IAM credentials
# It's designed for retrieving encrypted data bag keys
def self.S3DataBagSecret(s3_bucket, s3_path)
s3 = AWS::S3.new()
s3_file = s3.buckets[s3_bucket].objects[s3_path]
s3_string = s3_file.read
s3_string
end
end
<file_sep>#
# Cookbook Name:: jlaws
# Recipe:: default
#
include_recipe 'build-essential::default'
case node['platform_family']
when 'debian'
zlib_pkg = 'zlib1g-dev'
when 'rhel'
zlib_pkg = 'zlib-devel'
end
zlib = package zlib_pkg do
action :install
end
zlib.run_action(:install)
# workaround the xml cookbook locking nokogiri
# and install the same version before we install aws-sdk-v1
if node.recipes.include?('xml')
x = chef_gem 'nokogiri' do
action :nothing
version node['xml']['nokogiri']['version']
only_if { node['xml']['nokogiri']['version'] }
end
x.run_action(:install)
end
c = chef_gem 'aws-sdk-v1' do
action :install
version node['jlaws']['aws_sdk_ver']
# Use force to make a side by side gem install work with aws-sdk < 2.0
# (The aws cookbook use the v2 sdk)
options('--no-wrappers --force')
end
c.run_action(:install)
require 'aws-sdk-v1'
<file_sep>require_relative 'spec_helper'
describe "Install aws-sdk gem" do
describe command("/opt/chef/embedded/bin/gem specification aws-sdk version | awk '/version/ { print $2}'") do
its(:stdout) { should match "1.39.0" }
end
end
describe 'Install zlib1g-dev' do
describe package('zlib1g-dev') do
it { should be_installed }
end
end
<file_sep>def whyrun_supported?
true
end
action :create do
do_s3_file(:create)
new_resource.updated_by_last_action(true)
end
action :create_if_missing do
do_s3_file(:create_if_missing)
new_resource.updated_by_last_action(true)
end
action :delete do
do_s3_file(:delete)
new_resource.updated_by_last_action(true)
end
action :touch do
do_s3_file(:touch)
new_resource.updated_by_last_action(true)
end
def do_s3_file(resource_action)
s3 = AWS::S3.new()
s3_file = s3.buckets[new_resource.bucket].objects[new_resource.remote_path]
file new_resource.name do
path new_resource.path
content s3_file.read
owner new_resource.owner
group new_resource.group
mode new_resource.mode
backup new_resource.backup
action resource_action
end
end
<file_sep>default['build-essential']['compile_time'] = true
default['jlaws']['aws_sdk_ver'] = '1.63.0'
<file_sep>name 'instance-status'
depends 'jlaws'
<file_sep>include_recipe 'jlaws'
Chef::Resource::Log.send(:include, Jlaws::Helper)
log "Node i-422b873d (leeroy) is running!" do
only_if { instance_running?('i-422b873d',
node['aws_access_key_id'],
node['aws_secret_access_key']) }
end
# Node is stopped
log "Node i-670fc98b is termnated!" do
only_if { instance_terminated?('i-670fc98b',
node['aws_access_key_id'],
node['aws_secret_access_key']) }
end
# Node is NOT terminated - dosen't log this
log "Node i-67b71686 is terminated!" do
only_if { instance_terminated?('i-67b71686',
node['aws_access_key_id'],
node['aws_secret_access_key']) }
end
# Node does not exist
log "Node i-1111111 is not logged as it does not exist!" do
only_if { instance_terminated?('i-111111',
node['aws_access_key_id'],
node['aws_secret_access_key']) }
end
<file_sep>source 'https://supermarket.getchef.com'
metadata
group :integration do
cookbook 'instance-status', path: 'test/fixtures/cookbooks/instance-status'
# Just for testing nokogiri workaround in default.rb
cookbook 'xml'
end
|
7b5c90c33f953a48181d954278504308a912d8f1
|
[
"Ruby"
] | 9 |
Ruby
|
AaronKalair/jlaws
|
b039ed79b7ebe1a63f310d2f2f7edb2f05e1a711
|
3d191faffb237ff5a2c4549b0fcc006265291f57
|
refs/heads/master
|
<repo_name>kshnam/Tensorflow_Tutorial<file_sep>/Load_and_preprocess_data/Load_and_preprocess_data(pandas_DataFrame).py
from __future__ import absolute_import, division, print_function, unicode_literals
import pandas as pd
import tensorflow as tf
# heart.csv파일을 다운받는다.
csv_file = tf.keras.utils.get_file('heart.csv', 'https://storage.googleapis.com/applied-dl/heart.csv')
# 다운받은 파일을 읽어 DataFrame으로 생성
df = pd.read_csv(csv_file)
# df.dtypes : 데이터 타입을 확인해 int나 float로 만들어준다.
# thal의 dtype이 object이기 때문에 Categorical을 통해 1,2,3,4로 만들어준다.
df['thal'] = pd.Categorical(df['thal'])
df['thal'] = df.thal.cat.codes
# [ ] 안에 같이있는 label을 분리해 [ ([x], [label]), () ...] 형태로 만들어준다.
target = df.pop('target')
dataset = tf.data.Dataset.from_tensor_slices((df.values, target.values))
train_dataset = dataset.shuffle(len(df)).batch(1)
# 모델 생성
def get_compiled_model():
model = tf.keras.Sequential([tf.keras.layers.Dense(10, activation='relu'),tf.keras.layers.Dense(10,activation = 'relu'),tf.keras.layers.Dense(1, activation = 'sigmoid')])
model.compile(optimizer = 'adam', loss = 'binary_crossentropy',metrics = ['accuracy'])
return model
model = get_compiled_model()
model.fit(train_dataset, epochs = 15)
|
d08e2039ef3144754f10735542e706c12d56da8c
|
[
"Python"
] | 1 |
Python
|
kshnam/Tensorflow_Tutorial
|
e4296af7f45a296097ace615ca138035a2202a79
|
c9bd4152492add7dd9a6c7aafa354444d3a76329
|
refs/heads/main
|
<repo_name>hrkanon/pin-matcher<file_sep>/mine.js
function generatePin(){
const pin = Math.round(Math.random() * 10000);
const pinText = pin + "";
if(pinText.length == 4){
document.getElementById("display-pin").value = pin;
return pin;
}
else{
generatePin();
}
}
document.getElementById("key-pad").addEventListener('click', function(event){
const number = event.target.innerText;
let typedNumber = document.getElementById("typed-numbers");
if(isNaN(number)){
if(number == "C"){
typedNumber.value = "";
}
}
else{
const previousNumber = typedNumber.value;
const newNumber = previousNumber + number;
typedNumber.value = newNumber;
}
})
function verifyPin(){
// const generatedPin = generatePin();
const displayPin = document.getElementById("display-pin").value
const typedNumber = document.getElementById("typed-numbers").value;
const notifyFail = document.getElementById("notify-fail");
const notifySuccess = document.getElementById("notify-success");
if(displayPin == typedNumber ){
notifySuccess.style.display = "block";
notifyFail.style.display = "none";
}
else{
notifySuccess.style.display = "none";
notifyFail.style.display = "block";
}
}
|
62d071ecdfde50dcaf33ab7b879cd885398787ba
|
[
"JavaScript"
] | 1 |
JavaScript
|
hrkanon/pin-matcher
|
51ded64da5d719cfa155782b6c4ac8ea001f2780
|
57c5cb0c4dc3b905c0e5f41ec95781021e8d49a9
|
refs/heads/master
|
<file_sep>require 'sequel'
require 'pact_broker/models/pacticipant'
module PactBroker
module Repositories
class PacticipantRepository
def find_by_name name
PactBroker::Models::Pacticipant.where(name: name).single_record
end
def find_all
PactBroker::Models::Pacticipant.order(:name).all
end
def find_by_name_or_create name
if pacticipant = find_by_name(name)
pacticipant
else
create name: name
end
end
def create args
PactBroker::Models::Pacticipant.new(name: args[:name], repository_url: args[:repository_url]).save(raise_on_save_failure: true)
end
def find_latest_version name
end
end
end
end
<file_sep>require 'cgi'
require 'pact_broker/api/resources/base_resource'
module PactBroker::Api
module Resources
class Pact < BaseResource
def content_types_provided
[["application/json", :to_json]]
end
def content_types_accepted
[["application/json", :from_json]]
end
def allowed_methods
["GET", "PUT"]
end
def resource_exists?
@pact = pact_service.find_pact(identifier_from_path)
@pact != nil
end
def from_json
@pact, created = pact_service.create_or_update_pact(identifier_from_path.merge(:json_content => pact_content))
response.headers["Location"] = pact_url(resource_url, @pact) if created
response.body = to_json
end
def to_json
@pact.json_content
end
def pact_content
request.body.to_s
end
end
end
end
<file_sep>require 'sequel'
require 'pact_broker/logging'
module PactBroker
module Repositories
class PactRepository
include PactBroker::Logging
def find_by_version_and_provider version_id, provider_id
PactBroker::Models::Pact.where(version_id: version_id, provider_id: provider_id).single_record
end
def find_latest_pacts
# TODO Work out a more elegant and efficient way of executing this query.
# Haven't found a max/group by combination that works with Sequel Models
latest_versions = PactBroker::Models::Version.new.db.fetch('select max(id) as id from versions where id in (select distinct(version_id) from pacts) group by pacticipant_id').collect{ | it | it[:id] }.flatten
PactBroker::Models::Pact.select(:pacts__id, :pacts__json_content, :pacts__version_id, :pacts__provider_id, :versions__number___consumer_version_number).
join(:versions, {:id => :version_id}, {implicit_qualifier: :pacts}).
join(:pacticipants, {:id => :pacticipant_id}, {:table_alias => :consumers, implicit_qualifier: :versions}).
join(:pacticipants, {:id => :provider_id}, {:table_alias => :providers, implicit_qualifier: :pacts}).
where('versions.id in ?', latest_versions).all
end
def find_latest_pact(consumer_name, provider_name)
pact_finder(consumer_name, provider_name).order(:order).last
end
def find_pact consumer_name, consumer_version, provider_name
pact_finder(consumer_name, provider_name).where('versions.number = ?', consumer_version).single_record
end
def create params
PactBroker::Models::Pact.new(version_id: params[:version_id], provider_id: params[:provider_id], json_content: params[:json_content]).save
end
def create_or_update params
if pact = find_by_version_and_provider(params[:version_id], params[:provider_id])
pact.update_fields(json_content: params[:json_content])
else
create params
end
end
private
def pact_finder consumer_name, provider_name
PactBroker::Models::Pact.select(:pacts__id, :pacts__json_content, :pacts__version_id, :pacts__provider_id, :versions__number___consumer_version_number).
join(:versions, {:id => :version_id}, {implicit_qualifier: :pacts}).
join(:pacticipants, {:id => :pacticipant_id}, {:table_alias => :consumers, implicit_qualifier: :versions}).
join(:pacticipants, {:id => :provider_id}, {:table_alias => :providers, implicit_qualifier: :pacts}).
where('providers.name = ?', provider_name).
where('consumers.name = ?', consumer_name)
end
end
end
end
<file_sep>require './spec/spec_helper'
require 'pact/provider/rspec'
require 'sequel'
require 'db'
require 'pact_broker/api'
require 'uri'
require_relative 'provider_states_for_pact_broker_client'
require 'pact_broker/api/resources/pact'
Pact.configure do | config |
config.logger.level = Logger::DEBUG
#config.diff_format = :plus_and_minus
end
Pact.service_provider "Pact Broker" do
# app { PactBroker::API }
honours_pact_with "Pact Broker Client" do
pact_uri "../pact_broker-client/spec/pacts/pact_broker_client-pact_broker.json"
end
# honours_pact_with "Pact Broker Client", :ref => :head do
# pact_uri URI.encode("http://rea-pact-broker.biq.vpc.realestate.com.au/pact/provider/Pact Broker/consumer/Pact Broker Client/latest")
# end
end
<file_sep># Pact Broker
The Pact Broker provides a repository for pacts created using the pact gem. It solves the problem of how to share pacts between consumer and provider projects.
When a consumer CI build is configured to publish its pacts to a Pact Broker, it allows the provider to always be verified against the latest version of the pact. Once the required development work has been done, it will also allow providers to be verified against the "production" version of a pact, giving confidence when deploying that a new version of a provider will work against the production version of a consumer.
It provides endpoints for the following:
* Publish a pact between a provider and a given version of a consumer.
* Retrieve the latest pact between a consumer and a provider.
* View a list of published pacts.
* View a list of "pacticipants" (consumers and providers).
See the [Pact Broker Client](https://github.com/bethesque/pact_broker-client) for documentation on how to publish a pact to the Pact Broker, and configure the URLs in the provider project.
### Upcoming development
* Create a UI to show a network diagram based on the published pacts.
* Display pacts in HTML format.
* Allow "tagging" of pacts so that the provider can be verified against the production version of a pact.
## Usage
* Copy the [example](/example) directory to your workstation.
* Modify the config.ru and Gemfile as desired (eg. choose database driver gem)
* Run `bundle`
* Run `bundle exec rackup`
* Open [http://localhost:9292](http://localhost:9292) and you should see the HAL browser.
For production usage, use a web application server like Phusion Passenger to serve the Pact Broker application.
<file_sep>require 'pact_broker/api/resources/base_resource'
module Webmachine
class Request
def put?
method == "PUT" || method == "PATCH"
end
end
end
module PactBroker::Api
module Resources
class Pacticipant < BaseResource
def content_types_provided
[["application/hal+json", :to_json]]
end
def content_types_accepted
[["application/json", :from_json]]
end
def allowed_methods
["GET", "PATCH"]
end
def known_methods
super + ['PATCH']
end
def from_json
if @pacticipant
@pacticipant = pacticipant_service.update params.merge(name: identifier_from_path[:name])
else
@pacticipant = pacticipant_service.create params.merge(name: identifier_from_path[:name])
response.headers["Location"] = pacticipant_url(resource_url, @pacticipant)
end
response.body = to_json
end
def resource_exists?
@pacticipant = pacticipant_service.find_pacticipant_by_name(identifier_from_path[:name])
@pacticipant != nil
end
def to_json
PactBroker::Api::Decorators::PacticipantRepresenter.new(@pacticipant).to_json(base_url: resource_url)
end
end
end
end
<file_sep>require 'sequel'
module PactBroker
module Models
class Pact < Sequel::Model
set_primary_key :id
associate(:many_to_one, :provider, :class => "PactBroker::Models::Pacticipant", :key => :provider_id, :primary_key => :id)
associate(:many_to_one, :consumer_version, :class => "PactBroker::Models::Version", :key => :version_id, :primary_key => :id)
#Need to work out how to do this properly!
def consumer_version_number
values[:consumer_version_number]
end
def consumer
consumer_version.pacticipant
end
def to_s
"Pact: provider_id=#{provider_id}"
end
def to_json options = {}
json_content
end
end
end
end
<file_sep>require 'pact_broker/services/pact_service'
require 'pact_broker/services/pacticipant_service'
require 'pact_broker/services/tag_service'
module PactBroker
module Services
def pact_service
PactService
end
def pacticipant_service
PacticipantService
end
def tag_service
TagService
end
end
end<file_sep>require 'sequel'
module PactBroker
module Models
class Pacticipant < Sequel::Model
set_primary_key :id
one_to_many :versions, :order => :order, :reciprocal => :pacticipant
one_to_many :pacts
def latest_version
versions.last
end
def to_s
"Pacticipant: id=#{id}, name=#{name}"
end
end
end
end<file_sep>require 'pact_broker/api/resources/base_resource'
module PactBroker::Api
module Resources
class LatestPact < BaseResource
def content_types_provided
[["application/json", :to_json]]
end
def allowed_methods
["GET"]
end
def resource_exists?
@pact = pact_service.find_pact(identifier_from_path.merge(:consumer_version_number => 'latest'))
@pact != nil
end
def to_json
response.headers['X-Pact-Consumer-Version'] = @pact.consumer_version_number
@pact.json_content
end
end
end
end
<file_sep>require 'pact_broker/tasks'
PactBroker::DB::MigrationTask.new do | task |
require 'db'
task.database_connection = DB::PACT_BROKER_DB
end
namespace :db do
desc 'drop and recreate DB'
task :recreate => [:drop, 'pact_broker:db:migrate']
desc 'drop DB'
task :drop do
require 'yaml'
puts "Removing database #{db_file}"
FileUtils.rm_f db_file
FileUtils.mkdir_p File.dirname(db_file)
end
def db_file
@@db_file ||= YAML.load(ERB.new(File.read(File.join('./config', 'database.yml'))).result)[RACK_ENV]["database"]
end
end<file_sep>ENV['RACK_ENV'] = 'test'
RACK_ENV = 'test'
$: << File.expand_path("../../", __FILE__)
require 'rack/test'
require 'db'
require './spec/support/provider_state_builder'
require 'pact_broker/api'
require 'rspec/fire'
RSpec.configure do | config |
config.before :suite do
raise "Wrong environment!!! Don't run this script!! ENV['RACK_ENV'] is #{ENV['RACK_ENV']} and RACK_ENV is #{RACK_ENV}" if ENV['RACK_ENV'] != 'test' || RACK_ENV != 'test'
end
config.before :each do
DB::PACT_BROKER_DB[:pacts].truncate
DB::PACT_BROKER_DB[:tags].truncate
DB::PACT_BROKER_DB[:versions].truncate
DB::PACT_BROKER_DB[:pacticipants].truncate
end
config.include Rack::Test::Methods
config.include RSpec::Fire
def app
PactBroker::API
end
end
<file_sep>require 'sequel'
module PactBroker
module DB
MIGRATIONS_DIR = File.expand_path("../../../db/migrations", __FILE__)
def self.run_migrations database_connection
Sequel.extension :migration
Sequel::Migrator.run(database_connection, PactBroker::DB::MIGRATIONS_DIR)
end
end
end<file_sep>Do this to generate your change history
$ git log --date=relative --pretty=format:' * %h - %s (%an, %ad)' 'package/pact-broker-0.0.PRODVERSION'..'package/pact-broker-0.0.NEWVERSION'
#### 0.0.10 (2014-03-24)
* 7aee2ae - Implemented version tagging (Beth, 2 days ago)
* cc78f92 - Added 'latest' pact url to pact representation in the 'latest pacts' response (Beth, 2 days ago)
#### 0.0.9 (2014-02-27)
* d07f4b7 - Using default gem publish tasks (Beth, 4 weeks ago)
* d60b7ee - Comment (Beth, 7 weeks ago)
* 836347c - Using local pacts (Beth, 7 weeks ago)
* a2cb2bb - Fixed bug querying mysql DB, rather than sqlite (Beth, 7 weeks ago)
* 9d5f83b - Using the to_json options to pass in the base_url instead of the nasty hack. (Beth, 4 months ago)
* adb6148 - Changed 'last' to 'latest' (Beth, 4 months ago)
#### 0.0.8 (2013-11-18)
* 6022baa - Changed name to title in list pacticipants response (Beth, 7 hours ago)
* 13fde52 - Moving resources module under the Api module. (Beth, 8 hours ago)
* f52c572 - Added HAL index (Beth, 8 hours ago)
#### 0.0.7 (2013-11-15)
* 7984d86 - Added title to each item in the pacts/latest links array (Beth, 83 seconds ago)
#### 0.0.6 (2013-11-15)
* 021faae - Refactoring resources to DRY out code (Beth, 18 hours ago)
* bab0367 - Cleaning up the base_url setting hack. (Beth, 19 hours ago)
* f6df613 - Renamed representors to decorators (Beth, 19 hours ago)
* 3e89c20 - Created BaseDecorator (Beth, 19 hours ago)
* e5c3f88 - Changing from representors to decorators (Beth, 19 hours ago)
* b2eeb6f - Added back resource_exists? implementation in pacticipant resource - accidental deletion. (Beth, 19 hours ago)
* 1962a05 - Ever so slightly less hacky way of handling PATCH (Beth, 21 hours ago)
* 587e9c1 - First go at trying to use a dynamic base URL - to be continued (Beth, 2 days ago)
* ab9c185 - Including URLs for the dynamically calculated latest pact, not the hard link to the latest pact. (Beth, 2 days ago)
* 5621e41 - Beginning change from Roar Representor to Decoractor. Updating to new 'latest pact' URL (Beth, 2 days ago)
* d1bd995 - Adding missing PactBroker::Logging require (Beth, 2 days ago)
#### 0.0.5 (2013-11-13)
* 2cf987c - Added data migration to script which adds order column (Beth, 56 minutes ago)
* 9c709a9 - Changing queries to use new order column. (Beth, 61 minutes ago)
* 173f231 - Renaming var (Beth, 65 minutes ago)
* f9be93d - Renamed SortVersions to OrderVersions (Beth, 66 minutes ago)
* ca6e479 - Added SortVersions as an after version save hook (Beth, 69 minutes ago)
* 23cd1a3 - Adding order column to version table (Beth, 11 hours ago)
* c504e5f - Fixing application/json+hal to application/hal+json (Beth, 2 days ago)
* 1d24b9b - Removing old sinatra API (Beth, 2 days ago)
* fd1832c - WIP. Converting to use Webmachine (Beth, 2 days ago)
* 0b096a4 - Redoing the the URLs yet again (Beth, 3 days ago)
* 0934d89 - Implementing list latest pacts (Beth, 3 days ago)
* ed2d354 - Changed one_to_one associations to many_to_one (Beth, 4 days ago)
* 28de0ea - WIP implementing pacts/latest. (Beth, 6 days ago)
* 1cd36e6 - Changing to new /pacts/latest URL format (Beth, 6 days ago)
* 54f8fc3 - Writing underlying code to find the latest pact for each consumer/provider pair. (Beth, 6 days ago)
<file_sep>require 'spec_helper'
require 'pact_broker/repositories/version_repository'
module PactBroker
module Repositories
describe VersionRepository do
describe "#find_by_pacticipant_name_and_number" do
let(:pacticipant_name) { "test_pacticipant" }
let(:version_number) { "1.2.3" }
subject { described_class.new.find_by_pacticipant_name_and_number pacticipant_name, version_number }
context "when the version exists" do
before do
ProviderStateBuilder.new.create_version_with_hierarchy "other_pacticipant", version_number
ProviderStateBuilder.new.create_version_with_hierarchy pacticipant_name, version_number
end
it "returns the version" do
expect(subject.number).to eq version_number
expect(subject.pacticipant.name).to eq pacticipant_name
end
end
context "when the version doesn't exist" do
it "returns nil" do
expect(subject).to be_nil
end
end
end
end
end
end<file_sep>require 'pact_broker/api/resources/base_resource'
module PactBroker::Api
module Resources
class Pacticipants < BaseResource
def content_types_provided
[["application/hal+json", :to_json]]
end
def allowed_methods
["GET"]
end
def to_json
generate_json(pacticipant_service.find_all_pacticipants)
end
def generate_json pacticipants
PactBroker::Api::Decorators::PacticipantCollectionRepresenter.new(pacticipants).to_json(base_url: resource_url)
end
end
end
end
<file_sep>require 'pact_broker/repositories'
require 'json'
class ProviderStateBuilder
include PactBroker::Repositories
def create_pricing_service
@pricing_service_id = pacticipant_repository.create(:name => 'Pricing Service', :repository_url => '<EMAIL>:business-systems/pricing-service').save(raise_on_save_failure: true).id
self
end
def create_contract_proposal_service
@contract_proposal_service_id = pacticipant_repository.create(:name => 'Contract Proposal Service', :repository_url => '<EMAIL>:business-systems/contract-proposal-service').save(raise_on_save_failure: true).id
self
end
def create_contract_proposal_service_version number
@contract_proposal_service_version_id = version_repository.create(number: number, pacticipant_id: @contract_proposal_service_id).id
self
end
def create_contract_email_service
@contract_email_service_id = pacticipant_repository.create(:name => 'Contract Email Service', :repository_url => '<EMAIL>:business-systems/contract-email-service').save(raise_on_save_failure: true).id
self
end
def create_contract_email_service_version number
@contract_email_service_version_id = version_repository.create(number: number, pacticipant_id: @contract_email_service_id).id
self
end
def create_ces_cps_pact
@pact_id = pact_repository.create(version_id: @contract_email_service_version_id, provider_id: @contract_proposal_service_id, json_content: json_content).id
self
end
def create_condor
@condor_id = pacticipant_repository.create(:name => 'Condor').save(raise_on_save_failure: true).id
self
end
def create_condor_version number
@condor_version_id = version_repository.create(number: number, pacticipant_id: @condor_id).id
self
end
def create_pricing_service_version number
@pricing_service_version_id = version_repository.create(number: number, pacticipant_id: @pricing_service_id).id
self
end
def create_pact
@pact_id = pact_repository.create(version_id: @condor_version_id, provider_id: @pricing_service_id, json_content: json_content).id
self
end
def create_pact_with_hierarchy consumer_name, consumer_version, provider_name
provider = PactBroker::Models::Pacticipant.create(:name => provider_name)
consumer = PactBroker::Models::Pacticipant.create(:name => consumer_name)
version = PactBroker::Models::Version.create(:number => consumer_version, :pacticipant => consumer)
PactBroker::Models::Pact.create(:consumer_version => version, :provider => provider)
end
def create_version_with_hierarchy pacticipant_name, pacticipant_version
pacticipant = PactBroker::Models::Pacticipant.create(:name => pacticipant_name)
PactBroker::Models::Version.create(:number => pacticipant_version, :pacticipant => pacticipant)
end
def create_tag_with_hierarchy pacticipant_name, pacticipant_version, tag_name
version = create_version_with_hierarchy pacticipant_name, pacticipant_version
PactBroker::Models::Tag.create(name: tag_name, version: version)
end
def create_pacticipant pacticipant_name
@pacticipant = PactBroker::Models::Pacticipant.create(:name => pacticipant_name)
self
end
def create_version version_number
@version = PactBroker::Models::Version.create(:number => version_number, :pacticipant => @pacticipant)
self
end
def create_tag tag_name
@tag = PactBroker::Models::Tag.create(name: tag_name, version: @version)
self
end
private
# def create_pacticipant name
# pacticipant_repository.create(:name => name)
# end
# def create_version number, pacticipant
# version_repository.create(number: number, pacticipant: pacticipant)
# end
# def create_pact version, provider
# pact_repository.create(consumer_version: version, provider: provider, json_content: json_content)
# end
def json_content
json_content = {
"consumer" => {
"name" => "Condor"
},
"provider" => {
"name" => "Pricing Service"
},
"interactions" => []
}.to_json
end
end<file_sep>require 'pact_broker/configuration'
require 'pact_broker/db'
module PactBroker
class App
attr_accessor :configuration
def initialize &block
@configuration = Configuration.default_configuration
yield configuration
post_configure
build_app
end
def call env
@app.call env
end
private
def logger
PactBroker.logger
end
def post_configure
PactBroker.logger = configuration.logger
if configuration.auto_migrate_db
logger.info "Migrating database"
PactBroker::DB.run_migrations configuration.database_connection
else
logger.info "Skipping database migrations"
end
end
def build_app
@app = Rack::Builder.new
if configuration.use_hal_browser
logger.info "Mounting HAL browser"
@app.use Rack::HalBrowser::Redirect, :exclude => ['/trace']
else
logger.info "Not mounting HAL browser"
end
logger.info "Mounting PactBroker::API"
require 'pact_broker/api'
@app.map "/" do
run PactBroker::API
end
end
end
end<file_sep>require 'pact_broker/api/resources/base_resource'
module PactBroker::Api
module Resources
class LatestPacts < BaseResource
def content_types_provided
[["application/hal+json", :to_json]]
end
def allowed_methods
["GET"]
end
def to_json
generate_json(pact_service.find_latest_pacts)
end
def generate_json pacts
PactBroker::Api::Decorators::PactCollectionDecorator.new(pacts).to_json(base_url: resource_url)
end
end
end
end
<file_sep>require File.dirname(__FILE__) + '/config/boot'
require 'db'
require 'pact_broker/api'
require 'rack/hal_browser'
use Rack::HalBrowser::Redirect, :exclude => ['/diagnostic', '/trace']
run Rack::URLMap.new(
'/' => PactBroker::API
)
<file_sep>require 'pact_broker/api/resources/base_resource'
require 'json'
module PactBroker::Api
module Resources
class Index < BaseResource
def content_types_provided
[["application/hal+json", :to_json]]
end
def allowed_methods
["GET"]
end
# TODO change to use request.base_url to avoid params getting included!!!
def to_json
{
_links: {
'index' => [
{
href: request.uri.to_s,
title: 'The index page',
templated: false
}
],
'latest-pacts' => [
{
href: request.uri.to_s + 'pacts/latest',
title: 'Retrieve latest pacts',
templated: false
}
],
'pacticpants' => [
{
href: request.uri.to_s + 'pacticipants',
title: 'Retrieve pacticipants',
templated: false
}
]
}
}.to_json
end
end
end
end
<file_sep>source 'https://rubygems.org'
#source 'http://rea-rubygems/'
gemspec
#gem 'pact', :path => '../pact'<file_sep>require 'pact_broker/repositories'
module PactBroker
module Services
class PacticipantService
extend PactBroker::Repositories
def self.find_all_pacticipants
pacticipant_repository.find_all
end
def self.find_pacticipant_by_name name
pacticipant_repository.find_by_name(name)
end
def self.find_pacticipant_repository_url_by_pacticipant_name name
pacticipant = pacticipant_repository.find_by_name(name)
if pacticipant && pacticipant.repository_url
pacticipant.repository_url
else
nil
end
end
def self.update params
pacticipant = pacticipant_repository.find_by_name(params.fetch(:name))
pacticipant.update(params)
pacticipant_repository.find_by_name(params.fetch(:name))
end
def self.create params
pacticipant_repository.create(params)
end
end
end
end<file_sep>require 'rspec/core'
require 'rspec/core/rake_task'
RSpec::Core::RakeTask.new(:spec) do |task|
task.pattern = FileList['spec/**/*_spec.rb']
end
task :overwrite_rack_env do
ENV['RACK_ENV'] = 'test'
RACK_ENV = 'test'
end
task :spec => [:overwrite_rack_env, 'db:recreate']
<file_sep>require 'webmachine'
require 'pact_broker/services'
require 'pact_broker/api/decorators'
require 'pact_broker/logging'
require 'pact_broker/api/pact_broker_urls'
module PactBroker::Api
module Resources
class ErrorHandler
include PactBroker::Logging
def self.handle_exception e, response
logger.error e
logger.error e.backtrace
response.body = {:message => e.message, :backtrace => e.backtrace }.to_json
response.code = 500
end
end
class BaseResource < Webmachine::Resource
include PactBroker::Services
include PactBroker::Api::PactBrokerUrls
def identifier_from_path
request.path_info.each_with_object({}) do | pair, hash|
hash[pair.first] = CGI::unescape(pair.last)
end
end
def resource_url
request.uri.to_s.gsub(/#{request.uri.path}$/,'')
end
def handle_exception e
PactBroker::Api::Resources::ErrorHandler.handle_exception(e, response)
end
def params
JSON.parse(request.body.to_s, symbolize_names: true)
end
end
end
end<file_sep>require 'pact_broker/repositories'
module PactBroker
module Services
module PactService
extend self
extend PactBroker::Repositories
def find_pact params
if params[:consumer_version_number] == 'latest'
pact_repository.find_latest_pact(params[:consumer_name], params[:provider_name])
else
pact_repository.find_pact(params[:consumer_name], params[:consumer_version_number], params[:provider_name])
end
end
def find_latest_pacts
pact_repository.find_latest_pacts
end
def create_or_update_pact params
provider = pacticipant_repository.find_by_name_or_create params[:provider_name]
consumer = pacticipant_repository.find_by_name_or_create params[:consumer_name]
version = version_repository.find_by_pacticipant_id_and_number_or_create consumer.id, params[:consumer_version_number]
if pact = pact_repository.find_by_version_and_provider(version.id, provider.id)
pact.update(json_content: params[:json_content])
return pact, false
else
pact = pact_repository.create json_content: params[:json_content], version_id: version.id, provider_id: provider.id
return pact, true
end
end
end
end
end<file_sep>require 'pact/tasks'
Pact::VerificationTask.new(:dev) do | pact |
pact.uri "../pact_broker-client/spec/pacts/pact_broker_client-pact_broker.json"
end
namespace :pact do
task :prepare => [:overwrite_rack_env, 'db:recreate']
task :verify => :prepare
task 'verify:at' => :prepare
task 'verify:dev' => :prepare
end
<file_sep>require 'spec_helper'
require 'spec/support/provider_state_builder'
module PactBroker
module Repositories
describe PactRepository do
describe "find_latest_pacts" do
before do
ProviderStateBuilder.new
.create_condor
.create_condor_version('1.3.0')
.create_pricing_service
.create_pact
.create_condor_version('1.4.0')
.create_pact
.create_contract_email_service
.create_contract_email_service_version('2.6.0')
.create_contract_proposal_service
.create_ces_cps_pact
.create_contract_email_service_version('2.7.0')
.create_ces_cps_pact
.create_contract_email_service_version('2.8.0') # Create a version without a pact, it shouldn't be used
end
it "finds the latest pact for each consumer/provider pair" do
pacts = PactRepository.new.find_latest_pacts
expect(pacts[0].consumer_version.pacticipant.name).to eq("Condor")
expect(pacts[0].provider.name).to eq("Pricing Service")
expect(pacts[0].consumer_version.number).to eq("1.4.0")
expect(pacts[1].consumer_version.pacticipant.name).to eq("Contract Email Service")
expect(pacts[1].provider.name).to eq("Contract Proposal Service")
expect(pacts[1].consumer_version.number).to eq("2.7.0")
end
end
end
end
end
<file_sep>require 'pact_broker/tasks/migration_task'
|
8ca0b0759a2aade6ad1754917829992b3945b9c4
|
[
"Markdown",
"Ruby"
] | 29 |
Ruby
|
Maxim-Filimonov/pact_broker
|
cd8baf2798b9882e1a4799ea0334db1a14da03df
|
3db376ea76483ca53510a6fb9ecc427e921bafc0
|
refs/heads/master
|
<repo_name>jaebyunglee/Algorithm<file_sep>/SCAD완성_(2019.05.16).R
rm(list=ls())
set.seed(1)
n = 1000; p = 30
x.mat = matrix(rnorm(n*p),ncol=p); b.vec = 1/(1:p)
exb.vec = exp(drop(x.mat%*%b.vec)); p.vec = exb.vec/(1+exb.vec)
y.vec = rbinom(n,1,prob=p.vec)
iter.max = 1e+3;eps = 1e-05
###concave function
# lam = 1; b = -8; a = 4
# s.fun = function(lam,a,b){
# fa = lam*abs(b)
# fb = (a*lam*(abs(b)-lam)-(abs(b)^2-lam^2)/2)/(a-1)+lam^2
# fc = (a-1)*lam^2/2+lam^2
# ret = fa*I(0<=abs(b)&abs(b)<lam)+fb*I(lam<=abs(b)&abs(b)<=a*lam)+fc*I(abs(b)>a*lam)-lam*abs(b)
# return(ret)
# }
#
#
# bb.vec = NULL
# l.vec = seq(-5,5,length.out = 1000)
# for(b in l.vec){
# bb.vec = c(bb.vec,s.fun(lam,a,b))
# }
# plot(bb.vec)
###concave grad function
ccav.fun = function(b.vec,lam,a){
ret = rep(0,length(b.vec))
ret = sign(b.vec)*ifelse(a*lam-abs(b.vec)>0,a*lam-abs(b.vec),0)/(a-1)-sign(b.vec)*lam
if(sum(abs(b.vec)<=lam)!=0){ret[abs(b.vec)<=lam] = 0}
return(ret)
}
###loss
loss.fun = function(y.vec,x.mat,b.vec){
xb.vec = drop(x.mat%*%b.vec); ret = -sum(y.vec*xb.vec)+sum(log(1+exp(xb.vec)))
return(ret)
}
###scad.loss
scad.loss = function(y.vec,x.mat,b.vec,lam,a){
fa = lam*abs(b.vec)
fb = (a*lam*(abs(b.vec)-lam)-(abs(b.vec)^2-lam^2)/2)/(a-1)+lam^2
fc = (a-1)*lam^2/2+lam^2
ret = loss.fun(y.vec,x.mat,b.vec) + sum(fa*I(0<=abs(b.vec)&abs(b.vec)<lam)+fb*I(lam<=abs(b.vec)&abs(b.vec)<=a*lam)+fc*I(abs(b.vec)>a*lam)-lam*abs(b.vec))
return(ret)
}
###grad
grad.fun = function(y.vec,x.mat,b.vec){
xb.vec=drop(x.mat%*%b.vec);xb.vec = pmin(xb.vec,100)
exb.vec = exp(xb.vec); p.vec = exb.vec/(1+exb.vec)
ret = -drop(t(x.mat)%*%(y.vec-p.vec))/length(y.vec)
return(ret)
}
###hess
hess.fun = function(y.vec,x.mat,b.vec){
xb.vec = drop(x.mat%*%b.vec);xb.vec = pmin(xb.vec,100)
exb.vec = exp(xb.vec); p.vec = exb.vec/(1+exb.vec)
ret = t(x.mat)%*%diag(p.vec*(1-p.vec))%*%x.mat/length(y.vec)
return(ret)
}
###kkt condition
kkt.fun = function(g.vec,b.vec,lam,eps){
kkt1 = prod(abs(g.vec[b.vec!=0]+lam*sign(b.vec[b.vec!=0]))<eps)
kkt2 = prod(abs(g.vec[b.vec==0])-lam<eps)
return(kkt1&kkt2)
}
###quadratic approximation
quad.lasso.fun = function(q.mat,l.vec,b.vec,lam,iter.max=1e+02,eps=1e-07){
f.vec = rep(NA,iter.max)
for(iter in 1:iter.max){
for(j in 1:length(b.vec)){
a = 2*q.mat[j,j]; b = -2*sum(q.mat[j,-j]*b.vec[-j])-l.vec[j]
#print(cbind(lam,b))
if(abs(b)<lam){ b.vec[j] = 0
} else { b.vec[j] = sign(b)*(abs(b)-lam)/a }
}
f.vec[iter] = t(b.vec)%*%q.mat%*%b.vec+sum(l.vec*b.vec)+lam*sum(abs(b.vec))
g.vec = 2*q.mat%*%b.vec+l.vec
conv = kkt.fun(g.vec,b.vec,lam,eps)
if(conv) break
}
return(list(b.vec=b.vec,g.vec=g.vec,conv=conv,lam=lam))
}
###lasso
lasso.fun = function(y.vec,x.mat,b.vec,lam,iter.max=1e+3,eps=1e-5){
for(iter in 1:iter.max){#iter
h.mat = hess.fun(y.vec,x.mat,b.vec)
g.vec = grad.fun(y.vec,x.mat,b.vec)
q.mat = h.mat/2
l.vec = g.vec-drop(h.mat%*%b.vec)
b.vec = quad.lasso.fun(q.mat,l.vec,b.vec,lam,iter.max,eps)$b.vec
g.vec = grad.fun(y.vec,x.mat,b.vec)
conv = kkt.fun(g.vec,b.vec,lam,eps)
if(conv) break
}#iter
#print(conv)
#print(g.vec)
return(list(b.vec=b.vec,g.vec=g.vec,lam=lam))
}
### active set control - lasso
act.lasso.fun = function(y.vec,x.mat,lam.vec,iter.max=1e+3,eps=1e-5){
b.mat = NULL ; b.vec = rep(0,ncol(x.mat))
g.vec = grad.fun(y.vec,x.mat,b.vec)
for(lam in lam.vec){#lam
for(iter in 1:iter.max){#iter
set1 = b.vec!=0
set2 = abs(g.vec)>=(lam-eps)
set = set1|set2
if(sum(set)==0){set = 1:length(b.vec)}
ax.mat = x.mat[,set,drop=F]
ab.vec = b.vec[set]
fit = lasso.fun(y.vec,ax.mat,ab.vec,lam)
b.vec[set] = fit$b.vec
g.vec = grad.fun(y.vec,x.mat,b.vec)
conv = kkt.fun(g.vec,b.vec,lam,eps)
if(conv) break
}#iter
b.mat = cbind(b.mat,b.vec)
}#lam
return(list(b.mat=b.mat,lam.vec=lam.vec))
}
### SCAD
scad.fun = function(y.vec,x.mat,b.vec,lam,a,iter.max=1e+3,eps=1e-5){
for(c.id in 1:iter.max){#cccp
#print(c.id)
fb.val = scad.loss(y.vec,x.mat,b.vec,lam,a)
#print(scad.loss(y.vec,x.mat,b.vec,lam,a))
g.vec = grad.fun(y.vec,x.mat,b.vec)
c.vec = ccav.fun(b.vec,lam,a)
conv = kkt.fun(g.vec+c.vec,b.vec,lam,eps)
if(conv) break
u.vec = b.vec*0
for(q.id in 1:iter.max){#quad
#print(q.id)
h.mat = hess.fun(y.vec,x.mat,u.vec)
g.vec = grad.fun(y.vec,x.mat,u.vec)
q.mat = h.mat/2
l.vec = g.vec-drop(h.mat%*%u.vec)+c.vec
u.vec = quad.lasso.fun(q.mat,l.vec,u.vec,lam,iter.max,eps)$b.vec
g.vec = grad.fun(y.vec,x.mat,u.vec)
conv = kkt.fun(g.vec+c.vec,u.vec,lam,eps)
if(conv) break
}#quad
fu.val = scad.loss(y.vec,x.mat,u.vec,lam,a)
if(fu.val>fb.val){
for(h in seq(0,0.01,length.out = 100)){
b.vec = h*u.vec+(1-h)*b.vec
# print(scad.loss(y.vec,x.mat,u.vec,lam,a))
# print(b.vec[b.vec!=0])
}
} else { b.vec = u.vec }
}#cccp
return(list(b.vec=b.vec,"g.vec+c.vec"=g.vec+c.vec,lam=lam,a=a,conv=conv))
}
### active set control - SCAD
act.scad.fun = function(y.vec,x.mat,lam.vec,a,iter.max=1e+3,eps=1e-5){
b.mat = NULL ; b.vec = rep(0,ncol(x.mat))
g.vec = grad.fun(y.vec,x.mat,b.vec)
for(lam in lam.vec){#lam
#print(lam)
c.vec = ccav.fun(b.vec,lam,a)
for(iter in 1:iter.max){#iter
set1 = b.vec!=0
set2 = abs(g.vec+c.vec)>=(lam-eps)
set = set1|set2
if(sum(set)==0){set = 1:length(b.vec)}
ax.mat = x.mat[,set,drop=F]
ab.vec = b.vec[set]
fit = scad.fun(y.vec,ax.mat,ab.vec,lam,a)
b.vec[set] = fit$b.vec
g.vec = grad.fun(y.vec,x.mat,b.vec)
c.vec = ccav.fun(b.vec,lam,a)
conv = kkt.fun(g.vec+c.vec,b.vec,lam,eps)
if(conv) break
}#iter
b.mat = cbind(b.mat,b.vec)
}#lam
return(list(b.mat=b.mat,lam.vec=lam.vec))
}
lam.vec = 10:1
act.lasso.fun(y.vec,x.mat,lam.vec=lam.vec,iter.max,eps)
### check
lam.vec = seq(0.1,0.01,length.out=11)
# scad.fun(y.vec,x.mat,rep(0,p),lam=lam.vec[11],a=3,iter.max=1e+3,eps=1e-5)$b.vec
# act.scad.fun(y.vec,x.mat,lam.vec,a=3,iter.max=1e+3,eps=1e-5)$b.mat[,11]
# coef(glm(y.vec~.-1,data=data.frame(cbind(y.vec,x.mat)),family="binomial"))
scad1.fun = function(y.vec,x.mat,b.vec,lam,a,iter.max=1e+3,eps=1e-5){
g.vec = grad.fun(y.vec,x.mat,b.vec)
c.vec = ccav.fun(b.vec,lam,a)
for(iter in 1:iter.max){#iter
#print(iter)
#print(scad.loss(y.vec,x.mat,b.vec,lam,a))
h.mat = hess.fun(y.vec,x.mat,b.vec)
q.mat = h.mat/2
l.vec = g.vec-drop(h.mat%*%b.vec) + c.vec
b.vec = quad.lasso.fun(q.mat,l.vec,b.vec,lam,iter.max,eps)$b.vec
g.vec = grad.fun(y.vec,x.mat,b.vec)
c.vec = ccav.fun(b.vec,lam,a)
conv = kkt.fun(g.vec+c.vec,b.vec,lam,eps)
if(conv) break
}#iter
return(list(b.vec=b.vec,g.vec=g.vec+c.vec,lam=lam))
}
scad.fun(y.vec,x.mat,rep(0,p),lam=0.01,a=3,iter.max=1e+3,eps=1e-5)
scad1.fun(y.vec,x.mat,rep(0,p),lam=0.01,a=3,iter.max=1e+3,eps=1e-5)
act.scad.fun(y.vec,x.mat,lam.vec,3,iter.max=1e+3,eps=1e-5)
coef(glm(y.vec~.-1,data = data.frame(cbind(y.vec,x.mat)),family="binomial"))
<file_sep>/quad.lasso.fun & kkt condition_(2019.05.02).R
rm(list=ls())
set.seed(1)
n = 100; p = 10
x.mat = matrix(rnorm(n*p),ncol=p); b.vec = 1/(1:p)
exb.vec = exp(drop(x.mat%*%b.vec)); p.vec = exb.vec/(1+exb.vec)
y.vec = rbinom(n,1,prob=p.vec)
###loss
loss.fun = function(y.vec,x.mat,b.vec){
xb.vec = drop(x.mat%*%b.vec); ret = -sum(y.vec*xb.vec)+sum(log(1+exp(xb.vec)))
return(ret)
}
###gradient
grad.fun = function(y.vec,x.mat,b.vec){
exb.vec = exp(drop(x.mat%*%b.vec)); p.vec = exb.vec/(1+exb.vec)
ret = -drop(t(x.mat)%*%(y.vec-p.vec))
return(ret)
}
###hessian
hess.fun = function(y.vec,x.mat,b.vec){
exb.vec = exp(drop(x.mat%*%b.vec)); p.vec = exb.vec/(1+exb.vec)
ret = t(x.mat)%*%diag(p.vec*(1-p.vec))%*%x.mat
return(ret)
}
eps = 1e-10; iter.max = 100
b.vec = rep(0,p); lam = 1
q.mat = hess.fun(y.vec,x.mat,b.vec)
l.vec = grad.fun(y.vec,x.mat,b.vec)
###quad.lasso.fun & kkt condition
quad.lasso.fun = function(q.mat,l.vec,lam,b.vec,iter.max,eps){
f.vec = rep(NA,iter.max)
for(iter in 1:iter.max){
for(j in 1:length(b.vec)){
a = 2*q.mat[j,j]; b = -2*sum(q.mat[j,-j]*b.vec[-j])-l.vec[j]
if(abs(b)<lam){ b.vec[j] = 0
} else { b.vec[j] = sign(b)*(abs(b)-lam)/a }
}
f.vec[iter] = t(b.vec)%*%q.mat%*%b.vec+sum(l.vec*b.vec)+lam*sum(abs(b.vec))
grad = 2*q.mat%*%b.vec+l.vec
kkt1 = prod((grad[b.vec!=0] + lam*sign(b.vec[b.vec!=0]))<eps)
kkt2 = prod(abs(grad[b.vec==0])-lam<eps)
print(c(kkt1,kkt2))
if(kkt1&kkt2) break
}
return(list(b.vec=b.vec,f.vec=f.vec[1:iter]))
}
quad.lasso.fun(q.mat,l.vec,lam,b.vec,iter.max,eps)
###home work
grad = 2*q.mat%*%b.vec+l.vec
qq.mat = matrix(q.mat[1,1],1,1)
ll.vec = l.vec[1]
fit = quad.lasso.fun(qq.mat,ll.vec,lam,b.vec[1],iter.max,eps)
b.vec[1] = fit$b.vec
grad = 2*q.mat%*%b.vec+l.vec
qq.mat = matrix(q.mat[1:2,1:2],2,2)
ll.vec = l.vec[1:2]
fit = quad.lasso.fun(qq.mat,ll.vec,lam,b.vec[1:2],iter.max,eps)
b.vec[1:2] = fit$b.vec
grad = 2*q.mat%*%b.vec+l.vec
<file_sep>/em-algorithm.R
rm(list=ls())
library(MASS)
library(mvtnorm)
set.seed(1)
n = 5000
x1.mat = mvrnorm(n, mu = c(1,2.25),Sigma = matrix(c(1.75,1,1,1.75),2,2) )
x2.mat = mvrnorm(n, mu = c(12.9,10.3),Sigma = matrix(c(3.2,1.25,1.25,3.2),2,2))
I=rbinom(n,size=1,prob = 0.3)
x.mat = I*x1.mat + (1-I)*x2.mat
plot(x.mat)
t = 0.4
u1 = c(0,0)
u2 = c(1,1)
Sigma1 = matrix(c(1,0.8,0.8,1),2,2)
Sigma2 = matrix(c(1.5,0.3,0.3,1.5),2,2)
#############################################
for(i in 1:100){
d1=dmvnorm(x.mat,mean = u1,sigma = Sigma1)
d2=dmvnorm(x.mat,mean = u2,sigma = Sigma2)
#E-step
T1=(t*d1)/(t*d1+(1-t)*d2)
T2=(1-t)*d2/(t*d1+(1-t)*d2)
#M-step
nu1 = colSums(T1*x.mat)/sum(T1)
nu2 = colSums(T2*x.mat)/sum(T2)
s1 = scale(x.mat,center = nu1,scale = FALSE)
s2 = scale(x.mat,center = nu2,scale = FALSE)
nSigma1=(t(s1)%*%diag(T1)%*%(s1))/sum(T1)
nSigma2=(t(s2)%*%diag(T2)%*%(s2))/sum(T2)
nt = sum(T1)/n
if(sum(abs(u1-nu1))+sum(abs(u2-nu2))+sum(abs(Sigma1-nSigma1))+sum(abs(Sigma2-nSigma2))<1e-10) break
print(c(i,sum(abs(u1-nu1))+sum(abs(u2-nu2))+sum(abs(Sigma1-nSigma1))+sum(abs(Sigma2-nSigma2))))
cat('accracy',sum(diag(table(T1>0.5,I)))/sum(table(T1>0.5,I)),'\n')
t = nt ; u1 = nu1 ; u2 = nu2 ; Sigma1 = nSigma1 ; Sigma2 = nSigma2
}
u1; u2
Sigma1; Sigma2
par(mfrow=c(1,2))
plot(x.mat)
plot(x.mat[T1>0.5,],ylim = c(-3,16),xlim=c(-3,20),col="red")
points(x.mat[T1<=0.5,],col="blue")
<file_sep>/activeset control in lasso_(2019.05.09).R
rm(list=ls())
set.seed(1)
n = 100; p = 10
x.mat = matrix(rnorm(n*p),ncol=p); b.vec = 1/(1:p)
exb.vec = exp(drop(x.mat%*%b.vec)); p.vec = exb.vec/(1+exb.vec)
y.vec = rbinom(n,1,prob=p.vec)
iter.max = 1e+3;eps = 1e-05
###loss
loss.fun = function(y.vec,x.mat,b.vec){
xb.vec = drop(x.mat%*%b.vec); ret = -sum(y.vec*xb.vec)+sum(log(1+exp(xb.vec)))
return(ret)
}
###grad
grad.fun = function(y.vec,x.mat,b.vec){
exb.vec = exp(drop(x.mat%*%b.vec)); p.vec = exb.vec/(1+exb.vec)
ret = -drop(t(x.mat)%*%(y.vec-p.vec))
return(ret)
}
###hess
hess.fun = function(y.vec,x.mat,b.vec){
exb.vec = exp(drop(x.mat%*%b.vec)); p.vec = exb.vec/(1+exb.vec)
ret = t(x.mat)%*%diag(p.vec*(1-p.vec))%*%x.mat
return(ret)
}
###kkt condition
kkt.fun = function(g.vec,b.vec,lam,eps){
kkt1 = prod(abs(g.vec[b.vec!=0] + lam*sign(b.vec[b.vec!=0]))<eps)
kkt2 = prod(abs(g.vec[b.vec==0])-lam<eps)
return(kkt1&kkt2)
}
###quadratic approximation
quad.lasso.fun = function(q.mat,l.vec,lam,b.vec,iter.max=1e+02,eps=1e-07){
f.vec = rep(NA,iter.max)
for(iter in 1:iter.max){
for(j in 1:length(b.vec)){
a = 2*q.mat[j,j]; b = -2*sum(q.mat[j,-j]*b.vec[-j])-l.vec[j]
if(abs(b)<lam){ b.vec[j] = 0
} else { b.vec[j] = sign(b)*(abs(b)-lam)/a }
}
f.vec[iter] = t(b.vec)%*%q.mat%*%b.vec+sum(l.vec*b.vec)+lam*sum(abs(b.vec))
g.vec = 2*q.mat%*%b.vec+l.vec
conv = kkt.fun(g.vec,b.vec,lam,eps)
if(conv) break
}
return(list(b.vec=b.vec,g.vec=g.vec,conv=conv,lam=lam))
}
###quadratic approximation lasso ###SCAD homework
#initial beta
lasso.fun = function(y.vec,x.mat,b.vec,lam,iter.max=1e+3,eps=1e-5){
for(iter in 1:iter.max){#iter
h.mat = hess.fun(y.vec,x.mat,b.vec)
g.vec = grad.fun(y.vec,x.mat,b.vec)
q.mat = h.mat/2
l.vec = g.vec-drop(h.mat%*%b.vec)
b.vec = quad.lasso.fun(q.mat,l.vec,lam,b.vec,iter.max,eps)$b.vec
g.vec = grad.fun(y.vec,x.mat,b.vec)
conv = kkt.fun(g.vec,b.vec,lam,eps)
if(conv) break
}#iter
#print(conv)
#print(g.vec)
return(list(b.vec=b.vec,g.vec=g.vec,lam=lam))
}
###active set control
lam.vec = 10:1
act.lasso.fun = function(y.vec,x.mat,lam.vec,iter.max=1e+3,eps=1e-5){
b.mat = NULL ; b.vec = rep(0,ncol(x.mat))
g.vec = grad.fun(y.vec,x.mat,b.vec)
for(lam in lam.vec){#lam
for(iter in 1:iter.max){#iter
set1 = b.vec!=0
set2 = abs(g.vec)>=(lam-eps)
set = set1|set2
ax.mat = x.mat[,set,drop=F]
ab.vec = b.vec[set]
fit = lasso.fun(y.vec,ax.mat,ab.vec,lam)
b.vec[set] = fit$b.vec
g.vec = grad.fun(y.vec,x.mat,b.vec)
conv = kkt.fun(g.vec,b.vec,lam,eps)
if(conv) break
}#iter
b.mat = cbind(b.mat,b.vec)
}#lam
return(list(b.mat=b.mat,lam.vec=lam.vec))
}
act.lasso.fun(y.vec,x.mat,lam.vec=10:1,iter.max,eps)
lasso.fun(y.vec,x.mat,10,b.vec=rep(0,p),iter.max,eps)
|
b628b8359db2eef2b1ddf81e15094eec29da5b7f
|
[
"R"
] | 4 |
R
|
jaebyunglee/Algorithm
|
0a5380bad71105f788d623278586b41b37420a05
|
f830dfb98ed27c4a2c9927e1353e291aa00df30f
|
refs/heads/master
|
<file_sep>export default {
addCount(state, num){
state.count = num
}
}<file_sep>export default {
updatedAsync({commit},payload) {
setTimeout(()=>{
commit('addCount', payload.num)
},payload.time)
},
}<file_sep>import Vue from 'vue'
import Vuex from 'vuex'
import state from './state/index'
import mutations from './mutations/index'
import getters from './getter/index'
import actions from './action/index'
Vue.use(Vuex)
const isDev = process.env.NODE_ENV === 'development'
const store = new Vuex.Store({
strict:isDev,
state,
mutations,
getters,
actions,
modules: {
a: {
namespaced: true,
state: {
text:'233'
},
mutations:{
setText(state, texts){
state.text = texts
}
},
getters: {
textPlugin(state){
return state.text + 'weeere'
}
}
},
b: {
namespaced: true,
state: {
text: 'hahahha'
},
mutations:{
setText(state, texts){
state.text += texts
}
}
}
}
})
export default store
|
da5ae26b43b4eaa87818ce0cba039522dd864f45
|
[
"JavaScript"
] | 3 |
JavaScript
|
F1eyadu/vue-demo
|
bcd504bdc00ae20a73dbf98905af8e942912beea
|
26114fa18d4b57d663856571025d57eaac2a4112
|
refs/heads/master
|
<repo_name>yanzheyusam/COSI12b-Team07-Final-Project<file_sep>/src/StoreAssistantManager/Product.java
/* The Product class that is abstract,
containing the information of a product */
package StoreAssistantManager;
public class Product {
public String name;
public double cost;
public double price;
public int stock;
/**
* Constructor
* @param initName the name of the product
* @param initStore the store that the employee works
*/
public Product(String initName, double initcost, double initprice, int initstock) {
name = initName;
cost = initcost;
price = initprice;
stock = initstock;
}
/**
* Return the cost of the product
* @return the cost of the product
*/
public double getCost() {
return cost;
}
/**
* Return the price of the product
* @return the price of the product
*/
public double getPrice() {
return price;
}
/**
* Return the name of the product
* @return the name of the product
*/
public String getName() {
return name;
}
/**
* Return the stock of the product
* @return the stock of the product
*/
public double getStock() {
return stock;
}
/**
* Formatting the product's information in a string
*/
public String toString() {
return "Name: " + name + ". Cost: " + cost + ". Price: " + price + ". Stock: " + stock + ".";
}
}
<file_sep>/src/StoreAssistantManager/Manager.java
/* The Manager class extends from the Employee class,
* containing the information of a manager */
package StoreAssistantManager;
public class Manager extends Employee{
/**
* Constructor
* @param initName the name of the employee
* @param initStore the store that the employee works
*/
public Manager(String initName, double initSalary) {
super(initName, initSalary);
position = "Manager";
salary = initSalary;
}
/**
* Change the salary of a employee
* @param employee the employee object to be changed
* @param newSalary the new salary to change
* @return the amount of the new salary
*/
// public String changeSalary(Employee employee, String newSalary) {
// employee.salary = newSalary;
// return employee.salary;
// }
@Override
/**
* Add the specific information to the end of the string
* of the super class
*/
public String toString() {
return super.toString() + " Position: manager.";
}
}
<file_sep>/src/StoreAssistantManager/Seller.java
/* The Seller class extends from the Employee class,
* containing the information of a cashier */
package StoreAssistantManager;
public class Seller extends Employee{
/**
* Constructor
* @param initName the name of the employee
* @param initStore the store that the employee works
*/
public Seller(String initName, double initSalary) {
super(initName, initSalary);
position = "Seller";
salary = initSalary;
}
@Override
/**
* Add the specific information to the end of the string
* of the super class
*/
public String toString() {
return super.toString() + " Position: seller.";
}
}
<file_sep>/src/StoreAssistantManager/Employee.java
/* The Employee class that is abstract,
containing the information of an employee */
package StoreAssistantManager;
public class Employee {
public double salary;
public String name;
public String position;
/**
* Constructor
* @param initName the name of the employee
* @param initSalary the salary that the employee has
*/
public Employee(String initName, double initSalary) {
name = initName;
salary = initSalary;
}
/**
* Return the position of the employee
* @return the position of the employee
*/
public String getPosition() {
return position;
}
/**
* Return the salary of the employee
* @return the salary of the employee
*/
public double getSalary() {
return salary;
}
/**
* Return the name of the employee
* @return the name of the employee
*/
public String getName() {
return name;
}
/**
* Formatting the employee's information in a string
*/
public String toString() {
return "Name: " + name + ". Salary: " + salary + ".";
}
}
|
ef51050fc70bc0c107080ebbf2190d5d6d06172d
|
[
"Java"
] | 4 |
Java
|
yanzheyusam/COSI12b-Team07-Final-Project
|
08feb25400a4d0851ecd3326ed65de426bdf46aa
|
253b24fdb6ca380ced43b2656e9f7d80583e9607
|
refs/heads/master
|
<file_sep>// Ionic Starter App
// angular.module is a global place for creating, registering and retrieving Angular modules
// 'starter' is the name of this angular module example (also set in a <body> attribute in index.html)
// the 2nd parameter is an array of 'requires'
angular.module('quizTaker', ['ionic'])
.config(function ($stateProvider, $urlRouterProvider) {
$stateProvider
.state('oathPage', {
url: '/oathPage',
controller: 'oathPageController',
templateUrl: 'views/oathPage/oathPage.html'
})
.state('chooseTest', {
url: '/chooseTest',
controller: 'chooseTestController',
templateUrl: 'views/chooseTest/chooseTest.html'
})
.state('testCondition', {
url: '/testCondition/:testID',
controller: 'testConditionController',
templateUrl: 'views/testCondition/testCondition.html'
})
.state('startTest', {
url: '/startTest/:startID',
controller: 'startTestController',
templateUrl: 'views/startTest/startTest.html'
})
.state('testComplete', {
url: '/testComplete',
controller: 'testCompleteController',
templateUrl: 'views/testComplete/testComplete.html'
});
$urlRouterProvider
.otherwise('/chooseTest');
})
.factory("questionBank", function() {
return [
{
nameOfTest : "Apptitude",
testMarks: 15,
testDescription: "Quiz has 30 questions from Wily HTML and CSS by <NAME>." ,
testTime: 70,
passingMarks:10,
Questions: [
{
QuiestionNo: 1,
questionTitle: "",
Question: "Who was the founding father of Pakistan?",
options: [
{option: "<NAME>", ansValue: true, marks: 5},
{option:"<NAME>", ansValue: false, marks:0},
{option:"<NAME>",ansValue: false, marks:0},
{option:"<NAME>", ansValue:false, marks:0}
],
ans: 1
},
{
QuiestionNo: 2,
questionTitle: "Simplify the following:",
Question: "3 x 3 + 2 / 2 + 1",
options: [
{option: "7", ansValue: false, marks: 0},
{option: "11", ansValue: true, marks: 5},
{option: "13", ansValue: false, marks: 0},
{option: "4", ansValue: false, marks: 0}
],
ans: 2
},
{
QuiestionNo: 3,
questionTitle: "Change the voice",
Question: "He has eaten the food.",
options: [
{option: "Food has been eaten by him", ansValue: true, marks: 5},
{option: "Food has eaten by him", ansValue: false, marks: 0},
{option: "Food has eat him", ansValue: false, marks: 0},
{option: "Food has been eated by him", ansValue: false, marks: 0}
],
ans: 1
}]
},
{
nameOfTest : "Mathematics",
testDescription: "Quiz has 30 questions from Mathematics 10." ,
testMarks: 20,
testTime: 60,
passingMarks:15,
Questions: [
{
QuiestionNo: 1,
questionTitle: "Find the Square root of the following",
Question: "625",
options: [
{option: "5", ansValue: false, marks: 0},
{option: "125", ansValue: false, marks: 0},
{option: "75", ansValue: false, marks: 0},
{option: "25", ansValue: true, marks: 5}
],
ans: 4
},
{
QuiestionNo: 2,
questionTitle: "Simplify the following:",
Question: "3 x 3 + 2 / 2 + 1",
options: [
{option: "7", ansValue: false, marks: 0},
{option:"11", ansValue: true, marks: 5},
{option: "13", ansValue: false, marks: 0},
{option: "4", ansValue: false, marks: 0}
],
ans: 2
},
{
QuiestionNo: 3,
questionTitle: "Find the GCF",
Question: "125 , 625",
options: [
{option: "5", ansValue: false, marks: 0},
{option:"25", ansValue: false, marks: 0},
{option: "125", ansValue: true, marks: 5},
{option: "625", ansValue: false, marks: 0}
],
ans: 3
},
{
QuiestionNo: 4,
questionTitle: "Choose the correct answer.",
Question: "Select the largest nuumber",
options: [
{option: "0", ansValue: true, marks: 5},
{option: "-0.1", ansValue: false, marks: 0},
{option: "-0.3", ansValue: false, marks: 0},
{option: "-0.2", ansValue: false, marks: 0}
],
ans: 1
}]
}
];
})
.run(function($ionicPlatform) {
$ionicPlatform.ready(function() {
// Hide the accessory bar by default (remove this to show the accessory bar above the keyboard
// for form inputs)
if(window.cordova && window.cordova.plugins.Keyboard) {
cordova.plugins.Keyboard.hideKeyboardAccessoryBar(true);
}
if(window.StatusBar) {
StatusBar.styleDefault();
}
});
})
<file_sep><ion-view view-title="Test" hide-back-button="false" class="mainbody">
<ion-content class="paddingForBodyofTest topMargin">
<!--StopWatch-->
<div id="display"></div>
<!--StopWatch-->
<h3 class="timer">Question: {{questionBank.QuiestionNo}}</h3>
<h5>{{questionBank.questionTitle}}</h5>
<p>{{questionBank.Question}}</p>
<div class="list">
<div class="item item-divider">
Selected Option: {{ selectedOpt.Opt.option }}
</div>
<ion-radio ng-repeat="option in options"
ng-value="option"
ng-model="selectedOpt.Opt"
ng-change="optionChange(option)"
name="SelectedOption">
{{ option.option }}
</ion-radio>
</div>
<div class="buttonForOath">
<button class="button button-block button-positive " ng-click="next()">
Next
</button>
</div>
</ion-content>
</ion-view>
<file_sep>/**
* Created by haider on 11/13/2015.
*/
angular.module('quizTaker')
.controller("testConditionController", function($scope, $rootScope, questionBank, $stateParams){
var indexOfTest = $stateParams.testID;
$scope.indexOfTest = $stateParams.testID;
console.log(indexOfTest);
$scope.questionBank = questionBank[indexOfTest];
});
<file_sep>/**
* Created by haider on 11/13/2015.
*/
angular.module('quizTaker')
.controller("startTestController", function($scope, $rootScope, questionBank, $stateParams, $state){
var indexOfTest = $stateParams.startID;
//$scope.indexOfTest = $stateParams.startID;
//console.log(indexOfTest);
$rootScope.indexOfTest = indexOfTest;
var questionNo = 0;
$scope.questionBank = questionBank[indexOfTest].Questions[questionNo];
var lengthOfQuestions = questionBank[indexOfTest].Questions.length;
//console.log(lengthOfQuestions);
$rootScope.answers = 0;
$scope.options = $scope.questionBank.options;
$scope.selectedOpt = {};
$scope.optionChange = function(option) {
//console.log("Selected Serverside, text:", option.option, "value:", option.ansValue);
$scope.chosedOpt = option;
};
$scope.next = function () {
//console.log(option);
console.log($scope.chosedOpt);
if($scope.chosedOpt.ansValue == true ){
$rootScope.answers = $rootScope.answers + $scope.chosedOpt.marks;
}
console.log($rootScope.answers);
questionNo++;
if(questionNo == lengthOfQuestions){
console.log("test completed");
$state.go('testComplete');
}
else{
$scope.questionBank = questionBank[indexOfTest].Questions[questionNo];
$scope.options = $scope.questionBank.options;
$scope.optionChange($scope.options);
$scope.selectedOpt = {};
}
};
/* *//*StopWatch*/
var sec = 60;
var min = questionBank[indexOfTest].testTime - 1;
var startTimer = setInterval(function (){
var display = document.getElementById("display");
display.innerHTML = min+ ' : ' + (sec - 1) ;
//display.innerHTML = min+ ' : ' + (sec - 1) ;
sec--;
if(sec < 0){
//clearInterval(startTimer);
//console.log(sec);
min = min -1;
sec = 59;
display.innerHTML = min+ ' : ' + (sec) ;
//clearInterval(startTimer);
}
if (min == 0 && sec == 0){
clearInterval(startTimer);
console.log("test Complete");
$state.go('testComplete');
}
},1000);
});
<file_sep>/**
* Created by haider on 11/13/2015.
*/
angular.module('quizTaker')
.controller("chooseTestController", function($scope, questionBank){
//$rootScope.questionBank = questionBank;
$scope.questionBank = questionBank;
});
<file_sep>/**
* Created by haider on 11/13/2015.
*/
angular.module('quizTaker')
.controller("oathPageController", function($scope, $rootScope, $state){
$rootScope.hideLogoutButton = true;
$scope.login = function(){
$state.go("chooseTest");
};
//$scope.questionBank = questionBank;
});
<file_sep># QuizTaker
A quiz application for smartphones.
|
cf556eef1c4c96a5ee5c5291d19bfc0c1b10b381
|
[
"JavaScript",
"HTML",
"Markdown"
] | 7 |
JavaScript
|
haideralishah/QuizTaker
|
4c58c0556cfd9cf143f91a325c85caafe144fe77
|
096eb998a2a3a87a839a2c585b1d97d9bd29cc36
|
refs/heads/master
|
<file_sep>const aws_sdk = require('aws-sdk');
const schema = require('./validation');
aws_sdk.config.update({region: 'ap-south-1'});
const book = new aws_sdk.DynamoDB.DocumentClient();
function getByBookId(id) {
var params = {
TableName: "books_dev",
Key: {
book_id:parseInt(id)
}
};
return new Promise((res,rej)=>{
book.get(params, (err,response)=>{
if(err){
rej(err);
}
if(response){
res(response);
}
});
})
}
function getByBookName(name) {
var params = {
TableName: "books_dev",
IndexName: "book_name-index",
KeyConditionExpression: "book_name = :book_name",
ExpressionAttributeValues: {
":book_name": name
},
};
return new Promise((res,rej)=>{
book.query(params, (err,response)=>{
if(err){
rej(err);
}
if(response){
res(response);
}
});
})
}
function getByAuthor(AuthorName) {
var params = {
TableName: "books_dev",
IndexName: "author-index",
KeyConditionExpression: "author = :author_name",
ExpressionAttributeValues: {
":author_name": AuthorName
},
};
return new Promise((res,rej)=>{
book.query(params, (err,response)=>{
if(err){
rej(err);
}
if(response){
let arr=[];
for(i=0;i<response.Items.length;i++)
arr.push(response.Items[i].book_name);
res((arr));
}
});
})
}
function buyBook(Bid) {
var params = {
TableName: "books_dev",
Key: {
book_id: Bid,
},
UpdateExpression: "set quantity = quantity - :val",
ConditionExpression: 'quantity > :inc',
ExpressionAttributeValues: {
":val": 1,
":inc": 0
},
ReturnValues: "UPDATED_NEW"
};
console.log("Updating the item...");
return new Promise((res, rej) => {
book.update(params, function (err, data) {
if (err) {
rej(err)
} if (data) {
res(data)
}
});
})
}
let addBook = function (valueObject) {
var params = {
TableName: "books_dev",
Item: {
book_id: valueObject.book_id,
book_name: valueObject.book_name,
quantity: valueObject.quantity,
author: valueObject.author
},
};
return new Promise((res, rej) => {
book.put(params, (err, Response) => {
if (err) {
rej(err)
}
if (Response) {
res(Response.Item)
}
})
})
}
module.exports = {
getByBookId,
getByBookName,
getByAuthor,
buyBook,
docClient,
addBook,
}
<file_sep>const Joi = require('joi');
const bookSchema = {
book_id:Joi.number().required(),
book_name:Joi.string().required(),
author:Joi.string().required(),
quantity:Joi.number().required()
}
module.exports = bookSchema;<file_sep>const aws_sdk = require('aws-sdk');
db = require('./db');
const schema = require('./validation');
async function bookId(ctx) {
await db.buyBook(parseInt(ctx.params.bookId)).then(
result => {
ctx.body = result;
}
).catch(err => {
if(err){
if(err.code=="ConditionalCheckFailedException"){
ctx.body="Invalid Id";
}
else
{
ctx.body=err;
}
}
}
)
}
async function byId(ctx) {
console.log(ctx.url);
await db.getByBookId(ctx.params.Id).then(
result => {
if(result.Item==null){
ctx.body='No such data';
}
else{
ctx.body=result;
}
}
).catch(err=>{
if(err){
ctx.body = err;
}
}
);
}
async function byName(ctx) {
await db.getByBookName(ctx.params.Name).then(
result => {
if(result.Count==0)
{
ctx.body="Book not available.";
}
else
{
ctx.body=result;
}
}
).catch(err=>{
if(err){
ctx.body = err;
}
}
);
}
async function byAuthor(ctx) {
await db.getByAuthor(ctx.params.Author).then(
result => {
if(result.length==0){
ctx.body='No Book for this author';
}
else
ctx.body=result;
}
).catch(err=>{
ctx.body=err;
}
);
}
async function callAddBook(ctx) {
await db.addBook(ctx.request.body).then(
result => {
ctx.body = result;
}
).catch(err => {
ctx.body = err;
}
)
}
module.exports = {
byId,
byName,
byAuthor,
callAddBook,
bookId
}
<file_sep>const koa = require('koa');
const Router = require('koa-router');
const app = new koa();
router = new Router();
const handler = require('./handler');
router.get('/byId/:Id',handler.byId);
router.get('/byName/:Name',handler.byName);
router.get('/byAuthor/:Author',handler.byAuthor);
router.get('/buyBook/:bookId', handler.bookId);
router.post('/addBook', handler.callAddBook);
module.exports = router;
|
4123005cf5c277f15208a2cfe70bd249fc6be77c
|
[
"JavaScript"
] | 4 |
JavaScript
|
amolnarayanpatil08/book
|
b1372143188e5a785b32108602dae4bf360c0352
|
f54d0de48c146a935d5045c3e104566abff51e04
|
refs/heads/main
|
<file_sep># Notes
- http://www.joshkunz.com/iTunesControl/
- https://discussions.apple.com/thread/250208393
- http://samsoft.org.uk/iTunes/scripts.asp
- http://samsoft.org.uk/iTunes/links.asp
- https://pbpython.com/windows-com.html
- https://documentation.help/iTunesCOM/main.html
- https://github.com/ybnd/beets-plugins/blob/da1bb1ae4f7b2746fdfb43c71a1b4b36c25aa01b/README.md <3
- https://randomgeekery.org/post/2017/10/beets-and-itunes/
<file_sep>from beets.plugins import BeetsPlugin
import pythoncom
import win32com.client as win32
class ItunesImportPlugin(BeetsPlugin):
def __init__(self):
super(ItunesImportPlugin, self).__init__()
self.register_listener('album_imported', self.album_imported)
def album_imported(self, lib, album):
dir = album.item_dir()
dirStr = dir.decode(encoding="utf-8")
self._log.debug('album dir: ' + dirStr)
if dir:
pythoncom.CoInitialize() # threading, added via https://stackoverflow.com/a/51240037
itunes = win32.gencache.EnsureDispatch('iTunes.Application')
lib = itunes.LibraryPlaylist
lib.AddFile(dirStr)
<file_sep># beets-itunesimport
Plugin for [beets](https://beets.io/) to automatically add imported albums to iTunes, using the iTunes COM Interface on Windows.
Only tested on Win10 with iTunes 10.7 for now.
## Troubleshooting
### AttributeError: `… no attribute 'CLSIDToClassMap'`
E.g.
```powershell
AttributeError: module 'win32com.gen_py.9E93C96F-CF0D-43F6-8BA8-B807A3370712x0x1x13' has no attribute 'CLSIDToClassMap'
```
```powershell
AttributeError: module 'win32com.gen_py.9E93C96F-CF0D-43F6-8BA8-B807A3370712x0x1x13' has no attribute 'MinorVersion'
```
Answers ([1](https://stackoverflow.com/questions/59276808/win32com-module-problems), [2](https://stackoverflow.com/questions/52889704/python-win32com-excel-com-model-started-generating-errors)) on Stack Overflow
suggested deleting folders in
```
%USERPROFILE%\AppData\Local\Temp\gen_py
```
which worked for me.
<file_sep>from setuptools import setup
setup(
name='beets-itunesimport',
version='0.1.0.dev0', # TODO
description='', # TODO
long_description=open('README.md').read(),
long_description_content_type='text/markdown',
author='<NAME>',
author_email='<EMAIL>',
url='https://github.com/Schweinepriester/beets-itunesimport',
license='MIT',
platforms='WIN', # TODO
packages=['beetsplug'],
python_requires='>=3.7',
install_requires=[
'beets>=1.4.9',
],
classifiers=[
'Topic :: Multimedia :: Sound/Audio',
'Topic :: Multimedia :: Sound/Audio :: Players :: MP3',
'License :: OSI Approved :: MIT License',
'Environment :: Console',
'Environment :: Web Environment',
'Programming Language :: Python :: 3',
],
)
|
3515398f62ddc2f156c771e857ac988c99299e90
|
[
"Markdown",
"Python"
] | 4 |
Markdown
|
Zazvo/beets-itunesimport
|
7bcca238a0812789deed33ff4e2d1e98cfdf3d1b
|
22250f68e731553155315bc242660bcec14dda2a
|
refs/heads/master
|
<repo_name>yuki-js/lnd<file_sep>/invoices/invoice_expiry_watcher.go
package invoices
import (
"fmt"
"sync"
"time"
"github.com/lightningnetwork/lnd/channeldb"
"github.com/lightningnetwork/lnd/clock"
"github.com/lightningnetwork/lnd/lntypes"
"github.com/lightningnetwork/lnd/queue"
"github.com/lightningnetwork/lnd/zpay32"
)
// invoiceExpiry is a vanity interface for different invoice expiry types
// which implement the priority queue item interface, used to improve code
// readability.
type invoiceExpiry queue.PriorityQueueItem
// Compile time assertion that invoiceExpiryTs implements invoiceExpiry.
var _ invoiceExpiry = (*invoiceExpiryTs)(nil)
// invoiceExpiryTs holds and invoice's payment hash and its expiry. This
// is used to order invoices by their expiry time for cancellation.
type invoiceExpiryTs struct {
PaymentHash lntypes.Hash
Expiry time.Time
Keysend bool
}
// Less implements PriorityQueueItem.Less such that the top item in the
// priorty queue will be the one that expires next.
func (e invoiceExpiryTs) Less(other queue.PriorityQueueItem) bool {
return e.Expiry.Before(other.(*invoiceExpiryTs).Expiry)
}
// InvoiceExpiryWatcher handles automatic invoice cancellation of expried
// invoices. Upon start InvoiceExpiryWatcher will retrieve all pending (not yet
// settled or canceled) invoices invoices to its watcing queue. When a new
// invoice is added to the InvoiceRegistry, it'll be forarded to the
// InvoiceExpiryWatcher and will end up in the watching queue as well.
// If any of the watched invoices expire, they'll be removed from the watching
// queue and will be cancelled through InvoiceRegistry.CancelInvoice().
type InvoiceExpiryWatcher struct {
sync.Mutex
started bool
// clock is the clock implementation that InvoiceExpiryWatcher uses.
// It is useful for testing.
clock clock.Clock
// cancelInvoice is a template method that cancels an expired invoice.
cancelInvoice func(lntypes.Hash, bool) error
// timestampExpiryQueue holds invoiceExpiry items and is used to find
// the next invoice to expire.
timestampExpiryQueue queue.PriorityQueue
// newInvoices channel is used to wake up the main loop when a new
// invoices is added.
newInvoices chan []invoiceExpiry
wg sync.WaitGroup
// quit signals InvoiceExpiryWatcher to stop.
quit chan struct{}
}
// NewInvoiceExpiryWatcher creates a new InvoiceExpiryWatcher instance.
func NewInvoiceExpiryWatcher(clock clock.Clock) *InvoiceExpiryWatcher {
return &InvoiceExpiryWatcher{
clock: clock,
newInvoices: make(chan []invoiceExpiry),
quit: make(chan struct{}),
}
}
// Start starts the the subscription handler and the main loop. Start() will
// return with error if InvoiceExpiryWatcher is already started. Start()
// expects a cancellation function passed that will be use to cancel expired
// invoices by their payment hash.
func (ew *InvoiceExpiryWatcher) Start(
cancelInvoice func(lntypes.Hash, bool) error) error {
ew.Lock()
defer ew.Unlock()
if ew.started {
return fmt.Errorf("InvoiceExpiryWatcher already started")
}
ew.started = true
ew.cancelInvoice = cancelInvoice
ew.wg.Add(1)
go ew.mainLoop()
return nil
}
// Stop quits the expiry handler loop and waits for InvoiceExpiryWatcher to
// fully stop.
func (ew *InvoiceExpiryWatcher) Stop() {
ew.Lock()
defer ew.Unlock()
if ew.started {
// Signal subscriptionHandler to quit and wait for it to return.
close(ew.quit)
ew.wg.Wait()
ew.started = false
}
}
// makeInvoiceExpiry checks if the passed invoice may be canceled and calculates
// the expiry time and creates a slimmer invoiceExpiry implementation.
func makeInvoiceExpiry(paymentHash lntypes.Hash,
invoice *channeldb.Invoice) invoiceExpiry {
switch invoice.State {
// If we have an open invoice with no htlcs, we want to expire the
// invoice based on timestamp
case channeldb.ContractOpen:
return makeTimestampExpiry(paymentHash, invoice)
default:
log.Debugf("Invoice not added to expiry watcher: %v",
paymentHash)
return nil
}
}
// makeTimestampExpiry creates a timestamp-based expiry entry.
func makeTimestampExpiry(paymentHash lntypes.Hash,
invoice *channeldb.Invoice) *invoiceExpiryTs {
if invoice.State != channeldb.ContractOpen {
return nil
}
realExpiry := invoice.Terms.Expiry
if realExpiry == 0 {
realExpiry = zpay32.DefaultInvoiceExpiry
}
expiry := invoice.CreationDate.Add(realExpiry)
return &invoiceExpiryTs{
PaymentHash: paymentHash,
Expiry: expiry,
Keysend: len(invoice.PaymentRequest) == 0,
}
}
// AddInvoices adds invoices to the InvoiceExpiryWatcher.
func (ew *InvoiceExpiryWatcher) AddInvoices(invoices ...invoiceExpiry) {
if len(invoices) > 0 {
select {
case ew.newInvoices <- invoices:
log.Debugf("Added %d invoices to the expiry watcher",
len(invoices))
// Select on quit too so that callers won't get blocked in case
// of concurrent shutdown.
case <-ew.quit:
}
}
}
// nextTimestampExpiry returns a Time chan to wait on until the next invoice
// expires. If there are no active invoices, then it'll simply wait
// indefinitely.
func (ew *InvoiceExpiryWatcher) nextTimestampExpiry() <-chan time.Time {
if !ew.timestampExpiryQueue.Empty() {
top := ew.timestampExpiryQueue.Top().(*invoiceExpiryTs)
return ew.clock.TickAfter(top.Expiry.Sub(ew.clock.Now()))
}
return nil
}
// cancelNextExpiredInvoice will cancel the next expired invoice and removes
// it from the expiry queue.
func (ew *InvoiceExpiryWatcher) cancelNextExpiredInvoice() {
if !ew.timestampExpiryQueue.Empty() {
top := ew.timestampExpiryQueue.Top().(*invoiceExpiryTs)
if !top.Expiry.Before(ew.clock.Now()) {
return
}
// Don't force-cancel already accepted invoices. An exception to
// this are auto-generated keysend invoices. Because those move
// to the Accepted state directly after being opened, the expiry
// field would never be used. Enabling cancellation for accepted
// keysend invoices creates a safety mechanism that can prevents
// channel force-closes.
ew.expireInvoice(top.PaymentHash, top.Keysend)
ew.timestampExpiryQueue.Pop()
}
}
// expireInvoice attempts to expire an invoice and logs an error if we get an
// unexpected error.
func (ew *InvoiceExpiryWatcher) expireInvoice(hash lntypes.Hash, force bool) {
err := ew.cancelInvoice(hash, force)
switch err {
case nil:
case channeldb.ErrInvoiceAlreadyCanceled:
case channeldb.ErrInvoiceAlreadySettled:
default:
log.Errorf("Unable to cancel invoice: %v: %v", hash, err)
}
}
// pushInvoices adds invoices to be expired to their relevant queue.
func (ew *InvoiceExpiryWatcher) pushInvoices(invoices []invoiceExpiry) {
for _, inv := range invoices {
// Switch on the type of entry we have. We need to check nil
// on the implementation of the interface because the interface
// itself is non-nil.
switch expiry := inv.(type) {
case *invoiceExpiryTs:
if expiry != nil {
ew.timestampExpiryQueue.Push(expiry)
}
default:
log.Errorf("unexpected queue item: %T", inv)
}
}
}
// mainLoop is a goroutine that receives new invoices and handles cancellation
// of expired invoices.
func (ew *InvoiceExpiryWatcher) mainLoop() {
defer ew.wg.Done()
for {
// Cancel any invoices that may have expired.
ew.cancelNextExpiredInvoice()
select {
case newInvoices := <-ew.newInvoices:
// Take newly forwarded invoices with higher priority
// in order to not block the newInvoices channel.
ew.pushInvoices(newInvoices)
continue
default:
select {
case <-ew.nextTimestampExpiry():
// Wait until the next invoice expires.
continue
case newInvoices := <-ew.newInvoices:
ew.pushInvoices(newInvoices)
case <-ew.quit:
return
}
}
}
}
<file_sep>/lntest/itest/lnd_hold_invoice_force_test.go
package itest
import (
"context"
"fmt"
"github.com/lightningnetwork/lnd/lncfg"
"github.com/lightningnetwork/lnd/lnrpc"
"github.com/lightningnetwork/lnd/lnrpc/invoicesrpc"
"github.com/lightningnetwork/lnd/lnrpc/routerrpc"
"github.com/lightningnetwork/lnd/lntest"
"github.com/lightningnetwork/lnd/lntest/wait"
"github.com/lightningnetwork/lnd/lntypes"
"github.com/stretchr/testify/require"
)
// testHoldInvoiceForceClose demonstrates that recipients of hold invoices
// will not release active htlcs for their own invoices when they expire,
// resulting in a force close of their channel.
func testHoldInvoiceForceClose(net *lntest.NetworkHarness, t *harnessTest) {
ctxb, cancel := context.WithCancel(context.Background())
defer cancel()
// Open a channel between alice and bob.
chanReq := lntest.OpenChannelParams{
Amt: 300000,
}
ctxt, _ := context.WithTimeout(ctxb, channelOpenTimeout)
chanPoint := openChannelAndAssert(ctxt, t, net, net.Alice, net.Bob, chanReq)
// Create a non-dust hold invoice for bob.
var (
preimage = lntypes.Preimage{1, 2, 3}
payHash = preimage.Hash()
)
invoiceReq := &invoicesrpc.AddHoldInvoiceRequest{
Value: 30000,
CltvExpiry: 40,
Hash: payHash[:],
}
ctxt, _ = context.WithTimeout(ctxb, defaultTimeout)
bobInvoice, err := net.Bob.AddHoldInvoice(ctxt, invoiceReq)
require.NoError(t.t, err)
// Pay this invoice from Alice -> Bob, we should achieve this with a
// single htlc.
_, err = net.Alice.RouterClient.SendPaymentV2(
ctxb, &routerrpc.SendPaymentRequest{
PaymentRequest: bobInvoice.PaymentRequest,
TimeoutSeconds: 60,
FeeLimitMsat: noFeeLimitMsat,
},
)
require.NoError(t.t, err)
waitForInvoiceAccepted(t, net.Bob, payHash)
// Once the HTLC has cleared, alice and bob should both have a single
// htlc locked in.
nodes := []*lntest.HarnessNode{net.Alice, net.Bob}
err = wait.NoError(func() error {
return assertActiveHtlcs(nodes, payHash[:])
}, defaultTimeout)
require.NoError(t.t, err)
// Get our htlc expiry height and current block height so that we
// can mine the exact number of blocks required to expire the htlc.
chans, err := net.Alice.ListChannels(ctxb, &lnrpc.ListChannelsRequest{})
require.NoError(t.t, err)
require.Len(t.t, chans.Channels, 1)
require.Len(t.t, chans.Channels[0].PendingHtlcs, 1)
activeHtlc := chans.Channels[0].PendingHtlcs[0]
info, err := net.Alice.GetInfo(ctxb, &lnrpc.GetInfoRequest{})
require.NoError(t.t, err)
// Now we will mine blocks until the htlc expires, and wait for each
// node to sync to our latest height. Sanity check that we won't
// underflow.
require.Greater(t.t, activeHtlc.ExpirationHeight, info.BlockHeight,
"expected expiry after current height")
blocksTillExpiry := activeHtlc.ExpirationHeight - info.BlockHeight
// Alice will go to chain with some delta, sanity check that we won't
// underflow and subtract this from our mined blocks.
require.Greater(t.t, blocksTillExpiry,
uint32(lncfg.DefaultOutgoingBroadcastDelta))
blocksTillForce := blocksTillExpiry - lncfg.DefaultOutgoingBroadcastDelta
mineBlocks(t, net, blocksTillForce, 0)
require.NoError(t.t, net.Alice.WaitForBlockchainSync(ctxb))
require.NoError(t.t, net.Bob.WaitForBlockchainSync(ctxb))
// Alice should have a waiting-close channel because she has force
// closed to time out the htlc.
assertNumPendingChannels(t, net.Alice, 1, 0)
// We should have our force close tx in the mempool.
mineBlocks(t, net, 1, 1)
// Ensure alice and bob are synced to chain after we've mined our force
// close.
require.NoError(t.t, net.Alice.WaitForBlockchainSync(ctxb))
require.NoError(t.t, net.Bob.WaitForBlockchainSync(ctxb))
// At this point, Bob's channel should be resolved because his htlc is
// expired, so no further action is required. Alice will still have a
// pending force close channel because she needs to resolve the htlc.
assertNumPendingChannels(t, net.Alice, 0, 1)
assertNumPendingChannels(t, net.Bob, 0, 0)
ctxt, _ = context.WithTimeout(ctxb, defaultTimeout)
err = waitForNumChannelPendingForceClose(ctxt, net.Alice, 1,
func(channel *lnrpcForceCloseChannel) error {
numHtlcs := len(channel.PendingHtlcs)
if numHtlcs != 1 {
return fmt.Errorf("expected 1 htlc, got: "+
"%v", numHtlcs)
}
return nil
},
)
require.NoError(t.t, err)
// Cleanup Alice's force close.
cleanupForceClose(t, net, net.Alice, chanPoint)
}
|
c3eb104f511bc3ca8a6b84b4fb8e5bee6137497b
|
[
"Go"
] | 2 |
Go
|
yuki-js/lnd
|
241f62fbb69df7abf098d528f036259ac6185a1f
|
51fde58df80f361b72c5ed50c90959833f3417cc
|
refs/heads/master
|
<file_sep><?php
//declaration du titre de la page
$blog->titre='Erreurs';
//démarrage de la mémoire tampon
ob_start();
//mise en forme des erreurs
?>
<p class='erreurs'>Une erreur est survenue : <?= $message ?></p>
<?php
//arret de la mémoire tampon
$contenu=ob_get_clean();
//appel à mon gabarit
require_once 'gabarit.php';
?><file_sep><?php
require_once 'modele/classModele.php';
class Billet extends Modele{
/*VARS*/
public $ID;
public $parution;
public $titre;
public $contenu;
public $commentaires;
/*CONSTRUCTEUR*/
function __construct(){
//transformation de la date en objet date
$this->parution=new DateTime($this->parution);
//preparation des commentaires
$this->commentaires= array();
$requete="SELECT *
FROM Commentaire
ORDER BY parution asc";
$this->commentaires=$this->fetchAllClass($requete,'Commentaire');
}
/*GETTERS*/
public function getParutionFormatEdition(){
$bob="le ".$this->parution->format("d/m/Y à H\hs");
return $bob;
}
/*charge un billet en fonction de son id
* je sais pas comment faire pour passer en methode générique avec ma classe modele
*/
public static function load($id){
$requete="SELECT *
FROM Billet
WHERE ID=$id";
//preparation, execution de la requete
$bdd = db_connect::invoque();
$statement = $bdd->connexion->prepare($requete);
$statement->execute();
//si le retour est de , on fabrique un objet, sinon on lance une erreur
if ($statement->rowCount()==1){
$statement->setFetchMode(PDO::FETCH_CLASS, 'Billet');
$billet=$statement->fetch();
} else {
throw new Exception("Aucun '$classe' ne correspond à l'identifiant '$id'");
}
return $billet;
}
}<file_sep><?php
//Singleton
class db_connect {
public $host;
public $login;
public $pwd;
public $db;
public $connexion;
private static $objet = NULL;
function __construct($host, $login, $pwd, $db) {
$this->host = $host;
$this->login = $login;
$this->pwd = $pwd;
$this->db = $db;
//dsn = data source name
$dsn="mysql:dbname=$db;host=$host";
//options qui révèlent les erreurs et les transmettent à try/catch
$options=array(PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION);
//instanciation d'un PDO
$this->connexion = new PDO($dsn, $login, $pwd, $options);
}
public static function construit($host, $login, $pwd, $db) {
if (isset(self::$objet)) {
//La connexion existe déjà
return self::$objet;
} else {
//Il n'y a pas de connexion à la BDD
self::$objet = new db_connect($host, $login, $pwd, $db);
return self::$objet;
}
}
/* construit la connexion en fonction du portage */
public static function invoque(){
/*vars*/
$host='';
$login='';
$pwd='';
$db='';
/*var à changer avant le portage ailleurs*/
$portage="localhost";
/*hydratation*/
switch($portage){
case 'localhost':
$host='localhost';
$login='root';
$db='BLOG';
break;
}
/*retour*/
return self::construit($host, $login, $pwd, $db);
}
}<file_sep>/*creation de la base de données*/
CREATE DATABASE BLOG CHARACTER SET utf8;
/*utilisation de la base de données*/
USE BLOG;
/*creation des tables*/
CREATE TABLE Billet(
/*declaration des colonnes*/
ID int unsigned not null auto_increment,
parution datetime not null default current_timestamp,
titre varchar(100) not null,
contenu text not null,
/*indexation de la clé primaire*/
PRIMARY KEY (ID)
) ENGINE=innoDB;
CREATE TABLE Commentaire(
/*declaration des colonnes*/
ID int unsigned not null auto_increment,
parution datetime not null default current_timestamp,
auteur varchar(100) not null,
contenu tinytext not null,
billetID int unsigned not null,
/*indexation*/
PRIMARY KEY (ID),
FOREIGN KEY (billetID) REFERENCES Billet(ID)
) ENGINE=innoDB;<file_sep><?php
//declaration du titre de la page
$blog->titre='Mon blog';
//démarrage de la mémoire tampon et enregistrement de ce qu'il se passe à partir de la ligne prochaine
ob_start();
//mise en forme des billets
foreach ($blog->billets as $billet):
?>
<article>
<header>
<h3 class="titreBillet"><a href='billet-<?= $billet->ID ?>.html'><?= $billet->titre ?></a></h3>
<time><?= $billet->getParutionFormatEdition() ?></time>
</header>
<p><?= $billet->contenu ?></p>
</article>
<hr />
<?php
endforeach;
//arrêt du la mémoire tampon et transmission de ce qu'il s'est passé après mon ob_start()
$contenu=ob_get_clean();
//appel à mon gabarit
require_once 'gabarit.php';
?><file_sep>/*utilisation de la base de données*/
USE BLOG;
/*hydratation des tables*/
INSERT INTO Billet(titre, contenu) VALUES
/*id:1*/ ("Premier billet","Bonjour monde ! Ceci est le premier billet sur mon blog."),
/*id:2*/ ("Au travail", "Il faut enrichir ce blog dès maintenant.");
INSERT INTO Commentaire(auteur, contenu, billetID) VALUES
("<NAME>","Bravo pour ce début",1),
("Moi","Merci ! Je vais continuer sur ma lancée",1);<file_sep><?php
require_once 'modele/classModele.php';
class Commentaire extends Modele{
/*VARS*/
public $ID;
public $parution;
public $auteur;
public $contenu;
public $billetID;
/*CONSTRUCTEUR*/
function __construct(){
//transformation de la date en objet date
$this->parution=new DateTime($this->parution);
}
}<file_sep><?php
abstract class Modele {
//execution simple d'une requete
protected function executer($requete){
$bdd = db_connect::invoque();
$statement = $bdd->connexion->prepare($requete);
$statement->execute();
return $statement;
}
//fetch class
protected function fetchClass($requete, $classe){
//preparation, execution de la requete
$statement=$this->executer($requete);
//fetch all en class
$statement->setFetchMode(PDO::FETCH_CLASS, $classe);
$donnee=$statement->fetch();
return $donnee;
}
//fetch all class
protected function fetchAllClass($requete, $classe){
//preparation, execution de la requete
$statement=$this->executer($requete);
//fetch all en class
$donnees= array();
$statement->setFetchMode(PDO::FETCH_CLASS, $classe);
$donnees=$statement->fetchAll();
return $donnees;
}
//loader classique
protected function loader($requete, $classe){
//preparation, execution de la requete
$statement=$this->executer($requete);
//si le retour est de , on fabrique un objet, sinon on lance une erreur
if ($statement->rowCount()==1){
$statement->setFetchMode(PDO::FETCH_CLASS, $classe);
$donnee=$statement->fetch();
} else {
throw new Exception("Aucun '$classe' ne correspond à l'identifiant '$id'");
}
return $donnee;
}
}<file_sep><?php
//autoloader de fichiers
function load($repertoire){
$dossier=opendir($repertoire);
while ($fichier=readdir($dossier)){
if (is_file($repertoire.'/'.$fichier) && $fichier !='/' && $fichier !='.' && $fichier != '..'){
require_once($repertoire.'/'.$fichier);
}
}
closedir($dossier);
}
/**
* Fonction qui assainit les données depuis le POST
**/
function assainit($methode, $cle, $type){
$bob='';
if ($methode==='post') {
switch($type){
case 'int':
$bob=( isset($_POST[$cle]) ) ? intval($_POST[$cle]) : 0;
break;
case 'float':
$bob=( isset($_POST[$cle]) ) ? floatval($_POST[$cle]) : 0.00;
break;
case 'string':
$bob=( isset($_POST[$cle]) ) ? htmlspecialchars(trim(strip_tags($_POST[$cle]))) : '';
break;
case 'textarea':
$bob=( isset($_POST[$cle]) ) ? htmlspecialchars(trim(addslashes($_POST[$cle]))) : '';
//evite les injections xss
$bob=str_replace(array("<script>","</script>"), array("",""), $bob);
break;
}
} else {
switch($type){
case 'int':
$bob=( isset($_GET[$cle]) ) ? intval($_GET[$cle]) : 0;
break;
case 'float':
$bob=( isset($_GET[$cle]) ) ? floatval($_GET[$cle]) : 0.00;
break;
case 'string':
$bob=( isset($_GET[$cle]) ) ? htmlspecialchars(trim(strip_tags($_GET[$cle]))) : '';
break;
case 'textarea':
$bob=( isset($_GET[$cle]) ) ? htmlspecialchars(trim(addslashes($_GET[$cle]))) : '';
//evite les injections xss
$bob=str_replace(array("<script>","</script>"), array("",""), $bob);
break;
}
}
return $bob;
}<file_sep><?php
require_once 'modele/classModele.php';
/**
* Class modele
* @author Mylene <<EMAIL>>
*
*/
class Blog extends Modele
{
/*PROPRIETES*/
public $titre;
public $billets;
/* CONSTRUCTEUR */
function __construct()
{
//hydration de $titre
$this->titre = '';
//recuperation des billets
$this->billets = array();
$requete = "SELECT *
FROM Billet
ORDER BY parution desc";
$this->billets = $this->fetchAllClass($requete, 'Billet');
}
/* ACCUEIL */
public function affiche($vue)
{
switch ($vue) {
//au cas où $vue==accueil
case 'accueil':
$lien = 'vue/accueil.php';
break;
//au cas où $vue==billet
case 'billet':
$lien = 'vue/billet.php';
break;
//au cas où $vue==erreur
case 'erreur':
$lien = 'vue/erreur.php';
break;
//au cas où $vue est différent de mes cas
default:
$lien = 'vue/accueil.php';
break;
}
return $lien;
}
}
<file_sep><!doctype html>
<html lang="fr">
<head>
<meta charset="UTF-8" />
<link rel='stylesheet' href='cssReset.css'/>
<link rel="stylesheet" href="https://fonts.googleapis.com/icon?family=Material+Icons"/>
<link rel="stylesheet/less" type="text/css" href="gen.less" />
<script src="//cdn.jsdelivr.net/npm/less" ></script>
<title><?= $blog->titre; ?></title>
</head>
<body>
<div id="global">
<header>
<h1><a href="index.html">Mon blog</a></h1>
<p><em>Je vous souhaite la bienvenue sur ce modeste blog.</em></p>
</header>
<main>
<?= $contenu ?>
</main>
<footer>
Blog réalisé avec PHP, HTML5 et CSS.
</footer>
</div>
</body>
</html><file_sep><?php
//declaration du titre de la page
$blog->titre=$billet->titre;
//démarrage de la mémoire tampon
ob_start();
//mise en forme du billet
?>
<article>
<header>
<h3 class="titreBillet"><a href='billet-<?= $billet->ID ?>.html'><?= $billet->titre ?>'</a></h3>
<time><?= $billet->getParutionFormatEdition() ?></time>
</header>
<p><?= $billet->contenu ?></p>
</article>
<hr />
<h2>Réponses</h2>
<article>
<?php foreach ($billet->commentaires as $commentaire): ?>
<p><?= $commentaire->auteur ?> dit :</p>
<p><?= $commentaire->contenu ?></p>
<?php endforeach; ?>
</article>
<?php
//arret de la mémoire tampon
$contenu=ob_get_clean();
//appel à mon gabarit
require_once 'gabarit.php';
?><file_sep><?php
/* chargement des librairies */
require_once "inc/fonctions.php";
/* chargement de tous les fichiers du répertoire inc/ */
load('inc');
/* chargement de tous les fichiers modeles*/
load('modele');
/* chargement de tous les fichiers controleurs */
load('controleur');
/* try/catch */
try {
//instanciation du blog
$blog=new Blog;
//pivot
if ( isset($_GET['action']) ){
//assainissement
$action=assainit('get', 'action', 'string');
//pivot
switch ($action){
case 'billet':
if( isset($_GET['id']) ){
$id=assainit('get','id','int');
if ($id != 0) {
$billet= Billet::load($id);
//afficher le billet avec l'id renseigné
require_once $blog->affiche('billet');
} else {
throw new Exception("Identifiant de billet non valide");
}
} else {
throw new Exception("Identifiant de billet non défini");
}
break;
default:
throw new Exception("Action non valide");
break;
}
} else {
//afficher l'accueil par defaut
require_once $blog->affiche('accueil');
}
} catch (Exception $e){
//aller chercher les messages d'erreur
$messsage=$e->getMessage();
//vue Erreur
require_once $blog->affiche('erreur');
}
?>
|
7baf8db20248f4638aab17d8a951ac9a05f8c377
|
[
"SQL",
"PHP"
] | 13 |
PHP
|
Mylenelj/testAntho
|
dac59743456a10b2e8a1aae6442abcdf7f3a86c7
|
7e6cdd1dffe2c5747f7c2fcce90be5d861055d2d
|
refs/heads/master
|
<file_sep>Django>=3.1,<3.2
wagtail>=2.12,<2.13
gunicorn==20.0.4
psycopg2==2.8.3
dj_database_url
<file_sep># general imports
from django.urls import path
# api imports
# api urls
# general urls
urlpatterns = []
<file_sep># project_name
Wagtail 2.12 + Django 3.1 + Webpack + Postgres 11 + Dokku config (Production Ready)
## Documentation ##
### Directory Tree ###
```
├── main (Main application of the project, use it to add main templates, statics and root routes)
│ ├── fixtures
│ │ ├── dev.json (Useful dev fixtures, by default it creates an `admin` user with password `<PASSWORD>`)
│ │ └── initial.json (Initial fixture loaded on each startup of the project)
│ ├── migrations
│ ├── static (Add here the main statics of the app)
│ ├── templates (Add here the main templates of the app)
│ ├── admin.py
│ ├── apps.py
│ ├── models.py (Main models like City, Config)
│ ├── tests.py (We hope you will put some tests here :D)
│ ├── urls.py (Main urls, place the home page here)
│ └── views.py
├── media
├── project_name
│ ├── settings
│ │ ├── partials
│ │ │ └── util.py (Useful functions to be used in settings)
│ │ ├── common.py (Common settings for different environments)
│ │ ├── development.py (Settings for the development environment)
│ │ └── production.py (Settings for production)
│ ├── urls.py
│ └── wsgi.py
├── scripts
│ ├── command-dev.sh (Commands executed after the development containers are ready)
│ └── wait-for-it.sh (Dev script to wait for the database to be ready before starting the django app)
├── static
├── Dockerfile (Instructions to create the project image with docker)
├── Makefile (Useful commands)
├── Procfile (Dokku or Heroku file with startup command)
├── README.md (This file)
├── app.json (Dokku deployment configuration)
├── docker-compose.yml (Config to easily deploy the project in development with docker)
├── manage.py (Utility to run most django commands)
└── requirements.txt (Python dependencies to be installed)
```
### How to install the template ###
Clone the repository, and update your origin url:
```
git clone https://github.com/helixsoftco/wagtail-webpack-dokku project_name
cd project_name
```
Merge the addons required by your project (Optional):
```
git merge origin/rest
git merge origin/webpack
git merge origin/push-notifications
```
Rename your project files and directories:
```
make name=project_name init
```
> Info: Make is required, for mac run `brew install make`
> After this command you can already delete the init command inside the `Makefile`
The command before will remove the `.git` folder, so you will have to initialize it again:
```
git init
git remote add origin <repository-url>
```
### How to run the project ###
The project use docker, so just run:
```
docker-compose up
```
> If it's first time, the images will be created. Sometimes the project doesn't run at first time because the init of postgres, just run again `docker-compose up` and it will work.
*Your app will run in url `localhost:8000`*
To recreate the docker images after dependencies changes run:
```
docker-compose up --build
```
To remove the docker containers including the database (Useful sometimes when dealing with migrations):
```
docker-compose down
```
### Accessing Administration
The django admin site of the project can be accessed at `localhost:8000/admin`
By default, the development configuration creates a superuser with the following credentials:
```
Username: admin
Password: <PASSWORD>
```
## Production Deployment: ##
The project is dokku ready, these are the steps to deploy it in your dokku server:
#### Server Side: ####
> These docs do not cover dokku setup, you should already have configured the initial dokku config including ssh keys
Create the app and configure postgres:
```
dokku apps:create project_name
dokku postgres:create project_name
dokku postgres:link project_name project_name
```
> If you don't have dokku postgres installed, run this before:
> `sudo dokku plugin:install https://github.com/dokku/dokku-postgres.git`
Create the required environment variables:
```
dokku config:set project_name ENVIRONMENT=production DJANGO_SECRET_KEY=....
```
Current required environment variables are:
* ENVIRONMENT
* DJANGO_SECRET_KEY
* EMAIL_PASSWORD
Use the same command to configure secret credentials for the app
#### Local Side: ####
Configure the dokku remote:
```
git remote add production <EMAIL>>:project_name
```
Push your changes and just wait for the magic to happens :D:
```
git push production master
```
> *IMPORTANT:* After deploying configure the correct site domain on: https://myhost.com/admin/sites/
> Otherwise some URLs would be blocked by the ALLOWED_HOSTS due to Wagtail requesting localhost
Optional: To add SSL to the app check:
https://github.com/dokku/dokku-letsencrypt
Optional: Additional nginx configuration (like client_max_body_size) should be placed server side in:
```
/home/dokku/<app>/nginx.conf.d/<app>.conf
```
> Further dokku configuration can be found here: http://dokku.viewdocs.io/dokku/
### Serving static and media files from the dokku server
In case you want to serve the `static` and `media` files directly from the server, instead of AWS or a different storage,
the following steps are required:
In the server configure the dokku persistent storage:
```
dokku storage:mount project_name /var/lib/dokku/data/storage/project_name/media:/src/media
dokku storage:mount project_name /var/lib/dokku/data/storage/project_name/static:/src/static
dokku ps:restart project_name
```
> See: https://github.com/dokku/dokku/blob/master/docs/advanced-usage/persistent-storage.md
Then add the following config:
```
location /media/ {
alias /var/lib/dokku/data/storage/project_name/media/;
}
location /static/ {
alias /var/lib/dokku/data/storage/project_name/static/;
}
```
To the following file (You may need to create it):
```
/home/dokku/project_name/nginx.conf.d/project_name.conf
```
Finally, restart Nginx:
```
service nginx restart
```
### Configuring CORS
In production, you may want to configure Nginx to allow requests from a different domain, in that case add:
```
add_header "Access-Control-Allow-Origin" * always;
add_header "Access-Control-Allow-Methods" "GET, POST, PUT, OPTIONS, HEAD, PATCH, DELETE" always;
add_header "Access-Control-Allow-Headers" "Authorization, Origin, X-Requested-With, Content-Type, Accept" always;
if ($request_method = OPTIONS) {
return 204;
}
```
To the following file:
```
/home/dokku/project_name/nginx.conf.d/project_name.conf
```
Then restart Nginx:
```
service nginx restart
```
<file_sep># Use an official Python runtime based on Debian 10 "buster" as a parent image.
FROM python:3.8.1-slim-buster
ENV PYTHONUNBUFFERED=1
# Install system packages required by Wagtail and Django.
RUN apt-get update --yes --quiet && apt-get install --yes --quiet --no-install-recommends \
build-essential \
libpq-dev \
libmariadbclient-dev \
libjpeg62-turbo-dev \
zlib1g-dev \
libwebp-dev \
gettext \
&& rm -rf /var/lib/apt/lists/*
# Setup workdir
RUN mkdir /src
WORKDIR /src
# Python dependencies
COPY requirements.txt /src/
RUN pip install -r /src/requirements.txt
COPY . /src
|
9ab56d4e5f00fd5bb04351c5af0dfebfcbddbf5b
|
[
"Markdown",
"Python",
"Text",
"Dockerfile"
] | 4 |
Text
|
cwsuba/wagtail-webpack-dokku
|
a7a2d8cad22d5189dab072028b29dfa2860e0b9b
|
d89239f245ea66de783081e4dbdfd91ff59d5db4
|
refs/heads/main
|
<file_sep>import React, { Component } from 'react';
import { ToastContainer } from 'react-toastify';
import 'react-toastify/dist/ReactToastify.css';
import Searchbar from './Searchbar/Searchbar';
import ImageGallery from './ImageGallery/ImageGallery';
export default class App extends Component {
state = {
searchQuery: '',
page: 1,
};
handleFormSubmit = searchQuery => {
this.setState({ searchQuery });
};
render() {
const { searchQuery, page } = this.state;
return (
<>
<Searchbar onSubmit={this.handleFormSubmit} />
<ImageGallery searchQuery={searchQuery} page={page} />
<ToastContainer
position="top-center"
autoClose={2000}
hideProgressBar={false}
closeOnClick="true"
/>
</>
);
}
}
|
eeb678e7c4f907ba5f3b2b681070eca6c303294f
|
[
"JavaScript"
] | 1 |
JavaScript
|
Viacheslav-Koksharov/react-hw-03-image-finder
|
a415454bdd22ed7212983dc735d1bc31414d0e07
|
9ffe8a192838fb0de2e9ce3262ee6c0b24e439fa
|
refs/heads/master
|
<repo_name>chown-coffee/projeto-integrador<file_sep>/resources/data/dashboards/dashboard1.min.js
$(function () {
//SMS
c3.generate({
bindto: "#sms",
data: {
columns: [["SMS Consumido", 30]],
type: "gauge",
tooltip: {show: !0}
},
donut: {
label: {show: !1},
title: "Sales",
width: 18
},
legend: {hide: !0},
color: {pattern: ["#5f76e8"]}
});
d3.select("#sms .c3-chart-arcs-title").style("font-family", "Rubik");
//Chamadas
c3.generate({
bindto: "#chamadas",
data: {
columns: [["Chamadas", 100]],
type: "gauge",
tooltip: {show: !0}
},
donut: {
label: {show: !1},
title: "Sales",
width: 18
},
legend: {hide: !0},
color: {pattern: ["#5f76e8"]}
});
d3.select("#chamadas .c3-chart-arcs-title").style("font-family", "Rubik");
//Chamadas Excedentes
c3.generate({
bindto: "#chamadasex",
data: {
columns: [["Chamadas Excedentes", 10]],
type: "gauge",
tooltip: {show: !0}
},
donut: {
label: {show: !1},
title: "Sales",
width: 18
},
legend: {hide: !0},
color: {pattern: ["#5f76e8"]}
});
d3.select("#chamadasex .c3-chart-arcs-title").style("font-family", "Rubik");
//Barras SMS
var datasms = {
labels: ["Jan", "Fev", "Mar", "Abr", "Mai", "Jun", "Jul", "Ago", "Set", "Out", "Nov", "Dez"],
series: [[5, 4, 3, 7, 5, 10, 5, 4, 3, 7, 5, 10]]
};
var optionssms = {
seriesBarDistance: 30
};
var responsiveOptionssms = [
['screen and (max-width: 640px)', {
seriesBarDistance: 5,
axisX: {
labelInterpolationFnc: function (value) {
return value[0];
}
}
}]
];
new Chartist.Bar('#graphsms', datasms, optionssms, responsiveOptionssms);
//Barras Chamadas
var datachamada = {
labels: ["Jan", "Fev", "Mar", "Abr", "Mai", "Jun", "Jul", "Ago", "Set", "Out", "Nov", "Dez"],
series: [[5, 4, 3, 7, 5, 10, 5, 4, 3, 7, 5, 10]]
};
var optionschamada = {
seriesBarDistance: 30
};
var responsiveOptionschamada = [
['screen and (max-width: 640px)', {
seriesBarDistance: 5,
axisX: {
labelInterpolationFnc: function (value) {
return value[0];
}
}
}]
];
new Chartist.Bar('#graphchamada', datachamada, optionschamada, responsiveOptionschamada);
$(window).on("resize", function () {
t.update()
})
});<file_sep>/README.md
# PROJETO INTEGRADOR 2020-2
## Integrantes:
* <NAME> - 831348
* <NAME> - 826635
* <NAME> - 831912
* <NAME> - 832088
* <NAME> - 832564
<file_sep>/php/index.php
<?php
try {
$hostname = "localhost";
$dbname = "EXPRESSO_API";
$username = "expressoapi";
$pw = "<PASSWORD>";
$pdo = new PDO ("mssql:host=$hostname;dbname=$dbname","$username","$pw");
} catch (PDOException $e) {
echo "Erro de Conexão " . $e->getMessage() . "\n";
exit;
}
$query = $pdo->prepare("select Name FROM API");
$query->execute();
for($i=0; $row = $query->fetch(); $i++){
echo $i." – ".$row['Coluna']."<br/>";
}
unset($pdo);
unset($query);
?><file_sep>/php/connect.php
<?php
$hostname = "localhost";
$user = "root";
$password = "";
$database = "";
$conn = mysqli_connect ($hostname, $user, $password, $database);
if (!$conn) {
die("Connection failed: " . mysqli_connect_error());
}
echo "Connected successfully";
mysqli_close($conn);
$stmt = $conn->prepare("SELECT password FROM Client WHERE IdClient = ?");
$stmt->bind_param("s", $idClient);
$result = $stmt->execute;
if($result==$password){
echo Usuário Valido;
}
else{
echo Usuário invalido
}
?>
|
39f33adbc55200b159631833a1e6b7e2aedbd748
|
[
"JavaScript",
"PHP",
"Markdown"
] | 4 |
JavaScript
|
chown-coffee/projeto-integrador
|
3de252585ed3eb1bc6425a32ed504acf6cfdad8c
|
6d706caa6c012e8f4e6d4cbd0e6085a95c574b02
|
refs/heads/master
|
<file_sep>package com.example.anupam;
import com.example.ayush.A;
public class B extends A{
Integer obj;
public void doThings(){
Integer localVar = null;
if(obj == null){
if(localVar != null){
}
}
}
public B(){
super("");
}
}
<file_sep>package com.example.home.suntistassigment;
import android.content.Intent;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.view.View;
import android.widget.ImageView;
import com.facebook.CallbackManager;
import com.facebook.FacebookCallback;
import com.facebook.FacebookException;
import com.facebook.GraphRequest;
import com.facebook.GraphResponse;
import com.facebook.login.LoginResult;
import com.facebook.login.widget.LoginButton;
import org.json.JSONObject;
import java.util.Arrays;
public class LoginActivity extends AppCompatActivity {
private LoginButton loginButton;
private CallbackManager callbackManager;
private static final String EMAIL = "email";
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_login);
callbackManager = CallbackManager.Factory.create();
loginButton = (LoginButton) findViewById(R.id.login_button);
loginButton.setReadPermissions(Arrays.asList(EMAIL));
// If you are using in a fragment, call loginButton.setFragment(this);
// Callback registration
loginButton.registerCallback(callbackManager, new FacebookCallback<LoginResult>() {
@Override
public void onSuccess(LoginResult loginResult) {
getUserDetails(loginResult);
}
@Override
public void onCancel() {
// App code
}
@Override
public void onError(FacebookException exception) {
// App code
}
});
}
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
callbackManager.onActivityResult(requestCode, resultCode, data);
super.onActivityResult(requestCode, resultCode, data);
}
protected void getUserDetails(LoginResult loginResult) {
GraphRequest data_request = GraphRequest.newMeRequest(
loginResult.getAccessToken(), new GraphRequest.GraphJSONObjectCallback() {
@Override
public void onCompleted(
JSONObject json_object,
GraphResponse response) {
Intent intent = new Intent(LoginActivity.this, DetailsActivity.class);
intent.putExtra("userProfile", json_object.toString());
startActivity(intent);
}
});
Bundle permission_param = new Bundle();
permission_param.putString("fields", "id,name,email,picture.width(120).height(120)");
data_request.setParameters(permission_param);
data_request.executeAsync();
}
}
|
47e0a208e56c3a4f86b87a5814ae35ef38b46c5f
|
[
"Java"
] | 2 |
Java
|
sarafayush/sample
|
bc242f63231a2bdefcb747e754504313fd8ba5e0
|
91a04eee65afa08e8e26d1bfc47d647d3035f228
|
refs/heads/master
|
<repo_name>QueenBeauty2312/tag-youre-it<file_sep>/tag-youre-it.py
# Based on Amazon tutorial
# import os
# import sys
import time
import logging
import json
import random
import threading
from enum import Enum
from agt import AlexaGadget
from ev3dev2.led import Leds
from ev3dev2.sound import Sound
from ev3dev2.motor import OUTPUT_A, OUTPUT_B, OUTPUT_C, OUTPUT_D, MoveTank, SpeedPercent, MediumMotor
from ev3dev2.sensor.lego import UltrasonicSensor, TouchSensor, InfraredSensor
from time import sleep
# Set the logging level to INFO to see messages from AlexaGadget
logging.basicConfig(level=logging.INFO)
### Testing the sensors
# Connect infrared and touch sensors to any sensor ports
# ir = InfraredSensor()
# ts = TouchSensor()
# us = UltrasonicSensor()
# leds = Leds()
# leds.all_off() # stop the LEDs flashing (as well as turn them off)
# # is_pressed and proximity are not functions and do not need parentheses
# while not ts.is_pressed: # Stop program by pressing the touch sensor button
# print("Ultrasonic " + str(us.distance_centimeters))
# print("Infrared " + str(ir.proximity))
# if us.distance_centimeters < 40*1.4: # to detect objects closer than about 40cm
# leds.set_color('LEFT', 'RED')
# leds.set_color('RIGHT', 'RED')
# else:
# leds.set_color('LEFT', 'GREEN')
# leds.set_color('RIGHT', 'GREEN')
# self.drive.on_for_seconds(SpeedPercent(50), SpeedPercent(50), 2, block=is_blocking)
# sleep (0.01) # Give the CPU a rest
# ###
class Direction(Enum):
"""
The list of directional commands and their variations.
These variations correspond to the skill slot values.
"""
FORWARD = ['forward', 'forwards', 'go forward']
BACKWARD = ['back', 'backward', 'go backward']
LEFT = ['left', 'go left']
RIGHT = ['right', 'go right']
STOP = ['stop', 'brake', 'halt']
class Command(Enum):
"""
The list of preset commands and their invocation variation.
These variations correspond to the skill slot values.
"""
MOVE_CIRCLE = ['circle', 'move around']
MOVE_SQUARE = ['square']
TAG_YOURE_IT = ['tag you\'re it', 'tag you are it', 'your it', 'you are it', 'you\re it']
TAG_IM_IT = ['tag i\'m it', 'tag i am it', 'i am it', 'i\'m it', 'here i come', 'tag I\'m it', 'I\'m it', 'I it', 'I am going to get you']
class EventName(Enum):
"""
The list of custom event name sent from this gadget
"""
PROXIMITY = "Proximity"
SPEECH = "Speech"
class MindstormsGadget(AlexaGadget):
"""
A Mindstorms gadget that can perform bi-directional interaction with an Alexa skill.
"""
def __init__(self):
"""
Performs Alexa Gadget initialization routines and ev3dev resource allocation.
"""
super().__init__()
# Robot state
self.tag_youre_it_mode = False
self.tag_im_it_mode = False
# Connect two large motors on output ports A and D
self.drive = MoveTank(OUTPUT_A, OUTPUT_D)
self.sound = Sound()
self.leds = Leds()
self.ir = InfraredSensor()
self.us = UltrasonicSensor()
self.ts = TouchSensor()
# Start threads
threading.Thread(target=self._youre_it_thread, daemon=True).start()
threading.Thread(target=self._im_it_thread, daemon=True).start()
def on_connected(self, device_addr):
"""
Gadget connected to the paired Echo device.
:param device_addr: the address of the device we connected to
"""
self.leds.set_color("LEFT", "GREEN")
self.leds.set_color("RIGHT", "GREEN")
print("{} connected to Echo device".format(self.friendly_name))
def on_disconnected(self, device_addr):
"""
Gadget disconnected from the paired Echo device.
:param device_addr: the address of the device we disconnected from
"""
self.leds.set_color("LEFT", "BLACK")
self.leds.set_color("RIGHT", "BLACK")
print("{} disconnected from Echo device".format(self.friendly_name))
def on_custom_mindstorms_gadget_control(self, directive):
"""
Handles the Custom.Mindstorms.Gadget control directive.
:param directive: the custom directive with the matching namespace and name
"""
try:
payload = json.loads(directive.payload.decode("utf-8"))
print("Control payload: {}".format(payload))
control_type = payload["type"]
if control_type == "move":
# Expected params: [direction, duration, speed]
self._move(payload["direction"], int(payload["duration"]), int(payload["speed"]))
if control_type == "command":
# Expected params: [command]
self._activate(payload["command"])
except KeyError:
print("Missing expected parameters: {}".format(directive))
def _move(self, direction, duration: int, speed: int, is_blocking=False):
"""
Handles move commands from the directive.
Right and left movement can under or over turn depending on the surface type.
:param direction: the move direction
:param duration: the duration in seconds
:param speed: the speed percentage as an integer
:param is_blocking: if set, motor run until duration expired before accepting another command
"""
print("Move command: ({}, {}, {}, {})".format(direction, speed, duration, is_blocking))
if direction in Direction.FORWARD.value:
self.drive.on_for_seconds(SpeedPercent(speed), SpeedPercent(speed), duration, block=is_blocking)
if direction in Direction.BACKWARD.value:
self.drive.on_for_seconds(SpeedPercent(-speed), SpeedPercent(-speed), duration, block=is_blocking)
if direction in (Direction.RIGHT.value + Direction.LEFT.value):
self._turn(direction, speed)
self.drive.on_for_seconds(SpeedPercent(speed), SpeedPercent(speed), duration, block=is_blocking)
if direction in Direction.STOP.value:
self.drive.off()
self.patrol_mode = False
def _activate(self, command, speed=50):
"""
Handles preset commands.
:param command: the preset command
:param speed: the speed if applicable
"""
print("Activate command: ({}, {})".format(command, speed))
if command in Command.MOVE_CIRCLE.value:
self.drive.on_for_seconds(SpeedPercent(int(speed)), SpeedPercent(5), 12)
if command in Command.MOVE_SQUARE.value:
for i in range(4):
self._move("right", 2, speed, is_blocking=True)
if command in Command.TAG_YOURE_IT.value:
# Set tag you're it mode to resume tag_youre_it thread processing
self.tag_youre_it_mode = True
print('Command is active for Tag You\re it.')
self._send_event(EventName.SPEECH, {'speechOut': "Oh no. I am it! Here I come!"})
time.sleep(0.3)
if command in Command.TAG_IM_IT.value:
self.tag_im_it__mode = True
self._send_event(EventName.SPEECH, {'speechOut': "Yikes! You are it! I am out of here!"})
# Perform Run away
self.drive.on_for_seconds(SpeedPercent(80), SpeedPercent(-80), 2)
# time.sleep(0.3)
self.drive.on_for_seconds(SpeedPercent(-50), SpeedPercent(-50), 4)
self.leds.set_color("LEFT", "YELLOW", 1)
self.leds.set_color("RIGHT", "YELLOW", 1)
def _turn(self, direction, speed):
"""
Turns based on the specified direction and speed.
Calibrated for hard smooth surface.
:param direction: the turn direction
:param speed: the turn speed
"""
if direction in Direction.LEFT.value:
self.drive.on_for_seconds(SpeedPercent(0), SpeedPercent(speed), 2)
if direction in Direction.RIGHT.value:
self.drive.on_for_seconds(SpeedPercent(speed), SpeedPercent(0), 2)
def _send_event(self, name: EventName, payload):
"""
Sends a custom event to trigger an action.
:param name: the name of the custom event
:param payload: the JSON payload
"""
self.send_custom_event('Custom.Mindstorms.Gadget', name.value, payload)
def _im_it_thread(self):
"""
The human is it. Robot runs away.
"""
count = 0
while True:
while self.tag_im_it_mode:
print("Tag You're it mode activated randomly picking a path")
direction = random.choice(list(Direction))
duration = random.randint(1, 5)
speed = random.randint(1, 4) * 25
while direction == Direction.STOP:
direction = random.choice(list(Direction))
# direction: all except stop, duration: 1-5s, speed: 25, 50, 75, 100
self._move(direction.value[0], duration, speed)
time.sleep(duration)
time.sleep(1)
def _youre_it_thread(self):
"""
Chase after the player.
"""
while True:
while self.tag_youre_it_mode:
# Chase after person.
self.drive.on_for_seconds(SpeedPercent(-50), SpeedPercent(-50), 2, False)
distance = self.us.distance_centimeters
print("Proximity: {}".format(distance))
count = count + 1 if distance < 10 else 0
if distance < 25:
print("Close enough to tag you breached. Sending proximity event to skill")
self.leds.set_color("LEFT", "RED", 1)
self.leds.set_color("RIGHT", "RED", 1)
self._send_event(EventName.SPEECH, {'speechOut': "Tag you are it!"})
self._send_event(EventName.PROXIMITY, {'distance': distance})
self.tag_youre_it_mode = False
self.tag_iam_it_mode = True
time.sleep(0.2)
time.sleep(1)
if __name__ == '__main__':
# Startup sequence
gadget = MindstormsGadget()
gadget.sound.play_song((('C4', 'e'), ('D4', 'e'), ('E5', 'q')))
gadget.leds.set_color("LEFT", "GREEN")
gadget.leds.set_color("RIGHT", "GREEN")
# Gadget main entry point
gadget.main()
# Shutdown sequence
gadget.sound.play_song((('E5', 'e'), ('C4', 'e')))
gadget.leds.set_color("LEFT", "BLACK")
gadget.leds.set_color("RIGHT", "BLACK")
|
6292c378758de7ec1fc5febbff5983068e3a0c53
|
[
"Python"
] | 1 |
Python
|
QueenBeauty2312/tag-youre-it
|
07fa84747fbcda4369aff9fe3c2e375c91b02d19
|
c90fb18f5ca24de2f02a268d2449b3b01d164121
|
refs/heads/master
|
<repo_name>kimjunghak/Antlr2<file_sep>/src/test.c
int max=500; void main(int a){
int i; int j; int k;
int rem; int sum;
int rea[5];
int c=4;
i=2;
while( i <= max){
sum = 0;
k = i/2;
j=i;
while(j<=k){
rem =i%j;
if(rem == 0){
sum = sum + j;
++j;
}
if(i == sum) write(i);
++i;
value = rea[1] / rea[2];
}
}
return 10;
}
void test(int a, int b, int c){
if(a>1){
}
else{
}
}
void test2(){
}
|
b31913e8ec6ab6942f3044232b142faa4d5aeecd
|
[
"C"
] | 1 |
C
|
kimjunghak/Antlr2
|
c2097f6fc2264bcb57f89d9cba2d28a35ab01ab1
|
f253282517c88976539027b41a1838239079a08c
|
refs/heads/master
|
<file_sep>import com.brentvatne.react.ReactVideoPackage;
@Override
protected List<ReactPackage> getPackages() {
return Arrays.asList(
new MainReactPackage(),
new ReactVideoPackage()
);
}<file_sep>include ':AVRecorderPlayer'
<file_sep>JAVA_HOME=/usr/bin/java
export JAVA_HOME;
|
637c4c6a133858444f05240d8c0e7b8aa774b8d4
|
[
"Java",
"Shell",
"Gradle"
] | 3 |
Java
|
CatPerry/AVRecorderPlayer
|
83f4b050a71ffde10cf86cab703e833fce682724
|
422a7e0e3403267c2c98dc2a428dd5b41406f157
|
refs/heads/main
|
<file_sep>Rails.application.routes.draw do
root to: 'sessions#welcome'
resources :users, only: [:new, :create, :homepage, :profile]
get '/login' => 'sessions#new'
post '/login' => 'sessions#create'
delete '/logout' => 'sessions#destroy'
get '/signup' => 'users#new'
post '/signup' => 'users#create'
get '/biaslist', to: 'biaslists#mybias', as: "my_bias"
match '/auth/:google_oauth2/callback' => 'sessions#google', via: [:get,:post]
resources :groups, only: [:index, :show]
resources :kpopidols, only: [:index, :show] do
resources :biaslists
end
# For details on the DSL available within this file, see https://guides.rubyonrails.org/routing.html
end
<file_sep>
### Kpop Addicts Anonymous
This project is an app that allows you to keep track of your favorite Kpop idol, giving you access to information about that idol, important links and fun youtube videos, while also recommends new artists to you based on the Top 5 Kpop groups.
## To Get Started
To use this application please fork and clone this repository and open in your machine's text editor.
In your terminal shell type the following:
```
bundle install
```
This will install neccessary gems to run the program.
For database migration please type the following into terminal shell:
```
rails db:migrate
```
To connect to a local host, please type the following command:
```
rails s
```
When it has confirmed you are connected to a local host, please visit your web browser and click [here](http://localhost:3000/) to interact with the program.
You may now create an account, login with your own credentials or with the 'login with google' option, and start exploring the world of KPop
## Resources
A LOT of the information about the Kpop Groups and Idols was collected using [KpopIdols](https://kprofiles.com/) as they have all the information you need about every group and idol.
### Enjoy!<file_sep>class CreateKpopidols < ActiveRecord::Migration[6.1]
def change
create_table :kpopidols do |t|
t.string :stage_name
t.string :birth_name
t.string :birthday
t.string :birth_place
t.string :height
t.string :position
t.string :nickname
t.string :famous_quote
t.string :member_img
t.string :fan_cam
t.integer :group_id
end
end
end
<file_sep>class Group < ApplicationRecord
has_many :kpopidols
has_many :users, through: :checkout_lists
has_many :referalls, class_name: "Group",
foreign_key: "recommendation_id"
belongs_to :recommendation, class_name: "Group", optional: true
def self.main_group
self.where({ name: ["Stray Kids", "Monsta X", "BigBang", "NCT 127", "Got7"]})
end
def self.recommended_groups
self.referalls
end
end
<file_sep>class CreateGroups < ActiveRecord::Migration[6.1]
def change
create_table :groups do |t|
t.string :name
t.string :group_name
t.string :debut_date
t.string :fandom_name
t.string :instagram
t.string :twitter
t.string :youtube
t.string :group_img
t.string :music_video
t.string :live_video
t.string :dance_video
t.references :recommendation, foreign_key: { to_table: :groups }
end
end
end
<file_sep>require "test_helper"
class BiaslistsControllerTest < ActionDispatch::IntegrationTest
test "should get index" do
get biaslists_index_url
assert_response :success
end
test "should get show" do
get biaslists_show_url
assert_response :success
end
test "should get edit" do
get biaslists_edit_url
assert_response :success
end
end
<file_sep>class GroupsController < ApplicationController
def index
@groups = Group.all
end
def show
@group = Group.find(params[:id])
end
def main_group
@groups = Group.main_group
render :index
end
def recommended_groups
@groups = Group.recommended_groups
end
end
<file_sep>class SessionsController < ApplicationController
def welcome
end
def new
@user = User.new
end
def create
@user = User.find_by(username:params[:username])
if !@user
@error = "Are you sure that's your username?"
render :new
elsif [email protected](params[:password])
@error = "Password no matchy records"
render :new
else
session[:user_id] = @user.id
redirect_to groups_path
end
end
def google
user = User.find_or_create_by(username: auth["info"]["name"], uid: auth["uid"]) do |user|
user.password = <PASSWORD>(10)
end
if user
session[:user_id] = user.id
redirect_to groups_path
else
redirect_to login_path
end
end
def destroy
session.delete(:user_id)
redirect_to login_path
end
private
def auth
request.env['omniauth.auth']
end
end
<file_sep>class BiaslistsController < ApplicationController
before_action :set_biaslist, only: [:show, :edit, :update, :destroy]
before_action :set_kpopidol, only: [:new, :show, :edit ]
def new
@biaslist = @kpopidol.biaslists.build
render :new
end
def mybias
if logged_in?
@biaslist = current_user.biaslists
render :my_bias
else
redirect_to login_path
end
end
def show
if logged_in?
render :show
else
redirect_to login_path
end
end
def create
@biaslist = Biaslist.new(bias_params)
@biaslist.save
redirect_to my_bias_path
end
def edit
if logged_in?
if @biaslist.user_id == current_user.id
render :edit
else
redirect_to login_path
end
end
end
def update
@biaslist.update(notes: params[:biaslist][:notes])
redirect_to my_bias_path
end
def destroy
if @biaslist.user_id == current_user.id
@biaslist.destroy
redirect_to my_bias_path
end
end
private
def bias_params
params.require(:biaslist).permit(:notes,:user_id,:kpopidol_id)
end
def set_biaslist
@biaslist = Biaslist.find(params[:id])
end
def set_kpopidol
@kpopidol = Kpopidol.find(params[:kpopidol_id])
end
end
<file_sep>class User < ApplicationRecord
has_many :biaslists
has_many :kpopidols, through: :biaslists
has_many :groups, through: :checkout_lists
has_secure_password
validates :username, uniqueness: true
end
<file_sep>class Kpopidol < ApplicationRecord
belongs_to :group
has_many :biaslists
has_many :users, through: :biaslists
end
<file_sep>class KpopidolsController < ApplicationController
def index
@kpopidols = Kpopidol.all
render :index
end
def show
@kpopidol = Kpopidol.find(params[:id])
end
end
<file_sep>class CreateBiaslists < ActiveRecord::Migration[6.1]
def change
create_table :biaslists do |t|
t.integer :kpopidol_id
t.integer :user_id
t.string :notes
end
end
end
<file_sep>class Biaslist < ApplicationRecord
belongs_to :user
belongs_to :kpopidol
end
|
51c4c08e36bab8297d520140266dff0bc32c4384
|
[
"Markdown",
"Ruby"
] | 14 |
Ruby
|
myhaikudream/kpoponrails
|
2ab3446542b0468598d03edfcf2c22cfb38eed03
|
d52e7390671b0e509c7f1065ab651207f516a63d
|
refs/heads/master
|
<file_sep># -*- coding: utf-8 -*-
"""
Created on Wed Mar 29 11:18:32 2017
@author: <NAME>
"""
import numpy as np
def saliency_map(img):
import cv2
imgM2 = img.copy()
height = imgM2.shape[0]
width = imgM2.shape[1]
md = np.min([width,height])
labImgM2 = cv2.cvtColor(imgM2, cv2.COLOR_BGR2Lab)
l = labImgM2[:,:,0]/255.0
a = labImgM2[:,:,0]/255.0
b = labImgM2[:,:,0]/255.0
sm = np.zeros((height,width), dtype = np.uint8)
off1 = np.int32(md/2)
off2 = np.int32(md/4)
off3 = np.int32(md/8)
for j in xrange(0, height):
y11 = np.max([0, j-off1])
y12 = np.min([j+off1, height])
y21 = np.max([0, j-off2])
y22 = np.min([j+off2, height])
y31 = np.max([0, j-off3])
y32 = np.min([j+off3, height])
for k in xrange(0,width):
x11 = np.max([0, k-off1])
x12 = np.min([k+off1,width])
x21 = np.max([0, k-off2])
x22 = np.min([k+off2,width])
x31 = np.max([0, k-off3])
x32 = np.min([k+off3,width])
lm1 = np.mean(l[y11:y12, x11:x12])
am1 = np.mean(a[y11:y12, x11:x12])
bm1 = np.mean(b[y11:y12, x11:x12])
lm2 = np.mean(l[y21:y22, x21:x22])
am2 = np.mean(a[y21:y22, x21:x22])
bm2 = np.mean(b[y21:y22, x21:x22])
lm3 = np.mean(l[y31:y32, x31:x32])
am3 = np.mean(a[y31:y32, x31:x32])
bm3 = np.mean(b[y31:y32, x31:x32])
cv1 = (l[j,k] - lm1)**2 + (a[j,k] - am1)**2 + (b[j,k] - bm1)**2
cv2 = (l[j,k] - lm2)**2 + (a[j,k] - am2)**2 + (b[j,k] - bm2)**2
cv3 = (l[j,k] - lm3)**2 + (a[j,k] - am3)**2 + (b[j,k] - bm3)**2
sm[j,k] = (cv1+cv2+cv3) * 255
return sm<file_sep># python_saliencyMap
The goal of a saliency map is to simplified or change the representation of an image into something that is more meaningful and easier to analyze.
It is a part of Image segmentation world. More precisely, image segmentation is the process of assigning a label to every pixel in an image such that pixels with the same label share certain characteristics.
In another defination, Saliency at a given location is determined primarily by how different this location is from its surround in color, orientation, motion, depth etc.
here, a function of salincy map is provided whihc can be used in any color image as a part of segmenting features.
|
dd6d50bd14fbc6ef3c820ccfcfc01296456086ed
|
[
"Markdown",
"Python"
] | 2 |
Python
|
mohammadimtiazz/python_saliencyMap
|
9386249912eadfbcb2fa1dab505803eb202f8d68
|
76ba3f6c9c358545b7466a2e7e744489d1abfc49
|
refs/heads/master
|
<file_sep><?php
echo "hello nadim<br>";
$x=5;
$y=10;
echo $x+$y ."<br>";
echo $y-$x ."<br>";
echo $y/$x ."<br>";
echo $x*$y ."<br>";
while ($x<=10)
{
if($x==7)
{
echo "7 is lucky <br>";
}
echo "x = ".$x."<br>";
$x++;
}
echo"<br>";
for($i=1;$i<=10;$i++)
{
echo $i . ".Nadim<br>";
}
echo"<br>";
$arr=array("one","two","three","four","five","six","seven","eight","nine","ten");
for($i=0;$i<10;$i++)
{
echo $arr[$i].".Nadim<br>";
}
echo "\$x";
?><file_sep># My-first-PHP-code
|
57da188d5f8f78a7de16a5a8b449ce3850b05820
|
[
"Markdown",
"PHP"
] | 2 |
PHP
|
EANadim/My-first-PHP-code
|
e771cdc7f4fc40df6c7967cad6a4bf4ce5e3cbdc
|
93cc47784c2ba58ff5abd6077240b81bf1a52138
|
refs/heads/master
|
<file_sep>CREATE TABLE IF NOT EXISTS employees (
id INT AUTO_INCREMENT PRIMARY KEY,
first_name VARCHAR NOT NULL,
last_name VARCHAR NOT NULL,
salary DECIMAL(8,2) NOT NULL
);<file_sep># Avenue Code Integration Challenge
Hey! First off, Thanks for the opportunity.
So, I have any problems (I was without internet connection) and I do the challenge using mule 4, sorry!!!
# Source Code
git clone https://[email protected]/brunosouzas/avenuecode-integration-challenge.git
# Challenge
* I used mulesoft 4 but followed the requirements except exceptions related to mulesoft version 4.
* The java class of the example sent by you I could not use because the AbstractTransformer class is not accessible, but I created a java class with the calculation of the bonus.
* I separated the configuration files in order to make it easier to read and understand the source code.
* I used APIKit for wsdl and raml
<file_sep>
INSERT INTO employees (first_name, last_name, salary) VALUES ('John', 'Doe', 120000);<file_sep>package com.avenuecode.challenge.integration;
import java.util.Random;
public class EmployeeBonusTransformer {
public static double EmployeeBonus(double salary) {
int bonus = numeroAleatorio(1, 50);
salary = (salary*bonus)/100;
return salary;
}
private static int numeroAleatorio(int min, int max){
Random rand = new Random();
int randomNum = rand.nextInt((max - min) + 1) + min;
return randomNum;
}
}
|
74a598bcdd84c7c8eec73e2e5a4b0366ebb08826
|
[
"Markdown",
"SQL",
"Java"
] | 4 |
SQL
|
brunosouzas/MUL-INT-API-integration-challenge-v1
|
cab4dad2222fb1c6dc4278286db757a5c678e762
|
0c1356bd4804024bea47cb0731c1e2149c389644
|
refs/heads/main
|
<repo_name>shubhamrajpurohit963/tictactoe_2020<file_sep>/README.md
# tictactoe_2020
This app was build by using React Native, and I've used STATES & PROPS to just to enhance the learning experience.
## Technology Used
React Native
<file_sep>/components/Icons.js
import React,{useState} from 'react'
import Icon from 'react-native-vector-icons/FontAwesome'
import Snackbar from 'react-native-snackbar'
const itemArray = new Array(9).fill('empty')
const Icons = ({name}) => {
switch (name) {
case 'circle':
return <Icon name="circle-thin" size={35} color="#F4c824" />
case 'cross':
return <Icon name="times" size={35} color="#10a881" />
default:
return <Icon name="pencil" size={35} color="#303030" />
}
}
export default Icons
|
eb06e3ec3a9d1ecd8210b721ea200aae467b627f
|
[
"Markdown",
"JavaScript"
] | 2 |
Markdown
|
shubhamrajpurohit963/tictactoe_2020
|
97a98886e37e0dd9e95e35cd9d54e2e388a65251
|
666038a9b06d6fae076b590c8b589bb317ac90b0
|
refs/heads/master
|
<repo_name>Zumbalamambo/neuraudio<file_sep>/app/build.gradle
apply plugin: 'com.android.application'
android {
compileSdkVersion 25
buildToolsVersion "25.0.0"
defaultConfig {
applicationId "com.goperez.neuraudio"
minSdkVersion 16
targetSdkVersion 25
versionCode 1
versionName "1.0"
testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner"
jackOptions {
enabled true
}
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
}
}
compileOptions {
targetCompatibility 1.8
sourceCompatibility 1.8
}
}
repositories {
maven { url 'https://github.com/kshoji/JFugue-for-Android/raw/master/jfugue-android/snapshot' }
maven { url 'https://github.com/kshoji/USB-MIDI-Driver/raw/master/MIDIDriver/snapshots' }
}
dependencies {
compile fileTree(include: ['*.jar'], dir: 'libs')
androidTestCompile('com.android.support.test.espresso:espresso-core:2.2.2', {
exclude group: 'com.android.support', module: 'support-annotations'
})
compile 'com.android.support:appcompat-v7:25.1.0'
compile 'com.android.support:design:25.1.0'
testCompile 'junit:junit:4.12'
compile 'jp.kshoji:jfugue-android:4.0.3:@aar'
compile 'jp.kshoji:midi-driver:0.1.1:@aar'
compile 'com.github.hiteshsondhi88.libffmpeg:FFmpegAndroid:0.2.5'
compile 'com.explodingart:jmusic:1.6.4.1'
compile 'com.android.support.constraint:constraint-layout:1.0.0-beta4'
}
<file_sep>/app/src/main/java/com/goperez/neuraudio/MainActivity.java
package com.goperez.neuraudio;
import android.app.ActivityManager;
import android.content.BroadcastReceiver;
import android.content.Context;
import android.content.Intent;
import android.content.IntentFilter;
import android.content.res.AssetManager;
import android.media.AudioManager;
import android.media.MediaPlayer;
import android.net.Uri;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.view.View;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.util.Iterator;
import java.util.List;
import java.util.Random;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import jm.music.data.Note;
import jm.music.data.Part;
import jm.music.data.Phrase;
import jm.music.data.Score;
import jm.util.Write;
import static jm.constants.Durations.MINIM;
import static jm.constants.Pitches.C4;
import static jm.constants.ProgramChanges.GUITAR;
import static jm.constants.ProgramChanges.TRUMPET;
public class MainActivity extends AppCompatActivity {
static boolean isPlaying = false;
static boolean isFinished = false;
private ResponseReceiver receiver;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
IntentFilter filter = new IntentFilter(ResponseReceiver.ACTION_RESP);
filter.addCategory(Intent.CATEGORY_DEFAULT);
receiver = new ResponseReceiver(MainActivity.this);
registerReceiver(receiver, filter);
}
public void onClickPlay(View view) {
//Play button
android.support.design.widget.FloatingActionButton playButton = (android.support.design.widget.FloatingActionButton) findViewById(R.id.myFAB);
//Stores state of button
if(isPlaying) {
isPlaying = false;
playButton.setImageResource(R.drawable.ic_play_arrow_white_48dp);
ActivityManager am = (ActivityManager) getSystemService(ACTIVITY_SERVICE);
List<ActivityManager.RunningAppProcessInfo> runningAppProcesses = am.getRunningAppProcesses();
Iterator<ActivityManager.RunningAppProcessInfo> iter = runningAppProcesses.iterator();
while(iter.hasNext()){
ActivityManager.RunningAppProcessInfo next = iter.next();
String processName = getPackageName() + ":service";
if(next.processName.equals(processName)){
android.os.Process.killProcess(next.pid);
break;
}
}
} else {
isPlaying = true;
playButton.setImageResource(R.drawable.ic_pause_white_24dp);
Intent startPlayService = new Intent(MainActivity.this, PlayService.class);
startService(startPlayService);
}
}
}
<file_sep>/README.md
# Neuraudio
A procedurally generating music app for android that demonstrates the abilities of markov chains.
## What can I use it for?
Neuraudio has uses for:
* Music Producers - need an idea for a song or melody? Fire up Neuraudio and listen until something speaks to you
* The layman - Out of data and don't have anything to listen to? Neuraudio generates music straight from the app - no internet connection required! Enjoy relaxing melodies and chords at the touch of a button.
## How does it work?
Like this:
```java
public void createMusic() {
Note[] notes = new Note[10];
Random rand = new Random();
boolean scaleNote = false;
while(!scaleNote) {
notes[0] = new Note(rand.nextInt(37) + 36, new Double(.25) + new Double(2.75) * rand.nextDouble());
if(notes[0].isScale(MAJOR_SCALE)) {
scaleNote = true;
}
}
for(int i = 0; i < 10; i++) {
//notes[i] = new Note(rand.nextInt(37) + 36, new Double(.25) + new Double(2.75) * rand.nextDouble());
notes[i] = markovChain(notes[i-1]);
}
Phrase phrase = new Phrase();
phrase.addNoteList(notes);
Part guitar = new Part("Piano", PIANO, 0);
guitar.addPhrase(phrase);
Score score = new Score("Piano");
score.addPart(guitar);
Write.midi(score, getFilesDir() + "/test.midi");
System.out.println(new File(getFilesDir() + "/test.midi").exists());
playMusic();
}
```
This is one of our most vital functions as it creates musical phrases with instruments and initializes the markov chain.
# TODO
## Networking
* Create neuron as interface
* Implement interfaces for network
* Calculate initial weights for network
* Develop and implement algorithm for creation
## Python
* Pull Music21 Corpus pieces through API
* Parse MusicXML files into midi
* Analyze midi to calculate weights
<file_sep>/app/src/main/java/com/goperez/neuraudio/ResponseReceiver.java
package com.goperez.neuraudio;
import android.app.ActivityManager;
import android.content.BroadcastReceiver;
import android.content.Context;
import android.content.Intent;
import java.util.Iterator;
import java.util.List;
import static android.content.Context.ACTIVITY_SERVICE;
/**
* Created by sggan on 2/9/2017.
*/
public class ResponseReceiver extends BroadcastReceiver {
public static final String ACTION_RESP = "com.goperez.neuraudio.MESSAGE_PROCESSED";
public Context context;
public ResponseReceiver(Context context) {
this.context = context;
}
@Override
public void onReceive(Context context, Intent intent) {
ActivityManager am = (ActivityManager) context.getSystemService(ACTIVITY_SERVICE);
List<ActivityManager.RunningAppProcessInfo> runningAppProcesses = am.getRunningAppProcesses();
Iterator<ActivityManager.RunningAppProcessInfo> iter = runningAppProcesses.iterator();
while(iter.hasNext()){
ActivityManager.RunningAppProcessInfo next = iter.next();
String processName = context.getPackageName() + ":service";
if(next.processName.equals(processName)){
android.os.Process.killProcess(next.pid);
break;
}
}
Intent startPlayService = new Intent(context, PlayService.class);
context.startService(startPlayService);
}
}
|
bb1918649f7b45b8bf497a0cc295a9a4e6c51afa
|
[
"Markdown",
"Java",
"Gradle"
] | 4 |
Gradle
|
Zumbalamambo/neuraudio
|
d5c5dda6267120aaf7ddf71fdaf1bb5163237e95
|
8ed517a0a26903fdba4b666bb106074841fa2481
|
refs/heads/master
|
<repo_name>zhoukegui/lc_private_codes<file_sep>/Workstation/Clustering using ALFF/screenPatients.py
# -*- coding: utf-8 -*-
"""
Created on Wed Nov 7 21:02:52 2018
@author: lenovo
"""
import pandas as pd
scaleDataPath=r"D:\WorkStation_2018\WorkStation_2018_08_Doctor_DynamicFC_Psychosis\Scales\8.30大表.xlsx"
trainDataPath=r'D:\WorkStation_2018\WorkStation_2018_11_machineLearning_Psychosi_ALFF\trainingData.xlsx'
testScalePath=r'D:\WorkStation_2018\WorkStation_2018_11_machineLearning_Psychosi_ALFF\REST-meta-MDD-PhenotypicData_WithHAMDSubItem_S20.xlsx'
predictLabelPath=r'D:\WorkStation_2018\WorkStation_2018_11_machineLearning_Psychosi_ALFF\predictLabel_testData.xlsx'
#
scale=pd.read_excel(scaleDataPath)
trainData=pd.read_excel(trainDataPath)
testScale=pd.read_excel(testScalePath)
predictLabel=pd.read_excel(predictLabelPath,header=None,index=False)
predictLabel.columns=['label']
#
testScale_predictLabel=pd.concat([testScale,predictLabel],axis=1)
testScale_predictLabel.to_excel('testsScale_plus_predictLabel.xlsx')
#
joinDf=scale.set_index('folder').join(trainData.set_index('folder'),how='right')
diagnosis=joinDf['诊断']
#<file_sep>/Workstation/SSD_classification/Visulization/lc_visualizing_performances_for_leaveonesitecv.py
# -*- coding: utf-8 -*-
"""This script is used to get each subgroups' demographic information and visualization
"""
#%%
import sys
sys.path.append(r'D:\My_Codes\LC_Machine_Learning\lc_rsfmri_tools\lc_rsfmri_tools_python')
sys.path.append(r'D:\My_Codes\easylearn-fmri\eslearn\statistical_analysis')
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
from matplotlib.backends.backend_pdf import PdfPages
from matplotlib.pyplot import MultipleLocator
import pickle
import seaborn as sns
from lc_binomialtest import lc_binomialtest
from eslearn.statistical_analysis.lc_anova import oneway_anova
from eslearn.statistical_analysis.lc_chisqure import lc_chisqure
from eslearn.visualization.el_violine import ViolinPlotMatplotlib
#%% Inputs
scale_550_file = r'D:\WorkStation_2018\SZ_classification\Scale\10-24大表.xlsx'
headmotion_file_dataset1 = r'D:\WorkStation_2018\SZ_classification\Scale\头动参数_1322.xlsx'
classification_results_results_leave_one_site_cv_file = r'D:\WorkStation_2018\SZ_classification\Data\ML_data_npy\results_leave_one_site_cv.npy'
is_plot = 1
is_savefig = 1
#%% Load and proprocess
scale_550 = pd.read_excel(scale_550_file)
headmotion_dataset1 = pd.read_excel(headmotion_file_dataset1)[['Subject ID', 'mean FD_Power']]
scale_550 = pd.merge(scale_550, headmotion_dataset1, left_on='folder', right_on='Subject ID', how='inner')
results_leave_one_site_cv = np.load(classification_results_results_leave_one_site_cv_file, allow_pickle=True)
results_special = results_leave_one_site_cv['special_result']
results_special = pd.DataFrame(results_special)
results_special.iloc[:, 0] = np.int32(results_special.iloc[:, 0])
scale_550['folder'] = np.int32(scale_550['folder'])
# Filter subjects that have .mat files
scale_550_selected = pd.merge(results_special, scale_550, left_on=0, right_on='folder', how='inner')
#%% Calculate performance for Schizophrenia Spectrum subgroups
## Step 1: Dataset1
duration = 18 # Upper limit of first episode:
""" reference:
1. <NAME>, <NAME>, Schooler NR, et al. Comprehensive versus usual
community care for first-episode psychosis: 2-year outcomes from the NIMH
RAISE early treatment program. Am J Psychiatry. 2016;173(4):362-372. doi:10.1176/appi.ajp.2015.15050632.
2. Cognitive Impairment in Never-Medicated Individuals on the Schizophrenia Spectrum. doi:10.1001/jamapsychiatry.2020.0001"
"""
# Frist episode unmedicated; first episode medicated; chronic medicated
data_chronic_medicated_SSD_550 = scale_550_selected[
(scale_550_selected['诊断']==3) &
(scale_550_selected['病程月'] > duration) &
(scale_550_selected['用药'] == 1)
]
data_firstepisode_medicated_SSD_550 = scale_550_selected[
(scale_550_selected['诊断']==3) &
(scale_550_selected['首发'] == 1) &
(scale_550_selected['病程月'] <= duration) &
(scale_550_selected['用药'] == 1)
]
data_firstepisode_unmedicated_SSD_550 = scale_550_selected[
(scale_550_selected['诊断']==3) &
(scale_550_selected['首发'] == 1) &
(scale_550_selected['病程月'] <= duration) &
(scale_550_selected['用药'] == 0)
]
## Calculating Accuracy
acc_chronic_medicated_SSD_550 = np.sum(data_chronic_medicated_SSD_550[1]-data_chronic_medicated_SSD_550[3]==0) / len(data_chronic_medicated_SSD_550)
acc_first_episode_unmedicated_SSD_550 = np.sum(data_firstepisode_unmedicated_SSD_550[1]-data_firstepisode_unmedicated_SSD_550[3]==0) / len(data_firstepisode_unmedicated_SSD_550)
acc_firstepisode_medicated_SSD_550 = np.sum(data_firstepisode_medicated_SSD_550[1]-data_firstepisode_medicated_SSD_550[3]==0) / len(data_firstepisode_medicated_SSD_550)
acc_all_SSD_550 = np.sum(scale_550_selected[1]-scale_550_selected[3]==0) / len(scale_550_selected)
print(f'Sensitivity of firste pisode unmedicated in dataset550 = {acc_first_episode_unmedicated_SSD_550}')
print(f'Sensitivity of first episode medicated in dataset550 = {acc_firstepisode_medicated_SSD_550}')
print(f'Sensitivity of chronic medicated in dataset550 = {acc_chronic_medicated_SSD_550}')
print(f'Sensitivity of all SSD in dataset550 = {acc_all_SSD_550}')
print('-'*50)
# Extract subjects' demographic data
subinfo_chronic_medicated = data_chronic_medicated_SSD_550[['folder', '年龄', '性别', '学历(年)', 'BPRS_Total', 'mean FD_Power', '病程月']]
subinfo_firstepisode_medicated = data_firstepisode_medicated_SSD_550[['folder', '年龄', '性别', '学历(年)', 'BPRS_Total', 'mean FD_Power', '病程月']]
subinfo_firstepisode_unmedicated = data_firstepisode_unmedicated_SSD_550[['folder', '年龄', '性别', '学历(年)', 'BPRS_Total', 'mean FD_Power', '病程月']]
## Save index and subjects' demographic data
subinfo_chronic_medicated[['folder']].to_csv(r'D:\WorkStation_2018\SZ_classification\Scale\index_chronic.txt', index=False, header=False)
subinfo_firstepisode_medicated[['folder']].to_csv(r'D:\WorkStation_2018\SZ_classification\Scale\index_firstepisode_medicated.txt', index=False, header=False)
subinfo_firstepisode_unmedicated[['folder']].to_csv(r'D:\WorkStation_2018\SZ_classification\Scale\index_firstepisode_unmedicated.txt', index=False, header=False)
subinfo_chronic_medicated[['folder', '年龄', '性别', '学历(年)', 'mean FD_Power', '病程月']].to_csv(r'D:\WorkStation_2018\SZ_classification\Scale\cov_chronic.txt', index=False)
subinfo_firstepisode_medicated[['folder', '年龄', '性别', '学历(年)', 'mean FD_Power', '病程月']].to_csv(r'D:\WorkStation_2018\SZ_classification\Scale\cov_firstepisode_medicated.txt', index=False)
subinfo_firstepisode_unmedicated[['folder', '年龄', '性别', '学历(年)', 'mean FD_Power', '病程月']].to_csv(r'D:\WorkStation_2018\SZ_classification\Scale\cov_firstepisode_unmedicated.txt', index=False)
plt.figure(figsize=(8,10))
title_dict = {0:'Age', 1:'Education', 2:'BPRS', 3:'Head motion', 4: 'Gender', 5:'Duration'}
ylabel_dict = {0:'Year', 1:'Year', 2:'', 3:'', 4:'Proportion of male', 5:'Month'}
for i, df in enumerate(['年龄', '学历(年)', 'BPRS_Total', 'mean FD_Power', '性别', '病程月']):
plt.subplot(2,3,i+1)
if i == 4:
plt.bar(0, subinfo_chronic_medicated[df].dropna().value_counts()[1]/len(subinfo_chronic_medicated), width=0.3, alpha=0.5)
plt.grid(axis='y')
plt.bar(1, subinfo_firstepisode_medicated[df].dropna().value_counts()[1]/len(subinfo_firstepisode_medicated), width=0.3, alpha=0.5)
plt.grid(axis='y')
plt.bar(2, subinfo_firstepisode_unmedicated[df].dropna().value_counts()[1]/len(subinfo_firstepisode_unmedicated), width=0.3, alpha=0.5)
plt.grid(axis='y')
tt = [
len(subinfo_chronic_medicated[df].dropna()),
len(subinfo_firstepisode_medicated[df].dropna()),
len(subinfo_firstepisode_unmedicated[df].dropna())
]
obs = [
subinfo_chronic_medicated[df].dropna().value_counts()[1],
subinfo_firstepisode_medicated[df].dropna().value_counts()[1],
subinfo_firstepisode_unmedicated[df].dropna().value_counts()[1]
]
chi, p = lc_chisqure(obs, tt)
else:
ViolinPlotMatplotlib().plot([subinfo_chronic_medicated[df].dropna().values], positions=[0])
plt.grid(axis='y')
ViolinPlotMatplotlib().plot([subinfo_firstepisode_medicated[df].dropna().values], positions=[1])
plt.grid(axis='y')
ViolinPlotMatplotlib().plot([subinfo_firstepisode_unmedicated[df].dropna().values], positions=[2])
plt.grid(axis='y')
f, p = oneway_anova(
*[subinfo_chronic_medicated[df].dropna(),
subinfo_firstepisode_medicated[df].dropna(),
subinfo_firstepisode_unmedicated[df].dropna()]
)
plt.yticks(fontsize=12)
plt.xticks([0, 1, 2], ['Chronic SSD', 'First episode medicated SSD', 'First episode unmedicated SSD'], rotation=45, ha="right")
plt.ylabel(ylabel_dict[i], fontsize=15)
plt.title(''.join([title_dict[i], f'(P={p:.2f})']), fontsize=12, fontweight="bold")
plt.subplots_adjust(wspace = 0.2, hspace =0)
plt.tight_layout()
pdf = PdfPages(r'D:\WorkStation_2018\SZ_classification\Figure\Processed\subgroupinfo.pdf')
pdf.savefig()
pdf.close()
plt.show()
<file_sep>/Workstation/Others/move_DynamicFC.py
# -*- coding: utf-8 -*-
"""
Created on Mon Oct 15 16:26:23 2018
将各个窗的状态信息,根据不同诊断分组
@author: lenovo
"""
from moveScreenedFile import moveMain
import pandas as pd
import os
## ==========================input====================================
# source data
rootpath=r'D:\WorkStation_2018\WorkStation_2018_08_Doctor_DynamicFC_Psychosis\Data\zDynamic\state'
stateName=['allState17_4','allState17_5','allState17_8',
'allState20_2','allState20_4','allState20_5','allState20_8']
metricsName=['fractionOfTimeSpentInEachDtate','fullTransitionMatrix','meanDwellTimeInEachState','numberOfTransitions']
##
rootPath=[]
for statename in stateName:
[rootPath.append(os.path.join(rootpath,statename,metricsname)) for metricsname in metricsName]
#os.mkdir(rootPath[0])
# out path
outPath=[]
for statename in stateName:
[outPath.append(os.path.join(rootpath,statename,metricsname+'1')) for metricsname in metricsName]
#os.mkdir(outPath[0])
## ==========================import===================================
from basicInfoStat import folder1,folder2,folder3,folder4
## ============================run====================================
reguForExtractFileName='[1-9]\d*'
for (rootpath,outpath) in zip(rootPath,outPath):
screenedFilePath1,logic_loc_refrence1=\
moveMain(rootpath,folder1,reguForExtractFileName,\
os.path.join(outpath,'HC'),ifMove=1,ifSaveMoveLog=0)
screenedFilePath1,logic_loc_refrence1=\
moveMain(rootpath,folder2,reguForExtractFileName,\
os.path.join(outpath,'MDD'),ifMove=1,ifSaveMoveLog=0)
screenedFilePath1,logic_loc_refrence1=\
moveMain(rootpath,folder3,reguForExtractFileName,\
os.path.join(outpath,'SZ'),ifMove=1,ifSaveMoveLog=0)
screenedFilePath1,logic_loc_refrence1=\
moveMain(rootpath,folder4,reguForExtractFileName,\
os.path.join(outpath,'BD'),ifMove=1,ifSaveMoveLog=0)
rootpath=r'D:\WorkStation_2018\WorkStation_2018_08_Doctor_DynamicFC_Psychosis\Data\zDynamic\state\allState17_2\numberOfTransitions'
outpath=r'D:\WorkStation_2018\WorkStation_2018_08_Doctor_DynamicFC_Psychosis\Data\zDynamic\state\allState17_2\numberOfTransitions1'
screenedFilePath1,logic_loc_refrence1=\
moveMain(rootpath,folder1,reguForExtractFileName,\
os.path.join(outpath,'HC'),ifMove=1,ifSaveMoveLog=0)
screenedFilePath1,logic_loc_refrence1=\
moveMain(rootpath,folder2,reguForExtractFileName,\
os.path.join(outpath,'MDD'),ifMove=1,ifSaveMoveLog=0)
screenedFilePath1,logic_loc_refrence1=\
moveMain(rootpath,folder3,reguForExtractFileName,\
os.path.join(outpath,'SZ'),ifMove=1,ifSaveMoveLog=0)
screenedFilePath1,logic_loc_refrence1=\
moveMain(rootpath,folder4,reguForExtractFileName,\
os.path.join(outpath,'BD'),ifMove=1,ifSaveMoveLog=0)<file_sep>/Workstation/Others/lc_radar_leptin.py
# -*- coding: utf-8 -*-
"""
Created on Thu Nov 29 17:48:46 2018
@author: lenovo
"""
import pandas as pd
import numpy as np
file=r'D:\others\彦鸽姐\leptin-83-figure-data.xlsx'
data=pd.read_excel(file)
#hc=data[data['诊断']==1]['Leptin'].to_excel('D:\others\彦鸽姐\leptin_hc.xlsx',index=False)
#mdd=data[data['诊断']==2]['Leptin'].to_excel('D:\others\彦鸽姐\leptin_mdd.xlsx',index=False)
#scz=data[data['诊断']==3]['Leptin'].to_excel('D:\others\彦鸽姐\leptin_scz.xlsx',index=False)
#bpd=data[data['诊断']==4]['Leptin'].to_excel('D:\others\彦鸽姐\leptin_bpd.xlsx',index=False)
hc=data[data['诊断']==1]['Leptin']
mdd=data[data['诊断']==2]['Leptin']
scz=data[data['诊断']==3]['Leptin']
bpd=data[data['诊断']==4]['Leptin']
mean=[np.mean(hc),np.mean(mdd),np.mean(scz),np.mean(bpd)]
<file_sep>/Workstation/Clustering using ALFF/lc_compare_scale_after_classification.py
# -*- coding: utf-8 -*-
"""
Created on Mon Nov 26 16:22:19 2018
应用聚类模型,对测试集分类后,比较各个类的量表差异
@author: lenovo
"""
#
import sys
import pandas as pd
import numpy as np
sys.path.append(r'D:\myCodes\MVPA_LIChao\MVPA_Python\Statistic')
from lc_chi2 import lc_chi2
from lc_ttest2 import ttest2
from statsmodels.formula.api import ols
from statsmodels.stats.anova import anova_lm
import statsmodels.api as sm
import seaborn as sns
import matplotlib.pyplot as plt
# input
scale_file=r'D:\WorkStation_2018\WorkStation_2018_11_machineLearning_Psychosi_ALFF\Data\REST-meta-MDD-PhenotypicData_WithHAMDSubItem_S20.xlsx'
pred_label_file=r'D:\WorkStation_2018\WorkStation_2018_11_machineLearning_Psychosi_ALFF\Data\multilabel\predict_label_multilabel.xlsx'
# load file
scale=pd.read_excel(scale_file)
pred_label=pd.read_excel(pred_label_file,header=None)
pred_label.columns=['predict_label']
# concat
scale=pd.concat([scale,pred_label],axis=1)
# dropna and others
scale=scale.mask(scale==-9999,None)
scale=scale.mask(scale=='[]',None)
scale=scale.dropna()
# save
#scale.to_excel(r'D:\WorkStation_2018\WorkStation_2018_11_machineLearning_Psychosi_ALFF\Data\multilabel\scale_predct_label.xlsx',index=False)
#==============================================================================
# statistic
# sex
data_sex=[list(scale[scale['predict_label']==1]['性别'].value_counts()),
list(scale[scale['predict_label']==2]['性别'].value_counts()),
list(scale[scale['predict_label']==3]['性别'].value_counts()),
list(scale[scale['predict_label']==4]['性别'].value_counts()),
list(scale[scale['predict_label']==5]['性别'].value_counts())]
chi_sex,p_sex=lc_chi2(data_sex)
# age
model_age = ols('年龄~predict_label',scale).fit()
anova_age = sm.stats.anova_lm(model_age)
print(anova_age)
p_age=anova_age.iloc[0,-1]
F_age=anova_age.iloc[0,-2]
# post_hoc
p=np.ones([5,5])
for i in np.arange(1,6,1):
for j in np.arange(1,6,1):
t,p[i-1,j-1]=ttest2(scale[scale['predict_label']==i]['年龄'].values,
scale[scale['predict_label']==j]['年龄'].values,
method='independent')
# hot map
f, (ax1) = plt.subplots(figsize=(6,6),nrows=1)
#sns.heatmap(x, annot=True, ax=ax1,cmap='rainbow',center=0)#cmap='rainbow'
sns.heatmap(p,ax=ax1,
annot=True,annot_kws={'size':9,'weight':'normal', 'color':'w'},fmt='.2f',
cmap='RdBu',
linewidths = 0.05, linecolor= 'w',
mask=p>0.05)
ax1.set_title('P values of age')
ax1.set_xticklabels(labels=[1,2,3,4,5],fontsize=15)
ax1.set_yticklabels(labels=[1,2,3,4,5],fontsize=15)
# education
model_edu = ols('教育年限~C(predict_label)',scale).fit()
anova_edu = anova_lm(model_edu)
print(anova_edu)
p_edu=anova_edu.iloc[0,-1]
F_edu=anova_edu.iloc[0,-2]
# medication
data_med=[list(scale[scale['predict_label']==1]['是否正在用药'].value_counts()),
list(scale[scale['predict_label']==2]['是否正在用药'].value_counts()),
list(scale[scale['predict_label']==3]['是否正在用药'].value_counts()),
list(scale[scale['predict_label']==4]['是否正在用药'].value_counts()),
list(scale[scale['predict_label']==5]['是否正在用药'].value_counts())]
chi_med,p_med=lc_chi2(data_med)
# HAMD
# 先把HAMD变为数据类型
scale['HAMD']=scale['HAMD'].astype('float32')
model_HAMD = ols("HAMD~C(predict_label)",scale).fit()
anova_HAMD = anova_lm(model_HAMD)
p_HAMD=anova_HAMD.iloc[0,-1]
f_HAMD=anova_HAMD.iloc[0,-2]
#==============================================================================
# HAMA
# 先把HAMD变为数据类型
scale['HAMA']=scale['HAMA'].astype('float32')
model_HAMA = ols("HAMA~C(predict_label)",scale).fit()
anova_HAMA = anova_lm(model_HAMA)
p_HAMA=anova_HAMA.iloc[0,-1]
f_HAMA=anova_HAMA.iloc[0,-2]
# post_hoc
p=np.ones([5,5])
for i in np.arange(1,6,1):
for j in np.arange(1,6,1):
t,p[i-1,j-1]=ttest2(scale[scale['predict_label']==i]['HAMA'].values,
scale[scale['predict_label']==j]['HAMA'].values,
method='independent')
# hot map
f, (ax1) = plt.subplots(figsize=(6,6),nrows=1)
#sns.heatmap(x, annot=True, ax=ax1,cmap='rainbow',center=0)#cmap='rainbow'
sns.heatmap(p,ax=ax1,
annot=True,annot_kws={'size':9,'weight':'normal', 'color':'w'},fmt='.2f',
cmap='RdBu',
linewidths = 0.05, linecolor= 'w',
mask=p>0.05)
ax1.set_title('P values of HAMA')
ax1.set_xticklabels(labels=[1,2,3,4,5],fontsize=15)
ax1.set_yticklabels(labels=[1,2,3,4,5],fontsize=15)<file_sep>/Workstation/Others/copyDALFF.py
# -*- coding: utf-8 -*-
"""
Created on Sat Oct 27 13:10:51 2018
条件:subjxxx下有多个文件,其中只有一个是我们需要的
目的:将所有被试文件夹下,这一个我们需要的文件拷贝到另一个文件夹中
同时每一个文件都被名字为subjxxx的文件夹存放
@author: lenovo
"""
import sys
sys.path.append(r'D:\myCodes\MVPA_LIChao\MVPA_Python\workstation')
import copySelectedFile_OsWalk3 as copy
# ====================================================================
# input
referenceFile=(r'D:\WorkStation_2018\WorkStation_2018_08_Doctor_DynamicFC_Psychosis\Data\zDynamic\state\subjName.xlsx')
# basic['folder'].to_csv(r'I:\dynamicALFF\folder.txt',header=False,index=False)
sel=copy.copy_fmri(referencePath=referenceFile,
regularExpressionOfsubjName_forReference='([1-9]\d*)',
folderNameContainingFile_forSelect='',
num_countBackwards=2,
regularExpressionOfSubjName_forNeuroimageDataFiles='([1-9]\d*)',\
keywordThatFileContain='var_mdALFF',
neuroimageDataPath=r'I:\dynamicALFF\Results\DALFF\80_0.9\variance_dALFFmap',
savePath=r'I:\dynamicALFF\Results\DALFF\80_0.9',
saveFolderName='variance_dmALFF',
n_processess=10,
ifSaveLog=0,
ifCopy=1,
ifMove=1,
saveInToOneOrMoreFolder='saveToOneFolder',
saveNameSuffix='.nii',
ifRun=1)
allFilePath,allSubjName,logic_loc,allSelectedFilePath,allSelectedSubjName=\
sel.main_run() <file_sep>/Workstation/Others/alff_plot_box.py
# -*- coding: utf-8 -*-
"""
Created on Tue Nov 27 15:23:14 2018
@author: lenovo
"""
# =============================================================================
import sys
sys.path.append(r'D:\myCodes\MVPA_LIChao\MVPA_Python\plot')
#import lc_boxplot as boxplot
import lc_violinplot as violin
import numpy as np
import pandas as pd
data_path=r'D:\myCodes\MVPA_LIChao\MVPA_Python\plot\scale_predct_label.xlsx'
#prd=r'D:\WorkStation_2018\WorkStation_2018_11_machineLearning_Psychosi_ALFF\Machine_Learning\predictLabel_testData.xlsx'
#prd_label=pd.read_excel(prd)
#a=pd.read_excel(data_path)
# =============================================================================
#增加
sel=violin.violinplot(
data_path=r'D:\myCodes\MVPA_LIChao\MVPA_Python\plot\scale_predct_label.xlsx',
x_location=[7,8],
x_name='脑区',
y_name='reho',
hue_name='predict_label',
hue_order=None,
if_save_figure=0,
figure_name=r'D:\WorkStation_2018\WorkStation_2018_11_machineLearning_Psychosi_ALFF\Data\multilabel\hamd_hama.tif')
r=sel.plot()
#
r.ax.set_xticklabels(labels=['HAMD','HAMA'],fontsize=20)
r.ax.set_ylabel(ylabel='',fontsize=20)
r.ax.set_xlabel(xlabel='',fontsize=20)
#sel.f.savefig(sel.figure_name, dpi=300, bbox_inches='tight')
<file_sep>/Workstation/Others/BOLDVar.py
# -*- coding: utf-8 -*-
"""
Created on Mon Nov 12 22:52:35 2018
@author: lenovo
"""
import os
import pandas as pd
savePath=r'I:\BOLDVar\folder'
hcFolder=basic['folder'][basic['诊断']==1].to_excel(os.path.join(savePath,'hc.xlsx'),index=False,header=False)
mddFolder=basic['folder'][basic['诊断']==2].to_excel(os.path.join(savePath,'mdd.xlsx'),index=False,header=False)
szFolder=basic['folder'][basic['诊断']==3].to_excel(os.path.join(savePath,'sz.xlsx'),index=False,header=False)
bdFolder=basic['folder'][basic['诊断']==4].to_excel(os.path.join(savePath,'bd.xlsx'),index=False,header=False)
import copySelectedFile_OsWalk4 as copy
# basic['folder'].to_csv(r'I:\dynamicALFF\folder.txt',header=False,index=False)
sel=copy.copy_fmri(referencePath=os.path.join(savePath,'bd.xlsx'),
regularExpressionOfsubjName_forReference='([1-9]\d*)',
folderNameContainingFile_forSelect='',
num_countBackwards=1,
regularExpressionOfSubjName_forNeuroimageDataFiles='([1-9]\d*)',\
keywordThatFileContain='nii',
neuroimageDataPath=r'I:\BOLDVar\all_BOLDVar',
savePath=r'I:\BOLDVar\BD',
n_processess=10,
ifSaveLog=0,
ifCopy=0,
ifMove=1,
saveInToOneOrMoreFolder='saveToOneFolder',
saveNameSuffix='',
ifRun=1)
allFilePath,allSubjName,\
logic_loc,allSelectedFilePath,allSelectedSubjName=\
sel.main_run()
print('Done!')<file_sep>/Workstation/Others/groupDALFF.py
# -*- coding: utf-8 -*-
"""
Created on Sat Oct 27 13:10:51 2018
将dmALFF分组
@author: lenovo
"""
import sys
import pandas as pd
#sys.path.append(r'D:\myCodes\MVPA_LIChao\MVPA_Python\workstation')
import copySelectedFile_OsWalk4 as copy
# ====================================================================
# input
referenceFile_HC=(r'H:\dynamicALFF\Scales\folder_BD.xlsx')
#referenceFile_MDD=(r'H:\dynamicALFF\folder2.xlsx')
#referenceFile_SZ=(r'H:\dynamicALFF\folder3.xlsx')
#referenceFile_BD=(r'H:\dynamicALFF\folder4.xlsx')
#referenceFile=[referenceFile_HC,referenceFile_MDD,
# referenceFile_SZ,referenceFile_BD]
#subjName_forSelect=pd.read_excel(referenceFile_HC,dtype='str',header=None)
# ============================================================================
sel=copy.copy_fmri(referencePath=referenceFile_HC,
regularExpressionOfsubjName_forReference='([1-9]\d*)',
folderNameContainingFile_forSelect='',
num_countBackwards=1,
regularExpressionOfSubjName_forNeuroimageDataFiles='([1-9]\d*)',\
keywordThatFileContain='nii',
neuroimageDataPath=r'H:\dynamicALFF\Results\DALFF\50_0.9\BD_smooth',
savePath=r'H:\dynamicALFF\Results\DALFF\50_0.9\BD_smooth_screened',
n_processess=10,
ifSaveLog=0,
ifCopy=0,
ifMove=1,
saveInToOneOrMoreFolder='saveToOneFolder',
saveNameSuffix='',
ifRun=1)
allFilePath,allSubjName,logic_loc,allSelectedFilePath,allSelectedSubjName=\
sel.main_run() <file_sep>/Workstation/Clustering using ALFF/check_label_a.py
# -*- coding: utf-8 -*-
"""
Created on Mon Nov 19 14:42:56 2018
确认聚类结果中,分到label a的被试,有多少是MDD
1:HC;2:MDD;3:SZ;4:BD;5:HR
@author: lenovo
"""
import pandas as pd
def check_label(label='b'):
# input
dianosis={1:'HC',2:'MDD',3:'SZ',4:'BD',5:'HR'}
# label='b'
diagnosis_index=1
scale_data_path=r"D:\WorkStation_2018\WorkStation_2018_08_Doctor_DynamicFC_Psychosis\Scales\8.30大表.xlsx"
training_data_path=r'D:\WorkStation_2018\WorkStation_2018_11_machineLearning_Psychosi_ALFF\Data\trainingData.xlsx'
# read to DataFrame
scale_data=pd.read_excel(scale_data_path)
training_data=pd.read_excel(training_data_path)
# screen mdd's folder and all label a's folder
diagnosis_folder_in_scale=pd.DataFrame(scale_data.loc[scale_data['诊断'] == diagnosis_index]['folder'])
label_folder_in_training_data=pd.DataFrame(training_data.loc[training_data['k=5_label'] == label]['folder'])
# 交集
selected_folder=diagnosis_folder_in_scale.set_index('folder').\
join(label_folder_in_training_data.set_index('folder'),how='inner')
# 求training data中mdd的占比
per='{:f}'.format(len(selected_folder)/len(label_folder_in_training_data))
diagnosis_name=dianosis[diagnosis_index]
print('{}在label {} 中的占比是{}\n绝对数值是{}'.format(diagnosis_name,label,per,len(selected_folder)))
if __name__=='__main__':
check_label(label='a')
check_label(label='b')
check_label(label='c')
check_label(label='d')
check_label(label='e')
|
6c8d070675c5df6480f6813b38a2f99cf31d8865
|
[
"Python"
] | 10 |
Python
|
zhoukegui/lc_private_codes
|
b6272eea294427f02f404894f4eeeb6ed95ab820
|
38f56812bcc32cb1321eefce6989fb1ecc41cdd8
|
refs/heads/master
|
<repo_name>mohdasim8018/RMI-Remote-Method-Invocation-<file_sep>/src/Patient.java
import java.io.Serializable;
import java.rmi.Naming;
import java.rmi.RemoteException;
import java.util.*;
public class Patient implements Serializable {
private static final long serialVersionUID = 1L;
private String glob;
protected Patient() throws RemoteException{
super();
}
public void setChoice(String a){
this.glob = a;
}
public String getChoice(){
return this.glob;
}
public static void main(String[] args) throws Exception{
Patient patientObj = new Patient();
String url1 = new String("rmi://localhost/doctorObj");
doctorRmiIntf doctorStubObj1 = (doctorRmiIntf)Naming.lookup(url1);
Patient ref1 = patientObj;
Patient ref2 = patientObj;
/*
* Remote method call with the two references(ref1 and ref2) of
* the same object
* */
doctorStubObj1.refIntegrity(ref1, ref2);
String url = new String("rmi://localhost/hospitalObj");
hospitalRmiIntf hospitalStubObj = (hospitalRmiIntf)Naming.lookup(url);
Map<String, String> hosps= hospitalStubObj.displayHospitals();
while (true){
System.out.println("\t Hospital Name \t\t Address");
System.out.println("\t ------------- \t\t -------");
for(Map.Entry<String,String> entry : hosps.entrySet()) {
System.out.printf("\t%-20s \t%-20s %n", entry.getKey(), entry.getValue());
System.out.println();
}
while(true) {
System.out.println("Enter '1' for Doctors"+"\n"+"Enter '2' for Insurance\n"
+ "Enter '3' to 'Exit Application'");
@SuppressWarnings("resource")
Scanner userInput = new Scanner(System.in);
String docInsuran = userInput.next();
if (docInsuran.equals("1")){
while(true){
System.out.println("Enter '1' for 'Stroger' doctors Info \nEnter '2' for 'Rush' "
+ "doctors Info\nEnter '3' for 'UI Hospital' doctors Info\n"
+ "Enter '4' to 'Return to previous menu' \n" + "Enter '5' to 'Exit Application'\n");
String docChoice = userInput.next();
patientObj.setChoice(docChoice);
if (docChoice.equals("4")){
break;
}
if (docChoice.equals("5")){
System.exit(0);
}
Map<String, String> docs = hospitalStubObj.getDoctorsInfo(patientObj);
System.out.println("List of Doctors:");
System.out.println("\t Doctor Name \t\t\t Speciality");
System.out.println("\t ----------- \t\t\t ------------");
for(Map.Entry<String,String> entry : docs.entrySet()) {
System.out.printf("\t%-30s %-30s %n", entry.getKey(), entry.getValue());
System.out.println();
}
}
}
else if (docInsuran.equals("2")){
while(true){
System.out.println("Enter '1' for 'Stroger' Insurance companies Info \nEnter '2' for 'Rush' "
+ "Insurance Companies Info\nEnter '3' for 'UI Hospital' Insurance Companies Info\n"
+ "Enter '4' to 'Return to previous menu' \n" + "Enter '5' to 'Exit Application'\n");
String insurChoice = userInput.next();
patientObj.setChoice(insurChoice);
if (insurChoice.equals("4")){
break;
}
if (insurChoice.equals("5")){
System.exit(0);
}
Map<String, String> insur = hospitalStubObj.getInsurCompInfo(patientObj);
System.out.println("List of Insurance companies:");
System.out.println("\t Company Name \t\t\t Address");
System.out.println("\t ------------- \t\t\t -------");
for(Map.Entry<String,String> entry : insur.entrySet()) {
System.out.printf("\t%-30s %-30s %n", entry.getKey(), entry.getValue());
System.out.println();
}
}
}
else if (docInsuran.equals("3")){
System.exit(0);
}
}
}
}
}
<file_sep>/src/hospital.java
import java.rmi.Naming;
import java.rmi.RemoteException;
import java.rmi.server.UnicastRemoteObject;
//import java.rmi.registry.*;
import java.util.*;
public class hospital extends UnicastRemoteObject implements hospitalRmiIntf{
private static final long serialVersionUID = 1L;
private Map<String, String> hospList = new HashMap<String, String>();
private String choice;
protected hospital() throws RemoteException{
super();
}
public void hospSetChoice(String a){
this.choice = a;
}
public String hospGetChoice(){
return this.choice;
}
/*
* Patient's hospital choice is passed wrapped in the patient object for the
* insurance companies info
* */
public Map<String, String> getInsurCompInfo(Patient y)throws Exception{
Map<String, String> insurInfo = new HashMap<String, String>();
String url = new String("rmi://localhost/insuranceObj");
insuranceRmiIntf insurStubObj = (insuranceRmiIntf)Naming.lookup(url);
insurInfo = insurStubObj.provideInsurInfo(y);
return insurInfo;
}
/*
* List of available hospitals
* */
public Map<String, String> displayHospitals(){
//Map<String, String> hospList = new HashMap<String, String>();
this.hospList.put("UI Hospital", "722 W Maxwell St, Chicago, IL 60607");
this.hospList.put("Rush Hospital", "1620 W Harrison St, Chicago, IL");
this.hospList.put("Stroger Hospital","1969 W Ogden Ave Chicago, IL 60612");
return hospList;
}
/*
* Patient's hospital choice is being passed wrapped in the patient object
* to retrieve respective hospital's doctors info
* */
public Map<String, String> getDoctorsInfo(Patient k)throws Exception{
Map<String, String> docsInfo = new HashMap<String, String>();
String url = new String("rmi://localhost/doctorObj");
doctorRmiIntf doctorStubObj = (doctorRmiIntf)Naming.lookup(url);
docsInfo = doctorStubObj.provideDocsInfo(k);
return docsInfo;
}
public static void main(String[] args)throws Exception {
hospital hospObj = new hospital();
Naming.rebind("hospitalObj", hospObj);
System.out.println("rmi object bound to the name 'hospitalObj' and is ready for use");
}
}
<file_sep>/src/doctor.java
import java.rmi.Naming;
import java.rmi.RemoteException;
import java.rmi.server.UnicastRemoteObject;
import java.util.*;
public class doctor extends UnicastRemoteObject implements doctorRmiIntf {
private static final long serialVersionUID = 1L;
private Map<String, String> docMap= new HashMap<String, String>();
protected doctor() throws RemoteException{
super();
}
/*
* Checks patient's choice:
* If 1, Stroger associated doctors info is being requested
* If 2, Rush associated doctors info is being requested
* If 3, UI associated doctors info is being requested
* */
public Map<String, String> provideDocsInfo(Patient k){
//String choiceUnivDoc = hospObj.hospGetChoice();
if (k.getChoice().equals("3")){
this.docMap = provideUIDocsInfo();
}
else if (k.getChoice().equals("2")){
this.docMap = provideRushDocsInfo();
}
else if (k.getChoice().equals("1")){
this.docMap = provideStrogerDocsInfo();
}
return docMap;
}
public Map<String, String> provideRushDocsInfo(){
Map<String, String> rushMap= new HashMap<String, String>();
rushMap.put("Chase Secrest", "Cardiology");
rushMap.put("<NAME>", "Neurology");
rushMap.put("<NAME>", "Orthopedics");
rushMap.put("<NAME>", "Psychiatry");
rushMap.put("<NAME>", "Pulmonology");
rushMap.put("<NAME>", "Orthopedics");
rushMap.put("<NAME>", "Pulmonology");
rushMap.put("<NAME>", "Neurology");
return rushMap;
}
public Map<String, String> provideUIDocsInfo(){
Map<String, String> uiMap = new HashMap<String, String>();
uiMap.put("<NAME>", "Neurology");
uiMap.put("<NAME>", "Orthopedics");
uiMap.put("<NAME>", "Psychiatry");
uiMap.put("<NAME>", "Pulmonology");
uiMap.put("<NAME>", "Orthopedics");
uiMap.put("<NAME>", "Cardiology");
uiMap.put("<NAME>", "Orthopedics");
uiMap.put("<NAME>", "Psychiatry");
return uiMap;
}
public Map<String, String> provideStrogerDocsInfo(){
Map<String, String> strogerMap = new HashMap<String, String>();
strogerMap.put("<NAME>", "Neurology");
strogerMap.put("<NAME>", "Pulmonology");
strogerMap.put("<NAME>", "Psychiatry");
strogerMap.put("<NAME>", "Cardiology");
strogerMap.put("<NAME>", "Orthopedics");
strogerMap.put("<NAME>", "Pulmonology");
strogerMap.put("<NAME>", "Neurology");
return strogerMap;
}
public static void main(String[] args)throws Exception {
doctor docObj = new doctor();
Naming.rebind("doctorObj", docObj);
System.out.println("rmi object bound to the name 'doctorObj' and is ready for use");
}
/*
* Referencial integrity check:
* ---------------------------
* Printing to verify if Patient object's id
* pointed by the both references in the argument is same
* */
public void refIntegrity(Patient obj, Patient obj1) throws RemoteException {
System.out.println("Referential Integrity:");
System.out.println("---------------------");
System.out.println("First Reference object id"+System.identityHashCode(obj));
System.out.println("Second reference object id"+System.identityHashCode(obj1));
}
}
<file_sep>/src/doctorRmiIntf.java
import java.rmi.Remote;
import java.rmi.RemoteException;
import java.util.*;
public interface doctorRmiIntf extends Remote {
public void refIntegrity(Patient obj, Patient obj1) throws RemoteException;
public Map<String, String> provideDocsInfo(Patient i) throws RemoteException;
public Map<String, String> provideRushDocsInfo() throws RemoteException;
public Map<String, String> provideUIDocsInfo() throws RemoteException;
public Map<String, String> provideStrogerDocsInfo() throws RemoteException;
}
|
9623673b9b9ffdaa60078fae0f6ada0fd92f372b
|
[
"Java"
] | 4 |
Java
|
mohdasim8018/RMI-Remote-Method-Invocation-
|
48c38b1c1845ba032c5a9da8c66a21eac2e46ff6
|
5a992d9ccaaa80b1c84701a7b2a9bbee40118d24
|
refs/heads/master
|
<file_sep>#!/bin/sh
cwd=$(cd `dirname $0`; pwd)
echo $cwd
job=$1
url=`grep "url" *$job* | awk -F 'job: ' '{print $2}' | tail -1`
dt=`grep "date" *props* | grep -h delta_merge | awk -F '=' '{print $2}' | sed 's/\\\\//g'`
python $cwd/put_mr_metric.py -m $job -u $url -d "$dt"
<file_sep>.. queryengine documentation master file, created by
sphinx-quickstart on Sat Aug 3 10:12:59 2013.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
欢迎光临本人的博客。
分布式查询技术
====================
Impala
----------------
.. toctree::
:maxdepth: 1
:glob:
impala/overview
impala/frontend
impala/be
impala/debug
HIVE
----------------
.. toctree::
:maxdepth: 1
:glob:
hive/settings
hive/hive-meta
.. release
分享
========================
hadoop
----------------
.. toctree::
:maxdepth: 1
:glob:
share/blog
<file_sep>*********************
Impala架构介绍
*********************
Impala要解决的问题
=================================
我们都知道MapReduce在Google内部应用很广,社区有对应的MapReduce实现Hadoop。
但是Hadoop的MapReduce框架本身重点考虑高容量并发,具有很强的扩展能力和容错能力,但对任务的低延时考虑较少. 比如:
* TaskTracker新启动一个Task时需要启动新的JVM;
* Reduce必须等待所有的Map执行完成之后才开始执行其运算逻辑;
* Reduce通过轮询方式获知MapTask的完成情况,需要主动拉取Map的结果,而Map的结果需要写到本地磁盘;
* ...
因此,Hadoop的MapReduce是一个批处理的系统,并不适合实时处理。
针对实时Adhoc查询,Google内部有Dremel,之所以Dremel能在大数据上实现交互性的响应速度,主要使用了两方面的技术:
* 对嵌套结构的嵌套关系型数据采用了全新的列式存储格式;
* 分布式可扩展统计算法,能够在几千台机器上并行计算查询结果。
google的Dremel不支持Join,而其Tenzing支持join,但是基于MapReduce,性能相对逊色。
由于Impala的架构师<NAME>从Google出来 `[1] <http://www.oschina.net/news/34118/cloudera-impala>`_ ,Impala的设计有可能反映了Google内部对Dremel和Tenzing的改进思想.
也因此,Impala不会完全替代Hive,Impala是Hive的一个补充,可以快速的响应一些简单的查询请求。
Impala总体介绍
=========================
Impala是Cloudera开发并开放代码的一个MPP模型的实时海量数据的分布式查询引擎 (当前最新版本1.1).
该查询引擎主要针对“低延时的实时数据查询与分析”应用,支持直接访问HDFS(hive)和HBase中的数据。
Impala采用了和Hive相同的类SQL接口,但并没有采用MapRed框架执行任务,而是采用了类似Dremel的方式,
其实Impala就是自己实现了一个执行引擎。 这个引擎不像MapRed一样是一个通用框架,并且也没
有任何failover和high availability的设计.
Impala的特点:
-------------------
`Cloudera官方博客 <http://blog.cloudera.com/blog/2012/10/cloudera-impala-real-time-queries-in-apache-hadoop-for-real/>`_ 上列举了:
* Thanks to local processing on data nodes, network bottlenecks are avoided.
* A single, open, and unified metadata store can be utilized.
* Costly data format conversion is unnecessary and thus no overhead is incurred.
* All data is immediately query-able, with no delays for ETL.
* All hardware is utilized for Impala queries as well as for MapReduce.
* Only a single machine pool is needed to scale.
另外,补充一些:
* 面向实时查询,结果秒级返回
* 兼容hive的类sql语法,但暂时不支持udf和嵌套结构;
* Impala采用与Hive相同的元数据、SQL语法、ODBC驱动程序和用户接口(Hue Beeswax),这样在使用CDH产品时,批处理和实时查询的平台是统一的
* 支持从hdfs和hbase读取数据
* 编译前端采用Java实现,而运行时用c++实现,并利用了llvm的技术,动态优化执行代码;
* 没有任何failover的设计,也没有对慢节点进行考虑;
* Impala与Dremel相比,支持join,而且支持不同类型的文件格式;
由于在分布式环境中,机器宕机是一种常态,如果一个大任务在执行过程中发生机器故障,重算整个JOB开销很大。
而Impala的设计理念就是,执行速度足够快,快到如果失败了,重新执行一遍的代价也不大。
也因此,Impala无法取代Hive,而且Impala扩展能力不如MapReduce框架。
当前只能**部署300台**左右,机器数目继续增加,慢节点和机器故障带来的问题就会比较突出,影响性能表现。
Impala总体架构
===================
.. image:: ../_static/impala.png
一个SQL执行过程如下:
* client通过thrift接口连接到某个impalad,然后调用query(sql)接口提交Query;
* Impalad接收请求:
* 新建QueryExecState(它包含这个Query执行的上下文信息, 成员包含coordinator等).
* 通过JNI或者Thrift接口(如果使用PlanService),完成Plan的编译;
* 调用QueryExecState的Exec, 新建coordinator, 开始执行Query;
* 返回query handle;客户端调用返回;
* Client得到Query Handle后,调用get_state轮询Query的状态; 当impalad的Coordinator获取到第一个RowBatch之后将状态设置为SUCCESS.
* Client一旦发现query的状态是Success之后将调用Fetch接口获取数据;
StateStore/Impalad
--------------------
在一个运行中的Impala系统中,主要包含两类服务进程:
* statestored 这是一个中心进程,它的主要功能就是维护(impalad)进程的信息。设计应该是支持Query的调度的,不过暂时还不支持;
* impalad 这是一组进程,它们会与数据节点所在的机器混部, 对外提供查询服务.
每个impalad进程同时充当两类角色:查询的主控者 (coordinator) 或者查询的分布式工作单元 (backend).
当客户端发起一个SQL查询时,首先通过thrift连接到任意一个impalad上.
该Impalad负责与client交互,该impalad在执行查询中,将可以分布式执行的计算分发到其他impalad上执行。
此时其他impalad充当工作单元的角色。
交互流程:
* StateStore: 它通过创建多个线程来处理Impalad的注册订阅(Callback).
处理订阅信息是通过thrift接口,提交client请求给相应的Impalad处理;
* Impalad: 在启动时会向StateStore注册StateStoreSubscribe,如果注册不上,会反复注册,直到注册成功后才能提供服务;
为了防止StateStore在挂掉后不致于完全不能工作,Impalad会缓存一份当前可用的impalad列表,
虽然这有可能导致执行计划分配给实际不可用的impalad导致Query执行失败.
目前StateStore还不提供对元数据的更新订阅,因此impalad的Meta缓存可能不是最新,必要时需要reflush下。
而且,目前来看StateStore的这些功能可以完全使用ZK来替代(需要statestore支持Query级的调度)
模块之间的RPC调用
--------------------
* Impalad/Impala Service: Client(默认提供impala-shell实现)调用Thrift接口来完成提交Query,查询结果等.
* Impalad beeswax,提供beeswax接口,默认端口号21000
* Impalad HiveServer2,提供HiveServer2接口,默认端口号21050
* Impalad/Web Serveice: 默认端口25000
提供http服务,可查看impalad当前正在执行的query进度,Profile日志分析,Cancel Query等,方便开发调试;
* Impalad/Backend Service: impalad之间通信的内部thrift服务。默认端口22000
* Impalad/StateStore Subscriber Service: 和StateStore交互的service接口,默认端口23000
* StateStore/StateStore Web Service: 默认端口25010 查看StateStore的状态等
* StateStore/StateStore Service: 默认端口24000, 提供Impalad注册StateStoreSubscribe.
Impala的主要代码
-----------------
Impala的主要代码包含在三个目录中:
* common/thrift:包含了thrift接口与数据类型定义。在Impala中,thrift是通用的数据交换格式,包括进程间通信、Java和C之间的数据交换。
* fe:包含Impala的编译部分:词法分析、语法分析、语义分析、执行计划生成。
* be:包含Impala的服务进程与运行时系统。
<file_sep>*********************
Drill
列存储
优化的执行引擎
缓存连续的内存布局
支持嵌套结构,也支持无结构数据;
扩展性比较好;
<file_sep>---
layout: post
title: HQL的几点建议
categories:
- Programming
tags:
- hive
- antlr
---
<img width="140" height="140" src="http://ww1.sinaimg.cn/bmiddle/65cc0af7jw1e5wzbyfdkgj20bo0bogm8.jpg"/>
###一.简化代码
#####1. 例子
某用户需要对一张表格两个字段进行group by,但是字段可能为空,为此在计算前进行判断,HQL如下:
select if(key is null, 0, key), if (uid is null, 0, uid), sum(pv) from src group by if(key is null, 0, key), if (uid is null, 0, uid);
注意到group by与select算子有重复,我们进行改写:
select key, uid, sum(pv) from (select if (key is null, 0, key) as key, if(uid is null, 0, uid)as uid, pv from src) src1 group by key, uid;
注意到IF函数的逻辑相同,可以抽取出自定义函数,经过进一步改写后如下:
create temparory macro normalize(key int) if(key is null, 0, key);
select key, uid, sum(pv) from (select normalize(key) as key, normalize(uid) as uid from src) src1 group by key, uid;
从上可以看出,代码相对简洁一些了。
#####2. 例子
某用户
<file_sep>
ssh相关使用
###################
ssh
===================
chmod 600 ~/.ssh/authorized_keys
chmod 755 ~/.ssh
cat ~/.ssh/id_rsa.pub | ssh xx "cat - >>~/.ssh/authorized_keys"
配置ssh某些机器的信任文件
==========================
Host 172.31.*
IdentityFile /home/ubuntu/.ssh/particle_dev.pem
rsync copy 断点续传
======================
alias rscp='rsync --progress --bwlimit=512 -avhe -P -r -e ssh '
使用 rsync 同步文件夹
=======================
服务器间同步文件
在服务器间rsync传输文件,需要有一个是开着rsync的服务,而这一服务需要两个配置文件,说明当前运行的用户名和用户组
这个用户名和用户组讲在改变文件权限和相关内容的时候有用,否则有时候会出现提示权限问题。
配置文件也说明了模块,模块化管理服务的安全性,每个模块的名称都是自己定义的,可以添加用户名密码验证,也可以验证IP,设置目录是否可写等.
vim /etc/rsyncd/rsyncd.conf
############## 内容开始 #############
port = 73
address = 0.0.0.0
uid=nobody
gid=nobody
use chroot=yes
# yes为只读,如果是no,则可以从client发送文件到此服务器
read only=no
[bn]
path=/home/
list=false
write only=yes
auth users=azkaban
secrets file=/etc/rsyncd/rsyncd.secrets
hosts allow=192.168.30.0/255.255.255.0
vim /etc/rsyncd/rsyncd.secrets
##############内容开始##############
backup:reoGWh^#fL
# 一行一个用户,用户名:密码
##############内容结束###############
rsync -avl [email protected]:/home/azkaban/sunsc/ /home/azkaban/backup
<file_sep>**************************
分布式查询系统概述
**************************
执行框架
====================
MapReduce
==============
Storm
===================
批处理系统
================
基于MapReduce执行框架,典型代表:HIVE, PIG
实时处理系统
==================
Impala/Drill ?
<file_sep>__author__ = 'sunshangchun'
#!/usr/bin/python
import json
import sys
import codecs
from os import listdir
from os.path import isfile, join
def file_put_content(content, file_name):
#coding=utf-8
#file_obj = open(file_name, 'w', 'utf-8')
file_obj=codecs.open(file_name, 'w', 'utf-8')
try:
file_obj.write(content)
finally:
file_obj.close()
# print(unicode(file_obj_content,'UTF-8'))
if __name__ == '__main__' :
with open("log.txt") as infile:
for line in infile:
js = json.loads(line)
# dir_path = sys.argv[1]
# files = [ join(dir_path, f) for f in listdir(dir_path) if isfile(join(dir_path, f)) ]
# for file_name in files:
# f = file(file_name)
# js = json.load(f)
# file_content=json.dumps(js, sort_keys=True, indent=4, separators=(',', ': '), ensure_ascii=False)
# #print file_content
# file_put_content(file_content, file_name)<file_sep>copied my public key to authorized_keys but public-key authentication still doesn't work.
Typically this is caused by the file permissions on $HOME, $HOME/.ssh or $HOME/.ssh/authorized_keys being more permissive than sshd allows by default.
In this case, it can be solved by executing the following on the server.
$ chmod go-w $HOME $HOME/.ssh
$ chmod 600 $HOME/.ssh/authorized_keys
$ chown `whoami` $HOME/.ssh/authorized_keys
$ Ctrl-Z -> Stop -> disown
安装thrift
===========
安装boost:
./bootstrap.sh
sudo ./b2 threading=multi address-model=64 variant=release stage install
安装libevent:
安装thrift:
git clone
git checkout tag
http://myitcorner.com/blog/?p=207
autotools默认目录是在/usr/bin,使用brew安装, 默认目录是/brew/bin,
aclocal -I /brew/bin
#autoheader automake autoconf
./bootstrap.sh
./configure --prefix=/Users/sunshangchun/libs/thrift-0.9.1 --without-ruby --without-csharp --without-perl --without-php --without-erlang --with-qt4=no --with-qt5=no --with-boost=/Users/sunshangchun/libs/boost_1_57_0
make CXXFLAGS=-stdlib=libstdc++ or export CXXFLAGS="-std=c++11" && make
export CXXFLAGS="-std=c++11"
make install
export PATH=/Users/sunshangchun/libs/homebrew/Cellar/glib/2.40.0_1/bin:$PATH
https://issues.apache.org/jira/browse/THRIFT-2907
export GFLAGS_INSTALL=/Users/sunshangchun/ClionProjects/naga/thirdparty/gflags-2.0/third-party-install/ && sh bin/build_thirdparty.sh -glog
export PATH=/Users/sunshangchun/libs/homebrew//Cellar/bison/3.0.2/bin:$PATH
export PATH=/Users/sunshangchun/libs/homebrew//Cellar/glib/2.40.0_1/bin:$PATH
export GLIB_LIBS="-L/Users/sunshangchun/libs/homebrew//Cellar/glib/2.40.0_1"
export GMODULE_LIBS="-L/Users/sunshangchun/libs/homebrew//Cellar/glib/2.40.0_1/lib"
export PATH=/usr/gcc4.9.2/bin:$PATH
make CXXFLAGS=-stdlib=libstdc++
设置应用g++的标准库,苹果体系默认应用的是llvm clang的编译器,对应的标准库是libc++, 而GCC 应用的是libstdc++<file_sep>#/usr/bin/python
# -*- coding:utf-8 -*-
import httplib
import json
from optparse import OptionParser
import time
from urlparse import urlparse
import sys
import datetime
tsdb_url = "http://hadoop1-4.yidian.com:4242/api/put"
def str_to_datetime(string, time_format):
return datetime.datetime.strptime(string, time_format)
def str_to_timestamp(str_time, time_format):
return time.mktime(str_to_datetime(str_time, time_format).timetuple())
def current_timestamp():
return long(time.time() * 1000)
def request_get(url, netloc = ""):
if netloc == "":
netloc = urlparse(url).netloc
conn = httplib.HTTPConnection(netloc)
conn.request('GET', url)
counter_data = conn.getresponse().read()
conn.close()
json_data=json.loads(counter_data)
return json_data
#httplib.HTTPConnection.debuglevel = 1
def parse_app_master(location):
parsed_uri = urlparse(location)
app_id = parsed_uri.path.split("/")[2]
host = parsed_uri.netloc
path = "/proxy/" + app_id
conn = httplib.HTTPConnection(host)
conn.request('GET', path)
job_id = ""
app_master = ""
for item in conn.getresponse().getheaders():
if item[0]=='location':
parsed_uri = urlparse(item[1])
#app_master = '{uri.scheme}://{uri.netloc}'.format(uri=parsed_uri)
app_master = parsed_uri.netloc
job_id = parsed_uri.path.split("/")[-2]
conn.close()
return (app_master, job_id)
def request_counter(app_master, job_id, track_url, is_history):
counter_url = ("/ws/v1/history/mapreduce/jobs/%s/counters" % job_id) \
if is_history else (track_url + "/ws/v1/mapreduce/jobs/%s/counters" % job_id)
print "counter url:" + app_master + counter_url
return request_get(counter_url, app_master)
def post_metrics(url, points):
headers = {"Content-type":"application/json"}
parsed_url = urlparse(url)
conn = httplib.HTTPConnection(parsed_url.netloc)
params = json.dumps(points)
print("post " + params)
conn.request('POST', parsed_url.path, params, headers)
response = conn.getresponse()
data = response.read()
print data
def trans_counter(json_data, timestamp, config, tags):
counter_group = json_data["jobCounters"]["counterGroup"]
metric_points = {}
for counters in counter_group:
key = counters["counterGroupName"]
if config.has_key(key):
values = {}
for counter in counters["counter"]:
values[counter["name"]] = long(counter["totalCounterValue"])
metric_names = config[key]
for name in metric_names:
if values.has_key(name):
metric_point = {}
metric_point["metric"] = name
metric_point["value"] = values[name]
metric_point["tags"] = tags
metric_point["timestamp"] = timestamp
metric_points[metric_point["metric"]] = metric_point
return metric_points
def test_post():
metric_points = []
timestamp = current_timestamp()
metric_point = {"metric":"test_mapreduce", "timestamp":timestamp, "value":100 }
metric_point["tags"] = {"type":"mr"}
metric_points.append(metric_point)
post_metrics(tsdb_url, metric_points)
if __name__ == "__main__":
usage = "usage: %prog [options] arg"
parser = OptionParser(usage)
parser.add_option("-m", "--metric", dest="metric", help="metric namespace")
parser.add_option("-u", "--track_url", dest="track_url", help="hadoop tracking url")
parser.add_option("-c", "--config", dest="config", help="config filename, config in json format")
parser.add_option("-d", "--date", dest="date")
parser.add_option("-t", "--tags", dest="tags", default="type=mr", help="metric tags")
(options, args) = parser.parse_args()
metric = options.metric
track_url = options.track_url
timestamp = long(str_to_timestamp(options.date, "%Y-%m-%d %H:%M:%S") * 1000)
tags = dict(item.split("=") for item in options.tags.split(","))
# { 'TaskCounter': ['REDUCE_INPUT_GROUPS',' MAP_INPUT_RECORDS' ], 'JobCounter':[''] }
print("metric:%s, track_url:%s, date:%s, tags:%s" % (metric, track_url, options.date, options.tags))
(app_master, job_id) = parse_app_master(track_url)
url = track_url + "/"
is_history = False if app_master == "" else True
if app_master == "": # if running
app_master = urlparse(track_url).netloc
job_url = track_url + "/ws/v1/mapreduce/jobs"
jobs = request_get(job_url)
print jobs["jobs"]["job"]
for job in jobs["jobs"]["job"]:
job_id = job["id"]
break
json_data = request_counter(app_master, job_id, track_url, is_history)
if options.config:
config = json.load(options.config)
else:
config = {
"org.apache.hadoop.mapreduce.TaskCounter": [
"REDUCE_INPUT_GROUPS",
"REDUCE_OUTPUT_RECORDS",
"REDUCE_INPUT_GROUPS",
"MAP_INPUT_RECORDS",
"MAP_OUTPUT_RECORDS"
]
}
metric_points = trans_counter(json_data, timestamp, config, tags)
counter_group = []
mrec = metric_points["MAP_OUTPUT_RECORDS"] if metric_points.has_key("MAP_OUTPUT_RECORDS") else None
rrec = metric_points["REDUCE_OUTPUT_RECORDS"] if metric_points.has_key("REDUCE_OUTPUT_RECORDS") else None
rinc = metric_points["REDUCE_INPUT_GROUPS"] if metric_points.has_key("REDUCE_INPUT_GROUPS") else None
p = None
if (rrec != None and mrec != None):
p = rrec
elif rinc != None:
p = rinc
elif mrec != None:
p = mrec
if p != None:
p["metric"] = metric + "." + "output_records"
counter_group.append(p)
p = metric_points["HDFS_BYTES_WRITTEN"]
if p != None:
p["metric"] = metric + "." + "output_bytes"
counter_group.append(p)
p = metric_points["MAP_INPUT_RECORDS"]
if p != None:
p["metric"] = metric + "." + "input_records"
counter_group.append(p)
p = metric_points["HDFS_BYTES_READ"]
if p != None:
p["metric"] = metric + "." + "input_bytes"
counter_group.append(p)
post_metrics(tsdb_url, counter_group)
<file_sep>打造Rst的GitHub博客
=============================
:date: 2013-08-17
之前的博客使用markdown写过几篇,不过最近发现rst更加方便,因此上午折腾了一把,打算改用rst来记录博客。
我先是搜索了一把,找到 `这篇博客 <http://blog.xlarrakoetxea.org/posts/2012/10/creating-a-blog-with-pelican/>`_
感觉挺好的。于是,我也打算用pelican来构建,可惜折腾了一个上午,始终没有达到预期效果。只好放弃,改用rst+sphinx来弄。
默认的主题我不太喜欢,搜索了几个,最后决定使用 `armstrong <https://github.com/armstrong/armstrong_sphinx/>`_ 。(bootstrap也不错的)
最后,我在layout.html中加上disqus的链接,虽然看起来有些唐突,好歹算是博客的功能都有了!
写博客的流程如下 ::
#先在github上建库,为了方便,我是建立了两个库blog.git和joyyoj.github.com
$ git clone [email protected]:user/blog.git
$ cd ./blog
#master分支下,提交改动
$ git add .
$ git commit -m "First post"
$ git push origin master
$ make html && cp _build/html/* ../joyyoj.github.com
$ cd ../joyyoj.github.com
$ git add .
$ touch .nojekyll 文件
$ git commit -m "Publish hello world post"
#发布到博客上
$ git push origin master
git 其他常用的操作
Removing untracked files from your git working copy
If you want to also remove directories, run git clean -f -d
If you just want to remove ignored files, run git clean -f -X
If you want to remove ignored as well as non-ignored files, run git clean -f -x
touch .nojekyll 原因可以参考 `这里 <https://help.github.com/articles/using-jekyll-with-pages/>`_
帮助链接
----------
`reStructuredText入门 <http://www.pythondoc.com/sphinx/rest.html>`_
`sphinx使用文档 <http://docs.kissyui.com/1.3/docs/html/tutorials/tools/use-sphinx.html>`_
http://hadoop2-13.lg-4-e10.yidian.com:8088/proxy/application_1425412783697_53356/ws/v1/mapreduce/jobs/job_1425412783697_53356/counters
http://hadoop2-13.lg-4-e10.yidian.com:8088/proxy/ws/v1/history/mapreduce/jobs/job_1425412783697_53356/counters
http://hadoop2-2.lg-4-e9.yidian.com:19888/ws/v1/history/mapreduce/jobs/job_1425412783697_53356/counters<file_sep>#!/usr/bin/python
import json
import sys
import codecs
import commands
from os import listdir
from os.path import isfile, join
import re
def exec_shell(cmd):
output = commands.getstatusoutput(cmd)
return (output[0], output[1].split("\n"))
def file_put_content(content, file_name):
file_obj = codecs.open(file_name, 'w', 'utf-8')
try:
file_obj.write(content)
finally:
file_obj.close()
def match(str):
prog = re.compile("^(?!sc\d|c\d).*$")
m = prog.search(str)
if m:
i = m.group(0)
print(i)
if __name__ == '__main__' :
match("kc76")
# dir_path = sys.argv[1]
# files = [ join(dir_path, f) for f in listdir(dir_path) if isfile(join(dir_path, f)) ]
# for file_name in files:
# f = file(file_name)
# js = json.load(f)
# file_content=json.dumps(js, sort_keys=True, indent=4, separators=(',', ': '), ensure_ascii=False)
# #print file_content
# file_put_content(file_content, file_name)
<file_sep>**************************
Kylin架构
**************************
问题:
kylin只能支持sequencefile和textfile;
kylin维度表不能使用view,因为desc extended时不会有location路径;
kylin只能使用default数据库,因为desc extended会有问题;
维度表必须只有一个文件,否则步骤3会由于超过12个文件而报错; 另外,支持的hive表的分隔符只能是有限的几个,比如\t,而不能是\001
CubeSegment: 一个时间段内的数据;
dict:
lookup:
OLAP
################
OLAP的目标是满足决策支持或者满足在多维环境下特定的查询和报表需求,它的技术核心是"维"这个概念。
“维”是人们观察客观世界的角度,是一种高层次的类型划分。
“维”一般包含着层次关系,这种层次关系有时会相当复杂。通过把一个实体的多项重要的属性定义为多个维(dimension),使用户能对不同维上的数据进行比较。
因此OLAP也可以说是多维数据分析工具的集合。
OLAP的基本多维分析操作有钻取(roll up和drill down)、切片(slice)和切块(dice)、以及旋转(pivot)、drill across、drill through等。
钻取
========
钻取改变维的层次,变换分析的粒度。它包括向上钻取(roll up)和向下钻取(drill down)。
roll up是在某一维上将低层次的细节数据概括到高层次的汇总数据,或者减少维数;
drill down则相反,它从汇总数据深入到细节数据进行观察或增加新维。
切片和切块是在一部分维上选定值后,关心度量数据在剩余维上的分布。如果剩余的维只有两个,则是切片;如果有三个,则是切块。
旋转是变换维的方向,即在表格中重新安排维的放置(例如行列互换)。
OLAP有多种实现方法,根据存储数据的方式不同可以分为ROLAP、MOLAP、HOLAP。
ROLAP表示基于关系数据库的OLAP实现(Relational OLAP)。以关系数据库为核心,以关系型结构进行多维数据的表示和存储。
ROLAP将多维数据库的多维结构划分为两类表:
一类是事实表,用来存储数据和维关键字;
另一类是维表,即对每个维至少使用一个表来存放维的层次、成员类别等维的描述信息。
维表和事实表通过主关键字和外关键字联系在一起,形成了"星型模式"。
对于层次复杂的维,为避免冗余数据占用过大的存储空间,可以使用多个表来描述, 这种星型模式的扩展称为"雪花模式"。
MOLAP表示基于多维数据组织的OLAP实现(Multidimensional OLAP)。
以多维数据组织方式为核心,也就是说,MOLAP使用多维数组存储数据。
多维数据在存储中将形成"立方块(Cube)"的结构,在MOLAP中对"立方块"的"旋转"、"切块"、"切片"是产生多维数据报表的主要技术。
HOLAP表示基于混合数据组织的OLAP实现(Hybrid OLAP)。如低层是关系型的,高层是多维矩阵型的。这种方式具有更好的灵活性。
还有其他的一些实现OLAP的方法,如提供一个专用的SQL Server,对某些存储模式(如星型、雪片型)提供对SQL查询的特殊支持。
OLAP工具是针对特定问题的联机数据访问与分析。它通过多维的方式对数据进行分析、查询和报表。维是人们观察数据的特定角度。例如,一个企业在考虑产品的销售情况时,通常从时间、地区和产品的不同角度来深入观察产品的销售情况。这里的时间、地区和产品就是维。而这些维的不同组合和所考察的度量指标构成的多维数组则是OLAP分析的基础,可形式化表示为(维1,维2,……,维n,度量指标),如(地区、时间、产品、销售额)。多维分析是指对以多维形式组织起来的数据采取切片(Slice)、切块(Dice)、钻取(Drill-down和Roll-up)、旋转(Pivot)等各种分析动作,以求剖析数据,使用户能从多个角度、多侧面地观察数据库中的数据,从而深入理解包含在数据中的信息。
根据综合性数据的组织方式的不同,目前常见的OLAP主要有基于多维数据库的MOLAP及基于关系数据库的ROLAP两种。MOLAP是以多维的方式组织和存储数据,ROLAP则利用现有的关系数据库技术来模拟多维数据。在数据仓库应用中,OLAP应用一般是数据仓库应用的前端工具,同时OLAP工具还可以同数据挖掘工具、统计分析工具配合使用,增强决策分析功能。<file_sep>查询
LINUX系统下查看JAVA的哪个线程占用CPU高
1. 先定位占用cpu高的进程
top
1. 使用以下命令
ps -p 56651 -L -o pcpu,pid,tid,time,tname,stat,psr | sort -n -k1 -r | head -25
1.
12.3 56651 56675 00:01:45 ? Sl 4
12.3 56651 56669 00:01:45 ? Sl 12
12.3 56651 56667 00:01:46 ? Sl 4
12.3 56651 56666 00:01:46 ? Sl 4
12.3 56651 56661 00:01:45 ? Sl 5
其中第3个结果就是此进程中有问题的线程nid
1. 通过jstack命令dump出堆栈
jstack 56651 | grep -10 `printf "%x" 56675`
"Gang worker#11 (Parallel GC Threads)" prio=10 tid=0x00007f5b9c034000 nid=0xdd5e runnable
"Gang worker#12 (Parallel GC Threads)" prio=10 tid=0x00007f5b9c036000 nid=0xdd5f runnable
"Gang worker#13 (Parallel GC Threads)" prio=10 tid=0x00007f5b9c038000 nid=0xdd60 runnable
"Gang worker#14 (Parallel GC Threads)" prio=10 tid=0x00007f5b9c039800 nid=0xdd61 runnable
jstat查看gc内存:
每隔2s dump一次:
jstat -gcutil 27274 2000
S0 S1 E O P YGC YGCT FGC FGCT GCT
0.00 43.83 69.13 53.52 99.84 35 1.879 0 0.000 1.8
查看整个JVM内存状态
jmap -heap [pid]
要注意的是在使用CMS GC 情况下,jmap -heap的执行有可能会导致JAVA 进程挂起
查看JVM堆中对象详细占用情况
jmap -histo [pid]
导出整个JVM 中内存信息
jmap -dump:format=b,file=文件名 [pid]
gateway机器需要的操作:
1 hive替换jar包 logging-assemlby.jar; 在.hiverc中add jar xx;
2 pig更改配置: pig.load.default.statements=$PIGHOME/.pigbootup
3 /etc/azkaban/目录下存放了一些需要的脚本;
4 重启azkaban的executor webserver等;
5 重启 profile debugger等工具
6 export PIG_CLASSPATH=/opt/cloudera/parcels/CDH/lib/hive/lib/logging-assembly-0.1.0.jar
如何使用hcatalog: https://cwiki.apache.org/confluence/display/Hive/HCatalog+LoadStore
<file_sep>
if __name__ == "__main__":
cmd = "hadoop fs -ls %s | awk -F 'delta/' '{print $2}' | awk -F '/' '{print $1}'" % root
(ret, msg) = exec_shell(cmd)<file_sep># -*- coding: utf-8 -*-
AUTHOR = 'xxxxxxxx xxxxxxxxxxxxx'
SITENAME = "xxxxxxxxxxxx"
SITEURL = 'http://xxxxxxxxxxxxxxxxx'
TIMEZONE = "Europe/Madrid"
GITHUB_URL = 'http://github.com/XXXXX'
DISQUS_SITENAME = "XXXXXXXXXX"
EMAIL = "<EMAIL>"
PDF_GENERATOR = False
REVERSE_CATEGORY_ORDER = True
LOCALE = 'en_US'
DEFAULT_PAGINATION = 5
THEME = "iris"
FEED_RSS = 'feeds/all.rss.xml'
CATEGORY_FEED_RSS = 'feeds/%s.rss.xml'
LINKS = (('XXXXX XXXX', 'http://YYYYYYYYYYY.ZZZ'),)
SOCIAL = (('twitter', 'http://twitter.com/XXXXXX'),
('linkedin', 'http://www.linkedin.com/in/XXXXXXX'),
('github', GITHUB_URL),)
OUTPUT_PATH = 'output'
PATH = 'src'
ARTICLE_URL = "posts/{date:%Y}/{date:%m}/{slug}/"
ARTICLE_SAVE_AS = "posts/{date:%Y}/{date:%m}/{slug}/index.html"
GOSQUARED_SITENAME = "XXX-YYYYYY-X"
# global metadata to all the contents
#DEFAULT_METADATA = (('yeah', 'it is'),)
# static paths will be copied under the same name
STATIC_PATHS = ["images", ]
# A list of files to copy from the source to the destination
#FILES_TO_COPY = (('extra/robots.txt', 'robots.txt'),)
<file_sep>
vim支持汉字
-------------------
::
./configure --with-features=big
./configure --prefix=/usr --enable-multibyte
vim --version
looking for +multi_byte. If it says -multi_byte it will not work.
安装bundle
-----------
::
git clone https://github.com/gmarik/vundle.git ~/.vim/bundle/vundle
<file_sep>
编译GDB ::
LDFLAGS=-L<path to libpython2.7> ./configure --prefix=/home/users/sunshangchun/lib/gdb --with-python=/home/tools/tools/python/2.7.2/64/bin/
make
make install
http://sourceware.org/gdb/wiki/STLSupport
Check-out the latest Python libstdc++ printers to a place on your machine.
In a local directory, do ::
svn co svn://gcc.gnu.org/svn/gcc/trunk/libstdc++-v3/python
Add the following to your ~/.gdbinit. The path needs to match where the python module above was checked-out.
So if checked out to: /home/maude/gdb_printers/, the path would be as written in the example ::
python
import sys
sys.path.insert(0, '/home/users/sunshangchun/gdb_printers/python')
from libstdcxx.v6.printers import register_libstdcxx_printers
register_libstdcxx_printers (None)
end
下载gdb 按照 pretty print
如果没有打开core文件, ulimit -c unlimited<file_sep>***************************************
HIVE的数据类型/数据模型/对象模型
***************************************
[HIVE API的JavaDoc](http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.1/ds_Hive/api/index.html)
数据模型
===================
HIVE有四种数据模型(表):(内部)表、外部表、分区表和桶表。下面把thrift的重要接口定义罗列说明一下。
Database
------------------
Database是Table的命名空间,包含一组Table ::
struct Database {
1: string name,
2: string description,
3: string locationUri,
4: map<string, string> parameters, // properties associated with the database
5: optional PrincipalPrivilegeSet privileges
}
Table
-----------------------
::
// table information
struct Table {
1: string tableName,// name of the table
2: string dbName, // database name ('default')
3: string owner,// owner of this table
4: i32 createTime, // creation time of the table
5: i32 lastAccessTime, // last access time (usually this will be filled from HDFS and shouldn't be relied on)
6: i32 retention,// retention time
7: StorageDescriptor sd,// storage descriptor of the table
8: list<FieldSchema> partitionKeys, // partition keys of the table. only primitive types are supported
9: map<string, string> parameters, // to store comments or any other user level parameters
10: string viewOriginalText, // original view text, null for non-view
11: string viewExpandedText, // expanded view text, null for non-view
12: string tableType, // table type enum, e.g. EXTERNAL_TABLE
13: optional PrincipalPrivilegeSet privileges,
}
Partition
--------------
::
struct Partition {
1: list<string> values // string value is converted to appropriate partition key type
2: string dbName,
3: string tableName,
4: i32 createTime,
5: i32 lastAccessTime,
6: StorageDescriptor sd,
7: map<string, string> parameters,
8: optional PrincipalPrivilegeSet privileges
}
Index
---------------
::
struct Index {
1: string indexName, // unique with in the whole database namespace
2: string indexHandlerClass, // reserved
3: string dbName,
4: string origTableName,
5: i32 createTime,
6: i32 lastAccessTime,
7: string indexTableName,
8: StorageDescriptor sd,
9: map<string, string> parameters,
10: bool deferredRebuild
}
StorageDescriptor
---------------------
描述一张表的物理存储 ::
struct StorageDescriptor {
1: list<FieldSchema> cols, // required (refer to types defined above)
2: string location, // defaults to <warehouse loc>/<db loc>/tablename
3: string inputFormat, // SequenceFileInputFormat (binary) or TextInputFormat` or custom format
4: string outputFormat, // SequenceFileOutputFormat (binary) or IgnoreKeyTextOutputFormat or custom format
5: bool compressed, // compressed or not
6: i32numBuckets, // this must be specified if there are any dimension columns
7: SerDeInfoserdeInfo, // serialization and deserialization information
8: list<string> bucketCols, // reducer grouping columns and clustering columns and bucketing columns`
9: list<Order> sortCols, // sort order of the data in each bucket
10: map<string, string> parameters, // any user supplied key value hash
11: optional SkewedInfo skewedInfo, // skewed information
12: optional bool storedAsSubDirectories // stored as subdirectories or not
}
FieldSchema
-----------------
::
struct FieldSchema {
1: string name, // name of the field
2: string type,
3: string comment
}
SerdeInfo
----------------
// This object holds the information needed by SerDes ::
struct SerDeInfo {
1: string name, // name of the serde, table name by default
2: string serializationLib, // usually the class that implements the extractor & loader
3: map<string, string> parameters // initialization parameters
}
Column支持哪些数据类型
=======================
Hive使用TypeInfo描述数据类型,目前支持5种类型.
1. Primitive
2. List
3. Map
4. Struct
5. Union
具体的定义如下(只列出get接口)
.. code-block:: java
public abstract class TypeInfo implements Serializable {
public abstract Category getCategory();
public abstract String getTypeName();
}
public final class PrimitiveTypeInfo extends TypeInfo implements Serializable {
public PrimitiveCategory getPrimitiveCategory();
public Class<?> getPrimitiveWritableClass();
public Class<?> getPrimitiveJavaClass();
}
public final class MapTypeInfo extends TypeInfo implements Serializable {
public TypeInfo getMapKeyTypeInfo();
public TypeInfo getMapValueTypeInfo();
}
public final class ListTypeInfo extends TypeInfo implements Serializable {
public TypeInfo getListElementTypeInfo();
}
public final class StructTypeInfo extends TypeInfo implements Serializable {
public ArrayList<String> getAllStructFieldNames();
public ArrayList<TypeInfo> getAllStructFieldTypeInfos();
public TypeInfo getStructFieldTypeInfo(String field);
}
public class UnionTypeInfo extends TypeInfo implements Serializable {
public List<TypeInfo> getAllUnionObjectTypeInfos();
}
union类型稍微提一下,union与c的union相似,即在同一时间点,只能有一个指定的数据类型 ::
CREATE TABLE union_test(value UNIONTYPE<int, array<string>, struct<a:int,b:string>, map<string, string>>);
SELECT value FROM union_test;
{0:1}
{1:["val1","val2"]}
{2:{"a":1,"b":"val"}}
{3:{"key":"value","x":"y"}}
对于这些复合类型,有相应的UDF来创建,包括:array,map,struct,named_struct,create_union。
复杂数据类型的支持
=====================
从数据类型可以看出,hive支持嵌套struct、array、map等复杂结构。那么HIVE是如何实现的?
例如:`select a.b, key[0] from src; select a+b from src group by a+b;...`
类型检查
----------------
TypeCheck
::
// select a+b, count(1) from T group by a+b;
// a+b在group by时,处理过,因此RowResolver时已经包含a+b
processGroupby
if (.) {
Column.Const ExprFieldDesc
ColumnInfo info = get(tableAlias);
}
如何生成相应的ExprNodeDesc?
2 => ExprConstantDesc
a => ExprColumnDesc
a.b => ExprFieldDesc
a[0] => ExprGenericFuncDesc
**问题**
1. 语法解析,构建AST?
2. 语义分析中,如何处理[]?
3. 语义分析中,如何.?与dbName.tableName.columnName与嵌套结构的struct如何区分?
4. 运行时如何获取这些类型信息?
问题4先解答:
编译OK之后,进行运行,那么如何传递TypeInfo信息给Task?
HIVE内部有个自己的TypeInfoParser来解析TYPE字符串;
serdeParams.columnTypes -> columnType -> columTypes从FieldSchema中来。
对于中间结果,它的FieldSchema又如何生成?
PlanUtils.getFieldSchemasFromRowSchema(parent.getSchema(), "temporarycol")
那么关键的问题在于RowSchema如何生成?
每个Operator都有一个RowSchema;RS的RowSchema
显然,如果是select算子,则找出outputCols的则它的RowSchema?
如何知道OutputCols有哪些?
Parser
--------------
首先,为了支持上述的语法解析,Hive.g的部分定义如下:
precedenceFieldExpression
:
atomExpression ((LSQUARE^ expression RSQUARE!) | (DOT^ Identifier))*
;
atomExpression
:
KW_NULL -> TOK_NULL
| constant
| function
| castExpression
| caseExpression
| whenExpression
| tableOrColumn
| LPAREN! expression RPAREN!
;
类型检查、转换
================
HIVE引入几个数据结构。
FieldSchema
ColumnInfo
RowSchema
RowResolver
ColumnInfo
.. code-block:: java
class ColumnInfo {
String internalName;
ObjectInspector objectInspector;
String tabAlias;
boolean isVirtualCol;
boolean isHiddenVirtualCol;
}
class RowSchema implements Serializable {
ArrayList<ColumnInfo> signature;
}
在类型检查时,TypeCheckProcFactory提供了几个Processor来处理:
NullExprProcessor;
NumberExprProcessor;
StringExprProcessor;
BoolExprProcessor;
ColumnExprProcessor;
DefaultExprProcessor;//处理function
**问题**
1. 到此为止,我们知道给定输入的RowSchema,得到相应的ExprNodeDesc。例如select算子,我们知道要选择字段的类型,但是对于Operator之间如何传递?
1. RowSchema和RowResolver什么关系?
RowSchema可以从RowResolver中生成。RowResolver维护了tableAlias,columnAlias,ColumnInfo
internalName -> alias的关系维护等;
在算子层:操作符之间的出入schema由RowResolver完成传递;
RowResolver如何维护?显然从genTablePlan时,获取得到Table对应的ObjectInspector(此时必然是StructObjectInspector),获取每一列的类型信息;
【不妥之处:从SerdeParams中获取】
getOpParseCtx
Operator的ParseContext中包含了每个Operator与RowResolver的映射关系;
如果是FileSinkOperator、CommonJoinOperator?
public ColumnInfo(String internalName, TypeInfo type, String tabAlias, boolean isVirtualCol)
事实上HIVE不支持dbname.tablename.columnName,即只支持columnOrXPath, tableName.columnOrXPath;
在函数层:操作符之间的出入的schema由ObjectInspector;
如何检查函数的参数是否正确?由UDF来保证,UDF能够返回ObjectInspector;
描述符:ExprNodeDesc
HIVE支持数据类型的隐式转换。如:
insert into table string_table
select i32 as key from src;
对象模型
####################
ObjectInspector
======================
HIVE的运行时支持多种对象模型,ObjectInspector的引入的主要意义在于提供了对Object的统一访问方式,换句话说,它屏蔽了不同对象的存储细节。目前有两种对象模型的实现:
1. Java的对象模型(Thrift或者原生Java)
2. Hadoop的对象模型(Writable)
.. code-block:: java
public interface ObjectInspector {
public static enum Category {
PRIMITIVE, LIST, MAP, STRUCT, UNION
};
String getTypeName();
Category getCategory();
}
PrimitiveObjectInspector
=========================
.. code-block:: java
public interface PrimitiveObjectInspector extends ObjectInspector {
public static enum PrimitiveCategory {
VOID, BOOLEAN, BYTE, SHORT, INT, LONG, FLOAT, DOUBLE, STRING, TIMESTAMP, BINARY, UNKNOWN
};
PrimitiveCategory getPrimitiveCategory();
/**
* Get the Primitive Writable class which is the return type of
* getPrimitiveWritableObject() and copyToPrimitiveWritableObject().
*/
Class<?> getPrimitiveWritableClass();
/**
* Return the data in an instance of primitive writable Object. If the Object
* is already a primitive writable Object, just return o.
*/
Object getPrimitiveWritableObject(Object o);
/**
* Get the Java Primitive class which is the return type of
* getJavaPrimitiveObject().
*/
Class<?> getJavaPrimitiveClass();
/**
* Get the Java Primitive object.
*/
Object getPrimitiveJavaObject(Object o);
/**
* Get a copy of the Object in the same class, so the return value can be
* stored independently of the parameter.
*
* If the Object is a Primitive Java Object, we just return the parameter
* since Primitive Java Object is immutable.
*/
Object copyObject(Object o);
/**
* Whether the ObjectInspector prefers to return a Primitive Writable Object
* instead of a Primitive Java Object. This can be useful for determining the
* most efficient way to getting data out of the Object.
*/
boolean preferWritable();
}
StructObjectInspector
.. code-block:: java
public abstract class StructObjectInspector implements ObjectInspector {
//Returns all the fields.
public abstract List<? extends StructField> getAllStructFieldRefs();
//Look up a field.
public abstract StructField getStructFieldRef(String fieldName);
public abstract Object getStructFieldData(Object data, StructField fieldRef);
public abstract List<Object> getStructFieldsDataAsList(Object data);
}
public interface StructField {
//Get the name of the field. The name should be always in lower-case.
String getFieldName();
//Get the ObjectInspector for the field.
ObjectInspector getFieldObjectInspector();
String getFieldComment();
}
MapObjectInspector
.. code-block:: java
public interface MapObjectInspector extends ObjectInspector {
ObjectInspector getMapKeyObjectInspector();
ObjectInspector getMapValueObjectInspector();
Object getMapValueElement(Object data, Object key);
Map<?, ?> getMap(Object data);
int getMapSize(Object data);
}
ListObjectInspector
.. code-block:: java
public interface ListObjectInspector extends ObjectInspector {
// Methods that does not need a data object **
ObjectInspector getListElementObjectInspector();
Object getListElement(Object data, int index);
int getListLength(Object data);
List<?> getList(Object data);
}
运行时如何使用ObjectInspector
==================================
.. code-block:: java
//反序列化得到行
Object row = serDe.deserialize(t);
StructObjectInspector oi = (StructObjectInspector) serDe
.getObjectInspector();
List<? extends StructField> fieldRefs = oi.getAllStructFieldRefs();
//获取每一列的信息
for (int i = 0; i < fieldRefs.size(); i++) {
Object fieldData = oi.getStructFieldData(row, fieldRefs.get(i));
...
}
从这个例子中,不难出,Hive将对行中列的读取和行的存储方式解耦和了,只有ObjectInspector清楚行的结构,但使用者并不知道存储的细节。 对于数据的使用者来说,只需要行的Object和相应的ObjectInspector,就能读取出每一列的对象。
ObjectInspector有什么好处
------------------------------
除了上文说的,ObjectInspector提供对象的统一访问接口,还负责获取TypeInfo。另外,可以实现惰性反序列化。
编译时也用了ObjectInspector
-----------------------------
HIVE支持用户自己配置serde lib,例如protobuf。列信息在serde的class中 ::
static List<FieldSchema> getFieldsFromDeserializer(String tableName, Deserializer deserializer) {
获取ObjectInspector;
根据ObjectInspector的类型获取Field的OI;
返回FieldSchema;
}
ObjectInspector的信息:
TypeInfo getTypeInfoFromObjectInspector();
genTablePlan
================
tab.getDeserializer().getObjectInspector();
个人感觉ObjectInspector带来的编码复杂度很高,得不偿失, 感兴趣的可以关注向量化的内存布局。
写的比较粗糙,回头补充,待续...
<file_sep>#!/usr/bin/python
import commands
import time
import datetime
import sys
def datetime_to_str(dt, time_format):
return dt.strftime(time_format)
def str_to_datetime(string, time_format):
return datetime.datetime.strptime(string, time_format)
def str_to_timestamp(str_time, time_format):
return time.mktime(str_to_datetime(str_time, time_format).timetuple())
def current_timestamp():
return long(time.time() * 1000)
def timestamp_to_str(timestamp, time_format):
return time.strftime(time_format, time.localtime(timestamp))
def datetime_to_timestamp(date_time):
return time.mktime(date_time.timetuple())
def exec_shell(cmd):
output = commands.getstatusoutput(cmd)
return (output[0], output[1].split("\n"))
if __name__ == '__main__' :
date = sys.argv[1]
duration = int(sys.argv[2])
root="hdfs://nameservice1/data/userProfile/neo_str_new/delta/";
time_format='%Y-%m-%d/%H:%M:%S'
s = int(str_to_timestamp(date, time_format) + 1)
end_time = s
start_time = s - duration * 3600
non_exists_count = 0
parts=set()
cmd = "hadoop fs -ls %s | awk -F 'delta/' '{print $2}' | awk -F '/' '{print $1}'" % root
(ret, msg) = exec_shell(cmd)
if ret == 0:
for line in msg:
if line.strip() != '':
parts.add(line)
else:
print("hadoop fs -ls failed")
sys.exit(1)
non_exists = 0
for tt in range(start_time, end_time, 3600):
version = timestamp_to_str(tt, '%Y%m%d%H')
path = str(root) + version
if version in parts:
print path
else:
non_exists += 1
if non_exists > 5:
print("non exists exceed max allowed %d" % non_exists)
sys.exit(1)
<file_sep>Impala常见问题总结
=======================
GDB调试产生大量的段错误
-------------------------------
impala be会使用jni调用访问fe的相关模块,当用gdb调试be 的时候,由于JVM垃圾回收会使用到一些信号,如果gdb调试时不自定义信号处理方法,则可能导致GDB直接退出。在实际调试中,JVM产生大量的Segmentation faults. 可以在gdb执行run之前,添加命令 ::
gdb> handle SIGSEGV nostop noprint pass
此外,类似的gdb配置选项还有 ::
handle SIGSEGV pass noprint nostop
handle SIGUSR1 pass noprint nostop
handle SIGUSR2 pass noprint nostop
.. gdb bin/start-impalad.sh -use_statestore=false
.. gdb -q impalad
.. set args -use_statestore=false -nn=hadoop-01.localdomain -nn_port=8030
<file_sep>
if __name__ == "__main__":
filename = "/Users/sunshangchun/data/diff/0AyUH7Vg"
prev=""
sum=0
with open(filename, "r") as lines:
for line in lines:
m=filter(None, line.strip().split(" "))
# s=0
# m=filter(None, line.rstrip("\n").split(" "))
# if m[1] != prev:
# print "%s %s" % (prev, s)
# s=int(m[2])
# else:
# s+=int(m[2])
# sum+=int(m[2])
# prev = m[1]
print sum
<file_sep>*************************
浅谈SQL常用的JOIN算法
*************************
:date: 2013-08-22
常用的三种类型的连接
嵌套循环连接(Nested Loops)
==============================
基于Tuple的嵌套循环连接
------------------------
R(X,Y) :math:`\bowtie` S(Y, Z)
伪代码如下::
For R的每个Tuple r:
For S的每个Tuple s:
If r 与 s连接形成 Tuple t:
输出t
如果不做任何缓冲优化,这种算法需要的磁盘I/O多达T(R)*T(S).
不过,在很多情况下这种算法可以优化,使代价降低许多.
优化一 基于Block的嵌套循环连接
-----------------------------
思路: 使用内存缓存,减少磁盘读写。
显然在基于Tuple的遍历中,内存循环每个Tuple的读取次数是T(R)次,外层循环的每个Tuple只需要一次。
因此缓存S的数据是很自然的。
我们假定一个Block包含N个Tuple, R有T(R)/N=B(R)块,S为T(S)/N=B(S)块;
假定内存可以容纳M个Block。并假定大小 M < B(S) <= B(R)
我们加载M-1块个S到内存中,剩余1个Block用于加载R的数据,这样在内层遍历时只需要访问内存,
无须再访问磁盘了。伪代码如下 ::
While (true):
trunk=读取S的M-1个块到内存;
If 读取完成:
break;
对trunk建立查询结构I,比如使用hash索引或者红黑树等;
For R的每个block b:
读取b到内存
For b的每个Tuple t:
通过I,找出所有能够与t连接的Tuple并输出;
显然,磁盘的读取次数为 B(S)/(M-1) * (B(R) + M) = (B(S)*B(R)+B(S)*M))/(M-1)
从中也可以看到, 外层是小连接会更好一些.
优化二 基于索引的方式
-------------------------
合并
==================
由于每个输入都已排序,因此 Merge Join 运算符将从每个输入获取一行并将其进行比较。
例如,对于内联接操作,如果行相等则返回。如果行不相等,则废弃值较小的行并从该输入获得另一行。
这一过程将重复进行,直到处理完所有的行为止。
合并联接本身的速度很快,但如果需要排序操作,选择合并联接就会非常费时。
然而,如果数据量很大且能够从现有 B 树索引中获得预排序的所需数据,则合并联接通常是最快的可用联接算法。
显然,如果两个表在on的过滤条件上都有有序索引, 使用Merge join, 性能最好. M+N次;
::
a = BigTable.begin();
b = SmallTable.begin();
while (true) {
if (a is null or b is null) break;
if (a.key == b.key): emit(a, b);
else if (a.key > b.key): b.next();
else: a.next();
}
哈希
===================
如果一个有索引,一个没索引,则会选择Nested Loops join.
3.哈希联接(Hash)
如果两个表在on的过滤条件上都没有索引, 则就会使用Hash join.
也就是说, 使用Hash join算法是由于缺少现成的索引.
FOR S
参考资料:
[1] `高级查询优化概念 <http://msdn.microsoft.com/zh-cn/library/ms191426(v=SQL.100).aspx>`_
[2]
<file_sep>
进程情况
=============
ps -aux
memory情况
=================
查看网卡的流量情况
====================
iftop -B
查看传输速度
==============
iperf?
查看打开的链接
=================
lsof
查看网络连接情况
===================
netstat -antp
查看目录
===========
ll /proc/xx/pwd
ll /proc/xx/fd
<file_sep>.. queryengine documentation master file, created by
sphinx-quickstart on Sat Aug 3 10:12:59 2013.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
欢迎光临本人的博客。
分布式查询技术
====================
Impala
----------------
.. toctree::
:maxdepth: 1
:glob:
impala/overview
impala/io
.. impala/frontend
.. impala/be
.. impala/debug
HIVE
----------------
.. toctree::
:maxdepth: 1
:glob:
hive/hive-overall
hive/hql
hive/hive-type
.. query/join
.. hive/settings
.. release
工具分享
========================
Github
----------------
.. toctree::
:maxdepth: 1
:glob:
share/blog
share/hadoop_ecology
<file_sep>***************
RST
****************
如何插入数学公式
==================
::
.. math::
\frac{ \sum_{t=0}^{N}f(t,k) }{N}
:math:`\frac{ \sum_{t=0}^{N}f(t,k) }{N}`
<file_sep>
bucket sample -> tableSamplePresent;
tablespilt sample -> splitSamplePresent;
<file_sep>*****************************
Maven常用命令
*****************************
test
==========
执行一个单元测试: mvn test -Dtest=testname
如果工程包含多个module,需要指定单测所在的module: mvn test -Dtest=testname -pl subproject
单测进行调试: mvn test -Dtest=MySuperClassTest -Dmaven.surefire.debug 之后使用远程调试功能;
mvn -X cobertura:cobertura
依赖包
============
查看依赖图: mvn dependency:tree
拷贝依赖包: mvn dependency:copy-denpendices
mvn -U clean package
intellij的问题
===============
如果mvn的依赖scope是provided,那么如果要调试,需要改为compile才可以正确找到类的路径
模拟器,模拟发包, 真实机器
服务器 数据库 各种机型的特征值 多个信息
<file_sep>function day_range() {
dstart=$1
dend=$2
# convert in seconds sinch the epoch:
start=$(date -d$dstart +%s)
end=$(date -d$dend +%s)
cur=$start
output=()
while [ $cur -lt $end ]; do
# convert seconds to date:
date -d@$cur +%Y-%m-%d
let cur+=24*60*60
output+=($cur)
done
return $output
}<file_sep>
shell中的数组使用方法
======================
$ arr=(123 34 3 5)
$ echo $arr // 默认获取第一个元素
> 123
$ echo ${arr[1]} // 通过下标访问
> 34
$ echo ${arr[@]} // 访问整个数组 ,@或者* 获取整个数组
> 123 34 3 5
$ echo ${#arr[@]} // 获取数组的长度(最大下标) ,#获取长度 数组中是最后一个下标
> 3
$ echo ${#arr[3]} // 获取字符串长度
> 1
$ echo ${arr[@]:1:2} // 切片方式获取一部分数组内容
> 34 3
$ echo ${arr[@]:2} // 从第二个元素开始
> 3 5
$ echo ${arr[@]::2} // 到第二个元素
> 123 34
shell中的数组使用方法
push:
array=(”${array[@]}” $new_element)
pop:
array=(${array[@]:0:$((${#array[@]}-1))})
shift:
array=(${array[@]:1})
unshift
array=($new_element “${array[@]}”)
function del_array {
local i
for (( i = 0 ; i < ${#array[@]} ; i++ ))
do
if [ "$1" = "${array[$i]}” ] ;then
break
fi
done
del_array_index $i
}
function del_array_index {
array=(${array[@]:0:$1} ${array[@]:$(($1 + 1))})
}
<file_sep>***************************
HIVE基本框架
***************************
HIVE基本流程
==================
1. 主要流程包括,语法解析(抽象语法树,AST,采用antlr),语义分析(sematic Analyzer生成查询块),逻辑计划生成(OP tree),逻辑计划优化,物理计划生成(Task tree),以及物理计划执行组成。
2. 使用antlr定义sql语法,(详细见hive.g),由antlr工具将hive.g编译为两个java文件:HiveLexer.java HiveParser.java,可以将输入的sql解析为ast树
3. org.apache.hadoop.hive.ql.Driver 对ast树进行初步的解析(combile),调用相应的语法分析器进行分析处理(包括DDl,Explain,Load等,其中最重要的是:SemanticAnalyzer)
4. SemanticAnalyzer的主要分析过程:调用analyzeInternal函数
1)doPhase1过程:主要是将sql语句中涉及到的各种信息存储起来,存到QB中去,供后续调用
2)getMetaData:这个过程主要是获取元数据信息,主要是sql中涉及到的表到元数据的关联
3)genPlan:这是最重要的过程之一,主要是生成算子树(operator tree)
4)optimize:优化,对算子树进行一些优化操作,例如列剪枝等
5)genMapRedTasks:这个步骤是最关键的步骤,将算子树通过一定的规则生成若干相互以来的MR任务
5. Driver编译完成以后,开始进入执行阶段(execute),执行过程按照任务树从roottask开始依次执行,直至结束。
说明清楚各个模块的衔接点,接口交互即可。
.. code-block:: c
CliDriver.main
processLine
processCmd
Driver.run(cmd)
compile
BaseSemanticAnalyzer.analyze
SemanticAnalyzer.analyzeInternal
doPhase
getMetaData
genPlan
Optimizer.optimize
genMapRedTasks
execute
launchTask
TaskRunner.run
Task.executeTask
ExecDriver.execute
submitJob
getResults
CliDriver.main > processLine > processCmd >> Driver.run(cmd) > compile >> BaseSemanticAnalyzer >> xxxSemanticAnalyzer(常规select走SemanticAnalyzer) > analyze(sem.analyze) >> SemanticAnalyzer的analyzeInternal方法 >> new Optimizer.optimize(进行列剪裁等优化后生成Task) > genMapRedTasks >> 返回到Driver.run(cmd) >>ret = execute() >> launchTask >> TaskRunner.run > Task.executeTask > ExecDriver.execute > 执行MR(submitJob) >> getResults.
<file_sep>---
layout: post
title: HIVE基本框架
categories:
- 分布式查询
tags:
- HIVE
---
# HIVE源码分析之惰性求值和短路优化
initialize方法在SQL调用UDF函数中,首先被调用,它完成下面4件事:
1. 验证输入的类型是否预期输入
2. 设置返回一个与预期输出类型相符的ObjectInspector, 如GenericUDAFCountEvaluator在init实现中,返回PrimitiveObjectInspectorFactory.writableLongObjectInspector;
3. 存储在全局变量的ObjectInspectors元素的输入
4. 设置存储变量的输出
其中第三步不是必须的,因为全局变量能在evaluate方法中以局部变量的形式声明并处理,但是在initialize存储全局变量,只需要初始化一次。
public abstract class GenericUDF {
/**
* A Defered Object allows us to do lazy-evaluation and short-circuiting.
* GenericUDF use DeferedObject to pass arguments.
*/
public static interface DeferredObject {
Object get() throws HiveException;
};
public abstract Object evaluate(DeferredObject[] arguments);
public abstract ObjectInspector initialize(ObjectInspector[] arguments);
}
DeferredObject 短路优化
AllTrue(func1(a), func2(b), func3(c)) 此时如果func1(a) false,不需要计算b,c,因此
<file_sep>*****************************
设置HIVE的元数据库
*****************************
::
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://hadooptest:3306/hive?useUnicode=true&createDatabaseIfNotExist=true&characterEncoding=UTF-8</value>
<description>JDBC connect string for a JDBC metastore</description>
</property>
通过配置mysql的访问控制,可以有效的指定能够访问到你的数据库的ip范围。
root到你的mysql ::
mysql -u root -h(mysql的地址) -P(监听端口) -p
输入密码,进入mysql控制台,输入如下指令 ::
grant all privileges on *.* to '用户名'@'白名单' identified by '密码'
flush privileges;
(其中用户名是允许访问mysql的用户名,密码是对应用户的密码)
白名单内容可以如下:
% 表示所有ip可以访问;
也可以指定单个ip,如:10.81.7.80
或指定ip段,如:10.81.7.%
也可以是域名、机器名等;
上述语句的效果是在mysql的默认数据库 mysql中的user表插入或修改一条记录~
<file_sep>博客开张
===============
之前陆续在csdn,baidu空间等处搞过博客,可是工作后很久没有更新了。
最近使用rst写文档,发现确实很方便,因此在github上注册了新博客。
欢迎大家围观,讨论。
期待马年吉祥,努力多更一些文章。
:doc:`../share/blog`
<file_sep>import commands
import string
import sys
import time
def char_range(c1, c2):
"""Generates the characters from `c1` to `c2`, inclusive."""
for c in xrange(ord(c1), ord(c2)+1):
yield chr(c)
def exec_shell(cmd):
output = commands.getstatusoutput(cmd)
return (output[0], output[1].split("\n"))
if __name__ == "__main__":
filename = sys.argv[1]
parts = set()
with open(filename, "r") as lines:
for line in lines:
line.split("/")[-1]
day = line[0:4]
hour = line[4:6]
exec_shell("sh matrix_delta_str.sh " %(day, hour))
time.sleep(60)
<file_sep>****************************************************
[转载] 盘点Hadoop生态圈:13个让大象飞起来的开源工具
****************************************************
.. raw:: html
<div style="margin-top:10px;">
<iframe width="850px" height="5800px" scrolling="yes" src="http://m.csdn.net/article/2014-01-01/2817984-13-tools-let-hadoop-fly" frameborder="0" allowfullscreen>
</iframe>
</div>
<file_sep>\chapter{HIVE}
\section{~HIVE~编译器~}
\begin{lstlisting}[language=SQL]
select * from src where key=1;
select count(*), count(1) from src limit 1;
select distinct key from src;
select key, count(*) from src group by key;
select key, count(distinct value) from src group by key having key>2;
select * from (select key, value from src1 union all select key, value from src2) tmp;
select src.key, src.value from src join src_tmp on src.key = src_tmp.key where src.value=``bcd'';
包含~subquery~的查询;
包含~view~的查询;
join~的类型:
\end{lstlisting}
\begin{lstlisting}[language=JAVA,frame=single]
class NodeProcessor {
Object process(Node nd, Stack<Node> stack, NodeProcessorCtx procCtx, Object... nodeOutputs) throws SemanticException;
}
\end{lstlisting}
doPhase1(); //ASTTree $\rightarrow$ QB
getMetaData(); //
genPlan(); // 生成~Operator~树;
optimize();
genMapRedTask();
\subsection{genPlan}
\begin{lstlisting}[language=JAVA]
Operator genPlan(QBExpr qbexpr) throws SemanticException {
if (qbexpr.getOpcode() == QBExpr.Opcode.NULLOP) {
return genPlan(qbexpr.getQB());
}
if (qbexpr.getOpcode() == QBExpr.Opcode.UNION) {
// 后序遍历递归生成
Operator qbexpr1Ops = genPlan(qbexpr.getQBExpr1());
Operator qbexpr2Ops = genPlan(qbexpr.getQBExpr2());
return genUnionPlan(qbexpr.getAlias(), qbexpr.getQBExpr1().getAlias(),
qbexpr1Ops, qbexpr.getQBExpr2().getAlias(), qbexpr2Ops);
}
}
Operator genPlan(QB qb) {
\begin{enumerate}
\item 递归遍历子查询,生成子句的~plan,映射子句与对应的~OP;
\item 遍历~Table~,生成~TableScan~算子,映射表与对应的~OP;
\item genLateralViewPlans
\item 处理~JOIN~算子
\begin{itemize}
\item 如果是~UNIQE JOIN,?
\item mergeJoinTree
\end{itemize}
\item genBodyPlan
\end{enumerate}
}
处理~JOIN~算子 {
// if any filters are present in the join tree, push them on top of the
// table
pushJoinFilters(qb, qb.getQbJoinTree(), aliasToOpInfo);
genJoinPlan
}
TableRef~和子查询~处理上的区别有哪些?
a join b 时, left 是a, b right
// 这里要注意的是
genBodyPlan {
}
\end{lstlisting}
\subsection{预处理}
doPhase1
1. Gets all the aliases for all the tables / subqueries and makes the appropriate mapping in aliasToTabs, aliasToSubq
2. Gets the location of the destination and names the clase ``inclause'' + i
3. Creates a map from a string representation of an aggregation tree to the actual aggregation AST
4. Creates a mapping from the clause name to the select expression AST in destToSelExpr
5. Creates a mapping from a table alias to the lateral view AST's in aliasToLateralViews
\section{逻辑计划生成}
\begin{lstlisting}[language=JAVA]
public interface Transform {
ParseContext transform(ParseContext pctx) throws SemanticException;
}
\end{lstlisting}
引申一个问题,一个~Task~的执行树,如何做到~MapOperator~setChildren()~设置正确的~TableScanOperator;
\subsection{GlobalLimitOptimizer}
\subsection{JoinReorder}
Join~次序的优化。
如果表的Tag~越大,则越后面处理。
这个优化目前必须用户指定关键字~STREAMTABLE~来指定~TAG~来能生效。
例如:
%
\begin{verbatim}
SELECT /*+ STREAMTABLE(a) */ a.key, a.val, c.key;
FROM T1 a JOIN src c ON c.key+1=a.key;
\end{verbatim}
\subsection{ColumnPruner}
列裁减优化,如~RCFile~
\begin{itemize}
\item FIL\% 所需字段: 过滤条件需要的字段,去重~(~下同~)~。注意保留~column~在~RowSchema~的次序~(~每个~operatorgetSchema~)~
\item GBY\% 出现在~GroupByDesc~keys~的字段和聚合算子参数中引用的字段。
\item RS\% a) 孩子节点是JoinOperator; b) 孩子节点不是JoinOperator,为出现在key中的字段+出现在value中的字段
4)SelectOperator(ColumnPrunerSelectProc)所需字段为:4.1)如果有孩子节点为FileSinkOperator或者ScriptOperator或者UDTFOperator或者LimitOperator或者UnionOperator,那么从SelectOperator中获取所需字段。 4.2)
5)JoinOperator(ColumnPrunerJoinProc)所需字段为:如果有孩子节点是FileSinkOperator,那么不处理。其他情况:
6)MapJoinOperator(ColumnPrunerMapJoinProc)
7)TableScanOperator(ColumnPrunerTableScanProc)所需字段为:孩子节点需要的字段。
8)LateralViewJoinOperator(ColumnPrunerLateralViewJoinProc)
\end{itemize}
\subsection{简单优化}
设置了有几种模式,Normal~的优化是
~select columnList from table where partitionColumnFilter;
column~允许是虚拟列。(INPUT\_\_FILE\_\_NAME等)
Maximal~的优化是:
select columnList from table where Filter;
或者~table sampling~。
不足:TBL\%SEL\%SEL\%其实是可以优化的。
select key from (select * from src) t1 limit 1;
\subsection{谓词下推}
\subsection{~GroupBy~的实现}
\subsection{join~}
\subsection{union~}
\subsection{分区裁减优化}
PartitionConditionRemover
规则匹配:
(TS\%FIL\%)|(TS\%FIL\%FIL\%)
缺陷:如果包含~Later View~这种语法,后面包含~Partition Cond and NonDeterminst Cond, 将导致分区裁减优化失效。
\section{物理计划生成}
genMapRedTasks
\iffalse
在p-api中,如果要完成根据索引文件大小动态决定模式,如果通过插入Task的方式,由于根据Task的结果动态决定后续该启动的Task,这个hive目前不好支持,因为hive在genMapRedTasks中生成MapReduce任务是这么实现的:
1、 判断是否需要MapReduce(如limit语句可能优化后变成一个FetchTask搞定);
2、 将op树划分,生成所有的tasks;
3、 进行物理优化;
4、 决定执行模式,这一步中将取出所有的mapreduce任务,逐个判断任务是否需要采用本地执行,如果可本地执行,就设置localMode(true);这里是调用work的getInputSummary来计算输入大小的。如果partition实现ContentSummaryInputFormat,就可以重载计算其实际的大小。
而Task的执行plan其实在compile时完成,而不在exec中动态决定。
\fi
规则主要包括:
\begin{lstlisting}[language=JAVA]
opRules.put(new RuleRegExp(new String("R1"), "TS%"), new GenMRTableScan1());
opRules.put(new RuleRegExp(new String("R2"), "TS%.*RS%"), new GenMRRedSink1());
opRules.put(new RuleRegExp(new String("R3"), "RS%.*RS%"), new GenMRRedSink2());
opRules.put(new RuleRegExp(new String("R4"), "FS%"), new GenMRFileSink1());
opRules.put(new RuleRegExp(new String("R5"), "UNION%"), new GenMRUnion1());
opRules.put(new RuleRegExp(new String("R6"), "UNION%.*RS%"), new GenMRRedSink3());
opRules.put(new RuleRegExp(new String("R6"), "MAPJOIN%.*RS%"), new GenMRRedSink4());//R6-2
opRules.put(new RuleRegExp(new String("R7"), "TS%.*MAPJOIN%"), MapJoinFactory.getTableScanMapJoin());
opRules.put(new RuleRegExp(new String("R8"), "RS%.*MAPJOIN%"), MapJoinFactory.getReduceSinkMapJoin());
opRules.put(new RuleRegExp(new String("R9"), "UNION%.*MAPJOIN%"), MapJoinFactory.getUnionMapJoin());
opRules.put(new RuleRegExp(new String("R10"), "MAPJOIN%.*MAPJOIN%"), MapJoinFactory.getMapJoinMapJoin());
opRules.put(new RuleRegExp(new String("R11"), "MAPJOIN%SEL%"), MapJoinFactory.getMapJoin());
但是在开源~HIVE~的最新实现中进行了优化,合并对MapJoin的优化,即仅包含:
R1~R6相同(不含~R6-2),最后添加了~R7~: "MAPJOIN%", MapJoinFactory.getTableScanMapJoin()
\end{lstlisting}
\subsubsection{R1 TS\%}
TS\% 构造一个新的MapRedTask,并在ctx中记录该task为currTask,记录当前operator为currTopOp.
如果是~analyze command,则相对复杂一些。
\begin{lstlisting}[language=JAVA]
public static void setTaskPlan(String alias_id, Operator<? extends Serializable> topOp, MapredWork plan, boolean local, GenMRProcContext opProcCtx, PrunedPartitionList pList) {
//设置~plan~的~PathToAliases, PathToPartitionInfo~, opToPartList 等信息。
//为此,获取到~PrunedPartitionList,会调用~PartitionPruner.prune~! ?为什么不让~PCR~先优化再做这个?
plan.getPathToAliases().get(path).add(alias_id);
plan.getPathToPartitionInfo().put(path, prtDesc);
//另外,这里会尝试做一个优化,如果之前~GlobalLimit~优化成功,意味着
//no aggregation, group-by, distinct, sort by,distributed by, or table sampling in any of the sub-query.
//如果第一个分区的大小满足要求,就优化~Path,如果文件数目超过限制,或者~SamplePrune~启用,又会禁用这个优化
//!TODO 这个逻辑比较诡异,待进一步确认
}
\end{lstlisting}
在parseCtx中找到当前operator对应的alias;
为当前operator构造一个GenMapRedCtx对象,以currTask,currTopOp和alias为参数,并以当前operator为key记录在ctx中。
//ANALYZE TABLE T [PARTITION (...)] COMPUTE STATISTICS;
// The plan consists of a simple MapRedTask followed by a StatsTask.
// The MR task is just a simple TableScanOperator
select a, b from src;
select a from (select a, b from src);
select count(distinct key), sum(key) from src group by value;
UNION ALL:
select * from (select key, value from src union all select k as key, v as value from kv1) tmp;
JOIN:
select * from src join kv1 on src1.key=kv1.k;
TS-0 -> SEL-1\\
UNION-4 -> SEL-5 -> FS-6
TS-2 -> SEL-3/
\subsection{R2 TS\%.*RS\%}
explain select count(*) from src;
GenMRRedSink1
\begin{lstlisting}
opMapTask = RS~节点第~0~个孩子节点对应的~Task;
// If the plan for this reducer does not exist, initialize the plan
if (opMapTask == null) { //如果RS~节点的孩子Operator没有绑定Task
if (currPlan.getReducer() == null) { //如果当前task还没有reduce,则作为一轮MR任务的reduce task
GenMapRedUtils.initPlan(op, ctx);
} else { //新生成一个~MR~任务
GenMapRedUtils.splitPlan(op, ctx);
}
} else {
// This will happen in case of joins.
// The current plan can be thrown away
// after being merged with the original plan
GenMapRedUtils.joinPlan(op, null, opMapTask, ctx, -1, false, false, false);
currTask = opMapTask;
ctx.setCurrTask(currTask);
}
\end{lstlisting}
如果当前RS的子operator没有绑定task,且当前task没有设置reducer的时候,那么会将这个RS的子operator设为当前task的reducer。比如:
select count(*) from src;
如果当前RS的子operator(reducer逻辑)只能放到一个新的task(暂时称子task)里去完成了。
所以这个splitPlan方法会生成一个新的plan,并生成一个临时文件信息,通过这个临时文件将父task中的处理结果传递到子task中,由当前RS的子operator(新plan中的reducer)继续执行。这里还需要将当前RS(在父task中)替换为FS,而这个RS会被移到新创建的plan的map部分,当然它不能以top operator的形式出现,所以还会为它创建一个父operator——TS。比如:
对于~JOIN~而言,两个~RS~节点的孩子节点都指向~JOIN~节点,因此第一个~RS~节点处理时,join算子对应的~Task~就非空了。
此时会执行合并操作。
joinPlan的逻辑会通过splitPlan方法为当前RS新建一个plan,并把这个新的plan与子operator对应的task合并为一个task。
一个~TS~有多个~RS~节点
explain from kv1 insert overwrite table src select k key, count(*) as value group by k insert overwrite table kv1 select k, count(*) v1 group by k;
ELSE分支,
explain select key, value from src join kv1 on src.key=kv1.k;
如果是join情况,当前RS的子operator可能已经有对应的task信息了(它已经在join的另一路中被遍历过了),所以此刻会将当前的plan与RS的子operator对应的plan相合并,并将当前plan及task对象抛弃。
\subsection{RS\%.*RS\%}
GenMRRedSink2
explain select key, count(*) from src join kv1 on src.key=kv1.k group by key;
显然在匹配到这个规则,说明当前的节点肯定至少是第二个~RS~,因此之前必然有一个~Reducer,于是必须~splitPlan~
\section{物理优化器}
每个优化器实现接口如下:
\begin{lstlisting}
public interface PhysicalPlanResolver {
PhysicalContext resolve(PhysicalContext pctx) throws SemanticException;
}
\end{lstlisting}
调用PhysicalOptimizer的optimize时,将遍历所有的~PhysicalPlanResolver~的~resolve~方法。是否启用优化器可以配置,不过~BackendResolver总是加入。BackendResolver~它在resolve中将调用tbl对应storagehandler的getPhysicalPlanResolver进行resolver。
目前PhysicalPlanResolver包括:
\subsection{CommonJoinResolver}
\subsection{MapJoinResolver}
\subsection{SkewJoinResolver}
\subsection{IndexWhereResolver}
\subsection{MetadataOnlyOptimizer}
对于只涉及~Meta~的查询,如:
~select max(ds) from TEST1;
select ds, count(hr) from TEST2 group by ds;~
其中~ds, hr~是~Partition~列。
%\begin{advanced}\small
%\danger\\hive.input.format~必须是~CombineHiveInputFormat,否则报错。因为~filesplit~的~path~构造~Path~不允许为空\enddanger
%\end{advanced}
从而计算~split~等都可以省略。如~select
\subsection{BackendResolver}
这个会调用所有后端的优化器。例如~ddbs~优化器。
存在的问题:目前的resolver可能影响其他后端的实现。因为resolve总是会被调用,而不论是否和这个后端有无关系。
\subsection{~MapJoin~优化}
\section{~HIVE~数据模型}
原子类型:
复杂类型:
Map/Struct/Array
\subsection{Object \& ObjectInspector}
\section{执行器}
\subsection{MapOperator}
\subsection{求值树}
Operator { Evaluator[] }
.processOp{ evaluator.evaluate() }
\subsection{JOIN的实现}
\subsection{FileSinkOperator}
createBucketFiles
有些地方insert overwrite directory, newscratchdir
\section{UDF}
\subsection{UDF}
UDF \& GenericUDF
\subsection{UDAF}
\subsection{UDTF}
<file_sep>
<property>
<name>hive.metastore.local</name>
<value>true</value>
</property>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://10.20.151.10:3306/hive?characterEncoding=UTF-8</value>
<description>JDBC connect string for a JDBC metastore</description>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
<description>Driver class name for a JDBC metastore</description>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hadoop</value>
<description>username to use against metastore database</description>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>hadoop</value>
<description>password to use against metastore database</description>
</property>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://IP:3306/hive?createDatabaseIfNotExist=true&characterEncoding=UTF-8</value>
<description>JDBC connect string for a JDBC metastore</description>
</property>
表或者字段有中文的时候需要修改hive的元数据库的设置。
以mysql为例子,当mysql的字符集设置成utf8的时候使用hive会有问题
(com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Specified key was too long; max key length is 767 bytes )
所以当hive使用mysql作为元数据库的时候mysql的字符集要设置成latin1 default。
为了保存那些utf8的中文,要将mysql中存储注释的那几个字段的字符集单独修改为utf8。
修改字段注释字符集
alter table COLUMNS modify column COMMENT varchar(256) character set utf8;
修改表注释字符集
alter table TABLE_PARAMS modify column PARAM_VALUE varchar(4000) character set utf8;
用mysql做元数据
alter database hive character set latin1
这样才不会报 max key length is 767 bytes <file_sep>---
layout: post
title: HIVE源码分析之UDF
categories:
- 分布式查询
tags:
- HIVE
---
UDF的实现
=====
UDF中如何实现ObjectInspector
--
喂,在哪里?
objectinspector
####
是更细的节
####
.. csv-table:: Frozen Delights!
:header: "Treat", "Quantity", "Description"
:widths: 15, 10, 30
"Albatross", 2.99, "On a stick!"
"Crunchy Frog", 1.49, "If we took the bones out, it wouldn't be
crunchy, now would it?"
"Gannet Ripple", 1.99, "On a stick!"
在UDF中,需要产出的ObjectInspector
如GenericUDAFCountEvaluator在init实现中,返回PrimitiveObjectInspectorFactory.writableLongObjectInspector;
显然通过递归遍历ObjectInspector可以得到TypeInfo。
反过来,根据TypeInfo,只要指定一种实现,也可以得到对应的ObjectInspector.例如:getStandardWritableObjectInspectorFromTypeInfo或者getStandardJavaObjectInspectorFromTypeInfo。
由于UDF和GenericUDF的接口不兼容,为了能够实现UDF接口可用。HIVE中添加了GenericUDFBridge,把UDF变成其成员。
GenericUDFBridge encapsulates UDF to provide the same interface as
* GenericUDF.
因此UDF相对需要多一次Reflection,不过基本可以忽略不计。
可以直接调用Reflection;
那么其返回的OI如何计算?就是Java的返回值,
GenericUDAF为啥需要Evaluator?
如何调用?
在外层的方法中,如果有Constants,
如md5(a), abs(3)
select a, b from
序列化后,如select a from b;
这时是ConstantEvaluator;
eval[i].evalate(object)
count等是Evaluator?<file_sep>人生的重要品质
=====================
人生这样短暂,最重要的品质是什么?忽然想起这个问题,我于是推理而得到了几个观点。
首先想到的是行为的自律,即自我约束,不放纵,对自己的行为负责。因为自我约束,不对其他人造成伤害。不放纵感官的享乐,就不为感官控制。
对自己行为负责就避免了自我伤害。因为自律,不伤害自己与其他人。
其次是思想的自省,即洞察内心的能力。因为自省,就不被他人蛊惑,不被自己迷惑,掌控了不偏离事实的判断与认识。
最后是心的自由,即不被束缚。因此摆脱知见,独辟蹊径,具备了智慧的决断力, 因为不被烦恼束缚,得到安乐。
这几个观点是否是很片面的?
有时我不犯人,人也犯我。走在路上,被车撞,被人砍,也是有可能的。是否自律还需他律?控制它?
有时我自省,但是别人纠缠不清,没有一点理智。是否自省还需他省?改变它?
有时我自觉无碍,但是他人不认可,有心问鼎,无力回天,我又该如何?求取它?
世界上有各种法律、道德,文化、宗教,但是并没有彻底成功。对我个人而言,真的就放它,随它,由它?
说了这么多,这几个品质感觉不错,我如何具备它?一说想得到它就好像又失去了它。
必须得承认,这些品质都很不容易得到,好色,好名,好利都远超好德,生活不顺,就很烦;工作有障碍,就很忧虑;
身体有冲动,心情就会压抑;感情得不到或不融洽,就很孤独,愁苦...
还是太抽象了,我的这些思考对我的生活真的有指导意义么?还是因为没有认真实践?慎之,勤之,勉之。
<file_sep>***************************
Impala IO/数据流分析
***************************
总体流程
=============================
当用户执行Query时,Query被翻译成执行计划,物理执行计划其实就是ExecNode组成的DAG图。这其中扫描数据由ScanNode完成。
节点之间交互数据由ExchangeNode完成;
在节点之间交互的数据流采用推的方式,具体而言:
* 每一个PlanFragmentExecutor的计算节点完成计算后,通过DataSink主动将数据下推给下游节点;
* impalad的ImpalaServer实例收到DataStream类型的数据包时,调用DataStreamMgr::AddData()写入数据. 根据fragment_id/node_id分发给它管理的DataStreamRecvr,ExchangeNode等用户通过DataStreamRecvr接收数据。
* ExchangeNode在Prepare时创建DataStreamRecvr;
ExchangeNode在GetNext时会调用DataStreamRecvr的GetBatch(阻塞操作)
Impalad底层的读取文件的IO模型是典型的生产者-消费者模型,Scanner是消费者,而DiskIoMgr线程是生产者.
* Impalad在初始化环境时,将新建DiskIoMgr对象,并启动多个IoMgr线程(默认一块Disk一个IoMgr线程);
IoMgr线程,从磁盘扫描数据;
* ScanNode调用Open时,负责将要读取的ScanRanges Add到DiskIoMgr中;并启用
若干个Scanner线程读取数据. Scanner线程将调用GetBytes(阻塞), 然后处理bytes,物化Tuples;
* DiskIoMgr的线程在发现ReaderContext非空后,按照Round-Robined算法调度ReaderContext,读取数据;
Impala与HIVE的数据流对比
-------------------------
* HIVE的每个Task内部执行时,算子树的求值是推的方式。如MapTask先执行MapOperator,通常是先TableScanOperator,
然后将数据分给下游,如TableScanOperator->FilterOperator->SelectOperator->GroupbyOperator->ReduceSinkOperator;
然后ReduceSinkOperator将数据输出给MapReduce框架,MapReduce框架在执行时,ReduceTask主动去拉取MapTask的输出。
* 在Impala中,PlanFramgent相当于HIVE的一个Task,在PlanFrament的算子树求值时,下游节点调用GetNext向上游节点拉取,
而PlanFragment执行完一个GetNext之后,就会把通过DataSink方式将RowBatch推送到其他PlanFragment;
总结起来HIVE是算子树内部求值是推的方式,Task之间是拉的方式。而Impala则算子树求值是拉的方式,PlanFragment之间是推送的方式。
这样的优势我能想到两个明显的优势:
* 利用带宽更加有效;
* 对于select * from table limit 10这样的语句,
HIVE需要等待整个HadoopJob执行完成之后才可以输出结果,Impala只要Coordinator拿到10条记录后就可以Cancel掉其他后端。
Impala底层读取IO的详细流程
==============================
ImpalaServer在初始化环境时,创建ExecEnv对象 ::
ExecEnv的构造函数 {
新建各种对象,与IO相关的,主要包括DataStreamMgr,DiskIoMgr;
DiskIoMgr的构造函数 {
设置每个disk对应一个DiskQueue;
设置每个disk读取的线程数目为FLAGS_num_threads_per_disk;
}
调用disk_io_mgr的Init() {
For 每个disk_queue:
构造DiskQueue对象;
每个DiskQueue启动num_threads_per_disk个线程;
线程的主循环是ReadLoop :{
while (true) {
调用GetNextScanRange获取下一个要读的ScanRange: {
DiskQueue采用Round-Robined轮转算法,取出队首ReaderContext;
获取ReaderContext的要读取的ScanRange;
判断进程是否超内存限制;
判断Reader是否超内存限制;
If 进程超内存限制,调用GcIoBuffers();
If 依然超内存限制,就Reader->Cancel()
If reader是Canceled状态:
continue;
If reader下一个ScanRange为空并且unstarted_ranges队列非空:
将reader的unstarted_ranges队列取出一个ScanRange放入到>
ready_to_start_ranges_队列;
// 这样做的主要目的在于节省GetNextScanRange等待时间
唤醒一个阻塞在读GetNextScanRange的线程;
将reader重新放回disk_queue的队列;这样线程可以处理其他ScanRange
}
GetFreeBuffer(reader);//分配max_reader_size_ byte
GetBufferDesc;
range->OpenScanRange //打开文件,然后fseek到要读取的offset;(如果hdfs_connection_为空,是本地读取)
更新Counter;
range->ReadFromScanRange //从打开的文件句柄中读取数据;
} // while true
}
}//Init
}
HdfsScanNode
==============================
::
Prepare
tuple_desc_=state->desc_tbl().GetTupleDescriptor();
collect all materialized (partition key and not) slots
// Finally, populate materialized_slots_ and partition_key_slots_ in the order that
the slots appear in the file.
Codegen Scanner需要的函数
设置当thread可用时启用的callback函数HdfsScanNode::ThreadTokenAvailableCb,至少要求一个线程
ThreadTokenAvaliableCb的实现 {
只要有线程可启用,就启动Scanner线程。不过以下几种情况不用启动新线程:
1. ScanNode已经完成
2. 所有的Ranges已经被其他线程拿走 (?)
3. 剩余的Ranges < 活跃的线程数
线程的主循环 {
while (true) {
disk_io_mgr获取下一个ScanRange;
创建新的Scanner对象来处理这个ScanRange;
scanner->Prepare;
scanner->ProcessSplit;
scanner->Close;
}
}
}
//Open时初始化hdfs的连接,同时会启用Scanner线程. Scanner将向DiskIOMgr的队列添加Splits
Open(RuntimeState* state):
向disk_io_mgr注册Reader,传递hdfs_connections
调用ScanRange的每个Partition的PrepareExprs
初始化所有Scanner的Ranges
BaseSequenceScanner
======================
::
ProcessSplit:{
//对于每一个ScanRange,需要InitNewRange, ProcessRange
header = scan_node_->GetFileMetadata();
if (header_ == NULL)
ReadFileHeader(); // parsing header
scan_node_->SetFileMetadata();
scan_node_->AddDiskIoRanges();
InitNewRange;
// 第一条记录
if stream_->scan_range->offset() == 0 :
skip header;
else
skip to sync
do {
//处理当前的Range直到出错或者结束
//[要求在Data block的开始位置,这个通过sync marker来保证]
ProcessRange() {
while (!eof) {
GetRecord;
GetMemory;
如果当前slot需要物化,解析当前Record
否则,直接写入
}
}
如果不允许错误,退出;
否则,跳到下一个Sync
} while (!stream_->eosr());
}
<file_sep>########
::
create 'neo_on_matrix', {NAME => 'cf', COMPRESSION=>'SNAPPY'}, {NUMREGIONS => 30, SPLITALGO => 'HexStringSplit'}
问题:
提供一个schema编辑、展现的平台,提供日志打印的sdk工具?
提供neo的平台化方式实现;
class HiveHistoryRule extends MonitorRule {
date = "";
// name , rule, check interval,
def name() {
}
def dependencies() {
return name : [from, to, step];
}
def avg(List[DataPoint]) {
return xx;
}
def sum(List[DataPoint]) {
}
def reportError(List[DataPoint] points) {
return
}
}
class Metrics extends MetricsRule {
}
class SendMetrics extends Metrics {
avg(points)
}
meta: Download
monitor: MonitorLog;
monitor<file_sep>日常生活
=============================
随笔
----------------
.. toctree::
:maxdepth: 1
:glob:
life/open-blog
禅
-----------------
.. toctree::
:maxdepth: 1
:glob:
.. zen/dukka
<file_sep>
install grpc
==============
install grpc-java
===================
#. build the examples:
$ cd examples
$ export CXXFLAGS="-I/usr/local/include" LDFLAGS="-L/usr/local/lib"
$ ../gradlew installDist
#. run hello world example:
$ ./build/install/grpc-examples/bin/hello-world-server
$ ./build/install/grpc-examples/bin/hello-world-client
./gradlew -v 版本号
./gradlew clean 清除9GAG/app目录下的build文件夹
./gradlew build 检查依赖并编译打包
这里注意的是 ./gradlew build 命令把debug、release环境的包都打出来,如果正式发布只需要打Release的包,该怎么办呢,下面介绍一个很有用的命令 **assemble**, 如
./gradlew assembleDebug 编译并打Debug包
./gradlew assembleRelease 编译并打Release的包
除此之外,assemble还可以和productFlavors结合使用,具体在下一篇多渠道打包进一步解释。
./gradlew installRelease Release模式打包并安装
./gradlew uninstallRelease 卸载Release模式包<file_sep>
hbase建表时指定region
#######################
create 't1', 'f1', {SPLITS => ['10', '20', '30', '40']}
分区时针对全表而非某个Column Family:
create 't', {NAME => 'f', VERSIONS => 1, COMPRESSION => 'SNAPPY'},
{SPLITS => ['10','20','30']}
http://shitouer.cn/2013/05/hbase-create-table-with-pre-splitting-and-other-options/
例如创建neo的doc相关表格:
create 'neo_conjunction_cfb', {NAME => 'f', VERSIONS => 1, COMPRESSION => 'SNAPPY', IN_MEMORY => 'true', BLOCKCACHE => 'true', TTL => '345600' },
{SPLITS => ['0', '1', '2', '3', '4', '5', '6', '7', '8', '9', 'a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i', 'j', 'k', 'l', 'm', 'n', 'o', 'p', 'q', 'r', 's', 't', 'u', 'v', 'w', 'x', 'y', 'z', 'A', 'B', 'C', 'D', 'E', 'F', 'G', 'H', 'I', 'J', 'K', 'L', 'M', 'N', 'O', 'P', 'Q', 'R', 'S', 'T', 'U', 'V', 'W', 'X', 'Y', 'Z']}
create 'neo_doc_test', {NAME => 'f', VERSIONS => 1, COMPRESSION => 'SNAPPY', IN_MEMORY => 'true', BLOCKCACHE => 'true' },
{SPLITS => ['0', '1', '2', '3', '4', '5', '6', '7', '8', '9', 'a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i', 'j', 'k', 'l', 'm', 'n', 'o', 'p', 'q', 'r', 's', 't', 'u', 'v', 'w', 'x', 'y', 'z', 'A', 'B', 'C', 'D', 'E', 'F', 'G', 'H', 'I', 'J', 'K', 'L', 'M', 'N', 'O', 'P', 'Q', 'R', 'S', 'T', 'U', 'V', 'W', 'X', 'Y', 'Z']}
create 'stat_doc', 'f', { NUMREGIONS => 128, SPLITALGO => 'UniformSplit' }, { NAME => 'f', IN_MEMORY => 'true', BLOCKCACHE => 'true', COMPRESSION => 'SNAPPY' }
create 'stat_source', 'f', { NUMREGIONS => 32, SPLITALGO => 'UniformSplit' }, { NAME => 'f', IN_MEMORY => 'true', BLOCKCACHE => 'true', COMPRESSION => 'SNAPPY' }
create 'neo_conj', {NAME => 'f', VERSIONS => 1, COMPRESSION => 'SNAPPY', IN_MEMORY => 'true', BLOCKCACHE => 'true', TTL => '345600' },
{SPLITS => ['0', '1', '2', '3', '4', '5', '6', '7', '8', '9', 'a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i', 'j', 'k', 'l', 'm', 'n', 'o', 'p', 'q', 'r', 's', 't', 'u', 'v', 'w', 'x', 'y', 'z', 'A', 'B', 'C', 'D', 'E', 'F', 'G', 'H', 'I', 'J', 'K', 'L', 'M', 'N', 'O', 'P', 'Q', 'R', 'S', 'T', 'U', 'V', 'W', 'X', 'Y', 'Z']}
设置600,2415240
如果设置1200, 2246740
100, 2037900
;
设置pending=1000, 100, 1721980
设置5000, 1414120
create 'neo_conj_history', {NAME => 'f', VERSIONS => 1, COMPRESSION => 'SNAPPY', IN_MEMORY => 'true', BLOCKCACHE => 'true', TTL => '345600' },
{SPLITS => ['0', '1', '2', '3', '4', '5', '6', '7', '8', '9', 'a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i', 'j', 'k', 'l', 'm', 'n', 'o', 'p', 'q', 'r', 's', 't', 'u', 'v', 'w', 'x', 'y', 'z', 'A', 'B', 'C', 'D', 'E', 'F', 'G', 'H', 'I', 'J', 'K', 'L', 'M', 'N', 'O', 'P', 'Q', 'R', 'S', 'T', 'U', 'V', 'W', 'X', 'Y', 'Z']}
随机查询为主, 创建时采用UniformSplit;
create 'neo_shortterm', 'f', { NUMREGIONS => 128, SPLITALGO => 'UniformSplit' }, { NAME => 'f', IN_MEMORY => 'true', BLOCKCACHE => 'true', COMPRESSION => 'SNAPPY', TTL => '172800' }
create 'neo_shortterm', 'f', { NUMREGIONS => 16, SPLITALGO => 'UniformSplit' }, { NAME => 'f', IN_MEMORY => 'true', BLOCKCACHE => 'true', COMPRESSION => 'SNAPPY', TTL => '172800' }
生成delta -> -> shuffle by key -> persistent汇集写入
不做任何的batch,不emit,达到900m/10m;
不做任何的batch,emit,16k/1m,delta发送的频率为1.5M/m左右 140m/10m;
batch为1000,
不做后续的emit,但是
pipeline发数据,但是不写数据, 此时达到了?;
hbase coprocessor 开发
===============================
加载coprocessor
-----------------
alter 'neo_cfb_doc_test', METHOD=>'table_att','coprocessor'=>'hdfs://nameservice1/lib/neo/coprocessor.jar|com.blackwing.util.hbase.coprocessor.SplitRowEndPoint|1073741823'
参考:
1. HBase的coprocessor分拆HRegion http://blackwing.iteye.com/blog/1788647
<file_sep>***************************
Impala后端读取文件的实现
***************************
Impala的系统架构
=======================
be代码结构
================
* service: 连接前端,并接受client的请求
* runtime: 运行时需要的类,包括coordinator, planfragment,datastream, mempool, tuple等
* exec: 主要是ExecNode及子类,还包括读各种文件的Scanner,目前Scanner还没有分离出读文件的和解析内容两层;
* exprs: 各种表达式,如tableref,case等。目前表达式没有抽象提取出函数,因此新增一个函数就需要新加一个expr;
* transport: Thrift SASL: Simple Authentication and Security Layer
* statestore: statestore, simplescheduler, statestoresubscribe等;
* codegen: 代码生成
每个impalad都可以处理Query请求,而其他的impalad作为Backend,在实际执行时,
每个impalad都可以作为接收Query,其他Impalad都是一个Query包含由Coordinator和多个PlanFragmentExecutor共同完成。
PlanFramgent会发到Backend上执行
Impalad的启动流程
=======================
* 初始化LLVM,hdfs,jni,hbase,前端;
* 启动ImpalaServer
* 启动thriftserver,接受thrift请求
* 启动ExecEnv
* 启动webserver
* 启动SubscriptionManager
* 启动Scheduler
* 向StateStore订阅,并注册回调函数SimpleScheduler::UpdateMembership,用于调度
时提供当前可用的后端
SubscriptionManager::RegisterService
StateStore检查service是否存在,如果不存在,则建立一个新的serviceinstance
检查客户端是否存在于这个serviceinstance的membership中,如果不存在,则添加一个
SubscriptionMangaer::RegisterSubscription
StateStore添加一个Subscriber,订阅这个service的membership,并注册回调函数
MembershipCallback
当有update回调时,更新impalaserver
的membership状态,用于failure detector
Impalad启动后,就可以接受query请求,也可以接受其他impalad的请求,执行一个PlanFragment
Coordinator::Exec
==============================
1. ComputScanRangeAssignment;
2. ComputeFragmentExecParams;
为什么需要Conjuncts?而Expr不够表达?
--------------------------------------
CreateConjuncts
====================
PlanFragmentExecutor::prepare {
设置desc_tbl;
设置plan_;
plan_->Prepare;
设置ExchangeNode的sender数目
对ScanNode设置ScanRange;
设置data sink;
设置profile;
}
PlanFragmentExecutor::OpenInternal {
plan_->Open();
while (true) :
GetNextInternal {
while undone_:
plan_->GetNext();
}
sink_->Send;
sink_->Close;
}
Impala的Cache策略
======================
TUniqueId fragment_instance_id,
PlanNodeId dest_node_id,
StreamControlBlock
<file_sep>****************************
Impala后端分析
****************************
查询后端主要class介绍
=======================
ImaplaServer
---------------------------
ImpalaServer是impalad运行时所有请求的入口,以TExecRequest为例,由ImpalaServer::QueryExecState 负责处理。TExecRequest中包含TQueryExecRequest,描述了Frontend解析SQL后,向Backend发送的查询请求。
ImpalaServer::QueryExecState包含以下方法:
1. ImpalaServer::QueryExecState::Exec(TExecRequest* exec_request):
对TQueryExecRequest进行基本检查,然后调用Coordinator的Exec()方法解析提交请求
2. ImpalaServer::QueryExecState::FetchRows(const int32_t max_rows, QueryResultSet* fetched_rows)
先调用Coordinator::Wait(),阻塞等待,该调用结束,意味着若干查询结果已生成,通过ImpalaServer::QueryExecState::FetchNextBatch()调用Coordinator::GetNext(),获取一个小批量的结果,并按自己的处理能力,分批处理这些结果。
Coordinator
---------------------------
1. Coordinator::Exec()
Coordinator在Exec(const TUniqueId& query_id, TQueryExecRequest* request, const TQueryOptions& query_options)方法中,根据收到的query_id和request,会计算产生一次查询相关的参数信息,保存在本地。
1. 记录query_id,保存在Coordinator::query_id_
2. 从request中取出finalize_params,保存在Coordinator::finalize_params_
3. 从request中取出query_globals,保存在Coordinator::query_globals_
4. Coordinator::ComputeFragmentExecParams(const TQueryExecRequest& exec_request)
根据request计算出每个PlanFragment的执行参数,保存在Coordinator::fragment_exec_params_中
4.1 Coordinator::ComputeFragmentHosts(const TQueryExecRequest& exec_request)
对每一个PlanFragment,根据request中输入数据所在的位置(per_node_scan_ranges),得到执行查询的impalad的Host列表,保存在Coordinator::hosts中
4.1.1 从PlanFragments的DAG的底层向上计算每个PlanFragment的执行节点。因为有些PlanFragment可以直接使用其子Fragment的执行节点执行。
a) 若PlanFragment的partition.type == TPartitionType::UNPARTITIONED,说明它只能在一个impalad上执行,把它放在coordinator上执行
b) 称PlanFragment.
plan按DFS序执行第一个被执行的节点(最左子节点,或根节点)为LeftmostNode,若LeftmostNode的查询类型是EXCHANGE_NODE(用于收集DAG下层的PlanFragment的数据),且所有PlanFragment中存在一个PlanFragment,向当前LeftmostNode输出数据,则选择那个PlanFragment所在的hosts列表作为当前PlanFragment的执行节点。若LeftmostNode的类型是HDFS_SCAN_NODE或HBASE_SCAN_NODE,从LeftmostNode的scan_range_locations指定的一堆DataNode中,若DataNode上存在impalad,则选择该impalad执行,否则从所有impalad中,用Round-robin(轮询)算法选择一个impalad节点执行。在data_server_map中记录每个DataHost上的数据,由哪个impalad处理。
4.2 为4.1生成的hosts列表中每个THostPort生成一个全局唯一的instance_id。
4.3 统计每个PlanFragment的输入源,记录向该PlanFragment中的EXCHANGE_NODE写入数据的PlanNode的instance_id,及向该PlanFragment写入数据的hosts的数目. 目前,StreamSink的type只能是TPartitionType::UNPARTITIONED,即PlanFragment只能通过广播的形式,将所有数据发送给所有下游PlanFragment。
5. Coordinator::ComputeScanRangeAssignment(const TQueryExecRequest& exec_request) 对每个PlanNode所需扫描的Range,计算扫描量,安排负责执行的物理节点
5.1 对于scan_range_locations中每个TScanRangeLocation,根据scan_range计算大概的扫描量,从locations中选择目前扫描量最少的location,作为负责扫描该range的DataNode,并更新它的扫描量。
5.2 从data_server_map中,找到负责扫描该DataNode的impalad,记录scan_range和impalad的对应关系
6. 通过ParallelExecutor并行调用Coordinator::ExecRemoteFragment(void* exec_state_arg)
6.1 通过impala-client,向每个PlanFragment所在机器发送请求,执行机根据请求构建PlanFragmentExecutor
6.2 调用PlanFragmentExecutor的Prepare()方法,初始化执行过程
6.2.1 各种初始化
6.2.2 通过LLVM生成执行代码
6.2.3 设定扫描的Range,输出数据流等
7. 初始化ProgressUpdater progress_,后续用于收集执行状态
4.4.2.2 Coordinator::Wait():
Coordinator::Wait()中,调用PlanFragmentExecutor::Open()方法,阻塞等待远程的impalad返回。如果PlanFragmentExecutor中定义了sink,则Open()方法中即不断循环调用PlanFragmentExecutor::GetNext(),流式地获取数据,通过sink发送到下游。Coordinator::GetNext()调用PlanFragmentExecutor::GetNext()方法,并根据本机的处理能力,对流式返回的数据量做一些限制及意外处理。PlanFragmentExecutor::GetNext()每次从PlanNode中调用GetNext()获取一批数据,返回外层。
ExecNode
-------------------
ExecNode是各种PlanNode的基类,提供一些初始化、析构等公共方法,还有CreateTree方法,根据TQueryExecRequest中的TPlan,构建PlanNode Tree。
目前PlanNode有7种:HDFS_SCAN_NODE、HBASE_SCAN_NODE、AGGREGATION_NODE、HASH_JOIN_NODE、EXCHANGE_NODE、SORT_NODE、MERGE_NODE,他们组合成PlanNode Tree,完成各种复杂的查询操作。
DiskIoMgr
----------------------
DiskIoMgr 负责管理,调度所有IO请求.每个PlanNode会申请一个或多个ReaderContext,每个负责一系列的ScanRange.有独立的队列。
PlanNode通过DiskIoMgr的API,获得ScanRange(non-blocking)和读取数据(blocking)。DiskIoMgr有自己的工作线程,从本地磁盘或者HDFS读取数据。
DiskIoMgr通过ReaderContext实例标识一个请求,用户通过RegisterReader()/UnregisterReader()申请、释放ReaderContext实例,通过GetNext()/Read()等方法操作ReaderContext实例,获取数据。
DiskIoMgr可以理解成一个线程安全的生产者消费者管理器,管理两个层面的资源:
1. 硬件资源:读数据的Reader实例的队列;可复用的空Buffers队列
2. 读取请求:描述查询请求的ScanRanges队列;已使用的Buffers队列
磁盘工作线程,不断从空Buffers队列中取出Buffer,从ScanRanges队列中取出要处理的ScanRange;处理完成后,加入已使用的Buffers的队列。
用户通过AddScanRanges()向ScanRanges队列插入新的ScanRange请求;通过GetNext()消费写有数据的Buffer;通过ReturnBuffer()将用完的Buffer返回空Buffers队列。
DataStreamMgr
------------------
DataStreamMgr: 负责管理所有PlanNode间的输入输出流.
impalad的ImpalaServer实例收到DataStream类型的数据包时,调用DataStreamMgr::AddData()写入数据,它根据fragment_id/node_id分发给它管理的DataStreamRecvr,ExchangeNode等用户通过DataStreamRecvr接收数据。
ScanNode
------------------
HdfsScanNode
^^^^^^^^^^^^^^^^
HdfsScanNode负责从HDFS读取数据,反序列化,它启动一个DiskThread用于读取二进制数据,启动多个ScannerThread反序列化这些数据。Scanner是反序列化HDFS文件的实例,目前支持Txt、RCFile等各种格式,增加新的Scanner,可以支持OLAPEngine等。
::
Open()
GetDefaultConnect() // 建立HDFS的连接,初始化统计信息,
RegisterReader() // 初始化ReaderContext实例
add_thread(DiskThread())
IssueMoreRanges() // 初始化新的ScanRanges
DiskThread():
io_mgr()->GetNext() // 获取一个已经写有数据的Buffer
// 找到获得的Buffer对应的ScanRange的Scanner
StartNewScannerThread(): // 启动Scanner线程读取Buffer
CreateScanner() // 根据HdfsPartition类型构造HdfsScanner
add_thread(new thread(ScannerThread))
ScannerThread():
scanner->ProcessScanRange() // 调用Scanner处理Buffer中的信息
if 当前所有ScanRange都已完成:
IssueMoreRanges() 向DiskIoMgr添加新的ScanRange请求
AggregationNode
------------------------
基于Hash的内存聚集,利用LLVM生成算子计算Hash值和聚集结果。
HashJoinNode
--------------------------
支持INNER JOIN / LEFT OUTER JOIN / RIGHT OUTER JOIN / FULL OUTER JOIN / SEMI JOIN,但JOIN条件必须包含至少一个等值JOIN,即不支持Cross Join。
HashJoinNode会有两路输入(分别称为左路和右路),每次HashJoinNode::Open()方法中,会先流式拉取右路的所有输出结果,存在内存中。然后从左路中流式拉取数据,对内存Hash表中的右路数据进行匹配。根据JOIN类型输出JOIN结果。
ExchangeNode
-----------------------------
负责接收其它Fragment的PlanNode的输出(可能跨网络)。Prepare()方法中,从DataStreamMgr中获取DataStreamRecvr的实例,在GetNext()中不断从DataStreamRecvr取数据。
MergeNode
-------------------------------
合并Fragment内所有上游PlanNode输出的结果。合并过程中,会根据每个PlanNode的Schema对数据进行Schema Change,输出成统一Schema的结果。
<file_sep>
HIVE优化的策略:
计算优化
计算逻辑的优化
=================
减少Map Task
---------------
合并小文件
----------------
set mapred.combine.input.format.local.only=false来开启多机合并的策略
另外,相关的参数:
hive.merge.mapfiles = true 是否和并 Map 输出文件,默认为 True
hive.merge.mapredfiles = false 是否合并 Reduce 输出文件,默认为 False
hive.merge.size.per.task = 256*1000*1000 合并文件的大小
增加MR轮次
-----------------
count(distinct key) -> select count(*) from (select distinct key from src) t;
减少MR轮次
----------------
set abaci.is.dag.job=true;
开启dag优化,MR -> MR ==> MRR
MapJoin
---------------
小表读入内存,即使MapJoin失败,仍然会启用Backup Task,改用ReduceJoin来完成。
Left Semi Join
---------------
Left Outer Join可以实现
数据倾斜
-----------------
GroupBy: hive.groupby.skewindata = false
JOIN: set hive.optimize.skewjoin=true;
存储优化
===================================
减少文件(分区裁剪, Bucket优化)
-------------------------------
分区裁剪:默认生效;
类似:where baiduid=''
减少行
--------------------
(谓词下推)
尽量过滤不参与计算的行,例如JOIN中,可以去除JOIN key为NULL或空的行。
a join b on a.key=b.key and a.key is not null
A JOIN B JOIN C on A.key=B.key and B.key=C.key
=>
A JOIN B on A.key=B.key JOIN C on B.key=C.key;
针对union all的优化:
a join b on if(a.key is null, a.key1) a.key =b.key
减少列
---------------------------
列裁剪优化:(RCFile/ORCFile)
****************
减少MR的Job数目
===============
开启DAG优化
==============
减少MR框架的调度
===================
设置合理的MapReduce的task数
-----------------------------
设置split size
开启CombinehiveInputFormat
开启RemoteSplits功能
======================
数据倾斜
==========
join倾斜
---------------
groupby倾斜
-------------
set hive.groupby.skewindata=true;
hql语句优化
=================================
count(distinct) => groupby
优化时把握整体,单个作业最优不如整体最优。
单个HSQL的优化
================
取出cnblog某一天访问日志中同时看过博主“小张”和博主“小李”的人数。低效的思路是面向明细的,先取出看过博主“小张”的用户,再取出看过博主“小李”的用户,然后取交集,代码如下
::
select count(*) from
(select distinct user_id
from cnblogs_visit_20130801 where blog_owner = ‘小张’) a
join
(select distinct user_id
from cnblogs_visit_20130801 where blog_owner = ‘小李’) b
on a.user_id = b.user_id;
这样一来,就要产生2个求子查询的job(当然,可以并行),一个join job,还有一个计算count的job。
但是我们直接用面向统计的方法去计算的话,则会更加符合M/R的模式:
::
select count(*) from
(
select user_id,
count(case when blog_owner = ‘小张’ then 1 end) as visit_z,
count(case when blog_owner = ‘小李’ then 1 end) as visit_l
from cnblogs_visit_20130801 group by user_id
) t
where visit_z > 0 and visit_l > 0;
这种实现方式转换成job就只会有2个:内层的子查询和外层的统计,更少的job也就带来更高效的执行结果。
参考材料:
1 http://www.cnblogs.com/sunyi514/p/3279957.html
<file_sep>__author__ = 'sunshangchun'
import commands
def exec_shell(cmd):
output = commands.getstatusoutput(cmd)
return (output[0], output[1].split("\n"))
def get_partition(line):
output = line.strip(" ").split(" ")
path_info = output[-1].rsplit("/", 2)
return path_info
def dfs_ls(path):
cmd = "hadoop fs -ls %s" % path
(ret, msg) = exec_shell(cmd)
output = []
if ret == 0:
for line in msg:
part_info = line.strip(" ").split(" ")
if len(part_info) != 3:
continue
output.append(part_info)
return (ret, output)
def repair_part(table, path, day):
(ret, output) = dfs_ls(path, day)
if ret != 0:
return
for part_info in output:
hour = part_info[-1]
day = part_info[-2]
sql = "alter table %s add partition (p_day='%s', p_hour='%s') location '%s';" % (table, day, hour)
print(sql)
def remove_invalid_path(date):
for hour in range(0, 24):
parent = "hdfs://nameservice1/user/azkaban/camus/productlog/hourly/%s/%02d" % (date, hour)
path= parent + ("/%02" % hour)
(ret, msg) = exec_shell("hadoop fs -ls " + path)
if ret == 0:
cmd = "hadoop fs -mv %s/* %s" % path, parent
print("exec %s" % cmd)
exec_shell(cmd)
if __name__ == "__main__":
(ret, output) = dfs_ls("/data/userProfile/neo_str/delta")
if ret == 0 :
for line in output:
print line
# repair_part("etl_user_event", "/user/azkaban/camus/productlog/hourly/", date)
# line = " drwxr-xr-x - azkaban azkaban 0 2015-02-01 01:43 /user/azkaban/camus/productlog/hourly/2015-02-01/00"
# part_info = get_partition(line)
# print(part_info)
# for day in range(28, 29):
# day = "2015-02-%d" % day
# for hour in range(1, 22):
# location = "hdfs://nameservice1/user/azkaban/camus/productlog/hourly/%s/%02d" % (day, hour)
# sql="alter table etl_user_event add partition (p_day='%s', p_hour='%02d') location '%s';" % (day, hour, location)
# print(sql)
# file_put_content("hello", "helloa")
<file_sep>新gerrit注册和申请权限
1、登录http://git.inf.baidu.com:8888/
2、设置Contact Information,包括用户名和邮箱。
(PS:邮箱会收到邮件,需要访问链接确认,确认链接末尾可能会有‘=’,需要完整)
3、设置SSH Public Keys
将旧gerrit上的SSH拷贝到新gerrit
http://cq01-testing-ds015.cq01.baidu.com:8888/#/settings/ssh-keys
4、hi史吉冬,告诉所属项目,申请加入相应的权限组
将原gerrit上的已存在的评审推送到新gerrit上操作步骤:
1).cd到原来的本地git目录下,添加新的git仓库(名字叫inf,指向新的git库)
git remote add inf ssh://${<EMAIL>:29418/baidu
2). 将本地最新的git分支推送至新的inf仓库
git push inf HEAD:refs/for/master 将本地最新代码(inf)推送至新仓库
git remote rm origin
git log --name-status
or
git log --name-only
or
git log --stat
git reflog
mvn test -Dtest=TestQueryLogger -DfailIfNoTests=alse
mvn cobertura:cobertura
-Dmaven.test.skip=true
git push origin HEAD:master
git remote show origin
git remote update
// checkout 远程分支
git checkout -b hst-master hive/hst
// push 远程分支
git push hive HEAD:refs/for/hst
git remote set-url origin git://new.url.here
git rebase 冲突的解决方法
========================
git checkout --theirs foo/bar.java
git add foo/bar.java
git rebase --continue
将一个分支的某文件checkout过来
git checkout branch1 -- filenamefoo.txt
git checkout
git diff branch1:path/to/foo.txt branch2:path/to/foo-another.txt
$REVISION can be as shown in git rev-parse:
experiment@{yesterday}:app.js # app.js as it was yesterday
experiment^:app.js # app.js on the first commit parent
experiment@{2}:app.js # app.js two commits ago
======
git有三种状态:Change, Staged, Committed。
Change(Unstaged):你改动了一个,没有调用任何git命令前,就是这种状态。
Staged:调用git add或者git commit -a之后,进入Staged状态,表示申明要变动了。
Committed:Commit,生成新的版本commit号,进入此状态。
git init:初始化一个目录,其实就是加了一个.git的隐藏目录
git clone:远程复制一个完整的repository到本地,比如git clone git://github.com/schacon/simplegit.git,就是从git://github.com/schacon/simplegit.git这个地址clone到本地当前目录。
git add:把一个文件从change->staged状态。git add test.txt。注意,不仅仅是添加新文件,修改现有文件也要git add来修改状态,否则git不会考虑将之commit。(当然可以git commit -a省力点)
git status:刚添加完,还没commit,这时候就能用git status -s看看当前修改和仓库里面差别多少,可以看到有多少文件被新增了,多少被修改了,多少被删除了。加个-s用简洁模式查看。一般在git commit之前看一把。
git diff:不加参数比较当前修改的文件和上次commit在仓库里面的区别。git diff develop,查看当前版本和develop分支的差异。如果想比较某个目录下的文件,可以用git diff -- ./libs,这个只比较libs目录下的文件。
git commit:git commit -m 'message here'。提交到仓库,必须要一个message。如果嫌每次都是先git add,再git commit,很麻烦的话,直接git commit -am 'message',带上-a后全部一把进去。
git log:查看提交记录。只想看某个分支的话,git log develop。带上--oneline用简洁模式查看。
git reset:撤销某次提交。最普通用法,git reset HEAD -- file,将某个文件从staged状态->unstaged状态,这文件也不能被commit了。git reset --hard HEAD~1,回退到当前HEAD之前的一个版本。
git branch:不带任何参数,就是看当前目录有多少分支,默认init后一般会有一个master。git branch develop,创建一个develop分支。git branch -d develop,删除develop分支。-D参数则表示不管有没有merge过,都强制删除。
git checkout:快速切换分支,比如git checkout develop,马上切换到develop分支。这个地方我觉得git很牛逼,不用换目录,立马换一套context。
git tag:git tag -a v1.0,将最后一次commit(HEAD)标记为永久的v1.0版本。如果要给以前某次commit打tag,也可以加上提交的版本号就行(版本号可以通过git log --online看到)
git remote:列出所有的远程仓库。从别处clone来的,默认都会有一个别名"origin"的仓库。带上-v可以看到具体URL。git remote add/rw,添加/删除远程仓库地址。其实这些操作都是在本地,并没有实际牵涉到远程。另外github里面folk过来的,默认叫"upstream"。
git fetch:从远程下载分支。git fetch upstream A:B,将远程仓库upstream下的A分支下载到本地,本地叫B分支。如果不带A:B参数,则下载以后,可能会叫upstream/A(如果远程是A分支的话),远程分支要通过git branch -r查看。一般的做法是先git fetch upstream master:tmp(将远程的master先下载到本地的tmp分支,然后git diff tmp看看本地master和tmp的区别,没问题的话再git merge tmp。这样比直接git pull upstream来的安全。
git pull:同fetch,只是下载以后,直接进行merge。比如git pull upstream master,就直接将upstream下载下来,与本地的master合并。
git push:git push origin A:B,将本地的A分支push到远程仓库origin下,并叫做B。如果省略:B,那么一般本地和远程的分支同名。特殊情况:删除远程分支可用通过push一个本地空分支来做到。git push origin :B,push一个空分支到origin下的B,即删除了远程分支B
<file_sep>mysql手册
###############
create database azkaban character set utf8 collate utf8_general_ci;
CREATE DATABASE mydb CHARACTER SET utf8 COLLATE utf8_bin;
CREATE USER 'myuser'@'%' IDENTIFIED BY PASSWORD <PASSWORD>';
GRANT ALL ON mydb.* TO 'myuser'@'%';
GRANT ALL ON mydb TO 'myuser'@'%';
GRANT ALL PRIVILEGES ON *.* TO 'myuser'@'%'
GRANT CREATE ON mydb TO 'myuser'@'%';
FLUSH PRIVILEGES;
export PIG_CLASSPATH=/opt/cloudera/parcels/CDH/lib/hive/lib/logging-assembly-0.1.0.jar<file_sep>__author__ = 'sunshangchun'
<file_sep>***********************
Dryad
***********************
http://research.microsoft.com/en-us/projects/dryad/
|
d4883325f313054646b71e02723df84820b2a165
|
[
"Markdown",
"Python",
"reStructuredText",
"Shell"
] | 53 |
Shell
|
joyyoj/blog
|
0a6ab728f880e5eaa06557393f10b03e1d3afc19
|
f2405f1f4d6056afe21a3415d463944622722274
|
refs/heads/master
|
<file_sep># skynet-endpoint-js-raspi
<file_sep>var mqtt = require('mqtt');
var explorer_hat = require('./explorer-hat/explorer-hat');
var client = mqtt.connect('tcp://localhost', {
clientId: 'skynet_ep_' + Math.random().toString(16).substr(2,5)
});
client.on('connect', function () {
// Subscribe to the relevant topics
client.subscribe('skynet/control/#');
client.subscribe('skynet/clients/#');
});
client.on('message', function (topic, message) {
});
console.log(explorer_hat);
explorer_hat.touch().one.pressed(function(channel, evt) {
console.log("hey! I got a touch on button ", channel);
});
<file_sep>// NOTE - Remove relative paths when deploying for realz
var RASPI = require('raspi');
var GPIO = require('raspi-gpio');
var I2C = require('raspi-i2c').I2C;
var BOARD = require('raspi-board');
var PIBLASTER = require('pi-blaster.js');
var ADC = require('./ads1015');
var CAP1208 = require('./cap1xxx');
// Onboard LEDs above 1, 2, 3, 4
var LED1 = 7; // BCM - 4
var LED2 = 0; // BCM - 17;
var LED3 = 2; // BCM - 27;
var LED4 = 21; // BCM - 5;
// Outputs via ULN2003A
var OUT1 = 22; // BCM - 6;
var OUT2 = 26; // BCM - 12;
var OUT3 = 23; // BCM - 13;
var OUT4 = 27; // BCM - 16;
// 5v Tolerant Inputs
var IN1 = 4; // BCM - 23;
var IN2 = 2; // BCM - 22;
var IN3 = 5; // BCM - 24;
var IN4 = 6; // BCM - 25;
// Motor, via DRV8833PWP Dual H-Bridge
var M1B = 24; // BCM - 19;
var M1F = 28; // BCM - 20;
var M2B = 29; // BCM - 21;
var M2F = 25; // BCM - 26;
// Number of times to update
// pulsing LEDs per second
var PULSE_FPS = 50;
var PULSE_FREQUENCY = 1000;
var DEBOUNCE_TIME = 20;
var CAP_PRODUCT_ID = 107;
// Pulse class
function Pulse(pin, timeOn, timeOff, transitionOn, transitionOff) {
var d_pin = pin; // Class Pin
var d_timeOn = timeOn;
var d_timeOff = timeOff;
var d_transitionOn = transitionOn;
var d_transitionOff = transitionOff;
var d_fps = PULSE_FPS;
var d_timeStart = Date.now();
var d_paused = false;
var d_intervalToken;
function start() {
d_pin.frequency(PULSE_FREQUENCY);
if (d_paused) {
d_timeStart = Date.now();
d_paused = false;
return;
}
d_timeStart = Date.now();
d_intervalToken = setInterval(function() {
// RUN
if (!d_paused) {
var currTime = Date.now() - d_timeStart;
var delta = currTime % (d_transitionOn + d_timeOn + d_transitionOff + d_timeOff);
var timeOff = d_transitionOn + d_timeOn + d_transitionOff;
var timeOn = d_transitionOn + d_timeOn;
if (delta <= d_transitionOn) {
// Transition on phase
d_pin.dutyCycle(Math.round((100.0 / d_transitionOn) * delta));
}
else if (timeOn < delta && delta <= timeOff) {
// Transition Off phase
var currDelta = delta - d_transitionOn - d_timeOn;
d_pin.dutyCycle(Math.round(100.0 - ((100.0 / d_transitionOff) * currDelta)));
}
else if (d_transitionOn < delta && delta <= timeOn) {
d_pin.dutyCycle(100);
}
else if (delta > timeOff) {
d_pin.dutyCycle(0);
}
}
}, (1.0 / d_fps) * 1000);
}
function pause() {
d_paused = true;
}
function stop() {
if (d_intervalToken) {
d_intervalToken();
}
d_intervalToken = null;
d_pin.dutyCycle(0);
}
// Exports
this.start = start;
this.pause = pause;
this.stop = stop;
}
function Motor(pinFwd, pinBack) {
var d_invert = false;
var d_pinFwd = pinFwd;
var d_pinBack = pinBack;
var d_speed = 0; // -100 - 100
// Use pi-blaster to set the speed to 0
PIBLASTER.setPwm(d_pinFwd, 0); // 0.0 - 1.0
PIBLASTER.setPwm(d_pinBack, 0);
function invert() {
d_invert = !d_invert;
d_speed = -d_speed;
speed(d_speed);
return d_invert;
}
function forwards(spd) {
if (spd === undefined) spd = 100;
if (spd < 0 || spd > 100) {
throw new Error("Speed must be between 0 and 100");
}
if (d_invert) {
speed(-spd);
}
else {
speed(spd);
}
}
function backwards(spd) {
if (spd === undefined) spd = 100;
if (spd < 0 || spd > 100) {
throw new Error("Speed must be between 0 and 100");
}
if (d_invert) {
speed(spd);
}
else {
speed(-spd);
}
}
function speed(spd) {
if (spd === undefined) spd = 100;
if (spd < -100 || spd > 100) {
throw new Error("Speed must be between -100 and 100");
}
d_speed = spd;
if (spd > 0) {
PIBLASTER.setPwm(d_pinBack, 0);
PIBLASTER.setPwm(d_pinFwd, spd / 100.0);
}
if (spd < 0) {
PIBLASTER.setPwm(d_pinFwd, 0);
PIBLASTER.setPwm(d_pinBack, Math.abs(spd / 100.0));
}
if (spd === 0) {
PIBLASTER.setPwm(d_pinBack, 0);
PIBLASTER.setPwm(d_pinFwd, 0);
}
return spd;
}
function stop() {
speed(0);
}
this.invert = invert;
this.forwards = forwards;
this.forward = forwards;
this.backwards = backwards;
this.backward = backwards;
this.speed = speed;
this.reverse = invert;
}
function Input(pin) {
var d_pin = pin;
var d_pinImpl = new GPIO.DigitalInput(pin);
var d_handlePressed = null;
var d_handleReleased = null;
var d_handleChanged = null;
var d_hasCallback = false;
// Things that 'Pin' should have implemented
var d_last = read();
var d_handleChange = false;
var d_handleHigh = false;
var d_handleLow = false;
var d_pollToken = null;
function read() {
return d_pinImpl.read() === GPIO.HIGH;
}
function hasChanged() {
if (read() !== d_last) {
d_last = read();
return true;
}
return false;
}
function isOff() {
return (read() === false);
}
function isOn() {
return (read() === true);
}
function onHigh(callback, bounceTime) {
if (bounceTime === undefined) bounceTime = DEBOUNCE_TIME;
d_handlePressed = callback;
_setupCallback(bounceTime);
return true;
}
function _setupCallback(bouncetime) {
if (d_hasCallback) {
return false;
}
// We might need to implement this as polling
d_last = read();
d_pollToken = setInterval(function () {
var currVal = read();
if (currVal !== d_last) {
if (currVal && d_handlePressed) {
d_handlePressed();
}
else if (!currVal && d_handleReleased) {
d_handleReleased();
}
if (d_handleChanged) {
d_handleChanged();
}
}
d_last = currVal;
}, 10);
d_hasCallback = true;
return true;
}
function onLow(callback, bounceTime) {
if (bounceTime === undefined) bounceTime = DEBOUNCE_TIME;
d_handleReleased = callback;
_setupCallback(bounceTime);
return true;
}
function onChanged(callback, bounceTime) {
if (bounceTime === undefined) bounceTime = DEBOUNCE_TIME;
d_handleChanged = callback;
_setupCallback(bounceTime);
return true;
}
function clearEvents() {
if (d_pollToken) {
d_pollToken();
}
d_pollToken = null;
d_hasCallback = false;
}
this.read = read;
this.hasChanged = hasChanged;
this.isOn = isOn;
this.isOff = isOff;
this.onHigh = onHigh;
this.onLow = onLow;
this.onChanged = onChanged;
this.clearEvents = clearEvents;
this.changed = onChanged;
this.pressed = onHigh;
this.released = onLow;
}
function Output(pin) {
var d_pin = pin;
var d_pinImpl = new GPIO.DigitalOutput(pin);
var d_pulser = new Pulse(this, 0, 0, 0, 0);
var d_blinking = false;
var d_pulsing = false;
var d_fading = false;
var d_fader = null;
var d_value = false;
function write(value) {
if (value !== true && value !== false && value !== 1 && value !== 0) {
throw new Error("Legal values to be written are 1/true or 0/false");
}
d_value = (value === true || value === 1) ? true : false;
d_pinImpl.write(d_value);
return true;
}
function on() {
write(true);
return true;
}
function off() {
write(false);
return true;
}
function toggle() {
write(!d_value);
}
this.write = write;
this.on = on;
this.off = off;
this.high = on;
this.low = off;
this.toggle = toggle;
}
function AnalogInput(adc, channel) {
var d_channel = channel;
var d_sensitivity = 0.1;
var d_tWatch = null;
var d_lastValue = null;
var d_handler = null;
function read() {
return adc.readADC(d_channel);
}
function sensitivity(sens) {
d_sensitivity = sens;
}
function changed(handler, sens) {
d_handler = handler;
if (sens !== undefined) {
d_sensitivity = sens;
}
if (!d_tWatch) {
d_tWatch = setInterval(function() {
var value = read();
if (d_lastValue !== null && Math.abs(value - d_lastValue) > d_sensitivity) {
if (d_handler) {
d_handler(value);
}
}
d_lastValue = value;
}, 10);
}
}
this.read = read;
this.sensitivity = sensitivity;
this.changed = changed;
}
var CapTouchSettings = {
enableMultitouch: function(en) {
if (en === undefined) en = true;
d_cap1208.enableMultitouch(en);
}
};
function CapTouchInput(cap1208, channel, alias) {
var d_alias = alias;
var d_pressed = false;
var d_held = false;
var d_channel = channel;
var d_handlers = {
'press': null,
'release': null,
'held': null
};
['press', 'release', 'held'].forEach(function(evt) {
cap1208.on(d_channel, evt, _handleState);
});
function _handleState(channel, evt) {
if (channel === d_channel) {
if (evt === 'press') {
d_pressed = true;
}
else if (evt === 'held') {
d_held = true;
}
else if (['release', 'none'].indexOf(evt) !== -1) {
d_pressed = false;
d_held = false;
}
if (d_handlers[evt]) {
d_handlers[evt](d_alias, evt);
}
}
}
function isPressed() {
return d_pressed;
}
function isHeld() {
return d_held;
}
function pressed(handler) {
d_handlers['press'] = handler;
}
function released(handler) {
d_handlers['release'] = handler;
}
function held(handler) {
d_handlers['held'] = handler;
}
this.isPressed = isPressed;
this.isHeld = isHeld;
this.pressed = pressed;
this.released = released;
this.held = held;
}
var d_isReady = false;
var d_adc;
var d_cap1208;
var d_workers = {};
function asyncStart(name, func, pollTime) {
if (d_workers[name]) {
d_workers[name]();
}
d_workers[name] = setInterval(func, pollTime);
}
function asyncStop(name) {
if (d_workers[name]) {
d_workers[name]();
}
d_workers[name] = undefined;
}
function asyncStopAll() {
for (var name in d_workers) {
if (d_workers[name]) {
d_workers[name]();
}
d_workers[name] = undefined;
}
}
var settings, light, output, input, touch, motor, analog;
RASPI.init(function () {
var i2c = new I2C();
d_adc = new ADC(i2c);
d_cap1208 = new CAP1208(i2c, GPIO);
d_cap1208.initialize();
settings = {
touch: CapTouchSettings
};
light = {
blue: new Output(LED1),
yellow: new Output(LED2),
red: new Output(LED3),
green: new Output(LED4)
};
output = {
one: new Output(OUT1),
two: new Output(OUT2),
three: new Output(OUT3),
four: new Output(OUT4)
};
input = {
one: new Input(IN1),
two: new Input(IN2),
three: new Input(IN3),
four: new Input(IN4)
};
touch = {
one: new CapTouchInput(d_cap1208, 4, 1),
two: new CapTouchInput(d_cap1208, 5, 2),
three: new CapTouchInput(d_cap1208, 6, 3),
four: new CapTouchInput(d_cap1208, 7, 4),
five: new CapTouchInput(d_cap1208, 0, 5),
six: new CapTouchInput(d_cap1208, 1, 6),
seven: new CapTouchInput(d_cap1208, 2, 7),
eight: new CapTouchInput(d_cap1208, 3, 8)
};
motor = {
//one: new Motor(M1F, M1B),
//two: new Motor(M2F, M2B)
};
analog = {
one: new AnalogInput(3),
two: new AnalogInput(2),
three: new AnalogInput(1),
four: new AnalogInput(0)
};
d_isReady = true;
});
var EXPLORER_HAT = {
isReady: function() {
return d_isReady;
},
settings: function() { return settings; },
light: function() { return light; },
output: function() { return output; },
input: function() { return input; },
touch: function() { return touch; },
motor: function() { return motor; },
analog: function() { return analog; }
};
module.exports = EXPLORER_HAT;
<file_sep>// CAP1XXX Capacitive Touch Sensors
// cap 1208 - 8 inputs
// cap 1188 - 8 inputs, 8 LEDs
var Promise = require('promise');
var ADDRESS = 0x28;
// Supported devices
var PID_CAP1208 = 0x6B;
var PID_CAP1188 = 0x50;
var PID_CAP1166 = 0x51;
// REGISTER MAP
var R_MAIN_CONTROL = 0x00;
var R_GENERAL_STATUS = 0x02;
var R_INPUT_STATUS = 0x03;
var R_LED_STATUS = 0x04;
var R_NOISE_FLAG_STATUS = 0x0A;
// Read-only delta counts for all inputs
var R_INPUT_1_DELTA = 0x10;
var R_INPUT_2_DELTA = 0x11;
var R_INPUT_3_DELTA = 0x12;
var R_INPUT_4_DELTA = 0x13;
var R_INPUT_5_DELTA = 0x14;
var R_INPUT_6_DELTA = 0x15;
var R_INPUT_7_DELTA = 0x16;
var R_INPUT_8_DELTA = 0x17;
var R_SENSITIVITY = 0x1F;
// B7 = N/A
// B6..B4 = Sensitivity
// B3..B0 = Base Shift
var SENSITIVITY = {
128: 0x00,
64: 0x01,
32: 0x02,
16: 0x03,
8: 0x04,
4: 0x05,
2: 0x06,
1: 0x07
};
var R_GENERAL_CONFIG = 0x20;
// B7 = Timeout
// B6 = Wake Config ( 1 = Wake pin asserted )
// B5 = Disable Digital Noise ( 1 = Noise threshold disabled )
// B4 = Disable Analog Noise ( 1 = Low frequency analog noise blocking disabled )
// B3 = Max Duration Recalibration ( 1 = Enable recalibration if touch is held longer than max duration )
// B2..B0 = N/A
var R_INPUT_ENABLE = 0x21;
var R_INPUT_CONFIG = 0x22;
var R_INPUT_CONFIG2 = 0x23; // Default 0x00000111
// Values for bits 3 to 0 of R_INPUT_CONFIG2
// Determines minimum amount of time before
// a "press and hold" event is detected.
// Also - Values for bits 3 to 0 of R_INPUT_CONFIG
// Determines rate at which interrupt will repeat
//
// Resolution of 35ms, max = 35 + (35 * 0b1111) = 560ms
var R_SAMPLING_CONFIG = 0x24; // Default 0x00111001
var R_CALIBRATION = 0x26; // Default 0b00000000
var R_INTERRUPT_EN = 0x27; // Default 0b11111111
var R_REPEAT_EN = 0x28; // Default 0b11111111
var R_MTOUCH_CONFIG = 0x2A; // Default 0b11111111
var R_MTOUCH_PAT_CONF = 0x2B;
var R_MTOUCH_PATTERN = 0x2D;
var R_COUNT_O_LIMIT = 0x2E;
var R_RECALIBRATION = 0x2F;
// R/W Touch detection thresholds for inputs
var R_INPUT_1_THRESH = 0x30;
var R_INPUT_2_THRESH = 0x31;
var R_INPUT_3_THRESH = 0x32;
var R_INPUT_4_THRESH = 0x33;
var R_INPUT_4_THRESH = 0x34;
var R_INPUT_6_THRESH = 0x35;
var R_INPUT_7_THRESH = 0x36;
var R_INPUT_8_THRESH = 0x37;
// R/W Noise threshold for all inputs
var R_NOISE_THRESH = 0x38;
// R/W Standby and Config Registers
var R_STANDBY_CHANNEL = 0x40;
var R_STANDBY_CONFIG = 0x41;
var R_STANDBY_SENS = 0x42;
var R_STANDBY_THRESH = 0x43;
var R_CONFIGURATION2 = 0x44;
// B7 = Linked LED Transition Controls ( 1 = LED trigger is !touch )
// B6 = Alert Polarity ( 1 = Active Low Open Drain, 0 = Active High Push Pull )
// B5 = Reduce Power ( 1 = Do not power down between poll )
// B4 = Link Polarity/Mirror bits ( 0 = Linked, 1 = Unlinked )
// B3 = Show RF Noise ( 1 = Noise status registers only show RF, 0 = Both RF and EMI shown )
// B2 = Disable RF Noise ( 1 = Disable RF noise filter )
// B1..B0 = N/A
// Read-only reference counts for sensor inputs
var R_INPUT_1_BCOUNT = 0x50;
var R_INPUT_2_BCOUNT = 0x51;
var R_INPUT_3_BCOUNT = 0x52;
var R_INPUT_4_BCOUNT = 0x53;
var R_INPUT_5_BCOUNT = 0x54;
var R_INPUT_6_BCOUNT = 0x55;
var R_INPUT_7_BCOUNT = 0x56;
var R_INPUT_8_BCOUNT = 0x57;
// LED Controls - For CAP1188 and similar
var R_LED_OUTPUT_TYPE = 0x71;
var R_LED_LINKING = 0x72;
var R_LED_POLARITY = 0x73;
var R_LED_OUTPUT_CON = 0x74;
var R_LED_LTRANS_CON = 0x77;
var R_LED_MIRROR_CON = 0x79;
// LED Behaviour
var R_LED_BEHAVIOUR_1 = 0x81; // For LEDs 1-4
var R_LED_BEHAVIOUR_2 = 0x82; // For LEDs 5-8
var R_LED_PULSE_1_PER = 0x84;
var R_LED_PULSE_2_PER = 0x85;
var R_LED_BREATHE_PER = 0x86;
var R_LED_CONFIG = 0x88;
var R_LED_PULSE_1_DUT = 0x90;
var R_LED_PULSE_2_DUT = 0x91;
var R_LED_BREATHE_DUT = 0x92;
var R_LED_DIRECT_DUT = 0x93;
var R_LED_DIRECT_RAMP = 0x94;
var R_LED_OFF_DELAY = 0x95;
// R/W Power buttonc ontrol
var R_POWER_BUTTON = 0x60;
var R_POW_BUTTON_CONF = 0x61;
// Read-only upper 8-bit calibration values for sensors
var R_INPUT_1_CALIB = 0xB1;
var R_INPUT_2_CALIB = 0xB2;
var R_INPUT_3_CALIB = 0xB3;
var R_INPUT_4_CALIB = 0xB4;
var R_INPUT_5_CALIB = 0xB5;
var R_INPUT_6_CALIB = 0xB6;
var R_INPUT_7_CALIB = 0xB7;
var R_INPUT_8_CALIB = 0xB8;
// Read-only 2 LSBs for each sensor input
var R_INPUT_CAL_LSB1 = 0xB9;
var R_INPUT_CAL_LSB2 = 0xBA;
// Product ID Registers
var R_PRODUCT_ID = 0xFD;
var R_MANUFACTURER_ID = 0xFE;
var R_REVISION = 0xFF;
// LED Behaviour settings
var LED_BEHAVIOUR_DIRECT = 0x0;
var LED_BEHAVIOUR_PULSE1 = 0x1;
var LED_BEHAVIOUR_PULSE2 = 0x2;
var LED_BEHAVIOUR_BREATHE = 0x3;
var LED_OPEN_DRAIN = 0; // Default, LED is open-drain output with ext pullup
var LED_PUSH_PULL = 1; // LED is driven HIGH/LOW with logic 1/0
var LED_RAMP_RATE_2000MS = 7;
var LED_RAMP_RATE_1500MS = 6;
var LED_RAMP_RATE_1250MS = 5;
var LED_RAMP_RATE_1000MS = 4;
var LED_RAMP_RATE_750MS = 3;
var LED_RAMP_RATE_500MS = 2;
var LED_RAMP_RATE_250MS = 1;
var LED_RAMP_RATE_0MS = 0;
function CAP1XXX(i2c, gpio) {
var d_supported = [PID_CAP1208, PID_CAP1188, PID_CAP1166];
var d_numInputs = 8;
var d_numLeds = 8;
var d_alertPinId;
var d_alertPin;
var d_resetPinId;
var d_resetPin;
var d_touchHandlers;
var d_internalHandlers = {
'press': [],
'release': [],
'held': []
};
var d_initialized = false;
var d_delta = 50;
var d_lastInputStatus = [];
var d_inputStatus = [];
var d_inputDelta = [];
var d_inputPressed = [];
var d_repeatEnabled = 0x00;
var d_releaseEnabled = 0xFF;
var d_pollToken;
function initialize(alertPin, resetPin, onTouch, skipInit) {
var i;
if (alertPin === undefined) alertPin = -1;
d_alertPinId = alertPin;
if (resetPin === undefined) resetPin = -1;
d_resetPinId = resetPin;
if (onTouch === undefined) {
onTouch = [];
for (i = 0; i < d_numInputs; i++) {
onTouch[i] = null;
}
}
d_touchHandlers = onTouch;
skipInit = !!skipInit;
if (d_alertPinId !== -1) {
d_alertPin = new gpio.DigitalInput({
pin: d_alertPinId,
pullResistor: gpio.PULL_UP
});
}
if (d_resetPinId !== -1) {
d_resetPin = new gpio.DigitalOutput(d_resetPinId);
d_resetPin.write(gpio.LOW);
d_resetPin.write(gpio.HIGH);
// TODO sleep?
d_resetPin.write(gpio.LOW);
}
// Set up internal state
for (i = 0; i < d_numInputs; i++) {
d_internalHandlers['press'][i] = null;
d_internalHandlers['release'][i] = null;
d_internalHandlers['held'][i] = null;
d_lastInputStatus[i] = false;
d_inputStatus[i] = 'none';
d_inputDelta[i] = 0;
d_inputPressed[i] = false;
}
// At this point, we are ready
d_initialized = true;
if (skipInit) {
return;
}
// Initialize the IC
// Enable all inputs with interrupt by default
enableInputs(0xFF);
enableInterrupts(0xFF);
// Disable repeat for all channels, but give it sane defaults
enableRepeat(0x00);
enableMultitouch(true);
setHoldDelay(210);
setRepeatRate(210);
// Tested sane defaults for various configurations
_writeByte(R_SAMPLING_CONFIG, 0x08); // 1 sample per measure, 1.28ms time, 35ms cycle
_writeByte(R_SENSITIVITY, 0x60); // 2x Sensitivity
_writeByte(R_GENERAL_CONFIG, 0x38);
_writeByte(R_CONFIGURATION2, 0x60);
setTouchDelta(10);
}
function getInputStatus() {
if (!d_initialized) {
throw new Error("Not yet initialized");
}
// Get the status of all inputs
// Returns an array of 8 boolean values indicating
// whether an input has been triggered since the
// interrupt flag was last cleared
var touched = _readByte(R_INPUT_STATUS);
var threshold = _readBlock(R_INPUT_1_THRESH, d_numInputs);
var delta = _readBlock(R_INPUT_1_DELTA, d_numInputs);
for (var x = 0; x < d_numInputs; x++) {
if ((1 << x) & touched) {
var status = 'none';
var _delta = _getTwosComp(delta[x]);
// We only ever want to detect PRESS events
// If repeat is disabled and release detect is enabled
if (_delta >= threshold[x]) {
d_inputDelta[x] = _delta;
// touch down event
if (['press','held'].indexOf(d_inputStatus[x]) !== -1) {
if (d_repeatEnabled & (1 << x)) {
status = 'held';
}
}
if (['none','release'].indexOf(d_inputStatus[x]) !== -1) {
if (d_inputPressed[x]) {
status = 'none';
}
else {
status = 'press';
}
}
}
else {
// Touch Release event
if ((d_releaseEnabled & (1 << x)) && d_inputStatus[x] !== 'release') {
status = 'release';
}
else {
status = 'none';
}
}
d_inputStatus[x] = status;
d_inputPressed[x] = (['press','held','none'].indexOf(status) !== -1);
}
else {
d_inputStatus[x] = 'none';
d_inputPressed[x] = false;
}
}
return d_inputStatus;
}
function _getTwosComp(val) {
if ((val & (1 << (8 - 1))) !== 0) {
val = val - (1 << 8);
}
return val;
}
function clearInterrupt() {
if (!d_initialized) {
throw new Error("Not yet initialized");
}
// Clear the interrupt flag, bit 0 of the
// main control register
var main = _readByte(R_MAIN_CONTROL);
main &= ~0x01;
_writeByte(R_MAIN_CONTROL, main);
}
function _interruptStatus() {
if (d_alertPinId == -1) {
return ((_readByte(R_MAIN_CONTROL) & 1) == 1);
}
else {
return !d_alertPin.read();
}
}
function waitForInterrupt(timeout) {
if (!d_initialized) {
throw new Error("Not yet initialized");
}
// Wait for interrupt (bit 0) of the main
// control register to be set, indicating an
// input has been triggered
return new Promise(function(fulfill, reject) {
var start = Date.now();
var intervalToken = setInterval(function () {
var status = _interruptStatus();
if (status) {
clearInterval(intervalToken);
intervalToken = null;
fulfill(true);
}
if (Date.now() > start + timeout) {
clearInterval(intervalToken);
intervalToken = null;
fulfill(false);
}
}, 5);
});
}
function on(ch, evt, handler) {
if (!d_initialized) {
throw new Error("Not yet initialized");
}
if (ch === undefined) ch = 0;
if (evt === undefined) evt = 'press';
d_internalHandlers[evt][ch] = handler;
// TODO replace this with something
startWatching();
return true;
}
function startWatching() {
if (!d_initialized) {
throw new Error("Not yet initialized");
}
if (d_alertPinId !== -1) {
// TODO implement?
}
if (!d_pollToken) {
d_pollToken = setInterval(function () {
waitForInterrupt().then(function(result) {
if (result) {
_handleAlert();
}
});
}, 10);
return true;
}
return false;
}
function stopWatching() {
if (!d_initialized) {
throw new Error("Not yet initialized");
}
if (d_pollToken) {
d_pollToken();
d_pollToken = null;
}
return false;
}
function setTouchDelta(delta) {
if (!d_initialized) {
throw new Error("Not yet initialized");
}
d_delta = delta;
}
function autoRecalibrate(value) {
if (!d_initialized) {
throw new Error("Not yet initialized");
}
_changeBit(R_GENERAL_CONFIG, 3, value);
}
function filterAnalogNoise(value) {
if (!d_initialized) {
throw new Error("Not yet initialized");
}
_changeBit(R_GENERAL_CONFIG, 4, !value);
}
function filterDigitalNoise(value) {
if (!d_initialized) {
throw new Error("Not yet initialized");
}
_changeBit(R_GENERAL_CONFIG, 5, !value);
}
function setHoldDelay(ms) {
if (!d_initialized) {
throw new Error("Not yet initialized");
}
// Set time before a press and hod is detected
// clamps to multiples of 35 from 35 to 560
var repeatRate = _calcTouchRate(ms);
var inputConfig = _readByte(R_INPUT_CONFIG2);
intputConfig = (inputConfig & ~0xF) | repeatRate;
_writeByte(R_INPUT_CONFIG2, inputConfig);
}
function setRepeatRate(ms) {
if (!d_initialized) {
throw new Error("Not yet initialized");
}
// Set repeat rate in milliseconds
// Clamps to multiples of 35 from 35 to 560
var repeatRate = _calcTouchRate(ms);
var inputConfig = _readByte(R_INPUT_CONFIG);
inputConfig = (inputConfig & ~0xF) | repeatRate;
_writeByte(R_INPUT_CONFIG, inputConfig);
}
function _calcTouchRate(ms) {
ms = Math.min(Math.max(ms, 0), 500);
var scale = parseInt(Math.round(ms / 35.0) * 35, 10) / 35;
return parseInt(scale, 10);
}
function _handleAlert(pin) {
var inputs = _getInputStatus();
for (var x = 0; x < d_numInputs; x++) {
_triggerHandler(x, inputs[x]);
}
clearInterrupt();
}
function _triggerHandler(ch, event) {
if (event === 'none') {
return;
}
var handler = d_internalHandlers[event][ch];
if (handler) {
handler({
channel: ch,
event: event,
delta: d_inputDelta[ch]
});
}
}
function enableMultitouch(en) {
if (!d_initialized) {
throw new Error("Not yet initialized");
}
if (en === undefined) en = true;
// Toggles multitouch by toggling the multitouch
// block bit in the config register
var ret_mt = _readByte(R_MTOUCH_CONFIG);
if (en) {
_writeByte(R_MTOUCH_CONFIG, ret_mt & ~0x80);
}
else {
_writeByte(R_MTOUCH_CONFIG, ret_mt | 0x80);
}
}
function enableRepeat(inputs) {
if (!d_initialized) {
throw new Error("Not yet initialized");
}
d_repeatEnabled = inputs;
_writeByte(R_REPEAT_EN, inputs);
}
function enableInterrupts(inputs) {
if (!d_initialized) {
throw new Error("Not yet initialized");
}
_writeByte(R_INTERRUPT_EN, inputs);
}
function enableInputs(inputs) {
if (!d_initialized) {
throw new Error("Not yet initialized");
}
_writeByte(R_INPUT_ENABLE, inputs);
}
function _writeByte(register, value) {
i2c.writeByteSync(ADDRESS, register, value);
}
function _readByte(register) {
return i2c.readByteSync(ADDRESS, register);
}
function _readBlock(register, length) {
return i2c.readSync(ADDRESS, register, length);
}
function _setBit(register, bit) {
_writeByte(register, _readByte(register) | (1 << bit));
}
function _clearBit(register, bit) {
_writeByte(register, _readByte(register) & ~(1 << bit));
}
function _changeBit(register, bit, state) {
if (state) {
_setBit(register, bit);
}
else {
_clearBit(register, bit);
}
}
function _changeBits(register, offset, size, bits) {
var originalValue = _readByte(register);
for (var x = 0; x < size; x++) {
originalValue &= ~(1 << (offset+x));
}
originalValue |= (bits << offset);
_writeByte(register, originalValue);
}
this.initialize = initialize;
this.getInputStatus = getInputStatus;
this.clearInterrupt = clearInterrupt;
this.waitForInterrupt = waitForInterrupt;
this.on = on;
this.startWatching = startWatching;
this.stopWatching = stopWatching;
this.setTouchDelta = setTouchDelta;
this.autoRecalibrate = autoRecalibrate;
this.filterAnalogNoise = filterAnalogNoise;
this.filterDigitalNoise = filterDigitalNoise;
this.setHoldDelay = setHoldDelay;
this.setRepeatRate = setRepeatRate;
this.enableMultitouch = enableMultitouch;
this.enableRepeat = enableRepeat;
this.enableInterrupts = enableInterrupts;
this.enableInputs = enableInputs;
}
module.exports = CAP1XXX;
|
b4f14d0612228b6555e5bbf04594aee650e353ca
|
[
"Markdown",
"JavaScript"
] | 4 |
Markdown
|
zhiquanyeo/skynet-endpoint-js-raspi
|
633d4b1c36a9550ff101b7062df4c3c5aeb59ffb
|
829f9d092dd1b2ea0c296c70e1f8102dd42c5640
|
refs/heads/main
|
<repo_name>Guruprasad1399/SimplyNote_React_Native<file_sep>/screens/NoteItem.js
import React from "react";
import { View, Text, TouchableOpacity, StyleSheet } from "react-native";
export default function NoteItem(props) {
return (
<TouchableOpacity activeOpacity={0.8}>
<View style={styles.listItem}>
<View style={styles.topc}>
<Text style={styles.txt}>{props.title}</Text>
</View>
<View style={styles.bottomc}>
<Text style={styles.date}>{props.date}</Text>
</View>
</View>
</TouchableOpacity>
);
}
const styles = StyleSheet.create({
listItem: {
flex: 1,
marginTop: 8,
padding: 10,
width: 350,
backgroundColor: "#ccc",
borderColor: "black",
borderWidth: 1,
elevation: 1,
},
topc: {
flexDirection: "row",
},
bottomc: {
flexDirection: "row",
},
txt: {
fontWeight: "600",
fontSize: 20,
color: "#000",
},
date: {
fontSize: 14,
},
});
<file_sep>/App.js
import React, { useState } from "react";
import { View, Text, StyleSheet, Alert, FlatList } from "react-native";
import AsyncStorage from "@react-native-async-storage/async-storage";
import moment from "moment";
import { Ionicons } from "@expo/vector-icons";
import NoteInput from "./screens/NoteInput";
import NoteItem from "./screens/NoteItem";
export default function HomeScreen() {
const [isMode, setIsMode] = useState(false);
const [curnotes, setCurNotes] = useState([]);
const onNote = (notes) => {
setCurNotes((currentNotes) => [
...currentNotes,
{
id: Math.random().toString(),
value: [notes.title, notes.content],
Date: moment().format("Do MMMM YYYY, h:mm a"),
},
]);
setIsMode(false);
};
const onBack = () => {
Alert.alert("Warning", "Are you sure you want to go back", [
{
text: "Yes",
onPress: () => {
setIsMode(false);
},
},
{
text: "No",
onPress: () => {
console.log("Ok");
},
style: "cancel",
},
]);
};
return (
<View style={styles.container}>
<View style={styles.container1}>
<Text style={styles.txt}>Notes App</Text>
<Ionicons
style={styles.icons}
name="add-circle-sharp"
size={40}
color="yellow"
onPress={() => setIsMode(true)}
/>
</View>
<NoteInput visible={isMode} onBack={onBack} note={onNote} />
<FlatList
keyExtractor={(item) => item.id}
data={curnotes}
renderItem={(itemData) => (
<NoteItem
id={itemData.item.id}
title={itemData.item.value[0]}
content={itemData.item.value[1]}
date={itemData.item.Date}
/>
)}
/>
</View>
);
}
const styles = StyleSheet.create({
container: {
flex: 1,
marginTop: 30,
alignItems: "center",
backgroundColor: "#353535",
},
txt: {
fontSize: 30,
fontWeight: "bold",
color: "#fff",
padding: 10,
},
container1: {
flexDirection: "row",
justifyContent: "center",
alignItems: "center",
},
icons: {
position: "absolute",
top: 15,
left: 225,
},
});
<file_sep>/screens/NoteInput.js
import React, { useState } from "react";
import {
View,
Text,
Modal,
StyleSheet,
Pressable,
TextInput,
} from "react-native";
import { Ionicons } from "@expo/vector-icons";
import { AutoGrowingTextInput } from "react-native-autogrow-textinput";
export default function NoteInput(props) {
const [enteredTitle, setenteredTitle] = useState("");
const [enteredContent, setenteredContent] = useState("");
const onSave = () => {
if (enteredTitle === "" || enteredContent === "") {
alert("Please add a Title or Content");
} else {
const newnote = {
title: enteredTitle,
content: enteredContent,
};
props.note(newnote);
setenteredTitle("");
setenteredContent("");
}
};
const renderSave = () => {
if (enteredTitle === "" || enteredContent === "") {
console.log("empty");
} else {
return (
<Pressable style={styles.icons}>
<Ionicons
name="save-sharp"
size={30}
color="yellow"
onPress={onSave}
/>
</Pressable>
);
}
};
return (
<Modal visible={props.visible} animationType="slide">
<View style={styles.container1}>
<View style={styles.container2}>
<Text style={styles.txt}>Type your Notes</Text>
{renderSave()}
<Pressable style={styles.icons1}>
<Ionicons
name="arrow-back-outline"
size={30}
color="yellow"
onPress={props.onBack}
/>
</Pressable>
</View>
<View style={styles.container3}>
<TextInput
style={styles.input}
placeholder="Title"
placeholderTextColor="white"
multiline={true}
numberOfLines={1}
maxLength={25}
onChangeText={setenteredTitle}
value={enteredTitle}
/>
</View>
<View style={styles.container4}>
<AutoGrowingTextInput
style={styles.input1}
placeholder="Content"
placeholderTextColor="white"
onChangeText={setenteredContent}
value={enteredContent}
/>
</View>
</View>
</Modal>
);
}
const styles = StyleSheet.create({
container1: {
flex: 1,
backgroundColor: "#353839",
},
txt: {
fontSize: 25,
fontWeight: "bold",
color: "#fff",
padding: 10,
},
container2: {
flexDirection: "row",
justifyContent: "center",
alignItems: "center",
},
icons: {
position: "absolute",
top: 13,
left: 350,
},
icons1: {
position: "absolute",
top: 13,
right: 350,
},
container3: {
padding: 5,
margin: 5,
width: "97%",
},
input: {
fontSize: 30,
color: "#fff",
borderBottomWidth: 1,
borderBottomColor: "white",
},
input1: {
borderBottomWidth: 1,
borderBottomColor: "white",
fontSize: 20,
color: "#fff",
},
container4: {
padding: 5,
marginTop: 10,
marginLeft: 5,
width: "97%",
},
});
|
7ede44fba27e721937d0d9493d35855ced6c3aab
|
[
"JavaScript"
] | 3 |
JavaScript
|
Guruprasad1399/SimplyNote_React_Native
|
9a01e386f78a9a9c275d394d49d26f0225247377
|
fa4516dc409ae98df238c08b9919843d6b6bc0d8
|
refs/heads/main
|
<file_sep><?php
/**
* Template part for displaying page content in page.php.
*
* @link https://codex.wordpress.org/Template_Hierarchy
*
* @package Clean_Commerce
*/
?>
<?php $args = array(
'parent' => 0,
'orderby' => 'ID',
'order' => 'ASC',
'show_count' => 1,
'hide_empty' => 0,
'hierarchical' => 0,
// 'depth' => 1,
// 'number' => 12,
'taxonomy' => 'product_cat' // mention taxonomy here.
);
?>
<?php $categories = get_categories( $args ); ?>
<div class="page__title-bar">
<h3>Категории</h3>
<div class="page__title-right">
<div class="swiper-button-prev slider-thumbs__prev swiper-button-prev1 swiper-button-disabled" tabindex="-1" role="button" aria-label="Previous slide" aria-disabled="true"></div>
<div class="swiper-button-next slider-thumbs__next swiper-button-next1" tabindex="0" role="button" aria-label="Next slide" aria-disabled="false"></div>
</div>
</div>
<div class="swiper-container slider-thumbs slider-init mb-20 s1 swiper-container-initialized swiper-container-horizontal" data-paginationtype="bullets" data-spacebetweenitems="10" data-itemsperview="auto">
<div class="swiper-wrapper" style="transform: translate3d(0px, 0px, 0px);">
<? foreach ($categories as $category) {?>
<?
// get the thumbnail id using the queried category term_id
$thumbnail_id = get_term_meta( $category->term_id, 'thumbnail_id', true );
// get the image URL
$image = wp_get_attachment_url( $thumbnail_id );
?>
<div class="swiper-slide slider-thumbs__slide slider-thumbs__slide--4h swiper-slide-active" style="margin-right: 10px;">
<div class="slider-thumbs__icon"><a href="#" ><img src="<?=$image;?>" alt="" title=""></a></div>
<div class="slider-thumbs__caption caption">
<div class="caption__content">
<h2 class="caption__title caption__title--centered"><a href="<?=get_category_link($category);?>" ><?=$category->name;?></a></h2>
</div>
</div>
</div>
<?}?>
<!-- SLIDER THUMBS -->
</div>
<div class="swiper-pagination slider-thumbs__pagination swiper-pagination1 swiper-pagination-bullets"><span class="swiper-pagination-bullet swiper-pagination-bullet-active"></span><span class="swiper-pagination-bullet"></span><span class="swiper-pagination-bullet"></span></div>
<span class="swiper-notification" aria-live="assertive" aria-atomic="true"></span></div>
<file_sep><?php
/**
* Theme functions and definitions.
*
* @link https://codex.wordpress.org/Functions_File_Explained
*
* @package Clean_Commerce
*/
if ( ! function_exists( 'clean_commerce_setup' ) ) :
/**
* Sets up theme defaults and registers support for various WordPress features.
*
* Note that this function is hooked into the after_setup_theme hook, which
* runs before the init hook. The init hook is too late for some features, such
* as indicating support for post thumbnails.
*/
function clean_commerce_setup() {
// Make theme available for translation.
load_theme_textdomain( 'clean-commerce', get_template_directory() . '/languages' );
// Add default posts and comments RSS feed links to head.
add_theme_support( 'automatic-feed-links' );
// Let WordPress manage the document title.
add_theme_support( 'title-tag' );
// Enable support for Post Thumbnails on posts and pages.
add_theme_support( 'post-thumbnails' );
// Register menu locations.
register_nav_menus( array(
'primary' => esc_html__( 'Primary Menu', 'clean-commerce' ),
'header' => esc_html__( 'Header Menu', 'clean-commerce' ),
'social' => esc_html__( 'Social Menu', 'clean-commerce' ),
'notfound' => esc_html__( '404 Menu', 'clean-commerce' ),
) );
/*
* Switch default core markup for search form, comment form, and comments
* to output valid HTML5.
*/
add_theme_support( 'html5', array(
'search-form',
'comment-form',
'comment-list',
'gallery',
'caption',
) );
// Set up the WordPress core custom background feature.
add_theme_support( 'custom-background', apply_filters( 'clean_commerce_custom_background_args', array(
'default-color' => 'FFFFFF',
'default-image' => '',
) ) );
// Enable support for selective refresh of widgets in Customizer.
add_theme_support( 'customize-selective-refresh-widgets' );
// Enable support for custom logo.
add_theme_support( 'custom-logo' );
// Load default block styles.
add_theme_support( 'wp-block-styles' );
// Add support for responsive embeds.
add_theme_support( 'responsive-embeds' );
// Enable support for footer widgets.
add_theme_support( 'footer-widgets', 4 );
// Enable support for WooCommerce.
add_theme_support( 'woocommerce' );
add_theme_support( 'wc-product-gallery-lightbox' );
}
endif;
add_action( 'after_setup_theme', 'clean_commerce_setup' );
/**
* Set the content width in pixels, based on the theme's design and stylesheet.
*
* Priority 0 to make it available to lower priority callbacks.
*
* @global int $content_width
*/
function clean_commerce_content_width() {
$GLOBALS['content_width'] = apply_filters( 'clean_commerce_content_width', 640 );
}
add_action( 'after_setup_theme', 'clean_commerce_content_width', 0 );
/**
* Register widget area.
*/
function clean_commerce_widgets_init() {
register_sidebar( array(
'name' => esc_html__( 'Primary Sidebar', 'clean-commerce' ),
'id' => 'sidebar-1',
'description' => esc_html__( 'Add widgets here to appear in your Primary Sidebar.', 'clean-commerce' ),
'before_widget' => '<aside id="%1$s" class="widget %2$s">',
'after_widget' => '</aside>',
'before_title' => '<h2 class="widget-title">',
'after_title' => '</h2>',
) );
}
add_action( 'widgets_init', 'clean_commerce_widgets_init' );
/**
* Enqueue scripts and styles.
*/
function clean_commerce_scripts() {
$min = defined( 'SCRIPT_DEBUG' ) && SCRIPT_DEBUG ? '' : '.min';
// wp_enqueue_style( 'font-awesome', get_template_directory_uri() . '/third-party/font-awesome/css/font-awesome' . $min . '.css', '', '4.7.0' );
// wp_enqueue_style( 'jquery-sidr', get_template_directory_uri() . '/third-party/sidr/css/jquery.sidr.dark' . $min . '.css', '', '2.2.1' );
// wp_enqueue_style( 'jquery-slick', get_template_directory_uri() . '/third-party/slick/slick' . $min . '.css', '', '1.6.0' );
// wp_enqueue_style( 'clean-commerce-style', get_stylesheet_uri() );
// wp_enqueue_script( 'clean-commerce-skip-link-focus-fix', get_template_directory_uri() . '/js/skip-link-focus-fix' . $min . '.js', array(), '20130115', true );
// wp_enqueue_script( 'jquery-sidr', get_template_directory_uri() . '/third-party/sidr/js/jquery.sidr' . $min . '.js', array( 'jquery' ), '2.2.1', true );
// wp_enqueue_script( 'jquery-slick', get_template_directory_uri() . '/third-party/slick/slick' . $min . '.js', array( 'jquery' ), '1.6.0', true );
// wp_enqueue_script( 'clean-commerce-custom', get_template_directory_uri() . '/js/custom' . $min . '.js', array( 'jquery' ), '1.0.1', true );
wp_enqueue_script( 'jquery', get_template_directory_uri() . '/js/jquery-3.5.1' . $min . '.js', array( 'jquery' ), '1.0.1', true );
wp_enqueue_script( 'swiper', get_template_directory_uri() . '/js/swiper' . $min . '.js', array( 'jquery' ), '1.0.1', true );
wp_enqueue_script( 'swiper-init', get_template_directory_uri() . '/js/swiper-init.js', array( 'jquery' ), '1.0.1', true );
wp_enqueue_script( 'jquery-custom', get_template_directory_uri() . '/js/jquery.custom.js', array( 'jquery' ), '1.0.1', true );
wp_enqueue_style( 'swiper-css', get_template_directory_uri() . '/css/swiper' . $min . '.css', array(), '1.0.0' );
wp_enqueue_style( 'style-css', get_template_directory_uri() . '/css/style.css', array(), '1.0.0' );
if ( is_singular() && comments_open() && get_option( 'thread_comments' ) ) {
wp_enqueue_script( 'comment-reply' );
}
}
add_action( 'wp_enqueue_scripts', 'clean_commerce_scripts' );
/**
* Enqueue admin scripts and styles.
*/
function clean_commerce_admin_scripts( $hook ) {
$min = defined( 'SCRIPT_DEBUG' ) && SCRIPT_DEBUG ? '' : '.min';
if ( in_array( $hook, array( 'post.php', 'post-new.php' ) ) ) {
//wp_enqueue_style( 'clean-commerce-metabox', get_template_directory_uri() . '/css/metabox' . $min . '.css', '', '1.0.0' );
//wp_enqueue_script( 'clean-commerce-custom-admin', get_template_directory_uri() . '/js/admin' . $min . '.js', array( 'jquery', 'jquery-ui-core', 'jquery-ui-tabs' ), '1.0.0', true );
}
if ( 'widgets.php' === $hook ) {
/*
wp_enqueue_style( 'wp-color-picker' );
wp_enqueue_script( 'wp-color-picker' );
wp_enqueue_media();
wp_enqueue_style( 'clean-commerce-custom-widgets-style', get_template_directory_uri() . '/css/widgets' . $min . '.css', array(), '1.0.0' );
wp_enqueue_script( 'clean-commerce-custom-widgets', get_template_directory_uri() . '/js/widgets' . $min . '.js', array( 'jquery' ), '1.0.0', true );
*/
}
}
add_action( 'admin_enqueue_scripts', 'clean_commerce_admin_scripts' );
/**
* Load init.
*/
require_once get_template_directory() . '/inc/init.php';
// Ajaxify cart
/**
* Show cart contents / total Ajax
*/
add_filter( 'woocommerce_add_to_cart_fragments', 'woocommerce_header_add_to_cart_fragment' );
function woocommerce_header_add_to_cart_fragment( $fragments ) {
global $woocommerce;
ob_start();
?>
<a class="cart-customlocation" href="<?php echo esc_url(wc_get_cart_url()); ?>" title="<?php _e( 'View your shopping cart' ); ?>">
<img src="<?=get_template_directory_uri();?>/img/white/cart.svg" alt="" title="">
<div class="cart-count"><?=$woocommerce->cart->cart_contents_count;?></div>
<div class="cart-sum"><?=$woocommerce->cart->get_cart_total();?></div>
</a>
<?php
$fragments['a.cart-customlocation'] = ob_get_clean();
return $fragments;
}
/**
* Change a currency symbol
*/
add_filter('woocommerce_currency_symbol', 'change_existing_currency_symbol', 10, 2);
function change_existing_currency_symbol( $currency_symbol, $currency ) {
switch( $currency ) {
case 'BYN': $currency_symbol = 'р.'; break;
}
return $currency_symbol;
}
/**
* Increase api limit
*/
function maximum_api_filter($query_params) {
$query_params['per_page']["maximum"]=1000;
$query_params['posts_per_page']["maximum"]=1000;
return $query_params;
}
add_filter('rest_page_collection_params', 'maximum_api_filter');
/**
* For import, disable image thumbnail generation in background
*/
add_action( 'init', 'disable_image_regeneration_process', 5 );
function disable_image_regeneration_process() {
add_filter( 'woocommerce_background_image_regeneration', '__return_false' );
}<file_sep>Product category - Archive-product.php
Product category slider -> /inc/woocommerce-actions.php
Product content content-product.php
Product loop /woocommerce/loop/loop-start.php
Product loop /woocommerce/loop/loop-end.php<file_sep> </div> <!- Content area -->
</div> <!- Page content -->
</div>
<!-- PAGE END -->
<div id="bottom-toolbar" class="bottom-toolbar"><div class="bottom-navigation">
<div class="swiper-container-toolbar swiper-toolbar swiper-container-initialized swiper-container-horizontal">
<div class="bottom-navigation__pagination"></div>
<div class="bottom-navigation__more"><img src="<?=get_template_directory_uri();?>/img/more.svg" alt="" title=""></div>
<div class="swiper-wrapper" style="transform: translate3d(0px, 0px, 0px);">
<div class="swiper-slide swiper-slide-active" style="width: 1369px;">
<ul class="bottom-navigation__icons">
<li><a href="/"><img src="<?=get_template_directory_uri();?>/img/white/home.svg" alt="" title=""></a></li>
<li><a href="#" class="open-panel" data-panel="right" data-arrow="left"><img src="<?=get_template_directory_uri();?>/img/white/user.svg" alt="" title=""></a></li>
<li><a href="main.html"><img src="<?=get_template_directory_uri();?>/img/blocks.svg" alt="" title=""></a></li>
<li>
<a class="cart-customlocation" href="<?php echo wc_get_cart_url(); ?>" title="<?php _e( 'View your shopping cart' ); ?>">
<img src="<?=get_template_directory_uri();?>/img/white/cart.svg" alt="" title="">
<div class="cart-count"><?=WC()->cart->get_cart_contents_count();?></div>
<div class="cart-sum"><?= WC()->cart->get_cart_total();?></div>
</a>
</li>
<li><a href="contact.html"><img src="<?=get_template_directory_uri();?>/img/contact.svg" alt="" title=""></a></li>
</ul>
</div>
<? /*
<div class="swiper-slide swiper-slide-next" style="width: 1369px;">
<ul class="bottom-navigation__icons">
<li><a href="blog.html"><img src="<?=get_template_directory_uri();?>/img/news.svg" alt="" title=""></a></li>
<li><a href="photos.html"><img src="<?=get_template_directory_uri();?>/img/photos.svg" alt="" title=""></a></li>
<li><a href="videos.html"><img src="<?=get_template_directory_uri();?>/img/video.svg" alt="" title=""></a></li>
<li><a href="chat.html"><img src="<?=get_template_directory_uri();?>/img/chat.svg" alt="" title=""></a></li>
<li><a href="#" data-popup="social" class="open-popup"><img src="<?=get_template_directory_uri();?>/img/love.svg" alt="" title=""></a></li>
</ul>
</div>
*/?>
</div>
<span class="swiper-notification" aria-live="assertive" aria-atomic="true"></span></div>
</div>
</div>
<!-- Social Icons Popup -->
<div id="popup-social"> <div class="popup popup--centered popup--shadow popup--social">
<div class="popup__close"><a href="#" class="close-popup keychainify-checked" data-popup="social"><img src="<?=get_template_directory_uri();?>/img/close.svg" alt="" title=""></a></div>
<h2 class="popup__title">Share</h2>
<nav class="social-nav">
<ul>
<li><a href="#" ><img src="<?=get_template_directory_uri();?>/img/twitter.svg" alt="" title=""><span>TWITTER</span></a></li>
<li><a href="#" ><img src="<?=get_template_directory_uri();?>/img/facebook.svg" alt="" title=""><span>FACEBOOK</span></a></li>
<li><a href="#" ><img src="<?=get_template_directory_uri();?>/img/instagram.svg" alt="" title=""><span>INSTAGRAM</span></a></li>
</ul>
</nav>
</div></div>
<!-- Alert -->
<div id="popup-alert"> <div class="popup popup--centered popup--shadow popup--alert">
<div class="popup__close"><a href="#" class="close-popup keychainify-checked" data-popup="alert"><img src="<?=get_template_directory_uri();?>/img/close.svg" alt="" title=""></a></div>
<div class="popup__icon"><img src="<?=get_template_directory_uri();?>/img/alert.svg" alt="" title=""></div>
<h2 class="popup__title">Hey there !</h2>
<p class="popup__text">This is an alert example. Creativity is breaking out of established patterns to look at things in a different way.</p>
</div></div>
<!-- Notifications -->
<div id="popup-notifications"></div>
<?php
/**
* The template for displaying the footer.
*
* Contains the closing of the #content div and all content after.
*
* @link https://developer.wordpress.org/themes/basics/template-files/#template-partials
*
* @package Clean_Commerce
*/
/**
* Hook - clean_commerce_action_after_content.
*
* @hooked clean_commerce_content_end - 10
*/
do_action( 'clean_commerce_action_after_content' );
?>
<?php
/**
* Hook - clean_commerce_action_before_footer.
*
* @hooked clean_commerce_add_footer_bottom_widget_area - 5
* @hooked clean_commerce_footer_start - 10
*/
do_action( 'clean_commerce_action_before_footer' );
?>
<?php
/**
* Hook - clean_commerce_action_footer.
*
* @hooked clean_commerce_footer_copyright - 10
*/
do_action( 'clean_commerce_action_footer' );
?>
<?php
/**
* Hook - clean_commerce_action_after_footer.
*
* @hooked clean_commerce_footer_end - 10
*/
do_action( 'clean_commerce_action_after_footer' );
?>
<?php
/**
* Hook - clean_commerce_action_after.
*
* @hooked clean_commerce_page_end - 10
* @hooked clean_commerce_footer_goto_top - 20
*/
do_action( 'clean_commerce_action_after' );
?>
<?php wp_footer(); ?>
</body>
</html>
<file_sep><?php
/**
* The header for our theme.
*
* This is the template that displays all of the <head> section and everything up until <div id="content">
*
* @link https://developer.wordpress.org/themes/basics/template-files/#template-partials
*
* @package Clean_Commerce
*/
?><?php
/**
* Hook - clean_commerce_action_doctype.
*
* @hooked clean_commerce_doctype - 10
*/
do_action( 'clean_commerce_action_doctype' );
?>
<head>
<?php
/**
* Hook - clean_commerce_action_head.
*
* @hooked clean_commerce_head - 10
*/
do_action( 'clean_commerce_action_head' );
?>
<link rel="preconnect" href="https://fonts.gstatic.com">
<link href="https://fonts.googleapis.com/css2?family=Raleway:wght@300;400;500;600;800&display=swap" rel="stylesheet">
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, minimal-ui">
<?php wp_head(); ?>
</head>
<body <?php body_class(); ?>>
<!-- Overlay panel -->
<div class="body-overlay"></div>
<!-- Left panel -->
<div class="panels" id="panel-left">
<div class="panel panel--left">
<!-- Slider -->
<div class="panel__navigation swiper-container-initialized swiper-container-horizontal">
<div class="swiper-wrapper" style="transform: translate3d(0px, 0px, 0px);">
<div class="swiper-slide swiper-slide-active" style="width: 1316px;">
<div class="user-details">
<div class="user-details__thumb"><img src="<?=get_template_directory_uri();?>/img/avatar.jpg" alt="" title=""></div>
<div class="user-details__title"><span>Hello</span> <NAME></div>
</div>
<nav class="main-nav">
<ul>
<li class="subnav opensubnav"><img src="<?=get_template_directory_uri();?>/img/home.svg" alt="" title=""><span>Home Pages</span><i><img src="<?=get_template_directory_uri();?>/img/arrow-right.svg" alt="" title=""></i></li>
<li><a href="main.html"><img src="<?=get_template_directory_uri();?>/img/checked.svg" alt="" title=""><span>Features</span></a></li>
<li><a href="blog.html"><img src="<?=get_template_directory_uri();?>/img/news.svg" alt="" title=""><span>News</span></a></li>
<li><a href="shop.html"><img src="<?=get_template_directory_uri();?>/img/cart.svg" alt="" title=""><span>Shop</span></a></li>
<li><a href="photos.html"><img src="<?=get_template_directory_uri();?>/img/photos.svg" alt="" title=""><span>Photos</span></a></li>
<li><a href="videos.html"><img src="<?=get_template_directory_uri();?>/img/video.svg" alt="" title=""><span>Videos</span></a></li>
<li><a href="chat.html"><img src="<?=get_template_directory_uri();?>/img/chat.svg" alt="" title=""><span>Chat</span><strong>3</strong></a></li>
<li><a href="tabs-toggles.html"><img src="<?=get_template_directory_uri();?>/img/tabs.svg" alt="" title=""><span>Tabs & Toggles</span></a></li>
<li><a href="forms.html"><img src="<?=get_template_directory_uri();?>/img/form.svg" alt="" title=""><span>Forms</span></a></li>
<li><a href="tables.html"><img src="<?=get_template_directory_uri();?>/img/tables.svg" alt="" title=""><span>Tables</span></a></li>
<li><a href="contact.html"><img src="<?=get_template_directory_uri();?>/img/contact.svg" alt="" title=""><span>Contact</span></a></li>
<li><a href="splash.html"><img src="<?=get_template_directory_uri();?>/img/user.svg" alt="" title=""><span>User login</span></a></li>
</ul>
</nav>
</div>
<div class="swiper-slide swiper-slide-next" style="width: 1316px;">
<div class="subnav-header backtonav"><img src="<?=get_template_directory_uri();?>/img/arrow-left.svg" alt="" title=""><span>BACK</span></div>
<nav class="main-nav">
<ul>
<li><a href="index.html"><b>01</b><span>Main design</span></a></li>
<li><a href="index-website.html"><b>02</b><span>Mobile Website</span></a></li>
<li><a href="index-shop.html"><b>03</b><span>Shop Design</span></a></li>
<li><a href="index-food.html"><b>04</b><span>Food / Restaurant</span></a></li>
<li><a href="index-medical.html"><b>05</b><span>Medical Clinic</span></a></li>
<li><a href="index-beauty.html"><b>06</b><span>Beauty Center</span></a></li>
<li><a href="index-music.html"><b>07</b><span>Music App</span></a></li>
<li><a href="index-wedding.html"><b>08</b><span>Wedding Website</span></a></li>
<li><a href="index-app.html"><b>09</b><span>Classic App</span></a></li>
<li><a href="index-metro.html"><b>10</b><span>Metro Style</span></a></li>
</ul>
</nav>
</div>
</div>
<span class="swiper-notification" aria-live="assertive" aria-atomic="true"></span></div>
</div>
</div>
<!-- Right panel -->
<div id="panel-right" class="panels">
<div class="panel panel--right">
<div class="user-profile">
<div class="user-profile__thumb"><img src="<?=get_template_directory_uri();?>/img/avatar.jpg" alt="" title=""></div>
<div class="user-profile__name"><NAME></div>
<div class="buttons buttons--centered mb-20">
<div class="info-box"><span>Followers</span> 25k</div>
<div class="info-box"><span>Following</span> 9k</div>
<div class="info-box"><span>Likes</span>1.5k</div>
</div>
<div class="buttons buttons--centered">
<a href="#" class="button button--blue button--ex-small ml-10 keychainify-checked steem-keychain-checked">FOLLOW</a>
<a href="#" class="button button--green button--ex-small ml-10 keychainify-checked steem-keychain-checked">MESSAGE</a>
</div>
</div>
<nav class="main-nav">
<ul>
<li><a href="forms.html"><img src="<?=get_template_directory_uri();?>/img/user.svg" alt="" title=""><span>My Account</span></a></li>
<li><a href="contact.html"><img src="<?=get_template_directory_uri();?>/img/chat.svg" alt="" title=""><span>Messages</span><strong>3</strong></a></li>
<li><a href="contact.html"><img src="<?=get_template_directory_uri();?>/img/love.svg" alt="" title=""><span>Favourites</span></a></li>
<li><a href="contact.html"><img src="<?=get_template_directory_uri();?>/img/settings.svg" alt="" title=""><span>Settings</span></a></li>
<li><a href="index.html"><img src="<?=get_template_directory_uri();?>/img/logout.svg" alt="" title=""><span>Logout</span></a></li>
</ul>
</nav>
</div>
</div>
<div class="panel-close panel-close--left"><img src="<?=get_template_directory_uri();?>/img/close.svg" alt="" title=""></div>
<div class="panel-close panel-close--right"><img src="<?=get_template_directory_uri();?>/img/close.svg" alt="" title=""></div>
<div class="page page--main" data-page="shop">
<header class="header header--page header--fixed">
<div class="header__inner">
<div class="header__icon header__icon--menu open-panel" data-panel="left" data-arrow="right"><span></span><span></span><span></span><span></span><span></span><span></span></div>
<div class="header__logo header__logo--text"><a href="#" ><strong>Go</strong>Mobile</a></div>
<div class="header__icon header__icon--cart"><a href="cart.html" ><img src="<?=get_template_directory_uri();?>/img/cart.svg" alt="" title=""><span class="cart-items-nr">0</span></a></div>
</div>
</header>
<!-- PAGE CONTENT -->
<div class="page page--main" data-page="intro-website">
<!-- HEADER -->
<header class="header header--page header--fixed">
<div class="header__inner">
<div class="header__icon header__icon--menu open-panel" data-panel="left" data-arrow="right"><span></span><span></span><span></span><span></span><span></span><span></span></div>
<div class="header__logo header__logo--text"><a href="/">Ezhik.by</a></div>
<div class="header__icon open-panel" data-panel="right" data-arrow="left"><img src="<?=get_template_directory_uri();?>/img/white/user.svg" alt="" title=""></div>
</div>
</header>
<!-- PAGE CONTENT -->
<div class="page__content page__content--with-header">
<div class="content-area">
<h2 class="page__title">Что вы хотите заказать?</h2>
<?php get_template_part( 'template-parts/content', 'ajax-search' ); ?><file_sep><?
add_action( 'woocommerce_subcategory_slider', 'subcategory_slider' );
function subcategory_slider() {
$category_current = get_queried_object();
if (isset($category_current->parent)){
$parent = ($category_current->parent == 0 ? $category_current->term_id : $category_current->term_id);
}else {
$parent = 0;
}
$args = array(
'parent' => $parent,
'orderby' => 'ID',
'order' => 'ASC',
'show_count' => 1,
'hide_empty' => 1,
'hierarchical' => 1,
'depth' => 3,
// 'number' => 12,
'taxonomy' => 'product_cat' // mention taxonomy here.
);
$categories = get_categories( $args );?>
<div class="swiper-container swiper-categories slider-thumbs slider-init mb-20 swiper-container-initialized swiper-container-horizontal" data-paginationtype="bullets" data-spacebetweenitems="10" data-itemsperview="auto">
<div class="swiper-wrapper" style="transform: translate3d(0px, 0px, 0px);">
<? foreach ($categories as $category) {?>
<?
// get the thumbnail id using the queried category term_id
$thumbnail_id = get_term_meta( $category->term_id, 'thumbnail_id', true );
// get the image URL
$image = wp_get_attachment_url( $thumbnail_id );
?>
<div class="swiper-slide slider-thumbs__slide slider-thumbs__slide--custom swiper-slide-active<?=($category_current->term_id == $category->term_id ? " current-category" : "");?>" style="margin-right: 10px;">
<div class="slider-thumbs__icon">
<div class="slider-thumbs__caption caption">
<div class="caption__content">
<h2 class="caption__title caption__title--centered"><a href="<?=get_category_link($category);?>"><?=$category->name;?></a></h2>
</div>
</div>
</div>
</div>
<?}?>
<!-- SLIDER THUMBS -->
</div>
<div class="swiper-pagination slider-thumbs__pagination swiper-pagination1 swiper-pagination-bullets"><span class="swiper-pagination-bullet swiper-pagination-bullet-active"></span><span class="swiper-pagination-bullet"></span><span class="swiper-pagination-bullet"></span></div>
<span class="swiper-notification" aria-live="assertive" aria-atomic="true"></span></div>
<?
}
add_action( 'woocommerce_product_slider', 'product_slider' );
function product_slider($product) {
?>
<div <?php wc_product_class( 'swiper-slide slider-thumbs__slide slider-thumbs__slide--1h swiper-slide-active', $product ); ?> style="margin-right: 10px;">
<div class="slider-thumbs__image"><a href="<?=get_permalink( $product->get_id() );?>" ><?=$product->get_image();?></a><span class="slider-thumbs__price"><?=$product->get_price_html();?></span>
<div class="slider-thumbs__more">
<?
/**
* Hook: woocommerce_after_shop_loop_item.
*
* @hooked woocommerce_template_loop_product_link_close - 5
* @hooked woocommerce_template_loop_add_to_cart - 10
*/
do_action( 'woocommerce_after_shop_loop_item' );
?>
</div>
<div class="slider-thumbs__caption caption">
<div class="caption__content">
<h2 class="caption__title"><a href="<?=get_permalink( $product->get_id() );?>" ><?=$product->get_name();?></a></h2>
</div>
</div>
</div>
</div>
<?
}
add_action( 'woocommerce_products_slider_last_slide', 'products_slider_last' );
function products_slider_last($link) {
?>
<div class="swiper-slide last-slide slider-thumbs__slide slider-thumbs__slide--1h swiper-slide-active" style="margin-right: 10px;">
<a href="<?=$link;?>">
<img src="<?=get_template_directory_uri();?>/img/right.svg">
<h2 class="caption__title">Ежё товары</h2>
</a>
</div>
<?
}
add_action( 'woocommerce_products_slider', 'products_slider' );
function products_slider() {
wp_enqueue_style('main-styles', get_template_directory_uri() . '/css/product.css', array(), filemtime(get_template_directory() . '/css/product.css'), false);
$category_current = get_queried_object();
if (isset($category_current->term_id)){
$parent = ($category_current->parent == 0 ? $category_current->term_id : $category_current->term_id);
}else {
$parent = 0;
}
$args = array(
'parent' => $parent ,
'orderby' => 'ID',
'order' => 'ASC',
'show_count' => 1,
'hide_empty' => 1,
'hierarchical' => 0,
// 'depth' => 1,
// 'number' => 12,
'taxonomy' => 'product_cat' // mention taxonomy here.
);
$categories = get_categories( $args );
if ( $categories ) {
foreach ( $categories as $category ) {
?><div class="category-outer"><div class="category-title"><h2><?=$category->name;?></h2></div><div class="category-all"><a href="<?=get_category_link($category->term_id);?>">Все</a></div></div><?
$products = new WP_Query( array(
'post_type' => 'product',
'product_cat' => $category->slug,
) );
woocommerce_product_loop_start();
if ( $products->have_posts() ) {
while ( $products->have_posts() ) :
$products->the_post();
wc_get_template_part( 'content', 'product-slider' );
endwhile;
if($category->count > 10){
do_action( 'woocommerce_products_slider_last_slide', get_category_link($category->term_id) );
}
} else {
echo __( 'No products found' );
}
woocommerce_product_loop_end();
wp_reset_postdata();
};
}
else {
if (isset($category_current->slug)){
$terms = $category_current->slug;
}else {
$terms = "*";
}
$products = new WP_Query( array(
'post_type' => 'product',
/*
'tax_query' => array(
array(
'taxonomy' => 'product_cat',
'field' => 'slug',
'terms' => $terms,
),
)
*/
) );
if ( $products->have_posts() ) {
wc_get_template_part( 'content', 'product-loop-start' );
while ( have_posts() ) {
the_post();
/**
* Hook: woocommerce_shop_loop.
*/
do_action( 'woocommerce_shop_loop' );
wc_get_template_part( 'content', 'product' );
}
wc_get_template_part( 'content', 'product-loop-end' );
wp_reset_postdata();
};
}
}
add_action( 'woocommerce_product', 'product_card' );
function product_card($product) {
?>
<div <?php wc_product_class( 'card', $product ); ?>>
<div class="card__product"><a href="<?=get_permalink( $product->get_id() );?>"><?=$product->get_image();?></a> <div class="card__price"><?=$product->get_price_html();?></div></div>
<div class="card__details">
<h4 class="card__title"><a href="<?=get_permalink( $product->get_id() );?>"><?=$product->get_name();?></a></h4>
</div>
<div class="card__more">
<?
/**
* Hook: woocommerce_after_shop_loop_item.
*
* @hooked woocommerce_template_loop_product_link_close - 5
* @hooked woocommerce_template_loop_add_to_cart - 10
*/
do_action( 'woocommerce_after_shop_loop_item' );
?>
</div>
</div>
<?
} <file_sep>=== e-otstavka ===
<file_sep><?php
require __DIR__ . '/vendor/autoload.php';
use Automattic\WooCommerce\Client;
use Automattic\WooCommerce\HttpClient\HttpClientException;
// TODO: Add Badges Spicy
// TODO: Add Warehouse tag
// Script start
$rustart = getrusage();
function getWoocommerceConfig()
{
$woocommerce = new Client(
'https://mobile.leadergroup.by',
'ck_6e13eec6b7683d37d81cc7ba21cbc48873d261f9',
'cs_c31847d4a1328de09c4e2ae0e23d41008d2501ec',
[
'wp_api' => true,
'version' => 'wc/v3',
'query_string_auth' => false,
]
);
return $woocommerce;
}
/**
* Check if image exist on server
*
* @param string $image
* @return true
*/
function checkImageExists($image){
if (@getimagesize($image)) {
return true;
} else {
return false;
}
}
function writeToLog($data){
$fp = fopen(dirname(__FILE__) . '/data.txt', 'a');//opens file in append mode
fwrite($fp, print_r($data, true));
fclose($fp);
}
/**
* Find category in full list
*
* @param string $category, $allCategories
* @return category name
*/
function findRealCategory($category, $allCategories){
foreach ($allCategories as $allCategory){
if ($allCategory["id"] == $category){
$category = $allCategory["name"];
return $allCategory["name"];
}
}
}
/**
* Parse JSON file.
*
* @param string $file
* @return array
*/
function getJsonFromFile()
{
$file = dirname(__FILE__) . '/products.json';
$json = json_decode(file_get_contents($file), true);
return $json;
}
function checkProductBySku($skuCode)
{
$woocommerce = getWoocommerceConfig();
$products = $woocommerce->get('products', ['sku'=> $skuCode]);
foreach ($products as $product) {
$currentSku = strtolower($product->sku);
$skuCode = strtolower($skuCode);
if ($currentSku === $skuCode) {
return ['exist' => true, 'idProduct' => $product->id];
}
}
return ['exist' => false, 'idProduct' => null];
}
function createProducts()
{
$woocommerce = getWoocommerceConfig();
$products = getJsonFromFile()["products"];
$allCategories = getJsonFromFile()["categories"];
$imgCounter = 0;
$prodCount = 0;
foreach ($products as $product) {
/*Chec sku before create the product */
$productExist = checkProductBySku($product['id']);
$imagesFormated = array();
/*Main information */
$name = $product['name'];
$slug = $product['url'];
$sku = $product['id'];
$description = $product['desc'];
$images = $product['picture'];
$attributes = $product['params'];
array_splice($attributes, 0, 0, $product['size'] );
$categories = $product['category'];
$categoriesIds = array();
if (is_array($images)){
foreach ($images as $image){
if (checkImageExists($image) == true){
$imagesFormated[] = [
'src' => $image,
'position' => 0
];
$imgCounter++;
}
}
}else {
if (checkImageExists($images) == true){
$imagesFormated[] = [
'src' => $images,
'position' => 0
];
$imgCounter++;
}
}
if (is_array($categories)){
/* Prepare categories */
foreach ($categories as $category) {
$categoryName = findRealCategory($category, $allCategories);
if ($categoryName) {
$categoriesIds[] = ['id' => getCategoryIdByName($categoryName)];
}
}
}else {
$categoryName = findRealCategory($categories, $allCategories);
if ($categoryName) {
$categoriesIds[] = ['id' => getCategoryIdByName($categoryName)];
}else {
$categoriesIds = "";
}
}
$finalProduct = [
'name' => $name,
'slug' => $slug,
'sku' => $sku,
'description' => $description,
'regular_price' => $product['price']['value'],
'categories' => $categoriesIds,
'attributes' => getproductAtributesNames($attributes)
];
if ($imagesFormated[0]['src'] != ''){
$finalProduct['images'] = $imagesFormated;
}
else {
$finalProduct['images'] = '';
}
$fp = fopen(dirname(__FILE__) . '/data.txt', 'a');//opens file in append mode
print_r($finalProduct);
fwrite($fp, print_r($finalProduct, true));
if (!$productExist['exist']) {
$productResult = $woocommerce->post('products', $finalProduct);
echo "created product #" . $prodCount . " " . $finalProduct["name"] . " sku: " . $finalProduct["sku"] . "<br> \n \r\n";
fwrite($fp, "created product #" . $prodCount . " " . $finalProduct["name"] . " sku: " . $finalProduct["sku"] . "<br> \n \r\n");
} else {
/*Update product information */
$idProduct = $productExist['idProduct'];
$woocommerce->put('products/' . $idProduct, $finalProduct);
echo "updated product #" . $prodCount . " " . $finalProduct["name"] . " sku: " . $finalProduct["sku"] . "<br> \n \r\n";
fwrite($fp, "updated product #" . $prodCount . " " . $finalProduct["name"] . " sku: " . $finalProduct["sku"] . "<br> \n \r\n");
}
fclose($fp);
$prodCount++;
unset($name);
unset($slug);
unset($sku);
unset($description);
unset($imagesFormated);
unset($image);
unset($images);
unset($categoriesIds);
unset($attributes);
unset($finalProduct);
}
}
function createCategory($value)
{
$woocommerce = getWoocommerceConfig();
$categoryId = checkCategoryByname($value["name"]);
$data = [
'name' => $value["name"],
'parent'=> $value["parent"]
];
print_r($categoryId);
print_r($data);
writeToLog($data);
if (!$categoryId) {
$woocommerce->post('products/categories', $data);
}else {
$woocommerce->put('products/categories/'.$categoryId, $data);
}
}
function createCategories()
{
$products = getJsonFromFile()["products"];
$allCategories = getJsonFromFile()["categories"];
$allWithParent = array();
// loop throught all categories and find parent
foreach ($allCategories as $allCategory){
// if we have parent lets bind them together
if ($allCategory["parent"] !== 0){
// loop throught all categories
foreach ($allCategories as $findCategory){
// find id of parent and get its name
if($findCategory["id"] == $allCategory["parent"]){
// get its id in woocommerce from name
$parentId = getCategoryIdByName($findCategory["name"]);
$category = array("name"=>$allCategory["name"], "parent"=>$parentId);
createCategory($category);
}
}
}
// if does not have parent create right away
else {
$category = array("name"=>$allCategory["name"], "parent"=>"0");
createCategory($category);
}
}
return " Done importing categories";
}
function checkCategoryByName($categoryName)
{
$categoryName = str_replace(" ", " ", $categoryName);
$woocommerce = getWoocommerceConfig();
$categories = $woocommerce->get('products/categories', ['posts_per_page' => 100, 'number'=> 100, 'per_page'=>100, 'search'=>$categoryName]);
$lastResponse = $woocommerce->http->getResponse();
$headers = $lastResponse->getHeaders();
$totalPages = $headers['X-WP-TotalPages'];
$i = 0;
$page = 1;
while ($i++ < $totalPages)
{
$categories = $woocommerce->get('products/categories', ['posts_per_page' => 100, 'number'=> 100, 'per_page'=>100, 'page'=> $page, 'search'=>$categoryName]);
foreach ($categories as $category) {
if ($category->name === $categoryName) {
return $category->id;
}
}
$page++;
}
return false;
}
/** CATEGORIES **/
function getCategoryIdByName($categoryName)
{
$woocommerce = getWoocommerceConfig();
$categories = $woocommerce->get('products/categories', ['posts_per_page' => 100, 'number'=> 100, 'per_page'=>100, 'search'=>$categoryName]);
$lastResponse = $woocommerce->http->getResponse();
$headers = $lastResponse->getHeaders();
$totalPages = $headers['X-WP-TotalPages'];
$i = 0;
$page = 1;
while ($i++ < $totalPages)
{
$categories = $woocommerce->get('products/categories', ['posts_per_page' => 100, 'number'=> 100, 'per_page'=>100, 'page'=> $page, 'search'=>$categoryName]);
foreach ($categories as $category) {;
if ($category->name == $categoryName) {
return $category->id;
}
}
$page++;
}
}
function getproductAtributesNames($attributes)
{
$keys = array();
foreach ($attributes as $attribute) {
$attr[] =
array(
'name' => (isset($attribute['unit']) ? $attribute['unit'] : $attribute['id']),
'slug' => 'attr_' . (isset($attribute['unit']) ? $attribute['unit'] : $attribute['id']),
'visible' => true,
'variation' => true,
'options' => $attribute['value']
);
}
return $attr;
}
function prepareInitialConfig()
{
echo ('Importing data, wait...')."\n";
writeToLog( date("Y.m.d") . " " . date("h:i:sa") . "Importing data");
createCategories();
createProducts();
writeToLog("Done" . date("Y.m.d") . " " . date("h:i:sa") );
echo ('Done!')."\n";
}
prepareInitialConfig();
function rutime($ru, $rus, $index) {
return ($ru["ru_$index.tv_sec"]*1000 + intval($ru["ru_$index.tv_usec"]/1000))
- ($rus["ru_$index.tv_sec"]*1000 + intval($rus["ru_$index.tv_usec"]/1000));
}
$ru = getrusage();
echo "This process used " . rutime($ru, $rustart, "utime") .
" ms for its computations\n";
echo "It spent " . rutime($ru, $rustart, "stime") .
" ms in system calls\n";
?><file_sep><?php
/**
* Template part for displaying page content in page.php.
*
* @link https://codex.wordpress.org/Template_Hierarchy
*
* @package Clean_Commerce
*/
?>
<div class="page__title-bar">
<h3>Популярные товары</h3>
</div>
<div class="cards cards--12 mb-20">
<div class="card">
<div class="card__product"><a href="shop-details.html" ><img src="/wp-content/uploads/2021/01/image-5.jpg" alt="" title=""></a> <div class="card__price">$9</div></div>
<div class="card__details">
<h4 class="card__title"><a href="shop-details.html" >Cola Soda</a></h4>
</div>
<div class="card__more"><a class="button button--blue button--cart addtocart keychainify-checked" href="#">ADD TO CART</a></div>
</div>
<div class="card">
<div class="card__product"><a href="shop-details.html" ><img src="/wp-content/uploads/2021/01/image-5.jpg" alt="" title=""></a> <div class="card__price">$19</div></div>
<div class="card__details">
<h4 class="card__title"><a href="shop-details.html" >Burger Box</a></h4>
</div>
<div class="card__more"><a class="button button--blue button--cart addtocart keychainify-checked" href="#">ADD TO CART</a></div>
</div>
<div class="card">
<div class="card__product"><a href="shop-details.html" ><img src="/wp-content/uploads/2021/01/image-5.jpg" alt="" title=""></a> <div class="card__price">$29</div></div>
<div class="card__details">
<h4 class="card__title"><a href="shop-details.html" >Blue Jeans</a></h4>
</div>
<div class="card__more"><a class="button button--blue button--cart addtocart keychainify-checked" href="#">ADD TO CART</a></div>
</div>
<div class="card">
<div class="card__product"><a href="shop-details.html" ><img src="/wp-content/uploads/2021/01/image-5.jpg" alt="" title=""></a> <div class="card__price">$29</div></div>
<div class="card__details">
<h4 class="card__title"><a href="shop-details.html" >Food Box</a></h4>
</div>
<div class="card__more"><a class="button button--blue button--cart addtocart keychainify-checked" href="#">ADD TO CART</a></div>
</div>
</div>
<file_sep><?php
/**
* Load files.
*
* @package Clean_Commerce
*/
/**
* Include default theme options.
*/
require_once get_template_directory() . '/inc/woocommerce-actions.php';
<file_sep><?php
/**
* Template part for displaying page content in page.php.
*
* @link https://codex.wordpress.org/Template_Hierarchy
*
* @package Clean_Commerce
*/
?>
<h2 class="page__title">Выделенные товары</h2>
<!-- SLIDER THUMBS -->
<div class="swiper-container slider-thumbs slider-init mb-20 s0 swiper-container-initialized swiper-container-horizontal" data-paginationtype="bullets" data-spacebetweenitems="10" data-itemsperview="auto">
<div class="swiper-wrapper" style="transform: translate3d(0px, 0px, 0px);">
<div class="swiper-slide slider-thumbs__slide slider-thumbs__slide--1h swiper-slide-active" style="margin-right: 10px;">
<div class="slider-thumbs__image"><a href="shop-details.html" ><img src="/wp-content/uploads/2021/01/image-5.jpg" alt="" title=""></a><span class="slider-thumbs__price">$27</span>
<div class="slider-thumbs__more"><a class="button button--blue button--cart addtocart keychainify-checked" href="cart.html">ADD TO CART</a></div>
</div>
<div class="slider-thumbs__caption caption">
<div class="caption__content">
<h2 class="caption__title"><a href="shop-details.html" >Orange T-shirt</a></h2>
</div>
</div>
</div>
<div class="swiper-slide slider-thumbs__slide slider-thumbs__slide--1h swiper-slide-next" style="margin-right: 10px;">
<div class="slider-thumbs__image"><a href="shop-details.html" ><img src="/wp-content/uploads/2021/01/image-5.jpg" alt="" title=""></a><span class="slider-thumbs__price">$75</span>
<div class="slider-thumbs__more"><a class="button button--blue button--cart addtocart keychainify-checked" href="cart.html">ADD TO CART</a></div>
</div>
<div class="slider-thumbs__caption caption">
<div class="caption__content">
<h2 class="caption__title"><a href="shop-details.html" >Black Dress</a></h2>
</div>
</div>
</div>
<div class="swiper-slide slider-thumbs__slide slider-thumbs__slide--1h" style="margin-right: 10px;">
<div class="slider-thumbs__image"><a href="shop-details.html" ><img src="/wp-content/uploads/2021/01/image-5.jpg" alt="" title=""></a><span class="slider-thumbs__price">$32</span>
<div class="slider-thumbs__more"><a class="button button--blue button--cart addtocart keychainify-checked" href="cart.html">ADD TO CART</a></div>
</div>
<div class="slider-thumbs__caption caption">
<div class="caption__content">
<h2 class="caption__title"><a href="shop-details.html" >Blue Jeans</a></h2>
</div>
</div>
</div>
<div class="swiper-slide slider-thumbs__slide slider-thumbs__slide--1h" style="margin-right: 10px;">
<div class="slider-thumbs__image"><a href="shop-details.html" ><img src="/wp-content/uploads/2021/01/image-5.jpg" alt="" title=""></a><span class="slider-thumbs__price">$18</span>
<div class="slider-thumbs__more"><a class="button button--blue button--cart addtocart keychainify-checked" href="cart.html">ADD TO CART</a></div>
</div>
<div class="slider-thumbs__caption caption">
<div class="caption__content">
<h2 class="caption__title"><a href="shop-details.html" >Food Box</a></h2>
</div>
</div>
</div>
</div>
<div class="swiper-pagination slider-thumbs__pagination swiper-pagination0 swiper-pagination-bullets"><span class="swiper-pagination-bullet swiper-pagination-bullet-active"></span><span class="swiper-pagination-bullet"></span><span class="swiper-pagination-bullet"></span><span class="swiper-pagination-bullet"></span></div>
<span class="swiper-notification" aria-live="assertive" aria-atomic="true"></span>
</div><file_sep><?php
/**
* The template for displaying all pages.
*
* This is the template that displays all pages by default.
* Please note that this is the WordPress construct of pages
* and that other 'pages' on your WordPress site may use a
* different template.
*
* @link https://codex.wordpress.org/Template_Hierarchy
*
* @package Clean_Commerce
*/
get_header(); ?>
<?php get_template_part( 'template-parts/content', 'categories' ); ?>
<?php // get_template_part( 'template-parts/content', 'featured-slider' ); ?>
<?php // get_template_part( 'template-parts/content', 'categories-slider' ); ?>
<?php // get_template_part( 'template-parts/content', 'featured' ); ?>
<a href="/shop" class="button button--green button--full mb-20">Все товары</a>
<?php
/**
* Hook - clean_commerce_action_sidebar.
*
* @hooked: clean_commerce_add_sidebar - 10
*/
do_action( 'clean_commerce_action_sidebar' );
?>
<?php get_footer(); ?>
<file_sep><?php
/**
* Template part for displaying page content in page.php.
*
* @link https://codex.wordpress.org/Template_Hierarchy
*
* @package Clean_Commerce
*/
?>
<?php $args = array(
'parent' => 0,
'orderby' => 'ID',
'order' => 'ASC',
'show_count' => 1,
'hide_empty' => 1,
'exclude' => array(15),
'hierarchical' => true,
'depth' => 3,
// 'number' => 12,
'taxonomy' => 'product_cat' // mention taxonomy here.
);
?>
<?php $categories = get_categories( $args ); ?>
<div class="cards cards--12 mb-20">
<? foreach ($categories as $category) {?>
<div class="category-card">
<div class="card__details">
<h4 class="card__title"><a href="<?=get_category_link($category);?>" ><?=$category->name;?></a></h4>
</div>
<div class="card__product">
<?
// get the thumbnail id using the queried category term_id
$thumbnail_id = get_term_meta( $category->term_id, 'thumbnail_id', true );
// get the image URL
$image = wp_get_attachment_url( $thumbnail_id );
?>
<a href="<?=get_category_link($category);?>" >
<img src="<?=$image;?>" alt="" title="">
</a>
</div>
</div>
<?}?>
</div>
<file_sep><?php
/**
* Template part for displaying results in search pages.
*
* @link https://codex.wordpress.org/Template_Hierarchy
*
* @package Clean_Commerce
*/
?>
<div class="search__form mb-20">
<? echo do_shortcode('[wcas-search-form]'); ?>
</div>
|
8806136003b6b91921a6b48bdfae711a66365949
|
[
"Markdown",
"Text",
"PHP"
] | 14 |
PHP
|
BelarusDigitalFuture/e-otstavka
|
12c81e53db67a44e4d33ac92eef3ef43a67ea9fe
|
be58b8bd1748877055a854a208ee96b6bdacc05c
|
refs/heads/master
|
<repo_name>rahimianarezoo/LIQID-Coding-Challenge<file_sep>/data.js
var productList = [
{
productTitle: 'LIQID Wealth',
amount: '500.000 €',
flag: 'surplus',
percent: '25.3%'
},
{
productTitle: 'LIQID Cash',
amount: '260.000 €',
flag: 'surplus',
percent: '15.3%'
},
{
productTitle: 'LIQID Venture',
amount: '928.000 €',
flag: 'deficit',
percent: '0.13%'
},
{
productTitle: 'LIQID Private Equity',
amount: '850.000 €',
flag: 'surplus',
percent: '50.8%'
},
{
productTitle: 'LIQID Real State',
amount: '799.000 €',
flag: 'surplus',
percent: '25.6%'
}
];<file_sep>/assets/scripts/loadProductList.js
function loadProductList(productList, wrapperId) {
const listContainer = document.getElementById(wrapperId);
if (listContainer) {
listContainer.innerHTML = '';
productList.forEach(product => {
let li = document.createElement('li');
li.classList.add('product-list__item');
let h4 = document.createElement('h4');
h4.classList.add('product-list__item__title');
let div = document.createElement('div');
div.classList.add('product-list__item__badget');
if (product.flag === 'deficit') {
div.classList.add('product-list__item__badget--red')
}
let span = document.createElement('span');
span.classList.add('product-list__item__badget__percent');
let a = document.createElement('a');
a.classList.add('product-list__item__link');
h4.innerHTML = product.productTitle;
span.innerHTML = product.percent;
div.innerHTML = product.amount;
div.appendChild(span);
a.innerHTML = 'Explore';
// The source of the link will be filled here in the real project.
a.href = '#';
li.appendChild(h4);
li.appendChild(div);
li.appendChild(a);
listContainer.appendChild(li);
});
} else {
throw `${wrapperId} is not defined`;
}
}<file_sep>/README.md
# LIQID-Coding-Challenge
- I used the BEM naming convention since it is less confusing than the other methods but still provides the good architecture.
- I didn’t use “sanitize.css”, it was making some difficulties for me (lack of some CSS attributes for resetting the default styling of the elements e.g not removing the unwanted margin/padding of ul). Therefore I just used some parts of it that I needed and put it in the “reset.css” file.
- There are some minor space changes in the final implementation that I applied, because I found it easier to maintain.
- The mockup was provided in only two resolustions, one for 1920px and one for the mobile, but since my display has 1366px width, I provided three more breakpoints (1366, 1024, 768) to make it look better on those screens as well.
|
80412ade5b75ef0c82891bba90adf7d298773d31
|
[
"JavaScript",
"Markdown"
] | 3 |
JavaScript
|
rahimianarezoo/LIQID-Coding-Challenge
|
b1b05f4b5bc2d99d13aabb27748239e964d3b9f7
|
a34dc02dc10269f8e2c16ba6fea1582eee07ee51
|
refs/heads/master
|
<file_sep># UserManagementAPI
Very basic API build using node.js and Express connected to a MySQL database. This API was created for a demo webshop and includes many operations for working with user data. Security was out of scope in this project.
## Dependencies
- node.js
- npm
## Setting up the API
1. Clone this repository with `git clone https://github.com/strumswell/UserManagementAPI.git`
2. Install node packages with `npm install`
3. Create .env with the following content:
```
NODE_ENV=development
SQL_HOST=''
SQL_USER=''
SQL_PASSWORD=''
SQL_DATABASE=''
MAIL_USER=''
MAIL_PASSWORD=''
API_BASE_URL='https://api.example.com/'
AVATAR_LOCAL_PATH=''
AVATAR_PUBLIC_PATH='http://avatar.example.com'
```
4. Execute the app with `node app.js`

<file_sep>const pool = require('../util/db');
const bcrypt = require('bcrypt');
const mailer = require('../util/confirmMail');
const uuidv4 = require('uuid/v4');
const multer = require('multer');
const storage = multer.diskStorage({
destination: (req, file, cb) => {
cb(null, process.env.AVATAR_LOCAL_PATH);
},
filename: (req, file, cb) => {
console.log(file);
var filetype = '';
if(file.mimetype === 'image/gif') {
filetype = 'gif';
}
if(file.mimetype === 'image/png') {
filetype = 'png';
}
if(file.mimetype === 'image/jpeg') {
filetype = 'jpg';
}
cb(null, req.params.uuid + '.' + filetype);
}
});
const upload = multer({storage: storage});
// Send standardized response
function sendResponse(response, status, error, result) {
response.status(status).send(JSON.stringify({"status": status, "error": error, "result": result}));
}
// ==================
// Users routes
// ==================
module.exports = (app) => {
// Return all Users
app.get('/v1/users',(request, response) => {
let sql = "SELECT * FROM userdata";
let query = pool.query(sql, (error, results) => {
//Somethings wrong interally
if(error) return sendResponse(response, 500, "Internal server error.", null);
// All good
sendResponse(response, 200, null, results);
});
});
// Return specific User
app.get('/v1/users/:uuid',(request, response) => {
let sql = "SELECT * FROM userdata WHERE uuid='"+request.params.uuid+"'";
let query = pool.query(sql, (error, results) => {
//Somethings wrong interally, has "code" when DB doesn't respond. Body of node error!
if(error) return sendResponse(response, 500, "Internal server error.", null);
// UUID is unkown and no changes were made
if(results.length < 1) return sendResponse(response, 404, "User not found.", null);
// We don't want to share password hash
delete results[0].password;
// All good
sendResponse(response, 200, null, results[0]);
});
});
// Create new User
app.post('/v1/users',(request, response) => {
let data = request.body;
if (data.password !== undefined) {
// Change plain text to hash
data.password = bcrypt.hashSync(data.password, 10);
} else {
// Cannot hash password because its not set
return sendResponse(response, 400, "Bad request. Attributes may be missing, check docs at https://api.bolte.cloud/docs", null);
}
// Generate UUID
data.uuid = uuidv4();
// Build query and execute
let sql = "INSERT INTO userdata SET ?";
let query = pool.query(sql, data,(error, results) => {
// Missing or wrong attributes used
if(error && error.code === "ER_NO_DEFAULT_FOR_FIELD") return sendResponse(response, 400, "Bad request. Attributes may be missing, check docs at https://api.bolte.cloud/docs", null);
//Somethings wrong interally, has "code" when DB doesn't respond. Body of node error!
if(error) return sendResponse(response, 500, "Internal server error.", null);
// All good
mailer.sendEmailConfirmation(request.body.email, data.uuid, data.forename);
sendResponse(response, 200, null, {"uuid": data.uuid});
});
});
// Confirm E-Mail
app.get('/v1/users/:uuid/confirmEmail',(request, response) => {
// Build query and execute
let sql = "UPDATE userdata SET email_verified=true where uuid='"+request.params.uuid+"'";
let query = pool.query(sql,(error, results) => {
//Somethings wrong interally, has "code" when DB doesn't respond. Body of node error!
if(error) return sendResponse(response, 500, "Internal server error.", null);
// UUID is unkown
if(results.affectedRows < 1) return sendResponse(response, 404, "User not found.", null);
// All good
response.send("Deine E-Mail-Adresse wurde bestätigt!");
});
});
// Login user ; NOTE: security is out of scope for this PoC and therefore returns the userid
app.post('/v1/users/login',(request, response) => {
let data = request.body;
let sql = "SELECT uuid,password from userdata where email='"+request.body.email+"'";
let query = pool.query(sql, (error, results) => {
//Somethings wrong interally, has "code" when DB doesn't respond. Body of node error!
if(error) return sendResponse(response, 500, "Internal server error.", null);
// Id is unkown and no changes were made
if(results.length < 1) return sendResponse(response, 404, "User not found.", null);
// Check for password validity
if(bcrypt.compareSync(request.body.password, results[0].password)) {
sendResponse(response, 200, null, {"uuid": results[0].uuid});
} else {
sendResponse(response, 200, "Authentication failed!", null);
}
});
});
// Update specific User
app.put('/v1/users/:uuid',(request, response) => {
let data = request.body;
let password = data.password;
// Wanna change password?
if (password !== undefined) {
// Hash password and change request body
let hashed_password = bcrypt.hashSync(password, 10);
data.password = hashed_password;
}
// Update DB
let sql = "UPDATE userdata SET ? where uuid='"+request.params.uuid+"'";
let query = pool.query(sql, data,(error, results) => {
// Missing or wrong attributes used
if(error && error.code === "ER_PARSE_ERROR") return sendResponse(response, 400, "Bad request. Attributes may be missing, check docs at https://api.bolte.cloud/docs", null);
//Somethings wrong interally, has "code" when DB doesn't respond. Body of node error!
if(error) return sendResponse(response, 500, "Internal server error.", null);
// Id is unkown and no changes were made
if(results.affectedRows < 1) return sendResponse(response, 404, "User not found.", null);
// All good
sendResponse(response, 200, null, "User updated");
});
});
// Delete specific User
app.delete('/v1/users/:uuid',(request, response) => {
let sql = "DELETE FROM userdata WHERE uuid='"+request.params.uuid+"'";
let query = pool.query(sql, (error, results) => {
//Somethings wrong interally
if(error) return sendResponse(response, 500, "Internal server error.", null);
// Id is unkown and no changes were made
if(results.affectedRows < 1) return sendResponse(response, 404, "User not found.", null);
// All good
sendResponse(response, 200, null, results);
});
});
// Upload avatar
app.post('/v1/users/:uuid/avatar', upload.any(), (request, response) => {
let data = request.files;
let fileUrl = process.env.AVATAR_PUBLIC_PATH + data[0].filename;
let sql = "UPDATE userdata SET avatar='"+fileUrl+"' WHERE uuid='"+request.params.uuid+"'";
let query = pool.query(sql, (error, results) => {
// Somethings wrong interally, has "code" when DB doesn't respond. Body of node error!
if(error) return sendResponse(response, 500, "Internal server error.", error);
// All good
sendResponse(response, 200, null, {'fileUrl': fileUrl});
});
});
// Get avatar file url
app.get('/v1/users/:uuid/avatar', (request, response) => {
let sql = "SELECT avatar FROM userdata WHERE uuid='"+request.params.uuid+"'";
let query = pool.query(sql, (error, results) => {
// Somethings wrong interally, has "code" when DB doesn't respond. Body of node error!
if(error) return sendResponse(response, 500, "Internal server error.", error);
// All good
sendResponse(response, 200, null, {'fileUrl': results[0].avatar});
});
});
}
<file_sep>// Send standardized response
function sendResponse(response, status, error, result) {
response.status(status).send(JSON.stringify({"status": status, "error": error, "result": result}));
}
module.exports = (app) => {
// Serving unknown routes
app.use('/', (request, response) => {
response.send("Woah, what are you trying to do? You can find the documentation @ "+process.env.API_BASE_URL+"docs if you have no clue what you are doing.");
});
// Catch wrong JSON in requests
app.use(function (error, request, response, next) {
if (error instanceof SyntaxError) {
sendResponse(response, 400, "Invalid JSON.", null);
} else {
next();
}
});
}
<file_sep>const nodemailer = require('nodemailer');
const transporter = nodemailer.createTransport({
service: 'gmail',
auth: {
user: process.env.MAIL_USER,
pass: <PASSWORD>.MAIL_PASSWORD
}
});
// Send Email confirmation
function sendEmailConfirmation(receiver, id, forname) {
var mailOptions = {
from: 'Webshop XYZ <<EMAIL>>',
to: receiver,
subject: 'Bestätige deine E-Mail!',
html: '<h3>Fast fertig, '+forname+'!</h3> Bestätige deine E-Mail mit einem Klick.<br><a href="'+process.env.API_BASE_URL+'v1/users/'+id+'/confirmEmail">Jetzt bestätigen.</a>'
};
transporter.sendMail(mailOptions, (error, info) => {
if (error) {
console.log(error);
} else {
console.log('Email sent: ' + receiver + "-" + id + " :::: " + info.response);
}
});
}
module.exports.sendEmailConfirmation = sendEmailConfirmation;
|
eab04a604d6d0d4c06199c92ad8bfff95e1b7f3b
|
[
"Markdown",
"JavaScript"
] | 4 |
Markdown
|
strumswell/UserManagementAPI
|
102ee29c60d0c22eb5edd274a3b2481b512d4dd1
|
7bb116788818917e6c97537cd34d429b8fcdce25
|
refs/heads/master
|
<file_sep><?php
session_start();
?>
<html>
<head>
<title>Test</title>
<script src="https://ajax.googleapis.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
</head>
<body>
<form name="form" action="add_submit.php" method="POST" enctype="multipart/form-data">
Name:<input type="text" name="name" id="name" >
Description:<input type="text" name="description" id="description" >
Price:<input type="text" name="price" id="price" >
Image:<input type="file" name="image" id="image" >
<input type="submit" name="submit" value="Submit" id="formsubmit">
</form>
<script>
$('#formsubmit').click(function(e) {
var error=0;
var name = $('#name').val();
if(name == ""){
$('#name').css('border-color', 'red');
error++;
}
else{
$('#name').css('border-color', 'green');
}
var description = $('#description').val();
if(description == ""){
$('#description').css('border-color', 'red');
error++;
}
else{
$('#description').css('border-color', 'green');
}
var price = $('#price').val();
if(price == ""){
$('#price').css('border-color', 'red');
error++;
}
else{
$('#price').css('border-color', 'green');
}
var image = $('#image').val();
if(image == ""){
$('#image').css('border-color', 'red');
error++;
}
else{
$('#image').css('border-color', 'green');
}
if(error > 0){
e.preventDefault();
}
});
</script>
</body>
</html><file_sep><?php
session_start();
?>
<html>
<head>
<title>Test</title>
<script src="https://ajax.googleapis.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
</head>
<body>
<form name="login-form" action="login_action.php" method="POST" >
<input type="text" name="username" id="username" placeholder="Username" >
<input type="<PASSWORD>" name="password" id="password" placeholder="<PASSWORD>">
<input type="submit" name="submit" value="Submit" id="formsubmit">
</form>
<?php
if(isset($_SESSION['message'])){
echo $_SESSION['message'];
}
unset( $_SESSION['message'] );
?>
<script>
$('#formsubmit').click(function(e) {
var error=0;
var username = $('#username').val();
if(username == ""){
$('#username').css('border-color', 'red');
error++;
}
else{
$('#username').css('border-color', 'green');
}
var password = $('#password').val();
if(password == ""){
$('#password').css('border-color', 'red');
error++;
}
else{
$('#password').css('border-color', 'green');
}
if(error > 0){
e.preventDefault();
}
});
</script>
</body>
</html><file_sep># corephpvalidation
corephp with validation
|
f87eef4437b96e390218cea862ecf659fded0678
|
[
"Markdown",
"PHP"
] | 3 |
PHP
|
manojasai/corephpvalidation
|
bc998a5e66c9260dba47ef8ddaae8194da79d72c
|
094a445153fbf2dfe9d7e463097a16bd4f1bc9c1
|
refs/heads/master
|
<repo_name>ArvoGuo/dir-href<file_sep>/temp/config.js
module.exports = {
host: 'http://c.oooooo6.com',
temp: '<a href="{url}" target="_blank" class="btn btn-primary">{urlName}</a>\n'
};<file_sep>/README.md
dir-href
##说明
能够找到当前目录下所有的.html文件,并且提示你命名同时生成html目录
```javascript
assumption:
your host is : http://www.oooooo6.com
your file's dir in your server:
/Users/Arvo/github/dir-href/1.html
/Users/Arvo/github/dir-href/abc/2.html
/Users/Arvo/github/dir-href/abc/def/3.html
...
```
```javascript
=> //cd /Users/Arvo/github/dir-href && dir-href,
// you will get a html contains :
```
```html
<a href="http://www.oooooo6.com/1.html" target="_blank" class="btn btn-primary">my first html</a>
<a href="http://www.oooooo6.com/abc/2.html" target="_blank" class="btn btn-primary">my second html</a>
<a href="http://www.oooooo6.com/abc/def/3.html" target="_blank" class="btn btn-primary">my third html</a>
```
##注意
默认忽略以下目录
```
'.DS_Store',
'.git',
'node_modules',
'bower_compontents',
'lib',
'test'
```
命名时,直接回车将忽略该文件
##安装
```
npm install dir-href -g
```
##使用
```
cd dir/* some dir */
dir-href
```
##Demo示例
1.cd 到相应目录

2.dir-href 启动

3.输入替代的path, 已经生成的名称

4.可以查看catalogue.html ,就可以看到最终目的。

<file_sep>/app.js
#!/usr/bin/env node
var config = require('./temp/config.js');
var templatePath = __dirname + '/temp/template.html';
var path = require('path');
var readlineSync = require('readline-sync');
var fs = require('fs');
var dir = process.argv[2] || process.cwd();
var host = config.host;
var template = config.temp;
var html = '';
var needIgnoreFiles = [
'.DS_Store',
'.git',
'node_modules',
'bower_compontents',
'lib',
'test'
];
var distFileName = 'catalogue.html';
var matchTag = '.html';
/**===================================**/
/**============ prototype ============**/
/**===================================**/
Array.prototype.remove = function () {
var what, a = arguments, L = a.length, ax;
while (L && this.length) {
what = a[--L];
while ((ax = this.indexOf(what)) !== -1) {
this.splice(ax, 1);
}
}
return this;
};
/**===================================**/
/**============= Function ============**/
/**===================================**/
function ignoreFiles(files) {
needIgnoreFiles.forEach(function(f) {
files.remove(f);
});
}
function getFiles(path, files_) {
files_ = files_ || [];
var files = fs.readdirSync(path);
ignoreFiles(files);
for (var i in files) {
if (!files.hasOwnProperty(i)) {
continue;
}
var _path = path + '/' + files[i];
if (fs.statSync(_path).isDirectory()) {
getFiles(_path,files_);
} else {
if (files[i].match(matchTag) && files[i] != distFileName) {
files_.push((path + '/' + files[i]).replace(dir,host));
}
}
}
return files_;
}
/**===================================**/
/**=============== Main ==============**/
/**===================================**/
host = readlineSync.question('default path is: ' + dir + ', please enter your host: ').toString();
html = (function() {
var h = '';
getFiles(dir).forEach(function(item){
var urlName = readlineSync.question('href ( ' + item + ' ) name is : ').toString();
if (urlName) {
h += template.replace(/{url}/g, item).replace(/{urlName}/g,urlName);
}
});
return h;
})();
html = fs.readFileSync(templatePath, {encoding: 'utf8'}).replace(/{{content}}/g, html);
fs.writeFileSync(distFileName, html);
|
a2db35feb04952a7dcfdf2543463b1a80d32b277
|
[
"JavaScript",
"Markdown"
] | 3 |
JavaScript
|
ArvoGuo/dir-href
|
eb13f9f56c1e4c83ed82e21dde32ce869ffa1b30
|
96d58d6aea526e2c078cbf348acb2e8c6120f99c
|
refs/heads/master
|
<repo_name>JonathanBowker/ahmia-crawler<file_sep>/onionElasticBot/README.md
# onionElasticBot
Crawl .onion and .i2p websites from the Tor network
## Prerequisistes
Please follow [installation guide](https://github.com/ahmia/ahmia-crawler)
## Usage
```sh
$ scrapy crawl OnionSpider -s DEPTH_LIMIT=2 -s ROBOTSTXT_OBEY=0
or
$ scrapy crawl OnionSpider -s DEPTH_LIMIT=5 -s LOG_LEVEL=INFO
or
$ scrapy crawl i2pSpider -s DEPTH_LIMIT=100 -s LOG_LEVEL=DEBUG -s ELASTICSEARCH_TYPE=i2p
or
$ scrapy crawl i2pSpider -s DEPTH_LIMIT=1 -s ROBOTSTXT_OBEY=0 -s ELASTICSEARCH_TYPE=i2p
or
$ scrapy crawl OnionSpider -o items.json -t json
or
$ scrapy crawl OnionSpider -s DEPTH_LIMIT=1 -s ALLOWED_DOMAINS=/home/juha/allowed_domains.txt -s TARGET_SITES=/home/juha/seed_list.txt -s ELASTICSEARCH_TYPE=targetitemtype
```
<file_sep>/onionElasticBot/onionElasticBot/pipelines.py
# -*- coding: utf-8 -*-
"""Pipelines"""
# Define your item pipelines here
#
import datetime
class AddTimestampPipeline(object):
"""Add timestamp to item"""
def process_item(self, item, spider):
"""Process an item"""
item["timestamp"] = datetime.datetime.now().strftime(
"%Y-%m-%dT%H:%M:%S")
return item
<file_sep>/onionElasticBot/onionElasticBot/spiders/i2pspider.py
# -*- coding: utf-8 -*-
from onionElasticBot.spiders.base import WebSpider
class i2pSpider(WebSpider):
NAME = "i2pSpider"
DEFAULT_ALLOWED_DOMAINS = ["i2p"]
DEFAULT_TARGET_SITES = ['http://nekhbet.com/i2p_links.shtml',]
<file_sep>/onionElasticBot/onionElasticBot/spiders/onionspider.py
# -*- coding: utf-8 -*-
from onionElasticBot.spiders.base import WebSpider
class OnionSpider(WebSpider):
NAME = "OnionSpider"
DEFAULT_ALLOWED_DOMAINS = ["onion"]
DEFAULT_TARGET_SITES = ['https://ahmia.fi/address/', 'http://torlinks6ob7o7zq.onion/', 'http://tt3j2x4k5ycaa5zt.onion/onions.php',]
|
62f2cddfba22d34cff692b0d5a48db51139520c2
|
[
"Markdown",
"Python"
] | 4 |
Markdown
|
JonathanBowker/ahmia-crawler
|
c7212ab61516eec1414dd01e6753bb60cd0f6d2f
|
b35c0ef29d17f47279414b423faa2feb30c25e9e
|
refs/heads/main
|
<repo_name>Mikhail-Martynenko/Project-Activities<file_sep>/README.md
# Project-Activities-
Appclication - основная папка
App1RemoteForAndriod - содержит приложение для andriod.
|
3019787f4ecd06359b32a78e58aef41ada619d5e
|
[
"Markdown"
] | 1 |
Markdown
|
Mikhail-Martynenko/Project-Activities
|
d9fea9797d94f6b67b86d314beb25267c7f60166
|
043e91c99b775f4aa7a0165a23295a8434e01313
|
refs/heads/master
|
<repo_name>jtrimm007/image-scraper<file_sep>/src/MyTimerTask.java
import java.util.Date;
import java.util.TimerTask;
/**
* -------------------------------------------------
* File name: MyTimerTask.java
* Project name: Migration
* -------------------------------------------------
* Creator's name: <NAME>
* Email: <EMAIL>
* Creation date: May 24, 2018
* -------------------------------------------------
*/
/**
* <b>[Thread object to space out the time of pinging the website]</b>
* <hr>
* Date created: May 24, 2018
* <hr>
* @author <NAME>
*/
public class MyTimerTask extends TimerTask
{
@Override
public void run() {
System.out.println("Timer task started at:"+new Date());
completeTask();
System.out.println("Timer task finished at:"+new Date());
}
private void completeTask() {
try {
//assuming it takes 20 secs to complete the task
Thread.sleep(20000);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
|
b2a7bf6077a828d5090b5c5aa94a8cb0a3c622a9
|
[
"Java"
] | 1 |
Java
|
jtrimm007/image-scraper
|
3bbd830f6366a9fed23b0ff477313ebcdeb12e56
|
9887d794eec92f000bb202120cd02fa8e2b42b37
|
refs/heads/master
|
<file_sep>import React from 'react';
import PropTypes from 'prop-types';
import styles from './styles.module.css';
const Page = ({ children }) => {
return (
<section className={styles.page}>
<main className={styles.main}>
{children}
</main>
<footer className={styles.footer}>powered by Traffiq</footer>
</section>
);
};
Page.propTypes = {
children: PropTypes.node.isRequired,
};
export default Page;
<file_sep>export const SET_LOADING = 'SET_LOADING';
export const SET_REDIRECT_URL = 'SET_REDIRECT_URL';
export const SET_ERROR = 'SET_ERROR';
export const SET_ESTIMATED_WAIT_TIME = 'SET_ESTIMATED_WAIT_TIME';
export const SET_SPECIAL_TITLE = 'SET_SPECIAL_TITLE';
<file_sep>import { useEffect, useRef } from 'react';
const useInterval = (callback, delay, options = {}) => {
const { jitter } = options;
const savedCallback = useRef();
useEffect(() => {
savedCallback.current = callback;
}, [callback]);
useEffect(() => {
const tick = () => {
savedCallback.current();
};
const setTimeoutRepeated = () => {
tick();
setTimeout(setTimeoutRepeated, getDelay());
};
const getDelay = () => jitter ? delay + Math.round(Math.random() * jitter) : delay;
if (delay) {
setTimeout(setTimeoutRepeated, getDelay());
}
return () => {
clearTimeout();
};
}, [delay, jitter]);
};
export default useInterval;
<file_sep>import axios from 'axios';
import config from '../config';
function getQueuePosition(clientId, appName) {
const options = {
headers: {
'X-Client-ID': clientId,
},
};
return axios.get(`${config.baseUrl}/queue/${appName}`, options);
}
export default getQueuePosition;
<file_sep>import React, { useContext } from 'react';
import styles from './styles.module.css';
import Position from '../Position';
import { useScenario } from '../../hooks';
import { store } from '../../store';
const Content = () => {
const { description } = useScenario();
const { state } = useContext(store);
const { specialTitle } = state;
const title = specialTitle || 'Waiting Queue';
return (
<section className={styles.section}>
<h1 className={styles.title}>{title}</h1>
<p>{description}</p>
<Position />
</section>
);
};
export default Content;
<file_sep>import axios from 'axios';
import config from '../config';
function joinQueue() {
const hostname = window.location.hostname;
return axios.post(`${config.baseUrl}/queue/queueUrl/${hostname}`);
}
export default joinQueue;
<file_sep>import moment from 'moment';
const withSeconds = 's [sec]';
const withMinutes = `m [min] ${withSeconds}`;
const withHours = `h [hr] ${withMinutes}`;
const makeReadableTime = (time) => {
const seconds = time / 1000;
const minutes = time / (1000 * 60);
if (seconds < 60) {
return moment(time).format(withSeconds);
}
if (minutes < 60) {
return moment(time).format(withMinutes);
}
return moment(time).format(withHours);
};
export default makeReadableTime;
<file_sep>import { useContext } from 'react';
import { store } from '../store';
import { blocksAnimation, errorAnimation, readyAnimation } from '../assets/lottie';
import { scenarioTypes, colors } from '../common/constants';
const waitingScenario = {
scenarioType: scenarioTypes.waiting,
description: 'Your current position in the queue is',
documentTitle: '⌛ Waiting in line...',
animation: blocksAnimation,
backgroundColor: colors.WHITE,
};
const readyScenario = {
scenarioType: scenarioTypes.ready,
description: 'Your site is ready to be visited!',
documentTitle: '✅ Ready to go!',
animation: readyAnimation,
backgroundColor: colors.LIGHT_GREEN,
};
const errorScenario = {
scenarioType: scenarioTypes.error,
description: 'Uh oh, something went wrong! We\'re getting this fixed in the background; you\'ll be back in line momentarily.',
documentTitle: '❌ Something went wrong!',
animation: errorAnimation,
backgroundColor: colors.LIGHT_RED,
};
const useScenario = () => {
const { state } = useContext(store);
const { redirectUrl, error } = state;
let scenario = waitingScenario;
if (redirectUrl) {
scenario = readyScenario;
} else if (error) {
scenario = errorScenario;
}
return scenario;
};
export default useScenario;
<file_sep>import useInterval from './useInterval';
import useTabActive from './useTabActive';
import useScenario from './useScenario';
export { useInterval, useTabActive, useScenario };
<file_sep>import React from 'react';
import ReactDOM from 'react-dom';
import './index.css';
import App from './App';
import { GlobalProvider } from './store';
import { ThemeProvider } from 'evergreen-ui';
import theme from './theme';
ReactDOM.render(
<React.StrictMode>
<GlobalProvider>
<ThemeProvider value={theme}>
<App />
</ThemeProvider>
</GlobalProvider>
</React.StrictMode>,
document.getElementById('root'),
);
<file_sep>import { defaultTheme } from 'evergreen-ui';
export default {
...defaultTheme,
fontFamilies: {
display: 'Apercu',
ui: 'Apercu',
mono: 'Apercu',
},
};
<file_sep>import { useState, useEffect } from 'react';
const useTabActive = () => {
const [tabActive, setTabActive] = useState(!document.hidden);
const onVisibilityChange = () => {
if (document.visibilityState === 'visible') {
setTabActive(true);
} else {
setTabActive(false);
}
};
useEffect(() => {
document.addEventListener('visibilitychange', onVisibilityChange);
return () => {
document.removeEventListener('visibilitychange', onVisibilityChange);
};
}, [tabActive]);
return tabActive;
};
export default useTabActive;
<file_sep>import blocksAnimation from './blocks.json';
import errorAnimation from './error.json';
import readyAnimation from './green-check.json';
export { blocksAnimation, errorAnimation, readyAnimation };
<file_sep>export const colors = {
LIGHT_GREEN: '#94d2bd',
LIGHT_RED: '#f4978e',
WHITE: '#ffffff',
};
export const scenarioTypes = {
waiting: 'WAITING',
ready: 'READY',
error: 'ERROR',
};
<file_sep># Queue Frontend
Standalone React frontend where users can see and get updates to their queue position before being redirected to their destination site.
## Local Development
### Prerequisites
* Install `nvm`
* Install `yarn`
### Steps
Use the pinned Node version in the `.nvmrc` file.
```
nvm use
```
Install dependencies.
```
yarn install
```
Start the app.
```
yarn start
```<file_sep>import React, { useEffect, useContext, Fragment } from 'react';
import Page from '../components/Page';
import Content from '../components/Content';
import Animation from '../components/Animation';
import styles from './styles.module.css';
import { useScenario } from '../hooks';
import { store } from '../store';
import makeReadableTime from '../common/utils/makeReadableTime';
const App = () => {
const { backgroundColor, documentTitle } = useScenario();
const { state } = useContext(store);
const { estimatedWaitTime } = state;
useEffect(() => {
document.body.style.backgroundColor = backgroundColor;
document.title = `${documentTitle} | TraffiQ`;
}, [backgroundColor, documentTitle]);
return (
<Page>
<section className={styles.section}>
<Content />
<Animation />
</section>
<span className={styles.estimatedWaitTime}>
{estimatedWaitTime
? (
<Fragment>Estimated wait time is
{' '}
<em>{makeReadableTime(estimatedWaitTime)}</em>
</Fragment>
)
: (
null
)}
</span>
</Page>
);
};
export default App;
<file_sep>import React, { useState, useEffect, useContext, useCallback } from 'react';
import joinQueue from '../../api/joinQueue';
import getQueuePosition from '../../api/getQueuePosition';
import useInterval from '../../hooks/useInterval';
import styles from './styles.module.css';
import Spinner from '../Spinner';
import { store } from '../../store';
import { SET_ERROR, SET_ESTIMATED_WAIT_TIME, SET_LOADING, SET_REDIRECT_URL, SET_SPECIAL_TITLE } from '../../store/actions';
import { useScenario } from '../../hooks';
import { scenarioTypes } from '../../common/constants';
import PrimaryButton from '../PrimaryButton';
const POLLING_DELAY = 500;
const JITTER_DELAY = 200;
const Position = () => {
const [currentPosition, setCurrentPosition] = useState(null);
const [clientId, setClientId] = useState(null);
const [autoRedirect, setAutoRedirect] = useState(false);
const [startTime, setStartTime] = useState(null);
const [currentTime, setCurrentTime] = useState(null);
const [startPosition, setStartPosition] = useState(null);
const [appName, setAppName] = useState(null);
const { state, dispatch } = useContext(store);
const { redirectUrl } = state;
const { scenarioType } = useScenario();
const addUserToQueue = useCallback(async() => {
dispatch({ type: SET_LOADING, payload: true });
dispatch({ type: SET_ERROR, payload: null });
try {
const { data } = await joinQueue();
const time = new Date();
setStartTime(time);
setCurrentTime(time);
setCurrentPosition(data.position);
setStartPosition(data.position);
setClientId(data.clientId);
setAppName(data.appName);
dispatch({ type: SET_SPECIAL_TITLE, payload: data.message });
dispatch({ type: SET_ERROR, payload: null });
} catch (error) {
dispatch({ type: SET_ERROR, payload: error });
} finally {
dispatch({ type: SET_LOADING, payload: false });
}
}, [dispatch]);
useEffect(() => {
addUserToQueue();
}, [addUserToQueue]);
useInterval(async() => {
if (clientId && !redirectUrl && currentPosition !== -1) {
try {
const { data } = await getQueuePosition(clientId, appName);
if (data.redirectUrl) {
dispatch({ type: SET_REDIRECT_URL, payload: data.redirectUrl });
if (document.hasFocus()) {
setAutoRedirect(true);
window.location.replace(data.redirectUrl);
}
} else {
setCurrentPosition(data.position);
setCurrentTime(new Date());
dispatch({ type: SET_ERROR, payload: null });
}
} catch (error) {
dispatch({ type: SET_ERROR, payload: error });
setCurrentPosition(null);
} finally {
dispatch({ type: SET_LOADING, payload: false });
}
}
}, POLLING_DELAY, { jitter: JITTER_DELAY });
const handleRedirectClick = (e) => {
e.preventDefault();
window.location.href = redirectUrl;
};
useEffect(() => {
const dequeueRate = (currentTime - startTime) / (startPosition - currentPosition);
const estimatedTime = dequeueRate * currentPosition;
dispatch({ type: SET_ESTIMATED_WAIT_TIME, payload: estimatedTime });
}, [startTime, currentTime, startPosition, currentPosition]);
if (!autoRedirect && scenarioType === scenarioTypes.ready) {
return (
<PrimaryButton onClick={handleRedirectClick} size="large">Take me there</PrimaryButton>
);
}
if (currentPosition) {
return <span className={styles.number}>{currentPosition}</span>;
}
return <Spinner />;
};
export default Position;
<file_sep>const config = {
baseUrl: 'https://api.traffiq.xyz',
};
export default config;
<file_sep>import { SET_LOADING, SET_REDIRECT_URL, SET_ERROR, SET_ESTIMATED_WAIT_TIME, SET_SPECIAL_TITLE } from './actions';
const reducer = (state, { type, payload }) => {
switch (type) {
case SET_LOADING:
return { ...state, loading: payload };
case SET_REDIRECT_URL:
return { ...state, redirectUrl: payload };
case SET_ESTIMATED_WAIT_TIME:
return { ...state, estimatedWaitTime: payload };
case SET_SPECIAL_TITLE:
return { ...state, specialTitle: payload };
case SET_ERROR:
return { ...state, error: payload };
default:
}
};
export default reducer;
|
d7d8ebd29eff96b0b52b6e4132be6b2fca2649ac
|
[
"JavaScript",
"Markdown"
] | 19 |
JavaScript
|
Traffiq-Team/queue-frontend
|
a5ae233676d73221e91123a188e1debe783c1693
|
c9f924afe122736398723e83ff0e126c3d672fd8
|
refs/heads/master
|
<repo_name>deltatre-webplu/deltatre.utils<file_sep>/src/Deltatre.Utils/Dto/OperationResult_WithError.cs
using System;
using System.Collections.Generic;
using System.Collections.ObjectModel;
using System.Linq;
using Deltatre.Utils.Types;
namespace Deltatre.Utils.Dto
{
/// <summary>
/// Represents the result obtained when performing an operation
/// </summary>
/// <typeparam name="TOutput">The type of the operation output</typeparam>
/// <typeparam name="TError">The type of operation errors</typeparam>
public sealed class OperationResult<TOutput, TError>
{
/// <summary>
/// Call this method to create an instance representing the result of a failed operation.
/// </summary>
/// <param name="errors">All the detected errors</param>
/// <exception cref="ArgumentNullException">Throws ArgumentNullException when parameter errors is null</exception>
/// <returns>An instance representing the result of a failed operation.</returns>
public static OperationResult<TOutput, TError> CreateFailure(NonEmptySequence<TError> errors)
{
if (errors == null)
throw new ArgumentNullException(nameof(errors));
return new OperationResult<TOutput, TError>(default, errors);
}
/// <summary>
/// Call this method to create an instance representing the result of a failed operation.
/// </summary>
/// <param name="error">The detected error</param>
/// <returns>An instance representing the result of a failed operation.</returns>
public static OperationResult<TOutput, TError> CreateFailure(TError error) =>
CreateFailure(new NonEmptySequence<TError>(new[] { error }));
/// <summary>
/// Call this method to create an instance representing the result of a successful operation. The list of errors will be set to an empty list.
/// </summary>
/// <param name="output">The operation output.</param>
/// <returns>An instance representing the result of a successful operation.</returns>
public static OperationResult<TOutput, TError> CreateSuccess(TOutput output) =>
new OperationResult<TOutput, TError>(output, Enumerable.Empty<TError>());
private readonly TOutput _output;
private OperationResult(TOutput output, IEnumerable<TError> errors)
{
_output = output;
Errors = new ReadOnlyCollection<TError>(errors.ToList());
}
/// <summary>
/// Gets a flag indicating whether the operation completed successfully.
/// </summary>
public bool IsSuccess => Errors.Count == 0;
/// <summary>
/// Gets the result produced from the operation.
/// Accessing this property throws <see cref="System.InvalidOperationException"/> when property <see cref="IsSuccess"/> is <see langword="false"/>.
/// </summary>
/// <exception cref="System.InvalidOperationException">
/// Throws <see cref="System.InvalidOperationException"/> when property <see cref="IsSuccess"/> is <see langword="false"/>.
/// </exception>
public TOutput Output
{
get
{
if (!this.IsSuccess)
{
throw new InvalidOperationException(
"Reading the operation output is not allowed because the operation is failed.");
}
return _output;
}
}
/// <summary>
/// Gets the first error contained in the errors list for the operation.
/// Accessing this property throws <see cref="InvalidOperationException"/> when property <see cref="IsSuccess"/> is <see langword="true"/>.
/// </summary>
/// <exception cref="InvalidOperationException">
/// Throws <see cref="InvalidOperationException"/> when property <see cref="IsSuccess"/> is <see langword="true"/>.
/// </exception>
public TError Error
{
get
{
if (this.IsSuccess)
{
throw new InvalidOperationException(
"Reading the error is not allowed because the operation succeed.");
}
return this.Errors[0];
}
}
/// <summary>
/// All the detected errors. In case of successfull operation this collection will be empty.
/// </summary>
public ReadOnlyCollection<TError> Errors { get; }
/// <summary>
/// Implicit type conversion from <typeparamref name="TOutput"/> to <see cref="OperationResult{TOutput, TError}" />
/// </summary>
/// <param name="value">An instance of type<typeparamref name="TOutput"/> to be converted to <see cref="OperationResult{TOutput, TError}"/></param>
public static implicit operator OperationResult<TOutput, TError>(TOutput value)
{
return CreateSuccess(value);
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Parsers/DateParsersTest.cs
using System;
using System.Collections.Generic;
using NUnit.Framework;
using static Deltatre.Utils.Parsers.DateParsers;
namespace Deltatre.Utils.Tests.Parsers
{
[TestFixture]
public class DateParsersTest
{
[TestCase(null)]
[TestCase("")]
[TestCase(" ")]
public void ParseIso8601Date_Throws_When_Parameter_ToBeParsedIsNullOrWhiteSpace(string toBeParsed)
{
// ACT
Assert.Throws<ArgumentException>(() => ParseIso8601Date(toBeParsed));
}
// see https://docs.microsoft.com/en-us/dotnet/standard/base-types/standard-date-and-time-format-strings#Sortable
[Test]
public void ParseIso8601Date_Support_Microsoft_Sortable_Format_Specifier()
{
// ARRANGE
const string toBeParsed = "2008-04-10T06:30:00";
// ACT
var result = ParseIso8601Date(toBeParsed);
// ASSERT
Assert.IsNotNull(result);
Assert.IsTrue(result.IsSuccess);
// Check parsed value
var expected = new DateTimeOffset(new DateTime(2008, 4, 10, 6, 30, 0), TimeSpan.FromHours(2));
Assert.AreEqual(expected, result.Output);
}
// see https://docs.microsoft.com/en-us/dotnet/standard/base-types/standard-date-and-time-format-strings#the-round-trip-o-o-format-specifier
[TestCaseSource(nameof(Microsoft_RoundTrip_Format_Test_Case_Source))]
public void ParseIso8601Date_Support_Microsoft_RoundTrip_Format_Specifier((string toBeParsed, DateTimeOffset expected) tuple)
{
// ACT
var result = ParseIso8601Date(tuple.toBeParsed);
// ASSERT
Assert.IsNotNull(result);
Assert.IsTrue(result.IsSuccess);
Assert.AreEqual(tuple.expected, result.Output);
}
[TestCaseSource(nameof(Extended_Formats_Test_Case_Source))]
public void ParseIso8601Date_Support_Extended_Formats((string toBeParsed, DateTimeOffset expected) tuple)
{
// ACT
var result = ParseIso8601Date(tuple.toBeParsed);
// ASSERT
Assert.IsNotNull(result);
Assert.IsTrue(result.IsSuccess);
Assert.AreEqual(tuple.expected, result.Output);
}
[TestCaseSource(nameof(Basic_Formats_Test_Case_Source))]
public void ParseIso8601Date_Support_Basic_Formats((string toBeParsed, DateTimeOffset expected) tuple)
{
// ACT
var result = ParseIso8601Date(tuple.toBeParsed);
// ASSERT
Assert.IsNotNull(result);
Assert.IsTrue(result.IsSuccess);
Assert.AreEqual(tuple.expected, result.Output);
}
[TestCaseSource(nameof(Extended_Formats_With_Minutes_Accuracy_Test_Case_Source))]
public void ParseIso8601Date_Support_Extended_Formats_With_Minutes_Accuracy((string toBeParsed, DateTimeOffset expected) tuple)
{
// ACT
var result = ParseIso8601Date(tuple.toBeParsed);
// ASSERT
Assert.IsNotNull(result);
Assert.IsTrue(result.IsSuccess);
Assert.AreEqual(tuple.expected, result.Output);
}
[TestCaseSource(nameof(Basic_Formats_With_Minutes_Accuracy_Test_Case_Source))]
public void ParseIso8601Date_Support_Basic_Formats_With_Minutes_Accuracy((string toBeParsed, DateTimeOffset expected) tuple)
{
// ACT
var result = ParseIso8601Date(tuple.toBeParsed);
// ASSERT
Assert.IsNotNull(result);
Assert.IsTrue(result.IsSuccess);
Assert.AreEqual(tuple.expected, result.Output);
}
[TestCaseSource(nameof(Extended_Formats_With_Hours_Accuracy_Test_Case_Source))]
public void ParseIso8601Date_Support_Extended_Formats_With_Hours_Accuracy((string toBeParsed, DateTimeOffset expected) tuple)
{
// ACT
var result = ParseIso8601Date(tuple.toBeParsed);
// ASSERT
Assert.IsNotNull(result);
Assert.IsTrue(result.IsSuccess);
Assert.AreEqual(tuple.expected, result.Output);
}
[TestCaseSource(nameof(Basic_Formats_With_Hours_Accuracy_Test_Case_Source))]
public void ParseIso8601Date_Support_Basic_Formats_With_Hours_Accuracy((string toBeParsed, DateTimeOffset expected) tuple)
{
// ACT
var result = ParseIso8601Date(tuple.toBeParsed);
// ASSERT
Assert.IsNotNull(result);
Assert.IsTrue(result.IsSuccess);
Assert.AreEqual(tuple.expected, result.Output);
}
[TestCase("not a date")]
[TestCase("28 November 1988 13:45:60")]
[TestCase("Monday 21st May 2018")]
[TestCase("2011-January-12")]
[TestCase("foo")]
public void ParseIso8601Date_Returns_Invalid_Result_When_String_Passed_In_Is_Not_Iso8601_Date(string toBeParsed)
{
// ACT
var result = ParseIso8601Date(toBeParsed);
// ASSERT
Assert.IsNotNull(result);
Assert.IsFalse(result.IsSuccess);
}
[TestCase("2018-05-21")]
[TestCase("20180521")]
public void ParseIso8601Date_Supports_Formats_With_Date_Only(string toBeParsed)
{
// ACT
var result = ParseIso8601Date(toBeParsed);
// ASSERT
Assert.IsNotNull(result);
Assert.IsTrue(result.IsSuccess);
// check parsed value
var expected = new DateTimeOffset(new DateTime(2018, 5, 21), TimeSpan.FromHours(2));
Assert.AreEqual(expected, result.Output);
}
[TestCaseSource(nameof(Javascript_Iso8601_Format))]
public void ParseIso8601Date_Javascript_Iso8601_Format((string toBeParsed, DateTimeOffset expected) tuple)
{
// ACT
var result = ParseIso8601Date(tuple.toBeParsed);
// ASSERT
Assert.IsNotNull(result);
Assert.IsTrue(result.IsSuccess);
// check parsed value
Assert.AreEqual(tuple.expected, result.Output);
}
private static IEnumerable<(string toBeParsed, DateTimeOffset expected)> Microsoft_RoundTrip_Format_Test_Case_Source()
{
yield return (
"2009-06-15T13:45:30.0080000",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 30, 8), TimeSpan.FromHours(2))
);
yield return (
"2009-06-15T13:45:30.0080000Z",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 30, 8), TimeSpan.Zero)
);
yield return (
"2009-06-15T13:45:30.0080000-07:00",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 30, 8), TimeSpan.FromHours(-7))
);
yield return (
"2009-06-15T13:45:30.0080000+03:00",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 30, 8), TimeSpan.FromHours(3))
);
yield return (
"2009-06-15T13:45:30.0080000+00:00",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 30, 8), TimeSpan.Zero)
);
}
private static IEnumerable<(string toBeParsed, DateTimeOffset expected)> Extended_Formats_Test_Case_Source()
{
// not padded without minutes
yield return (
"2009-06-15T13:45:30-7",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 30), TimeSpan.FromHours(-7))
);
yield return (
"2009-06-15T13:45:30+3",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 30), TimeSpan.FromHours(3))
);
yield return (
"2009-06-15T13:45:30+0",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 30), TimeSpan.Zero)
);
// padded without minutes
yield return (
"2009-06-15T13:45:30-07",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 30), TimeSpan.FromHours(-7))
);
yield return (
"2009-06-15T13:45:30+03",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 30), TimeSpan.FromHours(3))
);
yield return (
"2009-06-15T13:45:30+00",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 30), TimeSpan.Zero)
);
// with minutes
yield return (
"2009-06-15T13:45:30-07:03",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 30), new TimeSpan(0, -7, -3, 0))
);
yield return (
"2009-06-15T13:45:30+04:07",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 30), new TimeSpan(0, 4, 7, 0))
);
yield return (
"2009-06-15T13:45:30+00:00",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 30), TimeSpan.Zero)
);
// UTC
yield return (
"2009-06-15T13:45:30Z",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 30), TimeSpan.Zero)
);
}
private static IEnumerable<(string toBeParsed, DateTimeOffset expected)> Basic_Formats_Test_Case_Source()
{
// not padded without minutes
yield return (
"20090615T134530-7",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 30), TimeSpan.FromHours(-7))
);
yield return (
"20090615T134530+3",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 30), TimeSpan.FromHours(3))
);
yield return (
"20090615T134530+0",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 30), TimeSpan.Zero)
);
// padded without minutes
yield return (
"20090615T134530-07",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 30), TimeSpan.FromHours(-7))
);
yield return (
"20090615T134530+03",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 30), TimeSpan.FromHours(3))
);
yield return (
"20090615T134530+00",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 30), TimeSpan.Zero)
);
// with minutes
yield return (
"20090615T134530-0703",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 30), new TimeSpan(0, -7, -3, 0))
);
yield return (
"20090615T134530+0407",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 30), new TimeSpan(0, 4, 7, 0))
);
yield return (
"20090615T134530+0000",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 30), TimeSpan.Zero)
);
// UTC
yield return (
"20090615T134530Z",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 30), TimeSpan.Zero)
);
}
private static IEnumerable<(string toBeParsed, DateTimeOffset expected)> Extended_Formats_With_Minutes_Accuracy_Test_Case_Source()
{
// not padded without minutes
yield return (
"2009-06-15T13:45-7",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 0), TimeSpan.FromHours(-7))
);
yield return (
"2009-06-15T13:45+3",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 0), TimeSpan.FromHours(3))
);
yield return (
"2009-06-15T13:45+0",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 0), TimeSpan.Zero)
);
// padded without minutes
yield return (
"2009-06-15T13:45-07",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 0), TimeSpan.FromHours(-7))
);
yield return (
"2009-06-15T13:45+03",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 0), TimeSpan.FromHours(3))
);
yield return (
"2009-06-15T13:45+00",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 0), TimeSpan.Zero)
);
// with minutes
yield return (
"2009-06-15T13:45-07:03",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 0), new TimeSpan(0, -7, -3, 0))
);
yield return (
"2009-06-15T13:45+04:07",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 0), new TimeSpan(0, 4, 7, 0))
);
yield return (
"2009-06-15T13:45+00:00",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 0), TimeSpan.Zero)
);
// UTC
yield return (
"2009-06-15T13:45Z",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 0), TimeSpan.Zero)
);
}
private static IEnumerable<(string toBeParsed, DateTimeOffset expected)> Basic_Formats_With_Minutes_Accuracy_Test_Case_Source()
{
// not padded without minutes
yield return (
"20090615T1345-7",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 0), TimeSpan.FromHours(-7))
);
yield return (
"20090615T1345+3",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 0), TimeSpan.FromHours(3))
);
yield return (
"20090615T1345+0",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 0), TimeSpan.Zero)
);
// padded without minutes
yield return (
"20090615T1345-07",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 0), TimeSpan.FromHours(-7))
);
yield return (
"20090615T1345+03",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 0), TimeSpan.FromHours(3))
);
yield return (
"20090615T1345+00",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 0), TimeSpan.Zero)
);
// with minutes
yield return (
"20090615T1345-0703",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 0), new TimeSpan(0, -7, -3, 0))
);
yield return (
"20090615T1345+0407",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 0), new TimeSpan(0, 4, 7, 0))
);
yield return (
"20090615T1345+0000",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 0), TimeSpan.Zero)
);
// UTC
yield return (
"20090615T1345Z",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 45, 0), TimeSpan.Zero)
);
}
private static IEnumerable<(string toBeParsed, DateTimeOffset expected)> Extended_Formats_With_Hours_Accuracy_Test_Case_Source()
{
// not padded without minutes
yield return (
"2009-06-15T13-7",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 0, 0), TimeSpan.FromHours(-7))
);
yield return (
"2009-06-15T13+3",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 0, 0), TimeSpan.FromHours(3))
);
yield return (
"2009-06-15T13+0",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 0, 0), TimeSpan.Zero)
);
// padded without minutes
yield return (
"2009-06-15T13-07",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 0, 0), TimeSpan.FromHours(-7))
);
yield return (
"2009-06-15T13+03",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 0, 0), TimeSpan.FromHours(3))
);
yield return (
"2009-06-15T13+00",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 0, 0), TimeSpan.Zero)
);
// with minutes
yield return (
"2009-06-15T13-07:03",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 0, 0), new TimeSpan(0, -7, -3, 0))
);
yield return (
"2009-06-15T13+04:07",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 0, 0), new TimeSpan(0, 4, 7, 0))
);
yield return (
"2009-06-15T13+00:00",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 0, 0), TimeSpan.Zero)
);
// UTC
yield return (
"2009-06-15T13Z",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 0, 0), TimeSpan.Zero)
);
}
private static IEnumerable<(string toBeParsed, DateTimeOffset expected)> Basic_Formats_With_Hours_Accuracy_Test_Case_Source()
{
// not padded without minutes
yield return (
"20090615T13-7",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 0, 0), TimeSpan.FromHours(-7))
);
yield return (
"20090615T13+3",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 0, 0), TimeSpan.FromHours(3))
);
yield return (
"20090615T13+0",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 0, 0), TimeSpan.Zero)
);
// padded without minutes
yield return (
"20090615T13-07",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 0, 0), TimeSpan.FromHours(-7))
);
yield return (
"20090615T13+03",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 0, 0), TimeSpan.FromHours(3))
);
yield return (
"20090615T13+00",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 0, 0), TimeSpan.Zero)
);
// with minutes
yield return (
"20090615T13-0703",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 0, 0), new TimeSpan(0, -7, -3, 0))
);
yield return (
"20090615T13+0407",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 0, 0), new TimeSpan(0, 4, 7, 0))
);
yield return (
"20090615T13+0000",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 0, 0), TimeSpan.Zero)
);
// UTC
yield return (
"20090615T13Z",
new DateTimeOffset(new DateTime(2009, 6, 15, 13, 0, 0), TimeSpan.Zero)
);
}
private static IEnumerable<(string toBeParsed, DateTimeOffset expected)> Javascript_Iso8601_Format()
{
yield return ("2013-02-04T22:44:30.652Z", new DateTimeOffset(new DateTime(2013, 2, 4, 22, 44, 30, 652), TimeSpan.Zero));
yield return ("2018-11-27T15:35:36.116+01:00", new DateTimeOffset(new DateTime(2018, 11, 27, 15, 35, 36, 116), TimeSpan.FromHours(1)));
}
}
}
<file_sep>/src/Deltatre.Utils/Randomization/RandomHelpers.cs
using System;
namespace Deltatre.Utils.Randomization
{
/// <summary>
/// Helper methods for generating random objects
/// </summary>
public static class RandomHelpers
{
private const string AlphanumericChars = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789";
/// <summary>
/// Call this method to get a random alphanumeric string
/// </summary>
/// <param name="length">The lenght of the random string to be generated</param>
/// <returns>A string having the desired lenght composed by alphanumeric characters randomly chosen</returns>
/// <exception cref="ArgumentOutOfRangeException">Throws <see cref="ArgumentOutOfRangeException"/> when parameter <paramref name="length"/> is less than zero</exception>
public static string GetRandomAlphanumericString(int length)
{
if (length < 0)
throw new ArgumentOutOfRangeException(
nameof(length),
$"Parameter {nameof(length)} cannot be less than zero.");
if (length == 0)
return string.Empty;
var characters = new char[length];
for (int i = 0; i < length; i++)
{
var randomIndex = RandomGenerator.Instance.Next(AlphanumericChars.Length);
var randomChar = AlphanumericChars[randomIndex];
characters[i] = randomChar;
}
return new string(characters);
}
}
}
<file_sep>/src/Deltatre.Utils/Functional/Helpers.cs
using System;
using System.Collections.Generic;
using System.Linq;
using Deltatre.Utils.Dto;
using static Deltatre.Utils.Functional.Functions;
namespace Deltatre.Utils.Functional
{
/// <summary>
/// A collection of useful helper methods for common programming tasks
/// </summary>
public static class Helpers
{
/// <summary>
/// This helper retrieves a projection of the first item of a sequence which satisfies a predicate.
/// </summary>
/// <typeparam name="TSource">The type of items inside the source sequence</typeparam>
/// <typeparam name="TResult">The type of the projection being retrieved from the sequence</typeparam>
/// <param name="source">The source from which the projection is retrieved</param>
/// <param name="predicate">The predicate that must be satisfied</param>
/// <param name="projector"></param>
/// <returns>The result of the get operation being performed</returns>
public static OperationResult<TResult> GetFirst<TSource, TResult>(
IEnumerable<TSource> source,
Func<TSource, bool> predicate,
Func<TSource, TResult> projector)
{
if (source == null)
throw new ArgumentNullException(nameof(source));
if (predicate == null)
throw new ArgumentNullException(nameof(predicate));
if (projector == null)
throw new ArgumentNullException(nameof(projector));
var firstMatch = source.FirstOrDefault(predicate);
var notFound = EqualityComparer<TSource>.Default.Equals(firstMatch, default(TSource));
if (notFound)
{
return OperationResult<TResult>.Failure;
}
var projection = projector(firstMatch);
return projection;
}
/// <summary>
/// This helper retrieves the first item of a sequence which satisfies a predicate.
/// </summary>
/// <typeparam name="T">The type of items inside the source sequence</typeparam>
/// <param name="source">The source from which the item is retrieved</param>
/// <param name="predicate">The predicate that must be satisfied</param>
/// <returns>The result of the get operation being performed</returns>
public static OperationResult<T> GetFirst<T>(
IEnumerable<T> source,
Func<T, bool> predicate) =>
GetFirst(source, predicate, Identity);
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Extensions/Enumerable/EnumerableExtensionsTest.cs
using System;
using System.Collections.Generic;
using System.Linq;
using Deltatre.Utils.Extensions.Enumerable;
using NUnit.Framework;
using Linq = System.Linq;
namespace Deltatre.Utils.Tests.Extensions.Enumerable
{
[TestFixture]
public class EnumerableExtensionsTest
{
[Test]
public void IsNullOrEmpty_Should_Return_True_When_Source_Is_Null()
{
// ACT
var result = EnumerableExtensions.IsNullOrEmpty<string>(null);
// ASSERT
Assert.IsTrue(result);
}
[Test]
public void IsNullOrEmpty_Should_Return_True_When_Source_Is_Empty()
{
// ACT
var result = Linq.Enumerable.Empty<string>().IsNullOrEmpty();
// ASSERT
Assert.IsTrue(result);
}
[Test]
public void IsNullOrEmpty_Should_Return_False_When_Source_Is_Not_Null_Or_Empty()
{
// ACT
var result = new [] { "test" }.IsNullOrEmpty();
// ASSERT
Assert.IsFalse(result);
}
[Test]
public void ForEach_Throws_When_Source_Is_Null()
{
// ACT
Assert.Throws<ArgumentNullException>(() => EnumerableExtensions.ForEach<string>(null, _ => { }));
}
[Test]
public void ForEach_Throws_When_ToBeDone_Is_Null()
{
// ARRANGE
var source = Linq.Enumerable.Empty<string>();
// ACT
Assert.Throws<ArgumentNullException>(() => source.ForEach(null));
}
[Test]
public void ForEach_Calls_ToBeDone_For_Each_Item_In_Source()
{
// ARRANGE
var source = new[] {"this", "is", "a", "test"};
// ACT
var strings = new List<string>();
source.ForEach(item => strings.Add(item));
// ASSERT
CollectionAssert.AreEqual(new List<string> {"this", "is", "a", "test"}, strings);
}
[Test]
public void HasDuplicates_Throws_When_Source_Is_Null()
{
// ACT
Assert.Throws<ArgumentNullException>(() => EnumerableExtensions.HasDuplicates<string>(null));
}
[Test]
public void HasDuplicates_Returns_False_When_Items_In_Source_Are_Distinct()
{
// ARRANGE
var target = new[] {"foo", "bar"};
// ACT
var result = target.HasDuplicates();
// ASSERT
Assert.IsFalse(result);
}
[Test]
public void HasDuplicates_Returns_True_When_Source_Has_Duplicates()
{
// ARRANGE
var target = new[] { "foo", "bar", "foo", "hello", "hello", "world", "bar", "webplu", "foo" };
// ACT
var result = target.HasDuplicates();
// ASSERT
Assert.IsTrue(result);
}
[Test]
public void HasDuplicates_Returns_False_When_Items_In_Source_Are_Distinct_Based_On_Default_Comparer()
{
// ARRANGE
var target = new[] { "foo", "bar", "HELLO", "WorlD" };
var comparer = StringComparer.InvariantCultureIgnoreCase;
// ACT
var result = target.HasDuplicates(comparer);
// ASSERT
Assert.IsFalse(result);
}
[Test]
public void HasDuplicates_Returns_True_When_Source_Has_Duplicates_Based_On_Default_Comparer()
{
// ARRANGE
var target = new[] { "foo", "bar", "FOO", "FOO", "baR", "hello", "world", "HellO", "webplu" };
var comparer = StringComparer.InvariantCultureIgnoreCase;
// ACT
var result = target.HasDuplicates(comparer);
// ASSERT
Assert.IsTrue(result);
}
[Test]
public void ToNonEmptySequence_Throws_When_Parameter_Source_Is_Null()
{
// ACT
Assert.Throws<ArgumentNullException>(() => EnumerableExtensions.ToNonEmptySequence<string>(null));
}
[Test]
public void ToNonEmptySequence_Creates_Non_Empty_Sequence()
{
// ARRANGE
var items = new[] {"foo", "bar"};
// ACT
var result = items.ToNonEmptySequence();
// ASSERT
Assert.IsNotNull(result);
CollectionAssert.AreEqual(new[] { "foo", "bar" }, result);
}
[TestCaseSource(nameof(GetTestCasesForToHashSet))]
public void ToHashSet_Creates_An_HashSet_With_The_Unique_Elements_Of_Starting_SequenceWhen_Equality_Comparer_Is_Not_Specified(
(IEnumerable<string> startingSequence, IEnumerable<string> uniqueItems) tuple)
{
// ACT
var result = EnumerableExtensions.ToHashSet(tuple.startingSequence);
// ASSERT
Assert.IsNotNull(result);
CollectionAssert.AreEquivalent(tuple.uniqueItems, result);
}
[TestCaseSource(nameof(GetTestCasesForToHashSetWithEqualityComparer))]
public void ToHashSet_Creates_An_HashSet_With_The_Unique_Elements_Of_Starting_Sequence_When_Equality_Comparer_Is_Specified(
(IEnumerable<string> startingSequence, IEnumerable<string> uniqueItems) tuple)
{
// ACT
var result = EnumerableExtensions.ToHashSet(tuple.startingSequence, StringComparer.InvariantCultureIgnoreCase);
// ASSERT
Assert.IsNotNull(result);
CollectionAssert.AreEquivalent(tuple.uniqueItems, result);
}
[Test]
public void SplitInBatches_Throws_When_Source_Is_Null()
{
// ACT
Assert.Throws<ArgumentNullException>(() => EnumerableExtensions.SplitInBatches<int>(null, 10));
}
[Test]
public void SplitInBatches_Throws_When_BatchSize_Is_Less_Than_Zero()
{
// ARRANGE
var target = Linq.Enumerable.Range(1, 100);
// ACT
Assert.Throws<ArgumentOutOfRangeException>(() => target.SplitInBatches(-5));
}
[Test]
public void SplitInBatches_Throws_When_BatchSize_Equals_Zero()
{
// ARRANGE
var target = Linq.Enumerable.Range(1, 100);
// ACT
Assert.Throws<ArgumentOutOfRangeException>(() => target.SplitInBatches(0));
}
[Test]
public void SplitInBatches_Returns_Empty_Sequence_When_Source_Is_Empty_Sequence()
{
// ARRANGE
var target = Linq.Enumerable.Empty<int>();
// ACT
var result = target.SplitInBatches(10).ToArray();
// ASSERT
Assert.IsNotNull(result);
Assert.IsEmpty(result);
}
[Test]
public void SplitInBatches_Returns_One_Batch_When_Source_Contains_Number_Of_Items_Less_Than_BatchSize()
{
// ARRANGE
var target = Linq.Enumerable.Range(1, 10);
// ACT
var result = target.SplitInBatches(15).ToArray();
// ASSERT
Assert.IsNotNull(result);
Assert.AreEqual(1, result.Length);
// check batch content
var batch = result[0];
CollectionAssert.AreEqual(Linq.Enumerable.Range(1, 10), batch);
}
[Test]
public void SplitInBatches_Returns_One_Batch_When_Source_Contains_Number_Of_Items_Equal_To_BatchSize()
{
// ARRANGE
var target = Linq.Enumerable.Range(1, 10);
// ACT
var result = target.SplitInBatches(10).ToArray();
// ASSERT
Assert.IsNotNull(result);
Assert.AreEqual(1, result.Length);
// check batch content
var batch = result[0];
CollectionAssert.AreEqual(Linq.Enumerable.Range(1, 10), batch);
}
[Test]
public void SplitInBatches_Works_When_Source_Contains_Number_Of_Items_Multiple_Of_BatchSize()
{
// ARRANGE
var target = Linq.Enumerable.Range(1, 20);
// ACT
var result = target.SplitInBatches(5).ToArray();
// ASSERT
Assert.IsNotNull(result);
Assert.AreEqual(4, result.Length);
// check batch content
var batch1 = result[0];
CollectionAssert.AreEqual(Linq.Enumerable.Range(1, 5), batch1);
var batch2 = result[1];
CollectionAssert.AreEqual(Linq.Enumerable.Range(6, 5), batch2);
var batch3 = result[2];
CollectionAssert.AreEqual(Linq.Enumerable.Range(11, 5), batch3);
var batch4 = result[3];
CollectionAssert.AreEqual(Linq.Enumerable.Range(16, 5), batch4);
}
[Test]
public void SplitInBatches_Works_When_Source_Contains_Number_Of_Items_Not_Multiple_Of_BatchSize()
{
// ARRANGE
var target = Linq.Enumerable.Range(1, 13);
// ACT
var result = target.SplitInBatches(5).ToArray();
// ASSERT
Assert.IsNotNull(result);
Assert.AreEqual(3, result.Length);
// check batch content
var batch1 = result[0];
CollectionAssert.AreEqual(Linq.Enumerable.Range(1, 5), batch1);
var batch2 = result[1];
CollectionAssert.AreEqual(Linq.Enumerable.Range(6, 5), batch2);
var batch3 = result[2];
CollectionAssert.AreEqual(Linq.Enumerable.Range(11, 3), batch3);
}
[Test]
public void ToNonEmptyList_Throws_ArgumentNullException_When_Source_Is_Null()
{
// ACT
var exception = Assert.Throws<ArgumentNullException>(() =>
EnumerableExtensions.ToNonEmptyList<string>(null));
// ASSERT
Assert.IsNotNull(exception);
Assert.AreEqual("source", exception.ParamName);
}
[Test]
public void ToNonEmptyList_Returns_Non_Null_Result()
{
// ARRANGE
var source = new[] { "foo", "bar" };
// ACT
var result = source.ToNonEmptyList();
// ASSERT
Assert.IsNotNull(result);
}
[Test]
public void ToNonEmptyList_Returns_Non_Empty_List_Containing_Items_Of_The_Provided_Sequence()
{
// ARRANGE
var source = new[] { "foo", "bar" };
// ACT
var result = source.ToNonEmptyList();
// ASSERT
CollectionAssert.AreEqual(source, result);
Assert.AreEqual(2, result.Count);
Assert.AreEqual("foo", result[0]);
Assert.AreEqual("bar", result[1]);
}
private static IEnumerable<(IEnumerable<string> startingSequence, IEnumerable<string> uniqueItems)>
GetTestCasesForToHashSet()
{
yield return (new string[0], new string[0]);
yield return (new[] {"foo"}, new[] {"foo"});
yield return (new[] {"foo", "bar"}, new[] {"foo", "bar"});
yield return (new[] {"foo", "bar", "foo", "bar"}, new[] {"foo", "bar"});
yield return (new[] {"foo", "bar", "FOO", "BAR", "Foo", "Bar", "FoO", "BaR" }, new[] { "foo", "bar", "FOO", "BAR", "Foo", "Bar", "FoO", "BaR" });
}
private static IEnumerable<(IEnumerable<string> startingSequence, IEnumerable<string> uniqueItems)>
GetTestCasesForToHashSetWithEqualityComparer()
{
yield return (new string[0], new string[0]);
yield return (new[] { "foo" }, new[] { "foo" });
yield return (new[] { "foo", "bar" }, new[] { "foo", "bar" });
yield return (new[] { "foo", "bar", "foo", "bar" }, new[] { "foo", "bar" });
yield return (new[] { "foo", "bar", "FOO", "BAR", "Foo", "Bar", "FoO", "BaR" }, new[] { "foo", "bar" });
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Dto/MaybeTest.cs
using Deltatre.Utils.Dto;
using NUnit.Framework;
using System;
namespace Deltatre.Utils.Tests.Dto
{
[TestFixture]
public sealed class MaybeTest
{
[Test]
public void Ctor_Allows_To_Create_New_Instance()
{
// ARRANGE
var value = new Person("Enrico");
// ACT
var result = new Maybe<Person>(value);
// ASSERT
Assert.IsNotNull(result);
Assert.IsTrue(result.HasValue);
Assert.AreSame(value, result.Value);
}
[Test]
public void None_Is_The_Empty_Instance()
{
// ACT
var result = Maybe<Person>.None;
// ASSERT
Assert.IsNotNull(result);
Assert.IsFalse(result.HasValue);
}
[Test]
public void Value_Throws_InvalidOperationException_When_Instance_Is_Empty()
{
// ARRANGE
var target = Maybe<Person>.None;
// ACT
var exception = Assert.Throws<InvalidOperationException>(
() => _ = target.Value
);
// ASSERT
Assert.IsNotNull(exception);
Assert.AreEqual("Accessing the value of an empty Maybe instance is not allowed", exception.Message);
}
[Test]
public void Value_Does_Not_Throw_When_Instance_Is_Not_Empty()
{
// ARRANGE
var value = new Person("Enrico");
var target = new Maybe<Person>(value);
// ACT
Assert.DoesNotThrow(
() => _ = target.Value
);
}
[Test]
public void Value_Returns_Wrapped_Value_When_Instance_Is_Not_Empty()
{
// ARRANGE
var value = new Person("Enrico");
var target = new Maybe<Person>(value);
// ACT
var result = target.Value;
// ASSERT
Assert.IsNotNull(result);
Assert.AreSame(value, result);
}
[Test]
public void HasValue_Returns_True_When_Instance_Is_Not_Empty()
{
// ARRANGE
var value = new Person("Enrico");
var target = new Maybe<Person>(value);
// ACT
var result = target.HasValue;
// ASSERT
Assert.IsTrue(result);
}
[Test]
public void HasValue_Returns_False_When_Instance_Is_Empty()
{
// ARRANGE
var target = Maybe<Person>.None;
// ACT
var result = target.HasValue;
// ASSERT
Assert.IsFalse(result);
}
}
}
<file_sep>/src/Deltatre.Utils/Extensions/ExpandoObjects/ExpandoObjectExtensions.cs
using System;
using System.Collections.Generic;
using System.Dynamic;
using Deltatre.Utils.Internals;
namespace Deltatre.Utils.Extensions.ExpandoObjects
{
/// <summary>
/// A collection of extension methods for the <see cref="ExpandoObject"/> class.
/// </summary>
public static class ExpandoObjectExtensions
{
/// <summary>
/// Performs a top level shallow merge between the current instance and the provided <see cref="ExpandoObject"/>.
/// The current instance is used as the target object and is modified.
/// The provided <see cref="ExpandoObject"/> is shallow merged over the current instance. Only the top level properties are shallow merged.
/// </summary>
/// <param name="target">
/// The object to be used as the target object for the shallow merge operation.
/// This object is modified by the shallow merge operation.
/// Cannot be <see langword="null"/>.
/// </param>
/// <param name="other">
/// The object to be shallow merged over <paramref name="target"/>.
/// Cannot be <see langword="null"/>.
/// </param>
/// <exception cref="ArgumentNullException">
/// Throws <see cref="ArgumentNullException"/> when <paramref name="target"/> is <see langword="null"/>
/// </exception>
/// <exception cref="ArgumentNullException">
/// Throws <see cref="ArgumentNullException"/> when <paramref name="other"/> is <see langword="null"/>
/// </exception>
public static void ShallowMergeWith(
this ExpandoObject target,
ExpandoObject other)
{
if (other == null)
throw new ArgumentNullException(nameof(other));
if (target == null)
throw new ArgumentNullException(nameof(target));
var targetAsDict = (IDictionary<string, object>)target;
var otherAsDict = (IDictionary<string, object>)other;
foreach (var key in otherAsDict.Keys)
{
targetAsDict[key] = otherAsDict[key];
}
}
/// <summary>
/// Returns a shallow clone of the current instance.
/// </summary>
/// <param name="source">
/// The object to be shallow cloned. Cannot be <see langword="null"/>.
/// </param>
/// <returns>
/// A shallow clone of the current instance.
/// </returns>
/// <exception cref="ArgumentNullException">
/// Throws <see cref="ArgumentNullException" /> when <paramref name="source"/> is <see langword="null"/>.
/// </exception>
public static ExpandoObject ShallowClone(this ExpandoObject source)
{
if (source == null)
throw new ArgumentNullException(nameof(source));
var result = new ExpandoObject();
var resultAsDict = (IDictionary<string, object>)result;
foreach (var (key, value) in source)
{
resultAsDict[key] = value;
}
return result;
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Resilience/PassThroughRetryStrategyTest.cs
using AutoFixture;
using Deltatre.Utils.Resilience;
using NUnit.Framework;
using System;
namespace Deltatre.Utils.Tests.Resilience
{
[TestFixture]
public sealed class PassThroughRetryStrategyTest
{
[Test]
public void Instance_Field_Returns_Non_Null_Instance_Of_PassThroughRetryStrategy()
{
// ASSERT
Assert.IsNotNull(PassThroughRetryStrategy.Instance);
}
[Test]
public void MaxRetryCount_Returns_Zero()
{
// ARRANGE
var target = PassThroughRetryStrategy.Instance;
// ACT
var result = target.MaxRetryCount;
// ASSERT
Assert.AreEqual(0, result);
}
[Test]
public void ComputeSleepDuration_Throws_InvalidOperationException_With_Any_RetryCount_Value()
{
// ARRANGE
var target = PassThroughRetryStrategy.Instance;
var fixture = new Fixture();
var retryCount = fixture.Create<int>();
// ACT
Assert.Throws<InvalidOperationException>(
() => target.ComputeSleepDuration(retryCount)
);
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Azure/OsNotSupportedOnAzureAppServicesExceptionTest.cs
using Deltatre.Utils.Azure;
using NUnit.Framework;
using System;
using System.IO;
using System.Runtime.Serialization.Formatters.Binary;
namespace Deltatre.Utils.Tests.Azure
{
[TestFixture]
public sealed class OsNotSupportedOnAzureAppServicesExceptionTest
{
[Test]
public void Exception_Object_Can_Be_Serialized_And_Then_Deserialized()
{
// ARRANGE
var innerException = new InvalidOperationException("Something wrong happened");
var target = new OsNotSupportedOnAzureAppServicesException(
"The OSX operating system is not supported on Azure app services",
innerException)
{
OperatingSystem = "Hello world"
};
// ACT
var binaryFormatter = new BinaryFormatter();
OsNotSupportedOnAzureAppServicesException result = null;
using (var serializationStream = new MemoryStream())
{
binaryFormatter.Serialize(serializationStream, target);
serializationStream.Seek(0, SeekOrigin.Begin);
result = (OsNotSupportedOnAzureAppServicesException)binaryFormatter.Deserialize(serializationStream);
}
// ASSERT
Assert.IsNotNull(result);
Assert.IsNotNull(result.InnerException);
Assert.IsInstanceOf<InvalidOperationException>(result.InnerException);
Assert.AreEqual("The OSX operating system is not supported on Azure app services", result.Message);
Assert.AreEqual("Something wrong happened", result.InnerException.Message);
Assert.AreEqual("Hello world", result.OperatingSystem);
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Dto/ValidationResultTest.cs
using Deltatre.Utils.Dto;
using NUnit.Framework;
using System;
namespace Deltatre.Utils.Tests.Dto
{
[TestFixture]
public sealed class ValidationResultTest
{
[Test]
public void Valid_Returns_Instance_For_Valid_Validation_Result()
{
// ACT
var result = ValidationResult<string>.Valid;
// ASSERT
Assert.IsNotNull(result);
Assert.IsTrue(result.IsValid);
Assert.IsTrue(result.Errors.IsEmpty);
Assert.AreEqual(0, result.Errors.Length);
}
[Test]
public void Invalid_Returns_Instance_For_Invalid_Validation_Result()
{
// ARRANGE
var errors = new[]
{
"first name is missing",
"last name is invalid"
};
// ACT
var result = ValidationResult<string>.Invalid(errors);
// ASSERT
Assert.IsNotNull(result);
Assert.IsFalse(result.IsValid);
Assert.AreEqual(2, result.Errors.Length);
CollectionAssert.AreEqual(errors, result.Errors);
}
[Test]
public void Invalid_Throws_ArgumentNullException_When_Errors_Is_Null()
{
// ACT
var exception = Assert.Throws<ArgumentNullException>(
() => ValidationResult<string>.Invalid(null)
);
// ASSERT
Assert.IsNotNull(exception);
Assert.AreEqual("errors", exception.ParamName);
}
[Test]
public void Invalid_Throws_ArgumentException_When_Errors_Is_Empty_Collection()
{
// ARRANGE
var errors = Array.Empty<string>();
// ACT
var exception = Assert.Throws<ArgumentException>(
() => ValidationResult<string>.Invalid(errors)
);
// ASSERT
Assert.IsNotNull(exception);
Assert.AreEqual("errors", exception.ParamName);
}
}
}
<file_sep>/src/Deltatre.Utils/Types/EndPoint.cs
using System;
namespace Deltatre.Utils.Types
{
/// <summary>
/// This class represents the endpoint of an interval. It stores a value and whether it is included or excluded from the interval.
/// </summary>
/// <typeparam name="T">The type of value</typeparam>
public struct Endpoint<T> : IComparable, IComparable<Endpoint<T>>, IEquatable<Endpoint<T>>
where T : struct, IComparable<T>, IEquatable<T>
{
/// <summary>
/// Constructs an inclusive <see cref="Endpoint{T}"/> of value <paramref name="value"/>
/// </summary>
/// <param name="value">The value of the endpoint</param>
/// <returns>Returns an inclusive <see cref="Endpoint{T}"/> of value <paramref name="value"/></returns>
public static Endpoint<T> Inclusive(T value)
{
return new Endpoint<T>(value, EndPointStatus.Inclusive);
}
/// <summary>
/// Constructs an exclusive <see cref="Endpoint{T}"/> of value <paramref name="value"/>
/// </summary>
/// <param name="value">The value of the endpoint</param>
/// <returns>Returns an exclusive <see cref="Endpoint{T}"/> of value <paramref name="value"/></returns>
public static Endpoint<T> Exclusive(T value)
{
return new Endpoint<T>(value, EndPointStatus.Exclusive);
}
private Endpoint(T value, EndPointStatus status)
{
Value = value;
Status = status;
}
/// <summary>
/// Value of the endpoint
/// </summary>
public T Value { get; }
/// <summary>
/// Status of the endpoint. It could be <see cref="EndPointStatus.Inclusive"/> or <see cref="EndPointStatus.Exclusive"/>
/// </summary>
public EndPointStatus Status { get; }
#region Equality
/// <summary>
/// Determines whether two instances of <see cref="Endpoint{T}"/> are equal.
/// Two <see cref="Endpoint{T}"/> are equal if both <see cref="Value"/> and <see cref="Status"/> are equal
/// </summary>
/// <param name="other">An object to compare the current instance with</param>
/// <returns>Boolean indicating whether the two instances are considered equals</returns>
public bool Equals(Endpoint<T> other)
{
return Value.Equals(other.Value) && Status == other.Status;
}
/// <summary>
/// Determines whether two instances of <see cref="Endpoint{T}"/> are equal.
/// Two <see cref="Endpoint{T}"/> are equal if both <see cref="Value"/> and <see cref="Status"/> are equal
/// </summary>
/// <param name="obj">An object to compare the current instance with</param>
/// <returns>Boolean indicating whether the two instances are considered equals</returns>
public override bool Equals(object obj)
{
return obj is Endpoint<T> endpoint && Equals(endpoint);
}
/// <summary>
/// Compares two <see cref='Endpoint{T}'/> objects. The result specifies whether they are equal.
/// </summary>
/// <returns>Boolean indicating whether the two instances are considered equals</returns>
public static bool operator ==(Endpoint<T> x, Endpoint<T> y)
{
return x.Equals(y);
}
/// <summary>
/// Compares two <see cref='Endpoint{T}'/> objects. The result specifies whether they are not equal.
/// </summary>
/// <returns>Boolean indicating whether the two instances are considered not equals</returns>
public static bool operator !=(Endpoint<T> x, Endpoint<T> y)
{
return !x.Equals(y);
}
/// <summary>
/// Generate an hash code for an instance of <see cref="Endpoint{T}"/>.
/// </summary>
/// <returns>Returns the generated hash code</returns>
public override int GetHashCode()
{
return Value.GetHashCode() ^ Status.GetHashCode();
}
#endregion
#region Comparison
/// <summary>
/// Compares this instance of <see cref="Endpoint{T}"/> with <paramref name="obj"/>
/// </summary>
/// <param name="obj">The object to compare this instance with</param>
/// <returns>
/// 1 : if this instance is greater than <paramref name="obj"/>
/// 0 : if this instance is equal to <paramref name="obj"/>
/// -1 : if this instance is lesser than <paramref name="obj"/>
/// </returns>
public int CompareTo(object obj)
{
if (obj == null)
throw new ArgumentNullException(nameof(obj));
if (!(obj is Endpoint<T>))
throw new ArgumentException($"Given object is not {nameof(Endpoint<T>)}");
return CompareTo((Endpoint<T>)obj);
}
/// <summary>
/// Compares this instance of <see cref="Endpoint{T}"/> with <paramref name="other"/>
/// </summary>
/// <param name="other">The object to compare this instance with</param>
/// <returns>
/// 1 : if this instance is greater than <paramref name="other"/>
/// 0 : if this instance is equal to <paramref name="other"/>
/// -1 : if this instance is lesser than <paramref name="other"/>
/// </returns>
public int CompareTo(Endpoint<T> other)
{
return Value.CompareTo(other.Value);
}
/// <summary>
/// Lesser than operator
/// </summary>
/// <param name="x">left side operand</param>
/// <param name="y">right side operand</param>
/// <returns>Returns a bool indicating whether or not <paramref name="x"/> is lesser than <paramref name="y"/></returns>
public static bool operator <(Endpoint<T> x, Endpoint<T> y)
{
return x.CompareTo(y) < 0;
}
/// <summary>
/// Lesser than or equal operator
/// </summary>
/// <param name="x">left side operand</param>
/// <param name="y">right side operand</param>
/// <returns>Returns a bool indicating whether or not <paramref name="x"/> is lesser than or equal <paramref name="y"/></returns>
public static bool operator <=(Endpoint<T> x, Endpoint<T> y)
{
return x.CompareTo(y) <= 0;
}
/// <summary>
/// Greater than operator
/// </summary>
/// <param name="x">left side operand</param>
/// <param name="y">right side operand</param>
/// <returns>Returns a bool indicating whether or not <paramref name="x"/> is greater than <paramref name="y"/></returns>
public static bool operator >(Endpoint<T> x, Endpoint<T> y)
{
return x.CompareTo(y) > 0;
}
/// <summary>
/// Greater than or equal operator
/// </summary>
/// <param name="x">left side operand</param>
/// <param name="y">right side operand</param>
/// <returns>Returns a bool indicating whether or not <paramref name="x"/> is greater than or equal <paramref name="y"/></returns>
public static bool operator >=(Endpoint<T> x, Endpoint<T> y)
{
return x.CompareTo(y) >= 0;
}
#endregion
/// <summary>
/// Prints a representation of <see cref="Endpoint{T}"/>
/// </summary>
/// <returns>The string representation of the endpoint</returns>
public override string ToString()
{
return $"Endpoint - value: {Value}, status: {Status}";
}
}
/// <summary>
/// This enum contains the possible statuses of an endpoint.
/// </summary>
public enum EndPointStatus
{
/// <summary>
/// The value of the endpoint is inclusive
/// </summary>
Inclusive = 0,
/// <summary>
/// The value of the endpoint is exclusive
/// </summary>
Exclusive = 1
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Types/NonEmptyListTest.cs
using Deltatre.Utils.Types;
using NUnit.Framework;
using System;
using System.Linq;
namespace Deltatre.Utils.Tests.Types
{
[TestFixture]
public class NonEmptyListTest
{
[Test]
public void Ctor_Throws_ArgumentNullException_When_Items_Is_Null()
{
// ACT
var exception = Assert.Throws<ArgumentNullException>(() => new NonEmptyList<string>(null));
// ASSERT
Assert.IsNotNull(exception);
Assert.AreEqual("items", exception.ParamName);
}
[Test]
public void Ctor_Throws_When_Items_Is_Empty_Sequence()
{
// ACT
var exception = Assert.Throws<ArgumentException>(() => new NonEmptyList<string>(Enumerable.Empty<string>()));
// ASSERT
Assert.IsNotNull(exception);
Assert.AreEqual("items", exception.ParamName);
}
[Test]
public void Ctor_Creates_An_Enumerable_Instance()
{
// ARRANGE
var items = new[] { "foo", "bar" };
// ACT
var result = new NonEmptyList<string>(items);
// ASSERT
Assert.IsNotNull(result);
CollectionAssert.AreEqual(new[] { "foo", "bar" }, result);
}
[Test]
public void Count_Returns_Number_Of_Items_In_List()
{
// ARRANGE
var items = new[] { "foo", "bar" };
var target = new NonEmptyList<string>(items);
// ACT
var result = target.Count;
// ASSERT
Assert.AreEqual(2, result);
}
[Test]
public void Indexer_Can_Be_Used_To_Access_List_Items()
{
// ARRANGE
var items = new[] { "foo", "bar" };
var target = new NonEmptyList<string>(items);
// ACT
var firstItem = target[0];
var secondItem = target[1];
// ASSERT
Assert.AreEqual("foo", firstItem);
Assert.AreEqual("bar", secondItem);
}
}
}
<file_sep>/src/Deltatre.Utils/GlobalSuppressions.cs
// This file is used by Code Analysis to maintain SuppressMessage
// attributes that are applied to this project.
// Project-level suppressions either have no target or are given
// a specific target and scoped to a namespace, type, member, etc.
using System.Diagnostics.CodeAnalysis;
[assembly: SuppressMessage("Globalization", "CA1303:Do not pass literals as localized parameters", Justification = "It's fine to have hardcoded messages in English.", Scope = "module")]
[assembly: SuppressMessage("Design", "CA1000:Do not declare static members on generic types", Justification = "It's fine to declare this static method", Scope = "member", Target = "~M:Deltatre.Utils.Dto.ValidationResult`1.Invalid(System.Collections.Generic.IEnumerable{`0})~Deltatre.Utils.Dto.ValidationResult`1")]
<file_sep>/tests/Deltatre.Utils.Tests/Concurrency/ILogger.cs
namespace Deltatre.Utils.Tests.Concurrency
{
public interface ILogger
{
void Log(string message);
}
}
<file_sep>/src/Deltatre.Utils/Concurrency/Extensions/EnumerableExtensions.cs
using System;
using System.Collections.Generic;
using System.Threading.Tasks;
using System.Collections.Concurrent;
using System.Linq;
using System.Threading;
namespace Deltatre.Utils.Concurrency.Extensions
{
/// <summary>
/// Extension methods for enumerables
/// </summary>
public static class EnumerableExtensions
{
/// <summary>
/// Executes an asynchronous operation for each item inside a source sequence. These operations are run concurrently in a parallel fashion. The invokation returns a task which completes when all of the asynchronous operations (one for each item inside the source sequence) complete. It is possible to constrain the maximum number of parallel operations.
/// </summary>
/// <typeparam name="T">The type of the items inside <paramref name="source"/></typeparam>
/// <param name="source">The source sequence</param>
/// <param name="operation">The asynchronous operation to be executed for each item inside <paramref name="source"/></param>
/// <param name="maxDegreeOfParallelism">The maximum number of operations that are able to run in parallel. If null, no limits will be set for the maximum number of parallel operations (same behaviour as Task.WhenAll)</param>
/// <returns>A task which completes when all of the asynchronous operations (one for each item inside <paramref name="source"/>) complete</returns>
/// <exception cref="ArgumentNullException"><paramref name="source"/> is <c>null</c>.</exception>
/// <exception cref="ArgumentNullException"><paramref name="operation"/> is <c>null</c>.</exception>
/// <exception cref="ArgumentOutOfRangeException"><paramref name="maxDegreeOfParallelism"/> is less than or equal to zero.</exception>
public static Task ForEachAsync<T>(
this IEnumerable<T> source,
Func<T, Task> operation,
int? maxDegreeOfParallelism = null)
{
if (source == null)
throw new ArgumentNullException(nameof(source));
if (operation == null)
throw new ArgumentNullException(nameof(operation));
EnsureValidMaxDegreeOfParallelism(maxDegreeOfParallelism);
return (maxDegreeOfParallelism == null)
? ApplyOperationToAllItems(source, operation)
: ApplyOperationToAllItemsWithConstrainedParallelism(source, operation, maxDegreeOfParallelism.Value);
}
private static Task ApplyOperationToAllItems<T>(
IEnumerable<T> items,
Func<T, Task> operation)
{
var tasks = items.Select(operation);
return Task.WhenAll(tasks);
}
private static async Task ApplyOperationToAllItemsWithConstrainedParallelism<T>(
IEnumerable<T> items,
Func<T, Task> operation,
int maxDegreeOfParallelism)
{
using (var throttler = new SemaphoreSlim(maxDegreeOfParallelism))
{
var tasks = new List<Task>();
foreach (var item in items)
{
await throttler.WaitAsync().ConfigureAwait(false);
#pragma warning disable IDE0039 // Use local function
Func<Task> bodyOfNewTask = async () =>
#pragma warning restore IDE0039 // Use local function
{
try
{
await operation(item).ConfigureAwait(false);
}
finally
{
throttler.Release();
}
};
tasks.Add(Task.Run(bodyOfNewTask));
}
await Task.WhenAll(tasks).ConfigureAwait(false);
}
}
/// <summary>
/// Executes an asynchronous operation for each item inside a source sequence. These operations are run concurrently in a parallel fashion. The invokation returns a task whose result is a sequence containing the results of all the asynchronous operations (in source sequence order). It is possible to constrain the maximum number of parallel operations.
/// </summary>
/// <typeparam name="TSource">The type of the items inside the source sequence</typeparam>
/// <typeparam name="TResult">The type of the object produced by invoking <paramref name="operation"/> on any item of <paramref name="source"/></typeparam>
/// <param name="source">The source sequence</param>
/// <param name="operation">The asynchronous operation to be executed for each item inside <paramref name="source"/>. This operation will produce a result of type <typeparamref name="TResult"/></param>
/// <param name="maxDegreeOfParallelism">The maximum number of operations that are able to run in parallel. If null, no limits will be set for the maximum number of parallel operations (same behaviour as Task.WhenAll)</param>
/// <returns>A task which completes when all of the asynchronous operations (one for each item inside <paramref name="source"/>) complete. This task will produce a sequence of objects of type <typeparamref name="TResult"/> which are the results (in source sequence order) of applying <paramref name="operation"/> to all items in <paramref name="source"/></returns>
/// <exception cref="ArgumentNullException"><paramref name="source"/> is <c>null</c>.</exception>
/// <exception cref="ArgumentNullException"><paramref name="operation"/> is <c>null</c>.</exception>
/// <exception cref="ArgumentOutOfRangeException"><paramref name="maxDegreeOfParallelism"/> is less than or equal to zero.</exception>
public static Task<TResult[]> ForEachAsync<TSource, TResult>(
this IEnumerable<TSource> source,
Func<TSource, Task<TResult>> operation,
int? maxDegreeOfParallelism = null)
{
if (source == null)
throw new ArgumentNullException(nameof(source));
if (operation == null)
throw new ArgumentNullException(nameof(operation));
EnsureValidMaxDegreeOfParallelism(maxDegreeOfParallelism);
return (maxDegreeOfParallelism == null)
? ApplyOperationToAllItems(source, operation)
: ApplyOperationToAllItemsWithConstrainedParallelism(source, operation, maxDegreeOfParallelism.Value);
}
private static Task<TResult[]> ApplyOperationToAllItems<TItem, TResult>(
IEnumerable<TItem> items,
Func<TItem, Task<TResult>> operation)
{
var tasks = items.Select(operation);
return Task.WhenAll(tasks);
}
private static async Task<TResult[]> ApplyOperationToAllItemsWithConstrainedParallelism<TItem, TResult>(
IEnumerable<TItem> items,
Func<TItem, Task<TResult>> operation,
int maxDegreeOfParallelism)
{
var resultsByPositionInSource = new ConcurrentDictionary<long, TResult>();
using (var throttler = new SemaphoreSlim(maxDegreeOfParallelism))
{
var tasks = new List<Task>();
foreach (var itemWithIndex in items.Select((item, index) => new { item, index }))
{
await throttler.WaitAsync().ConfigureAwait(false);
#pragma warning disable IDE0039 // Use local function
Func<Task> bodyOfNewTask = async () =>
#pragma warning restore IDE0039 // Use local function
{
try
{
var item = itemWithIndex.item;
var positionInSource = itemWithIndex.index;
var result = await operation(item).ConfigureAwait(false);
resultsByPositionInSource.TryAdd(positionInSource, result);
}
finally
{
throttler.Release();
}
};
tasks.Add(Task.Run(bodyOfNewTask));
}
await Task.WhenAll(tasks).ConfigureAwait(false);
}
return Enumerable
.Range(0, resultsByPositionInSource.Count)
.Select(position => resultsByPositionInSource[position])
.ToArray();
}
private static void EnsureValidMaxDegreeOfParallelism(int? maxDegreeOfParallelism)
{
if (maxDegreeOfParallelism <= 0)
{
throw new ArgumentOutOfRangeException(
nameof(maxDegreeOfParallelism),
$"Invalid value for the maximum degree of parallelism: {maxDegreeOfParallelism}. The maximum degree of parallelism must be a positive integer.");
}
}
}
}
<file_sep>/src/Deltatre.Utils/Resilience/WaitAndRetryWithRandomSleepDurationsConfiguration.cs
using System;
namespace Deltatre.Utils.Resilience
{
/// <summary>
/// This class encapsulates all the configurations for the
/// <see cref="WaitAndRetryWithRandomSleepDurations"/> retry strategy.
/// </summary>
public sealed class WaitAndRetryWithRandomSleepDurationsConfiguration
{
/// <summary>
/// Gets the number of allowed retries.
/// </summary>
/// <remarks>
/// If the number of allowed retries is zero (0) the resulting retry policy is a
/// pass-through which simply executes the provided operation and doesn't apply any retry in
/// case of errors.
/// </remarks>
public int MaxRetryCount { get; }
/// <summary>
/// Gets the maximum allowed duration for the sleep time between two consecutive retries
/// expressed as number of milliseconds.
/// </summary>
public int MaxSleepDurationInMilliseconds { get; }
/// <summary>
/// Initializes a new instance of the
/// <see cref="WaitAndRetryWithRandomSleepDurationsConfiguration"/> class.
/// </summary>
/// <param name="maxRetryCount">
/// The number of allowed retries. This parameter must be non negative.
/// If the value zero (0) is provided, the resulting retry strategy will be a pass-through
/// which doesn't perform any retry on errors.
/// </param>
/// <param name="maxSleepDurationInMilliseconds">
/// The maximum allowed duration for the sleep time between two consecutive retries.
/// This must be expressed as a number of milliseconds.
/// This parameter must be a positive integer number.
/// </param>
/// <exception cref="ArgumentOutOfRangeException">
/// When parameter <paramref name="maxRetryCount"/> is less than zero (0).
/// When parameter <paramref name="maxSleepDurationInMilliseconds"/> is less than or equal to zero (0).
/// </exception>
public WaitAndRetryWithRandomSleepDurationsConfiguration(
int maxRetryCount,
int maxSleepDurationInMilliseconds)
{
if (maxRetryCount < 0)
{
var message = FormattableString.Invariant(
$"Invalid number of allowed retries: {maxRetryCount}. The number of allowed retries must be a non negative integer number."
);
throw new ArgumentOutOfRangeException(nameof(maxRetryCount), message);
}
if (maxSleepDurationInMilliseconds <= 0)
{
var message = FormattableString.Invariant(
$"Invalid maximum allowed sleep duration: {maxSleepDurationInMilliseconds}. The maximum allowed sleep duration must be a positive integer number."
);
throw new ArgumentOutOfRangeException(nameof(maxSleepDurationInMilliseconds), message);
}
this.MaxRetryCount = maxRetryCount;
this.MaxSleepDurationInMilliseconds = maxSleepDurationInMilliseconds;
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Dto/OperationResultTest.cs
using Deltatre.Utils.Dto;
using NUnit.Framework;
using System;
namespace Deltatre.Utils.Tests.Dto
{
[TestFixture]
public class OperationResultTest
{
[Test]
public void CreateSuccess_Returns_An_Instance_Representing_Successful_Operation()
{
// ACT
var result = OperationResult<string>.CreateSuccess("hello world");
// ASSERT
Assert.IsNotNull(result);
Assert.IsTrue(result.IsSuccess);
Assert.AreEqual("hello world", result.Output);
}
[Test]
public void Failure_Is_An_Instance_Representing_The_Result_Of_A_Failed_Operation()
{
// ACT
var result = OperationResult<string>.Failure;
// ASSERT
Assert.IsNotNull(result);
Assert.IsFalse(result.IsSuccess);
}
[Test]
public void Implicit_Conversion_From_TOutput_To_OperationResultOfTOutput_Is_Available()
{
// ACT
OperationResult<string> result = "Hello !";
// ASSERT
Assert.IsNotNull(result);
Assert.IsTrue(result.IsSuccess);
Assert.AreEqual("Hello !", result.Output);
}
[Test]
public void Output_Throws_InvalidOperationException_When_Operation_Is_Failed()
{
// ARRANGE
var target = OperationResult<string>.Failure;
// ACT
var exception = Assert.Throws<InvalidOperationException>(() => _ = target.Output);
// ASSERT
Assert.IsNotNull(exception);
Assert.AreEqual("Reading the operation output is not allowed because the operation is failed.", exception.Message);
}
[Test]
public void Output_Does_Not_Throw_When_Operation_Is_Successful()
{
// ARRANGE
var target = OperationResult<string>.CreateSuccess("the message");
// ACT
Assert.DoesNotThrow(() => _ = target.Output);
}
[Test]
public void Output_Returns_The_Output_Of_Successful_Operation()
{
// ARRANGE
var target = OperationResult<string>.CreateSuccess("the message");
// ACT
var result = target.Output;
// ASSERT
Assert.AreEqual("the message", result);
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Randomization/RandomHelpersTest.cs
using System;
using System.Collections.Generic;
using System.Linq;
using Deltatre.Utils.Randomization;
using NUnit.Framework;
namespace Deltatre.Utils.Tests.Randomization
{
[TestFixture]
public class RandomHelpersTest
{
[Test]
public void GetRandomAlphanumericString_Throws_Argument_Out_Of_Range_Exception_When_Lenght_Is_Less_Than_Zero()
{
// ACT
Assert.Throws<ArgumentOutOfRangeException>(() => RandomHelpers.GetRandomAlphanumericString(-3));
}
[Test]
public void GetRandomAlphanumericString_Returns_Empty_String_When_Length_Equals_Zero()
{
// ACT
var result = RandomHelpers.GetRandomAlphanumericString(0);
// ASSERT
Assert.IsNotNull(result);
Assert.AreEqual(string.Empty, result);
}
[Test]
public void GetRandomAlphanumericString_Returns_Alphanumeric_String()
{
// ACT
var result = RandomHelpers.GetRandomAlphanumericString(8);
// ASSERT
Assert.IsNotNull(result);
Assert.IsTrue(result.All(char.IsLetterOrDigit));
}
[Test]
public void GetRandomAlphanumericString_Returns_String_Having_Requested_Length()
{
// ACT
var result = RandomHelpers.GetRandomAlphanumericString(8);
// ASSERT
Assert.IsNotNull(result);
Assert.AreEqual(8, result.Length);
}
[Test]
public void GetRandomAlphanumericString_Returns_String_With_Random_Characters()
{
// ARRANGE
var results = new HashSet<string>();
// ACT
for (int i = 0; i < 1000; i++)
{
var result = RandomHelpers.GetRandomAlphanumericString(8);
var addedToSet = results.Add(result);
Assert.IsTrue(addedToSet);
}
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Concurrency/Extensions/EnumerableExtensionsTest_ForEachAsync_WithResults.cs
using NUnit.Framework;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Deltatre.Utils.Concurrency.Extensions;
using System.Collections.Concurrent;
using System.Threading;
namespace Deltatre.Utils.Tests.Concurrency.Extensions
{
public partial class EnumerableExtensionsTest
{
[Test]
public void ForEachAsync_WithResults_WithResults_Throws_ArgumentNullException_When_Source_Is_Null()
{
// ARRANGE
const int maxDegreeOfParallelism = 4;
Func<string, Task<bool>> operation = _ => Task.FromResult(true);
// ACT
Assert.ThrowsAsync<ArgumentNullException>(() =>
EnumerableExtensions.ForEachAsync(null, operation, maxDegreeOfParallelism));
}
[Test]
public void ForEachAsync_WithResults_WithResults_Throws_ArgumentNullException_When_Operation_Is_Null()
{
// ARRANGE
const int maxDegreeOfParallelism = 4;
var source = new[] { "hello", "world" };
Func<string, Task<bool>> operation = null;
// ACT
Assert.ThrowsAsync<ArgumentNullException>(() =>
source.ForEachAsync(operation, maxDegreeOfParallelism));
}
[Test]
public void ForEachAsync_WithResults_WithResults_Throws_ArgumentOutOfRangeException_When_MaxDegreeOfParallelism_Is_Less_Than_Zero()
{
// ARRANGE
const int maxDegreeOfParallelism = -3;
var source = new[] { "hello", "world" };
Func<string, Task<bool>> operation = _ => Task.FromResult(true);
// ACT
Assert.ThrowsAsync<ArgumentOutOfRangeException>(() =>
source.ForEachAsync(operation, maxDegreeOfParallelism));
}
[Test]
public void ForEachAsync_WithResults_WithResults_Throws_ArgumentOutOfRangeException_When_MaxDegreeOfParallelism_Equals_Zero()
{
// ARRANGE
const int maxDegreeOfParallelism = 0;
var source = new[] { "hello", "world" };
Func<string, Task<bool>> operation = _ => Task.FromResult(true);
// ACT
Assert.ThrowsAsync<ArgumentOutOfRangeException>(() =>
source.ForEachAsync(operation, maxDegreeOfParallelism));
}
[Test]
public void ForEachAsync_WithResults_Does_Not_Throws_When_MaxDegreeOfParallelism_Is_Null()
{
// ARRANGE
var source = new[] { "hello", "world" };
Func<string, Task<string>> operation = item => Task.FromResult(item);
// ACT
Assert.DoesNotThrowAsync(() => source.ForEachAsync(operation, null));
}
[TestCase(2)]
[TestCase(3)]
[TestCase(4)]
[TestCase(null)]
public async Task ForEachAsync_WithResults_Executes_The_Operation_For_Each_Item_Inside_Source_Collection(int? maxDegreeOfParallelism)
{
// ARRANGE
var source = new[] { "foo", "bar", "buzz" };
var processedItems = new ConcurrentBag<string>();
Func<string, Task<string>> operation = item =>
{
processedItems.Add(item);
return Task.FromResult(item);
};
// ACT
await source.ForEachAsync(operation, maxDegreeOfParallelism).ConfigureAwait(false);
// ASSERT
CollectionAssert.AreEquivalent(new[] { "foo", "bar", "buzz" }, processedItems);
}
[TestCase(2)]
[TestCase(3)]
[TestCase(4)]
public async Task ForEachAsync_WithResults_Executes_A_Number_Of_Concurrent_Operations_Less_Than_Or_Equal_To_MaxDegreeOfParallelism(int maxDegreeOfParallelism)
{
// ARRANGE
var source = new[] { "foo", "bar", "buzz" };
var timeRanges = new ConcurrentBag<(DateTime start, DateTime end)>();
Func<string, Task<string>> operation = async item =>
{
var start = DateTime.Now;
await Task.Delay(500).ConfigureAwait(false);
timeRanges.Add((start, DateTime.Now));
return item;
};
// ACT
await source.ForEachAsync(operation, maxDegreeOfParallelism).ConfigureAwait(false);
// ASSERT
var timeRangesArray = timeRanges.ToArray();
for (int i = 0; i < timeRanges.Count; i++)
{
var current = timeRangesArray[i];
var others = GetOthers(timeRangesArray, i);
var overlaps = 0;
foreach (var item in others)
{
if (AreOverlapping(current, item))
{
overlaps++;
}
}
Assert.IsTrue(overlaps <= maxDegreeOfParallelism);
}
}
[TestCase(2)]
[TestCase(3)]
[TestCase(4)]
[TestCase(null)]
public async Task ForEachAsync_WithResults_Produces_Result_For_Each_Item_Inside_Source_Collection(int? maxDegreeOfParallelism)
{
// ARRANGE
var source = new[] { "foo", "bar", "buzz" };
Func<string, Task<string>> operation = item => Task.FromResult(item.ToUpperInvariant());
// ACT
var results = await source.ForEachAsync(operation, maxDegreeOfParallelism).ConfigureAwait(false);
// ASSERT
CollectionAssert.AreEquivalent(new[] { "FOO", "BAR", "BUZZ" }, results);
}
[TestCase(2)]
[TestCase(3)]
[TestCase(4)]
[TestCase(null)]
public async Task ForEachAsync_WithResults_Produces_Results_In_Source_Order(int? maxDegreeOfParallelism)
{
// ARRANGE
var source = new[] { "foo", "bar", "buzz" };
Func<string, Task<string>> operation = item => Task.FromResult(item.ToUpperInvariant());
// ACT
var results = await source.ForEachAsync(operation, maxDegreeOfParallelism).ConfigureAwait(false);
// ASSERT
CollectionAssert.AreEqual(new[] { "FOO", "BAR", "BUZZ" }, results);
}
}
}
<file_sep>/src/Deltatre.Utils/Extensions/String/StringExtensions.cs
using System;
using System.Text.RegularExpressions;
namespace Deltatre.Utils.Extensions.String
{
/// <summary>
/// Extension methods for strings
/// </summary>
public static class StringExtensions
{
private static readonly Regex HtmlTagRegex = new Regex(@"<\/?[A-Za-z0-9]+[^>]*>", RegexOptions.Compiled);
/// <summary>
/// This method is meant to be a simplistic tool to remove HTML tags (both opening and closing) from a string. Only the opening and closing tags will be removed the tag content won't.
/// Opening tags not matching any closing tag will be removed.
/// Closing tags not matching any opening tag will be removed.
/// Encoded HTML tags won't be removed.
/// Don't use this method as a general purpose string sanitization tool especially in critical contexts: it is regular expression based and this approach is known not to be optimal to parse HTML.
/// </summary>
/// <param name="source">The string to be sanitized from HTML tags</param>
/// <returns>The source string cleaned by all the HTML tags</returns>
public static string StripHtmlTags(this string source)
{
if (string.IsNullOrWhiteSpace(source))
return source;
return HtmlTagRegex.Replace(source, string.Empty);
}
/// <summary>
/// Call this method if you want to process a string and be sure that its length doesn't exceed a maximum allowed length.
/// If the lenght of the source string is less than or equal to the maximum allowed length then the source string is returned.
/// If the lenght of the source string is greater than the maximum allowedf length, then a substring of the source string is returned.
/// You can specify an optional suffix to be appended at the end of the returned string if the returned string is a substring of the source string.
/// If the source string is truncated and a suffix is appended at the end then the length of the returned string will be the maximum allowed length plus the lenght of the suffix
/// </summary>
/// <param name="source">The source string to be processed</param>
/// <param name="maximumAllowedLength">The maximum allowed length of the source string. This value must be non negative.</param>
/// <param name="ellipsis">The suffix to be appended at the end of the returned string if the source string is truncated because its length exceeds the maximum allowed length. If you pass a string null or white space then no suffix will be appended at the end of the returned string</param>
/// <returns>A <see cref="string"/> which is the source string or a substring of the source string possibly enriched with the specified suffix</returns>
/// <exception cref="ArgumentOutOfRangeException">Throws <see cref="ArgumentOutOfRangeException"/> when parameter <paramref name="maximumAllowedLength"/> is less than zero</exception>
public static string Truncate(
this string source,
int maximumAllowedLength = 150,
string ellipsis = "...")
{
if (maximumAllowedLength < 0)
{
throw new ArgumentOutOfRangeException(
nameof(maximumAllowedLength),
$"Parameter '{nameof(maximumAllowedLength)}' must be non negative integer");
}
if (string.IsNullOrEmpty(source))
return source;
if (source.Length <= maximumAllowedLength)
return source;
var truncated = source.Substring(0, maximumAllowedLength);
if (string.IsNullOrWhiteSpace(ellipsis))
return truncated;
return string.Concat(truncated, ellipsis);
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Extensions/DateTime/DateTimeExtensionsTest.cs
using System;
using Deltatre.Utils.Extensions.DateTimeExtension;
using NUnit.Framework;
namespace Deltatre.Utils.Tests.Extensions.DateTimeExtension
{
[TestFixture]
public class DateTimeExtensionsTest
{
[Test]
public void Epoch_Should_Return_Unix_Epoch_Time()
{
// ARRANGE
var expectedEpoch = new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);
// ACT
var actualEpoch = DateTimeExtensions.Epoch;
// ASSERT
Assert.AreEqual(expectedEpoch, actualEpoch);
}
[Test]
public void ToUnixTimeSeconds_Should_Return_Zero_When_Called_With_Epoch()
{
// ARRANGE
var epoch = new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);
//ACT
var unixTimeSeconds = epoch.ToUnixTimeSeconds();
// ASSERT
Assert.AreEqual(0, unixTimeSeconds);
}
[Test]
public void ToUnixTimeSeconds_Should_Return_86400_When_Called_With_Epoch_Plus_One_Day()
{
// ARRANGE
var date = new DateTime(1970, 1, 2, 0, 0, 0, DateTimeKind.Utc);
//ACT
var unixTimeSeconds = date.ToUnixTimeSeconds();
// ASSERT
Assert.AreEqual(86400, unixTimeSeconds);
}
[Test]
public void ToUnixTimeSeconds_Should_Return_Minus_86400_When_Called_With_Epoch_Minus_One_Day()
{
// ARRANGE
var date = new DateTime(1969, 12, 31, 0, 0, 0, DateTimeKind.Utc);
//ACT
var unixTimeSeconds = date.ToUnixTimeSeconds();
// ASSERT
Assert.AreEqual(-86400, unixTimeSeconds);
}
}
}
<file_sep>/src/Deltatre.Utils/Mocking/RandomValueProvider.cs
using System;
using static Deltatre.Utils.Randomization.RandomGenerator;
namespace Deltatre.Utils.Mocking
{
/// <summary>
/// An implementation of <see cref="IRandomValueProvider"/> based on the
/// <see cref="Random"/> class.
/// </summary>
public sealed class RandomValueProvider : IRandomValueProvider
{
/// <summary>
/// Returns a random integer that is within a specified range
/// </summary>
/// <param name="minValue">
/// The inclusive lower bound of the random number returned
/// </param>
/// <param name="maxValue">
/// The exclusive upper bound of the random number returned. <paramref name="maxValue"/>
/// must be greater than or equal to <paramref name="minValue"/>.
/// </param>
/// <returns>
/// A 32-bit signed integer greater than or equal to <paramref name="minValue"/> and less than <paramref name="maxValue"/>;
/// that is, the range of return values includes <paramref name="minValue"/> but not <paramref name="maxValue"/>.
/// If <paramref name="minValue"/> equals <paramref name="maxValue"/>, <paramref name="minValue"/> is returned.
/// </returns>
/// <exception cref="ArgumentOutOfRangeException">
/// <paramref name="minValue"/> is greater than <paramref name="maxValue"/>
/// </exception>
public int Next(int minValue, int maxValue) => Instance.Next(minValue, maxValue);
/// <summary>
/// Returns a random floating-point number that is greater than or equal to 0.0 and less than 1.0
/// </summary>
/// <returns>
/// A double-precision floating point number that is greater than or equal to 0.0, and less than 1.0
/// </returns>
public double NextDouble() => Instance.NextDouble();
}
}
<file_sep>/src/Deltatre.Utils/Randomization/RandomGenerator.cs
using System;
using System.Threading;
namespace Deltatre.Utils.Randomization
{
/// <summary>
/// Use this class to get a thread local instance of Random class. Instances of different threads will have different seeds.
/// </summary>
public static class RandomGenerator
{
private static int _seed = Environment.TickCount;
private static readonly ThreadLocal<Random> ThreadLocalRandom = new ThreadLocal<Random>(() =>
new Random(Interlocked.Increment(ref _seed))
);
/// <summary>
/// The thread local instance of Random class.
/// </summary>
public static Random Instance => ThreadLocalRandom.Value;
}
}
<file_sep>/src/Deltatre.Utils/Resilience/IInfiniteRetryStrategy.cs
using System;
namespace Deltatre.Utils.Resilience
{
/// <summary>
/// Describes a strategy to perform infinite retries over a failing operation.
/// </summary>
public interface IInfiniteRetryStrategy
{
/// <summary>
/// Computes the amount of time that must be waited before doing another retry.
/// </summary>
/// <param name="retryCount">
/// An incremental integer value which acts as a retry counter.
/// This counter is always a positive integer number.
/// It takes the value 1 after the first attempt of performing the operation, before the first retry.
/// It takes the value 2 after the first retry, before the second retry.
/// It takes the value 3 after the second retry, before the third retry and so on.
/// </param>
/// <returns>
/// A <see cref="TimeSpan"/> representing the amount of time that must be waited before doing another retry.
/// </returns>
/// <exception cref="ArgumentOutOfRangeException">
/// When parameter <paramref name="retryCount"/> is less than 1
/// </exception>
TimeSpan ComputeSleepDuration(int retryCount);
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Concurrency/TimerAsyncTest_Ctor.cs
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using Deltatre.Utils.Concurrency;
using NUnit.Framework;
namespace Deltatre.Utils.Tests.Concurrency
{
[TestFixture]
public partial class TimerAsyncTest
{
[Test]
public void Ctor_Throws_When_ScheduledAction_IsNull()
{
// ACT
Assert.Throws<ArgumentNullException>(
() => new TimerAsync(
null,
TimeSpan.FromSeconds(1),
TimeSpan.FromSeconds(1)));
}
[Test]
public void Ctor_Throws_When_DueTime_Is_Less_Than_Zero()
{
// ACT
Assert.Throws<ArgumentOutOfRangeException>(
() => new TimerAsync(
_ => Task.CompletedTask,
TimeSpan.FromMilliseconds(-6),
TimeSpan.FromSeconds(10)));
}
[Test]
public void Ctor_Throws_When_Period_Is_Less_Than_Zero()
{
// ACT
Assert.Throws<ArgumentOutOfRangeException>(
() => new TimerAsync(
_ => Task.CompletedTask,
TimeSpan.FromSeconds(10),
TimeSpan.FromMilliseconds(-3)));
}
[Test]
public void Ctor_Allows_To_Pass_DueTime_Zero()
{
// ACT
Assert.DoesNotThrow(
() => new TimerAsync(
_ => Task.CompletedTask,
TimeSpan.Zero,
TimeSpan.FromSeconds(10)));
}
[Test]
public void Ctor_Allows_To_Pass_Period_Zero()
{
// ACT
Assert.DoesNotThrow(
() => new TimerAsync(
_ => Task.CompletedTask,
TimeSpan.FromSeconds(5),
TimeSpan.Zero));
}
[Test]
public void Ctor_Allows_To_Pass_Infinite_DueTime()
{
Assert.DoesNotThrow(
() => new TimerAsync(
_ => Task.CompletedTask,
Timeout.InfiniteTimeSpan,
TimeSpan.FromSeconds(10)));
}
[Test]
public void Ctor_Allows_To_Pass_Infinite_Period()
{
Assert.DoesNotThrow(
() => new TimerAsync(
_ => Task.CompletedTask,
TimeSpan.FromSeconds(5),
Timeout.InfiniteTimeSpan));
}
[Test]
public async Task It_Is_Possible_To_Execute_Background_Workload_Before_Previous_Execution_Completes()
{
// ARRANGE
var iterationInfos = new ConcurrentBag<(DateTime start, DateTime end, int iterationNumber)>();
int counter = 0;
Func<CancellationToken, Task> action = async _ =>
{
var iterationNumber = Interlocked.Increment(ref counter);
var startTime = DateTime.Now;
await Task.Delay(300, CancellationToken.None).ConfigureAwait(false);
var endTime = DateTime.Now;
iterationInfos.Add((startTime, endTime, iterationNumber));
};
using (var target = new TimerAsync(
action,
TimeSpan.FromMilliseconds(40),
TimeSpan.FromMilliseconds(40),
canStartNextActionBeforePreviousIsCompleted: true))
{
// ACT
target.Start();
await Task.Delay(1000).ConfigureAwait(false);
await target.Stop().ConfigureAwait(false);
// ASSERT
Assert.GreaterOrEqual(iterationInfos.Count, 2);
// check the overlap
var timeFrames = iterationInfos
.OrderBy(tf => tf.iterationNumber)
.Select(tf => (tf.start, tf.end))
.ToArray<(DateTime start, DateTime end)>();
var timeFrame1 = timeFrames[0];
var timeFrame2 = timeFrames[1];
Assert.IsTrue(timeFrame1.end > timeFrame2.start);
}
}
[Test]
public async Task Amount_Of_Time_Before_First_Execution_Of_Background_Workload_Equals_DueTime()
{
// ARRANGE
var timeframes = new List<(DateTime start, DateTime end)>();
Func<CancellationToken, Task> action = async _ =>
{
var startTime = DateTime.Now;
await Task.Delay(500, CancellationToken.None).ConfigureAwait(false);
var endTime = DateTime.Now;
timeframes.Add((startTime, endTime));
};
using (var target = new TimerAsync(
action,
TimeSpan.FromMilliseconds(300),
TimeSpan.FromMilliseconds(500)))
{
// ACT
var startingTime = DateTime.Now;
target.Start();
await Task.Delay(350).ConfigureAwait(false);
await target.Stop().ConfigureAwait(false);
// ASSERT
Assert.GreaterOrEqual(timeframes.Count, 1);
var actualDueTime = timeframes[0].start - startingTime;
Assert.GreaterOrEqual(actualDueTime, TimeSpan.FromMilliseconds(300));
}
}
[Test]
public async Task Amount_Of_Time_Between_Consecutives_Execution_Of_Background_Workload_Equals_Period()
{
// ARRANGE
var timeframes = new List<(DateTime start, DateTime end)>();
Func<CancellationToken, Task> action = async _ =>
{
var startTime = DateTime.Now;
await Task.Delay(500, CancellationToken.None).ConfigureAwait(false);
var endTime = DateTime.Now;
timeframes.Add((startTime, endTime));
};
using (var target = new TimerAsync(
action,
TimeSpan.FromMilliseconds(300),
TimeSpan.FromMilliseconds(500)))
{
// ACT
target.Start();
await Task.Delay(2100).ConfigureAwait(false); // in this amount of time background workload is executed at least 2 times
await target.Stop().ConfigureAwait(false);
// ASSERT
Assert.GreaterOrEqual(timeframes.Count, 2);
var timeframe1 = timeframes[0];
var timeframe2 = timeframes[1];
var actualPeriod = timeframe2.start - timeframe1.end;
Assert.GreaterOrEqual(actualPeriod.TotalMilliseconds, 500);
}
}
}
}
<file_sep>/src/Deltatre.Utils/Internals/KeyValuePairExtensions.cs
using System.Collections.Generic;
namespace Deltatre.Utils.Internals
{
internal static class KeyValuePairExtensions
{
internal static void Deconstruct<TKey, TValue>(
this KeyValuePair<TKey, TValue> source,
out TKey key,
out TValue value)
{
key = source.Key;
value = source.Value;
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Resilience/WaitAndRetryWithRandomSleepDurationsTest.cs
using AutoFixture;
using Deltatre.Utils.Mocking;
using Deltatre.Utils.Resilience;
using Moq;
using NUnit.Framework;
using System;
namespace Deltatre.Utils.Tests.Resilience
{
[TestFixture]
public sealed class WaitAndRetryWithRandomSleepDurationsTest
{
private Mock<IRandomValueProvider> _randomValueProviderMock;
[SetUp]
public void Init()
{
_randomValueProviderMock = new Mock<IRandomValueProvider>();
}
[Test]
public void MaxRetryCount_Returns_Configured_Max_Retry_Count_Value()
{
// ARRANGE
const int maxRetryCount = 5;
var target = this.CreateTarget(maxRetryCount: maxRetryCount);
// ACT
var result = target.MaxRetryCount;
// ASSERT
Assert.AreEqual(maxRetryCount, result);
}
[TestCase(0)]
[TestCase(-1)]
public void ComputeSleepDuration_Throws_ArgumentOutOfRangeException_When_RetryCount_Is_Less_Than_One(
int retryCount)
{
// ARRANGE
const int maxRetryCount = 3;
const int maxSleepDurationMs = 2_000;
var target = this.CreateTarget(maxRetryCount, maxSleepDurationMs);
// ACT
var exception = Assert.Throws<ArgumentOutOfRangeException>(
() => target.ComputeSleepDuration(retryCount)
);
// ASSERT
Assert.IsNotNull(exception);
Assert.AreEqual("retryCount", exception.ParamName);
}
[TestCase(4)]
[TestCase(5)]
public void ComputeSleepDuration_Throws_ArgumentOutOfRangeException_When_RetryCount_Is_Greater_Than_MaxRetryCount(
int retryCount)
{
// ARRANGE
const int maxRetryCount = 3;
const int maxSleepDurationMs = 2_000;
var target = this.CreateTarget(maxRetryCount, maxSleepDurationMs);
// ACT
var exception = Assert.Throws<ArgumentOutOfRangeException>(
() => target.ComputeSleepDuration(retryCount)
);
// ASSERT
Assert.IsNotNull(exception);
Assert.AreEqual("retryCount", exception.ParamName);
}
[Test]
public void ComputeSleepDuration_Does_Not_Throw_When_Retry_Count_Equals_One()
{
// ARRANGE
const int maxRetryCount = 4;
const int maxSleepDurationMs = 2_000;
var target = this.CreateTarget(maxRetryCount, maxSleepDurationMs);
const int retryCount = 1;
const double randomValue = 0.6d;
_randomValueProviderMock
.Setup(m => m.NextDouble())
.Returns(randomValue);
// ACT
Assert.DoesNotThrow(
() => target.ComputeSleepDuration(retryCount)
);
}
[Test]
public void ComputeSleepDuration_Does_Not_Throw_When_Retry_Count_Equals_MaxRetryCount()
{
// ARRANGE
const int maxRetryCount = 4;
const int maxSleepDurationMs = 2_000;
var target = this.CreateTarget(maxRetryCount, maxSleepDurationMs);
const int retryCount = maxRetryCount;
const double randomValue = 0.6d;
_randomValueProviderMock
.Setup(m => m.NextDouble())
.Returns(randomValue);
// ACT
Assert.DoesNotThrow(
() => target.ComputeSleepDuration(retryCount)
);
}
[Test]
public void ComputeSleepDuration_Does_Not_Throw_When_Retry_Count_Is_Between_One_And_MaxRetryCount()
{
// ARRANGE
const int maxRetryCount = 4;
const int maxSleepDurationMs = 2_000;
var target = this.CreateTarget(maxRetryCount, maxSleepDurationMs);
const int retryCount = 2;
const double randomValue = 0.6d;
_randomValueProviderMock
.Setup(m => m.NextDouble())
.Returns(randomValue);
// ACT
Assert.DoesNotThrow(
() => target.ComputeSleepDuration(retryCount)
);
}
[Test]
public void ComputeSleepDuration_Returns_Expected_Value()
{
// ARRANGE
const int maxRetryCount = 4;
const int maxSleepDurationMs = 2_000;
var target = this.CreateTarget(maxRetryCount, maxSleepDurationMs);
const int retryCount = 2;
const double randomValue = 0.6d;
_randomValueProviderMock
.Setup(m => m.NextDouble())
.Returns(randomValue);
// ACT
var result = target.ComputeSleepDuration(retryCount);
// ASSERT
const double expectedSleepDurationMs = randomValue * maxSleepDurationMs;
var expectedSleepDuration = TimeSpan.FromMilliseconds(expectedSleepDurationMs);
Assert.AreEqual(expectedSleepDuration, result);
}
[Test]
public void ComputeSleepDuration_Throws_InvalidOperationException_When_MaxRetryCount_Equals_Zero()
{
// ARRANGE
const int maxRetryCount = 0;
const int maxSleepDurationMs = 2_000;
var target = this.CreateTarget(maxRetryCount, maxSleepDurationMs);
var fixture = new Fixture();
var retryCount = fixture.Create<int>();
const double randomValue = 0.6d;
_randomValueProviderMock
.Setup(m => m.NextDouble())
.Returns(randomValue);
// ACT
Assert.Throws<InvalidOperationException>(
() => target.ComputeSleepDuration(retryCount)
);
}
private WaitAndRetryWithRandomSleepDurations CreateTarget(
int maxRetryCount = 3,
int maxSleepDurationMs = 1000)
{
var randomValueProvider = _randomValueProviderMock.Object;
var config = new WaitAndRetryWithRandomSleepDurationsConfiguration(
maxRetryCount,
maxSleepDurationMs);
return new WaitAndRetryWithRandomSleepDurations(
config,
randomValueProvider);
}
}
}
<file_sep>/src/Deltatre.Utils/Extensions/Enumerable/EnumerableExtensions.cs
using System;
using System.Collections.Generic;
using System.Linq;
using Deltatre.Utils.Types;
namespace Deltatre.Utils.Extensions.Enumerable
{
/// <summary>
/// Extension methods for enumerables
/// </summary>
public static class EnumerableExtensions
{
/// <summary>
/// Checks whether a sequence is null or empty
/// </summary>
public static bool IsNullOrEmpty<T>(this IEnumerable<T> source)
{
if (source == null)
return true;
return !source.Any();
}
/// <summary>
/// Performs the specified action on each element of the provided <see cref="IEnumerable{T}"/>
/// </summary>
/// <param name="source">The sequence of items on which to execute the specified action</param>
/// <param name="toBeDone">The action to be executed for each item in <paramref name="source"/></param>
/// <exception cref="ArgumentNullException"> Throws ArgumenNullException when parameter source is null </exception>
/// <exception cref="ArgumentNullException"> Throws ArgumenNullException when parameter toBeDone is null </exception>
public static void ForEach<T>(this IEnumerable<T> source, Action<T> toBeDone)
{
if (source == null)
throw new ArgumentNullException(nameof(source));
if (toBeDone == null)
throw new ArgumentNullException(nameof(toBeDone));
foreach (var item in source)
toBeDone(item);
}
/// <summary>
/// Checks whether a sequence of items contains duplicate elements.
/// </summary>
/// <param name="source">The sequence of items to be checked for duplicates.</param>
/// <param name="comparer">The equality comparer to be used when checking for duplicates in <paramref name="source"/>. Pass null if you want to use the default equality comparer for type <typeparamref name="T"/>.</param>
/// <returns>A boolean value indicating whether or not <paramref name="source"/> contains duplicate elements.</returns>
/// <exception cref="ArgumentNullException">Throws ArgumenNullException when parameter source is null</exception>
/// <remarks> Use default equality comparer if parameter comparer is not specified </remarks>
public static bool HasDuplicates<T>(this IEnumerable<T> source, IEqualityComparer<T> comparer = null)
{
if (source == null)
throw new ArgumentNullException(nameof(source));
var equalityComparer = comparer ?? EqualityComparer<T>.Default;
var set = new HashSet<T>(equalityComparer);
var hasDuplicates = !source.All(set.Add);
return hasDuplicates;
}
/// <summary>
/// Creates an instance of <see cref="NonEmptySequence{T}"/> containing the items of the provided sequence.
/// </summary>
/// <param name="source">The sequence of items to be used to create an instance of <see cref="NonEmptySequence{T}"/>.</param>
/// <typeparam name="T">The type of the items in <paramref name="source"/></typeparam>
/// <returns>An instance of <see cref="NonEmptySequence{T}"/> containing the items of <paramref name="source"/>.</returns>
/// <exception cref="ArgumentNullException">Throws <see cref="ArgumentNullException"/> when <paramref name="source"/> is null</exception>
public static NonEmptySequence<T> ToNonEmptySequence<T>(this IEnumerable<T> source)
{
if (source == null)
throw new ArgumentNullException(nameof(source));
return new NonEmptySequence<T>(source);
}
/// <summary>
/// Creates an instance of <see cref="NonEmptyList{T}"/> containing the items of the provided sequence.
/// </summary>
/// <typeparam name="T">The type of the items in <paramref name="source"/></typeparam>
/// <param name="source">The sequence of items to be used to create an instance of <see cref="NonEmptyList{T}"/>.</param>
/// <returns>An instance of <see cref="NonEmptyList{T}"/> containing the items of <paramref name="source"/>.</returns>
/// <exception cref="ArgumentNullException">Throws <see cref="ArgumentNullException"/> when <paramref name="source"/> is null.</exception>
public static NonEmptyList<T> ToNonEmptyList<T>(this IEnumerable<T> source)
{
if (source == null)
throw new ArgumentNullException(nameof(source));
return new NonEmptyList<T>(source);
}
/// <summary>
/// Creates an hash set from a sequence of items. The default equality comparer for the type argument will be used.
/// </summary>
/// <typeparam name="T">The type of the items inside the starting sequence</typeparam>
/// <param name="source">The starting sequence</param>
/// <returns>An hash set containing the unique elements of the starting sequence</returns>
public static HashSet<T> ToHashSet<T>(this IEnumerable<T> source) =>
source.ToHashSet(EqualityComparer<T>.Default);
/// <summary>
/// Creates an hash set from a sequence of items.
/// </summary>
/// <typeparam name="T">The type of the items inside the starting sequence</typeparam>
/// <param name="source">The starting sequence</param>
/// <param name="comparer">The equality comparer used by the hash set to find the unique elements inside the starting sequence</param>
/// <returns>An hash set containing the unique elements of the starting sequence</returns>
public static HashSet<T> ToHashSet<T>(this IEnumerable<T> source, IEqualityComparer<T> comparer) =>
new HashSet<T>(source, comparer);
/// <summary>
/// Split a sequence in batches of fixed size
/// </summary>
/// <typeparam name="T">The type of the items inside the starting sequence</typeparam>
/// <param name="source">The starting sequence</param>
/// <param name="batchSize">The size of batches to be obtained by splitting the starting sequence</param>
/// <returns>A sequence containing the batches obtained by splitting the starting sequence</returns>
/// <exception cref="ArgumentNullException">Throws ArgumentNullException when parameter source is null</exception>
/// <exception cref="ArgumentOutOfRangeException">Throws ArgumentOutOfRangeException when parameter batchSize is less than or equal to zero</exception>
public static IEnumerable<IEnumerable<T>> SplitInBatches<T>(
this IEnumerable<T> source,
int batchSize)
{
if (source == null)
throw new ArgumentNullException(nameof(source));
if(batchSize <= 0)
{
throw new ArgumentOutOfRangeException(
nameof(batchSize),
$"Batch size is expected to be a positive integer, but was {batchSize}.");
}
return SplitInBatchesImplementation(source, batchSize);
}
private static IEnumerable<IEnumerable<T>> SplitInBatchesImplementation<T>(
IEnumerable<T> source,
int batchSize)
{
var batch = new List<T>(batchSize);
foreach (var item in source)
{
batch.Add(item);
if (batch.Count == batchSize)
{
yield return batch;
batch = new List<T>(batchSize);
}
}
if (batch.Count > 0)
yield return batch;
}
}
}
<file_sep>/src/Deltatre.Utils/Concurrency/TimerAsync.cs
using System;
using System.Threading;
using System.Threading.Tasks;
namespace Deltatre.Utils.Concurrency
{
/// <summary>
/// Provides a mechanism to schedule the recurrent execution of a background workload (supporting cancellation) on a thread pool thread. The schedulation can be stopped later, when the background workload isn't useful anymore. It is possible to start and stop the timer freely as many times as you want.
/// This class is thread safe.
/// This class cannot be inherited.
/// </summary>
public sealed class TimerAsync : IDisposable
{
private readonly Func<CancellationToken, Task> scheduledAction;
private readonly TimeSpan dueTime;
private readonly TimeSpan period;
private readonly bool canStartNextActionBeforePreviousIsCompleted;
private readonly SemaphoreSlim semaphore; // provides a way to syncronize access to isRunning private state
private CancellationTokenSource cancellationTokenSource;
private Task backgroundWorkloadTask;
private bool isDisposed;
private bool isRunning;
/// <summary>
/// This event is raised when an error occurs during the execution of the scheduled background workload. Subscribe this event if you want to perform logging or do something else when an error occurs.
/// </summary>
public event EventHandler<Exception> OnError;
/// <summary>
/// Initializes a new instance of the <see cref="TimerAsync" /> class
/// </summary>
/// <param name="scheduledAction">The background workload to be scheduled</param>
/// <param name="dueTime">The delay befoe <paramref name="scheduledAction" /> is invoked for the first time. Pass the value <see cref="Timeout.InfiniteTimeSpan" /> if you never want to execute <paramref name="scheduledAction" /></param>
/// <param name="period">The delay between two consecutive invocations of <paramref name="scheduledAction" />. Pass the value <see cref="Timeout.InfiniteTimeSpan" /> if want to execute <paramref name="scheduledAction" /> once</param>
/// <param name="canStartNextActionBeforePreviousIsCompleted"> Whether or not waiting for the completion of <paramref name="scheduledAction" /> before starting the delay represented by <paramref name="period" /></param>
/// <exception cref="ArgumentNullException">Throws <see cref="ArgumentNullException"/> when parameter <paramref name="scheduledAction"/> is null</exception>
/// <exception cref="ArgumentOutOfRangeException">Throws <see cref="ArgumentOutOfRangeException"/> when either parameter <paramref name="dueTime"/> or <paramref name="period"/> is less than <see cref="TimeSpan.Zero"/> and other than <see cref="Timeout.InfiniteTimeSpan" /></exception>
public TimerAsync(
Func<CancellationToken, Task> scheduledAction,
TimeSpan dueTime,
TimeSpan period,
bool canStartNextActionBeforePreviousIsCompleted = false)
{
if (!isValidDelay(dueTime))
throw new ArgumentOutOfRangeException(
nameof(dueTime),
$"Parameter {nameof(dueTime)} must be a non negative delay or a delay of -1 milliseconds");
if (!isValidDelay(period))
throw new ArgumentOutOfRangeException(
nameof(period),
$"Parameter {nameof(period)} must be a non negative delay or a delay of -1 milliseconds");
this.scheduledAction = scheduledAction ?? throw new ArgumentNullException(nameof(scheduledAction));
this.dueTime = dueTime;
this.period = period;
this.canStartNextActionBeforePreviousIsCompleted = canStartNextActionBeforePreviousIsCompleted;
this.semaphore = new SemaphoreSlim(1);
bool isValidDelay(TimeSpan delay)
{
return delay >= TimeSpan.Zero || delay == Timeout.InfiniteTimeSpan;
}
}
/// <summary>
/// Starts the timer. You can safely call this method multiple times even if the timer is already started.
/// </summary>
/// <exception cref="ObjectDisposedException">Throws <see cref="ObjectDisposedException"/> if the instance on which you are calling <see cref="Start"/> has been disposed</exception>
public void Start()
{
if (this.isDisposed)
throw new ObjectDisposedException(typeof(TimerAsync).Name);
this.semaphore.Wait();
try
{
if (this.isRunning)
return;
this.cancellationTokenSource = new CancellationTokenSource();
this.isRunning = true;
this.backgroundWorkloadTask = this.RunBackgroundWorkload();
}
finally
{
this.semaphore.Release();
}
}
/// <summary>
/// Stops the timer. Calling this method implies cancelling the schedulation of the background workload. You can safely call this method multiple times even if the timer is already stopped.
/// </summary>
/// <returns>A <see cref="Task"/> representing the ongoing operation of stopping the timer</returns>
/// <exception cref="ObjectDisposedException">Throws <see cref="ObjectDisposedException"/> if the instance on which you are calling <see cref="Stop"/> has been disposed</exception>
public async Task Stop()
{
if (this.isDisposed)
throw new ObjectDisposedException(typeof(TimerAsync).Name);
await this.semaphore.WaitAsync().ConfigureAwait(false);
try
{
if (!this.isRunning)
return;
this.cancellationTokenSource.Cancel();
await this.backgroundWorkloadTask.ConfigureAwait(false);
}
catch (OperationCanceledException)
{
// nothing to do here: task cancellation throws OperationCanceledException by design
}
finally
{
this.isRunning = false;
this.cancellationTokenSource?.Dispose();
this.cancellationTokenSource = null;
this.semaphore.Release();
}
}
private Task RunBackgroundWorkload()
{
return Task.Run(async () =>
{
try
{
await Task.Delay(this.dueTime, this.cancellationTokenSource.Token).ConfigureAwait(false);
while (true)
{
if (this.canStartNextActionBeforePreviousIsCompleted)
{
#pragma warning disable CS4014 // Because this call is not awaited, execution of the current method continues before the call is completed
this.scheduledAction(this.cancellationTokenSource.Token); // fire and forget call to the scheduled action whitout waiting for it to complete
#pragma warning restore CS4014 // Because this call is not awaited, execution of the current method continues before the call is completed
}
else
{
await this.scheduledAction(this.cancellationTokenSource.Token).ConfigureAwait(false);
}
await Task.Delay(this.period, this.cancellationTokenSource.Token).ConfigureAwait(false);
}
}
catch (OperationCanceledException)
{
// nothing to do here: task cancellation throws OperationCanceledException by design
}
catch (Exception exception)
{
this.isRunning = false;
try
{
this.OnError?.Invoke(this, exception);
}
#pragma warning disable RCS1075 // Avoid empty catch clause that catches System.Exception.
#pragma warning disable RECS0022 // A catch clause that catches System.Exception and has an empty body
catch(Exception)
#pragma warning restore RECS0022 // A catch clause that catches System.Exception and has an empty body
#pragma warning restore RCS1075 // Avoid empty catch clause that catches System.Exception.
{
// swallow any kind of error here (we know nothing about the event subscriber and it is possible that it throws an exception during execution)
}
}
}, this.cancellationTokenSource.Token);
}
/// <summary>
/// Release resources
/// </summary>
public void Dispose()
{
this.Dispose(true);
GC.SuppressFinalize(this);
}
/// <summary>
/// The class finalizer
/// </summary>
~TimerAsync()
{
this.Dispose(false);
}
private void Dispose(bool disposing)
{
if (this.isDisposed)
return;
if (disposing)
{
this.Stop().Wait();
this.cancellationTokenSource?.Dispose();
this.semaphore?.Dispose();
}
this.isDisposed = true;
}
}
}
<file_sep>/src/Deltatre.Utils/Extensions/Guid/GuidExtensions.cs
using System;
using System.Text.RegularExpressions;
namespace Deltatre.Utils.Extensions.Guid
{
/// <summary>
/// Extension methods for <see cref="System.Guid"/>.
/// </summary>
public static class GuidExtensions
{
/// <summary>
/// Converts a GUID to a Base64 string, removing "url-unfriendly" characters (/+=).
/// </summary>
/// <param name="value"></param>
/// <returns></returns>
public static string ToShortString(this System.Guid value)
{
return Regex.Replace(Convert.ToBase64String(value.ToByteArray()), "[/+=]", string.Empty);
}
}
}<file_sep>/README.md
# Deltatre Utils
This is a general purpose library containing helpers useful for C# .NET projects.
The main tools you can find here are:
- extension methods for .NET BCL types
- data transfer objects useful for common programming tasks
- custom types implementing auto safe logics (non empty string and non empty sequence for instance)
- a simple timer class useful to execute an asynchronous task periodically
<file_sep>/tests/Deltatre.Utils.Tests/Extensions/ExpandoObjects/ExpandoObjectExtensionsTest_ShallowMergeWith.cs
using Deltatre.Utils.Extensions.ExpandoObjects;
using NUnit.Framework;
using System;
using System.Collections;
using System.Collections.Generic;
using System.Dynamic;
using System.IO;
namespace Deltatre.Utils.Tests.Extensions.ExpandoObjects
{
[TestFixture]
public sealed partial class ExpandoObjectExtensionsTest
{
[Test]
public void ShallowMergeWith_Throws_ArgumentNullException_When_Target_Is_Null()
{
// ARRANGE
var other = new ExpandoObject();
var otherAsDict = (IDictionary<string, object>)other;
otherAsDict.Add("FirstName", "Mario");
otherAsDict.Add("LastName", "Rossi");
// ACT
var exception = Assert.Throws<ArgumentNullException>(
() => ExpandoObjectExtensions.ShallowMergeWith(null, other)
);
// ASSERT
Assert.IsNotNull(exception);
Assert.AreEqual("target", exception.ParamName);
}
[Test]
public void ShallowMergeWith_Throws_ArgumentNullException_When_Other_Is_Null()
{
// ARRANGE
var target = new ExpandoObject();
var targetAsDict = (IDictionary<string, object>)target;
targetAsDict.Add("FirstName", "Mario");
targetAsDict.Add("LastName", "Rossi");
// ACT
var exception = Assert.Throws<ArgumentNullException>(
() => target.ShallowMergeWith(null)
);
// ASSERT
Assert.IsNotNull(exception);
Assert.AreEqual("other", exception.ParamName);
}
[Test]
public void ShallowMergeWith_Does_Not_Change_Target_When_Other_Has_No_Properties()
{
// ARRANGE
dynamic contact = new ExpandoObject();
contact.Country = "Italy";
contact.Region = "Lombardia";
contact.Province = "MI";
contact.City = "Milano";
var target = new ExpandoObject();
var targetAsDict = (IDictionary<string, object>)target;
targetAsDict.Add("FirstName", "Mario");
targetAsDict.Add("LastName", "Rossi");
targetAsDict.Add("Age", 50);
targetAsDict.Add("Contact", contact);
// ACT
target.ShallowMergeWith(new ExpandoObject());
// ASSERT
Assert.AreEqual(targetAsDict.Count, 4);
Assert.IsTrue(targetAsDict.ContainsKey("FirstName"));
Assert.IsTrue(targetAsDict.ContainsKey("LastName"));
Assert.IsTrue(targetAsDict.ContainsKey("Age"));
Assert.IsTrue(targetAsDict.ContainsKey("Contact"));
Assert.AreEqual(targetAsDict["FirstName"], "Mario");
Assert.AreEqual(targetAsDict["LastName"], "Rossi");
Assert.AreEqual(targetAsDict["Age"], 50);
// check contact object
var contactObject = targetAsDict["Contact"];
Assert.IsNotNull(contactObject);
Assert.IsInstanceOf<ExpandoObject>(contactObject);
var contactObjectAsDict = (IDictionary<string, object>)contactObject;
Assert.AreEqual(contactObjectAsDict.Count, 4);
Assert.IsTrue(contactObjectAsDict.ContainsKey("Country"));
Assert.IsTrue(contactObjectAsDict.ContainsKey("Region"));
Assert.IsTrue(contactObjectAsDict.ContainsKey("Province"));
Assert.IsTrue(contactObjectAsDict.ContainsKey("City"));
Assert.AreEqual(contactObjectAsDict["Country"], "Italy");
Assert.AreEqual(contactObjectAsDict["Region"], "Lombardia");
Assert.AreEqual(contactObjectAsDict["Province"], "MI");
Assert.AreEqual(contactObjectAsDict["City"], "Milano");
}
[Test]
public void ShallowMergeWith_Copies_Properties_Of_Other_To_Target_When_Target_Has_No_Properties()
{
// ARRANGE
dynamic contact = new ExpandoObject();
contact.Country = "Italy";
contact.Region = "Lombardia";
contact.Province = "MI";
contact.City = "Milano";
var other = new ExpandoObject();
var otherAsDict = (IDictionary<string, object>)other;
otherAsDict.Add("FirstName", "Mario");
otherAsDict.Add("LastName", "Rossi");
otherAsDict.Add("Age", 50);
otherAsDict.Add("Contact", contact);
// ACT
var target = new ExpandoObject();
target.ShallowMergeWith(other);
// ASSERT
var targetAsDict = (IDictionary<string, object>)target;
Assert.AreEqual(targetAsDict.Count, 4);
Assert.IsTrue(targetAsDict.ContainsKey("FirstName"));
Assert.IsTrue(targetAsDict.ContainsKey("LastName"));
Assert.IsTrue(targetAsDict.ContainsKey("Age"));
Assert.IsTrue(targetAsDict.ContainsKey("Contact"));
Assert.AreEqual(targetAsDict["FirstName"], "Mario");
Assert.AreEqual(targetAsDict["LastName"], "Rossi");
Assert.AreEqual(targetAsDict["Age"], 50);
// check contact object
var contactObject = targetAsDict["Contact"];
Assert.IsNotNull(contactObject);
Assert.IsInstanceOf<ExpandoObject>(contactObject);
var contactObjectAsDict = (IDictionary<string, object>)contactObject;
Assert.AreEqual(contactObjectAsDict.Count, 4);
Assert.IsTrue(contactObjectAsDict.ContainsKey("Country"));
Assert.IsTrue(contactObjectAsDict.ContainsKey("Region"));
Assert.IsTrue(contactObjectAsDict.ContainsKey("Province"));
Assert.IsTrue(contactObjectAsDict.ContainsKey("City"));
Assert.AreEqual(contactObjectAsDict["Country"], "Italy");
Assert.AreEqual(contactObjectAsDict["Region"], "Lombardia");
Assert.AreEqual(contactObjectAsDict["Province"], "MI");
Assert.AreEqual(contactObjectAsDict["City"], "Milano");
}
[Test]
public void ShallowMergeWith_Keeps_Properties_Of_Target_Missing_In_Other_Unchanged()
{
// ARRANGE
dynamic contact = new ExpandoObject();
contact.Country = "Italy";
contact.Region = "Lombardia";
contact.Province = "MI";
contact.City = "Milano";
var target = new ExpandoObject();
var targetAsDict = (IDictionary<string, object>)target;
targetAsDict.Add("FirstName", "Mario");
targetAsDict.Add("LastName", "Rossi");
targetAsDict.Add("Age", 50);
targetAsDict.Add("Contact", contact);
var other = new ExpandoObject();
var otherAsDict = (IDictionary<string, object>)other;
otherAsDict.Add("Age", 45);
otherAsDict.Add("Contact", contact);
otherAsDict.Add("Hobbies", new[] { "gaming", "football" });
// ACT
target.ShallowMergeWith(other);
// ASSERT
Assert.GreaterOrEqual(targetAsDict.Count, 2);
Assert.IsTrue(targetAsDict.ContainsKey("FirstName"));
Assert.IsTrue(targetAsDict.ContainsKey("LastName"));
Assert.AreEqual(targetAsDict["FirstName"], "Mario");
Assert.AreEqual(targetAsDict["LastName"], "Rossi");
}
[Test]
public void ShallowMergeWith_Adds_Properties_Of_Other_Missing_In_Target_To_Target()
{
// ARRANGE
dynamic contact = new ExpandoObject();
contact.Country = "Italy";
contact.Region = "Lombardia";
contact.Province = "MI";
contact.City = "Milano";
var target = new ExpandoObject();
var targetAsDict = (IDictionary<string, object>)target;
targetAsDict.Add("Age", 50);
targetAsDict.Add("Contact", contact);
var other = new ExpandoObject();
var otherAsDict = (IDictionary<string, object>)other;
otherAsDict.Add("FirstName", "Mario");
otherAsDict.Add("LastName", "Rossi");
// ACT
target.ShallowMergeWith(other);
// ASSERT
Assert.GreaterOrEqual(targetAsDict.Count, 2);
Assert.IsTrue(targetAsDict.ContainsKey("FirstName"));
Assert.IsTrue(targetAsDict.ContainsKey("LastName"));
Assert.AreEqual(targetAsDict["FirstName"], "Mario");
Assert.AreEqual(targetAsDict["LastName"], "Rossi");
}
[Test]
public void ShallowMergeWith_Updates_Common_Properties_With_The_Value_Of_Other()
{
// ARRANGE
dynamic targetContact = new ExpandoObject();
targetContact.Country = "Italy";
targetContact.Region = "Lombardia";
targetContact.Province = "MI";
targetContact.City = "Milano";
var target = new ExpandoObject();
var targetAsDict = (IDictionary<string, object>)target;
targetAsDict.Add("FirstName", "Mario");
targetAsDict.Add("LastName", "Rossi");
targetAsDict.Add("Age", 50);
targetAsDict.Add("Contact", targetContact);
dynamic otherContact = new ExpandoObject();
otherContact.Country = "Spain";
otherContact.City = "Barcellona";
var other = new ExpandoObject();
var otherAsDict = (IDictionary<string, object>)other;
otherAsDict.Add("FirstName", "Giuseppe");
otherAsDict.Add("LastName", "Verdi");
otherAsDict.Add("Age", 25);
otherAsDict.Add("Contact", otherContact);
// ACT
target.ShallowMergeWith(other);
// ASSERT
Assert.AreEqual(targetAsDict.Count, 4);
Assert.IsTrue(targetAsDict.ContainsKey("FirstName"));
Assert.IsTrue(targetAsDict.ContainsKey("LastName"));
Assert.IsTrue(targetAsDict.ContainsKey("Age"));
Assert.IsTrue(targetAsDict.ContainsKey("Contact"));
Assert.AreEqual(targetAsDict["FirstName"], "Giuseppe");
Assert.AreEqual(targetAsDict["LastName"], "Verdi");
Assert.AreEqual(targetAsDict["Age"], 25);
// check contact object
var contactObject = targetAsDict["Contact"];
Assert.IsNotNull(contactObject);
Assert.IsInstanceOf<ExpandoObject>(contactObject);
var contactObjectAsDict = (IDictionary<string, object>)contactObject;
Assert.AreEqual(contactObjectAsDict.Count, 2);
Assert.IsTrue(contactObjectAsDict.ContainsKey("Country"));
Assert.IsTrue(contactObjectAsDict.ContainsKey("City"));
Assert.AreEqual(contactObjectAsDict["Country"], "Spain");
Assert.AreEqual(contactObjectAsDict["City"], "Barcellona");
}
[Test]
public void ShallowMergeWith_Does_Not_Change_Common_Properties_Having_Same_Value_In_Target_And_Other()
{
// ARRANGE
dynamic targetContact = new ExpandoObject();
targetContact.Country = "Italy";
targetContact.Region = "Lombardia";
targetContact.Province = "MI";
targetContact.City = "Milano";
var target = new ExpandoObject();
var targetAsDict = (IDictionary<string, object>)target;
targetAsDict.Add("FirstName", "Mario");
targetAsDict.Add("LastName", "Rossi");
targetAsDict.Add("Age", 50);
targetAsDict.Add("Contact", targetContact);
dynamic otherContact = new ExpandoObject();
otherContact.Country = "Spain";
otherContact.City = "Barcellona";
var other = new ExpandoObject();
var otherAsDict = (IDictionary<string, object>)other;
otherAsDict.Add("FirstName", "Mario");
otherAsDict.Add("LastName", "Rossi");
otherAsDict.Add("Age", 25);
otherAsDict.Add("Contact", otherContact);
// ACT
target.ShallowMergeWith(other);
// ASSERT
Assert.GreaterOrEqual(targetAsDict.Count, 2);
Assert.IsTrue(targetAsDict.ContainsKey("FirstName"));
Assert.IsTrue(targetAsDict.ContainsKey("LastName"));
Assert.AreEqual(targetAsDict["FirstName"], "Mario");
Assert.AreEqual(targetAsDict["LastName"], "Rossi");
}
[Test]
public void ShallowMergeWith_Executes_First_Level_Merge_Only()
{
// ARRANGE
var nestedObject1 = new ExpandoObject();
var nestedObject1AsDict = (IDictionary<string, object>)nestedObject1;
nestedObject1AsDict.Add("Key1", "Value1");
nestedObject1AsDict.Add("Key2", "Value2");
nestedObject1AsDict.Add("Key3", "Value3");
nestedObject1AsDict.Add("Key5", "Value5");
var nestedObject2 = new ExpandoObject();
var nestedObject2AsDict = (IDictionary<string, object>)nestedObject2;
nestedObject2AsDict.Add("Key2", "Value2");
nestedObject2AsDict.Add("Key3", "Changed");
nestedObject2AsDict.Add("Key4", "Value4");
var target = new ExpandoObject();
var targetAsDict = (IDictionary<string, object>)target;
targetAsDict.Add("FooProperty", nestedObject1);
var other = new ExpandoObject();
var otherAsDict = (IDictionary<string, object>)other;
otherAsDict.Add("FooProperty", nestedObject2);
// ACT
target.ShallowMergeWith(other);
// ASSERT
Assert.AreEqual(targetAsDict.Count, 1);
Assert.IsTrue(targetAsDict.ContainsKey("FooProperty"));
// check nested object
var nestedObject = targetAsDict["FooProperty"];
Assert.IsNotNull(nestedObject);
Assert.IsInstanceOf<ExpandoObject>(nestedObject);
var nestedObjectAsDict = (IDictionary<string, object>)nestedObject;
Assert.AreEqual(3, nestedObjectAsDict.Count);
Assert.IsTrue(nestedObjectAsDict.ContainsKey("Key2"));
Assert.IsTrue(nestedObjectAsDict.ContainsKey("Key3"));
Assert.IsTrue(nestedObjectAsDict.ContainsKey("Key4"));
Assert.AreEqual(nestedObjectAsDict["Key2"], "Value2");
Assert.AreEqual(nestedObjectAsDict["Key3"], "Changed");
Assert.AreEqual(nestedObjectAsDict["Key4"], "Value4");
}
[Test]
public void ShallowMergeWith_Does_Not_Modify_Other_Parameter()
{
// ARRANGE
dynamic targetContact = new ExpandoObject();
targetContact.Country = "Italy";
targetContact.Region = "Lombardia";
targetContact.Province = "MI";
targetContact.City = "Milano";
var target = new ExpandoObject();
var targetAsDict = (IDictionary<string, object>)target;
targetAsDict.Add("FirstName", "Mario");
targetAsDict.Add("LastName", "Rossi");
targetAsDict.Add("Age", 50);
targetAsDict.Add("Contact", targetContact);
dynamic otherContact = new ExpandoObject();
otherContact.Country = "Spain";
otherContact.City = "Barcellona";
var other = new ExpandoObject();
var otherAsDict = (IDictionary<string, object>)other;
otherAsDict.Add("FirstName", "Giuseppe");
otherAsDict.Add("LastName", "Verdi");
otherAsDict.Add("Age", 25);
otherAsDict.Add("Contact", otherContact);
// ACT
target.ShallowMergeWith(other);
// ASSERT
Assert.AreEqual(otherAsDict.Count, 4);
Assert.IsTrue(otherAsDict.ContainsKey("FirstName"));
Assert.IsTrue(otherAsDict.ContainsKey("LastName"));
Assert.IsTrue(otherAsDict.ContainsKey("Age"));
Assert.IsTrue(otherAsDict.ContainsKey("Contact"));
Assert.AreEqual(otherAsDict["FirstName"], "Giuseppe");
Assert.AreEqual(otherAsDict["LastName"], "Verdi");
Assert.AreEqual(otherAsDict["Age"], 25);
// check contact object
var contactObject = otherAsDict["Contact"];
Assert.IsNotNull(contactObject);
Assert.IsInstanceOf<ExpandoObject>(contactObject);
var contactObjectAsDict = (IDictionary<string, object>)contactObject;
Assert.AreEqual(contactObjectAsDict.Count, 2);
Assert.IsTrue(contactObjectAsDict.ContainsKey("Country"));
Assert.IsTrue(contactObjectAsDict.ContainsKey("City"));
Assert.AreEqual(contactObjectAsDict["Country"], "Spain");
Assert.AreEqual(contactObjectAsDict["City"], "Barcellona");
}
[Test]
public void ShallowMergeWith_Is_Able_To_Shallow_Merge_Deeply_Nested_Objects()
{
// ARRANGE
dynamic targetSecondLevelObject = new ExpandoObject();
targetSecondLevelObject.TargetKey3 = "target value 3";
targetSecondLevelObject.TargetKey4 = "target value 4";
dynamic targetFirstLevelObject = new ExpandoObject();
targetFirstLevelObject.TargetKey1 = "target value 1";
targetFirstLevelObject.TargetKey2 = targetSecondLevelObject;
var target = new ExpandoObject();
var targetAsDict = (IDictionary<string, object>)target;
targetAsDict["Key"] = targetFirstLevelObject;
dynamic otherSecondLevelObject = new ExpandoObject();
otherSecondLevelObject.OtherKey3 = "other value 3";
otherSecondLevelObject.OtherKey4 = "other value 4";
dynamic otherFirstLevelObject = new ExpandoObject();
otherFirstLevelObject.OtherKey1 = "other value 1";
otherFirstLevelObject.OtherKey2 = otherSecondLevelObject;
var other = new ExpandoObject();
var otherAsDict = (IDictionary<string, object>)other;
otherAsDict["Key"] = otherFirstLevelObject;
// ACT
target.ShallowMergeWith(other);
// ASSERT
Assert.AreEqual(1, targetAsDict.Count);
Assert.IsTrue(targetAsDict.ContainsKey("Key"));
var firstLevel = targetAsDict["Key"];
Assert.IsNotNull(firstLevel);
Assert.IsInstanceOf<ExpandoObject>(firstLevel);
var firstLevelAsDict = (IDictionary<string, object>)firstLevel;
Assert.AreEqual(2, firstLevelAsDict.Count);
Assert.IsTrue(firstLevelAsDict.ContainsKey("OtherKey1"));
Assert.IsTrue(firstLevelAsDict.ContainsKey("OtherKey2"));
Assert.AreEqual(firstLevelAsDict["OtherKey1"], "other value 1");
var secondLevel = firstLevelAsDict["OtherKey2"];
Assert.IsNotNull(secondLevel);
Assert.IsInstanceOf<ExpandoObject>(secondLevel);
var secondLevelAsDict = (IDictionary<string, object>)secondLevel;
Assert.AreEqual(2, secondLevelAsDict.Count);
Assert.IsTrue(secondLevelAsDict.ContainsKey("OtherKey3"));
Assert.IsTrue(secondLevelAsDict.ContainsKey("OtherKey4"));
Assert.AreEqual(secondLevelAsDict["OtherKey3"], "other value 3");
Assert.AreEqual(secondLevelAsDict["OtherKey4"], "other value 4");
// check object references
Assert.AreSame(otherSecondLevelObject, secondLevel);
Assert.AreSame(otherFirstLevelObject, firstLevel);
}
}
}
<file_sep>/src/Deltatre.Utils/Dto/ValidationResult.cs
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
namespace Deltatre.Utils.Dto
{
/// <summary>
/// Represents the result of a validation process.
/// </summary>
/// <typeparam name="TError">
/// The type of the validation error.
/// </typeparam>
public sealed class ValidationResult<TError>
{
/// <summary>
/// Represents the validation result obtained when validating a valid item.
/// This field is read-only.
/// </summary>
/// <remarks>
/// A valid item is an item for which the validation process has detected no
/// validation errors.
/// </remarks>
public static readonly ValidationResult<TError> Valid = new ValidationResult<TError>(ImmutableArray<TError>.Empty);
/// <summary>
/// Creates an instance of <see cref="ValidationResult{TError}"/> representing the
/// validation result obtained when validating an invalid item.
/// </summary>
/// <param name="errors">
/// The collection of detected validation errors.
/// This parameter cannot be <see langword="null"/>.
/// The provided collection of validation errors must contain at least one item.
/// </param>
/// <returns>
/// An instance of <see cref="ValidationResult{TError}"/> representing the
/// validation result obtained when validating an invalid item.
/// </returns>
/// <remarks>
/// An invalid item is an item for which the validation process has detected at least
/// one validation error.
/// </remarks>
/// <exception cref="ArgumentNullException">
/// When <paramref name="errors"/> is <see langword="null"/>.
/// </exception>
/// <exception cref="ArgumentException">
/// When <paramref name="errors"/> is an empty sequence.
/// </exception>
public static ValidationResult<TError> Invalid(
IEnumerable<TError> errors)
{
if (errors == null)
throw new ArgumentNullException(nameof(errors));
var errorsArray = errors.ToImmutableArray();
if (errorsArray.IsEmpty)
{
throw new ArgumentException(
"You must provide at least one validation error.",
nameof(errors)
);
}
return new ValidationResult<TError>(errorsArray);
}
/// <summary>
/// Gets a flag indicating whether the validated item is a valid one.
/// </summary>
public bool IsValid => this.Errors.IsEmpty;
/// <summary>
/// Gets the collection of the detected validation errors.
/// </summary>
/// <remarks>
/// An empty collection of validation errors means that the validated item
/// is a valid one.
/// </remarks>
public ImmutableArray<TError> Errors { get; }
private ValidationResult(ImmutableArray<TError> errors)
{
this.Errors = errors;
}
}
}
<file_sep>/src/Deltatre.Utils/Types/Interval.cs
using System;
namespace Deltatre.Utils.Types
{
/// <summary>
/// This class represents an interval of values.
/// </summary>
/// <typeparam name="T">Type of values in the range. They must be struct and implement <see cref="IComparable{T}"/> and <see cref="IEquatable{T}"/></typeparam>
public sealed class Interval<T> : IEquatable<Interval<T>>
where T : struct, IComparable<T>, IEquatable<T>
{
/// <summary>
/// Represents an empty interval
/// </summary>
public static readonly Interval<T> Empty = new Interval<T>(Endpoint<T>.Exclusive(default(T)), Endpoint<T>.Exclusive(default(T)));
/// <summary>
/// Left endpoint of the interval
/// </summary>
public Endpoint<T> From { get; }
/// <summary>
/// Right endpoint of the interval
/// </summary>
public Endpoint<T> To { get; }
/// <summary>
/// Constructs an object of type <see cref="Interval{T}"/>
/// <paramref name="from"/> must be lesser than or equal <paramref name="to"/>
/// </summary>
/// <param name="from">Left endpoint of the interval</param>
/// <param name="to">Right endpoint of the interval</param>
/// <exception cref="ArgumentException">Thrown when <paramref name="from"/> is greater than <paramref name="to"/></exception>
public Interval(Endpoint<T> from, Endpoint<T> to)
{
if (from > to)
{
throw new ArgumentException($"{nameof(from)} cannot be greater than {nameof(to)}");
}
From = from;
To = to;
}
/// <summary>
/// Prints a representation of <see cref="Interval{T}"/>
/// </summary>
/// <returns>The string representation of the interval</returns>
public override string ToString()
{
var fromEndpointSymbol = From.Status == EndPointStatus.Inclusive ? '[' : '(';
var toEndpointSymbol = To.Status == EndPointStatus.Inclusive ? ']' : ')';
return $"{fromEndpointSymbol}{From.Value}, {To.Value}{toEndpointSymbol}";
}
#region Equality
/// <summary>
/// Determines whether two instances of <see cref="Interval{T}"/> are equal.
/// Two <see cref="Interval{T}"/> are equal if both <see cref="From"/> and <see cref="To"/> endpoints are equal
/// </summary>
/// <param name="other">An object to compare the current instance with</param>
/// <returns>Boolean indicating whether the two instances are considered equals</returns>
public bool Equals(Interval<T> other)
{
if (other == null)
{
return false;
}
return From == other.From && To == other.To;
}
/// <summary>
/// Determines whether two instances of <see cref="Interval{T}"/> are equal.
/// Two <see cref="Interval{T}"/> are equal if both <see cref="From"/> and <see cref="To"/> endpoints are equal
/// </summary>
/// <param name="obj">An object to compare the current instance with</param>
/// <returns>Boolean indicating whether the two instances are considered equals</returns>
public override bool Equals(object obj)
{
return obj is Interval<T> interval && Equals(interval);
}
/// <summary>
/// Compares two <see cref='Interval{T}'/> objects. The result specifies whether they are equal.
/// </summary>
/// <returns>Boolean indicating whether the two instances are considered equals</returns>
public static bool operator ==(Interval<T> x, Interval<T> y)
{
if (ReferenceEquals(x, y))
return true;
if (ReferenceEquals(x, null))
return false;
if (ReferenceEquals(y, null))
return false;
return x.Equals(y);
}
/// <summary>
/// Compares two <see cref='Interval{T}'/> objects. The result specifies whether they are not equal.
/// </summary>
/// <returns>Boolean indicating whether the two instances are considered not equals</returns>
public static bool operator !=(Interval<T> x, Interval<T> y)
{
return !(x == y);
}
/// <summary>
/// Generate an hash code for an instance of <see cref="Interval{T}"/>.
/// </summary>
/// <returns>Returns the generated hash code</returns>
public override int GetHashCode()
{
return From.GetHashCode() ^ To.GetHashCode();
}
#endregion
}
}
<file_sep>/src/Deltatre.Utils/Extensions/Dictionary/DictionaryExtensions.cs
using System;
using System.Collections.Generic;
using System.Collections.ObjectModel;
using System.Threading;
namespace Deltatre.Utils.Extensions.Dictionary
{
/// <summary>
/// Extension methods for dictionaries
/// </summary>
public static class DictionaryExtensions
{
///<summary>Returns a read-only version of the <paramref name="source"/> dictionary</summary>
///<param name="source">A dictionary object</param>
///<returns cref="System.Collections.ObjectModel.ReadOnlyDictionary{TKey, TValue}">
///A read-only version of <paramref name="source"/>
///</returns>
/// <exception cref="ArgumentNullException"> Throws <see cref="ArgumentNullException"/> when parameter <paramref name="source"/> is null</exception>
public static ReadOnlyDictionary<TKey, TValue> AsReadOnly<TKey, TValue>(this IDictionary<TKey, TValue> source)
{
if (source == null)
throw new ArgumentNullException(nameof(source));
return new ReadOnlyDictionary<TKey, TValue>(source);
}
/// <summary>Call this method if you want to convert a dictionary value to the type <see cref="string"/> in a safe way. The default value of type <see cref="string"/> will be returned if either the source dictionary does not contain the provided key or the value cannot be converted to type <see cref="string"/>. The current thread's culture will be used for the culture-specific aspects of the type conversion.</summary>
/// <param name="source">The source dictionary</param>
/// <param name="key">The dictionary key for which you want the value converted to type <see cref="string"/></param>
/// <returns>The value of source dictionary corresponding to the provided key converted to type <see cref="string"/>. If the conversion cannot be performed or the source dictionary doesn't contain the provided key then the default value of type <see cref="string"/> is returned</returns>
/// <exception cref="ArgumentNullException"> Throws <see cref="ArgumentNullException"/> when either parameter <paramref name="key"/> or <paramref name="source"/> is null</exception>
public static string GetStringOrDefault(this IDictionary<string, object> source, string key) =>
GetValueOrDefault<string>(
source,
key);
/// <summary>Call this method if you want to convert a dictionary value to the type <see cref="string"/> in a safe way. The default value of type <see cref="string"/> will be returned if either the source dictionary does not contain the provided key or the value cannot be converted to type <see cref="string"/>.</summary>
/// <param name="source">The source dictionary</param>
/// <param name="key">The dictionary key for which you want the value converted to type <see cref="string"/></param>
/// <param name="provider">An object that supplies culture-specific formatting information</param>
/// <returns>The value of source dictionary corresponding to the provided key converted to type <see cref="string"/>. If the conversion cannot be performed or the source dictionary doesn't contain the provided key then the default value of type <see cref="string"/> is returned</returns>
/// <exception cref="ArgumentNullException"> Throws <see cref="ArgumentNullException"/> when either parameter <paramref name="key"/> or <paramref name="provider"/> or <paramref name="source"/> is null</exception>
public static string GetStringOrDefault(this IDictionary<string, object> source, string key, IFormatProvider provider) =>
GetValueOrDefault<string>(
source,
key,
provider);
/// <summary>Call this method if you want to convert a dictionary value to the type <see cref="bool"/> in a safe way. The default value of type <see cref="bool"/> will be returned if either the source dictionary does not contain the provided key or the value cannot be converted to type <see cref="bool"/>. The current thread's culture will be used for the culture-specific aspects of the type conversion.</summary>
/// <param name="source">The source dictionary</param>
/// <param name="key">The dictionary key for which you want the value converted to type <see cref="bool"/></param>
/// <returns>The value of source dictionary corresponding to the provided key converted to type <see cref="bool"/>. If the conversion cannot be performed or the source dictionary doesn't contain the provided key then the default value of type <see cref="bool"/> is returned</returns>
/// <exception cref="ArgumentNullException"> Throws <see cref="ArgumentNullException"/> when either parameter <paramref name="key"/> or <paramref name="source"/> is null</exception>
public static bool GetBoolOrDefault(this IDictionary<string, object> source, string key) =>
GetValueOrDefault<bool>(
source,
key,
Thread.CurrentThread.CurrentCulture);
/// <summary>Call this method if you want to convert a dictionary value to the type <see cref="bool"/> in a safe way. The default value of type <see cref="bool"/> will be returned if either the source dictionary does not contain the provided key or the value cannot be converted to type <see cref="bool"/>.</summary>
/// <param name="source">The source dictionary</param>
/// <param name="key">The dictionary key for which you want the value converted to type <see cref="bool"/></param>
/// <param name="provider">An object that supplies culture-specific formatting information</param>
/// <returns>The value of source dictionary corresponding to the provided key converted to type <see cref="bool"/>. If the conversion cannot be performed or the source dictionary doesn't contain the provided key then the default value of type <see cref="bool"/> is returned</returns>
/// <exception cref="ArgumentNullException"> Throws <see cref="ArgumentNullException"/> when either parameter <paramref name="key"/> or <paramref name="provider"/> or <paramref name="source"/> is null</exception>
public static bool GetBoolOrDefault(this IDictionary<string, object> source, string key, IFormatProvider provider) =>
GetValueOrDefault<bool>(
source,
key,
provider);
/// <summary>Call this method if you want to convert a dictionary value to the type <see cref="int"/> in a safe way. The default value of type <see cref="int"/> will be returned if either the source dictionary does not contain the provided key or the value cannot be converted to type <see cref="int"/>. The current thread's culture will be used for the culture-specific aspects of the type conversion.</summary>
/// <param name="source">The source dictionary</param>
/// <param name="key">The dictionary key for which you want the value converted to type <see cref="int"/></param>
/// <returns>The value of source dictionary corresponding to the provided key converted to type <see cref="int"/>. If the conversion cannot be performed or the source dictionary doesn't contain the provided key then the default value of type <see cref="int"/> is returned</returns>
/// <exception cref="ArgumentNullException"> Throws <see cref="ArgumentNullException"/> when either parameter <paramref name="key"/> or <paramref name="source"/> is null</exception>
public static int GetIntOrDefault(this IDictionary<string, object> source, string key) =>
GetValueOrDefault<int>(
source,
key,
Thread.CurrentThread.CurrentCulture);
/// <summary>Call this method if you want to convert a dictionary value to the type <see cref="int"/> in a safe way. The default value of type <see cref="int"/> will be returned if either the source dictionary does not contain the provided key or the value cannot be converted to type <see cref="int"/>.</summary>
/// <param name="source">The source dictionary</param>
/// <param name="key">The dictionary key for which you want the value converted to type <see cref="int"/></param>
/// <param name="provider">An object that supplies culture-specific formatting information</param>
/// <returns>The value of source dictionary corresponding to the provided key converted to type <see cref="int"/>. If the conversion cannot be performed or the source dictionary doesn't contain the provided key then the default value of type <see cref="int"/> is returned</returns>
/// <exception cref="ArgumentNullException"> Throws <see cref="ArgumentNullException"/> when either parameter <paramref name="key"/> or <paramref name="provider"/> or <paramref name="source"/> is null</exception>
public static int GetIntOrDefault(this IDictionary<string, object> source, string key, IFormatProvider provider) =>
GetValueOrDefault<int>(
source,
key,
provider);
/// <summary>Call this method if you want to convert a dictionary value to the type <typeparamref name="T"/> in a safe way. The default value of type <typeparamref name="T"/> will be returned if either the source dictionary does not contain the provided key or the value cannot be converted to type <typeparamref name="T"/>. The current thread's culture will be used for the culture-specific aspects of the type conversion.</summary>
/// <typeparam name="T">The desired type of the returning value</typeparam>
/// <param name="source">The source dictionary</param>
/// <param name="key">The dictionary key for which you want the value converted to type <typeparamref name="T"/></param>
/// <returns>The value of source dictionary corresponding to the provided key converted to type <typeparamref name="T"/>. If the conversion cannot be performed or the source dictionary doesn't contain the provided key then the default value of type <typeparamref name="T"/> is returned</returns>
/// <exception cref="ArgumentNullException"> Throws <see cref="ArgumentNullException"/> when either parameter <paramref name="key"/> or <paramref name="source"/> is null</exception>
public static T GetValueOrDefault<T>(this IDictionary<string, object> source, string key) =>
GetValueOrDefault<T>(
source,
key,
Thread.CurrentThread.CurrentCulture);
/// <summary>Call this method if you want to convert a dictionary value to the type <typeparamref name="T"/> in a safe way. The default value of type <typeparamref name="T"/> will be returned if either the source dictionary does not contain the provided key or the value cannot be converted to type <typeparamref name="T"/></summary>
/// <typeparam name="T">The desired type of the returning value</typeparam>
/// <param name="source">The source dictionary</param>
/// <param name="key">The dictionary key for which you want the value converted to type <typeparamref name="T"/></param>
/// <param name="provider">An object that supplies culture-specific formatting information</param>
/// <returns>The value of source dictionary corresponding to the provided key converted to type <typeparamref name="T"/>. If the conversion cannot be performed or the source dictionary doesn't contain the provided key then the default value of type <typeparamref name="T"/> is returned</returns>
/// <exception cref="ArgumentNullException"> Throws <see cref="ArgumentNullException"/> when either parameter <paramref name="key"/> or <paramref name="provider"/> or <paramref name="source"/> is null</exception>
public static T GetValueOrDefault<T>(
this IDictionary<string, object> source,
string key,
IFormatProvider provider)
{
if (source == null)
throw new ArgumentNullException(nameof(source));
if (key == null)
throw new ArgumentNullException(nameof(key));
if (provider == null)
throw new ArgumentNullException(nameof(provider));
var sourceContainsKey = source.TryGetValue(key, out var value);
if(!sourceContainsKey)
return default(T);
try
{
return (T)Convert.ChangeType(value, typeof(T), provider);
}
catch
{
return default(T);
}
}
}
}<file_sep>/tests/Deltatre.Utils.Tests/Resilience/WaitAndRetryWithRandomSleepDurationsConfigurationTest.cs
using Deltatre.Utils.Resilience;
using NUnit.Framework;
using System;
namespace Deltatre.Utils.Tests.Resilience
{
[TestFixture]
public sealed class WaitAndRetryWithRandomSleepDurationsConfigurationTest
{
[Test]
public void Ctor_Creates_New_Instance()
{
// ARRANGE
const int maxRetryCount = 10;
const int maxSleepDurationMs = 5_000;
// ACT
var result = new WaitAndRetryWithRandomSleepDurationsConfiguration(
maxRetryCount,
maxSleepDurationMs);
// ASSERT
Assert.IsNotNull(result);
Assert.AreEqual(maxRetryCount, result.MaxRetryCount);
Assert.AreEqual(maxSleepDurationMs, result.MaxSleepDurationInMilliseconds);
}
[Test]
public void Ctor_Allows_To_Create_Configuration_With_MaxRetryCount_Set_To_Zero()
{
// ARRANGE
const int maxRetryCount = 0;
const int maxSleepDurationMs = 5_000;
// ACT
Assert.DoesNotThrow(
() =>
new WaitAndRetryWithRandomSleepDurationsConfiguration(
maxRetryCount,
maxSleepDurationMs)
);
}
[Test]
public void Ctor_Throws_ArgumentOutOfRangeException_When_MaxRetryCount_Is_Less_Than_Zero()
{
// ARRANGE
const int maxRetryCount = -1;
const int maxSleepDurationMs = 5_000;
// ACT
var exception = Assert.Throws<ArgumentOutOfRangeException>(
() => new WaitAndRetryWithRandomSleepDurationsConfiguration(
maxRetryCount,
maxSleepDurationMs)
);
// ASSERT
Assert.IsNotNull(exception);
Assert.AreEqual("maxRetryCount", exception.ParamName);
}
[Test]
public void Ctor_Throws_ArgumentOutOfRangeException_When_MaxSleepDurationInMilliseconds_Is_Less_Than_Zero()
{
// ARRANGE
const int maxRetryCount = 3;
const int maxSleepDurationMs = -1;
// ACT
var exception = Assert.Throws<ArgumentOutOfRangeException>(
() => new WaitAndRetryWithRandomSleepDurationsConfiguration(
maxRetryCount,
maxSleepDurationMs)
);
// ASSERT
Assert.IsNotNull(exception);
Assert.AreEqual("maxSleepDurationInMilliseconds", exception.ParamName);
}
[Test]
public void Ctor_Throws_ArgumentOutOfRangeException_When_MaxSleepDurationInMilliseconds_Equals_Zero()
{
// ARRANGE
const int maxRetryCount = 3;
const int maxSleepDurationMs = 0;
// ACT
var exception = Assert.Throws<ArgumentOutOfRangeException>(
() => new WaitAndRetryWithRandomSleepDurationsConfiguration(
maxRetryCount,
maxSleepDurationMs)
);
// ASSERT
Assert.IsNotNull(exception);
Assert.AreEqual("maxSleepDurationInMilliseconds", exception.ParamName);
}
}
}
<file_sep>/src/Deltatre.Utils/Azure/OsNotSupportedOnAzureAppServicesException.cs
using System;
using System.Runtime.Serialization;
using System.Security.Permissions;
namespace Deltatre.Utils.Azure
{
/// <summary>
/// The exception which is thrown when the hosting environment OS is not supported on Azure app services.
/// This exception is thrown when the calling code performs an operation related to Azure app services, but the hosting environment
/// is running an OS which is not supported by Azure app services.
/// </summary>
[Serializable]
public class OsNotSupportedOnAzureAppServicesException : Exception
{
/// <summary>
/// Gets or sets the description of the OS executed by the hosting environment.
/// </summary>
public string OperatingSystem { get; set; }
/// <summary>
/// Initializes a new instance of <see cref="OsNotSupportedOnAzureAppServicesException"/> class.
/// </summary>
public OsNotSupportedOnAzureAppServicesException()
{
}
/// <summary>
/// Initializes a new instance of the <see cref="OsNotSupportedOnAzureAppServicesException"></see> class
/// with a specified error message.
/// </summary>
/// <param name="message">
/// The message that describes the error.
/// </param>
public OsNotSupportedOnAzureAppServicesException(
string message) : base(message)
{
}
/// <summary>
/// Initializes a new instance of the <see cref="OsNotSupportedOnAzureAppServicesException"></see> class
/// with a specified error message and a reference to the inner exception that is the cause of this exception.
/// </summary>
/// <param name="message">
/// The error message that explains the reason for the exception.
/// </param>
/// <param name="innerException">
/// The exception that is the cause of the current exception,
/// or a null reference (Nothing in Visual Basic) if no inner exception is specified.
/// </param>
public OsNotSupportedOnAzureAppServicesException(
string message,
Exception innerException) : base(message, innerException)
{
}
/// <summary>
/// Initializes a new instance of the <see cref="OsNotSupportedOnAzureAppServicesException"></see> class with serialized data.
/// </summary>
/// <param name="info">
/// The <see cref="SerializationInfo"></see> that holds the serialized object data about the exception being thrown.
/// </param>
/// <param name="context">
/// The <see cref="StreamingContext"></see> that contains contextual information about the source or destination.
/// </param>
/// <exception cref="ArgumentNullException">
/// The <paramref name="info">info</paramref> parameter is null.
/// </exception>
/// <exception cref="SerializationException">
/// The class name is null or <see cref="Exception.HResult"></see> is zero (0).
/// </exception>
protected OsNotSupportedOnAzureAppServicesException(
SerializationInfo info,
StreamingContext context) : base(info, context)
{
if (info == null)
throw new ArgumentNullException(nameof(info));
this.OperatingSystem = info.GetString(nameof(this.OperatingSystem));
}
/// <summary>
/// When overridden in a derived class, sets the <see cref="SerializationInfo"></see>
/// with information about the exception.
/// </summary>
/// <param name="info">
/// The <see cref="SerializationInfo"></see> that holds the serialized
/// object data about the exception being thrown.
/// </param>
/// <param name="context">
/// The <see cref="StreamingContext"></see> that contains contextual information about
/// the source or destination.
/// </param>
/// <exception cref="ArgumentNullException">
/// The <paramref name="info">info</paramref> parameter is a null reference
/// (Nothing in Visual Basic).
/// </exception>
[SecurityPermission(SecurityAction.Demand, SerializationFormatter = true)]
public override void GetObjectData(SerializationInfo info, StreamingContext context)
{
if (info == null)
throw new ArgumentNullException(nameof(info));
info.AddValue(nameof(this.OperatingSystem), this.OperatingSystem);
base.GetObjectData(info, context);
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Concurrency/TimerAsyncTest_Start.cs
using Deltatre.Utils.Extensions.Enumerable;
using Deltatre.Utils.Concurrency;
using NUnit.Framework;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
namespace Deltatre.Utils.Tests.Concurrency
{
public partial class TimerAsyncTest
{
[Test]
public async Task Start_Schedules_Execution_Of_Background_Workload()
{
// ARRANGE
var values = new List<int>();
Func<CancellationToken, Task> action = _ =>
{
values.Add(1);
return Task.CompletedTask;
};
using (var target = new TimerAsync(
action,
TimeSpan.FromMilliseconds(500),
TimeSpan.FromMilliseconds(500)))
{
// ACT
target.Start();
await Task.Delay(1200).ConfigureAwait(false); // wait for the execution of background workload at least two times
// ASSERT
Assert.GreaterOrEqual(2, values.Count);
Assert.IsTrue(values.All(value => value == 1));
}
}
[Test]
public void Start_Can_Be_Called_More_Than_Once_Without_Throwing_Exceptions()
{
// ARRANGE
using (var target = new TimerAsync(
_ => Task.CompletedTask,
TimeSpan.FromMilliseconds(500),
TimeSpan.FromMilliseconds(500)))
{
// ASSERT
Assert.DoesNotThrow(() =>
{
target.Start();
target.Start();
});
}
}
[Test]
public async Task Start_Can_Be_Called_More_Than_Once_Without_Affecting_The_Execution_Of_Background_Workload()
{
// ARRANGE
var values = new List<int>();
Func<CancellationToken, Task> action = _ =>
{
values.Add(1);
return Task.CompletedTask;
};
using (var target = new TimerAsync(
action,
TimeSpan.FromMilliseconds(500),
TimeSpan.FromMilliseconds(500)))
{
// ACT
target.Start();
target.Start();
await Task.Delay(1200).ConfigureAwait(false);
// ASSERT
Assert.GreaterOrEqual(2, values.Count);
Assert.IsTrue(values.All(value => value == 1));
}
}
[Test]
public async Task Start_Is_Thread_Safe()
{
// ARRANGE
var values = new List<int>();
Func<CancellationToken, Task> action = _ =>
{
values.Add(1);
return Task.CompletedTask;
};
using (var target = new TimerAsync(
action,
TimeSpan.FromMilliseconds(500),
TimeSpan.FromMilliseconds(500)))
{
bool run = true;
var thread1 = new Thread(() =>
{
while (run)
{
target.Start();
}
});
var thread2 = new Thread(() =>
{
while (run)
{
target.Start();
}
});
var threads = new[] { thread1, thread2 };
// ACT
threads.ForEach(t => t.Start());
await Task.Delay(1200).ConfigureAwait(false);
run = false;
threads.ForEach(t => t.Join());
// ASSERT
Assert.GreaterOrEqual(2, values.Count);
Assert.IsTrue(values.All(value => value == 1));
}
}
[Test]
public void Start_Throws_ObjectDisposedException_When_Called_On_A_Disposed_Instance()
{
// ARRANGE
var timer = new TimerAsync(
_ => Task.CompletedTask,
TimeSpan.FromMilliseconds(500),
TimeSpan.FromMilliseconds(500));
// ACT
timer.Dispose();
Assert.Throws<ObjectDisposedException>(timer.Start);
}
[Test]
public async Task Start_Executes_Background_Workload_Once_When_Period_Equals_Infine_Timespan()
{
// ARRANGE
var values = new List<int>();
Func<CancellationToken, Task> action = _ =>
{
values.Add(1);
return Task.CompletedTask;
};
using (var target = new TimerAsync(
action,
TimeSpan.FromMilliseconds(500),
Timeout.InfiniteTimeSpan))
{
// ACT
target.Start();
await Task.Delay(2000).ConfigureAwait(false);
// ASSERT
CollectionAssert.AreEqual(new[] { 1 }, values);
}
}
[Test]
public async Task Start_Never_Executes_Background_Workload_When_DueTime_Equals_Infine_Timespan()
{
// ARRANGE
var values = new List<int>();
Func<CancellationToken, Task> action = _ =>
{
values.Add(1);
return Task.CompletedTask;
};
using (var target = new TimerAsync(
action,
Timeout.InfiniteTimeSpan,
TimeSpan.FromMilliseconds(500)))
{
// ACT
target.Start();
await Task.Delay(2000).ConfigureAwait(false);
// ASSERT
CollectionAssert.IsEmpty(values);
}
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Functional/IdentityTest.cs
using NUnit.Framework;
using static Deltatre.Utils.Functional.Functions;
namespace Deltatre.Utils.Tests.Functional
{
[TestFixture]
public class IdentityTest
{
[Test]
public void Identity_Returns_Item_Passed_In()
{
// ARRANGE
const string item = "hello world";
// ACT
var result = Identity(item);
// ASSERT
Assert.AreEqual("hello world", result);
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Types/NonEmptySequenceTest.cs
using System;
using System.Linq;
using Deltatre.Utils.Types;
using NUnit.Framework;
namespace Deltatre.Utils.Tests.Types
{
[TestFixture]
public class NonEmptySequenceTest
{
[Test]
public void Ctor_Throws_When_Items_Is_Null()
{
// ACT
var exception = Assert.Throws<ArgumentNullException>(() => new NonEmptySequence<string>(null));
// ASSERT
Assert.IsNotNull(exception);
Assert.AreEqual("items", exception.ParamName);
}
[Test]
public void Ctor_Throws_When_Items_Is_Empty_Sequence()
{
// ACT
var exception = Assert.Throws<ArgumentException>(() => new NonEmptySequence<string>(Enumerable.Empty<string>()));
// ASSERT
Assert.IsNotNull(exception);
Assert.AreEqual("items", exception.ParamName);
}
[Test]
public void Ctor_Creates_An_Enumerable_Instance()
{
// ARRANGE
var items = new[] {"foo", "bar"};
// ACT
var result = new NonEmptySequence<string>(items);
// ASSERT
Assert.IsNotNull(result);
CollectionAssert.AreEqual(new[] { "foo", "bar" }, result);
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Dto/OperationResultTest_WithErrors.cs
using System;
using Deltatre.Utils.Dto;
using NUnit.Framework;
using Deltatre.Utils.Extensions.Enumerable;
using Deltatre.Utils.Types;
namespace Deltatre.Utils.Tests.Dto
{
[TestFixture]
public class OperationResultWithErrorsTest
{
[Test]
public void CreateFailure_Throws_When_Errors_Is_Null()
{
// ARRANGE
NonEmptySequence<string> nullSequence = null;
// ACT
var exception = Assert.Throws<ArgumentNullException>(() =>
OperationResult<string, string>.CreateFailure(nullSequence)
);
// ASSERT
Assert.IsNotNull(exception);
Assert.AreEqual("errors", exception.ParamName);
}
[Test]
public void CreateFailure_Allows_To_Create_Failure_Result_Providing_A_NonEmptySequence_Of_One_Error()
{
// ARRANGE
var errors = new[] { "something bad occurred" }.ToNonEmptySequence();
// ACT
var result = OperationResult<string, string>.CreateFailure(errors);
// ASSERT
Assert.IsNotNull(result);
Assert.IsFalse(result.IsSuccess);
Assert.IsNotNull(result.Errors);
CollectionAssert.AreEqual(errors, result.Errors);
}
[Test]
public void CreateFailure_Allows_To_Create_Failure_Result_Providing_A_NonEmptySequence_Of_Two_Errors()
{
// ARRANGE
var errors = new[] { "something bad occurred", "invalid prefix detected" }.ToNonEmptySequence();
// ACT
var result = OperationResult<string, string>.CreateFailure(errors);
// ASSERT
Assert.IsNotNull(result);
Assert.IsFalse(result.IsSuccess);
Assert.IsNotNull(result.Errors);
CollectionAssert.AreEqual(errors, result.Errors);
}
[Test]
public void CreateFailure_Allows_To_Create_Failure_Result_Providing_A_Single_Error_Object()
{
// ARRANGE
const string error = "something bad occurred";
// ACT
var result = OperationResult<string, string>.CreateFailure(error);
// ASSERT
Assert.IsNotNull(result);
Assert.IsFalse(result.IsSuccess);
Assert.IsNotNull(result.Errors);
CollectionAssert.AreEqual(new[] { error }, result.Errors);
}
[Test]
public void CreateSuccess_Creates_Success_Result()
{
// ACT
var result = OperationResult<string, string>.CreateSuccess("my normalized value");
// ASSERT
Assert.IsNotNull(result);
Assert.IsTrue(result.IsSuccess);
Assert.AreEqual("my normalized value", result.Output);
Assert.IsNotNull(result.Errors);
Assert.IsEmpty(result.Errors);
}
[Test]
public void Implicit_Conversion_From_TOutput_To_OperationResultOfTOutputTError_Is_Available()
{
// ACT
OperationResult<string, string> result = "Hello !";
// ASSERT
Assert.IsNotNull(result);
Assert.IsTrue(result.IsSuccess);
Assert.AreEqual("Hello !", result.Output);
Assert.IsNotNull(result.Errors);
Assert.IsEmpty(result.Errors);
}
[Test]
public void Output_Throws_InvalidOperationException_When_Operation_Is_Failed()
{
// ARRANGE
var target = OperationResult<string, string>.CreateFailure("error message");
// ACT
var exception = Assert.Throws<InvalidOperationException>(() => _ = target.Output);
// ASSERT
Assert.IsNotNull(exception);
Assert.AreEqual("Reading the operation output is not allowed because the operation is failed.", exception.Message);
}
[Test]
public void Output_Does_Not_Throw_When_Operation_Is_Successful()
{
// ARRANGE
var target = OperationResult<string, string>.CreateSuccess("the message");
// ACT
Assert.DoesNotThrow(() => _ = target.Output);
}
[Test]
public void Output_Returns_The_Output_Of_Successful_Operation()
{
// ARRANGE
var target = OperationResult<string, string>.CreateSuccess("the message");
// ACT
var result = target.Output;
// ASSERT
Assert.AreEqual("the message", result);
}
[Test]
public void Error_Throws_InvalidOperationException_When_Operation_Is_Successful()
{
// ARRANGE
var target = OperationResult<string, string>.CreateSuccess("the message");
// ACT
var exception = Assert.Throws<InvalidOperationException>(
() => _ = target.Error
);
// ASSERT
Assert.IsNotNull(exception);
Assert.IsFalse(string.IsNullOrWhiteSpace(exception.Message));
}
[Test]
public void Error_Does_Not_Throw_When_Operation_Is_Failed_With_One_Error()
{
// ARRANGE
var target = OperationResult<string, string>.CreateFailure("something bad happened!");
// ACT
Assert.DoesNotThrow(() => _ = target.Error);
}
[Test]
public void Error_Does_Not_Throw_When_Operation_Is_Failed_With_Several_Errors()
{
// ARRANGE
var errors = new NonEmptySequence<string>(new[] { "something bad happened!", "kaboom!" });
var target = OperationResult<string, string>.CreateFailure(errors);
// ACT
Assert.DoesNotThrow(() => _ = target.Error);
}
[Test]
public void Error_Returns_The_Error_When_Operation_Is_Failed_With_One_Error()
{
// ARRANGE
var target = OperationResult<string, string>.CreateFailure("something bad happened!");
// ACT
var result = target.Error;
// ASSERT
Assert.AreEqual("something bad happened!", result);
}
[Test]
public void Error_Returns_First_Error_When_Operation_Is_Failed_With_Several_Errors()
{
// ARRANGE
var errors = new NonEmptySequence<string>(new[] { "something bad happened!", "kaboom!" });
var target = OperationResult<string, string>.CreateFailure(errors);
// ACT
var result = target.Error;
// ASSERT
Assert.AreEqual("something bad happened!", result);
}
}
}
<file_sep>/src/Deltatre.Utils/Dto/OperationResult.cs
using System;
namespace Deltatre.Utils.Dto
{
/// <summary>
/// Represents the result obtained when performing an operation
/// </summary>
/// <typeparam name="TOutput">The type of the operation output</typeparam>
public sealed class OperationResult<TOutput>
{
/// <summary>
/// The result of a failed operation. Property <see cref="Output"/> will be set equal to the default value of type TOutput.
/// </summary>
public static readonly OperationResult<TOutput> Failure = new OperationResult<TOutput>(false, default);
/// <summary>
/// Call this method to create an instance representing the result of a successful operation.
/// </summary>
/// <param name="output">The operation output.</param>
/// <returns>An instance representing the result of a successful operation.</returns>
public static OperationResult<TOutput> CreateSuccess(TOutput output) =>
new OperationResult<TOutput>(true, output);
private readonly TOutput _output;
private OperationResult(bool isSuccess, TOutput output)
{
IsSuccess = isSuccess;
_output = output;
}
/// <summary>
/// Gets a flag indicating whether the operation completed successfully.
/// </summary>
public bool IsSuccess { get; }
/// <summary>
/// Gets the result produced from the operation.
/// Accessing this property throws <see cref="System.InvalidOperationException"/> when property <see cref="IsSuccess"/> is <see langword="false"/>.
/// </summary>
/// <exception cref="System.InvalidOperationException">
/// Throws <see cref="System.InvalidOperationException"/> when property <see cref="IsSuccess"/> is <see langword="false"/>.
/// </exception>
public TOutput Output
{
get
{
if (!this.IsSuccess)
{
throw new InvalidOperationException(
"Reading the operation output is not allowed because the operation is failed.");
}
return _output;
}
}
/// <summary>
/// Implicit type conversion from <typeparamref name="TOutput"/> to <see cref="OperationResult{TOutput}" />
/// </summary>
/// <param name="value">An instance of type<typeparamref name="TOutput"/> to be converted to <see cref="OperationResult{TOutput}"/></param>
public static implicit operator OperationResult<TOutput>(TOutput value)
{
return CreateSuccess(value);
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Text/TextAnalyzerTest.cs
using Deltatre.Utils.Text;
using NUnit.Framework;
using System;
using System.Collections.Generic;
using System.Text;
namespace Deltatre.Utils.Tests.Text
{
[TestFixture]
public class TextAnalyzerTest
{
[Test]
public void Should_be_possible_to_Count_Words_for_Afrikaans_Text()
{
var target = CreateTarget();
var text =
"Ons sportliefhebbers eis ongeëwenaarde toegang enige tyd, enige plek en op 'n aantal toestelle. Om saam met die span te werk, gee ons die kundigheid en bewese tegnologie wat ons nodig het om so 'n ambisieuse nuwe diens bekend te stel.";
var result = target.CountWords(text, "af-ZA");
Assert.AreEqual(41, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Afrikaans_Text()
{
var target = CreateTarget();
var text =
"Ons sportliefhebbers eis ongeëwenaarde toegang enige tyd, enige plek en op 'n aantal toestelle. Om saam met die span te werk, gee ons die kundigheid en bewese tegnologie wat ons nodig het om so 'n ambisieuse nuwe diens bekend te stel.";
var result = target.CountCharacters(text, "af-ZA");
Assert.AreEqual(234, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Albanian_Text()
{
var target = CreateTarget();
var text =
"Tifozët tanë të sportit kërkojnë qasje të pakrahasueshme në çdo kohë, kudo dhe në një numër pajisjesh. Puna me ekipin na jep ekspertizë dhe teknologji të dëshmuar që na duhet për të filluar një shërbim të ri kaq ambicioz.";
var result = target.CountWords(text, "sq-AL");
Assert.AreEqual(39, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Albanian_Text()
{
var target = CreateTarget();
var text =
"Tifozët tanë të sportit kërkojnë qasje të pakrahasueshme në çdo kohë, kudo dhe në një numër pajisjesh. Puna me ekipin na jep ekspertizë dhe teknologji të dëshmuar që na duhet për të filluar një shërbim të ri kaq ambicioz.";
var result = target.CountCharacters(text, "sq-AL");
Assert.AreEqual(221, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Amharic_Text()
{
var target = CreateTarget();
var text = "የእኛ የስፖርት አድናቂዎች በማንኛውም ሰዓት ፣ በማንኛውም ቦታ እና በበርካታ መሳሪያዎች ላይ ደህንነቱ ያልተጠበቀ መዳረሻ ይጠይቃሉ። ከቡድኑ ጋር አብሮ መሥራት እንደዚህ ዓይነቱን ምኞት አዲስ አገልግሎት ለመጀመር የሚያስፈልገንን ሙያዊ እና የተረጋገጠ ቴክኖሎጂ ይሰጠናል።";
var result = target.CountWords(text, "am-ET");
Assert.AreEqual(32, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Amharic_Text()
{
var target = CreateTarget();
var text = "የእኛ የስፖርት አድናቂዎች በማንኛውም ሰዓት ፣ በማንኛውም ቦታ እና በበርካታ መሳሪያዎች ላይ ደህንነቱ ያልተጠበቀ መዳረሻ ይጠይቃሉ። ከቡድኑ ጋር አብሮ መሥራት እንደዚህ ዓይነቱን ምኞት አዲስ አገልግሎት ለመጀመር የሚያስፈልገንን ሙያዊ እና የተረጋገጠ ቴክኖሎጂ ይሰጠናል።";
var result = target.CountCharacters(text, "am-ET");
Assert.AreEqual(170, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Armenian_Text()
{
var target = CreateTarget();
var text = "Մեր մարզասերները պահանջում են անզուգական մուտք ցանկացած պահի, ցանկացած վայրում և մի շարք սարքերով: Թիմի հետ աշխատելը մեզ տալիս է փորձ և ապացուցված տեխնոլոգիա, որը մեզ անհրաժեշտ է նման հավակնոտ նոր ծառայություն սկսելու համար:";
var result = target.CountWords(text, "hy-AM");
Assert.AreEqual(34, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Armenian_Text()
{
var target = CreateTarget();
var text = "Մեր մարզասերները պահանջում են անզուգական մուտք ցանկացած պահի, ցանկացած վայրում և մի շարք սարքերով: Թիմի հետ աշխատելը մեզ տալիս է փորձ և ապացուցված տեխնոլոգիա, որը մեզ անհրաժեշտ է նման հավակնոտ նոր ծառայություն սկսելու համար:";
var result = target.CountCharacters(text, "hy-AM");
Assert.AreEqual(224, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Azeri_Text()
{
var target = CreateTarget();
var text = "İdman həvəskarlarımız hər yerdə, hər yerdə və bir sıra cihazlarda rakursuz çıxış tələb edirlər. Komanda ilə işləmək, belə bir iddialı yeni bir xidmətə başlamağımız üçün lazım olan təcrübə və sübut edilmiş texnologiya verir.";
var result = target.CountWords(text, "az-Cyrl-AZ");
Assert.AreEqual(33, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Azeri_Text()
{
var target = CreateTarget();
var text = "İdman həvəskarlarımız hər yerdə, hər yerdə və bir sıra cihazlarda rakursuz çıxış tələb edirlər. Komanda ilə işləmək, belə bir iddialı yeni bir xidmətə başlamağımız üçün lazım olan təcrübə və sübut edilmiş texnologiya verir.";
var result = target.CountCharacters(text, "az-Cyrl-AZ");
Assert.AreEqual(223, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_American_Text()
{
var target = CreateTarget();
var text =
"Forty-five seasons into his coaching career, <NAME> appears to be having more fun than ever. He's taken charge of a deep, malleable, veteran defense built on the same 3-4 principles that he first taught the New York Giants as a coordinator in the 1980s, complete with veteran linebackers and a diverse secondary. The 65-year-old's top two defensive assistants are his son, Steve, and one of his favorite former players, <NAME>, both of whom are roughly half his age. In <NAME>'s world, there's nothing more fun than beating the organization that once fired you (even if, you know, technically that was a different team) in a driving rainstorm under grey skies. The uglier, the better.";
var result = target.CountWords(text, "en-US");
Assert.AreEqual(117, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_American_Text()
{
var target = CreateTarget();
var text =
"Forty-five seasons into his coaching career, <NAME> appears to be having more fun than ever. He's taken charge of a deep, malleable, veteran defense built on the same 3-4 principles that he first taught the New York Giants as a coordinator in the 1980s, complete with veteran linebackers and a diverse secondary. The 65-year-old's top two defensive assistants are his son, Steve, and one of his favorite former players, <NAME>, both of whom are roughly half his age. In <NAME>'s world, there's nothing more fun than beating the organization that once fired you (even if, you know, technically that was a different team) in a driving rainstorm under grey skies. The uglier, the better.";
var result = target.CountCharacters(text, "en-US");
Assert.AreEqual(704, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Basque_Text()
{
var target = CreateTarget();
var text =
"Gure kirol zaleek paregabeko sarbidea eskatzen dute noiznahi, edozein lekutan eta hainbat gailutan. Taldearekin lan egiteak hain zerbitzu handinahi berria abian jartzeko behar dugun eskarmentua eta frogatutako teknologia eskaintzen digu";
var result = target.CountWords(text, "eu-ES");
Assert.AreEqual(30, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Basque_Text()
{
var target = CreateTarget();
var text =
"Gure kirol zaleek paregabeko sarbidea eskatzen dute noiznahi, edozein lekutan eta hainbat gailutan. Taldearekin lan egiteak hain zerbitzu handinahi berria abian jartzeko behar dugun eskarmentua eta frogatutako teknologia eskaintzen digu";
var result = target.CountCharacters(text, "eu-ES");
Assert.AreEqual(236, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Bengali_Text()
{
var target = CreateTarget();
var text =
"আমাদের ক্রীড়া অনুরাগীরা যে কোনও সময়, যে কোনও জায়গায় এবং বেশ কয়েকটি ডিভাইসে অবিচ্ছিন্ন অ্যাক্সেসের দাবি করে। দলের সাথে কাজ করা আমাদের এমন উচ্চাভিলাষী নতুন পরিষেবা চালু করার জন্য প্রয়োজনীয় দক্ষতা এবং প্রমাণিত প্রযুক্তি দেয়।";
var result = target.CountWords(text, "bn-IN");
Assert.AreEqual(35, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Bengali_Text()
{
var target = CreateTarget();
var text =
"আমাদের ক্রীড়া অনুরাগীরা যে কোনও সময়, যে কোনও জায়গায় এবং বেশ কয়েকটি ডিভাইসে অবিচ্ছিন্ন অ্যাক্সেসের দাবি করে। দলের সাথে কাজ করা আমাদের এমন উচ্চাভিলাষী নতুন পরিষেবা চালু করার জন্য প্রয়োজনীয় দক্ষতা এবং প্রমাণিত প্রযুক্তি দেয়।";
var result = target.CountCharacters(text, "bn-IN");
Assert.AreEqual(229, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Belarusian_Text()
{
var target = CreateTarget();
var text =
"Нашы аматары спорту патрабуюць неперасягненага доступу ў любы час і ў любым месцы і на некалькіх прыладах. Праца з камандай дае нам вопыт і правераныя тэхналогіі, неабходныя для запуску такой амбіцыйнай новай паслугі.";
var result = target.CountWords(text, "be-BY");
Assert.AreEqual(33, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Belarusian_Text()
{
var target = CreateTarget();
var text =
"Нашы аматары спорту патрабуюць неперасягненага доступу ў любы час і ў любым месцы і на некалькіх прыладах. Праца з камандай дае нам вопыт і правераныя тэхналогіі, неабходныя для запуску такой амбіцыйнай новай паслугі.";
var result = target.CountCharacters(text, "be-BY");
Assert.AreEqual(217, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Bulgarian_Text()
{
var target = CreateTarget();
var text =
"Нашите фенове на спорта изискват ненадминат достъп по всяко време, навсякъде и на редица устройства. Работата с екипа ни дава опит и доказана технология, от които се нуждаем, за да стартираме такава амбициозна нова услуга.";
var result = target.CountWords(text, "bg-BG");
Assert.AreEqual(35, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Bulgarian_Text()
{
var target = CreateTarget();
var text =
"Нашите фенове на спорта изискват ненадминат достъп по всяко време, навсякъде и на редица устройства. Работата с екипа ни дава опит и доказана технология, от които се нуждаем, за да стартираме такава амбициозна нова услуга.";
var result = target.CountCharacters(text, "bg-BG");
Assert.AreEqual(222, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Italian_Text()
{
var target = CreateTarget();
var text = "Stop all’aumento della cedolare secca sugli affitti. Si ferma la stretta sulla Flat tax per i lavoratori autonomi. Nuove risorse per il piano Impresa 4.0 ma per investimenti che siano sostenibili dal punto di vista ambientale. Il vertice di maggioranza di ieri ha corretto tre punti del disegno di legge di Bilancio, approvato «salvo intese» dal consiglio dei ministri due settimane fa ma non ancora arrivato in Parlamento. Oggi una nuova riunione dovrebbe limare gli ultimi punti del testo. Poi comincerà l’esame del Parlamento. E nuove modifiche sono possibili.";
var result = target.CountWords(text, "en-US");
Assert.AreEqual(89, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Italian_Text()
{
var target = CreateTarget();
var text = "Stop all’aumento della cedolare secca sugli affitti. Si ferma la stretta sulla Flat tax per i lavoratori autonomi. Nuove risorse per il piano Impresa 4.0 ma per investimenti che siano sostenibili dal punto di vista ambientale. Il vertice di maggioranza di ieri ha corretto tre punti del disegno di legge di Bilancio, approvato «salvo intese» dal consiglio dei ministri due settimane fa ma non ancora arrivato in Parlamento. Oggi una nuova riunione dovrebbe limare gli ultimi punti del testo. Poi comincerà l’esame del Parlamento. E nuove modifiche sono possibili.";
var result = target.CountCharacters(text, "en-US");
Assert.AreEqual(563, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Arabic_Text()
{
var target = CreateTarget();
var text = "رئاسة الحكومة، وذلك بعد أن وصل إلى ما سماه طريقا مسدودا. وقدم الحريري استقالته للرئيس ميشال عون.";
var result = target.CountWords(text, "ar-SA");
Assert.AreEqual(17, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Arabic_Text()
{
var target = CreateTarget();
var text = "رئاسة الحكومة، وذلك بعد أن وصل إلى ما سماه طريقا مسدودا. وقدم الحريري استقالته للرئيس ميشال عون.";
var result = target.CountCharacters(text, "ar-SA");
Assert.AreEqual(96, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Russian_Text()
{
var target = CreateTarget();
var text = "Рассказываем об американцах, которые оставили заметный след в еврокубках.";
var result = target.CountWords(text, "ru-RU");
Assert.AreEqual(9, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Russian_Text()
{
var target = CreateTarget();
var text = "Рассказываем об американцах, которые оставили заметный след в еврокубках.";
var result = target.CountCharacters(text, "ru-RU");
Assert.AreEqual(73, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Urdu_Text()
{
var target = CreateTarget();
var text = "ہیلو آپ سب کیسے ٹھیک ہیں";
var result = target.CountWords(text, "ur-PK");
Assert.AreEqual(6, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Urdu_Text()
{
var target = CreateTarget();
var text = "ہیلو آپ سب کیسے ٹھیک ہیں";
var result = target.CountCharacters(text, "ur-PK");
Assert.AreEqual(24, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Chinese_Simplified_Text()
{
var target = CreateTarget();
var text = "淘宝为第三方交易平台及互联网信息服务提供者,淘宝(含网站、客户端等)所展示的商品/服务的标题、价格、详情等信息内容系由店铺经营者发布,其真实性、准确性和合法性均由店铺经营者负责。淘宝提醒用户购买商品/服务前注意谨慎核实。如用户对商品/服务的标题、价格、详情等任何信息有任何疑问的,请在购买前通过阿里旺旺与店铺经营者沟通确认;淘宝存在海量店铺,如用户发现店铺内有任何违法/侵权信息,请立即向淘宝举报并提供有效线索。";
var result = target.CountWords(text, "zh-CN");
Assert.AreEqual(206, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Chinese_Simplified_Text()
{
var target = CreateTarget();
var text = "淘宝为第三方交易平台及互联网信息服务提供者,淘宝(含网站、客户端等)所展示的商品/服务的标题、价格、详情等信息内容系由店铺经营者发布,其真实性、准确性和合法性均由店铺经营者负责。淘宝提醒用户购买商品/服务前注意谨慎核实。如用户对商品/服务的标题、价格、详情等任何信息有任何疑问的,请在购买前通过阿里旺旺与店铺经营者沟通确认;淘宝存在海量店铺,如用户发现店铺内有任何违法/侵权信息,请立即向淘宝举报并提供有效线索。";
var result = target.CountCharacters(text, "zh-CN");
Assert.AreEqual(206, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Chinese_Traditional_Text()
{
var target = CreateTarget();
var text = "我們的體育迷要求隨時隨地在許多設備上獲得無與倫比的訪問權限。 與團隊合作為我們提供了開展如此雄心勃勃的新服務所需的專業知識和成熟技術。";
var result = target.CountWords(text, "zh-HK");
Assert.AreEqual(66, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Chinese_Traditional_Text()
{
var target = CreateTarget();
var text = "我們的體育迷要求隨時隨地在許多設備上獲得無與倫比的訪問權限。 與團隊合作為我們提供了開展如此雄心勃勃的新服務所需的專業知識和成熟技術。";
var result = target.CountCharacters(text, "zh-HK");
Assert.AreEqual(67, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Korean_Text()
{
var target = CreateTarget();
var text = "우리의 스포츠 팬들은 언제 어디서나 많은 장치에서 최고의 액세스를 요구합니다. 팀과 협력하면 야심 찬 새 서비스를 시작하는 데 필요한 전문성과 입증 된 기술이 제공됩니다.";
var result = target.CountWords(text, "ko-KR");
Assert.AreEqual(72, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Korean_Text()
{
var target = CreateTarget();
var text = "우리의 스포츠 팬들은 언제 어디서나 많은 장치에서 최고의 액세스를 요구합니다. 팀과 협력하면 야심 찬 새 서비스를 시작하는 데 필요한 전문성과 입증 된 기술이 제공됩니다.";
var result = target.CountCharacters(text, "ko-KR");
Assert.AreEqual(95, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Greek_Text()
{
var target = CreateTarget();
var text = "Οι φίλαθλοι μας απαιτούν απαράμιλλη πρόσβαση οποιαδήποτε στιγμή, οπουδήποτε και σε πολλές συσκευές. Η συνεργασία με την ομάδα μας προσφέρει την τεχνογνωσία και την αποδεδειγμένη τεχνολογία που χρειαζόμαστε για να ξεκινήσουμε μια τόσο φιλόδοξη νέα υπηρεσία.";
var result = target.CountWords(text, "el-GR");
Assert.AreEqual(36, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Greek_Text()
{
var target = CreateTarget();
var text = "Οι φίλαθλοι μας απαιτούν απαράμιλλη πρόσβαση οποιαδήποτε στιγμή, οπουδήποτε και σε πολλές συσκευές. Η συνεργασία με την ομάδα μας προσφέρει την τεχνογνωσία και την αποδεδειγμένη τεχνολογία που χρειαζόμαστε για να ξεκινήσουμε μια τόσο φιλόδοξη νέα υπηρεσία.";
var result = target.CountCharacters(text, "el-GR");
Assert.AreEqual(256, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Hebrew_Text()
{
var target = CreateTarget();
var text = "אוהדי הספורט שלנו דורשים גישה ללא תחרות בכל עת ובכל מקום ובמספר מכשירים. העבודה עם הצוות מעניקה לנו את המומחיות והטכנולוגיה המוכחת הדרושה לנו כדי להשיק שירות חדש ושאפתני כזה.";
var result = target.CountWords(text, "he-IL");
Assert.AreEqual(30, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Hebrew_Text()
{
var target = CreateTarget();
var text = "אוהדי הספורט שלנו דורשים גישה ללא תחרות בכל עת ובכל מקום ובמספר מכשירים. העבודה עם הצוות מעניקה לנו את המומחיות והטכנולוגיה המוכחת הדרושה לנו כדי להשיק שירות חדש ושאפתני כזה.";
var result = target.CountCharacters(text, "he-IL");
Assert.AreEqual(174, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Persian_Text()
{
var target = CreateTarget();
var text = "طرفداران ورزش ما خواستار دسترسی بی نظیر در هر زمان ، هر مکان و تعدادی دستگاه هستند. همکاری با تیم به ما مهارت و فناوری اثبات شده ای را برای راه اندازی چنین سرویس جدید جاه طلبانه می دهد.";
var result = target.CountWords(text, "fa-IR");
Assert.AreEqual(39, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Persian_Text()
{
var target = CreateTarget();
var text = "طرفداران ورزش ما خواستار دسترسی بی نظیر در هر زمان ، هر مکان و تعدادی دستگاه هستند. همکاری با تیم به ما مهارت و فناوری اثبات شده ای را برای راه اندازی چنین سرویس جدید جاه طلبانه می دهد.";
var result = target.CountCharacters(text, "fa-IR");
Assert.AreEqual(185, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Japanese_Text()
{
var target = CreateTarget();
var text =
"私たちのスポーツファンは、いつでも、どこでも、多くのデバイスで比類のないアクセスを求めています。チームと協力することで、このような野心的な新しいサービスを立ち上げるために必要な専門知識と実績のある技術を得ることができます。";
var result = target.CountWords(text, "ja-JP");
Assert.AreEqual(111, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Japanese_Text()
{
var target = CreateTarget();
var text =
"私たちのスポーツファンは、いつでも、どこでも、多くのデバイスで比類のないアクセスを求めています。チームと協力することで、このような野心的な新しいサービスを立ち上げるために必要な専門知識と実績のある技術を得ることができます。";
var result = target.CountCharacters(text, "ja-JP");
Assert.AreEqual(111, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Gujarati_Text()
{
var target = CreateTarget();
var text =
"અમારા રમત પ્રશંસકો ગમે ત્યારે, ગમે ત્યાં અને સંખ્યાબંધ ઉપકરણો પર અજોડ પ્રવેશની માંગ કરે છે. ટીમ સાથે કામ કરવાથી અમને આવી મહત્વાકાંક્ષી નવી સેવા શરૂ કરવાની આવશ્યકતા અને સાબિત તકનીક મળે છે.";
var result = target.CountWords(text, "gu-IN");
Assert.AreEqual(33, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Gujarati_Text()
{
var target = CreateTarget();
var text =
"અમારા રમત પ્રશંસકો ગમે ત્યારે, ગમે ત્યાં અને સંખ્યાબંધ ઉપકરણો પર અજોડ પ્રવેશની માંગ કરે છે. ટીમ સાથે કામ કરવાથી અમને આવી મહત્વાકાંક્ષી નવી સેવા શરૂ કરવાની આવશ્યકતા અને સાબિત તકનીક મળે છે.";
var result = target.CountCharacters(text, "gu-IN");
Assert.AreEqual(187, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Hindi_Text()
{
var target = CreateTarget();
var text =
"हमारे खेल प्रशंसक कभी भी, कहीं भी और कई उपकरणों पर बेजोड़ एक्सेस की मांग करते हैं। टीम के साथ काम करने से हमें विशेषज्ञता और सिद्ध तकनीक मिलती है, हमें इस तरह की महत्वाकांक्षी नई सेवा शुरू करने की जरूरत है।";
var result = target.CountWords(text, "hi-IN");
Assert.AreEqual(42, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Hindi_Text()
{
var target = CreateTarget();
var text =
"हमारे खेल प्रशंसक कभी भी, कहीं भी और कई उपकरणों पर बेजोड़ एक्सेस की मांग करते हैं। टीम के साथ काम करने से हमें विशेषज्ञता और सिद्ध तकनीक मिलती है, हमें इस तरह की महत्वाकांक्षी नई सेवा शुरू करने की जरूरत है।";
var result = target.CountCharacters(text, "hi-IN");
Assert.AreEqual(206, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Punjabi_Text()
{
var target = CreateTarget();
var text =
"ਸਾਡੇ ਖੇਡ ਪ੍ਰਸ਼ੰਸਕ ਕਦੇ ਵੀ, ਕਿਤੇ ਵੀ ਅਤੇ ਕਈ ਡਿਵਾਈਸਿਸਾਂ 'ਤੇ ਬੇਜੋੜ ਪਹੁੰਚ ਦੀ ਮੰਗ ਕਰਦੇ ਹਨ. ਟੀਮ ਨਾਲ ਕੰਮ ਕਰਨਾ ਸਾਨੂੰ ਮੁਹਾਰਤ ਅਤੇ ਸਾਬਤ ਤਕਨਾਲੋਜੀ ਦਿੰਦਾ ਹੈ ਜਿਸਦੀ ਸਾਨੂੰ ਅਜਿਹੀ ਮਹੱਤਵਪੂਰਣ ਨਵੀਂ ਸੇਵਾ ਸ਼ੁਰੂ ਕਰਨ ਦੀ ਜ਼ਰੂਰਤ ਹੈ.";
var result = target.CountWords(text, "pa-IN");
Assert.AreEqual(39, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Punjabi_Text()
{
var target = CreateTarget();
var text =
"ਸਾਡੇ ਖੇਡ ਪ੍ਰਸ਼ੰਸਕ ਕਦੇ ਵੀ, ਕਿਤੇ ਵੀ ਅਤੇ ਕਈ ਡਿਵਾਈਸਿਸਾਂ 'ਤੇ ਬੇਜੋੜ ਪਹੁੰਚ ਦੀ ਮੰਗ ਕਰਦੇ ਹਨ. ਟੀਮ ਨਾਲ ਕੰਮ ਕਰਨਾ ਸਾਨੂੰ ਮੁਹਾਰਤ ਅਤੇ ਸਾਬਤ ਤਕਨਾਲੋਜੀ ਦਿੰਦਾ ਹੈ ਜਿਸਦੀ ਸਾਨੂੰ ਅਜਿਹੀ ਮਹੱਤਵਪੂਰਣ ਨਵੀਂ ਸੇਵਾ ਸ਼ੁਰੂ ਕਰਨ ਦੀ ਜ਼ਰੂਰਤ ਹੈ.";
var result = target.CountCharacters(text, "pa-IN");
Assert.AreEqual(202, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Thai_Text()
{
var target = CreateTarget();
var text =
"แฟนกีฬาของเราต้องการการเข้าถึงที่ยอดเยี่ยมทุกที่ทุกเวลาและบนอุปกรณ์จำนวนมาก การทำงานกับทีมทำให้เรามีความเชี่ยวชาญและเทคโนโลยีที่พิสูจน์แล้วว่าเราจำเป็นต้องเปิดตัวบริการใหม่ที่มีความทะเยอทะยาน";
var result = target.CountWords(text, "th-TH");
Assert.AreEqual(2, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Thai_Text()
{
var target = CreateTarget();
var text =
"แฟนกีฬาของเราต้องการการเข้าถึงที่ยอดเยี่ยมทุกที่ทุกเวลาและบนอุปกรณ์จำนวนมาก การทำงานกับทีมทำให้เรามีความเชี่ยวชาญและเทคโนโลยีที่พิสูจน์แล้วว่าเราจำเป็นต้องเปิดตัวบริการใหม่ที่มีความทะเยอทะยาน";
var result = target.CountCharacters(text, "th-TH");
Assert.AreEqual(191, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Turkish_Text()
{
var target = CreateTarget();
var text =
"Sporseverler her zaman, her yerde ve çeşitli cihazlarda rakipsiz erişim talep ediyor. Ekip ile çalışmak bize bu kadar iddialı ve yeni bir servis başlatmak için ihtiyaç duyduğumuz uzmanlığı ve kanıtlanmış teknolojiyi sunuyor.";
var result = target.CountWords(text, "tr-TR");
Assert.AreEqual(32, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Turkish_Text()
{
var target = CreateTarget();
var text =
"Sporseverler her zaman, her yerde ve çeşitli cihazlarda rakipsiz erişim talep ediyor. Ekip ile çalışmak bize bu kadar iddialı ve yeni bir servis başlatmak için ihtiyaç duyduğumuz uzmanlığı ve kanıtlanmış teknolojiyi sunuyor.";
var result = target.CountCharacters(text, "tr-TR");
Assert.AreEqual(224, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Mongolian_Cyrillic_Text()
{
var target = CreateTarget();
var text =
"Манай спортын фэнүүд хэзээ ч, хаанаас ч, хэд хэдэн төхөөрөмжөөс давтагдашгүй хандалтыг шаарддаг. Багтай хамтарч ажиллах нь бидэнд маш том шинэ үйлчилгээг эхлүүлэхэд шаардлагатай туршлага, батлагдсан технологийг өгдөг.";
var result = target.CountWords(text, "mn-MN");
Assert.AreEqual(28, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Mongolian_Cyrillic_Text()
{
var target = CreateTarget();
var text =
"Манай спортын фэнүүд хэзээ ч, хаанаас ч, хэд хэдэн төхөөрөмжөөс давтагдашгүй хандалтыг шаарддаг. Багтай хамтарч ажиллах нь бидэнд маш том шинэ үйлчилгээг эхлүүлэхэд шаардлагатай туршлага, батлагдсан технологийг өгдөг.";
var result = target.CountCharacters(text, "mn-MN");
Assert.AreEqual(217, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Nepali_Text()
{
var target = CreateTarget();
var text =
"हाम्रो खेल फ्यानहरूले कुनै पनि समय, कहिँ पनि र धेरै उपकरणहरूमा अतुलनीय पहुँचको माग गर्दछ। टोलीसँग काम गर्दा हामीलाई विशेषज्ञता र प्रमाणित टेक्नोलोजी दिन्छ जुन हामीले यस्तो महत्वाकांक्षी नयाँ सेवा सुरू गर्न आवश्यक छ।";
var result = target.CountWords(text, "ne-NP");
Assert.AreEqual(34, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Nepali_Text()
{
var target = CreateTarget();
var text =
"हाम्रो खेल फ्यानहरूले कुनै पनि समय, कहिँ पनि र धेरै उपकरणहरूमा अतुलनीय पहुँचको माग गर्दछ। टोलीसँग काम गर्दा हामीलाई विशेषज्ञता र प्रमाणित टेक्नोलोजी दिन्छ जुन हामीले यस्तो महत्वाकांक्षी नयाँ सेवा सुरू गर्न आवश्यक छ।";
var result = target.CountCharacters(text, "ne-NP");
Assert.AreEqual(215, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Macedonian_Text()
{
var target = CreateTarget();
var text =
"Нашите fansубители на спортот бараат неповторлив пристап во секое време, насекаде и на голем број уреди. Работата со тимот ни дава експертиза и докажана технологија што ни е потребна за да започнеме ваква амбициозна нова услуга.";
var result = target.CountWords(text, "mk-MK");
Assert.AreEqual(36, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Macedonian_Text()
{
var target = CreateTarget();
var text =
"Нашите fansубители на спортот бараат неповторлив пристап во секое време, насекаде и на голем број уреди. Работата со тимот ни дава експертиза и докажана технологија што ни е потребна за да започнеме ваква амбициозна нова услуга.";
var result = target.CountCharacters(text, "mk-MK");
Assert.AreEqual(228, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_French_Text()
{
var target = CreateTarget();
var text =
"Nos fans de sport exigent un accès inégalé à tout moment, n'importe où et sur de nombreux appareils. Travailler avec l'équipe nous donne l'expertise et la technologie éprouvée dont nous avons besoin pour lancer un nouveau service aussi ambitieux.";
var result = target.CountWords(text, "fr-FR");
Assert.AreEqual(39, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_French_Text()
{
var target = CreateTarget();
var text =
"Nos fans de sport exigent un accès inégalé à tout moment, n'importe où et sur de nombreux appareils. Travailler avec l'équipe nous donne l'expertise et la technologie éprouvée dont nous avons besoin pour lancer un nouveau service aussi ambitieux.";
var result = target.CountCharacters(text, "fr-FR");
Assert.AreEqual(246, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Spanish_Text()
{
var target = CreateTarget();
var text =
"Nuestros fanáticos de los deportes exigen un acceso inigualable en cualquier momento, en cualquier lugar y en varios dispositivos. Trabajar con el equipo nos brinda la experiencia y la tecnología comprobada que necesitamos para lanzar un nuevo servicio tan ambicioso.";
var result = target.CountWords(text, "es-ES");
Assert.AreEqual(40, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Spanish_Text()
{
var target = CreateTarget();
var text =
"Nuestros fanáticos de los deportes exigen un acceso inigualable en cualquier momento, en cualquier lugar y en varios dispositivos. Trabajar con el equipo nos brinda la experiencia y la tecnología comprobada que necesitamos para lanzar un nuevo servicio tan ambicioso.";
var result = target.CountCharacters(text, "es-ES");
Assert.AreEqual(267, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_German_Text()
{
var target = CreateTarget();
var text =
"Unsere Sportfans fordern einen konkurrenzlosen Zugang zu jeder Zeit, an jedem Ort und auf einer Reihe von Geräten. Durch die Zusammenarbeit mit dem Team verfügen wir über das Fachwissen und die bewährte Technologie, die wir für die Einführung eines so ehrgeizigen neuen Dienstes benötigen.";
var result = target.CountWords(text, "de-DE");
Assert.AreEqual(44, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_German_Text()
{
var target = CreateTarget();
var text =
"Unsere Sportfans fordern einen konkurrenzlosen Zugang zu jeder Zeit, an jedem Ort und auf einer Reihe von Geräten. Durch die Zusammenarbeit mit dem Team verfügen wir über das Fachwissen und die bewährte Technologie, die wir für die Einführung eines so ehrgeizigen neuen Dienstes benötigen.";
var result = target.CountCharacters(text, "de-DE");
Assert.AreEqual(289, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Polish_Text()
{
var target = CreateTarget();
var text =
"Nasi fani sportu wymagają niezrównanego dostępu w dowolnym miejscu, czasie i na wielu urządzeniach. Współpraca z zespołem daje nam wiedzę i sprawdzoną technologię, której potrzebujemy do wprowadzenia tak ambitnej nowej usługi.";
var result = target.CountWords(text, "pl-PL");
Assert.AreEqual(31, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Polish_Text()
{
var target = CreateTarget();
var text =
"Nasi fani sportu wymagają niezrównanego dostępu w dowolnym miejscu, czasie i na wielu urządzeniach. Współpraca z zespołem daje nam wiedzę i sprawdzoną technologię, której potrzebujemy do wprowadzenia tak ambitnej nowej usługi.";
var result = target.CountCharacters(text, "pl-PL");
Assert.AreEqual(226, result);
}
[Test]
public void Should_be_possible_to_Count_Words_for_Vietnamese_Text()
{
var target = CreateTarget();
var text =
"Người hâm mộ thể thao của chúng tôi yêu cầu truy cập vô song mọi lúc, mọi nơi và trên một số thiết bị. Làm việc với nhóm cung cấp cho chúng tôi chuyên môn và công nghệ đã được chứng minh, chúng tôi cần triển khai một dịch vụ mới đầy tham vọng như vậy.";
var result = target.CountWords(text, "vi-VN");
Assert.AreEqual(56, result);
}
[Test]
public void Should_be_possible_to_Count_Characters_for_Vietnamese_Text()
{
var target = CreateTarget();
var text =
"Người hâm mộ thể thao của chúng tôi yêu cầu truy cập vô song mọi lúc, mọi nơi và trên một số thiết bị. Làm việc với nhóm cung cấp cho chúng tôi chuyên môn và công nghệ đã được chứng minh, chúng tôi cần triển khai một dịch vụ mới đầy tham vọng như vậy.";
var result = target.CountCharacters(text, "vi-VN");
Assert.AreEqual(251, result);
}
private static TextAnalyzer CreateTarget()
{
return new TextAnalyzer();
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Types/EndpointTest.cs
using System;
using Deltatre.Utils.Types;
using NUnit.Framework;
namespace Deltatre.Utils.Tests.Types
{
[TestFixture]
public class EndpointTest
{
[Test]
public void CreateInclusive_Should_Create_Inclusive_Endpoint()
{
// ARRANGE
var endpointValue = DateTimeOffset.Now;
// ACT
var endpoint = Endpoint<DateTimeOffset>.Inclusive(endpointValue);
// ARRANGE
Assert.IsNotNull(endpoint);
Assert.AreEqual(endpointValue, endpoint.Value);
Assert.AreEqual(EndPointStatus.Inclusive, endpoint.Status);
}
[Test]
public void CreateExclusive_Should_Create_Exclusive_Endpoint()
{
// ARRANGE
var endpointValue = DateTimeOffset.Now;
// ACT
var endpoint = Endpoint<DateTimeOffset>.Exclusive(endpointValue);
// ARRANGE
Assert.IsNotNull(endpoint);
Assert.AreEqual(endpointValue, endpoint.Value);
Assert.AreEqual(EndPointStatus.Exclusive, endpoint.Status);
}
[Test]
public void Generic_CompareTo_Should_Correctly_Handle_GreaterThan()
{
// ARRANGE
var now = DateTime.Now;
var biggerValue = new DateTimeOffset(now);
var smallerValue = new DateTimeOffset(now).AddDays(-1);
var biggerEndpoint = Endpoint<DateTimeOffset>.Inclusive(biggerValue);
var smallerEndpoint = Endpoint<DateTimeOffset>.Exclusive(smallerValue);
// ACT
var result = biggerEndpoint.CompareTo(smallerEndpoint);
// ASSERT
Assert.Greater(result, 0);
}
[Test]
public void Generic_CompareTo_Should_Correctly_Handle_LowerThan()
{
// ARRANGE
var now = DateTime.Now;
var biggerValue = new DateTimeOffset(now);
var smallerValue = new DateTimeOffset(now).AddDays(-1);
var biggerEndpoint = Endpoint<DateTimeOffset>.Exclusive(biggerValue);
var smallerEndpoint = Endpoint<DateTimeOffset>.Inclusive(smallerValue);
// ACT
var result = smallerEndpoint.CompareTo(biggerEndpoint);
// ASSERT
Assert.Less(result, 0);
}
[Test]
public void Generic_CompareTo_Should_Correctly_Handle_EqualThan()
{
// ARRANGE
var now = DateTime.Now;
var firstValue = new DateTimeOffset(now);
var secondValue = new DateTimeOffset(now);
var firstEndpoint = Endpoint<DateTimeOffset>.Exclusive(firstValue);
var secondEndpoint = Endpoint<DateTimeOffset>.Inclusive(secondValue);
// ACT
var result = firstEndpoint.CompareTo(secondEndpoint);
// ASSERT
Assert.AreEqual(0, result);
}
[Test]
public void NonGeneric_CompareTo_Should_Throws_When_Passed_An_Object_Of_Different_Type()
{
// ARRANGE
var endpoint = Endpoint<DateTimeOffset>.Inclusive(DateTimeOffset.Now);
const string differentType = "test";
// ACT & ASSERT
Assert.Throws<ArgumentException>(() => endpoint.CompareTo(differentType));
}
[Test]
public void NonGeneric_CompareTo_Should_Correctly_Handle_Null_Values()
{
// ARRANGE
var endpoint = Endpoint<DateTimeOffset>.Inclusive(DateTimeOffset.Now);
// ACT & ASSERT
Assert.Throws<ArgumentNullException>(() => endpoint.CompareTo((object) null));
}
[Test]
public void NonGeneric_CompareTo_Should_Correctly_Handle_GreaterThan()
{
// ARRANGE
var now = DateTime.Now;
var biggerValue = new DateTimeOffset(now);
var smallerValue = new DateTimeOffset(now).AddDays(-1);
var biggerEndpoint = Endpoint<DateTimeOffset>.Exclusive(biggerValue);
object smallerEndpoint = Endpoint<DateTimeOffset>.Inclusive(smallerValue);
// ACT
var result = biggerEndpoint.CompareTo(smallerEndpoint);
// ASSERT
Assert.Greater(result, 0);
}
[Test]
public void NonGeneric_CompareTo_Should_Correctly_Handle_LowerThan()
{
// ARRANGE
var now = DateTime.Now;
var biggerValue = new DateTimeOffset(now);
var smallerValue = new DateTimeOffset(now).AddDays(-1);
object biggerEndpoint = Endpoint<DateTimeOffset>.Inclusive(biggerValue);
var smallerEndpoint = Endpoint<DateTimeOffset>.Exclusive(smallerValue);
// ACT
var result = smallerEndpoint.CompareTo(biggerEndpoint);
// ASSERT
Assert.Less(result, 0);
}
[Test]
public void NonGeneric_CompareTo_Should_Correctly_Handle_EqualThan()
{
// ARRANGE
var now = DateTime.Now;
var firstValue = new DateTimeOffset(now);
var secondValue = new DateTimeOffset(now);
var firstEndpoint = Endpoint<DateTimeOffset>.Inclusive(firstValue);
object secondEndpoint = Endpoint<DateTimeOffset>.Exclusive(secondValue);
// ACT
var result = firstEndpoint.CompareTo(secondEndpoint);
// ASSERT
Assert.AreEqual(0, result);
}
[Test]
public void GreaterThan_Operator_Should_Behave_Correctly()
{
// ARRANGE
var now = DateTime.Now;
var biggerValue = new DateTimeOffset(now);
var smallerValue = new DateTimeOffset(now).AddDays(-1);
var biggerEndpoint = Endpoint<DateTimeOffset>.Exclusive(biggerValue);
var smallerEndpoint = Endpoint<DateTimeOffset>.Inclusive(smallerValue);
// ACT & ASSERT
Assert.IsTrue(biggerEndpoint > smallerEndpoint);
Assert.IsFalse(biggerEndpoint <= smallerEndpoint);
}
[Test]
public void GreaterThanOrEqual_Operator_Should_Behave_Correctly_When_Greater()
{
// ARRANGE
var now = DateTime.Now;
var biggerValue = new DateTimeOffset(now);
var smallerValue = new DateTimeOffset(now).AddDays(-1);
var biggerEndpoint = Endpoint<DateTimeOffset>.Exclusive(biggerValue);
var smallerEndpoint = Endpoint<DateTimeOffset>.Inclusive(smallerValue);
// ACT & ASSERT
Assert.IsTrue(biggerEndpoint >= smallerEndpoint);
Assert.IsFalse(biggerEndpoint < smallerEndpoint);
}
[Test]
public void GreaterThanOrEqual_Operator_Should_Behave_Correctly_When_Equals()
{
// ARRANGE
var now = DateTime.Now;
var firstValue = new DateTimeOffset(now);
var secondValue = new DateTimeOffset(now);
var biggerEndpoint = Endpoint<DateTimeOffset>.Exclusive(firstValue);
var smallerEndpoint = Endpoint<DateTimeOffset>.Inclusive(secondValue);
// ACT & ASSERT
Assert.IsTrue(biggerEndpoint >= smallerEndpoint);
Assert.IsFalse(biggerEndpoint < smallerEndpoint);
}
[Test]
public void LesserThan_Operator_Should_Behave_Correctly()
{
// ARRANGE
var now = DateTime.Now;
var biggerValue = new DateTimeOffset(now);
var smallerValue = new DateTimeOffset(now).AddDays(-1);
var biggerEndpoint = Endpoint<DateTimeOffset>.Exclusive(biggerValue);
var smallerEndpoint = Endpoint<DateTimeOffset>.Inclusive(smallerValue);
// ACT & ASSERT
Assert.IsTrue(smallerEndpoint < biggerEndpoint);
Assert.IsFalse(smallerEndpoint >= biggerEndpoint);
}
[Test]
public void LesserThanOrEqual_Operator_Should_Behave_Correctly_When_Greater()
{
// ARRANGE
var now = DateTime.Now;
var biggerValue = new DateTimeOffset(now);
var smallerValue = new DateTimeOffset(now).AddDays(-1);
var biggerEndpoint = Endpoint<DateTimeOffset>.Exclusive(biggerValue);
var smallerEndpoint = Endpoint<DateTimeOffset>.Inclusive(smallerValue);
// ACT & ASSERT
Assert.IsTrue(smallerEndpoint <= biggerEndpoint);
Assert.IsFalse(smallerEndpoint > biggerEndpoint);
}
[Test]
public void LesserThanOrEqual_Operator_Should_Behave_Correctly_When_Equals()
{
// ARRANGE
var now = DateTime.Now;
var firstValue = new DateTimeOffset(now);
var secondValue = new DateTimeOffset(now);
var firstEndpoint = Endpoint<DateTimeOffset>.Exclusive(firstValue);
var secondEndpoint = Endpoint<DateTimeOffset>.Inclusive(secondValue);
// ACT & ASSERT
Assert.IsTrue(secondEndpoint <= firstEndpoint);
Assert.IsFalse(secondEndpoint > firstEndpoint);
}
[Test]
public void Equals_Operator_Should_Return_False_If_Not_Equals_On_Status()
{
// ARRANGE
var now = DateTime.Now;
var firstValue = new DateTimeOffset(now);
var secondValue = new DateTimeOffset(now);
var firstEndpoint = Endpoint<DateTimeOffset>.Exclusive(firstValue);
var secondEndpoint = Endpoint<DateTimeOffset>.Inclusive(secondValue);
// ACT & ASSERT
Assert.IsFalse(firstEndpoint == secondEndpoint);
Assert.IsTrue(firstEndpoint != secondEndpoint);
}
[Test]
public void Equals_Operator_Should_Return_False_If_Not_Equals_On_Value()
{
// ARRANGE
var now = DateTime.Now;
var firstValue = new DateTimeOffset(now);
var secondValue = new DateTimeOffset(now).AddMilliseconds(1);
var firstEndpoint = Endpoint<DateTimeOffset>.Exclusive(firstValue);
var secondEndpoint = Endpoint<DateTimeOffset>.Exclusive(secondValue);
// ACT & ASSERT
Assert.IsFalse(firstEndpoint == secondEndpoint);
Assert.IsTrue(firstEndpoint != secondEndpoint);
}
[Test]
public void Equals_Operator_Should_Return_True_If_All_Properties_Are_Equal()
{
// ARRANGE
var now = DateTime.Now;
var firstValue = new DateTimeOffset(now);
var secondValue = new DateTimeOffset(now);
var firstEndpoint = Endpoint<DateTimeOffset>.Exclusive(firstValue);
var secondEndpoint = Endpoint<DateTimeOffset>.Exclusive(secondValue);
// ACT & ASSERT
Assert.IsTrue(firstEndpoint == secondEndpoint);
Assert.IsFalse(firstEndpoint != secondEndpoint);
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Resilience/CappedExponentialRetryThenConstantTimeSleepsTest_ComputeSleepDuration.cs
using Deltatre.Utils.Resilience;
using NUnit.Framework;
using System;
namespace Deltatre.Utils.Tests.Resilience
{
[TestFixture]
public sealed partial class CappedExponentialRetryThenConstantTimeSleepsTest
{
[TestCase(-2)]
[TestCase(0)]
public void ComputeSleepDuration_Throws_ArgumentOutOfRangeException_When_RetryCount_Is_Not_Positive_Integer(
int retryCount)
{
// ARRANGE
const int numberOfRetriesWithExponentialSleepDuration = 3;
const int exponentialRetryBaseInMilliseconds = 2_000;
const int constantTimeSleepDurationInMilliseconds = 5_000;
var target = new CappedExponentialRetryThenConstantTimeSleeps(
numberOfRetriesWithExponentialSleepDuration,
exponentialRetryBaseInMilliseconds,
constantTimeSleepDurationInMilliseconds);
// ACT
var exception = Assert.Throws<ArgumentOutOfRangeException>(
() => target.ComputeSleepDuration(retryCount)
);
// ASSERT
Assert.IsNotNull(exception);
Assert.AreEqual("retryCount", exception.ParamName);
}
[Test]
public void ComputeSleepDuration_Applies_One_Exponential_Retry()
{
// ARRANGE
const int numberOfRetriesWithExponentialSleepDuration = 3;
const int exponentialRetryBaseInMilliseconds = 2_000;
const int constantTimeSleepDurationInMilliseconds = 5_000;
var target = new CappedExponentialRetryThenConstantTimeSleeps(
numberOfRetriesWithExponentialSleepDuration,
exponentialRetryBaseInMilliseconds,
constantTimeSleepDurationInMilliseconds);
// ACT
const int retryCount = 1;
var result = target.ComputeSleepDuration(retryCount);
// ASSERT
var expectedSleepDuration = TimeSpan.FromMilliseconds(
Math.Pow(exponentialRetryBaseInMilliseconds, retryCount - 1)
);
Assert.AreEqual(expectedSleepDuration, result);
}
[Test]
public void ComputeSleepDuration_Applies_Two_Exponential_Retries()
{
// ARRANGE
const int numberOfRetriesWithExponentialSleepDuration = 3;
const int exponentialRetryBaseInMilliseconds = 2_000;
const int constantTimeSleepDurationInMilliseconds = 5_000;
var target = new CappedExponentialRetryThenConstantTimeSleeps(
numberOfRetriesWithExponentialSleepDuration,
exponentialRetryBaseInMilliseconds,
constantTimeSleepDurationInMilliseconds);
// ACT
const int retryCount = 2;
var result = target.ComputeSleepDuration(retryCount);
// ASSERT
var expectedSleepDuration = TimeSpan.FromMilliseconds(
Math.Pow(exponentialRetryBaseInMilliseconds, retryCount - 1)
);
Assert.AreEqual(expectedSleepDuration, result);
}
[Test]
public void ComputeSleepDuration_Applies_Number_Of_Exponential_Retries_Equal_To_NumberOfRetriesWithExponentialSleepDuration()
{
// ARRANGE
const int numberOfRetriesWithExponentialSleepDuration = 3;
const int exponentialRetryBaseInMilliseconds = 2_000;
const int constantTimeSleepDurationInMilliseconds = 5_000;
var target = new CappedExponentialRetryThenConstantTimeSleeps(
numberOfRetriesWithExponentialSleepDuration,
exponentialRetryBaseInMilliseconds,
constantTimeSleepDurationInMilliseconds);
// ACT
const int retryCount = numberOfRetriesWithExponentialSleepDuration;
var result = target.ComputeSleepDuration(retryCount);
// ASSERT
var expectedSleepDuration = TimeSpan.FromMilliseconds(
Math.Pow(exponentialRetryBaseInMilliseconds, retryCount - 1)
);
Assert.AreEqual(expectedSleepDuration, result);
}
[Test]
public void ComputeSleepDuration_Applies_ConstantTimeSleepDurationInMilliseconds_When_Retry_Count_Equals_NumberOfRetriesWithExponentialSleepDuration_Plus_One()
{
// ARRANGE
const int numberOfRetriesWithExponentialSleepDuration = 3;
const int exponentialRetryBaseInMilliseconds = 2_000;
const int constantTimeSleepDurationInMilliseconds = 5_000;
var target = new CappedExponentialRetryThenConstantTimeSleeps(
numberOfRetriesWithExponentialSleepDuration,
exponentialRetryBaseInMilliseconds,
constantTimeSleepDurationInMilliseconds);
// ACT
const int retryCount = numberOfRetriesWithExponentialSleepDuration + 1;
var result = target.ComputeSleepDuration(retryCount);
// ASSERT
var expectedSleepDuration = TimeSpan.FromMilliseconds(constantTimeSleepDurationInMilliseconds);
Assert.AreEqual(expectedSleepDuration, result);
}
[Test]
public void ComputeSleepDuration_Applies_ConstantTimeSleepDurationInMilliseconds_When_Retry_Count_Is_Greater_Than_NumberOfRetriesWithExponentialSleepDuration_Plus_One()
{
// ARRANGE
const int numberOfRetriesWithExponentialSleepDuration = 3;
const int exponentialRetryBaseInMilliseconds = 2_000;
const int constantTimeSleepDurationInMilliseconds = 5_000;
var target = new CappedExponentialRetryThenConstantTimeSleeps(
numberOfRetriesWithExponentialSleepDuration,
exponentialRetryBaseInMilliseconds,
constantTimeSleepDurationInMilliseconds);
// ACT
const int retryCount = numberOfRetriesWithExponentialSleepDuration + 2;
var result = target.ComputeSleepDuration(retryCount);
// ASSERT
var expectedSleepDuration = TimeSpan.FromMilliseconds(constantTimeSleepDurationInMilliseconds);
Assert.AreEqual(expectedSleepDuration, result);
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Extensions/String/StringExtensionsTest_Truncate.cs
using Deltatre.Utils.Extensions.String;
using NUnit.Framework;
using System;
using System.Collections.Generic;
namespace Deltatre.Utils.Tests.Extensions.String
{
[TestFixture]
public partial class StringExtensionsTest
{
[Test]
public void Truncate_Throws_ArgumentOutOfRangeException_When_Parameter_MaximumAllowedLength_Is_Less_Than_Zero()
{
// ARRANGE
const string source = "Hello world";
const string ellipsis = "...";
// ACT
Assert.Throws<ArgumentOutOfRangeException>(() => source.Truncate(-3, ellipsis));
}
[TestCase(null)]
[TestCase("")]
public void Truncate_Returns_Source_String_When_Source_String_Is_Null_Or_Empty(string source)
{
// ACT
var result = source.Truncate(5, "...");
// ASSERT
Assert.AreEqual(source, result);
}
[TestCaseSource(nameof(GetTestCasesForSourceShorterThanMaxAllowedLength))]
public void Truncate_Returns_Source_When_Source_Lenght_Is_Less_Than_Max_Allowed_Length(
(string source, int maxLength) tuple)
{
// ACT
var result = tuple.source.Truncate(tuple.maxLength, "...");
// ASSERT
Assert.AreEqual(tuple.source, result);
}
[TestCaseSource(nameof(GetTestCasesForSourceLengthEqualToMaxAllowedLength))]
public void Truncate_Returns_Source_When_Source_Lenght_Equals_Max_Allowed_Length(
(string source, int maxLength) tuple)
{
// ACT
var result = tuple.source.Truncate(tuple.maxLength, "...");
// ASSERT
Assert.AreEqual(tuple.source, result);
}
[TestCaseSource(nameof(GetTestCasesForSourceLengthGreaterThanMaxAllowedLength))]
public void Truncate_Returns_Substring_When_Source_Lenght_Exceeds_Max_Allowed_Length(
(string source, int maxLength, string ellipsis, string expected) tuple)
{
// ACT
var result = tuple.source.Truncate(tuple.maxLength, tuple.ellipsis);
// ASSERT
Assert.AreEqual(tuple.expected, result);
}
[TestCase(null)]
[TestCase("")]
[TestCase(" ")]
public void Truncate_Does_Not_Add_Ellipsis_If_Ellipsis_Is_Null_Or_White_Space_And_Source_String_Is_Truncated(
string ellipsis)
{
// ACT
var result = "ciao".Truncate(2, ellipsis);
// ASSERT
Assert.AreEqual("ci", result);
}
private static IEnumerable<(string source, int maxLength)>
GetTestCasesForSourceShorterThanMaxAllowedLength()
{
yield return ("ciao", 5);
yield return (" ", 4);
yield return ("hello world", 12);
}
private static IEnumerable<(string source, int maxLength)>
GetTestCasesForSourceLengthEqualToMaxAllowedLength()
{
yield return ("ciao", 4);
yield return (" ", 4);
yield return ("hello world", 11);
yield return (string.Empty, 0);
}
private static IEnumerable<(string source, int maxLength, string ellipsis, string expected)>
GetTestCasesForSourceLengthGreaterThanMaxAllowedLength()
{
yield return ("ciao", 0, "...", "...");
yield return ("hello world", 0, "...", "...");
yield return ("ciao", 3, "...", "cia...");
yield return ("hello world", 3, "...", "hel...");
yield return ("hello world", 5, "...", "hello...");
yield return ("hello world", 6, "...", "hello ...");
yield return ("hello world", 7, "...", "hello w...");
yield return (" ", 3, "...", " ...");
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Concurrency/TimerAsyncTest_Stop.cs
using Deltatre.Utils.Extensions.Enumerable;
using Deltatre.Utils.Concurrency;
using NUnit.Framework;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
namespace Deltatre.Utils.Tests.Concurrency
{
public partial class TimerAsyncTest
{
[Test]
public async Task Stop_Terminates_Execution_Of_Scheduled_Background_Workload()
{
// ARRANGE
var values = new List<int>();
Func<CancellationToken, Task> action = ct =>
{
ct.ThrowIfCancellationRequested();
values.Add(1);
return Task.CompletedTask;
};
using (var target = new TimerAsync(
action,
TimeSpan.FromMilliseconds(500),
TimeSpan.FromMilliseconds(500)))
{
// ACT
target.Start();
await Task.Delay(1200).ConfigureAwait(false); // in 1200 milliseconds we are sure that action is called at least twice
await target.Stop().ConfigureAwait(false);
var snapshot1 = values.ToArray();
await Task.Delay(1200).ConfigureAwait(false); // in 1200 milliseconds we are sure that action is called at least twice (but now the timer is stopped)
var snapshot2 = values.ToArray();
// ASSERT
Assert.GreaterOrEqual(snapshot1.Length, 2);
Assert.IsTrue(snapshot1.All(i => i == 1));
CollectionAssert.AreEqual(snapshot1, snapshot2);
}
}
[Test]
public void Stop_Can_Be_Called_More_Than_Once_Without_Throwing_Exceptions()
{
// ARRANGE
using (var target = new TimerAsync(
_ => Task.CompletedTask,
TimeSpan.FromMilliseconds(500),
TimeSpan.FromMilliseconds(500)))
{
// ASSERT
Assert.DoesNotThrowAsync(async () =>
{
target.Start();
await Task.Delay(1200).ConfigureAwait(false);
await target.Stop().ConfigureAwait(false);
await target.Stop().ConfigureAwait(false);
});
}
}
[Test]
public async Task Stop_Can_Be_Called_More_Than_Once_Without_Changing_Expected_Behaviour()
{
// ARRANGE
var values = new List<int>();
Func<CancellationToken, Task> action = ct =>
{
ct.ThrowIfCancellationRequested();
values.Add(1);
return Task.CompletedTask;
};
using (var target = new TimerAsync(
action,
TimeSpan.FromMilliseconds(500),
TimeSpan.FromMilliseconds(500)))
{
// ACT
target.Start();
await Task.Delay(1200).ConfigureAwait(false); // in 1200 milliseconds we are sure that action is called at least twice
await target.Stop().ConfigureAwait(false);
await target.Stop().ConfigureAwait(false);
var snapshot1 = values.ToArray();
await Task.Delay(1200).ConfigureAwait(false); // in 1200 milliseconds we are sure that action is called at least twice (but now the timer is stopped)
await target.Stop().ConfigureAwait(false);
var snapshot2 = values.ToArray();
// ASSERT
Assert.GreaterOrEqual(snapshot1.Length, 2);
Assert.IsTrue(snapshot1.All(i => i == 1));
CollectionAssert.AreEqual(snapshot1, snapshot2);
}
}
[Test]
public async Task Stop_Is_Thread_Safe()
{
// ARRANGE
var values = new List<int>();
Func<CancellationToken, Task> action = ct =>
{
ct.ThrowIfCancellationRequested();
values.Add(1);
return Task.CompletedTask;
};
using (var target = new TimerAsync(
action,
TimeSpan.FromMilliseconds(500),
TimeSpan.FromMilliseconds(500)))
{
bool run = true;
var thread1 = new Thread(() =>
{
while (run)
{
target.Stop().Wait();
}
});
var thread2 = new Thread(() =>
{
while (run)
{
target.Stop().Wait();
}
});
var threads = new[] { thread1, thread2 };
// ACT
target.Start();
await Task.Delay(1200).ConfigureAwait(false); // in 1200 milliseconds we are sure that action is called at least twice
threads.ForEach(t => t.Start());
await Task.Delay(2000).ConfigureAwait(false);
run = false;
threads.ForEach(t => t.Join());
var snapshot1 = values.ToArray();
await Task.Delay(1200).ConfigureAwait(false); // in 1200 milliseconds we are sure that action is called at least twice (but now the timer is stopped)
var snapshot2 = values.ToArray();
// ASSERT
Assert.GreaterOrEqual(snapshot1.Length, 2);
Assert.IsTrue(snapshot1.All(i => i == 1));
CollectionAssert.AreEqual(snapshot1, snapshot2);
}
}
[Test]
public void Stop_Throws_ObjectDisposedException_When_Called_On_A_Disposed_Instance()
{
// ARRANGE
var timer = new TimerAsync(
_ => Task.CompletedTask,
TimeSpan.FromMilliseconds(500),
TimeSpan.FromMilliseconds(500));
// ACT & ASSERT
timer.Dispose();
Assert.ThrowsAsync<ObjectDisposedException>(timer.Stop);
}
[Test]
public void Stop_Can_Be_Called_Before_Start_Without_Throwing_Exceptions()
{
// ARRANGE
using (var target = new TimerAsync(
_ => Task.CompletedTask,
TimeSpan.FromMilliseconds(500),
TimeSpan.FromMilliseconds(500)))
{
// ASSERT
Assert.DoesNotThrowAsync(async () =>
{
await target.Stop().ConfigureAwait(false);
target.Start();
await Task.Delay(1200).ConfigureAwait(false);
});
}
}
[Test]
public async Task Stop_Can_Be_Called_Before_Start_Without_Changing_Expected_Behaviour()
{
// ARRANGE
var values = new List<int>();
Func<CancellationToken, Task> action = ct =>
{
ct.ThrowIfCancellationRequested();
values.Add(1);
return Task.CompletedTask;
};
using (var target = new TimerAsync(
action,
TimeSpan.FromMilliseconds(500),
TimeSpan.FromMilliseconds(500)))
{
// ACT
await target.Stop().ConfigureAwait(false);
target.Start();
await Task.Delay(1200).ConfigureAwait(false); // in 1200 milliseconds we are sure that action is called at least twice
await target.Stop().ConfigureAwait(false);
var snapshot1 = values.ToArray();
await Task.Delay(1200).ConfigureAwait(false); // in 1200 milliseconds we are sure that action is called at least twice (but now the timer is stopped)
var snapshot2 = values.ToArray();
// ASSERT
Assert.GreaterOrEqual(snapshot1.Length, 2);
Assert.IsTrue(snapshot1.All(i => i == 1));
CollectionAssert.AreEqual(snapshot1, snapshot2);
}
}
}
}
<file_sep>/src/Deltatre.Utils/Text/ITextAnalyzer.cs
using System.Globalization;
namespace Deltatre.Utils.Text
{
/// <summary>
/// Provides a set of utility methods to perform analysis on text
/// </summary>
public interface ITextAnalyzer
{
/// <summary>
/// Gets the number of words contained in the provided text following the rules of the provided culture
/// </summary>
/// <param name="text">The text for which count the words</param>
/// <param name="culture">The reference culture</param>
/// <returns>The number of words</returns>
int CountWords(string text, string culture);
/// <summary>
/// Gets the number of characters contained in the provided text following the rules of the provided culture
/// </summary>
/// <param name="text">The text for which count the characters</param>
/// <param name="culture">The reference culture</param>
/// <param name="skippedChars">A list of characters which will not be counted</param>
/// <returns>The number of characters</returns>
int CountCharacters(string text, string culture, string[] skippedChars = null);
/// <summary>
/// Gets the <see cref="TextInfo"/> object of a specific culture
/// </summary>
/// <param name="culture">The culture for which get the <see cref="TextInfo"/></param>
/// <returns></returns>
TextInfo GetTextInfo(string culture);
}
}
<file_sep>/src/Deltatre.Utils/Parsers/DateParsers.cs
using System;
using System.Globalization;
using Deltatre.Utils.Dto;
namespace Deltatre.Utils.Parsers
{
/// <summary>
/// A collection of methods to parse dates expressed in different formats
/// </summary>
public static class DateParsers
{
private static readonly string[] Iso8601Formats = {
//Microsoft standard date and time format strings
"s",
"o",
//Extended formats
"yyyy-MM-ddTHH:mm:ssZ",
"yyyy-MM-ddTHH:mm:ssz",
"yyyy-MM-ddTHH:mm:sszz",
"yyyy-MM-ddTHH:mm:sszzz",
// Basic formats
"yyyyMMddTHHmmsszzz",
"yyyyMMddTHHmmsszz",
"yyyyMMddTHHmmssz",
"yyyyMMddTHHmmssZ",
//Extended formats with accuracy reduced to minutes
"yyyy-MM-ddTHH:mmZ",
"yyyy-MM-ddTHH:mmz",
"yyyy-MM-ddTHH:mmzz",
"yyyy-MM-ddTHH:mmzzz",
// Basic formats with accuracy reduced to minutes
"yyyyMMddTHHmmzzz",
"yyyyMMddTHHmmzz",
"yyyyMMddTHHmmz",
"yyyyMMddTHHmmZ",
//Extended formats with accuracy reduced to hours
"yyyy-MM-ddTHHZ",
"yyyy-MM-ddTHHz",
"yyyy-MM-ddTHHzz",
"yyyy-MM-ddTHHzzz",
// Basic formats with accuracy reduced to hours
"yyyyMMddTHHzzz",
"yyyyMMddTHHzz",
"yyyyMMddTHHz",
"yyyyMMddTHHZ",
// Formats with hour only
"yyyy-MM-dd",
"yyyyMMdd",
// Javascript ISO8601 format
"yyyy-MM-ddTHH:mm:ss.fffZ",
"yyyy-MM-ddTHH:mm:ss.fffzzz"
};
/// <summary>
/// Parses a date and time of day represented as a string in the format ISO 8601.
/// </summary>
/// <param name="toBeParsed">The ISO 8601 string to be parsed</param>
/// <returns>An object of type ParsingResult which represents the result of the performed parsing operation</returns>
/// <exception cref="ArgumentException">Throws ArgumentException if a null or white space string is passed in</exception>
public static OperationResult<DateTimeOffset> ParseIso8601Date(string toBeParsed)
{
if(string.IsNullOrWhiteSpace(toBeParsed))
throw new ArgumentException($"Parameter '{toBeParsed}' cannot be null or white space.");
var isSuccess = DateTimeOffset.TryParseExact(
toBeParsed,
Iso8601Formats,
CultureInfo.InvariantCulture,
DateTimeStyles.RoundtripKind,
out var result);
return isSuccess ?
OperationResult<DateTimeOffset>.CreateSuccess(result):
OperationResult<DateTimeOffset>.Failure;
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Concurrency/TimerAsyncTest_OnError.cs
using System;
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
using Deltatre.Utils.Concurrency;
using Moq;
using NUnit.Framework;
namespace Deltatre.Utils.Tests.Concurrency
{
public partial class TimerAsyncTest
{
[Test]
public async Task OnError_Is_Raised_When_An_Error_Occurs_Inside_Background_Workload()
{
// ARRANGE
int counter = 0;
Func<CancellationToken, Task> action = _ =>
{
counter++;
if (counter >= 2)
{
throw new TestException("KABOOM !");
}
return Task.CompletedTask;
};
using (var target = new TimerAsync(
action,
TimeSpan.FromMilliseconds(500),
TimeSpan.FromMilliseconds(500)))
{
var mock = new Mock<ILogger>();
mock.Setup(m => m.Log(It.IsAny<string>())).Verifiable();
target.OnError += (object sender, Exception e) => mock.Object.Log($"An error occurred {e.Message}");
// ACT
target.Start();
await Task.Delay(2500).ConfigureAwait(false);
// ASSERT
mock.Verify(m => m.Log(It.IsAny<string>()), Times.Once());
mock.Verify(m => m.Log("An error occurred KABOOM !"), Times.Once());
}
}
[Test]
public async Task OnError_Is_Not_Called_When_Timer_Is_Stopped()
{
// ARRANGE
var values = new List<int>();
Func<CancellationToken, Task> action = _ =>
{
values.Add(1);
return Task.CompletedTask;
};
using (var target = new TimerAsync(
action,
TimeSpan.FromMilliseconds(500),
TimeSpan.FromMilliseconds(500)))
{
var mock = new Mock<ILogger>();
mock.Setup(m => m.Log(It.IsAny<string>())).Verifiable();
target.OnError += (object sender, Exception e) => mock.Object.Log($"An error occurred {e.Message}");
// ACT
target.Start();
await Task.Delay(2500).ConfigureAwait(false);
await target.Stop().ConfigureAwait(false);
await Task.Delay(2500).ConfigureAwait(false);
// ASSERT
mock.Verify(m => m.Log(It.IsAny<string>()), Times.Never());
}
}
}
}
<file_sep>/src/Deltatre.Utils/Text/TextAnalyzer.cs
using System.Globalization;
using System.Linq;
using System.Text.RegularExpressions;
namespace Deltatre.Utils.Text
{
/// <summary>
/// Provides a set of utility methods to perform analysis on text
/// </summary>
public class TextAnalyzer : ITextAnalyzer
{
private readonly string[] _cultureWordExceptions = new[] { "th-TH" };
/// <summary>
/// Gets the number of words contained in the provided text following the rules of the provided culture
/// </summary>
/// <param name="text">The text for which count the words</param>
/// <param name="culture">The reference culture</param>
/// <returns>The number of words</returns>
public int CountWords(string text, string culture)
{
if (string.IsNullOrEmpty(text))
return 0;
var cultureInfo = new CultureInfo(culture);
if (!cultureInfo.TextInfo.IsRightToLeft && !IsWordsText(text, cultureInfo))
return CountCharacters(text, culture, new[] { " " });
return CalculateWordCount(text);
}
/// <summary>
/// Gets the number of characters contained in the provided text following the rules of the provided culture
/// </summary>
/// <param name="text">The text for which count the characters</param>
/// <param name="culture">The reference culture</param>
/// <param name="skippedChars">A list of characters which will not be counted</param>
/// <returns>The number of characters</returns>
public int CountCharacters(string text, string culture, string[] skippedChars = null)
{
if (string.IsNullOrEmpty(text))
return 0;
if (skippedChars != null)
foreach (var skippedChar in skippedChars)
text = text.Replace(skippedChar, "");
return text.Length;
}
/// <summary>
/// Gets the <see cref="TextInfo"/> object of a specific culture
/// </summary>
/// <param name="culture">The culture for which get the <see cref="TextInfo"/></param>
/// <returns></returns>
public TextInfo GetTextInfo(string culture)
{
var cultureInfo = new CultureInfo(culture);
return cultureInfo.TextInfo;
}
private static int CalculateWordCount(string text)
{
var collection = Regex.Matches(text, @"[\S]+");
return collection.Count;
}
private bool IsWordsText(string text, CultureInfo cultureInfo)
{
if (cultureInfo.TextInfo.ANSICodePage == 0 || _cultureWordExceptions.Contains(cultureInfo.Name))
return true;
var chekLength = text.Length < 10
? text.Length
: 10;
var otherLetterChars = 0;
var spaceChars = 0;
for (int t = 0; t < chekLength; t++)
{
var cat = char.GetUnicodeCategory(text[t]);
switch (cat)
{
case UnicodeCategory.OtherLetter:
otherLetterChars++;
break;
case UnicodeCategory.SpaceSeparator:
spaceChars++;
break;
}
//if (cat == UnicodeCategory.OtherLetter)
//{
// otherLetterChars++;
//}
}
return otherLetterChars == 0;
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Extensions/String/StringExtensionsTest_StripHtmlTags.cs
using Deltatre.Utils.Extensions.String;
using NUnit.Framework;
namespace Deltatre.Utils.Tests.Extensions.String
{
[TestFixture]
public partial class StringExtensionsTest
{
[Test]
public void StripHtmlTags_Returns_Null_For_Null_Source()
{
// ACT
var result = StringExtensions.StripHtmlTags(null);
// ASSERT
Assert.IsNull(result);
}
[Test]
public void StripHtmlTags_Returns_Empty_String_For_Empty_String_Source()
{
// ACT
var result = string.Empty.StripHtmlTags();
// ASSERT
Assert.IsNotNull(result);
Assert.IsEmpty(result);
}
[Test]
public void StripHtmlTags_Returns_White_Space_String_For_White_Space_String_Source()
{
// ACT
var result = StringExtensions.StripHtmlTags(" ");
// ASSERT
Assert.IsNotNull(result);
Assert.AreEqual(" ", result);
}
[Test]
public void StripHtmlTags_Should_Return_Given_String_Without_Html_Tags()
{
// ARRANGE
const string source = "This <b>is a bold tag</b> and this <span>is a span tag</span>";
// ACT
var result = source.StripHtmlTags();
// ASSERT
Assert.IsNotNull(result);
const string expectedString = "This is a bold tag and this is a span tag";
Assert.AreEqual(expectedString, result);
}
[Test]
public void StripHtmlTags_Should_Return_Given_String_Without_Html_Tags_Leaving_Escaped_Html()
{
// ARRANGE
const string source = "<p>When you want to include <b>literal html</b> inside a web page you should encode it (e.g.: <h1> Title here </h1>)</p>";
// ACT
var result = source.StripHtmlTags();
// ASSERT
Assert.IsNotNull(result);
const string expectedString = "When you want to include literal html inside a web page you should encode it (e.g.: <h1> Title here </h1>)";
Assert.AreEqual(expectedString, result);
}
}
}
<file_sep>/src/Deltatre.Utils/Extensions/Array/ArrayExtensions.cs
using System;
using System.Linq;
using Deltatre.Utils.Randomization;
namespace Deltatre.Utils.Extensions.Array
{
/// <summary>
/// Extension methods for arrays
/// </summary>
public static class ArrayExtensions
{
/// <summary>
/// Call this method to get a randomic permutation of a given array
/// </summary>
/// <typeparam name="T">The type of elements contained in the array to be shuffled</typeparam>
/// <param name="source">The array to be shuffled</param>
/// <returns>A randomic permutation of the source array. Each possible permutation has the exact same likelihood</returns>
public static T[] Shuffle<T>(this T[] source)
{
if (source == null)
throw new ArgumentNullException(nameof(source));
if (source.Length == 0 || source.Length == 1)
return source.ToArray();
var result = source.ToArray();
for (var i = result.Length - 1; i >= 1; i--)
{
var j = RandomGenerator.Instance.Next(0, i);
var temp = result[j];
result[j] = result[i];
result[i] = temp;
}
return result;
}
}
}
<file_sep>/src/Deltatre.Utils/Azure/AzureHelpers.cs
using System;
using System.Runtime.InteropServices;
namespace Deltatre.Utils.Azure
{
/// <summary>
/// A collection of helper methods for applications running on the Azure cloud.
/// </summary>
public static class AzureHelpers
{
private const string AzureEnvironmentVariableForSiteName = "WEBSITE_SITE_NAME";
/// <summary>
/// Detects whether the running application is being hosted in the Azure cloud.
/// </summary>
/// <returns>
/// <see langword="true"/> if the running application is being hosted in Azure, <see langword="false"/> otherwise.
/// </returns>
public static bool IsRunningOnAzureAppService() =>
!string.IsNullOrWhiteSpace(
Environment.GetEnvironmentVariable(AzureEnvironmentVariableForSiteName)
);
/// <summary>
/// Returns the absolute path to a file which can be used to write logs so that they will be visible inside the Azure
/// log stream. Both Windows and Linux app services are supported.
/// </summary>
/// <returns>
/// The absolute path to a file that can be used to write logs so that they will be visible inside the Azure
/// log stream.
/// </returns>
/// <exception cref="OsNotSupportedOnAzureAppServicesException">
/// When the hosting environment is running an OS which is not compatible with Azure App Services (e.g.: OSX)
/// </exception>
public static string GetAzureLogStreamFilePath()
{
if (RuntimeInformation.IsOSPlatform(OSPlatform.Windows))
{
return @"D:\home\LogFiles\Application\logs.txt";
}
else if (RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
{
return "/home/LogFiles/Application/logs.txt";
}
else
{
throw new OsNotSupportedOnAzureAppServicesException(
$"The currently running operating system ({RuntimeInformation.OSDescription}) is not compatible with Azure app services."
)
{
OperatingSystem = RuntimeInformation.OSDescription
};
}
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Reflection/ReflectionHelpersTest.cs
using Deltatre.Utils.Reflection;
using NUnit.Framework;
using System;
using System.Linq;
namespace Deltatre.Utils.Tests.Reflection
{
[TestFixture]
public class ReflectionHelpersTest
{
[Test]
public void IsDecoratorFor_Throws_ArgumentNullException_When_Type_Is_Null()
{
// ACT
Assert.Throws<ArgumentNullException>(() => ReflectionHelpers.IsDecoratorFor<IService>(null));
}
[Test]
public void IsDecoratorFor_Returns_True_When_Type_Is_Decorator_For_TDecoratee_And_Type_Ctor_Has_One_Parameter()
{
// ACT
var result = ReflectionHelpers.IsDecoratorFor<IService>(typeof(SimpleDecorator));
// ASSERT
Assert.IsTrue(result);
}
[TestCase(typeof(ComplexDecorator1))]
[TestCase(typeof(ComplexDecorator2))]
public void IsDecoratorFor_Returns_True_When_Type_Is_Decorator_For_TDecoratee_And_Type_Ctor_Has_Two_Parameter(Type type)
{
// ACT
var result = ReflectionHelpers.IsDecoratorFor<IService>(type);
// ASSERT
Assert.IsTrue(result);
}
[Test]
public void IsDecoratorFor_Returns_True_When_Type_Is_Decorator_For_TDecoratee_And_Is_Sealed_Class()
{
// ACT
var result = ReflectionHelpers.IsDecoratorFor<IService>(typeof(SealedDecorator));
// ASSERT
Assert.IsTrue(result);
}
[Test]
public void IsDecoratorFor_Returns_True_When_Type_Is_Decorator_For_TDecoratee_And_Is_Open_Generic_Type()
{
// ACT
var result = ReflectionHelpers.IsDecoratorFor<IService>(typeof(GenericDecorator<>));
// ASSERT
Assert.IsTrue(result);
}
[Test]
public void IsDecoratorFor_Returns_True_When_Type_Is_Decorator_For_TDecoratee_And_Is_Closed_Generic_Type()
{
// ACT
var result = ReflectionHelpers.IsDecoratorFor<IService>(typeof(GenericDecorator<string>));
// ASSERT
Assert.IsTrue(result);
}
[TestCase(typeof(AbstractDecoratorWithPublicCtor))]
[TestCase(typeof(AbstractDecoratorWithProtectedCtor))]
public void IsDecoratorFor_Returns_False_When_Type_Is_Decorator_For_TDecoratee_And_Is_Abstract_Class(Type type)
{
// ACT
var result = ReflectionHelpers.IsDecoratorFor<IService>(type);
// ASSERT
Assert.IsFalse(result);
}
[Test]
public void IsDecoratorFor_Returns_False_When_Type_Depends_On_TDecoratee_But_Does_Not_Implement_TDecoratee()
{
// ACT
var result = ReflectionHelpers.IsDecoratorFor<IService>(typeof(SomeClass));
// ASSERT
Assert.IsFalse(result);
}
[Test]
public void IsDecoratorFor_Returns_False_When_Type_Does_Not_Depend_On_TDecoratee_But_Implements_TDecoratee()
{
// ACT
var result = ReflectionHelpers.IsDecoratorFor<IService>(typeof(SomeOtherClass));
// ASSERT
Assert.IsFalse(result);
}
[Test]
public void IsDecoratorFor_Returns_False_When_Type_Implements_Abstraction_And_Has_Non_Public_Ctor_Requiring_Abstraction()
{
// ACT
var result = ReflectionHelpers.IsDecoratorFor<IService>(typeof(ClassWithNoPublicCtor));
// ASSERT
Assert.IsFalse(result);
}
[Test]
public void IsDecoratorFor_Returns_False_When_Type_Has_Multiple_Public_Constructors()
{
// ACT
var result = ReflectionHelpers.IsDecoratorFor<IService>(typeof(DecoratorWithTwoPublicCtor));
// ASSERT
Assert.IsFalse(result);
}
[Test]
public void LoadAssembliesFromBinariesFolder_Throws_ArgumentNullException_When_RunningApplicationAssembly_Is_Null()
{
// ACT
Assert.Throws<ArgumentNullException>(() => ReflectionHelpers.LoadAssembliesFromBinariesFolder(null));
}
[Test]
public void LoadAssembliesFromBinariesFolder_Returns_Assemblies_From_Binaries_Folder()
{
// ACT
var assemblies = ReflectionHelpers.LoadAssembliesFromBinariesFolder(this.GetType().Assembly);
// ASSERT
Assert.IsNotNull(assemblies);
Assert.GreaterOrEqual(assemblies.Count, 1);
Assert.IsTrue(assemblies.Any(a => a.GetName().Name == "Deltatre.Utils.Tests"));
}
public interface IService
{
}
public class Service : IService
{
}
public class SimpleDecorator : IService
{
private readonly IService _decoratee;
public SimpleDecorator(IService decoratee)
{
_decoratee = decoratee;
}
}
public sealed class SealedDecorator : IService
{
private readonly IService _decoratee;
public SealedDecorator(IService decoratee)
{
_decoratee = decoratee;
}
}
public class GenericDecorator<T> : IService
{
private readonly IService _decoratee;
public GenericDecorator(IService decoratee)
{
_decoratee = decoratee;
}
}
public class ComplexDecorator1 : IService
{
private readonly IService _decoratee;
private readonly string _name;
public ComplexDecorator1(IService decoratee, string name)
{
_decoratee = decoratee;
_name = name;
}
}
public class ComplexDecorator2 : IService
{
private readonly IService _decoratee;
private readonly string _name;
public ComplexDecorator2(string name, IService decoratee)
{
_decoratee = decoratee;
_name = name;
}
}
public abstract class AbstractDecoratorWithPublicCtor : IService
{
private readonly IService _decoratee;
public AbstractDecoratorWithPublicCtor(IService decoratee)
{
_decoratee = decoratee;
}
}
public abstract class AbstractDecoratorWithProtectedCtor : IService
{
private readonly IService _decoratee;
protected AbstractDecoratorWithProtectedCtor(IService decoratee)
{
_decoratee = decoratee;
}
}
public class SomeClass
{
private readonly IService _decoratee;
public SomeClass(IService decoratee)
{
_decoratee = decoratee;
}
}
public class SomeOtherClass : IService
{
}
public sealed class ClassWithNoPublicCtor : IService
{
private readonly IService _decoratee;
private ClassWithNoPublicCtor(IService decoratee)
{
_decoratee = decoratee;
}
public static ClassWithNoPublicCtor Build(IService decoratee)
{
return new ClassWithNoPublicCtor(decoratee);
}
}
public class DecoratorWithTwoPublicCtor : IService
{
private readonly IService _decoratee;
public DecoratorWithTwoPublicCtor()
:this(new Service())
{
}
public DecoratorWithTwoPublicCtor(IService decoratee)
{
_decoratee = decoratee;
}
}
}
}
<file_sep>/src/Deltatre.Utils/Resilience/WaitAndRetryWithRandomSleepDurations.cs
using Deltatre.Utils.Mocking;
using System;
namespace Deltatre.Utils.Resilience
{
/// <summary>
/// This class represents a wait and retry strategy for retries where the sleep time between
/// two consecutive retries is randomic.
/// </summary>
public sealed class WaitAndRetryWithRandomSleepDurations : IRetryStrategy
{
private readonly WaitAndRetryWithRandomSleepDurationsConfiguration _configuration;
private readonly IRandomValueProvider _randomValueProvider;
/// <summary>
/// Initializes a new instance of the <see cref="WaitAndRetryWithRandomSleepDurations"/> class.
/// </summary>
/// <param name="configuration">
/// The configuration object for the instance being created.
/// This parameter cannot be <see langword="null"/>.
/// </param>
/// <param name="randomValueProvider">
/// A service to generate random values.
/// This parameter cannot be <see langword="null"/>.
/// </param>
/// <exception cref="ArgumentNullException">
/// When <paramref name="configuration"/> is <see langword="null"/>
/// When <paramref name="randomValueProvider"/> is <see langword="null"/>
/// </exception>
public WaitAndRetryWithRandomSleepDurations(
WaitAndRetryWithRandomSleepDurationsConfiguration configuration,
IRandomValueProvider randomValueProvider)
{
_configuration = configuration ?? throw new ArgumentNullException(nameof(configuration));
_randomValueProvider = randomValueProvider ?? throw new ArgumentNullException(nameof(randomValueProvider));
}
/// <summary>
/// Gets the number of allowed retries.
/// </summary>
/// <remarks>
/// The number of allowed retries is a non negative integer number.
/// When this number is zero the retry strategy is basically a pass-through which simply executes the provided operation
/// without attempting any retry in case of errors.
/// When this number is greater than zero the retry strategy retries the provided operation in case of errors.
/// </remarks>
public int MaxRetryCount => _configuration.MaxRetryCount;
/// <summary>
/// Computes the amount of time that must be waited before doing another retry.
/// </summary>
/// <remarks>
/// Invoking this method always throws an <see cref="InvalidOperationException" /> exception for a retry strategy
/// having <see cref="MaxRetryCount" /> equal to zero (0).
/// A retry strategy having <see cref="MaxRetryCount" /> equal to zero (0) is a pass-through
/// which simply executes the provided operation and doesn't perform any retry in case of errors.
/// </remarks>
/// <param name="retryCount">
/// An incremental integer value which acts as a retry counter.
/// This counter is always a positive integer number.
/// It takes the value 1 after the first attempt of performing the operation, before the first retry.
/// It takes the value 2 after the first retry, before the second retry.
/// It takes the value 3 after the second retry, before the third retry and so on.
/// </param>
/// <returns>
/// A <see cref="TimeSpan" /> representing the amount of time that must be waited before doing another retry.
/// </returns>
/// <exception cref="InvalidOperationException">
/// When <see cref="MaxRetryCount" /> is zero (0).
/// In that case the retry policy is simply a pass through for the provided
/// operation and no retires will be applied in case of errors. So, computing the sleep duration makes no sense in that case.
/// </exception>
/// <exception cref="ArgumentOutOfRangeException">
/// This exception is thrown by retry strategies having <see cref="MaxRetryCount" /> greater than zero (0)
/// when parameter <paramref name="retryCount" /> is less than 1 or greater than <see cref="MaxRetryCount" />.
/// </exception>
public TimeSpan ComputeSleepDuration(int retryCount)
{
EnsureRetriesAreAllowed();
EnsureRetryCountIsInValidRange();
var randomNumberOfMilliseconds =
_randomValueProvider.NextDouble()
*
_configuration.MaxSleepDurationInMilliseconds;
return TimeSpan.FromMilliseconds(randomNumberOfMilliseconds);
void EnsureRetriesAreAllowed()
{
if (this.MaxRetryCount <= 0)
{
throw new InvalidOperationException("Retries are not allowed with this retry strategy");
}
}
void EnsureRetryCountIsInValidRange()
{
if (retryCount < 1 || retryCount > this.MaxRetryCount)
{
var message = FormattableString.Invariant(
$"The provided value for retry count is invalid: {retryCount}. Retry count must be greater than or equal to 1 and less than or equal to {this.MaxRetryCount}"
);
throw new ArgumentOutOfRangeException(nameof(retryCount), message);
}
}
}
}
}
<file_sep>/src/Deltatre.Utils/Mocking/ISystemClock.cs
using System;
namespace Deltatre.Utils.Mocking
{
/// <summary>
/// This interface is an abstraction representing the system clock. It is useful to write testable classes relying on the system clock.
/// </summary>
public interface ISystemClock
{
/// <summary>
/// Represents the current date and time, expressed as the local time.
/// </summary>
DateTime Now { get; }
/// <summary>
/// Represents the current date and time, expressed as the Coordinated Universal Time (UTC).
/// </summary>
DateTime UtcNow { get; }
/// <summary>
/// Gets a <see cref="DateTimeOffset"/> object that is set to the current date and time on the current computer,
/// with the offset set to the local time's offset from Coordinated Universal Time (UTC).
/// </summary>
DateTimeOffset OffsetNow { get; }
/// <summary>
/// Gets a <see cref="DateTimeOffset"/> object whose date and time are set to the current Coordinated Universal Time (UTC) date and time
/// and whose offset is set to <see cref="TimeSpan.Zero"/>
/// </summary>
DateTimeOffset OffsetUtcNow { get; }
}
}
<file_sep>/src/Deltatre.Utils/Extensions/DateTime/DateTimeExtensions.cs
using System;
namespace Deltatre.Utils.Extensions.DateTimeExtension
{
/// <summary>
/// A collection of helpers to work with date and times.
/// </summary>
public static class DateTimeExtensions
{
private static DateTime _epoch;
/// <summary>
/// Unix Epoch Time (1970-01-01T00:00:00Z)
/// </summary>
public static DateTime Epoch
{
get
{
if(_epoch == default(DateTime))
{
_epoch = new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);
}
return _epoch;
}
}
/// <summary>
/// Returns the number of seconds that have elapsed since Epoch (1970-01-01T00:00:00Z)
/// </summary>
/// <param name="date">DateTime to convert in Unix Time</param>
/// <returns>The number of seconds that have elapsed since Epoch (1970-01-01T00:00:00Z)</returns>
public static int ToUnixTimeSeconds(this DateTime date)
{
var utcDate = date.ToUniversalTime();
return (int) utcDate.Subtract(Epoch).TotalSeconds;
}
}
}
<file_sep>/src/Deltatre.Utils/Helpers/ExpandoObjects/ExpandoObjectHelpers.cs
using Deltatre.Utils.Extensions.ExpandoObjects;
using System;
using System.Dynamic;
namespace Deltatre.Utils.Helpers.ExpandoObjects
{
/// <summary>
/// A collection of helper methods to work with instances of <see cref="ExpandoObject"/> class.
/// </summary>
public static class ExpandoObjectHelpers
{
/// <summary>
/// Executes the shallow merge of <paramref name="source"/> over <paramref name="target"/>
/// and returns a fresh new instance of <see cref="ExpandoObject"/> containing the result of
/// the shallow merge operation.
/// Parameter <paramref name="target"/> is used as the target of the shallow merge operation
/// and is not modified.
/// Parameter <paramref name="source"/> is shallow merged over <paramref name="target"/>
/// and is not modified.
/// The returned object contains the result of the shallow merge operation.
/// Only the top level properties of <paramref name="target"/> and <paramref name="source"/>
/// are shallow merged.
/// </summary>
/// <param name="target">
/// The object to be used as the target of the shallow merge operation.
/// Cannot be <see langword="null"/>. This object is not modified.
/// </param>
/// <param name="source">
/// The object to be merged over <paramref name="target"/>. Cannot be null.
/// This object is not modified.
/// </param>
/// <returns>
/// A fresh new instance of <see cref="ExpandoObject"/> containing the result of the shallow merge operation.
/// </returns>
/// <exception cref="ArgumentNullException">
/// Throws <see cref="ArgumentNullException"/> when <paramref name="target"/> is <see langword="null"/>.
/// </exception>
/// <exception cref="ArgumentNullException">
/// Throws <see cref="ArgumentNullException"/> when <paramref name="source"/> is <see langword="null"/>.
/// </exception>
public static ExpandoObject ShallowMerge(ExpandoObject target, ExpandoObject source)
{
if (target == null)
throw new ArgumentNullException(nameof(target));
if (source == null)
throw new ArgumentNullException(nameof(source));
var result = target.ShallowClone();
result.ShallowMergeWith(source);
return result;
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Extensions/Array/ArrayExtensionsTest.cs
using System;
using Deltatre.Utils.Extensions.Array;
using NUnit.Framework;
namespace Deltatre.Utils.Tests.Extensions.Array
{
[TestFixture]
public class ArrayExtensionsTest
{
[Test]
public void Shuffle_Throws_When_Source_Is_Null()
{
// ARRANGE
string[] source = null;
// ACT
Assert.Throws<ArgumentNullException>(() => source.Shuffle());
}
[Test]
public void Shuffle_Returns_Empty_Array_When_Source_Is_Empty_Array()
{
// ARRANGE
var source = new string[0];
// ACT
var result = source.Shuffle();
// ASSERT
Assert.IsNotNull(result);
Assert.IsEmpty(result);
}
[Test]
public void Shuffle_Returns_Source_When_Source_Is_Array_With_One_Item()
{
// ARRANGE
var source = new[] {"foo"};
// ACT
var result = source.Shuffle();
// ASSERT
Assert.IsNotNull(result);
CollectionAssert.AreEqual(source, result);
}
[Test]
public void Shuffle_Returns_Permutation_Of_Source_When_Source_Is_Array_With_More_Than_One_Item()
{
// ARRANGE
var source = new[] { "foo", "bar", "buzz" };
// ACT
var result = source.Shuffle();
// ASSERT
Assert.IsNotNull(result);
CollectionAssert.AreNotEqual(source, result);
CollectionAssert.AreEquivalent(source, result);
// check that source sequence is not changed
CollectionAssert.AreEqual(new[] { "foo", "bar", "buzz" }, source);
}
}
}
<file_sep>/src/Deltatre.Utils/Mocking/IRandomValueProvider.cs
using System;
using System.Diagnostics.CodeAnalysis;
namespace Deltatre.Utils.Mocking
{
/// <summary>
/// This interface is an abstraction over the <see cref="Random"/> class.
/// This is useful to write testable code depending upon the <see cref="Random"/> class.
/// </summary>
public interface IRandomValueProvider
{
/// <summary>
/// Returns a random integer that is within a specified range
/// </summary>
/// <param name="minValue">
/// The inclusive lower bound of the random number returned
/// </param>
/// <param name="maxValue">
/// The exclusive upper bound of the random number returned. <paramref name="maxValue"/>
/// must be greater than or equal to <paramref name="minValue"/>.
/// </param>
/// <returns>
/// A 32-bit signed integer greater than or equal to <paramref name="minValue"/> and less than <paramref name="maxValue"/>;
/// that is, the range of return values includes <paramref name="minValue"/> but not <paramref name="maxValue"/>.
/// If <paramref name="minValue"/> equals <paramref name="maxValue"/>, <paramref name="minValue"/> is returned.
/// </returns>
/// <exception cref="ArgumentOutOfRangeException">
/// <paramref name="minValue"/> is greater than <paramref name="maxValue"/>
/// </exception>
[SuppressMessage("Naming", "CA1716:Identifiers should not match keywords", Justification = "We want to mimic the same public API offered by the Random class.")]
int Next(int minValue, int maxValue);
/// <summary>
/// Returns a random floating-point number that is greater than or equal to 0.0 and less than 1.0
/// </summary>
/// <returns>
/// A double-precision floating point number that is greater than or equal to 0.0, and less than 1.0
/// </returns>
double NextDouble();
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Randomization/RandomGeneratorTest.cs
using System;
using System.Linq;
using System.Threading;
using NUnit.Framework;
using Deltatre.Utils.Randomization;
namespace Deltatre.Utils.Tests.Randomization
{
[TestFixture]
public class RandomGeneratorTest
{
[Test]
public void Instance_Of_Random_Class_Is_Not_Null()
{
// ACT
var instance = RandomGenerator.Instance;
// ASSERT
Assert.IsNotNull(instance);
}
[Test]
public void Different_Threads_Get_Different_Instances_Of_Random_Class()
{
// ARRANGE
Random instance1 = null;
Random instance2 = null;
var thread1 = new Thread(() => instance1 = RandomGenerator.Instance);
var thread2 = new Thread(() => instance2 = RandomGenerator.Instance);
// ACT
thread1.Start();
thread2.Start();
thread1.Join();
thread2.Join();
// ASSERT
Assert.IsFalse(ReferenceEquals(instance1, instance2));
}
[Test]
public void Instances_Of_Different_Threads_Generate_Different_Sequences_Of_Random_Integers()
{
// ARRANGE
const int sequenceLength = 3;
var sequence1 = new int[sequenceLength];
var sequence2 = new int[sequenceLength];
var thread1 = new Thread(() =>
{
for (var i = 0; i < sequenceLength; i++)
{
sequence1[i] = RandomGenerator.Instance.Next((i + 1) * 10);
}
});
var thread2 = new Thread(() =>
{
for (var i = 0; i < sequenceLength; i++)
{
sequence2[i] = RandomGenerator.Instance.Next((i + 1) * 10);
}
});
// ACT
thread1.Start();
thread2.Start();
thread1.Join();
thread2.Join();
// ASSERT
(int sequence1Value, int sequence2Value)[] joinedSequence = sequence1.Zip(
sequence2,
(val1, val2) => (val1, val2)
).ToArray();
Assert.IsTrue(joinedSequence.All(i => i.sequence1Value != i.sequence2Value));
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Validators/UrlHelpersTest_IsValidHttpsAbsoluteUrl.cs
using NUnit.Framework;
using static Deltatre.Utils.Validators.UrlHelpers;
namespace Deltatre.Utils.Tests.Validators
{
[TestFixture]
public partial class UrlHelpersTest
{
[TestCase(null)]
[TestCase("")]
[TestCase(" ")]
public void IsValidHttpsAbsoluteUrl_Returns_False_If_Value_Is_Null_Or_White_Space(string value)
{
// ACT
var result = IsValidHttpsAbsoluteUrl(value);
// ASSERT
Assert.IsFalse(result);
}
[TestCase("http://contoso.it")]
[TestCase("http://contoso.it/")]
[TestCase("http://contoso.it:800")]
[TestCase("http://contoso.it:800/")]
[TestCase("http://www.facebook.com")]
[TestCase("http://www.facebook.com/")]
[TestCase("http://www.facebook.com:900")]
[TestCase("http://www.facebook.com:800/")]
[TestCase("http://www.facebook.com/foo")]
[TestCase("http://www.facebook.com/foo/")]
[TestCase("http://www.facebook.com:80/foo")]
[TestCase("http://www.facebook.com:56/foo/")]
[TestCase("http://www.facebook.com/foo/bar")]
[TestCase("http://www.facebook.com/foo/bar/")]
[TestCase("http://www.facebook.com:800/foo/bar")]
[TestCase("http://www.facebook.com:800/foo/bar/")]
[TestCase("http://www.facebook.com/foo/bar?key=value")]
[TestCase("http://www.facebook.com:800/foo/bar?key=value")]
[TestCase("http://www.facebook.com/foo/bar?key=value&pippo=pluto")]
[TestCase("http://www.facebook.com:80/foo/bar?key=value&pippo=pluto")]
[TestCase("http://www.facebook.com/foo/bar/?key=value")]
[TestCase("http://www.facebook.com:800/foo/bar/?key=value")]
[TestCase("http://www.facebook.com/foo/bar/?key=value&pippo=pluto")]
[TestCase("http://www.facebook.com:800/foo/bar/?key=value&pippo=pluto")]
[TestCase("http://www.facebook.com:800/foo/bar/?key=value&pippo=pluto#somewhere")]
[TestCase("http://www.facebook.com:800/foo/bar/index.html")]
public void IsValidHttpsAbsoluteUrl_Returns_False_If_Value_Is_Http_Absolute_Url(string value)
{
// ACT
var result = IsValidHttpsAbsoluteUrl(value);
// ASSERT
Assert.IsFalse(result);
}
[TestCase("https://contoso.it")]
[TestCase("https://contoso.it/")]
[TestCase("https://contoso.it:800")]
[TestCase("https://contoso.it:800/")]
[TestCase("https://www.facebook.com")]
[TestCase("https://www.facebook.com/")]
[TestCase("https://www.facebook.com:900")]
[TestCase("https://www.facebook.com:800/")]
[TestCase("https://www.facebook.com/foo")]
[TestCase("https://www.facebook.com/foo/")]
[TestCase("https://www.facebook.com:80/foo")]
[TestCase("https://www.facebook.com:56/foo/")]
[TestCase("https://www.facebook.com/foo/bar")]
[TestCase("https://www.facebook.com/foo/bar/")]
[TestCase("https://www.facebook.com:800/foo/bar")]
[TestCase("https://www.facebook.com:800/foo/bar/")]
[TestCase("https://www.facebook.com/foo/bar?key=value")]
[TestCase("https://www.facebook.com:800/foo/bar?key=value")]
[TestCase("https://www.facebook.com/foo/bar?key=value&pippo=pluto")]
[TestCase("https://www.facebook.com:80/foo/bar?key=value&pippo=pluto")]
[TestCase("https://www.facebook.com/foo/bar/?key=value")]
[TestCase("https://www.facebook.com:800/foo/bar/?key=value")]
[TestCase("https://www.facebook.com/foo/bar/?key=value&pippo=pluto")]
[TestCase("https://www.facebook.com:800/foo/bar/?key=value&pippo=pluto")]
[TestCase("https://www.facebook.com:800/foo/bar/?key=value&pippo=pluto#somewhere")]
[TestCase("https://www.facebook.com:800/foo/bar/index.html")]
public void IsValidHttpsAbsoluteUrl_Returns_True_If_Value_Is_Https_Absolute_Url(string value)
{
// ACT
var result = IsValidHttpsAbsoluteUrl(value);
// ASSERT
Assert.IsTrue(result);
}
[TestCase("hello")]
[TestCase("hello world")]
[TestCase("7")]
[TestCase("13")]
[TestCase("12345")]
[TestCase("http:// not an url")]
[TestCase("https:// not an url")]
[TestCase("foo=bar&key=value")]
[TestCase("?foo=bar&key=value")]
[TestCase("http")]
[TestCase("http://")]
[TestCase("https")]
[TestCase("https://")]
[TestCase("Some Random text 1 here !!! ()/89")]
public void IsValidHttpsAbsoluteUrl_Returns_False_If_Value_Is_Not_Url(string value)
{
// ACT
var result = IsValidHttpsAbsoluteUrl(value);
// ASSERT
Assert.IsFalse(result);
}
[Test]
public void IsValidHttpsAbsoluteUrl_Returns_False_If_Value_Has_Protocol_Other_Than_Http_And_Https()
{
// ACT
var result = IsValidHttpsAbsoluteUrl("ftp://ftp.funet.fi/pub/standards/RFC/rfc959.txt");
// ASSERT
Assert.IsFalse(result);
}
[Test]
public void IsValidHttpsAbsoluteUrl_Returns_False_If_Value_Has_Implicit_Protocol()
{
// ACT
var result = IsValidHttpsAbsoluteUrl("//developer.mozilla.org/en-US/docs/Learn");
// ASSERT
Assert.IsFalse(result);
}
[Test]
public void IsValidHttpsAbsoluteUrl_Returns_False_If_Value_Has_Implicit_HostName()
{
// ACT
var result = IsValidHttpsAbsoluteUrl("/en-US/docs/Learn");
// ASSERT
Assert.IsFalse(result);
}
[TestCase("Skills/Infrastructure/Understanding_URLs")]
[TestCase("../CSS/display")]
public void IsValidHttpsAbsoluteUrl_Returns_False_If_Value_Is_Relative_Url(string value)
{
// ACT
var result = IsValidHttpsAbsoluteUrl(value);
// ASSERT
Assert.IsFalse(result);
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Concurrency/Extensions/EnumerableExtensionsTest_ForEachAsync.cs
using NUnit.Framework;
using System;
using System.Threading.Tasks;
using Deltatre.Utils.Concurrency.Extensions;
using System.Collections.Concurrent;
using System.Threading;
using System.Linq;
namespace Deltatre.Utils.Tests.Concurrency.Extensions
{
[TestFixture]
public partial class EnumerableExtensionsTest
{
[Test]
public void ForEachAsync_Throws_ArgumentNullException_When_Source_Is_Null()
{
// ARRANGE
const int maxDegreeOfParallelism = 4;
Func<string, Task> operation = _ => Task.FromResult(true);
// ACT
Assert.ThrowsAsync<ArgumentNullException>(() =>
EnumerableExtensions.ForEachAsync(null, operation, maxDegreeOfParallelism));
}
[Test]
public void ForEachAsync_Throws_ArgumentNullException_When_Operation_Is_Null()
{
// ARRANGE
const int maxDegreeOfParallelism = 4;
var source = new[] { "hello", "world" };
// ACT
Assert.ThrowsAsync<ArgumentNullException>(() =>
source.ForEachAsync(null, maxDegreeOfParallelism));
}
[Test]
public void ForEachAsync_Throws_ArgumentOutOfRangeException_When_MaxDegreeOfParallelism_Is_Less_Than_Zero()
{
// ARRANGE
const int maxDegreeOfParallelism = -3;
var source = new[] { "hello", "world" };
Func<string, Task> operation = _ => Task.FromResult(true);
// ACT
Assert.ThrowsAsync<ArgumentOutOfRangeException>(() =>
source.ForEachAsync(operation, maxDegreeOfParallelism));
}
[Test]
public void ForEachAsync_Throws_ArgumentOutOfRangeException_When_MaxDegreeOfParallelism_Is_Equals_Zero()
{
// ARRANGE
const int maxDegreeOfParallelism = 0;
var source = new[] { "hello", "world" };
Func<string, Task> operation = _ => Task.FromResult(true);
// ACT
Assert.ThrowsAsync<ArgumentOutOfRangeException>(() =>
source.ForEachAsync(operation, maxDegreeOfParallelism));
}
[Test]
public void ForEachAsync_Does_Not_Throws_When_MaxDegreeOfParallelism_Is_Null()
{
// ARRANGE
var source = new[] { "hello", "world" };
Func<string, Task> operation = _ => Task.FromResult(true);
// ACT
Assert.DoesNotThrowAsync(() => source.ForEachAsync(operation, null));
}
[TestCase(2)]
[TestCase(3)]
[TestCase(4)]
[TestCase(null)]
public async Task ForEachAsync_Executes_The_Operation_For_Each_Item_Inside_Source_Collection(int? maxDegreeOfParallelism)
{
// ARRANGE
var source = new[] { "foo", "bar", "buzz" };
var processedItems = new ConcurrentBag<string>();
Func<string, Task> operation = item =>
{
processedItems.Add(item);
return Task.FromResult(true);
};
// ACT
await source.ForEachAsync(operation, maxDegreeOfParallelism).ConfigureAwait(false);
// ASSERT
CollectionAssert.AreEquivalent(new[] { "foo", "bar", "buzz" }, processedItems);
}
[TestCase(2)]
[TestCase(3)]
[TestCase(4)]
public async Task ForEachAsync_Executes_A_Number_Of_Concurrent_Operations_Less_Than_Or_Equal_To_MaxDegreeOfParallelism(int maxDegreeOfParallelism)
{
// ARRANGE
var source = new[] { "foo", "bar", "buzz" };
var timeRanges = new ConcurrentBag<(DateTime start, DateTime end)>();
Func<string, Task> operation = async _ =>
{
var start = DateTime.Now;
await Task.Delay(500).ConfigureAwait(false);
timeRanges.Add((start, DateTime.Now));
};
// ACT
await source.ForEachAsync(operation, maxDegreeOfParallelism).ConfigureAwait(false);
// ASSERT
var timeRangesArray = timeRanges.ToArray();
for (int i = 0; i < timeRanges.Count; i++)
{
var current = timeRangesArray[i];
var others = GetOthers(timeRangesArray, i);
var overlaps = 0;
foreach (var item in others)
{
if (AreOverlapping(current, item))
{
overlaps++;
}
}
Assert.IsTrue(overlaps <= maxDegreeOfParallelism);
}
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Validators/UrlHelpersTest_IsValidAbsoluteUrl.cs
using Deltatre.Utils.Validators;
using NUnit.Framework;
namespace Deltatre.Utils.Tests.Validators
{
[TestFixture]
public partial class UrlHelpersTest
{
[TestCase(null)]
[TestCase("")]
[TestCase(" ")]
public void IsValidAbsoluteUrl_Returns_False_If_Value_Is_Null_Or_White_Space(string value)
{
// ACT
var result = UrlHelpers.IsValidAbsoluteUrl(value);
// ASSERT
Assert.IsFalse(result);
}
[TestCase("http://contoso.it")]
[TestCase("http://contoso.it/")]
[TestCase("http://contoso.it:800")]
[TestCase("http://contoso.it:800/")]
[TestCase("http://www.facebook.com")]
[TestCase("http://www.facebook.com/")]
[TestCase("http://www.facebook.com:900")]
[TestCase("http://www.facebook.com:800/")]
[TestCase("http://www.facebook.com/foo")]
[TestCase("http://www.facebook.com/foo/")]
[TestCase("http://www.facebook.com:80/foo")]
[TestCase("http://www.facebook.com:56/foo/")]
[TestCase("http://www.facebook.com/foo/bar")]
[TestCase("http://www.facebook.com/foo/bar/")]
[TestCase("http://www.facebook.com:800/foo/bar")]
[TestCase("http://www.facebook.com:800/foo/bar/")]
[TestCase("http://www.facebook.com/foo/bar?key=value")]
[TestCase("http://www.facebook.com:800/foo/bar?key=value")]
[TestCase("http://www.facebook.com/foo/bar?key=value&pippo=pluto")]
[TestCase("http://www.facebook.com:80/foo/bar?key=value&pippo=pluto")]
[TestCase("http://www.facebook.com/foo/bar/?key=value")]
[TestCase("http://www.facebook.com:800/foo/bar/?key=value")]
[TestCase("http://www.facebook.com/foo/bar/?key=value&pippo=pluto")]
[TestCase("http://www.facebook.com:800/foo/bar/?key=value&pippo=pluto")]
[TestCase("http://www.facebook.com:800/foo/bar/?key=value&pippo=pluto#somewhere")]
[TestCase("http://www.facebook.com:800/foo/bar/index.html")]
public void IsValidAbsoluteUrl_Returns_True_If_Value_Is_Http_Absolute_Url(string value)
{
// ACT
var result = UrlHelpers.IsValidAbsoluteUrl(value);
// ASSERT
Assert.IsTrue(result);
}
[TestCase("https://contoso.it")]
[TestCase("https://contoso.it/")]
[TestCase("https://contoso.it:800")]
[TestCase("https://contoso.it:800/")]
[TestCase("https://www.facebook.com")]
[TestCase("https://www.facebook.com/")]
[TestCase("https://www.facebook.com:900")]
[TestCase("https://www.facebook.com:800/")]
[TestCase("https://www.facebook.com/foo")]
[TestCase("https://www.facebook.com/foo/")]
[TestCase("https://www.facebook.com:80/foo")]
[TestCase("https://www.facebook.com:56/foo/")]
[TestCase("https://www.facebook.com/foo/bar")]
[TestCase("https://www.facebook.com/foo/bar/")]
[TestCase("https://www.facebook.com:800/foo/bar")]
[TestCase("https://www.facebook.com:800/foo/bar/")]
[TestCase("https://www.facebook.com/foo/bar?key=value")]
[TestCase("https://www.facebook.com:800/foo/bar?key=value")]
[TestCase("https://www.facebook.com/foo/bar?key=value&pippo=pluto")]
[TestCase("https://www.facebook.com:80/foo/bar?key=value&pippo=pluto")]
[TestCase("https://www.facebook.com/foo/bar/?key=value")]
[TestCase("https://www.facebook.com:800/foo/bar/?key=value")]
[TestCase("https://www.facebook.com/foo/bar/?key=value&pippo=pluto")]
[TestCase("https://www.facebook.com:800/foo/bar/?key=value&pippo=pluto")]
[TestCase("https://www.facebook.com:800/foo/bar/?key=value&pippo=pluto#somewhere")]
[TestCase("https://www.facebook.com:800/foo/bar/index.html")]
public void IsValidAbsoluteUrl_Returns_True_If_Value_Is_Https_Absolute_Url(string value)
{
// ACT
var result = UrlHelpers.IsValidAbsoluteUrl(value);
// ASSERT
Assert.IsTrue(result);
}
[TestCase("hello")]
[TestCase("hello world")]
[TestCase("7")]
[TestCase("13")]
[TestCase("12345")]
[TestCase("http:// not an url")]
[TestCase("https:// not an url")]
[TestCase("foo=bar&key=value")]
[TestCase("?foo=bar&key=value")]
[TestCase("http")]
[TestCase("http://")]
[TestCase("https")]
[TestCase("https://")]
[TestCase("Some Random text 1 here !!! ()/89")]
public void IsValidAbsoluteUrl_Returns_False_If_Value_Is_Not_Url(string value)
{
// ACT
var result = UrlHelpers.IsValidAbsoluteUrl(value);
// ASSERT
Assert.IsFalse(result);
}
[Test]
public void IsValidAbsoluteUrl_Returns_False_If_Value_Has_Protocol_Other_Than_Http_And_Https()
{
// ACT
var result = UrlHelpers.IsValidAbsoluteUrl("ftp://ftp.funet.fi/pub/standards/RFC/rfc959.txt");
// ASSERT
Assert.IsFalse(result);
}
[Test]
public void IsValidAbsoluteUrl_Returns_False_If_Value_Has_Implicit_Protocol()
{
// ACT
var result = UrlHelpers.IsValidAbsoluteUrl("//developer.mozilla.org/en-US/docs/Learn");
// ASSERT
Assert.IsFalse(result);
}
[Test]
public void IsValidAbsoluteUrl_Returns_False_If_Value_Has_Implicit_HostName()
{
// ACT
var result = UrlHelpers.IsValidAbsoluteUrl("/en-US/docs/Learn");
// ASSERT
Assert.IsFalse(result);
}
[TestCase("Skills/Infrastructure/Understanding_URLs")]
[TestCase("../CSS/display")]
public void IsValidAbsoluteUrl_Returns_False_If_Value_Is_Relative_Url(string value)
{
// ACT
var result = UrlHelpers.IsValidAbsoluteUrl(value);
// ASSERT
Assert.IsFalse(result);
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Extensions/Dictionary/DictionaryExtensionsTest.cs
using System;
using System.Collections.Generic;
using System.Globalization;
using Deltatre.Utils.Extensions.Dictionary;
using NUnit.Framework;
namespace Deltatre.Utils.Tests.Extensions.Dictionary
{
[TestFixture]
public class DictionaryExtensionsTest
{
[Test]
public void AsReadOnly_Throws_When_Source_Is_Null()
{
// ACT
Assert.Throws<ArgumentNullException>(() => DictionaryExtensions.AsReadOnly<string, object>(null));
}
[Test]
public void AsReadOnly_Wraps_Source_In_ReadOnlyDictionary()
{
// ARRANGE
var source = new Dictionary<string, int>
{
["Foo"] = 10,
["Bar"] = 45
};
// ACT
var result = source.AsReadOnly();
// ASSERT
Assert.IsNotNull(result);
Assert.AreEqual(2, result.Count);
Assert.IsTrue(result.ContainsKey("Foo"));
Assert.IsTrue(result.ContainsKey("Bar"));
Assert.AreEqual(10, result["Foo"]);
Assert.AreEqual(45, result["Bar"]);
}
[Test]
public void GetValueOrDefault_Should_Throw_When_Source_Is_Null()
{
// ACT
Assert.Throws<ArgumentNullException>(() => DictionaryExtensions.GetValueOrDefault<object>(null, "key"));
}
[Test]
public void GetValueOrDefault_Should_Return_Specified_String_Type()
{
var source = new Dictionary<string, object>
{
["stringKey"] = "stringValue"
};
var result = source.GetValueOrDefault<string>("stringKey");
var resultType = result.GetType();
Assert.AreEqual(resultType, typeof(string));
}
[Test]
public void GetValueOrDefault_Should_Return_Specified_Int_Type()
{
var source = new Dictionary<string, object>
{
["IntKey"] = 2
};
var result = source.GetValueOrDefault<int>("intKey");
var resultType = result.GetType();
Assert.AreEqual(resultType, typeof(int));
}
[Test]
public void GetValueOrDefault_Should_Return_Specified_Bool_Type()
{
var source = new Dictionary<string, object>
{
["boolKey"] = true
};
var result = source.GetValueOrDefault<bool>("boolKey");
var resultType = result.GetType();
Assert.AreEqual(resultType, typeof(bool));
}
[Test]
public void GetValueOrDefault_Should_Return_Default_Type_If_Not_Able_To_Cast_Type()
{
var source = new Dictionary<string, object>
{
["boolKey"] = true
};
var result = source.GetValueOrDefault<int>("boolKey");
var resultType = result.GetType();
Assert.AreEqual(resultType, typeof(int));
}
[Test]
public void GetValueOrDefault_Should_Return_Catch_Default_Value_If_Not_Able_To_Cast_Type()
{
var source = new Dictionary<string, object>
{
["stringKey"] = "stringValue"
};
var result = source.GetValueOrDefault<int>("stringKey");
var resultType = result.GetType();
Assert.AreEqual(resultType, typeof(int));
}
[Test]
public void GetValueOrDefault_Should_Return_Default_Type_If_Key_Is_Not_Found()
{
var source = new Dictionary<string, object>
{
["boolKey"] = true
};
var result = source.GetValueOrDefault<int>("noFoundKey");
var resultType = result.GetType();
Assert.AreEqual(resultType, typeof(int));
}
[Test]
public void GetValueOrDefault_Should_Return_Default_Object_Type()
{
var source = new Dictionary<string, object>
{
["boolKey"] = new object()
};
var result = source.GetValueOrDefault<object>("boolKey");
var resultType = result.GetType();
Assert.AreEqual(resultType, typeof(object));
}
[Test]
public void GetValueOrDefault_Should_Return_Default_Type_For_Custom_Class()
{
var source = new Dictionary<string, object>
{
["customKey"] = new ObjectDisposedException("x")
};
var result = source.GetValueOrDefault<ObjectDisposedException>("customKey");
var resultType = result.GetType();
Assert.AreEqual(resultType, typeof(ObjectDisposedException));
}
[Test]
public void GetValueOrDefault_Should_Return_DateTime_Culture_Info_Based()
{
var source = new Dictionary<string, object>
{
["dateKey"] = "8/18/2010"
};
var culture = new CultureInfo("en-CA");
var result = source.GetValueOrDefault<DateTime>("dateKey",culture);
var expectedDate = new DateTime(2010, 8, 18);
Assert.AreEqual(result.Date, expectedDate.Date);
}
[Test]
public void GetValueOrDefault_Should_Return_Default_DateTime_If_No_CultureInfo_Conversion_Is_Possible()
{
var source = new Dictionary<string, object>
{
["dateKey"] = "8/18/2010"
};
var culture = new CultureInfo("it-IT");
var result = source.GetValueOrDefault<DateTime>("dateKey", culture);
var notExpectedDate = new DateTime(2010, 8, 18);
var expectedDate = new DateTime(0001, 1, 1).Date;
Assert.AreNotEqual(result.Date, notExpectedDate.Date);
Assert.AreEqual(result.Date, expectedDate);
}
[Test]
public void GetValueOrDefault_Throws_If_Provider_Is_Null()
{
var source = new Dictionary<string, object>
{
["dateKey"] = "16/7/2018"
};
Assert.Throws<ArgumentNullException>(() => source.GetValueOrDefault<DateTime>("dateKey", null));
}
[Test]
public void GetStringOrDefault_Should_Return_The_Expected_String_Value()
{
var source = new Dictionary<string, object>
{
["stringKey"] = "valueString"
};
var result = source.GetStringOrDefault("stringKey");
Assert.AreEqual(result, "valueString");
}
[Test]
public void GetStringOrDefault_Should_Return_The_Default_String_Value()
{
var source = new Dictionary<string, object>
{
["stringKey"] = "valueString"
};
var result = source.GetStringOrDefault("stringKeyNotFound");
Assert.AreEqual(result, null);
}
[Test]
public void GetStringOrDefault_Should_Return_String_CultureInfo_Based()
{
var source = new Dictionary<string, object>
{
["doubleKey"] = 16325.62901
};
var culture = CultureInfo.CreateSpecificCulture("de-DE");
var result = source.GetStringOrDefault("doubleKey", culture);
Assert.AreEqual("16325,62901",result);
}
[Test]
public void GetBoolOrDefault_Should_Return_The_Expected_Bool_Value()
{
var source = new Dictionary<string, object>
{
["boolKey"] = true
};
var result = source.GetBoolOrDefault("boolKey");
Assert.AreEqual(result, true);
}
[Test]
public void GetBoolOrDefault_Should_Return_The_Default_Bool_Value()
{
var source = new Dictionary<string, object>
{
["boolKey"] = true
};
var result = source.GetBoolOrDefault("boolKeyNotFound");
Assert.AreEqual(result, false);
}
[Test]
public void GetIntOrDefault_Should_Return_The_Expected_Int_Value()
{
var source = new Dictionary<string, object>
{
["intKey"] = 10
};
var result = source.GetIntOrDefault("intKey");
Assert.AreEqual(result, 10);
}
[Test]
public void GetIntOrDefault_Should_Return_The_Default_Int_Value()
{
var source = new Dictionary<string, object>
{
["intKey"] = 15
};
var result = source.GetIntOrDefault("intKeyNotFound");
Assert.AreEqual(result, 0);
}
}
}<file_sep>/src/Deltatre.Utils/Validators/UrlHelpers.cs
using System;
namespace Deltatre.Utils.Validators
{
/// <summary>
/// A collection of helper methods to validate URLs
/// </summary>
public static class UrlHelpers
{
/// <summary>
/// Checks whether a string is a valid absolute url. Both HTTP and HTTPS scheme are taken into account.
/// </summary>
/// <param name="value">The string to be validated as an absolute URL</param>
/// <returns>True if parameter <paramref name="value"/> is a valid absolute URL, false otherwise</returns>
public static bool IsValidAbsoluteUrl(string value)
{
if (string.IsNullOrWhiteSpace(value))
return false;
var isAbsoluteUrl = Uri.TryCreate(value, UriKind.Absolute, out var url);
if (!isAbsoluteUrl)
return false;
return url.Scheme == Uri.UriSchemeHttp || url.Scheme == Uri.UriSchemeHttps;
}
/// <summary>
/// Checks whether a string is a valid absolute url having the HTTP schema.
/// </summary>
/// <param name="value">The string to be validated as an absolute URL having the HTTP schema.</param>
/// <returns>True if parameter <paramref name="value"/> is a valid absolute URL having the HTTP schema, false otherwise.</returns>
public static bool IsValidHttpAbsoluteUrl(string value)
{
if (string.IsNullOrWhiteSpace(value))
return false;
var isAbsoluteUrl = Uri.TryCreate(value, UriKind.Absolute, out var url);
if (!isAbsoluteUrl)
return false;
return url.Scheme == Uri.UriSchemeHttp;
}
/// <summary>
/// Checks whether a string is a valid absolute url having the HTTPS schema.
/// </summary>
/// <param name="value">The string to be validated as an absolute URL having the HTTPS schema.</param>
/// <returns>
/// True if parameter <paramref name="value"/> is a valid absolute URL having the HTTPS schema, false otherwise.
/// </returns>
public static bool IsValidHttpsAbsoluteUrl(string value)
{
if (string.IsNullOrWhiteSpace(value))
return false;
var isAbsoluteUrl = Uri.TryCreate(value, UriKind.Absolute, out var url);
if (!isAbsoluteUrl)
return false;
return url.Scheme == Uri.UriSchemeHttps;
}
}
}
<file_sep>/src/Deltatre.Utils/Resilience/CappedExponentialRetryThenConstantTimeSleeps.cs
using System;
namespace Deltatre.Utils.Resilience
{
/// <summary>
/// This is an infinite retry strategy.
/// The idea is performing a limited number of exponential retries
/// followed by infinite retries interspersed by constant time sleeps.
/// For instance you can decide to perform 3 exponential retries having base 2 seconds, followed by constant time sleeps of 5 seconds.
/// In that case the sleep durations between the retries are the followings:
/// 1 second (after the first attempt, before the first retry),
/// 2 seconds (after the first retry, before the second retry),
/// 4 seconds (after the second retry, before the third retry),
/// 5 seconds (after the third retry, before the fourth retry),
/// then forever 5 seconds (after the fourth retry and forever from that moment on).
/// </summary>
/// <remarks>
/// The series of exponential retries is computed by starting with the power 0, then the power 1 and so on.
/// Put another way, the series of exponents for the exponential retries starts with 0
/// </remarks>
public sealed class CappedExponentialRetryThenConstantTimeSleeps : IInfiniteRetryStrategy
{
/// <summary>
/// Gets the number of retries for which the sleep duration is computed
/// by using the exponential strategy
/// </summary>
public int NumberOfRetriesWithExponentialSleepDuration { get; }
/// <summary>
/// Gets the base for the exponential retries expressed as number of milliseconds.
/// </summary>
public int ExponentialRetryBaseInMilliseconds { get; }
/// <summary>
/// Gets the sleep duration for the constant time retries expressed
/// as number of milliseconds.
/// </summary>
public int ConstantTimeSleepDurationInMilliseconds { get; }
/// <summary>
/// Creates a new instance of the <see cref="CappedExponentialRetryThenConstantTimeSleeps"/> class.
/// </summary>
public CappedExponentialRetryThenConstantTimeSleeps(
int numberOfRetriesWithExponentialSleepDuration,
int exponentialRetryBaseInMilliseconds,
int constantTimeSleepDurationInMilliseconds)
{
if (!IsPositiveInteger(numberOfRetriesWithExponentialSleepDuration))
{
throw new ArgumentOutOfRangeException(
nameof(numberOfRetriesWithExponentialSleepDuration),
ErrorMessageForPositiveIntegerParameter(nameof(numberOfRetriesWithExponentialSleepDuration)));
}
if (!IsPositiveInteger(exponentialRetryBaseInMilliseconds))
{
throw new ArgumentOutOfRangeException(
nameof(exponentialRetryBaseInMilliseconds),
ErrorMessageForPositiveIntegerParameter(nameof(exponentialRetryBaseInMilliseconds)));
}
if (!IsPositiveInteger(constantTimeSleepDurationInMilliseconds))
{
throw new ArgumentOutOfRangeException(
nameof(constantTimeSleepDurationInMilliseconds),
ErrorMessageForPositiveIntegerParameter(nameof(constantTimeSleepDurationInMilliseconds)));
}
this.NumberOfRetriesWithExponentialSleepDuration = numberOfRetriesWithExponentialSleepDuration;
this.ExponentialRetryBaseInMilliseconds = exponentialRetryBaseInMilliseconds;
this.ConstantTimeSleepDurationInMilliseconds = constantTimeSleepDurationInMilliseconds;
}
/// <summary>
/// Computes the amount of time that must be waited before doing another retry.
/// </summary>
/// <param name="retryCount">
/// An incremental integer value which acts as a retry counter.
/// This counter is always a positive integer number.
/// It takes the value 1 after the first attempt of performing the operation, before the first retry.
/// It takes the value 2 after the first retry, before the second retry.
/// It takes the value 3 after the second retry, before the third retry and so on.
/// </param>
/// <returns>
/// A <see cref="TimeSpan" /> representing the amount of time that must be waited before doing another retry.
/// </returns>
/// <exception cref="ArgumentOutOfRangeException">
/// When parameter <paramref name="retryCount" /> is less than 1
/// </exception>
public TimeSpan ComputeSleepDuration(int retryCount)
{
if (retryCount <= 0)
{
throw new ArgumentOutOfRangeException(
nameof(retryCount),
ErrorMessageForPositiveIntegerParameter(nameof(retryCount))
);
}
if (retryCount <= this.NumberOfRetriesWithExponentialSleepDuration)
{
var sleepDurationInMilliseconds = Math.Pow(this.ExponentialRetryBaseInMilliseconds, retryCount - 1);
return TimeSpan.FromMilliseconds(sleepDurationInMilliseconds);
}
return TimeSpan.FromMilliseconds(this.ConstantTimeSleepDurationInMilliseconds);
}
private static bool IsPositiveInteger(int number) => number > 0;
private static string ErrorMessageForPositiveIntegerParameter(string parameterName) =>
FormattableString.Invariant(
$"Parameter '{parameterName}' must be a positive integer"
);
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Types/IntervalTest.cs
using System;
using Deltatre.Utils.Types;
using NUnit.Framework;
namespace Deltatre.Utils.Tests.Types
{
[TestFixture]
public class IntervalTest
{
[Test]
public void Ctor_Should_Throws_When_From_GreaterThan_To()
{
// ARRANGE
var from = Endpoint<int>.Inclusive(2);
var to = Endpoint<int>.Inclusive(1);
// ACT & ASSERT
Assert.Throws<ArgumentException>(() => new Interval<int>(from, to));
}
[Test]
public void Ctor_Should_Correctly_Construct_Range_Object_When_Parameters_Are_Valid()
{
// ARRANGE
var from = Endpoint<int>.Inclusive(1);
var to = Endpoint<int>.Exclusive(2);
// ACT
var result = new Interval<int>(from, to);
// ASSERT
Assert.IsTrue(result.From == from);
Assert.IsTrue(result.To == to);
}
[Test]
public void Equals_Should_Return_False_If_Endpoint_Value_Are_Not_Equals()
{
// ARRANGE
var from = Endpoint<int>.Inclusive(1);
var to_2 = Endpoint<int>.Inclusive(2);
var to_3 = Endpoint<int>.Inclusive(3);
var interval1 = new Interval<int>(from, to_2);
var interval2 = new Interval<int>(from, to_3);
// ACT & ASSERT
Assert.IsFalse(interval1 == interval2);
Assert.IsTrue(interval1 != interval2);
}
[Test]
public void Equals_Should_Return_False_If_Endpoint_Statuses_Are_Not_Equals()
{
// ARRANGE
var from = Endpoint<int>.Inclusive(1);
var to_inclusive = Endpoint<int>.Inclusive(2);
var to_exclusive = Endpoint<int>.Exclusive(2);
var interval1 = new Interval<int>(from, to_inclusive);
var interval2 = new Interval<int>(from, to_exclusive);
// ACT & ASSERT
Assert.IsFalse(interval1 == interval2);
Assert.IsTrue(interval1 != interval2);
}
[Test]
public void Equals_Should_Return_True_If_Both_Intervals_Are_Null()
{
// ARRANGE
Interval<int> interval1 = null;
Interval<int> interval2 = null;
// ACT & ASSERT
Assert.IsTrue(interval1 == interval2);
Assert.IsFalse(interval1 != interval2);
}
[Test]
public void Equals_Should_Return_False_If_One_Interval_is_Null()
{
// ARRANGE
Interval<int> interval1 = null;
var interval2 = new Interval<int>(Endpoint<int>.Inclusive(1), Endpoint<int>.Exclusive(2));
// ACT & ASSERT
Assert.IsFalse(interval1 == interval2);
Assert.IsTrue(interval1 != interval2);
}
[Test]
public void Equals_Should_Return_True_If_Intervals_Are_Equal()
{
// ARRANGE
var interval1 = new Interval<int>(Endpoint<int>.Inclusive(1), Endpoint<int>.Exclusive(2));
var interval2 = new Interval<int>(Endpoint<int>.Inclusive(1), Endpoint<int>.Exclusive(2));
// ACT & ASSERT
Assert.IsTrue(interval1 == interval2);
Assert.IsFalse(interval1 != interval2);
}
[Test]
public void Equals_Should_Return_True_If_Intervals_Are_Both_Empty()
{
// ARRANGE
var interval1 = Interval<int>.Empty;
var interval2 = Interval<int>.Empty;
// ACT & ASSERT
Assert.IsTrue(interval1 == interval2);
Assert.IsFalse(interval1 != interval2);
}
}
}
<file_sep>/src/Deltatre.Utils/Functional/Identity.cs
namespace Deltatre.Utils.Functional
{
/// <summary>
/// A collection of useful functions
/// </summary>
public static partial class Functions
{
/// <summary>
/// An implementation of the mathematical function f(x) = x
/// </summary>
/// <typeparam name="T">The type of the item passed in</typeparam>
/// <param name="item">The item passed in</param>
/// <returns>The same item passed in</returns>
public static T Identity<T>(T item) => item;
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Helpers/ExpandoObjects/ExpandoObjectHelpersTest_ShallowMerge.cs
using NUnit.Framework;
using System;
using System.Collections.Generic;
using System.Dynamic;
using static Deltatre.Utils.Helpers.ExpandoObjects.ExpandoObjectHelpers;
namespace Deltatre.Utils.Tests.Helpers.ExpandoObjects
{
[TestFixture]
public sealed class ExpandoObjectHelpersTest
{
[Test]
public void ShallowMerge_Throws_ArgumentNullException_When_Target_Is_Null()
{
// ARRANGE
var source = new ExpandoObject();
var sourceAsDict = (IDictionary<string, object>)source;
sourceAsDict.Add("FirstName", "Mario");
sourceAsDict.Add("LastName", "Rossi");
// ACT
var exception = Assert.Throws<ArgumentNullException>(
() => ShallowMerge(null, source)
);
// ASSERT
Assert.IsNotNull(exception);
Assert.AreEqual("target", exception.ParamName);
}
[Test]
public void ShallowMerge_Throws_ArgumentNullException_When_Source_Is_Null()
{
// ARRANGE
var target = new ExpandoObject();
var targetAsDict = (IDictionary<string, object>)target;
targetAsDict.Add("FirstName", "Mario");
targetAsDict.Add("LastName", "Rossi");
// ACT
var exception = Assert.Throws<ArgumentNullException>(
() => ShallowMerge(target, null)
);
// ASSERT
Assert.IsNotNull(exception);
Assert.AreEqual("source", exception.ParamName);
}
[Test]
public void ShallowMerge_Returns_Clone_Of_Target_When_Source_Has_No_Properties()
{
// ARRANGE
dynamic contact = new ExpandoObject();
contact.Country = "Italy";
contact.Region = "Lombardia";
contact.Province = "MI";
contact.City = "Milano";
var target = new ExpandoObject();
var targetAsDict = (IDictionary<string, object>)target;
targetAsDict.Add("FirstName", "Mario");
targetAsDict.Add("LastName", "Rossi");
targetAsDict.Add("Age", 50);
targetAsDict.Add("Contact", contact);
// ACT
var source = new ExpandoObject();
var result = ShallowMerge(target, source);
// ASSERT
Assert.IsNotNull(result);
Assert.AreNotSame(result, target);
Assert.AreNotSame(result, source);
var resultAsDict = (IDictionary<string, object>)result;
Assert.AreEqual(resultAsDict.Count, 4);
Assert.IsTrue(resultAsDict.ContainsKey("FirstName"));
Assert.IsTrue(resultAsDict.ContainsKey("LastName"));
Assert.IsTrue(resultAsDict.ContainsKey("Age"));
Assert.IsTrue(resultAsDict.ContainsKey("Contact"));
Assert.AreEqual(resultAsDict["FirstName"], "Mario");
Assert.AreEqual(resultAsDict["LastName"], "Rossi");
Assert.AreEqual(resultAsDict["Age"], 50);
// check contact object
var contactObject = resultAsDict["Contact"];
Assert.IsNotNull(contactObject);
Assert.IsInstanceOf<ExpandoObject>(contactObject);
var contactObjectAsDict = (IDictionary<string, object>)contactObject;
Assert.AreEqual(contactObjectAsDict.Count, 4);
Assert.IsTrue(contactObjectAsDict.ContainsKey("Country"));
Assert.IsTrue(contactObjectAsDict.ContainsKey("Region"));
Assert.IsTrue(contactObjectAsDict.ContainsKey("Province"));
Assert.IsTrue(contactObjectAsDict.ContainsKey("City"));
Assert.AreEqual(contactObjectAsDict["Country"], "Italy");
Assert.AreEqual(contactObjectAsDict["Region"], "Lombardia");
Assert.AreEqual(contactObjectAsDict["Province"], "MI");
Assert.AreEqual(contactObjectAsDict["City"], "Milano");
}
[Test]
public void ShallowMergeWith_Copies_Properties_Of_Source_To_Target_When_Target_Has_No_Properties()
{
// ARRANGE
dynamic contact = new ExpandoObject();
contact.Country = "Italy";
contact.Region = "Lombardia";
contact.Province = "MI";
contact.City = "Milano";
var source = new ExpandoObject();
var sourceAsDict = (IDictionary<string, object>)source;
sourceAsDict.Add("FirstName", "Mario");
sourceAsDict.Add("LastName", "Rossi");
sourceAsDict.Add("Age", 50);
sourceAsDict.Add("Contact", contact);
// ACT
var target = new ExpandoObject();
var result = ShallowMerge(target, source);
// ASSERT
Assert.IsNotNull(result);
Assert.AreNotSame(result, target);
Assert.AreNotSame(result, source);
var resultAsDict = (IDictionary<string, object>)result;
Assert.AreEqual(resultAsDict.Count, 4);
Assert.IsTrue(resultAsDict.ContainsKey("FirstName"));
Assert.IsTrue(resultAsDict.ContainsKey("LastName"));
Assert.IsTrue(resultAsDict.ContainsKey("Age"));
Assert.IsTrue(resultAsDict.ContainsKey("Contact"));
Assert.AreEqual(resultAsDict["FirstName"], "Mario");
Assert.AreEqual(resultAsDict["LastName"], "Rossi");
Assert.AreEqual(resultAsDict["Age"], 50);
// check contact object
var contactObject = resultAsDict["Contact"];
Assert.IsNotNull(contactObject);
Assert.IsInstanceOf<ExpandoObject>(contactObject);
var contactObjectAsDict = (IDictionary<string, object>)contactObject;
Assert.AreEqual(contactObjectAsDict.Count, 4);
Assert.IsTrue(contactObjectAsDict.ContainsKey("Country"));
Assert.IsTrue(contactObjectAsDict.ContainsKey("Region"));
Assert.IsTrue(contactObjectAsDict.ContainsKey("Province"));
Assert.IsTrue(contactObjectAsDict.ContainsKey("City"));
Assert.AreEqual(contactObjectAsDict["Country"], "Italy");
Assert.AreEqual(contactObjectAsDict["Region"], "Lombardia");
Assert.AreEqual(contactObjectAsDict["Province"], "MI");
Assert.AreEqual(contactObjectAsDict["City"], "Milano");
}
[Test]
public void ShallowMerge_Keeps_Properties_Of_Target_Missing_In_Source_Unchanged()
{
// ARRANGE
dynamic contact = new ExpandoObject();
contact.Country = "Italy";
contact.Region = "Lombardia";
contact.Province = "MI";
contact.City = "Milano";
var target = new ExpandoObject();
var targetAsDict = (IDictionary<string, object>)target;
targetAsDict.Add("FirstName", "Mario");
targetAsDict.Add("LastName", "Rossi");
targetAsDict.Add("Age", 50);
targetAsDict.Add("Contact", contact);
var source = new ExpandoObject();
var sourceAsDict = (IDictionary<string, object>)source;
sourceAsDict.Add("Age", 45);
sourceAsDict.Add("Contact", contact);
sourceAsDict.Add("Hobbies", new[] { "gaming", "football" });
// ACT
var result = ShallowMerge(target, source);
// ASSERT
Assert.IsNotNull(result);
Assert.AreNotSame(result, target);
Assert.AreNotSame(result, source);
var resultAsDict = (IDictionary<string, object>)result;
Assert.GreaterOrEqual(resultAsDict.Count, 2);
Assert.IsTrue(resultAsDict.ContainsKey("FirstName"));
Assert.IsTrue(resultAsDict.ContainsKey("LastName"));
Assert.AreEqual(resultAsDict["FirstName"], "Mario");
Assert.AreEqual(resultAsDict["LastName"], "Rossi");
}
[Test]
public void ShallowMerge_Adds_Properties_Of_Source_Missing_In_Target_To_Result()
{
// ARRANGE
dynamic contact = new ExpandoObject();
contact.Country = "Italy";
contact.Region = "Lombardia";
contact.Province = "MI";
contact.City = "Milano";
var target = new ExpandoObject();
var targetAsDict = (IDictionary<string, object>)target;
targetAsDict.Add("Age", 50);
targetAsDict.Add("Contact", contact);
var source = new ExpandoObject();
var sourceAsDict = (IDictionary<string, object>)source;
sourceAsDict.Add("FirstName", "Mario");
sourceAsDict.Add("LastName", "Rossi");
// ACT
var result = ShallowMerge(target, source);
// ASSERT
Assert.IsNotNull(result);
Assert.AreNotSame(result, target);
Assert.AreNotSame(result, source);
var resultAsDict = (IDictionary<string, object>)result;
Assert.GreaterOrEqual(resultAsDict.Count, 2);
Assert.IsTrue(resultAsDict.ContainsKey("FirstName"));
Assert.IsTrue(resultAsDict.ContainsKey("LastName"));
Assert.AreEqual(resultAsDict["FirstName"], "Mario");
Assert.AreEqual(resultAsDict["LastName"], "Rossi");
}
[Test]
public void ShallowMerge_Updates_Common_Properties_With_The_Value_Of_Source()
{
// ARRANGE
dynamic targetContact = new ExpandoObject();
targetContact.Country = "Italy";
targetContact.Region = "Lombardia";
targetContact.Province = "MI";
targetContact.City = "Milano";
var target = new ExpandoObject();
var targetAsDict = (IDictionary<string, object>)target;
targetAsDict.Add("FirstName", "Mario");
targetAsDict.Add("LastName", "Rossi");
targetAsDict.Add("Age", 50);
targetAsDict.Add("Contact", targetContact);
dynamic sourceContact = new ExpandoObject();
sourceContact.Country = "Spain";
sourceContact.City = "Barcellona";
var source = new ExpandoObject();
var sourceAsDict = (IDictionary<string, object>)source;
sourceAsDict.Add("FirstName", "Giuseppe");
sourceAsDict.Add("LastName", "Verdi");
sourceAsDict.Add("Age", 25);
sourceAsDict.Add("Contact", sourceContact);
// ACT
var result = ShallowMerge(target, source);
// ASSERT
Assert.IsNotNull(result);
Assert.AreNotSame(result, target);
Assert.AreNotSame(result, source);
var resultAsDict = (IDictionary<string, object>)result;
Assert.AreEqual(resultAsDict.Count, 4);
Assert.IsTrue(resultAsDict.ContainsKey("FirstName"));
Assert.IsTrue(resultAsDict.ContainsKey("LastName"));
Assert.IsTrue(resultAsDict.ContainsKey("Age"));
Assert.IsTrue(resultAsDict.ContainsKey("Contact"));
Assert.AreEqual(resultAsDict["FirstName"], "Giuseppe");
Assert.AreEqual(resultAsDict["LastName"], "Verdi");
Assert.AreEqual(resultAsDict["Age"], 25);
// check contact object
var contactObject = resultAsDict["Contact"];
Assert.IsNotNull(contactObject);
Assert.IsInstanceOf<ExpandoObject>(contactObject);
var contactObjectAsDict = (IDictionary<string, object>)contactObject;
Assert.AreEqual(contactObjectAsDict.Count, 2);
Assert.IsTrue(contactObjectAsDict.ContainsKey("Country"));
Assert.IsTrue(contactObjectAsDict.ContainsKey("City"));
Assert.AreEqual(contactObjectAsDict["Country"], "Spain");
Assert.AreEqual(contactObjectAsDict["City"], "Barcellona");
}
[Test]
public void ShallowMerge_Does_Not_Change_Common_Properties_Having_Same_Value_In_Target_And_Source()
{
// ARRANGE
dynamic targetContact = new ExpandoObject();
targetContact.Country = "Italy";
targetContact.Region = "Lombardia";
targetContact.Province = "MI";
targetContact.City = "Milano";
var target = new ExpandoObject();
var targetAsDict = (IDictionary<string, object>)target;
targetAsDict.Add("FirstName", "Mario");
targetAsDict.Add("LastName", "Rossi");
targetAsDict.Add("Age", 50);
targetAsDict.Add("Contact", targetContact);
dynamic sourceContact = new ExpandoObject();
sourceContact.Country = "Spain";
sourceContact.City = "Barcellona";
var source = new ExpandoObject();
var sourceAsDict = (IDictionary<string, object>)source;
sourceAsDict.Add("FirstName", "Mario");
sourceAsDict.Add("LastName", "Rossi");
sourceAsDict.Add("Age", 25);
sourceAsDict.Add("Contact", sourceContact);
// ACT
var result = ShallowMerge(target, source);
// ASSERT
Assert.IsNotNull(result);
Assert.AreNotSame(result, target);
Assert.AreNotSame(result, source);
var resultAsDict = (IDictionary<string, object>)result;
Assert.GreaterOrEqual(resultAsDict.Count, 2);
Assert.IsTrue(resultAsDict.ContainsKey("FirstName"));
Assert.IsTrue(resultAsDict.ContainsKey("LastName"));
Assert.AreEqual(resultAsDict["FirstName"], "Mario");
Assert.AreEqual(resultAsDict["LastName"], "Rossi");
}
[Test]
public void ShallowMerge_Executes_First_Level_Merge_Only()
{
// ARRANGE
var nestedObject1 = new ExpandoObject();
var nestedObject1AsDict = (IDictionary<string, object>)nestedObject1;
nestedObject1AsDict.Add("Key1", "Value1");
nestedObject1AsDict.Add("Key2", "Value2");
nestedObject1AsDict.Add("Key3", "Value3");
nestedObject1AsDict.Add("Key5", "Value5");
var nestedObject2 = new ExpandoObject();
var nestedObject2AsDict = (IDictionary<string, object>)nestedObject2;
nestedObject2AsDict.Add("Key2", "Value2");
nestedObject2AsDict.Add("Key3", "Changed");
nestedObject2AsDict.Add("Key4", "Value4");
var target = new ExpandoObject();
var targetAsDict = (IDictionary<string, object>)target;
targetAsDict.Add("FooProperty", nestedObject1);
var source = new ExpandoObject();
var sourceAsDict = (IDictionary<string, object>)source;
sourceAsDict.Add("FooProperty", nestedObject2);
// ACT
var result = ShallowMerge(target, source);
// ASSERT
Assert.IsNotNull(result);
Assert.AreNotSame(result, target);
Assert.AreNotSame(result, source);
var resultAsDict = (IDictionary<string, object>)result;
Assert.AreEqual(resultAsDict.Count, 1);
Assert.IsTrue(resultAsDict.ContainsKey("FooProperty"));
// check nested object
var nestedObject = resultAsDict["FooProperty"];
Assert.IsNotNull(nestedObject);
Assert.IsInstanceOf<ExpandoObject>(nestedObject);
var nestedObjectAsDict = (IDictionary<string, object>)nestedObject;
Assert.AreEqual(3, nestedObjectAsDict.Count);
Assert.IsTrue(nestedObjectAsDict.ContainsKey("Key2"));
Assert.IsTrue(nestedObjectAsDict.ContainsKey("Key3"));
Assert.IsTrue(nestedObjectAsDict.ContainsKey("Key4"));
Assert.AreEqual(nestedObjectAsDict["Key2"], "Value2");
Assert.AreEqual(nestedObjectAsDict["Key3"], "Changed");
Assert.AreEqual(nestedObjectAsDict["Key4"], "Value4");
}
[Test]
public void ShallowMerge_Does_Not_Modify_Target_Parameter()
{
// ARRANGE
dynamic targetContact = new ExpandoObject();
targetContact.Country = "Italy";
targetContact.Region = "Lombardia";
targetContact.Province = "MI";
targetContact.City = "Milano";
var target = new ExpandoObject();
var targetAsDict = (IDictionary<string, object>)target;
targetAsDict.Add("FirstName", "Mario");
targetAsDict.Add("LastName", "Rossi");
targetAsDict.Add("Age", 50);
targetAsDict.Add("Contact", targetContact);
dynamic otherContact = new ExpandoObject();
otherContact.Country = "Spain";
otherContact.City = "Barcellona";
var source = new ExpandoObject();
var sourceAsDict = (IDictionary<string, object>)source;
sourceAsDict.Add("FirstName", "Giuseppe");
sourceAsDict.Add("LastName", "Verdi");
sourceAsDict.Add("Age", 25);
sourceAsDict.Add("Contact", otherContact);
// ACT
_ = ShallowMerge(target, source);
// ASSERT
Assert.AreEqual(targetAsDict.Count, 4);
Assert.IsTrue(targetAsDict.ContainsKey("FirstName"));
Assert.IsTrue(targetAsDict.ContainsKey("LastName"));
Assert.IsTrue(targetAsDict.ContainsKey("Age"));
Assert.IsTrue(targetAsDict.ContainsKey("Contact"));
Assert.AreEqual(targetAsDict["FirstName"], "Mario");
Assert.AreEqual(targetAsDict["LastName"], "Rossi");
Assert.AreEqual(targetAsDict["Age"], 50);
// check contact object
var contactObject = targetAsDict["Contact"];
Assert.IsNotNull(contactObject);
Assert.IsInstanceOf<ExpandoObject>(contactObject);
var contactObjectAsDict = (IDictionary<string, object>)contactObject;
Assert.AreEqual(contactObjectAsDict.Count, 4);
Assert.IsTrue(contactObjectAsDict.ContainsKey("Country"));
Assert.IsTrue(contactObjectAsDict.ContainsKey("City"));
Assert.IsTrue(contactObjectAsDict.ContainsKey("Region"));
Assert.IsTrue(contactObjectAsDict.ContainsKey("Province"));
Assert.AreEqual(contactObjectAsDict["Country"], "Italy");
Assert.AreEqual(contactObjectAsDict["City"], "Milano");
Assert.AreEqual(contactObjectAsDict["Region"], "Lombardia");
Assert.AreEqual(contactObjectAsDict["Province"], "MI");
}
[Test]
public void ShallowMerge_Does_Not_Modify_Source_Parameter()
{
// ARRANGE
dynamic targetContact = new ExpandoObject();
targetContact.Country = "Italy";
targetContact.Region = "Lombardia";
targetContact.Province = "MI";
targetContact.City = "Milano";
var target = new ExpandoObject();
var targetAsDict = (IDictionary<string, object>)target;
targetAsDict.Add("FirstName", "Mario");
targetAsDict.Add("LastName", "Rossi");
targetAsDict.Add("Age", 50);
targetAsDict.Add("Contact", targetContact);
dynamic otherContact = new ExpandoObject();
otherContact.Country = "Spain";
otherContact.City = "Barcellona";
var source = new ExpandoObject();
var sourceAsDict = (IDictionary<string, object>)source;
sourceAsDict.Add("FirstName", "Giuseppe");
sourceAsDict.Add("LastName", "Verdi");
sourceAsDict.Add("Age", 25);
sourceAsDict.Add("Contact", otherContact);
// ACT
_ = ShallowMerge(target, source);
// ASSERT
Assert.AreEqual(sourceAsDict.Count, 4);
Assert.IsTrue(sourceAsDict.ContainsKey("FirstName"));
Assert.IsTrue(sourceAsDict.ContainsKey("LastName"));
Assert.IsTrue(sourceAsDict.ContainsKey("Age"));
Assert.IsTrue(sourceAsDict.ContainsKey("Contact"));
Assert.AreEqual(sourceAsDict["FirstName"], "Giuseppe");
Assert.AreEqual(sourceAsDict["LastName"], "Verdi");
Assert.AreEqual(sourceAsDict["Age"], 25);
// check contact object
var contactObject = sourceAsDict["Contact"];
Assert.IsNotNull(contactObject);
Assert.IsInstanceOf<ExpandoObject>(contactObject);
var contactObjectAsDict = (IDictionary<string, object>)contactObject;
Assert.AreEqual(contactObjectAsDict.Count, 2);
Assert.IsTrue(contactObjectAsDict.ContainsKey("Country"));
Assert.IsTrue(contactObjectAsDict.ContainsKey("City"));
Assert.AreEqual(contactObjectAsDict["Country"], "Spain");
Assert.AreEqual(contactObjectAsDict["City"], "Barcellona");
}
[Test]
public void ShallowMerge_Is_Able_To_Shallow_Merge_Deeply_Nested_Objects()
{
// ARRANGE
dynamic targetSecondLevelObject = new ExpandoObject();
targetSecondLevelObject.TargetKey3 = "target value 3";
targetSecondLevelObject.TargetKey4 = "target value 4";
dynamic targetFirstLevelObject = new ExpandoObject();
targetFirstLevelObject.TargetKey1 = "target value 1";
targetFirstLevelObject.TargetKey2 = targetSecondLevelObject;
var target = new ExpandoObject();
var targetAsDict = (IDictionary<string, object>)target;
targetAsDict["Key"] = targetFirstLevelObject;
dynamic sourceSecondLevelObject = new ExpandoObject();
sourceSecondLevelObject.SourceKey3 = "source value 3";
sourceSecondLevelObject.SourceKey4 = "source value 4";
dynamic sourceFirstLevelObject = new ExpandoObject();
sourceFirstLevelObject.SourceKey1 = "source value 1";
sourceFirstLevelObject.SourceKey2 = sourceSecondLevelObject;
var source = new ExpandoObject();
var sourceAsDict = (IDictionary<string, object>)source;
sourceAsDict["Key"] = sourceFirstLevelObject;
// ACT
var result = ShallowMerge(target, source);
// ASSERT
Assert.IsNotNull(result);
Assert.AreNotSame(result, target);
Assert.AreNotSame(result, source);
var resultAsDict = (IDictionary<string, object>)result;
Assert.AreEqual(1, resultAsDict.Count);
Assert.IsTrue(resultAsDict.ContainsKey("Key"));
var firstLevel = resultAsDict["Key"];
Assert.IsNotNull(firstLevel);
Assert.IsInstanceOf<ExpandoObject>(firstLevel);
var firstLevelAsDict = (IDictionary<string, object>)firstLevel;
Assert.AreEqual(2, firstLevelAsDict.Count);
Assert.IsTrue(firstLevelAsDict.ContainsKey("SourceKey1"));
Assert.IsTrue(firstLevelAsDict.ContainsKey("SourceKey2"));
Assert.AreEqual(firstLevelAsDict["SourceKey1"], "source value 1");
var secondLevel = firstLevelAsDict["SourceKey2"];
Assert.IsNotNull(secondLevel);
Assert.IsInstanceOf<ExpandoObject>(secondLevel);
var secondLevelAsDict = (IDictionary<string, object>)secondLevel;
Assert.AreEqual(2, secondLevelAsDict.Count);
Assert.IsTrue(secondLevelAsDict.ContainsKey("SourceKey3"));
Assert.IsTrue(secondLevelAsDict.ContainsKey("SourceKey4"));
Assert.AreEqual(secondLevelAsDict["SourceKey3"], "source value 3");
Assert.AreEqual(secondLevelAsDict["SourceKey4"], "source value 4");
// check object references
Assert.AreSame(sourceSecondLevelObject, secondLevel);
Assert.AreSame(sourceFirstLevelObject, firstLevel);
}
}
}
<file_sep>/src/Deltatre.Utils/Reflection/ReflectionHelpers.cs
using System;
using System.Collections.ObjectModel;
using System.IO;
using System.Linq;
using System.Reflection;
namespace Deltatre.Utils.Reflection
{
/// <summary>
/// A collection of helper methods for common reflection tasks
/// </summary>
public static class ReflectionHelpers
{
private const string OnlyFilesWithDllExtensionPattern = "*.dll";
/// <summary>
/// Checks whether a given instance of <see cref="System.Type"/> represents a decorator for the abstraction <typeparamref name="TDecoratee"/>.
/// A decorator for <typeparamref name="TDecoratee"/> is a non abstract type which implements the abstraction <typeparamref name="TDecoratee"/> and has a single public constructor which requires an instance of <typeparamref name="TDecoratee"/>.
/// If there are multiple public constructors false is returned.
/// </summary>
/// <typeparam name="TDecoratee">The type being decorated</typeparam>
/// <param name="type">The type being checked in order to determine whether it is a decorator for type <typeparamref name="TDecoratee"/></param>
/// <returns>True if <paramref name="type"/> is a decorator for <typeparamref name="TDecoratee"/></returns>
/// <exception cref="ArgumentNullException">Throws <see cref="ArgumentNullException"/> when parameter <paramref name="type"/> is null</exception>
public static bool IsDecoratorFor<TDecoratee>(Type type) where TDecoratee : class
{
if (type == null)
throw new ArgumentNullException(nameof(type));
var isNonAbstractType = !type.IsAbstract;
var implementsAbstraction = typeof(TDecoratee).IsAssignableFrom(type);
if (!isNonAbstractType || !implementsAbstraction)
{
return false;
}
var publicConstructors = type.GetConstructors();
var hasNoPublicConstructors = publicConstructors.Length == 0;
var hasMultiplePublicConstructors = publicConstructors.Length > 1;
if (hasNoPublicConstructors || hasMultiplePublicConstructors)
{
return false;
}
var dependsOnAbstraction = publicConstructors[0].GetParameters().Any(p => p.ParameterType == typeof(TDecoratee));
return dependsOnAbstraction;
}
/// <summary>
/// Gets all assemblies in the binaries folder matching the specified search pattern.
/// </summary>
/// <param name="runningApplicationAssembly">The assembly of the currently running application. This is the application for which you want to query the binaries folder.</param>
/// <param name="searchPattern">
/// The search pattern to be used to filter the returned assemblies.
/// This parameter can contain a combination of valid literal path and wildcard (* and ?) characters,
/// but it doesn't support regular expressions. The only allowed file extensions are .dll and .exe (because these are the only file extensions of .NET assemblies).
/// If you pass a string null or white space all the files in the binaries folder with a .dll extension will be returned.
/// </param>
/// <param name="searchOption">
/// One of the enumeration values that specifies whether the search operation should include
/// all subdirectories or only the current directory.
/// </param>
/// <exception cref="ArgumentNullException">Throws <see cref="ArgumentNullException"/> when <paramref name="runningApplicationAssembly"/> is null</exception>
/// <returns>All the assemblies in the binaries folder matching the specified search pattern.</returns>
public static ReadOnlyCollection<Assembly> LoadAssembliesFromBinariesFolder(
Assembly runningApplicationAssembly,
string searchPattern = null,
SearchOption searchOption = SearchOption.TopDirectoryOnly)
{
if (runningApplicationAssembly == null)
throw new ArgumentNullException(nameof(runningApplicationAssembly));
var runningApplicationAssemblyFullPath = runningApplicationAssembly.Location;
var binariesFolderFullPath = Path.GetDirectoryName(runningApplicationAssemblyFullPath);
var normalizedSearchPattern = string.IsNullOrWhiteSpace(searchPattern) ?
OnlyFilesWithDllExtensionPattern :
searchPattern;
var assembliesFileNames = Directory.GetFiles(
binariesFolderFullPath,
normalizedSearchPattern,
searchOption);
return assembliesFileNames.Select(Assembly.LoadFile).ToList().AsReadOnly();
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Concurrency/Extensions/EnumerableExtensionsTest_Helpers.cs
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace Deltatre.Utils.Tests.Concurrency.Extensions
{
public partial class EnumerableExtensionsTest
{
private static bool AreOverlapping(
(DateTime start, DateTime end) first,
(DateTime start, DateTime end) second)
{
return first.end > second.start && second.end > first.start;
}
private static IEnumerable<T> GetOthers<T>(IEnumerable<T> items, int index)
{
var numberOfElementsBefore = index;
var itemsToSkip = index + 1;
return items.Take(numberOfElementsBefore).Concat(items.Skip(itemsToSkip));
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Resilience/CappedExponentialRetryThenConstantTimeSleepsTest_Ctor.cs
using Deltatre.Utils.Resilience;
using NUnit.Framework;
using System;
namespace Deltatre.Utils.Tests.Resilience
{
[TestFixture]
public sealed partial class CappedExponentialRetryThenConstantTimeSleepsTest
{
[Test]
public void Ctor_Creates_New_Instance()
{
// ARRANGE
const int numberOfRetriesWithExponentialSleepDuration = 3;
const int exponentialRetryBaseInMilliseconds = 2_000;
const int constantTimeSleepDurationInMilliseconds = 5_000;
// ACT
var result = new CappedExponentialRetryThenConstantTimeSleeps(
numberOfRetriesWithExponentialSleepDuration,
exponentialRetryBaseInMilliseconds,
constantTimeSleepDurationInMilliseconds);
// ASSERT
Assert.IsNotNull(result);
}
[Test]
public void Ctor_Set_Value_To_NumberOfRetriesWithExponentialSleepDuration_Property()
{
// ARRANGE
const int numberOfRetriesWithExponentialSleepDuration = 3;
const int exponentialRetryBaseInMilliseconds = 2_000;
const int constantTimeSleepDurationInMilliseconds = 5_000;
// ACT
var result = new CappedExponentialRetryThenConstantTimeSleeps(
numberOfRetriesWithExponentialSleepDuration,
exponentialRetryBaseInMilliseconds,
constantTimeSleepDurationInMilliseconds);
// ASSERT
Assert.AreEqual(numberOfRetriesWithExponentialSleepDuration, result.NumberOfRetriesWithExponentialSleepDuration);
}
[Test]
public void Ctor_Set_Value_To_ExponentialRetryBaseInMilliseconds_Property()
{
// ARRANGE
const int numberOfRetriesWithExponentialSleepDuration = 3;
const int exponentialRetryBaseInMilliseconds = 2_000;
const int constantTimeSleepDurationInMilliseconds = 5_000;
// ACT
var result = new CappedExponentialRetryThenConstantTimeSleeps(
numberOfRetriesWithExponentialSleepDuration,
exponentialRetryBaseInMilliseconds,
constantTimeSleepDurationInMilliseconds);
// ASSERT
Assert.AreEqual(exponentialRetryBaseInMilliseconds, result.ExponentialRetryBaseInMilliseconds);
}
[Test]
public void Ctor_Set_Value_To_ConstantTimeSleepDurationInMilliseconds_Property()
{
// ARRANGE
const int numberOfRetriesWithExponentialSleepDuration = 3;
const int exponentialRetryBaseInMilliseconds = 2_000;
const int constantTimeSleepDurationInMilliseconds = 5_000;
// ACT
var result = new CappedExponentialRetryThenConstantTimeSleeps(
numberOfRetriesWithExponentialSleepDuration,
exponentialRetryBaseInMilliseconds,
constantTimeSleepDurationInMilliseconds);
// ASSERT
Assert.AreEqual(constantTimeSleepDurationInMilliseconds, result.ConstantTimeSleepDurationInMilliseconds);
}
[TestCase(-3)]
[TestCase(0)]
public void Ctor_Throws_ArgumentOutOfRangeException_When_NumberOfRetriesWithExponentialSleepDuration_Is_Not_Positive_Integer(
int numberOfRetriesWithExponentialSleepDuration)
{
// ARRANGE
const int exponentialRetryBaseInMilliseconds = 2_000;
const int constantTimeSleepDurationInMilliseconds = 5_000;
// ACT
var exception = Assert.Throws<ArgumentOutOfRangeException>(
() => new CappedExponentialRetryThenConstantTimeSleeps(
numberOfRetriesWithExponentialSleepDuration,
exponentialRetryBaseInMilliseconds,
constantTimeSleepDurationInMilliseconds)
);
// ASSERT
Assert.IsNotNull(exception);
Assert.AreEqual("numberOfRetriesWithExponentialSleepDuration", exception.ParamName);
}
[TestCase(-3)]
[TestCase(0)]
public void Ctor_Throws_ArgumentOutOfRangeException_When_ExponentialRetryBaseInMilliseconds_Is_Not_Positive_Integer(
int exponentialRetryBaseInMilliseconds)
{
// ARRANGE
const int numberOfRetriesWithExponentialSleepDuration = 3;
const int constantTimeSleepDurationInMilliseconds = 5_000;
// ACT
var exception = Assert.Throws<ArgumentOutOfRangeException>(
() => new CappedExponentialRetryThenConstantTimeSleeps(
numberOfRetriesWithExponentialSleepDuration,
exponentialRetryBaseInMilliseconds,
constantTimeSleepDurationInMilliseconds)
);
// ASSERT
Assert.IsNotNull(exception);
Assert.AreEqual("exponentialRetryBaseInMilliseconds", exception.ParamName);
}
[TestCase(-3)]
[TestCase(0)]
public void Ctor_Throws_ArgumentOutOfRangeException_When_ConstantTimeSleepDurationInMilliseconds_Is_Not_Positive_Integer(
int constantTimeSleepDurationInMilliseconds)
{
// ARRANGE
const int numberOfRetriesWithExponentialSleepDuration = 3;
const int exponentialRetryBaseInMilliseconds = 2_000;
// ACT
var exception = Assert.Throws<ArgumentOutOfRangeException>(
() => new CappedExponentialRetryThenConstantTimeSleeps(
numberOfRetriesWithExponentialSleepDuration,
exponentialRetryBaseInMilliseconds,
constantTimeSleepDurationInMilliseconds)
);
// ASSERT
Assert.IsNotNull(exception);
Assert.AreEqual("constantTimeSleepDurationInMilliseconds", exception.ParamName);
}
}
}
<file_sep>/src/Deltatre.Utils/Resilience/PassThroughRetryStrategy.cs
using System;
namespace Deltatre.Utils.Resilience
{
/// <summary>
/// An implementation of <see cref="IRetryStrategy"/> having <see cref="IRetryStrategy.MaxRetryCount"/> equal to zero (0).
/// This retry strategy acts as a pass-through for the provided operation and doesn't perform any retry in case of errors.
/// </summary>
/// <remarks>
/// This is tipically used in unit tests and when a placeholder for an instance of <see cref="IRetryStrategy"/> is requested.
/// </remarks>
public sealed class PassThroughRetryStrategy : IRetryStrategy
{
/// <summary>
/// Represents the pass-through retry strategy. This field is read-only.
/// </summary>
public static readonly PassThroughRetryStrategy Instance = new PassThroughRetryStrategy();
private PassThroughRetryStrategy() { }
/// <summary>
/// Gets the number of allowed retries.
/// </summary>
/// <remarks>
/// The number of allowed retries for <see cref="PassThroughRetryStrategy"/> is always zero (0).
/// </remarks>
public int MaxRetryCount => 0;
/// <summary>
/// Computes the amount of time that must be waited before doing another retry.
/// </summary>
/// <remarks>
/// Invoking <see cref="IRetryStrategy.ComputeSleepDuration(int)"/> always throws <see cref="InvalidOperationException"/>
/// for <see cref="PassThroughRetryStrategy"/>.
/// </remarks>
/// <param name="retryCount">
/// An incremental integer value which acts as a retry counter.
/// This counter is always a positive integer number.
/// It takes the value 1 after the first attempt of performing the operation, before the first retry.
/// It takes the value 2 after the first retry, before the second retry.
/// It takes the value 3 after the second retry, before the third retry and so on.
/// </param>
/// <returns>
/// A <see cref="TimeSpan" /> representing the amount of time that must be waited before doing another retry.
/// </returns>
/// <exception cref="InvalidOperationException">
/// Invoking this method always throws <see cref="InvalidOperationException"/>
/// </exception>
public TimeSpan ComputeSleepDuration(int retryCount)
{
var message = FormattableString.Invariant(
$"Retries are not allowed when using the {nameof(PassThroughRetryStrategy)} retry strategy."
);
throw new InvalidOperationException(message);
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Functional/HelpersTest.cs
using System;
using NUnit.Framework;
using static Deltatre.Utils.Functional.Helpers;
namespace Deltatre.Utils.Tests.Functional
{
[TestFixture]
public class HelpersTest
{
[Test]
public void GetiFirst_WithProjection_Throws_When_Source_Is_Null()
{
// ARRANGE
Func<string, bool> startsWithLetterMCaseInsensitive = item =>
item != null
&& item.StartsWith("M", StringComparison.InvariantCultureIgnoreCase);
Func<string, int> toLength = item => item.Length;
// ACT
Assert.Throws<ArgumentNullException>(() => GetFirst(null, startsWithLetterMCaseInsensitive, toLength));
}
[Test]
public void GetiFirst_WithProjection_Throws_When_Predicate_Is_Null()
{
// ARRANGE
var source = new[] { "foo", "bar" };
Func<string, int> toLength = item => item.Length;
// ACT
Assert.Throws<ArgumentNullException>(() => GetFirst(source, null, toLength));
}
[Test]
public void GetiFirst_WithProjection_Throws_When_Projection_Is_Null()
{
// ARRANGE
var source = new[] { "foo", "bar" };
Func<string, bool> startsWithLetterMCaseInsensitive = item =>
item != null
&& item.StartsWith("M", StringComparison.InvariantCultureIgnoreCase);
Func<string, int> toLength = null;
// ACT
Assert.Throws<ArgumentNullException>(() => GetFirst(source, startsWithLetterMCaseInsensitive, toLength));
}
[Test]
public void GetiFirst_WithProjection_Returns_Result_For_Item_Not_Found_When_None_Of_Items_Satisfies_Predicate()
{
// ARRANGE
Func<string, bool> startsWithLetterMCaseInsensitive = item =>
item != null
&& item.StartsWith("M", StringComparison.InvariantCultureIgnoreCase);
Func<string, int> toLength = item => item.Length;
var source = new[] { "foo", "bar" };
// ACT
var result = GetFirst(source, startsWithLetterMCaseInsensitive, toLength);
// ASSERT
Assert.IsNotNull(result);
Assert.IsFalse(result.IsSuccess);
}
[Test]
public void GetiFirst_WithProjection_Returns_Result_For_Item_Found_When_One_Of_Items_Satisfies_Predicate()
{
// ARRANGE
Func<string, bool> startsWithLetterMCaseInsensitive = item =>
item != null
&& item.StartsWith("M", StringComparison.InvariantCultureIgnoreCase);
Func<string, int> toLength = item => item.Length;
var source = new[] { "foo", "minimum" };
// ACT
var result = GetFirst(source, startsWithLetterMCaseInsensitive, toLength);
// ASSERT
Assert.IsNotNull(result);
Assert.IsTrue(result.IsSuccess);
Assert.AreEqual(7, result.Output);
}
[Test]
public void GetiFirst_WithProjection_Returns_Result_For_First_Item_Found_When_Many_Of_Items_Satisfies_Predicate()
{
// ARRANGE
Func<string, bool> startsWithLetterMCaseInsensitive = item =>
item != null
&& item.StartsWith("M", StringComparison.InvariantCultureIgnoreCase);
Func<string, int> toLength = item => item.Length;
var source = new[] { "fooz", "minimum", "Max" };
// ACT
var result = GetFirst(source, startsWithLetterMCaseInsensitive, toLength);
// ASSERT
Assert.IsNotNull(result);
Assert.IsTrue(result.IsSuccess);
Assert.AreEqual(7, result.Output);
}
[Test]
public void GetiFirst_Throws_When_Source_Is_Null()
{
// ARRANGE
Func<string, bool> startsWithLetterMCaseInsensitive = item =>
item != null
&& item.StartsWith("M", StringComparison.InvariantCultureIgnoreCase);
// ACT
Assert.Throws<ArgumentNullException>(() => GetFirst(null, startsWithLetterMCaseInsensitive));
}
[Test]
public void GetiFirst_Throws_When_Predicate_Is_Null()
{
// ARRANGE
var source = new[] { "foo", "bar" };
// ACT
Assert.Throws<ArgumentNullException>(() => GetFirst(source, null));
}
[Test]
public void GetiFirst_Returns_Result_For_Item_Not_Found_When_None_Of_Items_Satisfies_Predicate()
{
// ARRANGE
Func<string, bool> startsWithLetterMCaseInsensitive = item =>
item != null
&& item.StartsWith("M", StringComparison.InvariantCultureIgnoreCase);
var source = new[] { "foo", "bar" };
// ACT
var result = GetFirst(source, startsWithLetterMCaseInsensitive);
// ASSERT
Assert.IsNotNull(result);
Assert.IsFalse(result.IsSuccess);
}
[Test]
public void GetiFirst_Returns_Result_For_Item_Found_When_One_Of_Items_Satisfies_Predicate()
{
// ARRANGE
Func<string, bool> startsWithLetterMCaseInsensitive = item =>
item != null
&& item.StartsWith("M", StringComparison.InvariantCultureIgnoreCase);
var source = new[] { "foo", "minimum" };
// ACT
var result = GetFirst(source, startsWithLetterMCaseInsensitive);
// ASSERT
Assert.IsNotNull(result);
Assert.IsTrue(result.IsSuccess);
Assert.AreEqual("minimum", result.Output);
}
[Test]
public void GetiFirst_Returns_Result_For_First_Item_Found_When_Many_Of_Items_Satisfies_Predicate()
{
// ARRANGE
Func<string, bool> startsWithLetterMCaseInsensitive = item =>
item != null
&& item.StartsWith("M", StringComparison.InvariantCultureIgnoreCase);
var source = new[] { "fooz", "minimum", "Max" };
// ACT
var result = GetFirst(source, startsWithLetterMCaseInsensitive);
// ASSERT
Assert.IsNotNull(result);
Assert.IsTrue(result.IsSuccess);
Assert.AreEqual("minimum", result.Output);
}
}
}
<file_sep>/tests/Deltatre.Utils.Tests/Extensions/ExpandoObjects/ExpandoObjectExtensionsTest_ShallowClone.cs
using Deltatre.Utils.Extensions.ExpandoObjects;
using NUnit.Framework;
using System;
using System.Collections.Generic;
using System.Dynamic;
namespace Deltatre.Utils.Tests.Extensions.ExpandoObjects
{
[TestFixture]
public sealed partial class ExpandoObjectExtensionsTest
{
[Test]
public void ShallowClone_Throws_ArgumentNullException_When_Source_Is_Null()
{
// ACT
var exception = Assert.Throws<ArgumentNullException>(
() => ExpandoObjectExtensions.ShallowClone(null)
);
// ASSERT
Assert.IsNotNull(exception);
Assert.AreEqual("source", exception.ParamName);
}
[Test]
public void ShallowClone_Returns_Non_Null_Result()
{
// ARRANGE
dynamic contact = new ExpandoObject();
contact.Country = "Italy";
contact.Region = "Lombardia";
contact.Province = "MI";
contact.City = "Milano";
var target = new ExpandoObject();
var targetAsDict = (IDictionary<string, object>)target;
targetAsDict.Add("FirstName", "Mario");
targetAsDict.Add("LastName", "Rossi");
targetAsDict.Add("Age", 50);
targetAsDict.Add("Contact", contact);
// ACT
var result = target.ShallowClone();
// ASSERT
Assert.IsNotNull(result);
}
[Test]
public void ShallowClone_Returns_Fresh_New_Object()
{
// ARRANGE
dynamic contact = new ExpandoObject();
contact.Country = "Italy";
contact.Region = "Lombardia";
contact.Province = "MI";
contact.City = "Milano";
var target = new ExpandoObject();
var targetAsDict = (IDictionary<string, object>)target;
targetAsDict.Add("FirstName", "Mario");
targetAsDict.Add("LastName", "Rossi");
targetAsDict.Add("Age", 50);
targetAsDict.Add("Contact", contact);
// ACT
var result = target.ShallowClone();
// ASSERT
Assert.AreNotSame(result, target);
}
[Test]
public void ShallowClone_Returns_An_Object_Having_Same_Properties_As_Source()
{
// ARRANGE
dynamic contact = new ExpandoObject();
contact.Country = "Italy";
contact.Region = "Lombardia";
contact.Province = "MI";
contact.City = "Milano";
var target = new ExpandoObject();
var targetAsDict = (IDictionary<string, object>)target;
targetAsDict.Add("FirstName", "Mario");
targetAsDict.Add("LastName", "Rossi");
targetAsDict.Add("Age", 50);
targetAsDict.Add("Contact", contact);
// ACT
var result = target.ShallowClone();
// ASSERT
var resultAsDict = (IDictionary<string, object>)result;
Assert.AreEqual(resultAsDict.Count, 4);
Assert.IsTrue(resultAsDict.ContainsKey("FirstName"));
Assert.IsTrue(resultAsDict.ContainsKey("LastName"));
Assert.IsTrue(resultAsDict.ContainsKey("Age"));
Assert.IsTrue(resultAsDict.ContainsKey("Contact"));
Assert.AreEqual(resultAsDict["FirstName"], "Mario");
Assert.AreEqual(resultAsDict["LastName"], "Rossi");
Assert.AreEqual(resultAsDict["Age"], 50);
// check contact object
var contactObject = resultAsDict["Contact"];
Assert.IsNotNull(contactObject);
Assert.IsInstanceOf<ExpandoObject>(contactObject);
var contactObjectAsDict = (IDictionary<string, object>)contactObject;
Assert.AreEqual(contactObjectAsDict.Count, 4);
Assert.IsTrue(contactObjectAsDict.ContainsKey("Country"));
Assert.IsTrue(contactObjectAsDict.ContainsKey("Region"));
Assert.IsTrue(contactObjectAsDict.ContainsKey("Province"));
Assert.IsTrue(contactObjectAsDict.ContainsKey("City"));
Assert.AreEqual(contactObjectAsDict["Country"], "Italy");
Assert.AreEqual(contactObjectAsDict["Region"], "Lombardia");
Assert.AreEqual(contactObjectAsDict["Province"], "MI");
Assert.AreEqual(contactObjectAsDict["City"], "Milano");
}
[Test]
public void ShallowClone_Returns_Shallow_Clone_Of_Source()
{
// ARRANGE
dynamic contact = new ExpandoObject();
contact.Country = "Italy";
contact.Region = "Lombardia";
contact.Province = "MI";
contact.City = "Milano";
var target = new ExpandoObject();
var targetAsDict = (IDictionary<string, object>)target;
targetAsDict.Add("FirstName", "Mario");
targetAsDict.Add("LastName", "Rossi");
targetAsDict.Add("Age", 50);
targetAsDict.Add("Contact", contact);
// ACT
var result = target.ShallowClone();
// ASSERT
var resultAsDict = (IDictionary<string, object>)result;
Assert.AreEqual(resultAsDict.Count, 4);
Assert.IsTrue(resultAsDict.ContainsKey("FirstName"));
Assert.IsTrue(resultAsDict.ContainsKey("LastName"));
Assert.IsTrue(resultAsDict.ContainsKey("Contact"));
Assert.IsTrue(resultAsDict.ContainsKey("Age"));
Assert.AreSame(resultAsDict["FirstName"], targetAsDict["FirstName"]);
Assert.AreSame(resultAsDict["LastName"], targetAsDict["LastName"]);
Assert.AreSame(resultAsDict["Contact"], targetAsDict["Contact"]);
Assert.AreSame(resultAsDict["Age"], targetAsDict["Age"]);
}
[Test]
public void ShallowClone_Is_Able_To_Clone_Source_Having_Deeply_Nested_Properties()
{
// ARRANGE
dynamic secondLevelObject = new ExpandoObject();
secondLevelObject.Key3 = "Value3";
secondLevelObject.Key4 = "Value4";
dynamic firstLevelObject = new ExpandoObject();
firstLevelObject.Key1 = "Value1";
firstLevelObject.Key2 = secondLevelObject;
var target = new ExpandoObject();
var targetAsDict = (IDictionary<string, object>)target;
targetAsDict["key"] = firstLevelObject;
// ACT
var result = target.ShallowClone();
// ASSERT
var resultAsDict = (IDictionary<string, object>)result;
Assert.AreEqual(resultAsDict.Count, 1);
Assert.IsTrue(resultAsDict.ContainsKey("key"));
var firstLevel = resultAsDict["key"];
Assert.IsNotNull(firstLevel);
Assert.IsInstanceOf<ExpandoObject>(firstLevel);
var firstLevelAsDict = (IDictionary<string, object>)firstLevel;
Assert.AreEqual(2, firstLevelAsDict.Count);
Assert.IsTrue(firstLevelAsDict.ContainsKey("Key1"));
Assert.IsTrue(firstLevelAsDict.ContainsKey("Key2"));
Assert.AreEqual("Value1", firstLevelAsDict["Key1"]);
var secondLevel = firstLevelAsDict["Key2"];
Assert.IsNotNull(secondLevel);
Assert.IsInstanceOf<ExpandoObject>(secondLevel);
var secondLevelAsDict = (IDictionary<string, object>)secondLevel;
Assert.AreEqual(2, secondLevelAsDict.Count);
Assert.IsTrue(secondLevelAsDict.ContainsKey("Key3"));
Assert.IsTrue(secondLevelAsDict.ContainsKey("Key4"));
Assert.AreEqual("Value3", secondLevelAsDict["Key3"]);
Assert.AreEqual("Value4", secondLevelAsDict["Key4"]);
// check object references
Assert.AreSame(secondLevelObject, secondLevel);
Assert.AreSame(firstLevelObject, firstLevel);
}
}
}
<file_sep>/src/Deltatre.Utils/Mocking/MachineSystemClock.cs
using System;
namespace Deltatre.Utils.Mocking
{
/// <summary>
/// Concrete implementation of ISystemClock based on your machine's system clock.
/// </summary>
public sealed class MachineSystemClock: ISystemClock
{
/// <summary>
/// The current date and time on this computer expressed as the local time
/// </summary>
public DateTime Now => DateTime.Now;
/// <summary>
/// The current date and time on this computer expressed as UTC time
/// </summary>
public DateTime UtcNow => DateTime.UtcNow;
/// <summary>
/// Gets a <see cref="DateTimeOffset" /> object that is set to the current date and time on the current computer,
/// with the offset set to the local time's offset from Coordinated Universal Time (UTC).
/// </summary>
public DateTimeOffset OffsetNow => DateTimeOffset.Now;
/// <summary>
/// Gets a <see cref="DateTimeOffset" /> object whose date and time are set to the current Coordinated Universal Time (UTC) date and time
/// and whose offset is set to <see cref="TimeSpan.Zero" />
/// </summary>
public DateTimeOffset OffsetUtcNow => DateTimeOffset.UtcNow;
}
}
<file_sep>/src/Deltatre.Utils/Dto/Maybe.cs
using System;
namespace Deltatre.Utils.Dto
{
/// <summary>
/// This type is a wrapper around a value.
/// The wrapper can have a value inside it or it can be empty.
/// The wrapped value can be read if and only if it is
/// available inside the wrapper.
/// The wrapper can be used as a placeholder for the value in all the scenarios where the value
/// itself can possibly be missing.
/// </summary>
/// <remarks>
/// A typical use case for this type is a method which fetches an item from a database.
/// The searched item can be available or not inside the database.
/// You can use Maybe as the return type for the method.
/// </remarks>
/// <typeparam name="T">
/// The type of the wrapped value.
/// </typeparam>
public sealed class Maybe<T>
{
/// <summary>
/// Represents the empty <see cref="Maybe{T}"/> instance.
/// This field is read-only.
/// </summary>
/// <remarks>
/// Use this field each time you want to return an empty <see cref="Maybe{T}"/> instance.
/// </remarks>
public static readonly Maybe<T> None = new Maybe<T>(default, false);
private readonly T _value;
/// <summary>
/// Initializes a new instance of <see cref="Maybe{T}"/> class.
/// </summary>
/// <param name="value">
/// The value wrapped by the <see cref="Maybe{T}"/> instance.
/// </param>
/// <remarks>
/// By using this constructor you always get a non empty <see cref="Maybe{T}"/> instance.
/// </remarks>
public Maybe(T value): this(value, true)
{
}
private Maybe(T value, bool hasValue)
{
_value = value;
HasValue = hasValue;
}
/// <summary>
/// Gets a flag indicating whether the current instance is non empty.
/// An instance of <see cref="Maybe{T}"/> is non empty if and only if
/// it actually wraps a value.
/// </summary>
public bool HasValue { get; }
/// <summary>
/// Gets the wrapped value.
/// </summary>
/// <exception cref="InvalidOperationException">
/// When the current instance is an empty instance.
/// </exception>
/// <remarks>
/// Reading the wrapped value is allowed if and only if the current instance is non empty.
/// When trying to read the wrapped value from an empty instance an <see cref="InvalidOperationException"/>
/// is thrown.
/// </remarks>
public T Value
{
get
{
if (!this.HasValue)
{
throw new InvalidOperationException("Accessing the value of an empty Maybe instance is not allowed");
}
return this._value;
}
}
}
}
<file_sep>/src/Deltatre.Utils/Types/NonEmptySequence.cs
using System;
using System.Collections;
using System.Collections.Generic;
using System.Linq;
namespace Deltatre.Utils.Types
{
/// <inheritdoc />
/// <summary>
/// This type represents a non empty sequence of items
/// </summary>
/// <typeparam name="T">The type of items contained into the sequence</typeparam>
public sealed class NonEmptySequence<T>: IEnumerable<T>
{
private readonly T[] _items;
/// <summary>
/// Initializes a new instance of the <see cref="NonEmptySequence{T}"/> class.
/// </summary>
/// <param name="items">The items contained into the newly created instance of <see cref="NonEmptySequence{T}"/>.</param>
/// <exception cref="ArgumentNullException">Throws <see cref="ArgumentNullException"/> when <paramref name="items"/> is null</exception>
/// <exception cref="ArgumentException">Throws <see cref="ArgumentException"/> when <paramref name="items"/> is an empty sequence</exception>
public NonEmptySequence(IEnumerable<T> items)
{
if (items == null)
throw new ArgumentNullException(nameof(items));
var itemsArray = items.ToArray();
if (itemsArray.Length == 0)
throw new ArgumentException($"Parameter '{nameof(items)}' cannot be an empty sequence.", nameof(items));
_items = itemsArray;
}
IEnumerator<T> IEnumerable<T>.GetEnumerator() => ((IEnumerable<T>) _items).GetEnumerator();
IEnumerator IEnumerable.GetEnumerator() => _items.GetEnumerator();
}
}
<file_sep>/src/Deltatre.Utils/Resilience/IRetryStrategy.cs
using System;
namespace Deltatre.Utils.Resilience
{
/// <summary>
/// Describes a strategy to perform retries over a failing operation.
/// </summary>
public interface IRetryStrategy
{
/// <summary>
/// Gets the number of allowed retries.
/// </summary>
/// <remarks>
/// The number of allowed retries for a retry strategy is a non negative integer number.
/// When this number is zero the retry strategy is basically a pass-through which simply executes the provided operation
/// without attempting any retry in case of errors.
/// When this number is greater than zero the retry strategy retries the provided operation in case of errors.
/// </remarks>
int MaxRetryCount { get; }
/// <summary>
/// Computes the amount of time that must be waited before doing another retry.
/// </summary>
/// <remarks>
/// Invoking this method always throws an <see cref="InvalidOperationException"/> exception for a retry strategy
/// having <see cref="MaxRetryCount"/> equal to zero (0).
/// A retry strategy having <see cref="MaxRetryCount"/> equal to zero (0) is a pass-through
/// which simply executes the provided operation and doesn't perform any retry in case of errors.
/// </remarks>
/// <param name="retryCount">
/// An incremental integer value which acts as a retry counter.
/// This counter is always a positive integer number.
/// It takes the value 1 after the first attempt of performing the operation, before the first retry.
/// It takes the value 2 after the first retry, before the second retry.
/// It takes the value 3 after the second retry, before the third retry and so on.
/// </param>
/// <returns>
/// A <see cref="TimeSpan"/> representing the amount of time that must be waited before doing another retry.
/// </returns>
/// <exception cref="InvalidOperationException">
/// When <see cref="MaxRetryCount"/> is zero (0). In that case the retry policy is simply a pass through for the provided
/// operation and no retires will be applied in case of errors. So, computing the sleep duration makes no sense in that case.
/// </exception>
/// <exception cref="ArgumentOutOfRangeException">
/// This exception is thrown by retry strategies having <see cref="MaxRetryCount"/> greater than zero (0)
/// when parameter <paramref name="retryCount"/> is less than 1 or greater than <see cref="MaxRetryCount"/>.
/// </exception>
TimeSpan ComputeSleepDuration(int retryCount);
}
}
<file_sep>/src/Deltatre.Utils/Types/NonEmptyList.cs
using System;
using System.Collections;
using System.Collections.Generic;
using System.Linq;
namespace Deltatre.Utils.Types
{
/// <inheritdoc />
/// <summary>
/// This type represents a non empty and read-only list of items.
/// </summary>
/// <typeparam name="T">The type of items contained into the sequence</typeparam>
public sealed class NonEmptyList<T> : IReadOnlyList<T>
{
private readonly List<T> _items;
/// <summary>
/// Initializes a new instance of the <see cref="NonEmptyList{T}"/> class.
/// </summary>
/// <param name="items">The items contained into the newly created instance of <see cref="NonEmptyList{T}"/>.</param>
/// <exception cref="ArgumentNullException">Throws <see cref="ArgumentNullException"/> when <paramref name="items"/> is null</exception>
/// <exception cref="ArgumentException">Throws <see cref="ArgumentException"/> when <paramref name="items"/> is an empty sequence</exception>
public NonEmptyList(IEnumerable<T> items)
{
if (items == null)
throw new ArgumentNullException(nameof(items));
var itemsList = items.ToList();
if (itemsList.Count == 0)
throw new ArgumentException($"Parameter '{nameof(items)}' cannot be an empty sequence.", nameof(items));
_items = itemsList;
}
/// <summary>Gets the element at the specified index in the read-only list.</summary>
/// <param name="index">The zero-based index of the element to get.</param>
/// <returns>The element at the specified index in the read-only list.</returns>
public T this[int index] => _items[index];
/// <summary>
/// Gets the number of elements in the collection.
/// </summary>
public int Count => _items.Count;
IEnumerator<T> IEnumerable<T>.GetEnumerator() => ((IEnumerable<T>)_items).GetEnumerator();
IEnumerator IEnumerable.GetEnumerator() => ((IEnumerable)_items).GetEnumerator();
}
}
|
e5b1d4174142e71226f57fec59fa205099a4b6f4
|
[
"Markdown",
"C#"
] | 82 |
C#
|
deltatre-webplu/deltatre.utils
|
3e210d249adaa7b943bb4818ea35a3b9836e0024
|
41f36b71397adeaafb12737cbe34ac3d98913116
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.