code
stringlengths 501
5.19M
| package
stringlengths 2
81
| path
stringlengths 9
304
| filename
stringlengths 4
145
|
---|---|---|---|
Contributing to APScheduler
===========================
.. highlight:: bash
If you wish to contribute a fix or feature to APScheduler, please follow the following
guidelines.
When you make a pull request against the main APScheduler codebase, Github runs the test
suite against your modified code. Before making a pull request, you should ensure that
the modified code passes tests and code quality checks locally.
Running the test suite
----------------------
The test suite has dependencies on several external services, such as database servers.
To make this easy for the developer, a `docker compose`_ configuration is provided.
To use it, you need Docker_ (or a suitable replacement). On Linux, unless you're using
Docker Desktop, you may need to also install the compose (v2) plugin (named
``docker-compose-plugin``, or similar) separately.
Once you have the necessary tools installed, you can start the services with this
command::
docker compose up -d
You can run the test suite two ways: either with tox_, or by running pytest_ directly.
To run tox_ against all supported (of those present on your system) Python versions::
tox
Tox will handle the installation of dependencies in separate virtual environments.
To pass arguments to the underlying pytest_ command, you can add them after ``--``, like
this::
tox -- -k somekeyword
To use pytest directly, you can set up a virtual environment and install the project in
development mode along with its test dependencies (virtualenv activation demonstrated
for Linux and macOS; on Windows you need ``venv\Scripts\activate`` instead)::
python -m venv venv
source venv/bin/activate
pip install -e .[test]
Now you can just run pytest_::
pytest
Building the documentation
--------------------------
To build the documentation, run ``tox -e docs``. This will place the documentation in
``build/sphinx/html`` where you can open ``index.html`` to view the formatted
documentation.
APScheduler uses ReadTheDocs_ to automatically build the documentation so the above
procedure is only necessary if you are modifying the documentation and wish to check the
results before committing.
APScheduler uses pre-commit_ to perform several code style/quality checks. It is
recommended to activate pre-commit_ on your local clone of the repository (using
``pre-commit install``) to ensure that your changes will pass the same checks on GitHub.
Making a pull request on Github
-------------------------------
To get your changes merged to the main codebase, you need a Github account.
#. Fork the repository (if you don't have your own fork of it yet) by navigating to the
`main APScheduler repository`_ and clicking on "Fork" near the top right corner.
#. Clone the forked repository to your local machine with
``git clone [email protected]/yourusername/apscheduler``.
#. Create a branch for your pull request, like ``git checkout -b myfixname``
#. Make the desired changes to the code base.
#. Commit your changes locally. If your changes close an existing issue, add the text
``Fixes #XXX.`` or ``Closes #XXX.`` to the commit message (where XXX is the issue
number).
#. Push the changeset(s) to your forked repository (``git push``)
#. Navigate to Pull requests page on the original repository (not your fork) and click
"New pull request"
#. Click on the text "compare across forks".
#. Select your own fork as the head repository and then select the correct branch name.
#. Click on "Create pull request".
If you have trouble, consult the `pull request making guide`_ on opensource.com.
.. _Docker: https://docs.docker.com/desktop/#download-and-install
.. _docker compose: https://docs.docker.com/compose/
.. _tox: https://tox.readthedocs.io/en/latest/install.html
.. _pre-commit: https://pre-commit.com/#installation
.. _pytest: https://pypi.org/project/pytest/
.. _ReadTheDocs: https://readthedocs.org/
.. _main APScheduler repository: https://github.com/agronholm/apscheduler
.. _pull request making guide: https://opensource.com/article/19/7/create-pull-request-github
| APScheduler | /APScheduler-4.0.0a1.tar.gz/APScheduler-4.0.0a1/docs/contributing.rst | contributing.rst |
Integrating with application frameworks
=======================================
WSGI
----
To integrate APScheduler with web frameworks using WSGI_ (Web Server Gateway Interface),
you need to use the synchronous scheduler and start it as a side effect of importing the
module that contains your application instance::
from apscheduler.schedulers.sync import Scheduler
def app(environ, start_response):
"""Trivial example of a WSGI application."""
response_body = b"Hello, World!"
response_headers = [
("Content-Type", "text/plain"),
("Content-Length", str(len(response_body))),
]
start_response(200, response_headers)
return [response_body]
scheduler = Scheduler()
scheduler.start_in_background()
Assuming you saved this as ``example.py``, you can now start the application with uWSGI_
with:
.. code-block:: bash
uwsgi --enable-threads --http :8080 --wsgi-file example.py
The ``--enable-threads`` (or ``-T``) option is necessary because uWSGI disables threads
by default which then prevents the scheduler from working. See the
`uWSGI documentation <uWSGI-threads>`_ for more details.
.. note::
The :meth:`.schedulers.sync.Scheduler.start_in_background` method installs an
:mod:`atexit` hook that shuts down the scheduler gracefully when the worker process
exits.
.. _WSGI: https://wsgi.readthedocs.io/en/latest/what.html
.. _uWSGI: https://www.fullstackpython.com/uwsgi.html
.. _uWSGI-threads: https://uwsgi-docs.readthedocs.io/en/latest/WSGIquickstart.html#a-note-on-python-threads
ASGI
----
To integrate APScheduler with web frameworks using ASGI_ (Asynchronous Server Gateway
Interface), you need to use the asynchronous scheduler and tie its lifespan to the
lifespan of the application by wrapping it in middleware, as follows::
from apscheduler.schedulers.async_ import AsyncScheduler
async def app(scope, receive, send):
"""Trivial example of an ASGI application."""
if scope["type"] == "http":
await receive()
await send(
{
"type": "http.response.start",
"status": 200,
"headers": [
[b"content-type", b"text/plain"],
],
}
)
await send(
{
"type": "http.response.body",
"body": b"Hello, world!",
"more_body": False,
}
)
elif scope["type"] == "lifespan":
while True:
message = await receive()
if message["type"] == "lifespan.startup":
await send({"type": "lifespan.startup.complete"})
elif message["type"] == "lifespan.shutdown":
await send({"type": "lifespan.shutdown.complete"})
return
async def scheduler_middleware(scope, receive, send):
if scope['type'] == 'lifespan':
async with AsyncScheduler() as scheduler:
await app(scope, receive, send)
else:
await app(scope, receive, send)
Assuming you saved this as ``example.py``, you can then run this with Hypercorn_:
.. code-block:: bash
hypercorn example:scheduler_middleware
or with Uvicorn_:
.. code-block:: bash
uvicorn example:scheduler_middleware
.. _ASGI: https://asgi.readthedocs.io/en/latest/index.html
.. _Hypercorn: https://gitlab.com/pgjones/hypercorn/
.. _Uvicorn: https://www.uvicorn.org/
| APScheduler | /APScheduler-4.0.0a1.tar.gz/APScheduler-4.0.0a1/docs/integrations.rst | integrations.rst |
###############################################
Migrating from previous versions of APScheduler
###############################################
.. py:currentmodule:: apscheduler
From v3.x to v4.0
=================
APScheduler 4.0 has undergone a partial rewrite since the 3.x series.
There is currently no way to automatically import schedules from a persistent 3.x job
store, but this shortcoming will be rectified before the final v4.0 release.
Terminology and architectural design changes
--------------------------------------------
The concept of a *job* has been split into :class:`Task`, :class:`Schedule` and
:class:`Job`. See the documentation of each class (and read the tutorial) to understand
their roles.
**Executors** have been replaced by *workers*. Workers were designed to be able to run
independently from schedulers. Workers now *pull* jobs from the data store instead of
the scheduler pushing jobs directly to them.
**Data stores**, previously called *job stores*, have been redesigned to work with
multiple running schedulers and workers, both for purposes of scalability and fault
tolerance. Many data store implementations were dropped because they were either too
burdensome to support, or the backing services were not sophisticated enough to handle
the increased requirements.
**Event brokers** are a new component in v4.0. They relay events between schedulers and
workers, enabling them to work together with a shared data store. External (as opposed
to local) event broker services are required in multi-node or multi-process deployment
scenarios.
**Triggers** are now stateful. This change was found to be necessary to properly support
combining triggers (:class:`~.triggers.combining.AndTrigger` and
:class:`~.triggers.combining.OrTrigger`), as they needed to keep track of the next run
times of all the triggers contained within. This change also enables some more
sophisticated custom trigger implementations.
**Time zone** support has been revamped to use :mod:`zoneinfo` (or `backports.zoneinfo`_
on Python versions earlier than 3.9) zones instead of pytz zones. You should not use
pytz with APScheduler anymore.
`Entry points`_ are no longer used or supported, as they were more trouble than they
were worth, particularly with packagers like py2exe or PyInstaller which by default did
not package distribution metadata. Thus, triggers and data stores have to be explicitly
instantiated.
.. _backports.zoneinfo: https://pypi.org/project/backports.zoneinfo/
.. _Entry points: https://packaging.python.org/en/latest/specifications/entry-points/
Scheduler changes
-----------------
The ``add_job()`` method is now :meth:`~Scheduler.add_schedule`. The scheduler still has
a method named :meth:`~Scheduler.add_job`, but this is meant for making one-off runs of a
task. Previously you would have had to call ``add_job()`` with a
:class:`~apscheduler.triggers.date.DateTrigger` using the current time as the run time.
The two most commonly used schedulers, ``BlockingScheduler`` and
``BackgroundScheduler``, have often caused confusion among users and have thus been
combined into :class:`~.schedulers.sync.Scheduler`. This new unified scheduler class
has two methods that replace the ``start()`` method used previously:
:meth:`~.schedulers.sync.Scheduler.run_until_stopped` and
:meth:`~.schedulers.sync.Scheduler.start_in_background`. The former should be used if
you previously used ``BlockingScheduler``, and the latter if you used
``BackgroundScheduler``.
The asyncio scheduler has been replaced with a more generic :class:`AsyncScheduler`,
which is based on AnyIO_ and thus also supports Trio_ in addition to :mod:`asyncio`.
The API of the async scheduler differs somewhat from its synchronous counterpart. In
particular, it **requires** itself to be used as an async context manager – whereas with
the synchronous scheduler, use as a context manager is recommended but not required.
All other scheduler implementations have been dropped because they were either too
burdensome to support, or did not seem necessary anymore. Some of the dropped
implementations (particularly Qt) are likely to be re-added before v4.0 final.
Schedulers no longer support multiple data stores. If you need this capability, you
should run multiple schedulers instead.
Configuring and running the scheduler has been radically simplified. The ``configure()``
method is gone, and all configuration is now passed as keyword arguments to the
scheduler class.
.. _AnyIO: https://pypi.org/project/anyio/
.. _Trio: https://pypi.org/project/trio/
Trigger changes
---------------
As the scheduler is no longer used to create triggers, any supplied datetimes will be
assumed to be in the local time zone. If you wish to change the local time zone, you
should set the ``TZ`` environment variable to either the name of the desired timezone
(e.g. ``Europe/Helsinki``) or to a path of a time zone file. See the tzlocal_
documentation for more information.
**Jitter** support has been moved from individual triggers to the schedule level.
This not only simplified trigger design, but also enabled the scheduler to provide
information about the randomized jitter and the original run time to the user.
:class:`~.triggers.cron.CronTrigger` was changed to respect the standard order of
weekdays, so that Sunday is now 0 and Saturday is 6. If you used numbered weekdays
before, you must change your trigger configuration to match. If in doubt, use
abbreviated weekday names (e.g. ``sun``, ``fri``) instead.
:class:`~.triggers.interval.IntervalTrigger` was changed to start immediately, instead
of waiting for the first interval to pass. If you have workarounds in place to "fix"
the previous behavior, you should remove them.
.. _tzlocal: https://pypi.org/project/tzlocal/
From v3.0 to v3.2
=================
Prior to v3.1, the scheduler inadvertently exposed the ability to fetch and manipulate jobs before
the scheduler had been started. The scheduler now requires you to call ``scheduler.start()`` before
attempting to access any of the jobs in the job stores. To ensure that no old jobs are mistakenly
executed, you can start the scheduler in paused mode (``scheduler.start(paused=True)``) (introduced
in v3.2) to avoid any premature job processing.
From v2.x to v3.0
=================
The 3.0 series is API incompatible with previous releases due to a design overhaul.
Scheduler changes
-----------------
* The concept of "standalone mode" is gone. For ``standalone=True``, use
:class:`~apscheduler.schedulers.blocking.BlockingScheduler` instead, and for
``standalone=False``, use :class:`~apscheduler.schedulers.background.BackgroundScheduler`.
BackgroundScheduler matches the old default semantics.
* Job defaults (like ``misfire_grace_time`` and ``coalesce``) must now be passed in a dictionary as
the ``job_defaults`` option to :meth:`~apscheduler.schedulers.base.BaseScheduler.configure`. When
supplying an ini-style configuration as the first argument, they will need a corresponding
``job_defaults.`` prefix.
* The configuration key prefix for job stores was changed from ``jobstore.`` to ``jobstores.`` to
match the dict-style configuration better.
* The ``max_runs`` option has been dropped since the run counter could not be reliably preserved
when replacing a job with another one with the same ID. To make up for this, the ``end_date``
option was added to cron and interval triggers.
* The old thread pool is gone, replaced by ``ThreadPoolExecutor``.
This means that the old ``threadpool`` options are no longer valid.
See :ref:`scheduler-config` on how to configure executors.
* The trigger-specific scheduling methods have been removed entirely from the scheduler.
Use the generic :meth:`~apscheduler.schedulers.base.BaseScheduler.add_job` method or the
:meth:`~apscheduler.schedulers.base.BaseScheduler.scheduled_job` decorator instead.
The signatures of these methods were changed significantly.
* The ``shutdown_threadpool`` and ``close_jobstores`` options have been removed from the
:meth:`~apscheduler.schedulers.base.BaseScheduler.shutdown` method.
Executors and job stores are now always shut down on scheduler shutdown.
* :meth:`~apscheduler.scheduler.Scheduler.unschedule_job` and
:meth:`~apscheduler.scheduler.Scheduler.unschedule_func` have been replaced by
:meth:`~apscheduler.schedulers.base.BaseScheduler.remove_job`. You can also unschedule a job by
using the job handle returned from :meth:`~apscheduler.schedulers.base.BaseScheduler.add_job`.
Job store changes
-----------------
The job store system was completely overhauled for both efficiency and forwards compatibility.
Unfortunately, this means that the old data is not compatible with the new job stores.
If you need to migrate existing data from APScheduler 2.x to 3.x, contact the APScheduler author.
The Shelve job store had to be dropped because it could not support the new job store design.
Use SQLAlchemyJobStore with SQLite instead.
Trigger changes
---------------
From 3.0 onwards, triggers now require a pytz timezone. This is normally provided by the scheduler,
but if you were instantiating triggers manually before, then one must be supplied as the
``timezone`` argument.
The only other backwards incompatible change was that ``get_next_fire_time()`` takes two arguments
now: the previous fire time and the current datetime.
From v1.x to 2.0
================
There have been some API changes since the 1.x series. This document
explains the changes made to v2.0 that are incompatible with the v1.x API.
API changes
-----------
* The behavior of cron scheduling with regards to default values for omitted
fields has been made more intuitive -- omitted fields lower than the least
significant explicitly defined field will default to their minimum values
except for the week number and weekday fields
* SchedulerShutdownError has been removed -- jobs are now added tentatively
and scheduled for real when/if the scheduler is restarted
* Scheduler.is_job_active() has been removed -- use
``job in scheduler.get_jobs()`` instead
* dump_jobs() is now print_jobs() and prints directly to the given file or
sys.stdout if none is given
* The ``repeat`` parameter was removed from
:meth:`~apscheduler.scheduler.Scheduler.add_interval_job` and
:meth:`~apscheduler.scheduler.Scheduler.interval_schedule` in favor of the
universal ``max_runs`` option
* :meth:`~apscheduler.scheduler.Scheduler.unschedule_func` now raises a
KeyError if the given function is not scheduled
* The semantics of :meth:`~apscheduler.scheduler.Scheduler.shutdown` have
changed -- the method no longer accepts a numeric argument, but two booleans
Configuration changes
---------------------
* The scheduler can no longer be reconfigured while it's running
| APScheduler | /APScheduler-4.0.0a1.tar.gz/APScheduler-4.0.0a1/docs/migration.rst | migration.rst |
API reference
=============
Data structures
---------------
.. autoclass:: apscheduler.Task
.. autoclass:: apscheduler.Schedule
.. autoclass:: apscheduler.Job
.. autoclass:: apscheduler.JobInfo
.. autoclass:: apscheduler.JobResult
.. autoclass:: apscheduler.RetrySettings
Schedulers
----------
.. autoclass:: apscheduler.schedulers.sync.Scheduler
.. autoclass:: apscheduler.schedulers.async_.AsyncScheduler
Workers
-------
.. autoclass:: apscheduler.workers.sync.Worker
.. autoclass:: apscheduler.workers.async_.AsyncWorker
Data stores
-----------
.. autoclass:: apscheduler.abc.DataStore
.. autoclass:: apscheduler.abc.AsyncDataStore
.. autoclass:: apscheduler.datastores.memory.MemoryDataStore
.. autoclass:: apscheduler.datastores.sqlalchemy.SQLAlchemyDataStore
.. autoclass:: apscheduler.datastores.async_sqlalchemy.AsyncSQLAlchemyDataStore
.. autoclass:: apscheduler.datastores.mongodb.MongoDBDataStore
Event brokers
-------------
.. autoclass:: apscheduler.abc.EventBroker
.. autoclass:: apscheduler.abc.AsyncEventBroker
.. autoclass:: apscheduler.eventbrokers.local.LocalEventBroker
.. autoclass:: apscheduler.eventbrokers.async_local.LocalAsyncEventBroker
.. autoclass:: apscheduler.eventbrokers.asyncpg.AsyncpgEventBroker
.. autoclass:: apscheduler.eventbrokers.mqtt.MQTTEventBroker
.. autoclass:: apscheduler.eventbrokers.redis.RedisEventBroker
Serializers
-----------
.. autoclass:: apscheduler.abc.Serializer
.. autoclass:: apscheduler.serializers.cbor.CBORSerializer
.. autoclass:: apscheduler.serializers.json.JSONSerializer
.. autoclass:: apscheduler.serializers.pickle.PickleSerializer
Triggers
--------
.. autoclass:: apscheduler.abc.Trigger
.. autoclass:: apscheduler.triggers.date.DateTrigger
.. autoclass:: apscheduler.triggers.interval.IntervalTrigger
.. autoclass:: apscheduler.triggers.calendarinterval.CalendarIntervalTrigger
.. autoclass:: apscheduler.triggers.combining.AndTrigger
.. autoclass:: apscheduler.triggers.combining.OrTrigger
.. autoclass:: apscheduler.triggers.cron.CronTrigger
Events
------
.. autoclass:: apscheduler.Event
.. autoclass:: apscheduler.DataStoreEvent
.. autoclass:: apscheduler.TaskAdded
.. autoclass:: apscheduler.TaskUpdated
.. autoclass:: apscheduler.TaskRemoved
.. autoclass:: apscheduler.ScheduleAdded
.. autoclass:: apscheduler.ScheduleUpdated
.. autoclass:: apscheduler.ScheduleRemoved
.. autoclass:: apscheduler.JobAdded
.. autoclass:: apscheduler.JobRemoved
.. autoclass:: apscheduler.ScheduleDeserializationFailed
.. autoclass:: apscheduler.JobDeserializationFailed
.. autoclass:: apscheduler.SchedulerEvent
.. autoclass:: apscheduler.SchedulerStarted
.. autoclass:: apscheduler.SchedulerStopped
.. autoclass:: apscheduler.WorkerEvent
.. autoclass:: apscheduler.WorkerStarted
.. autoclass:: apscheduler.WorkerStopped
.. autoclass:: apscheduler.JobAcquired
.. autoclass:: apscheduler.JobReleased
Enumerated types
----------------
.. autoclass:: apscheduler.RunState
.. autoclass:: apscheduler.JobOutcome
.. autoclass:: apscheduler.ConflictPolicy
.. autoclass:: apscheduler.CoalescePolicy
Context variables
-----------------
See the :mod:`contextvars` module for information on how to work with context variables.
.. data:: apscheduler.current_scheduler
:annotation: the current scheduler
:type: ~contextvars.ContextVar[~typing.Union[Scheduler, AsyncScheduler]]
.. data:: apscheduler.current_worker
:annotation: the current scheduler
:type: ~contextvars.ContextVar[~typing.Union[Worker, AsyncWorker]]
.. data:: apscheduler.current_job
:annotation: information on the job being currently run
:type: ~contextvars.ContextVar[JobInfo]
Exceptions
----------
.. autoexception:: apscheduler.TaskLookupError
.. autoexception:: apscheduler.ScheduleLookupError
.. autoexception:: apscheduler.JobLookupError
.. autoexception:: apscheduler.JobResultNotReady
.. autoexception:: apscheduler.JobCancelled
.. autoexception:: apscheduler.JobDeadlineMissed
.. autoexception:: apscheduler.ConflictingIdError
.. autoexception:: apscheduler.SerializationError
.. autoexception:: apscheduler.DeserializationError
.. autoexception:: apscheduler.MaxIterationsReached
| APScheduler | /APScheduler-4.0.0a1.tar.gz/APScheduler-4.0.0a1/docs/api.rst | api.rst |
#####################
Extending APScheduler
#####################
This document is meant to explain how to develop your custom triggers and data stores.
Custom triggers
---------------
.. py:currentmodule:: apscheduler.triggers
The built-in triggers cover the needs of the majority of all users, particularly so when
combined using :class:`~.combining.AndTrigger` and :class:`~.combining.OrTrigger`.
However, some users may need specialized scheduling logic. This can be accomplished by
creating your own custom trigger class.
To implement your scheduling logic, create a new class that inherits from the
:class:`~..abc.Trigger` interface class::
from __future__ import annotations
from apscheduler.abc import Trigger
class MyCustomTrigger(Trigger):
def next() -> datetime | None:
... # Your custom logic here
def __getstate__():
... # Return the serializable state here
def __setstate__(state):
... # Restore the state from the return value of __getstate__()
Requirements and constraints for trigger classes:
* :meth:`~..abc.Trigger.next` must always either return a timezone aware
:class:`datetime` object or :data:`None` if a new run time cannot be calculated
* :meth:`~..abc.Trigger.next` must never return the same :class:`~datetime.datetime`
twice and never one that is earlier than the previously returned one
* :meth:`~..abc.Trigger.__setstate__` must accept the return value of
:meth:`~..abc.Trigger.__getstate__` and restore the trigger to the functionally same
state as the original
Triggers are stateful objects. The :meth:`~..abc.Trigger.next` method is where you
determine the next run time based on the current state of the trigger. The trigger's
internal state needs to be updated before returning to ensure that the trigger won't
return the same datetime on the next call. The trigger code does **not** need to be
thread-safe.
Custom data stores
------------------
If you want to make use of some external service to store the scheduler data, and it's
not covered by a built-in data store implementation, you may want to create a custom
data store class. It should be noted that custom data stores are substantially harder to
implement than custom triggers.
Data store classes have the following design requirements:
* Must publish the appropriate events to an event broker
* Code must be thread safe (synchronous API) or task safe (asynchronous API)
The data store class needs to inherit from either :class:`~..abc.DataStore` or
:class:`~..abc.AsyncDataStore`, depending on whether you want to implement the store
using synchronous or asynchronous APIs:
.. tabs::
.. code-tab:: python Synchronous
from apscheduler.abc import DataStore, EventBroker
class MyCustomDataStore(Datastore):
def start(self, event_broker: EventBroker) -> None:
... # Save the event broker in a member attribute and initialize the store
def stop(self, *, force: bool = False) -> None:
... # Shut down the store
# See the interface class for the rest of the abstract methods
.. code-tab:: python Asynchronous
from apscheduler.abc import AsyncDataStore, AsyncEventBroker
class MyCustomDataStore(AsyncDatastore):
async def start(self, event_broker: AsyncEventBroker) -> None:
... # Save the event broker in a member attribute and initialize the store
async def stop(self, *, force: bool = False) -> None:
... # Shut down the store
# See the interface class for the rest of the abstract methods
Handling temporary failures
+++++++++++++++++++++++++++
If you plan to make the data store implementation public, it is strongly recommended
that you make an effort to ensure that the implementation can tolerate the loss of
connectivity to the backing store. The Tenacity_ library is used for this purpose by the
built-in stores to retry operations in case of a disconnection. If you use it to retry
operations when exceptions are raised, it is important to only do that in cases of
*temporary* errors, like connectivity loss, and not in cases like authentication
failure, missing database and so forth. See the built-in data store implementations and
Tenacity_ documentation for more information on how to pick the exceptions on which to
retry the operations.
.. _Tenacity: https://pypi.org/project/tenacity/
| APScheduler | /APScheduler-4.0.0a1.tar.gz/APScheduler-4.0.0a1/docs/extending.rst | extending.rst |
APScheduler practical examples
==============================
.. highlight:: bash
This directory contains a number of practical examples for running APScheduler in a
variety of configurations.
Prerequisites
-------------
Most examples use one or more external services for data sharing and synchronization.
To start these services, you need Docker_ installed. Each example lists the services
it needs (if any) in the module file, so you can start these services selectively.
On Linux, if you're using the vendor provided system package for Docker instead of
Docker Desktop, you may need to install the compose (v2) plugin (named
``docker-compose-plugin``, or similar) separately.
.. note:: If you're still using the Python-based docker-compose tool (aka compose v1),
replace ``docker compose`` with ``docker-compose``.
To start all the services, run this command anywhere within the project directory::
docker compose up -d
To start just a specific service, you can pass its name as an argument::
docker compose up -d postgresql
To shut down the services and delete all their data::
docker compose down -v
In addition to having these background services running, you may need to install
specific extra dependencies, like database drivers. Each example module has its required
dependencies listed in the module comment at the top.
.. _Docker: https://docs.docker.com/desktop/#download-and-install
Standalone examples
-------------------
The examples in the ``standalone`` directory demonstrate how to run the scheduler in the
foreground, without anything else going on in the same process.
The directory contains four modules:
- ``async_memory.py``: Basic asynchronous scheduler using the default memory-based data
store
- ``async_postgres.py``: Basic asynchronous scheduler using the asynchronous SQLAlchemy
data store with a PostgreSQL back-end
- ``async_mysql.py``: Basic asynchronous scheduler using the asynchronous SQLAlchemy
data store with a MySQL back-end
- ``sync_mysql.py``: Basic synchronous scheduler using the default memory-based data
store
Schedulers in web apps
----------------------
The examples in the ``web`` directory demonstrate how to run the scheduler inside a web
application (ASGI_ or WSGI_).
The directory contains five modules:
- ``asgi_noframework.py``: Trivial ASGI_ application, with middleware that starts and
stops the scheduler as part of the ASGI lifecycle
- ``asgi_fastapi.py``: Trivial FastAPI_ application, with middleware that starts and
stops the scheduler as part of the ASGI_ lifecycle
- ``asgi_starlette.py``: Trivial Starlette_ application, with middleware that starts and
stops the scheduler as part of the ASGI_ lifecycle
- ``wsgi_noframework.py``: Trivial WSGI_ application where the scheduler is started in a
background thread
- ``wsgi_flask.py``: Trivial Flask_ application where the scheduler is started in a
background thread
.. note:: There is no Django example available yet.
To run any of the ASGI_ examples::
uvicorn <filename_without_py_extension>:app
To run any of the WSGI_ examples::
uwsgi -T --http :8000 --wsgi-file <filename>
.. _ASGI: https://asgi.readthedocs.io/en/latest/introduction.html
.. _WSGI: https://wsgi.readthedocs.io/en/latest/what.html
.. _FastAPI: https://fastapi.tiangolo.com/
.. _Starlette: https://www.starlette.io/
.. _Flask: https://flask.palletsprojects.com/
Separate scheduler and worker
-----------------------------
The example in the ``separate_worker`` directory demonstrates the ability to run
schedulers and workers separately. The directory contains three modules:
- ``sync_scheduler.py``: Runs a scheduler (without an internal worker) and adds/updates
a schedule
- ``sync_worker.py``: Runs a worker only
- ``tasks.py``: Contains the task code (don't try to run this directly; it does nothing)
The reason for the task function being in a separate module is because when you run
either the ``sync_scheduler`` or ``sync_worker`` script, that script is imported as the
``__main__`` module, so if the scheduler schedules ``__main__:tick`` as the task, then
the worker would not be able to find it because its own script would also be named
``__main__``.
To run the example, you need to have both the worker and scheduler scripts running at
the same time. To run the worker::
python sync_worker.py
To run the scheduler::
python sync_scheduler.py
You can run multiple schedulers and workers at the same time within this example. If you
run multiple workers, the message might be printed on the console of a different worker
each time the job is run. Running multiple schedulers should have no visible effect, and
as long as at least one scheduler is running, the scheduled task should keep running
periodically on one of the workers.
| APScheduler | /APScheduler-4.0.0a1.tar.gz/APScheduler-4.0.0a1/examples/README.rst | README.rst |
from __future__ import annotations
from datetime import datetime
from sqlalchemy.ext.asyncio import create_async_engine
from apscheduler.datastores.async_sqlalchemy import AsyncSQLAlchemyDataStore
from apscheduler.eventbrokers.asyncpg import AsyncpgEventBroker
from apscheduler.schedulers.async_ import AsyncScheduler
from apscheduler.triggers.interval import IntervalTrigger
def tick():
print("Hello, the time is", datetime.now())
async def original_app(scope, receive, send):
"""Trivial example of an ASGI application."""
if scope["type"] == "http":
await receive()
await send(
{
"type": "http.response.start",
"status": 200,
"headers": [
[b"content-type", b"text/plain"],
],
}
)
await send(
{
"type": "http.response.body",
"body": b"Hello, world!",
"more_body": False,
}
)
elif scope["type"] == "lifespan":
while True:
message = await receive()
if message["type"] == "lifespan.startup":
await send({"type": "lifespan.startup.complete"})
elif message["type"] == "lifespan.shutdown":
await send({"type": "lifespan.shutdown.complete"})
return
async def scheduler_middleware(scope, receive, send):
if scope["type"] == "lifespan":
engine = create_async_engine(
"postgresql+asyncpg://postgres:secret@localhost/testdb"
)
data_store = AsyncSQLAlchemyDataStore(engine)
event_broker = AsyncpgEventBroker.from_async_sqla_engine(engine)
async with AsyncScheduler(data_store, event_broker) as scheduler:
await scheduler.add_schedule(tick, IntervalTrigger(seconds=1), id="tick")
await original_app(scope, receive, send)
else:
await original_app(scope, receive, send)
# This is just for consistency with the other ASGI examples
app = scheduler_middleware | APScheduler | /APScheduler-4.0.0a1.tar.gz/APScheduler-4.0.0a1/examples/web/asgi_noframework.py | asgi_noframework.py |
The purpose of the application is to solve the problem of having a unique 'APScheduler' object for a multithreaded application. The project aims to easily implement a client-server architecture to control the scheduled processes. Another advantage is that you can use the scheduler in distributed processes or servers.
INSTALL
=======
::
$ pip install APSchedulerSocket
USAGE
=====
::
from apschedulersocket import schedulers
from datetime import datetime
def my_process():
print("Executing my process: Hello world!")
# Show the protocol messages
schedulers.DEBUG = True
# New SchedulerServer
scheduler = schedulers.SchedulerServer(daemon=False)
# Check if the server was not already started by another thread/process
if not scheduler.client.is_server_running():
scheduler.add_job(func=my_process,
id="my_process",
trigger='interval',
next_run_time=datetime.now(),
minutes=1)
# start the apscheduler and the server
print("Server started!")
scheduler.start()
print("Server state (1=running):",
scheduler.client.call(["BackgroundScheduler","state"]))
print("The next run time of my_process:",
scheduler.client.call(["my_process","next_run_time"]))
print("Pause the job:",
scheduler.client.call(["my_process","pause"]))
print("Next run time of paused my_process (None=paused):",
scheduler.client.call(["my_process","next_run_time"]))
print("Resume the job:",
scheduler.client.call(["my_process","resume"]))
print("Wrong message to server:",
scheduler.client.call(["Can you also cook?",]))
print("Shutdown!",
scheduler.client.call(["shutdown",]))
DEVELOP
=======
::
$ git clone https://github.com/rristow/APSchedulerSocket APSchedulerSocket
$ cd APSchedulerSocket
$ make
RUNNING TESTS
=============
::
$ make test
# TODO! | APSchedulerSocket | /APSchedulerSocket-0.4.2.tar.gz/APSchedulerSocket-0.4.2/README.rst | README.rst |
import logging
import socket
import threading
import json
from datetime import date
from datetime import datetime
from apscheduler.schedulers.background import BackgroundScheduler
import dateutil.parser
logger = logging.getLogger()
DEFAULT_PORT = 1033
DEFAULT_ADDR = "127.0.0.1"
MSG_MAX_LENGTH = 1024
DEBUG = False
def echo(msg):
"""
Show more information in debug mode.
"""
if DEBUG:
print(msg)
class SchedulerClient():
"""
Client to communicate with the server and retrieve information about the scheduled jobs.
"""
def __init__(self, addr=DEFAULT_ADDR, port=DEFAULT_PORT):
self.addr = addr
self.port = port
def _read_msg(self, conn):
msg = conn.recv(MSG_MAX_LENGTH).decode("utf-8")
msg = json.loads(msg)
# Check if it is a ISO-date
if type(msg) == str:
try:
msg = dateutil.parser.parse(msg)
except ValueError:
pass
echo(" :: C - %s" % msg)
return msg
def _send_msg(self, msg):
"""
Send a message/request to the server and wait/return the answer.
"""
conn = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
conn.connect((self.addr, self.port))
msg = json.dumps(msg)
msg = bytes(msg, 'utf-8')
echo("C => S - %s" % msg)
conn.sendall(msg)
return conn
def is_server_running(self):
"""
Check if the server is running
"""
try:
return self.call(["BackgroundScheduler", "state"]) == 1
except ConnectionRefusedError:
return False
def is_server_running(self):
"""
Check if the server is running
"""
try:
return self.call(["BackgroundScheduler", "state"]) == 1
except ConnectionRefusedError:
return False
def send_shutdown(self):
"""
Stop the server.
"""
return self.call(["shutdown"])
def call(self, call_array):
conn = ret = self._send_msg(call_array)
ret = self._read_msg(conn)
return ret
class BackgroundSchedulerAdjusted(BackgroundScheduler):
"""
A BackgroundScheduler object adjusted to return json Serialization values.
"""
def orig_get_jobs(self, jobstore=None, pending=None):
return super(BackgroundSchedulerAdjusted, self).get_jobs(jobstore, pending)
def get_jobs(self, jobstore=None, pending=None):
"""
Return just the id of the jobs
"""
jobs = super(BackgroundSchedulerAdjusted, self).get_jobs(jobstore, pending)
return [job.id for job in jobs]
class SchedulerServer():
"""
Integration between BackgroundScheduler and a Client-Server.
"""
background_scheduler = BackgroundSchedulerAdjusted()
def __init__(self, addr=DEFAULT_ADDR, port=DEFAULT_PORT, daemon=True):
self.addr = addr
self.port = port
self.daemon = daemon
self.client = SchedulerClient(addr, port)
def add_job(self, func, id, **args):
self.background_scheduler.add_job(func=func, id=id, **args)
def _read_msg(self, conn):
msg = conn.recv(MSG_MAX_LENGTH).decode("utf-8")
msg = json.loads(msg)
echo(" :: S - %s" % msg)
return msg
def _send_msg(self, conn, msg):
"""
Send a message/request to the server and wait/return the answer.
"""
try:
msg = json.dumps(msg)
except TypeError:
msg = json.dumps("The result '%s' is not JSON serializable!" % repr(msg))
echo("S => C - %s" % msg)
msg = bytes(msg, 'utf-8')
conn.sendall(msg)
def _start_server(self):
server_socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
server_socket.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
server_socket.bind((self.addr, self.port))
server_socket.listen()
while True:
print()
conn, addr = server_socket.accept()
try:
print("Connection accepted from " + repr(addr[1]))
msg = self._read_msg(conn)
if msg[0] == 'shutdown':
self._send_msg(conn, True)
self._stop_scheduler()
server_socket.close()
return True
elif msg[0] == 'BackgroundScheduler':
attrib = getattr(self.background_scheduler, msg[1], None)
else:
job = self.background_scheduler.get_job(msg[0], None)
if not job:
raise Exception("There is no job with id '%s' (The valid options are: shutdown, "
"BackgroundScheduler or [job_id])." % msg[0])
attrib = getattr(job, msg[1])
if callable(attrib):
# There are arguments informed
if len(msg) > 2:
args = msg[3:]
ret = attrib(*args)
else:
ret = attrib()
else:
ret = attrib
# Handle datetime in json
if isinstance(ret, (datetime, date)):
ret = ret.isoformat()
self._send_msg(conn, ret)
except Exception as detail:
self._send_msg(conn, {"error": "%s" % detail})
logger.error("SchedulerServer error: %s" % detail)
finally:
conn.close()
def _stop_scheduler(self):
# Start the background scheduler
if self.background_scheduler.running:
self.background_scheduler.shutdown()
def start(self, *args, **kwargs):
# Start the server in a thread
threading.Thread(target=self._start_server, daemon=self.daemon).start()
# Start the background scheduler
self.background_scheduler.start(*args, **kwargs)
def shutdown(self):
# Stop the thread
self.client.send_shutdown()
self._stop_scheduler()
print("Stopped") | APSchedulerSocket | /APSchedulerSocket-0.4.2.tar.gz/APSchedulerSocket-0.4.2/apschedulersocket/schedulers.py | schedulers.py |
import itertools
import tableprint as tp
from .molecular import *
import matplotlib.pyplot as plt
def calc (elements,depth,max=-1) :
elements=elements.elements
combinaisons=[]
for i in list(range(1,depth+1)) :
combinaisons+=list(itertools.combinations_with_replacement(elements,i))
results=mol_container()
impurities=mol_container()
for combinaison in combinaisons :
name=[]
mass=0
probability=1
impurity=0.
for element in combinaison :
name.append(element.name)
mass+=element.mass
if element.score>=1 :
element.score=1.
impurity+=1.
probability=probability*element.rate*element.score
name.sort()
_name=''
for item in name :
_name+=item
if impurity == 0. :
mol=molecular(_name,{'mass':mass,'probability':probability,'elements':name})
if mol.probability > max :
results.add_molecule(mol)
else:
pass
else :
mol = molecular(_name, {'mass': mass, 'probability': probability, 'elements': name})
if mol.probability > max :
impurities.add_molecule(mol)
else:
pass
results.sort()
impurities.sort()
return (results,impurities)
def charge_calculation(results,charge_range) :
results, impurities = results[0], results[1]
charged_results=ion_container()
charged_impurities = ion_container()
for ion in results.molecules :
charges=ion.get_charge(charge_range[0],charge_range[1])
for charge in charges :
mol=molecular(ion.name,{'mass':ion.mass/charge,'probability':ion.probability,'elements':[]},charge=charge)
charged_results.add_molecule(mol)
for ion in impurities.molecules :
charges=ion.get_charge(charge_range[0],charge_range[1])
for charge in charges :
mol=molecular(ion.name,{'mass':ion.mass/charge,'probability':ion.probability,'elements':[]},charge=charge)
charged_impurities.add_molecule(mol)
charged_results.sort()
charged_impurities.sort()
return (charged_results,charged_impurities)
def disp_results(results,imp=False) :
results, impurities = results[0], results[1]
yx=[]
for molecule in results.molecules :
yx.append((molecule.name,molecule.mass,molecule.charge,molecule.probability))
tp.banner('Combinaison results')
tp.table(yx,['Combinaison','DA','Charge','Probability'], style='fancy_grid', width=25)
if imp!=False :
yx = []
for molecule in impurities.molecules:
yx.append((molecule.name, molecule.mass, molecule.charge, molecule.probability))
tp.banner('Impurities')
tp.table(yx, ['Combinaison', 'DA', 'Charge', 'Relative Probability'], style='fancy_grid', width=25)
def plot_sim (results) :
results, impurities = results[0], results[1]
X=[molecule.mass for molecule in results.molecules]
Y=[molecule.probability for molecule in results.molecules]
plt.plot(X,Y)
plt.xlabel('DA')
plt.ylabel('Relative Intensity')
plt.show()
def save_results(results,file) :
results, impurities = results[0], results[1]
yx=[]
file=open(file,'w')
for molecule in results.molecules :
yx.append([molecule.name,molecule.mass,molecule.charge,molecule.probability])
for molecule in impurities.molecules:
yx.append([molecule.name, molecule.mass, molecule.charge, molecule.probability])
for item in yx :
file.write('%s\t%s\t%s\t%s\n' % (item[0],item[1],item[2],item[3]))
file.close()
def disp_select(results,DA) :
results, impurities = results[0], results[1]
yx=[]
total=0
for molecule in results.molecules :
if molecule.mass == DA :
yx.append([molecule.name,molecule.charge,molecule.probability])
total+=molecule.probability
for molecule in yx :
molecule.append(molecule[-1]/total*100)
tp.banner('DA Probability Results for %s' % DA)
tp.table(yx,['Combinaison','Charge','Overall Probability','DA Probability (%)'], style='fancy_grid', width=25)
yx=[]
for molecule in impurities.molecules :
if molecule.mass == DA :
yx.append([molecule.name,molecule.charge,molecule.probability])
total+=molecule.probability
for molecule in yx :
molecule.append(molecule[-1]/total*100)
tp.banner('Impurities at %s' % DA)
tp.table(yx,['Combinaison','Charge','Relative Probability','DA Probability (%)'], style='fancy_grid', width=25)
def plot_results (results,mass_range=(0,1E3)) :
results, impurities = results[0], results[1]
for molecule in results.molecules :
if mass_range[0]<=molecule.mass<=mass_range[1] :
plt.plot(molecule.mass,molecule.probability,marker='+',color='red',ms=10)
plt.text(molecule.mass,molecule.probability,molecule.name,fontsize=12)
for molecule in impurities.molecules :
if mass_range[0]<=molecule.mass<=mass_range[1] :
plt.plot(molecule.mass,molecule.probability,marker='+',color='blue',ms=10)
plt.text(molecule.mass,molecule.probability,molecule.name,fontsize=12)
plt.xlabel('DA')
plt.ylabel('Probability')
plt.yscale('log')
plt.show() | APT-stats | /APT_stats-0.0.4-py3-none-any.whl/APT_stats/calc.py | calc.py |
# APUtil
Utility classes and functions for arcpy. APUtil stands for **A**rc**P**y and **UTIL**ity.
## Install
```shell
python -m pip install aputil
```
Note: Future releases are published on PyPi.
## Example
### `aputil.xcursor`
### Using `xcursor(cursor)`
```python
import arcpy, arcpy.da
from aputil import xcursor
feature_class = "points.shp"
with arcpy.da.SearchCursor(feature_class, ["FieldName"]) as cursor:
for row in xcursor(cursor):
print(row["FieldName"]) # instead of row[0]
# other examples
print(row.get("FieldName", "Default Value"))
print(row.get_by_index(0, "Default Value"))
```
#### Using `to_row()`
See `test/xcursor_test.py` (test `test_to_row`) for an example.
### `aputil.tcursor`
### Using `tcursor(cursor)`
```python
import arcpy, arcpy.da
from aputil import tcursor
feature_class = "points.shp"
with arcpy.da.SearchCursor(feature_class, ["FieldName"]) as cursor:
for row in tcursor(cursor):
print(row.FieldName) # instead of row[0]
```
### `aputil.fc`
#### Using `use_memory()`
```python
import arcpy, arcpy.management
from aputil import fc
arcpy.env.workspace = r"c:\data"
with fc.use_memory() as copied:
print(arcpy.Exists(copied)) # false (not yet)
arcpy.management.CopyFeatures("buildings.shp", copied)
print(arcpy.Exists(copied)) # true
print(arcpy.Exists(copied)) # false
```
#### Using `count(fc)`
```python
import arcpy
from aputil import fc
record_count = fc.count(r"c:\data\buildings.shp")
print(record_count)
```
### `aputil.typings`
```python
import arcpy, arcpy.management
from aputil.typings import FeatureClassType
def create_feature_class() -> FeatureClassType:
return arcpy.management.CreateFeatureclass(r"c:\temp", "test.shp")
print(create_feature_class())
```
## Run Unit Tests
```shell
cd c:\projects\aputil
[conda activate arcgispro-py3]
python test.py
```
| APUtil | /APUtil-0.4.6.tar.gz/APUtil-0.4.6/README.md | README.md |
from collections.abc import Generator
from typing import List, Dict, Union
import arcpy, arcpy.da
__all__ = ["xcursor", "XRow"]
class XRow():
""" Wraps an arcpy cursor row. """
def __init__(self, row: List[any], fields: List[str]):
self.row = row
self.fields = fields
self._fields = {field_name.upper(): index for index, field_name in enumerate(fields)}
def __getitem__(self, index: Union[str, int]):
if isinstance(index, int):
return self.get_by_index(index)
return self.get(index)
def __repr__(self):
return "xcursor.XRow({}, {})".format(str(self.row), str(self.fields))
def get(self, field_name: str, default_value: any = None):
"""
Gets the field value for given field.
In addition to just using ["FieldName"], this method can
return a default value when the field's value is None.
"""
if field_name is None or field_name.upper() not in self._fields:
raise KeyError("Field {} does not exist.".format(field_name))
value = self.row[self._fields[field_name.upper()]]
if not value:
return default_value
return value
def get_by_index(self, index: int, default_value: any = None):
"""
Gets the field value for given index.
In addition to just using [index], this method can
return a default value when the field's value is None.
"""
if index >= len(self.row):
raise IndexError("Index {} is out of range.".format(index))
value = self.row[index]
if not value:
return default_value
return value
def to_dict(self) -> Dict[str, any]:
""" Returns a dictionary representation. """
return {field_name: value for field_name, value in zip(self._fields, self.row)} # pylint: disable=unnecessary-comprehension
def to_row(self, update_values: Dict[str, any] = None) -> List[any]:
""" Returns a copy of the row with updated values if provided. """
return [
value if not update_values or field_name not in update_values else update_values[field_name]
for field_name, value
in zip(self.fields, self.row)
]
def xcursor(cursor: arcpy.da.SearchCursor) -> Generator[XRow, None, None]:
"""
Generator wrapping an arcpy cursor providing XRow instances. A XRow instance provides
```python
import arcpy
from aputil import xcursor
feature_class = "points.shp"
with arcpy.da.SearchCursor(feature_class, ["FieldName"]) as cursor:
for row in xcursor(cursor):
print(row["FieldName"]) # instead of row[0]
```
"""
for row in cursor:
yield XRow(row, cursor.fields) | APUtil | /APUtil-0.4.6.tar.gz/APUtil-0.4.6/src/aputil/xcursor.py | xcursor.py |
from typing import Union, List, Dict
import arcpy
__all__ = ["ToolParameters"]
class ToolParameters:
"""
Wraps a list of `arcpy.Parameter`s and allows to index the parameter
by name. Example usage:
```python
import arcpy
from aputil.toolbox import ToolParameters
params = ToolParameters(arcpy.GetParameterInfo())
feature_class = params.get_string("feature_class") # retrieve string
count = params.get_int("count") # retrieve integer
distance = params.get_float("distance") # retrieve float
# and so on
```
"""
def __init__(self, parameters: List[arcpy.Parameter] = None, suppress_errors = True):
""" If `suppress_errors` is `False`, raises errors if a parameter does not
exist or its value conversion failed. """
self.parameters = {p.name: p for p in parameters} if parameters else {}
self.suppress_errors = suppress_errors
def __iter__(self):
self.iterator = iter(self.parameters.items())
return self
def __next__(self):
return next(self.iterator)
def get(self, name: str) -> Union[arcpy.Parameter, None]:
parameter = self.parameters.get(name)
if not parameter and not self.suppress_errors:
raise KeyError(f"Parameter with name {name} does not exist.")
return parameter
def get_params(self) -> List[arcpy.Parameter]:
return self.parameters.values()
def to_dict(self) -> Dict[str, arcpy.Parameter]:
return {**self.parameters}
def get_string(self, name: str, default_value: Union[str, None] = None) -> Union[str, None]:
parameter = self.get(name)
if parameter:
return parameter.valueAsText
return default_value
def get_int(self, name: str, default_value: Union[int, None] = None) -> Union[int, None]:
""" Returns a int if the parameter's value can be converted
into a int value. """
value = self.get_string(name)
if not value:
return default_value
try:
return int(value)
except ValueError as e:
if not self.suppress_errors:
raise e
return default_value
def get_float(self, name: str, default_value: Union[float, None] = None) -> Union[float, None]:
""" Returns a float if the parameter's value can be converted
into a float value. """
value = self.get_string(name)
if not value:
return default_value
try:
return float(value)
except ValueError as e:
if not self.suppress_errors:
raise e
return default_value
def get_bool(self, name: str, default_value: Union[bool, None] = None) -> Union[bool, None]:
""" Returns `True` if parameter exists and the parameter's value is
is a valid boolean value or a valid string or int representation. """
parameter = self.get(name)
if not parameter or not parameter.value:
return default_value
value = parameter.value
if isinstance(value, bool):
return value
if isinstance(value, (str, int, float)):
if str(value).lower() in ("true", "checked", "1", "1.0"):
return True
if str(value).lower() in ("false", "unchecked", "0", "0.0"):
return False
if not self.suppress_errors:
raise ValueError(f"Cannot convert {value} to boolean.")
return default_value
def is_defined(self, name: str) -> bool:
""" Returns `True` if parameter exists and the parameter's
value is not `None`, otherwise returns `False`. Does
not raise an error if parameter does not exist. """
parameter = self.parameters.get(name)
return parameter and parameter.value is not None
def get_multivalue(self, name: str, empty_list_if_no_value=True) -> List[str]:
parameter = self.get_string(name)
if parameter:
return parameter.split(";")
if empty_list_if_no_value:
return []
return None
def clear_messages(self) -> None:
""" Clears all messages at once. """
for param in self.get_params():
param.clearMessage() | APUtil | /APUtil-0.4.6.tar.gz/APUtil-0.4.6/src/aputil/toolbox/parameters.py | parameters.py |
Compute APDEX from Apache-style logs.
Overview
========
Parses Apache-style logs and generates several statistics intended for a
website developer audience:
- APDEX (Application Performance inDEX, see http://www.apdex.org) ratio
(plotted)
Because you want to know how satisfied your users are.
- hit count (plotted)
Because achieving 100% APDEX is easy when there is nobody around.
- HTTP status codes, with optional detailed output of the most frequent URLs
per error status code, along with their most frequent referers
Because your forgot to update a link to that conditionally-used browser
compatibility javascript you renamed.
- Hottest pages (pages which use rendering time the most)
Because you want to know where to invest time to get highest user experience
improvement.
- ERP5 sites: per-module statistics, with module and document views separated
Because module and document types are not born equal in usage patterns.
Some parsing performance figures:
On a 2.3Ghz Corei5, apachedex achieves 97000 lines/s (
pypy-c-jit-62994-bd32583a3f11-linux64) and 43000 lines/s (CPython 2.7).
Those were measures on a 3000000-hits logfile, with 3 --skip-base, 1
--erp5-base, 3 --base and --default set. --\*base values were similar in
simplicity to the ones provided in examples below.
What APacheDEX is not
=====================
APacheDEX does not produce website audience statistics like AWStats, Google
Analytics (etc) could do.
APacheDEX does not monitor website availability & resource usage like Zabbix,
Cacti, Ganglia, Nagios (etc) could do.
Requirements
============
Dependencies
------------
As such, apachedex has no strict dependencies outside of standard python 2.7
installation.
But generated output needs a few javascript files which come from other
projects:
- jquery.js
- jquery.flot.js
- jquery.flot.time.js (official flot plugin)
- jquery.flot.axislabels.js (third-party flot plugin)
If you installed apachedex (using an egg or with a distribution's package) you
should have them already.
If you are running from repository, you need to fetch them first::
python setup.py deps
Also, apachedex can make use of backports.lzma
(http://pypi.python.org/pypi/backports.lzma/) if it's installed to support xz
file compression.
Input
-----
All default "combined" log format fields are supported (more can easily be
added), plus %D.
Mandatory fields are (in any order) `%t`, `%r` (for request's URL), `%>s`,
`%{Referer}i`, `%D`. Just tell apachedex the value from your apache log
configuration (see `--logformat` argument documentation).
Input files may be provided uncompressed or compressed in:
- bzip
- gzip2
- xz (if module backports.lzma is installed)
Input filename "-" is understood as stdin.
Output
------
The output is HTML + CSS + JS, so you need a web browser to read it.
Output filename "-" is understood as stdout.
Usage
=====
A few usage examples. See embedded help (`-h`/`--help`) for further options.
Most basic usage::
apachedex --default website access.log
Generate stand-alone output (suitable for inclusion in a mail, for example)::
apachedex --default website --js-embed access.log --out attachment.html
A log file with requests for 2 websites for which individual stats are
desired, and hits outside those base urls are ignored::
apachedex --base "/site1(/|$|\?)" "/site2(/|$|\?)"
A log file with a site section to ignore. Order does not matter::
apachedex --skip-base "/ignored(/|$|\?)" --default website
A mix of both above examples. Order matters !::
apachedex --skip-base "/site1/ignored(/|$|\?)" \
--base "/site1(/|$|\?)" "/site2(/|$|\?)"
Matching non-ASCII urls works by using urlencoded strings::
apachedex --base "/%E6%96%87%E5%AD%97%E5%8C%96%E3%81%91(/|$|\\?)" access.log
Naming websites so that report looks less intimidating, by interleaving
"+"-prefixed titles with regexes (title must be just before regex)::
apachedex --default "Public website" --base "+Back office" \
"/backoffice(/|$|\\?)" "+User access" "/secure(/|$|\\?)" access.log
Saving the result of an analysis for faster reuse::
apachedex --default foo --format json --out save_state.json --period day \
access.log
Although not required, it is strongly advised to provide `--period` argument,
as mixing states saved with different periods (fixed or auto-detected from
data) give hard-to-read results and can cause problems if loaded data gets
converted to a larger period.
Continuing a saved analysis, updating collected data::
apachedex --default foo --format json --state-file save_state.json \
--out save_state.json --period day access.2.log
Generating HTML output from two state files, aggregating their content
without parsing more logs::
apachedex --default foo --state-file save_state.json save_state.2.json \
--period day --out index.html
Configuration files
===================
Providing a filename prefixed by "@" puts the content of that file in place of
that argument, recursively. Each file is loaded relative to the containing
directory of referencing file, or current working directory for command line.
- foo/dev.cfg::
--error-detail
@site.cfg
--stats
- foo/site.cfg::
--default Front-office
# This is a comment
--prefix "+Back office" "/back(/|$|\?)" # This is another comment
--skip-prefix "/baz/ignored(/|$|\?)" --prefix +Something "/baz(/|$|\?)"
- command line::
apachedex --skip-base "/ignored(/|$|\?)" @foo/dev.cfg --out index.html \
access.log
This is equivalent to::
apachedex --skip-base "/ignored(/|$|\?)" --error-detail \
--default Front-office --prefix "+Back office" "/back(/|$|\?)" \
--skip-prefix "/baz/ignored(/|$|\?)" --prefix +Something "/baz(/|$|\?)" \
--stats --out index.html access.log
Portability note: the use of paths containing directory elements inside
configuration files is discouraged, as it's not portable. This may change
later (ex: deciding that import paths are URLs and applying their rules).
Periods
=======
When providing the `--period` argument, two related settings are affected:
- the period represented by each point in a graph (most important for the
hit graph, as it represents the number of hits per such period)
- the period represented by each column in per-period tables (status codes
per date, hits per day...)
Also, when `--period` is not provided, apachedex uses a threshold to tell
when to switch to the larger period. That period was chosen to correspond
to 200 graph points, which represents a varying number of table columns.
.. table :: Details of `--period` argument
=========== ========== ========== ============== =========================
--period graph table to next period columns until next period
=========== ========== ========== ============== =========================
quarterhour minute 15 minutes 200 minutes 8 (3.3 hours)
halfday 30 minutes 12 hours 100 hours 9 (4.1 days)
day hour day 200 hours 9 (8.3 days)
week 6 hours week 1200 hours 8 (7.1 weeks)
month day month 5000 hours 7 (~6.7 months)
quarter 7 days quarter 1400 days 16 (15.3 weeks)
year month year (n/a) (infinity)
=========== ========== ========== ============== =========================
"7 days" period used in `--period quarter` are not weeks strictly
speaking: a week starts a monday/sunday, pendending on the locale.
"7 days" start on the first day of the year, for simplicity - and
performance. "week" used for `--period week` are really weeks, although
starting on monday independently from locale.
When there are no hits for more than a graph period, placeholders are
generated at 0 hit value (which is the reality) and 100% apdex (this is
arbitrary). Those placeholders only affect graphs, and do not affect
averages nor table content.
Because not all graph periods are actually equal in length (because of
leap seconds, DST, leap years, year containing a non-integer number of
weeks), some hit graph points are artificially corrected against these
effects. Here also, the correction only affects graphs, neither averages
nor table content. For example, on non-leap years, the last year's
"7 days" period lasts a single day. Ploted hit count is then multiplied
by 7 (and 3.5 on leap years).
Performance
===========
For better performance...
- pipe decompressed files to apachedex instead of having apachedex decompress
files itself::
bzcat access.log.bz2 | apachedex [...] -
- when letting apachedex decide statistic granularity with multiple log files,
provide earliest and latest log files first (whatever order) so apachedex can
adapt its data structure to analysed time range before there is too much
data::
apachedex [...] access.log.1.gz access.log.99.gz access.log.2.gz \
access.log.3.gz [...] access.98.gz
- parse log files in parallel processes, saving analysis output and aggregating
them in the end::
for LOG in access*.log; do
apachedex "$@" --format json --out "$LOG.json" "$LOG" &
done
wait
apachedex "$@" --out access.html --state-file access.*.json
If you have bash and have an xargs implementation supporting `-P`, you may
want to use `parallel_parse.sh` available in source distribution or from
repository.
Notes
=====
Loading saved states generated with different sets of parameters is not
prevented, but can produce nonsense/unreadable results. Or it can save the day
if you do want to mix different parameters (ex: you have some logs generated
with %T, others with %D).
It is unclear how saved state format will evolve. Be prepared to have
to regenerate saved states when you upgrade APacheDEX.
| APacheDEX | /APacheDEX-1.8.tar.gz/APacheDEX-1.8/README.rst | README.rst |
from __future__ import print_function
try:
import configparser
except ImportError:
import ConfigParser as configparser
import errno
import json
import os
import re
import subprocess
import sys
class VersioneerConfig:
"""Container for Versioneer configuration parameters."""
def get_root():
"""Get the project root directory.
We require that all commands are run from the project root, i.e. the
directory that contains setup.py, setup.cfg, and versioneer.py .
"""
root = os.path.realpath(os.path.abspath(os.getcwd()))
setup_py = os.path.join(root, "setup.py")
versioneer_py = os.path.join(root, "versioneer.py")
if not (os.path.exists(setup_py) or os.path.exists(versioneer_py)):
# allow 'python path/to/setup.py COMMAND'
root = os.path.dirname(os.path.realpath(os.path.abspath(sys.argv[0])))
setup_py = os.path.join(root, "setup.py")
versioneer_py = os.path.join(root, "versioneer.py")
if not (os.path.exists(setup_py) or os.path.exists(versioneer_py)):
err = ("Versioneer was unable to run the project root directory. "
"Versioneer requires setup.py to be executed from "
"its immediate directory (like 'python setup.py COMMAND'), "
"or in a way that lets it use sys.argv[0] to find the root "
"(like 'python path/to/setup.py COMMAND').")
raise VersioneerBadRootError(err)
try:
# Certain runtime workflows (setup.py install/develop in a setuptools
# tree) execute all dependencies in a single python process, so
# "versioneer" may be imported multiple times, and python's shared
# module-import table will cache the first one. So we can't use
# os.path.dirname(__file__), as that will find whichever
# versioneer.py was first imported, even in later projects.
me = os.path.realpath(os.path.abspath(__file__))
me_dir = os.path.normcase(os.path.splitext(me)[0])
vsr_dir = os.path.normcase(os.path.splitext(versioneer_py)[0])
if me_dir != vsr_dir:
print("Warning: build in %s is using versioneer.py from %s"
% (os.path.dirname(me), versioneer_py))
except NameError:
pass
return root
def get_config_from_root(root):
"""Read the project setup.cfg file to determine Versioneer config."""
# This might raise EnvironmentError (if setup.cfg is missing), or
# configparser.NoSectionError (if it lacks a [versioneer] section), or
# configparser.NoOptionError (if it lacks "VCS="). See the docstring at
# the top of versioneer.py for instructions on writing your setup.cfg .
setup_cfg = os.path.join(root, "setup.cfg")
parser = configparser.SafeConfigParser()
with open(setup_cfg, "r") as f:
parser.readfp(f)
VCS = parser.get("versioneer", "VCS") # mandatory
def get(parser, name):
if parser.has_option("versioneer", name):
return parser.get("versioneer", name)
return None
cfg = VersioneerConfig()
cfg.VCS = VCS
cfg.style = get(parser, "style") or ""
cfg.versionfile_source = get(parser, "versionfile_source")
cfg.versionfile_build = get(parser, "versionfile_build")
cfg.tag_prefix = get(parser, "tag_prefix")
if cfg.tag_prefix in ("''", '""'):
cfg.tag_prefix = ""
cfg.parentdir_prefix = get(parser, "parentdir_prefix")
cfg.verbose = get(parser, "verbose")
return cfg
class NotThisMethod(Exception):
"""Exception raised if a method is not valid for the current scenario."""
# these dictionaries contain VCS-specific tools
LONG_VERSION_PY = {}
HANDLERS = {}
def register_vcs_handler(vcs, method): # decorator
"""Decorator to mark a method as the handler for a particular VCS."""
def decorate(f):
"""Store f in HANDLERS[vcs][method]."""
if vcs not in HANDLERS:
HANDLERS[vcs] = {}
HANDLERS[vcs][method] = f
return f
return decorate
def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False,
env=None):
"""Call the given command(s)."""
assert isinstance(commands, list)
p = None
for c in commands:
try:
dispcmd = str([c] + args)
# remember shell=False, so use git.cmd on windows, not just git
p = subprocess.Popen([c] + args, cwd=cwd, env=env,
stdout=subprocess.PIPE,
stderr=(subprocess.PIPE if hide_stderr
else None))
break
except EnvironmentError:
e = sys.exc_info()[1]
if e.errno == errno.ENOENT:
continue
if verbose:
print("unable to run %s" % dispcmd)
print(e)
return None, None
else:
if verbose:
print("unable to find command, tried %s" % (commands,))
return None, None
stdout = p.communicate()[0].strip()
if sys.version_info[0] >= 3:
stdout = stdout.decode()
if p.returncode != 0:
if verbose:
print("unable to run %s (error)" % dispcmd)
print("stdout was %s" % stdout)
return None, p.returncode
return stdout, p.returncode
LONG_VERSION_PY['git'] = '''
# This file helps to compute a version number in source trees obtained from
# git-archive tarball (such as those provided by githubs download-from-tag
# feature). Distribution tarballs (built by setup.py sdist) and build
# directories (produced by setup.py build) will contain a much shorter file
# that just contains the computed version number.
# This file is released into the public domain. Generated by
# versioneer-0.18 (https://github.com/warner/python-versioneer)
"""Git implementation of _version.py."""
import errno
import os
import re
import subprocess
import sys
def get_keywords():
"""Get the keywords needed to look up the version information."""
# these strings will be replaced by git during git-archive.
# setup.py/versioneer.py will grep for the variable names, so they must
# each be defined on a line of their own. _version.py will just call
# get_keywords().
git_refnames = "%(DOLLAR)sFormat:%%d%(DOLLAR)s"
git_full = "%(DOLLAR)sFormat:%%H%(DOLLAR)s"
git_date = "%(DOLLAR)sFormat:%%ci%(DOLLAR)s"
keywords = {"refnames": git_refnames, "full": git_full, "date": git_date}
return keywords
class VersioneerConfig:
"""Container for Versioneer configuration parameters."""
def get_config():
"""Create, populate and return the VersioneerConfig() object."""
# these strings are filled in when 'setup.py versioneer' creates
# _version.py
cfg = VersioneerConfig()
cfg.VCS = "git"
cfg.style = "%(STYLE)s"
cfg.tag_prefix = "%(TAG_PREFIX)s"
cfg.parentdir_prefix = "%(PARENTDIR_PREFIX)s"
cfg.versionfile_source = "%(VERSIONFILE_SOURCE)s"
cfg.verbose = False
return cfg
class NotThisMethod(Exception):
"""Exception raised if a method is not valid for the current scenario."""
LONG_VERSION_PY = {}
HANDLERS = {}
def register_vcs_handler(vcs, method): # decorator
"""Decorator to mark a method as the handler for a particular VCS."""
def decorate(f):
"""Store f in HANDLERS[vcs][method]."""
if vcs not in HANDLERS:
HANDLERS[vcs] = {}
HANDLERS[vcs][method] = f
return f
return decorate
def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False,
env=None):
"""Call the given command(s)."""
assert isinstance(commands, list)
p = None
for c in commands:
try:
dispcmd = str([c] + args)
# remember shell=False, so use git.cmd on windows, not just git
p = subprocess.Popen([c] + args, cwd=cwd, env=env,
stdout=subprocess.PIPE,
stderr=(subprocess.PIPE if hide_stderr
else None))
break
except EnvironmentError:
e = sys.exc_info()[1]
if e.errno == errno.ENOENT:
continue
if verbose:
print("unable to run %%s" %% dispcmd)
print(e)
return None, None
else:
if verbose:
print("unable to find command, tried %%s" %% (commands,))
return None, None
stdout = p.communicate()[0].strip()
if sys.version_info[0] >= 3:
stdout = stdout.decode()
if p.returncode != 0:
if verbose:
print("unable to run %%s (error)" %% dispcmd)
print("stdout was %%s" %% stdout)
return None, p.returncode
return stdout, p.returncode
def versions_from_parentdir(parentdir_prefix, root, verbose):
"""Try to determine the version from the parent directory name.
Source tarballs conventionally unpack into a directory that includes both
the project name and a version string. We will also support searching up
two directory levels for an appropriately named parent directory
"""
rootdirs = []
for i in range(3):
dirname = os.path.basename(root)
if dirname.startswith(parentdir_prefix):
return {"version": dirname[len(parentdir_prefix):],
"full-revisionid": None,
"dirty": False, "error": None, "date": None}
else:
rootdirs.append(root)
root = os.path.dirname(root) # up a level
if verbose:
print("Tried directories %%s but none started with prefix %%s" %%
(str(rootdirs), parentdir_prefix))
raise NotThisMethod("rootdir doesn't start with parentdir_prefix")
@register_vcs_handler("git", "get_keywords")
def git_get_keywords(versionfile_abs):
"""Extract version information from the given file."""
# the code embedded in _version.py can just fetch the value of these
# keywords. When used from setup.py, we don't want to import _version.py,
# so we do it with a regexp instead. This function is not used from
# _version.py.
keywords = {}
try:
f = open(versionfile_abs, "r")
for line in f.readlines():
if line.strip().startswith("git_refnames ="):
mo = re.search(r'=\s*"(.*)"', line)
if mo:
keywords["refnames"] = mo.group(1)
if line.strip().startswith("git_full ="):
mo = re.search(r'=\s*"(.*)"', line)
if mo:
keywords["full"] = mo.group(1)
if line.strip().startswith("git_date ="):
mo = re.search(r'=\s*"(.*)"', line)
if mo:
keywords["date"] = mo.group(1)
f.close()
except EnvironmentError:
pass
return keywords
@register_vcs_handler("git", "keywords")
def git_versions_from_keywords(keywords, tag_prefix, verbose):
"""Get version information from git keywords."""
if not keywords:
raise NotThisMethod("no keywords at all, weird")
date = keywords.get("date")
if date is not None:
# git-2.2.0 added "%%cI", which expands to an ISO-8601 -compliant
# datestamp. However we prefer "%%ci" (which expands to an "ISO-8601
# -like" string, which we must then edit to make compliant), because
# it's been around since git-1.5.3, and it's too difficult to
# discover which version we're using, or to work around using an
# older one.
date = date.strip().replace(" ", "T", 1).replace(" ", "", 1)
refnames = keywords["refnames"].strip()
if refnames.startswith("$Format"):
if verbose:
print("keywords are unexpanded, not using")
raise NotThisMethod("unexpanded keywords, not a git-archive tarball")
refs = set([r.strip() for r in refnames.strip("()").split(",")])
# starting in git-1.8.3, tags are listed as "tag: foo-1.0" instead of
# just "foo-1.0". If we see a "tag: " prefix, prefer those.
TAG = "tag: "
tags = set([r[len(TAG):] for r in refs if r.startswith(TAG)])
if not tags:
# Either we're using git < 1.8.3, or there really are no tags. We use
# a heuristic: assume all version tags have a digit. The old git %%d
# expansion behaves like git log --decorate=short and strips out the
# refs/heads/ and refs/tags/ prefixes that would let us distinguish
# between branches and tags. By ignoring refnames without digits, we
# filter out many common branch names like "release" and
# "stabilization", as well as "HEAD" and "master".
tags = set([r for r in refs if re.search(r'\d', r)])
if verbose:
print("discarding '%%s', no digits" %% ",".join(refs - tags))
if verbose:
print("likely tags: %%s" %% ",".join(sorted(tags)))
for ref in sorted(tags):
# sorting will prefer e.g. "2.0" over "2.0rc1"
if ref.startswith(tag_prefix):
r = ref[len(tag_prefix):]
if verbose:
print("picking %%s" %% r)
return {"version": r,
"full-revisionid": keywords["full"].strip(),
"dirty": False, "error": None,
"date": date}
# no suitable tags, so version is "0+unknown", but full hex is still there
if verbose:
print("no suitable tags, using unknown + full revision id")
return {"version": "0+unknown",
"full-revisionid": keywords["full"].strip(),
"dirty": False, "error": "no suitable tags", "date": None}
@register_vcs_handler("git", "pieces_from_vcs")
def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
"""Get version from 'git describe' in the root of the source tree.
This only gets called if the git-archive 'subst' keywords were *not*
expanded, and _version.py hasn't already been rewritten with a short
version string, meaning we're inside a checked out source tree.
"""
GITS = ["git"]
if sys.platform == "win32":
GITS = ["git.cmd", "git.exe"]
out, rc = run_command(GITS, ["rev-parse", "--git-dir"], cwd=root,
hide_stderr=True)
if rc != 0:
if verbose:
print("Directory %%s not under git control" %% root)
raise NotThisMethod("'git rev-parse --git-dir' returned error")
# if there is a tag matching tag_prefix, this yields TAG-NUM-gHEX[-dirty]
# if there isn't one, this yields HEX[-dirty] (no NUM)
describe_out, rc = run_command(GITS, ["describe", "--tags", "--dirty",
"--always", "--long",
"--match", "%%s*" %% tag_prefix],
cwd=root)
# --long was added in git-1.5.5
if describe_out is None:
raise NotThisMethod("'git describe' failed")
describe_out = describe_out.strip()
full_out, rc = run_command(GITS, ["rev-parse", "HEAD"], cwd=root)
if full_out is None:
raise NotThisMethod("'git rev-parse' failed")
full_out = full_out.strip()
pieces = {}
pieces["long"] = full_out
pieces["short"] = full_out[:7] # maybe improved later
pieces["error"] = None
# parse describe_out. It will be like TAG-NUM-gHEX[-dirty] or HEX[-dirty]
# TAG might have hyphens.
git_describe = describe_out
# look for -dirty suffix
dirty = git_describe.endswith("-dirty")
pieces["dirty"] = dirty
if dirty:
git_describe = git_describe[:git_describe.rindex("-dirty")]
# now we have TAG-NUM-gHEX or HEX
if "-" in git_describe:
# TAG-NUM-gHEX
mo = re.search(r'^(.+)-(\d+)-g([0-9a-f]+)$', git_describe)
if not mo:
# unparseable. Maybe git-describe is misbehaving?
pieces["error"] = ("unable to parse git-describe output: '%%s'"
%% describe_out)
return pieces
# tag
full_tag = mo.group(1)
if not full_tag.startswith(tag_prefix):
if verbose:
fmt = "tag '%%s' doesn't start with prefix '%%s'"
print(fmt %% (full_tag, tag_prefix))
pieces["error"] = ("tag '%%s' doesn't start with prefix '%%s'"
%% (full_tag, tag_prefix))
return pieces
pieces["closest-tag"] = full_tag[len(tag_prefix):]
# distance: number of commits since tag
pieces["distance"] = int(mo.group(2))
# commit: short hex revision ID
pieces["short"] = mo.group(3)
else:
# HEX: no tags
pieces["closest-tag"] = None
count_out, rc = run_command(GITS, ["rev-list", "HEAD", "--count"],
cwd=root)
pieces["distance"] = int(count_out) # total number of commits
# commit date: see ISO-8601 comment in git_versions_from_keywords()
date = run_command(GITS, ["show", "-s", "--format=%%ci", "HEAD"],
cwd=root)[0].strip()
pieces["date"] = date.strip().replace(" ", "T", 1).replace(" ", "", 1)
return pieces
def plus_or_dot(pieces):
"""Return a + if we don't already have one, else return a ."""
if "+" in pieces.get("closest-tag", ""):
return "."
return "+"
def render_pep440(pieces):
"""Build up version string, with post-release "local version identifier".
Our goal: TAG[+DISTANCE.gHEX[.dirty]] . Note that if you
get a tagged build and then dirty it, you'll get TAG+0.gHEX.dirty
Exceptions:
1: no tags. git_describe was just HEX. 0+untagged.DISTANCE.gHEX[.dirty]
"""
if pieces["closest-tag"]:
rendered = pieces["closest-tag"]
if pieces["distance"] or pieces["dirty"]:
rendered += plus_or_dot(pieces)
rendered += "%%d.g%%s" %% (pieces["distance"], pieces["short"])
if pieces["dirty"]:
rendered += ".dirty"
else:
# exception #1
rendered = "0+untagged.%%d.g%%s" %% (pieces["distance"],
pieces["short"])
if pieces["dirty"]:
rendered += ".dirty"
return rendered
def render_pep440_pre(pieces):
"""TAG[.post.devDISTANCE] -- No -dirty.
Exceptions:
1: no tags. 0.post.devDISTANCE
"""
if pieces["closest-tag"]:
rendered = pieces["closest-tag"]
if pieces["distance"]:
rendered += ".post.dev%%d" %% pieces["distance"]
else:
# exception #1
rendered = "0.post.dev%%d" %% pieces["distance"]
return rendered
def render_pep440_post(pieces):
"""TAG[.postDISTANCE[.dev0]+gHEX] .
The ".dev0" means dirty. Note that .dev0 sorts backwards
(a dirty tree will appear "older" than the corresponding clean one),
but you shouldn't be releasing software with -dirty anyways.
Exceptions:
1: no tags. 0.postDISTANCE[.dev0]
"""
if pieces["closest-tag"]:
rendered = pieces["closest-tag"]
if pieces["distance"] or pieces["dirty"]:
rendered += ".post%%d" %% pieces["distance"]
if pieces["dirty"]:
rendered += ".dev0"
rendered += plus_or_dot(pieces)
rendered += "g%%s" %% pieces["short"]
else:
# exception #1
rendered = "0.post%%d" %% pieces["distance"]
if pieces["dirty"]:
rendered += ".dev0"
rendered += "+g%%s" %% pieces["short"]
return rendered
def render_pep440_old(pieces):
"""TAG[.postDISTANCE[.dev0]] .
The ".dev0" means dirty.
Eexceptions:
1: no tags. 0.postDISTANCE[.dev0]
"""
if pieces["closest-tag"]:
rendered = pieces["closest-tag"]
if pieces["distance"] or pieces["dirty"]:
rendered += ".post%%d" %% pieces["distance"]
if pieces["dirty"]:
rendered += ".dev0"
else:
# exception #1
rendered = "0.post%%d" %% pieces["distance"]
if pieces["dirty"]:
rendered += ".dev0"
return rendered
def render_git_describe(pieces):
"""TAG[-DISTANCE-gHEX][-dirty].
Like 'git describe --tags --dirty --always'.
Exceptions:
1: no tags. HEX[-dirty] (note: no 'g' prefix)
"""
if pieces["closest-tag"]:
rendered = pieces["closest-tag"]
if pieces["distance"]:
rendered += "-%%d-g%%s" %% (pieces["distance"], pieces["short"])
else:
# exception #1
rendered = pieces["short"]
if pieces["dirty"]:
rendered += "-dirty"
return rendered
def render_git_describe_long(pieces):
"""TAG-DISTANCE-gHEX[-dirty].
Like 'git describe --tags --dirty --always -long'.
The distance/hash is unconditional.
Exceptions:
1: no tags. HEX[-dirty] (note: no 'g' prefix)
"""
if pieces["closest-tag"]:
rendered = pieces["closest-tag"]
rendered += "-%%d-g%%s" %% (pieces["distance"], pieces["short"])
else:
# exception #1
rendered = pieces["short"]
if pieces["dirty"]:
rendered += "-dirty"
return rendered
def render(pieces, style):
"""Render the given version pieces into the requested style."""
if pieces["error"]:
return {"version": "unknown",
"full-revisionid": pieces.get("long"),
"dirty": None,
"error": pieces["error"],
"date": None}
if not style or style == "default":
style = "pep440" # the default
if style == "pep440":
rendered = render_pep440(pieces)
elif style == "pep440-pre":
rendered = render_pep440_pre(pieces)
elif style == "pep440-post":
rendered = render_pep440_post(pieces)
elif style == "pep440-old":
rendered = render_pep440_old(pieces)
elif style == "git-describe":
rendered = render_git_describe(pieces)
elif style == "git-describe-long":
rendered = render_git_describe_long(pieces)
else:
raise ValueError("unknown style '%%s'" %% style)
return {"version": rendered, "full-revisionid": pieces["long"],
"dirty": pieces["dirty"], "error": None,
"date": pieces.get("date")}
def get_versions():
"""Get version information or return default if unable to do so."""
# I am in _version.py, which lives at ROOT/VERSIONFILE_SOURCE. If we have
# __file__, we can work backwards from there to the root. Some
# py2exe/bbfreeze/non-CPython implementations don't do __file__, in which
# case we can only use expanded keywords.
cfg = get_config()
verbose = cfg.verbose
try:
return git_versions_from_keywords(get_keywords(), cfg.tag_prefix,
verbose)
except NotThisMethod:
pass
try:
root = os.path.realpath(__file__)
# versionfile_source is the relative path from the top of the source
# tree (where the .git directory might live) to this file. Invert
# this to find the root from __file__.
for i in cfg.versionfile_source.split('/'):
root = os.path.dirname(root)
except NameError:
return {"version": "0+unknown", "full-revisionid": None,
"dirty": None,
"error": "unable to find root of source tree",
"date": None}
try:
pieces = git_pieces_from_vcs(cfg.tag_prefix, root, verbose)
return render(pieces, cfg.style)
except NotThisMethod:
pass
try:
if cfg.parentdir_prefix:
return versions_from_parentdir(cfg.parentdir_prefix, root, verbose)
except NotThisMethod:
pass
return {"version": "0+unknown", "full-revisionid": None,
"dirty": None,
"error": "unable to compute version", "date": None}
'''
@register_vcs_handler("git", "get_keywords")
def git_get_keywords(versionfile_abs):
"""Extract version information from the given file."""
# the code embedded in _version.py can just fetch the value of these
# keywords. When used from setup.py, we don't want to import _version.py,
# so we do it with a regexp instead. This function is not used from
# _version.py.
keywords = {}
try:
f = open(versionfile_abs, "r")
for line in f.readlines():
if line.strip().startswith("git_refnames ="):
mo = re.search(r'=\s*"(.*)"', line)
if mo:
keywords["refnames"] = mo.group(1)
if line.strip().startswith("git_full ="):
mo = re.search(r'=\s*"(.*)"', line)
if mo:
keywords["full"] = mo.group(1)
if line.strip().startswith("git_date ="):
mo = re.search(r'=\s*"(.*)"', line)
if mo:
keywords["date"] = mo.group(1)
f.close()
except EnvironmentError:
pass
return keywords
@register_vcs_handler("git", "keywords")
def git_versions_from_keywords(keywords, tag_prefix, verbose):
"""Get version information from git keywords."""
if not keywords:
raise NotThisMethod("no keywords at all, weird")
date = keywords.get("date")
if date is not None:
# git-2.2.0 added "%cI", which expands to an ISO-8601 -compliant
# datestamp. However we prefer "%ci" (which expands to an "ISO-8601
# -like" string, which we must then edit to make compliant), because
# it's been around since git-1.5.3, and it's too difficult to
# discover which version we're using, or to work around using an
# older one.
date = date.strip().replace(" ", "T", 1).replace(" ", "", 1)
refnames = keywords["refnames"].strip()
if refnames.startswith("$Format"):
if verbose:
print("keywords are unexpanded, not using")
raise NotThisMethod("unexpanded keywords, not a git-archive tarball")
refs = set([r.strip() for r in refnames.strip("()").split(",")])
# starting in git-1.8.3, tags are listed as "tag: foo-1.0" instead of
# just "foo-1.0". If we see a "tag: " prefix, prefer those.
TAG = "tag: "
tags = set([r[len(TAG):] for r in refs if r.startswith(TAG)])
if not tags:
# Either we're using git < 1.8.3, or there really are no tags. We use
# a heuristic: assume all version tags have a digit. The old git %d
# expansion behaves like git log --decorate=short and strips out the
# refs/heads/ and refs/tags/ prefixes that would let us distinguish
# between branches and tags. By ignoring refnames without digits, we
# filter out many common branch names like "release" and
# "stabilization", as well as "HEAD" and "master".
tags = set([r for r in refs if re.search(r'\d', r)])
if verbose:
print("discarding '%s', no digits" % ",".join(refs - tags))
if verbose:
print("likely tags: %s" % ",".join(sorted(tags)))
for ref in sorted(tags):
# sorting will prefer e.g. "2.0" over "2.0rc1"
if ref.startswith(tag_prefix):
r = ref[len(tag_prefix):]
if verbose:
print("picking %s" % r)
return {"version": r,
"full-revisionid": keywords["full"].strip(),
"dirty": False, "error": None,
"date": date}
# no suitable tags, so version is "0+unknown", but full hex is still there
if verbose:
print("no suitable tags, using unknown + full revision id")
return {"version": "0+unknown",
"full-revisionid": keywords["full"].strip(),
"dirty": False, "error": "no suitable tags", "date": None}
@register_vcs_handler("git", "pieces_from_vcs")
def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
"""Get version from 'git describe' in the root of the source tree.
This only gets called if the git-archive 'subst' keywords were *not*
expanded, and _version.py hasn't already been rewritten with a short
version string, meaning we're inside a checked out source tree.
"""
GITS = ["git"]
if sys.platform == "win32":
GITS = ["git.cmd", "git.exe"]
out, rc = run_command(GITS, ["rev-parse", "--git-dir"], cwd=root,
hide_stderr=True)
if rc != 0:
if verbose:
print("Directory %s not under git control" % root)
raise NotThisMethod("'git rev-parse --git-dir' returned error")
# if there is a tag matching tag_prefix, this yields TAG-NUM-gHEX[-dirty]
# if there isn't one, this yields HEX[-dirty] (no NUM)
describe_out, rc = run_command(GITS, ["describe", "--tags", "--dirty",
"--always", "--long",
"--match", "%s*" % tag_prefix],
cwd=root)
# --long was added in git-1.5.5
if describe_out is None:
raise NotThisMethod("'git describe' failed")
describe_out = describe_out.strip()
full_out, rc = run_command(GITS, ["rev-parse", "HEAD"], cwd=root)
if full_out is None:
raise NotThisMethod("'git rev-parse' failed")
full_out = full_out.strip()
pieces = {}
pieces["long"] = full_out
pieces["short"] = full_out[:7] # maybe improved later
pieces["error"] = None
# parse describe_out. It will be like TAG-NUM-gHEX[-dirty] or HEX[-dirty]
# TAG might have hyphens.
git_describe = describe_out
# look for -dirty suffix
dirty = git_describe.endswith("-dirty")
pieces["dirty"] = dirty
if dirty:
git_describe = git_describe[:git_describe.rindex("-dirty")]
# now we have TAG-NUM-gHEX or HEX
if "-" in git_describe:
# TAG-NUM-gHEX
mo = re.search(r'^(.+)-(\d+)-g([0-9a-f]+)$', git_describe)
if not mo:
# unparseable. Maybe git-describe is misbehaving?
pieces["error"] = ("unable to parse git-describe output: '%s'"
% describe_out)
return pieces
# tag
full_tag = mo.group(1)
if not full_tag.startswith(tag_prefix):
if verbose:
fmt = "tag '%s' doesn't start with prefix '%s'"
print(fmt % (full_tag, tag_prefix))
pieces["error"] = ("tag '%s' doesn't start with prefix '%s'"
% (full_tag, tag_prefix))
return pieces
pieces["closest-tag"] = full_tag[len(tag_prefix):]
# distance: number of commits since tag
pieces["distance"] = int(mo.group(2))
# commit: short hex revision ID
pieces["short"] = mo.group(3)
else:
# HEX: no tags
pieces["closest-tag"] = None
count_out, rc = run_command(GITS, ["rev-list", "HEAD", "--count"],
cwd=root)
pieces["distance"] = int(count_out) # total number of commits
# commit date: see ISO-8601 comment in git_versions_from_keywords()
date = run_command(GITS, ["show", "-s", "--format=%ci", "HEAD"],
cwd=root)[0].strip()
pieces["date"] = date.strip().replace(" ", "T", 1).replace(" ", "", 1)
return pieces
def do_vcs_install(manifest_in, versionfile_source, ipy):
"""Git-specific installation logic for Versioneer.
For Git, this means creating/changing .gitattributes to mark _version.py
for export-subst keyword substitution.
"""
GITS = ["git"]
if sys.platform == "win32":
GITS = ["git.cmd", "git.exe"]
files = [manifest_in, versionfile_source]
if ipy:
files.append(ipy)
try:
me = __file__
if me.endswith(".pyc") or me.endswith(".pyo"):
me = os.path.splitext(me)[0] + ".py"
versioneer_file = os.path.relpath(me)
except NameError:
versioneer_file = "versioneer.py"
files.append(versioneer_file)
present = False
try:
f = open(".gitattributes", "r")
for line in f.readlines():
if line.strip().startswith(versionfile_source):
if "export-subst" in line.strip().split()[1:]:
present = True
f.close()
except EnvironmentError:
pass
if not present:
f = open(".gitattributes", "a+")
f.write("%s export-subst\n" % versionfile_source)
f.close()
files.append(".gitattributes")
run_command(GITS, ["add", "--"] + files)
def versions_from_parentdir(parentdir_prefix, root, verbose):
"""Try to determine the version from the parent directory name.
Source tarballs conventionally unpack into a directory that includes both
the project name and a version string. We will also support searching up
two directory levels for an appropriately named parent directory
"""
rootdirs = []
for i in range(3):
dirname = os.path.basename(root)
if dirname.startswith(parentdir_prefix):
return {"version": dirname[len(parentdir_prefix):],
"full-revisionid": None,
"dirty": False, "error": None, "date": None}
else:
rootdirs.append(root)
root = os.path.dirname(root) # up a level
if verbose:
print("Tried directories %s but none started with prefix %s" %
(str(rootdirs), parentdir_prefix))
raise NotThisMethod("rootdir doesn't start with parentdir_prefix")
SHORT_VERSION_PY = """
# This file was generated by 'versioneer.py' (0.18) from
# revision-control system data, or from the parent directory name of an
# unpacked source archive. Distribution tarballs contain a pre-generated copy
# of this file.
import json
version_json = '''
%s
''' # END VERSION_JSON
def get_versions():
return json.loads(version_json)
"""
def versions_from_file(filename):
"""Try to determine the version from _version.py if present."""
try:
with open(filename) as f:
contents = f.read()
except EnvironmentError:
raise NotThisMethod("unable to read _version.py")
mo = re.search(r"version_json = '''\n(.*)''' # END VERSION_JSON",
contents, re.M | re.S)
if not mo:
mo = re.search(r"version_json = '''\r\n(.*)''' # END VERSION_JSON",
contents, re.M | re.S)
if not mo:
raise NotThisMethod("no version_json in _version.py")
return json.loads(mo.group(1))
def write_to_version_file(filename, versions):
"""Write the given version number to the given _version.py file."""
os.unlink(filename)
contents = json.dumps(versions, sort_keys=True,
indent=1, separators=(",", ": "))
with open(filename, "w") as f:
f.write(SHORT_VERSION_PY % contents)
print("set %s to '%s'" % (filename, versions["version"]))
def plus_or_dot(pieces):
"""Return a + if we don't already have one, else return a ."""
if "+" in pieces.get("closest-tag", ""):
return "."
return "+"
def render_pep440(pieces):
"""Build up version string, with post-release "local version identifier".
Our goal: TAG[+DISTANCE.gHEX[.dirty]] . Note that if you
get a tagged build and then dirty it, you'll get TAG+0.gHEX.dirty
Exceptions:
1: no tags. git_describe was just HEX. 0+untagged.DISTANCE.gHEX[.dirty]
"""
if pieces["closest-tag"]:
rendered = pieces["closest-tag"]
if pieces["distance"] or pieces["dirty"]:
rendered += plus_or_dot(pieces)
rendered += "%d.g%s" % (pieces["distance"], pieces["short"])
if pieces["dirty"]:
rendered += ".dirty"
else:
# exception #1
rendered = "0+untagged.%d.g%s" % (pieces["distance"],
pieces["short"])
if pieces["dirty"]:
rendered += ".dirty"
return rendered
def render_pep440_pre(pieces):
"""TAG[.post.devDISTANCE] -- No -dirty.
Exceptions:
1: no tags. 0.post.devDISTANCE
"""
if pieces["closest-tag"]:
rendered = pieces["closest-tag"]
if pieces["distance"]:
rendered += ".post.dev%d" % pieces["distance"]
else:
# exception #1
rendered = "0.post.dev%d" % pieces["distance"]
return rendered
def render_pep440_post(pieces):
"""TAG[.postDISTANCE[.dev0]+gHEX] .
The ".dev0" means dirty. Note that .dev0 sorts backwards
(a dirty tree will appear "older" than the corresponding clean one),
but you shouldn't be releasing software with -dirty anyways.
Exceptions:
1: no tags. 0.postDISTANCE[.dev0]
"""
if pieces["closest-tag"]:
rendered = pieces["closest-tag"]
if pieces["distance"] or pieces["dirty"]:
rendered += ".post%d" % pieces["distance"]
if pieces["dirty"]:
rendered += ".dev0"
rendered += plus_or_dot(pieces)
rendered += "g%s" % pieces["short"]
else:
# exception #1
rendered = "0.post%d" % pieces["distance"]
if pieces["dirty"]:
rendered += ".dev0"
rendered += "+g%s" % pieces["short"]
return rendered
def render_pep440_old(pieces):
"""TAG[.postDISTANCE[.dev0]] .
The ".dev0" means dirty.
Eexceptions:
1: no tags. 0.postDISTANCE[.dev0]
"""
if pieces["closest-tag"]:
rendered = pieces["closest-tag"]
if pieces["distance"] or pieces["dirty"]:
rendered += ".post%d" % pieces["distance"]
if pieces["dirty"]:
rendered += ".dev0"
else:
# exception #1
rendered = "0.post%d" % pieces["distance"]
if pieces["dirty"]:
rendered += ".dev0"
return rendered
def render_git_describe(pieces):
"""TAG[-DISTANCE-gHEX][-dirty].
Like 'git describe --tags --dirty --always'.
Exceptions:
1: no tags. HEX[-dirty] (note: no 'g' prefix)
"""
if pieces["closest-tag"]:
rendered = pieces["closest-tag"]
if pieces["distance"]:
rendered += "-%d-g%s" % (pieces["distance"], pieces["short"])
else:
# exception #1
rendered = pieces["short"]
if pieces["dirty"]:
rendered += "-dirty"
return rendered
def render_git_describe_long(pieces):
"""TAG-DISTANCE-gHEX[-dirty].
Like 'git describe --tags --dirty --always -long'.
The distance/hash is unconditional.
Exceptions:
1: no tags. HEX[-dirty] (note: no 'g' prefix)
"""
if pieces["closest-tag"]:
rendered = pieces["closest-tag"]
rendered += "-%d-g%s" % (pieces["distance"], pieces["short"])
else:
# exception #1
rendered = pieces["short"]
if pieces["dirty"]:
rendered += "-dirty"
return rendered
def render(pieces, style):
"""Render the given version pieces into the requested style."""
if pieces["error"]:
return {"version": "unknown",
"full-revisionid": pieces.get("long"),
"dirty": None,
"error": pieces["error"],
"date": None}
if not style or style == "default":
style = "pep440" # the default
if style == "pep440":
rendered = render_pep440(pieces)
elif style == "pep440-pre":
rendered = render_pep440_pre(pieces)
elif style == "pep440-post":
rendered = render_pep440_post(pieces)
elif style == "pep440-old":
rendered = render_pep440_old(pieces)
elif style == "git-describe":
rendered = render_git_describe(pieces)
elif style == "git-describe-long":
rendered = render_git_describe_long(pieces)
else:
raise ValueError("unknown style '%s'" % style)
return {"version": rendered, "full-revisionid": pieces["long"],
"dirty": pieces["dirty"], "error": None,
"date": pieces.get("date")}
class VersioneerBadRootError(Exception):
"""The project root directory is unknown or missing key files."""
def get_versions(verbose=False):
"""Get the project version from whatever source is available.
Returns dict with two keys: 'version' and 'full'.
"""
if "versioneer" in sys.modules:
# see the discussion in cmdclass.py:get_cmdclass()
del sys.modules["versioneer"]
root = get_root()
cfg = get_config_from_root(root)
assert cfg.VCS is not None, "please set [versioneer]VCS= in setup.cfg"
handlers = HANDLERS.get(cfg.VCS)
assert handlers, "unrecognized VCS '%s'" % cfg.VCS
verbose = verbose or cfg.verbose
assert cfg.versionfile_source is not None, \
"please set versioneer.versionfile_source"
assert cfg.tag_prefix is not None, "please set versioneer.tag_prefix"
versionfile_abs = os.path.join(root, cfg.versionfile_source)
# extract version from first of: _version.py, VCS command (e.g. 'git
# describe'), parentdir. This is meant to work for developers using a
# source checkout, for users of a tarball created by 'setup.py sdist',
# and for users of a tarball/zipball created by 'git archive' or github's
# download-from-tag feature or the equivalent in other VCSes.
get_keywords_f = handlers.get("get_keywords")
from_keywords_f = handlers.get("keywords")
if get_keywords_f and from_keywords_f:
try:
keywords = get_keywords_f(versionfile_abs)
ver = from_keywords_f(keywords, cfg.tag_prefix, verbose)
if verbose:
print("got version from expanded keyword %s" % ver)
return ver
except NotThisMethod:
pass
try:
ver = versions_from_file(versionfile_abs)
if verbose:
print("got version from file %s %s" % (versionfile_abs, ver))
return ver
except NotThisMethod:
pass
from_vcs_f = handlers.get("pieces_from_vcs")
if from_vcs_f:
try:
pieces = from_vcs_f(cfg.tag_prefix, root, verbose)
ver = render(pieces, cfg.style)
if verbose:
print("got version from VCS %s" % ver)
return ver
except NotThisMethod:
pass
try:
if cfg.parentdir_prefix:
ver = versions_from_parentdir(cfg.parentdir_prefix, root, verbose)
if verbose:
print("got version from parentdir %s" % ver)
return ver
except NotThisMethod:
pass
if verbose:
print("unable to compute version")
return {"version": "0+unknown", "full-revisionid": None,
"dirty": None, "error": "unable to compute version",
"date": None}
def get_version():
"""Get the short version string for this project."""
return get_versions()["version"]
def get_cmdclass():
"""Get the custom setuptools/distutils subclasses used by Versioneer."""
if "versioneer" in sys.modules:
del sys.modules["versioneer"]
# this fixes the "python setup.py develop" case (also 'install' and
# 'easy_install .'), in which subdependencies of the main project are
# built (using setup.py bdist_egg) in the same python process. Assume
# a main project A and a dependency B, which use different versions
# of Versioneer. A's setup.py imports A's Versioneer, leaving it in
# sys.modules by the time B's setup.py is executed, causing B to run
# with the wrong versioneer. Setuptools wraps the sub-dep builds in a
# sandbox that restores sys.modules to it's pre-build state, so the
# parent is protected against the child's "import versioneer". By
# removing ourselves from sys.modules here, before the child build
# happens, we protect the child from the parent's versioneer too.
# Also see https://github.com/warner/python-versioneer/issues/52
cmds = {}
# we add "version" to both distutils and setuptools
from distutils.core import Command
class cmd_version(Command):
description = "report generated version string"
user_options = []
boolean_options = []
def initialize_options(self):
pass
def finalize_options(self):
pass
def run(self):
vers = get_versions(verbose=True)
print("Version: %s" % vers["version"])
print(" full-revisionid: %s" % vers.get("full-revisionid"))
print(" dirty: %s" % vers.get("dirty"))
print(" date: %s" % vers.get("date"))
if vers["error"]:
print(" error: %s" % vers["error"])
cmds["version"] = cmd_version
# we override "build_py" in both distutils and setuptools
#
# most invocation pathways end up running build_py:
# distutils/build -> build_py
# distutils/install -> distutils/build ->..
# setuptools/bdist_wheel -> distutils/install ->..
# setuptools/bdist_egg -> distutils/install_lib -> build_py
# setuptools/install -> bdist_egg ->..
# setuptools/develop -> ?
# pip install:
# copies source tree to a tempdir before running egg_info/etc
# if .git isn't copied too, 'git describe' will fail
# then does setup.py bdist_wheel, or sometimes setup.py install
# setup.py egg_info -> ?
# we override different "build_py" commands for both environments
if "setuptools" in sys.modules:
from setuptools.command.build_py import build_py as _build_py
else:
from distutils.command.build_py import build_py as _build_py
class cmd_build_py(_build_py):
def run(self):
root = get_root()
cfg = get_config_from_root(root)
versions = get_versions()
_build_py.run(self)
# now locate _version.py in the new build/ directory and replace
# it with an updated value
if cfg.versionfile_build:
target_versionfile = os.path.join(self.build_lib,
cfg.versionfile_build)
print("UPDATING %s" % target_versionfile)
write_to_version_file(target_versionfile, versions)
cmds["build_py"] = cmd_build_py
if "cx_Freeze" in sys.modules: # cx_freeze enabled?
from cx_Freeze.dist import build_exe as _build_exe
# nczeczulin reports that py2exe won't like the pep440-style string
# as FILEVERSION, but it can be used for PRODUCTVERSION, e.g.
# setup(console=[{
# "version": versioneer.get_version().split("+", 1)[0], # FILEVERSION
# "product_version": versioneer.get_version(),
# ...
class cmd_build_exe(_build_exe):
def run(self):
root = get_root()
cfg = get_config_from_root(root)
versions = get_versions()
target_versionfile = cfg.versionfile_source
print("UPDATING %s" % target_versionfile)
write_to_version_file(target_versionfile, versions)
_build_exe.run(self)
os.unlink(target_versionfile)
with open(cfg.versionfile_source, "w") as f:
LONG = LONG_VERSION_PY[cfg.VCS]
f.write(LONG %
{"DOLLAR": "$",
"STYLE": cfg.style,
"TAG_PREFIX": cfg.tag_prefix,
"PARENTDIR_PREFIX": cfg.parentdir_prefix,
"VERSIONFILE_SOURCE": cfg.versionfile_source,
})
cmds["build_exe"] = cmd_build_exe
del cmds["build_py"]
if 'py2exe' in sys.modules: # py2exe enabled?
try:
from py2exe.distutils_buildexe import py2exe as _py2exe # py3
except ImportError:
from py2exe.build_exe import py2exe as _py2exe # py2
class cmd_py2exe(_py2exe):
def run(self):
root = get_root()
cfg = get_config_from_root(root)
versions = get_versions()
target_versionfile = cfg.versionfile_source
print("UPDATING %s" % target_versionfile)
write_to_version_file(target_versionfile, versions)
_py2exe.run(self)
os.unlink(target_versionfile)
with open(cfg.versionfile_source, "w") as f:
LONG = LONG_VERSION_PY[cfg.VCS]
f.write(LONG %
{"DOLLAR": "$",
"STYLE": cfg.style,
"TAG_PREFIX": cfg.tag_prefix,
"PARENTDIR_PREFIX": cfg.parentdir_prefix,
"VERSIONFILE_SOURCE": cfg.versionfile_source,
})
cmds["py2exe"] = cmd_py2exe
# we override different "sdist" commands for both environments
if "setuptools" in sys.modules:
from setuptools.command.sdist import sdist as _sdist
else:
from distutils.command.sdist import sdist as _sdist
class cmd_sdist(_sdist):
def run(self):
versions = get_versions()
self._versioneer_generated_versions = versions
# unless we update this, the command will keep using the old
# version
self.distribution.metadata.version = versions["version"]
return _sdist.run(self)
def make_release_tree(self, base_dir, files):
root = get_root()
cfg = get_config_from_root(root)
_sdist.make_release_tree(self, base_dir, files)
# now locate _version.py in the new base_dir directory
# (remembering that it may be a hardlink) and replace it with an
# updated value
target_versionfile = os.path.join(base_dir, cfg.versionfile_source)
print("UPDATING %s" % target_versionfile)
write_to_version_file(target_versionfile,
self._versioneer_generated_versions)
cmds["sdist"] = cmd_sdist
return cmds
CONFIG_ERROR = """
setup.cfg is missing the necessary Versioneer configuration. You need
a section like:
[versioneer]
VCS = git
style = pep440
versionfile_source = src/myproject/_version.py
versionfile_build = myproject/_version.py
tag_prefix =
parentdir_prefix = myproject-
You will also need to edit your setup.py to use the results:
import versioneer
setup(version=versioneer.get_version(),
cmdclass=versioneer.get_cmdclass(), ...)
Please read the docstring in ./versioneer.py for configuration instructions,
edit setup.cfg, and re-run the installer or 'python versioneer.py setup'.
"""
SAMPLE_CONFIG = """
# See the docstring in versioneer.py for instructions. Note that you must
# re-run 'versioneer.py setup' after changing this section, and commit the
# resulting files.
[versioneer]
#VCS = git
#style = pep440
#versionfile_source =
#versionfile_build =
#tag_prefix =
#parentdir_prefix =
"""
INIT_PY_SNIPPET = """
from ._version import get_versions
__version__ = get_versions()['version']
del get_versions
"""
def do_setup():
"""Main VCS-independent setup function for installing Versioneer."""
root = get_root()
try:
cfg = get_config_from_root(root)
except (EnvironmentError, configparser.NoSectionError,
configparser.NoOptionError) as e:
if isinstance(e, (EnvironmentError, configparser.NoSectionError)):
print("Adding sample versioneer config to setup.cfg",
file=sys.stderr)
with open(os.path.join(root, "setup.cfg"), "a") as f:
f.write(SAMPLE_CONFIG)
print(CONFIG_ERROR, file=sys.stderr)
return 1
print(" creating %s" % cfg.versionfile_source)
with open(cfg.versionfile_source, "w") as f:
LONG = LONG_VERSION_PY[cfg.VCS]
f.write(LONG % {"DOLLAR": "$",
"STYLE": cfg.style,
"TAG_PREFIX": cfg.tag_prefix,
"PARENTDIR_PREFIX": cfg.parentdir_prefix,
"VERSIONFILE_SOURCE": cfg.versionfile_source,
})
ipy = os.path.join(os.path.dirname(cfg.versionfile_source),
"__init__.py")
if os.path.exists(ipy):
try:
with open(ipy, "r") as f:
old = f.read()
except EnvironmentError:
old = ""
if INIT_PY_SNIPPET not in old:
print(" appending to %s" % ipy)
with open(ipy, "a") as f:
f.write(INIT_PY_SNIPPET)
else:
print(" %s unmodified" % ipy)
else:
print(" %s doesn't exist, ok" % ipy)
ipy = None
# Make sure both the top-level "versioneer.py" and versionfile_source
# (PKG/_version.py, used by runtime code) are in MANIFEST.in, so
# they'll be copied into source distributions. Pip won't be able to
# install the package without this.
manifest_in = os.path.join(root, "MANIFEST.in")
simple_includes = set()
try:
with open(manifest_in, "r") as f:
for line in f:
if line.startswith("include "):
for include in line.split()[1:]:
simple_includes.add(include)
except EnvironmentError:
pass
# That doesn't cover everything MANIFEST.in can do
# (http://docs.python.org/2/distutils/sourcedist.html#commands), so
# it might give some false negatives. Appending redundant 'include'
# lines is safe, though.
if "versioneer.py" not in simple_includes:
print(" appending 'versioneer.py' to MANIFEST.in")
with open(manifest_in, "a") as f:
f.write("include versioneer.py\n")
else:
print(" 'versioneer.py' already in MANIFEST.in")
if cfg.versionfile_source not in simple_includes:
print(" appending versionfile_source ('%s') to MANIFEST.in" %
cfg.versionfile_source)
with open(manifest_in, "a") as f:
f.write("include %s\n" % cfg.versionfile_source)
else:
print(" versionfile_source already in MANIFEST.in")
# Make VCS-specific changes. For git, this means creating/changing
# .gitattributes to mark _version.py for export-subst keyword
# substitution.
do_vcs_install(manifest_in, cfg.versionfile_source, ipy)
return 0
def scan_setup_py():
"""Validate the contents of setup.py against Versioneer's expectations."""
found = set()
setters = False
errors = 0
with open("setup.py", "r") as f:
for line in f.readlines():
if "import versioneer" in line:
found.add("import")
if "versioneer.get_cmdclass()" in line:
found.add("cmdclass")
if "versioneer.get_version()" in line:
found.add("get_version")
if "versioneer.VCS" in line:
setters = True
if "versioneer.versionfile_source" in line:
setters = True
if len(found) != 3:
print("")
print("Your setup.py appears to be missing some important items")
print("(but I might be wrong). Please make sure it has something")
print("roughly like the following:")
print("")
print(" import versioneer")
print(" setup( version=versioneer.get_version(),")
print(" cmdclass=versioneer.get_cmdclass(), ...)")
print("")
errors += 1
if setters:
print("You should remove lines like 'versioneer.VCS = ' and")
print("'versioneer.versionfile_source = ' . This configuration")
print("now lives in setup.cfg, and should be removed from setup.py")
print("")
errors += 1
return errors
if __name__ == "__main__":
cmd = sys.argv[1]
if cmd == "setup":
errors = do_setup()
errors += scan_setup_py()
if errors:
sys.exit(1) | APacheDEX | /APacheDEX-1.8.tar.gz/APacheDEX-1.8/versioneer.py | versioneer.py |
(function($) {
var options = {
xaxis: {
timezone: null, // "browser" for local to the client or timezone for timezone-js
timeformat: null, // format string to use
twelveHourClock: false, // 12 or 24 time in time mode
monthNames: null // list of names of months
}
};
// round to nearby lower multiple of base
function floorInBase(n, base) {
return base * Math.floor(n / base);
}
// Returns a string with the date d formatted according to fmt.
// A subset of the Open Group's strftime format is supported.
function formatDate(d, fmt, monthNames, dayNames) {
if (typeof d.strftime == "function") {
return d.strftime(fmt);
}
var leftPad = function(n, pad) {
n = "" + n;
pad = "" + (pad == null ? "0" : pad);
return n.length == 1 ? pad + n : n;
};
var r = [];
var escape = false;
var hours = d.getHours();
var isAM = hours < 12;
if (monthNames == null) {
monthNames = ["Jan", "Feb", "Mar", "Apr", "May", "Jun", "Jul", "Aug", "Sep", "Oct", "Nov", "Dec"];
}
if (dayNames == null) {
dayNames = ["Sun", "Mon", "Tue", "Wed", "Thu", "Fri", "Sat"];
}
var hours12;
if (hours > 12) {
hours12 = hours - 12;
} else if (hours == 0) {
hours12 = 12;
} else {
hours12 = hours;
}
for (var i = 0; i < fmt.length; ++i) {
var c = fmt.charAt(i);
if (escape) {
switch (c) {
case 'a': c = "" + dayNames[d.getDay()]; break;
case 'b': c = "" + monthNames[d.getMonth()]; break;
case 'd': c = leftPad(d.getDate()); break;
case 'e': c = leftPad(d.getDate(), " "); break;
case 'H': c = leftPad(hours); break;
case 'I': c = leftPad(hours12); break;
case 'l': c = leftPad(hours12, " "); break;
case 'm': c = leftPad(d.getMonth() + 1); break;
case 'M': c = leftPad(d.getMinutes()); break;
// quarters not in Open Group's strftime specification
case 'q':
c = "" + (Math.floor(d.getMonth() / 3) + 1); break;
case 'S': c = leftPad(d.getSeconds()); break;
case 'y': c = leftPad(d.getFullYear() % 100); break;
case 'Y': c = "" + d.getFullYear(); break;
case 'p': c = (isAM) ? ("" + "am") : ("" + "pm"); break;
case 'P': c = (isAM) ? ("" + "AM") : ("" + "PM"); break;
case 'w': c = "" + d.getDay(); break;
}
r.push(c);
escape = false;
} else {
if (c == "%") {
escape = true;
} else {
r.push(c);
}
}
}
return r.join("");
}
// To have a consistent view of time-based data independent of which time
// zone the client happens to be in we need a date-like object independent
// of time zones. This is done through a wrapper that only calls the UTC
// versions of the accessor methods.
function makeUtcWrapper(d) {
function addProxyMethod(sourceObj, sourceMethod, targetObj, targetMethod) {
sourceObj[sourceMethod] = function() {
return targetObj[targetMethod].apply(targetObj, arguments);
};
};
var utc = {
date: d
};
// support strftime, if found
if (d.strftime != undefined) {
addProxyMethod(utc, "strftime", d, "strftime");
}
addProxyMethod(utc, "getTime", d, "getTime");
addProxyMethod(utc, "setTime", d, "setTime");
var props = ["Date", "Day", "FullYear", "Hours", "Milliseconds", "Minutes", "Month", "Seconds"];
for (var p = 0; p < props.length; p++) {
addProxyMethod(utc, "get" + props[p], d, "getUTC" + props[p]);
addProxyMethod(utc, "set" + props[p], d, "setUTC" + props[p]);
}
return utc;
};
// select time zone strategy. This returns a date-like object tied to the
// desired timezone
function dateGenerator(ts, opts) {
if (opts.timezone == "browser") {
return new Date(ts);
} else if (!opts.timezone || opts.timezone == "utc") {
return makeUtcWrapper(new Date(ts));
} else if (typeof timezoneJS != "undefined" && typeof timezoneJS.Date != "undefined") {
var d = new timezoneJS.Date();
// timezone-js is fickle, so be sure to set the time zone before
// setting the time.
d.setTimezone(opts.timezone);
d.setTime(ts);
return d;
} else {
return makeUtcWrapper(new Date(ts));
}
}
// map of app. size of time units in milliseconds
var timeUnitSize = {
"second": 1000,
"minute": 60 * 1000,
"hour": 60 * 60 * 1000,
"day": 24 * 60 * 60 * 1000,
"month": 30 * 24 * 60 * 60 * 1000,
"quarter": 3 * 30 * 24 * 60 * 60 * 1000,
"year": 365.2425 * 24 * 60 * 60 * 1000
};
// the allowed tick sizes, after 1 year we use
// an integer algorithm
var baseSpec = [
[1, "second"], [2, "second"], [5, "second"], [10, "second"],
[30, "second"],
[1, "minute"], [2, "minute"], [5, "minute"], [10, "minute"],
[30, "minute"],
[1, "hour"], [2, "hour"], [4, "hour"],
[8, "hour"], [12, "hour"],
[1, "day"], [2, "day"], [3, "day"],
[0.25, "month"], [0.5, "month"], [1, "month"],
[2, "month"]
];
// we don't know which variant(s) we'll need yet, but generating both is
// cheap
var specMonths = baseSpec.concat([[3, "month"], [6, "month"],
[1, "year"]]);
var specQuarters = baseSpec.concat([[1, "quarter"], [2, "quarter"],
[1, "year"]]);
function init(plot) {
plot.hooks.processDatapoints.push(function (plot, series, datapoints) {
$.each(plot.getAxes(), function(axisName, axis) {
var opts = axis.options;
if (opts.mode == "time") {
axis.tickGenerator = function(axis) {
var ticks = [];
var d = dateGenerator(axis.min, opts);
var minSize = 0;
// make quarter use a possibility if quarters are
// mentioned in either of these options
var spec = (opts.tickSize && opts.tickSize[1] ===
"quarter") ||
(opts.minTickSize && opts.minTickSize[1] ===
"quarter") ? specQuarters : specMonths;
if (opts.minTickSize != null) {
if (typeof opts.tickSize == "number") {
minSize = opts.tickSize;
} else {
minSize = opts.minTickSize[0] * timeUnitSize[opts.minTickSize[1]];
}
}
for (var i = 0; i < spec.length - 1; ++i) {
if (axis.delta < (spec[i][0] * timeUnitSize[spec[i][1]]
+ spec[i + 1][0] * timeUnitSize[spec[i + 1][1]]) / 2
&& spec[i][0] * timeUnitSize[spec[i][1]] >= minSize) {
break;
}
}
var size = spec[i][0];
var unit = spec[i][1];
// special-case the possibility of several years
if (unit == "year") {
// if given a minTickSize in years, just use it,
// ensuring that it's an integer
if (opts.minTickSize != null && opts.minTickSize[1] == "year") {
size = Math.floor(opts.minTickSize[0]);
} else {
var magn = Math.pow(10, Math.floor(Math.log(axis.delta / timeUnitSize.year) / Math.LN10));
var norm = (axis.delta / timeUnitSize.year) / magn;
if (norm < 1.5) {
size = 1;
} else if (norm < 3) {
size = 2;
} else if (norm < 7.5) {
size = 5;
} else {
size = 10;
}
size *= magn;
}
// minimum size for years is 1
if (size < 1) {
size = 1;
}
}
axis.tickSize = opts.tickSize || [size, unit];
var tickSize = axis.tickSize[0];
unit = axis.tickSize[1];
var step = tickSize * timeUnitSize[unit];
if (unit == "second") {
d.setSeconds(floorInBase(d.getSeconds(), tickSize));
} else if (unit == "minute") {
d.setMinutes(floorInBase(d.getMinutes(), tickSize));
} else if (unit == "hour") {
d.setHours(floorInBase(d.getHours(), tickSize));
} else if (unit == "month") {
d.setMonth(floorInBase(d.getMonth(), tickSize));
} else if (unit == "quarter") {
d.setMonth(3 * floorInBase(d.getMonth() / 3,
tickSize));
} else if (unit == "year") {
d.setFullYear(floorInBase(d.getFullYear(), tickSize));
}
// reset smaller components
d.setMilliseconds(0);
if (step >= timeUnitSize.minute) {
d.setSeconds(0);
} else if (step >= timeUnitSize.hour) {
d.setMinutes(0);
} else if (step >= timeUnitSize.day) {
d.setHours(0);
} else if (step >= timeUnitSize.day * 4) {
d.setDate(1);
} else if (step >= timeUnitSize.month * 2) {
d.setMonth(floorInBase(d.getMonth(), 3));
} else if (step >= timeUnitSize.quarter * 2) {
d.setMonth(floorInBase(d.getMonth(), 6));
} else if (step >= timeUnitSize.year) {
d.setMonth(0);
}
var carry = 0;
var v = Number.NaN;
var prev;
do {
prev = v;
v = d.getTime();
ticks.push(v);
if (unit == "month" || unit == "quarter") {
if (tickSize < 1) {
// a bit complicated - we'll divide the
// month/quarter up but we need to take
// care of fractions so we don't end up in
// the middle of a day
d.setDate(1);
var start = d.getTime();
d.setMonth(d.getMonth() +
(unit == "quarter" ? 3 : 1));
var end = d.getTime();
d.setTime(v + carry * timeUnitSize.hour + (end - start) * tickSize);
carry = d.getHours();
d.setHours(0);
} else {
d.setMonth(d.getMonth() +
tickSize * (unit == "quarter" ? 3 : 1));
}
} else if (unit == "year") {
d.setFullYear(d.getFullYear() + tickSize);
} else {
d.setTime(v + step);
}
} while (v < axis.max && v != prev);
return ticks;
};
axis.tickFormatter = function (v, axis) {
var d = dateGenerator(v, axis.options);
// first check global format
if (opts.timeformat != null) {
return formatDate(d, opts.timeformat, opts.monthNames, opts.dayNames);
}
// possibly use quarters if quarters are mentioned in
// any of these places
var useQuarters = (axis.options.tickSize &&
axis.options.tickSize[1] == "quarter") ||
(axis.options.minTickSize &&
axis.options.minTickSize[1] == "quarter");
var t = axis.tickSize[0] * timeUnitSize[axis.tickSize[1]];
var span = axis.max - axis.min;
var suffix = (opts.twelveHourClock) ? " %p" : "";
var hourCode = (opts.twelveHourClock) ? "%I" : "%H";
var fmt;
if (t < timeUnitSize.minute) {
fmt = hourCode + ":%M:%S" + suffix;
} else if (t < timeUnitSize.day) {
if (span < 2 * timeUnitSize.day) {
fmt = hourCode + ":%M" + suffix;
} else {
fmt = "%b %d " + hourCode + ":%M" + suffix;
}
} else if (t < timeUnitSize.month) {
fmt = "%b %d";
} else if ((useQuarters && t < timeUnitSize.quarter) ||
(!useQuarters && t < timeUnitSize.year)) {
if (span < timeUnitSize.year) {
fmt = "%b";
} else {
fmt = "%b %Y";
}
} else if (useQuarters && t < timeUnitSize.year) {
if (span < timeUnitSize.year) {
fmt = "Q%q";
} else {
fmt = "Q%q %Y";
}
} else {
fmt = "%Y";
}
var rt = formatDate(d, fmt, opts.monthNames, opts.dayNames);
return rt;
};
}
});
});
}
$.plot.plugins.push({
init: init,
options: options,
name: 'time',
version: '1.0'
});
// Time-axis support used to be in Flot core, which exposed the
// formatDate function on the plot object. Various plugins depend
// on the function, so we need to re-expose it here.
$.plot.formatDate = formatDate;
})(jQuery); | APacheDEX | /APacheDEX-1.8.tar.gz/APacheDEX-1.8/apachedex/jquery.flot.time.js | jquery.flot.time.js |
(function ($) {
var options = { };
function canvasSupported() {
return !!document.createElement('canvas').getContext;
}
function canvasTextSupported() {
if (!canvasSupported()) {
return false;
}
var dummy_canvas = document.createElement('canvas');
var context = dummy_canvas.getContext('2d');
return typeof context.fillText == 'function';
}
function css3TransitionSupported() {
var div = document.createElement('div');
return typeof div.style.MozTransition != 'undefined' // Gecko
|| typeof div.style.OTransition != 'undefined' // Opera
|| typeof div.style.webkitTransition != 'undefined' // WebKit
|| typeof div.style.transition != 'undefined';
}
function AxisLabel(axisName, position, padding, plot, opts) {
this.axisName = axisName;
this.position = position;
this.padding = padding;
this.plot = plot;
this.opts = opts;
this.width = 0;
this.height = 0;
}
CanvasAxisLabel.prototype = new AxisLabel();
CanvasAxisLabel.prototype.constructor = CanvasAxisLabel;
function CanvasAxisLabel(axisName, position, padding, plot, opts) {
AxisLabel.prototype.constructor.call(this, axisName, position, padding,
plot, opts);
}
CanvasAxisLabel.prototype.calculateSize = function() {
if (!this.opts.axisLabelFontSizePixels)
this.opts.axisLabelFontSizePixels = 14;
if (!this.opts.axisLabelFontFamily)
this.opts.axisLabelFontFamily = 'sans-serif';
var textWidth = this.opts.axisLabelFontSizePixels + this.padding;
var textHeight = this.opts.axisLabelFontSizePixels + this.padding;
if (this.position == 'left' || this.position == 'right') {
this.width = this.opts.axisLabelFontSizePixels + this.padding;
this.height = 0;
} else {
this.width = 0;
this.height = this.opts.axisLabelFontSizePixels + this.padding;
}
};
CanvasAxisLabel.prototype.draw = function(box) {
var ctx = this.plot.getCanvas().getContext('2d');
ctx.save();
ctx.font = this.opts.axisLabelFontSizePixels + 'px ' +
this.opts.axisLabelFontFamily;
var width = ctx.measureText(this.opts.axisLabel).width;
var height = this.opts.axisLabelFontSizePixels;
var x, y, angle = 0;
if (this.position == 'top') {
x = box.left + box.width/2 - width/2;
y = box.top + height*0.72;
} else if (this.position == 'bottom') {
x = box.left + box.width/2 - width/2;
y = box.top + box.height - height*0.72;
} else if (this.position == 'left') {
x = box.left + height*0.72;
y = box.height/2 + box.top + width/2;
angle = -Math.PI/2;
} else if (this.position == 'right') {
x = box.left + box.width - height*0.72;
y = box.height/2 + box.top - width/2;
angle = Math.PI/2;
}
ctx.translate(x, y);
ctx.rotate(angle);
ctx.fillText(this.opts.axisLabel, 0, 0);
ctx.restore();
};
HtmlAxisLabel.prototype = new AxisLabel();
HtmlAxisLabel.prototype.constructor = HtmlAxisLabel;
function HtmlAxisLabel(axisName, position, padding, plot, opts) {
AxisLabel.prototype.constructor.call(this, axisName, position,
padding, plot, opts);
}
HtmlAxisLabel.prototype.calculateSize = function() {
var elem = $('<div class="axisLabels" style="position:absolute;">' +
this.opts.axisLabel + '</div>');
this.plot.getPlaceholder().append(elem);
// store height and width of label itself, for use in draw()
this.labelWidth = elem.outerWidth(true);
this.labelHeight = elem.outerHeight(true);
elem.remove();
this.width = this.height = 0;
if (this.position == 'left' || this.position == 'right') {
this.width = this.labelWidth + this.padding;
} else {
this.height = this.labelHeight + this.padding;
}
};
HtmlAxisLabel.prototype.draw = function(box) {
this.plot.getPlaceholder().find('#' + this.axisName + 'Label').remove();
var elem = $('<div id="' + this.axisName +
'Label" " class="axisLabels" style="position:absolute;">'
+ this.opts.axisLabel + '</div>');
this.plot.getPlaceholder().append(elem);
if (this.position == 'top') {
elem.css('left', box.left + box.width/2 - this.labelWidth/2 + 'px');
elem.css('top', box.top + 'px');
} else if (this.position == 'bottom') {
elem.css('left', box.left + box.width/2 - this.labelWidth/2 + 'px');
elem.css('top', box.top + box.height - this.labelHeight + 'px');
} else if (this.position == 'left') {
elem.css('top', box.top + box.height/2 - this.labelHeight/2 + 'px');
elem.css('left', box.left + 'px');
} else if (this.position == 'right') {
elem.css('top', box.top + box.height/2 - this.labelHeight/2 + 'px');
elem.css('left', box.left + box.width - this.labelWidth + 'px');
}
};
CssTransformAxisLabel.prototype = new HtmlAxisLabel();
CssTransformAxisLabel.prototype.constructor = CssTransformAxisLabel;
function CssTransformAxisLabel(axisName, position, padding, plot, opts) {
HtmlAxisLabel.prototype.constructor.call(this, axisName, position,
padding, plot, opts);
}
CssTransformAxisLabel.prototype.calculateSize = function() {
HtmlAxisLabel.prototype.calculateSize.call(this);
this.width = this.height = 0;
if (this.position == 'left' || this.position == 'right') {
this.width = this.labelHeight + this.padding;
} else {
this.height = this.labelHeight + this.padding;
}
};
CssTransformAxisLabel.prototype.transforms = function(degrees, x, y) {
var stransforms = {
'-moz-transform': '',
'-webkit-transform': '',
'-o-transform': '',
'-ms-transform': ''
};
if (x != 0 || y != 0) {
var stdTranslate = ' translate(' + x + 'px, ' + y + 'px)';
stransforms['-moz-transform'] += stdTranslate;
stransforms['-webkit-transform'] += stdTranslate;
stransforms['-o-transform'] += stdTranslate;
stransforms['-ms-transform'] += stdTranslate;
}
if (degrees != 0) {
var rotation = degrees / 90;
var stdRotate = ' rotate(' + degrees + 'deg)';
stransforms['-moz-transform'] += stdRotate;
stransforms['-webkit-transform'] += stdRotate;
stransforms['-o-transform'] += stdRotate;
stransforms['-ms-transform'] += stdRotate;
}
var s = 'top: 0; left: 0; ';
for (var prop in stransforms) {
if (stransforms[prop]) {
s += prop + ':' + stransforms[prop] + ';';
}
}
s += ';';
return s;
};
CssTransformAxisLabel.prototype.calculateOffsets = function(box) {
var offsets = { x: 0, y: 0, degrees: 0 };
if (this.position == 'bottom') {
offsets.x = box.left + box.width/2 - this.labelWidth/2;
offsets.y = box.top + box.height - this.labelHeight;
} else if (this.position == 'top') {
offsets.x = box.left + box.width/2 - this.labelWidth/2;
offsets.y = box.top;
} else if (this.position == 'left') {
offsets.degrees = -90;
offsets.x = box.left - this.labelWidth/2 + this.labelHeight/2;
offsets.y = box.height/2 + box.top;
} else if (this.position == 'right') {
offsets.degrees = 90;
offsets.x = box.left + box.width - this.labelWidth/2
- this.labelHeight/2;
offsets.y = box.height/2 + box.top;
}
return offsets;
};
CssTransformAxisLabel.prototype.draw = function(box) {
this.plot.getPlaceholder().find("." + this.axisName + "Label").remove();
var offsets = this.calculateOffsets(box);
var elem = $('<div class="axisLabels ' + this.axisName +
'Label" style="position:absolute; ' +
'color: ' + this.opts.color + '; ' +
this.transforms(offsets.degrees, offsets.x, offsets.y) +
'">' + this.opts.axisLabel + '</div>');
this.plot.getPlaceholder().append(elem);
};
IeTransformAxisLabel.prototype = new CssTransformAxisLabel();
IeTransformAxisLabel.prototype.constructor = IeTransformAxisLabel;
function IeTransformAxisLabel(axisName, position, padding, plot, opts) {
CssTransformAxisLabel.prototype.constructor.call(this, axisName,
position, padding,
plot, opts);
this.requiresResize = false;
}
IeTransformAxisLabel.prototype.transforms = function(degrees, x, y) {
// I didn't feel like learning the crazy Matrix stuff, so this uses
// a combination of the rotation transform and CSS positioning.
var s = '';
if (degrees != 0) {
var rotation = degrees/90;
while (rotation < 0) {
rotation += 4;
}
s += ' filter: progid:DXImageTransform.Microsoft.BasicImage(rotation=' + rotation + '); ';
// see below
this.requiresResize = (this.position == 'right');
}
if (x != 0) {
s += 'left: ' + x + 'px; ';
}
if (y != 0) {
s += 'top: ' + y + 'px; ';
}
return s;
};
IeTransformAxisLabel.prototype.calculateOffsets = function(box) {
var offsets = CssTransformAxisLabel.prototype.calculateOffsets.call(
this, box);
// adjust some values to take into account differences between
// CSS and IE rotations.
if (this.position == 'top') {
// FIXME: not sure why, but placing this exactly at the top causes
// the top axis label to flip to the bottom...
offsets.y = box.top + 1;
} else if (this.position == 'left') {
offsets.x = box.left;
offsets.y = box.height/2 + box.top - this.labelWidth/2;
} else if (this.position == 'right') {
offsets.x = box.left + box.width - this.labelHeight;
offsets.y = box.height/2 + box.top - this.labelWidth/2;
}
return offsets;
};
IeTransformAxisLabel.prototype.draw = function(box) {
CssTransformAxisLabel.prototype.draw.call(this, box);
if (this.requiresResize) {
var elem = this.plot.getPlaceholder().find("." + this.axisName + "Label");
// Since we used CSS positioning instead of transforms for
// translating the element, and since the positioning is done
// before any rotations, we have to reset the width and height
// in case the browser wrapped the text (specifically for the
// y2axis).
elem.css('width', this.labelWidth);
elem.css('height', this.labelHeight);
}
};
function init(plot) {
// This is kind of a hack. There are no hooks in Flot between
// the creation and measuring of the ticks (setTicks, measureTickLabels
// in setupGrid() ) and the drawing of the ticks and plot box
// (insertAxisLabels in setupGrid() ).
//
// Therefore, we use a trick where we run the draw routine twice:
// the first time to get the tick measurements, so that we can change
// them, and then have it draw it again.
var secondPass = false;
var axisLabels = {};
var axisOffsetCounts = { left: 0, right: 0, top: 0, bottom: 0 };
var defaultPadding = 2; // padding between axis and tick labels
plot.hooks.draw.push(function (plot, ctx) {
var hasAxisLabels = false;
if (!secondPass) {
// MEASURE AND SET OPTIONS
$.each(plot.getAxes(), function(axisName, axis) {
var opts = axis.options // Flot 0.7
|| plot.getOptions()[axisName]; // Flot 0.6
if (!opts || !opts.axisLabel || !axis.show)
return;
hasAxisLabels = true;
var renderer = null;
if (!opts.axisLabelUseHtml &&
navigator.appName == 'Microsoft Internet Explorer') {
var ua = navigator.userAgent;
var re = new RegExp("MSIE ([0-9]{1,}[\.0-9]{0,})");
if (re.exec(ua) != null) {
rv = parseFloat(RegExp.$1);
}
if (rv >= 9 && !opts.axisLabelUseCanvas && !opts.axisLabelUseHtml) {
renderer = CssTransformAxisLabel;
} else if (!opts.axisLabelUseCanvas && !opts.axisLabelUseHtml) {
renderer = IeTransformAxisLabel;
} else if (opts.axisLabelUseCanvas) {
renderer = CanvasAxisLabel;
} else {
renderer = HtmlAxisLabel;
}
} else {
if (opts.axisLabelUseHtml || (!css3TransitionSupported() && !canvasTextSupported()) && !opts.axisLabelUseCanvas) {
renderer = HtmlAxisLabel;
} else if (opts.axisLabelUseCanvas || !css3TransitionSupported()) {
renderer = CanvasAxisLabel;
} else {
renderer = CssTransformAxisLabel;
}
}
var padding = opts.axisLabelPadding === undefined ?
defaultPadding : opts.axisLabelPadding;
axisLabels[axisName] = new renderer(axisName,
axis.position, padding,
plot, opts);
// flot interprets axis.labelHeight and .labelWidth as
// the height and width of the tick labels. We increase
// these values to make room for the axis label and
// padding.
axisLabels[axisName].calculateSize();
// AxisLabel.height and .width are the size of the
// axis label and padding.
axis.labelHeight += axisLabels[axisName].height;
axis.labelWidth += axisLabels[axisName].width;
opts.labelHeight = axis.labelHeight;
opts.labelWidth = axis.labelWidth;
});
// if there are axis labels re-draw with new label widths and heights
if (hasAxisLabels) {
secondPass = true;
plot.setupGrid();
plot.draw();
}
} else {
// DRAW
$.each(plot.getAxes(), function(axisName, axis) {
var opts = axis.options // Flot 0.7
|| plot.getOptions()[axisName]; // Flot 0.6
if (!opts || !opts.axisLabel || !axis.show)
return;
axisLabels[axisName].draw(axis.box);
});
}
});
}
$.plot.plugins.push({
init: init,
options: options,
name: 'axisLabels',
version: '2.0b0'
});
})(jQuery); | APacheDEX | /APacheDEX-1.8.tar.gz/APacheDEX-1.8/apachedex/jquery.flot.axislabels.js | jquery.flot.axislabels.js |
function updateGraphTooltip(event, pos, item, previousIndex, tooltip, plot) {
if (item) {
if (previousIndex != item.dataIndex) {
previousIndex = item.dataIndex;
var plot_offset = plot.getPlotOffset();
var offset = plot.offset();
tooltip.find(".x").html(item.series.xaxis.tickFormatter(
item.datapoint[0], item.series.xaxis));
tooltip.find(".y").html(item.series.yaxis.options.axisLabel + " : " +
item.datapoint[1]);
tooltip.css("left", item.pageX - offset.left + plot_offset.left + 5
+ "px");
tooltip.show();
// query offsetHeight *after* making the tooltip visible
tooltip.css("top", item.pageY - offset.top + plot_offset.top - 5
- tooltip.prop("offsetHeight") + "px");
}
} else {
if (previousIndex != null) {
tooltip.hide();
previousIndex = null;
}
}
return previousIndex;
}
scale_map = {
log100To0: [
function (v) { return -Math.log(101 - v); },
function (v) { return 101 - Math.exp(-v); }
],
log0ToAny: [
function (v) { return Math.log(v + 1); },
function (v) { return Math.exp(v) - 1; }
]
}
function updateAxisTransform(axis) {
if (axis != undefined) {
transform_list = scale_map[axis.transform];
if (transform_list == undefined) {
return;
}
axis.transform = transform_list[0];
axis.inverseTransform = transform_list[1];
}
}
function renderGraph(container) {
var container = $(container);
var previousIndex = null;
var tooltip = container.next(".tooltip");
var options = $.parseJSON(container.attr("data-options"));
updateAxisTransform(options.xaxis);
updateAxisTransform(options.yaxis);
var plot = $.plot(
container,
$.parseJSON(container.attr("data-points")),
options
);
tooltip.detach();
container.append(tooltip);
container.bind("plothover", function (event, pos, item) {
previousIndex = updateGraphTooltip(event, pos, item, previousIndex,
tooltip, plot);
});
}
function toggleGraph(node) {
var container = $(node).parent().find(".container");
// Note: toggling *after* rendering cause layout problems with flot.
container.toggle();
if (container.attr("data-rendered-marker") == null) {
container.attr("data-rendered-marker", "rendered");
container.find(".graph").each(function (i){renderGraph(this)});
}
}
function hideGraph(node) {
$(node).parent().hide();
}
$(function() {
$(".graph:visible").each(function (i){renderGraph(this)});
$(".hidden_graph .container").draggable();
}); | APacheDEX | /APacheDEX-1.8.tar.gz/APacheDEX-1.8/apachedex/apachedex.js | apachedex.js |
from __future__ import print_function, division, absolute_import, \
unicode_literals
from cgi import escape
from collections import defaultdict, Counter
from datetime import datetime, timedelta, date, tzinfo
from functools import partial
from operator import itemgetter
from urllib import splittype, splithost
import argparse
import bz2
import calendar
import codecs
import functools
import gzip
import httplib
import itertools
import json
import math
import os
import pkgutil
import platform
import re
import shlex
import sys
import time
import traceback
try:
import pytz
except ImportError:
pytz = None
def getResource(name, encoding='utf-8'):
return pkgutil.get_data(__name__, name).decode(encoding)
def _wrapOpen(func):
@functools.wraps(func)
def wrapper(*args, **kw):
encoding = kw.pop('encoding', None)
info = codecs.lookup(encoding)
errors = kw.pop('errors', 'strict')
file_object = func(*args, **kw)
if encoding is None:
return file_object
srw = codecs.StreamReaderWriter(
file_object,
info.streamreader,
info.streamwriter,
errors,
)
srw.encoding = encoding
return srw
return wrapper
gzip_open = gzip.open
if sys.version_info >= (3, 3):
import lzma
lzma_open = lzma.open
bz2_open = bz2.open
_read_mode = 'rt'
else:
gzip_open = _wrapOpen(gzip_open)
bz2_open = _wrapOpen(bz2.BZ2File)
_read_mode = 'r'
try:
from backports import lzma
lzma_open = _wrapOpen(lzma.open)
except ImportError:
lzma = None
FILE_OPENER_LIST = [
(gzip_open, IOError),
(bz2_open, IOError),
]
if lzma is not None:
FILE_OPENER_LIST.append((lzma_open, lzma.LZMAError))
# XXX: what encoding ? apache doesn't document one, but requests are supposed
# to be urlencoded, so pure ascii. Are timestamps localised ?
# Unlike apache, Caddy does not escape referrer headers, so caddy log files may contain
# non ascii characters.
# We read them as ascii, replacing non-ascii characters by unicode replacement character.
INPUT_ENCODING = 'ascii'
INPUT_ENCODING_ERROR_HANDLER = 'replace'
MONTH_VALUE_DICT = dict((y, x) for (x, y) in enumerate(('Jan', 'Feb', 'Mar',
'Apr', 'May', 'Jun', 'Jul', 'Aug', 'Sep', 'Oct', 'Nov', 'Dec'), 1))
MS_PER_S = 10 ** 3
US_PER_S = 10 ** 6
N_HOTTEST_PAGES_DEFAULT = 20
N_ERROR_URL = 10
N_REFERRER_PER_ERROR_URL = 5
N_USER_AGENT = 20
ITEMGETTER0 = itemgetter(0)
ITEMGETTER1 = itemgetter(1)
APDEX_TOLERATING_COEF = 4
AUTO_PERIOD_COEF = 200
# Larger (x < LARGER_THAN_INTEGER_STR == True) than any string starting with
# a number
LARGER_THAN_INTEGER_STR = 'A'
SMALLER_THAN_INTEGER_STR = ''
def statusIsError(status):
return status[0] > '3'
def getClassForDuration(duration, threshold):
if duration <= threshold:
return ''
if duration <= threshold * APDEX_TOLERATING_COEF:
return 'warning'
return 'problem'
def getClassForStatusHit(hit, status):
if hit and statusIsError(status):
return 'problem'
return ''
def getDataPoints(apdex_dict, status_period_dict={}):
period_error_dict = defaultdict(int)
for status, period_dict in status_period_dict.iteritems():
if statusIsError(status):
for period, hit in period_dict.iteritems():
period_error_dict[period] += hit
# If there was an error, there was a hit, and apdex_dict must contain it
# (at same date).
assert len(set(period_error_dict) - set(apdex_dict)) == 0
return [
(
value_date,
apdex.getApdex() * 100,
apdex.hit,
period_error_dict.get(value_date, 0),
) for value_date, apdex in sorted(apdex_dict.iteritems(), key=ITEMGETTER0)
]
def prepareDataForGraph(daily_data, date_format, placeholder_delta,
coefficient_callback, x_min=None, x_max=None):
current_date = datetime.strptime(x_min or daily_data[0][0], date_format)
new_daily_data = []
append = new_daily_data.append
for (measure_date_string, apdex, hit, error_hit) in daily_data:
measure_date = datetime.strptime(measure_date_string, date_format)
if current_date < measure_date:
append((current_date.strftime(date_format), 100, 0, 0))
placeholder_end_date = measure_date - placeholder_delta
if placeholder_end_date > current_date:
append((placeholder_end_date.strftime(date_format), 100, 0, 0))
coef = coefficient_callback(measure_date)
append((measure_date_string, apdex, hit * coef, error_hit * coef))
current_date = measure_date + placeholder_delta
if x_max is not None and current_date < datetime.strptime(x_max,
date_format):
append((current_date.strftime(date_format), 100, 0, 0))
append((x_max, 100, 0, 0))
return new_daily_data
def graphPair(daily_data, date_format, graph_period, apdex_y_min=None,
hit_y_min=None, hit_y_max=None, apdex_y_scale=None, hit_y_scale=None):
date_list = [int(calendar.timegm(time.strptime(x[0], date_format)) * 1000)
for x in daily_data]
timeformat = '%Y/<br/>%m/%d<br/> %H:%M'
# There is room for about 10 labels on the X axis.
minTickSize = (max(1,
(date_list[-1] - date_list[0]) / (60 * 60 * 1000 * 10)), 'hour')
# Guesstimation: 6px per digit. If only em were allowed...
yLabelWidth = max(int(math.log10(max(x[2] for x in daily_data))) + 1,
3) * 6
return graph('apdex',
[zip(date_list, (round(x[1], 2) for x in daily_data))],
{
'xaxis': {
'mode': 'time',
'timeformat': timeformat,
'minTickSize': minTickSize,
},
'yaxis': {
'min': apdex_y_min,
'max': 100,
'axisLabel': 'apdex (%)',
'labelWidth': yLabelWidth,
'transform': apdex_y_scale,
},
'lines': {'show': True},
'grid': {
'hoverable': True,
},
},
) + graph('Hits (per %s)' % graph_period,
[
{
'label': 'Errors',
'data': zip(date_list, (x[3] for x in daily_data)),
'color': 'red',
},
{
'label': 'Hits',
'data': zip(date_list, (x[2] for x in daily_data)),
},
],
{
'xaxis': {
'mode': 'time',
'timeformat': timeformat,
'minTickSize': minTickSize,
},
'yaxis': {
'min': hit_y_min,
'max': hit_y_max,
'axisLabel': 'Hits',
'labelWidth': yLabelWidth,
'tickDecimals': 0,
'transform': hit_y_scale,
},
'lines': {'show': True},
'grid': {
'hoverable': True,
},
'legend': {
'backgroundOpacity': 0.25,
},
},
)
def graph(title, data, options={}):
result = []
append = result.append
append('<h2>%s</h2><div class="graph" '
'style="width:600px;height:300px" data-points="' % title)
append(escape(json.dumps(data), quote=True))
append('" data-options="')
append(escape(json.dumps(options), quote=True))
append('"></div><div class="tooltip">'
'<span class="x"></span><br/>'
'<span class="y"></span></div>')
return ''.join(result)
class APDEXStats(object):
def __init__(self, threshold, getDuration):
threshold *= US_PER_S
self.threshold = threshold
self.threshold4 = threshold * APDEX_TOLERATING_COEF
self.apdex_1 = 0
self.apdex_4 = 0
self.hit = 0
self.duration_total = 0
self.duration_max = 0
self.getDuration = getDuration
def accumulate(self, match):
duration = self.getDuration(match)
self.duration_total += duration
self.duration_max = max(self.duration_max, duration)
if not statusIsError(match.group('status')):
if duration <= self.threshold:
self.apdex_1 += 1
elif duration <= self.threshold4:
self.apdex_4 += 1
self.hit += 1
def accumulateFrom(self, other):
for attribute in ('apdex_1', 'apdex_4', 'hit', 'duration_total'):
setattr(self, attribute,
getattr(self, attribute) + getattr(other, attribute))
self.duration_max = max(self.duration_max, other.duration_max)
def getApdex(self):
if self.hit:
return (self.apdex_1 + self.apdex_4 * .5) / self.hit
return 1
def getAverage(self):
if self.hit:
return float(self.duration_total) / (US_PER_S * self.hit)
return 0
def getMax(self):
return float(self.duration_max) / US_PER_S
@staticmethod
def asHTMLHeader(overall=False):
return '<th>apdex</th><th>hits</th><th>avg (s)</th>' \
'<th%s>max (s)</th>' % (overall and ' class="overall_right"' or '')
def asHTML(self, threshold, overall=False):
apdex = self.getApdex()
average = self.getAverage()
maximum = self.getMax()
hit = self.hit
if hit:
extra_class = ''
apdex_style = 'color: #%s; background-color: #%s' % (
(apdex < .5 and 'f' or '0') * 3,
('%x' % (apdex * 0xf)) * 3,
)
else:
extra_class = 'no_hit'
apdex_style = ''
if overall:
extra_right_class = 'overall_right'
else:
extra_right_class = ''
return '<td style="%(apdex_style)s" class="%(extra_class)s group_left">' \
'%(apdex)i%%</td><td class="%(extra_class)s">%(hit)s</td>' \
'<td class="%(average_class)s %(extra_class)s">%(average).2f</td>' \
'<td class="%(max_class)s %(extra_class)s group_right ' \
'%(extra_right_class)s">%(max).2f</td>' % {
'extra_class': extra_class,
'apdex_style': apdex_style,
'apdex': round(apdex * 100),
'hit': hit,
'average_class': getClassForDuration(average, threshold),
'average': average,
'max_class': getClassForDuration(maximum, threshold),
'max': maximum,
'extra_right_class': extra_right_class,
}
@classmethod
def fromJSONState(cls, state, getDuration):
result = cls(0, getDuration)
result.__dict__.update(state)
return result
def asJSONState(self):
result = self.__dict__.copy()
del result['getDuration']
return result
_APDEXDateDictAsJSONState = lambda date_dict: dict(((y, z.asJSONState())
for y, z in date_dict.iteritems()))
class GenericSiteStats(object):
def __init__(self, threshold, getDuration, suffix, error_detail=False,
user_agent_detail=False,
# Non-generic parameters
**kw):
self.threshold = threshold
self.suffix = suffix
self.error_detail = error_detail
self.status = defaultdict(partial(defaultdict, int))
if error_detail:
# status -> url -> referrer -> count
self.error_url_count = defaultdict(partial(defaultdict, Counter))
self.url_apdex = defaultdict(partial(APDEXStats, threshold, getDuration))
self.apdex = defaultdict(partial(APDEXStats, threshold, getDuration))
self.user_agent_detail = user_agent_detail
self.user_agent_counter = Counter()
def rescale(self, convert, getDuration):
for status, date_dict in self.status.iteritems():
new_date_dict = defaultdict(int)
for value_date, status_count in date_dict.iteritems():
new_date_dict[convert(value_date)] += status_count
self.status[status] = new_date_dict
new_apdex = defaultdict(partial(APDEXStats, self.threshold, getDuration))
for value_date, data in self.apdex.iteritems():
new_apdex[convert(value_date)].accumulateFrom(data)
self.apdex = new_apdex
def accumulate(self, match, url_match, value_date):
self.apdex[value_date].accumulate(match)
if url_match is None:
url = match.group('request')
else:
url = url_match.group('url')
# XXX: can eat memory if there are many different urls
self.url_apdex[url.split('?', 1)[0]].accumulate(match)
status = match.group('status')
self.status[status][value_date] += 1
if self.error_detail and statusIsError(status):
# XXX: can eat memory if there are many errors on many different urls
self.error_url_count[status][url][match.group('referer')] += 1
if self.user_agent_detail:
self.user_agent_counter[match.group('agent')] += 1
def getApdexData(self):
return getDataPoints(self.apdex, self.status)
def asHTML(self, date_format, placeholder_delta, graph_period,
graph_coefficient, encoding, stat_filter=lambda x: x,
x_min=None, x_max=None,
apdex_y_min=None, hit_y_min=None, hit_y_max=None,
apdex_y_scale=None, hit_y_scale=None,
n_hottest_pages=N_HOTTEST_PAGES_DEFAULT,
):
result = []
append = result.append
apdex = APDEXStats(self.threshold, None)
for data in self.apdex.itervalues():
apdex.accumulateFrom(data)
append('<h2>Overall</h2><table class="stats"><tr>')
append(APDEXStats.asHTMLHeader())
append('</tr><tr>')
append(apdex.asHTML(self.threshold))
append('</tr></table><h2>Hottest pages</h2><table class="stats"><tr>')
append(APDEXStats.asHTMLHeader())
append('<th>url</th></tr>')
for url, data in sorted(self.url_apdex.iteritems(), key=lambda x: x[1].getAverage() * x[1].hit,
reverse=True)[:n_hottest_pages]:
append('<tr>')
append(data.asHTML(self.threshold))
append('<td class="text">%s</td></tr>' % escape(url))
append('</table>')
if self.user_agent_detail:
append('<h2>User agents</h2><table class="stats"><tr><th>hits</th>'
'<th>user agent</th></tr>')
for user_agent, hit in self.user_agent_counter.most_common(N_USER_AGENT):
append('<tr><td>%s</td><td class="text">%s</td></tr>' % (hit, escape(user_agent)))
append('</table>')
column_set = set()
filtered_status = defaultdict(partial(defaultdict, int))
for status, date_dict in self.status.iteritems():
filtered_date_dict = filtered_status[status]
for value_date, value in date_dict.iteritems():
filtered_date_dict[stat_filter(value_date)] += value
column_set.update(filtered_date_dict)
column_list = sorted(column_set)
append('<h2>Hits per status code</h2><table class="stats"><tr>'
'<th>status</th><th>overall</th>')
for column in column_list:
append('<th>%s</th>' % column)
append('</tr>')
def hitTd(hit, status):
return '<td class="%s">%s</td>' % (getClassForStatusHit(hit, status), hit)
def statusAsHtml(status):
try:
definition = httplib.responses[int(status)]
except KeyError:
return status
else:
return '<abbr title="%s">%s</abbr>' % (definition, status)
has_errors = False
for status, data_dict in sorted(filtered_status.iteritems(),
key=ITEMGETTER0):
has_errors |= statusIsError(status)
append('<tr title="%s"><th>%s</th>' % (status, statusAsHtml(status)))
append(hitTd(sum(data_dict.itervalues()), status))
for column in column_list:
append(hitTd(data_dict[column], status))
append('</tr>')
append('</table>')
if self.error_detail and has_errors:
def getHitForUrl(referer_counter):
return sum(referer_counter.itervalues())
filtered_status_url = defaultdict(partial(defaultdict, dict))
for status, url_dict in self.error_url_count.iteritems():
filtered_status_url[status] = sorted(url_dict.iteritems(),
key=lambda x: getHitForUrl(x[1]), reverse=True)[:N_ERROR_URL]
append('<h3>Error detail</h3><table class="stats"><tr><th>status</th>'
'<th>hits</th><th>url</th><th>referers</th></tr>')
for status, url_list in sorted(filtered_status_url.iteritems(),
key=ITEMGETTER0):
append('<tr><th rowspan="%s">%s</th>' % (len(url_list),
statusAsHtml(status)))
first_url = True
for url, referer_counter in url_list:
if first_url:
first_url = False
else:
append('<tr>')
append('<td>%s</td><td class="text">%s</td>'
'<td class="text">%s</td>' % (
getHitForUrl(referer_counter),
escape(url),
'<br/>'.join('%i: %s' % (hit, escape(referer))
for referer, hit in referer_counter.most_common(
N_REFERRER_PER_ERROR_URL)),
))
append('</tr>')
append('</table>')
return '\n'.join(result)
@classmethod
def fromJSONState(cls, state, getDuration, suffix):
error_detail = state['error_detail']
result = cls(state['threshold'], getDuration, suffix, error_detail,
state.get('user_agent_detail', True))
if error_detail:
error_url_count = result.error_url_count
for state_status, state_url_dict in state['error_url_count'].iteritems():
url_dict = error_url_count[state_status]
for url, counter in state_url_dict.iteritems():
url_dict[url].update(counter)
for attribute_id in ('url_apdex', 'apdex'):
attribute = getattr(result, attribute_id)
for key, apdex_state in state[attribute_id].iteritems():
attribute[key] = APDEXStats.fromJSONState(apdex_state, getDuration)
status = result.status
for status_code, date_dict in state['status'].iteritems():
status[status_code].update(date_dict)
result.user_agent_counter.update(state['user_agent_counter'])
return result
def asJSONState(self):
return {
'threshold': self.threshold,
'error_detail': self.error_detail,
'error_url_count': getattr(self, 'error_url_count', None),
'url_apdex': _APDEXDateDictAsJSONState(self.url_apdex),
'apdex': _APDEXDateDictAsJSONState(self.apdex),
'status': self.status,
'user_agent_counter': self.user_agent_counter,
'user_agent_detail': self.user_agent_detail,
}
def accumulateFrom(self, other):
# XXX: ignoring: threshold, getDuration, suffix, error_detail,
# user_agent_detail.
# Assuming they are consistently set.
if self.error_detail:
for status, other_url_dict in other.error_url_count.iteritems():
url_dict = self.error_url_count[status]
for url, referer_counter in other_url_dict.iteritems():
url_dict[url].update(referer_counter)
for attribute_id in ('url_apdex', 'apdex'):
self_attribute = getattr(self, attribute_id)
for key, apdex_data in getattr(other, attribute_id).iteritems():
self_attribute[key].accumulateFrom(apdex_data)
status = self.status
for status_code, other_date_dict in other.status.iteritems():
date_dict = status[status_code]
for status_date, count in other_date_dict.iteritems():
date_dict[status_date] += count
self.user_agent_counter.update(other.user_agent_counter)
class ERP5SiteStats(GenericSiteStats):
"""
Heuristic used:
- ignore any GET parameter
- If the first in-site url chunk ends with "_module", count line as
belonging to a module
- If a line belongs to a module and has at least 2 slashes after module,
count line as belonging to a document of that module
"""
def __init__(self, threshold, getDuration, suffix, error_detail=False,
user_agent_detail=False, erp5_expand_other=False):
super(ERP5SiteStats, self).__init__(threshold, getDuration, suffix,
error_detail=error_detail, user_agent_detail=user_agent_detail)
self.expand_other = erp5_expand_other
# Key levels:
# - module id (string)
# - is document (bool)
# - date (string)
self.module = defaultdict(partial(defaultdict, partial(
defaultdict, partial(APDEXStats, threshold, getDuration))))
# Key levels:
# - id (string)
# => 'other' only if expand_other == False
# - date (string)
self.no_module = defaultdict(partial(
defaultdict, partial(APDEXStats, threshold, getDuration)))
self.site_search = defaultdict(partial(APDEXStats, threshold, getDuration))
def rescale(self, convert, getDuration):
super(ERP5SiteStats, self).rescale(convert, getDuration)
threshold = self.threshold
for document_dict in self.module.itervalues():
for is_document, date_dict in document_dict.iteritems():
new_date_dict = defaultdict(partial(APDEXStats, threshold, getDuration))
for value_date, data in date_dict.iteritems():
new_date_dict[convert(value_date)].accumulateFrom(data)
document_dict[is_document] = new_date_dict
for id_, date_dict in self.no_module.iteritems():
new_date_dict = defaultdict(partial(APDEXStats, threshold, getDuration))
for value_date, data in date_dict.iteritems():
new_date_dict[convert(value_date)].accumulateFrom(data)
self.no_module[id_] = new_date_dict
attribute = defaultdict(partial(APDEXStats, threshold, getDuration))
for value_date, data in self.site_search.iteritems():
attribute[convert(value_date)].accumulateFrom(data)
self.site_search = attribute
def accumulate(self, match, url_match, value_date):
split = self.suffix(url_match.group('url')).split('?', 1)[0].split('/')
if split and split[0].endswith('_module'):
super(ERP5SiteStats, self).accumulate(match, url_match, value_date)
module = split[0]
self.module[module][
len(split) > 1 and (split[1] != 'view' and '_view' not in split[1])
][value_date].accumulate(match)
elif split and split[0] == 'ERP5Site_viewSearchResult':
super(ERP5SiteStats, self).accumulate(match, url_match, value_date)
self.site_search[value_date].accumulate(match)
elif split and self.expand_other:
self.no_module[split[0]][value_date].accumulate(match)
else:
self.no_module['other'][value_date].accumulate(match)
def asHTML(self, date_format, placeholder_delta, graph_period, graph_coefficient,
encoding, stat_filter=lambda x: x, x_min=None, x_max=None,
apdex_y_min=None, hit_y_min=None, hit_y_max=None,
apdex_y_scale=None, hit_y_scale=None,
n_hottest_pages=N_HOTTEST_PAGES_DEFAULT,
):
result = []
append = result.append
append('<h2>Stats per module</h2><table class="stats stats_erp5"><tr>'
'<th rowspan="2" colspan="3">module</th>'
'<th colspan="4" class="overall_right">overall</th>')
module_document_overall = defaultdict(partial(APDEXStats, self.threshold,
None))
filtered_module = defaultdict(partial(defaultdict, partial(
defaultdict, partial(APDEXStats, self.threshold, None))))
other_overall = APDEXStats(self.threshold, None)
filtered_no_module = defaultdict(partial(
defaultdict, partial(APDEXStats, self.threshold, None)))
column_set = set()
for key, data_dict in self.no_module.iteritems():
filtered_id_dict = filtered_no_module[key]
for value_date, value in data_dict.iteritems():
filtered_id_dict[stat_filter(value_date)].accumulateFrom(value)
other_overall.accumulateFrom(value)
column_set.update(filtered_id_dict)
filtered_site_search = defaultdict(partial(APDEXStats, self.threshold,
None))
for value_date, value in self.site_search.iteritems():
filtered_site_search[stat_filter(value_date)].accumulateFrom(value)
column_set.update(filtered_site_search)
for key, is_document_dict in self.module.iteritems():
filtered_is_document_dict = filtered_module[key]
for key, data_dict in is_document_dict.iteritems():
filtered_data_dict = filtered_is_document_dict[key]
module_document_apdex = module_document_overall[key]
for value_date, value in data_dict.iteritems():
filtered_data_dict[stat_filter(value_date)].accumulateFrom(value)
module_document_apdex.accumulateFrom(value)
column_set.update(filtered_data_dict)
column_list = sorted(column_set)
for column in column_list:
append('<th colspan="4">%s</th>' % column)
append('</tr><tr>')
for i in xrange(len(column_list) + 1):
append(APDEXStats.asHTMLHeader(i == 0))
append('</tr>')
def apdexAsColumns(data_dict):
data_total = APDEXStats(self.threshold, None)
for data in data_dict.itervalues():
data_total.accumulateFrom(data)
append(data_total.asHTML(self.threshold, True))
for column in column_list:
append(data_dict[column].asHTML(self.threshold))
return data_total
def hiddenGraph(data_dict, title):
append('<td class="text group_right hidden_graph">')
data = getDataPoints(data_dict)
if len(data) > 1:
append('<span class="action" onclick="toggleGraph(this)">+</span>'
'<div class="positioner"><div class="container">'
'<div class="title">%s</div>'
'<div class="action close" onclick="hideGraph(this)">close</div>' %
title
)
append(graphPair(
prepareDataForGraph(
data,
date_format,
placeholder_delta,
graph_coefficient,
x_min=x_min,
x_max=x_max,
),
date_format,
graph_period,
apdex_y_min=apdex_y_min,
hit_y_min=hit_y_min,
hit_y_max=hit_y_max,
apdex_y_scale=apdex_y_scale,
hit_y_scale=hit_y_scale,
))
append('</div></div>')
append('</td>')
for module_id, data_dict in sorted(filtered_module.iteritems(),
key=ITEMGETTER0):
append('<tr class="group_top" title="%s (module)"><th rowspan="2">%s</th>'
'<th>module</th>' % (module_id, module_id))
hiddenGraph(self.module[module_id][False], module_id + ' (module)')
apdexAsColumns(data_dict[False])
append('</tr><tr class="group_bottom" title="%s (document)"><th>document</th>' % module_id)
hiddenGraph(self.module[module_id][True], module_id + ' (document)')
apdexAsColumns(data_dict[True])
append('</tr>')
append('<tr class="group_top group_bottom" title="site search"><th colspan="2">site search'
'</th>')
hiddenGraph(self.site_search, 'site search')
site_search_overall = apdexAsColumns(filtered_site_search)
append('</tr>')
for id_, date_dict in sorted(filtered_no_module.iteritems()):
append('<tr class="group_top group_bottom" title="%s"><th colspan="2">%s</th>'
% (id_, id_))
hiddenGraph(self.no_module[id_], id_)
apdexAsColumns(date_dict)
append('</tr>')
append('</table><h2>Per-level overall</h2><table class="stats"><tr>'
'<th>level</th>')
append(APDEXStats.asHTMLHeader())
append('</tr><tr><th>other</th>')
append(other_overall.asHTML(self.threshold))
append('</tr><tr><th>site search</th>')
append(site_search_overall.asHTML(self.threshold))
append('</tr><tr><th>module</th>')
append(module_document_overall[False].asHTML(self.threshold))
append('</tr><tr><th>document</th>')
append(module_document_overall[True].asHTML(self.threshold))
append('</tr></table>')
append(super(ERP5SiteStats, self).asHTML(date_format,
placeholder_delta, graph_period, graph_coefficient, encoding,
stat_filter=stat_filter,
x_min=x_min, x_max=x_max,
apdex_y_min=apdex_y_min, hit_y_min=hit_y_min, hit_y_max=hit_y_max,
apdex_y_scale=apdex_y_scale,
hit_y_scale=hit_y_scale,
n_hottest_pages=n_hottest_pages,
))
return '\n'.join(result)
@classmethod
def fromJSONState(cls, state, getDuration, suffix):
result = super(ERP5SiteStats, cls).fromJSONState(state, getDuration, suffix)
for module_id, module_dict_state in state['module'].iteritems():
module_dict = result.module[module_id]
for is_document, date_dict_state in module_dict_state.iteritems():
date_dict = module_dict[is_document == 'true']
for value_date, apdex_state in date_dict_state.iteritems():
date_dict[value_date] = APDEXStats.fromJSONState(apdex_state, getDuration)
for id_, date_dict in state['no_module'].iteritems():
no_module_dict = result.no_module[id_]
for value_date, apdex_state in date_dict.iteritems():
no_module_dict[value_date] = APDEXStats.fromJSONState(
apdex_state, getDuration)
for value_date, apdex_state in state['site_search'].iteritems():
result.site_search[value_date] = APDEXStats.fromJSONState(
apdex_state, getDuration)
return result
def asJSONState(self):
result = super(ERP5SiteStats, self).asJSONState()
result['module'] = module = {}
for module_id, module_dict in self.module.iteritems():
module_dict_state = module[module_id] = {}
for is_document, date_dict in module_dict.iteritems():
module_dict_state[is_document] = _APDEXDateDictAsJSONState(date_dict)
result['no_module'] = no_module = {}
for id_, date_dict in self.no_module.iteritems():
no_module[id_] = _APDEXDateDictAsJSONState(date_dict)
result['site_search'] = _APDEXDateDictAsJSONState(self.site_search)
return result
def accumulateFrom(self, other):
super(ERP5SiteStats, self).accumulateFrom(other)
module = self.module
for module_id, other_module_dict in other.module.iteritems():
module_dict = module[module_id]
for is_document, other_date_dict in other_module_dict.iteritems():
date_dict = module_dict[is_document]
for value_date, apdex in other_date_dict.iteritems():
date_dict[value_date].accumulateFrom(apdex)
for id_, other_date_dict in other.no_module.iteritems():
date_dict = self.no_module[id_]
for value_date, apdex in other_date_dict.iteritems():
date_dict.accumulateFrom(apdex)
attribute = self.site_search
for value_date, apdex in other.site_search.iteritems():
attribute[value_date].accumulateFrom(apdex)
DURATION_US_FORMAT = '%D'
DURATION_MS_FORMAT = '%{ms}T'
DURATION_S_FORMAT = '%T'
server_name_group_dict = {
'%v': lambda x, path: x.group('servername') + '/' + path,
'%V': lambda x, path: x.group('canonical_servername') + '/' + path,
}
logformat_dict = {
'%h': r'(?P<host>[^ ]*)',
'%b': r'(?P<bytes>[0-9-]*?)',
'%l': r'(?P<ident>[^ ]*)',
'%u': r'(?P<user>[^ ]*)',
'%t': r'\[(?P<timestamp>\d{2}/\w{3}/\d{4}:\d{2}:\d{2}:\d{2} [+-]\d{4})\]',
'%r': r'(?P<request>[^"]*)', # XXX: expected to be enclosed in ". See also REQUEST_PATTERN
'%>s': r'(?P<status>[0-9]*?)',
'%O': r'(?P<size>[0-9-]*?)',
'%{Referer}i': r'(?P<referer>[^"]*)', # XXX: expected to be enclosed in "
'%{REMOTE_USER}i': r'(?P<remote_user>[^"]*)', # XXX: expected to be enclosed in "
'%{User-Agent}i': r'(?P<agent>[^"]*)', # XXX: expected to be enclosed in "
DURATION_US_FORMAT: r'(?P<duration>[0-9]*)',
DURATION_MS_FORMAT: r'(?P<duration_ms>[0-9]*)',
DURATION_S_FORMAT: r'(?P<duration_s>[0-9]*)',
'%%': r'%',
'%v': r'(?P<servername>[^ ]*)',
'%V': r'(?P<canonical_servername>[^ ]*)',
# TODO: add more formats
}
# Expensive, but more robust, variants
expensive_logformat_dict = {
'%r': r'(?P<request>(\\.|[^\\"])*)',
'%{Referer}i': r'(?P<referer>(\\.|[^\\"])*)',
'%{User-Agent}i': r'(?P<agent>(\\.|[^\\"])*)',
'%{REMOTE_USER}i': r'(?P<remote_user>(\\.|[^\\"])*)',
}
REQUEST_PATTERN = re.compile('(?P<method>[^ ]*) (?P<url>[^ ]*)'
'( (?P<protocol>.*))?')
class AggregateSiteUrl(argparse.Action):
__argument_to_aggregator = {
'--base': GenericSiteStats,
'--erp5-base': ERP5SiteStats,
'--skip-base': None,
}
def __call__(self, parser, namespace, values, option_string=None):
action = base_action = self.__argument_to_aggregator[option_string]
site_list, site_caption_dict = getattr(namespace, self.dest)
next_value = iter(values).next
while True:
try:
value = next_value()
except StopIteration:
break
if value in site_caption_dict:
raise ValueError('Duplicate base: %r' % value)
if action is not None and value[0] == '+':
caption = value[1:]
try:
value = next_value()
except StopIteration:
raise ValueError('No base follows caption %r' % value)
else:
caption = value
site_caption_dict[value] = caption
match = re.compile(value).match
if base_action is not None:
match_suffix = re.compile(value + '(?P<suffix>.*)').match
action = partial(base_action,
suffix=lambda x: match_suffix(x).group('suffix'))
site_list.append((value, match, action))
class ShlexArgumentParser(argparse.ArgumentParser):
"""
Two objectives in this class:
- use shlex to split config files
- when recursively including files, do it from referer's path instead of
current working directory, to facilitate relative inclusion.
"""
# XXX: I whould like to be able to hook inside _read_args_from_files, but
# it would be dirtier. Instead, declare a private method doing similar
# replacement before handing args to original parse_known_args.
def __read_args_from_files(self, args, cwd):
new_args = []
append = new_args.append
extend = new_args.extend
args = iter(args)
for arg in args:
if arg[:1] in self.fromfile_prefix_chars:
filepath = arg[1:]
if not filepath:
filepath = next(args)
filepath = os.path.expanduser(filepath)
new_cwd = os.path.normpath(os.path.join(
cwd,
os.path.dirname(filepath),
))
try:
with open(os.path.join(new_cwd, os.path.basename(filepath))
) as in_file:
extend(self.__read_args_from_files(
shlex.split(in_file.read(), comments=True),
new_cwd,
))
except IOError, exc:
self.error(str(exc))
else:
append(arg)
return new_args
def parse_known_args(self, args=None, namespace=None):
if args is None:
args = sys.argv[1:]
else:
args = list(args)
args = self.__read_args_from_files(args, os.getcwd())
return super(ShlexArgumentParser, self).parse_known_args(args=args,
namespace=namespace)
_month_offset_cache = {}
def _asWeekString(dt):
year = dt.year
month = dt.month
day = dt.day
key = (year, month)
try:
offset = _month_offset_cache[key]
except KeyError:
# Substract 1 to exclude first day of month, and 1 to prepare for next
# operation (avoid substracting on each run).
offset = date(year, month, 1).timetuple().tm_yday - 2
_month_offset_cache[key] = offset
day_of_year = day + offset
day -= day_of_year - (day_of_year // 7 * 7)
if day < 1:
month -= 1
day += calendar.monthrange(year, month)[1]
assert day > 0 and month > 0, (dt, year, month, day)
return '%04i/%02i/%02i' % (year, month, day)
def _weekStringAsQuarterString(timestamp):
year, month, _ = timestamp.split('/')
return '%s/%02i' % (year, (int(month) - 1) // 3 * 3 + 1)
def _roundWeek(dt):
day_of_year = dt.timetuple().tm_yday
return dt - timedelta(day_of_year - ((day_of_year - 1) // 7 * 7 + 1))
def _getWeekCoefficient(dt):
if dt.month != 12:
return 1
# 32 = 31 days of December + 1 day so YYYY/12/31 is still 1 day of measure,
# and return value is 7.
return max(1, 7. / (32 - dt.day))
def _round6Hour(dt):
return dt.replace(hour=dt.hour // 6 * 6)
def _hourAsWeekString(timestamp):
dt = datetime.strptime(timestamp, '%Y/%m/%d %H')
return (dt - timedelta(dt.weekday())).date().strftime('%Y/%m/%d')
def _asHalfDayString(timestamp):
prefix, _ = timestamp.rsplit(':', 1)
prefix, hours = prefix.split(' ')
return '%s %02i' % (prefix, int(hours) // 12 * 12)
def _asQuarterHourString(timestamp):
prefix, minute = timestamp.rsplit(':', 1)
return '%s:%02i' % (prefix, int(minute) // 15 * 15)
# Key: argument (represents table granularity)
# Value:
# - cheap conversion from apache date format to graph granularity
# must be sortable consistently with time flow
# - conversion from gaph granularity to table granularity
# - graph granularity caption
# - format string to parse and generate graph granularity into/from
# datetime.datetime instance
# - period during which a placeholder point will be added if there is no data
# point
# - round a datetime.datetime instance so once represented using given format
# string it is a valid graph-granularity date for period
# - coefficient to apply to hit count for given (graph granularity)
# datetime.datetime. Most useful in case of "7 days", as last month's week
# may be a single day, causing graph to display a value up to 7 times lower
# than what it should be.
period_parser = {
'year': (
lambda x: x.strftime('%Y/%m'),
lambda x: x.split('/', 1)[0],
'month',
'%Y/%m',
# Longest month: 31 days
timedelta(31),
lambda x: x,
# Error margin without correction: 3/31 = 10%
lambda x: 31. / calendar.monthrange(x.year, x.month)[1],
),
'quarter': (
_asWeekString,
_weekStringAsQuarterString,
# Note: Not calendar weeks, but chunks of 7 days starting on first year's
# day. Cheaper to compute than locating first sunday/monday of the year.
'7 days',
'%Y/%m/%d',
timedelta(7),
_roundWeek,
# Error margin without correction: (366 % 7 = 2) 2/7 = 29%
_getWeekCoefficient,
),
'month': (
lambda x: x.strftime('%Y/%m/%d'),
lambda x: '/'.join(x.split('/', 2)[:2]),
'day',
'%Y/%m/%d',
# Longest day: 24 hours + 1h DST (never more ?)
timedelta(seconds=3600 * 25),
lambda x: x,
# Error margin without correction: (DST) 1/24 = 4%
lambda x: 1,
),
'week': (
lambda x: x.strftime('%Y/%m/%d ') + '%02i' % (x.hour // 6 * 6),
_hourAsWeekString,
'6 hours',
'%Y/%m/%d %H',
timedelta(seconds=3600 * 6),
_round6Hour,
# Error margin without correction: (DST) 1/6 = 17%
lambda x: 1,
),
'day': (
lambda x: x.strftime('%Y/%m/%d %H'),
lambda x: x.split(' ')[0],
'hour',
'%Y/%m/%d %H',
# Longest hour: 60 * 60 seconds + 1 leap second.
timedelta(seconds=3601),
lambda x: x,
# Error margin without correction: (leap) 1/3600 = .03%
lambda x: 1,
),
'halfday': (
lambda x: x.strftime('%Y/%m/%d %H:') + '%02i' % (x.minute // 30 * 30),
_asHalfDayString,
'30 minutes',
'%Y/%m/%d %H:%M',
timedelta(seconds=30 * 60),
lambda x: x.replace(minute=x.minute // 30 * 30),
lambda x: 1,
),
'quarterhour': (
lambda x: x.strftime('%Y/%m/%d %H:%M'),
_asQuarterHourString,
'minute',
'%Y/%m/%d %H:%M',
timedelta(seconds=60),
lambda x: x,
lambda x: 1,
),
}
apdex_y_scale_dict = {
'linear': None,
'log': 'log100To0',
}
hit_y_scale_dict = {
'linear': None,
'log': 'log0ToAny',
}
def asHTML(out, encoding, per_site, args, default_site, period_parameter_dict,
stats, site_caption_dict):
period = period_parameter_dict['period']
decimator = period_parameter_dict['decimator']
date_format = period_parameter_dict['date_format']
placeholder_delta = period_parameter_dict['placeholder_delta']
graph_period = period_parameter_dict['graph_period']
graph_coefficient = period_parameter_dict['graph_coefficient']
hit_y_max = args.fixed_yrange
if hit_y_max is not None:
apdex_y_min = hit_y_min = 0
if hit_y_max < 0:
hit_y_max = None
else:
apdex_y_min = hit_y_min = None
out.write('<!DOCTYPE html>\n<html><head><meta charset="%s">'
'<title>Stats</title><meta name="generator" content="APacheDEX" />' % encoding)
js_path = args.js
js_embed = js_path is None or args.js_embed
if js_embed:
out.write('<style>')
out.write(getResource('apachedex.css'))
out.write('</style>')
else:
out.write('<link rel="stylesheet" type="text/css" '
'href="%s/apachedex.css"/>' % js_path)
for script in ('jquery.js', 'jquery.flot.js', 'jquery.flot.time.js',
'jquery.flot.axislabels.js', 'jquery-ui.js', 'apachedex.js'):
if js_embed:
out.write('<script type="text/javascript">//<![CDATA[\n')
out.write(getResource(script))
out.write('\n//]]></script>')
else:
out.write('<script type="text/javascript" src="%s/%s"></script>' % (
js_path, script))
apdex_y_scale = apdex_y_scale_dict[args.apdex_yscale]
hit_y_scale = hit_y_scale_dict[args.hit_yscale]
out.write('</head><body><h1>Overall</h1>')
site_list = list(enumerate(sorted(per_site.iteritems(),
key=lambda x: site_caption_dict[x[0]])))
html_site_caption_dict = {}
for i, (site_id, _) in site_list:
html_site_caption_dict[site_id] = escape(site_caption_dict[site_id])
if len(per_site) > 1:
out.write('<h2>Index</h2><ol>')
for i, (site_id, _) in site_list:
out.write('<li><a href="#%s" title="%s">%s</a></li>' % (i,
escape(repr(site_id), quote=True), html_site_caption_dict[site_id]))
out.write('</ol>')
out.write('<h2>Parameters</h2><table class="stats">')
for caption, value in (
('apdex threshold', '%.2fs' % args.apdex),
('period', args.period or (period + ' (auto)')),
('timezone', args.to_timezone or "(input's)")
):
out.write('<tr><th class="text">%s</th><td>%s</td></tr>' % (
caption, value))
out.write('</table><h2>Hits per %s</h2><table class="stats">'
'<tr><th>date</th><th>hits</th></tr>' % period)
hit_per_day = defaultdict(int)
x_min = LARGER_THAN_INTEGER_STR
x_max = SMALLER_THAN_INTEGER_STR
for site_data in per_site.itervalues():
apdex_data_list = site_data.getApdexData()
if apdex_data_list:
x_min = min(x_min, apdex_data_list[0][0])
x_max = max(x_max, apdex_data_list[-1][0])
for hit_date, _, hit, _ in apdex_data_list:
hit_per_day[decimator(hit_date)] += hit
if x_min == LARGER_THAN_INTEGER_STR:
x_min = None
x_max = None
for hit_date, hit in sorted(hit_per_day.iteritems(), key=ITEMGETTER0):
out.write('<tr><td>%s</td><td>%s</td></tr>' % (hit_date, hit))
out.write('</table>')
n_hottest_pages = args.n_hottest_pages
for i, (site_id, data) in site_list:
out.write('<h1 id="%s" title="%s">%s</h1>' % (i, escape(repr(site_id),
quote=True), html_site_caption_dict[site_id]))
apdex_data = data.getApdexData()
if apdex_data:
out.write(
graphPair(
prepareDataForGraph(
apdex_data,
date_format,
placeholder_delta,
graph_coefficient,
x_min=x_min,
x_max=x_max,
),
date_format,
graph_period,
apdex_y_min=apdex_y_min,
hit_y_min=hit_y_min,
hit_y_max=hit_y_max,
apdex_y_scale=apdex_y_scale,
hit_y_scale=hit_y_scale,
)
)
out.write(data.asHTML(date_format, placeholder_delta, graph_period,
graph_coefficient, encoding, decimator,
x_min=x_min, x_max=x_max,
apdex_y_min=apdex_y_min, hit_y_min=hit_y_min, hit_y_max=hit_y_max,
apdex_y_scale=apdex_y_scale,
hit_y_scale=hit_y_scale,
n_hottest_pages=n_hottest_pages,
))
end_stat_time = time.time()
if args.stats:
out.write('<h1>Parsing stats</h1><table class="stats">')
buildno, builddate = platform.python_build()
end_parsing_time = stats['end_parsing_time']
parsing_time = end_parsing_time - stats['parsing_start_time']
all_lines = stats['all_lines']
for caption, value in (
('Execution date', datetime.now().isoformat()),
('Interpreter', '%s %s build %s (%s)' % (
platform.python_implementation(),
platform.python_version(),
buildno,
builddate,
)),
('State file count', stats['state_file_count']),
('State loading time', timedelta(seconds=stats['parsing_start_time']
- stats['loading_start_time'])),
('File count', stats['file_count']),
('Lines', all_lines),
('... malformed', stats['malformed_lines']),
('... URL-less', stats['no_url_lines']),
('... skipped (URL)', stats['skipped_lines']),
('... skipped (user agent)', stats['skipped_user_agent']),
('Parsing time', timedelta(seconds=parsing_time)),
('Parsing rate', '%i line/s' % (all_lines / parsing_time)),
('Rendering time', timedelta(seconds=(
end_stat_time - end_parsing_time))),
):
out.write('<tr><th class="text">%s</th><td>%s</td></tr>' % (
caption, value))
out.write('</table>')
out.write('</body></html>')
def asJSON(out, encoding, per_site, *_):
json.dump([(x, y.asJSONState()) for x, y in per_site.iteritems()], out)
format_generator = {
'html': (asHTML, 'utf-8'),
'json': (asJSON, 'ascii'),
}
ZERO_TIMEDELTA = timedelta(0, 0)
class AutoTZInfo(tzinfo):
"""
Only for fixed UTC offsets ([+-]HHMM)
Because datetime.strptime doesn't support %z.
"""
def __init__(self, name):
assert len(name) == 5, repr(name)
sign = name[0]
assert sign in '+-', sign
hour = int(name[1:3])
assert 0 <= hour <= 12, hour
minute = int(name[3:])
assert 0 <= minute < 60, minute
if sign == '-':
hour = -hour
minute = -minute
self.offset = timedelta(hours=hour, minutes=minute)
self.name = name
def utcoffset(self, dt):
return self.offset
def dst(self, dt):
return ZERO_TIMEDELTA
def tzname(self, dt):
return self.name
_tz_cache = {}
def getTZInfo(tz):
try:
return _tz_cache[tz]
except KeyError:
_tz_cache[tz] = tzi = AutoTZInfo(tz)
return tzi
def _gracefulExit(func):
@functools.wraps(func)
def wrapper(*args, **kw):
try:
return func(*args, **kw)
except KeyboardInterrupt:
sys.exit(1)
return wrapper
@_gracefulExit
def main():
parser = ShlexArgumentParser(description='Compute Apdex out of '
'apache-style log files', fromfile_prefix_chars='@')
parser.add_argument('logfile', nargs='*',
help='Log files to process. Use - for stdin.')
parser.add_argument('-l', '--logformat',
default='%h %l %u %t "%r" %>s %O "%{Referer}i" "%{User-Agent}i" %D',
help='Apache LogFormat used to generate provided logs. '
'Default: %(default)r')
parser.add_argument('-o', '--out', default='-',
help='Filename to write output to. Use - for stdout. Default: %(default)s')
parser.add_argument('-q', '--quiet', action='store_true',
help='Suppress warnings about malformed lines.')
parser.add_argument('-Q', '--no-progress', action='store_true',
help='Suppress progress indication (file being parsed, lines counter). '
'Does not imply -q.')
parser.add_argument('--state-file', nargs='+', default=[],
help='Use given JSON files as initial state. Use - for stdin.')
parser.add_argument('--to-timezone', help='Timezone to convert log '
'timestamps to before splitting days. If not provided, no conversion '
'happens. In addition to "Continent/City" format which know about DST '
'but requires pytz module, fixed UTC offsets can be provided in the '
'+hhmm form (ex: -0700 for UTC-7). This form does not require pytz '
'module.')
group = parser.add_argument_group('generated content (all formats)')
group.add_argument('-a', '--apdex', default=1.0, type=float,
help='First threshold for Apdex computation, in seconds. '
'Default: %(default).2fs')
group.add_argument('-e', '--error-detail', action='store_true',
help='Include detailed report (url & referers) for error statuses.')
group.add_argument('-u', '--user-agent-detail', action='store_true',
help='Include report of most frequent user agents.')
group.add_argument('--erp5-expand-other', action='store_true',
help='Expand ERP5 `other` stats')
group.add_argument('-f', '--format', choices=format_generator,
default='html', help='Format in which output should be generated.')
group.add_argument('-p', '--period', choices=period_parser,
help='Periodicity of sampling buckets. Default: (decide from data).')
group = parser.add_argument_group('generated content (html)')
group.add_argument('-s', '--stats', action='store_true',
help='Enable parsing stats (time spent parsing input, time spent '
'generating output, ...)')
group.add_argument('--js', help='Folder containing needed js files.')
group.add_argument('--js-embed', action='store_true',
help='Embed js files instead of linking to them.')
group.add_argument('--fixed-yrange', nargs='?', type=int, const=-1,
help='Fix graph vertical range: 0-100%% for apdex, 0-value for hits. '
'Negative value means hit max is adapted to data (used when this '
'argument is provided without value).')
group.add_argument('--apdex-yscale', default='linear',
choices=apdex_y_scale_dict,
help='apdex graph ordinate scale. Default: %(default)s')
group.add_argument('--hit-yscale', default='linear',
choices=hit_y_scale_dict,
help='hit graph ordinate scale. Default: %(default)s')
group.add_argument('--n-hottest-pages', type=int,
default=N_HOTTEST_PAGES_DEFAULT,
help='Number of hottest pages to display.')
group = parser.add_argument_group('site matching', 'Earlier arguments take '
'precedence. Arguments are Python regexes, matching urlencoded strings.'
'Regex matches can be named by providing a "+"-prefixed string before '
'regex.')
group.add_argument('-d', '--default',
help='Caption for lines matching no prefix, or skip them if not provided.')
group.add_argument('--base', dest='path', default=([], {}), nargs='+',
action=AggregateSiteUrl,
help='Title (optional) and regexes matching parts of a site.')
group.add_argument('--erp5-base', dest='path', nargs='+',
action=AggregateSiteUrl,
help='Similar to --base, but with specialised statistics. Ex: '
'"/erp5(/|$|\?)"')
group.add_argument('--skip-base', dest='path', nargs='+',
action=AggregateSiteUrl,
help='Absolute base url(s) to ignore.')
group.add_argument('--match-servername', choices=server_name_group_dict,
help='Prefix URL with (canonical) server name.')
group = parser.add_argument_group('filtering')
group.add_argument('--skip-user-agent', nargs='+', default=[],
action='append', help='List of user agents from which hits should be '
'ignored. Useful to exclude monitoring systems.')
args = parser.parse_args()
if not args.logfile and not args.state_file:
parser.error('Either --state-file or logfile arguments '
'must be specified.')
if DURATION_US_FORMAT in args.logformat:
getDuration = lambda x: int(x.group('duration'))
elif DURATION_MS_FORMAT in args.logformat:
getDuration = lambda x: int(x.group('duration_ms')) * MS_PER_S
elif DURATION_S_FORMAT in args.logformat:
getDuration = lambda x: int(x.group('duration_s')) * US_PER_S
else:
parser.error('Neither %D nor %T are present in logformat, apdex '
'cannot be computed.')
if args.match_servername is not None and \
args.match_servername not in args.logformat:
parser.error('--match-servername %s requested, but missing '
'from logformat.' % args.match_servername)
get_url_prefix = server_name_group_dict.get(args.match_servername,
lambda _, path: path)
line_regex = ''
expensive_line_regex = ''
try:
n = iter(args.logformat).next
while True:
key = None
expensive_char = char = n()
if char == '%':
fmt = n()
key = char + fmt
if fmt == '{':
while fmt != '}':
fmt = n()
key += fmt
key += n()
elif fmt == '>':
key += n()
# XXX: Consider unknown fields have no whitespaces (ie, support for
# quotes)
char = logformat_dict.get(key, r'\S*')
expensive_char = expensive_logformat_dict.get(key, char)
line_regex += char
expensive_line_regex += expensive_char
except StopIteration:
assert not key, key
matchline = re.compile(line_regex).match
expensive_matchline = re.compile(expensive_line_regex).match
matchrequest = REQUEST_PATTERN.match
if args.period is None:
next_period_data = ((x, y[4] * AUTO_PERIOD_COEF) for (x, y) in
sorted(period_parser.iteritems(), key=lambda x: x[1][4])).next
period, to_next_period = next_period_data()
original_period = period
earliest_date = latest_date = None
def getNextPeriod():
# datetime is slow (compared to string operations), but not many choices
return (datetime.strptime(earliest_date, date_format) + to_next_period
).strftime(date_format)
def rescale(x):
result = round_date(datetime.strptime(x, old_date_format)).strftime(date_format)
return result
else:
to_next_period = None
period = args.period
def _matchToDateTime(match):
dt, tz = match.group('timestamp').split()
day, month, rest = dt.split('/', 2)
return datetime.strptime(
'%s/%02i/%s' % (day, MONTH_VALUE_DICT[month], rest),
'%d/%m/%Y:%H:%M:%S').replace(tzinfo=getTZInfo(tz))
if args.to_timezone:
to_timezone = args.to_timezone
if re.match(r'^[+-]\d{4}$', to_timezone):
getTimezoneInfo = getTZInfo
else:
if pytz is None:
raise ValueError('pytz is not available, cannot convert timezone.')
getTimezoneInfo = pytz.timezone
tz_info = getTimezoneInfo(to_timezone)
matchToDateTime = lambda x: _matchToDateTime(x).astimezone(tz_info)
else:
matchToDateTime = _matchToDateTime
asDate, decimator, graph_period, date_format, placeholder_delta, \
round_date, graph_coefficient = period_parser[period]
site_list, site_caption_dict = args.path
default_site = args.default
if default_site is None:
default_action = None
if not [None for _, _, x in site_list if x is not None]:
parser.error('None of --default, --erp5-base and --base were '
'specified, nothing to do.')
else:
default_action = partial(GenericSiteStats, suffix=lambda x: x)
site_caption_dict[None] = default_site
infile_list = args.logfile
quiet = args.quiet
threshold = args.apdex
error_detail = args.error_detail
user_agent_detail = args.user_agent_detail
erp5_expand_other = args.erp5_expand_other
file_count = len(infile_list)
per_site = {}
if '-' in args.state_file and '-' in infile_list:
parser.error('stdin cannot be used both as log and state input.')
loading_start_time = time.time()
for state_file_name in args.state_file:
print('Loading %s...' % state_file_name, end='', file=sys.stderr)
if state_file_name == '-':
state_file = sys.stdin
else:
state_file = codecs.open(state_file_name, encoding='ascii')
with state_file:
load_start = time.time()
state = json.load(state_file)
for url, site_state in state:
if url is None:
site = None
action = default_action
else:
for site, prefix_match, action in site_list:
if site == url:
break
else:
site = None
action = default_action
if action is None:
print('Info: no prefix match %r, stats skipped' % url,
file='sys.stderr')
continue
site_stats = action.func.fromJSONState(site_state,
getDuration, action.keywords['suffix'])
if site in per_site:
per_site[site].accumulateFrom(site_stats)
else:
per_site[site] = site_stats
print('done (%s)' % timedelta(seconds=time.time() - load_start),
file=sys.stderr)
skip_user_agent = [re.compile(x).match
for x in itertools.chain(*args.skip_user_agent)]
malformed_lines = 0
skipped_lines = 0
no_url_lines = 0
all_lines = 0
skipped_user_agent = 0
show_progress = not args.no_progress
parsing_start_time = time.time()
for fileno, filename in enumerate(infile_list, 1):
if show_progress:
print('Processing %s [%i/%i]' % (filename, fileno, file_count),
file=sys.stderr)
if filename == '-':
logfile = sys.stdin
else:
for opener, exc in FILE_OPENER_LIST:
logfile = opener(filename, _read_mode, encoding=INPUT_ENCODING, errors=INPUT_ENCODING_ERROR_HANDLER)
try:
logfile.readline()
except exc:
continue
else:
logfile.seek(0)
break
else:
logfile = codecs.open(filename, _read_mode, encoding=INPUT_ENCODING, errors=INPUT_ENCODING_ERROR_HANDLER)
lineno = 0
for lineno, line in enumerate(logfile, 1):
if show_progress and lineno % 5000 == 0:
print(lineno, end='\r', file=sys.stderr)
match = matchline(line)
if match is None:
match = expensive_matchline(line)
if match is None:
if not quiet:
print('Malformed line at %s:%i: %r' % (filename, lineno, line),
file=sys.stderr)
malformed_lines += 1
continue
agent = match.group('agent')
if any(x(agent) for x in skip_user_agent):
skipped_user_agent += 1
continue
url_match = matchrequest(match.group('request'))
if url_match is None:
no_url_lines += 1
continue
url = url_match.group('url')
if url.startswith('http'):
url = splithost(splittype(url)[1])[1]
url = get_url_prefix(match, url)
for site, prefix_match, action in site_list:
if prefix_match(url) is not None:
break
else:
site = None
action = default_action
if action is None:
skipped_lines += 1
continue
hit_date = asDate(matchToDateTime(match))
if to_next_period is not None:
if latest_date is None or latest_date < hit_date:
latest_date = hit_date
if earliest_date is None or hit_date < earliest_date:
earliest_date = hit_date
next_period = getNextPeriod()
try:
while latest_date > next_period:
period, to_next_period = next_period_data()
next_period = getNextPeriod()
except StopIteration:
to_next_period = None
if original_period != period:
original_period = period
if show_progress:
print('Increasing period to %s...' % period, end='',
file=sys.stderr)
old_date_format = date_format
asDate, decimator, graph_period, date_format, placeholder_delta, \
round_date, graph_coefficient = period_parser[period]
latest_date = rescale(latest_date)
earliest_date = rescale(earliest_date)
period_increase_start = time.time()
for site_data in per_site.itervalues():
site_data.rescale(rescale, getDuration)
if show_progress:
print('done (%s)' % timedelta(seconds=time.time()
- period_increase_start), file=sys.stderr)
hit_date = asDate(matchToDateTime(match))
try:
site_data = per_site[site]
except KeyError:
site_data = per_site[site] = action(threshold, getDuration,
error_detail=error_detail, user_agent_detail=user_agent_detail,
erp5_expand_other=erp5_expand_other)
try:
site_data.accumulate(match, url_match, hit_date)
except Exception:
if not quiet:
print('Error analysing line at %s:%i: %r' % (filename, lineno, line),
file=sys.stderr)
traceback.print_exc(file=sys.stderr)
all_lines += lineno
if show_progress:
print(lineno, file=sys.stderr)
end_parsing_time = time.time()
generator, out_encoding = format_generator[args.format]
if args.out == '-':
out = codecs.getwriter(out_encoding)(sys.stdout)
else:
out = codecs.open(args.out, 'w', encoding=out_encoding)
with out:
generator(out, out_encoding, per_site, args, default_site, {
'period': period,
'decimator': decimator,
'date_format': date_format,
'placeholder_delta': placeholder_delta,
'graph_period': graph_period,
'graph_coefficient': graph_coefficient,
}, {
'state_file_count': len(args.state_file),
'loading_start_time': loading_start_time,
'parsing_start_time': parsing_start_time,
'end_parsing_time': end_parsing_time,
'file_count': file_count,
'all_lines': all_lines,
'malformed_lines': malformed_lines,
'no_url_lines': no_url_lines,
'skipped_lines': skipped_lines,
'skipped_user_agent': skipped_user_agent,
},
site_caption_dict,
)
if __name__ == '__main__':
__resource_base = os.path.join(*os.path.split(__file__)[:-1])
def getResource(name, encoding='utf-8'):
return codecs.open(
os.path.join(__resource_base, name),
encoding=encoding,
).read()
main()
from ._version import get_versions
__version__ = get_versions()['version']
del get_versions | APacheDEX | /APacheDEX-1.8.tar.gz/APacheDEX-1.8/apachedex/__init__.py | __init__.py |
(function(B){B.color={};B.color.make=function(F,E,C,D){var G={};G.r=F||0;G.g=E||0;G.b=C||0;G.a=D!=null?D:1;G.add=function(J,I){for(var H=0;H<J.length;++H){G[J.charAt(H)]+=I}return G.normalize()};G.scale=function(J,I){for(var H=0;H<J.length;++H){G[J.charAt(H)]*=I}return G.normalize()};G.toString=function(){if(G.a>=1){return"rgb("+[G.r,G.g,G.b].join(",")+")"}else{return"rgba("+[G.r,G.g,G.b,G.a].join(",")+")"}};G.normalize=function(){function H(J,K,I){return K<J?J:(K>I?I:K)}G.r=H(0,parseInt(G.r),255);G.g=H(0,parseInt(G.g),255);G.b=H(0,parseInt(G.b),255);G.a=H(0,G.a,1);return G};G.clone=function(){return B.color.make(G.r,G.b,G.g,G.a)};return G.normalize()};B.color.extract=function(D,C){var E;do{E=D.css(C).toLowerCase();if(E!=""&&E!="transparent"){break}D=D.parent()}while(!B.nodeName(D.get(0),"body"));if(E=="rgba(0, 0, 0, 0)"){E="transparent"}return B.color.parse(E)};B.color.parse=function(F){var E,C=B.color.make;if(E=/rgb\(\s*([0-9]{1,3})\s*,\s*([0-9]{1,3})\s*,\s*([0-9]{1,3})\s*\)/.exec(F)){return C(parseInt(E[1],10),parseInt(E[2],10),parseInt(E[3],10))}if(E=/rgba\(\s*([0-9]{1,3})\s*,\s*([0-9]{1,3})\s*,\s*([0-9]{1,3})\s*,\s*([0-9]+(?:\.[0-9]+)?)\s*\)/.exec(F)){return C(parseInt(E[1],10),parseInt(E[2],10),parseInt(E[3],10),parseFloat(E[4]))}if(E=/rgb\(\s*([0-9]+(?:\.[0-9]+)?)\%\s*,\s*([0-9]+(?:\.[0-9]+)?)\%\s*,\s*([0-9]+(?:\.[0-9]+)?)\%\s*\)/.exec(F)){return C(parseFloat(E[1])*2.55,parseFloat(E[2])*2.55,parseFloat(E[3])*2.55)}if(E=/rgba\(\s*([0-9]+(?:\.[0-9]+)?)\%\s*,\s*([0-9]+(?:\.[0-9]+)?)\%\s*,\s*([0-9]+(?:\.[0-9]+)?)\%\s*,\s*([0-9]+(?:\.[0-9]+)?)\s*\)/.exec(F)){return C(parseFloat(E[1])*2.55,parseFloat(E[2])*2.55,parseFloat(E[3])*2.55,parseFloat(E[4]))}if(E=/#([a-fA-F0-9]{2})([a-fA-F0-9]{2})([a-fA-F0-9]{2})/.exec(F)){return C(parseInt(E[1],16),parseInt(E[2],16),parseInt(E[3],16))}if(E=/#([a-fA-F0-9])([a-fA-F0-9])([a-fA-F0-9])/.exec(F)){return C(parseInt(E[1]+E[1],16),parseInt(E[2]+E[2],16),parseInt(E[3]+E[3],16))}var D=B.trim(F).toLowerCase();if(D=="transparent"){return C(255,255,255,0)}else{E=A[D]||[0,0,0];return C(E[0],E[1],E[2])}};var A={aqua:[0,255,255],azure:[240,255,255],beige:[245,245,220],black:[0,0,0],blue:[0,0,255],brown:[165,42,42],cyan:[0,255,255],darkblue:[0,0,139],darkcyan:[0,139,139],darkgrey:[169,169,169],darkgreen:[0,100,0],darkkhaki:[189,183,107],darkmagenta:[139,0,139],darkolivegreen:[85,107,47],darkorange:[255,140,0],darkorchid:[153,50,204],darkred:[139,0,0],darksalmon:[233,150,122],darkviolet:[148,0,211],fuchsia:[255,0,255],gold:[255,215,0],green:[0,128,0],indigo:[75,0,130],khaki:[240,230,140],lightblue:[173,216,230],lightcyan:[224,255,255],lightgreen:[144,238,144],lightgrey:[211,211,211],lightpink:[255,182,193],lightyellow:[255,255,224],lime:[0,255,0],magenta:[255,0,255],maroon:[128,0,0],navy:[0,0,128],olive:[128,128,0],orange:[255,165,0],pink:[255,192,203],purple:[128,0,128],violet:[128,0,128],red:[255,0,0],silver:[192,192,192],white:[255,255,255],yellow:[255,255,0]}})(jQuery);
// the actual Flot code
(function($) {
// Cache the prototype hasOwnProperty for faster access
var hasOwnProperty = Object.prototype.hasOwnProperty;
///////////////////////////////////////////////////////////////////////////
// The Canvas object is a wrapper around an HTML5 <canvas> tag.
//
// @constructor
// @param {string} cls List of classes to apply to the canvas.
// @param {element} container Element onto which to append the canvas.
//
// Requiring a container is a little iffy, but unfortunately canvas
// operations don't work unless the canvas is attached to the DOM.
function Canvas(cls, container) {
var element = container.children("." + cls)[0];
if (element == null) {
element = document.createElement("canvas");
element.className = cls;
$(element).css({ direction: "ltr", position: "absolute", left: 0, top: 0 })
.appendTo(container);
// If HTML5 Canvas isn't available, fall back to [Ex|Flash]canvas
if (!element.getContext) {
if (window.G_vmlCanvasManager) {
element = window.G_vmlCanvasManager.initElement(element);
} else {
throw new Error("Canvas is not available. If you're using IE with a fall-back such as Excanvas, then there's either a mistake in your conditional include, or the page has no DOCTYPE and is rendering in Quirks Mode.");
}
}
}
this.element = element;
var context = this.context = element.getContext("2d");
// Determine the screen's ratio of physical to device-independent
// pixels. This is the ratio between the canvas width that the browser
// advertises and the number of pixels actually present in that space.
// The iPhone 4, for example, has a device-independent width of 320px,
// but its screen is actually 640px wide. It therefore has a pixel
// ratio of 2, while most normal devices have a ratio of 1.
var devicePixelRatio = window.devicePixelRatio || 1,
backingStoreRatio =
context.webkitBackingStorePixelRatio ||
context.mozBackingStorePixelRatio ||
context.msBackingStorePixelRatio ||
context.oBackingStorePixelRatio ||
context.backingStorePixelRatio || 1;
this.pixelRatio = devicePixelRatio / backingStoreRatio;
// Size the canvas to match the internal dimensions of its container
this.resize(container.width(), container.height());
// Collection of HTML div layers for text overlaid onto the canvas
this.textContainer = null;
this.text = {};
// Cache of text fragments and metrics, so we can avoid expensively
// re-calculating them when the plot is re-rendered in a loop.
this._textCache = {};
}
// Resizes the canvas to the given dimensions.
//
// @param {number} width New width of the canvas, in pixels.
// @param {number} width New height of the canvas, in pixels.
Canvas.prototype.resize = function(width, height) {
if (width <= 0 || height <= 0) {
throw new Error("Invalid dimensions for plot, width = " + width + ", height = " + height);
}
var element = this.element,
context = this.context,
pixelRatio = this.pixelRatio;
// Resize the canvas, increasing its density based on the display's
// pixel ratio; basically giving it more pixels without increasing the
// size of its element, to take advantage of the fact that retina
// displays have that many more pixels in the same advertised space.
// Resizing should reset the state (excanvas seems to be buggy though)
if (this.width != width) {
element.width = width * pixelRatio;
element.style.width = width + "px";
this.width = width;
}
if (this.height != height) {
element.height = height * pixelRatio;
element.style.height = height + "px";
this.height = height;
}
// Save the context, so we can reset in case we get replotted. The
// restore ensure that we're really back at the initial state, and
// should be safe even if we haven't saved the initial state yet.
context.restore();
context.save();
// Scale the coordinate space to match the display density; so even though we
// may have twice as many pixels, we still want lines and other drawing to
// appear at the same size; the extra pixels will just make them crisper.
context.scale(pixelRatio, pixelRatio);
};
// Clears the entire canvas area, not including any overlaid HTML text
Canvas.prototype.clear = function() {
this.context.clearRect(0, 0, this.width, this.height);
};
// Finishes rendering the canvas, including managing the text overlay.
Canvas.prototype.render = function() {
var cache = this._textCache;
// For each text layer, add elements marked as active that haven't
// already been rendered, and remove those that are no longer active.
for (var layerKey in cache) {
if (hasOwnProperty.call(cache, layerKey)) {
var layer = this.getTextLayer(layerKey),
layerCache = cache[layerKey];
layer.hide();
for (var styleKey in layerCache) {
if (hasOwnProperty.call(layerCache, styleKey)) {
var styleCache = layerCache[styleKey];
for (var key in styleCache) {
if (hasOwnProperty.call(styleCache, key)) {
var info = styleCache[key];
if (info.active) {
if (!info.rendered) {
layer.append(info.element);
info.rendered = true;
}
} else {
delete styleCache[key];
if (info.rendered) {
info.element.detach();
}
}
}
}
}
}
layer.show();
}
}
};
// Creates (if necessary) and returns the text overlay container.
//
// @param {string} classes String of space-separated CSS classes used to
// uniquely identify the text layer.
// @return {object} The jQuery-wrapped text-layer div.
Canvas.prototype.getTextLayer = function(classes) {
var layer = this.text[classes];
// Create the text layer if it doesn't exist
if (layer == null) {
// Create the text layer container, if it doesn't exist
if (this.textContainer == null) {
this.textContainer = $("<div class='flot-text'></div>")
.css({
position: "absolute",
top: 0,
left: 0,
bottom: 0,
right: 0,
'font-size': "smaller",
color: "#545454"
})
.insertAfter(this.element);
}
layer = this.text[classes] = $("<div></div>")
.addClass(classes)
.css({
position: "absolute",
top: 0,
left: 0,
bottom: 0,
right: 0
})
.appendTo(this.textContainer);
}
return layer;
};
// Creates (if necessary) and returns a text info object.
//
// The object looks like this:
//
// {
// width: Width of the text's wrapper div.
// height: Height of the text's wrapper div.
// active: Flag indicating whether the text should be visible.
// rendered: Flag indicating whether the text is currently visible.
// element: The jQuery-wrapped HTML div containing the text.
// }
//
// Canvas maintains a cache of recently-used text info objects; getTextInfo
// either returns the cached element or creates a new entry.
//
// @param {string} layer A string of space-separated CSS classes uniquely
// identifying the layer containing this text.
// @param {string} text Text string to retrieve info for.
// @param {(string|object)=} font Either a string of space-separated CSS
// classes or a font-spec object, defining the text's font and style.
// @param {number=} angle Angle at which to rotate the text, in degrees.
// Angle is currently unused, it will be implemented in the future.
// @return {object} a text info object.
Canvas.prototype.getTextInfo = function(layer, text, font, angle) {
var textStyle, layerCache, styleCache, info;
// Cast the value to a string, in case we were given a number or such
text = "" + text;
// If the font is a font-spec object, generate a CSS font definition
if (typeof font === "object") {
textStyle = font.style + " " + font.variant + " " + font.weight + " " + font.size + "px/" + font.lineHeight + "px " + font.family;
} else {
textStyle = font;
}
// Retrieve (or create) the cache for the text's layer and styles
layerCache = this._textCache[layer];
if (layerCache == null) {
layerCache = this._textCache[layer] = {};
}
styleCache = layerCache[textStyle];
if (styleCache == null) {
styleCache = layerCache[textStyle] = {};
}
info = styleCache[text];
// If we can't find a matching element in our cache, create a new one
if (info == null) {
var element = $("<div></div>").html(text)
.css({
position: "absolute",
top: -9999
})
.appendTo(this.getTextLayer(layer));
if (typeof font === "object") {
element.css({
font: textStyle,
color: font.color
});
} else if (typeof font === "string") {
element.addClass(font);
}
info = styleCache[text] = {
active: false,
rendered: false,
element: element,
width: element.outerWidth(true),
height: element.outerHeight(true)
};
element.detach();
}
return info;
};
// Adds a text string to the canvas text overlay.
//
// The text isn't drawn immediately; it is marked as rendering, which will
// result in its addition to the canvas on the next render pass.
//
// @param {string} layer A string of space-separated CSS classes uniquely
// identifying the layer containing this text.
// @param {number} x X coordinate at which to draw the text.
// @param {number} y Y coordinate at which to draw the text.
// @param {string} text Text string to draw.
// @param {(string|object)=} font Either a string of space-separated CSS
// classes or a font-spec object, defining the text's font and style.
// @param {number=} angle Angle at which to rotate the text, in degrees.
// Angle is currently unused, it will be implemented in the future.
// @param {string=} halign Horizontal alignment of the text; either "left",
// "center" or "right".
// @param {string=} valign Vertical alignment of the text; either "top",
// "middle" or "bottom".
Canvas.prototype.addText = function(layer, x, y, text, font, angle, halign, valign) {
var info = this.getTextInfo(layer, text, font, angle);
// Mark the div for inclusion in the next render pass
info.active = true;
// Tweak the div's position to match the text's alignment
if (halign == "center") {
x -= info.width / 2;
} else if (halign == "right") {
x -= info.width;
}
if (valign == "middle") {
y -= info.height / 2;
} else if (valign == "bottom") {
y -= info.height;
}
// Move the element to its final position within the container
info.element.css({
top: Math.round(y),
left: Math.round(x)
});
};
// Removes one or more text strings from the canvas text overlay.
//
// If no parameters are given, all text within the layer is removed.
// The text is not actually removed; it is simply marked as inactive, which
// will result in its removal on the next render pass.
//
// @param {string} layer A string of space-separated CSS classes uniquely
// identifying the layer containing this text.
// @param {string} text Text string to remove.
// @param {(string|object)=} font Either a string of space-separated CSS
// classes or a font-spec object, defining the text's font and style.
// @param {number=} angle Angle at which the text is rotated, in degrees.
// Angle is currently unused, it will be implemented in the future.
Canvas.prototype.removeText = function(layer, text, font, angle) {
if (text == null) {
var layerCache = this._textCache[layer];
if (layerCache != null) {
for (var styleKey in layerCache) {
if (hasOwnProperty.call(layerCache, styleKey)) {
var styleCache = layerCache[styleKey]
for (var key in styleCache) {
if (hasOwnProperty.call(styleCache, key)) {
styleCache[key].active = false;
}
}
}
}
}
} else {
this.getTextInfo(layer, text, font, angle).active = false;
}
};
///////////////////////////////////////////////////////////////////////////
// The top-level container for the entire plot.
function Plot(placeholder, data_, options_, plugins) {
// data is on the form:
// [ series1, series2 ... ]
// where series is either just the data as [ [x1, y1], [x2, y2], ... ]
// or { data: [ [x1, y1], [x2, y2], ... ], label: "some label", ... }
var series = [],
options = {
// the color theme used for graphs
colors: ["#edc240", "#afd8f8", "#cb4b4b", "#4da74d", "#9440ed"],
legend: {
show: true,
noColumns: 1, // number of colums in legend table
labelFormatter: null, // fn: string -> string
labelBoxBorderColor: "#ccc", // border color for the little label boxes
container: null, // container (as jQuery object) to put legend in, null means default on top of graph
position: "ne", // position of default legend container within plot
margin: 5, // distance from grid edge to default legend container within plot
backgroundColor: null, // null means auto-detect
backgroundOpacity: 0.85, // set to 0 to avoid background
sorted: null // default to no legend sorting
},
xaxis: {
show: null, // null = auto-detect, true = always, false = never
position: "bottom", // or "top"
mode: null, // null or "time"
font: null, // null (derived from CSS in placeholder) or object like { size: 11, lineHeight: 13, style: "italic", weight: "bold", family: "sans-serif", variant: "small-caps" }
color: null, // base color, labels, ticks
tickColor: null, // possibly different color of ticks, e.g. "rgba(0,0,0,0.15)"
transform: null, // null or f: number -> number to transform axis
inverseTransform: null, // if transform is set, this should be the inverse function
min: null, // min. value to show, null means set automatically
max: null, // max. value to show, null means set automatically
autoscaleMargin: null, // margin in % to add if auto-setting min/max
ticks: null, // either [1, 3] or [[1, "a"], 3] or (fn: axis info -> ticks) or app. number of ticks for auto-ticks
tickFormatter: null, // fn: number -> string
labelWidth: null, // size of tick labels in pixels
labelHeight: null,
reserveSpace: null, // whether to reserve space even if axis isn't shown
tickLength: null, // size in pixels of ticks, or "full" for whole line
alignTicksWithAxis: null, // axis number or null for no sync
tickDecimals: null, // no. of decimals, null means auto
tickSize: null, // number or [number, "unit"]
minTickSize: null // number or [number, "unit"]
},
yaxis: {
autoscaleMargin: 0.02,
position: "left" // or "right"
},
xaxes: [],
yaxes: [],
series: {
points: {
show: false,
radius: 3,
lineWidth: 2, // in pixels
fill: true,
fillColor: "#ffffff",
symbol: "circle" // or callback
},
lines: {
// we don't put in show: false so we can see
// whether lines were actively disabled
lineWidth: 2, // in pixels
fill: false,
fillColor: null,
steps: false
// Omit 'zero', so we can later default its value to
// match that of the 'fill' option.
},
bars: {
show: false,
lineWidth: 2, // in pixels
barWidth: 1, // in units of the x axis
fill: true,
fillColor: null,
align: "left", // "left", "right", or "center"
horizontal: false,
zero: true
},
shadowSize: 3,
highlightColor: null
},
grid: {
show: true,
aboveData: false,
color: "#545454", // primary color used for outline and labels
backgroundColor: null, // null for transparent, else color
borderColor: null, // set if different from the grid color
tickColor: null, // color for the ticks, e.g. "rgba(0,0,0,0.15)"
margin: 0, // distance from the canvas edge to the grid
labelMargin: 5, // in pixels
axisMargin: 8, // in pixels
borderWidth: 2, // in pixels
minBorderMargin: null, // in pixels, null means taken from points radius
markings: null, // array of ranges or fn: axes -> array of ranges
markingsColor: "#f4f4f4",
markingsLineWidth: 2,
// interactive stuff
clickable: false,
hoverable: false,
autoHighlight: true, // highlight in case mouse is near
mouseActiveRadius: 10 // how far the mouse can be away to activate an item
},
interaction: {
redrawOverlayInterval: 1000/60 // time between updates, -1 means in same flow
},
hooks: {}
},
surface = null, // the canvas for the plot itself
overlay = null, // canvas for interactive stuff on top of plot
eventHolder = null, // jQuery object that events should be bound to
ctx = null, octx = null,
xaxes = [], yaxes = [],
plotOffset = { left: 0, right: 0, top: 0, bottom: 0},
plotWidth = 0, plotHeight = 0,
hooks = {
processOptions: [],
processRawData: [],
processDatapoints: [],
processOffset: [],
drawBackground: [],
drawSeries: [],
draw: [],
bindEvents: [],
drawOverlay: [],
shutdown: []
},
plot = this;
// public functions
plot.setData = setData;
plot.setupGrid = setupGrid;
plot.draw = draw;
plot.getPlaceholder = function() { return placeholder; };
plot.getCanvas = function() { return surface.element; };
plot.getPlotOffset = function() { return plotOffset; };
plot.width = function () { return plotWidth; };
plot.height = function () { return plotHeight; };
plot.offset = function () {
var o = eventHolder.offset();
o.left += plotOffset.left;
o.top += plotOffset.top;
return o;
};
plot.getData = function () { return series; };
plot.getAxes = function () {
var res = {}, i;
$.each(xaxes.concat(yaxes), function (_, axis) {
if (axis)
res[axis.direction + (axis.n != 1 ? axis.n : "") + "axis"] = axis;
});
return res;
};
plot.getXAxes = function () { return xaxes; };
plot.getYAxes = function () { return yaxes; };
plot.c2p = canvasToAxisCoords;
plot.p2c = axisToCanvasCoords;
plot.getOptions = function () { return options; };
plot.highlight = highlight;
plot.unhighlight = unhighlight;
plot.triggerRedrawOverlay = triggerRedrawOverlay;
plot.pointOffset = function(point) {
return {
left: parseInt(xaxes[axisNumber(point, "x") - 1].p2c(+point.x) + plotOffset.left, 10),
top: parseInt(yaxes[axisNumber(point, "y") - 1].p2c(+point.y) + plotOffset.top, 10)
};
};
plot.shutdown = shutdown;
plot.resize = function () {
var width = placeholder.width(),
height = placeholder.height();
surface.resize(width, height);
overlay.resize(width, height);
};
// public attributes
plot.hooks = hooks;
// initialize
initPlugins(plot);
parseOptions(options_);
setupCanvases();
setData(data_);
setupGrid();
draw();
bindEvents();
function executeHooks(hook, args) {
args = [plot].concat(args);
for (var i = 0; i < hook.length; ++i)
hook[i].apply(this, args);
}
function initPlugins() {
// References to key classes, allowing plugins to modify them
var classes = {
Canvas: Canvas
};
for (var i = 0; i < plugins.length; ++i) {
var p = plugins[i];
p.init(plot, classes);
if (p.options)
$.extend(true, options, p.options);
}
}
function parseOptions(opts) {
$.extend(true, options, opts);
if (options.xaxis.color == null)
options.xaxis.color = $.color.parse(options.grid.color).scale('a', 0.22).toString();
if (options.yaxis.color == null)
options.yaxis.color = $.color.parse(options.grid.color).scale('a', 0.22).toString();
if (options.xaxis.tickColor == null) // grid.tickColor for back-compatibility
options.xaxis.tickColor = options.grid.tickColor || options.xaxis.color;
if (options.yaxis.tickColor == null) // grid.tickColor for back-compatibility
options.yaxis.tickColor = options.grid.tickColor || options.yaxis.color;
if (options.grid.borderColor == null)
options.grid.borderColor = options.grid.color;
if (options.grid.tickColor == null)
options.grid.tickColor = $.color.parse(options.grid.color).scale('a', 0.22).toString();
// Fill in defaults for axis options, including any unspecified
// font-spec fields, if a font-spec was provided.
// If no x/y axis options were provided, create one of each anyway,
// since the rest of the code assumes that they exist.
var i, axisOptions, axisCount,
fontDefaults = {
style: placeholder.css("font-style"),
size: Math.round(0.8 * (+placeholder.css("font-size").replace("px", "") || 13)),
variant: placeholder.css("font-variant"),
weight: placeholder.css("font-weight"),
family: placeholder.css("font-family")
};
fontDefaults.lineHeight = fontDefaults.size * 1.15;
axisCount = options.xaxes.length || 1;
for (i = 0; i < axisCount; ++i) {
axisOptions = options.xaxes[i];
if (axisOptions && !axisOptions.tickColor) {
axisOptions.tickColor = axisOptions.color;
}
axisOptions = $.extend(true, {}, options.xaxis, axisOptions);
options.xaxes[i] = axisOptions;
if (axisOptions.font) {
axisOptions.font = $.extend({}, fontDefaults, axisOptions.font);
if (!axisOptions.font.color) {
axisOptions.font.color = axisOptions.color;
}
}
}
axisCount = options.yaxes.length || 1;
for (i = 0; i < axisCount; ++i) {
axisOptions = options.yaxes[i];
if (axisOptions && !axisOptions.tickColor) {
axisOptions.tickColor = axisOptions.color;
}
axisOptions = $.extend(true, {}, options.yaxis, axisOptions);
options.yaxes[i] = axisOptions;
if (axisOptions.font) {
axisOptions.font = $.extend({}, fontDefaults, axisOptions.font);
if (!axisOptions.font.color) {
axisOptions.font.color = axisOptions.color;
}
}
}
// backwards compatibility, to be removed in future
if (options.xaxis.noTicks && options.xaxis.ticks == null)
options.xaxis.ticks = options.xaxis.noTicks;
if (options.yaxis.noTicks && options.yaxis.ticks == null)
options.yaxis.ticks = options.yaxis.noTicks;
if (options.x2axis) {
options.xaxes[1] = $.extend(true, {}, options.xaxis, options.x2axis);
options.xaxes[1].position = "top";
}
if (options.y2axis) {
options.yaxes[1] = $.extend(true, {}, options.yaxis, options.y2axis);
options.yaxes[1].position = "right";
}
if (options.grid.coloredAreas)
options.grid.markings = options.grid.coloredAreas;
if (options.grid.coloredAreasColor)
options.grid.markingsColor = options.grid.coloredAreasColor;
if (options.lines)
$.extend(true, options.series.lines, options.lines);
if (options.points)
$.extend(true, options.series.points, options.points);
if (options.bars)
$.extend(true, options.series.bars, options.bars);
if (options.shadowSize != null)
options.series.shadowSize = options.shadowSize;
if (options.highlightColor != null)
options.series.highlightColor = options.highlightColor;
// save options on axes for future reference
for (i = 0; i < options.xaxes.length; ++i)
getOrCreateAxis(xaxes, i + 1).options = options.xaxes[i];
for (i = 0; i < options.yaxes.length; ++i)
getOrCreateAxis(yaxes, i + 1).options = options.yaxes[i];
// add hooks from options
for (var n in hooks)
if (options.hooks[n] && options.hooks[n].length)
hooks[n] = hooks[n].concat(options.hooks[n]);
executeHooks(hooks.processOptions, [options]);
}
function setData(d) {
series = parseData(d);
fillInSeriesOptions();
processData();
}
function parseData(d) {
var res = [];
for (var i = 0; i < d.length; ++i) {
var s = $.extend(true, {}, options.series);
if (d[i].data != null) {
s.data = d[i].data; // move the data instead of deep-copy
delete d[i].data;
$.extend(true, s, d[i]);
d[i].data = s.data;
}
else
s.data = d[i];
res.push(s);
}
return res;
}
function axisNumber(obj, coord) {
var a = obj[coord + "axis"];
if (typeof a == "object") // if we got a real axis, extract number
a = a.n;
if (typeof a != "number")
a = 1; // default to first axis
return a;
}
function allAxes() {
// return flat array without annoying null entries
return $.grep(xaxes.concat(yaxes), function (a) { return a; });
}
function canvasToAxisCoords(pos) {
// return an object with x/y corresponding to all used axes
var res = {}, i, axis;
for (i = 0; i < xaxes.length; ++i) {
axis = xaxes[i];
if (axis && axis.used)
res["x" + axis.n] = axis.c2p(pos.left);
}
for (i = 0; i < yaxes.length; ++i) {
axis = yaxes[i];
if (axis && axis.used)
res["y" + axis.n] = axis.c2p(pos.top);
}
if (res.x1 !== undefined)
res.x = res.x1;
if (res.y1 !== undefined)
res.y = res.y1;
return res;
}
function axisToCanvasCoords(pos) {
// get canvas coords from the first pair of x/y found in pos
var res = {}, i, axis, key;
for (i = 0; i < xaxes.length; ++i) {
axis = xaxes[i];
if (axis && axis.used) {
key = "x" + axis.n;
if (pos[key] == null && axis.n == 1)
key = "x";
if (pos[key] != null) {
res.left = axis.p2c(pos[key]);
break;
}
}
}
for (i = 0; i < yaxes.length; ++i) {
axis = yaxes[i];
if (axis && axis.used) {
key = "y" + axis.n;
if (pos[key] == null && axis.n == 1)
key = "y";
if (pos[key] != null) {
res.top = axis.p2c(pos[key]);
break;
}
}
}
return res;
}
function getOrCreateAxis(axes, number) {
if (!axes[number - 1])
axes[number - 1] = {
n: number, // save the number for future reference
direction: axes == xaxes ? "x" : "y",
options: $.extend(true, {}, axes == xaxes ? options.xaxis : options.yaxis)
};
return axes[number - 1];
}
function fillInSeriesOptions() {
var neededColors = series.length, maxIndex = -1, i;
// Subtract the number of series that already have fixed colors or
// color indexes from the number that we still need to generate.
for (i = 0; i < series.length; ++i) {
var sc = series[i].color;
if (sc != null) {
neededColors--;
if (typeof sc == "number" && sc > maxIndex) {
maxIndex = sc;
}
}
}
// If any of the series have fixed color indexes, then we need to
// generate at least as many colors as the highest index.
if (neededColors <= maxIndex) {
neededColors = maxIndex + 1;
}
// Generate all the colors, using first the option colors and then
// variations on those colors once they're exhausted.
var c, colors = [], colorPool = options.colors,
colorPoolSize = colorPool.length, variation = 0;
for (i = 0; i < neededColors; i++) {
c = $.color.parse(colorPool[i % colorPoolSize] || "#666");
// Each time we exhaust the colors in the pool we adjust
// a scaling factor used to produce more variations on
// those colors. The factor alternates negative/positive
// to produce lighter/darker colors.
// Reset the variation after every few cycles, or else
// it will end up producing only white or black colors.
if (i % colorPoolSize == 0 && i) {
if (variation >= 0) {
if (variation < 0.5) {
variation = -variation - 0.2;
} else variation = 0;
} else variation = -variation;
}
colors[i] = c.scale('rgb', 1 + variation);
}
// Finalize the series options, filling in their colors
var colori = 0, s;
for (i = 0; i < series.length; ++i) {
s = series[i];
// assign colors
if (s.color == null) {
s.color = colors[colori].toString();
++colori;
}
else if (typeof s.color == "number")
s.color = colors[s.color].toString();
// turn on lines automatically in case nothing is set
if (s.lines.show == null) {
var v, show = true;
for (v in s)
if (s[v] && s[v].show) {
show = false;
break;
}
if (show)
s.lines.show = true;
}
// If nothing was provided for lines.zero, default it to match
// lines.fill, since areas by default should extend to zero.
if (s.lines.zero == null) {
s.lines.zero = !!s.lines.fill;
}
// setup axes
s.xaxis = getOrCreateAxis(xaxes, axisNumber(s, "x"));
s.yaxis = getOrCreateAxis(yaxes, axisNumber(s, "y"));
}
}
function processData() {
var topSentry = Number.POSITIVE_INFINITY,
bottomSentry = Number.NEGATIVE_INFINITY,
fakeInfinity = Number.MAX_VALUE,
i, j, k, m, length,
s, points, ps, x, y, axis, val, f, p,
data, format;
function updateAxis(axis, min, max) {
if (min < axis.datamin && min != -fakeInfinity)
axis.datamin = min;
if (max > axis.datamax && max != fakeInfinity)
axis.datamax = max;
}
$.each(allAxes(), function (_, axis) {
// init axis
axis.datamin = topSentry;
axis.datamax = bottomSentry;
axis.used = false;
});
for (i = 0; i < series.length; ++i) {
s = series[i];
s.datapoints = { points: [] };
executeHooks(hooks.processRawData, [ s, s.data, s.datapoints ]);
}
// first pass: clean and copy data
for (i = 0; i < series.length; ++i) {
s = series[i];
data = s.data;
format = s.datapoints.format;
if (!format) {
format = [];
// find out how to copy
format.push({ x: true, number: true, required: true });
format.push({ y: true, number: true, required: true });
if (s.bars.show || (s.lines.show && s.lines.fill)) {
var autoscale = !!((s.bars.show && s.bars.zero) || (s.lines.show && s.lines.zero));
format.push({ y: true, number: true, required: false, defaultValue: 0, autoscale: autoscale });
if (s.bars.horizontal) {
delete format[format.length - 1].y;
format[format.length - 1].x = true;
}
}
s.datapoints.format = format;
}
if (s.datapoints.pointsize != null)
continue; // already filled in
s.datapoints.pointsize = format.length;
ps = s.datapoints.pointsize;
points = s.datapoints.points;
var insertSteps = s.lines.show && s.lines.steps;
s.xaxis.used = s.yaxis.used = true;
for (j = k = 0; j < data.length; ++j, k += ps) {
p = data[j];
var nullify = p == null;
if (!nullify) {
for (m = 0; m < ps; ++m) {
val = p[m];
f = format[m];
if (f) {
if (f.number && val != null) {
val = +val; // convert to number
if (isNaN(val))
val = null;
else if (val == Infinity)
val = fakeInfinity;
else if (val == -Infinity)
val = -fakeInfinity;
}
if (val == null) {
if (f.required)
nullify = true;
if (f.defaultValue != null)
val = f.defaultValue;
}
}
points[k + m] = val;
}
}
if (nullify) {
for (m = 0; m < ps; ++m) {
val = points[k + m];
if (val != null) {
f = format[m];
// extract min/max info
if (f.x)
updateAxis(s.xaxis, val, val);
if (f.y)
updateAxis(s.yaxis, val, val);
}
points[k + m] = null;
}
}
else {
// a little bit of line specific stuff that
// perhaps shouldn't be here, but lacking
// better means...
if (insertSteps && k > 0
&& points[k - ps] != null
&& points[k - ps] != points[k]
&& points[k - ps + 1] != points[k + 1]) {
// copy the point to make room for a middle point
for (m = 0; m < ps; ++m)
points[k + ps + m] = points[k + m];
// middle point has same y
points[k + 1] = points[k - ps + 1];
// we've added a point, better reflect that
k += ps;
}
}
}
}
// give the hooks a chance to run
for (i = 0; i < series.length; ++i) {
s = series[i];
executeHooks(hooks.processDatapoints, [ s, s.datapoints]);
}
// second pass: find datamax/datamin for auto-scaling
for (i = 0; i < series.length; ++i) {
s = series[i];
points = s.datapoints.points,
ps = s.datapoints.pointsize;
format = s.datapoints.format;
var xmin = topSentry, ymin = topSentry,
xmax = bottomSentry, ymax = bottomSentry;
for (j = 0; j < points.length; j += ps) {
if (points[j] == null)
continue;
for (m = 0; m < ps; ++m) {
val = points[j + m];
f = format[m];
if (!f || f.autoscale === false || val == fakeInfinity || val == -fakeInfinity)
continue;
if (f.x) {
if (val < xmin)
xmin = val;
if (val > xmax)
xmax = val;
}
if (f.y) {
if (val < ymin)
ymin = val;
if (val > ymax)
ymax = val;
}
}
}
if (s.bars.show) {
// make sure we got room for the bar on the dancing floor
var delta;
switch (s.bars.align) {
case "left":
delta = 0;
break;
case "right":
delta = -s.bars.barWidth;
break;
case "center":
delta = -s.bars.barWidth / 2;
break;
default:
throw new Error("Invalid bar alignment: " + s.bars.align);
}
if (s.bars.horizontal) {
ymin += delta;
ymax += delta + s.bars.barWidth;
}
else {
xmin += delta;
xmax += delta + s.bars.barWidth;
}
}
updateAxis(s.xaxis, xmin, xmax);
updateAxis(s.yaxis, ymin, ymax);
}
$.each(allAxes(), function (_, axis) {
if (axis.datamin == topSentry)
axis.datamin = null;
if (axis.datamax == bottomSentry)
axis.datamax = null;
});
}
function setupCanvases() {
// Make sure the placeholder is clear of everything except canvases
// from a previous plot in this container that we'll try to re-use.
placeholder.css("padding", 0) // padding messes up the positioning
.children(":not(.flot-base,.flot-overlay)").remove();
if (placeholder.css("position") == 'static')
placeholder.css("position", "relative"); // for positioning labels and overlay
surface = new Canvas("flot-base", placeholder);
overlay = new Canvas("flot-overlay", placeholder); // overlay canvas for interactive features
ctx = surface.context;
octx = overlay.context;
// define which element we're listening for events on
eventHolder = $(overlay.element).unbind();
// If we're re-using a plot object, shut down the old one
var existing = placeholder.data("plot");
if (existing) {
existing.shutdown();
overlay.clear();
}
// save in case we get replotted
placeholder.data("plot", plot);
}
function bindEvents() {
// bind events
if (options.grid.hoverable) {
eventHolder.mousemove(onMouseMove);
// Use bind, rather than .mouseleave, because we officially
// still support jQuery 1.2.6, which doesn't define a shortcut
// for mouseenter or mouseleave. This was a bug/oversight that
// was fixed somewhere around 1.3.x. We can return to using
// .mouseleave when we drop support for 1.2.6.
eventHolder.bind("mouseleave", onMouseLeave);
}
if (options.grid.clickable)
eventHolder.click(onClick);
executeHooks(hooks.bindEvents, [eventHolder]);
}
function shutdown() {
if (redrawTimeout)
clearTimeout(redrawTimeout);
eventHolder.unbind("mousemove", onMouseMove);
eventHolder.unbind("mouseleave", onMouseLeave);
eventHolder.unbind("click", onClick);
executeHooks(hooks.shutdown, [eventHolder]);
}
function setTransformationHelpers(axis) {
// set helper functions on the axis, assumes plot area
// has been computed already
function identity(x) { return x; }
var s, m, t = axis.options.transform || identity,
it = axis.options.inverseTransform;
// precompute how much the axis is scaling a point
// in canvas space
if (axis.direction == "x") {
s = axis.scale = plotWidth / Math.abs(t(axis.max) - t(axis.min));
m = Math.min(t(axis.max), t(axis.min));
}
else {
s = axis.scale = plotHeight / Math.abs(t(axis.max) - t(axis.min));
s = -s;
m = Math.max(t(axis.max), t(axis.min));
}
// data point to canvas coordinate
if (t == identity) // slight optimization
axis.p2c = function (p) { return (p - m) * s; };
else
axis.p2c = function (p) { return (t(p) - m) * s; };
// canvas coordinate to data point
if (!it)
axis.c2p = function (c) { return m + c / s; };
else
axis.c2p = function (c) { return it(m + c / s); };
}
function measureTickLabels(axis) {
var opts = axis.options, ticks = axis.ticks || [],
axisw = opts.labelWidth || 0, axish = opts.labelHeight || 0,
legacyStyles = axis.direction + "Axis " + axis.direction + axis.n + "Axis",
layer = "flot-" + axis.direction + "-axis flot-" + axis.direction + axis.n + "-axis " + legacyStyles,
font = opts.font || "flot-tick-label tickLabel";
for (var i = 0; i < ticks.length; ++i) {
var t = ticks[i];
if (!t.label)
continue;
var info = surface.getTextInfo(layer, t.label, font);
if (opts.labelWidth == null)
axisw = Math.max(axisw, info.width);
if (opts.labelHeight == null)
axish = Math.max(axish, info.height);
}
axis.labelWidth = Math.ceil(axisw);
axis.labelHeight = Math.ceil(axish);
}
function allocateAxisBoxFirstPhase(axis) {
// find the bounding box of the axis by looking at label
// widths/heights and ticks, make room by diminishing the
// plotOffset; this first phase only looks at one
// dimension per axis, the other dimension depends on the
// other axes so will have to wait
var lw = axis.labelWidth,
lh = axis.labelHeight,
pos = axis.options.position,
tickLength = axis.options.tickLength,
axisMargin = options.grid.axisMargin,
padding = options.grid.labelMargin,
all = axis.direction == "x" ? xaxes : yaxes,
index, innermost;
// determine axis margin
var samePosition = $.grep(all, function (a) {
return a && a.options.position == pos && a.reserveSpace;
});
if ($.inArray(axis, samePosition) == samePosition.length - 1)
axisMargin = 0; // outermost
// determine tick length - if we're innermost, we can use "full"
if (tickLength == null) {
var sameDirection = $.grep(all, function (a) {
return a && a.reserveSpace;
});
innermost = $.inArray(axis, sameDirection) == 0;
if (innermost)
tickLength = "full";
else
tickLength = 5;
}
if (!isNaN(+tickLength))
padding += +tickLength;
// compute box
if (axis.direction == "x") {
lh += padding;
if (pos == "bottom") {
plotOffset.bottom += lh + axisMargin;
axis.box = { top: surface.height - plotOffset.bottom, height: lh };
}
else {
axis.box = { top: plotOffset.top + axisMargin, height: lh };
plotOffset.top += lh + axisMargin;
}
}
else {
lw += padding;
if (pos == "left") {
axis.box = { left: plotOffset.left + axisMargin, width: lw };
plotOffset.left += lw + axisMargin;
}
else {
plotOffset.right += lw + axisMargin;
axis.box = { left: surface.width - plotOffset.right, width: lw };
}
}
// save for future reference
axis.position = pos;
axis.tickLength = tickLength;
axis.box.padding = padding;
axis.innermost = innermost;
}
function allocateAxisBoxSecondPhase(axis) {
// now that all axis boxes have been placed in one
// dimension, we can set the remaining dimension coordinates
if (axis.direction == "x") {
axis.box.left = plotOffset.left - axis.labelWidth / 2;
axis.box.width = surface.width - plotOffset.left - plotOffset.right + axis.labelWidth;
}
else {
axis.box.top = plotOffset.top - axis.labelHeight / 2;
axis.box.height = surface.height - plotOffset.bottom - plotOffset.top + axis.labelHeight;
}
}
function adjustLayoutForThingsStickingOut() {
// possibly adjust plot offset to ensure everything stays
// inside the canvas and isn't clipped off
var minMargin = options.grid.minBorderMargin,
margins = { x: 0, y: 0 }, i, axis;
// check stuff from the plot (FIXME: this should just read
// a value from the series, otherwise it's impossible to
// customize)
if (minMargin == null) {
minMargin = 0;
for (i = 0; i < series.length; ++i)
minMargin = Math.max(minMargin, 2 * (series[i].points.radius + series[i].points.lineWidth/2));
}
margins.x = margins.y = Math.ceil(minMargin);
// check axis labels, note we don't check the actual
// labels but instead use the overall width/height to not
// jump as much around with replots
$.each(allAxes(), function (_, axis) {
var dir = axis.direction;
if (axis.reserveSpace)
margins[dir] = Math.ceil(Math.max(margins[dir], (dir == "x" ? axis.labelWidth : axis.labelHeight) / 2));
});
plotOffset.left = Math.max(margins.x, plotOffset.left);
plotOffset.right = Math.max(margins.x, plotOffset.right);
plotOffset.top = Math.max(margins.y, plotOffset.top);
plotOffset.bottom = Math.max(margins.y, plotOffset.bottom);
}
function setupGrid() {
var i, axes = allAxes(), showGrid = options.grid.show;
// Initialize the plot's offset from the edge of the canvas
for (var a in plotOffset) {
var margin = options.grid.margin || 0;
plotOffset[a] = typeof margin == "number" ? margin : margin[a] || 0;
}
executeHooks(hooks.processOffset, [plotOffset]);
// If the grid is visible, add its border width to the offset
for (var a in plotOffset) {
if(typeof(options.grid.borderWidth) == "object") {
plotOffset[a] += showGrid ? options.grid.borderWidth[a] : 0;
}
else {
plotOffset[a] += showGrid ? options.grid.borderWidth : 0;
}
}
// init axes
$.each(axes, function (_, axis) {
axis.show = axis.options.show;
if (axis.show == null)
axis.show = axis.used; // by default an axis is visible if it's got data
axis.reserveSpace = axis.show || axis.options.reserveSpace;
setRange(axis);
});
if (showGrid) {
var allocatedAxes = $.grep(axes, function (axis) { return axis.reserveSpace; });
$.each(allocatedAxes, function (_, axis) {
// make the ticks
setupTickGeneration(axis);
setTicks(axis);
snapRangeToTicks(axis, axis.ticks);
// find labelWidth/Height for axis
measureTickLabels(axis);
});
// with all dimensions calculated, we can compute the
// axis bounding boxes, start from the outside
// (reverse order)
for (i = allocatedAxes.length - 1; i >= 0; --i)
allocateAxisBoxFirstPhase(allocatedAxes[i]);
// make sure we've got enough space for things that
// might stick out
adjustLayoutForThingsStickingOut();
$.each(allocatedAxes, function (_, axis) {
allocateAxisBoxSecondPhase(axis);
});
}
plotWidth = surface.width - plotOffset.left - plotOffset.right;
plotHeight = surface.height - plotOffset.bottom - plotOffset.top;
// now we got the proper plot dimensions, we can compute the scaling
$.each(axes, function (_, axis) {
setTransformationHelpers(axis);
});
if (showGrid) {
drawAxisLabels();
}
insertLegend();
}
function setRange(axis) {
var opts = axis.options,
min = +(opts.min != null ? opts.min : axis.datamin),
max = +(opts.max != null ? opts.max : axis.datamax),
delta = max - min;
if (delta == 0.0) {
// degenerate case
var widen = max == 0 ? 1 : 0.01;
if (opts.min == null)
min -= widen;
// always widen max if we couldn't widen min to ensure we
// don't fall into min == max which doesn't work
if (opts.max == null || opts.min != null)
max += widen;
}
else {
// consider autoscaling
var margin = opts.autoscaleMargin;
if (margin != null) {
if (opts.min == null) {
min -= delta * margin;
// make sure we don't go below zero if all values
// are positive
if (min < 0 && axis.datamin != null && axis.datamin >= 0)
min = 0;
}
if (opts.max == null) {
max += delta * margin;
if (max > 0 && axis.datamax != null && axis.datamax <= 0)
max = 0;
}
}
}
axis.min = min;
axis.max = max;
}
function setupTickGeneration(axis) {
var opts = axis.options;
// estimate number of ticks
var noTicks;
if (typeof opts.ticks == "number" && opts.ticks > 0)
noTicks = opts.ticks;
else
// heuristic based on the model a*sqrt(x) fitted to
// some data points that seemed reasonable
noTicks = 0.3 * Math.sqrt(axis.direction == "x" ? surface.width : surface.height);
axis.delta = (axis.max - axis.min) / noTicks;
// Time mode was moved to a plug-in in 0.8, but since so many people use this
// we'll add an especially friendly make sure they remembered to include it.
if (opts.mode == "time" && !axis.tickGenerator) {
throw new Error("Time mode requires the flot.time plugin.");
}
// Flot supports base-10 axes; any other mode else is handled by a plug-in,
// like flot.time.js.
if (!axis.tickGenerator) {
axis.tickGenerator = function (axis) {
var maxDec = opts.tickDecimals,
dec = -Math.floor(Math.log(axis.delta) / Math.LN10);
if (maxDec != null && dec > maxDec)
dec = maxDec;
var magn = Math.pow(10, -dec),
norm = axis.delta / magn, // norm is between 1.0 and 10.0
size,
ticks = [],
start,
i = 0,
v = Number.NaN,
prev;
if (norm < 1.5)
size = 1;
else if (norm < 3) {
size = 2;
// special case for 2.5, requires an extra decimal
if (norm > 2.25 && (maxDec == null || dec + 1 <= maxDec)) {
size = 2.5;
++dec;
}
}
else if (norm < 7.5)
size = 5;
else size = 10;
size *= magn;
if (opts.minTickSize != null && size < opts.minTickSize)
size = opts.minTickSize;
axis.tickDecimals = Math.max(0, maxDec != null ? maxDec : dec);
axis.tickSize = opts.tickSize || size;
start = floorInBase(axis.min, axis.tickSize);
do {
prev = v;
v = start + i * axis.tickSize;
ticks.push(v);
++i;
} while (v < axis.max && v != prev);
return ticks;
};
axis.tickFormatter = function (value, axis) {
var factor = axis.tickDecimals ? Math.pow(10, axis.tickDecimals) : 1;
var formatted = "" + Math.round(value * factor) / factor;
// If tickDecimals was specified, ensure that we have exactly that
// much precision; otherwise default to the value's own precision.
if (axis.tickDecimals != null) {
var decimal = formatted.indexOf(".");
var precision = decimal == -1 ? 0 : formatted.length - decimal - 1;
if (precision < axis.tickDecimals) {
return (precision ? formatted : formatted + ".") + ("" + factor).substr(1, axis.tickDecimals - precision);
}
}
return formatted;
};
}
if ($.isFunction(opts.tickFormatter))
axis.tickFormatter = function (v, axis) { return "" + opts.tickFormatter(v, axis); };
if (opts.alignTicksWithAxis != null) {
var otherAxis = (axis.direction == "x" ? xaxes : yaxes)[opts.alignTicksWithAxis - 1];
if (otherAxis && otherAxis.used && otherAxis != axis) {
// consider snapping min/max to outermost nice ticks
var niceTicks = axis.tickGenerator(axis);
if (niceTicks.length > 0) {
if (opts.min == null)
axis.min = Math.min(axis.min, niceTicks[0]);
if (opts.max == null && niceTicks.length > 1)
axis.max = Math.max(axis.max, niceTicks[niceTicks.length - 1]);
}
axis.tickGenerator = function (axis) {
// copy ticks, scaled to this axis
var ticks = [], v, i;
for (i = 0; i < otherAxis.ticks.length; ++i) {
v = (otherAxis.ticks[i].v - otherAxis.min) / (otherAxis.max - otherAxis.min);
v = axis.min + v * (axis.max - axis.min);
ticks.push(v);
}
return ticks;
};
// we might need an extra decimal since forced
// ticks don't necessarily fit naturally
if (!axis.mode && opts.tickDecimals == null) {
var extraDec = Math.max(0, -Math.floor(Math.log(axis.delta) / Math.LN10) + 1),
ts = axis.tickGenerator(axis);
// only proceed if the tick interval rounded
// with an extra decimal doesn't give us a
// zero at end
if (!(ts.length > 1 && /\..*0$/.test((ts[1] - ts[0]).toFixed(extraDec))))
axis.tickDecimals = extraDec;
}
}
}
}
function setTicks(axis) {
var oticks = axis.options.ticks, ticks = [];
if (oticks == null || (typeof oticks == "number" && oticks > 0))
ticks = axis.tickGenerator(axis);
else if (oticks) {
if ($.isFunction(oticks))
// generate the ticks
ticks = oticks(axis);
else
ticks = oticks;
}
// clean up/labelify the supplied ticks, copy them over
var i, v;
axis.ticks = [];
for (i = 0; i < ticks.length; ++i) {
var label = null;
var t = ticks[i];
if (typeof t == "object") {
v = +t[0];
if (t.length > 1)
label = t[1];
}
else
v = +t;
if (label == null)
label = axis.tickFormatter(v, axis);
if (!isNaN(v))
axis.ticks.push({ v: v, label: label });
}
}
function snapRangeToTicks(axis, ticks) {
if (axis.options.autoscaleMargin && ticks.length > 0) {
// snap to ticks
if (axis.options.min == null)
axis.min = Math.min(axis.min, ticks[0].v);
if (axis.options.max == null && ticks.length > 1)
axis.max = Math.max(axis.max, ticks[ticks.length - 1].v);
}
}
function draw() {
surface.clear();
executeHooks(hooks.drawBackground, [ctx]);
var grid = options.grid;
// draw background, if any
if (grid.show && grid.backgroundColor)
drawBackground();
if (grid.show && !grid.aboveData) {
drawGrid();
}
for (var i = 0; i < series.length; ++i) {
executeHooks(hooks.drawSeries, [ctx, series[i]]);
drawSeries(series[i]);
}
executeHooks(hooks.draw, [ctx]);
if (grid.show && grid.aboveData) {
drawGrid();
}
surface.render();
}
function extractRange(ranges, coord) {
var axis, from, to, key, axes = allAxes();
for (var i = 0; i < axes.length; ++i) {
axis = axes[i];
if (axis.direction == coord) {
key = coord + axis.n + "axis";
if (!ranges[key] && axis.n == 1)
key = coord + "axis"; // support x1axis as xaxis
if (ranges[key]) {
from = ranges[key].from;
to = ranges[key].to;
break;
}
}
}
// backwards-compat stuff - to be removed in future
if (!ranges[key]) {
axis = coord == "x" ? xaxes[0] : yaxes[0];
from = ranges[coord + "1"];
to = ranges[coord + "2"];
}
// auto-reverse as an added bonus
if (from != null && to != null && from > to) {
var tmp = from;
from = to;
to = tmp;
}
return { from: from, to: to, axis: axis };
}
function drawBackground() {
ctx.save();
ctx.translate(plotOffset.left, plotOffset.top);
ctx.fillStyle = getColorOrGradient(options.grid.backgroundColor, plotHeight, 0, "rgba(255, 255, 255, 0)");
ctx.fillRect(0, 0, plotWidth, plotHeight);
ctx.restore();
}
function drawGrid() {
var i, axes, bw, bc;
ctx.save();
ctx.translate(plotOffset.left, plotOffset.top);
// draw markings
var markings = options.grid.markings;
if (markings) {
if ($.isFunction(markings)) {
axes = plot.getAxes();
// xmin etc. is backwards compatibility, to be
// removed in the future
axes.xmin = axes.xaxis.min;
axes.xmax = axes.xaxis.max;
axes.ymin = axes.yaxis.min;
axes.ymax = axes.yaxis.max;
markings = markings(axes);
}
for (i = 0; i < markings.length; ++i) {
var m = markings[i],
xrange = extractRange(m, "x"),
yrange = extractRange(m, "y");
// fill in missing
if (xrange.from == null)
xrange.from = xrange.axis.min;
if (xrange.to == null)
xrange.to = xrange.axis.max;
if (yrange.from == null)
yrange.from = yrange.axis.min;
if (yrange.to == null)
yrange.to = yrange.axis.max;
// clip
if (xrange.to < xrange.axis.min || xrange.from > xrange.axis.max ||
yrange.to < yrange.axis.min || yrange.from > yrange.axis.max)
continue;
xrange.from = Math.max(xrange.from, xrange.axis.min);
xrange.to = Math.min(xrange.to, xrange.axis.max);
yrange.from = Math.max(yrange.from, yrange.axis.min);
yrange.to = Math.min(yrange.to, yrange.axis.max);
if (xrange.from == xrange.to && yrange.from == yrange.to)
continue;
// then draw
xrange.from = xrange.axis.p2c(xrange.from);
xrange.to = xrange.axis.p2c(xrange.to);
yrange.from = yrange.axis.p2c(yrange.from);
yrange.to = yrange.axis.p2c(yrange.to);
if (xrange.from == xrange.to || yrange.from == yrange.to) {
// draw line
ctx.beginPath();
ctx.strokeStyle = m.color || options.grid.markingsColor;
ctx.lineWidth = m.lineWidth || options.grid.markingsLineWidth;
ctx.moveTo(xrange.from, yrange.from);
ctx.lineTo(xrange.to, yrange.to);
ctx.stroke();
}
else {
// fill area
ctx.fillStyle = m.color || options.grid.markingsColor;
ctx.fillRect(xrange.from, yrange.to,
xrange.to - xrange.from,
yrange.from - yrange.to);
}
}
}
// draw the ticks
axes = allAxes();
bw = options.grid.borderWidth;
for (var j = 0; j < axes.length; ++j) {
var axis = axes[j], box = axis.box,
t = axis.tickLength, x, y, xoff, yoff;
if (!axis.show || axis.ticks.length == 0)
continue;
ctx.lineWidth = 1;
// find the edges
if (axis.direction == "x") {
x = 0;
if (t == "full")
y = (axis.position == "top" ? 0 : plotHeight);
else
y = box.top - plotOffset.top + (axis.position == "top" ? box.height : 0);
}
else {
y = 0;
if (t == "full")
x = (axis.position == "left" ? 0 : plotWidth);
else
x = box.left - plotOffset.left + (axis.position == "left" ? box.width : 0);
}
// draw tick bar
if (!axis.innermost) {
ctx.strokeStyle = axis.options.color;
ctx.beginPath();
xoff = yoff = 0;
if (axis.direction == "x")
xoff = plotWidth + 1;
else
yoff = plotHeight + 1;
if (ctx.lineWidth == 1) {
if (axis.direction == "x") {
y = Math.floor(y) + 0.5;
} else {
x = Math.floor(x) + 0.5;
}
}
ctx.moveTo(x, y);
ctx.lineTo(x + xoff, y + yoff);
ctx.stroke();
}
// draw ticks
ctx.strokeStyle = axis.options.tickColor;
ctx.beginPath();
for (i = 0; i < axis.ticks.length; ++i) {
var v = axis.ticks[i].v;
xoff = yoff = 0;
if (isNaN(v) || v < axis.min || v > axis.max
// skip those lying on the axes if we got a border
|| (t == "full"
&& ((typeof bw == "object" && bw[axis.position] > 0) || bw > 0)
&& (v == axis.min || v == axis.max)))
continue;
if (axis.direction == "x") {
x = axis.p2c(v);
yoff = t == "full" ? -plotHeight : t;
if (axis.position == "top")
yoff = -yoff;
}
else {
y = axis.p2c(v);
xoff = t == "full" ? -plotWidth : t;
if (axis.position == "left")
xoff = -xoff;
}
if (ctx.lineWidth == 1) {
if (axis.direction == "x")
x = Math.floor(x) + 0.5;
else
y = Math.floor(y) + 0.5;
}
ctx.moveTo(x, y);
ctx.lineTo(x + xoff, y + yoff);
}
ctx.stroke();
}
// draw border
if (bw) {
// If either borderWidth or borderColor is an object, then draw the border
// line by line instead of as one rectangle
bc = options.grid.borderColor;
if(typeof bw == "object" || typeof bc == "object") {
if (typeof bw !== "object") {
bw = {top: bw, right: bw, bottom: bw, left: bw};
}
if (typeof bc !== "object") {
bc = {top: bc, right: bc, bottom: bc, left: bc};
}
if (bw.top > 0) {
ctx.strokeStyle = bc.top;
ctx.lineWidth = bw.top;
ctx.beginPath();
ctx.moveTo(0 - bw.left, 0 - bw.top/2);
ctx.lineTo(plotWidth, 0 - bw.top/2);
ctx.stroke();
}
if (bw.right > 0) {
ctx.strokeStyle = bc.right;
ctx.lineWidth = bw.right;
ctx.beginPath();
ctx.moveTo(plotWidth + bw.right / 2, 0 - bw.top);
ctx.lineTo(plotWidth + bw.right / 2, plotHeight);
ctx.stroke();
}
if (bw.bottom > 0) {
ctx.strokeStyle = bc.bottom;
ctx.lineWidth = bw.bottom;
ctx.beginPath();
ctx.moveTo(plotWidth + bw.right, plotHeight + bw.bottom / 2);
ctx.lineTo(0, plotHeight + bw.bottom / 2);
ctx.stroke();
}
if (bw.left > 0) {
ctx.strokeStyle = bc.left;
ctx.lineWidth = bw.left;
ctx.beginPath();
ctx.moveTo(0 - bw.left/2, plotHeight + bw.bottom);
ctx.lineTo(0- bw.left/2, 0);
ctx.stroke();
}
}
else {
ctx.lineWidth = bw;
ctx.strokeStyle = options.grid.borderColor;
ctx.strokeRect(-bw/2, -bw/2, plotWidth + bw, plotHeight + bw);
}
}
ctx.restore();
}
function drawAxisLabels() {
$.each(allAxes(), function (_, axis) {
if (!axis.show || axis.ticks.length == 0)
return;
var box = axis.box,
legacyStyles = axis.direction + "Axis " + axis.direction + axis.n + "Axis",
layer = "flot-" + axis.direction + "-axis flot-" + axis.direction + axis.n + "-axis " + legacyStyles,
font = axis.options.font || "flot-tick-label tickLabel",
tick, x, y, halign, valign;
surface.removeText(layer);
for (var i = 0; i < axis.ticks.length; ++i) {
tick = axis.ticks[i];
if (!tick.label || tick.v < axis.min || tick.v > axis.max)
continue;
if (axis.direction == "x") {
halign = "center";
x = plotOffset.left + axis.p2c(tick.v);
if (axis.position == "bottom") {
y = box.top + box.padding;
} else {
y = box.top + box.height - box.padding;
valign = "bottom";
}
} else {
valign = "middle";
y = plotOffset.top + axis.p2c(tick.v);
if (axis.position == "left") {
x = box.left + box.width - box.padding;
halign = "right";
} else {
x = box.left + box.padding;
}
}
surface.addText(layer, x, y, tick.label, font, null, halign, valign);
}
});
}
function drawSeries(series) {
if (series.lines.show)
drawSeriesLines(series);
if (series.bars.show)
drawSeriesBars(series);
if (series.points.show)
drawSeriesPoints(series);
}
function drawSeriesLines(series) {
function plotLine(datapoints, xoffset, yoffset, axisx, axisy) {
var points = datapoints.points,
ps = datapoints.pointsize,
prevx = null, prevy = null;
ctx.beginPath();
for (var i = ps; i < points.length; i += ps) {
var x1 = points[i - ps], y1 = points[i - ps + 1],
x2 = points[i], y2 = points[i + 1];
if (x1 == null || x2 == null)
continue;
// clip with ymin
if (y1 <= y2 && y1 < axisy.min) {
if (y2 < axisy.min)
continue; // line segment is outside
// compute new intersection point
x1 = (axisy.min - y1) / (y2 - y1) * (x2 - x1) + x1;
y1 = axisy.min;
}
else if (y2 <= y1 && y2 < axisy.min) {
if (y1 < axisy.min)
continue;
x2 = (axisy.min - y1) / (y2 - y1) * (x2 - x1) + x1;
y2 = axisy.min;
}
// clip with ymax
if (y1 >= y2 && y1 > axisy.max) {
if (y2 > axisy.max)
continue;
x1 = (axisy.max - y1) / (y2 - y1) * (x2 - x1) + x1;
y1 = axisy.max;
}
else if (y2 >= y1 && y2 > axisy.max) {
if (y1 > axisy.max)
continue;
x2 = (axisy.max - y1) / (y2 - y1) * (x2 - x1) + x1;
y2 = axisy.max;
}
// clip with xmin
if (x1 <= x2 && x1 < axisx.min) {
if (x2 < axisx.min)
continue;
y1 = (axisx.min - x1) / (x2 - x1) * (y2 - y1) + y1;
x1 = axisx.min;
}
else if (x2 <= x1 && x2 < axisx.min) {
if (x1 < axisx.min)
continue;
y2 = (axisx.min - x1) / (x2 - x1) * (y2 - y1) + y1;
x2 = axisx.min;
}
// clip with xmax
if (x1 >= x2 && x1 > axisx.max) {
if (x2 > axisx.max)
continue;
y1 = (axisx.max - x1) / (x2 - x1) * (y2 - y1) + y1;
x1 = axisx.max;
}
else if (x2 >= x1 && x2 > axisx.max) {
if (x1 > axisx.max)
continue;
y2 = (axisx.max - x1) / (x2 - x1) * (y2 - y1) + y1;
x2 = axisx.max;
}
if (x1 != prevx || y1 != prevy)
ctx.moveTo(axisx.p2c(x1) + xoffset, axisy.p2c(y1) + yoffset);
prevx = x2;
prevy = y2;
ctx.lineTo(axisx.p2c(x2) + xoffset, axisy.p2c(y2) + yoffset);
}
ctx.stroke();
}
function plotLineArea(datapoints, axisx, axisy) {
var points = datapoints.points,
ps = datapoints.pointsize,
bottom = Math.min(Math.max(0, axisy.min), axisy.max),
i = 0, top, areaOpen = false,
ypos = 1, segmentStart = 0, segmentEnd = 0;
// we process each segment in two turns, first forward
// direction to sketch out top, then once we hit the
// end we go backwards to sketch the bottom
while (true) {
if (ps > 0 && i > points.length + ps)
break;
i += ps; // ps is negative if going backwards
var x1 = points[i - ps],
y1 = points[i - ps + ypos],
x2 = points[i], y2 = points[i + ypos];
if (areaOpen) {
if (ps > 0 && x1 != null && x2 == null) {
// at turning point
segmentEnd = i;
ps = -ps;
ypos = 2;
continue;
}
if (ps < 0 && i == segmentStart + ps) {
// done with the reverse sweep
ctx.fill();
areaOpen = false;
ps = -ps;
ypos = 1;
i = segmentStart = segmentEnd + ps;
continue;
}
}
if (x1 == null || x2 == null)
continue;
// clip x values
// clip with xmin
if (x1 <= x2 && x1 < axisx.min) {
if (x2 < axisx.min)
continue;
y1 = (axisx.min - x1) / (x2 - x1) * (y2 - y1) + y1;
x1 = axisx.min;
}
else if (x2 <= x1 && x2 < axisx.min) {
if (x1 < axisx.min)
continue;
y2 = (axisx.min - x1) / (x2 - x1) * (y2 - y1) + y1;
x2 = axisx.min;
}
// clip with xmax
if (x1 >= x2 && x1 > axisx.max) {
if (x2 > axisx.max)
continue;
y1 = (axisx.max - x1) / (x2 - x1) * (y2 - y1) + y1;
x1 = axisx.max;
}
else if (x2 >= x1 && x2 > axisx.max) {
if (x1 > axisx.max)
continue;
y2 = (axisx.max - x1) / (x2 - x1) * (y2 - y1) + y1;
x2 = axisx.max;
}
if (!areaOpen) {
// open area
ctx.beginPath();
ctx.moveTo(axisx.p2c(x1), axisy.p2c(bottom));
areaOpen = true;
}
// now first check the case where both is outside
if (y1 >= axisy.max && y2 >= axisy.max) {
ctx.lineTo(axisx.p2c(x1), axisy.p2c(axisy.max));
ctx.lineTo(axisx.p2c(x2), axisy.p2c(axisy.max));
continue;
}
else if (y1 <= axisy.min && y2 <= axisy.min) {
ctx.lineTo(axisx.p2c(x1), axisy.p2c(axisy.min));
ctx.lineTo(axisx.p2c(x2), axisy.p2c(axisy.min));
continue;
}
// else it's a bit more complicated, there might
// be a flat maxed out rectangle first, then a
// triangular cutout or reverse; to find these
// keep track of the current x values
var x1old = x1, x2old = x2;
// clip the y values, without shortcutting, we
// go through all cases in turn
// clip with ymin
if (y1 <= y2 && y1 < axisy.min && y2 >= axisy.min) {
x1 = (axisy.min - y1) / (y2 - y1) * (x2 - x1) + x1;
y1 = axisy.min;
}
else if (y2 <= y1 && y2 < axisy.min && y1 >= axisy.min) {
x2 = (axisy.min - y1) / (y2 - y1) * (x2 - x1) + x1;
y2 = axisy.min;
}
// clip with ymax
if (y1 >= y2 && y1 > axisy.max && y2 <= axisy.max) {
x1 = (axisy.max - y1) / (y2 - y1) * (x2 - x1) + x1;
y1 = axisy.max;
}
else if (y2 >= y1 && y2 > axisy.max && y1 <= axisy.max) {
x2 = (axisy.max - y1) / (y2 - y1) * (x2 - x1) + x1;
y2 = axisy.max;
}
// if the x value was changed we got a rectangle
// to fill
if (x1 != x1old) {
ctx.lineTo(axisx.p2c(x1old), axisy.p2c(y1));
// it goes to (x1, y1), but we fill that below
}
// fill triangular section, this sometimes result
// in redundant points if (x1, y1) hasn't changed
// from previous line to, but we just ignore that
ctx.lineTo(axisx.p2c(x1), axisy.p2c(y1));
ctx.lineTo(axisx.p2c(x2), axisy.p2c(y2));
// fill the other rectangle if it's there
if (x2 != x2old) {
ctx.lineTo(axisx.p2c(x2), axisy.p2c(y2));
ctx.lineTo(axisx.p2c(x2old), axisy.p2c(y2));
}
}
}
ctx.save();
ctx.translate(plotOffset.left, plotOffset.top);
ctx.lineJoin = "round";
var lw = series.lines.lineWidth,
sw = series.shadowSize;
// FIXME: consider another form of shadow when filling is turned on
if (lw > 0 && sw > 0) {
// draw shadow as a thick and thin line with transparency
ctx.lineWidth = sw;
ctx.strokeStyle = "rgba(0,0,0,0.1)";
// position shadow at angle from the mid of line
var angle = Math.PI/18;
plotLine(series.datapoints, Math.sin(angle) * (lw/2 + sw/2), Math.cos(angle) * (lw/2 + sw/2), series.xaxis, series.yaxis);
ctx.lineWidth = sw/2;
plotLine(series.datapoints, Math.sin(angle) * (lw/2 + sw/4), Math.cos(angle) * (lw/2 + sw/4), series.xaxis, series.yaxis);
}
ctx.lineWidth = lw;
ctx.strokeStyle = series.color;
var fillStyle = getFillStyle(series.lines, series.color, 0, plotHeight);
if (fillStyle) {
ctx.fillStyle = fillStyle;
plotLineArea(series.datapoints, series.xaxis, series.yaxis);
}
if (lw > 0)
plotLine(series.datapoints, 0, 0, series.xaxis, series.yaxis);
ctx.restore();
}
function drawSeriesPoints(series) {
function plotPoints(datapoints, radius, fillStyle, offset, shadow, axisx, axisy, symbol) {
var points = datapoints.points, ps = datapoints.pointsize;
for (var i = 0; i < points.length; i += ps) {
var x = points[i], y = points[i + 1];
if (x == null || x < axisx.min || x > axisx.max || y < axisy.min || y > axisy.max)
continue;
ctx.beginPath();
x = axisx.p2c(x);
y = axisy.p2c(y) + offset;
if (symbol == "circle")
ctx.arc(x, y, radius, 0, shadow ? Math.PI : Math.PI * 2, false);
else
symbol(ctx, x, y, radius, shadow);
ctx.closePath();
if (fillStyle) {
ctx.fillStyle = fillStyle;
ctx.fill();
}
ctx.stroke();
}
}
ctx.save();
ctx.translate(plotOffset.left, plotOffset.top);
var lw = series.points.lineWidth,
sw = series.shadowSize,
radius = series.points.radius,
symbol = series.points.symbol;
// If the user sets the line width to 0, we change it to a very
// small value. A line width of 0 seems to force the default of 1.
// Doing the conditional here allows the shadow setting to still be
// optional even with a lineWidth of 0.
if( lw == 0 )
lw = 0.0001;
if (lw > 0 && sw > 0) {
// draw shadow in two steps
var w = sw / 2;
ctx.lineWidth = w;
ctx.strokeStyle = "rgba(0,0,0,0.1)";
plotPoints(series.datapoints, radius, null, w + w/2, true,
series.xaxis, series.yaxis, symbol);
ctx.strokeStyle = "rgba(0,0,0,0.2)";
plotPoints(series.datapoints, radius, null, w/2, true,
series.xaxis, series.yaxis, symbol);
}
ctx.lineWidth = lw;
ctx.strokeStyle = series.color;
plotPoints(series.datapoints, radius,
getFillStyle(series.points, series.color), 0, false,
series.xaxis, series.yaxis, symbol);
ctx.restore();
}
function drawBar(x, y, b, barLeft, barRight, offset, fillStyleCallback, axisx, axisy, c, horizontal, lineWidth) {
var left, right, bottom, top,
drawLeft, drawRight, drawTop, drawBottom,
tmp;
// in horizontal mode, we start the bar from the left
// instead of from the bottom so it appears to be
// horizontal rather than vertical
if (horizontal) {
drawBottom = drawRight = drawTop = true;
drawLeft = false;
left = b;
right = x;
top = y + barLeft;
bottom = y + barRight;
// account for negative bars
if (right < left) {
tmp = right;
right = left;
left = tmp;
drawLeft = true;
drawRight = false;
}
}
else {
drawLeft = drawRight = drawTop = true;
drawBottom = false;
left = x + barLeft;
right = x + barRight;
bottom = b;
top = y;
// account for negative bars
if (top < bottom) {
tmp = top;
top = bottom;
bottom = tmp;
drawBottom = true;
drawTop = false;
}
}
// clip
if (right < axisx.min || left > axisx.max ||
top < axisy.min || bottom > axisy.max)
return;
if (left < axisx.min) {
left = axisx.min;
drawLeft = false;
}
if (right > axisx.max) {
right = axisx.max;
drawRight = false;
}
if (bottom < axisy.min) {
bottom = axisy.min;
drawBottom = false;
}
if (top > axisy.max) {
top = axisy.max;
drawTop = false;
}
left = axisx.p2c(left);
bottom = axisy.p2c(bottom);
right = axisx.p2c(right);
top = axisy.p2c(top);
// fill the bar
if (fillStyleCallback) {
c.beginPath();
c.moveTo(left, bottom);
c.lineTo(left, top);
c.lineTo(right, top);
c.lineTo(right, bottom);
c.fillStyle = fillStyleCallback(bottom, top);
c.fill();
}
// draw outline
if (lineWidth > 0 && (drawLeft || drawRight || drawTop || drawBottom)) {
c.beginPath();
// FIXME: inline moveTo is buggy with excanvas
c.moveTo(left, bottom + offset);
if (drawLeft)
c.lineTo(left, top + offset);
else
c.moveTo(left, top + offset);
if (drawTop)
c.lineTo(right, top + offset);
else
c.moveTo(right, top + offset);
if (drawRight)
c.lineTo(right, bottom + offset);
else
c.moveTo(right, bottom + offset);
if (drawBottom)
c.lineTo(left, bottom + offset);
else
c.moveTo(left, bottom + offset);
c.stroke();
}
}
function drawSeriesBars(series) {
function plotBars(datapoints, barLeft, barRight, offset, fillStyleCallback, axisx, axisy) {
var points = datapoints.points, ps = datapoints.pointsize;
for (var i = 0; i < points.length; i += ps) {
if (points[i] == null)
continue;
drawBar(points[i], points[i + 1], points[i + 2], barLeft, barRight, offset, fillStyleCallback, axisx, axisy, ctx, series.bars.horizontal, series.bars.lineWidth);
}
}
ctx.save();
ctx.translate(plotOffset.left, plotOffset.top);
// FIXME: figure out a way to add shadows (for instance along the right edge)
ctx.lineWidth = series.bars.lineWidth;
ctx.strokeStyle = series.color;
var barLeft;
switch (series.bars.align) {
case "left":
barLeft = 0;
break;
case "right":
barLeft = -series.bars.barWidth;
break;
case "center":
barLeft = -series.bars.barWidth / 2;
break;
default:
throw new Error("Invalid bar alignment: " + series.bars.align);
}
var fillStyleCallback = series.bars.fill ? function (bottom, top) { return getFillStyle(series.bars, series.color, bottom, top); } : null;
plotBars(series.datapoints, barLeft, barLeft + series.bars.barWidth, 0, fillStyleCallback, series.xaxis, series.yaxis);
ctx.restore();
}
function getFillStyle(filloptions, seriesColor, bottom, top) {
var fill = filloptions.fill;
if (!fill)
return null;
if (filloptions.fillColor)
return getColorOrGradient(filloptions.fillColor, bottom, top, seriesColor);
var c = $.color.parse(seriesColor);
c.a = typeof fill == "number" ? fill : 0.4;
c.normalize();
return c.toString();
}
function insertLegend() {
placeholder.find(".legend").remove();
if (!options.legend.show)
return;
var fragments = [], entries = [], rowStarted = false,
lf = options.legend.labelFormatter, s, label;
// Build a list of legend entries, with each having a label and a color
for (var i = 0; i < series.length; ++i) {
s = series[i];
if (s.label) {
label = lf ? lf(s.label, s) : s.label;
if (label) {
entries.push({
label: label,
color: s.color
});
}
}
}
// Sort the legend using either the default or a custom comparator
if (options.legend.sorted) {
if ($.isFunction(options.legend.sorted)) {
entries.sort(options.legend.sorted);
} else if (options.legend.sorted == "reverse") {
entries.reverse();
} else {
var ascending = options.legend.sorted != "descending";
entries.sort(function(a, b) {
return a.label == b.label ? 0 : (
(a.label < b.label) != ascending ? 1 : -1 // Logical XOR
);
});
}
}
// Generate markup for the list of entries, in their final order
for (var i = 0; i < entries.length; ++i) {
var entry = entries[i];
if (i % options.legend.noColumns == 0) {
if (rowStarted)
fragments.push('</tr>');
fragments.push('<tr>');
rowStarted = true;
}
fragments.push(
'<td class="legendColorBox"><div style="border:1px solid ' + options.legend.labelBoxBorderColor + ';padding:1px"><div style="width:4px;height:0;border:5px solid ' + entry.color + ';overflow:hidden"></div></div></td>' +
'<td class="legendLabel">' + entry.label + '</td>'
);
}
if (rowStarted)
fragments.push('</tr>');
if (fragments.length == 0)
return;
var table = '<table style="font-size:smaller;color:' + options.grid.color + '">' + fragments.join("") + '</table>';
if (options.legend.container != null)
$(options.legend.container).html(table);
else {
var pos = "",
p = options.legend.position,
m = options.legend.margin;
if (m[0] == null)
m = [m, m];
if (p.charAt(0) == "n")
pos += 'top:' + (m[1] + plotOffset.top) + 'px;';
else if (p.charAt(0) == "s")
pos += 'bottom:' + (m[1] + plotOffset.bottom) + 'px;';
if (p.charAt(1) == "e")
pos += 'right:' + (m[0] + plotOffset.right) + 'px;';
else if (p.charAt(1) == "w")
pos += 'left:' + (m[0] + plotOffset.left) + 'px;';
var legend = $('<div class="legend">' + table.replace('style="', 'style="position:absolute;' + pos +';') + '</div>').appendTo(placeholder);
if (options.legend.backgroundOpacity != 0.0) {
// put in the transparent background
// separately to avoid blended labels and
// label boxes
var c = options.legend.backgroundColor;
if (c == null) {
c = options.grid.backgroundColor;
if (c && typeof c == "string")
c = $.color.parse(c);
else
c = $.color.extract(legend, 'background-color');
c.a = 1;
c = c.toString();
}
var div = legend.children();
$('<div style="position:absolute;width:' + div.width() + 'px;height:' + div.height() + 'px;' + pos +'background-color:' + c + ';"> </div>').prependTo(legend).css('opacity', options.legend.backgroundOpacity);
}
}
}
// interactive features
var highlights = [],
redrawTimeout = null;
// returns the data item the mouse is over, or null if none is found
function findNearbyItem(mouseX, mouseY, seriesFilter) {
var maxDistance = options.grid.mouseActiveRadius,
smallestDistance = maxDistance * maxDistance + 1,
item = null, foundPoint = false, i, j, ps;
for (i = series.length - 1; i >= 0; --i) {
if (!seriesFilter(series[i]))
continue;
var s = series[i],
axisx = s.xaxis,
axisy = s.yaxis,
points = s.datapoints.points,
mx = axisx.c2p(mouseX), // precompute some stuff to make the loop faster
my = axisy.c2p(mouseY),
maxx = maxDistance / axisx.scale,
maxy = maxDistance / axisy.scale;
ps = s.datapoints.pointsize;
// with inverse transforms, we can't use the maxx/maxy
// optimization, sadly
if (axisx.options.inverseTransform)
maxx = Number.MAX_VALUE;
if (axisy.options.inverseTransform)
maxy = Number.MAX_VALUE;
if (s.lines.show || s.points.show) {
for (j = 0; j < points.length; j += ps) {
var x = points[j], y = points[j + 1];
if (x == null)
continue;
// For points and lines, the cursor must be within a
// certain distance to the data point
if (x - mx > maxx || x - mx < -maxx ||
y - my > maxy || y - my < -maxy)
continue;
// We have to calculate distances in pixels, not in
// data units, because the scales of the axes may be different
var dx = Math.abs(axisx.p2c(x) - mouseX),
dy = Math.abs(axisy.p2c(y) - mouseY),
dist = dx * dx + dy * dy; // we save the sqrt
// use <= to ensure last point takes precedence
// (last generally means on top of)
if (dist < smallestDistance) {
smallestDistance = dist;
item = [i, j / ps];
}
}
}
if (s.bars.show && !item) { // no other point can be nearby
var barLeft = s.bars.align == "left" ? 0 : -s.bars.barWidth/2,
barRight = barLeft + s.bars.barWidth;
for (j = 0; j < points.length; j += ps) {
var x = points[j], y = points[j + 1], b = points[j + 2];
if (x == null)
continue;
// for a bar graph, the cursor must be inside the bar
if (series[i].bars.horizontal ?
(mx <= Math.max(b, x) && mx >= Math.min(b, x) &&
my >= y + barLeft && my <= y + barRight) :
(mx >= x + barLeft && mx <= x + barRight &&
my >= Math.min(b, y) && my <= Math.max(b, y)))
item = [i, j / ps];
}
}
}
if (item) {
i = item[0];
j = item[1];
ps = series[i].datapoints.pointsize;
return { datapoint: series[i].datapoints.points.slice(j * ps, (j + 1) * ps),
dataIndex: j,
series: series[i],
seriesIndex: i };
}
return null;
}
function onMouseMove(e) {
if (options.grid.hoverable)
triggerClickHoverEvent("plothover", e,
function (s) { return s["hoverable"] != false; });
}
function onMouseLeave(e) {
if (options.grid.hoverable)
triggerClickHoverEvent("plothover", e,
function (s) { return false; });
}
function onClick(e) {
triggerClickHoverEvent("plotclick", e,
function (s) { return s["clickable"] != false; });
}
// trigger click or hover event (they send the same parameters
// so we share their code)
function triggerClickHoverEvent(eventname, event, seriesFilter) {
var offset = eventHolder.offset(),
canvasX = event.pageX - offset.left - plotOffset.left,
canvasY = event.pageY - offset.top - plotOffset.top,
pos = canvasToAxisCoords({ left: canvasX, top: canvasY });
pos.pageX = event.pageX;
pos.pageY = event.pageY;
var item = findNearbyItem(canvasX, canvasY, seriesFilter);
if (item) {
// fill in mouse pos for any listeners out there
item.pageX = parseInt(item.series.xaxis.p2c(item.datapoint[0]) + offset.left + plotOffset.left, 10);
item.pageY = parseInt(item.series.yaxis.p2c(item.datapoint[1]) + offset.top + plotOffset.top, 10);
}
if (options.grid.autoHighlight) {
// clear auto-highlights
for (var i = 0; i < highlights.length; ++i) {
var h = highlights[i];
if (h.auto == eventname &&
!(item && h.series == item.series &&
h.point[0] == item.datapoint[0] &&
h.point[1] == item.datapoint[1]))
unhighlight(h.series, h.point);
}
if (item)
highlight(item.series, item.datapoint, eventname);
}
placeholder.trigger(eventname, [ pos, item ]);
}
function triggerRedrawOverlay() {
var t = options.interaction.redrawOverlayInterval;
if (t == -1) { // skip event queue
drawOverlay();
return;
}
if (!redrawTimeout)
redrawTimeout = setTimeout(drawOverlay, t);
}
function drawOverlay() {
redrawTimeout = null;
// draw highlights
octx.save();
overlay.clear();
octx.translate(plotOffset.left, plotOffset.top);
var i, hi;
for (i = 0; i < highlights.length; ++i) {
hi = highlights[i];
if (hi.series.bars.show)
drawBarHighlight(hi.series, hi.point);
else
drawPointHighlight(hi.series, hi.point);
}
octx.restore();
executeHooks(hooks.drawOverlay, [octx]);
}
function highlight(s, point, auto) {
if (typeof s == "number")
s = series[s];
if (typeof point == "number") {
var ps = s.datapoints.pointsize;
point = s.datapoints.points.slice(ps * point, ps * (point + 1));
}
var i = indexOfHighlight(s, point);
if (i == -1) {
highlights.push({ series: s, point: point, auto: auto });
triggerRedrawOverlay();
}
else if (!auto)
highlights[i].auto = false;
}
function unhighlight(s, point) {
if (s == null && point == null) {
highlights = [];
triggerRedrawOverlay();
return;
}
if (typeof s == "number")
s = series[s];
if (typeof point == "number") {
var ps = s.datapoints.pointsize;
point = s.datapoints.points.slice(ps * point, ps * (point + 1));
}
var i = indexOfHighlight(s, point);
if (i != -1) {
highlights.splice(i, 1);
triggerRedrawOverlay();
}
}
function indexOfHighlight(s, p) {
for (var i = 0; i < highlights.length; ++i) {
var h = highlights[i];
if (h.series == s && h.point[0] == p[0]
&& h.point[1] == p[1])
return i;
}
return -1;
}
function drawPointHighlight(series, point) {
var x = point[0], y = point[1],
axisx = series.xaxis, axisy = series.yaxis,
highlightColor = (typeof series.highlightColor === "string") ? series.highlightColor : $.color.parse(series.color).scale('a', 0.5).toString();
if (x < axisx.min || x > axisx.max || y < axisy.min || y > axisy.max)
return;
var pointRadius = series.points.radius + series.points.lineWidth / 2;
octx.lineWidth = pointRadius;
octx.strokeStyle = highlightColor;
var radius = 1.5 * pointRadius;
x = axisx.p2c(x);
y = axisy.p2c(y);
octx.beginPath();
if (series.points.symbol == "circle")
octx.arc(x, y, radius, 0, 2 * Math.PI, false);
else
series.points.symbol(octx, x, y, radius, false);
octx.closePath();
octx.stroke();
}
function drawBarHighlight(series, point) {
var highlightColor = (typeof series.highlightColor === "string") ? series.highlightColor : $.color.parse(series.color).scale('a', 0.5).toString(),
fillStyle = highlightColor,
barLeft = series.bars.align == "left" ? 0 : -series.bars.barWidth/2;
octx.lineWidth = series.bars.lineWidth;
octx.strokeStyle = highlightColor;
drawBar(point[0], point[1], point[2] || 0, barLeft, barLeft + series.bars.barWidth,
0, function () { return fillStyle; }, series.xaxis, series.yaxis, octx, series.bars.horizontal, series.bars.lineWidth);
}
function getColorOrGradient(spec, bottom, top, defaultColor) {
if (typeof spec == "string")
return spec;
else {
// assume this is a gradient spec; IE currently only
// supports a simple vertical gradient properly, so that's
// what we support too
var gradient = ctx.createLinearGradient(0, top, 0, bottom);
for (var i = 0, l = spec.colors.length; i < l; ++i) {
var c = spec.colors[i];
if (typeof c != "string") {
var co = $.color.parse(defaultColor);
if (c.brightness != null)
co = co.scale('rgb', c.brightness);
if (c.opacity != null)
co.a *= c.opacity;
c = co.toString();
}
gradient.addColorStop(i / (l - 1), c);
}
return gradient;
}
}
}
// Add the plot function to the top level of the jQuery object
$.plot = function(placeholder, data, options) {
//var t0 = new Date();
var plot = new Plot($(placeholder), data, options, $.plot.plugins);
//(window.console ? console.log : alert)("time used (msecs): " + ((new Date()).getTime() - t0.getTime()));
return plot;
};
$.plot.version = "0.8.0-beta";
$.plot.plugins = [];
// Also add the plot function as a chainable property
$.fn.plot = function(data, options) {
return this.each(function() {
$.plot(this, data, options);
});
}
// round to nearby lower multiple of base
function floorInBase(n, base) {
return base * Math.floor(n / base);
}
})(jQuery); | APacheDEX | /APacheDEX-1.8.tar.gz/APacheDEX-1.8/apachedex/jquery.flot.js | jquery.flot.js |
import numpy as np
from copy import copy
def create_beam_slices(z, n_slices=10, len_slice=None):
"""Calculates the slice limits along z of a partile distribution for a
given number of slices or slice length.
Parameters
----------
z : array
Contains the longitudinal position of the particles in units of meters
n_slices : array
Number of longitudinal slices in which to divite the particle
distribution. Not used if len_slice is specified.
len_slice : array
Length of the longitudinal slices. If not None, replaces n_slices.
Returns
-------
A tuple containing an array with the slice limits and integer with the
number of slices, which might have been redefined.
"""
max_z = np.max(z)
min_z = np.min(z)
if len_slice is not None:
n_slices = int(np.round((max_z-min_z)/len_slice))
slice_lims = np.linspace(min_z, max_z, n_slices+1)
return slice_lims, n_slices
def weighted_std(values, weights=1):
"""Calculates the weighted standard deviation of the given values
Parameters
----------
values: array
Contains the values to be analyzed
weights : array
Contains the weights of the values to analyze
Returns
-------
A float with the value of the standard deviation
"""
mean_val = np.average(values, weights=np.abs(weights))
std = np.sqrt(np.average((values-mean_val)**2, weights=np.abs(weights)))
return std
def slope_of_correlation(y, x, w=None):
"""Calculates the slope of the correlation between two variables x and y
according to y = slope*x
Parameters
----------
y: array
Contains the x values
y: array
Contains the y values
w : array
Contains the weights of the values
Returns
-------
A float with the value of the slope
"""
a = np.average(y*x, weights=w)
b = np.average(y, weights=w)
c = np.average(x, weights=w)
d = np.average(x**2, weights=w)
return (a-b*c) / (d-c**2)
def remove_correlation(x, y, w=None, order=1):
"""Removes the correlation between two variables x and y, where y=y(x), up
to the desired order.
Parameters
----------
x: array
Contains the x values
y: array
Contains the y values
w : array
Contains the weights of the values
order : int
Determines the order of the polynomial fit and, thus, the higher
correlaton order to remove.
Returns
-------
An array containing the new values of y
"""
fit_coefs = np.polyfit(x, y, order, w=w)
for i, coef in enumerate(reversed(fit_coefs[:-1])):
y = y - coef * x**(i+1)
return y
def reposition_bunch(beam_data, avg_pos):
"""Reposition bunch with the specified averages"""
q = beam_data[6]
for i, new_avg in enumerate(avg_pos):
if new_avg is not None:
current_avg = np.average(beam_data[i], weights=q)
beam_data[i] += new_avg - current_avg
def get_particle_subset(beam_data, subset_size, preserve_charge=True):
"""
Get a random subsample of the particles in a distribution.
Parameters:
-----------
beam_data : list
Contains the beam data as [x, y, z, px, py, pz, q].
subset_size : int
Number of particles which the subset should have.
preserve_charge : bool
Whether the total charge of the distribution should be preserved.
If True, the charge of the output particles will be increased so that
the total charge remains the same.
"""
# Make sure subset size is an int.
subset_size = int(subset_size)
x = beam_data[0]
y = beam_data[1]
z = beam_data[2]
px = beam_data[3]
py = beam_data[4]
pz = beam_data[5]
q = beam_data[6]
if subset_size < len(q):
i = np.arange(len(q))
i = np.random.choice(i, size=int(subset_size))
x = x[i]
y = y[i]
z = z[i]
px = px[i]
py = py[i]
pz = pz[i]
if preserve_charge:
q_tot = np.sum(q)
q_part = q_tot/subset_size
q = np.ones(subset_size)*q_part
else:
q = q[i]
else:
print('Subset size is larger than original number of particles. '
'No operation performed.')
return [x, y, z, px, py, pz, q]
def join_infile_path(*paths):
"""
Join path components using '/' as separator.
This method is defined as an alternative to os.path.join, which uses '\\'
as separator in Windows environments and is therefore not valid to navigate
within data files.
Parameters
----------
*paths: all strings with path components to join
Returns
-------
A string with the complete path using '/' as separator.
"""
# Join path components
path = '/'.join(paths)
# Correct double slashes, if any is present
path = path.replace('//', '/')
return path
def calculate_slice_average(slice_vals, slice_weights):
"""
Calculates the weighted average of the computed sliced values and their
corresponding weights, not taking into account any possible NaN values in
the 'slice_vals' array.
Parameters
----------
slice_vals: array
Array containing the slice values.
slice_weights: array
Array containing the statitical weights of each slice.
Returns
-------
The weighted average of 'slice_vals'.
"""
slice_vals, slice_weights = filter_nans(slice_vals, slice_weights)
return np.average(slice_vals, weights=slice_weights)
def filter_nans(data, data_weights):
"""
Removes NaN values from a data array and is corresponding value in the
weights array.
Parameters
----------
data: array
Array containing the data to filter.
data_weights: array
Array with the same size as data containing the weights.
Returns
-------
Filtered data and data_weights arrays.
"""
filter_idx = np.isfinite(data)
data_weights_f = data_weights[filter_idx]
data_f = data[filter_idx]
return data_f, data_weights_f
def determine_statistically_relevant_slices(slice_weights, min_fraction=1e-4):
"""
Determines which beam slices have a statistically relevant weight
Parameters
----------
slice_weights: array
Array containing the statistical weight of each slice.
min_fraction: float
Minimum fraction of the total weight of the particle distribution that
a slice needs to have to be considered statistically relevant.
Returns
-------
A boolean array of the same dimensions as slice_weights.
"""
total_weight = np.sum(slice_weights)
slice_weights_rel = slice_weights / total_weight
return slice_weights_rel > min_fraction
def get_only_statistically_relevant_slices(slice_param, slice_weights,
min_fraction=1e-4,
replace_with_nans=False):
"""
Get the slice parameters only of slices which have a statistically
relevant weight.
Parameters
----------
slice_weights: array
Array containing the values of the slice parameter.
slice_weights: array
Array containing the statistical weight of each slice.
min_fraction: float
Minimum fraction of the total weight of the particle distribution that
a slice needs to have to be considered statistically relevant.
replace_with_nans: bool
If True, the slice parameters and weights of non-relevant slices are
replaced by NaN instead of being removed.
Returns
-------
A boolean array of the same dimensions as slice_weights.
"""
filter = determine_statistically_relevant_slices(slice_weights,
min_fraction)
if replace_with_nans:
inv_filter = np.logical_not(filter)
slice_param = copy(slice_param)
slice_weights = copy(slice_weights)
slice_param[inv_filter] = np.nan
slice_weights[inv_filter] = np.nan
return slice_param, slice_weights
else:
return slice_param[filter], slice_weights[filter] | APtools | /APtools-0.2.4-py3-none-any.whl/aptools/helper_functions.py | helper_functions.py |
import scipy.constants as ct
import numpy as np
def plasma_frequency(plasma_dens):
"""Calculate the plasma frequency from its densiy
Parameters
----------
plasma_dens : float
The plasma density in units of cm-3
Returns
-------
A float with plasma frequency in units of 1/s
"""
return np.sqrt(ct.e**2 * plasma_dens*1e6 / (ct.m_e*ct.epsilon_0))
def plasma_skin_depth(plasma_dens):
"""Calculate the plasma skin depth from its densiy
Parameters
----------
plasma_dens : float
The plasma density in units of cm-3
Returns
-------
A float with the plasma skin depth in meters
"""
return ct.c / plasma_frequency(plasma_dens)
def plasma_wavelength(plasma_dens):
"""Calculate the plasma wavelength from its densiy
Parameters
----------
plasma_dens : float
The plasma density in units of cm-3
Returns
-------
A float with the plasma wavelength in meters
"""
return 2*ct.pi*ct.c / plasma_frequency(plasma_dens)
def plasma_cold_non_relativisct_wave_breaking_field(plasma_dens):
"""Calculate the cold, non relativisitic wave breaking field from the
plasma density.
Parameters
----------
plasma_dens : float
The plasma density in units of cm-3
Returns
-------
A float with the field value in V/m
"""
return ct.m_e*ct.c/ct.e * plasma_frequency(plasma_dens)
def plasma_focusing_gradient_blowout(n_p):
"""Calculate the plasma focusing gradient assuming blowout regime
Parameters
----------
plasma_dens : float
The plasma density in units of cm-3
Returns
-------
A float with the focusing gradient value in T/m
"""
return ct.m_e*plasma_frequency(n_p)**2 / (2*ct.e*ct.c)
def plasma_focusing_gradient_linear(n_p, dist_from_driver, a_0, w_0):
"""Calculate the plasma focusing gradient assuming linear regime.
Parameters
----------
plasma_dens : float
The plasma density in units of cm-3
dist_from_driver : string
Distance from the driver center in units of m.
a_0 : float
Peak normalized vector potential of the laser.
w_0 : float
Spot size (w_0) of the laser pulse in units of m.
Returns
-------
A float with the focusing gradient value in T/m.
"""
w_p = plasma_frequency(n_p)
k_p = w_p/ct.c
E_0 = ct.m_e*ct.c*w_p/ct.e
K = (8*np.pi/np.e)**(1/4)*a_0/(k_p*w_0)
return -E_0*K**2*k_p*np.sin(k_p*dist_from_driver)/ct.c
def laser_frequency(l_lambda):
"""Calculate the laser frequency from its wavelength.
Parameters
----------
l_lambda : float
The laser wavelength in meters
Returns
-------
A float with laser frequency in units of 1/s
"""
return 2*ct.pi*ct.c / l_lambda
def laser_rayleigh_length(w_0, l_lambda):
"""Calculate the Rayleigh length of the laser assuming a Gaussian profile.
Parameters
----------
w_0 : float
The laser beam waist in meters, i. e., 1/e in field or 1/e^2 in
intesity. Calculate from FWHM as FWHM/sqrt(2*log(2)).
l_lambda : float
The laser wavelength in meters
Returns
-------
A float with Rayleigh length in m
"""
return ct.pi * w_0**2 / l_lambda
def laser_radius_at_z_pos(w_0, l_lambda, z):
"""Calculate the laser radius (W) at a distance z from its focal position.
Parameters
----------
w_0 : float
The laser beam waist in meters, i. e., 1/e in field or 1/e^2 in
intesity. Calculate from FWHM as FWHM/sqrt(2*log(2)).
l_lambda : float
The laser wavelength in meters.
z : float or array
Distance from the focal position (in meters) at which to calculate the
laser radius.
Returns
-------
A float or array with laser radius (W) in meters
"""
z_r = laser_rayleigh_length(w_0, l_lambda)
return w_0 * np.sqrt(1 + (z/z_r)**2)
def self_guiding_threshold_a0_blowout(plasma_dens, l_lambda):
"""Get minimum a0 to fulfill self-guiding condition in the blowout regime.
For more details see W. Lu - 2007 - Designing LWFA in the blowout regime
(https://ieeexplore.ieee.org/iel5/4439904/4439905/04440664.pdf).
Parameters
----------
plasma_dens : float
The plasma density in units of cm-3
l_lambda : float
The laser wavelength in meters
Returns
-------
A float with the value of the threshold a0
"""
w_p = plasma_frequency(plasma_dens)
w_0 = laser_frequency(l_lambda)
a_0 = (w_0/w_p)**(2/5)
return a_0
def plasma_density_for_self_guiding_blowout(w_0, a_0, l_0=None):
"""Get plasma density fulfilling self-guiding condition in blowout regime.
For more inforation see W. Lu - 2007 - Generating multi-GeVelectron bunches
using single stage laser wakefield acceleration in a 3D nonlinear regime
(https://journals.aps.org/prab/pdf/10.1103/PhysRevSTAB.10.061301)
Parameters
----------
w_0 : float
The laser beam waist in meters, i. e., 1/e in field or 1/e^2 in
intesity. Calculate from FWHM as FWHM/sqrt(2*log(2)).
a_0 : float
The laser a_0
l_0 : float
The laser wavelength in meters. Only necessary to check that the
self-guiding threshold is met.
Returns
-------
A float with the value of the plasma density in units of cm-3
"""
k_p = 2 * np.sqrt(a_0)/w_0
n_p = k_p**2 * ct.m_e*ct.epsilon_0*ct.c**2/ct.e**2 * 1e-6
if l_0 is not None:
a_0_thres = self_guiding_threshold_a0_blowout(n_p, l_0)
if a_0 < a_0_thres:
print("Warning: laser a0 does not meet self-guiding conditions.")
print("Value provided: {}, threshold value: {}".format(a_0,
a_0_thres))
return n_p
def laser_energy(a_0, l_0, lon_fwhm, w_0):
"""Calculate laser pulse energy assuming Gaussian profile.
Parameters
----------
a_0 : float
The laser a_0
l_0 : float
The laser wavelength in meters.
lon_fwhm : float
Longitudinal FWHM of the intensity in seconds.
w_0 : float
The laser beam waist in meters, i. e., 1/e in field or 1/e^2 in
intesity. Calculate from FWHM as FWHM/sqrt(2*log(2)).
Returns
-------
A float with the value of the energy in Joules
"""
i_peak = 2*np.pi**2*ct.epsilon_0*ct.m_e**2*ct.c**5 * a_0**2 / (ct.e*l_0)**2
s_x = w_0 / 2
s_y = s_x
s_z = lon_fwhm / (2*np.sqrt(2*np.log(2)))
l_ene = (2*np.pi)**(3/2) * s_x * s_y * s_z * i_peak
return l_ene
def laser_peak_intensity(a_0, l_0, z=None, w_0=None):
"""Calculate laser pulse peak intensity assuming Gaussian profile.
Parameters
----------
a_0 : float
The laser a_0.
l_0 : float
The laser wavelength in meters.
z : float or array
Distance to the focal position of the laser pulse.
w_0 : float
The laser beam waist in meters, i. e., 1/e in field or 1/e^2 in
intesity. Calculate from FWHM as FWHM/sqrt(2*log(2)). Only needed if
z is not None.
Returns
-------
A float with the value of the peak power in W/m^2.
"""
if z is not None:
z_r = laser_rayleigh_length(w_0, l_0)
a_peak = a_0 / np.sqrt(1 + (z/z_r)**2)
else:
a_peak = a_0
k = 2*np.pi**2*ct.epsilon_0*ct.m_e**2*ct.c**5
i_peak = k * a_peak**2 / (ct.e*l_0)**2
return i_peak
def laser_peak_power(a_0, l_0, w_0):
"""Calculate laser pulse peak power assuming Gaussian profile.
Parameters
----------
a_0 : float
The laser a_0
l_0 : float
The laser wavelength in meters.
w_0 : float
The laser beam waist in meters, i. e., 1/e in field or 1/e^2 in
intesity. Calculate from FWHM as FWHM/sqrt(2*log(2)).
Returns
-------
A float with the value of the oeak power in Watts
"""
i_peak = laser_peak_intensity(a_0, l_0)
p_peak = np.pi * w_0**2 * i_peak / 2
return p_peak
def laser_w0_for_self_guiding_blowout(n_p, a_0, l_0=None):
"""Get laser spot size fulfilling self-guiding condition in blowout regime.
For more inforation see W. Lu - 2007 - Generating multi-GeV electron
bunches using single stage laser wakefield acceleration in a 3D nonlinear
regime (https://journals.aps.org/prab/pdf/10.1103/PhysRevSTAB.10.061301)
Parameters
----------
n_p : float
The plasma density in units of cm-3
a_0 : float
The laser a_0
l_0 : float
The laser wavelength in meters. Only necessary to check that the
self-guiding threshold is met.
Returns
-------
A float with the value of w_0 in meters
"""
k_p = plasma_frequency(n_p) / ct.c
w_0 = 2 * np.sqrt(a_0)/k_p
if l_0 is not None:
a_0_thres = self_guiding_threshold_a0_blowout(n_p, l_0)
if a_0 < a_0_thres:
print("Warning: laser a0 does not meet self-guiding conditions.")
print("Value provided: {}, threshold value: {}".format(a_0,
a_0_thres))
return w_0
def matched_laser_pulse_duration_blowout(n_p, a_0):
"""Get maximum matched laser pulse duration in the blowout regime.
For more details see W. Lu - 2007 - Designing LWFA in the blowout regime
(https://ieeexplore.ieee.org/iel5/4439904/4439905/04440664.pdf).
Parameters
----------
n_p : float
The plasma density in units of cm-3
a_0 : float
The laser a_0
Returns
-------
A float with the value of t_FWHM in seconds
"""
k_p = plasma_frequency(n_p) / ct.c
t_FWHM = 2/ct.c * np.sqrt(a_0)/k_p
return t_FWHM
def matched_beam_size(beam_ene, beam_em, n_p=None, k_x=None):
"""Get matched beam size for the plasma focusing fields.
The focusing gradient, k_x, can be provided or calculated from the plasma
density, n_p.
Parameters
----------
beam_ene : float
Unitless electron beam mean energy (beta*gamma)
beam_em : float
The electron beam normalized emittance in m*rad
n_p : float
The plasma density in units of cm-3
k_x : float
The plasma transverse focusing gradient in T/m
Returns
-------
A float with the value of beam size in meters
"""
# matched beta function
b_x = matched_plasma_beta_function(beam_ene, n_p, k_x)
# matched beam size
s_x = np.sqrt(b_x*beam_em/beam_ene)
return s_x
def matched_plasma_beta_function(beam_ene, n_p=None, k_x=None,
regime='Blowout', dist_from_driver=None,
a_0=None, w_0=None):
"""Get beta function from the plasma focusing fields.
The focusing gradient, k_x, can be provided or calculated from the plasma
density, n_p.
Parameters
----------
beam_ene : float
Unitless electron beam mean energy (beta*gamma)
n_p : float
The plasma density in units of cm-3
k_x : float
The plasma transverse focusing gradient in T/m
regime : string
Specify the accelation regime ('Linear' or 'Blowout') for which to
calculate the focusing fields. Only used if k_x is not provided.
dist_from_driver : string
Distance from the driver center in units of m. Only needed for Linear
regime.
a_0 : float
Peak normalized vector potential of the laser. Only needed for Linear
regime.
w_0 : float
Spot size (w_0) of the laser pulse in units of m. Only needed for
Linear regime.
Returns
-------
A float with the value of the beta function in meters
"""
if k_x is None:
if n_p is None:
raise ValueError("No values for the plasma density and focusing"
" gradient have been provided.")
else:
if regime == 'Blowout':
k_x = plasma_focusing_gradient_blowout(n_p)
elif regime == 'Linear':
k_x = plasma_focusing_gradient_linear(n_p, dist_from_driver,
a_0, w_0)
else:
raise ValueError("Unrecognized acceleration regime")
# betatron frequency
w_x = np.sqrt(ct.c*ct.e/ct.m_e * k_x/beam_ene)
# beta function
b_x = ct.c/w_x
return b_x
def maximum_wakefield_plasma_lens(q_tot, s_z, s_r, n_p):
"""Calculates the maximum focusing gradient induced by beam wakefields in
an active plasma lens using linear theory.
Formula obtained from https://arxiv.org/pdf/1802.02750.pdf
Parameters
----------
q_tot : float
Total beam charge in C
s_z : float
RMS beam length in m
s_r : float
RMS beam size in m
n_p : float
The plasma density in units of cm-3
Returns
-------
A float with the value of the focusing gradient in T/m
"""
k_p = 1 / plasma_skin_depth(n_p)
a = 1 + k_p**2 * s_r**2 / 2
b = 1 + np.sqrt(8*ct.pi) * k_p**2 * s_z**2
g_max = q_tot * ct.mu_0 * ct.c * k_p**2 * s_z / (2*ct.pi * s_r**2 * a * b)
return g_max | APtools | /APtools-0.2.4-py3-none-any.whl/aptools/plasma_accel/general_equations.py | general_equations.py |
from typing import Optional
import numpy as np
import scipy.constants as ct
from openpmd_api import (Series, Access, Dataset, Mesh_Record_Component,
Unit_Dimension)
from aptools import __version__
from .particle_distribution import ParticleDistribution
SCALAR = Mesh_Record_Component.SCALAR
def save_distribution(
distribution: ParticleDistribution,
file_path: str,
data_format: str,
**kwargs
) -> None:
"""Save particle distribution to file in the specified format.
Parameters
----------
distribution : ParticleDistribution
The particle distribution to save.
file_path : str
Path to the file in which to save the data.
data_format : str
Internal format of the data. Possible values
are 'astra', 'csrtrack' and 'openpmd'.
Other Parameters
----------------
**kwargs
Additional parameters to be passed to the particle savers.
"""
_savers[data_format](distribution, file_path, **kwargs)
def save_to_astra(
distribution: ParticleDistribution,
file_path: str
) -> None:
"""Save particle distribution to text file in ASTRA format.
Parameters
----------
distribution : ParticleDistribution
The particle distribution to save.
file_path : str
Path to the file in which to save the data.
"""
# Get beam data
m_species = distribution.m_species
q_species = distribution.q_species
x_orig = distribution.x
y_orig = distribution.y
z_orig = distribution.z
px_orig = distribution.px * m_species * ct.c**2 / ct.e # eV/c
py_orig = distribution.py * m_species * ct.c**2 / ct.e # eV/c
pz_orig = distribution.pz * m_species * ct.c**2 / ct.e # eV/c
q_orig = distribution.q * 1e9 # nC
w = distribution.w
# Determine particle index (type of species).
if m_species == ct.m_e and q_species == -ct.e:
index = 1
elif m_species == ct.m_e and q_species == ct.e:
index = 2
elif m_species == ct.m_p and q_species == ct.e:
index = 3
else:
raise ValueError(
'Only electrons, positrons and protons are supported when saving '
'to ASTRA.')
# Create arrays
x = np.zeros(q_orig.size + 1)
y = np.zeros(q_orig.size + 1)
z = np.zeros(q_orig.size + 1)
px = np.zeros(q_orig.size + 1)
py = np.zeros(q_orig.size + 1)
pz = np.zeros(q_orig.size + 1)
q = np.zeros(q_orig.size + 1)
# Reference particle
x[0] = np.average(x_orig, weights=w)
y[0] = np.average(y_orig, weights=w)
z[0] = np.average(z_orig, weights=w)
px[0] = np.average(px_orig, weights=w)
py[0] = np.average(py_orig, weights=w)
pz[0] = np.average(pz_orig, weights=w)
q[0] = np.sum(q_orig) / len(q_orig)
# Put relative to reference particle
x[1::] = x_orig
y[1::] = y_orig
z[1::] = z_orig - z[0]
px[1::] = px_orig
py[1::] = py_orig
pz[1::] = pz_orig - pz[0]
q[1::] = q_orig
t = z / ct.c
# Add flags and indices
ind = np.ones(q.size) * index
flag = np.ones(q.size) * 5
# Save to file
data = np.column_stack((x, y, z, px, py, pz, t, q, ind, flag))
np.savetxt(
file_path,
data,
'%1.12e %1.12e %1.12e %1.12e %1.12e %1.12e %1.12e %1.12e %i %i'
)
def save_to_csrtrack(
distribution: ParticleDistribution,
file_path: str
) -> None:
"""Save particle distribution to text file using the CSRtrack fmt1 format.
Parameters
----------
distribution : ParticleDistribution
The particle distribution to save.
file_path : str
Path to the file in which to save the data.
"""
# Get beam data.
x_orig = distribution.x
y_orig = distribution.y
z_orig = distribution.z
px_orig = distribution.px * ct.m_e*ct.c**2/ct.e
py_orig = distribution.py * ct.m_e*ct.c**2/ct.e
pz_orig = distribution.pz * ct.m_e*ct.c**2/ct.e
q_orig = distribution.q
# Create arrays.
x = np.zeros(q_orig.size+2)
y = np.zeros(q_orig.size+2)
z = np.zeros(q_orig.size+2)
px = np.zeros(q_orig.size+2)
py = np.zeros(q_orig.size+2)
pz = np.zeros(q_orig.size+2)
q = np.zeros(q_orig.size+2)
# Reference particle.
x[1] = np.average(x_orig, weights=q_orig)
y[1] = np.average(y_orig, weights=q_orig)
z[1] = np.average(z_orig, weights=q_orig)
px[1] = np.average(px_orig, weights=q_orig)
py[1] = np.average(py_orig, weights=q_orig)
pz[1] = np.average(pz_orig, weights=q_orig)
q[1] = sum(q_orig)/len(q_orig)
# Relative coordinates.
x[2::] = x_orig - x[1]
y[2::] = y_orig - y[1]
z[2::] = z_orig - z[1]
px[2::] = px_orig - px[1]
py[2::] = py_orig - py[1]
pz[2::] = pz_orig - pz[1]
q[2::] = q_orig
# Save to file.
data = np.column_stack((z, x, y, pz, px, py, q))
if not file_path.endswith('.fmt1'):
file_path += '.fmt1'
np.savetxt(
file_path,
data,
'%1.12e %1.12e %1.12e %1.12e %1.12e %1.12e %1.12e'
)
def save_to_openpmd(
distribution: ParticleDistribution,
file_path: str,
species_name: Optional[str] = 'particle_distribution'
) -> None:
"""
Save particle distribution to an HDF5 file following the openPMD standard.
Parameters
----------
distribution : ParticleDistribution
The particle distribution to save.
file_path : str
Path to the file in which to save the data.
species_name : str
Optional. Name under which the particle species should be stored.
"""
# Get beam data
x = np.ascontiguousarray(distribution.x)
y = np.ascontiguousarray(distribution.y)
z = np.ascontiguousarray(distribution.z)
px = np.ascontiguousarray(distribution.px)
py = np.ascontiguousarray(distribution.py)
pz = np.ascontiguousarray(distribution.pz)
w = np.ascontiguousarray(distribution.w)
q_species = distribution.q_species
m_species = distribution.m_species
# Save to file
if not file_path.endswith('.h5'):
file_path += '.h5'
opmd_series = Series(file_path, Access.create)
# Set basic attributes.
opmd_series.set_software('APtools', __version__)
opmd_series.set_particles_path('particles')
# Create iteration
it = opmd_series.iterations[0]
# Create particles species.
particles = it.particles[species_name]
# Create additional necessary arrays and constants.
px = px * m_species * ct.c
py = py * m_species * ct.c
pz = pz * m_species * ct.c
# Generate datasets.
d_x = Dataset(x.dtype, extent=x.shape)
d_y = Dataset(y.dtype, extent=y.shape)
d_z = Dataset(z.dtype, extent=z.shape)
d_px = Dataset(px.dtype, extent=px.shape)
d_py = Dataset(py.dtype, extent=py.shape)
d_pz = Dataset(pz.dtype, extent=pz.shape)
d_w = Dataset(w.dtype, extent=w.shape)
d_q = Dataset(np.dtype('float64'), extent=[1])
d_m = Dataset(np.dtype('float64'), extent=[1])
d_xoff = Dataset(np.dtype('float64'), extent=[1])
d_yoff = Dataset(np.dtype('float64'), extent=[1])
d_zoff = Dataset(np.dtype('float64'), extent=[1])
# Record data.
particles['position']['x'].reset_dataset(d_x)
particles['position']['y'].reset_dataset(d_y)
particles['position']['z'].reset_dataset(d_z)
particles['positionOffset']['x'].reset_dataset(d_xoff)
particles['positionOffset']['y'].reset_dataset(d_yoff)
particles['positionOffset']['z'].reset_dataset(d_zoff)
particles['momentum']['x'].reset_dataset(d_px)
particles['momentum']['y'].reset_dataset(d_py)
particles['momentum']['z'].reset_dataset(d_pz)
particles['weighting'][SCALAR].reset_dataset(d_w)
particles['charge'][SCALAR].reset_dataset(d_q)
particles['mass'][SCALAR].reset_dataset(d_m)
# Prepare for writting.
particles['position']['x'].store_chunk(x)
particles['position']['y'].store_chunk(y)
particles['position']['z'].store_chunk(z)
particles['positionOffset']['x'].make_constant(0.)
particles['positionOffset']['y'].make_constant(0.)
particles['positionOffset']['z'].make_constant(0.)
particles['momentum']['x'].store_chunk(px)
particles['momentum']['y'].store_chunk(py)
particles['momentum']['z'].store_chunk(pz)
particles['weighting'][SCALAR].store_chunk(w)
particles['charge'][SCALAR].make_constant(q_species)
particles['mass'][SCALAR].make_constant(m_species)
# Set units.
particles['position'].unit_dimension = {Unit_Dimension.L: 1}
particles['positionOffset'].unit_dimension = {Unit_Dimension.L: 1}
particles['momentum'].unit_dimension = {
Unit_Dimension.L: 1,
Unit_Dimension.M: 1,
Unit_Dimension.T: -1,
}
particles['charge'].unit_dimension = {
Unit_Dimension.T: 1,
Unit_Dimension.I: 1,
}
particles['mass'].unit_dimension = {Unit_Dimension.M: 1}
# Set weighting attributes.
particles['position'].set_attribute('macroWeighted', np.uint32(0))
particles['positionOffset'].set_attribute(
'macroWeighted', np.uint32(0))
particles['momentum'].set_attribute('macroWeighted', np.uint32(0))
particles['weighting'][SCALAR].set_attribute(
'macroWeighted', np.uint32(1))
particles['charge'][SCALAR].set_attribute(
'macroWeighted', np.uint32(0))
particles['mass'][SCALAR].set_attribute('macroWeighted', np.uint32(0))
particles['position'].set_attribute('weightingPower', 0.)
particles['positionOffset'].set_attribute('weightingPower', 0.)
particles['momentum'].set_attribute('weightingPower', 1.)
particles['weighting'][SCALAR].set_attribute('weightingPower', 1.)
particles['charge'][SCALAR].set_attribute('weightingPower', 1.)
particles['mass'][SCALAR].set_attribute('weightingPower', 1.)
# Flush data.
opmd_series.flush()
_savers = {
'astra': save_to_astra,
'csrtrack': save_to_csrtrack,
'openpmd': save_to_openpmd
} | APtools | /APtools-0.2.4-py3-none-any.whl/aptools/particle_distributions/save.py | save.py |
from typing import Optional
import numpy as np
import scipy.constants as ct
from h5py import File
from aptools.helper_functions import join_infile_path
from .particle_distribution import ParticleDistribution
def read_distribution(
file_path: str,
data_format: str,
**kwargs
) -> ParticleDistribution:
"""Read particle distribution from file.
Parameters
----------
file_path : str
Path to the file with particle data
data_format : str
Internal format of the data. Possible values
are 'astra', 'csrtrack' and 'openpmd'.
Other Parameters
----------------
**kwargs
Additional parameters to be passed to the particle readers.
Returns
-------
ParticleDistribution
The particle distribution.
"""
return _readers[data_format](file_path, **kwargs)
def read_from_astra(
file_path: str,
remove_non_standard: bool = True
) -> ParticleDistribution:
"""Read particle distribution from ASTRA.
Parameters
----------
file_path : str
Path to the file with particle data
remove_non_standard : bool
Determines whether non-standard particles (those with a status flag
other than 5) should be removed from the read data.
Returns
-------
ParticleDistribution
The particle distribution.
"""
# Read data.
data = np.genfromtxt(file_path)
# Get status flag and remove non-standard particles, if needed.
status_flag = data[:, 9]
if remove_non_standard:
data = data[np.where(status_flag == 5)]
# Extract phase space and particle index.
x = data[:, 0]
y = data[:, 1]
z = data[:, 2]
px = data[:, 3]
py = data[:, 4]
pz = data[:, 5]
q = data[:, 7] * 1e-9
index = data[:, 8]
# Apply reference particle.
z[1:] += z[0]
pz[1:] += pz[0]
# Determine charge and mass of particle species.
assert index[0] in [1, 2, 3], (
'Only electrons, positrons and protons are supported when reading '
'ASTRA distributions')
q_species = ct.e * (-1 if index[0] in [1, 3] else 1)
m_species = ct.m_e if index[0] in [1, 2] else ct.m_p
# Convert momentum to normalized units (beta * gamma).
px /= m_species * ct.c**2 / ct.e
py /= m_species * ct.c**2 / ct.e
pz /= m_species * ct.c**2 / ct.e
# Get particle weights.
w = q / q_species
# Return distribution.
return ParticleDistribution(x, y, z, px, py, pz, w, q_species, m_species)
def read_from_csrtrack(
file_path: str,
) -> ParticleDistribution:
"""Read particle distribution from CSRtrack.
Parameters
----------
file_path : str
Path to the file with particle data
Returns
-------
ParticleDistribution
The particle distribution.
"""
# Read data.
data = np.genfromtxt(file_path)
# Extract phase space,
z = data[1:, 0]
x = data[1:, 1]
y = data[1:, 2]
pz = data[1:, 3] / (ct.m_e*ct.c**2/ct.e)
px = data[1:, 4] / (ct.m_e*ct.c**2/ct.e)
py = data[1:, 5] / (ct.m_e*ct.c**2/ct.e)
q = data[1:, 6]
# Apply reference particle.
x[1:] += x[0]
y[1:] += y[0]
z[1:] += z[0]
px[1:] += px[0]
py[1:] += py[0]
pz[1:] += pz[0]
# Determine charge and mass of particle species (only electrons in
# CSRtrack).
q_species = -ct.e
m_species = ct.m_e
# Get particle weights.
w = q / q_species
# Return distribution.
return ParticleDistribution(x, y, z, px, py, pz, w, q_species, m_species)
def read_from_openpmd(
file_path: str,
species_name: Optional[str] = None
) -> ParticleDistribution:
"""Read particle distribution from ASTRA.
Parameters
----------
file_path : str
Path to the file with particle data
species_name : str, Optional
Name of the particle species. Optional if only one particle species
is available in the openpmd file.
Returns
-------
ParticleDistribution
The particle distribution.
"""
# Open file.
file_content = File(file_path, mode='r')
# Get base path in file.
iteration = list(file_content['/data'].keys())[0]
base_path = '/data/{}'.format(iteration)
# Get path under which particle data is stored.
particles_path = file_content.attrs['particlesPath'].decode()
# Get list of available species.
available_species = list(
file_content[join_infile_path(base_path, particles_path)])
assert len(available_species) > 0, (
"No particle species found in '{}'.".format(file_path))
# It not specified, read first species.
if species_name is None:
species_name = available_species[0]
# Get species.
beam_species = file_content[
join_infile_path(base_path, particles_path, species_name)]
# Get phase space and attributes.
mass = beam_species['mass']
charge = beam_species['charge']
position = beam_species['position']
position_off = beam_species['positionOffset']
momentum = beam_species['momentum']
m_species = mass.attrs['value'] * mass.attrs['unitSI']
q_species = charge.attrs['value'] * charge.attrs['unitSI']
x = (position['x'][:] * position['x'].attrs['unitSI'] +
position_off['x'].attrs['value'] * position_off['x'].attrs['unitSI'])
y = (position['y'][:] * position['y'].attrs['unitSI'] +
position_off['y'].attrs['value'] * position_off['y'].attrs['unitSI'])
z = (position['z'][:] * position['z'].attrs['unitSI'] +
position_off['z'].attrs['value'] * position_off['z'].attrs['unitSI'])
px = momentum['x'][:] * momentum['x'].attrs['unitSI'] / (m_species*ct.c)
py = momentum['y'][:] * momentum['y'].attrs['unitSI'] / (m_species*ct.c)
pz = momentum['z'][:] * momentum['z'].attrs['unitSI'] / (m_species*ct.c)
w = beam_species['weighting'][:]
# Return distribution.
return ParticleDistribution(x, y, z, px, py, pz, w, q_species, m_species)
_readers = {
'astra': read_from_astra,
'csrtrack': read_from_csrtrack,
'openpmd': read_from_openpmd
} | APtools | /APtools-0.2.4-py3-none-any.whl/aptools/particle_distributions/read.py | read.py |
import scipy.constants as ct
import numpy as np
from aptools.helper_functions import (weighted_std, create_beam_slices,
remove_correlation,
calculate_slice_average)
def twiss_parameters(x, px, pz, py=None, w=None, emitt='tr',
disp_corrected=False, corr_order=1):
"""Calculate the alpha and beta functions of the beam in a certain
transverse plane
Parameters
----------
x : array
Contains the transverse position of the particles in one of the
transverse planes in units of meters
px : array
Contains the transverse momentum of the beam particles in the same
plane as x in non-dimmensional units (beta*gamma)
py : array
Contains the transverse momentum of the beam particles in the opposite
plane as as x in non-dimmensional units (beta*gamma). Necessary if
disp_corrected=True or emitt='ph'.
pz : array
Contains the longitudinal momentum of the beam particles in
non-dimmensional units (beta*gamma).
w : array or single value
Statistical weight of the particles.
emitt : str
Determines which emittance to use to calculate the Twiss parameters.
Possible values are 'tr' for trace-space emittance and 'ph' for
phase-space emittance
disp_corrected : bool
Whether ot not to correct for dispersion contributions.
corr_order : int
Highest order up to which dispersion effects should be corrected.
Returns
-------
A tuple with the value of the alpha, beta [m] and gamma [m^-1] functions
"""
if emitt == 'ph':
em_x = normalized_transverse_rms_emittance(x, px, py, pz, w,
disp_corrected, corr_order)
gamma = np.sqrt(1 + np.square(px) + np.square(py) + np.square(pz))
gamma_avg = np.average(gamma, weights=w)
x_avg = np.average(x, weights=w)
px_avg = np.average(px, weights=w)
# center x and x
x = x - x_avg
px = px - px_avg
if disp_corrected:
# remove x-gamma correlation
dgamma = (gamma - gamma_avg)/gamma_avg
x = remove_correlation(dgamma, x, w, corr_order)
b_x = np.average(x**2, weights=w)*gamma_avg/em_x
a_x = -np.average(x*px, weights=w)/em_x
elif emitt == 'tr':
em_x = transverse_trace_space_rms_emittance(x, px, py, pz, w,
disp_corrected, corr_order)
xp = px/pz
# center x and xp
x_avg = np.average(x, weights=w)
xp_avg = np.average(xp, weights=w)
x = x - x_avg
xp = xp - xp_avg
if disp_corrected:
# remove x-gamma correlation
gamma = np.sqrt(1 + np.square(px) + np.square(py) + np.square(pz))
gamma_avg = np.average(gamma, weights=w)
dgamma = (gamma - gamma_avg)/gamma_avg
x = remove_correlation(dgamma, x, w, corr_order)
# remove xp-gamma correlation
xp = remove_correlation(dgamma, xp, w, corr_order)
b_x = np.average(x**2, weights=w)/em_x
a_x = -np.average(x*xp, weights=w)/em_x
g_x = (1 + a_x**2)/b_x
return (a_x, b_x, g_x)
def dispersion(x, px, py, pz, gamma_ref=None, w=None):
"""Calculate the first-order dispersion from the beam distribution
Parameters
----------
x : array
Contains the transverse position of the particles in one of the
transverse planes in units of meters
px : array
Contains the transverse momentum of the beam particles in the same
plane as x in non-dimmensional units (beta*gamma)
py : array
Contains the transverse momentum of the beam particles in the opposite
plane as as x in non-dimmensional units (beta*gamma).
pz : array
Contains the longitudinal momentum of the beam particles in
non-dimmensional units (beta*gamma).
gamma_ref : float
Reference energy for the dispersive element. If 'None' this will be the
beam average energy.
w : array or single value
Statistical weight of the particles.
Returns
-------
A float with the value of the dispersion in m.
"""
gamma = np.sqrt(1 + np.square(px) + np.square(py) + np.square(pz))
if gamma_ref is None:
gamma_ref = np.average(gamma, weights=w)
dgamma = (gamma - gamma_ref)/gamma_ref
fit_coefs = np.polyfit(dgamma, x, 1, w=w)
disp = fit_coefs[0]
return disp
def rms_length(z, w=None):
"""Calculate the RMS bunch length of the provided particle
distribution
Parameters
----------
z : array
Contains the longitudinal position of the particles in units of meters
w : array or single value
Statistical weight of the particles.
Returns
-------
A float with the RMS length value in meters.
"""
s_z = weighted_std(z, weights=w)
return s_z
def rms_size(x, w=None):
"""Calculate the RMS bunch size of the provided particle
distribution
Parameters
----------
x : array
Contains the transverse position of the particles in units of meters
w : array or single value
Statistical weight of the particles.
Returns
-------
A float with the RMS length value in meters.
"""
s_x = weighted_std(x, weights=w)
return s_x
def mean_kinetic_energy(px, py, pz, w=None):
"""Calculate the mean kinetic energy of the provided particle distribution
Parameters
----------
px : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
py : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
pz : array
Contains the longitudonal momentum of the beam particles in
non-dimmensional units (beta*gamma)
w : array or single value
Statistical weight of the particles.
Returns
-------
A float with the mean kinetic energy in non-dimmensional
units, i.e. [1/(m_e c**2)]
"""
return np.average(np.sqrt(np.square(px) + np.square(py) + np.square(pz)),
weights=w)
def mean_energy(px, py, pz, w=None):
"""Calculate the mean energy of the provided particle distribution
Parameters
----------
px : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
py : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
pz : array
Contains the longitudonal momentum of the beam particles in
non-dimmensional units (beta*gamma)
w : array or single value
Statistical weight of the particles.
Returns
-------
A float with the mean energy in non-dimmensional units, i.e. [1/(m_e c**2)]
"""
return np.average(np.sqrt(1 + px**2 + py**2 + pz**2), weights=w)
def rms_energy_spread(px, py, pz, w=None):
"""Calculate the absotule RMS energy spread of the provided particle
distribution
Parameters
----------
px : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
py : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
pz : array
Contains the longitudonal momentum of the beam particles in
non-dimmensional units (beta*gamma)
w : array or single value
Statistical weight of the particles.
Returns
-------
A float with the energy spread value in non-dimmensional units,
i.e. [1/(m_e c**2)]
"""
part_ene = np.sqrt(1 + np.square(px) + np.square(py) + np.square(pz))
ene_std = weighted_std(part_ene, weights=w)
return ene_std
def fwhm_energy_spread(z, px, py, pz, w=None, n_slices=10, len_slice=None):
"""Calculate the absolute FWHM energy spread of the provided particle
distribution
Parameters
----------
px : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
py : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
pz : array
Contains the longitudonal momentum of the beam particles in
non-dimmensional units (beta*gamma)
w : array or single value
Statistical weight of the particles.
Returns
-------
A float with the energy spread value in non-dimmensional units,
i.e. [1/(m_e c**2)]
"""
part_ene = np.sqrt(1 + np.square(px) + np.square(py) + np.square(pz))
slice_lims, n_slices = create_beam_slices(z, n_slices, len_slice)
gamma_hist, z_edges = np.histogram(part_ene, bins=n_slices, weights=w)
slice_pos = z_edges[1:] - abs(z_edges[1]-z_edges[0])/2
peak = max(gamma_hist)
slices_in_fwhm = slice_pos[np.where(gamma_hist >= peak/2)]
fwhm = max(slices_in_fwhm) - min(slices_in_fwhm)
return fwhm
def relative_fwhm_energy_spread(z, px, py, pz, w=None, n_slices=10,
len_slice=None):
"""Calculate the relative RMS energy spread of the provided particle
distribution
Parameters
----------
px : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
py : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
pz : array
Contains the longitudonal momentum of the beam particles in
non-dimmensional units (beta*gamma)
w : array or single value
Statistical weight of the particles.
Returns
-------
A float with the relative energy spread value.
"""
abs_spread = fwhm_energy_spread(z, px, py, pz, w=w, n_slices=n_slices,
len_slice=len_slice)
mean_ene = mean_energy(px, py, pz, w)
rel_spread = abs_spread/mean_ene
return rel_spread
def relative_rms_energy_spread(px, py, pz, w=None):
"""Calculate the relative RMS energy spread of the provided particle
distribution
Parameters
----------
px : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
py : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
pz : array
Contains the longitudonal momentum of the beam particles in
non-dimmensional units (beta*gamma)
w : array or single value
Statistical weight of the particles.
Returns
-------
A float with the relative energy spread value.
"""
abs_spread = rms_energy_spread(px, py, pz, w)
mean_ene = mean_energy(px, py, pz, w)
rel_spread = abs_spread/mean_ene
return rel_spread
def longitudinal_energy_chirp(z, px, py, pz, w=None):
"""Calculate the longitudinal energy chirp, K, of the provided particle
distribution in units of m**(-1). It is defined as dE/<E> = K*dz.
Parameters
----------
z : array
Contains the longitudinal position of the particles in units of meters
px : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
py : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
pz : array
Contains the longitudonal momentum of the beam particles in
non-dimmensional units (beta*gamma)
w : array or single value
Statistical weight of the particles.
Returns
-------
A float with the chirp value in units of m^(-1)
"""
ene = np.sqrt(1 + np.square(px) + np.square(py) + np.square(pz))
mean_ene = np.average(ene, weights=w)
mean_z = np.average(z, weights=w)
dE_rel = (ene-mean_ene) / mean_ene
dz = z - mean_z
p = np.polyfit(dz, dE_rel, 1)
K = p[0]
return K
def rms_relative_correlated_energy_spread(z, px, py, pz, w=None):
"""Calculate the correlated energy spread of the provided particle
distribution
Parameters
----------
z : array
Contains the longitudinal position of the particles in units of meters
px : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
py : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
pz : array
Contains the longitudonal momentum of the beam particles in
non-dimmensional units (beta*gamma)
w : array or single value
Statistical weight of the particles.
Returns
-------
A float with the energy spread value in non-dimmensional units,
i.e. [1/(m_e c**2)]
"""
K = longitudinal_energy_chirp(z, px, py, pz, w)
mean_z = np.average(z, weights=w)
dz = z - mean_z
corr_ene = K*dz
corr_ene_sp = weighted_std(corr_ene, w)
return corr_ene_sp
def rms_relative_uncorrelated_energy_spread(z, px, py, pz, w=None):
"""Calculate the uncorrelated energy spread of the provided particle
distribution
Parameters
----------
z : array
Contains the longitudinal position of the particles in units of meters
px : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
py : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
pz : array
Contains the longitudonal momentum of the beam particles in
non-dimmensional units (beta*gamma)
w : array or single value
Statistical weight of the particles.
Returns
-------
A float with the energy spread value in non-dimmensional units,
i.e. [1/(m_e c**2)]
"""
if len(z) > 1:
ene = np.sqrt(1 + np.square(px) + np.square(py) + np.square(pz))
mean_ene = np.average(ene, weights=w)
mean_z = np.average(z, weights=w)
dE = ene-mean_ene
dz = z - mean_z
p = np.polyfit(dz, dE, 1)
K = p[0]
unc_ene = ene - K*dz
unc_ene_sp = weighted_std(unc_ene, w)/mean_ene
else:
unc_ene_sp = 0
return unc_ene_sp
def normalized_transverse_rms_emittance(x, px, py=None, pz=None, w=None,
disp_corrected=False, corr_order=1):
"""Calculate the normalized transverse RMS emittance without dispersion
contributions of the particle distribution in a given plane.
Parameters
----------
x : array
Contains the transverse position of the particles in one of the
transverse planes in units of meters
px : array
Contains the transverse momentum of the beam particles in the same
plane as x in non-dimmensional units (beta*gamma)
py : array
Contains the transverse momentum of the beam particles in the opposite
plane as as x in non-dimmensional units (beta*gamma). Necessary if
disp_corrected=True.
pz : array
Contains the longitudinal momentum of the beam particles in
non-dimmensional units (beta*gamma). Necessary if disp_corrected=True.
w : array or single value
Statistical weight of the particles.
disp_corrected : bool
Whether ot not to correct for dispersion contributions.
corr_order : int
Highest order up to which dispersion effects should be corrected.
Returns
-------
A float with the emmitance value in units of m * rad
"""
if len(x) > 1:
if disp_corrected:
# remove x-gamma correlation
gamma = np.sqrt(1 + np.square(px) + np.square(py) + np.square(pz))
gamma_avg = np.average(gamma, weights=w)
dgamma = (gamma - gamma_avg)/gamma_avg
x = remove_correlation(dgamma, x, w, corr_order)
cov_x = np.cov(x, px, aweights=np.abs(w))
em_x = np.sqrt(np.linalg.det(cov_x.astype(np.float32, copy=False)))
else:
em_x = 0
return em_x
def geometric_transverse_rms_emittance(x, px, py, pz, w=None,
disp_corrected=False, corr_order=1):
"""Calculate the geometric transverse RMS emittance without dispersion
contributions of the particle distribution in a given plane.
Parameters
----------
x : array
Contains the transverse position of the particles in one of the
transverse planes in units of meters
px : array
Contains the transverse momentum of the beam particles in the same
plane as x in non-dimmensional units (beta*gamma)
py : array
Contains the transverse momentum of the beam particles in the opposite
plane as as x in non-dimmensional units (beta*gamma).
pz : array
Contains the longitudinal momentum of the beam particles in
non-dimmensional units (beta*gamma).
w : array or single value
Statistical weight of the particles.
disp_corrected : bool
Whether ot not to correct for dispersion contributions.
corr_order : int
Highest order up to which dispersion effects should be corrected.
Returns
-------
A float with the emmitance value in units of m * rad
"""
gamma = np.sqrt(1 + np.square(px) + np.square(py) + np.square(pz))
gamma_avg = np.average(gamma, weights=w)
em_x = normalized_transverse_rms_emittance(x, px, py, pz, w,
disp_corrected, corr_order)
return em_x / gamma_avg
def normalized_transverse_trace_space_rms_emittance(
x, px, py, pz, w=None, disp_corrected=False, corr_order=1):
"""Calculate the normalized trasnverse trace-space RMS emittance of the
particle distribution in a given plane.
Parameters
----------
x : array
Contains the transverse position of the particles in one of the
transverse planes in units of meters
px : array
Contains the transverse momentum of the beam particles in the same
plane as x in non-dimmensional units (beta*gamma)
py : array
Contains the transverse momentum of the beam particles in the opposite
plane as as x in non-dimmensional units (beta*gamma).
pz : array
Contains the longitudinal momentum of the beam particles in
non-dimmensional units (beta*gamma).
w : array or single value
Statistical weight of the particles.
disp_corrected : bool
Whether ot not to correct for dispersion contributions.
corr_order : int
Highest order up to which dispersion effects should be corrected.
Returns
-------
A float with the emmitance value in units of m * rad
"""
gamma = np.sqrt(1 + np.square(px) + np.square(py) + np.square(pz))
gamma_avg = np.average(gamma, weights=w)
em_x = transverse_trace_space_rms_emittance(x, px, py, pz, w,
disp_corrected, corr_order)
return em_x * gamma_avg
def transverse_trace_space_rms_emittance(x, px, py=None, pz=None, w=None,
disp_corrected=False, corr_order=1):
"""Calculate the trasnverse trace-space RMS emittance of the
particle distribution in a given plane.
Parameters
----------
x : array
Contains the transverse position of the particles in one of the
transverse planes in units of meters
px : array
Contains the transverse momentum of the beam particles in the same
plane as x in non-dimmensional units (beta*gamma)
py : array
Contains the transverse momentum of the beam particles in the opposite
plane as as x in non-dimmensional units (beta*gamma). Necessary if
disp_corrected=True.
pz : array
Contains the longitudinal momentum of the beam particles in
non-dimmensional units (beta*gamma). Necessary if disp_corrected=True.
w : array or single value
Statistical weight of the particles.
disp_corrected : bool
Whether ot not to correct for dispersion contributions.
corr_order : int
Highest order up to which dispersion effects should be corrected.
Returns
-------
A float with the emmitance value in units of m * rad
"""
if len(x) > 1:
xp = px/pz
if disp_corrected:
# remove x-gamma correlation
gamma = np.sqrt(1 + np.square(px) + np.square(py) + np.square(pz))
gamma_avg = np.average(gamma, weights=w)
dgamma = (gamma - gamma_avg)/gamma_avg
x = remove_correlation(dgamma, x, w, corr_order)
# remove xp-gamma correlation
xp = remove_correlation(dgamma, xp, w, corr_order)
cov_x = np.cov(x, xp, aweights=np.abs(w))
em_x = np.sqrt(np.linalg.det(cov_x.astype(np.float32, copy=False)))
else:
em_x = 0
return em_x
def longitudinal_rms_emittance(z, px, py, pz, w=None):
"""Calculate the longitudinal RMS emittance of the particle
distribution in a given plane.
Parameters
----------
z : array
Contains the longitudinal position of the particles in units of meters
px : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
py : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
pz : array
Contains the longitudonal momentum of the beam particles in
non-dimmensional units (beta*gamma)
w : array or single value
Statistical weight of the particles.
Returns
-------
A float with the emmitance value in units of m
"""
g = np.sqrt(1 + np.square(px) + np.square(py) + np.square(pz))
cov_l = np.cov(z, g, aweights=np.abs(w))
em_l = np.sqrt(np.linalg.det(cov_l.astype(np.float32, copy=False)))
return em_l
def peak_current(z, q, n_slices=10, len_slice=None):
"""Calculate the peak current of the given particle distribution.
Parameters
----------
z : array
Contains the longitudinal position of the particles in units of meters
q : array
Contains the charge of the particles in C
n_slices : array
Number of longitudinal slices in which to divite the particle
distribution. Not used if len_slice is specified.
len_slice : array
Length of the longitudinal slices. If not None, replaces n_slices.
Returns
-------
The absolute value of the peak current in Ampere.
"""
current_prof, *_ = current_profile(z, q, n_slices=n_slices,
len_slice=len_slice)
current_prof = abs(current_prof)
return max(current_prof)
def fwhm_length(z, q, n_slices=10, len_slice=0.1e-6):
"""Calculate the FWHM length of the given particle distribution.
Parameters
----------
z : array
Contains the longitudinal position of the particles in units of meters
q : array
Contains the charge of the particles in C
n_slices : array
Number of longitudinal slices in which to divite the particle
distribution. Not used if len_slice is specified.
len_slice : array
Length of the longitudinal slices. If not None, replaces n_slices.
Returns
-------
The FWHM value in metres.
"""
current_prof, z_edges = current_profile(z, q, n_slices=n_slices,
len_slice=len_slice)
slice_pos = z_edges[1:] - abs(z_edges[1]-z_edges[0])/2
current_prof = abs(current_prof)
i_peak = max(current_prof)
i_half = i_peak/2
slices_in_fwhm = slice_pos[np.where(current_prof >= i_half)]
fwhm = max(slices_in_fwhm) - min(slices_in_fwhm)
return fwhm
def relative_rms_slice_energy_spread(z, px, py, pz, w=None, n_slices=10,
len_slice=None):
"""Calculate the relative RMS slice energy spread of the provided particle
distribution
Parameters
----------
z : array
Contains the longitudinal position of the particles in units of meters
px : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
py : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
pz : array
Contains the longitudonal momentum of the beam particles in
non-dimmensional units (beta*gamma)
w : array or single value
Statistical weight of the particles.
n_slices : array
Number of longitudinal slices in which to divite the particle
distribution. Not used if len_slice is specified.
len_slice : array
Length of the longitudinal slices. If not None, replaces n_slices.
Returns
-------
A tuple containing:
- An array with the relative energy spread value in each slice.
- An array with the statistical weight of each slice.
- An array with the slice edges.
- A float with the weigthed average of the slice values.
"""
slice_lims, n_slices = create_beam_slices(z, n_slices, len_slice)
slice_ene_sp = np.zeros(n_slices)
slice_weight = np.zeros(n_slices)
for i in np.arange(0, n_slices):
a = slice_lims[i]
b = slice_lims[i+1]
slice_particle_filter = (z > a) & (z <= b)
if slice_particle_filter.any():
px_slice = px[slice_particle_filter]
py_slice = py[slice_particle_filter]
pz_slice = pz[slice_particle_filter]
if hasattr(w, '__iter__'):
w_slice = w[slice_particle_filter]
else:
w_slice = w
slice_ene_sp[i] = relative_rms_energy_spread(px_slice, py_slice,
pz_slice, w_slice)
slice_weight[i] = np.sum(w_slice)
slice_avg = calculate_slice_average(slice_ene_sp, slice_weight)
return slice_ene_sp, slice_weight, slice_lims, slice_avg
def rms_relative_uncorrelated_slice_energy_spread(z, px, py, pz, w=None,
n_slices=10, len_slice=None):
"""Calculate the uncorrelated slcie energy spread of the provided particle
distribution
Parameters
----------
z : array
Contains the longitudinal position of the particles in units of meters
px : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
py : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
pz : array
Contains the longitudonal momentum of the beam particles in
non-dimmensional units (beta*gamma)
w : array or single value
Statistical weight of the particles.
n_slices : array
Number of longitudinal slices in which to divite the particle
distribution. Not used if len_slice is specified.
len_slice : array
Length of the longitudinal slices. If not None, replaces n_slices.
Returns
-------
A tuple containing:
- An array with the relative energy spread value in each slice.
- An array with the statistical weight of each slice.
- An array with the slice edges.
- A float with the weigthed average of the slice values.
"""
slice_lims, n_slices = create_beam_slices(z, n_slices, len_slice)
slice_ene_sp = np.zeros(n_slices)
slice_weight = np.zeros(n_slices)
for i in np.arange(0, n_slices):
a = slice_lims[i]
b = slice_lims[i+1]
slice_particle_filter = (z > a) & (z <= b)
if slice_particle_filter.any():
z_slice = z[slice_particle_filter]
px_slice = px[slice_particle_filter]
py_slice = py[slice_particle_filter]
pz_slice = pz[slice_particle_filter]
if hasattr(w, '__iter__'):
w_slice = w[slice_particle_filter]
else:
w_slice = w
slice_ene_sp[i] = rms_relative_uncorrelated_energy_spread(
z_slice, px_slice, py_slice, pz_slice, w_slice)
slice_weight[i] = np.sum(w_slice)
slice_avg = calculate_slice_average(slice_ene_sp, slice_weight)
return slice_ene_sp, slice_weight, slice_lims, slice_avg
def normalized_transverse_rms_slice_emittance(
z, x, px, py=None, pz=None, w=None, disp_corrected=False, corr_order=1,
n_slices=10, len_slice=None):
"""Calculate the normalized transverse RMS slice emittance of the particle
distribution in a given plane.
Parameters
----------
z : array
Contains the longitudinal position of the particles in units of meters
x : array
Contains the transverse position of the particles in one of the
transverse planes in units of meters
px : array
Contains the transverse momentum of the beam particles in the same
plane as x in non-dimmensional units (beta*gamma)
py : array
Contains the transverse momentum of the beam particles in the opposite
plane as as x in non-dimmensional units (beta*gamma). Necessary if
disp_corrected=True.
pz : array
Contains the longitudinal momentum of the beam particles in
non-dimmensional units (beta*gamma). Necessary if disp_corrected=True.
w : array or single value
Statistical weight of the particles.
disp_corrected : bool
Whether ot not to correct for dispersion contributions.
corr_order : int
Highest order up to which dispersion effects should be corrected.
n_slices : array
Number of longitudinal slices in which to divite the particle
distribution. Not used if len_slice is specified.
len_slice : array
Length of the longitudinal slices. If not None, replaces n_slices.
Returns
-------
A tuple containing:
- An array with the emmitance value in each slice in units of m * rad.
- An array with the statistical weight of each slice.
- An array with the slice edges.
- A float with the weigthed average of the slice values.
"""
if disp_corrected:
# remove x-gamma correlation
gamma = np.sqrt(1 + np.square(px) + np.square(py) + np.square(pz))
gamma_avg = np.average(gamma, weights=w)
dgamma = (gamma - gamma_avg)/gamma_avg
x = remove_correlation(dgamma, x, w, corr_order)
slice_lims, n_slices = create_beam_slices(z, n_slices, len_slice)
slice_em = np.zeros(n_slices)
slice_weight = np.zeros(n_slices)
for i in np.arange(0, n_slices):
a = slice_lims[i]
b = slice_lims[i+1]
slice_particle_filter = (z > a) & (z <= b)
if slice_particle_filter.any():
x_slice = x[slice_particle_filter]
px_slice = px[slice_particle_filter]
# if py is not None:
# py_slice = py[slice_particle_filter]
# else:
# py_slice=None
# if pz is not None:
# pz_slice = pz[slice_particle_filter]
# else:
# pz_slice=None
if hasattr(w, '__iter__'):
w_slice = w[slice_particle_filter]
else:
w_slice = w
slice_em[i] = normalized_transverse_rms_emittance(
x_slice, px_slice, w=w_slice)
slice_weight[i] = np.sum(w_slice)
slice_avg = calculate_slice_average(slice_em, slice_weight)
return slice_em, slice_weight, slice_lims, slice_avg
def slice_twiss_parameters(
z, x, px, pz, py=None, w=None, disp_corrected=False, corr_order=1,
n_slices=10, len_slice=None):
"""Calculate the Twiss parameters for each longitudinal slice of the
particle distribution in a given plane.
Parameters
----------
z : array
Contains the longitudinal position of the particles in units of meters
x : array
Contains the transverse position of the particles in one of the
transverse planes in units of meters
px : array
Contains the transverse momentum of the beam particles in the same
plane as x in non-dimmensional units (beta*gamma)
pz : array
Contains the longitudinal momentum of the beam particles in
non-dimmensional units (beta*gamma).
py : array
Contains the transverse momentum of the beam particles in the opposite
plane as as x in non-dimmensional units (beta*gamma). Necessary if
disp_corrected=True.
w : array or single value
Statistical weight of the particles.
disp_corrected : bool
Whether ot not to correct for dispersion contributions.
corr_order : int
Highest order up to which dispersion effects should be corrected.
n_slices : array
Number of longitudinal slices in which to divite the particle
distribution. Not used if len_slice is specified.
len_slice : array
Length of the longitudinal slices. If not None, replaces n_slices.
Returns
-------
A tuple containing:
- A list with the arrays of the alpha, beta [m] and gamma [m^-1] functions.
- An array with the statistical weight of each slice.
- An array with the slice edges.
- A list with the weighted average slice values of alpha, beta and gamma.
"""
if disp_corrected:
# remove x-gamma correlation
gamma = np.sqrt(1 + np.square(px) + np.square(py) + np.square(pz))
gamma_avg = np.average(gamma, weights=w)
dgamma = (gamma - gamma_avg)/gamma_avg
x = remove_correlation(dgamma, x, w, corr_order)
slice_lims, n_slices = create_beam_slices(z, n_slices, len_slice)
slice_alpha = np.zeros(n_slices)
slice_beta = np.zeros(n_slices)
slice_gamma = np.zeros(n_slices)
slice_weight = np.zeros(n_slices)
for i in np.arange(0, n_slices):
a = slice_lims[i]
b = slice_lims[i+1]
slice_particle_filter = (z > a) & (z <= b)
if slice_particle_filter.any():
x_slice = x[slice_particle_filter]
px_slice = px[slice_particle_filter]
pz_slice = pz[slice_particle_filter]
# if py is not None:
# py_slice = py[slice_particle_filter]
# else:
# py_slice=None
# if pz is not None:
# pz_slice = pz[slice_particle_filter]
# else:
# pz_slice=None
if hasattr(w, '__iter__'):
w_slice = w[slice_particle_filter]
else:
w_slice = w
slice_alpha[i], slice_beta[i], slice_gamma[i] = twiss_parameters(
x_slice, px_slice, pz_slice, w=w_slice)
slice_weight[i] = np.sum(w_slice)
slice_twiss_params = [slice_alpha, slice_beta, slice_gamma]
alpha_avg = calculate_slice_average(slice_alpha, slice_weight)
beta_avg = calculate_slice_average(slice_beta, slice_weight)
gamma_avg = calculate_slice_average(slice_gamma, slice_weight)
slice_avgs = [alpha_avg, beta_avg, gamma_avg]
return slice_twiss_params, slice_weight, slice_lims, slice_avgs
def energy_profile(z, px, py, pz, w=None, n_slices=10, len_slice=None):
"""Calculate the sliced longitudinal energy profile of the distribution
Parameters
----------
z : array
Contains the longitudinal position of the particles in units of meters
px : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
py : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
pz : array
Contains the longitudonal momentum of the beam particles in
non-dimmensional units (beta*gamma)
w : array or single value
Statistical weight of the particles.
n_slices : array
Number of longitudinal slices in which to divite the particle
distribution. Not used if len_slice is specified.
len_slice : array
Length of the longitudinal slices. If not None, replaces n_slices.
Returns
-------
A tuple containing:
- An array with the mean energy value in each slice.
- An array with the statistical weight of each slice.
- An array with the slice edges.
"""
slice_lims, n_slices = create_beam_slices(z, n_slices, len_slice)
slice_ene = np.zeros(n_slices)
slice_weight = np.zeros(n_slices)
for i in np.arange(0, n_slices):
a = slice_lims[i]
b = slice_lims[i+1]
slice_particle_filter = (z > a) & (z <= b)
if slice_particle_filter.any():
px_slice = px[slice_particle_filter]
py_slice = py[slice_particle_filter]
pz_slice = pz[slice_particle_filter]
if hasattr(w, '__iter__'):
w_slice = w[slice_particle_filter]
else:
w_slice = w
slice_ene[i] = mean_energy(px_slice, py_slice, pz_slice, w_slice)
slice_weight[i] = np.sum(w_slice)
return slice_ene, slice_weight, slice_lims
def current_profile(z, q, n_slices=10, len_slice=None):
"""Calculate the current profile of the given particle distribution.
Parameters
----------
z : array
Contains the longitudinal position of the particles in units of meters
q : array
Contains the charge of the particles in C
n_slices : array
Number of longitudinal slices in which to divite the particle
distribution. Not used if len_slice is specified.
len_slice : array
Length of the longitudinal slices. If not None, replaces n_slices.
Returns
-------
A tuple containing:
- An array with the current of each slice in units of A.
- An array with the slice edges along z.
"""
slice_lims, n_slices = create_beam_slices(z, n_slices, len_slice)
sl_len = slice_lims[1] - slice_lims[0]
charge_hist, z_edges = np.histogram(z, bins=n_slices, weights=q)
sl_dur = sl_len/ct.c
current_prof = charge_hist/sl_dur
return current_prof, z_edges
def energy_spectrum(px, py, pz, w=None, bins=10):
"""Calculate the energy spectrum (histogram) of the given particle
distribution.
Parameters
----------
px : array
Contains the transverse momentum in the x direction of the
beam particles in non-dimmensional units (beta*gamma)
py : array
Contains the transverse momentum in the y direction of the
beam particles in non-dimmensional units (beta*gamma)
pz : array
Contains the longitudinal momentum of the beam particles in
non-dimmensional units (beta*gamma)
w : array or single value
Statistical weight of the particles.
bins : int
Number of bins of the histogram.
Returns
-------
A tuple containing:
- An array with the energy histogram (normalized to 1).
- An array with the bin edges of the histogram.
"""
if w is not None:
w = np.abs(w)
gamma = np.sqrt(1 + px**2 + py**2 + pz**2)
ene_hist, bin_edges = np.histogram(gamma, bins=bins, weights=w)
ene_hist /= np.max(ene_hist)
return ene_hist, bin_edges
def general_analysis(x, y, z, px, py, pz, q, n_slices=10, len_slice=None,
print_params=False):
"""Quick method to analyze the most relevant beam parameters at once.
Parameters
----------
x : array
Contains the transverse position of the particles in the x
transverse plane in units of meter
y : array
Contains the transverse position of the particles in the y
transverse plane in units of meter
z : array
Contains the longitudinal position of the particles in units of meters
px : array
Contains the transverse momentum of the beam particles in the same
plane as x in non-dimmensional units (beta*gamma)
py : array
Contains the transverse momentum of the beam particles in the opposite
plane as as x in non-dimmensional units (beta*gamma).
pz : array
Contains the longitudinal momentum of the beam particles in
non-dimmensional units (beta*gamma).
q : array
Charge of the particles in Coulomb.
len_slice : array
Length of the longitudinal slices.
print_params : bool
If True, the parameters of the analyzed distribution will also be
printed.
Returns
-------
A tuple containing the centroid position, pointing angle, Twiss parameters,
bunch length, divergence, energy and the total and slice emittance and
energy spread.
"""
q_tot = np.sum(q)
a_x, b_x, g_x = twiss_parameters(x, px, pz, py, w=q)
a_y, b_y, g_y = twiss_parameters(y, py, pz, px, w=q)
ene = mean_energy(px, py, pz, w=q)
ene_sp = relative_rms_energy_spread(px, py, pz, w=q)
enespls, sl_w, sl_lim, ene_sp_sl = relative_rms_slice_energy_spread(
z, px, py, pz, w=q, n_slices=n_slices, len_slice=len_slice)
em_x = normalized_transverse_rms_emittance(x, px, py, pz, w=q)
em_y = normalized_transverse_rms_emittance(y, py, px, pz, w=q)
emsx, sl_w, sl_lim, em_sl_x = normalized_transverse_rms_slice_emittance(
z, x, px, py, pz, w=q, n_slices=n_slices, len_slice=len_slice)
emsy, sl_w, sl_lim, em_sl_y = normalized_transverse_rms_slice_emittance(
z, y, py, px, pz, w=q, n_slices=n_slices, len_slice=len_slice)
i_peak = peak_current(z, q, n_slices=n_slices, len_slice=len_slice)
z_fwhm = fwhm_length(z, q, n_slices=n_slices, len_slice=len_slice)
s_z = rms_length(z, w=q)
s_x = rms_size(x, w=q)
s_y = rms_size(y, w=q)
s_px = np.std(px/pz)
s_py = np.std(py/pz)
x_centroid = np.average(x, weights=q)
y_centroid = np.average(y, weights=q)
z_centroid = np.average(z, weights=q)
px_centroid = np.average(px, weights=q)
py_centroid = np.average(py, weights=q)
theta_x = px_centroid/ene
theta_y = py_centroid/ene
params_dict = {
'x_avg': x_centroid,
'y_avg': y_centroid,
'z_avg': z_centroid,
'theta_x': theta_x,
'theta_y': theta_y,
'sigma_x': s_x,
'sigma_y': s_y,
'sigma_z': s_z,
'z_fwhm': z_fwhm,
'sigma_px': s_px,
'sigma_py': s_py,
'alpha_x': a_x,
'alpha_y': a_y,
'beta_x': b_x,
'beta_y': b_y,
'gamma_x': g_x,
'gamma_y': g_y,
'emitt_nx': em_x,
'emitt_ny': em_y,
'emitt_nx_sl': em_sl_x,
'emitt_ny_sl': em_sl_y,
'ene_avg': ene,
'rel_ene_sp': ene_sp,
'rel_ene_sp_sl': ene_sp_sl,
'i_peak': i_peak,
'q': q_tot
}
if print_params:
print('Parametes of particle distribution:')
print('-'*80)
print('number of particles = {}'.format(len(x)))
print('q_tot = {:1.2e} C'.format(q_tot))
print('alpha_x = {:1.2e}, alpha_y = {:1.2e}'.format(a_x, a_y))
print('beta_x = {:1.2e} m, beta_y = {:1.2e} m'.format(b_x, b_y))
print('gamma_x = {:1.2e} 1/m, gamma_y = {:1.2e} 1/m'.format(g_x, g_y))
print('sigma_x = {:1.2e} m, sigma_y = {:1.2e} m'.format(s_x, s_y))
print('sigma_z = {:1.2e} m (sigma_t = {:1.2e} s)'.format(
s_z, s_z/ct.c))
print('z_fwhm = {:1.2e} m (t_fwhm = {:1.2e} s)'.format(
z_fwhm, z_fwhm/ct.c))
print('i_peak = {:1.2e} kA'.format(i_peak*1e-3))
print('norm_emitt_x = {:1.2e} m, norm_emitt_y = {:1.2e} m'.format(
em_x, em_y))
print(('norm_emitt_x_sl = {:1.2e} m, '
+ 'norm_emitt_y_sl = {:1.2e} m').format(em_sl_x, em_sl_y))
print('gamma_avg = {:1.2e} (ene_avg = {:1.2e} MeV)'.format(
ene, ene*ct.m_e*ct.c**2/ct.e*1e-6))
print('gamma_spread = {:1.2e} ({:1.2e} %)'.format(ene_sp, ene_sp*100))
print('gamma_spread_sl = {:1.2e} ({:1.2e} %)'.format(
ene_sp_sl, ene_sp_sl*100))
print('-'*80)
return params_dict | APtools | /APtools-0.2.4-py3-none-any.whl/aptools/data_analysis/beam_diagnostics.py | beam_diagnostics.py |
import numpy as np
from aptools.data_analysis.beam_diagnostics import twiss_parameters
def modify_twiss_parameters_all_beam(beam_data, betax_target=None,
alphax_target=None, gammax_target=None,
betay_target=None, alphay_target=None,
gammay_target=None):
"""
Modifies the transverse distribution of a particle bunch in both transverse
planes so that it has the specified Twiss parameters.
Parameters
----------
beam_data: iterable
List, tuple or array containing the beam data arrays as
[x, y, z, px, py, pz, q].
betax_target: float
Target beta in the x-plane (horizontal) of the resulting distribution.
Not necessary if alphax_target and gammax_target are already provided.
alphax_target: float
Target alpha in the x-plane (horizontal) of the resulting distribution.
Not necessary if betax_target and gammax_target are already provided.
gammax_target: float
Target gamma in the x-plane (horizontal) of the resulting distribution.
Not necessary if betax_target and alphax_target are already provided.
betay_target: float
Target beta in the y-plane (vertical) of the resulting distribution.
Not necessary if alphay_target and gammay_target are already provided.
alphay_target: float
Target alpha in the y-plane (vertical) of the resulting distribution.
Not necessary if betay_target and gammay_target are already provided.
gammay_target: float
Target gamma in the y-plane (vertical) of the resulting distribution.
Not necessary if betay_target and alphay_target are already provided.
Returns
-------
A tuple with 7 arrays containing the 6D components and charge of the
modified distribution.
"""
x, y, z, px, py, pz, q = beam_data
x, px = modify_twiss_parameters(x, px, pz, q, betax_target, alphax_target,
gammax_target)
y, py = modify_twiss_parameters(y, py, pz, q, betay_target, alphay_target,
gammay_target)
return x, y, z, px, py, pz, q
def modify_twiss_parameters(x, px, pz, weights=None, beta_target=None,
alpha_target=None, gamma_target=None):
"""
Modifies the transverse distribution of a particle bunch so that it has
the specified Twiss parameters.
Parameters
----------
x: array
Transverse position of the particles in meters.
px: array
Transverse momentum of the particles in the same plane as the x-array.
The momentum should be in non-dimmensional units (beta*gamma).
pz: array
Longitudinal momentum of the beam particles in non-dimmensional units
(beta*gamma).
weights: array
Statistical weight of the particles.
beta_target: float
Target beta of the resulting distribution. Not necessary if
alpha_target and gamma_target are already provided.
alpha_target: float
Target alpha of the resulting distribution. Not necessary if
beta_target and gamma_target are already provided.
gamma_target: float
Target gamma of the resulting distribution. Not necessary if
beta_target and alpha_target are already provided.
Returns
-------
A tuple with the new x and px arrays satisfying the target Twiss
parameters.
"""
# Check target Twiss parameters
if beta_target is not None:
bx = beta_target
if alpha_target is not None:
ax = alpha_target
gx = (1+ax**2)/bx
elif gamma_target is not None:
gx = gamma_target
ax = np.sqrt(bx*gx-1)
else:
print('Not enough parameters, please specify also a target alpha'
' or gamma value.')
return x, px
elif alpha_target is not None:
ax = alpha_target
if gamma_target is not None:
gx = gamma_target
bx = (1+ax**2)/gx
else:
print('Not enough parameters, please specify also a target beta or'
' gamma value.')
return x, px
elif gamma_target is not None:
print('Not enough parameters, please specify also a target beta or'
' alpha value.')
return x, px
else:
print('No target Twiss parameters specified. Please provide at least'
' two.')
return x, px
# Calculate initial Twiss parameters
ax_0, bx_0, gx_0 = twiss_parameters(x, px, pz, w=weights)
# Calculate transform matrix assuming M12=0
M11 = np.sqrt(bx/bx_0)
M22 = np.sqrt(bx_0/bx)
M21 = (M22*ax_0 - ax/M11)/bx_0
M = np.zeros((2, 2))
M[0, 0] = M11
M[1, 0] = M21
M[1, 1] = M22
# Apply transform matrix
xp = px/pz
x_new, xp_new = M.dot(np.vstack((x, xp)))
px_new = xp_new*pz
return x_new, px_new | APtools | /APtools-0.2.4-py3-none-any.whl/aptools/data_processing/beam_operations.py | beam_operations.py |
import numpy as np
from aptools.helper_functions import weighted_std
def filter_beam(beam_matrix, min_range, max_range):
"""Filter the beam particles keeping only those within a given range.
Parameters
----------
beam_matrix : array
M x N matrix containing all M components of the N particles.
min_range : array
Array of size M with the minimum value for each given component. For
values which are 'None' no filtering is performed.
max_range : array
Array of size M with the maximum value for each given component. For
values which are 'None' no filtering is performed.
Returns
-------
A M x N' matrix containing the particles within the given range. N' is the
number of particles after filtering.
"""
elements_to_keep = np.ones_like(beam_matrix[0])
for i, arr in enumerate(beam_matrix):
if min_range[i] is not None:
elements_to_keep = np.where(
arr < min_range[i], 0, elements_to_keep)
if max_range[i] is not None:
elements_to_keep = np.where(
arr > max_range[i], 0, elements_to_keep)
elements_to_keep = np.array(elements_to_keep, dtype=bool)
return beam_matrix[:, elements_to_keep]
def filter_beam_sigma(beam_matrix, max_sigma, w=None):
"""Filter the beam particles keeping only those within a certain number of
sigmas.
Parameters
----------
beam_matrix : array
M x N matrix containing all M components of the N particles.
max_sigma : array
Array of size M with the maximum number of sigmas of each component.
For each compoonent x, particles with x < x_avg - x_std * max_sigma or
x > x_avg + x_std * max_sigma will be discarded. If max_sigma is None
for some component, no filtering is performed.
w : array
Statistical weight of the particles.
Returns
-------
A M x N' matrix containing the particles within the given range. N' is the
number of particles after filtering.
"""
elements_to_keep = np.ones_like(beam_matrix[0])
for i, arr in enumerate(beam_matrix):
if max_sigma[i] is not None:
arr_mean = np.average(arr, weights=w)
arr_std = weighted_std(arr, weights=w)
min_range = arr_mean - max_sigma[i] * arr_std
max_range = arr_mean + max_sigma[i] * arr_std
elements_to_keep = np.where(
(arr < min_range) | (arr > max_range), 0, elements_to_keep)
elements_to_keep = np.array(elements_to_keep, dtype=bool)
return beam_matrix[:, elements_to_keep] | APtools | /APtools-0.2.4-py3-none-any.whl/aptools/data_processing/beam_filtering.py | beam_filtering.py |
import numpy as np
import scipy.constants as ct
from scipy.stats import truncnorm
import aptools.data_handling.reading as dr
import aptools.data_handling.saving as ds
import aptools.data_analysis.beam_diagnostics as bd
import aptools.data_processing.beam_operations as bo
def generate_gaussian_bunch_from_twiss(
a_x, a_y, b_x, b_y, en_x, en_y, ene, ene_sp, s_t, q_tot, n_part, x_c=0,
y_c=0, z_c=0, lon_profile='gauss', min_len_scale_noise=None,
sigma_trunc_lon=None, smooth_sigma=None, smooth_trunc=None,
save_to_file=False, save_to_code='astra',
save_to_path=None, file_name=None, perform_checks=False):
"""
Creates a transversely Gaussian particle bunch with the specified Twiss
parameters.
Parameters
----------
a_x : float
Alpha parameter in the x-plane.
a_y : float
Alpha parameter in the y-plane.
b_x : float
Beta parameter in the x-plane in units of m.
b_y : float
Beta parameter in the y-plane in units of m.
en_x : float
Normalized trace-space emittance in the x-plane in units of m*rad.
en_y : float
Normalized trace-space emittance in the y-plane in units of m*rad.
ene: float
Mean bunch energy in non-dimmensional units (beta*gamma).
ene_sp: float
Relative energy spread in %.
s_t: float
Bunch duration in seconds. If lon_profile='gauss', this corresponds to
the RMS duration. If lon_profile='flattop' or
lon_profile='flattop_smoothed', this instead the whole flat-top lenght.
q_tot: float
Total bunch charge in C.
n_part: int
Total number of particles in the bunch.
x_c: float
Central bunch position in the x-plane in units of m.
y_c: float
Central bunch position in the y-plane in units of m.
z_c: float
Central bunch position in the z-plane in units of m.
lon_profile: string
Longitudonal profile of the bunch. Possible values are 'gauss' and
'flattop'.
min_len_scale_noise: float
(optional) If specified, a different algorithm to generate a less noisy
longitudinal profile is used. This algorithm creates a profile that is
smooth for a longitudinal binning of the bunch with
bin lengths >= min_len_scale_noise
sigma_trunc_lon: float
(optional) If specified, it truncates the longitudinal distribution of
the bunch between [z_c-sigma_trunc_lon*s_z, z_c+sigma_trunc_lon*s_z].
Only used when lon_profile = 'gauss' and required if
min_len_scale_noise is specified.
smooth_sigma: float
The sigma of the Gaussian longitudinal smoothing applied to the
flat-top profile when lon_profile='flattop_smoothed'. Units are in
seconds.
smooth_trunc: float
Number of sigmas after which to truncate the Gaussian smoothing when
lon_profile='flattop_smoothed'
save_to_file: bool
Whether to save the generated distribution to a file.
save_to_code: string
(optional) Name of the target code that will use the saved file.
Possible values are 'csrtrack', 'astra' and 'fbpic'. Required if
save_to_file=True.
save_to_path: string
(optional) Path to the folder where to save the data. Required if
save_to_file=True.
file_name: string
(optional) Name of the file where to store the beam data. Required if
save_to_file=True.
perform_checks: bool
Whether to compute and print the parameters of the generated bunch.
Returns
-------
The 6D components and charge of the bunch in 7 arrays.
"""
print('Generating particle distribution... ', end='')
# Calculate necessary values
n_part = int(n_part)
ene_sp = ene_sp/100
ene_sp_abs = ene_sp*ene
s_z = s_t*ct.c
em_x = en_x/ene
em_y = en_y/ene
g_x = (1+a_x**2)/b_x
g_y = (1+a_y**2)/b_y
s_x = np.sqrt(em_x*b_x)
s_y = np.sqrt(em_y*b_y)
s_xp = np.sqrt(em_x*g_x)
s_yp = np.sqrt(em_y*g_y)
p_x = -a_x*em_x/(s_x*s_xp)
p_y = -a_y*em_y/(s_y*s_yp)
# Create longitudinal distributions
if lon_profile == 'gauss':
z = _create_gaussian_longitudinal_profile(z_c, s_z, n_part,
sigma_trunc_lon,
min_len_scale_noise)
elif lon_profile == 'flattop':
z = _create_flattop_longitudinal_profile(z_c, s_z, n_part,
min_len_scale_noise)
elif lon_profile == 'flattop_smoothed':
z = _create_flattop_longitudinal_profile_with_smoothing(
z_c, s_z, n_part, min_len_scale_noise, smooth_sigma, smooth_trunc)
# Define again n_part in case it changed when crealing long. profile
n_part = len(z)
pz = np.random.normal(ene, ene_sp_abs, n_part)
# Create normalized gaussian distributions
u_x = np.random.standard_normal(n_part)
v_x = np.random.standard_normal(n_part)
u_y = np.random.standard_normal(n_part)
v_y = np.random.standard_normal(n_part)
# Calculate transverse particle distributions
x = s_x*u_x
xp = s_xp*(p_x*u_x + np.sqrt(1-np.square(p_x))*v_x)
y = s_y*u_y
yp = s_yp*(p_y*u_y + np.sqrt(1-np.square(p_y))*v_y)
# Change from slope to momentum
px = xp*pz
py = yp*pz
# Charge
q = np.ones(n_part)*(q_tot/n_part)
print('Done.')
# Save to file
if save_to_file:
print('Saving to file... ', end='')
ds.save_beam(
save_to_code, [x, y, z, px, py, pz, q], save_to_path, file_name)
print('Done.')
if perform_checks:
_check_beam_parameters(x, y, z, px, py, pz, q)
return x, y, z, px, py, pz, q
def generate_from_file_modifying_twiss(
code_name, file_path, read_kwargs={}, alphax_target=None,
betax_target=None, alphay_target=None, betay_target=None,
save_to_file=False, save_to_code='astra', save_to_path=None,
file_name=None, save_kwargs={}, perform_checks=False):
"""
Creates a transversely Gaussian particle bunch with the specified Twiss
parameters.
Parameters
----------
code_name: str
Name of the tracking or PIC code of the data to read. Possible values
are 'csrtrack', 'astra', 'openpmd', 'osiris' and 'hipace'
file_path: str
Path of the file containing the data
read_kwargs: dict
Dictionary containing optional parameters for the read_beam function.
save_to_file: bool
Whether to save the generated distribution to a file.
save_to_code: string
(optional) Name of the target code that will use the saved file.
Possible values are 'csrtrack', 'astra' and 'fbpic'. Required if
save_to_file=True.
save_to_path: string
(optional) Path to the folder where to save the data. Required if
save_to_file=True.
file_name: string
(optional) Name of the file where to store the beam data. Required if
save_to_file=True.
save_kwargs: dict
Dictionary containing optional parameters for the save_beam function.
perform_checks: bool
Whether to compute and print the parameters of the generated bunch.
Returns
-------
A tuple with 7 arrays containing the 6D components and charge of the
modified distribution.
"""
# Read distribution
x, y, z, px, py, pz, q = dr.read_beam(code_name, file_path, **read_kwargs)
# Modify Twiss parameters
x, y, z, px, py, pz, q = bo.modify_twiss_parameters_all_beam(
[x, y, z, px, py, pz, q], alphax_target=alphax_target,
betax_target=betax_target, alphay_target=alphay_target,
betay_target=betay_target)
# Save to file
if save_to_file:
print('Saving to file... ', end='')
ds.save_beam(
save_to_code, [x, y, z, px, py, pz, q], save_to_path, file_name,
**save_kwargs)
print('Done.')
# Perform checks
if perform_checks:
_check_beam_parameters(x, y, z, px, py, pz, q)
return x, y, z, px, py, pz, q
def _create_gaussian_longitudinal_profile(z_c, s_z, n_part, sigma_trunc_lon,
min_len_scale_noise):
""" Creates a Gaussian longitudinal profile """
# Make sure number of particles is an integer
n_part = int(n_part)
if min_len_scale_noise is None:
if sigma_trunc_lon is not None:
z = truncnorm.rvs(-sigma_trunc_lon, sigma_trunc_lon, loc=z_c,
scale=s_z, size=n_part)
else:
z = np.random.normal(z_c, s_z, n_part)
else:
tot_len = 2*sigma_trunc_lon*s_z
n_slices = int(np.round(tot_len/(min_len_scale_noise)))
part_per_slice = 2*sigma_trunc_lon*n_part/n_slices * truncnorm.pdf(
np.linspace(-sigma_trunc_lon, sigma_trunc_lon, n_slices),
-sigma_trunc_lon, sigma_trunc_lon)
part_per_slice = part_per_slice.astype(int)
slice_edges = np.linspace(z_c-sigma_trunc_lon*s_z,
z_c+sigma_trunc_lon*s_z,
n_slices+1)
z = _create_smooth_z_array(part_per_slice, slice_edges)
return z
def _create_flattop_longitudinal_profile(z_c, length, n_part,
min_len_scale_noise):
""" Creates a flattop longitudinal profile """
# Make sure number of particles is an integer
n_part = int(n_part)
if min_len_scale_noise is None:
z = np.random.uniform(z_c-length/2, z_c+length/2, n_part)
else:
n_slices = int(np.round(length/(min_len_scale_noise)))
part_per_slice = np.round(np.ones(n_slices)*n_part/n_slices)
part_per_slice = part_per_slice.astype(int)
slice_edges = np.linspace(z_c-length/2, z_c+length/2, n_slices+1)
z = _create_smooth_z_array(part_per_slice, slice_edges)
return z
def _create_flattop_longitudinal_profile_with_smoothing(z_c, length, n_part,
min_len_scale_noise,
smooth_sigma,
smooth_trunc):
""" Creates a flattop longitudinal profile with gaussian smoothing at the
head and tail"""
# Number of particles
smooth_sigma = ct.c*smooth_sigma
n_plat = n_part * length/(length+np.sqrt(2*np.pi*smooth_sigma**2))
n_smooth = n_part * (np.sqrt(2*np.pi*smooth_sigma**2)
/ (length+np.sqrt(2*np.pi*smooth_sigma**2)))
# Create flattop and gaussian profiles
z_plat = _create_flattop_longitudinal_profile(length/2, length, n_plat,
min_len_scale_noise)
z_smooth = _create_gaussian_longitudinal_profile(0, smooth_sigma, n_smooth,
smooth_trunc,
min_len_scale_noise)
# Concatenate both profiles
z = np.concatenate((z_smooth[np.where(z_smooth <= 0)],
z_plat,
z_smooth[np.where(z_smooth > 0)] + length))
# Center distribution around desired position
z = z - length/2 + z_c
return z
def _create_smooth_z_array(part_per_slice, slice_edges):
""" Creates the z array of the distribution when forced to be smooth """
z = np.array([])
for i, part in enumerate(part_per_slice):
z_sl = np.linspace(slice_edges[i], slice_edges[i+1], part+2)[1:-1]
np.random.shuffle(z_sl)
z = np.concatenate((z, z_sl))
return z
def _check_beam_parameters(x, y, z, px, py, pz, q):
""" Analyzes and prints the parameters of the generated distribution """
print('Performing checks... ', end='')
beam_params = bd.general_analysis(x, y, z, px, py, pz, q)
b_x = beam_params['beta_x']
b_y = beam_params['beta_y']
a_x = beam_params['alpha_x']
a_y = beam_params['alpha_y']
s_x = beam_params['sigma_x']
s_y = beam_params['sigma_y']
s_z = beam_params['sigma_z']
em_x = beam_params['emitt_nx']
em_y = beam_params['emitt_ny']
ene = beam_params['ene_avg']
ene_sp = beam_params['rel_ene_sp']
print('Done.')
print('Generated beam with:')
print('-'*80)
print('alpha_x = {:1.2e}, alpha_y = {:1.2e}'.format(a_x, a_y))
print('beta_x = {:1.2e} m, beta_y = {:1.2e} m'.format(b_x, b_y))
print('sigma_x = {:1.2e} m, sigma_y = {:1.2e} m'.format(s_x, s_y))
print('sigma_z = {:1.2e} m (sigma_t = {:1.2e} s)'.format(s_z, s_z/ct.c))
print('norm_emitt_x = {:1.2e} m, norm_emitt_y = {:1.2e} m'.format(em_x,
em_y))
print('gamma_avg = {:1.2e}, gamma_spread = {:1.2e}'.format(ene, ene_sp))
print('-'*80) | APtools | /APtools-0.2.4-py3-none-any.whl/aptools/utilities/bunch_generation.py | bunch_generation.py |
import numpy as np
import matplotlib.pyplot as plt
import matplotlib.gridspec as gs
from matplotlib import ticker
from matplotlib.colorbar import Colorbar
import matplotlib.patheffects as path_effects
from matplotlib import colors
def add_projection(
x, bins, main_ax, subplot_spec, fig, orientation='horizontal'):
x_proj, x_bins = np.histogram(x, bins=bins)
x_pos = x_bins[1:] - (x_bins[1]-x_bins[0])
if orientation == 'horizontal':
gs_p = gs.GridSpecFromSubplotSpec(
2, 1, subplot_spec, height_ratios=[1, 0.2])
ax_p = fig.add_subplot(gs_p[-1])
elif orientation == 'vertical':
gs_p = gs.GridSpecFromSubplotSpec(
1, 2, subplot_spec, width_ratios=[0.2, 1])
ax_p = fig.add_subplot(gs_p[0])
ax_p.patch.set_alpha(0)
if orientation == 'horizontal':
ax_p.plot(x_pos, x_proj, c='k', lw=0.5, alpha=0.5)
ax_p.fill_between(x_pos, x_proj, facecolor='tab:gray', alpha=0.3)
xlim = main_ax.get_xlim()
ax_p.set_xlim(xlim)
ylim = list(ax_p.get_ylim())
ylim[0] = 0
ax_p.set_ylim(ylim)
elif orientation == 'vertical':
ax_p.plot(x_proj, x_pos, c='k', lw=0.5, alpha=0.5)
ax_p.fill_betweenx(x_pos, x_proj, facecolor='tab:gray', alpha=0.3)
ylim = main_ax.get_ylim()
ax_p.set_ylim(ylim)
xlim = list(ax_p.get_xlim())
xlim[0] = 0
ax_p.set_xlim(xlim)
ax_p.axis('off')
def create_vertical_colorbars(
images, labels, subplot_spec, fig=None, n_ticks=3, **kwargs):
if not isinstance(images, list):
images = [images]
if not isinstance(labels, list):
labels = [labels]
n_cbars = len(images)
cbar_gs = gs.GridSpecFromSubplotSpec(
n_cbars, 1, subplot_spec=subplot_spec, **kwargs)
if fig is None:
fig = plt.gcf()
for image, label, cbar_ss in zip(images, labels, cbar_gs):
ax = fig.add_subplot(cbar_ss)
tick_locator = ticker.MaxNLocator(nbins=n_ticks)
Colorbar(ax, image, ticks=tick_locator, label=label)
def add_text(ax, x, y, text, **kwargs):
fc = colors.to_rgba('white')
# fc[:-1] + (0.7,)
ec = colors.to_rgba('tab:gray')
bbox = dict(
boxstyle="round",
ec=ec,
fc=fc,
alpha=0.7
)
label = ax.text(
x, y, text, transform=ax.transAxes, fontsize=8, bbox=bbox, **kwargs)
label.set_path_effects(
[path_effects.Stroke(linewidth=1, foreground='white'),
path_effects.Normal()]) | APtools | /APtools-0.2.4-py3-none-any.whl/aptools/plotting/utils.py | utils.py |
import numpy as np
import matplotlib
import matplotlib.pyplot as plt
import matplotlib.gridspec as gridspec
import matplotlib.patheffects as path_effects
import scipy.constants as ct
import aptools.data_analysis.beam_diagnostics as bd
from aptools.data_handling.reading import read_beam
from aptools.plotting.plot_types import scatter_histogram
from aptools.plotting.rc_params import rc_params
from aptools.plotting.utils import (
add_projection, create_vertical_colorbars, add_text)
from aptools.helper_functions import (
get_only_statistically_relevant_slices, weighted_std)
def phase_space_overview_from_file(
code_name, file_path, rasterized_scatter=None, show=True, **kwargs):
x, y, z, px, py, pz, q = read_beam(code_name, file_path, **kwargs)
phase_space_overview(x, y, z, px, py, pz, q,
rasterized_scatter=rasterized_scatter,
show=show)
def phase_space_overview(x, y, z, px, py, pz, q, rasterized_scatter=None,
show=True):
em_x = bd.normalized_transverse_rms_emittance(x, px, w=q) * 1e6
em_y = bd.normalized_transverse_rms_emittance(y, py, w=q) * 1e6
a_x, b_x, g_x = bd.twiss_parameters(x, px, pz, w=q)
a_y, b_y, g_y = bd.twiss_parameters(y, py, pz, w=q)
s_x = bd.rms_size(x, w=q)
s_y = bd.rms_size(y, w=q)
em_l = bd.longitudinal_rms_emittance(z, px, py, pz, w=q) * 1e6
dz = z - np.average(z, weights=q)
s_z = bd.rms_length(z, w=q)
s_g = bd.relative_rms_energy_spread(pz, py, pz, w=q)
s_g_sl, w_sl, sl_ed, s_g_sl_av = bd.relative_rms_slice_energy_spread(
z, px, py, pz, w=q, n_slices=10)
c_prof, _ = bd.current_profile(z, q, n_slices=50)
c_peak = max(abs(c_prof))/1e3 # kA
# s_g_sl_c = s_g_sl[int(len(s_g_sl)/2)]
# make plot
plt.figure(figsize=(8, 3))
text_labels = []
with plt.rc_context(rc_params):
# x - px
ax_1 = plt.subplot(131)
scatter_histogram(x*1e6, px, rasterized=rasterized_scatter)
plt.xlabel("x [$\\mu m$]")
plt.ylabel("$p_x \\ \\mathrm{[m_e c]}$")
text_labels += [
plt.text(0.1, 0.9, '$\\epsilon_{n,x} = $'
+ '{}'.format(np.around(em_x, 3))
+ '$\\ \\mathrm{\\mu m}$',
transform=ax_1.transAxes, fontsize=8),
plt.text(0.1, 0.8,
'$\\beta_{x} = $' + '{}'.format(np.around(b_x, 3))
+ 'm', transform=ax_1.transAxes, fontsize=8),
plt.text(0.1, 0.7,
'$\\alpha_{x} = $' + '{}'.format(np.around(a_x, 3)),
transform=ax_1.transAxes, fontsize=8),
plt.text(0.1, 0.6, '$\\sigma_{x} = $'
+ '{}'.format(np.around(s_x*1e6, 3))
+ '$\\ \\mathrm{\\mu m}$', transform=ax_1.transAxes,
fontsize=8),
]
# y - py
ax_2 = plt.subplot(132)
scatter_histogram(y * 1e6, py, rasterized=rasterized_scatter)
plt.xlabel("y [$\\mu m$]")
plt.ylabel("$p_y \\ \\mathrm{[m_e c]}$")
text_labels += [
plt.text(0.1, 0.9, '$\\epsilon_{n,y} = $'
+ '{}'.format(np.around(em_y, 3))
+ '$\\ \\mathrm{\\mu m}$',
transform=ax_2.transAxes, fontsize=8),
plt.text(0.1, 0.8,
'$\\beta_{y} = $' + '{}'.format(np.around(b_y, 3))
+ 'm', transform=ax_2.transAxes, fontsize=8),
plt.text(0.1, 0.7,
'$\\alpha_{y} = $' + '{}'.format(np.around(a_y, 3)),
transform=ax_2.transAxes, fontsize=8),
plt.text(0.1, 0.6, '$\\sigma_{y} = $'
+ '{}'.format(np.around(s_y*1e6, 3))
+ '$\\ \\mathrm{\\mu m}$', transform=ax_2.transAxes,
fontsize=8)
]
# z - pz
ax_3 = plt.subplot(133)
scatter_histogram(dz / ct.c * 1e15, pz, rasterized=rasterized_scatter)
plt.xlabel("$\\Delta z$ [fs]")
plt.ylabel("$p_z \\ \\mathrm{[m_e c]}$")
text_labels += [
plt.text(0.1, 0.9, '$\\epsilon_{L} = $'
+ '{}'.format(np.around(em_l, 3))
+ '$\\ \\mathrm{\\mu m}$', transform=ax_3.transAxes,
fontsize=8),
plt.text(0.1, 0.8, '$\\sigma_\\gamma/\\gamma=$'
+ '{}'.format(np.around(s_g*1e2, 3)) + '$\\%$',
transform=ax_3.transAxes, fontsize=8),
plt.text(0.1, 0.7, '$\\sigma^s_\\gamma/\\gamma=$'
+ '{}'.format(np.around(s_g_sl_av*1e2, 3)) + '$\\%$',
transform=ax_3.transAxes, fontsize=8),
plt.text(0.1, 0.6, '$\\sigma_z=$'
+ '{}'.format(np.around(s_z/ct.c*1e15, 3)) + ' fs',
transform=ax_3.transAxes, fontsize=8),
plt.text(0.1, 0.5, '$I_{peak}=$'
+ '{}'.format(np.around(c_peak, 2)) + ' kA',
transform=ax_3.transAxes, fontsize=8)
]
for label in text_labels:
label.set_path_effects(
[path_effects.Stroke(linewidth=1, foreground='white'),
path_effects.Normal()])
plt.tight_layout()
if show:
plt.show()
def slice_analysis(x, y, z, px, py, pz, q, n_slices=50, len_slice=None,
ene_bins=50, left=0.125, right=0.875, top=0.98, bottom=0.13,
xlim=None, ylim=None, add_labels=False, include_twiss=False,
fig=None, rasterized_scatter=None, show=True):
# analyze beam
current_prof, z_edges = bd.current_profile(z, q, n_slices=n_slices,
len_slice=len_slice)
ene_spectrum, ene_spec_edgs = bd.energy_spectrum(px, py, pz, w=q,
bins=ene_bins)
slice_ene, *_ = bd.energy_profile(
z, px, py, pz, w=q, n_slices=n_slices, len_slice=len_slice)
slice_ene_sp, *_ = bd.relative_rms_slice_energy_spread(
z, px, py, pz, w=q, n_slices=n_slices, len_slice=len_slice)
sl_tw, sl_w, *_ = bd.slice_twiss_parameters(
z, x, px, pz, w=q, n_slices=n_slices, len_slice=len_slice)
alpha_x, *_ = get_only_statistically_relevant_slices(
sl_tw[0], sl_w, replace_with_nans=True)
beta_x, *_ = get_only_statistically_relevant_slices(
sl_tw[1], sl_w, replace_with_nans=True)
sl_tw, *_ = bd.slice_twiss_parameters(
z, y, py, pz, w=q, n_slices=n_slices, len_slice=len_slice)
alpha_y, *_ = get_only_statistically_relevant_slices(
sl_tw[0], sl_w, replace_with_nans=True)
beta_y, *_ = get_only_statistically_relevant_slices(
sl_tw[1], sl_w, replace_with_nans=True)
slice_em_x, *_ = bd.normalized_transverse_rms_slice_emittance(
z, x, px, w=q, n_slices=n_slices, len_slice=len_slice)
slice_em_y, *_ = bd.normalized_transverse_rms_slice_emittance(
z, y, py, w=q, n_slices=n_slices, len_slice=len_slice)
s_z = bd.rms_length(z, w=q)
len_fwhm = bd.fwhm_length(z, q, n_slices=n_slices, len_slice=len_slice)
ene_sp_tot = bd.relative_rms_energy_spread(px, py, pz, w=q)
# perform operations
gamma = np.sqrt(1 + px**2 + py**2 + pz**2)
ene = gamma * ct.m_e*ct.c**2/ct.e * 1e-9 # GeV
z_center = np.average(z, weights=q)
dz = z_edges[1] - z_edges[0]
slice_z = (z_edges[1:] - dz/2 - z_center) * 1e6 # micron
current_prof = np.abs(current_prof) * 1e-3 # kA
peak_current = np.nanmax(current_prof)
s_t = s_z * 1e15/ct.c
len_fwhm *= 1e15/ct.c # fs
slice_ene *= ct.m_e*ct.c**2/ct.e * 1e-9 # GeV
ene_spec_edgs = ene_spec_edgs[:-1] + (ene_spec_edgs[1]-ene_spec_edgs[0])/2
ene_spec_edgs *= ct.m_e*ct.c**2/ct.e * 1e-9 # GeV
slice_ene_sp *= 1e2 # %
ene_sp_tot *= 1e2 # %
slice_em_x *= 1e6 # micron
slice_em_y *= 1e6 # micron
max_beta = np.nanmax(beta_x)
if max_beta <= 0.1:
beta_units = 'mm'
beta_x *= 1e3
beta_y *= 1e3
else:
beta_units = 'm'
max_ene = np.nanmax(ene)
if max_ene <= 1:
ene_units = 'MeV'
ene *= 1e3
ene_spec_edgs *= 1e3
else:
ene_units = 'GeV'
ene_mean = np.average(ene, weights=q)
# make plot
if include_twiss:
nrows = 3
hr = [2.5, 1, 1]
fh = 3.3
else:
nrows = 2
hr = [2.5, 1]
fh = 2.5
if fig is None:
fig = plt.figure(figsize=(4, fh))
gs = gridspec.GridSpec(nrows, 2, height_ratios=hr,
width_ratios=[1, 0.02], hspace=0.1, wspace=0.05,
figure=fig, left=left, right=right,
top=top, bottom=bottom)
leg_frac = 0.25 # space to reserve for legend
with plt.rc_context(rc_params):
ax_or = plt.subplot(gs[0])
pscatt = scatter_histogram((z-z_center)*1e6, ene, bins=300,
weights=np.abs(q)*1e15,
rasterized=rasterized_scatter)
plt.ylabel('Energy [{}]'.format(ene_units))
plt.tick_params(axis='x', which='both', labelbottom=False)
params_text = ('$\\langle E \\rangle = '
+ '{:0.1f}$ {}\n'.format(ene_mean, ene_units)
+ '$\\sigma_\\mathrm{E,rel}='
+ '{:0.1f}$ %\n'.format(ene_sp_tot)
+ '$I_\\mathrm{peak}='
+ '{:0.1f}$ kA\n'.format(peak_current)
+ '$\\sigma_t='
+ '{:0.1f}$ fs'.format(s_t))
plt.text(0.98, 0.95, params_text, transform=ax_or.transAxes,
fontsize=6, horizontalalignment='right',
verticalalignment='top')
if add_labels:
plt.text(0.03, 0.05, '(a)', transform=ax_or.transAxes, fontsize=6,
horizontalalignment='left', verticalalignment='bottom')
if xlim is None:
xlim = list(plt.xlim())
xlim[0] -= (xlim[1] - xlim[0])/8
xlim[1] += (xlim[1] - xlim[0])/3
plt.xlim(xlim)
if ylim is None:
ylim = list(plt.ylim())
ylim[0] -= (ylim[1] - ylim[0])/3
plt.ylim(ylim)
# current profile plot
z_or = ax_or.get_zorder()
pos = list(ax_or.get_position().bounds)
pos[3] /= 5
ax_or.patch.set_alpha(0)
ax = fig.add_axes(pos)
ax.set_zorder(z_or-1)
plt.plot(slice_z, current_prof, c='k', lw=0.5, alpha=0.5)
plt.fill_between(slice_z, current_prof, facecolor='tab:gray',
alpha=0.3)
ax.spines['left'].set_position('zero')
ax.spines['left'].set_color('tab:grey')
ax.tick_params(axis='y', colors='tab:grey', labelsize=6,
direction="in", pad=-4)
ax.spines['right'].set_color('none')
ax.spines['top'].set_color('none')
ax.yaxis.set_ticks_position('left')
ax.xaxis.set_ticks_position('bottom')
plt.tick_params(axis='x', which='both', labelbottom=False)
for label in ax.yaxis.get_ticklabels():
label.set_horizontalalignment('left')
label.set_verticalalignment('bottom')
plt.xlim(xlim)
ylim_c = list(plt.ylim())
ylim_c[0] = 0
plt.ylim(ylim_c)
plt.ylabel('I [kA]', color='tab:gray', fontsize=6)
# energy profile plot
pos = list(ax_or.get_position().bounds)
pos[2] /= 8
ax = fig.add_axes(pos)
ax.set_zorder(z_or-1)
plt.plot(ene_spectrum, ene_spec_edgs, c='k', lw=0.5, alpha=0.5)
plt.fill_betweenx(ene_spec_edgs, ene_spectrum, facecolor='tab:gray',
alpha=0.3)
plt.gca().axis('off')
plt.ylim(ylim)
xlim_e = list(plt.xlim())
xlim_e[0] = 0
plt.xlim(xlim_e)
# colorbar
ax = plt.subplot(gs[1])
matplotlib.colorbar.Colorbar(ax, pscatt, label='Q [fC]')
# slice parameters plot
plt.subplot(gs[2])
l1 = plt.plot(slice_z, slice_ene_sp, lw=1, c='tab:green',
label='$\\sigma_\\gamma/\\gamma$')
plt.ylabel('$\\sigma_\\gamma/\\gamma$ [%]')
if include_twiss:
plt.tick_params(axis='x', which='both', labelbottom=False)
else:
plt.xlabel('$\\Delta z \\ [\\mathrm{\\mu m}]$')
# make room for legend
# ylim = list(plt.ylim())
# ylim[1] += (ylim[1] - ylim[0]) * leg_frac
plt.xlim(xlim)
# plt.ylim(ylim)
ax = plt.twinx()
l2 = plt.plot(slice_z, slice_em_x, lw=1, c='tab:blue',
label='$\\epsilon_{n,x}$')
l3 = plt.plot(slice_z, slice_em_y, lw=1, c='tab:orange',
label='$\\epsilon_{n,y}$')
plt.ylabel('$\\epsilon_{n} \\ [\\mathrm{\\mu m}]$')
# make room for legend
# ylim = list(plt.ylim())
# ylim[1] += (ylim[1] - ylim[0]) * leg_frac
# plt.ylim(ylim)
lines = l1 + l2 + l3
labels = [line.get_label() for line in lines]
plt.legend(lines, labels, fontsize=6, frameon=False,
loc='center right', borderaxespad=0.3)
if add_labels:
plt.text(0.03, 0.05, '(b)', transform=plt.gca().transAxes,
fontsize=6, horizontalalignment='left',
verticalalignment='bottom')
if include_twiss:
plt.subplot(gs[4])
l1 = plt.plot(slice_z, beta_x, lw=1, c='tab:blue',
label='$\\beta_x$')
l2 = plt.plot(slice_z, beta_y, lw=1, c='tab:orange',
label='$\\beta_y$')
plt.xlabel('$\\Delta z \\ [\\mathrm{\\mu m}]$')
plt.ylabel('$\\beta$ [{}]'.format(beta_units))
# make room for legend
ylim = list(plt.ylim())
ylim[1] += (ylim[1] - ylim[0]) * leg_frac
plt.ylim(ylim)
plt.xlim(xlim)
plt.twinx()
l3 = plt.plot(slice_z, alpha_x, lw=1, c='tab:blue', ls='--',
label='$\\alpha_x$')
l4 = plt.plot(slice_z, alpha_y, lw=1, c='tab:orange', ls='--',
label='$\\alpha_y$')
lines = l1 + l2 + l3 + l4
labels = [line.get_label() for line in lines]
# make room for legend
# ylim = list(plt.ylim())
# ylim[1] += (ylim[1] - ylim[0]) * leg_frac
# plt.ylim(ylim)
plt.legend(lines, labels, fontsize=6, ncol=1, frameon=False,
loc='center right', borderaxespad=0.3,
labelspacing=0.20)
if add_labels:
plt.text(0.03, 0.05, '(c)', transform=plt.gca().transAxes,
fontsize=6, horizontalalignment='left',
verticalalignment='bottom')
plt.ylabel('$\\alpha$')
if show:
plt.show()
def energy_vs_z(
z, px, py, pz, q, n_slices=50, len_slice=None, ene_bins=50,
xlim=None, ylim=None, show_text=True, x_proj=True, y_proj=True,
cbar=True, cbar_width=0.02, left=0.125, right=0.875, top=0.98,
bottom=0.13, fig=None, rasterized_scatter=None, show=True):
# analyze beam
current_prof, z_edges = bd.current_profile(z, q, n_slices=n_slices,
len_slice=len_slice)
ene_spectrum, ene_spec_edgs = bd.energy_spectrum(px, py, pz, w=q,
bins=ene_bins)
s_z = bd.rms_length(z, w=q)
len_fwhm = bd.fwhm_length(z, q, n_slices=n_slices, len_slice=len_slice)
ene_sp_tot = bd.relative_rms_energy_spread(px, py, pz, w=q)
# perform operations
gamma = np.sqrt(1 + px**2 + py**2 + pz**2)
ene = gamma * ct.m_e*ct.c**2/ct.e * 1e-9 # GeV
z_center = np.average(z, weights=q)
dz = z_edges[1] - z_edges[0]
slice_z = (z_edges[1:] - dz/2 - z_center) * 1e6 # micron
current_prof = np.abs(current_prof) * 1e-3 # kA
peak_current = np.nanmax(current_prof)
s_t = s_z * 1e15/ct.c
len_fwhm *= 1e15/ct.c # fs
ene_spec_edgs = ene_spec_edgs[:-1] + (ene_spec_edgs[1]-ene_spec_edgs[0])/2
ene_spec_edgs *= ct.m_e*ct.c**2/ct.e * 1e-9 # GeV
ene_sp_tot *= 1e2 # %
max_ene = np.nanmax(ene)
if max_ene <= 1:
ene_units = 'MeV'
ene *= 1e3
ene_spec_edgs *= 1e3
else:
ene_units = 'GeV'
ene_mean = np.average(ene, weights=q)
# make plot
if fig is None:
fig = plt.figure(figsize=(4, 2.5))
if cbar:
gs = gridspec.GridSpec(
1, 2, width_ratios=[1, cbar_width], hspace=0.1, wspace=0.05,
figure=fig, left=left, right=right, top=top, bottom=bottom)
else:
gs = gridspec.GridSpec(
1, 1, figure=fig, left=left, right=right, top=top, bottom=bottom)
with plt.rc_context(rc_params):
ax_or = plt.subplot(gs[0])
pscatt = scatter_histogram((z-z_center)*1e6, ene, bins=300,
weights=np.abs(q)*1e15,
rasterized=rasterized_scatter)
plt.xlabel('$\\Delta z \\ [\\mathrm{\\mu m}]$')
plt.ylabel('Energy [{}]'.format(ene_units))
if show_text:
params_text = ('$\\langle E \\rangle = '
+ '{:0.1f}$ {}\n'.format(ene_mean, ene_units)
+ '$\\sigma_\\mathrm{E,rel}='
+ '{:0.1f}$ %\n'.format(ene_sp_tot)
+ '$I_\\mathrm{peak}='
+ '{:0.1f}$ kA\n'.format(peak_current)
+ '$\\sigma_t='
+ '{:0.1f}$ fs'.format(s_t))
plt.text(0.98, 0.95, params_text, transform=ax_or.transAxes,
fontsize=6, horizontalalignment='right',
verticalalignment='top')
if xlim is not None:
plt.xlim(xlim)
else:
xlim = list(plt.xlim())
if y_proj:
xlim[0] -= (xlim[1] - xlim[0])/8
if show_text:
xlim[1] += (xlim[1] - xlim[0])/3
plt.xlim(xlim)
if ylim is not None:
plt.ylim(ylim)
else:
ylim = list(plt.ylim())
if x_proj:
ylim[0] -= (ylim[1] - ylim[0])/3
plt.ylim(ylim)
# current profile plot
if x_proj:
z_or = ax_or.get_zorder()
pos = list(ax_or.get_position().bounds)
pos[3] /= 5
ax_or.patch.set_alpha(0)
ax = fig.add_axes(pos)
ax.set_zorder(z_or-1)
plt.plot(slice_z, current_prof, c='k', lw=0.5, alpha=0.5)
plt.fill_between(
slice_z, current_prof, facecolor='tab:gray', alpha=0.3)
ax.spines['left'].set_position('zero')
ax.spines['left'].set_color('tab:grey')
ax.tick_params(
axis='y', colors='tab:grey', labelsize=6, direction="in",
pad=-4)
ax.spines['right'].set_color('none')
ax.spines['top'].set_color('none')
ax.yaxis.set_ticks_position('left')
ax.xaxis.set_ticks_position('bottom')
plt.tick_params(axis='x', which='both', labelbottom=False)
for label in ax.yaxis.get_ticklabels():
label.set_horizontalalignment('left')
label.set_verticalalignment('bottom')
plt.xlim(xlim)
ylim_c = list(plt.ylim())
ylim_c[0] = 0
plt.ylim(ylim_c)
plt.ylabel('I [kA]', color='tab:gray', fontsize=6)
# energy profile plot
if y_proj:
z_or = ax_or.get_zorder()
pos = list(ax_or.get_position().bounds)
pos[2] /= 8
ax_or.patch.set_alpha(0)
ax = fig.add_axes(pos)
ax.set_zorder(z_or-1)
plt.plot(ene_spectrum, ene_spec_edgs, c='k', lw=0.5, alpha=0.5)
plt.fill_betweenx(
ene_spec_edgs, ene_spectrum, facecolor='tab:gray', alpha=0.3)
plt.gca().axis('off')
plt.ylim(ylim)
xlim_e = list(plt.xlim())
xlim_e[0] = 0
plt.xlim(xlim_e)
# colorbar
if cbar:
ax = plt.subplot(gs[1])
matplotlib.colorbar.Colorbar(ax, pscatt, label='Q [fC]')
if show:
plt.show()
def full_phase_space(x, y, z, px, py, pz, q, show=True, **kwargs):
fig = plt.figure(figsize=(12, 3))
grid = gridspec.GridSpec(1, 3, figure=fig, wspace=0.55)
hor_phase_space(
x, px, q, pz, subplot_spec=grid[0], fig=fig, show=False, **kwargs)
ver_phase_space(
y, py, q, pz, subplot_spec=grid[1], fig=fig, show=False, **kwargs)
lon_phase_space(
z, pz, q, subplot_spec=grid[2], fig=fig, show=False, **kwargs)
if show:
plt.show()
def lon_phase_space(
z, pz, q, beam_info=True, bins=300, **kwargs):
if beam_info:
# analyze beam
if type(bins) in [tuple, list]:
bins_x = bins[0]
else:
bins_x = bins
i_peak = bd.peak_current(z, q, n_slices=bins_x) * 1e-3 # kA
tau_fwhm = bd.fwhm_length(z, q, n_slices=bins_x) * 1e15/ct.c # fs
s_t = bd.rms_length(z, w=q) * 1e15/ct.c # fs
pz_avg = np.average(pz, weights=q)
s_pz = weighted_std(pz, weights=q) / pz_avg * 100 # %
pz_avg *= ct.m_e*ct.c**2/ct.e * 1e-9 # GeV
if pz_avg < 0.1:
pz_units = 'MeV/c'
pz_avg *= 1e3
else:
pz_units = 'GeV/c'
text = (
'$\\bar{p_z} = $' + '{:0.2f}'.format(np.around(pz_avg, 3))
+ pz_units + '\n'
+ '$\\sigma_{p_z} = $' + '{}'.format(np.around(s_pz, 3))
+ '$\\%$\n'
+ '$I_\\mathrm{peak}=' + '{:0.2f}$ kA\n'.format(i_peak)
+ '$\\sigma_t=' + '{:0.1f}$ fs\n'.format(s_t)
+ '$\\tau_{FWHM}=' + '{:0.1f}$ fs'.format(tau_fwhm)
)
# Center in z.
z_avg = np.average(z, weights=q)
delta_z = z - z_avg
phase_space_plot(
x=delta_z * 1e6,
y=pz,
w=np.abs(q),
x_name='\\Delta z',
y_name='p_z',
w_name='Q',
x_units='µm',
y_units='m_e c',
w_units='C',
text=text,
bins=bins,
**kwargs
)
def hor_phase_space(x, px, q, pz=None, beam_info=True, **kwargs):
if beam_info:
em_x = bd.normalized_transverse_rms_emittance(x, px, w=q) * 1e6
s_x = bd.rms_size(x, w=q)
text = (
'$\\epsilon_{n,x} = $' + '{}'.format(np.around(em_x, 3))
+ '$\\ \\mathrm{\\mu m}$\n'
+ '$\\sigma_{x} = $' + '{}'.format(np.around(s_x*1e6, 3))
+ '$\\ \\mathrm{\\mu m}$'
)
if pz is not None:
a_x, b_x, g_x = bd.twiss_parameters(x, px, pz, w=q)
if b_x <= 0.1:
beta_units = 'mm'
b_x *= 1e3
else:
beta_units = 'm'
text += (
'\n'
+ '$\\beta_{x} = $' + '{}'.format(np.around(b_x, 3))
+ beta_units + '\n'
+ '$\\alpha_{x} = $' + '{}'.format(np.around(a_x, 3))
)
else:
text = None
phase_space_plot(
x=x * 1e6,
y=px,
w=np.abs(q),
x_name='x',
y_name='p_x',
w_name='Q',
x_units='µm',
y_units='m_e c',
w_units='C',
text=text,
**kwargs
)
def ver_phase_space(y, py, q, pz=None, beam_info=True, **kwargs):
if beam_info:
em_y = bd.normalized_transverse_rms_emittance(y, py, w=q) * 1e6
s_y = bd.rms_size(y, w=q)
text = (
'$\\epsilon_{n,y} = $' + '{}'.format(np.around(em_y, 3))
+ '$\\ \\mathrm{\\mu m}$\n'
+ '$\\sigma_{y} = $' + '{}'.format(np.around(s_y*1e6, 3))
+ '$\\ \\mathrm{\\mu m}$'
)
if pz is not None:
a_y, b_y, g_y = bd.twiss_parameters(y, py, pz, w=q)
if b_y <= 0.1:
beta_units = 'mm'
b_y *= 1e3
else:
beta_units = 'm'
text += (
'\n'
+ '$\\beta_{y} = $' + '{}'.format(np.around(b_y, 3))
+ beta_units + '\n'
+ '$\\alpha_{y} = $' + '{}'.format(np.around(a_y, 3))
)
else:
text = None
phase_space_plot(
x=y * 1e6,
y=py,
w=np.abs(q),
x_name='y',
y_name='p_y',
w_name='Q',
x_units='µm',
y_units='m_e c',
w_units='C',
text=text,
**kwargs
)
def phase_space_plot(
x, y, w=None, x_name='', y_name='', w_name='',
x_units='', y_units='', w_units='', x_lim=None, y_lim=None,
x_projection=True, y_projection=True, projection_space=True,
bins=300, rasterized=False,
s=1, cmap='plasma', center_lines=False,
text=None, cbar=True, cbar_ticks=3, cbar_width=0.05,
subplot_spec=None, fig=None, tight_layout=False, show=True):
if cbar:
n_cols = 2
width_ratios = [1, cbar_width]
figsize = (4 * (1 + cbar_width), 4)
else:
n_cols = 1
width_ratios = None
figsize = (4, 4)
with plt.rc_context(rc_params):
if fig is None:
fig = plt.figure(figsize=figsize)
if subplot_spec is None:
grid = gridspec.GridSpec(
1, n_cols, width_ratios=width_ratios, figure=fig, wspace=0.05)
else:
grid = gridspec.GridSpecFromSubplotSpec(
1, n_cols, subplot_spec, width_ratios=width_ratios,
wspace=0.05)
ax = fig.add_subplot(grid[0])
img = scatter_histogram(
x, y, bins=bins, weights=w, range=[x_lim, y_lim], s=s,
cmap=cmap, rasterized=rasterized, ax=ax)
if center_lines:
ax.axvline(np.average(x, weights=w), ls='--', lw=0.5, c='k')
ax.axhline(np.average(y, weights=w), ls='--', lw=0.5, c='k')
x_label = ''
if len(x_name) > 0:
x_label += '${}$'.format(x_name)
if len(x_units) > 0:
x_label += ' [${}$]'.format(x_units)
y_label = ''
if len(y_name) > 0:
y_label += '${}$'.format(y_name)
if len(y_units) > 0:
y_label += ' [${}$]'.format(y_units)
ax.set_xlabel(x_label)
ax.set_ylabel(y_label)
if projection_space:
if x_projection:
ylim = list(ax.get_ylim())
ylim[0] -= (ylim[1] - ylim[0])/5
ax.set_ylim(ylim)
if y_projection:
xlim = list(ax.get_xlim())
xlim[0] -= (xlim[1] - xlim[0])/5
ax.set_xlim(xlim)
if type(bins) in [tuple, list]:
bins_x, bins_y = bins
else:
bins_x = bins_y = bins
if x_projection:
add_projection(x, bins_x, ax, grid[0], fig)
if y_projection:
add_projection(y, bins_y, ax, grid[0], fig, orientation='vertical')
if text is not None:
add_text(ax, 0.05, 0.05, text, va='bottom', ha='left')
# generate colorbar
if w is not None and cbar:
cbar_label = ''
if len(w_name) > 0:
cbar_label += '${}$'.format(w_name)
if len(w_units) > 0:
cbar_label += ' [${}$]'.format(w_units)
create_vertical_colorbars(
img, cbar_label, grid[1], fig, n_ticks=cbar_ticks)
if tight_layout:
try:
grid.tight_layout(fig)
except Exception:
fig.tight_layout()
if show:
plt.show() | APtools | /APtools-0.2.4-py3-none-any.whl/aptools/plotting/quick_diagnostics.py | quick_diagnostics.py |
from os import path
import numpy as np
import scipy.constants as ct
from openpmd_api import (Series, Access, Dataset, Mesh_Record_Component,
Unit_Dimension)
from deprecated import deprecated
from aptools.helper_functions import reposition_bunch, get_particle_subset
from aptools import __version__
SCALAR = Mesh_Record_Component.SCALAR
@deprecated(
version="0.2.0",
reason=("This method is replaced by those in the new "
"`particle_distributions.save` module.")
)
def save_beam(code_name, beam_data, folder_path, file_name, reposition=False,
avg_pos=[None, None, None], avg_mom=[None, None, None],
n_part=None, **kwargs):
"""Converts particle data from one code to another.
Parameters
----------
code_name : str
Name of the target tracking or PIC code. Possible values are
'csrtrack', 'astra', 'fbpic' and 'openpmd'.
beam_data : list
Contains the beam data as [x, y, z, px, py, pz, q], where the positions
have units of meters, momentun is in non-dimensional units (beta*gamma)
and q is in Coulomb.
folder_path : str
Path to the folder in which to save the data
file_name : str
Name of the file to save, without extension
reposition : bool
Optional. Whether to reposition de particle distribution in space
and/or momentum centered in the coordinates specified in avg_pos and
avg_mom
avg_pos : list
Optional, only used it reposition=True. Contains the new average
positions of the beam after repositioning. Should be specified as
[x_avg, y_avg, z_avg] in meters. Setting a component as None prevents
repositioning in that coordinate.
avg_mom : list
Optional, only used it reposition=True. Contains the new
average momentum of the beam after repositioning. Should be specified
as [px_avg, py_avg, pz_avg] in non-dimmesional units (beta*gamma).
Setting a component as None prevents repositioning in that coordinate.
n_part : int
Optional. Number of particles to save. Must be lower than the original
number of particles. Particles to save are chosen randomly.
"""
save_beam_for = {'csrtrack': save_for_csrtrack_fmt1,
'astra': save_for_astra,
'fbpic': save_for_fbpic,
'openpmd': save_to_openpmd_file}
save_beam_for[code_name](beam_data, folder_path, file_name, reposition,
avg_pos, avg_mom, n_part, **kwargs)
def save_for_csrtrack_fmt1(beam_data, folder_path, file_name, reposition=False,
avg_pos=[None, None, None],
avg_mom=[None, None, None], n_part=None):
"""Saves particle data for CSRtrack in fmt1 format.
Parameters
----------
beam_data : list
Contains the beam data as [x, y, z, px, py, pz, q], where the positions
have units of meters, momentun is in non-dimensional units (beta*gamma)
and q is in Coulomb.
folder_path : str
Path to the folder in which to save the data
file_name : str
Name of the file to save without extension
reposition : bool
Optional. Whether to reposition de particle distribution in space
and/or momentum centered in the coordinates specified in avg_pos and
avg_mom
avg_pos : list
Optional, only used it reposition=True. Contains the new average
positions of the beam after repositioning. Should be specified as
[x_avg, y_avg, z_avg] in meters. Setting a component as None prevents
repositioning in that coordinate.
avg_mom : list
Optional, only used it reposition=True. Contains the new
average momentum of the beam after repositioning. Should be specified
as [px_avg, py_avg, pz_avg] in non-dimmesional units (beta*gamma).
Setting a component as None prevents repositioning in that coordinate.
n_part : int
Optional. Number of particles to save. Must be lower than the original
number of particles. Particles to save are chosen randomly.
"""
# Perform repositioning of original distribution
if reposition:
reposition_bunch(beam_data, avg_pos+avg_mom)
# Create subset of n_part
if n_part is not None:
beam_data = get_particle_subset(
beam_data, n_part, preserve_charge=True)
# Get beam data
x_orig = beam_data[0]
y_orig = beam_data[1]
xi_orig = beam_data[2]
px_orig = beam_data[3]*ct.m_e*ct.c**2/ct.e
py_orig = beam_data[4]*ct.m_e*ct.c**2/ct.e
pz_orig = beam_data[5]*ct.m_e*ct.c**2/ct.e
q_orig = beam_data[6]
# Create arrays
x = np.zeros(q_orig.size+2)
y = np.zeros(q_orig.size+2)
xi = np.zeros(q_orig.size+2)
px = np.zeros(q_orig.size+2)
py = np.zeros(q_orig.size+2)
pz = np.zeros(q_orig.size+2)
q = np.zeros(q_orig.size+2)
# Reference particle
x[1] = np.average(x_orig, weights=q_orig)
y[1] = np.average(y_orig, weights=q_orig)
xi[1] = np.average(xi_orig, weights=q_orig)
px[1] = np.average(px_orig, weights=q_orig)
py[1] = np.average(py_orig, weights=q_orig)
pz[1] = np.average(pz_orig, weights=q_orig)
q[1] = sum(q_orig)/len(q_orig)
# Relative coordinates
x[2::] = x_orig - x[1]
y[2::] = y_orig - y[1]
xi[2::] = xi_orig - xi[1]
px[2::] = px_orig - px[1]
py[2::] = py_orig - py[1]
pz[2::] = pz_orig - pz[1]
q[2::] = q_orig
# Save to file
data = np.column_stack((xi, x, y, pz, px, py, q))
file_name += '.fmt1'
np.savetxt(path.join(folder_path, file_name), data,
'%1.12e %1.12e %1.12e %1.12e %1.12e %1.12e %1.12e')
def save_for_astra(beam_data, folder_path, file_name, reposition=False,
avg_pos=[None, None, None], avg_mom=[None, None, None],
n_part=None):
"""Saves particle data in ASTRA format.
Parameters
----------
beam_data : list
Contains the beam data as [x, y, z, px, py, pz, q], where the positions
have units of meters, momentun is in non-dimensional units (beta*gamma)
and q is in Coulomb.
folder_path : str
Path to the folder in which to save the data
file_name : str
Name of the file to save without extension
reposition : bool
Optional. Whether to reposition de particle distribution in space
and/or momentum centered in the coordinates specified in avg_pos and
avg_mom
avg_pos : list
Optional, only used it reposition=True. Contains the new average
positions of the beam after repositioning. Should be specified as
[x_avg, y_avg, z_avg] in meters. Setting a component as None prevents
repositioning in that coordinate.
avg_mom : list
Optional, only used it reposition=True. Contains the new
average momentum of the beam after repositioning. Should be specified
as [px_avg, py_avg, pz_avg] in non-dimmesional units (beta*gamma).
Setting a component as None prevents repositioning in that coordinate.
n_part : int
Optional. Number of particles to save. Must be lower than the original
number of particles. Particles to save are chosen randomly.
"""
# Perform repositioning of original distribution
if reposition:
reposition_bunch(beam_data, avg_pos+avg_mom)
# Create subset of n_part
if n_part is not None:
beam_data = get_particle_subset(
beam_data, n_part, preserve_charge=True)
# Get beam data
x_orig = beam_data[0]
y_orig = beam_data[1]
xi_orig = beam_data[2]
px_orig = beam_data[3]*ct.m_e*ct.c**2/ct.e
py_orig = beam_data[4]*ct.m_e*ct.c**2/ct.e
pz_orig = beam_data[5]*ct.m_e*ct.c**2/ct.e
q_orig = beam_data[6]*1e9 # nC
# Create arrays
x = np.zeros(q_orig.size+1)
y = np.zeros(q_orig.size+1)
xi = np.zeros(q_orig.size+1)
px = np.zeros(q_orig.size+1)
py = np.zeros(q_orig.size+1)
pz = np.zeros(q_orig.size+1)
q = np.zeros(q_orig.size+1)
# Reference particle
x[0] = np.average(x_orig, weights=q_orig)
y[0] = np.average(y_orig, weights=q_orig)
xi[0] = np.average(xi_orig, weights=q_orig)
px[0] = np.average(px_orig, weights=q_orig)
py[0] = np.average(py_orig, weights=q_orig)
pz[0] = np.average(pz_orig, weights=q_orig)
q[0] = sum(q_orig)/len(q_orig)
# Put relative to reference particle
x[1::] = x_orig
y[1::] = y_orig
xi[1::] = xi_orig - xi[0]
px[1::] = px_orig
py[1::] = py_orig
pz[1::] = pz_orig - pz[0]
q[1::] = q_orig
t = xi/ct.c
# Add flags and indices
ind = np.ones(q.size)
flag = np.ones(q.size)*5
# Save to file
data = np.column_stack((x, y, xi, px, py, pz, t, q, ind, flag))
file_name += '.txt'
np.savetxt(
path.join(folder_path, file_name), data,
'%1.12e %1.12e %1.12e %1.12e %1.12e %1.12e %1.12e %1.12e %i %i')
def save_for_fbpic(beam_data, folder_path, file_name, reposition=False,
avg_pos=[None, None, None], avg_mom=[None, None, None],
n_part=None):
"""Saves particle data in in a format that can be read by FBPIC.
Parameters
----------
beam_data : list
Contains the beam data as [x, y, z, px, py, pz, q], where the positions
have units of meters, momentun is in non-dimensional units (beta*gamma)
and q is in Coulomb.
folder_path : str
Path to the folder in which to save the data
file_name : str
Name of the file to save without extension
reposition : bool
Optional. Whether to reposition de particle distribution in space
and/or momentum centered in the coordinates specified in avg_pos and
avg_mom
avg_pos : list
Optional, only used it reposition=True. Contains the new average
positions of the beam after repositioning. Should be specified as
[x_avg, y_avg, z_avg] in meters. Setting a component as None prevents
repositioning in that coordinate.
avg_mom : list
Optional, only used it reposition=True. Contains the new
average momentum of the beam after repositioning. Should be specified
as [px_avg, py_avg, pz_avg] in non-dimmesional units (beta*gamma).
Setting a component as None prevents repositioning in that coordinate.
n_part : int
Optional. Number of particles to save. Must be lower than the original
number of particles. Particles to save are chosen randomly.
"""
# Perform repositioning of original distribution
if reposition:
reposition_bunch(beam_data, avg_pos+avg_mom)
# Create subset of n_part
if n_part is not None:
beam_data = get_particle_subset(
beam_data, n_part, preserve_charge=True)
# Get beam data
x = beam_data[0]
y = beam_data[1]
xi = beam_data[2]
px = beam_data[3]
py = beam_data[4]
pz = beam_data[5]
# Save to file
data = np.column_stack((x, y, xi, px, py, pz))
file_name += '.txt'
np.savetxt(path.join(folder_path, file_name), data,
'%1.12e %1.12e %1.12e %1.12e %1.12e %1.12e')
def save_to_openpmd_file(
beam_data, folder_path, file_name, reposition=False,
avg_pos=[None, None, None], avg_mom=[None, None, None], n_part=None,
species_name='particle_beam'):
"""
Saves particle data to an HDF5 file following the openPMD standard.
Parameters
----------
beam_data : list
Contains the beam data as [x, y, z, px, py, pz, q], where the positions
have units of meters, momentun is in non-dimensional units (beta*gamma)
and q is in Coulomb.
folder_path : str
Path to the folder in which to save the data
file_name : str
Name of the file to save without extension
reposition : bool
Optional. Whether to reposition de particle distribution in space
and/or momentum centered in the coordinates specified in avg_pos and
avg_mom
avg_pos : list
Optional, only used it reposition=True. Contains the new average
positions of the beam after repositioning. Should be specified as
[x_avg, y_avg, z_avg] in meters. Setting a component as None prevents
repositioning in that coordinate.
avg_mom : list
Optional, only used it reposition=True. Contains the new
average momentum of the beam after repositioning. Should be specified
as [px_avg, py_avg, pz_avg] in non-dimmesional units (beta*gamma).
Setting a component as None prevents repositioning in that coordinate.
n_part : int
Optional. Number of particles to save. Must be lower than the original
number of particles. Particles to save are chosen randomly.
species_name : str
Optional. Name under which the particle species should be stored.
"""
# Perform repositioning of original distribution
if reposition:
reposition_bunch(beam_data, avg_pos+avg_mom)
# Create subset of n_part
if n_part is not None:
beam_data = get_particle_subset(
beam_data, n_part, preserve_charge=True)
# Get beam data
x = np.ascontiguousarray(beam_data[0])
y = np.ascontiguousarray(beam_data[1])
z = np.ascontiguousarray(beam_data[2])
px = np.ascontiguousarray(beam_data[3])
py = np.ascontiguousarray(beam_data[4])
pz = np.ascontiguousarray(beam_data[5])
q = np.ascontiguousarray(beam_data[6])
# Save to file
file_path = path.join(folder_path, file_name)
if not file_path.endswith('.h5'):
file_path += '.h5'
opmd_series = Series(file_path, Access.create)
# Set basic attributes.
opmd_series.set_software('APtools', __version__)
opmd_series.set_particles_path('particles')
# Create iteration
it = opmd_series.iterations[0]
# Create particles species.
particles = it.particles[species_name]
# Create additional necessary arrays and constants.
w = np.abs(q) / ct.e
m = ct.m_e
q = -ct.e
px = px * m * ct.c
py = py * m * ct.c
pz = pz * m * ct.c
# Generate datasets.
d_x = Dataset(x.dtype, extent=x.shape)
d_y = Dataset(y.dtype, extent=y.shape)
d_z = Dataset(z.dtype, extent=z.shape)
d_px = Dataset(px.dtype, extent=px.shape)
d_py = Dataset(py.dtype, extent=py.shape)
d_pz = Dataset(pz.dtype, extent=pz.shape)
d_w = Dataset(w.dtype, extent=w.shape)
d_q = Dataset(np.dtype('float64'), extent=[1])
d_m = Dataset(np.dtype('float64'), extent=[1])
d_xoff = Dataset(np.dtype('float64'), extent=[1])
d_yoff = Dataset(np.dtype('float64'), extent=[1])
d_zoff = Dataset(np.dtype('float64'), extent=[1])
# Record data.
particles['position']['x'].reset_dataset(d_x)
particles['position']['y'].reset_dataset(d_y)
particles['position']['z'].reset_dataset(d_z)
particles['positionOffset']['x'].reset_dataset(d_xoff)
particles['positionOffset']['y'].reset_dataset(d_yoff)
particles['positionOffset']['z'].reset_dataset(d_zoff)
particles['momentum']['x'].reset_dataset(d_px)
particles['momentum']['y'].reset_dataset(d_py)
particles['momentum']['z'].reset_dataset(d_pz)
particles['weighting'][SCALAR].reset_dataset(d_w)
particles['charge'][SCALAR].reset_dataset(d_q)
particles['mass'][SCALAR].reset_dataset(d_m)
# Prepare for writting.
particles['position']['x'].store_chunk(x)
particles['position']['y'].store_chunk(y)
particles['position']['z'].store_chunk(z)
particles['positionOffset']['x'].make_constant(0.)
particles['positionOffset']['y'].make_constant(0.)
particles['positionOffset']['z'].make_constant(0.)
particles['momentum']['x'].store_chunk(px)
particles['momentum']['y'].store_chunk(py)
particles['momentum']['z'].store_chunk(pz)
particles['weighting'][SCALAR].store_chunk(w)
particles['charge'][SCALAR].make_constant(q)
particles['mass'][SCALAR].make_constant(m)
# Set units.
particles['position'].unit_dimension = {Unit_Dimension.L: 1}
particles['positionOffset'].unit_dimension = {Unit_Dimension.L: 1}
particles['momentum'].unit_dimension = {
Unit_Dimension.L: 1,
Unit_Dimension.M: 1,
Unit_Dimension.T: -1,
}
particles['charge'].unit_dimension = {
Unit_Dimension.T: 1,
Unit_Dimension.I: 1,
}
particles['mass'].unit_dimension = {Unit_Dimension.M: 1}
# Set weighting attributes.
particles['position'].set_attribute('macroWeighted', np.uint32(0))
particles['positionOffset'].set_attribute(
'macroWeighted', np.uint32(0))
particles['momentum'].set_attribute('macroWeighted', np.uint32(0))
particles['weighting'][SCALAR].set_attribute(
'macroWeighted', np.uint32(1))
particles['charge'][SCALAR].set_attribute(
'macroWeighted', np.uint32(0))
particles['mass'][SCALAR].set_attribute('macroWeighted', np.uint32(0))
particles['position'].set_attribute('weightingPower', 0.)
particles['positionOffset'].set_attribute('weightingPower', 0.)
particles['momentum'].set_attribute('weightingPower', 1.)
particles['weighting'][SCALAR].set_attribute('weightingPower', 1.)
particles['charge'][SCALAR].set_attribute('weightingPower', 1.)
particles['mass'][SCALAR].set_attribute('weightingPower', 1.)
# Flush data.
opmd_series.flush() | APtools | /APtools-0.2.4-py3-none-any.whl/aptools/data_handling/saving.py | saving.py |
from aptools.data_handling.reading import read_beam
from aptools.data_handling.saving import save_beam
def convert_beam(orig_code, final_code, orig_path, final_path, final_file_name,
reposition=False, avg_pos=[None, None, None],
avg_mom=[None, None, None], n_part=None, read_kwargs={},
save_kwargs={}):
"""Converts particle data from one code to another.
Parameters
----------
orig_code : str
Name of the tracking or PIC code of the original data. Possible values
are 'csrtrack', 'astra' and 'openpmd'
final_code : str
Name of the tracking or PIC code in which to convert the data. Possible
values are 'csrtrack', 'astra', 'fbpic' and 'openpmd'
orig_path : str
Path of the file containing the original data
final_path : str
Path to the folder in which to save the converted data
final_file_name : str
Name of the file to save, without extension
reposition : bool
Optional. Whether to reposition de particle distribution in space
and/or momentum centered in the coordinates specified in avg_pos and
avg_mom
avg_pos : list
Optional, only used it reposition=True. Contains the new average
positions of the beam after repositioning. Should be specified as
[x_avg, y_avg, z_avg] in meters. Setting a component as None prevents
repositioning in that coordinate.
avg_mom : list
Optional, only used it reposition=True. Contains the new
average momentum of the beam after repositioning. Should be specified
as [px_avg, py_avg, pz_avg] in non-dimmesional units (beta*gamma).
Setting a component as None prevents repositioning in that coordinate.
n_part : int
Optional. Number of particles to save. Must be lower than the original
number of particles. Particles to save are chosen randomly.
Other Parameters
----------------
**kwargs
This method takes additional keyword parameters that might be needed
for some data readers. Currenlty, the only parameter is 'species_name',
for reading data from PIC codes.
"""
x, y, z, px, py, pz, q = read_beam(orig_code, orig_path, **read_kwargs)
beam_data = [x, y, z, px, py, pz, q]
save_beam(final_code, beam_data, final_path, final_file_name, reposition,
avg_pos, avg_mom, n_part, **save_kwargs) | APtools | /APtools-0.2.4-py3-none-any.whl/aptools/data_handling/conversion.py | conversion.py |
import numpy as np
import scipy.constants as ct
from h5py import File as H5File
from deprecated import deprecated
from aptools.helper_functions import join_infile_path, reposition_bunch
from aptools.plasma_accel.general_equations import plasma_skin_depth
from aptools.data_processing.beam_filtering import filter_beam
@deprecated(
version="0.2.0",
reason=("This method is replaced by those in the new "
"`particle_distributions.read` module.")
)
def read_beam(code_name, file_path, reposition=False,
avg_pos=[None, None, None], avg_mom=[None, None, None],
filter_min=[None, None, None, None, None, None, None],
filter_max=[None, None, None, None, None, None, None],
**kwargs):
"""Reads particle data from the specified code.
Parameters
----------
code_name : str
Name of the tracking or PIC code of the data to read. Possible values
are 'csrtrack', 'astra', 'openpmd', 'osiris', 'hipace' and 'fbpic'.
file_path : str
Path of the file containing the data
reposition : bool
Optional. Whether to reposition de particle distribution in space
and/or momentum centered in the coordinates specified in avg_pos and
avg_mom
avg_pos : list
Optional, only used it reposition=True. Contains the new average
positions of the beam after repositioning. Should be specified as
[x_avg, y_avg, z_avg] in meters. Setting a component as None prevents
repositioning in that coordinate.
avg_mom : list
Optional, only used it reposition=True. Contains the new
average momentum of the beam after repositioning. Should be specified
as [px_avg, py_avg, pz_avg] in non-dimmesional units (beta*gamma).
Setting a component as None prevents repositioning in that coordinate.
filter_min, filter_max : list
List of length 7 with the minimum and maximum value for each particle
component, ordered as [x, y, z, pz, py, pz, q]. Particles outside the
given range will be filtered out. For values which are 'None', no
filtering is performed.
Other Parameters
----------------
**kwargs
This method takes additional keyword parameters that might be needed
for some data readers. Possible parameters are 'species_name' and
'plasma_dens'.
"""
read_beam_from = {'csrtrack': read_csrtrack_data_fmt1,
'astra': read_astra_data,
'openpmd': read_openpmd_beam,
'osiris': read_osiris_beam,
'hipace': read_hipace_beam,
'fbpic': read_fbpic_input_beam}
x, y, z, px, py, pz, q = read_beam_from[code_name](file_path, **kwargs)
if reposition:
reposition_bunch([x, y, z, px, py, pz, q], avg_pos+avg_mom)
if any(filter_min + filter_max):
x, y, z, px, py, pz, q = filter_beam(
np.array([x, y, z, px, py, pz, q]), filter_min, filter_max)
return x, y, z, px, py, pz, q
def read_csrtrack_data_fmt1(file_path):
"""Reads particle data from CSRtrack in fmt1 format and returns it in the
unis used by APtools.
Parameters
----------
file_path : str
Path to the file with particle data
Returns
-------
A tuple with 7 arrays containing the 6D phase space and charge of the
particles.
"""
data = np.genfromtxt(file_path)
z = data[1:, 0]
x = data[1:, 1]
y = data[1:, 2]
pz = data[1:, 3] / (ct.m_e*ct.c**2/ct.e)
px = data[1:, 4] / (ct.m_e*ct.c**2/ct.e)
py = data[1:, 5] / (ct.m_e*ct.c**2/ct.e)
q = data[1:, 6]
x[1:] += x[0]
y[1:] += y[0]
z[1:] += z[0]
px[1:] += px[0]
py[1:] += py[0]
pz[1:] += pz[0]
return x, y, z, px, py, pz, q
def read_astra_data(file_path, remove_non_standard=True):
"""Reads particle data from ASTRA and returns it in the unis used by
APtools.
Parameters
----------
file_path : str
Path to the file with particle data
remove_non_standard : bool
Determines whether non-standard particles (those with a status flag
other than 5) should be removed from the read data.
Returns
-------
A tuple with 7 arrays containing the 6D phase space and charge of the
particles.
"""
data = np.genfromtxt(file_path)
status_flag = data[:, 9]
if remove_non_standard:
data = data[np.where(status_flag == 5)]
x = data[:, 0]
y = data[:, 1]
z = data[:, 2]
px = data[:, 3] / (ct.m_e*ct.c**2/ct.e)
py = data[:, 4] / (ct.m_e*ct.c**2/ct.e)
pz = data[:, 5] / (ct.m_e*ct.c**2/ct.e)
z[1:] += z[0]
pz[1:] += pz[0]
q = data[:, 7] * 1e-9
return x, y, z, px, py, pz, q
def read_openpmd_beam(file_path, species_name=None):
"""Reads particle data from a h5 file following the openPMD standard and
returns it in the unis used by APtools.
Parameters
----------
file_path : str
Path to the file with particle data
species_name : str, Optional
Name of the particle species. Optional if only one particle species
is available in the openpmd file.
Returns
-------
A tuple with 7 arrays containing the 6D phase space and charge of the
particles.
"""
file_content = H5File(file_path, mode='r')
# get base path in file
iteration = list(file_content['/data'].keys())[0]
base_path = '/data/{}'.format(iteration)
# get path under which particle data is stored
particles_path = file_content.attrs['particlesPath'].decode()
# list available species
available_species = list(
file_content[join_infile_path(base_path, particles_path)])
if species_name is None:
if len(available_species) == 1:
species_name = available_species[0]
else:
raise ValueError(
'More than one particle species is available. '
'Please specify a `species_name`. '
'Available species are: ' + str(available_species))
# get species
beam_species = file_content[
join_infile_path(base_path, particles_path, species_name)]
# get data
mass = beam_species['mass']
charge = beam_species['charge']
position = beam_species['position']
position_off = beam_species['positionOffset']
momentum = beam_species['momentum']
m = mass.attrs['value'] * mass.attrs['unitSI']
q = charge.attrs['value'] * charge.attrs['unitSI']
x = (position['x'][:] * position['x'].attrs['unitSI'] +
position_off['x'].attrs['value'] * position_off['x'].attrs['unitSI'])
y = (position['y'][:] * position['y'].attrs['unitSI'] +
position_off['y'].attrs['value'] * position_off['y'].attrs['unitSI'])
z = (position['z'][:] * position['z'].attrs['unitSI'] +
position_off['z'].attrs['value'] * position_off['z'].attrs['unitSI'])
px = momentum['x'][:] * momentum['x'].attrs['unitSI'] / (m*ct.c)
py = momentum['y'][:] * momentum['y'].attrs['unitSI'] / (m*ct.c)
pz = momentum['z'][:] * momentum['z'].attrs['unitSI'] / (m*ct.c)
w = beam_species['weighting'][:]
q *= w
return x, y, z, px, py, pz, q
def read_hipace_beam(file_path, plasma_dens):
"""Reads particle data from an HiPACE paricle file and returns it in the
unis used by APtools.
Parameters
----------
file_path : str
Path to the file with particle data
plasma_dens : float
Plasma density in units od cm^{-3} used to convert the beam data to
non-normalized units
Returns
-------
A tuple with 7 arrays containing the 6D phase space and charge of the
particles.
"""
s_d = plasma_skin_depth(plasma_dens)
file_content = H5File(file_path, mode='r')
# sim parameters
n_cells = file_content.attrs['NX']
sim_size = (file_content.attrs['XMAX'] - file_content.attrs['XMIN'])
cell_vol = np.prod(sim_size/n_cells)
q_norm = cell_vol * plasma_dens * 1e6 * s_d**3 * ct.e
# get data
q = np.array(file_content.get('q')) * q_norm
x = np.array(file_content.get('x2')) * s_d
y = np.array(file_content.get('x3')) * s_d
z = np.array(file_content.get('x1')) * s_d
px = np.array(file_content.get('p2'))
py = np.array(file_content.get('p3'))
pz = np.array(file_content.get('p1'))
return x, y, z, px, py, pz, q
def read_osiris_beam(file_path, plasma_dens):
"""Reads particle data from an OSIRIS paricle file and returns it in the
unis used by APtools.
Parameters
----------
file_path : str
Path to the file with particle data
plasma_dens : float
Plasma density in units od cm^{-3} used to convert the beam data to
non-normalized units
Returns
-------
A tuple with 7 arrays containing the 6D phase space and charge of the
particles.
"""
s_d = plasma_skin_depth(plasma_dens)
file_content = H5File(file_path, mode='r')
# get data
q = np.array(file_content.get('q')) * ct.e
x = np.array(file_content.get('x2')) * s_d
y = np.array(file_content.get('x3')) * s_d
z = np.array(file_content.get('x1')) * s_d
px = np.array(file_content.get('p2'))
py = np.array(file_content.get('p3'))
pz = np.array(file_content.get('p1'))
return x, y, z, px, py, pz, q
def read_fbpic_input_beam(file_path, q_tot):
"""Reads particle data from an FBPIC input beam file
Parameters
----------
file_path : str
Path to the file with particle data
q_tot: float
Total beam charge.
Returns
-------
A tuple with 7 arrays containing the 6D phase space and charge of the
particles.
"""
data = np.genfromtxt(file_path)
x = data[:, 0]
y = data[:, 1]
z = data[:, 2]
px = data[:, 3]
py = data[:, 4]
pz = data[:, 5]
q = np.ones(len(x))*q_tot/len(x)
return x, y, z, px, py, pz, q | APtools | /APtools-0.2.4-py3-none-any.whl/aptools/data_handling/reading.py | reading.py |
=====
APy
=====
With *APy* you can make a Python API and then, serve it online, make scalable
intercommunicable applications, document and organize your code, make command
line interfaces quickly, reuse it in other applications, and more cool stuff...
*APy* provides utilities to easily create aplications or modules that must
interact with other applications or modules. Its just that simple.
*APy* can receive python functions (and restful objects soon) and collects data,
doc, and annotations of it. Then a *APy* stores all functions and manages
them by context. One can also retrive a *Context* object to easily access
all the functions of that given context.
Some simple usage looks like this::
#!/usr/bin/env python
from apy.core.api import Api
api = Api()
@api.add()
def foo(a, b, c):
return a * b + c
with api.context("web"):
@api.add()
def foo(a, b):
return a + b
if __name__ == "__main__":
with context("web") as web:
print(web.foo())
To download and install PyApiMaker use::
pip install git+https://github.com/Jbat1Jumper/APy.git
Or any python3 pip shortcut that you may have.
`Full documentation here <https://github.com/Jbat1Jumper/APy/wiki>` | APy2 | /APy2-0.1.tar.gz/APy-0.1/README.rst | README.rst |
from .context import Context
from .function import Function
from .resource import Resource
from ..util.simple_match import smatch
class Api():
def __init__(self):
self._functions = {}
self._resources = {}
self._context = []
def context(self, name=None):
name = name or self.current_context()
return Context(self, name)
def current_context(self):
if not self._context:
return "root"
return self._context[-1]
def enter_context(self, name):
self._context.append(name)
def exit_context(self, name=None):
if self._context:
self._context.pop(-1)
def add(self, name=None, context=None):
from inspect import isfunction
def decorator(x):
if isfunction(x) or isinstance(x, Function):
y = self._add_function(x, name, context)
elif isinstance(x, Resource):
y = self._add_resource(x, name, context)
else:
raise Exception("TODO: EXCEPTION")
return y
return decorator
def _add_function(self, f, name, context):
if hasattr(f, "_func"):
f = f._func
y = Function(f)
y.name = name or y.name
y.context = context or self.current_context()
y.key = "%s.%s" % (str(y.context), str(y.name))
self._functions[y.key] = y
return y
def _add_resource(self, r, name):
raise NotImplementedError("add_resource")
def find_functions(self, name="*", context=None):
results = []
context = context or self.current_context()
for foo in self._functions.values():
if smatch(foo.name, name) and smatch(foo.context, context):
results.append(foo)
return results
def get_function(self, name, context=None):
context = context or self.current_context()
key = "%s.%s" % (str(context), str(name))
if key in self._functions:
return self._functions[key]
return None | APy2 | /APy2-0.1.tar.gz/APy-0.1/apy2/core/api.py | api.py |
import os
import sys
try:
from apyk.functions import *
from apyk.ui_backupscreen import Ui_BackupDialog
from apyk.ui_main import *
except BaseException:
from functions import *
from ui_backupscreen import Ui_BackupDialog
from ui_main import *
version = "0.0.2"
class NewThread(QtCore.QThread):
'''Thread for install package from wheel'''
NTSignal = QtCore.pyqtSignal(object)
def __init__(self, action, param, parent=None):
QtCore.QThread.__init__(self, parent)
self.alive = True
self.action = action
self.param = param
def CheckAdb(self):
'''This is to check if ADB is in the system environment variables'''
try:
devnull = open(os.devnull, 'w')
subprocess.call("adb", stdout=devnull, stderr=devnull)
return True
except FileNotFoundError:
return False
def run(self):
'''Run thread'''
if self.alive == True:
adb = self.CheckAdb()
if self.action == 'check':
if adb == True:
out = subprocess.run(['adb', 'devices'],
stdout=subprocess.PIPE)
out = out.stdout.decode('utf8')
self.NTSignal.emit(out)
else:
self.NTSignal.emit(False)
elif self.action == 'packages':
for pkg in run("adb shell pm list packages"):
pkg = pkg.split(':')
pkg = pkg[1]
if self.param == True:
name = AppLookup(pkg)
if name == False:
self.NTSignal.emit(pkg)
else:
self.NTSignal.emit(pkg + ' | ' + name)
else:
self.NTSignal.emit(pkg)
elif self.action == 'backup':
for pkg in self.param[0]:
results = []
for out in run("adb shell pm path {0}".format(pkg)):
results.append(out)
if len(results) > 1: # more than one apk, so, select named base.apk
target = [apk for apk in results if 'base' in apk][0]
else: # select the only apk
target = results[0]
# Split package word
target = target.split('package:')[1]
self.NTSignal.emit(b('Current package: ') + pkg)
self.NTSignal.emit(b('APK path: ') + target)
for out in run(r"adb pull {0} {1}\\{2}".format(target, self.param[1], pkg)):
if 'does not exist' in out:
self.NTSignal.emit(
b('Status: ') + bred('Remote object does not exist'))
else:
self.NTSignal.emit(
b('Status: ') + bgreen('Successfully exported'))
self.NTSignal.emit('---------')
self.NTSignal.emit('Step')
class BackupWindow(QtWidgets.QDialog, Ui_BackupDialog):
'''Window for processing selected packages'''
def __init__(self, parent, items):
QtWidgets.QDialog.__init__(self, parent)
self.setupUi(self)
self.setWindowIcon(QtGui.QIcon(os.path.join(ThisDir(), 'icon.ico')))
self.items = items
self.SetLabelCount()
self.SetBarConfig()
self.count = 0
self.canStart = True
self.Destiny = ''
self.RequestFolder()
# Create Thread Class for allow access from other methods
if self.canStart == True:
self.CollectThread = NewThread(
'backup', [self.items, self.Destiny])
self.StartBackup()
else:
self.Cancelled()
def StartBackup(self):
self.CollectThread.NTSignal.connect(self.BackupTHROut)
self.CollectThread.finished.connect(self.AfterBackup)
self.CollectThread.start()
self.curr_process.setText('Backing up')
def RequestFolder(self):
dest = GetDestiny(self)
if dest == False:
self.canStart = False
else:
self.Destiny = dest
def Cancelled(self):
self.curr_process.setText('The backup process has been cancelled')
def BackupTHROut(self, output):
if not 'Step' in output: # Show output in 'Console'
self.out.append(output)
elif 'Step' in output: # Modify progressbar status
self.count += 1
self.progressBar.setValue(self.count)
def AfterBackup(self):
self.CollectThread.alive = False
self.curr_process.setText('Finished')
def SetLabelCount(self):
'''Assign number of packets to be processed to the label'''
self.pkgs_count.setText(str(len(self.items)))
def SetBarConfig(self):
self.progressBar.setMaximum(len(self.items))
def closeEvent(self, event):
if hasattr('self', 'CollectThread'):
if self.CollectThread.alive == True:
ask = Ask(
self, 'Exit?', 'There is a process in place, are you sure you want to leave?')
if ask == True:
self.CollectThread.alive = False
self.CollectThread.wait()
event.accept()
else:
event.ignore()
else:
event.accept()
class MainWindow(QtWidgets.QMainWindow, Ui_MainWindow):
'''Main window class'''
def __init__(self):
QtWidgets.QMainWindow.__init__(self)
self.setupUi(self)
self.setWindowIcon(QtGui.QIcon(os.path.join(ThisDir(), 'icon.ico')))
self.list_pkgs.itemSelectionChanged.connect(self.ShowSelected)
self.filter_box.textChanged.connect(self.FilterPkgs)
self.btn_backup.clicked.connect(self.StartBackup)
self.actionAbout.triggered.connect(lambda: about(self, version))
self.closeEvent = self.closeEvent
self.selected_pkgs.setHidden(True)
self.btn_backup.setHidden(True)
self.status_label.setHidden(True)
self.Continue = True
self.DoingSomething = False
self.AppLookup = False
act = SearchName(self)
if act == True:
self.AppLookup = True
self.CheckDevice()
def StartBackup(self):
'''This starts the backup process, assigns the selected packets to a list and sends them to the window where they will be processed.'''
pkgs = []
for row in range(self.selected_pkgs.count()):
itemobj = self.selected_pkgs.item(row)
item = itemobj.text()
pkgs.append(item)
BW = BackupWindow(self, pkgs)
BW.show()
def StatusLab(self, text):
'''Displays the status label and assigns the received text to it'''
self.status_label.setHidden(False)
self.status_label.setText(text)
def CheckDevice(self):
'''Start the thread to check the connected devices'''
self.StatusLab('Checking device...')
self.CheckThread = NewThread('check', None)
self.CheckThread.NTSignal.connect(self.CheckTHROutput)
self.CheckThread.finished.connect(self.AfterCheck)
self.CheckThread.start()
self.DoingSomething = True
def CheckTHROutput(self, output):
'''Gets the output of the device check thread'''
if output == False: # error with adb command
self.Continue = False
self.StatusLab("ADB was not detected")
else:
if '\tdevice' in output:
device = [int(s) for s in output.split() if s.isdigit()]
if len(device) == 1:
self.field_device.setText(str(device[0]))
else:
self.Continue = False
self.StatusLab(
'More than one device has been detected')
else:
self.Continue = False
if 'unauthorized' in output:
self.StatusLab('Device unauthorized')
else:
self.StatusLab('No device detected')
def AfterCheck(self):
self.DoingSomething = False
'''Executed at the end of the device check thread'''
# Continue is temporal, maybe is better show an alert an exit
if self.Continue == True:
self.GetPackages()
def GetPackages(self):
'''Start the thread to get the packets from the connected device'''
self.StatusLab('Searching packages...')
self.GetPkgsThread = NewThread('packages', self.AppLookup)
self.GetPkgsThread.NTSignal.connect(self.ShowOutput)
self.GetPkgsThread.finished.connect(self.AfterPackages)
self.GetPkgsThread.start()
self.DoingSomething = True
def ShowOutput(self, out):
'''Method for show console output in each Thread call'''
self.list_pkgs.addItem(out)
def AfterPackages(self):
self.DoingSomething = False
'''Runs at the end of the thread to get the packets from the connected device'''
self.filter_box.setEnabled(True)
self.status_label.setHidden(True)
self.select_all.setEnabled(True)
self.deselect_all.setEnabled(True)
self.clear_filter.setEnabled(True)
def FilterPkgs(self, text):
'''Filters the list of packets based on the given text'''
for row in range(self.list_pkgs.count()):
itemobj = self.list_pkgs.item(row)
item = itemobj.text()
if text:
itemobj.setHidden(not ffilter(text, item))
else:
itemobj.setHidden(False)
def ShowSelected(self):
'''It is executed each time an item is selected from the list of packages and is displayed in a list part'''
listwid = self.list_pkgs
selcwid = self.selected_pkgs
selected = listwid.selectedItems()
if len(selected) != 0:
self.selected_pkgs.setHidden(False)
self.btn_backup.setHidden(False)
selcwid.clear()
for x in selected:
selcwid.addItem(x.text().split(' | ')[0])
else:
self.selected_pkgs.setHidden(True)
self.btn_backup.setHidden(True)
def closeEvent(self, event):
if self.DoingSomething == True:
ask = Ask(
self, 'Exit?', 'There is a process in place, are you sure you want to leave?')
if ask == True:
KillAdbServer()
sys.exit()
else:
event.ignore()
else:
KillAdbServer()
event.accept()
def main():
app = QtWidgets.QApplication([])
window = MainWindow()
window.show()
app.exec_()
if __name__ == "__main__":
main() | APyK | /APyK-0.0.2-py3-none-any.whl/apyk/win_main.py | win_main.py |
import os
import subprocess
import google_play_scraper as gps
from PyQt5 import QtWidgets
def KillAdbServer():
os.system("adb kill-server")
def Ask(self, title, msg):
bsgbox = QtWidgets.QMessageBox
answ = bsgbox.question(self, title, msg, bsgbox.Yes | bsgbox.No)
if answ == bsgbox.Yes:
return True
else:
return False
def GetDestiny(self):
fname = QtWidgets.QFileDialog.getExistingDirectory(
self, 'Select a directory', '')
if fname:
return fname
else:
return False
def AppLookup(package):
try:
result = gps.app(package)
return result['title']
except gps.exceptions.NotFoundError:
return False
def ThisDir():
current_dir = os.path.dirname(os.path.realpath(__file__))
return current_dir
def run(command):
p = subprocess.Popen(command,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT, shell=True)
for line in iter(p.stdout.readline, b''):
yield line.decode('utf8').rstrip()
def about(self, v):
QAlert = QtWidgets.QMessageBox()
QAlert.about(self, "About APyK",
'APyK v{0} by Pedro Torcatt\n\nhttps://github.com/Jalkhov/apyk\n\nIcon by GuillenDesign'.format(v))
def SearchName(self):
ask = Ask(self, 'App name lookup',
'Do you want to activate the search for package names? This option will try to search for the app names of the obtained packages. Apps that are not found in the playstore will not be modified. This option needs internet connection and may take longer. Activate?')
return ask
def ffilter(text, item):
return text in item
def b(text):
'''Add bold tags to received text'''
return '<b>' + text + '</b>'
def bgreen(text):
'''Add bold tags to received text'''
return '<b><span style="color:green;">' + text + '</span></b>'
def bred(text):
'''Add bold tags to received text'''
return '<b><span style="color:red;">' + text + '</span></b>' | APyK | /APyK-0.0.2-py3-none-any.whl/apyk/functions.py | functions.py |
# AQAInterpreter
[](https://pypi.org/project/AQAInterpreter)
[](https://pypi.org/project/AQAInterpreter)
-----
**Table of Contents**
- [Installation](#installation)
- [License](#license)
## Installation
```console
pip install AQAInterpreter
```
## License
`AQAInterpreter` is distributed under the terms of the [MIT](https://spdx.org/licenses/MIT.html) license.
| AQAInterpreter | /aqainterpreter-0.0.7.tar.gz/aqainterpreter-0.0.7/README.md | README.md |
\pagebreak
# Analysis
## Background
Pseudo-code is a series of code like statements used to describe an existing algorithm or plan out a new one. Everyone has their own unique style of writing pseudo-code. It might look something like a recipe books with clear individual steps to follow. Or it could look something like an existing high level language, with compilers or interpreters already available.
Pseudo-code is 'problem-oriented', the idea being you first write your solution in pseudo-code, and then when the time comes to program a solution, your brain is free to think about the specific implementation details of the chosen high-level language. Therefore, the purpose of writing pseudo-code is to prototype and plan.
However some people find writing pseudo-code may be tedious or boring, and would prefer going straight into an interpreted weakly-typed scripting languages such as python or java-script that trade robustness for rapid iteration speeds, enabling faster prototyping.
**the imprecision of natural language**
Pseudo-code can also have ambiguity. For example some languages like lua start array indexing at `1` so `array[1]` returns the first element `"a"`.
``` {.lua .numberLines}
-- lua code
array = {"a", "b", "c"}
print(array[1]) -- prints`a`
```
However other languages like python start array indexing at `0` so `array[1]` in this case returns the second element `"b"`
``` {.python .numberLines}
# python code
array = ["a", "b", "c"]
print(array[1]) # prints `b`
```
Imagine if both of these snippets where written in pseudo-code instead of a well defined language. It would be impossible to determine correctly whether the program was intended to print `a` or `b`. There are many other ways ambiguity can be introduced in pseudo-code, for example inclusive/exclusive for loops, rounding floats or the use of `Nil` or `None` types.
However general purpose pseudo-code is very different to AQA's pseudo-code, which is both referred to as Pseudo-code and has strict rules.
``` {.python .numberLines}
# AQA Pseudo-code
array = ["a", "b", "c"]
OUTPUT array[1]
```
Following the spec, arrays in AQA Pseudo-code start indexing at `0` so therefore the is `b`. Due to the consistency of the spec, we where able to unambiguously determine the output.
This consistency means that it would be possible to write a translator, that would take any set of pseudo-code following the AQA's style and convert it to the corresponding machine code. This blurs the lines between the pseudo-code and *real languages*. So henceforth correct AQA Pseudo-code, following the spec will just be referred to as AQA code.
## Justification
Some people may argue that a translator for AQA code would be unnecessary and would hinder students. It is mainly used for offline on-paper examinations without a computer. So having a tool to generate machine code would not be needed as it may confuse students to what the purpose of pseudo-code is. Furthermore where a working algorithm is needed it would be sufficient to manually translate AQA pseudo-code to an existing high level language where a compiler or translator is already available.
However I would argue that an AQA code translator would have real world uses not just as a research project. For example using this tool, it would give students more experience and knowledge of AQA code, which would aid reading and comprehension skills. Moreover, it would mean that manually rewriting AQA code into another high level language for example python would be unheeded. And it would avoid the chances of bugs being introduced in the rewrite saving students a large amount of time.
It could also help teachers help teachers who could demonstrate automatically tracing an algorithm using a debugger or aid examiners in marking. However the problem is mainly begin solved for students to aid learning, so I will attempt to find a student as my primary client.
## Research
I have chosen to involve as student named Reece in year 10, who is interested in testing my project. He is currently studying GCSE computer science and intents to take the A level. A so, he will be my primary end user. To research this problem I have chosen to produce a Questionnaire to gauge his needs. The responses were collected in person and have been summarized.
1. **What is your opinion on AQA 'pseudo-code'?**
Using 'pseudo-code' is useful for learning programming, However it is impractical to use.
2. **How would a pseudo-code interpreter be useful in learning?**
'pseudo-code' is used on exam papers, so having a good understanding of it is important. A 'pseudo-code' interpreter could make it easier to get to grips with the language.
3. **how important is performance?, or is usability a more important factor?**
Usability is more important, the solution should be user friendly before being fast.
4. **Should the program work in vscode/pycharm or be an online IDE?**
I personally use vscode, so a vscode extension would be nice, and would integrate with my workflow, but I know other people use pycharm, which are on school computers, or the online IDE called repl.it
5. **If you want an online IDE**
1. **Should be the default colour scheme be light or dark?**
Every programmer prefers dark theme.
2. **What colours should the syntax highlighting be?**
I use atom syntax highlighting, I even have an extension for it in vscode.
6. **How should the program display errors?**
The interpreter should tell the user what line the error was found on, and ideally a clear description on how to fix it.
7. **Other requirements?**
I would like a translucent background because they look cool.
## Analysis of research
Students that are interested in coding care a lot about aesthetics and developer experience. They expect tools like auto-completion and customizable themes to be present in any development environment they use. Specifically my client expects first class support in vscode, pycharm and repl-it. However my goal will be to prioritize core language features and getting a working AQA pseudo-code translator before tackling the editor features my client requests.
Students that are interested in coding care a lot about aesthetics and developer experience. They expect tools like auto-completion and customizable themes to be present in any development environment they use. Specifically my client expects first class support in vscode, pycharm and repl-it. However my goal will be to prioritize core language features and getting a working AQA pseudo-code translator before tackling the editor features my client requests.
## Background
There are three methods that could be used to convert AQA pseudo-code to machine code. Writing a **compilers**, **interpreters** or a **source-to-source compiler**. Compilers and interpreters produce a lower level representation of the source code, such as an intermediary representation (IR) or machine code. A source-to-source compiler also known as a transpiler converts the source to a different high level language, for example converting AQA pseudo-code to python. Then if the resultant high-level language has a compiler or interpreter already available.
**compilers**

- **Compilers** first scan/tokenize the source code producing a list of tokens which are then parsed producing an intermediary format such as byte-code. This is then converted into *machine code*. For example the *clang* compiler for *C++* converts the source code to *LLVM byte-codes* which is then converted to standalone *machine code* for each system, for example *X-86*. However *Java byte-code* is distributed standalone and each system requires a *JVM* (Java Virtual Machine) installed to do the final conversion step to *machine code*.
- **Interpreters** scan and parse but the source code is executed statement by statement and converted to *machine code* on the fly. Interpreters are simpler to implement but can be slower than the final output of a compiled language.
- **Transpilers** scan and parse but the intermediary form is converted to another high level language where a compiler or interpreter already available. For example, the *Nim* programming language works by first transpiling to either *C*, *C++* or *Java-Script*.
- Other notes: languages can be expressed with regular grammar. Tools like yacc and lex are known as compiler compilers as they create a compiler given regular grammar of the language to be built as input. However I will not be using these and I am interesting in learning how a translator works.
- Other notes: languages can be expressed with regular grammar. Tools like yacc and lex are known as compiler compilers as they create a compiler given regular grammar of the language to be built as input. However I will not be using these and I am interesting in learning how a translator works.
**Advantages and disadvantages of each approach**
The advantages of a compiler is that is can optimize the resulting machine code, making the executable more efficient. However a disadvantage of machine code is that the machine code is not portable, and cannot be copied over to different systems. Furthermore, the compilation step may take a large amount of time for complex projects, which means that errors, also will take a long time to show up. This reduces iteration speeds and can result in a worse developer experience than an interpreter where the errors show up much quicker.
Therefore I will create an interpreter as its the simplest to implement and based on my user research usability and develop experience are more important factors than performance.
## Analysis of existing solutions
Currently, there are no existing solutions for translating AQA 'pseudo-code'. However, I found two translators for IB 'pseudo-code'. One of them is website called [EZ Pseudocode](http://ibcomp.fis.edu/pseudocode/pcode.html), It is officially endorsed by the IB computer science page. The website is written entirely in *HTML* and *Java Script*, including the translation logic. This makes running programs feel snappy as the *Java-script* code runs client-side and does not have to wait on a network request. Moreover the website has the ability to save and load programs from the browsers cookies which is a nice feature.
However as noted in the comments of the sample program, the algorithm works by transpiling the users code into *Java-script* using a basic find and replace. This is not very robust and can lead to many bugs. For example, if the user enters `output "/ mod /"`, you would expect the string `"/ mod /"` to be printed out. However instead the string `/ % /` is printed out. This is because the `translate` function in the code calls `line.replace()` To fix this bug an algorithm would need to tokenize the input which is much more complicated.
``` {.js .numberLines}
function translate(line) {
line = line.replace(/ mod /g, " % ") // The bug is here
var lin = line.trim();
var sp = lin.indexOf(" ");
var first = "";
if (startswith(lin, "if")) { first = "if" }
...
}
```
{width=85% align=centre}
Another solution, also for IB 'pseudo-code' is written by [Dep Jain](https://pseudocode.deepjain.com/). It is much more complicated and includes syntax highlighting and auto-completions powered by an open source project code editor called [ace](https://github.com/ajaxorg/ace). It defaults to dark mode which makes the website easier on the eyes. It has large buttons on the top and also has the ability to save and load files from the users cookies similar to the previous program. However the web page makes a *HTTP* request to a server to translate the code, making it closed source and also slower than the previous solution.
{width=85% align=centre}
Both programs are website, making it convenient as the user does not have to download any language tools. Website are also portable and can be accessed on any computer with an internet connection. Therefore I would consider developing a simple online IDE as well, second to a programmed solution.
Another disadvantage is that both solutions are limited to the IB computer science syllabus and not AQA's. Focusing my project on AQA's 'pseudo-code' will make my project unique. My solution should also be open source like the first example allowing the user to view the source code to better understand how their code was interpreted.
## Project Requirements
1. Create a tree-walk interpreter for all the features in AQA's 'pseudo-code' command set including but not limited to:
- `REPEAT`, `WHILE`, `FOR`, `IF`, `RECORD`, `SUBROUTINE` constructs
- `INPUT`, `OUTPUT`, `LEN`, `POSITION`, `SUBSTRING`, `RANDOM_INT` functions
- `STRING`, `INT`, `REAL`, `BOOL` types
- `INPUT`, `OUTPUT` operations
- variables, constants and arrays
If it is not possible to implement all of these features, the the language should at least be Turing complete. For a language to be considered Turing complete it needs at lease arithmetic operations, control flow (`WHILE` / `REPEAT` loops), and access to arbitrary memory (arrays)
2. Additionally I would like to make keywords case insensitive giving the use the ability to style code to his preference.
3. The program should accept a large range of input. For example the 'pseudo-code' command set uses the unicode multiply sign `(×)` whereas most programming languages use an `(*)` as it can be typed on a traditional keyboard. My program should accept both of these symbols, making it adaptable. A table of these special symbols is show below.
| Traditional | Unicode |
| :---------: | :-----: |
| * | × |
| / | ÷ |
| != | `!=` |
| <= | `<=` |
| >= | `>=` |
| <- | `←` |
4. Robust error handling, informing the user of what line syntax errors have occurred.
5. Create on online IDE for users to quickly try out the language without having to install any extra language tools their local machine.
6. Add syntax highlighting to highlight keywords and constructs, following the colours of the atom text editor, as it was my clients preference.
# Documented design
## Language Choice
To translate 'pseudo-code' I am going ot build a *tree-walk* interpreter. The rough structure of my implementation is based on a book called *Crafting Interpreters* (ISBN 9780990582939) by *Robert Nystrom* which is written in *Java*. I have decided to use *Python* instead as it has a simple and readable syntax and is dynamically typed. This means I can re-use *python's* base types, which support string concatenation and integers of arbitrary precision meaning that integers will never overflow. *Python's* slower performance is not an issue as having a robust solution is higher priority and python is widely understood and is a popular language. Python is also multi-paradigm and supports OOP programming which is the main language feature I will use to structure my code. I also intend to use modules and split my code across multiple files to separate concerns.
## High level system overview
Converting 'pseudo-code' to *machine code* is comprised of several stages. After the user's source code is read in from a file or *stdin* it is stored in a variable of type string and resides in memory. The first main stage is known as scanning of tokenizing where the alphabet of characters are grouped together to form tokens that represent the grammar and punctuation of the language.
During this stage lexical analysis is performed to group the right amount of characters into the right token. Some tokens are made up of single characters such as (+) and (-). Whereas other tokens are made up of a set number of characters such as FOR and IF.
STRING and NUMBER literals are made up of a variable number of characters and need to be handled correctly. If the number 3 is found inside of a string, it should be treated as part of the STRING literal and not its own NUMBER literal. Therefore our scanner program will need to treat the same character differently depending on its state such as whether it has seen an opening `"` or `'`. The NUMBER literals include floats so the scanner should also handle decimal points correctly.
**Source code**
``` {.aqa .numberLines}
IDENTIFIER `i`
| ╭───── ASSIGNMENT token
↓ ↓
i <- 1
WHILE i <= 5
IF i = 3 ←─ INTEGER literal
OUTPUT "3 is a lucky number" ←─ STRING literal
ELSE ←─ ELSE token
OUTPUT i ←─ IDENTIFIER token
ENDIF ←─ ENDIF token
i <- i + 1
ENDWHILE ←─ ENDWHILE token
```
**Table of tokens**
| Token | Value |
| :--------- | :-------------------: |
| IDENTIFIER | 'i' |
| ASSIGNMENT | |
| NUMBER | 1 |
| WHILE | |
| IDENTIFIER | 'i' |
| LESS_EQUAL | |
| NUMBER | 5 |
| IF | |
| IDENTIFIER | |
| EQUAL | |
| NUMBER | 3 |
| PRINT | |
| STRING | '3 is a lucky number' |
| ELSE | |
| PRINT | |
| IDENTIFIER | 'i' |
| END | |
| IDENTIFIER | 'i' |
| ASSIGNMENT | |
| IDENTIFIER | 'i' |
| ADD | |
| NUMBER | 1 |
| END | |
| EOF | |
**Abstract Syntax Tree**
Looking in the table, the scanner has produced 18 separate tokens including an EOF (End Of File) token. The variable `i` also given a special IDENTIFIER token. We can use a hashmap data structure to store the value of `i` as it is incremented at the end of the for loop, so that it is printed correctly on the fifth line.
**Parsing**
The next step is parsing, where we convert the alphabet of tokens into expressions. This will be modelled using an as an Abstract Syntax Tree (AST). This nesting of the nodes inside a tree allows us to represent the nesting or or `FOR` and `IF` blocks. As well as correctly defining the order of operations of expressions following BIDMAS.
To do this the parser could use two possible methods to recognize the start and end of out `FOR` and `IF` blocks. Method 1 involves counting indentation levels which would require our scanner to emit INDENT tokens matching tab character or spaces. This can be complicated and erroneous where the user inconsistently mixes tabs and spaces. However, it would make the use of `ENFOR` and `ENDIF` keywords optional.
To do this the parser could use two possible methods to recognize the start and end of out `FOR` and `IF` blocks. Method 1 involves counting indentation levels which would require our scanner to emit INDENT tokens matching tab character or spaces. This can be complicated and erroneous where the user inconsistently mixes tabs and spaces. However, it would make the use of `ENFOR` and `ENDIF` keywords optional.
The second method is completely ignoring the indentation and only looking at the `ENDFOR` and `ENDIF` to determine the end of our `FOR` and `IF` blocks. This is simpler and less error-prone as it makes leading spaces or tab optional, but the user can still include them for readability. Therefore, this is the design i'll chose to use.
That aside, after parsing our AST looks like this:

During this stage the parser performs syntactic analysis, mapping tokens to `WHILE` and `IF` The parser sees a `WHILE` token so it knows what follows has to be a condition. Every statement thereafter is nested inside of the `WHILE` block until the parser sees the `ENDWHILE` token, A Tree data structure to represent the order of operations. The final stage is interpreting this tree.
The tree is interpreted from the leaves to the root. The source doe is made up of two statements. The first is the variable declaration `i <- 1` and the next is the `WHILE` loop. The while loop is then made of the condition `i <= 5` and two statements. The two statements are the `IF` statement and another assignment `i <- i + 1`. The `IF` statement also consists of a condition `i = 3`, only a single `=` and not a double `==`. This is because an `<-` is used as the assignment operator, and so the comparison operator hence is a single equal sign, compared to other languages. The `IF` statement has a then and else branch each consisting of a single `OUTPUT` statement. Each construct like the `WHILE` and the `IF` can nest any amount of statements.
A symbol table is used to keep track of the `i` variable as its value changes throughout the program. From here instead of traversing the tree and emitting byte code, we'll take a simpler but less efficient approach and run the python equivalent for each statement. So the `OUTPUT` is then mapped to the python `print()` statement.
## Language syntax
Backus-naur (BNF) is useful notation for describing the grammar of languages. BNF is a series of rules, consisting of a head and a body making up a production. The head is on the LHS of the `'::='` and the body is on RHS of the `'::='`. A rule can either be *terminal* or *non-terminal*. A *terminal* production matches string literals, number literals or tokens. A *non-terminal* production matches other rules. Note: keywords are case insensitive so `PRINT` or `print` or any other casing is perfectly valid. Although this gives the user less options for valid variable names, this gives the language more flexibility in the valid source code it accepts. Each BNF statement will be accompanied by a syntax diagram, for a clearer explanation.
Moreover the meta-characters `(*)`, `(?)` and `(|)` will be used. The `(*)` means zero or more, the `(?)` means zero or one and the `(|)` means or.
`<program> ::= <declarations> "EOF"`

`<declarations> ::= <declaration>*`

`<declaration> ::= (<variable_declaration> | <statement>)?`

`<statement> ::= (<printStatement> | < whileStatement> | <forStatement> | <ifStatement>)?`

`<variable declaration> ::= "IDENTIFIER" "<-" <expression>`

`<end> ::= "END" | "ENDIF" | "ENDWHILE" | "ENDFOR"`

`<printStatement> ::= ( "OUTPUT" | "PRINT" ) <expression>`

`<ifStatement> ::= "IF" <expression> ("THEN" | ":")? <declarations> ("ELSE" ":"? <declarations>)? <end>`

`<whileStatement> ::= "WHILE" <expression> ("THEN" | ":")? <declarations> <end> `

`<forStatement> ::= "FOR" <variable declaration> "TO" <expression> ("STEP" <expression>)? <declarations> <end>`

`<expression> ::= <logic_or>`

`<logic or> ::= <logic and> ( OR <logic and> )*`

`<logic and> ::= <equality ( AND <equality> )*`

`<equality> ::= <comparison> ( ( "==" | "!=" ) <comparison> )*`

`<comparison> ::= <term> ( ( ">" | ">=" | "<" | "<=" ) <term> )*`

`<term> ::= <factor> ( ( "-" | "+" ) <factor> )*`

`<factor> ::= <unary> ( ( "/" | "*" ) <unary> )*`

`<unary> ::= ( NOT | "-" ) <unary> | <primary>`

`<primary> ::= INTEGER | REAL | STRING | None | True | False | "(" <expression> ")"`

## User Interface
My program will have a basic command line interface. It should let the user pick from running the program via the Read Eval Print Loop (REPl), passing in the program as a string, or reading in the program as a file. The program should also display a helpful message when the program is called with the `---help` flag. Below shows a draft of what this might look like.
``` {.bash .numberLines}
# display help message
$ python aqainterpreter.py --help
Usage: aqainterpreter.py [OPTIONS] [FILENAME]
Options:
-c, --cmd TEXT
--help Show this message and exit.
# incorrect usage
$ python aqainterpreter.py filename -c cmd
Usage: aqainterpreter.py [OPTIONS] [FILENAME]
Try 'aqainterpreter.py --help' for help.
Error: cannot specify both filename and cmd at same time
# starting the repl (read-eval-print-loop)
$ python aqainterpreter.py
> OUTPUT 'Hi!'
Hi!
# program read in from file
$ python aqainterpreter.py input.txt
Hi!
# program passed in as a string
$ python aqainterpreter.py --cmd "OUTPUT 'Hi!'"
Hi!
```
# Technical Solution
**module hierarchy**
{ width=30% }
**Class diagrams**
{ width=30% }
{ width=20% }
{ width=30% }

## project structure
\TECHNICAL_SOLUTION
## syntax highlighting
Another one my clients needs was to produce syntax highlighting for the vscode editor. Due to time limitations, I instead prioritised the syntax highlighting of the code snippets in this document. This document was produced in pandoc which accepts KDE-style XML syntax definition files so I wrote one AQA pseudo-code. Unfortunately I couldn't get comments to work which is why AQA pseudo-code comments appear black in this documents whereas they appear green in python snippets. The XML file below contains regular expressions and is a lot more of a declarative style compared to the tokenizer and parse I wrote in python. In fact its only 114 lines compared to my scanner which is 203.
**aqa.xml**
``` {.xml .numberLines}
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE language>
<language name="AQA" section="Markup" version="7" kateversion="2.4"
extensions="*.aqa" mimetype="application/json">
<highlighting>
<list name="Constants">
<item>NOT</item>
<item>TRUE</item>
<item>FALSE</item>
<item>NONE</item>
<item>AND</item>
<item>OR</item>
<item>THEN</item>
<item>WHILE</item>
<item>DO</item>
<item>END</item>
<item>ENDIF</item>
<item>ENDWHILE</item>
<item>ENDFOR</item>
<item>FOR</item>
<item>TO</item>
<item>STEP</item>
<item>IDENTIFIER</item>
<item>EOF</item>
</list>
<list name="Built_In">
<item>OUTPUT</item>
<item>PRINT</item>
</list>
<list name="Control_Flow">
<item>IF</item>
<item>ELSE</item>
</list>
<list name="Data_Types">
<item>INTEGER</item>
<item>REAL</item>
<item>STRING</item>
</list>
<list name="Operators">
<item> + </item>
<item> - </item>
<item> * </item>
<item> × </item>
<item> / </item>
<item> ÷ </item>
<item> = </item>
<item> > </item>
<item> ≥ </item>
<item> >= </item>
<item> ≤ </item>
<item> ≠ </item>
<item> != </item>
</list>
<contexts>
<context name="Normal" lineEndContext="#stay">
<DetectChar char=""" context="String_Value"
attribute="Style_String_Value"/>
<DetectChar char=""" context="String_Value"
attribute="Style_String_Value"/>
<DetectSpaces context="#stay" attribute="Style_Normal" />
<keyword String="Constants" context="#stay" attribute="Style_Keyword"/>
<keyword String="Built_In" context="#stay" attribute="Style_Function"/>
<keyword String="Control_Flow" context="#stay" attribute="Style_Control_Flow"/>
<keyword String="Data_Types" context="#stay" attribute="Style_String_Key"/>
<keyword String="Operators" context="#stay" attribute="Style_Operator"/>
<DetectChar char="<<" context="#stay" attribute="Style_Operator"/>
<RegExpr String="-?[0-9]+\.[0-9]+(?:[eE][+-]?[0-9]+)?" context="#stay"
attribute="Style_Float" />
<RegExpr String="-?[0-9]+(?:[eE][+-]?[0-9]+)?" context="#stay"
attribute="Style_Decimal"/>
</context>
<context name="String_Value" lineEndContext="#pop" attribute="Style_String_Value">
<DetectChar char=""" context="#pop" attribute="Style_String_Value" />
<RegExpr String="\\(?:["\\/bfnrt]|u[0-9a-fA-f]{4})" context="#stay"
attribute="Style_String_Value_Char" />
</context>
<!-- <context name="Comment" lineEndContext="#pop" attribute="Style_Comment">
<RegExpr String="." attribute="Style_Comment">
</context> -->
</contexts>
<itemDatas>
<itemData name="Style_Normal" defStyleNum="dsNormal" />
<itemData name="Style_Comment" defStyleNum="dsComment" />
<itemData name="Style_Decimal" defStyleNum="dsDecVal" />
<itemData name="Style_Float" defStyleNum="dsFloat" />
<itemData name="Style_String_Key" defStyleNum="dsDataType" />
<itemData name="Style_String_Value" defStyleNum="dsString" />
<itemData name="Style_Control_Flow" defStyleNum="dsControlFlow" />
<itemData name="Style_Function" defStyleNum="dsFunction" />
<itemData name="Style_Operator" defStyleNum="dsOperator" />
<itemData name="Style_String_Value_Char" defStyleNum="dsChar" />
<itemData name="Style_Keyword" defStyleNum="dsKeyword" />
</itemDatas>
</highlighting>
<general>
<comments>
<comment name="singleLine" start="#"/>
</comments>
<keywords casesensitive="0"/>
</general>
</language>
```
Then to load this file into pandoc, the document generator used to create this document I use the following command.
``` {.bash .numberLines}
pandoc metadata.yaml report.md \
--output=out.pdf \
--syntax-definition=aqa.xml \
--pdf-engine=xelatex \
--table-of-contents \
--number-sections
```
`metedata.yaml` describes extra information such as the headers and footers used in this document. And `aqa.xml` is the KDE style XML file previously.
# Testing
To perform quality assurance on my project, I created several unit tests inside of the `test_.py` file. These tests where run using pythons `pytest` library. The `pytest` library automatically runs all functions and files prefixed with `test_` hence the strange name `test_.py`. Here is a more detailed snippet of all the code samples that where used as testing. However as the set of string matched by a context free grammar is infinite, it is impossible to test every pseudo-code input that can be run by my program. Therefore my testing covers the main language features I managed to implement and a couple extra pseudo-code programs that make use of multiple language features at once.
## testing expressions
``` {.aqa .numberLines}
OUTPUT 1 + 1 # 2
OUTPUT 1 - 1 # 0
OUTPUT -1 # -1
OUTPUT 2 * 1 # 2
OUTPUT 2 × 1 # 2
OUTPUT 2 / 1 # 2.0
OUTPUT 2 ÷ 1 # 2.0
OUTPUT "hi" + "÷" # "hi÷"
OUTPUT "a" * 3 # "aaa"
OUTPUT 1 > 0 # True
OUTPUT 1 ≥ 0 # True
OUTPUT 1 >= 1 # True
OUTPUT 1 < 0 # False
OUTPUT 1 ≤ 0 # False
OUTPUT 1 <= 1 # True
```
## testing comments
``` {.aqa .numberLines}
# a comments
```
## testing assignment
``` {.aqa .numberLines}
a <- 0
a <- a + 1
OUTPUT a # 1
```
## testing assignment
``` {.aqa .numberLines}
IF True
OUTPUT "yes"
ENDIF # yes
IF False
OUTPUT "yes"
ENDIF # "yes"
IF True
IF True
OUTPUT "yes"
ENDIF
ENDIF # "yes"
```
## testing while loops
``` {.aqa .numberLines}
a <- 1
WHILE a <= 3 DO
OUTPUT a
a <- a + 1
ENDWHILE # 1, 2, 3
# fibonacci sequence
a <- 1
b <- 1
c <- 2
count <- 0
WHILE count != 2
OUTPUT a
a <- b + c
OUTPUT b
b <- c + a
OUTPUT c
c <- a + b
count <- count + 1
ENDWHILE # 1, 1, 2, 3, 5, 8
```
## testing for loops
``` {.aqa .numberLines}
FOR a <- 1 TO 1
OUTPUT a
ENDFOR # 1
FOR a <- 1 TO 1 STEP 1
OUTPUT a
ENDFOR # 1
FOR a <- 1 TO 1 STEP -1
OUTPUT a
ENDFOR # 1
FOR a <- 1 TO 3
OUTPUT a
ENDFOR # 1, 2, 3
FOR a <- 1 TO 3 STEP 1
OUTPUT a
ENDFOR # 1, 2, 3
FOR a <- 3 TO 1
OUTPUT a
ENDFOR # 3, 2, 1
FOR a <- 3 TO 1 STEP -1
OUTPUT a
ENDFOR # 3, 2, 1
FOR a <- 1 TO 5 STEP 2
OUTPUT a
ENDFOR # 1, 3, 5
FOR a <- 5 TO 1 STEP -2
OUTPUT a
ENDFOR # 5, 3, 1
FOR a <- 1 TO 2
FOR b <- 1 TO 2
OUTPUT a
OUTPUT b
OUTPUT ''
ENDFOR
ENDFOR # 1,1 1,2 2,1 2,2
FOR a <- 1 TO 12
FOR b <- 1 TO 12
OUTPUT a + " × " + b + " = " + (a * b)
END
END
# 1 × 1 = 1
# 1 × 2 = 2
# 1 × 3 = 3
# 1 × 4 = 4
# 1 × 5 = 5
# 1 × 6 = 6
# ... all the times tables up to 12 × 12
```
## Tokens and AST
Here are the tokens and the ast generated by my program for a couple of the tests as showing them all would be too long.
### program 1
``` {.aqa .numberLines}
OUTPUT 1 + 2 * 3
```
**tokens**
``` {.python .numberLines}
[Token(type='PRINT', lexeme='OUTPUT', line=1),
Token(type='NUMBER', lexeme='1', line=1),
Token(type='ADD', lexeme='', line=1),
Token(type='NUMBER', lexeme='2', line=1),
Token(type='TIMES', lexeme='', line=1),
Token(type='NUMBER', lexeme='3', line=1),
Token(type='EOF', lexeme='', line=2)]
```
**ast**
``` {.python .numberLines}
[
Print(
expression=Binary(
left=Literal(value=1),
operator=Token(type="ADD", lexeme="", line=1),
right=Binary(
left=Literal(value=2),
operator=Token(type="TIMES", lexeme="", line=1),
right=Literal(value=3),
),
)
)
]
```
### program 2
``` {.aqa .numberLines}
IF True
IF True
OUTPUT "yes"
ENDIF
ENDIF
```
**tokens**
``` {.python .numberLines}
[Token(type='IF', lexeme='IF', line=1),
Token(type='TRUE', lexeme='True', line=1),
Token(type='IF', lexeme='IF', line=2),
Token(type='TRUE', lexeme='True', line=2),
Token(type='PRINT', lexeme='OUTPUT', line=3),
Token(type='STRING', lexeme='"yes"', line=3),
Token(type='END', lexeme='ENDIF', line=4),
Token(type='END', lexeme='ENDIF', line=5),
Token(type='EOF', lexeme='', line=6)]
```
**ast**
``` {.python .numberLines}
[
If(
condition=Literal(value=True),
then_branch=[
If(
condition=Literal(value=True),
then_branch=[Print(expression=Literal(value="yes"))],
else_branch=[],
)
],
else_branch=[],
)
]
```
### program 3
``` {.aqa .numberLines}
FOR a <- 1 TO 12
FOR b <- 1 TO 12
OUTPUT a + " × " + b + " = " + (a * b)
END
END
```
**tokens**
``` {.python .numberLines}
[Token(type='FOR', lexeme='FOR', line=1),
Token(type='IDENTIFIER', lexeme='a', line=1),
Token(type='ASSIGNMENT', lexeme='', line=1),
Token(type='NUMBER', lexeme='1', line=1),
Token(type='TO', lexeme='TO', line=1),
Token(type='NUMBER', lexeme='12', line=1),
Token(type='FOR', lexeme='FOR', line=2),
Token(type='IDENTIFIER', lexeme='b', line=2),
Token(type='ASSIGNMENT', lexeme='', line=2),
Token(type='NUMBER', lexeme='1', line=2),
Token(type='TO', lexeme='TO', line=2),
Token(type='NUMBER', lexeme='12', line=2),
Token(type='PRINT', lexeme='OUTPUT', line=3),
Token(type='IDENTIFIER', lexeme='a', line=3),
Token(type='ADD', lexeme='', line=3),
Token(type='STRING', lexeme='" × "', line=3),
Token(type='ADD', lexeme='', line=3),
Token(type='IDENTIFIER', lexeme='b', line=3),
Token(type='ADD', lexeme='', line=3),
Token(type='STRING', lexeme='" = "', line=3),
Token(type='ADD', lexeme='', line=3),
Token(type='LEFT_PAREN', lexeme='', line=3),
Token(type='IDENTIFIER', lexeme='a', line=3),
Token(type='TIMES', lexeme='', line=3),
Token(type='IDENTIFIER', lexeme='b', line=3),
Token(type='RIGHT_PAREN', lexeme='', line=3),
Token(type='END', lexeme='END', line=4),
Token(type='END', lexeme='END', line=5),
Token(type='EOF', lexeme='', line=6)]
```
**ast (wow this is long)**
``` {.python .numberLines}
[
Var(
name=Token(type="IDENTIFIER", lexeme="a", line=1), initialiser=Literal(value=1)
),
While(
condition=Binary(
left=Variable(name=Token(type="IDENTIFIER", lexeme="a", line=1)),
operator=Token(type="LESS_EQUAL", lexeme="", line=0),
right=Literal(value=12),
),
body=[
Var(
name=Token(type="IDENTIFIER", lexeme="b", line=2),
initialiser=Literal(value=1),
),
While(
condition=Binary(
left=Variable(name=Token(type="IDENTIFIER", lexeme="b", line=2)),
operator=Token(type="LESS_EQUAL", lexeme="", line=0),
right=Literal(value=12),
),
body=[
Print(
expression=Binary(
left=Binary(
left=Binary(
left=Binary(
left=Variable(
name=Token(
type="IDENTIFIER", lexeme="a", line=3
)
),
operator=Token(type="ADD", lexeme="", line=3),
right=Literal(value=" × "),
),
operator=Token(type="ADD", lexeme="", line=3),
right=Variable(
name=Token(
type="IDENTIFIER", lexeme="b", line=3
)
),
),
operator=Token(type="ADD", lexeme="", line=3),
right=Literal(value=" = "),
),
operator=Token(type="ADD", lexeme="", line=3),
right=Grouping(
expression=Binary(
left=Variable(
name=Token(
type="IDENTIFIER", lexeme="a", line=3
)
),
operator=Token(type="TIMES", lexeme="", line=3),
right=Variable(
name=Token(
type="IDENTIFIER", lexeme="b", line=3
)
),
)
),
)
),
Var(
name=Token(type="IDENTIFIER", lexeme="b", line=2),
initialiser=Binary(
left=Variable(
name=Token(type="IDENTIFIER", lexeme="b", line=2)
),
operator=Token(type="ADD", lexeme="", line=0),
right=Literal(value=1),
),
),
],
),
Var(
name=Token(type="IDENTIFIER", lexeme="a", line=1),
initialiser=Binary(
left=Variable(name=Token(type="IDENTIFIER", lexeme="a", line=1)),
operator=Token(type="ADD", lexeme="", line=0),
right=Literal(value=1),
),
),
],
),
]
```
# Evaluation
My program achieves a large number of my project requirements so I consider it a success. The first objective was partially met. I implemented `WHILE`, `FOR` and `IF` statements but missed out on `REPEAT`, `RECORD` and `SUBROUTINE` statements. I also didn't implement any of the functions including the call stack but i did have all the data types. Another thing that is missing where constants and arrays.
Since I din't implement arrays, my solution wasn't turning complete, but it did feature control flow so at least we where half way there.
My project was expressive enough to write some novel programs such a fibonacci sequence and display the times tables. And I successfully implemented a scanner, parser and a tree-walk interpreter.
Objective two and three where met fully. My program has case insensitive keywords due to the `.lower()` in `self.source[self._start : self._current].lower()` in `scanner.py` on line 160. The capitalisation of all the keywords in this documents was purely stylistic. Objective three was also fully met, due to more effort in `scanner.py`. Infant there aren't any other languages that support special symbols like `<-`, `<=`, `!=`, `÷` unless you use a special font with ligatures. The support of these symbols means my projects is more suitable for the source code to be printed.
Objective 4 was also met. My program shows helpful error messages. For example if the user entered:
``` {.aqa .numberLines}
IF True
OUTPUT "HI"
```
Then due to the logic in line 217-8 (and 225-226) in `parser.py`:
``` {.python .numberLines}
if self._peek().type == EOF:
raise self._error(self._peek(), "Expected END after IF statement")`
```
The user would see the message `[line 3] Error at '': Expected END after IF statement`. This message clearly informs to the programmer that he forgot and `END` statement and the end of his `IF` statement and shows the line the parser encountered this error. This is triggered because the parser ran in to the `EOF` (End Of File) token whilst the `IF` had not been closed. Due to this example and others in `parser.py` I would conclude that my program does indeed feature robust error handling and useful error messages.
Objective five was not quite met. I did have a prototype of an online IDE that would allow people to run pseudo-code right in their browser without having to have a python environment setup. Below is a screenshot as well as the HTML. However I didn't have time to hook it to my AQA interpreter or deploy it live on the web.
If the code editor looks familiar it is because it uses the monaco editor which also powers vscode for my client.
``` html
<!DOCTYPE html>
<html>
<head>
<title>browser-amd-editor</title>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8" />
</head>
<body>
<h2>Monaco Editor Sample</h2>
<button onclick="run()">click me</button>
<div id="container" style="width: 800px; height: 600px; border: 1px solid grey">
</div>
<script
src=
'https://cdnjs.cloudflare.com/ajax/libs/monaco-editor/0.36.1/min/vs/loader.min.js'
>
</script>
<script>
// fetch(document.URL + "api/run")
console.log(document.URL);
require.config(
{
paths:{
vs:'https://cdnjs.cloudflare.com/ajax/libs/monaco-editor/0.36.1/min/vs'
}}
);
require(['vs/editor/editor.main'], function () {
window.editor = monaco.editor.create(
document.getElementById('container'), {
value: 'print("Hi")',
language: 'python'
});
});
function run() {
console.log(window.editor.getValue());
}
</script>
</body>
</html>
```

Objective 6 was also met as you can have no doubt seen the syntax highlighting for AQA code snippets many times in this document. The code for which is also explained in the technical solution.
| AQAInterpreter | /aqainterpreter-0.0.7.tar.gz/aqainterpreter-0.0.7/report/report.md | report.md |
Lox is a scripting language with curly braces, similar to js. The book takes you though writing a tree-walk implementation in js and a byte-code implementation in c.
## Tree walk interpreter
1. scanning / tokenizing
```julia
"output 5 + 31" -> ["output", "5", "+", "31"]
```
each element in the above list, is known as a lexeme
2. parsing
```julia
["output", "5", "+", "31"] ->
output # statement
|
+ # binary operation
/ \
5 31 # constants
```
3. Traverse tree and evaluate it
4. Or traverse tree and transpile it to ruby
- Compiler compilers are programs which take in grammar and create an interpreter for said grammar. For example yacc and Lex
- Languages can be dynamically typed of statically typed meaning type checking is done at runtime or compile time, lox is dynamically typed
Most language grammars use a flavour of bnf
a 'string' is made up of the 'alphabet'
| program | Lexical grammar | Syntactic grammar |
| -------------- | --------------- | ----------------- |
| alphabet | characters | tokens |
| string | tokens | expressions |
| implemented by | scanner | parser |
a production has a head and body
the head is the name
the body is the actual rule
the body can be terminal or non-terminal
A terminal production is part of the alphabet
a non-terminal rule can link to another rule
For example:
```julia
# BNF style grammar
# non-terminal rules
sentence -> "There are " + num + " " + "animals at the " + place
num -> num + digit
num -> digit
# terminal rules
digit -> 1
digit -> 2
digit -> 3
...
digit -> 9
place -> "Farm"
place -> "Zoo"
# e.g. `There are 3 animals at the zoo
```
python asts can be printed via
```python
from ast import dump, parse
print(dump(parse('1 + 1'), indent=4))
```
I want the traceback to either use python technology
implement my own. I want if the user has a function
to support a full traceback of their program
but an optional flag that will make it so the full traceback
of interpreter.py is shown instead
For now lets show the full traceback which we don't mind seing
and well check this issue out later when we go about implementing
functions.
Currently skipped
- assignment syntax returning a value so `a <- b <- 1` doesn't work. p122 8.4
- expression blocks p125 8.5
- expression statements working in the REPL
The bnf the book uses is
```ruby
forStmt -> "for" "(" ( varDecl | exprStmt | ";" ) expression? ";" expression? ")" statement ;
```
actually for arrays we probably want a `len` function so maybe we should do functions first. Or should I finish for loops for the sake of completion.
ok for the billionth time here is bnf for a for loop
```ruby
forStmt -> "for" varDecl "to" expression ("step" expression)? list[stmt]
```
ok we have a problem where the varDecl does a lookahead but that will brick in a for loop
so in the statement() function we lookahead to end of line before a FOR or NEWLINE
ok what is the actual parser code for a `for_statement`
## BNF
expressions
```ruby
program -> declaration* EOF
declaration -> varDecl | stmt
stmt -> printStmt
| ifStmt
| whileStmt
| forStmt
varDecl -> IDENTIFIER "<-" expression
printStmt -> ( "PRINT" | "OUTPUT" ) expression
ifStmt -> "IF" expression ( "THEN" | ":" )? decleration* ( "ELSE" decleration* )? "ENDIF"
whileStmt -> "WHILE" expression ( "DO" | ":" ) decleration* "ENDWWHILE"
forStmt -> "FOR" varDecl "TO" expression ( "STEP" expression )? decleration*
expression -> assignment
logic_or -> logic_and ( "or" logic_and )*
logic_and -> equality ( "and" equality )*
equality -> comparison ( ( "!=" | "==" ) comparison )*
comparison -> term ( ( ">" | ">=" | "<" | "<=" ) term )*
term -> factor ( ( "-" | "+" ) factor )*
factor -> unary ( ( "/" | "*" ) unary )*
unary -> ( "!" | "-" ) unary | primary
primary -> INTEGER | REAL | STRING | "True" | "False" |
"None" | "(" expression ")"
```
\pagebreak
```aqa
i <- 1
WHILE i <= 5
IF i = 3
OUTPUT "3 is a lucky number"
ELSE
PRINT(i)
END
i <- i + 1
END
``` | AQAInterpreter | /aqainterpreter-0.0.7.tar.gz/aqainterpreter-0.0.7/report/notes.md | notes.md |
# AQIPython
## Description
AQIPython is a Python module that calculates the Air Quality Index (AQI) for various air pollutants based on different standards. The module takes pollutant concentration values in parts per million (PPM), milligrams per cubic meter (mg/m³), and micrograms per cubic meter (µg/m³) and provides the corresponding AQI value.
## Installation
To install AQIPython, you can use pip, the Python package manager. Open your terminal or command prompt and run the following command:
```shell:
pip install AQIPython
```
## Usage
Here's an example of how to use AQIPython to calculate the AQI for different pollutants:
```python:
from AQIPython import calculate_aqi
AQI = calculate_aqi('IN', 'CO', 3.5, "ug/m3")
print(AQi)
```
Make sure to replace the country code, pollutant code, concentration value, and unit with your actual data. The calculate_aqi() function takes four arguments: the country code (e.g., 'IN' for India), pollutant code (e.g., 'CO' for carbon monoxide), concentration value, and concentration unit (e.g., 'ug/m3' for micrograms per cubic meter).
## Supported Pollutants
AQIPython supports the following pollutants for calculating AQI:
- *PM25* (particulate matter with a diameter of 2.5 micrometers or less)
- *PM10* (particulate matter with a diameter of 10 micrometers or less)
- *NO2* (nitrogen dioxide)
- *SO2* (sulfur dioxide)
- *CO* (carbon monoxide)
- *O3* (ozone)
- *PB* (Lead)** (Not for US)
## Supported Standards
AQIPython supports the following Standards for calculating AQI:
- IN (India)
- US (United States of America)
## Supported units
AQIPython supports the following Standards for calculating AQI:
- ug/m3 (microgram per metercube)
- mg/m3 (miligram per metercube)
- ppm (parts per million)
## Contributing
If you'd like to contribute to AQIPython, please take a look at [Contributing Guidelines](https://github.com/Spritan/AQIPython/blob/main/CONTRIBUTING.md).
## Author
- [@Spritan](https://github.com/Spritan)
- [@dev-rajk](https://github.com/dev-rajk)
## Changelog
### [1.0.0] - 2023-07-10
- Initial release of AQIPython.
- Support for Indian and US standard AQI calculation.
- Calculate AQI for PM2.5, PM10, NO2, SO2, CO, and O3 pollutants.
- Provide function calculate_aqi() to calculate AQI based on pollutant concentrations.
## License
This module is released under the MIT License.
| AQIPython | /AQIPython-1.0.1.tar.gz/AQIPython-1.0.1/README.md | README.md |
#######################################################################
AQoPA
#######################################################################
Automated Quality of Protection Analysis tool of QoP-ML models. AQoPA is available in two modes: console and GUI mode.
--------------------------------------------------
Instruction for PIP and VIRTUALENV users is below.
--------------------------------------------------
INSTRUCTIONS FOR GNU/LINUX
==========================
1) INSTALLATION
Instalation steps for GNU/Linux (Debian, Ubuntu)
1. Install Python 2.6+: sudo apt-get install python
2. Install wxPython 2.8: sudo apt-get install python-wxgtk2.8 python-wxtools wx2.8-i18n
3. Install pythpn PLY package: sudo apt-get install python-ply
4. Download and extract AQoPA from website: http://qopml.org/aqopa/
Instalation steps for GNU/Linux (CentOS, Fedora, OpenSUSE, RedHat)
1. Install Python 2.6+: yum install python
2. Download and install wxPython 2.8 from this website: http://www.wxpython.org/download.php
3. Install pythpn PLY package: yum install python-ply
4. Download and extract AQoPA from website: http://qopml.org/aqopa/
2) RUN
-----------------------
GNU/Linux GUI version:
-----------------------
python bin/aqopa-gui
---------------------------
GNU/Linux console version:
---------------------------
python bin/aqopa-console
Run 'python bin/aqopa-console -h' to see all available options.
INSTRUCTIONS FOR MICROSOFT WINDOWS
==================================
Tested on Windows 7.
1) INSTALLATION
1. Download and install Python 2.7 from website: http://www.python.org/download/releases/2.7.6/ (Python will be installed into "C:\Python27" directory by default.)
2. Add Python directory to environment variable PATH:
- Open command line as Administrator: Start > cmd (mouse right-click -> Run as Administrator)
- Run: wmic ENVIRONMENT where "name='Path' and username='<%USERNAME%>'" set VariableValue="%Path%;C:\Python27\"
- Restart Windows
3. Download and install wxPython 2.8 from website: http://www.wxpython.org/download.php#stable
4. Download and extract python PLY 3.4 package from website: http://www.dabeaz.com/ply/
5. Install PLY 3.4:
- Open command line: Start > cmd
- Go to extracted directory with ply-3.4
- Run: python setup.py install
6. Download and extract AQoPA from website: http://qopml.org/aqopa/
2) RUN
------------------------------
Microsoft Windows GUI version:
------------------------------
1. Go to extracted AQoPA directory.
2. Double click aqopa-gui
----------------------------------
Microsoft Windows console version:
----------------------------------
1. Open command line: Start > cmd
2. Go to extracted AQoPA directory.
3. Run 'python bin/aqopa-console -h' to see all available options.
===========================================================================
=== INSTRUCTIONS FOR PIP & VIRTUALENV USERS ===
===========================================================================
INSTRUCTIONS FOR GNU/LINUX
==========================
1) INSTALLATION
Instalation steps for GNU/Linux (Debian, Ubuntu)
1. Install PIP: sudo apt-get install python-pip
2. Install PLY 3.4: using pip sudo pip install PLY
3. Install AQoPA using pip: sudo pip install AQoPA
Installing wxPython without virtualenv:
---------------------------------------
4. Install wxPython 2.8 sudo apt-get install python-wxgtk2.8 python-wxtools wx2.8-i18n
Installing wxPython with virtualenv:
------------------------------------
When using virtualenv the wxPython package may not be installed in the virtual environment. It is installed into the global python environment.
In order to make wxPython visible in virtual environment you need to create wx files in your virtual environment.
We assume that using apt-get the wxPython package has been installed in "/usr/lib/python2.7/dist-packages/" directory and the content of wx.pth file is "wx-2.8-gtk2-unicode". Otherwise, you have to find out where is wx.pth file and check its content.
4. Install wxPython 2.8 sudo apt-get install python-wxgtk2.8 python-wxtools wx2.8-i18n
5. Update wxPython paths. Replace with the path of virtualenv you have created:
- echo "/usr/lib/python2.7/dist-packages/wx-2.8-gtk2-unicode" > <virtual_env_path>/lib/python2.7/site-packages/wx.pth
- ln -s /usr/lib/python2.7/dist-packages/wxversion.py <virtual_env_path>/lib/python2.7/site-packages/wxversion.py
2) RUN
-----------------------
GNU/Linux GUI version:
-----------------------
Run aqopa-gui command: aqopa-gui
---------------------------
GNU/Linux console version:
---------------------------
Run aqopa-console command: aqopa-console
Type 'aqopa-console -h' to see all available options.
INSTRUCTIONS FOR MICROSOFT WINDOWS
==================================
Tested on Windows 7.
1) INSTALLATION
1. Download and install Python 2.7 from website: http://www.python.org/download/releases/2.7.6/ (Python will be installed into "C:\Python27" directory by default.)
2. Download and install wxPython 2.8 from website: http://www.wxpython.org/download.php#stable
3. Download and run pip-win 1.6 from website: https://sites.google.com/site/pydatalog/python/pip-for-windows (Pip-win win install some Python packages after first run.)
4. Install PLY using pip-win. Write pip install PLY in the text input and click Run.
5. Install AQoPA using pip-win. Write pip install AQoPA in the text input and click Run.
6. Download and install wxPython 2.8 from website: http://www.wxpython.org/download.php#stable
2) RUN
------------------------------
Microsoft Windows GUI version:
------------------------------
1. Open directory "C:\Python27\Scripts" (assuming that Python has been installed in "C:\Python27").
2. Run file aqopa-gui.exe
----------------------------------
Microsoft Windows console version:
----------------------------------
1. Open command line (cmd).
2. Go to "C:\Python27\Scripts" (assuming that Python has been installed in "C:\Python27").
3. Run aqopa-console.exe -h to show the help of AQoPA console command.
| AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/README.txt | README.txt |
#AQoPA
Automated Quality of Protection Analysis tool of QoP-ML models. AQoPA is available in two modes: console and GUI mode.
[Project Homepage](http://qopml.org)
*Instruction for **pip** and **virtualenv** users is below.*
##INSTRUCTIONS FOR GNU/LINUX
### INSTALLATION
Instalation steps for GNU/Linux (Debian, Ubuntu)
1. Install **Python 2.7** ```sudo apt-get install python```
2. Install **wxPython 2.8** ```sudo apt-get install python-wxgtk2.8 python-wxtools wx2.8-i18n```
3. Install pythpn **PLY** package ```sudo apt-get install python-ply```
4. Download and extract AQoPA from [http://qopml.org/aqopa/](http://qopml.org/aqopa/)
Instalation steps for GNU/Linux (CentOS, Fedora, OpenSUSE, RedHat)
1. Install **Python 2.7** ```yum install python```
2. Download and install **wxPython 2.8** from [http://www.wxpython.org/download.php](http://www.wxpython.org/download.php)
3. Install pythpn **PLY** package ```yum install python-ply```
4. Download and extract **AQoPA** from [http://qopml.org/aqopa/](http://qopml.org/aqopa/)
### RUN
####GNU/Linux GUI version:
python bin/aqopa-gui
####GNU/Linux console version:
python bin/aqopa-console
Run 'python bin/aqopa-console -h' to see all available options.
##INSTRUCTIONS FOR MICROSOFT WINDOWS
Tested on Windows 7.
### INSTALLATION
1. Download and install **Python 2.7** from [http://www.python.org/download/releases/2.7.6/](http://www.python.org/download/releases/2.7.6/) (Python will be installed into "C:\Python27" directory by default.)
2. Add Python directory to environment variable PATH:
- Open command line as Administrator: *Start > cmd (mouse right-click -> Run as Administrator)*
- Run ```wmic ENVIRONMENT where "name='Path' and username='<%USERNAME%>'" set VariableValue="%Path%;C:\Python27\"```
- Restart Windows
3. Download and install **wxPython 2.8** from [http://www.wxpython.org/download.php#stable](http://www.wxpython.org/download.php#stable)
4. Download and extract python **PLY 3.4** package from [http://www.dabeaz.com/ply/](http://www.dabeaz.com/ply/)
5. Install **PLY 3.4**:
- Open command line: *Start > cmd*
- Go to extracted directory with **ply-3.4**
- Run ```python setup.py install```
6. Download and extract **AQoPA** from website: [http://qopml.org/aqopa/](http://qopml.org/aqopa/)
### RUN
#### Microsoft Windows GUI version:
1. Go to extracted AQoPA directory.
2. Double click **aqopa-gui**
#### Microsoft Windows console version:
1. Open command line: *Start > cmd*
2. Go to extracted AQoPA directory.
3. Run **python bin/aqopa-console -h** to see all available options.
---
## INSTRUCTIONS FOR PIP & VIRTUALENV USERS
## INSTRUCTIONS FOR GNU/LINUX
### INSTALLATION
#### Instalation steps for GNU/Linux (Debian, Ubuntu)
1. Install **PIP** ```sudo apt-get install python-pip```
2. Install **PLY 3.4** using pip ```sudo pip install PLY```
3. Install **AQoPA** using pip ```sudo pip install AQoPA```
#### Installing wxPython without virtualenv:
4. Install **wxPython 2.8** ```sudo apt-get install python-wxgtk2.8 python-wxtools wx2.8-i18n```
####Installing wxPython with virtualenv:
When using **virtualenv** the **wxPython** package may not be installed in the virtual environment. It is installed into the global python environment.
In order to make wxPython visible in virtual environment you need to create wx files in your virtual environment.
We assume that using **apt-get** the **wxPython** package has been installed in "/usr/lib/python2.7/dist-packages/" directory and the content of wx.pth file is "wx-2.8-gtk2-unicode". Otherwise, you have to find out where is wx.pth file and check its content.
4. Install **wxPython 2.8** ```sudo apt-get install python-wxgtk2.8 python-wxtools wx2.8-i18n```
5. Update **wxPython paths**. Replace with the path of virtualenv you have created:
- ```echo "/usr/lib/python2.7/dist-packages/wx-2.8-gtk2-unicode" > <virtual_env_path>/lib/python2.7/site-packages/wx.pth```
- ```ln -s /usr/lib/python2.7/dist-packages/wxversion.py <virtual_env_path>/lib/python2.7/site-packages/wxversion.py```
### RUN
#### GNU/Linux GUI version:
Run aqopa-gui command: **aqopa-gui**
#### GNU/Linux console version:
Run aqopa-console command: **aqopa-console**
Run **aqopa-console -h** to see all available options.
## INSTRUCTIONS FOR MICROSOFT WINDOWS
Tested on Windows 7.
### INSTALLATION
1. Download and install **Python 2.7** from [http://www.python.org/download/releases/2.7.6/](http://www.python.org/download/releases/2.7.6/) (Python will be installed into "C:\Python27" directory by default.)
2. Download and install **wxPython 2.8** from [http://www.wxpython.org/download.php#stable](http://www.wxpython.org/download.php#stable)
3. Download and run **pip-win 1.6** from [https://sites.google.com/site/pydatalog/python/pip-for-windows](https://sites.google.com/site/pydatalog/python/pip-for-windows) (Pip-win win install some Python packages after first run.)
4. Install **PLY** using **pip-win**. Write **pip install PLY** in the text input and click Run.
5. Install **AQoPA** using **pip-win**. Write **pip install AQoPA** in the text input and click Run.
### RUN
#### Microsoft Windows GUI version:
1. Open directory "C:\Python27\Scripts" (assuming that Python has been installed in "C:\Python27").
2. Double click **aqopa-gui.exe**
#### Microsoft Windows console version:
1. Open command line (cmd).
2. Go to "C:\Python27\Scripts" (assuming that Python has been installed in "C:\Python27").
3. Run **aqopa-console.exe -h** to show the help of AQoPA console command.
| AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/README.md | README.md |
import sys
import wx
import wx.grid
import Utility
import Structs
import os
"""
@brief Security Mechanisms Evaluation Tool
@file SMETool.py
@author
@date
@date edited on 01-07-2014 by Katarzyna Mazur (visual improvements mainly)
"""
#MODEL LIBRARY
class ModelLibraryDialog(wx.Dialog):
def __init__(self, *args, **kwds):
kwds["style"] = wx.DEFAULT_DIALOG_STYLE
wx.Dialog.__init__(self, *args, **kwds)
# create txt ctrls
self.modelDescTxtCtrl = wx.TextCtrl(self, wx.ID_ANY, "")
self.modelDescTxtCtrl.SetEditable(False)
self.modelDescTxtCtrl.AppendText("Model Description:")
self.tree_ctrl_1 = wx.TreeCtrl(self, 2, style=wx.TR_HAS_BUTTONS | wx.TR_DEFAULT_STYLE | wx.SUNKEN_BORDER)
self.loadModelBtn = wx.Button(self, 1, ("Load model"))
self.Bind(wx.EVT_BUTTON, self.onClickLoadModel, id=1)
root = self.tree_ctrl_1.AddRoot("Models:")
self.tree_ctrl_1.AppendItem(root, "TLS cryptographic protocol")
self.Bind(wx.EVT_TREE_ITEM_ACTIVATED, self.onClickModelTreeNode, id=2)
self.currentlySelected = ""
self.__set_properties()
self.__do_layout()
def onClickLoadModel(self, e):
if(self.currentlySelected=="TLS cryptographic protocol"):
smetool.OnLoadAll(None)
self.Close()
def onClickModelTreeNode(self, e):
self.currentlySelected = self.tree_ctrl_1.GetItemText(e.GetItem())
if(self.currentlySelected=="TLS cryptographic protocol"):
self.modelDescTxtCtrl.Clear()
self.modelDescTxtCtrl.AppendText("TLS cryptographic protocol description")
def __set_properties(self):
self.SetTitle(("Model's Library"))
self.SetClientSize(wx.Size(600, 400))
def __do_layout(self):
mainSizer = wx.BoxSizer(wx.VERTICAL)
groupBoxAll = wx.StaticBox(self, label="Model's Library")
groupBoxAllSizer = wx.StaticBoxSizer(groupBoxAll, wx.HORIZONTAL)
sizer1 = wx.BoxSizer(wx.HORIZONTAL)
sizer1.Add(self.tree_ctrl_1, 1, wx.EXPAND, 5)
descBox = wx.StaticBox(self, label="Model's Description")
descBoxSizer = wx.StaticBoxSizer(descBox, wx.HORIZONTAL)
descBoxSizer.Add(self.modelDescTxtCtrl, 1, wx.EXPAND, 5)
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(wx.StaticText(self), 1, wx.EXPAND, 5)
buttonsSizer.Add(self.loadModelBtn, 0, wx.EXPAND, 5)
groupBoxAllSizer.Add(sizer1, 1, wx.ALL | wx.EXPAND, 5)
groupBoxAllSizer.Add(descBoxSizer, 1, wx.ALL | wx.EXPAND, 5)
mainSizer.Add(groupBoxAllSizer, 1, wx.ALL | wx.EXPAND, 5)
mainSizer.Add(buttonsSizer, 0, wx.ALL | wx.EXPAND, 5)
self.SetSizer(mainSizer)
self.CentreOnParent()
self.Layout()
################################################################## CATEGORY CLASSES ##################################################################
class AddCategoryDialog(wx.Dialog):
def __init__(self, *args, **kwds):
kwds["style"] = wx.DEFAULT_DIALOG_STYLE
wx.Dialog.__init__(self, *args, **kwds)
# create labels
self.label1 = wx.StaticText(self, wx.ID_ANY, ("Name:"))
self.label_2 = wx.StaticText(self, wx.ID_ANY, ("Description:"))
# create txt ctrls
self.catTxtCtrl = wx.TextCtrl(self, wx.ID_ANY, "", size=(200,-1))
self.catDescTxtCtrl = wx.TextCtrl(self, wx.ID_ANY, "", size=(200,-1))
# create buttons
self.addBtn = wx.Button(self, label="Add")
self.cancelBtn = wx.Button(self, label="Cancel")
# do some buttons-bindings
self.addBtn.Bind(wx.EVT_BUTTON, self.onClickAdd)
self.cancelBtn.Bind(wx.EVT_BUTTON, self.onClickCancel)
self.__set_properties()
self.__do_layout()
def onClickAdd(self, e):
flag = True
category = Structs.Category(self.catTxtCtrl.GetValue(),self.catDescTxtCtrl.GetValue())
for cat_item in Structs.categoryList:
if(category.name==cat_item.name):
flag = False
if(flag):
Structs.categoryList.append(category)
else:
SMETool.ShowMessage(smetool, "Duplicate!")
return
SMETool.populatelctrl1(smetool, Structs.categoryList)
self.Close()
def onClickCancel(self, e):
self.Close()
def __set_properties(self):
self.SetTitle(("Add a Category"))
self.SetSize((365, 216))
def __do_layout(self):
mainSizer = wx.BoxSizer(wx.VERTICAL)
groupBox = wx.StaticBox(self, label="Add the Category")
groupBoxSizer = wx.StaticBoxSizer(groupBox, wx.VERTICAL)
sizer1 = wx.BoxSizer(wx.HORIZONTAL)
sizer1.Add(self.label1, 0, wx.ALIGN_LEFT, 5)
sizer1.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer1.Add(self.catTxtCtrl, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer2 = wx.BoxSizer(wx.HORIZONTAL)
sizer2.Add(self.label_2, 0, wx.ALIGN_LEFT, 5)
sizer2.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer2.Add(self.catDescTxtCtrl, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
buttonsSizer.Add(self.addBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
buttonsSizer.Add(self.cancelBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
groupBoxSizer.Add(sizer1, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer2, 0, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(groupBoxSizer, 1, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(buttonsSizer, 0, wx.EXPAND | wx.ALL, 5)
self.SetSizer(mainSizer)
self.CentreOnParent()
self.Layout()
class EditCategoryDialog(wx.Dialog):
def __init__(self, *args, **kwds):
kwds["style"] = wx.DEFAULT_DIALOG_STYLE
wx.Dialog.__init__(self, *args, **kwds)
self.index = smetool.categoriesListView.GetFocusedItem()
# create labels
self.label1 = wx.StaticText(self, wx.ID_ANY, ("Name:"))
self.label_2 = wx.StaticText(self, wx.ID_ANY, ("Description:"))
#create text ctrls
self.catNameTxtCtrl = wx.TextCtrl(self, wx.ID_ANY, Structs.categoryList[self.index].name, size=(200,-1))
self.catDescTxtCtrl = wx.TextCtrl(self, wx.ID_ANY, Structs.categoryList[self.index].description, size=(200,-1))
# create buttons
self.applyBtn = wx.Button(self, label="Apply")
self.cancelBtn = wx.Button(self, label="Cancel")
# do some buttons bindings
self.applyBtn.Bind(wx.EVT_BUTTON, self.onClickApply)
self.cancelBtn.Bind(wx.EVT_BUTTON, self.onClickCancel)
self.__set_properties()
self.__do_layout()
def onClickApply(self, e):
flag = True
category = Structs.Category(self.catNameTxtCtrl.GetValue(),self.catDescTxtCtrl.GetValue())
for cat_item in Structs.categoryList:
if(category.name==cat_item.name):
flag = False
if(flag or Structs.categoryList[self.index].name==category.name):
i=-1
for fact in Structs.factList:
i=i+1
if(Structs.categoryList[self.index].name==fact.category):
Structs.factList[i].category = category.name
SMETool.onFactChange(smetool, Structs.factList[i].name+'('+Structs.categoryList[self.index].name+')', Structs.factList[i].name+'('+fact.category+')')
SMETool.populatelctrl2(smetool, Structs.factList)
del Structs.categoryList[self.index]
Structs.categoryList.append(category)
else:
SMETool.ShowMessage(smetool, "Duplicate!")
SMETool.populatelctrl1(smetool, Structs.categoryList)
self.Close()
def onClickCancel(self, e):
self.Close()
def __set_properties(self):
self.SetTitle(("Edit the Category"))
self.SetSize((365, 216))
def __do_layout(self):
mainSizer = wx.BoxSizer(wx.VERTICAL)
groupBox = wx.StaticBox(self, label="Edit the Category")
groupBoxSizer = wx.StaticBoxSizer(groupBox, wx.VERTICAL)
sizer1 = wx.BoxSizer(wx.HORIZONTAL)
sizer1.Add(self.label1, 0, wx.ALIGN_LEFT, 5)
sizer1.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer1.Add(self.catNameTxtCtrl, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer2 = wx.BoxSizer(wx.HORIZONTAL)
sizer2.Add(self.label_2, 0, wx.ALIGN_LEFT, 5)
sizer2.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer2.Add(self.catDescTxtCtrl, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
buttonsSizer.Add(self.applyBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
buttonsSizer.Add(self.cancelBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
groupBoxSizer.Add(sizer1, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer2, 0, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(groupBoxSizer, 1, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(buttonsSizer, 0, wx.EXPAND | wx.ALL, 5)
self.SetSizer(mainSizer)
self.CentreOnParent()
self.Layout()
################################################################## CATEGORY CLASSES ##################################################################
################################################################## FACT CLASSES ##################################################################
class AddFactDialog(wx.Dialog):
def __init__(self, *args, **kwds):
kwds["style"] = wx.DEFAULT_DIALOG_STYLE
wx.Dialog.__init__(self, *args, **kwds)
# create labels
self.label1 = wx.StaticText(self, wx.ID_ANY, ("Name:"))
self.label_2 = wx.StaticText(self, wx.ID_ANY, ("Category:"))
self.label_3 = wx.StaticText(self, wx.ID_ANY, ("Description:"))
self.label_4 = wx.StaticText(self, wx.ID_ANY, ("Value:"))
# create txt ctrls
self.factNameTxtCtrl = wx.TextCtrl(self, wx.ID_ANY, "", size=(200,-1))
self.factNameTxtCtrl.Disable()
self.factDescTxtCtrl = wx.TextCtrl(self, wx.ID_ANY, "", size=(200,-1))
self.factValTxtCtrl = wx.TextCtrl(self, wx.ID_ANY, "", size=(200,-1))
# create combo
self.factCatComboBox = wx.ComboBox(self, wx.ID_ANY, choices=self.catlist(), style=wx.CB_DROPDOWN|wx.TE_READONLY, size=(200,-1))
# create buttons
self.addBtn = wx.Button(self, label="Add")
self.cancelBtn = wx.Button(self, label="Cancel")
# do some buttons-bindings
self.addBtn.Bind(wx.EVT_BUTTON, self.onClickAdd)
self.cancelBtn.Bind(wx.EVT_BUTTON, self.onClickCancel)
self.__set_properties()
self.__do_layout()
def onClickAdd(self, e):
flag = True
if(self.factCatComboBox.GetValue()==''):
SMETool.ShowMessage(smetool, "Category can't be empty!")
return
f = 1
fact_name = 'f%d' % (f,)
if Structs.factList:
while(flag):
for item in Structs.factList:
if(item.category==self.factCatComboBox.GetValue()):
if(fact_name==item.name):
f=f+1
fact_name = 'f%d' % (f,)
flag = True
break
else:
flag = False
else:
flag = False
flag = True
fact = Structs.Fact(fact_name,self.factCatComboBox.GetValue(),self.factDescTxtCtrl.GetValue(),self.factValTxtCtrl.GetValue())
for fact_item in Structs.factList:
if(fact.name+fact.category==fact_item.name+fact_item.category):
flag = False
if(flag):
Structs.factList.append(fact)
else:
SMETool.ShowMessage(smetool, "Duplicate!")
return
SMETool.populatelctrl2(smetool, Structs.factList)
self.Close()
def onClickCancel(self, e):
self.Close()
def catlist(self):
somelist = []
[somelist.append(category.name) for category in Structs.categoryList]
return somelist
def __set_properties(self):
self.SetTitle(("Add a Fact"))
self.SetSize((365, 316))
def __do_layout(self):
mainSizer = wx.BoxSizer(wx.VERTICAL)
groupBox = wx.StaticBox(self, label="Add a Fact")
groupBoxSizer = wx.StaticBoxSizer(groupBox, wx.VERTICAL)
sizer1 = wx.BoxSizer(wx.HORIZONTAL)
sizer1.Add(self.label1, 0, wx.ALIGN_LEFT, 5)
sizer1.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer1.Add(self.factNameTxtCtrl, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer2 = wx.BoxSizer(wx.HORIZONTAL)
sizer2.Add(self.label_2, 0, wx.ALIGN_LEFT, 5)
sizer2.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer2.Add(self.factCatComboBox, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer3 = wx.BoxSizer(wx.HORIZONTAL)
sizer3.Add(self.label_3, 0, wx.ALIGN_LEFT, 5)
sizer3.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer3.Add(self.factDescTxtCtrl, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer4 = wx.BoxSizer(wx.HORIZONTAL)
sizer4.Add(self.label_4, 0, wx.ALIGN_LEFT, 5)
sizer4.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer4.Add(self.factValTxtCtrl, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
buttonsSizer.Add(self.addBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
buttonsSizer.Add(self.cancelBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
groupBoxSizer.Add(sizer1, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer2, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer3, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer4, 0, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(groupBoxSizer, 1, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(buttonsSizer, 0, wx.EXPAND | wx.ALL, 5)
self.SetSizer(mainSizer)
self.CentreOnParent()
self.Layout()
class EditFactDialog(wx.Dialog):
def __init__(self, *args, **kwds):
kwds["style"] = wx.DEFAULT_DIALOG_STYLE
wx.Dialog.__init__(self, *args, **kwds)
self.index = smetool.factsListView.GetFocusedItem()
# create labels
self.label1 = wx.StaticText(self, wx.ID_ANY, ("Name:"))
self.label_2 = wx.StaticText(self, wx.ID_ANY, ("Category:"))
self.label_3 = wx.StaticText(self, wx.ID_ANY, ("Description:"))
self.label_4 = wx.StaticText(self, wx.ID_ANY, ("Value:"))
# create text ctrls
self.factNameTxtCtrl = wx.TextCtrl(self, wx.ID_ANY, Structs.factList[self.index].name, size=(200,-1))
self.factDescTxtCtrl = wx.TextCtrl(self, wx.ID_ANY, Structs.factList[self.index].description, size=(200,-1))
self.factValTxtCtrl = wx.TextCtrl(self, wx.ID_ANY, Structs.factList[self.index].value, size=(200,-1))
# create combo
self.factCatComboBox = wx.ComboBox(self, wx.ID_ANY, choices=self.catlist(), style=wx.CB_DROPDOWN|wx.TE_READONLY, size=(200,-1))
self.factCatComboBox.SetValue(Structs.factList[self.index].category)
# create buttons
self.applyBtn = wx.Button(self, label="Apply")
self.cancelBtn = wx.Button(self, label="Cancel")
# do some buttons-bindings
self.applyBtn.Bind(wx.EVT_BUTTON, self.onClickApply)
self.cancelBtn.Bind(wx.EVT_BUTTON, self.onClickCancel)
self.__set_properties()
self.__do_layout()
def onClickApply(self, e):
flag = True
fact = Structs.Fact(self.factNameTxtCtrl.GetValue(),self.factCatComboBox.GetValue(),self.factDescTxtCtrl.GetValue(),self.factValTxtCtrl.GetValue())
for fact_item in Structs.factList:
if(fact.name+fact.category==Structs.factList[self.index].name+Structs.factList[self.index].category):
break
else:
if(fact.name+fact.category==fact_item.name+fact_item.category):
flag = False
break
if(flag):
SMETool.onFactChange(smetool, Structs.factList[self.index].name+'('+Structs.factList[self.index].category+')', fact.name+'('+fact.category+')')
del Structs.factList[self.index]
Structs.factList.append(fact)
else:
SMETool.ShowMessage(smetool, "Duplicate!")
SMETool.populatelctrl2(smetool, Structs.factList)
self.Close()
def onClickCancel(self, e):
self.Close()
def catlist(self):
somelist = []
[somelist.append(category.name) for category in Structs.categoryList]
return somelist
def __set_properties(self):
self.SetTitle(("Edit the Fact"))
self.SetSize((365, 316))
def __do_layout(self):
mainSizer = wx.BoxSizer(wx.VERTICAL)
groupBox = wx.StaticBox(self, label="Edit the Fact")
groupBoxSizer = wx.StaticBoxSizer(groupBox, wx.VERTICAL)
sizer1 = wx.BoxSizer(wx.HORIZONTAL)
sizer1.Add(self.label1, 0, wx.ALIGN_LEFT, 5)
sizer1.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer1.Add(self.factNameTxtCtrl, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer2 = wx.BoxSizer(wx.HORIZONTAL)
sizer2.Add(self.label_2, 0, wx.ALIGN_LEFT, 5)
sizer2.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer2.Add(self.factCatComboBox, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer3 = wx.BoxSizer(wx.HORIZONTAL)
sizer3.Add(self.label_3, 0, wx.ALIGN_LEFT, 5)
sizer3.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer3.Add(self.factDescTxtCtrl, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer4 = wx.BoxSizer(wx.HORIZONTAL)
sizer4.Add(self.label_4, 0, wx.ALIGN_LEFT, 5)
sizer4.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer4.Add(self.factValTxtCtrl, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
buttonsSizer.Add(self.applyBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
buttonsSizer.Add(self.cancelBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
groupBoxSizer.Add(sizer1, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer2, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer3, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer4, 0, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(groupBoxSizer, 1, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(buttonsSizer, 0, wx.EXPAND | wx.ALL, 5)
self.SetSizer(mainSizer)
self.CentreOnParent()
self.Layout()
class ViewFactsDialog(wx.Dialog):
def __init__(self, *args, **kwds):
kwds["style"] = wx.DEFAULT_DIALOG_STYLE
wx.Dialog.__init__(self, *args, **kwds)
# create list box
self.list_box_1 = wx.ListBox(self, wx.ID_ANY, choices=self.populateFactView())
self.__set_properties()
self.__do_layout()
def __set_properties(self):
self.SetSize((574, 826))
self.list_box_1.SetMinSize((550, 788))
def __do_layout(self):
groupBox = wx.StaticBox(self, label="All Available Facts")
groupBoxSizer = wx.StaticBoxSizer(groupBox, wx.VERTICAL)
groupBoxSizer.Add(self.list_box_1, 1, wx.EXPAND, 5)
mainSizer = wx.BoxSizer(wx.VERTICAL)
mainSizer.Add(groupBoxSizer, 1, wx.ALL | wx.EXPAND, 5)
self.SetSizer(mainSizer)
self.CentreOnParent()
self.Layout()
def populateFactView(self):
somelist = []
somelist.append("")
for category in Structs.categoryList:
somelist.append("Category name: "+category.name+" ("+category.description+")")
for fact in Structs.factList:
if(fact.category==category.name):
somelist.append(fact.name+"("+fact.category+") = "+fact.value)
somelist.append("")
return somelist
################################################################## FACT CLASSES ##################################################################
################################################################## SECURITY ATTRIBUTE CLASSES ##################################################################
class AddSADialog(wx.Dialog):
def __init__(self, *args, **kwds):
kwds["style"] = wx.DEFAULT_DIALOG_STYLE
wx.Dialog.__init__(self, *args, **kwds)
#create labels
self.label1 = wx.StaticText(self, wx.ID_ANY, ("Name:"))
self.label_2 = wx.StaticText(self, wx.ID_ANY, ("Description:"))
# create text ctrls
self.SAName = wx.TextCtrl(self, wx.ID_ANY, "", size=(200, -1))
self.SADesc = wx.TextCtrl(self, wx.ID_ANY, "", size=(200, -1))
#create buttons
self.addBtn = wx.Button(self, label="Add")
self.cancelBtn = wx.Button(self, label="Cancel")
# do some buttons-bindings
self.addBtn.Bind(wx.EVT_BUTTON, self.onClickAdd)
self.cancelBtn.Bind(wx.EVT_BUTTON, self.onClickCancel)
self.__set_properties()
self.__do_layout()
def onClickAdd(self, e):
flag = True
sa = Structs.SecurityAttribute(self.SAName.GetValue(),self.SADesc.GetValue())
for sa_item in Structs.saList:
if(sa.name==sa_item.name):
flag = False
if(flag):
Structs.saList.append(sa)
else:
SMETool.ShowMessage(smetool, "Duplicate!")
return
SMETool.populatelctrl3(smetool, Structs.saList)
self.Close()
def onClickCancel(self, e):
self.Close()
def __set_properties(self):
self.SetTitle(("Add a Security Attribute"))
self.SetSize((365, 216))
def __do_layout(self):
groupBox = wx.StaticBox(self, label="Add a Security Attribute")
groupBoxSizer = wx.StaticBoxSizer(groupBox, wx.VERTICAL)
mainSizer = wx.BoxSizer(wx.VERTICAL)
sizer1 = wx.BoxSizer(wx.HORIZONTAL)
sizer1.Add(self.label1, 0, wx.ALIGN_LEFT, 5)
sizer1.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer1.Add(self.SAName, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer2 = wx.BoxSizer(wx.HORIZONTAL)
sizer2.Add(self.label_2, 0, wx.ALIGN_LEFT, 5)
sizer2.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer2.Add(self.SADesc, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
buttonsSizer.Add(self.addBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
buttonsSizer.Add(self.cancelBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
groupBoxSizer.Add(sizer1, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer2, 0, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(groupBoxSizer, 1, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(buttonsSizer, 0, wx.EXPAND | wx.ALL, 5)
self.SetSizer(mainSizer)
self.CentreOnParent()
self.Layout()
class EditSADialog(wx.Dialog):
def __init__(self, *args, **kwds):
kwds["style"] = wx.DEFAULT_DIALOG_STYLE
wx.Dialog.__init__(self, *args, **kwds)
self.index = smetool.SAListView.GetFocusedItem()
# create labels
self.label1 = wx.StaticText(self, wx.ID_ANY, ("Name:"))
self.label_2 = wx.StaticText(self, wx.ID_ANY, ("Description:"))
#create txt ctrls
self.SANameTxtCtrl = wx.TextCtrl(self, wx.ID_ANY, Structs.saList[self.index].name, size=(200,-1))
self.SADescTxtCtrl = wx.TextCtrl(self, wx.ID_ANY, Structs.saList[self.index].description, size=(200,-1))
# create buttons
self.applyBtn = wx.Button(self, label="Apply")
self.cancelBtn = wx.Button(self, label="Cancel")
# do some buttons-bindings
self.applyBtn.Bind(wx.EVT_BUTTON, self.onClickApply)
self.cancelBtn.Bind(wx.EVT_BUTTON, self.onClickCancel)
self.__set_properties()
self.__do_layout()
def onClickApply(self, e):
flag = True
sa = Structs.SecurityAttribute(self.SANameTxtCtrl.GetValue(),self.SADescTxtCtrl.GetValue())
for sa_item in Structs.saList:
if(sa.name==Structs.saList[self.index].name):
break
else:
if(sa.name==sa_item.name):
flag = False
break
if(flag):
indexi = 0
for item in Structs.foList:
if(item.security_attribute==Structs.saList[smetool.SAListView.GetFocusedItem()].name):
Structs.foList[indexi].security_attribute=sa.name
SMETool.populatelctrl6(smetool, Structs.foList)
indexi=indexi+1
indexi = 0
for item in Structs.erList:
if(item.security_attribute==Structs.saList[smetool.SAListView.GetFocusedItem()].name):
Structs.erList[indexi].security_attribute=sa.name
SMETool.populatelctrl10(smetool, Structs.erList)
indexi=indexi+1
del Structs.saList[self.index]
Structs.saList.append(sa)
else:
SMETool.ShowMessage(smetool, "Duplicate!")
SMETool.populatelctrl3(smetool, Structs.saList)
self.Close()
def onClickCancel(self, e):
self.Close()
def __set_properties(self):
self.SetTitle(("Edit the Security Attribute"))
self.SetSize((365, 216))
def __do_layout(self):
groupBox = wx.StaticBox(self, label="Edit the Security Attribute")
groupBoxSizer = wx.StaticBoxSizer(groupBox, wx.VERTICAL)
mainSizer = wx.BoxSizer(wx.VERTICAL)
sizer1 = wx.BoxSizer(wx.HORIZONTAL)
sizer1.Add(self.label1, 0, wx.ALIGN_LEFT, 5)
sizer1.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer1.Add(self.SANameTxtCtrl, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer2 = wx.BoxSizer(wx.HORIZONTAL)
sizer2.Add(self.label_2, 0, wx.ALIGN_LEFT, 5)
sizer2.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer2.Add(self.SADescTxtCtrl, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
buttonsSizer.Add(self.applyBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
buttonsSizer.Add(self.cancelBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
groupBoxSizer.Add(sizer1, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer2, 0, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(groupBoxSizer, 1, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(buttonsSizer, 0, wx.EXPAND | wx.ALL, 5)
self.SetSizer(mainSizer)
self.CentreOnParent()
self.Layout()
################################################################## SECURITY ATTRIBUTE CLASSES ##################################################################
################################################################## RULE CLASSES ##################################################################
class AddRuleDialog(wx.Dialog):
def __init__(self, *args, **kwds):
kwds["style"] = wx.DEFAULT_DIALOG_STYLE
wx.Dialog.__init__(self, *args, **kwds)
self.list_box_1 = wx.ListBox(self, wx.ID_ANY, choices=[])
# create labels
self.label_1 = wx.StaticText(self, wx.ID_ANY, ("Choose fact category:"))
self.label_2 = wx.StaticText(self, wx.ID_ANY, ("Choose fact:"))
# create buttons
self.AND_Btn = wx.Button(self, label="AND")
self.OR_Btn = wx.Button(self, label="OR")
self.NEG_Btn = wx.Button(self, label="NEG")
self.IMPLY_Btn = wx.Button(self, label="IMPLY")
self.addRuleBtn = wx.Button(self, label="Add")
self.undoRuleBtn = wx.Button(self, label="Undo")
self.completeRuleBtn = wx.Button(self, label="Complete")
# do some buttons-bindings
self.AND_Btn.Bind(wx.EVT_BUTTON, self.onClickAND)
self.OR_Btn.Bind(wx.EVT_BUTTON, self.onClickOR)
self.NEG_Btn.Bind(wx.EVT_BUTTON, self.onClickNEG)
self.IMPLY_Btn.Bind(wx.EVT_BUTTON, self.onClickIMPLY)
self.addRuleBtn.Bind(wx.EVT_BUTTON, self.onClickAdd)
self.undoRuleBtn.Bind(wx.EVT_BUTTON, self.onClickUndo)
self.completeRuleBtn.Bind(wx.EVT_BUTTON, self.onClickComplete)
# create combo boxes
self.factCatComboBox = wx.ComboBox(self, 10, choices=self.catlist(), style=wx.CB_DROPDOWN|wx.TE_READONLY, size=(200, -1))
self.combo_box_2 = wx.ComboBox(self, wx.ID_ANY, choices=[], style=wx.CB_DROPDOWN|wx.TE_READONLY, size=(200, -1))
self.combo_box_2.Disable()
# do some combo-bindings
self.Bind(wx.EVT_COMBOBOX, self.OnSelect1, id=10)
self.__set_properties()
self.__do_layout()
def catlist(self):
somelist = []
[somelist.append(category.name) for category in Structs.categoryList]
return somelist
def factlist(self, cat):
somelist = []
[somelist.append(fact.value) for fact in Structs.factList if(fact.category==cat)]
return somelist
def OnSelect1(self, e):
self.combo_box_2.Enable()
self.combo_box_2.SetItems(self.factlist(self.factCatComboBox.GetValue()))
def onClickAND(self, e):
self.list_box_1.Append(Structs.AND)
def onClickOR(self, e):
self.list_box_1.Append(Structs.OR)
def onClickNEG(self, e):
self.list_box_1.Append(Structs.NEG)
def onClickIMPLY(self, e):
self.list_box_1.Append(Structs.IMPLY)
def onClickAdd(self, e):
if self.list_box_1.GetItems():
if(self.list_box_1.GetItems()[-1] == Structs.NEG):
self.list_box_1.Delete(len(self.list_box_1.GetItems())-1)
self.list_box_1.Append(Structs.NEG+self.combo_box_2.GetValue()+"("+self.factCatComboBox.GetValue()+")")
else:
self.list_box_1.Append(self.combo_box_2.GetValue()+"("+self.factCatComboBox.GetValue()+")")
else:
self.list_box_1.Append(self.combo_box_2.GetValue()+"("+self.factCatComboBox.GetValue()+")")
def onClickUndo(self, e):
index = self.list_box_1.GetCount()
self.list_box_1.Delete(index-1)
def onClickComplete(self, e):
Structs.ruleList.append(Structs.Rule(self.list_box_1.GetItems()))
SMETool.populatelctrl4(smetool, Structs.ruleList)
self.Close()
def __set_properties(self):
self.SetTitle(("Create a Rule"))
self.SetSize((708, 500))
self.list_box_1.SetMinSize((574, 300))
def __do_layout(self):
groupBox = wx.StaticBox(self, label="Create a Rule")
groupBoxSizer = wx.StaticBoxSizer(groupBox, wx.VERTICAL)
mainSizer = wx.BoxSizer(wx.VERTICAL)
sizer1 = wx.BoxSizer(wx.HORIZONTAL)
sizer1.Add(self.label_1, 0, wx.ALIGN_LEFT, 5)
sizer1.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer1.Add(self.factCatComboBox, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer2 = wx.BoxSizer(wx.HORIZONTAL)
sizer2.Add(self.label_2, 0, wx.ALIGN_LEFT, 5)
sizer2.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer2.Add(self.combo_box_2, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer3 = wx.BoxSizer(wx.HORIZONTAL)
sizer3.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer3.Add(self.AND_Btn, 0, wx.ALIGN_CENTER_HORIZONTAL, 5)
sizer3.Add(self.OR_Btn, 0, wx.ALIGN_CENTER_HORIZONTAL, 5)
sizer3.Add(self.NEG_Btn, 0, wx.ALIGN_CENTER_HORIZONTAL, 5)
sizer3.Add(self.IMPLY_Btn, 0, wx.ALIGN_CENTER_HORIZONTAL, 5)
sizer3.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
buttonsSizer.Add(self.addRuleBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
buttonsSizer.Add(self.undoRuleBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
buttonsSizer.Add(self.completeRuleBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
groupBoxSizer.Add(self.list_box_1, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer1, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer2, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer3, 0, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(groupBoxSizer, 1, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(buttonsSizer, 0, wx.EXPAND | wx.ALL, 5)
self.SetSizer(mainSizer)
self.CentreOnParent()
self.Layout()
class EditRuleDialog(wx.Dialog):
def __init__(self, *args, **kwds):
kwds["style"] = wx.DEFAULT_DIALOG_STYLE
wx.Dialog.__init__(self, *args, **kwds)
self.list_box_1 = wx.ListBox(self, wx.ID_ANY, choices=self.fillListBox())
self.label_1 = wx.StaticText(self, wx.ID_ANY, ("Choose Fact Category:"))
self.label_2 = wx.StaticText(self, wx.ID_ANY, ("Choose Fact:"))
self.factCatComboBox = wx.ComboBox(self, 10, choices=self.catlist(), style=wx.CB_DROPDOWN|wx.TE_READONLY, size=(200, -1))
self.Bind(wx.EVT_COMBOBOX, self.OnSelect1, id=10)
self.combo_box_2 = wx.ComboBox(self, wx.ID_ANY, choices=[], style=wx.CB_DROPDOWN|wx.TE_READONLY, size=(200, -1))
self.combo_box_2.Disable()
# create buttons
self.IMPLY_Btn = wx.Button(self, label="IMPLY")
self.AND_Btn = wx.Button(self, label="AND")
self.OR_Btn = wx.Button(self, label="OR")
self.NEG_Btn = wx.Button(self, label="NEG")
self.addRuleBtn = wx.Button(self, label="Add")
self.undoRuleBtn = wx.Button(self, label="Undo")
self.completeRuleBtn = wx.Button(self, label="Complete")
# do some buttons-bindings
self.AND_Btn.Bind(wx.EVT_BUTTON, self.onClickAND)
self.OR_Btn.Bind(wx.EVT_BUTTON, self.onClickOR)
self.NEG_Btn.Bind(wx.EVT_BUTTON, self.onClickNEG)
self.IMPLY_Btn.Bind(wx.EVT_BUTTON, self.onClickIMPLY)
self.addRuleBtn.Bind(wx.EVT_BUTTON, self.onClickAdd)
self.undoRuleBtn.Bind(wx.EVT_BUTTON, self.onClickUndo)
self.completeRuleBtn.Bind(wx.EVT_BUTTON, self.onClickComplete)
self.__set_properties()
self.__do_layout()
def fillListBox(self):
somelist = []
for ele in Structs.ruleList[smetool.ruleListView.GetFocusedItem()].elements:
somelist.append(ele)
return somelist
def catlist(self):
somelist = []
for category in Structs.categoryList:
somelist.append(category.name)
return somelist
def factlist(self, cat):
somelist = []
for fact in Structs.factList:
if(fact.category==cat):
somelist.append(fact.value)
return somelist
def OnSelect1(self, e):
self.combo_box_2.Enable()
self.combo_box_2.SetItems(self.factlist(self.factCatComboBox.GetValue()))
def onClickAND(self, e):
self.list_box_1.Append(Structs.AND)
def onClickOR(self, e):
self.list_box_1.Append(Structs.OR)
def onClickNEG(self, e):
self.list_box_1.Append(Structs.NEG)
def onClickIMPLY(self, e):
self.list_box_1.Append(Structs.IMPLY)
def onClickAdd(self, e):
self.list_box_1.Append(self.combo_box_2.GetValue()+"("+self.factCatComboBox.GetValue()+")")
def onClickUndo(self, e):
index = self.list_box_1.GetCount()
self.list_box_1.Delete(index-1)
def onClickComplete(self, e):
del Structs.ruleList[smetool.ruleListView.GetFocusedItem()]
Structs.ruleList.append(Structs.Rule(self.list_box_1.GetItems()))
SMETool.populatelctrl4(smetool, Structs.ruleList)
self.Close()
def __set_properties(self):
self.SetTitle(("Edit the Rule"))
self.SetSize((708, 500))
self.list_box_1.SetMinSize((574, 300))
def __do_layout(self):
groupBox = wx.StaticBox(self, label="Edit the Rule")
groupBoxSizer = wx.StaticBoxSizer(groupBox, wx.VERTICAL)
mainSizer = wx.BoxSizer(wx.VERTICAL)
sizer1 = wx.BoxSizer(wx.HORIZONTAL)
sizer1.Add(self.label_1, 0, wx.ALIGN_LEFT, 5)
sizer1.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer1.Add(self.factCatComboBox, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer2 = wx.BoxSizer(wx.HORIZONTAL)
sizer2.Add(self.label_2, 0, wx.ALIGN_LEFT, 5)
sizer2.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer2.Add(self.combo_box_2, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer3 = wx.BoxSizer(wx.HORIZONTAL)
sizer3.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer3.Add(self.AND_Btn, 0, wx.ALIGN_CENTER_HORIZONTAL, 5)
sizer3.Add(self.OR_Btn, 0, wx.ALIGN_CENTER_HORIZONTAL, 5)
sizer3.Add(self.NEG_Btn, 0, wx.ALIGN_CENTER_HORIZONTAL, 5)
sizer3.Add(self.IMPLY_Btn, 0, wx.ALIGN_CENTER_HORIZONTAL, 5)
sizer3.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
buttonsSizer.Add(self.addRuleBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
buttonsSizer.Add(self.undoRuleBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
buttonsSizer.Add(self.completeRuleBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
groupBoxSizer.Add(self.list_box_1, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer1, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer2, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer3, 0, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(groupBoxSizer, 1, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(buttonsSizer, 0, wx.EXPAND | wx.ALL, 5)
self.SetSizer(mainSizer)
self.CentreOnParent()
self.Layout()
################################################################## RULE CLASSES ##################################################################
#FACTS ORDER
class AddFODialog(wx.Dialog):
def __init__(self, *args, **kwds):
kwds["style"] = wx.DEFAULT_DIALOG_STYLE
wx.Dialog.__init__(self, *args, **kwds)
self.list_box_1 = wx.ListBox(self, wx.ID_ANY, choices=[])
self.label_1 = wx.StaticText(self, wx.ID_ANY, ("Choose Fact Category:"))
self.label_2 = wx.StaticText(self, wx.ID_ANY, ("Choose Fact:"))
self.label_3 = wx.StaticText(self, wx.ID_ANY, ("Choose Security Attribute:"))
self.factCatComboBox = wx.ComboBox(self, 10, choices=self.catlist(), style=wx.CB_DROPDOWN|wx.TE_READONLY, size=(200, -1))
self.Bind(wx.EVT_COMBOBOX, self.OnSelect1, id=10)
self.combo_box_2 = wx.ComboBox(self, wx.ID_ANY, choices=[], style=wx.CB_DROPDOWN, size=(200, -1))
self.combo_box_2.Disable()
self.combo_box_3 = wx.ComboBox(self, wx.ID_ANY, choices=self.securitylist(), style=wx.CB_DROPDOWN|wx.TE_READONLY, size=(200, -1))
# create buttons
self.addFactsOrderBtn = wx.Button(self, label="Add")
self.undoFactsOrderBtn = wx.Button(self, label="Undo")
self.completeFactsOrderBtn = wx.Button(self,label="Complete")
self.AND_Btn = wx.Button(self, label="LESSER")
self.OR_Btn = wx.Button(self, label="GREATER")
self.NEG_Btn = wx.Button(self, label="EQUALS")
# do some buttons-bindings
self.addFactsOrderBtn.Bind(wx.EVT_BUTTON, self.onClickAdd)
self.undoFactsOrderBtn.Bind(wx.EVT_BUTTON, self.onClickUndo)
self.completeFactsOrderBtn.Bind(wx.EVT_BUTTON, self.onClickComplete)
self.AND_Btn.Bind(wx.EVT_BUTTON, self.onClickLESSER)
self.OR_Btn.Bind(wx.EVT_BUTTON, self.onClickGREATER)
self.NEG_Btn.Bind(wx.EVT_BUTTON, self.onClickEQUALS)
self.__set_properties()
self.__do_layout()
def catlist(self):
somelist = []
for category in Structs.categoryList:
somelist.append(category.name)
return somelist
def factlist(self, cat):
somelist = []
for fact in Structs.factList:
if(fact.category==cat):
somelist.append(fact.value)
return somelist
def securitylist(self):
somelist = []
for security in Structs.saList:
somelist.append(security.name)
return somelist
def OnSelect1(self, e):
self.combo_box_2.Enable()
self.combo_box_2.SetItems(self.factlist(self.factCatComboBox.GetValue()))
def onClickLESSER(self, e):
self.list_box_1.Append(Structs.LESSER)
def onClickGREATER(self, e):
self.list_box_1.Append(Structs.GREATER)
def onClickEQUALS(self, e):
self.list_box_1.Append(Structs.EQUALS)
def onClickAdd(self, e):
self.list_box_1.Append(self.combo_box_2.GetValue()+"("+self.factCatComboBox.GetValue()+")")
def onClickUndo(self, e):
index = self.list_box_1.GetCount()
self.list_box_1.Delete(index-1)
def onClickComplete(self, e):
Structs.foList.append(Structs.FactsOrder(self.list_box_1.GetItems(),self.combo_box_3.GetValue()))
SMETool.populatelctrl6(smetool, Structs.foList)
self.Close()
def __set_properties(self):
self.SetTitle(("Create a Facts Order"))
self.SetSize((708, 550))
self.list_box_1.SetMinSize((574, 300))
def __do_layout(self):
groupBox = wx.StaticBox(self, label="Create a Facts Order")
groupBoxSizer = wx.StaticBoxSizer(groupBox, wx.VERTICAL)
mainSizer = wx.BoxSizer(wx.VERTICAL)
sizer1 = wx.BoxSizer(wx.HORIZONTAL)
sizer1.Add(self.label_1, 0, wx.ALIGN_LEFT, 5)
sizer1.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer1.Add(self.factCatComboBox, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer2 = wx.BoxSizer(wx.HORIZONTAL)
sizer2.Add(self.label_2, 0, wx.ALIGN_LEFT, 5)
sizer2.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer2.Add(self.combo_box_2, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer3 = wx.BoxSizer(wx.HORIZONTAL)
sizer3.Add(self.label_3, 0, wx.ALIGN_LEFT, 5)
sizer3.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer3.Add(self.combo_box_3, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer4 = wx.BoxSizer(wx.HORIZONTAL)
sizer4.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer4.Add(self.AND_Btn, 0, wx.ALIGN_CENTER_HORIZONTAL, 5)
sizer4.Add(self.OR_Btn, 0, wx.ALIGN_CENTER_HORIZONTAL, 5)
sizer4.Add(self.NEG_Btn, 0, wx.ALIGN_CENTER_HORIZONTAL, 5)
sizer4.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
buttonsSizer.Add(self.addFactsOrderBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
buttonsSizer.Add(self.undoFactsOrderBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
buttonsSizer.Add(self.completeFactsOrderBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
groupBoxSizer.Add(self.list_box_1, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer1, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer2, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer3, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer4, 0, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(groupBoxSizer, 1, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(buttonsSizer, 0, wx.EXPAND | wx.ALL, 5)
self.SetSizer(mainSizer)
self.CentreOnParent()
self.Layout()
class EditFODialog(wx.Dialog):
def __init__(self, *args, **kwds):
kwds["style"] = wx.DEFAULT_DIALOG_STYLE
wx.Dialog.__init__(self, *args, **kwds)
self.list_box_1 = wx.ListBox(self, wx.ID_ANY, choices=self.fillListBox())
self.label_1 = wx.StaticText(self, wx.ID_ANY, ("Choose Fact Category:"))
self.label_2 = wx.StaticText(self, wx.ID_ANY, ("Choose Fact:"))
self.label_3 = wx.StaticText(self, wx.ID_ANY, ("Choose Security Attribute:"))
self.factCatComboBox = wx.ComboBox(self, 10, choices=self.catlist(), style=wx.CB_DROPDOWN|wx.TE_READONLY, size=(200, -1))
self.Bind(wx.EVT_COMBOBOX, self.OnSelect1, id=10)
self.combo_box_2 = wx.ComboBox(self, wx.ID_ANY, choices=[], style=wx.CB_DROPDOWN|wx.TE_READONLY, size=(200, -1))
self.combo_box_2.Disable()
self.combo_box_3 = wx.ComboBox(self, wx.ID_ANY, choices=self.securitylist(), style=wx.CB_DROPDOWN|wx.TE_READONLY, size=(200, -1))
self.combo_box_3.SetValue(Structs.foList[smetool.factsOrderListView.GetFocusedItem()].security_attribute)
# create buttons
self.addFactsOrderBtn = wx.Button(self, label="Add")
self.undoFactsOrderBtn = wx.Button(self, label="Undo")
self.completeFactsOrderBtn = wx.Button(self,label="Complete")
self.AND_Btn = wx.Button(self, label="LESSER")
self.OR_Btn = wx.Button(self, label="GREATER")
self.NEG_Btn = wx.Button(self, label="EQUALS")
# do some buttons-bindings
self.addFactsOrderBtn.Bind(wx.EVT_BUTTON, self.onClickAdd)
self.undoFactsOrderBtn.Bind(wx.EVT_BUTTON, self.onClickUndo)
self.completeFactsOrderBtn.Bind(wx.EVT_BUTTON, self.onClickComplete)
self.AND_Btn.Bind(wx.EVT_BUTTON, self.onClickLESSER)
self.OR_Btn.Bind(wx.EVT_BUTTON, self.onClickGREATER)
self.NEG_Btn.Bind(wx.EVT_BUTTON, self.onClickEQUALS)
self.__set_properties()
self.__do_layout()
def fillListBox(self):
somelist = []
for ele in Structs.foList[smetool.factsOrderListView.GetFocusedItem()].elements:
somelist.append(ele)
return somelist
def catlist(self):
somelist = []
for category in Structs.categoryList:
somelist.append(category.name)
return somelist
def factlist(self, cat):
somelist = []
for fact in Structs.factList:
if(fact.category==cat):
somelist.append(fact.value)
return somelist
def securitylist(self):
somelist = []
for security in Structs.saList:
somelist.append(security.name)
return somelist
def OnSelect1(self, e):
self.combo_box_2.Enable()
self.combo_box_2.SetItems(self.factlist(self.factCatComboBox.GetValue()))
def onClickLESSER(self, e):
self.list_box_1.Append(Structs.LESSER)
def onClickGREATER(self, e):
self.list_box_1.Append(Structs.GREATER)
def onClickEQUALS(self, e):
self.list_box_1.Append(Structs.EQUALS)
def onClickAdd(self, e):
self.list_box_1.Append(self.combo_box_2.GetValue()+"("+self.factCatComboBox.GetValue()+")")
def onClickUndo(self, e):
index = self.list_box_1.GetCount()
self.list_box_1.Delete(index-1)
def onClickComplete(self, e):
del Structs.foList[smetool.factsOrderListView.GetFocusedItem()]
Structs.foList.append(Structs.FactsOrder(self.list_box_1.GetItems(),self.combo_box_3.GetValue()))
SMETool.populatelctrl6(smetool, Structs.foList)
self.Close()
def __set_properties(self):
self.SetTitle(("Edit the Facts Order"))
self.SetSize((708, 550))
self.list_box_1.SetMinSize((574, 300))
def __do_layout(self):
groupBox = wx.StaticBox(self, label="Create a Facts Order")
groupBoxSizer = wx.StaticBoxSizer(groupBox, wx.VERTICAL)
mainSizer = wx.BoxSizer(wx.VERTICAL)
sizer1 = wx.BoxSizer(wx.HORIZONTAL)
sizer1.Add(self.label_1, 0, wx.ALIGN_LEFT, 5)
sizer1.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer1.Add(self.factCatComboBox, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer2 = wx.BoxSizer(wx.HORIZONTAL)
sizer2.Add(self.label_2, 0, wx.ALIGN_LEFT, 5)
sizer2.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer2.Add(self.combo_box_2, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer3 = wx.BoxSizer(wx.HORIZONTAL)
sizer3.Add(self.label_3, 0, wx.ALIGN_LEFT, 5)
sizer3.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer3.Add(self.combo_box_3, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer4 = wx.BoxSizer(wx.HORIZONTAL)
sizer4.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer4.Add(self.AND_Btn, 0, wx.ALIGN_CENTER_HORIZONTAL, 5)
sizer4.Add(self.OR_Btn, 0, wx.ALIGN_CENTER_HORIZONTAL, 5)
sizer4.Add(self.NEG_Btn, 0, wx.ALIGN_CENTER_HORIZONTAL, 5)
sizer4.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
buttonsSizer.Add(self.addFactsOrderBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
buttonsSizer.Add(self.undoFactsOrderBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
buttonsSizer.Add(self.completeFactsOrderBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
groupBoxSizer.Add(self.list_box_1, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer1, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer2, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer3, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer4, 0, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(groupBoxSizer, 1, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(buttonsSizer, 0, wx.EXPAND | wx.ALL, 5)
self.SetSizer(mainSizer)
self.CentreOnParent()
self.Layout()
#EVALUATION RULES
class AddERDialog(wx.Dialog):
def __init__(self, *args, **kwds):
kwds["style"] = wx.DEFAULT_DIALOG_STYLE
wx.Dialog.__init__(self, *args, **kwds)
self.list_box_1 = wx.ListBox(self, wx.ID_ANY, choices=[])
# create labels
self.label_1 = wx.StaticText(self, wx.ID_ANY, ("Choose Fact Category:"))
self.label_2 = wx.StaticText(self, wx.ID_ANY, ("Choose Fact:"))
self.label_3 = wx.StaticText(self, wx.ID_ANY, ("Choose Security Attribute:"))
self.label_4 = wx.StaticText(self, wx.ID_ANY, ("Enter Influence Value:"))
# create buttons
self.AND_Btn = wx.Button(self, label="AND")
self.OR_Btn = wx.Button(self, label="OR")
self.NEG_Btn = wx.Button(self, label="IMPLY")
self.addBtn = wx.Button(self, label="Add")
self.undoBtn = wx.Button(self, label="Undo")
self.completeBtn = wx.Button(self, label="Complete")
# do some buttons-bindings
self.AND_Btn.Bind(wx.EVT_BUTTON, self.onClickAND)
self.OR_Btn.Bind(wx.EVT_BUTTON, self.onClickOR)
self.NEG_Btn.Bind(wx.EVT_BUTTON, self.onClickIMPLY)
self.addBtn.Bind(wx.EVT_BUTTON, self.onClickAdd)
self.undoBtn.Bind(wx.EVT_BUTTON, self.onClickUndo)
self.completeBtn.Bind(wx.EVT_BUTTON, self.onClickComplete)
# create combos
self.factCatComboBox = wx.ComboBox(self, 10, choices=self.catlist(), style=wx.CB_DROPDOWN|wx.TE_READONLY, size=(200,-1))
self.combo_box_2 = wx.ComboBox(self, wx.ID_ANY, choices=[], style=wx.CB_DROPDOWN|wx.TE_READONLY, size=(200,-1))
self.combo_box_2.Disable()
self.combo_box_3 = wx.ComboBox(self, wx.ID_ANY, choices=self.securitylist(), style=wx.CB_DROPDOWN|wx.TE_READONLY, size=(200,-1))
# do some combo bindings
self.Bind(wx.EVT_COMBOBOX, self.OnSelect1, id=10)
# create txt ctrls
self.textctrl_1 = wx.TextCtrl(self, wx.ID_ANY, size=(200,-1))
self.__set_properties()
self.__do_layout()
def catlist(self):
somelist = []
for category in Structs.categoryList:
somelist.append(category.name)
return somelist
def factlist(self, cat):
somelist = []
for fact in Structs.factList:
if(fact.category==cat):
somelist.append(fact.value)
return somelist
def securitylist(self):
somelist = []
for security in Structs.saList:
somelist.append(security.name)
return somelist
def OnSelect1(self, e):
self.combo_box_2.Enable()
self.combo_box_2.SetItems(self.factlist(self.factCatComboBox.GetValue()))
def onClickAND(self, e):
self.list_box_1.Append(Structs.AND)
def onClickOR(self, e):
self.list_box_1.Append(Structs.OR)
def onClickIMPLY(self, e):
self.list_box_1.Append(Structs.IMPLY)
def onClickAdd(self, e):
self.list_box_1.Append(self.combo_box_2.GetValue()+"("+self.factCatComboBox.GetValue()+")")
def onClickUndo(self, e):
index = self.list_box_1.GetCount()
self.list_box_1.Delete(index-1)
def onClickComplete(self, e):
if self.textctrl_1.GetValue()=='':
influence_value=0
else:
influence_value=self.textctrl_1.GetValue()
Structs.erList.append(Structs.EvaluationRule(self.list_box_1.GetItems(),influence_value,self.combo_box_3.GetValue()))
SMETool.populatelctrl10(smetool, Structs.erList)
self.Close()
def __set_properties(self):
self.SetTitle(("Create an Evaluation Rule"))
self.SetSize((708, 570))
self.list_box_1.SetMinSize((574, 300))
def __do_layout(self):
groupBox = wx.StaticBox(self, label="Create an Evaluation Rule")
groupBoxSizer = wx.StaticBoxSizer(groupBox, wx.VERTICAL)
mainSizer = wx.BoxSizer(wx.VERTICAL)
sizer1 = wx.BoxSizer(wx.HORIZONTAL)
sizer1.Add(self.label_1, 0, wx.ALIGN_LEFT, 5)
sizer1.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer1.Add(self.factCatComboBox, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer2 = wx.BoxSizer(wx.HORIZONTAL)
sizer2.Add(self.label_2, 0, wx.ALIGN_LEFT, 5)
sizer2.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer2.Add(self.combo_box_2, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer3 = wx.BoxSizer(wx.HORIZONTAL)
sizer3.Add(self.label_3, 0, wx.ALIGN_LEFT, 5)
sizer3.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer3.Add(self.combo_box_3, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer4 = wx.BoxSizer(wx.HORIZONTAL)
sizer4.Add(self.label_4, 0, wx.ALIGN_LEFT, 5)
sizer4.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer4.Add(self.textctrl_1, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer5 = wx.BoxSizer(wx.HORIZONTAL)
sizer5.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer5.Add(self.AND_Btn, 0, wx.ALIGN_CENTER_HORIZONTAL, 5)
sizer5.Add(self.OR_Btn, 0, wx.ALIGN_CENTER_HORIZONTAL, 5)
sizer5.Add(self.NEG_Btn, 0, wx.ALIGN_CENTER_HORIZONTAL, 5)
sizer5.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
buttonsSizer.Add(self.addBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
buttonsSizer.Add(self.undoBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
buttonsSizer.Add(self.completeBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
groupBoxSizer.Add(self.list_box_1, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer1, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer2, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer3, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer4, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer5, 0, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(groupBoxSizer, 1, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(buttonsSizer, 0, wx.EXPAND | wx.ALL, 5)
self.SetSizer(mainSizer)
self.CentreOnParent()
self.Layout()
class EditERDialog(wx.Dialog):
def __init__(self, *args, **kwds):
kwds["style"] = wx.DEFAULT_DIALOG_STYLE
wx.Dialog.__init__(self, *args, **kwds)
self.list_box_1 = wx.ListBox(self, wx.ID_ANY, choices=self.fillListBox())
self.label_1 = wx.StaticText(self, wx.ID_ANY, ("Choose Fact Category:"))
self.label_2 = wx.StaticText(self, wx.ID_ANY, ("Choose Fact:"))
self.label_3 = wx.StaticText(self, wx.ID_ANY, ("Choose Security Attribute:"))
self.label_4 = wx.StaticText(self, wx.ID_ANY, ("Enter Influence Value:"))
self.AND_Btn = wx.Button(self, 1, ("AND"))
self.Bind(wx.EVT_BUTTON, self.onClickAND, id=1)
self.OR_Btn = wx.Button(self, 2, ("OR"))
self.Bind(wx.EVT_BUTTON, self.onClickOR, id=2)
self.NEG_Btn = wx.Button(self, 3, ("IMPLY"))
self.Bind(wx.EVT_BUTTON, self.onClickIMPLY, id=3)
self.factCatComboBox = wx.ComboBox(self, 10, choices=self.catlist(), style=wx.CB_DROPDOWN|wx.TE_READONLY, size=(200,-1))
self.Bind(wx.EVT_COMBOBOX, self.OnSelect1, id=10)
self.combo_box_2 = wx.ComboBox(self, wx.ID_ANY, choices=[], style=wx.CB_DROPDOWN|wx.TE_READONLY, size=(200,-1))
self.combo_box_2.Disable()
self.combo_box_3 = wx.ComboBox(self, wx.ID_ANY, choices=self.securitylist(), style=wx.CB_DROPDOWN|wx.TE_READONLY, size=(200,-1))
self.combo_box_3.SetValue(Structs.erList[smetool.evaluationRulesListView.GetFocusedItem()].security_attribute)
self.textctrl_1 = wx.TextCtrl(self, wx.ID_ANY, size=(200,-1))
self.textctrl_1.SetValue(Structs.erList[smetool.evaluationRulesListView.GetFocusedItem()].influence)
self.addERBtn = wx.Button(self, 7, ("Add"))
self.Bind(wx.EVT_BUTTON, self.onClickAdd, id=7)
self.undoERBtn = wx.Button(self, 8, ("Undo"))
self.Bind(wx.EVT_BUTTON, self.onClickUndo, id=8)
self.completeERBtn = wx.Button(self, 9, ("Complete"))
self.Bind(wx.EVT_BUTTON, self.onClickComplete, id=9)
self.__set_properties()
self.__do_layout()
def fillListBox(self):
somelist = []
for ele in Structs.erList[smetool.evaluationRulesListView.GetFocusedItem()].elements:
somelist.append(ele)
return somelist
def catlist(self):
somelist = []
for category in Structs.categoryList:
somelist.append(category.name)
return somelist
def factlist(self, cat):
somelist = []
for fact in Structs.factList:
if(fact.category==cat):
somelist.append(fact.value)
return somelist
def securitylist(self):
somelist = []
for security in Structs.saList:
somelist.append(security.name)
return somelist
def OnSelect1(self, e):
self.combo_box_2.Enable()
self.combo_box_2.SetItems(self.factlist(self.factCatComboBox.GetValue()))
def onClickAND(self, e):
self.list_box_1.Append(Structs.AND)
def onClickOR(self, e):
self.list_box_1.Append(Structs.OR)
def onClickIMPLY(self, e):
self.list_box_1.Append(Structs.IMPLY)
def onClickAdd(self, e):
self.list_box_1.Append(self.combo_box_2.GetValue()+"("+self.factCatComboBox.GetValue()+")")
def onClickUndo(self, e):
index = self.list_box_1.GetCount()
self.list_box_1.Delete(index-1)
def onClickComplete(self, e):
if self.textctrl_1.GetValue()=='':
influence_value=0
else:
influence_value=self.textctrl_1.GetValue()
del Structs.erList[smetool.evaluationRulesListView.GetFocusedItem()]
Structs.erList.append(Structs.EvaluationRule(self.list_box_1.GetItems(),influence_value,self.combo_box_3.GetValue()))
SMETool.populatelctrl10(smetool, Structs.erList)
self.Close()
def __set_properties(self):
self.SetTitle(("Edit the Evaluation Rule"))
self.SetSize((708, 570))
self.list_box_1.SetMinSize((574, 300))
def __do_layout(self):
groupBox = wx.StaticBox(self, label="Edit the Evaluation Rule")
groupBoxSizer = wx.StaticBoxSizer(groupBox, wx.VERTICAL)
mainSizer = wx.BoxSizer(wx.VERTICAL)
sizer1 = wx.BoxSizer(wx.HORIZONTAL)
sizer1.Add(self.label_1, 0, wx.ALIGN_LEFT, 5)
sizer1.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer1.Add(self.factCatComboBox, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer2 = wx.BoxSizer(wx.HORIZONTAL)
sizer2.Add(self.label_2, 0, wx.ALIGN_LEFT, 5)
sizer2.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer2.Add(self.combo_box_2, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer3 = wx.BoxSizer(wx.HORIZONTAL)
sizer3.Add(self.label_3, 0, wx.ALIGN_LEFT, 5)
sizer3.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer3.Add(self.combo_box_3, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer4 = wx.BoxSizer(wx.HORIZONTAL)
sizer4.Add(self.label_4, 0, wx.ALIGN_LEFT, 5)
sizer4.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer4.Add(self.textctrl_1, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer5 = wx.BoxSizer(wx.HORIZONTAL)
sizer5.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer5.Add(self.AND_Btn, 0, wx.ALIGN_CENTER_HORIZONTAL, 5)
sizer5.Add(self.OR_Btn, 0, wx.ALIGN_CENTER_HORIZONTAL, 5)
sizer5.Add(self.NEG_Btn, 0, wx.ALIGN_CENTER_HORIZONTAL, 5)
sizer5.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
buttonsSizer.Add(self.addERBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
buttonsSizer.Add(self.undoERBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
buttonsSizer.Add(self.completeERBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
groupBoxSizer.Add(self.list_box_1, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer1, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer2, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer3, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer4, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer5, 0, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(groupBoxSizer, 1, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(buttonsSizer, 0, wx.EXPAND | wx.ALL, 5)
self.SetSizer(mainSizer)
self.CentreOnParent()
self.Layout()
#CASE
class AddCaseDialog(wx.Dialog):
def __init__(self, *args, **kwds):
kwds["style"] = wx.DEFAULT_DIALOG_STYLE
wx.Dialog.__init__(self, *args, **kwds)
self.list_box_1 = wx.ListBox(self, wx.ID_ANY, choices=[])
self.label_1 = wx.StaticText(self, wx.ID_ANY, ("Choose Fact Category:"))
self.label_2 = wx.StaticText(self, wx.ID_ANY, ("Choose Fact:"))
self.label_4 = wx.StaticText(self, wx.ID_ANY, ("Enter Description:"))
self.factCatComboBox = wx.ComboBox(self, 10, choices=self.catlist(), style=wx.CB_DROPDOWN|wx.TE_READONLY, size=(200,-1))
self.Bind(wx.EVT_COMBOBOX, self.OnSelect1, id=10)
self.combo_box_2 = wx.ComboBox(self, wx.ID_ANY, choices=[], style=wx.CB_DROPDOWN|wx.TE_READONLY, size=(200,-1))
self.combo_box_2.Disable()
self.textctrl_1 = wx.TextCtrl(self, wx.ID_ANY, size=(200,-1))
# create buttons
self.addCaseBtn = wx.Button(self, label="Add")
self.undoCaseBtn = wx.Button(self, label="Undo")
self.completeCaseBtn = wx.Button(self, label="Complete")
# do some buttons-bindings
self.addCaseBtn.Bind(wx.EVT_BUTTON, self.onClickAdd)
self.undoCaseBtn.Bind(wx.EVT_BUTTON, self.onClickUndo)
self.completeCaseBtn.Bind(wx.EVT_BUTTON, self.onClickComplete)
self.__set_properties()
self.__do_layout()
def catlist(self):
somelist = []
for category in Structs.categoryList:
somelist.append(category.name)
return somelist
def factlist(self, cat):
somelist = []
for fact in Structs.factList:
if(fact.category==cat):
somelist.append(fact.value)
return somelist
def OnSelect1(self, e):
self.combo_box_2.Enable()
self.combo_box_2.SetItems(self.factlist(self.factCatComboBox.GetValue()))
def onClickAdd(self, e):
self.list_box_1.Append(self.combo_box_2.GetValue()+"("+self.factCatComboBox.GetValue()+")")
def onClickUndo(self, e):
index = self.list_box_1.GetCount()
self.list_box_1.Delete(index-1)
def onClickComplete(self, e):
Structs.caseList.append(Structs.Case(self.createCasename(),self.list_box_1.GetItems(),self.textctrl_1.GetValue()))
SMETool.populatelctrl11(smetool, Structs.caseList)
self.Close()
def createCasename(self):
previ = 1
newi = previ
while(True):
newi = previ
for case in Structs.caseList:
if(case.casename=="Case "+str(previ)):
previ=previ+1
if(previ==newi):
break
casename = "Case "+str(newi)
return casename
def __set_properties(self):
self.SetTitle(("Create a Case"))
self.SetSize((708, 490))
self.list_box_1.SetMinSize((574, 300))
def __do_layout(self):
groupBox = wx.StaticBox(self, label="Create a Case")
groupBoxSizer = wx.StaticBoxSizer(groupBox, wx.VERTICAL)
mainSizer = wx.BoxSizer(wx.VERTICAL)
sizer1 = wx.BoxSizer(wx.HORIZONTAL)
sizer1.Add(self.label_1, 0, wx.ALIGN_LEFT, 5)
sizer1.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer1.Add(self.factCatComboBox, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer2 = wx.BoxSizer(wx.HORIZONTAL)
sizer2.Add(self.label_2, 0, wx.ALIGN_LEFT, 5)
sizer2.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer2.Add(self.combo_box_2, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer3 = wx.BoxSizer(wx.HORIZONTAL)
sizer3.Add(self.label_4, 0, wx.ALIGN_LEFT, 5)
sizer3.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer3.Add(self.textctrl_1, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
buttonsSizer.Add(self.addCaseBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
buttonsSizer.Add(self.undoCaseBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
buttonsSizer.Add(self.completeCaseBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
groupBoxSizer.Add(self.list_box_1, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer1, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer2, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer3, 0, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(groupBoxSizer, 1, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(buttonsSizer, 0, wx.EXPAND | wx.ALL, 5)
self.SetSizer(mainSizer)
self.CentreOnParent()
self.Layout()
class EditCaseDialog(wx.Dialog):
def __init__(self, *args, **kwds):
kwds["style"] = wx.DEFAULT_DIALOG_STYLE
wx.Dialog.__init__(self, *args, **kwds)
self.list_box_1 = wx.ListBox(self, wx.ID_ANY, choices=self.fillListBox())
self.label_1 = wx.StaticText(self, wx.ID_ANY, ("Choose Fact Category:"))
self.label_2 = wx.StaticText(self, wx.ID_ANY, ("Choose Fact:"))
self.label_4 = wx.StaticText(self, wx.ID_ANY, ("Enter Description:"))
self.factCatComboBox = wx.ComboBox(self, 10, choices=self.catlist(), style=wx.CB_DROPDOWN|wx.TE_READONLY, size=(200,-1))
self.Bind(wx.EVT_COMBOBOX, self.OnSelect1, id=10)
self.combo_box_2 = wx.ComboBox(self, wx.ID_ANY, choices=[], style=wx.CB_DROPDOWN|wx.TE_READONLY, size=(200,-1))
self.combo_box_2.Disable()
self.textctrl_1 = wx.TextCtrl(self, wx.ID_ANY, size=(200,-1))
self.textctrl_1.SetValue(Structs.caseList[smetool.casesListView.GetFocusedItem()].description)
self.addCaseBtn = wx.Button(self, 7, ("Add"))
self.Bind(wx.EVT_BUTTON, self.onClickAdd, id=7)
self.undoCaseBtn = wx.Button(self, 8, ("Undo"))
self.Bind(wx.EVT_BUTTON, self.onClickUndo, id=8)
self.completeCaseBtn = wx.Button(self, 9, ("Complete"))
self.Bind(wx.EVT_BUTTON, self.onClickComplete, id=9)
self.__set_properties()
self.__do_layout()
def fillListBox(self):
somelist = []
for fact in Structs.caseList[smetool.casesListView.GetFocusedItem()].facts:
somelist.append(fact)
return somelist
def catlist(self):
somelist = []
for category in Structs.categoryList:
somelist.append(category.name)
return somelist
def factlist(self, cat):
somelist = []
for fact in Structs.factList:
if(fact.category==cat):
somelist.append(fact.value)
return somelist
def OnSelect1(self, e):
self.combo_box_2.Enable()
self.combo_box_2.SetItems(self.factlist(self.factCatComboBox.GetValue()))
def onClickAdd(self, e):
self.list_box_1.Append(self.combo_box_2.GetValue()+"("+self.factCatComboBox.GetValue()+")")
def onClickUndo(self, e):
index = self.list_box_1.GetCount()
self.list_box_1.Delete(index-1)
def onClickComplete(self, e):
casename = Structs.caseList[smetool.casesListView.GetFocusedItem()].casename
del Structs.caseList[smetool.casesListView.GetFocusedItem()]
Structs.caseList.append(Structs.Case(casename,self.list_box_1.GetItems(),self.textctrl_1.GetValue()))
SMETool.populatelctrl11(smetool, Structs.caseList)
self.Close()
def __set_properties(self):
self.SetTitle(("Edit the Case"))
self.SetSize((708, 490))
self.list_box_1.SetMinSize((574, 300))
def __do_layout(self):
groupBox = wx.StaticBox(self, label="Edit the Case")
groupBoxSizer = wx.StaticBoxSizer(groupBox, wx.VERTICAL)
mainSizer = wx.BoxSizer(wx.VERTICAL)
sizer1 = wx.BoxSizer(wx.HORIZONTAL)
sizer1.Add(self.label_1, 0, wx.ALIGN_LEFT, 5)
sizer1.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer1.Add(self.factCatComboBox, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer2 = wx.BoxSizer(wx.HORIZONTAL)
sizer2.Add(self.label_2, 0, wx.ALIGN_LEFT, 5)
sizer2.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer2.Add(self.combo_box_2, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
sizer3 = wx.BoxSizer(wx.HORIZONTAL)
sizer3.Add(self.label_4, 0, wx.ALIGN_LEFT, 5)
sizer3.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
sizer3.Add(self.textctrl_1, 0, wx.EXPAND | wx.ALIGN_RIGHT, 5)
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(wx.StaticText(self), 1, wx.ALIGN_CENTER, 5)
buttonsSizer.Add(self.addCaseBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
buttonsSizer.Add(self.undoCaseBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
buttonsSizer.Add(self.completeCaseBtn, 0, wx.ALIGN_RIGHT | wx.ALIGN_BOTTOM, 5)
groupBoxSizer.Add(self.list_box_1, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer1, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer2, 0, wx.EXPAND | wx.ALL, 5)
groupBoxSizer.Add(sizer3, 0, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(groupBoxSizer, 1, wx.EXPAND | wx.ALL, 5)
mainSizer.Add(buttonsSizer, 0, wx.EXPAND | wx.ALL, 5)
self.SetSizer(mainSizer)
self.CentreOnParent()
self.Layout()
class EvaluateCaseDialog(wx.Dialog):
def __init__(self, *args, **kwds):
kwds["style"] = wx.DEFAULT_DIALOG_STYLE
wx.Dialog.__init__(self, *args, **kwds)
self.list_box_1 = wx.ListBox(self, wx.ID_ANY, choices=[])
self.list_box_1.Enable(False)
self.closeBtn = wx.Button(self, 9, ("Close"))
self.detailsBtn = wx.Button(self, 10, ("Details"))
self.grid_1 = wx.grid.Grid(self, wx.ID_ANY, size=(1, 1))
self.Bind(wx.EVT_BUTTON, self.onClickClose, id=9)
self.Bind(wx.EVT_BUTTON, self.onClickDetails, id=10)
self.resultList = []
self.factsetList = []
self.__set_properties()
self.__do_layout()
self.evaluate()
#Evaluation related
def getleftside(self, somelist):
leftside = []
for item in somelist:
if item==Structs.IMPLY:
break
else:
leftside.append(item)
return leftside
def getrightside(self, somelist):
flag = False
rightside = []
for item in somelist:
if flag:
rightside.append(item)
if item==Structs.IMPLY:
flag = True
return rightside
def analyse(self, startevalset):
previousevalset = []
additevalset = []
currevalset = []
for fact in startevalset:
for rule in Structs.ruleList:
if fact in rule.elements:
if fact in self.getleftside(rule.elements):
for item in self.getrightside(rule.elements):
if item!=Structs.IMPLY and item!=Structs.AND and item!=Structs.OR and item!=Structs.NEG and item not in additevalset:
additevalset.append(item)
currevalset.extend(additevalset)
while additevalset:
additevalset = []
for fact in currevalset:
for rule in Structs.ruleList:
if fact in rule.elements:
if fact in self.getleftside(rule.elements):
for item in self.getrightside(rule.elements):
if item!=Structs.IMPLY and item!=Structs.AND and item!=Structs.OR and item!=Structs.NEG and item not in additevalset:
additevalset.append(item)
for element in additevalset:
if element not in currevalset:
currevalset.append(element)
if set(previousevalset)==set(additevalset):
break
previousevalset = additevalset
for fact in currevalset:
if fact in startevalset:
currevalset.remove(fact)
currevalset.extend(startevalset)
del startevalset[0:len(startevalset)]
del previousevalset[0:len(previousevalset)]
del additevalset[0:len(additevalset)]
return currevalset
def createsecattdict(self):
secattdict = {}
for secatt in Structs.saList:
secattdict.update({secatt.name:0})
return secattdict
def skipconflrules(self, currevalset):
evalrules = []
for rule in Structs.erList:
evalrules.append(rule)
currevalrules = evalrules
skippedcomprules = []
tempfactelements = []
compositerules = []
for evalrule in evalrules:
if len(evalrule.elements)>2:
tempfactelements = []
for element in evalrule.elements:
if(element!=Structs.AND and element!=Structs.OR and element!=Structs.IMPLY):
tempfactelements.append(element)
for fact in tempfactelements:
if fact not in currevalset:
if evalrule not in skippedcomprules:
skippedcomprules.append(evalrule)
for evalrule in evalrules:
for skipped in skippedcomprules:
if(evalrule==skipped):
currevalrules.remove(skipped)
evalrule = currevalrules
for evalrule in evalrules:
if len(evalrule.elements)>2:
if evalrule not in compositerules:
compositerules.append(evalrule)
for evalrulecomp in compositerules:
for evalrule in evalrules:
for element in evalrulecomp.elements:
if(element!=Structs.AND and element!=Structs.OR and element!=Structs.IMPLY):
if element in evalrule.elements and len(evalrule.elements)<=2 and evalrule.security_attribute==evalrulecomp.security_attribute:
if evalrule in currevalrules:
currevalrules.remove(evalrule)
del skippedcomprules[0:len(skippedcomprules)]
del tempfactelements[0:len(tempfactelements)]
del compositerules[0:len(compositerules)]
return currevalrules
def evaluate(self):
startset = []
additionalfacts = []
deletedfacts = []
for item in Structs.caseList[smetool.casesListView.GetFocusedItem()].facts:
startset.append(item)
for fact in startset:
if fact.startswith(Structs.NEG):
continue
for evalrule in Structs.erList:
if fact in self.getleftside(evalrule.elements):
declared = True
if not declared:
for order in Structs.foList:
if fact in order.elements:
for element in order.elements:
if element!=Structs.GREATER and element!=Structs.LESSER and element!=Structs.EQUALS and element!=Structs.GREATER+Structs.EQUALS and element!=Structs.LESSER+Structs.EQUALS and element!=fact:
if element not in additionalfacts:
additionalfacts.append(element)
if element not in deletedfacts:
deletedfacts.append(fact)
break
declared = False
for fact in additionalfacts:
startset.append(fact)
for fact in deletedfacts:
if fact in startset:
startset.remove(fact)
currevalset = self.analyse(startset)
secattdict = self.createsecattdict()
currevalrules = self.skipconflrules(currevalset)
rulestoskip = []
declared = False
for fact in currevalset:
if fact.startswith(Structs.NEG):
continue
for evalrule in currevalrules:
if fact in self.getleftside(evalrule.elements):
for secatt in secattdict:
if secatt==evalrule.security_attribute:
if evalrule not in rulestoskip:
secattdict[secatt] = secattdict[secatt] + int(evalrule.influence)
if len(evalrule.elements)>2 :
if evalrule not in rulestoskip:
rulestoskip.append(evalrule)
for fact in currevalset:
if fact in Structs.caseList[smetool.casesListView.GetFocusedItem()].facts:
currevalset.remove(fact)
self.resultList.append(Structs.caseList[smetool.casesListView.GetFocusedItem()].casename+":")
self.resultList.append("")
self.grid_1.SetRowLabelValue(0, Structs.caseList[smetool.casesListView.GetFocusedItem()].casename)
iter = 0
for key, value in secattdict.items():
self.resultList.append(key+" - "+str(value))
self.grid_1.SetColLabelValue(iter, key)
self.grid_1.SetCellValue(0, iter, str(value))
iter = iter+1
self.resultList.append("")
self.factsetList.append("Set of facts for "+Structs.caseList[smetool.casesListView.GetFocusedItem()].casename+":")
self.factsetList.append("")
for fact in currevalset:
self.factsetList.append(fact)
del currevalset[0:len(currevalset)]
del currevalrules[0:len(currevalrules)]
del rulestoskip[0:len(rulestoskip)]
del additionalfacts[0:len(additionalfacts)]
del deletedfacts[0:len(deletedfacts)]
#End evaluation related
def onClickClose(self, e):
self.Close()
def onClickDetails(self, e):
self.list_box_1.Enable(True)
self.list_box_1.Clear()
self.list_box_1.AppendItems(self.resultList)
self.list_box_1.AppendItems(self.factsetList)
def __set_properties(self):
self.SetTitle(("Case QoP Evaluation"))
self.SetSize((800, 473))
self.grid_1.CreateGrid(1, len(Structs.saList))
for col in range(len(Structs.saList)):
self.grid_1.SetColSize(col, 100)
self.grid_1.SetMinSize((700, 300))
self.grid_1.EnableEditing(False)
self.list_box_1.SetMinSize((550, 100))
def __do_layout(self):
mainSizer = wx.BoxSizer(wx.VERTICAL)
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
buttonsSizer.Add(self.detailsBtn, 0, wx.ALL | wx.EXPAND, 5)
buttonsSizer.Add(self.closeBtn, 0, wx.ALL | wx.EXPAND, 5)
sizer_1 = wx.BoxSizer(wx.HORIZONTAL)
sizer_1.Add(self.grid_1, 1, wx.EXPAND, 5)
sizer_2 = wx.BoxSizer(wx.HORIZONTAL)
sizer_2.Add(self.list_box_1, 1, wx.EXPAND, 5)
mainSizer.Add(sizer_1, 0, wx.ALL | wx.EXPAND, 5)
mainSizer.Add(sizer_2, 0, wx.ALL | wx.EXPAND, 5)
mainSizer.Add(buttonsSizer, 0, wx.ALL | wx.EXPAND, 5)
self.SetSizer(mainSizer)
self.CentreOnParent()
self.Layout()
class EvaluateAllCasesDialog(wx.Dialog):
def __init__(self, *args, **kwds):
kwds["style"] = wx.DEFAULT_DIALOG_STYLE
wx.Dialog.__init__(self, *args, **kwds)
self.list_box_1 = wx.ListBox(self, wx.ID_ANY, choices=[])
self.list_box_1.Enable(False)
self.closeBtn = wx.Button(self, 9, ("Close"))
self.detailsBtn = wx.Button(self, 10, ("Details"))
self.grid_1 = wx.grid.Grid(self, wx.ID_ANY, size=(1, 1))
self.Bind(wx.EVT_BUTTON, self.onClickClose, id=9)
self.Bind(wx.EVT_BUTTON, self.onClickDetails, id=10)
self.resultList = []
self.factsetList = []
self.minsecattdict = self.createminsecattdict()
self.maxsecattdict = self.createsecattdict()
self.treshsecattdict = self.createsecattdict()
self.__set_properties()
self.__do_layout()
self.evaluateall()
#Evaluation related
def getleftside(self, somelist):
leftside = []
for item in somelist:
if item==Structs.IMPLY:
break
else:
leftside.append(item)
return leftside
def getrightside(self, somelist):
flag = False
rightside = []
for item in somelist:
if flag:
rightside.append(item)
if item==Structs.IMPLY:
flag = True
return rightside
def analyse(self, startevalset):
previousevalset = []
additevalset = []
currevalset = []
for fact in startevalset:
for rule in Structs.ruleList:
if fact in rule.elements:
if fact in self.getleftside(rule.elements):
for item in self.getrightside(rule.elements):
if item!=Structs.IMPLY and item!=Structs.AND and item!=Structs.OR and item!=Structs.NEG and item not in additevalset:
additevalset.append(item)
currevalset.extend(additevalset)
while additevalset:
additevalset = []
for fact in currevalset:
for rule in Structs.ruleList:
if fact in rule.elements:
if fact in self.getleftside(rule.elements):
for item in self.getrightside(rule.elements):
if item!=Structs.IMPLY and item!=Structs.AND and item!=Structs.OR and item!=Structs.NEG and item not in additevalset:
additevalset.append(item)
for element in additevalset:
if element not in currevalset:
currevalset.append(element)
if set(previousevalset)==set(additevalset):
break
previousevalset = additevalset
for fact in currevalset:
if fact in startevalset:
currevalset.remove(fact)
currevalset.extend(startevalset)
del startevalset[0:len(startevalset)]
del previousevalset[0:len(previousevalset)]
del additevalset[0:len(additevalset)]
return currevalset
def createsecattdict(self):
secattdict = {}
for secatt in Structs.saList:
secattdict.update({secatt.name:0})
return secattdict
def createminsecattdict(self):
secattdict = {}
for secatt in Structs.saList:
secattdict.update({secatt.name:100})
return secattdict
def skipconflrules(self, currevalset):
evalrules = []
for rule in Structs.erList:
evalrules.append(rule)
currevalrules = evalrules
skippedcomprules = []
tempfactelements = []
compositerules = []
for evalrule in evalrules:
if len(evalrule.elements)>2:
tempfactelements = []
for element in evalrule.elements:
if(element!=Structs.AND and element!=Structs.OR and element!=Structs.IMPLY):
tempfactelements.append(element)
for fact in tempfactelements:
if fact not in currevalset:
if evalrule not in skippedcomprules:
skippedcomprules.append(evalrule)
for evalrule in evalrules:
for skipped in skippedcomprules:
if(evalrule==skipped):
currevalrules.remove(skipped)
evalrule = currevalrules
for evalrule in evalrules:
if len(evalrule.elements)>2:
if evalrule not in compositerules:
compositerules.append(evalrule)
for evalrulecomp in compositerules:
for evalrule in evalrules:
for element in evalrulecomp.elements:
if(element!=Structs.AND and element!=Structs.OR and element!=Structs.IMPLY):
if element in evalrule.elements and len(evalrule.elements)<=2 and evalrule.security_attribute==evalrulecomp.security_attribute:
if evalrule in currevalrules:
currevalrules.remove(evalrule)
del skippedcomprules[0:len(skippedcomprules)]
del tempfactelements[0:len(tempfactelements)]
del compositerules[0:len(compositerules)]
return currevalrules
def evaluate(self, iterator):
startset = []
additionalfacts = []
deletedfacts = []
for item in Structs.caseList[iterator].facts:
startset.append(item)
for fact in startset:
if fact.startswith(Structs.NEG):
continue
for evalrule in Structs.erList:
if fact in self.getleftside(evalrule.elements):
declared = True
if not declared:
for order in Structs.foList:
if fact in order.elements:
for element in order.elements:
if element!=Structs.GREATER and element!=Structs.LESSER and element!=Structs.EQUALS and element!=Structs.GREATER+Structs.EQUALS and element!=Structs.LESSER+Structs.EQUALS and element!=fact:
if element not in additionalfacts:
additionalfacts.append(element)
if element not in deletedfacts:
deletedfacts.append(fact)
break
declared = False
for fact in additionalfacts:
startset.append(fact)
for fact in deletedfacts:
if fact in startset:
startset.remove(fact)
currevalset = self.analyse(startset)
secattdict = self.createsecattdict()
currevalrules = self.skipconflrules(currevalset)
rulestoskip = []
declared = False
for fact in currevalset:
if fact.startswith(Structs.NEG):
continue
for evalrule in currevalrules:
if fact in self.getleftside(evalrule.elements):
for secatt in secattdict:
if secatt==evalrule.security_attribute:
if evalrule not in rulestoskip:
secattdict[secatt] = secattdict[secatt] + int(evalrule.influence)
if len(evalrule.elements)>2 :
if evalrule not in rulestoskip:
rulestoskip.append(evalrule)
for fact in currevalset:
if fact in Structs.caseList[iterator].facts:
currevalset.remove(fact)
self.resultList.append(Structs.caseList[iterator].casename+":")
self.resultList.append("")
self.grid_1.SetRowLabelValue(iterator, Structs.caseList[iterator].casename)
iter = 0
for key, value in secattdict.items():
self.resultList.append(key+" - "+str(value))
self.grid_1.SetColLabelValue(iter, key)
self.grid_1.SetCellValue(iterator, iter, str(value))
iter = iter+1
if self.minsecattdict[key]>=value and value!=0:
self.minsecattdict[key] = value
if self.maxsecattdict[key]<=value:
self.maxsecattdict[key] = value
self.resultList.append("")
self.factsetList.append("Set of facts for "+Structs.caseList[iterator].casename+":")
self.factsetList.append("")
for fact in currevalset:
self.factsetList.append(fact)
self.factsetList.append("")
del currevalset[0:len(currevalset)]
del currevalrules[0:len(currevalrules)]
del rulestoskip[0:len(rulestoskip)]
del additionalfacts[0:len(additionalfacts)]
del deletedfacts[0:len(deletedfacts)]
def evaluateall(self):
val = ""
ci = 0
for case in Structs.caseList:
self.evaluate(ci)
ci=ci+1
#self.list_box_1.AppendItems(self.resultList)
#self.list_box_1.AppendItems(self.factsetList)
for key, value in self.treshsecattdict.items():
self.treshsecattdict[key] = float(self.maxsecattdict[key]-self.minsecattdict[key])/5
for row in range(len(Structs.caseList)):
#print "\nCASE:"+str(row+1)+"\n"
for col in range(len(Structs.saList)):
newmin = self.minsecattdict[self.grid_1.GetColLabelValue(col)]
#print "starting newmin:" +str(newmin) +" sec_att: "+self.grid_1.GetColLabelValue(col)
#print "value:" + str(self.grid_1.GetCellValue(row, col))
for i in range(6):
if float(self.grid_1.GetCellValue(row, col))==0:
val = "no"
break
elif float(self.grid_1.GetCellValue(row, col))>newmin and i<5 or i==0:
newmin = newmin + self.treshsecattdict[self.grid_1.GetColLabelValue(col)]
#print "Newmin: "+str(newmin)
else:
if i==1:
val = "very low"
if i==2:
val = "low"
if i==3:
val = "medium"
if i==4:
val = "high"
if i==5:
val = "very high"
break
#print "Cell: "+str(float(self.grid_1.GetCellValue(row, col)))
#print "i "+str(i)
self.grid_1.SetCellValue(row, col, self.grid_1.GetCellValue(row, col)+" ("+val+")")
#print self.grid_1.GetCellValue(row, col)
#print self.treshsecattdict
#print self.minsecattdict
#print self.maxsecattdict
self.grid_1.SetReadOnly(len(Structs.caseList), len(Structs.saList), True)
#End evaluation related
def onClickClose(self, e):
self.Close()
def onClickDetails(self, e):
self.list_box_1.Enable(True)
self.list_box_1.Clear()
self.list_box_1.AppendItems(self.resultList)
self.list_box_1.AppendItems(self.factsetList)
def __set_properties(self):
self.SetTitle(("All Cases QoP Evaluation"))
self.SetSize((800, 473))
self.grid_1.CreateGrid(len(Structs.caseList), len(Structs.saList))
for col in range(len(Structs.saList)):
self.grid_1.SetColSize(col, 100)
self.grid_1.SetMinSize((700, 300))
self.grid_1.EnableEditing(False)
self.list_box_1.SetMinSize((550, 100))
def __do_layout(self):
mainSizer = wx.BoxSizer(wx.VERTICAL)
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
buttonsSizer.Add(self.detailsBtn, 0, wx.ALL | wx.EXPAND, 5)
buttonsSizer.Add(self.closeBtn, 0, wx.ALL | wx.EXPAND, 5)
sizer_1 = wx.BoxSizer(wx.HORIZONTAL)
sizer_1.Add(self.grid_1, 1, wx.EXPAND, 5)
sizer_2 = wx.BoxSizer(wx.HORIZONTAL)
sizer_2.Add(self.list_box_1, 1, wx.EXPAND, 5)
mainSizer.Add(sizer_1, 0, wx.ALL | wx.EXPAND, 5)
mainSizer.Add(sizer_2, 0, wx.ALL | wx.EXPAND, 5)
mainSizer.Add(buttonsSizer, 0, wx.ALL | wx.EXPAND, 5)
self.SetSizer(mainSizer)
self.CentreOnParent()
self.Layout()
#MAINFRAME
class SMETool(wx.Frame):
def __init__(self, *args, **kwds):
#kwds["style"] = wx.CAPTION | wx.CLOSE_BOX | wx.MINIMIZE_BOX | wx.MAXIMIZE | wx.MAXIMIZE_BOX | wx.SYSTEM_MENU | wx.RESIZE_BORDER | wx.CLIP_CHILDREN
wx.Frame.__init__(self, *args, **kwds)
#MENUBAR
self.sme_menubar = wx.MenuBar()
wxglade_tmp_menu = wx.Menu()
wxglade_tmp_menu2 = wx.Menu()
# about menu item
itemAbout = wx.MenuItem(wxglade_tmp_menu, wx.NewId(), u"&About SMETool\tCTRL+I", "", wx.ITEM_NORMAL)
self.Bind(wx.EVT_MENU, self.OnAboutBox, itemAbout)
itemAbout.SetBitmap(wx.Bitmap(self.CreatePath4Icons('about.png')))
wxglade_tmp_menu.AppendItem(itemAbout)
# separator
wxglade_tmp_menu.AppendSeparator()
# save all menu item
saveAllItem = wx.MenuItem(wxglade_tmp_menu, wx.NewId(), u"&Save All\tCTRL+S", "", wx.ITEM_NORMAL)
saveAllItem.Enable(False)
self.Bind(wx.EVT_MENU, self.OnSaveAll, saveAllItem)
wxglade_tmp_menu.AppendItem(saveAllItem)
# separator
wxglade_tmp_menu.AppendSeparator()
# exit item
exitItem = wx.MenuItem(wxglade_tmp_menu, wx.NewId(), u"&Quit\tCTRL+Q", "", wx.ITEM_NORMAL)
exitItem.SetBitmap(wx.Bitmap(self.CreatePath4Icons('exit.png')))
self.Bind(wx.EVT_MENU, self.OnExit, exitItem)
wxglade_tmp_menu.AppendItem(exitItem)
item = wx.MenuItem(wxglade_tmp_menu2,9999, u"&Browse Models\tCTRL+M", "", wx.ITEM_NORMAL)
item.SetBitmap(wx.Bitmap(self.CreatePath4Icons('lib.png')))
self.Bind(wx.EVT_MENU, self.OnModelLibrary, id=9999)
item.Enable(True)
wxglade_tmp_menu2.AppendItem(item)
self.sme_menubar.Append(wxglade_tmp_menu, ("Menu"))
self.sme_menubar.Append(wxglade_tmp_menu2, ("Library"))
self.SetMenuBar(self.sme_menubar)
self.losma = wx.Bitmap(self.CreatePath4Icons('logo_SMETool_small.png'), wx.BITMAP_TYPE_PNG)
self.SetIcon(wx.Icon(self.CreatePath4Icons('appicon.png'), wx.BITMAP_TYPE_PNG))
self.notebook_1 = wx.Notebook(self, wx.ID_ANY, style=0)
# pretty, tiny icons for the notebook tabz - SME needs to look good 2!
il = wx.ImageList(20, 20)
self.categoriesTabImg = il.Add(wx.Bitmap(self.CreatePath4Icons('categories.png'), wx.BITMAP_TYPE_PNG))
self.factsTabImg = il.Add(wx.Bitmap(self.CreatePath4Icons('facts.png'), wx.BITMAP_TYPE_PNG))
self.secAttrTabImg = il.Add(wx.Bitmap(self.CreatePath4Icons('sec_attr.png'), wx.BITMAP_TYPE_PNG))
self.rulesTabImg = il.Add(wx.Bitmap(self.CreatePath4Icons('rules.png'), wx.BITMAP_TYPE_PNG))
self.factsOrderTabImg = il.Add(wx.Bitmap(self.CreatePath4Icons('facts_order.png'), wx.BITMAP_TYPE_PNG))
self.evaluationRulesTabImg = il.Add(wx.Bitmap(self.CreatePath4Icons('ev_rules.png'), wx.BITMAP_TYPE_PNG))
self.casesTabImg = il.Add(wx.Bitmap(self.CreatePath4Icons('cases.png'), wx.BITMAP_TYPE_PNG))
self.notebook_1.AssignImageList(il)
# -------------------------- TAB1 --------------------------
# main panel for the first tab
self.categoriesPanel = wx.Panel(self.notebook_1, wx.ID_ANY)
# smetool logo
self.logoSmall1 = wx.StaticBitmap(self.categoriesPanel, -1, self.losma)
# create buttons, give them actual, meaningful names
self.loadCategoriesBtn = wx.Button(self.categoriesPanel, label="Load")
self.saveCategoriesBtn = wx.Button(self.categoriesPanel, label="Save")
self.addCategoriesBtn = wx.Button(self.categoriesPanel, label="Add")
self.deleteCategoriesBtn = wx.Button(self.categoriesPanel, label="Delete")
self.editCategoriesBtn = wx.Button(self.categoriesPanel, label="Edit")
# do some buttons-bindings
self.loadCategoriesBtn.Bind(wx.EVT_BUTTON, self.onClickLoadCategories)
self.saveCategoriesBtn.Bind(wx.EVT_BUTTON, self.onClickSaveCategories)
self.addCategoriesBtn.Bind(wx.EVT_BUTTON, self.onClickAddCategory)
self.deleteCategoriesBtn.Bind(wx.EVT_BUTTON, self.onClickDeleteCategory)
self.editCategoriesBtn.Bind(wx.EVT_BUTTON, self.onClickEditCategory)
# create list view
self.categoriesListView = wx.ListView(self.categoriesPanel, wx.ID_ANY, style=wx.LC_REPORT | wx.SUNKEN_BORDER)
self.categoriesListView.InsertColumn(0, 'Category')
self.categoriesListView.SetColumnWidth(0, 200)
self.categoriesListView.InsertColumn(1, 'Description')
self.categoriesListView.SetColumnWidth(1, 500)
self.populatelctrl1(Structs.categoryList)
# -------------------------- TAB2 --------------------------
#main panel for the second tab
self.factsPanel = wx.Panel(self.notebook_1, wx.ID_ANY)
# smetool logo
self.logoSmall2 = wx.StaticBitmap(self.factsPanel, -1, self.losma)
# create buttons, give them actual, meaningful names
self.loadFactsBtn = wx.Button(self.factsPanel, label="Load")
self.saveFactsBtn = wx.Button(self.factsPanel, label="Save")
self.addFactsBtn = wx.Button(self.factsPanel, label="Add")
self.deleteFactBtn = wx.Button(self.factsPanel, label="Delete")
self.editFactBtn = wx.Button(self.factsPanel, label="Edit")
self.viewFactsBtn = wx.Button(self.factsPanel, label="View")
# do some buttons-bindings
self.loadFactsBtn.Bind(wx.EVT_BUTTON, self.onClickLoadFacts)
self.saveFactsBtn.Bind(wx.EVT_BUTTON, self.onClickSaveFacts)
self.addFactsBtn.Bind(wx.EVT_BUTTON, self.onClickAddFact)
self.deleteFactBtn.Bind(wx.EVT_BUTTON, self.onClickDeleteFact)
self.editFactBtn.Bind(wx.EVT_BUTTON, self.onClickEditFact)
self.viewFactsBtn.Bind(wx.EVT_BUTTON, self.onClickViewFact)
# create list view
self.factsListView = wx.ListView(self.factsPanel, wx.ID_ANY, style=wx.LC_REPORT | wx.SUNKEN_BORDER)
self.factsListView.InsertColumn(0, 'Name')
self.factsListView.SetColumnWidth(0, 100)
self.factsListView.InsertColumn(1, 'Category')
self.factsListView.SetColumnWidth(1, 200)
self.factsListView.InsertColumn(2, 'Description')
self.factsListView.SetColumnWidth(2, 200)
self.factsListView.InsertColumn(3, 'Value')
self.factsListView.SetColumnWidth(3, 200)
self.populatelctrl2(Structs.factList)
# -------------------------- TAB3 --------------------------
#main panel for the third tab
self.SAPanel = wx.Panel(self.notebook_1, wx.ID_ANY)
# smetool logo
self.logoSmall3 = wx.StaticBitmap(self.SAPanel, -1, self.losma)
# create buttons, give them actual, meaningful names
self.loadSABtn = wx.Button(self.SAPanel, label="Load")
self.saveSABtn = wx.Button(self.SAPanel, label="Save")
self.addSABtn = wx.Button(self.SAPanel, label="Add")
self.deleteSABtn = wx.Button(self.SAPanel, label="Delete")
self.editSABtn = wx.Button(self.SAPanel, label="Edit")
# do some buttons-bindings
self.loadSABtn.Bind(wx.EVT_BUTTON, self.onClickLoadSA)
self.saveSABtn.Bind(wx.EVT_BUTTON, self.onClickSaveSA)
self.addSABtn.Bind(wx.EVT_BUTTON, self.onClickAddSA)
self.deleteSABtn.Bind(wx.EVT_BUTTON, self.onClickDeleteSA)
self.editSABtn.Bind(wx.EVT_BUTTON, self.onClickEditSA)
# create list view
self.SAListView = wx.ListView(self.SAPanel, wx.ID_ANY, style=wx.LC_REPORT | wx.SUNKEN_BORDER)
self.SAListView.InsertColumn(0, 'Security Attribute')
self.SAListView.SetColumnWidth(0, 200)
self.SAListView.InsertColumn(1, 'Description')
self.SAListView.SetColumnWidth(1, 500)
self.populatelctrl3(Structs.saList)
# -------------------------- TAB4 --------------------------
# main panel for the fourth tab
self.rulePanel = wx.Panel(self.notebook_1, wx.ID_ANY)
# smetool logo
self.logoSmall4 = wx.StaticBitmap(self.rulePanel, -1, self.losma)
# create buttons, give them actual, meaningful names
self.loadRuleBtn = wx.Button(self.rulePanel, label="Load")
self.saveRuleBtn = wx.Button(self.rulePanel, label="Save")
self.addRuleBtn = wx.Button(self.rulePanel, label="Add")
self.deleteRuleBtn = wx.Button(self.rulePanel, label="Delete")
self.editRuleBtn = wx.Button(self.rulePanel, label="Edit")
# do some buttons-bindings
self.loadRuleBtn.Bind(wx.EVT_BUTTON, self.onClickLoadRules)
self.saveRuleBtn.Bind(wx.EVT_BUTTON, self.onClickSaveRules)
self.addRuleBtn.Bind(wx.EVT_BUTTON, self.onClickAddRule)
self.deleteRuleBtn.Bind(wx.EVT_BUTTON, self.onClickDeleteRule)
self.editRuleBtn.Bind(wx.EVT_BUTTON, self.onClickEditRule)
# create list view
self.ruleListView = wx.ListView(self.rulePanel, wx.ID_ANY, style=wx.LC_REPORT | wx.SUNKEN_BORDER)
self.ruleListView.InsertColumn(0, 'Rule')
self.ruleListView.SetColumnWidth(0, 1000)
self.populatelctrl4(Structs.ruleList)
# -------------------------- TAB5 --------------------------
#main panel for the fifth tab
self.factsOrderPanel = wx.Panel(self.notebook_1, wx.ID_ANY)
# smetool logo
self.logoSmall5 = wx.StaticBitmap(self.factsOrderPanel, -1, self.losma)
# create buttons, give them actual, meaningful names
self.loadFactsOrderBtn = wx.Button(self.factsOrderPanel, label="Load")
self.saveFactsOrderBtn = wx.Button(self.factsOrderPanel, label="Save")
self.addFactsOrderBtn = wx.Button(self.factsOrderPanel, label="Add")
self.deleteFactsOrderBtn = wx.Button(self.factsOrderPanel, label="Delete")
self.editFactsOrderBtn = wx.Button(self.factsOrderPanel, label="Edit")
# do some buttons-bindings
self.loadFactsOrderBtn.Bind(wx.EVT_BUTTON, self.onClickLoadFOs)
self.saveFactsOrderBtn.Bind(wx.EVT_BUTTON, self.onClickSaveFOs)
self.addFactsOrderBtn.Bind(wx.EVT_BUTTON, self.onClickAddFO)
self.deleteFactsOrderBtn.Bind(wx.EVT_BUTTON, self.onClickDeleteFO)
self.editFactsOrderBtn.Bind(wx.EVT_BUTTON, self.onClickEditFO)
# create list view
self.factsOrderListView = wx.ListView(self.factsOrderPanel, wx.ID_ANY, style=wx.LC_REPORT | wx.SUNKEN_BORDER)
self.factsOrderListView.InsertColumn(0, 'Facts Order')
self.factsOrderListView.SetColumnWidth(0, 500)
self.factsOrderListView.InsertColumn(1, 'Security Attribute')
self.factsOrderListView.SetColumnWidth(1, 500)
self.populatelctrl6(Structs.foList)
# -------------------------- TAB6 --------------------------
#main panel for the sixth tab
self.evaluationRulesPanel = wx.Panel(self.notebook_1, wx.ID_ANY)
# smetool logo
self.logoSmall6 = wx.StaticBitmap(self.evaluationRulesPanel, -1, self.losma)
# create buttons, give them actual, meaningful names
self.loadEvaluationRulesBtn = wx.Button(self.evaluationRulesPanel, label="Load")
self.saveEvaluationRulesBtn = wx.Button(self.evaluationRulesPanel, label="Save")
self.addEvaluationRulesBtn = wx.Button(self.evaluationRulesPanel, label="Add")
self.deleteEvaluationRuleBtn = wx.Button(self.evaluationRulesPanel, label="Delete")
self.editEvaluationRuleBtn = wx.Button(self.evaluationRulesPanel, label="Edit")
# do some buttons-bindings
self.loadEvaluationRulesBtn.Bind(wx.EVT_BUTTON, self.onClickLoadERs)
self.saveEvaluationRulesBtn.Bind(wx.EVT_BUTTON, self.onClickSaveERs)
self.addEvaluationRulesBtn.Bind(wx.EVT_BUTTON, self.onClickAddER)
self.deleteEvaluationRuleBtn.Bind(wx.EVT_BUTTON, self.onClickDeleteER)
self.editEvaluationRuleBtn.Bind(wx.EVT_BUTTON, self.onClickEditER)
# create list view
self.evaluationRulesListView = wx.ListView(self.evaluationRulesPanel, wx.ID_ANY, style=wx.LC_REPORT | wx.SUNKEN_BORDER)
self.evaluationRulesListView.InsertColumn(0, 'Evaluation rule')
self.evaluationRulesListView.SetColumnWidth(0, 300)
self.evaluationRulesListView.InsertColumn(1, 'Influence Value')
self.evaluationRulesListView.SetColumnWidth(1, 100)
self.evaluationRulesListView.InsertColumn(2, 'Security Attribute')
self.evaluationRulesListView.SetColumnWidth(2, 500)
self.populatelctrl10(Structs.erList)
# -------------------------- TAB7 --------------------------
#main panel for the seventh tab
self.casesPanel = wx.Panel(self.notebook_1, wx.ID_ANY)
# smetool logo
self.logoSmall7 = wx.StaticBitmap(self.casesPanel, -1, self.losma)
# create buttons, give them actual, meaningful names
self.loadCasesBtn = wx.Button(self.casesPanel, label="Load")
self.saveCasesBtn = wx.Button(self.casesPanel, label="Save")
self.addCasesBtn = wx.Button(self.casesPanel, label="Add")
self.deleteCasesBtn = wx.Button(self.casesPanel, label="Delete")
self.editCasesBtn = wx.Button(self.casesPanel, label="Edit")
self.evaluateCasesBtn = wx.Button(self.casesPanel, label="Evaluate")
self.evaluateAllCasesBtn = wx.Button(self.casesPanel, label="Evaluate All")
# do some buttons-bindings
self.loadCasesBtn.Bind(wx.EVT_BUTTON, self.onClickLoadCases)
self.saveCasesBtn.Bind(wx.EVT_BUTTON, self.onClickSaveCases)
self.addCasesBtn.Bind(wx.EVT_BUTTON, self.onClickAddCase)
self.deleteCasesBtn.Bind(wx.EVT_BUTTON, self.onClickDeleteCase)
self.editCasesBtn.Bind(wx.EVT_BUTTON, self.onClickEditCase)
self.evaluateCasesBtn.Bind(wx.EVT_BUTTON, self.onClickEvaluateCase)
self.evaluateAllCasesBtn.Bind(wx.EVT_BUTTON, self.onClickEvaluateAllCases)
# create list view
self.casesListView = wx.ListView(self.casesPanel, wx.ID_ANY, style=wx.LC_REPORT | wx.SUNKEN_BORDER)
self.casesListView.InsertColumn(0, 'Case Number')
self.casesListView.SetColumnWidth(0, 200)
self.casesListView.InsertColumn(1, 'Facts')
self.casesListView.SetColumnWidth(1, 500)
self.casesListView.InsertColumn(2, 'Description')
self.casesListView.SetColumnWidth(2, 500)
self.populatelctrl11(Structs.caseList)
self.__set_properties()
self.__do_layout()
def CreatePath4Icons(self, resourceName):
return os.path.dirname(os.path.abspath(__file__))+"/img/"+resourceName
def ShowMessage(self, message):
wx.MessageBox(message, 'Dialog', wx.OK | wx.ICON_INFORMATION)
def OnAboutBox(self, e):
info = wx.AboutDialogInfo()
info.SetIcon(wx.Icon(self.CreatePath4Icons("logo_SMETool_small.png"), wx.BITMAP_TYPE_PNG))
info.SetName("Security Mechanisms Evaluation Tool")
info.SetVersion('(v 0.9.3)')
info.SetDescription("Michail Mokkas, [email protected]")
info.SetCopyright('(C) 2014')
wx.AboutBox(info)
def OnModelLibrary(self, e):
dia = ModelLibraryDialog(self, -1, 'Model Library')
dia.ShowModal()
dia.Destroy()
def OnLoadAll(self, e):
dir = os.path.dirname(os.path.abspath(__file__))
Utility.loadCategories(dir+"/files/categoryList.pickle")
Utility.loadFacts(dir+"/files/factList.pickle")
Utility.loadSA(dir+"/files/saList.pickle")
Utility.loadRules(dir+"/files/ruleList.pickle")
Utility.loadFOs(dir+"/files/foList.pickle")
Utility.loadERs(dir+"/files/erList.pickle")
Utility.loadCases(dir+"/files/caseList.pickle")
self.populatelctrl1(Structs.categoryList)
self.populatelctrl2(Structs.factList)
self.populatelctrl3(Structs.saList)
self.populatelctrl4(Structs.ruleList)
self.populatelctrl6(Structs.foList)
self.populatelctrl10(Structs.erList)
self.populatelctrl11(Structs.caseList)
def OnSaveAll(self, e):
saveFileDialog = wx.FileDialog(self, "Choose A File To Save Model To:", "", "",
"SME Tool Files (*.sme)|*.sme",
wx.FD_SAVE | wx.FD_OVERWRITE_PROMPT)
saveFileDialog.ShowModal()
Utility.saveCategories(saveFileDialog.GetPath()+"_category.sme")
Utility.saveFacts(saveFileDialog.GetPath()+"_fact.sme")
Utility.saveSA(saveFileDialog.GetPath()+"_sa.sme")
Utility.saveRules(saveFileDialog.GetPath()+"_rule.sme")
Utility.saveFOs(saveFileDialog.GetPath()+"_fo.sme")
Utility.saveERs(saveFileDialog.GetPath()+"_er.sme")
Utility.saveCases(saveFileDialog.GetPath()+"_case.sme")
saveFileDialog.Destroy()
#Utility.saveCategories()
#Utility.saveFacts()
#Utility.saveSA()
#Utility.saveRules()
#Utility.saveFOs()
#Utility.saveERs()
#Utility.saveCases()
self.populatelctrl1(Structs.categoryList)
self.populatelctrl2(Structs.factList)
self.populatelctrl3(Structs.saList)
self.populatelctrl4(Structs.ruleList)
self.populatelctrl6(Structs.foList)
self.populatelctrl10(Structs.erList)
self.populatelctrl11(Structs.caseList)
def OnExit(self, e):
self.Close()
#Category
def onClickLoadCategories(self, e):
dlg = wx.FileDialog(self, "Choose a file to load categories from:")
dlg.ShowModal()
dlg.Destroy()
Utility.loadCategories(dlg.GetPath())
self.populatelctrl1(Structs.categoryList)
def onClickSaveCategories(self, e):
dlg = wx.FileDialog(self, "Choose a file to save categories to:")
dlg.ShowModal()
dlg.Destroy()
Utility.saveCategories(dlg.GetPath())
def onClickAddCategory(self, e):
dia = AddCategoryDialog(self, -1, 'Add a category')
dia.ShowModal()
dia.Destroy()
def onClickDeleteCategory(self, e):
index = 0
for item in Structs.factList:
if(item.category==Structs.categoryList[self.categoriesListView.GetFocusedItem()].name):
self.onFactDelete(item.name+'('+item.category+')')
del Structs.factList[index]
self.populatelctrl2(Structs.factList)
index=index+1
del Structs.categoryList[self.categoriesListView.GetFocusedItem()]
self.populatelctrl1(Structs.categoryList)
def onClickEditCategory(self, e):
dia = EditCategoryDialog(self, -1, 'Edit the Category')
dia.ShowModal()
dia.Destroy()
def populatelctrl1(self, thelist):
self.categoriesListView.DeleteAllItems()
for item in thelist:
self.categoriesListView.Append((item.name,item.description))
#Fact
def onClickLoadFacts(self, e):
dlg = wx.FileDialog(self, "Choose a file to load facts from:")
dlg.ShowModal()
dlg.Destroy()
Utility.loadFacts(dlg.GetPath())
self.populatelctrl2(Structs.factList)
def onClickSaveFacts(self, e):
dlg = wx.FileDialog(self, "Choose a file to save facts to:")
dlg.ShowModal()
dlg.Destroy()
Utility.saveFacts(dlg.GetPath())
def onClickAddFact(self, e):
dia = AddFactDialog(self, -1, 'Add a fact')
dia.ShowModal()
dia.Destroy()
def onClickDeleteFact(self, e):
self.onFactDelete(Structs.factList[self.factsListView.GetFocusedItem()].name+'('+Structs.factList[self.factsListView.GetFocusedItem()].category+')')
del Structs.factList[self.factsListView.GetFocusedItem()]
self.populatelctrl2(Structs.factList)
def onClickEditFact(self, e):
dia = EditFactDialog(self, -1, 'Edit the Fact')
dia.ShowModal()
dia.Destroy()
def onClickViewFact(self, e):
dia = ViewFactsDialog(self, -1, 'View Facts')
dia.ShowModal()
dia.Destroy()
def populatelctrl2(self, thelist):
self.factsListView.DeleteAllItems()
for item in thelist:
self.factsListView.Append((item.name,item.category,item.description,item.value))
def onFactDelete(self, catfact):
index = 0
for item in Structs.ruleList:
for element in Structs.ruleList[index].elements:
if(element==catfact):
del Structs.ruleList[index]
self.populatelctrl4(Structs.ruleList)
index=index+1
index = 0
for item in Structs.foList:
for element in Structs.foList[index].elements:
if(element==catfact):
del Structs.foList[index]
self.populatelctrl6(Structs.foList)
index=index+1
index = 0
for item in Structs.erList:
for element in Structs.erList[index].elements:
if(element==catfact):
del Structs.erList[index]
self.populatelctrl10(Structs.erList)
index=index+1
index = 0
for item in Structs.caseList:
for fact in Structs.caseList[index].facts:
if(fact==catfact):
del Structs.caseList[index]
self.populatelctrl11(Structs.caseList)
index=index+1
def onFactChange(self, oldcatfact, newcatfact):
index = 0
index2 = 0
for item in Structs.ruleList:
for element in Structs.ruleList[index].elements:
if(element==oldcatfact):
Structs.ruleList[index].elements[index2] = newcatfact
self.populatelctrl4(Structs.ruleList)
index2=index2+1
index=index+1
index = 0
index2 = 0
for item in Structs.foList:
for element in Structs.foList[index].elements:
if(element==oldcatfact):
Structs.foList[index].elements[index2] = newcatfact
self.populatelctrl6(Structs.foList)
index2=index2+1
index=index+1
index = 0
index2 = 0
for item in Structs.erList:
for element in Structs.erList[index].elements:
if(element==oldcatfact):
Structs.erList[index].elements[index2] = newcatfact
self.populatelctrl10(Structs.erList)
index2=index2+1
index=index+1
index = 0
index2 = 0
for item in Structs.caseList:
for fact in Structs.caseList[index].facts:
if(fact==oldcatfact):
Structs.caseList[index].facts[index2] = newcatfact
self.populatelctrl11(Structs.caseList)
index2=index2+1
index=index+1
#Security attribute
def onClickLoadSA(self, e):
dlg = wx.FileDialog(self, "Choose a file to load the security attributes from:")
dlg.ShowModal()
dlg.Destroy()
Utility.loadSA(dlg.GetPath())
self.populatelctrl3(Structs.saList)
def onClickSaveSA(self, e):
dlg = wx.FileDialog(self, "Choose a file to save the security attributes to:")
dlg.ShowModal()
dlg.Destroy()
Utility.saveSA(dlg.GetPath())
def onClickAddSA(self, e):
dia = AddSADialog(self, -1, 'Add a security attribute')
dia.ShowModal()
dia.Destroy()
def onClickDeleteSA(self, e):
index = 0
for item in Structs.foList:
if(item.security_attribute==Structs.saList[self.SAListView.GetFocusedItem()].name):
del Structs.foList[index]
self.populatelctrl6(Structs.foList)
index=index+1
index = 0
for item in Structs.erList:
if(item.security_attribute==Structs.saList[self.SAListView.GetFocusedItem()].name):
del Structs.erList[index]
self.populatelctrl10(Structs.erList)
index=index+1
del Structs.saList[self.SAListView.GetFocusedItem()]
self.populatelctrl3(Structs.saList)
def onClickEditSA(self, e):
dia = EditSADialog(self, -1, 'Edit the security attribute')
dia.ShowModal()
dia.Destroy()
def populatelctrl3(self, thelist):
self.SAListView.DeleteAllItems()
for item in thelist:
self.SAListView.Append((item.name,item.description))
#Rule
def onClickLoadRules(self, e):
dlg = wx.FileDialog(self, "Choose a file to load the rules from:")
dlg.ShowModal()
dlg.Destroy()
Utility.loadRules(dlg.GetPath())
self.populatelctrl4(Structs.ruleList)
def onClickSaveRules(self, e):
dlg = wx.FileDialog(self, "Choose a file to save the rules to:")
dlg.ShowModal()
dlg.Destroy()
Utility.saveRules(dlg.GetPath())
def onClickAddRule(self, e):
dia = AddRuleDialog(self, -1, 'Add a Rule')
dia.ShowModal()
dia.Destroy()
def onClickDeleteRule(self, e):
del Structs.ruleList[self.ruleListView.GetFocusedItem()]
self.populatelctrl4(Structs.ruleList)
def onClickEditRule(self, e):
dia = EditRuleDialog(self, -1, 'Edit the Rule')
dia.ShowModal()
dia.Destroy()
def populatelctrl4(self, thelist):
self.ruleListView.DeleteAllItems()
for item in thelist:
ele = ''
for element in item.elements:
ele = ele + element
self.ruleListView.Append((ele,)) # (ele,) != (ele) != (ele,"")
#Facts order
def onClickLoadFOs(self, e):
dlg = wx.FileDialog(self, "Choose a file to load facts order from:")
dlg.ShowModal()
dlg.Destroy()
Utility.loadFOs(dlg.GetPath())
self.populatelctrl6(Structs.foList)
def onClickSaveFOs(self, e):
dlg = wx.FileDialog(self, "Choose a file to save facts order to:")
dlg.ShowModal()
dlg.Destroy()
Utility.saveFOs(dlg.GetPath())
def onClickAddFO(self, e):
dia = AddFODialog(self, -1, 'Add the Facts Order')
dia.ShowModal()
dia.Destroy()
def onClickDeleteFO(self, e):
del Structs.foList[self.factsOrderListView.GetFocusedItem()]
self.populatelctrl6(Structs.foList)
def onClickEditFO(self, e):
dia = EditFODialog(self, -1, 'Edit the Facts Order')
dia.ShowModal()
dia.Destroy()
def populatelctrl6(self, thelist):
ele = ''
self.factsOrderListView.DeleteAllItems()
for item in thelist:
for element in item.elements:
ele = ele + element
self.factsOrderListView.Append((ele,item.security_attribute))
ele = ''
#Evaluation rules
def onClickLoadERs(self, e):
dlg = wx.FileDialog(self, "Choose a File to Load Evaluation Rules from:")
dlg.ShowModal()
dlg.Destroy()
Utility.loadERs(dlg.GetPath())
self.populatelctrl10(Structs.erList)
def onClickSaveERs(self, e):
dlg = wx.FileDialog(self, "Choose a File to Save Evaluation Rules from:")
dlg.ShowModal()
dlg.Destroy()
Utility.saveERs(dlg.GetPath())
def onClickAddER(self, e):
dia = AddERDialog(self, -1, 'Add an Evaluation Rule')
dia.ShowModal()
dia.Destroy()
def onClickDeleteER(self, e):
del Structs.erList[self.evaluationRulesListView.GetFocusedItem()]
self.populatelctrl10(Structs.erList)
def onClickEditER(self, e):
dia = EditERDialog(self, -1, 'Edit the Evaluation Rules')
dia.ShowModal()
dia.Destroy()
def populatelctrl10(self, thelist):
ele = ''
self.evaluationRulesListView.DeleteAllItems()
for item in thelist:
for element in item.elements:
ele = ele + element
self.evaluationRulesListView.Append((ele, item.influence, item.security_attribute))
ele = ''
#Case
def onClickLoadCases(self, e):
dlg = wx.FileDialog(self, "Choose a File to Load Cases from:")
dlg.ShowModal()
dlg.Destroy()
Utility.loadCases(dlg.GetPath())
self.populatelctrl11(Structs.caseList)
def onClickSaveCases(self, e):
dlg = wx.FileDialog(self, "Choose a File to Save Cases to:")
dlg.ShowModal()
dlg.Destroy()
Utility.saveCases(dlg.GetPath())
def onClickAddCase(self, e):
dia = AddCaseDialog(self, -1, 'Add a Case')
dia.ShowModal()
dia.Destroy()
def onClickDeleteCase(self, e):
del Structs.caseList[self.casesListView.GetFocusedItem()]
self.populatelctrl11(Structs.caseList)
def onClickEditCase(self, e):
dia = EditCaseDialog(self, -1, 'Edit the Case')
dia.ShowModal()
dia.Destroy()
def onClickEvaluateCase(self, e):
dia = EvaluateCaseDialog(self, -1, 'Case QoP Evaluation')
dia.ShowModal()
dia.Destroy()
def onClickEvaluateAllCases(self, e):
dia = EvaluateAllCasesDialog(self, -1, 'All Cases QoP Eavaluation')
dia.ShowModal()
dia.Destroy()
def populatelctrl11(self, thelist):
ele = ''
self.casesListView.DeleteAllItems()
for item in thelist:
for element in item.facts:
ele = ele + ' ' + element
self.casesListView.Append((item.casename, ele, item.description))
ele = ''
def __set_properties(self):
self.SetTitle(("Security Mechanisms Evaluation Tool"))
def __do_layout(self):
# -------------------------- TAB1 --------------------------
# create horizontal sizer for all the buttons
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(self.logoSmall1, 0, wx.ALIGN_CENTER, 5)
buttonsSizer.Add(wx.StaticText(self), 1, wx.EXPAND, 5)
buttonsSizer.Add(self.loadCategoriesBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.saveCategoriesBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.addCategoriesBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.deleteCategoriesBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.editCategoriesBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
# do the final alignment
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(self.categoriesListView, 1, wx.EXPAND | wx.ALL, 5)
sizer.Add(buttonsSizer, 0, wx.EXPAND | wx.ALL, 5)
self.categoriesPanel.SetSizer(sizer)
# -------------------------- TAB2 --------------------------
# create horizontal sizer for all the buttons
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(self.logoSmall2, 0, wx.ALIGN_CENTER, 5)
buttonsSizer.Add(wx.StaticText(self), 1, wx.EXPAND, 5)
buttonsSizer.Add(self.loadFactsBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.saveFactsBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.addFactsBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.deleteFactBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.editFactBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.viewFactsBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
# do the final alignment
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(self.factsListView, 1, wx.EXPAND | wx.ALL, 5)
sizer.Add(buttonsSizer, 0, wx.EXPAND | wx.ALL, 5)
self.factsPanel.SetSizer(sizer)
# -------------------------- TAB3 --------------------------
# create horizontal sizer for all the buttons
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(self.logoSmall3, 0, wx.ALIGN_CENTER, 5)
buttonsSizer.Add(wx.StaticText(self), 1, wx.EXPAND, 5)
buttonsSizer.Add(self.loadSABtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.saveSABtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.addSABtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.deleteSABtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.editSABtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
# do the final alignment
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(self.SAListView, 1, wx.EXPAND | wx.ALL, 5)
sizer.Add(buttonsSizer, 0, wx.EXPAND | wx.ALL, 5)
self.SAPanel.SetSizer(sizer)
# -------------------------- TAB4 --------------------------
# create horizontal sizer for all the buttons
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(self.logoSmall4, 0, wx.ALIGN_CENTER, 5)
buttonsSizer.Add(wx.StaticText(self), 1, wx.EXPAND, 5)
buttonsSizer.Add(self.loadRuleBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.saveRuleBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.addRuleBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.deleteRuleBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.editRuleBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
# do the final alignment
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(self.ruleListView, 1, wx.EXPAND | wx.ALL, 5)
sizer.Add(buttonsSizer, 0, wx.EXPAND | wx.ALL, 5)
self.rulePanel.SetSizer(sizer)
# -------------------------- TAB5 --------------------------
# create horizontal sizer for all the buttons
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(self.logoSmall5, 0, wx.ALIGN_CENTER, 5)
buttonsSizer.Add(wx.StaticText(self), 1, wx.EXPAND, 5)
buttonsSizer.Add(self.loadFactsOrderBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.saveFactsOrderBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.addFactsOrderBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.deleteFactsOrderBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.editFactsOrderBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
# do the final alignment
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(self.factsOrderListView, 1, wx.EXPAND | wx.ALL, 5)
sizer.Add(buttonsSizer, 0, wx.EXPAND | wx.ALL, 5)
self.factsOrderPanel.SetSizer(sizer)
# -------------------------- TAB6 --------------------------
# create horizontal sizer for all the buttons
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(self.logoSmall6, 0, wx.ALIGN_CENTER, 5)
buttonsSizer.Add(wx.StaticText(self), 1, wx.EXPAND, 5)
buttonsSizer.Add(self.loadEvaluationRulesBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.saveEvaluationRulesBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.addEvaluationRulesBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.deleteEvaluationRuleBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.editEvaluationRuleBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
# do the final alignment
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(self.evaluationRulesListView, 1, wx.EXPAND | wx.ALL, 5)
sizer.Add(buttonsSizer, 0, wx.EXPAND | wx.ALL, 5)
self.evaluationRulesPanel.SetSizer(sizer)
# -------------------------- TAB7 --------------------------
# create horizontal sizer for all the buttons
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(self.logoSmall7, 0, wx.ALIGN_CENTER, 5)
buttonsSizer.Add(wx.StaticText(self), 1, wx.EXPAND, 5)
buttonsSizer.Add(self.loadCasesBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.saveCasesBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.addCasesBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.deleteCasesBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.editCasesBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.evaluateCasesBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
buttonsSizer.Add(self.evaluateAllCasesBtn, 0, wx.ALIGN_CENTER | wx.ALL, 5)
# do the final alignment
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(self.casesListView, 1, wx.EXPAND | wx.ALL, 5)
sizer.Add(buttonsSizer, 0, wx.EXPAND | wx.ALL, 5)
self.casesPanel.SetSizer(sizer)
#ALLTABS
self.notebook_1.AddPage(self.categoriesPanel, ("Categories"))
self.notebook_1.SetPageImage(0, self.categoriesTabImg)
self.notebook_1.AddPage(self.factsPanel, ("Facts"))
self.notebook_1.SetPageImage(1, self.factsTabImg)
self.notebook_1.AddPage(self.SAPanel, ("Security Attributes"))
self.notebook_1.SetPageImage(2, self.secAttrTabImg)
self.notebook_1.AddPage(self.rulePanel, ("Rules"))
self.notebook_1.SetPageImage(3, self.rulesTabImg)
self.notebook_1.AddPage(self.factsOrderPanel, ("Facts Order"))
self.notebook_1.SetPageImage(4, self.factsOrderTabImg)
self.notebook_1.AddPage(self.evaluationRulesPanel, ("Evaluation Rules"))
self.notebook_1.SetPageImage(5, self.evaluationRulesTabImg)
self.notebook_1.AddPage(self.casesPanel, ("Cases"))
self.notebook_1.SetPageImage(6, self.casesTabImg)
sizer = wx.BoxSizer(wx.HORIZONTAL)
sizer.Add(self.notebook_1, 1, wx.EXPAND, 0, 5)
self.SetSizer(sizer)
self.Layout()
if __name__ == "__main__":
app = wx.App()
smetool = SMETool(None, wx.ID_ANY, "")
smetool.SetClientSize(wx.Size(850,550))
smetool.CenterOnScreen()
smetool.Show()
app.SetTopWindow(smetool)
app.MainLoop() | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/sme/SMETool.py | SMETool.py |
import Structs
import pickle
import wx
#Read from and write to a file
def readFile(filename):
filename = "files\\"+filename
f = open(filename)
lines = [line.strip() for line in f]
f.close()
return lines
#First clears then writes
def writeFile(filename, thelist):
filename = "files\\"+filename
f = open(filename, 'w')
for item in thelist:
f.write("%s\n" % item)
f.close()
#picklingMethods
def saveCategories(path):
with open(path, 'wb') as handle:
pickle.dump(Structs.categoryList, handle)
def loadCategories(path):
with open(path, 'rb') as handle:
Structs.categoryList = pickle.load(handle)
def saveFacts(path):
with open(path, 'wb') as handle:
pickle.dump(Structs.factList, handle)
def loadFacts(path):
with open(path, 'rb') as handle:
Structs.factList = pickle.load(handle)
def saveSA(path):
with open(path, 'wb') as handle:
pickle.dump(Structs.saList, handle)
def loadSA(path):
with open(path, 'rb') as handle:
Structs.saList = pickle.load(handle)
def saveRules(path):
with open(path, 'wb') as handle:
pickle.dump(Structs.ruleList, handle)
def loadRules(path):
with open(path, 'rb') as handle:
Structs.ruleList = pickle.load(handle)
def saveFOs(path):
with open(path, 'wb') as handle:
pickle.dump(Structs.foList, handle)
def loadFOs(path):
with open(path, 'rb') as handle:
Structs.foList = pickle.load(handle)
def saveERs(path):
with open(path, 'wb') as handle:
pickle.dump(Structs.erList, handle)
def loadERs(path):
with open(path, 'rb') as handle:
Structs.erList = pickle.load(handle)
def saveCases(path):
with open(path, 'wb') as handle:
pickle.dump(Structs.caseList, handle)
def loadCases(path):
with open(path, 'rb') as handle:
Structs.caseList = pickle.load(handle)
def saveEvaluations(path):
with open(path, 'wb') as handle:
pickle.dump(Structs.evalList, handle)
def loadEvaluations(path):
with open(path, 'rb') as handle:
Structs.evalList = pickle.load(handle) | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/sme/Utility.py | Utility.py |
import wx
import wx.lib.newevent
import threading
from aqopa.model.parser import ParserException, ModelParserException,\
MetricsParserException, ConfigurationParserException
from aqopa.model.parser.lex_yacc.grammar import algorithms
from aqopa.model.store import QoPMLModelStore
from aqopa.model import HostProcess, name_indexes, original_name,\
HostSubprocess, WhileInstruction, IfInstruction
from aqopa.simulator import Simulator,\
expression, state, equation, metrics, communication, scheduler, predefined, algorithm
from aqopa.simulator.state import Executor,\
AssignmentInstructionExecutor, IfInstructionExecutor,\
ProcessInstructionExecutor, SubprocessInstructionExecutor,\
FinishInstructionExecutor, CommunicationInstructionExecutor,\
ContinueInstructionExecutor, WhileInstructionExecutor, Host, Process,\
CallFunctionInstructionExecutor, PrintExecutor, BreakInstructionExecutor
from aqopa.simulator.error import EnvironmentDefinitionException
class Builder():
"""
Builder that builds environment elements
"""
def build_store(self):
"""
Builds store - keeps all parsed elements from qopml model
"""
return QoPMLModelStore()
def _build_functions(self, store):
"""
Validate and build functions
"""
functions = []
errors = []
for f in store.functions:
err = False
for ef in functions:
if ef.name == f.name:
errors.append("Function %s redeclared" % f.name)
err = True
break
if not err:
functions.append(f)
if len(errors) > 0:
raise EnvironmentDefinitionException('Functions redeclaration.', errors=errors)
return functions
def _build_equations(self, store, functions):
"""
Validate, build and return simulation equations build from parsed equations.
"""
validator = equation.Validator()
validator.validate(store.equations, functions)
equations = []
for parsed_equation in store.equations:
equations.append(equation.Equation(parsed_equation.composite, parsed_equation.simple))
return equations
def _build_hosts(self, store, version, functions, channels, populator, reducer):
"""
Rebuild hosts - create new instances with updated instructions lists according to version.
"""
def remove_unused_subprocesses(instructions_list, run_process):
"""
Create instruction lists according to "run process"
Removes subprocesses that are not selected to run.
"""
if run_process.all_subprocesses_active:
return instructions_list
new_instructions_list = []
for instruction in instructions_list:
if isinstance(instruction, HostSubprocess):
subprocess = instruction
if subprocess.name in run_process.active_subprocesses:
new_instructions_list.append(subprocess)
subprocess.instructions_list = \
remove_unused_subprocesses(subprocess.instructions_list,
run_process)
else:
new_instructions_list.append(instruction)
if isinstance(instruction, WhileInstruction):
instruction.instructions = \
remove_unused_subprocesses(instruction.instructions,
run_process)
if isinstance(instruction, IfInstruction):
instruction.true_instructions = \
remove_unused_subprocesses(instruction.true_instructions,
run_process)
instruction.false_instructions = \
remove_unused_subprocesses(instruction.false_instructions,
run_process)
return new_instructions_list
def get_channels_assgined_to_host(run_host, parsed_host, built_channels):
"""
Creates list all channels that host can use.
Firstly it is checked in version, secondly in parsed host.
"""
channel_names = []
if run_host.all_channels_active:
if parsed_host.all_channels_active:
return built_channels
else:
channel_names.extend([ c for c in parsed_host.active_channels ])
else:
channel_names.extend([ c for c in run_host.active_channels ])
# channel_names - names of channels that host can use
channels = []
for channel_name in channel_names:
for channel in built_channels:
if channel.name == channel_name:
channels.append(channel)
return channels
def get_channels_assigned_to_process(run_process, parsed_process, built_channels):
"""
Creates list all channels that process can use.
Firstly it is checked in version, secondly in parsed process.
"""
channel_names = []
# Get channel names uded by process
if parsed_process.all_channels_active:
return built_channels
else:
for channel_name in parsed_process.active_channels:
channel_names.append(channel_name)
# channel_names - names of channels that host can use
channels = []
for channel_name in channel_names:
for channel in built_channels:
if channel.name == channel_name:
channels.append(channel)
return channels
def build_host_instructions_list(parsed_host, run_host, repetition_number, channels):
"""
Create host's and its processes' instruction lists
according to "run host" (repetitions)
"""
def find_process(instructions_list, process_name):
for instr in instructions_list:
if not isinstance(instr, HostProcess):
continue
if instr.name == process_name:
return instr
return None
processes_numbers = {}
host_instructions_list = []
for run_process in run_host.run_processes:
# Find parsed process
parsed_process = find_process(parsed_host.instructions_list, run_process.process_name)
if parsed_process is None:
raise EnvironmentDefinitionException("Process '%s' does not exist in host '%s'." %
(run_process.process_name, parsed_host.name))
# Clone the parsed process, because builder
# may change its the instructions list
# and remove some subprocesses that would be
# needed by other version
parsed_process = parsed_process.clone()
# Define initial process number (if needed)
if run_process.process_name not in processes_numbers:
processes_numbers[run_process.process_name] = 0
# If process has follower
if run_process.follower:
# Define initial process number for following process (if needed)
if run_process.follower.process_name not in processes_numbers:
processes_numbers[run_process.follower.process_name] = 0
# Get channels assigned to process and its follower (if needed)
process_channels = get_channels_assigned_to_process(run_process, parsed_process, channels)
follower_channels = []
if run_process.follower:
follower_parsed_process = find_process(parsed_host.instructions_list, run_process.follower.process_name)
follower_channels = get_channels_assigned_to_process(run_process.follower, follower_parsed_process, channels)
for i in range(0, run_process.repetitions):
# Create instructions list
instructions_list = remove_unused_subprocesses(parsed_process.instructions_list, run_process)
# Create new simulation process
simulated_process = Process(parsed_process.name, instructions_list)
# Connect with channels
for ch in process_channels:
ch.connect_with_process(simulated_process)
# Update process index
process_number = processes_numbers[run_process.process_name] + i
simulated_process.add_name_index(process_number)
# Do the same fo the follower
if run_process.follower:
# Find process of follower
follower_parsed_process = find_process(parsed_host.instructions_list, run_process.follower.process_name)
# Build instructions list for follower
follower_instructions_list = remove_unused_subprocesses(follower_parsed_process.instructions_list, run_process.follower)
# Create simulated follower
simulated_follower = Process(follower_parsed_process.name, follower_instructions_list)
# Connect with channels
for ch in follower_channels:
ch.connect_with_process(simulated_follower)
# Update follower index
follower_number = processes_numbers[run_process.follower.process_name] + i
simulated_follower.add_name_index(follower_number)
simulated_process.follower = simulated_follower
host_instructions_list.append(simulated_process)
return host_instructions_list
def set_scheduler(host, algorithm):
"""
Build and set host's scheduler
"""
host.set_scheduler(scheduler.create(host, algorithm))
built_hosts = []
# Hosts numbers dict keeps the last used number of repeated host.
hosts_numbers = {}
for run_host in version.run_hosts:
if run_host.host_name not in hosts_numbers:
hosts_numbers[run_host.host_name] = 0
# Create prototype parsed host for this "run host"
parsed_host = store.find_host(run_host.host_name)
assigned_channels = get_channels_assgined_to_host(run_host, parsed_host, channels)
for i in range(0, run_host.repetitions):
# Build next instructions list for next repeated host
instructions_list = build_host_instructions_list(parsed_host, run_host, i, channels)
simulation_host = Host(parsed_host.name, instructions_list)
# Set the number of host
host_number = hosts_numbers[run_host.host_name] + i
simulation_host.add_name_index(host_number)
# Set scheduler
set_scheduler(simulation_host, parsed_host.schedule_algorithm)
for ch in assigned_channels:
ch.connect_with_host(simulation_host)
built_hosts.append(simulation_host)
hosts_numbers[run_host.host_name] += run_host.repetitions
return built_hosts
def _set_hosts_predefined_values(self, store, hosts, populator):
def set_predefined_variables(host, predefined_values, populator):
"""
Populate predefined values with expressions
and save them as variables in host
"""
for predefined_value in predefined_values:
populated_value = populator.populate(predefined_value.expression.clone(), host)
# print 'Setting predefined variable ', predefined_value.variable_name, \
# ' in host ', host.name, ' with value ', unicode(populated_value), \
# ' (', getattr(populated_value, '_host_name', 'None'), ')'
host.set_variable(predefined_value.variable_name, populated_value)
for host in hosts:
parsed_host = store.find_host(host.original_name())
# Save predefined values as variables
set_predefined_variables(host, parsed_host.predefined_values, populator)
return hosts
def _build_expression_populator(self, reducer):
"""
Build and return object that populates expressions
with variables' values.
"""
return expression.Populator(reducer)
def _build_expression_checker(self, populator):
"""
Build and return object that checks the logic value of expressions.
"""
return expression.Checker(populator)
def _build_expression_reducer(self, equations):
"""
Build and return object that reduces expressions.
"""
return expression.Reducer(equations)
def _build_channels(self, store, version):
"""
Validate, build and return simulation channels build from parsed channels.
Includes channel repetitions.
"""
built_channels = []
for parsed_channel in store.channels:
channel = communication.Channel(parsed_channel.name, parsed_channel.buffor_size, parsed_channel.tag_name)
built_channels.append(channel)
return built_channels
def _build_topology(self, topology_rules, hosts):
def find_left_hosts(topology_host, hosts):
found_hosts = []
for host in hosts:
# If host has the same identifier
if host.original_name() == topology_host.identifier:
# If no range is specified
if topology_host.index_range is None:
found_hosts.append(host)
else:
# Range is specified
start_i = topology_host.index_range[0]
end_i = topology_host.index_range[1]
i = host.get_name_index()
if (start_i is None or i >= start_i) and (end_i is None or i <= end_i):
found_hosts.append(host)
return found_hosts
def find_right_hosts(topology_host, hosts, current_host):
if topology_host is None:
return []
found_hosts = []
for host in hosts:
# If host has the same identifier
if host.original_name() == topology_host.identifier:
# If no range is specified
if topology_host.index_range is None:
# If no index shift is specified
if topology_host.i_shift is None:
found_hosts.append(host)
else:
# Index shift is specified
i = current_host.get_name_index()
i += topology_host.i_shift
shifted_host_name = topology_host.identifier + "." + str(i)
if host.name == shifted_host_name:
found_hosts.append(host)
else:
# Range is specified
start_i = topology_host.index_range[0]
end_i = topology_host.index_range[1]
i = host.get_name_index()
if (start_i is None or i >= start_i) and (end_i is None or i <= end_i):
found_hosts.append(host)
return found_hosts
def add_connection(topology, from_host, to_host, parameters):
if from_host not in topology:
topology[from_host] = {'hosts': [], 'parameters': {}}
if to_host not in topology[from_host]['hosts']:
topology[from_host]['hosts'].append(to_host)
for parameter in parameters:
if parameter not in topology[from_host]['parameters']:
topology[from_host]['parameters'][parameter] = {}
topology[from_host]['parameters'][parameter][to_host] = parameters[parameter]
return topology
topology = {}
for rule in topology_rules:
for left_host in find_left_hosts(rule.left_host, hosts):
for right_host in find_right_hosts(rule.right_host, hosts, left_host):
if rule.arrow == '->' or rule.arrow == '<->':
topology = add_connection(topology, left_host, right_host, rule.parameters)
if rule.arrow == '<-' or rule.arrow == '<->':
topology = add_connection(topology, right_host, left_host, rule.parameters)
return topology
def _build_channels_manager(self, channels, built_hosts, version, store):
"""
Build channels manager
"""
mgr = communication.Manager(channels)
for name in version.communication['mediums']:
topology_rules = version.communication['mediums'][name]['topology']['rules']
default_params = version.communication['mediums'][name]['default_parameters']
mgr.add_medium(name, self._build_topology(topology_rules, built_hosts), default_params)
for name in store.mediums:
if not mgr.has_medium(name):
topology_rules = store.mediums[name]['topology']['rules']
default_params = store.mediums[name]['default_parameters']
mgr.add_medium(name, self._build_topology(topology_rules, built_hosts), default_params)
return mgr
def _build_predefined_functions_manager(self, context):
"""
Build manager for predefined functions
"""
return predefined.FunctionsManager(context)
def _build_metrics_manager(self, store, hosts, version):
"""
Build and return metrics manager.
"""
host_metrics = []
for metrics_data in store.metrics_datas:
blocks = []
# Build list of metrics blocks
for block in metrics_data.blocks:
params = block.header.params[1:]
service_params = block.header.services_params
metrics_block = metrics.Block(params, service_params)
for metric in block.metrics:
m = metrics.Metric(metric.arguments[0], metric.arguments[1:len(params)+1],
metric.arguments[len(params)+1:])
metrics_block.add_metric(m)
blocks.append(metrics_block)
# get or create host metrics with given name
hm = None
for existing_hm in host_metrics:
if existing_hm.name == metrics_data.name:
hm = existing_hm
break
if hm is None:
hm = metrics.HostMetrics(metrics_data.name)
host_metrics.append(hm)
if metrics_data.plus or metrics_data.star:
if metrics_data.plus:
hm.plus_blocks = blocks
else: # star
hm.star_blocks = blocks
# If host metrics does not have normal block
# Search for them and assign if found in host metrics
# with simple name (not qualified like hm1.1)
if len(hm.normal_blocks) == 0:
hm_original_name = original_name(hm.name)
hm_original = None
for existing_hm in host_metrics:
if original_name(existing_hm.name) == hm_original_name \
and existing_hm != hm:
hm_original = existing_hm
break
if hm_original:
hm.normal_blocks = hm_original.normal_blocks
else: # metrics_data normal
hm_original_name = original_name(hm.name)
# Assign normal block to all host metrics with the same original name
for existing_hm in host_metrics:
if original_name(existing_hm.name) == hm_original_name:
# Assign notmal block to all host metrics with the same original name
# (including the current one - possibly created)
existing_hm.normal_blocks = blocks
# Connect host metrics with hosts
for metrics_set in version.metrics_sets:
for h in hosts:
if h.original_name() == metrics_set.host_name:
for host_metric in host_metrics:
if host_metric.name == metrics_set.configuration_name:
host_metric.connected_hosts.append(h)
break
return metrics.Manager(host_metrics)
def _build_algorithms_resolver(self, store):
"""
"""
resolver = algorithm.AlgorithmResolver()
for alg_name in store.algorithms:
resolver.add_algorithm(alg_name, store.algorithms[alg_name])
return resolver
def _build_context(self, store, version):
"""
Builds context with initial state.
"""
functions = self._build_functions(store)
equations = self._build_equations(store, functions)
expression_reducer = self._build_expression_reducer(equations)
expression_populator = self._build_expression_populator(expression_reducer)
expression_checker = self._build_expression_checker(expression_populator)
channels = self._build_channels(store, version)
hosts = self._build_hosts(store, version, functions, channels,
expression_populator, expression_reducer)
# Context
c = state.Context(version)
c.functions = functions
c.hosts = hosts
c.expression_reducer = expression_reducer
c.expression_checker = expression_checker
c.expression_populator = expression_populator
c.metrics_manager = self._build_metrics_manager(store, hosts, version)
c.channels_manager = self._build_channels_manager(channels, hosts, version, store)
c.algorithms_resolver = self._build_algorithms_resolver(store)
# Predefined manager
predefined_functions_manager = self._build_predefined_functions_manager(c)
expression_populator.predefined_functions_manager = predefined_functions_manager
expression_reducer.predefined_functions_manager = predefined_functions_manager
# Predefined hosts' variables
self._set_hosts_predefined_values(store, hosts, expression_populator)
return c
def build_executor(self):
"""
Creates executor for simulation
"""
e = Executor()
e.append_instruction_executor(AssignmentInstructionExecutor())
e.append_instruction_executor(CallFunctionInstructionExecutor())
e.append_instruction_executor(ProcessInstructionExecutor())
e.append_instruction_executor(SubprocessInstructionExecutor())
e.append_instruction_executor(CommunicationInstructionExecutor())
e.append_instruction_executor(FinishInstructionExecutor())
e.append_instruction_executor(ContinueInstructionExecutor())
e.append_instruction_executor(BreakInstructionExecutor())
e.append_instruction_executor(IfInstructionExecutor())
e.append_instruction_executor(WhileInstructionExecutor())
return e
def build_simulator(self, store, version):
"""
Creates simulator for particular version.
"""
sim = Simulator(self._build_context(store, version))
sim.set_executor(self.build_executor())
return sim
def build_model_parser(self, store, modules):
"""
Builder parser that parses model written in QoPML
and populates the store.
"""
from aqopa.model.parser.lex_yacc import LexYaccParser
from aqopa.model.parser.lex_yacc.grammar import main,\
functions, channels, equations, expressions, instructions,\
hosts, modules as modules_module, communication as comm_grammar
parser = LexYaccParser()
parser.set_store(store) \
.add_extension(main.ModelParserExtension()) \
.add_extension(modules_module.ModelParserExtension()) \
.add_extension(functions.ModelParserExtension()) \
.add_extension(channels.ModelParserExtension()) \
.add_extension(equations.ModelParserExtension()) \
.add_extension(expressions.ModelParserExtension()) \
.add_extension(comm_grammar.ModelParserExtension()) \
.add_extension(algorithms.ModelParserExtension()) \
.add_extension(instructions.ModelParserExtension()) \
.add_extension(hosts.ModelParserExtension())
for m in modules:
parser = m.extend_model_parser(parser)
return parser.build()
def build_metrics_parser(self, store, modules):
"""
Builder parser that parses metrics written in QoPML
and populates the store.
"""
from aqopa.model.parser.lex_yacc import LexYaccParser
from aqopa.model.parser.lex_yacc.grammar import main, metrics as metrics_grammar
parser = LexYaccParser()
parser.set_store(store)\
.add_extension(main.MetricsParserExtension())\
.add_extension(metrics_grammar.MetricsParserExtension())
for m in modules:
parser = m.extend_metrics_parser(parser)
return parser.build()
def build_config_parser(self, store, modules):
"""
Builder parser that parses config written in QoPML
and populates the store.
"""
from aqopa.model.parser.lex_yacc import LexYaccParser
from aqopa.model.parser.lex_yacc.grammar import versions, main, communication as comm_grammar
parser = LexYaccParser()
parser.set_store(store)\
.add_extension(main.ConfigParserExtension())\
.add_extension(comm_grammar.ConfigParserExtension()) \
.add_extension(versions.ConfigParserExtension())
for m in modules:
parser = m.extend_config_parser(parser)
return parser.build()
class Interpreter():
"""
Interpreter is responsible for parsing the model,
creating the environment for simulations,
manipulating selected models.
"""
def __init__(self, builder=None, model_as_text="",
metrics_as_text="", config_as_text=""):
self.builder = builder if builder is not None else Builder()
self.model_as_text = model_as_text
self.metrics_as_text = metrics_as_text
self.config_as_text = config_as_text
self.store = None
self._modules = []
def set_qopml_model(self, model_as_text):
"""
Set qopml model that will be interpreted.
"""
self.model_as_text = model_as_text
return self
def set_qopml_metrics(self, metrics_as_text):
"""
Set qopml metrics that will be interpreted.
"""
self.metrics_as_text = metrics_as_text
return self
def set_qopml_config(self, config_as_text):
"""
Set qopml configuration that will be interpreted.
"""
self.config_as_text = config_as_text
return self
def register_qopml_module(self, qopml_module):
"""
Registers new module
"""
if qopml_module in self._modules:
raise EnvironmentDefinitionException(u"QoPML Module '%s' is already registered." % unicode(qopml_module))
self._modules.append(qopml_module)
return self
def parse(self, all_modules):
"""
Parses the model from model_as_text field and populates the store.
"""
if len(self.model_as_text) == 0:
raise EnvironmentDefinitionException("QoPML Model not provided.")
self.store = self.builder.build_store()
parser = self.builder.build_model_parser(self.store, all_modules)
parser.parse(self.model_as_text)
if len(parser.get_syntax_errors()) > 0:
raise ModelParserException('Invalid syntax.', syntax_errors=parser.get_syntax_errors())
parser = self.builder.build_metrics_parser(self.store, all_modules)
parser.parse(self.metrics_as_text)
if len(parser.get_syntax_errors()) > 0:
raise MetricsParserException('Invalid syntax.', syntax_errors=parser.get_syntax_errors())
parser = self.builder.build_config_parser(self.store, all_modules)
parser.parse(self.config_as_text)
if len(parser.get_syntax_errors()) > 0:
raise ConfigurationParserException('Invalid syntax.', syntax_errors=parser.get_syntax_errors())
def install_modules(self, simulator):
""" """
raise NotImplementedError()
def run(self):
""" Runs all simulations """
raise NotImplementedError()
class ConsoleInterpreter(Interpreter):
def __init__(self, builder=None, model_as_text="",
metrics_as_text="", config_as_text=""):
Interpreter.__init__(self, builder, model_as_text, metrics_as_text, config_as_text)
self.simulators = []
def save_states_to_file(self, simulator):
"""
Tells simulator to save states flow to file.
"""
f = open('VERSION_%s_STATES_FLOW' % simulator.context.version.name, 'w')
simulator.get_executor().prepend_instruction_executor(PrintExecutor(f))
def prepare(self):
""" Prepares for run """
for version in self.store.versions:
simulator = self.builder.build_simulator(self.store, version)
self.simulators.append(simulator)
self.install_modules(simulator)
def install_modules(self, simulator):
""" """
for m in self._modules:
m.install_console(simulator)
def run(self):
""" Runs all simulations """
for s in self.simulators:
s.prepare()
s.run()
self.on_finished(s)
def on_finished(self, simulator):
pass
def is_finished(self):
for s in self.simulators:
if not s.is_simulation_finished():
return False
return True
class GuiInterpreter(Interpreter):
def __init__(self, builder=None, model_as_text="",
metrics_as_text="", config_as_text=""):
Interpreter.__init__(self, builder=builder,
model_as_text=model_as_text,
metrics_as_text=metrics_as_text,
config_as_text=config_as_text)
self.simulators = []
def prepare(self):
""" Prepares for run """
for version in self.store.versions:
simulator = self.builder.build_simulator(self.store, version)
self.install_modules(simulator)
self.simulators.append(simulator)
def install_modules(self, simulator):
""" """
for m in self._modules:
m.install_gui(simulator)
def run_simulation(self, simulator):
""" """
simulator.prepare()
simulator.run()
return simulator | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/app.py | app.py |
import optparse
import sys
import os
from aqopa import VERSION
from aqopa.bin import console, gui
def gui_command():
app = gui.AqopaApp(False)
app.MainLoop()
def console_command():
parser = optparse.OptionParser()
parser.usage = "%prog [options]"
parser.add_option("-f", "--model-file", dest="model_file", metavar="FILE",
help="specifies model file")
parser.add_option("-m", "--metrics-file", dest="metrics_file", metavar="FILE",
help="specifies file with metrics")
parser.add_option("-c", "--config-file", dest="config_file", metavar="FILE",
help="specifies file with modules configuration")
parser.add_option("-s", "--states", dest="save_states", action="store_true", default=False,
help="save states flow in a file")
parser.add_option("-p", '--progressbar', dest="show_progressbar", action="store_true", default=False,
help="show the progressbar of the simulation")
parser.add_option("-V", '--version', dest="show_version", action="store_true", default=False,
help="show version of AQoPA")
parser.add_option("-d", "--debug", dest="debug", action="store_true", default=False,
help="DEBUG mode")
(options, args) = parser.parse_args()
if options.show_version:
print "AQoPA (version %s)" % VERSION
sys.exit(0)
if not options.model_file:
parser.error("no qopml model file specified")
if not os.path.exists(options.model_file):
parser.error("qopml model file '%s' does not exist" % options.model_file)
if not options.metrics_file:
parser.error("no metrics file specified")
if not os.path.exists(options.metrics_file):
parser.error("metrics file '%s' does not exist" % options.metrics_file)
if not options.config_file:
parser.error("no configuration file specified")
if not os.path.exists(options.config_file):
parser.error("configuration file '%s' does not exist" % options.config_file)
f = open(options.model_file, 'r')
qopml_model = f.read()
f.close()
f = open(options.metrics_file, 'r')
qopml_metrics = f.read()
f.close()
f = open(options.config_file, 'r')
qopml_config = f.read()
f.close()
console.run(qopml_model, qopml_metrics, qopml_config,
save_states=options.save_states, debug=options.debug,
show_progressbar=options.show_progressbar) | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/cmd.py | cmd.py |
import wx
import os
# AQoPA imports
import aqopa
# AQoPA gui imports
from aqopa.gui.models_lib_gui import LibraryFrame, EVT_MODEL_SELECTED
from aqopa.gui.main_notebook_gui import MainNotebook
"""
@file main_frame_gui.py
@brief GUI for the main frame (the one with the tabs on it)
@author Damian Rusinek <[email protected]>
@date created on 05-09-2013 by Damian Rusinek
@date edited on 07-05-2014 by Katarzyna Mazur (visual improvements)
"""
class MainFrame(wx.Frame):
""" """
def __init__(self, *args, **kwargs):
wx.Frame.__init__(self, *args, **kwargs)
###########
# MENUBAR
###########
# create menubar
menuBar = wx.MenuBar()
#create main menu, lets call it 'file' menu
fileMenu = wx.Menu()
# create menu item = about AQoPA
item = wx.MenuItem(fileMenu, wx.NewId(), u"&About AQoPA\tCTRL+I")
item.SetBitmap(wx.Bitmap(self.CreatePath4Resource('about.png')))
fileMenu.AppendItem(item)
self.Bind(wx.EVT_MENU, self.OnAbout, item)
fileMenu.AppendSeparator()
# create menu item = quit AQoPa
item = wx.MenuItem(fileMenu, wx.NewId(), u"&Quit\tCTRL+Q")
item.SetBitmap(wx.Bitmap(self.CreatePath4Resource('exit.png')))
fileMenu.AppendItem(item)
self.Bind(wx.EVT_MENU, self.OnQuit, item)
# add 'file' menu to the menubar
menuBar.Append(fileMenu, "&Menu")
# create library menu, here u can find modules library
libraryMenu = wx.Menu()
# create models menu item
item = wx.MenuItem(libraryMenu, wx.NewId(), u"&Browse models\tCTRL+M")
item.SetBitmap(wx.Bitmap(self.CreatePath4Resource('models_lib.png')))
libraryMenu.AppendItem(item)
self.Bind(wx.EVT_MENU, self.OnBrowseModels, item)
# create metric menu item
item = wx.MenuItem(libraryMenu, wx.NewId(), u"&Browse metrics\tCTRL+F")
item.SetBitmap(wx.Bitmap(self.CreatePath4Resource('metrics.png')))
libraryMenu.AppendItem(item)
# add 'library' menu to the menubar
menuBar.Append(libraryMenu, "&Library")
self.SetMenuBar(menuBar)
###################
# SIZERS & EVENTS
###################
self.mainNotebook = MainNotebook(self)
logoPanel = wx.Panel(self)
pic = wx.StaticBitmap(logoPanel)
pic.SetBitmap(wx.Bitmap(self.CreatePath4Resource('logo.png')))
sizer = wx.BoxSizer(wx.HORIZONTAL)
sizer.Add(wx.StaticText(self), 1, wx.EXPAND, 5)
sizer.Add(logoPanel, 0, wx.RIGHT| wx.ALL|wx.EXPAND, 5)
s2 = wx.BoxSizer(wx.VERTICAL)
s2.Add(sizer, 0, wx.LEFT| wx.ALL|wx.EXPAND, 5)
s2.Add(self.mainNotebook, 1, wx.ALL|wx.EXPAND, 5)
self.SetSizer(s2)
self.SetIcon(wx.Icon(self.CreatePath4Resource('app_logo.png'), wx.BITMAP_TYPE_PNG))
self.SetMinSize(wx.Size(900, 700))
self.CenterOnScreen()
self.Layout()
def CreatePath4Resource(self, resourceName):
"""
@brief creates and returns path to the
given file in the resource
('assets') dir
@return path to the resource
"""
tmp = os.path.split(os.path.dirname(__file__))
return os.path.join(tmp[0], 'bin', 'assets', resourceName)
def OnQuit(self, event=None):
"""
@brief closes the application
"""
self.Close()
def OnBrowseModels(self, event=None):
"""
@brief shows the library frame (models library window)
"""
libraryFrame = LibraryFrame(self, title="Models Library")
libraryFrame.Show(True)
libraryFrame.CentreOnParent()
#libraryFrame.Maximize(True)
libraryFrame.Bind(EVT_MODEL_SELECTED, self.OnLibraryModelSelected)
def OnLibraryModelSelected(self, event):
""" """
self.mainNotebook.SetModelData(event.model_data)
self.mainNotebook.SetMetricsData(event.metrics_data)
self.mainNotebook.SetVersionsData(event.versions_data)
# set filenames on GUI
self.mainNotebook.modelTab.SetFilenameOnGUI(event.filenames['model'])
self.mainNotebook.metricsTab.SetFilenameOnGUI(event.filenames['metrics'])
self.mainNotebook.versionsTab.SetFilenameOnGUI(event.filenames['versions'])
def OnAbout(self, event=None):
""" Show about info """
description = """AQoPA stands for Automated Quality of Protection Analysis Tool
for QoPML models."""
licence = """AQoPA is free software; you can redistribute
it and/or modify it under the terms of the GNU General Public License as
published by the Free Software Foundation; either version 2 of the License,
or (at your option) any later version.
AQoPA is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE."""
logo_filepath = self.CreatePath4Resource('logo.png')
info = wx.AboutDialogInfo()
info.SetIcon(wx.Icon(logo_filepath, wx.BITMAP_TYPE_PNG))
info.SetName('AQoPA')
info.SetVersion(aqopa.VERSION)
info.SetDescription(description)
info.SetCopyright('(C) 2013 QoPML Project')
info.SetWebSite('http://www.qopml.org')
info.SetLicence(licence)
info.AddDeveloper('Damian Rusinek')
info.AddDocWriter('Damian Rusinek')
info.AddArtist('QoPML Project')
info.AddTranslator('Damian Rusinek')
wx.AboutBox(info) | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/gui/main_frame_gui.py | main_frame_gui.py |
import wx
import os
# AQoPA gui imports
from aqopa.gui.modules_panel_gui import ModulesPanel, EVT_MODULES_CHANGED
from aqopa.gui.mmv_panel_gui import MMVPanel
from aqopa.gui.run_panel_gui import RunPanel, EVT_MODEL_PARSED
from aqopa.gui.results_panel_gui import ResultsPanel
"""
@file main_notebook_gui.py
@brief GUI for the main notebook, where we attach our AQoPA tabs
@author Damian Rusinek
@date created on 05-09-2013 by Damian Rusinek
@date edited on 07-05-2014 by Katarzyna Mazur (visual improvements mainly)
"""
# modules communication events
ModuleSimulationRequestEvent, EVT_MODULE_SIMULATION_REQUEST = wx.lib.newevent.NewEvent() # Parameters: module
ModuleSimulationAllowedEvent, EVT_MODULE_SIMULATION_ALLOWED = wx.lib.newevent.NewEvent() # Parameters: interpreter
ModuleSimulationFinishedEvent, EVT_MODULE_SIMULATION_FINISHED = wx.lib.newevent.NewEvent()
class MainNotebook(wx.Notebook):
""" """
def __init__(self, parent):
wx.Notebook.__init__(self, parent)
###########
# MODULES
###########
self.availableModules = []
from aqopa.module import timeanalysis
timeanalysis_module = timeanalysis.Module()
timeanalysis_module.get_gui().Bind(EVT_MODULE_SIMULATION_REQUEST,
self.OnModuleSimulationRequest)
timeanalysis_module.get_gui().Bind(EVT_MODULE_SIMULATION_FINISHED,
self.OnModuleSimulationFinished)
self.availableModules.append(timeanalysis_module)
from aqopa.module import energyanalysis
m = energyanalysis.Module(timeanalysis_module)
self.availableModules.append(m)
from aqopa.module import reputation
self.availableModules.append(reputation.Module())
# list containing notebook images:
# .ico seem to be more OS portable, although we use .png here
# the (20, 20) is the size in pixels of the images
il = wx.ImageList(20, 20)
modelsTabImg = il.Add(wx.Bitmap(self.CreatePath4Resource('models_lib.png'), wx.BITMAP_TYPE_PNG))
metricsTabImg = il.Add(wx.Bitmap(self.CreatePath4Resource('metrics.png'), wx.BITMAP_TYPE_PNG))
versionsTabImg = il.Add(wx.Bitmap(self.CreatePath4Resource('versions.png'), wx.BITMAP_TYPE_PNG))
runTabImg = il.Add(wx.Bitmap(self.CreatePath4Resource('run.png'), wx.BITMAP_TYPE_PNG))
modulesTabImg = il.Add(wx.Bitmap(self.CreatePath4Resource('modules.png'), wx.BITMAP_TYPE_PNG))
resultsTabImg = il.Add(wx.Bitmap(self.CreatePath4Resource('results.png'), wx.BITMAP_TYPE_PNG))
self.AssignImageList(il)
###########
# TABS
###########
self.modelTab = MMVPanel(self)
self.modelTab.Layout()
self.Bind(wx.EVT_TEXT, self.OnModelTextChange, self.modelTab.dataTextArea)
self.AddPage(self.modelTab, "Model")
self.SetPageImage(0, modelsTabImg)
self.metricsTab = MMVPanel(self)
self.metricsTab.Layout()
self.Bind(wx.EVT_TEXT, self.OnModelTextChange, self.metricsTab.dataTextArea)
self.AddPage(self.metricsTab, "Metrics")
self.SetPageImage(1, metricsTabImg)
self.versionsTab = MMVPanel(self)
self.versionsTab.Layout()
self.Bind(wx.EVT_TEXT, self.OnModelTextChange, self.versionsTab.dataTextArea)
self.versionsTab.Layout()
self.AddPage(self.versionsTab, "Versions")
self.SetPageImage(2, versionsTabImg)
self.modulesTab = ModulesPanel(self, modules=self.availableModules)
self.modulesTab.Bind(EVT_MODULES_CHANGED, self.OnModulesChange)
self.modulesTab.Layout()
self.AddPage(self.modulesTab, "Modules")
self.SetPageImage(3, modulesTabImg)
self.runTab = RunPanel(self)
self.runTab.SetAllModules(self.availableModules)
self.runTab.Layout()
self.runTab.Bind(EVT_MODEL_PARSED, self.OnModelParsed)
self.AddPage(self.runTab, "Run")
self.SetPageImage(4, runTabImg)
self.resultsTab = ResultsPanel(self)
self.resultsTab.Layout()
self.AddPage(self.resultsTab, "Results")
self.SetPageImage(5, resultsTabImg)
def CreatePath4Resource(self, resourceName):
"""
@brief creates and returns path to the
given file in the resource
('assets') dir
@return path to the resource
"""
tmp = os.path.split(os.path.dirname(__file__))
return os.path.join(tmp[0], 'bin', 'assets', resourceName)
def LoadModelFile(self, filePath):
self.modelTab.dataTextArea.LoadFile(filePath)
def LoadMetricsFile(self, filePath):
self.metricsTab.dataTextArea.LoadFile(filePath)
def LoadVersionsFile(self, filePath):
self.versionsTab.dataTextArea.LoadFile(filePath)
def SetModelData(self, data):
self.modelTab.dataTextArea.SetValue(data)
def SetMetricsData(self, data):
self.metricsTab.dataTextArea.SetValue(data)
def SetVersionsData(self, data):
self.versionsTab.dataTextArea.SetValue(data)
def GetModelData(self):
return self.modelTab.dataTextArea.GetValue().strip()
def GetMetricsData(self):
return self.metricsTab.dataTextArea.GetValue().strip()
def GetVersionsData(self):
return self.versionsTab.dataTextArea.GetValue().strip()
def OnModelTextChange(self, event):
self.runTab.SetModel(self.GetModelData(),
self.GetMetricsData(),
self.GetVersionsData())
event.Skip()
def OnModulesChange(self, event):
self.runTab.SetSelectedModules(event.modules)
self.resultsTab.SetSelectedModules(event.modules)
def OnModelParsed(self, event):
self.resultsTab.ClearResults()
event.Skip()
def OnModuleSimulationRequest(self, event):
""" """
gui = event.module.get_gui()
self.runTab.runButton.Enable(False)
self.runTab.parseButton.Enable(False)
wx.PostEvent(gui, ModuleSimulationAllowedEvent(interpreter=self.runTab.interpreter))
def OnModuleSimulationFinished(self, event):
""" """
self.runTab.parseButton.Enable(True) | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/gui/main_notebook_gui.py | main_notebook_gui.py |
import wx
import wx.lib.newevent
from aqopa.gui.combo_check_box import ComboCheckBox
from aqopa.gui.general_purpose_frame_gui import GeneralFrame
"""
@file modules_panel_gui.py
@brief GUI for the 'Modules' tab on AQoPA's main window (panel)
@author Damian Rusinek <[email protected]>
@date created on 05-09-2013 by Damian Rusinek
@date edited on 07-05-2014 by Katarzyna Mazur
"""
ModulesChangedEvent, EVT_MODULES_CHANGED = wx.lib.newevent.NewEvent()
class ModulesPanel(wx.Panel):
"""
Panel used for selecting modules and configuring them.
"""
def __init__(self, *args, **kwargs):
self.allModules = kwargs['modules']
del kwargs['modules']
wx.Panel.__init__(self, *args, **kwargs)
# our main sizer
mainSizer = wx.BoxSizer(wx.VERTICAL)
# 'Select' button = select chosen modules
self.selectButton = wx.Button(self, label="Select")
# 'Configure' button = configure selected module, clicking the button should bring a new window where the configuration of the chosen module will be possible
self.configureButton = wx.Button(self, label="Configure")
self.configureButton.Disable()
# create group boxes, aka static boxes
modulesSelectionBox = wx.StaticBox(self, label="Select modules")
modulesConfigurationBox = wx.StaticBox(self, label="Configure modules")
mainBox = wx.StaticBox(self, label="Modules")
# create sizers = some kind of layout management
modulesSelectionBoxSizer = wx.StaticBoxSizer(modulesSelectionBox, wx.HORIZONTAL)
modulesConfigurationBoxSizer = wx.StaticBoxSizer(modulesConfigurationBox, wx.HORIZONTAL)
mainBoxSizer = wx.StaticBoxSizer(mainBox, wx.VERTICAL)
# create labels, aka static texts
selectModulesLabel = wx.StaticText(self, label="Choose modules for analysis and click the 'Select'\nbutton to add them to the configuration panel.")
configureModulesLabel = wx.StaticText(self, label="Choose module from selected modules and\nconfigure them one by one.")
# create combocheckbox, empty at first
self.comboCheckBox = wx.combo.ComboCtrl(self)
self.tcp = ComboCheckBox()
self.comboCheckBox.SetPopupControl(self.tcp)
self.comboCheckBox.SetText('...')
# create ordinary combobox for module configuration
self.modulesConfComboBox = wx.ComboBox(self, style=wx.TE_READONLY)
# add tooltipz = make user's life easier
modulesSelectionBox.SetToolTip(wx.ToolTip("Select modules for analysis"))
modulesConfigurationBox.SetToolTip(wx.ToolTip("Configure chosen modules"))
# align 'select modules' group box
modulesSelectionBoxSizer.Add(selectModulesLabel, 1, wx.ALL | wx.EXPAND, 5)
modulesSelectionBoxSizer.Add(self.comboCheckBox, 1, wx.ALL | wx.EXPAND, 5)
modulesSelectionBoxSizer.Add(self.selectButton, 0, wx.ALL | wx.EXPAND, 5)
# align 'configure modules' group box
modulesConfigurationBoxSizer.Add(configureModulesLabel, 1, wx.ALL | wx.EXPAND, 5)
modulesConfigurationBoxSizer.Add(self.modulesConfComboBox, 1, wx.ALL | wx.EXPAND, 5)
modulesConfigurationBoxSizer.Add(self.configureButton, 0, wx.ALL | wx.EXPAND, 5)
# do some bindings:
self.selectButton.Bind(wx.EVT_BUTTON, self.OnSelectButtonClicked)
self.configureButton.Bind(wx.EVT_BUTTON, self.OnConfigureButtonClicked)
#self.tcp.checkBoxList.Bind(wx.EVT_CHECKLISTBOX, self.OnCheckBoxChange)
# names of the modules to appeared in the config combobox
self.modulesNames4Combo = []
for m in self.allModules:
gui = m.get_gui()
self.modulesNames4Combo.append(gui.get_name())
# fill combocheckbox with modules names
self.tcp.SetChoices(self.modulesNames4Combo)
for i in range(0,4) :
mainSizer.Add(wx.StaticText(self), 0, 0, wx.ALL | wx.EXPAND, 5)
mainSizer.Add(modulesSelectionBoxSizer, 0, wx.ALL | wx.EXPAND, 5)
for i in range(0,3) :
mainSizer.Add(wx.StaticText(self), 0, 0, wx.ALL | wx.EXPAND, 5)
mainSizer.Add(modulesConfigurationBoxSizer, 0, wx.ALL | wx.EXPAND, 5)
mainBoxSizer.Add(mainSizer, 0, wx.ALL | wx.EXPAND, 5)
self.SetSizer(mainBoxSizer)
def OnSelectButtonClicked(self, event):
"""
@brief grabs selected modules from
combocheckbox widget, puts them into
conf combobox
"""
# add selected modules to config-combo box - u can
# configure only selected modules, one by one
self.FillUpComboWithModules(self.tcp.GetSelectedItems())
# check which module was selected, make a list out of the selected modules
modules = []
[modules.append(self.allModules[i]) for i in range(self.tcp.checkBoxList.GetCount()) if self.tcp.checkBoxList.IsChecked(i)]
# perform event - modules were selected
wx.PostEvent(self, ModulesChangedEvent(modules=modules, all_modules=self.allModules))
def OnConfigureButtonClicked(self, event):
"""
@brief configures chosen module [ideally, do it in
a new window]
"""
# print self.modulesConfComboBox.GetStringSelection()
# get selected module from combo
selectedModule = self.modulesConfComboBox.GetValue()
for m in self.allModules :
if m.get_gui().get_name() == selectedModule :
# new window (frame, actually) where we open up a
# panel received from module/name/gui.py[get_configuration_panel]
confWindow = GeneralFrame(self, "Module Configuration", "Configuration", "config.png")
panel = m.get_gui().get_configuration_panel(confWindow)
confWindow.AddPanel(panel)
confWindow.Show()
break
def FillUpComboWithModules(self, modules):
"""
@brief adds selected modules to the combobox
"""
# clear combo, do not remember prev choices
self.modulesConfComboBox.Clear()
# DO NOT select, selecting can mess up many things
self.modulesConfComboBox.SetSelection(-1)
self.modulesConfComboBox.SetValue("")
# add all chosen modules to the combo
self.modulesConfComboBox.AppendItems(modules)
self.modulesConfComboBox.Refresh()
# enable button if combo is not empty =
# we can actually configure some modules
if modules :
self.configureButton.Enable()
else:
self.configureButton.Disable() | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/gui/modules_panel_gui.py | modules_panel_gui.py |
import os
import wx
import wx.richtext
import wx.lib.newevent
"""
@file models_lib_gui.py
@brief GUI for the models library window (in python's nomenclature it's a wx frame)
@author Damian Rusinek
@date created on 05-09-2013 by Damian Rusinek
@date edited on 05-05-2014 by Katarzyna Mazur (visual improvements)
"""
ModelSelectedEvent, EVT_MODEL_SELECTED = wx.lib.newevent.NewEvent()
class LibraryTree(wx.TreeCtrl):
""" """
def __init__(self, *args, **kwargs):
wx.TreeCtrl.__init__(self, *args, **kwargs)
models_item = self.AddRoot(text="Models")
models_dir = os.path.join(os.path.dirname(__file__),
os.pardir,
'library',
'models')
import xml.etree.ElementTree as ET
for dir_root, dirs, files in os.walk(models_dir):
if 'meta.xml' in files:
item_key = os.path.basename(dir_root)
tree = ET.parse(os.path.join(dir_root, 'meta.xml'))
root = tree.getroot()
name_child = root.find('name')
if name_child is not None:
item = self.AppendItem(models_item, text=name_child.text)
author = root.find('author').text if root.find('author') is not None else ''
author_email = root.find('author_email').text if root.find('author_email') is not None else ''
description = root.find('description').text if root.find('description') is not None else ''
model_file = ''
metrics_file = ''
versions_file = ''
files = root.find('files')
if files is not None:
model_file = files.find('model').text if files.find('model') is not None else ''
metrics_file = files.find('metrics').text if files.find('metrics') is not None else ''
versions_file = files.find('versions').text if files.find('versions') is not None else ''
model_data = {
'root': dir_root,
'name': name_child.text,
'author': author,
'author_email': author_email,
'description': description,
'files': {
'model': model_file,
'metrics': metrics_file,
'versions': versions_file,
}
}
self.SetPyData(item, model_data)
self.ExpandAll()
def GetModelData(self, item):
""" """
if item in self.items_data:
return self.items_data[item]
return None
class ModelDescriptionPanel(wx.Panel):
"""
@brief Creates panel for displaying information about the chosen model
"""
def __init__(self, *args, **kwargs):
"""
@brief Initializes and aligns gui elements, inits model's data
"""
wx.Panel.__init__(self, *args, **kwargs)
# class' data
self.model_data = None
# create dicts for opened files [only for
# those from models library]
self.filenames = {}
# create text area (text edit, disabled, not editable) to show model's description
__txtStyle = wx.TE_MULTILINE | wx.TE_READONLY | wx.TE_AUTO_URL
self.modelsDescriptionText = wx.TextCtrl(self, style=__txtStyle)
# create group boxes aka static boxes
self.modelAboutBox = wx.StaticBox(self, label="About the model ...")
self.modelsGeneralInfoBox = wx.StaticBox(self, label="General information")
self.modelsDescriptionBox = wx.StaticBox(self, label="Description")
self.modelsTreeBox = wx.StaticBox(self, label="Models library", style=wx.SUNKEN_BORDER)
# create sizers = some kind of layout management
modelAboutBoxSizer = wx.StaticBoxSizer(self.modelAboutBox, wx.VERTICAL)
modelGeneralInfoBoxSizer = wx.StaticBoxSizer(self.modelsGeneralInfoBox, wx.VERTICAL)
modelDescriptionBoxSizer = wx.StaticBoxSizer(self.modelsDescriptionBox, wx.VERTICAL)
modelsTreeBoxSizer = wx.StaticBoxSizer(self.modelsTreeBox, wx.VERTICAL)
# create buttons
self.loadModelButton = wx.Button(self, label="Load model")
self.loadModelButton.Hide()
cancelButton = wx.Button(self, label="Close")
# create static texts aka labels
moduleNameLabel = wx.StaticText(self, label="Name: ")
moduleAuthorsNameLabel = wx.StaticText(self, label="Author: ")
moduleAuthorsEmailLabel = wx.StaticText(self, label="E-mail: ")
# create font for static texts = same as default panel font, just bold
defSysFont = self.GetFont()
boldFont = wx.Font(pointSize=defSysFont.GetPointSize(),
family=defSysFont.GetFamily(),
style=defSysFont.GetStyle(),
weight=wx.FONTWEIGHT_BOLD)
# create static texts will we displayed in bold
moduleNameLabel.SetFont(boldFont)
moduleAuthorsNameLabel.SetFont(boldFont)
moduleAuthorsEmailLabel.SetFont(boldFont)
# create editable static texts (labels) - will be modified
self.moduleNameText = wx.StaticText(self)
self.moduleAuthorsNameText = wx.StaticText(self)
self.moduleAuthorsEmailText = wx.StaticText(self)
# do some bindings -
# loadModelButton loads the chosen model to the text area
self.loadModelButton.Bind(wx.EVT_LEFT_UP, self.OnLoadModelClicked)
# cancelButton simply closes the model's lib window (frame)
cancelButton.Bind(wx.EVT_BUTTON, self.OnCancelClicked)
# create'n'align models lib tree
self.modelsTree = LibraryTree(self)
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(self.modelsTree, 1, wx.ALL | wx.EXPAND)
modelsTreeBoxSizer.Add(sizer, 1, wx.ALL | wx.EXPAND, 5)
# add some vertical space to make gui more readable
__verticalSpace = 25
# align name label and name
sizer = wx.BoxSizer(wx.HORIZONTAL)
sizer.Add(moduleNameLabel, 0, wx.ALL | wx.LEFT, 5)
sizer.Add(wx.StaticText(self), 0, wx.ALL | wx.EXPAND, 5)
sizer.Add(self.moduleNameText, 1, wx.ALL | wx.ALIGN_CENTRE_HORIZONTAL, 5)
modelGeneralInfoBoxSizer.Add(sizer, 0, wx.ALL | wx.EXPAND, 5)
# align author label and author
sizer = wx.BoxSizer(wx.HORIZONTAL)
sizer.Add(moduleAuthorsNameLabel, 0, wx.ALL | wx.LEFT, 5)
sizer.Add(wx.StaticText(self), 0, wx.ALL | wx.EXPAND, 5)
sizer.Add(self.moduleAuthorsNameText, 1, wx.ALL | wx.ALIGN_CENTRE_HORIZONTAL, 5)
modelGeneralInfoBoxSizer.Add(sizer, 0, wx.ALL | wx.EXPAND, 5)
# align email label and email
sizer = wx.BoxSizer(wx.HORIZONTAL)
sizer.Add(moduleAuthorsEmailLabel, 0, wx.ALL | wx.LEFT, 5)
sizer.Add(wx.StaticText(self), 0, wx.ALL | wx.EXPAND, 5)
sizer.Add(self.moduleAuthorsEmailText, 1, wx.ALL | wx.ALIGN_CENTRE_HORIZONTAL, 5)
modelGeneralInfoBoxSizer.Add(sizer, 0, wx.ALL | wx.EXPAND, 5)
# add group box 'general model info' at the top of the 'model's about' group box
modelAboutBoxSizer.Add(modelGeneralInfoBoxSizer, 0, wx.ALL | wx.TOP | wx.EXPAND, 10)
# add text area to the 'model's description' group box
modelDescriptionBoxSizer.Add(self.modelsDescriptionText, 1, wx.EXPAND | wx.TOP | wx.ALL, 5)
# add 'model's description' group box to the 'model's about' group box
modelAboutBoxSizer.Add(modelDescriptionBoxSizer, 1, wx.ALL | wx.TOP | wx.EXPAND, 10)
# align buttons
sizer = wx.BoxSizer(wx.HORIZONTAL)
sizer.Add(self.loadModelButton, 0, wx.LEFT | wx.ALL, 5)
sizer.Add(cancelButton, 0, wx.LEFT | wx.ALL, 5)
# add buttons to 'about the model' group box
modelAboutBoxSizer.Add(sizer, 0, wx.ALIGN_RIGHT | wx.RIGHT, 10)
# align all the gui elements together on the panel
sizer = wx.BoxSizer(wx.HORIZONTAL)
sizer.Add(modelsTreeBoxSizer, 1, wx.ALL | wx.EXPAND) # or simply sizer.Add(modelsTreeBoxSizer, 0, wx.EXPAND)
sizer.Add(modelAboutBoxSizer, 1, wx.EXPAND)
# do the final alignment
self.SetSizer(sizer)
self.Layout()
def OnCancelClicked(self, event):
"""
@brief closes the frame (as well as the panel)
"""
frame = self.GetParent()
frame.Close()
def __CreateLibraryPath(self, filename):
"""
@brief creates a filepath displayed on GUI,
the filepath is the path of the
model/metric/version chosen from
AQoPA's model library
"""
__mainPath = os.path.split(self.model_data['root'])[0]
for i in range (0,4):
__mainPath = os.path.split(__mainPath)[0]
__mainPath += "/library/models/"
__chosenModelPath = os.path.split(self.model_data['root'])[1]
__mainPath += __chosenModelPath
__mainPath += "/"
__mainPath += filename
return __mainPath
def ShowModel(self, model_data):
""" """
self.model_data = model_data
self.moduleNameText.SetLabel(model_data['name'])
self.moduleAuthorsNameText.SetLabel(model_data['author'])
self.moduleAuthorsEmailText.SetLabel(model_data['author_email'])
self.modelsDescriptionText.SetValue(model_data['description'])
self.filenames['model'] = self.__CreateLibraryPath(os.path.split(self.model_data['files']['model'])[1])
self.filenames['metrics'] = self.__CreateLibraryPath(os.path.split(self.model_data['files']['metrics'])[1])
self.filenames['versions'] = self.__CreateLibraryPath(os.path.split(self.model_data['files']['versions'])[1])
self.modelsDescriptionText.Show()
self.loadModelButton.Show()
self.Layout()
def OnLoadModelClicked(self, event=None):
""" """
f = open(os.path.join(self.model_data['root'], self.model_data['files']['model']))
model_data = f.read()
f.close()
f = open(os.path.join(self.model_data['root'], self.model_data['files']['metrics']))
metrics_data = f.read()
f.close()
f = open(os.path.join(self.model_data['root'], self.model_data['files']['versions']))
versions_data = f.read()
f.close()
evt = ModelSelectedEvent(model_data=model_data,
metrics_data=metrics_data,
versions_data=versions_data,
filenames=self.filenames)
wx.PostEvent(self, evt)
class LibraryFrame(wx.Frame):
""" """
def __init__(self, *args, **kwargs):
wx.Frame.__init__(self, *args, **kwargs)
###################
# SIZERS & EVENTS
###################
self.modelDescriptionPanel = ModelDescriptionPanel(self)
self.modelsTree = self.modelDescriptionPanel.modelsTree
self.modelDescriptionPanel.Bind(EVT_MODEL_SELECTED, self.OnLoadModelSelected)
# fill panel with linear gradient to make it look fancy = NOT NOW!
#self.modelDescriptionPanel.Bind(wx.EVT_PAINT, self.OnPaintPrettyPanel)
self.modelsTree.Bind(wx.EVT_TREE_SEL_CHANGED, self.OnModelSelected)
self.modelsTree.Bind(wx.EVT_TREE_ITEM_ACTIVATED, self.OnModelDoubleClicked)
# set window's icon
self.SetIcon(wx.Icon(self.CreatePath4Resource('models_lib.ico'), wx.BITMAP_TYPE_ICO))
# set minimum windows' size - you can make it bigger, but not smaller!
self.SetMinSize(wx.Size(800, 450))
# do the final alignment
sizer = wx.BoxSizer(wx.HORIZONTAL)
sizer.Add(self.modelDescriptionPanel, 4, wx.EXPAND)
self.SetSizer(sizer)
# center model's lib window on a screen
self.CentreOnParent()
self.Layout()
def CreatePath4Resource(self, resourceName):
"""
@brief creates and returns path to the
given file in the resource
('assets') dir
@return path to the resource
"""
tmp = os.path.split(os.path.dirname(__file__))
return os.path.join(tmp[0], 'bin', 'assets', resourceName)
def GetFilenames(self):
return self.modelDescriptionPanel.filenames
def OnPaintPrettyPanel(self, event):
# establish the painting canvas
dc = wx.PaintDC(self.modelDescriptionPanel)
x = 0
y = 0
w, h = self.GetSize()
dc.GradientFillLinear((x, y, w, h), '#606060', '#E0E0E0', nDirection=wx.NORTH)
def OnModelSelected(self, event=None):
""" """
itemID = event.GetItem()
if not itemID.IsOk():
itemID = self.tree.GetSelection()
model_data = self.modelsTree.GetPyData(itemID)
if model_data:
self.modelDescriptionPanel.ShowModel(model_data)
def OnModelDoubleClicked(self, event=None):
""" """
itemID = event.GetItem()
if not itemID.IsOk():
itemID = self.tree.GetSelection()
model_info = self.modelsTree.GetPyData(itemID)
if model_info:
f = open(os.path.join(model_info['root'], model_info['files']['model']))
model_data = f.read()
f.close()
f = open(os.path.join(model_info['root'], model_info['files']['metrics']))
metrics_data = f.read()
f.close()
f = open(os.path.join(model_info['root'], model_info['files']['versions']))
versions_data = f.read()
f.close()
evt = ModelSelectedEvent(model_data=model_data,
metrics_data=metrics_data,
versions_data=versions_data,
filenames=self.GetFilenames())
wx.PostEvent(self, evt)
self.Close()
def OnLoadModelSelected(self, event=None):
""" """
wx.PostEvent(self, event)
self.Close() | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/gui/models_lib_gui.py | models_lib_gui.py |
import wx
import wx.combo
"""
@file combo_check_box.py
@brief mixed combo-check-box widget, needed for modules configuration
and versions choosing [and probably some more GUI stuff]
@author Katarzyna Mazur
@date created on 12-05-2014 by Katarzyna Mazur
"""
class ComboCheckBox(wx.combo.ComboPopup):
def Init(self):
"""
@brief overrides Init method from the ComboPopup base class
"""
# possible choices (checkboxes) displayed in combobox, empty at first
self.choicesList = []
def OnComboKeyEvent(self, event):
"""
@brief receives key events from the parent ComboCtrl
"""
wx.ComboPopup.OnComboKeyEvent(self, event)
def Create(self, parent):
"""
@brief creates the popup child control, returns true for success
"""
# create checkbox list - use default wxpython's widget
self.checkBoxList = wx.CheckListBox(parent)
# react on checking / unchecking checkboxes in the checkbox list
self.checkBoxList.Bind(wx.EVT_LISTBOX, self.OnListBoxClicked)
return True
def ClearChoices(self):
del self.choicesList[:]
def SetChoices(self, choices):
"""
@brief initializes combobox with checkboxes values
"""
# clear current content from combobox
self.checkBoxList.Clear()
# add all the elements from the list to the combo -
self.checkBoxList.AppendItems(choices)
def CheckIfAnythingIsSelected(self):
"""
@brief checks if anything is selected on combobox,
if at least one item is selected, returns True,
otherwise False
"""
for i in range(self.checkBoxList.GetCount()) :
if self.checkBoxList.IsChecked(i) :
return True
return False
def GetSelectedItemsCount(self):
"""
@brief counts selected items (checkboxes) in
our combobox, returns the number of the
checked checkboxes
"""
count = 0
for i in range(self.checkBoxList.GetCount()) :
if self.checkBoxList.IsChecked(i) :
count += 1
return count
def GetControl(self):
"""
@brief returns the widget that is to be used for the popup
"""
return self.checkBoxList
def OnListBoxClicked(self, evt):
"""
@brief returns selected items in checkbox list
"""
return [str(self.checkBoxList.GetString(i)) for i in range(self.checkBoxList.GetCount()) if
self.checkBoxList.IsChecked(i)]
def GetSelectedItems(self):
"""
@brief returns selected items in checkbox list
"""
return [str(self.checkBoxList.GetString(i)) for i in range(self.checkBoxList.GetCount()) if
self.checkBoxList.IsChecked(i)]
def OnPopup(self):
"""
@brief called immediately after the popup is shown
"""
wx.combo.ComboPopup.OnPopup(self)
def GetAdjustedSize(self, minWidth, prefHeight, maxHeight):
"""
@brief returns final size of popup - set prefHeight to 100
in order to drop-down the popup and add vertical scrollbar
if needed
"""
return wx.combo.ComboPopup.GetAdjustedSize(self, minWidth, 100, maxHeight)
"""
@brief Simple testing (example usage), should be removed or commented
"""
"""
# fast'n'dirty code for testing purposes ONLY!
class TestMePanel(wx.Panel):
def __init__(self, parent, log):
self.log = log
wx.Panel.__init__(self, parent, -1)
fgs = wx.FlexGridSizer(cols=3, hgap=10, vgap=10)
# this is how we should use this combo check box widget
# EXAMPLE USAGE
cc = wx.combo.ComboCtrl(self)
self.tcp = ComboCheckBox()
cc.SetPopupControl(self.tcp)
self.tcp.SetChoices(['Time Analysis', 'Energy consumption'])
# second 'Set' removes previous - that's perfectly what we want
self.tcp.SetChoices(['Time Analysis', 'Energy consumption', 'Reputation'])
cc.SetText('...')
# other, non important stuff
fgs.Add(cc, 1, wx.EXPAND | wx.ALL, 20)
butt = wx.Button(self, -0, "Check Selected")
fgs.Add(butt, 1, wx.EXPAND | wx.ALL, 20)
butt.Bind(wx.EVT_BUTTON, self.OnButtonClicked)
box = wx.BoxSizer()
box.Add(fgs, 1, wx.EXPAND | wx.ALL, 20)
self.CentreOnParent()
self.SetSizer(box)
def OnButtonClicked(self, event=None):
print self.tcp.GetSelectedItems()
class MainFrame(wx.Frame):
def __init__(self, parent):
wx.Frame.__init__(self, parent)
panel = TestMePanel(self, None)
if __name__ == "__main__":
app = wx.App(0)
frame = MainFrame(None)
frame.CenterOnScreen()
frame.Show()
app.MainLoop()
""" | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/gui/combo_check_box.py | combo_check_box.py |
import wx
"""
@file main_notebook_gui.py
@brief GUI for the main notebook, where we attach our AQoPA tabs
@author Damian Rusinek <[email protected]>
@date created on 05-09-2013 by Damian Rusinek
@date edited on 09-05-2014 by Katarzyna Mazur (visual improvements mainly)
"""
class ResultsPanel(wx.Panel):
""" """
def __init__(self, *args, **kwargs):
wx.Panel.__init__(self, *args, **kwargs)
self.selectedModules = []
self.moduleResultPanel = {}
self._BuildMainLayout()
def _BuildMainLayout(self):
mainSizer = wx.BoxSizer(wx.VERTICAL)
self.modulesChooserBox = wx.StaticBox(self, label="Modules")
self.modulesChooserBox.Hide()
self.modulesChooserBoxSizer = wx.StaticBoxSizer(self.modulesChooserBox, wx.HORIZONTAL)
# create combobox, empty at first
self.modulesChooserComboBox = wx.ComboBox(self, style=wx.TE_READONLY)
# clear combo, do not remember prev choices
self.modulesChooserComboBox.Clear()
# DO NOT select, selecting can mess up many things
self.modulesChooserComboBox.SetSelection(-1)
self.modulesChooserComboBox.SetValue("")
self.modulesChooserComboBox.Hide()
# bind selection event - if user selects the module,
# AQoPA will show up some results panels
self.modulesChooserComboBox.Bind(wx.EVT_COMBOBOX, self.OnModuleSelected)
# create static text - simple information for the user whatz goin' on with GUI
self.modulesInfo = wx.StaticText(self, label="Select module to see the analysis results.")
self.modulesInfo.Hide()
# add text near the combocheckbox
self.modulesChooserBoxSizer.Add(self.modulesInfo, 0, wx.ALL | wx.EXPAND, 5)
# add some horizontal space - empty label its simple and effective
self.modulesChooserBoxSizer.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
# add combocheck box to the panel that will show up after selecting
# modules in the modules tab from the main window
self.modulesChooserBoxSizer.Add(self.modulesChooserComboBox, 1, wx.ALL | wx.EXPAND, 5)
self.resultsBox = wx.StaticBox(self, label="Results")
self.resultsBox.Hide()
self.resultsBoxSizer = wx.StaticBoxSizer(self.resultsBox, wx.VERTICAL)
mainSizer.Add(self.modulesChooserBoxSizer, 0, wx.ALL | wx.EXPAND, 5)
# add an empty static text - sth like a vertical spacer
for i in range(0, 2) :
mainSizer.Add(wx.StaticText(self), 0, 0, wx.ALL | wx.EXPAND, 5)
mainSizer.Add(self.resultsBoxSizer, 1, wx.ALL | wx.EXPAND, 5)
self.SetSizer(mainSizer)
self.Layout()
def _BuildModulesLayout(self):
"""
@brief builds a main layout
"""
for m in self.selectedModules:
if m in self.moduleResultPanel:
continue
gui = m.get_gui()
resultPanel = gui.get_results_panel(self)
self.resultsBoxSizer.Add(resultPanel, 1, wx.ALL | wx.EXPAND, 5)
self.moduleResultPanel[m] = resultPanel
self.Layout()
resultPanel.Hide()
# get unselected modules, make a list out of them
uncheckedModules = []
[uncheckedModules.append(m) for m in self.moduleResultPanel if m not in self.selectedModules]
# delete unchecked panels [modules actually]
for m in uncheckedModules:
self.moduleResultPanel[m].Destroy()
del self.moduleResultPanel[m]
self.Layout()
def SetSelectedModules(self, modules):
""" """
self.selectedModules = modules
if len(self.selectedModules) > 0:
self.resultsBox.Show()
self.modulesChooserBox.Show()
# DO NOT select, selecting can mess up many things
# instead, clear combo and its selection
self.modulesChooserComboBox.Clear()
self.modulesChooserComboBox.SetSelection(-1)
self.modulesChooserComboBox.SetValue("...")
# add selected modules to the combobox
for m in self.selectedModules:
self.modulesChooserComboBox.Append(m.get_gui().get_name())
# refresh combo and show it
self.modulesChooserComboBox.Refresh()
self.modulesChooserComboBox.Show()
self.modulesInfo.Show()
else:
self.resultsBox.Hide()
# clear'n' hide combobox if we do not need
# to see analysis results (0 modules were selected)
self.modulesChooserComboBox.Clear()
self.modulesChooserComboBox.Hide()
self.modulesChooserBox.Hide()
self.modulesInfo.Hide()
self._BuildModulesLayout()
def ClearResults(self):
""" """
for m in self.selectedModules:
gui = m.get_gui()
gui.on_parsed_model()
def OnModuleSelected(self, event):
"""
@brief when user selects the module in combobox,
appropriate result panel shows up, yay!
"""
# get current selection
selectedModule = self.modulesChooserComboBox.GetValue()
# hide all panels
for m in self.moduleResultPanel:
self.moduleResultPanel[m].Hide()
# get the module object from the selected modules list,
# simply find it by name - if its the same as the selected
# on combo - that's the module we're looking for - grab it
for m in self.selectedModules :
if m.get_gui().get_name() == selectedModule :
currentModule = m
break
# show the above-found module
self.moduleResultPanel[currentModule].Show()
self.Layout() | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/gui/results_panel_gui.py | results_panel_gui.py |
import wx
import os
"""
@file general_purpose_frame_gui.py
@brief GUI (frame = window) for modules configuration, results presentation, etc...
@author Katarzyna Mazur
@date created on 19-06-2014 by Katarzyna Mazur
"""
class GeneralFrame(wx.Frame):
"""
@brief frame (simply: window) for configuring chosen modules
"""
def __init__(self, parent, windowTitle, boxTitle, icon):
wx.Frame.__init__(self, parent)
self.SetIcon(wx.Icon(self.CreatePath4Resource(icon), wx.BITMAP_TYPE_PNG))
self.SetTitle(windowTitle)
# create static box aka group box
self.groupBox = wx.StaticBox(self, label=boxTitle)
self.groupBoxSizer = wx.StaticBoxSizer(self.groupBox, wx.VERTICAL)
# create buttons - simple 'OK' and 'Cancel' will be enough
self.OKButton = wx.Button(self, label="OK")
self.cancelButton = wx.Button(self, label="Cancel")
# do some bindings - for now on OK, as well as the cancel button
# will simply close the module configuration window
self.OKButton.Bind(wx.EVT_BUTTON, self.OnOKButtonClicked)
self.cancelButton.Bind(wx.EVT_BUTTON, self.OnCancelButtonClicked)
# align buttons
bottomSizer = wx.BoxSizer(wx.HORIZONTAL)
bottomSizer.Add(wx.StaticText(self), 1, wx.EXPAND, 5)
bottomSizer.Add(self.OKButton, 0, wx.ALIGN_CENTER | wx.ALL, 5)
bottomSizer.Add(self.cancelButton, 0, wx.ALIGN_CENTER | wx.ALL, 5)
self.sizer = wx.BoxSizer(wx.VERTICAL)
self.sizer.Add(self.groupBoxSizer, 1, wx.EXPAND | wx.ALL, 5)
self.sizer.Add(bottomSizer, 0, wx.EXPAND, 5)
self.SetSizer(self.sizer)
self.CentreOnScreen()
self.Layout()
def AddPanel(self, panel) :
self.panel = panel
self.groupBoxSizer.Add(self.panel, 1, wx.EXPAND | wx.ALL, 5)
self.SetSizer(self.sizer)
self.sizer.Layout()
def DoFit(self):
self.sizer.Fit(self)
self.CentreOnScreen()
self.Layout()
def SetWindowSize(self, width, height):
self.SetClientSize(wx.Size(width, height))
self.CentreOnScreen()
self.Layout()
def OnOKButtonClicked(self, event) :
self.Close()
def OnCancelButtonClicked(self, event) :
self.Close()
def CreatePath4Resource(self, resourceName):
"""
@brief creates and returns path to the
given file in the resource
('assets') dir
@return path to the resource
"""
tmp = os.path.split(os.path.dirname(__file__))
return os.path.join(tmp[0], 'bin', 'assets', resourceName) | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/gui/general_purpose_frame_gui.py | general_purpose_frame_gui.py |
import wx
import sys
import time
import traceback
# AQoPA imports
from aqopa import app
from aqopa.model.parser import MetricsParserException,\
ConfigurationParserException, ModelParserException
from aqopa.simulator.error import EnvironmentDefinitionException,\
RuntimeException
"""
@file run_panel_gui.py
@brief GUI for the 'Run' tab on AQoPA's main window (panel)
@author Damian Rusinek <[email protected]>
@date created on 05-09-2013 by Damian Rusinek
@date edited on 08-05-2014 by Katarzyna Mazur (visual improvements)
"""
# model parsing events
ModelParsedEvent, EVT_MODEL_PARSED = wx.lib.newevent.NewEvent()
ModelParseErrorEvent, EVT_MODEL_PARSE_ERROR = wx.lib.newevent.NewEvent()
class RunPanel(wx.Panel):
""" """
def __init__(self, *args, **kwargs):
wx.Panel.__init__(self, *args, **kwargs)
###############
# SIMULATION
###############
self.qopml_model = ""
self.qopml_metrics = ""
self.qopml_configuration = ""
self.allModules = []
self.selectedModules = []
self.interpreter = None
self.finishedSimulators = []
self.progressTimer = wx.Timer(self)
self.Bind(wx.EVT_TIMER, self.OnProgressTimerTick, self.progressTimer)
###############
# LAYOUT
###############
panelSizer = wx.BoxSizer(wx.HORIZONTAL)
# create the top panel - it can be the parsing panel
# (when parsing the model) or the run panel = when
# running the simulation process (analysis)
topPanel = wx.Panel(self, style=wx.ALIGN_CENTER)
topPanel.SetSizer(panelSizer)
# build panels
self.parsingPanel = self._BuildParsingPanel(topPanel)
self.runPanel = self._BuildRunPanel(topPanel)
# align panels
panelSizer.Add(self.parsingPanel, 1, wx.ALL | wx.EXPAND, 5)
panelSizer.Add(self.runPanel, 1, wx.ALL | wx.EXPAND, 5)
# hide the run panel (for now on) it will be visible
# after parsing the model and clicking the 'run' button
self.runPanel.Hide()
# align panels
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(topPanel, 1, wx.ALL | wx.EXPAND, 5)
# create the bottom panel - the one with the buttons
bottomPanel = wx.Panel(self, style=wx.ALIGN_CENTER)
# create buttons
self.parseButton = wx.Button(bottomPanel, label="Parse")
self.parseButton.SetToolTip(wx.ToolTip("Parse chosen model"))
self.runButton = wx.Button(bottomPanel, label="Run")
self.runButton.SetToolTip(wx.ToolTip("Start the simulation process"))
self.runButton.Enable(False)
#self.cleanButton = wx.Button(bottomPanel, label="Clean")
# properly align the bottom panel, buttons on the right
panelSizer = wx.BoxSizer(wx.HORIZONTAL)
panelSizer.Add(self.parseButton, 0, wx.LEFT | wx.ALL, 5)
panelSizer.Add(self.runButton, 0, wx.LEFT | wx.ALL, 5)
#panelSizer.Add(self.cleanButton, 0, wx.LEFT | wx.ALL, 5)
bottomPanel.SetSizer(panelSizer)
sizer.Add(bottomPanel, 0, wx.ALIGN_RIGHT | wx.RIGHT, 5)
self.SetSizer(sizer)
###############
# EVENTS
###############
self.parseButton.Bind(wx.EVT_BUTTON, self.OnParseClicked)
self.runButton.Bind(wx.EVT_BUTTON, self.OnRunClicked)
#self.cleanButton.Bind(wx.EVT_BUTTON, self.OnCleanClicked)
self.Bind(EVT_MODEL_PARSED, self.OnModelParsed)
def _BuildParsingPanel(self, parent):
"""
@brief creates the parsing panel
@return brand new parsing panel
"""
# new panel, we will return it and set as the main panel
# for AQoPA's main windows's Run tab
panel = wx.Panel(parent)
# create group boxes aka static boxes
parsingInfoBox = wx.StaticBox(panel, label="Model parsing information")
# create sizers = some kind of layout management
sizer = wx.BoxSizer(wx.HORIZONTAL)
parsingInfoBoxSizer = wx.StaticBoxSizer(parsingInfoBox, wx.HORIZONTAL)
# create text area for displaying parsing information
self.parseResult = wx.TextCtrl(panel, style=wx.TE_MULTILINE | wx.TE_READONLY)
# do the final panel alignment
parsingInfoBoxSizer.Add(self.parseResult, 1, wx.ALL | wx.EXPAND, 5)
sizer.Add(parsingInfoBoxSizer, 1, wx.EXPAND, 5)
panel.SetSizer(sizer)
return panel
def _BuildRunPanel(self, parent):
"""
@brief creates the run panel
@return brand new run panel
"""
panel = wx.Panel(parent)
sizer = wx.BoxSizer(wx.VERTICAL)
# create group boxes aka static boxes
self.statusStaticBox = wx.StaticBox(panel, label="Status: ")
timeStaticBox = wx.StaticBox(panel, label="Analysis Time")
runInfoBox = wx.StaticBox(panel, label="Model run information")
# create sizers = some kind of layout management
self.statusStaticBoxSizer = wx.StaticBoxSizer(self.statusStaticBox, wx.VERTICAL)
timeStaticBoxSizer = wx.StaticBoxSizer(timeStaticBox, wx.VERTICAL)
runInfoBoxSizer = wx.StaticBoxSizer(runInfoBox, wx.VERTICAL)
progressSizer = wx.BoxSizer(wx.HORIZONTAL)
# create labels
self.percentLabel = wx.StaticText(panel, label="0%")
self.dotsLabel = wx.StaticText(panel, label=".")
self.analysisTime = wx.StaticText(panel, label='---')
# create text area
self.runResult = wx.TextCtrl(panel, style=wx.TE_MULTILINE | wx.TE_READONLY)
# add content to sizers
progressSizer.Add(self.dotsLabel, 0, wx.ALIGN_LEFT, 5)
progressSizer.Add(self.percentLabel, 0, wx.ALIGN_LEFT, 5)
self.statusStaticBoxSizer.Add(progressSizer, 0, wx.ALL | wx.ALIGN_CENTER, 5)
timeStaticBoxSizer.Add(self.analysisTime, 0, wx.ALL | wx.ALIGN_CENTER, 5)
runInfoBoxSizer.Add(self.runResult, 1, wx.ALL | wx.EXPAND, 5)
sizer.Add(self.statusStaticBoxSizer, 0, wx.ALL | wx.EXPAND, 5)
sizer.Add(timeStaticBoxSizer, 0, wx.ALL | wx.EXPAND, 5)
sizer.Add(runInfoBoxSizer, 1, wx.ALL | wx.EXPAND, 5)
# do the final alignment
panel.SetSizer(sizer)
return panel
def ShowPanel(self, panel):
self.parsingPanel.Hide()
self.runPanel.Hide()
panel.Show()
self.Layout()
def SetModel(self, model, metrics, configuration):
""" """
self.qopml_model = model
self.qopml_metrics = metrics
self.qopml_configuration = configuration
def SetSelectedModules(self, modules):
""" """
self.selectedModules = modules
def SetAllModules(self, modules):
""" """
self.allModules = modules
def OnParseClicked(self, event):
""" """
self.interpreter = app.GuiInterpreter(
model_as_text=self.qopml_model,
metrics_as_text=self.qopml_metrics,
config_as_text=self.qopml_configuration)
for m in self.selectedModules:
self.interpreter.register_qopml_module(m)
try:
resultMessage = ""
error = False
self.interpreter.parse(self.allModules)
resultMessage = "SUCCESFULLY PARSED\n\n Now you can run simulation."
wx.PostEvent(self, ModelParsedEvent())
except EnvironmentDefinitionException, e:
error = True
resultMessage = "ENVIRONMENT ERROR\n"
resultMessage += "%s\n" % unicode(e)
except ModelParserException, e:
error = True
resultMessage = "MODEL SYNTAX ERROR\n"
if len(e.syntax_errors):
resultMessage += "\n".join(e.syntax_errors)
except MetricsParserException, e:
error = True
resultMessage = "METRICS SYNTAX ERROR\n"
if len(e.syntax_errors):
resultMessage += "\n".join(e.syntax_errors)
except ConfigurationParserException, e:
error = True
resultMessage = "VERSIONS SYNTAX ERROR\n"
if len(e.syntax_errors):
resultMessage += "\n".join(e.syntax_errors)
if error:
resultMessage += "\nModel may include syntax parsed by modules (eg. metrics, configuration). "+\
"Have you selected modules?"
wx.PostEvent(self, ModelParseErrorEvent(error=resultMessage))
if resultMessage != "":
self.parseResult.SetValue(resultMessage)
self.ShowPanel(self.parsingPanel)
def OnModelParsed(self, event):
""" """
self.runButton.Enable(True)
def OnRunClicked(self, event):
""" """
if not self.selectedModules:
dial = wx.MessageDialog(None, "You have to choose some modules!", 'Warning', wx.OK | wx.ICON_EXCLAMATION)
dial.ShowModal()
else :
try:
self.DisableModulesSelection(True)
self.statusStaticBox.SetLabel("Status: running")
self.analysisTime.SetLabel("---")
self.percentLabel.SetLabel("0%")
self.runResult.SetValue("")
self.runButton.Enable(False)
self.ShowPanel(self.runPanel)
self.finishedSimulators = []
self.simulatorIndex = 0
self.startAnalysisTime = time.time()
self.interpreter.prepare()
self.progressTimer.Start(1000)
simulator = self.interpreter.simulators[self.simulatorIndex]
wx.lib.delayedresult.startWorker(self.OnSimulationFinished,
self.interpreter.run_simulation,
wargs=(simulator,),
jobID = self.simulatorIndex)
except EnvironmentDefinitionException, e:
self.statusStaticBox.SetLabel("Status: error")
self.runButton.Enable(True)
errorMessage = "Error on creating environment: %s\n" % e
if len(e.errors) > 0:
errorMessage += "Errors:\n"
errorMessage += "\n".join(e.errors)
self.runResult.SetValue(errorMessage)
self.progressTimer.Stop()
except Exception, e:
self.statusStaticBox.SetLabel("Error")
self.runButton.Enable(True)
sys.stderr.write(traceback.format_exc())
errorMessage = "Unknown error\n"
self.runResult.SetValue(errorMessage)
self.progressTimer.Stop()
self.ShowPanel(self.runPanel)
def OnSimulationFinished(self, result):
""" """
simulator = self.interpreter.simulators[self.simulatorIndex]
self.simulatorIndex += 1
self.finishedSimulators.append(simulator)
resultMessage = None
try :
simulator = result.get()
self.PrintProgressbar(self.GetProgress())
for m in self.selectedModules:
gui = m.get_gui()
gui.on_finished_simulation(simulator)
runResultValue = self.runResult.GetValue()
resultMessage = runResultValue + \
"Version %s finished successfully.\n\n" \
% simulator.context.version.name
except RuntimeException, e:
runResultValue = self.runResult.GetValue()
resultMessage = runResultValue + \
"Version %s finished with error: \nHost: %s \nInstruction: %s\nError: %s \n\n" \
% (simulator.context.version.name,
unicode(simulator.context.get_current_host()),
unicode(simulator.context.get_current_instruction()),
e.args[0])
except Exception, e:
sys.stderr.write(traceback.format_exc())
runResultValue = self.runResult.GetValue()
resultMessage = runResultValue + \
"Version %s finished with unknown error.\n\n" \
% simulator.context.version.name
if resultMessage:
self.runResult.SetValue(resultMessage)
if len(self.finishedSimulators) == len(self.interpreter.simulators):
self.OnAllSimulationsFinished()
else:
simulator = self.interpreter.simulators[self.simulatorIndex]
wx.lib.delayedresult.startWorker(self.OnSimulationFinished,
self.interpreter.run_simulation,
wargs=(simulator,),
jobID = self.simulatorIndex)
def OnAllSimulationsFinished(self):
""" """
self.DisableModulesSelection(False)
self.progressTimer.Stop()
self.PrintProgressbar(1)
self.endAnalysisTime = time.time()
timeDelta = self.endAnalysisTime - self.startAnalysisTime
analysisTimeLabel = "%.4f s" % timeDelta
self.analysisTime.SetLabel(analysisTimeLabel)
self.Layout()
for m in self.selectedModules:
gui = m.get_gui()
gui.on_finished_all_simulations(self.interpreter.simulators)
################
# PROGRESS BAR
################
def OnProgressTimerTick(self, event):
""" """
progress = self.GetProgress()
if progress == 1:
self.progressTimer.Stop()
self.PrintProgressbar(progress)
def GetProgress(self):
all = 0.0
sum = 0.0
for simulator in self.interpreter.simulators:
all += 1
sum += simulator.context.get_progress()
progress = 0
if all > 0:
progress = sum / all
return progress
def PrintProgressbar(self, progress):
"""
Prints the formatted progressbar showing the progress of simulation.
"""
percentage = str(int(round(progress*100))) + '%'
self.percentLabel.SetLabel(percentage)
self.runPanel.Layout()
if progress == 1:
self.statusStaticBox.SetLabel("Status: finished")
self.runPanel.Layout()
self.dotsLabel.SetLabel('')
self.runPanel.Layout()
else:
dots = self.dotsLabel.GetLabel()
if len(dots) > 10:
dots = "."
else:
dots += ".."
self.dotsLabel.SetLabel(dots)
self.runPanel.Layout()
#def OnCleanClicked(self, event):
# self.parseResult.Clear()
def DisableModulesSelection(self, value) :
"""
@brief disables/enables elements on 'Modules' tab (panel),
thanks to such approach we can get rid of some errors, simply
disable modules selection/configuration when the simulation
is running, and re-enable it when simulation ends
"""
# get run panel parent - that is, the wx Notebook
notebook = self.GetParent()
# get modules panel, it's the third page in our wx Notebook
modulesTab = notebook.GetPage(3)
# disable or enable gui elements (depends on 'value')
if value :
modulesTab.selectButton.Disable()
modulesTab.configureButton.Disable()
modulesTab.comboCheckBox.Disable()
modulesTab.modulesConfComboBox.Disable()
else :
modulesTab.selectButton.Enable()
modulesTab.configureButton.Enable()
modulesTab.comboCheckBox.Enable()
modulesTab.modulesConfComboBox.Enable() | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/gui/run_panel_gui.py | run_panel_gui.py |
import wx
import wx.richtext
"""
@file mmv_panel_gui.py
@brief GUI for the 'Model', 'Metrics' and 'Versions' tabs on AQoPA's main window (panel)
@author Damian Rusinek <[email protected]>
@date created on 05-09-2013 by Damian Rusinek
@date edited on 06-05-2014 by Katarzyna Mazur (visual improvements mainly)
"""
class MMVPanel(wx.Panel):
"""
@brief panel containing text area (and buttons 2) for one of model parts:
model, metrics, configuration, used for creating tabs on AQoPA's
main window
"""
def __init__(self, *args, **kwargs):
"""
@brief Initializes and aligns all the gui elements for
tabs: model, metrics and versions
"""
wx.Panel.__init__(self, *args, **kwargs)
# create group boxes aka static boxes
self.tabBox = wx.StaticBox(self, label="File: ")
# create sizers = some kind of layout management
self.tabBoxSizer = wx.StaticBoxSizer(self.tabBox, wx.HORIZONTAL)
# create text area; here we display model, metric or version content
self.dataTextArea = wx.richtext.RichTextCtrl(self, style=wx.TE_MULTILINE | wx.TE_NO_VSCROLL)
# create buttons - simple 'Load' and 'Save' will be enough
self.loadButton = wx.Button(self, label="Load")
self.saveButton = wx.Button(self, label="Save")
# or add the 3rd button maybe? clears the
# content of model/metric/version if clicked
self.cleanButton = wx.Button(self, label="Clear")
# create label 4 displaying information about the cursor
# position in the opened file
self.cursorInfoLabel = wx.StaticText(self)
# create checkbox - make model/metric/version editable / not editable
self.editable = wx.CheckBox(self, -1, label="Editable")
# do some bindings -
# show cursor position below the text area
self.dataTextArea.Bind(wx.EVT_KEY_UP, self.printCursorInfo)
self.dataTextArea.Bind(wx.EVT_LEFT_UP, self.printCursorInfo)
# bind buttons to appropriate actions
self.loadButton.Bind(wx.EVT_BUTTON, self.OnLoadClicked)
self.saveButton.Bind(wx.EVT_BUTTON, self.OnSaveClicked)
self.cleanButton.Bind(wx.EVT_BUTTON, self.OnCleanClicked)
# bind checkbox state with the appropriate action - simply
# allow / do not allow to edit opened model/metric/version
self.editable.Bind(wx.EVT_CHECKBOX, self.OnCheckBoxClicked)
# at first, we are not allowed to edit
self.editable.SetValue(False)
# pretend that we clicked the check box, so it's event gets called
wx.PostEvent(self.editable, wx.CommandEvent(wx.wxEVT_COMMAND_CHECKBOX_CLICKED))
# align buttons and the checkbox
bottomSizer = wx.BoxSizer(wx.HORIZONTAL)
bottomSizer.Add(self.editable, 0, wx.EXPAND | wx.ALL, 5)
# create empty static box (label) in order to make a horizontal
# gap between the checkbox and the buttons
bottomSizer.Add(wx.StaticText(self), 1, wx.EXPAND, 5)
bottomSizer.Add(self.loadButton, 0, wx.ALIGN_CENTER | wx.ALL, 5)
bottomSizer.Add(self.saveButton, 0, wx.ALIGN_CENTER | wx.ALL, 5)
bottomSizer.Add(self.cleanButton, 0, wx.ALIGN_CENTER | wx.ALL, 5)
# add some 'useful' tooltips
self.loadButton.SetToolTip(wx.ToolTip("Load from HDD"))
self.saveButton.SetToolTip(wx.ToolTip("Save to HDD"))
self.cleanButton.SetToolTip(wx.ToolTip("Clear all"))
# add text area to the fancy group box with a filename above the displayed file content
self.tabBoxSizer.Add(self.dataTextArea, 1, wx.EXPAND | wx.ALL, 5)
# do the final alignment
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(self.tabBoxSizer, 1, wx.EXPAND | wx.ALL, 5)
sizer.Add(self.cursorInfoLabel, 0, wx.EXPAND | wx.ALL, 5)
sizer.Add(bottomSizer, 0, wx.EXPAND, 5)
self.SetSizer(sizer)
self.Layout()
def OnLoadClicked(self, event):
""" Load file to text area """
ofdlg = wx.FileDialog(self, "Load file", "", "", "QoP-ML Files (*.qopml)|*.qopml",
wx.FD_OPEN | wx.FD_FILE_MUST_EXIST)
ofdlg.ShowModal()
if ofdlg.GetPath():
wildcard, types = wx.richtext.RichTextBuffer.GetExtWildcard(save=False)
fileType = types[ofdlg.GetFilterIndex()]
self.dataTextArea.LoadFile(ofdlg.GetPath(), fileType)
self.SetFilenameOnGUI(ofdlg.GetPath())
ofdlg.Destroy()
def OnSaveClicked(self, event):
""" Save text area value to file """
ofdlg = wx.FileDialog(self, "Save file", "", "", "QoP-ML Files (*.qopml)|*.qopml",
wx.FD_SAVE)
ofdlg.ShowModal()
if ofdlg.GetPath() :
# add *.qopml extension if not given
validated = self.ValidateFilename(ofdlg.GetPath())
f = open(validated, "w")
f.write(self.dataTextArea.GetValue())
f.close()
# save on gui name of the saved file (with *.qopml) extension
self.SetFilenameOnGUI(validated)
ofdlg.Destroy()
def OnCleanClicked(self, event):
self.dataTextArea.SetValue("")
def printCursorInfo(self, event):
""" """
pos = self.dataTextArea.GetInsertionPoint()
xy = self.dataTextArea.PositionToXY(pos)
self.cursorInfoLabel.SetLabel("Line: %d, %d"
% (xy[1]+1, xy[0]+1))
self.Layout()
event.Skip()
def OnPaintPrettyPanel(self, event):
# establish the painting canvas
dc = wx.PaintDC(self)
x = 0
y = 0
w, h = self.GetSize()
dc.GradientFillLinear((x, y, w, h), '#606060', '#E0E0E0', nDirection=wx.NORTH)
def SetFilenameOnGUI(self, filename):
"""
@brief sets the title of the group box
to the opened/saved filename
"""
self.tabBox.SetLabel("File: "+filename)
def ValidateFilename(self, filename):
"""
@brief adds *.qopml extension to the file which
we want 2 save (but only if not given)
@return returns the validated filename
"""
if filename.endswith(".qopml") :
return filename
else :
filename += ".qopml"
return filename
def OnCheckBoxClicked(self, event):
"""
@brief checks if the checkbox is checked :D
if so, you can edit opened model/metric/version
if not, model/metric/version is not editable,
nah, just a fancy feature
"""
if event.IsChecked() :
self.dataTextArea.SetEditable(True)
else :
self.dataTextArea.SetEditable(False) | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/gui/mmv_panel_gui.py | mmv_panel_gui.py |
import os
import wx
import wx.richtext
import wx.lib.newevent
import wx.lib.delayedresult
import aqopa
"""
from aqopa.gui.main_notebook_gui import MainNotebook, EVT_MODULE_SIMULATION_REQUEST, EVT_MODULE_SIMULATION_ALLOWED, EVT_MODULE_SIMULATION_FINISHED
from aqopa.gui.main_frame_gui import LibraryFrame, EVT_MODEL_SELECTED
"""
# AQoPA gui imports
from aqopa.gui.models_lib_gui import LibraryFrame, EVT_MODEL_SELECTED
from aqopa.gui.modules_panel_gui import ModulesPanel, EVT_MODULES_CHANGED
from aqopa.gui.mmv_panel_gui import MMVPanel
from aqopa.gui.run_panel_gui import RunPanel, EVT_MODEL_PARSED
from aqopa.gui.results_panel_gui import ResultsPanel
# modules communication events
ModuleSimulationRequestEvent, EVT_MODULE_SIMULATION_REQUEST = wx.lib.newevent.NewEvent() # Parameters: module
ModuleSimulationAllowedEvent, EVT_MODULE_SIMULATION_ALLOWED = wx.lib.newevent.NewEvent() # Parameters: interpreter
ModuleSimulationFinishedEvent, EVT_MODULE_SIMULATION_FINISHED = wx.lib.newevent.NewEvent()
class MainNotebook(wx.Notebook):
""" """
def __init__(self, parent):
wx.Notebook.__init__(self, parent)
###########
# MODULES
###########
# here you can add modules to the GUI version of AQoPA
self.availableModules = []
# add time analysis module
from aqopa.module import timeanalysis
timeanalysis_module = timeanalysis.Module()
timeanalysis_module.get_gui().Bind(EVT_MODULE_SIMULATION_REQUEST,
self.OnModuleSimulationRequest)
timeanalysis_module.get_gui().Bind(EVT_MODULE_SIMULATION_FINISHED,
self.OnModuleSimulationFinished)
self.availableModules.append(timeanalysis_module)
# add energy analysis module - it depends on time analysis module
from aqopa.module import energyanalysis
energyanalysis_module = energyanalysis.Module(timeanalysis_module)
self.availableModules.append(energyanalysis_module)
# add reputation module
from aqopa.module import reputation
self.availableModules.append(reputation.Module())
# add qop module - KM
from aqopa.module import qopanalysis
self.availableModules.append(qopanalysis.Module())
# add finance module - it depends on energy analysis module - KM
from aqopa.module import financialanalysis
fm = financialanalysis.Module(energyanalysis_module)
self.availableModules.append(fm)
# add gogreen! module - it depends on energy analysis module - KM
from aqopa.module import greenanalysis
gm = greenanalysis.Module(energyanalysis_module)
self.availableModules.append(gm)
# list containing notebook images:
# .ico seem to be more OS portable, although we use .png here
# the (20, 20) is the size in pixels of the images
il = wx.ImageList(20, 20)
modelsTabImg = il.Add(wx.Bitmap(self.CreatePath4Resource('models_lib.png'), wx.BITMAP_TYPE_PNG))
metricsTabImg = il.Add(wx.Bitmap(self.CreatePath4Resource('metrics.png'), wx.BITMAP_TYPE_PNG))
versionsTabImg = il.Add(wx.Bitmap(self.CreatePath4Resource('versions.png'), wx.BITMAP_TYPE_PNG))
runTabImg = il.Add(wx.Bitmap(self.CreatePath4Resource('run.png'), wx.BITMAP_TYPE_PNG))
modulesTabImg = il.Add(wx.Bitmap(self.CreatePath4Resource('modules.png'), wx.BITMAP_TYPE_PNG))
resultsTabImg = il.Add(wx.Bitmap(self.CreatePath4Resource('results.png'), wx.BITMAP_TYPE_PNG))
self.AssignImageList(il)
###########
# TABS
###########
self.modelTab = MMVPanel(self)
self.modelTab.Layout()
self.Bind(wx.EVT_TEXT, self.OnModelTextChange, self.modelTab.dataTextArea)
self.AddPage(self.modelTab, "Model")
self.SetPageImage(0, modelsTabImg)
self.metricsTab = MMVPanel(self)
self.metricsTab.Layout()
self.Bind(wx.EVT_TEXT, self.OnModelTextChange, self.metricsTab.dataTextArea)
self.AddPage(self.metricsTab, "Metrics")
self.SetPageImage(1, metricsTabImg)
self.versionsTab = MMVPanel(self)
self.versionsTab.Layout()
self.Bind(wx.EVT_TEXT, self.OnModelTextChange, self.versionsTab.dataTextArea)
self.versionsTab.Layout()
self.AddPage(self.versionsTab, "Versions")
self.SetPageImage(2, versionsTabImg)
self.modulesTab = ModulesPanel(self, modules=self.availableModules)
self.modulesTab.Bind(EVT_MODULES_CHANGED, self.OnModulesChange)
self.modulesTab.Layout()
self.AddPage(self.modulesTab, "Modules")
self.SetPageImage(3, modulesTabImg)
self.runTab = RunPanel(self)
self.runTab.SetAllModules(self.availableModules)
self.runTab.Layout()
self.runTab.Bind(EVT_MODEL_PARSED, self.OnModelParsed)
self.AddPage(self.runTab, "Run")
self.SetPageImage(4, runTabImg)
self.resultsTab = ResultsPanel(self)
self.resultsTab.Layout()
self.AddPage(self.resultsTab, "Results")
self.SetPageImage(5, resultsTabImg)
def CreatePath4Resource(self, resourceName):
"""
@brief creates and returns path to the
given file in the resource
('assets') dir
@return path to the resource
"""
tmp = os.path.split(os.path.dirname(__file__))
return os.path.join(tmp[0], 'bin', 'assets', resourceName)
def LoadModelFile(self, filePath):
self.modelTab.dataTextArea.LoadFile(filePath)
def LoadMetricsFile(self, filePath):
self.metricsTab.dataTextArea.LoadFile(filePath)
def LoadVersionsFile(self, filePath):
self.versionsTab.dataTextArea.LoadFile(filePath)
def SetModelData(self, data):
self.modelTab.dataTextArea.SetValue(data)
def SetMetricsData(self, data):
self.metricsTab.dataTextArea.SetValue(data)
def SetVersionsData(self, data):
self.versionsTab.dataTextArea.SetValue(data)
def GetModelData(self):
return self.modelTab.dataTextArea.GetValue().strip()
def GetMetricsData(self):
return self.metricsTab.dataTextArea.GetValue().strip()
def GetVersionsData(self):
return self.versionsTab.dataTextArea.GetValue().strip()
def OnModelTextChange(self, event):
self.runTab.SetModel(self.GetModelData(),
self.GetMetricsData(),
self.GetVersionsData())
event.Skip()
def OnModulesChange(self, event):
self.runTab.SetSelectedModules(event.modules)
self.resultsTab.SetSelectedModules(event.modules)
def OnModelParsed(self, event):
self.resultsTab.ClearResults()
event.Skip()
def OnModuleSimulationRequest(self, event):
""" """
gui = event.module.get_gui()
self.runTab.runButton.Enable(False)
self.runTab.parseButton.Enable(False)
wx.PostEvent(gui, ModuleSimulationAllowedEvent(interpreter=self.runTab.interpreter))
def OnModuleSimulationFinished(self, event):
""" """
self.runTab.parseButton.Enable(True)
class MainFrame(wx.Frame):
""" """
def __init__(self, *args, **kwargs):
wx.Frame.__init__(self, *args, **kwargs)
###########
# MENUBAR
###########
# create menubar
menuBar = wx.MenuBar()
#create main menu, lets call it 'file' menu
fileMenu = wx.Menu()
# create menu item = about AQoPA
item = wx.MenuItem(fileMenu, wx.NewId(), u"&About AQoPA\tCTRL+I")
item.SetBitmap(wx.Bitmap(self.CreatePath4Resource('about.png')))
fileMenu.AppendItem(item)
self.Bind(wx.EVT_MENU, self.OnAbout, item)
fileMenu.AppendSeparator()
# create menu item = quit AQoPa
item = wx.MenuItem(fileMenu, wx.NewId(), u"&Quit\tCTRL+Q")
item.SetBitmap(wx.Bitmap(self.CreatePath4Resource('exit.png')))
fileMenu.AppendItem(item)
self.Bind(wx.EVT_MENU, self.OnQuit, item)
# add 'file' menu to the menubar
menuBar.Append(fileMenu, "&Menu")
# create library menu, here u can find modules library
libraryMenu = wx.Menu()
# create models menu item
item = wx.MenuItem(libraryMenu, wx.NewId(), u"&Browse models\tCTRL+M")
item.SetBitmap(wx.Bitmap(self.CreatePath4Resource('lib.png')))
libraryMenu.AppendItem(item)
self.Bind(wx.EVT_MENU, self.OnBrowseModels, item)
# create metric menu item
# item = wx.MenuItem(libraryMenu, wx.NewId(), u"&Browse metrics\tCTRL+F")
# item.SetBitmap(wx.Bitmap(self.CreatePath4Resource('metrics.png')))
# libraryMenu.AppendItem(item)
# add 'library' menu to the menubar
menuBar.Append(libraryMenu, "&Library")
self.SetMenuBar(menuBar)
###################
# SIZERS & EVENTS
###################
self.mainNotebook = MainNotebook(self)
logoPanel = wx.Panel(self)
pic = wx.StaticBitmap(logoPanel)
pic.SetBitmap(wx.Bitmap(self.CreatePath4Resource('logo.png')))
sizer = wx.BoxSizer(wx.HORIZONTAL)
sizer.Add(wx.StaticText(self), 1, wx.EXPAND, 5)
sizer.Add(logoPanel, 0, wx.RIGHT| wx.ALL|wx.EXPAND, 5)
s2 = wx.BoxSizer(wx.VERTICAL)
s2.Add(sizer, 0, wx.LEFT| wx.ALL|wx.EXPAND, 5)
s2.Add(self.mainNotebook, 1, wx.ALL|wx.EXPAND, 5)
self.SetSizer(s2)
self.SetIcon(wx.Icon(self.CreatePath4Resource('app_logo.png'), wx.BITMAP_TYPE_PNG))
# the size of the whole application window
self.SetClientSize(wx.Size(800, 600))
self.CenterOnScreen()
self.Layout()
def CreatePath4Resource(self, resourceName):
"""
@brief creates and returns path to the
given file in the resource
('assets') dir
@return path to the resource
"""
tmp = os.path.split(os.path.dirname(__file__))
return os.path.join(tmp[0], 'bin', 'assets', resourceName)
def OnQuit(self, event=None):
"""
@brief closes the application
"""
self.Close()
def OnBrowseModels(self, event=None):
"""
@brief shows the library frame (models library window)
"""
libraryFrame = LibraryFrame(self, title="Models Library")
libraryFrame.Show(True)
libraryFrame.CentreOnParent()
#libraryFrame.Maximize(True)
libraryFrame.Bind(EVT_MODEL_SELECTED, self.OnLibraryModelSelected)
def OnLibraryModelSelected(self, event):
""" """
self.mainNotebook.SetModelData(event.model_data)
self.mainNotebook.SetMetricsData(event.metrics_data)
self.mainNotebook.SetVersionsData(event.versions_data)
# set filenames on GUI
self.mainNotebook.modelTab.SetFilenameOnGUI(event.filenames['model'])
self.mainNotebook.metricsTab.SetFilenameOnGUI(event.filenames['metrics'])
self.mainNotebook.versionsTab.SetFilenameOnGUI(event.filenames['versions'])
def OnAbout(self, event=None):
""" Show about info """
description = """AQoPA stands for Automated Quality of Protection Analysis Tool
for QoPML models."""
licence = """AQoPA is free software; you can redistribute
it and/or modify it under the terms of the GNU General Public License as
published by the Free Software Foundation; either version 2 of the License,
or (at your option) any later version.
AQoPA is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE."""
logo_filepath = self.CreatePath4Resource('logo.png')
info = wx.AboutDialogInfo()
info.SetIcon(wx.Icon(logo_filepath, wx.BITMAP_TYPE_PNG))
info.SetName('AQoPA')
info.SetVersion(aqopa.VERSION)
info.SetDescription(description)
info.SetCopyright('(C) 2013 QoPML Project')
info.SetWebSite('http://www.qopml.org')
info.SetLicence(licence)
info.AddDeveloper('Damian Rusinek')
info.AddDocWriter('Damian Rusinek')
info.AddArtist('QoPML Project')
info.AddTranslator('Damian Rusinek')
wx.AboutBox(info)
class AqopaApp(wx.App):
def OnInit(self):
self.mainFrame = MainFrame(None,
title="Automated Quality of Protection Analysis Tool")
self.mainFrame.Show(True)
self.mainFrame.CenterOnScreen()
# self.mainFrame.Maximize(True)
self.SetTopWindow(self.mainFrame)
return True | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/bin/gui.py | gui.py |
import os
import sys
import threading
import time
from aqopa.model.parser import ModelParserException,\
MetricsParserException, ConfigurationParserException
from aqopa.app import Builder, ConsoleInterpreter
from aqopa.simulator import EnvironmentDefinitionException
from aqopa.module import timeanalysis, energyanalysis, reputation, financialanalysis, greenanalysis, qopanalysis
from aqopa.simulator.error import RuntimeException
class ProgressThread(threading.Thread):
def __init__(self, f, interpreter, *args, **kwargs):
super(ProgressThread, self).__init__(*args, **kwargs)
self.file = f
self.interpreter = interpreter
self.signs = '|/-\\'
self.sign_no = 0
self.finished = False
def get_progress(self):
all_progress = 0.0
sum_progress = 0.0
for s in self.interpreter.simulators:
all_progress += 1
sum_progress += s.context.get_progress()
progress = 0
if all_progress > 0:
progress = sum_progress / all_progress
return progress
def run(self):
progress = self.get_progress()
while progress < 1 and not self.interpreter.is_finished() and not self.finished:
self.print_progressbar(progress)
time.sleep(0.2)
progress = self.get_progress()
self.print_progressbar(progress)
self.file.write("\n")
def print_progressbar(self, progress):
"""
Prints the formatted progressbar showing the progress of simulation.
"""
self.sign_no += 1
sign = self.signs[self.sign_no % len(self.signs)]
percentage = str(int(round(progress*100))) + '%'
percentage = (' ' * (5-len(percentage))) + percentage
bar = ('#' * int(round(progress*20))) + (' ' * (20 - int(round(progress*20))))
self.file.write("\r%c[%s] %s" % (sign, bar, percentage))
self.file.flush()
def run(qopml_model, qopml_metrics, qopml_configuration,
save_states = False, debug = False, show_progressbar = False):
# Prepare all available modules
# here you can add modules to the console version of AQoPA
time_module = timeanalysis.Module()
energy_module = energyanalysis.Module(time_module)
reputation_module = reputation.Module()
financial_module = financialanalysis.Module(energy_module)
green_module = greenanalysis.Module(energy_module)
qop_module = qopanalysis.Module()
available_modules = [time_module, energy_module, reputation_module, green_module, qop_module]
############### DEBUG ###############
if debug:
builder = Builder()
store = builder.build_store()
parser = builder.build_model_parser(store, available_modules)
parser.lexer.input(qopml_model)
while True:
print parser.lexer.current_state()
tok = parser.lexer.token()
if not tok:
break
print tok
print ""
print 'Errors: ' + str(parser.get_syntax_errors())
print ""
print ""
return
#####################################
interpreter = ConsoleInterpreter()
try:
interpreter.set_qopml_model(qopml_model)
interpreter.set_qopml_metrics(qopml_metrics)
interpreter.set_qopml_config(qopml_configuration)
interpreter.register_qopml_module(time_module)
# interpreter.register_qopml_module(energy_module)
# interpreter.register_qopml_module(reputation_module)
# interpreter.register_qopml_module(financial_module)
# interpreter.register_qopml_module(green_module)
# interpreter.register_qopml_module(qop_module)
interpreter.parse(available_modules)
interpreter.prepare()
if save_states:
for simulator in interpreter.simulators:
interpreter.save_states_to_file(simulator)
if show_progressbar:
progressbar_thread = ProgressThread(sys.stdout, interpreter)
progressbar_thread.start()
try:
interpreter.run()
except RuntimeException:
if show_progressbar:
progressbar_thread.finished = True
raise
except EnvironmentDefinitionException, e:
sys.stderr.write('Error on creating environment: %s\n' % e)
if len(e.errors) > 0:
sys.stderr.write('Errors:\n')
sys.stderr.write('\n'.join(e.errors))
sys.stderr.write('\n')
except ModelParserException, e:
sys.stderr.write('Model parsing error: %s\n' % e)
if len(e.syntax_errors):
sys.stderr.write('Syntax errors:\n')
sys.stderr.write('\n'.join(e.syntax_errors))
sys.stderr.write('\n')
except MetricsParserException, e:
sys.stderr.write('Metrics parsing error: %s\n' % e)
if len(e.syntax_errors):
sys.stderr.write('Syntax errors:\n')
sys.stderr.write('\n'.join(e.syntax_errors))
sys.stderr.write('\n')
except ConfigurationParserException, e:
sys.stderr.write('Configuration parsing error: %s\n' % e)
if len(e.syntax_errors):
sys.stderr.write('Syntax errors:\n')
sys.stderr.write('\n'.join(e.syntax_errors))
sys.stderr.write('\n') | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/bin/console.py | console.py |
import wx
import os
from aqopa.gui.general_purpose_frame_gui import GeneralFrame
"""
@file gui.py
@brief gui file for the greenanalysis module
@author Katarzyna Mazur
"""
class SingleVersionPanel(wx.Panel):
def __init__(self, module, *args, **kwargs):
wx.Panel.__init__(self, *args, **kwargs)
self.module = module
self.versionSimulator = {}
# ################
# VERSION BOX
#################
versionBox = wx.StaticBox(self, label="Version")
versionsLabel = wx.StaticText(self, label="Choose Version To See\nAnalysis Results:")
self.versionsList = wx.ComboBox(self, style=wx.TE_READONLY, size=(200, -1))
self.versionsList.Bind(wx.EVT_COMBOBOX, self.OnVersionChanged)
versionBoxSizer = wx.StaticBoxSizer(versionBox, wx.HORIZONTAL)
versionBoxSizer.Add(versionsLabel, 0, wx.ALL | wx.ALIGN_CENTER, 5)
versionBoxSizer.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
versionBoxSizer.Add(self.versionsList, 1, wx.ALL | wx.ALIGN_CENTER, 5)
##################################
# CO2 RESULTS BOX
##################################
self.co2Box = wx.StaticBox(self, label="The Carbon Dioxide Emissions Analysis Results")
self.co2Label = wx.StaticText(self, label="CO2 produced per \none kWh [pounds]:")
self.co2Input = wx.TextCtrl(self, size=(200, -1))
co2Sizer = wx.BoxSizer(wx.HORIZONTAL)
co2Sizer.Add(self.co2Label, 0, wx.ALL | wx.EXPAND, 5)
co2Sizer.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
co2Sizer.Add(self.co2Input, 1, wx.ALL | wx.EXPAND | wx.ALIGN_RIGHT, 5)
hostsBox, hostsBoxSizer = self._BuildHostsBoxAndSizer()
co2BoxSizer = wx.StaticBoxSizer(self.co2Box, wx.VERTICAL)
co2BoxSizer.Add(co2Sizer, 0, wx.ALL | wx.EXPAND)
co2BoxSizer.Add(wx.StaticText(self), 0, wx.ALL | wx.EXPAND, 5)
co2BoxSizer.Add(hostsBoxSizer, 1, wx.ALL | wx.EXPAND)
#################
# BUTTONS LAY
#################
self.showCo2ResultsBtn = wx.Button(self, label="Show")
self.showCo2ResultsBtn.Bind(wx.EVT_BUTTON, self.OnShowCo2ResultsBtnClicked)
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
buttonsSizer.Add(self.showCo2ResultsBtn, 0, wx.ALL | wx.EXPAND, 5)
#################
# MAIN LAY
#################
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(versionBoxSizer, 0, wx.ALL | wx.EXPAND, 5)
sizer.Add(co2BoxSizer, 0, wx.ALL | wx.EXPAND, 5)
sizer.Add(buttonsSizer, 0, wx.ALL | wx.EXPAND, 5)
self.SetSizer(sizer)
self.SetVersionsResultsVisibility(False)
def OnShowCo2ResultsBtnClicked(self, event):
co2Text = self.co2Input.GetValue().strip()
try:
co2 = float(co2Text)
except ValueError:
wx.MessageBox("'%s' is not valid CO2 amount. Please correct it." % co2Text,
'Error', wx.OK | wx.ICON_ERROR)
return
versionName = self.versionsList.GetValue()
simulator = self.versionSimulator[versionName]
selected_host = self._GetSelectedHost(simulator)
all_emissions = self.module.calculate_all_emissions(simulator, simulator.context.hosts, co2)
# get some financial info from module
minemission, minhost = self.module.get_min_emission(simulator, simulator.context.hosts)
maxemission, maxhost = self.module.get_max_emission(simulator, simulator.context.hosts)
total_emission = self.module.get_total_emission(simulator, simulator.context.hosts)
avg_emission = self.module.get_avg_emission(simulator, simulator.context.hosts)
curr_emission = all_emissions[selected_host]
# after all calculations, build the GUI
title = "Carbon Dioxide Emissions Analysis for Host: "
title += selected_host.original_name()
co2Window = GeneralFrame(self, "Carbon Dioxide Emissions Analysis Results", title, "modules_results.png")
co2Panel = wx.Panel(co2Window)
# conversion constant used to convert pounds to kgs
conv_constant = 0.45539237
# ########################################################################
# ACTUAL CARBON DIOXIDE EMISSIONS
#########################################################################
actualEmissionsBox = wx.StaticBox(co2Panel, label="Actual Carbon Dioxide Emissions of CPU")
actualEmissionsBoxSizer = wx.StaticBoxSizer(actualEmissionsBox, wx.VERTICAL)
#########################################################################
# emission of the selected host
#########################################################################
infoLabel = "Emission for Host: "
hostInfoLabel = wx.StaticText(co2Panel, label=infoLabel)
co2Label ="%.15f" % curr_emission + " pounds"
hostCO2Label = wx.StaticText(co2Panel, label=co2Label)
sizer1 = wx.BoxSizer(wx.HORIZONTAL)
sizer1.Add(hostInfoLabel, 0, wx.ALL | wx.EXPAND, 5)
sizer1.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
sizer1.Add(hostCO2Label, 0, wx.ALL | wx.EXPAND, 5)
#########################################################################
# minimal emission of version (minimal emissions of every host in given version)
#########################################################################
infoLabel = "Minimal Version Emission (Host: " + minhost.original_name() + ")"
hostInfoLabel = wx.StaticText(co2Panel, label=infoLabel)
co2Label ="%.15f" % minemission + " pounds"
hostCO2Label = wx.StaticText(co2Panel, label=co2Label)
sizer2 = wx.BoxSizer(wx.HORIZONTAL)
sizer2.Add(hostInfoLabel, 0, wx.ALL | wx.EXPAND, 5)
sizer2.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
sizer2.Add(hostCO2Label, 0, wx.ALL | wx.EXPAND, 5)
#########################################################################
# maximal emission of version (maximal emissions of every host in given version)
#########################################################################
infoLabel = "Maximal Version Emission (Host: " + maxhost.original_name() + ")"
hostInfoLabel = wx.StaticText(co2Panel, label=infoLabel)
co2Label ="%.15f" % maxemission + " pounds"
hostCO2Label = wx.StaticText(co2Panel, label=co2Label)
sizer3 = wx.BoxSizer(wx.HORIZONTAL)
sizer3.Add(hostInfoLabel, 0, wx.ALL | wx.EXPAND, 5)
sizer3.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
sizer3.Add(hostCO2Label, 0, wx.ALL | wx.EXPAND, 5)
#########################################################################
# average version emission
#########################################################################
infoLabel = "Average Version Emission: "
hostInfoLabel = wx.StaticText(co2Panel, label=infoLabel)
co2Label ="%.15f" % avg_emission + " pounds"
hostCO2Label = wx.StaticText(co2Panel, label=co2Label)
sizer4 = wx.BoxSizer(wx.HORIZONTAL)
sizer4.Add(hostInfoLabel, 0, wx.ALL | wx.EXPAND, 5)
sizer4.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
sizer4.Add(hostCO2Label, 0, wx.ALL | wx.EXPAND, 5)
#########################################################################
# total version emission
#########################################################################
infoLabel = "Total Version Emission: "
hostInfoLabel = wx.StaticText(co2Panel, label=infoLabel)
co2Label ="%.15f" % total_emission + " pounds"
hostCO2Label = wx.StaticText(co2Panel, label=co2Label)
sizer5 = wx.BoxSizer(wx.HORIZONTAL)
sizer5.Add(hostInfoLabel, 0, wx.ALL | wx.EXPAND, 5)
sizer5.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
sizer5.Add(hostCO2Label, 0, wx.ALL | wx.EXPAND, 5)
actualEmissionsBoxSizer.Add(sizer1, 0, wx.ALL | wx.EXPAND, 5)
actualEmissionsBoxSizer.Add(sizer2, 0, wx.ALL | wx.EXPAND, 5)
actualEmissionsBoxSizer.Add(sizer3, 0, wx.ALL | wx.EXPAND, 5)
actualEmissionsBoxSizer.Add(sizer4, 0, wx.ALL | wx.EXPAND, 5)
actualEmissionsBoxSizer.Add(sizer5, 0, wx.ALL | wx.EXPAND, 5)
#########################################################################
# MAIN LAYOUT
#########################################################################
mainSizer = wx.BoxSizer(wx.VERTICAL)
mainSizer.Add(actualEmissionsBoxSizer, 0, wx.ALL | wx.EXPAND, 5)
mainSizer.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
co2Panel.SetSizer(mainSizer)
co2Panel.Layout()
co2Window.CentreOnScreen()
co2Window.AddPanel(co2Panel)
co2Window.SetWindowSize(600, 300)
co2Window.Show()
def _GetSelectedHost(self, simulator):
host = None
# get selected host name from combo
hostName = self.hostsList.GetValue()
# find host with the selected name
for h in simulator.context.hosts:
if h.original_name() == hostName:
host = h
break
return host
def _PopulateComboWithHostsNames(self, simulator):
hostsNames = []
[hostsNames.append(h.original_name()) for h in simulator.context.hosts if h.original_name() not in hostsNames]
self.hostsList.Clear()
self.hostsList.AppendItems(hostsNames)
# ################
# LAYOUT
#################
def _BuildHostsBoxAndSizer(self):
""" """
self.chooseHostLbl = wx.StaticText(self, label="Choose Host To See\nit's Total Cost:")
self.hostsList = wx.ComboBox(self, style=wx.TE_READONLY, size=(200, -1))
self.hostsBox = wx.StaticBox(self, label="Host(s)")
self.hostsBoxSizer = wx.StaticBoxSizer(self.hostsBox, wx.HORIZONTAL)
self.hostsBoxSizer.Add(self.chooseHostLbl, 0, wx.ALL | wx.EXPAND, 5)
self.hostsBoxSizer.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
self.hostsBoxSizer.Add(self.hostsList, 1, wx.ALL | wx.EXPAND | wx.ALIGN_RIGHT, 5)
return self.hostsBox, self.hostsBoxSizer
#################
# REACTIONS
#################
def AddFinishedSimulation(self, simulator):
""" """
version = simulator.context.version
self.versionsList.Append(version.name)
self.versionSimulator[version.name] = simulator
def OnVersionChanged(self, event):
""" """
versionName = self.versionsList.GetValue()
simulator = self.versionSimulator[versionName]
self._PopulateComboWithHostsNames(simulator)
self.SetVersionsResultsVisibility(True)
def RemoveAllSimulations(self):
""" """
self.versionsList.Clear()
self.versionsList.SetValue("")
self.versionSimulator = {}
self.hostsList.Clear()
self.hostsList.SetValue("")
self.SetVersionsResultsVisibility(False)
def SetVersionsResultsVisibility(self, visible):
""" """
widgets = []
widgets.append(self.hostsList)
widgets.append(self.co2Box)
widgets.append(self.hostsBox)
widgets.append(self.chooseHostLbl)
for w in widgets:
if visible:
w.Show()
else:
w.Hide()
self.Layout()
class MainResultsNotebook(wx.Notebook):
def __init__(self, module, *args, **kwargs):
wx.Notebook.__init__(self, *args, **kwargs)
self.module = module
il = wx.ImageList(24, 24)
singleVersionImg = il.Add(wx.Bitmap(self.CreatePath4Resource('gogreen.png'), wx.BITMAP_TYPE_PNG))
self.AssignImageList(il)
self.oneVersionTab = SingleVersionPanel(self.module, self)
self.AddPage(self.oneVersionTab, "Single Version")
self.SetPageImage(0, singleVersionImg)
self.oneVersionTab.Layout()
def OnParsedModel(self):
""" """
self.oneVersionTab.RemoveAllSimulations()
def OnSimulationFinished(self, simulator):
""" """
self.oneVersionTab.AddFinishedSimulation(simulator)
def OnAllSimulationsFinished(self, simulators):
""" """
pass
def CreatePath4Resource(self, resourceName):
"""
@brief creates and returns path to the
given file in the resource
('assets') dir
@return path to the resource
"""
tmp = os.path.split(os.path.dirname(__file__))
# find last / character in path
idx = tmp[0].rfind('/')
# get substring - path for resource
path = tmp[0][0:idx]
return os.path.join(path, 'bin', 'assets', resourceName)
class ModuleGui(wx.EvtHandler):
def __init__(self, module):
""" """
wx.EvtHandler.__init__(self)
self.module = module
self.mainResultNotebook = None
def get_gui(self):
if not getattr(self, '__gui', None):
setattr(self, '__gui', ModuleGui(self))
return getattr(self, '__gui', None)
def get_name(self):
return "Carbon Dioxide Emissions Analysis"
def install_gui(self, simulator):
""" Install module for gui simulation """
self._install(simulator)
return simulator
def get_configuration_panel(self, parent):
""" Returns WX panel with configuration controls. """
panel = wx.Panel(parent)
sizer = wx.BoxSizer(wx.VERTICAL)
text = wx.StaticText(panel, label="Module does not need to be configured.")
sizer.Add(text, 0, wx.ALL | wx.EXPAND, 5)
text = wx.StaticText(panel, label="All result options will be available after results are calculated.")
sizer.Add(text, 0, wx.ALL | wx.EXPAND, 5)
panel.SetSizer(sizer)
return panel
def get_results_panel(self, parent):
"""
Create main result panel existing from the beginning
which will be extended when versions' simulations are finished.
"""
self.mainResultNotebook = MainResultsNotebook(self.module, parent)
return self.mainResultNotebook
def on_finished_simulation(self, simulator):
""" """
self.mainResultNotebook.OnSimulationFinished(simulator)
def on_finished_all_simulations(self, simulators):
"""
Called once for all simulations after all of them are finished.
"""
self.mainResultNotebook.OnAllSimulationsFinished(simulators)
def on_parsed_model(self):
""" """
self.mainResultNotebook.OnParsedModel() | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/module/greenanalysis/gui.py | gui.py |
from aqopa import module
from .gui import ModuleGui
from aqopa.simulator.state import HOOK_TYPE_SIMULATION_FINISHED
from .console import PrintResultsHook
class Module(module.Module):
def __init__(self, energyanalysis_module):
self.energyanalysis_module = energyanalysis_module
self.carbon_dioxide_emissions = {}
self.pounds_of_co2_per_kWh = 0
def get_pounds_of_co2_per_kWh(self):
return self.pounds_of_co2_per_kWh
def set_pounds_of_co2_per_kWh(self, pounds_of_co2_per_kWh):
self.pounds_of_co2_per_kWh = pounds_of_co2_per_kWh
def get_gui(self):
if not getattr(self, '__gui', None):
setattr(self, '__gui', ModuleGui(self))
return getattr(self, '__gui', None)
def _install(self, simulator):
"""
"""
return simulator
def install_console(self, simulator):
""" Install module for console simulation """
self._install(simulator)
hook = PrintResultsHook(self, simulator)
simulator.register_hook(HOOK_TYPE_SIMULATION_FINISHED, hook)
return simulator
def install_gui(self, simulator):
""" Install module for gui simulation """
self._install(simulator)
return simulator
def __convert_to_joules(self, millijoules):
return millijoules / 1000.0
def __convert_to_kWh(self, joules):
return joules / 3600000.0
def calculate_emission(self, consumed_joules, pounds_of_co2_per_kWh):
kWhs = self.__convert_to_kWh(consumed_joules)
pounds = kWhs * pounds_of_co2_per_kWh
return pounds
def calculate_emission_for_host(self, simulator, host, pounds_of_co2_per_kWh):
all_consumptions = self.get_all_hosts_consumption(simulator)
joules = all_consumptions[host]['energy']
pounds_for_host = self.calculate_emission(joules, pounds_of_co2_per_kWh)
return pounds_for_host
def calculate_all_emissions(self, simulator, hosts, pounds_of_co2_per_kWh):
all_emissions = {}
for host in hosts:
all_emissions[host] = self.calculate_emission_for_host(simulator, host, pounds_of_co2_per_kWh)
self.add_co2_emission(simulator, host, all_emissions[host])
return all_emissions
def add_co2_emission(self, simulator, host, co2_emission):
# add a new simulator if not available yet
if simulator not in self.carbon_dioxide_emissions:
self.carbon_dioxide_emissions[simulator] = {}
# add a new host if not available yet
if host not in self.carbon_dioxide_emissions[simulator]:
self.carbon_dioxide_emissions[simulator][host] = []
# add he amount of released carbon dioxide for the
# host - but only if we have not added it yet and
# if it is not 'empty'
if co2_emission not in self.carbon_dioxide_emissions[simulator][host] and co2_emission:
self.carbon_dioxide_emissions[simulator][host].append(co2_emission)
def get_min_emission(self, simulator, hosts):
host = hosts[0]
min_cost = self.carbon_dioxide_emissions[simulator][hosts[0]]
if len(min_cost) > 0:
for h in hosts:
if self.carbon_dioxide_emissions[simulator][h] < min_cost:
min_cost = self.carbon_dioxide_emissions[simulator][h]
host = h
return min_cost[0], host
else:
return 0, None
def get_max_emission(self, simulator, hosts):
host = hosts[0]
max_cost = self.carbon_dioxide_emissions[simulator][hosts[0]]
if len(max_cost) > 0:
for h in hosts:
if self.carbon_dioxide_emissions[simulator][h] > max_cost:
max_cost = self.carbon_dioxide_emissions[simulator][h]
host = h
return max_cost[0], host
else :
return 0, None
def get_avg_emission(self, simulator, hosts):
cost_sum = 0.0
i = 0
for host in hosts:
for cost in self.carbon_dioxide_emissions[simulator][host]:
cost_sum += cost
i += 1
if i != 0:
return cost_sum / i
else:
return 0
def get_total_emission(self, simulator, hosts):
cost_sum = 0.0
for host in hosts:
for cost in self.carbon_dioxide_emissions[simulator][host]:
cost_sum += cost
return cost_sum
def get_all_emissions(self, simulator):
if simulator not in self.carbon_dioxide_emissions:
return []
return self.carbon_dioxide_emissions[simulator]
def get_all_hosts_consumption(self, simulator):
hosts = simulator.context.hosts
voltage = self.energyanalysis_module.get_voltage()
consumptions = self.energyanalysis_module.get_hosts_consumptions(simulator, hosts, voltage)
return consumptions | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/module/greenanalysis/__init__.py | __init__.py |
import wx
import os
from aqopa.gui.general_purpose_frame_gui import GeneralFrame
from sme.SMETool import SMETool
"""
@file __init__.py
@brief gui file for the qopanalysis module
@author Katarzyna Mazur
"""
class SingleVersionPanel(wx.Panel):
"""
Frame presenting results for one simulation.
Simulator may be retrived from module,
because each module has its own simulator.
"""
def __init__(self, module, *args, **kwargs):
wx.Panel.__init__(self, *args, **kwargs)
self.module = module
self.versionSimulator = {}
self.all_facts = []
self.occured_facts = []
#################
# VERSION BOX
#################
versionBox = wx.StaticBox(self, label="Version")
versionsLabel = wx.StaticText(self, label="Choose Version To See\nAnalysis Results:")
self.versionsList = wx.ComboBox(self, style=wx.TE_READONLY)
self.versionsList.Bind(wx.EVT_COMBOBOX, self.OnVersionChanged)
versionBoxSizer = wx.StaticBoxSizer(versionBox, wx.HORIZONTAL)
versionBoxSizer.Add(versionsLabel, 0, wx.ALL | wx.ALIGN_CENTER, 5)
versionBoxSizer.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
versionBoxSizer.Add(self.versionsList, 1, wx.ALL | wx.ALIGN_CENTER, 5)
##################################
# QoP PARAMETERS BOX
##################################
self.qopParamsBox = wx.StaticBox(self, label="The QoP Analysis Results")
hostsBox, hostsBoxSizer = self._BuildHostsBoxAndSizer()
qopParamsBoxSizer = wx.StaticBoxSizer(self.qopParamsBox, wx.VERTICAL)
qopParamsBoxSizer.Add(hostsBoxSizer, 1, wx.ALL | wx.EXPAND, 5)
#################
# BUTTONS LAY
#################
self.showQoPBtn = wx.Button(self, label="Show")
self.showQoPBtn.Bind(wx.EVT_BUTTON, self.OnShowQoPBtnClicked)
#self.launchSMEBtn = wx.Button(self, label="Evaluate")
#self.launchSMEBtn.Bind(wx.EVT_BUTTON, self.OnLaunchSMEClicked)
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
buttonsSizer.Add(self.showQoPBtn, 0, wx.ALL | wx.EXPAND, 5)
#buttonsSizer.Add(self.launchSMEBtn, 0, wx.ALL | wx.EXPAND, 5)
#################
# MAIN LAY
#################
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(versionBoxSizer, 0, wx.ALL | wx.EXPAND, 5)
sizer.Add(qopParamsBoxSizer, 0, wx.ALL | wx.EXPAND, 5)
sizer.Add(buttonsSizer, 0, wx.ALL | wx.EXPAND, 5)
self.SetSizer(sizer)
self.SetVersionsResultsVisibility(False)
def OnShowQoPBtnClicked(self, event):
hostName = self.hostsList.GetValue()
if hostName != "" :
versionName = self.versionsList.GetValue()
simulator = self.versionSimulator[versionName]
host = self._GetSelectedHost(simulator)
self.ShowQoPParameters(simulator, host)
def OnLaunchSMEClicked(self, event):
smetool = SMETool(None)
smetool.SetClientSize(wx.Size(800,450))
smetool.CentreOnScreen()
smetool.Show()
def _GetSelectedHost(self, simulator):
host = None
# get selected module name from combo
hostName = self.hostsList.GetValue()
# find host with the selected name
for h in simulator.context.hosts:
if h.original_name() == hostName:
host = h
break
return host
def _PopulateComboWithHostsNames(self, simulator):
hostsNames = []
[hostsNames.append(h.original_name()) for h in simulator.context.hosts if h.original_name() not in hostsNames]
self.hostsList.Clear()
self.hostsList.AppendItems(hostsNames)
#################
# LAYOUT
#################
def _BuildHostsBoxAndSizer(self):
""" """
self.chooseHostLbl = wx.StaticText(self, label="Choose Host To See\nit's QoP Parameters:")
self.hostsList = wx.ComboBox(self, style=wx.TE_READONLY)
self.hostsBox = wx.StaticBox(self, label="Host(s)")
self.hostsBoxSizer = wx.StaticBoxSizer(self.hostsBox, wx.HORIZONTAL)
self.hostsBoxSizer.Add(self.chooseHostLbl, 0, wx.ALL | wx.EXPAND, 5)
self.hostsBoxSizer.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
self.hostsBoxSizer.Add(self.hostsList, 1, wx.ALL | wx.EXPAND | wx.ALIGN_RIGHT, 5)
return self.hostsBox, self.hostsBoxSizer
#################
# REACTIONS
#################
def AddFinishedSimulation(self, simulator):
""" """
version = simulator.context.version
self.versionsList.Append(version.name)
self.versionSimulator[version.name] = simulator
def OnVersionChanged(self, event):
""" """
versionName = self.versionsList.GetValue()
simulator = self.versionSimulator[versionName]
self._PopulateComboWithHostsNames(simulator)
self.SetVersionsResultsVisibility(True)
def RemoveAllSimulations(self):
""" """
self.versionsList.Clear()
self.versionsList.SetValue("")
self.versionSimulator = {}
self.hostsList.Clear()
self.hostsList.SetValue("")
self.SetVersionsResultsVisibility(False)
def SetVersionsResultsVisibility(self, visible):
""" """
widgets = []
widgets.append(self.hostsList)
widgets.append(self.hostsBox)
widgets.append(self.showQoPBtn)
#widgets.append(self.launchSMEBtn)
widgets.append(self.qopParamsBox)
widgets.append(self.chooseHostLbl)
for w in widgets:
if visible:
w.Show()
else:
w.Hide()
self.Layout()
def ShowQoPParameters(self, simulator, host) :
title = "QoP Parameters for host: "
title += host.original_name()
qopsWindow = GeneralFrame(self, "QoP Analysis Results", title, "modules_results.png")
qopParamsPanel = wx.Panel(qopsWindow)
# get all & occured facts
versionName = self.versionsList.GetValue()
simulator = self.versionSimulator[versionName]
# simply copy lists
self.occured_facts = self._GetoOccuredFacts(simulator)[:]
self.all_facts = self._GetAllFacts(simulator)[:]
##################################
# ALL FACTS LAYOUT
##################################
all_factsListBox = wx.ListBox(qopParamsPanel, choices=self.all_facts)
all_factsBox = wx.StaticBox(qopParamsPanel, label="All Facts")
allFactsBoxSizer = wx.StaticBoxSizer(all_factsBox, wx.VERTICAL)
allFactsBoxSizer.Add(all_factsListBox, 1, wx.ALL | wx.EXPAND, 5)
##################################
# OCCURED FACTS LAYOUT
##################################
occured_factsListBox = wx.ListBox(qopParamsPanel, choices=self.occured_facts)
occured_factsBox = wx.StaticBox(qopParamsPanel, label="Occured Facts")
occuredFactsBoxSizer = wx.StaticBoxSizer(occured_factsBox, wx.VERTICAL)
occuredFactsBoxSizer.Add(occured_factsListBox, 1, wx.ALL | wx.EXPAND, 5)
sizer = wx.BoxSizer(wx.HORIZONTAL)
sizer.Add(allFactsBoxSizer, 1, wx.ALL | wx.EXPAND, 5)
sizer.Add(occuredFactsBoxSizer, 1, wx.ALL | wx.EXPAND, 5)
qopParamsPanel.SetSizer(sizer)
qopParamsPanel.Layout()
qopsWindow.CentreOnScreen()
qopsWindow.AddPanel(qopParamsPanel)
qopsWindow.SetWindowSize(600, 300)
qopsWindow.Show()
# some kind of debugging
#print "Occured facts from GUI: "+str(self._GetoOccuredFacts(simulator))
#print "All facts from GUI: "+str(self._GetAllFacts(simulator))
def _GetAllFacts(self, simulator):
return self.module.get_all_facts()
def _GetoOccuredFacts(self, simulator):
host = None
# get all hosts assigned to this simulator
allHosts = self.module.occured_facts[simulator]
# get the name of the host selected on hosts combo box
hostName = self.hostsList.GetValue()
# from all hosts get the selected one - its the host
# with the same same selected on combobox
for h in allHosts :
if h.original_name() == hostName :
host = h
break
# get all facts for the particular simulator and host
return self.module.get_occured_facts(simulator, host)
class MainResultsNotebook(wx.Notebook):
""" """
def __init__(self, module, *args, **kwargs):
wx.Notebook.__init__(self, *args, **kwargs)
self.module = module
il = wx.ImageList(24, 24)
singleVersionImg = il.Add(wx.Bitmap(self.CreatePath4Resource('qop.png'), wx.BITMAP_TYPE_PNG))
self.AssignImageList(il)
self.oneVersionTab = SingleVersionPanel(self.module, self)
self.AddPage(self.oneVersionTab, "Single Version")
self.SetPageImage(0, singleVersionImg)
self.oneVersionTab.Layout()
def OnParsedModel(self):
""" """
self.oneVersionTab.RemoveAllSimulations()
def OnSimulationFinished(self, simulator):
""" """
self.oneVersionTab.AddFinishedSimulation(simulator)
def OnAllSimulationsFinished(self, simulators):
""" """
pass
def CreatePath4Resource(self, resourceName):
"""
@brief creates and returns path to the
given file in the resource
('assets') dir
@return path to the resource
"""
tmp = os.path.split(os.path.dirname(__file__))
# find last / character in path
idx = tmp[0].rfind('/')
# get substring - path for resource
path = tmp[0][0:idx]
return os.path.join(path, 'bin', 'assets', resourceName)
class ModuleGui(wx.EvtHandler):
def __init__(self, module):
""" """
wx.EvtHandler.__init__(self)
self.module = module
self.mainResultNotebook = None
def get_gui(self):
if not getattr(self, '__gui', None):
setattr(self, '__gui', ModuleGui(self))
return getattr(self, '__gui', None)
def get_name(self):
return "QoP Analysis"
def install_gui(self, simulator):
""" Install module for gui simulation """
self._install(simulator)
return simulator
def get_configuration_panel(self, parent):
""" Returns WX panel with configuration controls. """
panel = wx.Panel(parent)
sizer = wx.BoxSizer(wx.VERTICAL)
text = wx.StaticText(panel, label="Module does not need to be configured.")
sizer.Add(text, 0, wx.ALL | wx.EXPAND, 5)
text = wx.StaticText(panel, label="All result options will be available after results are calculated.")
sizer.Add(text, 0, wx.ALL | wx.EXPAND, 5)
panel.SetSizer(sizer)
return panel
def get_results_panel(self, parent):
"""
Create main result panel existing from the beginning
which will be extended when versions' simulations are finished.
"""
self.mainResultNotebook = MainResultsNotebook(self.module, parent)
return self.mainResultNotebook
def on_finished_simulation(self, simulator):
""" """
self.mainResultNotebook.OnSimulationFinished(simulator)
def on_finished_all_simulations(self, simulators):
"""
Called once for all simulations after all of them are finished.
"""
self.mainResultNotebook.OnAllSimulationsFinished(simulators)
def on_parsed_model(self):
""" """
self.mainResultNotebook.OnParsedModel() | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/module/qopanalysis/gui.py | gui.py |
from .qop_param import QoPParameter
from aqopa.simulator.state import Hook, ExecutionResult
from aqopa.model import AssignmentInstruction,\
CallFunctionInstruction, IfInstruction, WhileInstruction,\
CommunicationInstruction, CallFunctionExpression, TupleExpression, ComparisonExpression, \
IdentifierExpression
"""
@file hook.py
@author Katarzyna Mazur
"""
class PreInstructionHook(Hook):
"""
Execution hook executed before default core execution of each instruction.
Returns execution result.
"""
# its static, cause we need to have the list for EVERY
# simulator and its hosts, right?
all_facts = []
def __init__(self, module, simulator):
self.module = module
self.simulator = simulator
def execute(self, context, **kwargs):
instruction = context.get_current_instruction()
# temp - test me!
# if instruction.__class__ is CallFunctionInstruction:
# print "function_name: "+str(instruction.function_name)
# print "arguments: "+str(instruction.arguments)
# print "qop_arguments: "+str(instruction.qop_arguments)
# for i in instruction.arguments:
# print "qop_argument:"+str(i.identifier)
if instruction.__class__ not in [AssignmentInstruction, CallFunctionInstruction,
IfInstruction, WhileInstruction]:
return
self._update_occured_facts(context)
self._update_all_facts(context)
return ExecutionResult()
def _update_occured_facts(self, context):
"""
Update all facts in context according to current instruction.
"""
# get current instruction
instruction = context.get_current_instruction()
# check the instruction type and extract the expression from it
if isinstance(instruction, AssignmentInstruction):
expression = instruction.expression
elif isinstance(instruction, CallFunctionInstruction):
expression = CallFunctionExpression(instruction.function_name, instruction.arguments, instruction.qop_arguments)
else:
expression = instruction.condition
# Return details for each expression in instruction
# In some instruction may be more expressions (tuple, nested call function)
fact = self._get_occured_facts_details_for_expression(context, expression)
# add fact to the module's fact list
self.module.add_occured_fact(self.simulator, context.get_current_host(), fact)
def _get_occured_facts_details_for_expression(self, context, expression):
if isinstance(expression, TupleExpression):
return self._get_occured_facts_details_for_tuple_expression(context, expression)
elif isinstance(expression, CallFunctionExpression):
return self._get_occured_facts_details_for_simple_expression(context, expression)
elif isinstance(expression, ComparisonExpression):
return self._get_occured_facts_details_for_comparison_expression(context, expression)
return []
def _get_occured_facts_details_for_tuple_expression(self, context, expression):
qop_args = []
for i in range(0, len(expression.elements)):
qop_arg = self._get_occured_facts_details_for_expression(context, expression.elements[i])
if type(qop_arg) is list :
qop_args.extend(qop_arg)
else:
qop_args.append(qop_arg)
return qop_args
def _get_occured_facts_details_for_simple_expression(self, context, expression):
qop_args = []
for expr in expression.qop_arguments:
#e = self._get_occured_facts_details_for_expression(context, expr)
qop_args.append(str(expr))
return qop_args
def _get_occured_facts_details_for_comparison_expression(self, context, expression):
qop_args = []
l = self._get_occured_facts_details_for_expression(context, expression.left)
r = self._get_occured_facts_details_for_expression(context, expression.right)
if type(l) is list :
qop_args.extend(l)
else:
qop_args.append(l)
if type(r) is list :
qop_args.extend(r)
else:
qop_args.append(r)
return qop_args
def _update_all_facts(self, context):
# make a list from all the facts within the simulator and its hosts,
# simply concatenate all the facts from all the simulators and its all
# hosts into a one, big list of facts available within the loaded model
facts = self.module.get_occured_facts(self.simulator, context.get_current_host())
for fact in facts:
if fact not in PreInstructionHook.all_facts:
PreInstructionHook.all_facts.append(fact)
self.module.set_all_facts(PreInstructionHook.all_facts) | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/module/qopanalysis/hook.py | hook.py |
from aqopa import module
from aqopa.simulator.state import HOOK_TYPE_SIMULATION_FINISHED, HOOK_TYPE_PRE_INSTRUCTION_EXECUTION
from .gui import ModuleGui
from .console import PrintResultsHook
from .hook import PreInstructionHook
"""
@file __init__.py
@brief initial file for the qopanalysis module
@author Katarzyna Mazur
"""
class Module(module.Module):
def __init__(self):
# all occured facts in a host, format: (simulator is the dict's key):
# { simulator: {host0: [f1,f2, ... fn], host1: [f1,f2, ..., fm]} }
self.occured_facts = {}
# all facts in a model
self.all_facts = []
# qopanalysis params list in a host, format: (simulator is the dict's key):
# { simulator: {host0: [[qop1, qop2, ..., qopn], ..., [qop1, qop2, ..., qopm]], host1: [[qop1, qop2, ..., qopk], ..., [qop1, qop2, ..., qopx]]} }
self.qopParams = {}
def get_gui(self):
if not getattr(self, '__gui', None):
setattr(self, '__gui', ModuleGui(self))
return getattr(self, '__gui', None)
def _install(self, simulator):
hook = PreInstructionHook(self, simulator)
simulator.register_hook(HOOK_TYPE_PRE_INSTRUCTION_EXECUTION, hook)
return simulator
def install_console(self, simulator):
""" Install module for console simulation """
self._install(simulator)
hook = PrintResultsHook(self, simulator)
simulator.register_hook(HOOK_TYPE_SIMULATION_FINISHED, hook)
return simulator
def install_gui(self, simulator):
""" Install module for gui simulation """
self._install(simulator)
return simulator
def get_all_facts(self):
"""
@brief returns a list which contains
all the facts available in a model
"""
return self.all_facts
def set_all_facts(self, facts_list):
"""
@brief sets all facts for the loaded
model
"""
self.all_facts = facts_list[:]
def add_occured_fact(self, simulator, host, fact):
"""
@brief adds a new, occured fact to the list
of occured facts for the particular host
present in the QoP-ML's model
"""
# add a new simulator if not available yet
if simulator not in self.occured_facts:
self.occured_facts[simulator] = {}
# add a new host if not available yet
if host not in self.occured_facts[simulator] :
self.occured_facts[simulator][host] = []
# add a new fact for the host - but only if we
# have not added it yet and if it is not empty
if str(fact) not in self.occured_facts[simulator][host] and fact:
# if the fact is actually a list of facts,
if type(fact) is list:
# add all the elements from the facts list
for f in fact :
if str(f) not in self.occured_facts[simulator][host]:
self.occured_facts[simulator][host].append(str(f))
else:
self.occured_facts[simulator][host].append(str(fact))
def get_occured_facts(self, simulator, host) :
"""
@brief gets a list of all occured facts
for the particular host present in the
QoP-ML's model
"""
if simulator not in self.occured_facts:
self.occured_facts[simulator] = {}
if host not in self.occured_facts[simulator]:
self.occured_facts[simulator][host] = []
return self.__make_list_flat(self.occured_facts[simulator][host])
def get_qop_params(self):
pass
def add_qop_param(self):
pass
def __make_list_flat(self, l) :
"""
@brief sometimes an element of the list
might be a list too, so we need to flat
the list given in the argument and return
the oblate list
"""
ans = []
for i in l:
if type(i) is list:
ans = self.__make_list_flat(i)
else:
ans.append(i)
return ans | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/module/qopanalysis/__init__.py | __init__.py |
import os
import re
import wx
import wx.animate
import wx.lib.scrolledpanel as scrolled
import wx.lib.delayedresult
from aqopa.model import name_indexes
from aqopa.bin import gui as aqopa_gui
from aqopa.simulator.error import RuntimeException
from aqopa.gui.general_purpose_frame_gui import GeneralFrame
"""
@file gui.py
@brief GUI for the energy analysis panel
@author Damian Rusinek <[email protected]>
@date created on 06-09-2013 by Damian Rusinek
@date edited on 25-06-2014 by Katarzyna Mazur (visual improvements mainly)
"""
class SingleVersionPanel(wx.Panel):
"""
Frame presenting results for one simulation.
Simulator may be retrived from module,
because each module has its own simulator.
"""
def __init__(self, module, *args, **kwargs):
wx.Panel.__init__(self, *args, **kwargs)
self.module = module
self.versionSimulator = {}
self.hostChoosePanels = [] # Panels used to choose hosts for energy consumptions results
self.checkBoxInformations = [] # Tuples with host name, and index ranges widget
self.hostCheckBoxes = [] # List of checkboxes with hosts names used for hosts' selection
#################
# VERSION BOX
#################
versionBox = wx.StaticBox(self, label="Version")
versionsLabel = wx.StaticText(self, label="Choose Version To See\nAnalysis Results:")
self.versionsList = wx.ComboBox(self, style=wx.TE_READONLY)
self.versionsList.Bind(wx.EVT_COMBOBOX, self.OnVersionChanged)
versionBoxSizer = wx.StaticBoxSizer(versionBox, wx.HORIZONTAL)
versionBoxSizer.Add(versionsLabel, 0, wx.ALL | wx.ALIGN_CENTER, 5)
versionBoxSizer.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
versionBoxSizer.Add(self.versionsList, 1, wx.ALL | wx.ALIGN_CENTER, 5)
##################################
# ENERGY CONSUMPTION BOX
##################################
self.consumptionsBox = wx.StaticBox(self, label="Energy consumption results")
self.voltageLabel = wx.StaticText(self, label="Enther the Voltage value:")
self.voltageInput = wx.TextCtrl(self, size=(200, 20))
voltageHBoxSizer = wx.BoxSizer(wx.HORIZONTAL)
voltageHBoxSizer.Add(self.voltageLabel, 0, wx.ALL | wx.EXPAND, 10)
voltageHBoxSizer.Add(self.voltageInput, 1, wx.ALL | wx.CENTER, 10)
operationBox, operationBoxSizer = self._BuildOperationsBoxAndSizer()
hostsBox, hostsBoxSizer = self._BuildHostsBoxAndSizer()
consumptionsBoxSizer = wx.StaticBoxSizer(self.consumptionsBox, wx.VERTICAL)
consumptionsHBoxSizer = wx.BoxSizer(wx.HORIZONTAL)
consumptionsHBoxSizer.Add(operationBoxSizer, 0, wx.ALL | wx.EXPAND)
consumptionsHBoxSizer.Add(hostsBoxSizer, 1, wx.ALL | wx.EXPAND)
self.showConsumptionBtn = wx.Button(self, label="Show")
self.showConsumptionBtn.Bind(wx.EVT_BUTTON, self.OnShowConsumptionButtonClicked)
consumptionsBoxSizer.Add(voltageHBoxSizer, 0, wx.ALL | wx.EXPAND, 5)
consumptionsBoxSizer.Add(consumptionsHBoxSizer, 0, wx.ALL | wx.EXPAND, 5)
consumptionsBoxSizer.Add(wx.StaticText(self), 1, wx.EXPAND, 5)
consumptionsBoxSizer.Add(self.showConsumptionBtn, 0, wx.ALIGN_RIGHT | wx.ALL, 5)
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(versionBoxSizer, 0, wx.ALL | wx.EXPAND, 5)
sizer.Add(consumptionsBoxSizer, 1, wx.ALL | wx.EXPAND, 5)
self.SetSizer(sizer)
self.SetVersionsResultsVisibility(False)
#################
# REACTIONS
#################
def AddFinishedSimulation(self, simulator):
""" """
version = simulator.context.version
self.versionsList.Append(version.name)
self.versionSimulator[version.name] = simulator
def OnVersionChanged(self, event):
""" """
versionName = self.versionsList.GetValue()
simulator = self.versionSimulator[versionName]
self._BuildHostsChoosePanel(simulator)
self.SetVersionsResultsVisibility(True)
def OnShowConsumptionButtonClicked(self, event):
""" """
versionName = self.versionsList.GetValue()
simulator = self.versionSimulator[versionName]
hosts = self._GetSelectedHosts(simulator)
if len(hosts) == 0:
wx.MessageBox("Please select hosts.", 'Error', wx.OK | wx.ICON_ERROR)
return
voltageText = self.voltageInput.GetValue().strip()
try:
voltage = float(voltageText)
except ValueError:
wx.MessageBox("Voltage '%s' is incorrect float number. Please correct it." % voltageText,
'Error', wx.OK | wx.ICON_ERROR)
return
self.module.set_voltage(voltage)
if self.oneECRB.GetValue():
self.ShowHostsConsumption(simulator, hosts, voltage)
elif self.avgECRB.GetValue():
self.ShowAverageHostsConsumption(simulator, hosts, voltage)
elif self.minECRB.GetValue():
self.ShowMinimalHostsConsumption(simulator, hosts, voltage)
elif self.maxECRB.GetValue():
self.ShowMaximalHostsConsumption(simulator, hosts, voltage)
def RemoveAllSimulations(self):
""" """
self.versionsList.Clear()
self.versionsList.SetValue("")
self.versionSimulator = {}
self.hostChoosePanels = []
self.checkBoxInformations = {}
self.hostCheckBoxes = []
self.hostsBoxSizer.Clear(True)
self.SetVersionsResultsVisibility(False)
#################
# LAYOUT
#################
def _BuildOperationsBoxAndSizer(self):
""" """
self.operationBox = wx.StaticBox(self, label="Operation")
self.oneECRB = wx.RadioButton(self, label="Energy consumption of one host")
self.avgECRB = wx.RadioButton(self, label="Average hosts' energy consumption")
self.minECRB = wx.RadioButton(self, label="Minimal hosts' energy consumption")
self.maxECRB = wx.RadioButton(self, label="Maximal hosts' energy consumption")
operationBoxSizer = wx.StaticBoxSizer(self.operationBox, wx.VERTICAL)
operationBoxSizer.Add(self.oneECRB, 0, wx.ALL)
operationBoxSizer.Add(self.avgECRB, 0, wx.ALL)
operationBoxSizer.Add(self.minECRB, 0, wx.ALL)
operationBoxSizer.Add(self.maxECRB, 0, wx.ALL)
return self.operationBox, operationBoxSizer
def _BuildHostsBoxAndSizer(self):
""" """
self.hostsBox = wx.StaticBox(self, label="Host(s)")
self.hostsBoxSizer = wx.StaticBoxSizer(self.hostsBox, wx.VERTICAL)
return self.hostsBox, self.hostsBoxSizer
def _BuildHostsChoosePanel(self, simulator):
""" """
for p in self.hostChoosePanels:
p.Destroy()
self.hostChoosePanels = []
self.checkBoxInformations = {}
self.hostCheckBoxes = []
self.hostsBoxSizer.Layout()
hosts = simulator.context.hosts
hostsIndexes = {}
for h in hosts:
name = h.original_name()
indexes = name_indexes(h.name)
index = indexes[0]
if name not in hostsIndexes or index > hostsIndexes[name]:
hostsIndexes[name] = index
for hostName in hostsIndexes:
panel = wx.Panel(self)
panelSizer = wx.BoxSizer(wx.HORIZONTAL)
ch = wx.CheckBox(panel, label=hostName, size=(120, 20))
textCtrl = wx.TextCtrl(panel, size=(200, 20))
textCtrl.SetValue("0")
rangeLabel = "Available range: 0"
if hostsIndexes[hostName] > 0:
rangeLabel += " - %d" % hostsIndexes[hostName]
maxLbl = wx.StaticText(panel, label=rangeLabel)
panelSizer.Add(ch, 0, wx.ALL | wx.ALIGN_CENTER)
panelSizer.Add(textCtrl, 1, wx.ALL | wx.ALIGN_CENTER)
panelSizer.Add(maxLbl, 0, wx.ALL | wx.ALIGN_CENTER)
panel.SetSizer(panelSizer)
self.hostsBoxSizer.Add(panel, 1, wx.ALL)
self.checkBoxInformations[ch] = (hostName, textCtrl)
self.hostChoosePanels.append(panel)
self.hostCheckBoxes.append(ch)
self.hostsBoxSizer.Layout()
self.Layout()
def SetVersionsResultsVisibility(self, visible):
""" """
widgets = []
widgets.append(self.consumptionsBox)
widgets.append(self.operationBox)
widgets.append(self.oneECRB)
widgets.append(self.avgECRB)
widgets.append(self.minECRB)
widgets.append(self.maxECRB)
widgets.append(self.hostsBox)
widgets.append(self.showConsumptionBtn)
widgets.append(self.voltageLabel)
widgets.append(self.voltageInput)
for w in widgets:
if visible:
w.Show()
else:
w.Hide()
self.Layout()
#################
# STATISTICS
#################
def _GetSelectedHosts(self, simulator):
""" Returns list of hosts selected by the user """
def ValidateHostsRange(indexesRange):
""" """
return re.match(r'\d(-\d)?(,\d(-\d)?)*', indexesRange)
def GetIndexesFromRange(indexesRange):
""" Extracts numbers list of hosts from range text """
indexes = []
ranges = indexesRange.split(',')
for r in ranges:
parts = r.split('-')
if len(parts) == 1:
indexes.append(int(parts[0]))
else:
for i in range(int(parts[0]), int(parts[1])+1):
indexes.append(i)
return indexes
hosts = []
for ch in self.hostCheckBoxes:
if not ch.IsChecked():
continue
hostName, hostRangeTextCtrl = self.checkBoxInformations[ch]
indexesRange = hostRangeTextCtrl.GetValue()
if not ValidateHostsRange(indexesRange):
wx.MessageBox("Range '%s' for host '%s' is invalid. Valid example: 0,12,20-25,30."
% (indexesRange, hostName), 'Error', wx.OK | wx.ICON_ERROR)
break
else:
indexes = GetIndexesFromRange(indexesRange)
for h in simulator.context.hosts:
hostIndexes = name_indexes(h.name)
if h.original_name() == hostName and hostIndexes[0] in indexes:
hosts.append(h)
return hosts
def ShowHostsConsumption(self, simulator, hosts, voltage):
"""
@brief shows host's energy consumption in a new window
"""
consumptions = self.module.get_hosts_consumptions(simulator, hosts, voltage)
lblText = ""
for h in hosts:
lblText += "%s: %.6f J" % (h.name, consumptions[h]['energy'])
error = h.get_finish_error()
if error is not None:
lblText += " (Not Finished - %s)" % error
lblText += "\n\n"
# create a new frame to show time analysis results on it
hostsEnergyWindow = GeneralFrame(self, "Energy Analysis Results", "Host's Consumption", "modules_results.png")
# create scrollable panel
hostsPanel = scrolled.ScrolledPanel(hostsEnergyWindow)
# create informational label
lbl = wx.StaticText(hostsPanel, label=lblText)
# sizer to align gui elements properly
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(lbl, 0, wx.ALL | wx.EXPAND, 5)
hostsPanel.SetSizer(sizer)
hostsPanel.SetupScrolling(scroll_x=True)
hostsPanel.Layout()
# add panel on a window
hostsEnergyWindow.AddPanel(hostsPanel)
# center window on a screen
hostsEnergyWindow.CentreOnScreen()
# show the results on the new window
hostsEnergyWindow.Show()
def ShowAverageHostsConsumption(self, simulator, hosts, voltage):
""" """
def GetVal(consumptions, hosts):
sum = 0.0
n = len(hosts)
for h in hosts:
sum += consumptions[h]['energy']
return sum / float(n)
val = GetVal(self.module.get_hosts_consumptions(simulator, hosts, voltage), hosts)
lblText = "Average: %.6f J" % val
# create a new frame to show time analysis results on it
avgEnergyWindow = GeneralFrame(self, "Energy Analysis Results", "Average Host's Consumptions", "modules_results.png")
# create scrollable panel
avgPanel = scrolled.ScrolledPanel(avgEnergyWindow)
# create informational label
lbl = wx.StaticText(avgPanel, label=lblText)
# sizer to align gui elements properly
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(lbl, 0, wx.ALL | wx.EXPAND, 5)
avgPanel.SetSizer(sizer)
avgPanel.Layout()
# add panel on a window
avgEnergyWindow.AddPanel(avgPanel)
# show the results on the new window
avgEnergyWindow.Show()
def ShowMinimalHostsConsumption(self, simulator, hosts, voltage):
""" """
def GetVal(consumptions, hosts):
val = None
for h in hosts:
v = consumptions[h]['energy']
if val is None or v < val:
val = v
return val
val = GetVal(self.module.get_hosts_consumptions(simulator, hosts, voltage), hosts)
lblText = "Minimum: %.6f J" % val
# create a new frame to show time analysis results on it
minEnergyWindow = GeneralFrame(self, "Energy Analysis Results", "Minimal Host's Consumptions", "modules_results.png")
# create scrollable panel
minPanel = scrolled.ScrolledPanel(minEnergyWindow)
# create informational label
lbl = wx.StaticText(minPanel, label=lblText)
# sizer to align gui elements properly
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(lbl, 0, wx.ALL | wx.EXPAND, 5)
minPanel.SetSizer(sizer)
minPanel.Layout()
# add panel on a window
minEnergyWindow.AddPanel(minPanel)
# center window on a screen
minEnergyWindow.CentreOnScreen()
# show the results on the new window
minEnergyWindow.Show()
def ShowMaximalHostsConsumption(self, simulator, hosts, voltage):
""" """
def GetVal(consumptions, hosts):
val = 0.0
for h in hosts:
v = consumptions[h]['energy']
if v > val:
val = v
return val
val = GetVal(self.module.get_hosts_consumptions(simulator, hosts, voltage), hosts)
lblText = "Maximum: %.6f J" % val
# create a new frame to show time analysis results on it
maxEnergyWindow = GeneralFrame(self, "Energy Analysis Results", "Maximal Host's Consumptions", "modules_results.png")
# create scrollable panel
maxPanel = scrolled.ScrolledPanel(maxEnergyWindow)
# create informational label
lbl = wx.StaticText(maxPanel, label=lblText)
# sizer to align gui elements properly
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(lbl, 0, wx.ALL | wx.EXPAND, 5)
maxPanel.SetSizer(sizer)
maxPanel.Layout()
# add panel on a window
maxEnergyWindow.AddPanel(maxPanel)
# center window on a screen
maxEnergyWindow.CentreOnScreen()
# show the results on the new window
maxEnergyWindow.Show()
class MainResultsNotebook(wx.Notebook):
""" """
def __init__(self, module, *args, **kwargs):
wx.Notebook.__init__(self, *args, **kwargs)
self.module = module
il = wx.ImageList(24,24)
singleVersionImg = il.Add(wx.Bitmap(self.CreatePath4Resource('energy.png'), wx.BITMAP_TYPE_PNG))
self.AssignImageList(il)
self.oneVersionTab = SingleVersionPanel(self.module, self)
self.AddPage(self.oneVersionTab, "Single Version")
self.SetPageImage(0, singleVersionImg)
self.oneVersionTab.Layout()
def OnParsedModel(self):
""" """
self.oneVersionTab.RemoveAllSimulations()
def OnSimulationFinished(self, simulator):
""" """
self.oneVersionTab.AddFinishedSimulation(simulator)
def OnAllSimulationsFinished(self, simulators):
""" """
pass
def CreatePath4Resource(self, resourceName):
"""
@brief creates and returns path to the
given file in the resource
('assets') dir
@return path to the resource
"""
tmp = os.path.split(os.path.dirname(__file__))
# find last / character in path
idx = tmp[0].rfind('/')
# get substring - path for resource
path = tmp[0][0:idx]
return os.path.join(path, 'bin', 'assets', resourceName)
class ModuleGui(wx.EvtHandler):
"""
Class used by GUI version of AQoPA.
"""
def __init__(self, module):
""" """
wx.EvtHandler.__init__(self)
self.module = module
self.mainResultNotebook = None
def get_name(self):
return "Energy Analysis"
def get_configuration_panel(self, parent):
""" Returns WX panel with configuration controls. """
panel = wx.Panel(parent)
sizer = wx.BoxSizer(wx.VERTICAL)
text = wx.StaticText(panel, label="Module does not need to be configured.")
sizer.Add(text, 0, wx.ALL | wx.EXPAND, 5)
text = wx.StaticText(panel, label="All result options will be available after results are calculated.")
sizer.Add(text, 0, wx.ALL | wx.EXPAND, 5)
text = wx.StaticText(panel, label="Module requires Time Analysis module.")
sizer.Add(text, 0, wx.ALL | wx.EXPAND, 5)
panel.SetSizer(sizer)
return panel
def get_results_panel(self, parent):
"""
Create main result panel existing from the beginning
which will be extended when versions' simulations are finished.
"""
self.mainResultNotebook = MainResultsNotebook(self.module, parent)
return self.mainResultNotebook
def on_finished_simulation(self, simulator):
""" """
self.mainResultNotebook.OnSimulationFinished(simulator)
def on_finished_all_simulations(self, simulators):
"""
Called once for all simulations after all of them are finished.
"""
self.mainResultNotebook.OnAllSimulationsFinished(simulators)
def on_parsed_model(self):
""" """
self.mainResultNotebook.OnParsedModel() | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/module/energyanalysis/gui.py | gui.py |
from aqopa.model import MetricsServiceParam
from aqopa.model.parser.lex_yacc import LexYaccParserExtension
class Builder():
def create_metrics_services_param_energy(self, token):
"""
metrics_services_param : SQLPARAN ENERGY COLON metrics_services_energy_type LPARAN metrics_services_energy_unit RPARAN SQRPARAN
| SQLPARAN ENERGY COLON LISTENING COLON EXACT LPARAN metrics_services_energy_unit RPARAN SQRPARAN
"""
if len(token) == 9:
return MetricsServiceParam(token[2], token[4], token[6])
else:
param_name = token[4] + ':' + token[6]
return MetricsServiceParam(token[2], param_name, token[6])
class ModelParserExtension(LexYaccParserExtension):
"""
Extension for timeanalysis module communications time
"""
#######################
# Communication Time
#######################
def sending_current_default_parameter(self, t):
"""
medium_default_parameter : SENDING_CURRENT_DEFAULT_PARAMETER EQUAL comm_current_value
"""
t[0] = {'default_sending_current': t[3]}
def receiving_current_default_parameter(self, t):
"""
medium_default_parameter : RECEIVING_CURRENT_DEFAULT_PARAMETER EQUAL comm_current_value
"""
t[0] = {'default_receiving_current': t[3]}
def listening_current_default_parameter(self, t):
"""
medium_default_parameter : LISTENING_CURRENT_DEFAULT_PARAMETER EQUAL comm_current_value
"""
t[0] = {'default_listening_current': t[3]}
def topology_rule_sending_current_parameter(self, t):
"""
topology_rule_parameter : SENDING_CURRENT_PARAMETER EQUAL comm_current_value
"""
t[0] = {'sending_current': t[3]}
def topology_rule_receiving_current_parameter(self, t):
"""
topology_rule_parameter : RECEIVING_CURRENT_PARAMETER EQUAL comm_current_value
"""
t[0] = {'receiving_current': t[3]}
def topology_rule_listening_current_parameter(self, t):
"""
topology_rule_parameter : LISTENING_CURRENT_PARAMETER EQUAL comm_current_value
"""
t[0] = {'listening_current': t[3]}
def comm_current_value(self, t):
"""
comm_current_value : comm_current_metric
| comm_current_algorithm
"""
t[0] = t[1]
def comm_current_metric(self, t):
"""
comm_current_metric : number comm_current_metric_unit
"""
t[0] = {
'type': 'metric',
'value': t[1],
'unit': t[2]
}
def comm_current_algorithm(self, t):
"""
comm_current_algorithm : IDENTIFIER SQLPARAN comm_current_metric_unit SQRPARAN
| IDENTIFIER
"""
unit = 'mA'
if len(t) == 5:
unit = t[3]
t[0] = {
'type': 'algorithm',
'name': t[1],
'unit': unit
}
def comm_current_metric_unit(self, t):
"""
comm_current_metric_unit : MA
"""
t[0] = t[1]
def _extend(self):
""" """
self.parser.add_reserved_word('mA', 'MA', state='communication', case_sensitive=True)
self.parser.add_reserved_word('default_sending_current', 'SENDING_CURRENT_DEFAULT_PARAMETER',
state='communication',)
self.parser.add_reserved_word('default_receiving_current', 'RECEIVING_CURRENT_DEFAULT_PARAMETER',
state='communication',)
self.parser.add_reserved_word('default_listening_current', 'LISTENING_CURRENT_DEFAULT_PARAMETER',
state='communication',)
self.parser.add_reserved_word('sending_current', 'SENDING_CURRENT_PARAMETER',
state='communication',)
self.parser.add_reserved_word('receiving_current', 'RECEIVING_CURRENT_PARAMETER',
state='communication',)
self.parser.add_reserved_word('listening_current', 'LISTENING_CURRENT_PARAMETER',
state='communication',)
self.parser.add_rule(self.sending_current_default_parameter)
self.parser.add_rule(self.receiving_current_default_parameter)
self.parser.add_rule(self.listening_current_default_parameter)
self.parser.add_rule(self.topology_rule_sending_current_parameter)
self.parser.add_rule(self.topology_rule_receiving_current_parameter)
self.parser.add_rule(self.topology_rule_listening_current_parameter)
self.parser.add_rule(self.comm_current_value)
self.parser.add_rule(self.comm_current_metric)
self.parser.add_rule(self.comm_current_algorithm)
self.parser.add_rule(self.comm_current_metric_unit)
class ConfigParserExtension(LexYaccParserExtension):
"""
Extension for timeanalysis module communications time
"""
#######################
# Communication Time
#######################
def version_sending_current_default_parameter(self, t):
"""
version_medium_default_parameter : SENDING_CURRENT_DEFAULT_PARAMETER EQUAL version_comm_current_value
"""
t[0] = {'default_sending_current': t[3]}
def version_receiving_current_default_parameter(self, t):
"""
version_medium_default_parameter : RECEIVING_CURRENT_DEFAULT_PARAMETER EQUAL version_comm_current_value
"""
t[0] = {'default_receiving_current': t[3]}
def version_topology_rule_sending_current_parameter(self, t):
"""
version_topology_rule_parameter : SENDING_CURRENT_PARAMETER EQUAL version_comm_current_value
"""
t[0] = {'sending_current': t[3]}
def version_topology_rule_receiving_current_parameter(self, t):
"""
version_topology_rule_parameter : RECEIVING_CURRENT_PARAMETER EQUAL version_comm_current_value
"""
t[0] = {'receiving_current': t[3]}
def version_listening_current_default_parameter(self, t):
"""
version_medium_default_parameter : LISTENING_CURRENT_DEFAULT_PARAMETER EQUAL version_comm_current_value
"""
t[0] = {'default_listening_current': t[3]}
def version_topology_rule_listening_current_parameter(self, t):
"""
version_topology_rule_parameter : LISTENING_CURRENT_PARAMETER EQUAL version_comm_current_value
"""
t[0] = {'listening_current': t[3]}
def version_comm_current_value(self, t):
"""
version_comm_current_value : version_comm_current_metric
| version_comm_current_algorithm
"""
t[0] = t[1]
def version_comm_current_metric(self, t):
"""
version_comm_current_metric : number version_comm_current_metric_unit
"""
t[0] = {
'type': 'metric',
'value': t[1],
'unit': t[2]
}
def version_comm_current_algorithm(self, t):
"""
version_comm_current_algorithm : IDENTIFIER SQLPARAN version_comm_current_metric_unit SQRPARAN
| IDENTIFIER
"""
unit = 'mA'
if len(t) == 5:
unit = t[3]
t[0] = {
'type': 'algorithm',
'name': t[1],
'unit': unit
}
def version_comm_current_metric_unit(self, t):
"""
version_comm_current_metric_unit : MA
"""
t[0] = t[1]
def _extend(self):
""" """
self.parser.add_reserved_word('mA', 'MA', state='versioncommunication', case_sensitive=True)
self.parser.add_reserved_word('default_sending_current', 'SENDING_CURRENT_DEFAULT_PARAMETER',
state='versioncommunication',)
self.parser.add_reserved_word('default_receiving_current', 'RECEIVING_CURRENT_DEFAULT_PARAMETER',
state='versioncommunication',)
self.parser.add_reserved_word('default_listening_current', 'LISTENING_CURRENT_DEFAULT_PARAMETER',
state='versioncommunication',)
self.parser.add_reserved_word('sending_current', 'SENDING_CURRENT_PARAMETER',
state='versioncommunication',)
self.parser.add_reserved_word('receiving_current', 'RECEIVING_CURRENT_PARAMETER',
state='versioncommunication',)
self.parser.add_reserved_word('listening_current', 'LISTENING_CURRENT_PARAMETER',
state='versioncommunication',)
self.parser.add_rule(self.version_sending_current_default_parameter)
self.parser.add_rule(self.version_receiving_current_default_parameter)
self.parser.add_rule(self.version_topology_rule_sending_current_parameter)
self.parser.add_rule(self.version_topology_rule_receiving_current_parameter)
self.parser.add_rule(self.version_listening_current_default_parameter)
self.parser.add_rule(self.version_topology_rule_listening_current_parameter)
self.parser.add_rule(self.version_comm_current_value)
self.parser.add_rule(self.version_comm_current_metric)
self.parser.add_rule(self.version_comm_current_algorithm)
self.parser.add_rule(self.version_comm_current_metric_unit)
class MetricsParserExtension(LexYaccParserExtension):
"""
Extension for parsing energy analysis module's metrics
"""
def __init__(self):
LexYaccParserExtension.__init__(self)
self.builder = Builder()
##################################
# Metrics energy consumption
##################################
def metrics_services_param_energy(self, t):
"""
metrics_services_param : SQLPARAN CURRENT COLON metrics_services_energy_type LPARAN metrics_services_energy_unit RPARAN SQRPARAN
"""
t[0] = self.builder.create_metrics_services_param_energy(t)
def metrics_services_energy_param_type(self, t):
"""
metrics_services_energy_type : EXACT
"""
t[0] = t[1].lower()
def metrics_services_energy_unit(self, t):
"""
metrics_services_energy_unit : MA
"""
t[0] = t[1].lower()
def _extend(self):
""" """
self.parser.add_reserved_word('current', 'CURRENT', state='metricsprimhead', case_sensitive=False)
self.parser.add_reserved_word('exact', 'EXACT', state='metricsprimhead', case_sensitive=False)
self.parser.add_reserved_word('mA', 'MA', state='metricsprimhead', case_sensitive=True)
self.parser.add_rule(self.metrics_services_param_energy)
self.parser.add_rule(self.metrics_services_energy_param_type)
self.parser.add_rule(self.metrics_services_energy_unit) | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/module/energyanalysis/parser.py | parser.py |
from aqopa import module
from aqopa.module.energyanalysis.console import PrintResultsHook
from aqopa.simulator.error import RuntimeException
from aqopa.simulator.state import HOOK_TYPE_SIMULATION_FINISHED
from .gui import ModuleGui
from aqopa.module.energyanalysis.parser import MetricsParserExtension, ConfigParserExtension, ModelParserExtension
from aqopa.model import WhileInstruction, AssignmentInstruction,\
CallFunctionInstruction, CallFunctionExpression, IfInstruction
class Module(module.Module):
"""
"""
def __init__(self, timeanalysis_module):
""" """
self.guis = {} # Divided by simulators - the reason for dict
self.timeanalysis_module = timeanalysis_module
self.voltage = 0
def get_voltage(self):
return self.voltage
def set_voltage(self, voltage):
self.voltage = voltage
def get_gui(self):
if not getattr(self, '__gui', None):
setattr(self, '__gui', ModuleGui(self))
return getattr(self, '__gui', None)
def extend_metrics_parser(self, parser):
"""
Overridden
"""
parser.add_extension(MetricsParserExtension())
return parser
def extend_model_parser(self, parser):
"""
Overriden
"""
parser.add_extension(ModelParserExtension())
return parser
def extend_config_parser(self, parser):
"""
Overriden
"""
parser.add_extension(ConfigParserExtension())
return parser
def _install(self, simulator):
"""
"""
return simulator
def install_console(self, simulator):
""" Install module for console simulation """
self._install(simulator)
hook = PrintResultsHook(self, simulator)
simulator.register_hook(HOOK_TYPE_SIMULATION_FINISHED, hook)
return simulator
def install_gui(self, simulator):
""" Install module for gui simulation """
self._install(simulator)
return simulator
def _get_current_from_metric(self, metric, default=None):
"""
Returns current in A
"""
block = metric.block
for i in range(0, len(block.service_params)):
sparam = block.service_params[i]
if sparam.service_name.lower() != "current":
continue
metric_type = sparam.param_name.lower()
if metric_type != "exact":
continue
# metric_unit = sparam.unit - mA
return float(metric.service_arguments[i]) / 1000.0
return default
def _get_current_for_expression(self, metrics_manager, host, expression):
"""
Returns current (in A) from metric for cpu.
"""
current = None
# Get current metric for expression
metric = metrics_manager.find_primitive(host, expression)
if metric:
current = self._get_current_from_metric(metric, default=None)
# If metric not found, find default value
if current is None:
expression = CallFunctionExpression('cpu')
metric = metrics_manager.find_primitive(host, expression)
if metric:
current = self._get_current_from_metric(metric, default=None)
if current is None:
current = 0.0
return current
def _get_current_for_communication(self, context, host, channel, metric, message, receiver=None):
"""
Returns current (in A) of sending/receiving'listening for a message
"""
if metric['type'] == 'metric':
metric_value = metric['value']
elif metric['type'] == 'algorithm':
algorithm_name = metric['name']
if not context.algorithms_resolver.has_algorithm(algorithm_name):
raise RuntimeException("Communication algorithm {0} undeclared.".format(algorithm_name))
link_quality = context.channels_manager.get_router().get_link_quality(channel.tag_name,
message.sender,
receiver)
alg = context.algorithms_resolver.get_algorithm(algorithm_name)
variables = {
'link_quality': link_quality,
alg['parameter']: message.expression,
}
metric_value = context.algorithms_resolver.calculate(context, host, algorithm_name, variables)
else:
return 0
# unit = metric['unit']
# exact current (only mA)
return metric_value / 1000.0
def _get_current_sending_for_link(self, context, channel, sender, message, receiver):
"""
Returns current (in A) of sending between sender and receiver.
"""
metric = context.channels_manager.get_router().get_link_parameter_value('sending_current',
channel.tag_name, sender, receiver)
if metric is None:
return 0.0
return self._get_current_for_communication(context, sender, channel, metric, message, receiver)
def _get_current_receiving_for_link(self, context, channel, sender, message, receiver):
"""
Returns current (in A) of sending between sender and receiver.
"""
metric = context.channels_manager.get_router().get_link_parameter_value('receiving_current',
channel.tag_name, sender, receiver)
if metric is None:
return 0.0
return self._get_current_for_communication(context, sender, channel, metric, message, receiver)
def _get_current_waiting_for_link(self, context, channel, sender, message, receiver):
"""
Returns current (in A) of sending between sender and receiver.
"""
metric = context.channels_manager.get_router().get_link_parameter_value('listening_current',
channel.tag_name, sender, receiver)
if metric is None:
return 0.0
return self._get_current_for_communication(context, sender, channel, metric, message, receiver)
def get_hosts_consumptions(self, simulator, hosts, voltage):
"""
Calculates energy consumption of hosts.
The unit of voltage is V (Volt).
"""
def combine_waiting_time_tuples(unsorted_time_tuples):
"""
Returns list of sorted time tuples (from, to) without overlapping.
"""
unsorted_time_tuples.sort(key=lambda tt: tt[0])
t_tuples = []
if len(unsorted_time_tuples) > 0:
current_time_from = unsorted_time_tuples[0][0]
current_time_to = unsorted_time_tuples[0][1]
current_current = unsorted_time_tuples[0][2]
i = 0
while i < len(unsorted_time_tuples):
t = unsorted_time_tuples[i]
# inside
# |---10---|
# |-X--|
if t[0] >= current_time_from and t[1] <= current_time_to:
# if inside is more energy consuming
# |---10---|
# |-15-|
if t[2] > current_current:
if t[0] > current_time_from:
t_tuples.append((current_time_from, t[0], current_current))
if current_time_to > t[1]:
new_tuple = (t[1], current_time_to, current_current)
added = False
j = i
while j < len(unsorted_time_tuples):
tt = unsorted_time_tuples[j]
if tt[0] > t[1]:
unsorted_time_tuples.insert(j, new_tuple)
added = True
break
j += 1
if not added:
unsorted_time_tuples.append(new_tuple)
current_time_from = t[0]
current_time_to = t[1]
current_current = t[2]
# overlapping
# |---10--|
# |--X--|
if t[0] < current_time_to and t[1] > current_time_to:
# Add left |---10|
if t[0] > current_time_from:
t_tuples.append((current_time_from, t[0], current_current))
# Add new tuple for right |X--|
new_tuple = (current_time_to, t[1], t[2])
added = False
j = i
while j < len(unsorted_time_tuples):
tt = unsorted_time_tuples[j]
if tt[0] > current_time_to:
unsorted_time_tuples.insert(j, new_tuple)
added = True
break
j += 1
if not added:
unsorted_time_tuples.append(new_tuple)
# Update currents
current_time_from = t[0]
current_time_to = current_time_to
current_current = max(current_current, t[2])
# later
# |---X--|
# |-Y--|
if t[0] > current_time_to:
if current_time_to > current_time_from:
t_tuples.append((current_time_from, current_time_to, current_current))
current_time_from = t[0]
current_time_to = t[1]
current_current = t[2]
i += 1
if current_time_to > current_time_from:
t_tuples.append((current_time_from, current_time_to, current_current))
return t_tuples
def calculate_consumptions_to_remove(waiting_tuples, transmitting_tuples):
"""
Returns list of waiting tuples without times when host was transmitting.
"""
consumptions = {
'energy': 0.0,
'amp-hour': 0.0,
}
for wtt in waiting_tuples:
for ttt in transmitting_tuples:
# overlapping
if ttt[0] <= wtt[1] and ttt[1] >= wtt[0]:
time = 0
# inside
if ttt[0] >= wtt[0] and ttt[1] <= wtt[1]:
time = ttt[1] - ttt[0]
# oversize
if ttt[0] < wtt[0] and ttt[1] > wtt[1]:
time = wtt[1] - wtt[0]
# left side
if ttt[0] < wtt[0] and ttt[1] <= wtt[1]:
time = ttt[1] - wtt[0]
# right side
if ttt[0] >= wtt[0] and ttt[1] > wtt[1]:
time = wtt[1] - ttt[0]
consumptions['energy'] += voltage * wtt[2] * time / 1000.0
consumptions['amp-hour'] += wtt[2] * time / 1000.0 / 3600.0
return consumptions
metrics_manager = simulator.context.metrics_manager
timetraces = self.timeanalysis_module.get_timetraces(simulator)
# Clear results
hosts_consumption = {}
for h in hosts:
hosts_consumption[h] = {
'energy': 0.0,
'amp-hour': 0.0,
}
# Traverse timetraces
# Additionaly create list of finish times of instructions for each host
# (List of times when instructions has been finished) ze co
for timetrace in timetraces:
# Omit hosts that are not given in parameter
if timetrace.host not in hosts:
continue
energy_consumption = 0.0
amp_hour = 0.0
# Get expressions from timetrace
for simple_expression, time in timetrace.expressions_details:
current = self._get_current_for_expression(metrics_manager, timetrace.host, simple_expression)
# Calculate consumption
energy_consumption += voltage * current * time
amp_hour = current * time / 3600.0
# print timetrace.host.name, 'cpu', unicode(simple_expression), current, time, voltage * time * current
hosts_consumption[timetrace.host]['energy'] += energy_consumption
hosts_consumption[timetrace.host]['amp-hour'] += amp_hour
# Traverse channel traces
# Look for times when host was waiting for message or sending a message
channels_traces = self.timeanalysis_module.get_all_channel_message_traces(simulator)
for channel in channels_traces:
channel_traces = channels_traces[channel]
host_channel_transmission_time_tuples = {} # Keeps list of tuples when host was transmitting message
host_channel_waiting_time_tuples = {} # Keeps list of tuples when host was waiting for message
# Traverse each trace and get the time of waiting
for trace in channel_traces:
# Add sending energy consumption for sender
if trace.sender in hosts:
current_sending = self._get_current_sending_for_link(simulator.context, channel,
trace.sender, trace.message,
trace.receiver)
energy_consumption = (voltage * current_sending * trace.sending_time / 1000.0)
amp_hour = current_sending * trace.sending_time / 1000.0 / 3600.0
hosts_consumption[trace.sender]['energy'] += energy_consumption
hosts_consumption[trace.sender]['amp-hour'] += amp_hour
if trace.sender not in host_channel_transmission_time_tuples:
host_channel_transmission_time_tuples[trace.sender] = []
host_channel_transmission_time_tuples[trace.sender].append((trace.sent_at,
trace.sent_at + trace.sending_time))
# Add time tuple for receiver if he is in asked hosts
if trace.receiver in hosts:
current_receiving = self._get_current_receiving_for_link(simulator.context, channel,
trace.sender, trace.message,
trace.receiver)
energy_consumption = voltage * current_receiving * trace.receiving_time / 1000.0
amp_hour = current_receiving * trace.receiving_time / 1000.0 / 3600.0
hosts_consumption[trace.receiver]['energy'] += energy_consumption
hosts_consumption[trace.receiver]['amp-hour'] += amp_hour
if trace.receiver not in host_channel_waiting_time_tuples:
host_channel_waiting_time_tuples[trace.receiver] = []
current_listening = self._get_current_waiting_for_link(simulator.context, channel,
trace.sender, trace.message, trace.receiver)
host_channel_waiting_time_tuples[trace.receiver].append((trace.started_waiting_at,
trace.started_receiving_at,
current_listening))
if trace.receiver not in host_channel_transmission_time_tuples:
host_channel_transmission_time_tuples[trace.receiver] = []
host_channel_transmission_time_tuples[trace.receiver].append((trace.started_receiving_at,
trace.started_receiving_at
+ trace.receiving_time))
for host in hosts:
# Handle waiting tuples
if host in host_channel_waiting_time_tuples:
waiting_tuples = combine_waiting_time_tuples(host_channel_waiting_time_tuples[host])
# Calculate consumption when host theoretically was waiting
# but in fact it was sending or receiving message
# This value will be removed from host's consumption
consumptions_to_remove = {
'energy': 0.0,
'amp-hour': 0.0,
}
if host in host_channel_transmission_time_tuples:
consumptions_to_remove = calculate_consumptions_to_remove(
waiting_tuples, host_channel_transmission_time_tuples[host])
# Add waiting consumption
for tf, tt, c in waiting_tuples:
energy_consumption = voltage * c * (tt-tf) / 1000.0
amp_hour = voltage * c * (tt-tf) / 1000.0 / 3600.0
hosts_consumption[host]['energy'] += energy_consumption
hosts_consumption[host]['amp-hour'] += amp_hour
# Remove surplus consumption
hosts_consumption[host]['energy'] -= consumptions_to_remove['energy']
hosts_consumption[host]['amp-hour'] -= consumptions_to_remove['amp-hour']
return hosts_consumption | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/module/energyanalysis/__init__.py | __init__.py |
import os
import re
import wx
import wx.animate
import wx.lib.scrolledpanel as scrolled
import wx.lib.delayedresult
from aqopa.model import name_indexes
from aqopa.bin import gui as aqopa_gui
from aqopa.simulator.error import RuntimeException
from aqopa.gui.general_purpose_frame_gui import GeneralFrame
"""
@file gui.py
@brief GUI for the reputation analysis panel
@author Damian Rusinek <[email protected]>
@date created on 06-09-2013 by Damian Rusinek
@date edited on 27-06-2014 by Katarzyna Mazur (visual improvements mainly)
"""
class SingleVersionPanel(wx.Panel):
"""
Frame presenting results for one simulation.
Simulator may be retrived from module,
because each module has its own simulator.
"""
def __init__(self, module, *args, **kwargs):
wx.Panel.__init__(self, *args, **kwargs)
self.module = module
self.versionSimulator = {}
self.hostChoosePanels = [] # Panels used to choose hosts for energy consumptions results
self.checkBoxInformations = [] # Tuples with host name, and index ranges widget
self.hostCheckBoxes = [] # List of checkboxes with hosts names used for hosts' selection
#################
# VERSION BOX
#################
versionBox = wx.StaticBox(self, label="Version")
versionsLabel = wx.StaticText(self, label="Choose Version To See\nAnalysis Results:")
self.versionsList = wx.ComboBox(self, style=wx.TE_READONLY)
self.versionsList.Bind(wx.EVT_COMBOBOX, self.OnVersionChanged)
versionBoxSizer = wx.StaticBoxSizer(versionBox, wx.HORIZONTAL)
versionBoxSizer.Add(versionsLabel, 0, wx.ALL | wx.ALIGN_CENTER, 5)
versionBoxSizer.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
versionBoxSizer.Add(self.versionsList, 1, wx.ALL | wx.ALIGN_CENTER, 5)
##################################
# ENERGY CONSUMPTION BOX
##################################
self.consumptionsBox = wx.StaticBox(self, label="Reputation results")
hostsBox, hostsBoxSizer = self._BuildHostsBoxAndSizer()
reputationBoxSizer = wx.StaticBoxSizer(self.consumptionsBox, wx.VERTICAL)
reputationHBoxSizer = wx.BoxSizer(wx.HORIZONTAL)
reputationHBoxSizer.Add(hostsBoxSizer, 1, wx.ALL | wx.EXPAND)
self.showReputationBtn = wx.Button(self, label="Show")
self.showReputationBtn.Bind(wx.EVT_BUTTON, self.OnShowReputationButtonClicked)
reputationBoxSizer.Add(reputationHBoxSizer, 0, wx.ALL | wx.EXPAND)
buttonsBoxSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsBoxSizer.Add(wx.StaticText(self), 1, wx.EXPAND, 5)
buttonsBoxSizer.Add(self.showReputationBtn, 0, wx.ALL | wx.ALIGN_RIGHT)
reputationBoxSizer.Add(buttonsBoxSizer, 1, wx.ALL | wx.EXPAND, 5)
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(versionBoxSizer, 0, wx.ALL | wx.EXPAND, 5)
sizer.Add(reputationBoxSizer, 1, wx.ALL | wx.EXPAND, 5)
self.SetSizer(sizer)
self.SetVersionsResultsVisibility(False)
#################
# REACTIONS
#################
def AddFinishedSimulation(self, simulator):
""" """
version = simulator.context.version
self.versionsList.Append(version.name)
self.versionSimulator[version.name] = simulator
def OnVersionChanged(self, event):
""" """
versionName = self.versionsList.GetValue()
simulator = self.versionSimulator[versionName]
self._BuildHostsChoosePanel(simulator)
self.SetVersionsResultsVisibility(True)
def OnShowReputationButtonClicked(self, event):
""" """
versionName = self.versionsList.GetValue()
simulator = self.versionSimulator[versionName]
hosts = self._GetSelectedHosts(simulator)
if len(hosts) == 0:
wx.MessageBox("Please select hosts.", 'Error', wx.OK | wx.ICON_ERROR)
return
self.ShowHostsReputation(simulator, hosts)
def RemoveAllSimulations(self):
""" """
self.versionsList.Clear()
self.versionsList.SetValue("")
self.versionSimulator = {}
self.hostChoosePanels = []
self.checkBoxInformations = {}
self.hostCheckBoxes = []
self.hostsBoxSizer.Clear(True)
self.SetVersionsResultsVisibility(False)
#################
# LAYOUT
#################
def _BuildHostsBoxAndSizer(self):
""" """
self.hostsBox = wx.StaticBox(self, label="Host(s)")
self.hostsBoxSizer = wx.StaticBoxSizer(self.hostsBox, wx.VERTICAL)
return self.hostsBox, self.hostsBoxSizer
def _BuildHostsChoosePanel(self, simulator):
""" """
for p in self.hostChoosePanels:
p.Destroy()
self.hostChoosePanels = []
self.checkBoxInformations = {}
self.hostCheckBoxes = []
self.hostsBoxSizer.Layout()
hosts = simulator.context.hosts
hostsIndexes = {}
for h in hosts:
name = h.original_name()
indexes = name_indexes(h.name)
index = indexes[0]
if name not in hostsIndexes or index > hostsIndexes[name]:
hostsIndexes[name] = index
for hostName in hostsIndexes:
panel = wx.Panel(self)
panelSizer = wx.BoxSizer(wx.HORIZONTAL)
ch = wx.CheckBox(panel, label=hostName, size=(120, 20))
textCtrl = wx.TextCtrl(panel, size=(200, 20))
textCtrl.SetValue("0")
rangeLabel = "Available range: 0"
if hostsIndexes[hostName] > 0:
rangeLabel += " - %d" % hostsIndexes[hostName]
maxLbl = wx.StaticText(panel, label=rangeLabel)
panelSizer.Add(ch, 0, wx.ALL | wx.ALIGN_CENTER)
panelSizer.Add(wx.StaticText(self), 1, wx.ALL | wx.ALIGN_RIGHT)
panelSizer.Add(textCtrl, 0, wx.ALL | wx.ALIGN_CENTER)
panelSizer.Add(maxLbl, 0, wx.ALL | wx.ALIGN_CENTER)
panel.SetSizer(panelSizer)
self.hostsBoxSizer.Add(panel, 1, wx.ALL)
self.checkBoxInformations[ch] = (hostName, textCtrl)
self.hostChoosePanels.append(panel)
self.hostCheckBoxes.append(ch)
self.hostsBoxSizer.Layout()
self.Layout()
def SetVersionsResultsVisibility(self, visible):
""" """
widgets = []
widgets.append(self.consumptionsBox)
widgets.append(self.hostsBox)
widgets.append(self.showReputationBtn)
for w in widgets:
if visible:
w.Show()
else:
w.Hide()
self.Layout()
#################
# STATISTICS
#################
def _GetSelectedHosts(self, simulator):
""" Returns list of hosts selected by user """
def ValidateHostsRange(indexesRange):
""" """
return re.match(r'\d(-\d)?(,\d(-\d)?)*', indexesRange)
def GetIndexesFromRange(indexesRange):
""" Extracts numbers list of hosts from range text """
indexes = []
ranges = indexesRange.split(',')
for r in ranges:
parts = r.split('-')
if len(parts) == 1:
indexes.append(int(parts[0]))
else:
for i in range(int(parts[0]), int(parts[1])+1):
indexes.append(i)
return indexes
hosts = []
for ch in self.hostCheckBoxes:
if not ch.IsChecked():
continue
hostName, hostRangeTextCtrl = self.checkBoxInformations[ch]
indexesRange = hostRangeTextCtrl.GetValue()
if not ValidateHostsRange(indexesRange):
wx.MessageBox("Range '%s' for host '%s' is invalid. Valid example: 0,12,20-25,30."
% (indexesRange, hostName), 'Error', wx.OK | wx.ICON_ERROR)
break
else:
indexes = GetIndexesFromRange(indexesRange)
for h in simulator.context.hosts:
hostIndexes = name_indexes(h.name)
if h.original_name() == hostName and hostIndexes[0] in indexes:
hosts.append(h)
return hosts
def ShowHostsReputation(self, simulator, hosts):
""" """
lblText = ""
for h in hosts:
lblText += "%s: " % (h.name,)
error = h.get_finish_error()
if error is not None:
lblText += " (Not Finished - %s)" % error
host_vars = self.module.get_host_vars(h)
for var_name in host_vars:
lblText = "%s: %s" % (var_name, unicode(host_vars[var_name]))
lblText += "\n\n"
# create a new frame to show time analysis results on it
hostsReputationWindow = GeneralFrame(self, "Reputation Analysis Results", "Host's Reputation", "modules_results.png")
# create scrollable panel
hostsPanel = scrolled.ScrolledPanel(hostsReputationWindow)
# create informational label
lbl = wx.StaticText(hostsPanel, label=lblText)
# sizer to align gui elements properly
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(lbl, 0, wx.ALL | wx.EXPAND, 5)
hostsPanel.SetSizer(sizer)
hostsPanel.SetupScrolling(scroll_x=True)
hostsPanel.Layout()
# add panel on a window
hostsReputationWindow.AddPanel(hostsPanel)
# center window on a screen
hostsReputationWindow.CentreOnScreen()
# show the results on the new window
hostsReputationWindow.Show()
class MainResultsNotebook(wx.Notebook):
""" """
def __init__(self, module, *args, **kwargs):
wx.Notebook.__init__(self, *args, **kwargs)
il = wx.ImageList(24, 24)
singleVersionImg = il.Add(wx.Bitmap(self.CreatePath4Resource('reputation.png'), wx.BITMAP_TYPE_PNG))
self.AssignImageList(il)
self.module = module
self.oneVersionTab = SingleVersionPanel(self.module, self)
self.AddPage(self.oneVersionTab, "Single Version")
self.SetPageImage(0, singleVersionImg)
self.oneVersionTab.Layout()
def OnParsedModel(self):
""" """
self.oneVersionTab.RemoveAllSimulations()
def OnSimulationFinished(self, simulator):
""" """
self.oneVersionTab.AddFinishedSimulation(simulator)
def OnAllSimulationsFinished(self, simulators):
""" """
pass
def CreatePath4Resource(self, resourceName):
"""
@brief creates and returns path to the
given file in the resource
('assets') dir
@return path to the resource
"""
tmp = os.path.split(os.path.dirname(__file__))
# find last / character in path
idx = tmp[0].rfind('/')
# get substring - path for resource
path = tmp[0][0:idx]
return os.path.join(path, 'bin', 'assets', resourceName)
class ModuleGui(wx.EvtHandler):
"""
Class used by GUI version of AQoPA.
"""
def __init__(self, module):
""" """
wx.EvtHandler.__init__(self)
self.module = module
self.mainResultNotebook = None
def get_name(self):
return "Reputation"
def get_configuration_panel(self, parent):
""" Returns WX panel with configuration controls. """
panel = wx.Panel(parent)
sizer = wx.BoxSizer(wx.VERTICAL)
text = wx.StaticText(panel, label="Module does not need to be configured.")
sizer.Add(text, 0, wx.ALL | wx.EXPAND, 5)
text = wx.StaticText(panel, label="All result options will be available after results are calculated.")
sizer.Add(text, 0, wx.ALL | wx.EXPAND, 5)
panel.SetSizer(sizer)
return panel
def get_results_panel(self, parent):
"""
Create main result panel existing from the beginning
which will be extended when versions' simulations are finished.
"""
self.mainResultNotebook = MainResultsNotebook(self.module, parent)
return self.mainResultNotebook
def on_finished_simulation(self, simulator):
""" """
self.mainResultNotebook.OnSimulationFinished(simulator)
def on_finished_all_simulations(self, simulators):
"""
Called once for all simulations after all of them are finished.
"""
self.mainResultNotebook.OnAllSimulationsFinished(simulators)
def on_parsed_model(self):
""" """
self.mainResultNotebook.OnParsedModel() | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/module/reputation/gui.py | gui.py |
from math import ceil
import random
from aqopa.module.reputation.algorithm import update_vars
from aqopa.simulator.state import Hook, ExecutionResult
from aqopa.model import AssignmentInstruction,\
CallFunctionInstruction, IfInstruction, WhileInstruction,\
CommunicationInstruction, CallFunctionExpression, TupleExpression, ComparisonExpression
from aqopa.module.timeanalysis.error import TimeSynchronizationException
from aqopa.simulator.error import RuntimeException
class PreInstructionHook(Hook):
"""
Execution hook executed before default core execution of each instruction.
Returns execution result.
"""
def __init__(self, module, simulator):
""" """
self.module = module
self.simulator = simulator
def _prepare_host_variables(self, host):
"""
Assign the init vars to host if not yet done.
"""
if host in self.module.reputation_vars:
return
self.module.reputation_vars[host] = {}
for v in self.module.init_vars:
self.module.reputation_vars[host][v] = self.module.init_vars[v]
def execute(self, context, **kwargs):
"""
"""
instruction = context.get_current_instruction()
if instruction.__class__ not in [AssignmentInstruction, CallFunctionInstruction,
IfInstruction, WhileInstruction]:
return
expression = None
if isinstance(instruction, AssignmentInstruction):
expression = instruction.expression
elif isinstance(instruction, CallFunctionInstruction):
expression = CallFunctionExpression(instruction.function_name,
instruction.arguments,
instruction.qop_arguments)
else:
expression = instruction.condition
self._update_vars(context, expression)
return ExecutionResult()
def _update_vars(self, context, expression):
"""
Update reputation variables in host according to current instruction.
"""
if isinstance(expression, CallFunctionExpression):
self._update_vars_simple(context, expression)
elif isinstance(expression, TupleExpression):
self._update_vars_tuple(context, expression)
elif isinstance(expression, ComparisonExpression):
self._update_vars_comparison(context, expression)
def _update_vars_simple(self, context, expression):
"""
Update reputation variables in host according to call function expression.
"""
# Firstly update vars according to nested call functions
for e in expression.arguments:
self._update_vars(context, e)
# Now update vars according to current call function
host = context.get_current_host()
self._prepare_host_variables(host)
metric_expression = CallFunctionExpression(
expression.function_name, expression.arguments, []
)
metric = context.metrics_manager\
.find_primitive(host, metric_expression)
if metric:
block = metric.block
algorithm_name = None
for i in range(0, len(block.service_params)):
sparam = block.service_params[i]
if sparam.service_name.lower() != "reputation":
continue
metric_type = sparam.param_name.lower()
metric_value = metric.service_arguments[i]
if metric_type == "algorithm":
algorithm_name = metric_value
break
if algorithm_name:
algorithm = self.module.get_algorithm(algorithm_name)
if len(expression.qop_arguments) != len(algorithm['parameters']):
raise RuntimeException('Reputation algorithm "%s" required %d parameters, %d given.'
% (algorithm_name, len(algorithm['parameters']),
len(expression.qop_arguments)))
vars = self.module.get_host_vars(host)
i = 0
for qop_arg in expression.qop_arguments:
try:
val = float(qop_arg)
vars[algorithm['parameters'][i]] = val
i += 1
except ValueError:
raise RuntimeException('Reputation argument "%s" in expression "%s" not a float number.'
% (qop_arg, unicode(expression)))
vars = update_vars(host, algorithm['instructions'], vars)
for var_name in self.module.init_vars:
self.module.set_reputation_var(host, var_name, vars[var_name])
def _update_vars_tuple(self, context, expression):
"""
Update reputation variables in host according to tuple expression.
"""
# Update vars according to tuple elements
for e in expression.elements:
self._update_vars(context, e)
def _update_vars_comparison(self, context, expression):
"""
Update reputation variables in host according to comparison expression.
"""
# Update vars according to compared expressions
self._update_vars(context, expression.left)
self._update_vars(context, expression.right) | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/module/reputation/hook.py | hook.py |
from aqopa.model import MetricsServiceParam
from aqopa.model.parser.lex_yacc import LexYaccParserExtension
class Builder():
def create_metrics_services_param_reputation(self, token):
"""
metrics_services_param : SQLPARAN REPUTATION COLON ALGORITHM SQRPARAN
"""
return MetricsServiceParam(token[2], token[4])
class MetricsParserExtension(LexYaccParserExtension):
"""
Extension for parsing energy analysis module's metrics
"""
def __init__(self):
LexYaccParserExtension.__init__(self)
self.builder = Builder()
##########################################
# RESERVED WORDS
##########################################
def word_reputation(self, t):
t.lexer.push_state('reputationprimhead')
return t
def word_algorithm(self, t):
t.lexer.pop_state()
return t
##########################################
# TOKENS
##########################################
##########################################
# RULES
##########################################
def rule_metrics_services_param_reputation(self, t):
"""
metrics_services_param : SQLPARAN REPUTATION COLON ALGORITHM SQRPARAN
"""
t[0] = self.builder.create_metrics_services_param_reputation(t)
def _extend(self):
""" """
self.parser.add_state('reputationprimhead', 'inclusive')
self.parser.add_reserved_word('reputation', 'REPUTATION', func=self.word_reputation, state='metricsprimhead',
case_sensitive=True)
self.parser.add_token('COLON', r':', states=['reputationprimhead'])
self.parser.add_reserved_word('algorithm', 'ALGORITHM', func=self.word_algorithm, state='reputationprimhead',
case_sensitive=True)
self.parser.add_rule(self.rule_metrics_services_param_reputation)
class ModelParserExtension(LexYaccParserExtension):
"""
Extension for parsing energy analysis module's metrics
"""
def __init__(self, module):
LexYaccParserExtension.__init__(self)
self.module = module
self.builder = Builder()
self.open_blocks_cnt = 0
##########################################
# TOKENS
##########################################
def token_reputation(self, t):
r'reputation'
t.lexer.push_state('reputation')
return t
def token_blockopen(self, t):
r"\{"
self.open_blocks_cnt += 1
return t
def token_blockclose(self, t):
r"\}"
self.open_blocks_cnt -= 1
if self.open_blocks_cnt == 0:
t.lexer.pop_state()
return t
##########################################
# RULES
##########################################
def reputation_specification(self, t):
"""
module_specification : REPUTATION BLOCKOPEN reputation_init_vars reputation_algorithms BLOCKCLOSE
"""
pass
def reputation_algorithms(self, t):
"""
reputation_algorithms : reputation_algorithm
| reputation_algorithms reputation_algorithm
"""
pass
def reputation_algorithm(self, t):
"""
reputation_algorithm : ALGORITHM IDENTIFIER LPARAN identifiers_list RPARAN BLOCKOPEN reputation_instructions BLOCKCLOSE
"""
self.module.algorithms[t[2]] = {
'parameters': t[4],
'instructions': t[7],
}
def reputation_init_vars(self, t):
"""
reputation_init_vars : reputation_init_var
| reputation_init_vars reputation_init_var
"""
pass
def reputation_init_var(self, t):
"""
reputation_init_var : HASH IDENTIFIER EQUAL number SEMICOLON
"""
self.module.init_vars[t[2]] = t[4]
def reputation_instructions(self, t):
"""
reputation_instructions : reputation_instruction
| reputation_instructions reputation_instruction
"""
if len(t) == 2:
t[0] = []
t[0].append(t[1])
else:
t[0] = t[1]
t[0].append(t[2])
def reputation_instruction(self, t):
"""
reputation_instruction : reputation_instruction_assignment SEMICOLON
| reputation_instruction_conditional
| reputation_instruction_conditional SEMICOLON
"""
t[0] = t[1]
def reputation_instruction_assignment(self, t):
"""
reputation_instruction_assignment : IDENTIFIER EQUAL reputation_expression
"""
t[0] = {'type': 'assignment', 'identifier': t[1], 'expression': t[3]}
def reputation_expression_operations(self, t):
"""
reputation_expression : reputation_expression PLUS reputation_expression
| reputation_expression MINUS reputation_expression
| reputation_expression TIMES reputation_expression
| reputation_expression DIVIDE reputation_expression
"""
t[0] = ''.join(t[1:])
def reputation_expression_values(self, t):
"""
reputation_expression : number
| IDENTIFIER
"""
t[0] = str(t[1])
def reputation_expression_uminus(self, t):
"""
reputation_expression : MINUS reputation_expression %prec UMINUS
"""
t[0] = ''.join(t[1:2])
def reputation_expression_paran(self, t):
"""
reputation_expression : LPARAN reputation_expression RPARAN
"""
t[0] = ''.join(t[1:])
def reputation_instruction_conditional(self, t):
"""
reputation_instruction_conditional : IF LPARAN reputation_expression_conditional RPARAN BLOCKOPEN reputation_instructions BLOCKCLOSE
| IF LPARAN reputation_expression_conditional RPARAN BLOCKOPEN reputation_instructions BLOCKCLOSE ELSE BLOCKOPEN reputation_instructions BLOCKCLOSE
"""
if_instruction = {'type': 'if', 'condition': t[3], 'true_instructions': t[6], 'false_instructions': []}
if len(t) > 8:
if_instruction['false_instructions'] = t[10]
t[0] = if_instruction
def reputation_expression_conditional_comparison(self, t):
"""
reputation_expression_conditional : reputation_expression EQUAL EQUAL reputation_expression
| reputation_expression EXCLAMATION EQUAL reputation_expression
| reputation_expression GREATER reputation_expression
| reputation_expression GREATER EQUAL reputation_expression
| reputation_expression SMALLER reputation_expression
| reputation_expression SMALLER EQUAL reputation_expression
| reputation_expression_conditional AND AND reputation_expression_conditional
| reputation_expression_conditional OR OR reputation_expression_conditional
"""
t[0] = ''.join(t[1:])
def reputation_expression_conditional_operators(self, t):
"""
reputation_expression_conditional : USED LPARAN IDENTIFIER RPARAN
| EXISTS LPARAN IDENTIFIER RPARAN
"""
t[0] = ''.join(t[1:])
def reputation_expression_conditional_paran(self, t):
"""
reputation_expression_conditional : LPARAN reputation_expression_conditional RPARAN
"""
t[0] = ''.join(t[1:])
def _extend(self):
""" """
self.parser.add_state('reputation', 'inclusive')
self.parser.add_token('REPUTATION', func=self.token_reputation, states=['modules'])
self.parser.add_reserved_word('algorithm', 'ALGORITHM', state='reputation', case_sensitive=True)
self.parser.add_reserved_word('if', 'IF', state='reputation', case_sensitive=True)
self.parser.add_reserved_word('else', 'ELSE', state='reputation', case_sensitive=True)
self.parser.add_reserved_word('used', 'USED', state='reputation', case_sensitive=True)
self.parser.add_reserved_word('exists', 'EXISTS', state='reputation', case_sensitive=True)
self.parser.add_token('BLOCKOPEN', func=self.token_blockopen, states=['reputation'])
self.parser.add_token('BLOCKCLOSE', func=self.token_blockclose, states=['reputation'])
self.parser.add_token('PLUS', r'\+', states=['reputation'])
self.parser.add_token('MINUS', r'\-', states=['reputation'])
self.parser.add_token('TIMES', r'\*', states=['reputation'])
self.parser.add_token('DIVIDE', r'/', states=['reputation'])
self.parser.add_token('HASH', r"\#", states=['reputation'])
self.parser.add_token('GREATER', r'\>', states=['reputation'])
self.parser.add_token('SMALLER', r'\<', states=['reputation'])
self.parser.add_token('EXCLAMATION', r'\!', states=['reputation'])
self.parser.add_token('AND', r'\&', states=['reputation'])
self.parser.add_token('OR', r'\|', states=['reputation'])
self.parser.add_precedence(['PLUS', 'MINUS'], 'left')
self.parser.add_precedence(['TIMES', 'DIVIDE'], 'left')
self.parser.add_precedence(['UMINUS'], 'right')
# Rules
self.parser.add_rule(self.reputation_specification)
self.parser.add_rule(self.reputation_algorithms)
self.parser.add_rule(self.reputation_algorithm)
self.parser.add_rule(self.reputation_init_vars)
self.parser.add_rule(self.reputation_init_var)
self.parser.add_rule(self.reputation_instructions)
self.parser.add_rule(self.reputation_instruction)
self.parser.add_rule(self.reputation_instruction_assignment)
self.parser.add_rule(self.reputation_expression_operations)
self.parser.add_rule(self.reputation_expression_paran)
self.parser.add_rule(self.reputation_expression_uminus)
self.parser.add_rule(self.reputation_expression_values)
self.parser.add_rule(self.reputation_instruction_conditional)
self.parser.add_rule(self.reputation_expression_conditional_comparison)
self.parser.add_rule(self.reputation_expression_conditional_operators)
self.parser.add_rule(self.reputation_expression_conditional_paran) | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/module/reputation/parser.py | parser.py |
from copy import copy, deepcopy
from aqopa import module
from aqopa.module.reputation.hook import PreInstructionHook
from aqopa.simulator.state import HOOK_TYPE_SIMULATION_FINISHED, HOOK_TYPE_PRE_INSTRUCTION_EXECUTION
from aqopa.model import CallFunctionExpression
from .parser import MetricsParserExtension, ModelParserExtension
from .gui import ModuleGui
from .console import PrintResultsHook
class Module(module.Module):
"""
"""
def __init__(self):
""" """
self.guis = {} # Divided by simulators - the reason for dict
self.init_vars = {} # Divided by name - the reason for dict
self.algorithms = {} # Divided by name - the reason for dict
self.reputation_vars = {} # Divided by hosts - the reason for dict
def get_gui(self):
if not getattr(self, '__gui', None):
setattr(self, '__gui', ModuleGui(self))
return getattr(self, '__gui', None)
def extend_metrics_parser(self, parser):
"""
Overriden
"""
parser.add_extension(MetricsParserExtension())
return parser
def extend_model_parser(self, parser):
"""
Overriden
"""
parser.add_extension(ModelParserExtension(self))
return parser
def _install(self, simulator):
"""
"""
hook = PreInstructionHook(self, simulator)
simulator.register_hook(HOOK_TYPE_PRE_INSTRUCTION_EXECUTION, hook)
return simulator
def install_console(self, simulator):
""" Install module for console simulation """
self._install(simulator)
hook = PrintResultsHook(self, simulator)
simulator.register_hook(HOOK_TYPE_SIMULATION_FINISHED, hook)
return simulator
def install_gui(self, simulator):
""" Install module for gui simulation """
self._install(simulator)
return simulator
def get_algorithm(self, name):
"""
Returns algorithm by name.
"""
return deepcopy(self.algorithms[name])
def set_reputation_var(self, host, var, val):
""" """
if host not in self.reputation_vars:
self.reputation_vars[host] = {}
self.reputation_vars[host][var] = val
def get_host_vars(self, host):
"""
"""
if host not in self.reputation_vars:
return {}
return copy(self.reputation_vars[host]) | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/module/reputation/__init__.py | __init__.py |
from aqopa.model import IfInstruction, WhileInstruction, AssignmentInstruction
from aqopa.model.parser.lex_yacc import LexYaccParser, LexYaccParserExtension
from aqopa.simulator.state import Process
class ComputionalParserExtension(LexYaccParserExtension):
def __init__(self):
LexYaccParserExtension.__init__(self)
self.open_blocks_cnt = 0
##########################################
# TOKENS
##########################################
def token_identifier(self, t):
r'[_a-zA-Z][_a-zA-Z0-9]*'
words = self.parser.get_reserved_words()
states_stack = []
states_stack.extend(t.lexer.lexstatestack)
states_stack.append(t.lexer.current_state())
i = len(states_stack)-1
while i >= 0:
state = states_stack[i]
if state in words:
state_words = words[state]
for state_word in state_words:
tvalue = t.value
state_word_value = state_word
word_tuple = state_words[state_word]
# if not case sensitive
if not word_tuple[2]:
tvalue = tvalue.lower()
state_word_value = state_word_value.lower()
if tvalue == state_word_value:
# If function exists
if word_tuple[1]:
t = word_tuple[1](t)
t.type = word_tuple[0]
break
i -= 1
return t
def token_float(self, t):
r'([1-9][0-9]*\.[0-9]+)|(0\.[0-9]+)'
t.value = float(t.value)
return t
def token_integer(self, t):
r'0|[1-9][0-9]*'
t.value = int(t.value)
return t
def token_blockopen(self, t):
r"\{"
self.open_blocks_cnt += 1
return t
def token_blockclose(self, t):
r"\}"
self.open_blocks_cnt -= 1
if self.open_blocks_cnt == 0:
t.lexer.pop_state()
return t
##########################################
# RULES
##########################################
def rule_number(self, t):
"""
number : FLOAT
| INTEGER
"""
t[0] = t[1]
def reputation_expression_operations(self, t):
"""
reputation_expression : reputation_expression PLUS reputation_expression
| reputation_expression MINUS reputation_expression
| reputation_expression TIMES reputation_expression
| reputation_expression DIVIDE reputation_expression
"""
if t[2] == '+':
t[0] = t[1] + t[3]
elif t[2] == '-':
t[0] = t[1] - t[3]
elif t[2] == '/':
t[0] = t[1] / t[3]
elif t[2] == '*':
t[0] = t[1] * t[3]
def reputation_expression_values(self, t):
"""
reputation_expression : number
| IDENTIFIER
"""
try:
t[0] = float(t[1])
except ValueError:
t[0] = self.parser.vars[t[1]]
def reputation_expression_uminus(self, t):
"""
reputation_expression : MINUS reputation_expression %prec UMINUS
"""
t[0] = - t[2]
def reputation_expression_paran(self, t):
"""
reputation_expression : LPARAN reputation_expression RPARAN
"""
t[0] = t[2]
def reputation_expression_conditional_comparison(self, t):
"""
reputation_expression_conditional : reputation_expression EQUAL EQUAL reputation_expression
| reputation_expression EXCLAMATION EQUAL reputation_expression
| reputation_expression GREATER reputation_expression
| reputation_expression GREATER EQUAL reputation_expression
| reputation_expression SMALLER reputation_expression
| reputation_expression SMALLER EQUAL reputation_expression
| reputation_expression AND AND reputation_expression
| reputation_expression OR OR reputation_expression
"""
if t[3] == '=':
if t[2] == '=':
t[0] = t[1] == t[4]
elif t[2] == '!':
t[0] = t[1] != t[4]
elif t[2] == '>':
t[0] = t[1] >= t[4]
elif t[2] == '<':
t[0] = t[1] <= t[4]
elif t[2] == '>':
t[0] = t[1] > t[3]
elif t[2] == '<':
t[0] = t[1] < t[3]
elif t[2] == '&' and t[3] == '&':
t[0] = t[1] and t[4]
elif t[2] == '|' and t[3] == '|':
t[0] = t[1] or t[4]
else:
t[0] = False
def reputation_expression_conditional_operators(self, t):
"""
reputation_expression_conditional : USED LPARAN IDENTIFIER RPARAN
| EXISTS LPARAN IDENTIFIER RPARAN
"""
if t[1].lower() == 'used':
t[0] = self.parser.process_uses_variable(t[3])
else:
t[0] = self.parser.host.has_variable(t[3])
def reputation_expression_conditional_paran(self, t):
"""
reputation_expression_conditional : LPARAN reputation_expression_conditional RPARAN
"""
t[0] = t[2]
def _extend(self):
""" """
self.parser.add_reserved_word('if', 'IF', case_sensitive=True)
self.parser.add_reserved_word('else', 'ELSE', case_sensitive=True)
self.parser.add_reserved_word('used', 'USED', case_sensitive=True)
self.parser.add_reserved_word('exists', 'EXISTS', case_sensitive=True)
self.parser.add_token('FLOAT', func=self.token_float)
self.parser.add_token('INTEGER', func=self.token_integer)
self.parser.add_token('IDENTIFIER', func=self.token_identifier)
self.parser.add_token('SEMICOLON', r';')
self.parser.add_token('EQUAL', r'\=')
self.parser.add_token('LPARAN', r'\(')
self.parser.add_token('RPARAN', r'\)')
self.parser.add_token('SQLPARAN', r'\[')
self.parser.add_token('SQRPARAN', r'\]')
self.parser.add_token('BLOCKOPEN', func=self.token_blockopen)
self.parser.add_token('BLOCKCLOSE', func=self.token_blockclose)
self.parser.add_token('PLUS', r'\+')
self.parser.add_token('MINUS', r'\-')
self.parser.add_token('TIMES', r'\*')
self.parser.add_token('DIVIDE', r'/')
self.parser.add_token('GREATER', r'\>')
self.parser.add_token('SMALLER', r'\<')
self.parser.add_token('EXCLAMATION', r'\!')
self.parser.add_token('AND', r'\&')
self.parser.add_token('OR', r'\|')
self.parser.add_precedence(['PLUS', 'MINUS'], 'left')
self.parser.add_precedence(['TIMES', 'DIVIDE'], 'left')
self.parser.add_precedence(['UMINUS'], 'right')
# Rules
self.parser.add_rule(self.rule_number)
self.parser.add_rule(self.reputation_expression_operations)
self.parser.add_rule(self.reputation_expression_paran)
self.parser.add_rule(self.reputation_expression_uminus)
self.parser.add_rule(self.reputation_expression_values)
self.parser.add_rule(self.reputation_expression_conditional_comparison)
self.parser.add_rule(self.reputation_expression_conditional_operators)
self.parser.add_rule(self.reputation_expression_conditional_paran)
class ComputionalParser(LexYaccParser):
_conditional_expression_instance = None
_simple_expression_instance = None
def __init__(self):
self.host = None
self.vars = {}
LexYaccParser.__init__(self)
def restart(self):
self.yaccer.restart()
def parse_expression(self, s, host, rep_vars):
self.host = host
self.vars = rep_vars
return self.yaccer.parse(input=s, lexer=self.lexer)
def process_uses_variable(self, variable_name):
""" Returns True if variable with given name is defined anywhere in process """
index = 0
instructions = self.host.get_current_process().instructions_list
while index < len(instructions):
instruction = instructions[index]
if isinstance(instruction, IfInstruction):
instructions.extend(instruction.true_instructions)
instructions.extend(instruction.false_instructions)
elif isinstance(instruction, WhileInstruction):
instructions.extend(instruction.instructions)
elif isinstance(instruction, AssignmentInstruction):
if instruction.variable_name == variable_name:
return True
index += 1
return False
@staticmethod
def conditional_expression_instance():
if ComputionalParser._conditional_expression_instance is None:
parser_ext = ComputionalParserExtension()
ComputionalParser._conditional_expression_instance = ComputionalParser()
ComputionalParser._conditional_expression_instance.add_extension(parser_ext)
ComputionalParser._conditional_expression_instance.start_symbol = 'reputation_expression_conditional'
ComputionalParser._conditional_expression_instance.build()
else:
ComputionalParser._conditional_expression_instance.restart()
return ComputionalParser._conditional_expression_instance
@staticmethod
def simple_expression_instance():
if ComputionalParser._simple_expression_instance is None:
parser_ext = ComputionalParserExtension()
ComputionalParser._simple_expression_instance = ComputionalParser()
ComputionalParser._simple_expression_instance.add_extension(parser_ext)
ComputionalParser._simple_expression_instance.start_symbol = 'reputation_expression'
ComputionalParser._simple_expression_instance.build()
else:
ComputionalParser._conditional_expression_instance.restart()
return ComputionalParser._simple_expression_instance
def compute_conditional_expression(host, expression, vars):
"""
Computes the result of conditional expression.
"""
parser = ComputionalParser.conditional_expression_instance()
return parser.parse_expression(expression, host, vars)
def compute_simple_expression(host, expression, vars):
"""
Computes the result of simple expression.
"""
parser = ComputionalParser.simple_expression_instance()
return parser.parse_expression(expression, host, vars)
def update_vars(host, instructions, vars):
"""
Updae vars according to the instructions of algorithm.
"""
index = 0
while index < len(instructions):
instruction = instructions[index]
if instruction['type'] == 'assignment':
vars[instruction['identifier']] = compute_simple_expression(host, instruction['expression'], vars)
elif instruction['type'] == 'if':
condition = compute_conditional_expression(host, instruction['condition'], vars)
if condition:
if_instructions = instruction['true_instructions']
else:
if_instructions = instruction['false_instructions']
if_index = index + 1
for i in if_instructions:
instructions.insert(if_index, i)
if_index += 1
index += 1
return vars | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/module/reputation/algorithm.py | algorithm.py |
import os
import re
import wx
import wx.animate
import wx.lib.scrolledpanel as scrolled
import wx.lib.delayedresult
from aqopa.model import name_indexes
from aqopa.bin import gui as aqopa_gui
from aqopa.simulator.error import RuntimeException
from aqopa.gui.combo_check_box import ComboCheckBox
from aqopa.gui.general_purpose_frame_gui import GeneralFrame
"""
@file timeanalysis.py
@brief GUI for the main time analysis results, where we can see actual analysis results [time]
@author Damian Rusinek <[email protected]>
@date created on 05-09-2013 by Damian Rusinek
@date edited on 25-06-2014 by Katarzyna Mazur (visual improvements mainly)
"""
class SingleVersionPanel(wx.Panel):
"""
Frame presenting results for one simulation.
Simulator may be retrived from module,
because each module has its own simulator.
"""
def __init__(self, module, *args, **kwargs):
wx.Panel.__init__(self, *args, **kwargs)
self.module = module
self.versionSimulator = {}
self.hostChoosePanels = [] # Panels used to choose hosts for times results
self.checkBoxInformations = [] # Tuples with host name, and index ranges widget
self.hostCheckBoxes = [] # List of checkboxes with hosts names used for hosts' selection
#################
# VERSION BOX
#################
versionBox = wx.StaticBox(self, label="Version")
versionsLabel = wx.StaticText(self, label="Choose Version To See\nAnalysis Results:")
self.versionsList = wx.ComboBox(self, style=wx.TE_READONLY)
self.versionsList.Bind(wx.EVT_COMBOBOX, self.OnVersionChanged)
versionBoxSizer = wx.StaticBoxSizer(versionBox, wx.HORIZONTAL)
versionBoxSizer.Add(versionsLabel, 0, wx.ALL | wx.ALIGN_CENTER, 5)
versionBoxSizer.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
versionBoxSizer.Add(self.versionsList, 1, wx.ALL | wx.ALIGN_CENTER, 5)
#################
# TOTAL TIME BOX
#################
self.totalTimeBox = wx.StaticBox(self, label="Total time")
self.totalTimeLabel = wx.StaticText(self, label="---")
totalTimeBoxSizer = wx.StaticBoxSizer(self.totalTimeBox, wx.VERTICAL)
totalTimeBoxSizer.Add(self.totalTimeLabel, 1, wx.ALL | wx.ALIGN_CENTER, 5)
#################
# TIMES BOX
#################
self.timesBox = wx.StaticBox(self, label="Times")
operationBox, operationBoxSizer = self._BuildOperationsBoxAndSizer()
hostsBox, hostsBoxSizer = self._BuildHostsBoxAndSizer()
timesBoxSizer = wx.StaticBoxSizer(self.timesBox, wx.VERTICAL)
timesHBoxSizer = wx.BoxSizer(wx.HORIZONTAL)
timesHBoxSizer.Add(operationBoxSizer, 0, wx.ALL | wx.EXPAND)
timesHBoxSizer.Add(hostsBoxSizer, 1, wx.ALL | wx.EXPAND)
self.showTimeBtn = wx.Button(self, label="Show")
self.showTimeBtn.Bind(wx.EVT_BUTTON, self.OnShowTimeButtonClicked)
timesBoxSizer.Add(timesHBoxSizer, 0, wx.ALL | wx.EXPAND)
timesBoxSizer.Add(wx.StaticText(self), 1, wx.EXPAND, 5)
timesBoxSizer.Add(self.showTimeBtn, 0, wx.ALIGN_RIGHT | wx.ALL, 5)
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(versionBoxSizer, 0, wx.ALL | wx.EXPAND, 5)
sizer.Add(totalTimeBoxSizer, 0, wx.ALL | wx.EXPAND, 5)
sizer.Add(timesBoxSizer, 1, wx.ALL | wx.EXPAND, 5)
self.SetSizer(sizer)
self.SetVersionsResultsVisibility(False)
def CreatePath4Resource(self, resourceName):
"""
@brief creates and returns path to the
given file in the resource
('assets') dir
@return path to the resource
"""
tmp = os.path.split(os.path.dirname(__file__))
# find last / character in path
idx = tmp[0].rfind('/')
# get substring - path for resource
path = tmp[0][0:idx]
return os.path.join(path, 'bin', 'assets', resourceName)
#################
# REACTIONS
#################
def AddFinishedSimulation(self, simulator):
""" """
version = simulator.context.version
self.versionsList.Append(version.name)
self.versionSimulator[version.name] = simulator
def OnVersionChanged(self, event):
""" """
versionName = self.versionsList.GetValue()
simulator = self.versionSimulator[versionName]
totalTime = self.GetTotalTime(simulator)
self.totalTimeLabel.SetLabel("%.6f s" % totalTime)
self._BuildHostsChoosePanel(simulator)
self.SetVersionsResultsVisibility(True)
def OnShowTimeButtonClicked(self, event):
""" """
versionName = self.versionsList.GetValue()
simulator = self.versionSimulator[versionName]
hosts = self._GetSelectedHosts(simulator)
if len(hosts) == 0:
wx.MessageBox("Please select hosts.", 'Error', wx.OK | wx.ICON_ERROR)
return
if self.oneTimeRB.GetValue():
self.ShowHostsTimes(simulator, hosts)
elif self.avgTimeRB.GetValue():
self.ShowAverageHostsTime(simulator, hosts)
elif self.minTimeRB.GetValue():
self.ShowMinimalHostsTime(simulator, hosts)
elif self.maxTimeRB.GetValue():
self.ShowMaximalHostsTime(simulator, hosts)
def RemoveAllSimulations(self):
""" """
self.versionsList.Clear()
self.versionsList.SetValue("")
self.versionSimulator = {}
self.hostChoosePanels = []
self.checkBoxInformations = {}
self.hostCheckBoxes = []
self.hostsBoxSizer.Clear(True)
self.SetVersionsResultsVisibility(False)
#################
# LAYOUT
#################
def _BuildOperationsBoxAndSizer(self):
""" """
self.operationBox = wx.StaticBox(self, label="Operation")
self.oneTimeRB = wx.RadioButton(self, label="Total host's time")
self.avgTimeRB = wx.RadioButton(self, label="Average hosts' time")
self.minTimeRB = wx.RadioButton(self, label="Minimal hosts' time")
self.maxTimeRB = wx.RadioButton(self, label="Maximal hosts' time")
operationBoxSizer = wx.StaticBoxSizer(self.operationBox, wx.VERTICAL)
operationBoxSizer.Add(self.oneTimeRB, 0, wx.ALL)
operationBoxSizer.Add(self.avgTimeRB, 0, wx.ALL)
operationBoxSizer.Add(self.minTimeRB, 0, wx.ALL)
operationBoxSizer.Add(self.maxTimeRB, 0, wx.ALL)
return self.operationBox, operationBoxSizer
def _BuildHostsBoxAndSizer(self):
""" """
self.hostsBox = wx.StaticBox(self, label="Host(s)")
self.hostsBoxSizer = wx.StaticBoxSizer(self.hostsBox, wx.VERTICAL)
return self.hostsBox, self.hostsBoxSizer
def _BuildHostsChoosePanel(self, simulator):
""" """
for p in self.hostChoosePanels:
p.Destroy()
self.hostChoosePanels = []
self.checkBoxInformations = {}
self.hostCheckBoxes = []
self.hostsBoxSizer.Layout()
hosts = simulator.context.hosts
hostsIndexes = {}
for h in hosts:
name = h.original_name()
indexes = name_indexes(h.name)
index = indexes[0]
if name not in hostsIndexes or index > hostsIndexes[name]:
hostsIndexes[name] = index
for hostName in hostsIndexes:
panel = wx.Panel(self)
panelSizer = wx.BoxSizer(wx.HORIZONTAL)
ch = wx.CheckBox(panel, label=hostName, size=(120, 20))
textCtrl = wx.TextCtrl(panel, size=(300, 20))
textCtrl.SetValue("0")
rangeLabel = "Available range: 0"
if hostsIndexes[hostName] > 0:
rangeLabel += " - %d" % hostsIndexes[hostName]
maxLbl = wx.StaticText(panel, label=rangeLabel)
panelSizer.Add(ch, 0, wx.ALL | wx.ALIGN_CENTER)
panelSizer.Add(textCtrl, 1, wx.ALL | wx.ALIGN_CENTER)
panelSizer.Add(maxLbl, 0, wx.ALL | wx.ALIGN_CENTER)
panel.SetSizer(panelSizer)
self.hostsBoxSizer.Add(panel, 1, wx.ALL)
self.checkBoxInformations[ch] = (hostName, textCtrl)
self.hostChoosePanels.append(panel)
self.hostCheckBoxes.append(ch)
self.hostsBoxSizer.Layout()
self.Layout()
def SetVersionsResultsVisibility(self, visible):
""" """
widgets = []
widgets.append(self.timesBox)
widgets.append(self.totalTimeBox)
widgets.append(self.totalTimeLabel)
widgets.append(self.operationBox)
widgets.append(self.oneTimeRB)
widgets.append(self.avgTimeRB)
widgets.append(self.minTimeRB)
widgets.append(self.maxTimeRB)
widgets.append(self.hostsBox)
widgets.append(self.showTimeBtn)
for w in widgets:
if visible:
w.Show()
else:
w.Hide()
self.Layout()
#################
# STATISTICS
#################
def _GetSelectedHosts(self, simulator):
""" Returns list of hosts selected by user """
def ValidateHostsRange(indexesRange):
""" """
return re.match(r'\d(-\d)?(,\d(-\d)?)*', indexesRange)
def GetIndexesFromRange(indexesRange):
""" Extracts numbers list of hosts from range text """
indexes = []
ranges = indexesRange.split(',')
for r in ranges:
parts = r.split('-')
if len(parts) == 1:
indexes.append(int(parts[0]))
else:
for i in range(int(parts[0]), int(parts[1])+1):
indexes.append(i)
return indexes
hosts = []
for ch in self.hostCheckBoxes:
if not ch.IsChecked():
continue
hostName, hostRangeTextCtrl = self.checkBoxInformations[ch]
indexesRange = hostRangeTextCtrl.GetValue()
if not ValidateHostsRange(indexesRange):
wx.MessageBox("Range '%s' for host '%s' is invalid. Valid example: 0,12,20-25,30."
% (indexesRange, hostName), 'Error', wx.OK | wx.ICON_ERROR)
break
else:
indexes = GetIndexesFromRange(indexesRange)
for h in simulator.context.hosts:
hostIndexes = name_indexes(h.name)
if h.original_name() == hostName and hostIndexes[0] in indexes:
hosts.append(h)
return hosts
def GetTotalTime(self, simulator):
""" Return total time of simulated version. """
totalTime = 0
hosts = simulator.context.hosts
for h in hosts:
t = self.module.get_current_time(simulator, h)
if t > totalTime:
totalTime = t
return totalTime
def ShowHostsTimes(self, simulator, hosts):
""" """
lblText = ""
for h in hosts:
lblText += "%s: %.6f s" % (h.name,self.module.get_current_time(simulator, h))
error = h.get_finish_error()
if error is not None:
lblText += " (Not Finished - %s)" % error
lblText += "\n\n"
# create a new frame to show time analysis results on it
hostsTimeWindow = GeneralFrame(self, "Time Analysis Results", "Host's Time", "modules_results.png")
# create scrollable panel
hostsPanel = scrolled.ScrolledPanel(hostsTimeWindow)
# create informational label
lbl = wx.StaticText(hostsPanel, label=lblText)
# sizer to align gui elements properly
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(lbl, 1, wx.ALL | wx.EXPAND, 5)
hostsPanel.SetSizer(sizer)
hostsPanel.SetupScrolling(scroll_x=True)
hostsPanel.Layout()
# add panel on a window
hostsTimeWindow.AddPanel(hostsPanel)
# center window on a screen
hostsTimeWindow.CentreOnScreen()
# show the results on the new window
hostsTimeWindow.Show()
def ShowAverageHostsTime(self, simulator, hosts):
"""
@brief shows average time [in ms] -
present results in a new window (frame, actually)
"""
def GetVal(simulator, hosts):
sum = 0.0
n = len(hosts)
for h in hosts:
sum += self.module.get_current_time(simulator, h)
return sum / float(n)
# create a new frame to show time analysis results on it
avgTimeWindow = GeneralFrame(self, "Time Analysis Results", "Average Time", "modules_results.png")
# create scrollable panel
avgPanel = scrolled.ScrolledPanel(avgTimeWindow)
# get average time and make a label out of it
avg = GetVal(simulator, hosts)
lblText = "Average: %.6f s" % avg
lbl = wx.StaticText(avgPanel, label=lblText)
# sizer to align gui elements properly
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(lbl, 0, wx.ALL | wx.EXPAND, 5)
avgPanel.SetSizer(sizer)
avgPanel.Layout()
# add panel on a window
avgTimeWindow.AddPanel(avgPanel)
# center window on a screen
avgTimeWindow.CentreOnScreen()
# show the results on the new window
avgTimeWindow.Show()
def ShowMinimalHostsTime(self, simulator, hosts):
"""
@brief shows minimum hosts time
"""
def GetVal(simulator, hosts):
val = None
for h in hosts:
t = self.module.get_current_time(simulator, h)
if val is None or t < val:
val = t
return val
val = GetVal(simulator, hosts)
lblText = "Minimum: %.6f s" % val
# create a new frame to show time analysis results on it
minTimeWindow = GeneralFrame(self, "Time Analysis Results", "Minimal Time", "modules_results.png")
# create scrollable panel
minPanel = scrolled.ScrolledPanel(minTimeWindow)
# create informational label
lbl = wx.StaticText(minPanel, label=lblText)
# sizer to align gui elements properly
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(lbl, 0, wx.ALL | wx.EXPAND, 5)
minPanel.SetSizer(sizer)
minPanel.Layout()
# add panel on a window
minTimeWindow.AddPanel(minPanel)
# center window on a screen
minTimeWindow.CentreOnScreen()
# show the results on the new window
minTimeWindow.Show()
def ShowMaximalHostsTime(self, simulator, hosts):
""" """
def GetVal(simulator, hosts):
val = 0.0
for h in hosts:
t = self.module.get_current_time(simulator, h)
if t > val:
val = t
return val
val = GetVal(simulator, hosts)
lblText = "Maximum: %.6f s" % val
# create a new frame to show time analysis results on it
maxTimeWindow = GeneralFrame(self, "Time Analysis Results", "Maximal Time", "modules_results.png")
# create scrollable panel
maxPanel = scrolled.ScrolledPanel(maxTimeWindow)
# create informational label
lbl = wx.StaticText(maxPanel, label=lblText)
# sizer to align gui elements properly
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(lbl, 0, wx.ALL | wx.EXPAND, 5)
maxPanel.SetSizer(sizer)
maxPanel.Layout()
# add panel on a window
maxTimeWindow.AddPanel(maxPanel)
# center window on a screen
maxTimeWindow.CentreOnScreen()
# show the results on the new window
maxTimeWindow.Show()
#TIME_TYPE_MAX = 1
#TIME_TYPE_AVG = 2
#TIME_TYPE_TOTAL = 3
#
#class VersionsChartsPanel(wx.Panel):
#
# def __init__(self, module, *args, **kwargs):
# wx.Panel.__init__(self, *args, **kwargs)
#
# self.module = module
#
# self.simulators = []
# self.chartPanel = None
#
# sizer = wx.BoxSizer(wx.VERTICAL)
#
# self.chartTypeBox = wx.StaticBox(self, label="Chart Type")
# self.chartTypeBoxSizer = wx.StaticBoxSizer(self.chartTypeBox, wx.HORIZONTAL)
#
# self.totalTimeOnRepetitionBtn = wx.Button(self, label="T_Total / N")
# self.avgTimeOnRepetitionBtn = wx.Button(self, label="T_AVG / N")
# self.totalTimeOnVersionsBtn = wx.Button(self, label="T_Total / Version")
# self.totalTimeOnMetricsBtn = wx.Button(self, label="T_Total / M")
# self.avgTimeOnMetricsBtn = wx.Button(self, label="T_AVG / M")
#
# self.totalTimeOnRepetitionBtn.Bind(wx.EVT_BUTTON, self.OnTimeTotalOnRepetitionBtnClicked)
# self.avgTimeOnRepetitionBtn.Bind(wx.EVT_BUTTON, self.OnTimeAvgOnRepetitionBtnClicked)
# self.totalTimeOnVersionsBtn.Bind(wx.EVT_BUTTON, self.OnTimeTotalOnVersionsBtnClicked)
# self.totalTimeOnMetricsBtn.Bind(wx.EVT_BUTTON, self.OnTimeTotalOnMetricsBtnClicked)
# self.avgTimeOnMetricsBtn.Bind(wx.EVT_BUTTON, self.OnTimeAvgOnMetricsBtnClicked)
#
# self.chartTypeBoxSizer.Add(self.totalTimeOnRepetitionBtn, 1, wx.ALL | wx.ALIGN_CENTER, 5)
# self.chartTypeBoxSizer.Add(self.avgTimeOnRepetitionBtn, 1, wx.ALL | wx.ALIGN_CENTER, 5)
# self.chartTypeBoxSizer.Add(self.totalTimeOnVersionsBtn, 1, wx.ALL | wx.ALIGN_CENTER, 5)
# self.chartTypeBoxSizer.Add(self.totalTimeOnMetricsBtn, 1, wx.ALL | wx.ALIGN_CENTER, 5)
# self.chartTypeBoxSizer.Add(self.avgTimeOnMetricsBtn, 1, wx.ALL | wx.ALIGN_CENTER, 5)
#
# sizer.Add(self.chartTypeBoxSizer, 0, wx.ALL | wx.EXPAND, 5)
#
#
# self.chartConfigBox = wx.StaticBox(self, label="Chart Configuration")
# self.chartConfigBoxSizer = wx.StaticBoxSizer(self.chartConfigBox, wx.VERTICAL)
#
# sizer.Add(self.chartConfigBoxSizer, 1, wx.ALL | wx.EXPAND, 5)
#
# self.SetSizer(sizer)
#
# def _GetHostsRepetitions(self):
# hostsRepetitions = {}
#
# for s in self.simulators:
# for rh in s.context.version.run_hosts:
# if rh.host_name not in hostsRepetitions:
# hostsRepetitions[rh.host_name] = 1
# if rh.repetitions > hostsRepetitions[rh.host_name]:
# hostsRepetitions[rh.host_name] = rh.repetitions
#
# return hostsRepetitions
#
# def _GetHosts(self, simulators, hostName):
# simulatorsHosts = {}
# for s in simulators:
# simulatorsHosts[s] = []
# for h in s.context.hosts:
# if h.original_name() == hostName:
# simulatorsHosts[s].append(h)
# return simulatorsHosts
#
# def _MetricExistsIn(self, metric, metrics):
# """ """
# for m in metrics:
# if self._AreMetricsEqual(m, metric):
# return True
# return False
#
# def _AreMetricsEqual(self, leftMetric, rightMetric):
# """ """
# if len(leftMetric) != len(rightMetric):
# return False
#
# for leftMetricsSet in leftMetric:
# found = False
# for rightMetricsSet in rightMetric:
# if leftMetricsSet.host_name == rightMetricsSet.host_name:
# found= True
# if leftMetricsSet.configuration_name != rightMetricsSet.configuration_name:
# return False
# if not found:
# return False
# return True
#
# def _GetMetric(self, simulator):
# """ """
# return simulator.context.version.metrics_sets
#
# def _GetMetrics(self):
# """ """
# metrics = []
# for s in self.simulators:
# metric = self._GetMetric(s)
# if not self._MetricExistsIn(metric, metrics):
# metrics.append(metric)
# return metrics
#
# #################
# # REACTIONS
# #################
#
# def OnTimeTotalOnRepetitionBtnClicked(self, event):
# self.chartConfigBox.SetLabel("T_Total/N Chart Configuration")
# self.BuildTimesByRepetitionsChartPanel(TIME_TYPE_MAX)
#
# def OnTimeAvgOnRepetitionBtnClicked(self, event):
# self.chartConfigBox.SetLabel("TAvg/N Chart Configuration")
# self.BuildTimesByRepetitionsChartPanel(TIME_TYPE_AVG)
#
# def OnTimeTotalOnVersionsBtnClicked(self, event):
# self.chartConfigBox.SetLabel("TTotal/Version Chart Configuration")
# self.BuildTimesByVersionsChartPanel(TIME_TYPE_TOTAL)
#
# def OnTimeTotalOnMetricsBtnClicked(self, event):
# self.chartConfigBox.SetLabel("T_Total/M Chart Configuration")
# self.BuildTimesByMetricsChartPanel(TIME_TYPE_MAX)
#
# def OnTimeAvgOnMetricsBtnClicked(self, event):
# self.chartConfigBox.SetLabel("TAvg/M Chart Configuration")
# self.BuildTimesByMetricsChartPanel(TIME_TYPE_AVG)
#
# def OnShowChartTTotalVarRepButtonClicked(self, event):
# """ """
# if len(self.curves) == 0:
# wx.MessageBox("No curves defined. You must add at least one curve.",
# 'Error', wx.OK | wx.ICON_ERROR)
# else:
# self.CalculateAndShowChartByRepFrame(TIME_TYPE_MAX)
#
# def OnShowChartTAvgVarRepButtonClicked(self, event):
# """ """
# if len(self.curves) == 0:
# wx.MessageBox("No curves defined. You must add at least one curve.",
# 'Error', wx.OK | wx.ICON_ERROR)
# else:
# self.CalculateAndShowChartByRepFrame(TIME_TYPE_AVG)
#
# def OnShowChartTTotalVarMetricsButtonClicked(self, event):
# """ """
# if len(self.curves) == 0:
# wx.MessageBox("No curves defined. You must add at least one curve.",
# 'Error', wx.OK | wx.ICON_ERROR)
# else:
# self.CalculateAndShowChartByMetricsFrame(TIME_TYPE_MAX)
#
# def OnShowChartTAvgVarMetricsButtonClicked(self, event):
# """ """
# if len(self.curves) == 0:
# wx.MessageBox("No curves defined. You must add at least one curve.",
# 'Error', wx.OK | wx.ICON_ERROR)
# else:
# self.CalculateAndShowChartByMetricsFrame(TIME_TYPE_AVG)
#
# def OnShowChartTTotalVarVersionButtonClicked(self, event):
# """ """
# if len(self.simulators) == 0:
# wx.MessageBox("There are no finished simulations yet. Please wait and try again.",
# 'Error', wx.OK | wx.ICON_ERROR)
# else:
# self.CalculateAndShowChartByVersions()
#
# def OnAddCurveButtonClicked(self, event):
# """ """
# lblParts = []
# curveSimulators = []
# for ch in self.checkboxSimulator:
# if ch.IsChecked():
# curveSimulators.append(self.checkboxSimulator[ch])
# lblParts.append(self.checkboxSimulator[ch].context.version.name)
#
# if len(curveSimulators) == 0:
# return
#
# self.curves.append(curveSimulators)
#
# curveLabel = "%d. Versions: %s" % (len(self.curves), ', '.join(lblParts))
# text = wx.StaticText(self.curvesListPanel, label=curveLabel)
# self.curvesListPanelSizer.Add(text)
#
# self.Layout()
#
# #################
# # LAYOUT
# #################
#
# def BuildCurvesPanel(self, parent, parentSizer, onShowChartClicked):
# """ """
# rightBox = wx.StaticBox(parent, label="Define curves")
# rightBoxSizer = wx.StaticBoxSizer(rightBox, wx.HORIZONTAL)
# parentSizer.Add(rightBoxSizer, 4, wx.ALL, 5)
#
# addCurveBox = wx.StaticBox(parent, label="Add curve")
# addCurveBoxSizer = wx.StaticBoxSizer(addCurveBox, wx.VERTICAL)
# rightBoxSizer.Add(addCurveBoxSizer, 0, wx.ALL, 5)
#
# self.checkboxSimulator = {}
#
# for s in self.simulators:
# version = s.context.version
# ch = wx.CheckBox(parent, label=version.name)
# addCurveBoxSizer.Add(ch)
# self.checkboxSimulator[ch] = s
#
# addCurveButton = wx.Button(parent, label="Add curve")
# addCurveBoxSizer.Add(addCurveButton, 0, wx.ALIGN_CENTER)
# addCurveButton.Bind(wx.EVT_BUTTON, self.OnAddCurveButtonClicked)
#
# curvesBox = wx.StaticBox(parent, label="Curves")
# curvesBoxSizer = wx.StaticBoxSizer(curvesBox, wx.VERTICAL)
# rightBoxSizer.Add(curvesBoxSizer, 1, wx.ALL, 5)
#
# self.curvesListPanel = wx.Panel(parent)
# self.curvesListPanelSizer = wx.BoxSizer(wx.VERTICAL)
# self.curvesListPanel.SetSizer(self.curvesListPanelSizer)
# curvesBoxSizer.Add(self.curvesListPanel, 1, wx.ALL | wx.EXPAND, 5)
#
# showChartBtn = wx.Button(parent, label="Show Chart")
# showChartBtn.Bind(wx.EVT_BUTTON, onShowChartClicked)
# curvesBoxSizer.Add(showChartBtn, 0, wx.ALIGN_CENTER)
#
# def BuildTimesByRepetitionsChartPanel(self, timeType):
# """ """
# if self.chartPanel:
# self.chartPanel.Destroy()
# self.chartPanel = None
#
# self.chartPanel = wx.Panel(self)
# chartPanelSizer = wx.BoxSizer(wx.HORIZONTAL)
# self.chartPanel.SetSizer(chartPanelSizer)
#
# leftBox = wx.StaticBox(self.chartPanel, label="Choose host")
# leftBoxSizer = wx.StaticBoxSizer(leftBox, wx.VERTICAL)
# chartPanelSizer.Add(leftBoxSizer, 1, wx.ALL, 5)
#
# self.hostRadios = []
#
# hostsRepetitions = self._GetHostsRepetitions()
# for hostName in hostsRepetitions:
# radioBtn = wx.RadioButton(self.chartPanel, label=hostName)
# leftBoxSizer.Add(radioBtn)
# self.hostRadios.append(radioBtn)
#
# onShowChartBtnClicked = None
# if timeType == TIME_TYPE_MAX:
# onShowChartBtnClicked = self.OnShowChartTTotalVarRepButtonClicked
# elif timeType == TIME_TYPE_AVG:
# onShowChartBtnClicked = self.OnShowChartTAvgVarRepButtonClicked
#
# self.BuildCurvesPanel(self.chartPanel, chartPanelSizer, onShowChartBtnClicked)
#
# self.chartConfigBoxSizer.Add(self.chartPanel, 1, wx.ALL | wx.EXPAND, 5)
# self.Layout()
#
# self.curves = []
#
# def BuildTimesByVersionsChartPanel(self, timeType):
# """ """
# if self.chartPanel:
# self.chartPanel.Destroy()
# self.chartPanel = None
#
# self.chartPanel = wx.Panel(self)
# chartPanelSizer = wx.BoxSizer(wx.HORIZONTAL)
# self.chartPanel.SetSizer(chartPanelSizer)
#
# self.BuildCurvesPanel(self.chartPanel, chartPanelSizer, self.OnShowChartTTotalVarVersionButtonClicked)
#
# self.chartConfigBoxSizer.Add(self.chartPanel, 1, wx.ALL | wx.EXPAND, 5)
# self.Layout()
#
# self.curves = []
#
# def BuildTimesByMetricsChartPanel(self, timeType):
# """ """
# if self.chartPanel:
# self.chartPanel.Destroy()
# self.chartPanel = None
#
# self.chartPanel = wx.Panel(self)
# chartPanelSizer = wx.BoxSizer(wx.HORIZONTAL)
# self.chartPanel.SetSizer(chartPanelSizer)
#
# leftBox = wx.StaticBox(self.chartPanel, label="Metrics")
# leftBoxSizer = wx.StaticBoxSizer(leftBox, wx.VERTICAL)
# chartPanelSizer.Add(leftBoxSizer, 0, wx.ALL, 5)
#
# self.metrics = self._GetMetrics()
#
# i = 0
# for metric in self.metrics:
# i += 1
# metricPanel = wx.Panel(self.chartPanel)
# metricPanelSizer = wx.BoxSizer(wx.HORIZONTAL)
# metricPanel.SetSizer(metricPanelSizer)
#
# leftBoxSizer.Add(metricPanel)
#
# metricNumber = "%d ." % i
# lbl = wx.StaticText(metricPanel, label=metricNumber)
# metricPanelSizer.Add(lbl)
#
# metricsConfigPanel = wx.Panel(metricPanel)
# metricsConfigPanelSizer = wx.BoxSizer(wx.VERTICAL)
# metricsConfigPanel.SetSizer(metricsConfigPanelSizer)
#
# metricPanelSizer.Add(metricsConfigPanel)
#
# for mSet in metric:
# lbl = wx.StaticText(metricsConfigPanel, label="Host: %s -> Config: %s" % (mSet.host_name, mSet.configuration_name))
# metricsConfigPanelSizer.Add(lbl)
#
# onShowChartBtnClicked = None
# if timeType == TIME_TYPE_MAX:
# onShowChartBtnClicked = self.OnShowChartTTotalVarMetricsButtonClicked
# elif timeType == TIME_TYPE_AVG:
# onShowChartBtnClicked = self.OnShowChartTAvgVarMetricsButtonClicked
#
# self.BuildCurvesPanel(self.chartPanel, chartPanelSizer, onShowChartBtnClicked)
#
# self.chartConfigBoxSizer.Add(self.chartPanel, 1, wx.ALL | wx.EXPAND, 5)
# self.Layout()
#
# self.curves = []
#
# def CalculateAndShowChartByRepFrame(self, timeType):
# """ """
#
# def TTotal(module, simulator, hosts):
# val = 0.0
# for h in hosts:
# t = module.get_current_time(simulator, h)
# if t > val:
# val = t
# return val
#
# def TAvg(module, simulator, hosts):
# s = 0.0
# for h in hosts:
# s += module.get_current_time(simulator, h)
# l = len(hosts)
# if l == 0:
# return 0
# return s / float(l)
#
# hostName = None
# for radio in self.hostRadios:
# if radio.GetValue():
# hostName = radio.GetLabel()
# break
# if not hostName:
# return
#
# curvesData = []
#
# chartFun = lambda x : x
# chartTitle = ""
# xLabel = ""
# yLabel = ""
#
# if timeType == TIME_TYPE_MAX:
# chartFun = TTotal
# chartTitle = "Chart: Total Time / Repetitions"
# xLabel = "Repetitions"
# yLabel = "T_Total"
# else:
# chartFun = TAvg
# chartTitle = "Chart: TimeAvg / Repetitions"
# xLabel = "Repetitions"
# yLabel = "TAvg"
#
# i = 0
# for curveSimulators in self.curves:
#
# i += 1
# label = "%d." % i
#
# values = []
# hostsBySimulator = self._GetHosts(curveSimulators, hostName)
# for s in hostsBySimulator:
# values.append((len(hostsBySimulator[s]), chartFun(self.module, s, hostsBySimulator[s])))
#
# values.sort(key=lambda t: t[0])
# curveData = (label, values)
# curvesData.append(curveData)
#
# self.ShowChartFrame(chartTitle, xLabel, yLabel, curvesData)
#
# def CalculateAndShowChartByMetricsFrame(self, timeType):
# """ """
#
# def TTotal(values):
# val = 0.0
# for t in values:
# if t > val:
# val = t
# return val
#
# def TAvg(values):
# s = 0.0
# for v in values:
# s += v
# l = len(values)
# if l == 0:
# return 0
# return s / float(l)
#
# def buildLegend(parent):
#
# mainBox = wx.StaticBox(parent, label="Metrics")
# mainBoxSizer = wx.StaticBoxSizer(mainBox,wx.VERTICAL)
#
# i = 0
# for metric in self.metrics:
# i += 1
# metricPanel = wx.Panel(parent)
# metricPanelSizer = wx.BoxSizer(wx.HORIZONTAL)
# metricPanel.SetSizer(metricPanelSizer)
#
# mainBoxSizer.Add(metricPanel)
#
# metricNumber = "%d ." % i
# lbl = wx.StaticText(metricPanel, label=metricNumber)
# metricPanelSizer.Add(lbl)
#
# metricsConfigPanel = wx.Panel(metricPanel)
# metricsConfigPanelSizer = wx.BoxSizer(wx.VERTICAL)
# metricsConfigPanel.SetSizer(metricsConfigPanelSizer)
#
# metricPanelSizer.Add(metricsConfigPanel)
#
# for mSet in metric:
# lbl = wx.StaticText(metricsConfigPanel, label="Host: %s -> Config: %s" % (mSet.host_name, mSet.configuration_name))
# metricsConfigPanelSizer.Add(lbl)
#
# return mainBoxSizer
#
# curvesData = []
#
# chartFun = lambda x : x
# chartTitle = ""
# xLabel = ""
# yLabel = ""
#
# if timeType == TIME_TYPE_MAX:
# chartFun = TTotal
# chartTitle = "Chart: Total Time / Metric"
# xLabel = "Metric"
# yLabel = "T_Total"
# else:
# chartFun = TAvg
# chartTitle = "Chart: TimeAvg / Metric"
# xLabel = "Metric"
# yLabel = "TAvg"
#
# c = 0
# for curveSimulators in self.curves:
# c += 1
#
# values = []
# i = 0
# for m in self.metrics:
# i += 1
# time_values = []
#
# for s in curveSimulators:
# currentMetric = self._GetMetric(s)
# if self._AreMetricsEqual(m, currentMetric):
# time_values.extend([ self.module.get_current_time(s, h)
# for h in s.context.hosts])
#
# values.append((i, chartFun(time_values)))
#
# values.sort(key=lambda t: t[0])
# curveData = ("%d." % c, values)
# curvesData.append(curveData)
#
# self.ShowChartFrame(chartTitle, xLabel, yLabel, curvesData,
# buildLegendPanelFun=buildLegend)
#
# def CalculateAndShowChartByVersions(self):
# """ """
# def chartFun(module, simulator, hosts):
# val = 0.0
# for h in hosts:
# t = module.get_current_time(simulator, h)
# if t > val:
# val = t
# return val
#
# chartTitle = "TTotal / Version"
# xLabel = "Version"
# yLabel = "TTotal"
# curvesData = []
#
# c = 0
# for curveSimulators in self.curves:
# c += 1
#
# values = []
# i = 0
# for s in curveSimulators:
# i += 1
# values.append((i, chartFun(self.module, s, s.context.hosts)))
#
# values.sort(key=lambda t: t[0])
#
# curveData = ("%d." % c, values)
# curvesData.append(curveData)
#
# self.ShowChartFrame(chartTitle, xLabel, yLabel, curvesData)
#
# def AddFinishedSimulation(self, simulator):
# """ """
# self.simulators.append(simulator)
#
# self.chartTypeBox.Show()
# self.totalTimeOnRepetitionBtn.Show()
# self.totalTimeOnMetricsBtn.Show()
# self.totalTimeOnVersionsBtn.Show()
# self.avgTimeOnRepetitionBtn.Show()
# self.avgTimeOnMetricsBtn.Show()
#
# self.chartConfigBox.Show()
# self.chartConfigBox.SetLabel("Chart Configuration")
# self.Layout()
#
# def RemoveAllSimulations(self):
# """ """
# self.simulators = []
# self.chartTypeBox.Hide()
# self.totalTimeOnRepetitionBtn.Hide()
# self.totalTimeOnMetricsBtn.Hide()
# self.totalTimeOnVersionsBtn.Hide()
# self.avgTimeOnRepetitionBtn.Hide()
# self.avgTimeOnMetricsBtn.Hide()
#
# self.chartConfigBox.Hide()
# self.chartConfigBoxSizer.Clear(True)
# self.Layout()
#
# def ShowChartFrame(self, chartTitle, xTitle, yTitle, curvesData,
# buildLegendPanelFun = None):
# """ Shows frame with gnuplot chart """
#
# from wx.lib.plot import PlotCanvas, PolyMarker, PolyLine, PlotGraphics
# import random
#
# def drawPlot(chartTitle, xTitle, yTitle, curvesData):
# """ """
# plots = []
#
# for curveData in curvesData:
# cr = random.randint(0, 255)
# cg = random.randint(0, 255)
# cb = random.randint(0, 255)
# markers = PolyMarker(curveData[1], legend=curveData[0],
# colour=wx.Color(cr,cg,cb), size=2)
# line = PolyLine(curveData[1], legend=curveData[0],
# colour=wx.Color(cr,cg,cb), width=1)
# plots.append(markers)
# plots.append(line)
#
# return PlotGraphics(plots, chartTitle,
# xTitle, yTitle)
#
# frame = wx.Frame(None, title=chartTitle)
# panel = wx.Panel(frame)
# panelSizer = wx.BoxSizer(wx.HORIZONTAL)
# panel.SetSizer(panelSizer)
#
# frameSizer = wx.BoxSizer(wx.VERTICAL)
# frameSizer.Add(panel, 1, wx.ALL | wx.EXPAND, 5)
# frame.SetSizer(frameSizer)
#
# canvas = PlotCanvas(panel)
# canvas.Draw(drawPlot(chartTitle, xTitle, yTitle, curvesData))
#
# if buildLegendPanelFun:
# legendPanel = buildLegendPanelFun(panel)
# panelSizer.Add(legendPanel, 0, wx.ALL, 5)
# panelSizer.Add(canvas, 1, wx.EXPAND)
#
# panelSizer.Layout()
#
# frame.Maximize(True)
# frame.Show()
TIME_TYPE_TOTAL = 1
TIME_TYPE_AVG = 2
class DistributedVersionPanel(wx.Panel):
def __init__(self, *args):
wx.Panel.__init__(self, *args)
self.repetitionText = wx.StaticText(self, label="")
self.maximumBox = wx.StaticBox(self, label="Version with maximum execution time")
maximumBoxSizer = wx.StaticBoxSizer(self.maximumBox, wx.VERTICAL)
hs = wx.BoxSizer(wx.HORIZONTAL)
self.versionsLabel = wx.StaticText(self, label="Version:")
hs.Add(self.versionsLabel, 0, wx.ALIGN_LEFT | wx.EXPAND, 5)
hs.Add(wx.StaticText(self), 1, wx.EXPAND, 5)
self.maximumVersionText = wx.StaticText(self, label="")
hs.Add(self.maximumVersionText, 0, wx.ALIGN_LEFT, 5)
maximumBoxSizer.Add(hs, 0, wx.EXPAND, 5)
hs = wx.BoxSizer(wx.HORIZONTAL)
self.timeLabel = wx.StaticText(self, label="Time:")
hs.Add(self.timeLabel, 0, wx.ALIGN_LEFT | wx.EXPAND, 5)
hs.Add(wx.StaticText(self), 1, wx.EXPAND, 5)
self.maximumTimeText = wx.StaticText(self, label="")
hs.Add(self.maximumTimeText, 0, wx.ALIGN_LEFT, 5)
maximumBoxSizer.Add(hs, 0, wx.EXPAND, 5)
hs = wx.BoxSizer(wx.HORIZONTAL)
self.hostsNumberLabel = wx.StaticText(self, label="Number of simultaneous clients:")
hs.Add(self.hostsNumberLabel, 0, wx.ALIGN_LEFT | wx.EXPAND, 5)
hs.Add(wx.StaticText(self), 1, wx.EXPAND, 5)
self.maximumRepetitionText = wx.StaticText(self, label="")
hs.Add(self.maximumRepetitionText, 0, wx.ALIGN_LEFT, 5)
maximumBoxSizer.Add(hs, 0, wx.EXPAND, 5)
self.resultsBox = wx.StaticBox(self, label="Optimization results")
self.resultsBoxSizer = wx.StaticBoxSizer(self.resultsBox, wx.VERTICAL)
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(self.repetitionText, 0, wx.ALIGN_CENTER|wx.ALL, 5)
sizer.Add(maximumBoxSizer, 0, wx.EXPAND|wx.ALL, 5)
sizer.Add(self.resultsBoxSizer, 0, wx.EXPAND|wx.ALL, 5)
# labels
versionLbl = wx.StaticText(self, label="Version", size=(200, -1))
timeLbl = wx.StaticText(self, label="Time", size=(200, -1))
hostsLbl = wx.StaticText(self, label="Number of simulatenous\nhosts", size=(200, -1))
hS = wx.BoxSizer(wx.HORIZONTAL)
hS.Add(versionLbl, 0)
hS.Add(timeLbl, 0)
hS.Add(hostsLbl, 0)
self.resultsBoxSizer.Add(hS, 0, wx.ALL | wx.EXPAND, 5)
# results report
self.reportText = ""
self.reportTextCtrl = wx.StaticText(self, label=self.reportText, style=wx.TE_MULTILINE)
self.resultsBoxSizer.Add(self.reportTextCtrl, 0, wx.ALL | wx.EXPAND, 5)
self.SetSizer(sizer)
self.CenterOnParent()
self.Layout()
def SetValues(self, versionsTxt, timeTxt, hostsNumberTxt, reportTxt):
self.maximumVersionText.SetLabel(versionsTxt)
self.maximumTimeText.SetLabel(timeTxt)
self.maximumRepetitionText.SetLabel(hostsNumberTxt)
self.reportTextCtrl.SetLabel(reportTxt)
self.Refresh()
self.Layout()
class DistributedSystemOptimizationPanel(wx.ScrolledWindow):
def __init__(self, module, *args, **kwargs):
wx.ScrolledWindow.__init__(self, *args, **kwargs)
self.module = module
self.checkBoxes = []
self.checkBoxToSimulator = {}
self.optimizationResults = {}
# combobox with time types
self.timeComboBox = wx.ComboBox(self, choices=["Average", "Total"], style=wx.CB_READONLY, size=(220, 20))
# host combobox
self.hostCombo = wx.ComboBox(self, style=wx.TE_READONLY, size=(220, 20))
# create combocheckbox, empty at first
self.comboCheckBox = wx.combo.ComboCtrl(self, style=wx.TE_READONLY, size=(220, 20))
self.tcp = ComboCheckBox()
self.comboCheckBox.SetPopupControl(self.tcp)
self.comboCheckBox.SetText('...')
# labels
hostText = wx.StaticText(self, label="Repeated host:", )
versionsText = wx.StaticText(self, label="Versions:")
timeTypeText = wx.StaticText(self, label="Time type:")
toleranceText = wx.StaticText(self, label="Tolerance: (in %)")
self.toleranceTextCtrl = wx.TextCtrl(self, size=(220, 20))
# info label
descText = "Optimization algorithm finds the numbers of simultaneous clients for each version such that the execution time of protocols will be the same (with given tolerance)."
# create static boxes aka group boxes
configurationBox = wx.StaticBox(self, label="Optimization configuration")
configurationBox.SetToolTip(wx.ToolTip(descText))
# create sizers
configurationBoxSizer = wx.StaticBoxSizer(configurationBox, wx.VERTICAL)
sizer1 = wx.BoxSizer(wx.HORIZONTAL)
sizer2 = wx.BoxSizer(wx.HORIZONTAL)
sizer3 = wx.BoxSizer(wx.HORIZONTAL)
sizer4 = wx.BoxSizer(wx.HORIZONTAL)
# add repeated host
sizer1.Add(hostText, 1, wx.ALL | wx.EXPAND, 5)
sizer1.Add(self.hostCombo, 1, wx.ALL, 5)
# add versions
sizer2.Add(versionsText, 1, wx.ALL | wx.EXPAND, 5)
sizer2.Add(self.comboCheckBox, 1, wx.ALL, 5)
# add time type: avg or total
sizer3.Add(timeTypeText, 1, wx.ALL | wx.EXPAND, 5)
sizer3.Add(self.timeComboBox, 1, wx.ALL, 5)
# add tolerance percent
sizer4.Add(toleranceText, 1, wx.ALL | wx.EXPAND, 5)
sizer4.Add(self.toleranceTextCtrl, 1, wx.ALL, 5)
self.startButton = wx.Button(self, label="Start optimization")
self.startButton.Bind(wx.EVT_BUTTON, self.OnStartClick)
configurationBoxSizer.Add(sizer1, 0, wx.ALIGN_CENTER|wx.ALL, 5)
configurationBoxSizer.Add(sizer2, 0, wx.ALIGN_CENTER|wx.ALL, 5)
configurationBoxSizer.Add(sizer3, 0, wx.ALIGN_CENTER|wx.ALL, 5)
configurationBoxSizer.Add(sizer4, 0, wx.ALIGN_CENTER|wx.ALL, 5)
configurationBoxSizer.Add(wx.StaticText(self), 1, 1, wx.ALL | wx.EXPAND)
configurationBoxSizer.Add(self.startButton, 0, wx.ALIGN_CENTER|wx.ALL)
self.reportText = ""
# OPTIMIZATION PROCESS
self.module.get_gui().Bind(aqopa_gui.EVT_MODULE_SIMULATION_ALLOWED, self.OnSimulationAllowed)
self.interpreter = None # Interpreter used to run simulators
self.maxTime = 0 # Maximum execution time (avg or total)
self.timeType = None # The type of time calculation (avg or total)
self.hostName = None # Name of host used to calculate avg time
self.maxSimulator = None # The simulator with maximum execution time
self.maxRepetition = 0 # The number of simulatenous clients in scenario with maximum execution time
self.newSimulator = None
self.previousRepetition = 0
self.previousTime = 0
self.currentRepetition = 0
self.currentTime = 0
self.tolerance = 0.05
self.progressTimer = wx.Timer(self)
self.dots = 0
self.Bind(wx.EVT_TIMER, self.OnProgressTimerTick, self.progressTimer)
self.processBox = wx.StaticBox(self, label="Optimization process")
processBoxSizer = wx.StaticBoxSizer(self.processBox, wx.VERTICAL)
self.statusText = wx.StaticText(self, label="Not started")
self.dotsText = wx.StaticText(self, label="")
self.dotsText.Hide()
self.repetitionText = wx.StaticText(self, label="")
self.repetitionText.Hide()
#self.maximumBox = wx.StaticBox(self, label="Version with maximum execution time")
#maximumBoxSizer = wx.StaticBoxSizer(self.maximumBox, wx.VERTICAL)
#hs = wx.BoxSizer(wx.HORIZONTAL)
#self.versionsLabel = wx.StaticText(self, label="Version:")
#hs.Add(self.versionsLabel, 0, wx.ALIGN_LEFT | wx.EXPAND, 5)
#hs.Add(wx.StaticText(self), 1, wx.EXPAND, 5)
self.maximumVersionText = wx.StaticText(self, label="")
self.maximumVersionText.Hide()
#hs.Add(self.maximumVersionText, 0, wx.ALIGN_LEFT, 5)
#maximumBoxSizer.Add(hs, 0, wx.EXPAND, 5)
hs = wx.BoxSizer(wx.HORIZONTAL)
# self.timeLabel = wx.StaticText(self, label="Time:")
# hs.Add(self.timeLabel, 0, wx.ALIGN_LEFT | wx.EXPAND, 5)
# hs.Add(wx.StaticText(self), 1, wx.EXPAND, 5)
self.maximumTimeText = wx.StaticText(self, label="")
self.maximumTimeText.Hide()
# hs.Add(self.maximumTimeText, 0, wx.ALIGN_LEFT, 5)
# maximumBoxSizer.Add(hs, 0, wx.EXPAND, 5)
# hs = wx.BoxSizer(wx.HORIZONTAL)
# self.hostsNumberLabel = wx.StaticText(self, label="Number of\nsimultaneous clients:")
# hs.Add(self.hostsNumberLabel, 0, wx.ALIGN_LEFT | wx.EXPAND, 5)
# hs.Add(wx.StaticText(self), 1, wx.EXPAND, 5)
self.maximumRepetitionText = wx.StaticText(self, label="")
self.maximumRepetitionText.Hide()
# hs.Add(self.maximumRepetitionText, 0, wx.ALIGN_LEFT, 5)
# maximumBoxSizer.Add(hs, 0, wx.EXPAND, 5)
# self.resultsBox = wx.StaticBox(self, label="Optimization results")
# self.resultsBoxSizer = wx.StaticBoxSizer(self.resultsBox, wx.VERTICAL)
processBoxSizer.Add(self.statusText, 0, wx.ALIGN_CENTER|wx.ALL, 5)
processBoxSizer.Add(self.dotsText, 0, wx.ALIGN_CENTER|wx.ALL, 5)
processBoxSizer.Add(self.repetitionText, 0, wx.ALIGN_CENTER|wx.ALL, 5)
#processBoxSizer.Add(maximumBoxSizer, 0, wx.EXPAND|wx.ALL, 5)
#processBoxSizer.Add(self.resultsBoxSizer, 0, wx.EXPAND|wx.ALL, 5)
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(configurationBoxSizer, 0, wx.EXPAND|wx.ALL, 5)
sizer.Add(processBoxSizer, 0, wx.EXPAND|wx.ALL, 5)
self.SetSizer(sizer)
self.CentreOnParent()
self.SetScrollRate(0, 10)
def DisableGUI(self, value):
"""
@brief disables/enables GUI elements
(action depends on value)
"""
if value :
self.hostCombo.Disable()
self.timeComboBox.Disable()
self.comboCheckBox.Disable()
self.toleranceTextCtrl.Disable()
self.startButton.Disable()
else :
self.hostCombo.Enable()
self.timeComboBox.Enable()
self.comboCheckBox.Enable()
self.toleranceTextCtrl.Enable()
self.startButton.Enable()
def _OptimizationStep(self):
"""
Implements one step of optimization.
"""
nextRepetition = self._GenerateNewRepetitionNumber(self.maxTime, self.previousTime,
self.currentTime, self.previousRepetition, self.currentRepetition)
if nextRepetition is None:
self.currentTime = None
self.currentRepetition = None
self._FinishSimulatorOptimization()
return
if nextRepetition == self.currentRepetition:
self._FinishSimulatorOptimization()
return
self.previousRepetition = self.currentRepetition
self.currentRepetition = nextRepetition
simulator = self.optimizedSimulators[self.optimizedSimulatorIndex]
newVersion = simulator.context.version.clone()
changed = False
for rh in newVersion.run_hosts:
if rh.host_name == self.hostName:
rh.repetitions = nextRepetition
changed = True
break
if not changed:
self.currentTime = None
self._FinishSimulatorOptimization()
return
self.repetitionText.SetLabel("Number of simultaneous hosts: %d" % nextRepetition)
self.newSimulator = self.interpreter.builder.build_simulator(self.interpreter.store, newVersion)
self.interpreter.install_modules(self.newSimulator)
wx.lib.delayedresult.startWorker(self._FinishStep,
self.interpreter.run_simulation,
wargs=(self.newSimulator,))
def _FinishStep(self, result):
"""
Handles the result of one optimization step (simulation of new repetition number).
"""
try :
result.get()
self.previousTime = self.currentTime
self.currentTime = self._GetTime(self.newSimulator, self.timeType, self.hostName)
if (self.currentTime == self.previousTime) and (self.currentRepetition == self.previousRepetition):
self._FinishSimulatorOptimization()
return
if (self.currentTime <= self.previousTime) and (self.currentRepetition >= self.previousRepetition):
self.currentTime = None
self.currentRepetition = None
self._FinishSimulatorOptimization()
return
relError = abs(self.maxTime - self.currentTime)/self.maxTime
if relError <= self.tolerance:
self._FinishSimulatorOptimization()
return
self._OptimizationStep()
except RuntimeException, e:
self.currentTime = None
self._FinishSimulatorOptimization()
except Exception, e:
import traceback
print traceback.format_exc()
self.currentTime = None
self._FinishSimulatorOptimization()
def _FinishSimulatorOptimization(self):
"""
Actions taken when optimization of one simulator is finished.
"""
timeText = "Failed"
repetitionText = ""
simulator = self.optimizedSimulators[self.optimizedSimulatorIndex]
self.optimizationResults[simulator] = (self.currentTime, self.currentRepetition)
if self.currentTime is not None:
timeText = "%.6f s" % self.currentTime
repetitionText = "%d" % self.currentRepetition
# hS = wx.BoxSizer(wx.HORIZONTAL)
# hS.Add(wx.StaticText(self, label=simulator.context.version.name, size=(200, -1)), 0)
# hS.Add(wx.StaticText(self, label=timeText, size=(200, -1)), 0)
# hS.Add(wx.StaticText(self, label=repetitionText, size=(200, -1)), 0)
# self.resultsBoxSizer.Add(hS, 0, wx.ALL | wx.EXPAND, 0)
self.Layout()
# # # create a new frame to show time analysis results on it
# resultsWindow = GeneralFrame(self.GetParent(), "Time Analysis Results", "Optimization Process", "modules_results.png")
# # # create scrollable panel
# maxPanel = DistributedVersionPanel(resultsWindow)
# maxPanel.SetValues(self.maximumVersionText.GetLabelText(), self.maximumTimeText.GetLabelText(),
# self.maximumRepetitionText.GetLabelText(), self.reportText)
# # # add panel on a window
# resultsWindow.AddPanel(maxPanel)
# resultsWindow.SetWindowSize(600,350)
# # # show the results on the new window
# resultsWindow.Show()
self.optimizedSimulatorIndex += 1
self._OptimizeNextSimulator()
def _OptimizeNextSimulator(self):
"""
Function finds the number of simultaneous clients
for the next simulator from class field ```optimizedSimulators```.
The next simulator is at index ```optimizedSimulatorIndex``` field.
When no more simulators are left the optimization process is finished.
"""
if self.optimizedSimulatorIndex >= len(self.optimizedSimulators):
self.progressTimer.Stop()
self.startButton.Enable(True)
self.statusText.SetLabel("Finished")
self.dotsText.SetLabel("")
self.dotsText.Hide()
self.repetitionText.SetLabel("")
self.repetitionText.Hide()
# Create report.
self.reportText = "The execution time of the scenario {0} ({1} simultaneous clients) is similar (with the {2}% tolerance): \n".format(
self.maxSimulator.context.version.name, self.maxRepetition, self.tolerance*100.0)
simulatorsReportTexts = []
for simulator in self.optimizedSimulators:
results = self.optimizationResults[simulator]
if results[0] is None:
continue
simulatorsReportTexts.append(" - to the scenario {0} with {1} simultaneous clients".format(
simulator.context.version.name, results[1]))
self.reportText += " and\n".join(simulatorsReportTexts) + "."
#reportTextCtrl = wx.StaticText(self, label=self.reportText, style=wx.TE_MULTILINE)
#self.resultsBoxSizer.Add(reportTextCtrl, 0, wx.ALL | wx.EXPAND, 10)
#self.Layout()
# all simulations are finished, we can present the results in a new window
if not self.progressTimer.IsRunning() :
# create a new frame to show time analysis results on it
resultsWindow = GeneralFrame(self.GetParent(), "Time Analysis Results", "Optimization Process", "modules_results.png")
# create scrollable panel
maxPanel = DistributedVersionPanel(resultsWindow)
maxPanel.SetValues(self.maximumVersionText.GetLabelText(), self.maximumTimeText.GetLabelText(),
self.maximumRepetitionText.GetLabelText(), self.reportText)
# add panel on a window
resultsWindow.AddPanel(maxPanel)
resultsWindow.SetWindowSize(600,350)
# show the results on the new window
resultsWindow.Show()
wx.PostEvent(self.module.get_gui(), aqopa_gui.ModuleSimulationFinishedEvent())
return
simulator = self.optimizedSimulators[self.optimizedSimulatorIndex]
self.previousRepetition = 0
self.previousTime = 0
self.currentRepetition = self._GetHostRepetition(simulator, self.hostName)
self.currentTime = self._GetTime(simulator, self.timeType, self.hostName)
self.statusText.SetLabel("Working on %s" % simulator.context.version.name)
self.repetitionText.SetLabel("")
self._OptimizationStep()
def _GetTime(self, simulator, timeType, hostName):
"""
Returns the execution time of simulator.
Parameter ``timeType`` selects the way of calculating time.
It may be total execution time or average execution time of
host ``hostName``.
"""
times = self.module.current_times[simulator]
if timeType == TIME_TYPE_TOTAL:
return max([times[i] for i in times ])
else: # TIME_TYPE_AVG
nb_hosts = 0
sum_times = 0.0
for h in times:
if h.original_name() == hostName:
nb_hosts += 1
sum_times += times[h]
if nb_hosts > 0:
return sum_times / nb_hosts
return 0
def _GetHostRepetition(self, simulator, hostName):
"""
Returns number of repeated hosts ``hostName``.
"""
version = simulator.context.version
nb = 0
for rh in version.run_hosts:
if rh.host_name == hostName:
nb += rh.repetitions
return nb
def _GetMaximumTimeSimulator(self, simulators, timeType, hostName):
"""
Returns triple (simulator, time) with the maximum execution
time depending on the tipe of time selected
"""
simulator = None
time = 0.0
for s in simulators:
t = self._GetTime(s, timeType, hostName)
if t >= time:
simulator = s
time = t
return (simulator, time)
def _GenerateNewRepetitionNumber(self, maxTime, previousTime, currentTime,
previousRepetition, currentRepetition):
"""
Return the next number of simultaneous hosts closer to the final result.
"""
if previousTime == currentTime:
if currentRepetition == 0:
return None
a = currentTime / currentRepetition
if a == 0:
return None
return int(maxTime / a)
else:
if currentRepetition == previousRepetition:
return None
a = (currentTime - previousTime) / (currentRepetition - previousRepetition)
b = currentTime - currentRepetition * a
return int((maxTime - b) / a)
def RemoveAllSimulations(self):
""" """
self.checkBoxes = []
self.checkBoxToSimulator = {}
self.optimizedSimulators = []
self.optimizationResults = {}
self.optimizedSimulatorIndex = 0
self.startButton.Enable(False)
self.hostCombo.Clear()
self.hostCombo.SetValue("")
self.processBox.Hide()
self.statusText.Hide()
self.dotsText.Hide()
self.repetitionText.Hide()
# clear versions combocheckbox
self.tcp.ClearChoices()
# self.resultsBox.Hide()
# self.resultsBoxSizer.Clear(True)
self.Layout()
def AddFinishedSimulation(self, simulator):
""" """
version = simulator.context.version
self.checkBoxToSimulator[version.name] = simulator
self.Layout()
def OnAllSimulationsFinished(self, simulators):
""" """
versionsList = []
items = []
for s in simulators:
version = s.context.version
versionsList.append(version.name)
for rh in version.run_hosts:
if rh.host_name not in items:
items.append(rh.host_name)
# fill combocheckbox with versions names
self.tcp.SetChoices(versionsList)
self.hostCombo.Clear()
self.hostCombo.AppendItems(items)
self.hostCombo.Select(0)
self.startButton.Enable(True)
self.Layout()
def OnStartClick(self, event=None):
""" """
self.processBox.Show()
self.statusText.Show()
self.dotsText.Show()
self.repetitionText.Show()
# self.maximumBox.Show()
# self.versionsLabel.Show()
# self.timeLabel.Show()
# self.hostsNumberLabel.Show()
# self.maximumRepetitionText.Show()
# self.maximumTimeText.Show()
# self.maximumVersionText.Show()
#
# self.resultsBox.Show()
# check if at least 2 versions were selected,
# if not, simply do not start the optimization
# process, first - correct the parameters
# (select at least 2 versions)
simulators = []
vs = self.tcp.GetSelectedItems()
for v in vs :
simulators.append(self.checkBoxToSimulator[v])
if len(simulators) < 2:
wx.MessageBox('Please select at least 2 versions.', 'Error',
wx.OK | wx.ICON_ERROR)
return
try:
self.tolerance = float(self.toleranceTextCtrl.GetValue())/100.0
except ValueError:
wx.MessageBox('Tolerance must be a float number.', 'Error',
wx.OK | wx.ICON_ERROR)
return
if self.tolerance >= 1:
wx.MessageBox('Tolerance must be smaller than 100%.', 'Error',
wx.OK | wx.ICON_ERROR)
return
if self.tolerance <= 0.01:
wx.MessageBox('Tolerance must be greater than 1%.', 'Error',
wx.OK | wx.ICON_ERROR)
return
self.timeType = TIME_TYPE_TOTAL
if self.timeComboBox.GetValue() == "Average" :
self.timeType = TIME_TYPE_AVG
self.hostName = self.hostCombo.GetValue()
self.maxSimulator, self.maxTime = self._GetMaximumTimeSimulator(simulators, self.timeType, self.hostName)
self.maxRepetition = self._GetHostRepetition(self.maxSimulator, self.hostName)
self.maximumVersionText.SetLabel("%s" % self.maxSimulator.context.version.name)
self.maximumTimeText.SetLabel("%.5f ms" % self.maxTime)
self.maximumRepetitionText.SetLabel("%d" % self.maxRepetition)
if self.maxTime == 0:
wx.MessageBox('Maximum time is equal to 0. Cannot optimize.', 'Error',
wx.OK | wx.ICON_ERROR)
return
self.startButton.Enable(False)
self.statusText.SetLabel("Waiting for the simulator")
# self.resultsBoxSizer.Clear(True)
# hS = wx.BoxSizer(wx.HORIZONTAL)
# hS.Add(wx.StaticText(self, label="Version", size=(200, -1)), 0)
# hS.Add(wx.StaticText(self, label="Time", size=(200, -1)), 0)
# hS.Add(wx.StaticText(self, label="Number of simulatenous hosts", size=(200, -1)), 0)
# self.resultsBoxSizer.Add(hS, 0, wx.ALL | wx.EXPAND, 0)
self.Layout()
simulators.remove(self.maxSimulator)
self.progressTimer.Start(500)
self.optimizedSimulators = simulators
self.optimizedSimulatorIndex = 0
wx.PostEvent(self.module.get_gui(), aqopa_gui.ModuleSimulationRequestEvent(module=self.module))
def OnSimulationAllowed(self, event):
""" """
self.interpreter = event.interpreter
self._OptimizeNextSimulator()
################
# PROGRESS BAR
################
def OnProgressTimerTick(self, event):
""" """
self.dots = (self.dots + 1) % 10
self.dotsText.SetLabel("." * self.dots)
self.Layout()
class MainResultsNotebook(wx.Notebook):
""" """
def __init__(self, module, *args, **kwargs):
wx.Notebook.__init__(self, *args, **kwargs)
# tab images
il = wx.ImageList(20, 20)
singleVersionImg = il.Add(wx.Bitmap(self.CreatePath4Resource('PuzzlePiece.png'), wx.BITMAP_TYPE_PNG))
distributedVersionImg = il.Add(wx.Bitmap(self.CreatePath4Resource('PuzzlePieces.png'), wx.BITMAP_TYPE_PNG))
self.AssignImageList(il)
self.module = module
# single version tab
self.oneVersionTab = SingleVersionPanel(self.module, self)
self.AddPage(self.oneVersionTab, "Single Version")
self.SetPageImage(0, singleVersionImg)
self.oneVersionTab.Layout()
# distributed version tab
self.distributedOptimizationTab = DistributedSystemOptimizationPanel(self.module, self)
self.AddPage(self.distributedOptimizationTab, "Distributed System Optimization")
self.SetPageImage(1, distributedVersionImg)
self.distributedOptimizationTab.Layout()
# self.compareTab = VersionsChartsPanel(self.module, self)
# self.AddPage(self.compareTab, "Versions' Charts")
# self.compareTab.Layout()
def CreatePath4Resource(self, resourceName):
"""
@brief creates and returns path to the
given file in the resource
('assets') dir
@return path to the resource
"""
tmp = os.path.split(os.path.dirname(__file__))
# find last / character in path
idx = tmp[0].rfind('/')
# get substring - path for resource
path = tmp[0][0:idx]
return os.path.join(path, 'bin', 'assets', resourceName)
def OnParsedModel(self):
""" """
self.oneVersionTab.RemoveAllSimulations()
# self.compareTab.RemoveAllSimulations()
self.distributedOptimizationTab.RemoveAllSimulations()
def OnSimulationFinished(self, simulator):
""" """
self.oneVersionTab.AddFinishedSimulation(simulator)
# self.compareTab.AddFinishedSimulation(simulator)
self.distributedOptimizationTab.AddFinishedSimulation(simulator)
def OnAllSimulationsFinished(self, simulators):
""" """
self.distributedOptimizationTab.OnAllSimulationsFinished(simulators)
class ModuleGui(wx.EvtHandler):
"""
Class used by GUI version of AQoPA.
"""
def __init__(self, module):
""" """
wx.EvtHandler.__init__(self)
self.module = module
self.mainResultNotebook = None
def get_name(self):
return "Time Analysis"
def get_configuration_panel(self, parent):
""" Returns WX panel with configuration controls. """
panel = wx.Panel(parent)
sizer = wx.BoxSizer(wx.VERTICAL)
text = wx.StaticText(panel, label="Module does not need to be configured.")
sizer.Add(text, 0, wx.ALL | wx.EXPAND, 5)
text = wx.StaticText(panel, label="All result options will be available after results are calculated.")
sizer.Add(text, 0, wx.ALL | wx.EXPAND, 5)
panel.SetSizer(sizer)
return panel
def get_results_panel(self, parent):
"""
Create main result panel existing from the beginning
which will be extended when versions' simulations are finished.
"""
self.mainResultNotebook = MainResultsNotebook(self.module, parent)
return self.mainResultNotebook
def on_finished_simulation(self, simulator):
""" """
self.mainResultNotebook.OnSimulationFinished(simulator)
def on_finished_all_simulations(self, simulators):
"""
Called once for all simulations after all of them are finished.
"""
self.mainResultNotebook.OnAllSimulationsFinished(simulators)
def on_parsed_model(self):
""" """
self.mainResultNotebook.OnParsedModel() | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/module/timeanalysis/gui.py | gui.py |
from math import ceil
import random
from aqopa.simulator.state import Hook, ExecutionResult
from aqopa.model import AssignmentInstruction,\
CallFunctionInstruction, IfInstruction, WhileInstruction,\
CommunicationInstruction, CallFunctionExpression, TupleExpression, ComparisonExpression, COMMUNICATION_TYPE_OUT, \
COMMUNICATION_TYPE_IN
from aqopa.module.timeanalysis.error import TimeSynchronizationException
from aqopa.simulator.error import RuntimeException
class PreInstructionHook(Hook):
"""
Execution hook executed before default core execution of each instruction.
Returns execution result.
"""
def __init__(self, module, simulator):
""" """
self.module = module
self.simulator = simulator
def execute(self, context, **kwargs):
"""
"""
instruction = context.get_current_instruction()
if instruction.__class__ not in [AssignmentInstruction, CallFunctionInstruction,
IfInstruction, WhileInstruction, CommunicationInstruction]:
return
if isinstance(instruction, CommunicationInstruction):
return self._execute_communication_instruction(context, **kwargs)
else:
self._update_time(context)
return ExecutionResult()
def _update_time(self, context):
"""
Update times in context according to current instruction.
"""
instruction = context.get_current_instruction()
if isinstance(instruction, AssignmentInstruction):
expression = instruction.expression
elif isinstance(instruction, CallFunctionInstruction):
expression = CallFunctionExpression(instruction.function_name,
instruction.arguments,
instruction.qop_arguments)
else:
expression = instruction.condition
# Return details for each expression in instruction
# In some instruction may be more expressions (tuple, nested call function)
total_time, time_details = self._get_time_details_for_expression(context, expression)
if total_time > 0:
host = context.get_current_host()
self.module.add_timetrace(self.simulator, host, host.get_current_process(),
instruction, time_details,
self.module.get_current_time(self.simulator, host),
total_time)
self.module.set_current_time(self.simulator, host, self.module.get_current_time(self.simulator, host) + total_time)
def _get_time_details_for_expression(self, context, expression):
"""
Returns calculated time that execution of expression takes.
"""
if isinstance(expression, TupleExpression):
return self._get_time_details_for_tuple_expression(context, expression)
elif isinstance(expression, CallFunctionExpression):
return self._get_time_details_for_simple_expression(context, expression)
elif isinstance(expression, ComparisonExpression):
return self._get_time_details_for_comparison_expression(context, expression)
return 0, []
def _get_time_details_for_tuple_expression(self, context, expression):
"""
Calculate execution time for tuple expression.
"""
total_time = 0
time_details = []
for i in range(0, len(expression.elements)):
element_total_time, element_time_details = self._get_time_details_for_expression(context, expression.elements[i])
total_time += element_total_time
time_details.extend(element_time_details)
return total_time, time_details
def _get_time_details_for_comparison_expression(self, context, expression):
"""
Calculate execution time for tuple expression.
"""
total_time = 0
time_details = []
element_total_time, element_time_details = self._get_time_details_for_expression(context, expression.left)
total_time += element_total_time
time_details.extend(element_time_details)
element_total_time, element_time_details = self._get_time_details_for_expression(context, expression.right)
total_time += element_total_time
time_details.extend(element_time_details)
return total_time, time_details
def _get_time_for_exact_range_metric(self, metric_type, metric_unit, metric_value, expression, host, context):
"""
Returns execution time of expression which has exact or range metric assigned.
"""
if metric_unit in ["ms", "s"]:
# Get time
if metric_type == "exact":
time_value = float(metric_value)
else: # range
mvalues = metric_value.split('..')
val_from = float(mvalues[0])
val_to = float(mvalues[1])
time_value = val_from + (val_to-val_from)*random.random()
# Update time according to the time unit
if metric_unit == "ms":
return time_value / 1000.0
else: # metric_unit == "s":
return time_value
elif metric_unit in ["mspb", "mspB", "kbps", "mbps"]:
mparts = metric_value.split(':')
time_metric = mparts[0]
indexes = None
if len(mparts) == 2:
indexes = [int(i) for i in mparts[1].split(',')]
# Get time
if metric_type == "exact":
time_value = float(time_metric)
else: # range
mvalues = time_metric.split('..')
val_from = float(mvalues[0])
val_to = float(mvalues[1])
time_value = val_from + (val_to-val_from)*random.random()
size = 0
if indexes is None:
populated_expression = context.expression_populator.populate(expression,
context.get_current_host())
size = context.metrics_manager.get_expression_size(populated_expression,
context, context.get_current_host())
else:
for index in indexes:
populated_expression = context.expression_populator.populate(
expression.arguments[index-1], context.get_current_host())
size += context.metrics_manager.get_expression_size(
populated_expression, context, context.get_current_host())
if metric_unit in ["mspb", "mspB"]:
msperbyte = float(time_value)
if metric_unit == "mspb":
msperbyte /= 8.0
elif metric_unit == "kbps":
msperbyte = 8000.0 / 1024.0 / time_value
else: # mbps
msperbyte = 8000.0 / 1024.0 / 1024.0 / time_value
# Make s from ms
sperbyte = msperbyte / 1000.0
return sperbyte * size
def _get_time_for_block_metric(self, metric_type, metric_unit, metric_value, expression, host, context):
"""
Returns execution time of expression which has block metric assigned.
"""
mparts = metric_value.split(':')
element_index = None
if len(mparts) == 3:
element_index = int(mparts[0])-1
unit_time = float(mparts[1])
unit_size = int(mparts[2])
else:
unit_time = float(mparts[0])
unit_size = int(mparts[1])
time_unit = metric_unit[0]
if time_unit == "ms":
unit_time /= 1000.0
size_unit = metric_unit[1]
expression_to_populate = expression if element_index is None else expression.arguments[element_index]
populated_expression = context.expression_populator.populate(expression_to_populate, host)
argument_size = context.metrics_manager.get_expression_size(populated_expression,
context, host)
units = ceil(argument_size / float(unit_size))
time = units * unit_time
if size_unit == 'b':
time *= 8.0
return time
def _get_time_details_for_simple_expression(self, context, expression):
"""
Calculate time for expression.
"""
time = 0
time_details = []
metric = context.metrics_manager\
.find_primitive(context.get_current_host(), expression)
if metric:
block = metric.block
for i in range(0, len(block.service_params)):
sparam = block.service_params[i]
if sparam.service_name.lower().strip() != "time":
continue
metric_type = sparam.param_name.lower().strip()
metric_unit = sparam.unit
metric_value = metric.service_arguments[i].strip()
if metric_type in ["exact", "range"]:
time = self._get_time_for_exact_range_metric(metric_type, metric_unit, metric_value,
expression, context.get_current_host(), context)
elif metric_type == "block":
time = self._get_time_for_block_metric(metric_type, metric_unit, metric_value,
expression, context.get_current_host(), context)
elif metric_type == 'algorithm':
algorithm_name = metric_value
if not context.algorithms_resolver.has_algorithm(algorithm_name):
raise RuntimeException("Communication algorithm {0} undeclared.".format(algorithm_name))
alg = context.algorithms_resolver.get_algorithm(algorithm_name)
variables = {alg['parameter']: expression}
time = context.algorithms_resolver.calculate(context, context.get_current_host(),
algorithm_name, variables)
# Update time according to the time unit
if metric_unit == "ms":
time /= 1000.0
if time > 0:
time_details.append((expression, time))
for expr in expression.arguments:
argument_time, argument_time_details = self._get_time_details_for_expression(context, expr)
time += argument_time
time_details.extend(argument_time_details)
return time, time_details
def _get_time_for_communication_step(self, context, host, channel, metric, message, receiver=None):
""" Returns time of sending/receiving message """
if metric['type'] == 'metric':
metric_value = metric['value']
elif metric['type'] == 'algorithm':
algorithm_name = metric['name']
if not context.algorithms_resolver.has_algorithm(algorithm_name):
raise RuntimeException("Communication algorithm {0} undeclared.".format(algorithm_name))
link_quality = context.channels_manager.get_router().get_link_quality(channel.tag_name,
message.sender,
receiver)
alg = context.algorithms_resolver.get_algorithm(algorithm_name)
variables = {
'link_quality': link_quality,
alg['parameter']: message.expression,
}
metric_value = context.algorithms_resolver.calculate(context, host, algorithm_name, variables)
else:
return 0
unit = metric['unit']
if unit == 'ms':
# exact time
return metric_value / 1000.0
else:
# time depending on the size
populated_expression = context.expression_populator.populate(message.expression, message.sender)
# size in bytes
size = context.metrics_manager.get_expression_size(populated_expression, context, message.sender)
mspB = 0
if unit == 'mspB':
mspB = float(metric_value)
elif unit == 'mspb':
mspB = float(metric_value) * 8.0
elif unit == 'kbps':
mspB = 1.0 / (float(metric_value) * 0.128)
elif unit == 'mbps':
mspB = 1.0 / (float(metric_value) * 0.000128)
spB = mspB / 1000.0
return spB * size
def _find_communication_metric(self, context, channel, sender, receiver=None):
"""
Return the communication metric from topology.
"""
return context.channels_manager.get_router().get_link_parameter_value('time', channel.tag_name, sender, receiver)
def _get_time_of_sending_point_to_point(self, context, channel, message, request):
""" Returns time of message sending process """
metric = self._find_communication_metric(context, channel, message.sender, receiver=request.receiver)
if metric is None:
return 0
return self._get_time_for_communication_step(context, message.sender, channel, metric, message, receiver=request.receiver)
def _get_time_of_sending_boradcast(self, context, channel, message):
""" Returns time of message sending process """
metric = self._find_communication_metric(context, channel, message.sender, receiver=None)
if metric is None:
return 0
return self._get_time_for_communication_step(context, message.sender, channel, metric, message, receiver=None)
def _get_time_of_receiving(self, context, channel, message, request):
""" Returns time of message receiving process """
metric = self._find_communication_metric(context, channel, message.sender, receiver=request.receiver)
if metric is None:
return 0
return self._get_time_for_communication_step(context, message.sender, channel, metric, message, receiver=request.receiver)
def _execute_communication_instruction(self, context, **kwargs):
""" """
channel = context.channels_manager.find_channel_for_current_instruction(context)
if not channel:
return ExecutionResult(result_kwargs=kwargs)
if kwargs is None:
kwargs = {}
instruction = context.get_current_instruction()
# TODO: Consider situation when host has many context and in one context it is waiting for message
# while in second one it can go further (non communication instruction)
# Check if other hosts should send or receive their message through this channel before
current_host_time = self.module.get_current_time(self.simulator, context.get_current_host())
min_hosts_time = (context.get_current_host(), current_host_time)
delay_communication_execution = False
for h in context.hosts:
# Omit finished hosts
if h.finished():
continue
# Omit current host
if h == context.get_current_host():
continue
# Omit hosts not using current channel
if not channel.is_connected_with_host(h):
continue
# Check the time of other hosts
host_time = self.module.get_current_time(self.simulator, h)
# If hosts have the same time - the IN instruction should be executed before OUT execution
# and other instructions (non-communication) should be executed before as well
if host_time == min_hosts_time[1]:
for instructions_context in h.get_all_instructions_contexts():
if instructions_context.finished():
continue
host_current_instruction = instructions_context.get_current_instruction()
if isinstance(host_current_instruction, CommunicationInstruction):
if instruction.is_out() and not host_current_instruction.is_out():
host_channel = context.channels_manager.find_channel_for_host_instruction(
context, h, host_current_instruction)
if host_channel:
if not host_channel.is_waiting_on_instruction(h, host_current_instruction):
delay_communication_execution = True
else:
delay_communication_execution = True
# If checked (in loop) host is in the past relative to the time of actual host
if host_time < min_hosts_time[1]:
# In general, the instruction in host with smaller time
# should be executed earlier
host_delay_communication_execution = True
# The only exception is when the host in the past is waiting on IN instruction
# and the request cannot be fulfilled
current_instruction = h.get_current_instructions_context().get_current_instruction()
if isinstance(current_instruction, CommunicationInstruction):
if not current_instruction.is_out():
# The checked host (the one in the past) is waiting for the message
# Lets check if the messages for the host are in the channel
host_channel = context.channels_manager.find_channel_for_host_instruction(
context, h, current_instruction)
if host_channel:
# If host is not waiting on IN instruction - let it go first
# If host is waiting - check if it is fulfilled
if host_channel.is_waiting_on_instruction(h, current_instruction):
request = host_channel.get_existing_request_for_instruction(h, current_instruction)
if not request.ready_to_fulfill():
host_delay_communication_execution = False
if host_delay_communication_execution:
delay_communication_execution = True
## Delay execution of this instruction
## if needed according to previous check
if delay_communication_execution:
return ExecutionResult(custom_index_management=True,
finish_instruction_execution=True,
result_kwargs=kwargs)
##############################################
# Now the host with minimal time is executed
##############################################
if instruction.is_out():
# OUT instruction
sender = context.get_current_host()
# Get list of messages being sent from upper executor or build new list
if 'sent_message' in kwargs:
sent_message = kwargs['sent_message']
else:
sent_message = context.channels_manager.build_message(
context.get_current_host(),
context.get_current_host().get_variable(instruction.variable_name).clone(),
context.expression_checker)
kwargs['sent_message'] = sent_message
sender_time = self.module.get_current_time(self.simulator, sender)
# DEBUG #
# print 'sending message from ', sender.name, 'at', sender_time
# DEBUG #
all_requests = channel.get_filtered_requests(sent_message, context.channels_manager.get_router())
accepted_requests = []
for request in all_requests:
if self.module.get_request_created_time(self.simulator, request) > sender_time:
sent_message.cancel_for_request(request)
else:
accepted_requests.append(request)
if len(accepted_requests) > 1:
# broadcast
# Send message with broadcast metric, update sender's time and fill in the sending time in message
broadcast_time = self._get_time_of_sending_boradcast(context, channel, sent_message)
self.module.add_message_sent_time(self.simulator, sent_message, sender_time)
self.module.add_message_sending_time(self.simulator, sent_message, broadcast_time)
self.module.set_current_time(self.simulator, sender, sender_time + broadcast_time)
for request in accepted_requests:
if request.is_waiting and request.assigned_message is None:
receiving_time = self._get_time_of_receiving(context, channel, sent_message, request)
receiving_time = max(receiving_time, broadcast_time)
started_waiting_at = self.module.get_request_created_time(self.simulator, request)
receiver_time = self.module.get_current_time(self.simulator, request.receiver)
receiver_time = max(receiver_time, sender_time)
self.module.set_current_time(self.simulator, request.receiver,
receiver_time + receiving_time)
# Add timetrace to module
self.module.add_channel_message_trace(self.simulator, channel, sent_message,
sent_message.sender, sender_time, broadcast_time,
request.receiver, started_waiting_at,
receiver_time, receiving_time)
# DEBUG #
# print 'msg from', sent_message.sender.name, \
# 'at', sender_time, '(', broadcast_time, 'ms ) ', 'to', len(accepted_requests), 'of', \
# len(all_requests), 'hosts', '- message', unicode(sent_message.expression)
# DEBUG #
elif len(accepted_requests) == 1:
# point to point
# Send message with link metric, update sender's time and fill in the sending time in message
request = accepted_requests[0]
sending_time = self._get_time_of_sending_point_to_point(context, channel,
sent_message, request)
self.module.add_message_sent_time(self.simulator, sent_message, sender_time)
self.module.add_message_sending_time(self.simulator, sent_message, sending_time)
self.module.set_current_time(self.simulator, sender, sender_time + sending_time)
receiving_time = self._get_time_of_receiving(context, channel, sent_message, request)
receiving_time = max(receiving_time, sending_time)
receiver_time = self.module.get_current_time(self.simulator, request.receiver)
receiver_time = max(receiver_time, sender_time)
self.module.set_current_time(self.simulator, request.receiver,
receiver_time + receiving_time)
started_waiting_at = self.module.get_request_created_time(self.simulator, request)
# Add timetrace to module
self.module.add_channel_message_trace(self.simulator, channel, sent_message,
sent_message.sender, sender_time, sending_time,
request.receiver, started_waiting_at,
receiver_time, receiving_time)
# DEBUG #
# print 'msg from', sent_message.sender.name, 'to', request.receiver.name, \
# 'at', sender_time, '(', sending_time, 'ms ) ', 'and started receiving at', \
# receiver_time, '(t:', receiving_time, 'ms, wait since:', started_waiting_at, 'ms)', \
# 'message', unicode(sent_message.expression)
# DEBUG #
else: # zero receivers
# Send message with broadcast metric and update sender's time - no receivers
broadcast_time = self._get_time_of_sending_boradcast(context, channel, sent_message)
self.module.add_message_sent_time(self.simulator, sent_message, sender_time)
self.module.add_message_sending_time(self.simulator, sent_message, broadcast_time)
self.module.set_current_time(self.simulator, sender, sender_time + broadcast_time)
# DEBUG #
# print 'all requests', len(all_requests), ' - accepted:', len(accepted_requests)
# print 'msg from', sent_message.sender.name, \
# 'at', sender_time, '(', broadcast_time, 'ms ) ', \
# 'message', unicode(sent_message.expression)
# DEBUG #
else:
# IN instruction
if 'messages_request' in kwargs:
request = kwargs['messages_request']
else:
request = context.channels_manager.build_message_request(context.get_current_host(),
instruction,
context.expression_populator)
kwargs['messages_request'] = request
# If waiting request has NOT been created and added before
if not channel.is_waiting_on_instruction(request.receiver, request.instruction):
receiver = context.get_current_host()
self.module.add_request_created_time(self.simulator, request,
self.module.get_current_time(self.simulator, receiver))
# DEBUG #
# print 'msg requested by', receiver.name, 'at', self.module.get_current_time(self.simulator, receiver)
# DEBUG #
if request.assigned_message is None:
# Set messages from the past as cancelled for this request
all_messages = channel.get_filtered_messages(request, context.channels_manager.get_router())
accepted_messages = []
for message in all_messages:
if self.module.get_request_created_time(self.simulator, request) > \
self.module.get_message_sent_time(self.simulator, message):
message.cancel_for_request(request)
else:
accepted_messages.append(message)
if len(accepted_messages) > 0:
message = accepted_messages[0]
sending_time = self.module.get_message_sending_time(self.simulator, message)
receiving_time = self._get_time_of_receiving(context, channel, message, request)
receiving_time = max(receiving_time, sending_time)
receiver_time = self.module.get_current_time(self.simulator, request.receiver)
self.module.set_current_time(self.simulator, request.receiver,
receiver_time + receiving_time)
# Add timetrace to module
message_sent_time = self.module.get_message_sent_time(self.simulator, message)
started_waiting_at = self.module.get_request_created_time(self.simulator, request)
self.module.add_channel_message_trace(self.simulator, channel, message,
message.sender, message_sent_time, sending_time,
request.receiver, started_waiting_at,
receiver_time, receiving_time)
return ExecutionResult(result_kwargs=kwargs) | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/module/timeanalysis/hook.py | hook.py |
import sys
from aqopa.simulator.state import Hook
class PrintResultsHook(Hook):
def __init__(self, module, simulator, output_file=sys.stdout):
""" """
self.module = module
self.simulator = simulator
self.output_file = output_file
def execute(self, context, **kwargs):
""" """
self.output_file.write('-'*80)
self.output_file.write('\n')
self.output_file.write('Module\tTime Analysis (time in s)')
self.output_file.write('\n')
self.output_file.write('Version\t%s\n\n' % self.simulator.context.version.name)
if self.simulator.infinite_loop_occured():
self.output_file.write('ERROR\tInfinite loop on {0} -> {1}\n'.format(
unicode(self.simulator.context.get_current_host()),
unicode(self.simulator.context.get_current_instruction())))
self.output_file.write('\n')
for h in context.hosts:
self.output_file.write('{0}\t{1}\t'.format(h.name, str(self.module.get_current_time(self.simulator, h))))
if h.finished():
self.output_file.write('Finished')
if h.get_finish_error():
self.output_file.write(' with error\t{0}'.format(h.get_finish_error()))
else:
self.output_file.write('NOT Finished\t{0}'.format(unicode(h.get_current_instructions_context()\
.get_current_instruction())))
self.output_file.write("\n")
self.output_file.write('\n')
self.output_file.write('Dropped messages\n')
i = 0
for c in context.channels_manager.channels:
if c.get_dropped_messages_nb() > 0:
i += 1
self.output_file.write('%s\t%d\n' % (c.name, c.get_dropped_messages_nb()))
if i == 0:
self.output_file.write('None\n')
self.output_file.write('\n') | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/module/timeanalysis/console.py | console.py |
from aqopa.model import MetricsServiceParam
from aqopa.model.parser.lex_yacc import LexYaccParserExtension
class Builder():
def create_metrics_services_param_time(self, token):
"""
metrics_services_param : SQLPARAN TIME COLON metrics_services_time_type LPARAN metrics_services_time_unit RPARAN SQRPARAN
| SQLPARAN TIME COLON metrics_services_time_two_params_type LPARAN metrics_services_exact_time_unit COMMA metrics_services_size_unit RPARAN SQRPARAN
"""
if len(token) == 9:
return MetricsServiceParam(token[2], token[4], token[6])
else:
return MetricsServiceParam(token[2], token[4], (token[6], token[8]))
class ModelParserExtension(LexYaccParserExtension):
"""
Extension for timeanalysis module communications time
"""
#######################
# Communication Time
#######################
def time_default_parameter(self, t):
"""
medium_default_parameter : TIME_DEFAULT_PARAMETER EQUAL comm_time_value
"""
t[0] = {'default_time': t[3]}
def topology_rule_time_parameter(self, t):
"""
topology_rule_parameter : TIME_PARAMETER EQUAL comm_time_value
"""
t[0] = {'time': t[3]}
def comm_time_value(self, t):
"""
comm_time_value : comm_time_metric
| comm_time_algorithm
"""
t[0] = t[1]
def comm_time_metric(self, t):
"""
comm_time_metric : number comm_time_metric_unit
"""
t[0] = {
'type': 'metric',
'value': t[1],
'unit': t[2]
}
def comm_time_algorithm(self, t):
"""
comm_time_algorithm : IDENTIFIER SQLPARAN comm_time_metric_unit SQRPARAN
| IDENTIFIER
"""
unit = 'ms'
if len(t) == 5:
unit = t[3]
t[0] = {
'type': 'algorithm',
'name': t[1],
'unit': unit
}
def comm_time_metric_unit(self, t):
"""
comm_time_metric_unit : MS
| MSPBIT
| MSPBYTE
| KBYTEPS
| MBYTEPS
"""
t[0] = t[1]
def _extend(self):
""" """
self.parser.add_reserved_word('ms', 'MS', state='communication', case_sensitive=True)
self.parser.add_reserved_word('mspb', 'MSPBIT', state='communication', case_sensitive=True)
self.parser.add_reserved_word('mspB', 'MSPBYTE', state='communication', case_sensitive=True)
self.parser.add_reserved_word('kbps', 'KBYTEPS', state='communication', case_sensitive=True)
self.parser.add_reserved_word('mbps', 'MBYTEPS', state='communication', case_sensitive=True)
self.parser.add_reserved_word('default_time', 'TIME_DEFAULT_PARAMETER', state='communication',)
self.parser.add_reserved_word('time', 'TIME_PARAMETER', state='communication',)
self.parser.add_rule(self.time_default_parameter)
self.parser.add_rule(self.topology_rule_time_parameter)
self.parser.add_rule(self.comm_time_value)
self.parser.add_rule(self.comm_time_metric)
self.parser.add_rule(self.comm_time_metric_unit)
self.parser.add_rule(self.comm_time_algorithm)
class ConfigParserExtension(LexYaccParserExtension):
"""
Extension for timeanalysis module communications time
"""
#######################
# Communication Time
#######################
def version_time_default_parameter(self, t):
"""
version_medium_default_parameter : TIME_DEFAULT_PARAMETER EQUAL version_comm_time_value
"""
t[0] = {'default_time': t[3]}
def version_topology_rule_time_parameter(self, t):
"""
version_topology_rule_parameter : TIME_PARAMETER EQUAL version_comm_time_value
"""
t[0] = {'time': t[3]}
def version_comm_time_value(self, t):
"""
version_comm_time_value : version_comm_time_metric
| version_comm_time_algorithm
"""
t[0] = t[1]
def version_comm_time_metric(self, t):
"""
version_comm_time_metric : number version_comm_time_metric_unit
"""
t[0] = {
'type': 'metric',
'value': t[1],
'unit': t[2]
}
def version_comm_time_algorithm(self, t):
"""
version_comm_time_algorithm : IDENTIFIER SQLPARAN version_comm_time_metric_unit SQRPARAN
| IDENTIFIER
"""
unit = 'ms'
if len(t) == 5:
unit = t[3]
t[0] = {
'type': 'algorithm',
'name': t[1],
'unit': unit
}
def version_comm_time_metric_unit(self, t):
"""
version_comm_time_metric_unit : MS
| MSPBIT
| MSPBYTE
| KBYTEPS
| MBYTEPS
"""
t[0] = t[1]
def _extend(self):
""" """
self.parser.add_reserved_word('ms', 'MS', state='versioncommunication', case_sensitive=True)
self.parser.add_reserved_word('mspb', 'MSPBIT', state='versioncommunication', case_sensitive=True)
self.parser.add_reserved_word('mspB', 'MSPBYTE', state='versioncommunication', case_sensitive=True)
self.parser.add_reserved_word('kbps', 'KBYTEPS', state='versioncommunication', case_sensitive=True)
self.parser.add_reserved_word('mbps', 'MBYTEPS', state='versioncommunication', case_sensitive=True)
self.parser.add_reserved_word('default_time', 'TIME_DEFAULT_PARAMETER', state='versioncommunication',)
self.parser.add_reserved_word('time', 'TIME_PARAMETER', state='versioncommunication',)
self.parser.add_rule(self.version_time_default_parameter)
self.parser.add_rule(self.version_topology_rule_time_parameter)
self.parser.add_rule(self.version_comm_time_value)
self.parser.add_rule(self.version_comm_time_metric)
self.parser.add_rule(self.version_comm_time_metric_unit)
self.parser.add_rule(self.version_comm_time_algorithm)
class MetricsParserExtension(LexYaccParserExtension):
"""
Extension for parsing timeanalysis module metrics
"""
def __init__(self):
LexYaccParserExtension.__init__(self)
self.builder = Builder()
#######################
# Metrics Time
#######################
def metrics_services_param_time(self, t):
"""
metrics_services_param : SQLPARAN TIME COLON metrics_services_time_type LPARAN metrics_services_time_unit RPARAN SQRPARAN
| SQLPARAN TIME COLON metrics_services_time_two_params_type LPARAN metrics_services_exact_time_unit COMMA metrics_services_size_unit RPARAN SQRPARAN
"""
t[0] = self.builder.create_metrics_services_param_time(t)
def metrics_services_time_one_param_type(self, t):
"""
metrics_services_time_type : EXACT
| RANGE
| ALGORITHM
"""
t[0] = t[1].lower()
def metrics_services_time_two_params_type(self, t):
"""
metrics_services_time_two_params_type : BLOCK
"""
t[0] = t[1].lower()
def metrics_services_time_unit(self, t):
"""
metrics_services_time_unit : S
| MS
| MSPBIT
| MSPBYTE
| KBYTEPS
| MBYTEPS
"""
t[0] = t[1]
def metrics_services_exact_time_unit(self, t):
"""
metrics_services_exact_time_unit : S
| MS
"""
t[0] = t[1].lower()
def metrics_services_size_unit(self, t):
"""
metrics_services_size_unit : B
| b
"""
t[0] = t[1]
def _extend(self):
""" """
self.parser.add_token('COMMA', r'\,', states=['metricsprimhead'])
self.parser.add_reserved_word('time', 'TIME', state='metricsprimhead', case_sensitive=False)
self.parser.add_reserved_word('exact', 'EXACT', state='metricsprimhead', case_sensitive=False)
self.parser.add_reserved_word('range', 'RANGE', state='metricsprimhead', case_sensitive=False)
self.parser.add_reserved_word('s', 'S', state='metricsprimhead', case_sensitive=True)
self.parser.add_reserved_word('ms', 'MS', state='metricsprimhead', case_sensitive=True)
self.parser.add_reserved_word('mspb', 'MSPBIT', state='metricsprimhead', case_sensitive=True)
self.parser.add_reserved_word('mspB', 'MSPBYTE', state='metricsprimhead', case_sensitive=True)
self.parser.add_reserved_word('kbps', 'KBYTEPS', state='metricsprimhead', case_sensitive=True)
self.parser.add_reserved_word('mbps', 'MBYTEPS', state='metricsprimhead', case_sensitive=True)
self.parser.add_reserved_word('algorithm', 'ALGORITHM', state='metricsprimhead', case_sensitive=True)
self.parser.add_reserved_word('block', 'BLOCK', state='metricsprimhead', case_sensitive=False)
self.parser.add_reserved_word('b', 'b', state='metricsprimhead', case_sensitive=True)
self.parser.add_reserved_word('B', 'B', state='metricsprimhead', case_sensitive=True)
self.parser.add_rule(self.metrics_services_param_time)
self.parser.add_rule(self.metrics_services_time_one_param_type)
self.parser.add_rule(self.metrics_services_time_two_params_type)
self.parser.add_rule(self.metrics_services_time_unit)
self.parser.add_rule(self.metrics_services_exact_time_unit)
self.parser.add_rule(self.metrics_services_size_unit) | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/module/timeanalysis/parser.py | parser.py |
from aqopa import module
from aqopa.module.timeanalysis.parser import ConfigParserExtension, ModelParserExtension
from aqopa.simulator.state import HOOK_TYPE_PRE_INSTRUCTION_EXECUTION,\
HOOK_TYPE_SIMULATION_FINISHED
from .parser import MetricsParserExtension
from .hook import PreInstructionHook
from .model import TimeTrace, ChannelMessageTrace
from .gui import ModuleGui
from .console import PrintResultsHook
class Module(module.Module):
"""
"""
def __init__(self):
""" """
self.timetraces = {} # Generated timetraces list for each simulator
# (divided by simulators - the reason for dict)
self.current_times = {} # Current times of hosts, key - host instance divided by simulators
self.guis = {} # Divided by simulators - the reason for dict
self.channel_request_times = {} # Times when a request has been created
# (divided by simulators - the reason for dict)
self.channel_message_times = {} # Times when a message has been sent
# (divided by simulators - the reason for dict)
self.channel_message_sending_times = {} # The time of sending a message
# (divided by simulators - the reason for dict)
self.channel_message_traces = {} # Time traces for communication steps
# (divided by simulators - the reason for dict)
def get_gui(self):
if not getattr(self, '__gui', None):
setattr(self, '__gui', ModuleGui(self))
return getattr(self, '__gui', None)
def extend_model_parser(self, parser):
"""
Overriden
"""
parser.add_extension(ModelParserExtension())
return parser
def extend_metrics_parser(self, parser):
"""
Overriden
"""
parser.add_extension(MetricsParserExtension())
return parser
def extend_config_parser(self, parser):
"""
Overriden
"""
parser.add_extension(ConfigParserExtension())
return parser
def _install(self, simulator):
"""
"""
hook = PreInstructionHook(self, simulator)
simulator.register_hook(HOOK_TYPE_PRE_INSTRUCTION_EXECUTION, hook)
return simulator
def install_console(self, simulator):
""" Install module for console simulation """
self._install(simulator)
hook = PrintResultsHook(self, simulator)
simulator.register_hook(HOOK_TYPE_SIMULATION_FINISHED, hook)
return simulator
def install_gui(self, simulator):
""" Install module for gui simulation """
self._install(simulator)
return simulator
def add_timetrace(self, simulator, host, process, instruction, expressions_details, started_at, length):
""" """
if simulator not in self.timetraces:
self.timetraces[simulator] = []
tt = self.timetraces[simulator]
tt.append(TimeTrace(host, process, instruction, expressions_details, started_at, length))
def get_timetraces(self, simulator):
""" """
if simulator not in self.timetraces:
return []
return self.timetraces[simulator]
def add_channel_message_trace(self, simulator, channel, message,
sender, sent_at, sending_time,
receiver, started_waiting_at,
started_receiving_at, receiving_time):
if simulator not in self.channel_message_traces:
self.channel_message_traces[simulator] = {}
if channel not in self.channel_message_traces[simulator]:
self.channel_message_traces[simulator][channel] = []
cmt = self.channel_message_traces[simulator][channel]
cmt.append(ChannelMessageTrace(channel, message,
sender, sent_at, sending_time,
receiver, started_waiting_at,
started_receiving_at, receiving_time))
def get_channel_message_traces(self, simulator, channel):
""" """
if simulator not in self.channel_message_traces:
return []
if channel not in self.channel_message_traces[simulator]:
return []
return self.channel_message_traces[simulator][channel]
def get_all_channel_message_traces(self, simulator):
""" """
if simulator not in self.channel_message_traces:
return []
return self.channel_message_traces[simulator]
def set_current_time(self, simulator, host, time):
""" """
if simulator not in self.current_times:
self.current_times[simulator] = {}
self.current_times[simulator][host] = time
def get_current_time(self, simulator, host):
""" """
if simulator not in self.current_times:
self.current_times[simulator] = {}
if host not in self.current_times[simulator]:
self.current_times[simulator][host] = 0
return self.current_times[simulator][host]
def add_message_sent_time(self, simulator, message, time):
""" """
if simulator not in self.channel_message_times:
self.channel_message_times[simulator] = {}
self.channel_message_times[simulator][message] = time
def get_message_sent_time(self, simulator, message):
if simulator not in self.channel_message_times:
return None
if message not in self.channel_message_times[simulator]:
return None
return self.channel_message_times[simulator][message]
def add_message_sending_time(self, simulator, message, time):
""" """
if simulator not in self.channel_message_sending_times:
self.channel_message_sending_times[simulator] = {}
self.channel_message_sending_times[simulator][message] = time
def get_message_sending_time(self, simulator, message):
if simulator not in self.channel_message_sending_times:
return None
if message not in self.channel_message_sending_times[simulator]:
return None
return self.channel_message_sending_times[simulator][message]
def add_request_created_time(self, simulator, request, time):
""" """
if simulator not in self.channel_request_times:
self.channel_request_times[simulator] = {}
self.channel_request_times[simulator][request] = time
def get_request_created_time(self, simulator, request):
if simulator not in self.channel_request_times:
return None
if request not in self.channel_request_times[simulator]:
return None
return self.channel_request_times[simulator][request] | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/module/timeanalysis/__init__.py | __init__.py |
import wx
import os
from aqopa.gui.general_purpose_frame_gui import GeneralFrame
"""
@file gui.py
@brief gui file for the financialanalysis module
@author Katarzyna Mazur
"""
class SingleVersionPanel(wx.Panel):
def __init__(self, module, *args, **kwargs):
wx.Panel.__init__(self, *args, **kwargs)
self.module = module
self.versionSimulator = {}
# ################
# VERSION BOX
#################
versionBox = wx.StaticBox(self, label="Version")
versionsLabel = wx.StaticText(self, label="Choose Version To See\nAnalysis Results:")
self.versionsList = wx.ComboBox(self, style=wx.TE_READONLY, size=(200, -1))
self.versionsList.Bind(wx.EVT_COMBOBOX, self.OnVersionChanged)
versionBoxSizer = wx.StaticBoxSizer(versionBox, wx.HORIZONTAL)
versionBoxSizer.Add(versionsLabel, 0, wx.ALL | wx.ALIGN_CENTER, 5)
versionBoxSizer.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
versionBoxSizer.Add(self.versionsList, 1, wx.ALL | wx.ALIGN_CENTER, 5)
##################################
# FINANCIAL RESULTS BOX
##################################
self.cashBox = wx.StaticBox(self, label="The Financial Analysis Results")
self.cashLabel = wx.StaticText(self, label="Price of one kWh [$]:")
self.cashInput = wx.TextCtrl(self, size=(200, -1))
cashSizer = wx.BoxSizer(wx.HORIZONTAL)
cashSizer.Add(self.cashLabel, 0, wx.ALL | wx.EXPAND, 5)
cashSizer.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
cashSizer.Add(self.cashInput, 1, wx.ALL | wx.EXPAND | wx.ALIGN_RIGHT, 5)
hostsBox, hostsBoxSizer = self._BuildHostsBoxAndSizer()
cashBoxSizer = wx.StaticBoxSizer(self.cashBox, wx.VERTICAL)
cashBoxSizer.Add(cashSizer, 0, wx.ALL | wx.EXPAND)
cashBoxSizer.Add(wx.StaticText(self), 0, wx.ALL | wx.EXPAND, 5)
cashBoxSizer.Add(hostsBoxSizer, 1, wx.ALL | wx.EXPAND)
#################
# BUTTONS LAY
#################
self.showFinanceResultsBtn = wx.Button(self, label="Show")
self.showFinanceResultsBtn.Bind(wx.EVT_BUTTON, self.OnShowFinanceResultsBtnClicked)
buttonsSizer = wx.BoxSizer(wx.HORIZONTAL)
buttonsSizer.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
buttonsSizer.Add(self.showFinanceResultsBtn, 0, wx.ALL | wx.EXPAND, 5)
#################
# MAIN LAY
#################
sizer = wx.BoxSizer(wx.VERTICAL)
sizer.Add(versionBoxSizer, 0, wx.ALL | wx.EXPAND, 5)
sizer.Add(cashBoxSizer, 0, wx.ALL | wx.EXPAND, 5)
sizer.Add(buttonsSizer, 0, wx.ALL | wx.EXPAND, 5)
self.SetSizer(sizer)
self.SetVersionsResultsVisibility(False)
def OnShowFinanceResultsBtnClicked(self, event):
cashText = self.cashInput.GetValue().strip()
try:
price = float(cashText)
except ValueError:
wx.MessageBox("'%s' is not a valid price. Please correct it." % cashText,
'Error', wx.OK | wx.ICON_ERROR)
return
versionName = self.versionsList.GetValue()
simulator = self.versionSimulator[versionName]
selected_host = self._GetSelectedHost(simulator)
# get some financial info from module
all_costs = self.module.calculate_all_costs(simulator, simulator.context.hosts, price)
mincost, minhost = self.module.get_min_cost(simulator, simulator.context.hosts)
maxcost, maxhost = self.module.get_max_cost(simulator, simulator.context.hosts)
total_cost = self.module.get_total_cost(simulator, simulator.context.hosts)
avg_cost = self.module.get_avg_cost(simulator, simulator.context.hosts)
curr_cost = all_costs[selected_host]
# after all calculations, build the GUI
title = "Financial Analysis for Host: "
title += selected_host.original_name()
cashWindow = GeneralFrame(self, "Financial Analysis Results", title, "modules_results.png")
cashPanel = wx.Panel(cashWindow)
# ########################################################################
# ACTUAL COSTS
#########################################################################
actualCostsBox = wx.StaticBox(cashPanel, label="Actual Costs of CPU Power Consumption")
actualCostsBoxSizer = wx.StaticBoxSizer(actualCostsBox, wx.VERTICAL)
#########################################################################
# cost of the selected host
#########################################################################
infoLabel = "Cost for Host: "
hostInfoLabel = wx.StaticText(cashPanel, label=infoLabel)
costLabel ="%.15f" % curr_cost + " $"
hostCostLabel = wx.StaticText(cashPanel, label=costLabel)
sizer1 = wx.BoxSizer(wx.HORIZONTAL)
sizer1.Add(hostInfoLabel, 0, wx.ALL | wx.EXPAND, 5)
sizer1.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
sizer1.Add(hostCostLabel, 0, wx.ALL | wx.EXPAND, 5)
#########################################################################
# minimal cost of version (minimal cost of every host in given version)
#########################################################################
infoLabel = "Minimal Version Cost (Host: " + minhost.original_name() + ")"
hostInfoLabel = wx.StaticText(cashPanel, label=infoLabel)
costLabel ="%.15f" % mincost + " $"
hostCostLabel = wx.StaticText(cashPanel, label=costLabel)
sizer2 = wx.BoxSizer(wx.HORIZONTAL)
sizer2.Add(hostInfoLabel, 0, wx.ALL | wx.EXPAND, 5)
sizer2.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
sizer2.Add(hostCostLabel, 0, wx.ALL | wx.EXPAND, 5)
#########################################################################
# maximal cost of version (maximal cost of every host in given version)
#########################################################################
infoLabel = "Maximal Version Cost (Host: " + maxhost.original_name() + ")"
hostInfoLabel = wx.StaticText(cashPanel, label=infoLabel)
costLabel ="%.15f" % maxcost + " $"
hostCostLabel = wx.StaticText(cashPanel, label=costLabel)
sizer3 = wx.BoxSizer(wx.HORIZONTAL)
sizer3.Add(hostInfoLabel, 0, wx.ALL | wx.EXPAND, 5)
sizer3.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
sizer3.Add(hostCostLabel, 0, wx.ALL | wx.EXPAND, 5)
#########################################################################
# average version cost
#########################################################################
infoLabel = "Average Version Cost: "
hostInfoLabel = wx.StaticText(cashPanel, label=infoLabel)
costLabel ="%.15f" % avg_cost + " $"
hostCostLabel = wx.StaticText(cashPanel, label=costLabel)
sizer4 = wx.BoxSizer(wx.HORIZONTAL)
sizer4.Add(hostInfoLabel, 0, wx.ALL | wx.EXPAND, 5)
sizer4.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
sizer4.Add(hostCostLabel, 0, wx.ALL | wx.EXPAND, 5)
#########################################################################
# total version cost
#########################################################################
infoLabel = "Total Version Cost: "
hostInfoLabel = wx.StaticText(cashPanel, label=infoLabel)
costLabel ="%.15f" % total_cost + " $"
hostCostLabel = wx.StaticText(cashPanel, label=costLabel)
sizer5 = wx.BoxSizer(wx.HORIZONTAL)
sizer5.Add(hostInfoLabel, 0, wx.ALL | wx.EXPAND, 5)
sizer5.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
sizer5.Add(hostCostLabel, 0, wx.ALL | wx.EXPAND, 5)
actualCostsBoxSizer.Add(sizer1, 0, wx.ALL | wx.EXPAND, 5)
actualCostsBoxSizer.Add(sizer2, 0, wx.ALL | wx.EXPAND, 5)
actualCostsBoxSizer.Add(sizer3, 0, wx.ALL | wx.EXPAND, 5)
actualCostsBoxSizer.Add(sizer4, 0, wx.ALL | wx.EXPAND, 5)
actualCostsBoxSizer.Add(sizer5, 0, wx.ALL | wx.EXPAND, 5)
#########################################################################
# MAIN LAYOUT
#########################################################################
mainSizer = wx.BoxSizer(wx.VERTICAL)
mainSizer.Add(actualCostsBoxSizer, 1, wx.ALL | wx.EXPAND, 5)
cashPanel.SetSizer(mainSizer)
cashPanel.Layout()
cashWindow.CentreOnScreen()
cashWindow.AddPanel(cashPanel)
cashWindow.SetWindowSize(600, 300)
cashWindow.Show()
def _GetSelectedHost(self, simulator):
host = None
# get selected host name from combo
hostName = self.hostsList.GetValue()
# find host with the selected name
for h in simulator.context.hosts:
if h.original_name() == hostName:
host = h
break
return host
def _PopulateComboWithHostsNames(self, simulator):
hostsNames = []
[hostsNames.append(h.original_name()) for h in simulator.context.hosts if h.original_name() not in hostsNames]
self.hostsList.Clear()
self.hostsList.AppendItems(hostsNames)
# ################
# LAYOUT
#################
def _BuildHostsBoxAndSizer(self):
""" """
self.chooseHostLbl = wx.StaticText(self, label="Choose Host To See\nit's Total Cost:")
self.hostsList = wx.ComboBox(self, style=wx.TE_READONLY, size=(200, -1))
self.hostsBox = wx.StaticBox(self, label="Host(s)")
self.hostsBoxSizer = wx.StaticBoxSizer(self.hostsBox, wx.HORIZONTAL)
self.hostsBoxSizer.Add(self.chooseHostLbl, 0, wx.ALL | wx.EXPAND, 5)
self.hostsBoxSizer.Add(wx.StaticText(self), 1, wx.ALL | wx.EXPAND, 5)
self.hostsBoxSizer.Add(self.hostsList, 1, wx.ALL | wx.EXPAND | wx.ALIGN_RIGHT, 5)
return self.hostsBox, self.hostsBoxSizer
#################
# REACTIONS
#################
def AddFinishedSimulation(self, simulator):
""" """
version = simulator.context.version
self.versionsList.Append(version.name)
self.versionSimulator[version.name] = simulator
def OnVersionChanged(self, event):
""" """
versionName = self.versionsList.GetValue()
simulator = self.versionSimulator[versionName]
self._PopulateComboWithHostsNames(simulator)
self.SetVersionsResultsVisibility(True)
def RemoveAllSimulations(self):
""" """
self.versionsList.Clear()
self.versionsList.SetValue("")
self.versionSimulator = {}
self.hostsList.Clear()
self.hostsList.SetValue("")
self.SetVersionsResultsVisibility(False)
def SetVersionsResultsVisibility(self, visible):
""" """
widgets = []
widgets.append(self.cashLabel)
widgets.append(self.cashBox)
widgets.append(self.cashInput)
widgets.append(self.hostsList)
widgets.append(self.hostsBox)
widgets.append(self.showFinanceResultsBtn)
widgets.append(self.chooseHostLbl)
for w in widgets:
if visible:
w.Show()
else:
w.Hide()
self.Layout()
class MainResultsNotebook(wx.Notebook):
def __init__(self, module, *args, **kwargs):
wx.Notebook.__init__(self, *args, **kwargs)
self.module = module
il = wx.ImageList(24, 24)
singleVersionImg = il.Add(wx.Bitmap(self.CreatePath4Resource('cash.png'), wx.BITMAP_TYPE_PNG))
self.AssignImageList(il)
self.oneVersionTab = SingleVersionPanel(self.module, self)
self.AddPage(self.oneVersionTab, "Single Version")
self.SetPageImage(0, singleVersionImg)
self.oneVersionTab.Layout()
def OnParsedModel(self):
""" """
self.oneVersionTab.RemoveAllSimulations()
def OnSimulationFinished(self, simulator):
""" """
self.oneVersionTab.AddFinishedSimulation(simulator)
def OnAllSimulationsFinished(self, simulators):
""" """
pass
def CreatePath4Resource(self, resourceName):
"""
@brief creates and returns path to the
given file in the resource
('assets') dir
@return path to the resource
"""
tmp = os.path.split(os.path.dirname(__file__))
# find last / character in path
idx = tmp[0].rfind('/')
# get substring - path for resource
path = tmp[0][0:idx]
return os.path.join(path, 'bin', 'assets', resourceName)
class ModuleGui(wx.EvtHandler):
def __init__(self, module):
""" """
wx.EvtHandler.__init__(self)
self.module = module
self.mainResultNotebook = None
def get_gui(self):
if not getattr(self, '__gui', None):
setattr(self, '__gui', ModuleGui(self))
return getattr(self, '__gui', None)
def get_name(self):
return "Financial Analysis"
def install_gui(self, simulator):
""" Install module for gui simulation """
self._install(simulator)
return simulator
def get_configuration_panel(self, parent):
""" Returns WX panel with configuration controls. """
panel = wx.Panel(parent)
sizer = wx.BoxSizer(wx.VERTICAL)
text = wx.StaticText(panel, label="Module does not need to be configured.")
sizer.Add(text, 0, wx.ALL | wx.EXPAND, 5)
text = wx.StaticText(panel, label="All result options will be available after results are calculated.")
sizer.Add(text, 0, wx.ALL | wx.EXPAND, 5)
panel.SetSizer(sizer)
return panel
# panel = wx.Panel(parent)
# # create group boxes, aka static boxes
# costConfBox = wx.StaticBox(panel, label="Cost Per One Kilowatt-Hour")
# # create info label
# moduleInfoLabel = wx.StaticText(panel, label="To obtain meaningful results, you need to select The Time Analysis and The Energy Analysis Modules as well. "
# "Also, remember to give the price of one kilowatt-hour in US dollars.")
# cashInfoLabel = wx.StaticText(panel, label="Cost per kWh [$]")
# # create sizers = some kind of layout management
# sizer = wx.StaticBoxSizer(costConfBox, wx.VERTICAL)
# inputCashSizer = wx.BoxSizer(wx.HORIZONTAL)
# mainSizer = wx.BoxSizer(wx.VERTICAL)
# # create line edit field
# cashInputText = wx.TextCtrl(panel)
# inputCashSizer.Add(cashInfoLabel, 0, wx.ALL | wx.EXPAND, 5)
# inputCashSizer.Add(cashInputText, 1, wx.ALL | wx.EXPAND, 5)
# # lay them all out
# sizer.Add(moduleInfoLabel, 0, wx.ALL | wx.EXPAND, 5)
# sizer.Add(wx.StaticText(panel), 1, wx.ALL | wx.EXPAND, 5)
# sizer.Add(inputCashSizer, 1, wx.ALL | wx.EXPAND, 5)
# mainSizer.Add(sizer, 0, wx.ALL | wx.EXPAND, 5)
# panel.SetSizer(mainSizer)
# return panel, cashInputText.GetValue()
def get_results_panel(self, parent):
"""
Create main result panel existing from the beginning
which will be extended when versions' simulations are finished.
"""
self.mainResultNotebook = MainResultsNotebook(self.module, parent)
return self.mainResultNotebook
def on_finished_simulation(self, simulator):
""" """
self.mainResultNotebook.OnSimulationFinished(simulator)
def on_finished_all_simulations(self, simulators):
"""
Called once for all simulations after all of them are finished.
"""
self.mainResultNotebook.OnAllSimulationsFinished(simulators)
def on_parsed_model(self):
""" """
self.mainResultNotebook.OnParsedModel() | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/module/financialanalysis/gui.py | gui.py |
from aqopa import module
from .gui import ModuleGui
from aqopa.simulator.state import HOOK_TYPE_SIMULATION_FINISHED
from .console import PrintResultsHook
"""
@file __init__.py
@brief initial file for the financialanalysis module
@author Katarzyna Mazur
"""
class Module(module.Module):
def __init__(self, energyanalysis_module):
self.energyanalysis_module = energyanalysis_module
self.consumption_costs = {}
self.cost_per_kWh = 0
def get_cost_per_kWh(self):
return self.cost_per_kWh
def set_cost_per_kWh(self, cost_per_kWh):
self.cost_per_kWh = cost_per_kWh
def get_gui(self):
if not getattr(self, '__gui', None):
setattr(self, '__gui', ModuleGui(self))
return getattr(self, '__gui', None)
def _install(self, simulator):
"""
"""
return simulator
def install_console(self, simulator):
""" Install module for console simulation """
self._install(simulator)
hook = PrintResultsHook(self, simulator)
simulator.register_hook(HOOK_TYPE_SIMULATION_FINISHED, hook)
return simulator
def install_gui(self, simulator):
""" Install module for gui simulation """
self._install(simulator)
return simulator
def __convert_to_joules(self, millijoules):
return millijoules / 1000.0
def __convert_to_kWh(self, joules):
return joules / 3600000.0
def calculate_cost(self, consumed_joules, cost_per_kWh):
kWhs = self.__convert_to_kWh(consumed_joules)
cost = kWhs * cost_per_kWh
return cost
def calculate_cost_for_host(self, simulator, host, cost_per_kWh):
all_consumptions = self.get_all_hosts_consumption(simulator, simulator.context.hosts)
joules = all_consumptions[host]['energy']
cost_for_host = self.calculate_cost(joules, cost_per_kWh)
return cost_for_host
def calculate_all_costs(self, simulator, hosts, cost_per_kWh):
all_costs = {}
for host in hosts:
all_costs[host] = self.calculate_cost_for_host(simulator, host, cost_per_kWh)
self.add_cost(simulator, host, all_costs[host])
return all_costs
def add_cost(self, simulator, host, cost):
"""
@brief adds cost of power consumption to
the list of cost consumptions for the
particular host present in the
QoP-ML's model
"""
# add a new simulator if not available yet
if simulator not in self.consumption_costs:
self.consumption_costs[simulator] = {}
# add a new host if not available yet
if host not in self.consumption_costs[simulator]:
self.consumption_costs[simulator][host] = []
# add cost for the host - but only if we
# have not added it yet and if it is not 'empty'
if cost not in self.consumption_costs[simulator][host] and cost:
self.consumption_costs[simulator][host].append(cost)
def get_min_cost(self, simulator, hosts):
host = hosts[0]
min_cost = self.consumption_costs[simulator][hosts[0]]
if len(min_cost) > 0 :
for h in hosts:
if self.consumption_costs[simulator][h] < min_cost:
min_cost = self.consumption_costs[simulator][h]
host = h
return min_cost[0], host
else :
return 0, host
def get_max_cost(self, simulator, hosts):
host = hosts[0]
max_cost = self.consumption_costs[simulator][hosts[0]]
if len(max_cost) > 0 :
for h in hosts:
if self.consumption_costs[simulator][h] > max_cost:
max_cost = self.consumption_costs[simulator][h]
host = h
return max_cost[0], host
else:
return 0, host
def get_avg_cost(self, simulator, hosts):
cost_sum = 0.0
i = 0
for host in hosts:
for cost in self.consumption_costs[simulator][host]:
cost_sum += cost
i += 1
if i != 0:
return cost_sum / i
else:
return 0
def get_total_cost(self, simulator, hosts):
cost_sum = 0.0
for host in hosts:
for cost in self.consumption_costs[simulator][host]:
cost_sum += cost
return cost_sum
def get_all_costs(self, simulator):
if simulator not in self.consumption_costs:
return []
return self.consumption_costs[simulator]
# def set_all_costs(self, consumption_costs):
# self.consumption_costs = copy.deepcopy(consumption_costs)
def get_all_hosts_consumption(self, simulator, hosts):
voltage = self.energyanalysis_module.get_voltage()
consumptions = self.energyanalysis_module.get_hosts_consumptions(simulator, hosts, voltage)
return consumptions | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/module/financialanalysis/__init__.py | __init__.py |
from aqopa.simulator.state import InstructionsContext,\
InstructionsList, Process
class Scheduler():
def __init__(self, host):
raise NotImplementedError()
def finished(self):
"""
Returns True if host is finished.
"""
raise NotImplementedError()
def get_current_context(self):
"""
Returns instructions context executed in current state.
"""
raise NotImplementedError()
def get_all_contexts(self):
"""
Return all contexts in this scheduler.
"""
raise NotImplementedError()
def get_instructions_context_of_instruction(self, instruction):
"""
Returns instruction context with the instruction from parameter
as the current instruction
"""
raise NotImplementedError()
def goto_next_instruction_context(self):
"""
Move hosts state to next instructions list.
"""
raise NotImplementedError()
def get_contexts_number(self):
"""
Returns the number of contexts in scheduler.
"""
raise NotImplementedError()
def get_unfinished_contexts_number(self):
"""
Returns the number of contexts in scheduler that are not yet finished.
"""
raise NotImplementedError()
def is_current_context_first(self):
"""
Returns True if the current context is the first unfinished.
"""
raise NotImplementedError()
class FifoScheduler(Scheduler):
def __init__(self, host):
""" """
self.host = host
self.context = self._build_context()
def _build_context(self):
""" """
context = InstructionsContext(self.host)
context.add_instructions_list(self.host.instructions_list)
return context
def finished(self):
""" """
return self.context.finished()
def get_current_context(self):
""" """
return self.context
def get_all_contexts(self):
""" """
return [self.context]
def get_instructions_context_of_instruction(self, instruction):
"""
Returns instruction context with the instruction from parameter
as the current instruction
"""
if self.context.get_current_instruction() == instruction:
return self.context
return None
def goto_next_instruction_context(self):
""" """
# Fifo has only one context
pass
def get_contexts_number(self):
"""
Returns the number of contexts in scheduler.
"""
return 1
def get_unfinished_contexts_number(self):
"""
Returns the number of contexts in scheduler that are not yet finished.
"""
return 1 if not self.context.finished() else 0
def is_current_context_first(self):
"""
Returns the index of current context (zero-based)
"""
return True
class RoundRobinScheduler(Scheduler):
def __init__(self, host):
""" """
self.host = host
self.contexts = []
self._current_context_index = 0
self._build_contexts()
def _build_contexts(self):
""" """
host_instructions_context = InstructionsContext(self.host)
host_instructions_list = []
host_context_added = False
for i in self.host.instructions_list:
if isinstance(i, Process):
process = i
if len(process.instructions_list) > 0:
process_context = InstructionsContext(self.host)
self.contexts.append(process_context)
process_context.add_instructions_list(
process.instructions_list,
process)
else:
host_instructions_list.append(i)
if not host_context_added:
self.contexts.append(host_instructions_context)
host_context_added = True
if len(host_instructions_list) > 0:
host_instructions_context.add_instructions_list(InstructionsList(host_instructions_list))
def finished(self):
""" """
for c in self.contexts:
if not c.finished():
return False
return True
def get_current_context(self):
""" """
return self.contexts[self._current_context_index]
def get_all_contexts(self):
""" """
return self.contexts
def get_instructions_context_of_instruction(self, instruction):
"""
Returns instruction context with the instruction from parameter
as the current instruction
"""
for icontext in self.contexts:
if not icontext.finished() and icontext.get_current_instruction() == instruction:
return icontext
return None
def goto_next_instruction_context(self):
""" """
now_index = self._current_context_index
self._current_context_index = (self._current_context_index + 1) % len(self.contexts)
while now_index != self._current_context_index and self.get_current_context().finished():
self._current_context_index = (self._current_context_index + 1) % len(self.contexts)
def get_contexts_number(self):
"""
Returns the number of contexts in scheduler.
"""
return len(self.contexts)
def get_unfinished_contexts_number(self):
"""
Returns the number of contexts in scheduler that are not yet finished.
"""
nb = 0
for c in self.contexts:
if not c.finished():
nb += 1
return nb
def is_current_context_first(self):
"""
Returns the index of current context (zero-based)
"""
indexes = []
i = 0
for c in self.contexts:
if not c.finished():
indexes.append(i)
i += 1
return self._current_context_index == min(indexes)
SCHEDULE_ALGORITHM_ROUND_ROBIN = 'rr'
SCHEDULE_ALGORITHM_FIFO = 'fifo'
def create(host, algorithm):
if algorithm == SCHEDULE_ALGORITHM_FIFO:
return FifoScheduler(host)
elif algorithm == SCHEDULE_ALGORITHM_ROUND_ROBIN:
return RoundRobinScheduler(host) | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/simulator/scheduler.py | scheduler.py |
from aqopa.simulator.error import RuntimeException
from aqopa.model import TupleExpression, CallFunctionExpression, CallFunctionInstruction, IdentifierExpression, TupleElementExpression
class HostMetrics():
"""
Whole metrics configuration for particular hosts.
Configuration includes the name, the device configuration params,
three types of metrics: normal, plus and star.
"""
def __init__(self, name):
self.name = name
self.configurations = []
self.normal_blocks = []
self.plus_blocks = []
self.star_blocks = []
self.connected_hosts = []
def is_connected_with_host(self, host):
""" """
return host in self.connected_hosts
class Block():
"""
Block of metrics containing all metrics with the same params
and service params definition.
"""
def __init__(self, params, service_params):
self.params = params
self.service_params = service_params
self.metrics = []
def add_metric(self, metric):
""" """
self.metrics.append(metric)
metric.block = self
def find_primitives(self, function_name, qop_arguments):
""" """
if len(qop_arguments) < len(self.params):
return []
found_metrics = []
for m in self.metrics:
if m.function_name != function_name:
continue
params_ok = True
for i in range(0, len(m.arguments)):
if qop_arguments[i] != m.arguments[i]:
params_ok = False
if not params_ok:
continue
found_metrics.append(m)
return found_metrics
class Metric():
"""
One metrics row with defined: call arguments and service arguments.
Call arguments are used for metric lookup and service arguments are used
to define new state or by modules to calculate some informations.
"""
def __init__(self, function_name, arguments, service_arguments):
self.function_name = function_name
self.arguments = arguments
self.service_arguments = service_arguments
self.block = None
class Manager():
"""
Metrics manager.
Class used for operations on metrics: searching, etc.
"""
def __init__(self, host_metrics):
self.host_metrics = host_metrics
def find_primitive(self, host, call_function_expression):
"""
Method finds metric for function call in host.
Method searches the metric in three kinds: normal, plus and star.
If more than one metrics is found, runtime exception is raised.
"""
found_host_metric = None
for host_metric in self.host_metrics:
if host_metric.is_connected_with_host(host):
found_host_metric = host_metric
break
if not found_host_metric:
return None
normal_metrics = []
plus_metrics = []
star_metrics = []
if len(found_host_metric.normal_blocks) > 0:
for mb in found_host_metric.normal_blocks:
if len(mb.params) <= len(call_function_expression.qop_arguments):
normal_metrics.extend(mb.find_primitives(
call_function_expression.function_name,
call_function_expression.qop_arguments))
if len(found_host_metric.plus_blocks) > 0:
for mb in found_host_metric.plus_blocks:
if len(mb.params) <= len(call_function_expression.qop_arguments):
plus_metrics.extend(mb.find_primitives(
call_function_expression.function_name,
call_function_expression.qop_arguments))
if len(found_host_metric.star_blocks) > 0:
for mb in found_host_metric.star_blocks:
if len(mb.params) <= len(call_function_expression.qop_arguments):
star_metrics.extend(mb.find_primitives(
call_function_expression.function_name,
call_function_expression.qop_arguments))
if len(normal_metrics) + len(plus_metrics) + len(star_metrics) > 1:
raise RuntimeException("Found many metrics for function '%s' with qopanalysis arguments: %s." \
% (call_function_expression.function_name,
', '.join(call_function_expression.qop_arguments)))
if len(normal_metrics) > 0:
return normal_metrics[0]
if len(plus_metrics) > 0:
return plus_metrics[0]
if len(star_metrics) > 0:
return star_metrics[0]
return None
def get_expression_size(self, expression, context, host):
"""
Returns the size in bytes of expression according to metrics.
Metrics van specify exact size (ie. in bytes, bits, kilobytes, kilobits, megabytes, megabits)
or ratio size (ie. 0.5 equals 50%) used ie. in compression.
Expression cannot have variables - it must be filled with variables' values.
"""
size = 0
if isinstance(expression, IdentifierExpression):
# If expression is an variable, get its value size
return self.get_expression_size(host.get_variable(expression.identifier), context, host)
if isinstance(expression, TupleElementExpression):
variable = host.get_variable(expression.variable_name)
variable = context.expression_reducer.reduce(variable)
if not isinstance(variable, TupleExpression):
raise RuntimeException('Cannot get tuple element on expression: %s.' % unicode(variable))
return self.get_expression_size(variable.elements[expression.index], context, host)
elif isinstance(expression, TupleExpression):
# If expression is tuple, just sum its elements' sizes
for expr in expression.elements:
size += self.get_expression_size(expr, context, host)
return size
if isinstance(expression, CallFunctionExpression) or isinstance(expression, CallFunctionInstruction):
expression = context.expression_reducer.reduce(expression)
metric = self.find_primitive(host, expression)
if not metric:
raise RuntimeException("Cannot get expression size: No metric found for expression '%s'."
% unicode(expression))
block = metric.block
for i in range(0, len(block.service_params)):
sparam = block.service_params[i]
if sparam.service_name.lower() != "size":
continue
metric_type = sparam.param_name.lower()
metric_unit = sparam.unit
metric_value = metric.service_arguments[i]
if metric_type == "ratio":
mparts = metric_value.split(':')
element_index = int(mparts[0])-1
percent = float(mparts[1])
size = self.get_expression_size(expression.arguments[element_index], context, host) \
* percent
elif metric_type == "sum_ratio":
ratios = metric_value.split(',')
size = 0.0
for ratio in ratios:
rparts = ratio.strip().split(':')
element_index = int(rparts[0])-1
percent = float(rparts[1])
size += self.get_expression_size(expression.arguments[element_index], context, host) \
* percent
elif metric_type == "exact":
if metric_unit == 'B':
size = float(metric_value)
elif metric_unit == 'b':
size = float(metric_value)/8.0
else:
raise RuntimeException('Cannot get expression size: Unsupported size value for exact type.')
elif metric_type == "block":
mparts = metric_value.split(':')
element_index = int(mparts[0])-1
unit_value = int(mparts[1])
factor = 1.0
if metric_unit == 'b':
factor = 1.0 / 8.0
argument_size = self.get_expression_size(expression.arguments[element_index], context, host)
argument_size_excess = argument_size % unit_value
size = argument_size
if argument_size_excess > 0:
size += unit_value - argument_size_excess
size *= factor
elif metric_type == "nested":
mparts = metric_value.split(':')
element_index = int(mparts[0])-1
nested_element_index = int(mparts[1])-1
nested_expression = expression.arguments[element_index]
if isinstance(nested_expression, IdentifierExpression):
nested_expression = host.get_variable(nested_expression.identifier)
if not isinstance(nested_expression, CallFunctionExpression) and \
not isinstance(nested_expression, CallFunctionInstruction):
raise RuntimeException('Cannot get nested expression size: Not a function call.')
size = self.get_expression_size(nested_expression.arguments[nested_element_index], context, host)
else:
raise RuntimeException('Cannot get expression size: Unsupported size type.')
return size
raise RuntimeException('Cannot get expression size: Unsupported expression type. Expression: {0}.'
.format(unicode(expression))) | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/simulator/metrics.py | metrics.py |
import copy
from aqopa.model import IdentifierExpression, CallFunctionExpression, TupleExpression,\
BooleanExpression, ComparisonExpression, TupleElementExpression
from aqopa.simulator.error import RuntimeException
class Populator():
def __init__(self, reducer):
self.reducer = reducer
self.predefined_functions_manager = None
def is_function_predefined(self, fun_name):
return self.predefined_functions_manager is not None \
and self.predefined_functions_manager.is_function_predefined(fun_name)
def populate(self, expression, host, main_expression=None):
"""
Returns new expression with replaced variables names
with copies of values of variables from variables list.
Reducer is used for special types of variables,
eg. tuple element expressions.
"""
if main_expression is None:
main_expression = expression
variables = host.get_variables()
if isinstance(expression, IdentifierExpression):
if expression.identifier not in variables:
raise RuntimeException("Variable {0} undefined in expression '{1}'."
.format(expression.identifier, unicode(main_expression)))
return variables[expression.identifier].clone()
if isinstance(expression, CallFunctionExpression):
if self.is_function_predefined(expression.function_name):
populated = self.predefined_functions_manager.populate_call_function_expression_result(expression,
host, self)
return populated.clone()
arguments = []
for arg in expression.arguments:
arguments.append(self.populate(arg, host, main_expression=main_expression))
qop_arguments = []
for qop_arg in expression.qop_arguments:
qop_arguments.append(qop_arg)
return CallFunctionExpression(expression.function_name, arguments, qop_arguments)
if isinstance(expression, TupleExpression):
elements = []
for e in expression.elements:
elements.append(self.populate(e, host, main_expression=main_expression))
return TupleExpression(elements)
if isinstance(expression, TupleElementExpression):
if expression.variable_name not in variables:
raise RuntimeException("Variable {0} does not exist in host {1}. Trying expression '{2}'.".format(
expression.variable_name, host.name, unicode(main_expression)))
expr = variables[expression.variable_name]
if not isinstance(expr, TupleExpression):
expr = self.reducer.reduce(expr)
if not isinstance(expr, TupleExpression):
raise RuntimeException(
"Cannot compute expression '{0}' in host {1}. Variable {2} is not a tuple. It is: {3}."
.format(unicode(main_expression), host.name, expression.variable_name, unicode(expr)))
if len(expr.elements) <= expression.index:
raise RuntimeException(
"Cannot compute expression '{0}' in host {1}. "
"Variable {2} does not have index {3}. It has {4} elements: {5}."
.format(unicode(main_expression), host.name, expression.variable_name, expression.index,
len(expr.elements), unicode(expr)))
return self.populate(expr.elements[expression.index], host, main_expression=main_expression)
return expression.clone()
class Checker():
"""
Expression checker.
Class used to check the result of expressions.
"""
def __init__(self, populator):
self.populator = populator
def _are_equal(self, left, right):
"""
Method checks if both expressions are the same.
Expressions cannot contains identifiers. All identifiers should be replaced with its values.
Checks whether all function calls and boolean expressions have the same values,
function names and parameters' values.
"""
if left.__class__ != right.__class__:
return False
if isinstance(left, BooleanExpression):
return left.val == right.val
if isinstance(left, CallFunctionExpression):
if self.populator.is_function_predefined(left.function_name):
return self.populator.predefined_functions_manager\
.are_equal_call_function_expressions(left, right)
if left.function_name != right.function_name:
return False
if len(left.arguments) != len(right.arguments):
return False
for i in range(0, len(left.arguments)):
if not self._are_equal(left.arguments[i], right.arguments[i]):
return False
return True
def result(self, condition, host):
"""
Method checks the result of condition.
Returns True if condition is true or can be reduced to true condition.
At the moment only comparision condition is available.
"""
if isinstance(condition, BooleanExpression):
return condition.is_true()
if isinstance(condition, ComparisonExpression):
left = condition.left
right = condition.right
left = self.populator.populate(left, host)
right = self.populator.populate(right, host)
# Additional populator execution to populate predefined function
left = self.populator.populate(left, host)
right = self.populator.populate(right, host)
left = self.populator.reducer.reduce(left)
right = self.populator.reducer.reduce(right)
if condition.is_equal_type():
result = self._are_equal(left, right)
else:
result = not self._are_equal(left, right)
del left
del right
return result
return False
class ReductionPoint():
"""
Class representing point where expression can be reduced.
"""
def __init__(self, equation, expression, replacement, modified_part=None, modified_element_info=None):
self.equation = equation # Equation that is used to reduce
self.expression = expression # Expression that will be reduced
# Reduction can replace whole expression
# or the part of expression
self.replacement = replacement # Expression that will replace reduced part
self.replaced = None # Part of expression that was replaced with replacement
self.modified_part = modified_part # Part of expression that will be modified
# If None, whole expression should be replaced
self.modified_element_info = modified_element_info # Information which element
# of modified part should be replaced
def equals_to(self, reduction_point):
"""
Returns True if self and reduction_points try to reduct the expression in the same place.
"""
return self.expression == reduction_point.expression \
and self.modified_part == reduction_point.modified_part \
and self.modified_element_info == reduction_point.modified_element_info
def _get_replaced_part(self):
"""
Returns the part of expression that will be replaced.
"""
if isinstance(self.modified_part, CallFunctionExpression):
return self.modified_part.arguments[self.modified_element_info]
def _replace_part(self, replacement):
"""
Replace the part of modified_part according to modified_element_info.
"""
if isinstance(self.modified_part, CallFunctionExpression):
self.modified_part.arguments[self.modified_element_info] = replacement
def reduce(self):
"""
Returns reduced expression.
Method saves informaction for rolling back the reduction.
"""
if self.modified_part is None:
return self.replacement
self.replaced = self._get_replaced_part()
self._replace_part(self.replacement)
return self.expression
def rollback(self):
"""
Rolls back the reduction.
Returns expression to the form before reduction.
"""
if self.modified_part is None:
return self.expression
self._replace_part(self.replaced)
self.replaced = None
return self.expression
class Reducer():
"""
Expression reducer.
Class used to recude complex expressions with usage of equations.
"""
def __init__(self, equations):
self.equations = equations
self.predefined_functions_manager = None
def _get_reduction_points_for_equation(self, equation, whole_expression, current_expression,
parent_expression=None, modified_element_info=None):
"""
Recursive strategy of finding points.
"""
points = []
variables = {}
if equation.can_reduce(current_expression, equation.composite, variables, self.predefined_functions_manager):
if isinstance(equation.simple, IdentifierExpression):
simpler_value = variables[equation.simple.identifier]
else:
simpler_value = equation.simple
points.append(ReductionPoint(equation,
expression=whole_expression,
replacement=simpler_value,
modified_part=parent_expression,
modified_element_info=modified_element_info))
if isinstance(current_expression, CallFunctionExpression):
for i in range(0, len(current_expression.arguments)):
e = current_expression.arguments[i]
for p in self._get_reduction_points_for_equation(equation,
whole_expression,
current_expression=e,
parent_expression=current_expression,
modified_element_info=i):
points.append(p)
return points
def get_reduction_points_for_equation(self, equation, expression):
"""
Method returns list of points where equation can reduce expression.
Example:
equation: f(x) = x
expression: f(a(f(b())))
^ ^
Two reduction points selected above,
"""
return self._get_reduction_points_for_equation(equation=equation,
whole_expression=expression,
current_expression=expression,
parent_expression=None,
modified_element_info=None)
def _get_reduction_points(self, expression):
"""
Method finds points in expression where it can be reduced.
"""
points = []
for eq in self.equations:
for p in self.get_reduction_points_for_equation(eq, expression):
points.append(p)
return points
def reduce(self, expression):
"""
Reduces expression with usage of equations.
Returns reduced expression.
Raises exception if ambiguity is found.
"""
continue_reducing = True
"""
# Wrap expression and user wrapper variable
# Used to simulate real pass-by-reference
# If expression was passed without wrapped variable and equation wanted to
# replace whole exception, it would not work, because whole variable cannot be changed
# For example:
# eq enc(K, dec(K, X)) = X
# expression: enc(k(), dec(k(), b()) should be reduced to b()
# If we pass whole expression variable to reduce we cannot change it whole
# def f(x,v) -> x = v
# e = A(a=1)
# f(e, A(a=2)) - will not work, because e is reference to instance, but passed by value
# When we pass wrapper, we can change its element (ie. expression)
"""
# Reduce until no reduction can be performed.
# One reduction can give way for another reduction.
while continue_reducing:
continue_reducing = False
# For each equation we find points where expression can be reduced
reduction_points = self._get_reduction_points(expression)
if len(reduction_points) > 0:
# For each point:
# - temporary reduce at this point
# - remove used point from reduction points list
# - generate new reduction points list for reduced expression
# - if any of points from old list is not present in new list raise ambiguity exception
# ! New reduction points may come
for reduction_point in reduction_points:
# Generate list with reduction points before reduction
old_reduction_points = copy.copy(reduction_points)
old_reduction_points.remove(reduction_point)
# Reduce temporary
expression = reduction_point.reduce()
# Generate new reduction points
new_reduction_points = self._get_reduction_points(expression)
for old_reduction_point in old_reduction_points:
found = False
for new_reduction_point in new_reduction_points:
if old_reduction_point.equals_to(new_reduction_point):
found = True
break
if not found:
raise RuntimeException("Equations '%s' and '%s are ambiguous for expression: %s." %
(unicode(old_reduction_point.equation),
unicode(reduction_point.equation),
unicode(expression)))
# Cancel reduction
expression = reduction_point.rollback()
# Ambiguity error checked
# Make factual reduction
reduction_point = reduction_points[0]
# Reduce and commit reduction
expression = reduction_point.reduce()
continue_reducing = True
return expression | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/simulator/expression.py | expression.py |
from wx.lib.agw.aui.aui_constants import right
from aqopa.model import CallFunctionExpression, IdentifierExpression,\
BooleanExpression
from aqopa.simulator.error import EnvironmentDefinitionException
class Equation():
"""
Equation built for simulation from parsed equation.
"""
def __init__(self, composite, simple):
self.composite = composite
self.simple = simple
def __unicode__(self):
""" """
return u"eq %s = %s" % (unicode(self.composite), unicode(self.simple))
def _are_equal(self, left_expression, right_expression, predefined_functions_manager=None):
"""
Method returns True, when expressions are equal.
Equality covers: called function names, logic values, identifiers names.
"""
if left_expression.__class__ != right_expression.__class__:
return False
if isinstance(left_expression, BooleanExpression):
return left_expression.val == right_expression.val
if isinstance(left_expression, IdentifierExpression):
return left_expression.identifier == right_expression.identifier
if isinstance(left_expression, CallFunctionExpression):
if predefined_functions_manager:
if predefined_functions_manager.is_function_predefined(left_expression.function_name):
return predefined_functions_manager.are_equal_call_function_expressions(left_expression,
right_expression)
if left_expression.function_name != right_expression.function_name:
return False
if len(left_expression.arguments) != len(right_expression.arguments):
return False
for i in range(0, len(left_expression.arguments)):
if not self._are_equal(left_expression.arguments[i], right_expression.arguments[i],
predefined_functions_manager):
return False
return True
def can_reduce(self, reduced_expression, composite_expression, variables, predefined_functions_manager=None):
"""
Method returns True if reduced_expression can be reduced with composite_expression.
Recursive strategy.
composite_expression is a paremeter, because it is changed while nesting reduction check.
Otherwise it would be possible to retrieve it from composite field of self.
In variables parameter method saves values of identifiers from composite_expression.
Method saves expression from reduced_expression for each identifier from composite_expression
which stand in the same place.
"""
if isinstance(composite_expression, IdentifierExpression):
if composite_expression.identifier not in variables:
variables[composite_expression.identifier] = reduced_expression
# print 'Found identifier ', composite_expression.identifier, ' - setting its value: ', \
# unicode(reduced_expression), getattr(reduced_expression, '_host_name', 'None')
return True
else:
current_val = variables[composite_expression.identifier]
# print 'Found identifier ', composite_expression.identifier, ' - checking variables'
# print 'Variables:'
# for n in variables:
# print n, ' = ', unicode(variables[n]), getattr(variables[n], '_host_name', 'None')
# print 'Reduced value: ', unicode(reduced_expression), getattr(reduced_expression, '_host_name', 'None')
if self._are_equal(current_val, reduced_expression, predefined_functions_manager):
# print 'They are equal!'
return True
if isinstance(composite_expression, BooleanExpression):
if not isinstance(reduced_expression, BooleanExpression):
return False
return self._are_equal(composite_expression, reduced_expression, predefined_functions_manager)
if isinstance(composite_expression, CallFunctionExpression):
# Predefined function cannot be used in equations.
if not isinstance(reduced_expression, CallFunctionExpression):
return False
if composite_expression.function_name != reduced_expression.function_name:
return False
if len(composite_expression.arguments) != len(reduced_expression.arguments):
return False
for i in range(0, len(composite_expression.arguments)):
if not self.can_reduce(reduced_expression.arguments[i], composite_expression.arguments[i],
variables, predefined_functions_manager):
return False
return True
return False
class Validator():
def _find_function(self, functions, name):
""""""
for f in functions:
if f.name == name:
return f
return None
def _validate_function_names(self, expression, functions):
"""
Method checs if all function exist and are callef woth correct number of parameters.
Returns True or raises EnvironmentDefinitionException.
"""
if isinstance(expression, CallFunctionExpression):
function = self._find_function(functions, expression.function_name)
if not function:
raise EnvironmentDefinitionException('Function %s does not exist.' % expression.function_name)
if len(function.params) != len(expression.arguments):
raise EnvironmentDefinitionException(
'Function %s called with wrong number of arguments - expected: %d, got: %d.'
% (expression.function_name, len(function.params), len(expression.arguments)))
for arg in expression.arguments:
if isinstance(arg, CallFunctionExpression):
self._validate_function_names(arg, functions)
return True
def _are_expressions_the_same(self, left, right, check_variables=False, variables=None):
"""
Method checks if expressions are the same in aspect of equations, which means that
both expressions could be used to reduce another expression.
Method can check also the names variables from both expressions. Method checks
if variable X from left expression stands in all the same places (and no more)
that corresponding variable Y from right expressions.
Example:
f(x,y,x,x) == f(a,b,a,a) - are the same
f(x,y,x,x) == f(a,b,a,b) - are not the same, because second b should be a
"""
if variables is None:
variables = {}
if left.__class__ != right.__class__:
return False
if isinstance(left, IdentifierExpression):
if not check_variables:
return True
if left.identifier in variables:
return variables[left.identifier] == right.identifier
else:
variables[left.identifier] = right.identifier
return True
if isinstance(left, BooleanExpression):
return left.val == right.val
if isinstance(left, CallFunctionExpression):
if left.function_name != right.function_name:
return False
if len(left.arguments) != len(right.arguments):
return False
for i in range(0, len(left.arguments)):
if not self._are_expressions_the_same(left.arguments[i], right.arguments[i], check_variables, variables):
return False
return True
return False
def _expression_contains_identifier(self, expression, identifier):
"""
Returns True if expression contains an identifier expression
with name equal to second parameter.
"""
if isinstance(expression, IdentifierExpression):
return expression.identifier == identifier
if isinstance(expression, CallFunctionExpression):
for arg in expression.arguments:
if self._expression_contains_identifier(arg, identifier):
return True
return False
return False
def _validate_syntax(self, parsed_equations, functions):
"""
Method check the syntax of equations:
- does composite part include the identifier from simple part (if simple part is identifier)?
- do all functions exist and are callef woth correct number of parameters?
Returns True or raises EnvironmentDefinitionException.
"""
errors = []
for eq in parsed_equations:
try:
self._validate_function_names(eq.composite, functions)
except EnvironmentDefinitionException, e:
errors.append(e.args[0])
if isinstance(eq.simple, IdentifierExpression):
if not self._expression_contains_identifier(eq.composite, eq.simple.identifier):
errors.append("Equation '%s' does not have identifier from simple expression '%s' in composite expression."
% (unicode(eq), eq.simple.identifier))
if len(errors) > 0:
raise EnvironmentDefinitionException('Invalid syntax.', errors=errors)
def validate(self, parsed_equations, functions):
"""
Validates equations parsed from model according to functions.
Returns True if validation is passed or raises EnvironmentDefinitionException.
"""
# Validate syntax - function names and parametrs counts
self._validate_syntax(parsed_equations, functions)
errors = []
# Search for equations that can reduce themselves
# Check all possible pairs of equations
for i in range (0, len(parsed_equations)):
has_the_same_expression = False
for j in range(i+1, len(parsed_equations)):
variables_map = {}
eq_left = parsed_equations[i]
eq_right = parsed_equations[j]
if self._are_expressions_the_same(eq_left.composite, eq_right.composite, check_variables=True, variables=variables_map):
if variables_map[eq_left.simple.identifier] == eq_right.simple.identifier:
has_the_same_expression = True
break
if has_the_same_expression:
errors.append("Equations '%s' and '%s' are the same." % (unicode(eq_left), unicode(eq_right)))
# Todo: Check ambiguity
return True | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/simulator/equation.py | equation.py |
from aqopa.model import BooleanExpression, IdentifierExpression,\
CallFunctionExpression, TupleExpression, TupleElementExpression,\
AssignmentInstruction, CommunicationInstruction, FinishInstruction,\
ContinueInstruction, CallFunctionInstruction, IfInstruction,\
WhileInstruction, HostSubprocess, COMMUNICATION_TYPE_OUT,\
original_name, name_indexes, BreakInstruction
from aqopa.simulator.error import RuntimeException
class Context():
def __init__(self, version):
self.version = version
self.hosts = [] # List of all hosts in this context
self.functions = [] # List of all functions in this context
self.expression_populator = None
self.expression_checker = None
self.expression_reducer = None
self.metrics_manager = None
self.channels_manager = None
self.algorithms_resolver = None
self._current_host_index = 0
self._previous_host_index = -1
def get_current_host(self):
"""
Return host being executed at this step.
"""
return self.hosts[self._current_host_index]
def get_current_instruction(self):
"""
Return instruction being executed at this step.
"""
return self.get_current_host().get_current_instructions_context().get_current_instruction()
def all_hosts_finished(self):
"""
Returns True if all hosts are in FINISHED state
and no next states can be generated.
"""
for h in self.hosts:
if not h.finished():
return False
return True
def get_progress(self):
"""
Returs a number between 0 and 1 representing the progress of
simulation.
"""
all = 0
ended = 0
for h in self.hosts:
if h.finished():
ended += 1
all += 1
return float(ended)/float(all) if all > 0 else 0
def has_epoch_ended(self):
"""
Returns True when all hosts finished their epoch. Each host tried to execute an instruction
in all its instruction contexts.
Going to the next state is done for one host in next state generation step, so to finish
the one loop for all hosts' instruction contexts (let N), simulator must call function N times.
"""
for h in self.hosts:
if not h.finished():
if not h.has_epoch_ended():
return False
return True
def any_host_changed(self):
"""
Return True in any host has changed in last next state generation loop performed
for all hosts.
"""
for h in self.hosts:
if h.has_changed():
return True
return False
def mark_all_hosts_unchanged(self):
"""
Sets the state of all hosts to unchanged.
Used before each next state generation loop.
"""
for h in self.hosts:
h.mark_unchanged()
def goto_next_host(self):
"""
Context is moved to the next host so that next state generation step is performed for next host.
"""
self._previous_host_index = self._current_host_index
self._current_host_index = (self._current_host_index + 1) % len(self.hosts)
# Go to next host until current host is not finihed
while self._current_host_index != self._previous_host_index and self.get_current_host().finished():
self._current_host_index = (self._current_host_index + 1) % len(self.hosts)
# ----------- Hook
HOOK_TYPE_PRE_HOST_LIST_EXECUTION = 1
HOOK_TYPE_PRE_INSTRUCTION_EXECUTION = 2
HOOK_TYPE_POST_INSTRUCTION_EXECUTION = 3
HOOK_TYPE_SIMULATION_FINISHED = 4
class Hook():
"""
Hooks are executed in many places of simulation.
Hooks can be added by modules.
"""
def execute(self, context, **kwargs):
"""
Method changes the context.
May return exectution result if used in executors hook.
"""
raise NotImplementedError()
# ----------- Host Process
HOST_STATUS_RUNNING = 1
HOST_STATUS_FINISHED = 2
class Host():
"""
Simulation equivalent of host
"""
def __init__(self, name, instructions_list, predefined_variables=None):
self.name = name
self.instructions_list = instructions_list
self._variables = {}
if predefined_variables is not None:
self._variables = predefined_variables
self._scheduler = None
self._changed = False
self._status = HOST_STATUS_RUNNING
self._finish_error = None
self._touches = 0 # How many times the host has been touched.
# When touches is greater than number of instruction contexts, the epoch end.
def __unicode__(self):
return u"host %s" % unicode(self.name)
def original_name(self):
""""""
return original_name(self.name)
def add_name_index(self, index):
"""
Add index to the name.
Before: name = ch, index = 1. After: name = ch.1
"""
self.name += ".%d" % index
def get_name_index(self):
"""
Returns repetition index of this host
"""
elems = self.name.split('.')
if len(elems) < 2:
return 0
return int(elems[1])
def set_scheduler(self, scheduler):
"""Set scheduler"""
self._scheduler = scheduler
def set_variable(self, name, value):
"""Set hotst's variable"""
self._variables[name] = value
def get_variable(self, name):
""" Get host's variable """
if name not in self._variables:
raise RuntimeException("Variable '%s' undefined in host '%s'." % (name, self.name))
return self._variables[name]
def has_variable(self, name):
""" Returns True if variable is defined """
return name in self._variables
def get_variables(self):
""" Get variables dict """
return self._variables
def touch(self):
"""
Host is touched before the execution of its instruction.
It is used to check if all processes were tried to execute - hence whether the epoc has ended.
Each time is touching is started when the first instruction context is executed.
"""
# Touches equal to zero means that new epoch has been started.
# The counter is started when the forst context is executed.
if self._touches == 0:
if self._scheduler.is_current_context_first():
self._touches += 1
else:
self._touches += 1
def has_epoch_ended(self):
"""
Returns True when host tried to execute each instructions context in current epoch.
"""
return self._touches > self._scheduler.get_unfinished_contexts_number()
def mark_changed(self):
""" Marks host changed. Means that host have changes in last state. """
self._changed = True
def mark_unchanged(self):
""" Marks host unchanged. Clears changes from last state. """
self._changed = False
self._touches = 0
def has_changed(self):
""" Returns True if host has changed """
return self._changed
def goto_next_instructions_context(self):
"""
Moves host to next instructions context.
"""
self._scheduler.goto_next_instruction_context()
if self._scheduler.finished() and not self.finished():
self.finish_successfuly()
def get_current_instructions_context(self):
"""
Returns the current instructions context retrieved from scheduler.
"""
return self._scheduler.get_current_context()
def get_all_instructions_contexts(self):
"""
Returns all instructions context retrived from scheduler.
"""
return self._scheduler.get_all_contexts()
def get_instructions_context_of_instruction(self, instruction):
"""
Returns instruction context with the instruction from parameter
as the current instruction
"""
return self._scheduler.get_instructions_context_of_instruction(instruction)
def get_current_process(self):
""" """
return self.get_current_instructions_context().get_process_of_current_list()
def check_if_finished(self):
"""
Method checks if host has no more instructions to execute
and updates host's status.
"""
if self._scheduler.finished():
if self._status == HOST_STATUS_RUNNING:
self._status = HOST_STATUS_FINISHED
self._finish_error = None
def finished(self):
"""
Returns True when host is finished
"""
return self._status == HOST_STATUS_FINISHED
def finish_successfuly(self):
"""
Finish host execution without error
"""
self._status = HOST_STATUS_FINISHED
self._finish_error = None
def finish_failed(self, error):
"""
Finish host execution with error
"""
self._status = HOST_STATUS_FINISHED
self._finish_error = error
def get_finish_error(self):
""" """
return self._finish_error
class Process():
def __init__(self, name, instructions_list):
self.name = name
self.instructions_list = instructions_list
self.follower = None
def original_name(self):
""""""
return original_name(self.name)
def add_name_index(self, index):
"""
Add index to the name.
Before: name = ch, index = 1. After: name = ch.1
"""
self.name += ".%d" % index
def __unicode__(self):
return u"process %s" % unicode(self.name)
# ----------- Instructions Context
class InstructionsList:
def __init__(self, instructions_list, process=None):
self.process = process
self.instructions_list = instructions_list
self._current_instruction_index = 0
def get_current_instruction(self):
""" """
return self.instructions_list[self._current_instruction_index]
def goto_next_instruction(self):
""" """
self._current_instruction_index += 1
def finished(self):
"""
Returns True if list is finished.
"""
return self._current_instruction_index >= len(self.instructions_list)
class InstructionsContext:
def __init__(self, host):
self.host = host
self.stack = [] # Stack of instructions list
def _get_current_list(self):
"""
Returns currently executed list of instructions.
"""
if len(self.stack) == 0:
return None
return self.stack[len(self.stack)-1]
def get_current_instruction(self):
"""
Returns currently executed instruction.
"""
return self._get_current_list().get_current_instruction() if self._get_current_list() is not None else None
def get_process_of_current_list(self):
"""
Returns the process that current list is in.
"""
return self._get_current_list().process if self._get_current_list() is not None else None
def add_instructions_list(self, instructions_list, process=None):
"""
Add instructions list to the stack.
"""
l = InstructionsList(instructions_list, process)
self.stack.append(l)
def goto_next_instruction(self):
"""
Moves context to the next instruction.
"""
# Go to next instruction in current list
self._get_current_list().goto_next_instruction()
# Check if process has follower and if it should be started now
current_process = self.get_process_of_current_list()
if current_process is not None and current_process.follower is not None:
# We will move to next instruction only if the stack has been popped at least once and no follower was added
move_to_next_instruction = False
while len(self.stack) > 0 and self._get_current_list().finished():
self.stack.pop()
# Stack is popped so we set move flag to True, because in upper instructions list
# the pointer should be moved
move_to_next_instruction = True
new_current_process = self.get_process_of_current_list()
# Compare processes between and after stack pop
# If they are different it means that process has been finished
if new_current_process != current_process:
# Add follower's instructions to the context
instructions_list = current_process.follower.instructions_list
# If follower has at least one instruction
if len(instructions_list) > 0:
self.add_instructions_list(instructions_list, current_process.follower)
# When new instructions are added to the context it should not be moved to the next instruction
# because we would omit the first new instruction
move_to_next_instruction = False
break
if not self.finished() and move_to_next_instruction:
# If context is still not finished and new current list is not LOOP
# Go to next instruction in new current list
if not isinstance(self._get_current_list().get_current_instruction(), WhileInstruction):
self._get_current_list().goto_next_instruction()
# Now we handle the situation when current process has no follower
# While current list is finished but context is not finished
while not self.finished() and self._get_current_list().finished():
# Remove current list
self.stack.pop()
# If context is still not finished and new current list is not LOOP
# Go to next instruction in new current list
if not self.finished():
if not isinstance(self._get_current_list().get_current_instruction(), WhileInstruction):
self._get_current_list().goto_next_instruction()
# And repeat this process
# When moving to the next instruction in instructions context is finished
# update host's status - it may become finished
self.host.check_if_finished()
def finished(self):
"""
Returns True if context is finished.
"""
if len(self.stack) == 0:
return True
if len(self.stack) == 1 and self._get_current_list().finished():
return True
return False
# ----------- Executor
class InstructionExecutor():
"""
Abstract class of executor from the chain of responsibility.
"""
def can_execute_instruction(self, instruction):
"""
Returns True if instruction can be executed by executor.
"""
raise NotImplementedError()
def execute_instruction(self, context, **kwargs):
"""
Changes the context according to the execution of current instruction.
The second parameter `kwargs` keeps variables created be previous executors.
Returns the result of execution containing information
about execution and the next steps: should it omit next executors?,
does it consume cpu?, does it implement custom index management? etc.
"""
raise NotImplementedError()
class ExecutionResult():
"""
The result of execution of one instruction by one executor.
"""
def __init__(self, consumes_cpu=False,
custom_index_management=False,
finish_instruction_execution=False,
result_kwargs=None):
""" """
self.consumes_cpu = consumes_cpu
self.custom_index_management = custom_index_management
self.finish_instruction_execution = finish_instruction_execution
self.result_kwargs = result_kwargs
class HookExecutor(InstructionExecutor):
"""
Class executes hooks.
"""
def __init__(self):
self._hooks = [] # List of hooks
def add_hook(self, hook):
""" Adds hook to the list """
self._hooks.append(hook)
return self
def remove_hook(self, hook):
""" Removes hook from the list """
self._hooks.remove(hook)
return self
def execute_instruction(self, context, **kwargs):
""" Overriden """
consumes_cpu = False
custom_index_management = False
finish_instruction_execution = False
exec_kwargs = kwargs
for h in self._hooks:
result = h.execute(context, **exec_kwargs)
if result:
if result.consumes_cpu:
consumes_cpu = True
if result.custom_index_management:
custom_index_management = True
if result.finish_instruction_execution:
finish_instruction_execution = True
if result.result_kwargs is not None:
exec_kwargs.update(result.result_kwargs)
return ExecutionResult(consumes_cpu, custom_index_management,
finish_instruction_execution,
result_kwargs=exec_kwargs)
def can_execute_instruction(self, instruction):
""" Overriden """
return True
class PrintExecutor(InstructionExecutor):
"""
Excecutor writes current instruction to the stream.
"""
def __init__(self, f):
self.file = f # File to write instruction to
self.result = ExecutionResult()
def execute_instruction(self, context, **kwargs):
""" Overriden """
self.file.write("Host: %s \t" % context.get_current_host().name)
instruction = context.get_current_instruction()
simples = [AssignmentInstruction, CommunicationInstruction, FinishInstruction, ContinueInstruction]
for s in simples:
if isinstance(instruction, s):
self.file.write(unicode(instruction))
if isinstance(instruction, CallFunctionInstruction):
self.file.write(instruction.function_name + '(...)')
if isinstance(instruction, IfInstruction):
self.file.write('if (%s) ...' % unicode(instruction.condition))
if isinstance(instruction, WhileInstruction):
self.file.write('while (%s) ...' % unicode(instruction.condition))
if isinstance(instruction, Process):
self.file.write('process %s ...' % unicode(instruction.name))
if isinstance(instruction, HostSubprocess):
self.file.write('subprocess %s ...' % unicode(instruction.name))
self.file.write("\n")
return self.result
def can_execute_instruction(self, instruction):
""" Overriden """
return True
class AssignmentInstructionExecutor(InstructionExecutor):
"""
Executes assignment insructions.
"""
def _compute_current_expression(self, expression, context):
"""
Computes the expression from current assignment instruction.
"""
if isinstance(expression, BooleanExpression):
return expression.clone()
if isinstance(expression, IdentifierExpression) or isinstance(expression, CallFunctionExpression) \
or isinstance(expression, TupleExpression) or isinstance(expression, TupleElementExpression):
# Population of variables which were previously populated (ie. additional attrs)
return context.expression_populator.populate(expression, context.get_current_host())
raise RuntimeException("Expression '%s' cannot be a value of variable.")
def execute_instruction(self, context, **kwargs):
""" Overriden """
instruction = context.get_current_instruction()
expression = self._compute_current_expression(instruction.expression, context)
expression = context.expression_reducer.reduce(expression)
context.get_current_host().set_variable(instruction.variable_name, expression)
context.get_current_host().mark_changed()
return ExecutionResult(consumes_cpu=True)
def can_execute_instruction(self, instruction):
""" Overriden """
return isinstance(instruction, AssignmentInstruction)
class CallFunctionInstructionExecutor(InstructionExecutor):
"""
Executes call function insructions.
Dummy executor just to show that call function instruction
consumes cpu and changes host.
"""
def execute_instruction(self, context, **kwargs):
""" Overriden """
context.get_current_host().mark_changed()
return ExecutionResult(consumes_cpu=True)
def can_execute_instruction(self, instruction):
""" Overriden """
return isinstance(instruction, CallFunctionInstruction)
class ProcessInstructionExecutor(InstructionExecutor):
"""
Executes process insructions.
"""
def execute_instruction(self, context, **kwargs):
""" Overriden """
process_instruction = context.get_current_instruction()
instructions_list = process_instruction.instructions_list
# If process has at least one instruction
if len(instructions_list) > 0:
context.get_current_host().get_current_instructions_context().add_instructions_list(instructions_list,
process_instruction)
else:
# Go to next instruction if proces has no instructions
context.get_current_host().get_current_instructions_context().goto_next_instruction()
context.get_current_host().mark_changed()
return ExecutionResult(custom_index_management=True)
def can_execute_instruction(self, instruction):
""" Overriden """
return isinstance(instruction, Process)
class SubprocessInstructionExecutor(InstructionExecutor):
"""
Executes subprocess insructions.
"""
def execute_instruction(self, context, **kwargs):
""" Overriden """
subprocess_instruction = context.get_current_instruction()
current_process = context.get_current_host().get_current_process()
instructions_list = subprocess_instruction.instructions_list
if len(instructions_list) > 0:
context.get_current_host().get_current_instructions_context().add_instructions_list(instructions_list, current_process)
else:
context.get_current_host().get_current_instructions_context().goto_next_instruction()
context.get_current_host().mark_changed()
return ExecutionResult(custom_index_management=True)
def can_execute_instruction(self, instruction):
""" Overriden """
return isinstance(instruction, HostSubprocess)
class CommunicationInstructionExecutor(InstructionExecutor):
"""
Executes communication in-out insructions.
"""
def execute_instruction(self, context, **kwargs):
""" Overriden """
instruction = context.get_current_instruction()
channel = context.channels_manager.find_channel_for_current_instruction(context)
if context.channels_manager.find_channel(instruction.channel_name) is None:
raise RuntimeException("Channel {0} undefined.".format(instruction.channel_name))
if not channel:
context.get_current_host().get_current_instructions_context().goto_next_instruction()
context.get_current_host().mark_changed()
return ExecutionResult(consumes_cpu=True,
custom_index_management=True)
if instruction.communication_type == COMMUNICATION_TYPE_OUT:
if kwargs is not None and 'sent_message' in kwargs:
message = kwargs['sent_message']
else:
message = context.channels_manager.build_message(
context.get_current_host(),
context.get_current_host().get_variable(instruction.variable_name).clone(),
context.expression_checker)
kwargs['sent_message'] = message
channel.send_message(context.get_current_host(), message, context.channels_manager.get_router())
# Go to next instruction
context.get_current_host().get_current_instructions_context().goto_next_instruction()
context.get_current_host().mark_changed()
else:
if kwargs is not None and 'messages_request' in kwargs:
request = kwargs['messages_request']
else:
request = context.channels_manager.build_message_request(context.get_current_host(), instruction,
context.expression_populator)
kwargs['messages_request'] = request
channel.wait_for_message(request)
return ExecutionResult(consumes_cpu=True,
custom_index_management=True,
result_kwargs=kwargs)
def can_execute_instruction(self, instruction):
""" Overriden """
return isinstance(instruction, CommunicationInstruction)
class FinishInstructionExecutor(InstructionExecutor):
"""
Executes finish (end, stop) insructions.
"""
def execute_instruction(self, context, **kwargs):
""" Overriden """
instruction = context.get_current_instruction()
if instruction.command == "end":
for h in context.hosts:
h.finish_successfuly()
else:
msg = 'Executed stop instruction in host {0}'.format(context.get_current_host().name)
for h in context.hosts:
if h == context.get_current_host():
h.finish_failed('Executed stop instruction')
else:
h.finish_failed(msg)
context.get_current_host().mark_changed()
return ExecutionResult(consumes_cpu=True)
def can_execute_instruction(self, instruction):
""" Overriden """
return isinstance(instruction, FinishInstruction)
class ContinueInstructionExecutor(InstructionExecutor):
"""
Executes continue insructions.
"""
def execute_instruction(self, context, **kwargs):
""" Overriden """
instructions_context = context.get_current_host().get_current_instructions_context()
instruction = instructions_context.get_current_instruction()
while instruction and not isinstance(instruction, WhileInstruction):
instructions_context.stack.pop()
instruction = instructions_context.get_current_instruction()
context.get_current_host().mark_changed()
return ExecutionResult(consumes_cpu=True,
custom_index_management=True)
def can_execute_instruction(self, instruction):
""" Overriden """
return isinstance(instruction, ContinueInstruction)
class BreakInstructionExecutor(InstructionExecutor):
"""
Executes continue insructions.
"""
def execute_instruction(self, context, **kwargs):
""" Overriden """
instructions_context = context.get_current_host().get_current_instructions_context()
instruction = instructions_context.get_current_instruction()
while instruction and not isinstance(instruction, WhileInstruction):
instructions_context.stack.pop()
instruction = instructions_context.get_current_instruction()
instructions_context.goto_next_instruction()
context.get_current_host().mark_changed()
return ExecutionResult(consumes_cpu=True,
custom_index_management=True)
def can_execute_instruction(self, instruction):
""" Overriden """
return isinstance(instruction, BreakInstruction)
class IfInstructionExecutor(InstructionExecutor):
"""
Executes if-clause insructions.
"""
def execute_instruction(self, context, **kwargs):
""" Overriden """
instruction = context.get_current_instruction()
current_process = context.get_current_host().get_current_process()
contidion_result = context.expression_checker.result(instruction.condition,
context.get_current_host())
if contidion_result:
instructions_list = instruction.true_instructions
else:
instructions_list = instruction.false_instructions
if len(instructions_list) > 0:
context.get_current_host().get_current_instructions_context().add_instructions_list(
instructions_list,
current_process)
else:
context.get_current_host().get_current_instructions_context().goto_next_instruction()
context.get_current_host().mark_changed()
return ExecutionResult(consumes_cpu=True,
custom_index_management=True)
def can_execute_instruction(self, instruction):
""" Overriden """
return isinstance(instruction, IfInstruction)
class WhileInstructionExecutor(InstructionExecutor):
"""
Executes while-clause insructions.
"""
def execute_instruction(self, context, **kwargs):
""" Overriden """
instruction = context.get_current_instruction()
current_process = context.get_current_host().get_current_process()
contidion_result = context.expression_checker.result(instruction.condition,
context.get_current_host())
if contidion_result:
instructions_list = instruction.instructions
if len(instructions_list) > 0:
context.get_current_host().get_current_instructions_context().add_instructions_list(
instructions_list,
current_process)
else:
context.get_current_host().get_current_instructions_context().goto_next_instruction()
context.get_current_host().mark_changed()
return ExecutionResult(consumes_cpu=True,
custom_index_management=True)
def can_execute_instruction(self, instruction):
""" Overriden """
return isinstance(instruction, WhileInstruction)
class Executor():
"""
Class executes instructions to move simulation to the nest state.
"""
def __init__(self):
self.executors = [] # List of instruction executors
def prepend_instruction_executor(self, instruction_executor):
"""
Adds instruction executor at the beginning of the executors list.
"""
self.executors.insert(0, instruction_executor)
def append_instruction_executor(self, instruction_executor):
"""
Adds instruction executor to the end of the executors list.
"""
self.executors.append(instruction_executor)
def execute_instruction(self, context):
"""
Executes one instruction of context which is equal to going to the next state.
The executed instruction is get from the current host of the context.
"""
execution_result = None
cpu_time_consumed = False
# Execute instructions in one instructions context until
# instruction that consumes cpu time is executes.
# Method also checks whether the host was not stopped or ended meanwhile
# and whether the context is not finished
# (fe. when there is only information about instructions printed)
while not cpu_time_consumed and not context.get_current_host().finished() \
and not context.get_current_host().get_current_instructions_context().finished():
instr = context.get_current_instruction()
custom_instructions_index_change = False
exec_kwargs = {}
for e in self.executors:
if e.can_execute_instruction(instr):
# Execute current instruction by current executor.
# Executor can change index.
execution_result = e.execute_instruction(context, **exec_kwargs)
# If executor does not return result
# create a default one
if not execution_result:
execution_result = ExecutionResult()
if execution_result.consumes_cpu:
cpu_time_consumed = True
# Check if executor changes instructions index itself.
if execution_result.custom_index_management:
custom_instructions_index_change = True
if execution_result.result_kwargs is not None:
exec_kwargs.update(execution_result.result_kwargs)
# Omit other executors if the result says that
if execution_result.finish_instruction_execution:
break
# If index is not changed by executor,
# method has to change it.
if not custom_instructions_index_change:
context.get_current_host().get_current_instructions_context().goto_next_instruction()
# Finish execution of instruction
if execution_result and execution_result.finish_instruction_execution:
break
# Change the index of instructions in current host
# according to the scheduler algorithm.
# It moves index to next instructions context.
context.get_current_host().goto_next_instructions_context() | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/simulator/state.py | state.py |
from aqopa.model import original_name, CallFunctionExpression,\
IdentifierExpression
from aqopa.simulator.error import RuntimeException
###############################
# PREDEFINED FUNCTIONS
###############################
# Function ID
def _predefined_id_function__populate(call_function_expression, host, populator, context=None):
existing_host_name = getattr(call_function_expression, '_host_name', None)
if existing_host_name is not None:
return call_function_expression
arguments = call_function_expression.arguments
if len(arguments) == 0:
expr_host_name = host.name
else:
expr_host_name = arguments[0].identifier
if expr_host_name == original_name(expr_host_name):
expr_host_name += ".0"
setattr(call_function_expression, '_host_name', expr_host_name)
return call_function_expression
def _predefined_id_function__are_equal(left, right):
if left.function_name != right.function_name:
return False
left_host_name = getattr(left, '_host_name', None)
right_host_name = getattr(right, '_host_name', None)
if left_host_name is None or right_host_name is None:
return False
return left_host_name == right_host_name
# Function routing_next
def _predefined_routing_next_function__populate(call_function_expression, host, populator, context=None):
topology_name = call_function_expression.arguments[0].identifier
receiver_function_id_expression = populator.populate(call_function_expression.arguments[1], host)
receiver_function_id_expression = _predefined_id_function__populate(receiver_function_id_expression,
host, populator)
receiver_name = getattr(receiver_function_id_expression, '_host_name')
receiver = None
for h in context.hosts:
if h.name == receiver_name:
receiver = h
break
sender = host
sender_name = sender.name
if len(call_function_expression.arguments) > 2:
sender_function_id_expression = populator.populate(call_function_expression.arguments[2], host)
sender_function_id_expression = _predefined_id_function__populate(sender_function_id_expression,
host, populator)
sender_name = getattr(sender_function_id_expression, '_host_name')
if sender_name != sender.name:
sender = None
for h in context.hosts:
if h.name == sender_name:
sender = h
break
if sender is None:
raise RuntimeException("Host '%s' undefined." % sender_name)
if receiver is None:
raise RuntimeException("Host '%s' undefined." % receiver_name)
if sender == receiver:
next_host = receiver
else:
next_host = context.channels_manager.get_router().get_next_hop_host(topology_name, sender, receiver)
if next_host is None:
raise RuntimeException("The route from host '%s' to host '%s' cannot be found."
% (sender_name, receiver_name))
# DEBUG
# print host.name, ': ', unicode(call_function_expression), ' -> ', next_host.name
# DEBUG
id_function = CallFunctionExpression('id', arguments=[IdentifierExpression(next_host.name)])
setattr(id_function, '_host_name', next_host.name)
return id_function
###############################
# PREDEFINED MANAGEMENT
###############################
class FunctionsManager():
def __init__(self, context):
self.context = context
self.functions_map = {
'id': {
'populate': _predefined_id_function__populate,
'are_equal': _predefined_id_function__are_equal
},
'routing_next': {
'populate': _predefined_routing_next_function__populate,
'are_equal': None
}
}
def is_function_predefined(self, function_name):
return function_name in self.functions_map
def populate_call_function_expression_result(self, call_function_expression, host, populator):
fun_name = call_function_expression.function_name
if not self.is_function_predefined(fun_name):
return call_function_expression
return self.functions_map[fun_name]['populate'](call_function_expression, host, populator, context=self.context)
def are_equal_call_function_expressions(self, left, right):
if not isinstance(left, CallFunctionExpression) or not isinstance(right, CallFunctionExpression):
raise RuntimeException("Trying to compare predefined functions expressions, but they are not.")
return self.functions_map[left.function_name]['are_equal'](left, right) | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/simulator/predefined.py | predefined.py |
from aqopa.simulator.state import HOOK_TYPE_PRE_HOST_LIST_EXECUTION,\
HookExecutor, HOOK_TYPE_PRE_INSTRUCTION_EXECUTION,\
HOOK_TYPE_POST_INSTRUCTION_EXECUTION, HOOK_TYPE_SIMULATION_FINISHED
from aqopa.simulator.error import RuntimeException,\
EnvironmentDefinitionException, InfiniteLoopException
class Simulator():
"""
Interpreter's Model Simulator
"""
def __init__(self, context):
self.context = context # Simulation context (keeps information about current state)
self._ready = False
self._hooks = {}
self._modules = []
self._executor = None
self._before_instruction_executor = HookExecutor()
self._after_instruction_executor = HookExecutor()
self._first_loop = True
self._infinite_loop_error = False
def _execute_hook(self, hook_type):
"""
Execute all hooks of given type.
Pre instruction and post instruction hooks
cannot be executed manually.
"""
if hook_type not in self._hooks:
return
if hook_type in [HOOK_TYPE_PRE_INSTRUCTION_EXECUTION, HOOK_TYPE_POST_INSTRUCTION_EXECUTION]:
raise RuntimeException("Cannot execute pre instruction and post instruction hooks manually.")
for h in self._hooks[hook_type]:
h.execute(self.context)
def _install_modules(self):
"""
Method installs registered modules.
"""
if self._ready:
raise EnvironmentDefinitionException('Cannot install modules in prepared simulation, they were already installed.')
for m in self._modules:
m.install(self)
def _internal_goto_next_state(self):
"""
Internally goes to next state.
The difference is that it does not check
whether simulation is finished.
"""
if self.context.has_epoch_ended():
self._execute_hook(HOOK_TYPE_PRE_HOST_LIST_EXECUTION)
if self._first_loop:
self._first_loop = False
else:
# If nothing has changed in this next state generation loop
if not self.context.any_host_changed():
# Throw runtime error if any channel has message to be binded with receiver
# or if any channel has dropped a message
for ch in self.context.channels_manager.channels:
if ch.get_dropped_messages_nb() > 0:
raise InfiniteLoopException()
for h in self.context.hosts:
if not h.finished():
h.finish_failed(u'Infinite loop occured on instruction: %s' %
unicode(h.get_current_instructions_context().get_current_instruction()))
self.context.mark_all_hosts_unchanged()
self.context.get_current_host().touch()
self._executor.execute_instruction(self.context)
self.context.goto_next_host()
# Public
def get_executor(self):
""" executor getter """
return self._executor
def set_executor(self, executor):
""" executor setter """
self._executor = executor
def infinite_loop_occured(self):
""" Returns True if infinite loop occured """
return self._infinite_loop_error
def register_module(self, module):
"""
Register module for simulation.
"""
if self._ready:
raise EnvironmentDefinitionException('Cannot register module in prepared simulation.')
self._modules.append(module)
return self
def prepare(self):
"""
Prepare simulator to start.
"""
self._install_modules()
self._first_loop = True
self._ready = True
return self
def register_hook(self, hook_type, hook):
"""
Registers new hook of particular type
"""
if hook_type not in self._hooks:
self._hooks[hook_type] = []
if hook in self._hooks[hook_type]:
raise EnvironmentDefinitionException(u"Hook '%s' is already registered." % unicode(hook))
self._hooks[hook_type].append(hook)
if hook_type == HOOK_TYPE_PRE_INSTRUCTION_EXECUTION:
self._before_instruction_executor.add_hook(hook)
if hook_type == HOOK_TYPE_POST_INSTRUCTION_EXECUTION:
self._after_instruction_executor.add_hook(hook)
return self
def is_ready_to_run(self):
"""
Returns True if simulator is ready to run
"""
return self._ready
def is_simulation_finished(self):
"""
Returns True if simulation has ended.
Simulation can end with success or error (eg. infinite loop occured).
"""
return self.is_ready_to_run() and (self.context.all_hosts_finished() or self.infinite_loop_occured())
def count_dropped_messages(self):
"""
Counts left (undelivered) messages as dropped on each channel
"""
for ch in self.context.channels_manager.channels:
ch.add_left_messages_to_dropped()
def run(self):
"""
Runs whole simulation process.
"""
if not self.is_ready_to_run():
raise EnvironmentDefinitionException("Simulation is not yet ready to run.")
self._executor.prepend_instruction_executor(self._before_instruction_executor)
self._executor.append_instruction_executor(self._after_instruction_executor)
while not self.is_simulation_finished():
try:
self._internal_goto_next_state()
except InfiniteLoopException:
self._infinite_loop_error = True
self.count_dropped_messages()
self._execute_hook(HOOK_TYPE_SIMULATION_FINISHED) | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/simulator/__init__.py | __init__.py |
from aqopa.model import AlgWhile, TupleExpression, AlgCallFunction, AlgReturn, AlgIf, AlgAssignment
from aqopa.simulator.error import RuntimeException
class AlgorithmResolver():
def __init__(self):
self.algorithms = {}
def add_algorithm(self, name, algorithm):
self.algorithms[name] = algorithm
def has_algorithm(self, name):
return name in self.algorithms
def get_algorithm(self, name):
return self.algorithms[name]
def calculate(self, context, host, alg_name, variables=None):
if not self.has_algorithm(alg_name):
return 0
if variables is None:
variables = {}
return AlgorithmCalculator(context, host, alg_name, variables, self.algorithms[alg_name]).calculate()
class AlgorithmCalculator():
def __init__(self, context, host, algorithm_name, variables, algorithm):
self.context = context
self.host = host
self.algorithm_name = algorithm_name
self.variables = variables
self.algorithm = algorithm
self.instructions = algorithm['instructions']
self.link_quality = variables['link_quality'] if 'link_quality' in variables else 1
self.instructions_stack = [algorithm['instructions']]
self.instructions_stack_index = 0
self.instructions_lists_indexes = {0: 0}
self.left_associative_operators = ['--', '-', '+', '*', '/', '==', '!=', '<=', '>=', '>', '<', '&&', '||']
self.right_associative_operators = []
self.all_operators = self.left_associative_operators + self.right_associative_operators
self.return_value = None
def get_index_in_current_list(self):
""" Returns the index of instruction in current list """
return self.instructions_lists_indexes[self.instructions_stack_index]
def in_main_stack(self):
""" Returns True when current instruction is in main stack """
return self.instructions_stack_index == 0
def has_current_instruction(self):
""" """
return self.get_index_in_current_list() < len(self.instructions_stack[self.instructions_stack_index])
def get_current_instruction(self):
""" Returns the instruction that should be executed next """
return self.instructions_stack[self.instructions_stack_index][self.get_index_in_current_list()]
def finished(self):
""" Returns True when algorithm is finished """
return self.return_value is not None or (not self.has_current_instruction() and self.in_main_stack())
def goto_next_instruction(self):
""" """
self.instructions_lists_indexes[self.instructions_stack_index] += 1
while not self.finished() and not self.has_current_instruction():
self.instructions_stack.pop()
del self.instructions_lists_indexes[self.instructions_stack_index]
self.instructions_stack_index -= 1
if not self.finished():
if not isinstance(self.get_current_instruction(), AlgWhile):
self.instructions_lists_indexes[self.instructions_stack_index] += 1
def add_instructions_list(self, instructions):
""" Adds new instructions list to the stack """
self.instructions_stack.append(instructions)
self.instructions_stack_index += 1
self.instructions_lists_indexes[self.instructions_stack_index] = 0
def calculate_function_value(self, call_function_instruction):
if call_function_instruction.function_name == 'size':
var_name = call_function_instruction.args[0]
if var_name not in self.variables:
raise RuntimeException("Variable {0} not defined in communication algorithm {1}."
.format(var_name, self.algorithm_name))
value = self.variables[var_name]
# If tuple element expression
if len(call_function_instruction.args) > 1:
if not isinstance(value, TupleExpression):
raise RuntimeException("Variable {0} in communication algorithm {1} is not tuple, it is: {2}."
.format(var_name, self.algorithm_name, unicode(value)))
index = call_function_instruction.args[1]
if len(value.elements) <= index:
raise RuntimeException("Variable {0} in communication algorithm {1} has "
"{2} elements while index {3} is asked."
.format(var_name, self.algorithm_name, len(value.elements), index))
value = value.elements[index]
return self.context.metrics_manager.get_expression_size(value, self.context, self.host)
elif call_function_instruction.function_name == 'quality':
if self.link_quality is None:
raise RuntimeException("Link quality is undefined in {0} algorithm. "
.format(self.algorithm_name))
return self.link_quality
raise RuntimeException("Unresolved reference to function {0}() in algorithm {1}."
.format(call_function_instruction.function_name, self.algorithm_name))
def _is_operation_token(self, token):
return isinstance(token, basestring) and token in self.all_operators
def _operator_order(self, operator):
"""
Returns the order of operator as number.
"""
orders = [['==', '!=', '<=', '>=', '>', '<', '&&', '||'], ['--', '-', '+'], ['*', '/']]
for i in range(0, len(orders)):
if operator in orders[i]:
return i
raise RuntimeException("Operator {0} undefined in algorithm {1}.".format(operator, self.algorithm_name))
def _make_rpn(self, expression):
""" """
stack = []
rpn = []
for token in expression:
# if operator
if self._is_operation_token(token):
while len(stack) > 0:
top_operator = stack[len(stack)-1]
# if current operator is left-associative and its order is lower or equal than top operator
# or current operator is right-associative and its order is lower than top operator
if (token in self.left_associative_operators
and self._operator_order(token) <= self._operator_order(top_operator))\
or (token in self.right_associative_operators
and self._operator_order(token) < self._operator_order(top_operator)):
rpn.append(stack.pop())
else:
break
stack.append(token)
elif token == '(':
stack.append(token)
elif token == ')':
found_paran = False
while len(stack) > 0:
top_operator = stack[len(stack)-1]
if top_operator == '(':
found_paran = True
stack.pop()
break
else:
rpn.append(stack.pop())
if not found_paran:
raise RuntimeException("Incorrect number of brackets in algorithm {0}.".format(self.algorithm_name))
else: # else number
if isinstance(token, AlgCallFunction):
token = self.calculate_function_value(token)
elif isinstance(token, basestring):
if token not in self.variables:
raise RuntimeException("Variable {0} not defined in communication algorithm {1}."
.format(token, self.algorithm_name))
token = self.variables[token]
rpn.append(float(token))
while len(stack) > 0:
rpn.append(stack.pop())
return rpn
def _calculate_operation(self, operator, left, right):
""" """
if operator == '+':
return left + right
elif operator == '-':
return left - right
elif operator == '*':
return left * right
elif operator == '/':
return left / right
elif operator == '==':
return left == right
elif operator == '!=':
return left != right
elif operator == '>=':
return left >= right
elif operator == '>':
return left > right
elif operator == '<=':
return left <= right
elif operator == '<':
return left < right
else:
raise RuntimeException("Incorrect operator {0} of brackets in algorithm {1}."
.format(operator, self.algorithm_name))
def _calculate_rpn(self, rpn_elements):
""" """
stack = []
for token in rpn_elements:
# if operator
if self._is_operation_token(token):
if token == '--':
value = stack.pop()
value = - value
stack.append(value)
else:
a = stack.pop()
b = stack.pop()
stack.append(self._calculate_operation(token, b, a))
else: # number
stack.append(token)
return stack.pop()
def calculate_value(self, expression):
rpn_elements = self._make_rpn(expression)
return self._calculate_rpn(rpn_elements)
def execute_current_instruction(self):
current_instruction = self.get_current_instruction()
if isinstance(current_instruction, AlgReturn):
self.return_value = self.calculate_value(current_instruction.expression)
self.goto_next_instruction()
elif isinstance(current_instruction, AlgWhile):
if len(current_instruction.instructions) > 0 and self.calculate_value(current_instruction.condition):
self.add_instructions_list(current_instruction.instructions)
else:
self.goto_next_instruction()
elif isinstance(current_instruction, AlgIf):
if self.calculate_value(current_instruction.condition):
instructions = current_instruction.true_instructions
else:
instructions = current_instruction.false_instructions
if len(instructions) > 0:
self.add_instructions_list(instructions)
else:
self.goto_next_instruction()
elif isinstance(current_instruction, AlgAssignment):
self.variables[current_instruction.identifier] = self.calculate_value(current_instruction.expression)
self.goto_next_instruction()
def calculate(self):
while not self.finished():
self.execute_current_instruction()
if self.return_value is None:
raise RuntimeException("Algorithm {0} has no return value. Did you forget to use return instruction?"
.format(self.algorithm_name))
return self.return_value | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/simulator/algorithm.py | algorithm.py |
import copy
from aqopa.model import CommunicationInstruction, TupleExpression, ComparisonExpression, COMPARISON_TYPE_EQUAL, \
CallFunctionExpression, AlgWhile, AlgReturn, AlgIf, AlgAssignment, AlgCallFunction
from aqopa.simulator.error import RuntimeException
class ChannelMessage():
"""
Message sent through channel.
"""
def __init__(self, sender, expression, expression_checker):
self.sender = sender # The host that sent the expression
self.expression = expression # The sent expression
self.expression_checker = expression_checker # checker used to check if message passes given filters
self.not_for_requests = [] # List of requests that this expression should not been used by
def cancel_for_request(self, request):
self.not_for_requests.append(request)
def is_for_request(self, request):
return request not in self.not_for_requests
def is_used(self):
return len(self.not_for_hosts)
def pass_filters(self, filters):
"""
Returns True if message passes all given filters
"""
if len(filters) == 0:
return True
else:
if not isinstance(self.expression, TupleExpression) \
or (len(self.expression.elements) < len(filters)):
return False
for i in range(0, len(filters)):
f = filters[i]
if isinstance(f, basestring): # star - accept everything
continue
else:
cmp_expr = ComparisonExpression(f, self.expression.elements[i], COMPARISON_TYPE_EQUAL)
if not self.expression_checker.result(cmp_expr, self.sender):
return False
return True
class ChannelMessageRequest():
"""
Representation of hosts' requests for messages.
"""
def __init__(self, receiver, communication_instruction, expression_populator):
""" """
self.is_waiting = True
self.receiver = receiver
self.instruction = communication_instruction
self.expression_populator = expression_populator # populator used to populate current version of filters
self.assigned_message = None
def get_populated_filters(self):
"""
Get list of filters with populated expressions - ready for comparison.
"""
filters = []
for f in self.instruction.filters:
if f == '*':
filters.append(f)
else:
filters.append(self.expression_populator.populate(f, self.receiver))
return filters
def clean(self):
self.is_waiting = False
self.assigned_message = None
def start_waiting(self):
self.is_waiting = True
def get_message_from_buffer(self, buffer):
"""
Adds messages from the given list and returns number of added messages.
"""
# Get messages only if receiver is actually waiting
filters = self.get_populated_filters()
if self.is_waiting:
if self.assigned_message is None and len(buffer) > 0:
for message in buffer:
# Check if message passes the filters
if not message.pass_filters(filters):
continue
self.assigned_message = message
buffer.remove(message)
return True
return False
def ready_to_fulfill(self):
"""
Return True when request has as many messages as he is waiting for.
"""
return self.assigned_message is not None
def fulfill(self):
"""
Assigns requested variables with received messages.
"""
if not self.ready_to_fulfill():
raise RuntimeException("Request of host {0} in instruction {1} is not ready to fulfill."
.format(self.receiver.name, unicode(self.instruction)))
# Set variable sent in message
self.receiver.set_variable(self.instruction.variable_name, self.assigned_message.expression.clone())
# Move instructions context to the next instruction
# print unicode(self.receiver.get_instructions_context_of_instruction(self.instruction).get_current_instruction())
self.receiver.get_instructions_context_of_instruction(self.instruction)\
.goto_next_instruction()
# print unicode(self.receiver.get_instructions_context_of_instruction(self.instruction))
self.receiver.mark_changed()
class Channel():
"""
Simulation channel.
"""
def __init__(self, name, buffer_size, tag_name=None):
self.name = name
self.tag_name = tag_name
self._buffer_size = buffer_size # Size of buffer, negative means that buffer is unlimited,
# zero means that channel is asynhronous
self._connected_hosts = [] # List of hosts that can use this channel
self._connected_processes = [] # List of processes that can use this channel
self._buffers = {} # Hosts' buffers - the host is the key, the list of sent messages is the value
self._waiting_requests = []
self._dropped_messages_cnt = 0
def connect_with_host(self, host):
"""
Connect channel with host if it is not already connected.
"""
if host in self._connected_hosts:
return
self._connected_hosts.append(host)
def is_connected_with_host(self, host):
"""
Returns True if host is connected with this channel.
"""
return host in self._connected_hosts
def connect_with_process(self, process):
"""
Connect channel with process if it is not already connected.
"""
if process in self._connected_processes:
return
self._connected_processes.append(process)
def is_connected_with_process(self, process):
"""
Returns True if process is connected with this channel.
"""
return process in self._connected_processes
def is_synchronous(self):
"""
Returns True if channel is synchronous.
"""
return self._buffer_size == 0
def has_unlimited_buffer(self):
"""
Returns True if channel has unlimited buffer.
"""
return self._buffer_size < 0
def get_dropped_messages_nb(self):
"""
Return number of messages dropped on this channel
"""
return self._dropped_messages_cnt
def get_buffer_for_host(self, host):
"""
Returns the content of buffer assigned to host.
"""
if host not in self._buffers:
self._buffers[host] = []
return self._buffers[host]
def get_existing_request_for_instruction(self, receiver, instruction):
"""
Returns request on which channel is waiting for message or None if it is not waiting.
"""
for i in range(0, len(self._waiting_requests)):
request = self._waiting_requests[i]
if request.instruction == instruction and request.receiver == receiver:
return request
return None
def is_waiting_on_instruction(self, receiver, instruction):
"""
Returns True if channel is waiting for message.
"""
request = self.get_existing_request_for_instruction(receiver, instruction)
if request:
return request.is_waiting
return False
def wait_for_message(self, request):
"""
Add host to the queue of hosts waiting for messages.
Host can wait for many expressions.
"""
if not self.is_connected_with_host(request.receiver):
raise RuntimeException("Channel '%s' is not connected with host '%s'." % (self.name, request.receiver.name))
# Check if this request already exists
existing_request = self.get_existing_request_for_instruction(request.receiver, request.instruction)
# If it does not exist add
if not existing_request:
# print 'IN', unicode(self.name), unicode(request.receiver), unicode(request.instruction)
self._waiting_requests.append(request)
else:
# If request exists, check if it is waiting on IN instruction now
if not existing_request.is_waiting:
# If it is now (request stays in channel only to fill the buffer) start waiting on this request
existing_request.start_waiting()
# Check if request can be bound with expressions
self._bind_messages_with_receivers()
def get_filtered_requests(self, message, router):
"""
Returns list of requests that can accept the message
"""
requests = []
for request in self._waiting_requests:
# Check if there is a link between sender and receiver
if not router.link_exists(self, message.sender, request.receiver):
continue
# Check if message has not been declined for waiting host
if not message.is_for_request(request):
continue
# Check if message passes the filters
filters = request.get_populated_filters()
if not message.pass_filters(filters):
continue
requests.append(request)
return requests
def get_filtered_messages(self, request, router):
"""
Returns list of messages from buffer that can be assigned to request
"""
messages = []
buffer = self.get_buffer_for_host(request.receiver)
filters = request.get_populated_filters()
for message in buffer:
# Check if there is a link between sender and receiver
if not router.link_exists(self, message.sender, request.receiver):
continue
# Check if message passes the filters
if not message.pass_filters(filters):
continue
messages.append(message)
return messages
def send_message(self, sender_host, message, router):
"""
Accept message with expressions.
"""
if not self.is_connected_with_host(sender_host):
raise RuntimeException("Channel '%s' is not connected with host '%s'." % (self.name, sender_host.name))
# print 'OUT', unicode(self.name), unicode(sender_host), unicode(message.expression)
# Put sent message in the buffers of receivers
# Receivers are retrieved from the requests present in the channel
# When the channel is synchronous the buffers are cleaned after the binding try
# and requests are removed after they are fulfilled
for request in self.get_filtered_requests(message, router):
if request.receiver not in self._buffers:
self._buffers[request.receiver] = []
self._buffers[request.receiver].append(message)
# Check if request can be bound with expressions
self._bind_messages_with_receivers()
def _bind_messages_with_receivers(self):
"""
Assign messages from buffers to requests.
"""
# Update all requests
for request in self._waiting_requests:
# Add messages from buffer to request
# Add not more that request wants (it is one message)
if request.receiver not in self._buffers:
self._buffers[request.receiver] = []
request.get_message_from_buffer(self._buffers[request.receiver])
# Here the request is filled with all requested messages
# or there was not enough messages in the buffer
# If the request is filled with all requested messages = ready to fulfill
if request.ready_to_fulfill():
# Fulfill it
request.fulfill()
# print "Binded: ", unicode(request.instruction), unicode(request.receiver)
# If channel is synchronous, delete the request - a new one will be created
# when the intruction is executed again
if self.is_synchronous():
self._waiting_requests.remove(request)
else:
# If channel is asynchronous, clean the request - still accept messages to the buffer
request.clean()
# Clean buffers if channel is synchronous
if self.is_synchronous():
for i in self._buffers:
self._buffers[i] = []
elif self._buffer_size > 0: # Channel has a limit in buffer
for i in self._buffers:
if len(self._buffers[i]) > self._buffer_size:
self._dropped_messages_cnt += len(self._buffers[i]) - self._buffer_size
for j in range(self._buffer_size, len(self._buffers[i])):
del self._buffers[i][j]
def add_left_messages_to_dropped(self):
"""
If there are any undelivered messages, add their number to the number of dropped messages.
"""
for i in self._buffers:
self._dropped_messages_cnt += len(self._buffers[i])
class Manager():
""" Channels manager class """
def __init__(self, channels):
self.channels = channels
self.router = Router()
def add_medium(self, name, topology, default_parameters):
self.router.add_medium(name, topology, default_parameters)
def has_medium(self, name):
return self.router.has_medium(name)
# def add_algorithm(self, name, algorithm):
# self.algorithm_resolver.add_algorithm(name, algorithm)
#
# def has_algorithm(self, name):
# return self.algorithm_resolver.has_algorithm(name)
#
# def get_algorithm(self, name):
# return self.algorithm_resolver.get_algorithm(name)
def get_router(self):
return self.router
# def get_algorithm_resolver(self):
# return self.algorithm_resolver
def find_channel(self, name, predicate=None):
"""
"""
for ch in self.channels:
if ch.name == name:
if predicate is not None and not predicate(ch):
continue
return ch
return None
def find_channel_for_current_instruction(self, context):
"""
Finds channel connected with process/host from current instruction
and with the channel name from instruction.
"""
instruction = context.get_current_instruction()
if not isinstance(instruction, CommunicationInstruction):
return None
return self.find_channel_for_host_instruction(context, context.get_current_host(),
instruction)
def find_channel_for_host_instruction(self, context, host, instruction):
"""
Finds channel connected with process/host from current instruction
of passed host and with the channel name from instruction.
"""
process = host.get_current_instructions_context().get_process_of_current_list()
if process:
return self.find_channel(instruction.channel_name,
predicate=lambda x: x.is_connected_with_process(process))
else:
return self.find_channel(instruction.channel_name,
predicate=lambda x: x.is_connected_with_host(host))
def build_message(self, sender, expression, expression_checker):
"""
Creates a message that will be sent.
"""
return ChannelMessage(sender, expression, expression_checker)
def build_message_request(self, receiver, communication_instruction,
expression_populator):
"""
Creates a request for message when host executes instruction IN
"""
return ChannelMessageRequest(receiver, communication_instruction, expression_populator)
class Router():
def __init__(self):
self.mediums = {} # { MEDIUM_NAME -> {
# 'default_q' -> ...,
# 'default_...' -> ...,
# 'default_...' -> ...,
# 'topology' ->
# { SENDER -> {
# 'hosts': [HOST, HOST, ...]
# 'q': { HOST -> QUALITY, HOST -> QUALITY, ... },
# PARAMETER: { HOST -> QUALITY, HOST -> QUALITY, ... },
# ...
# } }
# }
# }
self.routing = {} # { MEDIUM_NAME ->
# SENDER -> {
# NEXT -> [RECEIVER1, RECEIVER2, ...),
# ...
# }
# }
def add_medium(self, name, topology, default_parameters):
self.mediums[name] = {
'topology': topology,
'default_parameters': default_parameters}
# { MEDIUM_NAME -> {
# 'default_parameters' -> {
# 'default_q' -> ...,
# 'default_...' -> ...,
# 'default_...' -> ...,
# }
#
# 'topology' ->
# { SENDER -> {
# 'hosts': [HOST, HOST, ...]
# 'q': { HOST -> QUALITY, HOST -> QUALITY, ... }
# PARAMETER: { HOST -> VALUE, HOST -> VALUE, ... },
# ...
# } }
# }
# }
self.routing[name] = {} # SENDER -> {
# NEXT -> [RECEIVER1, RECEIVER2, ...),
# ...
# }
def has_medium(self, name):
return name in self.mediums
def link_exists(self, channel, sender, receiver):
"""
Returns True if there exists a link between sender and receiver.
If communication structure exists the link must be specified. If channel medium has no structure it is assumed
that channel may be used by all hosts.
"""
medium_name = channel.tag_name
if medium_name not in self.mediums:
return True
topology = self.mediums[medium_name]['topology']
if len(topology) == 0:
return True
return self.get_link_quality(medium_name, sender, receiver) is not None
def get_link_parameter_value(self, parameter, medium_name, sender, receiver=None, default=None, no_link_value=None):
"""
Returns value of parameter between sender and receiver in medium.
When receiver is None the parameter is get for situation when sender is broadcasting.
"""
def topology_default():
defaults = self.mediums[medium_name]['default_parameters']
default_parameter_name = 'default_'+parameter
if default_parameter_name in defaults:
return defaults[default_parameter_name]
return default
if medium_name not in self.mediums:
return default
medium = self.mediums[medium_name]
if len(medium['topology']) == 0:
return default
if sender not in medium['topology']:
return no_link_value
sender_hosts = medium['topology'][sender]['hosts']
sender_topology = medium['topology'][sender]['parameters']
if parameter not in sender_topology:
if receiver in sender_hosts:
return topology_default()
else:
return no_link_value
sender_parameters = sender_topology[parameter]
if receiver not in sender_parameters:
if receiver in sender_hosts:
return topology_default()
else:
return no_link_value
return sender_parameters[receiver]
def get_link_quality(self, medium_name, sender, receiver, default=1, no_link_value=None):
""" """
return self.get_link_parameter_value('q', medium_name, sender, receiver, default=default,
no_link_value=no_link_value)
def get_sender_links_qualities(self, medium_name, sender, exclude_broadcast=False):
""" """
medium = self.mediums[medium_name]
if sender not in medium['topology']:
return {}
default_quality = 1
defaults = medium['default_parameters']
if 'default_q' in defaults:
default_quality = defaults['default_q']
sender_topology = medium['topology'][sender]
sender_qualities = sender_topology['parameters']['q'] if 'q' in sender_topology['parameters'] else {}
qualities = {}
for h in sender_topology['hosts']:
if h is None and exclude_broadcast:
continue
if h in sender_qualities:
qualities[h] = sender_qualities[h]
else:
qualities[h] = default_quality
return qualities
def _find_existing_next_hop_host(self, topology_name, sender, receiver):
"""
Finds existing next hop host in the path from sender to receiver.
If path does not exist, return None.
"""
routing = self.routing[topology_name]
if not sender in routing:
return None
sender_routing = routing[sender]
for next_hop in sender_routing:
if receiver in sender_routing[next_hop]:
return next_hop
return None
def get_hosts_sending_to_receiver(self, medium_name, receiver):
"""
Returns ditictionary:
host -> quality
containing hosts that can send message to receiver.
"""
if not self.has_medium(medium_name):
return {}
hosts = {}
for sender in self.mediums[medium_name]['topology']:
q = self.get_link_quality(medium_name, sender, receiver)
if q is not None:
hosts[sender] = q
return hosts
def get_next_hop_host(self, medium_name, sender, receiver):
"""
Returns the next host which is in the path between sender and receiver.
If path does not exist, None is returned.
"""
# Check if path already exists
existing_next_hop = self._find_existing_next_hop_host(medium_name, sender, receiver)
if existing_next_hop is not None:
return existing_next_hop
# Build path using Dijsktra algorithm
topology = self.mediums[medium_name]['topology']
def find_closest_host(distances, out):
closest_host = None
d = -1
for h in distances:
if d < 0 or d > distances[h]:
if h not in out:
closest_host = h
d = distances[h]
return closest_host, d
# Dijsktra
distances = {sender: 0}
out = []
closest, closes_distance = find_closest_host(distances, out)
while (closest is not None) and (closest != receiver):
if closest in topology:
qualities = self.get_sender_links_qualities(medium_name, closest, exclude_broadcast=True)
for next_host in qualities:
if (next_host not in distances) \
or ((closes_distance + qualities[next_host]) < distances[next_host]):
distances[next_host] = closes_distance + qualities[next_host]
out.append(closest)
closest, closes_distance = find_closest_host(distances, out)
# for h in distances:
# print h.name, distances[h]
def update_paths(medium_name, receiver, distances):
routing = self.routing[medium_name]
# Start from receiver
current_host = receiver
distance = distances[receiver]
# Add receiver to path
hosts_path = [receiver]
# Repeat until we finish handling sender
while distance > 0:
# Find all hosts that are connected with current host
previous_hosts = self.get_hosts_sending_to_receiver(medium_name, current_host)
for prev_host in previous_hosts:
# If prev host has not calculated distance, omit him
if prev_host not in distances:
continue
# Check if this host is on the shortest path
prev_quality = previous_hosts[prev_host]
if distances[prev_host] + prev_quality == distances[current_host]:
# Go one step earlier
current_host = prev_host
# Decrease current distance
distance = distances[prev_host]
if prev_host not in routing:
routing[prev_host] = {}
next_host = hosts_path[0]
if next_host not in routing[prev_host]:
routing[prev_host][next_host] = []
for i in range(0, len(hosts_path)):
routing[prev_host][next_host].append(hosts_path[i])
# Add host to path
hosts_path.insert(0, prev_host)
break
# Printer().print_routing(self.routing[medium_name])
if receiver not in distances:
raise RuntimeException("The path between {0} and {1} undefined.".format(sender.name, receiver.name))
update_paths(medium_name, receiver, distances)
return self._find_existing_next_hop_host(medium_name, sender, receiver)
class Printer():
def print_topology(self, topology):
for sender in topology:
snlen = len(sender.name)
print sender.name
for next in topology[sender]['quality']:
print " " * snlen, ' -> ', next.name, ' : ', topology[sender]['quality'][next]
def print_routing(self, routing):
for sender in routing:
snlen = len(sender.name)
print sender.name
for next in routing[sender]:
nnlen = len(next.name)
print " " * snlen, ' -> ', next.name
for receiver in routing[sender][next]:
print " " * (snlen+nnlen+4), ' -> ', receiver.name | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/simulator/communication.py | communication.py |
import copy
#from aqopa.simulator.error import RuntimeException
'''
Created on 23-04-2013
@author: Damian Rusinek <[email protected]>
'''
################################################
# Names functions
################################################
def original_name(name):
"""
Return name without indexes.
"""
return name.split('.')[0]
def name_indexes(name):
"""
Return indexes of name.
"""
indexes = name.split('.')
if len(indexes) == 0:
indexes = []
else:
indexes = indexes[1:]
return [ int(i) for i in indexes ]
################################################
# QoPML Model Elements
################################################
class Function():
def __init__(self, name, params = [], qop_params = [], comment = ""):
self.name = name
self.params = params
self.qop_params = qop_params
self.comment = comment
def clone(self):
return Function(copy.copy(self.name), copy.deepcopy(self.qop_params), copy.copy(self.comment))
def __unicode__(self):
comment = ""
if self.comment:
comment = "( %s )" % self.comment
qop_params = ""
if len(self.qop_params):
qop_params = [ '%s : %s' % (name, ', '.join(pars)) for (name, pars) in self.qop_params ]
qop_params = ', '.join(qop_params)
qop_params = "[%s]" % qop_params
return u"fun %s(%s) %s %s" % (self.name, ', '.join(self.params), qop_params, comment)
COMMUNICATION_TYPE_IN = 1
COMMUNICATION_TYPE_OUT = 2
class Channel():
def __init__(self, name, buffor_size, tag_name=None):
self.name = name
self.buffor_size = buffor_size # Buffor size < 0 is unlimited
self.tag_name = tag_name
def __unicode__(self):
buffor_size = str(self.buffor_size) if self.buffor_size >= 0 else "*"
return u"channel %s [%s]" % (self.name, buffor_size)
def clone(self):
return Channel(copy.copy(self.name), copy.copy(self.buffor_size))
class Equation():
def __init__(self, simple, composite):
self.simple = simple
self.composite = composite
def __unicode__(self):
return u"eq %s = %s" % ( unicode(self.composite), unicode(self.simple) )
def clone(self):
return Equation(self.simple.clone(), self.composite.clone())
################################
# Expressions
################################
class BooleanExpression():
def __init__(self, val):
self.val = bool(val)
def clone(self):
return BooleanExpression(copy.copy(self.val))
def __unicode__(self):
return u"true" if self.val else u"false"
def is_true(self):
return self.val
class IdentifierExpression():
def __init__(self, identifier):
self.identifier = identifier
def __unicode__(self):
return unicode(self.identifier)
def clone(self):
return IdentifierExpression(copy.copy(self.identifier))
class CallFunctionExpression():
def __init__(self, function_name, arguments=[], qop_arguments=[]):
self.function_name = function_name
self.arguments = arguments
self.qop_arguments = qop_arguments
def __unicode__(self):
u = u"%s(%s)" % (unicode(self.function_name), unicode(', '.join([unicode(a) for a in self.arguments])))
if len(self.qop_arguments) > 0:
u += "[%s]" % unicode(', '.join([unicode(a) for a in self.qop_arguments]))
return u
def clone(self):
# Regular clone
expr = CallFunctionExpression(copy.copy(self.function_name),
[a.clone() for a in self.arguments],
[copy.copy(a) for a in self.qop_arguments])
# Copy additional values (may come from predefined functions or keep calculated size, etc.)
regular_vars = ['function_name', 'arguments', 'qop_arguments']
for attr_name in self.__dict__:
if attr_name not in regular_vars:
setattr(expr, attr_name, copy.deepcopy(self.__dict__[attr_name]))
return expr
COMPARISON_TYPE_EQUAL = 1
COMPARISON_TYPE_NOT_EQUAL = 2
class ComparisonExpression():
def __init__(self, left, right, comparison_type):
self.left = left
self.right = right
self.comparison_type = comparison_type
def __unicode__(self):
return "%s %s %s" % (unicode(self.left), u'==' if self.is_equal_type() else '!=',
unicode(self.right))
def is_equal_type(self):
return self.comparison_type == COMPARISON_TYPE_EQUAL
def clone(self):
return ComparisonExpression(self.left.clone(), self.right.clone(), self.comparison_type)
class TupleExpression():
def __init__(self, elements):
self.elements = elements
def __unicode__(self):
return u"(%s)" % unicode(', '.join([unicode(e) for e in self.elements]))
def clone(self):
return TupleExpression([e.clone() for e in self.elements])
class TupleElementExpression():
def __init__(self, variable_name, index):
self.variable_name = variable_name
self.index = index
def __unicode__(self):
return u"%s[%d]" % (unicode(self.variable_name), self.index)
def clone(self):
return TupleElementExpression(copy.copy(self.variable_name),
copy.copy(self.index))
################################
# Instructions
################################
class CallFunctionInstruction():
def __init__(self, function_name, arguments=[], qop_arguments=[]):
self.function_name = function_name
self.arguments = arguments
self.qop_arguments = qop_arguments
def __unicode__(self):
return u"%s(%s)[%s];" % ( unicode(self.function_name), unicode(', '.join([ unicode(a) for a in self.arguments])),
unicode(', '.join([ unicode(a) for a in self.qop_arguments])) )
def clone(self):
return CallFunctionInstruction(copy.copy(self.function_name),
[ a.clone() for a in self.arguments ],
[ copy.copy(a) for a in self.qop_arguments ])
class FinishInstruction():
def __init__(self, command):
self.command = command
def clone(self):
return FinishInstruction(copy.copy(self.command))
def __unicode__(self):
return "%s;" % unicode(self.command)
class ContinueInstruction():
def clone(self):
return ContinueInstruction()
def __unicode__(self):
return u"continue;"
class BreakInstruction():
def clone(self):
return BreakInstruction()
def __unicode__(self):
return u"break;"
class AssignmentInstruction():
def __init__(self, variable_name, expression):
self.variable_name = variable_name
self.expression = expression
def clone(self):
return AssignmentInstruction(copy.copy(self.variable_name),
self.expression.clone())
def __unicode__(self):
return u"%s = %s;" % (unicode(self.variable_name), unicode(self.expression))
class CommunicationInstruction():
def __init__(self, communication_type, channel_name, variable_name, filters):
self.communication_type = communication_type
self.channel_name = channel_name
self.variable_name = variable_name
self.filters = filters
def clone(self):
filters = []
for f in self.filters:
if isinstance(f, basestring):
filters.append(copy.copy(f))
else:
filters.append(f.clone())
return CommunicationInstruction(copy.copy(self.communication_type),
copy.copy(self.channel_name),
copy.copy(self.variable_name),
filters)
def is_out(self):
return self.communication_type == COMMUNICATION_TYPE_OUT
def __unicode__(self):
if self.communication_type == COMMUNICATION_TYPE_IN:
filters_str = u""
if len(self.filters) > 0:
filters_str = u", ".join([unicode(f) for f in self.filters])
filters_str = u": |%s|" % filters_str
return u"in (%s: %s%s);" % (unicode(self.channel_name),
unicode(self.variable_name),
filters_str)
else:
return u"out (%s: %s);" % (unicode(self.channel_name), unicode(self.variable_name))
class IfInstruction():
def __init__(self, condition, true_instructions, false_instructions=[]):
self.condition = condition
self.true_instructions = true_instructions
self.false_instructions = false_instructions
def clone(self):
t_instructions = []
for i in self.true_instructions:
t_instructions.append(i.clone())
f_instructions = []
for i in self.false_instructions:
f_instructions.append(i.clone())
return IfInstruction(self.condition.clone(), t_instructions, f_instructions)
def __unicode__(self):
return u"if (%s) ..." % unicode(self.condition)
class WhileInstruction():
def __init__(self, condition, instructions):
self.condition = condition
self.instructions = instructions
def __unicode__(self):
return u"while (%s) ..." % unicode(self.condition)
def clone(self):
instructions = []
for i in self.instructions:
instructions.append(i.clone())
return WhileInstruction(self.condition.clone(), instructions)
################################
# Hosts
################################
class HostSubprocess():
def __init__(self, name, instructions_list):
self.name = name
self.instructions_list = instructions_list
self.all_channels_active = False
self.active_channels = []
def clone(self):
instructions_list = []
for i in self.instructions_list:
instructions_list.append(i.clone())
p = HostSubprocess(copy.copy(self.name), instructions_list)
p.all_channels_active = copy.copy(self.all_channels_active)
p.active_channels = copy.deepcopy(self.active_channels)
return p
def __unicode__(self):
return u"subprocess %s" % unicode(self.name)
class HostProcess():
def __init__(self, name, instructions_list):
self.name = name
self.instructions_list = instructions_list
self.all_channels_active = False
self.active_channels = []
self.follower = None
def clone(self):
instructions_list = []
for i in self.instructions_list:
instructions_list.append(i.clone())
p = HostProcess(copy.copy(self.name), instructions_list)
p.all_channels_active = copy.copy(self.all_channels_active)
p.active_channels = copy.deepcopy(self.active_channels)
return p
def __unicode__(self):
return u"process %s" % unicode(self.name)
class Host():
def __init__(self, name, schedule_algorithm, instructions_list, predefined_values=[]):
self.name = name
self.schedule_algorithm = schedule_algorithm
self.instructions_list = instructions_list
self.predefined_values = predefined_values
self.all_channels_active = False
self.active_channels = []
def __unicode__(self):
return u"host %s" % unicode(self.name)
def clone(self):
instructions_list = []
for i in self.instructions_list:
if isinstance(i, HostProcess):
instructions_list.append(i.clone())
else:
instructions_list.append(i)
predefined_values = []
for i in self.predefined_values:
predefined_values.append(i.clone())
h = Host(copy.copy(self.name), copy.copy(self.schedule_algorithm), instructions_list, predefined_values)
h.all_channels_active = self.all_channels_active
h.active_channels = self.active_channels
return h
################################
# Versions
################################
class Version():
def __init__(self, name):
self.name = name
self.run_hosts = []
self.metrics_sets = []
self.communication = {'mediums': {}}
def __unicode__(self):
return u"version %d" % self.name
def clone(self):
r_hosts = []
for rh in self.run_hosts:
r_hosts.append(rh.clone())
m_sets = []
for ms in self.metrics_sets:
m_sets.append(ms.clone())
v = Version(copy.copy(self.name))
v.run_hosts = r_hosts
v.metrics_sets = m_sets
return v
class VersionRunHost():
def __init__(self, host_name):
self.host_name = host_name
self.all_channels_active = False
self.active_channels = []
self.repetitions = 1
self.repeated_channels = [] # deprecated
self.run_processes = []
def __unicode__(self):
return u"run host %s (...)" % self.host_name
def clone(self):
r_processes = []
for rp in self.run_processes:
r_processes.append(rp.clone())
rh = VersionRunHost(copy.copy(self.host_name))
rh.all_channels_active = self.all_channels_active
rh.active_channels = copy.deepcopy(self.active_channels)
rh.repetitions = self.repetitions
rh.repeated_channels = copy.deepcopy(self.repeated_channels)
rh.run_processes = r_processes
return rh
class VersionRunProcess():
def __init__(self, process_name):
self.process_name = process_name
self.all_subprocesses_active = False
self.active_subprocesses = []
self.repetitions = 1
self.repeated_channels = [] # deprecated
self.follower = None
def __unicode__(self):
s = u"run %s (...)" % self.process_name
if self.follower:
s += " -> %s" % unicode(self.follower)
return s
def clone(self):
rp = VersionRunProcess(copy.copy(self.process_name))
rp.all_subprocesses_active = self.all_subprocesses_active
rp.active_subprocesses = copy.deepcopy(self.active_subprocesses)
rp.repetitions = self.repetitions
rp.repeated_channels = copy.deepcopy(self.repeated_channels)
if self.follower:
rp.follower = self.follower.clone()
return rp
################################
# Metrics
################################
class MetricsConfiguration():
def __init__(self, name, specifications):
self.name = name
self.specifications = specifications
def __unicode__(self):
return u"conf %s { ... }" % self.name
def clone(self):
return MetricsConfiguration(copy.copy(self.name), copy.deepcopy(self.specifications))
class MetricsSet():
def __init__(self, host_name, configuration_name):
self.host_name = host_name
self.configuration_name = configuration_name
def __unicode__(self):
return u"set host %s (%s) { ... }" % (self.host_name, self.configuration_name)
def clone(self):
return MetricsSet(copy.copy(self.host_name), copy.copy(self.configuration_name))
class MetricsData():
def __init__(self, name, blocks, plus = False, star = False):
self.name = name
self.blocks = blocks
self.star = star
self.plus = plus
def __unicode__(self):
return u"data %s { ... }" % self.name
def clone(self):
blocks = []
for b in self.blocks:
blocks.append(b.clone())
return MetricsData(copy.copy(self.name), blocks, copy.copy(self.plus), copy.copy(self.star))
class MetricsPrimitiveBlock():
def __init__(self, header, metrics):
self.header = header
self.metrics = metrics
def clone(self):
metrics = []
for m in self.metrics:
metrics.append(m.clone())
return MetricsPrimitiveBlock(self.header.clone(), metrics)
class MetricsPrimitiveHeader():
def __init__(self, params, services_params):
self.params = params
self.services_params = services_params
def clone(self):
params = []
for p in self.params:
params.append(p.clone())
s_params = []
for p in self.services_params:
s_params.append(p.clone())
return MetricsPrimitiveHeader(params, s_params)
class MetricsServiceParam():
def __init__(self, service_name, param_name, unit=None):
self.service_name = service_name
self.param_name = param_name
self.unit = unit
def clone(self):
return MetricsServiceParam(copy.copy(self.service_name),
copy.copy(self.param_name),
copy.copy(self.unit))
class MetricsPrimitive():
def __init__(self, arguments):
self.arguments = arguments
def clone(self):
return MetricsPrimitive(copy.deepcopy(self.arguments))
################################
# Communication
################################
class TopologyRuleHost():
def __init__(self, identifier, index_range=None, i_shift=None):
self.identifier = identifier
self.index_range = index_range
self.i_shift = i_shift
def __unicode__(self):
s = self.identifier
if self.index_range is not None:
s += "["
if self.index_range[0] is not None:
s += str(self.index_range[0])
s += ":"
if self.index_range[1] is not None:
s += str(self.index_range[1])
s += "]"
elif self.i_shift is not None:
s += "[i"
if self.i_shift >= 0:
s += "+"
s += str(self.i_shift) + "]"
return unicode(s)
class TopologyRule():
def __init__(self, left_host, arrow, right_host, parameters=None):
self.left_host = left_host
self.right_host = right_host
self.arrow = arrow
self.parameters = parameters if parameters is not None else {}
def __unicode__(self):
params_list = []
for p in self.parameters:
params_list.append(u"{0}={1}".format(p, self.parameters[p]))
params_str = u""
if len(params_list) > 0:
params_str = u" : " + u", ".join(params_list)
right_host_str = u"*"
if self.right_host is not None:
right_host_str = unicode(self.right_host)
return u"{0} {1} {2}{3};".format(unicode(self.left_host), self.arrow, right_host_str, params_str)
################################
# Algorithms
################################
class AlgCallFunction():
def __init__(self, function_name, args):
self.function_name = function_name
self.args = args
class AlgWhile():
def __init__(self, condition, instructions):
self.condition = condition
self.instructions = instructions
class AlgIf():
def __init__(self, condition, true_instructions, false_instructions):
self.condition = condition
self.true_instructions = true_instructions
self.false_instructions = false_instructions
class AlgReturn():
def __init__(self, expression):
self.expression = expression
class AlgAssignment():
def __init__(self, identifier, expression):
self.identifier = identifier
self.expression = expression | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/model/__init__.py | __init__.py |
_tabversion = '3.8'
_lr_method = 'LALR'
_lr_signature = 'D349F105D4176011A9739F6739BFAE91'
_lr_action_items = {'STAR':([33,84,152,153,154,155,],[40,92,161,-96,163,-95,]),'MA':([118,119,120,137,],[-99,-98,136,136,]),'TIME_PARAMETER':([169,172,190,],[178,178,178,]),'EQUAL':([96,97,98,99,102,176,178,179,180,182,],[108,109,110,111,116,188,191,192,193,194,]),'SQRPARAN':([63,67,68,69,70,88,136,138,140,141,142,143,150,151,167,173,175,183,185,206,207,],[77,80,-56,-57,-52,-53,-31,-12,-14,-11,-15,-13,159,160,174,186,187,196,199,208,209,]),'MEDIUM_SPECIFICATION':([30,36,37,47,117,],[35,-44,35,-45,-49,]),'RECEIVING_CURRENT_PARAMETER':([169,172,190,],[179,179,179,]),'ARROWLEFT':([146,147,149,174,186,187,199,],[155,-76,-77,-82,-83,-84,-85,]),'COMM_PLUS':([183,],[197,]),'SET':([12,14,15,19,49,50,],[13,-29,13,-30,-33,-32,]),'SEMICOLON':([38,39,100,115,118,119,121,122,123,124,125,126,127,129,130,131,134,135,136,138,139,140,141,142,143,159,160,162,163,164,165,166,174,177,181,184,186,187,196,199,200,201,202,203,204,205,208,209,],[49,50,112,133,-99,-98,-9,-20,-28,-19,-17,-6,-4,-1,-8,-3,-58,-22,-31,-12,-5,-14,-11,-15,-13,-27,-7,-81,-79,-78,-80,171,-82,189,-89,198,-83,-84,-86,-85,-91,-90,-2,-16,-18,-10,-88,-87,]),'RPARAN':([31,32,33,40,42,43,64,84,92,94,],[38,39,41,51,52,-100,-101,93,105,106,]),'TOPOLOGY_SPECIFICATION':([101,112,133,],[114,-54,-55,]),'SENDING_CURRENT_DEFAULT_PARAMETER':([87,101,112,133,],[98,98,-54,-55,]),'COMMA':([42,43,64,67,68,69,70,88,94,118,119,122,123,124,127,130,131,135,136,138,139,140,141,142,143,159,160,177,181,184,200,201,202,203,204,205,],[53,-100,-101,79,-56,-57,-52,-53,53,-99,-98,-20,-28,-19,-4,-8,-3,-22,-31,-12,-5,-14,-11,-15,-13,-27,-7,190,-89,190,-91,-90,-2,-16,-18,-10,]),'KBYTEPS':([118,119,128,144,],[-99,-98,140,140,]),'SENDING_CURRENT_PARAMETER':([169,172,190,],[182,182,182,]),'INTEGER':([45,91,108,109,110,111,116,156,168,170,173,188,191,192,193,194,195,197,],[61,61,118,118,118,118,118,167,175,167,185,118,118,118,118,118,206,207,]),'IDENTIFIER':([6,16,22,27,33,46,53,55,57,79,84,85,108,109,110,111,132,145,148,152,153,154,155,157,171,189,191,192,193,194,198,],[9,21,28,32,43,63,64,69,72,69,43,95,123,123,123,130,147,-62,147,-94,-96,164,-95,-63,-66,-72,130,123,123,123,-67,]),'$end':([2,3,5,11,],[-92,0,-93,-21,]),'RECEIVING_CURRENT_DEFAULT_PARAMETER':([87,101,112,133,],[96,96,-54,-55,]),'COLON':([156,161,162,163,164,165,166,167,170,174,186,187,196,199,208,209,],[168,169,-81,-79,-78,-80,172,173,168,-82,-83,-84,-86,-85,-88,-87,]),'VERSIONS_SPECIFICATION':([0,2,3,5,11,],[1,-92,1,-93,-21,]),'ARROWBOTH':([146,147,149,174,186,187,199,],[153,-76,-77,-82,-83,-84,-85,]),'Q_PARAMETER':([169,172,190,],[176,176,176,]),'HOST':([13,17,],[16,22,]),'VERSION':([4,7,8,10,26,29,],[6,6,-23,-24,-25,-26,]),'BLOCKOPEN':([1,9,25,34,41,44,51,52,56,76,77,80,83,93,105,106,114,],[4,12,30,45,-46,54,-47,-48,71,-43,87,-50,91,-73,-74,-75,132,]),'COMMUNICATION_SPECIFICATION':([18,20,24,62,66,75,78,82,89,],[25,-34,-35,-36,-37,-39,-40,-38,-41,]),'SQLPARAN':([35,44,76,90,123,130,147,164,],[46,55,-43,55,137,144,156,170,]),'BLOCKCLOSE':([7,8,10,18,20,23,24,26,29,36,37,45,47,48,54,58,59,60,61,62,65,66,71,74,75,76,78,80,81,82,83,86,89,90,93,103,104,105,106,107,113,117,145,148,157,158,171,189,198,],[11,-23,-24,26,-34,29,-35,-25,-26,-44,48,62,-45,-42,66,-64,75,-59,76,-36,78,-37,82,-60,-39,-43,-40,-50,89,-38,-68,-65,-41,-69,-73,117,-70,-74,-75,-71,-51,-49,-62,158,-63,-61,-66,-72,-67,]),'ARROWRIGHT':([58,76,80,83,90,93,104,105,106,146,147,149,174,186,187,199,],[73,-43,-50,-68,-69,-73,-70,-74,-75,152,-76,-77,-82,-83,-84,-85,]),'LPARAN':([21,28,72,95,],[27,33,84,84,]),'MSPBIT':([118,119,128,144,],[-99,-98,138,138,]),'RUN':([14,15,18,19,20,24,45,49,50,54,58,59,60,62,65,66,71,73,74,75,76,78,80,81,82,83,86,89,90,93,104,105,106,107,],[-29,17,17,-30,-34,-35,57,-33,-32,57,-64,57,-59,-36,57,-37,57,85,-60,-39,-43,-40,-50,57,-38,-68,-65,-41,-69,-73,-70,-74,-75,-71,]),'MBYTEPS':([118,119,128,144,],[-99,-98,142,142,]),'COMM_MINUS':([183,],[195,]),'FLOAT':([108,109,110,111,116,188,191,192,193,194,],[119,119,119,119,119,119,119,119,119,119,]),'QUALIFIED_IDENTIFIER':([27,55,79,],[31,68,68,]),'LISTENING_CURRENT_PARAMETER':([169,172,190,],[180,180,180,]),'I_INDEX':([170,],[183,]),'LISTENING_CURRENT_DEFAULT_PARAMETER':([87,101,112,133,],[97,97,-54,-55,]),'TIME_DEFAULT_PARAMETER':([87,101,112,133,],[99,99,-54,-55,]),'QUALITY_DEFAULT_PARAMETER':([87,101,112,133,],[102,102,-54,-55,]),'MS':([118,119,128,144,],[-99,-98,141,141,]),'MSPBYTE':([118,119,128,144,],[-99,-98,143,143,]),}
_lr_action = {}
for _k, _v in _lr_action_items.items():
for _x,_y in zip(_v[0],_v[1]):
if not _x in _lr_action: _lr_action[_x] = {}
_lr_action[_x][_k] = _y
del _lr_action_items
_lr_goto_items = {'version_topology_rule_left_hosts':([132,148,],[146,146,]),'version_run_hosts':([15,],[18,]),'version_topology_rule_parameters':([169,172,],[177,184,]),'version_run_processes':([45,54,71,],[59,65,81,]),'number':([108,109,110,111,116,188,191,192,193,194,],[120,120,120,128,134,200,128,120,120,120,]),'version_topology_host_with_i_index':([154,],[162,]),'version_repetition':([34,83,],[44,90,]),'version_run_process':([45,54,59,65,71,81,],[60,60,74,74,60,74,]),'version_topology_rule_parameter':([169,172,190,],[181,181,201,]),'version_topology_arrow':([146,],[154,]),'version_medium_default_parameters':([87,],[101,]),'version_comm_time_metric':([111,191,],[131,131,]),'version_medium_elements':([87,],[103,]),'version_comm_specifications':([30,],[37,]),'version_communication':([18,],[23,]),'version_repetition_channels_list':([55,],[67,]),'version_run_host':([15,18,],[20,24,]),'specification':([0,3,],[2,5,]),'versions_list':([4,],[7,]),'version_comm_current_metric_unit':([120,137,],[135,150,]),'version':([4,7,],[8,10,]),'version_comm_time_value':([111,191,],[129,202,]),'version_run_process_follower':([73,],[86,]),'version_medium_default_parameter':([87,101,],[100,115,]),'metrics_sets':([12,],[15,]),'version_repetition_channels':([44,90,],[56,104,]),'version_topology_rule_right_hosts':([154,],[166,]),'version_topology_rule':([132,148,],[145,157,]),'version_run_process_base':([45,54,59,65,71,81,],[58,58,58,58,58,58,]),'metrics_set':([12,15,],[14,19,]),'version_comm_time_metric_unit':([128,144,],[139,151,]),'identifiers_list':([33,84,],[42,94,]),'version_comm_current_algorithm':([108,109,110,192,193,194,],[122,122,122,122,122,122,]),'configuration':([0,],[3,]),'version_repetition_channel':([55,79,],[70,88,]),'version_comm_current_metric':([108,109,110,192,193,194,],[124,124,124,124,124,124,]),'version_subprocesses_list':([72,95,],[83,107,]),'version_medium_topology':([101,],[113,]),'version_topology_host_with_indicies':([132,148,154,],[149,149,165,]),'version_comm_specification':([30,37,],[36,47,]),'version_channels':([28,],[34,]),'version_topology_rules_list':([132,],[148,]),'version_comm_current_value':([108,109,110,192,193,194,],[121,125,126,203,204,205,]),'version_comm_time_algorithm':([111,191,],[127,127,]),}
_lr_goto = {}
for _k, _v in _lr_goto_items.items():
for _x, _y in zip(_v[0], _v[1]):
if not _x in _lr_goto: _lr_goto[_x] = {}
_lr_goto[_x][_k] = _y
del _lr_goto_items
_lr_productions = [
("S' -> configuration","S'",1,None,None,None),
('version_medium_default_parameter -> TIME_DEFAULT_PARAMETER EQUAL version_comm_time_value','version_medium_default_parameter',3,'p_version_time_default_parameter','parser.py',116),
('version_topology_rule_parameter -> TIME_PARAMETER EQUAL version_comm_time_value','version_topology_rule_parameter',3,'p_version_topology_rule_time_parameter','parser.py',122),
('version_comm_time_value -> version_comm_time_metric','version_comm_time_value',1,'p_version_comm_time_value','parser.py',128),
('version_comm_time_value -> version_comm_time_algorithm','version_comm_time_value',1,'p_version_comm_time_value','parser.py',129),
('version_comm_time_metric -> number version_comm_time_metric_unit','version_comm_time_metric',2,'p_version_comm_time_metric','parser.py',135),
('version_medium_default_parameter -> SENDING_CURRENT_DEFAULT_PARAMETER EQUAL version_comm_current_value','version_medium_default_parameter',3,'p_version_sending_current_default_parameter','parser.py',145),
('version_comm_time_algorithm -> IDENTIFIER SQLPARAN version_comm_time_metric_unit SQRPARAN','version_comm_time_algorithm',4,'p_version_comm_time_algorithm','parser.py',145),
('version_comm_time_algorithm -> IDENTIFIER','version_comm_time_algorithm',1,'p_version_comm_time_algorithm','parser.py',146),
('version_medium_default_parameter -> RECEIVING_CURRENT_DEFAULT_PARAMETER EQUAL version_comm_current_value','version_medium_default_parameter',3,'p_version_receiving_current_default_parameter','parser.py',151),
('version_topology_rule_parameter -> SENDING_CURRENT_PARAMETER EQUAL version_comm_current_value','version_topology_rule_parameter',3,'p_version_topology_rule_sending_current_parameter','parser.py',157),
('version_comm_time_metric_unit -> MS','version_comm_time_metric_unit',1,'p_version_comm_time_metric_unit','parser.py',159),
('version_comm_time_metric_unit -> MSPBIT','version_comm_time_metric_unit',1,'p_version_comm_time_metric_unit','parser.py',160),
('version_comm_time_metric_unit -> MSPBYTE','version_comm_time_metric_unit',1,'p_version_comm_time_metric_unit','parser.py',161),
('version_comm_time_metric_unit -> KBYTEPS','version_comm_time_metric_unit',1,'p_version_comm_time_metric_unit','parser.py',162),
('version_comm_time_metric_unit -> MBYTEPS','version_comm_time_metric_unit',1,'p_version_comm_time_metric_unit','parser.py',163),
('version_topology_rule_parameter -> RECEIVING_CURRENT_PARAMETER EQUAL version_comm_current_value','version_topology_rule_parameter',3,'p_version_topology_rule_receiving_current_parameter','parser.py',163),
('version_medium_default_parameter -> LISTENING_CURRENT_DEFAULT_PARAMETER EQUAL version_comm_current_value','version_medium_default_parameter',3,'p_version_listening_current_default_parameter','parser.py',169),
('version_topology_rule_parameter -> LISTENING_CURRENT_PARAMETER EQUAL version_comm_current_value','version_topology_rule_parameter',3,'p_version_topology_rule_listening_current_parameter','parser.py',175),
('version_comm_current_value -> version_comm_current_metric','version_comm_current_value',1,'p_version_comm_current_value','parser.py',181),
('version_comm_current_value -> version_comm_current_algorithm','version_comm_current_value',1,'p_version_comm_current_value','parser.py',182),
('specification -> VERSIONS_SPECIFICATION BLOCKOPEN versions_list BLOCKCLOSE','specification',4,'p_versions_specification','versions.py',184),
('version_comm_current_metric -> number version_comm_current_metric_unit','version_comm_current_metric',2,'p_version_comm_current_metric','parser.py',188),
('versions_list -> version','versions_list',1,'p_versions_list','versions.py',191),
('versions_list -> versions_list version','versions_list',2,'p_versions_list','versions.py',192),
('version -> VERSION IDENTIFIER BLOCKOPEN metrics_sets version_run_hosts BLOCKCLOSE','version',6,'p_version','versions.py',198),
('version -> VERSION IDENTIFIER BLOCKOPEN metrics_sets version_run_hosts version_communication BLOCKCLOSE','version',7,'p_version','versions.py',199),
('version_comm_current_algorithm -> IDENTIFIER SQLPARAN version_comm_current_metric_unit SQRPARAN','version_comm_current_algorithm',4,'p_version_comm_current_algorithm','parser.py',198),
('version_comm_current_algorithm -> IDENTIFIER','version_comm_current_algorithm',1,'p_version_comm_current_algorithm','parser.py',199),
('metrics_sets -> metrics_set','metrics_sets',1,'p_metrics_sets','versions.py',205),
('metrics_sets -> metrics_sets metrics_set','metrics_sets',2,'p_metrics_sets','versions.py',206),
('version_comm_current_metric_unit -> MA','version_comm_current_metric_unit',1,'p_version_comm_current_metric_unit','parser.py',212),
('metrics_set -> SET HOST IDENTIFIER LPARAN IDENTIFIER RPARAN SEMICOLON','metrics_set',7,'p_metrics_set','versions.py',217),
('metrics_set -> SET HOST IDENTIFIER LPARAN QUALIFIED_IDENTIFIER RPARAN SEMICOLON','metrics_set',7,'p_metrics_set','versions.py',218),
('version_run_hosts -> version_run_host','version_run_hosts',1,'p_version_run_hosts','versions.py',224),
('version_run_hosts -> version_run_hosts version_run_host','version_run_hosts',2,'p_version_run_hosts','versions.py',225),
('version_run_host -> RUN HOST IDENTIFIER version_channels BLOCKOPEN BLOCKCLOSE','version_run_host',6,'p_version_run_host','versions.py',236),
('version_run_host -> RUN HOST IDENTIFIER version_channels version_repetition BLOCKOPEN BLOCKCLOSE','version_run_host',7,'p_version_run_host','versions.py',237),
('version_run_host -> RUN HOST IDENTIFIER version_channels version_repetition version_repetition_channels BLOCKOPEN BLOCKCLOSE','version_run_host',8,'p_version_run_host','versions.py',238),
('version_run_host -> RUN HOST IDENTIFIER version_channels BLOCKOPEN version_run_processes BLOCKCLOSE','version_run_host',7,'p_version_run_host','versions.py',239),
('version_run_host -> RUN HOST IDENTIFIER version_channels version_repetition BLOCKOPEN version_run_processes BLOCKCLOSE','version_run_host',8,'p_version_run_host','versions.py',240),
('version_run_host -> RUN HOST IDENTIFIER version_channels version_repetition version_repetition_channels BLOCKOPEN version_run_processes BLOCKCLOSE','version_run_host',9,'p_version_run_host','versions.py',241),
('version_communication -> COMMUNICATION_SPECIFICATION BLOCKOPEN version_comm_specifications BLOCKCLOSE','version_communication',4,'p_version_communication','communication.py',241),
('version_repetition -> BLOCKOPEN INTEGER BLOCKCLOSE','version_repetition',3,'p_version_repetition','versions.py',247),
('version_comm_specifications -> version_comm_specification','version_comm_specifications',1,'p_version_comm_specifications','communication.py',250),
('version_comm_specifications -> version_comm_specifications version_comm_specification','version_comm_specifications',2,'p_version_comm_specifications','communication.py',251),
('version_channels -> LPARAN RPARAN','version_channels',2,'p_version_channels','versions.py',253),
('version_channels -> LPARAN STAR RPARAN','version_channels',3,'p_version_channels','versions.py',254),
('version_channels -> LPARAN identifiers_list RPARAN','version_channels',3,'p_version_channels','versions.py',255),
('version_comm_specification -> MEDIUM_SPECIFICATION SQLPARAN IDENTIFIER SQRPARAN BLOCKOPEN version_medium_elements BLOCKCLOSE','version_comm_specification',7,'p_version_medium_specification','communication.py',261),
('version_repetition_channels -> SQLPARAN version_repetition_channels_list SQRPARAN','version_repetition_channels',3,'p_version_repetition_channels','versions.py',267),
('version_medium_elements -> version_medium_default_parameters version_medium_topology','version_medium_elements',2,'p_version_medium_elements','communication.py',269),
('version_repetition_channels_list -> version_repetition_channel','version_repetition_channels_list',1,'p_version_repetition_channels_list','versions.py',273),
('version_repetition_channels_list -> version_repetition_channels_list COMMA version_repetition_channel','version_repetition_channels_list',3,'p_version_repetition_channels_list','versions.py',274),
('version_medium_default_parameters -> version_medium_default_parameter SEMICOLON','version_medium_default_parameters',2,'p_version_medium_default_parameters','communication.py',278),
('version_medium_default_parameters -> version_medium_default_parameters version_medium_default_parameter SEMICOLON','version_medium_default_parameters',3,'p_version_medium_default_parameters','communication.py',279),
('version_repetition_channel -> QUALIFIED_IDENTIFIER','version_repetition_channel',1,'p_version_repetition_channel','versions.py',285),
('version_repetition_channel -> IDENTIFIER','version_repetition_channel',1,'p_version_repetition_channel','versions.py',286),
('version_medium_default_parameter -> QUALITY_DEFAULT_PARAMETER EQUAL number','version_medium_default_parameter',3,'p_version_quality_default_parameter','communication.py',287),
('version_run_processes -> version_run_process','version_run_processes',1,'p_version_run_processes','versions.py',292),
('version_run_processes -> version_run_processes version_run_process','version_run_processes',2,'p_version_run_processes','versions.py',293),
('version_medium_topology -> TOPOLOGY_SPECIFICATION BLOCKOPEN version_topology_rules_list BLOCKCLOSE','version_medium_topology',4,'p_version_medium_topology','communication.py',293),
('version_topology_rules_list -> version_topology_rule','version_topology_rules_list',1,'p_version_topology_rules_list','communication.py',299),
('version_topology_rules_list -> version_topology_rules_list version_topology_rule','version_topology_rules_list',2,'p_version_topology_rules_list','communication.py',300),
('version_run_process -> version_run_process_base','version_run_process',1,'p_version_run_process','versions.py',304),
('version_run_process -> version_run_process_base ARROWRIGHT version_run_process_follower','version_run_process',3,'p_version_run_process','versions.py',305),
('version_topology_rule -> version_topology_rule_left_hosts version_topology_arrow version_topology_rule_right_hosts SEMICOLON','version_topology_rule',4,'p_version_topology_rule_point_to_point','communication.py',311),
('version_topology_rule -> version_topology_rule_left_hosts version_topology_arrow version_topology_rule_right_hosts COLON version_topology_rule_parameters SEMICOLON','version_topology_rule',6,'p_version_topology_rule_point_to_point','communication.py',312),
('version_run_process_base -> RUN IDENTIFIER version_subprocesses_list','version_run_process_base',3,'p_version_run_process_base','versions.py',311),
('version_run_process_base -> RUN IDENTIFIER version_subprocesses_list version_repetition','version_run_process_base',4,'p_version_run_process_base','versions.py',312),
('version_run_process_base -> RUN IDENTIFIER version_subprocesses_list version_repetition version_repetition_channels','version_run_process_base',5,'p_version_run_process_base','versions.py',313),
('version_run_process_follower -> RUN IDENTIFIER version_subprocesses_list','version_run_process_follower',3,'p_version_run_process_follower','versions.py',319),
('version_topology_rule -> version_topology_rule_left_hosts ARROWRIGHT STAR COLON version_topology_rule_parameters SEMICOLON','version_topology_rule',6,'p_version_topology_rule_boradcast','communication.py',321),
('version_subprocesses_list -> LPARAN RPARAN','version_subprocesses_list',2,'p_version_subprocesses_list','versions.py',325),
('version_subprocesses_list -> LPARAN STAR RPARAN','version_subprocesses_list',3,'p_version_subprocesses_list','versions.py',326),
('version_subprocesses_list -> LPARAN identifiers_list RPARAN','version_subprocesses_list',3,'p_version_subprocesses_list','versions.py',327),
('version_topology_rule_left_hosts -> IDENTIFIER','version_topology_rule_left_hosts',1,'p_version_topology_rule_left_hosts','communication.py',330),
('version_topology_rule_left_hosts -> version_topology_host_with_indicies','version_topology_rule_left_hosts',1,'p_version_topology_rule_left_hosts','communication.py',331),
('version_topology_rule_right_hosts -> IDENTIFIER','version_topology_rule_right_hosts',1,'p_version_topology_rule_right_hosts','communication.py',340),
('version_topology_rule_right_hosts -> STAR','version_topology_rule_right_hosts',1,'p_version_topology_rule_right_hosts','communication.py',341),
('version_topology_rule_right_hosts -> version_topology_host_with_indicies','version_topology_rule_right_hosts',1,'p_version_topology_rule_right_hosts','communication.py',342),
('version_topology_rule_right_hosts -> version_topology_host_with_i_index','version_topology_rule_right_hosts',1,'p_version_topology_rule_right_hosts','communication.py',343),
('version_topology_host_with_indicies -> IDENTIFIER SQLPARAN INTEGER SQRPARAN','version_topology_host_with_indicies',4,'p_version_topology_host_with_indicies','communication.py',355),
('version_topology_host_with_indicies -> IDENTIFIER SQLPARAN INTEGER COLON SQRPARAN','version_topology_host_with_indicies',5,'p_version_topology_host_with_indicies','communication.py',356),
('version_topology_host_with_indicies -> IDENTIFIER SQLPARAN COLON INTEGER SQRPARAN','version_topology_host_with_indicies',5,'p_version_topology_host_with_indicies','communication.py',357),
('version_topology_host_with_indicies -> IDENTIFIER SQLPARAN INTEGER COLON INTEGER SQRPARAN','version_topology_host_with_indicies',6,'p_version_topology_host_with_indicies','communication.py',358),
('version_topology_host_with_i_index -> IDENTIFIER SQLPARAN I_INDEX SQRPARAN','version_topology_host_with_i_index',4,'p_version_topology_host_with_i_index','communication.py',374),
('version_topology_host_with_i_index -> IDENTIFIER SQLPARAN I_INDEX COMM_PLUS INTEGER SQRPARAN','version_topology_host_with_i_index',6,'p_version_topology_host_with_i_index','communication.py',375),
('version_topology_host_with_i_index -> IDENTIFIER SQLPARAN I_INDEX COMM_MINUS INTEGER SQRPARAN','version_topology_host_with_i_index',6,'p_version_topology_host_with_i_index','communication.py',376),
('version_topology_rule_parameters -> version_topology_rule_parameter','version_topology_rule_parameters',1,'p_version_topology_rule_parameters','communication.py',391),
('version_topology_rule_parameters -> version_topology_rule_parameters COMMA version_topology_rule_parameter','version_topology_rule_parameters',3,'p_version_topology_rule_parameters','communication.py',392),
('version_topology_rule_parameter -> Q_PARAMETER EQUAL number','version_topology_rule_parameter',3,'p_version_topology_rule_quality_parameter','communication.py',400),
('configuration -> specification','configuration',1,'p_rule_configuration','main.py',402),
('configuration -> configuration specification','configuration',2,'p_rule_configuration','main.py',403),
('version_topology_arrow -> ARROWRIGHT','version_topology_arrow',1,'p_version_topology_arrow','communication.py',406),
('version_topology_arrow -> ARROWLEFT','version_topology_arrow',1,'p_version_topology_arrow','communication.py',407),
('version_topology_arrow -> ARROWBOTH','version_topology_arrow',1,'p_version_topology_arrow','communication.py',408),
('empty -> <empty>','empty',0,'p_rule_empty','main.py',409),
('number -> FLOAT','number',1,'p_rule_number','main.py',415),
('number -> INTEGER','number',1,'p_rule_number','main.py',416),
('identifiers_list -> IDENTIFIER','identifiers_list',1,'p_rule_identifiers_list','main.py',422),
('identifiers_list -> identifiers_list COMMA IDENTIFIER','identifiers_list',3,'p_rule_identifiers_list','main.py',423),
('qualified_identifiers_list -> QUALIFIED_IDENTIFIER','qualified_identifiers_list',1,'p_rule_qualified_identifiers_list','main.py',433),
('qualified_identifiers_list -> qualified_identifiers_list COMMA QUALIFIED_IDENTIFIER','qualified_identifiers_list',3,'p_rule_qualified_identifiers_list','main.py',434),
] | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/model/parser/lex_yacc/parsetab.py | parsetab.py |
from ply.lex import LexError
from aqopa.model.parser import ParserException, QoPMLModelParser
##########################################
# EXTENDERS
##########################################
class LexYaccParserExtension():
"""
Abstract extenstion of parser (tokens, states, rules, etc.)
"""
def __init__(self):
self.parser = None # Extended parser
def _save_parser(self, parser):
if not self.parser:
self.parser = parser
def _extend(self):
raise NotImplementedError()
def extend(self, parser):
self._save_parser(parser)
self._extend()
##########################################
# MODEL PARSER
##########################################
class LexYaccParser(QoPMLModelParser):
"""
Parser class
"""
def __init__(self):
self.tokens = () # lex tokens
self.precedence = [] # lex precedence
self.states = [] # lex states
self.reserved_words = {} # lex reserved_words
self.start_symbol = None # grammar starting symbol
self.lexer = None # lex object
self.yaccer = None # yacc object
self.extensions = [] # objects that extend parser (instances of LexYaccParserExtension)
self.store = None # store for built objects
self.syntax_errors = [] # list of syntax errors that occured while parsing
def get_syntax_errors(self):
return self.syntax_errors
def add_extension(self, ext):
self.extensions.append(ext)
return self
def set_store(self, store):
self.store = store
return self
def build(self, **kwargs):
if not self.lexer:
for e in self.extensions:
e.extend(self)
# Build the lexer
from ply import lex
self.lexer = lex.lex(object=self, **kwargs)
# Build the yacc
import ply.yacc as yacc
self.yaccer = yacc.yacc(module=self, start=self.start_symbol, **kwargs)
return self
def parse(self, s):
try:
self.yaccer.parse(input = s, lexer = self.lexer)
except LexError:
pass
return self.store
# LEX
t_ignore = " \t"
def t_newline(self, t):
r'\n+'
t.lexer.lineno += t.value.count("\n")
def t_error(self, t):
last_cr = t.lexer.lexdata.rfind('\n',0,t.lexpos)
if last_cr < 0:
last_cr = 0
column = (t.lexpos - last_cr) + 1
self.syntax_errors.append(("Line [%s:%s, pos:%s]: Illegal character '%s'" % (t.lexer.lineno, column, t.lexer.lexpos, t.value[0])))
next_cr = t.lexer.lexdata.find('\n',t.lexpos)
if next_cr < 0:
next_cr = len(t.lexer.lexdata)
t.lexer.skip(next_cr - t.lexer.lexpos)
def add_state(self, name, state_type):
if state_type not in ['inclusive', 'exclusive']:
raise ParserException('Invalid state type State type must be either inclusive or exclusive.')
self.states.append((name, state_type))
def add_precedence(self, token_names, precedence):
"""
Adds precedence for token
"""
args = [precedence]
for t in token_names:
args.append(t)
self.precedence.append(tuple(args))
def add_token(self, name, regex=None, func=None, precedence=None, states=[], include_in_tokens=True):
"""
Adds new token and saves it in this module.
It is needed by
"""
if include_in_tokens:
if not name in self.tokens:
self.tokens += (name,)
field_name = name
states_str = '_'.join(states)
if len(states_str):
field_name = states_str + '_' + name
if regex and func:
raise ParserException('Cannot add token with both: regex and function.')
if regex:
setattr(self, 't_%s' % field_name, regex)
if func:
setattr(self, 't_%s' % field_name, func)
if precedence:
self.add_precedence([name], precedence)
def add_reserved_word(self, word, token, func=None, state='INITIAL', case_sensitive=True):
"""
Adds token representing reserved word for particular state (INITIAL by default)
"""
if not token in self.tokens:
self.tokens += (token,)
if state not in self.reserved_words:
self.reserved_words[state] = {}
self.reserved_words[state][word] = (token, func, case_sensitive)
def get_reserved_words(self):
"""
Returns all reserver words
"""
return self.reserved_words
# YACC
def add_rule(self, func):
setattr(self, 'p_%s' % (func.__name__), func)
def p_error(self, t):
if not t:
self.syntax_errors.append("Syntax error 'Unexpected end of file'")
else:
last_cr = t.lexer.lexdata.rfind('\n',0,t.lexpos)
if last_cr < 0:
last_cr = 0
column = (t.lexpos - last_cr) + 1
self.syntax_errors.append(("Line [%s:%s, pos: %s]: Syntax error near '%s'" % (t.lexer.lineno, column, t.lexer.lexpos, t.value))) | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/model/parser/lex_yacc/__init__.py | __init__.py |
from aqopa.model.parser.lex_yacc import LexYaccParserExtension
from aqopa.model import Version, VersionRunProcess, VersionRunHost,\
MetricsSet
class Builder():
"""
Builder for creating channel objects
"""
def build_version(self, token):
"""
version : VERSION IDENTIFIER BLOCKOPEN metrics_sets version_run_hosts BLOCKCLOSE
| VERSION IDENTIFIER BLOCKOPEN metrics_sets version_run_hosts version_communication BLOCKCLOSE
"""
v = Version(token[2])
for metrics_set in token[4]:
v.metrics_sets.append(metrics_set)
for run_host in token[5]:
v.run_hosts.append(run_host)
if len(token) == 8:
v.communication = token[6]
return v
def create_metrics_set(self, token):
"""
metrics_set : SET HOST IDENTIFIER LPARAN IDENTIFIER RPARAN SEMICOLON
| SET HOST IDENTIFIER LPARAN QUALIFIED_IDENTIFIER RPARAN SEMICOLON
"""
return MetricsSet(token[3], token[5])
def build_run_host(self, token):
"""
version_run_host : RUN HOST IDENTIFIER version_channels BLOCKOPEN BLOCKCLOSE
| RUN HOST IDENTIFIER version_channels version_repetition BLOCKOPEN BLOCKCLOSE
| RUN HOST IDENTIFIER version_channels version_repetition version_repetition_channels BLOCKOPEN BLOCKCLOSE
| RUN HOST IDENTIFIER version_channels BLOCKOPEN version_run_processes BLOCKCLOSE
| RUN HOST IDENTIFIER version_channels version_repetition BLOCKOPEN version_run_processes BLOCKCLOSE
| RUN HOST IDENTIFIER version_channels version_repetition version_repetition_channels BLOCKOPEN version_run_processes BLOCKCLOSE
"""
run_host = VersionRunHost(token[3])
if '*' in token[4]:
run_host.all_channels_active = True
else:
run_host.active_channels = token[4]
if len(token) == 8:
if isinstance(token[5], int): # RUN HOST IDENTIFIER version_channels version_repetition BLOCKOPEN BLOCKCLOSE
run_host.repetitions = token[5]
else: # RUN HOST IDENTIFIER version_channels BLOCKOPEN version_run_processes BLOCKCLOSE
run_host.run_processes = token[6]
elif len(token) == 9:
if isinstance(token[6], list): # RUN HOST IDENTIFIER version_channels version_repetition version_repetition_channels BLOCKOPEN BLOCKCLOSE
run_host.repetitions = token[5]
run_host.repeated_channels = token[6]
else: # RUN HOST IDENTIFIER version_channels version_repetition BLOCKOPEN version_run_processes BLOCKCLOSE
run_host.repetitions = token[5]
run_host.run_processes = token[7]
elif len(token) == 10:
run_host.repetitions = token[5]
run_host.repeated_channels = token[6]
run_host.run_processes = token[8]
return run_host
def build_run_process(self, token):
"""
version_run_process : version_run_process_base
| version_run_process_base ARROWRIGHT version_run_process_follower
"""
run_process = token[1]
if len(token) == 4:
run_process.follower = token[3]
return run_process
def build_run_process_base(self, token):
"""
version_run_process_base : RUN IDENTIFIER version_subprocesses_list
| RUN IDENTIFIER version_subprocesses_list version_repetition
| RUN IDENTIFIER version_subprocesses_list version_repetition version_repetition_channels
"""
run_process = VersionRunProcess(token[2])
if '*' in token[3]:
run_process.all_subprocesses_active = True
else:
for identifier in token[3]:
run_process.active_subprocesses.append(identifier)
if len(token) > 4:
run_process.repetitions = token[4]
if len(token) > 5:
for identifier in token[5]:
run_process.repeated_channels.append(identifier)
return run_process
def build_run_process_follower(self, token):
"""
version_run_process_follower : RUN IDENTIFIER version_subprocesses_list
"""
run_process = VersionRunProcess(token[2])
if '*' in token[3]:
run_process.all_subprocesses_active = True
else:
for identifier in token[3]:
run_process.active_subprocesses.append(identifier)
return run_process
class ConfigParserExtension(LexYaccParserExtension):
"""
Extension for parsing functions
"""
def __init__(self):
LexYaccParserExtension.__init__(self)
self.open_blocks_cnt = 0
self.builder = Builder()
##########################################
# RESERVED WORDS
##########################################
def word_versions_specification(self, t):
t.lexer.push_state('versions')
return t
##########################################
# TOKENS
##########################################
def token_block_open(self, t):
r'{'
self.open_blocks_cnt += 1
return t
def token_block_close(self, t):
r'}'
self.open_blocks_cnt -= 1
if self.open_blocks_cnt == 0:
t.lexer.pop_state()
return t
def token_sq_lparan(self, t):
r'\['
t.lexer.push_state('versionsrepeatedchannels')
return t
def token_sq_rparan(self, t):
r'\]'
t.lexer.pop_state()
return t
def token_error(self, t):
self.parser.t_error(t)
def t_newline(self, t):
r'\n+'
t.lexer.lineno += t.value.count("\n")
##########################################
# RULES
##########################################
def versions_specification(self, t):
"""
specification : VERSIONS_SPECIFICATION BLOCKOPEN versions_list BLOCKCLOSE
"""
pass
def versions_list(self, t):
"""
versions_list : version
| versions_list version
"""
pass
def version(self, t):
"""
version : VERSION IDENTIFIER BLOCKOPEN metrics_sets version_run_hosts BLOCKCLOSE
| VERSION IDENTIFIER BLOCKOPEN metrics_sets version_run_hosts version_communication BLOCKCLOSE
"""
self.parser.store.versions.append(self.builder.build_version(t))
def metrics_sets(self, t):
"""
metrics_sets : metrics_set
| metrics_sets metrics_set
"""
if len(t) == 3:
t[0] = t[1]
t[0].append(t[2])
else:
t[0] = []
t[0].append(t[1])
def metrics_set(self, t):
"""
metrics_set : SET HOST IDENTIFIER LPARAN IDENTIFIER RPARAN SEMICOLON
| SET HOST IDENTIFIER LPARAN QUALIFIED_IDENTIFIER RPARAN SEMICOLON
"""
t[0] = self.builder.create_metrics_set(t)
def version_run_hosts(self, t):
"""
version_run_hosts : version_run_host
| version_run_hosts version_run_host
"""
if len(t) == 3:
t[0] = t[1]
t[0].append(t[2])
else:
t[0] = []
t[0].append(t[1])
def version_run_host(self, t):
"""
version_run_host : RUN HOST IDENTIFIER version_channels BLOCKOPEN BLOCKCLOSE
| RUN HOST IDENTIFIER version_channels version_repetition BLOCKOPEN BLOCKCLOSE
| RUN HOST IDENTIFIER version_channels version_repetition version_repetition_channels BLOCKOPEN BLOCKCLOSE
| RUN HOST IDENTIFIER version_channels BLOCKOPEN version_run_processes BLOCKCLOSE
| RUN HOST IDENTIFIER version_channels version_repetition BLOCKOPEN version_run_processes BLOCKCLOSE
| RUN HOST IDENTIFIER version_channels version_repetition version_repetition_channels BLOCKOPEN version_run_processes BLOCKCLOSE
"""
t[0] = self.builder.build_run_host(t)
def version_repetition(self, t):
"""
version_repetition : BLOCKOPEN INTEGER BLOCKCLOSE
"""
t[0] = t[2]
def version_channels(self, t):
"""
version_channels : LPARAN RPARAN
| LPARAN STAR RPARAN
| LPARAN identifiers_list RPARAN
"""
if len(t) == 3:
t[0] = []
elif len(t) == 4:
if t[2] == '*':
t[0] = ['*']
else:
t[0] = t[2]
def version_repetition_channels(self, t):
"""
version_repetition_channels : SQLPARAN version_repetition_channels_list SQRPARAN
"""
t[0] = t[2]
def version_repetition_channels_list(self, t):
"""
version_repetition_channels_list : version_repetition_channel
| version_repetition_channels_list COMMA version_repetition_channel
"""
if len(t) == 4:
t[0] = t[1]
t[0].append(t[3])
else:
t[0] = []
t[0].append(t[1])
def version_repetition_channel(self, t):
"""
version_repetition_channel : QUALIFIED_IDENTIFIER
| IDENTIFIER
"""
t[0] = t[1]
def version_run_processes(self, t):
"""
version_run_processes : version_run_process
| version_run_processes version_run_process
"""
if len(t) == 3:
t[0] = t[1]
t[0].append(t[2])
else:
t[0] = []
t[0].append(t[1])
def version_run_process(self, t):
"""
version_run_process : version_run_process_base
| version_run_process_base ARROWRIGHT version_run_process_follower
"""
t[0] = self.builder.build_run_process(t)
def version_run_process_base(self, t):
"""
version_run_process_base : RUN IDENTIFIER version_subprocesses_list
| RUN IDENTIFIER version_subprocesses_list version_repetition
| RUN IDENTIFIER version_subprocesses_list version_repetition version_repetition_channels
"""
t[0] = self.builder.build_run_process_base(t)
def version_run_process_follower(self, t):
"""
version_run_process_follower : RUN IDENTIFIER version_subprocesses_list
"""
t[0] = self.builder.build_run_process_follower(t)
def version_subprocesses_list(self, t):
"""
version_subprocesses_list : LPARAN RPARAN
| LPARAN STAR RPARAN
| LPARAN identifiers_list RPARAN
"""
if len(t) == 3:
t[0] = []
elif len(t) == 4:
if t[2] == '*':
t[0] = ['*']
else:
t[0] = t[2]
def _extend(self):
self.parser.add_state('versions', 'inclusive')
self.parser.add_state('versionsrepeatedchannels', 'exclusive')
self.parser.add_reserved_word('versions', 'VERSIONS_SPECIFICATION', func=self.word_versions_specification)
self.parser.add_reserved_word('version', 'VERSION', state='versions')
self.parser.add_reserved_word('run', 'RUN', state='versions')
self.parser.add_reserved_word('set', 'SET', state='versions')
self.parser.add_reserved_word('host', 'HOST', state='versions')
self.parser.add_token('BLOCKOPEN', func=self.token_block_open, states=['versions'])
self.parser.add_token('BLOCKCLOSE', func=self.token_block_close, states=['versions'])
self.parser.add_token('ARROWRIGHT', r'\-\>', states=['versions'])
self.parser.add_token('SQLPARAN', func=self.token_sq_lparan, states=['versions'])
self.parser.add_token('IDENTIFIER', r'[_a-zA-Z][_a-zA-Z0-9]*', states=['versionsrepeatedchannels'])
self.parser.add_token('QUALIFIED_IDENTIFIER', r'[_a-zA-Z][_a-zA-Z0-9]*(\.[0-9][0-9]*)+',
states=['versionsrepeatedchannels'])
self.parser.add_token('COMMA', r',', states=['versionsrepeatedchannels'])
self.parser.add_token('SQRPARAN', func=self.token_sq_rparan, states=['versionsrepeatedchannels'])
self.parser.add_token('error', func=self.token_error, states=['versionsrepeatedchannels'], include_in_tokens=False)
self.parser.add_token('ignore', "\t ", states=['versionsrepeatedchannels'], include_in_tokens=False)
self.parser.add_token('newline', func=self.t_newline, states=['versionsrepeatedchannels'], include_in_tokens=False)
self.parser.add_rule(self.versions_specification)
self.parser.add_rule(self.versions_list)
self.parser.add_rule(self.version)
self.parser.add_rule(self.metrics_sets)
self.parser.add_rule(self.metrics_set)
self.parser.add_rule(self.version_run_hosts)
self.parser.add_rule(self.version_run_host)
self.parser.add_rule(self.version_repetition)
self.parser.add_rule(self.version_channels)
self.parser.add_rule(self.version_repetition_channels)
self.parser.add_rule(self.version_repetition_channels_list)
self.parser.add_rule(self.version_repetition_channel)
self.parser.add_rule(self.version_run_processes)
self.parser.add_rule(self.version_run_process)
self.parser.add_rule(self.version_run_process_base)
self.parser.add_rule(self.version_run_process_follower)
self.parser.add_rule(self.version_subprocesses_list) | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/model/parser/lex_yacc/grammar/versions.py | versions.py |
from aqopa.model.parser.lex_yacc import LexYaccParserExtension
from aqopa.model import MetricsConfiguration, MetricsData,\
MetricsPrimitiveBlock, MetricsPrimitiveHeader, MetricsPrimitive, \
MetricsServiceParam
class Builder():
def create_metrics_configuration(self, token):
"""
metrics_configuration : METRICS_CONFIGURATION LPARAN IDENTIFIER RPARAN BLOCKOPEN metrics_configuration_params BLOCKCLOSE
"""
hostname = token[3]
host_conf_params = token[6]
return MetricsConfiguration(hostname, host_conf_params)
def create_metrics_data(self, token):
"""
metrics_data : METRICS_DATA LPARAN IDENTIFIER RPARAN BLOCKOPEN metrics_data_blocks BLOCKCLOSE
| METRICS_DATA PLUS LPARAN QUALIFIED_IDENTIFIER RPARAN BLOCKOPEN metrics_data_blocks BLOCKCLOSE
| METRICS_DATA STAR LPARAN QUALIFIED_IDENTIFIER RPARAN BLOCKOPEN metrics_data_blocks BLOCKCLOSE
"""
if len(token) == 8:
name = token[3]
blocks = token[6]
md = MetricsData(name, blocks)
elif len(token) == 9:
name = token[4]
blocks = token[7]
plus = token[2] == '+'
star = token[2] == '*'
md = MetricsData(name, blocks, plus, star)
return md
def create_metrics_block(self, token):
"""
metrics_data_block : metrics_primitives_head metrics_primitives
"""
return MetricsPrimitiveBlock(token[1], token[2])
def create_metrics_header(self, token):
"""
metrics_primitives_head : METRICS_PRIMITIVES_HEAD metrics_params metrics_services_params SEMICOLON
"""
return MetricsPrimitiveHeader(token[2], token[3])
def create_metrics_primitive(self, token):
"""
metrics_primitive : METRICS_PRIMITIVE metrics_primitive_arguments SEMICOLON
"""
return MetricsPrimitive(token[2])
def create_metrics_services_param_size(self, token):
"""
metrics_services_param : SQLPARAN SIZE COLON metrics_services_size_type_unit LPARAN metrics_services_size_unit RPARAN SQRPARAN
| SQLPARAN SIZE COLON metrics_services_size_type_non_unit SQRPARAN
"""
unit = token[6] if len(token) == 9 else None
return MetricsServiceParam(token[2], token[4], unit)
class MetricsParserExtension(LexYaccParserExtension):
"""
Extension for parsing metrics
"""
def __init__(self):
LexYaccParserExtension.__init__(self)
self.builder = Builder()
##########################################
# RESERVED WORDS
##########################################
def word_metrics_specification(self, t):
t.lexer.push_state('metrics')
return t
def word_metrics_data(self, t):
t.lexer.push_state('metricsdata')
return t
def word_metrics_configuration(self, t):
t.lexer.push_state('metricsconfiguration')
return t
def word_metricsdata_primhead(self, t):
t.lexer.push_state('metricsprimhead')
return t
def word_metricsdata_primitive(self, t):
t.lexer.push_state('metricsprimitive')
return t
##########################################
# TOKENS
##########################################
def token_metrics_block_close(self, t):
r"\}"
t.lexer.pop_state()
return t
def token_metricsconfiguration_block_open(self, t):
r"\{"
t.lexer.push_state('metricsconfigurationparams')
return t
def token_metricsconfigurationparams_block_close(self, t):
r"\}"
t.lexer.pop_state()
t.lexer.pop_state()
return t
def token_metricsconfigurationparams_equal(self, t):
r'\='
t.lexer.push_state('metricsconfigurationparamsvalue')
return t
def token_metricsconfigurationparamsvalue_paramvalue(self, t):
r'[^;]+'
t.lexer.pop_state()
return t
def token_metricsdata_block_close(self, t):
r"\}"
t.lexer.pop_state()
return t
def token_metricsprimhead_semicolon(self, t):
r"\;"
t.lexer.pop_state()
return t
def token_metricsprimitive_sqlparan(self, t):
r"\["
t.lexer.push_state('metricsprimitiveargumentvalue')
return t
def token_metricsprimitive_semicolon(self, t):
r"\;"
t.lexer.pop_state()
return t
def token_metricsprimitiveargumentvalue_argumentvalue(self, t):
r'[^]]+'
t.lexer.pop_state()
return t
def token_error(self, t):
self.parser.t_error(t)
def t_newline(self, t):
r'\n+'
t.lexer.lineno += t.value.count("\n")
##########################################
# RULES
##########################################
def metrics_specification(self, t):
"""
specification : METRICS_SPECIFICATION BLOCKOPEN metrics_configurations metrics_datas BLOCKCLOSE
"""
pass
def metrics_configurations(self, t):
"""
metrics_configurations : metrics_configuration
| metrics_configurations metrics_configuration
"""
pass
def metrics_configuration(self, t):
"""
metrics_configuration : METRICS_CONFIGURATION LPARAN IDENTIFIER RPARAN BLOCKOPEN metrics_configuration_params BLOCKCLOSE
"""
self.parser.store.metrics_configurations.append(self.builder.create_metrics_configuration(t))
def metrics_configuration_params(self, t):
"""
metrics_configuration_params : metrics_configuration_param
| metrics_configuration_params metrics_configuration_param
"""
if len(t) == 2:
t[0] = []
t[0].append(t[1])
else:
t[0] = t[1]
t[0].append(t[2])
def metrics_configuration_param(self, t):
"""
metrics_configuration_param : IDENTIFIER EQUAL PARAMVALUE SEMICOLON
"""
t[0] = (t[1], t[2])
def metrics_datas(self, t):
"""
metrics_datas : metrics_data
| metrics_datas metrics_data
"""
pass
def metrics_data(self, t):
"""
metrics_data : METRICS_DATA LPARAN IDENTIFIER RPARAN BLOCKOPEN metrics_data_blocks BLOCKCLOSE
| METRICS_DATA PLUS LPARAN QUALIFIED_IDENTIFIER RPARAN BLOCKOPEN metrics_data_blocks BLOCKCLOSE
| METRICS_DATA STAR LPARAN QUALIFIED_IDENTIFIER RPARAN BLOCKOPEN metrics_data_blocks BLOCKCLOSE
"""
self.parser.store.metrics_datas.append(self.builder.create_metrics_data(t))
def metrics_data_blocks(self, t):
"""
metrics_data_blocks : metrics_data_block
| metrics_data_blocks HASH metrics_data_block
"""
if len(t) == 2:
t[0] = []
t[0].append(t[1])
else:
t[0] = t[1]
t[0].append(t[3])
def metrics_data_block(self, t):
"""
metrics_data_block : metrics_primitives_head metrics_primitives
"""
t[0] = self.builder.create_metrics_block(t)
def metrics_primitives_head(self, t):
"""
metrics_primitives_head : METRICS_PRIMITIVES_HEAD metrics_params metrics_services_params SEMICOLON
"""
t[0] = self.builder.create_metrics_header(t)
def metrics_params(self, t):
"""
metrics_params : metrics_param
| metrics_params metrics_param
"""
if len(t) == 2:
t[0] = []
t[0].append(t[1])
else:
t[0] = t[1]
t[0].append(t[2])
def metrics_param(self, t):
"""
metrics_param : SQLPARAN IDENTIFIER SQRPARAN
"""
t[0] = t[2]
def metrics_services_params(self, t):
"""
metrics_services_params : metrics_services_param
| metrics_services_params metrics_services_param
"""
if len(t) == 2:
t[0] = []
t[0].append(t[1])
else:
t[0] = t[1]
t[0].append(t[2])
def metrics_primitives(self, t):
"""
metrics_primitives : metrics_primitive
| metrics_primitives metrics_primitive
"""
if len(t) == 2:
t[0] = []
t[0].append(t[1])
else:
t[0] = t[1]
t[0].append(t[2])
def metrics_primitive(self, t):
"""
metrics_primitive : METRICS_PRIMITIVE metrics_primitive_arguments SEMICOLON
"""
t[0] = self.builder.create_metrics_primitive(t)
def metrics_primitive_arguments(self, t):
"""
metrics_primitive_arguments : metrics_primitive_argument
| metrics_primitive_arguments metrics_primitive_argument
"""
if len(t) == 2:
t[0] = []
t[0].append(t[1])
else:
t[0] = t[1]
t[0].append(t[2])
def metrics_primitive_argument(self, t):
"""
metrics_primitive_argument : SQLPARAN ARGUMENTVALUE SQRPARAN
"""
t[0] = t[2].strip()
#######################
# Metrics Size
#######################
def metrics_services_param_size(self, t):
"""
metrics_services_param : SQLPARAN SIZE COLON metrics_services_size_type_unit LPARAN metrics_services_size_unit RPARAN SQRPARAN
| SQLPARAN SIZE COLON metrics_services_size_type_non_unit SQRPARAN
"""
t[0] = self.builder.create_metrics_services_param_size(t)
def metrics_services_size_type_unit(self, t):
"""
metrics_services_size_type_unit : EXACT
| BLOCK
"""
t[0] = t[1].lower()
def metrics_services_size_type_non_unit(self, t):
"""
metrics_services_size_type_non_unit : RATIO
| SUM_RATIO
| NESTED
"""
t[0] = t[1].lower()
def metrics_services_size_unit(self, t):
"""
metrics_services_size_unit : B
"""
t[0] = t[1].lower()
def _extend(self):
self.parser.add_state('metrics', 'inclusive')
self.parser.add_state('metricsdata', 'inclusive')
self.parser.add_state('metricsprimhead', 'inclusive')
self.parser.add_state('metricsconfiguration', 'inclusive')
self.parser.add_state('metricsconfigurationparams', 'exclusive')
self.parser.add_state('metricsconfigurationparamsvalue', 'exclusive')
self.parser.add_state('metricsprimitive', 'inclusive')
self.parser.add_state('metricsprimitiveargumentvalue', 'exclusive')
self.parser.add_reserved_word('metrics', 'METRICS_SPECIFICATION', func=self.word_metrics_specification,)
self.parser.add_reserved_word('conf', 'METRICS_CONFIGURATION', func=self.word_metrics_configuration, state='metrics')
self.parser.add_reserved_word('data', 'METRICS_DATA', func=self.word_metrics_data, state='metrics')
self.parser.add_reserved_word('primhead', 'METRICS_PRIMITIVES_HEAD', func=self.word_metricsdata_primhead, state='metricsdata')
self.parser.add_reserved_word('primitive', 'METRICS_PRIMITIVE', func=self.word_metricsdata_primitive, state='metricsdata')
self.parser.add_token('QUALIFIED_IDENTIFIER', r'[_a-zA-Z][_a-zA-Z0-9]*(\.[1-9][0-9]*)+', states=['metrics', 'metricsdata'])
self.parser.add_token('BLOCKCLOSE', func=self.token_metrics_block_close, states=['metrics'])
# METRICS CONFIGURATION
self.parser.add_token('BLOCKOPEN', func=self.token_metricsconfiguration_block_open, states=['metricsconfiguration'])
# Metrics configuration params state
self.parser.add_token('error', func=self.token_error, states=['metricsconfigurationparams'], include_in_tokens=False)
self.parser.add_token('ignore', "\t ", states=['metricsconfigurationparams'], include_in_tokens=False)
self.parser.add_token('newline', func=self.t_newline, states=['metricsconfigurationparams'], include_in_tokens=False)
self.parser.add_token('BLOCKCLOSE', func=self.token_metricsconfigurationparams_block_close, states=['metricsconfigurationparams'])
self.parser.add_token('IDENTIFIER', r'[_a-zA-Z][_a-zA-Z0-9]*', states=['metricsconfigurationparams'])
self.parser.add_token('SEMICOLON', r'\;', states=['metricsconfigurationparams'])
self.parser.add_token('EQUAL', func=self.token_metricsconfigurationparams_equal, states=['metricsconfigurationparams'])
# Metrics configuration params value state
self.parser.add_token('error', func=self.token_error, states=['metricsconfigurationparamsvalue'], include_in_tokens=False)
self.parser.add_token('ignore', "\t ", states=['metricsconfigurationparamsvalue'], include_in_tokens=False)
self.parser.add_token('newline', func=self.t_newline, states=['metricsconfigurationparamsvalue'], include_in_tokens=False)
self.parser.add_token('PARAMVALUE', func=self.token_metricsconfigurationparamsvalue_paramvalue, states=['metricsconfigurationparamsvalue'])
# METRICS DATA
self.parser.add_token('PLUS', r"\+", states=['metricsdata'])
self.parser.add_token('HASH', r"\#", states=['metricsdata'])
self.parser.add_token('BLOCKCLOSE', func=self.token_metricsdata_block_close, states=['metricsdata'])
# Primhead
self.parser.add_token('SEMICOLON', func=self.token_metricsprimhead_semicolon, states=['metricsprimhead'])
self.parser.add_reserved_word('size', 'SIZE', state='metricsprimhead', case_sensitive=False)
self.parser.add_reserved_word('exact', 'EXACT', state='metricsprimhead', case_sensitive=False)
self.parser.add_reserved_word('ratio', 'RATIO', state='metricsprimhead', case_sensitive=False)
self.parser.add_reserved_word('sum_ratio', 'SUM_RATIO', state='metricsprimhead', case_sensitive=False)
self.parser.add_reserved_word('block', 'BLOCK', state='metricsprimhead', case_sensitive=False)
self.parser.add_reserved_word('nested', 'NESTED', state='metricsprimhead', case_sensitive=False)
self.parser.add_reserved_word('B', 'B', state='metricsprimhead')
self.parser.add_reserved_word('ms', 'ms', state='metricsprimhead')
#Primitive
self.parser.add_token('SQLPARAN', func=self.token_metricsprimitive_sqlparan, states=['metricsprimitive'])
self.parser.add_token('SEMICOLON', func=self.token_metricsprimitive_semicolon, states=['metricsprimitive'])
self.parser.add_token('error', func=self.token_error, states=['metricsprimitiveargumentvalue'], include_in_tokens=False)
self.parser.add_token('ignore', "\t ", states=['metricsprimitiveargumentvalue'], include_in_tokens=False)
self.parser.add_token('newline', func=self.t_newline, states=['metricsprimitiveargumentvalue'], include_in_tokens=False)
self.parser.add_token('ARGUMENTVALUE', func=self.token_metricsprimitiveargumentvalue_argumentvalue, states=['metricsprimitiveargumentvalue'])
self.parser.add_rule(self.metrics_specification)
self.parser.add_rule(self.metrics_configurations)
self.parser.add_rule(self.metrics_configuration)
self.parser.add_rule(self.metrics_configuration_params)
self.parser.add_rule(self.metrics_configuration_param)
self.parser.add_rule(self.metrics_datas)
self.parser.add_rule(self.metrics_data)
self.parser.add_rule(self.metrics_data_blocks)
self.parser.add_rule(self.metrics_data_block)
self.parser.add_rule(self.metrics_primitives_head)
self.parser.add_rule(self.metrics_params)
self.parser.add_rule(self.metrics_param)
self.parser.add_rule(self.metrics_services_params)
self.parser.add_rule(self.metrics_primitives)
self.parser.add_rule(self.metrics_primitive)
self.parser.add_rule(self.metrics_primitive_arguments)
self.parser.add_rule(self.metrics_primitive_argument)
self.parser.add_rule(self.metrics_services_param_size)
self.parser.add_rule(self.metrics_services_size_type_unit)
self.parser.add_rule(self.metrics_services_size_type_non_unit)
self.parser.add_rule(self.metrics_services_size_unit) | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/model/parser/lex_yacc/grammar/metrics.py | metrics.py |
from aqopa.model.parser.lex_yacc import LexYaccParserExtension
from aqopa.model import AssignmentInstruction,\
CommunicationInstruction,\
WhileInstruction, IfInstruction, ContinueInstruction, FinishInstruction,\
CallFunctionInstruction, COMMUNICATION_TYPE_OUT, COMMUNICATION_TYPE_IN,\
HostSubprocess, CallFunctionExpression, IdentifierExpression, BreakInstruction
class Builder():
def build_subprocess(self, token):
"""
host_subprocess : SUBPROCESS IDENTIFIER host_channels BLOCKOPEN instructions_list BLOCKCLOSE
| SUBPROCESS IDENTIFIER host_channels BLOCKOPEN instructions_list BLOCKCLOSE SEMICOLON
"""
s = HostSubprocess(token[2], token[5])
all_channels_active = '*' in token[3]
if all_channels_active:
s.all_channels_active = True
else:
s.active_channels = token[3]
return s
def build_communication_instruction(self, token):
"""
instruction_communication : IN LPARAN IDENTIFIER COLON IDENTIFIER RPARAN SEMICOLON
| IN LPARAN IDENTIFIER COLON IDENTIFIER COLON PIPE instruction_in_filters_list PIPE RPARAN SEMICOLON
| OUT LPARAN IDENTIFIER COLON IDENTIFIER RPARAN SEMICOLON
"""
communication_type = COMMUNICATION_TYPE_OUT
if token[1].lower() == 'in':
communication_type = COMMUNICATION_TYPE_IN
filters = []
if len(token) == 12:
filters = token[8]
return CommunicationInstruction(communication_type, token[3], token[5], filters)
class ModelParserExtension(LexYaccParserExtension):
"""
Extension for parsing functions
"""
def __init__(self):
LexYaccParserExtension.__init__(self)
self.open_blocks_cnt_by_state = {}
self.builder = Builder()
##########################################
# RESERVED WORDS
##########################################
def word_subprocess_specification(self, t):
t.lexer.push_state('subprocess')
return t
##########################################
# TOKENS
##########################################
def token_block_open(self, t):
r'{'
state = t.lexer.current_state()
state += str(t.lexer.lexstatestack.count(state))
if state not in self.open_blocks_cnt_by_state:
self.open_blocks_cnt_by_state[state] = 0
self.open_blocks_cnt_by_state[state] += 1
return t
def token_block_close(self, t):
r'}'
state = t.lexer.current_state()
state += str(t.lexer.lexstatestack.count(state))
self.open_blocks_cnt_by_state[state] -= 1
if self.open_blocks_cnt_by_state[state] == 0:
t.lexer.pop_state()
return t
def token_host_rparan(self, t):
r'\)'
t.lexer.push_state('hostrparan')
return t
##########################################
# RULES
##########################################
def instructions_list(self, t):
"""
instructions_list : instruction
| instructions_list instruction
"""
if len(t) == 2:
t[0] = []
t[0].append(t[1])
else:
t[0] = t[1]
t[0].append(t[2])
def instruction(self, t):
"""
instruction : instruction_assignment
| instruction_subprocess
| instruction_communication
| instruction_while
| instruction_if
| instruction_special_command
| instruction_call_function
"""
t[0] = t[1]
def instruction_subprocess(self, t):
"""
instruction_subprocess : SUBPROCESS IDENTIFIER host_channels BLOCKOPEN instructions_list BLOCKCLOSE
| SUBPROCESS IDENTIFIER host_channels BLOCKOPEN instructions_list BLOCKCLOSE SEMICOLON
"""
t[0] = self.builder.build_subprocess(t)
def instruction_assignment(self, t):
"""
instruction_assignment : IDENTIFIER EQUAL expression_simple SEMICOLON
"""
t[0] = AssignmentInstruction(t[1], t[3])
def instruction_communication(self, t):
"""
instruction_communication : IN LPARAN IDENTIFIER COLON IDENTIFIER RPARAN SEMICOLON
| IN LPARAN IDENTIFIER COLON IDENTIFIER COLON PIPE instruction_in_filters_list PIPE RPARAN SEMICOLON
| OUT LPARAN IDENTIFIER COLON IDENTIFIER RPARAN SEMICOLON
"""
t[0] = self.builder.build_communication_instruction(t)
def instruction_in_filters_list(self, t):
"""
instruction_in_filters_list : instruction_in_filter
| instruction_in_filters_list COMMA instruction_in_filter
"""
if len(t) == 2:
t[0] = []
t[0].append(t[1])
else:
t[0] = t[1]
t[0].append(t[3])
def instruction_in_filter(self, t):
"""
instruction_in_filter : STAR
| expression_call_function
| IDENTIFIER
"""
if (t[1] == '*') or isinstance(t[1], CallFunctionExpression):
t[0] = t[1]
else:
t[0] = IdentifierExpression(t[1])
def instruction_while(self, t):
"""
instruction_while : WHILE LPARAN expression_conditional RPARAN BLOCKOPEN instructions_list BLOCKCLOSE
| WHILE LPARAN expression_conditional RPARAN BLOCKOPEN instructions_list BLOCKCLOSE SEMICOLON
"""
t[0] = WhileInstruction(t[3], t[6])
def instruction_if(self, t):
"""
instruction_if : IF LPARAN expression_conditional RPARAN BLOCKOPEN instructions_list BLOCKCLOSE
| IF LPARAN expression_conditional RPARAN BLOCKOPEN instructions_list BLOCKCLOSE SEMICOLON
| IF LPARAN expression_conditional RPARAN BLOCKOPEN instructions_list BLOCKCLOSE ELSE BLOCKOPEN instructions_list BLOCKCLOSE
| IF LPARAN expression_conditional RPARAN BLOCKOPEN instructions_list BLOCKCLOSE ELSE BLOCKOPEN instructions_list BLOCKCLOSE SEMICOLON
"""
conditional = t[3]
true_instructions = t[6]
false_instructions = []
if len(t) > 10:
false_instructions = t[10]
t[0] = IfInstruction(conditional, true_instructions, false_instructions)
def instruction_special_command(self, t):
"""
instruction_special_command : CONTINUE SEMICOLON
| BREAK SEMICOLON
| STOP SEMICOLON
| END SEMICOLON
"""
if t[1].lower() == 'continue':
t[0] = ContinueInstruction()
elif t[1].lower() == 'break':
t[0] = BreakInstruction()
elif t[1].lower() == 'end':
t[0] = FinishInstruction('end')
elif t[1].lower() == 'stop':
t[0] = FinishInstruction('stop')
def instruction_call_function(self, t):
"""
instruction_call_function : expression_call_function SEMICOLON
"""
t[0] = CallFunctionInstruction(t[1].function_name, t[1].arguments, t[1].qop_arguments)
def _extend(self):
self.parser.add_state('subprocess', 'inclusive')
self.parser.add_reserved_word('subprocess', 'SUBPROCESS', func=self.word_subprocess_specification, state='process')
self.parser.add_token('BLOCKOPEN', func=self.token_block_open, states=['subprocess'])
self.parser.add_token('BLOCKCLOSE', func=self.token_block_close, states=['subprocess'])
self.parser.add_token('RPARAN', func=self.token_host_rparan, states=['subprocess'])
self.parser.add_rule(self.instructions_list)
self.parser.add_rule(self.instruction)
self.parser.add_rule(self.instruction_subprocess)
self.parser.add_rule(self.instruction_assignment)
self.parser.add_rule(self.instruction_communication)
self.parser.add_rule(self.instruction_in_filters_list)
self.parser.add_rule(self.instruction_in_filter)
self.parser.add_rule(self.instruction_while)
self.parser.add_rule(self.instruction_if)
self.parser.add_rule(self.instruction_special_command)
self.parser.add_rule(self.instruction_call_function) | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/model/parser/lex_yacc/grammar/instructions.py | instructions.py |
from aqopa.model.parser.lex_yacc import LexYaccParserExtension
from aqopa.model import BooleanExpression, TupleExpression,\
TupleElementExpression, IdentifierExpression, ComparisonExpression,\
CallFunctionExpression, COMPARISON_TYPE_EQUAL, COMPARISON_TYPE_NOT_EQUAL
class Builder():
def build_expression_call_function(self, token):
"""
expression_call_function : IDENTIFIER LPARAN RPARAN
| IDENTIFIER LPARAN expression_function_arguments RPARAN
| IDENTIFIER LPARAN RPARAN SQLPARAN expression_function_qop_arguments SQRPARAN
| IDENTIFIER LPARAN expression_function_arguments RPARAN SQLPARAN expression_function_qop_arguments SQRPARAN
"""
arguments = []
qop_arguments = []
if len(token) == 5:
arguments = token[3]
elif len(token) == 7:
qop_arguments = token[5]
elif len(token) == 8:
arguments = token[3]
qop_arguments = token[6]
return CallFunctionExpression(token[1], arguments, qop_arguments)
def build_expression_call_id_function(self, token):
"""
expression_call_function : ID LPARAN RPARAN
| ID LPARAN IDENTIFIER RPARAN
| ID LPARAN QUALIFIED_IDENTIFIER RPARAN
"""
arguments = []
if len(token) == 5:
arguments.append(IdentifierExpression(token[3]))
return CallFunctionExpression(token[1], arguments)
def build_expression_call_routing_next(self, token):
"""
expression_call_function : ROUTING_NEXT LPARAN IDENTIFIER COMMA expression_call_routing_next_argument RPARAN
| ROUTING_NEXT LPARAN IDENTIFIER COMMA expression_call_routing_next_argument COMMA expression_call_routing_next_argument RPARAN
"""
arguments = [IdentifierExpression(token[3]), token[5]]
if len(token) == 9:
arguments.append(token[7])
return CallFunctionExpression(token[1], arguments)
def build_comparison_expression(self, token):
"""
expression_comaprison : expression_simple EQUAL EQUAL expression_simple
| expression_simple EXCLAMATION EQUAL expression_simple
"""
comparison_type = COMPARISON_TYPE_EQUAL if token[2] == '=' else COMPARISON_TYPE_NOT_EQUAL
return ComparisonExpression(token[1], token[4], comparison_type)
class ModelParserExtension(LexYaccParserExtension):
"""
Extension for parsing functions
"""
def __init__(self):
LexYaccParserExtension.__init__(self)
self.builder = Builder()
##########################################
# RESERVED WORDS
##########################################
##########################################
# TOKENS
##########################################
##########################################
# RULES
##########################################
def expression_conditional(self, t):
"""
expression_conditional : expression_comaprison
| BOOL
"""
if isinstance(t[1], bool):
t[0] = BooleanExpression(t[1])
else:
t[0] = t[1]
def expression_tuple(self, t):
"""
expression_tuple : LPARAN RPARAN
| LPARAN expression_function_arguments RPARAN
"""
elements = []
if len(t) > 3:
elements = t[2]
t[0] = TupleExpression(elements)
def expression_tuple_element(self, t):
"""
expression_tuple_element : IDENTIFIER SQLPARAN INTEGER SQRPARAN
"""
t[0] = TupleElementExpression(t[1], t[3])
def expression_function_arguments(self, t):
"""
expression_function_arguments : expression_simple
| expression_function_arguments COMMA expression_simple
"""
if len(t) == 2:
t[0] = []
t[0].append(t[1])
else:
t[0] = t[1]
t[0].append(t[3])
def expression_call_function(self, t):
"""
expression_call_function : IDENTIFIER LPARAN RPARAN
| IDENTIFIER LPARAN expression_function_arguments RPARAN
| IDENTIFIER LPARAN RPARAN SQLPARAN expression_function_qop_arguments SQRPARAN
| IDENTIFIER LPARAN expression_function_arguments RPARAN SQLPARAN expression_function_qop_arguments SQRPARAN
"""
t[0] = self.builder.build_expression_call_function(t)
def expression_call_id_function(self, t):
"""
expression_call_function : ID LPARAN RPARAN
| ID LPARAN IDENTIFIER RPARAN
| ID LPARAN QUALIFIED_IDENTIFIER RPARAN
"""
t[0] = self.builder.build_expression_call_id_function(t)
def expression_call_routing_next_function(self, t):
"""
expression_call_function : ROUTING_NEXT LPARAN IDENTIFIER COMMA expression_call_routing_next_argument RPARAN
| ROUTING_NEXT LPARAN IDENTIFIER COMMA expression_call_routing_next_argument COMMA expression_call_routing_next_argument RPARAN
"""
t[0] = self.builder.build_expression_call_routing_next(t)
def expression_call_routing_next_argument(self, t):
"""
expression_call_routing_next_argument : expression_call_function
| expression_tuple_element
| IDENTIFIER
"""
if isinstance(t[1], str) or isinstance(t[1], unicode):
t[0] = IdentifierExpression(t[1])
else:
t[0] = t[1]
def expression_function_qop_arguments(self, t):
"""
expression_function_qop_arguments : TEXT
| expression_function_qop_arguments COMMA TEXT
"""
if len(t) == 2:
t[0] = []
t[0].append(t[1].strip())
else:
t[0] = t[1]
t[0].append(t[3].strip())
def expression_simple(self, t):
"""
expression_simple : expression_call_function
| expression_tuple_element
| expression_tuple
| IDENTIFIER
| BOOL
"""
if isinstance(t[1], bool):
t[0] = BooleanExpression(t[1])
elif isinstance(t[1], str) or isinstance(t[1], unicode):
t[0] = IdentifierExpression(t[1])
else:
t[0] = t[1]
def expression_comaprison(self, t):
"""
expression_comaprison : expression_simple EQUAL EQUAL expression_simple
| expression_simple EXCLAMATION EQUAL expression_simple
"""
t[0] = self.builder.build_comparison_expression(t)
def _extend(self):
self.parser.add_reserved_word('routing_next', 'ROUTING_NEXT', case_sensitive=True)
self.parser.add_reserved_word('id', 'ID', case_sensitive=True)
self.parser.add_rule(self.expression_conditional)
self.parser.add_rule(self.expression_tuple)
self.parser.add_rule(self.expression_tuple_element)
self.parser.add_rule(self.expression_function_arguments)
self.parser.add_rule(self.expression_call_function)
self.parser.add_rule(self.expression_call_id_function)
self.parser.add_rule(self.expression_call_routing_next_function)
self.parser.add_rule(self.expression_call_routing_next_argument)
self.parser.add_rule(self.expression_function_qop_arguments)
self.parser.add_rule(self.expression_simple)
self.parser.add_rule(self.expression_comaprison) | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/model/parser/lex_yacc/grammar/expressions.py | expressions.py |
from aqopa.model.parser.lex_yacc import LexYaccParserExtension
from aqopa.model import Host, HostProcess
from ply.lex import Lexer
import re
class Builder():
"""
Builder for creating channel objects
"""
def build_host(self, token):
"""
host : HOST IDENTIFIER LPARAN host_schedule_algorithm RPARAN host_channels BLOCKOPEN host_body BLOCKCLOSE
| HOST IDENTIFIER LPARAN host_schedule_algorithm RPARAN host_channels BLOCKOPEN host_predefined_values host_body BLOCKCLOSE
"""
all_channels_active = '*' in token[6]
instructions_list = []
predefined_values = []
if len(token) == 10:
instructions_list = token[8]
elif len(token) == 11:
predefined_values = token[8]
instructions_list = token[9]
h = Host(token[2], token[4], instructions_list, predefined_values)
if all_channels_active:
h.all_channels_active = True
else:
h.active_channels = token[6]
return h
def build_process(self, token):
"""
host_process : PROCESS IDENTIFIER host_channels BLOCKOPEN instructions_list BLOCKCLOSE
| PROCESS IDENTIFIER host_channels BLOCKOPEN instructions_list BLOCKCLOSE SEMICOLON
"""
p = HostProcess(token[2], token[5])
all_channels_active = '*' in token[3]
if all_channels_active:
p.all_channels_active = True
else:
p.active_channels = token[3]
return p
class ModelParserExtension(LexYaccParserExtension):
"""
Extension for parsing functions
"""
def __init__(self):
LexYaccParserExtension.__init__(self)
self.open_blocks_cnt_by_state = {}
self.builder = Builder()
##########################################
# RESERVED WORDS
##########################################
def word_hosts_specification(self, t):
t.lexer.push_state('hosts')
return t
def word_host_specification(self, t):
t.lexer.push_state('host')
return t
def word_process_specification(self, t):
t.lexer.push_state('process')
return t
##########################################
# TOKENS
##########################################
def token_block_open(self, t):
r'{'
state = t.lexer.current_state()
if state not in self.open_blocks_cnt_by_state:
self.open_blocks_cnt_by_state[state] = 0
self.open_blocks_cnt_by_state[state] += 1
return t
def token_block_close(self, t):
r'}'
state = t.lexer.current_state()
self.open_blocks_cnt_by_state[state] -= 1
if self.open_blocks_cnt_by_state[state] == 0:
t.lexer.pop_state()
return t
def token_host_rparan(self, t):
r'\)'
t.lexer.push_state('hostrparan')
return t
def token_host_sq_lparan(self, t):
r'\['
t.lexer.pop_state()
t.lexer.push_state('functionqopargs')
return t
def token_host_any_char(self, t):
r"."
t.lexer.pop_state()
t.lexer.skip(-1)
def token_qop_sq_rparan(self, t):
r'\]'
t.lexer.pop_state()
return t
def token_error(self, t):
self.parser.t_error(t)
def t_newline(self, t):
r'\n+'
t.lexer.lineno += t.value.count("\n")
##########################################
# RULES
##########################################
def hosts_specification(self, t):
"""
specification : HOSTS_SPECIFICATION BLOCKOPEN hosts_list BLOCKCLOSE
"""
pass
def hosts_list(self, t):
"""
hosts_list : host
| hosts_list host
"""
pass
def host(self, t):
"""
host : HOST IDENTIFIER LPARAN host_schedule_algorithm RPARAN host_channels BLOCKOPEN host_body BLOCKCLOSE
| HOST IDENTIFIER LPARAN host_schedule_algorithm RPARAN host_channels BLOCKOPEN host_predefined_values host_body BLOCKCLOSE
"""
self.parser.store.hosts.append(self.builder.build_host(t))
def host_schedule_algorithm(self, t):
"""
host_schedule_algorithm : ROUND_ROBIN
| FIFO
"""
t[0] = t[1]
def host_channels(self, t):
"""
host_channels : LPARAN RPARAN
| LPARAN STAR RPARAN
| LPARAN identifiers_list RPARAN
"""
if len(t) == 2:
t[0] = []
elif len(t) == 2:
t[0] = ['*']
elif len(t) > 2:
t[0] = t[2]
def host_predefined_values(self, t):
"""
host_predefined_values : host_predefined_value
| host_predefined_values host_predefined_value
"""
if len(t) == 2:
t[0] = []
t[0].append(t[1])
else:
t[0] = t[1]
t[0].append(t[2])
def host_predefined_value(self, t):
"""
host_predefined_value : HASH instruction_assignment
"""
t[0] = t[2]
def host_body(self, t):
"""
host_body : host_process
| host_body host_process
"""
if len(t) == 2:
t[0] = []
t[0].append(t[1])
else:
t[0] = t[1]
t[0].append(t[2])
def host_process(self, t):
"""
host_process : PROCESS IDENTIFIER host_channels BLOCKOPEN instructions_list BLOCKCLOSE
| PROCESS IDENTIFIER host_channels BLOCKOPEN instructions_list BLOCKCLOSE SEMICOLON
"""
t[0] = self.builder.build_process(t)
def _extend(self):
self.parser.add_state('hosts', 'inclusive')
self.parser.add_state('host', 'inclusive')
self.parser.add_state('process', 'inclusive')
self.parser.add_state('hostrparan', 'exclusive')
self.parser.add_state('functionqopargs', 'exclusive')
self.parser.add_reserved_word('hosts', 'HOSTS_SPECIFICATION', func=self.word_hosts_specification)
self.parser.add_reserved_word('host', 'HOST', func=self.word_host_specification, state='hosts')
self.parser.add_reserved_word('process', 'PROCESS', func=self.word_process_specification, state='host')
self.parser.add_reserved_word('rr', 'ROUND_ROBIN', state='host')
self.parser.add_reserved_word('fifo', 'FIFO', state='host')
self.parser.add_reserved_word('in', 'IN', state='host')
self.parser.add_reserved_word('out', 'OUT', state='host')
self.parser.add_reserved_word('while', 'WHILE', state='host')
self.parser.add_reserved_word('if', 'IF', state='host')
self.parser.add_reserved_word('else', 'ELSE', state='host')
self.parser.add_reserved_word('continue', 'CONTINUE', state='host')
self.parser.add_reserved_word('break', 'BREAK', state='host')
self.parser.add_reserved_word('stop', 'STOP', state='host')
self.parser.add_reserved_word('end', 'END', state='host')
self.parser.add_token('BLOCKOPEN', func=self.token_block_open, states=['hosts', 'host', 'process'])
self.parser.add_token('BLOCKCLOSE', func=self.token_block_close, states=['hosts', 'host', 'process'])
self.parser.add_token('HASH', r'\#', states=['host'])
self.parser.add_token('RPARAN', func=self.token_host_rparan, states=['host', 'process'])
self.parser.add_token('SQLPARAN', func=self.token_host_sq_lparan, states=['hostrparan'])
self.parser.add_token('ANYCHAR', func=self.token_host_any_char, states=['hostrparan'], include_in_tokens=False)
self.parser.add_token('error', func=self.token_error, states=['hostrparan'], include_in_tokens=False)
self.parser.add_token('ignore', "\t", states=['hostrparan'], include_in_tokens=False)
self.parser.add_token('newline', func=self.t_newline, states=['hostrparan'], include_in_tokens=False)
self.parser.add_token('error', func=self.token_error, states=['functionqopargs'], include_in_tokens=False)
self.parser.add_token('ignore', "\t", states=['functionqopargs'], include_in_tokens=False)
self.parser.add_token('newline', func=self.t_newline, states=['functionqopargs'], include_in_tokens=False)
self.parser.add_token('SQRPARAN', func=self.token_qop_sq_rparan, states=['functionqopargs'])
self.parser.add_token('COMMA', r'\,', states=['functionqopargs'])
self.parser.add_token('TEXT', r'[-_A-Za-z0-9 ]+', states=['functionqopargs'])
self.parser.add_rule(self.hosts_specification)
self.parser.add_rule(self.hosts_list)
self.parser.add_rule(self.host)
self.parser.add_rule(self.host_schedule_algorithm)
self.parser.add_rule(self.host_channels)
self.parser.add_rule(self.host_predefined_values)
self.parser.add_rule(self.host_predefined_value)
self.parser.add_rule(self.host_body)
self.parser.add_rule(self.host_process) | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/model/parser/lex_yacc/grammar/hosts.py | hosts.py |
from aqopa.model.parser.lex_yacc import LexYaccParserExtension
class ModelParserExtension(LexYaccParserExtension):
"""
Extension for parsing functions
"""
def __init__(self):
LexYaccParserExtension.__init__(self)
self.open_blocks_cnt = 0
##########################################
# RESERVED WORDS
##########################################
def word_modules_specification(self, t):
t.lexer.push_state('modules')
return t
##########################################
# TOKENS
##########################################
def token_blockopen(self, t):
r"\{"
self.open_blocks_cnt += 1
return t
def token_blockclose(self, t):
r"\}"
self.open_blocks_cnt -= 1
if self.open_blocks_cnt == 0:
t.lexer.pop_state()
return t
##########################################
# RULES
##########################################
def modules_specification(self, t):
"""
specification : MODULES_SPECIFICATION BLOCKOPEN modules_specifications_list BLOCKCLOSE
"""
pass
def modules_specifications_list(self, t):
"""
modules_specifications_list : module_specification
| modules_specifications_list module_specification
"""
pass
def modules_specification_empty(self, t):
"""
module_specification : empty
"""
pass
def _extend(self):
self.parser.add_state('modules', 'inclusive')
self.parser.add_reserved_word('modules', 'MODULES_SPECIFICATION', func=self.word_modules_specification,)
self.parser.add_token('BLOCKOPEN', func=self.token_blockopen, states=['modules'])
self.parser.add_token('BLOCKCLOSE', func=self.token_blockclose, states=['modules'])
self.parser.add_rule(self.modules_specification)
self.parser.add_rule(self.modules_specifications_list)
self.parser.add_rule(self.modules_specification_empty) | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/model/parser/lex_yacc/grammar/modules.py | modules.py |
import re
from aqopa.model.parser.lex_yacc import LexYaccParserExtension
class ModelParserExtension(LexYaccParserExtension):
def __init__(self):
LexYaccParserExtension.__init__(self)
##########################################
# RESERVED WORDS
##########################################
def word_bool(self, t):
t.value = bool(t.value)
return t
##########################################
# TOKENS
##########################################
def token_identifier(self, t):
r'[_a-zA-Z][_a-zA-Z0-9]*(\.[0-9]+)*'
qualified_regexp = r'[_a-zA-Z][_a-zA-Z0-9]*(\.[0-9]+)+'
if re.match(qualified_regexp, t.value):
t.type = 'QUALIFIED_IDENTIFIER'
return t
words = self.parser.get_reserved_words()
states_stack = []
states_stack.extend(t.lexer.lexstatestack)
states_stack.append(t.lexer.current_state())
i = len(states_stack)-1
while i >= 0:
state = states_stack[i]
if state in words:
state_words = words[state]
for state_word in state_words:
tvalue = t.value
state_word_value = state_word
word_tuple = state_words[state_word]
# if not case sensitive
if not word_tuple[2]:
tvalue = tvalue.lower()
state_word_value = state_word_value.lower()
if tvalue == state_word_value:
# If function exists
if word_tuple[1]:
t = word_tuple[1](t)
t.type = word_tuple[0]
break
i -= 1
return t
def token_float(self, t):
r'([1-9][0-9]*\.[0-9]+)|(0\.[0-9]+)'
t.value = float(t.value)
return t
def token_integer(self, t):
r'0|[1-9][0-9]*'
t.value = int(t.value)
return t
def token_comment(self, t):
r'\%[^\n]*'
pass
##########################################
# RULES
##########################################
def rule_model(self, t):
"""
model : specification
| model specification
"""
pass
def rule_empty(self, t):
"""
empty :
"""
pass
def rule_number(self, t):
"""
number : FLOAT
| INTEGER
"""
t[0] = t[1]
def rule_identifiers_list(self, t):
"""
identifiers_list : IDENTIFIER
| identifiers_list COMMA IDENTIFIER
"""
if len(t) == 2:
t[0] = [t[1]]
else:
t[0] = t[1]
t[0].append(t[3])
def rule_qualified_identifiers_list(self, t):
"""
qualified_identifiers_list : QUALIFIED_IDENTIFIER
| qualified_identifiers_list COMMA QUALIFIED_IDENTIFIER
"""
if len(t) == 2:
t[0] = [t[1]]
else:
t[0] = t[1]
t[0].append(t[3])
def _extend(self):
self.parser.add_reserved_word('true', 'BOOL', func=self.word_bool)
self.parser.add_reserved_word('false', 'BOOL', func=self.word_bool)
self.parser.add_token('COMMA', r',')
self.parser.add_token('FLOAT', func=self.token_float)
self.parser.add_token('INTEGER', func=self.token_integer)
self.parser.add_token('QUALIFIED_IDENTIFIER')
self.parser.add_token('IDENTIFIER', func=self.token_identifier)
self.parser.add_token('TEXT', r'[-_A-Za-z0-9 ]+')
self.parser.add_token('SEMICOLON', r';')
self.parser.add_token('COLON', r':',)
self.parser.add_token('STAR', r'\*')
self.parser.add_token('EQUAL', r'\=')
self.parser.add_token('EXCLAMATION', r'\!')
self.parser.add_token('PIPE', r'\|')
self.parser.add_token('LPARAN', r'\(')
self.parser.add_token('RPARAN', r'\)')
self.parser.add_token('SQLPARAN', r'\[')
self.parser.add_token('SQRPARAN', r'\]')
self.parser.add_token('BLOCKOPEN', r'{')
self.parser.add_token('BLOCKCLOSE', r'}')
self.parser.add_token('COMMENT', self.token_comment, include_in_tokens=False)
self.parser.add_rule(self.rule_model)
self.parser.add_rule(self.rule_empty)
self.parser.add_rule(self.rule_number)
self.parser.add_rule(self.rule_identifiers_list)
self.parser.add_rule(self.rule_qualified_identifiers_list)
self.parser.start_symbol = 'model'
class MetricsParserExtension(LexYaccParserExtension):
"""
"""
def __init__(self):
LexYaccParserExtension.__init__(self)
##########################################
# RESERVED WORDS
##########################################
def word_bool(self, t):
t.value = bool(t.value)
return t
##########################################
# TOKENS
##########################################
def token_identifier(self, t):
r'[_a-zA-Z][_a-zA-Z0-9]*(\.[0-9]+)*'
qualified_regexp = r'[_a-zA-Z][_a-zA-Z0-9]*(\.[0-9]+)+'
if re.match(qualified_regexp, t.value):
t.type = 'QUALIFIED_IDENTIFIER'
return t
words = self.parser.get_reserved_words()
states_stack = []
states_stack.extend(t.lexer.lexstatestack)
states_stack.append(t.lexer.current_state())
i = len(states_stack)-1
while i >= 0:
state = states_stack[i]
if state in words:
state_words = words[state]
for state_word in state_words:
tvalue = t.value
state_word_value = state_word
word_tuple = state_words[state_word]
# if not case sensitive
if not word_tuple[2]:
tvalue = tvalue.lower()
state_word_value = state_word_value.lower()
if tvalue == state_word_value:
# If function exists
if word_tuple[1]:
t = word_tuple[1](t)
t.type = word_tuple[0]
break
i -= 1
return t
def token_float(self, t):
r'([1-9][0-9]*\.[0-9]+)|(0\.[0-9]+)'
t.value = float(t.value)
return t
def token_integer(self, t):
r'0|[1-9][0-9]*'
t.value = int(t.value)
return t
def token_comment(self, t):
r'\%[^\n]*'
pass
##########################################
# RULES
##########################################
def rule_metrics(self, t):
"""
metrics : specification
| metrics specification
"""
pass
def rule_empty(self, t):
"""
empty :
"""
pass
def rule_number(self, t):
"""
number : FLOAT
| INTEGER
"""
t[0] = t[1]
def rule_identifiers_list(self, t):
"""
identifiers_list : IDENTIFIER
| identifiers_list COMMA IDENTIFIER
"""
if len(t) == 2:
t[0] = [t[1]]
else:
t[0] = t[1]
t[0].append(t[3])
def rule_qualified_identifiers_list(self, t):
"""
qualified_identifiers_list : QUALIFIED_IDENTIFIER
| qualified_identifiers_list COMMA QUALIFIED_IDENTIFIER
"""
if len(t) == 2:
t[0] = [t[1]]
else:
t[0] = t[1]
t[0].append(t[3])
def _extend(self):
self.parser.add_token('COMMA', r',')
self.parser.add_token('FLOAT', func=self.token_float)
self.parser.add_token('INTEGER', func=self.token_integer)
self.parser.add_token('QUALIFIED_IDENTIFIER')
self.parser.add_token('IDENTIFIER', func=self.token_identifier)
self.parser.add_token('TEXT', r'[-_A-Za-z0-9 ]+')
self.parser.add_token('SEMICOLON', r';')
self.parser.add_token('COLON', r':',)
self.parser.add_token('STAR', r'\*')
self.parser.add_token('EQUAL', r'\=')
self.parser.add_token('LPARAN', r'\(')
self.parser.add_token('RPARAN', r'\)')
self.parser.add_token('SQLPARAN', r'\[')
self.parser.add_token('SQRPARAN', r'\]')
self.parser.add_token('BLOCKOPEN', r'{')
self.parser.add_token('BLOCKCLOSE', r'}')
self.parser.add_token('COMMENT', self.token_comment, include_in_tokens=False)
self.parser.add_rule(self.rule_metrics)
self.parser.add_rule(self.rule_empty)
self.parser.add_rule(self.rule_number)
self.parser.add_rule(self.rule_identifiers_list)
self.parser.add_rule(self.rule_qualified_identifiers_list)
self.parser.start_symbol = 'metrics'
class ConfigParserExtension(LexYaccParserExtension):
"""
"""
def __init__(self):
LexYaccParserExtension.__init__(self)
##########################################
# RESERVED WORDS
##########################################
def word_bool(self, t):
t.value = bool(t.value)
return t
##########################################
# TOKENS
##########################################
def token_identifier(self, t):
r'[_a-zA-Z][_a-zA-Z0-9]*(\.[0-9]+)*'
qualified_regexp = r'[_a-zA-Z][_a-zA-Z0-9]*(\.[0-9]+)+'
if re.match(qualified_regexp, t.value):
t.type = 'QUALIFIED_IDENTIFIER'
return t
words = self.parser.get_reserved_words()
states_stack = []
states_stack.extend(t.lexer.lexstatestack)
states_stack.append(t.lexer.current_state())
i = len(states_stack)-1
while i >= 0:
state = states_stack[i]
if state in words:
state_words = words[state]
for state_word in state_words:
tvalue = t.value
state_word_value = state_word
word_tuple = state_words[state_word]
# if not case sensitive
if not word_tuple[2]:
tvalue = tvalue.lower()
state_word_value = state_word_value.lower()
if tvalue == state_word_value:
# If function exists
if word_tuple[1]:
t = word_tuple[1](t)
t.type = word_tuple[0]
break
i -= 1
return t
def token_float(self, t):
r'([1-9][0-9]*\.[0-9]+)|(0\.[0-9]+)'
t.value = float(t.value)
return t
def token_integer(self, t):
r'0|[1-9][0-9]*'
t.value = int(t.value)
return t
def token_comment(self, t):
r'\%[^\n]*'
pass
##########################################
# RULES
##########################################
def rule_configuration(self, t):
"""
configuration : specification
| configuration specification
"""
pass
def rule_empty(self, t):
"""
empty :
"""
pass
def rule_number(self, t):
"""
number : FLOAT
| INTEGER
"""
t[0] = t[1]
def rule_identifiers_list(self, t):
"""
identifiers_list : IDENTIFIER
| identifiers_list COMMA IDENTIFIER
"""
if len(t) == 2:
t[0] = [t[1]]
else:
t[0] = t[1]
t[0].append(t[3])
def rule_qualified_identifiers_list(self, t):
"""
qualified_identifiers_list : QUALIFIED_IDENTIFIER
| qualified_identifiers_list COMMA QUALIFIED_IDENTIFIER
"""
if len(t) == 2:
t[0] = [t[1]]
else:
t[0] = t[1]
t[0].append(t[3])
def _extend(self):
self.parser.add_reserved_word('true', 'BOOL', func=self.word_bool)
self.parser.add_reserved_word('false', 'BOOL', func=self.word_bool)
self.parser.add_token('COMMA', r',')
self.parser.add_token('FLOAT', func=self.token_float)
self.parser.add_token('INTEGER', func=self.token_integer)
self.parser.add_token('QUALIFIED_IDENTIFIER')
self.parser.add_token('IDENTIFIER', func=self.token_identifier)
self.parser.add_token('TEXT', r'[-_A-Za-z0-9 ]+')
self.parser.add_token('SEMICOLON', r';')
self.parser.add_token('COLON', r':',)
self.parser.add_token('STAR', r'\*')
self.parser.add_token('EQUAL', r'\=')
self.parser.add_token('LPARAN', r'\(')
self.parser.add_token('RPARAN', r'\)')
self.parser.add_token('SQLPARAN', r'\[')
self.parser.add_token('SQRPARAN', r'\]')
self.parser.add_token('BLOCKOPEN', r'{')
self.parser.add_token('BLOCKCLOSE', r'}')
self.parser.add_token('COMMENT', self.token_comment, include_in_tokens=False)
self.parser.add_rule(self.rule_configuration)
self.parser.add_rule(self.rule_empty)
self.parser.add_rule(self.rule_number)
self.parser.add_rule(self.rule_identifiers_list)
self.parser.add_rule(self.rule_qualified_identifiers_list)
self.parser.start_symbol = 'configuration' | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/model/parser/lex_yacc/grammar/main.py | main.py |
from aqopa.model.parser.lex_yacc import LexYaccParserExtension
from aqopa.model import Channel
class Builder():
"""
Builder for creating channel objects
"""
def build_channels(self, token):
"""
channel : CHANNEL identifiers_list LPARAN channel_buffor RPARAN SEMICOLON
| CHANNEL identifiers_list LPARAN channel_buffor RPARAN SQLPARAN IDENTIFIER SQRPARAN SEMICOLON
"""
tag_name = token[7] if len(token) == 10 else None
channels = []
for name in token[2]:
buffer_size = token[4]
if (isinstance(buffer_size, str) or isinstance(buffer_size, unicode)) \
and buffer_size == "*":
buffer_size = -1
channels.append(Channel(name, buffer_size, tag_name))
return channels
class ModelParserExtension(LexYaccParserExtension):
"""
Extension for parsing functions
"""
def __init__(self):
LexYaccParserExtension.__init__(self)
self.builder = Builder()
##########################################
# RESERVED WORDS
##########################################
def word_channels_specification(self, t):
t.lexer.push_state('channels')
return t
##########################################
# TOKENS
##########################################
def token_block_close(self, t):
r'}'
t.lexer.pop_state()
return t
##########################################
# RULES
##########################################
def channels_specification(self, t):
"""
specification : CHANNELS_SPECIFICATION BLOCKOPEN channels_list BLOCKCLOSE
"""
pass
def channels_list(self, t):
"""
channels_list : channel
| channels_list channel
"""
pass
def channel(self, t):
"""
channel : CHANNEL identifiers_list LPARAN channel_buffor RPARAN SEMICOLON
| CHANNEL identifiers_list LPARAN channel_buffor RPARAN SQLPARAN IDENTIFIER SQRPARAN SEMICOLON
"""
for ch in self.builder.build_channels(t):
self.parser.store.channels.append(ch)
def channel_buffor(self, t):
"""
channel_buffor : STAR
| INTEGER
"""
t[0] = t[1]
def _extend(self):
self.parser.add_state('channels', 'inclusive')
self.parser.add_reserved_word('channels', 'CHANNELS_SPECIFICATION', func=self.word_channels_specification)
self.parser.add_reserved_word('channel', 'CHANNEL', state='channels')
self.parser.add_token('BLOCKCLOSE', func=self.token_block_close, states=['channels'])
self.parser.add_rule(self.channels_specification)
self.parser.add_rule(self.channels_list)
self.parser.add_rule(self.channel)
self.parser.add_rule(self.channel_buffor) | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/model/parser/lex_yacc/grammar/channels.py | channels.py |
from aqopa.model.parser.lex_yacc import LexYaccParserExtension
from aqopa.model import AlgWhile, AlgCallFunction, AlgIf, AlgReturn, AlgAssignment
class Builder():
"""
Builder for store objects
"""
class ModelParserExtension(LexYaccParserExtension):
"""
Extension for parsing functions
"""
def __init__(self):
LexYaccParserExtension.__init__(self)
self.builder = Builder()
self.open_blocks_cnt = 0
##########################################
# RESERVED WORDS
##########################################
def word_algorithms_specification(self, t):
t.lexer.push_state('algorithms')
return t
##########################################
# TOKENS
##########################################
def token_block_open(self, t):
r'{'
self.open_blocks_cnt += 1
return t
def token_block_close(self, t):
r'}'
self.open_blocks_cnt -= 1
if self.open_blocks_cnt == 0:
t.lexer.pop_state()
return t
##########################################
# RULES
##########################################
def algorithms_specification(self, t):
"""
specification : ALGORITHMS_SPECIFICATION BLOCKOPEN algorithms_list BLOCKCLOSE
"""
pass
def algorithms_list(self, t):
"""
algorithms_list : algorithm
| algorithms_list algorithm
"""
pass
def algorithm(self, t):
"""
algorithm : ALGORITHM IDENTIFIER LPARAN IDENTIFIER RPARAN BLOCKOPEN algorithm_instructions BLOCKCLOSE
"""
self.parser.store.algorithms[t[2]] = {'parameter': t[4], 'instructions': t[7]}
def algorithm_instructions(self, t):
"""
algorithm_instructions : algorithm_instruction
| algorithm_instructions algorithm_instruction
"""
if len(t) == 3:
t[0] = t[1]
t[0].append(t[2])
else:
t[0] = []
t[0].append(t[1])
def algorithm_instruction(self, t):
"""
algorithm_instruction : algorithm_instruction_assignment SEMICOLON
| algorithm_instruction_return SEMICOLON
| algorithm_instruction_if
| algorithm_instruction_while
"""
t[0] = t[1]
def algorithm_instruction_assignment(self, t):
"""
algorithm_instruction_assignment : IDENTIFIER EQUAL algorithm_expression
"""
t[0] = AlgAssignment(identifier=t[1], expression=t[3])
def algorithm_instruction_return(self, t):
"""
algorithm_instruction_return : RETURN algorithm_expression
"""
t[0] = AlgReturn(expression=t[2])
def algorithm_instruction_if(self, t):
"""
algorithm_instruction_if : IF LPARAN algorithm_expression_conditional RPARAN BLOCKOPEN algorithm_instructions BLOCKCLOSE
| IF LPARAN algorithm_expression_conditional RPARAN BLOCKOPEN algorithm_instructions BLOCKCLOSE ELSE BLOCKOPEN algorithm_instructions BLOCKCLOSE
"""
if len(t) == 8:
t[0] = AlgIf(condition=t[3], true_instructions=t[6], false_instructions=[])
else:
t[0] = AlgIf(condition=t[3], true_instructions=t[6], false_instructions=t[10])
def algorithm_instruction_while(self, t):
"""
algorithm_instruction_while : WHILE LPARAN algorithm_expression_conditional RPARAN BLOCKOPEN algorithm_instructions BLOCKCLOSE
"""
t[0] = AlgWhile(condition=t[3], instructions=t[6])
def algorithm_expression_conditional_comparison(self, t):
"""
algorithm_expression_conditional : algorithm_expression EQUAL EQUAL algorithm_expression
| algorithm_expression EXCLAMATION EQUAL algorithm_expression
| algorithm_expression GREATER algorithm_expression
| algorithm_expression GREATER EQUAL algorithm_expression
| algorithm_expression SMALLER algorithm_expression
| algorithm_expression SMALLER EQUAL algorithm_expression
| algorithm_expression_conditional AND AND algorithm_expression_conditional
| algorithm_expression_conditional OR OR algorithm_expression_conditional
"""
t[0] = t[1]
sign = t[2]
if len(t) == 5:
sign += t[3]
t[0].append(sign)
t[0].extend(t[4])
else:
t[0].append(sign)
t[0].extend(t[3])
def algorithm_expression_conditional_paran(self, t):
"""
algorithm_expression_conditional : LPARAN algorithm_expression_conditional RPARAN
"""
t[0] = t[2]
t[0].prepend('(')
t[0].append(')')
def algorithm_expression_simple(self, t):
"""
algorithm_expression : number
| IDENTIFIER
"""
t[0] = [t[1]]
def algorithm_expression_uminus(self, t):
"""
algorithm_expression : MINUS algorithm_expression %prec UMINUS
"""
t[0] = t[2]
t[0].prepend('--')
def algorithm_expression_paran(self, t):
"""
algorithm_expression : LPARAN algorithm_expression RPARAN
"""
t[0] = t[2]
t[0].prepend('(')
t[0].append(')')
def algorithm_expression_operations(self, t):
"""
algorithm_expression : algorithm_expression PLUS algorithm_expression
| algorithm_expression MINUS algorithm_expression
| algorithm_expression TIMES algorithm_expression
| algorithm_expression DIVIDE algorithm_expression
"""
t[0] = t[1]
t[0].append(t[2])
t[0].extend(t[3])
def algorithm_expression_function(self, t):
"""
algorithm_expression : QUALITY LPARAN RPARAN
| SIZE LPARAN IDENTIFIER RPARAN
| SIZE LPARAN IDENTIFIER SQLPARAN INTEGER SQRPARAN RPARAN
"""
args = []
if len(t) == 5:
args = [t[3]]
elif len(t) == 8:
args = [t[3], t[5]]
t[0] = [AlgCallFunction(t[1], args)]
# Predefined functions
def _extend(self):
self.parser.add_state('algorithms', 'inclusive')
self.parser.add_reserved_word('algorithms', 'ALGORITHMS_SPECIFICATION', func=self.word_algorithms_specification)
self.parser.add_reserved_word('alg', 'ALGORITHM', state='algorithms')
self.parser.add_reserved_word('if', 'IF', state='algorithms', case_sensitive=True)
self.parser.add_reserved_word('while', 'WHILE', state='algorithms', case_sensitive=True)
self.parser.add_reserved_word('else', 'ELSE', state='algorithms', case_sensitive=True)
self.parser.add_reserved_word('quality', 'QUALITY', state='algorithms', case_sensitive=True)
self.parser.add_reserved_word('size', 'SIZE', state='algorithms', case_sensitive=True)
self.parser.add_reserved_word('return', 'RETURN', state='algorithms', case_sensitive=True)
self.parser.add_token('BLOCKOPEN', func=self.token_block_open, states=['algorithms'])
self.parser.add_token('BLOCKCLOSE', func=self.token_block_close, states=['algorithms'])
self.parser.add_token('ARROWRIGHT', r'\-\>', states=['algorithms'])
self.parser.add_token('ARROWLEFT', r'\<\-', states=['algorithms'])
self.parser.add_token('ARROWBOTH', r'\<\-\>', states=['algorithms'])
self.parser.add_token('PLUS', r'\+', states=['algorithms'])
self.parser.add_token('MINUS', r'\-', states=['algorithms'])
self.parser.add_token('TIMES', r'\*', states=['algorithms'])
self.parser.add_token('DIVIDE', r'/', states=['algorithms'])
self.parser.add_token('GREATER', r'\>', states=['algorithms'])
self.parser.add_token('SMALLER', r'\<', states=['algorithms'])
self.parser.add_token('EXCLAMATION', r'\!', states=['algorithms'])
self.parser.add_token('AND', r'\&', states=['algorithms'])
self.parser.add_token('OR', r'\|', states=['algorithms'])
self.parser.add_rule(self.algorithms_specification)
self.parser.add_rule(self.algorithms_list)
self.parser.add_rule(self.algorithm)
self.parser.add_rule(self.algorithm_instructions)
self.parser.add_rule(self.algorithm_instruction)
self.parser.add_rule(self.algorithm_instruction_assignment)
self.parser.add_rule(self.algorithm_instruction_return)
self.parser.add_rule(self.algorithm_instruction_if)
self.parser.add_rule(self.algorithm_instruction_while)
self.parser.add_rule(self.algorithm_expression_conditional_comparison)
self.parser.add_rule(self.algorithm_expression_conditional_paran)
self.parser.add_rule(self.algorithm_expression_simple)
self.parser.add_rule(self.algorithm_expression_uminus)
self.parser.add_rule(self.algorithm_expression_paran)
self.parser.add_rule(self.algorithm_expression_operations)
self.parser.add_rule(self.algorithm_expression_function)
self.parser.add_precedence(['PLUS', 'MINUS'], 'left')
self.parser.add_precedence(['TIMES', 'DIVIDE'], 'left')
self.parser.add_precedence(['UMINUS'], 'right') | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/model/parser/lex_yacc/grammar/algorithms.py | algorithms.py |
from aqopa.model.parser.lex_yacc import LexYaccParserExtension
from aqopa.model import Equation, BooleanExpression,\
IdentifierExpression, CallFunctionExpression
class Builder():
def build_equation(self, token):
"""
equation : EQUATION equation_complex_expression EQUAL equation_simple_expression SEMICOLON
"""
return Equation(token[4], token[2])
def build_equation_complex_expression(self, token):
"""
equation_complex_expression : IDENTIFIER LPARAN equation_function_arguments RPARAN
| IDENTIFIER LPARAN RPARAN
"""
if len(token) == 5:
return CallFunctionExpression(token[1], token[3])
return CallFunctionExpression(token[1], [])
class ModelParserExtension(LexYaccParserExtension):
"""
Extension for parsing functions
"""
def __init__(self):
LexYaccParserExtension.__init__(self)
self.builder = Builder()
##########################################
# RESERVED WORDS
##########################################
def word_equations_specification(self, t):
t.lexer.push_state('equations')
return t
##########################################
# TOKENS
##########################################
def token_block_close(self, t):
r'}'
t.lexer.pop_state()
return t
##########################################
# RULES
##########################################
def equations_specification(self, t):
"""
specification : EQUATIONS_SPECIFICATION BLOCKOPEN equations_list BLOCKCLOSE
| EQUATIONS_SPECIFICATION BLOCKOPEN BLOCKCLOSE
"""
pass
def equations_list(self, t):
"""
equations_list : equation
| equations_list equation
"""
pass
def equation(self, t):
"""
equation : EQUATION equation_complex_expression EQUAL equation_simple_expression SEMICOLON
"""
self.parser.store.equations.append(self.builder.build_equation(t))
def equation_complex_expression(self, t):
"""
equation_complex_expression : IDENTIFIER LPARAN equation_function_arguments RPARAN
| IDENTIFIER LPARAN RPARAN
"""
t[0] = self.builder.build_equation_complex_expression(t)
def equation_simple_expression(self,t):
"""
equation_simple_expression : IDENTIFIER
| BOOL
"""
if isinstance(t[1], str) or isinstance(t[1], unicode):
t[0] = IdentifierExpression(t[1])
else:
t[0] = BooleanExpression(t[1])
def equation_function_arguments(self, t):
"""
equation_function_arguments : equation_expression
| equation_function_arguments COMMA equation_expression
"""
if len(t) == 2:
t[0] = [t[1]]
else:
t[0] = t[1]
t[0].append(t[3])
def equation_expression(self, t):
"""
equation_expression : equation_complex_expression
| equation_simple_expression
"""
t[0] = t[1]
def _extend(self):
self.parser.add_state('equations', 'inclusive')
self.parser.add_reserved_word('equations', 'EQUATIONS_SPECIFICATION', func=self.word_equations_specification)
self.parser.add_reserved_word('eq', 'EQUATION', state='equations')
self.parser.add_token('BLOCKCLOSE', func=self.token_block_close, states=['equations'])
self.parser.add_rule(self.equations_specification)
self.parser.add_rule(self.equations_list)
self.parser.add_rule(self.equation)
self.parser.add_rule(self.equation_complex_expression)
self.parser.add_rule(self.equation_simple_expression)
self.parser.add_rule(self.equation_function_arguments)
self.parser.add_rule(self.equation_expression) | AQoPA | /AQoPA-0.9.5.tar.gz/AQoPA-0.9.5/aqopa/model/parser/lex_yacc/grammar/equations.py | equations.py |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.