Title
stringlengths
11
150
A_Id
int64
518
72.5M
Users Score
int64
-42
283
Q_Score
int64
0
1.39k
ViewCount
int64
17
1.71M
Database and SQL
int64
0
1
Tags
stringlengths
6
105
Answer
stringlengths
14
4.78k
GUI and Desktop Applications
int64
0
1
System Administration and DevOps
int64
0
1
Networking and APIs
int64
0
1
Other
int64
0
1
CreationDate
stringlengths
23
23
AnswerCount
int64
1
55
Score
float64
-1
1.2
is_accepted
bool
2 classes
Q_Id
int64
469
42.4M
Python Basics and Environment
int64
0
1
Data Science and Machine Learning
int64
0
1
Web Development
int64
1
1
Available Count
int64
1
15
Question
stringlengths
17
21k
GAE - Deployment Error: "AttributeError: can't set attribute"
12,912,373
0
11
4,377
0
python,google-app-engine,deployment
This also happens if your default_error value overlaps with your static_dirs in app.yaml.
0
1
0
0
2012-04-25T11:55:00.000
6
0
false
10,315,069
0
0
1
4
When I try to deploy my app I get the following error: Starting update of app: flyingbat123, version: 0-1 Getting current resource limits. Password for avigmati: Traceback (most recent call last): File "C:\Program Files (x86)\Google\google_appengine\appcfg.py", line 125, in run_file(__file__, globals()) File "C:\Program Files (x86)\Google\google_appengine\appcfg.py", line 121, in run_file execfile(script_path, globals_) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 4062, in main(sys.argv) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 4053, in main result = AppCfgApp(argv).Run() File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 2543, in Run self.action(self) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 3810, in __call__ return method() File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 3006, in Update self.UpdateVersion(rpcserver, self.basepath, appyaml) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 2995, in UpdateVersion self.options.max_size) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 2122, in DoUpload resource_limits = GetResourceLimits(self.rpcserver, self.config) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 355, in GetResourceLimits resource_limits.update(GetRemoteResourceLimits(rpcserver, config)) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 326, in GetRemoteResourceLimits version=config.version) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appengine_rpc.py", line 379, in Send self._Authenticate() File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appengine_rpc.py", line 437, in _Authenticate super(HttpRpcServer, self)._Authenticate() File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appengine_rpc.py", line 281, in _Authenticate auth_token = self._GetAuthToken(credentials[0], credentials[1]) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appengine_rpc.py", line 233, in _GetAuthToken e.headers, response_dict) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appengine_rpc.py", line 94, in __init__ self.reason = args["Error"] AttributeError: can't set attribute 2012-04-25 19:30:15 (Process exited with code 1) The following is my app.yaml: application: flyingbat123 version: 0-1 runtime: python api_version: 1 threadsafe: no It seems like an authentication error, but I'm entering a valid email and password. What am I doing wrong?
GAE - Deployment Error: "AttributeError: can't set attribute"
10,871,690
1
11
4,377
0
python,google-app-engine,deployment
I had the same problem and after inserting logger.warn(body), I get this: WARNING appengine_rpc.py:231 Error=BadAuthentication Info=InvalidSecondFactor The standard error message could have been more helpful, but this makes me wonder if I should not use an application specific password?
0
1
0
0
2012-04-25T11:55:00.000
6
0.033321
false
10,315,069
0
0
1
4
When I try to deploy my app I get the following error: Starting update of app: flyingbat123, version: 0-1 Getting current resource limits. Password for avigmati: Traceback (most recent call last): File "C:\Program Files (x86)\Google\google_appengine\appcfg.py", line 125, in run_file(__file__, globals()) File "C:\Program Files (x86)\Google\google_appengine\appcfg.py", line 121, in run_file execfile(script_path, globals_) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 4062, in main(sys.argv) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 4053, in main result = AppCfgApp(argv).Run() File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 2543, in Run self.action(self) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 3810, in __call__ return method() File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 3006, in Update self.UpdateVersion(rpcserver, self.basepath, appyaml) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 2995, in UpdateVersion self.options.max_size) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 2122, in DoUpload resource_limits = GetResourceLimits(self.rpcserver, self.config) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 355, in GetResourceLimits resource_limits.update(GetRemoteResourceLimits(rpcserver, config)) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 326, in GetRemoteResourceLimits version=config.version) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appengine_rpc.py", line 379, in Send self._Authenticate() File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appengine_rpc.py", line 437, in _Authenticate super(HttpRpcServer, self)._Authenticate() File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appengine_rpc.py", line 281, in _Authenticate auth_token = self._GetAuthToken(credentials[0], credentials[1]) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appengine_rpc.py", line 233, in _GetAuthToken e.headers, response_dict) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appengine_rpc.py", line 94, in __init__ self.reason = args["Error"] AttributeError: can't set attribute 2012-04-25 19:30:15 (Process exited with code 1) The following is my app.yaml: application: flyingbat123 version: 0-1 runtime: python api_version: 1 threadsafe: no It seems like an authentication error, but I'm entering a valid email and password. What am I doing wrong?
GAE - Deployment Error: "AttributeError: can't set attribute"
12,750,238
2
11
4,377
0
python,google-app-engine,deployment
I know this doesn't answer the OP question, but it may help others who experience problems using --oauth2 mentioned by others in this question. I have 2-step verification enabled, and I had been using the application-specific password, but found it tedious to look up and paste the long string every day or so. I found that using --oauth2 returns This application does not exist (app_id=u'my-app-id') but by adding the --no_cookies option appcfg.py --oauth2 --no_cookies update my-app-folder\ I can now authenticate each time by just clicking [Allow access] in the browser window that is opened. I'm using Python SDK 1.7.2 on Windows 7 NOTE: I found this solution elsewhere, but I can't remember where, so I can't properly attribute it. Sorry. .
0
1
0
0
2012-04-25T11:55:00.000
6
0.066568
false
10,315,069
0
0
1
4
When I try to deploy my app I get the following error: Starting update of app: flyingbat123, version: 0-1 Getting current resource limits. Password for avigmati: Traceback (most recent call last): File "C:\Program Files (x86)\Google\google_appengine\appcfg.py", line 125, in run_file(__file__, globals()) File "C:\Program Files (x86)\Google\google_appengine\appcfg.py", line 121, in run_file execfile(script_path, globals_) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 4062, in main(sys.argv) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 4053, in main result = AppCfgApp(argv).Run() File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 2543, in Run self.action(self) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 3810, in __call__ return method() File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 3006, in Update self.UpdateVersion(rpcserver, self.basepath, appyaml) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 2995, in UpdateVersion self.options.max_size) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 2122, in DoUpload resource_limits = GetResourceLimits(self.rpcserver, self.config) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 355, in GetResourceLimits resource_limits.update(GetRemoteResourceLimits(rpcserver, config)) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appcfg.py", line 326, in GetRemoteResourceLimits version=config.version) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appengine_rpc.py", line 379, in Send self._Authenticate() File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appengine_rpc.py", line 437, in _Authenticate super(HttpRpcServer, self)._Authenticate() File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appengine_rpc.py", line 281, in _Authenticate auth_token = self._GetAuthToken(credentials[0], credentials[1]) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appengine_rpc.py", line 233, in _GetAuthToken e.headers, response_dict) File "C:\Program Files (x86)\Google\google_appengine\google\appengine\tools\appengine_rpc.py", line 94, in __init__ self.reason = args["Error"] AttributeError: can't set attribute 2012-04-25 19:30:15 (Process exited with code 1) The following is my app.yaml: application: flyingbat123 version: 0-1 runtime: python api_version: 1 threadsafe: no It seems like an authentication error, but I'm entering a valid email and password. What am I doing wrong?
Eclipse / PyDev overrides @sys, cannot find Python 64bits interpreter
10,343,117
0
2
321
0
python,linux,eclipse,pydev
I don't really think there's anything that can be done on the PyDev side... it seems @sys is resolved based on the kind of process you're running (not your system), so, if you use a 64 bit vm (I think) it should work... Other than that, you may have to provide the actual path instead of using @sys...
0
1
0
1
2012-04-25T12:05:00.000
1
1.2
true
10,315,232
0
0
1
1
I'm working in a multiuser environment with the following setup: Linux 64bits environment (users can login in to different servers). Eclipse (IBM Eclipse RSA-RTE) 32bits. So Java VM, Eclipse and PyDev is 32bits. Python 3 interpreter is only available for 64bits at this moment. In the preferences for PyDev, I want to set the path to the Python interpreter like this: /app/python/@sys/3.2.2/bin/python In Eclipse/PyDev, @sys points to i386_linux26 even if the system actually is amd64_linux26. So if I do not explicitly write amd64_linux26 instead of @sys, PyDev will not be able to find the Python 3 interpreter which is only available for 64bits. The link works as expected outside Eclipse/PyDev, e.g. in the terminal. Any ideas how to force Eclipse/PyDev to use the real value of @sys? Thanks in advance!
openerp echo the return result of a function
10,324,086
0
1
517
0
python,openerp
If your purpose is to debug, the simplest solution is to add print statements in your code and then run the server in a console.
0
0
0
0
2012-04-25T12:07:00.000
2
0
false
10,315,257
0
0
1
1
In openerp, im working on a dummy function that (for example) returns the sum of a certain field on selected records. for instance, u select 3 invoices and it returns the sum of the quantity in the invoice lines. i think the function to perform the sum is correct, and even if it wasnt, i just need help in displaying the result of the function when called in a popup box. for that, i've added an action similar to "Confirm Invoices" found in the invoice object. to make myself clearer, when the confirm invoice is pressed, its function is called and the popup previously opened is of course closed because of this line found in the function: return {'type': 'ir.actions.act_window_close'} how can i tell it in my function instead (of closing) to display the result stored after executing the function?
migrating business logic to services: alternatives to Thrift
10,328,853
1
0
254
0
c++,python,architecture,thrift
you could consider: already mentioned CORBA solution: built in marshaling, compact binary protocol REST http and based json server: simple, a bit chatty on the network, you need to serialize your data to json AMQP messaging + json or some other serializer: you need to serialize your data to json or something else like google protocol buffers, plus is that scaling if you need more servers will be simpler.
0
0
0
1
2012-04-25T14:18:00.000
2
1.2
true
10,317,632
0
0
1
2
I am building an application, which has an application based front-end in C++/Qt and a web based front-end in Python (using Django) framework. I'm trying to migrate the architecture to services-based, as both these front-ends have business logic embedded in them, which makes it hard to maintain. I'm thinking of choosing Thrift to write the RPC services, which can be consumed by the other modules in the system and Python code. However, as it seems, Thrift does not work well with Windows, so I'm left with the option of converting the Thrift output to some C++ structures, which theen need to be serialized/de-serialized again, so that the services can be consumed by Qt/C++. Python code can consume these Thrift services easily. In this process, I need to convert/serialize the structure, first according to the Thrift IDL and then some custom code. Any suggestions to change the architecture, so as to keep it simple works with multiple languages quick to implement?
migrating business logic to services: alternatives to Thrift
10,317,981
1
0
254
0
c++,python,architecture,thrift
I've implemented something similar using omniORB. It has bindings for python and for C++. It's really easy in python and performs very well.
0
0
0
1
2012-04-25T14:18:00.000
2
0.099668
false
10,317,632
0
0
1
2
I am building an application, which has an application based front-end in C++/Qt and a web based front-end in Python (using Django) framework. I'm trying to migrate the architecture to services-based, as both these front-ends have business logic embedded in them, which makes it hard to maintain. I'm thinking of choosing Thrift to write the RPC services, which can be consumed by the other modules in the system and Python code. However, as it seems, Thrift does not work well with Windows, so I'm left with the option of converting the Thrift output to some C++ structures, which theen need to be serialized/de-serialized again, so that the services can be consumed by Qt/C++. Python code can consume these Thrift services easily. In this process, I need to convert/serialize the structure, first according to the Thrift IDL and then some custom code. Any suggestions to change the architecture, so as to keep it simple works with multiple languages quick to implement?
Django with psycopg2 plugin
20,124,244
-1
6
4,157
1
python,django
sudo apt-get install python-psycopg2 should work fine since it worked solution for me as well.
0
0
0
0
2012-04-25T18:25:00.000
4
-0.049958
false
10,321,568
0
0
1
2
Ive been reading the Django Book and its great so far, unless something doesn't work properly. I have been trying for two days to install the psycogp2 plugin with no luck. i navigate to the unzipped directory and run setup.py install and it returns "You must have postgresql dev for building a serverside extension or libpq-dev for client side." I don't know what any of this means, and google returns results tossing a lot of terms I don't really understand. Ive been trying to learn django for abut a week now plus linux so any help would be great. Thanks Btw, I have installed postgresql and pgadminIII from installer pack. I also tried sudo apt-get post.... and some stuff happens...but Im lost.
Django with psycopg2 plugin
22,528,687
3
6
4,157
1
python,django
I'm working on Xubuntu (12.04) and I have encountered the same error when I wanted to install django-toolbelt. I solved this error with the following operations : sudo apt-get install python-dev sudo apt-get install libpq-dev sudo apt-get install python-psycopg2 I hope this informations may be helpful for someone else.
0
0
0
0
2012-04-25T18:25:00.000
4
0.148885
false
10,321,568
0
0
1
2
Ive been reading the Django Book and its great so far, unless something doesn't work properly. I have been trying for two days to install the psycogp2 plugin with no luck. i navigate to the unzipped directory and run setup.py install and it returns "You must have postgresql dev for building a serverside extension or libpq-dev for client side." I don't know what any of this means, and google returns results tossing a lot of terms I don't really understand. Ive been trying to learn django for abut a week now plus linux so any help would be great. Thanks Btw, I have installed postgresql and pgadminIII from installer pack. I also tried sudo apt-get post.... and some stuff happens...but Im lost.
Using sphinx-api-doc when both sphinx and django are in multiple virtualenv
10,322,795
0
1
388
0
django,virtualenv,documentation-generation,python-sphinx
The api documentation for your code can only be generated with proper access to your code, so the anser will be "no, you'll need to have them both in the same virtualenv". Some extra thoughts: If your code virtualenv isn't isolated from the system's python packages, you could install sphinx globally, but you probably don't and shouldn't want that. I'd just add sphinx to your code's virtualenv. I don't think you'll have to worry about extra overhead of a few extra kilobytes.
0
0
0
0
2012-04-25T19:43:00.000
1
1.2
true
10,322,632
1
0
1
1
I have multiple django projects running different django versions in their own virtualenv. I want to use sphinx-api-doc command to generate api docs for the django projects. However i dont want to install sphinx directly in the system and would like to install it in a separate virtualenv. Since only one virtualenv can be activated at a time, i am not able to use sphinx-api-doc. Is there a way to use sphinx-api-doc with sphinx and django in independent virtualenv or is installing sphinx directly in the system the only way to go?
Steps to access Django application hosted in VM from Windows 7 client
10,331,810
0
0
422
1
django,wxpython,sql-server-2008-r2,vmware,python-2.7
Maybe this could help you a bit, although my set-up is slightly different. I am running an ASP.NET web app developed on Windows7 via VMware fusion on OS X. I access the web app from outside the VM (browser of Mac or other computers/phones within the network). Here are the needed settings: Network adapter set to (Bridged), so that the VM has its own IP address Configure the VM to have a static IP At this point, the VM is acting as its own machine, so you can access it as if it were another server sitting on the network.
0
1
0
0
2012-04-26T10:21:00.000
2
1.2
true
10,331,518
0
0
1
1
We have developed an application using DJango 1.3.1, Python 2.7.2 using Database as SQL server 2008. All these are hosted in Win 2008 R2 operating system on VM. The clients has windows 7 as o/s. We developed application keeping in view with out VM, all of sudden client has come back saying they can only host the application on VM. Now the challnege is to access application from client to server which is on VM. If anyone has done this kind of applications, request them share step to access the applicaiton on VM. As I am good at standalone systems, not having knowledge on VM accessbility. We have done all project and waiting to someone to respond ASAP. Thanks in advance for your guidence. Regards, Shiva.
Eclipse with Webfaction and Django
10,332,409
4
2
146
0
python,eclipse,pydev,webfaction
Don't do that. Your host is for hosting. Your personal machine is for developing. Edit and run your code locally. When it's ready, upload it to Webfaction. Don't edit code on your server.
0
0
0
0
2012-04-26T11:18:00.000
1
1.2
true
10,332,337
0
0
1
1
This is my first time purchasing a hosting and I opted for Webfaction.com to host my Django application. So far, i've been using Eclipse to write all my code and manage my Django application and I'm not ready to use VIM as a text editor yet. Now my question is, how can I use Eclipse to write my code and manage all my files while being connected to my webfaction account?
Django Application Assign Stylesheet -- don't want to add it to app's index file? Can it be dynamic?
10,336,728
0
0
76
0
python,django
If i'm reading your question correctly the first part wants to make a stylesheet that is dynamic??? I am unable to figure out how to make my stylesheet dynamic for front end For that you could use something like Django admin follows convention of adding {% block extra_head %} (or something similar, sorry don't remember specifics) Which is exactly what it sounds like a block that is in the <head> tag. This will let you load a stylesheet from any template. Just define that block in your base_site.html and implement it when you extend base_site.html But then at the end of your question you it seems you want to define style sheet in one place and include that stylesheet for every request? My only aim is to how i define my app's stylesheet on one place and applicable through out my application. Perhaps you could set up a directive in your settings.py 'DEFAULT_STYLESHEET and include that in your base_site.html template. Put the css in the block extra_head. If you need to override it just implement that block and viola!
0
0
0
0
2012-04-26T15:28:00.000
1
1.2
true
10,336,582
0
0
1
1
I am new to Django framework and kindly consider if my question is novice. I have created a polls application using the django framwork. I am unable to figure out how to make my stylesheet dynamic for front end. As i dont want to call it in my base_site.html or index.html files as I am also multiple views render different template files. My only aim is to how i define my app's stylesheet on one place and applicable through out my application.
How to migrate a python site to another machine?
10,341,733
13
5
2,657
0
python,django,virtualenv
In case you are using pip for package management, you can easily recreate the virtualenv on another system: On system1, run pip freeze --local > requirements.txt and copy that file to system2. Over there, create and activate the virtualenv and use pip install -r requirements.txt to install all packages that were installed in the previous virtualenv. Your python code can be simply copied to the new system; I'd find -name '*.pyc' -delete though since you usually do not want to move compiled code (even if it's just python bytecode) between machines.
0
0
0
0
2012-04-26T21:30:00.000
1
1.2
true
10,341,707
0
0
1
1
I would like to know how to setup a complex python website, that is currently running in production environment, into a local machine for development? Currently the site uses python combined with Django apps (registration + cms modules) in a virtual environment.
Python big list and input to database
10,345,847
4
1
122
0
python,database,sqlite
Consider just doing a commit after every 1000 records or so
0
0
0
0
2012-04-27T06:31:00.000
3
1.2
true
10,345,821
0
0
1
3
I must parse HTML files, which can be up to 500 000 links. Of which 400 000 will be desired by me. Should I put all the links that satisfy the condition for the new list and then for the elements of this list and put it into the database. Or when I find links to satisfy the condition to add it to the database (sqlite) (and commit it). Is that a large number of commits is not a problem? I do not want to lose data in case of failure such as power. Thats why i want commit after insert to the database. How best to place a large number of items in the database?
Python big list and input to database
10,345,975
1
1
122
0
python,database,sqlite
If these many links are spread across several files, what about a commit after processing each file? Then you as well could remember which files you have processed. In the case of a single file, record the file offset after each commit for clean continuation.
0
0
0
0
2012-04-27T06:31:00.000
3
0.066568
false
10,345,821
0
0
1
3
I must parse HTML files, which can be up to 500 000 links. Of which 400 000 will be desired by me. Should I put all the links that satisfy the condition for the new list and then for the elements of this list and put it into the database. Or when I find links to satisfy the condition to add it to the database (sqlite) (and commit it). Is that a large number of commits is not a problem? I do not want to lose data in case of failure such as power. Thats why i want commit after insert to the database. How best to place a large number of items in the database?
Python big list and input to database
10,346,693
0
1
122
0
python,database,sqlite
You can try to use noSQL database like mongo. With mongo I add 500.000 documents with 6 fields in each added about 15 seconds (on my old laptop), and about 0.023 sec on not difficult queries.
0
0
0
0
2012-04-27T06:31:00.000
3
0
false
10,345,821
0
0
1
3
I must parse HTML files, which can be up to 500 000 links. Of which 400 000 will be desired by me. Should I put all the links that satisfy the condition for the new list and then for the elements of this list and put it into the database. Or when I find links to satisfy the condition to add it to the database (sqlite) (and commit it). Is that a large number of commits is not a problem? I do not want to lose data in case of failure such as power. Thats why i want commit after insert to the database. How best to place a large number of items in the database?
How can I communicate between a Siemens S7-1200 and python?
10,782,983
6
10
38,421
0
python,plc,siemens,s7-1200
After failing with libnodave and OPC, I created a TCON,TSEND and TRECV communication thing. It transmits a byte over TCP and it works.
0
0
0
1
2012-04-27T18:30:00.000
7
1.2
true
10,355,953
0
0
1
3
I am running a process on a S7-1200 plc and I need it to send a start signal to my python script, after the script is done running it needs to send something back to the plc to initiate the next phase. Oh, and it has to be done in ladder. Is there a quick and dirty way to send things over profibus or am I better off using just a RS232 thing?
How can I communicate between a Siemens S7-1200 and python?
24,056,273
2
10
38,421
0
python,plc,siemens,s7-1200
There is a commercial library called "S7connector" by Rothenbacher GmbH (obviously it's not the "s7connector" on sourceforge). It is for the .NET framework, so could be used with IronPython. It does work with S7-1200 PLCs. You just have to make sure a DB you want to read from / write to is not an optimized S7-1200 style DB, but a S7-300/400 compatible one, an option which you can set when creating a DB in TIA portal. This lib also allows to read and write all I/O ports - the "shadow registers" (not sure what they're called officially) and directly as well, overriding the former.
0
0
0
1
2012-04-27T18:30:00.000
7
0.057081
false
10,355,953
0
0
1
3
I am running a process on a S7-1200 plc and I need it to send a start signal to my python script, after the script is done running it needs to send something back to the plc to initiate the next phase. Oh, and it has to be done in ladder. Is there a quick and dirty way to send things over profibus or am I better off using just a RS232 thing?
How can I communicate between a Siemens S7-1200 and python?
10,773,413
1
10
38,421
0
python,plc,siemens,s7-1200
Ther best way to communicate with S7-1200 PLC cpu's is with OPC UA or Classic OPC (ommonly known as OPC DA. ) Libnodave is made for S7-300 and S7-400 not for S71200 (2.x firmware). If you use a third party solution to communicate with S7-1200 (or S7-1500) you have to decrease the security level at the PLC by allowing the put and get mechanism. Put and get are pure evil to use. You open the memory of the CPU for every process. Don’t use them anymore. Siemens should actually block this. This applies for all firmware release for S7-1200. Siemens pushes people you use OPC UA as default communication from PLC. What makes sense, because OPC UA is the protocol for industry 4.0 and IIoT. Edit: rewrite everything. Info was heavily outdated. If you use a firmware 2 or 3 1200, consider replacement or upgrade. These versions are no longer supported and contains the worm issue.
0
0
0
1
2012-04-27T18:30:00.000
7
0.028564
false
10,355,953
0
0
1
3
I am running a process on a S7-1200 plc and I need it to send a start signal to my python script, after the script is done running it needs to send something back to the plc to initiate the next phase. Oh, and it has to be done in ladder. Is there a quick and dirty way to send things over profibus or am I better off using just a RS232 thing?
Getting db_type() error while using django-facebook connect for DjangoApp
10,486,708
1
0
282
1
python,django,facebook,sqlite
You should use django-facebook instead, it does that and more and it is actively supported :)
0
0
0
0
2012-04-27T19:17:00.000
1
1.2
true
10,356,581
0
0
1
1
I'm using django-1.4 , sqlite3 , django-facebookconnect Following instructions in Wiki to setup . "python manage.py syncdb" throws an error . Creating tables ... Creating table auth_permission Creating table auth_group_permissions Creating table auth_group Creating table auth_user_user_permissions Creating table auth_user_groups Creating table auth_user Creating table django_content_type Creating table django_session Creating table django_site Creating table blog_post Creating table blog_comment Creating table django_admin_log Traceback (most recent call last): File "manage.py", line 10, in execute_from_command_line(sys.argv) File "/usr/local/lib/python2.7/dist-packages/django/core/management/init.py", line 443, in execute_from_command_line utility.execute() File "/usr/local/lib/python2.7/dist-packages/django/core/management/init.py", line 382, in execute self.fetch_command(subcommand).run_from_argv(self.argv) File "/usr/local/lib/python2.7/dist-packages/django/core/management/base.py", line 196, in run_from_argv self.execute(*args, **options.dict) File "/usr/local/lib/python2.7/dist-packages/django/core/management/base.py", line 232, in execute output = self.handle(*args, **options) File "/usr/local/lib/python2.7/dist-packages/django/core/management/base.py", line 371, in handle return self.handle_noargs(**options) File "/usr/local/lib/python2.7/dist-packages/django/core/management/commands/syncdb.py", line 91, in handle_noargs sql, references = connection.creation.sql_create_model(model, self.style, seen_models) File "/usr/local/lib/python2.7/dist-packages/django/db/backends/creation.py", line 44, in sql_create_model col_type = f.db_type(connection=self.connection) TypeError: db_type() got an unexpected keyword argument 'connection' Is there any solution ??
Why am I getting error: "Value Error" - invalid literal for int() with base 10: 'matthew'
10,366,811
1
0
965
0
python,django,literals,base
What kind of field is assignedTo? I'd guess it's a foreign key, in which case it's trying to do the filter on an id, when you're passing a string. Am I right? The problem here is that assignedTo is being treated as an int (either it is an int, or is being compared on the basis of an int, such as a foreign key id), and you're passing a string to compare it to, which is invalid.
0
0
0
0
2012-04-28T19:11:00.000
2
0.099668
false
10,366,687
0
0
1
1
I am using django/python. Here's where the data comes from: User selects one or more options from a <select name="op_assignedTo"> I also use <option value="username"> The data is sent via POST and collected in a django view: op_assignedTo = request.POST.getlist('op_assignedTo') But the following line is give me the error: assignedTo_list = Item.objects.filter(assignedTo__in=op_assignedTo) I got the above line from numerous other answers to other questions on stackoverflow. I am confused at the error, because even the line temp = Item.objects.filter(assignedTo='matthew') gives the same error, "Value Error" - invalid literal for int() with base 10: 'matthew'. If the first part of my post doesn't quite make sense, please just look at the last line of code I posted. Thanks all!
button visibility in openerp
10,379,449
1
1
2,413
0
python,openerp
Depending the logged in user : You can use the variable 'uid' but I don't think you can do 'uid.name' or 'uid.groups_id'. So the easier method will be the second. Depending on the groups Example : We have some users who are managers and others not, create a group 'Manager' (in a xml file !!!) and add this group to managers. Now change the field in the xml like this : <field name="name" string="this is the string" groups="my_module.my_reference_to_the_group"/> The field will only be visible for managers
0
0
0
0
2012-04-30T00:07:00.000
4
0.049958
false
10,377,131
0
0
1
1
I am using openerp 5.16 web. Is there any way we can hide button depending upon the logged in user. or how can i control the group visibility depending upon the user group.
Can I use get on a key in a jinja template?
10,386,640
4
0
1,136
0
python,google-app-engine,python-2.7,jinja2
Looks like you're confused between NDB keys and db keys. The db.Key class (here shown as datastore_types.Key) does not have a get() method. However the NDB Key class (which would be google.appengine.ext.ndb.key.Key) does.
0
0
0
0
2012-04-30T04:43:00.000
1
1.2
true
10,378,591
0
0
1
1
I have a list of keys and I'm trying to get the object(s) in a Jinja2 template: {{item.cities[0].get().name}} UndefinedError: 'google.appengine.api.datastore_types.Key object' has no attribute 'get' I thought one could use get() on a key even in a template but here I get the error. Is it true that it can't be done?
Django social networking site with heroku
10,381,862
1
1
675
0
python,django,apache,heroku,social-networking
apache Solr for fast indexing, virtual-env , a library that provides connection pooling (SQLAlchemy), django-evolution or south for migration.
0
0
0
0
2012-04-30T08:57:00.000
1
0.197375
false
10,380,922
0
0
1
1
Iam in a plan to develop a social networking site in *python/django.*I have decided to use following technologies to implement this.I have some doubt regarding these technologies which i had planned to use. If anyone can help me regarding this it will be helpful.I want to avoid the bottle necks when it is scale into the thousands of connections . Apache as web-server Mailgun cloud-based email service (Heroku addon) RabbitMQ as a message queue(Heroku addon)if required MySQL 5.1 as database system.(Xeround addon) Git as file content management Memcache to reduce database load (optional) Heroku as a cloud based plattform(staging and live) Which storage i have to use for static files delivery or any heroku addon is there for static or content delivery? Please advice. Thanking you in advance
Generating random string based on some hex
10,382,646
0
0
730
0
python,string,random,passwords,md5
short answer: you can't longr answer: a md5 hashsum contains 128 bits of information, so to store that you also need 128 bits. the closest you get from that to a human readable form would probably be to base64 encode it, that will leave you with 22 characters (24 with padding). that's probably as short as it gets.* where does the randomness in your md5 hash come from anyway? md5 hashes arn't random, so you're probably hashing something random (what?) to get them (and by doing so you can't increase the entropy in any way, only decrease it). *you could probably create you own way to encode the checksum using a larger range of characters from the unicode range... but that would mean you had to select a suitable set of characters that anybody will know how to pronounce... something like ☺ ⚓ ⚔ ☂ ☏ would seem fairly clear, but some symbols like ♨ no so much...
0
0
0
0
2012-04-30T09:48:00.000
3
0
false
10,381,594
0
0
1
1
I have a md5 checksum in python; like s = '14966ba801aed57c2771c7487c7b194a'. What I want is to shorten it and make it a string in the form 'a-zA-Z0-9_.-', without loosing entropy of my random md5 checksum. The output have to be pronounceable, so I cant just do binascii.unhexlify(s). Nor can I do base64.encodestring(s) and cut it because then I will loose entropy. Any ideas on how to solve this without mapping an insane number (256) of hex pair (00->FF) to different letters? The reason I want this is to be able to say a whole md5 checksum over the phone, but use the whole alphabet+numbers+some special characters.
Executing a function on datetimes generated randomly each day for each user
10,383,029
0
0
177
0
python,ios,datetime,heroku,scheduled-tasks
There are certainly other ways. Whether they're better is a different matter. For instance: suppose there are n minutes left before the end of a given user's day and they haven't had their notification yet. Then send them a notification now with probability 1/n. This way, you don't need the huge list of random datetimes, but every minute you still need to iterate over all your users, see whether they've been notified yet, and compute random numbers for them all. It's a little more computation in total (though I doubt the difference is significant) and means that all your database updates are small. Or: Each time you notify a user, then you generate their next update time. That way, the next-update times get computed incrementally but are still known in advance. (If your number of users is relatively small, so that on most minutes there isn't a notification, you can make the scheduling smarter -- but I won't say more about that, because if you have that few users then the amount of work your software needs to do is going to be negligible anyway and there's no point optimizing for that case.)
0
0
0
0
2012-04-30T11:18:00.000
2
1.2
true
10,382,810
0
0
1
1
An interesting conundrum. Here's what I want to do: I have a Pyramid (python 2.7.2) website running on Heroku which pushes notifications to my iPhone app users. Each day, every user needs a push notification sent to them at a randomly generated time between 10:00am and 10:00pm (it obviously needs to know the users timezone as well). My current plan is the following: Use a persistent worker process to trigger a function every 1 minute on the minute. Each minute, it will call a function (on a different thread so as not to interrupt the timer) which will do 2 things: Check if it's 11:00pm for each timezone (which will happen 24 times a day, once for each timezone). If true, it will call a function which loops through every user in that respective timezone and generates their random time for the next day, then stores it in the Mongo database. On each minute, the worker will also loop through the users and check if they have their notification due at that time. If it's due, then send the notification. My question is: Is there a better way of doing this that doesn't require generating a huge list of random datetimes every day beforehand?
Can python's mechanize extract the text associated with a control?
10,408,283
1
0
153
0
python,web-crawler,mechanize
Look for text on the sibling nodes and the parent node's text, because that's where they frequently are. LXML might be able to help if you actually have to parse the html.
0
0
0
0
2012-04-30T17:21:00.000
1
1.2
true
10,387,816
0
0
1
1
I'm writing a crawler, and I keep encountering forms controls for which mechanize can give me no information beyond type. Is there any way that I can get the human-readable text associated with the control? I know this is a bit of a fuzzy area, since there's no perfect way of getting that information, but perhaps something can help?
Creating a web query form
10,389,650
1
2
2,579
0
python,django,forms
Hm. I don't believe any utilty like this exists. It would be nice if there were a reverse ModelForm. It would look at field type and get the data ranges for each field for a search form. I think right now you are stuck with creating a text box and a datepicker range. And processing that data in a view.
0
0
0
0
2012-04-30T19:42:00.000
3
0.066568
false
10,389,594
0
0
1
1
I have two models: Director, and Film. I want to create a web query form so that a user can search something like "All films from director 'Steven Spielberg' between 1990 and 1998". Just curious what the best and simplest way to do this would be? Thanks,
Abstract model class in django, but with table
10,393,642
0
1
657
0
python,django
You could try returning None or raising NotImplemented in __new__ in the class, I don't know if that would affect anything else but its worth a shot.
0
0
0
0
2012-04-30T21:15:00.000
1
0
false
10,390,689
0
0
1
1
In Django, if I do an abstract model class, and then have actual derived classes, only these classes will have an associated table, and the abstract class cannot be instantiated by itself. If I remove the abstract=True meta information, then an actual table is created for the base class, but doing so allows client code to create an object of the base class. Is there a way of forcing client code to always instantiate derived classes, while having a table associated to the base class ?
Remote API is extremely slow
10,398,726
2
2
415
0
python,google-app-engine
Don't forget that the remoteapi executes your code locally and only calls appengine servers for datastore/blobstore/etc. operations. So in essence, you're running code that's hitting a database living over the network. It's definitely slower.
0
1
0
1
2012-05-01T04:24:00.000
2
0.197375
false
10,393,531
0
0
1
1
I use the remote API for some utility tasks, and I've noticed that it is orders of magnitude slower than code running on Appengine. A simple get_by_id(list) took a couple of minutes using the remote API, and a couple of seconds running on Appengine. The logs show that the remote API fetched separately taking a couple of seconds each; whereas on Appengine the whole list of objects is retrieved in about the same time. Is there any way to improve this situation?
mongo db or redis for a facebook like site?
10,396,700
2
0
1,158
1
python,django,database-design,mongodb,redis
There's a huge distinction to be made between Redis and MongoDB for your particular needs, in that Redis, unlike MongoDB, doesn't facilitate value queries. You can use MongoDB to embed the comments within the post document, which means you get the post and the comments in a single query, yet you could also query for post documents based on tags, the author, etc. You'll definitely want to go with MongoDB. Redis is great, but it's not a proper fit for what I'd believe you'll need from it.
0
0
0
0
2012-05-01T10:16:00.000
4
0.099668
false
10,396,315
0
0
1
3
I'm building a social app in django, the architecture of the site will be very similar to facebook There will be posts, posts will have comments Both posts and comments will have meta data like date, author, tags, votes I decided to go with nosql database because of the ease with which we can add new features. I finalized on mongodb as i can easily store a post and its comments in a single document. I'm having second thoughts now, would REDIS be better than mongo for this kind of app? Update: I have decided to go with mongodb, will use redis for user home page and home page if necessary.
mongo db or redis for a facebook like site?
10,403,789
0
0
1,158
1
python,django,database-design,mongodb,redis
First, loosely couple your app and your persistence so that you can swap them out at a very granular level. For example, you want to be able to move one service from mongo to redis as your needs evolve. Be able to measure your services and appropriately respond to them individually. Second, you are unlikely to find one persistence solution that fits every workflow in your application at scale. Don't be afraid to use more than one. Mongo is a good tool for a set of problems, as is Redis, just not necessarily the same problems.
0
0
0
0
2012-05-01T10:16:00.000
4
0
false
10,396,315
0
0
1
3
I'm building a social app in django, the architecture of the site will be very similar to facebook There will be posts, posts will have comments Both posts and comments will have meta data like date, author, tags, votes I decided to go with nosql database because of the ease with which we can add new features. I finalized on mongodb as i can easily store a post and its comments in a single document. I'm having second thoughts now, would REDIS be better than mongo for this kind of app? Update: I have decided to go with mongodb, will use redis for user home page and home page if necessary.
mongo db or redis for a facebook like site?
10,396,466
1
0
1,158
1
python,django,database-design,mongodb,redis
These things are subjective and can be looked at in different directions. But if you have already decided to go with a nosql solution and is trying to determine between mongodb and redis I think it is better to go with mongodb as I guess you should be able to save a big number of posts and also mongodb documents are better suited to represent posts. Redis can only save upto the max memory limit but is super fast. So if you need to index some kind of things you can save posts in mongodb and then keep the id's of posts in redis to access faster.
0
0
0
0
2012-05-01T10:16:00.000
4
0.049958
false
10,396,315
0
0
1
3
I'm building a social app in django, the architecture of the site will be very similar to facebook There will be posts, posts will have comments Both posts and comments will have meta data like date, author, tags, votes I decided to go with nosql database because of the ease with which we can add new features. I finalized on mongodb as i can easily store a post and its comments in a single document. I'm having second thoughts now, would REDIS be better than mongo for this kind of app? Update: I have decided to go with mongodb, will use redis for user home page and home page if necessary.
Converting JavaScript into Python bytecode
10,406,398
0
4
846
0
javascript,python,google-app-engine
To my knowledge, there are no complete and robust implementations of Javascript interpreters on Python. Your best option is probably to deploy an alternate version of your app with the Rhino interpreter in Java, and call this as a web service with the main version of your app.
0
0
0
1
2012-05-01T13:26:00.000
2
0
false
10,398,315
1
0
1
1
I am trying to execute simple JavaScript code in a pure Python environment (Google AppEngine). I've tried PYJON, but it does not seem mature enough for real use(it does not handle eg forward referenced functions or do-while and it hangs on array usage). One idea would be to use pynarcissus to convert JavaScript into a syntax tree and than convert this tree jnto a Python AST which could be compiled into Python bytecode. Has anybody done this before? Any problems with this idea?
Django reportlab - Print current page (admin console) to PDF
10,422,659
1
1
976
0
python,django,pdf,reportlab
If html2pdf doesn't do what you need, you can do everything you want to do with ReportLab. Have a look at the ReportLab manual, in particular the parts on Platypus. This is a part of the ReportLab library that allows you to build PDFs out of objects representing page parts (paragraphs, tables, frames, layouts, etc.).
0
0
0
0
2012-05-02T02:46:00.000
2
0.099668
false
10,407,046
0
0
1
1
What I am trying to accomplish is to allow users to view information in the django admin console and allow them to save and print out a PDF of the information infront of them based upon how ever they sorted/filtered the data. I have seen a lot of documentation on report lab but mostly for just drawing lines and what not. How can I simply output the admin results to a PDF? If that is even possible. I am open to other suggestions if report lab is not the ideal way to get this done. Thanks in advance.
Python program to Website Application
10,414,361
1
2
575
0
python,django,web,cgi,pyramid
If the application needs to talk to a database that already exists, then django won't buy you much value IMO. since the admin interface won't work for that part unless the db schema adheres to what django expects (auto increment int primary keys etc...), the same goes for any other webframework that presumes an expected schema. So then, sqlalchemy is your best bet. It has an orm layer, but you don't have to use it, you can get a lot of bang for the buck just using the query interfaces. So as far as webframeworks go, that narrows it down to anything that can use sqlalchemy. Which is anything but Django, Zope and probably web2py due to the reasons mentioned above. Though for Zope it's value is somewhat derived from the fact that it's backed by zodb. But zodb is not going to help you at all with your existing database and data. So out of what is left of the web frameworks, what I would use a selection criteria is what it's abilities are for routing requests to views. And how well it matches your url generation strategy. IMO, pyramid is very flexible in this area. But you might not need that. You may be able to get by with flask or bottle. Or even straight webob. Another slightly less important criteria is template engine/language, most frameworks will support the more popular ones, such as jinja2 etc... My personal choice is pyramid because it scales nicely from super easy to super hairy in the request routing department. But again, depending on how you want your urls to work, you may not need that.
0
0
0
0
2012-05-02T05:31:00.000
5
0.039979
false
10,408,208
0
0
1
4
I'm working on a project which is converting a 50mb python Graduated interval recall rating system for pictures and text program to a website based application. (and then design a website around it) It needs to connect to a database to store user information frequently so it needs to be run server side correct? Assuming I know nothing, what is the best structure to complete this? There seem like many different options and I feel lost. I've been using CGI to create a web UI for the original python code. Is this even possible to implement? What about pyramid / uWSGI / pylon / flask or Django? (although I was told to refrain from it for this project)
Python program to Website Application
10,408,572
0
2
575
0
python,django,web,cgi,pyramid
Django has a command (./manage.py inspectdb) that can help you make initial models of your current database. If you decide to redesign the database this will still make it easier to move the data into the new schema. Personally, I like Django, but the others maybe very well suited to your application. To communicate back to the server you could probably use AJAX.
0
0
0
0
2012-05-02T05:31:00.000
5
0
false
10,408,208
0
0
1
4
I'm working on a project which is converting a 50mb python Graduated interval recall rating system for pictures and text program to a website based application. (and then design a website around it) It needs to connect to a database to store user information frequently so it needs to be run server side correct? Assuming I know nothing, what is the best structure to complete this? There seem like many different options and I feel lost. I've been using CGI to create a web UI for the original python code. Is this even possible to implement? What about pyramid / uWSGI / pylon / flask or Django? (although I was told to refrain from it for this project)
Python program to Website Application
10,409,701
7
2
575
0
python,django,web,cgi,pyramid
Well, it may be difficult to give you a good advice because the description of your project is quite vague - what in the world is "a 50mb python Graduated interval recall rating system for pictures and text program"??? :) - but I'll try to outline the difference between the options you're listing: Django is a sort of an integrated solution - it includes a templating system, an ORM, forms framework and lots more. Because of the fact that those things are all closely tied together, Django provides some niceties such as built-in admin interface, pluggable apps etc. Which would make kick-starting development of a traditional website easier as you don't need to build those things yourself. For example, to build a blog site with Django, you need to define a database model, a couple of routes, and a couple of views and that's it - you can add and edit blog entries using the built-in admin interface and authenticate using pluggable authentication module. But there's a price, of course - to ensure all those bits work together, Django to some extent requires you to use technologies provided by Django - i.e., you have to define your models using Django ORM and write templates using Django templates. You can swap different bits for something else, but they understandably would not work well with the rest of the framework - i.e. you can use another ORM, such as SQLAlchemy, to access database, but such models won't work with Django's admin interface. To some extent, Django also expects a particular structure of database tables (i.e. it expects to be able to create those tables based on models defined in Python code), which would make working with an existing databases more difficult. Also, my understanding it that it expects you to have an SQL database. So, in my opinion, Django is a very good choice for building a "typical" Django website (it was built for news websites) which could make use of existing plugabble apps and other Django features. Pyramid, on the other hand, does not require you to use any particular technology for database access - in fact, it does not require you to have a database at all - you can build an application which works with data stored on filesystem, in an object database such as ZODB or some distributed NoSQL storage. Maybe even some XML file and a bunch of images... your imagination is your limit When using an SQL database, it does not expect the database to have a certain structure. Also, SQLAlchemy, the recommended Pyramid's ORM, is considered to be more flexible and powerful than Django ORM It does not require you to use any particular templating library or form library, so you can choose whatever suits your needs best. Pyramid does not even require you to use route mapping with is a cornerstone feature of most web frameworks - in addition to route mapping Pyramid supports URL traversal, which can be a very powerful way to work with hierarchical data structures. While not requiring you to use any particular technology, Pyramid does provide some sane templates for typical use cases. The cost of this flexibility is that may be more difficult to find existing "apps" which can be plugged into your very custom Pyramid website without any changes - although excellent WSGI support in Pyramid leverages that. Pylons is called Pyramid now after the project merged with repoze.bfg some time ago. uWSGI is more of an application/protocol to serve a Pyramid application (or other WSGI-conformant application) flask - never used it, maybe someone else will give you some overview. So, in short, the choice between Django and Pyramid boils down to the question "How much of Django's built-in features will I be able to use on my site" - because if you're not going to use Django's automatic admin or to make heavy use of third-party pluggable apps - everything else is better in Pyramid :)
0
0
0
0
2012-05-02T05:31:00.000
5
1
false
10,408,208
0
0
1
4
I'm working on a project which is converting a 50mb python Graduated interval recall rating system for pictures and text program to a website based application. (and then design a website around it) It needs to connect to a database to store user information frequently so it needs to be run server side correct? Assuming I know nothing, what is the best structure to complete this? There seem like many different options and I feel lost. I've been using CGI to create a web UI for the original python code. Is this even possible to implement? What about pyramid / uWSGI / pylon / flask or Django? (although I was told to refrain from it for this project)
Python program to Website Application
10,408,245
1
2
575
0
python,django,web,cgi,pyramid
I have been told that pylons is quite good (newer pyramid), but I personally use Django and I'm quite happy with it. Do not even try to use CGI because that's the same mistake I made - and I figured out later that changing all the html was a pain in the arse.
0
0
0
0
2012-05-02T05:31:00.000
5
0.039979
false
10,408,208
0
0
1
4
I'm working on a project which is converting a 50mb python Graduated interval recall rating system for pictures and text program to a website based application. (and then design a website around it) It needs to connect to a database to store user information frequently so it needs to be run server side correct? Assuming I know nothing, what is the best structure to complete this? There seem like many different options and I feel lost. I've been using CGI to create a web UI for the original python code. Is this even possible to implement? What about pyramid / uWSGI / pylon / flask or Django? (although I was told to refrain from it for this project)
Remove leading and trailing slash / in python
67,345,321
1
88
95,279
0
python,django,path,strip
you can try: "/get/category".strip("/")
0
0
0
0
2012-05-02T06:35:00.000
4
0.049958
false
10,408,826
0
0
1
1
I am using request.path to return the current URL in Django, and it is returning /get/category. I need it as get/category (without leading and trailing slash). How can I do this?
How to run nginx + python (without django)
10,412,251
0
7
7,440
0
python,nginx,web,fastcgi
All the same you must use wsgi server, as nginx does not support fully this protocol.
0
0
0
1
2012-05-02T10:39:00.000
4
0
false
10,412,063
0
0
1
2
I want to have simple program in python that can process different requests (POST, GET, MULTIPART-FORMDATA). I don't want to use a complete framework. I basically need to be able to get GET and POST params - probably (but not necessarily) in a way similar to PHP. To get some other SERVER variables like REQUEST_URI, QUERY, etc. I have installed nginx successfully, but I've failed to find a good example on how to do the rest. So a simple tutorial or any directions and ideas on how to setup nginx to run certain python process for certain virtual host would be most welcome!
How to run nginx + python (without django)
10,417,619
4
7
7,440
0
python,nginx,web,fastcgi
You should look into using Flask -- it's an extremely lightweight interface to a WSGI server (werkzeug) which also includes a templating library, should you ever want to use one. But you can totally ignore it if you'd like.
0
0
0
1
2012-05-02T10:39:00.000
4
1.2
true
10,412,063
0
0
1
2
I want to have simple program in python that can process different requests (POST, GET, MULTIPART-FORMDATA). I don't want to use a complete framework. I basically need to be able to get GET and POST params - probably (but not necessarily) in a way similar to PHP. To get some other SERVER variables like REQUEST_URI, QUERY, etc. I have installed nginx successfully, but I've failed to find a good example on how to do the rest. So a simple tutorial or any directions and ideas on how to setup nginx to run certain python process for certain virtual host would be most welcome!
How can I use the Django ORM in my Tornado application?
10,415,532
12
22
6,269
0
python,django,tornado
Add the path to the Django project to the Tornado application's PYTHONPATH env-var and set DJANGO_SETTINGS_MODULE appropriately. You should then be able to import your models and use then as normal with Django taking care of initial setup on the first import. You shouldn't require any symlinks.
0
0
0
0
2012-05-02T14:06:00.000
3
1.2
true
10,415,429
0
0
1
1
I have an existing Django application with a database and corresponding models.py file. I have a new Tornado application that provides a web service to other applications. It needs to read/write from that same database, and there is code in the models file I'd like to use. How can I best use the Django database and models in my Tornado request handlers? Is it as simple as making a symbolic link to the models.py Django project folder, importing Django modules, and using it? I guess I'd have to do settings.configure(), right? Thanks!
How does pgBouncer help to speed up Django
10,419,731
11
30
41,803
0
python,django,postgresql,connection-pooling,pgbouncer
PgBouncer reduces the latency in establishing connections by serving as a proxy which maintains a connection pool. This may help speed up your application if you're opening many short-lived connections to Postgres. If you only have a small number of connections, you won't see much of a win.
0
0
0
1
2012-05-02T18:34:00.000
2
1
false
10,419,665
0
0
1
2
I have some management commands that are based on gevent. Since my management command makes thousands to requests, I can turn all socket calls into non-blocking calls using Gevent. This really speeds up my application as I can make requests simultaneously. Currently the bottleneck in my application seems to be Postgres. It seems that this is because the Psycopg library that is used for connecting to Django is written in C and does not support asynchronous connections. I've also read that using pgBouncer can speed up Postgres by 2X. This sounds great but it would be great if someone could explain how pgBouncer works and helps? Thanks
How does pgBouncer help to speed up Django
10,420,469
105
30
41,803
0
python,django,postgresql,connection-pooling,pgbouncer
Besides saving the overhead of connect & disconnect where this is otherwise done on each request, a connection pooler can funnel a large number of client connections down to a small number of actual database connections. In PostgreSQL, the optimal number of active database connections is usually somewhere around ((2 * core_count) + effective_spindle_count). Above this number, both throughput and latency get worse. NOTE: Recent versions have improved concurrency, so in 2022 I would recommend something more like ((4 * core_count) + effective_spindle_count). Sometimes people will say "I want to support 2000 users, with fast response time." It is pretty much guaranteed that if you try to do that with 2000 actual database connections, performance will be horrible. If you have a machine with four quad-core processors and the active data set is fully cached, you will see much better performance for those 2000 users by funneling the requests through about 35 database connections. To understand why that is true, this thought experiment should help. Consider a hypothetical database server machine with only one resource to share -- a single core. This core will time-slice equally among all concurrent requests with no overhead. Let's say 100 requests all come in at the same moment, each of which needs one second of CPU time. The core works on all of them, time-slicing among them until they all finish 100 seconds later. Now consider what happens if you put a connection pool in front which will accept 100 client connections but make only one request at a time to the database server, putting any requests which arrive while the connection is busy into a queue. Now when 100 requests arrive at the same time, one client gets a response in 1 second; another gets a response in 2 seconds, and the last client gets a response in 100 seconds. Nobody had to wait longer to get a response, throughput is the same, but the average latency is 50.5 seconds rather than 100 seconds. A real database server has more resources which can be used in parallel, but the same principle holds, once they are saturated, you only hurt things by adding more concurrent database requests. It is actually worse than the example, because with more tasks you have more task switches, increased contention for locks and cache, L2 and L3 cache line contention, and many other issues which cut into both throughput and latency. On top of that, while a high work_mem setting can help a query in a number of ways, that setting is the limit per plan node for each connection, so with a large number of connections you need to leave this very small to avoid flushing cache or even leading to swapping, which leads to slower plans or such things as hash tables spilling to disk. Some database products effectively build a connection pool into the server, but the PostgreSQL community has taken the position that since the best connection pooling is done closer to the client software, they will leave it to the users to manage this. Most poolers will have some way to limit the database connections to a hard number, while allowing more concurrent client requests than that, queuing them as necessary. This is what you want, and it should be done on a transactional basis, not per statement or connection.
0
0
0
1
2012-05-02T18:34:00.000
2
1.2
true
10,419,665
0
0
1
2
I have some management commands that are based on gevent. Since my management command makes thousands to requests, I can turn all socket calls into non-blocking calls using Gevent. This really speeds up my application as I can make requests simultaneously. Currently the bottleneck in my application seems to be Postgres. It seems that this is because the Psycopg library that is used for connecting to Django is written in C and does not support asynchronous connections. I've also read that using pgBouncer can speed up Postgres by 2X. This sounds great but it would be great if someone could explain how pgBouncer works and helps? Thanks
get_application_id() behaviour with aliased app id
10,423,664
3
1
124
0
python,google-app-engine
No - get_application_id returns the ID of the app that is actually serving your request. You can examine the hostname to see if the request was directed to oldappid.appspot.com.
0
1
0
0
2012-05-02T23:44:00.000
1
1.2
true
10,423,245
0
0
1
1
I was forced to alias my app name after migrating to the High Replication Datastore. I use the google.appengine.api.app_identity.get_application_id() method throughout my app, but now it returns the new app id instead of the original one even when visiting via the old app id url. Is there a way to output the original app id? thanks
Pointers on using celery with sorl-thumbnails with remote storages?
11,048,085
4
11
1,459
0
python,django,amazon-s3,celery,sorl-thumbnail
As I understand Sorl works correctly with the S3 storage but it's very slow. I believe that you know what image sizes do you need. You should launch the celery task after the image was uploaded. In task you call to sorl.thumbnail.default.backend.get_thumbnail(file, geometry_string, **options) Sorl will generate a thumbnail and upload it to S3. Next time you request an image from template it's already cached and served directly from Amazon's servers a clean way to handle a placeholder thumbnail image while the image is being processed. For this you will need to override the Sorl backend. Add new argument to get_thumbnail function, e.g. generate=False. When you will call this function from celery pass generate=True And in function change it's logic, so if thumb is not present and generate is True you work just like the standard backend, but if generate is false you return your placeholder image with text like "We process your image now, come back later" and do not call backend._create_thumbnail. You can launch a task in this case, if you think that thumbnail can be accidentally deleted. I hope this helps
0
1
0
0
2012-05-03T02:48:00.000
3
0.26052
false
10,424,456
0
0
1
1
I'm surprised I don't see anything but "use celery" when searching for how to use celery tasks with sorl-thumbnails and S3. The problem: using remote storages causes massive delays when generating thumbnails (think 100s+ for a page with many thumbnails) while the thumbnail engine downloads originals from remote storage, crunches them, then uploads back to s3. Where is a good place to set up the celery task within sorl, and what should I call? Any of your experiences / ideas would be greatly appreciated. I will start digging around Sorl internals to find a more useful place to delay this task, but there are a few more things I'm curious about if this has been solved before. What image is returned immediately? Sorl must be told somehow that the image returned is not the real thumbnail. The cache must be invalidated when celery finishes the task. Handle multiple thumbnail generation requests cleanly (only need the first one for a given cache key) For now, I've temporarily solved this by using an nginx reverse proxy cache that can serve hits while the backend spends time generating expensive pages (resizing huge PNGs on a huge product grid) but it's a very manual process.
Dynamic notifications pop up
10,429,261
4
2
1,343
0
php,python,facebook
Facebook calls an AJAX endpoint every few seconds to keep the client-side UI fresh. The payload from this endpoint contains updates for ticker, newsfeed, notifications, messages and various other statuses. You can view this by opening Facebook in Google Chrome and looking at the network tab in Chrome Developer Tools.
0
0
0
0
2012-05-03T09:17:00.000
1
1.2
true
10,428,414
0
0
1
1
How is the dynamic notification updates are displayed in facebook.Also here at stackoverflow, why isn't the notifications pop up immediately as notifications arises.They aren't displayed until i refresh the page.
Allowing tags with Google App Engine and Jinja2
10,441,370
-1
10
4,529
0
python,escaping,whitespace,jinja2,webapp2
The easiest way to do this is to escape the field yourself, then add line breaks. When you pass it in in jinja, mark it as safe so it's not autoescaped.
0
0
0
0
2012-05-03T17:30:00.000
8
-0.024995
false
10,436,458
0
0
1
2
In my web app, the user can make blog posts. When I display the blog post, newlines aren't shown because I didn't replace the new lines with <br> tags. The problem is that I've turned autoescaping on in Jinja, so <br> tags are escaped. I don't want to temporarily disable autoescaping, I want to specifically allow <br> tags. How would I do this?
Allowing tags with Google App Engine and Jinja2
10,436,756
-1
10
4,529
0
python,escaping,whitespace,jinja2,webapp2
The solution was to put <pre></pre> tags around the area where I had the content.
0
0
0
0
2012-05-03T17:30:00.000
8
-0.024995
false
10,436,458
0
0
1
2
In my web app, the user can make blog posts. When I display the blog post, newlines aren't shown because I didn't replace the new lines with <br> tags. The problem is that I've turned autoescaping on in Jinja, so <br> tags are escaped. I don't want to temporarily disable autoescaping, I want to specifically allow <br> tags. How would I do this?
Fabric + django asynchronous prompt for sudo password
10,439,756
1
1
626
0
python,ajax,django,sudo,fabric
I can't think of a way to do a password prompt only if required... you could prompt before and cache it as required, though, and the backend would have access. To pass the sudo password to the fabric command, you can use sudo -S... i.e. echo password | sudo -S command
0
0
0
0
2012-05-03T21:36:00.000
2
0.099668
false
10,439,654
0
0
1
2
I'm working on the deployment tool in Django and fabric. The case is putting some parameters (like hostname and username) in the initial form, then let Django app to call fabric methods to do the rest and collect the output in the web browser. IF there is a password prompt from OS to fabric (ie. running sudo commands etc.), I would like to popup the one-field form for the password to be put in it (for example using jQuery UI elements). The person will fill the password field for user prompted and fabric will continue to do the things. Is this situation possible to be implemented? I was thinking about some async calls to browser, but I have no idea how it can be done from the other side. Probably there is another way. Please let me know if you have any suggestions. Thanks!
Fabric + django asynchronous prompt for sudo password
10,439,758
2
1
626
0
python,ajax,django,sudo,fabric
Yes, capture the password exception, than popup the form, and run the fabric script again with env.password = userpassword If you want to continue where you caught the exception, keep a variable that knows what has been done yet (i.e. nlinesexecuted) and save it when you catch the exception. Use logic when you rerun the script to continue where you left of.
0
0
0
0
2012-05-03T21:36:00.000
2
1.2
true
10,439,654
0
0
1
2
I'm working on the deployment tool in Django and fabric. The case is putting some parameters (like hostname and username) in the initial form, then let Django app to call fabric methods to do the rest and collect the output in the web browser. IF there is a password prompt from OS to fabric (ie. running sudo commands etc.), I would like to popup the one-field form for the password to be put in it (for example using jQuery UI elements). The person will fill the password field for user prompted and fabric will continue to do the things. Is this situation possible to be implemented? I was thinking about some async calls to browser, but I have no idea how it can be done from the other side. Probably there is another way. Please let me know if you have any suggestions. Thanks!
Distributed server model
10,440,880
1
1
221
1
python,distributed-computing
The general way to handle this is to have the threads report their status back to the server daemon. If you haven't seen a status update within the last 5N seconds, then you kill the thread and start another. You can keep track of the current active threads that you've spun up in a list, then just loop through them occasionally to determine state. You of course should also fix the errors in your program that are causing threads to exit prematurely. Premature exits and killing a thread could also leave your program in an unexpected, non-atomic state. You should probably also have the server daemon run a cleanup process that makes sure any items in your queue, or whatever you're using to determine the workload, get reset after a certain period of inactivity.
0
1
0
0
2012-05-03T22:39:00.000
1
1.2
true
10,440,277
0
0
1
1
Lets say I have 100 servers each running a daemon - lets call it server - that server is responsible for spawning a thread for each user of this particular service (lets say 1000 threads per server). Every N seconds each thread does something and gets information for that particular user (this request/response model cannot be changed). The problem I a have is sometimes a thread hangs and stops doing something. I need some way to know that users data is stale, and needs to be refreshed. The only idea I have is every 5N seconds have the thread update a MySQL record associated with that user (a last_scanned column in the users table), and another process that checks that table every 15N seconds, if the last_scanned column is not current, restart the thread.
plone how to add content rule for event which after end date should be moved to another folder
10,448,068
4
2
213
0
python,plone
Not sure you can do this with a content rule; there is no code running at that exact time. You'd need to run an external cron job to trigger a scan for expired events. Why not just use a collection to list expired events in the other location?
0
0
0
0
2012-05-04T11:23:00.000
1
1.2
true
10,447,858
0
0
1
1
I wish to create a content rule for an event such that after expiry date of the event i.e end date, it should be moved to another folder. How do I specify the content rule. Please guide. Using Plone 4.1
Django: How can I find methods/functions filling in the specific template
10,451,563
1
0
66
0
python,django
In well designed django, you should only have to edit the template. Good design provides clean separation. It's possible the developer may have been forced todo something unusual...but you could try to edit the template and see what happens (make backup 1st)
0
0
0
0
2012-05-04T14:59:00.000
2
0.099668
false
10,451,323
0
0
1
2
Say, I find some bug in a web interface. I open firebug and discover element and class, id of this element. By them I can then identify a template which contains variables, tags and so on. How can I move forward and reveal in which .py files these variables are filled in? I know how it works in Lift framework: when you've found a template there are elements with attributes bound to snippets. So you can easily proceed to specific snippets and edit code. How does it work in django? May be, I suppose wrong process... then point me to the right algorithm, please.
Django: How can I find methods/functions filling in the specific template
10,451,982
1
0
66
0
python,django
Determining template variable resolution is all about Context. Use the URL to identify the view being invoked. Look at the view's return and note a) the template being used, and b) any values being passed in the Context used when the template is being rendered. Look at settings.py for the list of TEMPLATE_CONTEXT_PROCESSORS. These are routines that are called automatically and invisibly to add values to the Context being passed to the template. This is sort of a Man Behind the Curtain™ process that can really trip you up if you don't know about it. Check to see if there are any magic template tags being called (either in the template in question, in a template it extends, or in a template that includes the template) that might be modifying the Context. Sometimes I need use an old-school django snippet called {%expr%} that can do evaluation in the template, but I always use it as close to the point of need as possible to highlight the fact it is being used. Note that because of the way Django template variables are resolved, {{foo.something}} could be either a value or a callable method. I have serious issues with this syntax, but that's the way they wrote it.
0
0
0
0
2012-05-04T14:59:00.000
2
1.2
true
10,451,323
0
0
1
2
Say, I find some bug in a web interface. I open firebug and discover element and class, id of this element. By them I can then identify a template which contains variables, tags and so on. How can I move forward and reveal in which .py files these variables are filled in? I know how it works in Lift framework: when you've found a template there are elements with attributes bound to snippets. So you can easily proceed to specific snippets and edit code. How does it work in django? May be, I suppose wrong process... then point me to the right algorithm, please.
Django - Is it better to install packages to virtualenv/system or include them within the project?
10,459,311
5
1
121
0
python,django,project
The best way is to use separate virtualenv for each project. There is nothing messy with it (use virtualenvwrapper). Sharing a library between project is always a potentional risk: what if you want to upgrade the library in one project and use older version in another? Also pip freeze will list actual list of aps for the project, not some list you should filter manually.
0
0
0
0
2012-05-05T05:09:00.000
1
0.761594
false
10,459,041
1
0
1
1
The large number of apps/packages which can be used in python/django is a great advantage of both. This also raises a question about handling these installed applications/library, especially when there are multiple environments in which the project has to be deployed. Installing such third party libraries to the system does not seem ideal to me. Thus after some research, I found that there are two possible ways to go namely virtualenv or including the package within the project folder. But the problems are that creating a virtualenv for each project is kind of messy and on the other side, including large packages within the project directory increases the project size and also creates import problems. I have found kind of a middle ground between the above two methods which is to install libraries which can be shared with multiple projects into a virtualenv and smaller project specific libraries within the project. For example, for a django project, I would install django into a virtualenv and other libraries used in the project for example xlwrt, dojango etc are included within a "lib" folder within the project. Is this the best way to go or are there better alternative methods??
Sending objects from Jinja Templates to Python
10,460,937
1
1
1,379
0
python,html,forms,google-app-engine,jinja2
Why do this? Any logic that you implement in the template is accessible to you in the controller of your app, including any variables that you place in the template context. If the data has been changed due to interaction with the user, then the best way to retrieve data, in my opinion, is to set up a form and use the normal POST method to send the request and the required data, correctly encoded and escaped, back to your program. In this way, you are protected from XSS issues, among other inconveniences. I would never do any processing in a template, and only use any local logic to modify the presentation itself. EDIT Taking into account your scenario, I suggest the following: User presses a button on a page and invokes a Get handler Get handler queries a database and receives a list of images the list is cached, maybe in a memcache and the key is sent with the list of images encoded as a parameter in the GET URL displayed by the template List of images get passed to the template engine for display Another button is pressed and a different Get handler is invoked using the key received encoded in the GET URL, after sanitising and validation, to retrieve the cached list If you don't want the intermediate step of caching a key-value pair, you may want to encode the whole list in the GET URL, and the step of sanitising and validation should be as easy on the whole list as on a key to the list. Both methods avoid a round trip to the database, protect you from malicious use, and respect the separation of data, presentation, and logic.
0
0
0
0
2012-05-05T09:55:00.000
2
1.2
true
10,460,716
0
0
1
2
I started using Jinja Templating with Python to develop web apps. With Jinja, I am able to send objects from my Python code to my index.html, but is it possible to receive objects from my index.html to my Python code? For example, passing a list back and forth. If so, do you have any examples? Thank You!
Sending objects from Jinja Templates to Python
10,460,906
-1
1
1,379
0
python,html,forms,google-app-engine,jinja2
Just a thought.. Have you tried accessing the variables in the dict you passed to jinja after processing the template?
0
0
0
0
2012-05-05T09:55:00.000
2
-0.099668
false
10,460,716
0
0
1
2
I started using Jinja Templating with Python to develop web apps. With Jinja, I am able to send objects from my Python code to my index.html, but is it possible to receive objects from my index.html to my Python code? For example, passing a list back and forth. If so, do you have any examples? Thank You!
is it possible to run a task scheduler in bottle web framework
11,097,542
0
2
663
0
python,task,scheduler,bottle
I would suggest threading it allows the webserver to be unaffected by the scheduled tasks that will either be in a queue or written into the code itself.
0
1
0
0
2012-05-07T05:55:00.000
1
1.2
true
10,477,310
0
0
1
1
Does anyone have any examples on how to integrate a task scheduler in Bottle. Something like APScheduler or sched?
Simplest framework for converting python app into webapp?
10,493,743
2
3
1,030
0
python,django
CherryPy would serve you the best if you are looking for a minimalist one.
0
0
0
0
2012-05-08T05:45:00.000
5
0.07983
false
10,493,188
0
0
1
2
I have a small python app that I would like to convert into a webapp. I have look at some frameworks out there such as django. I need some suggestions on a very simple framework. All my webapp will have is a textbox and two buttons.
Simplest framework for converting python app into webapp?
10,497,657
0
3
1,030
0
python,django
+1 for bottle. Nevertheless, consider that the scope usually grows when developing web apps and that text box and those 2 buttons will later on need: templates, lots of routes, maybe translations, etc. So, always consider using something that would allow you to move into some powerful framework such as web2py or Django.
0
0
0
0
2012-05-08T05:45:00.000
5
0
false
10,493,188
0
0
1
2
I have a small python app that I would like to convert into a webapp. I have look at some frameworks out there such as django. I need some suggestions on a very simple framework. All my webapp will have is a textbox and two buttons.
Is there a workaround to get floating divs in Report Lab?
10,517,245
1
2
819
0
python,pdf,pdf-generation,reportlab,xhtml2pdf
If you cannot achieve the results you need with xhtml2pdf, I suggest you use ReportLab directly. ReportLab contains support for RML, ReportLabs own markup language that lets you easily create formatted text, and has a support library called Platypus that makes layout fairly simple using Python objects to represent document parts and page layouts. The reason you are having problems, by the way, is that xhtml2pdf has to essentially act like a HTML rendering engine that outputs to PDF rather than the screen directly. As it took a long time and a lot of effort to make good rendering engines for browsers, so, too, does it seem that xhtml2pdf will take a lot of effort to make it of similar quality. This isn't to say that xhtml2pdf is bad, just that it's going to take time for it to be as good as rendering in a browser, and if PDF output for its own sake is what you really are interested in, I think using ReportLab directly is a better choice.
0
0
0
0
2012-05-08T14:06:00.000
1
0.197375
false
10,500,237
0
0
1
1
I generate PDFs with the xhtml2pdf Python package. The output is not optimal. I use floating divs in order to place images and text on the page. In HTML this works but after PDF rendering, images and text ar placed underneath eachother which is not what I want. From surfing the web I learned that the Report Lab package that is used by xhtml2pdf can not handle floating divs. Does a workaround exist? I have tried webkit rendering via QT but the resulting PDFs are of low quality, i.e. character spacing is completely wrong.
Display exceptions in mod_wsgi hosted cherrypy app
13,859,477
2
0
166
0
python,mod-wsgi,cherrypy
If you're setting the environment configuration to either production, embedded or test_suite then automatically log.screen is set to False. Manually setting this to true will give you a full trace-back displayed on screen. Alternatively you'd have to start digging in your web server logs.
0
0
0
0
2012-05-08T14:24:00.000
1
1.2
true
10,500,536
0
0
1
1
I have a cherrypy app with works fine on my local machine, but not on my server. If an exception is thrown in a request handler, I see the following message in my browser: The server encountered an unexpected condition which prevented it from fulfilling the request. How can I tell cherrypy to display the exception? I see no log information in the appache log or somewhere else. I would expect some flag where I can say "DEBUG=True" or something like that. But I could not find anything in the documentation.
Detecting File Download Completion Status
12,107,307
-1
1
431
0
python,django
I don't know if this is exactly what you want, but you could check filesize at two timepoints and then compare the filesize. For sure if the download is pausing for some reason, this solution is not very good.
0
0
0
0
2012-05-09T04:04:00.000
1
1.2
true
10,509,661
0
0
1
1
I would just like to ask if it's possible for me to check the status of a file (csv file) download in Django . I have a template page with a button, which when clicked, opens a 'Save As' dialog box for downloading a file from the server (file content is retrieved from db). after the download is complete i want to update my DB with its status (if download complete status = Downloaded ,if abort status = aborted , if Cancel Status= Canceled . How can i do this ?
Django users post to twitter
10,516,808
2
1
783
0
python,django,twitter
You could just extract the code into your own project and that will work. But the benefits of using an open source library is that there's a good chance when Twitter or Social Network X changes it's API, the library, if popular, would get updated as opposed to you needing to make the change.
0
0
0
0
2012-05-09T09:54:00.000
3
0.132549
false
10,513,759
0
0
1
1
I use Django-social-auth to authenticate the users of a Django project. So I guess I have all the necessary information about a Twitter user to make a post on their behalf on their Twitter account. So do I really need to install a new app? or with the info at hand how would I do that? Isn't it just a matter of posting to a Twitter API with the relevant info?
Execute java methodes via a Python or Perl client
10,519,519
0
3
114
0
java,python,perl,client
What you are talking about is Web Services. A corollary to this is XML and SOAP. In Java, Python, C#, C++... any language, you can create a Web Service that conforms to a standard pattern. Using NetBeans (Oracle's Java IDE) it is easy to create Java web services. Otherwise, use google to search for "web services tutorial [your programming language]
0
1
0
1
2012-05-09T15:42:00.000
1
0
false
10,519,454
0
0
1
1
I have a java application as server (installed on Tomcat/Apache) and another java application as client. The client's task is to get some arguments and pass them to the server and call an adequate method on the server to be execute. I want to have the client in other languages like Perl, Python or TCL. So, I‌ need to know how to establish the communication and what is the communication structure. I'm not seeking for some codes but rather to know more about how to execute some java codes via other languages. I try to google it, but I mostly found the specific question/answer and not a tutorial or something like that. I wonder if I should search for a specific expression ? Do you know any tutorial or site whom explains such structures considering all aspects ? Many thanks Bye.
convert text post into xml file using python in google app engine
10,561,318
2
0
265
0
python,google-app-engine,app-inventor
The solution I found was to do nothing with shival wolfs code on app engine, and to replace the 'postfile' block in the app inventor code with a 'posttext' block with the text you want to send attached to it. Also change the filename variable to the name you want the file called including file type (i.e. .xml, .csv, .txt etc). This appears to work for me.
0
1
1
0
2012-05-09T18:47:00.000
1
0.379949
false
10,522,243
0
0
1
1
newbie here in need of help. Using App Inventor amd App Engine. Learning python as i go along. Still early days. Need to post text data from AI to app engine, save in blob store as file (.xml), to be emailed as an attachment. Am able to send pictures using Shival Wolfs wolfwebmail2, and am sure with a bit of playing with the code i can change it to save the text post as a file in blob store to do the same operation. As stated newbie learning fast. Many thanks in advance for any pointers.
How can I place my Django/Celery periodic tasks and regular tasks in separate files?
10,550,920
4
1
303
0
python,django,celery,code-organization,django-celery
Celery have special configuration settings named: CELERY_IMPORTS. Its a list with files named which need to be imported and checked for existing Celery task functions. So, i think django-celery just add there something like */tasks.py. In your way you can add somethings in footer of your manage.py script in put in CELERY_IMPORTS scripts what you want to use.
0
0
0
0
2012-05-10T12:20:00.000
1
1.2
true
10,533,864
0
0
1
1
All my celery tasks are contained under a tasks.py under each Django app of mine. It's quite cluttered. I'd like to move my celery periodic tasks into a check.py file under each app maainly to make it easier to organise and manage my code. Is there a provision in django-celery to do this? Thanks.
Using sqlalchemy in pyramid_jqm
10,555,714
2
0
101
1
python,sqlalchemy,pyramid
You have to setup your project in the same way that the alchemy scaffold is constructed. Put "sqlalchemy" in your setup.py requires field and run "python setup.py develop" to install the dependency. This is all just python and unrelated to Pyramid.
0
0
0
0
2012-05-11T12:08:00.000
1
1.2
true
10,551,042
0
0
1
1
I know that pyramid comes with a scaffold for sqlalchemy. But what if I'm using the pyramid_jqm scaffold. How would you integrate or use sqlalchemy then? When I create a model.py and import from sqlalchemy I get an error that he couldnt find the module.
How can I make Django's error emails report `locals()`?
10,552,934
3
4
200
0
python,django,debugging
Solved it by including 'include_html': 'True', next to 'class': 'django.utils.log.AdminEmailHandler', in the LOGGING setting in settings.py. Now I get HTML reports via email that contain all of the locals() info.
0
0
0
0
2012-05-11T13:06:00.000
2
1.2
true
10,551,960
0
0
1
1
When your Django app raises an exception, Django sends you an email to report you of the error which includes some useful information. That's great. But there's one piece of information that would be helpful in debugging but is missing from those emails: locals(). How can I make Django's error emails include locals()?
Looking for a Django nag system
10,552,835
1
3
121
0
python,django
You can do it client-side by adding those nagging message in the cookies and showing them on every page load. with a close function handler that would remove the cookie.
0
0
0
0
2012-05-11T13:24:00.000
2
0.099668
false
10,552,242
0
0
1
1
I have a Django app. I've implemented a few "nag" boxes in a rather manual way, and I'm now looking for a dedicated module/framework for such nags. What do I mean by "nag"? Those are little boxes that show up near the top of each page on the site, similar to Django's built-in messages, that tell the user of something that requires his attention. For example, if a user's credit card is about to expire, we need to show a nag saying "Your credit card is about to expire, click here to enter a new one". If a recent credit card charge failed, we need to show a nag. If he hasn't verified his email address, we need to show a nag. Why not use the built-in messages framework? Because these nags are a bit different than messages. Messages are shown one time, and then they're cleared, while nags should show every time on every page on the website that the user visits. Nags should have a "close" button which will actually function as a "snooze" button, causing the message not to be shown for a specified time period, like 24 hours. Did anyone implement something like this? A framework in which I can create these nags, specify their conditions for appearing, their snooze abilities, and possibly more features?
Is mixing Clojure with Python a good idea?
10,558,919
14
8
3,402
0
python,clojure
I built an embarrassingly parallel number-crunching application with a backend in Clojure (on an arbitrary number of machines) and a frontend in Ruby on Rails. I don't particularly like RoR, but this was a zero-budget project at the time and we had a Rails programmer at hand who was willing to work for free. The Clojure part consisted of (roughly) a controller, number crunching nodes, and a server implementing a JSON-over-HTTP API which was the interface to the Rails web app. The Clojure nodes used RabbitMQ to talk to each other. Because we defined clear APIs between different parts of the application, it was easy to later rewrite the frontend in Clojure (because that better suited our needs). If you're working on a distributed project with a long life span and continuous development effort, it could make sense to design the application as a number of separate modules that communicate through well defined APIs (json, bson, ... over AMQP, HTTP, ... or a database). That means you can get started quickly using a language you're comfortable with, and rewrite parts in another language at a later stage if necessary.
0
0
0
0
2012-05-11T20:16:00.000
4
1.2
true
10,558,044
1
0
1
3
I am working on a big project that involves a lot of web based and AI work. I am extremely comfortable with Python, though my only concern is with concurrent programming and scaling this project to make it work on clusters. Thus, Clojure for AI and support for Java function calls and bring about concurrent programming. Is this a good idea to do all the web-based api work with Python and let Clojure take care of most of the concurrent AI work? Edit: Let me explain the interaction in detail. Python would be doing most of the dirty work (scraping, image processing, improving the database and all that.) Clojure, if possible, would either deal with the data base or get the data from Python. I except something CPython sort of linking with Python and Clojure. Edit2: Might be a foolish question to ask, but this being a rather long term project which will evolve quite a bit and go under several iterations, is Clojure a language here to stay? Is it portable enough?
Is mixing Clojure with Python a good idea?
10,558,126
4
8
3,402
0
python,clojure
If you can build both sides to use Data and Pure(ish) Functions to communicate then this should work very well. wrapping your clojure functions in web services that take and retrun JSON (or more preferably clojure forms) should make them accessible to your Python based front end will no extra fuss. Of course it's more fun to write it in Clojure all the way through. ;) If this is a long term project than building clean Functional (as in takes and returns values) interfaces that exchange Data becomes even more important because it will give you the ability to evolve the components independently.
0
0
0
0
2012-05-11T20:16:00.000
4
0.197375
false
10,558,044
1
0
1
3
I am working on a big project that involves a lot of web based and AI work. I am extremely comfortable with Python, though my only concern is with concurrent programming and scaling this project to make it work on clusters. Thus, Clojure for AI and support for Java function calls and bring about concurrent programming. Is this a good idea to do all the web-based api work with Python and let Clojure take care of most of the concurrent AI work? Edit: Let me explain the interaction in detail. Python would be doing most of the dirty work (scraping, image processing, improving the database and all that.) Clojure, if possible, would either deal with the data base or get the data from Python. I except something CPython sort of linking with Python and Clojure. Edit2: Might be a foolish question to ask, but this being a rather long term project which will evolve quite a bit and go under several iterations, is Clojure a language here to stay? Is it portable enough?
Is mixing Clojure with Python a good idea?
10,562,377
0
8
3,402
0
python,clojure
In such scenarios I personally like to start in the below sequence. Divide the system into subsystems with "very clear" definition of what each system does and that definition should follow the principle of "do one thing and keep it simple". At this stage don't think about language etc. Choose the platform (not languages) on which these subsystems will run. Ex: JVM, Python VM, NodeJs, CLR(Mono), other VMs. Try to select few platforms or if possible just one as that does make life easier down the road in terms of complexity. Choose the language to program those platforms. This is very subjective but for JVM you can go with Clojure or Jython (in case you like Dynamic languages as I do). As far as Clojure future is concerned, this is a language developed by "community of amazing programmers" and not by some corporation. I hope that clears your doubt about the "long term" concern of Clojure. By the way Clojure is LISP, so you can modify the language the way you want it and fix things yourself even if someone don't do that for you.
0
0
0
0
2012-05-11T20:16:00.000
4
0
false
10,558,044
1
0
1
3
I am working on a big project that involves a lot of web based and AI work. I am extremely comfortable with Python, though my only concern is with concurrent programming and scaling this project to make it work on clusters. Thus, Clojure for AI and support for Java function calls and bring about concurrent programming. Is this a good idea to do all the web-based api work with Python and let Clojure take care of most of the concurrent AI work? Edit: Let me explain the interaction in detail. Python would be doing most of the dirty work (scraping, image processing, improving the database and all that.) Clojure, if possible, would either deal with the data base or get the data from Python. I except something CPython sort of linking with Python and Clojure. Edit2: Might be a foolish question to ask, but this being a rather long term project which will evolve quite a bit and go under several iterations, is Clojure a language here to stay? Is it portable enough?
running python module from django shell (manage.py)
18,869,159
0
1
3,142
0
python,database,django,api
Try django-extensions it has a runscript command you put in a scripts directory. Oddly the command seems to be missing from the main documentation but if you google django-extensions runscript you will find examples and documentation.
0
0
0
0
2012-05-12T04:12:00.000
3
0
false
10,561,025
0
0
1
1
I'm try to populate data into my db and I'd like to use the django API to do so. I've written a python module that allows me to this very easily. However, within the django shell (manage.py), it doesn't seem possible to run my "populate.py" file. Is it possible to run my populate.py from the django shell? Or alternatively, should I instead be running a regular python terminal and importing the necessary django components? Either way, I would greatly appreciate some pointers.
HTML scraping: iterating through nested directories
10,562,395
2
0
273
0
python,html,iteration,web-scraping
Recursion is usually the easiest way to go. However, that might give you a StackOverflowError after some time if someone creates a directory with a symlink to itself or a parent.
0
0
1
0
2012-05-12T09:07:00.000
2
0.197375
false
10,562,380
0
0
1
1
I need to scrape a website that has a basic folder system, with folders labled with keywords - some of the folders contain text files. I need to scan all the pages (folders) and check the links to new folders, record keywords and files. My main problem ise more abstract: if there is a directory with nested folders and unknown "depth", what is the most pythonc way to iterate through all of them. [if the "depth" would be known, it would be a really simple for loop). Ideas greatly appriciated.
I am getting incompatitble debugger version in eclipse while using pydev
10,644,024
0
0
113
0
python,eclipse,pydev
Maybe you have an old version of the debugger in your pythopath? Please check your interpreter configuration and see if you didn't add it there by accident.
0
1
0
0
2012-05-12T15:36:00.000
1
0
false
10,565,079
0
0
1
1
i get a incompatible debugger version in eclipse while using pydev my default port of the pydev debugger is 5678
How to find the error on web2py and GAE SDK when I got Internal error - Ticket issued: unknow
10,566,246
1
2
554
0
python,google-app-engine,web2py
You have to read the logs on GAE dashboard to figure out the Python exception it is throwing
0
1
0
0
2012-05-12T17:25:00.000
1
1.2
true
10,565,929
0
0
1
1
I'm working with Web2py and Google App Engine SDK. I have an action that works using the WSGI version, but fails when running on SDK. Inside this action, there are no imports specific from GAE libraries... but I can't figure out what is wrong cause I only got the message: Internal error Ticket issued: unknown And there is no ticket showing the error. How can I debug web2py when working with GAE and specifically in this case?
Django-Pinax : How do you use a pinax app apart from what you get with a pinax base project?
10,571,697
3
15
4,505
0
python,django,pinax,django-1.3
The problem that pinax solves is that it avoids you hunting around for the best app that does something, as pinax bundles it together for you. So if you want to get something up and running quickly, pinax makes that easy. For example, it is - by far - the quickest way to get a django project going with twitter bootstrap + other common plugins.
0
0
0
0
2012-05-13T04:26:00.000
5
0.119427
false
10,569,310
0
0
1
3
I am trying to understand Pinax and plan to use it in my next project. I have started with a pinax basic project, and now I have something to go with runserver. Now, I understand that I can customize the initial setup that I got from pinax and customize the profiles, themes, etc as per my requirements. But is that all that pinax provides ? I am very confused here, like I want to use the pinax phileo app in my project, so how does pinax helps me do that ? My Effort : I searched and found that I have to install it with pip install phileo Then, add it to INSTALLED_APPS and use it as required. But what did pinax do in this ? Pinax has phileo featured on its website, but why ? Since I could have used it just like any other app on my non-pinax django project. So, my question in a nutshell is : What does pinax provide after a base project and default templates that come with pinax ? Right, now it feels like pinax just provides a base project with some apps already working with some default templates. [ That's it ? ] Then, what about other apps featured on pinax's website that do not come with base projects ? Please, help clear up the confusion ! Update My question is somewhat - What is the significance of pinax-ecosystem when we already have them listed somewhere like djangopackages.com ?
Django-Pinax : How do you use a pinax app apart from what you get with a pinax base project?
10,908,828
9
15
4,505
0
python,django,pinax,django-1.3
You seem to be assuming that unless all of Pinax is useful, Pinax as a project isn't useful. It was never the intention that Pinax be a single thing, all of which you use on a given project. If all you find helpful is the project layout, that's fine. Pinax suggests a standard project layout (which you can use alone with pinax-project-zero). If all you find helpful is the pinax-project-account (django-user-accounts and a few other things, already integrated with templates following bootstrap class naming) as a starting point for you site, that's great. Pinax is fundamentally about getting you started sooner and pinax-project-account is a suitable starting point for most sites with user accounts. Once you have a project, you are free to add any Django apps you want. There's nothing that requires you to use Pinax apps. "So", you ask, "why does Pinax even bother having apps?". Well, because apps aren't isolated. Reusability isn't just at the level of app but also groups of apps. Take, for example, a waiting list app, an invitations app, a referral code app, a points app, a badges app. Sure these can be developed and used independently. But if they are developed with the same mind set you can make sure the waiting list app and invitations app and referral code app work well with the user account app (and don't duplicate anything). You can make sure the referral code app plays nicely with the points app and the points app plays nicely with the badges app. You can make sure your forum app doesn't try to do something your moderation app already provides. Or that each app isn't trying to solve avatars its own way. So Pinax isn't trying to be a "directory" of apps. It's a family of apps, themes and starter projects written with each other in mind.
0
0
0
0
2012-05-13T04:26:00.000
5
1.2
true
10,569,310
0
0
1
3
I am trying to understand Pinax and plan to use it in my next project. I have started with a pinax basic project, and now I have something to go with runserver. Now, I understand that I can customize the initial setup that I got from pinax and customize the profiles, themes, etc as per my requirements. But is that all that pinax provides ? I am very confused here, like I want to use the pinax phileo app in my project, so how does pinax helps me do that ? My Effort : I searched and found that I have to install it with pip install phileo Then, add it to INSTALLED_APPS and use it as required. But what did pinax do in this ? Pinax has phileo featured on its website, but why ? Since I could have used it just like any other app on my non-pinax django project. So, my question in a nutshell is : What does pinax provide after a base project and default templates that come with pinax ? Right, now it feels like pinax just provides a base project with some apps already working with some default templates. [ That's it ? ] Then, what about other apps featured on pinax's website that do not come with base projects ? Please, help clear up the confusion ! Update My question is somewhat - What is the significance of pinax-ecosystem when we already have them listed somewhere like djangopackages.com ?
Django-Pinax : How do you use a pinax app apart from what you get with a pinax base project?
10,692,708
1
15
4,505
0
python,django,pinax,django-1.3
Pinax 0.7 was bundled with some apps and starter projects like social_projects which could be used for building a site more quickly, but I things changed in pinax 0.9. I think pinax has reinvented its structure for some reasons (eg,In pinax 0.7 some people were complaining that it was messy to customize starter projects or to use only subset of certain starter projects so could pinax 0.9 provides more flexibility). Some of the apps included in a pinax website are somehow coupled with each other so that it is easier to deploy them together,sometimes all you have to do is just installing them and the apps will communicate with each other.
0
0
0
0
2012-05-13T04:26:00.000
5
0.039979
false
10,569,310
0
0
1
3
I am trying to understand Pinax and plan to use it in my next project. I have started with a pinax basic project, and now I have something to go with runserver. Now, I understand that I can customize the initial setup that I got from pinax and customize the profiles, themes, etc as per my requirements. But is that all that pinax provides ? I am very confused here, like I want to use the pinax phileo app in my project, so how does pinax helps me do that ? My Effort : I searched and found that I have to install it with pip install phileo Then, add it to INSTALLED_APPS and use it as required. But what did pinax do in this ? Pinax has phileo featured on its website, but why ? Since I could have used it just like any other app on my non-pinax django project. So, my question in a nutshell is : What does pinax provide after a base project and default templates that come with pinax ? Right, now it feels like pinax just provides a base project with some apps already working with some default templates. [ That's it ? ] Then, what about other apps featured on pinax's website that do not come with base projects ? Please, help clear up the confusion ! Update My question is somewhat - What is the significance of pinax-ecosystem when we already have them listed somewhere like djangopackages.com ?
Multiple logins accessing specific results without using groups in Django
10,572,800
0
0
43
0
python,django
What I can think of: You can use a custom authentication backend. Allow someone to enter a playername and password (or whatever you use for authorisation). Log in the user associated with that player (in order to be able to keep using Django's authentication system), but store which player was selected. When editing player results, instead of checking only whether the player belongs to the correct user, also check whether it is in fact the player logged in as (if any).
0
0
0
0
2012-05-13T04:37:00.000
1
0
false
10,569,341
0
0
1
1
I have a Django app set up using standard Django user auth. In which users can log in, add their players (Player model - Foreign Key to User model), and add results for their players (Results model - Foreign Key to Player model). I would like to also have the players (each entry in Player model) that the user creates be able to log in, but only be able to edit/add their own results. Just to be clear, the user and player would see the same results for that player, and each be able to add and edit them. I don't believe this can be done with groups as the group would have to be user specific. Can anyone point me in the right direction please?
Send GET request to Plone form built with zope.formlib
10,571,827
2
2
319
0
python,plone
zope.formlib applies actions based on the presence of the action name in the request; usually this is done by giving the submit button the name of the action. By including that name in your GET request you thus invoke the action. Note that actions are prefixed by both the form identifier and the 'action' keyword, so the save action will generally be using the parameter name form.action.save: @@page?param1=myparam1&param2=myparam2&form.action.save=Save Easiest way to discover the exact name of your action parameters is to just look at the output generated for your form and look for the .action. names.
0
0
0
0
2012-05-13T09:23:00.000
1
1.2
true
10,570,652
0
0
1
1
I've a form built with zope.formlib. When I fill the form and push submit, I can see the result in the browser. Now, I would like to reference these results from other places so the results would be avaible for the users without filling the form. I've tried to built an URL adding the parameters after the form URL to do a GET request like this: @@page?param1=myparam1&param2=myparam2 This way I can't get the result to get load. It only shows the form. Is there something missing? Does zope.formlib allow GET requests?
How can I disable the Django Celery admin modules?
10,572,634
1
8
3,492
0
python,django,django-admin,celery,django-celery
You can simply unregister celerys models like admin.site.unregister(CeleryModelIdoNotWantInAdmin)
0
0
0
0
2012-05-13T13:03:00.000
3
0.066568
false
10,571,960
0
0
1
1
I have no need to the celery modules in my Django admin. Is there a way I could remove it?
Where are the logs from BackgroundThreads on App Engine?
10,573,307
3
2
354
0
python,google-app-engine,backend,background-thread
There is a combobox in the top left corner of the versions/backends of your application switch to the backend there and you will see the backend logs.
0
1
0
0
2012-05-13T16:13:00.000
1
0.53705
false
10,573,217
0
0
1
1
I'm writing an app that writes log entries from a BackgroundThread object on a backend instance. My problem is that I don't know how to access the logs. The docs say, "A background thread's os.environ and logging entries are independent of those of the spawning thread," and indeed, the log entries don't show up with the backend instance's entries on the admin console. But the admin console doesn't offer an option for showing the background threads. appcfg request_logs doesn't seem to be the answer either. Does anybody know?
Localhost is not refreshing/reseting
10,575,238
2
0
437
1
python,google-app-engine
Those warnings shouldn't prevent you from seeing new 'content,' they simply mean that you are missing some libraries necessary to run local versions of CloudSQL (MySQL) and the Images API. First to do is try to clear your browser cache. What changes did you make to your Hello World app?
0
1
0
0
2012-05-13T20:57:00.000
3
0.132549
false
10,575,184
0
0
1
3
I am absolute beginner using google app engine with python 2.7. I was successful with creating helloworld app, but then any changes I do to the original app doesn't show in localhost:8080. Is there a way to reset/refresh the localhost. I tried to create new projects/directories with different content but my localhost constantly shows the old "Hello world!" I get the following in the log window: WARNING 2012-05-13 20:54:25,536 rdbms_mysqldb.py:74] The rdbms API is not available because the MySQLdb library could not be loaded. WARNING 2012-05-13 20:54:26,496 datastore_file_stub.py:518] Could not read datastore data from c:\users\tomek\appdata\local\temp\dev_appserver.datastore WARNING 2012-05-13 20:54:26,555 dev_appserver.py:3401] Could not initialize images API; you are likely missing the Python "PIL" module. ImportError: No module named _imaging Please help...
Localhost is not refreshing/reseting
10,593,822
0
0
437
1
python,google-app-engine
Press CTRL-F5 in your browser, while on the page. Forces a cache refresh.
0
1
0
0
2012-05-13T20:57:00.000
3
0
false
10,575,184
0
0
1
3
I am absolute beginner using google app engine with python 2.7. I was successful with creating helloworld app, but then any changes I do to the original app doesn't show in localhost:8080. Is there a way to reset/refresh the localhost. I tried to create new projects/directories with different content but my localhost constantly shows the old "Hello world!" I get the following in the log window: WARNING 2012-05-13 20:54:25,536 rdbms_mysqldb.py:74] The rdbms API is not available because the MySQLdb library could not be loaded. WARNING 2012-05-13 20:54:26,496 datastore_file_stub.py:518] Could not read datastore data from c:\users\tomek\appdata\local\temp\dev_appserver.datastore WARNING 2012-05-13 20:54:26,555 dev_appserver.py:3401] Could not initialize images API; you are likely missing the Python "PIL" module. ImportError: No module named _imaging Please help...
Localhost is not refreshing/reseting
41,388,817
0
0
437
1
python,google-app-engine
You can try opening up the DOM reader (Mac: alt+command+i, Windows: shift+control+i) the reload the page. It's weird, but it works for me.
0
1
0
0
2012-05-13T20:57:00.000
3
0
false
10,575,184
0
0
1
3
I am absolute beginner using google app engine with python 2.7. I was successful with creating helloworld app, but then any changes I do to the original app doesn't show in localhost:8080. Is there a way to reset/refresh the localhost. I tried to create new projects/directories with different content but my localhost constantly shows the old "Hello world!" I get the following in the log window: WARNING 2012-05-13 20:54:25,536 rdbms_mysqldb.py:74] The rdbms API is not available because the MySQLdb library could not be loaded. WARNING 2012-05-13 20:54:26,496 datastore_file_stub.py:518] Could not read datastore data from c:\users\tomek\appdata\local\temp\dev_appserver.datastore WARNING 2012-05-13 20:54:26,555 dev_appserver.py:3401] Could not initialize images API; you are likely missing the Python "PIL" module. ImportError: No module named _imaging Please help...
Python with HTML front-end
10,576,447
0
2
4,095
0
python,html
Yes it is possible. You have to use the web frameworks such as Django to do this. You can receive the input from the HTML file with name attribute in the HTML file and if you return back the output you can use the value attribute. Web Frameworks use Handlers to receive and post the data to the HTML form. In order to use the web frameworks, you need to install them in your server like you installed python in your Linux server.
0
0
0
0
2012-05-13T21:34:00.000
2
0
false
10,575,432
0
0
1
1
I have a Python script that runs on my Linux server. I want to make a simple HTML page to be the "front-end" of my script. It will just take 2 values from a web form (that user will insert acessing the webpage) and send them to the Python script on background as variables, then Python will operate with these values and make all the stuff works. Is it possible? Tks !
How to use Products.csvreplicata 1.1.7 with Products.PressRoom to export PressContacts in Plone 4.1
10,596,347
1
1
153
0
python,plone
Go to Site setup / CSV Replicata tool, and select PressRoom content(s) as exportable (and then select the schemata you want to be considered during import/export).
0
0
0
0
2012-05-14T05:26:00.000
1
1.2
true
10,577,866
0
1
1
1
How to use Products.csvreplicata 1.1.7 with Products.PressRoom 3.18 to export PressContacts to csv in Plone 4.1? Or is there any other product to import/export all the PressRoom contacts into csv.
Selenium: Testing pop-up windows
10,586,476
0
5
7,687
0
python,html,selenium
For the Selenium RC API, you need to use the SelectWindow command to switch to the pop-up window. The window can be specified either by its name (as specified on the JavaScript window.open() function) or its title. To switch back to the main window, use SelectWindow(None).
0
0
1
0
2012-05-14T09:36:00.000
2
0
false
10,580,772
0
0
1
1
I have an issue when trying to test a web application with Selenium/Python. Basically I can't test elements of a pop-up window. A scenario: I can test all elements for a page. But when I go to click on a button that opens up a small pop up box I can't test the elements on the popup. It's like the pop up isn't in focus or active. I can test elements on the next page. For example click a button, brings me on to next page, and I can work with elements on the 'next' page. So it the problem seems to be popup specific. I could post code but to be honest it might confuse at this stage. I may post code in a later post, thanks
How to use mysql in gevent based programs in python?
12,335,813
1
4
1,197
1
python,mysql,gevent
Postgres may be better suited due to its asynchronous capabilities
0
1
0
0
2012-05-14T09:41:00.000
2
1.2
true
10,580,835
0
0
1
2
I have found that ultramysql meets my requirement. But it has no document, and no windows binary package. I have a program heavy on internet downloads and mysql inserts. So I use gevent to solve the multi-download-tasks problem. After I downloaded the web pages, and parsed the web pages, I get to insert the data into mysql. Is monkey.patch_all() make mysql operations async? Can anyone show me a correct way to go.
How to use mysql in gevent based programs in python?
13,006,283
1
4
1,197
1
python,mysql,gevent
I think one solution is use pymysql. Since pymysql use python socket, after monkey patch, should be work with gevent.
0
1
0
0
2012-05-14T09:41:00.000
2
0.099668
false
10,580,835
0
0
1
2
I have found that ultramysql meets my requirement. But it has no document, and no windows binary package. I have a program heavy on internet downloads and mysql inserts. So I use gevent to solve the multi-download-tasks problem. After I downloaded the web pages, and parsed the web pages, I get to insert the data into mysql. Is monkey.patch_all() make mysql operations async? Can anyone show me a correct way to go.
Building an infrastructure for developing web-applications using multiple programming languages(Python Java C#)
10,581,791
1
1
274
0
java,python,architecture
I am not sure if I understood question correctly, but I supose that you want to build web app on multiple languages. My first guess is Service Oriented Programming. You build services on multiple languages and they can communicate trough JSon or XML
0
0
0
0
2012-05-14T10:37:00.000
3
1.2
true
10,581,737
1
0
1
3
I've searched for this on different places but I haven't found an exhaustive answer. Suppose we have separate modules written on different languages each of them realizes certain part of logic. We have not permission for recompile them to certain one language (for example using JPython). I`m a real novice at this part of programming so it is hardly to find words describing this problem. It seems I try to find something like Maven but for multiple languages and for addition modules may be precompiled. Is this theoretically possible?
Building an infrastructure for developing web-applications using multiple programming languages(Python Java C#)
10,582,487
1
1
274
0
java,python,architecture
I think rocco337's Service Oriented Programming is really a good idea. But there's one slight downside of that particular approach. Traffic between your services because of all these HTTP requests. I heard amazon had suffered for it, But they managed it I guess because they are giant. This one also has it's own downside. Just think this one as dirty and quick alternative. Web application that I built recently was based on both python, php and bunch of C modules. The way I mingled them was using simple command line and shell scripts. And python works really great as a glue language. so basically what you have to do is. A. Asynchronous approach. (When your module need more than few seconds to finish it's job) open up a thread run command line application (written in java, C# what ever) show what ever view you want while waiting for result. when u get result from command line, let user reload or using ajax to refresh your view with result. B. Synchronous approach. (Job is fairly simple) run command line application wait until you get result. show user result view with result from 2 Good luck with your project!
0
0
0
0
2012-05-14T10:37:00.000
3
0.066568
false
10,581,737
1
0
1
3
I've searched for this on different places but I haven't found an exhaustive answer. Suppose we have separate modules written on different languages each of them realizes certain part of logic. We have not permission for recompile them to certain one language (for example using JPython). I`m a real novice at this part of programming so it is hardly to find words describing this problem. It seems I try to find something like Maven but for multiple languages and for addition modules may be precompiled. Is this theoretically possible?
Building an infrastructure for developing web-applications using multiple programming languages(Python Java C#)
10,581,810
1
1
274
0
java,python,architecture
It is certainly possible. For example I have developed web applications using a mix of: Java (for back end interfaces, APIs and performance-sensitive code) Clojure (for the main web app on the server side) JavaScript (for client side code in the browser) Overall, this has worked pretty well in that you can use each of the languages for their specific strengths. Though you should be warned that it does require you to be pretty multi-skilled in all the languages to be effective, and it does require a lot more configuration of your environment to get everything working smoothly Tricks I have found useful: Be very clear about why you are using each language. For example, JavaScript might be limited strictly to client side code in the browser. It helps enormously if languages run in the same platform. For example, Clojure and Java both run in the JVM which makes interoperability much easier as you don't need any extra code for interfacing between the two. Use an IDE that supports all of the languages you intend to use (in my case Eclipse) Use a good build system with multi-language support. You want to be able to do a "one click build" that includes all the languages. IMHO Maven is the best tool for this (especially if you use something like the Eclipse plugin for integration with the IDE)
0
0
0
0
2012-05-14T10:37:00.000
3
0.066568
false
10,581,737
1
0
1
3
I've searched for this on different places but I haven't found an exhaustive answer. Suppose we have separate modules written on different languages each of them realizes certain part of logic. We have not permission for recompile them to certain one language (for example using JPython). I`m a real novice at this part of programming so it is hardly to find words describing this problem. It seems I try to find something like Maven but for multiple languages and for addition modules may be precompiled. Is this theoretically possible?
Django:limiting user to login once at a time
10,588,902
0
2
1,188
0
python,django
You will want to use Django Sessions in your login view. Depending on how you set up the login view you might want to query your Sessions object and filter as well and then compare datetime.now() vs NameOfQuerySessionVariable.expire_date
0
0
0
0
2012-05-14T17:33:00.000
1
0
false
10,588,289
0
0
1
1
Django allows a user to be logged in from multiple computers, in different sessions. Is there a way to limit user from logging from multiple machines at the same time. That is if there's a live session with the user logged in on a browser or a computer, you must not allow him to login at another computer. This would be a useful hack, for security purpose. Do advise
Consuming a RESTful API with Django
10,591,191
1
11
6,261
0
python,django,django-models
Make the REST calls using the builtin urllib (a bit clunky but functional) and wrap the interface into a class, with a method for each remote call. Your class can then translate to and from native python types. That is what I'd do anyway!
0
0
0
0
2012-05-14T20:24:00.000
2
0.099668
false
10,590,497
0
0
1
1
I'm building a Django application that needs to interact with a 3rd party RESTful API, making various GETs, PUTs, etc to that resource. What I'm looking for is a good way to represent that API within Django. The most obvious, but perhaps less elegant solution seems to be creating a model that has various methods mapping to webservice queries. On the other hand, it seems that using something like a custom DB backend would provide more flexibility and be better integrated into Django's ORM. Caveat: This is the first real project I've done with Django, so it's possible I'm missing something obvious here.