Title
stringlengths
11
150
A_Id
int64
518
72.5M
Users Score
int64
-42
283
Q_Score
int64
0
1.39k
ViewCount
int64
17
1.71M
Database and SQL
int64
0
1
Tags
stringlengths
6
105
Answer
stringlengths
14
4.78k
GUI and Desktop Applications
int64
0
1
System Administration and DevOps
int64
0
1
Networking and APIs
int64
0
1
Other
int64
0
1
CreationDate
stringlengths
23
23
AnswerCount
int64
1
55
Score
float64
-1
1.2
is_accepted
bool
2 classes
Q_Id
int64
469
42.4M
Python Basics and Environment
int64
0
1
Data Science and Machine Learning
int64
0
1
Web Development
int64
1
1
Available Count
int64
1
15
Question
stringlengths
17
21k
ImportError: cannot import name signals
24,360,080
-1
7
16,619
0
python,django,unit-testing,importerror
This is easy to solve. If you have already written the settings.py (most probable) just navigate into the directory which contains the "settings.py" file and execute it. 1] python 2] import settings These commands should do the trick. Then go to any folder and continue with execution.
0
0
0
0
2011-05-12T19:06:00.000
3
-0.066568
false
5,983,100
0
0
1
3
I'm using Django 1.3.0 with Python 2.7.1. In every test I write the following imports I get the importError above: from django.utils import unittest from django.test.client import Client The full stack trace: File "C:\Program Files (x86)\j2ee\plugins\org.python.pydev.debug_1.6.3.2010100513\pysrc\runfiles.py", line 342, in __get_module_from_str mod = __import__(modname) File "C:/Users/benjamin/workspace/BookIt/src/BookIt/tests\basic_flow.py", line 11, in from django.test.client import Client File "C:\Python27\lib\site-packages\django\test\__init__.py", line 5, in from django.test.client import Client, RequestFactory File "C:\Python27\lib\site-packages\django\test\client.py", line 21, in from django.test import signals ImportError: cannot import name signals ERROR: Module: basic_flow could not be imported. Any ideas why this happening ?
Should I migrate from Django 1.2.5 to 1.3.x ? Are there any undocumented issues?
5,987,736
0
7
929
0
python,django,release-management
Migrating to a new version of Django (especially from a directly previous release) is really easy and takes less than ten minutes for one project (if its not super complex). It it's running for 18 to 24 months I would better upgrade. After one or two more releases of Django (one major release each year) they will simply not support security updates for older versions anymore. To be be save you can make the step to 1.3 and then don't need to upgrade further, except your project is running even longer.
0
0
0
0
2011-05-13T01:36:00.000
3
0
false
5,986,388
0
0
1
2
Short Question Are there any compelling reasons to update Django 1.2.5 to 1.3? If the noted changes in the release notes do not impact my application directly. EDIT: To better clarify my question (thanks S.Lott!): Are there any issues not mentioned in the Django 1.3's release notes that I should be aware of if upgrading from version 1.2.5? Background I have done all of my development on 1.2.5 with no troubles. I will be deploying my application within the next week or two and it's life cycle will be 18 - 24 months of constant use. Secondary (more general question) Just prior to a release, is it good practice to get the latest and greatest (stable) version of your dependencies?
Should I migrate from Django 1.2.5 to 1.3.x ? Are there any undocumented issues?
5,990,072
0
7
929
0
python,django,release-management
As I once read somewhere, Django has releases mostly because people ask about releases. Other than that, it's perfectly ok to stick to the trunk revision.
0
0
0
0
2011-05-13T01:36:00.000
3
0
false
5,986,388
0
0
1
2
Short Question Are there any compelling reasons to update Django 1.2.5 to 1.3? If the noted changes in the release notes do not impact my application directly. EDIT: To better clarify my question (thanks S.Lott!): Are there any issues not mentioned in the Django 1.3's release notes that I should be aware of if upgrading from version 1.2.5? Background I have done all of my development on 1.2.5 with no troubles. I will be deploying my application within the next week or two and it's life cycle will be 18 - 24 months of constant use. Secondary (more general question) Just prior to a release, is it good practice to get the latest and greatest (stable) version of your dependencies?
Linux user scheme for a Django production server
5,986,912
2
6
925
0
python,django,linux,nginx,uwsgi
I like having regular users on a system: multiple admins show up in sudo logs -- there's nothing quite like asking a specific person why they made a specific change. not all tasks require admin privileges, but admin-level mistakes can be more costly to repair it is easier to manage the ~/.ssh/authorized_keys if each file contains only keys from a specific user -- if you get four or five different users in the file, it's harder to manage. Small point :) but it is so easy to write cat ~/.ssh/id_rsa.pub | ssh user@remotehost "cat - > ~/.ssh/authorized_keys" -- if one must use >> instead, it's precarious. :) But you're right, you can do all your work as root and not bother with regular user accounts.
0
1
0
0
2011-05-13T01:51:00.000
1
1.2
true
5,986,472
0
0
1
1
I'm currently trying to set up nginx + uWSGI server for my Django homepage. Some tutorials advice me to create specific UNIX users for certain daemons. Like nginx user for nginx daemon and so on. As I'm new to Linux administration, I thought just to create second user for running all the processes (nginx, uWSGI etc.), but it turned out that I need some --system users for that. Main question is what users would you set up for nginx + uWSGI server and how to work with them? Say, I have server with freshly installed Debian Squeeze. Should I install all the packages, virtual environment and set up all the directories as root user and then create system ones to run the scripts?
Detect if .swf has transparent background
6,071,331
0
9
625
0
java,php,python,flash
The swf files are handled by plugins running inside or outside the browser. There is no way to tell if a certain flash has or hasn't transparent background... i guess you can assume that all of them are transparent..
0
0
0
0
2011-05-13T06:41:00.000
6
0
false
5,988,172
0
0
1
4
Is there a way to do this programmatically in PHP, Python or Java? Use case: User uploads .swf through an upload form. Detect if it has a transparent background. If it does, change it to something else, e.g. white.
Detect if .swf has transparent background
6,078,955
1
9
625
0
java,php,python,flash
Flash is a bunch of compiled binary data. You Might be able to hunt down a third party library which can un-compile the flash file and get some of the data from the root movie scene, but it's a large stretch. As others have recommended, a checkbox during upload would be the best route to take.
0
0
0
0
2011-05-13T06:41:00.000
6
0.033321
false
5,988,172
0
0
1
4
Is there a way to do this programmatically in PHP, Python or Java? Use case: User uploads .swf through an upload form. Detect if it has a transparent background. If it does, change it to something else, e.g. white.
Detect if .swf has transparent background
6,060,270
0
9
625
0
java,php,python,flash
I think you can read the .html where the .swf is placed and see if there is a parameter called "wmode" and is set to "transparent" If that's the case you can get the html file and read it with Python, Java or PHP. It's an ugly solution, but it's a solution. PS: Sorry for my poor English.
0
0
0
0
2011-05-13T06:41:00.000
6
0
false
5,988,172
0
0
1
4
Is there a way to do this programmatically in PHP, Python or Java? Use case: User uploads .swf through an upload form. Detect if it has a transparent background. If it does, change it to something else, e.g. white.
Detect if .swf has transparent background
6,079,767
4
9
625
0
java,php,python,flash
This is very difficult to determine, because all SWFs are actually transparent (or they all can be) - it's just that they can contain shapes which cover the entire stage, which make them appear non-transparent. To determine this programmatically, you'd need to loop over every shape in the SWF and look at it's bounds. However, that wont be enough since any shapes can be changed by ActionScript blocks and new shapes can be created at run time. But really, making them transparent is the hard task. Since you want to make sure they're are non-transparent, you can just deal with this at display time - put a colored div behind the object or embed tag.
0
0
0
0
2011-05-13T06:41:00.000
6
1.2
true
5,988,172
0
0
1
4
Is there a way to do this programmatically in PHP, Python or Java? Use case: User uploads .swf through an upload form. Detect if it has a transparent background. If it does, change it to something else, e.g. white.
Python: tool to keep track of deployments
5,988,654
1
5
332
0
python,deployment,web-deployment
pip freeze gives you a listing of all installed packages. Bonus: if you redirect the output to a file, you can use it as part of your deployment process to install all those packages (pip can programmatically install all packages from the file). I see you're already using virtualenv. Good. You can run pip freeze -E myvirtualenv > myproject.reqs to generate a dependency file that doubles as a status report of the Python environment.
0
1
0
0
2011-05-13T06:41:00.000
2
0.099668
false
5,988,177
0
0
1
1
I'm looking for a tool to keep track of "what's running where". We have a bunch of servers, and on each of those a bunch of projects. These projects may be running on a specific version (hg tag/commit nr) and have their requirements at specific versions as well. Fabric looks like a great start to do the actual deployments by automating the ssh part. However, once a deployment is done there is no overview of what was done. Before reinventing the wheel I'd like to check here on SO as well (I did my best w/ Google but could be looking for the wrong keywords). Is there any such tool already? (In practice I'm deploying Django projects, but I'm not sure that's relevant for the question; anything that keeps track of pip/virtualenv installs or server state in general should be fine) many thanks, Klaas ========== EDIT FOR TEMP. SOLUTION ========== For now, we've chosen to simply store this information in a simple key-value store (in our case: the filesystem) that we take great care to back up (in our case: using a DCVS). We keep track of this store with the same deployment tool that we use to do the actual deploys (in our case: fabric) Passwords are stored inside a TrueCrypt volume that's stored inside our key-value store. ========== I will still gladly accept any answer when some kind of Open Source solution to this problem pops up somewhere. I might share (part of) our solution somewhere myself in the near future.
PyCharm: DJANGO_SETTINGS_MODULE is undefined
65,421,376
2
21
18,748
0
python,django,environment-variables,pycharm
I faced the same problem and proposed solutions did not work. I solved it by going in "File > Settings..." Then search for "Django console" and add the following Environment variable: DJANGO_SETTINGS_MODULE=mysite.settings
0
0
0
0
2011-05-13T07:23:00.000
6
0.066568
false
5,988,597
1
0
1
3
I am using PyCharm IDE and when I run any file.py I get this error: ..raise ImportError("Settings cannot be imported, because environment variable %s is undefined." % ENVIRONMENT_VARIABLE) ImportError: Settings cannot be imported, because environment variable DJANGO_SETTINGS_MODULE is undefined. How to configure DJANGO_SETTINGS_MODULE environment variable in PyCharm?
PyCharm: DJANGO_SETTINGS_MODULE is undefined
7,150,995
9
21
18,748
0
python,django,environment-variables,pycharm
You can go to the tool bar which shows a dropdown and select your project ->edit configuration. There in the Environment variable area enter DJANGO_SETTINGS_MODULE mysite.settings and save .
0
0
0
0
2011-05-13T07:23:00.000
6
1
false
5,988,597
1
0
1
3
I am using PyCharm IDE and when I run any file.py I get this error: ..raise ImportError("Settings cannot be imported, because environment variable %s is undefined." % ENVIRONMENT_VARIABLE) ImportError: Settings cannot be imported, because environment variable DJANGO_SETTINGS_MODULE is undefined. How to configure DJANGO_SETTINGS_MODULE environment variable in PyCharm?
PyCharm: DJANGO_SETTINGS_MODULE is undefined
55,761,299
0
21
18,748
0
python,django,environment-variables,pycharm
for new commers, below is the additional info'. where you can find the place to set DJANGO_SETTINGS_MODULE in pycharm 2019.1 is: config -> language and frameworks -> Django's right pain. the column just below the django project root you can cite your config file path. and the environment variables column you can set DJANGO_SETTINGS_MODULE value.
0
0
0
0
2011-05-13T07:23:00.000
6
0
false
5,988,597
1
0
1
3
I am using PyCharm IDE and when I run any file.py I get this error: ..raise ImportError("Settings cannot be imported, because environment variable %s is undefined." % ENVIRONMENT_VARIABLE) ImportError: Settings cannot be imported, because environment variable DJANGO_SETTINGS_MODULE is undefined. How to configure DJANGO_SETTINGS_MODULE environment variable in PyCharm?
How to push changes through django?
5,995,348
0
0
2,657
0
python,django
Sounds like an important task ;) Your python scripts (.py) are compiled to bytecode files (.pyc) automatically. If the .py file has changed or there is no .pyc file, your python interpreter will automatically compile. AFAIK, template files (.html) are not cached in memory. But .pyc files are, and often it is necessary to tell apache to reload. To me, it sounds more like you're changing the wrong .html file or the wrong part of the html code. With respect to compiling, python is much like php, except that it saves its bytecode for later use.
0
0
0
0
2011-05-13T16:24:00.000
2
1.2
true
5,994,922
0
0
1
1
I've been given the keys to a website written in Python on the Django framework. I have a php background, so I'm not used to having to worry about compliation, etc. I'm just trying to change the copyright at the footer. I found the template that holds the html... it's called base.html, in the 'templates' folder. I changed the copyright from 2010 to 2011, but it's not showing up on the site. Should I need to recompile anything in order for my changes to show up?
Timed Quiz: How to consider internet interruptions?
5,995,769
2
4
466
0
python,django,session,timer
Either you do the clock on the client side, in which case they can always cheat somehow, or you do it on the server side, and then you aren't taking into account these interruptions. To reduce cheating somewhat and still allow for interruptions, you could do a 'keep alive'. Here the client side code announces to the server that it is still there every so often, say every 5 seconds. The server side notes when it stops getting these messages, and pauses/stops the clock. However it still has the start and end time, so you know how long it really took in wall time, and also how long it took while the client was supposedly there. With these two pieces of information you could very easily track down odd behaviour and blacklist people. Blacklisted people might not be aware that they are blacklisted, but their quiz scores don't show up for other users of your quiz system.
0
0
0
1
2011-05-13T17:36:00.000
4
0.099668
false
5,995,674
0
0
1
4
I am preparing a Test or Quiz in Django. The quiz needs to be completed in certain time frame. Say 30 minutes for 40 questions.I can always initiate a clock at start of the test, and then calculate time by the time the Quiz is completed. However it's likely that during the attempt, there may be issues such as internet connection drops, or system crashes/power outages etc. I need a strategy to figure out when such an accident happened, and stop the clock, then let the user take the test again from where it stopped, and start the clock again. What is the right strategy? Any help including sample code/examples/ideas are most welcome
Timed Quiz: How to consider internet interruptions?
5,995,763
2
4
466
0
python,django,session,timer
The simplest way would be to add a timestamp when the person starts the quiz and then compare that to when they submit. Of course, this doesn't take into account connection drops, crashes, etc... like you mentioned. To account for these issues I'd probably use something like node.js. Each client has "check-in" when they connect to the quiz. Then at regular intervals (every 1s, 10s, 1m, etc...) the client checks in. If at these intervals the client doesn't check-in you can assume they've had the connection drop. You could keep track of when they connect again and start the timer from where they left off. This is my initial thought on how to keep track of connection drops and crashes. The same could be done with a front-end ajax call to a Django view.
0
0
0
1
2011-05-13T17:36:00.000
4
0.099668
false
5,995,674
0
0
1
4
I am preparing a Test or Quiz in Django. The quiz needs to be completed in certain time frame. Say 30 minutes for 40 questions.I can always initiate a clock at start of the test, and then calculate time by the time the Quiz is completed. However it's likely that during the attempt, there may be issues such as internet connection drops, or system crashes/power outages etc. I need a strategy to figure out when such an accident happened, and stop the clock, then let the user take the test again from where it stopped, and start the clock again. What is the right strategy? Any help including sample code/examples/ideas are most welcome
Timed Quiz: How to consider internet interruptions?
6,015,282
2
4
466
0
python,django,session,timer
Your strategy should depend on importance of the test and ability to retake whole test. Is test/quiz for fun or competence/knowledge checking? Are you dealing with logged users? Are tests generated randomly from large poll of available questions? these are the questions you need to answer yourself first. Remember that: malicious user CAN simulate connection outage / power failure, only clock you can trust is one on server side, everything on browser side can be manipulated (think firebug/console js injection) My approach would be: Inform users that TIME is important factor and connection issues may not be taken into account when grade will be given..., Serve only one question, wait for answer, serve another one, Whole test time should be calculated as SUM of each answer time: save each "question send" / "answer received" timestamps and calculate answer time from it, time between questions wouldn't count, you'd get extra scope on which questions was harder / took longer to answer. Add some kind of heartbeat to your question page (like ajax request every X seconds), when heartbeat stops you can (depending on options you have): invalidate question and notify user via dialog that he has connection issues and have to refresh to get new question instead if you have larger poll of questions to use, pause time on server side (and for example dim question page so user cannot answer until his connection is restored) IMO only for games/fun quiz/tests save information on server side on each interruption which would later ease decision to allow retake whole test e.g. he was fine until 20th question and then on 3-4 easy questions in a row he was dropping...
0
0
0
1
2011-05-13T17:36:00.000
4
1.2
true
5,995,674
0
0
1
4
I am preparing a Test or Quiz in Django. The quiz needs to be completed in certain time frame. Say 30 minutes for 40 questions.I can always initiate a clock at start of the test, and then calculate time by the time the Quiz is completed. However it's likely that during the attempt, there may be issues such as internet connection drops, or system crashes/power outages etc. I need a strategy to figure out when such an accident happened, and stop the clock, then let the user take the test again from where it stopped, and start the clock again. What is the right strategy? Any help including sample code/examples/ideas are most welcome
Timed Quiz: How to consider internet interruptions?
5,996,776
0
4
466
0
python,django,session,timer
The problem with pausing the clock when the connection to the user drops, is that the user could just disconnect their computer from the internet each time they received a new question, and then reconnect once they had worked out the answer. One thing you could do, is give the user a certain amount of time for each question. The clock is started when the user successfully receives the question to their browser, and if the user submits an answer before the time limit, it is accepted, otherwise it is void. That would mean if a user lost connection it would only affect the question they are currently on. But it would also mean that the user would have no flexibility in how much time they want to allot to each question, you decide for them. I was thinking you could do something like removing the question from the screen unless the connection to the server was still alive, but the user could always just screen-shot the question before disconnecting.
0
0
0
1
2011-05-13T17:36:00.000
4
0
false
5,995,674
0
0
1
4
I am preparing a Test or Quiz in Django. The quiz needs to be completed in certain time frame. Say 30 minutes for 40 questions.I can always initiate a clock at start of the test, and then calculate time by the time the Quiz is completed. However it's likely that during the attempt, there may be issues such as internet connection drops, or system crashes/power outages etc. I need a strategy to figure out when such an accident happened, and stop the clock, then let the user take the test again from where it stopped, and start the clock again. What is the right strategy? Any help including sample code/examples/ideas are most welcome
How to get an app name using python in django
68,934,242
0
26
31,008
0
python,django,django-views
I believe the updated solution is view.__module__. This returns your app_name both from Django and Django Rest Framework. My scenario was working with dynamically module or app_name from view call so that I can work with access permission check for that particular module.
0
0
0
0
2011-05-14T06:32:00.000
9
0
false
6,000,205
0
0
1
1
If you are in the view and want to retrieve the app name using Python ( the app name will be used for further logic ), how would you do it ?
Hot deployment using mod_wsgi,python and django on Apache
6,007,285
3
4
768
1
python,django,apache2,mod-wsgi,hotdeploy
Just touching the wsgi file allways worked for me.
0
0
0
0
2011-05-15T05:26:00.000
2
0.291313
false
6,006,666
0
0
1
1
I have setup an Apache server with mod_wsgi, python_sql, mysql and django. Everything works fine, except the fact that if I make some code changes, they do not reflect immidiately, though I thing that everything is compiled on the fly when it comes to python/mod_wsgi. I have to shut down the server and come back again to see the changes. Can someone point me to how hot-deployment can be achieved with the above setup?? Thanks, Neeraj
Python / TG2 - Sprox
6,636,134
0
1
160
0
python,web-applications,sprox
You should be able to pass the form to the template, render it there while adding any custom css rule inside the template itself. For example using <style> tag inside the template head. If you really want to add a custom CSS on python side you can try to add CSSLink/CSSSource to css key inside form __base_widget_args__ dictionary
1
0
0
0
2011-05-15T02:34:00.000
1
1.2
true
6,006,773
0
0
1
1
I am looking into using sprox but I can't seem to find any information about styling the generated form. I am sure its got to be something obvious but i didn't see it in the docs or find anything using a google/google groups search. Ideally i would use sprox to generate the form but be able to pass in some css for layout. I could just manually create the forms but with the built in validation and select, drop downs pulling in data it seemed worth a look. In a perfect world I would use sprox and pass it to the template and then let my designer have at it for formatting/styling the resulting widget leaving me to not have to fuss with it. TIA!
How to prioritize internationalization parameters
6,007,173
5
3
140
0
python,django,google-app-engine,internationalization,nlp
If you have a saved preference somewhere, then that would be the first choice. The cookie value is, presumably, what they chose last time they were around so that would be the first thing to check. The hl parameter is something that Google has figured out and they probably know what they're doing so that seems like a sensible third choice. Then we have the HTTP headers or a final default so check the accept language header next. And finally, have a default language in place just in case all else fails. So, in order: Saved preference. Cookie. hl parameter. HTTP accept language header. Built in default. Ideally you'd backtrack up the list once you get a language from somewhere so that you'd have less work to do on the next request. For example, if you ended up getting the language from the accept language header, you'd want to: set hl (possibly redirecting), store it in the cookie, and save the preference in their user settings (if you have such a permanent store and a signed in person).
0
0
1
0
2011-05-15T06:19:00.000
1
1.2
true
6,006,839
0
0
1
1
given that these all have different values: HTTP browser accept language header parameter HTTP GET human language parameter eg. hl=en or hl=fr Cookie value for language choice How should we decide which language to display pages in if deciding based on these values? It's also thinkable saving user's preferred language to the data layer for a fourth way to let agents and users decide language. Thanks in advance for answers and comments
How to create Django like button for anonymous users?
6,006,961
1
3
1,150
0
python,django,django-models,django-views,django-sessions
If you don't have any way of identifying your users then your best bet is to store this info in a browser cookie or HTML5 local storage. (I don't advise using flash cookies since there is a long debate about them and they are harder to implement)
0
0
0
0
2011-05-15T06:32:00.000
3
0.066568
false
6,006,882
0
0
1
2
I am using Django and my website has no user profiles so all are anonymous. I want to implement a 'like' system. How do I restrict a user to like only once. Thanks.
How to create Django like button for anonymous users?
6,006,968
0
3
1,150
0
python,django,django-models,django-views,django-sessions
You can't 100% restrict multiple votes, but you can make it very difficult for a regular user, by using: a cookie a DB entry with the voter's IP
0
0
0
0
2011-05-15T06:32:00.000
3
0
false
6,006,882
0
0
1
2
I am using Django and my website has no user profiles so all are anonymous. I want to implement a 'like' system. How do I restrict a user to like only once. Thanks.
How to validate JSON with simplejson
6,011,614
2
1
817
0
python,django,json,simplejson
Find the first non-whitespace character. If it's "<" then you have HTML. Also, check the content type header and HTTP status code.
0
0
1
0
2011-05-15T22:08:00.000
2
0.197375
false
6,011,596
1
0
1
1
Occasionally I'm querying a server for JSON and receiving a 404 HTML page when the requested data is not available. Thus, I must have a check to ensure that the JSON I'm expecting, is actually json rather than HTML. I'm accomplishing this now by checking for a string that I can expect to be in the HTML is contained in the response, but I think there has to be a better way to do this.
Advice: where to situate html table content: in JS or HTML
6,022,254
4
0
77
0
javascript,python,html,dhtml
If your API can output a JSON document with your data, you gain significant flexibility and future-proofing. This might even be something your users will want to access directly for their own external consumption. And of course your JS code can easily generate a table from this data. However nobody here can tell you whether this is worth doing or not, as that depends entirely on the scope of your project and opportunity cost of time spent re-architecting.
0
0
1
0
2011-05-16T19:07:00.000
1
1.2
true
6,022,119
0
0
1
1
SHORT: my python code generates a webpage with a table. i'm considering rewriting it to generate a js file instead, that holds the table contents in an array ... and then let the table be generated client-side. I am not sure of the pros and cons. Anyone care to offer their experience/insight? Are there other solutions? LONG: the web page contains a single table and an embedded gmap. the table is a set of locations with several columns of location-stats and also two navigation columns. one nav column consists of onclicks that will recenter embedded gmap to the lat,lon of the location. the other nav column consists of hrefs that open a new window with a gmap centered on the lat,lon. Until recently, my python code would do some number crunching on a list of files, and then generate the html file. also i wrote a js file that keeps the webpage liquid upon browser window resizing. Recently, I modified my python code so that it: placed the lat,lon info in a custom attribute of the tr elements no longer produced the nav column tds and then wrote a js function that loops through the trs onLoad reads the lat,lon from the custom attribute inserts the nav tds fwiw, this reduced the size of the html file by 70% while increasing the js by 10%. ok, so now I am debating if I should go all the way and write my python code to generate 2 files an essentially abstract html file a js file containing a js array of the locations and their stats
How to run a code in an Amazone's EC2 instance?
71,252,207
0
71
66,912
0
python,amazon-ec2
simply add your code to Github and take clone on EC2 instance and run that code.
0
0
0
1
2011-05-17T11:29:00.000
5
0
false
6,030,115
0
0
1
2
I understand nearly nothing to the functioning of EC2. I created an Amazon Web Service (AWS) account. Then I launched an EC2 instance. And now I would like to execute a Python code in this instance, and I don't know how to proceed. Is it necessary to load the code somewhere in the instance? Or in Amazon's S3 and to link it to the instance? Where is there a guide that explain the usages of instance that are possible? I feel like a man before a flying saucer's dashboard without user's guide.
How to run a code in an Amazone's EC2 instance?
12,026,840
4
71
66,912
0
python,amazon-ec2
Launch your instance through Amazon's Management Console -> Instance Actions -> Connect (More details in the getting started guide) Launch the Java based SSH CLient Plugins-> SCFTP File Transfer Upload your files run your files in the background (with '&' at the end or use nohup) Be sure to select an AMI with python included, you can check by typing 'python' in the shell. If your app require any unorthodox packages you'll have to install them.
0
0
0
1
2011-05-17T11:29:00.000
5
0.158649
false
6,030,115
0
0
1
2
I understand nearly nothing to the functioning of EC2. I created an Amazon Web Service (AWS) account. Then I launched an EC2 instance. And now I would like to execute a Python code in this instance, and I don't know how to proceed. Is it necessary to load the code somewhere in the instance? Or in Amazon's S3 and to link it to the instance? Where is there a guide that explain the usages of instance that are possible? I feel like a man before a flying saucer's dashboard without user's guide.
Python or Java for text processing (text mining, information retrieval, natural language processing)
6,030,342
2
10
10,439
0
java,python,nlp,information-retrieval,text-mining
it's not language you have to evaluate, but frameworks and app servers for clustering, data storage/retrieval etc available for the language. you can use jython and use all the java enterprise technologies for high load system and do text parsing with python.
0
0
0
1
2011-05-17T11:46:00.000
4
0.099668
false
6,030,291
1
0
1
3
I'm soon to start on a new project where I am going to do lots of text processing tasks like searching, categorization/classifying, clustering, and so on. There's going to be a huge amount of documents that need to be processed; probably millions of documents. After the initial processing, it also has to be able to be updated daily with multiple new documents. Can I use Python to do this, or is Python too slow? Is it best to use Java? If possible, I would prefer Python since that's what I have been using lately. Plus, I would finish the coding part much faster. But it all depends on Python's speed. I have used Python for some small scale text processing tasks with only a couple of thousand documents, but I am not sure how well it scales up.
Python or Java for text processing (text mining, information retrieval, natural language processing)
6,030,330
3
10
10,439
0
java,python,nlp,information-retrieval,text-mining
Just write it, the biggest flaw in programming people have is premature optimization. Work on a project, write it out and get it working. Then go back and fix the bugs and ensure that its optimized. There are going to be a number of people harping on about speed of x vs y and y is better than x but at the end of a day its just a language. Its not what a language is but how it does it.
0
0
0
1
2011-05-17T11:46:00.000
4
0.148885
false
6,030,291
1
0
1
3
I'm soon to start on a new project where I am going to do lots of text processing tasks like searching, categorization/classifying, clustering, and so on. There's going to be a huge amount of documents that need to be processed; probably millions of documents. After the initial processing, it also has to be able to be updated daily with multiple new documents. Can I use Python to do this, or is Python too slow? Is it best to use Java? If possible, I would prefer Python since that's what I have been using lately. Plus, I would finish the coding part much faster. But it all depends on Python's speed. I have used Python for some small scale text processing tasks with only a couple of thousand documents, but I am not sure how well it scales up.
Python or Java for text processing (text mining, information retrieval, natural language processing)
6,030,370
9
10
10,439
0
java,python,nlp,information-retrieval,text-mining
It's very difficult to answer questions like this without trying. So why don't you Figure out what would be a difficult operation Implement that (and I mean the simplest, quickest hack that you can make work) Run it with a lot of data, and see how long it takes Figure out if it's too slow I've done this in the past and it's really the way to see if something performs well enough for something.
0
0
0
1
2011-05-17T11:46:00.000
4
1
false
6,030,291
1
0
1
3
I'm soon to start on a new project where I am going to do lots of text processing tasks like searching, categorization/classifying, clustering, and so on. There's going to be a huge amount of documents that need to be processed; probably millions of documents. After the initial processing, it also has to be able to be updated daily with multiple new documents. Can I use Python to do this, or is Python too slow? Is it best to use Java? If possible, I would prefer Python since that's what I have been using lately. Plus, I would finish the coding part much faster. But it all depends on Python's speed. I have used Python for some small scale text processing tasks with only a couple of thousand documents, but I am not sure how well it scales up.
Outputting and Responding to Javascript with PyQt
6,036,473
0
3
1,332
0
javascript,python,screen-scraping,pyqt,pyside
I think you are confused about where things happen, so it is not clear to me what it is you are attempting to do, but lets make a guess. I think you want to automate the use of a web site, where you have to call up a selection page, tick a box, click a button and handle the resulting download. If you only want to do it a few times, for testing the site, then check out watir and Selenium. If you really wish to code it up in Python, then you will have to understand the page sent with the check box well enough that you can find and extract the form, create a POST from the fields in that form, and send the POST to get your download. If the page contains javascript this might add/remove/inhibit you from creating a valid post. Then you will have to catch and save the resulting download. And you will have a panic change to your code, every time the site changes its html pages. I don't envy you that job one bit.
1
0
0
0
2011-05-17T14:24:00.000
2
0
false
6,032,287
0
0
1
1
I am trying to use PyQt to load the html of a web page which can then be manipulated and fed back to the page for web scraping. I am basically trying to log into a page with Javascript on it, search for documents to download (by selecting a check box next to the correct one's names), and then clicking a download button which pops out another page. Does anyone know the functions I would use? Is there a way to discuss this without going into Classes? (My understanding of Classes is not as good as it could be, I am trying to learn, I'm still something of a beginner). Sorry if I didn't explain this well. I'm trying to use either PyQt or PySide to do this.
Two sets of users (teacher and student) in Django authentication
6,033,103
1
2
1,239
0
python,django
You could use permissions. When they sign up if they're a Teacher give them content creation permissions. If they're a student they don't get the permissions. In the user profile I would just have a field that says which type they are. Unless a lot of the data is different I wouldn't have two user profiles.
0
0
0
0
2011-05-17T15:17:00.000
1
1.2
true
6,033,067
0
0
1
1
I'm building a web application where I have 2 sets of users (students and teachers). Teachers should be able to create their account, create a page of their content. Students should be able to create an account to sign up for this content. I am currently using django-registration to handle registration but I am wondering what's the best way to handle these 2 sets of users and still be able to use the Django authentication framework? I have heard about having multiple profiles but would like some opinions. Thanks!
django and calling cli java application
40,871,485
0
1
793
0
java,python,django,command-line-interface
Yeah subprocess is a great module to call commandline scripts. If you are worried that the process will take to long for the user to wait around for html, consider using celery to asynchronously call your external scripts as a background process.
0
0
0
0
2011-05-17T21:13:00.000
2
0
false
6,037,210
0
0
1
1
I am just trying to gather some information, and whether it is possible to tie in cli java application and set up the class path using python. Also to be able to pass data around and share information between the two applications (web frontend and java application) I currently I have a java application and looking to see if it is possible interface the two between django and java. Since I am familiar with it, instead of having to learn a new web framework like wicket. I understand this maybe more python specific than django, any help would be appreciated. Thanks
Detecting blog or forum software using python?
6,037,452
0
0
649
0
python
Some sites will set the 'generator' meta-tag in the html head.
0
0
0
1
2011-05-17T21:29:00.000
2
0
false
6,037,379
0
0
1
1
Is there a way beside checking for known signatures in the site content to find out what kind of software is the website running e.g vbbuliten,WP etc, preferably python.
Python logging module on Eclipse
6,045,239
2
0
1,139
0
python,eclipse,logging
It will depends on how you configure your logging system. If you use only print statement, it should be shown in the console view of eclipse. If you use logging and you configured a Console handler it should also be displayed in the console eclipse view. If you configured only file handler in the logging configuration, you'll have to tail the log files ;)
0
0
0
1
2011-05-18T12:10:00.000
2
0.197375
false
6,044,443
0
0
1
1
Where should I see the logging output on Eclipse while debugging ? and when running ?
Stripping irrelevant parts of a web page
6,097,514
1
2
171
0
python,screen-scraping,web-scraping
One approach is to compare the structure of multiple webpages that share the same template. In this case you would compare multiple SO questions. Then you can determine which content is static (useless) or dynamic (useful). This field is known as wrapper induction. Unfortunately it is harder than it sounds!
0
0
1
0
2011-05-18T21:19:00.000
4
0.049958
false
6,051,175
0
0
1
1
Is there a API or systematic way of stripping irrelevant parts of a web page while scraping it via Python? For instance, take this very page -- the only important part is the question and the answers, not the side bar column, header, etc. One can guess things like that, but is there any smart way of doing it?
What is the Django filter for truncating characters?
6,052,784
7
5
6,402
0
python,django
You can use the slice notation: {{ a_string_variable|slice:":5" }} This would give you the first 5 characters in the string.
0
0
0
0
2011-05-18T21:35:00.000
4
1
false
6,051,301
0
0
1
1
On the Django docs for built in tags and filters they give a filter to truncate words but not characters (letters/numbers/spaces, etc). Is there such a thing available?
Is it safe to write your own table creation SQL for use with Django, when the generated tables are not enough?
6,053,509
2
2
112
1
python,sql,django,postgresql
Yes. I don't see why not, but that would be most unconventional and breaking convention usually leads to complications down the track. Describe the problem you think it will solve and perhaps someone can offer a more conventional solution.
0
0
0
0
2011-05-19T03:30:00.000
3
0.132549
false
6,053,426
0
0
1
1
I need to have some references in my table and a bunch of "deferrable initially deferred" modifiers, but I can't find a way to make this work in the default generated Django code. Is it safe to create the table manually and still use Django models?
To use Facebook Javascript or not
6,064,508
0
3
130
0
python,facebook,oauth
I would think newsfeed, but if there is a difference, it's very small.
0
0
0
0
2011-05-19T10:01:00.000
2
0
false
6,056,936
0
0
1
2
I'm planning to start a web app that uses the Facebook Python SDK. I found two examples in the sdk, newsfeed and oauth. Since im going to deploy the app at google appengine I'm confused to choose which one to base my project. The oauth example uses OAuth 2.0 directly and the newsfeed example rely on the cookie saved by the Facebook Javascript SDK to get the user id of active user. Which one of the two examples will you prefer if you want your app to be less cpu intensive?
To use Facebook Javascript or not
6,064,561
0
3
130
0
python,facebook,oauth
If is a Javascript SDK then it runs on the client browser, not in GAE server, so, that should be the one with less ( server ) CPU usage.
0
0
0
0
2011-05-19T10:01:00.000
2
0
false
6,056,936
0
0
1
2
I'm planning to start a web app that uses the Facebook Python SDK. I found two examples in the sdk, newsfeed and oauth. Since im going to deploy the app at google appengine I'm confused to choose which one to base my project. The oauth example uses OAuth 2.0 directly and the newsfeed example rely on the cookie saved by the Facebook Javascript SDK to get the user id of active user. Which one of the two examples will you prefer if you want your app to be less cpu intensive?
Python and Django basics
6,060,065
0
2
1,236
0
python,django
The best resource in my opinion is right here a Stackoverflow. I haven't found any good python or django books that were better than just doing it yourself. If you know how to program, jump right in. Start making a django app. You'll find it pretty straightforward. I prefer vim but a lot of people use Eclipse or Komodo.
0
0
0
0
2011-05-19T13:51:00.000
3
0
false
6,059,744
0
0
1
1
I am a Java developer and am going to try out Python/Django for the first time and I had some basic questions from my end. Which are the go-to python forums? Which are the go-to django forums? What are the go-to IDEs? (I use Eclipse for Java) Are there any books you encourage or discourage me from reading? Any final words on my conversion to the dark side of dynamic programming? I do realize that I can just Google this stuff but then i wouldn't be getting the experts feedback on what is good or not, so thanks ahead of time. K
Google App Engine Versioning in the Datastore
6,063,370
7
15
1,749
0
python,google-app-engine,google-cloud-datastore
Datastore has no concept of versions. When you update a model definition, any entities you create in the future will be of the new type, and the old ones will be of the old type. This frequently leads to runtime errors if your code is not aware of these changes.
0
1
0
0
2011-05-19T18:35:00.000
2
1
false
6,063,286
0
0
1
1
Google App Engine has the concept of app versions. i.e., you can have multiple versions of your app running concurrently and accessible at different subdomains. For instance: http://1.my-app-name.appspot.com, http://2.my-app-name.appspot.com. What aspects of the app are actually "versioned" by this? Is it only the Python + Static files codebase? Does the datastore have the concept of "versions"? If not, then what happens when I update the definition of a Google App Engine model? Thanks!
Is there a way to get the time in milliseconds for an ad-hoc query against CouchDB using couchdb-python?
11,588,230
0
2
209
0
python,couchdb,pymongo,couchdb-python
Use view() function to run permanent views in database. It is more efficient. In many cases, temporary may be very costly.
0
0
0
0
2011-05-19T20:18:00.000
2
0
false
6,064,385
0
0
1
1
I'm using ad-hoc JavaScript map functions in couchdb-python through its query() function. Is there a way of getting the time the query takes to process? I've tried timing the script, but it's pretty obvious to me that the time I'm getting is not correct. If I iterate over the ViewResult that the query() function returns and print all the results, I believe I get an answer that's closer to the truth, but I don't want the printing to be included in my timing.. Anyone got any ideas? Thanks a bunch!
What would be a good strategy to implement functionality similar to facebook 'likes'?
6,067,968
1
0
104
1
python,architecture
You will actually only need the user_likes table. The like_count is calculated from that table. You will only need to store that if you need to gain performance, but since you're using memcached, It may be a good idea to not store the aggregated value in the database, but store it only in memcached.
0
0
0
1
2011-05-20T05:46:00.000
3
0.066568
false
6,067,919
0
0
1
2
I have a website where people post comments, pictures, and other content. I want to add a feature that users can like/unlike these items. I use a database to store all the content. There are a few approaches I am looking at: Method 1: Add a 'like_count' column to the table, and increment it whenever someone likes an item Add a 'user_likes' table to keep a track that everything the user has liked. Pros: Simple to implement, minimal queries required. Cons: The item needs to be refreshed with each change in like count. I have a whole list of items cached, which will break. Method 2: Create a new table 'like_summary' and store the total likes of each item in that table Add a 'user_likes' table to keep a track that everything the user has liked. Cache the like_summary data in memcache, and only flush it if the value changes Pros: Less load on the main items table, it can be cached without worrying. Cons: Too many hits on memcache (a page shows 20 items, which needs to be loaded from memcache), might be slow Any suggestions?
What would be a good strategy to implement functionality similar to facebook 'likes'?
6,067,953
1
0
104
1
python,architecture
One relation table that does a many-to-many mapping between user and item should do the trick.
0
0
0
1
2011-05-20T05:46:00.000
3
0.066568
false
6,067,919
0
0
1
2
I have a website where people post comments, pictures, and other content. I want to add a feature that users can like/unlike these items. I use a database to store all the content. There are a few approaches I am looking at: Method 1: Add a 'like_count' column to the table, and increment it whenever someone likes an item Add a 'user_likes' table to keep a track that everything the user has liked. Pros: Simple to implement, minimal queries required. Cons: The item needs to be refreshed with each change in like count. I have a whole list of items cached, which will break. Method 2: Create a new table 'like_summary' and store the total likes of each item in that table Add a 'user_likes' table to keep a track that everything the user has liked. Cache the like_summary data in memcache, and only flush it if the value changes Pros: Less load on the main items table, it can be cached without worrying. Cons: Too many hits on memcache (a page shows 20 items, which needs to be loaded from memcache), might be slow Any suggestions?
Flexible parsing of text with regular expressions in Java or Python
6,075,053
0
0
446
0
java,python,regex
You should be fine with java regular expressions and it should be a trivial exercise to support named captures. After all it is just mapping capture group numbers to names. I even have code for this around somewhere, but can't share due to copyright reasons. You could put regular expressions to parse the individual parts of such listings in a text file and make those part of your configuration. Regular expressions are compiled at run-time, so this should be fairly dynamic. If you want a more flexible system (albeit at the cost of a pre-compilation step), have a look at parser generators like JavaCC or ANTLR. These allow you to create context-free grammars which are considerably more powerful than regexp.
0
0
0
0
2011-05-20T16:22:00.000
3
0
false
6,074,884
0
0
1
2
I am working on some code to parse text into XML. I am currently using java and jaxb to handle the XML and the in-program representation of my data. I need to setup an easily expandable and adaptable method to parse the info from my text files into my java classes. The data will for the most part stay the same, but I need to be able to support later changes in the text input format. (I am parsing airline pilot flight schedules, and I want to support the schedules of other airlines down the road.) It seems like regular expressions are the way to go, but the little I have worked with java RE makes it seem a poor solution compared to python - named captures specifically. But, I know less about python than I do about Java! So, I am looking for a modular system to parse text data that I can easily adapt, extend, and distribute later on. I am willing to learn more python if I need it, but my time and abilities are limited. Any suggestions? An example of the text I am parsing follows. ================================================================================================= 8122 TU REPORT AT 06.45/N EFFECTIVE JUN 08-JUN 29 1 CAPT, 1 F/O DAY FLT. EQP DEPARTS ARRIVES BLK. BLK. DUTY CR. LAYOVER MO TU WE TR FR SA SU TU 180 320 PHX 0745 SAN 0857* 1.12 -- -- -- -- -- -- TU 005 320 SAN 0950 PHX 1106 1.16 -- 8 -- -- -- -- -- TU 592 L 320 PHX 1215 MCI 1652 2.37 -- 15 -- -- -- -- -- Radisson A/P 5.05 8.22 5.05 MCI 12.18 -- 22 -- -- -- -- -- (816) 464-2423 -- 29 -- WE 403 B 320 MCI 0610 PHX 0657 2.47 WE 149 320 PHX 0859 CMH 1547 3.48 Holiday Inn City Center 6.35 9.37 6.35 CMH 15.13 (614) 221-3281 TH 335 B 320 CMH 0800 PHX 0913 4.13 TH 343 L 320 PHX 1029 PVR 1508 2.39 Marriott Casamagna 6.52 9.23 6.52 PVR 15.52 52-322-2260000 TRANS: Hotel Shuttle FR 621 320 PVR 0815 PHX 0839 2.24 2.24 3.39 2.24 CREDIT HRS. 21.00 BLK. HRS. 20.56 LDGS: 8 TAFB 74.24 =================================================================================================
Flexible parsing of text with regular expressions in Java or Python
6,075,184
2
0
446
0
java,python,regex
Those look like fixed-width fields, which are probably a good choice for simple string splitting. The only thing it looks like you could use regular expressions for is to determine what type of record you are looking at, although it looks like the indentation level is also useful for determining that.
0
0
0
0
2011-05-20T16:22:00.000
3
0.132549
false
6,074,884
0
0
1
2
I am working on some code to parse text into XML. I am currently using java and jaxb to handle the XML and the in-program representation of my data. I need to setup an easily expandable and adaptable method to parse the info from my text files into my java classes. The data will for the most part stay the same, but I need to be able to support later changes in the text input format. (I am parsing airline pilot flight schedules, and I want to support the schedules of other airlines down the road.) It seems like regular expressions are the way to go, but the little I have worked with java RE makes it seem a poor solution compared to python - named captures specifically. But, I know less about python than I do about Java! So, I am looking for a modular system to parse text data that I can easily adapt, extend, and distribute later on. I am willing to learn more python if I need it, but my time and abilities are limited. Any suggestions? An example of the text I am parsing follows. ================================================================================================= 8122 TU REPORT AT 06.45/N EFFECTIVE JUN 08-JUN 29 1 CAPT, 1 F/O DAY FLT. EQP DEPARTS ARRIVES BLK. BLK. DUTY CR. LAYOVER MO TU WE TR FR SA SU TU 180 320 PHX 0745 SAN 0857* 1.12 -- -- -- -- -- -- TU 005 320 SAN 0950 PHX 1106 1.16 -- 8 -- -- -- -- -- TU 592 L 320 PHX 1215 MCI 1652 2.37 -- 15 -- -- -- -- -- Radisson A/P 5.05 8.22 5.05 MCI 12.18 -- 22 -- -- -- -- -- (816) 464-2423 -- 29 -- WE 403 B 320 MCI 0610 PHX 0657 2.47 WE 149 320 PHX 0859 CMH 1547 3.48 Holiday Inn City Center 6.35 9.37 6.35 CMH 15.13 (614) 221-3281 TH 335 B 320 CMH 0800 PHX 0913 4.13 TH 343 L 320 PHX 1029 PVR 1508 2.39 Marriott Casamagna 6.52 9.23 6.52 PVR 15.52 52-322-2260000 TRANS: Hotel Shuttle FR 621 320 PVR 0815 PHX 0839 2.24 2.24 3.39 2.24 CREDIT HRS. 21.00 BLK. HRS. 20.56 LDGS: 8 TAFB 74.24 =================================================================================================
How can I override Django filter
6,079,021
0
0
174
0
python,django
Monkeypatch django.utils.timesince.timesince with your desired function.
0
0
0
0
2011-05-21T01:19:00.000
2
0
false
6,078,990
0
0
1
1
I want to override timesince Django's template filter for creating a more efficient filter for getting relative times. How can I do it?
django Error (EXTERNAL IP)
6,080,084
0
3
1,988
0
python,django,url
I guess the main problem is in your view file,where you are working with http response request object .Check that all the settings are accurate in settings.py,also use try except block for finding the error more precisely.
0
0
0
0
2011-05-21T06:22:00.000
4
0
false
6,080,024
0
0
1
1
I am receiving notifications every time, when enter an address that does not exist. Traceback (most recent call last): File "/home/user/opt/local/django/core/handlers/base.py", line 100, in get_response File "/web/blog/views.py", line 33, in post File "/home/user/local/django/db/models/manager.py", line 132, in get File "/home/user/opt/local/django/db/models/query.py", line 347, in get DoesNotExist: Post matching query does not exist. how to solve it
ipython equivalent for javascript/coffeescript for node.js?
6,081,947
7
33
6,571
0
javascript,node.js,ipython,coffeescript
To my knowledge, node and coffee are the only full-featured command-line REPLs for Node.js and CoffeeScript (respectively) right now. In their latest iterations, both offer some degree of colorful output, pretty printing, and completion.
0
0
0
0
2011-05-21T07:13:00.000
6
1
false
6,080,242
1
0
1
1
More specifically, is there a REPL that has (more) colorful output, pretty printing, tab completion and the other goodies that ipython has for node.js javascript/coffeescript?
Preserve POST variables during login redirect in GAE?
6,084,633
1
3
375
0
python,google-app-engine,http-post
The general problem with capturing a POST and turning it into a GET is first that the query string on a GET has a browser-dependent limited size, and second that a POST may be form/multi-part (what to do with the uploaded file becomes an issue). An approach that might work for you is to accept the POST and save the data, then redirect to a page that requires login, passing the Key(s) (or enough information to reconstruct them) in the query string. The handler for that URL then assumes successful login, and fixes up the saved data (say, to associate it with the logged-in user) as appropriate. People who decide not to login will leave orphaned records, which you can clean up via a cron job.
0
1
0
0
2011-05-21T20:01:00.000
2
0.099668
false
6,084,123
0
0
1
1
in a form, I submit data to a python webapp handler (all Google App Engine based) using a HTTP POST request. In this script, I first check if the user is logged in and if not, I use users.create_login_url(...) to redirect the user first to the login page. How can I ensure that after login the user is not just forwarded to my python script again, but that also the POST variables are preserved? The only way I found was turning all POST variables into URL parameters and adding it to the URL. Is that possible at all?
What's your folder layout for a Flask app divided in modules?
6,091,602
5
17
7,631
0
python,flask,flask-sqlalchemy
Actually I found out what I was looking for. Instead of importing flaskext.sqlalchemy on the main __init__ you import it in the model. After that you import the model in the main __init__ and with db.init_app() start it and pass the app configurations. It is not as flexible as the skeleton shown in @Sean post, but it was what I wanted to know. If i weren't toying around probably the skeleton would be the one I'd use.
0
0
0
0
2011-05-22T15:47:00.000
2
0.462117
false
6,089,020
0
0
1
1
I am experimenting with Flask coming from Django and I really like it. There is just one problem that I ran into. I read the flask docs and the part about big applications or something like that and it explains a way to divide your project in packages, each one with its own static and templates folder as well as its own views module. the thing is that I cannot find a way that works to put the models in there using SQLAlchemy with the Flask extension. It works from the interactive prompt to create the tables, but when i use it inside the code it breaks. So I wanted to know how more experienced Flask developers solved this.
How do I secure REST calls I am making in-app?
6,098,282
4
11
4,137
0
python,google-app-engine,rest,restful-authentication,tipfy
Securing a javascript client is nearly impossible; at the server, you have no fool-proof way to differentiate between a human using a web browser and a well-crafted script. SSL encrypts data over the wire but decrypts at the edges, so that's no help. It prevents man-in-the-middle attacks, but does nothing to verify the legitimacy of the original client. OAuth is good for securing requests between two servers, but for a Javascript client, it doesn't really help: anyone reading your javascript code can find your consumer key/secret, and then they can forge signed requests. Some things you can do to mitigate API scraping: Generate short-lived session cookies when someone visits your website. Require a valid session cookie to invoke the REST API. Generate short-lived request tokens and include them in your website HTML; require a valid request token inside each API request. Require users of your website to log in (Google Accounts / OpenID); check auth cookie before handling API requests. Rate-limit API requests. If you see too many requests from one client in a short time frame, block them.
0
0
1
0
2011-05-23T00:36:00.000
3
0.26052
false
6,091,784
0
0
1
1
I have an application that has a "private" REST API; I use RESTful URLs when making Ajax calls from my own webpages. However, this is unsecure, and anyone could make those same calls if they knew the URL patterns. What's the best (or standard) way to secure these calls? Is it worth looking at something like OAuth now if I intend to release an API in the future, or am I mixing two separate strategies together? I am using Google App Engine for Python and Tipfy.
How to use backends in google app engine without wasting cpu resources?
6,098,645
1
1
1,080
0
python,google-app-engine,google-cloud-datastore,backend
The general advice for optimizing CPU usage is to minimize RPCs, understand how to use the datastore efficiently and use appstats to find your bottlenecks. For specific optimization advice, we would need to see some code. While backends can be configured to handle public requests, they aren't intended to replace normal instances. Backends are designed for resource-intensive offline processing. Normal instances are created and destroyed automatically in response to request volume; backends have to be configured and instantiated explicitly by an administrator, thus they are not well-suited to handling traffic spikes. They're also more expensive: keeping a backend instance online for 24 hours will cost you $3.84, whether the instance is handling requests or not.
0
1
0
0
2011-05-23T11:55:00.000
1
1.2
true
6,096,806
0
0
1
1
While processing data in datastore using backends, app engine is using my cpu resources completely. How do i process my data without wasting CPU resources? Can i have the entire app on a backend without wasting cpu resources? am i missing something.. if the question is too vague, ask me to clarify.. thanks
Does reportlab's renderPM work on Google appengine?
6,166,841
0
2
234
0
python,image,google-app-engine,reportlab
Just to close the question - as indicated by Wooble, reportlab itself works fine on Appengine (being pure python) but the RenderPM library doesn't.
0
1
0
0
2011-05-23T15:45:00.000
1
1.2
true
6,099,592
0
0
1
1
I wanted to use ReportLab's RenderPM to generate images on Google App-Engine but it looks like it depends on c libraries. Does anyone know if it's possible to get it working? Thanks, Richard
Multi-domain authentication options for google app engine
6,110,157
0
1
796
0
python,google-app-engine,authentication
Authentication does not imply authorisation. All that the federated ID system does for your application is give you a username/userid that you can trust. So you can setup your user accounts tied to this infomation and rely on the fact that whenever you see that userid you are talking to the same user. Or in the case of domain-wide applications, whenever you see someone with that domain in their userid. It is completely up to your application to decide if that userid has any meaning on your application. If I login to your app now with my google account, it should say "Oh I haven't seen you before, would you like to join?" ... it should (depending on your app) not just assume I'm authorised to use your application, just because I told you my name. I'm not sure where you got the "domain login model" from? The only two choices are Google Account and Open/FederatedID neither of those attempt to restrict user access. In your final example, users spanning multiple google accounts will see different results depending on if they have enable multiple-signin or not. Most users will be presented with a screen to choose which google account they mean before continuing.
0
1
0
0
2011-05-24T07:53:00.000
1
0
false
6,107,318
0
0
1
1
I am looking for some suggestions to implement authentication (and authorization) in our GAE app. Assuming that our app is called someapp, our requirement is as follows: someapp is primarily for google apps users of the domain its installed for but can also authenticate users from other google apps domains. For example, lets say google apps is configured on domainX.com and domainY.com. Additionally the admin for domainX.com has added someapp to their domain from the apps marketplace. The admin for domainX.com invites [email protected] and [email protected] log on to the application. Both google app domain users should be able to use their SSO (single sign-on) functionality. As far as we know, current authentication options in the app engine allow either domain login, which allows only the users of one domain to log in to the app or federated/openid login which would allow the users of any domain to log in to the app. There is no in-between option which would allow only the users of previously authorized domains to log on to the app. Does that mean our only option is to leave aside google apps authentication and implement our own custom authentication? Also in our sample scenario above, what if domainX.com and domainY.com have both added someapp. If [email protected] navigates to someapp.appspot.com, which installation of the app will be used, the one installed on domainX.com or the one installed on domainY.com.
How do I remove south from a django project
6,886,466
5
6
4,611
0
python,django,django-south
Remove 'south' from INSTALLED_APPS, remove south_migrations table from DB. Also, you'll need to delete the Migrations folders from your app folders.
0
0
0
0
2011-05-24T08:55:00.000
4
0.244919
false
6,107,978
0
0
1
3
I installed south and tried a few changes using it, which didn't exactly work out the way I wanted it to. Thankfully, my data is safe but locked into south. I want to remove south and use syncdb normally now, how do I do that without affecting my data?
How do I remove south from a django project
6,108,006
3
6
4,611
0
python,django,django-south
What does it mean for your data to be "locked into" South? The data lives in the database, and South simply creates the schema for you and migrates it when necessary. If you remove South, the data will stay exactly the same.
0
0
0
0
2011-05-24T08:55:00.000
4
0.148885
false
6,107,978
0
0
1
3
I installed south and tried a few changes using it, which didn't exactly work out the way I wanted it to. Thankfully, my data is safe but locked into south. I want to remove south and use syncdb normally now, how do I do that without affecting my data?
How do I remove south from a django project
6,108,042
10
6
4,611
0
python,django,django-south
Remove 'south' from INSTALLED_APPS, remove south_migrations table from DB.
0
0
0
0
2011-05-24T08:55:00.000
4
1.2
true
6,107,978
0
0
1
3
I installed south and tried a few changes using it, which didn't exactly work out the way I wanted it to. Thankfully, my data is safe but locked into south. I want to remove south and use syncdb normally now, how do I do that without affecting my data?
Google App Engine Locking
6,112,067
2
5
467
0
python,google-app-engine
Instantiating an email object certainly does not count against your "recipients emailed" quota. Like other App Engine services, you consume quota when you trigger an RPC, i.e. call send(). If you intended to email 1500 recipients and App Engine says you emailed 45,000, your code has a bug.
0
1
0
0
2011-05-24T11:21:00.000
2
0.197375
false
6,109,602
0
0
1
2
just wondering if anyone of you has come across this. I'm playing around with the Python mail API on Google App Engine and I created an app that accepts a message body and address via POST, creates an entity in the datastore, then a cron job is run every minute, grabs 200 entities and sends out the emails, then deletes the entities. I ran an experiment with 1500 emails, had 1500 entities created in the datastore and 1500 emails were sent out. I then look at my stats and see that approx. 45,000 recipients were used from the quota, how is that possible? So my question is at which point does the "Recipients Emailed" quota actually count? At the point where I create a mail object or when I actually send() it? I was hoping for the second, but the quotas seem to show something different. I do pass the mail object around between crons and tasks, etc. Anybody has any info on this? Thanks. Update: Turns out I actually was sending out 45k emails with a queue of only 1500. It seems that one cron job runs until the previous one is finished and works out with the same entities. So the question changes to "how do I lock the entities and make sure nobody selects them before sending the emails"? Thanks again!
Google App Engine Locking
6,141,535
3
5
467
0
python,google-app-engine
Use tasks to send the email. Create a task that takes a key as an argument, retrieves the stored entity for that key, then sends the email. When your handler receives the body and address, store that as you do now but then enqueue a task to do the send and pass the key of your datastore object to the task so it knows which object to send an email for. You may find that the body and address are small enough that you can simply pass them as arguments to a task and have the task send the email without having to store anything directly in the datastore. This also has the advantage that if you want to impose a limit on the number of emails sent within a given amount of time (quota) you can set up a task queue with that rate.
0
1
0
0
2011-05-24T11:21:00.000
2
0.291313
false
6,109,602
0
0
1
2
just wondering if anyone of you has come across this. I'm playing around with the Python mail API on Google App Engine and I created an app that accepts a message body and address via POST, creates an entity in the datastore, then a cron job is run every minute, grabs 200 entities and sends out the emails, then deletes the entities. I ran an experiment with 1500 emails, had 1500 entities created in the datastore and 1500 emails were sent out. I then look at my stats and see that approx. 45,000 recipients were used from the quota, how is that possible? So my question is at which point does the "Recipients Emailed" quota actually count? At the point where I create a mail object or when I actually send() it? I was hoping for the second, but the quotas seem to show something different. I do pass the mail object around between crons and tasks, etc. Anybody has any info on this? Thanks. Update: Turns out I actually was sending out 45k emails with a queue of only 1500. It seems that one cron job runs until the previous one is finished and works out with the same entities. So the question changes to "how do I lock the entities and make sure nobody selects them before sending the emails"? Thanks again!
Help me decide what to use with Google App Engine for this practical work
6,116,489
4
0
346
0
java,python,google-app-engine
I would recommend using Python + Django framework. I love Java, but for the Google App Engine there is much more documentation online for Python.
0
1
0
0
2011-05-24T20:09:00.000
3
0.26052
false
6,116,236
0
0
1
1
I'm working on a practical work for college, and I have to develop a web-app that could be used by all the teachers from my province. The application should let the users (teachers) manage some information related to their daily duties. One of the requirements is that I must use Google App Engine platform for developing and hosting the web application. I have 2 months to finish the work. I have some intermediate knowledge of C++, so what language (Python or Java ) and web framework do you think would the best to develop the application in less time? I know this is not a strictly programming questions, but please don't delete this post at least until I get a few answer in order to have an idea of how to proceed. Many thanks in advance!
Which python project to choose for a database-heavy application
6,124,043
0
1
899
0
python,web-applications,web-frameworks
My gut says you want to use SQLAlchemy as the ORM. Turbogears does this out of the box, and probably is the largest player in the "not Django" space. There was some work on pulling in SQLAlchemy for (or in addition to!) Django's ORM, but I don't know how current that work is (a quick google search found articles from 2008-2009 as the top hits)
0
0
0
0
2011-05-25T11:45:00.000
4
0
false
6,123,950
0
0
1
3
I plan to develop a rather database heavy (~100 tables) web application in python. The focus is on providing a nice and task-optimized interface for people that edit or navigate through the data. Other focuses are: Handle lots of data and complex queries. Internationalization (translation, timezones, currencies) Mailings (bulk emailing as well as notifications) Easy integration into other websites (pull data from or push data to the application) A role based authentication scheme. (ideally enforcing one role at a time) It should be easy and fast (for python programmers) to create custom forms and workflows to work with the data. I've read a lot about django, turbogears, pyramid, webcore, … but I'm still having a hard time to figure out where to start. My current evaluation would suggest that turbogears is the way to go. Pyramid seems too much to learn about. Django seems to be too publishing focused. WebCore seems a bit to immature to base such a project on it. Am I overlooking something? Are there other more suitable python frameworks? Is my information about some of them plain wrong? Which framework would you choose for this project, and why?
Which python project to choose for a database-heavy application
6,124,088
3
1
899
0
python,web-applications,web-frameworks
Imo the only part of django that might be "too" publishing orientated is the admin, but I have seen plenty of django applications doing stuff neatly. Django has plenty of apps available covering what you want to do, but the only road block you might find is the part of: handle lots of data and complex queries. You will probably move out of django ORM land, but you might even move out of SQLAlchemy land too. Most of these projects use ORM's, so I would look into SQLAlchemy first, and evaluate how to use it for your needs. Second, I would just go through the tutorials of the following projects, reading about them is good, but a small little tutorial/project (or mini prototype) is the only way to see if the project fits your programming style: pyramid, turbogears, and django. They have afaik the largest communities. The best tool will be the one you feel more confortable with. They all have good, excellent documentation, good supportive communities, and are mature enough for solid projects, and for very subtle differences, you probably can use any of them for your needs.
0
0
0
0
2011-05-25T11:45:00.000
4
1.2
true
6,123,950
0
0
1
3
I plan to develop a rather database heavy (~100 tables) web application in python. The focus is on providing a nice and task-optimized interface for people that edit or navigate through the data. Other focuses are: Handle lots of data and complex queries. Internationalization (translation, timezones, currencies) Mailings (bulk emailing as well as notifications) Easy integration into other websites (pull data from or push data to the application) A role based authentication scheme. (ideally enforcing one role at a time) It should be easy and fast (for python programmers) to create custom forms and workflows to work with the data. I've read a lot about django, turbogears, pyramid, webcore, … but I'm still having a hard time to figure out where to start. My current evaluation would suggest that turbogears is the way to go. Pyramid seems too much to learn about. Django seems to be too publishing focused. WebCore seems a bit to immature to base such a project on it. Am I overlooking something? Are there other more suitable python frameworks? Is my information about some of them plain wrong? Which framework would you choose for this project, and why?
Which python project to choose for a database-heavy application
6,124,031
0
1
899
0
python,web-applications,web-frameworks
The number of tables is not relevant for speeds etc. and not relevant for the choice of the framework. Recommendation: use SQLAlchemy as ORM between database and application. Go for Pyramid as web framework. Pyramid is easy, well-documented, test and very flexible in all aspects. Forms etc. can be easily created using "colander" + "deform" add-ons.
0
0
0
0
2011-05-25T11:45:00.000
4
0
false
6,123,950
0
0
1
3
I plan to develop a rather database heavy (~100 tables) web application in python. The focus is on providing a nice and task-optimized interface for people that edit or navigate through the data. Other focuses are: Handle lots of data and complex queries. Internationalization (translation, timezones, currencies) Mailings (bulk emailing as well as notifications) Easy integration into other websites (pull data from or push data to the application) A role based authentication scheme. (ideally enforcing one role at a time) It should be easy and fast (for python programmers) to create custom forms and workflows to work with the data. I've read a lot about django, turbogears, pyramid, webcore, … but I'm still having a hard time to figure out where to start. My current evaluation would suggest that turbogears is the way to go. Pyramid seems too much to learn about. Django seems to be too publishing focused. WebCore seems a bit to immature to base such a project on it. Am I overlooking something? Are there other more suitable python frameworks? Is my information about some of them plain wrong? Which framework would you choose for this project, and why?
Get a python program I've written onto Blackberry or other mobile platform?
6,128,379
1
5
2,149
0
python,blackberry,cross-platform
Given the number of platforms you listed, iPhone, Symbian, Android, BlackBerry, and Windows Mobile I'd suggest you look into a web framework you can integrate your logic into. I know Django is quite popular. Putting a web-frontend on your app does mean your users do have to be connected to the Internet to use your application and you have to have it hosted publicly on the Internet- but I think the Pros far outweigh the Cons. If you develop your application to run on the phone, you have to address every platform you want it to run on; conversely, if you host your app on the web, any standards compliant browser should be able to present your application to the user. This also means the application isn't tied to the device. Should the user change phones, or loose their phone- the application (and their data) is not lost or compromised. This also means the users can access the application from their desktop, tablet, nettop, PS3, wifi-connected toaster etc. I know this isn't really what you are looking for; its a suggestion to the fundamental design of your application; but with the little info you've posted about the application- there was nothing suggesting it 'can not' be hosted on a the web using standards compliant technologies. FWIW- making a mobile application more 'future proof' will only pay out in the end. mobile platforms change faster then just about any other consumer technology. My $0.02
0
0
0
0
2011-05-25T15:41:00.000
3
0.066568
false
6,127,135
0
0
1
1
The said program runs only on my PC as of now. I've been searching through StackOverflow, and I've found out about RhoMobile's Rhodes which allows you to write the app in Ruby once and run it in multiple mobile platforms: iPhone, Symbian, Android, BlackBerry, and Windows Mobile. Is there anything similar for Python? If not how would I go about doing it? Thanks in advance!
SOAP Method Max Item Number
6,132,117
0
0
86
0
python,soap,suds
There is no such limit. It's just on the server side so that big queries wouldn't hinder the server's work.
0
0
0
0
2011-05-25T23:22:00.000
1
1.2
true
6,132,080
0
0
1
1
I was wondering if there was a maximum limit to the number of items that could be received through a SOAP method, or if the server I'm communicating with just has a strange limit. When using Python's framework Suds, I used a method called getRecords from a database of about 39,000 rows. Unfortunately, when I actually get the results, I only get a list of about 250. Of course, this is data for each row that is necessary for the system to work. I was just curious if the reason why I was being limited was based upon a limit set by SOAP. Thanks!
Adding/removing product fields to Product class in Cartridge?
6,141,939
0
0
295
0
python,django
The only way is to subclass the product class, and then use your subclass instead anywhere you need access to the extra fields. Just bear in mind that spots that require the parent class will accept the subclass but spots that require the subclass will not accept the parent class. It's like putting a round peg into a square whole versus putting a square peg into a round hole.
0
0
0
0
2011-05-26T15:22:00.000
1
0
false
6,141,026
0
0
1
1
Is there anyway to add fields to the default product class provided by Cartridge? I'm attempting to setup a parent-child relationship where the parent is the Product class and the child embellishes with custom product fields. Is this at all possible? Thanks. Mark Anderson
Can auto transfer data from memcached to mysql DB?
6,146,042
1
1
329
1
python,mysql,django,memcached
Memcached is not a persistent store, so if you need your data to be durable at all then you will need to store them in a persistent store immediately. So you need to put them somewhere - possibly a MySQL table - as soon as the data arrive, and make sure they are fsync'd to disc. Storing them in memcached as well only speeds up access, so it is a nice to have. Memcached can discard any objects at any time for any reason, not just when they expire.
0
0
0
0
2011-05-26T19:06:00.000
1
0.197375
false
6,143,748
0
0
1
1
I'm building a real-time service so my real-time database need to storage in memcached before fetch to DB (Avoid to read/write mysql db too much). I want to fetch data to mysql DB when some events occur , etc : before data expire or have least-recently-used (LRU) data. What is solution for my problem ? My system used memcached , mysql ,django and python-memcache Thank
Python on the web: executing code as it's processed?
6,159,307
1
1
971
0
python,cgi
Probably the best approach to something like this to seperate your concerns. Make an ajax-drive "console" type display, that for instance will poll a log file, which is written to in the worker process.
0
0
0
0
2011-05-27T03:10:00.000
3
0.066568
false
6,147,461
0
0
1
1
I made a python application that I'd like to deploy to the web. I'm on a Mac, so I enabled the web server and dropped it in my cgi-bin, and it works fine. The problem is, the application does some intensive computations, and I would really like to let the user know what's going on while it's executing. Even though i have print statement scattered throughout the code, it doesn't output anything to my browser until the entire thing is done executing. Is there any way I can fix this to execute code as it's processed?
How to upload current date/time into App Engine with Bulkloader tool?
6,153,856
1
1
389
0
python,google-app-engine,bulkloader
I would do this: Add a property to my bulkuploader.yaml for the modified time and use a import transform to get the date?
0
1
0
0
2011-05-27T07:41:00.000
1
1.2
true
6,149,263
0
0
1
1
How can I add a last modified time property to my entity kind that gets updated during a bulk upload? I'm currently using appcfg upload_data to upload a csv up to my high replication datastore. I plan to have this as a cron job to do a one-way sync from our internal database to datastore. In order to account for stale records, I'd like to have it update a last modified time property and then do a map reduce to delete old records (older than a week). Records will be updated using key property. What would be the best way to create the last modified time considering I want to reserver the ability to use the Datastore Admin to delete the entire entity kind if I need to? Create a object model to "initialize" the datastore entity with all the necessary fields? Add a property to my bulkuploader.yaml for the modified time and use a import transform to get the date? Other... Thanks in advance!
How to keep static resource in memory in python web app?
6,149,856
2
0
150
0
python,image,web-applications,resources
The images will be most likely cached by the system virtual memory unless you are running out of memory. In any case, it's common practice to always run webapps behind a fast webserver and let the webserver serve static images. The webapp could simply send the img tags to the browser, than the browser would load them from the webserver
0
0
0
0
2011-05-27T08:19:00.000
2
0.197375
false
6,149,611
1
0
1
1
I implemented a python web app, which generates images dynamically using small pieces of images. Each time when user visit the page, a script runs and all small images are loaded from disc to generate big image. I think loading small images from disc is quite a big overhead. Is it possible to load all small images once, and all other python scripts can use it freely whenever called?
Proper way of avoiding to store the same attachment twice
7,260,768
1
1
99
0
python,project,task,openerp
I think building a wizard is the only way that will work, because there isn't a real link between attachment and project.task. If I were you, I would build a wizard that walks the parent relation to build a list of all ancestor task ids, plus the current task id. Then have the wizard open the attachment window using that list of ids as one of the domain search criteria.
0
0
0
0
2011-05-27T12:01:00.000
1
1.2
true
6,151,975
0
0
1
1
I'm using the project.task model where delegation creates a parent/child link between both. When delegating I would like the person who gets the delegated task to also have access to the attachments on the original task, how could I avoid to have to really copy it? I've thought about using an <act_window> or a wizard which checks if there is a parent task and if so (also) show the parent task attachments. The problem with act_window is that you would need to specify 2 different act_window records and that would still only cover one parent and one child relation (the task could be delegated more) For the wizard approach it seems to be a lot of overkill work for something that could maybe be solved easier (hence the question).
Appengine - Storing a Pickled in Datastore
6,157,563
1
2
853
0
python,google-app-engine,pickle
Never mind. I just ran tests with both. It appears that you cannot use TextProperty with pickle. It will cause errors. Using it with BlobProperty, on the other hand, works perfectly.
0
1
0
0
2011-05-27T20:27:00.000
3
1.2
true
6,157,367
0
0
1
2
In Google Appengine, I'm interested in pickling an object and storing it in the datastore. I don't need to index it. Is there any difference if I store it as a BlobProperty or TextProperty? Which one is better?
Appengine - Storing a Pickled in Datastore
6,157,457
4
2
853
0
python,google-app-engine,pickle
BlobProperty can store binary data while TextProperty can store only strings. You can use BlobProperty as TextProperty is basicly a BlobProperty with encoding.
0
1
0
0
2011-05-27T20:27:00.000
3
0.26052
false
6,157,367
0
0
1
2
In Google Appengine, I'm interested in pickling an object and storing it in the datastore. I don't need to index it. Is there any difference if I store it as a BlobProperty or TextProperty? Which one is better?
Key generation in Google App Engine
6,164,669
1
3
421
0
python,google-app-engine
Keys in App Engine are based on: The keys of the ancestor entities of the entity, if any. The kind name of the entity. Either an auto-generated integer id or a user-assigned key_name. The integer IDs are allocated in generally-increasing blocks to various instances of the application, so that they can be guaranteed to be unique but are not guaranteed to actualy get assigned to entities in a monotonically increasing fashion. The keys do not use anything like a universally unique ID.
0
1
0
0
2011-05-28T04:35:00.000
4
0.049958
false
6,159,666
0
0
1
1
If you guys ever used Google App Engine. It generates a key for every single instance of a model created. It's pretty neat. I'm looking into building something like that. Do they do it so that the key is based on the content? Or do they just take a random choice from a-zA-Z0-9 for like 50 times and build a string out of it? That sounds reasonable because the chances that 2 key would be the same would be lower than 1/10^89.
Appcelerator Python Application Requires Silverlight?
6,165,444
0
0
161
0
python,silverlight,client-side,appcelerator
I got the answer from appcelerator developer center: The php/python/ruby scripts are executed in the context of the Titanium Desktop. So you will need to run it inside a session of Ti Desktop application, and not on a webpage. Ti Desktop is a modified webkit that knows how to handle this scripts. So apparently this tag is not for websites.
0
0
0
0
2011-05-28T17:06:00.000
1
1.2
true
6,163,105
1
0
1
1
Something that I couldn't understand from appcelerator website: If I use appcelerator to write a client-side python application for the web, will it require that users have silverlight installed on their machines?
Python Markdown module choking on unicode conversion, utf-8
6,166,085
1
1
957
0
python,unicode,utf-8,markdown
The \xe2\x80\x9c is U+201C LEFT DOUBLE QUOTATION MARK (a "smart quote") when decoded as UTF-8. The two occurrences of \xe2\x80" are not valid UTF-8 sequences and the presence there of a " (a "dumb" quote) is suspicious. You appear to have a mangling problem or an encoding problem, or both. We need to sort that out before we get to the task of replacing e.g. smart quotes by dumb quotes. Exactly how are "people submitting stuff"? What transformations has it gone through before markdown does unicode(txt, 'utf-8')?
0
0
0
0
2011-05-29T04:08:00.000
1
0.197375
false
6,165,893
1
0
1
1
I'm using the markdown module from web2py to handle marked up text. The problem is, people are submitting stuff with smartquotes, special characters etc, and I need to replace those with their equivalents. I have text like this: '\n\r\nThe Colonels face paled a bit. \xe2\x80\x9cBut, then \xe2\x80" excuse my boldness, sir \xe2\x80" our going to Uvar now' How do I ensure that calling unicode(txt, 'utf-8') like it does on the text internally inside markdown will not throw an error? The fancy special quotes that word processing programs insert are the normal cause, but there seem to be many characters which are an issue.
Django: Model.objects.get fails with ImportError exception
6,170,760
0
1
1,042
0
python,exception,django-models,importerror
Get rid of the models directory, and put all your models in a models.py file, unless you have a VERY good reason not to. Change your imports to from core.models import Variable (rename your Variables class to Variable - django models should be named as the singular not the plural). The problem probably has to do with the fact that your models live in namespaces other than models; ie. models.storage. The django infrastructure expects certain things to be in certain places. If you're going to put models in separate namespaces like you're doing, you should be importing them from the __init__.py file within the models module. Don't do this, again, unless you have a very good reason. Finally, when asking questions of this nature, you should provide a lot more information. You should show the code of all of the relevant files. You should provide detail on what you've done that deviates from the convention of django (in this case, a models directory rather than a models.py file). You should also show relevant settings from settings.py, in this case, your INSTALLED_APPS setting. You should also tell us what version of django and python you are using; this is sometimes relevant. More information upfront is much better than less information.
0
0
0
0
2011-05-29T08:14:00.000
1
1.2
true
6,166,654
0
0
1
1
I've model which represents settings in my system and I use it from another part of my application so that import has 3 levels WORKING CODE <- Module <- Model Model Variables from django.db import models class Variables(models.Model): key = models.CharField(max_length = 20, verbose_name = 'Variable') value = models.CharField(max_length = 1024) class Meta: app_label = 'core' def __unicode__(self): return '%s: %s' % (self.key, self.value,) Here is the code I'm using it from Module variables.py from core.models.storage import Variables def get_var(name): return Variables.objects.get(key = name) Module config.py var = get_var('some_key') When I use this stuff from django shell everything works well but when I call get_var function I've ImportError exception storage.py from django.db import models class Variables(models.Model): key = models.CharField(max_length = 20, verbose_name = 'Variable') value = models.CharField(max_length = 1024) class Meta: app_label = 'core' def __unicode__(self): return '%s: %s' % (self.key, self.value,) File "monitor_cli.py", line 19, in print worker.get_provider() File "/home/sultan/Project/monitor/app/worker.py", line 14, in get_provider print Variables.objects.get(pk=1) File "/usr/local/lib/python2.6/dist-packages/django/db/models/manager.py", line 132, in get return self.get_query_set().get(*args, **kwargs) File "/usr/local/lib/python2.6/dist-packages/django/db/models/query.py", line 341, in get clone = self.filter(*args, **kwargs) File "/usr/local/lib/python2.6/dist-packages/django/db/models/query.py", line 550, in filter return self._filter_or_exclude(False, *args, **kwargs) File "/usr/local/lib/python2.6/dist-packages/django/db/models/query.py", line 568, in _filter_or_exclude clone.query.add_q(Q(*args, **kwargs)) File "/usr/local/lib/python2.6/dist-packages/django/db/models/sql/query.py", line 1172, in add_q can_reuse=used_aliases, force_having=force_having) File "/usr/local/lib/python2.6/dist-packages/django/db/models/sql/query.py", line 1060, in add_filter negate=negate, process_extras=process_extras) File "/usr/local/lib/python2.6/dist-packages/django/db/models/sql/query.py", line 1226, in setup_joins field, model, direct, m2m = opts.get_field_by_name(name) File "/usr/local/lib/python2.6/dist-packages/django/db/models/options.py", line 307, in get_field_by_name cache = self.init_name_map() File "/usr/local/lib/python2.6/dist-packages/django/db/models/options.py", line 337, in init_name_map for f, model in self.get_all_related_m2m_objects_with_model(): File "/usr/local/lib/python2.6/dist-packages/django/db/models/options.py", line 414, in get_all_related_m2m_objects_with_model cache = self._fill_related_many_to_many_cache() File "/usr/local/lib/python2.6/dist-packages/django/db/models/options.py", line 428, in _fill_related_many_to_many_cache for klass in get_models(): File "/usr/local/lib/python2.6/dist-packages/django/db/models/loading.py", line 167, in get_models self._populate() File "/usr/local/lib/python2.6/dist-packages/django/db/models/loading.py", line 61, in _populate self.load_app(app_name, True) File "/usr/local/lib/python2.6/dist-packages/django/db/models/loading.py", line 76, in load_app app_module = import_module(app_name) File "/usr/local/lib/python2.6/dist-packages/django/utils/importlib.py", line 35, in import_module __import__(name) ImportError: No module named c
Database Based Job scheduler
6,184,556
1
0
710
1
php,python,database
Here's a possible solution - a script, either in php or python performing your database tasks - a scheduler : Cron for linux, or the windows task scheduler ; where you set the frequency of your jobs. I'm using this solution for multiple projects. Very easy to set up.
0
0
0
0
2011-05-31T07:50:00.000
2
1.2
true
6,184,491
0
0
1
1
I need a job scheduler (a library) that queries a db every 5 minutes and, based on time, triggers events which have expired and rerun on failure. It should be in Python or PHP. I researched and came up with Advanced Python Scheduler but it is not appropriate because it only schedules the jobs in its job store. Instead, I want that it takes jobs from a database. I also found Taskforest, which exactly fits my needs except it is a text-file based scheduler meaning the jobs have to be added to the text-file either through the scheduler or manually, which I don't want to do. Could anyone suggest me something useful?
Allowing Users logged in with multiple accounts at once to the same Django site
6,249,925
1
1
329
0
python,django,session,django-sessions,django-users
I ended up storing the per-site data in the user's session, i.e. session['site_id_1'] = user_obj_1, session['site_id_2'] = user_obj_2, etc... Instead of logging in, I just store the user data in the appropriate key. Instead of logging out, I delete the key for the site.
0
0
0
0
2011-05-31T08:41:00.000
2
1.2
true
6,184,994
0
0
1
1
I'm creating a widget that gets installed on various different sites and I need distinct users for each site. Problem is, the same person browsing might have 2 different sites open at once that use my widget. This means that I need users to be logged in with multiple accounts simultaneously to the same Django site. From my understanding, Django usually assumes that only 1 user is logged in per session. What's the simplest and most effective way to go about this?
Ant simulation: it's better to create a Process/Thread for each Ant or something else?
6,189,789
2
6
1,030
0
python,multithreading,resources,simulation,multiprocess
I wrote an ant simulation (for finding a good TSP-solution) and a wouldnt recommend a Thread-Solution. I use a loop to calculate for each ant the next step, so my ants do not really behave concurrently (but synchronize after each step). I don't see any reason to model those ants with Threads. Its no advantage in terms of run-time behavior nor is it an advantage in terms of elegancy (of the code)! It might be, admittedly, slightly more realistic to use Threads since real ants are concurrent, but for simulations purposes this is IMHO neglectable.
0
0
0
1
2011-05-31T14:49:00.000
5
0.07983
false
6,189,398
1
0
1
2
The simple study is: Ant life simulation I'm creating an OO structure that see a Class for the Anthill, a Class for the Ant and a Class for the whole simulator. Now I'm brainstorming on "how to" make Ants 'live'... I know that there are projects like this just started but I'm brainstorming, I'm not looking for a just-ready-to-eat-dish. Sincerely I have to make some tests for understand on "what is better", AFAIK Threads, in Python, use less memory than Processes. What "Ants" have to do when you start the simulation is just: moving around with random direction, if they found food ->eat/bring to the anthill, if they found another ant from another anthill that is transporting food -> attack -> collect food -> do what have to do.... and so on...that means that I have to "share" information across ants and across the whole enviroment. so I rewrite: It's better to create a Process/Thread for each Ant or something else? EDIT: In cause of my question "what is better", I'd upvoted all the smart answers that I received, and I also put a comment on them. After my tests, I'll accept the best answer.
Ant simulation: it's better to create a Process/Thread for each Ant or something else?
6,189,548
1
6
1,030
0
python,multithreading,resources,simulation,multiprocess
I agree with @delan - it seems like overkill to allocate a whole thread per Ant, especially if you are looking to scale this to a whole anthill with thousands of the critters running around. Instead you might consider using a thread to update many ants in a single "cycle". Depending on how you write it - you need to carefully consider what data needs to be shared - you might even be able to use a pool of these threads to scale up your simulation. Also keep in mind that in CPython the GIL prevents multiple native threads from executing code at the same time.
0
0
0
1
2011-05-31T14:49:00.000
5
0.039979
false
6,189,398
1
0
1
2
The simple study is: Ant life simulation I'm creating an OO structure that see a Class for the Anthill, a Class for the Ant and a Class for the whole simulator. Now I'm brainstorming on "how to" make Ants 'live'... I know that there are projects like this just started but I'm brainstorming, I'm not looking for a just-ready-to-eat-dish. Sincerely I have to make some tests for understand on "what is better", AFAIK Threads, in Python, use less memory than Processes. What "Ants" have to do when you start the simulation is just: moving around with random direction, if they found food ->eat/bring to the anthill, if they found another ant from another anthill that is transporting food -> attack -> collect food -> do what have to do.... and so on...that means that I have to "share" information across ants and across the whole enviroment. so I rewrite: It's better to create a Process/Thread for each Ant or something else? EDIT: In cause of my question "what is better", I'd upvoted all the smart answers that I received, and I also put a comment on them. After my tests, I'll accept the best answer.
How to sort collections based on current user locale on a Django site
9,914,642
0
0
393
0
python,django,internationalization,locale,python-babel
The solution I ended up taking is just converting the utf-8 string to ASCII and stripping the diacritics just for the sort operation. Not ideal but it ended up working for this specific case.
0
0
0
0
2011-05-31T17:51:00.000
1
1.2
true
6,191,477
0
0
1
1
I need to sort a collection of objects by a utf-8 string property (built via ActiveRecord). Currently the code is sorting by ASCII order via the order_by method, however this needs to be changed to locale.strcoll. Unfortunately using the built in locale functionality requires changing the culture for the entire application, not just the current request. I've looked at the Babel library but it does not appear to provide the functionality I need. The only other option I have been able to find is pyICU, however getting the ICU libraries installed in my environment will prevent this as a viable solution. Are there any other options?
What's the advantage or benefit of Slug field in Django?
6,192,676
8
1
3,559
0
python,django,slug
The slug provides a human-friendly url fragment for the page. This is often useful when people are deciding whether or not to click a link. There's one in the url of this page, for example: http://stackoverflow.com/questions/6192655/whats-the-advantage-or-benefit-of-slug-field-in-django You can actually get to this question without the slug (in StackOverflow's system), but a slug is a more friendly and more semantic way to address the page. Search engines do place some weight on words that appear in the address of the page. One downside of using only the slug to address a page is that if you change the content of the page and wish to change the title, you have to decide between changing the slug, or leaving it as is (and thus not reflecting the content of the page). StackOverflow's compromise of having the slug but not relying on it is one solution. Nothing in Django requires you to use slugs in your application, but it's a convenience that's present because many of us do.
0
0
0
0
2011-05-31T19:40:00.000
1
1.2
true
6,192,655
0
0
1
1
What's the benefit of the slug field? Does it make the url more searching engine friendly? If it does, how? Isn't a meaningful title of the page searching engine friendly enough?
How to insert a checkbox in a django form
58,656,035
-2
47
114,804
0
python,django,django-forms
You can just add required=False on your forms.BooleanField() args.
0
0
0
0
2011-06-01T01:46:00.000
4
-0.099668
false
6,195,424
0
0
1
1
I've a settings page where users can select if they want to receive a newsletter or not. I want a checkbox for this, and I want that Django select it if 'newsletter' is true in database. How can I implement in Django?
Game Engine Remake - Trouble Choosing a Language /API(Java or Python)
6,195,610
0
1
213
0
java,python
I'd recommend going with PySFML, and, of course, Python. If you do your Python programming correctly , and if you really are willing to fiddle with C or ASM Python plugins for faster computations, you shouldn't really have too much performace hits.
1
0
0
1
2011-06-01T02:04:00.000
3
1.2
true
6,195,508
0
0
1
2
The engine I've been wanting to remake is from a PlayStation 1 game called Final Fantasy Tactics, and the game is basically a 2.5D game I guess you could say. Low-resolution sprites and textures, and 3D maps for battlefields. The plan is to mainly load the graphics from a disc, or .iso (I already know the sectors the graphics need to be read from) and fill in the rest with game logic and graphics routines, and probably load other things from the disc like the map data. I want this to be a multiplatform project, because I use Linux and would like for more people to join in on the project once I have enough done (and it's easy to get more people through platforms like Windows). I'll be making a website to host the project. Also, none of the graphics will be distributed, they'll have to be loaded from your own disc. I'd rather not have to deal with any legal issues.. At least not soon after the project is hosted on my site. But anyway, here's my dilemma- I know quite a bit of Java, and some Python, but I'm worried about performance/feature issues if I make this engine using one of these two languages. I chose them due to familiarity and platform independence, but I haven't gotten too into graphics programming yet. I'm very much willing to learn, however, and I've done quite a bit of ASM work on the game- looking at graphics routines and whatnot. What would be the best route to take for a project like this? Oh, and keep in mind I'll eventually want to add higher-resolution textures in an .iso restructuring patch or something. I'm assuming based on my results on Google that I could go with something like Pygame + OpenGL, JOGL, Pyglet, etc. Any suggestions on an API? Which has plenty of documentation/support for game or graphics programming? Do they have any serious performance hits? Thank you for your time.
Game Engine Remake - Trouble Choosing a Language /API(Java or Python)
6,195,870
0
1
213
0
java,python
At the end of the day if you're passionate about the project and committed to getting the most out of the language you choose, the performance difference between java and python will be minimal if non-existent. Personally speaking, the biggest challenge is finishing the project once it loses novelty and initial momentum. I suggest you go with whichever language you're more passionate about and are interested in plumbing the depths of, or one that could boost your resume. Secondly, as you mention you're hoping to attract contributors, you may want to factor that into your decision. I can't comment much here, but have a look at similar projects with lots of activity. Good luck!
1
0
0
1
2011-06-01T02:04:00.000
3
0
false
6,195,508
0
0
1
2
The engine I've been wanting to remake is from a PlayStation 1 game called Final Fantasy Tactics, and the game is basically a 2.5D game I guess you could say. Low-resolution sprites and textures, and 3D maps for battlefields. The plan is to mainly load the graphics from a disc, or .iso (I already know the sectors the graphics need to be read from) and fill in the rest with game logic and graphics routines, and probably load other things from the disc like the map data. I want this to be a multiplatform project, because I use Linux and would like for more people to join in on the project once I have enough done (and it's easy to get more people through platforms like Windows). I'll be making a website to host the project. Also, none of the graphics will be distributed, they'll have to be loaded from your own disc. I'd rather not have to deal with any legal issues.. At least not soon after the project is hosted on my site. But anyway, here's my dilemma- I know quite a bit of Java, and some Python, but I'm worried about performance/feature issues if I make this engine using one of these two languages. I chose them due to familiarity and platform independence, but I haven't gotten too into graphics programming yet. I'm very much willing to learn, however, and I've done quite a bit of ASM work on the game- looking at graphics routines and whatnot. What would be the best route to take for a project like this? Oh, and keep in mind I'll eventually want to add higher-resolution textures in an .iso restructuring patch or something. I'm assuming based on my results on Google that I could go with something like Pygame + OpenGL, JOGL, Pyglet, etc. Any suggestions on an API? Which has plenty of documentation/support for game or graphics programming? Do they have any serious performance hits? Thank you for your time.
How can keep my libraries , modules or jar files Synced across diff projects
6,199,290
0
1
229
0
java,php,python,linux,sync
For Java projects, you could give Maven a try, and configure you own repository on your server. Don't know if it can be used for the other languages, however.
0
0
0
1
2011-06-01T09:23:00.000
3
0
false
6,198,941
1
0
1
1
I am doing various projects acroos diff computers , servers and diff languages like php python java etc. Now on every computer i have to install / download various supporting libraries like javascript libraries for PHP , Jar files for Java and many python modules. Is there way so that i can make online folder on server with only libraries and then automatically sync them across different computers. There may be some solution out there for this but i don't know it For java and php there is no need to install them but i don't know whether python modules or libraraies work this way or not like south, PIL, matolib etc. Is there any thing which can help me with this
Python desktop app linked to PHP web app?
6,200,853
0
1
227
0
php,python
There are a million ways to do this. I suggest you write the local data gathering first, to know the amount and format of the data (and because getting Windows hardware information via Python seems to be the hardest part). Then write the web page login, to see if you can get HTTPS or have to take care of security yourself. With these constrains it is much easier to make a recommendation.
0
0
0
1
2011-06-01T11:55:00.000
2
0
false
6,200,707
0
0
1
1
Well, i do want to make a web app in php, where the user could create an account and log in, then download a desktop app made in python, log in there also with the username and the password from the web app and then run in tray. The purpose of this project is none, i want to do it for fun and practice, but i do have some problems. How i could link a web app to a desktop application? That desktop app should gather information about the user's system, harware memory used (like the windows rating) and then send it to the web app and display it in the user's panel. Any ideas ? thanks
Python Web Framework for Small Team
6,207,234
1
3
1,144
0
python,django,pylons,cherrypy
I would recommend DJANGO or TurboGears.
0
0
0
0
2011-06-01T20:26:00.000
5
0.039979
false
6,207,211
0
0
1
1
I have 4 days off and I will use this time to rewrite our RoR (Ruby on Rails) Application in a python web framework just for fun ;-] (and why not make the switch, RoR is great but keep changing all the time, can be exhausting.) I don't know the python web framework very well, I've glad web.py, django, cherry.py, pylons/pyramid and few others. Our requirements are (put everything can be irrelevant) : MVC (Strict or not) Small Team (2-3 people included one designer) Fun to use REST support Multilevel caching (DB query, page cache) Nginx Support (X-Accel-Redirect File Download) Heavy traffic (1,200,000 ~ views) Urls rewrting (Multi-domains support not only subdomain) Not a problem if it's not hype Not a problem if there is no plugins Either SQL or NOSQL (can be fun to try NOSQL) So what you would advise ?
Django view and separate processes
6,213,293
0
1
488
0
python,django,multiprocessing
Andrey Fedoseev gave a great suggestion, but let me come up with a more general solution. You can create some WaitingTasks model into which where your view puts new tasks. Then, there can use any method to process those waiting tasks - cronjob, upstart daemon, whatever - writing back progress and result. (In fact celery uses a similar approach, only with RabbitMQ)
0
0
0
0
2011-06-02T07:50:00.000
3
1.2
true
6,211,868
0
0
1
1
I would like to do something similar: f(n) calculates n! , this obviously takes a long time to do, so the calculations need to run in a separate process from the django view. Additionally I would like the view to return a response immediately (ex. progress 0% ) and subsequent polling needs to update progress, so the view needs to communicate with the above process. What would be the best way to achieve this?
How to extract text from a web page that requires logging in using python and beautiful soup?
6,216,054
3
0
563
0
python,urllib2,beautifulsoup
BeautifulSoup is for parsing html once you've already fetched it. You can fetch the html using any standard url fetching library. I prefer curl, as you tagged your post, python's built-in urllib2 also works well. If you're saying that after logging in the response html is the same as for those who are not logged in, I'm gonna guess that your login is failing for some reason. If you are using urllib2, are are you making sure to store the cookie properly after your first login and then passing this cookie to urllib2 when you are sending the request for the data? It would help if you posted the code you are using to make the two requests (the initial login, and the attempt to fetch the data).
0
0
1
0
2011-06-02T14:21:00.000
1
1.2
true
6,215,808
0
0
1
1
i have to retrieve some text from a website called morningstar.com . To access that data i have to log in. Once i log in and provide the url of the web page , i get the HTML text of a normal user (not logged in).As a result am not able to accees that information . ANy solutions ?
target both android and iphone with Python
6,217,111
0
3
1,256
0
python,android,iphone,beeware
phonegap is one of the way that you can use to target the iphone & android.through the javascript,html.
1
0
0
1
2011-06-02T15:47:00.000
4
0
false
6,216,890
0
0
1
1
I want to develop an application targeting both android and iphone. I guess they use Java and Objective-C. Can I use single language like Python? Is it the best route? Will I lose performance, features, etc. by using Python. Are there any limitations that I will run into?
Repeating function over certain amount of time
6,219,095
2
2
244
0
python,mobile
You can use the time module and the sleep function.
0
0
0
0
2011-06-02T18:59:00.000
3
0.132549
false
6,219,063
0
0
1
1
I am writing a program that upload files from my nokia cell phone files to the web server which I am already done writing that. But, my program only does his job only one time and what I want is that I want to call that function for let's say every 5 mins again and again which I do not know how to do it.
Capturing CAPTCHA image with Python
6,232,816
4
0
540
0
python
The only way to save the image would be to make a single call to the CATPCHA URL programatically, save the result, and then present that saved result to the user. The whole point of CAPTCHA is that each request generates a unique/different reponse/image.
0
0
1
1
2011-06-03T21:17:00.000
1
1.2
true
6,232,780
0
0
1
1
I tried using mechanize to see the URL of the image, but its a dynamic page generating a different image each time. I was wondering if there was any way to "capture" this image for viewing/saving. Thanks!
UnicodeEncodeError in django app
6,233,867
1
1
1,361
0
python,django,unicode,webfaction
Yes, u'\u2122' is the trade mark sign. Somewhere in your code, you should be: encoding your unicode data using a codec (utf8, cp1250 to cp1258, etc) that supports that character or avoiding an automatic unexpected decoding (which uses ascii, which doesn't support that character). Which action is needed and where? No idea, as you haven't supplied a traceback ... please edit your question to include the full traceback, and format it as code, so that it's legible.
0
0
0
0
2011-06-03T22:22:00.000
2
0.099668
false
6,233,230
0
0
1
2
I'm trying to create an app using Django on webfaction. I basically was messing around with the Amazon API, and when one of the search results has a trademark symbol, which is passed to my template...the error is thrown. I'm getting the error Caught UnicodeEncodeError while rendering: 'ascii' codec can't encode character u'\u2122' in position 9: ordinal not in range(128) and was wondering if anyone knew what the fix is.
UnicodeEncodeError in django app
6,233,347
3
1
1,361
0
python,django,unicode,webfaction
It probably means you are calling str() on a a peice of unicode data - the str function could be called ascii to better describe what it does! Your templates will be totally happy with unicode data so given that you are using Django I suspect the problem is in a __unicode__ method or some such. Unicode is a tricky subject, have a Google for python unicode to get a feel for it. Tricky to help you further without seeing some more code but the gist is to try and use unicode strings all through your application! Python has a unicode() method that works exactly like the str method for simple strings but will work fine with unicode strings as well - it's better to use that.
0
0
0
0
2011-06-03T22:22:00.000
2
1.2
true
6,233,230
0
0
1
2
I'm trying to create an app using Django on webfaction. I basically was messing around with the Amazon API, and when one of the search results has a trademark symbol, which is passed to my template...the error is thrown. I'm getting the error Caught UnicodeEncodeError while rendering: 'ascii' codec can't encode character u'\u2122' in position 9: ordinal not in range(128) and was wondering if anyone knew what the fix is.
How much is the difference between html parsing and web crawling in python
6,236,831
1
4
2,480
0
python,django,web-crawler
HTML parse will parse the page and you can collect the links present in it. These links you can add to queue and visit these pages. Combine these steps in a loop and you made a basic crawler. Crawling libraries are the ready to use solutions which do the crawling. They provide more features like detection of recursive links, cycles etc. A lot of features you would want to code would have already been done within these libraries. However first option is preferred if you have some special requirements which libraries do not satisfy.
0
0
1
0
2011-06-04T12:41:00.000
3
0.066568
false
6,236,794
0
0
1
1
I need to grab some data from websites in my django website. Now i am confused whether i should use python parsing libraries or web crawling libraries. Does search engine libraries also fall in same category I want to know how much is the difference between the two and if i want to use those functions inside my website which should i use
Appengine ACL with Google Authentication
6,246,968
5
3
871
0
python,django,google-app-engine,authentication,authorization
You'll need to do this yourself: Implement the ACL with a datastore model keyed by the user's user_id, and fetch and check it on each request. The Users API doesn't provide anything like this built-in.
0
1
0
0
2011-06-04T21:49:00.000
2
1.2
true
6,239,612
0
0
1
1
I would like to implement ACL with Google Authentication. Need some pointer regarding the possibility of the same. Use case: Page X accessible only to [email protected] Page Y accessible for all belong to a group Y. After registration a moderator will add/reject the user to the group Y. Pages are not accessible if user does not belong to any one of the above two. Unauthorized view is prohibited even though the user is authenticated successfully. I am planning to use Django for my project, any support provided by Django would be useful. Thanks in advance.