Title
stringlengths
11
150
A_Id
int64
518
72.5M
Users Score
int64
-42
283
Q_Score
int64
0
1.39k
ViewCount
int64
17
1.71M
Database and SQL
int64
0
1
Tags
stringlengths
6
105
Answer
stringlengths
14
4.78k
GUI and Desktop Applications
int64
0
1
System Administration and DevOps
int64
0
1
Networking and APIs
int64
0
1
Other
int64
0
1
CreationDate
stringlengths
23
23
AnswerCount
int64
1
55
Score
float64
-1
1.2
is_accepted
bool
2 classes
Q_Id
int64
469
42.4M
Python Basics and Environment
int64
0
1
Data Science and Machine Learning
int64
0
1
Web Development
int64
1
1
Available Count
int64
1
15
Question
stringlengths
17
21k
Add content rule based on the form field value input data in Plone?
11,633,491
2
2
340
0
python,plone,ploneformgen
To be able to do this, you need to create 2 content rules--each with different conditions and destination folders to move the content to(using the "move to folder" action). In the content rules, add a TALES expression condition. Then do something like: python: request.form.get('value-to-check', False) == 'foobar' Obviously, you'll need to customize the expression a bit.
0
0
0
0
2012-07-24T10:46:00.000
1
1.2
true
11,629,204
0
0
1
1
I create a form using PloneFormGen.eg. Product code: Gear-12 Value : 2000 Based on the input value i.e if Value is >5000 , move the saved data content entry to folder 1 else move the entry to folder 2. How can the TALES expression be used to trigger this kind of content rule? I am using Plone ver 4.1
python locale.setlocale fails when running apache
11,727,380
1
0
958
0
python,django-templates,apache2,django-views,locale
I managed to solve the problem by explicitly calling locale.setlocale(locale.LC_ALL,'en_US.UTF-8') I'm not sure why it wasn't working without the en_US.UTF-8 parameter as the local setting is 'en_US.UTF-8'. If anyone knows why I needed to use an explicit call when the apache process runs the code but not when I'm testing it anywhere else I'd still be interested in an answer, but I'll mark this as solved.
0
0
0
0
2012-07-24T21:00:00.000
2
1.2
true
11,639,235
0
0
1
1
I've got a simple Ubuntu/django/apache server set up and I'm having trouble formatting some of the numbers that I want to display in my Django templates. When I run the code locally (i.e. on my work machine) using the Django test server everything formats with no problem. Likewise, when I open up IDLE on the server I can do this:` >>> import locale >>> locale.setlocale(locale.LC_ALL,'') 'en_US.UTF-8' >>> '{0:n}'.format(42424242) '42,424,242'` However whenever I try to run the apache server and test the code live it fails and I get outputs like: '42424242' I prepended a print statement to the locale.setlocale(locale.LC_ALL,'') call that have in my view.py file and all I found in the apache error log was [Tue Jul 24 15:26:56 2012] [error] C Could it be that the apache process doesn't have permissions to access the native locale setting?
Windmill-Without web browser
12,344,550
0
1
364
0
python,selenium,windmill,browser-testing
If you're looking to run Windmill in headless mode (no monitor) you can do it by running Xvfb :99 -ac & DISPLAY=:99 windmill firefox -e test=/path/to/your/test.py
0
0
1
1
2012-07-25T08:17:00.000
1
0
false
11,645,451
0
0
1
1
In selenium testing, there is htmlunitdriver which you can run tests without browser with. I need to do this with windmill too. Is there a way to do this in windmill? Thank!
Add logging to tests results in jenkins
11,650,753
1
2
3,561
0
python,jenkins,jython,nose
There is no way to achieve this from your tests. The report generator simply won't display the output unless there are errors. You will have to get the sources for Jenkins itself (the JUnit runner is built into it) and patch the reporter or write your own plugin.
0
1
0
0
2012-07-25T09:35:00.000
2
0.099668
false
11,646,830
0
0
1
2
I am using nose in jenkins to test a project. If a test fails, I get a short report of the output to the console. I would like to have this output regardless of the test result. So even if the test passes I want to be able to see the tests output to stderr/stdout. At the moment I can turn off logging by calling nose with --nocapture. However this results in all the output beeing under the projects console log that jenkins creates by default. How I tell nose/capture to append the captured output to each test result shown in Jenkins? I use xunit to generate a junit compatible xml file which is in turn used by Jenkins to generate its reports. edit: Additional Infos as requested Url in Jenkins (after buildnumber part): /testReport/testDesignParser.testDesignCsvParser.testDesignCsvParser/testDesignCsvParser/test/? I know that this design is not pretty but thats how it is now. If it matters: `testDesignParser.testDesignCsvParser.testDesignCsvParser` module `testDesignCsvParser` class `test` (member)testfunction
Add logging to tests results in jenkins
11,742,115
2
2
3,561
0
python,jenkins,jython,nose
With the latest Jenkins there is an option to save the output (Retain long standard output/error) right under the post build step belonging to JUnit. Additionally I run nose with --nocapture. This gives me a console output view on every test (an option on the left menu when I have a test opened)
0
1
0
0
2012-07-25T09:35:00.000
2
1.2
true
11,646,830
0
0
1
2
I am using nose in jenkins to test a project. If a test fails, I get a short report of the output to the console. I would like to have this output regardless of the test result. So even if the test passes I want to be able to see the tests output to stderr/stdout. At the moment I can turn off logging by calling nose with --nocapture. However this results in all the output beeing under the projects console log that jenkins creates by default. How I tell nose/capture to append the captured output to each test result shown in Jenkins? I use xunit to generate a junit compatible xml file which is in turn used by Jenkins to generate its reports. edit: Additional Infos as requested Url in Jenkins (after buildnumber part): /testReport/testDesignParser.testDesignCsvParser.testDesignCsvParser/testDesignCsvParser/test/? I know that this design is not pretty but thats how it is now. If it matters: `testDesignParser.testDesignCsvParser.testDesignCsvParser` module `testDesignCsvParser` class `test` (member)testfunction
Not able to install python mysql module
11,653,215
4
0
62
1
python,mysql,django
You need to install the client libraries. The Python module is a wrapper around the client libraries. You don't need to install the server.
0
0
0
0
2012-07-25T15:19:00.000
1
0.664037
false
11,653,040
0
0
1
1
i am trying to connect to mysql in django. it asked me to install the module. the module prerequisites are "MySQL 3.23.32 or higher" etc. do i really need to install mysql, cant i just connect to remote one??
Django DynamoDB Database backend
11,679,929
6
8
6,014
0
python,django,amazon-web-services,amazon-dynamodb,django-database
I think the answer is there's no easy way. Django supports relational databases, but DynamoDB is NoSQL. There doesn't appear to be a backend for django-nonrel, an unofficial fork for non relational databases. If you want to use amazon to host the database, you could use their RDS service and configure Django as you would for MySQL.
0
0
0
0
2012-07-26T22:27:00.000
1
1.2
true
11,678,950
0
0
1
1
Is it possible to set up an AWS DynamoDB as the database backed for a Django server? If so, how would I go about doing this? thanks!
AppEngine datastore reserve ID range
11,681,451
1
0
178
0
python,google-app-engine,google-cloud-datastore
I found the answer: db.allocate_id_range(...)
0
1
0
0
2012-07-27T04:15:00.000
1
1.2
true
11,681,354
0
0
1
1
Is it possible to tell AppEngine Datastore to reserve a range of IDs, which should never be allocated to models?
Twitter Bootstrap Website Deployed with GAE
20,403,908
4
7
9,488
0
python,google-app-engine,twitter-bootstrap,blogs
In addition to the answers above, also note that the order of declarations is important. The - url: /.* part should be the last one, in this case
0
0
0
0
2012-07-27T04:42:00.000
4
0.197375
false
11,681,557
0
0
1
1
So I'm a programming noob. I have been following many of the Udacity classes and I am slowly learning to code. Alright so here's my question. I have built the basic HTML files of my blog using Twitter Bootstrap as it is so simple to use. Now what I would like to do is to combine the great template's that Bootstrap provides with the simple hosting services of Google App Engine. This is where my noobness comes in and I'm totally lost. Any help would be appreciated, don't be afraid to hurt my feelings I am noob and will understand if this is completely impossible.
How to show continuous real time updates like facebook ticker, meetup.com home page does?
11,688,432
0
8
5,085
0
php,python,node.js,asynchronous,real-time
You could use a poll, long-poll or if you want a push system. Easiest would be a poll. However, all solutions require client side coding. Performance impact depends on your solution. Easiest to implement would be a poll. A poll with short frequency does effectively a request every, say 100ms ot simulate real time. A long-poll would be of less impact, but it would keep open a lot of request during a more or less time.
0
0
1
0
2012-07-27T13:08:00.000
4
0
false
11,688,397
0
0
1
1
How to show continuous real time updates in browser like facebook ticker, meetup.com home page does? In python, PHP, node.js and what would be the performance impact at the server side ? Also how could we achieve the same update thing if the page is cached by an CDN like akamai?
How can I prevent my website from being "hit-boosted"?
11,697,490
2
4
216
0
php,python,http,header,hit
The best way to do it is pattern-recognition, since most proxies won't tell you that they are a proxy: if you see certain spikes of traffic, flag them and don't add them to the hitcount. Alternatively, if (s)he's using the same proxies over and over again, just blacklist those IP addresses. You could also try to detect proxies by using some sort of API proxy list service or checking for listening proxy servers.
0
0
1
0
2012-07-28T01:22:00.000
2
0.197375
false
11,697,457
0
0
1
1
I am making a social site where users can post content and the content has views. Whenever a user from a different IP address views the content, the view count is incremented; multiple requests coming from the same IP address do not count. However lately someone is iterating though a list of proxies or something and artificially increasing the view counts. How can I prevent this? Is there something I can do by checking headers or something? Thanks.
How to launch Django development server with a different database setting (not default)
11,702,027
5
5
1,173
0
python,database,django,settings,local
You can hold 2 different settings.py and while run manage.py do : python manage.py runserver --settings=[projectname].[settingsfile]. change the settingsfile according to your database.
0
0
0
0
2012-07-28T14:30:00.000
2
1.2
true
11,701,887
0
0
1
1
i have different configurations for django database in settings, one named "default" and one named "clean". How i can run the development server (python manage.py runserver ip:port) binding the "clean" database setting and not the default?
Using Regular Expression with Twill
11,712,834
0
3
225
0
python,regex,beautifulsoup,twill
I'd rather user CSS selectors or "real" regexps on page source. Twill is AFAIK not being worked on. Have you tried BS or PyQuery with CSS selectors?
0
0
1
0
2012-07-29T00:57:00.000
2
0
false
11,705,835
0
0
1
1
I'm currently using urllib2 and BeautifulSoup to open and parse html data. However I've ran into a problem with a site that uses javascript to load the images after the page has been rendered (I'm trying to find the image source for a certain image on the page). I'm thinking Twill could be a solution, and am trying to open the page and use a regular expression with 'find' to return the html string I'm looking for. I'm having some trouble getting this to work though, and can't seem to find any documentation or examples on how to use regular expressions with Twill. Any help or advice on how to do this or solve this problem in general would be much appreciated.
Get dynamic html table using selenium & parse it using beautifulsoup
11,707,106
0
1
1,222
0
python,regex,selenium,webdriver,beautifulsoup
You'd need to figure out what HTTP requests the Javascript is making, and make the same ones in your Python code. You can do this by using your favorite browser's development tools, or wireshark if forced.
0
0
1
0
2012-07-29T03:31:00.000
2
0
false
11,706,424
0
0
1
1
I'm trying to get the content of a HTML table generated dynamically by JavaScript in a webpage & parse it using BeautifulSoup to use certain values from the table. Since the content is generated by JavaScript it's not available in source (driver.page_source). Is there any other way to obtain the content and use it? It's table containing list of tasks, I need to parse the table and identify whether specific task I'm searching for is available.
Need to create thumbnail from user submitted url like reddit/facebook
11,748,617
1
2
1,630
0
python,django,image
You'll first need to parse the html content for img src urls with something like lxml or BeautifulSoup. Then, you can feed one of those img src urls into sorl-thumbnail or easy-thumbnails as Edmon suggests.
0
0
0
0
2012-07-29T21:53:00.000
3
1.2
true
11,713,284
0
0
1
1
I am pretty new to Django so I am creating a project to learn more about how it works. Right now I have a model that contains a URL field. I want to automatically generate a thumbnail from this url field by taking an appropriate image from the webite like facebook or reddit does. I'm guessing that I should store this image in an image field. What would be a good way to select an ideal image from the website and how can I accomplish this? EDIT- I'm trying to take actual images from the website rather than a picture of the website
Django: Static files referring to one another.
11,713,704
3
1
67
0
python,django,static-files,django-staticfiles
The static file finders will collocate the contents of all of your static folders under a single root upon collection, which means you can reference static resources relatively. For instance, if you had a logo in project/commons/static/images/logo.png, you can reuse it in a stylesheet in some other application, say project/myapp/static/css/myapp.css, by relatively referencing the image, i.e. ../images/logo.png.
0
0
0
0
2012-07-29T22:35:00.000
1
1.2
true
11,713,588
0
0
1
1
I have a directory with static files myapp/static/ and within that there are static/css,static/javascript, and static/images How can these files refere to one another? For instants, the css has to use the image files for the background of certain pages. Do you have to hard code the url? Can you do it relatively?
Generate a pdf with python
11,725,677
5
6
3,407
0
python,pdf,latex
I would suggest using the LaTeX approach. It is cross-platform, works in many different languages and is easy to maintain. Plus it's non-commercial!
0
0
0
1
2012-07-30T16:29:00.000
3
1.2
true
11,725,645
0
0
1
1
I'm trying to develop a small script that generate a complete new pdf, mainly text and tables, file as result. I'm searching for the best way to do it. I've read about reportlab, which seems pretty good. It has only one drawback asfar as I can see. It is quiet hard to write a template without the commercial version, and the code seems to be hard to maintain. So I've searched for a more sufficient way and found xhtml2pdf, but this software is quiet old, and cannot generate tables over two pages or more. The last solution in my mind it to generate a tex-File with a template framework, and later call pdftex as subprocess. I would implement the last one, and go over LateX. Would you do so, have you better ideas?
application run slowly under uwsgi threaded mode
11,733,510
0
1
947
0
python,memcached,uwsgi
You are probably experiencing the python GIL overhead. Try adding a second process to see if results are better.
0
1
0
0
2012-07-31T05:06:00.000
1
0
false
11,733,437
0
0
1
1
we use uwsgi + nginx to build the web site. recently, we want to improve the qps of our site, so we decide to switch uwsgi mode from prefork to threaded. but we found something very bad. when using prefork mode with workers setting 5, we get the request time is 10-20ms. but in threaded mode(one worker 5 threads), the value increase to 100-200ms. this is too bad. we find memcache.Client take the most time which makes the request time increasing. please help me to know where the problem is and how to solve, thank you! PS: code: import memcache client = memcache.Client(['127.0.0.1:11211']) client.get('mykey')
http server using python & gevent(not using apache)
11,740,272
1
2
1,332
0
python,apache,nginx,gevent,httpserver
In my opinion, you will never get the same level of security with a pure-Python server that you could have with majors web servers, as Apache and Nginx are. These are well-tested before being released, so, by using a stable build and by configuring it properly, you will be close of the maximum of security possible. Pure-python servers are very usefull during development, but I do not know any that can claim to compete with them for security testing / bug report / quick fix. This is why it is generally advisable to put one of these servers in front before the server in pure python, using, for example, options like ProxyPass.
0
0
0
1
2012-07-31T10:08:00.000
2
0.099668
false
11,737,754
0
0
1
2
just using python & gevent.server to server a simple login server(just check some data and do some db operation), would it be a problem when it's under ddos attack? would it be better if using apache/ngnix to server http request?
http server using python & gevent(not using apache)
11,740,845
3
2
1,332
0
python,apache,nginx,gevent,httpserver
If you are using gevent.server to implement your own HTTP server, I advise against it, and you should use instead gevent.pywsgi, that provides a full-featured, stable and thoroughly tested HTTP server. It is not as fast as gevent.wsgi, which is backed by libevent-http, but has more features that you are likely to need, like HTTPS. Gevent is much more likely to survive a DDOS attack than Apache, but nginx is as good as gevent on this regard, although I don't see why using it if you can do just fine with your pure Python server. It would be the case of using nginx if you had multiple backends through the same server, like your auth server together with some static file serving (what could be done entirely by nginx) and possibly other subsystem, or other virtual hosts, that all could be served through a single nginx configuration.
0
0
0
1
2012-07-31T10:08:00.000
2
0.291313
false
11,737,754
0
0
1
2
just using python & gevent.server to server a simple login server(just check some data and do some db operation), would it be a problem when it's under ddos attack? would it be better if using apache/ngnix to server http request?
Python HTML email : customizing output color-coding
11,868,901
0
0
1,729
0
python
All right. Her's what worked for me, just in case anybody bumps into the same problem. I had to enter carriage return (i.e. \n) after my every tag in the HTML table. And everything worked fine. PS: One clue as to whether this will help you is that, I am creating one big string of HTML.
0
0
0
1
2012-07-31T16:43:00.000
2
1.2
true
11,745,033
0
0
1
1
I am trying to send a python (2.6) HTML email with color coded output. My script creates an output string which I format to look like a table (using str.format). It prints okay on the screen: abcd 24222 xyz A abcd 24222 xyz B abcd 24222 xyz A abcd 24222 xyz D But I also need to send it as an email message and I need to have A (say in Green color), B (in Red) etc. How could I do it? What I've tried is attach FONT COLOR = #somecolor & /FONT tags at the front and back of A, B etc. And I wrote a method/module which adds table, tr & td tags) at appropriate parts of the string so that the message would like an HTML table in the email. But, There is an issue with this approach: 1) This doesn't always work properly. The emails (obtained by running the exact same script) are different and many times with misalligned members and mysterious tr's or td's appearing (at different locations each time). even though my html table creation is correct Any help would be appreciated.
SQLAlchemy - Query show results where records exist in both table
11,747,157
1
1
144
1
python,sqlalchemy
If there is a foreign key defined between tables, SA will figure the join condition for you, no need for additional filters. There is, and i was really over thinking this. Thanks for the fast response. – Ominus
0
0
0
0
2012-07-31T18:24:00.000
2
1.2
true
11,746,610
0
0
1
1
I have an items table that is related to an item_tiers table. The second table consists of inventory receipts for an item in the items table. There can be 0 or more records in the item_tiers table related to a single record in the items table. How can I, using query, get only records that have 1 or more records in item tiers.... results = session.query(Item).filter(???).join(ItemTier) Where the filter piece, in pseudo code, would be something like ... if the item_tiers table has one or more records related to item.
many selects (dropdowns) on html form, how to get just the value of the select that was changed
11,755,603
1
0
130
0
javascript,python,html,forms,cgi
If every select should be the only value that's needed, then every select is basically a form on its own. You could either remove all other selects when you activate a single select (which is prone to errors), or simply put every select in its own form instead of using one giant form. Otherwise all data is going to be send.
0
0
1
0
2012-08-01T08:34:00.000
1
1.2
true
11,755,474
0
0
1
1
In a python cgi script I have many selects in a form (100 or so), and each select has 5 or 6 options to choose from. I don't want to have a separate submit button, so I am using onchange="submit();" to submit the form as soon as an option is selected from one of the many selects. When I read the form data with form.keys() the name of every select on the form is listed instead of just the one that was changed. This requires me to compare the value selected in each select with the starting value to find out which one changed and this of course is very slow. How can I just get the new value of the one select that was changed?
Django: writing test for multiple database site
11,758,203
0
0
1,276
0
python,django
I finally get the tests running, here's what I did: disabled DATABASE_ROUTERS settings when running tests maintain B alias in DATABASES settings but the name is the same as A append B's INSTALLED_APPS that aren't present to A's INSTALLED_APPS
0
0
0
0
2012-08-01T09:13:00.000
2
0
false
11,756,115
0
0
1
1
I have 2 sites: A and B. A relies on some tables from B so it has an entry in its DATABASES settings pointing to B together with some entries under its DATABASE_ROUTERS settings to route certain model access to B's database. Now I'm trying to write a test on A but just running manage.py test immediately fails because some of A's models relies on some models covered by the tables coming from B, and B's complete database tables hasn't been created yet. So my question is, how do I tweak my TEST_RUNNER to first run syncdb on B against B's test db so then when I run manage.py test on A it can find the tables from B that it relies on? I hope that makes sense.
python interpreter on non-pydev projects?
11,758,291
0
1
113
0
python,eclipse,pydev
PyDev should be working fine. In project properties, you can set interpreter, PYTHONPATH and other PyDev related settings. To manually trigger code analysis, right-click on project, file or folder and select PyDev->Code analysis
0
0
0
1
2012-08-01T09:18:00.000
1
0
false
11,756,207
1
0
1
1
we have successfully added pydev plugin on our eclipse. as a result in pydev projects it detects errors and so on. but the question is that is there any way that we use pydev abilities (e.g. error detection) in non-pydev projects?(e.g. a java project). actually we are developing an eclipse plugin that contains some .py files and we want it to interpret them as a side feature
Which Python framework is flexible and similar to CodeIgniter in PHP?
11,760,761
1
0
1,400
0
python,django,pylons,cherrypy
For the fastest development you may dive into Django. But Django is probably not the fastest solution. Flask is lighter. Also you can try Pyramid.
0
0
0
1
2012-08-01T12:25:00.000
4
0.049958
false
11,759,164
0
0
1
1
The requirement is to develop a HTML based facebook app. It would not be content based like a newspaper site, but will mostly have user generated data which would be aggregated and presented from database + memcache. The app would contain 4-5 pages at most, with different purposes. We decided to write the app in Python instead of PHP , and tried to evaluate django. However, we found django is not as flexible as how CodeIgniter in PHP is i.e. putting less restrictions and rules, and allowing you to do what you want to do. PHP CodeIgnitor is minimalistic MVC framework, which we would have chosen if we were to develop in PHP. Can you please suggest a flexible and minimalistic python based web framework? I have heard of pylons,cheeryPy,web.py , but I am completely unaware of their usage and structure.
Maintaining a (singleton) process with mod_wsgi?
11,768,941
0
1
396
0
python,apache,mod-wsgi,web.py
Easy. Don't restart Apache, don't set maximum-requests and don't change the code in the WSGI script file. Are you saying that you are seeing restarts even when you leave Apache completely untouched? And yes it sounds like you should be re-architecting your system. A web process that takes that long to startup is crazy.
0
1
0
0
2012-08-01T14:46:00.000
1
1.2
true
11,761,785
0
0
1
1
I have a python web.py app with long (minutes) start-up time that I'd like to host with in Apache with mod_wsgi. The long-term answer may be "rewrite the app." But in the short term I'd like to configure mod_wsgi to: Use a single process to serve the app (I can do this with WSGIDaemonProcess processes=1), and Keep using that process without killing it off periodically Is #2 doable? Or, are there other stopgap solutions I can use to host this app? Thanks!
Memory usage in django-imagekit is unacceptable -- ideas on fixes?
11,834,793
3
3
353
0
python,django,image-processing,heroku,django-imagekit
Try to change image size with PIL from console and see if memory usage is ok. Image resize is a simple task, I don't believe you should use side applications. Besides, split your task into 3 tasks(3 images?).
0
0
0
0
2012-08-01T15:15:00.000
1
1.2
true
11,762,290
0
0
1
1
Django-imagekit, which I'm using to process user uploaded images on a social media website, uses an unacceptably high level of memory. I'm looking for ideas on how to get around this problem. We are using django-imagekit to copy user uploaded images it into three predefined sizes, and saves the four copies (3 processed plus 1 original) into our AmazonS3 bucket. This operation is quickly causing us to go over our memory limit on our Heroku dynos. On the django-imagekit github page, I've seen a few suggestions for hacking the library to use less memory. I see three options: Try to hack django-imagekit, and deal with the ensuing update problems from using a modified third party library Use a different imaging processing library Do something different entirely -- resize the images on in the browser perhaps? Or use a third party service? Or...? I'm looking for advice on which of these routes to take. In particular, if you are familiar with django-imagekit, or if you know of / are using a different image processing library in a Django app, I'd love to hear your thoughts. Thanks a lot! Clay
Pass information from javascript to django app and back
11,762,988
1
13
10,912
0
javascript,python,ajax,django
Yes, it is possible. If you pass the id as a parameter to the view you will use inside your app, like: def example_view (request,id) and in urls.py, you can use something like this: url(r'^example_view/(?P<id>\d+)/', 'App.views.example_view'). The id in the url /example_view_template/8 will get access to the result using the id which is related to the number 8. Like the 8th record of a specific table in your database, for example.
0
0
0
0
2012-08-01T15:32:00.000
3
0.066568
false
11,762,629
0
0
1
1
So I'm trying to basically set up a webpage where a user chooses an id, the webpage then sends the id information to python, where python uses the id to query a database, and then returns the result to the webpage for display. I'm not quite sure how to do this. I know how to use an ajax call to call the data generated by python, but I'm unsure of how to communicate the initial id information to the django app. Is it possible to say, query a url like ./app/id (IE /app/8), and then use the url information to give python the info? How would I go about editing urls.py and views.py to do that? Thanks,
Submitting Multiple Forms At The Same Time (Edit Profile Page)
11,764,721
0
0
146
0
python,google-app-engine,profile
I've used django on most of my webapps, but the concept should be the same; I use ajax to send the data to the backend whenever the user hits submit (and the form returns false) so the user can keep editing it. With ajax, you can send the data to different handlers on the backend. Also, using jQuery, you can set flags to see if fields have been changed, to avoid sending the ajax message in the first place. Ajax requests behave almost exactly like standard HTTP requests, but I believe the header indicates AJAX. If you're looking at strictly backend, then you will need to do multiple "if" statements on the backend and check one field at a time to see if it has been changed. On the backend you should still be able to call other handlers (passing them the same request).
0
0
0
0
2012-08-01T17:37:00.000
1
1.2
true
11,764,579
0
0
1
1
My question I suppose is rather simple. Basically, I have a profile. It has many variables being passed in. For instance, name, username, profile picture, and many others that are updated by their own respective pages. So one page would be used to update the profile picture, and that form would submit data from the form to the handler, and put() it to the database. What i'm trying to do here, is put all of the forms used to edit the profile on one single page at the same time. Would I need one huge handler to deal with that page? When I hit 'save' at the bottom of the page, how do I avoid overwriting data that hasn't been modified? Currently, say I have 5 profile variables, they map to 5 handlers, and 5 separate pages that contain their own respective form. Thanks.
How to use a different database host in development vs deployment?
11,765,289
1
1
120
0
python,django
You can have two copies of your settings.py file, one for production and one for development. Whatever you need to be the default, name it as settings.py Just set DJANGO_SETTINGS_MODULE to the python path for the file that you would like to use. So, if your settings files are myproject/settings.py, myproject/settings_dev.py; you can then do: $ DJANGO_SETTINGS_MODULE=settings_dev python manage.py shell From the myproject directory.
0
0
0
0
2012-08-01T18:14:00.000
2
1.2
true
11,765,123
0
0
1
1
I'm new to Python and Django and have over the past few weeks managed to set up my first deployment - a very basic site with user authentication and a few pages, which I hope to fill with content in the next couple of weeks. I have managed to find the answer to probably 40+ questions I have encountered so far by searching Google / StackOverflow / Django docs etc., but now I have one I can't seem to find a good answer to (perhaps because I don't know how best to search for it): when I develop on my local machine I need my settings.py file to point to the remote database ('HOST': 'www.mysite.com',) but when I deploy to a shared hosting service provider they require the use of localhost ('HOST': '', in settings.py). Since I host my code on GitHub and want to mirror it to the server, is there a way to resolve this so I don't have to make a manual edit to settings.py each time after uploading changes to the server?
Parsing a utf-8 encoded web page with some gb2312 body text with Python
11,767,390
1
0
587
0
python,encoding,character-encoding,web-scraping,beautifulsoup
The simplest way might be to parse the page twice, once as UTF-8, and once as GB2312. Then extract the relevant section from the GB2312 parse. I don't know much about GB2312, but looking it up it appears to at least agree with ASCII on the basic letters, numbers, etc. So you should still be able to parse the HTML structure using GB2312, which would hopefully give you enough information to extract the part you need. This may be the only way to do it, actually. In general, GB2312-encoded text won't be valid UTF-8, so trying to decode it as UTF-8 should lead to errors. The BeautifulSoup documentation says: In rare cases (usually when a UTF-8 document contains text written in a completely different encoding), the only way to get Unicode may be to replace some characters with the special Unicode character “REPLACEMENT CHARACTER” (U+FFFD, �). If Unicode, Dammit needs to do this, it will set the .contains_replacement_characters attribute to True on the UnicodeDammit or BeautifulSoup object. This makes it sound like BeautifulSoup just ignores decoding errors and replaces the erroneous characters with U+FFFD. If this is the case (i.e., if your document has contains_replacement_characters == True), then there is no way to get the original data back from document once it's been decoded as UTF-8. You will have to do something like what I suggested above, decoding the entire document twice with different codecs.
0
0
1
0
2012-08-01T20:23:00.000
1
1.2
true
11,767,001
0
0
1
1
I'm trying to parse a web page using Python's beautiful soup Python parser, and am running into an issue. The header of the HTML we get from them declares a utf-8 character set, so Beautiful Soup encodes the whole document in utf-8, and indeed the HTML tags are encoded in UTF-8 so we get back a nicely structured HTML page. The trouble is, this stupid website injects gb2312-encoded body text into the page that gets parsed as utf-8 by beautiful soup. Is there a way to convert the text from this "gb2312 pretending to be utf-8" state to "proper expression of the character set in utf-8?"
Distinguishing post request's from possible poster elements
11,770,627
0
0
53
0
python,html,post,bottle
You could add a hidden input field to each form on the page with a specific value. On the server side, check the value of this field to detect which form the post request came from.
0
0
1
0
2012-08-02T02:36:00.000
1
1.2
true
11,770,312
0
0
1
1
So, what issue im running into is how do i know what element of my page made a post request? I have multiple elements that can make the post request on the page, but how do i get the values from the element that created the request? It seems like this would be fairly trivial,but i have come up with nothing, and when doing quite a few google searches i have come up with nothing again. Is there any way to do this using Bottle? I had an idea to an a route for an sql page (with authentication of course) for providing the action for the form and use the template to render the id in the action, but i was thinking there had to be a better way to do this without routing another page.
Import data into Google App Engine in a way that is "easy" for the user of the application
11,780,827
1
1
181
0
python,google-app-engine,google-cloud-datastore,data-import
Have the user upload the file, then start a task that runs the import. Email results/errors to the user at the end. The other way I have done, is get the user to create the spreadsheet in google docs and have them supply the sheet key or link if it's published and then start a task that processes the spreadsheet directly from google docs.
0
1
0
0
2012-08-02T14:00:00.000
2
0.099668
false
11,779,033
0
0
1
1
Building an application using Python in GAE that handles a lot of user data such as contacts, appointments, etc... Would like to allow users to import their old data from other applications. For example an appointment might look like: Start time Duration Service Customer Id 2012-08-02 09:50AM, 01:00:00, Hair cut, 94782910, 2012-08-02 10:50AM, 00:30:00, Dye job, 42548910, ... I'm unfamiliar with accepted practices with handling this type of situation. I also see issues with handling this on google app engine specifically because requests can not take longer than 30 seconds. Ideally, it seems like users should be able to upload CSV files of their data via a web page, but I don't really know of a good way to do this with app engine. Another way I can think of would be to let users cut and paste text directly into and HTML text area. Then javascript could be used to iterate the data and POST it to the server one row at a time or in small chunks. This sounds really sketchy to me though. Any ideas on what a "good" way to handle this would be? Thanks so much!
ImportError: No Module Named bs4 (BeautifulSoup)
63,454,563
0
170
402,243
0
python,beautifulsoup,flask,importerror
In case you are behind corporate proxy then try using following command pip install --proxy=http://www-YOUR_PROXY_URL.com:PROXY_PORT BeautifulSoup4
0
0
0
0
2012-08-02T18:47:00.000
22
0
false
11,783,875
1
0
1
8
I'm working in Python and using Flask. When I run my main Python file on my computer, it works perfectly, but when I activate venv and run the Flask Python file in the terminal, it says that my main Python file has "No Module Named bs4." Any comments or advice is greatly appreciated.
ImportError: No Module Named bs4 (BeautifulSoup)
39,798,150
5
170
402,243
0
python,beautifulsoup,flask,importerror
If you use Pycharm, go to preferences - project interpreter - install bs4. If you try to install BeautifulSoup, it will still show that no module named bs4.
0
0
0
0
2012-08-02T18:47:00.000
22
0.045423
false
11,783,875
1
0
1
8
I'm working in Python and using Flask. When I run my main Python file on my computer, it works perfectly, but when I activate venv and run the Flask Python file in the terminal, it says that my main Python file has "No Module Named bs4." Any comments or advice is greatly appreciated.
ImportError: No Module Named bs4 (BeautifulSoup)
49,884,772
5
170
402,243
0
python,beautifulsoup,flask,importerror
I will advise you to uninstall the bs4 library by using this command: pip uninstall bs4 and then install it using this command: sudo apt-get install python3-bs4 I was facing the same problem in my Linux Ubuntu when I used the following command for installing bs4 library: pip install bs4
0
0
0
0
2012-08-02T18:47:00.000
22
0.045423
false
11,783,875
1
0
1
8
I'm working in Python and using Flask. When I run my main Python file on my computer, it works perfectly, but when I activate venv and run the Flask Python file in the terminal, it says that my main Python file has "No Module Named bs4." Any comments or advice is greatly appreciated.
ImportError: No Module Named bs4 (BeautifulSoup)
51,779,869
5
170
402,243
0
python,beautifulsoup,flask,importerror
If you are using Anaconda for package management, following should do: conda install -c anaconda beautifulsoup4
0
0
0
0
2012-08-02T18:47:00.000
22
0.045423
false
11,783,875
1
0
1
8
I'm working in Python and using Flask. When I run my main Python file on my computer, it works perfectly, but when I activate venv and run the Flask Python file in the terminal, it says that my main Python file has "No Module Named bs4." Any comments or advice is greatly appreciated.
ImportError: No Module Named bs4 (BeautifulSoup)
67,411,090
3
170
402,243
0
python,beautifulsoup,flask,importerror
This worked for me. pipenv pip install BeautifulSoup4
0
0
0
0
2012-08-02T18:47:00.000
22
0.027266
false
11,783,875
1
0
1
8
I'm working in Python and using Flask. When I run my main Python file on my computer, it works perfectly, but when I activate venv and run the Flask Python file in the terminal, it says that my main Python file has "No Module Named bs4." Any comments or advice is greatly appreciated.
ImportError: No Module Named bs4 (BeautifulSoup)
58,264,670
2
170
402,243
0
python,beautifulsoup,flask,importerror
pip install --user BeautifulSoup4
0
0
0
0
2012-08-02T18:47:00.000
22
0.01818
false
11,783,875
1
0
1
8
I'm working in Python and using Flask. When I run my main Python file on my computer, it works perfectly, but when I activate venv and run the Flask Python file in the terminal, it says that my main Python file has "No Module Named bs4." Any comments or advice is greatly appreciated.
ImportError: No Module Named bs4 (BeautifulSoup)
70,539,039
0
170
402,243
0
python,beautifulsoup,flask,importerror
One more solution for PyCharm: Go to File -> Settings -> Python Interpreter, click on plus sign and find beautifulsoup4. Click install.
0
0
0
0
2012-08-02T18:47:00.000
22
0
false
11,783,875
1
0
1
8
I'm working in Python and using Flask. When I run my main Python file on my computer, it works perfectly, but when I activate venv and run the Flask Python file in the terminal, it says that my main Python file has "No Module Named bs4." Any comments or advice is greatly appreciated.
ImportError: No Module Named bs4 (BeautifulSoup)
58,678,662
1
170
402,243
0
python,beautifulsoup,flask,importerror
A lot of tutorials/references were written for Python 2 and tell you to use pip install somename. If you're using Python 3 you want to change that to pip3 install somename.
0
0
0
0
2012-08-02T18:47:00.000
22
0.009091
false
11,783,875
1
0
1
8
I'm working in Python and using Flask. When I run my main Python file on my computer, it works perfectly, but when I activate venv and run the Flask Python file in the terminal, it says that my main Python file has "No Module Named bs4." Any comments or advice is greatly appreciated.
Django:How to stop a form from showing up when user enters the direct url of the page containing the form?
11,785,422
0
0
55
0
python,django,url
Can you just use HTTP verbs? If both of your entries into the URL are from forms, then you could simply not support GET access to that URL at all.
0
0
0
0
2012-08-02T20:18:00.000
1
0
false
11,785,116
0
0
1
1
Suppose I have two modelforms 'A' and 'B' associated with models 'C' and 'D' respectively.Model 'D' has a foreign key of model 'C'.So objects of models should be created first.Now when user submits form 'A',an object of 'C' is generated.Now to send the id of object of model 'C' I'm using a url like this "/{{ object.id }}/".This way the modelform 'B' gets to know which object of model 'C' should be associated with the object of model 'D'. Now the problem I'm facing is if I enter the url "/{{ object.id }}/",I get to see the modelform 'B' which I don't want.What can I do?
django-admin.py: command not found
24,193,778
1
3
6,301
0
python,django,django-admin
i have the same Problem. the django-admin.py for me was in this Path ~/.local/bin. this because i run pip instal --user django
0
1
0
0
2012-08-03T07:41:00.000
3
0.066568
false
11,791,368
0
0
1
1
I have django-admin.py in usr/local/bin and I have tried all the help given on the web to make a symbolic link but it still says django-admin.py: command not found. I am trying to start of my first project in django :- django-admin.py startproject mysite. No matter what I do I just keep on getting django-admin.py: command not found. I am using ubuntu 11.10. Thanks
Can I run PHP and Python on the same Apache server using mod_vhost_alias and mod_wsgi?
36,646,397
0
3
6,266
1
php,python,apache,mod-vhost-alias
Even I faced the same situation, and initially I was wondering in google but later realised and fixed it, I'm using EC2 service in aws with ubuntu and I created alias to php and python individually and now I can access both.
0
0
0
1
2012-08-03T12:53:00.000
2
0
false
11,796,126
0
0
1
1
I currently run my own server "in the cloud" with PHP using mod_fastcgi and mod_vhost_alias. My mod_vhost_alias config uses a VirtualDocumentRoot of /var/www/%0/htdocs so that I can serve any domain that routes to my server's IP address out of a directory with that name. I'd like to begin writing and serving some Python projects from my server, but I'm unsure how to configure things so that each site has access to the appropriate script processor. For example, for my blog, dead-parrot.com, I'm running a PHP blog platform (Habari, not WordPress). But I'd like to run an app I've written in Flask on not-dead-yet.com. I would like to enable Python execution with as little disruption to my mod_vhost_alias configuration as possible, so that I can continue to host new domains on this server simply by adding an appropriate directory. I'm willing to alter the directory structure, if necessary, but would prefer not to add additional, specific vhost config files for every new Python-running domain, since apart from being less convenient than my current setup with just PHP, it seems kind of hacky to have to name these earlier alphabetically to get Apache to pick them up before the single mod_vhost_alias vhost config. Do you know of a way that I can set this up to run Python and PHP side-by-side as conveniently as I do just PHP? Thanks!
python Django custom error msg for any error (apart from try,except)
11,799,755
0
0
480
0
python,django
In Django, if an error occurs, it isn't actually propagated to the user automatically. Rather, if you have an error occur, it returns a 500 error page. If Django is in debug mode, it will give a traceback. If the other server is intelligent, it will realized that there is a 500 error message instead of a 200 message. In Django you can then define a "500.html" in your root directory to handle the errors (or use the default Django template). GLHF
0
0
0
0
2012-08-03T13:06:00.000
2
1.2
true
11,796,333
0
0
1
1
in django, whenever an error is occured , if we dont keep in try block, an error will be raised . At that point of time(during error), if it is not in try block, instead of the error page, can we display a msg . What I am actually saying is, is there anything in django(like signals) that gets activated during error and prints that msg . In my case, it is an ajax request, so what i want is, when something is not inside try block, and still if it raises error, then it should atleast send back an error msg(to another server which made ajax call to our server) saying "error occured" .
python bottle css image and css routing
11,800,422
4
1
1,803
0
python,python-2.7,bottle
In order for a webbrowser to be able to download and render the css or image, it will either have to be part of your page (where people can view it by viewing the source of the page) or accessible at a URL. So if you're trying to get around people being able to look at just your css or just your image, the answer is that there's no way around it.
0
0
0
0
2012-08-03T17:07:00.000
2
1.2
true
11,800,219
0
0
1
1
How would I go about linking css and images to a template without routing it through bottle (@route('/image/') or @route('/css/')) and using a static_file return? because i am unable to link css normally (it cant find the css/image) and if i do it through static_file anyone can go to that link and view the css/image (IE www.mysite.com/css/css.css or www.mysite.com/image/image.png). Is there any way to get around this issue?
Which is more expensive (in $): database memory or processing power?
11,806,820
4
3
612
0
python,postgresql,heroku,hosting
Difficult to answer without testing it out but you might want to answer these questions: 1) How expensive is the diff operation? Run a test or compute the complexity. If diff operation is on really large files or rapidly changing files, you might want to modify the algorithm. Storing diffs doesn't seem like a great solution if the files are large, change little or change rapidly over time. 2) How many times would you need to generate the same diff with the same files and is there a time bound associated with this? - If the same diff is generated over and over again in a short span of time, you might want to cache it and not write it to a database. If the diff is accessed sporadically over time (Few days, months), you might want to store it that is after analyzing 1 above. You might benchmark using costs on Amazon Web Services. Again you have choices there. You could just use a single EC2 instance for everything or split the workflow against RDS, EC2 and S3 and then analyze the cost. Depends on what level of scale you desire.
0
0
0
0
2012-08-04T07:12:00.000
3
1.2
true
11,806,700
0
0
1
2
This is a fairly abstract question, I hope it is within bounds. I'm about 5 months into my coding career in web development. I've found that there's often a tension between CPU and storage resources. Put simply, you can use less of one and more of the other, or vice versa (then throw in the speed consideration). I'm now getting to the point of deploying my first app for production, so this balance is now a matter of real dollars and cents. The thing is this: I really don't have any idea what kind of balance I should be looking for. Here's some salient examples that might illuminate the balance to be struck in different case scenarios. Background I am working on an app that does alot of diffs between text. Users will call on pages that contain diffs displayed in html. A lot. First Case Should I run a diff each time a page is displayed, or should I run the diff once, store it, and call it each time a page is displayed? Second Case I have coded up an algorithm that summarises diffs. It's about 110 lines of code, and it uses 4 or 5 loops and subloops. Again, should I run this once and store the results, so that they can be called on later, or should I just run the algorithm each time a page is displayed? Would also love to hear your views on the best tools to use to quantify the balance.
Which is more expensive (in $): database memory or processing power?
11,809,281
2
3
612
0
python,postgresql,heroku,hosting
My suggestion would be to store the cache in DB-tables, not in memory. If the entries are referenced a lot, they will be in memory (in disk buffers). The advantage of this approach is that the diffs will be competing for a place in core with the other DB tables, which is always smarter than pre-allocating (and managing) XXX bytes of memory. An addtional advantage is that maintaining {hitcount,date of access, ...} for the cache entries is relatively easy, and its management can all be done in SQL. And remember: disk space is for free. It is very easy to have an XXX GB cache on disk, and effectively using only XXX MB of it. The hard hitters will be in memory while the long tail will sit on disk. And it is always possible to grow or shrink the cache. Cost estimate for the uncached version: I/O + buffer memory cost for 2 files CPU + memory cost for the diff operation buffer memory for the result. Cost estimate for the cached version: I/O + to fetch the diff CPU + memory for the query buffer memory for the result If you compare the two: the uncached version has a larger I/O cost (given the diff is smaller than the sum of the two files) The uncached version always has a larger memory footprint The query cost could be smaller than diff-execution cost. Or it could be larger...
0
0
0
0
2012-08-04T07:12:00.000
3
0.132549
false
11,806,700
0
0
1
2
This is a fairly abstract question, I hope it is within bounds. I'm about 5 months into my coding career in web development. I've found that there's often a tension between CPU and storage resources. Put simply, you can use less of one and more of the other, or vice versa (then throw in the speed consideration). I'm now getting to the point of deploying my first app for production, so this balance is now a matter of real dollars and cents. The thing is this: I really don't have any idea what kind of balance I should be looking for. Here's some salient examples that might illuminate the balance to be struck in different case scenarios. Background I am working on an app that does alot of diffs between text. Users will call on pages that contain diffs displayed in html. A lot. First Case Should I run a diff each time a page is displayed, or should I run the diff once, store it, and call it each time a page is displayed? Second Case I have coded up an algorithm that summarises diffs. It's about 110 lines of code, and it uses 4 or 5 loops and subloops. Again, should I run this once and store the results, so that they can be called on later, or should I just run the algorithm each time a page is displayed? Would also love to hear your views on the best tools to use to quantify the balance.
Fastest way to get map coordinates in a javascript function from Django
11,808,220
1
2
391
0
javascript,python,django
I think the best way would be to store this data in a database, for a couple of reasons: You will be able to perform some queries on this data, like "give me all points in the view port" or "give me all points that are 5km from some other point" - even if you don't need it now it might be very useful in the future, especially if you think about having around 1000 points There are wonderful utils to keep coordinates in the database that integrate very well with Django - you should definitely check postgis and geodjango You are using Django, so you probably have some other data in the database and it's nice to have everything in one place Unless it is some kind of static data that you'd like to display on the map that is not likely to change, it doesn't feel right to keep it anywhere else than in the database. If for some reason you don't want to use any database (even though you can always store this data in a file with sqlite) you can also try storing it in some python object and then send them to js (so the second option), and the third one I think is the worst - you are not able to make any operation with this data outside javascript, it would be very hard to read or debug (syntax errors for example). Hth
0
0
0
0
2012-08-04T09:44:00.000
1
1.2
true
11,807,565
0
0
1
1
So...i have used a map in a project which needs coordinates to set markers to different positions. There are many options available to me to get the coordinates. Store the coordinates in a database and use django view to get and forward the coordinates to the javascript function using an ajax response. Store it into a python list or dictionary and send the data when needed to that javascript func. Hard code the coordinates in a HTML tag attribute and get them via javascript and then set the marker. Use files and get data through file I/O in django view and forward that to the javascript function. I want to know which of these techniques is efficient for about 50 set of coordinates and which one will be more sufficient if my set increases to about 1000? If you have a better way to do this...please share it.. Thanks
How to Accomplish App Version Switching Using Buildout?
11,897,450
1
2
170
0
python,virtualenv,buildout,gunicorn,supervisord
Just like with your point-mod_wsgi-at-a-different-folder solution, you can do the same with gunicorn/buildout. Just set up your latest buildout in a different directory, stop the old gunicorn and start the new. There'll be a short delay between stopping the one and starting the other, of course. Alternative: set up the new one with a different port number, change the nginx config and kick ngnix if you really want zero downtime-seconds.
0
0
0
0
2012-08-05T02:46:00.000
1
1.2
true
11,813,585
0
0
1
1
My application is developed with Flask and uses buildout to handle dependency isolation. I plan to use Gunicorn and supervisord as wsgi container and process manager, in front of which there is Nginx doing load balancing. Here is the problem when deploying a new version of the application: everything is builtout in a subfolder, how to restart the gunicorn server so that the version switching can take place gracefully? I come up with some solutions of course: Ditch gunicorn and superviosrd, and turn to apache mod_wsgi, so when deploying a new version I could simply change the folder in .wsgi file and the server will restart. Use virtualenv and install gunicorn, supervisord, as well as my application package in it, so when switching version I just restart it using supervisorctl. Is there a 'pure' buildout way that can accomplish this situation? Or any in-use production solutions will all be appreciated. Thanks in advance.
pycharm convert tabs to spaces automatically
43,841,682
12
131
141,288
0
python,pycharm
ctrl + shift + A => open pop window to select options, select to spaces to convert all tabs as space, or to tab to convert all spaces as tab.
0
0
0
0
2012-08-05T11:49:00.000
8
1
false
11,816,147
1
0
1
5
I am using pycharm IDE for python development it works perfectly fine for django code so suspected that converting tabs to spaces is default behaviour, however in python IDE is giving errors everywhere because it can't convert tabs to spaces automatically is there a way to achieve this.
pycharm convert tabs to spaces automatically
20,491,867
64
131
141,288
0
python,pycharm
For selections, you can also convert the selection using the "To spaces" function. I usually just use it via the ctrl-shift-A then find "To Spaces" from there.
0
0
0
0
2012-08-05T11:49:00.000
8
1
false
11,816,147
1
0
1
5
I am using pycharm IDE for python development it works perfectly fine for django code so suspected that converting tabs to spaces is default behaviour, however in python IDE is giving errors everywhere because it can't convert tabs to spaces automatically is there a way to achieve this.
pycharm convert tabs to spaces automatically
54,234,510
6
131
141,288
0
python,pycharm
ctr+alt+shift+L -> reformat whole file :)
0
0
0
0
2012-08-05T11:49:00.000
8
1
false
11,816,147
1
0
1
5
I am using pycharm IDE for python development it works perfectly fine for django code so suspected that converting tabs to spaces is default behaviour, however in python IDE is giving errors everywhere because it can't convert tabs to spaces automatically is there a way to achieve this.
pycharm convert tabs to spaces automatically
66,066,077
1
131
141,288
0
python,pycharm
Just ot note: Pycharm's to spaces function only works on indent tabs at the beginning of a line, not interstitial tabs within a line of text. for example, when you are trying to format columns in monospaced text.
0
0
0
0
2012-08-05T11:49:00.000
8
0.024995
false
11,816,147
1
0
1
5
I am using pycharm IDE for python development it works perfectly fine for django code so suspected that converting tabs to spaces is default behaviour, however in python IDE is giving errors everywhere because it can't convert tabs to spaces automatically is there a way to achieve this.
pycharm convert tabs to spaces automatically
57,950,986
1
131
141,288
0
python,pycharm
For me it was having a file called ~/.editorconfig that was overriding my tab settings. I removed that (surely that will bite me again someday) but it fixed my pycharm issue
0
0
0
0
2012-08-05T11:49:00.000
8
0.024995
false
11,816,147
1
0
1
5
I am using pycharm IDE for python development it works perfectly fine for django code so suspected that converting tabs to spaces is default behaviour, however in python IDE is giving errors everywhere because it can't convert tabs to spaces automatically is there a way to achieve this.
Inconsistent SignatureDoesNotMatch Amazon S3 with django-pipeline, s3boto and storages
57,364,827
0
3
5,584
0
python,django,amazon-s3,boto,django-storage
Simple workaround for me was to generate a new access key with only alphanumeric characters (ie no special characters such as "/", "+", etc. which AWS sometimes adds to the keys).
0
0
0
0
2012-08-05T22:23:00.000
5
0
false
11,820,566
0
0
1
1
I have 2 files compiled by django-pipeline along with s3boto: master.css and master.js. They are set to "Public" in my buckets. However, when I access them, sometimes master.css is served, sometimes it errs with SignatureDoesNotMatch. The same with master.js. This doesn't happen on Chrome. What could I be missing? EDIT: It now happens on Chrome too.
Django and Bootstrap: What app is recommended?
11,821,278
6
70
50,626
0
python,django,twitter-bootstrap
I've been using django-crispy-forms with bootstrap for the last couple of months and it has been quite useful. Forms render exactly as they're meant to. If you do any custom form rendering though, be prepared to define your forms in code rather than in template, using helpers.
0
0
0
0
2012-08-05T23:52:00.000
4
1
false
11,821,116
0
0
1
1
I want to start using Twitter's Bootstrap for a recently started Django app. I have quite a bit of experience with Django, but I'm totally new to Bootstrap. What's the best way to proceed? Are there any particular Boostrap apps for Django you would recommend or have experience with? I understand that I could use Bootstrap directly, without any special Bootstrap-specific Django apps. However, I also read that the form rendering doesn't come out particularly well without a little server side support (rendering the Bootstrap specific CSS into the form HTML, for example). There seem to be several projects, such as crispy forms, django-bootstrap-toolkit, etc. Looking at their project pages, I can see different levels of activity and support. If I decide to go with one of those, I would of course like to pick one which has some momentum and therefore a good likelihood of staying supported and maintained for a while. This is very important and so even if the particular app doesn't have all possible features or is a bit less flexible, it might still be a good choice, due to support/freshness, availability of examples, etc. Thank you for any recommendations or feedback.
Python multiprocessing and Django - I'm confused
11,823,901
0
1
685
0
python,django,multiprocessing
You can try celery as it's django friendly. But if to be honest i'm not fond of it (bugs :) We are going to switch to Gearman. Writing your own job producers and consumers (workers) are a kind of a fun!
0
0
0
0
2012-08-06T06:42:00.000
2
0
false
11,823,586
1
0
1
1
I'm trying to write a Web app in Python, which is to consist of two parts: A Django-based user interface, which allows each user to set up certain tasks Worker processes (one per user), which, when started by the user, perform the tasks in the background without freezing the UI. Since any object I create in a view is not persistent, I have no way of keeping tracks of worker processes. I'm not even sure how to approach this task. Any ideas?
Selectively indexing subdomains
11,829,421
1
0
134
0
python,seo,indexing,robots.txt,googlebot
No. The search engine should not care what script generates the pages. Just so long as the pages generated by the webapps are indexed you should be fine. Second question: You should create a separate robots.txt per subdomain. That is when robots.txt is fetched from a particular subdomain, return a robots.txt file that pertains to that sudomain only. So if you want the subdomain indexed, has that robots file allow all. If you don't want it indexed, have the robots file deny all.
0
1
0
1
2012-08-06T13:34:00.000
2
1.2
true
11,829,360
0
0
1
1
I am working on Web application, which allows users to create their own webapp in turn. For each new webapp created by my application I Assign a new Subdomain. e.g. subdomain1.xyzdomain.com, subdomain2.xyzdomain.com etc. All these Webapps are stored in Database and are served by a python script (say default_script.py) kept in /var/www/. Till now, I have blocked Search Engine indexing for directory ( /var/www/ ) using robots.txt. Which essentially blocks indexing of my all scripts including default_script.py as well as content served for multiple webapps using that default_script.py script. But now I want that some of those subdomains should be indexed. After searching for a while I was able to figure out a way to block indexing of my scripts by explicitly specifing them in robots.txt But I am still doubtful about the following: Will blocking the my default_script.py from indexing also block indexing of all content that are served from default_script.py. If yes then if I let it index, will default_script.py start showing up in search results also. How can I allow indexing of some of the Subdomains seletively. Ex: Index subdomain1.xyzdomain.com but NOT subdomain2.xyzdomain.com
python: how to pass parameters to application in eclipse without using command prompt
11,843,883
3
1
94
0
python,django
You can create a "Run Configuration" in Eclipse to invoke manage.py. There is an "Arguments" tab that allows you to provide command line arguments.
0
0
0
0
2012-08-07T10:18:00.000
1
0.53705
false
11,843,817
0
0
1
1
I am doing project on django, and every time we have to run project we have to give commands in command prompt. Can we pass arguments to manage.py in eclipse(IDE) itself or at the runconfig. Please throw some light on this.
How to change ancestor of an NDB record?
11,855,209
9
5
1,422
1
python,google-app-engine,google-cloud-datastore
The only way to change the ancestor of an entity is to delete the old one and create a new one with a new key. This must be done for all child (and grand child, etc) entities in the ancestor path. If this isn't possible, then your listed solution works. This is required because the ancestor path of an entity is part of its unique key. Parents of entities (i.e., entities in the ancestor path) need not exist, so changing a parent's key will leave the children in the datastore with no parent.
0
1
0
0
2012-08-07T21:04:00.000
1
1.2
true
11,854,137
0
0
1
1
In the High-Replication Datastore (I'm using NDB), the consistency is eventual. In order to get a guaranteed complete set, ancestor queries can be used. Ancestor queries also provide a great way to get all the "children" of a particular ancestor with kindless queries. In short, being able to leverage the ancestor model is hugely useful in GAE. The problem I seem to have is rather simplistic. Let's say I have a contact record and a message record. A given contact record is being treated as the ancestor for each message. However, it is possible that two contacts are created for the same person (user error, different data points, whatever). This situation produces two contact records, which have messages related to them. I need to be able to "merge" the two records, and bring put all the messages into one big pile. Ideally, I'd be able to modify ancestor for one of the record's children. The only way I can think of doing this, is to create a mapping and make my app check to see if record has been merged. If it has, look at the mappings to find one or more related records, and perform queries against those. This seems hugely inefficient. Is there more of "by the book" way of handling this use case?
Swapy could not be used to access swing properties of swing based java application.How to access swing properties of a java application
12,048,644
5
0
1,126
0
python,swing,ui-automation,pywinauto
Pywinauto uses standard windows API calls. Unfortunately many UI libraries (like Swing/QT/GTK) do not respond in a typical way to the API calls used - so unfortunately pywinauto usually cannot get the control information. (I am the Author of pywinauto). I can't give you a way to access the properties of the Swing controls.
0
1
0
0
2012-08-08T08:07:00.000
1
1.2
true
11,860,280
0
0
1
1
I am using swapy(desktop automation tool which uses pywinauto python package) to automate desktop UI activities, but swapy does not recognize the properties of a swing based java application, but it can recognize the properties of other applications like notepad windows media player etc.. can anybody please the reason for this problem and can I use swing explorer for this swing based application of which I don not have code, just the application If i cant use it, please give me a way/solution to access the properties of swing based java application. Thanks in advance..
Create a receipt for a user form submission
11,879,610
1
0
504
0
python,form-submit
@Serafeim, your approach is very good for the situation. Here are some ideas of extending it: Make sure that the secret_word (in hashing terms it is called salt) is long enough. Make the end function a bit more complex, e.g. hash = h(h(username) + month + year + h(salt)) Use a bit more complex hash function, e.g. SHA1 Don't give the end user the whole hash value. E.g. md5 hex digest contains 32 digits, but it would be enough to have first 5-10 digits of the hash in the report. Updated: In case you have resources, generate a random salt per user. Then even if somehow a user will learn the salt and the hash function, it will be still useless for the others.
0
0
0
0
2012-08-09T06:11:00.000
1
1.2
true
11,877,666
1
0
1
1
There is a requirement that our users should complete and submit a form once a month. So, each month we should have a form that will contain data for the triplet (username, month, year). I want our users to be able to certify that they did actually submit the form for that particular month by creating a receipt for them. So, for each month there will be a report containing the data the user submitted along with the receipt. I don't want the users to be able to create that receipt by themselves though. What I was thinking was to create a string that contained username, month, year, secret_word and give the md5 hash of that string to the users as their receipt. That way because the users won't have the secret word they won't be able to generate the md5 hash. However my users will probably complain when they see the complexity of that md5 hash. Also if the find out the secret word they will be able to create receipts for everybody. Is there a standard way of doing what I ask ? Could you recommend me any other possible solutions ? I am using Python but some pseudocode or link to the appropriate methods would be ok.
How do I make Django use a different database besides the 'default'?
11,878,547
1
0
304
1
python,database,django,configuration
You can just use a different settings.py in your production environment. Or - which is a bit cleaner - you might want to create a file settings_local.py next to settings.py where you define a couple of settings that are specific for the current machine (like DEBUG, DATABASES, MEDIA_ROOT etc.) and do a from settings_local import * at the beginning of your generic settings.py file. Of course settings.py must not overwrite these imported settings.
0
0
0
0
2012-08-09T07:14:00.000
3
0.066568
false
11,878,454
0
0
1
1
I am relatively new to Django and one thing that has been on my mind is changing the database that will be used when running the project. By default, the DATABASES 'default' is used to run my test project. But in the future, I want to be able to define a 'production' DATABASES configuration and have it use that instead. In a production environment, I won't be able to "manage.py runserver" so I can't really set the settings. I read a little bit about "routing" the database to use another database, but is there an easier way so that I won't need to create a new router every time I have another database I want to use (e.g. I can have test database, production database, and development database)?
Django:Any way to get date and name of uploaded file later?
11,882,269
1
4
1,226
0
python,django
You can use the original file's name as part of the file name when storing in the disk, and you probably can use the file's creation/modification date for the upload date. IMO, you should just store it explicitly in the database.
0
0
0
0
2012-08-09T10:58:00.000
2
0.099668
false
11,881,900
0
0
1
1
Is there any way to get the uploaded file date and name which we have stored into the database using forms ? Right now I am just creating two more database tuples for name and date and storing them like this file_name = request.FILES['file'].name for file_name and storing date using upload_date = datetime.datetime.now()
Should PostgreSQL connections be pooled in a Python web app, or create a new connection per request?
11,889,137
1
8
9,082
1
python,postgresql,web-applications,flask,psycopg2
I think connection pooling is the best thing to do if this application is to serve multiple clients and concurrently.
0
0
0
0
2012-08-09T17:48:00.000
5
0.039979
false
11,889,104
0
0
1
3
I'm building a web app in Python (using Flask). I do not intend to use SQLAlchemy or similar ORM system, rather I'm going to use Psycopg2 directly. Should I open a new database connection (and subsequently close it) for each new request? Or should I use something to pool these connections?
Should PostgreSQL connections be pooled in a Python web app, or create a new connection per request?
11,889,659
3
8
9,082
1
python,postgresql,web-applications,flask,psycopg2
The answer depends on how many such requests will happen and how many concurrently in your web app ? Connection pooling is usually a better idea if you expect your web app to be busy with 100s or even 1000s of user concurrently logged in. If you are only doing this as a side project and expect less than few hundred users, you can probably get away without pooling.
0
0
0
0
2012-08-09T17:48:00.000
5
0.119427
false
11,889,104
0
0
1
3
I'm building a web app in Python (using Flask). I do not intend to use SQLAlchemy or similar ORM system, rather I'm going to use Psycopg2 directly. Should I open a new database connection (and subsequently close it) for each new request? Or should I use something to pool these connections?
Should PostgreSQL connections be pooled in a Python web app, or create a new connection per request?
61,078,209
0
8
9,082
1
python,postgresql,web-applications,flask,psycopg2
Pooling seems to be totally impossible in context of Flask, FastAPI and everything relying on wsgi/asgi dedicated servers with multiple workers. Reason for this behaviour is simple: you have no control about the pooling and master thread/process. A pooling instance is only usable for a single thread serving a set of clients - so for just one worker. Any other worker will get it's own pool and therefore there cannot be any sharing of established connections. Logically it's also impossible, because you cannot share these object states across threads/processes in multi core env with python (2.x - 3.8).
0
0
0
0
2012-08-09T17:48:00.000
5
0
false
11,889,104
0
0
1
3
I'm building a web app in Python (using Flask). I do not intend to use SQLAlchemy or similar ORM system, rather I'm going to use Psycopg2 directly. Should I open a new database connection (and subsequently close it) for each new request? Or should I use something to pool these connections?
Specify Python Version for Virtualenv in Requirements.txt
11,890,145
3
27
20,644
0
python,virtualenv,pip
it would be really convenient not to have to tell every new person joining the team how to set up their virtualenv Just add it to the normal set of instructions you give new members when the join; right in the same place when you tell them about the internal documentation wiki, the password to the wifi and the phone number to the sandwich delivery shop. It will be extremely uncovenient to not have to tell people and have them figure it out themselves; the first time they submit something that uses collections.Counter only to find out it broke the build because the server doesn't have 2.7.x
0
0
0
0
2012-08-09T18:43:00.000
5
0.119427
false
11,889,932
1
0
1
1
I'm using virtualenv to develop a django application with a team. The server we're deploying on is running python 2.6, but the default for our machines is 2.7.3. Is there any way to specify python version in the requirements.txt file, or something similar, within the code base? I know requirements.txt is a pip thing, and python version is a virtualenv thing, but it would be really convenient not to have to tell every new person joining the team how to set up their virtualenv.
Preparing a pyramid app for production
11,898,284
3
3
2,476
0
python,pyramid,production
Well the big difference between python setup.py develop and python setup.py install. Is that install will install the package in your site-packages directory. While develop will install an egg-link that point to the directory for development. So yeah you can technically use both method. But depending on how you did your project, installing in site-package might be a bad idea. Why? FileUpload or anything your app might generate like dynamic files etc... If your app doesn't use config files to find where to save your files. Installing your app and running your app may try to write file in your site-packages directory. In other words, you have to make sure that all files and directories that may be generated, etc can be located using config files. Then if all dynamic directories are pointed out in the configs, then installing is good... All you'll have to do is create a folder with a production.ini file and run pserve production.ini. Code can be saved anywhere on your comp that way and you can also use uWSGI or any other WSGI server you like. I think installing the code isn't a bad thing, and having data appart from the application is a good thing. It has some advantage for deployment I guess.
0
0
0
1
2012-08-10T01:18:00.000
1
1.2
true
11,894,210
0
0
1
1
So as I near the production phase of my web project, I've been wondering how exactly to deploy a pyramid app. In the docs, it says to use ../bin/python setup.py develop to put the app in development mode. Is there another mode that is designed for production. Or do I just use ../bin/python setup.py install.
Testing memory usage of python frameworks in Virtualenv
12,218,779
1
1
1,578
0
python,memory,virtualenv,web-frameworks
It depends on how you're going to run the application in your environment. There are many different ways to run Python web apps. Recently popular methods seem to be Gunicorn and uWSGI. So you'd be best off running the application as you would in your environment and you could simply use a process monitor to see how much memory and CPU is being used by the process running your applicaiton.
0
0
0
1
2012-08-10T01:37:00.000
3
0.066568
false
11,894,333
1
0
1
1
I'm creating an app in several different python web frameworks to see which has the better balance of being comfortable for me to program in and performance. Is there a way of reporting the memory usage of a particular app that is being run in virtualenv? If not, how can I find the average, maximum and minimum memory usage of my web framework apps?
How can I check programmatically the status of my task queue in Google Appengine?
11,903,487
0
6
585
0
python,google-app-engine,queue
There is a task queue link on your appengine console where you can look at the pending tasks, statistics and see what's going on.
0
1
0
0
2012-08-10T13:41:00.000
2
0
false
11,902,954
0
0
1
1
I am firing up a queue to complete some tasks in Python Appengine application. Is there a way to get the status of the queue? I would like to check whether it is still running or has incomplete tasks.
Selective user registration using django_registration?
11,909,991
0
1
82
0
python,django,authentication,django-registration
To limit registrations to people already in the database, you will need some way to identify them. Require the club administrator to enter an email address for each member entered. Require the user to supply that address when registering. Send the registration link to that address, including the primary key of the user record in the link. When the user clicks the link, in your django view examine the link and make sure the key matches, then complete the registration.
0
0
0
0
2012-08-10T21:57:00.000
2
1.2
true
11,909,800
0
0
1
2
I have a site catering to multiple clubs, and each club has administrators who maintain a database of club members. I want to limit site registration only to members who have explicitly been added to the club's database. How do I go about auto-generating and sending out registration links to members as they are added to the database? In other words, I want registration to be initiated only by club administrators.
Selective user registration using django_registration?
11,909,979
0
1
82
0
python,django,authentication,django-registration
You said you have a database of club members already, so you must have a primary key or a tuple which is unique for that database like a club registration number which club members should know already Tell Users to give there primary key value(club registration number) at the time of registration. Make that club registration number also as a primary key to the new database which you are creating after registration for the user, next time if some body will re-use that club registration number to re-register then It will fail as the database tuple will already be there associated with that club registration number Also have a warning message at the time of registration that you can use one club registration number for one registration only for the site.
0
0
0
0
2012-08-10T21:57:00.000
2
0
false
11,909,800
0
0
1
2
I have a site catering to multiple clubs, and each club has administrators who maintain a database of club members. I want to limit site registration only to members who have explicitly been added to the club's database. How do I go about auto-generating and sending out registration links to members as they are added to the database? In other words, I want registration to be initiated only by club administrators.
How to deal with deployments to a single host with multiple application servers using Fabric?
11,918,085
2
3
202
0
python,host,fabric
It's just python so do what you need to do to keep them seperate. You can define the dir differences in a dictionary or some yaml file that's read into the script. There isn't anything made in fabric to make you do it one way nor provide any specific way to do this. But essentially just keep in mind that it's not a DSL, it's a full python file, and you'll stumble onto what works best for you and your environment.
0
1
0
0
2012-08-10T23:00:00.000
1
0.379949
false
11,910,295
0
0
1
1
I have many application servers running on the same host. Every application server is installed in a different directory. How should I tackle deployments on the servers, using Fabric? I would like to be able to perform deployments on each server separately, and on subsets of servers. Clearly the env.hosts parameter has no use here, since all servers are on the same host. Same goes for the env.roledefs parameter. These come in handy when every server is installed on a different host. How should I deal with grouping of the servers, and setting separate environment parameters for each one of them which the fab tool can read and apply.
Cron job on Appengine - first time on start?
11,918,536
2
0
211
0
python,google-app-engine,cron
There is no "launch" the app in production as such. You deploy the app for the first time and crontab is now present and crontab scheduling is started. So I assume you really mean you would like to run the cron job every time you deploy a new version of your application in addition to the cron schedule. The cron handler is callable by you, so why not just wrap appcfg in a script that calls the cron handler after you do the deploy. Use wget/curl etc.....
0
1
0
1
2012-08-11T21:37:00.000
1
0.379949
false
11,917,869
0
0
1
1
I am successfully running a cron job every hour on Google Appengine. However I would like it to start when I launch the app. Now it does the first cron job 1 hour after the start. I am using Python.
Migrate from JavaScript to Python on Google Apps
11,927,684
3
2
157
0
javascript,python,google-apps-script
From what I can tell, the JavaScript-like language is the only one offered for Google Apps Script. You seem to have confused it with Google App Engine, which is a platform-as-a-service that you can use to write your own applications, and offers Java, Python, and Go runtime environments. It is not a scripting language for Google Apps products such as Docs spreadsheets; that's what Apps Script is for.
0
0
0
0
2012-08-13T02:40:00.000
1
1.2
true
11,927,604
1
0
1
1
I've inherited a fairly complex Googledoc spreadsheet with some scripted functionality implemented in the Google App Engine. The original coder used the JavaScript environment. Personally, I'm more comfortable with Python and I'm running into all kinds of weird errors on the JavaScript environment. I'd like to just scrap what we have and rewrite the same scripts in Python, an exercise in translation, if you will...I'm wondering if there's a way to do that keeping the original spreadsheet so I don't have to recreate all the existing spreadsheet structure (several tabs, each with built-in conditional formatting, filters, etc. not to mention a length and complex submission form). So, in short, I'd like to switch from JavaScript to Python in GAE -- is it possible? If so, how? If not, is there a way to copy the whole spreadsheet but start fresh with a blank Python script? Thanks in advance.
OpenERP : Saving field value (amount) into an account
11,929,906
1
1
226
0
python,openerp,accounting
You can override "pay_and_reconcile" function to write in account field, this function is called at time of Pay. action_date_assign() action_move_create() action_number() this 3 function are called at time of validating invoice. You can override any one from this or you can add your own function . in workflow for the "open" activity.
0
0
0
0
2012-08-13T06:20:00.000
2
1.2
true
11,929,073
0
0
1
1
I'm new to OpenERP and python and I need some help saving an amount into a particular account. I have created a field in the invoice form that calculates a specific amount based on some code and displays that amount in the field. What I want to do is to associate an account with this field, so when the invoice is validated and/or payed, this amount is saved into an account and later on I can see it in the journal entries and/or chart of account. Any idea how to do that ?
Secured communication between two web servers (Amazon EC2 with Django and Google App Engine)
11,960,819
1
3
705
0
python,django,google-app-engine,amazon-ec2,urllib
For server to server communication, traditional security advice would recommend some sort of IP range restriction at the web server level for the URLs in addition to whatever default security is in place. However, since you are making the call from a cloud provider to another cloud provider, your ability to permanently control the IP address of either the client and the server may diminished. That said, I would recommend using a standard username/password authentication mechanism and HTTPS for transport security. A basic auth username/password would be my recommendation(https:\\username:[email protected]\). In addition, I would make sure to enforce a lockout based on a certain number of failed attempts in a specific time window. This would discourage attempts to brute force the password. Depending on what web framework you are using on the App Engine, there is probably already support for some or all of what I just mentioned. If you update this question with more specifics on your architecture or open a new question with more information, we could give you a more accurate recommendation.
0
1
0
0
2012-08-13T20:49:00.000
3
0.066568
false
11,942,094
0
0
1
2
I have a website which uses Amazon EC2 with Django and Google App Engine for its powerful Image API and image serving infrastructure. When a user uploads an image the browser makes an AJAX request to my EC2 server for the Blobstore upload url. I'm fetching this through my Django server so I can check whether the user is authenticated or not and then the server needs to get the url from the App Engine server. After the upload is complete and processed in App Engine I need to send the upload info back to the django server so I can build the required model instances. How can I accomplish this? I was thinking to use urllib but how can I secure this to make sure the urls will only get accessed by my servers only and not by a web user? Maybe some sort of secret key?
Secured communication between two web servers (Amazon EC2 with Django and Google App Engine)
11,960,193
2
3
705
0
python,django,google-app-engine,amazon-ec2,urllib
apart from the Https call ( which you should be making to transfer info to django ), you can go with AES encryption ( use Pycrypto/ any other lib). It takes a secret key to encrypt your message.
0
1
0
0
2012-08-13T20:49:00.000
3
1.2
true
11,942,094
0
0
1
2
I have a website which uses Amazon EC2 with Django and Google App Engine for its powerful Image API and image serving infrastructure. When a user uploads an image the browser makes an AJAX request to my EC2 server for the Blobstore upload url. I'm fetching this through my Django server so I can check whether the user is authenticated or not and then the server needs to get the url from the App Engine server. After the upload is complete and processed in App Engine I need to send the upload info back to the django server so I can build the required model instances. How can I accomplish this? I was thinking to use urllib but how can I secure this to make sure the urls will only get accessed by my servers only and not by a web user? Maybe some sort of secret key?
Migrate django1.2 project to django1.4
11,947,415
0
1
168
0
python,django,migration
If you're using virtualenv, just create a new virtualenv with the versions of django/python that you want, workon this virtualenv and run your testsuite against it. BTW, you might want to be careful with the word 'migration' whilst in a Django context. Migration normally refers to model migrations with South when you're making changes to the tables in the database.
0
0
0
0
2012-08-14T05:17:00.000
4
0
false
11,946,144
0
0
1
3
We developed project with django version 1.2 and python 2.4. Now we want to migrate the projects into latest version (Django1.4 and python2.7). I am very new for migration, Can anyone please advise on this. What things do I need to take careof? Do we need to rewrite all the code again?
Migrate django1.2 project to django1.4
11,955,904
2
1
168
0
python,django,migration
This is what we are doing (we're upgrading ~60Kloc from Django 0.97 to 1.4): create an upgrade branch of your code create a virtualenv for working on the upgrade download the "next" version of Django (if you prefer small steps), or the Django version you want to end up with, and place it into your own version control system (VCS). check out Django from your VCS to the root of your virtualenv. repeat until done: run your testsuite (and coverage). fix any problems add a comment in your root __init__.py file indicating which Django version your code works with (this will save you a lot of time one day :-) merge your trunk out to your upgrade branch (to get all the changes that have happened while you were working on the upgrade). run your testsuite, fix any problems, then check-in the merge. finally: reintegrate your upgrade branch back into trunk. Now you've upgraded your code (you'll still have to plan the deployment of the upgrade, but that's another question). ps: we store Django in our VCS so we can keep track of any changes we need to make to Django itself (especially needed if you don't want to go to 1.4, but still might need one or two fixes from that version).
0
0
0
0
2012-08-14T05:17:00.000
4
1.2
true
11,946,144
0
0
1
3
We developed project with django version 1.2 and python 2.4. Now we want to migrate the projects into latest version (Django1.4 and python2.7). I am very new for migration, Can anyone please advise on this. What things do I need to take careof? Do we need to rewrite all the code again?
Migrate django1.2 project to django1.4
11,946,791
-1
1
168
0
python,django,migration
Python does not support backward compatibility,consider that you may get some issues on migrating to 2.7 from 2.4.
0
0
0
0
2012-08-14T05:17:00.000
4
-0.049958
false
11,946,144
0
0
1
3
We developed project with django version 1.2 and python 2.4. Now we want to migrate the projects into latest version (Django1.4 and python2.7). I am very new for migration, Can anyone please advise on this. What things do I need to take careof? Do we need to rewrite all the code again?
schedule number of web dynos by time of day
11,950,901
1
15
1,533
0
python,django,dynamic,heroku
It's not built into the platform but should be pretty easy to implement via scheduler and using your API token.
0
0
0
0
2012-08-14T09:15:00.000
3
0.066568
false
11,949,240
0
0
1
1
Is there a way to use the Heroku scheduler to start and stop web dynos for specific periods of the day? Like say during business hours 2 dynos and at night only 1 dyno? I really would like to avoid putting the normal user/pass credentials into the app itself, so I'm looking for a secure way to do this (apart from doing it manually each day for each app). Using the "heroku ps:scale web=2" directly would naturally be nice but as far as I know this is not supported. Thanks for any feedback in advance...
How to implement Haystack across all pages in website?
11,962,907
2
0
138
0
python,django,django-templates
It's a good idea to have all pages extend a base template. So you can have one template (e.g. base.html) that contains the basic structure of your site (headers, footers, boilerplate). Then you can extend this template for each page of your site. (i.e. {% extends 'base.html' %}). Following this structure, you should be able to put your search form in the base template and have it appear on all pages.
0
0
0
0
2012-08-15T00:41:00.000
1
1.2
true
11,962,749
0
0
1
1
I'm creating a site like craigslist and need to implement a search feature where customers can search for a key term and see results. For example, searching for "lamp" will create a result page with all the posts that are related to lamps. I'm using Haystack / Solr to search the contents. However, at the moment, users have to go to a specific search page where they can then narrow their results. How do I implement it in such a way that the search bar can appear in my header on every page? I'm using Django.
Website sync of contacts and reminders with iCloud
11,973,429
1
4
1,657
0
php,python,api,sync,icloud
To the best of my knowledge, there is no way to interface with iCloud directly; it can only be done through an iOS or Mac OS app, and by calling the correct iCloud Objective-C APIs with UI/NSDocument classes. Since you are not using Cocoa, let alone Objective-C, you will most likely not be able to do this. I may be wrong of course, as I haven't conducted an in-depth search into this.
0
1
0
0
2012-08-15T13:23:00.000
4
0.049958
false
11,970,079
0
0
1
2
I'm building custom CRM web based system and have integrated synchronization of contacts and reminders with Google apps and need do the same with Apple iCloud. Is there any way how to do it? I haven't find any official API for this purpose, CRM is written in PHP, but I'm able to use python for this purpose as well.
Website sync of contacts and reminders with iCloud
12,255,882
0
4
1,657
0
php,python,api,sync,icloud
I would recommend that you sync using the google contacts api. Then, you can tell iPhone people to use that instead of iCloud.
0
1
0
0
2012-08-15T13:23:00.000
4
0
false
11,970,079
0
0
1
2
I'm building custom CRM web based system and have integrated synchronization of contacts and reminders with Google apps and need do the same with Apple iCloud. Is there any way how to do it? I haven't find any official API for this purpose, CRM is written in PHP, but I'm able to use python for this purpose as well.
Regularly updated data and the Search API
11,983,057
2
1
519
0
google-app-engine,python-2.7,google-cloud-datastore,gae-search
It is true that the Search API's documents can include numeric data, and can easily be updated, but as you say, if you're doing a lot of updates, it could be non-optimal to be modifying the documents so frequently. One design you might consider would store the numeric data in Datastore entities, but make heavy use of a cache as well-- either memcache or a backend in-memory cache. Cross-reference the docs and their associated entities (that is, design the entities to include a field with the associated doc id, and the docs to include a field with the associated entity key). If your application domain is such that the doc id and the datastore entity key name can be the same string, then this is even more straightforward. Then, in the cache, index the numeric field information by doc id. This would let you efficiently fetch the associated numeric information for the docs retrieved by your queries. You'd of course need to manage the cache on updates to the datastore entities. This could work well as long as the size of your cache does not need to be prohibitively large. If your doc id and associated entity key name can be the same string, then I think you may be able to leverage ndb's caching support to do much of this.
0
0
0
0
2012-08-16T02:26:00.000
1
1.2
true
11,979,898
1
0
1
1
I have an application which requires very flexible searching functionality. As part of this, users will need have the ability to do full-text searching of a number of text fields but also filter by a number of numeric fields which record data which is updated on a regular basis (at times more than once or twice a minute). This data is stored in an NDB datastore. I am currently using the Search API to create document objects and indexes to search the text-data and I am aware that I can also add numeric values to these documents for indexing. However, with the dynamic nature of these numeric fields I would be constantly updating (deleting and recreating) the documents for the search API index. Even if I allowed the search API to use the older data for a period it would still need to be updated a few times a day. To me, this doesn't seem like an efficient way to store this data for searching, particularly given the number of search queries will be considerably less than the number of updates to the data. Is there an effective way I can deal with this dynamic data that is more efficient than having to be constantly revising the search documents? My only thoughts on the idea is to implement a two-step process where the results of a full-text search are then either used in a query against the NDB datastore or manually filtered using Python. Neither seems ideal, but I'm out of ideas. Thanks in advance for any assistance.
How to separate test types using Django
11,982,939
1
1
545
0
python,django,unit-testing,testing,django-testing
The way my company organises tests is to split them into two broad categories. Unit and functional. The unit tests live inside the Django test discovery. manage.py test will run them. The functional tests live outside of that directory. They are run either manually or by the CI. Buildbot in this case. They are still run with the unittest textrunner. We also have a subcategory of functional tests called stress tests. These are tests that can't be run in parallel because they are doing rough things to the servers. Like switching off the database and seeing what happens. The CI can then run each test type as a different step. Tests can be decorated with skipif. It's not a perfect solution but it is quite clear and easy to understand.
0
0
0
1
2012-08-16T07:37:00.000
2
0.099668
false
11,982,638
0
0
1
1
I have a series of tests in Django that are categorised into various "types", such as "unit", "functional", "slow", "performance", ... Currently I'm annotating them with a decorator that is used to only run tests of a certain type (similar to @skipIf(...)), but this doesn't seem like an optimal approach. I'm wondering if there is a better way to do separation of tests into types? I'm open to using different test runners, extending the existing django testing framework, building suites or even using another test framework if that doesn't sacrifice other benefits. The underlying reason for wanting to do this is to run an efficient build pipeline, and as such my priorities are to: ensure that my continuous integration runs check the unit tests first, possibly parallelise some test runs skip some classes of test altogether
Conflict resolution in ZODB
11,996,422
1
2
565
0
python,django,zodb
IOBucket is part of the persistence structure of a BTree; it exists to try and reduce conflict errors, and it does try and resolve conflicts where possible. That said, conflicts are not always avoidable, and you should restart your transaction. In Zope, for example, the whole request is re-run up to 5 times if a ConflictError is raised. Conflicts are ZODB's way of handling the (hopefully rare) occasion where two different requests tried to change the exact same data structure. Restarting your transaction means calling transaction.begin() and applying the same changes again. The .begin() will fetch any changes made by the other process and your commit will be based on the fresh data.
0
0
0
1
2012-08-16T15:56:00.000
1
1.2
true
11,991,114
0
0
1
1
I do run parallel write requests on my ZODB. I do have multiple BTree instances inside my ZODB. Once the server accesses the same objects inside such a BTree, I get a ConflictError for the IOBucket class. For all my Django bases classes I do have _p_resolveconflict set up, but can't implement it for IOBucket 'cause its a C based class. I did a deeper analysis, but still don't understand why it complains about the IOBucket class and what it writes into it. Additionally, what would be the right strategy to resolve it? Thousand thanks for any help!
How to automatically capitalize field on form submission in Django?
12,001,435
1
7
11,826
0
python,django,sorting,django-models,django-forms
If you want to ensure your data is consistent, I'm not sure that capitalizing at the form / view level is the best way to go. What happens when you add a Product through the admin where you're not using that form / save method? If you forget the capital, you're in for data inconsistency. You could instead use your model's save method, or even use the pre_save signal that django sends. This way, data is always treated the same, regardless of where it came from.
0
0
0
0
2012-08-16T23:40:00.000
6
0.033321
false
11,996,963
0
0
1
2
I have a ProductForm where users can add a Product to the database with information like title, price, and condition. How do I make it so that when the user submits the form, the first letter of the title field is automatically capitalized? For example, if a user types "excellent mattress" in the form, django saves it as "Excellent mattress" to the database. Just for reference, the reason I ask is because when I display all the product objects on a page, Django's sort feature by title is case-sensitive. As such, "Bravo", "awful", "Amazing" would be sorted as "Amazing", "Bravo", "awful" when as users, we know that is not alphabetical. Thanks for the help!
How to automatically capitalize field on form submission in Django?
57,403,421
0
7
11,826
0
python,django,sorting,django-models,django-forms
if you need all first letters of all words to capitalize use val.title() in Jeremy Lewis' answer. If you use val.capitalize() then "hello world" would be "Hello world", with title() you can get "Hello World"
0
0
0
0
2012-08-16T23:40:00.000
6
0
false
11,996,963
0
0
1
2
I have a ProductForm where users can add a Product to the database with information like title, price, and condition. How do I make it so that when the user submits the form, the first letter of the title field is automatically capitalized? For example, if a user types "excellent mattress" in the form, django saves it as "Excellent mattress" to the database. Just for reference, the reason I ask is because when I display all the product objects on a page, Django's sort feature by title is case-sensitive. As such, "Bravo", "awful", "Amazing" would be sorted as "Amazing", "Bravo", "awful" when as users, we know that is not alphabetical. Thanks for the help!
Script to open web pages, fill texts and click buttons
28,718,663
-2
1
4,599
0
python,facebook,text,login,fill
You can also take a look at IEC which uses windows API to run an instance of Internet explorer and give commands to it. Although it may not be good for large scale automation, but it is very easy to use.
0
0
1
0
2012-08-18T22:05:00.000
3
-0.132549
false
12,022,570
0
0
1
1
I'd like to write a script, preferably a Python code, to fill text areas in web pages and then click certain buttons. I've come across some solutions for this but none worked, mainly because cookies were not stored properly, for exmaple, there was a Python script to login to Facebook, which did seem to get it right in the shell screen, but when I opened Facebook in the browser it was logged out like nothing happened. Also, the code was hard coded for Facebook and I'm asking for something more general. So, please, if anyone had been successful with these kind of things, your advice is much needed. Open a web page, fill text in specified text elements, click a specified button, save cookies, that's all. Many thanks.
Django multiple photo download
12,027,094
1
0
282
0
python,django
I think the only way to do it is in the backend, because in the frontend you will only have to select which photos you want to download and send the ids or some identifiers to the server side, then retrieve those selected photos from the filesystem (based on the identifiers), compress them in a single file and return that compressed file in a response as attached content. If you do it in the front end how would you get each file and compress them all? Doing it in server side is the best solution in my opinion :)
0
0
0
0
2012-08-19T13:56:00.000
2
1.2
true
12,027,033
0
0
1
1
I developed a photo gallery in python, now I want to insert a new feature, "Download Multiple Photos": a user can select some photos to download and system creates a compressed file with the photos. In your opinion: in the frontend what is the best way to send the ids? Json? input hidden? and in the backend there is a django library that compress the selected photos and return the compressed file? Thanks, Marco
Running algorithms in compiled C/C++ code within a Java/PHP/Python framework?
12,033,703
0
2
419
0
java,php,c++,python,c
For Java, you can search JNI (Java Native Interface), there're a lot of guides telling how to use it.
0
0
0
1
2012-08-19T18:27:00.000
4
0
false
12,028,908
0
0
1
1
Occasionally, I have come across programming techniques that involve creating application frameworks or websites in Java, PHP or Python, but when complex algorithms are needed, writing those out in C or C++ and running them as API-like function calls within your Java/PHP/Python code. I have been googling and searching around the net for this, and unless I don't know the name of the practice, I can't seem to find anything on it. To put simply, how can I: Create functions or classes in C or C++ Compile them into a DLL/binary/some form Run the functions from - Java PHP Python I suspect JSON/XML like output and input must be created between the Java/PHP/Python and the C/C++ function so the data can be easily bridged, but that is okay. I'm just not sure how to approach this technique, but it seems like a very smart way to take advantage of the great features of Java, PHP, and Python while at the same time utilizing the very fast programming languages for large, complex tasks. The other thought going through my head is if I am creating functions using only literals in Java/PHP/Python, will it go nearly as fast as C anyway? The specific tasks I'm looking to work with C/C++ on is massive loops, pinging a database, and analyzing maps. No work has started yet, its all theory now.
From MongoDB to PostgreSQL - Django
15,858,338
1
2
1,475
1
python,django,mongodb,database-migration,django-postgresql
Whether the migration is easy or hard depends on a very large number of things including how many different versions of data structures you have to accommodate. In general you will find it a lot easier if you approach this in stages: Ensure that all the Mongo data is consistent in structure with your RDBMS model and that the data structure versions are all the same. Move your data. Expect that problems will be found and you will have to go back to step 1. The primary problems you can expect are data validation problems because you are moving from a less structured data platform to a more structured one. Depending on what you are doing regarding MapReduce you may have some work there as well.
0
0
0
0
2012-08-20T08:25:00.000
2
0.099668
false
12,034,390
0
0
1
1
Could any one shed some light on how to migrate my MongoDB to PostgreSQL? What tools do I need, what about handling primary keys and foreign key relationships, etc? I had MongoDB set up with Django, but would like to convert it back to PostgreSQL.
To json or not to json
12,039,958
1
1
151
0
python,django,json
It depends entirely on what you're trying to do. render_to_response passes some data to a template to render an HTML document. simply responding with a JSON object will return a JSON object. If you want to present a usable page to a human, then use render_to_response. If you're simply passing some data to a jQuery element, then simply returning a simplejson.dumps() is perfectly valid. There are other ways to return JSON, but that's by far the easiest and most robust. In order to explain more, it would help if you elaborated on exactly what the infinite scroll view is.
0
0
0
0
2012-08-20T14:36:00.000
1
1.2
true
12,039,689
0
0
1
1
New to Django and Python. I am using MySQL as a backend. I have two views: an infinite scroll call that calls all the records in tableA and an autocomplete field that queries tableB and returns matching records from a column. My infinite scroll and autocomplete were created using help from various separate tutorials around the web. In my infinite scroll, I am currently returning a render_to_response object (I based it off the Django beginner's tutorial). My autocomplete returns simplejson (I based it off some articles I googled). They both are returning records from a DB, so shouldn't the responses be similar? When should I use json (or simplejson, in my case) and when shouldn't I? Thx!
Call python script from html page
12,043,550
1
0
1,126
0
html,ajax,django,python-3.x,cherrypy
No, you don't need a web framework, but in general it's a good idea. Django seems like brutal overkill for this. CherryPy or Pyramid or some micro framework seems better. You can have an HTML page that calls the CherryPy server, but since this page obviously is a part of the system/service you are building, serving it from the server makes more sense. Sure, why not.
0
0
1
0
2012-08-20T18:49:00.000
1
0.197375
false
12,043,333
0
0
1
1
I have to make a html page with css and javascript that I have to enter a url in a form. With this url, I have to get some information from the html of the page with a Python 3.2 Script. I start learning Python some days ago and I have some question: I need CherryPy/Django to do that? (I'm asking because I executed a script to get the entire html without using CherryPy/Django and it works - no interaction with browser) CherryPy examples have the html built in the python code. I must write the html in the python script or can I have an html page that call the script with Ajax (or anything else)? If I can use Ajax, is XmlHttpRequest a good choice? Thank you! :D
python - web2py - can't seem to find lxml - ActivePython - windows7
12,047,285
1
0
277
0
python,lxml,web2py,activepython
If you are using the Windows binary version of web2py, it comes with its own Python 2.5 interpreter and is self-contained, so it won't use your system's Python 2.7 nor see any of its modules. Instead, you should switch to running web2py from source. It's just as easy as the binary version -- just download the zip file and unzip it. You can then import lxml without moving anything to the application's /modules folder.
0
0
1
0
2012-08-20T23:41:00.000
1
1.2
true
12,046,683
0
0
1
1
Have been using ActivePython on windows7 and lxml seems working without an issue.. There were a lot of other third party packages I had & they were working too.. Until I wanted to use it inside Web2Py. All the others seem to be working if I copy them directly inside c:/web2py/applications/myApp/modules With lxml, seems I need to copy something else.. I have a third party module, which imports lxml like this : from lxml.etree import tostring It ends up throwing - No module named lxml.etree My test program outside web2py runs without an issue with both these modules. When I do a pypm files lxml I see this : %APPDATA%\Python\Python27\site-packages\lxml-2.3-py2.7.egg-info What else should I copy along with the lxml directory into the modules directory ? Pretty sure it's me doing something wrong instead of Web2py, but can't put a finger on.. web2py version = Version 1.99.7 (2012-03-04 22:12:08) stable