Available Count
int64
1
31
AnswerCount
int64
1
35
GUI and Desktop Applications
int64
0
1
Users Score
int64
-17
588
Q_Score
int64
0
6.79k
Python Basics and Environment
int64
0
1
Score
float64
-1
1.2
Networking and APIs
int64
0
1
Question
stringlengths
15
7.24k
Database and SQL
int64
0
1
Tags
stringlengths
6
76
CreationDate
stringlengths
23
23
System Administration and DevOps
int64
0
1
Q_Id
int64
469
38.2M
Answer
stringlengths
15
7k
Data Science and Machine Learning
int64
0
1
ViewCount
int64
13
1.88M
is_accepted
bool
2 classes
Web Development
int64
0
1
Other
int64
1
1
Title
stringlengths
15
142
A_Id
int64
518
72.2M
1
3
0
11
6
0
1
0
I want to send some modem AT commands using python code, and am wondering what is the keycode for the key combination control+z Gath
0
python
2009-09-16T07:30:00.000
0
1,431,495
Key code? If you send AT commands you are probably sending strings with ascii text and control codes, right? Ctrl-Z is usually 26 (decimal). So chr(26) should work, or if it's a part of a string, '\x1a' as 26 decimal is 1A hex. That said, Ctrl-Z is not usually a part of the AT command set... so if this doesn't help you, maybe you could explain more what you are trying to do and why you would need to send Ctrl-Z.
0
13,499
false
0
1
What is the keycode for control+z key in python?
1,431,524
1
3
0
2
1
0
0.132549
0
I have a cherrypy web server that uses larges amounts of HTML data. Is there anyway in Python to minimize the HTML so that all comments, spaces, ext, are removed?
0
python,html,minimize
2009-09-17T07:59:00.000
0
1,437,357
there are bindings to tidy for python, called mxTidy from eGenix (Marc André Lemburg)
0
1,549
false
1
1
Python HTML Minimizer
1,531,684
2
2
1
0
7
0
0
0
I am embedding a c++ library (binding done with SIP) in my python application. Under certain circonstances (error cases), this library uses exit(), which causes my entire application to exit. Is there a way to catch this event, or do I need to modify the library to handle error cases differently ? Thank you very much,
0
python,exception,binding,exit
2009-09-17T15:19:00.000
0
1,439,533
You can override the library linking with LD_LIBRARY_PATH and make your own exit function. Works fine.
0
1,407
false
0
1
How to catch exit() in embedded C++ module from python code?
8,099,938
2
2
1
7
7
0
1.2
0
I am embedding a c++ library (binding done with SIP) in my python application. Under certain circonstances (error cases), this library uses exit(), which causes my entire application to exit. Is there a way to catch this event, or do I need to modify the library to handle error cases differently ? Thank you very much,
0
python,exception,binding,exit
2009-09-17T15:19:00.000
0
1,439,533
You must modify the source of the library. There is no "exception handling" in C and exit() does not return to the calling code under any circumstances.
0
1,407
true
0
1
How to catch exit() in embedded C++ module from python code?
1,439,585
3
5
0
1
7
1
0.039979
0
I write a lot of scripts in Python to analyze and plot experimental data as well as write simple simulations to test how theories fit the data. The scripts tend to be very procedural; calculate some property, calculate some other property, plot properties, analyze plot... Rather than just writing a procedure, would there be an benefits of using a Class? I can bury the actual analysis into functions so I can pass the data to the function and let it do it's thing but the functions are not contained in a Class. What sort of drawbacks would a Class over come and what would be the purpose of using a Class if it can be written procedurally? If this has been posted before my apologies, just point me in that direction.
0
python,oop,class-design,procedural-programming
2009-09-17T18:07:00.000
0
1,440,434
OOP lends itself well to complex programs. It's great for capturing the state and behavior of real world concepts and orchestrating the interplay between them. Good OO code is easy to read/understand, protects your data's integrity, and maximizes code reuse. I'd say code reuse is one big advantage to keeping your frequently used calculations in a class.
0
4,545
false
0
1
Class usage in Python
1,440,734
3
5
0
1
7
1
0.039979
0
I write a lot of scripts in Python to analyze and plot experimental data as well as write simple simulations to test how theories fit the data. The scripts tend to be very procedural; calculate some property, calculate some other property, plot properties, analyze plot... Rather than just writing a procedure, would there be an benefits of using a Class? I can bury the actual analysis into functions so I can pass the data to the function and let it do it's thing but the functions are not contained in a Class. What sort of drawbacks would a Class over come and what would be the purpose of using a Class if it can be written procedurally? If this has been posted before my apologies, just point me in that direction.
0
python,oop,class-design,procedural-programming
2009-09-17T18:07:00.000
0
1,440,434
Object-oriented programming isn't the solution to every coding problem. In Python, functions are objects. You can mix as many objects and functions as you want. Modules with functions are already objects with properties. If you find yourself passing a lot of the same variables around — state — an object is probably better suited. If you have a lot of classes with class methods, or methods that don't use self very much, then functions are probably better.
0
4,545
false
0
1
Class usage in Python
1,447,527
3
5
0
4
7
1
0.158649
0
I write a lot of scripts in Python to analyze and plot experimental data as well as write simple simulations to test how theories fit the data. The scripts tend to be very procedural; calculate some property, calculate some other property, plot properties, analyze plot... Rather than just writing a procedure, would there be an benefits of using a Class? I can bury the actual analysis into functions so I can pass the data to the function and let it do it's thing but the functions are not contained in a Class. What sort of drawbacks would a Class over come and what would be the purpose of using a Class if it can be written procedurally? If this has been posted before my apologies, just point me in that direction.
0
python,oop,class-design,procedural-programming
2009-09-17T18:07:00.000
0
1,440,434
You don't need to use classes in Python - it doesn't force you to do OOP. If you're more comfortable with the functional style, that's fine. I use classes when I want to model some abstraction which has variations, and I want to model those variations using classes. As the word "class" implies, they're useful mainly when the stuff you are working with falls naturally into various classes. When just manipulating large datasets, I've not found an overarching need to follow an OOP paradigm just for the sake of it.
0
4,545
false
0
1
Class usage in Python
1,440,655
3
7
0
0
8
0
0
0
I'd like to know if it's possible to find out the "command" that a PID is set to. When I say command, I mean what you see in the last column when you run the command "top" in a linux shell. I'd like to get this information from Python somehow when I have a specific PID. Any help would be great. Thanks.
0
python,linux,process
2009-09-17T19:44:00.000
1
1,440,941
The proc filesystem exports this (and other) information. Look at the /proc/PID/cmd symlink.
0
9,565
false
0
1
Finding the command for a specific PID in Linux from Python
1,440,965
3
7
0
5
8
0
0.141893
0
I'd like to know if it's possible to find out the "command" that a PID is set to. When I say command, I mean what you see in the last column when you run the command "top" in a linux shell. I'd like to get this information from Python somehow when I have a specific PID. Any help would be great. Thanks.
0
python,linux,process
2009-09-17T19:44:00.000
1
1,440,941
Look in /proc/$PID/cmdline, and then os.readlink() on /proc/$PID/exe. /proc/$PID/cmdline is not necessarily going to be correct, as a program can change its argument vector or it may not contain a full path. Three examples of this from my current process list are: avahi-daemon: chroot helper qmgr -l -t fifo -u /usr/sbin/postgrey --pidfile=/var/run/postgrey.pid --daemonize --inet=127.0.0.1:60000 --delay=55 That first one is obvious - it's not a valid path or program name. The second is just an executable with no path name. The third looks ok, but that whole command line is actually in argv[0], with spaces separating the arguments. Normally you should have NUL separated arguments. All this goes to show that /proc/$PID/cmdline (or the ps(1) output) is not reliable. However, nor is /proc/$PID/exe. Usually it is a symlink to the executable that is the main text segment of the process. But sometimes it has " (deleted)" after it if the executable is no longer in the filesystem. Also, the program that is the text segment is not always what you want. For instance, /proc/$PID/exe from that /usr/sbin/postgrey example above is /usr/bin/perl. This will be the case for all interpretted scripts (#!). I settled on parsing /proc/$PID/cmdline - taking the first element of the vector, and then looking for spaces in that, and taking all before the first space. If that was an executable file - I stopped there. Otherwise I did a readlink(2) on /proc/$PID/exe and removed any " (deleted)" strings on the end. That first part will fail if the executable filename actually has spaces in it. There's not much you can do about that. BTW. The argument to use ps(1) instead of /proc/$PID/cmdline does not apply in this case, since you are going to fall back to /proc/$PID/exe. You will be dependent on the /proc filesystem, so you may as well read it with read(2) instead of pipe(2), fork(2), execve(2), readdir(3)..., write(2), read(2). While ps and /proc/$PID/cmdline may be the same from the point of view of lines of python code, there's a whole lot more going on behind the scenes with ps.
0
9,565
false
0
1
Finding the command for a specific PID in Linux from Python
1,443,544
3
7
0
6
8
0
1
0
I'd like to know if it's possible to find out the "command" that a PID is set to. When I say command, I mean what you see in the last column when you run the command "top" in a linux shell. I'd like to get this information from Python somehow when I have a specific PID. Any help would be great. Thanks.
0
python,linux,process
2009-09-17T19:44:00.000
1
1,440,941
Look in /proc/$PID/cmdline
0
9,565
false
0
1
Finding the command for a specific PID in Linux from Python
1,440,969
2
5
0
0
1
0
0
1
I want to simulate MyApp that imports a module (ResourceX) which requires a resource that is not available at the time and will not work. A solution for this is to make and import a mock module of ResourceX (named ResourceXSimulated) and divert it to MyApp as ResourceX. I want to do this in order to avoid breaking a lot of code and get all kinds of exception from MyApp. I am using Python and It should be something like: "Import ResourceXSimulated as ResourceX" "ResourceX.getData()", actually calls ResourceXSimultated.getData() Looking forward to find out if Python supports this kind of redirection. Cheers. ADDITIONAL INFO: I have access to the source files. UPDATE: I am thinking of adding as little code as possible to MyApp regarding using the fake module and add this code near the import statements.
0
python,testing,mocking,module,monkeypatching
2009-09-18T08:12:00.000
0
1,443,173
Yes, Python can do that, and so long as the methods exposed in the ResourceXSimulated module "look and smell" like these of the original module, the application should not see much any difference (other than, I'm assuming, bogus data fillers, different response times and such).
0
256
false
0
1
Is it possible to divert a module in python? (ResourceX diverted to ResourceXSimulated)
1,443,195
2
5
0
1
1
0
0.039979
1
I want to simulate MyApp that imports a module (ResourceX) which requires a resource that is not available at the time and will not work. A solution for this is to make and import a mock module of ResourceX (named ResourceXSimulated) and divert it to MyApp as ResourceX. I want to do this in order to avoid breaking a lot of code and get all kinds of exception from MyApp. I am using Python and It should be something like: "Import ResourceXSimulated as ResourceX" "ResourceX.getData()", actually calls ResourceXSimultated.getData() Looking forward to find out if Python supports this kind of redirection. Cheers. ADDITIONAL INFO: I have access to the source files. UPDATE: I am thinking of adding as little code as possible to MyApp regarding using the fake module and add this code near the import statements.
0
python,testing,mocking,module,monkeypatching
2009-09-18T08:12:00.000
0
1,443,173
Yes, it's possible. Some starters: You can "divert" modules by manipulating sys.modules. It keeps a list of imported modules, and there you can make your module appear under the same name as the original one. You must do this manipulating before any module that imports the module you want to fake though. You can also make a package called a different name, but in that package actually use the original module name, for your completely different module. This works well as long as the original module isn't installed. In none of these cases you can use both modules at the same time. For that you need to monkey-patch the original module. And of course: It' perfectly possible to just call the new module with the old name. But it might be confusing.
0
256
false
0
1
Is it possible to divert a module in python? (ResourceX diverted to ResourceXSimulated)
1,443,281
1
4
0
8
17
0
1
0
I need identify which file is binary and which is a text in a directory. I tried use mimetypes but it isnt a good idea in my case because it cant identify all files mimes, and I have strangers ones here... I just need know, binary or text. Simple ? But I couldn´t find a solution... Thanks
0
python,text,binary,file-type
2009-09-18T19:58:00.000
0
1,446,549
It's inherently not simple. There's no way of knowing for sure, although you can take a reasonably good guess in most cases. Things you might like to do: Look for known magic numbers in binary signatures Look for the Unicode byte-order-mark at the start of the file If the file is regularly 00 xx 00 xx 00 xx (for arbitrary xx) or vice versa, that's quite possibly UTF-16 Otherwise, look for 0s in the file; a file with a 0 in is unlikely to be a single-byte-encoding text file. But it's all heuristic - it's quite possible to have a file which is a valid text file and a valid image file, for example. It would probably be nonsense as a text file, but legitimate in some encoding or other...
0
20,864
false
0
1
How to identify binary and text files using Python?
1,446,580
1
3
0
0
2
0
0
0
I'm building a mobile photo sharing site in Python similar to TwitPic and have been exploring various queues to handle the image processing. I've looked into RabbitMQ and ActiveMQ but I'm thinking that there is a better solution for my use case. I'm looking for something a little more lightweight. I'm open to any suggestions.
0
python,queue
2009-09-20T01:33:00.000
0
1,450,038
Are you considering single machine architecture, or a cluster of machines? Forwarding the image to an available worker process on the same machine or a different machine isn't profoundly different, particularly if you use TCP sockets. Knowing what workers are available, spawning more if necessary and the resources are available, having a fail-safe mechanism if a worker crashes, etc, gradually make the problem more complicated. It could be something as simple as using httplib to push the image to a private server running Apache or twisted and a collection of cgi applications. When you add another server, round robin the request amongst them.
0
290
false
0
1
Which queue is most appropriate?
1,450,105
5
9
0
1
25
1
0.022219
0
I wanted to know Python is suited for what kind of applications? I am new to Python world but I know it's a scripting language like Perl but I was not sure about the kind of applications which one would build using Python and would certainly appreciate if someone can provide some useful information.
0
python
2009-09-21T01:24:00.000
0
1,452,509
Well, the short answer is, since you mentioned Perl, anything you could build in Perl you could build in Python. You can build anything in any language, and if the language has easy C bindings, you could even do it efficiently. Now, this being the case, the question becomes somewhat philosophical. Python has as a key tenet "There should only be one way to do it". Perl is exactly the opposite. The key tenet of Perl is "There Is More Than One Way To Do It" (TIMTOWTDI) or ( Tim Toady, to his frineds ;) ) How do you like to do things? One clear and shining path, agreed upon by most? Or perhaps you value the almost infinite number of solution paths that any task has in Perl? So, assuming that your task is I/O bound ( like most things ) rather than CPU bound ( real time programming or games , or nipple crinkling number crunching ) then Python would be suitable. Whether its philosophy suits you is the key question.
0
80,717
false
0
1
What kind of applications are built using Python?
1,452,566
5
9
0
1
25
1
0.022219
0
I wanted to know Python is suited for what kind of applications? I am new to Python world but I know it's a scripting language like Perl but I was not sure about the kind of applications which one would build using Python and would certainly appreciate if someone can provide some useful information.
0
python
2009-09-21T01:24:00.000
0
1,452,509
Most of the 3d packages these days, such as Maya, SoftImage, Houdini, RealFlow, Blender, etc. all use Python as an embedded scripting and plugin language.
0
80,717
false
0
1
What kind of applications are built using Python?
1,452,576
5
9
0
10
25
1
1
0
I wanted to know Python is suited for what kind of applications? I am new to Python world but I know it's a scripting language like Perl but I was not sure about the kind of applications which one would build using Python and would certainly appreciate if someone can provide some useful information.
0
python
2009-09-21T01:24:00.000
0
1,452,509
You say: I am new to Python world but I know it's an scripting language. I think the distinction between "scripting languages" and "programming languages" is quite arbitrary. Nearly every language developed in the last 10-20 years has some kind of runtime support, usually in the form of a bytecode interpreter or virtual machine. Python is no different: it gets compiled to bytecode and the bytecode is executed by the Python runtime. The point is, I would say there are very few things you can do in Java, C#, Ruby, etc., that you couldn't do in Python. That said, however, different languages have different strengths. So there are certainly some kinds of programs that would be better suited to being written in Python. It really depends on what you want the programming language to do for you, and what you want to do yourself. The right answer depends on what kinds of problems you're interested in solving.
0
80,717
false
0
1
What kind of applications are built using Python?
1,452,546
5
9
0
33
25
1
1.2
0
I wanted to know Python is suited for what kind of applications? I am new to Python world but I know it's a scripting language like Perl but I was not sure about the kind of applications which one would build using Python and would certainly appreciate if someone can provide some useful information.
0
python
2009-09-21T01:24:00.000
0
1,452,509
It's hard to think of kinds of general applications where Python would be unsuitable, but there are several kinds where, like just about all higher-level languages akin to it, it might be considered a peculiar and probably inferior choice. In "hard real time" applications, all dynamic memory allocation and freeing, and especially garbage collection, are quite understandably frowned upon; this rules out almost all modern languages (including Python, but also Java, C#, etc, etc), since almost all of them rely on dynamic memory handling and garbage collection of some kind or other. If you're programming for an "embedded device" which you expect to be produced and sold in huge numbers, every bit of ROM may add measurably to the overall costs, so you want a language focused on squeezing the application down to the last possible bit -- any language that relies on a rich supporting runtime environment or operating system (including Python, and, again, also Java, C#, etc, etc) would no doubt force you to spend extra on many more bits of ROM (consider threaded-interpretive languages like good old Forth: they can make a substantial application's code be measurably more compact than straightforward machine code would!). There many be other niches that share similar constraints (mostly focused on MEMORY: focus on using as few bits as possible and/or strictly confining execution within precisely predefined limits -- no dynamism, no allocation, no garbage collection, etc, etc), and basically the case would once again incline in similar ways (for example, there are server applications, intended to run on myriads of servers, which can save many megabytes per server if coded in C++ [especially if without "allegedly-smart" pointers;-)] rather than Java, Python, C#, and so on). Of course there are excellent reasons most modern languages (Python, Java, C#, etc) choose to do dynamic memory allocation, garbage collection, and so forth, despite the importance of application niches where those techniques are a negative aspect: essentially, if you can possibly afford such nice memory handling, writing applications becomes MUCH, MUCH easier, and a whole class of problems and bugs connected with the need to carefully manage memory if you lack such support can go away -- programmer productivity really soars... IF garbage collection and the like can be afforded at all, that is. For example, if an application was going to run on a few hundreds or thousands of servers, I probably wouldn't bother coding it in C++ with manual memory management in order to save memory; it's only at tens and hundreds of thousands of servers, that the economics of all those extra megabytes really kicks in. Note that, despite the common misconception that "interpreted languages" (ones with a rich underlying runtime or VM, like Java, C#, Python, etc) "are slow", in fact for most CPU-intensive applications (such as scientific computation), Python is perfectly suitable, as long as the "rich supporting runtime environment" (e.g. numpy) is factored in. So, that is not really a factor -- though memory consumption and garbage collection CAN be, in some niches.
0
80,717
true
0
1
What kind of applications are built using Python?
1,452,532
5
9
0
0
25
1
0
0
I wanted to know Python is suited for what kind of applications? I am new to Python world but I know it's a scripting language like Perl but I was not sure about the kind of applications which one would build using Python and would certainly appreciate if someone can provide some useful information.
0
python
2009-09-21T01:24:00.000
0
1,452,509
Bittorrent was built on Python.
0
80,717
false
0
1
What kind of applications are built using Python?
1,452,528
3
11
0
-3
177
1
-0.054491
0
What are people's experiences with any of the Git modules for Python? (I know of GitPython, PyGit, and Dulwich - feel free to mention others if you know of them.) I am writing a program which will have to interact (add, delete, commit) with a Git repository, but have no experience with Git, so one of the things I'm looking for is ease of use/understanding with regards to Git. The other things I'm primarily interested in are maturity and completeness of the library, a reasonable lack of bugs, continued development, and helpfulness of the documentation and developers. If you think of something else I might want/need to know, please feel free to mention it.
0
python,git
2009-09-21T19:10:00.000
0
1,456,269
For the record, none of the aforementioned Git Python libraries seem to contain a "git status" equivalent, which is really the only thing I would want since dealing with the rest of the git commands via subprocess is so easy.
0
120,684
false
0
1
Python Git Module experiences?
2,180,936
3
11
0
0
177
1
0
0
What are people's experiences with any of the Git modules for Python? (I know of GitPython, PyGit, and Dulwich - feel free to mention others if you know of them.) I am writing a program which will have to interact (add, delete, commit) with a Git repository, but have no experience with Git, so one of the things I'm looking for is ease of use/understanding with regards to Git. The other things I'm primarily interested in are maturity and completeness of the library, a reasonable lack of bugs, continued development, and helpfulness of the documentation and developers. If you think of something else I might want/need to know, please feel free to mention it.
0
python,git
2009-09-21T19:10:00.000
0
1,456,269
The git interaction library part of StGit is actually pretty good. However, it isn't broken out as a separate package but if there is sufficient interest, I'm sure that can be fixed. It has very nice abstractions for representing commits, trees etc, and for creating new commits and trees.
0
120,684
false
0
1
Python Git Module experiences?
1,478,107
3
11
0
18
177
1
1
0
What are people's experiences with any of the Git modules for Python? (I know of GitPython, PyGit, and Dulwich - feel free to mention others if you know of them.) I am writing a program which will have to interact (add, delete, commit) with a Git repository, but have no experience with Git, so one of the things I'm looking for is ease of use/understanding with regards to Git. The other things I'm primarily interested in are maturity and completeness of the library, a reasonable lack of bugs, continued development, and helpfulness of the documentation and developers. If you think of something else I might want/need to know, please feel free to mention it.
0
python,git
2009-09-21T19:10:00.000
0
1,456,269
Maybe it helps, but Bazaar and Mercurial are both using dulwich for their Git interoperability. Dulwich is probably different than the other in the sense that's it's a reimplementation of git in python. The other might just be a wrapper around Git's commands (so it could be simpler to use from a high level point of view: commit/add/delete), it probably means their API is very close to git's command line so you'll need to gain experience with Git.
0
120,684
false
0
1
Python Git Module experiences?
1,458,963
1
2
0
0
1
0
0
0
Microsoft Word has "send as attachment" functionality which creates a new message in Outlook with the document attached. I would like to replace Outlook with a custom mail agent, but I do not know how to achieve this. Now my mail agent is simply a program that runs, and takes a file name as parameter. As far as I know, "send as attachment" is using some DLL/API called MAPI. I would need to change my app so that it does not simply accept file name arguments, but can receive MAPI(?) calls MS Word uses when "sending as attachment". Further, I need to change the default mail agent by creating my own MAPI32.dll stub which simply redirects to my app. I'd appreciate if anyone had more info on how this could be achieved!
0
c#,python,outlook,ms-word,mapi
2009-09-22T07:53:00.000
0
1,458,690
OK, to answer my own question. I need to build a DLL with "MAPISendDocuments" and/or "MAPISendMail" functions defined. This DLL can have any name, and is referenced in the registry at HKLM/Software/Clients/Mail/MyMailApp/DLLPath. Found examples using Delphi...
0
3,109
false
0
1
How to create a MAPI32.dll stub to be able to "send as attachment" from MS Word?
1,483,115
1
8
0
1
41
1
0.024995
0
I'm trying to find a way to lazily load a module-level variable. Specifically, I've written a tiny Python library to talk to iTunes, and I want to have a DOWNLOAD_FOLDER_PATH module variable. Unfortunately, iTunes won't tell you where its download folder is, so I've written a function that grabs the filepath of a few podcast tracks and climbs back up the directory tree until it finds the "Downloads" directory. This takes a second or two, so I'd like to have it evaluated lazily, rather than at module import time. Is there any way to lazily assign a module variable when it's first accessed or will I have to rely on a function?
0
python,module,variables,lazy-loading,itunes
2009-09-22T22:32:00.000
0
1,462,986
If that variable lived in a class rather than a module, then you could overload getattr, or better yet, populate it in init.
0
14,915
false
0
1
Lazy module variables--can it be done?
1,463,036
1
4
0
1
3
0
0.049958
0
I am currently running a high-traffic python/django website using Apache and mod_wsgi. I'm hoping that there's a faster webserver configuration out there, and I've heard a fair number of recommendations for lighttpd and fastcgi. Is this setup faster than apache+mod_wsgi for serving dynamic django pages (I'm already convinced that lighttpd can server static files better)? The benchmarks online are either poorly conducted or inconclusive so I'm looking for some personal anecdotes. What architectural benefits does lighttpd + fastcgi provide? I understand that lighttpd uses epoll, and that a fastcgi process will be multithreaded. Also, having two separate processes, one for lighttpd and one for the python interpreter, will be largely beneficial. I am aware of tornado and its ability to handle thousands of file descriptors with much fewer threads using epoll and callbacks. However, I'd prefer to stick with django for now. Thanks, Ken
0
python,django,apache,fastcgi,lighttpd
2009-09-24T04:26:00.000
0
1,469,770
Doesn't answer you question, but do you already use caching for your site? Like memcached? This might give you a better performance gain than going through the mess of switching webservers.
0
3,952
false
1
1
Better webserver performance for Python Django: Apache mod_wsgi or Lighttpd fastcgi
1,470,459
3
5
0
0
4
1
0
0
I'm setting up a web application to use IronPython for scripting various user actions and I'll be exposing various business objects ready for accessing by the script. I want to make it impossible for the user to import the CLR or other assemblies in order to keep the script's capabilities simple and restricted to the functionality I expose in my business objects. How do I prevent the CLR and other assemblies/modules from being imported?
0
python,ironpython
2009-09-25T20:33:00.000
0
1,479,454
If you'd like to disable certain built-in modules I'd suggest filing a feature request over at ironpython.codeplex.com. This should be an easy enough thing to implement. Otherwise you could simply look at either Importer.cs and disallow the import there or you could simply delete ClrModule.cs from IronPython and re-build (and potentially remove any references to it).
0
996
false
0
1
IronPython - How to prevent CLR (and other modules) from being imported
1,480,581
3
5
0
0
4
1
0
0
I'm setting up a web application to use IronPython for scripting various user actions and I'll be exposing various business objects ready for accessing by the script. I want to make it impossible for the user to import the CLR or other assemblies in order to keep the script's capabilities simple and restricted to the functionality I expose in my business objects. How do I prevent the CLR and other assemblies/modules from being imported?
0
python,ironpython
2009-09-25T20:33:00.000
0
1,479,454
In case anyone comes across this thread from google still (like i did) I managed to disable 'import clr' in python scripts by commenting out the line //[assembly: PythonModule("clr", typeof(IronPython.Runtime.ClrModule))] in ClrModule.cs, but i'm not convinced this is a full solution to preventing unwanted access, since you will still need to override things like the file builtin.
0
996
false
0
1
IronPython - How to prevent CLR (and other modules) from being imported
59,887,790
3
5
0
1
4
1
1.2
0
I'm setting up a web application to use IronPython for scripting various user actions and I'll be exposing various business objects ready for accessing by the script. I want to make it impossible for the user to import the CLR or other assemblies in order to keep the script's capabilities simple and restricted to the functionality I expose in my business objects. How do I prevent the CLR and other assemblies/modules from being imported?
0
python,ironpython
2009-09-25T20:33:00.000
0
1,479,454
You'll have to search the script for the imports you don't want them to use, and reject the script in toto if it contains any of them. Basically, just reject the script if it contains Assembly.Load, import or AddReference.
0
996
true
0
1
IronPython - How to prevent CLR (and other modules) from being imported
1,479,480
1
4
0
12
33
0
1
0
Is there a python equivalent of phpMyAdmin? Here's why I'm looking for a python version of phpmyadmin: While I agree that phpmyadmin really rocks, I don't want to run php on my server. I'd like to move from apache2-prefork to apache2-mpm-worker. Worker blows the doors off of prefork for performance, but php5 doesn't work with worker. (Technically it does, but it's far more complicated.) The extra memory and performance penalty for having php on this server is large to me.
1
python,phpmyadmin
2009-09-26T04:51:00.000
0
1,480,453
You can use phpMyAdmin for python project, because phpMyAdmin is meant for MySQL databases. If you are using MySQL, then regardless of whether you are using PHP or python, you can use phpMyAdmin.
0
23,600
false
0
1
phpMyAdmin equivalent in python?
1,480,549
1
1
0
1
0
0
1.2
0
I would like to write a small script that does the following (and that I can then run using my crontab): Look into a directory that contains directories whose names are in some date format, e.g. 30-10-09. Convert the directory name to the date it represents (of course, I could put this information as a string into a file in these directories, that doesn't matter to me). Compare each date with the current system time and find the one that has a specific time difference to the current system date, e.g. less than two days. Then, do something with the files in that directory (e.g., paste them together and send an email). I know a little bash scripting, but I don't know whether bash can itself handle this. I think I could do this in R, but the server where this needs to run doesn't have R. I'm curious anyway to learn a little bit of either Python or Ruby (both of which are on the server). Can someone point me in the right direction what might be the best way to do this?
0
python,date,scripting
2009-09-28T14:45:00.000
1
1,487,450
I would suggest using Python. You'll need the following functions: os.listdir gives you the directory contents, as a list of strings time.strptime(name, "%d-%m-%y") will try to parse such a string, and return a time tuple. You get a ValueError exception if parsing fails. time.mktime will convert a time tuple into seconds since the epoch. time.time returns seconds since the epoch the smtplib module can send emails, assuming you know what SMTP server to use. Alternatively, you can run /usr/lib/sendmail, through the subprocess module (assuming /usr/lib/sendmail is correctly configured)
0
289
true
0
1
Time difference between system date and string, e.g. from directory name?
1,487,702
1
5
0
1
6
1
0.039979
0
Perl habits die hard. Variable declaration, scoping, global/local is different between the 2 languages. Is there a set of recommended python language idioms that will render the transition from perl coding to python coding less painful. Subtle variable misspelling can waste an extraordinary amount of time. I understand the variable declaration issue is quasi-religious among python folks I'm not arguing for language changes or features, just a reliable bridge between the 2 languages that will not cause my perl habits sink my python efforts. Thanks.
0
python,perl,transitions
2009-09-28T21:02:00.000
0
1,489,355
In python $_ does not exist except in the python shell and variables with global scope are frowned upon. In practice this has two major effects: In Python you can't use regular expressions as naturally as Perl, s0 matching each iterated $_ and similarly catching matches is more cumbersome Python functions tend to be called explicitly or have default variables However these differences are fairly minor when one considers that in Python just about everything becomes a class. When I used to do Perl I thought of "carving"; in Python I rather feel I am "composing". Python doesn't have the idiomatic richness of Perl and I think it is probably a mistake to attempt to do the translation.
0
539
false
0
1
Managing Perl habits in a Python environment
1,489,635
1
10
0
0
447
1
0
0
How do I find out which directories are listed in my system’s PYTHONPATH variable, from within a Python script (or the interactive shell)?
0
python,python-module,pythonpath
2009-09-28T22:01:00.000
1
1,489,599
If using conda, you can get the env prefix using os.environ["CONDA_PREFIX"].
0
822,516
false
0
1
How do I find out my PYTHONPATH using Python?
62,773,911
1
5
0
2
6
0
0.07983
1
I'm trying to create XML using the ElementTree object structure in python. It all works very well except when it comes to processing instructions. I can create a PI easily using the factory function ProcessingInstruction(), but it doesn't get added into the elementtree. I can add it manually, but I can't figure out how to add it above the root element where PI's are normally placed. Anyone know how to do this? I know of plenty of alternative methods of doing it, but it seems that this must be built in somewhere that I just can't find.
0
python,xml,elementtree
2009-09-29T00:09:00.000
0
1,489,949
Yeah, I don't believe it's possible, sorry. ElementTree provides a simpler interface to (non-namespaced) element-centric XML processing than DOM, but the price for that is that it doesn't support the whole XML infoset. There is no apparent way to represent the content that lives outside the root element (comments, PIs, the doctype and the XML declaration), and these are also discarded at parse time. (Aside: this appears to include any default attributes specified in the DTD internal subset, which makes ElementTree strictly-speaking a non-compliant XML processor.) You can probably work around it by subclassing or monkey-patching the Python native ElementTree implementation's write() method to call _write on your extra PIs before _writeing the _root, but it could be a bit fragile. If you need support for the full XML infoset, probably best stick with DOM.
0
4,332
false
0
1
ElementTree in Python 2.6.2 Processing Instructions support?
1,490,057
1
5
0
2
1
1
0.07983
0
I have a memory and CPU intensive problem to solve and I need to benchmark the different solutions in ruby and python on different platforms. To do the benchmark, I need to measure the time taken and the memory occupied by objects (not the entire program, but a selected list of objects) in both python and ruby. Please recommend ways to do it, and also let me know if it is possible to do it without using OS specify tools like (Task Manager and ps). Thanks! Update: Yes, I know that both Python and Ruby are not strong in performance and there are better alternatives like c, c++, Java etc. I am actually more interested in comparing the performance of Python and Ruby. And please no fame-wars.
0
python,ruby,performance,memory-management
2009-09-29T06:15:00.000
0
1,490,841
If you are using Python for CPU intensive algorithmic tasks I suggest use Numpy/Scipy to speed up your numerical calculations and use the Psyco JIT compiler for everything else. Your speeds can approach that of much lower-level languages if you use optimized components.
0
1,564
false
0
1
Comparing performance between ruby and python code
1,491,010
6
6
0
5
5
0
0.16514
0
I'm looking at implementing a fuzzy logic controller based on either PyFuzzy (Python) or FFLL (C++) libraries. I'd prefer to work with python but am unsure if the performance will be acceptable in the embedded environment it will work in (either ARM or embedded x86 proc both ~64Mbs of RAM). The main concern is that response times are as fast as possible (an update rate of 5hz+ would be ideal >2Hz is required). The system would be reading from multiple (probably 5) sensors from an RS232 port and provide 2/3 outputs based on the results of the fuzzy evaluation. Should I be concerned that Python will be too slow for this task?
0
python,c,embedded,fuzzy-logic
2009-09-30T13:32:00.000
1
1,498,155
Make it work, then make it work fast.
0
1,593
false
0
1
Performance of Python worth the cost?
1,498,739
6
6
0
1
5
0
0.033321
0
I'm looking at implementing a fuzzy logic controller based on either PyFuzzy (Python) or FFLL (C++) libraries. I'd prefer to work with python but am unsure if the performance will be acceptable in the embedded environment it will work in (either ARM or embedded x86 proc both ~64Mbs of RAM). The main concern is that response times are as fast as possible (an update rate of 5hz+ would be ideal >2Hz is required). The system would be reading from multiple (probably 5) sensors from an RS232 port and provide 2/3 outputs based on the results of the fuzzy evaluation. Should I be concerned that Python will be too slow for this task?
0
python,c,embedded,fuzzy-logic
2009-09-30T13:32:00.000
1
1,498,155
If most of your runtime is spent in C libraries, the language you use to call these libraries isn't important. What language are your time-eating libraries written in ?
0
1,593
false
0
1
Performance of Python worth the cost?
1,499,253
6
6
0
0
5
0
0
0
I'm looking at implementing a fuzzy logic controller based on either PyFuzzy (Python) or FFLL (C++) libraries. I'd prefer to work with python but am unsure if the performance will be acceptable in the embedded environment it will work in (either ARM or embedded x86 proc both ~64Mbs of RAM). The main concern is that response times are as fast as possible (an update rate of 5hz+ would be ideal >2Hz is required). The system would be reading from multiple (probably 5) sensors from an RS232 port and provide 2/3 outputs based on the results of the fuzzy evaluation. Should I be concerned that Python will be too slow for this task?
0
python,c,embedded,fuzzy-logic
2009-09-30T13:32:00.000
1
1,498,155
From your description, speed should not be much of a concern (and you can use C, cython, whatever you want to make it faster), but memory would be. For environments with 64 Mb max (where the OS and all should fit as well, right ?), I think there is a good chance that python may not be the right tool for target deployment. If you have non trivial logic to handle, I would still prototype in python, though.
0
1,593
false
0
1
Performance of Python worth the cost?
1,502,231
6
6
0
0
5
0
0
0
I'm looking at implementing a fuzzy logic controller based on either PyFuzzy (Python) or FFLL (C++) libraries. I'd prefer to work with python but am unsure if the performance will be acceptable in the embedded environment it will work in (either ARM or embedded x86 proc both ~64Mbs of RAM). The main concern is that response times are as fast as possible (an update rate of 5hz+ would be ideal >2Hz is required). The system would be reading from multiple (probably 5) sensors from an RS232 port and provide 2/3 outputs based on the results of the fuzzy evaluation. Should I be concerned that Python will be too slow for this task?
0
python,c,embedded,fuzzy-logic
2009-09-30T13:32:00.000
1
1,498,155
I never really measured the performance of pyfuzzy's examples, but as the new version 0.1.0 can read FCL files as FFLL does. Just describe your fuzzy system in this format, write some wrappers, and check the performance of both variants. For reading FCL with pyfuzzy you need the antlr python runtime, but after reading you should be able to pickle the read object, so you don't need the antlr overhead on the target.
0
1,593
false
0
1
Performance of Python worth the cost?
1,612,690
6
6
0
35
5
0
1.2
0
I'm looking at implementing a fuzzy logic controller based on either PyFuzzy (Python) or FFLL (C++) libraries. I'd prefer to work with python but am unsure if the performance will be acceptable in the embedded environment it will work in (either ARM or embedded x86 proc both ~64Mbs of RAM). The main concern is that response times are as fast as possible (an update rate of 5hz+ would be ideal >2Hz is required). The system would be reading from multiple (probably 5) sensors from an RS232 port and provide 2/3 outputs based on the results of the fuzzy evaluation. Should I be concerned that Python will be too slow for this task?
0
python,c,embedded,fuzzy-logic
2009-09-30T13:32:00.000
1
1,498,155
In general, you shouldn't obsess over performance until you've actually seen it become a problem. Since we don't know the details of your app, we can't say how it'd perform if implemented in Python. And since you haven't implemented it yet, neither can you. Implement the version you're most comfortable with, and can implement fastest, first. Then benchmark it. And if it is too slow, you have three options which should be done in order: First, optimize your Python code If that's not enough, write the most performance-critical functions in C/C++, and call that from your Python code And finally, if you really need top performance, you might have to rewrite the whole thing in C++. But then at least you'll have a working prototype in Python, and you'll have a much clearer idea of how it should be implemented. You'll know what pitfalls to avoid, and you'll have an already correct implementation to test against and compare results to.
0
1,593
true
0
1
Performance of Python worth the cost?
1,498,214
6
6
0
12
5
0
1
0
I'm looking at implementing a fuzzy logic controller based on either PyFuzzy (Python) or FFLL (C++) libraries. I'd prefer to work with python but am unsure if the performance will be acceptable in the embedded environment it will work in (either ARM or embedded x86 proc both ~64Mbs of RAM). The main concern is that response times are as fast as possible (an update rate of 5hz+ would be ideal >2Hz is required). The system would be reading from multiple (probably 5) sensors from an RS232 port and provide 2/3 outputs based on the results of the fuzzy evaluation. Should I be concerned that Python will be too slow for this task?
0
python,c,embedded,fuzzy-logic
2009-09-30T13:32:00.000
1
1,498,155
Python is very slow at handling large amounts of non-string data. For some operations, you may see that it is 1000 times slower than C/C++, so yes, you should investigate into this and do necessary benchmarks before you make time-critical algorithms in Python. However, you can extend python with modules in C/C++ code, so that time-critical things are fast, while still being able to use python for the main code.
0
1,593
false
0
1
Performance of Python worth the cost?
1,498,176
2
3
0
-1
14
0
-0.066568
0
On a Linux box I want to run a Python script as another user. I've already made a wrapper program in C++ that calls the script, since I've realized that the ownership of running the script is decided by the ownership of the python interpreter. After that I change the C++ program to a different user and run the C++ program. This setup doesn't seem to be working. Any ideas?
0
python,linux
2009-09-30T16:31:00.000
1
1,499,268
Use the command sudo. In order to run a program as a user, the system must "authenticate" that user. Obviously, root can run any program as any user, and any user can su to another user with a password. The program sudo can be configured to allow a group of users to sudo a particular command as a particular user. For example, you could create a group scriptUsers and a user scriptRun. Then, configure sudo to let any user in scriptUsers become scriptRun ONLY to run your script.
0
33,209
false
0
1
Running python script as another user
1,499,282
2
3
0
0
14
0
0
0
On a Linux box I want to run a Python script as another user. I've already made a wrapper program in C++ that calls the script, since I've realized that the ownership of running the script is decided by the ownership of the python interpreter. After that I change the C++ program to a different user and run the C++ program. This setup doesn't seem to be working. Any ideas?
0
python,linux
2009-09-30T16:31:00.000
1
1,499,268
Give those users the ability to sudo su $dedicated_username and tailor the permissions on your system so that $dedicated_user has sufficient, but not excessive, access.
0
33,209
false
0
1
Running python script as another user
1,499,313
1
3
0
2
5
0
0.132549
0
I'm using cProfile, pstats and Gprof2dot to profile a rather long python script. The results tell me that the most time is spent calling a method in an object I've defined. However, what I would really like is to know exactly what line number within that function is eating up the time. Any idea's how to get this additional information? (By the way, I'm using Python 2.6 on OSX snow leopard if that helps...)
0
python,scripting,numbers,profiler,line
2009-09-30T20:46:00.000
0
1,500,564
cProfile does not track line numbers within a function; it only tracks the line number of where the function was defined. cProfile attempts to duplicate the behavior of profile (which is pure Python). profile uses pstats to store the data from running, and pstats only stores line numbers for function definitions, not for individual Python statements. If you need to figure out with finer granularity what is eating all your time, then you need to refactor your big function into several, smaller functions.
0
2,434
false
0
1
cProfile and Python: Finding the specific line number that code spends most time on
1,500,818
3
4
0
5
4
1
0.244919
0
Because I'm a Python fan, I'd like to learn the .NET framework using IronPython. Would I be missing out on something? Is this in some way not recommended? EDIT: I'm pretty knowledgeable of Java ( so learning/using a new language is not a problem for me ). If needed, will I be able to use everything I learned in IronPython ( excluding language featurs ) to write C# code?
0
c#,.net,ironpython
2009-10-01T15:55:00.000
0
1,504,804
If I wanted to just "learn the framework", I would do it in C# or VB for two main reasons: Intellisense - the framework is huge, and being offered suggestions for function overloads is one of the ways to find new stuff. There's almost no good intellisense for the framework with IronPython at the moment (Michael Foord has done some work on building the appropriate info for Wing, but I haven't tried it myself). Code samples - pretty much all the educational material that exists about the .NET framework is given with C# or VB. You'll be much more on your own with IronPython.
0
477
false
1
1
Using IronPython to learn the .NET framework, is this bad?
1,551,802
3
4
0
11
4
1
1.2
0
Because I'm a Python fan, I'd like to learn the .NET framework using IronPython. Would I be missing out on something? Is this in some way not recommended? EDIT: I'm pretty knowledgeable of Java ( so learning/using a new language is not a problem for me ). If needed, will I be able to use everything I learned in IronPython ( excluding language featurs ) to write C# code?
0
c#,.net,ironpython
2009-10-01T15:55:00.000
0
1,504,804
No, sounds like a good way to learn to me. You get to stick with a language and syntax that you are familiar with, and learn about the huge range of classes available in the framework, and how the CLR supports your code. Once you've got to grips with some of the framework and the CLR services you could always pick up C# in the future. By that point it will just be a minor syntax change from what you already know. Bare in mind that if you are thinking with respect to a career, you won't find many iron python jobs, but like I say, this could be a good way to learn about the framework first, then build on that with C# in a month or twos time.
0
477
true
1
1
Using IronPython to learn the .NET framework, is this bad?
1,504,823
3
4
0
5
4
1
0.244919
0
Because I'm a Python fan, I'd like to learn the .NET framework using IronPython. Would I be missing out on something? Is this in some way not recommended? EDIT: I'm pretty knowledgeable of Java ( so learning/using a new language is not a problem for me ). If needed, will I be able to use everything I learned in IronPython ( excluding language featurs ) to write C# code?
0
c#,.net,ironpython
2009-10-01T15:55:00.000
0
1,504,804
You can definitely do that to learn the class library, but I'm not sure if it's such a good idea when it comes to fundamental CLR concepts (e.g. delegates and events). You'll need to pay attention and distinguish what is strictly an IronPython feature, and what is CLR feature exposed in IronPython in a way that matches its dynamic semantics better.
0
477
false
1
1
Using IronPython to learn the .NET framework, is this bad?
1,504,904
1
7
0
314
166
1
1.2
0
What is the cleanest and most Pythonic way to get tomorrow's date? There must be a better way than to add one to the day, handle days at the end of the month, etc.
0
python,datetime,date,time
2009-10-01T22:45:00.000
0
1,506,901
datetime.date.today() + datetime.timedelta(days=1) should do the trick
0
110,489
true
0
1
Cleanest and most Pythonic way to get tomorrow's date?
1,506,916
1
5
0
7
7
1
1
0
I've seen some comparisons between Smalltalk and Ruby on the one hand and Ruby and Python on the other, but not between Python and Smalltalk. I'd especially like to know what the fundamental differences in Implementation, Syntax, Extensiabillity and Philosophy are. For example Python does not seem to have Metaclasses. Smalltalk has no concept of generators. And although both are said to be dynamicly typed, I believe that Python does not do dynamic method dispatch. Is this correct?
0
python,comparison,language-features,smalltalk,language-comparisons
2009-10-02T08:09:00.000
0
1,508,256
Python certainly does have metaclasses. Smalltalk has some unusual features: Has a rather simple syntax and only about 6 (!) keywords. Everything else (including defining new classes) is accomplished by calling methods (sending messages in Smalltalk). This allows you to create some DSL within the language. In Smalltalk, you don't store source files, but instead have one big memory image and you modify it on the fly. You can also modify most of the Smalltalk itself (and possibly break it ;)
0
3,672
false
0
1
How does Smalltalk (Pharo for example) compare to Python?
1,508,312
3
4
0
0
6
1
0
0
I have an application that generates some large log files > 500MB. I have written some utilities in Python that allows me to quickly browse the log file and find data of interest. But I now get some datasets where the file is too big to load it all into memory. I thus want to scan the document once, build an index and then only load the section of the document into memory that I want to look at at a time. This works for me when I open a 'file' read it one line at a time and store the offset with from file.tell(). I can then come back to that section of the file later with file.seek( offset, 0 ). My problem is however that I may have UTF-8 in the log files so I need to open them with the codecs module (codecs.open(<filename>, 'r', 'utf-8')). With the resulting object I can call seek and tell but they do not match up. I assume that codecs needs to do some buffering or maybe it returns character counts instead of bytes from tell? Is there a way around this?
0
python,utf-8,codec,seek
2009-10-02T15:17:00.000
0
1,510,188
Update: You can't do seek/tell on the object returned by codec.open(). You need to use a normal file, and decode the strings to unicode after reading. I do not know why it doesn't work but I can't make it work. The seek seems to only work once, for example. Then you need to close and reopen the file, which is of course not useful. The tell does not use character positions, but doesn't show you where your position in the stream is (but probably where the underlying file object is in reading from disk). So probably because of some sort of underlying buffering, you can't do it. But deocding after reading works just fine, so go for that.
0
2,140
false
0
1
Can seek and tell work with UTF-8 encoded documents in Python?
1,510,303
3
4
0
3
6
1
1.2
0
I have an application that generates some large log files > 500MB. I have written some utilities in Python that allows me to quickly browse the log file and find data of interest. But I now get some datasets where the file is too big to load it all into memory. I thus want to scan the document once, build an index and then only load the section of the document into memory that I want to look at at a time. This works for me when I open a 'file' read it one line at a time and store the offset with from file.tell(). I can then come back to that section of the file later with file.seek( offset, 0 ). My problem is however that I may have UTF-8 in the log files so I need to open them with the codecs module (codecs.open(<filename>, 'r', 'utf-8')). With the resulting object I can call seek and tell but they do not match up. I assume that codecs needs to do some buffering or maybe it returns character counts instead of bytes from tell? Is there a way around this?
0
python,utf-8,codec,seek
2009-10-02T15:17:00.000
0
1,510,188
If true, this sounds like a bug or limitation of the codecs module, as it's probably confusing byte and character offsets. I would use the regular open() function for opening the file, then seek()/tell() will give you byte offsets that are always consistent. Whenever you want to read, use f.readline().decode('utf-8'). Beware though, that using the f.read() function can land you in the middle of a multi-byte character, thus producing an UTF-8 decode error. readline() will always work. This doesn't transparently handle the byte-order mark for you, but chances are your log files do not have BOMs anyway.
0
2,140
true
0
1
Can seek and tell work with UTF-8 encoded documents in Python?
1,510,276
3
4
0
2
6
1
0.099668
0
I have an application that generates some large log files > 500MB. I have written some utilities in Python that allows me to quickly browse the log file and find data of interest. But I now get some datasets where the file is too big to load it all into memory. I thus want to scan the document once, build an index and then only load the section of the document into memory that I want to look at at a time. This works for me when I open a 'file' read it one line at a time and store the offset with from file.tell(). I can then come back to that section of the file later with file.seek( offset, 0 ). My problem is however that I may have UTF-8 in the log files so I need to open them with the codecs module (codecs.open(<filename>, 'r', 'utf-8')). With the resulting object I can call seek and tell but they do not match up. I assume that codecs needs to do some buffering or maybe it returns character counts instead of bytes from tell? Is there a way around this?
0
python,utf-8,codec,seek
2009-10-02T15:17:00.000
0
1,510,188
For UTF-8, you don't actually need to open the file with codecs.open. Instead, it is reliable to read the file as a byte string first, and only then decode an individual section (invoking the .decode method on the string). Breaking the file at line boundaries is safe; the only unsafe way to split it would be in the middle of a multi-byte character (which you can recognize from its byte value > 128).
0
2,140
false
0
1
Can seek and tell work with UTF-8 encoded documents in Python?
1,510,282
1
3
0
4
4
1
0.26052
0
All. I am trying to find a python module that I can use to parse a cron entry and get the next time it will run. With perl I use the Schedule::Cron::Events module but I would like to convert to python. Thanks in advance.
0
python,module,cron
2009-10-02T21:22:00.000
0
1,511,854
I could be wrong but doesn't python crontab offer ways to read and write to crontab but nothing regarding parsing the crontab to determine the time until the next time a job will be run?
0
8,165
false
0
1
Parse a cron entry in Python
3,256,325
1
2
0
1
5
1
1.2
0
I've got several eggs I maintain on Pypi but up until now I've always focused on Python 2.5x. I'd like to release my eggs under both Python 2.5 & Python 2.6 in an automated fashion i.e. running tests generating doc preparing eggs uploading to Pypi How do you guys achieve this? A related question: how do I tag an egg to be "version independent" ? works under all version of Python?
0
python,release-management,pypi
2009-10-03T02:45:00.000
0
1,512,644
You don't need to release eggs for anything else than Windows, and then only if your package uses C extensions so that they have compiled parts. Otherwise you simply release one source distribution. That will be enough for all Python versions on all platforms. Running the tests for different versions automated is tricky if you don't have a buildbot. But once you have run the tests with both 2.5 and 2.6 releasing is just a question of running python setup.py sdist register upload and it doesn't matter what Python version you use to run that.
0
817
true
0
1
Python Pypi: what is your process for releasing packages for different Python versions? (Linux)
1,513,884
1
6
0
9
2
0
1.2
1
I am making a community for web-comic artist who will be able to sync their existing website to this site. However, I am in debate for what CMS I should use: Drupal or Wordpress. I have heard great things about Drupal, where it is really aimed for Social Networking. I actually got to play a little bit in the back end of Drupal and it seemed quite complicated to me, but I am not going to give up to fully understand how Drupal works. As for Wordpress, I am very familiar with the Framework. I have the ability to extend it to do what I want, but I am hesitating because I think the framework is not built for communities (I think it may slow down in the future). I also have a unrelated question as well: Should I go with a Python CMS? I heard very great things about Python and how much better it is compare to PHP. Your advice is appreciated.
0
python,wordpress,drupal,content-management-system,social-networking
2009-10-03T07:17:00.000
0
1,513,062
Difficult decision. Normally I would say 'definitely Drupal' without hesitation, as Drupal was build as a System for community sites from the beginning, whereas Wordpress still shows its heritage as a blogging solution, at least that's what I hear quite often. But then I'm working with Drupal all the time recently and haven't had a closer look at Wordpress for quite a while. That said, Drupal has grown into a pretty complex system over the years, so there is quite a learning curve for newcomers. Given that you are already familiar with Wordpress, it might be more efficient for you to go with that, provided it can do all that you need. So I would recommend Drupal, but you should probably get some opinions from people experienced with Wordpress concerning the possibility to turn it into a community site first. As for the Python vs. PHP CMS question, I'd say that the quality of a CMS is a function of the ability of its developers, the maturity of the system, the surrounding 'ecosystem', etc. and not of the particular language used to build it. (And discussions about the quality of one established language vs. another? Well - let's just not go there ;)
0
3,472
true
0
1
Drupal or Wordpress CMS as a Social Network?
1,513,657
1
1
0
0
0
0
1.2
0
I can't seem to fetch the verifiedEmail field when trying to login to AOLs OpenID on my site. Every other provider that I know of provides this property, but not AOL. I realize that AOL somehow uses an old OpenID version, although is it feasible to just assume that their e-mail ends in @aol.com? I'm using the RPXNow library with Python.
0
python,openid,rpxnow
2009-10-03T11:40:00.000
0
1,513,543
I believe that OpenID lets the user decide how much information to "share" during the login process. I can't say that I am an expert on the subject, but I know that my identity at myopenid.com lets me specify precisely what information to make available. Is it possible that the AOL default is to share nothing? If this is the case, then you may want to do an email authorization directly with the user if the OpenID provider doesn't seem to have the information. OpenID doesn't mandate that this information is available so I would assume that you will have to handle the case of it not being there in application code.
0
166
true
0
1
verifiedEmail AOL OpenID
1,513,794
3
5
0
3
3
1
0.119427
0
This is what I've done for a project. I have a few data structures that are bascially dictionaries with some methods that operate on the data. When I save them to disk, I write them out to .py files as code that when imported as a module will load the same data into such a data structure. Is this reasonable? Are there any big disadvantages? The advantage I see is that when I want to operate with the saved data, I can quickly import the modules I need. Also, the modules can be used seperate from the rest of the application because you don't need a separate parser or loader functionality.
0
python,data-persistence,dynamic-import
2009-10-03T16:40:00.000
0
1,514,228
The biggest drawback is that it's a potential security problem since it's hard to guarantee that the files won't contains arbitrary code, which could be really bad. So don't use this approach if anyone else than you have write-access to the files.
0
351
false
0
1
Is it reasonable to save data as python modules?
1,514,234
3
5
0
3
3
1
1.2
0
This is what I've done for a project. I have a few data structures that are bascially dictionaries with some methods that operate on the data. When I save them to disk, I write them out to .py files as code that when imported as a module will load the same data into such a data structure. Is this reasonable? Are there any big disadvantages? The advantage I see is that when I want to operate with the saved data, I can quickly import the modules I need. Also, the modules can be used seperate from the rest of the application because you don't need a separate parser or loader functionality.
0
python,data-persistence,dynamic-import
2009-10-03T16:40:00.000
0
1,514,228
It's reasonable, and I do it all the time. Obviously it's not a format you use to exchange data, so it's not a good format for anything like a save file. But for example, when I do migrations of websites to Plone, I often get data about the site (such as a list of which pages should be migrated, or a list of how old urls should be mapped to new ones, aor lists of tags). These you typically get in Word och Excel format. Also the data often needs massaging a bit, and I end up with what for all intents and purposes are a dictionaries mapping one URL to some other information. Sure, I could save that as CVS, and parse it into a dictionary. But instead I typically save it as a Python file with a dictionary. Saves code. So, yes, it's reasonable, no it's not a format you should use for any sort of save file. It however often used for data that straddles the border to configuration, like above.
0
351
true
0
1
Is it reasonable to save data as python modules?
1,514,248
3
5
0
7
3
1
1
0
This is what I've done for a project. I have a few data structures that are bascially dictionaries with some methods that operate on the data. When I save them to disk, I write them out to .py files as code that when imported as a module will load the same data into such a data structure. Is this reasonable? Are there any big disadvantages? The advantage I see is that when I want to operate with the saved data, I can quickly import the modules I need. Also, the modules can be used seperate from the rest of the application because you don't need a separate parser or loader functionality.
0
python,data-persistence,dynamic-import
2009-10-03T16:40:00.000
0
1,514,228
By operating this way, you may gain some modicum of convenience, but you pay many kinds of price for that. The space it takes to save your data, and the time it takes to both save and reload it, go up substantially; and your security exposure is unbounded -- you must ferociously guard the paths from which you reload modules, as it would provide an easy avenue for any attacker to inject code of their choice to be executed under your userid (pickle itself is not rock-solid, security-wise, but, compared to this arrangement, it shines;-). All in all, I prefer a simpler and more traditional arrangement: executable code lives in one module (on a typical code-loading path, that does not need to be R/W once the module's compiled) -- it gets loaded just once and from an already-compiled form. Data live in their own files (or portions of DB, etc) in any of the many suitable formats, mostly standard ones (possibly including multi-language ones such as JSON, CSV, XML, ... &c, if I want to keep the option open to easily load those data from other languages in the future).
0
351
false
0
1
Is it reasonable to save data as python modules?
1,514,250
1
7
0
0
29
1
0
0
This is a very basic question - but I haven't been able to find an answer by searching online. I am using python to control ArcGIS, and I have a simple python script, that calls some pre-written code. However, when I make a change to the pre-written code, it does not appear to result in any change. I import this module, and have tried refreshing it, but nothing happens. I've even moved the file it calls to another location, and the script still works fine. One thing I did yesterday was I added the folder where all my python files are to the sys path (using sys.append('path') ), and I wonder if that made a difference. Thanks in advance, and sorry for the sloppy terminology.
0
python,refresh,reload
2009-10-04T18:23:00.000
0
1,517,038
I had the exact same issue creating a geoprocessing script for ArcGIS 10.2. I had a python toolbox script, a tool script and then a common script. I have a parameter for Dev/Test/Prod in the tool that would control which version of the code was run. Dev would run the code in the dev folder, test from test folder and prod from prod folder. Changes to the common dev script would not run when the tool was run from ArcCatalog. Closing ArcCatalog made no difference. Even though I selected Dev or Test it would always run from the prod folder. Adding reload(myCommonModule) to the tool script resolved this issue.
0
106,481
false
0
1
python refresh/reload
27,299,101
2
2
0
1
0
0
1.2
0
I have an open source project containing both Python and C code. I'm wondering that is there any use for distutils for me, because I'm planning to do a ubuntu/debian package. The C code is not something that I could or want to use as Python extension. C and Python programs communicate with TCP/IP through localhost. So the bottom line here is that while I'm learning packaging, does the usage of distutils specific files only make me more confused since I can't use my C-code as Python extensions? Or should I divide my C and Python functionality to separate projects to be able to understand packaging concepts better?
0
python,c,packaging,distutils
2009-10-06T06:17:00.000
1
1,523,874
distutils can be used to install end user programs, but it's most useful when using it for Python libraries, as it can create source packages and also install them in the correct place. For that I would say it's more or less required. But for an end user Python program you can also use make or whatever you like and are used to, as you don't need to install any code in the Python site-packages directory, and you don't need to put your code onto PyPI and it doesn't need to be accessible from other Python-code. I don't think distutils will be neither more or less complicated to use in installing an end-user program compared to other tools. All such install/packaging tools are hella-complex, as Cartman would have said.
0
284
true
0
1
Reasons to use distutils when packaging C/Python project
1,523,993
2
2
0
1
0
0
0.099668
0
I have an open source project containing both Python and C code. I'm wondering that is there any use for distutils for me, because I'm planning to do a ubuntu/debian package. The C code is not something that I could or want to use as Python extension. C and Python programs communicate with TCP/IP through localhost. So the bottom line here is that while I'm learning packaging, does the usage of distutils specific files only make me more confused since I can't use my C-code as Python extensions? Or should I divide my C and Python functionality to separate projects to be able to understand packaging concepts better?
0
python,c,packaging,distutils
2009-10-06T06:17:00.000
1
1,523,874
Because it uses an unified python setup.py install command? distutils, or setuptools? Whatever, just use one of those. For development, it's also really useful because you don't have to care where to find such and such dependency. As long as it's standard Python/basic system library stuff, setup.py should find it for you. With setup.py, you don't require anymore ./configure stuff or ugly autotools to create huge Makefiles. It just works (tm)
0
284
false
0
1
Reasons to use distutils when packaging C/Python project
1,525,194
4
11
0
2
6
0
0.036348
0
I am basically from the world of C language programming, now delving into the world of scripting languages like Ruby and Python. I am wondering how to do debugging. At present the steps I follow is, I complete a large script, Comment everything but the portion I want to check Execute the script Though it works, I am not able to debug like how I would do in, say, a VC++ environment or something like that. My question is, is there any better way of debugging? Note: I guess it may be a repeated question, if so, please point me to the answer.
0
python,ruby,scripting-language
2009-10-07T06:41:00.000
0
1,529,896
Script languages have no differences compared with other languages in the sense that you still have to break your problems into manageable pieces -- that is, functions. So, instead of testing the whole script after finishing the whole script, I prefer to test those small functions before integrating them. TDD always helps.
0
1,224
false
0
1
Debugging a scripting language like ruby
1,529,931
4
11
0
10
6
0
1.2
0
I am basically from the world of C language programming, now delving into the world of scripting languages like Ruby and Python. I am wondering how to do debugging. At present the steps I follow is, I complete a large script, Comment everything but the portion I want to check Execute the script Though it works, I am not able to debug like how I would do in, say, a VC++ environment or something like that. My question is, is there any better way of debugging? Note: I guess it may be a repeated question, if so, please point me to the answer.
0
python,ruby,scripting-language
2009-10-07T06:41:00.000
0
1,529,896
Your sequence seems entirely backwards to me. Here's how I do it: I write a test for the functionality I want. I start writing the script, executing bits and verifying test results. I review what I'd done to document and publish. Specifically, I execute before I complete. It's way too late by then. There are debuggers, of course, but with good tests and good design, I've almost never needed one.
0
1,224
true
0
1
Debugging a scripting language like ruby
1,529,996
4
11
0
0
6
0
0
0
I am basically from the world of C language programming, now delving into the world of scripting languages like Ruby and Python. I am wondering how to do debugging. At present the steps I follow is, I complete a large script, Comment everything but the portion I want to check Execute the script Though it works, I am not able to debug like how I would do in, say, a VC++ environment or something like that. My question is, is there any better way of debugging? Note: I guess it may be a repeated question, if so, please point me to the answer.
0
python,ruby,scripting-language
2009-10-07T06:41:00.000
0
1,529,896
The debugging method you described is perfect for a static language like C++, but given that the language is so different, the coding methods are similarly different. One of the big very important things in a dynamic language such as Python or Ruby is the interactive toplevel (what you get by typing, say python on the command line). This means that running a part of your program is very easy. Even if you've written a large program before testing (which is a bad idea), it is hopefully separated into many functions. So, open up your interactive toplevel, do an import thing (for whatever thing happens to be) and then you can easily start testing your functions one by one, just calling them on the toplevel. Of course, for a more mature project, you probably want to write out an actual test suite, and most languages have a method to do that (in Python, this is doctest and nose, don't know about other languages). At first, though, when you're writing something not particularly formal, just remember a few simple rules of debugging dynamic languages: Start small. Don't write large programs and test them. Test each function as you write it, at least cursorily. Use the toplevel. Running small pieces of code in a language like Python is extremely lightweight: fire up the toplevel and run it. Compare with writing a complete program and the compile-running it in, say, C++. Use that fact that you can quickly change the correctness of any function. Debuggers are handy. But often, so are print statements. If you're only running a single function, debugging with print statements isn't that inconvenient, and also frees you from dragging along an IDE.
0
1,224
false
0
1
Debugging a scripting language like ruby
1,531,112
4
11
0
2
6
0
0.036348
0
I am basically from the world of C language programming, now delving into the world of scripting languages like Ruby and Python. I am wondering how to do debugging. At present the steps I follow is, I complete a large script, Comment everything but the portion I want to check Execute the script Though it works, I am not able to debug like how I would do in, say, a VC++ environment or something like that. My question is, is there any better way of debugging? Note: I guess it may be a repeated question, if so, please point me to the answer.
0
python,ruby,scripting-language
2009-10-07T06:41:00.000
0
1,529,896
My question is, is there any better way of debugging?" Yes. Your approach, "1. I complete a large script, 2. Comment everything but the portion I want to check, 3. Execute the script" is not really the best way to write any software in any language (sorry, but that's the truth.) Do not write a large anything. Ever. Do this. Decompose your problem into classes of objects. For each class, write the class by 2a. Outline the class, focus on the external interface, not the implementation details. 2b. Write tests to prove that interface works. 2c. Run the tests. They'll fail, since you only outlined the class. 2d. Fix the class until it passes the test. 2e. At some points, you'll realize your class designs aren't optimal. Refactor your design, assuring your tests still pass. Now, write your final script. It should be short. All the classes have already been tested. 3a. Outline the script. Indeed, you can usually write the script. 3b. Write some test cases that prove the script works. 3c. Runt the tests. They may pass. You're done. 3d. If the tests don't pass, fix things until they do. Write many small things. It works out much better in the long run that writing a large thing and commenting parts of it out.
0
1,224
false
0
1
Debugging a scripting language like ruby
1,530,723
2
6
0
0
3
1
0
0
I heard that Python has automated "garbage collection" , but C++ does not. What does that mean?
0
c++,python,garbage-collection
2009-10-07T08:21:00.000
0
1,530,245
As you have got your answer, now it's better to know the cons of automated garbage collection: it requires large amounts of extra memory and not suitable for hard real-time deadline applications.
0
1,606
false
0
1
I heard that Python has automated "garbage collection" , but C++ does not. What does that mean?
1,532,646
2
6
0
2
3
1
0.066568
0
I heard that Python has automated "garbage collection" , but C++ does not. What does that mean?
0
c++,python,garbage-collection
2009-10-07T08:21:00.000
0
1,530,245
It basically means the way they handle memory resources. When you need memory you usually ask for it to the OS and then return it back. With python you don't need to worry about returning it, with C++ you need to track what you asked and return it back, one is easier, the other performant, you choose your tool.
0
1,606
false
0
1
I heard that Python has automated "garbage collection" , but C++ does not. What does that mean?
1,530,267
5
6
0
5
6
1
0.16514
0
i need to transfer large files across network and need to create checksum for them on hourly basis. so the speed for generating checksum is critical for me. somehow i can't make zlib.crc32 and zlib.adler32 working with files larger than 4GB on Windows XP Pro 64bit machine. i suspect i've hit the 32bit limitation here? using hashlib.md5 i could get a result but the problem is the speed. it takes roughly about 5 minutes to generate an md5 for 4.8GB file. task manager shows that the process is using one core only. my questions are: is there a way to make crc works on large file? i prefer to use crc than md5 if not then is there a way to speed up the md5.hexdigest()/md5.digest? or in this case any hashlib hexdigest/digest? maybe spliting it into multi thread process? how do i do that? PS: i'm working on somethimg similar like an "Asset Management" system, kind of like svn but the asset consist of large compressed image files. the files have tiny bit incremental changes. the hashing/checksum is needed for detecting changes and error detection.
0
python,multithreading,md5,crc32,hashlib
2009-10-07T16:28:00.000
0
1,532,720
It's an algorithm selection problem, rather than a library/language selection problem! There appears to be two points to consider primarily: how much would the disk I/O affect the overall performance? what is the expected reliability of the error detection feature? Apparently, the answer to the second question is something like 'some false negative allowed' since the reliability of any 32 bits hash, relative to a 4Gb message, even in a moderately noisy channel, is not going to be virtually absolute. Assuming that I/O can be improved through multithreading, we may choose a hash that doesn't require a sequential scan of the complete message. Instead we can maybe work the file in parallel, hashing individual sections and either combining the hash values or appending them, to form a longer, more reliable error detection device. The next step could be to formalize this handling of files as ordered sections, and to transmit them as such (to be re-glued together at the recipient's end). This approach, along additional information about the way the files are produced (for ex. they may be exclusively modified by append, like log files), may even allow to limit the amount of hash calculation required. The added complexity of this approach needs to weighted against the desire to have zippy fast CRC calculation. Side note: Alder32 is not limited to message sizes below a particular threshold. It may just be a limit of the zlib API. (BTW, the reference I found about zlib.adler32 used a buffer, and well... this approach is to be avoided in the context of our huge messages, in favor of streamed processes: read a little from file, calculate, repeat..)
0
17,263
false
0
1
the fastest way to create checksum for large files in python
1,533,036
5
6
0
1
6
1
0.033321
0
i need to transfer large files across network and need to create checksum for them on hourly basis. so the speed for generating checksum is critical for me. somehow i can't make zlib.crc32 and zlib.adler32 working with files larger than 4GB on Windows XP Pro 64bit machine. i suspect i've hit the 32bit limitation here? using hashlib.md5 i could get a result but the problem is the speed. it takes roughly about 5 minutes to generate an md5 for 4.8GB file. task manager shows that the process is using one core only. my questions are: is there a way to make crc works on large file? i prefer to use crc than md5 if not then is there a way to speed up the md5.hexdigest()/md5.digest? or in this case any hashlib hexdigest/digest? maybe spliting it into multi thread process? how do i do that? PS: i'm working on somethimg similar like an "Asset Management" system, kind of like svn but the asset consist of large compressed image files. the files have tiny bit incremental changes. the hashing/checksum is needed for detecting changes and error detection.
0
python,multithreading,md5,crc32,hashlib
2009-10-07T16:28:00.000
0
1,532,720
You cannot possibly use more than one core to calculate MD5 hash of a large file because of the very nature of MD5: it expects a message to be broken up in chunks and fed into hashing function in strict sequence. However, you can use one thread to read a file into internal queue, and then calculate hash in a separate thread so that. I do not think though that this will give you any significant performance boost. The fact that it takes so long to process a big file might be due to "unbuffered" reads. Try reading, say, 16 Kb at a time and then feed the content in chunks to hashing function.
0
17,263
false
0
1
the fastest way to create checksum for large files in python
1,532,764
5
6
0
3
6
1
0.099668
0
i need to transfer large files across network and need to create checksum for them on hourly basis. so the speed for generating checksum is critical for me. somehow i can't make zlib.crc32 and zlib.adler32 working with files larger than 4GB on Windows XP Pro 64bit machine. i suspect i've hit the 32bit limitation here? using hashlib.md5 i could get a result but the problem is the speed. it takes roughly about 5 minutes to generate an md5 for 4.8GB file. task manager shows that the process is using one core only. my questions are: is there a way to make crc works on large file? i prefer to use crc than md5 if not then is there a way to speed up the md5.hexdigest()/md5.digest? or in this case any hashlib hexdigest/digest? maybe spliting it into multi thread process? how do i do that? PS: i'm working on somethimg similar like an "Asset Management" system, kind of like svn but the asset consist of large compressed image files. the files have tiny bit incremental changes. the hashing/checksum is needed for detecting changes and error detection.
0
python,multithreading,md5,crc32,hashlib
2009-10-07T16:28:00.000
0
1,532,720
First, there is nothing inherent in any of the CRC algorithms that would prevent them working on an arbitrary length of data (however, a particular implementation might well impose a limit). However, in a file syncing application, that probably doesn't matter, as you may not want to hash the entire file when it gets large, just chunks anyway. If you hash the entire file, and the hashes at each end differ, you have to copy the entire file. If you hash fixed sized chunks, then you only have to copy the chunks whose hash has changed. If most of the changes to the files are localized (e.g. database) then this will likely require much less copying (and it' easier to spread per chunk calculations across multiple cores). As for the hash algorithm itself, the basic tradeoff is speed vs. lack of collisions (two different data chunks yielding the same hash). CRC-32 is fast, but with only 2^32 unique values, collisions may be seen. MD5 is much slower, but has 2^128 unique values, so collisions will almost never be seen (but are still theoretically possible). The larger hashes (SHA1, SHA256, ...) have even more unique values, but are slower still: I doubt you need them: you're worried about accidental collisions, unlike digital signature applications, where you're worried about deliberately (malicously) engineered collisions. It sounds like you're trying to do something very similar to what the rsync utility does. Can you just use rsync?
0
17,263
false
0
1
the fastest way to create checksum for large files in python
1,533,255
5
6
0
2
6
1
0.066568
0
i need to transfer large files across network and need to create checksum for them on hourly basis. so the speed for generating checksum is critical for me. somehow i can't make zlib.crc32 and zlib.adler32 working with files larger than 4GB on Windows XP Pro 64bit machine. i suspect i've hit the 32bit limitation here? using hashlib.md5 i could get a result but the problem is the speed. it takes roughly about 5 minutes to generate an md5 for 4.8GB file. task manager shows that the process is using one core only. my questions are: is there a way to make crc works on large file? i prefer to use crc than md5 if not then is there a way to speed up the md5.hexdigest()/md5.digest? or in this case any hashlib hexdigest/digest? maybe spliting it into multi thread process? how do i do that? PS: i'm working on somethimg similar like an "Asset Management" system, kind of like svn but the asset consist of large compressed image files. the files have tiny bit incremental changes. the hashing/checksum is needed for detecting changes and error detection.
0
python,multithreading,md5,crc32,hashlib
2009-10-07T16:28:00.000
0
1,532,720
You might be hitting a size limit for files in XP. The 64-bit gives you more addressing space (removing the 2GB (or so) addressing space per application), but probably does nothing for the file size problem.
0
17,263
false
0
1
the fastest way to create checksum for large files in python
1,540,992
5
6
0
1
6
1
0.033321
0
i need to transfer large files across network and need to create checksum for them on hourly basis. so the speed for generating checksum is critical for me. somehow i can't make zlib.crc32 and zlib.adler32 working with files larger than 4GB on Windows XP Pro 64bit machine. i suspect i've hit the 32bit limitation here? using hashlib.md5 i could get a result but the problem is the speed. it takes roughly about 5 minutes to generate an md5 for 4.8GB file. task manager shows that the process is using one core only. my questions are: is there a way to make crc works on large file? i prefer to use crc than md5 if not then is there a way to speed up the md5.hexdigest()/md5.digest? or in this case any hashlib hexdigest/digest? maybe spliting it into multi thread process? how do i do that? PS: i'm working on somethimg similar like an "Asset Management" system, kind of like svn but the asset consist of large compressed image files. the files have tiny bit incremental changes. the hashing/checksum is needed for detecting changes and error detection.
0
python,multithreading,md5,crc32,hashlib
2009-10-07T16:28:00.000
0
1,532,720
md5 itself can't be run in parallel. However you can md5 the file in sections (in parallel) and the take an md5 of the list of hashes. However that assumes that the hashing is not IO-limited, which I would suspect it is. As Anton Gogolev suggests - make sure that you're reading the file efficiently (in large power-of-2 chunks). Once you've done that, make sure the file isn't fragmented. Also a hash such as sha256 should be selected rather than md5 for new projects. Are the zlib checksums much faster than md5 for 4Gb files?
0
17,263
false
0
1
the fastest way to create checksum for large files in python
1,532,779
9
10
1
0
2
1
0
0
I am a corporate C# programmer. I found some time to invest into myself and stumbed upon a dilemma. Where to go from now? C#/.NET is easy to learn, develop for, etc. In future I would want to apply to Microsoft or Google, and want to invest spare time wisely, so what I will learn will flourish in future. So: Python or C++ for a C# programmer? I am a little scared of C++ because developing anything in it takes ages. Python is easy, but I take it as a child-play language, which still need lots of patching to be some mature development tool/language. Any C# developers having same dilemma?
0
c#,c++,python
2009-10-07T21:52:00.000
0
1,534,450
As someone familiar with C# and .NET you should consider IronPython. Python for .NET. This would be a good way to leverage what you know and learn a new dynamic language at the same time.
0
1,673
false
0
1
C++ or Python for C# programmer?
1,534,470
9
10
1
7
2
1
1
0
I am a corporate C# programmer. I found some time to invest into myself and stumbed upon a dilemma. Where to go from now? C#/.NET is easy to learn, develop for, etc. In future I would want to apply to Microsoft or Google, and want to invest spare time wisely, so what I will learn will flourish in future. So: Python or C++ for a C# programmer? I am a little scared of C++ because developing anything in it takes ages. Python is easy, but I take it as a child-play language, which still need lots of patching to be some mature development tool/language. Any C# developers having same dilemma?
0
c#,c++,python
2009-10-07T21:52:00.000
0
1,534,450
I Am a little scared of C++ because developing anything in it takes ages. I'm not sure how you can say that when you say yourself that you have no experience in the language. C++ is a good tool for some things, Python is good for other things. What you want to do should be driving this decision, not the technology in and of itself. C# programmer or not, I would assume that you can pick up any language, but a language is just a tool, so your question is difficult to answer.
0
1,673
false
0
1
C++ or Python for C# programmer?
1,534,460
9
10
1
5
2
1
0.099668
0
I am a corporate C# programmer. I found some time to invest into myself and stumbed upon a dilemma. Where to go from now? C#/.NET is easy to learn, develop for, etc. In future I would want to apply to Microsoft or Google, and want to invest spare time wisely, so what I will learn will flourish in future. So: Python or C++ for a C# programmer? I am a little scared of C++ because developing anything in it takes ages. Python is easy, but I take it as a child-play language, which still need lots of patching to be some mature development tool/language. Any C# developers having same dilemma?
0
c#,c++,python
2009-10-07T21:52:00.000
0
1,534,450
Python may be easier to get started with, but a dynamically typed scripting language is a very different language from C# or C++. You will learn more about programming learning it than you will by hopping to a close cousin of a language you already know. Really, solid familiarity with at least one scripting language (Python, Perl and Ruby are the favorites) should be a requirement for all programmers.
0
1,673
false
0
1
C++ or Python for C# programmer?
1,534,467
9
10
1
2
2
1
1.2
0
I am a corporate C# programmer. I found some time to invest into myself and stumbed upon a dilemma. Where to go from now? C#/.NET is easy to learn, develop for, etc. In future I would want to apply to Microsoft or Google, and want to invest spare time wisely, so what I will learn will flourish in future. So: Python or C++ for a C# programmer? I am a little scared of C++ because developing anything in it takes ages. Python is easy, but I take it as a child-play language, which still need lots of patching to be some mature development tool/language. Any C# developers having same dilemma?
0
c#,c++,python
2009-10-07T21:52:00.000
0
1,534,450
C# is a little closer to Java and C++ than it is to Python, so learn Python first out of the two. However, my advice would be: Stick with your current language and learn more techniques, such as a wider range of algorithms, functional programming, design by contract, unit testing, OOAD, etc. learn C (focus on figuring out pointers, multi-dimensional arrays, data structures like linked lists, and resource management like memory allocation/deallocation, file handles, etc) learn Assembly (on a modern platform with a flat memory architecture, but doing low-level stuff like talking to hardware or drawing on a canvas) learn Python or Ruby. Chances are, you'll stick with one of these for a while, knowing all of the above, unless some hot new language has come along by then.
0
1,673
true
0
1
C++ or Python for C# programmer?
1,534,729
9
10
1
3
2
1
0.059928
0
I am a corporate C# programmer. I found some time to invest into myself and stumbed upon a dilemma. Where to go from now? C#/.NET is easy to learn, develop for, etc. In future I would want to apply to Microsoft or Google, and want to invest spare time wisely, so what I will learn will flourish in future. So: Python or C++ for a C# programmer? I am a little scared of C++ because developing anything in it takes ages. Python is easy, but I take it as a child-play language, which still need lots of patching to be some mature development tool/language. Any C# developers having same dilemma?
0
c#,c++,python
2009-10-07T21:52:00.000
0
1,534,450
If you want to apply to Google then Python might be the one to go for, surely MS would like the C# already. If nothing else the competition would not be as fierce as there are much more folk out there with multi years of C++ experience. Also Python gives you a broader language skill and would be a good path to more languages and scripting. But as said and will be said again, choose your tool wisely and see whether it's a nail or a screw you're trying to secure.
0
1,673
false
0
1
C++ or Python for C# programmer?
1,534,483
9
10
1
0
2
1
0
0
I am a corporate C# programmer. I found some time to invest into myself and stumbed upon a dilemma. Where to go from now? C#/.NET is easy to learn, develop for, etc. In future I would want to apply to Microsoft or Google, and want to invest spare time wisely, so what I will learn will flourish in future. So: Python or C++ for a C# programmer? I am a little scared of C++ because developing anything in it takes ages. Python is easy, but I take it as a child-play language, which still need lots of patching to be some mature development tool/language. Any C# developers having same dilemma?
0
c#,c++,python
2009-10-07T21:52:00.000
0
1,534,450
You might be interested in looking at Windows Powershell. It's the latest scripting technology from Microsoft, built on .NET, and can be extended via C#. Granted, it's not as portable as C++ or Python, but it would leverage your C#/.NET experience more readily. Otherwise, I would suggest C++ (and possibly C). Microsoft builds a lot more of its products with C/C++ than with Python.
0
1,673
false
0
1
C++ or Python for C# programmer?
1,534,589
9
10
1
1
2
1
0.019997
0
I am a corporate C# programmer. I found some time to invest into myself and stumbed upon a dilemma. Where to go from now? C#/.NET is easy to learn, develop for, etc. In future I would want to apply to Microsoft or Google, and want to invest spare time wisely, so what I will learn will flourish in future. So: Python or C++ for a C# programmer? I am a little scared of C++ because developing anything in it takes ages. Python is easy, but I take it as a child-play language, which still need lots of patching to be some mature development tool/language. Any C# developers having same dilemma?
0
c#,c++,python
2009-10-07T21:52:00.000
0
1,534,450
C++ is usually used when speed, and low-level OS access is involved. It's a good skill to have if you want to expand. Python allows you to do thing quickly, and it's quite easy to learn, and provides more power than you'd expect from a scripting language, and probably one of the fastest ones out there. C++ isn't exactly slow to develop, if you've got an IDE, it's not hard to write per-se, but the syntax is going to get you.
0
1,673
false
0
1
C++ or Python for C# programmer?
1,534,645
9
10
1
1
2
1
0.019997
0
I am a corporate C# programmer. I found some time to invest into myself and stumbed upon a dilemma. Where to go from now? C#/.NET is easy to learn, develop for, etc. In future I would want to apply to Microsoft or Google, and want to invest spare time wisely, so what I will learn will flourish in future. So: Python or C++ for a C# programmer? I am a little scared of C++ because developing anything in it takes ages. Python is easy, but I take it as a child-play language, which still need lots of patching to be some mature development tool/language. Any C# developers having same dilemma?
0
c#,c++,python
2009-10-07T21:52:00.000
0
1,534,450
If you want to apply to Google and/ or Microsoft then I'd say that of the two you need both! Given more choice, probably C++ and one other language - either dynamic, functional, or both (Scala might be a good choice too). It's not necessarily about whether you'd use the languages themselves but more about the different approaches they require and encourage. If you continue to be "scared" by C++ you're probably going to struggle applying as a dev at either of those organisations - unless you are highly specialised elsewhere.
0
1,673
false
0
1
C++ or Python for C# programmer?
1,534,651
9
10
1
1
2
1
0.019997
0
I am a corporate C# programmer. I found some time to invest into myself and stumbed upon a dilemma. Where to go from now? C#/.NET is easy to learn, develop for, etc. In future I would want to apply to Microsoft or Google, and want to invest spare time wisely, so what I will learn will flourish in future. So: Python or C++ for a C# programmer? I am a little scared of C++ because developing anything in it takes ages. Python is easy, but I take it as a child-play language, which still need lots of patching to be some mature development tool/language. Any C# developers having same dilemma?
0
c#,c++,python
2009-10-07T21:52:00.000
0
1,534,450
Why not learn some of each. Studying a language for a week or so won't make you an expert, but it will answer a lot of questions in your head and plant a seed for the future. It's important to not just read through exercises. Find some simple problems that can be programmed in a page or two at most and solve them with each language. That will help you to learn the strengths and weaknesses in the context of the way you think and how you solve problems.
0
1,673
false
0
1
C++ or Python for C# programmer?
1,534,556
4
9
0
1
7
0
0.022219
0
I want to write a hit counter script to keep track of hits on images on a website and the originating IPs. Impressions are upwards of hundreds of thousands per day, so the counters will be incremented many times a second. I'm looking for a simple, self-hosted method (php, python scripts, etc.). I was thinking of using MySQL to keep track of this, but I'm guessing there's a more efficient way. What are good methods of keeping counters?
0
php,python,mysql,tracking
2009-10-08T02:12:00.000
0
1,535,261
If accuracy is important, you can do it slightly slower with MySql... create a HEAP / Memory table to store your counter values. These a in-memory tables that are blazingly fast. You can write the data into a normal table at intervals. Based on the app engine ideas, you could use memcache as a temporary store for your counter. Incrementing a memcache counter is faster than using the MySql heap tables (I think). Once every five or ten seconds, you could read the memcache counter and write that number into your DB.
0
6,857
false
0
1
How to write an efficient hit counter for websites
1,535,794
4
9
0
4
7
0
0.088656
0
I want to write a hit counter script to keep track of hits on images on a website and the originating IPs. Impressions are upwards of hundreds of thousands per day, so the counters will be incremented many times a second. I'm looking for a simple, self-hosted method (php, python scripts, etc.). I was thinking of using MySQL to keep track of this, but I'm guessing there's a more efficient way. What are good methods of keeping counters?
0
php,python,mysql,tracking
2009-10-08T02:12:00.000
0
1,535,261
You could take your webserver's Access log (Apache: access.log) and evaluate it time and again (cronjob) in case you do not need to have the data at hand at the exact moment in time when someone visits your site. Usually, the access.log is generated anyway and contains the requested resource as well as time, date and the user's IP. This way you do not have to route all trafic through a php-script. Lean, mean counting machine.
0
6,857
false
0
1
How to write an efficient hit counter for websites
1,536,049
4
9
0
7
7
0
1
0
I want to write a hit counter script to keep track of hits on images on a website and the originating IPs. Impressions are upwards of hundreds of thousands per day, so the counters will be incremented many times a second. I'm looking for a simple, self-hosted method (php, python scripts, etc.). I was thinking of using MySQL to keep track of this, but I'm guessing there's a more efficient way. What are good methods of keeping counters?
0
php,python,mysql,tracking
2009-10-08T02:12:00.000
0
1,535,261
A fascinating subject. Incrementing a counter, simple as it may be, just has to be a transaction... meaning, it can lock out the whole DB for longer than makes sense!-) It can easily be the bottleneck for the whole system. If you need rigorously exact counts but don't need them to be instantly up-to-date, my favorite approach is to append the countable information to a log (switching logs as often as needed for data freshness purposes). Once a log is closed (with thousands of countable events in it), a script can read it and update all that's needed in a single transaction -- maybe not intuitive, but much faster than thousands of single locks. Then there's extremely-fast counters that are only statistically accurate -- but since you don't say that such imprecision is acceptable, I'm not going to explain them in more depth.
0
6,857
false
0
1
How to write an efficient hit counter for websites
1,535,311
4
9
0
0
7
0
0
0
I want to write a hit counter script to keep track of hits on images on a website and the originating IPs. Impressions are upwards of hundreds of thousands per day, so the counters will be incremented many times a second. I'm looking for a simple, self-hosted method (php, python scripts, etc.). I was thinking of using MySQL to keep track of this, but I'm guessing there's a more efficient way. What are good methods of keeping counters?
0
php,python,mysql,tracking
2009-10-08T02:12:00.000
0
1,535,261
I've done something very similar, on a similar scale (multiple servers, hundreds of domains, several thousand hits per hour) and log file analysis was definitely the way to go. (It also checked hit rates, weighted them by file type, and blacklisted IP addresses at the firewall if they were making too many requests; its intended purpose was to auto-block bad bots, not to just be a counter, but counting was an essential piece of it.) No performance impact on the web server process itself, since it's not doing any additional work there, and you could easily publish periodically-updated hit counts by injecting them into the site's database every minute/5 minutes/100 hits/whatever without having to lock the relevant row/table/database (depending on the locking mechanism in use) on every hit.
0
6,857
false
0
1
How to write an efficient hit counter for websites
1,900,337
1
3
0
2
2
0
1.2
0
I'm starting a new webapp project in Python to get into the Agile mind-set and I'd like to do things "properly" with regards to deployment. However, I'm finding the whole virtualenv/fabric/zc.buildout/etc stuff a little confusing - I'm used to just FTP'ing PHP files to a server and pointing a webserver at it. After deployment the server set-up would look something like: Nginx --proxy-to--> WSGI Webserver (Spawning) --> WSGI Middleware --> WSGI App (probably MNML or similar) with the python webserver being managed by supervisord. What sort of deployment set-up/packages/apps should I be looking into? And is there a specific directory structure I need to stick to with my app to ease deployment?
0
python,deployment,virtualenv
2009-10-08T11:40:00.000
0
1,537,298
You already mentioned buildout, and it's all you need. Google for example buildouts for the different parts. Takes a while to set it up the first time, but then you can reuse the setup between different projects too. Let supervisord start everything, not just the python server. Then start supervisord at reboot either fron cron or init.d.
0
1,087
true
1
1
What do I need to know/learn for automated python deployment?
1,537,585
2
3
0
0
0
0
0
1
I have a server that has to respond to HTTP and XML-RPC requests. Right now I have an instance of SimpleXMLRPCServer, and an instance of BaseHTTPServer.HTTPServer with a custom request handler, running on different ports. I'd like to run both services on a single port. I think it should be possible to modify the CGIXMLRPCRequestHandler class to also serve custom HTTP requests on some paths, or alternately, to use multiple request handlers based on what path is requested. I'm not really sure what the cleanest way to do this would be, though.
0
python,http,xml-rpc
2009-10-08T19:45:00.000
0
1,540,011
Is there a reason not to run a real webserver out front with url rewrites to the two ports you are usign now? It's going to make life much easier in the long run
0
849
false
0
1
Python HTTP server with XML-RPC
1,540,053
2
3
0
0
0
0
1.2
1
I have a server that has to respond to HTTP and XML-RPC requests. Right now I have an instance of SimpleXMLRPCServer, and an instance of BaseHTTPServer.HTTPServer with a custom request handler, running on different ports. I'd like to run both services on a single port. I think it should be possible to modify the CGIXMLRPCRequestHandler class to also serve custom HTTP requests on some paths, or alternately, to use multiple request handlers based on what path is requested. I'm not really sure what the cleanest way to do this would be, though.
0
python,http,xml-rpc
2009-10-08T19:45:00.000
0
1,540,011
Use SimpleXMLRPCDispatcher class directly from your own request handler.
0
849
true
0
1
Python HTTP server with XML-RPC
1,543,370
3
8
0
7
68
0
1
1
If yes are there any frameworks/Tutorials/tips/etc recommended? N00b at Python but I have tons of PHP experience and wanted to expand my skill set. I know Python is great at server side execution, just wanted to know about client side as well.
0
python,client-side
2009-10-08T20:27:00.000
0
1,540,214
Silverlight can run IronPython, so you can make Silverlight applications. Which is client-side.
0
54,388
false
0
1
Can Python be used for client side web development?
1,540,379
3
8
0
-1
68
0
-0.024995
1
If yes are there any frameworks/Tutorials/tips/etc recommended? N00b at Python but I have tons of PHP experience and wanted to expand my skill set. I know Python is great at server side execution, just wanted to know about client side as well.
0
python,client-side
2009-10-08T20:27:00.000
0
1,540,214
No. Browsers don't run Python.
0
54,388
false
0
1
Can Python be used for client side web development?
1,540,233
3
8
0
3
68
0
0.07486
1
If yes are there any frameworks/Tutorials/tips/etc recommended? N00b at Python but I have tons of PHP experience and wanted to expand my skill set. I know Python is great at server side execution, just wanted to know about client side as well.
0
python,client-side
2009-10-08T20:27:00.000
0
1,540,214
On Windows, any language that registers for the Windows Scripting Host can run in IE. At least the ActiveState version of Python could do that; I seem to recall that has been superseded by a more official version these days. But that solution requires the user to install a python interpreter and run some script or .reg file to put the correct "magic" into the registry for the hooks to work.
0
54,388
false
0
1
Can Python be used for client side web development?
7,437,506
1
3
0
1
8
0
0.066568
0
I'm writting a small python script notify me when certain condition met. I used smtplib which does the emailing for me, but I also want the script to call my cell phone as well. I can't find a free library for phone callings. Does anyone know any?
0
python,phone-call
2009-10-09T15:38:00.000
0
1,544,550
I've used Skype4Py very successfully. Keep in mind though it does require Skype to be installed and costs the standard rate for SkypeOut.
0
7,113
false
0
1
Is there a free python library for phone calling?
1,545,365
1
2
0
1
1
0
0.099668
1
I need to implement a small test utility which consumes extremely simple SOAP XML (HTTP POST) messages. This is a protocol which I have to support, and it's not my design decision to use SOAP (just trying to prevent those "why do you use protocol X?" answers) I'd like to use stuff that's already in the basic python 2.6.x installation. What's the easiest way to do that? The sole SOAP message is really simple, I'd rather not use any enterprisey tools like WSDL class generation if possible. I already implemented the same functionality earlier in Ruby with just plain HTTPServlet::AbstractServlet and REXML parser. Worked fine. I thought I could a similar solution in Python with BaseHTTPServer, BaseHTTPRequestHandler and the elementree parser, but it's not obvious to me how I can read the contents of my incoming SOAP POST message. The documentation is not that great IMHO.
0
python,http,soap
2009-10-10T09:49:00.000
0
1,547,520
I wrote something like this in Boo, using a .Net HTTPListener, because I too had to implement someone else's defined WSDL. The WSDL I was given used document/literal form (you'll need to make some adjustments to this information if your WSDL uses rpc/encoded). I wrapped the HTTPListener in a class that allowed client code to register callbacks by SOAP action, and then gave that class a Start method that would kick off the HTTPListener. You should be able to do something very similar in Python, with a getPOST() method on BaseHTTPServer to: extract the SOAP action from the HTTP headers use elementtree to extract the SOAP header and SOAP body from the POST'ed HTTP call the defined callback for the SOAP action, sending these extracted values return the response text given by the callback in a corresponding SOAP envelope; if the callback raises an exception, catch it and re-wrap it as a SOAP fault Then you just implement a callback per SOAP action, which gets the XML content passed to it, parses this with elementtree, performs the desired action (or mock action if this is tester), and constructs the necessary response XML (I was not too proud to just create this explicitly using string interpolation, but you could use elementtree to create this by serializing a Python response object). It will help if you can get some real SOAP sample messages in order to help you not tear out your hair, especially in the part where you create the necessary response XML.
0
515
false
0
1
A minimalist, non-enterprisey approach for a SOAP server in Python
1,547,642
1
6
0
1
14
0
0.033321
1
I have a two websites in php and python. When a user sends a request to the server I need php/python to send an HTTP POST request to a remote server. I want to reply to the user immediately without waiting for a response from the remote server. Is it possible to continue running a php/python script after sending a response to the user. In that case I'll first reply to the user and only then send the HTTP POST request to the remote server. Is it possible to create a non-blocking HTTP client in php/python without handling the response at all? A solution that will have the same logic in php and python is preferable for me. Thanks
0
php,python,nonblocking
2009-10-12T16:22:00.000
0
1,555,517
What you need to do is have the PHP script execute another script that does the server call and then sends the user the request.
0
12,400
false
0
1
sending a non-blocking HTTP POST request
1,555,614
3
8
0
2
1
0
0.049958
0
We have a sizable code base in Perl. For the forseeable future, our codebase will remain in Perl. However, we're looking into adding a GUI-based dashboard utility. We are considering writing the dashboard in Python (using tkinter or wx). The problem, however, is that we would like to leverage our existing Perl codebase in the Python GUI. So... any suggestions on how achieve this? We are considering a few options: Write executables (in Perl) that mimic function calls; invoke those Perl executables in python as system calls. Write Perl executables on-the-fly inside the Python dashboard, and invoke the (temporary) Perl executable. Find some kind of Perl-to-Python converter or binding. Any other ideas? I'd love to hear if other people have confronted this problem. Unfortunately, it's not an option to convert the codebase itself to Python at this time.
0
python,perl
2009-10-12T20:22:00.000
1
1,556,668
Well, if you really want to write the GUI in another language (which, seriously, is just a bad idea, since it will cost you more than it could ever benefit you), the thing you should do is the following: Document your Perl app in terms of the services it provides. You should do it with XML Schema Definition - XSD - for the data types and Web Service Description Language - WSDL - for the actual service. Implement the services in Perl, possibly using Catalyst::Controller::SOAP, or just XML::Compile::SOAP. Consume the services from your whatever-language GUI interface. Profit. But honestly, I really suggest you taking a look at the Perl GTK2 binding, it is awesome, including features such as implementing a Gtk class entirely in Perl and using it as argument to a function written in C - for instance, you can write a model class for a gtk tree entirely in Perl.
0
406
false
0
1
Recommendations for perl-to-python interoperation?
1,557,216
3
8
0
1
1
0
0.024995
0
We have a sizable code base in Perl. For the forseeable future, our codebase will remain in Perl. However, we're looking into adding a GUI-based dashboard utility. We are considering writing the dashboard in Python (using tkinter or wx). The problem, however, is that we would like to leverage our existing Perl codebase in the Python GUI. So... any suggestions on how achieve this? We are considering a few options: Write executables (in Perl) that mimic function calls; invoke those Perl executables in python as system calls. Write Perl executables on-the-fly inside the Python dashboard, and invoke the (temporary) Perl executable. Find some kind of Perl-to-Python converter or binding. Any other ideas? I'd love to hear if other people have confronted this problem. Unfortunately, it's not an option to convert the codebase itself to Python at this time.
0
python,perl
2009-10-12T20:22:00.000
1
1,556,668
Interesting project: I would opt for loose-coupling and consider an XML-RPC or JSON based approach.
0
406
false
0
1
Recommendations for perl-to-python interoperation?
1,560,979
3
8
0
7
1
0
1
0
We have a sizable code base in Perl. For the forseeable future, our codebase will remain in Perl. However, we're looking into adding a GUI-based dashboard utility. We are considering writing the dashboard in Python (using tkinter or wx). The problem, however, is that we would like to leverage our existing Perl codebase in the Python GUI. So... any suggestions on how achieve this? We are considering a few options: Write executables (in Perl) that mimic function calls; invoke those Perl executables in python as system calls. Write Perl executables on-the-fly inside the Python dashboard, and invoke the (temporary) Perl executable. Find some kind of Perl-to-Python converter or binding. Any other ideas? I'd love to hear if other people have confronted this problem. Unfortunately, it's not an option to convert the codebase itself to Python at this time.
0
python,perl
2009-10-12T20:22:00.000
1
1,556,668
I hate to be another one in the chorus, but... Avoid the use of an alternate language Use Wx so it's native look and feel makes the application look "real" to non-technical audiences. Download the Padre source code and see how it does Wx Perl code, then steal rampantly from it's best tricks or maybe just gut it and use the application skeleton (using the Artistic half of the Perl dual license to make it legal). Build your own Strawberry Perl subclass to package the application as an MSI installer and push it out across the corporate Active Directory domain. Of course, I only say all this because you said "Dashboard" which I read as "Corporate", which then makes me assume a Microsoft AD network...
0
406
false
0
1
Recommendations for perl-to-python interoperation?
1,557,825
1
2
0
17
11
1
1
0
Usually I tend to install things via the package manager, for unixy stuff. However, when I programmed a lot of perl, I would use CPAN, newer versions and all that. In general, I used to install system stuff via package manager, and language stuff via it's own package manager ( gem/easy_install|pip/cpan) Now using python primarily, I am wondering what best practice is?
0
python,setuptools,distutils,pip
2009-10-13T10:25:00.000
0
1,559,372
There are two completely opposing camps: one in favor of system-provided packages, and one in favor of separate installation. I'm personally in the "system packages" camp. I'll provide arguments from each side below. Pro system packages: system packager already cares about dependency, and compliance with overall system policies (such as file layout). System packages provide security updates while still caring about not breaking compatibility - so they sometimes backport security fixes that the upstream authors did not backport. System packages are "safe" wrt. system upgrades: after a system upgrade, you probably also have a new Python version, but all your Python modules are still there if they come from a system packager. That's all personal experience with Debian. Con system packages: not all software may be provided as a system package, or not in the latest version; installing stuff yourself into the system may break system packages. Upgrades may break your application. Pro separate installation: Some people (in particular web application developers) argue that you absolutely need a repeatable setup, with just the packages you want, and completely decoupled from system Python. This goes beyond self-installed vs. system packages, since even for self-installed, you might still modify the system python; with the separate installation, you won't. As Lennart discusses, there are now dedicated tool chains to support this setup. People argue that only this approach can guarantee repeatable results. Con separate installation: you need to deal with bug fixes yourself, and you need to make sure all your users use the separate installation. In the case of web applications, the latter is typically easy to achieve.
0
1,252
false
0
1
Which is the most pythonic: installing python modules via a package manager ( macports, apt) or via pip/easy_install/setuptools
1,559,521
1
6
0
1
4
0
0.033321
0
Right now its a gmail box but sooner or later I want it to scale. I want to sync a copy of a live personal mailbox (inbox and outbox) somewhere else, but I don't want to affect the unread state of any unread messages. what type of access will make this easiest? I can't find any information if IMAP will affect the read state, but it appears I can manually reset a message to unread. Pop by definition doesn't affect unread state but nobody seems to use pop to access their gmail, why?
0
python,gmail,imap,pop3,imaplib
2009-10-14T04:27:00.000
0
1,564,237
Nobody uses POP because typically they want the extra functionality of IMAP, such as tracking message state. When that functionality is only getting in your way and needs workarounds, I think using POP's your best bet!-)
0
5,515
false
0
1
get email unread content, without affecting unread state
1,564,294
1
2
0
1
1
0
0.099668
1
Standard libraries (xmlrpclib+SimpleXMLRPCServer in Python 2 and xmlrpc.server in Python 3) report all errors (including usage errors) as python exceptions which is not suitable for public services: exception strings are often not easy understandable without python knowledge and might expose some sensitive information. It's not hard to fix this, but I prefer to avoid reinventing the wheel. Is there a third party library with better error reporting? I'm interested in good fault messages for all usage errors and hiding internals when reporting internal errors (this is better done with logging). xmlrpclib already have the constants for such errors: NOT_WELLFORMED_ERROR, UNSUPPORTED_ENCODING, INVALID_ENCODING_CHAR, INVALID_XMLRPC, METHOD_NOT_FOUND, INVALID_METHOD_PARAMS, INTERNAL_ERROR.
0
python,xml-rpc
2009-10-15T10:50:00.000
0
1,571,598
I don't think you have a library specific problem. When using any library or framework you typically want to trap all errors, log them somewhere, and throw up "Oops, we're having problems. You may want to contact us at [email protected] with error number 100 and tell us what you did." So wrap your failable entry points in try/catches, create a generic logger and off you go...
0
1,435
false
0
1
XML-RPC server with better error reporting
1,608,160