background preloader

Charming Python: Decorators make magic easy

Charming Python: Decorators make magic easy
Doing a lot by doing very little Decorators have something in common with previous metaprogramming abstractions introduced to Python: they do not actually do anything you could not do without them. As Michele Simionato and I pointed out in earlier Charming Python installments, it was possible even in Python 1.5 to manipulate Python class creation without the "metaclass" hook. Decorators are similar in their ultimate banality. All a decorator does is modify the function or method that is defined immediately after the decorator. Listing 1. class C: def foo(cls, y): print "classmethod", cls, y foo = classmethod(foo) Though classmethod() is a built-in, there is nothing unique about it; you could also have "rolled your own" method transforming function. Listing 2. def enhanced(meth): def new(self, y): print "I am enhanced" return meth(self, y) return new class C: def bar(self, x): print "some method says:", x bar = enhanced(bar) Listing 3. Listing 4. Listing 5. Listing 6. Listing 7. Back to top

Charming Python: Distributing computing with RPyC Back in 2002, I wrote a series of articles on "distributing computing" (see Resources). At that time, RPyC did not exist, but I covered the Python tool Pyro and the library xmlrpclib. Surprisingly little has changed in the last seven years. The space RPyC steps into is very much the one already occupied by Pyro and XML-RPC. RPyC 3.0+, in a nutshell, has two modes to it: a "classic mode," which was already available prior to its version 3, and a "services mode," which was introduced in version 3. Background on distributing computing In the paradigm of stand-alone personal computing, a user's workstation contains a number of resources that are used to run an application: disk storage for programs and data; a CPU; volatile memory; a video display monitor; a keyboard and pointing device; perhaps peripheral I/O devices such as printers, scanners, sound systems, modems, game inputs, and so on. In the end, what is shared between distributed computers are sets of responsibilities. Back to top

Charming Python: Easy Web data collection with mechanize and Beautiful Soup Writing scripts to interact with Web sites is possible with the basic Python modules, but you don't want to if you don't have to. The modules urllib and urllib2 in Python 2.x, along with the unified urllib.* subpackages in Python 3.0, do a passable job of fetching resources at the ends of URLs. However, when you want to do any sort of moderately sophisticated interaction with the contents you find at a Web page, you really need the mechanize library (see Resources for a download link). One of the big difficulties with automating Web scraping or other simulations of user interaction with Web sites is server use of cookies to track session progress. Python's mechanize is inspired by Perl's WWW:Mechanize, which has a similar range of capabilities. A close friend of mechanize is the equally excellent library Beautiful Soup (see Resources for a download link). A real-life example I have used mechanize in several programming projects. Back to top Tools to start with The search result scraper

Related: