Você está na página 1de 42

Aleksandar Veselinovic

2012.

OVERVIEW

Introduction
Syntax
Types

and objects

Operators

and expressions

Structure

and control flow

Functions

and functional programming

Classes

and OOP

Modules, packages, and


Input

distribution

and output

Execution

environment

Testing, debugging, profiling

and tuning

ONLINE RESOURCES

PYTHON INTERPRETER
Jython
CPython

Python

Python for .NET


IronPython
PyPy

Python is defined by
implementation. CPython is
default Python.
CPython is a bytecode interpreter.
It has a foreign function interface
with several languages including C,
in which one must explicitly write
bindings in a language other than
Python.

DATA MODEL
Files

Code objects

User defined functions

Frame objects

User defined methods

Traceback objects

Internal types

Generator functions
Built-in functions
Built-in methods

Slice objects
Static method objects

Callable

Class method objects

Class types

Modules

Classic classes

Strings

Class instances
Plain integers
Long integers

Immutable

Type hierarchy

Unicode

Sequences
numbers.Integral

Booleans

Byte arrays

None

numbers.Complex

NotImplemented
Ellipsis

Dictionaries

Lists

Mutable

Numbers
numbers.Real

Tuples

Mappings
Class instances

Sets
Classes

Mutable
Immutable

Set
Frozen set

OPERATIONS
Boolean subtype

int

32 bits+

Unlimited precision

x+y

long

x-y

x|y
Integer

x*y

x^y

x in s

x // y

x&y

Operations

x not in s

x << n

s+t

x >> n

x%y
-x
+x

~x

s * n, n * s

abs(x)

s[i]
s[i:j]
s[i:j:k]

int(x)
Sequence

Operations

Numeric

long(x)
complex(re, im)

len(s)

z.conjugate()

min(s)

math.trunc()

max(s)

divmod(x, y)

round(x[, n])

s.index(i)

power

math.floor(x)

s.count(i)
numbers.Real

math.ceil(x)
float.as_integer_ratio()

Additional methods
For floats only

float.is_integer()
float.hex()
float.fromhex()

numbers.Integral

int.bit_length()
long.bit_length()

complex

pow(x, y)
x ** y
z.real
z.imag

COMPARISONS
<

in
not in
__cmp__()
is
is not

Sequence types
Class instances

<=

Comparisons

>

All objects

>=

Object identity

==
!=

0
Zero

0L
0.0
0j
''

True

Truth value testing

False

Empty sequence

()
[]

Empty mapping
User defined classes

and
Boolean operations

or
not

None

{}
__nonzero__()
__len__()

BUILT-IN FUNCTIONS

Refrain from using names that hide built in functions. Common errors: id,
min, max.

If you are using vim add to your .vimrc:


let python_highlight_builtins=1

LISTS AND TUPLES


>>> a = [66.25, 333, 333, 1, 1234.5]
>>> print a.count(333), a.count(66.25), a.count('x')
210
>>> a.insert(2, -1)
>>> a.append(333)
>>> a
[66.25, 333, -1, 333, 1, 1234.5, 333]
>>> a.remove(333)
>>> a
[66.25, -1, 333, 1, 1234.5, 333]
>>> a.reverse()
>>> a
[333, 1234.5, 1, 333, -1, 66.25]
>>> a.sort()
>>> a
[-1, 1, 66.25, 333, 333, 1234.5]
>>> squares = []
>>> for x in range(10):
... squares.append(x**2)
...
>>> squares
[0, 1, 4, 9, 16, 25, 36, 49, 64, 81]
>>> squares = [x**2 for x in range(10)]
>>>
>>> [(x, y) for x in [1,2,3] for y in [3,1,4] if x != y]
[(1, 3), (1, 4), (2, 3), (2, 1), (2, 4), (3, 1), (3, 4)]
>>> combs = []
>>> for x in [1,2,3]:
... for y in [3,1,4]:
...
if x != y:
...
combs.append((x, y))
...
>>> combs
[(1, 3), (1, 4), (2, 3), (2, 1), (2, 4), (3, 1), (3, 4)]

>>> t = 12345, 54321, 'hello!'


>>> t[0]
12345
>>> t
(12345, 54321, 'hello!')
>>> # Tuples may be nested:
... u = t, (1, 2, 3, 4, 5)
>>> u
((12345, 54321, 'hello!'), (1, 2, 3, 4, 5))
>>> # Tuples are immutable:
... t[0] = 88888
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: 'tuple' object does not support item assignment
>>> # but they can contain mutable objects:
... v = ([1, 2, 3], [3, 2, 1])
>>> v
([1, 2, 3], [3, 2, 1])

>>> import itertools


>>> import pprint
>>> pprint.pprint(list(itertools.permutations("spam")))

List comprehension!

SETS
>>> basket = ['apple', 'orange', 'apple', 'pear', 'orange', 'banana']
>>> fruit = set(basket)
# create a set without duplicates
>>> fruit
set(['orange', 'pear', 'apple', 'banana'])
>>> 'orange' in fruit
# fast membership testing
True
>>> 'crabgrass' in fruit
False
>>> # Demonstrate set operations on unique letters from two words
...
>>> a = set('abracadabra')
>>> b = set('alacazam')
>>> a
# unique letters in a
set(['a', 'r', 'b', 'c', 'd'])
>>> a - b
# letters in a but not in b
set(['r', 'd', 'b'])
>>> a | b
# letters in either a or b
set(['a', 'c', 'r', 'd', 'b', 'm', 'z', 'l'])
>>> a & b
# letters in both a and b
set(['a', 'c'])
>>> a ^ b
# letters in a or b but not both
set(['r', 'd', 'b', 'm', 'z', 'l'])
>>> # Similarly to list comprehensions, set comprehensions are also supported:
>>> a = {x for x in 'abracadabra' if x not in 'abc'}
>>> a
set(['r', 'd'])

DICTIONARIES
>>> tel = {'jack': 4098, 'sape': 4139}
>>> tel['guido'] = 4127
>>> tel
{'sape': 4139, 'guido': 4127, 'jack': 4098}
>>> tel['jack']
4098
>>> del tel['sape']
>>> tel['irv'] = 4127
>>> tel
{'guido': 4127, 'irv': 4127, 'jack': 4098}
>>> tel.keys()
['guido', 'irv', 'jack']
>>> 'guido' in tel
True

Dictionary
comprehension

>>> dict([('sape', 4139), ('guido', 4127), ('jack', 4098)])


{'sape': 4139, 'jack': 4098, 'guido': 4127}
>>> {x: x**2 for x in (2, 4, 6)}
{2: 4, 4: 16, 6: 36}

Everything in Python is built with dictionaries: class properties, methods,


imports...

If order is important there is ordered dictionary: OrderedDict.

LOOPING TECHNIQUES
>>> for i, v in enumerate(['tic', 'tac', 'toe']):
... print i, v
...
0 tic
1 tac
2 toe
>>> questions = ['name', 'quest', 'favorite color']
>>> answers = ['lancelot', 'the holy grail', 'blue']
>>> for q, a in zip(questions, answers):
... print 'What is your {0}? It is {1}.'.format(q, a)
...
What is your name? It is lancelot.
What is your quest? It is the holy grail.
What is your favorite color? It is blue.
>>> for i in reversed(xrange(1,10,2)):
... print i,
...
97531
>>> basket = ['apple', 'orange', 'apple', 'pear', 'orange', 'banana']
>>> for f in sorted(set(basket)):
... print f
...
apple
banana
orange
pear

GENERATORS
l = [1, 2, 3, 4, 5]
d = (str(x) for x in l if x % 2 == 0)
>>> <generator object <genexpr> at 0x106708410>
tuple(d)
>>> ('2', '4')
d
>>> <generator object <genexpr> at 0x106708410>
tuple(d)
>>>()
def countdown(n):
print "Counting down from", n
while n > 0:
yield n
n -= 1
print "Done counting down"
for i in countdown(10):
print i
# Sum up the bytes transferred in an Apache server log using
# generator expressions
wwwlog = open("access-log")
bytecolumn = (line.rsplit(None,1)[1] for line in wwwlog)
bytes
= (int(x) for x in bytecolumn if x != '-')
print "Total", sum(bytes)

COROUTINES
def grep(pattern):
print "Looking for %s" % pattern
while True:
line = (yield)
if pattern in line:
print line,
g = grep("python")
g.next()
g.send("Yeah, but no, but yeah, but no")
g.send("A series of tubes")
g.send("python generators rock!")

NAMESPACES AND SCOPES


Maps names to objects
Implemented as dictionaries

#!/usr/bin/python

No relation between names in dierent


modules

def multiply_b_f(value):
def multiply_by(x):
return x * value
return multiply_by

Built in when Python starts (also called


__builtin__ module)
Global for a module when definition is read in

Lifetime

Local when the function is called, deleted


when function returns

# Lexical scoping.
my_func = multiply_b_f(2)
value = 3
print my_func(10)
>>>
20

What?

Built in names
Global names in a module

Examples

Local names in a function invocation


Namespace
Textual region of a Python program where a
namespace is directly accessible
Determined statically, used dynamically
Assignments to names always go into the
innermost scope
The global scope of a function defined in a
module is that modules namespace, no
matter from where or by what alias the
function is called

What?

Class definitions place yet another


namespace in the local scope.
Innermost scope, contains local names
Scopes of any enclosing functions
Current module's global names
Built in names

During execution, there are at least three


nested scopes:

Scope

CLASSES
class Mapping:
def __init__(self, iterable):
self.items_list = []
self.__update(iterable)

Data attributes override method attributes


with the same name.

Passing an object is cheap since only a


pointer is passed by the implementation; and
if a function modifies an object passed as an
argument, the caller will see the change.

def update(self, iterable):


for item in iterable:
self.items_list.append(item)
__update = update # Private copy of original update() method.
class MappingSubclass(Mapping):
def update(self, keys, values):
# provides new signature for update()
# but does not break __init__()
for item in zip(keys, values):
self.items_list.append(item)

>>> class Complex:


... def __init__(self, realpart, imagpart):
...
self.r = realpart
...
self.i = imagpart
...
>>> x = Complex(3.0, -4.5)
>>> x.r, x.i
(3.0, -4.5)

class B:
pass
class C(B):
pass
class D(C):
pass
for c in [B, C, D]:
try:
raise c()
except D:
print "D"
except C:
print "C"
except B:
print "B"

class Foo(object):
# Class variable.
DUMMY = 1
def bar(self):
return self.DUMMY + 1
def baz(self, new_value):
self.DUMMY = new_value
a = Foo()
b = Foo()
b.baz(2)
# Which one fails?
assert Foo.DUMMY == a.DUMMY
assert Foo.DUMMY == b.DUMMY
# A: self.__class__.DUMMY

GENERATOR VS. ITERATOR


Generators

and iterators work the same.

But

not in multithreaded environment!


Think about counters:
You

cannot call a generator that is


already executing.

You

can lock protect iterator state and


call it many times concurrently.

def squares(start, stop):


"""Generator."""
for i in xrange(start, stop):
yield i * i
class Squares(object):
"""Iterator."""
def __init__(self, start, stop):
self.start = start
self.stop = stop
def __iter__(self):
return self
def next(self):
# Lock here.
if self.start >= self.stop:
raise StopIteration
current = self.start * self.start
self.start += 1
return current
for i in squares(1, 5):
print i,
# Inline generator:
for i in (i*i for i in xrange(1, 5)):
print i,
sq_range = Squares(1, 5)
for i in sq_range:
print i,
>>> 1 4 9 16

FUNCTION ARGUMENTS

Never use keyword argument for a function that doesn't explicitly define
one. If you do that youve introduced a global variable!

In your tests use the function call the same way you use it in production
code: it can catch these bugs.
def foo(x, y):
print x ** y

foo(2, 3)
foo(2, y=3)
foo(x=2, y=3)

def foo(base, exponent):


print base ** exponent

foo(x=2, y=3)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: foo() got an unexpected keyword argument 'x'

CONTEXT MANAGERS
f = open("hello.txt")
try:
for line in f:
print line,
finally:
f.close()

with open("hello.txt") as f:
for line in f:
print line,

defines what the context manager should do at


the beginning of the block created by the with statement. Note
that the return value of __enter__ is bound to the target of the
with statement, or the name after the as.

__enter__()

__exit__(self,

exception_type,
exception_value, traceback) defines what the context
manager should do after its block has been executed (or
terminates).
and __exit__ can be useful for specific classes
that have well-defined and common behavior for setup and
cleanup.

__enter__

import sys
from StringIO import StringIO
class redirect_stdout:
def __init__(self, target):
# Save stdout and target.
self.stdout = sys.stdout
self.target = target
# Do this before.
def __enter__(self):
# Replace stdout with target.
sys.stdout = self.target
# Do this after.
def __exit__(self, type, value, tb):
# Restore stdout.
sys.stdout = self.stdout
out = StringIO()
with redirect_stdout(out):
# Print goes to StringIO object now!
print 'Test'
# Verify:
>>> out.getvalue() == 'Test\n'
True

DECORATORS
Decorator

expressions are
evaluated when the function is
defined, in the scope that contains
the function definition.

import time
def timeit(func):
"""Decorator for measuring function run time.
Args:
func: Function to be wrapped, passed implicitly
through "@..."
call.
Returns:
Wrapped function.

The

result must be a callable, which


is invoked with the function object
as the only argument.

The

returned value is bound to the


function name instead of the
function object. Multiple decorators
are applied in nested fashion.

"""
def function_call_wrap(*args, **kwargs):
try:
start_time = time.time()
return func(*args, **kwargs)
finally:
logger_func("%s() took %fms.", func.func_name,
(time.time() - start_time) * 1000)
return function_call_wrap

def sleep1():
time.sleep(1)
@timeit
def sleep2():
time.sleep(2)

RUN VS. IMPORT


Module

imports trigger __init__.py execution. Imports are


actually running a code.

Running

a code from the same folder would not see it as a module


and __init__.py wouldnt run! Read an explanation at
http://stackoverflow.com/a/465129.

Stay

on the safe side: know what you are initializing.

from

foo import bar considered harmful.

MODULES AND EXCEPTIONS


Modules

or packages should define their own domain-specific base exception


class, which should be subclassed from the built-in Exception class.

Modules

should have short, all-lowercase names. (Though there are historical


exceptions: StringIO.)

Module

level exceptions enable catching all errors that can be raised by one
module only. Very useful for debugging and testing.
class Error(Exception):
"""Base class for exceptions in this module."""
class RedisLockError(Error):
"""Base class for lock exceptions."""

EXCEPTION CATCHING
When

catching exceptions, mention


specific exceptions whenever possible
instead of using a bare except: clause.

If

you want to catch all exceptions that


signal program errors, use except
Exception:(bare except is equivalent
to except BaseException:).
try:
do_something()
except:
# Diaper pattern.
print "Error"

DOCUMENTATION
def convert_to_named_tuple(original_class):
"""Replace class declaration with named tuple.
This decorator is to be used when one uses class for storing
constants.

PEP-257 talks about docstring


conventions.
Comments that contradict the code are
worse than no comments. Always make a
priority of keeping the comments up-todate when the code changes!
pydoc -p <port>: See all modules
in production. Use it as a local Python
library reference.

Note: this will work only for classes with constants, not with any
other declared methods.
Example usage::
@convert_to_named_tuple
class important_constants(object):
PI = 3.141
e = 2.718
print important_constants.PI # Prints 3.141
important_constants.PI = 2 # Raises exception!
Args:
original_class: A class declaration.
Returns:
Named tuple object in place of the decorated class.
"""
@wraps(original_class)
def replace_class_with_named_tuple():
constant_value_dict = dict()
for attribute in original_class.__dict__:
if not attribute.startswith("__"):
constant_value_dict[attribute] = (
original_class.__dict__[attribute])
replacement_tuple = namedtuple(
original_class.__class__.__name__,
" ".join(constant_value_dict.iterkeys()))
args = constant_value_dict.values()
return replacement_tuple(*args)
return replace_class_with_named_tuple()

ARGPARSE
parser = argparse.ArgumentParser(
prog="smm",
description="Management")
parser.add_argument(
"--version",
action="version",
version="%(prog)s 0.1")
parser.add_argument(
"--log",
action="store",
choices=("debug", "info", "warning", "error", "critical"),
default="warning")
# Default arguments for each subparser: add/rm/list.
parent_parser = argparse.ArgumentParser(add_help=False)
parent_parser.add_argument(
"-l", "--labels",
required=True,
type=labels.valid_label_pair,
nargs="+",
metavar=("label1=value1", "label2=value2"),
help="labels as 'key=value' pairs")
parent_parser.add_argument(
"-m", "--machines",
required=True,
type=machines.valid_machine_name,
nargs="+",
metavar=("machine_1", "machine_2"),
help="machine names")
parent_parser.add_argument(
"--log",
action="store",
choices=("debug", "info", "warning", "error", "critical"),
default="warning")
# Subparsers for add, rm, and list inherit the same
subparsers = parser.add_subparsers(
title="subcommands",
description="valid subcommands",
dest="subparser_name",
help="sub-commands")
parser_add = subparsers.add_parser(
"add",
parents=[parent_parser],
help="add labels to machines")
parser_add.add_argument(
"add",
help="add labels",
action="store_true")
parser_add.set_defaults(func=add_machines)
#...
def main():
args = parser.parse_args()
logger.initialize_logger(args.log)
logger.LOG.debug("Parsed command line and initialized logger.")
logger.LOG.debug("Command line parsed: %s", args)
logger.LOG.debug("Dispatching:")
args.func(args)
logger.LOG.debug("Done!")

Try not to use optparse


module, it is deprecated.

GFLAGS
Googles

command line parsing library:


http://code.google.com/p/python-gflags/

It

has increased flexibility, including built-in support for Python


types, and the ability to define flags in the source file in which
they're used (major difference from OptParse).
FLAGS = gflags.FLAGS
gflags.DEFINE_integer(
"port",
9001,
"service port")
gflags.RegisterValidator(
"port",
lambda port: 1024 < port <= 65535,
message="must be in (1024, 65535] range")

def main(argv)
port = FLAGS.port

PEP8
Python

Style Checker: python.org/dev/peps/pep-0008/

Read

it, and then read it again. It will teach you to write better and
more reliable Python code. Add comments, be verbose, keep it clean.

Code

should be written to minimize the time it would take for someone


else to understand it. The Art of Readable Code, Dustin Boswell,
Trevor Foucher.

New

code should (must) have test coverage. Use asserts. They will be
completely ignored when the code is run in optimized mode (python
-O).

PEP-8 ON IMPORTS
Always

use the absolute package path for all imports. Even now that PEP-328
(Imports: Multi-Line and Absolute/Relative) is fully implemented in Python
2.5, its style of explicit relative imports is actively discouraged; absolute
imports are more portable and usually more readable.

Imports

are always put at the top of the file, just after any module comments
and docstrings, and before module globals and constants. Imports should be
grouped in the following order:

1. Standard library imports


2. Related third party imports
3. Local application/library specific imports
You

should put a blank line between each group of imports.

IS NONE VS. == NONE


PEP

8 says: Comparisons to singletons like None


should always be done with is or is not, never
the equality operators (==, !=).

Beware

of writing if x when you really mean if


x is not None e.g. when testing whether a
variable or argument that defaults to None was set
to some other value. The other value might have a
type (such as a container) that could be false in a
boolean context! A class is free to implement
comparison any way it chooses, and it can choose
to make comparison against None mean something

Use

x is not y instead of not x is y.


Operator priority can be confusing and the second
statement can be read as (not x) is y.

class Zero():
"""A class that is zero."""
def __nonzero__(self):
return False
class Len0():
"""A class with zero length."""
def __len__(self):
return 0
class Equal():
"""A class that is equal to everything."""
def __eq__(self, other):
return True
stuff = [None, False, 0, 0L, 0.0, 0j,
(), [], {}, set(), '', float('NaN'), float('inf'),
Zero(), Len0(), Equal()]
for x in stuff:
if x is None:
print("{} is None ".format(x))
if x==None:
print("{} == None ".format(x))
>>>
None is None
None == None
<__main__.Equal instance at 0x84a80> == None

__DEL__ AND MEMORY


def log_memory_leaks(func, logger_func):
"""Decorator for detecting memory leaks.

class SomeClass(object):
pass

class SomeNastyClass(object):
# Confuse garbage collection by adding __del__ method. If
# circular reference is created it wouldn't know which one to
# dispose of first and would let them stay in memory!
def __del__(self):
pass

..
th
w
i
ana
ger
:
tm
tex
on
ec

Us

def leaky_function():
"""Leaky function."""
foo = SomeNastyClass()
bar = SomeNastyClass()
foo.other = bar
bar.other = foo
del foo
del bar
return

def non_leaky_function():
"""Non leaky function."""
foo = SomeClass()
bar = SomeClass()
foo.other = bar
bar.other = foo
del foo
del bar
return

Log what was not garbage collected after the function has returned.
Args:
func: Function to be wrapped, passed implicitly through "@..." call.
logger_func: Logging function to be called around the wrapped function.
Returns:
Wrapped function.
"""
@wraps(func)
def function_call_wrap(*args, **kwargs):
# Force garbage collection.
gc.collect()
# Different type instance counters before and after the function run.
before = Counter([type(i) for i in gc.get_objects()])
try:
return func(*args, **kwargs)
finally:
gc.collect()
# Count instances by type after the run. Ignore object "before"
# created in this decorator.
after = Counter(
[type(i) for i in gc.get_objects() if i is not before])
# Find instance types that have changed after the run.
instance_diff = {
i: after[i] - before[i] for i in after if after[i] != before[i]}
if instance_diff:
logger_func(
"Memory usage after %s(args=%s, kwargs=%s): %s",
func.func_name, args, kwargs, pprint.pformat(instance_diff))
return function_call_wrap

WHY TEST
No

standard scoping: once a variable has come into existence it remains


until the enclosing function exits and not when the enclosing block
terminates.

No

concept of data privacy, only obfuscation.

No

concept of declaration leads to ambiguity when you have multiple


scopes. Instead of having one simple var keyword, Python has the global
and nonlocal keywords (the latter is only available in Python 3).

More

errors are detected at run time than is desirable. Basically you have
to make sure that all your code has been executed before you can say that
the program is even semantically correct.

Your

friend: https://nose.readthedocs.org/en/latest/

PYCHARM COMMERCIAL

>>> # Python 2.X


>>> True == False
False
>>> True = True
>>> True = False
>>> True == True
True
>>> True == False
True

http://www.jetbrains.com/pycharm/

package main

EXAMPLE: SUM

import (
"fmt"
"runtime"
)
func summer(ch chan<- uint64, from uint64, to uint64) {
var sum uint64 = 0
for i := from; i <= to; i++ {
sum += i
}
// Send the result.
ch <- sum
}
func main() {
const upper uint64 = 1000000000
const workers uint64 = 8
var start_interval uint64 = 1
const step uint64 = upper / workers
// Make a channel that can buffer up to $workers numbers.
ch := make(chan uint64, workers)
// Use up to 8 CPUs. This should nicely use quad core CPU with
// hyperthreading.
runtime.GOMAXPROCS(8)
// Dispatch workers, each with a different number segment.
for i := uint64(0); i < workers; i++ {
go summer(ch, start_interval, start_interval+step-1)
start_interval += step
}
// Read out results as they keep arriving to the channel (we block on the
// channel until a value is ready).
var sum uint64 = 0
for i := uint64(0); i < workers; i++ {
sum += <-ch
}

#!/usr/bin/perl
use integer; $sum = 0; $sum += $_ for (1 .. 1000000000); print $sum;
>
real 1m21.774s
user 1m21.656s
sys 0m0.061s

#include "stdio.h"
int main(int argc, char const *argv[]) {
long sum = 0;
for (long i = 1; i <= 1000000000L; sum+=i++)
;
printf("%ld\n", sum);
return 0;
}
>
real 0m2.465s
user 0m2.461s
sys 0m0.002s

#!/usr/bin/python
print sum(xrange(1, 1000000001))
>
real 0m12.114s
user 0m12.087s
sys 0m0.012s

fmt.Println(sum)
}
>
real 0m0.302s
user 0m2.165s
sys 0m0.004s

10
X
i=1

i = 500000000500000000

PARALLELIZE SUM!
#!/usr/bin/python
import multiprocessing
UPPER = 1000000000
WORKERS = 8
STEP = UPPER / WORKERS
pool = multiprocessing.Pool(processes=WORKERS)
ranges = (xrange(lo, hi + 1) for (lo, hi) in zip(xrange(1, UPPER, STEP),
xrange(STEP, UPPER + 1, STEP)))
print sum(pool.map(sum, ranges))
>
real 0m2.008s
user 0m13.991s
sys 0m0.051s

From 12 to 2 seconds.

FIBONACCI NUMBERS
#!/usr/bin/python
import time
def timeit(func):
def function_call_wrap(*args, **kwargs):
try:
start_time = time.time()
return func(*args, **kwargs)
finally:
print(time.time() - start_time)
return function_call_wrap

8
>
<0
f (n) = 1
>
:
f (n

1) + f (n

2)

n=0
n=1
otherwise

f (30) = 832040
#!/usr/bin/python
#!/usr/bin/python

from timeit import timeit


from functools import lru_cache

from timeit import timeit


def fib(n):
assert n >= 0
if n == 0:
return 0
if n == 1:
return 1
return fib(n-1) + fib(n-2)

def fib(n):
assert n >= 0
if n == 0:
return 0
if n == 1:
return 1
return fib(n-1) + fib(n-2)

@lru_cache(maxsize=512)
def fib(n):
assert n >= 0
if n == 0:
return 0
if n == 1:
return 1
return fib(n-1) + fib(n-2)

fib30()

print(timeit(
stmt="fib(30)",
setup="from __main__ import fib",
number=1))

print(timeit(
stmt="fib(30)",
setup="from __main__ import fib",
number=1))

python3.3 fib.py
0.794215202331543

$ python3.3 fib.py
0.8095278141554445

$python3.3 fib.py
0.0003443881869316101

@timeit
def fib30():
fib(30)

USEFUL LIBRARIES
matplotlib

PIL

django

NumPy
https://pypi.python.org/pypi/pip

CRITICISM
Python

is highly typed. Do you get function overloading based upon parameter type? No.
Can you stipulate the type of a parameter in a function declaration? No, this has to be
coded within the function.

Mutable

default arguments (def foo(a=abc, b=[])).

Documentation
It

is generally good, but quite often it doesnt go into enough detail.

is deceptively easy to start with, but to write serious code you have to know hidden stuff.

All

function arguments are essentially global variables! If you rename them you can break
some code! Partially fixed in Python 3+.

White

space complicates refactoring.

Anonymous

or lambda functions are limited in their ability. However one can declare a
named function in an inner scope and use that instead.

GIL
GIL, is

a mutex that prevents multiple native threads from executing Python bytecodes at once.
This lock is necessary mainly because CPython's memory management is not thread-safe.
However, since the GIL exists, other features have grown to depend on the guarantees that it
enforces.

It

prevents multithreaded CPython programs from taking full advantage of multiprocessor systems.

2.X OR 3.X
$ python
Python 2.7.2 (default, Oct 11 2012, 20:14:37)
[GCC 4.2.1 Compatible Apple Clang 4.0 (tags/Apple/clang-418.0.60)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> 3/2
1
>>> ^D
$ python3.3
Python 3.3.1 (v3.3.1:d9893d13c628, Apr 6 2013, 11:07:11)
[GCC 4.2.1 (Apple Inc. build 5666) (dot 3)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> 3/2
1.5
>>> ^D

Some

libraries are still not ported to 3.X

2.7.X

is the last 2.X version

New

features are added to 3.X, it is a better language.

http://wiki.python.org/moin/Python2orPython3

GOOD TO KNOW
python it can mask the process name. You wouldnt see
the name of the code running when listing processes on a machine.

#!env

Unicode

vs. normal strings: size difference exists for ASCII characters


as well. uAleksa takes twice the size of Aleksa (or even four times!).
Differences in 2.X and 3.X.

In

all python projects you do not cd into a lower directory to run


things. You stay at the top and run everything from there so that all of
the system can access all the modules and files.

http://www.aleksa.org/2013/03/python-resources.html

FAQ

Q&A

Você também pode gostar