faster python/django programming

Post on 02-Jul-2015

533 Views

Category:

Technology

1 Downloads

Preview:

Click to see full reader

DESCRIPTION

The slide consists some tips about good programming practices in python/django as well as some tools and techniques to make application faster.

TRANSCRIPT

Faster Programming inPython/Django

Subit Raj Pokharel@ rajsubit, rajsubit

First Code then optimize

Response Time Limits● 0.1 seconds

○ limit for having the user feel that the system is reacting instantaneously

● 1 seconds○ limit for the user's flow of thought to stay uninterrupted

● 10 seconds○ limit for keeping the user's attention focused on the dialogue

http://www.nngroup.com/articles/response-times-3-important-limits/

Time consuming function

New Relichttp://newrelic.com/

def find_recent_blog_post():...url = requests.get("http://blog.flipkarma.com/feed")soup = BeautifulSoup.BeautifulSoup(url.text)recent_post = soup.findAll('item')[:2]...

A simple example

Analysis with Line ProfilerLine # Hits Time Per Hit % Time Line Contents============================================================== 70 1 1112952 1112952.0 73.4 url = requests.get("http://blog.flipkarma.com/feed") 71 1 103695 103695.0 6.8 soup = BeautifulSoup.BeautifulSoup(url.text) 72 1 2186 2186.0 0.1 recent_post = soup.findAll('item')[:2]

https://github.com/rkern/line_profiler$ pip install line_profiler

Optimization with Memcached# Original codeurl = requests.get("http://blog.flipkarma.com/feed")soup = BeautifulSoup.BeautifulSoup(url.text)recent_post = soup.findAll('item')[:2]

# Using Memcachedkey = "flipkarma_blog_post"cache_time = 10000result = cache.get(key)if result: recent_post = resultelse:

url = requests.get("http://blog.flipkarma.com/feed")soup = BeautifulSoup.BeautifulSoup(url.text)recent_post = soup.findAll('item')[:2]cache.set(key, recent_post, cache_time)

http://memcached.org/

Line # Hits Time Per Hit % Time Line Contents============================================================== 65 1 3 3.0 0.0 cache_time = 10000 # time to live in seconds 66 1 17973 17973.0 1.3 result = cache.get(cache_key) 67 1 4 4.0 0.0 if result: 68 1 31 31.0 0.0 parameters["result"] = result 69 1 3 3.0 0.0 if not result: 74 url = requests.get("http://blog.flipkarma.com/feed")

soup = BeautifulSoup.BeautifulSoup(url.text)

recent_post = soup.findAll('item')[:2]

cache.set(cache_key, recent_post, cache_time)

Result

New Relic

● analyze the application● help understand the stories of application

data● real-time business insights

http://newrelic.com/

Profiling

Python Tools:● cProfile● profile● hotshot

Django Debug ToolbarLine Profiler

Django Debug Toolbar

SQL Query

Query Time

Reducing Query

Query Time

Some Tips

If it's fast enough, don't optimise it.

Find the slowest step first.

Make the slowest operation faster

● Python functions have an overhead. Inline if possible● List comprehensions are faster than for or if

Reduce the number of hits.

Perform each operation as rarely as possible.

● Cache the result if speed is more important than memory● Move slower operations inside if conditions● assign operation to variable only if necessary

Reducing Queries

● Use of Class Based Views● proper use of query sets● select_related() and prefetch_related()

Change the algorithm. This has the second-biggest impact on speed

The largest impact comes from eliminating code.

Functionality is an asset. Code is a liability.

Thank You

top related