You are on page 1of 20

Big-O Cheat Sheet

1.7k
Tw eet

5.1k
Like

I receive

$7.50 / wk
on Gittip.

Know Thy Complexities!


Hi there! This webpage covers the space and time Big-O complexities of common algorithms used in Computer Science. When preparing for technical interviews in the past, I found myself spending hours crawling the internet putting together the best, average, and worst case complexities for search and sorting algorithms so that I wouldn't be stumped when asked about them. Over the last few years, I've interviewed at several Silicon Valley startups, and also some bigger companies, like Yahoo, eBay, LinkedIn, and Google, and each time that I prepared for an interview, I thought to myself "Why oh why hasn't someone created a nice Big-O cheat sheet?". So, to save all of you fine folks a ton of time, I went ahead and created one. Enjoy!
Good Fair Poor

Searching
Space

open in browser PRO version

Are you a developer? Try out the HTML to PDF API

pdfcrowd.com

Algorithm

Data Structure

Time Complexity Average Worst


O(|E| + |V|)

Complexity Worst
O(|V|)

Depth First Search (DFS)

Graph of |V| vertices and |E| edges Graph of |V| vertices and |E| edges Sorted array of n elements Array Graph with |V| vertices and |E| edges

Breadth First Search (BFS)

O(|E| + |V|)

O(|V|)

Binary search

O(log(n))

O(log(n))

O(1)

Linear (Brute Force) Shortest path by Dijkstra, using a Min-heap as priority queue Shortest path by Dijkstra, using an unsorted array as priority queue Shortest path by BellmanFord

O(n) O((|V| + |E|) log |V|)

O(n) O((|V| + |E|) log |V|)

O(1) O(|V|)

Graph with |V| vertices and |E| edges

O(|V|^2)

O(|V|^2)

O(|V|)

Graph with |V| vertices and |E| edges

O(|V||E|)

O(|V||E|)

O(|V|)

Sorting
Algorithm Data Structure Time Complexity Best Average Worst Worst Case Auxiliary Space Complexity Worst

open in browser PRO version

Are you a developer? Try out the HTML to PDF API

pdfcrowd.com

Quicksort

Array

O(n log(n))

O(n log(n)) O(n log(n)) O(n log(n)) O(n^2)

O(n^2)

O(n)

Mergesort

Array

O(n log(n))

O(n log(n)) O(n log(n)) O(n^2)

O(n)

Heapsort

Array

O(n log(n))

O(1)

Bubble Sort Insertion Sort Select Sort Bucket Sort Radix Sort

Array

O(n)

O(1)

Array

O(n)

O(n^2)

O(n^2)

O(1)

Array Array

O(n^2) O(n+k)

O(n^2) O(n+k)

O(n^2) O(n^2)

O(1) O(nk)

Array

O(nk)

O(nk)

O(nk)

O(n+k)

Data Structures
Data Structure Time Complexity Average Indexing Basic
O(1)

Space Complexity Worst Worst Search


O(n)

Search
O(n)

Insertion
-

Deletion
-

Indexing
O(1)

Insertion
-

Deletion
O(n)

open in browser PRO version

Are you a developer? Try out the HTML to PDF API

pdfcrowd.com

Array Dynamic Array SinglyLinked List DoublyLinked List Skip List


O(1) O(n) O(n) O(n) O(1) O(n) O(n) O(n) O(n)

O(n)

O(n)

O(1)

O(1)

O(n)

O(n)

O(1)

O(1)

O(n)

O(n)

O(n)

O(1)

O(1)

O(n)

O(n)

O(1)

O(1)

O(n)

O(log(n))

O(log(n))

O(log(n))

O(log(n))

O(n)

O(n)

O(n)

O(n)

O(n log(n))

Hash Table Binary Search Tree Cartresian Tree B-Tree Red-Black Tree Splay Tree AVL Tree

O(1)

O(1)

O(1)

O(n)

O(n)

O(n)

O(n)

O(log(n))

O(log(n))

O(log(n))

O(log(n))

O(n)

O(n)

O(n)

O(n)

O(n)

O(log(n))

O(log(n))

O(log(n))

O(n)

O(n)

O(n)

O(n)

O(log(n)) O(log(n))

O(log(n)) O(log(n))

O(log(n)) O(log(n))

O(log(n)) O(log(n))

O(log(n)) O(log(n))

O(log(n)) O(log(n))

O(log(n)) O(log(n))

O(log(n)) O(log(n))

O(n) O(n)

O(log(n))

O(log(n))

O(log(n))

O(log(n))

O(log(n))

O(log(n))

O(n)

O(log(n))

O(log(n))

O(log(n))

O(log(n))

O(log(n))

O(log(n))

O(log(n))

O(log(n))

O(n)

open in browser PRO version

Are you a developer? Try out the HTML to PDF API

pdfcrowd.com

Heaps
Heaps Time Complexity Extract Max
O(1)

Heapify Linked List (sorted) Linked List (unsorted) Binary Heap Binomial Heap Fibonacci Heap
-

Find Max
O(1)

Increase Key
O(n)

Insert
O(n)

Delete
O(1)

Merge
O(m+n)

O(n)

O(n)

O(1)

O(1)

O(1)

O(1)

O(n)

O(1)

O(log(n))

O(log(n))

O(log(n))

O(log(n))

O(m+n)

O(log(n))

O(log(n))

O(log(n))

O(log(n))

O(log(n))

O(log(n))

O(1)

O(log(n))*

O(1)*

O(1)

O(log(n))*

O(1)

Graphs
Node / Edge Management Adjacency list Storage Add Vertex Add Edge Remove Vertex
O(|V| +

Remove Edge
O(|E|)

Query

O(|V|+|E|)

O(1)

O(1)

O(|V|)

open in browser PRO version

Are you a developer? Try out the HTML to PDF API

pdfcrowd.com

|E|)

Incidence list Adjacency matrix Incidence matrix

O(|V|+|E|) O(|V|^2) O(|V| |E|)

O(1) O(|V|^2) O(|V| |E|)

O(1) O(1) O(|V| |E|)

O(|E|) O(|V|^2) O(|V| |E|)

O(|E|) O(1) O(|V| |E|)

O(|E|) O(1) O(|E|)

Notation for asymptotic growth


letter (theta) (big-oh) O (small-oh) o (big omega) (small omega) bound upper and lower, tight[1] upper, tightness unknown upper, not tight lower, tightness unknown lower, not tight growth equal[2] less than or equal[3] less than greater than or equal greater than

[1] Big O is the upper bound, while Omega is the lower bound. Theta requires both Big O and Omega, so that's why it's referred to as a tight bound (it must be both the upper and lower bound). For example, an algorithm taking Omega(n log n) takes at least n log n time but has no upper limit. An algorithm taking Theta(n log n) is far preferential since it takes AT LEAST n log n (Omega n log n) and NO MORE THAN n log n (Big O n log n).SO [2] f(x)=(g(n)) means f (the running time of the algorithm) grows exactly like g when n (input size) gets larger. In other words, the growth rate of f(x) is asymptotically proportional to g(n). [3] Same thing. Here the growth rate is no faster than g(n). big-oh is the most useful because represents the worstcase behavior.

open in browser PRO version

Are you a developer? Try out the HTML to PDF API

pdfcrowd.com

In short, if algorithm is __ then its performance is __ algorithm o(n) O(n) (n) (n) (n) performance <n n =n n >n

Google Apps for Business


www.google.com/apps/business Get Gmail for Business, Google Calendar, Google Docs and more!

Big-O Complexity Chart


This interactive chart, created by our friends over at MeteorCharts, shows the number of operations (y axis) required to obtain a result as the number of elements (x axis) increase. O(n!) is the worst complexity which requires 720 operations for just 6 elements, while O(1) is the best complexity, which only requires a constant number of operations for any number of elements.

open in browser PRO version

Are you a developer? Try out the HTML to PDF API

pdfcrowd.com

Contributors
Edit these tables! 1. 2. 3. 4. 5. Eric Rowell Quentin Pleple Nick Dizazzo Michael Abed Adam Forsyth
Are you a developer? Try out the HTML to PDF API

open in browser PRO version

pdfcrowd.com

6. 7. 8. 9. 10. 11. 12. 13. 14.

Jay Engineer Josh Davis makosblade Alejandro Ramirez Joel Friedly Robert Burke David Dorfman Eric Lefevre-Ardant Thomas Dybdahl Ahle

AROUND THE WEB

What the Bible Says About Money (Shocking)


Moneynews

Malcolm Gladwell answers questions about Competing in Today's OPEN Forum The Best Pizza in Every State: 50 Amazing Pies Across America Zagat Want to Declutter Your Desk: Here's the Ultimate Way to Do It My Life Scoop Avoid These 10 Jobs: They Have No Future
Kiplinger.com

"Dry Drowning" Claims Life of 10-Year-Old - What Parents Need to Watch For iVillage.com Decking Your Apartment for the Holidays
Remodelista

Sex: The Best Painkiller and Cancer Preventer


Daily Health Post

148 comments Join the discussion


Best Community Michael Mitchell
7 months ago

Share

This is great. Maybe you could include some Are you a developer? Try out the HTML to PDF API resources (links to khan academy, mooc etc) that would explain each open in browser PRO version

pdfcrowd.com

This is great. Maybe you could include some resources (links to khan academy, mooc etc) that would explain each of these concepts for people trying to learn them.
119
Reply Share

Amanda Harlin 34 Cam Tyler 9

Michael Mitchell 7 months ago

Yes! Please & thank you


Reply Share

Michael Mitchell 6 months ago

This explanation in 'plain English' helps: http://stackoverflow.com/quest...


Reply Share

Arjan Nieuwenhuizen

Michael Mitchell 5 months ago

Here are the links that I know of. #1) http://aduni.org/courses/algor... #2) http://ocw.mit.edu/courses/ele... #3) https://www.udacity.com/course... probably as good or maybe better # 2, but I have not had a chance to look at it. http://ocw.mit.edu/courses/ele... Sincerely, Arjan p.s. https://www.coursera.org/cours... This course has just begun on coursera (dated 1 July 2013), and looks very good.
3
Reply Share

fireheron

Arjan Nieuwenhuizen 2 months ago

Thank you Arjan. Espaecially the coursera.org one ;-)


Reply Share

Gokce Toykuyu

a year ago

open in browser PRO version

Are you a developer? Try out the HTML to PDF API

pdfcrowd.com

Gokce Toykuyu 29

a year ago

Could we add some tree algorithms and complexities? Thanks. I really like the Red-Black trees ;)
Reply Share

ericdrowell

Mod

Gokce Toykuyu a year ago

Excellent idea. I'll add a section that compares insertion, deletion, and search complexities for specific data structures
23 Darren Le Redgatr
Reply Share

7 months ago

I came here from an idle twitter click. I have no idea what it's talking about or any of the comments. But I love the fact there are people out there this clever. Makes me think that one day humanity will come good. Cheers.
22
Reply Share

Jon Renner 30 1

7 months ago

This is god's work.


Reply Share

Adam Heinermann 19

6 months ago

Is there a printer-friendly version?


Reply Share

qwertykeyboard 14

7 months ago

It would be very helpful to have export to PDF. Thx


Reply Share

Gene

qwertykeyboard a month ago

You could convert the document yourself using Pandoc: http://johnmacfarlane.net/pand... It might take you a long time to get it working, but Pandoc is an amazing one stop shop for file conversion, and cross platform compatible. If I understand big oh notation correctly I might say "I estimate your learning rate for learning Pandoc will be O(1). ".
1
Reply Share

open in browser PRO version

Are you a developer? Try out the HTML to PDF API

pdfcrowd.com

Valentin Stanciu

7 months ago

1. Deletion/insertion in a single linked list is implementation dependent. For the question of "Here's a pointer to an element, how much does it take to delete it?", single-linked lists take O(N) since you have to search for the element that points to the element being deleted. Double-linked lists solve this problem. 2. Hashes come in a million varieties. However with a good distribution function they are O(logN) worst case. Using a double hashing algorithm, you end up with a worst case of O(loglogN). 3. For trees, the table should probably also contain heaps and the complexities for the operation "Get Minimum".
13 Guest
Reply Share

7 months ago

Finals are already over... This should have been shared a week ago! Would have saved me like 45 minutes of using Wikipedia.
9
Reply Share

Blake Jennings 8 tempire

7 months ago

i'm literally crying


Reply Share

7 months ago

This chart seems misleading. Big O is worst case, not average case; ~ is average case. O(...) shouldn't be used in the average case columns.
11 1 guest 6
Reply Share

tempire 7 months ago


Reply Share

I think big O is just an upper bound. It could be used for all (best, worst and average) cases. Am I wrong?

Luis 2

guest 7 months ago


Reply Share

You are right.

Oleksandr

Luis 2 months ago

open in browser PRO version

@Luis That is WRONG. @tempire is correct. Big O cannot be used for lower, average, and upper bound.. Big O (Omicron) is the Worst Case Scenario. It is the upper bound for for the Are you a developer? Try out the HTML to PDF API

pdfcrowd.com

upper bound.. Big O (Omicron) is the Worst Case Scenario. It is the upper bound for for the algorithm. For instance in a linear search algorithm, worst case is when the list is completed out of order, i.e. the list sorted but backwards. Omega is the lower bound. This is almost pointless to have, for instance you would rather have Big O then Omega because it is exactly the same as say "it will take more than five dollars to get to N.Y. vs. Its will always take, at most, 135 dollars to get to New York." The first bit of information from Omega is essentially useless, the third however gives you the constraint. Theta is the upper and lower bound together. This is the most beneficially piece of information to have about an algorithm but unfortunately it is usually very hard to find, but we have done this. You can usually find that average for an algorithms efficiency by testing it in average case and worst cases together. Simply this is a computational exercise to extract the empirical data. There is another problem I do not like is the color scheme is sometimes wrong.. O(n) is better the O(log(n))? In what way? 1024 vs 10 increments that a sort algorithm has to perform for instance? All in all this is good information but in its current state, to the novice, honestly it needs to be taken with a grain of salt and fact check with a good algorithm book. However, this is in MHO so if I'm off base or incorrect then feel free to flame me like the fantastic four at a gay parade :)
Reply Share

Luis

Oleksandr 2 months ago

@Oleksandr You are confused. Your example about the dollars states specific amounts (e.g. " at most 135 dollars"), but big O and related concepts are used to bound the order (linear, exponential, etc.) of a function that describes how an algorithm grows (in space, time, etc.) with problem size. To be more appropriate, your example should be modified to say something like "it takes at most 2$ per mile" (linear). With this in mind, you can thus understand how big O can be used both for, say, the best and the worst case. Take your linear search. As the size of the problem grows (the array to be searched grows in size), the best case still has an upper time bound of O(1) (it takes constant time to find an element in index 0, or another fixed position), while the worst case (the object is in the last index where we look) has an upper time bound of O(n) (it takes a number of steps of order equal to the problem size, n, until we find the object in the last index where we look.).
Reply Share

Oleksandr1

Luis a month ago

You make a very poor assumption that because a specific value is given,
open in browser PRO version
Are you a developer? Try out the HTML to PDF API

pdfcrowd.com

than it must be a linear function. It is in fact any polynomial function of my choice given its parameters and any amount of Lagrange constants which will produce a value of 135, or any such number I specify to be used in the example. The point is that Big O is the upper bound of a function. In fact there are an infinite amount of Big O's for any elementary functions. Big O cannot be used for the best case scenario, this is a complete misunderstanding of Omega vs Omicron. You should read up on this because this is very important. As for the example, $135 dollars was given as an upper bound, $5 was the Omega value, I'm not sure why you don't understand a very clear analogy, but for you I change situation and values. Given function unknown, it will run more than five iterations (Omega), BUT it will never run more than 135 iterations. 135 being the Omicron value. On the linear search algorithm, forgive me, I meant to say Linear Sort Algorithm, which has the worst can scenario when a list is fed to said algorithm in order, but backwards. I agree about what you said about linear search algorithm. I
Reply Share

see more

Yavuz Yetim

Oleksandr1 a month ago

@Oleksandr @Luis IMHO, there are three different statements in this argument, that lead to the eventual misunderstanding. I agree with Luis that the table is correct and not useless but also agree with Oleksandr that it's not complete (but again disagree that it is incomplete because of the mismatch between best/average case and big-O, see Statement iii and Example (a) in the end). The main confusion is between the terms "case" and "bound". These are orthogonal terms, and do not have any relation with each other. For example, you have a lower-bound for average-case, or upper-bound for best-case, ... (in total 9 different, correct combinations, each useful for a different use case, but --none-- has useless/meaningless information) Here are the statements in this argument: Statement i) "The table is wrong in using Big-O notation for all columns". This statement is

open in browser PRO version

Are you a developer? Try out the HTML to PDF API

pdfcrowd.com

Statement i) "The table is wrong in using Big-O notation for all columns". This statement is false because the table is correct. Big-O notation does not have anything to do with the worst case, average case or the best case. Big-O notation is only a representation for a function. Let's say the best-case run time for an algorithm for a given input of size n is exactly (3*n + 1). One correct representation for this function is O(n). Therefore, writing O(n) for a best-case entry is correct.
see more
Reply Share

Guest

Oleksandr 2 months ago

@Oleksandr You are confused. Your example about the dollars states specific amounts (e.g. " at most 135 dollars"), but big O and related concepts are used to bound the order (linear, exponential, etc.) of a function that describes how an algorithm grows (in space, time, etc.) with problem size. To be more appropriate, your example should be modified to say something like "it takes at most 2$ per mile" (linear). With this in mind, you can thus understand how big O can be used both for, say, the best and the worst case. Take your linear search. As the size of the problem grows (the array to be searched grows in size), the best case still has an upper time bound of O(1) (it takes constant time to find an element in index 0, or another fixed position), while the worst case (the object is in the last index where we look) has an upper time bound of O(n) (it takes a number of steps of order equal to the problem size, n, until we find the object in the last index where we look.). (fixed: wrong autocomplete of who I replied to)
Reply Share

ericdrowell 1 Antoine Grondin 7

Mod

tempire 7 months ago

I'll try to clarify that. Thanks!


Reply Share

7 months ago

I think DFS and BFS, under Search, would be more appropriate listed as Graph instead of Tree.
Reply Share

open in browser PRO version

Are you a developer? Try out the HTML to PDF API

pdfcrowd.com

ericdrowell 3

Mod

Antoine Grondin 7 months ago

Fixed! Thanks
Reply Share

Quentin Plepl

Antoine Grondin 7 months ago

Agreed
Reply Share

Stephane Duguay

7 months ago

Hi, I'd like to use a french version of this page in class... should I translate it on another website or you can support localisation and I do the data entering for french? I'm interested!
9 1
Reply Share

Marten Czech

Stephane Duguay 7 months ago

learn English!
19 9
Reply Share

Marcus

Marten Czech 2 months ago

Maybe he means he wants to deliver it to French students. If he is offering to do the data entry from french, but clearly speaks English (from his comment). Don't be ignorant, there is no reason that everything should be in English.
4
Reply Share

Marten Czech 1 Jon Renner 5


5 months ago

Marcus 2 months ago

IT world ticks in English, the sooner French realize that the faster we can go together.
Reply Share

Anyway I can get a PDF version without taking screenshots myself?


Reply Share

Anonimancio Cobardoso 5 open in browser PRO version

7 months ago

You could include a chart with logarithmic scale. Looks nicer IMHO.
Reply Share Are you a developer? Try out the HTML to PDF API

pdfcrowd.com

Reply Share

sigmaalgebra

7 months ago

You omitted an in-place, guaranteed O(n log(n)) array sort, e.g., heap sort. You omitted radix sort that can be faster than any of the algorithms you mentioned. Might mention SAT and related problems in NP-complete where the best known algorithm for a problem of size n has O(2^n). Might include an actual, precise definition of O().
5
Reply Share

Gbor Ndai

7 months ago

Nice.
5
Reply Share

IvanKuckir

7 months ago

Do you really find this useful? When talking about complexity, you must talk about some specific algorithm. But when you know the algorithm, you already know the complexity, am I wrong? Does anybody just learns the paris algorithm_name : complexity, without any idea how algorithm works? OMG...
4
Reply Share

ericdrowell 3

Mod

IvanKuckir 7 months ago

have you never had a technical interview before?


Reply Share

IvanKuckir

ericdrowell 7 months ago

No, I am still a student. And I think, that if employer wants you to know just algorithm complexity, but not the whole alogrithm, there is something wrong with that company...
open in browser PRO version
Are you a developer? Try out the HTML to PDF API

pdfcrowd.com

Reply Share

Yeukhon Wong

IvanKuckir 7 months ago

That's too strong. There are simply too many algorithms. Also, just because certain companies are asking algorithms, this fact does not imply other companies have a lower expectation. Most of the top companies I know of don't even go with Red-Black tree. Both of them are interested in basic tree/graph and sorting algorithms and give you one or two puzzles that don't really help in real life. Half of the Google interview questions are good, but the other half are puzzles that I find (and certainly a lot of people) less helpful . One I find useful one is fitting GBs of data into 1M memory if I remember correctly. Also, not everyone will remember the complexity. Certain people will never use algorithms above tree search or sorting. They might not even need streaming algorithm.
2 chriswashere818
Reply Share

IvanKuckir 7 months ago

I don't know every algorithm out there, including some of these, because I am on the hardware side of computer science. So yes, even though I know some of these, I don't know all of them, and therefore, I found it useful. Anyway, complexity is not always apparent. I've worked with algorithms that are O(n^2.836) ... how is that obvious?
2
Reply Share

IvanKuckir

chriswashere818 7 months ago

You are probably talking about some fast matrix multiplication, aren't you? :) I have implemented Strassen Algorithm once, it has complexity around n^2.8...
2
Reply Share

chriswashere818 1 Ankush Gupta


7 months ago

IvanKuckir 7 months ago

Yes, that's the one.


Reply Share

Awesome You should add Are youresource! a developer? Try out the HTML toDijkstra PDF API using a Fibonacci Heap! open in browser PRO version

pdfcrowd.com

Awesome resource! You should add Dijkstra using a Fibonacci Heap!


3 AmitK 3 maxw3st 3 Nikola 2
Reply Share

7 months ago

Its pretty handy!


Reply Share

a year ago

This gives me some excellent homework to do of a variety I'm not getting in classes. Thank you.
Reply Share

4 months ago

Extremely useful and on an easy to remember URL. Thank you!


Reply Share

Mohamed Shimran

6 months ago

This is damn great :) Ultimate programming tutorials


2 ct01 2
Reply Share

7 months ago

Why is space complexity O(|V|) "good" in some cases and "fair" in others?
Reply Share

soulmachine

7 months ago

awesome!
2 Nir Alfasi 2
Reply Share

7 months ago

Complexity forBFS/DFS is O(|V| + |E|), since we're traversing each node/edge only once. Good job!
Reply Share

open in browser PRO version

Are you a developer? Try out the HTML to PDF API

pdfcrowd.com

Load more comments

Subscribe

Add Disqus to your site

Google Apps for Business


www.google.com/apps/business Get Gmail for Business, Google Calendar, Google Docs and more! Page styling via Bootstrap Comments via Disqus Algorithm detail via Wikipedia Big-O complexity chart via MeteorCharts Table source hosted on Github Mashup via @ericdrowell

open in browser PRO version

Are you a developer? Try out the HTML to PDF API

pdfcrowd.com

You might also like