DSA cheat sheet

Time complexity

Data Structure Insertion Deletion Indexing Reading Search (Optimized) Search (Worst Case) Sorting
Arrays & Strings O(n) - O(1) O(1) O(log n) (If sorted, Binary Search can be used) O(n) O(n log n)
Linked List O(1) O(1) O(n) O(n) O(n) (Linked Lists are not usually sorted) O(n) O(n log n)
HashTable O(1) O(1) O(1) O(1) - O(n) -
Stack/Queue/Deque O(1) O(1) - - - - -
Heap O(log n) O(log n) - - - O(n) O(n log n)
Binary Search Tree O(log n) O(log n) - O(n) O(log n) (BSTs are sorted by definition) O(n) O(n log n)

O(1), O(n), O(log n), and O(n log n) notations in the table are indeed time complexities. They are used in computer science to describe the performance or complexity of an algorithm.

Here’s a brief explanation of what each notation means:

O(1): This is known as constant time complexity. It means the operation takes a constant amount of time, regardless of the size of the input data. O(n): This is known as linear time complexity. It means the time taken by an operation grows linearly with the size of the input data. O(log n): This is known as logarithmic time complexity. It means the time taken by an operation grows logarithmically with the size of the input data. This is often seen in operations involving binary trees or when using binary search. O(n log n): This is known as linearithmic time complexity. It’s commonly seen in efficient sorting algorithms like quicksort, mergesort, and heapsort. These notations provide a high-level understanding of how the performance of an operation will change as the size of the input data increases. They help us estimate the efficiency of an algorithm or a specific operation within a data structure.

Avatar
Bhargava Mummadireddy
Software Craftsman