首頁  >  文章  >  web前端  >  DSA 與 Big O 表示法簡介

DSA 與 Big O 表示法簡介

Linda Hamilton
Linda Hamilton原創
2024-09-19 20:30:10259瀏覽

Intro to DSA & Big O Notation

掌握 DSA 的注意事項:

Master DSA「有資格」獲得向 S/w Ers 提供的高薪。
DSA 是軟體工程的主要部分。
在編寫程式碼之前,請確保您了解大局,然後深入了解細節。
這一切都是為了直觀地理解概念,然後透過任何 l/g 將這些概念翻譯成程式碼,因為 DSA 與語言無關。
每個即將到來的概念都以某種方式與先前的概念相關聯。因此,除非您透過練習徹底掌握了這個概念,否則不要跳話題或繼續前進。
當我們直觀地學習概念時,我們會對材料有更深入的理解,從而幫助我們更長時間地保留知識。
如果您遵循這些建議,您將不會有任何損失。

Linear DS:
Arrays
LinkedList(LL) & Doubly LL (DLL)
Stack
Queue & Circular Queue

Non-linear DS:
Trees
Graphs

大 O 表示法

理解這種表示法對於演算法的效能比較至關重要。
這是一種比較演算法效率的數學方法。

時間複雜度

程式碼運行得越快,它就會越低
V. 大多數採訪的印象。

空間複雜度

由於儲存成本低,與時間複雜度相比很少被考慮。
要理解,因為面試官也可能會問你這個。

三個希臘字母:

  1. 歐米茄
  2. 西塔
  3. Omicron 即 Big-O [最常見]

演算法案例

  1. 最佳情況[使用 Omega 表示]
  2. 平均案例[使用 Theta 表示]
  3. 最壞情況[使用 Omicron 表示]

從技術上講,平均情況 Big-O 不存在最佳情況。它們分別以 omega 和 theta 表示。
我們總是在衡量最壞的情況。

## O(n): Efficient Code
Proportional
Its simplified by dropping the constant values.
An operation happens 'n' times, where n is passed as an argument as shown below.
Always going to be a straight line having slope 1, as no of operations is proportional to n.
X axis - value of n.
Y axis - no of operations 

// O(n)
function printItems(n){
  for(let i=1; i<=n; i++){
    console.log(i);
  }
}
printItems(9);

// O(n) + O(n) i.e O(2n) operations. As we drop constants, it eventually becomes O(n)
function printItems(n){
  for(let i=0; i<n; i++){
    console.log(i);
  }
  for(let j=0; j<n; j++){
    console.log(j);
  }
}
printItems(10);
## O(n^2):
Nested loops.
No of items which are output in this case are n*n for a 'n' input.
function printItems(n){
  for(let i=0; i<n; i++){
    console.log('\n');
    for(let j=0; j<n; j++){
      console.log(i, j);
    }
  }
}
printItems(4);
## O(n^3):
No of items which are output in this case are n*n*n for a 'n' input.
// O(n*n*n)
function printItems(n){
  for(let i=0; i<n; i++){
    console.log(`Outer Iteration ${i}`);
    for(let j=0; j<n; j++){
      console.log(`  Mid Iteration ${j}`);
      for(let k=0; k<n; k++){
        //console.log("Inner");
        console.log(`    Inner Iteration ${i} ${j} ${k}`);
      }
    }
  }
}
printItems(3);


## Comparison of Time Complexity:
O(n) > O(n*n)


## Drop non-dominants:
function xxx(){
  // O(n*n)
  Nested for loop

  // O(n)
  Single for loop
}
Complexity for the below code will O(n*n) + O(n) 
By dropping non-dominants, it will become O(n*n) 
As O(n) will be negligible as the n value grows. O(n*n) is dominant term, O(n) is non-dominnat term here.
## O(1):
Referred as Constant time i.e No of operations do not change as 'n' changes.
Single operation irrespective of no of operands.
MOST EFFICIENT. Nothing is more efficient than this. 
Its a flat line overlapping x-axis on graph.


// O(1)
function printItems(n){
  return n+n+n+n;
}
printItems(3);


## Comparison of Time Complexity:
O(1) > O(n) > O(n*n)
## O(log n)
Divide and conquer technique.
Partitioning into halves until goal is achieved.

log(base2) of 8 = 3 i.e we are basically saying 2 to what power is 8. That power denotes the no of operations to get to the result.

Also, to put it in another way we can say how many times we need to divide 8 into halves(this makes base 2 for logarithmic operation) to get to the single resulting target item which is 3.

Ex. Amazing application is say for a 1,000,000,000 array size, how many times we need to cut to get to the target item.
log(base 2) 1,000,000,000 = 31 times
i.e 2^31 will make us reach the target item.

Hence, if we do the search in linear fashion then we need to scan for billion items in the array.
But if we use divide & conquer approach, we can find it in just 31 steps.
This is the immense power of O(log n)

## Comparison of Time Complexity:
O(1) > O(log n) > O(n) > O(n*n)
Best is O(1) or O(log n)
Acceptable is O(n)
O(n log n) : 
Used in some sorting Algos.
Most efficient sorting algo we can make unless we are sorting only nums.
Tricky Interview Ques: Different Terms for Inputs.
function printItems(a,b){
  // O(a)
  for(let i=0; i<a; i++){
    console.log(i);
  }
  // O(b)
  for(let j=0; j<b; j++){
    console.log(j);
  }
}
printItems(3,5);

O(a) + O(b) we can't have both variables equal to 'n'. Suppose a is 1 and b is 1bn.
Then both will be very different. Hence, it will eventually be O(a + b) is what can call it.
Similarly if these were nested for loops, then it will become O(a * b)
## Arrays
No reindexing is required in arrays for push-pop operations. Hence both are O(1).
Adding-Removing from end in array is O(1)

Reindexing is required in arrays for shift-unshift operations. Hence, both are O(n) operations, where n is no of items in the array.
Adding-Removing from front in array is O(n)

Inserting anywhere in array except start and end positions:
myArr.splice(indexForOperation, itemsToBeRemoved, ContentTobeInsterted)
Remaining array after the items has to be reindexed.
Hence, it will be O(n) and not O(0.5 n) as Big-O always meassures worst case, and not avg case. 0.5 is constant, hence its droppped.
Same is applicable for removing an item from an array also as the items after it has to be reindexed.


Finding an item in an array:
if its by value: O(n)
if its by index: O(1)

Select a DS based on the use-case.
For index based, array will be a great choice.
If a lot of insertion-deletion is perform in the begin, then use some other DS as reindexing will make it slow.

n=100 的時間複雜度比較:

O(1) = 1
O(log 100) = 7
O(100) = 100
O(n^2) = 10,000

n=1000 的時間複雜度比較:

O(1) = 1
O(log 1000) = ~10
O(1000) = 1000
O(1000*1000) = 1,000,000

我們主要關注這4個:
大 O(n*n):巢狀循環
大 O(n):比例
Big O(log n):分而治之
大 O(1):常數

O(n!) 通常發生在我們故意寫糟糕的程式碼時。
O(n*n) 是可怕的演算法
O(n log n) 是可以接受的,並被某些排序演算法使用
O(n) :可接受
O(log n), O(1) :最佳

所有 DS 的空間複雜度幾乎相同,即 O(n)。
使用排序演算法,空間複雜度從 O(n) 到 O(log n) 或 O(1) 不等

時間複雜度依演算法變化

除數字(如字串)之外的排序的最佳時間複雜度是 O(n log n),即快速排序、合併排序、時間排序、堆排序。

應用所學的最佳方法是盡可能多地編寫程式碼。

根據每個 DS 的優缺點來選擇在哪個問題陳述中選擇哪個 DS。

更多資訊請參考:bigocheatsheet.com

以上是DSA 與 Big O 表示法簡介的詳細內容。更多資訊請關注PHP中文網其他相關文章!

陳述:
本文內容由網友自願投稿,版權歸原作者所有。本站不承擔相應的法律責任。如發現涉嫌抄襲或侵權的內容,請聯絡admin@php.cn