# 338-counting-bits Try it on leetcode ## Description

Given an integer n, return an array ans of length n + 1 such that for each i (0 <= i <= n), ans[i] is the number of 1's in the binary representation of i.

 

Example 1:

Input: n = 2
Output: [0,1,1]
Explanation:
0 --> 0
1 --> 1
2 --> 10

Example 2:

Input: n = 5
Output: [0,1,1,2,1,2]
Explanation:
0 --> 0
1 --> 1
2 --> 10
3 --> 11
4 --> 100
5 --> 101

 

Constraints:

 

Follow up:

## Solution(Python) ```Python class Solution: def countBits(self, n: int) -> List[int]: return self.dynamicprogramming(n) # Time Complexity :O(nlogn) # Space Complexity: O(n) def naive(self, n: int) -> List[int]: return [self.countones(num) for num in range(0, n + 1)] # Time Complexity :O(n) # Space Complexity: O(n) def dynamicprogramming(self, n: int) -> List[int]: dp = [0] * (n + 1) offset = 1 for index in range(1, n + 1): if index == offset * 2: offset *= 2 dp[index] = dp[index - offset] + 1 return dp def countones(self, num): cnt = 0 while num: cnt += 1 num &= num - 1 return cnt ```