You are given an array of \(N\) positive integers and \(Q\) queries. In each query, you are given an index \(i\) \((1\leq i \leq N)\). Find an index \(j\) \((1\leq j \leq N)\) such that \((i\) & \(j) = 0\) and \(A_i + A_j\) is maximum. Find the maximum possible value \(A_i+A_j\) for each query. If no such index \(j\) exists, then the answer is \(-1\).
Note: Here, \(\&\) is a bitwise AND operator.
Input format
Output format
For each test case, print \(Q\) lines where the \(i^{th}\) line must contain the output for the \(i^{th}\) query.
Constraints
\(1 \leq T \leq 20\)
\(1 \leq N, Q \leq 10^5\)
\(1 \leq A_i \leq 10^9\)
\(1 \leq Q_i \leq N\)
In 1'st query, we can choose index 1, as (1 & 2) = 0 and (7 + 2) = 9 is maximum possible value we can get.
In 2'nd query, we can choose index 3, as (3 & 4) = 0 and (8 + 6) = 14 is maximum possible value we can get.