Untitled

Accepts: 504
Submissions: 1542
Time Limit: 2000/1000 MS (Java/Others)
Memory Limit: 65536/65536 K (Java/Others)
Problem Description
There is an integer $a$ and $n$ integers $b_1, \ldots, b_n$. After selecting some numbers from $b_1, \ldots, b_n$ in any order, say $c_1, \ldots, c_r$, we want to make sure that $a \ mod \ c_1 \ mod \ c_2 \ mod \ldots \ mod \ c_r = 0$ (i.e., $a$ will become the remainder divided by $c_i$ each time, and at the end, we want $a$ to become $0$). Please determine the minimum value of $r$. If the goal cannot be achieved, print $-1$ instead.
Input
The first line contains one integer $T \leq 5$, which represents the number of testcases. For each testcase, there are two lines: 1. The first line contains two integers $n$ and $a$ ($1 \leq n \leq 20, 1 \leq a \leq 10^6$). 2. The second line contains $n$ integers $b_1, \ldots, b_n$ ($\forall 1\leq i \leq n, 1 \leq b_i \leq 10^6$).
Output
Print $T$ answers in $T$ lines.
Sample Input
2
2 9
2 7
2 9
6 7
Sample Output
2
-1