You are given a matrix A of dimention N*M consisting of non-negative numbers where N is the number of rows and M is the number of columns. Let's number the rows of the matrix from 1 to N from top to bottom, let's number the columns from 1 to M from left to right. Let Aij be the element at the intersection of ith row and jth column. You have to determine the largest square submatrix having sum of its elements less than or equal to a given number S.
A square submatrix, as its name suggests, has the same number of rows and columns.
Input :
First line consists of three space separated integers N, M and S representing the number of rows, columns and maximum sum limit respectively.
Next N lines consisting of M space separated integers each describe the matrix. jth integer of (i+1)th line is the element at the ith row and jth column of the matrix A.
Output :
If there is no such square submatrix, just output a single integer -1.
Otherwise on first line print a single integer that represents the number of rows in the largest square submatrix.
On the next line print two space separated integers, r and c, row and column number of the top left coordinates of that submatrix respectively. If there are more than one such submatrices, output the one having smallest r. If there are still many submatrices with the same r, output the one having smallest c.
Constraints :
1 <= N <= 2500
1 <= M <= 2500
0 <= Aij <= 10000 where 1 <= i <= N, 1 <= j <= M.
0 <= S <= 10^12
There are three submatrices with max no. of rows = 2 having sum <= 5. They have top left coordinates (1,2), (2,1) and (2,2). We output the one with minimum r.