The problem of determining the longest increasing sub-sequence in a list of numbers is already a classic in programming competitions. Despite this fact, that is the problem you must solve here. But, in order to avoid having you yawning while solving the problem, we introduced a small change: the list of numbers will be given as a bidimensional matrix, and the increasing sequence must be “embedded” in a submatrix of the original matrix. Let’s define the problem more precisely. The linearization of a bidimensional matrix is the concate- nation of its lines, from the first to the last. A submatrix is a rectangular region of a bidimensional matrix (with sides paralel to the sides of the matrix). The size of a submatrix is its number of elements. You must write a program that, given a matrix of integers, determines the largest submatrix such that, when linearized, results in an increasing sequence. The figure below shows some examples of submatrices of largest size which contain increasing se- quences. Notice that more than one such a submatrix may exist in a given matrix. Input The input contains several test cases. The first line of a test case contains two integers N and M indicating the matrix dimensions (1 ≤ N,M ≤ 600). Each of the next N lines contains M integers, separated by a space, describing the elements of the matrix. Element Xi,j of the matrix is the j-th integer of the i-th line in the input (−106 ≤ Xi,j ≤ 106). The end of input is indicated by a line containing only two zeros, separated by a space. Output For each test case in the input your program must print one single line, containing the size of the largest sub-matrix that, when linearized, results in an increasing sequence. Sample Input 33 125 467 10 8 3 34 1212 9673 8728 42 -23 -12 02

2/2 16 15 57 33 44 2222 2222 2222 2222 00 Sample Output 4 3 4 1