User blog comment:Ikosarakt1/Coding strings as ordinals (for Friedman's n(k))/@comment-5529393-20130715142426/@comment-5529393-20130807022513

Okay, here is a bad sequence of length greater than F(k) using k+4 letters.

First, the first five letters are k+3, k+4, k+4, k+4, k+3. Next, we put k+2 at the indices 6, 13, 27, 55,... and also at indices 9, 19, 39, 79,... .  We have chosen the indices so that any block subsequence {i, i+1, ..., 2i} will contain exactly two instances of k+2. Also, the intervals between instances of k+2 are of length 2, 3, 5, 7, 11, 15,..., and the ith interval is of length at least i+1. So we can put the ith sequence of F(k) using the first k letters in the ith interval, and use the letter k+1 to pad out the remainder of the interval. This goes on until the F(k)th interval, which will extend the length of the sequence past F(k) (actually to about 2^(F(k)/2) )

Now, given block subsequences [i, 2i] and [j, 2j], each contains two instances of k+2, which must map to each other. So the interval in between the k+2's in [i,2i] must map to the interval between the k+2's in [j,2j]. But the earlier interval contains an earlier sequence from F(k), and the latter interval contains a later sequence from F(k) along with k+1's. And we know from the definition of the F(k) sequence that no earlier sequence maps into a later sequence, so no map from [i,2i] to [j,2j] exists and the sequence is bad.

Looking at it more carefully, it looks like we can reduce to k+3 by letting the first five letters be k+3, k+3, k+1, k+1, k+3. Please check if this is correct.