Wrong processing of authentication data in the server mode
The code assumes that authentication data (user:password) are less of 100 bytes and would be read by a single InputStream.read.
Here is an echo-like Sample:
import com.kx.c;
import java.io.IOException;
import java.net.ServerSocket;
import java.util.Arrays;
public class Test {
public static void main(String[] args) throws IOException, c.KException {
ServerSocket ss = new ServerSocket(1111);
c conn = new c(ss, new c.IAuthenticate() {
@Override
public boolean authenticate(String s) {
System.out.println("auth: " + s + "; len: " + s.length());
return true;
}
});
while(true) {
Object[] objs = conn.readMsg();
if (((byte)objs[0]) == 0) {
System.out.println("async");
} else {
System.out.println("sync");
conn.kr(objs[1]);
}
}
}
}
And now how we can lock (both Java and q client side):
q)(`$":localhost:1111:",200#"u") "aaa"
Here is what we got in Java before the hang:
auth: uuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuu; len: 97
Thanks for the details, we'll fix.
Changed added to code if you want to test. Will do a new release soon.
In your implementation, you relay that all connection data are available on the server side. However generally is it guaranteed? I would read until i get '\0' and do not relay on the fact the read returns 0. My view it would be cleaner implementation.
Also if you get -1 during reading authenticated data, you can throw EOFException.
Further improvements made