hadoop - TApplicationException: Required field 'client_protocol' is unset -
I am developing a savings client,
I have a thrivate hive server (apache-hive- 0kl4k0) on my machine and I can also use Cloidera Dist Acvev 4.6.0 is
when I connect Seediac client Throft client, then you may receive the following error:
TApplicationException: Required field 'client_protocol' is unset! Structure: TOpenSessionReq (client_protocol: null, username: I'm passing the correct protocol on the server but it seems that some things doing this ride ....
In addition to this, if I point to the local host (where I have a hive server running) everything works fine ....
Please tell me what is wrong here .... code:
var socket = new TSKet ("XXX.XXX.XXX. XXX ", 10000); TST StreetTontor T STransport = (TstreamTransport) socket; var transport = new Tibufrendtronsport (socket); Antrnihittronsport = transport; Vert Protot = new Tibiriprotokl (transport); var client = new Tisielaisis. Client (Protota); transport. Open (); TOpenSessionReq req = new TOpenSessionReq (TProtocolVersion.HIVE_CLI_SERVICE_PROTOCOL_V6); req.Username = "hive"; req.Password = "hive"; TOpenSessionResp oSResponse = client.OpenSession (request); TSessionHandle sessionHandle = oSResponse.SessionHandle; TExecuteStatementReq execReq = New TExecuteS TatementReq (Season Handle, "Select from AMP"); Texquistestation Reciprocity Exchange = Client.exequestestation (Exercise); Operation Handle = Operation Handle = XER. OperationHandle; TFetchResultsReq fechReq = New TFetchResultsReq (Operation Handle, TFetchOrientation.FETCH_FIRST, 1); TFetchResultsResp fechRes = client.FetchResults (fechReq); TRowSet result = fechRes. result; & Lt; Trow & gt; Result = result Rows; Forrest (rows in resulting rows) {var val = row.ColVals [0]; System.Console.WriteLine (val.StringVal); } Tecloassation Reiki close closure = new vaccine operation rick (OperationHandle); Client.CloseOperation (closeOprReq); TCloseSessionReq creq = New TCloseSessionReq (Season Handle); Client.CloseSession (creq);
I believe this is a problem with the Hive-JDBC version. This solution can solve your problem:
Comments
Post a Comment