SynapseML icon indicating copy to clipboard operation
SynapseML copied to clipboard

Arbitrary web APIs does not work with some urls

Open meigaoms opened this issue 5 years ago • 1 comments

Describe the bug Following the sample code here: https://docs.microsoft.com/en-us/azure/cognitive-services/big-data/samples-python#arbitrary-web-apis, I was able to reproduce this sample code. However, it fails when I use this url: https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia.org/all-access/all-agents/Physics/monthly/20151104/20161104

To Reproduce

from requests import Request
from mmlspark.io.http import HTTPTransformer, http_udf
from pyspark.sql.functions import udf, col

def world_bank_request(country):
  return Request("GET", "https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia.org/all-access/all-agents/Physics/monthly/20151104/20161104")

df = (spark.createDataFrame([("br",),("usa",)], ["country"])
  .withColumn("request", http_udf(world_bank_request)(col("country"))))

client = (HTTPTransformer()
      .setConcurrency(3)
      .setInputCol("request")
      .setOutputCol("response"))

def get_response_body(resp):
  return resp.entity.content.decode()

display(client.transform(df).select("country", udf(get_response_body)(col("response")).alias("response")))

Expected behavior Returns content in this page: https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia.org/all-access/all-agents/Physics/monthly/20151104/20161104

Info (please complete the following information):

  • MMLSpark Version: [2.11]
  • Spark Version [2.4.5]
  • Spark Platform [ Databricks]
  • DataBricks Runtime Version: 6.6ML (incudes Apache Spark2

** Stacktrace**

 2.4.5 stderr log page for app-20210119170012-0000/1
Back to Master

Showing 102400 Bytes: 444287 - 546687 of 546687
Load More
ception(Alerts.java:198)
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:159)
	at sun.security.ssl.SSLSocketImpl.recvAlert(SSLSocketImpl.java:2041)
	at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1145)
	at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1388)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1416)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1400)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.createLayeredSocket(SSLConnectionSocketFactory.java:396)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:355)
	at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142)
	at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:373)
	at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:394)
	at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:237)
	at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:185)
	at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
	at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
	at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.sendWithRetries(HTTPClients.scala:76)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.advanced(HTTPClients.scala:128)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2$$anon$4.block(ExecutionContextImpl.scala:48)
	at scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2.blockOn(ExecutionContextImpl.scala:45)
	at scala.concurrent.package$.blocking(package.scala:123)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.handle(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:58)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:57)
	at scala.Option.map(Option.scala:146)
	at com.microsoft.ml.spark.io.http.HTTPClient$class.sendRequestWithContext(HTTPClients.scala:57)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.sendRequestWithContext(HTTPClients.scala:153)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
	at scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121)
	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
21/01/19 18:51:21 ERROR Executor: Exception in task 0.1 in stage 50.0 (TID 336)
javax.net.ssl.SSLHandshakeException: Received fatal alert: handshake_failure
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:198)
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:159)
	at sun.security.ssl.SSLSocketImpl.recvAlert(SSLSocketImpl.java:2041)
	at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1145)
	at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1388)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1416)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1400)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.createLayeredSocket(SSLConnectionSocketFactory.java:396)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:355)
	at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142)
	at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:373)
	at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:394)
	at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:237)
	at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:185)
	at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
	at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
	at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.sendWithRetries(HTTPClients.scala:76)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.advanced(HTTPClients.scala:128)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2$$anon$4.block(ExecutionContextImpl.scala:48)
	at scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2.blockOn(ExecutionContextImpl.scala:45)
	at scala.concurrent.package$.blocking(package.scala:123)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.handle(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:58)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:57)
	at scala.Option.map(Option.scala:146)
	at com.microsoft.ml.spark.io.http.HTTPClient$class.sendRequestWithContext(HTTPClients.scala:57)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.sendRequestWithContext(HTTPClients.scala:153)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
	at scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121)
	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
21/01/19 18:53:41 INFO CoarseGrainedExecutorBackend: Got assigned task 339
21/01/19 18:53:41 INFO Executor: Running task 0.0 in stage 51.0 (TID 339)
21/01/19 18:53:41 INFO TorrentBroadcast: Started reading broadcast variable 51
21/01/19 18:53:41 INFO MemoryStore: Block broadcast_51_piece0 stored as bytes in memory (estimated size 52.4 KB, free 21.0 GB)
21/01/19 18:53:41 INFO TorrentBroadcast: Reading broadcast variable 51 took 10 ms
21/01/19 18:53:41 INFO MemoryStore: Block broadcast_51 stored as values in memory (estimated size 215.2 KB, free 21.0 GB)
21/01/19 18:53:41 INFO PythonRunner: Times: total = 90, boot = 46, init = 44, finish = 0
21/01/19 18:53:41 INFO TransportClientFactory: Found inactive connection to /10.139.64.6:37539, creating a new one.
21/01/19 18:53:41 INFO TransportClientFactory: Successfully created connection to /10.139.64.6:37539 after 3 ms (0 ms spent in bootstraps)
21/01/19 18:53:41 INFO CodeGenerator: Code generated in 46.233333 ms
21/01/19 18:53:43 INFO PythonUDFRunner: Times: total = 1623, boot = 48, init = 1575, finish = 0
21/01/19 18:53:43 INFO CodeGenerator: Generated method too long to be JIT compiled: org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage3.serializefromobject_doConsume_0$ is 10183 bytes
21/01/19 18:53:43 INFO CodeGenerator: Code generated in 64.344481 ms
21/01/19 18:53:43 INFO PythonUDFRunner: Times: total = 43, boot = -1640, init = 1683, finish = 0
21/01/19 18:53:43 INFO Executor: Finished task 0.0 in stage 51.0 (TID 339). 2048 bytes result sent to driver
21/01/19 18:53:43 INFO CoarseGrainedExecutorBackend: Got assigned task 340
21/01/19 18:53:43 INFO Executor: Running task 0.0 in stage 52.0 (TID 340)
21/01/19 18:53:43 INFO CoarseGrainedExecutorBackend: Got assigned task 342
21/01/19 18:53:43 INFO Executor: Running task 2.0 in stage 52.0 (TID 342)
21/01/19 18:53:43 INFO TorrentBroadcast: Started reading broadcast variable 52
21/01/19 18:53:43 INFO MemoryStore: Block broadcast_52_piece0 stored as bytes in memory (estimated size 52.4 KB, free 21.0 GB)
21/01/19 18:53:43 INFO TorrentBroadcast: Reading broadcast variable 52 took 10 ms
21/01/19 18:53:43 INFO MemoryStore: Block broadcast_52 stored as values in memory (estimated size 215.2 KB, free 21.0 GB)
21/01/19 18:53:43 INFO PythonRunner: Times: total = 3, boot = -125, init = 128, finish = 0
21/01/19 18:53:43 INFO PythonRunner: Times: total = 45, boot = -33, init = 78, finish = 0
21/01/19 18:53:45 INFO PythonUDFRunner: Times: total = 1726, boot = 47, init = 1679, finish = 0
21/01/19 18:53:45 INFO PythonUDFRunner: Times: total = 1750, boot = 95, init = 1655, finish = 0
21/01/19 18:53:45 INFO PythonUDFRunner: Times: total = 43, boot = -1697, init = 1740, finish = 0
21/01/19 18:53:45 INFO Executor: Finished task 0.0 in stage 52.0 (TID 340). 2048 bytes result sent to driver
21/01/19 18:53:45 INFO PythonUDFRunner: Times: total = 42, boot = -1703, init = 1745, finish = 0
21/01/19 18:53:45 INFO Executor: Finished task 2.0 in stage 52.0 (TID 342). 2048 bytes result sent to driver
21/01/19 18:53:45 INFO CoarseGrainedExecutorBackend: Got assigned task 344
21/01/19 18:53:45 INFO Executor: Running task 0.0 in stage 53.0 (TID 344)
21/01/19 18:53:45 INFO CoarseGrainedExecutorBackend: Got assigned task 346
21/01/19 18:53:45 INFO Executor: Running task 2.0 in stage 53.0 (TID 346)
21/01/19 18:53:45 INFO CoarseGrainedExecutorBackend: Got assigned task 348
21/01/19 18:53:45 INFO TorrentBroadcast: Started reading broadcast variable 53
21/01/19 18:53:45 INFO CoarseGrainedExecutorBackend: Got assigned task 350
21/01/19 18:53:45 INFO Executor: Running task 4.0 in stage 53.0 (TID 348)
21/01/19 18:53:45 INFO Executor: Running task 6.0 in stage 53.0 (TID 350)
21/01/19 18:53:45 INFO CoarseGrainedExecutorBackend: Got assigned task 352
21/01/19 18:53:45 INFO CoarseGrainedExecutorBackend: Got assigned task 354
21/01/19 18:53:45 INFO Executor: Running task 8.0 in stage 53.0 (TID 352)
21/01/19 18:53:45 INFO CoarseGrainedExecutorBackend: Got assigned task 356
21/01/19 18:53:45 INFO Executor: Running task 10.0 in stage 53.0 (TID 354)
21/01/19 18:53:45 INFO Executor: Running task 12.0 in stage 53.0 (TID 356)
21/01/19 18:53:45 INFO CoarseGrainedExecutorBackend: Got assigned task 358
21/01/19 18:53:45 INFO Executor: Running task 14.0 in stage 53.0 (TID 358)
21/01/19 18:53:45 INFO CoarseGrainedExecutorBackend: Got assigned task 360
21/01/19 18:53:45 INFO Executor: Running task 16.0 in stage 53.0 (TID 360)
21/01/19 18:53:45 INFO CoarseGrainedExecutorBackend: Got assigned task 362
21/01/19 18:53:45 INFO Executor: Running task 18.0 in stage 53.0 (TID 362)
21/01/19 18:53:45 INFO MemoryStore: Block broadcast_53_piece0 stored as bytes in memory (estimated size 52.4 KB, free 21.0 GB)
21/01/19 18:53:45 INFO TorrentBroadcast: Reading broadcast variable 53 took 12 ms
21/01/19 18:53:45 INFO MemoryStore: Block broadcast_53 stored as values in memory (estimated size 215.2 KB, free 21.0 GB)
21/01/19 18:53:45 INFO PythonRunner: Times: total = 3, boot = -54, init = 57, finish = 0
21/01/19 18:53:45 INFO PythonRunner: Times: total = 50, boot = -34, init = 84, finish = 0
21/01/19 18:53:45 INFO PythonRunner: Times: total = 46, boot = -33, init = 79, finish = 0
21/01/19 18:53:45 INFO PythonRunner: Times: total = 3, boot = -78, init = 81, finish = 0
21/01/19 18:53:45 INFO PythonRunner: Times: total = 475, boot = 431, init = 44, finish = 0
21/01/19 18:53:45 INFO PythonRunner: Times: total = 427, boot = 383, init = 43, finish = 1
21/01/19 18:53:46 INFO PythonRunner: Times: total = 378, boot = 334, init = 44, finish = 0
21/01/19 18:53:46 INFO PythonRunner: Times: total = 330, boot = 287, init = 43, finish = 0
21/01/19 18:53:46 INFO PythonRunner: Times: total = 91, boot = 47, init = 44, finish = 0
21/01/19 18:53:46 INFO PythonRunner: Times: total = 523, boot = 479, init = 44, finish = 0
21/01/19 18:53:46 INFO PythonUDFRunner: Times: total = 241, boot = -402, init = 643, finish = 0
21/01/19 18:53:46 INFO PythonUDFRunner: Times: total = 670, boot = -12, init = 682, finish = 0
21/01/19 18:53:46 INFO HandlingUtils: sending http://api.worldbank.org/v2/country/br?format=json
21/01/19 18:53:46 INFO PythonUDFRunner: Times: total = 47, boot = -686, init = 733, finish = 0
21/01/19 18:53:46 INFO Executor: Finished task 8.0 in stage 53.0 (TID 352). 2048 bytes result sent to driver
21/01/19 18:53:46 INFO HandlingUtils: finished sending (47ms) http://api.worldbank.org/v2/country/br?format=json
21/01/19 18:53:46 INFO PythonUDFRunner: Times: total = 112, boot = -664, init = 775, finish = 1
21/01/19 18:53:46 INFO Executor: Finished task 10.0 in stage 53.0 (TID 354). 2409 bytes result sent to driver
21/01/19 18:53:47 INFO PythonUDFRunner: Times: total = 2245, boot = 189, init = 2055, finish = 1
21/01/19 18:53:47 INFO PythonUDFRunner: Times: total = 1820, boot = 95, init = 1725, finish = 0
21/01/19 18:53:47 INFO PythonUDFRunner: Times: total = 2262, boot = 237, init = 2025, finish = 0
21/01/19 18:53:47 INFO PythonUDFRunner: Times: total = 1992, boot = 386, init = 1606, finish = 0
21/01/19 18:53:47 INFO PythonUDFRunner: Times: total = 45, boot = -1856, init = 1901, finish = 0
21/01/19 18:53:47 INFO Executor: Finished task 18.0 in stage 53.0 (TID 362). 2048 bytes result sent to driver
21/01/19 18:53:47 INFO PythonUDFRunner: Times: total = 45, boot = -1812, init = 1857, finish = 0
21/01/19 18:53:47 INFO Executor: Finished task 12.0 in stage 53.0 (TID 356). 2048 bytes result sent to driver
21/01/19 18:53:47 INFO PythonUDFRunner: Times: total = 45, boot = -1773, init = 1818, finish = 0
21/01/19 18:53:47 INFO Executor: Finished task 0.0 in stage 53.0 (TID 344). 2048 bytes result sent to driver
21/01/19 18:53:47 INFO PythonUDFRunner: Times: total = 1978, boot = 293, init = 1685, finish = 0
21/01/19 18:53:47 INFO PythonUDFRunner: Times: total = 2319, boot = 92, init = 2227, finish = 0
21/01/19 18:53:47 INFO PythonUDFRunner: Times: total = 2319, boot = 140, init = 2179, finish = 0
21/01/19 18:53:47 INFO PythonUDFRunner: Times: total = 43, boot = -1937, init = 1980, finish = 0
21/01/19 18:53:47 INFO Executor: Finished task 6.0 in stage 53.0 (TID 350). 2048 bytes result sent to driver
21/01/19 18:53:47 INFO PythonUDFRunner: Times: total = 1955, boot = 191, init = 1764, finish = 0
21/01/19 18:53:47 INFO PythonUDFRunner: Times: total = 43, boot = -2209, init = 2252, finish = 0
21/01/19 18:53:47 INFO Executor: Finished task 4.0 in stage 53.0 (TID 348). 2048 bytes result sent to driver
21/01/19 18:53:47 INFO PythonUDFRunner: Times: total = 42, boot = -1787, init = 1829, finish = 0
21/01/19 18:53:47 INFO Executor: Finished task 16.0 in stage 53.0 (TID 360). 2048 bytes result sent to driver
21/01/19 18:53:47 INFO PythonUDFRunner: Times: total = 49, boot = -1565, init = 1614, finish = 0
21/01/19 18:53:47 INFO Executor: Finished task 14.0 in stage 53.0 (TID 358). 2086 bytes result sent to driver
21/01/19 18:53:47 INFO PythonUDFRunner: Times: total = 41, boot = -1584, init = 1625, finish = 0
21/01/19 18:53:47 INFO Executor: Finished task 2.0 in stage 53.0 (TID 346). 2086 bytes result sent to driver
21/01/19 18:53:47 INFO CoarseGrainedExecutorBackend: Got assigned task 364
21/01/19 18:53:47 INFO Executor: Running task 0.0 in stage 54.0 (TID 364)
21/01/19 18:53:47 INFO CoarseGrainedExecutorBackend: Got assigned task 366
21/01/19 18:53:47 INFO Executor: Running task 2.0 in stage 54.0 (TID 366)
21/01/19 18:53:47 INFO CoarseGrainedExecutorBackend: Got assigned task 368
21/01/19 18:53:47 INFO Executor: Running task 4.0 in stage 54.0 (TID 368)
21/01/19 18:53:47 INFO CoarseGrainedExecutorBackend: Got assigned task 370
21/01/19 18:53:47 INFO Executor: Running task 6.0 in stage 54.0 (TID 370)
21/01/19 18:53:47 INFO TorrentBroadcast: Started reading broadcast variable 54
21/01/19 18:53:47 INFO MemoryStore: Block broadcast_54_piece0 stored as bytes in memory (estimated size 52.4 KB, free 21.0 GB)
21/01/19 18:53:47 INFO TorrentBroadcast: Reading broadcast variable 54 took 10 ms
21/01/19 18:53:47 INFO MemoryStore: Block broadcast_54 stored as values in memory (estimated size 215.2 KB, free 21.0 GB)
21/01/19 18:53:47 INFO PythonRunner: Times: total = 3, boot = -153, init = 156, finish = 0
21/01/19 18:53:47 INFO PythonRunner: Times: total = 4, boot = -149, init = 153, finish = 0
21/01/19 18:53:47 INFO PythonRunner: Times: total = 43, boot = -1596, init = 1639, finish = 0
21/01/19 18:53:47 INFO PythonRunner: Times: total = 44, boot = -1631, init = 1675, finish = 0
21/01/19 18:53:47 INFO PythonUDFRunner: Times: total = 43, boot = -122, init = 165, finish = 0
21/01/19 18:53:47 INFO PythonUDFRunner: Times: total = 43, boot = -138, init = 181, finish = 0
21/01/19 18:53:47 INFO HandlingUtils: sending http://api.worldbank.org/v2/country/usa?format=json
21/01/19 18:53:47 INFO HandlingUtils: finished sending (11ms) http://api.worldbank.org/v2/country/usa?format=json
21/01/19 18:53:47 INFO PythonUDFRunner: Times: total = 42, boot = -164, init = 206, finish = 0
21/01/19 18:53:47 INFO Executor: Finished task 0.0 in stage 54.0 (TID 364). 2048 bytes result sent to driver
21/01/19 18:53:47 INFO PythonUDFRunner: Times: total = 67, boot = -133, init = 199, finish = 1
21/01/19 18:53:47 INFO Executor: Finished task 6.0 in stage 54.0 (TID 370). 2391 bytes result sent to driver
21/01/19 18:53:49 INFO PythonUDFRunner: Times: total = 1595, boot = -134, init = 1729, finish = 0
21/01/19 18:53:49 INFO PythonUDFRunner: Times: total = 2, boot = -1678, init = 1680, finish = 0
21/01/19 18:53:49 INFO Executor: Finished task 4.0 in stage 54.0 (TID 368). 2048 bytes result sent to driver
21/01/19 18:53:49 INFO PythonUDFRunner: Times: total = 1653, boot = -137, init = 1789, finish = 1
21/01/19 18:53:49 INFO PythonUDFRunner: Times: total = 3, boot = -1735, init = 1738, finish = 0
21/01/19 18:53:49 INFO Executor: Finished task 2.0 in stage 54.0 (TID 366). 2048 bytes result sent to driver
21/01/19 18:54:07 INFO CoarseGrainedExecutorBackend: Got assigned task 372
21/01/19 18:54:07 INFO Executor: Running task 0.0 in stage 56.0 (TID 372)
21/01/19 18:54:07 INFO CoarseGrainedExecutorBackend: Got assigned task 374
21/01/19 18:54:07 INFO Executor: Running task 2.0 in stage 56.0 (TID 374)
21/01/19 18:54:07 INFO TorrentBroadcast: Started reading broadcast variable 56
21/01/19 18:54:07 INFO MemoryStore: Block broadcast_56_piece0 stored as bytes in memory (estimated size 52.5 KB, free 21.0 GB)
21/01/19 18:54:07 INFO TorrentBroadcast: Reading broadcast variable 56 took 9 ms
21/01/19 18:54:07 INFO MemoryStore: Block broadcast_56 stored as values in memory (estimated size 215.3 KB, free 21.0 GB)
21/01/19 18:54:07 INFO PythonRunner: Times: total = 3, boot = -19403, init = 19406, finish = 0
21/01/19 18:54:07 INFO PythonRunner: Times: total = 45, boot = -19445, init = 19490, finish = 0
21/01/19 18:54:07 INFO CodeGenerator: Code generated in 43.21933 ms
21/01/19 18:54:08 INFO PythonUDFRunner: Times: total = 1582, boot = -19406, init = 20988, finish = 0
21/01/19 18:54:08 INFO PythonUDFRunner: Times: total = 1582, boot = -19414, init = 20996, finish = 0
21/01/19 18:54:08 INFO CodeGenerator: Generated method too long to be JIT compiled: org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage3.serializefromobject_doConsume_0$ is 10183 bytes
21/01/19 18:54:08 INFO CodeGenerator: Code generated in 64.339038 ms
21/01/19 18:54:08 INFO PythonUDFRunner: Times: total = 44, boot = -21026, init = 21070, finish = 0
21/01/19 18:54:08 INFO PythonUDFRunner: Times: total = 44, boot = -21015, init = 21059, finish = 0
21/01/19 18:54:08 INFO Executor: Finished task 0.0 in stage 56.0 (TID 372). 2048 bytes result sent to driver
21/01/19 18:54:08 INFO Executor: Finished task 2.0 in stage 56.0 (TID 374). 2048 bytes result sent to driver
21/01/19 18:54:08 INFO CoarseGrainedExecutorBackend: Got assigned task 376
21/01/19 18:54:08 INFO Executor: Running task 0.0 in stage 57.0 (TID 376)
21/01/19 18:54:08 INFO CoarseGrainedExecutorBackend: Got assigned task 378
21/01/19 18:54:08 INFO Executor: Running task 2.0 in stage 57.0 (TID 378)
21/01/19 18:54:08 INFO CoarseGrainedExecutorBackend: Got assigned task 380
21/01/19 18:54:08 INFO Executor: Running task 4.0 in stage 57.0 (TID 380)
21/01/19 18:54:08 INFO TorrentBroadcast: Started reading broadcast variable 57
21/01/19 18:54:08 INFO CoarseGrainedExecutorBackend: Got assigned task 382
21/01/19 18:54:08 INFO Executor: Running task 6.0 in stage 57.0 (TID 382)
21/01/19 18:54:08 INFO CoarseGrainedExecutorBackend: Got assigned task 384
21/01/19 18:54:08 INFO CoarseGrainedExecutorBackend: Got assigned task 386
21/01/19 18:54:08 INFO Executor: Running task 8.0 in stage 57.0 (TID 384)
21/01/19 18:54:08 INFO Executor: Running task 10.0 in stage 57.0 (TID 386)
21/01/19 18:54:08 INFO CoarseGrainedExecutorBackend: Got assigned task 388
21/01/19 18:54:08 INFO Executor: Running task 12.0 in stage 57.0 (TID 388)
21/01/19 18:54:08 INFO CoarseGrainedExecutorBackend: Got assigned task 390
21/01/19 18:54:08 INFO Executor: Running task 14.0 in stage 57.0 (TID 390)
21/01/19 18:54:08 INFO CoarseGrainedExecutorBackend: Got assigned task 392
21/01/19 18:54:08 INFO Executor: Running task 16.0 in stage 57.0 (TID 392)
21/01/19 18:54:08 INFO CoarseGrainedExecutorBackend: Got assigned task 394
21/01/19 18:54:08 INFO Executor: Running task 18.0 in stage 57.0 (TID 394)
21/01/19 18:54:08 INFO MemoryStore: Block broadcast_57_piece0 stored as bytes in memory (estimated size 52.5 KB, free 21.0 GB)
21/01/19 18:54:08 INFO TorrentBroadcast: Reading broadcast variable 57 took 10 ms
21/01/19 18:54:08 INFO MemoryStore: Block broadcast_57 stored as values in memory (estimated size 215.3 KB, free 21.0 GB)
21/01/19 18:54:08 INFO PythonRunner: Times: total = 2, boot = -19460, init = 19462, finish = 0
21/01/19 18:54:08 INFO PythonRunner: Times: total = 45, boot = -21053, init = 21098, finish = 0
21/01/19 18:54:08 INFO PythonRunner: Times: total = 45, boot = -21011, init = 21056, finish = 0
21/01/19 18:54:08 INFO PythonRunner: Times: total = 45, boot = -19456, init = 19501, finish = 0
21/01/19 18:54:08 INFO PythonRunner: Times: total = 49, boot = -20995, init = 21044, finish = 0
21/01/19 18:54:09 INFO PythonRunner: Times: total = 45, boot = -20947, init = 20992, finish = 0
21/01/19 18:54:09 INFO PythonRunner: Times: total = 53, boot = -21052, init = 21105, finish = 0
21/01/19 18:54:09 INFO PythonRunner: Times: total = 53, boot = -21040, init = 21093, finish = 0
21/01/19 18:54:09 INFO PythonUDFRunner: Times: total = 47, boot = -1717, init = 1764, finish = 0
21/01/19 18:54:09 INFO PythonRunner: Times: total = 45, boot = -21012, init = 21057, finish = 0
21/01/19 18:54:09 INFO PythonUDFRunner: Times: total = 50, boot = -142, init = 192, finish = 0
21/01/19 18:54:09 INFO PythonUDFRunner: Times: total = 50, boot = -142, init = 192, finish = 0
21/01/19 18:54:09 INFO PythonUDFRunner: Times: total = 57, boot = -19402, init = 19459, finish = 0
21/01/19 18:54:09 INFO PythonUDFRunner: Times: total = 54, boot = -24, init = 78, finish = 0
21/01/19 18:54:09 INFO PythonUDFRunner: Times: total = 52, boot = 24, init = 28, finish = 0
21/01/19 18:54:09 INFO PythonUDFRunner: Times: total = 54, boot = -24, init = 78, finish = 0
21/01/19 18:54:09 INFO PythonRunner: Times: total = 45, boot = -21012, init = 21057, finish = 0
21/01/19 18:54:09 INFO PythonUDFRunner: Times: total = 59, boot = -19407, init = 19465, finish = 1
21/01/19 18:54:09 INFO HandlingUtils: sending https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia.org/all-access/all-agents/Physics/monthly/20151104/20161104
21/01/19 18:54:09 INFO PythonUDFRunner: Times: total = 70, boot = 32, init = 38, finish = 0
21/01/19 18:54:09 INFO PythonUDFRunner: Times: total = 71, boot = -8, init = 79, finish = 0
21/01/19 18:54:09 INFO PythonUDFRunner: Times: total = 74, boot = 26, init = 48, finish = 0
21/01/19 18:54:09 INFO Executor: Finished task 16.0 in stage 57.0 (TID 392). 2048 bytes result sent to driver
21/01/19 18:54:09 INFO Executor: Finished task 6.0 in stage 57.0 (TID 382). 2048 bytes result sent to driver
21/01/19 18:54:09 INFO Executor: Finished task 8.0 in stage 57.0 (TID 384). 2048 bytes result sent to driver
21/01/19 18:54:09 INFO PythonUDFRunner: Times: total = 79, boot = 30, init = 49, finish = 0
21/01/19 18:54:09 INFO Executor: Finished task 18.0 in stage 57.0 (TID 394). 2048 bytes result sent to driver
21/01/19 18:54:09 INFO PythonUDFRunner: Times: total = 78, boot = 16, init = 62, finish = 0
21/01/19 18:54:09 INFO PythonUDFRunner: Times: total = 83, boot = 19, init = 64, finish = 0
21/01/19 18:54:09 INFO PythonUDFRunner: Times: total = 83, boot = -2, init = 85, finish = 0
21/01/19 18:54:09 INFO Executor: Finished task 14.0 in stage 57.0 (TID 390). 2048 bytes result sent to driver
21/01/19 18:54:09 INFO Executor: Finished task 2.0 in stage 57.0 (TID 378). 2048 bytes result sent to driver
21/01/19 18:54:09 INFO Executor: Finished task 0.0 in stage 57.0 (TID 376). 2048 bytes result sent to driver
21/01/19 18:54:09 ERROR PythonUDFRunner: Python worker exited unexpectedly (crashed)
org.apache.spark.api.python.PythonException: Traceback (most recent call last):
  File "/databricks/spark/python/pyspark/worker.py", line 460, in main
    eval_type = read_int(infile)
  File "/databricks/spark/python/pyspark/serializers.py", line 832, in read_int
    raise EOFError
EOFError

	at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.handlePythonException(PythonRunner.scala:540)
	at org.apache.spark.sql.execution.python.PythonUDFRunner$$anon$1.read(PythonUDFRunner.scala:81)
	at org.apache.spark.sql.execution.python.PythonUDFRunner$$anon$1.read(PythonUDFRunner.scala:64)
	at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.hasNext(PythonRunner.scala:494)
	at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
	at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage4.processNext(Unknown Source)
	at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
	at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext(WholeStageCodegenExec.scala:640)
	at org.apache.spark.sql.execution.collect.UnsafeRowBatchUtils$.encodeUnsafeRows(UnsafeRowBatchUtils.scala:62)
	at org.apache.spark.sql.execution.collect.Collector$$anonfun$2.apply(Collector.scala:159)
	at org.apache.spark.sql.execution.collect.Collector$$anonfun$2.apply(Collector.scala:158)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
	at org.apache.spark.scheduler.Task.doRunTask(Task.scala:140)
	at org.apache.spark.scheduler.Task.run(Task.scala:113)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$17.apply(Executor.scala:606)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1541)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:612)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: javax.net.ssl.SSLHandshakeException: Received fatal alert: handshake_failure
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:198)
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:159)
	at sun.security.ssl.SSLSocketImpl.recvAlert(SSLSocketImpl.java:2041)
	at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1145)
	at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1388)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1416)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1400)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.createLayeredSocket(SSLConnectionSocketFactory.java:396)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:355)
	at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142)
	at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:373)
	at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:394)
	at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:237)
	at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:185)
	at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
	at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
	at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.sendWithRetries(HTTPClients.scala:76)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.advanced(HTTPClients.scala:128)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2$$anon$4.block(ExecutionContextImpl.scala:48)
	at scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2.blockOn(ExecutionContextImpl.scala:45)
	at scala.concurrent.package$.blocking(package.scala:123)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.handle(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:58)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:57)
	at scala.Option.map(Option.scala:146)
	at com.microsoft.ml.spark.io.http.HTTPClient$class.sendRequestWithContext(HTTPClients.scala:57)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.sendRequestWithContext(HTTPClients.scala:153)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
	at scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121)
	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
21/01/19 18:54:09 ERROR PythonUDFRunner: This may have been caused by a prior exception:
javax.net.ssl.SSLHandshakeException: Received fatal alert: handshake_failure
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:198)
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:159)
	at sun.security.ssl.SSLSocketImpl.recvAlert(SSLSocketImpl.java:2041)
	at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1145)
	at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1388)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1416)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1400)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.createLayeredSocket(SSLConnectionSocketFactory.java:396)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:355)
	at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142)
	at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:373)
	at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:394)
	at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:237)
	at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:185)
	at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
	at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
	at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.sendWithRetries(HTTPClients.scala:76)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.advanced(HTTPClients.scala:128)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2$$anon$4.block(ExecutionContextImpl.scala:48)
	at scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2.blockOn(ExecutionContextImpl.scala:45)
	at scala.concurrent.package$.blocking(package.scala:123)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.handle(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:58)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:57)
	at scala.Option.map(Option.scala:146)
	at com.microsoft.ml.spark.io.http.HTTPClient$class.sendRequestWithContext(HTTPClients.scala:57)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.sendRequestWithContext(HTTPClients.scala:153)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
	at scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121)
	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
21/01/19 18:54:09 ERROR Executor: Exception in task 10.0 in stage 57.0 (TID 386)
javax.net.ssl.SSLHandshakeException: Received fatal alert: handshake_failure
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:198)
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:159)
	at sun.security.ssl.SSLSocketImpl.recvAlert(SSLSocketImpl.java:2041)
	at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1145)
	at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1388)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1416)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1400)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.createLayeredSocket(SSLConnectionSocketFactory.java:396)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:355)
	at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142)
	at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:373)
	at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:394)
	at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:237)
	at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:185)
	at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
	at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
	at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.sendWithRetries(HTTPClients.scala:76)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.advanced(HTTPClients.scala:128)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2$$anon$4.block(ExecutionContextImpl.scala:48)
	at scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2.blockOn(ExecutionContextImpl.scala:45)
	at scala.concurrent.package$.blocking(package.scala:123)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.handle(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:58)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:57)
	at scala.Option.map(Option.scala:146)
	at com.microsoft.ml.spark.io.http.HTTPClient$class.sendRequestWithContext(HTTPClients.scala:57)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.sendRequestWithContext(HTTPClients.scala:153)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
	at scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121)
	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
21/01/19 18:54:09 INFO CoarseGrainedExecutorBackend: Got assigned task 397
21/01/19 18:54:09 INFO Executor: Running task 10.2 in stage 57.0 (TID 397)
21/01/19 18:54:09 INFO PythonRunner: Times: total = 42, boot = -357, init = 399, finish = 0
21/01/19 18:54:09 INFO PythonUDFRunner: Times: total = 45, boot = -358, init = 402, finish = 1
21/01/19 18:54:09 INFO HandlingUtils: sending https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia.org/all-access/all-agents/Physics/monthly/20151104/20161104
21/01/19 18:54:09 ERROR PythonUDFRunner: Python worker exited unexpectedly (crashed)
org.apache.spark.api.python.PythonException: Traceback (most recent call last):
  File "/databricks/spark/python/pyspark/worker.py", line 460, in main
    eval_type = read_int(infile)
  File "/databricks/spark/python/pyspark/serializers.py", line 832, in read_int
    raise EOFError
EOFError

	at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.handlePythonException(PythonRunner.scala:540)
	at org.apache.spark.sql.execution.python.PythonUDFRunner$$anon$1.read(PythonUDFRunner.scala:81)
	at org.apache.spark.sql.execution.python.PythonUDFRunner$$anon$1.read(PythonUDFRunner.scala:64)
	at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.hasNext(PythonRunner.scala:494)
	at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
	at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage4.processNext(Unknown Source)
	at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
	at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext(WholeStageCodegenExec.scala:640)
	at org.apache.spark.sql.execution.collect.UnsafeRowBatchUtils$.encodeUnsafeRows(UnsafeRowBatchUtils.scala:62)
	at org.apache.spark.sql.execution.collect.Collector$$anonfun$2.apply(Collector.scala:159)
	at org.apache.spark.sql.execution.collect.Collector$$anonfun$2.apply(Collector.scala:158)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
	at org.apache.spark.scheduler.Task.doRunTask(Task.scala:140)
	at org.apache.spark.scheduler.Task.run(Task.scala:113)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$17.apply(Executor.scala:606)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1541)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:612)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: javax.net.ssl.SSLHandshakeException: Received fatal alert: handshake_failure
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:198)
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:159)
	at sun.security.ssl.SSLSocketImpl.recvAlert(SSLSocketImpl.java:2041)
	at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1145)
	at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1388)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1416)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1400)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.createLayeredSocket(SSLConnectionSocketFactory.java:396)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:355)
	at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142)
	at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:373)
	at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:394)
	at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:237)
	at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:185)
	at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
	at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
	at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.sendWithRetries(HTTPClients.scala:76)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.advanced(HTTPClients.scala:128)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2$$anon$4.block(ExecutionContextImpl.scala:48)
	at scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2.blockOn(ExecutionContextImpl.scala:45)
	at scala.concurrent.package$.blocking(package.scala:123)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.handle(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:58)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:57)
	at scala.Option.map(Option.scala:146)
	at com.microsoft.ml.spark.io.http.HTTPClient$class.sendRequestWithContext(HTTPClients.scala:57)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.sendRequestWithContext(HTTPClients.scala:153)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
	at scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121)
	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
21/01/19 18:54:09 ERROR PythonUDFRunner: This may have been caused by a prior exception:
javax.net.ssl.SSLHandshakeException: Received fatal alert: handshake_failure
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:198)
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:159)
	at sun.security.ssl.SSLSocketImpl.recvAlert(SSLSocketImpl.java:2041)
	at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1145)
	at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1388)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1416)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1400)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.createLayeredSocket(SSLConnectionSocketFactory.java:396)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:355)
	at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142)
	at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:373)
	at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:394)
	at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:237)
	at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:185)
	at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
	at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
	at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.sendWithRetries(HTTPClients.scala:76)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.advanced(HTTPClients.scala:128)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2$$anon$4.block(ExecutionContextImpl.scala:48)
	at scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2.blockOn(ExecutionContextImpl.scala:45)
	at scala.concurrent.package$.blocking(package.scala:123)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.handle(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:58)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:57)
	at scala.Option.map(Option.scala:146)
	at com.microsoft.ml.spark.io.http.HTTPClient$class.sendRequestWithContext(HTTPClients.scala:57)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.sendRequestWithContext(HTTPClients.scala:153)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
	at scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121)
	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
21/01/19 18:54:09 ERROR Executor: Exception in task 10.2 in stage 57.0 (TID 397)
javax.net.ssl.SSLHandshakeException: Received fatal alert: handshake_failure
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:198)
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:159)
	at sun.security.ssl.SSLSocketImpl.recvAlert(SSLSocketImpl.java:2041)
	at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1145)
	at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1388)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1416)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1400)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.createLayeredSocket(SSLConnectionSocketFactory.java:396)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:355)
	at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142)
	at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:373)
	at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:394)
	at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:237)
	at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:185)
	at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
	at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
	at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.sendWithRetries(HTTPClients.scala:76)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.advanced(HTTPClients.scala:128)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2$$anon$4.block(ExecutionContextImpl.scala:48)
	at scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2.blockOn(ExecutionContextImpl.scala:45)
	at scala.concurrent.package$.blocking(package.scala:123)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.handle(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:58)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:57)
	at scala.Option.map(Option.scala:146)
	at com.microsoft.ml.spark.io.http.HTTPClient$class.sendRequestWithContext(HTTPClients.scala:57)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.sendRequestWithContext(HTTPClients.scala:153)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
	at scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121)
	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
21/01/19 18:54:09 INFO Executor: Executor is trying to kill task 4.0 in stage 57.0 (TID 380), reason: Stage cancelled
21/01/19 18:54:09 INFO Executor: Executor is trying to kill task 12.0 in stage 57.0 (TID 388), reason: Stage cancelled
21/01/19 18:54:10 INFO PythonUDFRunner: Times: total = 1603, boot = -1708, init = 3311, finish = 0
21/01/19 18:54:10 INFO Executor: Executor killed task 12.0 in stage 57.0 (TID 388), reason: Stage cancelled
21/01/19 18:54:10 INFO PythonUDFRunner: Times: total = 1673, boot = 45, init = 1628, finish = 0
21/01/19 18:54:10 INFO Executor: Executor killed task 4.0 in stage 57.0 (TID 380), reason: Stage cancelled
21/01/19 19:04:15 INFO CoarseGrainedExecutorBackend: Got assigned task 399
21/01/19 19:04:15 INFO Executor: Running task 0.0 in stage 58.0 (TID 399)
21/01/19 19:04:15 INFO TorrentBroadcast: Started reading broadcast variable 58
21/01/19 19:04:15 INFO MemoryStore: Block broadcast_58_piece0 stored as bytes in memory (estimated size 51.6 KB, free 21.0 GB)
21/01/19 19:04:15 INFO TorrentBroadcast: Reading broadcast variable 58 took 9 ms
21/01/19 19:04:15 INFO MemoryStore: Block broadcast_58 stored as values in memory (estimated size 212.9 KB, free 21.0 GB)
21/01/19 19:04:15 INFO TransportClientFactory: Found inactive connection to /10.139.64.6:37539, creating a new one.
21/01/19 19:04:15 INFO TransportClientFactory: Successfully created connection to /10.139.64.6:37539 after 3 ms (0 ms spent in bootstraps)
21/01/19 19:04:15 INFO CodeGenerator: Code generated in 49.520089 ms
21/01/19 19:04:17 INFO CodeGenerator: Generated method too long to be JIT compiled: org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage3.serializefromobject_doConsume_0$ is 10180 bytes
21/01/19 19:04:17 INFO CodeGenerator: Code generated in 67.030832 ms
21/01/19 19:04:17 INFO PythonUDFRunner: Times: total = 1663, boot = 48, init = 1615, finish = 0
21/01/19 19:04:17 INFO HandlingUtils: sending https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia.org/all-access/all-agents/Physics/monthly/20151104/20161104
21/01/19 19:04:17 ERROR PythonUDFRunner: Python worker exited unexpectedly (crashed)
org.apache.spark.api.python.PythonException: Traceback (most recent call last):
  File "/databricks/spark/python/pyspark/worker.py", line 460, in main
    eval_type = read_int(infile)
  File "/databricks/spark/python/pyspark/serializers.py", line 832, in read_int
    raise EOFError
EOFError

	at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.handlePythonException(PythonRunner.scala:540)
	at org.apache.spark.sql.execution.python.PythonUDFRunner$$anon$1.read(PythonUDFRunner.scala:81)
	at org.apache.spark.sql.execution.python.PythonUDFRunner$$anon$1.read(PythonUDFRunner.scala:64)
	at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.hasNext(PythonRunner.scala:494)
	at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
	at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage4.processNext(Unknown Source)
	at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
	at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext(WholeStageCodegenExec.scala:640)
	at org.apache.spark.sql.execution.collect.UnsafeRowBatchUtils$.encodeUnsafeRows(UnsafeRowBatchUtils.scala:62)
	at org.apache.spark.sql.execution.collect.Collector$$anonfun$2.apply(Collector.scala:159)
	at org.apache.spark.sql.execution.collect.Collector$$anonfun$2.apply(Collector.scala:158)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
	at org.apache.spark.scheduler.Task.doRunTask(Task.scala:140)
	at org.apache.spark.scheduler.Task.run(Task.scala:113)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$17.apply(Executor.scala:606)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1541)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:612)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: javax.net.ssl.SSLHandshakeException: Received fatal alert: handshake_failure
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:198)
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:159)
	at sun.security.ssl.SSLSocketImpl.recvAlert(SSLSocketImpl.java:2041)
	at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1145)
	at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1388)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1416)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1400)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.createLayeredSocket(SSLConnectionSocketFactory.java:396)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:355)
	at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142)
	at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:373)
	at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:394)
	at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:237)
	at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:185)
	at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
	at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
	at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.sendWithRetries(HTTPClients.scala:76)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.advanced(HTTPClients.scala:128)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2$$anon$4.block(ExecutionContextImpl.scala:48)
	at scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2.blockOn(ExecutionContextImpl.scala:45)
	at scala.concurrent.package$.blocking(package.scala:123)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.handle(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:58)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:57)
	at scala.Option.map(Option.scala:146)
	at com.microsoft.ml.spark.io.http.HTTPClient$class.sendRequestWithContext(HTTPClients.scala:57)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.sendRequestWithContext(HTTPClients.scala:153)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
	at scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121)
	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
21/01/19 19:04:17 ERROR PythonUDFRunner: This may have been caused by a prior exception:
javax.net.ssl.SSLHandshakeException: Received fatal alert: handshake_failure
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:198)
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:159)
	at sun.security.ssl.SSLSocketImpl.recvAlert(SSLSocketImpl.java:2041)
	at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1145)
	at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1388)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1416)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1400)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.createLayeredSocket(SSLConnectionSocketFactory.java:396)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:355)
	at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142)
	at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:373)
	at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:394)
	at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:237)
	at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:185)
	at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
	at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
	at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.sendWithRetries(HTTPClients.scala:76)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.advanced(HTTPClients.scala:128)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2$$anon$4.block(ExecutionContextImpl.scala:48)
	at scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2.blockOn(ExecutionContextImpl.scala:45)
	at scala.concurrent.package$.blocking(package.scala:123)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.handle(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:58)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:57)
	at scala.Option.map(Option.scala:146)
	at com.microsoft.ml.spark.io.http.HTTPClient$class.sendRequestWithContext(HTTPClients.scala:57)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.sendRequestWithContext(HTTPClients.scala:153)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
	at scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121)
	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
21/01/19 19:04:17 ERROR Executor: Exception in task 0.0 in stage 58.0 (TID 399)
javax.net.ssl.SSLHandshakeException: Received fatal alert: handshake_failure
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:198)
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:159)
	at sun.security.ssl.SSLSocketImpl.recvAlert(SSLSocketImpl.java:2041)
	at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1145)
	at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1388)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1416)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1400)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.createLayeredSocket(SSLConnectionSocketFactory.java:396)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:355)
	at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142)
	at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:373)
	at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:394)
	at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:237)
	at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:185)
	at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
	at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
	at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.sendWithRetries(HTTPClients.scala:76)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.advanced(HTTPClients.scala:128)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2$$anon$4.block(ExecutionContextImpl.scala:48)
	at scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2.blockOn(ExecutionContextImpl.scala:45)
	at scala.concurrent.package$.blocking(package.scala:123)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.handle(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:58)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:57)
	at scala.Option.map(Option.scala:146)
	at com.microsoft.ml.spark.io.http.HTTPClient$class.sendRequestWithContext(HTTPClients.scala:57)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.sendRequestWithContext(HTTPClients.scala:153)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
	at scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121)
	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
21/01/19 19:04:17 INFO CoarseGrainedExecutorBackend: Got assigned task 400
21/01/19 19:04:17 INFO Executor: Running task 0.1 in stage 58.0 (TID 400)
21/01/19 19:04:17 INFO PythonUDFRunner: Times: total = 2, boot = -238, init = 240, finish = 0
21/01/19 19:04:17 INFO HandlingUtils: sending https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia.org/all-access/all-agents/Physics/monthly/20151104/20161104
21/01/19 19:04:17 ERROR PythonUDFRunner: Python worker exited unexpectedly (crashed)
org.apache.spark.api.python.PythonException: Traceback (most recent call last):
  File "/databricks/spark/python/pyspark/worker.py", line 460, in main
    eval_type = read_int(infile)
  File "/databricks/spark/python/pyspark/serializers.py", line 832, in read_int
    raise EOFError
EOFError

	at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.handlePythonException(PythonRunner.scala:540)
	at org.apache.spark.sql.execution.python.PythonUDFRunner$$anon$1.read(PythonUDFRunner.scala:81)
	at org.apache.spark.sql.execution.python.PythonUDFRunner$$anon$1.read(PythonUDFRunner.scala:64)
	at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.hasNext(PythonRunner.scala:494)
	at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
	at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage4.processNext(Unknown Source)
	at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
	at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext(WholeStageCodegenExec.scala:640)
	at org.apache.spark.sql.execution.collect.UnsafeRowBatchUtils$.encodeUnsafeRows(UnsafeRowBatchUtils.scala:62)
	at org.apache.spark.sql.execution.collect.Collector$$anonfun$2.apply(Collector.scala:159)
	at org.apache.spark.sql.execution.collect.Collector$$anonfun$2.apply(Collector.scala:158)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
	at org.apache.spark.scheduler.Task.doRunTask(Task.scala:140)
	at org.apache.spark.scheduler.Task.run(Task.scala:113)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$17.apply(Executor.scala:606)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1541)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:612)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: javax.net.ssl.SSLHandshakeException: Received fatal alert: handshake_failure
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:198)
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:159)
	at sun.security.ssl.SSLSocketImpl.recvAlert(SSLSocketImpl.java:2041)
	at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1145)
	at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1388)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1416)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1400)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.createLayeredSocket(SSLConnectionSocketFactory.java:396)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:355)
	at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142)
	at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:373)
	at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:394)
	at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:237)
	at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:185)
	at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
	at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
	at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.sendWithRetries(HTTPClients.scala:76)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.advanced(HTTPClients.scala:128)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2$$anon$4.block(ExecutionContextImpl.scala:48)
	at scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2.blockOn(ExecutionContextImpl.scala:45)
	at scala.concurrent.package$.blocking(package.scala:123)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.handle(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:58)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:57)
	at scala.Option.map(Option.scala:146)
	at com.microsoft.ml.spark.io.http.HTTPClient$class.sendRequestWithContext(HTTPClients.scala:57)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.sendRequestWithContext(HTTPClients.scala:153)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
	at scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121)
	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
21/01/19 19:04:17 ERROR PythonUDFRunner: This may have been caused by a prior exception:
javax.net.ssl.SSLHandshakeException: Received fatal alert: handshake_failure
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:198)
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:159)
	at sun.security.ssl.SSLSocketImpl.recvAlert(SSLSocketImpl.java:2041)
	at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1145)
	at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1388)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1416)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1400)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.createLayeredSocket(SSLConnectionSocketFactory.java:396)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:355)
	at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142)
	at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:373)
	at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:394)
	at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:237)
	at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:185)
	at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
	at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
	at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.sendWithRetries(HTTPClients.scala:76)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.advanced(HTTPClients.scala:128)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2$$anon$4.block(ExecutionContextImpl.scala:48)
	at scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2.blockOn(ExecutionContextImpl.scala:45)
	at scala.concurrent.package$.blocking(package.scala:123)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.handle(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:58)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:57)
	at scala.Option.map(Option.scala:146)
	at com.microsoft.ml.spark.io.http.HTTPClient$class.sendRequestWithContext(HTTPClients.scala:57)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.sendRequestWithContext(HTTPClients.scala:153)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
	at scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121)
	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
21/01/19 19:04:17 ERROR Executor: Exception in task 0.1 in stage 58.0 (TID 400)
javax.net.ssl.SSLHandshakeException: Received fatal alert: handshake_failure
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:198)
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:159)
	at sun.security.ssl.SSLSocketImpl.recvAlert(SSLSocketImpl.java:2041)
	at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1145)
	at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1388)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1416)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1400)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.createLayeredSocket(SSLConnectionSocketFactory.java:396)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:355)
	at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142)
	at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:373)
	at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:394)
	at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:237)
	at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:185)
	at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
	at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
	at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.sendWithRetries(HTTPClients.scala:76)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.advanced(HTTPClients.scala:128)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2$$anon$4.block(ExecutionContextImpl.scala:48)
	at scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2.blockOn(ExecutionContextImpl.scala:45)
	at scala.concurrent.package$.blocking(package.scala:123)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.handle(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:58)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:57)
	at scala.Option.map(Option.scala:146)
	at com.microsoft.ml.spark.io.http.HTTPClient$class.sendRequestWithContext(HTTPClients.scala:57)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.sendRequestWithContext(HTTPClients.scala:153)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
	at scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121)
	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
21/01/19 19:04:17 INFO CoarseGrainedExecutorBackend: Got assigned task 401
21/01/19 19:04:17 INFO Executor: Running task 0.2 in stage 58.0 (TID 401)
21/01/19 19:04:17 INFO PythonUDFRunner: Times: total = 45, boot = -99, init = 143, finish = 1
21/01/19 19:04:17 INFO HandlingUtils: sending https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia.org/all-access/all-agents/Physics/monthly/20151104/20161104
21/01/19 19:04:17 ERROR PythonUDFRunner: Python worker exited unexpectedly (crashed)
org.apache.spark.api.python.PythonException: Traceback (most recent call last):
  File "/databricks/spark/python/pyspark/worker.py", line 460, in main
    eval_type = read_int(infile)
  File "/databricks/spark/python/pyspark/serializers.py", line 832, in read_int
    raise EOFError
EOFError

	at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.handlePythonException(PythonRunner.scala:540)
	at org.apache.spark.sql.execution.python.PythonUDFRunner$$anon$1.read(PythonUDFRunner.scala:81)
	at org.apache.spark.sql.execution.python.PythonUDFRunner$$anon$1.read(PythonUDFRunner.scala:64)
	at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.hasNext(PythonRunner.scala:494)
	at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
	at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage4.processNext(Unknown Source)
	at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
	at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext(WholeStageCodegenExec.scala:640)
	at org.apache.spark.sql.execution.collect.UnsafeRowBatchUtils$.encodeUnsafeRows(UnsafeRowBatchUtils.scala:62)
	at org.apache.spark.sql.execution.collect.Collector$$anonfun$2.apply(Collector.scala:159)
	at org.apache.spark.sql.execution.collect.Collector$$anonfun$2.apply(Collector.scala:158)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
	at org.apache.spark.scheduler.Task.doRunTask(Task.scala:140)
	at org.apache.spark.scheduler.Task.run(Task.scala:113)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$17.apply(Executor.scala:606)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1541)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:612)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: javax.net.ssl.SSLHandshakeException: Received fatal alert: handshake_failure
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:198)
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:159)
	at sun.security.ssl.SSLSocketImpl.recvAlert(SSLSocketImpl.java:2041)
	at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1145)
	at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1388)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1416)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1400)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.createLayeredSocket(SSLConnectionSocketFactory.java:396)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:355)
	at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142)
	at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:373)
	at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:394)
	at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:237)
	at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:185)
	at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
	at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
	at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.sendWithRetries(HTTPClients.scala:76)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.advanced(HTTPClients.scala:128)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2$$anon$4.block(ExecutionContextImpl.scala:48)
	at scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2.blockOn(ExecutionContextImpl.scala:45)
	at scala.concurrent.package$.blocking(package.scala:123)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.handle(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:58)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:57)
	at scala.Option.map(Option.scala:146)
	at com.microsoft.ml.spark.io.http.HTTPClient$class.sendRequestWithContext(HTTPClients.scala:57)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.sendRequestWithContext(HTTPClients.scala:153)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
	at scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121)
	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
21/01/19 19:04:17 ERROR PythonUDFRunner: This may have been caused by a prior exception:
javax.net.ssl.SSLHandshakeException: Received fatal alert: handshake_failure
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:198)
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:159)
	at sun.security.ssl.SSLSocketImpl.recvAlert(SSLSocketImpl.java:2041)
	at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1145)
	at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1388)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1416)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1400)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.createLayeredSocket(SSLConnectionSocketFactory.java:396)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:355)
	at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142)
	at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:373)
	at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:394)
	at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:237)
	at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:185)
	at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
	at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
	at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.sendWithRetries(HTTPClients.scala:76)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.advanced(HTTPClients.scala:128)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2$$anon$4.block(ExecutionContextImpl.scala:48)
	at scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2.blockOn(ExecutionContextImpl.scala:45)
	at scala.concurrent.package$.blocking(package.scala:123)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.handle(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:58)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:57)
	at scala.Option.map(Option.scala:146)
	at com.microsoft.ml.spark.io.http.HTTPClient$class.sendRequestWithContext(HTTPClients.scala:57)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.sendRequestWithContext(HTTPClients.scala:153)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
	at scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121)
	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
21/01/19 19:04:17 ERROR Executor: Exception in task 0.2 in stage 58.0 (TID 401)
javax.net.ssl.SSLHandshakeException: Received fatal alert: handshake_failure
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:198)
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:159)
	at sun.security.ssl.SSLSocketImpl.recvAlert(SSLSocketImpl.java:2041)
	at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1145)
	at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1388)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1416)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1400)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.createLayeredSocket(SSLConnectionSocketFactory.java:396)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:355)
	at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142)
	at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:373)
	at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:394)
	at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:237)
	at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:185)
	at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
	at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
	at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.sendWithRetries(HTTPClients.scala:76)
	at com.microsoft.ml.spark.io.http.HandlingUtils$.advanced(HTTPClients.scala:128)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.HandlingUtils$$anonfun$advancedUDF$1.apply(HTTPClients.scala:141)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient$$anonfun$handle$1.apply(HTTPClients.scala:160)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2$$anon$4.block(ExecutionContextImpl.scala:48)
	at scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640)
	at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$2.blockOn(ExecutionContextImpl.scala:45)
	at scala.concurrent.package$.blocking(package.scala:123)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.handle(HTTPClients.scala:160)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:58)
	at com.microsoft.ml.spark.io.http.HTTPClient$$anonfun$sendRequestWithContext$1.apply(HTTPClients.scala:57)
	at scala.Option.map(Option.scala:146)
	at com.microsoft.ml.spark.io.http.HTTPClient$class.sendRequestWithContext(HTTPClients.scala:57)
	at com.microsoft.ml.spark.io.http.AsyncHTTPClient.sendRequestWithContext(HTTPClients.scala:153)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at com.microsoft.ml.spark.io.http.AsyncClient$$anonfun$1$$anonfun$apply$1.apply(Clients.scala:58)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
	at scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121)
	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Load New

If the bug pertains to a specific feature please tag the appropriate CODEOWNER for better visibility

Additional context Add any other context about the problem here.

AB#1984589

meigaoms avatar Jan 19 '21 19:01 meigaoms

👋 Thanks for opening your first issue here! If you're reporting a 🐞 bug, please make sure you include steps to reproduce it.

welcome[bot] avatar Jan 19 '21 19:01 welcome[bot]