手记

Spark-streaming运行一段时间后报Accept timed out问题解决

Accept timed out 异常堆栈

18/07/13 23:14:35 ERROR PythonRDD: Error while sending iterator
java.net.SocketTimeoutException: Accept timed out
        at java.net.PlainSocketImpl.socketAccept(Native Method)
        at java.net.AbstractPlainSocketImpl.accept(AbstractPlainSocketImpl.java:409)
        at java.net.ServerSocket.implAccept(ServerSocket.java:545)
        at java.net.ServerSocket.accept(ServerSocket.java:513)
        at org.apache.spark.api.python.PythonRDD$$anon$1.run(PythonRDD.scala:397)
18/07/13 23:22:46 INFO ContextCleaner: Cleaned accumulator 1265984
18/07/13 23:22:46 INFO ContextCleaner: Cleaned accumulator 1261984
18/07/13 23:22:46 INFO ContextCleaner: Cleaned accumulator 1291349
18/07/13 23:22:46 INFO ContextCleaner: Cleaned accumulator 1261975

原因

是spark对相同的IP不同的host别名识别为不同的主机,如:spark-streaming节点上配置  127.0.0.1 localhost.localdomain localhost4 localhost4.localdomain4 会导致executor偶现的Accept timed out的问题。

解决

hosts文件中将 127.0.0.1 localhost.localdomain localhost4 localhost4.localdomain4改为127.0.0.1 localhost 搞定。



作者:祗談風月
链接:https://www.jianshu.com/p/e008431fd7de


0人推荐
随时随地看视频
慕课网APP