100字范文,内容丰富有趣,生活中的好帮手!
100字范文 > Spark hadoop票据过期问题HDFS_DELEGATION_TOKEN

Spark hadoop票据过期问题HDFS_DELEGATION_TOKEN

时间:2023-08-04 18:59:47

相关推荐

Spark hadoop票据过期问题HDFS_DELEGATION_TOKEN

Spark streaming应用运行7天之后,自动退出,日志显示token for xxx(用户名): HDFS_DELEGATION_TOKEN owner=xxxx@, renewer=yarn, realUser=, issueDate=1581323654722, maxDate=1581928454722, sequenceNumber=6445344, masterKeyId=1583) is expired, current time: -02-17 16:37:40,567+0800 expected renewal time: -02-17 16:34:14,722+0800 ,可是提交应用的已经用kinit获取了kerberos票据,从日志信息中可以看出,是spark streaming的checkpoint操作hadoop时,发现kerberos票据过期导致。

解决方案:

spark-submit 其他参数。。。 --keytab /home/keytabs/xxxx.keytab --principal xxxx --conf spark.hadoop.fs.hdfs.impl.disable.cache=true

xxxx只是为了隐藏真实用户名 还有就是程序提交时肯定还有其他参数需要配置的。

完整异常信息如下:

暂时隐去用户名信息

20/02/17 16:37:40 ERROR util.Utils: Uncaught exception in thread Thread-5

org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManagerInvalidToken):token(tokenforhadoop:HDFSDELEGATIONTOKENowner=xxxx@,renewer=yarn,realUser=,issueDate=1581323654722,maxDate=1581928454722,sequenceNumber=6445344,masterKeyId=1583)isexpired,currenttime:−02−1716:37:40,567+0800expectedrenewaltime:−02−1716:34:14,722+0800atorg.apache.hadoop.ipc.Client.call(Client.java:1504)atorg.apache.hadoop.ipc.Client.call(Client.java:1441)atorg.apache.hadoop.ipc.ProtobufRpcEngineInvalidToken): token (token for hadoop: HDFS_DELEGATION_TOKEN owner=xxxx@, renewer=yarn, realUser=, issueDate=1581323654722, maxDate=1581928454722, sequenceNumber=6445344, masterKeyId=1583) is expired, current time: -02-17 16:37:40,567+0800 expected renewal time: -02-17 16:34:14,722+0800 at org.apache.hadoop.ipc.Client.call(Client.java:1504) at org.apache.hadoop.ipc.Client.call(Client.java:1441) at org.apache.hadoop.ipc.ProtobufRpcEngineInvalidToken):token(tokenforhadoop:HDFSD​ELEGATIONT​OKENowner=xxxx@,renewer=yarn,realUser=,issueDate=1581323654722,maxDate=1581928454722,sequenceNumber=6445344,masterKeyId=1583)isexpired,currenttime:−02−1716:37:40,567+0800expectedrenewaltime:−02−1716:34:14,722+0800atorg.apache.hadoop.ipc.Client.call(Client.java:1504)atorg.apache.hadoop.ipc.Client.call(Client.java:1441)atorg.apache.hadoop.ipc.ProtobufRpcEngineInvoker.invoke(ProtobufRpcEngine.java:230)

at com.sun.proxy.Proxy16.getFileInfo(UnknownSource)atorg.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)atsun.reflect.GeneratedMethodAccessor34.invoke(UnknownSource)atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)atjava.lang.reflect.Method.invoke(Method.java:606)atorg.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:258)atorg.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)atcom.sun.proxy.Proxy16.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771) at sun.reflect.GeneratedMethodAccessor34.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:258) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104) at com.sun.proxy.Proxy16.getFileInfo(UnknownSource)atorg.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)atsun.reflect.GeneratedMethodAccessor34.invoke(UnknownSource)atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)atjava.lang.reflect.Method.invoke(Method.java:606)atorg.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:258)atorg.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)atcom.sun.proxy.Proxy17.getFileInfo(Unknown Source)

at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2126)

at org.apache.hadoop.hdfs.DistributedFileSystem$20.doCall(DistributedFileSystem.java:1262)

at org.apache.hadoop.hdfs.DistributedFileSystem20.doCall(DistributedFileSystem.java:1258)atorg.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)atorg.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1258)atorg.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1418)atorg.apache.spark.scheduler.EventLoggingListener.stop(EventLoggingListener.scala:232)atorg.apache.spark.SparkContext20.doCall(DistributedFileSystem.java:1258) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1258) at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1418) at org.apache.spark.scheduler.EventLoggingListener.stop(EventLoggingListener.scala:232) at org.apache.spark.SparkContext20.doCall(DistributedFileSystem.java:1258)atorg.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)atorg.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1258)atorg.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1418)atorg.apache.spark.scheduler.EventLoggingListener.stop(EventLoggingListener.scala:232)atorg.apache.spark.SparkContextanonfunanonfunanonfunstop777anonfunanonfunanonfunapplymcVmcVmcVsp5.apply(SparkContext.scala:1831)atorg.apache.spark.SparkContext5.apply(SparkContext.scala:1831) at org.apache.spark.SparkContext5.apply(SparkContext.scala:1831)atorg.apache.spark.SparkContextanonfunanonfunanonfunstop777anonfunanonfunanonfunapplymcVmcVmcVsp5.apply(SparkContext.scala:1831)atscala.Option.foreach(Option.scala:257)atorg.apache.spark.SparkContext5.apply(SparkContext.scala:1831) at scala.Option.foreach(Option.scala:257) at org.apache.spark.SparkContext5.apply(SparkContext.scala:1831)atscala.Option.foreach(Option.scala:257)atorg.apache.spark.SparkContextanonfunanonfunanonfunstop7.apply7.apply7.applymcVsp(SparkContext.scala:1831)atorg.apache.spark.util.Utilssp(SparkContext.scala:1831) at org.apache.spark.util.Utilssp(SparkContext.scala:1831)atorg.apache.spark.util.Utils.tryLogNonFatalError(Utils.scala:1295)

at org.apache.spark.SparkContext.stop(SparkContext.scala:1830)

at org.apache.spark.SparkContextKaTeX parse error: Can't use function '$' in math mode at position 8: anonfun$̲2.apply$mcV$sp(…anonfun$runAll111anonfunanonfunanonfunapplymcVmcVmcVsp1.apply1.apply1.applymcVsp(ShutdownHookManager.scala:188)atorg.apache.spark.util.SparkShutdownHookManagersp(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManagersp(ShutdownHookManager.scala:188)atorg.apache.spark.util.SparkShutdownHookManageranonfunanonfunanonfunrunAll111anonfunanonfunanonfunapplymcVmcVmcVsp1.apply(ShutdownHookManager.scala:188)atorg.apache.spark.util.SparkShutdownHookManager1.apply(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager1.apply(ShutdownHookManager.scala:188)atorg.apache.spark.util.SparkShutdownHookManageranonfunanonfunanonfunrunAll111anonfunanonfunanonfunapplymcVmcVmcVsp1.apply(ShutdownHookManager.scala:188)atorg.apache.spark.util.Utils1.apply(ShutdownHookManager.scala:188) at org.apache.spark.util.Utils1.apply(ShutdownHookManager.scala:188)atorg.apache.spark.util.Utils.logUncaughtExceptions(Utils.scala:1963)

at org.apache.spark.util.SparkShutdownHookManagerKaTeX parse error: Can't use function '$' in math mode at position 8: anonfun$̲runAll$1.apply$…anonfun$runAll1.apply(ShutdownHookManager.scala:188)atorg.apache.spark.util.SparkShutdownHookManager1.apply(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager1.apply(ShutdownHookManager.scala:188)atorg.apache.spark.util.SparkShutdownHookManageranonfunanonfunanonfunrunAll1.apply(ShutdownHookManager.scala:188)atscala.util.Try1.apply(ShutdownHookManager.scala:188) at scala.util.Try1.apply(ShutdownHookManager.scala:188)atscala.util.Try.apply(Try.scala:192)

at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)

at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)

at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)

本内容不代表本网观点和政治立场,如有侵犯你的权益请联系我们处理。
网友评论
网友评论仅供其表达个人看法,并不表明网站立场。