java hadoop spark 报错

在做项目的时候,设置了Thread.currentThread.setClassLoaderContext(classLoaderContext);在执行项目的时候在hadoop的位置报错

复制代码
 public FSDataInputStream getFSDataInputStream(String srcFilePath) {
        System.setProperty("HADOOP_HOME", "/usr/local/hadoop");
        FSDataInputStream dataInputStream = null;
        OutputStream output = null;
        String FileName = String.format("data-%s", System.currentTimeMillis());
        String destFilePath = TEMP_FILE_PATH + FileName;
        final String path = srcFilePath;
        Configuration config = new Configuration();
        //  config.set("fs.default.name", "hdfs://master:9000");
        config.set("fs.hdfs.impl", org.apache.hadoop.hdfs.DistributedFileSystem.class.getName());
        FileSystem fileSystem = null;
        try {
            fileSystem = FileSystem.get(URI.create(path), config);

            if (!fileSystem.exists(new Path(path))) {
                logger.error("文件不存在");
                return null;
            }
            dataInputStream = fileSystem.open(new Path(path));
            return dataInputStream;
        } catch (IOException ex) {
            logger.error(ex.getStackTrace().toString());
            return null;
        }
    }
在加粗的位置报错;
错误信息为:

[DEBUG] 2016-07-30 13:11:26,209(466) --> [main] org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:320): Failed to detect a valid hadoop home directory
java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:302)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:327)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79)
at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:116)
at org.apache.hadoop.security.Groups.<init>(Groups.java:93)
at org.apache.hadoop.security.Groups.<init>(Groups.java:73)
at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:293)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:283)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:260)
at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:789)
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:774)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:647)
at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2753)
at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2745)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2611)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
at com.itcchina.zs57s.splitter.tools.HadoopTools.getFSDataInputStream(HadoopTools.java:74)
at com.itcchina.zs57s.splitter.handler.HadoopWorkItem.start(HadoopWorkItem.java:37)
at ItcWorkFlowClassify.ItcWorkItem.runWorker(ItcWorkItem.java:297)
at ItcWorkFlowClassify.ItcWorkFlow.start(ItcWorkFlow.java:71)
at ItcWorkFlowClassify.ItcWorkItem.runWorker(ItcWorkItem.java:276)
at Main.main(Main.java:25)

 
复制代码
婷婷同学_
浏览 770回答 1
1回答

慕村9548890

自己发现了问题,原来自己在使用Runtime.getRuntime().exec()的时候参数给错了,默认给了第二个参数,因此获取不到环境变量 ExecutorService executorService = Executors.newCachedThreadPool(); executorService.execute(new Runnable() { public void run() { switch (taskCommandType) { case start: try { currentProcess = Runtime.getRuntime().exec(String.format("java -jar %s %s %s", itcWorkFlow4jFilePath, m_mainWorkFilePath, fileList[0]), null); DataInputStream dataInputStream = new DataInputStream(currentProcess.getErrorStream()); /// byte[] buffer = new byte[1024 * 1024]; // int totalSize = 0; // int currentSize = 0; String tempStr = ""; StringBuffer errorMessage = new StringBuffer(); while ((tempStr = dataInputStream.readLine()) != null) { errorMessage.append(tempStr); } dataInputStream.close(); // String errorMessage = new String(buffer, 0, totalSize); // buffer = null; logger.error(errorMessage.toString()); } catch (IOException e) { e.printStackTrace(); } break; case stop: currentProcess.destroy(); break; case list: break;  
打开App,查看更多内容
随时随地看视频慕课网APP

相关分类

Java