我面临一个奇怪的问题,我无法看到我的hdfs文件。 每当我做一个hadoop fs -ls我得到下面的错误:
hadoop fs -ls Warning: $HADOOP_HOME is deprecated. ls: Cannot access .: No such file or directory.
我可以使用像copyfromlocal等命令,但仍然不能看到他们在hdfs也不能运行猪脚本由于这个问题。 如果我做一个hadoop fs -ls / *我得到下面的错误:
hadoop fs -ls /* Warning: $HADOOP_HOME is deprecated. Found 1 items drwxr-xr-x - hduser hadoop 0 2014-02-14 16:49 /app/hadoop ls: Cannot access /bin: No such file or directory. ls: Cannot access /boot: No such file or directory. ls: Cannot access /Data: No such file or directory. ls: Cannot access /dev: No such file or directory. ls: Cannot access /etc: No such file or directory. Found 1 items drwxr-xr-x - hduser hadoop 0 2014-02-19 13:02 /home/hduser ls: Cannot access /lib: No such file or directory. ls: Cannot access /lib64: No such file or directory. ls: Cannot access /lost+found: No such file or directory. ls: Cannot access /media: No such file or directory. ls: Cannot access /misc: No such file or directory. ls: Cannot access /mnt: No such file or directory. ls: Cannot access /net: No such file or directory. ls: Cannot access /opt: No such file or directory. ls: Cannot access /proc: No such file or directory. ls: Cannot access /root: No such file or directory. ls: Cannot access /sbin: No such file or directory. ls: Cannot access /selinux: No such file or directory. ls: Cannot access /srv: No such file or directory. ls: Cannot access /sys: No such file or directory. ls: Cannot access /tftpboot: No such file or directory. ls: Cannot access /usr: No such file or directory. ls: Cannot access /var: No such file or directory. ls: Cannot access /zookeeper.out: No such file or directory. ls: Cannot access /zookeeper_server.pid: No such file or directory.
任何人都可以让我知道这里可能是什么问题? 我有一个7节点hadoop群集工作正常。 我仅在2天左右才遇到这个问题。 已经尝试重启群集,重启节点等,但仍面临同样的问题。
谢谢,
无法写入Hadoop DFS目录模式775组权限UserGroupinformation
Hadoop在伪分布式模式。 拒绝连接
在Windows 10上的Spark。'Files Spark bin .. jars“”'不被识别为内部或外部命令
在Windows上安装和运行hadoop 2.2的文档
我如何在Arch Linux上安装protobuf 2.5,以便使用maven 3.3.1来编译hadoop 2.6.0
Hadoop webhdfs需要身份validation
将本地文件复制到Hadoop fs时出现Hadoop语法错误
configuration单元使用MysqL metastore
在OpenStack上安装Apache Hadoop
你的hadoop没有问题
hadoop fs -ls
没有显示任何输出,因为当前用户的主目录中没有目录或文件(从中执行命令)
请运行第二个命令
hadoop fs -ls /
代替
hadoop fs -ls /*
这将工作正常,并会给你正确的输出。
以及它显示,你不会正确设置所有的变量与你的路径..首先,你需要配置你的Java家庭之前,你hadoop
将这篇文章转发给hadoop的配置
http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
并给予用户你希望安装hadoop的权限。
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。