简体   繁体   English

Azure df中的磁盘空间和使用情况显示已满,但du未添加

[英]Disk space and usage in Azure df show full but du doesnt adds

I have an azure virtual machine with four external disk mounted. 我有一个装有四个外部磁盘的蔚蓝虚拟机。

df -h

Filesystem      Size  Used Avail Use% Mounted on
/dev/sda1        29G   28G     0 100% /
none            4.0K     0  4.0K   0% /sys/fs/cgroup
udev            1.7G   12K  1.7G   1% /dev
tmpfs           345M  460K  344M   1% /run
none            5.0M     0  5.0M   0% /run/lock
none            1.7G     0  1.7G   0% /run/shm
none            100M     0  100M   0% /run/user
none             64K     0   64K   0% /etc/network/interfaces.dynamic.d
/dev/sdb1       133G   31G   96G  25% /mnt
/dev/sdc1       197G  647M  187G   1% /home/hduser/mydata
/dev/sdd1       296G   82G  199G  30% /var/data/HadoopData

df -h shows Disk Full df -h显示磁盘已满

du -sh /*

9.7M    /bin
44M /boot
12K /dev
6.4M    /etc
1.5G    /home
0   /initrd.img
0   /initrd.img.old
176M    /lib
4.0K    /lib64
16K /lost+found
4.0K    /media
31G /mnt
4.0K    /opt
du: cannot access ‘/proc/16100/task/16100/fd/4’: No such file or directory
du: cannot access ‘/proc/16100/task/16100/fdinfo/4’: No such file or directory
du: cannot access ‘/proc/16100/fd/4’: No such file or directory
du: cannot access ‘/proc/16100/fdinfo/4’: No such file or directory
0   /proc
156K    /root
460K    /run
9.4M    /sbin
4.0K    /srv
0   /sys
313M    /tmp
1.1G    /usr
83G /var
0   /vmlinuz
0   /vmlinuz.old

How should I solve this issue.I checked for open deleted file in each /dev/ but no open deleted files present. 我应该如何解决此问题。我检查了每个/ dev /中是否有打开的已删除文件,但没有打开的已删除文件。 I am running hadoop and hbase on this VM. 我在此VM上运行hadoop和hbase。

Adding hidden file output 添加隐藏文件输出

ls -altr

total 156
-rw-r--r--  1 root root    140 Feb 20  2014 .profile
drwxr-xr-x 22 root root   4096 Aug 24 08:05 ..
drwx------  2 root root   4096 Aug 24 14:02 .ssh
-rw-r--r--  1 root root      0 Aug 25 05:06 .mongorc.js
-rw-r--r--  1 root root   3172 Sep  9 05:25 .bashrc
-rw-r--r--  1 root root    363 Sep  9 13:41 .dbshell
-rw-r--r--  1 root root     75 Sep 14 09:38 .selected_editor
-rw-------  1 root root 120842 Dec 21 08:25 .bash_history
-rw-------  1 root root   6342 Dec 21 08:53 .viminfo
drwx------  3 root root   4096 Dec 21 08:53 .

I checked this hidden files with du -sh. 我用du -sh检查了这个隐藏文件。 None of the files I can see as a problem 我看不到任何文件有问题

du -sh /var/*
1.3M    /var/backups
143M    /var/cache
4.0K    /var/crash
82G /var/data
4.0K    /var/gearmand.pid
147M    /var/lib
4.0K    /var/local
0   /var/lock
427M    /var/log
4.0K    /var/mail
4.0K    /var/opt
0   /var/run
36K /var/spool
4.0K    /var/tmp

directories in /var/data / var / data中的目录

ls -altr

total 40
drwxrwxr-x  4 azureuser azureuser  4096 Aug 26 09:41 jar
drwxr-xr-x  2 root      root       4096 Sep  9 13:19 maketsv
drwxr-xr-x  8 root      root       4096 Sep 10 07:24 HadoopData
drwxrwxrwx  4 root      root       4096 Sep 11 09:33 HadoopOperations
drwxr-xr-x 13 root      root       4096 Sep 15 06:13 ..
-rwxrwxrwx  1 root      root      13990 Dec  3 12:49 supervisord.conf
drwxr-xr-x  6 root      root       4096 Dec 21 06:07 .

There could be hidden directories under /. /下可能存在隐藏目录。 You can get list of hidden files using ls -altr and then run du -sh commands on the new directories. 您可以使用ls -altr获取隐藏文件的列表,然后在新目录上运行du -sh命令。

Also it seems to the issue with /var. 也似乎是/ var的问题。 /var is not using /dev/sdd1, only /var/data/HadoopData is using it. / var不使用/ dev / sdd1,只有/ var / data / HadoopData使用它。 So, you have to navigate to /var and see what all directories are taking up 82 GB space. 因此,您必须导航到/ var并查看所有目录占用了82 GB的空间。 Most likely issue could be with /var/log. 最可能的问题可能与/ var / log有关。

You can run this command to get the details of each and every directory. 您可以运行此命令以获取每个目录的详细信息。

find . -type d | xargs du -sh

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM