Home >Common Problem >what is hdfs command
hdfs command refers to the command of Hadoop hdfs system. Its common commands include: 1. ls command; 2. cat command; 3. mkdir command; 4. rm command; 5. put command; 6. cp command ;7. copyFromLocal command; 8. get command; 9. copyToLocal command; 10. mv command, etc.
#The operating environment of this tutorial: linux5.9.8 system, Dell G3 computer.
What is the hdfs command?
refers to the command of Hadoop hdfs system.
To operate the hdfs system, you can use hadoop fs or hdfs dfs, both of which have the same effect. (The hadoop dfs command is no longer recommended)
Some commonly used commands in the Hadoop hdfs system
1. hadoop fs (hdfs dfs) file operations
ls displays all files or folders in the directory
Usage: hadoop fs -ls [uri form directory]
Example: hadoop fs –ls / Display all files and directories in the root directory
You can add the -R option to display all files in the directory
Example: hadoop fs -ls -R /
cat View file content
Usage: hadoop fs -cat URI [URI …]
Example: hadoop fs -cat /in/ test2.txt
mkdir Create directory
Usage: hadoop fs -mkdir [uri form directory]
Example: hadoop fs –mkdir /test
Create multi-level directories plus –p
Example: hadoop fs –mkdir -p /a/b/c
rm Delete a directory or file
Usage: hadoop fs -rm [file path] Delete a folder and add -r
Example: hadoop fs -rm /test1 .txt
Delete the folder and add -r,
Example: hadoop fs -rm -r /test
put Copy the file
Copy the file to the hdfs system, or read the file from the standard input. At this time, dst It is a file
Usage: hadoop fs -put b3e6245f968e4d335caa58a859ea3c39 ... 66f6480be339a60aa99e285cc7163a46
Example:
Hadoop fs -put /usr/wisedu/ temp/test1.txt /
Read files from standard input: hadoop fs -put -/in/myword
cp Copy files in the system
Usage: hadoopfs -cp URI [URI …] 6be58d1426a577c69ada50a4bc87dd69
Copy the file from the source path to the destination path. This command allows multiple source paths, in which case the target path must be a directory.
Example:
hadoop fs -cp /in/myword/word
copyFromLocal Copy local files to hdfs
Usage: hadoop fs -copyFromLocal b3e6245f968e4d335caa58a859ea3c39 URI
Similar to the put command except that the source path is limited to a local file
get copies the file to the local system
Usage: hadoop fs -get[-ignorecrc] [-crc] cbce1a3cf2f839037583dce8e845670b 6f6ffa7fad51158d88c8ae6908c9872c
Copy files to the local file system. Files that failed the CRC check can be copied using the -ignorecrc option. Use the -crc option to copy the file along with the CRC information.
Example: hadoop fs -get/word /usr/wisedu/temp/word.txt
copyToLocal Copy files to the local system
Usage: hadoop fs-copyToLocal [-ignorecrc] [-crc] URI 6f6ffa7fad51158d88c8ae6908c9872c
Similar to the get command except that the target path is limited to a local file.
Example: hadoop fs - copyToLocal/word /usr/wisedu/temp/word.txt
mv
will Files are moved from source path to destination path. This command allows multiple source paths, in which case the target path must be a directory. Moving files between different file systems is not allowed.
Usage: hadoop fs -mv URI [URI …] 6be58d1426a577c69ada50a4bc87dd69
Example: hadoop fs -mv /in/test2.txt /test2.txt
du Display file size
Display the size of all files in the directory.
Usage: hadoop fs -du URI [URI ...]
Example: hadoop fs -du /
To display the size of the current directory or folder, you can add the option -s
Example: hadoop fs -du -s /
touchz Create an empty file
Usage: hadoop fs -touchz URI [URI …]
Create an empty file with 0 bytes
Example: hadoop fs -touchz /empty.txt
chmod changes file permissions
Usage: hadoop fs -chmod[-R] a022822f2b0fb3cf3c2fe0daa5dd3285 URI [URI ...]
is similar to the chmod command on the Linux platform, changing the permissions of the file. Using -R will cause changes to be made recursively through the directory structure. The user of the command must be the owner of the file or the superuser.
Example: First create a normal user test: sudo useradd -m test
Then use wisedu user to create hello.txt file in hdfs system directory /a. At this time, test has the ability to read /a/hello The permissions of the .txt file are as shown below:
After switching back to the wisedu user, modify the permissions of the file to make the files in the /a directory unreadable to other users. Command: hadoop fs - chmod -R o-r /a As shown in the figure below, when switching back to the test user to view the /a/hello.txt file, it prompts that there is no permission:
chown Change the file Owner
Usage: hadoop fs -chown [-R] [OWNER][:[GROUP]] URI [URI]
Change the owner of the file. Using -R will cause changes to be made recursively through the directory structure. The user of the command must be a superuser.
Example: hadoop fs -chown -R test /a As shown below:
chgrp changes the group where the file is located
Usage: hadoop fs -chgrp [-R] GROUP URI [URI ...]
Change the group to which the file belongs. Using -R will cause changes to be made recursively through the directory structure. The user of the command must be the owner of the file or the superuser.
Example: hadoop fs -chgrp -R test /a As shown below:
2. hdfs dfsadmin management command
1) -report
View basic information and statistical information of the file system.
Example: hdfs dfsadmin -report
2) -safemode
enter | leave | get | wait: Safe mode command. Safe mode is a state of the NameNode in which the NameNode does not accept changes to the namespace (read-only); blocks are not copied or deleted. NameNode automatically enters safe mode when it starts. When the minimum percentage of configuration blocks meets the minimum number of replicas, it will automatically leave safe mode. Enter means to enter and leave means to leave.
Example: hdfs dfsadmin -safemode get
hdfsdfsadmin -safemode enter
3) -refreshNodes
Reread hosts and exclude files to make new Nodes or nodes that need to exit the cluster can be re-identified by the NameNode. This command is used when adding a node or deregistering a node.
Example: hdfs dfsadmin -refreshNodes
4) -finalizeUpgrade
End the HDFS upgrade operation. The DataNode deletes the previous version's working directory, and the NameNode does the same thereafter.
5) -upgradeProgress
status | details | force: Request the upgrade status of the current system | Details of the upgrade status | Forced upgrade operation
6) -metasave filename
Save the main data structure of NameNode to the 2334ac29606bf8a170583e4f7533b1f4 file in the directory specified by the hadoop.log.dir attribute.
7) -setQuota91363e4d07f5c5dd93a0d7aa611d90c0635058c94e19ffeddfb72b34fa0e8ce3......635058c94e19ffeddfb72b34fa0e8ce3
Set quota91363e4d07f5c5dd93a0d7aa611d90c0 for each directory 635058c94e19ffeddfb72b34fa0e8ce3. The directory quota is a long integer that forces the number of names under the directory tree to be set.
8) -clrQuota635058c94e19ffeddfb72b34fa0e8ce3…635058c94e19ffeddfb72b34fa0e8ce3
Clear quota settings for each directory 635058c94e19ffeddfb72b34fa0e8ce3.
9) -help
Display help information
For more related knowledge, please visit the FAQ column!
The above is the detailed content of what is hdfs command. For more information, please follow other related articles on the PHP Chinese website!