In a Hadoop cluster, how to contribute limited/specific amount of storage as slave to the cluster?

We can easily do this by creating a partition on the storage device ,ie Data Node with a specific amount of storage we want .

Prasantmahato
Oct 18, 2020
Hadoop Cluster .

In this task I have used an external EBS /Block volume of 1 gb and devoted all its space to the Data Node folder .

Procedure

Step 1

I have created a Volume of 1 GB and then attached to my DataNode .

Attached external block Volume .

Step 2

As the storage device which I attached is new , so we have to make a physical partition. I have created a physical partition on whole of the space left.

Created a physical partition .

Step 3

I formatted the partition with a file system .xfs format .

Formatting the partition .

Step 4

I mounted the partition with the directory/dn1 that is provided to hadoop to be shared .

Mounting the partition with the directory .

Step 5

I successfully attached partition that I created with the DataNode directory which is to be shared.

Successfully attached the partition with directory .

Step 6

I started my DataNode daemon and now the storage size I wanted is now shared .

Started my Datanode daemon .

These simple way we can use to contribute limited/specific amount of storage.

THANKYOU

--

--