Profile
Strong experience on Hadoop distributions like Cloudera & Hortonworks.
Expertise in Hadoop Framework and Big Data concepts.
Configure fully distributed node cluster in the AWS cloud.
Working on Hadoop Architecture and Ecosystem.
Decommissioning and commissioning the Node on running cluster including Balancing HDFS block data.
Adding and removing hadoop services.
Adding new nodes to an existing cluster
Hands on experience with installing, configuring, and using Hadoop ecosystem components like Hive, Pig, Sqoop, Flume etc.
Experienced in loading data into the cluster from dynamically-generated files using Flume and from RDBMS using Sqoop.
Also from the local file system to the Hadoop cluster.
Troubleshooting, Performance tuning, and solving Hadoop issues using CLI and WebUI.
Working knowledge on Distributed Copy to transfer data from one cluster to another.
Working knowledge on creating Snapshot of data.