Job summary:


Title:
Hadoop Cluster Admin with Hortonworks

Location:
Omaha, Nebraska, United States

Length and terms:
Long Term Contract - W2 or C2C


Position created on 04/26/2017 08:07 pm

Job description:


Required Technical Skills

In depth understanding of Linux.

Knowledge of Java.

In depth knowledge of Hadoop ecosystem tools, Kerberos, Ranger, Hadoop Authentication/Authorization, NoSQL databases.

NoSQL data modeling techniques and map reduce design patterns.

Must have experience in a production environment in Hortonworks Distribution

Responsibilities

Hadoop Admin is responsible for administering the Hadoop cluster, apply new versions and patches through Ambari.

Assist in implementing security for Hadoop big projects.

Assist Build team with Environment requirements Design and Develop new processes for better maintenance of the environments.

Help Application and Operations team to troubleshoot performance and configuration issues.

Participate in the on-call rotation.

Candidate will be competent to complete most of the Big Data ecosystem Admin and design tasks independently and should be able to work with little guidance.

Main Accountability #1 Hadoop Cluster Administration

Responsible for implementation and ongoing administration of Hadoop infrastructure.

Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments.

Working with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and MapReduce access for the new users.

Cluster maintenance as well as creation and removal of nodes

Performance tuning of Hadoop clusters and Hadoop MapReduce routines.

Screen Hadoop cluster job performances and capacity planning

Monitor Hadoop cluster connectivity and security

Manage and review Hadoop log files.

File system management and monitoring.

HDFS support and maintenance.

Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required.

Develop new processes for better maintenance of the environments.

Setup best practices for monitoring the cluster

Implement security for Hadoop big data.

Help Application and Operations team to troubleshoot the performance issues.

Point of Contact for Vendor escalation.

Participate in the on-call rotation to support a 7x24x365 for Production support.

Experience in working with Linux platform and setting up HA.

Experience with NOSQL data bases a plus

Point of Contact for Vendor escalation

Main Accountability #2 DBA Responsibilities Performed by Hadoop Administrator

Data modelling, design & implementation based on recognized standards.

Software installation and configuration.

Database backup and recovery.

Database connectivity and security.

Performance monitoring and tuning.

Disk space management.

Software patches and upgrades.

Automate manual tasks.

 

Main Accountability #3: Process Support

Develop Infrastructure documentation. Responsible for code migrations from Dev. to QA and production and providing operational instructions for deployments.

Executes and provides feedback for operational policies, procedure, processes and standards

Support any process that needs attention in Production environment.

please send your resume to venu.gopal@msysinc.com


Contact the recruiter working on this position:



The recruiter working on this position is Venu Gopal
His/her contact number is +(1) (919) 3712451
His/her contact email is venu.gopalxcxcxzc@msysinc.com

Our recruiters will be more than happy to help you to get this contract.