Job summary:


Title:
Hadoop/Big Data Developer with Core Java

Location:
Raleigh, NC, USA

Length and terms:
6+ Months - W2 or C2C


Position created on 03/06/2018 03:08 pm

Job description:


JobID: 1423

Position: Hadoop/Big Data Developer with Core Java

Location:  Raleigh, NC
Duration: 6+ Months

Experience Required

7+ years dev experience,

Programming languages: Java, Python MUST. Good to have a knowledge of C

Experience with Hadoop development and QA

Experience with NFS (preferably ONTAP) and shared file systems development

Experience with Hortonworks technologies like HDP, Ambari and Cloudera technologies like CDP and management software.

Knowledge of Hadoop ecosystem components like MapReduce, Spark, HBase and, Ranger and Sentry.

Security framework experience with Kerberos, LDAP

Work:

Add features to NFS Connector to make it HDFS compliant (HCFS) and API compliant

Enhance Metadata management for NFS connector

Ambari integration for NFS connector

Ranger and Kerberos support for NFS Connector

Product Packaging with management interface across Cloudera and Hortonworks

Add Logging mechanism for NFS connector

Spark/Hive integrations.

Garbage collection management.

Make NFS connector compliant with Atlas, Flume, Sqoope.

Make Hadoop management APIs work with NFS connector

Enable NFS as a default file system for Hadoop and Spark.


Contact the recruiter working on this position:



The recruiter working on this position is Rakesh Murali
His/her contact number is +(1) (510) 4706154
His/her contact email is rakesh_murali@msysinc.com

Our recruiters will be more than happy to help you to get this contract.