Job summary:
Title:
AWS ETL Developer
Location:
Raleigh, NC, United States
Length and terms:
Long term - W2 or C2C
Position created on 10/06/2020 08:39 pm
Job description:
Interview Type: Skype *** Very long term project; initial PO for 1 year, expect to go for 4+ years *** Remote during covid then onsite; candidate must pickup laptop in person for remote work.
The ETL Developer is responsible for:
- Working with Data Architects, Analysts, and Scientist to aid in BIDP data pipeline efforts
- Configure ingestion and format validation of the new workstreams
- Verify initial and incremental data uploads and maintain the workstreams
- Implement data pipeline-based technologies within and associated with the BIDP environment
- Preform data pipeline testing
- Conducting data pipeline optimization, troubleshooting and debugging, report regularly on health and performance of jobs
- Maintain ownership of release activities interacting with pipeline projects
- Support and improve data pipeline automation
- Assist in discovering, evaluating, and qualifying new technologies around BIDP data pipeline functions
- Partner with BIDP stakeholders to establish and maintain policies, procedures, operational standards around the data pipeline functions
- Help business in leveraging the BIDP through the data pipeline process
15%
Preform data pipeline process management and lifecycle support by outlining the process and setting the boundaries for data consumption and processing within the BIDP. Provide architecture and data flows for the data pipelines. Document the requirements of the data pipeline process, and tools, and manage its development. Work with stakeholder in the development and maintenance of all data pipeline-based documentation. Take part in the development and implementation of data consumption tools within the BIDP environment. Ensure that data pipelines comply with all regulatory standards. Prepare and report on activities and utilization around data pipelines.
65%
Develop data pipelines, processes and designs for data being consumed within the BIDP. Identify trends and opportunities for growth through analysis of complex data sets. Evaluate methods and provide source-to-target mappings and information-model specification documents for data sets. Work directly with stakeholders to gather requirements for data pipelines for the BIDP. Work closely with business understand and maintain focus on their analytical needs, including identifying critical metrics and KPIs, and deliver actionable insights to relevant stakeholders. Define and implement data acquisition and integration logic, selecting appropriate combination of methods and tools within defined technology stack to ensure optimal scalability and performance of the solution
20%
Conduct testing of tools and data pipelines. Perform root cause analysis on all processes and resolve all production issues and validate all data and perform routine tests on data sets and provide support to all data pipeline connections. Document all test procedures for BIDP data pipeline tools and processes, and coordinate with stakeholders and exchange partners to resolve issues and maintain quality.
Competencies, Knowledge, Skills and Abilities Required in this Position
- Strong knowledge and hands-on experience with data pipeline tools
- Experience in data modeling
- AWS,
- Knowledge of compliance frameworks PCI, SOX, SOC 2, ISO 27001 and the ability to apply their requirements and concepts to a complex environment
- Experience in supporting a data warehouse in identifying and revising reporting requirements
- Knowledge and hands-on expertise with processing confidential data and information according to guidelines
- Strong interpersonal communication and ability to solve complex problems
- Hands-on experiences with databases e.g., SQL, MYSQL, Oracle, Redshift (preferred), etc.
- Experience in model design, segmentation techniques, and ETL Frameworks
- Knowledge and understanding of micro-services architecture
- Strong analytic skills, including mining, evaluation, analysis, and visualization
- AWS, Azure or GCP background (preferably all three)
- Experience with enterprise level cloud-based development, deployment, and auditing, including: PAAS, IAAS, and SAAS. (preferred)
- Proficiency in scripting languages e.g., XML, JavaScript, etc.
Required skills:
- 7 years experience
- 5 years ETL
- 5 years of Experience with AWS Services, especially, Data Pipeline, Glue, Kinesis, Redshift Spectrum
- 3 years of In depth experience Amazon Redshift, working knowledge of S3 data lake environment, AWS Certified Database (preferred)
- 3 years of Experience in schema design and data modeling
- 5 years of Knowledge of compliance frameworks PCI, SOX, SOC 2, ISO 27001 and the ability to apply their requirements and concepts to a complex environment
- Knowledge and hands-on expertise with Structured Query Language (SQL)
- Strong interpersonal communication and ability to solve complex problems
- Experience in debugging
- Strong technical experience, knowledge, and understanding of micro-services architecture
- 5 years of Experience with enterprise level cloud-based development, deployment, and auditing, including: PAAS, IAAS, and SAAS. (preferred)
- Excellent knowledge of data backup, recovery, integrity for database solutions
- 1 years of Experience /knowledge of Dell Boomi
Contact the recruiter working on this position:
The recruiter working on this position is Abaka Kartik(Shaji Team)
His/her contact number is +(1) (571) 2812089
His/her contact email is karthik.abaka@msysinc.com
Our recruiters will be more than happy to help you to get this contract.