Job summary:


Title:
ETL Developer with Azure - Onsite

Location:
Washington, DC, United States

Length and terms:
Long term - w2 or c2c or 1099


Position created on 04/17/2024 04:10 pm

Job description:


**** W2 or 1099 or  c2c *** webcam interview ***** Long term  project  usually the project goes for multiple years with this customer. *** Onsite *** 

Short Description:

The Enterprise Data team at OCTO requires an ETL data engineer to support data operations for its Cloud Data Exchange.  The resource will utilize native Azure tools to perform ETL, data loading and data transformation tasks.

Job Description:

The ETL data engineer will support the OCTO Enterprise Data team for data curation, processing, and transformation tasks.  Specifically, the ETL data engineer will be responsible for the following tasks:

Responsibilities:

  • Analyzes, designs, develops, implements, replicates, and supports complex enterprise data projects.
  • Interfaces with other agencies, consult with and inform user departments on system requirements, advise on environment constraints and operating difficulties for the current state and advise and resolve problems using cloud solutions and develop and replicate future enhancements to Districts data systems.
  • Strong knowledge of Extraction, Transformation and Loading (ETL) processes using frameworks like Azure Data Factory, Synapse, Databricks, Informatica by gathering requirements from the stake holders or analyzing the existing code and perform enhancements or new development.
  • Establishing the cloud and on premise connectivity in different systems like ADLS, ADF, Synapse, Databricks
  • Hands on experience in Azure cloud services like Azure Data Factory or Azure Synapse, MSSQL Db, Azure SQL DB, Azure Data Lake Storage Gen2, Blob Storage, Python etc.
  • Worked on creating end to end pipelines to load data by reading it from multiple sources or source systems and load to landing layer or SQL tables.
  • Familiarity/experience with data integration and data pipeline tools (e.g., Informatica, Synapse, Apache NiFi, Apache Airflow)
  • Familiarity/experience with various data formats including database specific (Oracle, SQL Server, DB2, Quickbase), text formats(CSV, XML) and Binary(Parquet, AVRO)
  • Develops, standardizes and optimizes existing data workflow/pipelines adhering to best practices.
  • Adhere and contribute to enterprise data governance standards by ensuring data accuracy, consistency, security, and reliability.
  • Automates, monitors, alerts, and manages data pipelines and workflows.
  • Analyzes and evaluates system changes to determine feasibility, provides alternative solutions, back up, and rollback procedures.
  • Works on the development of new systems, upgrades and enhancements to existing systems and ensure systems follow approved standards and consistency after the changes.
  • Develops complex programs and reports in database query language.
  • Familiarity/experience with data visualization tools.
  • Familiarity/experience handling and securing sensitive data based on the level of the sensitivity
  • Demonstrates expertise in conveying technical and functional concepts for a specific technical specialty.
  • Identifies improvements to project standards to achieve high quality services/ products. This is a professional position which may require subject matter expertise consistent with demanding and rare technological skills.
  • May require coordination of programming activities being conducted by the application development team
  • Confers with other business and technical personnel to resolve problems of intent, inaccuracy, or feasibility of computer processing and project design.
  • Works with necessary personnel to determine if modifications are necessary with interested personnel to determine necessity for modifications or enhancements.
  • Leverages excellent written and verbal communication skills to develop new business process and programming solutions as directed by business and technical stakeholders.
  • May coordinate activities of application developers.
  • Able to identify best practices and standards for the use of the product.
  • Proven track record of hands on technical design and code work within large complex systems.
  • Proven hands on technical work with a variety of technologies.
  • Demonstrated technical expertise integrating a variety of diverse technical environments and cross platform technologies.
  • Delivers support and design for industry specific applications that require integration with statewide systems or applications.
  • Interacts with executive level business users or technical experts.
  • Advanced experience in the required technical subject matter.
  • May function as a niche technical SME (Subject Matter Expert).
  • Has proven experience across large and complex implementations and systems.

Minimum Education/Certification Requirements :

  • Bachelors degree in Information Technology or related field or equivalent experience

Required Skills:

  • Strong knowledge for development of Extract Transform Load (ETL) processes, including end to end pipelines with data loading from multiple sources 15 Years
  • Ability to gather and document requirements for data extraction, transformation and load processes 15 Years
  • Understanding of data warehousing, data lake, business intelligence and information management concepts and standards. 15 Years
  • Ability to advise internal and external customers on appropriate tools and systems for complex data processing challenges. 15 Years
  • Knowledge and use of SQL for relational databases 11 Years
  • Experience with various data formats including database specific (Oracle, SQL, Postgres, DB2), text formats (CSV, XML) and Binary (Parquet, AVRO) 11 Years
  • Contribute to enterprise data governance standards by ensuring accuracy, consistency, security and reliability 7 Years
  • Strong experience with Microsoft Azure tools such as Azure Data Factory, Synapse, SQL Database, Data Lake Storage Gen2, Blob Storage 5 Years
  • Experience with data integration and data pipeline tools such as Informatica PowerCenter, Apache NiFi, Apache Airflow and FME 5 Years
  • Strong communication skills, both oral and written 3 Years
  • Ability to provide excellent customer service to external partners 3 Years
  • Ability to work independently or as part of a larger team 3 Years

Highly Desired Skills:

  • Experience with visualization and reporting software such as MicroStrategy, Tableau, and Esri ArcGIS 3 Years
  • Experience performing data functions with Databricks 3 Years

Contact the recruiter working on this position:



The recruiter working on this position is Nadeem(Shaji Team) Ahmed Razvi
His/her contact number is +(1) (202) 7381674
His/her contact email is nadeem@msysinc.com

Our recruiters will be more than happy to help you to get this contract.