Job summary:


Title:
Data Engineer with Databricks - Hybrid

Location:
Lansing, MI, United States

Length and terms:
Long term - W2 or C2C


Position created on 03/12/2025 09:35 pm

Job description:


*** Very long term project initial PO for 1 year and usually the project goes for 3-5 years with this customer ***  ***Hybrid*** 2 days a week onsite*** ***In-person interview*** 

Description:

The position is responsible for providing ongoing maintenance and support of the Michigan Disease Surveillance System (MDSS). MDSS is a complex application that supports communicable disease surveillance, registries, and case management systems that are critical to supporting effective responses to public health emergencies and reducing the burden of communicable diseases. MDSS is going through modernization to enhance stability and functionality of the system. With phase 1 already completed.

The resource is integral to developing and maintaining and enhancing MDHHS’ MDSS phase 1. Making sure automated processes are funtioning, streamlining critical business processes, data integrity, SEM/SUITE compliance, and securing the application.  The resource also performs as a technical lead and provides technical guidance to the other developers in the department.  As a technical lead, the resource participates in a variety of analytical assignments that provide for the enhancement, integration, maintenance, and implementation of projects.  The resource will also provides technical oversight to other developers in the team that support other critical applications . Not having a resource on staff will lead to MDHSS failing to maintain, enhance, and support the modernized MDSS that can lead to errors causing application outages, data integrity issues and can eventually lead to incorrect information being processed and reporting of the patient information.

Required Skills

  • 12+ years developing complex database systems.
  • 8+ years Databricks.
  • 8+ years using Elastic search, Kibanna.
  • 8+ years using Python/Scala.
  • 8+ years Oracle.
  • 5+ years experience with Extract, Transform, and Load (ETL) processes and developing Data Pipelines.
  • 5+ years experience with AWS.
  • Over 5+ years experience with data warehousing, data visualization Tools, data integrity .
  • Over 5+ years using CMM/CMMI Level 3 methods and practices.
  • Over 5+ years implemented agile development processes including test driven development.

Nice to have:

  • Over 3+ years Experience or Knowledge on  creating CI/CD pipelines using Azure Devops

Contact the recruiter working on this position:



The recruiter working on this position is Rohit(Shaji Team) Bala
His/her contact number is
His/her contact email is rohit@msysinc.com

Our recruiters will be more than happy to help you to get this contract.