Job summary:
Title:
Data Architect - Remote
Location:
Remote
Length and terms:
Long term - W2 or C2C
Position created on 01/24/2025 07:29 pm
Job description:
**** Webcam interview *** Long term project *** *** Remote ***
Skills: Azure Data Factory, Azure Data Lake, Informatica, Guidewire, Python/Spark
Job Description:
We are looking for an experienced Data Architect with a strong background in building scalable data solutions, integrating enterprise systems, and leveraging cloud based technologies. The ideal candidate will have hands on experience with Azure, ETL tools, and advanced data pipeline creation, along with deep expertise in the Property and Casualty (P&C) insurance domain and Guidewire Data.
Key Responsibilities:
- Design and build metadata driven data pipelines using tools like Azure Data Factory (ADF) and Informatica.
- Develop and optimize Operational Data Stores (ODS) leveraging Azure Data Lake.
- Implement and manage data solutions on Azure, ensuring efficient cloud resource utilization and cost optimization.
- Use Azure Functions for data processing automation and orchestration.
- Work with Guidewire Data and ensure seamless integration and processing.
- Write robust and scalable code using Python, T SQL, and Spark to support custom data transformation processes.
- Integrate and process data from diverse sources into Azure Data Lake and SQL Server.
- Knowledge of Hadoop is a plus for handling large scale data processing and storage needs.
- Utilize prior Property and Casualty (P&C) insurance domain experience to align technical solutions with business requirements.
- Collaborate with stakeholders to gather requirements, define data strategies, and translate business goals into technical implementations.
- Provide clear and effective communication across technical and non technical teams.
Required Skills:
- Azure: Strong experience with Azure Data Factory, Azure Data Lake, Azure Functions, and general cloud architecture.
- ETL Tools: Expertise in Informatica and building scalable ETL workflows.
- SQL Server: Advanced knowledge of SQL Server and T SQL for data management and processing.
- Programming: Proficiency in Python and Spark for data engineering tasks.
- Guidewire Data: Experience working with Guidewire Data.
- ODS Development: Proven expertise in building ODS from Data Lakes.
- Cloud Optimization: Hands on experience in Azure Cloud Cost Optimization strategies.
- P&C Domain: Strong understanding of the Property and Casualty insurance domain.
- Communication Skills: Excellent verbal and written communication skills are crucial for this role.
- Experience working in large scale data migration projects.
Contact the recruiter working on this position:
The recruiter working on this position is Sandeep(Shaji Team) Maraganti
His/her contact number is
His/her contact email is sandeep.maraganti@msysinc.com
Our recruiters will be more than happy to help you to get this contract.