***W2 Only-No C2C***Long term project; webcam interview; hourly contract or full time with benefits; please check our benefits here: https://msysinc.com/benefits/
Skills and Qualifications:
· Data Engineer with Bachelor’s Degree or foreign equivalent in Computer Science, Electrical Engineering, Mathematics, Computer Applications, Information Systems or Engineering is required
· 1+ year experience with Snowflake database - MUST HAVE
· AWS cloud experience (EC2, S3, Lambda, EMR, RDS, Redshift)
· Experience in ETL and ELT workflow management- MUST HAVE
· Familiarity with AWS Data and Analytics technologies such as Glue, Athena, Spectrum, Data Pipeline
· Experience building internal cloud to cloud integrations is ideal
· Experience with streaming related technologies ex Spark streaming or other message brokers like Kafka is a plus
· 3+ years of Data Management Experience
· 3+ years of batch ETL tool experience (Data Stage / Informatica / Talend)
· 3+ years’ experience developing, deploying and supporting scalable and high-performance data pipelines (leveraging distributed, data movement technologies and approaches, including but not limited to ETL and streaming ingestion and processing)
· 2+ years’ experience with Hadoop Ecosystem (HDFS/S3, Hive, Spark)
· 2+ years’ experience in a software engineering, leveraging Java, Python, Scala, etc.
· 2+ years’ advanced distributed schema and SQL development skills including partitioning for performance of ingestion and consumption patterns
· 2+ years’ experience with distributed NoSQL databases (Apache Cassandra, Graph databases, Document Store databases)
· Experience in the financial services, banking and/ or Insurance industries is a nice to have
The recruiter working on this position is Sarath Chandra(Ravi Team)
His/her contact number is +(1) (703) 6468499
His/her contact email is sarath@msysinc.com
Our recruiters will be more than happy to help you to get this contract.