Senior Associate Data Engineer
Remote (preferably working in PST hours)
6+ Months
Hands on Spark, Scala Coding
Knowledge of hive tables
Knowledge of parquet ORC files
Experience in Snowflakes
Experience/ knowledge on AWS /AZURE /GCP Java, Hadoop
We would like to interview possible candidates who have the following skills:
Extensive work experience in SQL and JAVA SCRIPT,Node.JS (Must Have)
Working Experience in Snowflake (Must have)
Working Experience in Azure Data Factory (Nice to have)
Asset Management experience (Nice to have)
Looking for a Senior Associate Data Engineer to be part of our team of top-notch technologists. You will lead and deliver technical solutions for large-scale digital transformation projects. Working with the latest data technologies in the industry, you will be instrumental in helping our clients evolve for a more digital future.
Your Impact:
· Combine your technical expertise and problem-solving passion to work closely with clients, turning complex ideas into end-to-end solutions that transform our clients' business
· Translate client requirements to system design and develop a solution that delivers business value
· Lead, design, develop and deliver large-scale data systems, data processing and data transformation projects
· Automate data platform operations and manage the post-production system and processes
· Conduct technical feasibility assessments and provide estimates for design and development of the solution
Skills & Experience:
Demonstrable experience in data platforms involving implementation of end-to-end data pipelines
Ability to Apply Snowflake best practices to Snowflake Data Warehouse. Strong development background on snowflake stored procedures using java script, SQL, data modeling, Snow flake DB setup, configuration and deployment
Hands-on experience with Azure cloud data services
· Implementation experience with column-oriented database technologies, NoSQL database technologies (i.e. Cosmos DB) and traditional database systems (i.e. SQL Server, MySQL)
· Experience in implementing data pipelines for both streaming and batch integrations using tools/frameworks like Azure Data Factory, Spark, Spark Streaming and python scripting etc.
· Ability to handle module or track level responsibilities and contributing to tasks "hands-on" experience in data modeling, warehouse design and fact/dimension implementations
· Experience working with code repositories and continuous integration
Regards,
Satya
Technical Recruiter
Key Business Solutions, Inc|| Office: 916 646 2080 Ext 216 || Fax: 916 646 2081 || Email: satya@keybusinessglobal.com || Website: www.key-soft.com || Yahoo IM/G Talk: satyakeysoft
Notice: This email is not intended to be a solicitation. Please accept our apologies and reply in the subject heading with REMOVE to be removed from our Mailing list. Thank You.
You received this message because you are subscribed to the Google Groups "Xrecnet IT Recruiters Network - Corp to Corp IT Jobs & Hotlists" group.
To unsubscribe from this group and stop receiving emails from it, send an email to xrecnet+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/xrecnet/CAAd%3Dnqd-CexGYh7q1QfnFzNm4BcXgMj5rcVY5kshX2RKwv_7Lw%40mail.gmail.com.
No comments:
Post a Comment