Wednesday, June 4, 2025

[xrecnet] Local Preferred:: Urgent need for Sr. Data Engineer @ NYC, NY (Min 13 Years EXP Required)

Note: Please share me your updated resume with contact details to hari.prasad@vajraasys.com

 

Dear,

 

Hope you are doing well,

 

Please find the below requirement and if you comfortable and interested please share me your updated resume with contact details ASAP.

 

Role: Senior GCP - Data Engineer

Location: NYC, NY (Day 1 Onsite) - Locals or EST zone can only apply

Duration: Long Term Contract

Experience: Min 13+ Years Exp Required

 

Mandatory Details –

Category                                            Required Skills

 

Cloud Platform                             GCP (Google Cloud Platform)

Pipeline Tools                                Dataflow (Apache Beam), Pub/Sub, Cloud Storage

Orchestration                                Cloud Composer (Airflow)

Data Warehouse                                BigQuery (must), Snowflake (migration knowledge)

Data Modeling & Optimization              Schema mapping, partitioning, clustering, query tuning

Languages                                       Python, SQL (advanced proficiency)

Validation & Monitoring              Data quality checks, validation rules, pipeline observability

Certifications                                 GCP-Certified Data Engineer

 

Job Description:

•            Pipeline rebuild, ingestion, transformation, performance: BigQuery, Dataflow (Apache Beam), Pub/Sub, Cloud Storage; Composer (Airflow) for orchestration; ETL/ELT migration from Snowflake to GCP; Schema mapping, partitioning, clustering, query tuning; Python/SQL proficiency, data quality and validation.

•            GCP-Certified Data Engineer with 5+ years of experience in cloud data engineering, preferably with prior Snowflake-to-BigQuery migration experience.

 

Key Responsibilities:

* Design, build, and optimize scalable ETL/ELT pipelines using Dataflow and Pub/Sub

* Orchestrate workflows using Cloud Composer (Apache Airflow)

* Migrate large-scale structured and semi-structured datasets from AWS/Snowflake to BigQuery

* Optimize BigQuery queries using partitioning, clustering, and resource management techniques

* Implement data quality checks, validation rules, and pipeline observability

* Collaborate with analytics, product, and business stakeholders to meet data requirements

 

 

Thanks and Regards,
Hari Prasad |Lead Technical Recruiter |Vajraasys Limited
E:
hari.prasad@vajraasys.com |vajraasys.com
3515 Plymouth Blvd, Suite 205, Plymouth, MN 55447

--
You received this message because you are subscribed to the Google Groups "Xrecnet IT Recruiters Network - Corp to Corp IT Jobs & Hotlists" group.
To unsubscribe from this group and stop receiving emails from it, send an email to xrecnet+unsubscribe@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/xrecnet/CAGZ%2BRrcSKjm%2BfPRYsvQqRM%3DeDJJfRrpRsNcCue9wBOCENkpmSQ%40mail.gmail.com.

No comments:

Post a Comment