Data warehouse consultant W/ETL (3 Positions)
Remote
6+ Months
Must have: C#, Python and Excellent Communication
Note: Will be required to take a practicum/skills evaluation on C# or Python and SQL MUST HAVE C# SKILLS BECAUSE THAT IS THEIR ETL LANGUAGE.
Job Description:
The Enterprise Data Engineer is responsible for designing and implementing diverse and large-scale cloud and on-premise data environments utilized by ERP and its clients.
The Enterprise Data Engineer demonstrates understanding of the operational goals and downstream application of ERP data as related to the organizational mission, and will ensure that all security, access, privacy, performance, uptime, data ingestion pipeline and transformation (ETL) requirements are met or exceeded to facilitate utilization and support successful outcomes.
Creates and maintains optimal data pipeline infrastructure, environments and architecture across a variety of cloud and on-premise platforms in support of numerous concurrent engagements.
Identifies, proposes and implements internal data process improvements based on industry best practices for security, performance, accuracy, stability and scalability purposes.
Maintains detailed understanding of, and ensures compliance with, all applicable privacy and security requirements for relevant data types, sources and contexts (e.g., FERPA, HIPPAA, CJIS, etc.)
Builds and deploys infrastructure, tools and resources required for optimal extraction, transformation, and loading of data from a wide variety of data sources in various formats using SQL, AWS, Azure and other 'big data' technologies.
Supports the development of analytics tools, visualizations, dashboards and other interfaces that utilize the data pipelines to provide actionable insights in alignment with the organizational mission and goals.
Works effectively with internal and external stakeholders including the Executive, Project Management, Data Science and Web and Research teams to support all aspects of project executions.
Identify and develop/implement appropriate infrastructure (cloud and on-premises), frameworks, models, tools, resources and processes to support the work of the organization.
Collaborates with a high performing and diverse team of data scientists to ensure data infrastructure effectively supports analytical needs, including the creation of data tools and machine learning models.
Design, develop and program methods, processes, and systems to consolidate and analyze unstructured, diverse data sources to generate actionable insights and solutions for clients/partners and ERP
Maintains current, accessible and clear documentation pertaining to all data infrastructure, environments and services.
Knowledge, Skills, Abilities, and Qualifications
Significant experience designing complex database environments using a variety of platforms, environments and schema methodologies, including MS SQL, Postgres and MySQL.
High degree of proficiency with SQL programming and ETL including advanced SQL queries, functions, triggers and stored procedures.
At least two years of experience working with AWS. Ideal skills include AWS Glue, Amazon Athena, AWS Transfer Family, EC2, Lambda, RDS, Route F3, S3, VPC, CloudFormation, CloudWatch, Elastic Beanstalk
Ability to propose, architect, design, implement and manage complex data infrastructure in cloud and on-premise environments including AWS and Azure, including the selection and deployment of optimal services, tools and resources to support functional, performance, privacy and security requirements.
Training in data science, statistical modeling, visualization and analysis techniques, including the use of collaborative interfaces such as Jupyter.
Experience developing in C#, Python and related frameworks
Proficient with relational database methodologies including referential integrity and data
normalization.
Experience working on large scale cloud native and internally hosted big data and data science
initiatives.
Experience with machine learning, predictive analytics and data visualization.
An intrinsic commitment to improving public education, equity and life outcome
Education and Experience Requirements:
An advanced degree in a quantitative field - computer science, engineering, statistics, operations research, economics, etc.
Minimum of five years of progressive relevant experience in a data-intensive setting.
Regards,
Satya
Technical Recruiter
Key Business Solutions, Inc|| Office: 916 646 2080 Ext 216 || Fax: 916 646 2081 || Email: satya@keybusinessglobal.com || Website: www.key-soft.com || Yahoo IM/G Talk: satyakeysoft
Notice: This email is not intended to be a solicitation. Please accept our apologies and reply in the subject heading with REMOVE to be removed from our Mailing list. Thank You.
You received this message because you are subscribed to the Google Groups "Xrecnet IT Recruiters Network - Corp to Corp IT Jobs & Hotlists" group.
To unsubscribe from this group and stop receiving emails from it, send an email to xrecnet+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/xrecnet/CAAd%3Dnqf8TAfy%2BzogD-Hwpifc7ntNWD21KBpOpTa%3DZTWahUEjEQ%40mail.gmail.com.
No comments:
Post a Comment