Koch
Join our Talent Network
Skip to main content

Lead Data Engineer

Description

The responsibilities will include:

·Hands-on experience with AWS technologies and at least one full life cycle project experience using various cloud technologies

·Hands-on expertise with writing, modifying, tuning SQL queries

·Optimize, scale and perform performance tuning on DW on AWS Redshift and the data lake on AWS S3

·Analysis, design, development, testing, and support of Talend DI/ Big Data solution

·Backtracking, troubleshooting failures and provide fix

·On demand support data feeds from various ERP source systems into Data warehouse

·Proactively identify gaps and propose better/best possible solutions to the existing system

·Ability to independently work on full life cycle projects

·Being able to support Spark and AWS EMR configurations along with Spark scripting

·Be able to analyze large complex data sets to resolve data quality issues

·Document requirements and create/update system specifications as well as documentation of code.

·Design and Develop new Talend Big Jobs using best practices laid down by the GP Technical Lead

·Be able to adopt to technical stack used by Procurement Data Lake and Data Warehouse

·Position’s day to day responsibility will be directed by the technical lead who is part of the KTC. All team members are required to be on a daily status call with GP Technical Lead

A Day In The Life Could Include:

(job responsibilities)

    • Provide on-demand support, and the activities will include all critical, high, medium and low priority issues.
    • Handle SSIS/TALEND/DMS/Glue (other AWS services) ETL Job failures - Code, Data fixes
    • Application Configuration changes related to TALEND ETL packages as well as SSIS.
    • Create new ETL Jobs on Talend DI

    Data engineering concepts (ETL, near-/real-time streaming, data structures

    What You Will Need To Bring With You:

    (experience & education required)

    • Bachelor’s degree in Engineering (preferably Analytics, MIS or Computer Science). Masters degrees preferred.
    • Minimum 8+ of ETL with AWS, Talend Big Data, Data Analytics and Data Lake ETL Experience using Glue, Spark scripting(Good to have), AWS EMR(Moderate), AWS DMS(Moderate), AWS Redshift, AWS S3, AWS EC2, AWS CloudWatch, AWS RDS(Moderate), AWS IAM, Microsoft SQL/PLSQL, Microsoft SSIS(Moderate), GitLab, DevOps (CI/CD),
    • Working / Hands on knowledge on Lamda(Moderate)
    • Leverage Kinesis/ Firehose for real-time extraction(Good to have)
  • Experience with PySpark, Glue(Good to have)
  • What Will Put You Ahead:

    (experience & education preferred)

  • . AWS Certification (Solutions Architect, DevOps, or Big Data)
  • Experience with AWS native tools – Python, Spark, Lambda, Glue
  • Strong English language communications
  • Ability to pull together complex and disparate data sources, warehouse those data sources and architect a foundation to produce BI and analytical content, while operating in a fluid, rapidly changing data environment
    • 2+ years of Procure to Pay Domain knowledge

    Koch is proud to be an equal opportunity workplace

Sign up for our talent network.

Not ready to apply? Take a minute to sign up to receive notifications on opportunities that match your interests.

Sign Up Now
Our teams around the globe are finding innovative solutions to the COVID-19 pandemic. See how