Koch
Join our Talent Network
Skip to main content

Data Engineer

 

Description

The Data Engineer will be a part of an global team creating new solutions as well as improving existing solutions for Koch Industries. Koch Industries is a privately held global organization with over 120,000 employees around the world, with subsidiaries involved in manufacturing, trading, and investments. Koch Technology Center (KTC) is being developed in India to extend its IT operations, as well as act as a hub for innovation in the IT function. As KTC rapidly scales up its operations in India, it’s employees will get opportunities to carve out a career path for themselves within the organization. This role will have the opportunity to join on the ground floor and will play a critical part in helping build out the Koch Technology Center (KTC) over the next several years. Working closely with global colleagues would provide significant global exposure to the employees.

Data Engineer will report to the KBS Security Supervisor of the KTC and will be responsible for creating valuable solutions for KBS and the customers. Data Engineer shall be responsible for providing an end to end implementation of technology solutions to solve customer’s requests as well as serve business needs and priorities. Data Engineer will be expected to continually learn about new applications, features and tools which can be used for new solutions.

A Day In The Life Could Include:

(job responsibilities)

  • Work with the data architect and data engineering teams to design and build a BI and analytics solutions
  • Implement batch and real time data movement design patterns and define best practices in data engineering.
  • Design and develop optimal cloud data solutions (lakes, warehouses, marts, analytics) by collaborating with diverse IT teams including business analysts, project managers, architects, developers, or vendors
  • Work closely with data engineers and data analysts to procure, blend and analyze data for quality and distribution; ensuring key elements are harmonized and modeled for effective analytics, while operating in a fluid, rapidly changing data environment
  • Build data pipelines from a wide variety of sources
  • Demonstrate strong conceptual, analytical, and problem-solving skills and ability to articulate ideas and technical solutions effectively to external IT partners as well as internal data team members
  • Work with cross-functional teams, on-shore/off-shore, development/QA teams/Vendors in a matrixed environment for data delivery
  • Backtracking, troubleshooting failures and provide fix as needed
  • Update and maintain key data cloud solution deliverables and diagrams
  • Ensure conformance and compliance using Georgia-Pacific data architecture guidelines and enterprise data strategic vision
  • Provide consultant support in analysis, solution design, and service delivery

What You Will Need To Bring With You:

(experience & education required)

  • Around 6 years overall IT experience
  • At least 4+ of hands-on experience in designing, implementing, managing large-scale data and ETL solutions with AWS IaaS and PaaS Compute, Storage, and Database services (S3, RDS, Lambda, IAM, RedShift, Glue, EMR, Kinesis, Athena).
  • Hands on experience in Cloud monitoring stack like CloudTrail, CloudWatch and AWS Event-bridge trigger service.
  • Design, Develop, Test, and deploy Pipelines with Batch, Real-time and Near Realtime capabilities across our Segments.
  • 2+ years of experience in leading and delivering a data analytics project.
  • Strong knowledge of Data Engineering, Data Warehousing, and database concepts
  • Be able to analyze large complex data sets to resolve data quality issues

 What Will Put You Ahead:

(experience & education preferred)

  • AWS certifications like Solution Architect (SAA/SAP) or Data Analytics Specialty (DAS)
  • Hands-on experience with AWS data technologies and at least one full life cycle project experience in building a data solution in AWS.
  • Understanding of common DevSecOps/DataOps and CICD processes, methodologies, and technologies like GitLab, Terraform etc.
  • Strong knowledge in PySpark, SQL and Redshift store procedures, Kinesis and AWS Glue service.
  • IAM policies – and best practices with emphasis on custom security
  • Independent problem solver.

Koch is proud to be an equal opportunity workplace

    Sign up for our talent network.

    Not ready to apply? Take a minute to sign up to receive notifications on opportunities that match your interests.

    Sign Up Now