Koch
Join our Talent Network
Skip to main content

Data Quality Engineer

Description

The Data Quality Engineer will be the part of the Data & Analytics COE (Centre of Excellence) supporting multiple businesses and ongoing Data Analytics, Big Data, Data Science & AI, Data Governance and Data Management projects for Koch Industries.

The Data quality Engineer will work closely with key stakeholders within the program and identify the key systems and their data entities in the end-to-end process for data. Create and maintain functional Information Asset Registers and identify the Data Owners, Stewards and Custodians. You will also be responsible for supporting the data quality issues and providing support and advice on how they are to be resolved. Ultimately the role will drive the business understanding of the data flows and their dependencies. He/she will work closely with their global counterparts on enterprise-wide delivery.

A Day in The Life Could Include:

(job responsibilities) 

  • Business discussions with stakeholders, data owners and stewards to understand the business use cases, systems criticality and connectivity of the source systems
  • Analyse the Source systems & its Data to discover the business-critical use cases & Identify the CDEs (Critical Data Elements)
  • Identify the business domains and build the Data Dictionary (structures, schema etc.) for each of the source systems and map with respective domains and systems
  • Identify & document the upstream & downstream systems for each data attributes and understand the format and type of data and capture the System of Records for each attribute
  • Document the data lineage for each CDEs considering the flow of data from upstream to downstream sources (SORs to Reporting layer)
  • Participate in daily standups & Provide the technical thought leadership in various aspects of technical delivery.
  • Work in collaboration with US leaders and teams.
  • Highlight any delays / issues

What You Will Need to Bring with You: 

(experience & education required) 

  1. Bachelor's/Master's degree in Computer Science/Information Technology with 5+ years of data analytics experience
  2. Coding experience in Python, Spark, SQL, Bash Scripting

3.Experience in implementing Data Governance, Data Analysis, Data Lineage, Data Catalogue, Data Transformation, Data Quality & Data Cleansing Frameworks

4.ETL- Experiencing identifying and mapping Data Flows, Data profiling, Data Engineering, Data Migration using Talend/ Matillion/Informatica etc.

5.Experience in implementing Data Lake, Data warehouse using Redshift, Snowflake, and other opensource/Cloud Services

6.Data virtualization layer using Denodo

7.Experience is assessing the Quality of data and Data Analysis using Data quality tools such as IDQ, SAS DQ etc.

8.Experience in, Advanced Analytics, message queuing, stream processing, and highly scalable big data’ data stores, Hadoop, Spark, Kafka, ETL/data pipeline and workflow management,

9.Experience in AWS cloud services: EC2, EMR, RDS, Redshift, etc.

10.Experience in Data Requirements Capture and Documentation

11.Experience in using Data Governance & Data Profiling tools such as EBX

  1. Should possess understanding of Agile & CI/CD delivery mythology and DevOps for Big Data solutions
  2. Strong customer focus, collaboration, and problem-solving skills
  3. Exceptional communication skills

What Will Put You Ahead:

(experience & education preferred) 

  • Experience as a Business Analyst or Data Analyst and strong understanding on Data Exploration and Analysis
  • Good understanding on one or multiple business domains such as Supply chain, Manufacturing and Logistics
  • Good understanding on cloud-based Implementations.

Koch is proud to be an equal opportunity workplace

Sign up for our talent network.

Not ready to apply? Take a minute to sign up to receive notifications on opportunities that match your interests.

Sign Up Now
Our teams around the globe are finding innovative solutions to the COVID-19 pandemic. See how