Join our Talent Network
Skip to main content
We're prepping for scheduled maintenance! Application submissions will be closed from 11:00 pm EST Friday to 4:00 am EST on Saturday.

Lead Data Quality Engineer

This job posting is no longer active.


Lead Data Engineer – Quality & Operations

Georgia-Pacific (GP) is among the world's leading manufacturers of bath tissue, paper towels, napkins, tableware, paper-based packaging, office papers, cellulose, specialty fibers, nonwoven fabrics, building products and related chemicals. Our building products business makes DensGlass® gypsum board often seen in commercial construction, DryPly® plywood and RESI-MIX® wood adhesives, among others. Our containerboard and packaging business offers high-end graphic packaging to bulk bins as well as Golden Isles fluff pulp. You may also recognize consumer brands like Angel Soft®, Brawny®, and Dixie® on retail shelves and enMotion® towels, Compact® bath tissue and SmartStock® cutlery dispensers when you are away from home. Our GP Harmon business is one of the world's largest recyclers of paper, metal and plastics. As a Koch Company, we create long-term value using resources efficiently to provide innovative products and solutions that meet the needs of customers and society, while operating in a manner that is environmentally and socially responsible, and economically sound. Headquartered in Atlanta, GA., we employ approximately 35,000 people. For more information, visit www.gp.com.

To learn more about our culture, Market-Based Management (MBM®), click here:



We are seeking a highly motivated, forward thinking Lead Data Engineer to support the enterprise Collaboration and Support Center Manufacturing excellence group and the GP Decision Analytics Group. The role is focused on developing and automating data analytics solutions used across 150+ facilities within multiple divisions. GP CSC and the Decision Analytics Group functions as a Center of Excellence for all things related to manufacturing and commercial transformation for all of Georgia Pacific. This groups create sustainable value and competitive advantage by leveraging analytics, information technology, and actionable insights across the enterprise while focusing on futuristic possibilities of analytics. This role will have the opportunity to leverage the latest Big Data, Cloud and Analytics technologies and partner with our operations, engineering and data science community to expand our modeling and decision-making capabilities focused on asset health, asset optimization and process optimization. 

What You Will Do In Your Role

  • Hands-on lead for data engineering in the area of Quality and Operations for the Enterprise data and analytics team focused on managing the Data Lake and helping the business develop, deploy and manage predictive and prescriptive models to create business value through optimization of manufacturing facilities.
  • Develop critical data pipelines and data quality jobs in the AWS environment working with Lambda, Glue, Python, SQL and noSQL databases.
  • Enhance and optimize exiting data quality processes including automated testing and alerting on critical data assets.
  • Assist Data Science teams with preparing, cleansing, and delivering analytical datasets for machine learning models.
  • Monitor, deploy and troubleshoot SAS models in production
  • Participate in various engineering projects building data product tools utilizing batch and streaming data.
  • On-call support rotation, troubleshooting and improving existing Operations processes including logging, alerting and monitoring.
  • Manage own learning and contribute to technical skill building of the team.
  • Develop deep technical expertise in the data movement patterns, practices and tools.
  • Demonstrate technical, team, and solution leadership through strong communication skills to recommend actionable, data-driven solutions
  • Collaborate with team members, business stakeholders and data SMEs to elicit requirements and to develop a technical design and then implement a solution.

The Experience You Will Bring


  • Bachelor’s degree
  • 5+ years of Data Engineering experience 
  • 2+ years working with AWS serverless technologies like AWS Lambda, AWS Glue, Kinesis, DynamoDB and Redshift or similar services
  • Data Concepts knowledge (ETL, near-/real-time streaming, data structures, data modeling, metadata, and workflow management)
  • 5+ years of experience with databases, SQL, and data warehousing concepts
  • Experience drawing technical diagrams and presenting to the team

What Will Put You Ahead

  • Bachelor’s degree in Engineering (preferably Analytics, MIS or Computer Science)
  • 2+ years of experience with Code Management Tools (Git/GitHub, Bit Bucket, SVN, TFS)
  • 2+ years of Python development experience
  • Experience in Big Data projects (Hadoop/EMR, Hive, Kinesis, Spark)
  • Master’s degrees
  • AWS certifications
  • Manufacturing IT systems knowledge (OSIPI)
  • Markup Languages (JSON, XML, YAML)
  • Front end UI and scripting experience including React, .Net, Tableau, PowerBI and/or QlikSense
  • SAS tools and scripting experience
  • Experience with DevOps processes and tools like Terraform
  • Ability to pull together complex and disparate data sources, warehouse those data sources and architect a foundation to produce BI and analytical content, while operating in a fluid, rapidly changing data environment

Salary and Benefits Commensurate with Experience.
Equal Opportunity Employer.
Except where prohibited by state law, all offers of employment are conditioned upon successfully passing a drug test.

This employer uses E-Verify. Please visit the following website for additional information: www.kochcareers.com/doc/Everify.pdf

This job posting is no longer active.

Sign up for our talent network.

Not ready to apply? Take a minute to sign up to receive notifications on opportunities that match your interests.

Sign Up Now
Interested in our early career opportunities? Scholarship and event applications are now open. Learn More.