Koch
Join our Talent Network
Skip to main content

Data Engineer

 

Description

Senior Data Engineer

Role Summary:

Georgia-Pacific is seeking a hands-on Senior Data Engineer in the data engineering program delivering a ground up data lake build. This role will be responsible to ensure data engineering design and implementations are aligned to best practices and consistently implemented providing business value. The Senior Data Engineer is required to possess proven expertise and development skill in data engineering including modern cloud data lake ingestion, migration, management, and consumption patterns enabling analytics solution development and delivery across the enterprise at scale.

Responsibilities:

  • Senior level engineering and software development skills in the delivery of full data lifecycle frameworks and patterns including data acquisition, migration, storage, transformation, prep, and consumption to support a ground up Enterprise Data Lake build for use in analytics solutions at different maturity levels between data science and operational teams
  • Hands-on data engineering design and implementations including data ingestion, data models, data structures, data storage, high-throughput data processing, data pipelines, and data monitoring at scale
  • Follow data engineering best practices with considerations for high data availability, computational efficiency, cost, and quality
  • Build and maintain environments, processes, functionalities, and tools to improve all stages of data lake implementation and analytics solution development, e.g., proof of concepts, prototypes, and production
  • Implement capability for repeatable, configuration driven automation throughout the data lifecycle
  • Maintain awareness of relevant technology trends and product updates (i.e. AWS Services)
  • Senior level skills implementing general IT concepts, strategies, methodologies, modern application and data engineering architectures and approaches including cloud, streaming, event based, IoT data, and edge server capability
  • Write complex programs, implement architectures, and enable automation in cloud environments
  • Implement configuration driven, reusable, automation frameworks consistent with designed approaches
  • Apply DevOps best practices, CI/CD processes and tools, testing frameworks
  • Optimize data solutions for multi-petabyte data systems in batch, streaming, and event approaches
  • Work with agile methodologies, cross-functional teams (Product Owners, Scrum Masters, Developers, Test Engineers), backlog grooming, and tooling
  • Ensure alignment and best practices for data governance

Qualifications

  • 5+ years of professional experience in data engineering related roles - full software development lifecycle, DBA, data architect, data engineer
  • 3+ years specific experience in modern data engineering and cloud data practices including various data lake ingestion techniques, ETL/ELT, consumption, and operations
  • Working experience in data analytics (data wrangling, mining, integration, analysis, visualization, data modeling, analysis/analytics, and reporting) using BI (Business Intelligence) tools
  • Expertise in AWS services including S3, EC2, SQS, EMR, Kinesis, Lambda, Step Functions, Terraform, Glue, Redshift, Athena, DynamoDB, API Gateway, Cloudwatch, IAM, and API Gateway; Programming background and ability to utilize a variety of software/languages/tools, e.g., Python, Scala, Java, Spark, Hive, SQL; Linux including Korn shell, scripting, and regex
  • Degree in Computer Science or equivalent work experience

Koch is proud to be an equal opportunity workplace

Sign up for our talent network.

Not ready to apply? Take a minute to sign up to receive notifications on opportunities that match your interests.

Sign Up Now