🌎
This job posting isn't available in all website languages
📁
Technology
📅
92 Total Views
Thanks for your interest in the Lead Data Engineer position. Unfortunately this position has been closed but you can search our 0 open jobs by clicking here.

What You Bring

  • Bachelor/master degree in Computer Science or related technical subject area or equivalent combination of education and experience.
  • 7+ years of relevant experience in large-scale software development.
  • 5+ years of data engineering experience.
  • Experience designing, estimating and implementing for complex software projects involving RDBMS systems, SQL and SQL Analytical functions.
  • 5+ years of proven experience in ETL/ELT/In Database Processing.
  • Solid Experience in creating stored procedures to enrich, validate and load data.
  • 5+ Experience Stored Procedures, PL/SQL, SQL queries, data analysis, and validation.
  • 5+ years of relevant experience with Teradata: Loading data into Teradata production and development warehouse using BTEQ, FASTLOAD, FASTEXPORT, MULTI LOAD, and ETL Tools like Informatica.
  • Experience with source code control tools like Github or Bitbucket.
  • Extensive experience with performance and scalability tuning (Understanding of Teradata capabilities and how to optimize design for performance within Teradata architecture.)
  • Familiarity with the principles of Domain Driven Design.
  • Proven ability to quickly pick up new languages, technologies, and frameworks.
  • Experience participating in key business, architectural and technical decisions, providing guidance and mentorship to other engineers.
  • Experience in Agile/Scrum application development.

The following skills and experience are also relevant to our overall environment, and nice to have:

  • Strong programming experience, Python or Scala preferred.
  • Experience working in a public cloud environment, particularly AWS.
  • Experience with data warehouse tools like Snowflake.
  • Experience with messaging/streaming/complex event processing tooling and frameworks such as Kinesis, Kafka, Spark Streaming, Flink, Nifi, etc.
  • Experience working with NoSQL data stores such as HBase, DynamoDB, etc.
  • Experience building RESTful API’s to enable data consumption.
  • Experience with build tools such as Terraform or CloudFormation and automation tools such as Jenkins or Circle CI.
  • Knowledge of Nike Technology landscape including Dimensional data at Nike.