We are looking for a seasoned Data Engineer to join a project for a fiber optic industry. The project involves migrating an existing Data Lake, currently accessed via SAS, to Google Cloud. This is a full-time project with a duration of 3+ months.

Requirements:

    • SQL, dbt, git, Airflow (Python), ODE Generic Export Framework.
    • SAS, Google Cloud,
    • KNIME, VS Code, DIL-Pipelines, InnovatorX (MDD Software, Interfaces, Documentation), Iceberg/Biglake Tables.
    • English - B2 or better

      Responsibilities:

      • Ensure data integrity, performance optimization, and system stability during migration.
      • Lead and execute the migration of the Data Lake from SAS to Google Cloud.
      • Collaborate with stakeholders to align technical solutions with business needs.
      • Implement best practices for data engineering and cloud-based architecture.
      • Provide documentation and knowledge transfer to internal team

      We offer:

      • Remote work, full-time
      • Competitive compensation
      • World-class team
      • Interesting, challenging tasks