Data Engineer

  • Full-Time
  • Remote
  • IntegriChain
  • Posted 3 years ago – Accepting applications
Job Description

Mission:

Deliver modern data pipelines and create custom data extracts that meet the needs of both internal and external customers.


Duties:

  • Develop, support, and refine new data pipelines, data models, business logic, data schemas as code, and analytics to product specifications.
  • Prototype and optimize data type checks to ensure data uniformity prior to load.
  • Develop, and refine both streaming and batch processing data pipeline frameworks.
  • Maintain, improve, and develop expertise in existing production data, models, and algorithms.
  • Learn and utilize business data domain knowledge and its correlation to underlying data sources.
  • Define, document, and maintain a data dictionary including: data definitions, data sources, business meaning and usage of information.
  • Identify and validate opportunities to reuse existing data and algorithms.
  • Works with stakeholders to gather requirements on merging, de-duplicating, standardizing data.
  • Collaborate on design and implementation of data standardization procedures.
  • Share team responsibilities; such as contributing to development of data warehouses and productizing algorithms created by Data Science team members.


Qualifications and Competencies:

  • Bachelor's Degree in technical background or equivalent work experience.
  • 2 - 3+ years of experience building data pipelines and using ETL tools. Prefer python programming experience.
  • 3+ years of experience in at least one basic relational database platform (sql server, oracle, postgres, mysql) and languages (PL/SQL, SQL).
  • 1+ years experience developing modern, industry standard big data frameworks with AWS or other cloud services.
  • Experience with common GitHub developer practices and paradigms.
  • Experience working with agile methodologies and cross-functional teams.
  • Knowledge of redshift or any other columnar database is prefered.
  • Knowledge of aws services and airflow is a plus.
  • Experience in building AWS data pipelines using python, S3 data lake is a plus.
  • Knowledge of speciality pharmaceutical and retail pharmacy is a plus.
Apply to this Job