<p>Data Warehouse Engineer (Draper, UT)</p>
<p>Design large scale data models in Snowflake. Use Azure Synapse Analytics to extract and transform data, ensuring seamless integration and real-time updates. Build and maintain secure and compliant data processing pipelines, tables, views, procedures and datasets. Utilize APIs to extract information from second and third-party data sources, Parquet, JSON and XML formats, and transform information to improve operational efficiency and accuracy. Integrate, transform, and consolidate data from various structured and unstructured systems into Kimball data modeling to support analytics, data science and Machine Learning. Engineer data profiling and quality using Python and SQL. Review ETL/ELT code and pipelines and assess tools and procedures for enhanced data validation, code quality and engineering efficiency. Develop and deploy microservices. Configure and manage CI/CD pipelines to automate deployment of data projects, ensuring rapid and reliable releases in Agile framework. Monitor, troubleshoot and resolve issues in production environments.</p>
<p>Requires: Master's Degree in Business Analytics, 1 year of experience in job offered, and proficiency with Azure Synapse or Matillion, Snowflake or BigQuery, Azure DevOps or GitLab, SQL server, APIs, Parquet, JSON, Machine Learning, Python, ETL, CI/CD, and Agile.</p>
<p>#LI-DNI</p>