It's fun to work in a company where people truly BELIEVE in what they're doing!
We're committed to bringing passion and customer focus to the business.
The Data Engineer will help design, build and maintain modern data pipelines and reporting solutions using Microsoft and cloud-based technologies. This role will provide hands-on experience with ETL/ELT development, BI tools and cross-functional collaboration across finance, operations and IT teams. The ideal candidate has a strong foundation in SQL and Python, and is excited to grow their skills in Azure, Microsoft Fabric and other enterprise data platforms.
Essential Functions
- Assist in developing SQL queries, views, and stored procedures for data transformation, reporting, and analytics.
- Support the creation and maintenance of ETL/ELT pipelines using Azure Data Factory, Synapse pipelines, or Microsoft Fabric.
- Learn and contribute to data modeling and transformations in Microsoft Fabric (Lakehouse, Dataflows, Power Query, and Notebooks).
- Help maintain and monitor data pipelines running in Azure (Data Factory, Logic Apps, Storage Accounts, etc.).
- Build and enhance dashboards and reports in Power BI to visualize KPIs, financial results, and operational metrics.
- Work with business stakeholders to gather reporting requirements and translate them into technical specifications.
- Contribute to automation projects using Python, SQL, Power Automate, and Fabric pipelines to reduce manual effort.
- Participate in data validation, cleansing, and quality assurance to ensure accurate reporting.
- Gain exposure to cloud data warehouses such as Snowflake and Azure Synapse.
- Learn and apply version control (Git/GitHub), DevOps practices, and CI/CD pipelines in a Microsoft-centric environment.
COMPETENCIES
- Knowledge of cloud platforms (Azure, AWS, or GCP), with a focus on Azure services.
- Familiarity with Git/GitHub, Docker, or Linux command line.
- Understanding of financial or retail data concepts (a plus, not required).
- Solid SQL skills, with ability to query and transform data from multiple systems.
- Proficiency in Python for data manipulation (pandas, PySpark, etc.).
- Familiarity with Microsoft technologies, especially Power BI, Power Automate, Excel, and SQL Server.
- Interest in learning Microsoft Fabric, including Lakehouse, OneLake, Dataflows, and integrated pipelines.
- Exposure to Azure Data Services (Data Factory, Synapse, Logic Apps, Fabric).
- Strong analytical, problem-solving, and communication skills.
Education/Experience
- Bachelor’s degree in Data Science, Computer Science, Information Systems, Finance/Accounting, or related field (or equivalent experience).
- 1–3 years of professional experience in data analysis, BI development, or related technical role (internships and academic projects considered).
- Hands-on experience with Microsoft Fabric or Azure Synapse Analytics.
Must be authorized to work in the United States
If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!