Schwarz Partners has an exciting opportunity available for a Data Engineer in Carmel, IN! Data Engineers build data pipelines that transform raw, unstructured data into formats that can be used for analysis. They are responsible for creating and maintaining the analytics infrastructure that enables almost every other data function. This includes architectures such as databases, servers, and large-scale processing systems. A Data Engineer uses different technologies to collect and map an organization’s data landscape to help decision-makers find cost savings and optimization opportunities. In addition, data Engineers use this data to display trends in collected analytics information, encouraging transparency with stakeholders.
Schwarz Partners is one of the largest independent manufacturers of corrugated sheets and packaging materials in the U.S. Through our family of companies, we continuously build and strengthen our capabilities. You’ll find our products wherever goods are packaged, shipped, and sold—from innovative retail packaging to colorful in-store displays at pharmacies and grocers. You also may have spotted our trucks on the highway. Schwarz Partners is built around the idea that independence and innovation go hand in hand. Our structure allows us to adapt to change quickly, get new ideas off the ground, and thrive in the marketplace. Our people are empowered to tap into their talents, build their skills, and grow with us.
ESSENTIAL JOB FUNCTIONS FOR THIS POSITION:
- Build and maintain the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud technologies.
- Assemble large, complex sets of data that meet non-functional and functional business requirements.
- Identify, design, and implement internal data-related process improvements.
- Working with stakeholders including data, design, product and executive teams and assisting them with data-related technical issues.
- Conduct configuration & design of application to better leverage the enterprise.
- Prepare data for prescriptive and predictive modeling.
- Use effective communication to work with application vendors.
- Assist in the creation and quality assurance review of design documents and test results to ensure all project requirements are satisfied.
- Ability to advise and implement on improvements to data warehousing and data workflow architecture.
- Think outside the box and come up with improvement and efficiency opportunities to streamline business and operational workflows.
- Document high-level business workflows and transform into low-level technical requirements.
- Ability to analyze complex information sets and communicate that information in a clear well thought out and well laid out manner.
- Ability to communicate at varying levels of detail (30,000 ft. view, 10,000 ft. view, granular level) and to produce corresponding documentation at varying levels of abstraction.
- Be an advocate for best practices and continued learning.
- Ability to communicate with business stakeholders on status of projects/issues.
- Ability to prioritize and multi-task between duties at any given time.
- Solid communication and interpersonal skills.
- Comply with company policies and procedures and all applicable laws and regulations.
- General DBA work as needed.
- Maintain and troubleshoot existing ETL processes.
- Create and maintain BI reports.
- Additional duties as assigned.
REQUIRED EDUCATION / EXPERIENCE:
- Bachelor’s degree in Computer Science or 4+ years’ experience in related field.
PREFERRED EDUCATION / EXPERIENCE:
- Experience developing data workflows.
- Ability to perform prescriptive and predictive modeling.
REQUIRED SKILLS:
- Demonstrated experience with SQL in a large database environment.
- Direct experience utilizing SQL to develop queries or profile data.
- Experience in quantitative and qualitative analysis of data.
- Experienced level skills in Systems Analysis.
- Experienced level skills in Systems Engineering.
- Ability to function as a self-starter.
REQUIRED MICROSOFT FABRIC SKILLS:
- Strong grasp of OneLake concepts: lakehouses vs. warehouses, shortcuts, mirroring, item/workspace structure.
- Hands-on with Delta Lake (Parquet, Delta tables, partitioning, V-ordering, Z-ordering, Vacuum retention).
- Understanding of Direct Lake, Import, and DirectQuery trade-offs and when to use each.
- Experience designing star schemas and modern medallion architectures (bronze/silver/gold).
- Spark/PySpark notebooks (jobs, clusters, caching, optimization, broadcast joins).
- Data Factory in Fabric (Pipelines): activities, triggers, parameterization, error handling/retries.
- Dataflows Gen2 (Power Query/M) for ELT, incremental refresh, and reusable transformations.
- Building/optimizing semantic models; DAX (measures, calculation groups, aggregations).
- Ability to multi-task, think on his/her feet and react, apply attention to detail and follow-up, and work effectively and collegially with management staff and end users.
Our organization is an Equal Opportunity Employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.