BT-156 – Data Engineer Analyst
Skill Level: Entry/Junior
Location: Chantilly/Herndon
- MUST HAVE AN ACTIVE TS OR TS/SCI CLEARANCE TO APPLY. Those without an active security clearance will not be considered.**
Role Description
This is an entry-level opportunity to begin your career as a Data Engineer. As a Junior Analyst, you will learn the fundamentals of building and maintaining large-scale data pipelines on a government data platform. You will work alongside senior engineers, assisting with development, monitoring systems, and helping to ensure the quality and reliability of our data. This role is ideal for a motivated individual with a foundational understanding of data concepts and a strong desire to learn and grow.
Responsibilities
- Assist senior engineers in the development and maintenance of data pipelines.
- Support the operations of existing pipelines by monitoring their performance and responding to alerts under supervision.
- Execute pre-defined scripts and processes for data ingestion and transformation.
- Help run data quality checks and document the results.
- Participate in team meetings and Agile/Scrum ceremonies, learning the development lifecycle.
- Contribute to the creation and upkeep of technical documentation for data processes.
Required Qualifications
- 0-2 years of experience or relevant internship experience in a technical or data-related field.
- A Bachelor's degree in Computer Science, Information Systems, Engineering, or a related discipline.
- Basic knowledge of SQL and at least one scripting language (Python preferred).
- A foundational understanding of data concepts (e.g., databases, ETL).
- Strong problem-solving skills and a proactive willingness to learn new technologies.
- Experience with cloud data platforms (e.g., Databricks, Snowflake, AWS S3, Azure Data Lake).
- A foundational understanding of metadata management and data governance concepts.
Preferred Qualifications
- Hands-on experience developing integrations for an enterprise data catalog platform (e.g., Collibra, Alation).
- Experience with workflow orchestration tools (e.g., Airflow, Prefect).
- Familiarity with Infrastructure-as-Code (e.g., Terraform).
- Exposure to streaming data technologies (e.g., Kafka).