Job Title: Big Data Platform Engineer
Duration: 12 Months
Location: Charlotte, NC - Hybrid Role
Job Summary:
We are seeking a highly skilled and motivated Tools Administration & Engineering Specialist to join our Data private cloud. This role involves working closely with other technical staff to manage, support, and engineer platform tools across various big data and cloud environments including Cloudera Data Platform (CDP), Hortonworks, and OpenShift Container Platform (OCP).
Key Responsibilities:
- Administer and support tools on the Data private cloud , Including CDP, HWX, MapR.
- Install, configure, and maintain data analytical and virtualization tools such as Dremio, JupyterHub and AtScale, across multiple clusters.
- Develop proof-of-concept solutions leveraging CDP and OCP technologies.
- Deploy tools and troubleshoot issues, perform root cause analysis, and remediate vulnerabilities.
- Act as a technical subject matter expert, supporting programming staff during development, testing, and implementation phases.
- Develop automation scripts for configuration and maintenance of data virtualization tools.
- Lead complex platform design, coding, and testing efforts.
- Drive advanced modeling, simulation, and analysis initiatives.
- Maintain comprehensive documentation of Hadoop cluster configurations, processes, and procedures.
- Generate reports on cluster usage, performance metrics, and capacity utilization.
- Work closely with data engineers, data scientists, and other stakeholders to understand their requirements and provide necessary support.
- Collaborate with IT infrastructure teams for integrating Dremio Tool, Hadoop clusters with existing systems and services.
Required Skills & Experience:
- Strong experience with big data platforms: MapR, Hortonworks, Cloudera Data Platform.
- Hands-on expertise with data virtualization tools: Dremio, JupyterHub, AtScale.
- Proficiency in deploying and managing tools in cloud and containerized environments (CDP, OCP).
- Solid understanding of platform engineering, automation scripting, and DevOps practices.
- Proven ability to troubleshoot complex issues and perform root cause analysis.
- Experience in leading technical efforts and mentoring team members.
Preferred Qualifications:
- Certifications in Cloudera, OpenShift, or related technologies.
- Experience with enterprise-level data lake architectures and governance.