Specific skills required:
Python Programming:
Strong proficiency in Python for data processing, analysis, and automation.
Experience with libraries such as Pandas, NumPy, and PySpark.
Ability to write efficient, maintainable, and scalable code for data pipelines and analytics.
Data Engineering & Development:
Experience designing and building robust ETL/ELT pipelines.
Familiarity with data modeling, data warehousing concepts, and schema design.
Proficiency in working with structured and unstructured data.
Experience with version control systems (e.g., Git) and CI/CD pipelines.
Knowledge of cloud platforms (e.g., AWS, GCP, Azure) and containerization (e.g., Docker, Kubernetes) is a plus.
Real-Time Data Feeds:
Experience integrating data platforms with real-time messaging systems (Solace/ION/Kafka).
Deep understanding of real-time market data feeds and data processing techniques.
Financial Markets:
Strong domain knowledge in Fixed Income, FX, and Equities trading.
DevOps:
Implement DevOps practices to automate the build, test, and deployment of data solutions.
Monitoring Tools:
Experience using tools like ITRS to monitor performance and health of data systems.