DevOps Engineer
Millennium is a top tier global hedge fund with a strong commitment to leveraging innovations in technology and data science to solve complex problems for the business. We are assembling a strong Commodities Technology team to build our next generation in-house Commodities platform.Β Commodities Technology provides a dynamic and fast-paced environment with excellent growth opportunities.
ResponsibilitiesΒ
- Work closely with quants, portfolio manager, risk managers and other engineers in New York, Chicago, Miami, and Bangalore to develop data intensive and multi-asset analytics for our Commodities platform
- Gather requirements and user feedback in collaboration with fellow engineers and project leads
- Design, build and refactor robust software applications with clean and concise code following Agile and continuous delivery practices
- Automation of system maintenance tasks, end-of-day processing jobs, data integrity checks and bulk data loads/extracts
- Staying abreast of industry trends, new platforms and tools, and coming up with a business case to adopt new technologies
- Develop new tools and infrastructure using Python (Flask/Fast API) or Java (Spring Boot) and relational data backend (AWS β Aurora/Redshift/Athena/S3)
- Support users and operational flows for quantitative risk, senior management and portfolio management teams using the tools developed
Qualifications/Skills Required
- Advance degree in computer science or any other scientific fields
- 3+ years of experience in CI/CD Tools like TeamCity, Jenkins, Octopus Deploy and ArgoCD.
- AWS Cloud infrastructure design, implementation and support. Experience with multiple AWS Services.
- Infrastructure as Code deploying cloud infrastructure using Terraform or Cloudformation.
- Knowledge of Python (Flask/FastAPI/Django)
- Demonstrated expertise in the process of containerization for applications and their subsequent orchestration within Kubernetes environments.
- Experience working on at least one monitoring/observability stack (Datadog, ELK, Splunk, Loki, Grafana).
- Strong knowledge of Unix or Linux
- Strong communication skills to collaborate with various stakeholders
- Able to work independently in a fast-paced environment
- Detail oriented, organized, demonstrating thoroughness and strong ownership of work
- Experience working in a production environment.
- Some experience with relational and non-relational databases.
Qualifications/Nice to have
- Experience with a messaging middleware platform like Solace, Kafka or RabbitMQ.
- Experience with Snowflake and distributed processing technologies (e.g., Hadoop, Flink, Spark)