Senior Cloud Data Engineer, Framework Engineering
Job Description
Job Overview
The Investment Process Engineer is responsible for the fidelity of Arrowstreet Capital’s Data-Driven Investment Process. The role involves building visibility, and recovery mechanisms for our modern distributed investment process. The daily investment process includes processing external vendor data and the data flow through pre trade, optimization, post trade, and reporting. The modern investment process is driven by many common frameworks, distributed platforms and applications with complex dependencies. This will require a systematic approach and consistent tooling to maintain a comprehensive view and reduce process debt.
The ideal candidate is interested in connecting modern technology and the processes that drive the daily automated investment process. This role presents a great opportunity for a technologist interested in the end-to-end investment process and working with the latest cloud native technology.
Responsibilities
- Develop cloud-native solutions to enable visibility, rapid response, reduce mean time to recovery of the end-to-end data-driven investment process.
- Design, build a system to interact with distributed investment process applications to obtain relevant information for observability and management
- Work with key stakeholders to build tooling to visualize the investment process network, to identify potential system risk factors, capacity issues
- Design, build ability to run simulations through the entire system
- Knowledge of data warehousing concepts, dimensional modeling, and data modeling techniques.
- Accountable for the completion of assigned deliverables in a timely manner with little day to day management
- Able to assess the trade-offs of different candidate systems, processes or technologies
- You will work with key business stakeholders, IT experts, and subject-matter experts to plan and deliver optimal financial and system data visualization solutions.
- Working with cross-functional teams, you will participate in gathering and documenting requirements to meet business needs and use those requirements to help design, develop, test and implement reports and dashboards that utilize the underlying data to deliver cutting-edge process management products.
Qualifications
- Bachelor’s degree in computer science, systems analysis or a related study, or equivalent experience
- 5+ Building cloud-based applications in Python
- 5+ years of experience with cloud services (AWS, GCP, Azure), with AWS preferred.
- 5+ years of experience AWS data services like AWS Redshift, AWS Aurora PostgreSQL, S3(parquet), Athena or DynamoDB
- 3+ years of experience in developing and optimizing data transformation processes using tools like Apache Spark or cloud-native services (e.g., AWS Glue, EMR).
- 2+ years of experience in building CI/CD pipelines, strong knowledge of Git.
- 2+ years of experience in container technologies like Docker, Kubernetes.
- 2+ year experience with writing Infrastructure as Code with python/terraform/cloud formation
- Familiarity with tools like AWS OpenSearch, Prometheus, CloudWatch, CloudTrail, IAM resource and role policies
- Prior Experience with Object-Oriented languages like C# or Java is a plus.
- Prior AWS Certifications would be a plus
- Previous experience writing cloud-native applications using AWS SAM (or serverless), Lambda, Step Functions and/or DynamoDB would be a plus.
We maintain a friendly, team-oriented environment and place a high value on professionalism, attitude, and initiative.
We maintain a friendly, team-oriented environment and place a high value on professionalism, attitude and initiative.