Senior Data Engineer (AWS, SQL, Python)
Curve was founded with a rebellious spirit, and a lofty vision; to truly simplify your finances, so you can focus on what matters most in life. That’s why Curve puts your finances simply at your fingertips, so you can make smart choices on how to spend, send, see and save your money. We help you control your financial life, so you can go out and live the life you want to live.
With Curve you can spend from all your accounts, track spend behaviour and provide insights, and security to protect you from fraud. For the first time giving you bright insights and control of all your money in one beautiful place.
Who we’re looking for
We’re looking for a Senior Data Engineer, who will be part of the core analytics team. Our mission is to build a robust platform to collect data from multiple part of the system and enable its activation for business and product usage. We believe in data being at the core of the business. We hate latency and approximation.
Our timeline is full of exciting projects and ideas we want to implement. This role is a great opportunity for a Senior Data Engineer who love data, mathematics, software architecture, system architecture, programming and want to have an impact and build systems from scratch. You will mainly be working with AWS (Redshift, S3, EC2, Kinesis, Data Pipeline), Chartio & Tableau for analytics and visualisation, Python and Snowplow.
What will your day involve
- Managing and administering our Data Warehouse (Redshift)
- Designing and building pipelines to collect data from various data sources, both real-time (Kenesis/Snowplow) and batch processing (AWS Data-pipelines, Lambda, AirFlow)
- Mentor junior data engineers by providing feedback, leading critiques, and elevating the output of the team.
- Being part of the Technical Design Authority and providing technical leadership, direction and best-practices to the wider engineering teams.
Who Should You Be
- You have at least 6+ years’ experience with data warehouse and pipeline design
- You will have production experience with both real time events streaming (SnowPlow/Kenesis/Kafka/Microsoft Azure Event Hubs/Google Pub-Sub) and Batch processing job using Schedulers (AWS Data Pipelines/AirFlow/Oozie/Luigi)
- You have a proof track experience with a modern programming language (Any of the following is ideal - Python/Scala/Go)
- You will be an expert with SQL (MS SQL/MySQL/PostGres/PL SQL)
- You are experienced with NoSQL databases like ElasticSearch and MongoDB